modelId
stringlengths
4
111
lastModified
stringlengths
24
24
tags
list
pipeline_tag
stringlengths
5
30
author
stringlengths
2
34
config
null
securityStatus
null
id
stringlengths
4
111
likes
int64
0
9.53k
downloads
int64
2
73.6M
library_name
stringlengths
2
84
created
timestamp[us]
card
stringlengths
101
901k
card_len
int64
101
901k
embeddings
list
google/canine-s
2022-08-08T16:22:47.000Z
[ "transformers", "pytorch", "canine", "feature-extraction", "multilingual", "af", "sq", "ar", "an", "hy", "ast", "az", "ba", "eu", "bar", "be", "bn", "inc", "bs", "br", "bg", "my", "ca", "ceb", "ce", "zh", "cv", "hr", "cs", "da", "nl", "en", "et", "fi", "fr", "gl", "ka", "de", "el", "gu", "ht", "he", "hi", "hu", "is", "io", "id", "ga", "it", "ja", "jv", "kn", "kk", "ky", "ko", "la", "lv", "lt", "roa", "nds", "lm", "mk", "mg", "ms", "ml", "mr", "mn", "min", "ne", "new", "nb", "nn", "oc", "fa", "pms", "pl", "pt", "pa", "ro", "ru", "sco", "sr", "scn", "sk", "sl", "aze", "es", "su", "sw", "sv", "tl", "tg", "th", "ta", "tt", "te", "tr", "uk", "ud", "uz", "vi", "vo", "war", "cy", "fry", "pnb", "yo", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:2103.06874", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
google
null
null
google/canine-s
13
10,141
transformers
2022-03-02T23:29:05
--- language: - multilingual - af - sq - ar - an - hy - ast - az - ba - eu - bar - be - bn - inc - bs - br - bg - my - ca - ceb - ce - zh - cv - hr - cs - da - nl - en - et - fi - fr - gl - ka - de - el - gu - ht - he - hi - hu - is - io - id - ga - it - ja - jv - kn - kk - ky - ko - la - lv - lt - roa - nds - lm - mk - mg - ms - ml - mr - mn - min - ne - new - nb - nn - oc - fa - pms - pl - pt - pa - ro - ru - sco - sr - hr - scn - sk - sl - aze - es - su - sw - sv - tl - tg - th - ta - tt - te - tr - uk - ud - uz - vi - vo - war - cy - fry - pnb - yo license: apache-2.0 datasets: - bookcorpus - wikipedia --- # CANINE-s (CANINE pre-trained with subword loss) Pretrained CANINE model on 104 languages using a masked language modeling (MLM) objective. It was introduced in the paper [CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation](https://arxiv.org/abs/2103.06874) and first released in [this repository](https://github.com/google-research/language/tree/master/language/canine). What's special about CANINE is that it doesn't require an explicit tokenizer (such as WordPiece or SentencePiece) as other models like BERT and RoBERTa. Instead, it directly operates at a character level: each character is turned into its [Unicode code point](https://en.wikipedia.org/wiki/Code_point#:~:text=For%20Unicode%2C%20the%20particular%20sequence,forming%20a%20self%2Dsynchronizing%20code.). This means that input processing is trivial and can typically be accomplished as: ``` input_ids = [ord(char) for char in text] ``` The ord() function is part of Python, and turns each character into its Unicode code point. Disclaimer: The team releasing CANINE did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description CANINE is a transformers model pretrained on a large corpus of multilingual data in a self-supervised fashion, similar to BERT. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: * Masked language modeling (MLM): one randomly masks part of the inputs, which the model needs to predict. This model (CANINE-s) is trained with a subword loss, meaning that the model needs to predict the identities of subword tokens, while taking characters as input. By reading characters yet predicting subword tokens, the hard token boundary constraint found in other models such as BERT is turned into a soft inductive bias in CANINE. * Next sentence prediction (NSP): the model concatenates two sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of multiple languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the CANINE model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=canine) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at models like GPT2. ### How to use Here is how to use this model: ```python from transformers import CanineTokenizer, CanineModel model = CanineModel.from_pretrained('google/canine-s') tokenizer = CanineTokenizer.from_pretrained('google/canine-s') inputs = ["Life is like a box of chocolates.", "You never know what you gonna get."] encoding = tokenizer(inputs, padding="longest", truncation=True, return_tensors="pt") outputs = model(**encoding) # forward pass pooled_output = outputs.pooler_output sequence_output = outputs.last_hidden_state ``` ## Training data The CANINE model was pretrained on on the multilingual Wikipedia data of [mBERT](https://github.com/google-research/bert/blob/master/multilingual.md), which includes 104 languages. ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2103-06874, author = {Jonathan H. Clark and Dan Garrette and Iulia Turc and John Wieting}, title = {{CANINE:} Pre-training an Efficient Tokenization-Free Encoder for Language Representation}, journal = {CoRR}, volume = {abs/2103.06874}, year = {2021}, url = {https://arxiv.org/abs/2103.06874}, archivePrefix = {arXiv}, eprint = {2103.06874}, timestamp = {Tue, 16 Mar 2021 11:26:59 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-2103-06874.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
5,319
[ [ -0.03924560546875, -0.0517578125, 0.0028553009033203125, 0.026123046875, -0.0191802978515625, -0.0020694732666015625, -0.02777099609375, -0.034423828125, 0.0162811279296875, 0.015350341796875, -0.052978515625, -0.033355712890625, -0.031280517578125, 0.01151275634765625, -0.009063720703125, 0.08477783203125, 0.0012378692626953125, 0.02642822265625, -0.00916290283203125, -0.008575439453125, -0.01261138916015625, -0.06707763671875, -0.043670654296875, -0.037353515625, 0.033233642578125, -0.004558563232421875, 0.0261383056640625, 0.04052734375, 0.0239105224609375, 0.020965576171875, 0.01093292236328125, -0.00592041015625, -0.0224456787109375, -0.014129638671875, 0.00975799560546875, -0.034149169921875, -0.0325927734375, 0.028961181640625, 0.0301361083984375, 0.05621337890625, -0.00481414794921875, 0.00241851806640625, 0.0028057098388671875, 0.028411865234375, -0.03399658203125, 0.0222625732421875, -0.0474853515625, 0.005168914794921875, -0.029693603515625, 0.0037994384765625, -0.02691650390625, -0.0171966552734375, 0.0020771026611328125, -0.04559326171875, 0.0217437744140625, -0.00843048095703125, 0.0933837890625, 0.00414276123046875, -0.01160430908203125, -0.02764892578125, -0.0364990234375, 0.058685302734375, -0.049102783203125, 0.050750732421875, 0.0528564453125, -0.01128387451171875, -0.01337432861328125, -0.07830810546875, -0.051483154296875, -0.0311737060546875, -0.0170440673828125, 0.014404296875, -0.01507568359375, 0.005214691162109375, 0.018096923828125, 0.03076171875, -0.04400634765625, 0.01105499267578125, -0.03466796875, -0.022979736328125, 0.047515869140625, -0.0045166015625, 0.0292510986328125, -0.0049896240234375, -0.007511138916015625, -0.0168304443359375, -0.03936767578125, 0.036529541015625, 0.0203704833984375, 0.016326904296875, -0.00640106201171875, 0.04156494140625, -0.007648468017578125, 0.0322265625, 0.038055419921875, -0.0149078369140625, 0.04559326171875, -0.0260009765625, -0.0187225341796875, 0.003936767578125, 0.07452392578125, 0.008758544921875, 0.0210723876953125, -0.025177001953125, -0.00565338134765625, 0.0018739700317382812, 0.0123443603515625, -0.05633544921875, -0.01232147216796875, 0.011444091796875, -0.044342041015625, -0.0241241455078125, -0.010162353515625, -0.0142364501953125, -0.005268096923828125, -0.0032100677490234375, 0.0262451171875, -0.054840087890625, -0.01145172119140625, 0.0199737548828125, -0.01165008544921875, 0.03778076171875, 0.0016994476318359375, -0.08367919921875, 0.0119781494140625, 0.0293121337890625, 0.04052734375, 0.004299163818359375, -0.054412841796875, -0.02850341796875, -0.0081024169921875, -0.034423828125, 0.033905029296875, -0.0408935546875, -0.00782012939453125, -0.007328033447265625, 0.036590576171875, -0.0149688720703125, -0.031219482421875, 0.03448486328125, -0.0343017578125, 0.0252227783203125, -0.01256561279296875, -0.038970947265625, -0.0131988525390625, 0.01186370849609375, -0.041412353515625, 0.105224609375, -0.00007611513137817383, -0.06573486328125, 0.03497314453125, -0.033111572265625, -0.043792724609375, 0.01268768310546875, 0.0006875991821289062, -0.016204833984375, 0.007415771484375, 0.031646728515625, 0.0465087890625, -0.0018949508666992188, 0.030242919921875, -0.022796630859375, -0.0161895751953125, 0.00991058349609375, -0.00193023681640625, 0.0665283203125, -0.000934600830078125, -0.0257568359375, 0.005268096923828125, -0.06781005859375, 0.006763458251953125, 0.030120849609375, -0.043365478515625, -0.0321044921875, -0.00012624263763427734, -0.01172637939453125, 0.0276641845703125, 0.024139404296875, -0.0419921875, 0.01352691650390625, -0.03082275390625, 0.040740966796875, 0.0343017578125, -0.023834228515625, 0.03704833984375, -0.01763916015625, 0.032989501953125, -0.0187225341796875, 0.010162353515625, -0.017822265625, -0.025787353515625, -0.0755615234375, -0.0242156982421875, 0.045440673828125, 0.050445556640625, -0.0391845703125, 0.058624267578125, -0.010772705078125, -0.050994873046875, -0.07464599609375, -0.004245758056640625, 0.023895263671875, 0.020599365234375, 0.027374267578125, -0.036407470703125, -0.048004150390625, -0.061065673828125, -0.00222015380859375, -0.0119781494140625, -0.0012807846069335938, 0.0008211135864257812, 0.043121337890625, -0.0257568359375, 0.05889892578125, -0.005725860595703125, -0.010772705078125, -0.045196533203125, 0.01531219482421875, -0.0014276504516601562, 0.046295166015625, 0.031005859375, -0.0237274169921875, -0.032806396484375, -0.01229095458984375, -0.06646728515625, 0.0007276535034179688, -0.01190948486328125, -0.006923675537109375, 0.0193939208984375, 0.0555419921875, -0.05657958984375, 0.026885986328125, 0.033721923828125, 0.01154327392578125, 0.042999267578125, -0.0134429931640625, -0.0164794921875, -0.06719970703125, 0.01401519775390625, -0.002002716064453125, -0.0051116943359375, -0.0596923828125, 0.001190185546875, 0.01146697998046875, -0.00383758544921875, -0.0355224609375, 0.04638671875, -0.032806396484375, 0.006526947021484375, -0.0008382797241210938, 0.002277374267578125, -0.0023212432861328125, 0.059967041015625, 0.0322265625, 0.052825927734375, 0.044952392578125, -0.046051025390625, 0.01515960693359375, 0.0236358642578125, -0.0228424072265625, 0.0194854736328125, -0.05718994140625, 0.01485443115234375, -0.01148223876953125, 0.0239410400390625, -0.044586181640625, -0.00864410400390625, 0.0167388916015625, -0.05035400390625, 0.030242919921875, -0.0006237030029296875, -0.038360595703125, -0.032928466796875, -0.03167724609375, 0.01145172119140625, 0.03289794921875, -0.03448486328125, 0.02740478515625, 0.02996826171875, -0.01503753662109375, -0.05047607421875, -0.050537109375, 0.00174713134765625, -0.0069427490234375, -0.05072021484375, 0.039093017578125, -0.0116729736328125, 0.01508331298828125, 0.00824737548828125, 0.007564544677734375, 0.00394439697265625, 0.0007824897766113281, 0.006008148193359375, 0.007518768310546875, -0.0195465087890625, 0.0223846435546875, -0.006694793701171875, -0.00531005859375, -0.004062652587890625, -0.01229095458984375, 0.06719970703125, -0.024871826171875, -0.006687164306640625, -0.03778076171875, 0.0295257568359375, 0.03582763671875, -0.045745849609375, 0.0653076171875, 0.0762939453125, -0.03607177734375, 0.001682281494140625, -0.04071044921875, -0.01306915283203125, -0.035675048828125, 0.05389404296875, -0.043914794921875, -0.05560302734375, 0.039764404296875, 0.0084075927734375, -0.007213592529296875, 0.043731689453125, 0.04010009765625, -0.0176239013671875, 0.08544921875, 0.0645751953125, -0.0123138427734375, 0.04327392578125, -0.016693115234375, 0.019500732421875, -0.0712890625, -0.0399169921875, -0.0361328125, 0.00472259521484375, -0.05511474609375, -0.012664794921875, -0.00969696044921875, 0.0423583984375, -0.0230865478515625, 0.029876708984375, -0.023651123046875, 0.0216522216796875, 0.0699462890625, -0.012451171875, -0.00939178466796875, 0.015655517578125, -0.033294677734375, -0.0213470458984375, -0.05474853515625, -0.03936767578125, 0.07574462890625, 0.0379638671875, 0.065185546875, -0.0236358642578125, 0.05413818359375, 0.0015630722045898438, 0.004207611083984375, -0.0638427734375, 0.042755126953125, -0.018798828125, -0.050994873046875, -0.02435302734375, 0.0011272430419921875, -0.08367919921875, 0.020050048828125, -0.0241241455078125, -0.0640869140625, 0.00463104248046875, -0.01097869873046875, -0.003467559814453125, 0.01103973388671875, -0.043304443359375, 0.06658935546875, -0.031280517578125, -0.0155181884765625, 0.002941131591796875, -0.055145263671875, 0.0219268798828125, -0.015411376953125, -0.006805419921875, 0.004512786865234375, 0.02301025390625, 0.069580078125, -0.045684814453125, 0.06561279296875, -0.0020236968994140625, 0.009063720703125, 0.00608062744140625, -0.018310546875, 0.05035400390625, -0.020355224609375, 0.004871368408203125, 0.0158843994140625, -0.001972198486328125, -0.026153564453125, -0.0286712646484375, 0.02191162109375, -0.06683349609375, -0.036041259765625, -0.038482666015625, -0.0191802978515625, -0.0155029296875, 0.0307159423828125, 0.048095703125, 0.0284423828125, -0.016693115234375, 0.0177001953125, 0.03668212890625, -0.051513671875, 0.037353515625, 0.04541015625, -0.004947662353515625, -0.0311279296875, 0.0625, -0.0007185935974121094, 0.00708770751953125, 0.018310546875, 0.0012693405151367188, -0.04034423828125, -0.039642333984375, -0.036956787109375, 0.049285888671875, -0.05694580078125, 0.0092620849609375, -0.07086181640625, -0.045379638671875, -0.046478271484375, -0.01605224609375, -0.03485107421875, -0.0198822021484375, -0.025848388671875, -0.00691986083984375, -0.0006237030029296875, 0.0474853515625, -0.0143890380859375, 0.02984619140625, -0.053497314453125, 0.0119171142578125, 0.025665283203125, 0.019775390625, -0.005764007568359375, -0.0325927734375, -0.0254058837890625, 0.003002166748046875, -0.0187225341796875, -0.05487060546875, 0.039093017578125, 0.023773193359375, 0.07373046875, 0.0213775634765625, 0.0008387565612792969, 0.06585693359375, -0.038848876953125, 0.06329345703125, 0.0245361328125, -0.0732421875, 0.0472412109375, -0.005748748779296875, 0.01355743408203125, 0.0443115234375, 0.0428466796875, -0.06402587890625, -0.042633056640625, -0.04437255859375, -0.0634765625, 0.08148193359375, 0.01220703125, 0.026885986328125, -0.004474639892578125, 0.00823211669921875, 0.01346588134765625, 0.0209808349609375, -0.0882568359375, -0.031768798828125, -0.031280517578125, -0.01122283935546875, -0.01457977294921875, -0.0226898193359375, 0.01393890380859375, -0.040130615234375, 0.07086181640625, 0.0088653564453125, 0.0361328125, 0.0083770751953125, -0.037261962890625, 0.0160064697265625, 0.01488494873046875, 0.04522705078125, 0.01366424560546875, -0.039093017578125, 0.0010633468627929688, 0.0105438232421875, -0.05633544921875, 0.00327301025390625, 0.0249481201171875, -0.00946807861328125, 0.00347137451171875, 0.0401611328125, 0.0838623046875, -0.016204833984375, -0.041107177734375, 0.0482177734375, -0.0150146484375, -0.0269775390625, -0.01561737060546875, -0.0121002197265625, 0.00702667236328125, 0.01251983642578125, 0.0241241455078125, -0.018341064453125, -0.0120086669921875, -0.045135498046875, 0.0142364501953125, 0.030487060546875, -0.0335693359375, -0.033050537109375, 0.052642822265625, 0.0154266357421875, -0.0140533447265625, 0.06634521484375, -0.0361328125, -0.049346923828125, 0.040863037109375, 0.0618896484375, 0.067138671875, -0.0162506103515625, 0.0217437744140625, 0.04010009765625, 0.045501708984375, 0.003192901611328125, -0.0006184577941894531, -0.0011034011840820312, -0.0633544921875, -0.024658203125, -0.06658935546875, 0.01148223876953125, 0.06317138671875, -0.046173095703125, 0.022247314453125, -0.0198516845703125, -0.0143280029296875, 0.017852783203125, 0.0170745849609375, -0.05694580078125, 0.02777099609375, 0.018890380859375, 0.06512451171875, -0.05621337890625, 0.0850830078125, 0.059600830078125, -0.0560302734375, -0.06695556640625, 0.0018091201782226562, -0.033111572265625, -0.07672119140625, 0.06036376953125, 0.022735595703125, 0.006786346435546875, 0.002544403076171875, -0.03607177734375, -0.061737060546875, 0.06268310546875, 0.0208740234375, -0.030731201171875, -0.01018524169921875, 0.01355743408203125, 0.039093017578125, -0.0318603515625, 0.01136016845703125, 0.03643798828125, 0.0189361572265625, -0.0006861686706542969, -0.07244873046875, 0.00980377197265625, -0.04620361328125, -0.00440216064453125, -0.00977325439453125, -0.039093017578125, 0.068603515625, -0.01056671142578125, -0.0185699462890625, -0.007404327392578125, 0.028594970703125, 0.01739501953125, 0.006618499755859375, 0.0222015380859375, 0.03936767578125, 0.07025146484375, -0.0045623779296875, 0.071044921875, -0.022613525390625, 0.02032470703125, 0.08050537109375, -0.0007681846618652344, 0.05718994140625, 0.018310546875, -0.022613525390625, 0.032958984375, 0.07977294921875, -0.01265716552734375, 0.06744384765625, 0.01314544677734375, -0.00453948974609375, -0.023712158203125, -0.0031032562255859375, -0.0501708984375, 0.045684814453125, 0.02178955078125, -0.030731201171875, -0.002323150634765625, 0.01336669921875, 0.017852783203125, -0.042694091796875, -0.0110321044921875, 0.05322265625, -0.00518035888671875, -0.051361083984375, 0.061798095703125, 0.028961181640625, 0.0645751953125, -0.056396484375, 0.0260162353515625, -0.039215087890625, 0.00455474853515625, -0.0051116943359375, -0.026885986328125, 0.00948333740234375, 0.001186370849609375, -0.013763427734375, -0.0087890625, 0.0604248046875, -0.043060302734375, -0.040435791015625, 0.0232391357421875, 0.0303955078125, 0.02911376953125, -0.0091552734375, -0.0670166015625, -0.005847930908203125, -0.0004086494445800781, -0.0206451416015625, 0.0225982666015625, 0.0232086181640625, -0.018798828125, 0.04669189453125, 0.046783447265625, 0.0023345947265625, 0.03363037109375, 0.004314422607421875, 0.060272216796875, -0.054595947265625, -0.036468505859375, -0.052490234375, 0.03515625, 0.004001617431640625, -0.01488494873046875, 0.05328369140625, 0.04248046875, 0.08612060546875, -0.0160064697265625, 0.08203125, -0.0138092041015625, 0.0133056640625, -0.033416748046875, 0.05389404296875, -0.05438232421875, -0.002017974853515625, -0.0102996826171875, -0.050445556640625, -0.021148681640625, 0.06195068359375, -0.0171051025390625, 0.00965118408203125, 0.06573486328125, 0.06536865234375, -0.00447845458984375, -0.0218353271484375, 0.018035888671875, -0.004146575927734375, 0.0116729736328125, 0.02850341796875, 0.06842041015625, -0.0357666015625, 0.04852294921875, -0.0272979736328125, 0.000041484832763671875, -0.0124664306640625, -0.044586181640625, -0.08331298828125, -0.054290771484375, -0.023101806640625, -0.035430908203125, -0.0009298324584960938, 0.0638427734375, 0.07354736328125, -0.056427001953125, -0.0086212158203125, -0.008148193359375, -0.0182342529296875, -0.0062255859375, -0.0127410888671875, 0.040496826171875, -0.0474853515625, -0.054595947265625, 0.01763916015625, 0.006572723388671875, 0.0175323486328125, -0.01087188720703125, -0.0262908935546875, -0.01702880859375, 0.0176239013671875, 0.045928955078125, 0.0212554931640625, -0.06207275390625, 0.000949859619140625, 0.0194854736328125, -0.033416748046875, 0.00821685791015625, 0.048431396484375, -0.0484619140625, 0.03192138671875, 0.0137786865234375, 0.04193115234375, 0.069580078125, -0.0228271484375, 0.033966064453125, -0.065185546875, 0.022613525390625, -0.004947662353515625, 0.017730712890625, 0.026611328125, -0.00989532470703125, 0.0322265625, 0.0272064208984375, -0.0180816650390625, -0.055938720703125, 0.004291534423828125, -0.06695556640625, -0.0237274169921875, 0.075927734375, -0.02520751953125, -0.0182342529296875, -0.00858306884765625, -0.00885009765625, 0.034515380859375, -0.0192718505859375, 0.043487548828125, 0.07177734375, 0.0008330345153808594, -0.0205078125, -0.03668212890625, 0.05230712890625, 0.011138916015625, -0.058502197265625, -0.0157623291015625, 0.00984954833984375, 0.032379150390625, -0.0017852783203125, 0.08233642578125, -0.004863739013671875, -0.0011272430419921875, 0.0120086669921875, 0.03778076171875, -0.0028629302978515625, -0.0100250244140625, -0.0238037109375, -0.004238128662109375, -0.0028171539306640625, -0.043792724609375 ] ]
togethercomputer/LLaMA-2-7B-32K
2023-10-03T17:39:07.000Z
[ "transformers", "pytorch", "llama", "text-generation", "custom_code", "en", "dataset:togethercomputer/RedPajama-Data-1T", "dataset:togethercomputer/RedPajama-Data-Instruct", "dataset:EleutherAI/pile", "dataset:togethercomputer/Long-Data-Collections", "license:llama2", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
togethercomputer
null
null
togethercomputer/LLaMA-2-7B-32K
482
10,130
transformers
2023-07-26T02:19:41
--- license: llama2 datasets: - togethercomputer/RedPajama-Data-1T - togethercomputer/RedPajama-Data-Instruct - EleutherAI/pile - togethercomputer/Long-Data-Collections language: - en library_name: transformers --- # LLaMA-2-7B-32K ## Model Description LLaMA-2-7B-32K is an open-source, long context language model developed by Together, fine-tuned from Meta's original Llama-2 7B model. This model represents our efforts to contribute to the rapid progress of the open-source ecosystem for large language models. The model has been extended to a context length of 32K with position interpolation, allowing applications on multi-document QA, long text summarization, etc. ## What's new? This model introduces several improvements and new features: 1. **Extended Context:** The model has been trained to handle context lengths up to 32K, which is a significant improvement over the previous versions. 2. **Pre-training and Instruction Tuning:** We have shared our data recipe, which consists of a mixture of pre-training and instruction tuning data. 3. **Fine-tuning Examples:** We provide examples of how to fine-tune the model for specific applications, including book summarization and long context question and answering. 4. **Software Support:** We have updated both the inference and training stack to allow efficient inference and fine-tuning for 32K context. ## Model Architecture The model follows the architecture of Llama-2-7B and extends it to handle a longer context. It leverages the recently released FlashAttention-2 and a range of other optimizations to improve the speed and efficiency of inference and training. ## Training and Fine-tuning The model has been trained using a mixture of pre-training and instruction tuning data. - In the first training phase of continued pre-training, our data mixture contains 25% RedPajama Book, 25% RedPajama ArXiv (including abstracts), 25% other data from RedPajama, and 25% from the UL2 Oscar Data, which is a part of OIG (Open-Instruction-Generalist), asking the model to fill in missing chunks, or complete the text. To enhance the long-context ability, we exclude data shorter than 2K word. The inclusion of UL2 Oscar Data is effective in compelling the model to read and utilize long-range context. - We then fine-tune the model to focus on its few shot capacity under long context, including 20% Natural Instructions (NI), 20% Public Pool of Prompts (P3), 20% the Pile. We decontaminated all data against HELM core scenarios . We teach the model to leverage the in-context examples by packing examples into one 32K-token sequence. To maintain the knowledge learned from the first piece of data, we incorporate 20% RedPajama-Data Book and 20% RedPajama-Data ArXiv. Next, we provide examples of how to fine-tune the model for specific applications. The example datasets are placed in [togethercomputer/Long-Data-Collections](https://huggingface.co/datasets/togethercomputer/Long-Data-Collections) You can use the [OpenChatKit](https://github.com/togethercomputer/OpenChatKit) to fine-tune your own 32K model over LLaMA-2-7B-32K. Please refer to [OpenChatKit](https://github.com/togethercomputer/OpenChatKit) for step-by-step illustrations. 1. Long Context QA. We take as an example the multi-document question answering task from the paper “Lost in the Middle: How Language Models Use Long Contexts”. The input for the model consists of (i) a question that requires an answer and (ii) k documents, which are passages extracted from Wikipedia. Notably, only one of these documents contains the answer to the question, while the remaining k − 1 documents, termed as "distractor" documents, do not. To successfully perform this task, the model must identify and utilize the document containing the answer from its input context. With OCK, simply run the following command to fine-tune: ``` bash training/finetune_llama-2-7b-32k-mqa.sh ``` 2. Summarization. Another example is BookSum, a unique dataset designed to address the challenges of long-form narrative summarization. This dataset features source documents from the literature domain, including novels, plays, and stories, and offers human-written, highly abstractive summaries. We here focus on chapter-level data. BookSum poses a unique set of challenges, necessitating that the model comprehensively read through each chapter. With OCK, simply run the following command to fine-tune: ``` bash training/finetune_llama-2-7b-32k-booksum.sh ``` ## Inference You can use the [Together API](https://together.ai/blog/api-announcement) to try out LLaMA-2-7B-32K for inference. The updated inference stack allows for efficient inference. To run the model locally, we strongly recommend to install Flash Attention V2, which is necessary to obtain the best performance: ``` # Please update the path of `CUDA_HOME` export CUDA_HOME=/usr/local/cuda-11.8 pip install transformers==4.31.0 pip install sentencepiece pip install ninja pip install flash-attn --no-build-isolation pip install git+https://github.com/HazyResearch/flash-attention.git#subdirectory=csrc/rotary ``` You can use this model directly from the Hugging Face Model Hub or fine-tune it on your own data using the OpenChatKit. ```python from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("togethercomputer/LLaMA-2-7B-32K") model = AutoModelForCausalLM.from_pretrained("togethercomputer/LLaMA-2-7B-32K", trust_remote_code=True, torch_dtype=torch.float16) input_context = "Your text here" input_ids = tokenizer.encode(input_context, return_tensors="pt") output = model.generate(input_ids, max_length=128, temperature=0.7) output_text = tokenizer.decode(output[0], skip_special_tokens=True) print(output_text) ``` Alternatively, you can set `trust_remote_code=False` if you prefer not to use flash attention. ## Limitations and Bias As with all language models, LLaMA-2-7B-32K may generate incorrect or biased content. It's important to keep this in mind when using the model. ## Community Join us on [Together Discord](https://discord.gg/6ZVDU8tTD4)
6,141
[ [ -0.03228759765625, -0.06805419921875, 0.0242919921875, 0.0189361572265625, -0.038330078125, -0.00969696044921875, -0.02337646484375, -0.060028076171875, 0.01148223876953125, 0.035980224609375, -0.04241943359375, -0.0419921875, -0.047454833984375, 0.00762176513671875, -0.023040771484375, 0.0921630859375, 0.001964569091796875, -0.0244293212890625, -0.019012451171875, -0.01070404052734375, -0.033050537109375, -0.028076171875, -0.04412841796875, -0.02099609375, 0.045166015625, 0.041229248046875, 0.04412841796875, 0.052947998046875, 0.0288848876953125, 0.0221710205078125, -0.028839111328125, 0.020477294921875, -0.0552978515625, 0.0008182525634765625, -0.0029506683349609375, -0.037567138671875, -0.046844482421875, -0.0089263916015625, 0.0421142578125, 0.0285797119140625, -0.005626678466796875, 0.0244293212890625, -0.0058746337890625, 0.0457763671875, -0.0400390625, 0.031280517578125, -0.033935546875, -0.02362060546875, -0.021240234375, 0.007114410400390625, -0.0167694091796875, -0.00623321533203125, 0.0085906982421875, -0.05059814453125, -0.00605010986328125, 0.0010824203491210938, 0.070556640625, 0.033233642578125, -0.042572021484375, -0.0189666748046875, -0.0291900634765625, 0.06488037109375, -0.07342529296875, 0.0179901123046875, 0.052978515625, 0.0160980224609375, -0.0159149169921875, -0.04986572265625, -0.05859375, -0.016571044921875, -0.0062713623046875, 0.005550384521484375, -0.00897216796875, -0.002777099609375, 0.015960693359375, 0.0298004150390625, -0.025634765625, 0.013092041015625, -0.047607421875, 0.002716064453125, 0.05828857421875, 0.0127716064453125, 0.0192718505859375, -0.01806640625, -0.022674560546875, -0.0201416015625, -0.046234130859375, 0.016143798828125, 0.023956298828125, 0.013671875, -0.033782958984375, 0.0467529296875, -0.0355224609375, 0.041259765625, 0.01210784912109375, -0.033111572265625, 0.0212249755859375, -0.0313720703125, -0.015167236328125, -0.00917816162109375, 0.055877685546875, 0.04229736328125, 0.0145721435546875, 0.00455474853515625, -0.00728607177734375, 0.003673553466796875, 0.00579071044921875, -0.06591796875, -0.0035228729248046875, 0.0221405029296875, -0.020233154296875, -0.0261077880859375, -0.0200958251953125, -0.0537109375, -0.022552490234375, -0.0266265869140625, 0.021759033203125, -0.03985595703125, -0.01523590087890625, 0.0088348388671875, -0.0018224716186523438, 0.028167724609375, 0.0267333984375, -0.07562255859375, 0.0253143310546875, 0.043731689453125, 0.05828857421875, -0.00763702392578125, -0.0364990234375, -0.016815185546875, 0.007781982421875, -0.0184326171875, 0.058380126953125, -0.0283966064453125, -0.0175628662109375, -0.0201416015625, 0.0079803466796875, 0.003997802734375, -0.036041259765625, 0.0226287841796875, -0.040496826171875, 0.0180511474609375, -0.0145416259765625, -0.039794921875, -0.0260467529296875, 0.01495361328125, -0.041961669921875, 0.1002197265625, 0.0237579345703125, -0.05755615234375, 0.00286865234375, -0.0543212890625, -0.03485107421875, -0.02178955078125, 0.008270263671875, -0.03125, -0.0043487548828125, 0.02923583984375, 0.037994384765625, -0.036590576171875, 0.030487060546875, -0.0220947265625, -0.0330810546875, 0.01031494140625, -0.00696563720703125, 0.06201171875, 0.01250457763671875, -0.0311737060546875, 0.00701904296875, -0.054931640625, -0.02130126953125, 0.022857666015625, -0.0289306640625, -0.0225830078125, -0.005619049072265625, -0.002483367919921875, 0.0014715194702148438, 0.029388427734375, -0.033538818359375, 0.024658203125, -0.03936767578125, 0.04156494140625, 0.047454833984375, -0.004558563232421875, 0.036773681640625, -0.031646728515625, 0.035400390625, 0.017486572265625, 0.011016845703125, -0.027313232421875, -0.04150390625, -0.0711669921875, -0.0204010009765625, 0.01221466064453125, 0.040496826171875, -0.034088134765625, 0.03387451171875, -0.0130615234375, -0.05267333984375, -0.0401611328125, 0.015899658203125, 0.034759521484375, 0.050567626953125, 0.036041259765625, -0.0299530029296875, -0.031982421875, -0.06512451171875, 0.00652313232421875, -0.01446533203125, 0.0014162063598632812, 0.0286102294921875, 0.04388427734375, -0.04296875, 0.07379150390625, -0.043365478515625, -0.0229644775390625, -0.0207672119140625, -0.007137298583984375, 0.0175628662109375, 0.034149169921875, 0.04443359375, -0.0521240234375, -0.0287322998046875, -0.003818511962890625, -0.05706787109375, -0.010009765625, 0.003387451171875, -0.0198516845703125, 0.02703857421875, 0.043182373046875, -0.05584716796875, 0.029144287109375, 0.03973388671875, -0.02850341796875, 0.036346435546875, 0.002178192138671875, 0.004486083984375, -0.09808349609375, 0.012298583984375, -0.00408935546875, -0.01213836669921875, -0.03936767578125, 0.01806640625, 0.00649261474609375, 0.0020275115966796875, -0.040924072265625, 0.07342529296875, -0.0298614501953125, 0.004314422607421875, -0.0211639404296875, 0.015655517578125, 0.005153656005859375, 0.056488037109375, -0.00580596923828125, 0.0565185546875, 0.03326416015625, -0.050994873046875, 0.029266357421875, 0.03570556640625, -0.038665771484375, 0.0309295654296875, -0.05694580078125, 0.017486572265625, -0.01230621337890625, 0.0345458984375, -0.075927734375, -0.040191650390625, 0.023162841796875, -0.037567138671875, 0.028076171875, 0.0068359375, -0.0362548828125, -0.025543212890625, -0.0258026123046875, 0.043426513671875, 0.04180908203125, -0.043487548828125, 0.03778076171875, 0.038726806640625, -0.0221099853515625, -0.059722900390625, -0.052703857421875, -0.0017652511596679688, -0.0277099609375, -0.052032470703125, 0.030517578125, -0.020843505859375, -0.01529693603515625, -0.0165863037109375, -0.0012636184692382812, -0.00634765625, 0.010345458984375, 0.0254364013671875, 0.016204833984375, -0.01214599609375, 0.0149688720703125, 0.0265045166015625, -0.00001436471939086914, -0.01059722900390625, 0.006816864013671875, 0.041412353515625, -0.016693115234375, -0.0014972686767578125, -0.037506103515625, 0.025360107421875, 0.043670654296875, -0.0139312744140625, 0.058685302734375, 0.045928955078125, -0.0304107666015625, 0.0016164779663085938, -0.049774169921875, -0.01336669921875, -0.0386962890625, 0.03741455078125, -0.0208282470703125, -0.0718994140625, 0.05126953125, 0.00812530517578125, 0.0159454345703125, 0.0361328125, 0.0479736328125, -0.005290985107421875, 0.056640625, 0.06561279296875, -0.0256500244140625, 0.04290771484375, -0.0148468017578125, -0.00035834312438964844, -0.0716552734375, -0.0226593017578125, -0.0221405029296875, -0.0176239013671875, -0.0244598388671875, -0.030059814453125, 0.0238037109375, 0.024383544921875, -0.0274810791015625, 0.03466796875, -0.033050537109375, 0.041717529296875, 0.047698974609375, 0.018096923828125, 0.01422882080078125, -0.010223388671875, 0.015838623046875, 0.012603759765625, -0.041656494140625, -0.033599853515625, 0.07843017578125, 0.042999267578125, 0.05328369140625, 0.01010894775390625, 0.052978515625, 0.006885528564453125, 0.0247344970703125, -0.0548095703125, 0.042449951171875, 0.004985809326171875, -0.050537109375, -0.029266357421875, -0.0193634033203125, -0.071044921875, 0.00907135009765625, 0.0036678314208984375, -0.062469482421875, 0.0048675537109375, 0.0032634735107421875, -0.031646728515625, -0.0008630752563476562, -0.057769775390625, 0.0684814453125, -0.00783538818359375, -0.00997161865234375, -0.0186767578125, -0.07073974609375, 0.044097900390625, -0.00978851318359375, 0.01071929931640625, -0.0215606689453125, -0.0050811767578125, 0.06903076171875, -0.028228759765625, 0.0946044921875, 0.0060577392578125, -0.01947021484375, 0.040008544921875, -0.00975799560546875, 0.032928466796875, 0.004932403564453125, -0.004734039306640625, 0.02825927734375, 0.01163482666015625, -0.029937744140625, -0.032623291015625, 0.048309326171875, -0.07598876953125, -0.044891357421875, -0.0233154296875, -0.053466796875, -0.005970001220703125, 0.0185546875, 0.0189056396484375, 0.0182037353515625, -0.0070037841796875, 0.01369476318359375, 0.04034423828125, -0.033294677734375, 0.0303192138671875, 0.03350830078125, -0.0128631591796875, -0.039215087890625, 0.050567626953125, -0.0032367706298828125, 0.0217132568359375, 0.0284576416015625, 0.013671875, -0.023345947265625, -0.015655517578125, -0.033935546875, 0.039703369140625, -0.060638427734375, -0.0299072265625, -0.05255126953125, -0.0175018310546875, -0.048126220703125, -0.0083465576171875, -0.02642822265625, -0.02105712890625, -0.048004150390625, -0.005733489990234375, 0.0498046875, 0.0504150390625, 0.00667572021484375, 0.04931640625, -0.05010986328125, 0.033660888671875, 0.0199432373046875, 0.0198974609375, 0.0071563720703125, -0.060821533203125, -0.0277252197265625, 0.004543304443359375, -0.029541015625, -0.06427001953125, 0.031707763671875, 0.0223236083984375, 0.019256591796875, 0.0258941650390625, 0.01245880126953125, 0.058685302734375, -0.029449462890625, 0.058990478515625, 0.00395965576171875, -0.07159423828125, 0.043975830078125, -0.0380859375, 0.040802001953125, 0.0396728515625, 0.02734375, -0.041717529296875, -0.02960205078125, -0.04180908203125, -0.06488037109375, 0.062469482421875, 0.007076263427734375, 0.018402099609375, -0.00940704345703125, 0.038330078125, 0.0045166015625, 0.01554107666015625, -0.0673828125, -0.0146484375, -0.0115203857421875, -0.02838134765625, -0.018280029296875, -0.034423828125, -0.0091400146484375, -0.03289794921875, 0.048126220703125, -0.0177001953125, 0.0224456787109375, 0.01467132568359375, -0.017822265625, -0.007808685302734375, 0.004535675048828125, 0.0567626953125, 0.05010986328125, -0.016021728515625, -0.006069183349609375, 0.029266357421875, -0.03924560546875, -0.004863739013671875, 0.0189666748046875, -0.0034046173095703125, -0.017242431640625, 0.03271484375, 0.0704345703125, 0.02911376953125, -0.059356689453125, 0.035125732421875, 0.0028095245361328125, 0.005054473876953125, -0.020294189453125, 0.010009765625, 0.0253143310546875, 0.039215087890625, 0.021484375, -0.025390625, -0.00121307373046875, -0.045989990234375, 0.004627227783203125, 0.017974853515625, -0.0038394927978515625, -0.0242919921875, 0.05462646484375, 0.01312255859375, -0.01654052734375, 0.0450439453125, -0.0039825439453125, -0.034759521484375, 0.0672607421875, 0.04998779296875, 0.04217529296875, -0.019317626953125, 0.004947662353515625, 0.036041259765625, 0.020751953125, -0.00704193115234375, 0.0079803466796875, -0.0194091796875, -0.039886474609375, -0.026092529296875, -0.06329345703125, -0.0343017578125, 0.0157470703125, -0.044525146484375, 0.0330810546875, -0.0252838134765625, -0.004421234130859375, -0.009429931640625, 0.0120697021484375, -0.04296875, 0.017578125, 0.0033893585205078125, 0.07177734375, -0.049041748046875, 0.06939697265625, 0.032806396484375, -0.040924072265625, -0.05389404296875, -0.01529693603515625, -0.004520416259765625, -0.06707763671875, 0.032135009765625, 0.03369140625, 0.00751495361328125, -0.003063201904296875, -0.058441162109375, -0.0806884765625, 0.10504150390625, 0.0264892578125, -0.050079345703125, -0.0143890380859375, -0.003101348876953125, 0.044189453125, -0.022216796875, 0.0236053466796875, 0.05255126953125, 0.025543212890625, -0.0007805824279785156, -0.08026123046875, 0.0086517333984375, -0.019287109375, -0.01100921630859375, 0.0009741783142089844, -0.07269287109375, 0.0706787109375, -0.03338623046875, -0.01178741455078125, 0.020172119140625, 0.07220458984375, 0.0269317626953125, 0.03466796875, 0.0311737060546875, 0.0478515625, 0.05755615234375, 0.0159149169921875, 0.08575439453125, -0.0235748291015625, 0.0251922607421875, 0.0848388671875, -0.00594329833984375, 0.07464599609375, 0.03021240234375, -0.0084686279296875, 0.039093017578125, 0.069580078125, 0.00653076171875, 0.03839111328125, -0.0028858184814453125, -0.0106201171875, -0.0011882781982421875, -0.003437042236328125, -0.03765869140625, 0.033233642578125, 0.025970458984375, -0.0202178955078125, -0.006378173828125, -0.0030269622802734375, 0.0167083740234375, -0.0168609619140625, -0.00862884521484375, 0.06195068359375, 0.0247344970703125, -0.06365966796875, 0.058990478515625, 0.01258087158203125, 0.06805419921875, -0.047821044921875, 0.0092620849609375, -0.037109375, 0.01114654541015625, -0.0045318603515625, -0.04693603515625, 0.0185394287109375, -0.0013256072998046875, -0.016845703125, 0.00731658935546875, 0.04925537109375, -0.042724609375, -0.0291900634765625, 0.02838134765625, 0.0143890380859375, 0.0284576416015625, -0.0062713623046875, -0.0484619140625, 0.0161285400390625, -0.004512786865234375, -0.037017822265625, 0.035888671875, 0.02349853515625, -0.006740570068359375, 0.053314208984375, 0.04620361328125, -0.0149383544921875, -0.005741119384765625, -0.010162353515625, 0.0799560546875, -0.055267333984375, -0.04241943359375, -0.055908203125, 0.038543701171875, 0.0022678375244140625, -0.044830322265625, 0.04534912109375, 0.0261077880859375, 0.06939697265625, -0.0010519027709960938, 0.035552978515625, -0.004428863525390625, 0.035125732421875, -0.03155517578125, 0.059722900390625, -0.056640625, 0.00963592529296875, -0.0238800048828125, -0.07177734375, -0.01535797119140625, 0.045928955078125, -0.0186614990234375, 0.001995086669921875, 0.0421142578125, 0.06610107421875, -0.00988006591796875, -0.010223388671875, 0.0138092041015625, 0.02044677734375, 0.014617919921875, 0.0546875, 0.049835205078125, -0.0491943359375, 0.03717041015625, -0.02685546875, -0.027801513671875, -0.0280914306640625, -0.060211181640625, -0.08367919921875, -0.056488037109375, -0.0189208984375, -0.0233001708984375, 0.001773834228515625, 0.059722900390625, 0.053436279296875, -0.042022705078125, -0.0218353271484375, 0.0178985595703125, 0.01058197021484375, -0.00234222412109375, -0.017791748046875, 0.0233917236328125, -0.0213623046875, -0.0628662109375, 0.034393310546875, 0.002838134765625, 0.0126800537109375, -0.0146942138671875, -0.01136016845703125, -0.0207061767578125, 0.02008056640625, 0.058624267578125, 0.0312042236328125, -0.057464599609375, -0.017059326171875, 0.0038204193115234375, -0.0156402587890625, 0.00980377197265625, 0.0228118896484375, -0.05279541015625, 0.018310546875, 0.0224151611328125, 0.049652099609375, 0.05419921875, 0.007503509521484375, 0.02838134765625, -0.053619384765625, 0.01934814453125, 0.0031890869140625, 0.022705078125, 0.0265655517578125, -0.036865234375, 0.04571533203125, 0.00455474853515625, -0.05438232421875, -0.0760498046875, 0.00818634033203125, -0.09503173828125, -0.0277252197265625, 0.10565185546875, -0.006214141845703125, -0.039276123046875, 0.009857177734375, -0.0307464599609375, 0.034942626953125, -0.034515380859375, 0.07073974609375, 0.05828857421875, -0.003475189208984375, -0.018768310546875, -0.03302001953125, 0.03338623046875, 0.032135009765625, -0.0731201171875, 0.004673004150390625, 0.039581298828125, 0.03436279296875, 0.01114654541015625, 0.06170654296875, 0.00386810302734375, 0.0222930908203125, -0.008697509765625, 0.007793426513671875, -0.00774383544921875, -0.0290374755859375, -0.020751953125, -0.004840850830078125, -0.00992584228515625, -0.00528717041015625 ] ]
PY007/TinyLlama-1.1B-step-50K-105b
2023-09-16T03:06:11.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "en", "dataset:cerebras/SlimPajama-627B", "dataset:bigcode/starcoderdata", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
PY007
null
null
PY007/TinyLlama-1.1B-step-50K-105b
103
10,099
transformers
2023-09-01T08:59:02
--- license: apache-2.0 datasets: - cerebras/SlimPajama-627B - bigcode/starcoderdata language: - en --- <div align="center"> # TinyLlama-1.1B </div> https://github.com/jzhang38/TinyLlama The TinyLlama project aims to **pretrain** a **1.1B Llama model on 3 trillion tokens**. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs 🚀🚀. The training has started on 2023-09-01. <div align="center"> <img src="./TinyLlama_logo.png" width="300"/> </div> We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint. #### This Model This is an intermediate checkpoint with 50K steps and 105B tokens. #### Releases Schedule We will be rolling out intermediate checkpoints following the below schedule. We also include some baseline models for comparison. | Date | HF Checkpoint | Tokens | Step | HellaSwag Acc_norm | |------------|-------------------------------------------------|--------|------|---------------------| | Baseline | [StableLM-Alpha-3B](https://huggingface.co/stabilityai/stablelm-base-alpha-3b)| 800B | -- | 38.31 | | Baseline | [Pythia-1B-intermediate-step-50k-105b](https://huggingface.co/EleutherAI/pythia-1b/tree/step50000) | 105B | 50k | 42.04 | | Baseline | [Pythia-1B](https://huggingface.co/EleutherAI/pythia-1b) | 300B | 143k | 47.16 | | 2023-09-04 | [TinyLlama-1.1B-intermediate-step-50k-105b](https://huggingface.co/PY007/TinyLlama-1.1B-step-50K-105b) | 105B | 50k | 43.50 | | 2023-09-16 | -- | 500B | -- | -- | | 2023-10-01 | -- | 1T | -- | -- | | 2023-10-16 | -- | 1.5T | -- | -- | | 2023-10-31 | -- | 2T | -- | -- | | 2023-11-15 | -- | 2.5T | -- | -- | | 2023-12-01 | -- | 3T | -- | -- | #### How to use You will need the transformers>=4.31 Do check the [TinyLlama](https://github.com/jzhang38/TinyLlama) github page for more information. ``` from transformers import AutoTokenizer import transformers import torch model = "PY007/TinyLlama-1.1B-step-50K-105b" tokenizer = AutoTokenizer.from_pretrained(model) pipeline = transformers.pipeline( "text-generation", model=model, torch_dtype=torch.float16, device_map="auto", ) sequences = pipeline( 'The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs 🚀🚀. The training has started on 2023-09-01.', do_sample=True, top_k=10, num_return_sequences=1, repetition_penalty=1.5, eos_token_id=tokenizer.eos_token_id, max_length=500, ) for seq in sequences: print(f"Result: {seq['generated_text']}") ```
3,411
[ [ -0.0281829833984375, -0.04351806640625, 0.03173828125, 0.0161895751953125, -0.038177490234375, -0.003505706787109375, -0.01090240478515625, -0.03460693359375, 0.03607177734375, 0.005481719970703125, -0.059844970703125, -0.0296478271484375, -0.03607177734375, -0.0130615234375, -0.0207672119140625, 0.081787109375, 0.0013170242309570312, -0.01264190673828125, 0.0256500244140625, -0.00205230712890625, -0.031097412109375, -0.0011272430419921875, -0.056427001953125, -0.0236053466796875, 0.036285400390625, 0.04681396484375, 0.0501708984375, 0.0621337890625, 0.031707763671875, 0.0162353515625, -0.0194244384765625, -0.00038504600524902344, -0.0299072265625, -0.029571533203125, 0.02410888671875, -0.04534912109375, -0.058746337890625, 0.0073089599609375, 0.0540771484375, 0.02081298828125, -0.00771331787109375, 0.05133056640625, 0.0024242401123046875, 0.0278472900390625, -0.0228118896484375, 0.01001739501953125, -0.033660888671875, 0.01251220703125, -0.0323486328125, 0.01094818115234375, -0.005931854248046875, -0.0304718017578125, 0.0042877197265625, -0.0599365234375, 0.00928497314453125, 0.0212554931640625, 0.084716796875, 0.0259552001953125, -0.01340484619140625, -0.021148681640625, -0.0211944580078125, 0.0489501953125, -0.05224609375, -0.00337982177734375, 0.0230560302734375, 0.0252685546875, 0.00153350830078125, -0.07415771484375, -0.046661376953125, -0.01800537109375, -0.004695892333984375, -0.0026493072509765625, -0.016998291015625, -0.01317596435546875, 0.0255279541015625, 0.038665771484375, -0.0416259765625, 0.020416259765625, -0.040771484375, -0.01056671142578125, 0.046173095703125, 0.03875732421875, 0.0185394287109375, -0.0259246826171875, -0.0259552001953125, -0.0121002197265625, -0.0628662109375, 0.0062713623046875, 0.013336181640625, 0.0253143310546875, -0.052459716796875, 0.03814697265625, -0.0036830902099609375, 0.023406982421875, 0.01534271240234375, -0.020416259765625, 0.0162353515625, -0.0374755859375, -0.034820556640625, -0.0011339187622070312, 0.06396484375, 0.008270263671875, -0.01092529296875, 0.0139923095703125, -0.002719879150390625, 0.01215362548828125, -0.00295257568359375, -0.08380126953125, -0.028594970703125, 0.0136566162109375, -0.048736572265625, -0.040130615234375, -0.0170440673828125, -0.0631103515625, -0.004856109619140625, -0.0030002593994140625, 0.044281005859375, -0.024017333984375, -0.006229400634765625, -0.01232147216796875, 0.0292205810546875, 0.01331329345703125, 0.027740478515625, -0.0732421875, 0.0116424560546875, 0.041961669921875, 0.08184814453125, 0.0203399658203125, -0.0294189453125, -0.018585205078125, -0.0052337646484375, -0.0223236083984375, 0.038787841796875, -0.0010623931884765625, -0.023590087890625, -0.0239715576171875, -0.00614166259765625, -0.01123809814453125, -0.0274200439453125, 0.01274871826171875, -0.0244598388671875, 0.005634307861328125, -0.013336181640625, -0.022064208984375, -0.01105499267578125, 0.0158233642578125, -0.039764404296875, 0.0758056640625, -0.004894256591796875, -0.044525146484375, 0.024383544921875, -0.04339599609375, 0.0031375885009765625, -0.01297760009765625, -0.0055694580078125, -0.03851318359375, 0.003314971923828125, 0.0205078125, 0.0166473388671875, -0.04083251953125, -0.003734588623046875, -0.014373779296875, -0.040252685546875, 0.0020847320556640625, -0.0025920867919921875, 0.0673828125, 0.0333251953125, -0.042236328125, 0.0111846923828125, -0.058197021484375, 0.001392364501953125, 0.0294189453125, -0.0245361328125, 0.0196990966796875, -0.0253143310546875, 0.011138916015625, 0.01300048828125, 0.03582763671875, -0.029571533203125, 0.0458984375, -0.04547119140625, 0.040008544921875, 0.0693359375, -0.007328033447265625, 0.037078857421875, -0.032379150390625, 0.04327392578125, -0.0006551742553710938, 0.0301055908203125, -0.01139068603515625, -0.05078125, -0.09716796875, -0.0261688232421875, 0.03411865234375, 0.0137939453125, -0.0330810546875, 0.02899169921875, -0.028350830078125, -0.064697265625, -0.040985107421875, 0.0182952880859375, 0.0290374755859375, 0.02410888671875, 0.0245819091796875, -0.0264739990234375, -0.05169677734375, -0.056304931640625, 0.0195159912109375, -0.03851318359375, 0.00405120849609375, 0.0004432201385498047, 0.0679931640625, -0.03045654296875, 0.062469482421875, -0.038360595703125, -0.03656005859375, -0.0157623291015625, 0.006496429443359375, 0.032623291015625, 0.04620361328125, 0.044219970703125, -0.0259246826171875, -0.035369873046875, -0.004695892333984375, -0.047515869140625, 0.01141357421875, -0.0019283294677734375, -0.0095977783203125, -0.000743865966796875, 0.01190185546875, -0.059539794921875, 0.03594970703125, 0.0374755859375, -0.026580810546875, 0.0236968994140625, -0.004772186279296875, -0.0199737548828125, -0.0780029296875, 0.01436614990234375, -0.01554107666015625, -0.01220703125, -0.0355224609375, 0.0194549560546875, 0.0016183853149414062, -0.007549285888671875, -0.037994384765625, 0.0428466796875, -0.023101806640625, -0.0013704299926757812, -0.0220794677734375, -0.0035800933837890625, -0.0177001953125, 0.036468505859375, -0.01617431640625, 0.053741455078125, 0.031829833984375, -0.033355712890625, 0.01409149169921875, 0.022979736328125, -0.03082275390625, 0.00579833984375, -0.05303955078125, 0.021240234375, 0.017791748046875, 0.030059814453125, -0.05902099609375, -0.0163726806640625, 0.049468994140625, -0.0297698974609375, 0.01605224609375, -0.010345458984375, -0.04913330078125, -0.045166015625, -0.037506103515625, 0.03997802734375, 0.05224609375, -0.0543212890625, 0.025115966796875, 0.024932861328125, 0.00698089599609375, -0.012725830078125, -0.054595947265625, -0.0008797645568847656, -0.0289306640625, -0.052642822265625, 0.0237579345703125, -0.00528717041015625, -0.0003383159637451172, -0.01080322265625, -0.00231170654296875, 0.01248931884765625, 0.004131317138671875, 0.036102294921875, 0.0259857177734375, -0.0187530517578125, -0.007358551025390625, -0.00751495361328125, -0.02362060546875, -0.0158843994140625, -0.020599365234375, 0.051849365234375, -0.0404052734375, -0.01375579833984375, -0.056854248046875, -0.00926971435546875, 0.021575927734375, 0.006397247314453125, 0.04736328125, 0.048309326171875, -0.04681396484375, 0.00839996337890625, -0.034820556640625, -0.0147857666015625, -0.037506103515625, 0.003936767578125, -0.0261993408203125, -0.058074951171875, 0.044952392578125, 0.006977081298828125, 0.01313018798828125, 0.058319091796875, 0.06884765625, -0.01036834716796875, 0.06640625, 0.050811767578125, -0.0090484619140625, 0.03656005859375, -0.0728759765625, 0.00012481212615966797, -0.06689453125, -0.0260162353515625, -0.0267333984375, -0.035736083984375, -0.03436279296875, -0.043212890625, 0.0189208984375, 0.01654052734375, -0.0404052734375, 0.041534423828125, -0.0295562744140625, 0.016204833984375, 0.03143310546875, 0.0043182373046875, 0.0116119384765625, 0.0016040802001953125, -0.01465606689453125, -0.0174560546875, -0.05975341796875, -0.0528564453125, 0.0987548828125, 0.0386962890625, 0.045654296875, 0.00044727325439453125, 0.06878662109375, -0.001087188720703125, 0.040557861328125, -0.03912353515625, 0.0474853515625, 0.017730712890625, -0.06005859375, -0.005313873291015625, -0.0103912353515625, -0.06103515625, 0.033233642578125, -0.00856781005859375, -0.06884765625, 0.015228271484375, 0.0212860107421875, -0.054412841796875, 0.0183563232421875, -0.036285400390625, 0.067138671875, -0.01300048828125, -0.01690673828125, -0.0240325927734375, -0.046112060546875, 0.03826904296875, -0.0173492431640625, 0.01430511474609375, -0.02520751953125, -0.012847900390625, 0.0732421875, -0.06488037109375, 0.059356689453125, -0.012786865234375, 0.006153106689453125, 0.03240966796875, -0.0068359375, 0.035369873046875, 0.021820068359375, 0.0006532669067382812, 0.0213775634765625, -0.0016412734985351562, -0.036834716796875, -0.0021800994873046875, 0.053192138671875, -0.07244873046875, -0.035186767578125, -0.056640625, -0.025665283203125, 0.00661468505859375, 0.01062774658203125, 0.0390625, -0.00971221923828125, -0.00641632080078125, -0.00969696044921875, 0.0170745849609375, 0.00545501708984375, 0.04351806640625, 0.0177001953125, -0.0306854248046875, -0.031707763671875, 0.06640625, 0.01314544677734375, -0.00450897216796875, -0.0034580230712890625, 0.01261138916015625, -0.00853729248046875, -0.04534912109375, -0.049285888671875, 0.035888671875, -0.029510498046875, -0.0278167724609375, -0.0267791748046875, -0.01078033447265625, -0.021697998046875, -0.003955841064453125, -0.049285888671875, -0.043853759765625, -0.05230712890625, 0.00397491455078125, 0.0198211669921875, 0.049957275390625, -0.0164642333984375, 0.04937744140625, -0.038787841796875, 0.01837158203125, 0.0305938720703125, -0.00937652587890625, 0.0204925537109375, -0.0697021484375, -0.04046630859375, 0.00513458251953125, -0.04815673828125, -0.051605224609375, 0.031707763671875, 0.0146026611328125, 0.0150909423828125, 0.04339599609375, -0.032867431640625, 0.08746337890625, -0.0330810546875, 0.06365966796875, 0.033935546875, -0.068603515625, 0.0672607421875, -0.01263427734375, 0.0113983154296875, 0.044769287109375, 0.00620269775390625, -0.017974853515625, -0.0239715576171875, -0.058746337890625, -0.055633544921875, 0.07415771484375, 0.0309600830078125, -0.0007214546203613281, 0.0181121826171875, 0.029571533203125, -0.007534027099609375, 0.00861358642578125, -0.06396484375, -0.023193359375, -0.0138397216796875, -0.00559234619140625, -0.01226043701171875, -0.0235595703125, -0.0186004638671875, -0.04412841796875, 0.056427001953125, -0.011474609375, 0.0281829833984375, -0.006511688232421875, -0.0160675048828125, -0.0191497802734375, -0.015838623046875, 0.044830322265625, 0.04266357421875, -0.0189666748046875, -0.01393890380859375, 0.0367431640625, -0.047332763671875, 0.0233306884765625, 0.0085601806640625, -0.0263824462890625, -0.00732421875, 0.0297393798828125, 0.06158447265625, 0.0287628173828125, -0.0293426513671875, 0.0301513671875, -0.00391387939453125, -0.00347900390625, -0.025390625, 0.002956390380859375, 0.01241302490234375, 0.03240966796875, 0.0360107421875, -0.0028057098388671875, -0.005764007568359375, -0.036651611328125, -0.011322021484375, 0.013092041015625, 0.005603790283203125, -0.04229736328125, 0.07769775390625, -0.00261688232421875, -0.008087158203125, 0.03662109375, -0.00737762451171875, -0.0249176025390625, 0.07080078125, 0.0311431884765625, 0.049957275390625, -0.005283355712890625, -0.00196075439453125, 0.04473876953125, 0.049163818359375, -0.0153961181640625, 0.014495849609375, 0.0011548995971679688, -0.019439697265625, 0.00753021240234375, -0.061920166015625, -0.0181732177734375, 0.003887176513671875, -0.0355224609375, 0.031463623046875, -0.059417724609375, -0.0204315185546875, -0.0005669593811035156, 0.047149658203125, -0.06781005859375, 0.022003173828125, 0.0211181640625, 0.0782470703125, -0.05889892578125, 0.08746337890625, 0.042449951171875, -0.046112060546875, -0.08013916015625, -0.0027141571044921875, 0.0191497802734375, -0.08123779296875, 0.048065185546875, 0.028167724609375, 0.0152740478515625, 0.00812530517578125, -0.033477783203125, -0.0731201171875, 0.1204833984375, 0.03082275390625, -0.047332763671875, -0.0147705078125, -0.005279541015625, 0.042388916015625, -0.027740478515625, 0.0186309814453125, 0.051666259765625, 0.03436279296875, 0.00555419921875, -0.0726318359375, 0.013702392578125, -0.0253753662109375, 0.033447265625, -0.0087127685546875, -0.0858154296875, 0.08795166015625, -0.02288818359375, -0.0189208984375, 0.04656982421875, 0.0692138671875, 0.034332275390625, 0.0212554931640625, 0.031280517578125, 0.0694580078125, 0.0509033203125, -0.024810791015625, 0.0732421875, -0.02569580078125, 0.05426025390625, 0.053619384765625, 0.01456451416015625, 0.052032470703125, 0.044830322265625, -0.0296630859375, 0.037078857421875, 0.08135986328125, -0.00908660888671875, 0.039337158203125, 0.0010204315185546875, -0.01177978515625, -0.01122283935546875, 0.00798797607421875, -0.0594482421875, 0.02081298828125, 0.023712158203125, -0.0212860107421875, -0.0027790069580078125, 0.00457000732421875, 0.006076812744140625, -0.0355224609375, -0.0350341796875, 0.032196044921875, 0.02386474609375, -0.0170440673828125, 0.04986572265625, 0.019256591796875, 0.06304931640625, -0.04180908203125, 0.0273590087890625, -0.032440185546875, 0.01514434814453125, -0.020721435546875, -0.01220703125, 0.005466461181640625, 0.016845703125, 0.0090484619140625, -0.00775909423828125, 0.040069580078125, -0.003353118896484375, -0.04827880859375, 0.00894927978515625, 0.01873779296875, 0.007476806640625, 0.017333984375, -0.056671142578125, 0.03192138671875, -0.00762176513671875, -0.043426513671875, 0.025970458984375, 0.00574493408203125, 0.0173797607421875, 0.042724609375, 0.044342041015625, 0.0136871337890625, 0.035675048828125, -0.0203704833984375, 0.07537841796875, -0.04180908203125, -0.04791259765625, -0.07373046875, 0.029510498046875, 0.00969696044921875, -0.04083251953125, 0.067626953125, 0.055023193359375, 0.0439453125, 0.0017185211181640625, 0.0190887451171875, -0.0186920166015625, 0.00450897216796875, -0.04132080078125, 0.051727294921875, -0.0655517578125, 0.01806640625, -0.011016845703125, -0.0584716796875, -0.01288604736328125, 0.07440185546875, -0.01300811767578125, 0.00353240966796875, 0.039825439453125, 0.05731201171875, -0.00383758544921875, 0.008636474609375, -0.01410675048828125, 0.0204620361328125, 0.0265655517578125, 0.06976318359375, 0.0623779296875, -0.066162109375, 0.05078125, -0.046478271484375, -0.01343536376953125, -0.038238525390625, -0.03997802734375, -0.053192138671875, -0.027801513671875, -0.02349853515625, -0.020233154296875, -0.007457733154296875, 0.075927734375, 0.059295654296875, -0.052032470703125, -0.025665283203125, -0.0033626556396484375, 0.0012865066528320312, -0.000012576580047607422, -0.0128021240234375, 0.030426025390625, -0.004619598388671875, -0.0677490234375, 0.0225067138671875, 0.016448974609375, 0.0212554931640625, -0.022308349609375, -0.023529052734375, -0.01157379150390625, -0.00615692138671875, 0.0182037353515625, 0.03533935546875, -0.054718017578125, -0.027984619140625, -0.0255889892578125, -0.02618408203125, 0.0043792724609375, 0.044647216796875, -0.041900634765625, 0.01473236083984375, 0.010406494140625, 0.0197296142578125, 0.06854248046875, -0.031646728515625, -0.0023021697998046875, -0.03857421875, 0.05023193359375, 0.013153076171875, 0.032012939453125, 0.002246856689453125, -0.0014028549194335938, 0.049835205078125, 0.017822265625, -0.037841796875, -0.0833740234375, -0.0049285888671875, -0.0716552734375, 0.010528564453125, 0.060516357421875, -0.0013294219970703125, -0.0179595947265625, 0.0257110595703125, -0.0162353515625, 0.0301361083984375, -0.005680084228515625, 0.0704345703125, 0.03302001953125, -0.0023365020751953125, 0.004230499267578125, -0.0360107421875, 0.0278167724609375, 0.04010009765625, -0.0628662109375, -0.03314208984375, 0.0194244384765625, 0.026519775390625, 0.025054931640625, 0.085205078125, 0.004459381103515625, 0.03875732421875, 0.0142669677734375, -0.003509521484375, -0.006427764892578125, -0.026580810546875, -0.0362548828125, 0.0038051605224609375, 0.01126861572265625, -0.030059814453125 ] ]
sanali209/nsfwfilter
2023-08-29T01:19:05.000Z
[ "transformers", "pytorch", "tensorboard", "vit", "image-classification", "huggingpics", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-classification
sanali209
null
null
sanali209/nsfwfilter
3
10,098
transformers
2023-08-21T13:44:01
--- tags: - image-classification - pytorch - huggingpics metrics: - accuracy model-index: - name: sanali209/nsfwfilter results: - task: name: Image Classification type: image-classification metrics: - name: Accuracy type: accuracy value: 0.9273858666419983 --- # sanali209/nsfwfilter Autogenerated by HuggingPics🤗🖼️ Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb). Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics). ## Example Images
656
[ [ -0.05615234375, -0.03955078125, 0.029815673828125, 0.0374755859375, -0.045562744140625, -0.01038360595703125, 0.0276641845703125, -0.035125732421875, 0.045684814453125, 0.0272674560546875, -0.038543701171875, -0.04266357421875, -0.046875, 0.01654052734375, -0.0390625, 0.087158203125, 0.0019254684448242188, -0.01502227783203125, -0.0164337158203125, 0.007232666015625, -0.00615692138671875, -0.005229949951171875, -0.0291900634765625, -0.034210205078125, 0.004932403564453125, 0.031341552734375, 0.05242919921875, 0.039031982421875, 0.034271240234375, 0.029754638671875, 0.0118560791015625, -0.014007568359375, -0.01751708984375, 0.004974365234375, -0.00684356689453125, -0.020111083984375, -0.0234222412109375, 0.0168914794921875, 0.0294189453125, -0.0019521713256835938, -0.0117340087890625, 0.03228759765625, -0.03033447265625, 0.041656494140625, -0.0396728515625, 0.00795745849609375, -0.033538818359375, 0.02227783203125, 0.0031585693359375, -0.0022525787353515625, 0.005756378173828125, -0.059051513671875, 0.005802154541015625, -0.03948974609375, 0.02880859375, 0.0236968994140625, 0.09246826171875, 0.024810791015625, -0.0272064208984375, -0.0032825469970703125, 0.00975799560546875, 0.0247039794921875, -0.020538330078125, 0.0161895751953125, 0.042938232421875, 0.026336669921875, -0.0163421630859375, -0.05926513671875, -0.0230255126953125, -0.00971221923828125, -0.00351715087890625, -0.0144500732421875, -0.027801513671875, -0.0016117095947265625, 0.0101165771484375, 0.041961669921875, -0.0567626953125, -0.021942138671875, -0.04022216796875, -0.024444580078125, 0.061309814453125, -0.01812744140625, 0.06195068359375, -0.03460693359375, -0.031036376953125, 0.007801055908203125, -0.0215301513671875, -0.0006318092346191406, 0.028167724609375, -0.027496337890625, -0.01861572265625, 0.027587890625, -0.0009679794311523438, 0.038330078125, 0.045013427734375, 0.0031147003173828125, 0.037384033203125, 0.002410888671875, -0.037078857421875, -0.00135040283203125, 0.074462890625, 0.04266357421875, 0.01201629638671875, 0.00028395652770996094, 0.0018472671508789062, 0.013458251953125, 0.0008754730224609375, -0.075439453125, -0.06451416015625, -0.0071258544921875, -0.04266357421875, -0.057952880859375, 0.00940704345703125, -0.04412841796875, -0.019500732421875, 0.0009450912475585938, 0.0137481689453125, -0.01444244384765625, -0.027008056640625, 0.00226593017578125, -0.033233642578125, 0.011444091796875, 0.020843505859375, -0.040557861328125, 0.014617919921875, 0.0131072998046875, 0.0455322265625, 0.0159149169921875, -0.01666259765625, -0.03656005859375, -0.015899658203125, -0.0121612548828125, 0.0767822265625, -0.0065460205078125, -0.0506591796875, -0.0139007568359375, 0.0262298583984375, -0.0021343231201171875, -0.035186767578125, 0.042816162109375, -0.035552978515625, 0.0181732177734375, 0.0002872943878173828, -0.0225067138671875, -0.0350341796875, 0.0016736984252929688, -0.056640625, 0.07244873046875, 0.01348114013671875, -0.06475830078125, 0.01947021484375, -0.057342529296875, -0.020904541015625, 0.0309295654296875, -0.0028781890869140625, -0.058990478515625, -0.00531768798828125, -0.00756072998046875, 0.036041259765625, -0.0015735626220703125, 0.041351318359375, -0.0548095703125, -0.0308685302734375, 0.0153045654296875, 0.0168914794921875, 0.074462890625, 0.024139404296875, -0.0270843505859375, 0.02703857421875, -0.0360107421875, 0.006938934326171875, 0.02935791015625, 0.006458282470703125, -0.0275421142578125, -0.03802490234375, 0.0246124267578125, 0.013824462890625, 0.003444671630859375, -0.059539794921875, 0.019012451171875, 0.00203704833984375, 0.0308074951171875, 0.04296875, 0.015838623046875, 0.03143310546875, -0.025390625, 0.057403564453125, -0.0238037109375, 0.0380859375, 0.01214599609375, -0.04888916015625, -0.03485107421875, -0.034912109375, 0.0023441314697265625, 0.00814056396484375, -0.045745849609375, 0.058319091796875, 0.016387939453125, -0.04437255859375, -0.0210723876953125, 0.0247802734375, 0.0298004150390625, 0.0170745849609375, 0.002010345458984375, -0.050140380859375, -0.051910400390625, -0.057281494140625, 0.0222625732421875, -0.0254364013671875, 0.0179290771484375, 0.0259246826171875, 0.050933837890625, -0.027008056640625, 0.052825927734375, -0.05682373046875, -0.0262908935546875, 0.0304107666015625, 0.0099029541015625, -0.001186370849609375, 0.047760009765625, 0.07958984375, -0.0828857421875, -0.0226898193359375, -0.03167724609375, -0.049957275390625, -0.027557373046875, 0.031280517578125, -0.036468505859375, 0.0014505386352539062, 0.046600341796875, -0.0239410400390625, 0.055389404296875, 0.052947998046875, -0.05316162109375, 0.04376220703125, -0.0114288330078125, 0.02191162109375, -0.0584716796875, 0.00188446044921875, 0.02154541015625, -0.056427001953125, -0.0163116455078125, 0.0211029052734375, 0.01544189453125, -0.015228271484375, -0.05242919921875, 0.045013427734375, -0.03240966796875, 0.00982666015625, -0.034423828125, -0.00004178285598754883, -0.0036907196044921875, 0.01192474365234375, 0.01366424560546875, 0.0197906494140625, 0.05267333984375, -0.040618896484375, 0.053009033203125, 0.04510498046875, -0.01546478271484375, 0.02703857421875, -0.05712890625, 0.005580902099609375, -0.01277923583984375, 0.043670654296875, -0.07257080078125, -0.042205810546875, 0.04339599609375, -0.037017822265625, 0.01125335693359375, -0.030303955078125, -0.033660888671875, -0.04962158203125, -0.042144775390625, 0.032470703125, 0.0499267578125, -0.0693359375, 0.0219879150390625, 0.038360595703125, 0.009674072265625, -0.0238037109375, -0.0361328125, 0.0041046142578125, -0.0173797607421875, -0.03448486328125, 0.01049041748046875, -0.019561767578125, 0.002735137939453125, 0.03375244140625, -0.00012493133544921875, -0.022857666015625, 0.00756072998046875, 0.0589599609375, 0.0391845703125, 0.0079193115234375, -0.004150390625, 0.005218505859375, -0.0169830322265625, 0.0083160400390625, -0.00865936279296875, 0.03826904296875, -0.0557861328125, -0.0229949951171875, -0.05010986328125, 0.01334381103515625, 0.040191650390625, 0.004642486572265625, 0.00603485107421875, 0.07568359375, -0.038604736328125, -0.0244293212890625, -0.041595458984375, -0.0024089813232421875, -0.03857421875, -0.00878143310546875, -0.0157012939453125, -0.06329345703125, 0.0211029052734375, 0.00933837890625, -0.03076171875, 0.036346435546875, 0.02288818359375, -0.0137939453125, 0.06402587890625, 0.061431884765625, -0.0078887939453125, 0.032440185546875, -0.026275634765625, -0.01149749755859375, -0.064697265625, -0.01123046875, -0.027862548828125, -0.01666259765625, -0.06732177734375, -0.01715087890625, 0.01690673828125, -0.0023746490478515625, -0.0229644775390625, 0.05303955078125, -0.0694580078125, 0.057861328125, 0.0673828125, 0.019012451171875, 0.008453369140625, -0.019195556640625, 0.00415802001953125, 0.01519012451171875, -0.0306243896484375, -0.0177154541015625, 0.037078857421875, 0.0276336669921875, 0.08197021484375, -0.005672454833984375, 0.07476806640625, -0.0008592605590820312, 0.05194091796875, -0.05169677734375, 0.053680419921875, -0.0251922607421875, -0.075439453125, 0.0062408447265625, -0.005832672119140625, -0.049896240234375, 0.0015964508056640625, -0.01056671142578125, -0.049560546875, 0.035430908203125, 0.0218505859375, 0.002552032470703125, 0.0226898193359375, -0.051055908203125, 0.054656982421875, -0.0164642333984375, -0.00750732421875, 0.017822265625, -0.046295166015625, 0.056488037109375, 0.006427764892578125, -0.0013818740844726562, -0.039459228515625, 0.00235748291015625, 0.0345458984375, -0.0212249755859375, 0.077392578125, -0.032501220703125, 0.00969696044921875, 0.01457977294921875, 0.00980377197265625, -0.0173187255859375, 0.0186004638671875, 0.0201263427734375, 0.01763916015625, 0.0217132568359375, -0.01354217529296875, -0.01445770263671875, 0.06317138671875, -0.040313720703125, -0.023101806640625, -0.057281494140625, 0.0010223388671875, 0.0185394287109375, 0.0145111083984375, 0.042083740234375, -0.009429931640625, -0.01910400390625, 0.01047515869140625, 0.04547119140625, -0.00787353515625, 0.03997802734375, 0.0165863037109375, -0.05609130859375, -0.057952880859375, 0.04400634765625, -0.0236358642578125, -0.01953125, -0.00659942626953125, 0.042022705078125, -0.044036865234375, -0.0223236083984375, -0.047393798828125, 0.019561767578125, -0.045806884765625, -0.0367431640625, -0.03021240234375, -0.01369476318359375, -0.03863525390625, -0.026153564453125, 0.0008707046508789062, -0.00760650634765625, -0.034881591796875, -0.0059661865234375, 0.056976318359375, 0.02227783203125, -0.022674560546875, 0.01537322998046875, -0.062469482421875, 0.05548095703125, 0.033050537109375, 0.04241943359375, -0.0237579345703125, -0.0262908935546875, 0.026824951171875, -0.031951904296875, -0.022186279296875, -0.047576904296875, 0.042999267578125, 0.037811279296875, 0.0311431884765625, 0.04620361328125, 0.0212249755859375, 0.049713134765625, -0.006137847900390625, 0.038055419921875, 0.046356201171875, -0.07806396484375, 0.06878662109375, -0.0229949951171875, 0.00689697265625, 0.051116943359375, 0.03631591796875, -0.019500732421875, -0.026580810546875, -0.06732177734375, -0.045166015625, 0.03802490234375, 0.0149383544921875, 0.002033233642578125, 0.006256103515625, 0.03662109375, 0.0007195472717285156, 0.017120361328125, -0.0654296875, -0.0070648193359375, -0.0181884765625, -0.01910400390625, 0.0198974609375, -0.00719451904296875, -0.022247314453125, -0.027496337890625, 0.06719970703125, -0.0169525146484375, 0.003662109375, 0.025421142578125, -0.0213623046875, -0.044921875, -0.016357421875, 0.006397247314453125, 0.007781982421875, -0.021209716796875, 0.00011777877807617188, -0.0233306884765625, -0.037017822265625, 0.026214599609375, 0.025787353515625, -0.03424072265625, 0.0152740478515625, -0.0037212371826171875, 0.0548095703125, 0.0194244384765625, -0.024444580078125, 0.045013427734375, -0.04376220703125, -0.0259857177734375, -0.04144287109375, 0.04119873046875, 0.006137847900390625, 0.01111602783203125, 0.0005469322204589844, 0.01004791259765625, 0.0400390625, -0.04180908203125, 0.03350830078125, 0.0196075439453125, -0.01099395751953125, -0.005962371826171875, 0.06878662109375, 0.0179901123046875, -0.022216796875, 0.02813720703125, -0.055755615234375, -0.05316162109375, 0.058197021484375, 0.03662109375, 0.0850830078125, -0.03216552734375, 0.04248046875, 0.037353515625, 0.00969696044921875, 0.0024127960205078125, 0.034271240234375, -0.04083251953125, -0.04736328125, -0.0016117095947265625, -0.06390380859375, -0.038848876953125, 0.0158843994140625, -0.046783447265625, 0.034759521484375, -0.047882080078125, -0.019561767578125, 0.0284271240234375, 0.00604248046875, -0.042083740234375, 0.042449951171875, 0.018402099609375, 0.06048583984375, -0.07598876953125, 0.046142578125, 0.07537841796875, -0.032989501953125, -0.041015625, -0.00640106201171875, 0.0208282470703125, -0.06268310546875, 0.04266357421875, 0.026947021484375, -0.020843505859375, -0.0109100341796875, -0.065673828125, -0.04351806640625, 0.04864501953125, 0.01027679443359375, -0.01483917236328125, 0.0037097930908203125, -0.0567626953125, 0.0268402099609375, -0.0269317626953125, 0.00598907470703125, 0.022308349609375, 0.040313720703125, 0.01448822021484375, -0.07025146484375, 0.007419586181640625, -0.05950927734375, 0.02288818359375, 0.0161590576171875, -0.049468994140625, 0.01450347900390625, -0.034027099609375, -0.0087432861328125, 0.0276336669921875, 0.0301666259765625, 0.020050048828125, 0.0528564453125, 0.06494140625, 0.050537109375, 0.024322509765625, -0.0272216796875, 0.04052734375, 0.0021419525146484375, 0.044647216796875, 0.10455322265625, -0.005523681640625, 0.02294921875, 0.008270263671875, -0.003368377685546875, 0.04425048828125, 0.08111572265625, -0.06903076171875, 0.02783203125, 0.0267333984375, -0.0248565673828125, -0.005588531494140625, -0.004734039306640625, -0.0273284912109375, 0.04339599609375, 0.00841522216796875, -0.04620361328125, -0.01995849609375, 0.02740478515625, -0.0011339187622070312, 0.0007262229919433594, -0.00843048095703125, 0.06402587890625, -0.0013360977172851562, -0.0338134765625, 0.05511474609375, -0.01448822021484375, 0.06829833984375, -0.031280517578125, 0.00009113550186157227, 0.0194244384765625, -0.00565338134765625, -0.047576904296875, -0.06561279296875, 0.0272674560546875, 0.006175994873046875, 0.01149749755859375, -0.016510009765625, 0.08428955078125, -0.0347900390625, -0.0207366943359375, 0.032073974609375, 0.0010986328125, 0.0301513671875, -0.0022563934326171875, -0.06561279296875, 0.01178741455078125, -0.0008602142333984375, -0.015777587890625, 0.010894775390625, 0.00188446044921875, 0.01025390625, 0.055389404296875, 0.0570068359375, 0.0025997161865234375, -0.00864410400390625, 0.0049285888671875, 0.056976318359375, -0.0394287109375, -0.042022705078125, -0.0294952392578125, 0.020050048828125, -0.021881103515625, -0.018524169921875, 0.04351806640625, 0.037841796875, 0.06854248046875, -0.03729248046875, 0.053619384765625, -0.038360595703125, -0.00787353515625, 0.004009246826171875, 0.0640869140625, -0.03900146484375, -0.041046142578125, -0.03204345703125, -0.037017822265625, -0.0294189453125, 0.06640625, 0.0083160400390625, -0.005672454833984375, 0.043670654296875, 0.0494384765625, -0.005626678466796875, 0.0054779052734375, 0.01488494873046875, -0.0103607177734375, 0.01180267333984375, 0.0185546875, 0.044708251953125, -0.041839599609375, 0.015167236328125, -0.07012939453125, -0.037567138671875, -0.026947021484375, -0.09197998046875, -0.06072998046875, -0.05609130859375, -0.06805419921875, -0.036468505859375, -0.0237579345703125, 0.061492919921875, 0.09503173828125, -0.06201171875, -0.0004954338073730469, -0.01043701171875, 0.0159759521484375, 0.0204620361328125, -0.0209197998046875, 0.0284576416015625, 0.01568603515625, -0.072998046875, -0.006259918212890625, 0.00937652587890625, 0.042266845703125, 0.001178741455078125, 0.023193359375, -0.00737762451171875, -0.031890869140625, 0.036712646484375, 0.047515869140625, -0.040802001953125, -0.0202484130859375, -0.0290069580078125, 0.01171112060546875, -0.005191802978515625, 0.02532958984375, -0.029754638671875, 0.0313720703125, 0.047760009765625, 0.00917816162109375, 0.03485107421875, 0.007354736328125, 0.00489044189453125, -0.0343017578125, 0.0106658935546875, 0.01175689697265625, 0.049163818359375, 0.0309295654296875, -0.045562744140625, 0.0526123046875, 0.03973388671875, -0.05584716796875, -0.06292724609375, 0.008209228515625, -0.0814208984375, -0.002719879150390625, 0.06585693359375, -0.0084381103515625, -0.034393310546875, 0.003421783447265625, -0.02593994140625, 0.020660400390625, -0.0382080078125, 0.0701904296875, 0.02801513671875, -0.0298004150390625, -0.016845703125, -0.004642486572265625, 0.03472900390625, -0.02362060546875, -0.07110595703125, -0.03387451171875, 0.04864501953125, 0.060699462890625, 0.032562255859375, 0.04718017578125, -0.033203125, 0.050079345703125, -0.01450347900390625, 0.036590576171875, -0.0166015625, -0.005306243896484375, -0.037078857421875, 0.003154754638671875, -0.02264404296875, -0.046875 ] ]
facebook/xglm-564M
2023-01-24T16:35:45.000Z
[ "transformers", "pytorch", "tf", "jax", "xglm", "text-generation", "multilingual", "en", "ru", "zh", "de", "es", "fr", "ja", "it", "pt", "el", "ko", "fi", "id", "tr", "ar", "vi", "th", "bg", "ca", "hi", "et", "bn", "ta", "ur", "sw", "te", "eu", "my", "ht", "qu", "arxiv:2112.10668", "license:mit", "has_space", "region:us" ]
text-generation
facebook
null
null
facebook/xglm-564M
29
10,097
transformers
2022-03-02T23:29:05
--- language: - multilingual - en - ru - zh - de - es - fr - ja - it - pt - el - ko - fi - id - tr - ar - vi - th - bg - ca - hi - et - bn - ta - ur - sw - te - eu - my - ht - qu license: mit thumbnail: https://huggingface.co/front/thumbnails/facebook.png inference: false --- # XGLM-564M XGLM-564M is a multilingual autoregressive language model (with 564 million parameters) trained on a balanced corpus of a diverse set of 30 languages totaling 500 billion sub-tokens. It was introduced in the paper [Few-shot Learning with Multilingual Language Models](https://arxiv.org/abs/2112.10668) by Xi Victoria Lin\*, Todor Mihaylov, Mikel Artetxe, Tianlu Wang, Shuohui Chen, Daniel Simig, Myle Ott, Naman Goyal, Shruti Bhosale, Jingfei Du, Ramakanth Pasunuru, Sam Shleifer, Punit Singh Koura, Vishrav Chaudhary, Brian O'Horo, Jeff Wang, Luke Zettlemoyer, Zornitsa Kozareva, Mona Diab, Veselin Stoyanov, Xian Li\* (\*Equal Contribution). The original implementation was released in [this repository](https://github.com/pytorch/fairseq/tree/main/examples/xglm). ## Training Data Statistics The training data statistics of XGLM-564M is shown in the table below. | ISO-639-1| family | name | # tokens | ratio | ratio w/ lowRes upsampling | |:--------|:-----------------|:------------------------|-------------:|------------:|-------------:| | en | Indo-European | English | 803526736124 | 0.489906 | 0.3259 | | ru | Indo-European | Russian | 147791898098 | 0.0901079 | 0.0602 | | zh | Sino-Tibetan | Chinese | 132770494630 | 0.0809494 | 0.0483 | | de | Indo-European | German | 89223707856 | 0.0543992 | 0.0363 | | es | Indo-European | Spanish | 87303083105 | 0.0532282 | 0.0353 | | fr | Indo-European | French | 77419639775 | 0.0472023 | 0.0313 | | ja | Japonic | Japanese | 66054364513 | 0.040273 | 0.0269 | | it | Indo-European | Italian | 41930465338 | 0.0255648 | 0.0171 | | pt | Indo-European | Portuguese | 36586032444 | 0.0223063 | 0.0297 | | el | Indo-European | Greek (modern) | 28762166159 | 0.0175361 | 0.0233 | | ko | Koreanic | Korean | 20002244535 | 0.0121953 | 0.0811 | | fi | Uralic | Finnish | 16804309722 | 0.0102455 | 0.0681 | | id | Austronesian | Indonesian | 15423541953 | 0.00940365 | 0.0125 | | tr | Turkic | Turkish | 12413166065 | 0.00756824 | 0.0101 | | ar | Afro-Asiatic | Arabic | 12248607345 | 0.00746791 | 0.0099 | | vi | Austroasiatic | Vietnamese | 11199121869 | 0.00682804 | 0.0091 | | th | Tai–Kadai | Thai | 10842172807 | 0.00661041 | 0.044 | | bg | Indo-European | Bulgarian | 9703797869 | 0.00591635 | 0.0393 | | ca | Indo-European | Catalan | 7075834775 | 0.0043141 | 0.0287 | | hi | Indo-European | Hindi | 3448390110 | 0.00210246 | 0.014 | | et | Uralic | Estonian | 3286873851 | 0.00200399 | 0.0133 | | bn | Indo-European | Bengali, Bangla | 1627447450 | 0.000992245 | 0.0066 | | ta | Dravidian | Tamil | 1476973397 | 0.000900502 | 0.006 | | ur | Indo-European | Urdu | 1351891969 | 0.000824241 | 0.0055 | | sw | Niger–Congo | Swahili | 907516139 | 0.000553307 | 0.0037 | | te | Dravidian | Telugu | 689316485 | 0.000420272 | 0.0028 | | eu | Language isolate | Basque | 105304423 | 6.42035e-05 | 0.0043 | | my | Sino-Tibetan | Burmese | 101358331 | 6.17976e-05 | 0.003 | | ht | Creole | Haitian, Haitian Creole | 86584697 | 5.27902e-05 | 0.0035 | | qu | Quechuan | Quechua | 3236108 | 1.97304e-06 | 0.0001 | ## Model card For intended usage of the model, please refer to the [model card](https://github.com/pytorch/fairseq/blob/main/examples/xglm/model_card.md) released by the XGLM-564M development team. ## Example (COPA) The following snippet shows how to evaluate our models (GPT-3 style, zero-shot) on the Choice of Plausible Alternatives (COPA) task, using examples in English, Chinese and Hindi. ```python import torch import torch.nn.functional as F from transformers import XGLMTokenizer, XGLMForCausalLM tokenizer = XGLMTokenizer.from_pretrained("facebook/xglm-564M") model = XGLMForCausalLM.from_pretrained("facebook/xglm-564M") data_samples = { 'en': [ { "premise": "I wanted to conserve energy.", "choice1": "I swept the floor in the unoccupied room.", "choice2": "I shut off the light in the unoccupied room.", "question": "effect", "label": "1" }, { "premise": "The flame on the candle went out.", "choice1": "I blew on the wick.", "choice2": "I put a match to the wick.", "question": "cause", "label": "0" } ], 'zh': [ { "premise": "我想节约能源。", "choice1": "我在空着的房间里扫了地板。", "choice2": "我把空房间里的灯关了。", "question": "effect", "label": "1" }, { "premise": "蜡烛上的火焰熄灭了。", "choice1": "我吹灭了灯芯。", "choice2": "我把一根火柴放在灯芯上。", "question": "cause", "label": "0" } ], 'hi': [ { "premise": "M te vle konsève enèji.", "choice1": "Mwen te fin baleye chanm lib la.", "choice2": "Mwen te femen limyè nan chanm lib la.", "question": "effect", "label": "1" }, { "premise": "Flam bouji a te etenn.", "choice1": "Mwen te soufle bouji a.", "choice2": "Mwen te limen mèch bouji a.", "question": "cause", "label": "0" } ] } def get_logprobs(prompt): inputs = tokenizer(prompt, return_tensors="pt") input_ids, output_ids = inputs["input_ids"], inputs["input_ids"][:, 1:] outputs = model(**inputs, labels=input_ids) logits = outputs.logits logprobs = torch.gather(F.log_softmax(logits, dim=2), 2, output_ids.unsqueeze(2)) return logprobs # Zero-shot evaluation for the Choice of Plausible Alternatives (COPA) task. # A return value of 0 indicates that the first alternative is more plausible, # while 1 indicates that the second alternative is more plausible. def COPA_eval(prompt, alternative1, alternative2): lprob1 = get_logprobs(prompt + "\n" + alternative1).sum() lprob2 = get_logprobs(prompt + "\n" + alternative2).sum() return 0 if lprob1 > lprob2 else 1 for lang in data_samples_long: for idx, example in enumerate(data_samples_long[lang]): predict = COPA_eval(example["premise"], example["choice1"], example["choice2"]) print(f'{lang}-{idx}', predict, example['label']) # en-0 1 1 # en-1 0 0 # zh-0 1 1 # zh-1 0 0 # hi-0 1 1 # hi-1 0 0 ```
7,588
[ [ -0.03314208984375, -0.04046630859375, 0.024688720703125, -0.0006570816040039062, -0.0146331787109375, 0.00099945068359375, -0.01129150390625, -0.024078369140625, 0.0179595947265625, 0.01224517822265625, -0.03997802734375, -0.058624267578125, -0.03662109375, 0.00260162353515625, -0.0167236328125, 0.07562255859375, -0.007160186767578125, 0.0112152099609375, 0.0196533203125, -0.012786865234375, -0.0113525390625, -0.027496337890625, -0.05828857421875, -0.02105712890625, 0.029541015625, 0.01207733154296875, 0.05364990234375, 0.0286407470703125, 0.0238037109375, 0.024261474609375, -0.011444091796875, 0.00482940673828125, -0.0267333984375, -0.036651611328125, 0.005016326904296875, -0.054779052734375, -0.036773681640625, 0.0108642578125, 0.053680419921875, 0.032684326171875, -0.00006073713302612305, 0.0267486572265625, 0.0041961669921875, 0.036895751953125, -0.0264434814453125, 0.0275115966796875, -0.03045654296875, 0.01200103759765625, -0.006832122802734375, -0.01154327392578125, -0.0165252685546875, -0.0259857177734375, 0.004718780517578125, -0.0313720703125, 0.0191650390625, 0.01158905029296875, 0.0865478515625, -0.01029205322265625, -0.0214996337890625, -0.0181121826171875, -0.030792236328125, 0.060333251953125, -0.06170654296875, 0.01273345947265625, 0.0237274169921875, 0.01001739501953125, -0.01532745361328125, -0.054901123046875, -0.045623779296875, -0.0094757080078125, -0.033447265625, 0.03240966796875, -0.0199737548828125, -0.001590728759765625, 0.030792236328125, 0.032684326171875, -0.06298828125, -0.00952911376953125, -0.045623779296875, -0.007381439208984375, 0.057647705078125, 0.023529052734375, 0.0306243896484375, -0.031280517578125, -0.033050537109375, -0.00978851318359375, -0.036376953125, 0.02069091796875, 0.0290374755859375, 0.0150146484375, -0.026641845703125, 0.039093017578125, -0.0164947509765625, 0.06634521484375, -0.00868988037109375, -0.0181884765625, 0.033294677734375, -0.034332275390625, -0.0244293212890625, -0.0030307769775390625, 0.06854248046875, 0.029998779296875, -0.00858306884765625, 0.004482269287109375, -0.0006275177001953125, 0.005802154541015625, -0.024627685546875, -0.0528564453125, -0.0222015380859375, 0.0311279296875, -0.037353515625, -0.01715087890625, 0.00862884521484375, -0.06109619140625, 0.007129669189453125, 0.00017845630645751953, 0.049285888671875, -0.031768798828125, -0.020233154296875, 0.0155487060546875, -0.00360870361328125, 0.030059814453125, 0.0163116455078125, -0.0657958984375, 0.0135498046875, 0.0264129638671875, 0.069580078125, -0.007350921630859375, -0.0230255126953125, -0.01544952392578125, -0.0005006790161132812, -0.026611328125, 0.039154052734375, -0.0194091796875, -0.03057861328125, -0.0155029296875, 0.01259613037109375, -0.0301361083984375, -0.0228424072265625, 0.054840087890625, -0.013336181640625, 0.037628173828125, -0.0189208984375, -0.034332275390625, -0.031524658203125, 0.034515380859375, -0.0517578125, 0.0885009765625, 0.01434326171875, -0.07183837890625, 0.02667236328125, -0.040863037109375, 0.01043701171875, -0.00621795654296875, -0.012237548828125, -0.037200927734375, -0.00667572021484375, 0.011962890625, 0.0292205810546875, -0.031219482421875, 0.01180267333984375, -0.01139068603515625, -0.02044677734375, 0.006343841552734375, -0.02984619140625, 0.09844970703125, 0.0267486572265625, -0.0384521484375, 0.0236053466796875, -0.07025146484375, 0.0218963623046875, 0.02557373046875, -0.03179931640625, 0.0023097991943359375, -0.023223876953125, 0.0075531005859375, 0.04949951171875, 0.0158843994140625, -0.036895751953125, 0.018341064453125, -0.038116455078125, 0.0335693359375, 0.05218505859375, -0.00774383544921875, 0.01003265380859375, -0.04461669921875, 0.044036865234375, -0.00223541259765625, 0.00917816162109375, -0.007152557373046875, -0.0438232421875, -0.057586669921875, -0.026641845703125, 0.00865936279296875, 0.046051025390625, -0.044921875, 0.035858154296875, -0.007598876953125, -0.0511474609375, -0.0401611328125, 0.00307464599609375, 0.031036376953125, 0.04669189453125, 0.01611328125, 0.0017900466918945312, -0.04486083984375, -0.0552978515625, -0.007297515869140625, -0.0213775634765625, 0.016632080078125, 0.0251922607421875, 0.0606689453125, -0.0080108642578125, 0.0665283203125, -0.05230712890625, -0.0199127197265625, -0.0285186767578125, 0.00081634521484375, 0.049072265625, 0.038055419921875, 0.04913330078125, -0.05670166015625, -0.057769775390625, 0.004802703857421875, -0.0555419921875, 0.0070953369140625, -0.001323699951171875, -0.0183563232421875, 0.0243377685546875, 0.01149749755859375, -0.055999755859375, 0.038665771484375, 0.040130615234375, -0.036407470703125, 0.056854248046875, -0.026702880859375, 0.01345062255859375, -0.0897216796875, 0.0181121826171875, -0.01428985595703125, -0.006855010986328125, -0.046142578125, 0.005603790283203125, -0.00513458251953125, -0.0014448165893554688, -0.057708740234375, 0.07574462890625, -0.045196533203125, 0.01523590087890625, 0.0020542144775390625, 0.01203155517578125, -0.003932952880859375, 0.0443115234375, 0.01326751708984375, 0.036834716796875, 0.065185546875, -0.02642822265625, 0.0304718017578125, 0.0117645263671875, -0.041015625, 0.017913818359375, -0.045013427734375, -0.0037708282470703125, -0.01490020751953125, 0.0158233642578125, -0.09710693359375, -0.01800537109375, 0.039306640625, -0.036468505859375, 0.01372528076171875, -0.0072174072265625, -0.02947998046875, -0.0535888671875, -0.0338134765625, 0.0262451171875, 0.03387451171875, -0.0390625, 0.039093017578125, 0.01242828369140625, 0.0095062255859375, -0.049407958984375, -0.051239013671875, -0.00823974609375, -0.0286407470703125, -0.06573486328125, 0.024993896484375, -0.011077880859375, -0.01021575927734375, 0.007476806640625, -0.0026721954345703125, 0.005756378173828125, -0.003147125244140625, 0.00357818603515625, 0.033905029296875, -0.004703521728515625, -0.01421356201171875, -0.0154876708984375, -0.0228424072265625, 0.002353668212890625, -0.0180816650390625, 0.0498046875, -0.0257415771484375, -0.0183868408203125, -0.0253753662109375, 0.021331787109375, 0.0302886962890625, -0.0166473388671875, 0.05438232421875, 0.06597900390625, -0.03460693359375, 0.004711151123046875, -0.0241241455078125, 0.002254486083984375, -0.033203125, 0.035064697265625, -0.037017822265625, -0.05511474609375, 0.06256103515625, 0.01488494873046875, 0.0169830322265625, 0.0528564453125, 0.0635986328125, 0.00870513916015625, 0.0887451171875, 0.0224609375, -0.004467010498046875, 0.0248870849609375, -0.05999755859375, 0.00876617431640625, -0.044921875, -0.0275421142578125, -0.029876708984375, -0.0140838623046875, -0.047210693359375, -0.03314208984375, 0.0295257568359375, -0.0020599365234375, -0.0189056396484375, 0.023834228515625, -0.037078857421875, 0.0183258056640625, 0.04998779296875, 0.01309967041015625, 0.0155792236328125, -0.007129669189453125, -0.02227783203125, 0.00841522216796875, -0.06866455078125, -0.039459228515625, 0.0823974609375, 0.01435089111328125, 0.05865478515625, 0.01248931884765625, 0.0665283203125, -0.0017576217651367188, 0.0156097412109375, -0.05010986328125, 0.042877197265625, 0.004131317138671875, -0.0782470703125, -0.02032470703125, -0.030303955078125, -0.0841064453125, 0.04754638671875, -0.019439697265625, -0.0806884765625, 0.0194854736328125, 0.01337432861328125, -0.0478515625, 0.03515625, -0.050537109375, 0.06927490234375, -0.032257080078125, -0.04742431640625, -0.0082244873046875, -0.049468994140625, 0.039154052734375, 0.00530242919921875, 0.01439666748046875, -0.0200653076171875, 0.01047515869140625, 0.06903076171875, -0.031646728515625, 0.04705810546875, 0.0008139610290527344, 0.0007252693176269531, 0.04296875, -0.01337432861328125, 0.034881591796875, 0.0163421630859375, -0.0162353515625, 0.01013946533203125, 0.0160369873046875, -0.03997802734375, -0.0301666259765625, 0.060699462890625, -0.06182861328125, -0.059661865234375, -0.055755615234375, -0.043182373046875, 0.01013946533203125, 0.0406494140625, 0.047210693359375, 0.02630615234375, -0.008056640625, -0.00016510486602783203, 0.01806640625, -0.0178070068359375, 0.0484619140625, 0.024261474609375, -0.035369873046875, -0.048492431640625, 0.0621337890625, 0.0272674560546875, 0.0218963623046875, 0.01366424560546875, 0.022430419921875, -0.03802490234375, -0.025115966796875, -0.02545166015625, 0.025726318359375, -0.04754638671875, -0.0080108642578125, -0.04241943359375, -0.01171112060546875, -0.030975341796875, -0.0167388916015625, -0.0140228271484375, -0.0248260498046875, -0.023101806640625, -0.01806640625, 0.0208740234375, 0.031463623046875, -0.0082244873046875, 0.0128631591796875, -0.046356201171875, 0.023468017578125, 0.00872039794921875, 0.0090179443359375, -0.0015659332275390625, -0.046844482421875, -0.0223388671875, 0.004566192626953125, -0.0148773193359375, -0.07122802734375, 0.05572509765625, 0.0014505386352539062, 0.047698974609375, 0.03228759765625, -0.002750396728515625, 0.0565185546875, -0.0206451416015625, 0.05450439453125, 0.0263519287109375, -0.0689697265625, 0.0323486328125, -0.018035888671875, 0.01806640625, 0.0292510986328125, 0.04217529296875, -0.043182373046875, -0.01898193359375, -0.061553955078125, -0.06317138671875, 0.08367919921875, 0.024932861328125, 0.006591796875, -0.006031036376953125, 0.01221466064453125, -0.0008006095886230469, 0.00833892822265625, -0.071533203125, -0.052001953125, -0.0008478164672851562, -0.0130462646484375, -0.00388336181640625, -0.0149688720703125, -0.0216827392578125, -0.046722412109375, 0.06353759765625, -0.00016689300537109375, 0.0233612060546875, 0.0175933837890625, 0.006473541259765625, 0.0017137527465820312, 0.0098419189453125, 0.04730224609375, 0.051666259765625, -0.030792236328125, 0.01000213623046875, 0.0289154052734375, -0.0230560302734375, 0.0210418701171875, 0.007366180419921875, -0.0259857177734375, 0.01149749755859375, 0.0206756591796875, 0.07427978515625, 0.0009427070617675781, -0.03271484375, 0.024871826171875, 0.001018524169921875, -0.0267486572265625, -0.036590576171875, 0.00850677490234375, 0.00567626953125, 0.0149688720703125, 0.02667236328125, 0.0006213188171386719, -0.001384735107421875, -0.053741455078125, 0.01678466796875, 0.03887939453125, -0.024139404296875, -0.0273590087890625, 0.06256103515625, -0.0062713623046875, -0.0081787109375, 0.0301666259765625, -0.02008056640625, -0.040679931640625, 0.052001953125, 0.045196533203125, 0.05975341796875, -0.016510009765625, 0.01279449462890625, 0.06109619140625, 0.0229339599609375, -0.01165771484375, 0.040557861328125, 0.0186004638671875, -0.04669189453125, -0.0043182373046875, -0.041778564453125, -0.01010894775390625, 0.0168914794921875, -0.04412841796875, 0.0423583984375, -0.0245513916015625, -0.0259857177734375, -0.01143646240234375, 0.0188140869140625, -0.047027587890625, 0.0243988037109375, 0.016326904296875, 0.07073974609375, -0.07330322265625, 0.07183837890625, 0.0447998046875, -0.0406494140625, -0.0684814453125, -0.022491455078125, 0.0002651214599609375, -0.060394287109375, 0.039794921875, 0.006900787353515625, 0.00518035888671875, 0.0134429931640625, -0.027557373046875, -0.09344482421875, 0.091796875, 0.0287017822265625, -0.0269012451171875, -0.004405975341796875, 0.0063323974609375, 0.036956787109375, -0.01142120361328125, 0.04241943359375, 0.0521240234375, 0.04425048828125, 0.008148193359375, -0.059783935546875, 0.01297760009765625, -0.0360107421875, -0.0167999267578125, 0.0082244873046875, -0.07391357421875, 0.0880126953125, -0.0276947021484375, -0.00927734375, 0.0024700164794921875, 0.0469970703125, 0.0288848876953125, 0.0127410888671875, 0.0250091552734375, 0.047210693359375, 0.0618896484375, -0.0215911865234375, 0.07183837890625, -0.042755126953125, 0.0662841796875, 0.06494140625, 0.0028934478759765625, 0.0452880859375, 0.0230560302734375, -0.03497314453125, 0.019683837890625, 0.057891845703125, -0.0134124755859375, 0.0243682861328125, 0.0153350830078125, -0.030914306640625, -0.01381683349609375, 0.0157318115234375, -0.030670166015625, 0.039703369140625, 0.01363372802734375, -0.0305328369140625, -0.0258941650390625, 0.0026569366455078125, 0.023101806640625, -0.028839111328125, -0.0283355712890625, 0.02838134765625, 0.00679779052734375, -0.04998779296875, 0.071533203125, 0.006877899169921875, 0.043426513671875, -0.0560302734375, -0.0008382797241210938, -0.00411224365234375, 0.0171966552734375, -0.0400390625, -0.06658935546875, 0.01381683349609375, -0.0085601806640625, -0.008087158203125, 0.0080413818359375, 0.0355224609375, -0.0406494140625, -0.06292724609375, 0.0275726318359375, 0.0271453857421875, 0.02081298828125, 0.02081298828125, -0.07666015625, -0.001300811767578125, 0.01983642578125, -0.036895751953125, 0.0175323486328125, 0.01232147216796875, -0.00916290283203125, 0.044189453125, 0.035369873046875, 0.004940032958984375, 0.0230255126953125, -0.018707275390625, 0.05999755859375, -0.0504150390625, -0.0281982421875, -0.0771484375, 0.0426025390625, 0.01288604736328125, -0.038482666015625, 0.07672119140625, 0.06378173828125, 0.06005859375, -0.00963592529296875, 0.0660400390625, -0.022247314453125, 0.01177978515625, -0.033966064453125, 0.05145263671875, -0.052001953125, 0.0002346038818359375, -0.0273895263671875, -0.058135986328125, -0.037506103515625, 0.05615234375, -0.03790283203125, 0.015899658203125, 0.0421142578125, 0.06610107421875, 0.00946044921875, -0.0201263427734375, 0.0296173095703125, 0.01503753662109375, 0.0252838134765625, 0.061248779296875, 0.04376220703125, -0.039154052734375, 0.048370361328125, -0.04095458984375, -0.013946533203125, -0.0193939208984375, -0.0478515625, -0.060791015625, -0.040924072265625, -0.02471923828125, -0.03173828125, -0.0134735107421875, 0.08642578125, 0.041839599609375, -0.0640869140625, -0.0272979736328125, 0.004352569580078125, 0.01413726806640625, -0.0230865478515625, -0.0183258056640625, 0.06475830078125, -0.01123046875, -0.08123779296875, 0.0015325546264648438, 0.0118865966796875, 0.007251739501953125, -0.00739288330078125, -0.00836944580078125, -0.03192138671875, 0.0097198486328125, 0.0428466796875, 0.0198211669921875, -0.05523681640625, -0.017059326171875, -0.004978179931640625, -0.0132904052734375, 0.0224761962890625, 0.01763916015625, -0.0377197265625, 0.045989990234375, 0.051605224609375, 0.0023441314697265625, 0.05548095703125, -0.004283905029296875, 0.0169525146484375, -0.0267181396484375, 0.018707275390625, 0.006610870361328125, 0.05023193359375, -0.0015420913696289062, -0.02008056640625, 0.044952392578125, 0.025177001953125, -0.03192138671875, -0.06292724609375, -0.005992889404296875, -0.08233642578125, -0.0135498046875, 0.08782958984375, -0.0258331298828125, -0.029998779296875, -0.00424957275390625, -0.04425048828125, 0.0203857421875, -0.0180511474609375, 0.0509033203125, 0.052398681640625, -0.031524658203125, -0.004360198974609375, -0.048919677734375, 0.035400390625, 0.03314208984375, -0.056304931640625, -0.005153656005859375, 0.01152801513671875, 0.0290069580078125, 0.0179290771484375, 0.062255859375, -0.00922393798828125, 0.026641845703125, 0.00814056396484375, 0.0177459716796875, 0.00653076171875, -0.0007867813110351562, -0.0216064453125, 0.012664794921875, -0.00033354759216308594, -0.02099609375 ] ]
FlagAlpha/Llama2-Chinese-7b-Chat
2023-07-23T10:56:00.000Z
[ "transformers", "pytorch", "llama", "text-generation", "question-answering", "zh", "en", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
question-answering
FlagAlpha
null
null
FlagAlpha/Llama2-Chinese-7b-Chat
117
10,088
transformers
2023-07-23T10:12:21
--- developers: [https://huggingface.co/FlagAlphaAI] license: apache-2.0 language: - zh - en pipeline_tag: question-answering library_name: transformers --- # Llama2中文社区 --- ## Llama2中文微调参数 由于Llama2本身的中文对齐较弱,我们采用中文指令集,对meta-llama/Llama-2-7b-chat-hf进行LoRA微调,使其具备较强的中文对话能力。 🎯 **该版本为LoRA中文微调参数FlagAlpha/Llama2-Chinese-7b-Chat-LoRA和meta-llama/Llama-2-7b-chat-hf参数结合后的版本,可直接使用** --- ## 🚀 社区地址: Github:[**Llama2-Chinese**](https://github.com/FlagAlpha/Llama2-Chinese) 在线体验链接:[**llama.family**](https://llama.family/) ## 🔥 社区介绍 欢迎来到Llama2中文社区! 我们是一个专注于Llama2模型在中文方面的优化和上层建设的高级技术社区。 **基于大规模中文数据,从预训练开始对Llama2模型进行中文能力的持续迭代升级**。 我们热忱欢迎对大模型LLM充满热情的开发者和研究者加入我们的行列。 ## 🐼 社区资源 - Llama2在线体验链接[**llama.family**](https://llama.family/),同时包含Meta原版和中文微调版本! - Llama2 Chat模型的[中文问答能力评测](https://github.com/FlagAlpha/Llama2-Chinese/tree/main#-%E6%A8%A1%E5%9E%8B%E8%AF%84%E6%B5%8B)! - [社区飞书知识库](https://chinesellama.feishu.cn/wiki/space/7257824476874768388?ccm_open_type=lark_wiki_spaceLink),欢迎大家一起共建!
997
[ [ -0.02947998046875, -0.04315185546875, 0.0160064697265625, 0.051300048828125, -0.0601806640625, 0.0206298828125, 0.00855255126953125, -0.052215576171875, 0.0357666015625, 0.0298614501953125, -0.040618896484375, -0.04864501953125, -0.036773681640625, 0.004756927490234375, -0.00839996337890625, 0.062469482421875, -0.016021728515625, -0.0198822021484375, 0.025726318359375, -0.0152435302734375, -0.032440185546875, -0.015716552734375, -0.046051025390625, -0.01345062255859375, 0.01297760009765625, 0.0192413330078125, 0.04644775390625, 0.0379638671875, 0.041534423828125, 0.017822265625, -0.01377105712890625, 0.0191497802734375, -0.028472900390625, -0.01776123046875, 0.006488800048828125, -0.045654296875, -0.05535888671875, -0.0262908935546875, 0.04010009765625, 0.0168914794921875, -0.00873565673828125, 0.030303955078125, 0.0166778564453125, 0.058441162109375, -0.0210113525390625, 0.044097900390625, -0.03515625, 0.005886077880859375, -0.0301971435546875, -0.02215576171875, 0.00601959228515625, -0.046630859375, -0.0268707275390625, -0.04705810546875, -0.0194549560546875, 0.00457000732421875, 0.08905029296875, 0.005596160888671875, -0.039764404296875, -0.0094757080078125, -0.0195465087890625, 0.054962158203125, -0.061676025390625, -0.0119171142578125, 0.036590576171875, 0.0282745361328125, -0.0244140625, -0.051422119140625, -0.051177978515625, 0.00826263427734375, -0.0199127197265625, 0.0088043212890625, -0.02069091796875, -0.0255889892578125, -0.002079010009765625, 0.0098114013671875, -0.036590576171875, 0.0255126953125, -0.037200927734375, -0.01003265380859375, 0.0452880859375, 0.003936767578125, 0.0261077880859375, -0.029327392578125, -0.0413818359375, 0.005832672119140625, -0.0714111328125, 0.022857666015625, 0.0176239013671875, 0.003047943115234375, -0.05303955078125, 0.031707763671875, -0.0165863037109375, 0.017791748046875, 0.011444091796875, -0.044586181640625, 0.039031982421875, -0.035736083984375, -0.00756072998046875, -0.0157928466796875, 0.066650390625, 0.04962158203125, -0.0027942657470703125, 0.0179901123046875, -0.00859832763671875, -0.0184478759765625, -0.034942626953125, -0.052215576171875, 0.00531768798828125, 0.034423828125, -0.057769775390625, -0.0273590087890625, 0.00716400146484375, -0.0660400390625, -0.00235748291015625, 0.009918212890625, 0.004047393798828125, -0.00919342041015625, -0.0499267578125, 0.002925872802734375, 0.006473541259765625, 0.033782958984375, 0.033172607421875, -0.06024169921875, 0.00026297569274902344, 0.055145263671875, 0.0592041015625, -0.0035991668701171875, -0.01351165771484375, 0.003726959228515625, 0.041534423828125, -0.0202789306640625, 0.054229736328125, -0.006412506103515625, -0.037841796875, 0.01247406005859375, 0.012786865234375, 0.00452423095703125, -0.0364990234375, 0.043609619140625, -0.0435791015625, 0.00766754150390625, -0.03533935546875, 0.007354736328125, -0.040740966796875, 0.021148681640625, -0.040771484375, 0.083251953125, 0.01218414306640625, -0.069580078125, -0.0054168701171875, -0.04248046875, -0.029083251953125, 0.004665374755859375, -0.0172576904296875, -0.028076171875, -0.028961181640625, 0.01027679443359375, 0.0157470703125, -0.033935546875, 0.00983428955078125, 0.0014286041259765625, -0.028228759765625, 0.0090789794921875, -0.012939453125, 0.0859375, 0.0221099853515625, -0.0457763671875, 0.007083892822265625, -0.037139892578125, 0.0029201507568359375, 0.043731689453125, -0.035797119140625, -0.0036106109619140625, 0.01541900634765625, 0.006404876708984375, 0.0085906982421875, 0.0638427734375, -0.018890380859375, 0.037506103515625, -0.038604736328125, 0.0322265625, 0.06298828125, 0.00646209716796875, -0.0030918121337890625, -0.035858154296875, 0.0250244140625, 0.01345062255859375, 0.022674560546875, -0.00946044921875, -0.051239013671875, -0.07757568359375, -0.009918212890625, -0.0201568603515625, 0.06219482421875, -0.045654296875, 0.063232421875, -0.00792694091796875, -0.049102783203125, -0.02001953125, 0.0181884765625, 0.041595458984375, 0.0037555694580078125, 0.0299224853515625, -0.0170745849609375, -0.0517578125, -0.051300048828125, 0.016204833984375, -0.057525634765625, 0.011383056640625, 0.0296630859375, 0.05255126953125, -0.0474853515625, 0.0457763671875, -0.03387451171875, -0.0276336669921875, -0.018829345703125, -0.0114288330078125, 0.041046142578125, 0.022857666015625, 0.0648193359375, -0.045135498046875, -0.0469970703125, 0.00988006591796875, -0.0594482421875, -0.0129852294921875, -0.00414276123046875, -0.0202789306640625, 0.03564453125, -0.00421142578125, -0.036224365234375, 0.0287628173828125, 0.052459716796875, -0.0272979736328125, 0.0509033203125, 0.0045928955078125, 0.0081939697265625, -0.08599853515625, 0.00028896331787109375, -0.0435791015625, 0.006900787353515625, -0.019073486328125, 0.024139404296875, -0.01212310791015625, 0.041290283203125, -0.04608154296875, 0.044219970703125, -0.0296173095703125, -0.005016326904296875, -0.018096923828125, 0.01343536376953125, -0.0052032470703125, 0.043182373046875, -0.0245361328125, 0.057159423828125, 0.040740966796875, -0.041229248046875, 0.037384033203125, 0.03857421875, -0.0193328857421875, -0.00015115737915039062, -0.05352783203125, 0.006984710693359375, 0.01387786865234375, 0.0491943359375, -0.08514404296875, -0.011077880859375, 0.05377197265625, -0.02117919921875, 0.004253387451171875, 0.0007319450378417969, -0.039764404296875, -0.03570556640625, -0.06103515625, 0.034912109375, 0.046722412109375, -0.05438232421875, 0.004848480224609375, 0.02978515625, -0.0000033974647521972656, -0.059234619140625, -0.054962158203125, 0.00927734375, -0.006175994873046875, -0.04827880859375, 0.0262298583984375, -0.011627197265625, -0.0022869110107421875, -0.011199951171875, 0.0041046142578125, -0.004856109619140625, -0.0026702880859375, 0.01953125, 0.015777587890625, -0.017364501953125, -0.02410888671875, 0.0235443115234375, -0.01532745361328125, 0.003307342529296875, 0.0120849609375, 0.049591064453125, -0.010894775390625, -0.046173095703125, -0.055633544921875, 0.02313232421875, 0.0316162109375, -0.007167816162109375, 0.041748046875, 0.050048828125, -0.0140533447265625, 0.009552001953125, -0.05194091796875, 0.01123046875, -0.03955078125, 0.028717041015625, -0.0203704833984375, -0.0853271484375, 0.043609619140625, -0.00698089599609375, 0.0225067138671875, 0.054443359375, 0.04534912109375, -0.01355743408203125, 0.051055908203125, 0.05126953125, -0.030303955078125, 0.0168609619140625, -0.0323486328125, 0.0012464523315429688, -0.06756591796875, -0.040069580078125, -0.038421630859375, -0.02227783203125, -0.054962158203125, -0.04864501953125, 0.02447509765625, 0.019317626953125, -0.03143310546875, 0.038238525390625, -0.04840087890625, 0.01207733154296875, 0.024017333984375, -0.0006012916564941406, 0.0311737060546875, 0.0090179443359375, -0.004261016845703125, 0.01273345947265625, -0.0301666259765625, -0.0628662109375, 0.06060791015625, 0.037384033203125, 0.034210205078125, 0.03143310546875, 0.04705810546875, 0.0041351318359375, 0.0109405517578125, -0.03570556640625, 0.052032470703125, 0.002101898193359375, -0.04229736328125, -0.01129150390625, -0.0018377304077148438, -0.08306884765625, 0.0140380859375, 0.01800537109375, -0.08123779296875, 0.0153045654296875, -0.0094451904296875, -0.009033203125, 0.0418701171875, -0.02392578125, 0.0215301513671875, -0.0271759033203125, -0.0025234222412109375, -0.0270233154296875, -0.042388916015625, 0.061126708984375, -0.02410888671875, 0.02435302734375, -0.0272674560546875, -0.033782958984375, 0.0694580078125, -0.03900146484375, 0.07257080078125, -0.01256561279296875, -0.026763916015625, 0.0595703125, 0.0023345947265625, 0.05352783203125, 0.0302886962890625, -0.00405120849609375, 0.0504150390625, 0.00820159912109375, -0.0210113525390625, -0.009429931640625, 0.0513916015625, -0.08233642578125, -0.06634521484375, -0.032806396484375, 0.0108642578125, 0.0184173583984375, 0.00931549072265625, 0.02130126953125, -0.0208282470703125, 0.0045928955078125, 0.00826263427734375, 0.01239013671875, -0.03271484375, 0.05029296875, 0.054412841796875, -0.036376953125, -0.04571533203125, 0.046478271484375, 0.011474609375, 0.0028934478759765625, 0.037017822265625, -0.00701904296875, -0.0027675628662109375, -0.03265380859375, -0.03289794921875, 0.032806396484375, -0.036712646484375, -0.0237884521484375, -0.03460693359375, -0.029296875, -0.017181396484375, -0.0026493072509765625, -0.015106201171875, -0.028350830078125, -0.031341552734375, -0.0113067626953125, 0.041259765625, 0.041961669921875, -0.014129638671875, 0.03729248046875, -0.061798095703125, 0.041473388671875, 0.005184173583984375, 0.0133819580078125, 0.0169219970703125, -0.039703369140625, -0.022308349609375, 0.005893707275390625, -0.03314208984375, -0.0748291015625, 0.03106689453125, -0.00376129150390625, 0.0396728515625, 0.047149658203125, -0.00534820556640625, 0.05682373046875, -0.01983642578125, 0.07183837890625, 0.0286865234375, -0.061004638671875, 0.04962158203125, -0.0291900634765625, -0.005588531494140625, 0.0152587890625, 0.01535797119140625, -0.0357666015625, -0.008148193359375, -0.022216796875, -0.071533203125, 0.05718994140625, 0.0261993408203125, 0.0166168212890625, 0.00864410400390625, 0.006931304931640625, 0.0006265640258789062, 0.01039886474609375, -0.069580078125, -0.04876708984375, -0.01392364501953125, 0.01273345947265625, 0.0167236328125, -0.056640625, -0.016937255859375, -0.01496124267578125, 0.0509033203125, 0.0085906982421875, 0.039398193359375, 0.00897979736328125, 0.0236053466796875, -0.03387451171875, 0.00969696044921875, 0.0396728515625, 0.03228759765625, -0.00958251953125, -0.0101165771484375, 0.03253173828125, -0.043731689453125, 0.01251220703125, -0.017852783203125, -0.028228759765625, 0.00946807861328125, 0.047821044921875, 0.05340576171875, 0.00577545166015625, -0.039031982421875, 0.03326416015625, 0.01157379150390625, -0.01203155517578125, -0.056427001953125, 0.01611328125, 0.023040771484375, 0.0361328125, 0.050384521484375, -0.019439697265625, -0.00853729248046875, -0.037017822265625, -0.005901336669921875, 0.0316162109375, 0.0025768280029296875, -0.017791748046875, 0.03826904296875, 0.0277862548828125, -0.00699615478515625, 0.01279449462890625, -0.019866943359375, -0.058197021484375, 0.07354736328125, 0.047332763671875, 0.058441162109375, -0.00904083251953125, 0.004695892333984375, 0.06097412109375, 0.046844482421875, 0.0022125244140625, 0.052459716796875, -0.00885772705078125, -0.034820556640625, -0.01447296142578125, -0.042816162109375, -0.028076171875, 0.01849365234375, -0.01390838623046875, 0.028411865234375, -0.04705810546875, -0.007671356201171875, -0.038482666015625, 0.0288238525390625, -0.0170440673828125, -0.007572174072265625, -0.0019311904907226562, 0.07354736328125, -0.040679931640625, 0.06707763671875, 0.0274505615234375, -0.0284423828125, -0.06561279296875, 0.0024852752685546875, 0.007404327392578125, -0.072998046875, 0.05438232421875, 0.01171112060546875, -0.0099029541015625, -0.01111602783203125, -0.042999267578125, -0.101806640625, 0.11541748046875, -0.0009860992431640625, -0.043975830078125, 0.0200653076171875, 0.01381683349609375, 0.0268096923828125, -0.01306915283203125, 0.0254669189453125, 0.01788330078125, 0.068603515625, 0.01166534423828125, -0.08526611328125, 0.02410888671875, -0.03106689453125, -0.00250244140625, -0.009033203125, -0.10107421875, 0.060791015625, -0.0201263427734375, -0.031768798828125, 0.0292510986328125, 0.055572509765625, 0.04205322265625, 0.03192138671875, 0.0240325927734375, 0.0311431884765625, 0.031280517578125, -0.0165863037109375, 0.034271240234375, -0.0198516845703125, 0.032135009765625, 0.051300048828125, -0.02069091796875, 0.0660400390625, 0.026611328125, -0.05621337890625, 0.0599365234375, 0.057830810546875, -0.0223236083984375, 0.036529541015625, -0.007740020751953125, -0.02484130859375, 0.0133514404296875, -0.010894775390625, -0.06817626953125, 0.0179595947265625, 0.03277587890625, -0.004642486572265625, -0.007419586181640625, -0.039764404296875, 0.021575927734375, -0.0236358642578125, -0.00930023193359375, 0.050811767578125, 0.0036678314208984375, -0.0295257568359375, 0.057769775390625, 0.023468017578125, 0.07330322265625, -0.05010986328125, -0.00922393798828125, -0.026092529296875, -0.00913238525390625, -0.045013427734375, -0.053619384765625, 0.0191650390625, 0.01282501220703125, -0.006488800048828125, 0.019683837890625, 0.041534423828125, -0.003665924072265625, -0.026336669921875, 0.051177978515625, 0.0277862548828125, 0.027587890625, 0.034088134765625, -0.06573486328125, 0.01476287841796875, 0.0276336669921875, -0.052642822265625, 0.02734375, 0.018951416015625, -0.019012451171875, 0.051422119140625, 0.06884765625, 0.00199127197265625, -0.0032024383544921875, -0.015869140625, 0.078125, -0.042755126953125, -0.027435302734375, -0.06884765625, 0.035614013671875, 0.00473785400390625, -0.0270538330078125, 0.048187255859375, 0.0413818359375, 0.0478515625, -0.004932403564453125, 0.05609130859375, -0.024627685546875, 0.037261962890625, -0.01837158203125, 0.055389404296875, -0.042022705078125, 0.0237579345703125, -0.0164031982421875, -0.058380126953125, -0.0100860595703125, 0.047821044921875, 0.011749267578125, 0.01163482666015625, 0.056121826171875, 0.0660400390625, 0.018402099609375, -0.024566650390625, 0.003082275390625, 0.024169921875, 0.0294952392578125, 0.08154296875, 0.045745849609375, -0.048828125, 0.047515869140625, -0.0245361328125, -0.016815185546875, -0.04119873046875, -0.059234619140625, -0.06585693359375, -0.0273895263671875, -0.01369476318359375, -0.0149383544921875, -0.0062408447265625, 0.054718017578125, 0.030426025390625, -0.0650634765625, -0.0386962890625, 0.01282501220703125, 0.035858154296875, -0.0087890625, -0.01177215576171875, 0.057373046875, 0.014373779296875, -0.044708251953125, 0.02130126953125, 0.032012939453125, 0.005161285400390625, -0.014495849609375, -0.0226593017578125, -0.0188446044921875, -0.00464630126953125, 0.03582763671875, 0.035797119140625, -0.07977294921875, -0.0080108642578125, -0.0200653076171875, -0.033905029296875, 0.0018033981323242188, 0.0092315673828125, -0.0347900390625, -0.011016845703125, 0.0474853515625, 0.00005143880844116211, 0.019744873046875, -0.01104736328125, -0.0020351409912109375, -0.026092529296875, 0.029571533203125, -0.0198974609375, 0.039154052734375, -0.007732391357421875, -0.0267486572265625, 0.0638427734375, 0.03271484375, -0.02630615234375, -0.0435791015625, 0.01959228515625, -0.0936279296875, -0.0181427001953125, 0.10174560546875, -0.01751708984375, -0.0140380859375, -0.002391815185546875, -0.0518798828125, 0.045928955078125, -0.02734375, 0.054290771484375, 0.029754638671875, 0.00981903076171875, 0.00733184814453125, -0.035430908203125, 0.017791748046875, 0.0181427001953125, -0.063232421875, -0.0200653076171875, 0.009429931640625, 0.0018587112426757812, 0.02166748046875, 0.0540771484375, -0.004253387451171875, 0.023590087890625, -0.0285491943359375, 0.00994110107421875, -0.019378662109375, 0.01265716552734375, 0.0019235610961914062, 0.0040283203125, -0.003742218017578125, -0.01763916015625 ] ]
distil-whisper/distil-large-v2
2023-11-06T16:37:42.000Z
[ "transformers", "pytorch", "jax", "tensorboard", "onnx", "safetensors", "whisper", "automatic-speech-recognition", "audio", "transformers.js", "en", "arxiv:2311.00430", "arxiv:2210.13352", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
automatic-speech-recognition
distil-whisper
null
null
distil-whisper/distil-large-v2
170
10,078
transformers
2023-10-24T15:48:32
--- language: - en tags: - audio - automatic-speech-recognition - transformers.js widget: - example_title: LibriSpeech sample 1 src: https://cdn-media.huggingface.co/speech_samples/sample1.flac - example_title: LibriSpeech sample 2 src: https://cdn-media.huggingface.co/speech_samples/sample2.flac pipeline_tag: automatic-speech-recognition license: mit library_name: transformers --- # Distil-Whisper: distil-large-v2 Distil-Whisper was proposed in the paper [Robust Knowledge Distillation via Large-Scale Pseudo Labelling](https://arxiv.org/abs/2311.00430). It is a distilled version of the Whisper model that is **6 times faster**, 49% smaller, and performs **within 1% WER** on out-of-distribution evaluation sets. This is the repository for distil-large-v2, a distilled variant of [Whisper large-v2](https://huggingface.co/openai/whisper-large-v2). | Model | Params / M | Rel. Latency | Short-Form WER | Long-Form WER | |----------------------------------------------------------------------------|------------|--------------|----------------|---------------| | [large-v2](https://huggingface.co/openai/whisper-large-v2) | 1550 | 1.0 | **9.1** | 11.7 | | | | | | | | [distil-large-v2](https://huggingface.co/distil-whisper/distil-large-v2) | 756 | 5.8 | 10.1 | **11.6** | | [distil-medium.en](https://huggingface.co/distil-whisper/distil-medium.en) | **394** | **6.8** | 11.1 | 12.4 | **Note:** Distil-Whisper is currently only available for English speech recognition. Multilingual support will be provided in a follow-up. ## Usage Distil-Whisper is supported in Hugging Face 🤗 Transformers from version 4.35 onwards. To run the model, first install the latest version of the Transformers library. For this example, we'll also install 🤗 Datasets to load toy audio dataset from the Hugging Face Hub: ```bash pip install --upgrade pip pip install --upgrade transformers accelerate datasets[audio] ``` ### Short-Form Transcription The model can be used with the [`pipeline`](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline) class to transcribe short-form audio files (< 30-seconds) as follows: ```python import torch from transformers import AutoModelForSpeechSeq2Seq, AutoProcessor, pipeline from datasets import load_dataset device = "cuda:0" if torch.cuda.is_available() else "cpu" torch_dtype = torch.float16 if torch.cuda.is_available() else torch.float32 model_id = "distil-whisper/distil-large-v2" model = AutoModelForSpeechSeq2Seq.from_pretrained( model_id, torch_dtype=torch_dtype, low_cpu_mem_usage=True, use_safetensors=True ) model.to(device) processor = AutoProcessor.from_pretrained(model_id) pipe = pipeline( "automatic-speech-recognition", model=model, tokenizer=processor.tokenizer, feature_extractor=processor.feature_extractor, max_new_tokens=128, torch_dtype=torch_dtype, device=device, ) dataset = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") sample = dataset[0]["audio"] result = pipe(sample) print(result["text"]) ``` To transcribe a local audio file, simply pass the path to your audio file when you call the pipeline: ```diff - result = pipe(sample) + result = pipe("audio.mp3") ``` ### Long-Form Transcription Distil-Whisper uses a chunked algorithm to transcribe long-form audio files (> 30-seconds). In practice, this chunked long-form algorithm is 9x faster than the sequential algorithm proposed by OpenAI in the Whisper paper (see Table 7 of the [Distil-Whisper paper](https://arxiv.org/abs/2311.00430)). To enable chunking, pass the `chunk_length_s` parameter to the `pipeline`. For Distil-Whisper, a chunk length of 15-seconds is optimal. To activate batching, pass the argument `batch_size`: ```python import torch from transformers import AutoModelForSpeechSeq2Seq, AutoProcessor, pipeline from datasets import load_dataset device = "cuda:0" if torch.cuda.is_available() else "cpu" torch_dtype = torch.float16 if torch.cuda.is_available() else torch.float32 model_id = "distil-whisper/distil-large-v2" model = AutoModelForSpeechSeq2Seq.from_pretrained( model_id, torch_dtype=torch_dtype, low_cpu_mem_usage=True, use_safetensors=True ) model.to(device) processor = AutoProcessor.from_pretrained(model_id) pipe = pipeline( "automatic-speech-recognition", model=model, tokenizer=processor.tokenizer, feature_extractor=processor.feature_extractor, max_new_tokens=128, chunk_length_s=15, batch_size=16, torch_dtype=torch_dtype, device=device, ) dataset = load_dataset("distil-whisper/librispeech_long", "clean", split="validation") sample = dataset[0]["audio"] result = pipe(sample) print(result["text"]) ``` <!--- **Tip:** The pipeline can also be used to transcribe an audio file from a remote URL, for example: ```python result = pipe("https://huggingface.co/datasets/sanchit-gandhi/librispeech_long/resolve/main/audio.wav") ``` ---> ### Speculative Decoding Distil-Whisper can be used as an assistant model to Whisper for speculative decoding. Speculative decoding mathematically ensures the exact same outputs as Whisper are obtained while being 2 times faster. This makes it the perfect drop-in replacement for existing Whisper pipelines, since the same outputs are guaranteed. In the following code-snippet, we load the assistant Distil-Whisper model standalone to the main Whisper pipeline. We then specify it as the "assistant model" for generation: ```python from transformers import pipeline, AutoModelForCausalLM, AutoModelForSpeechSeq2Seq, AutoProcessor import torch from datasets import load_dataset device = "cuda:0" if torch.cuda.is_available() else "cpu" torch_dtype = torch.float16 if torch.cuda.is_available() else torch.float32 assistant_model_id = "distil-whisper/distil-large-v2" assistant_model = AutoModelForCausalLM.from_pretrained( assistant_model_id, torch_dtype=torch_dtype, low_cpu_mem_usage=True, use_safetensors=True ) assistant_model.to(device) model_id = "openai/whisper-large-v2" model = AutoModelForSpeechSeq2Seq.from_pretrained( model_id, torch_dtype=torch_dtype, low_cpu_mem_usage=True, use_safetensors=True ) model.to(device) processor = AutoProcessor.from_pretrained(model_id) pipe = pipeline( "automatic-speech-recognition", model=model, tokenizer=processor.tokenizer, feature_extractor=processor.feature_extractor, max_new_tokens=128, generate_kwargs={"assistant_model": assistant_model}, torch_dtype=torch_dtype, device=device, ) dataset = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") sample = dataset[0]["audio"] result = pipe(sample) print(result["text"]) ``` ## Additional Speed & Memory Improvements You can apply additional speed and memory improvements to Distil-Whisper which we cover in the following. ### Flash Attention We recommend using [Flash-Attention 2](https://huggingface.co/docs/transformers/main/en/perf_infer_gpu_one#flashattention-2) if your GPU allows for it. To do so, you first need to install [Flash Attention](https://github.com/Dao-AILab/flash-attention): ``` pip install flash-attn --no-build-isolation ``` and then all you have to do is to pass `use_flash_attention_2=True` to `from_pretrained`: ```diff - model = AutoModelForSpeechSeq2Seq.from_pretrained(model_id, torch_dtype=torch_dtype, low_cpu_mem_usage=True, use_safetensors=True) + model = AutoModelForSpeechSeq2Seq.from_pretrained(model_id, torch_dtype=torch_dtype, low_cpu_mem_usage=True, use_safetensors=True, use_flash_attention_2=True) ``` ### Torch Scale-Product-Attention (SDPA) If your GPU does not support Flash Attention, we recommend making use of [BetterTransformers](https://huggingface.co/docs/transformers/main/en/perf_infer_gpu_one#bettertransformer). To do so, you first need to install optimum: ``` pip install --upgrade optimum ``` And then convert your model to a "BetterTransformer" model before using it: ```diff model = AutoModelForSpeechSeq2Seq.from_pretrained(model_id, torch_dtype=torch_dtype, low_cpu_mem_usage=True, use_safetensors=True) + model = model.to_bettertransformer() ``` ### 8bit & 4bit Quantization Coming soon ... ### Candle Coming soon ... ### Whisper.cpp Coming soon ... ### Running Whisper in `openai-whisper` To use the model in the original Whisper format, first ensure you have the [`openai-whisper`](https://pypi.org/project/openai-whisper/) package installed: ```bash pip install --upgrade openai-whisper ``` The following code-snippet demonstrates how to transcribe a sample file from the LibriSpeech dataset loaded using 🤗 Datasets: ```python import torch from datasets import load_dataset from huggingface_hub import hf_hub_download from whisper import load_model, transcribe distil_large_v2 = hf_hub_download(repo_id="distil-whisper/distil-large-v2", filename="original-model.bin") model = load_model(distil_large_v2) dataset = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation") sample = dataset[0]["audio"]["array"] sample = torch.from_numpy(sample).float() pred_out = transcribe(model, audio=sample) print(pred_out["text"]) ``` To transcribe a local audio file, simply pass the path to the audio file as the `audio` argument to transcribe: ```python pred_out = transcribe(model, audio="audio.mp3") ``` ### Transformers.js ```js import { pipeline } from '@xenova/transformers'; let transcriber = await pipeline('automatic-speech-recognition', 'distil-whisper/distil-large-v2'); let url = 'https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/jfk.wav'; let output = await transcriber(url); // { text: " And so, my fellow Americans, ask not what your country can do for you. Ask what you can do for your country." } ``` See the [docs](https://huggingface.co/docs/transformers.js/api/pipelines#module_pipelines.AutomaticSpeechRecognitionPipeline) for more information. *Note:* Due to the large model size, we recommend running this model server-side with [Node.js](https://huggingface.co/docs/transformers.js/guides/node-audio-processing) (instead of in-browser). ## Model Details Distil-Whisper inherits the encoder-decoder architecture from Whisper. The encoder maps a sequence of speech vector inputs to a sequence of hidden-state vectors. The decoder auto-regressively predicts text tokens, conditional on all previous tokens and the encoder hidden-states. Consequently, the encoder is only run forward once, whereas the decoder is run as many times as the number of tokens generated. In practice, this means the decoder accounts for over 90% of total inference time. Thus, to optimise for latency, the focus should be on minimising the inference time of the decoder. To distill the Whisper model, we reduce the number of decoder layers while keeping the encoder fixed. The encoder (shown in green) is entirely copied from the teacher to the student and frozen during training. The student's decoder consists of only two decoder layers, which are initialised from the first and last decoder layer of the teacher (shown in red). All other decoder layers of the teacher are discarded. The model is then trained on a weighted sum of the KL divergence and pseudo-label loss terms. <p align="center"> <img src="https://huggingface.co/datasets/distil-whisper/figures/resolve/main/architecture.png?raw=true" width="600"/> </p> ## Evaluation The following code-snippets demonstrates how to evaluate the Distil-Whisper model on the LibriSpeech validation.clean dataset with [streaming mode](https://huggingface.co/blog/audio-datasets#streaming-mode-the-silver-bullet), meaning no audio data has to be downloaded to your local device. First, we need to install the required packages, including 🤗 Datasets to stream and load the audio data, and 🤗 Evaluate to perform the WER calculation: ```bash pip install --upgrade pip pip install --upgrade transformers datasets[audio] evaluate jiwer ``` Evaluation can then be run end-to-end with the following example: ```python from transformers import AutoModelForSpeechSeq2Seq, AutoProcessor from transformers.models.whisper.english_normalizer import EnglishTextNormalizer from datasets import load_dataset from evaluate import load import torch from tqdm import tqdm # define our torch configuration device = "cuda:0" if torch.cuda.is_available() else "cpu" torch_dtype = torch.float16 if torch.cuda.is_available() else torch.float32 model_id = "distil-whisper/distil-large-v2" # load the model + processor model = AutoModelForSpeechSeq2Seq.from_pretrained(model_id, torch_dtype=torch_dtype, use_safetensors=True, low_cpu_mem_usage=True) model = model.to(device) processor = AutoProcessor.from_pretrained(model_id) # load the dataset with streaming mode dataset = load_dataset("librispeech_asr", "clean", split="validation", streaming=True) # define the evaluation metric wer_metric = load("wer") normalizer = EnglishTextNormalizer(processor.tokenizer.english_spelling_normalizer) def inference(batch): # 1. Pre-process the audio data to log-mel spectrogram inputs audio = [sample["array"] for sample in batch["audio"]] input_features = processor(audio, sampling_rate=batch["audio"][0]["sampling_rate"], return_tensors="pt").input_features input_features = input_features.to(device, dtype=torch_dtype) # 2. Auto-regressively generate the predicted token ids pred_ids = model.generate(input_features, max_new_tokens=128, language="en", task="transcribe") # 3. Decode the token ids to the final transcription batch["transcription"] = processor.batch_decode(pred_ids, skip_special_tokens=True) batch["reference"] = batch["text"] return batch dataset = dataset.map(function=inference, batched=True, batch_size=16) all_transcriptions = [] all_references = [] # iterate over the dataset and run inference for i, result in tqdm(enumerate(dataset), desc="Evaluating..."): all_transcriptions.append(result["transcription"]) all_references.append(result["reference"]) # normalize predictions and references all_transcriptions = [normalizer(transcription) for transcription in all_transcriptions] all_references = [normalizer(reference) for reference in all_references] # compute the WER metric wer = 100 * wer_metric.compute(predictions=all_transcriptions, references=all_references) print(wer) ``` **Print Output:** ``` 2.983685535968466 ``` ## Intended Use Distil-Whisper is intended to be a drop-in replacement for Whisper on English speech recognition. In particular, it achieves comparable WER results over out-of-distribution test data, while being 6x faster over both short and long-form audio. ## Data Distil-Whisper is trained on 22,000 hours of audio data from 9 open-source, permissively licensed speech datasets on the Hugging Face Hub: | Dataset | Size / h | Speakers | Domain | Licence | |-----------------------------------------------------------------------------------------|----------|----------|-----------------------------|-----------------| | [People's Speech](https://huggingface.co/datasets/MLCommons/peoples_speech) | 12,000 | unknown | Internet Archive | CC-BY-SA-4.0 | | [Common Voice 13](https://huggingface.co/datasets/mozilla-foundation/common_voice_13_0) | 3,000 | unknown | Narrated Wikipedia | CC0-1.0 | | [GigaSpeech](https://huggingface.co/datasets/speechcolab/gigaspeech) | 2,500 | unknown | Audiobook, podcast, YouTube | apache-2.0 | | Fisher | 1,960 | 11,900 | Telephone conversations | LDC | | [LibriSpeech](https://huggingface.co/datasets/librispeech_asr) | 960 | 2,480 | Audiobooks | CC-BY-4.0 | | [VoxPopuli](https://huggingface.co/datasets/facebook/voxpopuli) | 540 | 1,310 | European Parliament | CC0 | | [TED-LIUM](https://huggingface.co/datasets/LIUM/tedlium) | 450 | 2,030 | TED talks | CC-BY-NC-ND 3.0 | | SwitchBoard | 260 | 540 | Telephone conversations | LDC | | [AMI](https://huggingface.co/datasets/edinburghcstr/ami) | 100 | unknown | Meetings | CC-BY-4.0 | |||||| | **Total** | 21,770 | 18,260+ | | | The combined dataset spans 10 distinct domains and over 50k speakers. The diversity of this dataset is crucial to ensuring the distilled model is robust to audio distributions and noise. The audio data is then pseudo-labelled using the Whisper large-v2 model: we use Whisper to generate predictions for all the audio in our training set and use these as the target labels during training. Using pseudo-labels ensures that the transcriptions are consistently formatted across datasets and provides sequence-level distillation signal during training. ## WER Filter The Whisper pseudo-label predictions are subject to mis-transcriptions and hallucinations. To ensure we only train on accurate pseudo-labels, we employ a simple WER heuristic during training. First, we normalise the Whisper pseudo-labels and the ground truth labels provided by each dataset. We then compute the WER between these labels. If the WER exceeds a specified threshold, we discard the training example. Otherwise, we keep it for training. Section 9.2 of the [Distil-Whisper paper](https://arxiv.org/abs/2311.00430) demonstrates the effectiveness of this filter for improving downstream performance of the distilled model. We also partially attribute Distil-Whisper's robustness to hallucinations to this filter. ## Training The model was trained for 80,000 optimisation steps (or eight epochs). The Tensorboard training logs can be found under: https://huggingface.co/distil-whisper/distil-large-v2/tensorboard?params=scalars#frame ## Results The distilled model performs to within 1% WER of Whisper on out-of-distribution (OOD) short-form audio, and outperforms Whisper by 0.1% on OOD long-form audio. This performance gain is attributed to lower hallucinations. For a detailed per-dataset breakdown of the evaluation results, refer to Tables 16 and 17 of the [Distil-Whisper paper](https://arxiv.org/abs/2311.00430) Distil-Whisper is also evaluated on the [ESB benchmark](https://arxiv.org/abs/2210.13352) datasets as part of the [OpenASR leaderboard](https://huggingface.co/spaces/hf-audio/open_asr_leaderboard), where it performs to within 0.2% WER of Whisper. ## Reproducing Distil-Whisper Training and evaluation code to reproduce Distil-Whisper will be made available on the Distil-Whisper repository: https://github.com/huggingface/distil-whisper ## Citation If you use this model, please consider citing the [Distil-Whisper paper](https://arxiv.org/abs/2311.00430): ``` @misc{gandhi2023distilwhisper, title={Distil-Whisper: Robust Knowledge Distillation via Large-Scale Pseudo Labelling}, author={Sanchit Gandhi and Patrick von Platen and Alexander M. Rush}, year={2023}, eprint={2311.00430}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## Acknowledgements * OpenAI for the Whisper [model](https://huggingface.co/openai/whisper-large-v2) and [original codebase](https://github.com/openai/whisper) * Hugging Face 🤗 [Transformers](https://github.com/huggingface/transformers) for the model integration * Google's [TPU Research Cloud (TRC)](https://sites.research.google/trc/about/) programme for Cloud TPU v4s * [`@rsonavane`](https://huggingface.co/rsonavane/distil-whisper-large-v2-8-ls) for releasing an early iteration of Distil-Whisper on the LibriSpeech dataset
20,465
[ [ -0.0216217041015625, -0.053497314453125, 0.0196075439453125, 0.02825927734375, -0.0194244384765625, 0.005405426025390625, -0.017059326171875, -0.0325927734375, 0.0014104843139648438, 0.007801055908203125, -0.039703369140625, -0.032684326171875, -0.06494140625, -0.00699615478515625, -0.0318603515625, 0.0650634765625, 0.00934600830078125, -0.0158233642578125, 0.020904541015625, -0.0046234130859375, -0.0269927978515625, -0.02978515625, -0.051025390625, -0.0167236328125, 0.00923919677734375, 0.00434112548828125, 0.0238189697265625, 0.036102294921875, 0.0185394287109375, 0.03277587890625, -0.025543212890625, 0.0017042160034179688, -0.0227508544921875, 0.0010385513305664062, 0.028076171875, -0.035003662109375, -0.037933349609375, 0.015625, 0.04107666015625, 0.014923095703125, -0.017852783203125, 0.027984619140625, 0.01776123046875, 0.03277587890625, -0.0355224609375, 0.026153564453125, -0.053955078125, -0.01317596435546875, -0.016326904296875, -0.003185272216796875, -0.0229949951171875, -0.01904296875, 0.04168701171875, -0.0460205078125, 0.0246734619140625, 0.004611968994140625, 0.0733642578125, 0.042205810546875, -0.005954742431640625, -0.0279083251953125, -0.048065185546875, 0.07232666015625, -0.07757568359375, 0.0245513916015625, 0.0272674560546875, 0.0133209228515625, 0.00023794174194335938, -0.0828857421875, -0.050140380859375, -0.0078277587890625, -0.0029239654541015625, 0.01087188720703125, -0.033233642578125, 0.0009632110595703125, 0.033782958984375, 0.028472900390625, -0.03265380859375, -0.0032939910888671875, -0.05242919921875, -0.030792236328125, 0.0465087890625, -0.000018417835235595703, 0.00873565673828125, -0.00827789306640625, -0.01384735107421875, -0.024658203125, -0.019195556640625, 0.01299285888671875, 0.02471923828125, 0.018341064453125, -0.034027099609375, 0.022216796875, -0.008209228515625, 0.03863525390625, 0.0281829833984375, -0.041961669921875, 0.048065185546875, -0.015655517578125, -0.024444580078125, 0.0372314453125, 0.08233642578125, 0.01617431640625, 0.0008020401000976562, 0.023193359375, -0.00400543212890625, 0.003910064697265625, -0.0110321044921875, -0.07049560546875, -0.03912353515625, 0.038421630859375, -0.032135009765625, -0.03271484375, -0.01412200927734375, -0.046630859375, -0.006626129150390625, -0.0020122528076171875, 0.057647705078125, -0.0362548828125, -0.0291748046875, 0.0160675048828125, -0.039764404296875, 0.0194854736328125, -0.002643585205078125, -0.07720947265625, 0.026458740234375, 0.0318603515625, 0.07843017578125, 0.0020351409912109375, -0.032623291015625, -0.045257568359375, 0.002941131591796875, 0.0104522705078125, 0.0233001708984375, -0.00537109375, -0.0380859375, -0.029388427734375, 0.00408172607421875, -0.0094757080078125, -0.0555419921875, 0.047393798828125, -0.01142120361328125, 0.03057861328125, -0.005313873291015625, -0.03509521484375, -0.01165771484375, -0.01739501953125, -0.0304412841796875, 0.09149169921875, 0.0058135986328125, -0.07568359375, 0.0173797607421875, -0.042449951171875, -0.043365478515625, -0.0159912109375, 0.01406097412109375, -0.049835205078125, 0.0008788108825683594, 0.0338134765625, 0.037078857421875, -0.0171966552734375, 0.00415802001953125, -0.00994873046875, -0.031463623046875, 0.0154571533203125, -0.052001953125, 0.093017578125, 0.0191497802734375, -0.052703857421875, 0.00577545166015625, -0.051025390625, -0.0004811286926269531, 0.00995635986328125, -0.0126953125, -0.006114959716796875, -0.0097808837890625, 0.00922393798828125, 0.0050048828125, 0.0220184326171875, -0.04034423828125, 0.0019121170043945312, -0.0380859375, 0.07391357421875, 0.048980712890625, -0.00994873046875, 0.0201263427734375, -0.0262298583984375, 0.003589630126953125, 0.004497528076171875, 0.0308685302734375, 0.0055389404296875, -0.043121337890625, -0.07379150390625, -0.039031982421875, 0.024383544921875, 0.04718017578125, -0.036376953125, 0.05145263671875, -0.0028896331787109375, -0.057525634765625, -0.08331298828125, -0.00460052490234375, 0.0245208740234375, 0.053955078125, 0.042816162109375, -0.0059051513671875, -0.04541015625, -0.052703857421875, 0.008544921875, -0.027557373046875, -0.013458251953125, 0.021697998046875, 0.0254364013671875, -0.01038360595703125, 0.052154541015625, -0.0537109375, -0.036895751953125, -0.0161590576171875, 0.01788330078125, 0.04248046875, 0.03607177734375, 0.042236328125, -0.04559326171875, -0.0443115234375, -0.0203094482421875, -0.0489501953125, -0.0180206298828125, 0.006954193115234375, 0.003917694091796875, 0.009063720703125, 0.022796630859375, -0.05084228515625, 0.0302581787109375, 0.04132080078125, -0.006320953369140625, 0.0487060546875, -0.0181884765625, 0.01055908203125, -0.0792236328125, 0.0028209686279296875, -0.0057373046875, -0.0084686279296875, -0.038848876953125, -0.02703857421875, -0.0076446533203125, 0.0009794235229492188, -0.05126953125, 0.0423583984375, -0.03326416015625, 0.01153564453125, -0.01490020751953125, -0.0016717910766601562, 0.00859832763671875, 0.040191650390625, 0.017333984375, 0.0579833984375, 0.07489013671875, -0.05316162109375, 0.046478271484375, 0.0243988037109375, -0.023162841796875, 0.01407623291015625, -0.07855224609375, 0.01678466796875, 0.0175018310546875, 0.01343536376953125, -0.060821533203125, -0.0011463165283203125, 0.005313873291015625, -0.05377197265625, 0.032379150390625, -0.0266571044921875, -0.0272216796875, -0.044281005859375, -0.0233306884765625, 0.01424407958984375, 0.076416015625, -0.045684814453125, 0.040924072265625, 0.03369140625, -0.0079803466796875, -0.042236328125, -0.053680419921875, -0.0272064208984375, -0.039093017578125, -0.052520751953125, 0.032379150390625, -0.0049896240234375, -0.012908935546875, -0.021331787109375, -0.0143585205078125, -0.00702667236328125, -0.00807952880859375, 0.021575927734375, 0.03582763671875, -0.0018758773803710938, -0.023406982421875, 0.00580596923828125, -0.0212249755859375, 0.009033203125, -0.0137939453125, 0.05133056640625, -0.01326751708984375, -0.003192901611328125, -0.0657958984375, 0.0060882568359375, 0.043243408203125, -0.016937255859375, 0.037445068359375, 0.066650390625, -0.012237548828125, -0.00858306884765625, -0.051422119140625, -0.0240325927734375, -0.0457763671875, 0.0237579345703125, -0.023651123046875, -0.03631591796875, 0.03839111328125, 0.00634765625, 0.01346588134765625, 0.049041748046875, 0.047943115234375, -0.0254364013671875, 0.067626953125, 0.024688720703125, -0.024566650390625, 0.0285797119140625, -0.05743408203125, -0.0008592605590820312, -0.07135009765625, -0.0216064453125, -0.03778076171875, -0.022430419921875, -0.0272064208984375, -0.0258636474609375, 0.04150390625, 0.0091705322265625, -0.0125579833984375, 0.037506103515625, -0.07012939453125, 0.006694793701171875, 0.056488037109375, 0.00321197509765625, 0.01406097412109375, -0.004947662353515625, -0.0027484893798828125, -0.005950927734375, -0.0357666015625, -0.023895263671875, 0.080078125, 0.039886474609375, 0.053985595703125, -0.01219940185546875, 0.058441162109375, 0.006031036376953125, -0.00844573974609375, -0.05499267578125, 0.02783203125, -0.0016460418701171875, -0.0491943359375, -0.0257110595703125, -0.019500732421875, -0.05120849609375, 0.006885528564453125, -0.005435943603515625, -0.047760009765625, 0.006114959716796875, 0.00982666015625, -0.0345458984375, 0.0198516845703125, -0.06927490234375, 0.0472412109375, -0.01236724853515625, -0.01136016845703125, -0.001781463623046875, -0.047088623046875, 0.0178985595703125, 0.0028972625732421875, -0.002696990966796875, -0.0131683349609375, 0.017547607421875, 0.0767822265625, -0.04693603515625, 0.043243408203125, -0.03143310546875, 0.0020046234130859375, 0.045989990234375, -0.0219268798828125, 0.03466796875, -0.002407073974609375, -0.012786865234375, 0.0292205810546875, 0.017852783203125, -0.01108551025390625, -0.03863525390625, 0.047119140625, -0.07318115234375, -0.03680419921875, -0.038787841796875, -0.0210723876953125, -0.0054473876953125, -0.0042877197265625, 0.061187744140625, 0.039794921875, 0.004680633544921875, 0.00035071372985839844, 0.05889892578125, -0.01178741455078125, 0.0482177734375, 0.0276947021484375, -0.0029392242431640625, -0.0248870849609375, 0.06243896484375, 0.01166534423828125, 0.02581787109375, 0.00868988037109375, 0.0259552001953125, -0.045989990234375, -0.050628662109375, -0.03326416015625, 0.0114288330078125, -0.023284912109375, -0.0128173828125, -0.0609130859375, -0.0408935546875, -0.043212890625, 0.0098724365234375, -0.04412841796875, -0.0172119140625, -0.038970947265625, 0.0123138427734375, 0.0682373046875, 0.0234375, -0.00997161865234375, 0.033172607421875, -0.05450439453125, 0.04144287109375, 0.043365478515625, 0.00038433074951171875, 0.003360748291015625, -0.07623291015625, 0.0004215240478515625, 0.01371002197265625, -0.032989501953125, -0.04833984375, 0.036773681640625, 0.0213470458984375, 0.035186767578125, 0.019500732421875, 0.0189666748046875, 0.0697021484375, -0.039306640625, 0.05810546875, 0.0181884765625, -0.0850830078125, 0.05877685546875, -0.014129638671875, 0.023193359375, 0.035247802734375, 0.0291900634765625, -0.022857666015625, -0.0281829833984375, -0.049774169921875, -0.064453125, 0.055816650390625, 0.03472900390625, -0.0056610107421875, 0.0201873779296875, 0.00801849365234375, 0.00516510009765625, 0.0025615692138671875, -0.036041259765625, -0.044952392578125, -0.0247344970703125, -0.0163116455078125, -0.0006642341613769531, -0.0008034706115722656, 0.0035076141357421875, -0.054901123046875, 0.06689453125, 0.0137786865234375, 0.0325927734375, 0.035064697265625, -0.00012540817260742188, 0.023590087890625, 0.00606536865234375, 0.0205841064453125, 0.0191802978515625, -0.0306854248046875, -0.00554656982421875, 0.032196044921875, -0.07110595703125, 0.01776123046875, 0.023406982421875, -0.010498046875, 0.017822265625, 0.03131103515625, 0.074951171875, -0.0001811981201171875, -0.02874755859375, 0.03497314453125, -0.01500701904296875, -0.0263214111328125, -0.03204345703125, 0.0177154541015625, 0.0260772705078125, 0.028289794921875, 0.038360595703125, -0.004688262939453125, 0.01715087890625, -0.0450439453125, 0.00894927978515625, 0.0188140869140625, -0.016326904296875, -0.0212860107421875, 0.06878662109375, 0.0162200927734375, -0.018890380859375, 0.0601806640625, -0.0028553009033203125, -0.036956787109375, 0.03839111328125, 0.029937744140625, 0.07476806640625, -0.0239715576171875, -0.00807952880859375, 0.03466796875, 0.01092529296875, -0.00992584228515625, 0.022796630859375, -0.0007710456848144531, -0.030517578125, -0.0181121826171875, -0.06634521484375, -0.011474609375, 0.01776123046875, -0.051788330078125, 0.03265380859375, -0.0293121337890625, -0.0291900634765625, 0.0255279541015625, 0.0170135498046875, -0.05059814453125, 0.0088653564453125, 0.0022430419921875, 0.057647705078125, -0.0528564453125, 0.0855712890625, 0.00281524658203125, -0.022857666015625, -0.07672119140625, -0.0124053955078125, 0.00008475780487060547, -0.05596923828125, 0.0291900634765625, 0.01093292236328125, -0.01316070556640625, 0.00562286376953125, -0.032257080078125, -0.056427001953125, 0.09344482421875, 0.03271484375, -0.06158447265625, -0.009307861328125, -0.003765106201171875, 0.033233642578125, -0.00997161865234375, 0.0325927734375, 0.0628662109375, 0.03375244140625, 0.019744873046875, -0.11077880859375, -0.0108184814453125, -0.0137481689453125, -0.0197601318359375, -0.0019969940185546875, -0.059478759765625, 0.07318115234375, -0.027557373046875, -0.0115509033203125, 0.0031032562255859375, 0.052764892578125, 0.034027099609375, 0.0218505859375, 0.03887939453125, 0.043853759765625, 0.0621337890625, -0.0171051025390625, 0.0537109375, -0.005710601806640625, 0.0286865234375, 0.08123779296875, -0.012451171875, 0.07110595703125, 0.0274810791015625, -0.03704833984375, 0.03143310546875, 0.03741455078125, -0.0027332305908203125, 0.057586669921875, 0.0068359375, -0.02801513671875, 0.0113067626953125, 0.013763427734375, -0.05145263671875, 0.049041748046875, 0.0179595947265625, -0.0288848876953125, 0.015838623046875, 0.002849578857421875, -0.0021648406982421875, -0.029388427734375, -0.00865936279296875, 0.06158447265625, 0.01253509521484375, -0.02374267578125, 0.08123779296875, -0.0104522705078125, 0.091796875, -0.053497314453125, 0.01184844970703125, 0.00894927978515625, 0.0181121826171875, -0.01462554931640625, -0.04644775390625, 0.0262908935546875, -0.0030612945556640625, -0.018524169921875, 0.0003643035888671875, 0.04144287109375, -0.03656005859375, -0.03851318359375, 0.0263214111328125, 0.0163116455078125, 0.032928466796875, -0.01497650146484375, -0.06353759765625, 0.03643798828125, 0.0131378173828125, -0.023468017578125, 0.0160675048828125, 0.0236968994140625, 0.032135009765625, 0.050323486328125, 0.05010986328125, 0.01477813720703125, 0.0250244140625, -0.002796173095703125, 0.0628662109375, -0.0357666015625, -0.027557373046875, -0.063720703125, 0.035369873046875, -0.0175018310546875, -0.03265380859375, 0.05902099609375, 0.052703857421875, 0.0465087890625, 0.00415802001953125, 0.0596923828125, -0.0156707763671875, 0.043243408203125, -0.03326416015625, 0.073974609375, -0.0266265869140625, 0.01537322998046875, -0.030975341796875, -0.061370849609375, 0.02374267578125, 0.0579833984375, -0.005596160888671875, -0.0036602020263671875, 0.037750244140625, 0.06024169921875, -0.014373779296875, 0.0029964447021484375, 0.0038661956787109375, 0.03271484375, 0.0129852294921875, 0.050689697265625, 0.0341796875, -0.06719970703125, 0.046051025390625, -0.046783447265625, -0.02325439453125, 0.012298583984375, -0.0491943359375, -0.0655517578125, -0.059783935546875, -0.038970947265625, -0.033599853515625, -0.0133056640625, 0.05596923828125, 0.071533203125, -0.0489501953125, -0.020233154296875, 0.01551055908203125, -0.0089874267578125, -0.0175018310546875, -0.020538330078125, 0.04779052734375, 0.0014247894287109375, -0.0789794921875, 0.0328369140625, 0.00533294677734375, 0.020233154296875, -0.02252197265625, -0.014923095703125, -0.009246826171875, -0.004055023193359375, 0.035888671875, 0.016326904296875, -0.05224609375, -0.00894927978515625, -0.0079803466796875, 0.0099029541015625, 0.0132293701171875, 0.021209716796875, -0.062042236328125, 0.030029296875, 0.031341552734375, -0.0021495819091796875, 0.07525634765625, -0.02862548828125, 0.020172119140625, -0.0472412109375, 0.044189453125, 0.018524169921875, 0.0271148681640625, 0.01812744140625, -0.02215576171875, 0.01776123046875, 0.0210723876953125, -0.0443115234375, -0.06744384765625, -0.0166168212890625, -0.08837890625, -0.00933074951171875, 0.0826416015625, -0.00966644287109375, -0.02569580078125, -0.01015472412109375, -0.032562255859375, 0.0413818359375, -0.04522705078125, 0.0537109375, 0.03857421875, -0.007472991943359375, -0.00534820556640625, -0.03875732421875, 0.054901123046875, 0.0026798248291015625, -0.03778076171875, 0.00884246826171875, 0.011474609375, 0.052764892578125, 0.014404296875, 0.0616455078125, -0.0079498291015625, 0.006664276123046875, 0.019256591796875, -0.0010499954223632812, -0.01806640625, -0.0099029541015625, -0.01374053955078125, -0.0036525726318359375, -0.01500701904296875, -0.038787841796875 ] ]
alibaba-damo/mgp-str-base
2023-03-17T07:10:02.000Z
[ "transformers", "pytorch", "mgp-str", "image-to-text", "arxiv:2209.03592", "endpoints_compatible", "has_space", "region:us" ]
image-to-text
alibaba-damo
null
null
alibaba-damo/mgp-str-base
1
10,075
transformers
2022-11-23T13:38:30
--- tags: - mgp-str - image-to-text widget: - src: https://github.com/AlibabaResearch/AdvancedLiterateMachinery/blob/main/OCR/MGP-STR/demo_imgs/IIIT5k_HOUSE.png example_title: Example 1 - src: https://github.com/AlibabaResearch/AdvancedLiterateMachinery/blob/main/OCR/MGP-STR/demo_imgs/IIT5k_EVERYONE.png example_title: Example 2 - src: https://github.com/AlibabaResearch/AdvancedLiterateMachinery/blob/main/OCR/MGP-STR/demo_imgs/CUTE80_KINGDOM.png example_title: Example 3 --- # MGP-STR (base-sized model) MGP-STR base-sized model is trained on MJSynth and SynthText. It was introduced in the paper [Multi-Granularity Prediction for Scene Text Recognition](https://arxiv.org/abs/2209.03592) and first released in [this repository](https://github.com/AlibabaResearch/AdvancedLiterateMachinery/tree/main/OCR/MGP-STR). ## Model description MGP-STR is pure vision STR model, consisting of ViT and specially designed A^3 modules. The ViT module was initialized from the weights of DeiT-base, except the patch embedding model, due to the inconsistent input size. Images (32x128) are presented to the model as a sequence of fixed-size patches (resolution 4x4), which are linearly embedded. One also adds absolute position embeddings before feeding the sequence to the layers of the ViT module. Next, A^3 module selects a meaningful combination from the tokens of ViT output and integrates them into one output token corresponding to a specific character. Moreover, subword classification heads based on BPE A^3 module and WordPiece A^3 module are devised for subword predictions, so that the language information can be implicitly modeled. Finally, these multi-granularity predictions (character, subword and even word) are merged via a simple and effective fusion strategy. ## Intended uses & limitations You can use the raw model for optical character recognition (OCR) on text images. See the [model hub](https://huggingface.co/models?search=alibaba-damo/mgp-str) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model in PyTorch: ```python from transformers import MgpstrProcessor, MgpstrForSceneTextRecognition import requests from PIL import Image processor = MgpstrProcessor.from_pretrained('alibaba-damo/mgp-str-base') model = MgpstrForSceneTextRecognition.from_pretrained('alibaba-damo/mgp-str-base') # load image from the IIIT-5k dataset url = "https://i.postimg.cc/ZKwLg2Gw/367-14.png" image = Image.open(requests.get(url, stream=True).raw).convert("RGB") pixel_values = processor(images=image, return_tensors="pt").pixel_values outputs = model(pixel_values) generated_text = processor.batch_decode(outputs.logits)['generated_text'] ``` ### BibTeX entry and citation info ```bibtex @inproceedings{ECCV2022mgp_str, title={Multi-Granularity Prediction for Scene Text Recognition}, author={Peng Wang, Cheng Da, and Cong Yao}, booktitle = {ECCV}, year={2022} } ```
2,949
[ [ -0.03155517578125, -0.060089111328125, 0.031219482421875, -0.01332855224609375, -0.01861572265625, -0.0226287841796875, -0.01385498046875, -0.047760009765625, -0.0021800994873046875, 0.027587890625, -0.022186279296875, -0.026275634765625, -0.05377197265625, 0.007503509521484375, -0.01113128662109375, 0.07623291015625, -0.0077056884765625, -0.0017910003662109375, 0.00489044189453125, -0.00311279296875, -0.01032257080078125, -0.0465087890625, -0.040069580078125, -0.0284576416015625, 0.038604736328125, 0.00852203369140625, 0.043212890625, 0.059906005859375, 0.052947998046875, 0.030487060546875, -0.03125, 0.0182037353515625, -0.0116729736328125, -0.025848388671875, 0.0021953582763671875, -0.007488250732421875, -0.0216217041015625, -0.006084442138671875, 0.04949951171875, 0.0221099853515625, 0.007411956787109375, -0.0028591156005859375, 0.021392822265625, 0.030029296875, -0.0465087890625, 0.0092926025390625, -0.03912353515625, 0.0203094482421875, 0.0034961700439453125, -0.004932403564453125, -0.037017822265625, -0.0021381378173828125, 0.017333984375, -0.022491455078125, 0.034210205078125, -0.0035228729248046875, 0.09771728515625, -0.019561767578125, -0.0193328857421875, -0.028564453125, -0.043365478515625, 0.0596923828125, -0.042327880859375, 0.0142669677734375, 0.018310546875, 0.01131439208984375, -0.003692626953125, -0.06854248046875, -0.050506591796875, -0.017913818359375, -0.0196533203125, 0.007080078125, -0.005504608154296875, 0.01812744140625, 0.023468017578125, 0.0528564453125, -0.039459228515625, -0.02203369140625, -0.040252685546875, -0.02313232421875, 0.02301025390625, 0.0015897750854492188, 0.0269622802734375, -0.028656005859375, -0.04193115234375, -0.006282806396484375, -0.008819580078125, -0.004581451416015625, 0.0093841552734375, 0.0131988525390625, -0.010528564453125, 0.052581787109375, 0.0012998580932617188, 0.05816650390625, 0.01088714599609375, -0.03076171875, 0.042236328125, -0.037353515625, -0.018707275390625, -0.0014543533325195312, 0.0811767578125, 0.005687713623046875, 0.0262298583984375, -0.0042572021484375, -0.01430511474609375, 0.00618743896484375, 0.00923919677734375, -0.07586669921875, -0.020721435546875, -0.00034737586975097656, -0.045989990234375, -0.007312774658203125, 0.01580810546875, -0.0552978515625, -0.0179595947265625, -0.026702880859375, 0.058563232421875, -0.06427001953125, -0.005390167236328125, 0.0026912689208984375, -0.01546478271484375, -0.004367828369140625, 0.01873779296875, -0.07476806640625, 0.0008864402770996094, 0.01531219482421875, 0.08807373046875, 0.0041046142578125, -0.044219970703125, -0.00743865966796875, -0.0003104209899902344, -0.01049041748046875, 0.0577392578125, -0.0238494873046875, -0.0208282470703125, -0.01371002197265625, 0.004795074462890625, -0.0215911865234375, -0.0238494873046875, 0.042236328125, -0.00783538818359375, 0.0253143310546875, 0.0074615478515625, -0.0082855224609375, -0.01332855224609375, 0.033416748046875, -0.0413818359375, 0.0943603515625, 0.02166748046875, -0.077880859375, 0.024810791015625, -0.035980224609375, -0.006282806396484375, -0.004180908203125, -0.0007443428039550781, -0.05218505859375, 0.004932403564453125, 0.03173828125, 0.033294677734375, 0.00791168212890625, 0.00665283203125, -0.018646240234375, -0.0277862548828125, 0.0226898193359375, -0.038665771484375, 0.033111572265625, 0.0212249755859375, -0.0079803466796875, -0.0003647804260253906, -0.056732177734375, 0.01490020751953125, 0.00011557340621948242, -0.043914794921875, -0.0122222900390625, -0.01296234130859375, 0.0189666748046875, 0.035614013671875, 0.035797119140625, -0.04241943359375, 0.024200439453125, -0.029022216796875, 0.036865234375, 0.0390625, -0.01062774658203125, 0.035369873046875, -0.01263427734375, 0.01328277587890625, 0.0151519775390625, 0.00894927978515625, -0.0362548828125, -0.0400390625, -0.05804443359375, -0.04095458984375, 0.032379150390625, 0.05517578125, -0.06884765625, 0.033447265625, -0.03253173828125, -0.043121337890625, -0.0166778564453125, -0.01209259033203125, 0.04742431640625, 0.031280517578125, 0.030364990234375, -0.027587890625, -0.046539306640625, -0.0684814453125, -0.018951416015625, -0.007312774658203125, -0.0139617919921875, 0.0291290283203125, 0.05743408203125, -0.032257080078125, 0.057830810546875, -0.0413818359375, -0.040069580078125, -0.038421630859375, 0.0268096923828125, 0.025238037109375, 0.046417236328125, 0.035400390625, -0.04583740234375, -0.046112060546875, 0.0131378173828125, -0.05029296875, -0.00569915771484375, 0.007785797119140625, -0.00396728515625, 0.033905029296875, 0.0256195068359375, -0.045135498046875, 0.03662109375, 0.04931640625, -0.036346435546875, 0.045623779296875, -0.039642333984375, 0.020782470703125, -0.10699462890625, 0.0103607177734375, 0.0194244384765625, -0.01885986328125, -0.054656982421875, -0.00021064281463623047, 0.018585205078125, -0.0112457275390625, -0.0213165283203125, 0.031524658203125, -0.047332763671875, -0.01018524169921875, -0.0175628662109375, -0.003589630126953125, -0.002010345458984375, 0.05865478515625, 0.0222930908203125, 0.067138671875, 0.0238800048828125, -0.041412353515625, 0.005130767822265625, 0.017486572265625, -0.04058837890625, 0.028717041015625, -0.048919677734375, 0.0117950439453125, -0.0120086669921875, 0.0030040740966796875, -0.07635498046875, -0.00939178466796875, 0.0201873779296875, -0.04278564453125, 0.0243682861328125, -0.0303955078125, -0.0458984375, -0.020751953125, -0.0050506591796875, 0.0309906005859375, 0.0188140869140625, -0.032012939453125, 0.07061767578125, 0.028564453125, -0.0220947265625, -0.040252685546875, -0.07867431640625, 0.0077972412109375, 0.003337860107421875, -0.051361083984375, 0.03955078125, 0.007049560546875, 0.0257720947265625, -0.01084136962890625, 0.0137939453125, -0.0178070068359375, -0.033721923828125, 0.01507568359375, 0.036956787109375, -0.01439666748046875, -0.00921630859375, -0.0238800048828125, -0.0184783935546875, -0.028900146484375, -0.0128936767578125, 0.07061767578125, -0.0247955322265625, 0.002254486083984375, -0.022705078125, 0.0182037353515625, 0.030181884765625, -0.003936767578125, 0.046600341796875, 0.06201171875, -0.028045654296875, 0.00736236572265625, -0.02191162109375, 0.01097869873046875, -0.034759521484375, 0.044677734375, -0.03973388671875, -0.04876708984375, 0.037841796875, 0.016387939453125, -0.0173187255859375, 0.05340576171875, 0.04156494140625, 0.0201416015625, 0.08258056640625, 0.05035400390625, 0.0038623809814453125, 0.065673828125, -0.032135009765625, 0.0108642578125, -0.0736083984375, -0.027008056640625, -0.028778076171875, -0.022613525390625, -0.032470703125, -0.03253173828125, 0.0221405029296875, 0.00321197509765625, -0.01418304443359375, 0.036041259765625, -0.05517578125, 0.0419921875, 0.04644775390625, 0.025360107421875, 0.01136016845703125, 0.02447509765625, -0.0309295654296875, -0.01152801513671875, -0.044403076171875, -0.0372314453125, 0.058868408203125, 0.026092529296875, 0.058868408203125, -0.01043701171875, 0.0249786376953125, 0.006610870361328125, 0.004383087158203125, -0.0789794921875, 0.0411376953125, -0.024993896484375, -0.036651611328125, 0.002323150634765625, -0.045654296875, -0.067138671875, 0.040802001953125, -0.032684326171875, -0.07354736328125, 0.0198516845703125, 0.00531768798828125, -0.01062774658203125, 0.037994384765625, -0.068359375, 0.07183837890625, 0.0021228790283203125, -0.0147705078125, 0.019989013671875, -0.043304443359375, 0.01430511474609375, 0.0238189697265625, -0.0039520263671875, 0.0185546875, 0.0033664703369140625, 0.04132080078125, -0.0214080810546875, 0.054779052734375, 0.00287628173828125, 0.005035400390625, 0.0178375244140625, -0.006702423095703125, 0.02996826171875, -0.017303466796875, 0.00672149658203125, 0.035400390625, 0.0018367767333984375, -0.01502227783203125, -0.015655517578125, 0.04217529296875, -0.0625, -0.000016629695892333984, -0.0638427734375, -0.047393798828125, 0.01457977294921875, 0.031219482421875, 0.052520751953125, 0.049407958984375, -0.00826263427734375, 0.01157379150390625, 0.05084228515625, -0.007091522216796875, 0.04931640625, 0.00917816162109375, -0.032257080078125, -0.070556640625, 0.065185546875, 0.0196075439453125, 0.0222015380859375, 0.0335693359375, 0.0211181640625, -0.02349853515625, -0.045654296875, -0.0440673828125, 0.006076812744140625, -0.054168701171875, -0.0131988525390625, -0.060302734375, -0.04302978515625, -0.025238037109375, -0.0092620849609375, -0.00884246826171875, -0.020660400390625, -0.035614013671875, 0.0191192626953125, 0.0184326171875, 0.035675048828125, 0.0106353759765625, 0.04498291015625, -0.0638427734375, 0.02899169921875, 0.008209228515625, 0.023468017578125, -0.0185546875, -0.05950927734375, -0.031341552734375, -0.002391815185546875, -0.03643798828125, -0.042022705078125, 0.0626220703125, 0.0325927734375, 0.017120361328125, 0.036590576171875, -0.03076171875, 0.058349609375, -0.046844482421875, 0.05535888671875, 0.0209197998046875, -0.0753173828125, 0.036346435546875, -0.020904541015625, 0.03863525390625, 0.031005859375, 0.00719451904296875, -0.04132080078125, -0.007205963134765625, -0.05194091796875, -0.060302734375, 0.06939697265625, 0.0151824951171875, -0.0006375312805175781, 0.0191802978515625, 0.020111083984375, 0.0011539459228515625, 0.01507568359375, -0.0765380859375, -0.00815582275390625, -0.0186004638671875, -0.043243408203125, -0.0148468017578125, -0.034881591796875, 0.015777587890625, -0.01861572265625, 0.03985595703125, -0.0183563232421875, 0.05877685546875, 0.038787841796875, -0.02142333984375, 0.005046844482421875, 0.01018524169921875, 0.043304443359375, 0.03082275390625, -0.01522064208984375, 0.0246429443359375, 0.0105438232421875, -0.078369140625, -0.007320404052734375, 0.0081939697265625, -0.0193328857421875, 0.0187835693359375, 0.0300140380859375, 0.065673828125, -0.005176544189453125, -0.0281982421875, 0.041229248046875, 0.00018167495727539062, -0.03192138671875, -0.0233612060546875, -0.00787353515625, -0.01537322998046875, 0.00934600830078125, 0.02716064453125, 0.030364990234375, 0.012420654296875, -0.044281005859375, 0.0033664703369140625, 0.0286102294921875, -0.040924072265625, -0.013580322265625, 0.02227783203125, -0.01317596435546875, -0.0152435302734375, 0.03350830078125, -0.00568389892578125, -0.054901123046875, 0.043975830078125, 0.0543212890625, 0.056610107421875, -0.0120697021484375, 0.009307861328125, 0.0469970703125, 0.0284423828125, -0.0245361328125, 0.027252197265625, 0.0148468017578125, -0.06622314453125, -0.018402099609375, -0.059600830078125, -0.01641845703125, 0.019927978515625, -0.013336181640625, 0.0245819091796875, -0.056610107421875, -0.01531219482421875, 0.004985809326171875, 0.00981903076171875, -0.0156402587890625, 0.035614013671875, 0.00921630859375, 0.0584716796875, -0.055328369140625, 0.06488037109375, 0.047027587890625, -0.042724609375, -0.04156494140625, -0.004039764404296875, -0.01340484619140625, -0.060272216796875, 0.040618896484375, 0.040863037109375, -0.0031032562255859375, -0.006122589111328125, -0.03729248046875, -0.06646728515625, 0.08624267578125, 0.0159149169921875, -0.0220947265625, -0.00537872314453125, 0.01213836669921875, 0.03631591796875, -0.034149169921875, 0.0279998779296875, 0.014495849609375, 0.0197296142578125, 0.02740478515625, -0.061065673828125, -0.005390167236328125, -0.0303497314453125, 0.0288238525390625, 0.0303955078125, -0.0291748046875, 0.10205078125, -0.0118560791015625, -0.0168609619140625, 0.0335693359375, 0.042999267578125, 0.0251007080078125, 0.010589599609375, 0.0249786376953125, 0.05902099609375, 0.0291290283203125, -0.029083251953125, 0.0811767578125, -0.01221466064453125, 0.049224853515625, 0.052032470703125, 0.0289764404296875, 0.06781005859375, 0.041473388671875, -0.01131439208984375, 0.045257568359375, 0.05078125, -0.0261383056640625, 0.045318603515625, -0.006862640380859375, -0.01953125, -0.0026149749755859375, 0.00278472900390625, -0.024993896484375, 0.0234222412109375, 0.0254974365234375, -0.0689697265625, -0.00040531158447265625, -0.005336761474609375, -0.0015707015991210938, -0.0222320556640625, -0.0462646484375, 0.0599365234375, -0.000965118408203125, -0.039154052734375, 0.039337158203125, 0.0211639404296875, 0.078369140625, -0.065185546875, 0.007701873779296875, 0.0138702392578125, 0.057952880859375, -0.01354217529296875, -0.06463623046875, 0.02056884765625, -0.01490020751953125, -0.02545166015625, -0.0013637542724609375, 0.08001708984375, -0.049652099609375, -0.060943603515625, 0.007472991943359375, 0.0088958740234375, 0.0123748779296875, -0.0200347900390625, -0.0577392578125, -0.01580810546875, -0.006011962890625, -0.018646240234375, 0.035369873046875, 0.048614501953125, 0.0208587646484375, 0.036407470703125, 0.050628662109375, 0.01180267333984375, 0.004180908203125, -0.01494598388671875, 0.050262451171875, -0.04931640625, -0.046417236328125, -0.06884765625, 0.0423583984375, -0.0093536376953125, -0.0423583984375, 0.0509033203125, 0.046539306640625, 0.053497314453125, -0.0321044921875, 0.056549072265625, -0.0185699462890625, 0.01336669921875, -0.051422119140625, 0.048614501953125, -0.04107666015625, -0.022003173828125, -0.04559326171875, -0.059173583984375, -0.034088134765625, 0.0721435546875, -0.020965576171875, 0.026611328125, 0.06732177734375, 0.07330322265625, -0.031341552734375, -0.0188751220703125, 0.0223541259765625, -0.0012941360473632812, -0.004558563232421875, 0.046966552734375, 0.05242919921875, -0.05908203125, 0.0626220703125, -0.01398468017578125, -0.0121002197265625, -0.0221710205078125, -0.06256103515625, -0.0748291015625, -0.044403076171875, -0.01380157470703125, -0.04278564453125, -0.0171051025390625, 0.049652099609375, 0.045196533203125, -0.053924560546875, 0.00010329484939575195, -0.022186279296875, -0.0100555419921875, -0.0156707763671875, -0.0188751220703125, 0.0294952392578125, -0.0114593505859375, -0.0704345703125, -0.01448822021484375, 0.0218658447265625, 0.035675048828125, -0.005084991455078125, 0.00963592529296875, -0.031463623046875, 0.0027675628662109375, 0.053863525390625, 0.015777587890625, -0.041015625, -0.0170745849609375, -0.01309967041015625, -0.0296478271484375, 0.024139404296875, 0.0460205078125, -0.039825439453125, 0.01505279541015625, 0.060791015625, 0.02484130859375, 0.037689208984375, -0.0051116943359375, 0.021820068359375, -0.057830810546875, 0.023529052734375, -0.0028324127197265625, 0.0565185546875, 0.032440185546875, -0.0251007080078125, 0.031463623046875, 0.034576416015625, -0.049163818359375, -0.03814697265625, 0.01983642578125, -0.08489990234375, -0.0251617431640625, 0.0958251953125, -0.01462554931640625, -0.032257080078125, 0.0032176971435546875, -0.062286376953125, 0.00952911376953125, -0.0291900634765625, 0.052459716796875, 0.053741455078125, 0.00960540771484375, -0.03955078125, -0.0214385986328125, 0.0200042724609375, -0.001186370849609375, -0.047332763671875, -0.0072479248046875, 0.037994384765625, 0.016510009765625, 0.046173095703125, 0.043548583984375, -0.0174713134765625, -0.00194549560546875, 0.0283660888671875, 0.027801513671875, -0.01399993896484375, -0.03277587890625, -0.0288238525390625, 0.02838134765625, -0.01885986328125, -0.022613525390625 ] ]
Lykon/dreamshaper-8
2023-08-26T16:49:07.000Z
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "art", "artistic", "anime", "dreamshaper", "en", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
Lykon
null
null
Lykon/dreamshaper-8
13
10,059
diffusers
2023-08-26T16:49:06
--- language: - en license: creativeml-openrail-m tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - art - artistic - diffusers - anime - dreamshaper duplicated_from: lykon-models/dreamshaper-8 --- # Dreamshaper 8 `lykon-models/dreamshaper-8` is a Stable Diffusion model that has been fine-tuned on [runwayml/stable-diffusion-v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5). Please consider supporting me: - on [Patreon](https://www.patreon.com/Lykon275) - or [buy me a coffee](https://snipfeed.co/lykon) ## Diffusers For more general information on how to run text-to-image models with 🧨 Diffusers, see [the docs](https://huggingface.co/docs/diffusers/using-diffusers/conditional_image_generation). 1. Installation ``` pip install diffusers transformers accelerate ``` 2. Run ```py from diffusers import AutoPipelineForText2Image, DEISMultistepScheduler import torch pipe = AutoPipelineForText2Image.from_pretrained('lykon-models/dreamshaper-8', torch_dtype=torch.float16, variant="fp16") pipe.scheduler = DEISMultistepScheduler.from_config(pipe.scheduler.config) pipe = pipe.to("cuda") prompt = "portrait photo of muscular bearded guy in a worn mech suit, light bokeh, intricate, steel metal, elegant, sharp focus, soft lighting, vibrant colors" generator = torch.manual_seed(33) image = pipe(prompt, generator=generator, num_inference_steps=25).images[0] image.save("./image.png") ``` ![](./image.png) ## Notes - **Version 8** focuses on improving what V7 started. Might be harder to do photorealism compared to realism focused models, as it might be hard to do anime compared to anime focused models, but it can do both pretty well if you're skilled enough. Check the examples! - **Version 7** improves lora support, NSFW and realism. If you're interested in "absolute" realism, try AbsoluteReality. - **Version 6** adds more lora support and more style in general. It should also be better at generating directly at 1024 height (but be careful with it). 6.x are all improvements. - **Version 5** is the best at photorealism and has noise offset. - **Version 4** is much better with anime (can do them with no LoRA) and booru tags. It might be harder to control if you're used to caption style, so you might still want to use version 3.31. V4 is also better with eyes at lower resolutions. Overall is like a "fix" of V3 and shouldn't be too much different.
2,412
[ [ -0.0254974365234375, -0.041473388671875, 0.0421142578125, 0.0211029052734375, -0.03607177734375, -0.0111236572265625, 0.00794219970703125, -0.0511474609375, 0.02935791015625, 0.04608154296875, -0.032470703125, -0.0287017822265625, -0.051239013671875, 0.00579833984375, -0.0287628173828125, 0.056915283203125, -0.00890350341796875, -0.0140533447265625, 0.0032596588134765625, 0.0082855224609375, -0.040191650390625, 0.0011129379272460938, -0.06475830078125, -0.027496337890625, 0.06585693359375, 0.0010595321655273438, 0.055755615234375, 0.0479736328125, 0.04693603515625, 0.0296173095703125, -0.006103515625, -0.0011072158813476562, -0.03790283203125, 0.0106201171875, 0.00849151611328125, -0.02685546875, -0.0439453125, -0.0208587646484375, 0.04119873046875, 0.007328033447265625, -0.016876220703125, 0.00154876708984375, -0.004955291748046875, 0.050140380859375, -0.038177490234375, 0.005855560302734375, -0.022125244140625, 0.0098876953125, -0.0161590576171875, 0.0160064697265625, -0.01212310791015625, -0.01003265380859375, -0.0116424560546875, -0.06622314453125, 0.01470947265625, 0.00217437744140625, 0.08154296875, 0.03668212890625, -0.0240631103515625, 0.0125579833984375, -0.055145263671875, 0.05279541015625, -0.073486328125, 0.0269012451171875, 0.0069122314453125, 0.0389404296875, -0.01024627685546875, -0.09527587890625, -0.035614013671875, 0.00954437255859375, -0.0022602081298828125, 0.02783203125, -0.01788330078125, 0.0139617919921875, 0.04302978515625, 0.038177490234375, -0.0357666015625, 0.0006623268127441406, -0.065673828125, -0.0126953125, 0.045745849609375, 0.0009479522705078125, 0.02459716796875, -0.0197296142578125, -0.0187835693359375, -0.0035495758056640625, -0.0286865234375, -0.007793426513671875, 0.03582763671875, -0.0125579833984375, -0.035430908203125, 0.04766845703125, -0.010162353515625, 0.03741455078125, 0.0168304443359375, -0.0254669189453125, 0.0236968994140625, 0.022705078125, -0.0178070068359375, -0.0161285400390625, 0.06494140625, 0.054412841796875, -0.004962921142578125, 0.0179901123046875, -0.02264404296875, 0.030670166015625, 0.01416778564453125, -0.087890625, -0.033660888671875, 0.034637451171875, -0.04644775390625, -0.049652099609375, -0.029388427734375, -0.0438232421875, -0.0225067138671875, -0.0012226104736328125, 0.034942626953125, -0.03753662109375, -0.039031982421875, 0.018951416015625, -0.0177764892578125, 0.0176239013671875, 0.050750732421875, -0.04608154296875, 0.0108642578125, 0.025390625, 0.076904296875, -0.00783538818359375, -0.00577545166015625, -0.01013946533203125, -0.004901885986328125, -0.043060302734375, 0.0721435546875, -0.00888824462890625, -0.0233612060546875, -0.0032329559326171875, 0.00518798828125, 0.015380859375, -0.039886474609375, 0.053863525390625, -0.037261962890625, 0.022491455078125, -0.00403594970703125, -0.0440673828125, -0.0177764892578125, 0.0034332275390625, -0.054168701171875, 0.0771484375, 0.0194854736328125, -0.067138671875, 0.0179901123046875, -0.057281494140625, -0.0012044906616210938, 0.0121307373046875, -0.0116119384765625, -0.048919677734375, 0.0106048583984375, -0.00714874267578125, 0.0283966064453125, 0.005702972412109375, 0.005950927734375, -0.0321044921875, -0.0303497314453125, -0.01268768310546875, -0.017059326171875, 0.06915283203125, 0.027099609375, -0.0242919921875, 0.01548004150390625, -0.048492431640625, 0.001445770263671875, 0.022186279296875, -0.0002989768981933594, 0.00740814208984375, -0.00574493408203125, 0.02398681640625, 0.0171051025390625, 0.01494598388671875, -0.043365478515625, 0.01413726806640625, -0.029632568359375, 0.023162841796875, 0.0615234375, 0.004177093505859375, 0.011749267578125, -0.0478515625, 0.05255126953125, 0.01480865478515625, 0.01678466796875, 0.0010938644409179688, -0.060089111328125, -0.080078125, -0.0292205810546875, 0.0019321441650390625, 0.01666259765625, -0.059539794921875, 0.0222320556640625, -0.0031337738037109375, -0.052459716796875, -0.02728271484375, -0.0119476318359375, 0.022003173828125, 0.03253173828125, 0.0107421875, -0.0175933837890625, -0.044830322265625, -0.067626953125, 0.0004467964172363281, 0.00684356689453125, 0.00962066650390625, -0.006175994873046875, 0.04132080078125, -0.0203399658203125, 0.03375244140625, -0.04608154296875, -0.0259246826171875, -0.0296783447265625, -0.02166748046875, 0.04632568359375, 0.043060302734375, 0.067626953125, -0.056121826171875, -0.056640625, -0.0002970695495605469, -0.05718994140625, 0.0141448974609375, 0.0196685791015625, -0.0190277099609375, 0.007694244384765625, 0.0099334716796875, -0.071533203125, 0.047821044921875, 0.051727294921875, -0.03753662109375, 0.051910400390625, -0.017669677734375, 0.01287078857421875, -0.0953369140625, 0.007472991943359375, 0.0133209228515625, -0.046722412109375, -0.03973388671875, 0.017486572265625, -0.00736236572265625, -0.0199432373046875, -0.054290771484375, 0.05828857421875, -0.02740478515625, 0.00972747802734375, -0.02423095703125, -0.01181793212890625, 0.006862640380859375, 0.03253173828125, 0.0130462646484375, 0.02850341796875, 0.0653076171875, -0.0364990234375, 0.047454833984375, 0.0232391357421875, -0.0433349609375, 0.03643798828125, -0.06793212890625, 0.027099609375, -0.024017333984375, 0.0234527587890625, -0.054443359375, -0.035491943359375, 0.04620361328125, -0.034698486328125, 0.00705718994140625, -0.0231781005859375, -0.0264434814453125, -0.04132080078125, -0.02581787109375, 0.0220489501953125, 0.08172607421875, -0.0261077880859375, 0.05096435546875, 0.00273895263671875, 0.0198211669921875, -0.040069580078125, -0.038726806640625, -0.008514404296875, -0.039398193359375, -0.04766845703125, 0.03314208984375, -0.02777099609375, -0.02215576171875, 0.00687408447265625, -0.0026607513427734375, 0.0009484291076660156, -0.0254058837890625, 0.0187530517578125, 0.019317626953125, -0.0195465087890625, -0.044097900390625, 0.004436492919921875, -0.01447296142578125, -0.0003807544708251953, -0.005950927734375, 0.033294677734375, -0.005321502685546875, -0.005725860595703125, -0.051177978515625, 0.01070404052734375, 0.0382080078125, 0.0151214599609375, 0.04779052734375, 0.058319091796875, -0.052734375, -0.0169830322265625, -0.046661376953125, -0.01485443115234375, -0.044097900390625, 0.0242919921875, -0.03912353515625, -0.033782958984375, 0.0426025390625, 0.0176239013671875, 0.03033447265625, 0.0439453125, 0.03204345703125, -0.038360595703125, 0.08160400390625, 0.0509033203125, 0.01386260986328125, 0.03753662109375, -0.06329345703125, 0.002902984619140625, -0.0621337890625, -0.01425933837890625, -0.01116180419921875, -0.045745849609375, -0.0269012451171875, -0.037750244140625, 0.032470703125, 0.01107025146484375, -0.01548004150390625, 0.0253448486328125, -0.033447265625, 0.036285400390625, 0.004238128662109375, 0.01715087890625, 0.01922607421875, 0.01702880859375, 0.0181121826171875, -0.0188751220703125, -0.0452880859375, -0.027008056640625, 0.052825927734375, 0.02520751953125, 0.0633544921875, 0.0111541748046875, 0.05841064453125, 0.0171356201171875, 0.022247314453125, -0.0594482421875, 0.047210693359375, -0.0158843994140625, -0.05096435546875, 0.0007610321044921875, -0.01422882080078125, -0.06585693359375, 0.0282440185546875, -0.0204620361328125, -0.044158935546875, 0.01387786865234375, 0.0447998046875, -0.0177154541015625, 0.033966064453125, -0.03662109375, 0.07275390625, -0.00803375244140625, -0.036376953125, -0.01666259765625, -0.030609130859375, 0.033843994140625, 0.01287841796875, 0.014556884765625, -0.0304412841796875, -0.01212310791015625, 0.040679931640625, -0.03680419921875, 0.06536865234375, -0.022979736328125, -0.01280975341796875, 0.0282745361328125, 0.007007598876953125, 0.034759521484375, -0.0041961669921875, -0.01241302490234375, 0.00470733642578125, 0.00984954833984375, -0.0343017578125, -0.04962158203125, 0.066650390625, -0.061920166015625, -0.032928466796875, -0.04046630859375, -0.023284912109375, 0.0127410888671875, 0.037689208984375, 0.064697265625, 0.04736328125, 0.016326904296875, 0.0192413330078125, 0.048492431640625, 0.007495880126953125, 0.038055419921875, -0.007122039794921875, -0.0628662109375, -0.0478515625, 0.0494384765625, 0.005466461181640625, 0.0290679931640625, 0.00598907470703125, 0.034942626953125, -0.019927978515625, -0.039154052734375, -0.041900634765625, 0.029876708984375, -0.057159423828125, -0.032318115234375, -0.033660888671875, -0.034759521484375, -0.01910400390625, -0.019561767578125, -0.037872314453125, -0.0318603515625, -0.0252838134765625, 0.01104736328125, 0.05841064453125, 0.057464599609375, 0.004329681396484375, 0.016082763671875, -0.03424072265625, 0.042327880859375, 0.003780364990234375, 0.03070068359375, -0.01125335693359375, -0.056365966796875, -0.01503753662109375, -0.005809783935546875, -0.038665771484375, -0.054168701171875, 0.03668212890625, 0.00746917724609375, 0.0235443115234375, 0.030975341796875, -0.005035400390625, 0.0628662109375, -0.03497314453125, 0.05889892578125, 0.03607177734375, -0.0400390625, 0.037689208984375, -0.061431884765625, 0.017425537109375, 0.00823974609375, 0.0225372314453125, -0.037200927734375, -0.0278167724609375, -0.0736083984375, -0.04913330078125, 0.0303497314453125, 0.0428466796875, 0.015960693359375, 0.021148681640625, 0.04730224609375, -0.004543304443359375, 0.015228271484375, -0.08453369140625, -0.030303955078125, -0.03741455078125, -0.0158233642578125, 0.00568389892578125, -0.00200653076171875, 0.011444091796875, -0.0276336669921875, 0.0655517578125, -0.0135650634765625, 0.0330810546875, 0.0273895263671875, 0.0266876220703125, -0.0242919921875, -0.0174407958984375, 0.02630615234375, 0.050811767578125, -0.02239990234375, 0.002651214599609375, -0.000037610530853271484, -0.034027099609375, 0.00782012939453125, -0.00728607177734375, -0.03411865234375, 0.0186309814453125, 0.016021728515625, 0.0845947265625, 0.00548553466796875, -0.028656005859375, 0.032989501953125, 0.004367828369140625, -0.02398681640625, -0.040985107421875, 0.0281982421875, 0.00951385498046875, 0.023681640625, -0.005718231201171875, 0.042449951171875, 0.02362060546875, -0.029937744140625, -0.0194244384765625, 0.0157470703125, -0.021240234375, -0.038604736328125, 0.07257080078125, 0.0109710693359375, -0.035491943359375, 0.01702880859375, -0.014739990234375, -0.015869140625, 0.05279541015625, 0.06268310546875, 0.06134033203125, -0.007518768310546875, 0.0142974853515625, 0.050994873046875, 0.0005598068237304688, -0.0305023193359375, 0.047027587890625, 0.0265655517578125, -0.048583984375, -0.023040771484375, -0.032989501953125, -0.0195770263671875, 0.03173828125, -0.030609130859375, 0.06201171875, -0.0433349609375, -0.0283966064453125, -0.00848388671875, -0.00428009033203125, -0.05035400390625, 0.024322509765625, 0.01556396484375, 0.068603515625, -0.0421142578125, 0.06292724609375, 0.057891845703125, -0.036834716796875, -0.06134033203125, -0.0158233642578125, -0.007965087890625, -0.035980224609375, 0.007205963134765625, 0.0173492431640625, -0.01375579833984375, 0.0196075439453125, -0.04705810546875, -0.06756591796875, 0.08935546875, 0.03802490234375, -0.049102783203125, -0.0166473388671875, -0.02301025390625, 0.038787841796875, -0.0247344970703125, 0.004940032958984375, 0.0123443603515625, 0.022552490234375, 0.0149078369140625, -0.052825927734375, -0.01380157470703125, -0.0311737060546875, 0.028106689453125, 0.0061798095703125, -0.07012939453125, 0.0496826171875, -0.033538818359375, -0.006671905517578125, 0.05035400390625, 0.07110595703125, 0.037567138671875, 0.01019287109375, 0.0438232421875, 0.057098388671875, 0.0215301513671875, 0.00010883808135986328, 0.07830810546875, -0.006526947021484375, 0.0325927734375, 0.05401611328125, -0.00553131103515625, 0.0667724609375, 0.03265380859375, -0.01611328125, 0.0511474609375, 0.05242919921875, -0.004405975341796875, 0.03955078125, 0.0002865791320800781, -0.0181732177734375, -0.0092010498046875, 0.0053558349609375, -0.0369873046875, 0.0001118779182434082, 0.01136016845703125, -0.01111602783203125, 0.006092071533203125, 0.007015228271484375, 0.0050048828125, -0.006465911865234375, -0.02325439453125, 0.031280517578125, 0.0256805419921875, -0.031341552734375, 0.04833984375, -0.0081939697265625, 0.0684814453125, -0.060150146484375, -0.01393890380859375, -0.0193023681640625, 0.030914306640625, -0.037353515625, -0.06689453125, 0.01525115966796875, -0.002368927001953125, 0.0007648468017578125, -0.02166748046875, 0.07568359375, -0.0174713134765625, -0.06268310546875, 0.0207672119140625, -0.0029888153076171875, 0.0275115966796875, 0.00897979736328125, -0.061614990234375, 0.043365478515625, 0.006359100341796875, -0.02349853515625, 0.0017271041870117188, 0.007568359375, 0.021575927734375, 0.052337646484375, 0.0255584716796875, 0.0189056396484375, 0.0067596435546875, 0.01039886474609375, 0.05206298828125, -0.0281524658203125, -0.03240966796875, -0.04248046875, 0.0653076171875, -0.014862060546875, -0.025115966796875, 0.052520751953125, 0.056793212890625, 0.047637939453125, -0.041534423828125, 0.057525634765625, -0.0251922607421875, 0.030548095703125, -0.04559326171875, 0.051361083984375, -0.056854248046875, 0.006359100341796875, -0.033050537109375, -0.088134765625, -0.0110015869140625, 0.07476806640625, 0.006877899169921875, 0.01544952392578125, 0.040924072265625, 0.056915283203125, -0.0286865234375, -0.0257415771484375, 0.03656005859375, 0.0165252685546875, 0.025299072265625, 0.0172576904296875, 0.07470703125, -0.070556640625, 0.0250244140625, -0.060089111328125, 0.0016126632690429688, 0.0090179443359375, -0.07220458984375, -0.06298828125, -0.041534423828125, -0.0540771484375, -0.04840087890625, 0.004314422607421875, 0.05419921875, 0.07391357421875, -0.03009033203125, -0.0234527587890625, -0.006103515625, -0.00839996337890625, -0.005123138427734375, -0.01561737060546875, 0.00841522216796875, 0.00803375244140625, -0.0731201171875, 0.01531982421875, 0.0125885009765625, 0.02801513671875, -0.0246429443359375, -0.0169219970703125, 0.012725830078125, -0.0124053955078125, 0.037261962890625, 0.01317596435546875, -0.062164306640625, -0.02581787109375, -0.009918212890625, 0.018402099609375, 0.021881103515625, 0.036224365234375, -0.049835205078125, 0.01010894775390625, 0.029022216796875, 0.0016145706176757812, 0.0567626953125, -0.0237274169921875, 0.0225982666015625, -0.0260772705078125, 0.02764892578125, 0.019195556640625, 0.04620361328125, 0.023284912109375, -0.029144287109375, 0.035858154296875, 0.037261962890625, -0.049041748046875, -0.057891845703125, 0.0169525146484375, -0.08831787109375, -0.0005154609680175781, 0.08050537109375, 0.01390838623046875, -0.023956298828125, 0.01476287841796875, -0.05047607421875, -0.0091400146484375, -0.01910400390625, 0.03973388671875, 0.0307159423828125, -0.0193939208984375, -0.039398193359375, -0.052001953125, 0.0447998046875, -0.001861572265625, -0.06573486328125, -0.0203857421875, 0.0487060546875, 0.052337646484375, 0.01739501953125, 0.07366943359375, -0.0113525390625, 0.0219879150390625, 0.00466156005859375, 0.006183624267578125, 0.01499176025390625, -0.017822265625, -0.01268768310546875, -0.016265869140625, 0.0124664306640625, -0.00844573974609375 ] ]
Salesforce/instructblip-flan-t5-xl
2023-06-18T09:19:04.000Z
[ "transformers", "pytorch", "instructblip", "text2text-generation", "vision", "image-captioning", "image-to-text", "en", "arxiv:2305.06500", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
image-to-text
Salesforce
null
null
Salesforce/instructblip-flan-t5-xl
20
10,058
transformers
2023-05-28T13:29:18
--- language: en license: mit tags: - vision - image-captioning pipeline_tag: image-to-text --- # InstructBLIP model InstructBLIP model using [Flan-T5-xl](https://huggingface.co/google/flan-t5-xl) as language model. InstructBLIP was introduced in the paper [InstructBLIP: Towards General-purpose Vision-Language Models with Instruction Tuning](https://arxiv.org/abs/2305.06500) by Dai et al. Disclaimer: The team releasing InstructBLIP did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description InstructBLIP is a visual instruction tuned version of [BLIP-2](https://huggingface.co/docs/transformers/main/model_doc/blip-2). Refer to the paper for details. ![InstructBLIP architecture](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/instructblip_architecture.jpg) ## Intended uses & limitations Usage is as follows: ``` from transformers import InstructBlipProcessor, InstructBlipForConditionalGeneration import torch from PIL import Image import requests model = InstructBlipForConditionalGeneration.from_pretrained("Salesforce/instructblip-flan-t5-xl") processor = InstructBlipProcessor.from_pretrained("Salesforce/instructblip-flan-t5-xl") device = "cuda" if torch.cuda.is_available() else "cpu" model.to(device) url = "https://raw.githubusercontent.com/salesforce/LAVIS/main/docs/_static/Confusing-Pictures.jpg" image = Image.open(requests.get(url, stream=True).raw).convert("RGB") prompt = "What is unusual about this image?" inputs = processor(images=image, text=prompt, return_tensors="pt").to(device) outputs = model.generate( **inputs, do_sample=False, num_beams=5, max_length=256, min_length=1, top_p=0.9, repetition_penalty=1.5, length_penalty=1.0, temperature=1, ) generated_text = processor.batch_decode(outputs, skip_special_tokens=True)[0].strip() print(generated_text) ``` ### How to use For code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/instructblip).
2,133
[ [ -0.03607177734375, -0.047210693359375, 0.0014524459838867188, 0.0264892578125, -0.0168304443359375, -0.00279998779296875, -0.00815582275390625, -0.050079345703125, -0.00440216064453125, 0.037994384765625, -0.05108642578125, -0.0281982421875, -0.040008544921875, -0.0177154541015625, -0.0191497802734375, 0.0750732421875, -0.0013055801391601562, -0.01074981689453125, -0.0131988525390625, 0.00519561767578125, -0.0230255126953125, -0.0230712890625, -0.037139892578125, -0.02972412109375, -0.015899658203125, 0.033233642578125, 0.050628662109375, 0.032562255859375, 0.049774169921875, 0.0248870849609375, -0.0220794677734375, 0.0088043212890625, -0.0283203125, -0.040252685546875, 0.00482177734375, -0.04071044921875, -0.0360107421875, 0.001575469970703125, 0.0418701171875, 0.027618408203125, -0.00560760498046875, 0.0357666015625, -0.0012769699096679688, 0.033355712890625, -0.0345458984375, 0.0267486572265625, -0.033843994140625, 0.0079193115234375, -0.0012073516845703125, -0.017242431640625, -0.0469970703125, -0.01290130615234375, 0.002227783203125, -0.02886962890625, 0.042449951171875, 0.004611968994140625, 0.10772705078125, 0.017364501953125, -0.0095977783203125, -0.01361846923828125, -0.05224609375, 0.055389404296875, -0.06103515625, 0.03167724609375, 0.00957489013671875, 0.020050048828125, 0.00524139404296875, -0.07611083984375, -0.0360107421875, -0.0107269287109375, -0.0107421875, 0.01464080810546875, -0.01947021484375, 0.0182647705078125, 0.04833984375, 0.0196533203125, -0.042724609375, 0.0085601806640625, -0.04840087890625, -0.01226806640625, 0.04730224609375, 0.0023250579833984375, 0.0196380615234375, -0.01409912109375, -0.06121826171875, -0.0287322998046875, -0.040985107421875, 0.0206756591796875, 0.01219940185546875, 0.00847625732421875, -0.046661376953125, 0.04229736328125, 0.004802703857421875, 0.049224853515625, 0.0308074951171875, -0.022216796875, 0.044219970703125, -0.00047469139099121094, -0.0340576171875, 0.004123687744140625, 0.059814453125, 0.03656005859375, 0.007598876953125, 0.0005474090576171875, -0.0279541015625, 0.002544403076171875, 0.0155029296875, -0.08795166015625, -0.0189971923828125, 0.0263519287109375, -0.039642333984375, -0.023284912109375, 0.02166748046875, -0.0628662109375, -0.002872467041015625, 0.00489044189453125, 0.038818359375, -0.043853759765625, -0.02874755859375, 0.0017681121826171875, -0.0256805419921875, 0.032958984375, 0.0023555755615234375, -0.08990478515625, 0.01540374755859375, 0.0472412109375, 0.06103515625, 0.018035888671875, -0.0321044921875, -0.0159912109375, 0.0124664306640625, -0.00917816162109375, 0.04864501953125, -0.019195556640625, -0.0313720703125, -0.00521087646484375, 0.01221466064453125, -0.00548553466796875, -0.054107666015625, 0.027252197265625, -0.0235137939453125, 0.018524169921875, 0.0031986236572265625, -0.047454833984375, -0.0203399658203125, -0.006931304931640625, -0.0290374755859375, 0.0819091796875, 0.026824951171875, -0.06292724609375, 0.021636962890625, -0.054931640625, -0.028472900390625, 0.017791748046875, -0.007587432861328125, -0.048431396484375, 0.00007361173629760742, 0.004978179931640625, 0.035888671875, -0.013397216796875, 0.00214385986328125, -0.0140838623046875, -0.036468505859375, 0.0030193328857421875, -0.01538848876953125, 0.0897216796875, 0.007904052734375, -0.0487060546875, 0.0260772705078125, -0.04925537109375, 0.005947113037109375, 0.0224456787109375, -0.002468109130859375, 0.007640838623046875, -0.02972412109375, 0.00811004638671875, 0.0101470947265625, 0.0270538330078125, -0.043975830078125, 0.018035888671875, -0.036712646484375, 0.041229248046875, 0.046112060546875, -0.0057525634765625, 0.038238525390625, -0.00826263427734375, 0.04071044921875, 0.00739288330078125, 0.039764404296875, -0.01195526123046875, -0.047515869140625, -0.06707763671875, -0.0183868408203125, -0.00804901123046875, 0.05218505859375, -0.0626220703125, 0.0181732177734375, -0.01114654541015625, -0.056243896484375, -0.036773681640625, -0.00066375732421875, 0.047515869140625, 0.061309814453125, 0.034210205078125, -0.007671356201171875, -0.033721923828125, -0.079833984375, 0.015716552734375, -0.01007843017578125, 0.002567291259765625, 0.01458740234375, 0.046844482421875, -0.01342010498046875, 0.0457763671875, -0.042999267578125, -0.01447296142578125, -0.031158447265625, 0.0108489990234375, 0.031341552734375, 0.043548583984375, 0.06268310546875, -0.053466796875, -0.028167724609375, -0.004688262939453125, -0.058258056640625, -0.0005946159362792969, -0.0132598876953125, -0.017120361328125, 0.034149169921875, 0.03424072265625, -0.061431884765625, 0.044708251953125, 0.043853759765625, -0.03466796875, 0.045745849609375, -0.00910186767578125, -0.005771636962890625, -0.0811767578125, 0.0116119384765625, 0.01226806640625, -0.00815582275390625, -0.040802001953125, 0.015777587890625, 0.026611328125, -0.016448974609375, -0.051849365234375, 0.04669189453125, -0.036651611328125, 0.003635406494140625, -0.01141357421875, -0.0212860107421875, 0.00528717041015625, 0.05609130859375, 0.01068115234375, 0.058197021484375, 0.047821044921875, -0.04925537109375, 0.03424072265625, 0.040313720703125, -0.01947021484375, 0.0274505615234375, -0.06591796875, 0.003673553466796875, 0.002239227294921875, -0.00330352783203125, -0.055389404296875, -0.0136871337890625, 0.0389404296875, -0.0306243896484375, 0.0460205078125, -0.0197906494140625, -0.04168701171875, -0.045745849609375, -0.0083465576171875, 0.0189208984375, 0.055419921875, -0.06121826171875, 0.04107666015625, 0.0259552001953125, 0.020538330078125, -0.04107666015625, -0.08013916015625, -0.0164794921875, -0.0096282958984375, -0.057342529296875, 0.04888916015625, -0.01284027099609375, 0.006084442138671875, -0.003292083740234375, 0.0008831024169921875, -0.013214111328125, -0.0189056396484375, 0.031646728515625, 0.035400390625, -0.0157012939453125, -0.01479339599609375, 0.00101470947265625, -0.00962066650390625, 0.01678466796875, 0.007282257080078125, 0.056060791015625, -0.03143310546875, -0.01421356201171875, -0.061798095703125, 0.0102996826171875, 0.0254058837890625, -0.02154541015625, 0.047943115234375, 0.05511474609375, -0.021331787109375, -0.03485107421875, -0.03070068359375, -0.0250396728515625, -0.045928955078125, 0.0277862548828125, -0.0243072509765625, -0.028564453125, 0.03564453125, 0.00453948974609375, 0.0098419189453125, 0.0269317626953125, 0.034393310546875, -0.01145172119140625, 0.06219482421875, 0.05877685546875, 0.019989013671875, 0.060028076171875, -0.06707763671875, 0.0028171539306640625, -0.0535888671875, -0.0311431884765625, -0.0130462646484375, -0.01194000244140625, -0.03558349609375, -0.0164947509765625, 0.0207061767578125, 0.01171112060546875, -0.037200927734375, 0.04144287109375, -0.0634765625, 0.0175018310546875, 0.06103515625, 0.0269775390625, -0.017333984375, 0.01012420654296875, -0.00476837158203125, -0.00841522216796875, -0.059295654296875, -0.02691650390625, 0.04681396484375, 0.033416748046875, 0.053131103515625, -0.0147552490234375, 0.04815673828125, -0.00739288330078125, 0.0231781005859375, -0.050750732421875, 0.05438232421875, -0.0089111328125, -0.03973388671875, 0.01058197021484375, -0.02325439453125, -0.05230712890625, -0.00186920166015625, -0.01158905029296875, -0.05609130859375, 0.01070404052734375, 0.0293121337890625, -0.00562286376953125, 0.039581298828125, -0.08233642578125, 0.0791015625, -0.037994384765625, -0.0196380615234375, 0.0003790855407714844, -0.042205810546875, 0.0318603515625, 0.037811279296875, -0.0164947509765625, 0.0079803466796875, 0.018402099609375, 0.058074951171875, -0.042633056640625, 0.0784912109375, -0.026092529296875, 0.0027370452880859375, 0.039276123046875, -0.01132965087890625, 0.024261474609375, -0.0071258544921875, -0.012603759765625, 0.031890869140625, 0.0146942138671875, -0.026519775390625, -0.043792724609375, 0.0291290283203125, -0.055908203125, -0.039794921875, -0.0274505615234375, -0.03375244140625, -0.0001558065414428711, 0.0273284912109375, 0.0584716796875, 0.04315185546875, 0.0030994415283203125, 0.0019073486328125, 0.037322998046875, -0.028167724609375, 0.04888916015625, 0.006816864013671875, -0.02642822265625, -0.032135009765625, 0.07379150390625, -0.00897979736328125, 0.037750244140625, 0.0318603515625, 0.00577545166015625, -0.019775390625, -0.0183868408203125, -0.05145263671875, 0.03216552734375, -0.0523681640625, -0.03179931640625, -0.00804901123046875, -0.037353515625, -0.0292205810546875, -0.019073486328125, -0.04071044921875, -0.01412200927734375, -0.03570556640625, 0.0019159317016601562, 0.039031982421875, 0.037078857421875, 0.003635406494140625, 0.040863037109375, -0.05474853515625, 0.03497314453125, 0.0014019012451171875, 0.038299560546875, 0.0017957687377929688, -0.050933837890625, -0.023345947265625, 0.01666259765625, -0.038421630859375, -0.06048583984375, 0.0382080078125, 0.0104217529296875, 0.040618896484375, 0.0382080078125, -0.00025272369384765625, 0.0584716796875, -0.0262451171875, 0.050628662109375, 0.026031494140625, -0.07989501953125, 0.051849365234375, 0.00437164306640625, 0.0293731689453125, 0.0086822509765625, 0.0189971923828125, -0.02618408203125, -0.02435302734375, -0.06536865234375, -0.06610107421875, 0.06732177734375, 0.0035247802734375, 0.0150909423828125, 0.012176513671875, 0.022003173828125, -0.001987457275390625, 0.018035888671875, -0.046356201171875, -0.03619384765625, -0.036285400390625, -0.0125885009765625, 0.01088714599609375, -0.0272674560546875, -0.0117950439453125, -0.03216552734375, 0.047637939453125, 0.0075225830078125, 0.0531005859375, 0.01434326171875, -0.01093292236328125, -0.0099334716796875, -0.0181884765625, 0.049652099609375, 0.04144287109375, -0.02752685546875, -0.0135345458984375, -0.0015773773193359375, -0.04449462890625, -0.0221099853515625, 0.00966644287109375, -0.03338623046875, 0.005519866943359375, 0.03448486328125, 0.09368896484375, 0.01198577880859375, -0.03350830078125, 0.03857421875, -0.00006633996963500977, -0.0236358642578125, -0.03851318359375, 0.014129638671875, 0.006900787353515625, 0.0293121337890625, 0.0159912109375, 0.0202178955078125, -0.013763427734375, -0.0164947509765625, 0.00838470458984375, 0.0386962890625, -0.020416259765625, -0.032806396484375, 0.07232666015625, 0.0180816650390625, -0.03759765625, 0.06475830078125, -0.014251708984375, -0.03564453125, 0.06201171875, 0.056640625, 0.05706787109375, 0.00020575523376464844, 0.027801513671875, 0.035797119140625, 0.033905029296875, 0.0096588134765625, 0.046539306640625, 0.0020427703857421875, -0.055755615234375, -0.00836944580078125, -0.039764404296875, -0.032806396484375, 0.00555419921875, -0.053466796875, 0.0394287109375, -0.06427001953125, -0.01215362548828125, 0.00666046142578125, -0.00403594970703125, -0.06292724609375, 0.019866943359375, 0.01971435546875, 0.08233642578125, -0.053070068359375, 0.055572509765625, 0.0479736328125, -0.057647705078125, -0.072021484375, -0.008880615234375, -0.01763916015625, -0.06591796875, 0.051483154296875, 0.02923583984375, -0.007015228271484375, 0.01029205322265625, -0.07586669921875, -0.04168701171875, 0.078369140625, 0.043975830078125, -0.0234527587890625, -0.0001424551010131836, 0.01082611083984375, 0.03753662109375, -0.0218963623046875, 0.026336669921875, 0.00966644287109375, 0.027801513671875, 0.0146942138671875, -0.07086181640625, 0.0119781494140625, -0.0240631103515625, -0.006702423095703125, 0.00991058349609375, -0.0714111328125, 0.061370849609375, -0.04205322265625, -0.0224456787109375, 0.00616455078125, 0.071044921875, 0.019195556640625, 0.0220489501953125, 0.032867431640625, 0.040252685546875, 0.04248046875, -0.0029735565185546875, 0.08599853515625, -0.03564453125, 0.042449951171875, 0.06732177734375, 0.0087890625, 0.0650634765625, 0.034881591796875, -0.0052337646484375, 0.038818359375, 0.037933349609375, -0.04449462890625, 0.02642822265625, -0.00916290283203125, 0.0009703636169433594, 0.00862884521484375, -0.0030498504638671875, -0.034210205078125, 0.041015625, 0.032806396484375, -0.03997802734375, 0.003520965576171875, -0.003448486328125, 0.00658416748046875, -0.0208740234375, -0.0195465087890625, 0.026458740234375, -0.0026035308837890625, -0.037139892578125, 0.0767822265625, -0.0082550048828125, 0.06591796875, -0.042022705078125, -0.01435089111328125, -0.0105743408203125, 0.02630615234375, -0.031585693359375, -0.068603515625, 0.026031494140625, -0.009552001953125, -0.0080718994140625, -0.0244140625, 0.0318603515625, -0.0247039794921875, -0.060943603515625, 0.022613525390625, -0.0024318695068359375, 0.0186614990234375, 0.021148681640625, -0.06317138671875, 0.0152587890625, 0.01425933837890625, -0.0249481201171875, 0.00153350830078125, 0.01317596435546875, 0.0104827880859375, 0.0638427734375, 0.03173828125, 0.0026760101318359375, 0.01483154296875, 0.004535675048828125, 0.0628662109375, -0.044525146484375, -0.02593994140625, -0.040130615234375, 0.053131103515625, -0.010772705078125, -0.048065185546875, 0.049774169921875, 0.048065185546875, 0.0859375, -0.0287628173828125, 0.045867919921875, -0.0183868408203125, 0.015350341796875, -0.04193115234375, 0.051544189453125, -0.051544189453125, 0.0006380081176757812, -0.0310516357421875, -0.05560302734375, -0.014373779296875, 0.049041748046875, -0.01123809814453125, 0.01361846923828125, 0.049896240234375, 0.07965087890625, -0.0302734375, -0.005298614501953125, 0.0120849609375, 0.0225372314453125, 0.02545166015625, 0.0283355712890625, 0.027557373046875, -0.051025390625, 0.050140380859375, -0.0777587890625, -0.0228118896484375, -0.011993408203125, -0.054473876953125, -0.07073974609375, -0.046142578125, -0.034149169921875, -0.037353515625, -0.01308441162109375, 0.05621337890625, 0.071533203125, -0.057952880859375, -0.0239715576171875, -0.01279449462890625, -0.00787353515625, -0.0308990478515625, -0.0226287841796875, 0.036834716796875, -0.007526397705078125, -0.070068359375, 0.00897979736328125, 0.01084136962890625, 0.01715087890625, -0.01226806640625, -0.0032062530517578125, -0.005550384521484375, -0.02166748046875, 0.036651611328125, 0.04541015625, -0.049896240234375, -0.0152740478515625, 0.0022563934326171875, -0.00970458984375, 0.01279449462890625, 0.027374267578125, -0.0343017578125, 0.0222930908203125, 0.03533935546875, 0.02301025390625, 0.052764892578125, -0.006866455078125, 0.02825927734375, -0.055084228515625, 0.055023193359375, 0.01296234130859375, 0.038909912109375, 0.0249481201171875, -0.03204345703125, 0.032196044921875, 0.017913818359375, -0.035919189453125, -0.06439208984375, 0.0265655517578125, -0.0906982421875, -0.0097503662109375, 0.08428955078125, -0.0026988983154296875, -0.0430908203125, 0.03125, -0.0242156982421875, 0.02679443359375, -0.0306854248046875, 0.050994873046875, 0.01251983642578125, -0.0210113525390625, -0.04150390625, -0.029815673828125, 0.0277252197265625, 0.015625, -0.0484619140625, -0.0150146484375, 0.0192718505859375, 0.0199127197265625, 0.0138092041015625, 0.024566650390625, 0.002197265625, 0.0172882080078125, 0.0188751220703125, 0.04949951171875, -0.024566650390625, -0.01219940185546875, -0.00849151611328125, -0.0020294189453125, -0.008941650390625, -0.03753662109375 ] ]
cyberagent/open-calm-7b
2023-05-18T01:12:08.000Z
[ "transformers", "pytorch", "gpt_neox", "text-generation", "japanese", "causal-lm", "ja", "dataset:wikipedia", "dataset:cc100", "dataset:mc4", "license:cc-by-sa-4.0", "has_space", "text-generation-inference", "region:us" ]
text-generation
cyberagent
null
null
cyberagent/open-calm-7b
191
10,053
transformers
2023-05-15T07:53:34
--- license: cc-by-sa-4.0 datasets: - wikipedia - cc100 - mc4 language: - ja tags: - japanese - causal-lm inference: false --- # OpenCALM-7B ## Model Description OpenCALM is a suite of decoder-only language models pre-trained on Japanese datasets, developed by CyberAgent, Inc. ## Usage ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("cyberagent/open-calm-7b", device_map="auto", torch_dtype=torch.float16) tokenizer = AutoTokenizer.from_pretrained("cyberagent/open-calm-7b") inputs = tokenizer("AIによって私達の暮らしは、", return_tensors="pt").to(model.device) with torch.no_grad(): tokens = model.generate( **inputs, max_new_tokens=64, do_sample=True, temperature=0.7, top_p=0.9, repetition_penalty=1.05, pad_token_id=tokenizer.pad_token_id, ) output = tokenizer.decode(tokens[0], skip_special_tokens=True) print(output) ``` ## Model Details |Model|Params|Layers|Dim|Heads|Dev ppl| |:---:|:---: |:---:|:---:|:---:|:---:| |[cyberagent/open-calm-small](https://huggingface.co/cyberagent/open-calm-small)|160M|12|768|12|19.7| |[cyberagent/open-calm-medium](https://huggingface.co/cyberagent/open-calm-medium)|400M|24|1024|16|13.8| |[cyberagent/open-calm-large](https://huggingface.co/cyberagent/open-calm-large)|830M|24|1536|16|11.3| |[cyberagent/open-calm-1b](https://huggingface.co/cyberagent/open-calm-1b)|1.4B|24|2048|16|10.3| |[cyberagent/open-calm-3b](https://huggingface.co/cyberagent/open-calm-3b)|2.7B|32|2560|32|9.7| |[cyberagent/open-calm-7b](https://huggingface.co/cyberagent/open-calm-7b)|6.8B|32|4096|32|8.2| * **Developed by**: [CyberAgent, Inc.](https://www.cyberagent.co.jp/) * **Model type**: Transformer-based Language Model * **Language**: Japanese * **Library**: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) * **License**: OpenCALM is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License ([CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)). When using this model, please provide appropriate credit to CyberAgent, Inc. * Example (en): This model is a fine-tuned version of OpenCALM-XX developed by CyberAgent, Inc. The original model is released under the CC BY-SA 4.0 license, and this model is also released under the same CC BY-SA 4.0 license. For more information, please visit: https://creativecommons.org/licenses/by-sa/4.0/ * Example (ja): 本モデルは、株式会社サイバーエージェントによるOpenCALM-XXをファインチューニングしたものです。元のモデルはCC BY-SA 4.0ライセンスのもとで公開されており、本モデルも同じくCC BY-SA 4.0ライセンスで公開します。詳しくはこちらをご覧ください: https://creativecommons.org/licenses/by-sa/4.0/ ## Training Dataset * Wikipedia (ja) * Common Crawl (ja) ## Author [Ryosuke Ishigami](https://huggingface.co/rishigami) ## Citations ```bibtext @software{gpt-neox-library, title = {{GPT-NeoX: Large Scale Autoregressive Language Modeling in PyTorch}}, author = {Andonian, Alex and Anthony, Quentin and Biderman, Stella and Black, Sid and Gali, Preetham and Gao, Leo and Hallahan, Eric and Levy-Kramer, Josh and Leahy, Connor and Nestler, Lucas and Parker, Kip and Pieler, Michael and Purohit, Shivanshu and Songz, Tri and Phil, Wang and Weinbach, Samuel}, url = {https://www.github.com/eleutherai/gpt-neox}, doi = {10.5281/zenodo.5879544}, month = {8}, year = {2021}, version = {0.0.1}, } ```
3,366
[ [ -0.0303497314453125, -0.053924560546875, 0.020599365234375, 0.00811767578125, -0.01253509521484375, -0.0220794677734375, -0.02825927734375, -0.03277587890625, 0.013763427734375, 0.040069580078125, -0.03662109375, -0.057342529296875, -0.0360107421875, 0.0046234130859375, -0.005863189697265625, 0.06878662109375, -0.014984130859375, -0.00743865966796875, -0.00449371337890625, -0.01024627685546875, -0.01335906982421875, -0.038055419921875, -0.052642822265625, -0.028228759765625, 0.00225067138671875, 0.01371002197265625, 0.0716552734375, 0.05487060546875, 0.035614013671875, 0.0292816162109375, 0.003391265869140625, 0.002063751220703125, -0.0247802734375, -0.0204925537109375, 0.0004627704620361328, -0.052825927734375, -0.055328369140625, -0.00909423828125, 0.049591064453125, 0.026885986328125, -0.005092620849609375, 0.015228271484375, -0.00984954833984375, 0.01236724853515625, -0.039794921875, 0.03167724609375, -0.033721923828125, -0.002552032470703125, -0.00855255126953125, -0.0008835792541503906, -0.0275115966796875, -0.0296478271484375, 0.0013952255249023438, -0.05535888671875, 0.018524169921875, -0.00887298583984375, 0.08526611328125, 0.008514404296875, -0.00807952880859375, 0.0003628730773925781, -0.05670166015625, 0.060455322265625, -0.0799560546875, 0.01702880859375, 0.040924072265625, 0.01806640625, -0.0006427764892578125, -0.05908203125, -0.0303497314453125, -0.0122528076171875, -0.000812530517578125, 0.01654052734375, -0.041656494140625, -0.002429962158203125, 0.0280914306640625, 0.026519775390625, -0.06658935546875, 0.0022373199462890625, -0.02789306640625, -0.0269317626953125, 0.032867431640625, 0.027191162109375, 0.03582763671875, 0.0004336833953857422, -0.0252685546875, -0.0222625732421875, -0.0330810546875, -0.007762908935546875, 0.0229034423828125, 0.0296783447265625, -0.0467529296875, 0.0338134765625, -0.006458282470703125, 0.0438232421875, -0.018585205078125, -0.016998291015625, 0.0457763671875, -0.03741455078125, -0.0216522216796875, -0.003173828125, 0.109375, 0.0169525146484375, 0.028228759765625, 0.00420379638671875, -0.00970458984375, 0.004009246826171875, 0.0001825094223022461, -0.07623291015625, -0.007137298583984375, 0.01100921630859375, -0.03192138671875, -0.0141143798828125, 0.0256805419921875, -0.0587158203125, 0.0092620849609375, -0.0174713134765625, 0.03271484375, -0.038116455078125, -0.045196533203125, 0.02569580078125, 0.01416778564453125, 0.00537872314453125, 0.0194091796875, -0.034881591796875, 0.03961181640625, 0.0169677734375, 0.0751953125, 0.0216217041015625, -0.030181884765625, -0.00832366943359375, -0.02069091796875, -0.01291656494140625, 0.02593994140625, -0.0203857421875, -0.02288818359375, -0.005588531494140625, 0.01617431640625, -0.027130126953125, -0.042999267578125, 0.0265350341796875, -0.0264892578125, 0.0282745361328125, 0.0118408203125, -0.0012063980102539062, -0.0094757080078125, 0.00799560546875, -0.043975830078125, 0.0843505859375, 0.017425537109375, -0.0667724609375, -0.002735137939453125, -0.0423583984375, 0.0036411285400390625, -0.01023101806640625, 0.003452301025390625, -0.0438232421875, -0.01305389404296875, 0.020477294921875, 0.004852294921875, -0.03912353515625, 0.01126861572265625, -0.04058837890625, -0.01148223876953125, 0.02197265625, -0.0186309814453125, 0.08575439453125, 0.0300140380859375, -0.0301666259765625, 0.015655517578125, -0.0638427734375, 0.0131988525390625, 0.04052734375, -0.006687164306640625, -0.019744873046875, -0.0196685791015625, 0.011627197265625, 0.032958984375, 0.0286712646484375, -0.04449462890625, 0.0154266357421875, -0.04180908203125, 0.0435791015625, 0.03839111328125, -0.016632080078125, 0.02294921875, -0.018890380859375, 0.051605224609375, 0.020416259765625, 0.0273895263671875, -0.0198974609375, -0.036773681640625, -0.050750732421875, -0.03558349609375, 0.01143646240234375, 0.0472412109375, -0.043792724609375, 0.03607177734375, -0.0277862548828125, -0.05487060546875, -0.052825927734375, -0.007404327392578125, 0.040252685546875, 0.018646240234375, 0.031402587890625, -0.0269012451171875, -0.04705810546875, -0.053192138671875, 0.0038890838623046875, -0.0118408203125, 0.0097808837890625, 0.031463623046875, 0.047454833984375, -0.04034423828125, 0.0804443359375, -0.05511474609375, -0.0090484619140625, -0.0110626220703125, 0.0143280029296875, 0.03271484375, 0.029541015625, 0.0494384765625, -0.034576416015625, -0.043487548828125, 0.0122833251953125, -0.05718994140625, -0.0177154541015625, 0.0082550048828125, -0.002925872802734375, 0.0193634033203125, 0.03839111328125, -0.053863525390625, 0.046142578125, 0.02001953125, -0.0291290283203125, 0.061492919921875, -0.0263214111328125, -0.0045166015625, -0.10595703125, 0.030792236328125, 0.007541656494140625, -0.02728271484375, -0.0355224609375, 0.0118408203125, 0.00350189208984375, -0.0157012939453125, -0.038970947265625, 0.06451416015625, -0.033447265625, 0.0075531005859375, -0.0206451416015625, 0.01617431640625, -0.01824951171875, 0.031646728515625, 0.007781982421875, 0.0484619140625, 0.05780029296875, -0.0309906005859375, 0.02557373046875, 0.0160369873046875, -0.00902557373046875, 0.01345062255859375, -0.06036376953125, 0.01080322265625, 0.000012993812561035156, 0.01335906982421875, -0.06329345703125, -0.00665283203125, 0.0276947021484375, -0.054931640625, 0.018096923828125, -0.02130126953125, -0.0546875, -0.04180908203125, -0.01389312744140625, 0.040283203125, 0.0300750732421875, -0.031829833984375, 0.0220794677734375, 0.01389312744140625, -0.0045166015625, -0.035552978515625, -0.04833984375, -0.0145416259765625, -0.0103607177734375, -0.054718017578125, 0.0220184326171875, -0.010467529296875, -0.008819580078125, 0.0151824951171875, 0.002918243408203125, -0.01654052734375, -0.0126953125, 0.0156402587890625, 0.039276123046875, -0.0372314453125, -0.0219879150390625, -0.0280914306640625, -0.0230712890625, 0.0198822021484375, -0.0216522216796875, 0.0660400390625, -0.0251007080078125, -0.0108642578125, -0.03912353515625, 0.003345489501953125, 0.054595947265625, -0.026885986328125, 0.06268310546875, 0.070068359375, -0.038238525390625, 0.004261016845703125, -0.01959228515625, -0.0177154541015625, -0.03466796875, 0.03778076171875, -0.027587890625, -0.056793212890625, 0.071533203125, 0.01517486572265625, 0.001468658447265625, 0.047515869140625, 0.05181884765625, 0.043914794921875, 0.087646484375, 0.033935546875, -0.0233001708984375, 0.038970947265625, -0.0360107421875, 0.00926971435546875, -0.0595703125, -0.007236480712890625, -0.05767822265625, 0.0091400146484375, -0.0584716796875, -0.030303955078125, 0.018646240234375, -0.0033626556396484375, -0.043975830078125, 0.052764892578125, -0.0297088623046875, 0.0126190185546875, 0.037322998046875, 0.0161285400390625, 0.01070404052734375, -0.00044465065002441406, -0.033935546875, 0.0034332275390625, -0.0531005859375, -0.0258331298828125, 0.0799560546875, 0.0364990234375, 0.07232666015625, 0.0101470947265625, 0.05194091796875, -0.0289306640625, 0.0008616447448730469, -0.048126220703125, 0.049285888671875, 0.00917816162109375, -0.0574951171875, -0.0022563934326171875, -0.06243896484375, -0.0889892578125, 0.01255035400390625, 0.0015964508056640625, -0.08111572265625, 0.0096435546875, 0.0034999847412109375, -0.01885986328125, 0.03533935546875, -0.0435791015625, 0.06524658203125, 0.00452423095703125, -0.032196044921875, -0.00783538818359375, -0.04705810546875, 0.02532958984375, 0.005687713623046875, 0.0233917236328125, 0.007778167724609375, -0.0007982254028320312, 0.05145263671875, -0.05767822265625, 0.06951904296875, -0.007167816162109375, 0.01201629638671875, 0.032470703125, 0.0017633438110351562, 0.0252227783203125, 0.006694793701171875, 0.00025725364685058594, 0.0269622802734375, 0.01152801513671875, -0.0196533203125, -0.023101806640625, 0.06439208984375, -0.0963134765625, -0.016937255859375, -0.05322265625, -0.041229248046875, 0.0090789794921875, 0.047454833984375, 0.03253173828125, 0.04315185546875, -0.0028438568115234375, 0.0163421630859375, 0.0421142578125, -0.03839111328125, 0.033050537109375, 0.0421142578125, -0.032928466796875, -0.063720703125, 0.0679931640625, 0.00672149658203125, 0.03173828125, 0.0230712890625, 0.0272216796875, -0.021575927734375, -0.02105712890625, -0.0224151611328125, 0.04766845703125, -0.03851318359375, -0.020355224609375, -0.046295166015625, -0.0345458984375, -0.055145263671875, 0.004543304443359375, -0.04052734375, -0.01285552978515625, -0.03375244140625, 0.015838623046875, 0.028228759765625, 0.033477783203125, -0.00980377197265625, 0.026885986328125, -0.052276611328125, 0.0168609619140625, -0.00830078125, 0.0284576416015625, 0.0026035308837890625, -0.05072021484375, -0.03192138671875, 0.0176239013671875, -0.033721923828125, -0.055511474609375, 0.054351806640625, -0.007350921630859375, 0.0394287109375, 0.0269622802734375, -0.0012178421020507812, 0.051361083984375, -0.007167816162109375, 0.04644775390625, 0.0223846435546875, -0.05645751953125, 0.040496826171875, -0.044158935546875, 0.059112548828125, 0.0214080810546875, 0.043243408203125, -0.0335693359375, -0.0193328857421875, -0.0625, -0.07647705078125, 0.09210205078125, 0.0163421630859375, -0.00267791748046875, -0.0015268325805664062, 0.0154876708984375, -0.01446533203125, 0.003337860107421875, -0.0748291015625, -0.03289794921875, -0.0190277099609375, -0.0273284912109375, -0.007389068603515625, -0.0009517669677734375, 0.00759124755859375, -0.0213775634765625, 0.0635986328125, -0.01093292236328125, 0.052001953125, 0.0224609375, -0.0160980224609375, -0.0209197998046875, 0.00493621826171875, 0.048614501953125, 0.032867431640625, -0.0286712646484375, -0.004146575927734375, 0.00426483154296875, -0.05987548828125, -0.0116119384765625, 0.0196990966796875, -0.01239013671875, 0.0002655982971191406, 0.0235443115234375, 0.08221435546875, 0.01239013671875, -0.027862548828125, 0.036376953125, -0.0088348388671875, -0.0294647216796875, -0.0263214111328125, -0.0006537437438964844, 0.0008940696716308594, 0.0026798248291015625, 0.00057220458984375, -0.003429412841796875, -0.01031494140625, -0.04376220703125, -0.00206756591796875, 0.0223236083984375, -0.030975341796875, -0.038848876953125, 0.0517578125, -0.018035888671875, -0.01116943359375, 0.054840087890625, -0.0110015869140625, -0.036773681640625, 0.054779052734375, 0.06866455078125, 0.07269287109375, -0.033294677734375, -0.0038242340087890625, 0.0679931640625, 0.03607177734375, -0.0001811981201171875, 0.0302581787109375, 0.0185089111328125, -0.048797607421875, -0.01538848876953125, -0.043731689453125, -0.01031494140625, 0.02978515625, -0.0567626953125, 0.037750244140625, -0.0352783203125, -0.03314208984375, -0.005130767822265625, 0.01087188720703125, -0.056396484375, 0.0169677734375, 0.0021839141845703125, 0.07110595703125, -0.05322265625, 0.06097412109375, 0.06304931640625, -0.043975830078125, -0.0599365234375, -0.01375579833984375, 0.006740570068359375, -0.0650634765625, 0.0225372314453125, 0.00621795654296875, 0.009185791015625, 0.01494598388671875, -0.029510498046875, -0.07598876953125, 0.07745361328125, 0.030487060546875, -0.027618408203125, -0.0069732666015625, 0.03778076171875, 0.03619384765625, -0.0157012939453125, 0.0745849609375, 0.01265716552734375, 0.032012939453125, 0.001163482666015625, -0.09332275390625, 0.018218994140625, -0.0300140380859375, 0.0010776519775390625, 0.030975341796875, -0.063232421875, 0.07269287109375, -0.0189056396484375, -0.00524139404296875, 0.01071929931640625, 0.03009033203125, 0.0116729736328125, 0.01242828369140625, 0.022674560546875, 0.06268310546875, 0.0206756591796875, -0.01453399658203125, 0.06036376953125, -0.023681640625, 0.05078125, 0.07904052734375, 0.00580596923828125, 0.048065185546875, -0.003917694091796875, -0.026702880859375, 0.039886474609375, 0.03472900390625, 0.0011615753173828125, 0.01953125, 0.0028247833251953125, -0.01006317138671875, -0.01134490966796875, 0.010650634765625, -0.0377197265625, 0.03009033203125, 0.022918701171875, -0.036102294921875, -0.03887939453125, 0.0002951622009277344, 0.031524658203125, -0.019195556640625, -0.0168609619140625, 0.047271728515625, 0.0181732177734375, -0.04071044921875, 0.0648193359375, 0.0185394287109375, 0.0364990234375, -0.060821533203125, 0.0218048095703125, 0.00225067138671875, 0.036285400390625, -0.0184173583984375, -0.0439453125, 0.01482391357421875, -0.0017528533935546875, -0.00836944580078125, 0.002109527587890625, 0.04705810546875, -0.0230712890625, -0.0251617431640625, 0.034088134765625, 0.01485443115234375, 0.0161895751953125, -0.00951385498046875, -0.07122802734375, 0.0218353271484375, -0.00283050537109375, -0.0260467529296875, 0.028472900390625, 0.0287322998046875, -0.00864410400390625, 0.043914794921875, 0.0540771484375, 0.0023326873779296875, 0.008514404296875, 0.0053253173828125, 0.06591796875, -0.051361083984375, -0.03582763671875, -0.0679931640625, 0.03851318359375, 0.007152557373046875, -0.04388427734375, 0.06414794921875, 0.061309814453125, 0.06964111328125, -0.0078125, 0.05419921875, -0.00997161865234375, 0.01812744140625, -0.01473236083984375, 0.04534912109375, -0.03839111328125, -0.01177215576171875, -0.031402587890625, -0.0679931640625, -0.0184783935546875, 0.047027587890625, -0.035400390625, 0.0247344970703125, 0.05078125, 0.081298828125, -0.01953125, -0.01314544677734375, 0.01282501220703125, 0.042724609375, 0.0224456787109375, 0.0552978515625, 0.03350830078125, -0.05413818359375, 0.04052734375, -0.03363037109375, -0.034820556640625, -0.0261077880859375, -0.042266845703125, -0.0709228515625, -0.036773681640625, -0.03521728515625, -0.045806884765625, -0.015869140625, 0.08319091796875, 0.058837890625, -0.050537109375, -0.034332275390625, -0.0118255615234375, -0.0189056396484375, -0.01345062255859375, -0.0225982666015625, 0.0149383544921875, -0.01332855224609375, -0.056854248046875, 0.00785064697265625, 0.00122833251953125, 0.006748199462890625, -0.01503753662109375, -0.007232666015625, -0.01517486572265625, 0.0107421875, 0.029205322265625, 0.033721923828125, -0.0413818359375, 0.01511383056640625, 0.0153961181640625, -0.0202178955078125, 0.01406097412109375, 0.03192138671875, -0.03826904296875, 0.0443115234375, 0.049285888671875, 0.0020904541015625, 0.05352783203125, -0.0297393798828125, 0.0269927978515625, -0.02996826171875, 0.0197906494140625, 0.00867462158203125, 0.03656005859375, 0.005184173583984375, -0.01151275634765625, 0.0261993408203125, 0.033538818359375, -0.045745849609375, -0.0662841796875, -0.0008111000061035156, -0.087646484375, -0.00927734375, 0.09674072265625, -0.0181732177734375, -0.0343017578125, 0.00722503662109375, -0.02069091796875, 0.0287322998046875, -0.026153564453125, 0.0286102294921875, 0.03961181640625, 0.021392822265625, -0.0103759765625, -0.045745849609375, 0.007099151611328125, 0.00469970703125, -0.06036376953125, 0.006816864013671875, 0.0203399658203125, 0.03277587890625, 0.0260009765625, 0.0506591796875, -0.047210693359375, 0.024993896484375, 0.00331878662109375, 0.0306854248046875, -0.017974853515625, -0.01273345947265625, -0.022247314453125, -0.0128326416015625, 0.001941680908203125, -0.0099029541015625 ] ]
THUDM/visualglm-6b
2023-05-22T13:29:07.000Z
[ "transformers", "pytorch", "chatglm", "glm", "visualglm", "thudm", "custom_code", "zh", "en", "arxiv:2103.10360", "arxiv:2210.02414", "arxiv:2301.12597", "arxiv:2105.13290", "endpoints_compatible", "has_space", "region:us" ]
null
THUDM
null
null
THUDM/visualglm-6b
186
10,051
transformers
2023-05-17T08:34:19
--- language: - zh - en tags: - glm - visualglm - chatglm - thudm --- # VisualGLM-6B <p align="center"> 💻 <a href="https://github.com/THUDM/VisualGLM-6B" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/thukeg" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2103.10360" target="_blank">[GLM@ACL 22]</a> <a href="https://github.com/THUDM/GLM" target="_blank">[GitHub]</a> • 📃 <a href="https://arxiv.org/abs/2210.02414" target="_blank">[GLM-130B@ICLR 23]</a> <a href="https://github.com/THUDM/GLM-130B" target="_blank">[GitHub]</a> <br> </p> <p align="center"> 👋 Join our <a href="https://join.slack.com/t/chatglm/shared_invite/zt-1th2q5u69-7tURzFuOPanmuHy9hsZnKA" target="_blank">Slack</a> and <a href="https://github.com/THUDM/ChatGLM-6B/blob/main/resources/WECHAT.md" target="_blank">WeChat</a> </p> ## 介绍 VisualGLM-6B 是一个开源的,支持**图像、中文和英文**的多模态对话语言模型,语言模型基于 [ChatGLM-6B](https://github.com/THUDM/ChatGLM-6B),具有 62 亿参数;图像部分通过训练 [BLIP2-Qformer](https://arxiv.org/abs/2301.12597) 构建起视觉模型与语言模型的桥梁,整体模型共78亿参数。 VisualGLM-6B 依靠来自于 [CogView](https://arxiv.org/abs/2105.13290) 数据集的30M高质量中文图文对,与300M经过筛选的英文图文对进行预训练,中英文权重相同。该训练方式较好地将视觉信息对齐到ChatGLM的语义空间;之后的微调阶段,模型在长视觉问答数据上训练,以生成符合人类偏好的答案。 ## 软件依赖 ```shell pip install SwissArmyTransformer>=0.3.6 torch>1.10.0 torchvision transformers>=4.27.1 cpm_kernels ``` ## 代码调用 可以通过如下代码调用 VisualGLM-6B 模型来生成对话: ```ipython >>> from transformers import AutoTokenizer, AutoModel >>> tokenizer = AutoTokenizer.from_pretrained("THUDM/visualglm-6b", trust_remote_code=True) >>> model = AutoModel.from_pretrained("THUDM/visualglm-6b", trust_remote_code=True).half().cuda() >>> image_path = "your image path" >>> response, history = model.chat(tokenizer, image_path, "描述这张图片。", history=[]) >>> print(response) >>> response, history = model.chat(tokenizer, image_path, "这张图片可能是在什么场所拍摄的?", history=history) >>> print(response) ``` 关于更多的使用说明,包括如何运行命令行和网页版本的 DEMO,以及使用模型量化以节省显存,请参考我们的 [Github Repo](https://github.com/THUDM/VisualGLM-6B)。 For more instructions, including how to run CLI and web demos, and model quantization, please refer to our [Github Repo](https://github.com/THUDM/VisualGLM-6B). ## 协议 本仓库的代码依照 [Apache-2.0](LICENSE) 协议开源,VisualGLM-6B 模型的权重的使用则需要遵循 [Model License](MODEL_LICENSE)。 ## 引用 如果你觉得我们的工作有帮助的话,请考虑引用下列论文: ``` @inproceedings{du2022glm, title={GLM: General Language Model Pretraining with Autoregressive Blank Infilling}, author={Du, Zhengxiao and Qian, Yujie and Liu, Xiao and Ding, Ming and Qiu, Jiezhong and Yang, Zhilin and Tang, Jie}, booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)}, pages={320--335}, year={2022} } ``` ``` @article{ding2021cogview, title={Cogview: Mastering text-to-image generation via transformers}, author={Ding, Ming and Yang, Zhuoyi and Hong, Wenyi and Zheng, Wendi and Zhou, Chang and Yin, Da and Lin, Junyang and Zou, Xu and Shao, Zhou and Yang, Hongxia and others}, journal={Advances in Neural Information Processing Systems}, volume={34}, pages={19822--19835}, year={2021} } ```
3,106
[ [ -0.0386962890625, -0.057891845703125, 0.01392364501953125, 0.01549530029296875, -0.031524658203125, -0.0034008026123046875, -0.0234375, -0.037567138671875, -0.00211334228515625, 0.00836944580078125, -0.04449462890625, -0.059783935546875, -0.04388427734375, -0.01395416259765625, -0.0084075927734375, 0.05328369140625, 0.0087738037109375, -0.00785064697265625, -0.00968170166015625, -0.01922607421875, -0.034149169921875, -0.03936767578125, -0.0309600830078125, -0.01535797119140625, 0.0010538101196289062, 0.002532958984375, 0.048614501953125, 0.0307159423828125, 0.036041259765625, 0.0309600830078125, -0.007312774658203125, 0.0133209228515625, -0.044708251953125, -0.026031494140625, 0.019287109375, -0.02716064453125, -0.03936767578125, -0.00112152099609375, 0.041229248046875, 0.01219940185546875, -0.005985260009765625, 0.0283966064453125, 0.00942230224609375, 0.041259765625, -0.03948974609375, 0.03173828125, -0.03936767578125, -0.0053253173828125, -0.00220489501953125, -0.0115966796875, -0.0204620361328125, -0.029876708984375, -0.0015420913696289062, -0.04443359375, 0.0157623291015625, 0.00193023681640625, 0.10833740234375, 0.005615234375, -0.00922393798828125, -0.0006418228149414062, -0.03997802734375, 0.060028076171875, -0.077392578125, 0.0189666748046875, 0.0019817352294921875, 0.03173828125, -0.00980377197265625, -0.07281494140625, -0.039031982421875, -0.0098114013671875, -0.023651123046875, 0.0252685546875, -0.02203369140625, 0.006000518798828125, 0.0250244140625, 0.0207061767578125, -0.059539794921875, -0.00495147705078125, -0.050689697265625, -0.017059326171875, 0.0494384765625, 0.0112762451171875, 0.047943115234375, -0.013336181640625, -0.046630859375, -0.0077667236328125, -0.03375244140625, 0.01238250732421875, 0.0112457275390625, 0.0112152099609375, -0.05291748046875, 0.022674560546875, -0.005847930908203125, 0.058135986328125, 0.00418853759765625, -0.0078582763671875, 0.041534423828125, -0.039215087890625, -0.0302276611328125, -0.02105712890625, 0.09796142578125, 0.033447265625, 0.00193023681640625, 0.0140380859375, -0.0022182464599609375, 0.0018711090087890625, 0.0006880760192871094, -0.07086181640625, -0.02044677734375, 0.0213470458984375, -0.041046142578125, -0.00879669189453125, 0.00905609130859375, -0.04254150390625, 0.0142822265625, -0.006435394287109375, 0.05438232421875, -0.03302001953125, -0.01824951171875, 0.01434326171875, -0.01105499267578125, 0.030120849609375, 0.0239410400390625, -0.064208984375, 0.01366424560546875, 0.0196075439453125, 0.061431884765625, -0.005779266357421875, -0.0276641845703125, -0.030517578125, 0.0020503997802734375, -0.0113372802734375, 0.0263824462890625, 0.0013637542724609375, -0.02972412109375, -0.001434326171875, -0.00203704833984375, -0.016143798828125, -0.0301055908203125, 0.0210723876953125, -0.0289459228515625, 0.04052734375, -0.0104217529296875, -0.02716064453125, -0.0258026123046875, 0.0166168212890625, -0.014495849609375, 0.075927734375, -0.00012814998626708984, -0.06390380859375, 0.01216888427734375, -0.0355224609375, -0.0022678375244140625, 0.01038360595703125, -0.0318603515625, -0.040069580078125, -0.0048370361328125, 0.034149169921875, 0.021148681640625, -0.0196075439453125, 0.00628662109375, -0.0268402099609375, -0.0362548828125, -0.00417327880859375, -0.0255279541015625, 0.08795166015625, 0.009552001953125, -0.043304443359375, 0.04730224609375, -0.044189453125, 0.0257568359375, 0.02294921875, -0.0090789794921875, 0.000015795230865478516, -0.007686614990234375, 0.007354736328125, 0.03680419921875, 0.040130615234375, -0.019744873046875, 0.03143310546875, -0.033111572265625, 0.027801513671875, 0.060699462890625, -0.012939453125, 0.029327392578125, -0.0098419189453125, 0.03082275390625, 0.015289306640625, 0.051177978515625, -0.004848480224609375, -0.041229248046875, -0.077392578125, -0.022186279296875, -0.00386810302734375, 0.04974365234375, -0.07037353515625, 0.0540771484375, -0.0227203369140625, -0.041656494140625, -0.034698486328125, 0.00753021240234375, 0.0323486328125, 0.0330810546875, 0.0477294921875, -0.0232391357421875, -0.035491943359375, -0.058868408203125, -0.0079803466796875, -0.0263214111328125, 0.006988525390625, 0.033905029296875, 0.037139892578125, -0.032989501953125, 0.0650634765625, -0.04949951171875, -0.035247802734375, -0.017822265625, 0.007808685302734375, 0.0294952392578125, 0.041900634765625, 0.052764892578125, -0.04541015625, -0.0543212890625, -0.00478363037109375, -0.0775146484375, -0.004909515380859375, 0.0013284683227539062, -0.023834228515625, 0.02581787109375, 0.039337158203125, -0.039703369140625, 0.04669189453125, 0.051361083984375, -0.04034423828125, 0.045654296875, -0.0250244140625, -0.000244140625, -0.08612060546875, 0.002246856689453125, -0.012603759765625, -0.00785064697265625, -0.034210205078125, -0.00788116455078125, -0.006900787353515625, 0.006694793701171875, -0.045440673828125, 0.06402587890625, -0.06573486328125, 0.0083770751953125, -0.006832122802734375, 0.0084075927734375, -0.0216522216796875, 0.057464599609375, -0.018585205078125, 0.0482177734375, 0.07537841796875, -0.03717041015625, 0.027374267578125, 0.0233612060546875, -0.01275634765625, 0.0022678375244140625, -0.06549072265625, 0.0046844482421875, -0.0011205673217773438, 0.002685546875, -0.0855712890625, 0.004154205322265625, 0.04156494140625, -0.056549072265625, 0.0218658447265625, -0.027069091796875, -0.028900146484375, -0.038543701171875, -0.02154541015625, 0.0216827392578125, 0.06396484375, -0.035400390625, 0.052001953125, 0.02630615234375, 0.01316070556640625, -0.04638671875, -0.037017822265625, 0.005191802978515625, -0.0092315673828125, -0.0693359375, 0.0230712890625, -0.01485443115234375, 0.00899505615234375, 0.00018143653869628906, 0.006694793701171875, 0.00994873046875, -0.0140380859375, 0.023651123046875, 0.040985107421875, -0.006534576416015625, -0.0208282470703125, -0.0283050537109375, -0.0198974609375, 0.00316619873046875, -0.0264434814453125, 0.0443115234375, -0.035980224609375, -0.038604736328125, -0.037139892578125, 0.00390625, 0.033233642578125, -0.0193939208984375, 0.047271728515625, 0.0919189453125, -0.005767822265625, -0.0032711029052734375, -0.041412353515625, -0.0183563232421875, -0.043365478515625, 0.034271240234375, -0.0002130270004272461, -0.06695556640625, 0.060638427734375, 0.027557373046875, 0.0125274658203125, 0.041412353515625, 0.0489501953125, 0.0136871337890625, 0.0789794921875, 0.0438232421875, -0.016082763671875, 0.052093505859375, -0.048309326171875, 0.01299285888671875, -0.047393798828125, -0.007053375244140625, -0.0200653076171875, -0.01380157470703125, -0.052459716796875, -0.043975830078125, 0.0394287109375, 0.0141754150390625, -0.01788330078125, 0.0201263427734375, -0.05291748046875, -0.00044226646423339844, 0.04412841796875, -0.002227783203125, 0.005901336669921875, 0.005336761474609375, -0.0185394287109375, 0.0008335113525390625, -0.0626220703125, -0.0241241455078125, 0.04510498046875, 0.017822265625, 0.06890869140625, 0.0124969482421875, 0.05194091796875, -0.01812744140625, 0.019561767578125, -0.037445068359375, 0.057098388671875, 0.00485992431640625, -0.0318603515625, -0.025634765625, -0.039031982421875, -0.06561279296875, 0.041015625, -0.00882720947265625, -0.07763671875, 0.0126953125, 0.0224609375, -0.0084228515625, 0.0207672119140625, -0.06292724609375, 0.0697021484375, -0.027496337890625, -0.03076171875, 0.0121917724609375, -0.058258056640625, 0.030517578125, 0.04205322265625, 0.010223388671875, -0.0189056396484375, 0.00572967529296875, 0.0474853515625, -0.03997802734375, 0.06463623046875, -0.0230560302734375, 0.00004363059997558594, 0.0389404296875, -0.0146636962890625, 0.039031982421875, 0.0158233642578125, 0.0284271240234375, 0.03125, -0.008026123046875, -0.0287933349609375, -0.0411376953125, 0.049774169921875, -0.059661865234375, -0.05908203125, -0.038543701171875, -0.0303497314453125, -0.00426483154296875, 0.027801513671875, 0.04229736328125, 0.027069091796875, 0.010162353515625, 0.011505126953125, 0.0240325927734375, -0.0298309326171875, 0.0526123046875, 0.0224609375, -0.041656494140625, -0.046539306640625, 0.0670166015625, 0.0106964111328125, 0.033294677734375, 0.007587432861328125, 0.0032825469970703125, -0.0235443115234375, -0.0421142578125, -0.0430908203125, 0.024200439453125, -0.046142578125, -0.0079803466796875, -0.03656005859375, -0.03973388671875, -0.031707763671875, 0.00104522705078125, -0.028717041015625, 0.006988525390625, -0.03802490234375, 0.006778717041015625, 0.035003662109375, 0.011871337890625, -0.0029697418212890625, 0.007965087890625, -0.07513427734375, 0.027557373046875, 0.0193634033203125, 0.0223541259765625, 0.01110076904296875, -0.050933837890625, -0.03375244140625, 0.016845703125, -0.0205841064453125, -0.034332275390625, 0.052886962890625, 0.01140594482421875, 0.053680419921875, 0.01511383056640625, -0.002532958984375, 0.06561279296875, -0.0247955322265625, 0.06488037109375, 0.03369140625, -0.07061767578125, 0.0328369140625, -0.0386962890625, 0.04510498046875, 0.0187530517578125, 0.008697509765625, -0.032806396484375, -0.0259246826171875, -0.057891845703125, -0.08160400390625, 0.054840087890625, 0.033782958984375, 0.0183258056640625, 0.01404571533203125, -0.00775909423828125, -0.032379150390625, 0.00548553466796875, -0.0582275390625, -0.05731201171875, -0.01806640625, -0.0114288330078125, -0.00412750244140625, -0.018096923828125, -0.00792694091796875, -0.0416259765625, 0.047210693359375, -0.0171966552734375, 0.0478515625, 0.0190887451171875, -0.00856781005859375, 0.00600433349609375, 0.001354217529296875, 0.055694580078125, 0.06402587890625, -0.011505126953125, -0.004825592041015625, 0.0128326416015625, -0.0465087890625, -0.00177764892578125, 0.008697509765625, -0.02459716796875, 0.0186614990234375, 0.0263214111328125, 0.09197998046875, 0.0017843246459960938, -0.02587890625, 0.050262451171875, -0.0222015380859375, -0.03302001953125, -0.022613525390625, 0.0162506103515625, 0.0235443115234375, 0.0152587890625, 0.048309326171875, -0.0077056884765625, -0.0107574462890625, -0.029541015625, 0.005817413330078125, 0.041229248046875, -0.0377197265625, -0.027069091796875, 0.061279296875, 0.00897216796875, 0.0045928955078125, 0.0404052734375, -0.0109710693359375, -0.050689697265625, 0.043212890625, 0.02923583984375, 0.0728759765625, 0.007534027099609375, -0.00373077392578125, 0.050567626953125, 0.0199432373046875, 0.0009918212890625, 0.03656005859375, 0.0164642333984375, -0.051177978515625, -0.022369384765625, -0.03826904296875, 0.0081329345703125, 0.00925445556640625, -0.03656005859375, 0.031005859375, -0.0248870849609375, -0.023468017578125, -0.017822265625, -0.0122528076171875, -0.04345703125, 0.00820159912109375, 0.01078033447265625, 0.046905517578125, -0.041656494140625, 0.053924560546875, 0.039306640625, -0.041839599609375, -0.06396484375, -0.017547607421875, 0.0077056884765625, -0.053375244140625, 0.0421142578125, 0.0013322830200195312, -0.0038089752197265625, 0.013275146484375, -0.053131103515625, -0.07330322265625, 0.08880615234375, 0.027923583984375, -0.0311737060546875, -0.005107879638671875, -0.01306915283203125, 0.037689208984375, -0.004375457763671875, 0.031463623046875, -0.00015211105346679688, 0.032806396484375, 0.0188751220703125, -0.083740234375, 0.0140228271484375, -0.0377197265625, 0.0155792236328125, 0.00695037841796875, -0.06622314453125, 0.0771484375, -0.0259857177734375, -0.03314208984375, -0.0140228271484375, 0.06768798828125, 0.01922607421875, -0.01171112060546875, 0.0219573974609375, 0.03497314453125, 0.0259246826171875, -0.0246734619140625, 0.0579833984375, -0.03338623046875, 0.0653076171875, 0.06903076171875, 0.016845703125, 0.03985595703125, 0.0350341796875, -0.0192718505859375, 0.0256805419921875, 0.040863037109375, -0.02642822265625, 0.0321044921875, 0.01354217529296875, -0.0162353515625, -0.01511383056640625, 0.016693115234375, -0.054229736328125, 0.020965576171875, 0.0202484130859375, -0.0182037353515625, -0.0201568603515625, -0.003665924072265625, -0.01204681396484375, -0.0259857177734375, -0.01331329345703125, 0.06597900390625, 0.0177459716796875, -0.0287933349609375, 0.0968017578125, -0.01247406005859375, 0.07666015625, -0.05560302734375, -0.0073394775390625, -0.006557464599609375, 0.0172882080078125, -0.0224151611328125, -0.03759765625, 0.00827789306640625, -0.01837158203125, -0.003772735595703125, -0.0017747879028320312, 0.057098388671875, -0.0386962890625, -0.034271240234375, 0.040557861328125, 0.03338623046875, 0.004619598388671875, 0.019439697265625, -0.064453125, 0.0248870849609375, 0.016693115234375, -0.03887939453125, 0.025115966796875, 0.0209503173828125, -0.0023097991943359375, 0.058837890625, 0.055511474609375, -0.00020420551300048828, 0.0215301513671875, 0.006694793701171875, 0.066650390625, -0.04925537109375, -0.0390625, -0.078125, 0.057373046875, -0.0093536376953125, -0.031707763671875, 0.0765380859375, 0.04278564453125, 0.072265625, -0.0215301513671875, 0.059539794921875, -0.02203369140625, 0.0018062591552734375, -0.03900146484375, 0.06866455078125, -0.040985107421875, 0.00928497314453125, -0.0267333984375, -0.047149658203125, -0.0176239013671875, 0.04998779296875, -0.016845703125, 0.0309600830078125, 0.042877197265625, 0.067138671875, 0.00246429443359375, -0.037750244140625, 0.0275726318359375, 0.0216064453125, 0.03759765625, 0.058013916015625, 0.0369873046875, -0.055450439453125, 0.0498046875, -0.0150604248046875, 0.00267791748046875, -0.0284881591796875, -0.05511474609375, -0.0736083984375, -0.04901123046875, -0.0213470458984375, -0.03533935546875, -0.005634307861328125, 0.06573486328125, 0.06396484375, -0.041900634765625, -0.029876708984375, -0.0023479461669921875, 0.00899505615234375, -0.022979736328125, -0.0192718505859375, 0.039031982421875, -0.0257415771484375, -0.0611572265625, -0.004241943359375, 0.0236358642578125, 0.024383544921875, -0.0139617919921875, -0.01548004150390625, -0.0271453857421875, -0.015289306640625, 0.037506103515625, 0.030120849609375, -0.062469482421875, -0.0161590576171875, -0.00092315673828125, -0.03424072265625, 0.0236968994140625, 0.0125732421875, -0.039703369140625, 0.036285400390625, 0.05694580078125, 0.011932373046875, 0.0526123046875, 0.00033473968505859375, 0.0347900390625, -0.044830322265625, 0.0325927734375, 0.004123687744140625, 0.02496337890625, 0.01134490966796875, -0.016204833984375, 0.035400390625, 0.0311126708984375, -0.0270233154296875, -0.055084228515625, -0.00992584228515625, -0.0882568359375, -0.005779266357421875, 0.09417724609375, -0.010894775390625, -0.033935546875, 0.009307861328125, -0.035858154296875, 0.0249786376953125, -0.02587890625, 0.052459716796875, 0.039947509765625, 0.00298309326171875, -0.0209503173828125, -0.039520263671875, 0.04827880859375, 0.0109710693359375, -0.0672607421875, -0.004154205322265625, 0.037078857421875, 0.021392822265625, 0.01287078857421875, 0.0667724609375, -0.00403594970703125, 0.025238037109375, -0.01343536376953125, 0.039703369140625, -0.027801513671875, 0.003932952880859375, -0.021697998046875, -0.0108795166015625, -0.01132965087890625, -0.03216552734375 ] ]
johnslegers/epic-diffusion
2023-01-21T06:04:43.000Z
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
johnslegers
null
null
johnslegers/epic-diffusion
128
10,040
diffusers
2023-01-03T09:54:56
--- license: creativeml-openrail-m tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image inference: true extra_gated_prompt: |- This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies: 1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content 2. CompVis claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license 3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) Please read the full license carefully here: https://huggingface.co/spaces/CompVis/stable-diffusion-license --- [![Example][1]][1] ## Why Epic Diffusion Epîc Diffusion is a general purpose model based on Stable Diffusion 1.x intended to replace the official SD releases as your default model. It is focused on providing high quality output in a wide range of different styles, with support for NFSW content. Epîc Diffusion 1.0 is a heavily calibrated merge of SD 1.4, SD 1.5, Analog Diffusion, Wavy Diffusion, Openjourney Diffusion, Samdoesarts Ultramerge, postapocalypse, Elldreth's Dream, Inkpunk Diffusion, Arcane Diffusion & Van Gogh Diffusion blended and reblended multiple times until I got the quality & consistency I was looking for... Epic Diffusion is also [available on CivitAI](https://civitai.com/models/3855/epic-diffusion). ## License This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies: 1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content 2. CompVis claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license 3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) <a href="https://www.buymeacoffee.com/johnslegers" target="_blank"> <img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 45px !important;width: 162px !important;" > </a> ## Example prompts <table> <tr style="border: 1px solid;background:#e5e7eb"> <th style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> Prompt </th> <th style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> Parameters </th> <th style="vertical-align:top;padding:.5714286em!important;border: 1px solid;min-width:270px"> Output </th> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> scarlett johansson, in the style of Wes Anderson, highly detailed, unreal engine, octane render, 8k </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>2263657329<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/0oZij.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> sansa angeline jolie gessica chastain mummy, intricate, elegant, highly detailed, digital painting, artstation, concept art, smooth, sharp focus, illustration, art by artgerm and greg rutkowski and alphonse mucha and william - adolphe bouguereau </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>1310341382<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/mnnBR.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> Pokimane, Feminine, Mercy, Perfect Sexy Symmetrical Face, Detailed Pupils, Pensive Smirk, Look at Viewer, Leaf Armor, Ilya Kuvshinov, Gil Elvgren, Mucha. Intricate, Octane Render, 4KUHD, Centered, Oil Painting, Bokeh, Rim Lighting. </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>4142902194<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/v9NoC.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> Mature babe,artgerm Style, gerald brom, atey ghailan, mike mignola, short cut off shirt knot, wide hips, showing off, exposing herself vulnerable, blushing, exited, confident, demanding, joyful, trending on artstation, double split complementary colors, intricate details, highly detailed, </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>3954688283<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/vl0bc.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> planet base, windows, night, ground level, no man's sky, digital art, highly detailed, intricate, sharp focus, Trending on Artstation HQ, deviantart, unreal engine 5, 4K UHD image </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>895811336<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/D2GNK.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> berchtesgaden, hyperdetailed, detailed faces, artgerm, wolfenstein, portal 2, Leartes Studios, assassin's creed, alphonse mucha, bouguereau, edmund blair leighton, greg kadel, dynamic lighting, delicate, unreal engine, octane render, 8k </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>1172925287<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/m7Xkb.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> princess, detailed portrait, hyperdetailed, detailed faces, irakli nadar, magali villeneuve, Assassin's Creed, Tim Hildebrandt, Ilya Kuvshinov, artgem, greg kadel, dynamic lighting, delicate, unreal engine, octane render, 8k </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>2096567313<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/LwPPa.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> a Photorealistic dramatic hyperrealistic bright blue eyes, African American elegant girl, black hair, white veil,by WLOP,Artgerm,Greg Rutkowski,Alphonse Mucha, Beautiful dynamic dramatic bright sunset lighting,shadows,cinematic atmosphere,Artstation,concept design art,Octane render,8k </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>2999946689<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/1nH9c.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> cutest girl in the world outside, (detailed portrait), in the style of fernanda suarez and simon stalenhag and Ilya Kuvshinov and Wlop and Artgerm and Chie Yoshii and Greg Rutkowski and Waking Life, trending on artstation, featured on pixiv, dynamic lighting, highly detailed, ambient lighting, octane render, 8k </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>2249388004<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/uNux1.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> military academy, (detailed portrait), steampunk, in the style of arcane and fernanda suarez and dishonored and bioshock and simon stalenhag and Ilya Kuvshinov and Wlop and Artgerm, trending on artstation, featured on pixiv, dynamic lighting, highly detailed, ambient lighting, octane render, 8k </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>3877530043<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/sFXCi.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> beautiful female assassin wearing cyberpunk clothing, respirator, cybernetic respirator, (detailed portrait), cell shaded, 4 k, vivid colours, photorealistic concept art by wlop, ilya kuvshinov, artgerm, krenz cushart, greg rutkowski, pixiv. cinematic dramatic atmosphere, sharp focus, volumetric lighting, cinematic lighting, studio quality </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>3388890157<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/14iZS.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> cemetary, pen and ink, in the style of gustave dore highly detailed, octane render, 8k, trending on artstation, sharp focus, studio photo, intricate details, highly detailed, by greg rutkowski </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>568457114<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/D1hsN.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> dubai, hyperdetailed, detailed faces, artgem, irakli nadar, mass effect, Tim Hildebrandt, Ilya Kuvshinov, liam wong, greg rutkowski, greg kadel, dynamic lighting, delicate, unreal engine, octane render, 8k, centered, symmetry, painted, intricate, volumetric lighting, beautiful, rich deep colors masterpiece, sharp focus, ultra detailed, in the style of dan mumford and marc simonetti, astrophotography </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>DPM++ SDE<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>4262868463<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/4uPzr.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> Little cute forest fluffy chibi cuteness overload, sunny magical background, ultra precious details, intricate details, volumetric lighting, photo realistic, lifelike, photography, digital art, 8k, trending on artstation, sharp focus, studio photo, intricate details, highly detailed, by greg rutkowski, sharp focus, emitting diodes, smoke, artillery, sparks, racks, system unit, motherboard, by pascal blanche rutkowski repin artstation hyperrealism painting concept art of detailed character design matte painting, 4 k resolution blade runner </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>DPM++ SDE Karras<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>3849507891<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/4yTQP.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> 15 year old schoolgirl with short straight hair, blue eyes, cute, friendly, round face, cottagecore, intricate, enlightened, highly detailed, digital painting, artstation, concept art, smooth, sharp focus, illustration, art by artgerm and greg rutkowski and alphonse mucha </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>2276800560<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/gqynB.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> extreme wide shot a futuristic containment building in a rainforest valley with a city in the distance, national geographic, hyper realistic, 4 k, harsh light </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>3260458902<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/8qH9Y.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> portrait of a middle - eastern female cleric with straight black hair wearing blue and yellow vestments casting fireball, fantasy, highly detailed, digital painting, artstation, concept art, character art, art by greg rutkowski and tyler jacobson and alphonse mucha </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>1379894453<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/BP98Y.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> aSnowshoe Siamese Cat as the doomslayer, realistic scifi cyberpunk power armor robot, closeup portrait art by donato giancola and greg rutkowski, vintage retro scifi, realistic face, digital art, trending on artstation, symmetry </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>2122325442<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/GYdOS.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> Beautiful boy by René Magritte </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>1753689226<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/vP9sv.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> portrait of a dark god, copper wires, visible scars and nerves, intricate, headshot, highly detailed, digital painting, artstation, concept art, sharp focus, cinematic lighting, illustration, art by artgerm and greg rutkowski, alphonse mocha, cgsociety, Olivia </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>3355776798<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/A94Gg.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> knight warrior helmet skyrim mask elder scrolls v nordic armor bethesda adam adamowicz illustration character design concept, unreal 5, daz, hyperrealistic, octane render, cosplay, rpg portrait, dynamic lighting, intricate detail, harvest fall vibrancy, cinematic volume inner glowing aura global illumination ray tracing hdr </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>1938574287<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/efGrz.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> berserker portrait, d&d style, fantasy, photorealistic, highly detailed, artstation, smooth, sharp focus, art by michael whelan, artgerm, greg rutkowski and alphonse mucha </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>DPM++ SDE Karras<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>156077154<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/Wbjgp.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> symmetry product render poster vivid colors classical proportion car, glowing fog intricate, elegant, highly detailed, digital painting, art station, concept art, smooth, sharp focus, illustration, </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>DPM++ SDE Karras<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>4294525772<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/sMMpR.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> Futuristic Vintage Medium Shot 1920's Poster with Cyberpunk, ovni, tron biker with helmet bike, black in color, with a cyberpunk city background, futuristic lighting, cinematic lighting, cozy lighting, 8k, cinematic poster vintage 1800s </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>1229558409<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/0Gojz.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> beautiful, young woman, cybernetic, cyberpunk, detailed gorgeous face, flowing hair, vaporwave aesthetic, synthwave , digital painting, artstation, concept art, smooth, sharp focus, illustration, art by artgerm and greg rutkowski and alphonse mucha </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>264509871<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/zFdjj.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> strong warrior princess| centered| key visual| intricate| highly detailed| breathtaking beauty| precise lineart| vibrant| comprehensive cinematic| Carne Griffiths| Conrad Roset </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>16<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/aGuIL.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> portrait of a rugged 19th century man with mutton chops in a jacket, victorian, concept art, detailed face, fantasy, close up face, highly detailed, cinematic lighting, digital art painting by greg rutkowski </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>16<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/6sKW6.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> side profile of cyberpunk body with cyborg skull | cyberpunk | styled in Art Nouveau | insanely detailed | embellishments | high definition | concept art | digital art | vibrant </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>16<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/N7kSu.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> a cute little matte low poly isometric cherry blossom forest island, pink waterfalls, mist, lat lighting, soft shadows, trending on artstation, 3d render, monument valley, fez video game, </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>16<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/fVj9N.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> high resolution concept art of an apartment living room overlooking a large futuristic city with floor to ceiling windows and mid century modern furniture cinematic lighting cgsociety </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>850995814<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/jkpgU.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> hyperrealistic full length portrait of gorgeous watson from apex legends | blonde | detailed gorgeous face!! | full body!! | armor | intricate | elegant | realistic | hyperrealistic | cinematic | character design | concept art | highly detailed | illustration | digital art | digital painting | depth of field | illustrated by tim brown lee </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>3002798343<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/hMsH2.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> Chibi spiderman, high redolution, 3D rendering, octane rendering, modern Disney style </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>20<br> <b>Sampler:</b><br>Euler a<br> <b>CFG scale:</b><br>7<br> <b>Seed:</b><br>3232863832<br> <b>Size:</b><br>512x512 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/zl18l.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> photo of the most beautiful artwork in the world featuring soft lustrous, industrial mechanic real world, fantastic location, working environment, rugged harsh situation worker, full body 8k unity render, action shot, skin pores, detailed intricate iris, very dark lighting, heavy shadows, detailed, detailed face, (vibrant, photo realistic, realistic, dramatic, dark, sharp focus, 8k), (weathered greasy dirty damaged old worn technician worker outfit:1.1), (intricate:1.1), (highly detailed:1.1), digital painting, octane render, artstation, concept art, smooth, sharp focus, illustration, art by artgerm, (loish:0.23), wlop ilya kuvshinov., (global illumination, studio light, volumetric light)<br><br> <b>Negative prompt:</b> Asian, black and white, close up, cartoon, 3d, denim, (disfigured), (deformed), (poorly drawn), (extra limbs), blurry, boring, sketch, lackluster, signature, letters, watermark, low res , horrific , mutated , artifacts , bad art , gross , b&w , poor quality , low quality , cropped </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>30<br> <b>Sampler:</b><br>DPM++ SDE Karras<br> <b>CFG scale:</b><br>10<br> <b>Seed:</b><br>169686802<br> <b>Size:</b><br>512x640 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/dPnAA.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> photo of the most beautiful artwork in the world featuring soft lustrous, industrial mechanic real world, fantastic location, working environment, rugged harsh situation worker, full body 8k unity render, action shot, skin pores, detailed intricate iris, very dark lighting, heavy shadows, detailed, detailed face, (vibrant, photo realistic, realistic, dramatic, dark, sharp focus, 8k), (weathered greasy dirty damaged old worn technician worker outfit:1.1), (intricate:1.1), (highly detailed:1.1), digital painting, octane render, artstation, concept art, smooth, sharp focus, illustration, art by artgerm, (loish:0.23), wlop ilya kuvshinov., (global illumination, studio light, volumetric light)<br><br> <b>Negative prompt:</b> Asian, black and white, close up, cartoon, 3d, denim, (disfigured), (deformed), (poorly drawn), (extra limbs), blurry, boring, sketch, lackluster, signature, letters, watermark, low res , horrific , mutated , artifacts , bad art , gross , b&w , poor quality , low quality , cropped </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>30<br> <b>Sampler:</b><br>DPM++ SDE Karras<br> <b>CFG scale:</b><br>10<br> <b>Seed:</b><br>169686796<br> <b>Size:</b><br>512x640<br> <b>Denoising strength:</b><br>0.7<br> <b>Hires upscale:</b><br>2<br> <b>Hires upscaler:</b><br>Latent </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.imgur.com/ktLu2Tl.png"> </td> </tr> <tr> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> dark and gloomy full body 8k unity render, female teen cyborg, Blue yonder hair, wearing broken battle armor, at cluttered and messy shack , action shot, tattered torn shirt, porcelain cracked skin, skin pores, detailed intricate iris, very dark lighting, heavy shadows, detailed, detailed face, (vibrant, photo realistic, realistic, dramatic, dark, sharp focus, 8k)<br><br> <b>Negative prompt:</b> nude, Asian, black and white, close up, cartoon, 3d, denim, (disfigured), (deformed), (poorly drawn), (extra limbs), blurry, boring, sketch, lackluster, signature, letters, watermark, low res , horrific , mutated , artifacts , bad art , gross , b&w , poor quality , low quality , cropped </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <b>Steps:</b><br>26<br> <b>Sampler:</b><br>DPM++ SDE Karras<br> <b>CFG scale:</b><br>7.5<br> <b>Seed:</b><br>2388736888<br> <b>Size:</b><br>768x1024 </td> <td style="vertical-align:top;padding:.5714286em!important;border: 1px solid"> <img style="vertical-align:top;margin:0;padding:0" src="https://i.stack.imgur.com/GnUuV.jpg"> </td> </tr> </table> [1]: https://i.stack.imgur.com/wkK2b.png
33,582
[ [ -0.0467529296875, -0.062164306640625, 0.025482177734375, 0.0245513916015625, -0.01053619384765625, 0.0111083984375, 0.01983642578125, -0.042236328125, 0.05181884765625, 0.0228271484375, -0.0467529296875, -0.059295654296875, -0.043609619140625, 0.0107269287109375, -0.00247955322265625, 0.053375244140625, -0.00931549072265625, -0.01393890380859375, -0.0009307861328125, 0.026519775390625, -0.0200653076171875, -0.001758575439453125, -0.03375244140625, 0.002666473388671875, 0.0214691162109375, 0.0224151611328125, 0.0625, 0.0635986328125, 0.035491943359375, 0.022613525390625, -0.02593994140625, -0.00203704833984375, -0.031463623046875, -0.0145416259765625, 0.0121002197265625, -0.0187530517578125, -0.034698486328125, -0.00244140625, 0.024658203125, 0.0298614501953125, -0.003814697265625, 0.01505279541015625, 0.01641845703125, 0.06591796875, -0.041229248046875, -0.002025604248046875, 0.0036296844482421875, 0.01561737060546875, -0.025482177734375, -0.0013980865478515625, -0.0076141357421875, -0.051666259765625, -0.0144500732421875, -0.0814208984375, 0.01334381103515625, 0.0034923553466796875, 0.0858154296875, -0.0124359130859375, -0.028900146484375, 0.0028629302978515625, -0.0457763671875, 0.039337158203125, -0.037872314453125, 0.0197906494140625, -0.00212860107421875, 0.01971435546875, -0.029327392578125, -0.057281494140625, -0.06793212890625, 0.0222015380859375, -0.0072021484375, 0.047821044921875, -0.0626220703125, -0.04248046875, 0.00873565673828125, 0.0264434814453125, -0.06396484375, -0.00868988037109375, -0.019500732421875, 0.007640838623046875, 0.0303802490234375, 0.0079345703125, 0.048309326171875, 0.002834320068359375, -0.04632568359375, -0.01389312744140625, -0.043609619140625, 0.0408935546875, 0.0283966064453125, 0.006015777587890625, -0.05902099609375, 0.034515380859375, -0.0019426345825195312, 0.0408935546875, 0.0268707275390625, -0.01297760009765625, 0.0268096923828125, -0.0274810791015625, -0.01224517822265625, -0.021820068359375, 0.06903076171875, 0.05645751953125, -0.01340484619140625, 0.004230499267578125, 0.00884246826171875, -0.0019140243530273438, 0.00856781005859375, -0.0675048828125, -0.01104736328125, 0.0252838134765625, -0.0270843505859375, -0.02874755859375, -0.00691986083984375, -0.1004638671875, -0.00749969482421875, 0.01363372802734375, 0.023468017578125, -0.03509521484375, -0.019927978515625, 0.0246429443359375, -0.00850677490234375, 0.0283355712890625, 0.0283966064453125, -0.044281005859375, 0.0038909912109375, 0.00986480712890625, 0.05645751953125, -0.004116058349609375, 0.0174713134765625, 0.00562286376953125, 0.027862548828125, -0.038055419921875, 0.0616455078125, -0.030975341796875, -0.045562744140625, -0.01097869873046875, 0.0294342041015625, 0.0029659271240234375, -0.00009185075759887695, 0.0792236328125, -0.004913330078125, 0.00508880615234375, -0.041473388671875, -0.01081085205078125, -0.01953125, 0.00867462158203125, -0.043212890625, 0.048675537109375, 0.0203857421875, -0.07464599609375, 0.033050537109375, -0.05792236328125, -0.005046844482421875, -0.00572967529296875, -0.00978851318359375, -0.042938232421875, 0.0035114288330078125, 0.033203125, 0.019775390625, -0.033660888671875, -0.04791259765625, -0.048919677734375, -0.0014066696166992188, 0.004177093505859375, -0.00696563720703125, 0.07598876953125, 0.036224365234375, -0.048614501953125, -0.0231170654296875, -0.06781005859375, -0.0003237724304199219, 0.06329345703125, -0.0305633544921875, 0.0009436607360839844, -0.0295257568359375, 0.005401611328125, 0.035675048828125, 0.0189361572265625, -0.046234130859375, 0.0121002197265625, -0.006717681884765625, 0.0194244384765625, 0.05609130859375, 0.013336181640625, 0.024566650390625, -0.045318603515625, 0.0440673828125, 0.0278778076171875, 0.022735595703125, 0.00385284423828125, -0.0270843505859375, -0.055816650390625, -0.042266845703125, 0.0140228271484375, 0.0210723876953125, -0.0280914306640625, 0.053955078125, -0.0212249755859375, -0.05657958984375, -0.0290069580078125, -0.01061248779296875, 0.0228271484375, 0.036346435546875, 0.014801025390625, -0.0289764404296875, -0.027099609375, -0.0697021484375, 0.00534820556640625, 0.0088043212890625, 0.0075531005859375, 0.0216827392578125, 0.047943115234375, -0.01531219482421875, 0.072509765625, -0.048583984375, -0.0292510986328125, -0.0223846435546875, -0.01239013671875, 0.03350830078125, 0.06268310546875, 0.0726318359375, -0.041107177734375, -0.044525146484375, 0.00864410400390625, -0.043670654296875, 0.0010929107666015625, 0.015716552734375, -0.022247314453125, -0.01499176025390625, -0.0012912750244140625, -0.06256103515625, 0.057952880859375, 0.0218048095703125, -0.052337646484375, 0.048553466796875, -0.033660888671875, 0.033660888671875, -0.09112548828125, 0.01416778564453125, 0.027099609375, 0.00722503662109375, -0.055877685546875, 0.0286407470703125, -0.002986907958984375, -0.0066680908203125, -0.049774169921875, 0.046539306640625, -0.046142578125, 0.0220947265625, 0.0093994140625, -0.01348876953125, 0.0232391357421875, 0.0374755859375, -0.0113677978515625, 0.047943115234375, 0.05712890625, -0.023193359375, 0.006816864013671875, 0.0213165283203125, -0.029144287109375, 0.06976318359375, -0.060394287109375, 0.00534820556640625, -0.0145721435546875, 0.02178955078125, -0.093505859375, -0.01898193359375, 0.0236053466796875, -0.04400634765625, 0.0272216796875, -0.01019287109375, -0.035552978515625, -0.056427001953125, -0.052886962890625, 0.0031681060791015625, 0.056884765625, -0.0321044921875, 0.057159423828125, 0.02099609375, 0.0120391845703125, -0.03692626953125, -0.0562744140625, -0.0191802978515625, -0.0193328857421875, -0.06146240234375, 0.0247955322265625, -0.003292083740234375, -0.01462554931640625, -0.00518035888671875, -0.01326751708984375, -0.004024505615234375, -0.022308349609375, 0.032379150390625, 0.0157012939453125, -0.0108642578125, -0.036834716796875, -0.01300811767578125, 0.0009551048278808594, -0.005069732666015625, -0.0213470458984375, 0.047271728515625, -0.01000213623046875, -0.0396728515625, -0.06439208984375, 0.024261474609375, 0.07684326171875, -0.0149383544921875, 0.058074951171875, 0.04345703125, -0.0203857421875, -0.0008945465087890625, -0.045257568359375, 0.00374603271484375, -0.029083251953125, -0.0107879638671875, -0.03173828125, -0.034912109375, 0.05487060546875, 0.01605224609375, -0.0003459453582763672, 0.053741455078125, 0.035186767578125, -0.037078857421875, 0.0810546875, 0.024749755859375, 0.005931854248046875, 0.01525115966796875, -0.0740966796875, 0.0014400482177734375, -0.049957275390625, -0.03955078125, -0.036590576171875, -0.05023193359375, -0.018951416015625, -0.051605224609375, 0.02862548828125, 0.0041046142578125, -0.041778564453125, 0.0265655517578125, -0.05023193359375, 0.02276611328125, 0.0311279296875, 0.042266845703125, 0.01201629638671875, 0.00847625732421875, -0.046905517578125, -0.0180511474609375, -0.0253753662109375, -0.02667236328125, 0.06951904296875, 0.0241241455078125, 0.052520751953125, 0.0219573974609375, 0.033111572265625, 0.004627227783203125, -0.01419830322265625, 0.0003154277801513672, 0.038299560546875, 0.0040283203125, -0.06292724609375, -0.004856109619140625, -0.02227783203125, -0.08935546875, 0.0173492431640625, -0.053497314453125, -0.058807373046875, 0.0533447265625, 0.0109710693359375, -0.04168701171875, 0.048858642578125, -0.06610107421875, 0.055572509765625, -0.020538330078125, -0.063232421875, 0.0026416778564453125, -0.040985107421875, 0.01476287841796875, 0.024505615234375, 0.0240936279296875, -0.00007015466690063477, -0.007244110107421875, 0.03790283203125, -0.048919677734375, 0.048736572265625, 0.0018186569213867188, 0.01751708984375, 0.030975341796875, 0.01271820068359375, 0.07269287109375, 0.0094451904296875, 0.0198974609375, 0.0026683807373046875, -0.0166015625, -0.053253173828125, -0.0321044921875, 0.065185546875, -0.07293701171875, -0.0341796875, -0.038055419921875, -0.0143890380859375, 0.014678955078125, 0.0172576904296875, 0.0379638671875, 0.02203369140625, -0.00873565673828125, 0.0240936279296875, 0.048919677734375, -0.01297760009765625, 0.04669189453125, 0.0184783935546875, -0.01690673828125, -0.0426025390625, 0.06280517578125, 0.02685546875, 0.015106201171875, 0.022735595703125, 0.0162506103515625, -0.03173828125, -0.043182373046875, -0.029144287109375, 0.0186614990234375, -0.037078857421875, -0.0056304931640625, -0.06646728515625, 0.0032711029052734375, -0.0321044921875, -0.047271728515625, -0.00970458984375, -0.0310821533203125, -0.0433349609375, -0.01485443115234375, 0.049774169921875, 0.04266357421875, -0.00954437255859375, 0.009185791015625, -0.011810302734375, 0.01103973388671875, 0.036773681640625, 0.018890380859375, 0.005313873291015625, -0.026641845703125, 0.02337646484375, 0.0056304931640625, -0.0457763671875, -0.07672119140625, 0.046478271484375, -0.0377197265625, 0.0225067138671875, 0.02294921875, -0.00998687744140625, 0.0765380859375, -0.0198822021484375, 0.045806884765625, 0.0601806640625, -0.030670166015625, 0.032745361328125, -0.0556640625, 0.032989501953125, 0.0335693359375, 0.029754638671875, -0.011871337890625, -0.0325927734375, -0.06475830078125, -0.062103271484375, 0.0233001708984375, 0.0308990478515625, -0.01180267333984375, 0.0161590576171875, -0.0001722574234008789, -0.00827789306640625, 0.004474639892578125, -0.07330322265625, -0.04852294921875, -0.015106201171875, 0.008056640625, 0.00707244873046875, 0.01552581787109375, -0.014251708984375, -0.048370361328125, 0.04986572265625, 0.0228271484375, 0.02691650390625, 0.0200042724609375, 0.029937744140625, -0.019622802734375, 0.018890380859375, 0.03167724609375, 0.0523681640625, -0.018463134765625, -0.005977630615234375, 0.0023250579833984375, -0.034027099609375, 0.00982666015625, 0.00782012939453125, -0.04296875, -0.0029735565185546875, -0.004154205322265625, 0.042388916015625, 0.00992584228515625, -0.0168304443359375, 0.03240966796875, -0.0020008087158203125, -0.023468017578125, -0.0247650146484375, 0.00696563720703125, 0.0214691162109375, 0.020233154296875, 0.039398193359375, 0.01203155517578125, 0.006072998046875, -0.039520263671875, 0.002986907958984375, 0.041717529296875, -0.0279541015625, -0.014007568359375, 0.07305908203125, 0.005214691162109375, -0.01306915283203125, -0.003345489501953125, -0.0113677978515625, -0.01454925537109375, 0.0595703125, 0.04638671875, 0.047119140625, -0.0005283355712890625, 0.026397705078125, 0.046661376953125, 0.011474609375, 0.014434814453125, 0.031402587890625, 0.0247344970703125, 0.0075836181640625, 0.0035076141357421875, -0.054962158203125, -0.0025119781494140625, 0.0312347412109375, -0.00211334228515625, 0.057373046875, -0.051239013671875, -0.01395416259765625, 0.004459381103515625, 0.004077911376953125, -0.0455322265625, 0.041839599609375, -0.0165252685546875, 0.0625, -0.06396484375, 0.032989501953125, 0.043609619140625, -0.06683349609375, -0.07904052734375, 0.004016876220703125, 0.022430419921875, -0.039947509765625, 0.029998779296875, -0.00099945068359375, 0.005153656005859375, 0.008941650390625, -0.048980712890625, -0.0654296875, 0.10137939453125, 0.019927978515625, -0.0231170654296875, -0.007427215576171875, 0.0070343017578125, 0.03253173828125, -0.0308990478515625, 0.05841064453125, 0.03326416015625, 0.0290069580078125, 0.046844482421875, -0.0443115234375, 0.006404876708984375, -0.04010009765625, 0.021087646484375, -0.004756927490234375, -0.0777587890625, 0.0706787109375, -0.01251983642578125, -0.016448974609375, -0.00646209716796875, 0.04266357421875, 0.034576416015625, 0.003009796142578125, 0.0394287109375, 0.0836181640625, 0.007808685302734375, -0.03594970703125, 0.0826416015625, -0.028289794921875, 0.0253753662109375, 0.0458984375, 0.006816864013671875, 0.042724609375, 0.0207366943359375, -0.0428466796875, 0.0192413330078125, 0.060089111328125, 0.005268096923828125, 0.025909423828125, 0.0234222412109375, -0.0203857421875, 0.000576019287109375, -0.0195465087890625, -0.06585693359375, 0.0055389404296875, 0.00659942626953125, -0.01416778564453125, -0.02862548828125, 0.0003292560577392578, 0.01522064208984375, 0.0231475830078125, -0.0225982666015625, 0.034149169921875, -0.012603759765625, -0.01320648193359375, 0.04876708984375, -0.025665283203125, 0.04345703125, -0.04608154296875, -0.00453948974609375, -0.0311279296875, 0.006622314453125, -0.032501220703125, -0.08355712890625, -0.01593017578125, 0.0041046142578125, -0.006565093994140625, -0.01068115234375, 0.022735595703125, 0.00635528564453125, -0.06756591796875, 0.01934814453125, 0.011444091796875, 0.01177978515625, 0.03875732421875, -0.0760498046875, 0.01641845703125, 0.007568359375, -0.03033447265625, 0.01204681396484375, 0.050537109375, 0.0283355712890625, 0.04534912109375, 0.0577392578125, 0.0242462158203125, 0.024261474609375, -0.0303955078125, 0.07318115234375, -0.034027099609375, -0.044830322265625, -0.057281494140625, 0.0684814453125, -0.0233001708984375, -0.025115966796875, 0.062255859375, 0.058502197265625, 0.04168701171875, -0.0283966064453125, 0.061767578125, -0.0408935546875, 0.0394287109375, -0.01291656494140625, 0.059295654296875, -0.058807373046875, -0.00908660888671875, -0.041900634765625, -0.058074951171875, -0.01222991943359375, 0.05316162109375, -0.011383056640625, 0.01611328125, 0.0285491943359375, 0.06396484375, -0.0021610260009765625, -0.0098876953125, 0.0029392242431640625, 0.0278778076171875, 0.0212554931640625, 0.059051513671875, 0.034759521484375, -0.0222015380859375, 0.03582763671875, -0.04473876953125, -0.0209808349609375, -0.0283355712890625, -0.053955078125, -0.054168701171875, -0.036041259765625, -0.0189361572265625, -0.03521728515625, 0.0006561279296875, 0.0670166015625, 0.0599365234375, -0.060089111328125, -0.01708984375, -0.01171112060546875, 0.0014352798461914062, -0.0204315185546875, -0.015167236328125, 0.014984130859375, 0.027435302734375, -0.06903076171875, -0.004711151123046875, 0.04132080078125, 0.0576171875, 0.0115509033203125, -0.00501251220703125, -0.036956787109375, -0.004650115966796875, 0.0137481689453125, 0.0285797119140625, -0.03887939453125, 0.0002434253692626953, -0.0029087066650390625, -0.01505279541015625, 0.035400390625, 0.0197296142578125, -0.029632568359375, 0.0361328125, 0.044891357421875, 0.0287933349609375, 0.036224365234375, 0.00034999847412109375, 0.01507568359375, -0.01552581787109375, 0.01084136962890625, 0.0013208389282226562, 0.041107177734375, -0.00824737548828125, -0.03167724609375, 0.0504150390625, 0.0290069580078125, -0.031402587890625, -0.060546875, 0.00490570068359375, -0.10443115234375, -0.0197601318359375, 0.059722900390625, 0.002620697021484375, -0.047119140625, 0.006847381591796875, -0.0265655517578125, 0.00995635986328125, -0.047821044921875, 0.02557373046875, 0.039306640625, -0.0035686492919921875, -0.006732940673828125, -0.0650634765625, 0.0283355712890625, 0.0178680419921875, -0.06591796875, -0.011260986328125, 0.031280517578125, 0.0146026611328125, 0.0604248046875, 0.034332275390625, -0.039794921875, 0.0244293212890625, -0.0019168853759765625, 0.02752685546875, 0.00862884521484375, 0.0015821456909179688, -0.0115814208984375, 0.0282135009765625, -0.01198577880859375, -0.03143310546875 ] ]
google/deplot
2023-09-06T19:53:17.000Z
[ "transformers", "pytorch", "safetensors", "pix2struct", "text2text-generation", "visual-question-answering", "en", "fr", "ro", "de", "multilingual", "arxiv:2212.10505", "license:apache-2.0", "autotrain_compatible", "has_space", "region:us" ]
visual-question-answering
google
null
null
google/deplot
115
10,022
transformers
2023-04-03T11:05:38
--- language: - en - fr - ro - de - multilingual inference: false pipeline_tag: visual-question-answering license: apache-2.0 --- # Model card for DePlot <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/deplot_architecture.png" alt="drawing" width="600"/> # Table of Contents 0. [TL;DR](#TL;DR) 1. [Using the model](#using-the-model) 2. [Contribution](#contribution) 3. [Citation](#citation) # TL;DR The abstract of the paper states that: > Visual language such as charts and plots is ubiquitous in the human world. Comprehending plots and charts requires strong reasoning skills. Prior state-of-the-art (SOTA) models require at least tens of thousands of training examples and their reasoning capabilities are still much limited, especially on complex human-written queries. This paper presents the first one-shot solution to visual language reasoning. We decompose the challenge of visual language reasoning into two steps: (1) plot-to-text translation, and (2) reasoning over the translated text. The key in this method is a modality conversion module, named as DePlot, which translates the image of a plot or chart to a linearized table. The output of DePlot can then be directly used to prompt a pretrained large language model (LLM), exploiting the few-shot reasoning capabilities of LLMs. To obtain DePlot, we standardize the plot-to-table task by establishing unified task formats and metrics, and train DePlot end-to-end on this task. DePlot can then be used off-the-shelf together with LLMs in a plug-and-play fashion. Compared with a SOTA model finetuned on more than >28k data points, DePlot+LLM with just one-shot prompting achieves a 24.0% improvement over finetuned SOTA on human-written queries from the task of chart QA. # Using the model You can run a prediction by querying an input image together with a question as follows: ```python from transformers import Pix2StructProcessor, Pix2StructForConditionalGeneration import requests from PIL import Image processor = Pix2StructProcessor.from_pretrained('google/deplot') model = Pix2StructForConditionalGeneration.from_pretrained('google/deplot') url = "https://raw.githubusercontent.com/vis-nlp/ChartQA/main/ChartQA%20Dataset/val/png/5090.png" image = Image.open(requests.get(url, stream=True).raw) inputs = processor(images=image, text="Generate underlying data table of the figure below:", return_tensors="pt") predictions = model.generate(**inputs, max_new_tokens=512) print(processor.decode(predictions[0], skip_special_tokens=True)) ``` # Converting from T5x to huggingface You can use the [`convert_pix2struct_checkpoint_to_pytorch.py`](https://github.com/huggingface/transformers/blob/main/src/transformers/models/pix2struct/convert_pix2struct_original_pytorch_to_hf.py) script as follows: ```bash python convert_pix2struct_checkpoint_to_pytorch.py --t5x_checkpoint_path PATH_TO_T5X_CHECKPOINTS --pytorch_dump_path PATH_TO_SAVE --is_vqa ``` if you are converting a large model, run: ```bash python convert_pix2struct_checkpoint_to_pytorch.py --t5x_checkpoint_path PATH_TO_T5X_CHECKPOINTS --pytorch_dump_path PATH_TO_SAVE --use-large --is_vqa ``` Once saved, you can push your converted model with the following snippet: ```python from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor model = Pix2StructForConditionalGeneration.from_pretrained(PATH_TO_SAVE) processor = Pix2StructProcessor.from_pretrained(PATH_TO_SAVE) model.push_to_hub("USERNAME/MODEL_NAME") processor.push_to_hub("USERNAME/MODEL_NAME") ``` # Contribution This model was originally contributed by Fangyu Liu, Julian Martin Eisenschlos et al. and added to the Hugging Face ecosystem by [Younes Belkada](https://huggingface.co/ybelkada). # Citation If you want to cite this work, please consider citing the original paper: ``` @misc{liu2022deplot, title={DePlot: One-shot visual language reasoning by plot-to-table translation}, author={Liu, Fangyu and Eisenschlos, Julian Martin and Piccinno, Francesco and Krichene, Syrine and Pang, Chenxi and Lee, Kenton and Joshi, Mandar and Chen, Wenhu and Collier, Nigel and Altun, Yasemin}, year={2022}, eprint={2212.10505}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
4,317
[ [ -0.034088134765625, -0.068603515625, 0.03668212890625, 0.018157958984375, -0.020111083984375, -0.0384521484375, -0.005290985107421875, -0.03369140625, 0.00487518310546875, 0.04815673828125, -0.0494384765625, -0.031005859375, -0.044586181640625, -0.0039215087890625, -0.0211944580078125, 0.062286376953125, -0.0101318359375, -0.0045928955078125, -0.02630615234375, -0.0022296905517578125, -0.009124755859375, -0.0221710205078125, -0.03668212890625, -0.0280303955078125, 0.0263519287109375, 0.0285491943359375, 0.06304931640625, 0.0372314453125, 0.042694091796875, 0.0250244140625, -0.0128173828125, -0.006168365478515625, -0.031646728515625, -0.005138397216796875, -0.00598907470703125, -0.042694091796875, -0.04693603515625, 0.0085601806640625, 0.033935546875, 0.03826904296875, 0.00962066650390625, 0.0169677734375, -0.00284576416015625, 0.05499267578125, -0.016571044921875, 0.037841796875, -0.0231170654296875, 0.0014448165893554688, 0.0005850791931152344, 0.0090484619140625, -0.03363037109375, -0.01544952392578125, 0.0188446044921875, -0.046661376953125, 0.000053942203521728516, -0.0150909423828125, 0.0914306640625, 0.038970947265625, -0.018890380859375, 0.0200653076171875, -0.03594970703125, 0.04901123046875, -0.052398681640625, 0.0177764892578125, 0.028045654296875, 0.03448486328125, -0.006622314453125, -0.07012939453125, -0.06036376953125, -0.0128631591796875, -0.018951416015625, 0.0149078369140625, -0.040313720703125, -0.00601959228515625, 0.045684814453125, 0.00812530517578125, -0.045806884765625, -0.0057373046875, -0.037506103515625, -0.01435089111328125, 0.061553955078125, -0.004962921142578125, 0.03564453125, -0.03985595703125, -0.048431396484375, -0.049896240234375, -0.03668212890625, 0.0107574462890625, -0.005443572998046875, 0.01519775390625, -0.0341796875, 0.044647216796875, -0.01275634765625, 0.037689208984375, 0.005340576171875, -0.0170440673828125, 0.027923583984375, -0.0028896331787109375, -0.01849365234375, -0.017547607421875, 0.0780029296875, 0.0352783203125, 0.0330810546875, 0.0142974853515625, -0.004180908203125, 0.00600433349609375, -0.0036640167236328125, -0.0833740234375, -0.02789306640625, 0.01837158203125, -0.0235748291015625, -0.01220703125, 0.0004935264587402344, -0.06048583984375, -0.01457977294921875, -0.0194244384765625, 0.040435791015625, -0.044708251953125, -0.03289794921875, -0.0020923614501953125, -0.01172637939453125, 0.017974853515625, 0.033447265625, -0.0435791015625, -0.002277374267578125, 0.03131103515625, 0.06231689453125, 0.00008338689804077148, -0.03369140625, -0.034423828125, -0.01361083984375, -0.015533447265625, 0.06549072265625, -0.01922607421875, -0.006633758544921875, -0.0170135498046875, 0.0164947509765625, -0.0123748779296875, -0.042694091796875, 0.0265655517578125, -0.01959228515625, 0.00746917724609375, -0.02276611328125, -0.0189971923828125, -0.0299072265625, 0.0187835693359375, -0.041473388671875, 0.0869140625, 0.041839599609375, -0.061309814453125, 0.01384735107421875, -0.03271484375, -0.0213165283203125, -0.006160736083984375, -0.00696563720703125, -0.05230712890625, -0.01158905029296875, 0.0294189453125, 0.03826904296875, -0.022674560546875, 0.019500732421875, -0.036346435546875, -0.010955810546875, 0.014739990234375, -0.001949310302734375, 0.0716552734375, 0.01904296875, -0.0394287109375, 0.0160980224609375, -0.031585693359375, 0.003902435302734375, 0.0308380126953125, -0.00787353515625, 0.005504608154296875, -0.0225677490234375, 0.0126800537109375, 0.038848876953125, 0.00521087646484375, -0.034027099609375, 0.03143310546875, -0.027740478515625, 0.047332763671875, 0.0374755859375, -0.0189666748046875, 0.0297393798828125, -0.0038623809814453125, 0.036651611328125, 0.0175323486328125, 0.007476806640625, -0.0251922607421875, -0.048431396484375, -0.061553955078125, -0.0174102783203125, 0.00872039794921875, 0.035430908203125, -0.07379150390625, 0.0294952392578125, -0.027496337890625, -0.048614501953125, -0.00955963134765625, -0.0090789794921875, 0.0220184326171875, 0.044891357421875, 0.041168212890625, -0.01041412353515625, -0.02716064453125, -0.06439208984375, 0.0035381317138671875, -0.0256500244140625, -0.0094757080078125, 0.0093231201171875, 0.052093505859375, -0.032806396484375, 0.07659912109375, -0.039154052734375, -0.02313232421875, -0.0343017578125, 0.005771636962890625, 0.01227569580078125, 0.050933837890625, 0.055999755859375, -0.04559326171875, -0.044403076171875, -0.0029735565185546875, -0.069091796875, -0.00543975830078125, -0.0133056640625, -0.037261962890625, 0.021270751953125, 0.03936767578125, -0.058990478515625, 0.05059814453125, 0.03466796875, -0.04827880859375, 0.045654296875, 0.00021767616271972656, -0.0028629302978515625, -0.06671142578125, 0.02392578125, 0.007701873779296875, -0.0291748046875, -0.039276123046875, 0.01097869873046875, 0.0067901611328125, -0.028900146484375, -0.033294677734375, 0.0697021484375, -0.0252685546875, 0.0023822784423828125, -0.01502227783203125, 0.0133209228515625, 0.01081085205078125, 0.052703857421875, 0.01465606689453125, 0.035400390625, 0.0635986328125, -0.03778076171875, 0.029876708984375, 0.060760498046875, -0.01371002197265625, 0.037933349609375, -0.0621337890625, 0.0122528076171875, -0.01230621337890625, 0.0247039794921875, -0.072021484375, -0.019805908203125, 0.03936767578125, -0.037139892578125, 0.02630615234375, -0.0207672119140625, -0.03424072265625, -0.042236328125, -0.0031280517578125, 0.0176239013671875, 0.05072021484375, -0.04888916015625, 0.0478515625, 0.017822265625, -0.00952911376953125, -0.0236358642578125, -0.07086181640625, -0.0049591064453125, -0.00975799560546875, -0.0782470703125, 0.032135009765625, 0.00214385986328125, -0.006488800048828125, -0.0010747909545898438, -0.01041412353515625, 0.00978851318359375, -0.016326904296875, 0.0232086181640625, 0.0276641845703125, -0.01078033447265625, -0.0014905929565429688, 0.0022869110107421875, -0.040802001953125, 0.007389068603515625, -0.01122283935546875, 0.048431396484375, -0.0159454345703125, -0.015716552734375, -0.0570068359375, 0.01474761962890625, 0.056060791015625, -0.02325439453125, 0.041839599609375, 0.06036376953125, -0.02301025390625, -0.0019969940185546875, -0.035491943359375, -0.0275421142578125, -0.033203125, 0.034637451171875, -0.034820556640625, -0.035888671875, 0.042205810546875, 0.00007814168930053711, -0.01194000244140625, 0.052215576171875, 0.038787841796875, -0.0022869110107421875, 0.08343505859375, 0.049530029296875, 0.0161285400390625, 0.04315185546875, -0.054534912109375, 0.0015535354614257812, -0.048675537109375, -0.02032470703125, -0.0207672119140625, -0.0311431884765625, -0.03692626953125, -0.046173095703125, 0.051239013671875, 0.0304412841796875, -0.0345458984375, 0.044586181640625, -0.043670654296875, 0.01180267333984375, 0.0411376953125, 0.032318115234375, 0.003421783447265625, 0.0198516845703125, 0.0018911361694335938, -0.003063201904296875, -0.045501708984375, -0.035400390625, 0.059417724609375, 0.03607177734375, 0.05523681640625, -0.0105743408203125, 0.03515625, -0.0228424072265625, 0.02142333984375, -0.059295654296875, 0.04925537109375, -0.007659912109375, -0.039215087890625, -0.025177001953125, -0.040435791015625, -0.058197021484375, 0.018096923828125, -0.01708984375, -0.06939697265625, 0.0191192626953125, 0.01165008544921875, -0.0203857421875, 0.0214996337890625, -0.059112548828125, 0.1021728515625, -0.0181732177734375, -0.044891357421875, -0.0004878044128417969, -0.052093505859375, 0.021331787109375, 0.0267181396484375, -0.01189422607421875, -0.0149993896484375, 0.021514892578125, 0.057342529296875, -0.039642333984375, 0.056488037109375, -0.027862548828125, 0.0023479461669921875, 0.048095703125, 0.01534271240234375, 0.0438232421875, 0.0013599395751953125, -0.01419830322265625, 0.0273590087890625, 0.0265655517578125, -0.034576416015625, -0.048370361328125, 0.035491943359375, -0.0634765625, -0.040313720703125, -0.0294342041015625, -0.045013427734375, 0.0015497207641601562, 0.018890380859375, 0.0209503173828125, 0.01416778564453125, 0.0084991455078125, 0.00677490234375, 0.0253143310546875, -0.0294036865234375, 0.02685546875, 0.0016241073608398438, -0.0408935546875, -0.0232086181640625, 0.052764892578125, 0.0007958412170410156, 0.03143310546875, 0.01153564453125, 0.034454345703125, -0.032684326171875, -0.004573822021484375, -0.0535888671875, 0.038848876953125, -0.060211181640625, -0.02294921875, -0.054779052734375, -0.007686614990234375, -0.040771484375, -0.00504302978515625, -0.039947509765625, -0.0316162109375, -0.021484375, 0.00922393798828125, 0.047760009765625, 0.0273590087890625, -0.00522613525390625, 0.0280303955078125, -0.05615234375, 0.03125, 0.02874755859375, 0.0225677490234375, -0.01192474365234375, -0.053314208984375, 0.004695892333984375, 0.00859832763671875, -0.035308837890625, -0.074951171875, 0.048675537109375, 0.005710601806640625, 0.029815673828125, 0.02838134765625, 0.0017175674438476562, 0.0634765625, -0.01345062255859375, 0.038116455078125, 0.029144287109375, -0.07171630859375, 0.05352783203125, -0.02008056640625, 0.0206451416015625, 0.0293426513671875, 0.030487060546875, -0.0367431640625, 0.002838134765625, -0.058319091796875, -0.05242919921875, 0.0677490234375, 0.016021728515625, 0.001041412353515625, 0.0185699462890625, 0.032073974609375, -0.0028629302978515625, 0.00298309326171875, -0.06451416015625, -0.01486968994140625, -0.037078857421875, -0.00270843505859375, 0.0033168792724609375, -0.00873565673828125, -0.0006852149963378906, -0.056793212890625, 0.045135498046875, -0.03314208984375, 0.040191650390625, 0.031646728515625, -0.01409912109375, -0.021270751953125, -0.00934600830078125, 0.01849365234375, 0.041656494140625, -0.006153106689453125, 0.005985260009765625, 0.004718780517578125, -0.0187530517578125, -0.012359619140625, 0.019317626953125, -0.01971435546875, -0.004413604736328125, 0.028564453125, 0.07769775390625, 0.003387451171875, -0.046844482421875, 0.046539306640625, 0.01541900634765625, -0.0140228271484375, -0.012847900390625, -0.0017271041870117188, 0.019989013671875, 0.03387451171875, 0.01502227783203125, 0.011566162109375, -0.020111083984375, -0.045196533203125, 0.027313232421875, 0.0282135009765625, -0.0304107666015625, -0.032379150390625, 0.06427001953125, 0.0011701583862304688, 0.0027618408203125, 0.060211181640625, -0.0184173583984375, -0.05438232421875, 0.0501708984375, 0.033050537109375, 0.061492919921875, -0.0020046234130859375, 0.028045654296875, 0.06964111328125, 0.0163726806640625, -0.0085601806640625, 0.01546478271484375, 0.0016422271728515625, -0.044219970703125, -0.025390625, -0.0584716796875, -0.01505279541015625, 0.0013189315795898438, -0.03656005859375, 0.044464111328125, -0.03131103515625, -0.00885772705078125, 0.0009264945983886719, -0.00974273681640625, -0.05902099609375, 0.01100921630859375, 0.0294342041015625, 0.05731201171875, -0.06536865234375, 0.055389404296875, 0.050201416015625, -0.060882568359375, -0.067626953125, -0.013397216796875, -0.004909515380859375, -0.068603515625, 0.056671142578125, 0.031463623046875, 0.01557159423828125, 0.002227783203125, -0.0631103515625, -0.061279296875, 0.08770751953125, 0.047271728515625, -0.037200927734375, -0.005237579345703125, 0.0178680419921875, 0.038360595703125, -0.01395416259765625, 0.045135498046875, 0.051544189453125, 0.056671142578125, 0.03350830078125, -0.06939697265625, 0.0236968994140625, -0.0231781005859375, -0.0004658699035644531, -0.005710601806640625, -0.048583984375, 0.0853271484375, -0.037750244140625, -0.03106689453125, -0.0036220550537109375, 0.054840087890625, 0.0269012451171875, 0.0145416259765625, 0.036590576171875, 0.0535888671875, 0.03155517578125, -0.025543212890625, 0.0826416015625, -0.00495147705078125, 0.052276611328125, 0.0478515625, 0.0032215118408203125, 0.0628662109375, 0.033477783203125, -0.0341796875, 0.039459228515625, 0.03985595703125, -0.0133514404296875, 0.011474609375, -0.0007882118225097656, 0.0195159912109375, -0.03765869140625, 0.003818511962890625, -0.034454345703125, 0.039642333984375, 0.0102386474609375, -0.01947021484375, -0.0290069580078125, 0.006534576416015625, 0.01788330078125, 0.0038013458251953125, 0.0006742477416992188, 0.040618896484375, 0.0162200927734375, -0.054412841796875, 0.07440185546875, -0.0099029541015625, 0.046600341796875, -0.033294677734375, -0.00386810302734375, -0.022064208984375, 0.0210113525390625, -0.02471923828125, -0.061798095703125, 0.02996826171875, -0.0088348388671875, -0.0002543926239013672, -0.0208282470703125, 0.042816162109375, -0.03558349609375, -0.056121826171875, 0.0243072509765625, 0.03564453125, 0.0228271484375, -0.038818359375, -0.073486328125, 0.01508331298828125, 0.007312774658203125, -0.0478515625, 0.0225067138671875, 0.027313232421875, -0.0025653839111328125, 0.05987548828125, 0.040679931640625, -0.017120361328125, -0.00629425048828125, -0.0038242340087890625, 0.07598876953125, -0.05352783203125, -0.035736083984375, -0.050048828125, 0.06597900390625, 0.007724761962890625, -0.0305023193359375, 0.039093017578125, 0.0224151611328125, 0.0601806640625, -0.014617919921875, 0.044708251953125, -0.02655029296875, 0.01702880859375, -0.03533935546875, 0.058624267578125, -0.059173583984375, -0.005229949951171875, -0.036163330078125, -0.06964111328125, -0.024078369140625, 0.0653076171875, -0.033447265625, 0.007732391357421875, 0.058685302734375, 0.0711669921875, -0.007602691650390625, -0.01904296875, 0.01849365234375, 0.026092529296875, 0.034759521484375, 0.044189453125, 0.044219970703125, -0.0543212890625, 0.05126953125, -0.048736572265625, -0.0205535888671875, -0.01313018798828125, -0.05615234375, -0.067626953125, -0.049102783203125, -0.052398681640625, -0.037841796875, -0.034759521484375, 0.051025390625, 0.06036376953125, -0.0537109375, -0.00360107421875, -0.01091766357421875, -0.001209259033203125, -0.0290985107421875, -0.0209808349609375, 0.039825439453125, -0.0084381103515625, -0.07330322265625, 0.004642486572265625, 0.011505126953125, 0.01000213623046875, -0.0211639404296875, -0.01129150390625, -0.01110076904296875, -0.025146484375, 0.05108642578125, 0.0261077880859375, -0.05926513671875, -0.02044677734375, 0.003948211669921875, -0.01238250732421875, 0.020904541015625, 0.032623291015625, -0.061309814453125, 0.04083251953125, 0.061187744140625, 0.0266571044921875, 0.0770263671875, 0.001495361328125, 0.0166473388671875, -0.02923583984375, 0.0343017578125, 0.010711669921875, 0.0293121337890625, 0.0252685546875, -0.01508331298828125, 0.041473388671875, 0.028045654296875, -0.031646728515625, -0.059417724609375, 0.0188751220703125, -0.1004638671875, -0.0189971923828125, 0.09368896484375, -0.016937255859375, -0.04473876953125, 0.028411865234375, -0.031585693359375, 0.03106689453125, -0.0160369873046875, 0.039154052734375, 0.016021728515625, -0.020263671875, -0.036376953125, -0.033905029296875, 0.03021240234375, 0.0341796875, -0.0657958984375, 0.007602691650390625, 0.0288238525390625, 0.02093505859375, 0.029632568359375, 0.031280517578125, -0.0195159912109375, 0.0204925537109375, 0.01629638671875, 0.03631591796875, -0.0182952880859375, -0.01715087890625, -0.008087158203125, -0.0066375732421875, -0.0160369873046875, 0.0019969940185546875 ] ]
Deci/DeciCoder-1b
2023-10-02T08:28:44.000Z
[ "transformers", "safetensors", "text-generation", "text generation", "Deci AI", "DeciCoder", "custom_code", "dataset:bigcode/starcoderdata", "arxiv:2305.13245", "arxiv:2104.09864", "license:apache-2.0", "model-index", "has_space", "region:us" ]
text-generation
Deci
null
null
Deci/DeciCoder-1b
223
10,008
transformers
2023-08-16T14:52:10
--- pipeline_tag: text-generation license: apache-2.0 tags: - text generation - Deci AI - DeciCoder programming_language: - Java - JavaScript - Python metrics: - code_eval inference: true widget: - text: 'def print_hello_world():' example_title: Hello world group: Python model-index: - name: DeciCoder-1b results: - task: type: text-generation dataset: type: nuprl/MultiPL-E name: MultiPL-HumanEval (Python) metrics: - name: pass@1 type: pass@1 value: 0.191 verified: false - task: type: text-generation dataset: type: nuprl/MultiPL-E name: MultiPL-HumanEval (JavaScript) metrics: - name: pass@1 type: pass@1 value: 0.184 verified: false - task: type: text-generation dataset: type: nuprl/MultiPL-E name: MultiPL-HumanEval (Java) metrics: - name: pass@1 type: pass@1 value: 0.166 verified: false datasets: - bigcode/starcoderdata --- # Model Card for DeciCoder 1B DeciCoder 1B is a 1 billion parameter decoder-only code completion model trained on the Python, Java, and Javascript subsets of [Starcoder Training Dataset](https://huggingface.co/datasets/bigcode/starcoderdata). The model uses Grouped Query Attention and has a context window of 2048 tokens. It was trained using a Fill-in-the-Middle training objective. The model's architecture was generated by Deci's proprietary Neural Architecture Search-based technology, AutoNAC. ## Model Details - **Developed by:** Deci - **Model type:** DeciCoder is an auto-regressive language model based on the transformer decoder architecture, using Grouped Query Attention. - **Language(s):** Python, Java, JavaScript - **License:** Model checkpoints are licensed under the [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) ## Model Architecture | Parameters | Layers | Heads | Sequence Length | GQA num_key_value_heads | Hidden Size | |:----------|:----------|:----------|:----------|:----------|:----------| | 1.1B | 20 | 32 | 2048 | 4 | 2048 | | - **Decoder layer:** Grouped Query Attention [Ainslie et al., 2023](https://arxiv.org/abs/2305.13245) - **Position Embeddings:** Rotary Position Embeddings [Su et al., 2021](https://arxiv.org/abs/2104.09864) ## Uses The model is intended to do single/multiline code completion from a context window of up to 2048k tokens. It is *not* an instruction model and commands like \"Write a function that computes the absolute value of an integer,\" won't yield the desired results. A more effective approach is to frame instructions in the style of source code comments (e.g. \# this function calculates the absolute value of an integer) or to present a function signature and docstring, enabling the model to complete the function's body. ### How to Use ```bibtex # pip install -q transformers import torch from transformers import AutoModelForCausalLM, AutoTokenizer checkpoint = "Deci/DeciCoder-1b" device = "cuda" # for GPU usage or "cpu" for CPU usage tokenizer = AutoTokenizer.from_pretrained(checkpoint) model = AutoModelForCausalLM.from_pretrained(checkpoint, torch_dtype=torch.bfloat16, trust_remote_code=True).to(device) inputs = tokenizer.encode("def print_hello_world():", return_tensors="pt").to(device) outputs = model.generate(inputs, max_new_tokens=100) print(tokenizer.decode(outputs[0])) ``` ### Attribution DeciCoder was trained on StarCoder Training Dataset, filtered for Python, Java, and Javascript code. For additional information, please refer to [https://huggingface.co/datasets/bigcode/starcoderdata](https://huggingface.co/datasets/bigcode/starcoderdata). ### Limitations The model has undergone training with source code from Python, Java, and JavaScript. While the primary language in the source is English, it does contain other languages. Therefore, the model can produce code snippets given some context. However, there\'s no assurance that the resulting code will function as expected. It might be suboptimal, contain bugs, or even exploits. ## Training Details ### Training Data DeciCoder was trained on the Python, Java, and Javascript subsets of [Starcoder Training Dataset](https://huggingface.co/datasets/bigcode/starcoderdata) ### Training Procedure - **Warm-Up Steps**: 9000 - **Total Training Steps**: 284k - **Total Tokens**: 446B - **Global Batch Size**: 768 - **Optimizer**: AdamW - **Optimizer Parameters**: beta1=0.9, beta2=0.95 - **Weight Decay**: 0.1 - **Learning Rate**: 4e-4 - **Learning Rate Schedule**: cosine ## Evaluation Below are DeciCoder's pass@1 on MultiPL HumanEval scores | Python | JavaScript | Java | |:----------|:----------|:----------| | 19.1% | 18.4% | 16.6% | ### Runtime Benchmarks |Inference Tool/Hardware | A10 (tokens/sec) |A100 (tokens/sec) | |:----------|:----------|:----------| | PyTorch | 1,364.2 | 3,244.4 | | Infery LLM | 3,889.3 | 11,676.8 | - Throughput (tokens/sec) - Measured with optimal batch size per hardware - A10 on BS 128, A100 on BS 512 ## Documentation - [Notebook](https://colab.research.google.com/drive/1JCxvBsWCZKHfIcHSMVf7GZCs3ClMQPjs) - Blog post: [Introducing DeciCoder: The New Gold Standard in Efficient and Accurate Code Generation](https://deci.ai/blog/decicoder-efficient-and-accurate-code-generation-llm/) - Questions:Feel free to contact us via our [Discord Community!](https://discord.com/invite/p9ecgRhDR8/) ## How to Cite Please cite this model using this format. ```bibtex @misc{DeciFoundationModels, title = {DeciCoder}, author = {DeciAI Research Team}, year = {2023} url={[https://huggingface.co/deci/decicoder-1b](https://huggingface.co/deci/decicoder-1b)}, } ```
5,704
[ [ -0.029754638671875, -0.042816162109375, 0.0118408203125, 0.000720977783203125, -0.02532958984375, -0.001529693603515625, -0.025146484375, -0.044158935546875, 0.0180816650390625, 0.038818359375, -0.030975341796875, -0.057373046875, -0.051605224609375, 0.02447509765625, -0.0181121826171875, 0.0721435546875, 0.004150390625, -0.0013284683227539062, -0.0203399658203125, 0.0009503364562988281, -0.0212249755859375, -0.05535888671875, -0.038787841796875, 0.0008115768432617188, 0.0209503173828125, 0.03070068359375, 0.029083251953125, 0.06494140625, 0.06329345703125, 0.02581787109375, -0.004680633544921875, -0.01061248779296875, -0.025299072265625, -0.0171966552734375, 0.006622314453125, -0.0443115234375, -0.033050537109375, -0.0065460205078125, 0.0295257568359375, 0.02008056640625, -0.0028171539306640625, 0.03167724609375, -0.01448822021484375, 0.04547119140625, -0.035797119140625, 0.038543701171875, -0.0428466796875, -0.0191192626953125, -0.0026798248291015625, 0.00237274169921875, -0.0289306640625, -0.0340576171875, -0.026763916015625, -0.033782958984375, 0.02392578125, -0.01029205322265625, 0.07037353515625, 0.04388427734375, -0.02587890625, -0.0328369140625, -0.0374755859375, 0.052703857421875, -0.07427978515625, 0.021240234375, 0.02691650390625, 0.01271820068359375, 0.0159759521484375, -0.06866455078125, -0.042266845703125, -0.0217132568359375, -0.0070648193359375, 0.00682830810546875, -0.0024890899658203125, 0.01435089111328125, 0.05413818359375, 0.028472900390625, -0.038818359375, 0.0021991729736328125, -0.0570068359375, -0.01131439208984375, 0.041839599609375, 0.017242431640625, 0.0028228759765625, -0.0232086181640625, -0.0292816162109375, -0.021575927734375, -0.036468505859375, 0.0147247314453125, 0.0308380126953125, 0.005748748779296875, -0.0447998046875, 0.02996826171875, -0.0094757080078125, 0.05853271484375, 0.01251220703125, -0.0178985595703125, 0.03912353515625, -0.034332275390625, -0.0206146240234375, 0.004058837890625, 0.08056640625, 0.03045654296875, 0.01332855224609375, 0.0017442703247070312, -0.0026760101318359375, 0.02783203125, 0.018310546875, -0.11090087890625, -0.0301666259765625, 0.03521728515625, -0.028656005859375, -0.038177490234375, 0.00580596923828125, -0.05224609375, -0.0010404586791992188, -0.0214385986328125, 0.00507354736328125, -0.03875732421875, -0.016326904296875, 0.0115203857421875, -0.01172637939453125, 0.0273590087890625, 0.00420379638671875, -0.0771484375, 0.01424407958984375, 0.051177978515625, 0.052398681640625, 0.0126953125, -0.013824462890625, -0.0031452178955078125, -0.024017333984375, -0.02923583984375, 0.030975341796875, -0.034698486328125, -0.0214080810546875, -0.02386474609375, 0.004024505615234375, -0.0206756591796875, -0.026153564453125, 0.03204345703125, -0.052215576171875, 0.0116729736328125, -0.0308685302734375, -0.039794921875, -0.019866943359375, 0.0181427001953125, -0.0465087890625, 0.0506591796875, 0.0081329345703125, -0.06878662109375, 0.0247344970703125, -0.0570068359375, -0.00988006591796875, 0.00356292724609375, -0.0142364501953125, -0.0295257568359375, -0.0030956268310546875, 0.0084991455078125, 0.01654052734375, -0.036346435546875, 0.02752685546875, -0.01074981689453125, -0.040771484375, 0.02764892578125, -0.03900146484375, 0.08331298828125, 0.027587890625, -0.0311279296875, -0.00504302978515625, -0.05987548828125, 0.004940032958984375, 0.0275115966796875, -0.02459716796875, 0.010284423828125, -0.0261077880859375, 0.003719329833984375, 0.020416259765625, 0.042449951171875, -0.044525146484375, 0.025390625, -0.0318603515625, 0.053009033203125, 0.0211029052734375, 0.0159454345703125, 0.00827789306640625, -0.0233306884765625, 0.041351318359375, 0.003955841064453125, 0.026763916015625, -0.034393310546875, -0.0283203125, -0.05816650390625, -0.0273284912109375, 0.04266357421875, 0.0379638671875, -0.045166015625, 0.05841064453125, -0.029876708984375, -0.04827880859375, -0.04107666015625, 0.00543975830078125, 0.04840087890625, 0.030609130859375, 0.0249481201171875, -0.018218994140625, -0.047088623046875, -0.06622314453125, 0.0301971435546875, -0.027740478515625, 0.006622314453125, 0.007266998291015625, 0.057891845703125, -0.018524169921875, 0.058624267578125, -0.03594970703125, -0.0013360977172851562, -0.017822265625, -0.01142120361328125, 0.039703369140625, 0.044921875, 0.049163818359375, -0.052032470703125, -0.032470703125, -0.021087646484375, -0.06427001953125, 0.0116119384765625, -0.005039215087890625, -0.004299163818359375, 0.0179290771484375, 0.03448486328125, -0.040740966796875, 0.024139404296875, 0.048675537109375, -0.0275115966796875, 0.040771484375, -0.00839996337890625, 0.0281219482421875, -0.09619140625, 0.03204345703125, -0.006816864013671875, 0.0007848739624023438, -0.034759521484375, 0.0199737548828125, 0.022003173828125, -0.0019092559814453125, -0.035308837890625, 0.018951416015625, -0.0266265869140625, 0.020965576171875, -0.0144500732421875, -0.014434814453125, 0.0116119384765625, 0.0684814453125, 0.003704071044921875, 0.06866455078125, 0.052520751953125, -0.037322998046875, 0.0264129638671875, 0.0250396728515625, -0.0270233154296875, 0.01568603515625, -0.0740966796875, 0.0074920654296875, -0.0007038116455078125, 0.01259613037109375, -0.0784912109375, -0.01165008544921875, 0.034576416015625, -0.05078125, 0.029449462890625, -0.055206298828125, -0.044677734375, -0.046142578125, -0.0163116455078125, 0.03826904296875, 0.061279296875, -0.0311279296875, 0.01457977294921875, 0.0172576904296875, 0.005832672119140625, -0.03826904296875, -0.05413818359375, 0.00913238525390625, -0.0209503173828125, -0.0418701171875, 0.0193634033203125, -0.027740478515625, -0.00905609130859375, -0.02252197265625, -0.0095977783203125, -0.0061798095703125, -0.0012531280517578125, 0.0310211181640625, 0.033294677734375, -0.01165008544921875, -0.0239105224609375, -0.01800537109375, -0.016204833984375, 0.0272674560546875, -0.0161285400390625, 0.05560302734375, -0.02337646484375, -0.026336669921875, -0.041015625, -0.01261138916015625, 0.057525634765625, -0.0308990478515625, 0.050018310546875, 0.033966064453125, -0.01800537109375, 0.0143890380859375, -0.042388916015625, -0.0157928466796875, -0.038421630859375, 0.0194244384765625, -0.035858154296875, -0.028900146484375, 0.05560302734375, 0.031707763671875, 0.0200653076171875, 0.04754638671875, 0.034332275390625, 0.0225677490234375, 0.060821533203125, 0.03558349609375, 0.0011529922485351562, 0.04052734375, -0.07794189453125, -0.0082855224609375, -0.07086181640625, -0.0301971435546875, -0.04632568359375, -0.01255035400390625, -0.054901123046875, -0.03558349609375, 0.0153961181640625, 0.01325225830078125, -0.035552978515625, 0.057098388671875, -0.060211181640625, 0.031402587890625, 0.03961181640625, 0.011627197265625, 0.006622314453125, 0.0013132095336914062, -0.0088958740234375, 0.00036644935607910156, -0.049560546875, -0.0300445556640625, 0.09246826171875, 0.0272674560546875, 0.06195068359375, -0.0017566680908203125, 0.06494140625, -0.0026988983154296875, 0.01380157470703125, -0.052398681640625, 0.034088134765625, -0.006229400634765625, -0.057281494140625, -0.0010824203491210938, -0.039306640625, -0.059295654296875, 0.01605224609375, 0.00630950927734375, -0.049041748046875, 0.02850341796875, 0.000247955322265625, -0.027008056640625, 0.021942138671875, -0.04766845703125, 0.0838623046875, -0.03009033203125, -0.03802490234375, 0.0003910064697265625, -0.04931640625, 0.03253173828125, -0.0016775131225585938, 0.004077911376953125, 0.0011806488037109375, -0.00007265806198120117, 0.0814208984375, -0.0426025390625, 0.0732421875, -0.0208587646484375, -0.0031299591064453125, 0.04071044921875, 0.004917144775390625, 0.036956787109375, 0.0074005126953125, -0.0176239013671875, 0.04791259765625, 0.009368896484375, -0.0280914306640625, -0.0203399658203125, 0.061279296875, -0.06494140625, -0.03765869140625, -0.03271484375, -0.0198516845703125, 0.01276397705078125, 0.02337646484375, 0.036865234375, 0.066650390625, 0.004306793212890625, 0.0272064208984375, 0.037200927734375, -0.0010328292846679688, 0.048553466796875, 0.040557861328125, -0.00485992431640625, -0.06170654296875, 0.06622314453125, -0.003391265869140625, 0.01413726806640625, 0.026702880859375, 0.000621795654296875, -0.03619384765625, -0.045654296875, -0.052398681640625, 0.032623291015625, -0.0570068359375, -0.025909423828125, -0.06390380859375, -0.0228271484375, -0.0261993408203125, -0.010894775390625, -0.0350341796875, -0.0275115966796875, -0.018524169921875, -0.004276275634765625, 0.042816162109375, 0.029937744140625, 0.01103973388671875, 0.017547607421875, -0.0570068359375, 0.01369476318359375, 0.0006995201110839844, 0.0214691162109375, -0.00933074951171875, -0.05841064453125, -0.0296478271484375, 0.0233306884765625, -0.0150909423828125, -0.04840087890625, 0.030181884765625, -0.01556396484375, 0.0309295654296875, 0.031951904296875, 0.01039886474609375, 0.0556640625, -0.01467132568359375, 0.070068359375, 0.02752685546875, -0.04534912109375, 0.032470703125, -0.00955963134765625, 0.0282745361328125, 0.06549072265625, 0.0283050537109375, -0.017120361328125, -0.01904296875, -0.06951904296875, -0.06622314453125, 0.06292724609375, 0.0183258056640625, 0.00179290771484375, -0.0119781494140625, 0.0141754150390625, -0.0058441162109375, 0.026092529296875, -0.03936767578125, -0.032196044921875, -0.0266876220703125, -0.026763916015625, -0.008636474609375, -0.014495849609375, -0.0013990402221679688, -0.0277252197265625, 0.032989501953125, -0.0017461776733398438, 0.03680419921875, 0.00550079345703125, -0.0197601318359375, -0.0003204345703125, 0.000720977783203125, 0.042449951171875, 0.06317138671875, -0.0301055908203125, -0.006256103515625, 0.006763458251953125, -0.04833984375, -0.0024738311767578125, 0.022857666015625, -0.0132904052734375, -0.005207061767578125, 0.037994384765625, 0.06512451171875, 0.01248931884765625, -0.0404052734375, 0.043212890625, -0.0087738037109375, -0.027130126953125, -0.04437255859375, 0.0208892822265625, -0.002864837646484375, 0.0012407302856445312, 0.0166168212890625, 0.021697998046875, 0.005535125732421875, -0.020843505859375, 0.01551055908203125, 0.0148162841796875, -0.030853271484375, -0.03131103515625, 0.082275390625, 0.0204620361328125, -0.00457000732421875, 0.055023193359375, -0.0194091796875, -0.03857421875, 0.08978271484375, 0.036712646484375, 0.0640869140625, 0.00655364990234375, -0.0024394989013671875, 0.060821533203125, 0.0226287841796875, 0.0036640167236328125, 0.01763916015625, -0.01476287841796875, -0.035797119140625, -0.03094482421875, -0.0599365234375, -0.017333984375, 0.0025768280029296875, -0.043243408203125, 0.030487060546875, -0.043121337890625, -0.01543426513671875, 0.0081634521484375, 0.034149169921875, -0.0721435546875, 0.0117034912109375, 0.01248931884765625, 0.080810546875, -0.051177978515625, 0.06494140625, 0.051544189453125, -0.06964111328125, -0.04815673828125, -0.0167083740234375, -0.01297760009765625, -0.06390380859375, 0.044219970703125, 0.020538330078125, 0.012237548828125, 0.0177154541015625, -0.033599853515625, -0.06939697265625, 0.0877685546875, 0.012420654296875, -0.0567626953125, -0.003604888916015625, 0.00673675537109375, 0.043731689453125, -0.0249176025390625, 0.0643310546875, 0.032958984375, 0.034210205078125, 0.03814697265625, -0.058563232421875, 0.01546478271484375, -0.02838134765625, -0.0014057159423828125, 0.00969696044921875, -0.065673828125, 0.060394287109375, -0.03204345703125, -0.00479888916015625, -0.003559112548828125, 0.030853271484375, 0.0227508544921875, 0.0066986083984375, 0.0261993408203125, 0.0472412109375, 0.040374755859375, -0.002170562744140625, 0.085205078125, -0.0791015625, 0.0266571044921875, 0.053558349609375, 0.00528717041015625, 0.0478515625, 0.00705718994140625, -0.02471923828125, 0.045257568359375, 0.044586181640625, -0.02301025390625, 0.035919189453125, 0.00185394287109375, 0.0020503997802734375, -0.01183319091796875, 0.01287078857421875, -0.040863037109375, 0.0268402099609375, 0.0251617431640625, -0.0350341796875, -0.00598907470703125, -0.01322174072265625, 0.017669677734375, -0.0206756591796875, -0.00205230712890625, 0.06695556640625, -0.002948760986328125, -0.048583984375, 0.09912109375, 0.01171112060546875, 0.060638427734375, -0.048675537109375, -0.0029964447021484375, -0.02606201171875, 0.01270294189453125, -0.0187530517578125, -0.03387451171875, 0.01219940185546875, 0.0105743408203125, -0.01468658447265625, -0.0024871826171875, 0.02099609375, -0.0275421142578125, -0.05023193359375, 0.00015544891357421875, 0.00487518310546875, 0.0251007080078125, -0.0005841255187988281, -0.06341552734375, 0.03912353515625, 0.024261474609375, -0.0196685791015625, 0.0081634521484375, 0.0207366943359375, 0.01502227783203125, 0.03814697265625, 0.058990478515625, -0.01331329345703125, -0.003337860107421875, 0.00118255615234375, 0.06610107421875, -0.06781005859375, -0.034820556640625, -0.0714111328125, 0.048797607421875, -0.00887298583984375, -0.027130126953125, 0.05462646484375, 0.074951171875, 0.056610107421875, -0.007965087890625, 0.06475830078125, -0.035125732421875, 0.0030536651611328125, -0.032379150390625, 0.033447265625, -0.023101806640625, 0.031646728515625, 0.00299835205078125, -0.06671142578125, -0.0024623870849609375, 0.034637451171875, -0.0267791748046875, 0.01551055908203125, 0.055755615234375, 0.08056640625, -0.01146697998046875, -0.00826263427734375, 0.0198822021484375, 0.0213775634765625, 0.0300445556640625, 0.059051513671875, 0.03729248046875, -0.06304931640625, 0.057708740234375, -0.043731689453125, -0.017974853515625, -0.0247650146484375, -0.04071044921875, -0.07861328125, -0.03900146484375, -0.0445556640625, -0.03466796875, -0.0034885406494140625, 0.0762939453125, 0.04962158203125, -0.06414794921875, -0.0160064697265625, -0.01519775390625, -0.0038433074951171875, -0.0148162841796875, -0.017974853515625, 0.030609130859375, -0.02227783203125, -0.06982421875, 0.01236724853515625, -0.0182952880859375, 0.00244903564453125, -0.0156097412109375, -0.006381988525390625, 0.01094818115234375, -0.00145721435546875, 0.0258026123046875, 0.022064208984375, -0.04107666015625, -0.0016107559204101562, 0.00780487060546875, -0.0289459228515625, 0.0138702392578125, 0.04193115234375, -0.048431396484375, 0.019561767578125, 0.040008544921875, 0.0170745849609375, 0.05401611328125, 0.0212249755859375, 0.00664520263671875, -0.0294342041015625, 0.0191802978515625, 0.00824737548828125, 0.025482177734375, 0.0240020751953125, -0.02618408203125, 0.03375244140625, 0.0213165283203125, -0.044708251953125, -0.03857421875, -0.0007395744323730469, -0.07562255859375, -0.0153350830078125, 0.09979248046875, 0.0018815994262695312, -0.0340576171875, 0.006244659423828125, -0.01456451416015625, 0.01338958740234375, -0.03753662109375, 0.048797607421875, 0.020355224609375, 0.00823974609375, -0.005828857421875, -0.044097900390625, 0.03350830078125, -0.017120361328125, -0.043365478515625, 0.01236724853515625, 0.04217529296875, 0.050872802734375, 0.0231170654296875, 0.046661376953125, -0.027801513671875, 0.011260986328125, 0.03240966796875, 0.043212890625, -0.04547119140625, -0.0223236083984375, -0.0242919921875, 0.003292083740234375, 0.006732940673828125, -0.03802490234375 ] ]
WizardLM/WizardCoder-Python-7B-V1.0
2023-09-09T06:44:39.000Z
[ "transformers", "pytorch", "llama", "text-generation", "code", "arxiv:2304.12244", "arxiv:2306.08568", "arxiv:2308.09583", "arxiv:2303.08774", "license:llama2", "model-index", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
WizardLM
null
null
WizardLM/WizardCoder-Python-7B-V1.0
52
9,994
transformers
2023-08-29T10:08:55
--- license: llama2 metrics: - code_eval library_name: transformers tags: - code model-index: - name: WizardCoder-Python-34B-V1.0 results: - task: type: text-generation dataset: type: openai_humaneval name: HumanEval metrics: - name: pass@1 type: pass@1 value: 0.555 verified: false --- <p align="center"> 🤗 <a href="https://huggingface.co/WizardLM" target="_blank">HF Repo</a> •🐱 <a href="https://github.com/nlpxucan/WizardLM" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/WizardLM_AI" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> • 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> • 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a> <br> </p> <p align="center"> 👋 Join our <a href="https://discord.gg/VZjjHtWrKs" target="_blank">Discord</a> </p> ## News - 🔥🔥🔥[2023/08/26] We released **WizardCoder-Python-34B-V1.0** , which achieves the **73.2 pass@1** and surpasses **GPT4 (2023/03/15)**, **ChatGPT-3.5**, and **Claude2** on the [HumanEval Benchmarks](https://github.com/openai/human-eval). - [2023/06/16] We released **WizardCoder-15B-V1.0** , which achieves the **57.3 pass@1** and surpasses **Claude-Plus (+6.8)**, **Bard (+15.3)** and **InstructCodeT5+ (+22.3)** on the [HumanEval Benchmarks](https://github.com/openai/human-eval). ❗Note: There are two HumanEval results of GPT4 and ChatGPT-3.5. The 67.0 and 48.1 are reported by the official GPT4 Report (2023/03/15) of [OpenAI](https://arxiv.org/abs/2303.08774). The 82.0 and 72.5 are tested by ourselves with the latest API (2023/08/26). | Model | Checkpoint | Paper | HumanEval | MBPP | Demo | License | | ----- |------| ---- |------|-------| ----- | ----- | | WizardCoder-Python-34B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-34B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 73.2 | 61.2 | [Demo](http://47.103.63.15:50085/) | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> | | WizardCoder-15B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-15B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 59.8 |50.6 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> | | WizardCoder-Python-13B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-13B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 64.0 | 55.6 | -- | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> | | WizardCoder-Python-7B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-7B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 55.5 | 51.6 | [Demo](http://47.103.63.15:50088/) | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> | | WizardCoder-3B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-3B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 34.8 |37.4 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> | | WizardCoder-1B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-1B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 23.8 |28.6 | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> | - Our **WizardMath-70B-V1.0** model slightly outperforms some closed-source LLMs on the GSM8K, including **ChatGPT 3.5**, **Claude Instant 1** and **PaLM 2 540B**. - Our **WizardMath-70B-V1.0** model achieves **81.6 pass@1** on the [GSM8k Benchmarks](https://github.com/openai/grade-school-math), which is **24.8** points higher than the SOTA open-source LLM, and achieves **22.7 pass@1** on the [MATH Benchmarks](https://github.com/hendrycks/math), which is **9.2** points higher than the SOTA open-source LLM. <font size=4> | Model | Checkpoint | Paper | GSM8k | MATH |Online Demo| License| | ----- |------| ---- |------|-------| ----- | ----- | | WizardMath-70B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-70B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **81.6** | **22.7** |[Demo](http://47.103.63.15:50083/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a> | | WizardMath-13B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-13B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **63.9** | **14.0** |[Demo](http://47.103.63.15:50082/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a> | | WizardMath-7B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardMath-7B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a>| **54.9** | **10.7** | [Demo ](http://47.103.63.15:50080/)| <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 </a>| </font> - [08/09/2023] We released **WizardLM-70B-V1.0** model. Here is [Full Model Weight](https://huggingface.co/WizardLM/WizardLM-70B-V1.0). <font size=4> | <sup>Model</sup> | <sup>Checkpoint</sup> | <sup>Paper</sup> |<sup>MT-Bench</sup> | <sup>AlpacaEval</sup> | <sup>GSM8k</sup> | <sup>HumanEval</sup> | <sup>License</sup>| | ----- |------| ---- |------|-------| ----- | ----- | ----- | | <sup>**WizardLM-70B-V1.0**</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-70B-V1.0" target="_blank">HF Link</a> </sup>|<sup>📃**Coming Soon**</sup>| <sup>**7.78**</sup> | <sup>**92.91%**</sup> |<sup>**77.6%**</sup> | <sup> **50.6**</sup>|<sup> <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 License </a></sup> | | <sup>WizardLM-13B-V1.2</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.2" target="_blank">HF Link</a> </sup>| | <sup>7.06</sup> | <sup>89.17%</sup> |<sup>55.3%</sup> | <sup>36.6 </sup>|<sup> <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama 2 License </a></sup> | | <sup>WizardLM-13B-V1.1</sup> |<sup> 🤗 <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.1" target="_blank">HF Link</a> </sup> | | <sup>6.76</sup> |<sup>86.32%</sup> | | <sup>25.0 </sup>| <sup>Non-commercial</sup>| | <sup>WizardLM-30B-V1.0</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-30B-V1.0" target="_blank">HF Link</a></sup> | | <sup>7.01</sup> | | | <sup>37.8 </sup>| <sup>Non-commercial</sup> | | <sup>WizardLM-13B-V1.0</sup> | <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-13B-V1.0" target="_blank">HF Link</a> </sup> | | <sup>6.35</sup> | <sup>75.31%</sup> | | <sup> 24.0 </sup> | <sup>Non-commercial</sup>| | <sup>WizardLM-7B-V1.0 </sup>| <sup>🤗 <a href="https://huggingface.co/WizardLM/WizardLM-7B-V1.0" target="_blank">HF Link</a> </sup> |<sup> 📃 <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> </sup>| | | |<sup>19.1 </sup>|<sup> Non-commercial</sup>| </font> ## Comparing WizardCoder-Python-34B-V1.0 with Other LLMs. 🔥 The following figure shows that our **WizardCoder-Python-34B-V1.0 attains the second position in this benchmark**, surpassing GPT4 (2023/03/15, 73.2 vs. 67.0), ChatGPT-3.5 (73.2 vs. 72.5) and Claude2 (73.2 vs. 71.2). <p align="center" width="100%"> <a ><img src="https://raw.githubusercontent.com/nlpxucan/WizardLM/main/WizardCoder/imgs/compare_sota.png" alt="WizardCoder" style="width: 96%; min-width: 300px; display: block; margin: auto;"></a> </p> ## Prompt Format ``` "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Response:" ``` ## Inference Demo Script We provide the inference demo code [here](https://github.com/nlpxucan/WizardLM/tree/main/demo). ## Citation Please cite the repo if you use the data, method or code in this repo. ``` @article{luo2023wizardcoder, title={WizardCoder: Empowering Code Large Language Models with Evol-Instruct}, author={Luo, Ziyang and Xu, Can and Zhao, Pu and Sun, Qingfeng and Geng, Xiubo and Hu, Wenxiang and Tao, Chongyang and Ma, Jing and Lin, Qingwei and Jiang, Daxin}, journal={arXiv preprint arXiv:2306.08568}, year={2023} } ```
9,128
[ [ -0.047698974609375, -0.033416748046875, -0.007030487060546875, 0.026611328125, 0.003368377685546875, -0.01239776611328125, 0.002590179443359375, -0.036102294921875, 0.015869140625, 0.0212554931640625, -0.048828125, -0.046966552734375, -0.038665771484375, 0.0204620361328125, -0.005596160888671875, 0.061767578125, -0.01213836669921875, -0.016754150390625, -0.01338958740234375, -0.0135498046875, -0.01033782958984375, -0.036041259765625, -0.016815185546875, -0.03375244140625, 0.030670166015625, 0.0016336441040039062, 0.0655517578125, 0.0341796875, 0.02740478515625, 0.0216217041015625, -0.0163421630859375, 0.0367431640625, -0.0111236572265625, -0.01971435546875, 0.0182037353515625, -0.0182342529296875, -0.07159423828125, -0.00153350830078125, 0.043731689453125, 0.0205841064453125, -0.00029659271240234375, 0.027862548828125, 0.00432586669921875, 0.06951904296875, -0.04052734375, 0.0191650390625, -0.0178985595703125, 0.022369384765625, -0.01318359375, -0.008758544921875, 0.00591278076171875, -0.043212890625, 0.000335693359375, -0.0587158203125, -0.00420379638671875, 0.00836181640625, 0.08526611328125, 0.0156402587890625, -0.017608642578125, -0.003726959228515625, -0.01776123046875, 0.0474853515625, -0.06365966796875, 0.0254669189453125, 0.03802490234375, 0.01062774658203125, -0.0401611328125, -0.0438232421875, -0.064697265625, -0.01055145263671875, -0.007427215576171875, 0.0074005126953125, -0.0294952392578125, -0.0159759521484375, 0.0239715576171875, 0.025665283203125, -0.05126953125, -0.007328033447265625, -0.026397705078125, -0.021270751953125, 0.06085205078125, 0.0135345458984375, 0.035552978515625, -0.015869140625, -0.0009369850158691406, -0.01436614990234375, -0.037078857421875, 0.014892578125, 0.0293121337890625, -0.00887298583984375, -0.034454345703125, 0.05511474609375, -0.0067291259765625, 0.044586181640625, 0.00966644287109375, -0.04693603515625, 0.05029296875, -0.029205322265625, -0.0140838623046875, -0.0075531005859375, 0.08428955078125, 0.042083740234375, 0.01025390625, 0.00411224365234375, 0.0014362335205078125, -0.0160369873046875, 0.0003192424774169922, -0.06793212890625, -0.009185791015625, 0.0234832763671875, -0.03863525390625, -0.0152740478515625, -0.0180816650390625, -0.058837890625, -0.0286712646484375, -0.0006084442138671875, 0.0201568603515625, -0.047943115234375, -0.0234832763671875, 0.01751708984375, -0.0088348388671875, 0.0477294921875, 0.039886474609375, -0.0635986328125, 0.0163116455078125, 0.037872314453125, 0.061553955078125, -0.011505126953125, -0.04058837890625, -0.008392333984375, -0.000054776668548583984, -0.027130126953125, 0.040863037109375, 0.0024776458740234375, -0.032318115234375, -0.00823211669921875, -0.005687713623046875, -0.00806427001953125, -0.026458740234375, 0.03070068359375, -0.03363037109375, 0.01959228515625, -0.00965118408203125, -0.035980224609375, -0.0172576904296875, 0.0247039794921875, -0.0455322265625, 0.08258056640625, 0.01288604736328125, -0.07464599609375, 0.0021381378173828125, -0.045867919921875, -0.007568359375, -0.029632568359375, -0.005512237548828125, -0.048675537109375, -0.02056884765625, 0.0206756591796875, 0.01702880859375, -0.03466796875, -0.014923095703125, -0.020172119140625, -0.0203399658203125, 0.0200653076171875, -0.039947509765625, 0.09100341796875, 0.0169677734375, -0.0293121337890625, -0.00634765625, -0.0703125, -0.0006742477416992188, 0.044342041015625, -0.037322998046875, 0.007083892822265625, -0.0240478515625, -0.009002685546875, 0.01239013671875, 0.05072021484375, -0.0237884521484375, 0.03680419921875, -0.035736083984375, -0.00644683837890625, 0.0537109375, -0.009368896484375, 0.027740478515625, -0.0360107421875, 0.0301361083984375, -0.011627197265625, 0.0302276611328125, 0.01233673095703125, -0.052947998046875, -0.061187744140625, -0.029571533203125, 0.00494384765625, 0.05328369140625, -0.036285400390625, 0.0760498046875, -0.020263671875, -0.06683349609375, -0.039825439453125, 0.0210723876953125, 0.029693603515625, 0.036590576171875, 0.04046630859375, -0.016448974609375, -0.024444580078125, -0.058319091796875, -0.00586700439453125, -0.0239105224609375, -0.006290435791015625, 0.02301025390625, 0.045562744140625, -0.03369140625, 0.073486328125, -0.043853759765625, -0.01898193359375, -0.00428009033203125, -0.017547607421875, 0.0296630859375, 0.04608154296875, 0.045654296875, -0.0455322265625, -0.033905029296875, 0.011810302734375, -0.06842041015625, -0.007144927978515625, 0.00799560546875, -0.0240936279296875, 0.026031494140625, -0.0079345703125, -0.06719970703125, 0.058349609375, 0.022186279296875, -0.041046142578125, 0.06683349609375, -0.026397705078125, 0.00982666015625, -0.0703125, 0.00386810302734375, -0.00777435302734375, 0.009857177734375, -0.0447998046875, -0.0022563934326171875, 0.00560760498046875, 0.0221405029296875, -0.045166015625, 0.059539794921875, -0.035064697265625, -0.0091094970703125, -0.0027904510498046875, -0.01021575927734375, 0.0150146484375, 0.05645751953125, -0.0084075927734375, 0.056121826171875, 0.051849365234375, -0.03192138671875, 0.04034423828125, 0.0294647216796875, -0.01506805419921875, 0.02716064453125, -0.03997802734375, 0.00864410400390625, 0.0074005126953125, 0.0274505615234375, -0.03887939453125, -0.007480621337890625, 0.0472412109375, -0.046661376953125, 0.02484130859375, -0.0024356842041015625, -0.057586669921875, -0.0460205078125, -0.053375244140625, 0.005767822265625, 0.058380126953125, -0.0404052734375, 0.048126220703125, 0.0203857421875, 0.0233001708984375, -0.0596923828125, -0.04095458984375, -0.00904083251953125, -0.01136016845703125, -0.059539794921875, 0.01953125, -0.02288818359375, -0.00856781005859375, -0.00414276123046875, -0.0284881591796875, 0.0010499954223632812, 0.00823211669921875, 0.019134521484375, 0.03411865234375, -0.01201629638671875, -0.0265960693359375, 0.0026645660400390625, -0.0095367431640625, -0.0004131793975830078, -0.021148681640625, 0.03387451171875, -0.0212860107421875, -0.0364990234375, -0.033721923828125, 0.0006742477416992188, 0.035675048828125, -0.0199432373046875, 0.06396484375, 0.050567626953125, -0.03509521484375, 0.01061248779296875, -0.050323486328125, 0.01111602783203125, -0.04156494140625, 0.0074615478515625, -0.032989501953125, -0.05328369140625, 0.043670654296875, 0.0189056396484375, 0.022979736328125, 0.04437255859375, 0.050140380859375, 0.00911712646484375, 0.067138671875, 0.0311279296875, -0.00659942626953125, 0.03533935546875, -0.0404052734375, 0.004482269287109375, -0.0660400390625, -0.03955078125, -0.034576416015625, 0.0009889602661132812, -0.03765869140625, -0.04931640625, 0.0301666259765625, 0.045196533203125, -0.045623779296875, 0.043365478515625, -0.0697021484375, 0.0160675048828125, 0.0389404296875, 0.002105712890625, 0.01444244384765625, 0.01393890380859375, -0.021575927734375, 0.0170440673828125, -0.0266265869140625, -0.044281005859375, 0.07904052734375, 0.0200347900390625, 0.047088623046875, 0.01506805419921875, 0.0577392578125, 0.0009584426879882812, -0.0024738311767578125, -0.0249786376953125, 0.050048828125, 0.02703857421875, -0.040130615234375, -0.029815673828125, -0.0180511474609375, -0.08258056640625, 0.036041259765625, -0.01403045654296875, -0.08770751953125, 0.0254669189453125, 0.005268096923828125, -0.0183258056640625, 0.03857421875, -0.04034423828125, 0.065185546875, -0.0107574462890625, -0.03411865234375, 0.005535125732421875, -0.033447265625, 0.02032470703125, 0.00902557373046875, 0.00858306884765625, -0.02740478515625, -0.0240631103515625, 0.06292724609375, -0.0809326171875, 0.044036865234375, -0.003387451171875, -0.0230712890625, 0.042083740234375, -0.0038604736328125, 0.03985595703125, -0.0050811767578125, -0.01537322998046875, 0.035888671875, 0.0105743408203125, -0.032684326171875, -0.04400634765625, 0.050506591796875, -0.0830078125, -0.050811767578125, -0.042083740234375, -0.024627685546875, -0.002704620361328125, 0.0183563232421875, 0.022186279296875, 0.01374053955078125, 0.0233612060546875, -0.024169921875, 0.052154541015625, -0.02764892578125, 0.02880859375, 0.025177001953125, -0.021270751953125, -0.038116455078125, 0.07403564453125, 0.009765625, -0.0059051513671875, 0.0247802734375, 0.0170440673828125, -0.0170135498046875, -0.030487060546875, -0.053131103515625, 0.0283203125, -0.055419921875, -0.0299835205078125, -0.059356689453125, -0.03460693359375, -0.045135498046875, -0.0228271484375, -0.0294189453125, -0.03521728515625, -0.048065185546875, 0.007053375244140625, 0.07952880859375, 0.0300750732421875, -0.0200347900390625, -0.01210784912109375, -0.052520751953125, 0.0266876220703125, 0.026947021484375, 0.01580810546875, 0.021453857421875, -0.039947509765625, -0.013824462890625, -0.0125732421875, -0.0418701171875, -0.06793212890625, 0.045562744140625, -0.01328277587890625, 0.04071044921875, 0.0062103271484375, 0.0019369125366210938, 0.06085205078125, -0.043853759765625, 0.07659912109375, 0.044097900390625, -0.05645751953125, 0.035552978515625, -0.0118560791015625, 0.0240325927734375, 0.0177764892578125, 0.023681640625, -0.0300750732421875, -0.012054443359375, -0.033935546875, -0.0555419921875, 0.050628662109375, 0.0258941650390625, -0.00380706787109375, 0.01255035400390625, 0.014190673828125, 0.0025196075439453125, 0.0019817352294921875, -0.03924560546875, -0.061431884765625, -0.03118896484375, -0.0163726806640625, 0.018829345703125, 0.0018339157104492188, 0.0005617141723632812, -0.0377197265625, 0.052490234375, -0.0047149658203125, 0.044464111328125, 0.026641845703125, -0.00563812255859375, -0.001430511474609375, 0.008544921875, 0.036712646484375, 0.03778076171875, -0.01134490966796875, -0.00968170166015625, 0.03216552734375, -0.0604248046875, 0.0169677734375, 0.0258941650390625, -0.01641845703125, -0.01012420654296875, 0.03509521484375, 0.05828857421875, -0.004482269287109375, -0.03594970703125, 0.046142578125, 0.0030517578125, -0.0191802978515625, -0.03466796875, 0.0133514404296875, 0.01739501953125, 0.030303955078125, 0.033782958984375, 0.005584716796875, 0.0167236328125, -0.0163421630859375, 0.002208709716796875, 0.0312347412109375, -0.0025787353515625, -0.013885498046875, 0.0526123046875, -0.01444244384765625, -0.0248565673828125, 0.01354217529296875, -0.0230712890625, -0.046661376953125, 0.057708740234375, 0.035308837890625, 0.052276611328125, 0.01081085205078125, -0.01303863525390625, 0.039337158203125, 0.01242828369140625, 0.0012865066528320312, 0.00923919677734375, -0.0081634521484375, -0.03509521484375, -0.00638580322265625, -0.062469482421875, -0.0208740234375, -0.01580810546875, -0.022186279296875, 0.03643798828125, -0.033905029296875, -0.004955291748046875, -0.0151519775390625, 0.03900146484375, -0.0684814453125, -0.0115509033203125, 0.0205535888671875, 0.0897216796875, -0.01305389404296875, 0.0740966796875, 0.0309295654296875, -0.052154541015625, -0.0716552734375, -0.0103759765625, 0.0265655517578125, -0.06689453125, 0.038787841796875, 0.0006756782531738281, -0.0076446533203125, -0.01146697998046875, -0.03192138671875, -0.07464599609375, 0.10797119140625, 0.01529693603515625, -0.0229949951171875, -0.021759033203125, 0.0010356903076171875, 0.028289794921875, -0.00443267822265625, 0.04425048828125, 0.045501708984375, 0.046783447265625, 0.006305694580078125, -0.0994873046875, 0.0216522216796875, -0.042633056640625, 0.0009822845458984375, -0.0131378173828125, -0.06781005859375, 0.061614990234375, -0.0106964111328125, 0.003467559814453125, 0.0205230712890625, 0.058746337890625, 0.059600830078125, 0.0246429443359375, 0.01058197021484375, 0.039703369140625, 0.06097412109375, 0.00852203369140625, 0.0946044921875, -0.0171661376953125, 0.03485107421875, 0.039825439453125, -0.0014123916625976562, 0.037139892578125, 0.014892578125, -0.047698974609375, 0.03900146484375, 0.0526123046875, -0.01239776611328125, 0.031341552734375, 0.03857421875, -0.01233673095703125, 0.005481719970703125, 0.01169586181640625, -0.05584716796875, -0.01233673095703125, 0.01953125, 0.00934600830078125, -0.00234222412109375, -0.0010023117065429688, 0.01488494873046875, -0.0180511474609375, -0.0302276611328125, 0.045562744140625, 0.00836181640625, -0.018096923828125, 0.08258056640625, -0.0084228515625, 0.08587646484375, -0.049560546875, -0.00635528564453125, -0.0250396728515625, -0.0011081695556640625, -0.03466796875, -0.055419921875, -0.007232666015625, 0.00862884521484375, -0.0034236907958984375, 0.01265716552734375, 0.059722900390625, -0.00445556640625, -0.0430908203125, 0.0278472900390625, 0.022918701171875, 0.034820556640625, 0.0264434814453125, -0.072998046875, 0.032440185546875, -0.0028171539306640625, -0.050567626953125, 0.03167724609375, 0.037994384765625, 0.0006651878356933594, 0.0560302734375, 0.048065185546875, 0.0025424957275390625, 0.036590576171875, -0.0164794921875, 0.06842041015625, -0.04046630859375, -0.006916046142578125, -0.06463623046875, 0.04144287109375, -0.01537322998046875, -0.02557373046875, 0.08203125, 0.04541015625, 0.051422119140625, -0.00392913818359375, 0.047698974609375, -0.009307861328125, 0.0099029541015625, -0.0180206298828125, 0.0660400390625, -0.06488037109375, 0.00707244873046875, -0.037017822265625, -0.05914306640625, -0.035400390625, 0.0682373046875, -0.01474761962890625, 0.0026531219482421875, 0.032257080078125, 0.07452392578125, 0.012451171875, -0.0175933837890625, 0.01265716552734375, -0.0111236572265625, 0.023529052734375, 0.055633544921875, 0.041748046875, -0.0528564453125, 0.048858642578125, -0.027740478515625, -0.0114288330078125, -0.02703857421875, -0.0452880859375, -0.0780029296875, -0.0303955078125, -0.03033447265625, -0.051300048828125, -0.01233673095703125, 0.094970703125, 0.04486083984375, -0.0487060546875, -0.0156402587890625, 0.00826263427734375, 0.04351806640625, -0.0186920166015625, -0.01358795166015625, 0.056976318359375, 0.0107421875, -0.058624267578125, 0.01192474365234375, 0.01468658447265625, 0.025604248046875, -0.0184326171875, -0.05633544921875, -0.0084075927734375, 0.0185089111328125, 0.0307159423828125, 0.047576904296875, -0.058349609375, -0.00445556640625, 0.0004639625549316406, -0.0252227783203125, 0.01366424560546875, 0.0160980224609375, -0.04083251953125, 0.00556182861328125, 0.038482666015625, 0.034210205078125, 0.039825439453125, -0.03643798828125, 0.002750396728515625, -0.01442718505859375, 0.007415771484375, -0.0046539306640625, 0.035736083984375, 0.01169586181640625, -0.032501220703125, 0.045135498046875, 0.0193939208984375, -0.035247802734375, -0.060333251953125, -0.01546478271484375, -0.07550048828125, -0.0091094970703125, 0.0806884765625, -0.004940032958984375, -0.0457763671875, 0.006999969482421875, -0.029052734375, 0.0249786376953125, -0.0372314453125, 0.0223236083984375, 0.028167724609375, -0.0212860107421875, -0.00518798828125, -0.037994384765625, 0.035736083984375, 0.0005578994750976562, -0.057342529296875, -0.0032329559326171875, 0.034912109375, 0.0225677490234375, 0.04229736328125, 0.0703125, -0.021331787109375, 0.026641845703125, 0.018218994140625, 0.03497314453125, -0.02349853515625, 0.01052093505859375, -0.0246124267578125, -0.00547027587890625, -0.0036563873291015625, -0.01412200927734375 ] ]
digiplay/perfectlevel10
2023-08-01T18:49:48.000Z
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "license:other", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
digiplay
null
null
digiplay/perfectlevel10
4
9,979
diffusers
2023-08-01T18:10:46
--- license: other tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers inference: true --- https://civitai.com/models/117591/perfectlevel10 Sample image : ![tmp3nsyt_7b.png](https://cdn-uploads.huggingface.co/production/uploads/646c83c871d0c8a6e4455854/RNZZocA65KkXhGUGsLxmh.png)
312
[ [ -0.03564453125, -0.020660400390625, 0.021697998046875, 0.0504150390625, -0.0167694091796875, 0.0074920654296875, 0.0260772705078125, -0.0257720947265625, 0.0426025390625, 0.03155517578125, -0.049713134765625, -0.066162109375, -0.0186004638671875, -0.00159454345703125, -0.038238525390625, 0.0550537109375, 0.027191162109375, 0.0220184326171875, -0.0009136199951171875, 0.01207733154296875, -0.03363037109375, 0.0089874267578125, -0.06634521484375, -0.02044677734375, 0.0261077880859375, 0.05035400390625, 0.03045654296875, 0.025390625, 0.0181121826171875, 0.0144500732421875, -0.01534271240234375, -0.0181884765625, -0.036712646484375, 0.00048089027404785156, 0.0019779205322265625, -0.0474853515625, -0.07098388671875, 0.016632080078125, 0.06646728515625, 0.0301055908203125, -0.01485443115234375, 0.0293426513671875, 0.007801055908203125, 0.041351318359375, -0.0297698974609375, -0.0037078857421875, -0.00945281982421875, 0.0093536376953125, -0.0230255126953125, -0.01093292236328125, -0.00415802001953125, -0.05035400390625, -0.0186614990234375, -0.08154296875, 0.037384033203125, 0.01751708984375, 0.09466552734375, -0.0089111328125, -0.0411376953125, -0.009979248046875, -0.0275421142578125, 0.025604248046875, -0.0345458984375, 0.0064544677734375, 0.02099609375, 0.037628173828125, -0.0235137939453125, -0.045989990234375, -0.03424072265625, 0.010986328125, 0.0207977294921875, 0.01221466064453125, -0.02996826171875, -0.00940704345703125, 0.043701171875, 0.0670166015625, -0.043975830078125, -0.000736236572265625, -0.046600341796875, 0.0130157470703125, 0.033111572265625, 0.004283905029296875, 0.06597900390625, 0.0174560546875, -0.0134735107421875, -0.0120086669921875, -0.039398193359375, 0.0305328369140625, 0.0279083251953125, -0.0011854171752929688, -0.0517578125, 0.039276123046875, -0.01296234130859375, 0.0557861328125, 0.027862548828125, 0.00803375244140625, 0.0445556640625, -0.004444122314453125, -0.043212890625, -0.02923583984375, 0.0231170654296875, 0.05810546875, 0.01284027099609375, 0.007965087890625, -0.005126953125, -0.016326904296875, 0.0113067626953125, -0.0755615234375, -0.016845703125, 0.0248870849609375, -0.0252227783203125, -0.0458984375, 0.0224456787109375, -0.08367919921875, -0.0065460205078125, 0.01458740234375, 0.01049041748046875, -0.023223876953125, -0.047607421875, 0.0029296875, -0.018829345703125, 0.024383544921875, 0.024017333984375, -0.02880859375, 0.015228271484375, 0.00045037269592285156, 0.0404052734375, 0.0303192138671875, 0.021209716796875, -0.03973388671875, 0.0219879150390625, -0.039093017578125, 0.047271728515625, -0.01458740234375, -0.032623291015625, 0.0100250244140625, 0.025390625, 0.0081024169921875, -0.04913330078125, 0.08489990234375, -0.0198516845703125, 0.0181427001953125, -0.0280609130859375, -0.0260772705078125, -0.02667236328125, 0.01192474365234375, -0.051849365234375, 0.056488037109375, 0.035247802734375, -0.049835205078125, 0.044891357421875, -0.049591064453125, 0.00580596923828125, 0.030242919921875, 0.003215789794921875, -0.032196044921875, 0.01873779296875, 0.00439453125, 0.030059814453125, -0.00803375244140625, -0.024505615234375, -0.056610107421875, -0.027618408203125, 0.0174102783203125, 0.00867462158203125, 0.0772705078125, 0.0374755859375, -0.0301055908203125, 0.008636474609375, -0.08734130859375, 0.021514892578125, 0.040008544921875, 0.01617431640625, -0.02593994140625, -0.048431396484375, -0.001129150390625, 0.052886962890625, 0.033966064453125, -0.048095703125, 0.02703857421875, 0.00118255615234375, 0.008514404296875, 0.033050537109375, 0.021575927734375, 0.0147247314453125, -0.01541900634765625, 0.0714111328125, -0.0086517333984375, 0.037384033203125, 0.01520538330078125, -0.01067352294921875, -0.058258056640625, -0.03857421875, 0.0297698974609375, 0.0234527587890625, -0.040863037109375, 0.04486083984375, 0.01055908203125, -0.057037353515625, -0.05206298828125, -0.01171875, 0.02227783203125, 0.0286712646484375, 0.004077911376953125, -0.032073974609375, -0.01184844970703125, -0.0860595703125, 0.01123809814453125, 0.00665283203125, -0.048187255859375, 0.022857666015625, 0.0184478759765625, 0.0130462646484375, 0.0215606689453125, -0.03314208984375, -0.021331787109375, -0.018096923828125, -0.0254364013671875, 0.0478515625, 0.056671142578125, 0.07391357421875, -0.0654296875, -0.04339599609375, -0.03082275390625, -0.059112548828125, -0.041473388671875, -0.005306243896484375, -0.041717529296875, -0.040985107421875, 0.0209808349609375, -0.0177459716796875, 0.048187255859375, 0.035491943359375, -0.060272216796875, 0.023162841796875, -0.042816162109375, 0.04339599609375, -0.0953369140625, 0.05059814453125, 0.03839111328125, -0.03765869140625, -0.040924072265625, 0.0616455078125, 0.01983642578125, -0.00572967529296875, -0.03643798828125, 0.023956298828125, -0.060211181640625, -0.0005841255187988281, -0.00011450052261352539, 0.0029850006103515625, 0.0186614990234375, -0.0014734268188476562, -0.0280609130859375, 0.0360107421875, 0.0677490234375, -0.0350341796875, 0.01666259765625, 0.0183258056640625, -0.037933349609375, 0.07379150390625, -0.0909423828125, -0.0006003379821777344, 0.00628662109375, 0.0234527587890625, -0.06414794921875, -0.047149658203125, 0.037841796875, -0.043304443359375, 0.030426025390625, -0.002777099609375, -0.04412841796875, -0.036468505859375, -0.0278778076171875, 0.05438232421875, 0.038421630859375, -0.032562255859375, -0.0121002197265625, 0.0219879150390625, 0.01107025146484375, -0.01947021484375, -0.046234130859375, -0.015777587890625, -0.01303863525390625, -0.03192138671875, 0.043243408203125, -0.0005054473876953125, 0.00231170654296875, -0.0200958251953125, -0.00423431396484375, -0.0297698974609375, -0.0157928466796875, 0.06121826171875, 0.043121337890625, -0.01263427734375, -0.03021240234375, 0.0013713836669921875, -0.0031070709228515625, -0.0031375885009765625, 0.004817962646484375, 0.0491943359375, -0.020660400390625, -0.0259552001953125, -0.09539794921875, 0.016632080078125, 0.054901123046875, 0.0014905929565429688, 0.045989990234375, 0.039093017578125, -0.0438232421875, 0.006153106689453125, -0.0479736328125, 0.01318359375, -0.032623291015625, -0.0188751220703125, -0.040069580078125, -0.034820556640625, 0.04815673828125, 0.0220489501953125, -0.02130126953125, 0.0640869140625, 0.01568603515625, -0.0107574462890625, 0.08917236328125, 0.038482666015625, -0.000606536865234375, 0.02423095703125, -0.061859130859375, -0.00571441650390625, -0.047454833984375, -0.050933837890625, -0.0394287109375, -0.042510986328125, -0.0469970703125, -0.007015228271484375, 0.012481689453125, 0.03106689453125, -0.04498291015625, 0.041534423828125, -0.034881591796875, 0.0413818359375, 0.04534912109375, 0.035247802734375, 0.013153076171875, -0.007293701171875, -0.01263427734375, -0.01088714599609375, -0.0235748291015625, -0.006160736083984375, 0.024932861328125, 0.0243377685546875, 0.0184783935546875, 0.004352569580078125, 0.053314208984375, -0.004055023193359375, 0.0010366439819335938, -0.016357421875, 0.036376953125, 0.011260986328125, -0.072998046875, 0.005275726318359375, -0.01177978515625, -0.0516357421875, -0.0053863525390625, -0.034759521484375, -0.018524169921875, 0.031768798828125, 0.0154571533203125, -0.03973388671875, 0.034881591796875, -0.0281219482421875, 0.05572509765625, 0.0050201416015625, -0.0252227783203125, -0.007167816162109375, -0.0286712646484375, 0.01552581787109375, 0.029022216796875, 0.014739990234375, -0.00795745849609375, -0.0018100738525390625, 0.0226593017578125, -0.039764404296875, 0.037567138671875, -0.036163330078125, -0.0026302337646484375, 0.01055145263671875, 0.009033203125, -0.0017566680908203125, 0.0191650390625, -0.0036106109619140625, -0.01541900634765625, -0.00269317626953125, -0.07965087890625, -0.022857666015625, 0.05120849609375, -0.052001953125, -0.00884246826171875, -0.05194091796875, -0.009429931640625, 0.0087127685546875, 0.01171112060546875, 0.06866455078125, 0.0293426513671875, -0.02349853515625, 0.03271484375, 0.0430908203125, -0.0156707763671875, 0.006748199462890625, 0.01837158203125, -0.026275634765625, -0.0215606689453125, 0.0706787109375, -0.01261138916015625, 0.0197906494140625, 0.0149993896484375, -0.0033054351806640625, -0.006809234619140625, -0.0364990234375, -0.01354217529296875, 0.02862548828125, -0.008453369140625, -0.013580322265625, -0.01233673095703125, -0.027374267578125, -0.042266845703125, -0.03662109375, -0.0185089111328125, -0.0200347900390625, -0.046234130859375, -0.0158538818359375, 0.0010766983032226562, 0.056427001953125, -0.01861572265625, 0.028045654296875, -0.016387939453125, 0.02294921875, 0.0369873046875, 0.026580810546875, -0.0030841827392578125, -0.0589599609375, 0.0305633544921875, 0.0075531005859375, -0.03387451171875, -0.0423583984375, 0.05419921875, 0.017913818359375, 0.04742431640625, 0.04547119140625, -0.01099395751953125, 0.051055908203125, -0.00766754150390625, 0.057464599609375, 0.04290771484375, -0.06011962890625, 0.06427001953125, -0.0419921875, 0.029998779296875, 0.06365966796875, 0.0309906005859375, -0.039276123046875, -0.04296875, -0.06866455078125, -0.055755615234375, 0.01055145263671875, 0.027587890625, -0.006137847900390625, 0.0352783203125, 0.040679931640625, -0.01132965087890625, 0.00839996337890625, -0.051849365234375, -0.024871826171875, -0.0237274169921875, -0.01538848876953125, 0.00533294677734375, -0.0249176025390625, -0.00955963134765625, -0.051544189453125, 0.048583984375, 0.00817108154296875, 0.0167694091796875, 0.005451202392578125, -0.003932952880859375, -0.0002837181091308594, -0.0010986328125, 0.0643310546875, 0.06182861328125, -0.039642333984375, -0.0198211669921875, -0.00582122802734375, -0.020263671875, 0.014923095703125, -0.02374267578125, -0.01580810546875, 0.005947113037109375, 0.020751953125, 0.05364990234375, 0.028106689453125, -0.008087158203125, 0.06707763671875, -0.0246124267578125, -0.01024627685546875, -0.0280914306640625, 0.018218994140625, 0.009552001953125, 0.0218353271484375, 0.036529541015625, 0.00643157958984375, 0.02398681640625, -0.0198822021484375, 0.02093505859375, 0.01560211181640625, -0.047607421875, -0.039398193359375, 0.06292724609375, 0.00940704345703125, -0.0256805419921875, 0.05230712890625, -0.020355224609375, -0.0163116455078125, 0.0712890625, 0.04931640625, 0.05517578125, -0.041717529296875, 0.050018310546875, 0.0533447265625, 0.0028324127197265625, 0.01513671875, 0.0535888671875, 0.0279693603515625, -0.017822265625, 0.0095672607421875, -0.0458984375, -0.0249176025390625, 0.0304107666015625, -0.0772705078125, 0.0323486328125, -0.0675048828125, 0.02130126953125, 0.0011234283447265625, 0.003299713134765625, -0.056610107421875, 0.054351806640625, 0.0026569366455078125, 0.11151123046875, -0.0673828125, 0.067138671875, 0.0304107666015625, -0.045318603515625, -0.0633544921875, -0.006175994873046875, 0.0034847259521484375, -0.08599853515625, 0.031341552734375, 0.030670166015625, 0.0163116455078125, -0.005462646484375, -0.07464599609375, -0.022674560546875, 0.08404541015625, 0.0238800048828125, -0.052490234375, -0.0152435302734375, -0.0307769775390625, 0.043548583984375, -0.0572509765625, 0.0266876220703125, 0.008087158203125, 0.0249481201171875, 0.03790283203125, -0.043701171875, -0.00862884521484375, -0.058197021484375, 0.0201568603515625, -0.0104522705078125, -0.042022705078125, 0.0589599609375, -0.00766754150390625, -0.006275177001953125, 0.0191802978515625, 0.0633544921875, 0.0229949951171875, 0.0154876708984375, 0.07794189453125, 0.047271728515625, 0.02154541015625, -0.00820159912109375, 0.0897216796875, -0.032501220703125, 0.01505279541015625, 0.0478515625, -0.003143310546875, 0.027801513671875, -0.004550933837890625, 0.004383087158203125, 0.031005859375, 0.088623046875, -0.0221099853515625, 0.029449462890625, 0.0230560302734375, -0.033782958984375, -0.01224517822265625, -0.018707275390625, -0.058349609375, 0.019195556640625, 0.014404296875, -0.01316070556640625, -0.011962890625, -0.0089111328125, -0.014404296875, 0.01334381103515625, -0.046417236328125, 0.0289764404296875, -0.00373077392578125, -0.0076751708984375, 0.043670654296875, -0.0272979736328125, 0.046539306640625, -0.019134521484375, 0.0018291473388671875, -0.0229949951171875, -0.0001684427261352539, -0.041046142578125, -0.057098388671875, 0.00601959228515625, 0.005474090576171875, 0.002079010009765625, -0.0193939208984375, 0.054168701171875, -0.03466796875, -0.07293701171875, 0.0220489501953125, 0.01045989990234375, 0.01641845703125, -0.002521514892578125, -0.06951904296875, -0.0027790069580078125, 0.0065460205078125, 0.01030731201171875, 0.00003254413604736328, 0.02716064453125, 0.0171356201171875, 0.038787841796875, 0.023529052734375, 0.039794921875, 0.008514404296875, 0.00984954833984375, 0.04888916015625, -0.037139892578125, -0.033355712890625, -0.039947509765625, 0.04010009765625, -0.058685302734375, -0.058074951171875, 0.04339599609375, 0.0501708984375, 0.0565185546875, -0.032196044921875, 0.0209197998046875, -0.015625, 0.047576904296875, -0.023834228515625, 0.08184814453125, -0.0672607421875, -0.038330078125, -0.028778076171875, -0.0667724609375, -0.0250244140625, 0.045562744140625, 0.016876220703125, 0.00006121397018432617, 0.01316070556640625, 0.046417236328125, -0.0113677978515625, -0.01520538330078125, 0.0284881591796875, 0.015411376953125, 0.0242156982421875, 0.040374755859375, 0.03369140625, -0.034942626953125, -0.0028057098388671875, -0.036376953125, -0.0595703125, -0.0260009765625, -0.04632568359375, -0.05120849609375, -0.05792236328125, -0.020050048828125, -0.042266845703125, -0.00807952880859375, 0.0684814453125, 0.049774169921875, -0.07537841796875, -0.02862548828125, -0.0006556510925292969, 0.004825592041015625, 0.0095367431640625, -0.0231170654296875, -0.007122039794921875, 0.0269317626953125, -0.074951171875, -0.01070404052734375, 0.007488250732421875, 0.0413818359375, -0.0014553070068359375, 0.022552490234375, -0.024932861328125, -0.0184783935546875, 0.0012273788452148438, 0.044219970703125, -0.020538330078125, -0.0242919921875, -0.00441741943359375, -0.02984619140625, 0.01107025146484375, 0.038421630859375, -0.0022525787353515625, -0.005283355712890625, 0.05364990234375, 0.0149383544921875, 0.0258636474609375, -0.01349639892578125, 0.048736572265625, -0.034881591796875, 0.032806396484375, 0.0175933837890625, 0.0265350341796875, 0.0216064453125, -0.00849151611328125, 0.0430908203125, 0.025146484375, -0.041595458984375, -0.08135986328125, -0.0164031982421875, -0.09368896484375, 0.01074981689453125, 0.033477783203125, 0.0011758804321289062, -0.029815673828125, 0.04150390625, -0.0196990966796875, 0.0275115966796875, -0.02392578125, 0.0430908203125, 0.053802490234375, 0.0197296142578125, -0.00885009765625, -0.044281005859375, 0.0236663818359375, 0.0218505859375, -0.05810546875, -0.058258056640625, 0.02130126953125, 0.0288848876953125, 0.030914306640625, 0.0279388427734375, -0.0238037109375, 0.0245208740234375, -0.0018520355224609375, 0.0374755859375, 0.004611968994140625, -0.0189208984375, -0.029998779296875, 0.0272216796875, -0.0098114013671875, -0.05010986328125 ] ]
liam168/trans-opus-mt-zh-en
2021-07-16T03:34:38.000Z
[ "transformers", "pytorch", "marian", "text2text-generation", "translation", "en", "zh", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
translation
liam168
null
null
liam168/trans-opus-mt-zh-en
15
9,938
transformers
2022-03-02T23:29:05
--- language: - en - zh tags: - translation widget: - text: "我喜欢学习数据科学和机器学习。" --- # liam168/trans-opus-mt-zh-en ## Model description * source group: English * target group: Chinese * model: transformer * source language(s): eng ## How to use ```python >>> from transformers import AutoModelWithLMHead,AutoTokenizer,pipeline >>> mode_name = 'liam168/trans-opus-mt-zh-en' >>> model = AutoModelWithLMHead.from_pretrained(mode_name) >>> tokenizer = AutoTokenizer.from_pretrained(mode_name) >>> translation = pipeline("translation_zh_to_en", model=model, tokenizer=tokenizer) >>> translation('我喜欢学习数据科学和机器学习。', max_length=400) [{'translation_text': 'I like to study data science and machine learning.'}] ``` ## Contact liam168520@gmail.com
752
[ [ -0.00402069091796875, -0.0161285400390625, 0.0005030632019042969, 0.0237274169921875, -0.03253173828125, -0.005176544189453125, -0.007137298583984375, 0.01030731201171875, 0.0078887939453125, 0.0289459228515625, -0.062042236328125, -0.03076171875, -0.060455322265625, 0.023468017578125, -0.03961181640625, 0.06524658203125, -0.015655517578125, 0.0239105224609375, 0.022125244140625, 0.00909423828125, -0.01113128662109375, -0.0252838134765625, -0.044342041015625, -0.0300140380859375, 0.019134521484375, 0.01560211181640625, 0.0389404296875, 0.0640869140625, 0.0345458984375, 0.02655029296875, 0.0025081634521484375, 0.008575439453125, -0.022705078125, -0.0034332275390625, -0.01441192626953125, -0.041900634765625, -0.036834716796875, -0.004833221435546875, 0.07098388671875, 0.039215087890625, 0.007537841796875, 0.0288238525390625, 0.01113128662109375, 0.02392578125, -0.010498046875, 0.033203125, -0.0499267578125, 0.020965576171875, -0.00682830810546875, -0.01190185546875, -0.0284576416015625, 0.0035858154296875, 0.004665374755859375, -0.045623779296875, 0.01557159423828125, 0.005218505859375, 0.09344482421875, 0.02130126953125, -0.042144775390625, 0.0064697265625, -0.050628662109375, 0.07342529296875, -0.061004638671875, 0.033782958984375, 0.019683837890625, 0.0285491943359375, 0.0006165504455566406, -0.08416748046875, -0.03997802734375, 0.01184844970703125, -0.018829345703125, 0.01177215576171875, 0.0017194747924804688, -0.005374908447265625, 0.0128936767578125, 0.033721923828125, -0.040985107421875, -0.0036029815673828125, -0.05242919921875, -0.038177490234375, 0.055419921875, 0.0297088623046875, 0.02606201171875, -0.020660400390625, -0.0214080810546875, -0.0123748779296875, -0.0235443115234375, -0.00366973876953125, 0.031494140625, 0.0323486328125, -0.017486572265625, 0.048004150390625, -0.021942138671875, 0.046295166015625, 0.0137481689453125, 0.00656890869140625, 0.035491943359375, -0.03466796875, -0.023895263671875, -0.01010894775390625, 0.08203125, 0.006256103515625, -0.0028743743896484375, -0.00835418701171875, -0.00788116455078125, -0.016510009765625, 0.01210784912109375, -0.0802001953125, -0.018341064453125, 0.0268402099609375, -0.05462646484375, -0.02166748046875, 0.027862548828125, -0.050048828125, 0.024627685546875, -0.0247650146484375, 0.03839111328125, -0.02142333984375, -0.032440185546875, 0.0116729736328125, -0.00659942626953125, 0.0252532958984375, -0.00302886962890625, -0.07366943359375, -0.00714111328125, 0.028289794921875, 0.056884765625, -0.01113128662109375, -0.0528564453125, -0.021148681640625, 0.01528167724609375, -0.0032901763916015625, 0.032196044921875, -0.0037689208984375, -0.033905029296875, 0.0009622573852539062, 0.049713134765625, -0.034881591796875, -0.0168914794921875, 0.0592041015625, -0.0151214599609375, 0.05853271484375, 0.01468658447265625, -0.03607177734375, -0.0243072509765625, 0.0133209228515625, -0.034698486328125, 0.06207275390625, 0.04022216796875, -0.04522705078125, 0.006015777587890625, -0.06878662109375, -0.0293731689453125, 0.024627685546875, -0.0009794235229492188, -0.039093017578125, -0.01497650146484375, 0.0060882568359375, 0.0305938720703125, -0.01439666748046875, 0.031341552734375, 0.01158905029296875, -0.013641357421875, -0.003993988037109375, -0.0399169921875, 0.08453369140625, 0.032989501953125, -0.0290069580078125, 0.028564453125, -0.054901123046875, 0.0035533905029296875, 0.0180206298828125, -0.027801513671875, 0.0065155029296875, -0.0069732666015625, 0.035919189453125, 0.0198516845703125, 0.0293731689453125, -0.03436279296875, 0.03277587890625, -0.0404052734375, 0.061859130859375, 0.046356201171875, -0.0218658447265625, 0.037017822265625, -0.015838623046875, 0.033050537109375, 0.0178375244140625, 0.01499176025390625, -0.01519012451171875, -0.02667236328125, -0.0711669921875, -0.0159149169921875, 0.03594970703125, 0.04901123046875, -0.07440185546875, 0.052520751953125, -0.031890869140625, -0.04986572265625, -0.056793212890625, 0.0067901611328125, 0.00591278076171875, 0.031982421875, 0.042449951171875, -0.0131988525390625, -0.055328369140625, -0.0635986328125, -0.01885986328125, -0.0170440673828125, -0.02520751953125, 0.005779266357421875, 0.044708251953125, -0.05230712890625, 0.050018310546875, -0.035400390625, -0.03814697265625, -0.0262451171875, 0.013214111328125, 0.04400634765625, 0.057220458984375, 0.03631591796875, -0.056854248046875, -0.042633056640625, -0.006755828857421875, -0.043609619140625, -0.00531768798828125, -0.00954437255859375, -0.03515625, 0.000385284423828125, 0.004383087158203125, -0.044189453125, 0.0201263427734375, 0.034149169921875, -0.04571533203125, 0.042633056640625, 0.0020351409912109375, 0.0140838623046875, -0.10894775390625, 0.0029544830322265625, -0.0170745849609375, -0.0142669677734375, -0.037841796875, 0.0089874267578125, 0.010589599609375, -0.005229949951171875, -0.0687255859375, 0.03985595703125, -0.0254364013671875, 0.0313720703125, -0.0219268798828125, -0.018829345703125, 0.007007598876953125, 0.03204345703125, 0.0145263671875, 0.04107666015625, 0.0626220703125, -0.057220458984375, 0.06005859375, 0.05401611328125, -0.0165252685546875, 0.0237274169921875, -0.048797607421875, -0.013580322265625, 0.0193634033203125, 0.01161956787109375, -0.05609130859375, -0.0197296142578125, 0.046295166015625, -0.044647216796875, 0.0207672119140625, -0.022613525390625, -0.0380859375, -0.043548583984375, 0.004459381103515625, 0.042205810546875, 0.03509521484375, -0.03778076171875, 0.061126708984375, 0.0026702880859375, 0.0070037841796875, -0.0305938720703125, -0.0745849609375, 0.0182647705078125, -0.0233917236328125, -0.0474853515625, 0.04058837890625, -0.0218048095703125, 0.0245819091796875, -0.00786590576171875, 0.014434814453125, -0.0297088623046875, 0.004863739013671875, -0.01007080078125, 0.03375244140625, -0.0213623046875, -0.0224456787109375, 0.000652313232421875, -0.0297088623046875, 0.0204620361328125, -0.043182373046875, 0.04156494140625, 0.002197265625, -0.0079498291015625, -0.04144287109375, 0.0132904052734375, 0.04425048828125, -0.02032470703125, 0.05755615234375, 0.059722900390625, -0.0166015625, -0.023101806640625, -0.00004965066909790039, -0.01459503173828125, -0.037750244140625, 0.045257568359375, -0.0261383056640625, -0.034332275390625, 0.0352783203125, -0.0019330978393554688, 0.011993408203125, 0.0645751953125, 0.051025390625, 0.01084136962890625, 0.07818603515625, 0.0298614501953125, -0.023101806640625, 0.0252838134765625, -0.05853271484375, 0.0138397216796875, -0.060882568359375, -0.0012540817260742188, -0.033935546875, -0.0067901611328125, -0.0291595458984375, -0.0169677734375, 0.00524139404296875, -0.0205535888671875, -0.0247039794921875, 0.06768798828125, -0.059051513671875, 0.0024261474609375, 0.046295166015625, -0.00916290283203125, 0.00521087646484375, -0.0100555419921875, -0.001739501953125, -0.0001533031463623047, -0.053558349609375, -0.028839111328125, 0.0655517578125, 0.0291900634765625, 0.033355712890625, 0.0017843246459960938, 0.03192138671875, -0.005619049072265625, 0.00020933151245117188, -0.06500244140625, 0.0274658203125, -0.01024627685546875, -0.0498046875, -0.0256500244140625, -0.036651611328125, -0.0703125, 0.0236358642578125, -0.0013494491577148438, -0.032318115234375, 0.004390716552734375, -0.00991058349609375, -0.0017385482788085938, 0.043548583984375, -0.017303466796875, 0.08050537109375, -0.0226898193359375, -0.00408935546875, 0.0174713134765625, -0.04730224609375, 0.01500701904296875, -0.00470733642578125, 0.01091766357421875, 0.006137847900390625, 0.005893707275390625, 0.060882568359375, -0.0246734619140625, 0.045074462890625, -0.0020198822021484375, -0.0109710693359375, 0.0127716064453125, 0.002025604248046875, 0.0221099853515625, -0.003513336181640625, -0.01837158203125, 0.039642333984375, 0.0116729736328125, -0.0271453857421875, -0.0261077880859375, 0.028961181640625, -0.060211181640625, -0.0352783203125, -0.020660400390625, -0.0526123046875, 0.033782958984375, 0.0369873046875, 0.0533447265625, 0.015655517578125, 0.004734039306640625, 0.02630615234375, 0.0161285400390625, 0.00144195556640625, 0.045013427734375, 0.043365478515625, -0.005664825439453125, -0.041107177734375, 0.07269287109375, 0.0179901123046875, 0.01120758056640625, 0.0220794677734375, 0.0198516845703125, -0.0242462158203125, -0.0328369140625, -0.0638427734375, 0.0191192626953125, -0.03411865234375, -0.0195159912109375, -0.044403076171875, -0.05426025390625, -0.0517578125, 0.0187530517578125, -0.035308837890625, -0.026947021484375, -0.04437255859375, -0.00197601318359375, 0.015167236328125, 0.01117706298828125, 0.002685546875, 0.0406494140625, -0.07867431640625, 0.0236968994140625, 0.0034809112548828125, 0.0185699462890625, -0.01416778564453125, -0.08642578125, -0.042083740234375, 0.0157623291015625, -0.0411376953125, -0.0482177734375, 0.06439208984375, 0.0250701904296875, 0.048309326171875, 0.034149169921875, 0.016632080078125, 0.037078857421875, -0.044525146484375, 0.06256103515625, -0.0041961669921875, -0.080322265625, 0.043243408203125, -0.0244140625, 0.021636962890625, 0.02490234375, 0.027862548828125, -0.051513671875, -0.032257080078125, -0.047515869140625, -0.07666015625, 0.05780029296875, 0.034759521484375, 0.0286865234375, 0.01515960693359375, 0.008026123046875, 0.01050567626953125, 0.01678466796875, -0.08709716796875, -0.049163818359375, -0.040985107421875, -0.032135009765625, 0.010162353515625, -0.00033211708068847656, -0.00507354736328125, -0.039031982421875, 0.09417724609375, -0.0125885009765625, 0.0230255126953125, 0.012847900390625, -0.0048828125, -0.00984954833984375, 0.003021240234375, 0.03228759765625, 0.0286102294921875, -0.001644134521484375, -0.01184844970703125, 0.00881195068359375, -0.0380859375, 0.01352691650390625, 0.01491546630859375, -0.029083251953125, 0.02557373046875, 0.026214599609375, 0.0615234375, -0.01073455810546875, -0.01580810546875, 0.0380859375, -0.0032901763916015625, -0.037109375, -0.047454833984375, -0.00009816884994506836, 0.001811981201171875, 0.004150390625, 0.035980224609375, -0.01197052001953125, 0.009246826171875, -0.01995849609375, 0.02117919921875, 0.01357269287109375, -0.029632568359375, -0.01476287841796875, 0.076171875, 0.018585205078125, -0.0419921875, 0.051055908203125, 0.0023250579833984375, -0.0628662109375, 0.060150146484375, 0.04248046875, 0.076416015625, -0.0289764404296875, -0.004131317138671875, 0.04638671875, 0.0233917236328125, -0.0141754150390625, 0.01003265380859375, 0.004520416259765625, -0.05145263671875, -0.0253143310546875, -0.038665771484375, -0.01476287841796875, 0.0072021484375, -0.046661376953125, 0.046051025390625, 0.00490570068359375, 0.00746917724609375, -0.02392578125, -0.00565338134765625, -0.041778564453125, 0.0210723876953125, -0.0035991668701171875, 0.06878662109375, -0.0650634765625, 0.08587646484375, 0.045135498046875, -0.03741455078125, -0.07623291015625, -0.0214996337890625, -0.0180511474609375, -0.04852294921875, 0.061004638671875, 0.020904541015625, 0.021636962890625, 0.0280914306640625, -0.040252685546875, -0.06549072265625, 0.08721923828125, 0.01113128662109375, -0.0304412841796875, -0.0026187896728515625, 0.0258331298828125, 0.020477294921875, -0.0352783203125, 0.031646728515625, 0.0276336669921875, 0.05316162109375, 0.00896453857421875, -0.061431884765625, -0.0130462646484375, -0.037628173828125, -0.012603759765625, 0.00841522216796875, -0.05950927734375, 0.07733154296875, 0.0174407958984375, -0.0207977294921875, 0.01279449462890625, 0.06634521484375, -0.006134033203125, 0.008697509765625, 0.03289794921875, 0.0293426513671875, 0.0165252685546875, -0.0193939208984375, 0.035980224609375, -0.04833984375, 0.065673828125, 0.06024169921875, 0.004001617431640625, 0.03387451171875, 0.029937744140625, -0.019256591796875, 0.050567626953125, 0.059722900390625, -0.057281494140625, 0.040130615234375, 0.01195526123046875, -0.0008778572082519531, -0.007694244384765625, 0.0081939697265625, -0.039764404296875, 0.03253173828125, 0.01256561279296875, -0.0474853515625, -0.0019273757934570312, 0.0014476776123046875, -0.00891876220703125, -0.009063720703125, -0.0229949951171875, 0.034454345703125, 0.0019330978393554688, -0.034576416015625, 0.04632568359375, 0.0133209228515625, 0.068603515625, -0.05621337890625, 0.004547119140625, -0.004520416259765625, 0.01511383056640625, -0.018890380859375, -0.034942626953125, 0.035400390625, -0.0207977294921875, -0.018646240234375, -0.0257110595703125, 0.0286865234375, -0.06085205078125, -0.072509765625, 0.0302276611328125, 0.027984619140625, 0.0290374755859375, 0.0021820068359375, -0.07220458984375, -0.016876220703125, 0.018310546875, -0.02850341796875, 0.0031604766845703125, 0.0335693359375, 0.0284423828125, 0.058013916015625, 0.040924072265625, 0.0207977294921875, -0.0005102157592773438, -0.0104522705078125, 0.05224609375, -0.035980224609375, -0.0406494140625, -0.07818603515625, 0.05389404296875, 0.005809783935546875, -0.0187835693359375, 0.060791015625, 0.055938720703125, 0.0711669921875, -0.0270233154296875, 0.060791015625, -0.009521484375, 0.027984619140625, -0.0189208984375, 0.077392578125, -0.042236328125, 0.004917144775390625, -0.0107421875, -0.0709228515625, -0.01293182373046875, 0.05194091796875, 0.00878143310546875, -0.01235198974609375, 0.05596923828125, 0.048919677734375, -0.0183563232421875, -0.0164947509765625, 0.0226287841796875, 0.0222625732421875, 0.032928466796875, 0.050567626953125, 0.03570556640625, -0.073486328125, 0.056549072265625, -0.03546142578125, 0.00405120849609375, -0.00045680999755859375, -0.06500244140625, -0.06829833984375, -0.04693603515625, -0.0262298583984375, -0.04071044921875, -0.0364990234375, 0.07305908203125, 0.05401611328125, -0.07196044921875, -0.033111572265625, -0.00439453125, 0.00818634033203125, -0.0174713134765625, -0.0214080810546875, 0.040496826171875, -0.017730712890625, -0.09844970703125, 0.026611328125, -0.01291656494140625, 0.01690673828125, -0.035491943359375, -0.009674072265625, -0.01151275634765625, -0.00829315185546875, 0.0165557861328125, 0.00606536865234375, -0.0543212890625, -0.0024261474609375, -0.010894775390625, -0.0135040283203125, 0.0158538818359375, 0.0361328125, -0.03076171875, 0.01264190673828125, 0.0394287109375, 0.039794921875, 0.0194091796875, -0.00708770751953125, 0.06304931640625, -0.027435302734375, 0.0244598388671875, 0.00528717041015625, 0.0423583984375, 0.019866943359375, -0.045867919921875, 0.050262451171875, 0.0452880859375, -0.048553466796875, -0.0557861328125, 0.0018711090087890625, -0.054901123046875, -0.0209503173828125, 0.077880859375, -0.019012451171875, -0.0202484130859375, -0.0007238388061523438, -0.0287017822265625, 0.048431396484375, -0.01456451416015625, 0.04473876953125, 0.050048828125, 0.0252532958984375, 0.0008649826049804688, -0.035003662109375, 0.01776123046875, 0.0221405029296875, -0.038238525390625, -0.0120086669921875, -0.004199981689453125, 0.0209503173828125, 0.0145721435546875, 0.007335662841796875, -0.00046443939208984375, 0.0018262863159179688, -0.01312255859375, 0.0226898193359375, -0.0259246826171875, -0.00498199462890625, -0.0201873779296875, -0.00453948974609375, -0.028228759765625, -0.035491943359375 ] ]
mosaicml/mpt-7b-8k
2023-10-30T21:53:51.000Z
[ "transformers", "pytorch", "mpt", "text-generation", "Composer", "MosaicML", "llm-foundry", "StreamingDatasets", "custom_code", "dataset:mc4", "dataset:c4", "dataset:togethercomputer/RedPajama-Data-1T", "dataset:bigcode/the-stack", "dataset:allenai/s2orc", "arxiv:2108.12409", "arxiv:2302.13971", "arxiv:2205.14135", "arxiv:2010.04245", "arxiv:1909.08053", "arxiv:2302.06675", "license:apache-2.0", "has_space", "text-generation-inference", "region:us" ]
text-generation
mosaicml
null
null
mosaicml/mpt-7b-8k
20
9,936
transformers
2023-06-30T19:56:15
--- license: apache-2.0 tags: - Composer - MosaicML - llm-foundry - StreamingDatasets datasets: - mc4 - c4 - togethercomputer/RedPajama-Data-1T - bigcode/the-stack - allenai/s2orc inference: false --- # MPT-7B-8k MPT-7B-8k is a decoder-style transformer pretrained starting from MPT-7B, but updating the sequence length to 8k and training for an additional 500B tokens, resulting in a total of 1.5T tokens of text and code. This model was trained by [MosaicML](https://www.mosaicml.com). MPT-7B-8k is part of the family of Mosaic Pretrained Transformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference. These architectural changes include performance-optimized layer implementations and the elimination of context length limits by replacing positional embeddings with Attention with Linear Biases ([ALiBi](https://arxiv.org/abs/2108.12409)). Thanks to these modifications, MPT models can be trained with high throughput efficiency and stable convergence. MPT models can also be served efficiently with both standard HuggingFace pipelines and NVIDIA's [FasterTransformer](https://github.com/NVIDIA/FasterTransformer). This model uses the MosaicML LLM codebase, which can be found in the [llm-foundry repository](https://github.com/mosaicml/llm-foundry). It was trained by MosaicML’s NLP team on the [MosaicML platform](https://www.mosaicml.com/training) for LLM pretraining, finetuning, and inference. ### How is this model different? MPT-7B-8k is * **Licensed for the possibility of commercial use.** * **Trained on a large amount of data** (1.5T tokens like [XGen](https://huggingface.co/Salesforce/xgen-7b-8k-base) vs. 1T for [LLaMA](https://arxiv.org/abs/2302.13971), 1T for [MPT-7B](https://www.mosaicml.com/blog/mpt-7b), 300B for [Pythia](https://github.com/EleutherAI/pythia), 300B for [OpenLLaMA](https://github.com/openlm-research/open_llama), and 800B for [StableLM](https://github.com/Stability-AI/StableLM)). * **Prepared to handle long inputs** thanks to [ALiBi](https://arxiv.org/abs/2108.12409). With ALiBi, the model can extrapolate beyond the 8k training sequence length to up to 10k, and with a few million tokens it can be finetuned to extrapolate much further. * **Capable of fast training and inference** via [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf) and [FasterTransformer](https://github.com/NVIDIA/FasterTransformer) * **Equipped with highly efficient open-source training code** via the [llm-foundry repository](https://github.com/mosaicml/llm-foundry) ### Models finetuned off MPT-7B-8k: The following models are finetuned on MPT-7B-8k: * [MPT-7B-8k-Instruct](https://huggingface.co/mosaicml/mpt-7b-8k-instruct): a model for long-form instruction following (especially summarization and question-answering). Built by finetuning MPT-7B-8k on several carefully curated datasets. * License: _CC-BY-SA-3.0_ * [MPT-7B-8k-Chat](https://huggingface.co/mosaicml/mpt-7b-8k-chat): a chatbot-like model for dialogue generation. Built by finetuning MPT-7B-8k on approximately 1.5B tokens of chat data. * License: _CC-By-NC-SA-4.0_ ## Model Date July 18, 2023 ## Model License Apache-2.0 ## Documentation * [Blog post: MPT-7B-8k](https://www.mosaicml.com/blog/long-context-mpt-7b-8k) * [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/) * Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)! ## How to Use This model is best used with the MosaicML [llm-foundry repository](https://github.com/mosaicml/llm-foundry) for training and finetuning. ```python import transformers model = transformers.AutoModelForCausalLM.from_pretrained( 'mosaicml/mpt-7b-8k', trust_remote_code=True ) ``` Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method. This is because we use a custom `MPT` model architecture that is not yet part of the Hugging Face `transformers` package. `MPT` includes options for many training efficiency features such as [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), [QK LayerNorm](https://arxiv.org/abs/2010.04245), and more. To use the optimized [triton implementation](https://github.com/openai/triton) of FlashAttention, you can load the model on GPU (`cuda:0`) with `attn_impl='triton'` and with `bfloat16` precision: ```python import torch import transformers name = 'mosaicml/mpt-7b-8k' config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True) config.attn_config['attn_impl'] = 'triton' config.init_device = 'cuda:0' # For fast initialization directly on GPU! model = transformers.AutoModelForCausalLM.from_pretrained( name, config=config, torch_dtype=torch.bfloat16, # Load model weights in bfloat16 trust_remote_code=True ) ``` Although the model was trained with a sequence length of 2048, ALiBi enables users to increase the maximum sequence length during finetuning and/or inference. For example: ```python import transformers name = 'mosaicml/mpt-7b-8k' config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True) config.max_seq_len = 10000 # (input + output) tokens can now be up to 10000 model = transformers.AutoModelForCausalLM.from_pretrained( name, config=config, trust_remote_code=True ) ``` This model was trained with the MPT-7B-8k tokenizer which is identical to the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer. ```python from transformers import AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('mosaicml/mpt-7b-8k') ``` The model can then be used, for example, within a text-generation pipeline. Note: when running Torch modules in lower precision, it is best practice to use the [torch.autocast context manager](https://pytorch.org/docs/stable/amp.html). ```python from transformers import pipeline with torch.autocast('cuda', dtype=torch.bfloat16): inputs = tokenizer('Here is a recipe for vegan banana bread:\n', return_tensors="pt").to('cuda') outputs = model.generate(**inputs, max_new_tokens=100) print(tokenizer.batch_decode(outputs, skip_special_tokens=True)) # or using the HF pipeline pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, device='cuda:0') with torch.autocast('cuda', dtype=torch.bfloat16): print( pipe('Here is a recipe for vegan banana bread:\n', max_new_tokens=100, do_sample=True, use_cache=True)) ``` ## Model Description The architecture is a modification of a standard decoder-only transformer. The model has been modified from a standard transformer in the following ways: * It uses [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf) * It uses [ALiBi (Attention with Linear Biases)](https://arxiv.org/abs/2108.12409) and does not use positional embeddings * It does not use biases | Hyperparameter | Value | |----------------|-------| |n_parameters | 6.7B | |n_layers | 32 | | n_heads | 32 | | d_model | 4096 | | vocab size | 50432 | | sequence length | 2048 | ## Training Data ### Streaming Datasets Data was formatted using the MosaicML [StreamingDataset](https://github.com/mosaicml/streaming) library to host our data in object storage and efficiently stream it to our compute cluster during training. StreamingDataset obviates the need to download the whole dataset before starting training, and allows instant resumption of training from any point in the dataset. ### Data Mix The model was trained for ___T tokens. First it was trained for 1T tokens (with batch size 1760 and sequence length 2048) on the following data mix: #### Data Mix for Original 1T Tokens Used to Train MPT-7B | Data Source | Number of Tokens in Source | Proportion | Effective Number of Tokens | Epochs | |-------------|----------------------------|------------|----------------------------|--------| | mC4 3.1.0 - English | 417.99 B | 0.33 | 330 B | 0.14 | | C4 - English - SemDedup 80% | 100.42 B | 0.299 | 299 B | 2.98 | | RedPajama - CommonCrawl | 878.45 B | 0.1 | 100 B | 0.11 | | The Stack - Selected Languages | 463.78 B | 0.1 | 100 B | 0.22 | | RedPajama - Wikipedia - En | 4.87 B | 0.04 | 40 B | 8.21 | | The Stack - Markdown | 107.07 B | 0.035 | 35 B | 0.33 | | S2ORC | 48.85 B | 0.033 | 33 B | 0.68 | | RedPajama - Books | 26.02 B | 0.03 | 30B | 1.15 | | RedPajama - arXiv | 28.10 B | 0.019 | 19 B | 0.68 | | RedPajama - StackExchange | 20.54 B | 0.014 | 14 B |0.68 | #### Data Mix for Additional 500B Tokens Used to Further Train MPT-7B-8k We took 80B tokens from document samples that were longer than 4096 tokens, and 120B tokens with varying document sample lengths that matched the "baseline" length distribution for a total of 200B tokens in a single dataset. We then trained MPT-7B for 500B tokens with a maximum sequence length of 8192, resulting in MPT-7B-8k. Since we trained for 500B tokens using 200B tokens, nearly every subset was trained on for exactly 2.5 epochs. | Sequence Length Distribution | Number of Tokens in Source (Billion) | Proportion | Effective Number of Tokens (Billion) | Epochs | |---|---|---|---|---| | mC4 3.1.0 - English (200+ words) - Baseline | 33.60 | 16.80% | 84.00 | 2.50 | | mC4 3.1.0 - English (200+ words) - ≥4096 tokens | 23.04 | 11.52% | 57.60 | 2.50 | | c4 - English - SemDedup 80% - Baseline | 30.12 | 15.06% | 75.30 | 2.50 | | c4 - English - SemDedup 80% - ≥4096 tokens | 0.92 | 0.46% | 2.30 | 2.50 | | RedPajama - CommonCrawl - Baseline | 8.52 | 4.26% | 21.30 | 2.50 | | RedPajama - CommonCrawl - ≥4096 tokens | 12.80 | 6.40% | 32.00 | 2.50 | | The Stack - Selected Languages - Baseline | 30.00 | 15.00% | 75.00 | 2.50 | | The Stack - Selected Languages - ≥4096 tokens | 10.00 | 5.00% | 25.00 | 2.50 | | RedPajama - Wikipedia - Baseline | 3.60 | 1.80% | 9.00 | 2.50 | | RedPajama - Wikipedia - ≥4096 tokens | 1.04 | 0.52% | 2.60 | 2.50 | | The Stack - Markdown - Baseline | 4.50 | 2.25% | 11.25 | 2.50 | | The Stack - Markdown - ≥4096 tokens | 8.00 | 4.00% | 20.00 | 2.50 | | Semantic Scholar ORC - Baseline | 3.30 | 1.65% | 8.25 | 2.50 | | Semantic Scholar ORC - ≥4096 tokens | 8.00 | 4.00% | 20.00 | 2.50 | | RedPajama - Books - Baseline | 3.00 | 1.50% | 7.50 | 2.50 | | RedPajama - Books - ≥4096 tokens | 8.00 | 4.00% | 20.00 | 2.50 | | RedPajama - arXiv - Baseline | 1.92 | 0.96% | 4.80 | 2.50 | | RedPajama - arXiv - ≥4096 tokens | 5.40 | 2.70% | 13.50 | 2.50 | | RedPajama - StackExchange - Baseline | 1.44 | 0.72% | 3.60 | 2.50 | | RedPajama - StackExchange - ≥4096 tokens | 1.52 | 1.40% | 7.00 | 4.60 | | N Training Tokens | 200 | 100.00% | | 2.5 epochs * 200B = 500B tokens | Samples for each batch were selected from one of the datasets with the probability specified above. The examples were shuffled within each dataset, and each example was constructed from as many sequences from that dataset as were necessary to fill the 2048 sequence length. The data was tokenized using the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer. This BPE tokenizer has a number of desirable characteristics, most of which are relevant for tokenizing code: (1) It was trained on a diverse mix of data that includes code (The Pile) (2) It applies consistent space delimitation, unlike the GPT2 tokenizer which tokenizes inconsistently depending on the presence of prefix spaces (3) It contains tokens for repeated space characters, which allows superior compression of text with large amounts of repeated space characters. The model vocabulary size of 50432 was set to be a multiple of 128 (as in [MEGATRON-LM](https://arxiv.org/abs/1909.08053)), model flop utilization (MFU) increased by up to four percentage points. ### Training Configuration This model was trained on 440 A100-40GBs for about 9.5 days using the [MosaicML Platform](https://www.mosaicml.com/platform). The model was trained with sharded data parallelism using [FSDP](https://pytorch.org/docs/stable/fsdp.html) and used the [LION](https://arxiv.org/abs/2302.06675) optimizer. ## Limitations and Biases _The following language is modified from [EleutherAI's GPT-NeoX-20B](https://huggingface.co/EleutherAI/gpt-neox-20b)_ MPT-7B-8k is **not** intended for deployment without finetuning. It should not be used for human-facing interactions without further guardrails and user consent. MPT-7B-8k can produce factually incorrect output, and should not be relied on to produce factually accurate information. MPT-7B-8k was trained on various public datasets. While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs. ## MosaicML Platform If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://www.mosaicml.com/get-started?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b-8k). ## Disclaimer The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please cosult an attorney before using this model for commercial purposes. ## Citation Please cite this model using the following format: ``` @online{MosaicML2023Introducing, author = {MosaicML NLP Team}, title = {Introducing MPT-7B: A New Standard for Open-Source, ly Usable LLMs}, year = {2023}, url = {www.mosaicml.com/blog/mpt-7b}, note = {Accessed: 2023-03-28}, % change this date urldate = {2023-03-28} % change this date } ```
13,686
[ [ -0.031982421875, -0.03466796875, 0.014312744140625, 0.019317626953125, -0.028656005859375, 0.0037899017333984375, -0.00691986083984375, -0.01715087890625, 0.0057830810546875, 0.0221710205078125, -0.045806884765625, -0.037139892578125, -0.054290771484375, 0.0016689300537109375, -0.0208892822265625, 0.07293701171875, -0.0019044876098632812, 0.006366729736328125, 0.0093231201171875, -0.00711822509765625, -0.01568603515625, -0.032989501953125, -0.050384521484375, -0.0240631103515625, 0.0418701171875, 0.01284027099609375, 0.051177978515625, 0.06396484375, 0.035491943359375, 0.025146484375, -0.01418304443359375, 0.01398468017578125, -0.041229248046875, -0.0252532958984375, 0.016265869140625, -0.03216552734375, -0.035614013671875, 0.0115509033203125, 0.039886474609375, 0.0225982666015625, -0.007843017578125, 0.03564453125, -0.0011568069458007812, 0.026763916015625, -0.032806396484375, 0.0246429443359375, -0.034423828125, 0.014862060546875, -0.01540374755859375, -0.0004744529724121094, -0.04345703125, -0.019287109375, 0.002655029296875, -0.04156494140625, 0.0206756591796875, 0.00003731250762939453, 0.07269287109375, 0.02862548828125, -0.0302886962890625, 0.00415802001953125, -0.038482666015625, 0.0489501953125, -0.06365966796875, 0.03369140625, 0.0164337158203125, 0.021209716796875, 0.0016040802001953125, -0.08404541015625, -0.051544189453125, -0.01200103759765625, -0.0054473876953125, 0.0274658203125, -0.01259613037109375, 0.004627227783203125, 0.039398193359375, 0.034698486328125, -0.0458984375, -0.0008945465087890625, -0.035308837890625, -0.0196380615234375, 0.0377197265625, 0.0181427001953125, 0.014312744140625, -0.024749755859375, -0.039337158203125, -0.03173828125, -0.0499267578125, 0.00693511962890625, 0.021759033203125, -0.002895355224609375, -0.03271484375, 0.041534423828125, -0.002498626708984375, 0.03863525390625, 0.0167388916015625, -0.011016845703125, 0.0330810546875, -0.02197265625, -0.0289306640625, -0.0005602836608886719, 0.07427978515625, 0.0187225341796875, -0.006298065185546875, -0.0010814666748046875, -0.0023136138916015625, -0.003692626953125, 0.0022525787353515625, -0.08026123046875, -0.024749755859375, 0.017974853515625, -0.036041259765625, -0.01265716552734375, 0.004619598388671875, -0.04071044921875, -0.01543426513671875, -0.017578125, 0.04632568359375, -0.05157470703125, -0.019134521484375, -0.001171112060546875, -0.00948333740234375, 0.032440185546875, 0.01212310791015625, -0.07098388671875, 0.0016574859619140625, 0.03814697265625, 0.07403564453125, -0.01543426513671875, -0.037628173828125, -0.0161590576171875, -0.005817413330078125, -0.0024204254150390625, 0.04034423828125, -0.01006317138671875, -0.01401519775390625, -0.028778076171875, 0.012939453125, -0.02557373046875, -0.0287628173828125, 0.024627685546875, -0.0289459228515625, 0.031280517578125, -0.01389312744140625, -0.037811279296875, -0.0193328857421875, 0.0111846923828125, -0.04669189453125, 0.0806884765625, 0.0241546630859375, -0.0623779296875, 0.024200439453125, -0.05908203125, -0.00853729248046875, -0.005062103271484375, 0.00241851806640625, -0.05517578125, -0.0073699951171875, 0.027252197265625, 0.035552978515625, -0.0224151611328125, 0.018463134765625, -0.01172637939453125, -0.039886474609375, 0.00974273681640625, -0.04962158203125, 0.0767822265625, 0.026885986328125, -0.052886962890625, 0.0169525146484375, -0.06005859375, -0.008880615234375, 0.01904296875, -0.0372314453125, 0.040313720703125, -0.0227203369140625, 0.0013551712036132812, 0.0195159912109375, 0.00792694091796875, -0.042938232421875, 0.01120758056640625, -0.037445068359375, 0.047882080078125, 0.060394287109375, -0.01107025146484375, 0.0224761962890625, -0.0290985107421875, 0.029541015625, 0.0187835693359375, 0.031280517578125, -0.020599365234375, -0.052520751953125, -0.0750732421875, -0.0308074951171875, 0.0301513671875, 0.03424072265625, -0.06597900390625, 0.0296173095703125, -0.02191162109375, -0.05499267578125, -0.05657958984375, -0.006511688232421875, 0.0310821533203125, 0.04193115234375, 0.04046630859375, -0.02459716796875, -0.0499267578125, -0.056060791015625, 0.0015096664428710938, 0.0034542083740234375, 0.002758026123046875, 0.0143280029296875, 0.046356201171875, -0.0202178955078125, 0.06744384765625, -0.022735595703125, 0.006114959716796875, -0.021240234375, 0.0125732421875, 0.037750244140625, 0.04266357421875, 0.03826904296875, -0.0517578125, -0.051788330078125, -0.0116729736328125, -0.045806884765625, 0.010711669921875, -0.0054931640625, -0.01276397705078125, 0.01568603515625, 0.01446533203125, -0.0699462890625, 0.040252685546875, 0.0447998046875, -0.030426025390625, 0.036865234375, -0.00540924072265625, 0.0021533966064453125, -0.10528564453125, 0.00873565673828125, -0.0159759521484375, -0.01568603515625, -0.04302978515625, -0.013580322265625, 0.010284423828125, -0.00406646728515625, -0.0662841796875, 0.03973388671875, -0.0278472900390625, 0.00185394287109375, -0.01084136962890625, -0.023223876953125, -0.00605010986328125, 0.066162109375, 0.01141357421875, 0.062744140625, 0.03515625, -0.0322265625, 0.03857421875, 0.026031494140625, -0.0286712646484375, 0.00597381591796875, -0.0438232421875, 0.01226043701171875, 0.0029087066650390625, 0.0229949951171875, -0.0701904296875, -0.01212310791015625, 0.0299835205078125, -0.0418701171875, 0.022491455078125, -0.021697998046875, -0.03619384765625, -0.0430908203125, -0.0130767822265625, 0.03375244140625, 0.056610107421875, -0.061248779296875, 0.047088623046875, 0.00356292724609375, 0.017181396484375, -0.060638427734375, -0.0528564453125, -0.005222320556640625, -0.0228271484375, -0.0565185546875, 0.0278472900390625, -0.005939483642578125, 0.0160980224609375, -0.01995849609375, -0.0053558349609375, 0.01010894775390625, -0.0008516311645507812, 0.03350830078125, 0.027252197265625, -0.0214080810546875, -0.017608642578125, -0.010986328125, -0.0220489501953125, 0.0009832382202148438, -0.0226287841796875, 0.07183837890625, -0.0239410400390625, -0.0252838134765625, -0.047119140625, 0.0003204345703125, 0.04119873046875, -0.01558685302734375, 0.07916259765625, 0.08355712890625, -0.01033782958984375, 0.00852203369140625, -0.047027587890625, -0.0176849365234375, -0.038726806640625, 0.0279083251953125, -0.0130615234375, -0.056365966796875, 0.048858642578125, 0.01381683349609375, 0.003192901611328125, 0.05340576171875, 0.06488037109375, 0.0005249977111816406, 0.07470703125, 0.0242462158203125, 0.00830841064453125, 0.045989990234375, -0.05682373046875, 0.00322723388671875, -0.0653076171875, -0.024017333984375, -0.00959014892578125, -0.0233154296875, -0.046539306640625, -0.037109375, 0.0187835693359375, -0.013031005859375, -0.05120849609375, 0.043182373046875, -0.04339599609375, 0.0287628173828125, 0.06072998046875, 0.0224761962890625, 0.0033130645751953125, -0.004581451416015625, -0.020751953125, 0.00673675537109375, -0.06719970703125, -0.026336669921875, 0.095947265625, 0.033599853515625, 0.040130615234375, -0.00023555755615234375, 0.049468994140625, -0.005634307861328125, 0.036773681640625, -0.0321044921875, 0.0296478271484375, 0.00756072998046875, -0.050384521484375, -0.0029010772705078125, -0.046142578125, -0.0592041015625, 0.0225677490234375, -0.0184478759765625, -0.049224853515625, 0.02508544921875, 0.01457977294921875, -0.039520263671875, 0.045257568359375, -0.0701904296875, 0.08197021484375, -0.0173797607421875, -0.0302886962890625, 0.0121002197265625, -0.062469482421875, 0.0262603759765625, 0.0032806396484375, -0.007144927978515625, -0.004863739013671875, 0.01934814453125, 0.06365966796875, -0.04034423828125, 0.062164306640625, -0.015594482421875, 0.0196380615234375, 0.0251312255859375, -0.01361846923828125, 0.036346435546875, -0.0024051666259765625, 0.0013942718505859375, 0.0225982666015625, 0.0034122467041015625, -0.0303192138671875, -0.02630615234375, 0.03509521484375, -0.08514404296875, -0.03936767578125, -0.0377197265625, -0.0439453125, 0.005481719970703125, 0.01465606689453125, 0.05279541015625, 0.02374267578125, 0.00791168212890625, 0.024749755859375, 0.045654296875, -0.0266571044921875, 0.055389404296875, 0.01207733154296875, 0.0008339881896972656, -0.04296875, 0.06396484375, -0.00005120038986206055, 0.02667236328125, 0.0165252685546875, 0.00948333740234375, -0.027313232421875, -0.040069580078125, -0.039154052734375, 0.0212554931640625, -0.047760009765625, -0.0299224853515625, -0.047821044921875, -0.0379638671875, -0.0372314453125, 0.0102996826171875, -0.046905517578125, -0.032196044921875, -0.035064697265625, -0.006870269775390625, 0.0279541015625, 0.035980224609375, 0.0003571510314941406, 0.05224609375, -0.05841064453125, 0.0137176513671875, 0.0194244384765625, 0.0287322998046875, 0.005313873291015625, -0.062255859375, -0.0274810791015625, 0.01287841796875, -0.044952392578125, -0.055877685546875, 0.046722412109375, 0.0023975372314453125, 0.034393310546875, 0.026214599609375, -0.01366424560546875, 0.053955078125, -0.032257080078125, 0.0712890625, 0.02685546875, -0.06201171875, 0.01861572265625, -0.02301025390625, 0.0309600830078125, 0.021697998046875, 0.032562255859375, -0.034637451171875, -0.00794219970703125, -0.058013916015625, -0.05548095703125, 0.07659912109375, 0.040130615234375, 0.00479888916015625, -0.0006909370422363281, 0.028411865234375, 0.002231597900390625, 0.009521484375, -0.08453369140625, -0.0267791748046875, -0.04730224609375, -0.030029296875, -0.008148193359375, -0.00959014892578125, -0.00550079345703125, -0.0440673828125, 0.05767822265625, 0.0010509490966796875, 0.05084228515625, 0.01398468017578125, -0.024169921875, 0.005954742431640625, -0.0044403076171875, 0.040863037109375, 0.047515869140625, -0.0238800048828125, 0.0028324127197265625, 0.021484375, -0.056610107421875, 0.0057830810546875, 0.0177154541015625, -0.004367828369140625, -0.01375579833984375, 0.0207672119140625, 0.08917236328125, 0.0010004043579101562, -0.0265350341796875, 0.039031982421875, -0.0135345458984375, -0.0197296142578125, -0.0156707763671875, 0.009490966796875, 0.026885986328125, 0.037506103515625, 0.0157928466796875, 0.00319671630859375, -0.015777587890625, -0.031768798828125, 0.011199951171875, 0.01236724853515625, -0.0129241943359375, -0.02154541015625, 0.06890869140625, 0.0021686553955078125, -0.01995849609375, 0.06451416015625, -0.0027408599853515625, -0.040313720703125, 0.052642822265625, 0.0478515625, 0.059783935546875, -0.020233154296875, 0.00986480712890625, 0.031585693359375, 0.017791748046875, -0.008819580078125, 0.0005474090576171875, -0.00711822509765625, -0.050537109375, -0.03369140625, -0.0565185546875, -0.0196685791015625, -0.0004336833953857422, -0.029876708984375, 0.0250091552734375, -0.0311737060546875, -0.020782470703125, -0.00888824462890625, 0.0079803466796875, -0.061065673828125, 0.01169586181640625, 0.028350830078125, 0.07073974609375, -0.049072265625, 0.0750732421875, 0.0218658447265625, -0.03997802734375, -0.07330322265625, -0.033416748046875, -0.0098419189453125, -0.07769775390625, 0.035430908203125, 0.024871826171875, 0.0172119140625, 0.01070404052734375, -0.046417236328125, -0.07318115234375, 0.1217041015625, 0.043365478515625, -0.0308990478515625, -0.0157623291015625, 0.037017822265625, 0.0323486328125, -0.02813720703125, 0.046905517578125, 0.048370361328125, 0.0266876220703125, 0.031951904296875, -0.062164306640625, 0.003673553466796875, -0.025634765625, -0.0046539306640625, 0.0059967041015625, -0.06353759765625, 0.08453369140625, -0.004669189453125, -0.00414276123046875, 0.0071258544921875, 0.051055908203125, 0.023345947265625, 0.0164642333984375, 0.022125244140625, 0.061614990234375, 0.04248046875, -0.0261383056640625, 0.083740234375, -0.027374267578125, 0.0457763671875, 0.0684814453125, 0.02069091796875, 0.042694091796875, 0.026702880859375, -0.01025390625, 0.0296478271484375, 0.0755615234375, -0.0239410400390625, 0.0350341796875, -0.004871368408203125, -0.0104217529296875, -0.0158538818359375, 0.0224761962890625, -0.03582763671875, 0.0248565673828125, 0.01384735107421875, -0.04339599609375, -0.00568389892578125, 0.00913238525390625, 0.00749969482421875, -0.043212890625, -0.0185089111328125, 0.04473876953125, 0.0091705322265625, -0.045318603515625, 0.057861328125, -0.0028743743896484375, 0.0511474609375, -0.04058837890625, 0.00926971435546875, -0.0229339599609375, 0.0182647705078125, -0.020599365234375, -0.047210693359375, 0.0157012939453125, -0.006809234619140625, 0.006717681884765625, -0.0170135498046875, 0.0249786376953125, -0.0227203369140625, -0.033203125, 0.00930023193359375, 0.0191802978515625, 0.01175689697265625, -0.008453369140625, -0.0643310546875, 0.003017425537109375, 0.007038116455078125, -0.040008544921875, 0.0162200927734375, 0.01160430908203125, 0.0201263427734375, 0.046417236328125, 0.055389404296875, -0.01189422607421875, 0.03369140625, -0.00849151611328125, 0.072265625, -0.05352783203125, -0.0247344970703125, -0.07305908203125, 0.052825927734375, -0.00013625621795654297, -0.0288848876953125, 0.0546875, 0.051544189453125, 0.06353759765625, -0.0025615692138671875, 0.03155517578125, -0.00617218017578125, 0.01174163818359375, -0.037567138671875, 0.061279296875, -0.026947021484375, 0.01873779296875, -0.01317596435546875, -0.0848388671875, -0.0221405029296875, 0.044281005859375, -0.028411865234375, 0.01265716552734375, 0.046142578125, 0.05889892578125, -0.0218353271484375, 0.00949859619140625, 0.01311492919921875, 0.02813720703125, 0.02667236328125, 0.05706787109375, 0.059417724609375, -0.05340576171875, 0.052093505859375, -0.03668212890625, -0.006130218505859375, -0.006988525390625, -0.0521240234375, -0.07525634765625, -0.0379638671875, -0.0122833251953125, -0.0355224609375, -0.006656646728515625, 0.081298828125, 0.064208984375, -0.047882080078125, -0.0190887451171875, -0.003406524658203125, -0.002811431884765625, -0.00545501708984375, -0.01319122314453125, 0.045989990234375, -0.005886077880859375, -0.0537109375, 0.00901031494140625, 0.0031070709228515625, 0.0204925537109375, -0.00861358642578125, -0.0164642333984375, -0.0239410400390625, -0.000030994415283203125, 0.0394287109375, 0.0161285400390625, -0.04052734375, -0.0163116455078125, 0.00395965576171875, -0.0011796951293945312, 0.041412353515625, 0.03289794921875, -0.04937744140625, 0.01885986328125, 0.02410888671875, 0.035430908203125, 0.086181640625, -0.005252838134765625, 0.0285797119140625, -0.04010009765625, 0.0177459716796875, 0.0161895751953125, 0.037017822265625, 0.0210723876953125, -0.0235748291015625, 0.04010009765625, 0.0305938720703125, -0.0439453125, -0.057647705078125, -0.003276824951171875, -0.084228515625, -0.0023097991943359375, 0.08099365234375, -0.01355743408203125, -0.04168701171875, 0.019134521484375, -0.0185394287109375, 0.045013427734375, -0.014495849609375, 0.047576904296875, 0.03936767578125, -0.00974273681640625, -0.036865234375, -0.0260467529296875, 0.040313720703125, 0.02362060546875, -0.04510498046875, -0.01235198974609375, 0.006252288818359375, 0.039031982421875, 0.009765625, 0.03411865234375, -0.012451171875, 0.0278472900390625, 0.0009813308715820312, 0.0144195556640625, -0.0268707275390625, -0.00949859619140625, -0.01514434814453125, 0.007495880126953125, -0.0233917236328125, -0.0131683349609375 ] ]
RWKV/rwkv-4-430m-pile
2023-05-15T10:00:20.000Z
[ "transformers", "pytorch", "rwkv", "text-generation", "dataset:EleutherAI/pile", "endpoints_compatible", "has_space", "region:us" ]
text-generation
RWKV
null
null
RWKV/rwkv-4-430m-pile
3
9,930
transformers
2023-05-04T13:38:41
--- datasets: - EleutherAI/pile --- ![RWKlogo.png](https://s3.amazonaws.com/moonup/production/uploads/62441d1d9fdefb55a0b7d12c/UWpP-lGRZJJDaEx_uUlDv.png) # Model card for RWKV-4 | 430M parameters trained on Pile dataset RWKV is a project led by [Bo Peng](https://github.com/BlinkDL). Learn more about the model architecture in the blogposts from Johan Wind [here](https://johanwind.github.io/2023/03/23/rwkv_overview.html) and [here](https://johanwind.github.io/2023/03/23/rwkv_details.html). Learn more about the project by joining the [RWKV discord server](https://discordapp.com/users/468093332535640064). # Table of contents 0. [TL;DR](#TL;DR) 1. [Model Details](#model-details) 2. [Usage](#usage) 3. [Citation](#citation) ## TL;DR Below is the description from the [original repository](https://github.com/BlinkDL/RWKV-LM) > RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). It's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding. ## Model Details The details of the architecture can be found on the blogpost mentioned above and the Hugging Face blogpost of the integration. ## Usage ### Convert the raw weights to the HF format You can use the [`convert_rwkv_checkpoint_to_hf.py`](https://github.com/huggingface/transformers/tree/main/src/transformers/models/rwkv/convert_rwkv_checkpoint_to_hf.py) script by specifying the repo_id of the original weights, the filename and the output directory. You can also optionally directly push the converted model on the Hub by passing `--push_to_hub` flag and `--model_name` argument to specify where to push the converted weights. ```bash python convert_rwkv_checkpoint_to_hf.py --repo_id RAW_HUB_REPO --checkpoint_file RAW_FILE --output_dir OUTPUT_DIR --push_to_hub --model_name dummy_user/converted-rwkv ``` ### Generate text You can use the `AutoModelForCausalLM` and `AutoTokenizer` classes to generate texts from the model. Expand the sections below to understand how to run the model in different scenarios: ### Running the model on a CPU <details> <summary> Click to expand </summary> ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("RWKV/rwkv-4-169m-pile") tokenizer = AutoTokenizer.from_pretrained("RWKV/rwkv-4-169m-pile") prompt = "\nIn a shocking finding, scientist discovered a herd of dragons living in a remote, previously unexplored valley, in Tibet. Even more surprising to the researchers was the fact that the dragons spoke perfect Chinese." inputs = tokenizer(prompt, return_tensors="pt") output = model.generate(inputs["input_ids"], max_new_tokens=40) print(tokenizer.decode(output[0].tolist(), skip_special_tokens=True)) ``` ### Running the model on a single GPU <details> <summary> Click to expand </summary> ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("RWKV/rwkv-4-430m-pile").to(0) tokenizer = AutoTokenizer.from_pretrained("RWKV/rwkv-4-430m-pile") prompt = "\nIn a shocking finding, scientist discovered a herd of dragons living in a remote, previously unexplored valley, in Tibet. Even more surprising to the researchers was the fact that the dragons spoke perfect Chinese." inputs = tokenizer(prompt, return_tensors="pt").to(0) output = model.generate(inputs["input_ids"], max_new_tokens=40) print(tokenizer.decode(output[0].tolist(), skip_special_tokens=True)) ``` </details> </details> ### Running the model in half-precision, on GPU <details> <summary> Click to expand </summary> ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("RWKV/rwkv-4-430m-pile", torch_dtype=torch.float16).to(0) tokenizer = AutoTokenizer.from_pretrained("RWKV/rwkv-4-430m-pile") prompt = "\nIn a shocking finding, scientist discovered a herd of dragons living in a remote, previously unexplored valley, in Tibet. Even more surprising to the researchers was the fact that the dragons spoke perfect Chinese." inputs = tokenizer(prompt, return_tensors="pt").to(0) output = model.generate(inputs["input_ids"], max_new_tokens=40) print(tokenizer.decode(output[0].tolist(), skip_special_tokens=True)) ``` </details> ### Running the model multiple GPUs <details> <summary> Click to expand </summary> ```python # pip install accelerate from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("RWKV/rwkv-4-430m-pile", device_map="auto") tokenizer = AutoTokenizer.from_pretrained("RWKV/rwkv-4-430m-pile") prompt = "\nIn a shocking finding, scientist discovered a herd of dragons living in a remote, previously unexplored valley, in Tibet. Even more surprising to the researchers was the fact that the dragons spoke perfect Chinese." inputs = tokenizer(prompt, return_tensors="pt").to(0) output = model.generate(inputs["input_ids"], max_new_tokens=40) print(tokenizer.decode(output[0].tolist(), skip_special_tokens=True)) ``` </details> ## Citation If you use this model, please consider citing the original work, from the original repo [here](https://github.com/BlinkDL/ChatRWKV/)
5,303
[ [ -0.029266357421875, -0.043609619140625, -0.0008363723754882812, 0.01300048828125, -0.0184173583984375, -0.023956298828125, -0.011138916015625, -0.023529052734375, -0.006374359130859375, 0.0188140869140625, -0.040191650390625, -0.02581787109375, -0.03692626953125, 0.00016772747039794922, -0.03765869140625, 0.0721435546875, -0.0032501220703125, 0.004825592041015625, 0.01629638671875, -0.006267547607421875, -0.0101470947265625, -0.022613525390625, -0.040313720703125, -0.0469970703125, 0.0294647216796875, -0.0250701904296875, 0.05169677734375, 0.082763671875, 0.021759033203125, 0.0279541015625, -0.01093292236328125, 0.00774383544921875, -0.02734375, -0.016326904296875, 0.00937652587890625, -0.0185699462890625, -0.0224151611328125, 0.0135498046875, 0.053009033203125, 0.020416259765625, -0.02557373046875, 0.0179901123046875, 0.00470733642578125, 0.019378662109375, -0.0146636962890625, 0.0169219970703125, -0.027679443359375, 0.0255889892578125, 0.00045490264892578125, -0.0086822509765625, -0.024871826171875, -0.004810333251953125, 0.005603790283203125, -0.07623291015625, 0.0399169921875, 0.0017795562744140625, 0.10040283203125, 0.0450439453125, -0.0130157470703125, 0.003559112548828125, -0.041107177734375, 0.06170654296875, -0.0770263671875, 0.022491455078125, 0.0037975311279296875, 0.00475311279296875, -0.01264190673828125, -0.0743408203125, -0.052581787109375, -0.017669677734375, -0.0219879150390625, 0.013763427734375, -0.01297760009765625, 0.0003304481506347656, 0.0450439453125, 0.038848876953125, -0.045867919921875, 0.00394439697265625, -0.04144287109375, -0.0249786376953125, 0.044891357421875, 0.02618408203125, 0.033355712890625, -0.039764404296875, -0.034942626953125, -0.04144287109375, -0.03729248046875, 0.01032257080078125, 0.025848388671875, 0.0222625732421875, -0.0270843505859375, 0.03759765625, -0.0138702392578125, 0.05657958984375, 0.0245819091796875, 0.00013959407806396484, 0.0231475830078125, -0.02374267578125, -0.0298309326171875, -0.01708984375, 0.081787109375, 0.009033203125, -0.003871917724609375, -0.00942230224609375, -0.01507568359375, -0.0218658447265625, 0.013153076171875, -0.08111572265625, -0.044891357421875, 0.014373779296875, -0.06048583984375, -0.0309906005859375, -0.0040130615234375, -0.0504150390625, -0.0174407958984375, -0.0007576942443847656, 0.049102783203125, -0.0309600830078125, -0.046356201171875, -0.002094268798828125, -0.02947998046875, 0.045135498046875, 0.0021457672119140625, -0.07415771484375, -0.009185791015625, 0.040008544921875, 0.06170654296875, -0.0019025802612304688, -0.054229736328125, -0.0201873779296875, -0.0006127357482910156, -0.02203369140625, 0.04443359375, -0.0045013427734375, -0.039154052734375, -0.0230560302734375, 0.023712158203125, -0.01507568359375, -0.034271240234375, 0.03704833984375, -0.0238800048828125, 0.03338623046875, -0.0195465087890625, -0.0355224609375, -0.027557373046875, 0.0157623291015625, -0.04034423828125, 0.1021728515625, 0.0109100341796875, -0.0709228515625, 0.018341064453125, -0.03741455078125, -0.02313232421875, 0.0122528076171875, 0.0008282661437988281, -0.042999267578125, -0.0060272216796875, 0.02044677734375, 0.031768798828125, -0.01052093505859375, 0.018218994140625, -0.014892578125, -0.035888671875, 0.01232147216796875, -0.040740966796875, 0.08184814453125, 0.0223388671875, -0.04290771484375, 0.02154541015625, -0.04534912109375, 0.005588531494140625, 0.01313018798828125, -0.040191650390625, 0.007274627685546875, -0.0011358261108398438, 0.00493621826171875, 0.01026153564453125, 0.0194549560546875, -0.033843994140625, 0.0169830322265625, -0.038726806640625, 0.055877685546875, 0.05523681640625, -0.022369384765625, 0.0190582275390625, -0.0196533203125, 0.01947021484375, -0.0014629364013671875, 0.019073486328125, -0.0148162841796875, -0.041229248046875, -0.071533203125, -0.01702880859375, 0.021697998046875, 0.029327392578125, -0.056427001953125, 0.04217529296875, -0.0179901123046875, -0.055816650390625, -0.045501708984375, -0.018310546875, 0.011016845703125, 0.0447998046875, 0.0323486328125, 0.00039768218994140625, -0.026641845703125, -0.04705810546875, -0.0156402587890625, -0.020050048828125, -0.01076507568359375, 0.0264739990234375, 0.04522705078125, -0.01995849609375, 0.054962158203125, -0.034881591796875, -0.00885009765625, -0.0149078369140625, 0.024200439453125, 0.031646728515625, 0.05548095703125, 0.02996826171875, -0.0484619140625, -0.0291748046875, 0.0036773681640625, -0.072021484375, 0.01222991943359375, -0.011810302734375, -0.00745391845703125, -0.00039124488830566406, 0.026947021484375, -0.0606689453125, 0.030059814453125, 0.036590576171875, -0.0187530517578125, 0.0552978515625, -0.0302886962890625, 0.0097808837890625, -0.081298828125, 0.02423095703125, -0.005855560302734375, -0.0015544891357421875, -0.0386962890625, 0.004467010498046875, 0.0087738037109375, -0.01468658447265625, -0.031890869140625, 0.054595947265625, -0.02825927734375, 0.00215911865234375, -0.0190277099609375, -0.0101470947265625, -0.00342559814453125, 0.054351806640625, 0.003688812255859375, 0.055450439453125, 0.05853271484375, -0.052581787109375, 0.047332763671875, 0.0295562744140625, -0.0207061767578125, 0.00005078315734863281, -0.06658935546875, 0.0012063980102539062, 0.005611419677734375, 0.0233154296875, -0.057220458984375, -0.02166748046875, 0.034210205078125, -0.055999755859375, 0.02703857421875, -0.020111083984375, -0.02630615234375, -0.041900634765625, -0.00794219970703125, 0.040740966796875, 0.05047607421875, -0.065673828125, 0.06378173828125, 0.0191192626953125, 0.016754150390625, -0.062164306640625, -0.0645751953125, 0.00031685829162597656, -0.0244293212890625, -0.0435791015625, 0.036895751953125, 0.0005736351013183594, 0.0145721435546875, 0.0129547119140625, 0.01128387451171875, -0.00616455078125, -0.0089569091796875, 0.023406982421875, 0.035247802734375, -0.01751708984375, -0.004184722900390625, -0.0239410400390625, -0.0237884521484375, 0.0196075439453125, -0.0323486328125, 0.042938232421875, -0.010711669921875, -0.01023101806640625, -0.0546875, -0.0024852752685546875, 0.037872314453125, -0.00890350341796875, 0.055816650390625, 0.07989501953125, -0.03131103515625, -0.022003173828125, -0.02996826171875, -0.0267333984375, -0.039398193359375, 0.04534912109375, -0.0154266357421875, -0.033172607421875, 0.049530029296875, 0.0079193115234375, 0.008880615234375, 0.06292724609375, 0.042022705078125, 0.00359344482421875, 0.08251953125, 0.048095703125, -0.00946807861328125, 0.03302001953125, -0.048919677734375, 0.0199127197265625, -0.058349609375, -0.0237579345703125, -0.0313720703125, -0.00022733211517333984, -0.04681396484375, -0.0362548828125, 0.018280029296875, 0.00589752197265625, -0.039215087890625, 0.0279541015625, -0.0694580078125, 0.00959014892578125, 0.035980224609375, 0.0020160675048828125, -0.00965118408203125, 0.0014629364013671875, -0.0140380859375, 0.0084381103515625, -0.07489013671875, -0.017364501953125, 0.06683349609375, 0.02923583984375, 0.052734375, -0.02520751953125, 0.0305938720703125, 0.01018524169921875, 0.0261383056640625, -0.04608154296875, 0.03668212890625, -0.01143646240234375, -0.050445556640625, -0.0249786376953125, -0.0439453125, -0.053375244140625, 0.036956787109375, -0.0142364501953125, -0.0255584716796875, 0.004749298095703125, 0.01137542724609375, -0.043212890625, 0.05072021484375, -0.03521728515625, 0.081787109375, -0.006366729736328125, -0.017303466796875, -0.0096435546875, -0.034423828125, 0.03302001953125, 0.018707275390625, -0.0005316734313964844, 0.00001621246337890625, 0.0189056396484375, 0.072509765625, -0.046295166015625, 0.059478759765625, -0.023529052734375, 0.005603790283203125, 0.032012939453125, -0.0263824462890625, 0.04443359375, -0.0117034912109375, -0.0090179443359375, 0.0281219482421875, -0.0037441253662109375, -0.0212554931640625, -0.02166748046875, 0.06072998046875, -0.08343505859375, -0.0305328369140625, -0.03271484375, -0.0458984375, 0.0364990234375, 0.02154541015625, 0.041473388671875, 0.029815673828125, -0.002532958984375, -0.004161834716796875, 0.047332763671875, -0.040557861328125, 0.05694580078125, 0.01678466796875, -0.00598907470703125, -0.0433349609375, 0.06494140625, 0.003025054931640625, 0.007022857666015625, 0.0009098052978515625, 0.0233154296875, -0.0340576171875, -0.0272064208984375, -0.062744140625, 0.031097412109375, -0.059661865234375, -0.012237548828125, -0.057220458984375, -0.042877197265625, -0.0399169921875, 0.00754547119140625, -0.03643798828125, -0.0127410888671875, -0.038238525390625, 0.006175994873046875, 0.0297088623046875, 0.053436279296875, -0.0208587646484375, 0.017242431640625, -0.0538330078125, 0.0249786376953125, 0.0325927734375, 0.00949859619140625, 0.020904541015625, -0.06903076171875, -0.0185699462890625, 0.00734710693359375, -0.01108551025390625, -0.04559326171875, 0.053741455078125, -0.004627227783203125, 0.04974365234375, 0.0234375, -0.0017833709716796875, 0.0714111328125, -0.0234375, 0.07177734375, 0.006038665771484375, -0.06219482421875, 0.0089263916015625, -0.02532958984375, 0.015777587890625, -0.0011749267578125, 0.0103607177734375, -0.039215087890625, -0.004390716552734375, -0.04180908203125, -0.058135986328125, 0.05523681640625, 0.007228851318359375, 0.0084228515625, 0.020050048828125, 0.03955078125, -0.006500244140625, -0.004688262939453125, -0.084228515625, -0.040313720703125, -0.04901123046875, -0.0001544952392578125, 0.0068359375, 0.0028591156005859375, -0.003681182861328125, -0.049652099609375, 0.0699462890625, -0.0004582405090332031, 0.039398193359375, 0.0262908935546875, -0.002513885498046875, -0.0109100341796875, -0.01374053955078125, 0.0307159423828125, 0.027557373046875, 0.004199981689453125, -0.00666046142578125, 0.032684326171875, -0.04364013671875, -0.008544921875, 0.0308837890625, -0.024749755859375, 0.0014963150024414062, 0.018890380859375, 0.06884765625, -0.00653076171875, -0.01430511474609375, 0.0253753662109375, -0.015960693359375, -0.01806640625, -0.03118896484375, 0.0081634521484375, 0.019012451171875, 0.02606201171875, 0.0406494140625, 0.0099029541015625, -0.01363372802734375, -0.00635528564453125, 0.00936126708984375, 0.033294677734375, -0.0228118896484375, -0.02056884765625, 0.08135986328125, 0.0161895751953125, -0.015411376953125, 0.0740966796875, -0.01392364501953125, -0.0469970703125, 0.06201171875, 0.03265380859375, 0.07159423828125, -0.00931549072265625, 0.00972747802734375, 0.06494140625, 0.0266876220703125, -0.0201568603515625, -0.006824493408203125, -0.006397247314453125, -0.048675537109375, -0.03680419921875, -0.061767578125, -0.00527191162109375, 0.017181396484375, -0.04559326171875, 0.03302001953125, -0.0157623291015625, 0.00003170967102050781, -0.0014429092407226562, 0.0003390312194824219, -0.041290283203125, 0.01500701904296875, 0.00975799560546875, 0.0673828125, -0.0621337890625, 0.07830810546875, 0.0355224609375, -0.02801513671875, -0.08477783203125, -0.004302978515625, -0.0298309326171875, -0.072265625, 0.043365478515625, 0.0252838134765625, -0.0012378692626953125, 0.0234222412109375, -0.045867919921875, -0.062255859375, 0.0950927734375, 0.0026988983154296875, -0.016357421875, -0.004787445068359375, 0.004901885986328125, 0.038909912109375, -0.006587982177734375, 0.031005859375, 0.0219879150390625, 0.041046142578125, 0.01113128662109375, -0.058502197265625, 0.0190887451171875, -0.0254669189453125, -0.0111236572265625, 0.00902557373046875, -0.055908203125, 0.1036376953125, -0.0277557373046875, -0.0264129638671875, 0.0169677734375, 0.0709228515625, 0.022613525390625, -0.0079803466796875, 0.037445068359375, 0.053375244140625, 0.046051025390625, -0.0180816650390625, 0.07745361328125, -0.044677734375, 0.061431884765625, 0.033172607421875, 0.01166534423828125, 0.044158935546875, 0.0162353515625, -0.0155029296875, 0.031982421875, 0.06231689453125, -0.03955078125, 0.033721923828125, 0.013824462890625, -0.01715087890625, -0.019744873046875, 0.01053619384765625, -0.04803466796875, 0.0127410888671875, 0.01090240478515625, -0.0227203369140625, -0.013458251953125, -0.004344940185546875, 0.002033233642578125, -0.033416748046875, -0.01708984375, 0.0299224853515625, -0.0016651153564453125, -0.05731201171875, 0.069580078125, 0.005184173583984375, 0.07281494140625, -0.048736572265625, -0.003856658935546875, -0.0017061233520507812, 0.0264739990234375, -0.0217742919921875, -0.043914794921875, 0.01351165771484375, -0.01143646240234375, -0.0108184814453125, -0.0175018310546875, 0.044281005859375, -0.037506103515625, -0.039306640625, 0.01617431640625, 0.00585174560546875, 0.0304412841796875, 0.002567291259765625, -0.07586669921875, -0.001277923583984375, 0.016693115234375, -0.040313720703125, 0.013916015625, 0.019012451171875, 0.02685546875, 0.054351806640625, 0.0626220703125, 0.00527191162109375, 0.025848388671875, -0.01399993896484375, 0.06658935546875, -0.06298828125, -0.0298004150390625, -0.06536865234375, 0.038360595703125, 0.002231597900390625, -0.040313720703125, 0.07025146484375, 0.046630859375, 0.05828857421875, -0.005992889404296875, 0.06109619140625, -0.023773193359375, 0.01302337646484375, -0.0196380615234375, 0.07843017578125, -0.038482666015625, 0.0020580291748046875, 0.00846099853515625, -0.041229248046875, 0.00553131103515625, 0.06561279296875, 0.0034427642822265625, 0.01486968994140625, 0.037994384765625, 0.0699462890625, 0.0009822845458984375, -0.004375457763671875, 0.01277923583984375, 0.0364990234375, 0.0289154052734375, 0.026763916015625, 0.052978515625, -0.058319091796875, 0.05364990234375, -0.036590576171875, -0.00957489013671875, 0.018524169921875, -0.064697265625, -0.06890869140625, -0.0355224609375, -0.027740478515625, -0.042144775390625, -0.0113983154296875, 0.042144775390625, 0.059417724609375, -0.04425048828125, -0.0203704833984375, -0.0141448974609375, -0.0030193328857421875, -0.0222320556640625, -0.0193634033203125, 0.0408935546875, -0.0189971923828125, -0.06121826171875, 0.0214996337890625, 0.0028133392333984375, 0.0174560546875, -0.043914794921875, -0.0192718505859375, -0.007293701171875, -0.015655517578125, 0.01047515869140625, 0.0306396484375, -0.05810546875, -0.01177978515625, 0.001995086669921875, -0.01497650146484375, 0.0007028579711914062, 0.03778076171875, -0.061248779296875, 0.0214996337890625, 0.044158935546875, 0.029052734375, 0.068359375, -0.0014486312866210938, 0.03955078125, -0.022064208984375, 0.0210418701171875, -0.00193023681640625, 0.02178955078125, 0.032196044921875, -0.032958984375, 0.0199737548828125, 0.035797119140625, -0.063720703125, -0.069091796875, -0.01495361328125, -0.0584716796875, -0.029388427734375, 0.08587646484375, -0.026397705078125, -0.03277587890625, 0.005584716796875, -0.001491546630859375, 0.04901123046875, -0.0013113021850585938, 0.06524658203125, 0.03857421875, -0.00931549072265625, -0.004669189453125, -0.048583984375, 0.054229736328125, 0.0269927978515625, -0.04022216796875, 0.01678466796875, 0.00814056396484375, 0.050140380859375, 0.015045166015625, 0.0298309326171875, -0.0014133453369140625, 0.0120391845703125, 0.018798828125, 0.0312347412109375, -0.0264739990234375, 0.00920867919921875, -0.028106689453125, -0.0038585662841796875, -0.0308074951171875, -0.02593994140625 ] ]
shahrukhx01/bert-mini-finetune-question-detection
2023-03-29T22:00:48.000Z
[ "transformers", "pytorch", "safetensors", "bert", "text-classification", "neural-search-query-classification", "neural-search", "en", "endpoints_compatible", "has_space", "region:us" ]
text-classification
shahrukhx01
null
null
shahrukhx01/bert-mini-finetune-question-detection
8
9,929
transformers
2022-03-02T23:29:05
--- language: "en" tags: - neural-search-query-classification - neural-search widget: - text: "keyword query." --- # KEYWORD QUERY VS STATEMENT/QUESTION CLASSIFIER FOR NEURAL SEARCH | Train Loss | Validation Acc.| Test Acc.| | ------------- |:-------------: | -----: | | 0.000806 | 0.99 | 0.997 | ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("shahrukhx01/bert-mini-finetune-question-detection") model = AutoModelForSequenceClassification.from_pretrained("shahrukhx01/bert-mini-finetune-question-detection") ``` Trained to add feature for classifying queries between Keyword Query or Question + Statement Query using classification in [Haystack](https://github.com/deepset-ai/haystack/issues/611) Problem Statement: One common challenge that we saw in deployments: We need to distinguish between real questions and keyword queries that come in. We only want to route questions to the Reader branch in order to maximize the accuracy of results and minimize computation efforts/costs. Baseline: https://www.kaggle.com/shahrukhkhan/question-v-statement-detection Dataset: https://www.kaggle.com/stefanondisponibile/quora-question-keyword-pairs Kaggle Notebook: https://www.kaggle.com/shahrukhkhan/question-vs-statement-classification-mini-bert/
1,349
[ [ -0.035308837890625, -0.0562744140625, 0.024322509765625, -0.0002574920654296875, -0.0169525146484375, -0.01114654541015625, -0.01490020751953125, -0.00220489501953125, -0.002201080322265625, 0.03179931640625, -0.04443359375, -0.02716064453125, -0.0212249755859375, 0.004413604736328125, -0.03680419921875, 0.06494140625, 0.0142669677734375, 0.0102386474609375, -0.0391845703125, -0.01148223876953125, -0.029296875, -0.0560302734375, -0.0631103515625, -0.005352020263671875, 0.0175018310546875, 0.004390716552734375, 0.030609130859375, 0.0187530517578125, 0.0262603759765625, 0.0245208740234375, -0.01052093505859375, 0.0154266357421875, -0.00614166259765625, 0.03155517578125, 0.00011622905731201172, -0.059844970703125, 0.00688934326171875, -0.0045928955078125, 0.0298004150390625, 0.037628173828125, -0.02618408203125, -0.005573272705078125, -0.0110015869140625, 0.052032470703125, -0.05224609375, 0.00806427001953125, -0.0682373046875, -0.002902984619140625, -0.0015630722045898438, 0.0007257461547851562, -0.01334381103515625, -0.033966064453125, 0.0235748291015625, -0.03143310546875, 0.039337158203125, 0.013427734375, 0.0968017578125, 0.03326416015625, -0.03411865234375, -0.034515380859375, -0.0267791748046875, 0.036041259765625, -0.0413818359375, 0.026824951171875, 0.03094482421875, 0.041229248046875, 0.01763916015625, -0.061676025390625, -0.054107666015625, -0.00830841064453125, -0.019134521484375, 0.01058197021484375, 0.0134429931640625, 0.0142974853515625, 0.01129150390625, 0.028839111328125, -0.06500244140625, 0.0011138916015625, -0.041900634765625, -0.03497314453125, 0.046722412109375, 0.012115478515625, -0.0224456787109375, -0.047607421875, -0.01313018798828125, -0.0287322998046875, -0.037109375, 0.008026123046875, 0.01568603515625, 0.0175323486328125, 0.023590087890625, 0.0166015625, -0.04302978515625, 0.050537109375, 0.01480865478515625, 0.0215301513671875, 0.0128936767578125, -0.05224609375, -0.00417327880859375, 0.00392913818359375, 0.07000732421875, 0.0236053466796875, 0.015472412109375, -0.005649566650390625, 0.005817413330078125, -0.0029811859130859375, 0.032684326171875, -0.0675048828125, -0.02093505859375, 0.0085296630859375, -0.030487060546875, -0.020111083984375, 0.00030922889709472656, -0.033050537109375, -0.023406982421875, 0.0040435791015625, 0.04644775390625, -0.021026611328125, -0.01090240478515625, 0.00806427001953125, -0.006011962890625, 0.058868408203125, 0.0232696533203125, -0.044281005859375, 0.01071929931640625, 0.04498291015625, 0.034759521484375, 0.0116424560546875, 0.0062103271484375, -0.0037250518798828125, -0.037200927734375, -0.01959228515625, 0.0513916015625, -0.021820068359375, -0.0287933349609375, 0.005634307861328125, 0.0165557861328125, -0.052459716796875, -0.046417236328125, 0.04534912109375, -0.06671142578125, 0.0526123046875, -0.0254669189453125, -0.057220458984375, -0.03607177734375, 0.036956787109375, -0.017913818359375, 0.064208984375, 0.0016813278198242188, -0.03564453125, 0.0260467529296875, -0.037322998046875, -0.02691650390625, 0.00576019287109375, -0.0269012451171875, -0.03594970703125, -0.01430511474609375, 0.00959014892578125, 0.018646240234375, -0.0251007080078125, 0.0078582763671875, -0.01143646240234375, -0.036224365234375, 0.01537322998046875, -0.01666259765625, 0.055572509765625, 0.0271759033203125, -0.03350830078125, -0.00916290283203125, -0.04913330078125, 0.031951904296875, 0.0158233642578125, -0.03668212890625, -0.01119232177734375, 0.01092529296875, -0.0035457611083984375, 0.0028591156005859375, 0.02569580078125, -0.03997802734375, 0.0178375244140625, -0.048858642578125, 0.0292510986328125, 0.0379638671875, -0.0020656585693359375, 0.0223541259765625, -0.006450653076171875, 0.04888916015625, 0.0122222900390625, 0.01125335693359375, -0.005336761474609375, -0.0341796875, -0.0770263671875, 0.02801513671875, 0.058441162109375, 0.06097412109375, -0.0523681640625, 0.043914794921875, -0.00341033935546875, -0.039642333984375, -0.05908203125, -0.0067138671875, 0.02410888671875, 0.0562744140625, 0.0262298583984375, 0.01195526123046875, -0.047698974609375, -0.0703125, -0.0123138427734375, -0.028472900390625, 0.0184478759765625, 0.045562744140625, 0.024871826171875, 0.0059967041015625, 0.051849365234375, -0.04925537109375, -0.02496337890625, -0.01407623291015625, 0.0024433135986328125, 0.0181121826171875, 0.046905517578125, 0.037384033203125, -0.08221435546875, -0.037994384765625, -0.0251007080078125, -0.053009033203125, 0.0194244384765625, -0.00771331787109375, -0.00791168212890625, 0.0187225341796875, 0.053375244140625, -0.05084228515625, 0.042144775390625, -0.00043964385986328125, -0.0177001953125, 0.035186767578125, -0.0005855560302734375, 0.015625, -0.0863037109375, -0.00289154052734375, -0.0181427001953125, 0.0098876953125, -0.0283660888671875, 0.0186309814453125, 0.00792694091796875, -0.01149749755859375, -0.0247650146484375, 0.0078887939453125, -0.00693511962890625, 0.022979736328125, -0.003955841064453125, 0.0029811859130859375, 0.015045166015625, 0.038360595703125, -0.0047760009765625, 0.08551025390625, 0.041412353515625, -0.07525634765625, 0.0440673828125, 0.048858642578125, -0.0218048095703125, 0.024444580078125, -0.0704345703125, 0.023468017578125, -0.01224517822265625, 0.01275634765625, -0.10760498046875, 0.0038814544677734375, 0.0255584716796875, -0.062225341796875, 0.0162200927734375, -0.0117034912109375, -0.055328369140625, -0.036529541015625, -0.03338623046875, 0.0266876220703125, 0.0496826171875, -0.032470703125, 0.0176544189453125, -0.000293731689453125, 0.01416778564453125, -0.0299224853515625, -0.05908203125, -0.00579833984375, -0.00638580322265625, -0.04449462890625, 0.0119781494140625, 0.001354217529296875, 0.019134521484375, 0.00817108154296875, -0.0182342529296875, -0.03521728515625, 0.0252227783203125, -0.000637054443359375, 0.01788330078125, -0.03875732421875, 0.048126220703125, 0.0077667236328125, 0.004276275634765625, 0.018890380859375, -0.01279449462890625, 0.04620361328125, -0.0338134765625, -0.0298919677734375, -0.01715087890625, 0.01416778564453125, 0.02020263671875, -0.025390625, 0.0640869140625, 0.049560546875, -0.0007691383361816406, -0.0019969940185546875, -0.047088623046875, -0.01552581787109375, -0.038482666015625, 0.026580810546875, -0.028961181640625, -0.063232421875, 0.0391845703125, 0.007686614990234375, -0.027984619140625, 0.06689453125, 0.01471710205078125, -0.03375244140625, 0.0843505859375, 0.01459503173828125, -0.000537872314453125, 0.016265869140625, -0.045013427734375, 0.02484130859375, -0.0662841796875, -0.032470703125, -0.0675048828125, -0.01389312744140625, -0.0667724609375, -0.004856109619140625, 0.0266265869140625, 0.0080108642578125, -0.0009541511535644531, 0.039794921875, -0.069580078125, 0.01318359375, 0.04974365234375, -0.0009403228759765625, -0.015472412109375, -0.003833770751953125, -0.0166168212890625, 0.001468658447265625, -0.0704345703125, -0.01296234130859375, 0.08807373046875, -0.00983428955078125, 0.05255126953125, -0.0008754730224609375, 0.06854248046875, 0.0003304481506347656, -0.016815185546875, -0.0699462890625, 0.06707763671875, -0.0016326904296875, -0.080322265625, -0.016082763671875, -0.04071044921875, -0.08917236328125, 0.033905029296875, 0.00426483154296875, -0.025848388671875, 0.0215301513671875, 0.003612518310546875, -0.043487548828125, 0.0204925537109375, -0.0546875, 0.08343505859375, -0.01396942138671875, 0.00860595703125, -0.01049041748046875, -0.06378173828125, 0.0059356689453125, 0.01088714599609375, -0.0179901123046875, 0.005062103271484375, -0.01149749755859375, 0.06549072265625, -0.0477294921875, 0.036041259765625, -0.03009033203125, 0.0379638671875, 0.0440673828125, -0.02618408203125, 0.01148223876953125, -0.012664794921875, 0.00567626953125, -0.023406982421875, 0.03021240234375, -0.0113677978515625, -0.037872314453125, 0.040557861328125, -0.041778564453125, -0.05615234375, -0.046630859375, -0.02154541015625, -0.022125244140625, 0.0095977783203125, 0.0287933349609375, 0.03656005859375, 0.00792694091796875, 0.047027587890625, 0.0772705078125, -0.0230865478515625, 0.023223876953125, 0.029754638671875, -0.00786590576171875, -0.0118865966796875, 0.0648193359375, 0.01094818115234375, -0.0033721923828125, 0.03070068359375, 0.0160064697265625, -0.0526123046875, -0.05084228515625, -0.0188140869140625, 0.0235595703125, -0.05828857421875, -0.0269775390625, -0.053375244140625, -0.051849365234375, -0.052093505859375, 0.037445068359375, -0.0004296302795410156, -0.027679443359375, -0.03656005859375, 0.006511688232421875, 0.03521728515625, 0.01262664794921875, 0.021331787109375, 0.03631591796875, -0.052459716796875, 0.00725555419921875, 0.04449462890625, 0.01580810546875, -0.01149749755859375, -0.0509033203125, -0.01013946533203125, -0.0027294158935546875, -0.033599853515625, -0.061553955078125, -0.00565338134765625, 0.036834716796875, 0.0419921875, 0.0226898193359375, 0.0214691162109375, 0.03924560546875, -0.016021728515625, 0.0660400390625, -0.01474761962890625, -0.0400390625, 0.059906005859375, 0.00716400146484375, 0.020599365234375, 0.09552001953125, 0.0379638671875, -0.0340576171875, -0.0023403167724609375, -0.061798095703125, -0.059906005859375, 0.0517578125, 0.00925445556640625, 0.01384735107421875, -0.01509857177734375, 0.039520263671875, -0.01690673828125, 0.050262451171875, -0.017181396484375, -0.0282440185546875, -0.022003173828125, -0.0157470703125, 0.0174407958984375, -0.005863189697265625, 0.022216796875, -0.04730224609375, 0.040985107421875, -0.0029754638671875, 0.033416748046875, 0.033721923828125, -0.018280029296875, 0.0296173095703125, 0.020599365234375, 0.0287322998046875, 0.033416748046875, -0.033843994140625, -0.01357269287109375, 0.00696563720703125, -0.0299530029296875, -0.005718231201171875, 0.032989501953125, -0.0301666259765625, 0.01514434814453125, 0.0049896240234375, 0.06304931640625, -0.0088958740234375, -0.03778076171875, 0.045928955078125, -0.01012420654296875, -0.043853759765625, -0.0187225341796875, 0.01364898681640625, 0.00670623779296875, 0.0044097900390625, 0.0210418701171875, 0.006107330322265625, 0.038604736328125, -0.0325927734375, 0.01151275634765625, 0.027618408203125, -0.0265045166015625, 0.012908935546875, 0.05535888671875, 0.01556396484375, -0.018157958984375, 0.0694580078125, -0.0382080078125, -0.07818603515625, 0.056549072265625, -0.0080413818359375, 0.07489013671875, -0.0079498291015625, 0.01296234130859375, 0.037628173828125, 0.020294189453125, 0.01122283935546875, 0.0289306640625, -0.037109375, -0.0472412109375, -0.0131378173828125, -0.038848876953125, -0.026580810546875, 0.0186004638671875, -0.058197021484375, -0.0140380859375, -0.043792724609375, -0.005893707275390625, 0.006984710693359375, 0.014007568359375, -0.049163818359375, 0.031707763671875, -0.0086212158203125, 0.078857421875, -0.0228729248046875, 0.0540771484375, 0.06744384765625, -0.051971435546875, -0.068603515625, -0.00457763671875, -0.025421142578125, -0.07012939453125, 0.058624267578125, 0.02191162109375, 0.025543212890625, -0.01490020751953125, -0.040496826171875, -0.0408935546875, 0.0731201171875, -0.0244598388671875, -0.03228759765625, -0.020111083984375, 0.0041961669921875, 0.037322998046875, -0.005741119384765625, 0.003833770751953125, 0.029693603515625, 0.020599365234375, 0.0243682861328125, -0.0692138671875, -0.008087158203125, -0.00862884521484375, 0.0012636184692382812, 0.024261474609375, -0.054351806640625, 0.0650634765625, 0.0084381103515625, -0.012237548828125, 0.02716064453125, 0.0372314453125, 0.002788543701171875, 0.035980224609375, 0.04913330078125, 0.046234130859375, 0.04541015625, -0.0374755859375, 0.080322265625, -0.0137481689453125, 0.03668212890625, 0.07080078125, 0.004337310791015625, 0.05474853515625, 0.01708984375, -0.039642333984375, 0.022125244140625, 0.07196044921875, -0.019439697265625, 0.0614013671875, 0.0303497314453125, -0.000759124755859375, -0.02642822265625, 0.00847625732421875, -0.05474853515625, 0.03631591796875, 0.01538848876953125, -0.006595611572265625, -0.026580810546875, -0.00885009765625, -0.004627227783203125, 0.00516510009765625, -0.00899505615234375, 0.0653076171875, -0.006557464599609375, -0.050933837890625, 0.0699462890625, -0.0142669677734375, 0.056243896484375, -0.055328369140625, 0.00843048095703125, -0.0266876220703125, 0.021270751953125, -0.01087188720703125, -0.03778076171875, 0.0212554931640625, -0.0281829833984375, 0.0017995834350585938, 0.0118255615234375, 0.0599365234375, -0.03375244140625, -0.02410888671875, -0.018157958984375, 0.033843994140625, 0.0266265869140625, -0.032012939453125, -0.0955810546875, -0.01529693603515625, 0.0002409219741821289, -0.0131378173828125, 0.01422882080078125, 0.019439697265625, 0.0252227783203125, 0.046142578125, 0.031982421875, -0.006351470947265625, 0.02252197265625, 0.006282806396484375, 0.06610107421875, -0.045806884765625, -0.036956787109375, -0.0312347412109375, -0.00103759765625, -0.023651123046875, -0.032470703125, 0.06793212890625, 0.037506103515625, 0.064697265625, 0.002109527587890625, 0.036834716796875, -0.01377105712890625, 0.047210693359375, -0.0265960693359375, 0.05877685546875, -0.056671142578125, 0.01180267333984375, 0.0169219970703125, -0.051055908203125, 0.00991058349609375, 0.0618896484375, -0.0172882080078125, 0.0240631103515625, 0.06744384765625, 0.0709228515625, 0.01274871826171875, 0.00017523765563964844, 0.0036525726318359375, -0.00040435791015625, 0.01163482666015625, 0.035064697265625, 0.07220458984375, -0.077880859375, 0.0693359375, -0.01580810546875, 0.024749755859375, -0.032806396484375, -0.0225067138671875, -0.059906005859375, -0.047149658203125, -0.043548583984375, -0.041473388671875, -0.0027618408203125, 0.04168701171875, 0.064453125, -0.07391357421875, -0.010467529296875, -0.0210723876953125, 0.0194244384765625, 0.004352569580078125, -0.022705078125, 0.034515380859375, -0.057586669921875, -0.022735595703125, 0.012908935546875, -0.01338958740234375, -0.00835418701171875, -0.01898193359375, 0.009429931640625, -0.05584716796875, 0.002567291259765625, 0.023162841796875, 0.0198974609375, -0.05072021484375, -0.002925872802734375, 0.01432037353515625, -0.0181884765625, 0.0108795166015625, 0.03631591796875, -0.070556640625, 0.035003662109375, 0.04644775390625, 0.07818603515625, 0.062225341796875, -0.0012054443359375, 0.005779266357421875, -0.04693603515625, -0.008148193359375, 0.0340576171875, 0.00567626953125, 0.01959228515625, -0.0271453857421875, 0.044891357421875, 0.03436279296875, -0.040191650390625, -0.056243896484375, 0.00315093994140625, -0.07208251953125, -0.0246734619140625, 0.06488037109375, 0.00475311279296875, -0.0227813720703125, 0.005527496337890625, -0.0225830078125, 0.0084381103515625, -0.015899658203125, 0.03851318359375, 0.048126220703125, -0.00103759765625, -0.006557464599609375, -0.038116455078125, 0.04107666015625, 0.055755615234375, -0.043365478515625, -0.026824951171875, 0.0177459716796875, 0.018646240234375, 0.028564453125, 0.0197601318359375, 0.0008606910705566406, 0.0289306640625, -0.005645751953125, -0.0123138427734375, -0.019134521484375, 0.0031948089599609375, -0.030548095703125, 0.02484130859375, -0.0088958740234375, -0.08642578125 ] ]
KB/bert-base-swedish-cased
2022-06-07T16:31:14.000Z
[ "transformers", "pytorch", "tf", "jax", "bert", "fill-mask", "sv", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
KB
null
null
KB/bert-base-swedish-cased
8
9,880
transformers
2022-06-07T16:28:03
--- language: sv --- # Swedish BERT Models The National Library of Sweden / KBLab releases three pretrained language models based on BERT and ALBERT. The models are trained on aproximately 15-20GB of text (200M sentences, 3000M tokens) from various sources (books, news, government publications, swedish wikipedia and internet forums) aiming to provide a representative BERT model for Swedish text. A more complete description will be published later on. The following three models are currently available: - **bert-base-swedish-cased** (*v1*) - A BERT trained with the same hyperparameters as first published by Google. - **bert-base-swedish-cased-ner** (*experimental*) - a BERT fine-tuned for NER using SUC 3.0. - **albert-base-swedish-cased-alpha** (*alpha*) - A first attempt at an ALBERT for Swedish. All models are cased and trained with whole word masking. ## Files | **name** | **files** | |---------------------------------|-----------| | bert-base-swedish-cased | [config](https://s3.amazonaws.com/models.huggingface.co/bert/KB/bert-base-swedish-cased/config.json), [vocab](https://s3.amazonaws.com/models.huggingface.co/bert/KB/bert-base-swedish-cased/vocab.txt), [pytorch_model.bin](https://s3.amazonaws.com/models.huggingface.co/bert/KB/bert-base-swedish-cased/pytorch_model.bin) | | bert-base-swedish-cased-ner | [config](https://s3.amazonaws.com/models.huggingface.co/bert/KB/bert-base-swedish-cased-ner/config.json), [vocab](https://s3.amazonaws.com/models.huggingface.co/bert/KB/bert-base-swedish-cased-ner/vocab.txt) [pytorch_model.bin](https://s3.amazonaws.com/models.huggingface.co/bert/KB/bert-base-swedish-cased-ner/pytorch_model.bin) | | albert-base-swedish-cased-alpha | [config](https://s3.amazonaws.com/models.huggingface.co/bert/KB/albert-base-swedish-cased-alpha/config.json), [sentencepiece model](https://s3.amazonaws.com/models.huggingface.co/bert/KB/albert-base-swedish-cased-alpha/spiece.model), [pytorch_model.bin](https://s3.amazonaws.com/models.huggingface.co/bert/KB/albert-base-swedish-cased-alpha/pytorch_model.bin) | TensorFlow model weights will be released soon. ## Usage requirements / installation instructions The examples below require Huggingface Transformers 2.4.1 and Pytorch 1.3.1 or greater. For Transformers<2.4.0 the tokenizer must be instantiated manually and the `do_lower_case` flag parameter set to `False` and `keep_accents` to `True` (for ALBERT). To create an environment where the examples can be run, run the following in an terminal on your OS of choice. ``` # git clone https://github.com/Kungbib/swedish-bert-models # cd swedish-bert-models # python3 -m venv venv # source venv/bin/activate # pip install --upgrade pip # pip install -r requirements.txt ``` ### BERT Base Swedish A standard BERT base for Swedish trained on a variety of sources. Vocabulary size is ~50k. Using Huggingface Transformers the model can be loaded in Python as follows: ```python from transformers import AutoModel,AutoTokenizer tok = AutoTokenizer.from_pretrained('KB/bert-base-swedish-cased') model = AutoModel.from_pretrained('KB/bert-base-swedish-cased') ``` ### BERT base fine-tuned for Swedish NER This model is fine-tuned on the SUC 3.0 dataset. Using the Huggingface pipeline the model can be easily instantiated. For Transformer<2.4.1 it seems the tokenizer must be loaded separately to disable lower-casing of input strings: ```python from transformers import pipeline nlp = pipeline('ner', model='KB/bert-base-swedish-cased-ner', tokenizer='KB/bert-base-swedish-cased-ner') nlp('Idag släpper KB tre språkmodeller.') ``` Running the Python code above should produce in something like the result below. Entity types used are `TME` for time, `PRS` for personal names, `LOC` for locations, `EVN` for events and `ORG` for organisations. These labels are subject to change. ```python [ { 'word': 'Idag', 'score': 0.9998126029968262, 'entity': 'TME' }, { 'word': 'KB', 'score': 0.9814832210540771, 'entity': 'ORG' } ] ``` The BERT tokenizer often splits words into multiple tokens, with the subparts starting with `##`, for example the string `Engelbert kör Volvo till Herrängens fotbollsklubb` gets tokenized as `Engel ##bert kör Volvo till Herr ##ängens fotbolls ##klubb`. To glue parts back together one can use something like this: ```python text = 'Engelbert tar Volvon till Tele2 Arena för att titta på Djurgården IF ' +\ 'som spelar fotboll i VM klockan två på kvällen.' l = [] for token in nlp(text): if token['word'].startswith('##'): l[-1]['word'] += token['word'][2:] else: l += [ token ] print(l) ``` Which should result in the following (though less cleanly formated): ```python [ { 'word': 'Engelbert', 'score': 0.99..., 'entity': 'PRS'}, { 'word': 'Volvon', 'score': 0.99..., 'entity': 'OBJ'}, { 'word': 'Tele2', 'score': 0.99..., 'entity': 'LOC'}, { 'word': 'Arena', 'score': 0.99..., 'entity': 'LOC'}, { 'word': 'Djurgården', 'score': 0.99..., 'entity': 'ORG'}, { 'word': 'IF', 'score': 0.99..., 'entity': 'ORG'}, { 'word': 'VM', 'score': 0.99..., 'entity': 'EVN'}, { 'word': 'klockan', 'score': 0.99..., 'entity': 'TME'}, { 'word': 'två', 'score': 0.99..., 'entity': 'TME'}, { 'word': 'på', 'score': 0.99..., 'entity': 'TME'}, { 'word': 'kvällen', 'score': 0.54..., 'entity': 'TME'} ] ``` ### ALBERT base The easisest way to do this is, again, using Huggingface Transformers: ```python from transformers import AutoModel,AutoTokenizer tok = AutoTokenizer.from_pretrained('KB/albert-base-swedish-cased-alpha'), model = AutoModel.from_pretrained('KB/albert-base-swedish-cased-alpha') ``` ## Acknowledgements ❤️ - Resources from Stockholms University, Umeå University and Swedish Language Bank at Gothenburg University were used when fine-tuning BERT for NER. - Model pretraining was made partly in-house at the KBLab and partly (for material without active copyright) with the support of Cloud TPUs from Google's TensorFlow Research Cloud (TFRC). - Models are hosted on S3 by Huggingface 🤗
6,165
[ [ -0.027435302734375, -0.050689697265625, 0.012542724609375, 0.0287933349609375, -0.0191192626953125, -0.01617431640625, -0.0217437744140625, -0.0291900634765625, 0.0301055908203125, 0.0287322998046875, -0.037078857421875, -0.04962158203125, -0.0458984375, 0.01354217529296875, -0.0144500732421875, 0.091064453125, -0.0135650634765625, 0.0181427001953125, 0.00406646728515625, -0.0200347900390625, -0.01399993896484375, -0.059478759765625, -0.035400390625, -0.03485107421875, 0.032745361328125, 0.00281524658203125, 0.04083251953125, 0.037261962890625, 0.0279693603515625, 0.0294647216796875, -0.0236358642578125, -0.016998291015625, -0.0173492431640625, 0.004726409912109375, 0.00997161865234375, -0.041351318359375, -0.041351318359375, -0.004573822021484375, 0.0361328125, 0.045654296875, -0.00862884521484375, 0.012664794921875, -0.00959014892578125, 0.0279388427734375, -0.01000213623046875, 0.021820068359375, -0.044219970703125, -0.0008559226989746094, -0.004619598388671875, 0.0149993896484375, -0.0150146484375, -0.00687408447265625, 0.0201416015625, -0.042449951171875, 0.0264739990234375, 0.01490020751953125, 0.0980224609375, 0.00955963134765625, -0.006359100341796875, -0.034210205078125, -0.0180816650390625, 0.07000732421875, -0.07061767578125, 0.0384521484375, 0.027252197265625, -0.004650115966796875, -0.0186309814453125, -0.061859130859375, -0.0472412109375, -0.0111236572265625, -0.0223541259765625, 0.01050567626953125, -0.00643157958984375, 0.0018825531005859375, 0.0175933837890625, 0.020263671875, -0.043701171875, 0.0005288124084472656, -0.036834716796875, -0.0240020751953125, 0.05657958984375, -0.01107025146484375, 0.03173828125, -0.046173095703125, -0.024932861328125, -0.03302001953125, -0.040435791015625, -0.0097808837890625, 0.0361328125, 0.035003662109375, -0.0219268798828125, 0.059844970703125, 0.00963592529296875, 0.04248046875, 0.004840850830078125, -0.0066070556640625, 0.031524658203125, -0.01511383056640625, -0.01611328125, 0.0019254684448242188, 0.07025146484375, 0.0260162353515625, 0.01434326171875, -0.01169586181640625, -0.034912109375, -0.007556915283203125, 0.005352020263671875, -0.057525634765625, -0.03314208984375, 0.033050537109375, -0.05133056640625, -0.0169830322265625, -0.01885986328125, -0.043914794921875, -0.00186920166015625, -0.0259552001953125, 0.049835205078125, -0.0606689453125, -0.0227813720703125, 0.01105499267578125, -0.017669677734375, 0.019073486328125, -0.004467010498046875, -0.0679931640625, 0.0100250244140625, 0.03924560546875, 0.06378173828125, 0.008087158203125, -0.0238800048828125, -0.0025463104248046875, -0.0177764892578125, -0.01528167724609375, 0.0430908203125, -0.0122528076171875, -0.0228271484375, 0.00388336181640625, 0.0164642333984375, -0.0004031658172607422, -0.02557373046875, 0.0252838134765625, -0.0284271240234375, 0.03973388671875, -0.01308441162109375, -0.060455322265625, -0.009429931640625, 0.016387939453125, -0.03997802734375, 0.08978271484375, 0.028564453125, -0.061126708984375, 0.0258941650390625, -0.049041748046875, -0.0308990478515625, 0.00634765625, 0.0006895065307617188, -0.042694091796875, 0.0090179443359375, 0.0247344970703125, 0.047515869140625, -0.0027790069580078125, 0.011627197265625, -0.0200958251953125, -0.0080108642578125, -0.004741668701171875, 0.01194000244140625, 0.087646484375, 0.00849151611328125, -0.0185089111328125, 0.00687408447265625, -0.06005859375, -0.00533294677734375, 0.0194549560546875, -0.03302001953125, -0.0228729248046875, -0.0168914794921875, 0.026763916015625, 0.0175323486328125, 0.0218963623046875, -0.052154541015625, 0.0197906494140625, -0.051971435546875, 0.041595458984375, 0.0517578125, -0.01018524169921875, 0.0281524658203125, -0.0287322998046875, 0.0298614501953125, 0.0008502006530761719, 0.0023250579833984375, -0.004352569580078125, -0.037811279296875, -0.06585693359375, -0.034149169921875, 0.0501708984375, 0.032989501953125, -0.064208984375, 0.0599365234375, -0.0200958251953125, -0.051361083984375, -0.0635986328125, -0.001129150390625, 0.023529052734375, 0.0313720703125, 0.03564453125, -0.01023101806640625, -0.05413818359375, -0.07415771484375, -0.0194549560546875, -0.023040771484375, -0.0185546875, 0.02508544921875, 0.061248779296875, -0.01222991943359375, 0.067626953125, -0.02166748046875, -0.033843994140625, -0.01018524169921875, 0.0212249755859375, 0.036773681640625, 0.05322265625, 0.061309814453125, -0.046966552734375, -0.03619384765625, -0.013916015625, -0.04510498046875, 0.0083160400390625, 0.00026535987854003906, -0.006603240966796875, 0.037872314453125, 0.04022216796875, -0.058013916015625, 0.01554107666015625, 0.03326416015625, -0.03021240234375, 0.046630859375, -0.0218658447265625, -0.017059326171875, -0.09686279296875, 0.014068603515625, -0.00238037109375, -0.007122039794921875, -0.042266845703125, 0.00354766845703125, -0.00646209716796875, 0.00841522216796875, -0.040618896484375, 0.060791015625, -0.044281005859375, -0.005977630615234375, 0.00972747802734375, 0.00615692138671875, -0.00557708740234375, 0.0501708984375, 0.01512908935546875, 0.04022216796875, 0.04058837890625, -0.03912353515625, 0.03131103515625, 0.047760009765625, -0.0386962890625, 0.0242462158203125, -0.06597900390625, 0.00726318359375, -0.0205230712890625, 0.0193939208984375, -0.0614013671875, -0.0156402587890625, 0.0178070068359375, -0.043426513671875, 0.03106689453125, -0.0247650146484375, -0.05145263671875, -0.0341796875, -0.01206207275390625, 0.00485992431640625, 0.04339599609375, -0.037872314453125, 0.0723876953125, 0.02899169921875, -0.02178955078125, -0.03973388671875, -0.05938720703125, -0.0015077590942382812, -0.01580810546875, -0.048828125, 0.03973388671875, -0.00815582275390625, -0.0025920867919921875, 0.01507568359375, -0.00519561767578125, -0.0013475418090820312, 0.00577545166015625, 0.006450653076171875, 0.03125, -0.01287078857421875, 0.006488800048828125, -0.0054473876953125, 0.011383056640625, 0.00412750244140625, -0.01502227783203125, 0.0623779296875, -0.0250396728515625, -0.0035991668701171875, -0.017791748046875, 0.0241241455078125, 0.03759765625, -0.007289886474609375, 0.071533203125, 0.055633544921875, -0.04345703125, 0.0032444000244140625, -0.044830322265625, -0.0260772705078125, -0.033294677734375, 0.022125244140625, -0.043701171875, -0.0587158203125, 0.05426025390625, 0.01953125, 0.031158447265625, 0.052764892578125, 0.05145263671875, -0.01525115966796875, 0.08343505859375, 0.046417236328125, -0.007396697998046875, 0.04052734375, -0.034088134765625, 0.0114288330078125, -0.054473876953125, -0.0262603759765625, -0.031646728515625, -0.015960693359375, -0.053131103515625, -0.014068603515625, 0.01407623291015625, 0.017547607421875, -0.00994873046875, 0.051361083984375, -0.054901123046875, 0.0120697021484375, 0.051361083984375, 0.0002161264419555664, -0.013671875, 0.00955963134765625, -0.03204345703125, -0.0009531974792480469, -0.061492919921875, -0.034942626953125, 0.080810546875, 0.035125732421875, 0.03515625, 0.002918243408203125, 0.06451416015625, -0.0010557174682617188, 0.0350341796875, -0.06597900390625, 0.034393310546875, -0.00745391845703125, -0.06964111328125, -0.02081298828125, -0.0209197998046875, -0.0721435546875, 0.02459716796875, -0.0260009765625, -0.05218505859375, 0.0042724609375, 0.0007686614990234375, -0.0265045166015625, 0.01412200927734375, -0.050567626953125, 0.05584716796875, -0.013031005859375, -0.0096588134765625, 0.006725311279296875, -0.059478759765625, 0.01276397705078125, -0.0021915435791015625, 0.00821685791015625, -0.0152130126953125, 0.0126800537109375, 0.07489013671875, -0.04296875, 0.054412841796875, -0.0140228271484375, -0.00893402099609375, 0.01690673828125, -0.005584716796875, 0.03076171875, -0.01038360595703125, -0.0088958740234375, 0.038116455078125, 0.0195465087890625, -0.032745361328125, -0.023284912109375, 0.040374755859375, -0.0657958984375, -0.0234832763671875, -0.0323486328125, -0.03424072265625, -0.0028228759765625, 0.031402587890625, 0.036285400390625, 0.0257720947265625, -0.003864288330078125, 0.019744873046875, 0.035675048828125, -0.0251312255859375, 0.039794921875, 0.03411865234375, -0.015777587890625, -0.040740966796875, 0.049072265625, 0.01108551025390625, -0.0090484619140625, 0.01320648193359375, 0.00643157958984375, -0.036376953125, -0.031890869140625, -0.0210723876953125, 0.041259765625, -0.042449951171875, -0.021240234375, -0.07147216796875, -0.0236663818359375, -0.0579833984375, -0.00470733642578125, -0.0261688232421875, -0.0310821533203125, -0.026947021484375, -0.00424957275390625, 0.043182373046875, 0.047760009765625, -0.01328277587890625, 0.0408935546875, -0.049896240234375, 0.0216064453125, 0.006572723388671875, 0.035186767578125, -0.0224456787109375, -0.038330078125, -0.010955810546875, 0.006450653076171875, -0.0063323974609375, -0.059814453125, 0.047882080078125, 0.006725311279296875, 0.034637451171875, 0.01123809814453125, -0.0035858154296875, 0.042999267578125, -0.052276611328125, 0.07073974609375, 0.0205841064453125, -0.07391357421875, 0.036895751953125, -0.02783203125, 0.004405975341796875, 0.026611328125, 0.036407470703125, -0.05303955078125, -0.0233154296875, -0.07452392578125, -0.08868408203125, 0.0718994140625, 0.028533935546875, 0.012969970703125, -0.01219940185546875, 0.027313232421875, 0.0005116462707519531, 0.0152587890625, -0.058441162109375, -0.031494140625, -0.0296173095703125, -0.033660888671875, -0.004512786865234375, -0.0194854736328125, -0.01129913330078125, -0.03216552734375, 0.08221435546875, 0.0005860328674316406, 0.03558349609375, 0.0264739990234375, -0.0252838134765625, 0.01561737060546875, -0.007598876953125, 0.0419921875, 0.0278167724609375, -0.0350341796875, -0.01148223876953125, 0.02490234375, -0.02655029296875, -0.00868988037109375, 0.0246124267578125, -0.004291534423828125, 0.0323486328125, 0.035919189453125, 0.07293701171875, 0.0181732177734375, -0.035552978515625, 0.04217529296875, -0.00885009765625, -0.03955078125, -0.038482666015625, -0.0206146240234375, 0.0190582275390625, 0.00989532470703125, 0.0164947509765625, -0.0111541748046875, -0.014862060546875, -0.031402587890625, 0.025543212890625, 0.022369384765625, -0.019775390625, -0.02978515625, 0.0517578125, 0.005313873291015625, -0.034942626953125, 0.057952880859375, -0.004058837890625, -0.058837890625, 0.045989990234375, 0.0433349609375, 0.057586669921875, -0.0091094970703125, 0.006404876708984375, 0.03997802734375, 0.0267791748046875, -0.01351165771484375, 0.03387451171875, 0.00492095947265625, -0.06585693359375, -0.024139404296875, -0.07403564453125, -0.01061248779296875, 0.0262603759765625, -0.04962158203125, 0.025390625, -0.033355712890625, -0.024139404296875, 0.01153564453125, 0.01348876953125, -0.06573486328125, 0.01776123046875, 0.01177978515625, 0.072265625, -0.06915283203125, 0.051910400390625, 0.0672607421875, -0.041259765625, -0.0697021484375, -0.0224761962890625, -0.0255889892578125, -0.0657958984375, 0.050750732421875, 0.01491546630859375, 0.0260467529296875, 0.00656890869140625, -0.050689697265625, -0.07958984375, 0.06280517578125, 0.0140838623046875, -0.03436279296875, -0.0175018310546875, -0.005802154541015625, 0.051544189453125, -0.0182037353515625, 0.0305328369140625, 0.048431396484375, 0.0338134765625, -0.005695343017578125, -0.058929443359375, 0.003955841064453125, -0.0276641845703125, 0.0091094970703125, 0.0041656494140625, -0.0487060546875, 0.069091796875, 0.003772735595703125, -0.01407623291015625, 0.009063720703125, 0.06488037109375, 0.0228271484375, -0.01251220703125, 0.0399169921875, 0.046966552734375, 0.05218505859375, -0.0160675048828125, 0.07147216796875, -0.04364013671875, 0.048675537109375, 0.061065673828125, 0.00426483154296875, 0.056121826171875, 0.03955078125, -0.01568603515625, 0.04022216796875, 0.051361083984375, -0.025726318359375, 0.0294647216796875, 0.0228729248046875, -0.00457763671875, -0.01116943359375, 0.00235748291015625, -0.0204925537109375, 0.042022705078125, 0.0290069580078125, -0.023834228515625, -0.0139312744140625, -0.0025920867919921875, 0.027069091796875, -0.0196533203125, -0.0012407302856445312, 0.050994873046875, 0.0085296630859375, -0.05181884765625, 0.05230712890625, 0.0108184814453125, 0.0718994140625, -0.040374755859375, 0.00975799560546875, -0.01044464111328125, 0.00798797607421875, 0.00033211708068847656, -0.048431396484375, 0.01519012451171875, -0.008453369140625, -0.005580902099609375, -0.0291595458984375, 0.04736328125, -0.05181884765625, -0.042877197265625, 0.0254669189453125, 0.0243072509765625, 0.0384521484375, 0.007328033447265625, -0.07806396484375, -0.0015916824340820312, -0.002178192138671875, -0.041412353515625, 0.0090484619140625, 0.017608642578125, 0.012054443359375, 0.043182373046875, 0.05572509765625, 0.01470947265625, 0.0118865966796875, 0.00025177001953125, 0.05853271484375, -0.042816162109375, -0.0275115966796875, -0.05462646484375, 0.04180908203125, -0.01354217529296875, -0.038482666015625, 0.045989990234375, 0.04022216796875, 0.062103271484375, -0.00978851318359375, 0.050994873046875, -0.02947998046875, 0.0304107666015625, -0.031585693359375, 0.07269287109375, -0.0576171875, -0.00931549072265625, -0.008026123046875, -0.07269287109375, -0.0225677490234375, 0.0653076171875, -0.004180908203125, 0.006641387939453125, 0.03631591796875, 0.049835205078125, 0.0024318695068359375, -0.017181396484375, 0.005184173583984375, 0.035369873046875, 0.0254364013671875, 0.0186004638671875, 0.0303955078125, -0.05401611328125, 0.033843994140625, -0.0384521484375, -0.003742218017578125, -0.0258026123046875, -0.0701904296875, -0.07891845703125, -0.0545654296875, -0.0279083251953125, -0.043701171875, -0.005176544189453125, 0.0860595703125, 0.05816650390625, -0.0760498046875, -0.0023403167724609375, -0.00685882568359375, -0.00020599365234375, -0.0060272216796875, -0.021392822265625, 0.047637939453125, -0.0254364013671875, -0.06341552734375, 0.021514892578125, -0.0123291015625, 0.02679443359375, -0.0025577545166015625, 0.0074005126953125, -0.024505615234375, 0.0131072998046875, 0.044891357421875, 0.0178680419921875, -0.06292724609375, -0.0264739990234375, 0.00145721435546875, -0.00811004638671875, 0.0026531219482421875, 0.0276641845703125, -0.040435791015625, 0.0199432373046875, 0.03961181640625, 0.0267181396484375, 0.05169677734375, -0.01035308837890625, 0.04425048828125, -0.07257080078125, 0.01983642578125, 0.0266876220703125, 0.04693603515625, 0.035736083984375, -0.00911712646484375, 0.03515625, 0.01183319091796875, -0.037933349609375, -0.07440185546875, -0.006916046142578125, -0.06207275390625, -0.026611328125, 0.08251953125, -0.01459503173828125, -0.04052734375, 0.00768280029296875, -0.015716552734375, 0.0386962890625, -0.01953125, 0.057403564453125, 0.0753173828125, 0.0016918182373046875, 0.004795074462890625, -0.0211944580078125, 0.0467529296875, 0.050048828125, -0.035064697265625, -0.0066986083984375, 0.01849365234375, 0.049896240234375, 0.0294189453125, 0.0386962890625, -0.01320648193359375, -0.00839996337890625, 0.0031795501708984375, 0.050445556640625, 0.0020847320556640625, -0.0117034912109375, -0.0242462158203125, -0.004291534423828125, -0.0256805419921875, -0.00865936279296875 ] ]
facebook/metaclip-h14-fullcc2.5b
2023-10-14T09:01:40.000Z
[ "transformers", "pytorch", "clip", "zero-shot-image-classification", "vision", "metaclip", "arxiv:2309.16671", "arxiv:2103.00020", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us", "has_space" ]
zero-shot-image-classification
facebook
null
null
facebook/metaclip-h14-fullcc2.5b
14
9,869
transformers
2023-10-09T20:33:38
--- license: cc-by-nc-4.0 tags: - vision - metaclip widget: - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png candidate_labels: playing music, playing sports example_title: Cat & Dog --- # MetaCLIP model, huge-sized version, patch resolution 14 MetaCLIP model applied to 2.5 billion data points of CommonCrawl (CC). It was introduced in the paper [Demystifying CLIP Data](https://arxiv.org/abs/2309.16671) by Xu et al. and first released in [this repository](https://github.com/facebookresearch/MetaCLIP). Disclaimer: The team releasing MetaCLIP did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description The [Demystifying CLIP Data](https://arxiv.org/abs/2309.16671) paper aims to reveal CLIP’s method around training data curation. OpenAI never open-sourced code regarding their data preparation pipeline. <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/clip_overview.jpg" alt="drawing" width="600"/> <small> CLIP high-level overview. Taken from the <a href="https://arxiv.org/abs/2103.00020">CLIP paper</a>. </small> ## Intended uses & limitations You can use the raw model for linking images with text in a shared embedding space. This enables things like zero-shot image classification, text-based image retrieval, image-based text retrieval, etc. ### How to use We refer to the [docs](https://huggingface.co/docs/transformers/main/en/model_doc/clip#usage). Just replace the names of the models on the hub. ### BibTeX entry and citation info ```bibtex @misc{xu2023demystifying, title={Demystifying CLIP Data}, author={Hu Xu and Saining Xie and Xiaoqing Ellen Tan and Po-Yao Huang and Russell Howes and Vasu Sharma and Shang-Wen Li and Gargi Ghosh and Luke Zettlemoyer and Christoph Feichtenhofer}, year={2023}, eprint={2309.16671}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
2,012
[ [ -0.05267333984375, -0.047821044921875, 0.019622802734375, 0.01074981689453125, -0.0167694091796875, -0.01309967041015625, -0.0009031295776367188, -0.045013427734375, 0.04388427734375, 0.035430908203125, -0.055084228515625, -0.042999267578125, -0.04840087890625, -0.0034999847412109375, -0.04241943359375, 0.075439453125, -0.004787445068359375, -0.0143280029296875, -0.016265869140625, -0.03131103515625, -0.01332855224609375, -0.055877685546875, 0.0049285888671875, -0.0212860107421875, 0.0118560791015625, 0.023895263671875, 0.0732421875, 0.06365966796875, 0.051513671875, 0.017852783203125, -0.0306243896484375, 0.00074005126953125, -0.036346435546875, -0.041046142578125, -0.0161590576171875, -0.0233306884765625, -0.05548095703125, 0.02783203125, 0.0361328125, 0.033538818359375, -0.0001042485237121582, 0.0237884521484375, 0.0031375885009765625, 0.0272674560546875, -0.0389404296875, -0.0000718235969543457, -0.0310516357421875, 0.00843048095703125, -0.0196075439453125, 0.014007568359375, -0.0275421142578125, -0.01085662841796875, 0.0286102294921875, -0.0350341796875, 0.042633056640625, -0.02117919921875, 0.097412109375, 0.002597808837890625, -0.020904541015625, -0.002899169921875, -0.053863525390625, 0.0634765625, -0.036224365234375, 0.0240478515625, 0.00580596923828125, 0.03179931640625, 0.015533447265625, -0.0689697265625, -0.04022216796875, 0.0277252197265625, 0.006412506103515625, -0.00260162353515625, -0.033599853515625, -0.0016040802001953125, 0.0467529296875, 0.037445068359375, -0.01219940185546875, -0.01727294921875, -0.055023193359375, -0.0207366943359375, 0.04876708984375, 0.017486572265625, 0.03509521484375, -0.032318115234375, -0.06610107421875, -0.0467529296875, -0.049224853515625, 0.0165557861328125, 0.0217132568359375, 0.00553131103515625, -0.04644775390625, 0.0487060546875, 0.02423095703125, 0.0262298583984375, 0.01378631591796875, -0.0106964111328125, 0.0325927734375, -0.0242462158203125, -0.004146575927734375, -0.0208282470703125, 0.0772705078125, 0.049774169921875, 0.01049041748046875, 0.0127410888671875, 0.01561737060546875, -0.005580902099609375, 0.0196990966796875, -0.075927734375, -0.0185089111328125, 0.005298614501953125, -0.04266357421875, -0.0186920166015625, 0.023895263671875, -0.06439208984375, 0.007259368896484375, -0.0179595947265625, 0.0200958251953125, -0.0321044921875, -0.0197601318359375, 0.00201416015625, -0.01247406005859375, 0.020294189453125, 0.01041412353515625, -0.05731201171875, 0.0224761962890625, 0.04608154296875, 0.08746337890625, -0.01517486572265625, -0.0264129638671875, 0.0059814453125, 0.0200958251953125, 0.006866455078125, 0.06488037109375, -0.031219482421875, -0.022430419921875, -0.0064544677734375, 0.031768798828125, 0.00005942583084106445, -0.04132080078125, 0.053436279296875, -0.01690673828125, -0.0011548995971679688, -0.02923583984375, -0.01605224609375, -0.03167724609375, 0.0157623291015625, -0.07257080078125, 0.06787109375, 0.0130462646484375, -0.07720947265625, 0.0196990966796875, -0.045654296875, 0.005008697509765625, -0.0238189697265625, 0.01363372802734375, -0.04998779296875, -0.01605224609375, 0.033538818359375, 0.03369140625, -0.019317626953125, 0.037445068359375, -0.055633544921875, -0.022735595703125, 0.0033016204833984375, -0.01157379150390625, 0.068359375, 0.002925872802734375, 0.007511138916015625, 0.0142364501953125, -0.033538818359375, -0.00464630126953125, 0.007110595703125, 0.01084136962890625, -0.01378631591796875, -0.01500701904296875, 0.014617919921875, 0.0290985107421875, -0.0006413459777832031, -0.046417236328125, 0.034515380859375, -0.0010662078857421875, 0.0537109375, 0.032073974609375, 0.0033740997314453125, 0.0241241455078125, -0.0300445556640625, 0.037200927734375, 0.0076141357421875, 0.03570556640625, -0.032257080078125, -0.0245361328125, -0.02471923828125, -0.037750244140625, 0.032806396484375, 0.0192108154296875, -0.025421142578125, 0.0167083740234375, -0.0193634033203125, -0.062347412109375, -0.042999267578125, -0.0165252685546875, 0.007053375244140625, 0.0232391357421875, 0.04388427734375, -0.0355224609375, -0.054840087890625, -0.07763671875, 0.0004940032958984375, 0.0023403167724609375, -0.0203857421875, 0.0258331298828125, 0.04254150390625, -0.0287322998046875, 0.09051513671875, -0.0555419921875, -0.029937744140625, -0.0166168212890625, 0.017120361328125, 0.0205230712890625, 0.031707763671875, 0.0645751953125, -0.059814453125, -0.030517578125, -0.04931640625, -0.051422119140625, -0.0016641616821289062, 0.008880615234375, -0.0201568603515625, -0.01348114013671875, 0.0269622802734375, -0.05621337890625, 0.0438232421875, 0.05755615234375, -0.004215240478515625, 0.052215576171875, 0.014617919921875, -0.00543212890625, -0.06744384765625, 0.03240966796875, 0.004352569580078125, -0.031219482421875, -0.035888671875, 0.017059326171875, 0.00397491455078125, -0.035125732421875, -0.05426025390625, 0.0270233154296875, -0.033111572265625, 0.0001271963119506836, -0.0261993408203125, 0.01346588134765625, 0.00467681884765625, 0.05694580078125, 0.018890380859375, 0.04803466796875, 0.02862548828125, -0.040802001953125, 0.0174713134765625, 0.06536865234375, 0.0032749176025390625, 0.039154052734375, -0.0867919921875, 0.00539398193359375, 0.008026123046875, 0.038787841796875, -0.039947509765625, -0.0276947021484375, 0.022369384765625, -0.0235443115234375, 0.00891876220703125, -0.0218658447265625, -0.035797119140625, -0.045013427734375, -0.0484619140625, 0.043975830078125, 0.0364990234375, -0.0574951171875, 0.041748046875, 0.0589599609375, 0.0092010498046875, -0.0279541015625, -0.049896240234375, -0.01580810546875, -0.004222869873046875, -0.0638427734375, 0.055908203125, -0.01776123046875, 0.0180816650390625, 0.0037288665771484375, -0.006740570068359375, -0.0287322998046875, -0.0256195068359375, 0.041351318359375, 0.029144287109375, 0.011505126953125, -0.006923675537109375, 0.0026340484619140625, -0.0196685791015625, 0.0098876953125, 0.03900146484375, 0.0457763671875, -0.006378173828125, -0.0295257568359375, -0.0296173095703125, 0.01416778564453125, 0.042236328125, -0.0079498291015625, 0.04278564453125, 0.0526123046875, -0.0193634033203125, -0.0008792877197265625, -0.019287109375, -0.01363372802734375, -0.035675048828125, 0.005260467529296875, -0.0102386474609375, -0.0469970703125, 0.055206298828125, -0.01322174072265625, 0.005290985107421875, 0.053131103515625, 0.0352783203125, 0.01535797119140625, 0.0791015625, 0.08331298828125, -0.0127410888671875, 0.04034423828125, -0.046875, 0.009918212890625, -0.0863037109375, -0.0186920166015625, -0.0241241455078125, -0.0274200439453125, -0.031982421875, -0.0489501953125, 0.04461669921875, 0.024017333984375, -0.01369476318359375, 0.06317138671875, -0.059234619140625, 0.0204010009765625, 0.0400390625, 0.0284423828125, 0.01947021484375, -0.011199951171875, -0.01058197021484375, -0.022186279296875, -0.039031982421875, -0.0321044921875, 0.06689453125, 0.05712890625, 0.0521240234375, -0.0079803466796875, 0.03076171875, -0.0013837814331054688, 0.0017251968383789062, -0.050628662109375, 0.0401611328125, -0.00514984130859375, -0.06597900390625, 0.00405120849609375, -0.00568389892578125, -0.05169677734375, 0.00406646728515625, -0.0214996337890625, -0.05230712890625, 0.02764892578125, 0.01087188720703125, -0.005107879638671875, 0.04443359375, -0.029205322265625, 0.055908203125, 0.0204315185546875, -0.0177001953125, -0.0099029541015625, -0.0546875, 0.049957275390625, 0.0094757080078125, -0.004215240478515625, -0.00760650634765625, 0.023345947265625, 0.073974609375, -0.053802490234375, 0.0732421875, -0.006702423095703125, 0.00353240966796875, 0.061126708984375, -0.006305694580078125, 0.0265350341796875, -0.00650787353515625, 0.0038471221923828125, 0.045196533203125, 0.0287017822265625, -0.0183868408203125, -0.03350830078125, 0.024749755859375, -0.057952880859375, -0.01422119140625, -0.0272064208984375, -0.04547119140625, 0.012176513671875, 0.01450347900390625, 0.04058837890625, 0.041412353515625, -0.0185546875, 0.0193328857421875, 0.0343017578125, -0.0240478515625, 0.037017822265625, 0.01788330078125, -0.0215301513671875, -0.035736083984375, 0.0478515625, 0.0164947509765625, 0.0171661376953125, -0.0018215179443359375, 0.0020999908447265625, -0.0059356689453125, -0.004398345947265625, -0.0207977294921875, 0.04376220703125, -0.03448486328125, -0.038543701171875, -0.05462646484375, -0.040374755859375, -0.0245819091796875, -0.01160430908203125, -0.04986572265625, -0.0267181396484375, -0.027435302734375, 0.003631591796875, 0.03662109375, 0.04144287109375, -0.0299072265625, 0.0396728515625, -0.08001708984375, 0.032623291015625, 0.001644134521484375, 0.039947509765625, 0.01837158203125, -0.03369140625, -0.0171051025390625, 0.018768310546875, -0.07623291015625, -0.058013916015625, 0.03753662109375, 0.0094146728515625, 0.032867431640625, 0.031768798828125, 0.01727294921875, 0.06427001953125, -0.032073974609375, 0.0699462890625, 0.0234222412109375, -0.0855712890625, 0.0231475830078125, -0.034637451171875, 0.0168304443359375, 0.032562255859375, 0.053436279296875, -0.04534912109375, -0.00952911376953125, -0.072021484375, -0.070068359375, 0.055419921875, -0.01465606689453125, 0.019317626953125, 0.007747650146484375, 0.007785797119140625, 0.003963470458984375, 0.01190185546875, -0.054443359375, -0.0231475830078125, -0.020904541015625, -0.0107421875, 0.0248870849609375, -0.03790283203125, -0.015533447265625, -0.03704833984375, 0.0290069580078125, -0.01326751708984375, 0.052978515625, 0.0244293212890625, -0.01171112060546875, -0.0184173583984375, -0.01505279541015625, 0.0323486328125, 0.0295562744140625, -0.029144287109375, -0.01378631591796875, 0.0005779266357421875, -0.0594482421875, -0.01502227783203125, -0.0052032470703125, -0.015899658203125, 0.0146026611328125, 0.033203125, 0.055389404296875, 0.043792724609375, -0.04412841796875, 0.0689697265625, 0.0199127197265625, -0.0389404296875, -0.0300445556640625, -0.0039825439453125, -0.00103759765625, 0.0182342529296875, 0.0233001708984375, 0.0059967041015625, 0.00567626953125, -0.04345703125, 0.039794921875, 0.0289306640625, -0.040435791015625, -0.06646728515625, 0.051727294921875, 0.0038051605224609375, -0.00225067138671875, 0.048370361328125, 0.00823211669921875, -0.06103515625, 0.025665283203125, 0.03692626953125, 0.041717529296875, -0.00945281982421875, 0.0238189697265625, 0.06005859375, -0.000759124755859375, -0.01910400390625, 0.00041031837463378906, 0.0059051513671875, -0.0389404296875, -0.0233306884765625, -0.0225830078125, -0.055694580078125, 0.0034809112548828125, -0.07269287109375, 0.061248779296875, -0.046478271484375, -0.0159454345703125, -0.00481414794921875, 0.001750946044921875, -0.052886962890625, -0.00931549072265625, 0.0015363693237304688, 0.1031494140625, -0.07635498046875, 0.039825439453125, 0.0269317626953125, -0.047821044921875, -0.0714111328125, -0.02001953125, 0.002490997314453125, -0.044952392578125, 0.02581787109375, 0.0157623291015625, 0.0171051025390625, -0.022125244140625, -0.09112548828125, -0.072021484375, 0.08331298828125, 0.0256500244140625, -0.0309906005859375, -0.0178985595703125, -0.0014562606811523438, 0.0396728515625, -0.0284576416015625, 0.044647216796875, 0.0289306640625, 0.0247039794921875, 0.03546142578125, -0.07025146484375, 0.001850128173828125, -0.0128936767578125, -0.0207672119140625, 0.022918701171875, -0.0738525390625, 0.079833984375, -0.019805908203125, -0.03363037109375, 0.006649017333984375, 0.04705810546875, 0.0217132568359375, 0.0275421142578125, 0.0246124267578125, 0.050872802734375, 0.0290985107421875, -0.0030803680419921875, 0.064208984375, -0.01337432861328125, 0.05462646484375, 0.068603515625, -0.00455474853515625, 0.061737060546875, 0.024627685546875, -0.0205841064453125, 0.04931640625, 0.0276336669921875, -0.05560302734375, 0.03411865234375, -0.0225830078125, 0.022186279296875, -0.004192352294921875, -0.024749755859375, -0.02459716796875, 0.033721923828125, 0.006198883056640625, -0.0347900390625, -0.02886962890625, 0.0106353759765625, 0.01678466796875, 0.00021338462829589844, -0.004230499267578125, 0.04296875, 0.0036468505859375, -0.01120758056640625, 0.04119873046875, -0.01181793212890625, 0.059051513671875, -0.028900146484375, -0.0193939208984375, 0.0152740478515625, -0.0000413060188293457, -0.021942138671875, -0.06256103515625, 0.049774169921875, 0.005535125732421875, -0.0276947021484375, -0.0060882568359375, 0.044647216796875, -0.0252227783203125, -0.0261993408203125, 0.024322509765625, -0.0025844573974609375, 0.02789306640625, -0.00037026405334472656, -0.05230712890625, 0.035675048828125, -0.0012750625610351562, -0.013885498046875, 0.018585205078125, 0.0151519775390625, 0.0016469955444335938, 0.05010986328125, 0.037506103515625, -0.00843048095703125, -0.01107025146484375, -0.02874755859375, 0.09124755859375, -0.06243896484375, -0.0201263427734375, -0.043975830078125, 0.04779052734375, -0.01155853271484375, -0.0316162109375, 0.04864501953125, 0.03680419921875, 0.059814453125, -0.00893402099609375, 0.032440185546875, -0.018707275390625, 0.06829833984375, -0.01210784912109375, 0.0330810546875, -0.019500732421875, -0.01611328125, -0.05865478515625, -0.07342529296875, -0.0197296142578125, 0.028778076171875, -0.0191802978515625, -0.00994110107421875, 0.043975830078125, 0.032440185546875, -0.031768798828125, 0.002475738525390625, 0.00014507770538330078, 0.006504058837890625, 0.033966064453125, 0.03253173828125, 0.0252227783203125, -0.06549072265625, 0.051910400390625, -0.04052734375, -0.0042266845703125, -0.030242919921875, -0.07025146484375, -0.06964111328125, -0.04541015625, -0.019805908203125, -0.0183868408203125, -0.03656005859375, 0.067626953125, 0.07452392578125, -0.044647216796875, 0.0050506591796875, 0.0180206298828125, 0.003902435302734375, -0.01380157470703125, -0.01580810546875, 0.037017822265625, 0.0242767333984375, -0.061981201171875, -0.003986358642578125, -0.00907135009765625, 0.0192108154296875, -0.0142974853515625, 0.01317596435546875, -0.0028247833251953125, 0.0008783340454101562, 0.0318603515625, 0.03753662109375, -0.02825927734375, -0.036285400390625, 0.007595062255859375, -0.00004953145980834961, 0.020294189453125, 0.059478759765625, -0.0469970703125, 0.01091766357421875, 0.036346435546875, 0.01000213623046875, 0.060028076171875, 0.03277587890625, 0.040252685546875, -0.0428466796875, 0.03143310546875, -0.002040863037109375, 0.0127716064453125, 0.0208740234375, -0.007541656494140625, 0.0438232421875, 0.025482177734375, -0.048675537109375, -0.07452392578125, 0.01849365234375, -0.0987548828125, -0.0028476715087890625, 0.051849365234375, -0.01812744140625, -0.040985107421875, 0.0205230712890625, -0.0201873779296875, 0.023590087890625, -0.026031494140625, 0.0411376953125, 0.035491943359375, 0.0118408203125, -0.03314208984375, -0.0302734375, 0.0249176025390625, -0.01108551025390625, -0.0300445556640625, 0.006011962890625, 0.015625, 0.032684326171875, 0.039825439453125, 0.0127716064453125, -0.039031982421875, 0.0115203857421875, 0.0021076202392578125, 0.039031982421875, -0.0257110595703125, -0.032623291015625, -0.0179595947265625, 0.01219940185546875, -0.0213623046875, -0.01230621337890625 ] ]
klue/roberta-large
2023-06-12T12:29:57.000Z
[ "transformers", "pytorch", "safetensors", "roberta", "fill-mask", "korean", "klue", "ko", "arxiv:2105.09680", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
klue
null
null
klue/roberta-large
28
9,861
transformers
2022-03-02T23:29:05
--- language: ko tags: - korean - klue mask_token: "[MASK]" widget: - text: 대한민국의 수도는 [MASK] 입니다. --- # KLUE RoBERTa large Pretrained RoBERTa Model on Korean Language. See [Github](https://github.com/KLUE-benchmark/KLUE) and [Paper](https://arxiv.org/abs/2105.09680) for more details. ## How to use _NOTE:_ Use `BertTokenizer` instead of RobertaTokenizer. (`AutoTokenizer` will load `BertTokenizer`) ```python from transformers import AutoModel, AutoTokenizer model = AutoModel.from_pretrained("klue/roberta-large") tokenizer = AutoTokenizer.from_pretrained("klue/roberta-large") ``` ## BibTeX entry and citation info ```bibtex @misc{park2021klue, title={KLUE: Korean Language Understanding Evaluation}, author={Sungjoon Park and Jihyung Moon and Sungdong Kim and Won Ik Cho and Jiyoon Han and Jangwon Park and Chisung Song and Junseong Kim and Yongsook Song and Taehwan Oh and Joohong Lee and Juhyun Oh and Sungwon Lyu and Younghoon Jeong and Inkwon Lee and Sangwoo Seo and Dongjun Lee and Hyunwoo Kim and Myeonghwa Lee and Seongbo Jang and Seungwon Do and Sunkyoung Kim and Kyungtae Lim and Jongwon Lee and Kyumin Park and Jamin Shin and Seonghyun Kim and Lucy Park and Alice Oh and Jungwoo Ha and Kyunghyun Cho}, year={2021}, eprint={2105.09680}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
1,350
[ [ -0.0185394287109375, -0.0284576416015625, 0.034210205078125, 0.0306396484375, -0.0242462158203125, 0.0032024383544921875, -0.047943115234375, -0.014678955078125, 0.0035037994384765625, 0.01409149169921875, -0.023040771484375, -0.039764404296875, -0.05255126953125, 0.007495880126953125, -0.0003275871276855469, 0.07611083984375, -0.005481719970703125, 0.03277587890625, -0.0005931854248046875, -0.0179595947265625, -0.01552581787109375, -0.0300140380859375, -0.041961669921875, -0.0377197265625, 0.018890380859375, 0.0028705596923828125, 0.0322265625, 0.0312042236328125, 0.009674072265625, 0.0253753662109375, -0.00531768798828125, -0.01381683349609375, -0.0203399658203125, 0.0095367431640625, 0.0032138824462890625, -0.033599853515625, -0.035064697265625, 0.0172576904296875, 0.053680419921875, 0.033843994140625, 0.0232086181640625, 0.03302001953125, -0.00656890869140625, 0.047454833984375, -0.0296630859375, 0.028228759765625, -0.03173828125, -0.0086669921875, -0.01018524169921875, 0.0123291015625, -0.033966064453125, -0.0254058837890625, 0.015167236328125, -0.041046142578125, 0.003082275390625, -0.01541900634765625, 0.10992431640625, 0.0243072509765625, -0.037109375, -0.0257568359375, -0.038726806640625, 0.06842041015625, -0.057098388671875, 0.042938232421875, 0.031707763671875, 0.0235443115234375, -0.0081329345703125, -0.061492919921875, -0.044189453125, -0.038787841796875, -0.010833740234375, 0.014434814453125, -0.0004260540008544922, -0.004894256591796875, 0.0245208740234375, 0.0254669189453125, -0.058837890625, -0.01090240478515625, -0.0255889892578125, -0.0097198486328125, 0.0479736328125, -0.0033779144287109375, 0.01136016845703125, -0.032073974609375, -0.0185699462890625, -0.0384521484375, -0.016632080078125, 0.00861358642578125, 0.033477783203125, 0.018402099609375, -0.022796630859375, 0.0297088623046875, -0.0107269287109375, 0.053558349609375, 0.0125885009765625, -0.0196685791015625, 0.06671142578125, -0.033721923828125, -0.0095062255859375, -0.0285491943359375, 0.063232421875, 0.006076812744140625, 0.0267333984375, -0.01042938232421875, 0.0025653839111328125, -0.01038360595703125, -0.00215911865234375, -0.05865478515625, -0.05535888671875, 0.017852783203125, -0.056915283203125, -0.00018608570098876953, 0.024383544921875, -0.068115234375, 0.0019969940185546875, -0.037689208984375, 0.0267791748046875, -0.0203094482421875, -0.034088134765625, -0.00843048095703125, -0.0002351999282836914, 0.020416259765625, -0.01268768310546875, -0.044189453125, 0.004184722900390625, 0.0289459228515625, 0.053009033203125, -0.002063751220703125, -0.031890869140625, -0.0250396728515625, -0.0169830322265625, -0.0178985595703125, 0.035858154296875, -0.0181884765625, -0.01470947265625, -0.00691986083984375, 0.0272216796875, -0.0292816162109375, -0.0377197265625, 0.05072021484375, -0.03619384765625, 0.01904296875, 0.010955810546875, -0.04693603515625, -0.0170135498046875, 0.011993408203125, -0.042999267578125, 0.09808349609375, 0.031524658203125, -0.03973388671875, 0.035858154296875, -0.036834716796875, -0.0237274169921875, -0.004695892333984375, -0.00283050537109375, -0.057586669921875, -0.006988525390625, 0.0089263916015625, 0.0229644775390625, -0.005153656005859375, 0.027313232421875, -0.0250701904296875, -0.006603240966796875, 0.0118255615234375, -0.0112762451171875, 0.0865478515625, 0.03173828125, -0.0300140380859375, 0.02264404296875, -0.083740234375, 0.04486083984375, 0.0128936767578125, -0.039459228515625, -0.029052734375, -0.0290069580078125, 0.031951904296875, 0.0242156982421875, 0.03436279296875, -0.02740478515625, 0.01554107666015625, -0.0248870849609375, 0.021881103515625, 0.039093017578125, -0.018585205078125, 0.03485107421875, -0.00873565673828125, 0.047576904296875, -0.00525665283203125, 0.0049896240234375, -0.012603759765625, -0.0212860107421875, -0.0474853515625, -0.036041259765625, 0.0648193359375, 0.041717529296875, -0.054473876953125, 0.054473876953125, -0.0200042724609375, -0.05926513671875, -0.061065673828125, -0.0018863677978515625, 0.042755126953125, 0.011627197265625, 0.024200439453125, 0.010009765625, -0.05816650390625, -0.048828125, -0.02813720703125, -0.013214111328125, -0.00452423095703125, 0.043731689453125, 0.046966552734375, -0.01178741455078125, 0.049530029296875, -0.036163330078125, -0.0014982223510742188, 0.0014219284057617188, 0.023345947265625, 0.045196533203125, 0.053802490234375, 0.054290771484375, -0.045501708984375, -0.08306884765625, -0.009307861328125, -0.036346435546875, -0.0190887451171875, -0.003143310546875, -0.01061248779296875, 0.046478271484375, 0.034881591796875, -0.04937744140625, 0.0233001708984375, 0.046600341796875, -0.0333251953125, 0.060791015625, -0.02288818359375, 0.0224151611328125, -0.09027099609375, 0.0220184326171875, -0.0194244384765625, -0.00905609130859375, -0.034454345703125, 0.009063720703125, 0.0183563232421875, -0.004302978515625, -0.0096588134765625, 0.05767822265625, -0.046966552734375, -0.0032329559326171875, -0.0207672119140625, 0.01580810546875, -0.0122222900390625, 0.03204345703125, -0.002593994140625, 0.056060791015625, 0.04608154296875, -0.03729248046875, 0.02288818359375, 0.01837158203125, -0.041717529296875, 0.016754150390625, -0.05584716796875, 0.016021728515625, 0.008331298828125, 0.0211029052734375, -0.08489990234375, -0.00786590576171875, 0.02496337890625, -0.053192138671875, 0.0225830078125, -0.0416259765625, -0.03729248046875, -0.0487060546875, -0.01213836669921875, 0.0399169921875, 0.04302978515625, -0.04791259765625, 0.0484619140625, 0.0123443603515625, -0.0010471343994140625, -0.033294677734375, -0.0264434814453125, -0.01454925537109375, -0.0187530517578125, -0.042724609375, 0.018829345703125, -0.0184326171875, 0.0057220458984375, 0.0135345458984375, 0.00485992431640625, -0.00984954833984375, -0.0008454322814941406, 0.006549835205078125, 0.0308074951171875, -0.02410888671875, 0.01215362548828125, -0.03289794921875, -0.030731201171875, -0.006221771240234375, -0.0245208740234375, 0.07684326171875, -0.01275634765625, -0.0087890625, -0.033233642578125, -0.006931304931640625, 0.033660888671875, -0.0182952880859375, 0.05255126953125, 0.0755615234375, -0.0025882720947265625, -0.0011806488037109375, -0.02740478515625, 0.007633209228515625, -0.0301971435546875, 0.050537109375, -0.03564453125, -0.042449951171875, 0.04779052734375, -0.0160980224609375, -0.019073486328125, 0.041595458984375, 0.047576904296875, 0.018096923828125, 0.070556640625, 0.023101806640625, -0.0261993408203125, 0.042266845703125, -0.0318603515625, 0.023162841796875, -0.07159423828125, -0.000018298625946044922, -0.0662841796875, -0.0016260147094726562, -0.06024169921875, -0.0263214111328125, 0.0116729736328125, 0.0220184326171875, -0.038116455078125, 0.036834716796875, -0.03375244140625, 0.0274810791015625, 0.050201416015625, -0.007495880126953125, 0.0015497207641601562, -0.0111083984375, -0.0146026611328125, -0.0109405517578125, -0.040374755859375, -0.033355712890625, 0.10107421875, 0.01435089111328125, 0.052703857421875, -0.00664520263671875, 0.04931640625, -0.010406494140625, 0.010345458984375, -0.04681396484375, 0.0270843505859375, -0.011688232421875, -0.057342529296875, -0.021209716796875, -0.036865234375, -0.07916259765625, 0.0182647705078125, -0.003406524658203125, -0.0494384765625, 0.00531768798828125, -0.0010890960693359375, -0.0050506591796875, 0.004665374755859375, -0.042755126953125, 0.07757568359375, -0.00942230224609375, 0.0086517333984375, -0.00004750490188598633, -0.029327392578125, 0.018524169921875, 0.002223968505859375, 0.004108428955078125, -0.00537872314453125, 0.01174163818359375, 0.0487060546875, -0.0186614990234375, 0.040313720703125, -0.035491943359375, 0.0135498046875, 0.0130157470703125, -0.01172637939453125, 0.04693603515625, 0.01456451416015625, -0.00628662109375, 0.017059326171875, 0.005191802978515625, -0.0408935546875, -0.038360595703125, 0.045135498046875, -0.07611083984375, -0.0190277099609375, -0.05157470703125, -0.04327392578125, -0.004169464111328125, 0.036346435546875, 0.0280303955078125, 0.0087127685546875, 0.0038013458251953125, 0.0159149169921875, 0.0193939208984375, -0.01238250732421875, 0.02789306640625, 0.05084228515625, -0.03155517578125, -0.06036376953125, 0.06280517578125, 0.015625, 0.01837158203125, -0.020904541015625, 0.01079559326171875, -0.02789306640625, -0.033905029296875, -0.0304412841796875, 0.033477783203125, -0.0452880859375, -0.01145172119140625, -0.046295166015625, -0.03564453125, -0.042938232421875, 0.0088043212890625, -0.0310821533203125, -0.0260772705078125, -0.0186004638671875, 0.015869140625, 0.0233306884765625, 0.032867431640625, -0.00714874267578125, 0.0225067138671875, -0.0423583984375, 0.0176239013671875, -0.002994537353515625, 0.031646728515625, 0.02362060546875, -0.058502197265625, -0.038330078125, 0.000194549560546875, -0.0185546875, -0.032684326171875, 0.024566650390625, 0.016021728515625, 0.064208984375, 0.01457977294921875, 0.006282806396484375, 0.054290771484375, -0.03680419921875, 0.06671142578125, 0.00493621826171875, -0.0657958984375, 0.0257720947265625, -0.0162353515625, 0.02825927734375, 0.048492431640625, 0.040374755859375, -0.040069580078125, -0.02105712890625, -0.0567626953125, -0.08721923828125, 0.05413818359375, 0.006671905517578125, -0.0003151893615722656, -0.0014028549194335938, 0.015167236328125, -0.0026264190673828125, 0.004886627197265625, -0.08282470703125, -0.040985107421875, -0.0190582275390625, -0.0277252197265625, -0.00994110107421875, -0.0157012939453125, -0.0030670166015625, -0.0347900390625, 0.09130859375, -0.0003304481506347656, 0.01468658447265625, 0.0209808349609375, -0.019439697265625, -0.0051727294921875, -0.00247955322265625, 0.045989990234375, 0.040924072265625, -0.008697509765625, -0.013336181640625, 0.00433349609375, -0.04058837890625, 0.005931854248046875, 0.022918701171875, -0.0199737548828125, 0.0158538818359375, 0.03369140625, 0.05426025390625, 0.0189971923828125, -0.05035400390625, 0.0269775390625, 0.002147674560546875, -0.0237579345703125, -0.054290771484375, -0.00473785400390625, 0.0097503662109375, 0.032562255859375, 0.02850341796875, -0.01171112060546875, -0.0214691162109375, -0.02301025390625, 0.014312744140625, 0.02496337890625, -0.01361083984375, -0.020050048828125, 0.041839599609375, -0.009307861328125, -0.0215606689453125, 0.0421142578125, -0.033477783203125, -0.057098388671875, 0.055023193359375, 0.05169677734375, 0.06561279296875, -0.02032470703125, 0.00348663330078125, 0.06878662109375, 0.0045166015625, -0.0005116462707519531, 0.034820556640625, 0.00968170166015625, -0.0391845703125, -0.0181884765625, -0.049102783203125, 0.011993408203125, 0.03765869140625, -0.05413818359375, 0.0128021240234375, -0.0171966552734375, -0.0212249755859375, 0.01270294189453125, 0.033111572265625, -0.040740966796875, 0.00135040283203125, -0.0235748291015625, 0.052337646484375, -0.0648193359375, 0.07275390625, 0.057403564453125, -0.040557861328125, -0.07379150390625, 0.0139007568359375, -0.0225067138671875, -0.044097900390625, 0.0748291015625, -0.007965087890625, 0.026275634765625, 0.0002484321594238281, -0.0243072509765625, -0.092529296875, 0.0767822265625, -0.003070831298828125, -0.0251312255859375, 0.0245513916015625, -0.0120697021484375, 0.043548583984375, -0.02886962890625, 0.0184326171875, 0.03570556640625, 0.05841064453125, -0.0068359375, -0.0762939453125, -0.00807952880859375, -0.03472900390625, 0.01318359375, 0.010101318359375, -0.06317138671875, 0.09429931640625, 0.008056640625, -0.00820159912109375, 0.045166015625, 0.052642822265625, 0.0223388671875, 0.024200439453125, 0.04144287109375, 0.050567626953125, 0.0635986328125, -0.01056671142578125, 0.0533447265625, -0.040557861328125, 0.0606689453125, 0.09515380859375, -0.00798797607421875, 0.046478271484375, 0.01495361328125, -0.04229736328125, 0.0382080078125, 0.048431396484375, -0.04327392578125, 0.0487060546875, 0.0247650146484375, -0.005252838134765625, -0.01544952392578125, 0.030029296875, -0.047607421875, 0.03411865234375, -0.00749969482421875, -0.03704833984375, -0.0099029541015625, 0.01215362548828125, 0.0223236083984375, 0.01078033447265625, -0.00939178466796875, 0.0391845703125, -0.00994873046875, -0.04730224609375, 0.070068359375, -0.00490570068359375, 0.044586181640625, -0.04168701171875, 0.0078277587890625, 0.00855255126953125, 0.024444580078125, -0.00986480712890625, -0.034393310546875, 0.0005855560302734375, -0.0056915283203125, -0.033782958984375, 0.02288818359375, 0.0452880859375, -0.051422119140625, -0.0467529296875, 0.03680419921875, 0.01873779296875, 0.0243072509765625, 0.0223541259765625, -0.06414794921875, 0.005947113037109375, 0.00579071044921875, -0.042388916015625, 0.0229644775390625, 0.038726806640625, 0.007038116455078125, 0.0421142578125, 0.056793212890625, 0.008148193359375, 0.02783203125, 0.0090484619140625, 0.05584716796875, -0.0389404296875, -0.041717529296875, -0.050933837890625, 0.0220947265625, -0.0212249755859375, -0.051483154296875, 0.0743408203125, 0.058929443359375, 0.07684326171875, -0.0294952392578125, 0.054931640625, -0.0237274169921875, 0.04791259765625, -0.034088134765625, 0.07269287109375, -0.037933349609375, -0.01300048828125, -0.0258636474609375, -0.06488037109375, -0.01174163818359375, 0.070068359375, -0.0271453857421875, 0.036376953125, 0.05108642578125, 0.0499267578125, 0.00452423095703125, -0.0200653076171875, 0.010467529296875, 0.046905517578125, 0.0262451171875, 0.024444580078125, 0.03411865234375, -0.054107666015625, 0.042877197265625, -0.01800537109375, -0.0143890380859375, -0.0196990966796875, -0.07275390625, -0.0775146484375, -0.06439208984375, -0.0207366943359375, -0.051055908203125, -0.00688934326171875, 0.08624267578125, 0.05926513671875, -0.073486328125, -0.0115203857421875, 0.007091522216796875, 0.00469970703125, -0.0216217041015625, -0.0197296142578125, 0.068359375, -0.049774169921875, -0.07269287109375, 0.01009368896484375, -0.0251312255859375, 0.0080108642578125, -0.0018825531005859375, -0.02838134765625, -0.033782958984375, -0.01219940185546875, 0.0261993408203125, 0.0225067138671875, -0.053802490234375, -0.01493072509765625, -0.0210113525390625, -0.01861572265625, 0.004741668701171875, 0.040924072265625, -0.04290771484375, 0.026611328125, 0.031219482421875, 0.03753662109375, 0.048095703125, -0.005680084228515625, 0.018707275390625, -0.031402587890625, 0.027587890625, 0.00615692138671875, 0.0198822021484375, 0.0056304931640625, -0.0109405517578125, 0.052459716796875, 0.023895263671875, -0.0682373046875, -0.0665283203125, -0.0031070709228515625, -0.0706787109375, -0.0233306884765625, 0.07244873046875, -0.045989990234375, -0.03192138671875, -0.018157958984375, -0.01422882080078125, 0.030426025390625, -0.01264190673828125, 0.0640869140625, 0.07440185546875, 0.0147705078125, 0.0032291412353515625, -0.053558349609375, 0.04351806640625, 0.0247955322265625, -0.040496826171875, 0.0061187744140625, 0.006877899169921875, 0.040496826171875, 0.0213165283203125, 0.0423583984375, -0.01450347900390625, 0.0253753662109375, 0.007671356201171875, 0.0205230712890625, -0.018280029296875, -0.0179290771484375, -0.025543212890625, -0.0206756591796875, -0.0011806488037109375, -0.0144195556640625 ] ]
PygmalionAI/pygmalion-2.7b
2023-01-12T19:20:02.000Z
[ "transformers", "pytorch", "tensorboard", "gpt_neo", "text-generation", "text generation", "conversational", "en", "license:creativeml-openrail-m", "has_space", "region:us" ]
conversational
PygmalionAI
null
null
PygmalionAI/pygmalion-2.7b
48
9,845
transformers
2023-01-05T14:57:04
--- license: creativeml-openrail-m language: - en thumbnail: tags: - text generation - conversational inference: false --- # Pygmalion 2.7B ## Model description Pymalion 2.7B is a proof-of-concept dialogue model based on EleutherAI's [gpt-neo-2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B). **Warning:** This model is **NOT** suitable for use by minors. It **will** output X-rated content under certain circumstances. ## Training data The fine-tuning dataset consisted of 56MB of dialogue data gathered from multiple sources, which includes both real _and_ partially machine-generated conversations. ## Training procedure Model weights were initialized from the `uft-2.7b` ConvoGPT model made available in [this commit](https://huggingface.co/hakurei/convogpt/tree/07707377dee0aa7d1ee5363ef660b13eb5b73f9d/2.7b-uft). The model was then further fine-tuned on ~48.5 million tokens for ~5k steps on 4 NVIDIA A40s using DeepSpeed. ## Intended use ### The easy way We provide a notebook with a Gradio UI for playing around with the model without having to manually format inputs. This notebook can be found [here](https://github.com/PygmalionAI/gradio-ui/blob/master/notebooks/GPU.ipynb). ### The manual way The model can be used as a regular text generation model, but it'll perform best if the input prompt adheres to the following format: ``` [CHARACTER]'s Persona: [A few sentences about the character you want the model to play] <START> [DIALOGUE HISTORY] You: [Your input message here] [CHARACTER]: ``` Where `[CHARACTER]` is, as you can probably guess, the name of the character you want the model to portray, `<START>` should be used verbatim as a delimiter token to separate persona and scenario data from the dialogue, and `[DIALOGUE HISTORY]` is chat history so the model can have some conversational context to draw from. Ideally it'll be pairs of messages like: ``` [CHARACTER]: [some dialogue here] You: [your response to the dialogue above] ``` Apart from chat history, you can also just add example conversations in `[DIALOGUE HISTORY]` to show how the character should speak - ideally at the beginning, so it doesn't get confused as to what's conversation history vs. character definition. ## Known issues We haven't played around with the model enough to enumerate them. Feel free to give us some feedback!
2,347
[ [ -0.01800537109375, -0.06927490234375, 0.02459716796875, 0.009124755859375, -0.0418701171875, -0.01898193359375, -0.00228118896484375, -0.03875732421875, 0.01108551025390625, 0.03765869140625, -0.060150146484375, -0.022674560546875, -0.040924072265625, -0.017059326171875, -0.00396728515625, 0.09259033203125, 0.0182037353515625, 0.002361297607421875, 0.0004603862762451172, 0.0208740234375, -0.045867919921875, -0.03436279296875, -0.07281494140625, -0.04205322265625, 0.03057861328125, 0.019805908203125, 0.06103515625, 0.0270233154296875, 0.0013332366943359375, 0.0211639404296875, -0.026947021484375, -0.003231048583984375, -0.04412841796875, 0.005313873291015625, -0.0151519775390625, -0.026214599609375, -0.034210205078125, 0.0042572021484375, 0.045013427734375, 0.0243377685546875, -0.0048675537109375, 0.005096435546875, -0.0003113746643066406, 0.00467681884765625, -0.0179901123046875, 0.032318115234375, -0.0306854248046875, -0.0026397705078125, -0.00689697265625, 0.015472412109375, -0.0157470703125, -0.0173492431640625, 0.0318603515625, -0.0555419921875, 0.004581451416015625, 0.001720428466796875, 0.074462890625, -0.005645751953125, -0.027496337890625, -0.0232391357421875, -0.036468505859375, 0.038330078125, -0.0797119140625, -0.007022857666015625, 0.0218505859375, 0.030975341796875, -0.017181396484375, -0.0771484375, -0.037017822265625, -0.0280914306640625, -0.006130218505859375, 0.011627197265625, -0.0213470458984375, 0.0183258056640625, 0.04266357421875, 0.030029296875, -0.05462646484375, -0.00859832763671875, -0.03009033203125, -0.01959228515625, 0.0300445556640625, 0.0275726318359375, 0.0296478271484375, -0.025726318359375, -0.0182037353515625, -0.00907135009765625, -0.035919189453125, 0.01666259765625, 0.042083740234375, 0.0036602020263671875, -0.020660400390625, 0.045440673828125, -0.0012226104736328125, 0.03912353515625, 0.0258941650390625, -0.0202178955078125, -0.004497528076171875, -0.0211181640625, -0.0228424072265625, 0.005855560302734375, 0.0782470703125, 0.0482177734375, 0.0112457275390625, 0.017120361328125, 0.0010509490966796875, 0.0007281303405761719, 0.01325225830078125, -0.0943603515625, -0.046417236328125, 0.018951416015625, -0.03338623046875, -0.0217742919921875, -0.0199432373046875, -0.051544189453125, -0.0313720703125, -0.0091705322265625, 0.0296478271484375, -0.046783447265625, -0.036407470703125, -0.006847381591796875, -0.014923095703125, -0.00624847412109375, 0.03741455078125, -0.0762939453125, 0.01470947265625, 0.037353515625, 0.062744140625, 0.0236663818359375, -0.015380859375, -0.01556396484375, -0.007137298583984375, -0.01108551025390625, 0.04156494140625, -0.0228424072265625, -0.035858154296875, -0.00914764404296875, 0.01419830322265625, -0.0163421630859375, -0.02606201171875, 0.0435791015625, -0.001255035400390625, 0.05035400390625, 0.0081634521484375, -0.0501708984375, -0.030548095703125, 0.0072021484375, -0.033416748046875, 0.056854248046875, 0.032867431640625, -0.069091796875, 0.0014677047729492188, -0.0394287109375, -0.0164794921875, 0.020843505859375, -0.005130767822265625, -0.0251312255859375, 0.004364013671875, 0.0028247833251953125, 0.02056884765625, -0.03753662109375, 0.0457763671875, -0.0183258056640625, -0.0389404296875, 0.02630615234375, -0.0266876220703125, 0.06561279296875, 0.01873779296875, -0.0261383056640625, -0.0038127899169921875, -0.033782958984375, 0.0012197494506835938, 0.01050567626953125, -0.0158233642578125, 0.005527496337890625, -0.0083770751953125, 0.01158905029296875, 0.0194244384765625, 0.026580810546875, -0.0299530029296875, 0.024017333984375, -0.041412353515625, 0.037384033203125, 0.0276947021484375, 0.01096343994140625, 0.0245819091796875, -0.0384521484375, 0.04083251953125, -0.0028514862060546875, 0.01995849609375, -0.0206146240234375, -0.06658935546875, -0.049896240234375, -0.01751708984375, 0.022430419921875, 0.05645751953125, -0.053863525390625, 0.0276947021484375, 0.017547607421875, -0.044677734375, -0.02197265625, -0.0178680419921875, 0.042877197265625, 0.049957275390625, -0.000652313232421875, -0.020782470703125, -0.0562744140625, -0.06927490234375, -0.007381439208984375, -0.05340576171875, -0.01351165771484375, 0.041656494140625, 0.031982421875, -0.02105712890625, 0.051971435546875, -0.0298004150390625, 0.01322174072265625, -0.040496826171875, 0.0208740234375, 0.0250701904296875, 0.05206298828125, 0.034942626953125, -0.0413818359375, -0.0216522216796875, -0.01088714599609375, -0.058746337890625, -0.01568603515625, -0.0103912353515625, -0.007747650146484375, -0.004604339599609375, 0.0146636962890625, -0.06781005859375, 0.03411865234375, 0.039886474609375, -0.041900634765625, 0.04150390625, -0.0055084228515625, 0.00847625732421875, -0.11090087890625, 0.018951416015625, 0.01065826416015625, -0.020263671875, -0.050811767578125, 0.005008697509765625, -0.0191650390625, -0.030670166015625, -0.043212890625, 0.055755615234375, -0.01294708251953125, 0.01763916015625, -0.01511383056640625, 0.005859375, -0.0011615753173828125, 0.0435791015625, 0.00887298583984375, 0.047943115234375, 0.043548583984375, -0.041107177734375, 0.05072021484375, 0.0285491943359375, -0.030548095703125, 0.0380859375, -0.0802001953125, 0.02886962890625, 0.004825592041015625, 0.0277252197265625, -0.073486328125, -0.03076171875, 0.06982421875, -0.056793212890625, 0.0312347412109375, -0.059783935546875, -0.03387451171875, -0.0189666748046875, 0.001689910888671875, 0.022491455078125, 0.054718017578125, -0.024322509765625, 0.048858642578125, 0.032562255859375, -0.02923583984375, -0.00737762451171875, -0.0352783203125, 0.00885009765625, -0.03338623046875, -0.07147216796875, 0.0164642333984375, -0.0160980224609375, -0.005809783935546875, -0.025482177734375, 0.02166748046875, -0.00423431396484375, 0.0025577545166015625, 0.031402587890625, 0.01177978515625, 0.0015506744384765625, -0.01052093505859375, 0.018707275390625, -0.004016876220703125, -0.0008268356323242188, -0.0193023681640625, 0.056427001953125, 0.00458526611328125, 0.0085601806640625, -0.0555419921875, 0.0245361328125, 0.03790283203125, -0.0006246566772460938, 0.0296783447265625, 0.05499267578125, -0.03619384765625, 0.0228424072265625, -0.0144500732421875, -0.0088043212890625, -0.031219482421875, 0.041015625, -0.0308837890625, -0.058929443359375, 0.046661376953125, -0.018463134765625, -0.0032939910888671875, 0.0290069580078125, 0.0509033203125, 0.0001169443130493164, 0.09698486328125, 0.0285797119140625, 0.01085662841796875, 0.0509033203125, -0.0201416015625, 0.00215911865234375, -0.078857421875, -0.035369873046875, -0.020599365234375, -0.01476287841796875, -0.050628662109375, -0.0185546875, 0.013580322265625, 0.0299530029296875, -0.0219879150390625, 0.04461669921875, -0.0231475830078125, 0.019317626953125, 0.04425048828125, 0.007213592529296875, -0.0059814453125, -0.0012493133544921875, 0.0006046295166015625, -0.01425933837890625, -0.058013916015625, -0.0469970703125, 0.06439208984375, 0.051605224609375, 0.056884765625, 0.01861572265625, 0.0478515625, -0.0119171142578125, 0.0059814453125, -0.060089111328125, 0.03729248046875, -0.0032958984375, -0.0615234375, -0.016937255859375, -0.03778076171875, -0.06488037109375, 0.0188751220703125, -0.00893402099609375, -0.09307861328125, 0.007175445556640625, 0.009979248046875, -0.036346435546875, 0.004283905029296875, -0.061981201171875, 0.0943603515625, -0.0044097900390625, -0.0203857421875, 0.004688262939453125, -0.052093505859375, 0.041015625, 0.032440185546875, -0.020416259765625, -0.0036678314208984375, 0.02392578125, 0.0540771484375, -0.035369873046875, 0.07257080078125, -0.02020263671875, 0.0005335807800292969, 0.044342041015625, 0.0247650146484375, 0.03594970703125, 0.05084228515625, 0.0218505859375, 0.004047393798828125, 0.0216064453125, -0.00640869140625, -0.03802490234375, 0.0543212890625, -0.05810546875, -0.03076171875, -0.0435791015625, -0.04412841796875, 0.00676727294921875, 0.0018434524536132812, 0.039581298828125, 0.04852294921875, -0.032196044921875, 0.00394439697265625, 0.04949951171875, -0.01406097412109375, 0.02276611328125, 0.0297698974609375, -0.038848876953125, -0.054473876953125, 0.0555419921875, -0.0094451904296875, 0.0101470947265625, 0.002201080322265625, 0.021881103515625, -0.03228759765625, -0.0186614990234375, -0.06591796875, 0.016357421875, -0.032958984375, -0.0017042160034179688, -0.051513671875, -0.0093841552734375, -0.036834716796875, 0.0285797119140625, -0.0179290771484375, -0.033660888671875, -0.04046630859375, 0.021331787109375, 0.0205078125, 0.040771484375, 0.0144805908203125, 0.05029296875, -0.043701171875, 0.01384735107421875, 0.032440185546875, 0.0040130615234375, -0.016571044921875, -0.06488037109375, -0.0204620361328125, 0.029144287109375, -0.03857421875, -0.07183837890625, 0.03857421875, 0.011627197265625, 0.03057861328125, 0.0296783447265625, -0.0214385986328125, 0.0367431640625, -0.0226287841796875, 0.0765380859375, 0.0207366943359375, -0.062469482421875, 0.050445556640625, -0.04486083984375, 0.028900146484375, 0.03314208984375, 0.0118560791015625, -0.06378173828125, -0.0177154541015625, -0.0662841796875, -0.04534912109375, 0.0679931640625, 0.03717041015625, 0.01763916015625, -0.0049896240234375, 0.03759765625, -0.003997802734375, 0.021881103515625, -0.04315185546875, -0.005893707275390625, -0.032623291015625, -0.0175933837890625, -0.00992584228515625, -0.02899169921875, -0.00879669189453125, -0.02490234375, 0.05340576171875, -0.0149383544921875, 0.0333251953125, 0.0120391845703125, -0.0018520355224609375, -0.011199951171875, -0.00579071044921875, 0.04791259765625, 0.042816162109375, -0.038421630859375, -0.014404296875, -0.0099029541015625, -0.038299560546875, -0.01039886474609375, 0.0279541015625, -0.026031494140625, 0.0219268798828125, 0.012298583984375, 0.09259033203125, 0.0194091796875, -0.032806396484375, 0.032623291015625, -0.0312347412109375, -0.01253509521484375, -0.025665283203125, 0.01412200927734375, 0.0147552490234375, 0.037933349609375, -0.0019474029541015625, -0.012481689453125, 0.0179290771484375, -0.0621337890625, -0.005336761474609375, 0.013671875, -0.026763916015625, -0.0240478515625, 0.04840087890625, 0.0159759521484375, -0.04339599609375, 0.054412841796875, -0.0215911865234375, -0.04412841796875, 0.044952392578125, 0.0606689453125, 0.06390380859375, -0.01476287841796875, 0.031646728515625, 0.044403076171875, 0.01171112060546875, -0.01617431640625, 0.0129852294921875, 0.01434326171875, -0.03570556640625, -0.01715087890625, -0.0312347412109375, -0.03521728515625, 0.0283203125, -0.0279083251953125, 0.0182037353515625, -0.05487060546875, -0.0274505615234375, -0.0118865966796875, 0.0222320556640625, -0.0276947021484375, 0.021881103515625, 0.01386260986328125, 0.054595947265625, -0.055908203125, 0.056976318359375, 0.05987548828125, -0.042755126953125, -0.0626220703125, -0.01284027099609375, -0.00383758544921875, -0.038482666015625, 0.021484375, 0.01229095458984375, 0.0272064208984375, 0.0047760009765625, -0.055084228515625, -0.03857421875, 0.10089111328125, 0.036407470703125, -0.04534912109375, -0.0177459716796875, -0.00927734375, 0.036834716796875, -0.0469970703125, 0.0479736328125, 0.032623291015625, 0.019561767578125, 0.0223388671875, -0.07513427734375, 0.0005283355712890625, -0.032012939453125, 0.0225067138671875, -0.0015125274658203125, -0.053253173828125, 0.08453369140625, 0.0057220458984375, -0.020233154296875, 0.05194091796875, 0.03863525390625, 0.027069091796875, 0.03033447265625, 0.0212860107421875, 0.04815673828125, 0.0498046875, -0.0179595947265625, 0.08782958984375, -0.032867431640625, 0.032440185546875, 0.09234619140625, 0.002674102783203125, 0.021942138671875, 0.0212860107421875, 0.00801849365234375, 0.032806396484375, 0.07073974609375, 0.00095367431640625, 0.043487548828125, 0.0215606689453125, -0.0196075439453125, -0.0224761962890625, -0.01471710205078125, -0.035125732421875, 0.0203704833984375, 0.0208740234375, -0.037109375, 0.005802154541015625, -0.02093505859375, 0.01454925537109375, -0.025787353515625, -0.01470947265625, 0.0594482421875, 0.00731658935546875, -0.055267333984375, 0.03399658203125, 0.01374053955078125, 0.0435791015625, -0.05426025390625, -0.0216522216796875, -0.034423828125, 0.007175445556640625, 0.004940032958984375, -0.046783447265625, -0.0139312744140625, -0.0078277587890625, -0.004131317138671875, 0.0128173828125, 0.06048583984375, -0.04541015625, -0.044342041015625, -0.002269744873046875, 0.0286102294921875, 0.03778076171875, -0.0213775634765625, -0.062255859375, 0.0041961669921875, -0.0012979507446289062, -0.001605987548828125, 0.0241546630859375, 0.03875732421875, 0.0024280548095703125, 0.040771484375, 0.033599853515625, -0.01306915283203125, -0.020782470703125, 0.021942138671875, 0.06231689453125, -0.028167724609375, -0.03472900390625, -0.05029296875, 0.053253173828125, -0.00934600830078125, -0.056671142578125, 0.045379638671875, 0.045928955078125, 0.0517578125, -0.00952911376953125, 0.05194091796875, -0.0163116455078125, 0.0286865234375, -0.040985107421875, 0.053253173828125, -0.022430419921875, 0.0031795501708984375, -0.035125732421875, -0.07122802734375, 0.010833740234375, 0.0751953125, -0.0006794929504394531, 0.01416015625, 0.048675537109375, 0.07269287109375, -0.00014662742614746094, 0.011260986328125, 0.026458740234375, 0.01313018798828125, 0.0167388916015625, 0.076904296875, 0.08673095703125, -0.059967041015625, 0.030670166015625, -0.0157318115234375, -0.022705078125, -0.018890380859375, -0.0521240234375, -0.09429931640625, -0.0323486328125, -0.02972412109375, -0.057373046875, 0.0202178955078125, 0.0760498046875, 0.058013916015625, -0.0380859375, -0.0294647216796875, -0.006633758544921875, -0.008453369140625, -0.004581451416015625, -0.01221466064453125, -0.01190185546875, 0.015655517578125, -0.06536865234375, 0.02490234375, -0.01401519775390625, 0.0285491943359375, -0.01561737060546875, -0.025360107421875, -0.016693115234375, -0.004001617431640625, 0.019622802734375, 0.02398681640625, -0.0465087890625, -0.0214080810546875, -0.0175628662109375, 0.0013818740844726562, -0.00516510009765625, 0.07122802734375, -0.048675537109375, 0.02703857421875, 0.0283966064453125, -0.0028781890869140625, 0.05926513671875, -0.005645751953125, 0.05047607421875, -0.046600341796875, 0.02001953125, 0.01824951171875, 0.0298919677734375, 0.030731201171875, -0.036102294921875, 0.0229339599609375, 0.02716064453125, -0.044708251953125, -0.04034423828125, 0.02886962890625, -0.0675048828125, -0.00951385498046875, 0.089599609375, -0.0202178955078125, -0.0186614990234375, 0.0015268325805664062, -0.0711669921875, 0.040252685546875, -0.0511474609375, 0.041473388671875, 0.052642822265625, -0.0009613037109375, -0.03741455078125, -0.0283966064453125, 0.0404052734375, 0.01033782958984375, -0.043548583984375, -0.0008563995361328125, 0.05029296875, 0.0269775390625, 0.0031757354736328125, 0.047119140625, -0.01611328125, 0.037109375, 0.01422119140625, 0.002857208251953125, -0.01294708251953125, -0.0225372314453125, -0.031890869140625, -0.0153961181640625, 0.00434112548828125, 0.0006833076477050781 ] ]
ckpt/ModelScope
2023-04-05T23:36:43.000Z
[ "open_clip", "license:cc-by-nc-4.0", "region:us" ]
null
ckpt
null
null
ckpt/ModelScope
3
9,822
open_clip
2023-04-05T23:27:22
--- license: cc-by-nc-4.0 --- https://huggingface.co/damo-vilab/modelscope-damo-text-to-video-synthesis, but with fp16 (half precision) weights Read all the info here https://huggingface.co/damo-vilab/modelscope-damo-text-to-video-synthesis/blob/main/README.md
263
[ [ -0.038177490234375, -0.036956787109375, 0.0136871337890625, 0.051971435546875, -0.01357269287109375, -0.01218414306640625, -0.0006704330444335938, -0.0085296630859375, 0.019012451171875, 0.0174102783203125, -0.05364990234375, -0.03326416015625, -0.05255126953125, -0.01233673095703125, -0.035491943359375, 0.072998046875, -0.0245208740234375, 0.0236358642578125, 0.01004791259765625, 0.01053619384765625, 0.00595855712890625, -0.016510009765625, -0.0304718017578125, -0.017913818359375, 0.01503753662109375, 0.05267333984375, 0.04058837890625, 0.040557861328125, 0.066650390625, 0.02325439453125, -0.0162811279296875, 0.0094757080078125, -0.024566650390625, -0.03271484375, -0.001285552978515625, -0.004638671875, -0.070556640625, -0.0084381103515625, 0.0986328125, 0.00362396240234375, -0.0206298828125, 0.0290374755859375, -0.0056610107421875, 0.05078125, -0.042999267578125, 0.005218505859375, -0.0208892822265625, 0.01276397705078125, -0.01666259765625, -0.0111846923828125, -0.01052093505859375, 0.0275726318359375, -0.013641357421875, -0.05206298828125, 0.0195159912109375, 0.026275634765625, 0.085205078125, 0.02734375, -0.0479736328125, 0.01377105712890625, -0.045318603515625, 0.046173095703125, -0.053375244140625, 0.0333251953125, 0.0002498626708984375, 0.036834716796875, 0.00656890869140625, -0.061553955078125, -0.0221710205078125, 0.0057220458984375, 0.033172607421875, 0.01436614990234375, 0.00021791458129882812, 0.0389404296875, 0.047088623046875, 0.051971435546875, -0.024383544921875, -0.020965576171875, -0.05059814453125, -0.024627685546875, 0.06304931640625, -0.009246826171875, 0.0249786376953125, -0.029022216796875, -0.039794921875, -0.038177490234375, -0.0255889892578125, -0.0033359527587890625, 0.036041259765625, -0.0096435546875, -0.0341796875, 0.0758056640625, 0.00434112548828125, 0.0560302734375, 0.02362060546875, -0.01320648193359375, 0.01300811767578125, -0.032623291015625, -0.03631591796875, 0.01277923583984375, 0.072021484375, 0.0528564453125, 0.0170745849609375, 0.0193634033203125, -0.0227813720703125, -0.018829345703125, 0.049591064453125, -0.0972900390625, -0.031280517578125, -0.00897216796875, -0.047637939453125, -0.020782470703125, 0.00946044921875, -0.043548583984375, 0.033172607421875, -0.018341064453125, 0.054779052734375, -0.033172607421875, -0.01438140869140625, -0.0032978057861328125, -0.025726318359375, 0.02001953125, 0.0177154541015625, -0.06805419921875, 0.061676025390625, 0.01561737060546875, 0.054779052734375, -0.0078582763671875, -0.029266357421875, -0.037933349609375, 0.0095062255859375, -0.0251922607421875, 0.0233154296875, 0.01401519775390625, -0.031707763671875, -0.01184844970703125, 0.0066680908203125, 0.0302734375, -0.031890869140625, 0.06610107421875, -0.0083160400390625, -0.005584716796875, -0.00435638427734375, -0.04437255859375, -0.00027489662170410156, 0.0111083984375, -0.0308380126953125, 0.07635498046875, 0.01727294921875, -0.06390380859375, 0.0268402099609375, -0.02130126953125, -0.036651611328125, -0.021392822265625, -0.0005826950073242188, -0.051177978515625, 0.0135955810546875, 0.0008363723754882812, 0.0142822265625, -0.009185791015625, -0.003032684326171875, -0.0240936279296875, -0.014556884765625, -0.0062713623046875, -0.0029125213623046875, 0.028228759765625, 0.0323486328125, -0.0172882080078125, 0.028167724609375, -0.062164306640625, -0.0099334716796875, -0.01837158203125, -0.0087890625, 0.0156707763671875, -0.01311492919921875, 0.0369873046875, 0.0006365776062011719, 0.01561737060546875, -0.05029296875, -0.0017824172973632812, -0.031280517578125, 0.0215301513671875, 0.020294189453125, -0.00615692138671875, 0.051300048828125, -0.0060577392578125, 0.03485107421875, -0.0038166046142578125, 0.033416748046875, 0.035186767578125, -0.0555419921875, -0.0224609375, -0.01361846923828125, 0.0067291259765625, 0.0295257568359375, -0.0540771484375, 0.0089111328125, 0.016754150390625, -0.0577392578125, -0.0133819580078125, 0.02764892578125, 0.0228271484375, 0.03448486328125, 0.0188446044921875, -0.038116455078125, -0.0511474609375, -0.07489013671875, 0.007518768310546875, -0.0029201507568359375, -0.025634765625, 0.0190277099609375, 0.044464111328125, 0.0170135498046875, 0.037689208984375, -0.05328369140625, -0.042388916015625, 0.00792694091796875, 0.004451751708984375, 0.01568603515625, 0.03851318359375, 0.0753173828125, -0.06463623046875, -0.03192138671875, -0.019256591796875, -0.0243988037109375, -0.01303863525390625, 0.0180816650390625, -0.01849365234375, -0.0138702392578125, 0.032257080078125, -0.05535888671875, 0.03271484375, 0.050079345703125, -0.015167236328125, 0.0357666015625, -0.03289794921875, 0.02484130859375, -0.08734130859375, 0.0150146484375, 0.0224609375, -0.03564453125, -0.02337646484375, 0.0033855438232421875, 0.0267791748046875, -0.00316619873046875, -0.054840087890625, 0.0296173095703125, -0.04278564453125, -0.01910400390625, -0.01983642578125, -0.0005922317504882812, -0.0179290771484375, 0.029632568359375, 0.0111083984375, 0.046295166015625, 0.04266357421875, -0.03741455078125, 0.0467529296875, 0.033050537109375, -0.0206451416015625, 0.039642333984375, -0.09014892578125, -0.03466796875, 0.01207733154296875, 0.0146331787109375, -0.061859130859375, -0.0252227783203125, 0.0200653076171875, -0.0577392578125, 0.026641845703125, -0.04583740234375, -0.0252685546875, -0.018463134765625, -0.03851318359375, 0.052215576171875, 0.0311737060546875, 0.007373809814453125, 0.02496337890625, 0.054473876953125, -0.0145111083984375, -0.0179290771484375, -0.06951904296875, 0.0107269287109375, -0.0151824951171875, -0.037628173828125, 0.0723876953125, -0.01067352294921875, -0.01055145263671875, -0.031951904296875, 0.004543304443359375, -0.027496337890625, -0.042755126953125, 0.03533935546875, 0.0244903564453125, 0.00003713369369506836, -0.029632568359375, 0.02337646484375, -0.0045623779296875, 0.0127716064453125, -0.022552490234375, 0.00885772705078125, -0.01377105712890625, 0.0101470947265625, -0.021453857421875, 0.00919342041015625, 0.047088623046875, 0.018218994140625, 0.0400390625, 0.07843017578125, -0.05865478515625, 0.0005292892456054688, -0.02642822265625, -0.028076171875, -0.035003662109375, 0.006809234619140625, -0.033233642578125, -0.035552978515625, 0.02227783203125, 0.0302734375, -0.004596710205078125, 0.059539794921875, 0.06781005859375, -0.01418304443359375, 0.072509765625, 0.03515625, 0.0251007080078125, 0.05914306640625, -0.02728271484375, -0.00791168212890625, -0.046966552734375, -0.0185089111328125, -0.01348876953125, -0.0408935546875, -0.0423583984375, -0.035980224609375, 0.0457763671875, 0.015289306640625, -0.034423828125, 0.035858154296875, -0.052764892578125, -0.00782012939453125, 0.0606689453125, 0.0193939208984375, 0.00876617431640625, 0.0170135498046875, -0.0052032470703125, -0.03497314453125, -0.033905029296875, -0.0131683349609375, 0.06365966796875, 0.020538330078125, 0.0849609375, 0.0035839080810546875, 0.027618408203125, 0.0203094482421875, -0.00864410400390625, -0.0626220703125, 0.032135009765625, -0.02130126953125, -0.070068359375, -0.0153656005859375, -0.036895751953125, -0.0058746337890625, -0.0261383056640625, -0.042083740234375, -0.0157318115234375, -0.01238250732421875, 0.02825927734375, 0.00044465065002441406, 0.03558349609375, -0.041015625, 0.0693359375, 0.0028228759765625, -0.0189971923828125, -0.0006084442138671875, -0.057769775390625, 0.0167083740234375, 0.0316162109375, 0.0013666152954101562, 0.016815185546875, 0.005344390869140625, 0.064697265625, -0.0650634765625, 0.03515625, -0.0240631103515625, -0.004322052001953125, 0.05438232421875, 0.006557464599609375, 0.015594482421875, 0.028167724609375, 0.010833740234375, 0.0063934326171875, 0.0249176025390625, -0.02020263671875, -0.0006933212280273438, 0.04022216796875, -0.057464599609375, 0.007106781005859375, -0.0159149169921875, -0.061553955078125, 0.017578125, 0.020477294921875, 0.03387451171875, 0.040924072265625, -0.0256500244140625, 0.024078369140625, 0.04833984375, 0.017974853515625, 0.036773681640625, 0.014129638671875, -0.027587890625, -0.06585693359375, 0.03350830078125, 0.00974273681640625, 0.0252685546875, 0.074462890625, 0.017364501953125, -0.005809783935546875, -0.040618896484375, -0.07415771484375, 0.006755828857421875, -0.0166473388671875, -0.0193023681640625, -0.024017333984375, -0.026336669921875, -0.057342529296875, -0.000263214111328125, -0.05975341796875, -0.06695556640625, -0.0274200439453125, -0.013916015625, 0.06329345703125, 0.02972412109375, -0.021240234375, 0.04241943359375, -0.06365966796875, 0.051177978515625, 0.02667236328125, 0.02606201171875, -0.0304107666015625, -0.068115234375, -0.01346588134765625, 0.0178070068359375, -0.049163818359375, -0.05548095703125, 0.03143310546875, 0.0194244384765625, -0.00024962425231933594, 0.034271240234375, -0.018890380859375, 0.0394287109375, -0.053558349609375, 0.045379638671875, 0.027191162109375, -0.0755615234375, 0.037445068359375, -0.03607177734375, 0.0180816650390625, 0.021484375, 0.0294189453125, -0.015380859375, 0.005970001220703125, -0.03900146484375, -0.06024169921875, 0.051177978515625, 0.044036865234375, -0.019561767578125, 0.046966552734375, 0.048675537109375, -0.00725555419921875, 0.021820068359375, -0.05438232421875, -0.00463104248046875, -0.03936767578125, -0.038238525390625, 0.0046539306640625, -0.0262603759765625, -0.006717681884765625, -0.026824951171875, 0.0711669921875, -0.0154876708984375, 0.00847625732421875, 0.042694091796875, 0.0004153251647949219, -0.033416748046875, -0.0049285888671875, 0.045562744140625, 0.03900146484375, -0.051849365234375, 0.00437164306640625, 0.0094451904296875, -0.049591064453125, -0.0019426345825195312, 0.01464080810546875, -0.0362548828125, 0.008392333984375, 0.00455474853515625, 0.057861328125, 0.0288848876953125, -0.032135009765625, 0.0487060546875, 0.0063018798828125, -0.054473876953125, -0.0271453857421875, -0.01708984375, 0.0171661376953125, 0.00782012939453125, 0.0253448486328125, -0.003383636474609375, 0.0275421142578125, -0.0220184326171875, 0.034332275390625, 0.022796630859375, -0.041900634765625, -0.032806396484375, 0.09063720703125, 0.016754150390625, -0.03594970703125, 0.02587890625, -0.024566650390625, -0.002223968505859375, 0.0232391357421875, 0.023040771484375, 0.040802001953125, -0.022918701171875, 0.0196685791015625, 0.03759765625, 0.026031494140625, -0.0185089111328125, 0.050628662109375, -0.027984619140625, -0.035430908203125, -0.044464111328125, -0.056915283203125, -0.0557861328125, -0.0389404296875, -0.06256103515625, 0.050872802734375, -0.05914306640625, -0.0169525146484375, 0.008209228515625, 0.00957489013671875, -0.01983642578125, 0.053436279296875, 0.01540374755859375, 0.08154296875, -0.0517578125, 0.0667724609375, 0.06329345703125, -0.033966064453125, -0.051666259765625, -0.052978515625, 0.00936126708984375, -0.042083740234375, 0.0177764892578125, -0.004917144775390625, -0.004650115966796875, -0.004734039306640625, -0.042724609375, -0.0263824462890625, 0.0706787109375, 0.01971435546875, -0.0611572265625, -0.04345703125, 0.00904083251953125, 0.036865234375, -0.016021728515625, 0.0162506103515625, 0.007678985595703125, 0.0322265625, 0.041259765625, -0.08251953125, -0.0269317626953125, -0.018768310546875, 0.0038661956787109375, 0.023193359375, -0.057159423828125, 0.0533447265625, -0.00017368793487548828, 0.02606201171875, 0.008758544921875, 0.062164306640625, 0.00788116455078125, 0.0303802490234375, 0.03143310546875, 0.054473876953125, -0.022308349609375, -0.004467010498046875, 0.051971435546875, -0.0239105224609375, 0.04473876953125, 0.0755615234375, -0.00160980224609375, 0.047943115234375, 0.056549072265625, -0.019866943359375, 0.01335906982421875, 0.050994873046875, -0.0008015632629394531, 0.0246429443359375, 0.0157012939453125, 0.002307891845703125, -0.0027141571044921875, 0.0015621185302734375, -0.0142669677734375, 0.039337158203125, 0.0230865478515625, -0.04345703125, 0.017578125, 0.0023860931396484375, 0.00395965576171875, 0.003078460693359375, -0.027191162109375, 0.032989501953125, 0.0289154052734375, -0.035552978515625, 0.0283355712890625, 0.0128326416015625, 0.0496826171875, -0.07421875, -0.002407073974609375, 0.0254058837890625, 0.03375244140625, -0.0222930908203125, -0.08416748046875, 0.040557861328125, -0.00513458251953125, -0.0251312255859375, -0.038848876953125, 0.058929443359375, -0.0540771484375, -0.0670166015625, 0.01354217529296875, 0.0015554428100585938, 0.01326751708984375, -0.041473388671875, -0.047119140625, 0.004425048828125, 0.003955841064453125, -0.025909423828125, -0.00867462158203125, 0.0130157470703125, 0.0206451416015625, 0.0294952392578125, 0.0118560791015625, 0.0309295654296875, 0.003955841064453125, 0.00901031494140625, 0.0211029052734375, -0.04302978515625, -0.0132904052734375, -0.03564453125, 0.02130126953125, -0.0158538818359375, -0.0298309326171875, 0.061676025390625, 0.0501708984375, 0.061065673828125, -0.0228118896484375, 0.012664794921875, -0.039886474609375, 0.01161956787109375, -0.04443359375, 0.0472412109375, -0.053131103515625, -0.05035400390625, -0.042205810546875, -0.08056640625, -0.018585205078125, 0.043121337890625, 0.021148681640625, 0.006351470947265625, 0.038421630859375, 0.066162109375, -0.01812744140625, -0.00904083251953125, 0.00934600830078125, 0.028411865234375, 0.01551055908203125, 0.022674560546875, -0.0114898681640625, -0.08447265625, 0.057220458984375, -0.036865234375, -0.0286865234375, -0.01377105712890625, -0.0758056640625, -0.037200927734375, -0.04913330078125, -0.0299224853515625, -0.02288818359375, -0.0097808837890625, 0.0435791015625, 0.07415771484375, -0.042388916015625, -0.0240478515625, -0.01192474365234375, -0.007320404052734375, -0.0108184814453125, -0.01947021484375, 0.0107269287109375, 0.03350830078125, -0.07916259765625, 0.01020050048828125, 0.030181884765625, 0.0154571533203125, -0.0159149169921875, -0.0113983154296875, -0.028961181640625, -0.0085906982421875, 0.0316162109375, 0.01389312744140625, -0.0231781005859375, -0.0124664306640625, -0.0144195556640625, 0.00891876220703125, 0.0021495819091796875, 0.051483154296875, -0.0241241455078125, 0.02618408203125, 0.069091796875, -0.0032176971435546875, 0.0831298828125, -0.0020198822021484375, 0.059051513671875, -0.06072998046875, 0.0159454345703125, -0.0028972625732421875, 0.03277587890625, 0.0273284912109375, -0.00637054443359375, 0.036651611328125, 0.0291748046875, -0.05364990234375, -0.034881591796875, 0.00649261474609375, -0.12030029296875, 0.0109405517578125, 0.06427001953125, 0.0217132568359375, -0.032867431640625, 0.06744384765625, -0.061370849609375, 0.0516357421875, -0.030731201171875, 0.040679931640625, 0.0384521484375, 0.01202392578125, -0.01690673828125, -0.03436279296875, 0.03729248046875, 0.024078369140625, -0.0377197265625, -0.030120849609375, 0.037445068359375, 0.039093017578125, 0.01049041748046875, 0.0295257568359375, -0.038726806640625, 0.025238037109375, 0.0173187255859375, 0.03411865234375, -0.00873565673828125, -0.0299224853515625, 0.0162353515625, -0.00572967529296875, -0.01245880126953125, -0.0280609130859375 ] ]
lllyasviel/sd-controlnet-depth
2023-04-24T22:30:25.000Z
[ "diffusers", "art", "controlnet", "stable-diffusion", "image-to-image", "arxiv:2302.05543", "license:openrail", "has_space", "diffusers:ControlNetModel", "region:us" ]
image-to-image
lllyasviel
null
null
lllyasviel/sd-controlnet-depth
28
9,788
diffusers
2023-02-24T06:59:59
--- license: openrail base_model: runwayml/stable-diffusion-v1-5 tags: - art - controlnet - stable-diffusion - image-to-image --- # Controlnet - *Depth Version* ControlNet is a neural network structure to control diffusion models by adding extra conditions. This checkpoint corresponds to the ControlNet conditioned on **Depth estimation**. It can be used in combination with [Stable Diffusion](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/text2img). ![img](./sd.png) ## Model Details - **Developed by:** Lvmin Zhang, Maneesh Agrawala - **Model type:** Diffusion-based text-to-image generation model - **Language(s):** English - **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based. - **Resources for more information:** [GitHub Repository](https://github.com/lllyasviel/ControlNet), [Paper](https://arxiv.org/abs/2302.05543). - **Cite as:** @misc{zhang2023adding, title={Adding Conditional Control to Text-to-Image Diffusion Models}, author={Lvmin Zhang and Maneesh Agrawala}, year={2023}, eprint={2302.05543}, archivePrefix={arXiv}, primaryClass={cs.CV} } ## Introduction Controlnet was proposed in [*Adding Conditional Control to Text-to-Image Diffusion Models*](https://arxiv.org/abs/2302.05543) by Lvmin Zhang, Maneesh Agrawala. The abstract reads as follows: *We present a neural network structure, ControlNet, to control pretrained large diffusion models to support additional input conditions. The ControlNet learns task-specific conditions in an end-to-end way, and the learning is robust even when the training dataset is small (< 50k). Moreover, training a ControlNet is as fast as fine-tuning a diffusion model, and the model can be trained on a personal devices. Alternatively, if powerful computation clusters are available, the model can scale to large amounts (millions to billions) of data. We report that large diffusion models like Stable Diffusion can be augmented with ControlNets to enable conditional inputs like edge maps, segmentation maps, keypoints, etc. This may enrich the methods to control large diffusion models and further facilitate related applications.* ## Released Checkpoints The authors released 8 different checkpoints, each trained with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) on a different type of conditioning: | Model Name | Control Image Overview| Control Image Example | Generated Image Example | |---|---|---|---| |[lllyasviel/sd-controlnet-canny](https://huggingface.co/lllyasviel/sd-controlnet-canny)<br/> *Trained with canny edge detection* | A monochrome image with white edges on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_bird_canny.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_bird_canny.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_canny_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_canny_1.png"/></a>| |[lllyasviel/sd-controlnet-depth](https://huggingface.co/lllyasviel/sd-controlnet-depth)<br/> *Trained with Midas depth estimation* |A grayscale image with black representing deep areas and white representing shallow areas.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_vermeer_depth.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_vermeer_depth.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_depth_2.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_depth_2.png"/></a>| |[lllyasviel/sd-controlnet-hed](https://huggingface.co/lllyasviel/sd-controlnet-hed)<br/> *Trained with HED edge detection (soft edge)* |A monochrome image with white soft edges on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_bird_hed.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_bird_hed.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_hed_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_hed_1.png"/></a> | |[lllyasviel/sd-controlnet-mlsd](https://huggingface.co/lllyasviel/sd-controlnet-mlsd)<br/> *Trained with M-LSD line detection* |A monochrome image composed only of white straight lines on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_room_mlsd.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_room_mlsd.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_mlsd_0.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_mlsd_0.png"/></a>| |[lllyasviel/sd-controlnet-normal](https://huggingface.co/lllyasviel/sd-controlnet-normal)<br/> *Trained with normal map* |A [normal mapped](https://en.wikipedia.org/wiki/Normal_mapping) image.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_human_normal.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_human_normal.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_normal_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_normal_1.png"/></a>| |[lllyasviel/sd-controlnet_openpose](https://huggingface.co/lllyasviel/sd-controlnet-openpose)<br/> *Trained with OpenPose bone image* |A [OpenPose bone](https://github.com/CMU-Perceptual-Computing-Lab/openpose) image.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_human_openpose.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_human_openpose.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_openpose_0.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_openpose_0.png"/></a>| |[lllyasviel/sd-controlnet_scribble](https://huggingface.co/lllyasviel/sd-controlnet-scribble)<br/> *Trained with human scribbles* |A hand-drawn monochrome image with white outlines on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_vermeer_scribble.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_vermeer_scribble.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_scribble_0.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_scribble_0.png"/></a> | |[lllyasviel/sd-controlnet_seg](https://huggingface.co/lllyasviel/sd-controlnet-seg)<br/>*Trained with semantic segmentation* |An [ADE20K](https://groups.csail.mit.edu/vision/datasets/ADE20K/)'s segmentation protocol image.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_room_seg.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_room_seg.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_seg_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_seg_1.png"/></a> | ## Example It is recommended to use the checkpoint with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) as the checkpoint has been trained on it. Experimentally, the checkpoint can be used with other diffusion models such as dreamboothed stable diffusion. 1. Let's install `diffusers` and related packages: ``` $ pip install diffusers transformers accelerate ``` 2. Run code: ```py from transformers import pipeline from diffusers import StableDiffusionControlNetPipeline, ControlNetModel, UniPCMultistepScheduler from PIL import Image import numpy as np import torch from diffusers.utils import load_image depth_estimator = pipeline('depth-estimation') image = load_image("https://huggingface.co/lllyasviel/sd-controlnet-depth/resolve/main/images/stormtrooper.png") image = depth_estimator(image)['depth'] image = np.array(image) image = image[:, :, None] image = np.concatenate([image, image, image], axis=2) image = Image.fromarray(image) controlnet = ControlNetModel.from_pretrained( "lllyasviel/sd-controlnet-depth", torch_dtype=torch.float16 ) pipe = StableDiffusionControlNetPipeline.from_pretrained( "runwayml/stable-diffusion-v1-5", controlnet=controlnet, safety_checker=None, torch_dtype=torch.float16 ) pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config) # Remove if you do not have xformers installed # see https://huggingface.co/docs/diffusers/v0.13.0/en/optimization/xformers#installing-xformers # for installation instructions pipe.enable_xformers_memory_efficient_attention() pipe.enable_model_cpu_offload() image = pipe("Stormtrooper's lecture", image, num_inference_steps=20).images[0] image.save('./images/stormtrooper_depth_out.png') ``` ![stormtrooper](./images/stormtrooper.png) ![stormtrooler_depth](./images/stormtrooper_depth.png) ![stormtrooler_depth_out](./images/stormtrooper_depth_out.png) ### Training The depth model was trained on 3M depth-image, caption pairs. The depth images were generated with Midas. The model was trained for 500 GPU-hours with Nvidia A100 80G using Stable Diffusion 1.5 as a base model. ### Blog post For more information, please also have a look at the [official ControlNet Blog Post](https://huggingface.co/blog/controlnet).
11,472
[ [ -0.0450439453125, -0.040130615234375, -0.00434112548828125, 0.03265380859375, -0.0220794677734375, -0.0223388671875, -0.004428863525390625, -0.048919677734375, 0.06341552734375, 0.01288604736328125, -0.041900634765625, -0.0345458984375, -0.05401611328125, -0.0032901763916015625, -0.0005807876586914062, 0.044586181640625, -0.0088653564453125, -0.0092926025390625, 0.0172882080078125, -0.00853729248046875, -0.00786590576171875, -0.00872802734375, -0.09588623046875, -0.038055419921875, 0.044403076171875, 0.00484466552734375, 0.05133056640625, 0.062103271484375, 0.04058837890625, 0.0308837890625, -0.025421142578125, 0.0199737548828125, -0.017333984375, -0.0135498046875, 0.009429931640625, -0.0082244873046875, -0.056793212890625, 0.006069183349609375, 0.044097900390625, 0.027069091796875, 0.01395416259765625, -0.016815185546875, 0.005687713623046875, 0.054656982421875, -0.039215087890625, -0.0031337738037109375, -0.0063323974609375, 0.03460693359375, -0.0134735107421875, -0.00797271728515625, -0.00589752197265625, -0.0212860107421875, 0.01078033447265625, -0.0631103515625, -0.0221405029296875, -0.00588226318359375, 0.1083984375, 0.00496673583984375, -0.022674560546875, -0.0027618408203125, -0.01154327392578125, 0.0430908203125, -0.061004638671875, 0.0096435546875, 0.01959228515625, 0.0160369873046875, -0.0162811279296875, -0.060333251953125, -0.040374755859375, -0.01210784912109375, -0.00952911376953125, 0.035797119140625, -0.034027099609375, 0.00476837158203125, 0.01483917236328125, 0.0180206298828125, -0.029449462890625, 0.0188446044921875, -0.028045654296875, -0.03173828125, 0.05267333984375, -0.002399444580078125, 0.051666259765625, -0.00647735595703125, -0.039093017578125, -0.0253448486328125, -0.0265045166015625, 0.03448486328125, 0.0242156982421875, 0.00304412841796875, -0.05523681640625, 0.034027099609375, -0.010528564453125, 0.0693359375, 0.0274810791015625, -0.01824951171875, 0.039306640625, -0.0185546875, -0.0195770263671875, -0.024383544921875, 0.0863037109375, 0.047943115234375, 0.020965576171875, 0.0035495758056640625, -0.01265716552734375, -0.01113128662109375, 0.00263214111328125, -0.06884765625, -0.0115814208984375, 0.00745391845703125, -0.055328369140625, -0.0294189453125, -0.00653839111328125, -0.0535888671875, -0.022979736328125, -0.01476287841796875, 0.0254058837890625, -0.052825927734375, -0.04632568359375, 0.0223388671875, -0.0374755859375, 0.039825439453125, 0.0384521484375, -0.02508544921875, 0.0162811279296875, 0.01033782958984375, 0.08148193359375, -0.01611328125, -0.01971435546875, -0.01528167724609375, -0.0079498291015625, -0.026153564453125, 0.037017822265625, -0.00435638427734375, -0.007568359375, -0.01371002197265625, 0.0233612060546875, -0.002391815185546875, -0.0248260498046875, 0.0187835693359375, -0.01294708251953125, 0.00620269775390625, -0.0036296844482421875, -0.015899658203125, -0.0247802734375, 0.0235748291015625, -0.04510498046875, 0.04327392578125, 0.0162811279296875, -0.079345703125, 0.0251312255859375, -0.050933837890625, -0.0247344970703125, -0.01861572265625, 0.0029544830322265625, -0.04681396484375, -0.0433349609375, 0.0006995201110839844, 0.055816650390625, -0.00437164306640625, -0.0121612548828125, -0.0443115234375, -0.004245758056640625, 0.018524169921875, -0.0067138671875, 0.08941650390625, 0.0128631591796875, -0.045623779296875, 0.0159149169921875, -0.05712890625, 0.005401611328125, 0.0195159912109375, -0.0279693603515625, -0.011322021484375, -0.012908935546875, 0.006587982177734375, 0.05242919921875, 0.02362060546875, -0.05377197265625, 0.0070343017578125, -0.0278472900390625, 0.0309295654296875, 0.040191650390625, 0.00881195068359375, 0.0426025390625, -0.047149658203125, 0.032012939453125, 0.021728515625, 0.0305633544921875, 0.01203155517578125, -0.0305023193359375, -0.061767578125, -0.039154052734375, -0.003265380859375, 0.04571533203125, -0.060028076171875, 0.0645751953125, -0.0041961669921875, -0.043609619140625, -0.0247802734375, 0.003597259521484375, 0.037139892578125, 0.0447998046875, 0.0255584716796875, -0.038848876953125, -0.0290985107421875, -0.07781982421875, 0.0108489990234375, 0.0158538818359375, -0.00047898292541503906, 0.023681640625, 0.042877197265625, -0.0040435791015625, 0.048065185546875, -0.01210784912109375, -0.035308837890625, -0.0017995834350585938, -0.0103607177734375, 0.01537322998046875, 0.0767822265625, 0.05755615234375, -0.0701904296875, -0.039306640625, 0.01085662841796875, -0.07373046875, 0.0019550323486328125, -0.01479339599609375, -0.0323486328125, 0.0070037841796875, 0.03936767578125, -0.037567138671875, 0.06683349609375, 0.032440185546875, -0.035614013671875, 0.031585693359375, -0.0234527587890625, 0.01392364501953125, -0.07830810546875, 0.0243988037109375, 0.033355712890625, -0.01488494873046875, -0.042022705078125, 0.00714111328125, 0.00911712646484375, 0.00995635986328125, -0.048553466796875, 0.04913330078125, -0.04876708984375, -0.005863189697265625, -0.0138092041015625, 0.001125335693359375, -0.0014429092407226562, 0.050445556640625, 0.01220703125, 0.035308837890625, 0.06353759765625, -0.030059814453125, 0.0298614501953125, 0.0298614501953125, -0.00603485107421875, 0.06646728515625, -0.0704345703125, 0.002079010009765625, 0.00005334615707397461, 0.03546142578125, -0.059051513671875, -0.0195159912109375, 0.040557861328125, -0.041717529296875, 0.050933837890625, -0.007419586181640625, -0.0130615234375, -0.0400390625, -0.04364013671875, 0.00496673583984375, 0.031829833984375, -0.03509521484375, 0.03289794921875, 0.0170745849609375, 0.0182342529296875, -0.0511474609375, -0.081298828125, 0.002960205078125, -0.01025390625, -0.06329345703125, 0.0250701904296875, -0.0005335807800292969, 0.002910614013671875, 0.01128387451171875, 0.006603240966796875, -0.015228271484375, 0.008270263671875, 0.045166015625, 0.008056640625, -0.0251312255859375, -0.01247406005859375, 0.0014581680297851562, -0.0048675537109375, -0.01363372802734375, -0.03558349609375, 0.033416748046875, -0.0084075927734375, -0.0166015625, -0.07568359375, 0.019287109375, 0.040985107421875, -0.015655517578125, 0.052825927734375, 0.053558349609375, -0.03131103515625, -0.00234222412109375, -0.0262603759765625, -0.00844573974609375, -0.0374755859375, -0.011993408203125, -0.0190277099609375, -0.0643310546875, 0.053436279296875, 0.0059051513671875, -0.01030731201171875, 0.041229248046875, 0.016265869140625, -0.01568603515625, 0.06536865234375, 0.0297393798828125, -0.0116424560546875, 0.05084228515625, -0.060791015625, -0.0129547119140625, -0.07525634765625, -0.024627685546875, -0.01959228515625, -0.068603515625, -0.0279083251953125, -0.0209197998046875, 0.0260772705078125, 0.0216217041015625, -0.049774169921875, 0.034027099609375, -0.050018310546875, 0.00974273681640625, 0.03125, 0.04962158203125, -0.018218994140625, -0.00727081298828125, -0.0264129638671875, 0.01372528076171875, -0.047210693359375, -0.01525115966796875, 0.052337646484375, 0.03680419921875, 0.0386962890625, -0.00335693359375, 0.04254150390625, 0.007099151611328125, 0.007144927978515625, -0.051788330078125, 0.039215087890625, -0.005451202392578125, -0.043304443359375, -0.033721923828125, -0.027130126953125, -0.0906982421875, 0.01611328125, -0.04119873046875, -0.052581787109375, 0.033599853515625, 0.0093231201171875, -0.0133819580078125, 0.050323486328125, -0.0516357421875, 0.048583984375, 0.0012769699096679688, -0.036346435546875, 0.012786865234375, -0.071044921875, 0.01288604736328125, 0.02130126953125, -0.0205535888671875, 0.005558013916015625, -0.013519287109375, 0.0634765625, -0.06304931640625, 0.07080078125, -0.03826904296875, 0.007335662841796875, 0.0273895263671875, 0.007720947265625, 0.0283660888671875, -0.0183563232421875, -0.01270294189453125, 0.00385284423828125, -0.007747650146484375, -0.043426513671875, -0.0292205810546875, 0.042633056640625, -0.06622314453125, -0.012847900390625, -0.0174102783203125, -0.0227508544921875, 0.01080322265625, 0.02740478515625, 0.039886474609375, 0.036041259765625, 0.0195159912109375, 0.0242156982421875, 0.04058837890625, -0.03277587890625, 0.04241943359375, 0.0011854171752929688, 0.0059051513671875, -0.0400390625, 0.055084228515625, 0.0149383544921875, 0.01313018798828125, 0.0260772705078125, 0.01824951171875, -0.0148468017578125, -0.03851318359375, -0.02508544921875, 0.032135009765625, -0.04168701171875, -0.04034423828125, -0.046875, -0.0386962890625, -0.0261077880859375, -0.036590576171875, -0.019256591796875, -0.0186004638671875, -0.050506591796875, 0.00724029541015625, 0.048431396484375, 0.042724609375, -0.0304107666015625, 0.04931640625, -0.01439666748046875, 0.0231170654296875, 0.0257568359375, 0.024383544921875, 0.005706787109375, -0.0281829833984375, 0.017486572265625, 0.0195465087890625, -0.021514892578125, -0.0657958984375, 0.033416748046875, 0.0095367431640625, 0.033599853515625, 0.03558349609375, -0.0226898193359375, 0.04302978515625, -0.00737762451171875, 0.033538818359375, 0.0540771484375, -0.051910400390625, 0.041656494140625, -0.034271240234375, 0.0283966064453125, 0.0313720703125, 0.040924072265625, -0.0389404296875, -0.018707275390625, -0.042236328125, -0.043060302734375, 0.0450439453125, -0.00013768672943115234, -0.01532745361328125, 0.0213165283203125, 0.050933837890625, -0.026947021484375, -0.0015459060668945312, -0.059722900390625, -0.033416748046875, -0.026092529296875, -0.00824737548828125, 0.004364013671875, 0.0111083984375, -0.00254058837890625, -0.04217529296875, 0.05029296875, -0.014984130859375, 0.04205322265625, 0.0400390625, 0.0115814208984375, -0.0162506103515625, -0.03082275390625, 0.057830810546875, 0.033111572265625, 0.00331878662109375, -0.00835418701171875, -0.004886627197265625, -0.040802001953125, 0.01788330078125, -0.0086822509765625, -0.027191162109375, -0.01641845703125, 0.028961181640625, 0.06585693359375, -0.005939483642578125, 0.0016889572143554688, 0.0738525390625, -0.00152587890625, -0.05889892578125, -0.00955963134765625, 0.0032596588134765625, 0.01502227783203125, 0.035675048828125, 0.0106048583984375, 0.0256500244140625, 0.006465911865234375, -0.002819061279296875, 0.025604248046875, 0.0360107421875, -0.039093017578125, -0.01320648193359375, 0.03741455078125, -0.004261016845703125, -0.0133514404296875, 0.0206146240234375, -0.033966064453125, -0.05645751953125, 0.07391357421875, 0.038787841796875, 0.0518798828125, -0.00862884521484375, 0.0284881591796875, 0.05120849609375, 0.026702880859375, 0.01629638671875, 0.03033447265625, 0.007198333740234375, -0.05096435546875, -0.027313232421875, -0.0295257568359375, 0.005001068115234375, 0.0254669189453125, -0.034576416015625, 0.0225830078125, -0.07330322265625, -0.018951416015625, 0.0014276504516601562, 0.0140380859375, -0.048248291015625, 0.03314208984375, 0.0007653236389160156, 0.09649658203125, -0.05853271484375, 0.0543212890625, 0.0531005859375, -0.037933349609375, -0.0909423828125, -0.01140594482421875, 0.00891876220703125, -0.061981201171875, 0.054962158203125, 0.0016355514526367188, -0.01166534423828125, -0.006259918212890625, -0.0711669921875, -0.0440673828125, 0.09967041015625, 0.004184722900390625, -0.017333984375, 0.011444091796875, -0.03643798828125, 0.033416748046875, -0.034576416015625, 0.0260009765625, 0.0171966552734375, 0.0452880859375, 0.0170745849609375, -0.058074951171875, 0.023162841796875, -0.043914794921875, 0.0139007568359375, 0.01525115966796875, -0.075927734375, 0.06854248046875, 0.01837158203125, -0.007274627685546875, 0.0014553070068359375, 0.04962158203125, 0.01102447509765625, 0.01137542724609375, 0.057037353515625, 0.054779052734375, 0.023773193359375, -0.00699615478515625, 0.06402587890625, -0.005199432373046875, 0.006366729736328125, 0.048370361328125, 0.0216217041015625, 0.043609619140625, 0.0276641845703125, 0.005748748779296875, 0.02642822265625, 0.0599365234375, 0.0022335052490234375, 0.022491455078125, 0.0433349609375, 0.0020847320556640625, -0.0040740966796875, 0.0022068023681640625, -0.0262451171875, 0.029327392578125, 0.02569580078125, -0.0172119140625, -0.01250457763671875, 0.0284423828125, 0.0109405517578125, -0.0002684593200683594, -0.0285186767578125, 0.0516357421875, -0.00933837890625, -0.0305633544921875, 0.0635986328125, -0.00988006591796875, 0.09478759765625, -0.04608154296875, 0.0011034011840820312, -0.0258636474609375, -0.004230499267578125, -0.0310516357421875, -0.07373046875, 0.01464080810546875, -0.020172119140625, 0.0255889892578125, -0.0168304443359375, 0.0712890625, -0.0190582275390625, -0.00927734375, 0.045654296875, 0.0146331787109375, 0.0263519287109375, 0.017608642578125, -0.08856201171875, 0.01529693603515625, 0.00482940673828125, -0.03521728515625, 0.0083465576171875, 0.0391845703125, 0.007747650146484375, 0.052490234375, 0.031524658203125, 0.048370361328125, 0.0200347900390625, -0.02886962890625, 0.0758056640625, -0.02093505859375, -0.0260467529296875, -0.03460693359375, 0.047607421875, -0.043243408203125, -0.0391845703125, 0.0377197265625, 0.023345947265625, 0.059112548828125, -0.00024020671844482422, 0.04669189453125, -0.0260772705078125, 0.0208892822265625, -0.036468505859375, 0.080810546875, -0.0748291015625, -0.033111572265625, -0.016510009765625, -0.036834716796875, -0.03863525390625, 0.07440185546875, -0.01374053955078125, 0.01186370849609375, 0.03326416015625, 0.07208251953125, -0.0244598388671875, -0.033416748046875, 0.006633758544921875, 0.005809783935546875, 0.01110076904296875, 0.0606689453125, 0.048614501953125, -0.046234130859375, 0.0108489990234375, -0.042694091796875, -0.03753662109375, -0.0218353271484375, -0.0665283203125, -0.06622314453125, -0.05596923828125, -0.0364990234375, -0.041717529296875, -0.01629638671875, 0.061767578125, 0.09307861328125, -0.04705810546875, -0.024017333984375, -0.01134490966796875, 0.0066680908203125, -0.01361846923828125, -0.01361083984375, 0.0286407470703125, 0.00966644287109375, -0.031646728515625, 0.002227783203125, 0.0272979736328125, 0.045166015625, -0.0011606216430664062, -0.032012939453125, -0.04248046875, -0.0231475830078125, 0.0216064453125, 0.0499267578125, -0.034027099609375, -0.0209503173828125, -0.00360870361328125, -0.025787353515625, 0.007701873779296875, 0.0295867919921875, -0.014495849609375, 0.022125244140625, 0.047393798828125, 0.04693603515625, 0.04998779296875, -0.0159759521484375, 0.0037708282470703125, -0.039306640625, 0.039093017578125, 0.002994537353515625, 0.0263671875, 0.01715087890625, -0.02447509765625, 0.040496826171875, 0.02337646484375, -0.041961669921875, -0.0311126708984375, -0.002056121826171875, -0.10955810546875, -0.0200653076171875, 0.0660400390625, -0.031280517578125, -0.051910400390625, 0.0189971923828125, -0.0369873046875, 0.031829833984375, -0.020660400390625, 0.0047149658203125, 0.01898193359375, -0.0017881393432617188, -0.0195465087890625, -0.0304718017578125, 0.045135498046875, 0.0171966552734375, -0.059814453125, -0.054779052734375, 0.039794921875, 0.01386260986328125, 0.036285400390625, 0.055938720703125, -0.00888824462890625, 0.020294189453125, -0.006183624267578125, 0.0265045166015625, -0.01529693603515625, -0.01318359375, -0.0426025390625, -0.004306793212890625, -0.016204833984375, -0.03485107421875 ] ]
EleutherAI/pythia-6.9b-deduped
2023-06-08T13:05:19.000Z
[ "transformers", "pytorch", "gpt_neox", "text-generation", "causal-lm", "pythia", "en", "dataset:EleutherAI/the_pile_deduplicated", "arxiv:2304.01373", "arxiv:2101.00027", "arxiv:2201.07311", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
EleutherAI
null
null
EleutherAI/pythia-6.9b-deduped
6
9,778
transformers
2023-02-25T17:56:57
--- language: - en tags: - pytorch - causal-lm - pythia license: apache-2.0 datasets: - EleutherAI/the_pile_deduplicated --- The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf). It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. We also provide 154 intermediate checkpoints per model, hosted on Hugging Face as branches. The Pythia model suite was designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. <details> <summary style="font-weight:600">Details on previous early release and naming convention.</summary> Previously, we released an early version of the Pythia suite to the public. However, we decided to retrain the model suite to address a few hyperparameter discrepancies. This model card <a href="#changelog">lists the changes</a>; see appendix B in the Pythia paper for further discussion. We found no difference in benchmark performance between the two Pythia versions. The old models are [still available](https://huggingface.co/models?other=pythia_v0), but we suggest the retrained suite if you are just starting to use Pythia.<br> **This is the current release.** Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. </details> <br> # Pythia-6.9B-deduped ## Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. [See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation details. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:contact@eleuther.ai). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ## Uses and Limitations ### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. We also provide 154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints `step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to `step143000`. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-6.9B-deduped for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-6.9B-deduped as a basis for your fine-tuned model, please conduct your own risk and bias assessment. ### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. For example, the model may generate harmful or offensive text. Please evaluate the risks associated with your particular use case. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-6.9B-deduped has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-6.9B-deduped will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “follow” human instructions. ### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token used by the model need not produce the most “accurate” text. Never rely on Pythia-6.9B-deduped to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-6.9B-deduped may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-6.9B-deduped. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ## Training ### Training data Pythia-6.9B-deduped was trained on the Pile **after the dataset has been globally deduplicated**.<br> [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/). ### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training, from `step1000` to `step143000` (which is the same as `main`). In addition, we also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for 143000 steps at a batch size of 2M (2,097,152 tokens).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ## Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge—Easy Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/> </details> ## Changelog This section compares differences between previously released [Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current models. See Appendix B of the Pythia paper for further discussion of these changes and the motivation behind them. We found that retraining Pythia had no impact on benchmark performance. - All model sizes are now trained with uniform batch size of 2M tokens. Previously, the models of size 160M, 410M, and 1.4B parameters were trained with batch sizes of 4M tokens. - We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64, 128,256,512} in addition to every 1000 training steps. - Flash Attention was used in the new retrained suite. - We remedied a minor inconsistency that existed in the original suite: all models of size 2.8B parameters or smaller had a learning rate (LR) schedule which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and 12B models all used an LR schedule which decayed to a minimum LR of 0. In the redone training runs, we rectified this inconsistency: all models now were trained with LR decaying to a minimum of 0.1× their maximum LR. ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 12B | 13B | 11,846,072,320 | 11,327,027,200 | </figure>
13,663
[ [ -0.024505615234375, -0.05950927734375, 0.02496337890625, 0.004302978515625, -0.0177459716796875, -0.0149688720703125, -0.016937255859375, -0.0338134765625, 0.0141448974609375, 0.012847900390625, -0.027191162109375, -0.0218505859375, -0.03240966796875, -0.003753662109375, -0.034942626953125, 0.0845947265625, -0.00777435302734375, -0.01171875, 0.00931549072265625, -0.005359649658203125, -0.00560760498046875, -0.0413818359375, -0.033721923828125, -0.0297088623046875, 0.0479736328125, 0.01317596435546875, 0.06597900390625, 0.04302978515625, 0.012115478515625, 0.0219879150390625, -0.028350830078125, -0.006023406982421875, -0.011444091796875, -0.00748443603515625, -0.0018434524536132812, -0.01904296875, -0.053802490234375, 0.002559661865234375, 0.051025390625, 0.049407958984375, -0.01354217529296875, 0.0184173583984375, -0.0016794204711914062, 0.0270538330078125, -0.03814697265625, 0.0026397705078125, -0.0250091552734375, -0.01320648193359375, -0.005096435546875, 0.01166534423828125, -0.029327392578125, -0.0257415771484375, 0.03240966796875, -0.048553466796875, 0.0200958251953125, 0.005908966064453125, 0.0894775390625, -0.0083160400390625, -0.031463623046875, -0.004032135009765625, -0.053070068359375, 0.05047607421875, -0.054595947265625, 0.0254669189453125, 0.0208282470703125, 0.012939453125, -0.0034503936767578125, -0.0673828125, -0.04168701171875, -0.0167999267578125, -0.0101470947265625, -0.0025806427001953125, -0.046112060546875, 0.0020771026611328125, 0.03948974609375, 0.04876708984375, -0.0611572265625, -0.0019626617431640625, -0.0281829833984375, -0.0260772705078125, 0.0273895263671875, 0.00400543212890625, 0.033905029296875, -0.0240478515625, -0.00005543231964111328, -0.0290985107421875, -0.050506591796875, -0.01727294921875, 0.0413818359375, 0.005657196044921875, -0.0275421142578125, 0.037109375, -0.0286407470703125, 0.04266357421875, -0.005313873291015625, 0.0186767578125, 0.0325927734375, -0.01520538330078125, -0.0379638671875, -0.00725555419921875, 0.07135009765625, 0.0096588134765625, 0.0160675048828125, -0.0011396408081054688, -0.00374603271484375, 0.00510406494140625, 0.003780364990234375, -0.08453369140625, -0.06005859375, 0.0174102783203125, -0.0295257568359375, -0.03271484375, -0.0140380859375, -0.06982421875, -0.01544952392578125, -0.0142364501953125, 0.04266357421875, -0.038543701171875, -0.0540771484375, -0.01045989990234375, 0.00017189979553222656, 0.0157470703125, 0.0277862548828125, -0.0709228515625, 0.0300750732421875, 0.032196044921875, 0.07537841796875, 0.017730712890625, -0.0413818359375, -0.015380859375, -0.0200653076171875, -0.0087127685546875, 0.02764892578125, -0.010284423828125, -0.01499176025390625, -0.00873565673828125, 0.01219940185546875, -0.0093841552734375, -0.0261383056640625, 0.031097412109375, -0.029876708984375, 0.0206146240234375, -0.0206451416015625, -0.033050537109375, -0.028900146484375, 0.00762176513671875, -0.04718017578125, 0.06463623046875, 0.01806640625, -0.0723876953125, 0.0172576904296875, -0.01800537109375, -0.004669189453125, -0.003498077392578125, 0.0136260986328125, -0.051513671875, 0.0023345947265625, 0.0279083251953125, 0.003742218017578125, -0.0294036865234375, 0.0154876708984375, -0.0190887451171875, -0.03240966796875, 0.01360321044921875, -0.040771484375, 0.07000732421875, 0.01485443115234375, -0.049530029296875, 0.0210418701171875, -0.044525146484375, 0.0158843994140625, 0.0179901123046875, -0.026397705078125, 0.0034923553466796875, -0.013214111328125, 0.02764892578125, 0.0161590576171875, 0.012481689453125, -0.027374267578125, 0.0229034423828125, -0.03814697265625, 0.056243896484375, 0.056488037109375, -0.005939483642578125, 0.034515380859375, -0.031585693359375, 0.033905029296875, 0.00319671630859375, 0.01520538330078125, -0.004024505615234375, -0.046966552734375, -0.07421875, -0.0233306884765625, 0.0282440185546875, 0.0218048095703125, -0.035491943359375, 0.0340576171875, -0.018768310546875, -0.06549072265625, -0.01256561279296875, -0.006969451904296875, 0.0313720703125, 0.0237579345703125, 0.03240966796875, -0.01204681396484375, -0.0394287109375, -0.06640625, -0.0161590576171875, -0.031951904296875, 0.01012420654296875, 0.01308441162109375, 0.07098388671875, -0.00917816162109375, 0.044097900390625, -0.0269927978515625, 0.0186309814453125, -0.0279998779296875, 0.01267242431640625, 0.0343017578125, 0.045623779296875, 0.0296783447265625, -0.042144775390625, -0.0287628173828125, 0.0008120536804199219, -0.043212890625, 0.006298065185546875, 0.00235748291015625, -0.023101806640625, 0.02337646484375, 0.00569915771484375, -0.0751953125, 0.035675048828125, 0.046875, -0.041656494140625, 0.060943603515625, -0.024505615234375, -0.0005669593811035156, -0.08026123046875, 0.01995849609375, 0.009002685546875, -0.0173187255859375, -0.045379638671875, 0.00536346435546875, 0.014617919921875, -0.01552581787109375, -0.029876708984375, 0.045806884765625, -0.040496826171875, -0.01222991943359375, -0.0158538818359375, 0.005489349365234375, -0.0027446746826171875, 0.047454833984375, 0.01183319091796875, 0.043426513671875, 0.060150146484375, -0.05828857421875, 0.0325927734375, 0.0170745849609375, -0.0205230712890625, 0.02850341796875, -0.0662841796875, 0.0125732421875, 0.006336212158203125, 0.031768798828125, -0.04437255859375, -0.025665283203125, 0.0399169921875, -0.04376220703125, 0.0107421875, -0.031585693359375, -0.040313720703125, -0.03277587890625, -0.01287078857421875, 0.045257568359375, 0.058624267578125, -0.04620361328125, 0.05181884765625, 0.003292083740234375, 0.009521484375, -0.0289306640625, -0.040924072265625, -0.018890380859375, -0.0396728515625, -0.049896240234375, 0.028717041015625, 0.01157379150390625, -0.013427734375, 0.00109100341796875, -0.0006694793701171875, 0.00772857666015625, -0.0037174224853515625, 0.023895263671875, 0.02587890625, -0.0037364959716796875, 0.0016851425170898438, -0.01094818115234375, -0.0095062255859375, 0.000011861324310302734, -0.03759765625, 0.07244873046875, -0.021636962890625, -0.014312744140625, -0.060302734375, -0.0006451606750488281, 0.06671142578125, -0.03131103515625, 0.0662841796875, 0.04632568359375, -0.052825927734375, 0.01097869873046875, -0.0283966064453125, -0.022064208984375, -0.033294677734375, 0.04949951171875, -0.0199737548828125, -0.0271148681640625, 0.04827880859375, 0.0232696533203125, 0.021331787109375, 0.043975830078125, 0.054718017578125, 0.0174102783203125, 0.09027099609375, 0.03436279296875, -0.0127410888671875, 0.04791259765625, -0.038604736328125, 0.019622802734375, -0.08270263671875, -0.0131683349609375, -0.039764404296875, -0.01934814453125, -0.07080078125, -0.022430419921875, 0.0235443115234375, 0.0173187255859375, -0.056488037109375, 0.04266357421875, -0.042022705078125, 0.0048065185546875, 0.049346923828125, 0.0186309814453125, 0.01496124267578125, 0.01529693603515625, 0.0048065185546875, -0.00449371337890625, -0.049652099609375, -0.024658203125, 0.09259033203125, 0.036895751953125, 0.045867919921875, 0.022247314453125, 0.053802490234375, -0.01092529296875, 0.0180206298828125, -0.05291748046875, 0.03277587890625, 0.025146484375, -0.054229736328125, -0.01548004150390625, -0.0589599609375, -0.070556640625, 0.037261962890625, 0.005889892578125, -0.08453369140625, 0.01708984375, 0.0169525146484375, -0.0287628173828125, 0.036407470703125, -0.04779052734375, 0.0758056640625, -0.0175323486328125, -0.035247802734375, -0.026947021484375, -0.02301025390625, 0.018035888671875, 0.028106689453125, 0.00954437255859375, 0.00726318359375, 0.023956298828125, 0.07427978515625, -0.050994873046875, 0.050201416015625, -0.009490966796875, 0.01104736328125, 0.024749755859375, 0.020660400390625, 0.04913330078125, 0.011077880859375, 0.0106201171875, -0.0031299591064453125, 0.011962890625, -0.041046142578125, -0.029083251953125, 0.06927490234375, -0.08392333984375, -0.0287628173828125, -0.060516357421875, -0.045501708984375, 0.006725311279296875, 0.0160675048828125, 0.0309600830078125, 0.050567626953125, -0.002620697021484375, 0.001262664794921875, 0.045806884765625, -0.03875732421875, 0.0281219482421875, 0.01605224609375, -0.0364990234375, -0.03973388671875, 0.074462890625, 0.0010242462158203125, 0.026092529296875, 0.0014314651489257812, 0.0183258056640625, -0.0310821533203125, -0.03326416015625, -0.045318603515625, 0.040985107421875, -0.05511474609375, -0.00007194280624389648, -0.05389404296875, -0.004650115966796875, -0.035736083984375, 0.00849151611328125, -0.03021240234375, -0.0290374755859375, -0.0181884765625, -0.0018301010131835938, 0.044189453125, 0.035980224609375, 0.007381439208984375, 0.025787353515625, -0.041900634765625, -0.0026302337646484375, 0.0172576904296875, 0.008026123046875, 0.0079193115234375, -0.068603515625, -0.007137298583984375, 0.01026153564453125, -0.033050537109375, -0.085693359375, 0.03814697265625, -0.0034236907958984375, 0.0280303955078125, 0.00457000732421875, -0.016998291015625, 0.045318603515625, -0.006183624267578125, 0.051025390625, 0.0120697021484375, -0.07745361328125, 0.041534423828125, -0.03765869140625, 0.02392578125, 0.0256805419921875, 0.026092529296875, -0.054840087890625, -0.007015228271484375, -0.07647705078125, -0.0811767578125, 0.05682373046875, 0.03741455078125, 0.01250457763671875, 0.0084075927734375, 0.02911376953125, -0.034912109375, 0.0119171142578125, -0.078369140625, -0.0208587646484375, -0.019073486328125, -0.006908416748046875, 0.011993408203125, -0.00217437744140625, 0.00386810302734375, -0.04241943359375, 0.07598876953125, 0.0037174224853515625, 0.0258636474609375, 0.021484375, -0.0298919677734375, -0.00634765625, -0.0019779205322265625, 0.01239776611328125, 0.05694580078125, -0.0103759765625, 0.003612518310546875, 0.0155029296875, -0.042266845703125, 0.0021991729736328125, 0.01313018798828125, -0.0287322998046875, -0.005657196044921875, 0.0133056640625, 0.06646728515625, 0.0105743408203125, -0.0305938720703125, 0.017669677734375, -0.004192352294921875, -0.006000518798828125, -0.0229034423828125, -0.01262664794921875, 0.0132293701171875, 0.01549530029296875, -0.002170562744140625, -0.0124359130859375, -0.0008802413940429688, -0.06536865234375, 0.0037288665771484375, 0.01568603515625, -0.01142120361328125, -0.03082275390625, 0.044342041015625, 0.0036487579345703125, -0.01369476318359375, 0.0858154296875, -0.017913818359375, -0.051239013671875, 0.057830810546875, 0.037506103515625, 0.055633544921875, -0.014251708984375, 0.026641845703125, 0.06805419921875, 0.024871826171875, -0.01558685302734375, 0.005397796630859375, 0.007366180419921875, -0.03936767578125, -0.008514404296875, -0.061126708984375, -0.0169830322265625, 0.0201568603515625, -0.043426513671875, 0.033538818359375, -0.048248291015625, -0.006622314453125, -0.0032958984375, 0.0166168212890625, -0.0426025390625, 0.024139404296875, 0.01264190673828125, 0.05450439453125, -0.0697021484375, 0.06207275390625, 0.048736572265625, -0.056304931640625, -0.08306884765625, 0.001495361328125, 0.0017213821411132812, -0.032501220703125, 0.0136260986328125, 0.01654052734375, 0.01526641845703125, 0.01235198974609375, -0.02117919921875, -0.06573486328125, 0.0968017578125, 0.0174560546875, -0.04840087890625, -0.02142333984375, -0.00901031494140625, 0.040802001953125, 0.00461578369140625, 0.054656982421875, 0.054168701171875, 0.0305938720703125, 0.006565093994140625, -0.08074951171875, 0.028106689453125, -0.0253143310546875, -0.004119873046875, 0.0170440673828125, -0.050201416015625, 0.09832763671875, -0.005008697509765625, -0.0011091232299804688, 0.0303802490234375, 0.045257568359375, 0.030242919921875, -0.01062774658203125, 0.0281219482421875, 0.058990478515625, 0.06591796875, -0.0283966064453125, 0.09259033203125, -0.0225067138671875, 0.05816650390625, 0.06451416015625, 0.0149383544921875, 0.038909912109375, 0.0297698974609375, -0.027984619140625, 0.039886474609375, 0.06256103515625, -0.006134033203125, 0.01313018798828125, 0.01861572265625, -0.0218048095703125, -0.021331787109375, 0.00927734375, -0.046142578125, 0.01381683349609375, 0.0115509033203125, -0.0435791015625, -0.01605224609375, -0.0258636474609375, 0.0262603759765625, -0.030487060546875, -0.017913818359375, 0.019805908203125, 0.00730133056640625, -0.049346923828125, 0.048187255859375, 0.0173187255859375, 0.042144775390625, -0.034820556640625, 0.01132965087890625, -0.01251220703125, 0.0255279541015625, -0.0259857177734375, -0.0325927734375, 0.00637054443359375, -0.00026607513427734375, 0.00449371337890625, 0.008270263671875, 0.03289794921875, -0.01026153564453125, -0.043212890625, 0.01447296142578125, 0.035125732421875, 0.0185546875, -0.033294677734375, -0.0517578125, 0.006977081298828125, -0.01192474365234375, -0.0400390625, 0.0325927734375, 0.01934814453125, -0.01003265380859375, 0.04486083984375, 0.047454833984375, 0.00249481201171875, 0.0003807544708251953, 0.01192474365234375, 0.07421875, -0.0361328125, -0.036651611328125, -0.06817626953125, 0.038421630859375, -0.001476287841796875, -0.050079345703125, 0.06475830078125, 0.041351318359375, 0.052642822265625, 0.0184478759765625, 0.0457763671875, -0.0345458984375, 0.0012359619140625, -0.02264404296875, 0.050933837890625, -0.03790283203125, 0.0034046173095703125, -0.038543701171875, -0.0860595703125, -0.00435638427734375, 0.072265625, -0.03948974609375, 0.030029296875, 0.060821533203125, 0.061004638671875, -0.0067291259765625, 0.0074920654296875, 0.004215240478515625, 0.0228729248046875, 0.040618896484375, 0.0694580078125, 0.067626953125, -0.05303955078125, 0.041595458984375, -0.03826904296875, -0.01995849609375, -0.011749267578125, -0.03826904296875, -0.0654296875, -0.03497314453125, -0.03802490234375, -0.056060791015625, -0.001384735107421875, 0.06683349609375, 0.055023193359375, -0.046356201171875, -0.0122833251953125, -0.039093017578125, 0.003910064697265625, -0.019775390625, -0.0175933837890625, 0.03192138671875, 0.00882720947265625, -0.071533203125, -0.00272369384765625, -0.0120391845703125, 0.00771331787109375, -0.03131103515625, -0.0209197998046875, -0.0152435302734375, -0.00936126708984375, 0.006275177001953125, 0.0238037109375, -0.0390625, -0.02142333984375, 0.002017974853515625, 0.0036754608154296875, 0.00042557716369628906, 0.05291748046875, -0.043426513671875, 0.00865936279296875, 0.047637939453125, 0.008392333984375, 0.06097412109375, -0.0202484130859375, 0.03173828125, -0.0203399658203125, 0.0261383056640625, 0.0210113525390625, 0.04815673828125, 0.0253143310546875, -0.018890380859375, 0.0128021240234375, 0.032073974609375, -0.055328369140625, -0.06475830078125, 0.027435302734375, -0.05419921875, -0.0067901611328125, 0.0955810546875, -0.0196685791015625, -0.030120849609375, 0.004730224609375, -0.0161895751953125, 0.03961181640625, -0.0213623046875, 0.050323486328125, 0.0472412109375, 0.005428314208984375, -0.0160675048828125, -0.04833984375, 0.028350830078125, 0.049591064453125, -0.06158447265625, 0.02880859375, 0.047698974609375, 0.04620361328125, 0.0187530517578125, 0.044647216796875, -0.022705078125, 0.045867919921875, 0.006702423095703125, 0.006443023681640625, 0.0019235610961914062, -0.035125732421875, -0.032012939453125, -0.0109100341796875, 0.0168914794921875, 0.0020198822021484375 ] ]
KoboldAI/LLAMA2-13B-Holodeck-1
2023-07-28T21:08:16.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "en", "license:other", "endpoints_compatible", "text-generation-inference", "region:us" ]
text-generation
KoboldAI
null
null
KoboldAI/LLAMA2-13B-Holodeck-1
17
9,763
transformers
2023-07-28T09:33:08
--- license: other language: en commercial: no inference: true --- # LLAMA2 13B - Holodeck ## Model Description LLAMA2 13B-Holodeck is a finetune created using Meta's llama 2 model. ## Training data The training data contains around 3000 ebooks in various genres. Most parts of the dataset have been prepended using the following text: `[Genre: <genre1>, <genre2>]` ### How to use You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run: ```py >>> from transformers import pipeline >>> generator = pipeline('text-generation', model='KoboldAI/LLAMA2-13B-Holodeck-1') >>> generator("Welcome Captain Janeway, I apologize for the delay.", do_sample=True, min_length=50) [{'generated_text': 'Welcome Captain Janeway, I apologize for the delay."\nIt's all right," Janeway said. "I'm certain that you're doing your best to keep me informed of what\'s going on."'}] ``` ### Limitations and Biases Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion). ### License Llama 2 is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved. **Extra clause:** You shall use the Materials and Products solely for research purposes or personal use and not for any commercial purpose. Nothing in the Community License shall be construed as granting you a license to use the Materials or Products for any other purpose. ### BibTeX entry and citation info https://huggingface.co/meta-llama/Llama-2-13b-hf
1,576
[ [ -0.0106658935546875, -0.04052734375, 0.018829345703125, 0.03289794921875, -0.0478515625, -0.003993988037109375, 0.0157470703125, -0.0125732421875, 0.0254974365234375, 0.06787109375, -0.05841064453125, -0.046875, -0.053619384765625, 0.01284027099609375, -0.0310516357421875, 0.10308837890625, 0.0147705078125, -0.0280303955078125, -0.00897979736328125, 0.0019521713256835938, -0.04156494140625, -0.027679443359375, -0.0290374755859375, -0.038787841796875, 0.037445068359375, 0.03765869140625, 0.04425048828125, 0.060821533203125, 0.025238037109375, 0.012359619140625, -0.0205230712890625, 0.01012420654296875, -0.0396728515625, 0.01056671142578125, -0.007099151611328125, -0.035064697265625, -0.041229248046875, -0.019622802734375, 0.037872314453125, 0.01995849609375, -0.013427734375, 0.05291748046875, 0.0169219970703125, 0.029296875, -0.029388427734375, 0.0189208984375, -0.049285888671875, 0.00672149658203125, -0.0292510986328125, -0.01371002197265625, -0.0255279541015625, -0.0118865966796875, 0.01064300537109375, -0.0494384765625, 0.008697509765625, 0.009918212890625, 0.0701904296875, 0.041168212890625, -0.05499267578125, -0.0308990478515625, -0.03375244140625, 0.046966552734375, -0.06982421875, -0.014801025390625, 0.03778076171875, 0.0074462890625, -0.009002685546875, -0.07611083984375, -0.05108642578125, 0.007663726806640625, -0.0018253326416015625, 0.01410675048828125, -0.01294708251953125, -0.01522064208984375, 0.025787353515625, 0.021728515625, -0.042633056640625, 0.0146484375, -0.04095458984375, -0.0171661376953125, 0.05535888671875, 0.0265960693359375, 0.0035953521728515625, -0.051422119140625, -0.03326416015625, -0.0295562744140625, -0.04632568359375, 0.005512237548828125, 0.059844970703125, 0.001544952392578125, -0.02880859375, 0.07635498046875, -0.0015306472778320312, 0.0259552001953125, 0.021331787109375, -0.033355712890625, 0.038543701171875, -0.0016813278198242188, -0.008209228515625, 0.0109405517578125, 0.0640869140625, 0.057373046875, 0.004573822021484375, 0.0150909423828125, -0.00171661376953125, -0.005733489990234375, 0.012115478515625, -0.0501708984375, -0.005702972412109375, 0.0272369384765625, -0.04473876953125, -0.0462646484375, -0.0066680908203125, -0.07666015625, -0.039764404296875, -0.01219940185546875, 0.0255584716796875, -0.00794219970703125, -0.0155792236328125, -0.0002281665802001953, -0.019622802734375, 0.025238037109375, -0.01255035400390625, -0.072509765625, 0.006877899169921875, 0.032928466796875, 0.05877685546875, 0.0006847381591796875, -0.0249786376953125, -0.0333251953125, 0.02789306640625, 0.0000591278076171875, 0.06463623046875, -0.00848388671875, -0.03436279296875, 0.0155029296875, 0.016632080078125, 0.023040771484375, -0.038787841796875, 0.0280303955078125, -0.031341552734375, 0.015899658203125, 0.007335662841796875, -0.0258941650390625, -0.0175323486328125, 0.01280975341796875, -0.056427001953125, 0.09423828125, 0.01174163818359375, -0.0662841796875, 0.007419586181640625, -0.04248046875, -0.0290374755859375, -0.00872039794921875, 0.0274658203125, -0.045562744140625, -0.019073486328125, -0.0029754638671875, 0.0261077880859375, -0.017547607421875, 0.04925537109375, -0.0013475418090820312, -0.0269012451171875, 0.00934600830078125, -0.0212554931640625, 0.06640625, 0.0173797607421875, -0.0386962890625, -0.0032596588134765625, -0.07720947265625, -0.0219573974609375, 0.0118255615234375, -0.051116943359375, -0.0038547515869140625, 0.01025390625, 0.02972412109375, 0.0001952648162841797, 0.034210205078125, -0.0267333984375, 0.03948974609375, -0.04217529296875, 0.037200927734375, 0.048797607421875, 0.01313018798828125, 0.029205322265625, -0.0325927734375, 0.02972412109375, -0.006603240966796875, -0.0010204315185546875, -0.0129241943359375, -0.058074951171875, -0.0662841796875, -0.0009336471557617188, 0.0275421142578125, 0.06597900390625, -0.04876708984375, 0.052642822265625, -0.01064300537109375, -0.061614990234375, -0.041015625, 0.01158905029296875, 0.0196533203125, 0.020477294921875, 0.039031982421875, -0.0234527587890625, -0.062286376953125, -0.05328369140625, -0.00600433349609375, -0.0095062255859375, -0.0173187255859375, 0.0196533203125, 0.046539306640625, -0.06072998046875, 0.06024169921875, -0.0394287109375, -0.0291748046875, -0.027862548828125, 0.0238189697265625, 0.045745849609375, 0.025543212890625, 0.03424072265625, -0.0308837890625, -0.0357666015625, -0.0189056396484375, -0.05023193359375, -0.0302734375, -0.0255279541015625, -0.032928466796875, -0.002391815185546875, 0.0111083984375, -0.045135498046875, 0.042938232421875, 0.04840087890625, -0.0379638671875, 0.034576416015625, -0.001438140869140625, -0.0238189697265625, -0.10382080078125, 0.004497528076171875, -0.02716064453125, 0.00917816162109375, -0.031585693359375, -0.0064849853515625, -0.03009033203125, 0.0059051513671875, -0.028594970703125, 0.05438232421875, -0.0013446807861328125, -0.00060272216796875, -0.0298004150390625, 0.00559234619140625, 0.00572967529296875, 0.039031982421875, 0.0031490325927734375, 0.0243988037109375, 0.029693603515625, -0.057891845703125, 0.03204345703125, 0.06463623046875, -0.034576416015625, 0.0214385986328125, -0.05474853515625, 0.0180511474609375, -0.00409698486328125, 0.032135009765625, -0.071044921875, -0.038787841796875, 0.04443359375, -0.017730712890625, 0.005336761474609375, -0.006744384765625, -0.044342041015625, -0.0286865234375, -0.0095977783203125, 0.024658203125, 0.045623779296875, -0.042236328125, 0.04986572265625, 0.0185699462890625, -0.0132293701171875, -0.058380126953125, -0.06549072265625, -0.005718231201171875, -0.03143310546875, -0.03009033203125, 0.041412353515625, -0.01149749755859375, -0.00664520263671875, -0.028564453125, 0.00901031494140625, -0.00630950927734375, -0.0014448165893554688, 0.033111572265625, 0.0099945068359375, 0.004688262939453125, -0.0026702880859375, 0.0386962890625, -0.01325225830078125, 0.00820159912109375, -0.0068511962890625, 0.06036376953125, 0.01445770263671875, -0.005947113037109375, -0.0386962890625, 0.01377105712890625, 0.0181121826171875, 0.0004963874816894531, 0.055206298828125, 0.0628662109375, -0.0132598876953125, 0.0087127685546875, -0.028594970703125, -0.0020084381103515625, -0.035919189453125, 0.0244140625, 0.0029964447021484375, -0.04486083984375, 0.033447265625, -0.0016613006591796875, 0.0283355712890625, 0.060089111328125, 0.057647705078125, -0.019805908203125, 0.0391845703125, 0.03363037109375, 0.0161895751953125, 0.0253448486328125, -0.037445068359375, -0.004062652587890625, -0.0648193359375, -0.04071044921875, -0.0307464599609375, -0.047119140625, -0.052215576171875, -0.0156707763671875, 0.007213592529296875, 0.006122589111328125, -0.054290771484375, 0.048919677734375, -0.04241943359375, 0.0174713134765625, 0.06689453125, 0.01294708251953125, 0.0269927978515625, 0.00994873046875, -0.009857177734375, 0.00923919677734375, -0.036651611328125, -0.07135009765625, 0.10272216796875, 0.027984619140625, 0.048980712890625, 0.012969970703125, 0.0648193359375, 0.02935791015625, 0.024688720703125, -0.043212890625, 0.0284271240234375, -0.005588531494140625, -0.07373046875, -0.013702392578125, 0.006519317626953125, -0.07958984375, -0.0033626556396484375, -0.030792236328125, -0.053955078125, 0.0152740478515625, 0.01520538330078125, -0.01532745361328125, 0.0283050537109375, -0.03839111328125, 0.048492431640625, -0.017547607421875, -0.01380157470703125, 0.0018711090087890625, -0.06304931640625, 0.028717041015625, -0.01421356201171875, 0.016082763671875, 0.00016057491302490234, -0.0032253265380859375, 0.07330322265625, -0.0177459716796875, 0.09320068359375, -0.002105712890625, -0.021148681640625, 0.059051513671875, 0.00614166259765625, 0.033416748046875, 0.01898193359375, -0.0108642578125, 0.032867431640625, 0.01198577880859375, -0.0134429931640625, -0.00287628173828125, 0.0275421142578125, -0.08184814453125, -0.03790283203125, -0.0394287109375, -0.035614013671875, 0.0172119140625, 0.0137939453125, 0.0557861328125, 0.0156402587890625, 0.006931304931640625, 0.0224761962890625, 0.0171661376953125, -0.0367431640625, 0.0377197265625, 0.0121612548828125, -0.022918701171875, -0.056671142578125, 0.04937744140625, 0.00325775146484375, 0.0017690658569335938, 0.02728271484375, 0.0038661956787109375, -0.031585693359375, -0.01102447509765625, -0.021820068359375, 0.0221710205078125, -0.048980712890625, -0.03271484375, -0.040771484375, -0.024017333984375, -0.019134521484375, 0.00568389892578125, -0.0013742446899414062, -0.03167724609375, -0.056610107421875, -0.0157623291015625, 0.041900634765625, 0.057159423828125, -0.0157318115234375, 0.0819091796875, -0.057281494140625, 0.033233642578125, 0.02685546875, 0.0079803466796875, 0.006755828857421875, -0.0701904296875, -0.01255035400390625, 0.0079193115234375, -0.068115234375, -0.065185546875, 0.04473876953125, 0.0139312744140625, 0.0308990478515625, 0.029541015625, 0.004512786865234375, 0.03143310546875, -0.045166015625, 0.07135009765625, 0.0164947509765625, -0.06402587890625, 0.044677734375, -0.033599853515625, 0.0163421630859375, 0.01253509521484375, 0.0271453857421875, -0.036468505859375, -0.01201629638671875, -0.06475830078125, -0.0732421875, 0.0419921875, 0.0303955078125, 0.03448486328125, -0.004558563232421875, 0.0307464599609375, 0.030181884765625, 0.0289154052734375, -0.10107421875, -0.035247802734375, -0.045318603515625, -0.0289764404296875, -0.005584716796875, -0.042724609375, -0.0056304931640625, -0.00836944580078125, 0.05029296875, -0.00870513916015625, 0.039642333984375, 0.00051116943359375, -0.00555419921875, -0.001708984375, 0.012054443359375, 0.03228759765625, 0.018341064453125, -0.02392578125, -0.004192352294921875, 0.021392822265625, -0.056549072265625, 0.01410675048828125, -0.0023479461669921875, -0.0192108154296875, 0.00638580322265625, 0.046966552734375, 0.07122802734375, 0.0107574462890625, -0.046722412109375, 0.015899658203125, 0.0250701904296875, -0.03961181640625, -0.034820556640625, 0.00629425048828125, 0.016510009765625, 0.0399169921875, 0.02301025390625, 0.005588531494140625, 0.00860595703125, -0.0301361083984375, 0.0013780593872070312, 0.0003445148468017578, 0.0014495849609375, -0.01165008544921875, 0.0703125, 0.013641357421875, -0.0458984375, 0.054840087890625, -0.00018608570098876953, -0.015380859375, 0.0570068359375, 0.060089111328125, 0.04876708984375, 0.005985260009765625, 0.024200439453125, 0.04803466796875, 0.01904296875, 0.00276947021484375, 0.021026611328125, 0.00676727294921875, -0.046417236328125, -0.0235595703125, -0.060638427734375, -0.03057861328125, 0.0182037353515625, -0.055572509765625, 0.0386962890625, -0.03448486328125, -0.03594970703125, -0.03021240234375, -0.005279541015625, -0.049285888671875, 0.006748199462890625, 0.022216796875, 0.06280517578125, -0.05279541015625, 0.044097900390625, 0.037109375, -0.0301361083984375, -0.053497314453125, -0.0012664794921875, -0.006641387939453125, -0.058837890625, 0.0399169921875, 0.0227813720703125, 0.024627685546875, 0.0208892822265625, -0.0626220703125, -0.0777587890625, 0.1148681640625, 0.01168060302734375, -0.051177978515625, -0.01058197021484375, 0.0206451416015625, 0.038909912109375, -0.035614013671875, 0.0177764892578125, 0.040863037109375, 0.045654296875, 0.0030364990234375, -0.049774169921875, -0.01128387451171875, -0.01129913330078125, -0.0006213188171386719, -0.024932861328125, -0.054840087890625, 0.048583984375, -0.00963592529296875, -0.0014371871948242188, 0.047454833984375, 0.04962158203125, 0.053802490234375, 0.042938232421875, 0.0308990478515625, 0.034332275390625, 0.058502197265625, -0.01189422607421875, 0.0662841796875, -0.00827789306640625, 0.0443115234375, 0.07440185546875, -0.0138397216796875, 0.06634521484375, 0.0345458984375, -0.01541900634765625, 0.08294677734375, 0.07000732421875, -0.03204345703125, 0.060089111328125, 0.0137176513671875, 0.0069732666015625, -0.0038928985595703125, -0.00855255126953125, -0.033966064453125, 0.03326416015625, 0.020904541015625, -0.036285400390625, 0.006298065185546875, -0.00284576416015625, -0.0002906322479248047, -0.0202484130859375, -0.01251220703125, 0.04925537109375, 0.00830841064453125, -0.034820556640625, 0.0400390625, 0.002899169921875, 0.057342529296875, -0.041229248046875, -0.0038318634033203125, -0.01421356201171875, -0.0008215904235839844, -0.0220794677734375, -0.054473876953125, 0.0115814208984375, 0.0191650390625, -0.0237884521484375, -0.00592041015625, 0.0322265625, -0.0233001708984375, -0.03814697265625, 0.0118408203125, 0.0174713134765625, 0.0279693603515625, 0.0240325927734375, -0.039154052734375, 0.0005640983581542969, 0.018218994140625, -0.022216796875, 0.00884246826171875, 0.02239990234375, 0.01314544677734375, 0.047607421875, 0.045196533203125, -0.0286865234375, 0.00769805908203125, 0.0009322166442871094, 0.0546875, -0.04095458984375, -0.0305023193359375, -0.054931640625, 0.0732421875, 0.00342559814453125, -0.04827880859375, 0.031951904296875, 0.035736083984375, 0.041748046875, -0.038543701171875, 0.049774169921875, -0.015838623046875, 0.0288238525390625, -0.04351806640625, 0.0562744140625, -0.037445068359375, 0.01513671875, -0.024627685546875, -0.094482421875, -0.0078277587890625, 0.057830810546875, 0.004955291748046875, -0.01305389404296875, 0.0479736328125, 0.0506591796875, 0.01458740234375, -0.01116943359375, -0.011627197265625, 0.024627685546875, 0.0114593505859375, 0.0280303955078125, 0.05560302734375, -0.059844970703125, 0.0594482421875, -0.0221099853515625, -0.00251007080078125, -0.0202789306640625, -0.08001708984375, -0.0784912109375, -0.0362548828125, -0.0226898193359375, -0.04119873046875, -0.027313232421875, 0.05462646484375, 0.0264129638671875, -0.0526123046875, -0.0310211181640625, 0.0022296905517578125, 0.0034961700439453125, -0.0099334716796875, -0.0155487060546875, 0.02606201171875, 0.016998291015625, -0.052581787109375, 0.011627197265625, -0.01036834716796875, 0.0255279541015625, -0.01336669921875, -0.017333984375, -0.0285491943359375, -0.00046324729919433594, 0.037109375, 0.005268096923828125, -0.06024169921875, -0.008056640625, 0.0012941360473632812, 0.0027179718017578125, -0.00499725341796875, 0.0179595947265625, -0.048980712890625, 0.007793426513671875, 0.01873779296875, 0.02337646484375, 0.050933837890625, -0.005489349365234375, 0.02490234375, -0.057525634765625, 0.0194549560546875, -0.00959014892578125, 0.022918701171875, 0.0171051025390625, -0.0362548828125, 0.051544189453125, 0.0250091552734375, -0.05108642578125, -0.052642822265625, -0.003093719482421875, -0.08355712890625, 0.0005917549133300781, 0.1021728515625, 0.019134521484375, -0.00634002685546875, 0.0026836395263671875, -0.052276611328125, 0.04364013671875, -0.0175323486328125, 0.053009033203125, 0.051544189453125, 0.007366180419921875, 0.008148193359375, -0.0338134765625, 0.0132904052734375, 0.00008749961853027344, -0.04705810546875, -0.0187225341796875, 0.01611328125, 0.0288848876953125, 0.0234527587890625, 0.02789306640625, 0.01751708984375, 0.01251220703125, 0.006366729736328125, 0.0215911865234375, 0.002471923828125, -0.0189056396484375, -0.0012006759643554688, -0.003082275390625, -0.0171966552734375, 0.0016012191772460938 ] ]
adasnew/t5-small-xsum
2022-04-11T22:35:12.000Z
[ "transformers", "pytorch", "tensorboard", "t5", "text2text-generation", "generated_from_trainer", "dataset:xsum", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
text2text-generation
adasnew
null
null
adasnew/t5-small-xsum
0
9,759
transformers
2022-04-11T18:45:17
--- license: apache-2.0 tags: - generated_from_trainer datasets: - xsum model-index: - name: t5-small-xsum results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # t5-small-xsum This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the xsum dataset. It achieves the following results on the evaluation set: - Loss: 2.3953 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 2.8641 | 0.04 | 500 | 2.6202 | | 2.7466 | 0.08 | 1000 | 2.5660 | | 2.8767 | 0.12 | 1500 | 2.5319 | | 2.7099 | 0.16 | 2000 | 2.5107 | | 2.7752 | 0.2 | 2500 | 2.4922 | | 2.6037 | 0.24 | 3000 | 2.4800 | | 2.8236 | 0.27 | 3500 | 2.4677 | | 2.7089 | 0.31 | 4000 | 2.4581 | | 2.7299 | 0.35 | 4500 | 2.4498 | | 2.7498 | 0.39 | 5000 | 2.4420 | | 2.6186 | 0.43 | 5500 | 2.4346 | | 2.7817 | 0.47 | 6000 | 2.4288 | | 2.5559 | 0.51 | 6500 | 2.4239 | | 2.6725 | 0.55 | 7000 | 2.4186 | | 2.6316 | 0.59 | 7500 | 2.4149 | | 2.5561 | 0.63 | 8000 | 2.4115 | | 2.5708 | 0.67 | 8500 | 2.4097 | | 2.5861 | 0.71 | 9000 | 2.4052 | | 2.6363 | 0.74 | 9500 | 2.4024 | | 2.7435 | 0.78 | 10000 | 2.4003 | | 2.7258 | 0.82 | 10500 | 2.3992 | | 2.6113 | 0.86 | 11000 | 2.3983 | | 2.6006 | 0.9 | 11500 | 2.3972 | | 2.5684 | 0.94 | 12000 | 2.3960 | | 2.6181 | 0.98 | 12500 | 2.3953 | ### Framework versions - Transformers 4.18.0 - Pytorch 1.10.0+cu111 - Datasets 2.0.0 - Tokenizers 0.11.6
2,590
[ [ -0.041473388671875, -0.025634765625, 0.01381683349609375, -0.0008282661437988281, -0.0079193115234375, -0.012786865234375, -0.002010345458984375, -0.007205963134765625, 0.0296783447265625, 0.0249481201171875, -0.05694580078125, -0.04827880859375, -0.048858642578125, -0.01508331298828125, -0.0108489990234375, 0.07135009765625, 0.005146026611328125, 0.005481719970703125, -0.0007586479187011719, -0.01030731201171875, -0.0300140380859375, -0.0170135498046875, -0.0701904296875, -0.033843994140625, 0.0209808349609375, 0.042572021484375, 0.05767822265625, 0.053955078125, 0.035430908203125, 0.0278778076171875, -0.0250091552734375, -0.0015172958374023438, -0.0357666015625, -0.044891357421875, 0.0101318359375, -0.04522705078125, -0.03656005859375, -0.00016951560974121094, 0.046173095703125, 0.0450439453125, -0.014495849609375, 0.04248046875, 0.00574493408203125, 0.04901123046875, -0.0307769775390625, 0.01180267333984375, -0.0181121826171875, 0.015838623046875, -0.01155853271484375, -0.01288604736328125, -0.0047760009765625, -0.0165252685546875, 0.00644683837890625, -0.0379638671875, 0.037811279296875, 0.0129852294921875, 0.10076904296875, 0.029266357421875, -0.0204010009765625, 0.00823974609375, -0.0374755859375, 0.051239013671875, -0.040374755859375, 0.0240020751953125, 0.0338134765625, 0.0270233154296875, 0.006977081298828125, -0.06915283203125, -0.0433349609375, 0.020111083984375, -0.0155029296875, 0.013397216796875, -0.0186614990234375, -0.020751953125, 0.04132080078125, 0.05206298828125, -0.04107666015625, 0.010467529296875, -0.04437255859375, -0.01471710205078125, 0.045196533203125, 0.041656494140625, 0.0034332275390625, -0.031494140625, -0.0333251953125, -0.0163726806640625, -0.031097412109375, 0.0325927734375, 0.037078857421875, 0.015167236328125, -0.044677734375, 0.0284271240234375, -0.01424407958984375, 0.04498291015625, 0.01349639892578125, -0.0138702392578125, 0.0516357421875, -0.0212860107421875, -0.03485107421875, -0.0022220611572265625, 0.052581787109375, 0.041229248046875, -0.0034236907958984375, 0.00959014892578125, -0.01383209228515625, -0.0066986083984375, 0.01358795166015625, -0.06890869140625, -0.02490234375, 0.025054931640625, -0.042755126953125, -0.0474853515625, 0.00597381591796875, -0.041290283203125, 0.006916046142578125, -0.0250244140625, 0.033782958984375, -0.020355224609375, -0.01442718505859375, 0.0013933181762695312, -0.01499176025390625, 0.0213775634765625, 0.01483154296875, -0.07354736328125, 0.0299224853515625, 0.0294036865234375, 0.06353759765625, 0.0085601806640625, -0.00775146484375, -0.0041656494140625, 0.00911712646484375, -0.036285400390625, 0.044677734375, -0.00452423095703125, -0.033721923828125, -0.0226287841796875, 0.0302276611328125, -0.01410675048828125, -0.0291290283203125, 0.061614990234375, -0.022003173828125, 0.0294189453125, -0.0208282470703125, -0.032012939453125, -0.0198822021484375, 0.0273895263671875, -0.0474853515625, 0.09478759765625, 0.0167999267578125, -0.06982421875, 0.048858642578125, -0.047210693359375, 0.00661468505859375, -0.011810302734375, -0.004665374755859375, -0.07318115234375, -0.0110931396484375, 0.01910400390625, 0.028289794921875, -0.02734375, 0.006862640380859375, -0.0139923095703125, -0.032318115234375, -0.0180816650390625, -0.0216217041015625, 0.073486328125, 0.011260986328125, -0.04571533203125, 0.017547607421875, -0.08441162109375, 0.019195556640625, 0.0258636474609375, -0.0233001708984375, -0.0008840560913085938, -0.02374267578125, 0.01560211181640625, 0.029052734375, 0.01416015625, -0.035003662109375, 0.0184173583984375, -0.021270751953125, 0.03997802734375, 0.0499267578125, 0.0103759765625, 0.020538330078125, -0.0518798828125, 0.0213165283203125, 0.0290679931640625, 0.021697998046875, 0.0097808837890625, -0.0316162109375, -0.07196044921875, -0.034454345703125, 0.015716552734375, 0.0257110595703125, -0.01520538330078125, 0.046600341796875, -0.016571044921875, -0.05316162109375, -0.0305938720703125, -0.00737762451171875, 0.0190277099609375, 0.0531005859375, 0.0214996337890625, -0.002590179443359375, -0.041961669921875, -0.080078125, 0.0096588134765625, 0.00586700439453125, 0.0107269287109375, 0.031280517578125, 0.070556640625, -0.0128021240234375, 0.0660400390625, -0.05181884765625, -0.03985595703125, -0.0080108642578125, -0.00220489501953125, 0.04315185546875, 0.049041748046875, 0.06494140625, -0.0487060546875, -0.0362548828125, -0.00710296630859375, -0.047698974609375, 0.0246124267578125, -0.0026874542236328125, -0.00460052490234375, -0.0014362335205078125, 0.017578125, -0.0282135009765625, 0.05242919921875, 0.0413818359375, -0.03607177734375, 0.056488037109375, -0.031402587890625, 0.01441192626953125, -0.08538818359375, 0.030517578125, -0.003131866455078125, -0.02978515625, -0.02032470703125, -0.01453399658203125, 0.006824493408203125, -0.0219879150390625, -0.0294647216796875, 0.05206298828125, -0.0280609130859375, 0.0029735565185546875, 0.0036163330078125, -0.015655517578125, 0.000621795654296875, 0.05340576171875, 0.0029773712158203125, 0.0682373046875, 0.051971435546875, -0.039794921875, 0.01067352294921875, 0.0299835205078125, -0.0472412109375, 0.03717041015625, -0.057708740234375, -0.00415802001953125, 0.0026607513427734375, 0.0003180503845214844, -0.07403564453125, -0.0217437744140625, 0.02178955078125, -0.032989501953125, 0.0171661376953125, -0.0111846923828125, -0.0174102783203125, -0.05316162109375, -0.0323486328125, 0.004878997802734375, 0.0384521484375, -0.03277587890625, 0.024139404296875, 0.006153106689453125, 0.017120361328125, -0.043121337890625, -0.05743408203125, -0.004985809326171875, -0.017303466796875, -0.05963134765625, 0.034759521484375, 0.0002703666687011719, -0.0024051666259765625, 0.004974365234375, -0.005245208740234375, -0.01024627685546875, 0.0030879974365234375, 0.03350830078125, 0.01369476318359375, -0.011566162109375, -0.01358795166015625, -0.0097808837890625, -0.0303955078125, 0.0016450881958007812, -0.00518035888671875, 0.045440673828125, -0.0134735107421875, -0.0178680419921875, -0.0595703125, -0.003925323486328125, 0.03350830078125, -0.0146636962890625, 0.07525634765625, 0.0445556640625, -0.0374755859375, -0.0011339187622070312, -0.0267486572265625, -0.0196380615234375, -0.034912109375, 0.0270233154296875, -0.04962158203125, -0.041229248046875, 0.064697265625, 0.0018873214721679688, 0.01407623291015625, 0.06390380859375, 0.0423583984375, -0.00673675537109375, 0.0816650390625, 0.02923583984375, -0.0030155181884765625, 0.026336669921875, -0.0775146484375, -0.005733489990234375, -0.056182861328125, -0.037078857421875, -0.0406494140625, -0.033172607421875, -0.04150390625, -0.004184722900390625, 0.0266265869140625, 0.0035915374755859375, -0.05364990234375, 0.00754547119140625, -0.04229736328125, 0.0236968994140625, 0.0550537109375, 0.0290985107421875, 0.00446319580078125, -0.0066986083984375, -0.0258941650390625, -0.01352691650390625, -0.058837890625, -0.03460693359375, 0.08514404296875, 0.0244598388671875, 0.036468505859375, 0.004901885986328125, 0.059661865234375, 0.0189666748046875, 0.004734039306640625, -0.042236328125, 0.013580322265625, -0.0019121170043945312, -0.0738525390625, -0.0202178955078125, -0.0245208740234375, -0.07220458984375, 0.0208740234375, -0.0271148681640625, -0.061248779296875, 0.034698486328125, 0.022064208984375, -0.039459228515625, 0.045074462890625, -0.03863525390625, 0.07958984375, -0.00787353515625, -0.039306640625, -0.00815582275390625, -0.050750732421875, 0.0252685546875, -0.00021982192993164062, 0.0036773681640625, -0.005550384521484375, 0.0002980232238769531, 0.06719970703125, -0.06231689453125, 0.044036865234375, -0.01210784912109375, 0.0254974365234375, 0.0239410400390625, -0.0170745849609375, 0.048370361328125, 0.00887298583984375, -0.013702392578125, 0.0055694580078125, 0.008544921875, -0.0472412109375, -0.0278778076171875, 0.0599365234375, -0.08599853515625, -0.0384521484375, -0.044891357421875, -0.031158447265625, 0.01404571533203125, 0.0328369140625, 0.044586181640625, 0.04571533203125, -0.002376556396484375, 0.0229339599609375, 0.044158935546875, 0.0036830902099609375, 0.0401611328125, 0.0160064697265625, -0.00196075439453125, -0.053619384765625, 0.0587158203125, 0.0062255859375, 0.022369384765625, 0.0007953643798828125, 0.01873779296875, -0.0288543701171875, -0.023101806640625, -0.03387451171875, 0.0168304443359375, -0.03515625, -0.0192413330078125, -0.035919189453125, -0.01068878173828125, -0.043121337890625, -0.023895263671875, -0.032623291015625, -0.0224609375, -0.0330810546875, -0.011810302734375, 0.0289154052734375, 0.03753662109375, -0.0016107559204101562, 0.04132080078125, -0.045166015625, 0.0014362335205078125, 0.0019435882568359375, 0.01446533203125, 0.002490997314453125, -0.05157470703125, -0.022491455078125, -0.003948211669921875, -0.03350830078125, -0.04339599609375, 0.049530029296875, 0.004669189453125, 0.0384521484375, 0.05010986328125, -0.00917816162109375, 0.0693359375, -0.018524169921875, 0.057586669921875, 0.032684326171875, -0.05255126953125, 0.035247802734375, -0.02166748046875, 0.023895263671875, 0.046417236328125, 0.037261962890625, -0.0296478271484375, -0.003559112548828125, -0.09832763671875, -0.05828857421875, 0.0726318359375, 0.02362060546875, -0.01007843017578125, 0.0207366943359375, 0.0240020751953125, -0.0240631103515625, 0.0200653076171875, -0.053741455078125, -0.05712890625, -0.00980377197265625, -0.002780914306640625, -0.01483917236328125, -0.01378631591796875, -0.01629638671875, -0.03570556640625, 0.05865478515625, 0.005214691162109375, 0.0284423828125, 0.01007080078125, 0.01276397705078125, -0.00672149658203125, -0.001617431640625, 0.052642822265625, 0.053802490234375, -0.04425048828125, -0.0038013458251953125, 0.0180816650390625, -0.03216552734375, 0.0082550048828125, -0.0010137557983398438, -0.0250396728515625, 0.000583648681640625, 0.028900146484375, 0.06390380859375, 0.01507568359375, 0.00217437744140625, 0.042022705078125, 0.0016889572143554688, -0.036773681640625, -0.040283203125, 0.005340576171875, 0.0026683807373046875, 0.0189361572265625, 0.0183868408203125, 0.032470703125, 0.005695343017578125, -0.0308685302734375, 0.0164337158203125, 0.017486572265625, -0.047210693359375, -0.0078277587890625, 0.0699462890625, 0.004119873046875, -0.01197052001953125, 0.04962158203125, 0.0001163482666015625, -0.036865234375, 0.0714111328125, 0.0303497314453125, 0.048370361328125, -0.01446533203125, -0.00492095947265625, 0.07305908203125, 0.022705078125, -0.0017480850219726562, 0.037384033203125, 0.00872802734375, -0.0271148681640625, 0.0019817352294921875, -0.0382080078125, -0.019500732421875, 0.0239715576171875, -0.06756591796875, 0.05126953125, -0.0325927734375, -0.0303497314453125, 0.006473541259765625, 0.0303497314453125, -0.071533203125, 0.05328369140625, -0.0033931732177734375, 0.08673095703125, -0.0693359375, 0.049224853515625, 0.045806884765625, -0.049560546875, -0.07977294921875, -0.038299560546875, -0.0029087066650390625, -0.06689453125, 0.045806884765625, 0.004146575927734375, 0.019378662109375, 0.0154571533203125, -0.038787841796875, -0.07470703125, 0.09539794921875, 0.0010137557983398438, -0.047607421875, 0.003997802734375, 0.008697509765625, 0.029937744140625, -0.004772186279296875, 0.0362548828125, 0.037384033203125, 0.0343017578125, 0.01493072509765625, -0.05914306640625, 0.005687713623046875, -0.0239410400390625, -0.0065765380859375, 0.028564453125, -0.0689697265625, 0.07965087890625, -0.0278778076171875, 0.014862060546875, 0.006885528564453125, 0.04974365234375, 0.0219879150390625, 0.016021728515625, 0.037384033203125, 0.0792236328125, 0.060028076171875, -0.026458740234375, 0.0885009765625, -0.03387451171875, 0.062042236328125, 0.0699462890625, 0.0232086181640625, 0.05078125, 0.032928466796875, -0.03173828125, 0.025543212890625, 0.0826416015625, -0.0166778564453125, 0.040130615234375, 0.0035305023193359375, -0.01248931884765625, -0.0262451171875, 0.01654052734375, -0.05029296875, 0.0123443603515625, 0.0113525390625, -0.044708251953125, -0.0222930908203125, -0.015869140625, 0.0081939697265625, -0.0239715576171875, -0.040557861328125, 0.03192138671875, -0.01084136962890625, -0.015472412109375, 0.0478515625, 0.00392913818359375, 0.03338623046875, -0.050201416015625, 0.0080718994140625, -0.006328582763671875, 0.03765869140625, -0.049407958984375, -0.058685302734375, 0.0196685791015625, -0.01209259033203125, -0.0241241455078125, 0.0028324127197265625, 0.030059814453125, -0.007122039794921875, -0.05267333984375, -0.00365447998046875, 0.015380859375, 0.0099334716796875, 0.012969970703125, -0.06341552734375, -0.01434326171875, 0.00811767578125, -0.0504150390625, 0.003444671630859375, 0.0271453857421875, -0.014923095703125, 0.043060302734375, 0.050201416015625, 0.00025177001953125, 0.01190948486328125, -0.0036869049072265625, 0.0821533203125, -0.049896240234375, -0.055419921875, -0.048614501953125, 0.034759521484375, -0.0205078125, -0.060150146484375, 0.052215576171875, 0.06866455078125, 0.0474853515625, -0.00565338134765625, 0.04248046875, -0.019073486328125, 0.03662109375, -0.02850341796875, 0.051422119140625, -0.06292724609375, -0.01345062255859375, -0.01145172119140625, -0.057708740234375, -0.02618408203125, 0.049102783203125, -0.044403076171875, 0.01383209228515625, 0.040130615234375, 0.05645751953125, -0.00020265579223632812, -0.0025386810302734375, 0.017974853515625, 0.000308990478515625, 0.01708984375, 0.0347900390625, 0.02734375, -0.06182861328125, 0.035430908203125, -0.0526123046875, -0.0036334991455078125, -0.0156402587890625, -0.04779052734375, -0.060394287109375, -0.0277557373046875, -0.0380859375, -0.032257080078125, -0.01337432861328125, 0.0694580078125, 0.061492919921875, -0.05291748046875, -0.0156707763671875, 0.001708984375, -0.00670623779296875, -0.02044677734375, -0.01021575927734375, 0.063720703125, -0.01210784912109375, -0.0670166015625, -0.0182647705078125, -0.00965118408203125, 0.02392578125, -0.0114288330078125, -0.01221466064453125, -0.01558685302734375, -0.01543426513671875, 0.0196380615234375, 0.005298614501953125, -0.028900146484375, -0.0224609375, -0.0117340087890625, -0.01763916015625, 0.02880859375, 0.0210723876953125, -0.032684326171875, 0.0198822021484375, 0.0316162109375, 0.0211181640625, 0.06768798828125, -0.0009293556213378906, 0.00799560546875, -0.046844482421875, 0.02294921875, 0.0000012516975402832031, 0.0333251953125, 0.00827789306640625, -0.02276611328125, 0.0545654296875, 0.0321044921875, -0.047637939453125, -0.05657958984375, -0.022705078125, -0.085693359375, 0.01059722900390625, 0.07269287109375, -0.004283905029296875, -0.0423583984375, 0.0145111083984375, -0.02142333984375, 0.0057830810546875, -0.0249786376953125, 0.035186767578125, 0.051727294921875, -0.01274871826171875, -0.0026340484619140625, -0.043548583984375, 0.0396728515625, 0.0144195556640625, -0.05389404296875, -0.028656005859375, 0.019317626953125, 0.045074462890625, 0.01364898681640625, 0.035125732421875, -0.010833740234375, 0.0186767578125, 0.0264434814453125, 0.0189361572265625, -0.0184478759765625, -0.006412506103515625, -0.0164947509765625, 0.01364898681640625, 0.00396728515625, -0.04571533203125 ] ]
yuvalkirstain/peft-100k_t5_xl_task_pred_1024steps_color_best
2023-11-01T07:35:10.000Z
[ "peft", "region:us" ]
null
yuvalkirstain
null
null
yuvalkirstain/peft-100k_t5_xl_task_pred_1024steps_color_best
0
9,757
peft
2023-11-01T07:35:05
--- library_name: peft --- ## Training procedure ### Framework versions - PEFT 0.5.0 - PEFT 0.5.0
101
[ [ -0.019500732421875, -0.006832122802734375, 0.02142333984375, 0.064697265625, -0.01194000244140625, -0.000026166439056396484, 0.039276123046875, -0.00853729248046875, 0.0196990966796875, 0.0562744140625, -0.04290771484375, -0.0174713134765625, -0.035552978515625, 0.00884246826171875, -0.036224365234375, 0.05096435546875, -0.010528564453125, 0.028961181640625, 0.0133819580078125, -0.0018291473388671875, -0.005008697509765625, -0.0550537109375, -0.065185546875, -0.048065185546875, 0.038787841796875, 0.047149658203125, 0.0120697021484375, 0.034820556640625, 0.05316162109375, 0.014312744140625, 0.00023758411407470703, -0.06640625, -0.020538330078125, 0.004886627197265625, -0.01495361328125, 0.01540374755859375, -0.0771484375, -0.006397247314453125, 0.0303497314453125, 0.05743408203125, -0.018798828125, 0.01186370849609375, -0.043609619140625, 0.0701904296875, -0.0267333984375, -0.01224517822265625, -0.014068603515625, 0.042572021484375, 0.0016641616821289062, 0.00170135498046875, 0.003925323486328125, -0.01427459716796875, 0.0154571533203125, -0.037384033203125, 0.0301055908203125, -0.0169219970703125, 0.0592041015625, 0.004199981689453125, -0.0416259765625, 0.028717041015625, -0.0179595947265625, 0.041015625, -0.0163116455078125, 0.07550048828125, 0.03314208984375, 0.05072021484375, -0.001026153564453125, -0.07080078125, 0.003997802734375, 0.004791259765625, 0.023162841796875, -0.022674560546875, 0.024017333984375, 0.0146484375, 0.055206298828125, 0.041534423828125, -0.0330810546875, 0.0066375732421875, -0.0780029296875, -0.0257415771484375, 0.048675537109375, -0.02630615234375, -0.0088958740234375, 0.037384033203125, -0.03369140625, -0.003826141357421875, -0.07110595703125, -0.032745361328125, -0.004436492919921875, 0.01534271240234375, -0.0364990234375, 0.0239410400390625, -0.0523681640625, 0.01001739501953125, 0.0246124267578125, 0.01399993896484375, 0.06781005859375, -0.0021533966064453125, -0.06463623046875, 0.0224761962890625, 0.01094818115234375, 0.0087127685546875, 0.00536346435546875, -0.009796142578125, -0.03338623046875, -0.040618896484375, 0.01357269287109375, -0.07049560546875, -0.039276123046875, 0.01299285888671875, -0.034027099609375, -0.03955078125, 0.01593017578125, -0.04638671875, -0.0224761962890625, 0.004886627197265625, 0.01947021484375, -0.0221099853515625, -0.03582763671875, 0.01242828369140625, -0.041473388671875, 0.046417236328125, 0.026519775390625, -0.081787109375, 0.038726806640625, 0.042449951171875, 0.0129852294921875, 0.041351318359375, -0.03125, -0.03460693359375, 0.0145263671875, -0.01568603515625, 0.0421142578125, -0.0003635883331298828, -0.046539306640625, -0.0071563720703125, 0.0318603515625, 0.0020923614501953125, -0.0307159423828125, 0.07171630859375, -0.01861572265625, 0.0469970703125, -0.0233154296875, -0.04925537109375, -0.022613525390625, 0.02056884765625, -0.035491943359375, 0.08502197265625, 0.04638671875, -0.00902557373046875, 0.041656494140625, -0.09405517578125, -0.0259552001953125, -0.006374359130859375, 0.003513336181640625, -0.0304107666015625, -0.005672454833984375, -0.02008056640625, 0.018768310546875, 0.01160430908203125, -0.0115509033203125, -0.0192413330078125, -0.0215301513671875, -0.037933349609375, -0.0261077880859375, 0.08245849609375, 0.0272979736328125, -0.042510986328125, 0.005153656005859375, -0.064697265625, -0.0008106231689453125, -0.0033416748046875, -0.0062408447265625, 0.036163330078125, -0.023468017578125, -0.00794219970703125, 0.020538330078125, 0.03607177734375, -0.032257080078125, 0.0190582275390625, -0.01300048828125, 0.0164642333984375, 0.0404052734375, 0.0022678375244140625, 0.0279998779296875, -0.03363037109375, 0.036895751953125, 0.01091766357421875, 0.03173828125, 0.01129150390625, -0.0163726806640625, -0.058990478515625, -0.01165771484375, 0.024017333984375, 0.051483154296875, -0.002696990966796875, 0.034332275390625, 0.01399993896484375, -0.046173095703125, 0.02398681640625, 0.004688262939453125, 0.0297698974609375, 0.0151824951171875, 0.0267791748046875, -0.0007462501525878906, -0.0673828125, -0.04266357421875, 0.0278167724609375, -0.0213775634765625, 0.00731658935546875, -0.02923583984375, 0.05926513671875, -0.0258636474609375, 0.00382232666015625, -0.01959228515625, -0.003662109375, 0.0186767578125, 0.01099395751953125, 0.0271148681640625, 0.07183837890625, 0.0955810546875, -0.050445556640625, -0.0494384765625, -0.0382080078125, -0.001338958740234375, -0.0274505615234375, 0.00396728515625, 0.011260986328125, 0.04766845703125, 0.033905029296875, -0.00665283203125, 0.02960205078125, 0.0184173583984375, -0.04132080078125, 0.035369873046875, 0.01418304443359375, 0.0044097900390625, -0.0458984375, 0.0100555419921875, -0.007061004638671875, -0.041900634765625, -0.005321502685546875, 0.01230621337890625, -0.0111083984375, -0.000040411949157714844, -0.065673828125, 0.03741455078125, -0.0404052734375, 0.0099029541015625, 0.03399658203125, -0.059478759765625, 0.0037403106689453125, 0.0024127960205078125, 0.0222015380859375, 0.060089111328125, 0.058380126953125, -0.0295562744140625, 0.05029296875, 0.01297760009765625, 0.04071044921875, 0.04266357421875, -0.0499267578125, 0.01207733154296875, 0.025970458984375, -0.016387939453125, -0.025634765625, -0.042236328125, 0.04827880859375, -0.0185089111328125, 0.0163726806640625, -0.0006237030029296875, -0.027191162109375, -0.0295867919921875, -0.053131103515625, 0.00331878662109375, 0.06488037109375, -0.01910400390625, 0.0010738372802734375, -0.003696441650390625, 0.00893402099609375, -0.04052734375, -0.04931640625, -0.0086212158203125, -0.002330780029296875, -0.0198516845703125, 0.0111236572265625, -0.0301666259765625, 0.004016876220703125, -0.0404052734375, -0.019195556640625, -0.045013427734375, -0.019989013671875, 0.0272674560546875, -0.007289886474609375, -0.0177001953125, -0.007030487060546875, 0.0128173828125, -0.01739501953125, 0.00313568115234375, 0.00101470947265625, 0.0116729736328125, -0.01385498046875, -0.018096923828125, 0.00540924072265625, 0.037109375, 0.03424072265625, -0.0012941360473632812, 0.0350341796875, 0.0416259765625, -0.036773681640625, 0.00969696044921875, -0.036895751953125, -0.01593017578125, -0.034912109375, 0.012359619140625, -0.029144287109375, -0.06597900390625, 0.029266357421875, -0.01290130615234375, 0.0033740997314453125, 0.036285400390625, 0.0143585205078125, -0.006420135498046875, 0.050079345703125, 0.06121826171875, 0.01509857177734375, 0.05438232421875, -0.000308990478515625, 0.01262664794921875, -0.019989013671875, -0.029876708984375, -0.0511474609375, 0.0207061767578125, -0.00629425048828125, -0.00453948974609375, 0.01312255859375, 0.039093017578125, -0.07232666015625, 0.05059814453125, -0.051788330078125, 0.051177978515625, 0.023529052734375, -0.0097198486328125, -0.01178741455078125, 0.0030803680419921875, -0.0100555419921875, 0.02532958984375, -0.04022216796875, -0.026123046875, 0.08734130859375, 0.06927490234375, 0.0203094482421875, 0.0059051513671875, 0.060394287109375, 0.009307861328125, 0.02618408203125, -0.01520538330078125, 0.042449951171875, -0.012054443359375, -0.0775146484375, -0.0213623046875, 0.00843048095703125, -0.034881591796875, -0.0165252685546875, -0.0096893310546875, -0.03668212890625, 0.0015544891357421875, 0.004550933837890625, -0.0211334228515625, -0.0004432201385498047, -0.050445556640625, 0.11346435546875, -0.0248260498046875, 0.0293731689453125, 0.007160186767578125, -0.062164306640625, -0.010406494140625, -0.0116119384765625, -0.0232391357421875, -0.01110076904296875, -0.01325225830078125, 0.03759765625, -0.0311431884765625, 0.04119873046875, -0.01654052734375, 0.006671905517578125, -0.0197296142578125, -0.03192138671875, 0.008148193359375, -0.009185791015625, -0.01186370849609375, -0.00029468536376953125, 0.056915283203125, -0.053619384765625, 0.0028057098388671875, 0.01256561279296875, -0.042022705078125, 0.005161285400390625, -0.053558349609375, 0.006801605224609375, 0.0113677978515625, 0.034027099609375, 0.043670654296875, 0.0175628662109375, -0.0302734375, 0.0113983154296875, 0.0706787109375, 0.021484375, 0.04229736328125, 0.03106689453125, -0.054656982421875, -0.04931640625, 0.047454833984375, 0.00971221923828125, 0.00861358642578125, 0.0306854248046875, -0.017181396484375, -0.0391845703125, -0.069091796875, -0.0125885009765625, 0.01001739501953125, -0.031829833984375, -0.006748199462890625, -0.0053863525390625, -0.037322998046875, -0.07049560546875, -0.025634765625, -0.023193359375, -0.046173095703125, -0.044952392578125, -0.0308990478515625, 0.020416259765625, 0.008087158203125, -0.05059814453125, 0.04205322265625, -0.0816650390625, -0.0120697021484375, 0.0186614990234375, 0.05810546875, -0.06683349609375, -0.020355224609375, -0.01325225830078125, 0.00007426738739013672, -0.055389404296875, -0.078369140625, 0.00600433349609375, 0.0261688232421875, 0.048095703125, 0.039703369140625, -0.0091552734375, 0.0308074951171875, -0.02349853515625, 0.06768798828125, 0.0352783203125, -0.0526123046875, 0.049346923828125, -0.032989501953125, 0.013153076171875, 0.07794189453125, 0.053985595703125, -0.022979736328125, 0.022491455078125, -0.08697509765625, -0.040283203125, 0.007137298583984375, -0.01079559326171875, 0.00299835205078125, 0.004634857177734375, 0.00997161865234375, 0.0177154541015625, 0.04510498046875, -0.0538330078125, -0.0037593841552734375, -0.02606201171875, -0.0023822784423828125, 0.0321044921875, -0.01543426513671875, -0.02587890625, -0.0599365234375, 0.060546875, 0.010040283203125, 0.0394287109375, 0.032470703125, 0.0277252197265625, -0.0298309326171875, 0.000865936279296875, 0.06378173828125, 0.06463623046875, -0.0648193359375, -0.005462646484375, 0.00025200843811035156, -0.044342041015625, 0.0032329559326171875, 0.035064697265625, 0.00603485107421875, 0.01788330078125, 0.0262603759765625, 0.02667236328125, -0.0004184246063232422, -0.04522705078125, 0.027740478515625, 0.0142364501953125, -0.048736572265625, -0.08154296875, 0.0321044921875, -0.0117645263671875, 0.004459381103515625, 0.0122833251953125, 0.013458251953125, 0.012847900390625, -0.040008544921875, 0.052398681640625, -0.01849365234375, -0.0655517578125, -0.0129852294921875, 0.03228759765625, 0.04669189453125, -0.0654296875, 0.03070068359375, -0.0294952392578125, -0.049774169921875, 0.055999755859375, 0.02362060546875, 0.0687255859375, -0.003917694091796875, 0.0295867919921875, 0.016571044921875, -0.0102081298828125, -0.020751953125, 0.0662841796875, 0.0005230903625488281, -0.03631591796875, -0.000728607177734375, -0.0423583984375, -0.0243377685546875, -0.0009174346923828125, -0.02783203125, 0.0156402587890625, -0.058563232421875, -0.015655517578125, 0.0222625732421875, 0.01114654541015625, -0.0028839111328125, 0.02569580078125, 0.008758544921875, 0.1368408203125, -0.0372314453125, 0.07672119140625, 0.0853271484375, -0.044189453125, -0.0399169921875, -0.034332275390625, 0.0110015869140625, -0.014984130859375, 0.041961669921875, 0.00044226646423339844, 0.01959228515625, 0.0062408447265625, -0.060394287109375, -0.0347900390625, 0.0853271484375, 0.0292510986328125, -0.035125732421875, 0.0246124267578125, 0.00313568115234375, 0.01520538330078125, -0.04168701171875, -0.01215362548828125, 0.042724609375, 0.0237579345703125, 0.028717041015625, -0.1116943359375, -0.0169525146484375, -0.01459503173828125, -0.038482666015625, 0.039093017578125, -0.0416259765625, 0.04913330078125, 0.01049041748046875, 0.03997802734375, 0.028472900390625, 0.029296875, 0.0269927978515625, 0.0147857666015625, 0.04150390625, 0.04229736328125, 0.03863525390625, -0.0214691162109375, 0.0267333984375, -0.0031299591064453125, 0.0312347412109375, 0.07086181640625, -0.01551055908203125, 0.035369873046875, 0.06890869140625, -0.0260772705078125, 0.03680419921875, 0.08258056640625, -0.03155517578125, 0.0285491943359375, -0.022705078125, -0.0323486328125, -0.0163726806640625, 0.0107269287109375, -0.05572509765625, -0.022705078125, 0.001544952392578125, -0.049957275390625, -0.0191802978515625, -0.043792724609375, 0.029327392578125, -0.021270751953125, -0.037994384765625, 0.035400390625, 0.0052642822265625, -0.0303802490234375, 0.0372314453125, 0.005157470703125, 0.0300445556640625, -0.0494384765625, -0.0261993408203125, 0.007717132568359375, 0.002437591552734375, -0.009552001953125, -0.045379638671875, 0.058868408203125, -0.01369476318359375, -0.028350830078125, -0.042510986328125, 0.05078125, -0.01526641845703125, -0.050262451171875, 0.01354217529296875, 0.0101318359375, 0.0130767822265625, -0.00783538818359375, -0.0928955078125, 0.007633209228515625, -0.00525665283203125, 0.0142822265625, 0.0164642333984375, 0.0185394287109375, 0.01410675048828125, 0.034454345703125, 0.0130767822265625, -0.0025119781494140625, 0.0099029541015625, 0.01094818115234375, 0.0406494140625, -0.031951904296875, -0.034759521484375, -0.03277587890625, 0.046661376953125, -0.0265045166015625, -0.046142578125, 0.01473236083984375, 0.061431884765625, 0.0804443359375, -0.01372528076171875, 0.022369384765625, -0.036590576171875, 0.0423583984375, -0.0078887939453125, 0.040863037109375, -0.034576416015625, 0.00865936279296875, 0.0030975341796875, -0.051605224609375, -0.0286865234375, 0.041748046875, -0.0101318359375, 0.01259613037109375, 0.0650634765625, 0.0445556640625, -0.006999969482421875, 0.0218353271484375, 0.013153076171875, -0.01561737060546875, -0.00550079345703125, 0.02630615234375, 0.0036487579345703125, -0.048065185546875, 0.01409912109375, -0.048858642578125, -0.0229949951171875, -0.007717132568359375, -0.0927734375, -0.042449951171875, -0.0074920654296875, -0.04132080078125, -0.037445068359375, 0.006931304931640625, 0.0877685546875, 0.08575439453125, -0.0694580078125, -0.02728271484375, -0.005611419677734375, 0.0102996826171875, -0.012054443359375, -0.0114593505859375, 0.0034465789794921875, -0.0130615234375, -0.0211639404296875, 0.003814697265625, 0.01061248779296875, 0.0435791015625, -0.0236968994140625, -0.037872314453125, -0.02593994140625, -0.00313568115234375, 0.0283050537109375, 0.043701171875, -0.0126953125, -0.04052734375, -0.007289886474609375, -0.00664520263671875, 0.0203857421875, 0.037994384765625, -0.0238037109375, -0.022857666015625, 0.048675537109375, -0.0016918182373046875, 0.0269622802734375, -0.019317626953125, 0.05352783203125, -0.06146240234375, 0.028411865234375, 0.0188446044921875, 0.027587890625, 0.00450897216796875, -0.0272979736328125, 0.08428955078125, 0.055999755859375, -0.041290283203125, -0.055206298828125, 0.00457763671875, -0.0908203125, -0.0124359130859375, 0.02728271484375, 0.00656890869140625, -0.0248870849609375, -0.0123138427734375, -0.031402587890625, 0.002506256103515625, -0.032562255859375, 0.00580596923828125, 0.0469970703125, -0.026275634765625, -0.042449951171875, -0.01837158203125, 0.0482177734375, 0.0104522705078125, -0.035888671875, -0.048828125, 0.047576904296875, 0.0313720703125, 0.002132415771484375, 0.0714111328125, -0.02813720703125, 0.042083740234375, 0.0271759033203125, 0.00641632080078125, -0.027862548828125, -0.0245819091796875, -0.037261962890625, 0.0037841796875, -0.0016021728515625, -0.016082763671875 ] ]
42MARU/GenAI-llama-2-13b
2023-10-04T04:35:56.000Z
[ "transformers", "pytorch", "llama", "text-generation", "llama-2", "instruct", "instruction", "en", "endpoints_compatible", "text-generation-inference", "region:us" ]
text-generation
42MARU
null
null
42MARU/GenAI-llama-2-13b
3
9,702
transformers
2023-09-24T14:28:22
--- language: - en tags: - llama-2 - instruct - instruction pipeline_tag: text-generation --- # GenAI-llama-2-13b ### Model Details - Developed by: [42MARU](https://www.42maru.ai/en/) - Backbone Model: [LLaMA-2](https://github.com/facebookresearch/llama/tree/main) - Library: [transformers](https://github.com/huggingface/transformers) ### Used Datasets - Orca-style dataset - Platypus ### Prompt Template ``` ### User: {User} ### Assistant: {Assistant} ``` ### Intruduce 42MARU - At 42Maru we study QA (Question Answering) and are developing advanced search paradigms that help users spend less time searching by understanding natural language and intention thanks to AI and Deep Learning. - [About Us](https://www.42maru.ai/en/about-us/) - [Contact Us](https://www.42maru.ai/en/contact/) ### Contribute - [l-yohai](https://github.com/l-yohai) ### License [LICENSE.txt](meta-license/LICENSE.txt) ### USE_POLICY [USE_POLICY.md](meta-license/USE_POLICY.md) ### Responsible Use Guide [Responsible-Use-Guide.pdf](meta-license/Responsible-Use-Guide.pdf)
1,060
[ [ -0.030303955078125, -0.03070068359375, 0.0303955078125, 0.0231475830078125, -0.031463623046875, 0.0171661376953125, 0.023406982421875, -0.0145263671875, 0.0272674560546875, 0.034393310546875, -0.051116943359375, -0.05621337890625, -0.029205322265625, 0.01378631591796875, -0.0261077880859375, 0.053558349609375, 0.004947662353515625, -0.00818634033203125, -0.0130767822265625, -0.024078369140625, -0.0253753662109375, -0.023101806640625, -0.06427001953125, -0.0276947021484375, 0.041656494140625, 0.037811279296875, 0.04315185546875, 0.040924072265625, 0.029998779296875, 0.015655517578125, -0.02166748046875, 0.0085906982421875, -0.0250244140625, -0.0067291259765625, -0.019866943359375, -0.031402587890625, -0.0662841796875, -0.0170440673828125, 0.03607177734375, 0.01509857177734375, -0.0207672119140625, 0.048828125, 0.00164794921875, 0.042877197265625, -0.033721923828125, 0.0411376953125, -0.0281829833984375, -0.00992584228515625, -0.0289306640625, -0.00965118408203125, 0.006732940673828125, -0.035614013671875, -0.007312774658203125, -0.06817626953125, -0.00580596923828125, 0.005878448486328125, 0.0819091796875, 0.0309600830078125, -0.042999267578125, -0.0193634033203125, -0.0024929046630859375, 0.045562744140625, -0.067626953125, 0.020050048828125, 0.057586669921875, 0.0266571044921875, -0.022918701171875, -0.06927490234375, -0.04876708984375, 0.006999969482421875, 0.0016765594482421875, 0.01392364501953125, -0.004688262939453125, -0.0135040283203125, 0.0082550048828125, 0.033355712890625, -0.0462646484375, 0.025665283203125, -0.05560302734375, -0.0172882080078125, 0.07073974609375, 0.022735595703125, 0.02197265625, -0.0274658203125, -0.004161834716796875, -0.0231475830078125, -0.041534423828125, 0.024139404296875, 0.0347900390625, 0.005863189697265625, -0.036285400390625, 0.0660400390625, -0.04498291015625, 0.02984619140625, -0.004108428955078125, -0.015899658203125, 0.038665771484375, -0.0428466796875, -0.01354217529296875, -0.0212860107421875, 0.058868408203125, 0.0234527587890625, 0.0067291259765625, 0.0140228271484375, 0.0020275115966796875, -0.01216888427734375, 0.005382537841796875, -0.046875, -0.006439208984375, 0.02117919921875, -0.05694580078125, -0.03076171875, 0.011962890625, -0.07061767578125, -0.038848876953125, 0.0002956390380859375, 0.01152801513671875, -0.00914764404296875, -0.0289459228515625, 0.01544952392578125, 0.0010652542114257812, 0.059173583984375, 0.01093292236328125, -0.053863525390625, 0.0195465087890625, 0.03814697265625, 0.034088134765625, 0.01910400390625, -0.01369476318359375, -0.0006723403930664062, 0.01031494140625, -0.007740020751953125, 0.047332763671875, -0.0225372314453125, -0.040618896484375, -0.00984954833984375, 0.00952911376953125, -0.0025959014892578125, -0.022186279296875, 0.0709228515625, -0.043731689453125, 0.035308837890625, -0.022491455078125, -0.017242431640625, -0.06512451171875, 0.01617431640625, -0.058685302734375, 0.099609375, 0.021820068359375, -0.051513671875, -0.00008606910705566406, -0.0733642578125, -0.0220184326171875, 0.00955963134765625, -0.003719329833984375, -0.04180908203125, -0.0178985595703125, 0.02294921875, 0.0261077880859375, -0.038848876953125, 0.0235137939453125, -0.0199737548828125, -0.01464080810546875, 0.004093170166015625, 0.01149749755859375, 0.08770751953125, 0.027435302734375, -0.01261138916015625, 0.0159759521484375, -0.062286376953125, -0.0177001953125, 0.04949951171875, -0.037811279296875, -0.0033416748046875, -0.00804901123046875, 0.01053619384765625, -0.0043182373046875, 0.059478759765625, -0.0300445556640625, 0.040435791015625, -0.008453369140625, 0.032073974609375, 0.049896240234375, -0.01145172119140625, 0.023956298828125, -0.019073486328125, 0.05816650390625, -0.015899658203125, 0.0243377685546875, 0.0145416259765625, -0.059600830078125, -0.064453125, 0.00238800048828125, 0.00968170166015625, 0.06243896484375, -0.04315185546875, 0.043731689453125, -0.01617431640625, -0.0552978515625, -0.04254150390625, 0.0185394287109375, 0.0174560546875, 0.040924072265625, 0.028106689453125, -0.037811279296875, -0.057952880859375, -0.07135009765625, 0.00844573974609375, -0.036346435546875, 0.00005161762237548828, 0.038116455078125, 0.043548583984375, -0.050628662109375, 0.060089111328125, -0.04443359375, -0.0300445556640625, -0.0279541015625, -0.0020904541015625, 0.020782470703125, 0.02117919921875, 0.046600341796875, -0.07025146484375, -0.01308441162109375, -0.022125244140625, -0.06591796875, -0.0121307373046875, 0.020782470703125, -0.0102386474609375, 0.00868988037109375, 0.017486572265625, -0.037384033203125, 0.035491943359375, 0.031829833984375, -0.032440185546875, 0.027801513671875, 0.0135650634765625, -0.00177001953125, -0.109375, 0.0216522216796875, -0.005878448486328125, -0.003940582275390625, -0.01399993896484375, 0.022430419921875, -0.0165863037109375, -0.0016994476318359375, -0.0282135009765625, 0.046783447265625, -0.0082855224609375, -0.004161834716796875, 0.00830078125, -0.003711700439453125, 0.00963592529296875, 0.028045654296875, -0.0198974609375, 0.061187744140625, 0.04376220703125, -0.034820556640625, 0.04193115234375, 0.060028076171875, -0.003528594970703125, 0.01145172119140625, -0.062225341796875, 0.01421356201171875, 0.0020313262939453125, 0.0355224609375, -0.07122802734375, -0.039825439453125, 0.045318603515625, -0.03277587890625, 0.000009179115295410156, -0.0006585121154785156, -0.064453125, -0.03948974609375, -0.046295166015625, 0.05755615234375, 0.034698486328125, -0.02398681640625, 0.0408935546875, 0.037994384765625, -0.01088714599609375, -0.04534912109375, -0.05908203125, 0.004711151123046875, -0.0185089111328125, -0.04278564453125, 0.027008056640625, -0.006988525390625, 0.008392333984375, -0.0002567768096923828, 0.0158843994140625, -0.0228271484375, 0.0161895751953125, 0.0185089111328125, 0.0191497802734375, -0.01580810546875, -0.0159912109375, 0.0027523040771484375, -0.008056640625, 0.0036373138427734375, 0.01580810546875, 0.0487060546875, -0.0164337158203125, -0.020782470703125, -0.0516357421875, 0.003978729248046875, 0.03753662109375, -0.027618408203125, 0.03857421875, 0.0567626953125, -0.021759033203125, 0.0201873779296875, -0.04132080078125, 0.0029697418212890625, -0.03424072265625, 0.0254058837890625, -0.0210418701171875, -0.0706787109375, 0.0633544921875, 0.0043182373046875, 0.00005257129669189453, 0.05755615234375, 0.031768798828125, 0.0016384124755859375, 0.056365966796875, 0.04437255859375, -0.023101806640625, 0.036376953125, -0.03582763671875, 0.00994873046875, -0.0823974609375, -0.037353515625, -0.04669189453125, -0.0120697021484375, -0.07318115234375, -0.01456451416015625, 0.00902557373046875, 0.01605224609375, -0.031646728515625, 0.0552978515625, -0.0504150390625, 0.00691986083984375, 0.05755615234375, 0.0100555419921875, 0.0182037353515625, -0.018341064453125, -0.0028476715087890625, 0.0262603759765625, -0.04681396484375, -0.03851318359375, 0.0989990234375, 0.033782958984375, 0.051055908203125, 0.0162200927734375, 0.056427001953125, 0.00672149658203125, -0.01123046875, -0.039886474609375, 0.058258056640625, -0.00414276123046875, -0.050201416015625, -0.0172119140625, -0.0186920166015625, -0.08416748046875, -0.003467559814453125, 0.012908935546875, -0.059326171875, 0.025054931640625, -0.00957489013671875, -0.0244903564453125, 0.0215606689453125, -0.0225372314453125, 0.05859375, -0.0223236083984375, 0.0162811279296875, -0.0244598388671875, -0.05120849609375, 0.0428466796875, 0.0015611648559570312, 0.00730133056640625, -0.01108551025390625, -0.01183319091796875, 0.0703125, -0.0021915435791015625, 0.06573486328125, 0.004840850830078125, -0.00453948974609375, 0.033843994140625, -0.01007080078125, 0.0267333984375, 0.029083251953125, -0.01049041748046875, 0.0183868408203125, 0.0073089599609375, -0.034759521484375, -0.006496429443359375, 0.033935546875, -0.0672607421875, -0.03546142578125, -0.035400390625, -0.034637451171875, 0.02099609375, 0.0195159912109375, 0.0167083740234375, 0.00640106201171875, 0.0038738250732421875, 0.021484375, 0.039794921875, -0.037811279296875, 0.0257568359375, 0.028106689453125, -0.03582763671875, -0.03851318359375, 0.060699462890625, -0.0004687309265136719, -0.0226287841796875, 0.0249786376953125, -0.0050811767578125, -0.038421630859375, -0.038299560546875, -0.033966064453125, 0.0257720947265625, -0.0413818359375, -0.0263214111328125, -0.0267791748046875, -0.03515625, -0.04901123046875, 0.0178680419921875, -0.0147705078125, -0.041229248046875, -0.057464599609375, -0.023193359375, 0.04022216796875, 0.0474853515625, 0.00017523765563964844, 0.02471923828125, -0.041015625, 0.0406494140625, 0.0272369384765625, 0.0114898681640625, -0.0026416778564453125, -0.039886474609375, -0.00829315185546875, -0.00341033935546875, -0.050537109375, -0.06201171875, 0.0228424072265625, 0.01309967041015625, 0.060394287109375, 0.01038360595703125, 0.0017452239990234375, 0.03460693359375, 0.0083160400390625, 0.074462890625, -0.012939453125, -0.052398681640625, 0.047088623046875, -0.0357666015625, 0.021759033203125, 0.05401611328125, 0.0206298828125, -0.01611328125, -0.01142120361328125, -0.049530029296875, -0.0792236328125, 0.027679443359375, 0.023651123046875, 0.01296234130859375, 0.0231475830078125, 0.02392578125, -0.00669097900390625, 0.0170135498046875, -0.0697021484375, -0.0302581787109375, -0.0221405029296875, -0.0229644775390625, 0.027618408203125, -0.03692626953125, -0.0120697021484375, -0.019805908203125, 0.06378173828125, -0.01224517822265625, 0.01076507568359375, 0.01456451416015625, 0.0005426406860351562, 0.004550933837890625, 0.0233917236328125, 0.051849365234375, 0.057769775390625, -0.0283203125, -0.0196990966796875, 0.0177001953125, -0.0372314453125, -0.005764007568359375, 0.032745361328125, -0.014923095703125, -0.0253448486328125, 0.024383544921875, 0.04193115234375, -0.004398345947265625, -0.0362548828125, 0.01788330078125, -0.00848388671875, -0.0025424957275390625, -0.04351806640625, 0.00612640380859375, 0.0248870849609375, 0.028045654296875, 0.0440673828125, -0.0238800048828125, 0.028228759765625, -0.027374267578125, 0.0014848709106445312, 0.019744873046875, 0.0003757476806640625, -0.0144500732421875, 0.053619384765625, 0.0198974609375, -0.02001953125, 0.043853759765625, -0.0288238525390625, -0.041473388671875, 0.06231689453125, 0.0159454345703125, 0.0689697265625, -0.0054779052734375, 0.01397705078125, 0.02728271484375, 0.0197601318359375, -0.0007925033569335938, 0.050537109375, -0.0019445419311523438, -0.042694091796875, -0.017791748046875, -0.048919677734375, -0.029998779296875, 0.0236968994140625, -0.06201171875, 0.0291900634765625, -0.033935546875, -0.00995635986328125, -0.014678955078125, 0.0203857421875, -0.055450439453125, 0.009613037109375, -0.0017242431640625, 0.07818603515625, -0.040008544921875, 0.06494140625, 0.071533203125, -0.060302734375, -0.0721435546875, -0.0246429443359375, 0.008392333984375, -0.0655517578125, 0.0355224609375, 0.00911712646484375, 0.004169464111328125, 0.01023101806640625, -0.05938720703125, -0.07672119140625, 0.1280517578125, 0.0164337158203125, -0.0333251953125, 0.0089874267578125, 0.004459381103515625, 0.032958984375, -0.0241851806640625, 0.0115966796875, 0.027496337890625, 0.048614501953125, 0.003177642822265625, -0.083251953125, 0.0109405517578125, -0.0286407470703125, -0.00444793701171875, -0.0233001708984375, -0.07501220703125, 0.0546875, -0.004535675048828125, -0.006866455078125, 0.04412841796875, 0.04656982421875, 0.049346923828125, 0.0404052734375, 0.0207672119140625, 0.0738525390625, 0.038909912109375, 0.0008072853088378906, 0.0677490234375, -0.008636474609375, 0.040252685546875, 0.06512451171875, -0.011474609375, 0.05401611328125, 0.022216796875, -0.022369384765625, 0.059814453125, 0.07684326171875, -0.0282135009765625, 0.0322265625, 0.003849029541015625, 0.003864288330078125, 0.0025920867919921875, -0.023590087890625, -0.053558349609375, 0.034759521484375, 0.0188751220703125, -0.0273895263671875, -0.01175689697265625, -0.032745361328125, 0.01068878173828125, -0.0117645263671875, -0.02264404296875, 0.058258056640625, 0.0132598876953125, -0.01142120361328125, 0.036285400390625, 0.004886627197265625, 0.052734375, -0.03851318359375, -0.0086212158203125, -0.0250244140625, -0.00594329833984375, -0.030914306640625, -0.0645751953125, 0.01428985595703125, -0.01271820068359375, -0.017913818359375, 0.0038433074951171875, 0.04852294921875, -0.0011081695556640625, -0.052398681640625, 0.035369873046875, 0.027435302734375, 0.0186920166015625, 0.01277923583984375, -0.0738525390625, 0.0010833740234375, 0.00848388671875, -0.0274200439453125, 0.020965576171875, 0.0093841552734375, -0.018463134765625, 0.068115234375, 0.05767822265625, -0.013336181640625, 0.005413055419921875, 0.00719451904296875, 0.07928466796875, -0.04486083984375, -0.0256195068359375, -0.045196533203125, 0.0438232421875, -0.003940582275390625, -0.052734375, 0.037139892578125, 0.054473876953125, 0.06463623046875, 0.0035037994384765625, 0.045257568359375, -0.0252227783203125, 0.02838134765625, -0.023956298828125, 0.066650390625, -0.059112548828125, 0.011444091796875, -0.004711151123046875, -0.0709228515625, -0.0212554931640625, 0.043670654296875, 0.0015354156494140625, 0.00695037841796875, 0.0303497314453125, 0.0689697265625, -0.0035648345947265625, -0.01171875, -0.0050506591796875, 0.00919342041015625, 0.0252685546875, 0.040435791015625, 0.038787841796875, -0.0628662109375, 0.049835205078125, -0.026123046875, 0.00016701221466064453, -0.026031494140625, -0.05401611328125, -0.050750732421875, -0.041015625, -0.01922607421875, -0.03076171875, -0.006511688232421875, 0.047332763671875, 0.045989990234375, -0.0723876953125, -0.03106689453125, -0.00423431396484375, 0.0235443115234375, -0.0013704299926757812, -0.0199737548828125, 0.035125732421875, 0.01201629638671875, -0.037841796875, 0.0300445556640625, -0.0046234130859375, 0.004161834716796875, -0.0316162109375, -0.0147705078125, -0.04266357421875, -0.00910186767578125, 0.0201568603515625, 0.05029296875, -0.07220458984375, -0.0031890869140625, 0.00214385986328125, -0.0191497802734375, 0.0030231475830078125, 0.0185699462890625, -0.04302978515625, 0.0130157470703125, 0.05938720703125, 0.032501220703125, 0.032928466796875, 0.015960693359375, 0.0291900634765625, -0.0297393798828125, 0.0135345458984375, 0.01000213623046875, 0.0325927734375, 0.031707763671875, -0.036224365234375, 0.07806396484375, 0.0117645263671875, -0.049285888671875, -0.054840087890625, 0.01064300537109375, -0.1072998046875, 0.00543212890625, 0.080078125, -0.011322021484375, -0.00977325439453125, 0.0030612945556640625, -0.032501220703125, 0.04461669921875, -0.04620361328125, 0.060699462890625, 0.037689208984375, -0.0008935928344726562, -0.0207672119140625, -0.036285400390625, 0.0209197998046875, 0.0130615234375, -0.07220458984375, -0.046478271484375, 0.0101165771484375, 0.020294189453125, 0.0223236083984375, 0.029815673828125, -0.0103912353515625, 0.02001953125, -0.0275421142578125, 0.0162811279296875, -0.0175933837890625, 0.006366729736328125, -0.0238189697265625, -0.004459381103515625, 0.0028247833251953125, -0.032623291015625 ] ]
ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli
2021-05-20T23:17:23.000Z
[ "transformers", "pytorch", "jax", "roberta", "text-classification", "dataset:snli", "dataset:anli", "dataset:multi_nli", "dataset:multi_nli_mismatch", "dataset:fever", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
text-classification
ynie
null
null
ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli
9
9,690
transformers
2022-03-02T23:29:05
--- datasets: - snli - anli - multi_nli - multi_nli_mismatch - fever license: mit --- This is a strong pre-trained RoBERTa-Large NLI model. The training data is a combination of well-known NLI datasets: [`SNLI`](https://nlp.stanford.edu/projects/snli/), [`MNLI`](https://cims.nyu.edu/~sbowman/multinli/), [`FEVER-NLI`](https://github.com/easonnie/combine-FEVER-NSMN/blob/master/other_resources/nli_fever.md), [`ANLI (R1, R2, R3)`](https://github.com/facebookresearch/anli). Other pre-trained NLI models including `RoBERTa`, `ALBert`, `BART`, `ELECTRA`, `XLNet` are also available. Trained by [Yixin Nie](https://easonnie.github.io), [original source](https://github.com/facebookresearch/anli). Try the code snippet below. ``` from transformers import AutoTokenizer, AutoModelForSequenceClassification import torch if __name__ == '__main__': max_length = 256 premise = "Two women are embracing while holding to go packages." hypothesis = "The men are fighting outside a deli." hg_model_hub_name = "ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli" # hg_model_hub_name = "ynie/albert-xxlarge-v2-snli_mnli_fever_anli_R1_R2_R3-nli" # hg_model_hub_name = "ynie/bart-large-snli_mnli_fever_anli_R1_R2_R3-nli" # hg_model_hub_name = "ynie/electra-large-discriminator-snli_mnli_fever_anli_R1_R2_R3-nli" # hg_model_hub_name = "ynie/xlnet-large-cased-snli_mnli_fever_anli_R1_R2_R3-nli" tokenizer = AutoTokenizer.from_pretrained(hg_model_hub_name) model = AutoModelForSequenceClassification.from_pretrained(hg_model_hub_name) tokenized_input_seq_pair = tokenizer.encode_plus(premise, hypothesis, max_length=max_length, return_token_type_ids=True, truncation=True) input_ids = torch.Tensor(tokenized_input_seq_pair['input_ids']).long().unsqueeze(0) # remember bart doesn't have 'token_type_ids', remove the line below if you are using bart. token_type_ids = torch.Tensor(tokenized_input_seq_pair['token_type_ids']).long().unsqueeze(0) attention_mask = torch.Tensor(tokenized_input_seq_pair['attention_mask']).long().unsqueeze(0) outputs = model(input_ids, attention_mask=attention_mask, token_type_ids=token_type_ids, labels=None) # Note: # "id2label": { # "0": "entailment", # "1": "neutral", # "2": "contradiction" # }, predicted_probability = torch.softmax(outputs[0], dim=1)[0].tolist() # batch_size only one print("Premise:", premise) print("Hypothesis:", hypothesis) print("Entailment:", predicted_probability[0]) print("Neutral:", predicted_probability[1]) print("Contradiction:", predicted_probability[2]) ``` More in [here](https://github.com/facebookresearch/anli/blob/master/src/hg_api/interactive_eval.py). Citation: ``` @inproceedings{nie-etal-2020-adversarial, title = "Adversarial {NLI}: A New Benchmark for Natural Language Understanding", author = "Nie, Yixin and Williams, Adina and Dinan, Emily and Bansal, Mohit and Weston, Jason and Kiela, Douwe", booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics", year = "2020", publisher = "Association for Computational Linguistics", } ```
3,406
[ [ -0.01425933837890625, -0.057098388671875, 0.01010894775390625, 0.0205535888671875, -0.0021800994873046875, -0.01068115234375, -0.017608642578125, -0.02752685546875, 0.03094482421875, 0.0194091796875, -0.0438232421875, -0.045257568359375, -0.036895751953125, 0.0162200927734375, -0.031585693359375, 0.0736083984375, 0.017791748046875, 0.005512237548828125, 0.01557159423828125, -0.00310516357421875, -0.0037288665771484375, -0.03436279296875, -0.06341552734375, -0.03851318359375, 0.032989501953125, -0.0028972625732421875, 0.038787841796875, 0.03466796875, 0.039154052734375, 0.022857666015625, -0.0181884765625, -0.003955841064453125, -0.03936767578125, -0.015899658203125, 0.003795623779296875, -0.03460693359375, -0.038055419921875, 0.00601959228515625, 0.039825439453125, 0.031707763671875, -0.00928497314453125, 0.0260772705078125, -0.00717926025390625, 0.030029296875, -0.028839111328125, 0.0012521743774414062, -0.04461669921875, 0.01081085205078125, -0.0168914794921875, 0.005023956298828125, -0.04364013671875, -0.0024242401123046875, 0.020263671875, -0.04473876953125, 0.01479339599609375, 0.01458740234375, 0.10614013671875, 0.0267333984375, -0.035003662109375, -0.0190582275390625, -0.0265350341796875, 0.057525634765625, -0.08087158203125, 0.01352691650390625, 0.01678466796875, -0.002620697021484375, -0.022430419921875, -0.06280517578125, -0.047637939453125, -0.01727294921875, -0.013946533203125, 0.031890869140625, -0.0253753662109375, -0.006031036376953125, 0.02862548828125, 0.01715087890625, -0.0687255859375, -0.006946563720703125, -0.031982421875, -0.00738525390625, 0.05267333984375, 0.0027923583984375, 0.0146942138671875, -0.03271484375, -0.029510498046875, -0.0002834796905517578, -0.01898193359375, -0.0015687942504882812, 0.0237274169921875, 0.01082611083984375, -0.0282135009765625, 0.03558349609375, -0.00531768798828125, 0.06451416015625, 0.0142364501953125, -0.0070953369140625, 0.05670166015625, -0.028076171875, -0.01953125, 0.009429931640625, 0.08270263671875, 0.017791748046875, 0.01291656494140625, -0.0003437995910644531, 0.004024505615234375, -0.00455474853515625, 0.0010471343994140625, -0.06268310546875, -0.039031982421875, 0.026214599609375, -0.043060302734375, -0.02484130859375, 0.007434844970703125, -0.06585693359375, -0.01519775390625, -0.0090484619140625, 0.05572509765625, -0.04241943359375, -0.0259246826171875, 0.0146942138671875, -0.01424407958984375, 0.030029296875, -0.0091552734375, -0.052093505859375, 0.007610321044921875, 0.0223846435546875, 0.060150146484375, -0.0006589889526367188, -0.0278167724609375, -0.024444580078125, 0.0050201416015625, -0.0145111083984375, 0.020416259765625, -0.01230621337890625, -0.007045745849609375, -0.0162811279296875, 0.0019254684448242188, -0.027374267578125, -0.025390625, 0.0386962890625, -0.029510498046875, 0.0278167724609375, 0.00418853759765625, -0.043121337890625, -0.014190673828125, 0.0263824462890625, -0.039398193359375, 0.07818603515625, 0.00507354736328125, -0.094970703125, 0.0144195556640625, -0.057464599609375, -0.03363037109375, -0.01136016845703125, -0.003177642822265625, -0.06427001953125, -0.013153076171875, 0.0191192626953125, 0.0276336669921875, -0.0232086181640625, 0.0335693359375, -0.040802001953125, -0.017608642578125, 0.0323486328125, -0.026947021484375, 0.09368896484375, 0.01322174072265625, -0.041229248046875, 0.03460693359375, -0.0816650390625, 0.0050201416015625, 0.03668212890625, -0.01413726806640625, -0.0002551078796386719, -0.027435302734375, 0.0009927749633789062, 0.0194854736328125, 0.0311431884765625, -0.032623291015625, 0.0144805908203125, -0.04986572265625, 0.04791259765625, 0.0364990234375, 0.00420379638671875, 0.0146331787109375, -0.021575927734375, 0.0111846923828125, -0.00618743896484375, 0.0220794677734375, 0.0036067962646484375, -0.04400634765625, -0.06170654296875, -0.0279541015625, 0.0201263427734375, 0.040008544921875, -0.05108642578125, 0.06207275390625, -0.002582550048828125, -0.04876708984375, -0.039215087890625, -0.002773284912109375, 0.0251617431640625, 0.041412353515625, 0.03155517578125, 0.00151824951171875, -0.065185546875, -0.056854248046875, -0.00624847412109375, -0.0132293701171875, -0.003704071044921875, 0.0220184326171875, 0.0576171875, -0.035797119140625, 0.0733642578125, -0.035003662109375, -0.0284271240234375, -0.032440185546875, 0.0289459228515625, 0.0634765625, 0.0487060546875, 0.045013427734375, -0.05096435546875, -0.0308990478515625, -0.020660400390625, -0.06890869140625, 0.0013637542724609375, -0.00048041343688964844, -0.0251007080078125, 0.005558013916015625, 0.025604248046875, -0.04339599609375, 0.040130615234375, 0.0372314453125, -0.04156494140625, 0.04461669921875, -0.024810791015625, 0.00197601318359375, -0.0977783203125, 0.0239410400390625, 0.00360107421875, 0.00582122802734375, -0.05596923828125, 0.0008845329284667969, 0.0068817138671875, 0.0028896331787109375, -0.034393310546875, 0.046234130859375, -0.02288818359375, 0.0143585205078125, -0.00902557373046875, -0.0107879638671875, 0.00690460205078125, 0.039276123046875, 0.0001697540283203125, 0.043060302734375, 0.058837890625, -0.04052734375, 0.032470703125, 0.0222930908203125, -0.006221771240234375, 0.01277923583984375, -0.06427001953125, 0.0046234130859375, -0.0020771026611328125, 0.0242462158203125, -0.0704345703125, -0.034576416015625, 0.0229034423828125, -0.053009033203125, 0.01690673828125, -0.004764556884765625, -0.0281829833984375, -0.058837890625, -0.00884246826171875, 0.04486083984375, 0.042633056640625, -0.052276611328125, 0.063720703125, 0.00957489013671875, 0.019073486328125, -0.060882568359375, -0.059661865234375, -0.0010929107666015625, -0.0262451171875, -0.042510986328125, 0.023193359375, -0.000019252300262451172, -0.005649566650390625, 0.0046539306640625, 0.0124664306640625, -0.0103912353515625, 0.0007233619689941406, 0.013153076171875, 0.0333251953125, -0.0175323486328125, 0.0024433135986328125, -0.032745361328125, -0.0103759765625, 0.01444244384765625, -0.03179931640625, 0.0650634765625, -0.02239990234375, -0.01428985595703125, -0.046661376953125, 0.004238128662109375, 0.0254669189453125, -0.0263671875, 0.06207275390625, 0.08642578125, -0.0469970703125, 0.005863189697265625, -0.035675048828125, -0.0182647705078125, -0.033782958984375, 0.039031982421875, -0.0260467529296875, -0.034759521484375, 0.04241943359375, 0.0121002197265625, -0.0078887939453125, 0.05487060546875, 0.038238525390625, 0.0069427490234375, 0.07427978515625, 0.027862548828125, -0.0274810791015625, 0.020294189453125, -0.05657958984375, 0.01873779296875, -0.05487060546875, -0.001972198486328125, -0.035675048828125, -0.0094146728515625, -0.0222320556640625, -0.0170135498046875, 0.0234527587890625, 0.00986480712890625, -0.037811279296875, 0.0231475830078125, -0.0384521484375, 0.0213623046875, 0.051971435546875, 0.01462554931640625, -0.009368896484375, -0.0212249755859375, -0.005054473876953125, 0.00623321533203125, -0.06390380859375, -0.015594482421875, 0.09320068359375, 0.035003662109375, 0.052734375, 0.004711151123046875, 0.067626953125, -0.006160736083984375, 0.023345947265625, -0.041290283203125, 0.033477783203125, -0.0203094482421875, -0.05487060546875, -0.0251312255859375, -0.0528564453125, -0.08575439453125, 0.0089874267578125, -0.035980224609375, -0.0623779296875, 0.0036468505859375, 0.0104217529296875, -0.03533935546875, 0.0242156982421875, -0.051300048828125, 0.075927734375, -0.020904541015625, -0.0305633544921875, -0.0031452178955078125, -0.047149658203125, 0.04443359375, 0.00785064697265625, 0.015228271484375, -0.003650665283203125, 0.039459228515625, 0.0797119140625, -0.0193023681640625, 0.05596923828125, -0.01253509521484375, 0.01296234130859375, 0.02484130859375, -0.0259246826171875, 0.02728271484375, 0.0207061767578125, -0.0246429443359375, 0.0217437744140625, -0.017730712890625, -0.038848876953125, -0.026336669921875, 0.045654296875, -0.06640625, -0.03631591796875, -0.039276123046875, -0.0293426513671875, 0.0122528076171875, 0.032318115234375, 0.0535888671875, 0.05059814453125, 0.0021800994873046875, 0.01515960693359375, 0.055389404296875, -0.026885986328125, 0.03643798828125, 0.00815582275390625, -0.0195159912109375, -0.046600341796875, 0.06396484375, -0.0104522705078125, 0.0086517333984375, 0.010772705078125, 0.02545166015625, -0.0304107666015625, -0.02978515625, -0.03155517578125, 0.041656494140625, -0.038726806640625, -0.01374053955078125, -0.05963134765625, -0.04345703125, -0.04754638671875, -0.00302886962890625, -0.0249786376953125, -0.0005450248718261719, -0.03582763671875, 0.0024547576904296875, 0.0291595458984375, 0.04119873046875, -0.0001310110092163086, 0.0165252685546875, -0.04107666015625, 0.032867431640625, 0.0188140869140625, 0.0014963150024414062, 0.00016021728515625, -0.0546875, -0.00872039794921875, -0.00589752197265625, -0.007312774658203125, -0.07855224609375, 0.059814453125, 0.018585205078125, 0.03460693359375, 0.033355712890625, 0.01953125, 0.050323486328125, -0.01617431640625, 0.055450439453125, 0.032470703125, -0.076904296875, 0.036468505859375, -0.0180511474609375, 0.026611328125, 0.02838134765625, 0.0202178955078125, -0.0162506103515625, -0.03570556640625, -0.05755615234375, -0.08612060546875, 0.054351806640625, 0.0204010009765625, -0.0138092041015625, -0.0023822784423828125, 0.0272674560546875, -0.0173797607421875, 0.01007843017578125, -0.0841064453125, -0.047576904296875, -0.01580810546875, -0.0465087890625, -0.0123443603515625, -0.01300048828125, -0.02032470703125, -0.0384521484375, 0.076416015625, -0.0031986236572265625, 0.033721923828125, 0.03155517578125, -0.00936126708984375, 0.00934600830078125, 0.00360870361328125, 0.029052734375, 0.034759521484375, -0.0306396484375, 0.009674072265625, 0.03216552734375, -0.0316162109375, 0.0107879638671875, 0.02288818359375, -0.00539398193359375, 0.004497528076171875, 0.045013427734375, 0.07855224609375, 0.004085540771484375, -0.037078857421875, 0.053466796875, -0.0208892822265625, -0.034759521484375, -0.04388427734375, 0.003543853759765625, 0.006679534912109375, 0.02020263671875, 0.03521728515625, 0.004619598388671875, 0.01215362548828125, -0.032562255859375, 0.0254974365234375, 0.0316162109375, -0.011932373046875, -0.0212554931640625, 0.0679931640625, 0.0037384033203125, -0.021240234375, 0.04498291015625, -0.042572021484375, -0.053985595703125, 0.05462646484375, 0.04901123046875, 0.055633544921875, -0.0016183853149414062, 0.001949310302734375, 0.0631103515625, 0.0183868408203125, 0.0027980804443359375, 0.021636962890625, 0.022613525390625, -0.048004150390625, -0.02593994140625, -0.050567626953125, -0.0009646415710449219, 0.031524658203125, -0.060791015625, 0.0251617431640625, -0.024932861328125, -0.023590087890625, 0.004150390625, 0.008056640625, -0.0628662109375, 0.0259857177734375, 0.002574920654296875, 0.06005859375, -0.0704345703125, 0.07403564453125, 0.046905517578125, -0.0211181640625, -0.07379150390625, 0.01470947265625, -0.01287841796875, -0.0472412109375, 0.050048828125, 0.0293426513671875, 0.01160430908203125, 0.0111083984375, -0.03106689453125, -0.0872802734375, 0.07196044921875, 0.00897216796875, -0.036895751953125, -0.004718780517578125, -0.01436614990234375, 0.038421630859375, -0.018157958984375, 0.041229248046875, 0.045928955078125, 0.03192138671875, 0.0009551048278808594, -0.06640625, 0.00830078125, -0.034759521484375, -0.0137176513671875, -0.0016803741455078125, -0.044281005859375, 0.08599853515625, -0.025299072265625, 0.0021572113037109375, 0.01107025146484375, 0.06787109375, 0.0250701904296875, 0.041900634765625, 0.044189453125, 0.04876708984375, 0.062042236328125, -0.01299285888671875, 0.054412841796875, -0.03302001953125, 0.041656494140625, 0.0767822265625, -0.0139007568359375, 0.0645751953125, 0.0263519287109375, -0.004970550537109375, 0.02679443359375, 0.03509521484375, -0.00745391845703125, 0.023651123046875, 0.0244293212890625, -0.005825042724609375, -0.00720977783203125, 0.0186309814453125, -0.03936767578125, 0.035797119140625, 0.0308685302734375, -0.0255889892578125, 0.0028362274169921875, 0.0146942138671875, -0.0035839080810546875, -0.010498046875, -0.003978729248046875, 0.05523681640625, -0.01172637939453125, -0.034637451171875, 0.07232666015625, -0.0093841552734375, 0.070068359375, -0.04437255859375, 0.015960693359375, 0.0193328857421875, 0.0208740234375, -0.01898193359375, -0.036651611328125, 0.0062103271484375, -0.0026035308837890625, -0.00868988037109375, 0.0221099853515625, 0.04168701171875, -0.043060302734375, -0.038970947265625, 0.041656494140625, 0.01258087158203125, 0.01111602783203125, 0.018157958984375, -0.0694580078125, 0.0029582977294921875, 0.00875091552734375, -0.035919189453125, 0.001129150390625, 0.019439697265625, 0.02099609375, 0.042877197265625, 0.05059814453125, 0.00827789306640625, 0.0185089111328125, 0.012054443359375, 0.055755615234375, -0.055511474609375, -0.038116455078125, -0.06573486328125, 0.027557373046875, -0.028594970703125, -0.0379638671875, 0.05609130859375, 0.050323486328125, 0.050140380859375, -0.0101776123046875, 0.0518798828125, -0.01100921630859375, 0.0286407470703125, -0.038970947265625, 0.0546875, -0.041656494140625, -0.01092529296875, -0.0193939208984375, -0.045318603515625, -0.032867431640625, 0.06988525390625, -0.020751953125, 0.00827789306640625, 0.039459228515625, 0.05926513671875, 0.0116424560546875, -0.0022830963134765625, 0.0006070137023925781, 0.04608154296875, 0.0303955078125, 0.056365966796875, 0.04486083984375, -0.039886474609375, 0.03656005859375, -0.035614013671875, -0.0269622802734375, -0.0246429443359375, -0.0499267578125, -0.0830078125, -0.036376953125, -0.035247802734375, -0.050445556640625, 0.00951385498046875, 0.08746337890625, 0.061309814453125, -0.06719970703125, -0.0089263916015625, -0.0170440673828125, -0.0131683349609375, -0.0244293212890625, -0.022796630859375, 0.046844482421875, -0.022979736328125, -0.05023193359375, 0.01480865478515625, -0.0014324188232421875, 0.01557159423828125, 0.0003082752227783203, -0.006015777587890625, -0.033233642578125, 0.005512237548828125, 0.0231781005859375, 0.016143798828125, -0.055023193359375, -0.01103973388671875, 0.0007472038269042969, -0.0173187255859375, -0.0013294219970703125, 0.01776123046875, -0.038848876953125, 0.0213470458984375, 0.01824951171875, 0.03753662109375, 0.04974365234375, -0.0128021240234375, 0.0256805419921875, -0.053009033203125, 0.02020263671875, 0.0092620849609375, 0.042999267578125, 0.01116180419921875, -0.0290985107421875, 0.034210205078125, 0.01434326171875, -0.0478515625, -0.06622314453125, 0.0016345977783203125, -0.07958984375, -0.025115966796875, 0.0927734375, -0.019683837890625, -0.048797607421875, -0.0031452178955078125, -0.0284576416015625, 0.04058837890625, -0.023773193359375, 0.0615234375, 0.036865234375, -0.006198883056640625, 0.007526397705078125, -0.023956298828125, 0.034149169921875, 0.04266357421875, -0.052154541015625, -0.00948333740234375, -0.00103759765625, 0.01465606689453125, 0.0294952392578125, 0.047943115234375, 0.006866455078125, 0.020660400390625, 0.002353668212890625, 0.0227203369140625, -0.004016876220703125, -0.0186767578125, -0.034149169921875, 0.01019287109375, -0.0046539306640625, -0.017181396484375 ] ]
sail-rvc/EricCartman
2023-07-14T07:22:16.000Z
[ "transformers", "rvc", "sail-rvc", "audio-to-audio", "endpoints_compatible", "region:us" ]
audio-to-audio
sail-rvc
null
null
sail-rvc/EricCartman
0
9,687
transformers
2023-07-14T07:21:57
--- pipeline_tag: audio-to-audio tags: - rvc - sail-rvc --- # EricCartman ## RVC Model ![banner](https://i.imgur.com/xocCjhH.jpg) This model repo was automatically generated. Date: 2023-07-14 07:22:16 Bot Name: juuxnscrap Model Type: RVC Source: https://huggingface.co/juuxn/RVCModels/ Reason: Converting into loadable format for https://github.com/chavinlo/rvc-runpod
379
[ [ -0.032196044921875, -0.0164794921875, 0.02642822265625, 0.0013818740844726562, -0.035003662109375, 0.00792694091796875, 0.014801025390625, -0.0030574798583984375, 0.0261077880859375, 0.06884765625, -0.0439453125, -0.055084228515625, -0.0275726318359375, -0.00327301025390625, -0.026397705078125, 0.068603515625, -0.00882720947265625, 0.013885498046875, -0.0134429931640625, -0.04229736328125, -0.04815673828125, -0.0024967193603515625, -0.056976318359375, -0.0347900390625, 0.046630859375, 0.040374755859375, 0.0543212890625, 0.0264739990234375, 0.051300048828125, 0.0295562744140625, 0.0012073516845703125, -0.00983428955078125, -0.002590179443359375, 0.01085662841796875, -0.0169219970703125, -0.031402587890625, -0.0546875, -0.00225067138671875, 0.035125732421875, 0.003284454345703125, -0.030975341796875, 0.0208282470703125, -0.0184783935546875, 0.051177978515625, -0.045379638671875, 0.01123809814453125, -0.0265350341796875, -0.0016279220581054688, -0.0012950897216796875, -0.00301361083984375, -0.01910400390625, -0.0207366943359375, -0.02099609375, -0.07098388671875, 0.0223541259765625, 0.036224365234375, 0.08270263671875, -0.0013685226440429688, -0.034088134765625, -0.0311279296875, -0.03790283203125, 0.043182373046875, -0.037994384765625, 0.0199127197265625, 0.0186767578125, 0.049957275390625, -0.03179931640625, -0.06939697265625, -0.04046630859375, -0.01654052734375, -0.01177215576171875, 0.007045745849609375, -0.013214111328125, -0.006374359130859375, -0.0011911392211914062, 0.0301513671875, -0.05950927734375, -0.0003287792205810547, -0.0487060546875, -0.005184173583984375, 0.03216552734375, 0.014495849609375, 0.04974365234375, -0.0130462646484375, -0.051177978515625, -0.0234375, -0.0305938720703125, -0.007808685302734375, 0.022857666015625, 0.014495849609375, -0.05419921875, 0.07452392578125, 0.00943756103515625, 0.0372314453125, 0.006389617919921875, 0.01076507568359375, 0.034759521484375, -0.0123748779296875, -0.006134033203125, 0.0006074905395507812, 0.0699462890625, 0.043792724609375, 0.0150604248046875, 0.037353515625, -0.038360595703125, -0.033782958984375, 0.050201416015625, -0.04833984375, -0.0306549072265625, 0.03759765625, -0.046722412109375, -0.049591064453125, 0.01532745361328125, -0.037689208984375, -0.0177764892578125, -0.00833892822265625, 0.01560211181640625, -0.036865234375, -0.036163330078125, -0.0074005126953125, -0.00453948974609375, 0.00823974609375, 0.046539306640625, -0.0311431884765625, 0.01336669921875, 0.03778076171875, 0.063720703125, 0.02166748046875, 0.006229400634765625, -0.022430419921875, 0.00800323486328125, -0.04425048828125, 0.06536865234375, -0.008087158203125, -0.04083251953125, -0.0041656494140625, 0.033660888671875, 0.01654052734375, -0.032196044921875, 0.0221099853515625, -0.058746337890625, 0.024261474609375, 0.0041656494140625, -0.02716064453125, -0.01198577880859375, 0.0116729736328125, -0.0731201171875, 0.0679931640625, 0.04290771484375, -0.05218505859375, 0.0190582275390625, -0.0665283203125, -0.006824493408203125, 0.037139892578125, -0.00024020671844482422, -0.04248046875, 0.007495880126953125, -0.025299072265625, 0.0202484130859375, -0.00592803955078125, 0.00897979736328125, -0.0225982666015625, 0.0224761962890625, 0.0227203369140625, 0.027618408203125, 0.046875, 0.056060791015625, 0.006084442138671875, 0.0189056396484375, -0.06707763671875, -0.00887298583984375, 0.03515625, 0.0002987384796142578, -0.021148681640625, -0.00738525390625, 0.0076446533203125, 0.0086517333984375, 0.02532958984375, -0.028411865234375, 0.0280303955078125, 0.0218505859375, 0.019073486328125, 0.059814453125, -0.0037631988525390625, 0.03826904296875, -0.0249481201171875, 0.059722900390625, -0.0032711029052734375, 0.0229644775390625, 0.003696441650390625, -0.02716064453125, -0.033233642578125, -0.040863037109375, 0.037384033203125, 0.0244293212890625, -0.044952392578125, 0.049072265625, -0.0040283203125, -0.05902099609375, 0.002834320068359375, -0.01513671875, 0.00510406494140625, 0.021728515625, 0.0054931640625, -0.07427978515625, -0.034027099609375, -0.07000732421875, 0.0112457275390625, -0.035888671875, -0.0020351409912109375, 0.028778076171875, 0.06512451171875, -0.0125885009765625, 0.037261962890625, -0.030426025390625, -0.011077880859375, -0.052581787109375, 0.011932373046875, 0.031951904296875, 0.0579833984375, 0.055755615234375, -0.048736572265625, -0.039947509765625, -0.0050048828125, -0.04388427734375, 0.00004583597183227539, -0.0160980224609375, -0.0264739990234375, 0.01300811767578125, 0.0097503662109375, -0.0726318359375, 0.05926513671875, 0.04693603515625, -0.052825927734375, 0.0594482421875, -0.0237579345703125, 0.015106201171875, -0.08282470703125, 0.0184173583984375, 0.0017547607421875, -0.056304931640625, -0.003917694091796875, 0.00963592529296875, -0.0055389404296875, -0.0188140869140625, -0.03411865234375, 0.036773681640625, -0.045196533203125, -0.0048980712890625, -0.013458251953125, -0.0005011558532714844, 0.01204681396484375, -0.00605010986328125, -0.007381439208984375, 0.039215087890625, 0.033660888671875, -0.042510986328125, 0.04534912109375, 0.04058837890625, -0.019622802734375, 0.035919189453125, -0.07366943359375, -0.00113677978515625, -0.0012340545654296875, 0.046234130859375, -0.064453125, -0.044036865234375, 0.052215576171875, -0.034637451171875, 0.04669189453125, -0.044464111328125, -0.04266357421875, -0.0263214111328125, -0.007030487060546875, 0.03515625, 0.046630859375, -0.02880859375, 0.050079345703125, 0.02728271484375, 0.01155853271484375, -0.00986480712890625, -0.0275115966796875, -0.016571044921875, -0.0126190185546875, -0.0273895263671875, 0.0308074951171875, -0.00223541259765625, -0.027923583984375, -0.008880615234375, -0.0012941360473632812, 0.0040740966796875, -0.031951904296875, 0.030059814453125, 0.045928955078125, -0.0138092041015625, 0.016937255859375, -0.0196533203125, -0.0035400390625, -0.01033782958984375, -0.00342559814453125, 0.065673828125, -0.01708984375, -0.0025768280029296875, -0.0631103515625, 0.0007767677307128906, 0.0645751953125, 0.0013380050659179688, 0.07666015625, 0.021514892578125, 0.0035419464111328125, -0.01338958740234375, -0.0267791748046875, -0.0107421875, -0.03863525390625, 0.005161285400390625, -0.0206451416015625, -0.024627685546875, 0.034881591796875, 0.0023345947265625, 0.004901885986328125, 0.046417236328125, 0.038330078125, -0.023712158203125, 0.048126220703125, 0.0201873779296875, 0.01430511474609375, 0.051361083984375, -0.039337158203125, 0.00528717041015625, -0.05828857421875, -0.0179901123046875, -0.0244903564453125, -0.02191162109375, -0.0643310546875, -0.03857421875, -0.007404327392578125, 0.01500701904296875, -0.03485107421875, 0.08514404296875, -0.06927490234375, 0.004638671875, 0.02655029296875, -0.007373809814453125, 0.01129913330078125, -0.02728271484375, -0.02008056640625, 0.00992584228515625, -0.041961669921875, -0.05462646484375, 0.07965087890625, 0.0287017822265625, 0.050201416015625, -0.005550384521484375, 0.0543212890625, 0.0361328125, 0.0167236328125, -0.032867431640625, 0.041595458984375, 0.0283355712890625, -0.08990478515625, -0.005977630615234375, -0.01947021484375, -0.067626953125, 0.026580810546875, -0.005466461181640625, -0.03277587890625, 0.0013113021850585938, 0.0297698974609375, -0.007171630859375, 0.033050537109375, -0.036529541015625, 0.0675048828125, -0.02789306640625, -0.002857208251953125, -0.0255279541015625, -0.0391845703125, 0.0233001708984375, 0.023956298828125, 0.037322998046875, -0.0260467529296875, -0.0128631591796875, 0.0396728515625, -0.060455322265625, 0.040435791015625, -0.00963592529296875, 0.004512786865234375, 0.035308837890625, 0.0187225341796875, 0.06048583984375, 0.0284423828125, 0.0225067138671875, -0.0035343170166015625, 0.01203155517578125, -0.02972412109375, -0.034881591796875, 0.0482177734375, -0.052398681640625, 0.0025730133056640625, -0.053436279296875, -0.015167236328125, 0.047088623046875, 0.0069732666015625, 0.0249786376953125, 0.01554107666015625, -0.037628173828125, 0.005283355712890625, 0.034515380859375, -0.01010894775390625, 0.01557159423828125, 0.037017822265625, -0.0596923828125, -0.03900146484375, 0.04986572265625, -0.005340576171875, 0.0002951622009277344, 0.002391815185546875, -0.01332855224609375, 0.007778167724609375, -0.032501220703125, -0.022857666015625, 0.035400390625, -0.02362060546875, -0.019500732421875, -0.024871826171875, -0.0112152099609375, -0.0266265869140625, -0.01154327392578125, -0.07281494140625, -0.059814453125, -0.0545654296875, -0.003814697265625, 0.0462646484375, 0.07666015625, -0.025146484375, 0.044189453125, -0.04296875, 0.037139892578125, -0.00548553466796875, 0.018310546875, -0.033782958984375, -0.042205810546875, -0.00494384765625, -0.013275146484375, -0.057281494140625, -0.044891357421875, 0.0665283203125, -0.0160980224609375, 0.0298614501953125, 0.0157318115234375, 0.00563812255859375, 0.020843505859375, -0.034149169921875, 0.05157470703125, 0.055023193359375, -0.0301055908203125, 0.042388916015625, -0.0394287109375, 0.0200042724609375, 0.02777099609375, 0.0231781005859375, -0.0177001953125, -0.0265045166015625, -0.0750732421875, -0.0792236328125, 0.03955078125, 0.045196533203125, 0.0082550048828125, 0.01462554931640625, 0.01134490966796875, 0.013336181640625, 0.01186370849609375, -0.05181884765625, -0.0211944580078125, -0.035003662109375, -0.01849365234375, 0.0091400146484375, -0.0084381103515625, -0.0283355712890625, -0.0285491943359375, 0.06756591796875, -0.00457000732421875, 0.0338134765625, 0.007678985595703125, -0.003917694091796875, -0.0224456787109375, 0.00612640380859375, 0.06805419921875, 0.048095703125, -0.0267181396484375, -0.025238037109375, -0.003917694091796875, -0.032440185546875, -0.01464080810546875, -0.01093292236328125, 0.0007996559143066406, 0.031005859375, 0.025177001953125, 0.05712890625, 0.0087432861328125, -0.018768310546875, 0.031890869140625, -0.024688720703125, -0.01073455810546875, -0.08734130859375, 0.01727294921875, 0.00809478759765625, 0.0180816650390625, 0.016815185546875, 0.032012939453125, -0.017059326171875, -0.02215576171875, 0.004730224609375, 0.0276641845703125, -0.054840087890625, -0.0594482421875, 0.051788330078125, 0.0177764892578125, -0.040191650390625, 0.052581787109375, -0.0040283203125, -0.0194244384765625, 0.056610107421875, 0.0298004150390625, 0.056365966796875, -0.00330352783203125, 0.0307159423828125, 0.057769775390625, 0.00971221923828125, 0.0032711029052734375, 0.03216552734375, -0.006946563720703125, -0.037567138671875, 0.0129547119140625, -0.033447265625, -0.0154266357421875, 0.0082550048828125, -0.059539794921875, 0.055328369140625, -0.054412841796875, -0.033050537109375, -0.00909423828125, -0.00940704345703125, -0.0521240234375, 0.0289459228515625, 0.016021728515625, 0.0889892578125, -0.037322998046875, 0.05999755859375, 0.05963134765625, -0.032379150390625, -0.040374755859375, -0.0238800048828125, 0.007083892822265625, -0.061187744140625, 0.04107666015625, 0.0178375244140625, 0.01155853271484375, -0.005008697509765625, -0.0780029296875, -0.07379150390625, 0.10284423828125, -0.02239990234375, -0.053009033203125, 0.00685882568359375, -0.024322509765625, 0.02783203125, -0.07513427734375, 0.037139892578125, -0.00748443603515625, 0.036895751953125, 0.01203155517578125, -0.058868408203125, -0.004451751708984375, -0.01525115966796875, 0.015899658203125, 0.014251708984375, -0.07061767578125, 0.046417236328125, -0.0157012939453125, 0.0035381317138671875, 0.027313232421875, 0.05303955078125, 0.03875732421875, 0.010833740234375, 0.050628662109375, 0.05865478515625, 0.0171966552734375, 0.007289886474609375, 0.08477783203125, -0.0161590576171875, 0.054931640625, 0.08056640625, -0.019500732421875, 0.027618408203125, 0.04815673828125, -0.0152587890625, 0.092529296875, 0.0467529296875, -0.022735595703125, 0.05133056640625, 0.00824737548828125, -0.00014400482177734375, -0.025787353515625, 0.0008149147033691406, -0.029754638671875, 0.031982421875, 0.01739501953125, -0.049835205078125, 0.0002665519714355469, -0.0174560546875, -0.006008148193359375, -0.01306915283203125, -0.019683837890625, 0.042694091796875, 0.0179290771484375, -0.037139892578125, 0.02685546875, -0.022705078125, 0.01329803466796875, -0.05316162109375, -0.0160064697265625, 0.01263427734375, 0.0173187255859375, -0.011962890625, -0.06207275390625, 0.0128936767578125, -0.0106964111328125, -0.0008630752563476562, -0.0252838134765625, 0.04144287109375, 0.01340484619140625, -0.0457763671875, 0.0295562744140625, 0.01146697998046875, 0.0165252685546875, 0.022216796875, -0.078369140625, 0.0164031982421875, -0.002079010009765625, -0.0579833984375, 0.0207977294921875, 0.006801605224609375, -0.00363922119140625, 0.0645751953125, 0.035888671875, -0.00325775146484375, -0.00533294677734375, 0.00070953369140625, 0.06903076171875, -0.04046630859375, -0.0113525390625, -0.0307159423828125, 0.0777587890625, -0.01171875, -0.048858642578125, 0.03143310546875, 0.058837890625, 0.046539306640625, -0.0275421142578125, 0.04766845703125, -0.05499267578125, 0.046295166015625, -0.01395416259765625, 0.0836181640625, -0.0635986328125, 0.00937652587890625, 0.0028934478759765625, -0.0338134765625, 0.00433349609375, 0.033660888671875, 0.01029205322265625, 0.003711700439453125, 0.05804443359375, 0.05902099609375, -0.013031005859375, -0.00707244873046875, 0.0236358642578125, -0.00797271728515625, -0.0007719993591308594, -0.01922607421875, 0.049957275390625, -0.055023193359375, 0.033599853515625, -0.031524658203125, -0.01187896728515625, -0.0297393798828125, -0.042572021484375, -0.0699462890625, -0.04302978515625, -0.039764404296875, -0.056304931640625, 0.0013256072998046875, 0.07086181640625, 0.05419921875, -0.08056640625, -0.025970458984375, -0.0257110595703125, 0.0163726806640625, 0.00902557373046875, -0.0223388671875, -0.00904083251953125, -0.01397705078125, -0.0367431640625, 0.004001617431640625, -0.003978729248046875, 0.0283966064453125, 0.0062255859375, -0.0123291015625, -0.022216796875, -0.00555419921875, 0.01053619384765625, 0.03411865234375, -0.040008544921875, -0.0281219482421875, -0.0178375244140625, -0.0261077880859375, 0.0323486328125, 0.0675048828125, -0.039764404296875, 0.0184478759765625, 0.0478515625, -0.047576904296875, 0.038848876953125, 0.007633209228515625, 0.0712890625, -0.033416748046875, 0.01861572265625, 0.0244293212890625, 0.03961181640625, 0.01546478271484375, -0.0216217041015625, 0.046295166015625, 0.0274200439453125, -0.06329345703125, -0.06732177734375, 0.0111541748046875, -0.09271240234375, -0.025146484375, 0.061553955078125, 0.0019512176513671875, -0.040283203125, -0.0006785392761230469, -0.015838623046875, 0.0283050537109375, -0.055908203125, 0.03448486328125, 0.0316162109375, -0.002239227294921875, -0.035888671875, -0.05267333984375, 0.006694793701171875, -0.0022563934326171875, -0.0211334228515625, -0.01934814453125, 0.0130767822265625, 0.04290771484375, 0.031524658203125, 0.0186614990234375, -0.0161895751953125, 0.0269317626953125, 0.005619049072265625, 0.06964111328125, -0.0014963150024414062, -0.0145721435546875, -0.009735107421875, -0.016357421875, -0.0075836181640625, -0.03204345703125 ] ]
togethercomputer/RedPajama-INCITE-Chat-3B-v1
2023-05-09T14:59:47.000Z
[ "transformers", "pytorch", "gpt_neox", "text-generation", "en", "dataset:togethercomputer/RedPajama-Data-1T", "dataset:OpenAssistant/oasst1", "dataset:databricks/databricks-dolly-15k", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
togethercomputer
null
null
togethercomputer/RedPajama-INCITE-Chat-3B-v1
116
9,681
transformers
2023-05-05T05:06:37
--- license: apache-2.0 language: - en datasets: - togethercomputer/RedPajama-Data-1T - OpenAssistant/oasst1 - databricks/databricks-dolly-15k widget: - text: "<human>: Write an email to my friends inviting them to come to my home on Friday for a dinner party, bring their own food to share.\n<bot>:" example_title: "Email Writing" - text: "<human>: Create a list of things to do in San Francisco\n<bot>:" example_title: "Brainstorming" inference: parameters: temperature: 0.7 top_p: 0.7 top_k: 50 max_new_tokens: 128 --- # RedPajama-INCITE-Chat-3B-v1 RedPajama-INCITE-Chat-3B-v1 was developed by Together and leaders from the open-source AI community including Ontocord.ai, ETH DS3Lab, AAI CERC, Université de Montréal, MILA - Québec AI Institute, Stanford Center for Research on Foundation Models (CRFM), Stanford Hazy Research research group and LAION. It is fine-tuned on OASST1 and Dolly2 to enhance chatting ability. - Base Model: [RedPajama-INCITE-Base-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1) - Instruction-tuned Version: [RedPajama-INCITE-Instruct-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1) - Chat Version: [RedPajama-INCITE-Chat-3B-v1](https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1) ## Model Details - **Developed by**: Together Computer. - **Model type**: Language Model - **Language(s)**: English - **License**: Apache 2.0 - **Model Description**: A 2.8B parameter pretrained language model. # Quick Start Please note that the model requires `transformers` version >= 4.25.1. To prompt the chat model, use the following format: ``` <human>: [Instruction] <bot>: ``` ## GPU Inference This requires a GPU with 8GB memory. ```python import torch import transformers from transformers import AutoTokenizer, AutoModelForCausalLM MIN_TRANSFORMERS_VERSION = '4.25.1' # check transformers version assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.' # init tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-Chat-3B-v1") model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-Chat-3B-v1", torch_dtype=torch.float16) model = model.to('cuda:0') # infer prompt = "<human>: Who is Alan Turing?\n<bot>:" inputs = tokenizer(prompt, return_tensors='pt').to(model.device) input_length = inputs.input_ids.shape[1] outputs = model.generate( **inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True ) token = outputs.sequences[0, input_length:] output_str = tokenizer.decode(token) print(output_str) """ Alan Turing was a British mathematician, logician, cryptologist, and computer scientist. He is widely regarded as the father of computer science and artificial intelligence. """ ``` ## GPU Inference in Int8 This requires a GPU with 6GB memory. To run inference with int8, please ensure you have installed accelerate and bitandbytes. You can install them with the following command: ```bash pip install accelerate pip install bitsandbytes ``` Then you can run inference with int8 as follows: ```python import torch import transformers from transformers import AutoTokenizer, AutoModelForCausalLM MIN_TRANSFORMERS_VERSION = '4.25.1' # check transformers version assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.' # init tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-Chat-3B-v1") model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-Chat-3B-v1", device_map='auto', torch_dtype=torch.float16, load_in_8bit=True) # infer prompt = "<human>: Who is Alan Turing?\n<bot>:" inputs = tokenizer(prompt, return_tensors='pt').to(model.device) input_length = inputs.input_ids.shape[1] outputs = model.generate( **inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True ) token = outputs.sequences[0, input_length:] output_str = tokenizer.decode(token) print(output_str) """ Alan Turing was a British mathematician and computer scientist who made important contributions to computer science and mathematical logic. He is widely regarded as the father of computer science and artificial intelligence for his work on the Turing machine and Turing test. """ ``` ## CPU Inference ```python import torch import transformers from transformers import AutoTokenizer, AutoModelForCausalLM MIN_TRANSFORMERS_VERSION = '4.25.1' # check transformers version assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.' # init tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-Chat-3B-v1") model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-Chat-3B-v1", torch_dtype=torch.bfloat16) # infer prompt = "<human>: Who is Alan Turing?\n<bot>:" inputs = tokenizer(prompt, return_tensors='pt').to(model.device) input_length = inputs.input_ids.shape[1] outputs = model.generate( **inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True ) token = outputs.sequences[0, input_length:] output_str = tokenizer.decode(token) print(output_str) """ Alan Turing was a British mathematician and computer scientist who made important contributions to the fields of mathematics, cryptography, and computer science. He is widely regarded as the father of computer science and artificial intelligence. """ ``` Please note that since `LayerNormKernelImpl` is not implemented in fp16 for CPU, we use `bfloat16` for CPU inference. # Uses Excluded uses are described below. ### Misuse, Malicious Use, and Out-of-Scope Use It is the responsibility of the end user to ensure that the model is used in a responsible and ethical manner. #### Out-of-Scope Use `RedPajama-INCITE-Chat-3B-v1` is a language model and may not perform well for other use cases outside of its intended scope. For example, it may not be suitable for use in safety-critical applications or for making decisions that have a significant impact on individuals or society. It is important to consider the limitations of the model and to only use it for its intended purpose. #### Misuse and Malicious Use `RedPajama-INCITE-Chat-3B-v1` is designed for language modeling. Misuse of the model, such as using it to engage in illegal or unethical activities, is strictly prohibited and goes against the principles of the project. Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to: - Generating fake news, misinformation, or propaganda - Promoting hate speech, discrimination, or violence against individuals or groups - Impersonating individuals or organizations without their consent - Engaging in cyberbullying or harassment - Defamatory content - Spamming or scamming - Sharing confidential or sensitive information without proper authorization - Violating the terms of use of the model or the data used to train it - Creating automated bots for malicious purposes such as spreading malware, phishing scams, or spamming ## Limitations `RedPajama-INCITE-Chat-3B-v1`, like other language models, has limitations that should be taken into consideration. For example, the model may not always provide accurate or relevant answers, particularly for questions that are complex, ambiguous, or outside of its training data. We therefore welcome contributions from individuals and organizations, and encourage collaboration towards creating a more robust and inclusive chatbot. ## Training **Training Data** Please refer to [togethercomputer/RedPajama-Data-1T](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) **Training Procedure** - **Hardware:** 8 A100 - **Optimizer:** Adam - **Gradient Accumulations**: 1 - **Num of Tokens:** 131M tokens - **Learning rate:** 1e-5 ## Community Join us on [Together Discord](https://discord.gg/6ZVDU8tTD4)
8,215
[ [ -0.031646728515625, -0.08056640625, 0.0208892822265625, 0.02972412109375, -0.00030541419982910156, -0.01345062255859375, -0.01187896728515625, -0.0311737060546875, 0.0186309814453125, 0.019927978515625, -0.0531005859375, -0.03009033203125, -0.05914306640625, 0.0032939910888671875, -0.035430908203125, 0.0531005859375, 0.00548553466796875, -0.0135498046875, -0.026275634765625, 0.0036258697509765625, -0.029327392578125, -0.031768798828125, -0.05859375, -0.0340576171875, -0.005947113037109375, 0.0282135009765625, 0.038787841796875, 0.0299835205078125, 0.0400390625, 0.0305023193359375, -0.0033016204833984375, 0.00417327880859375, -0.03375244140625, -0.001552581787109375, 0.0162811279296875, -0.0400390625, -0.028472900390625, -0.00737762451171875, 0.03424072265625, 0.019927978515625, 0.0059051513671875, 0.0185394287109375, -0.003192901611328125, 0.036865234375, -0.040496826171875, 0.0286102294921875, -0.04852294921875, -0.00315093994140625, -0.0101165771484375, -0.0126953125, -0.034515380859375, -0.020477294921875, -0.0051422119140625, -0.026580810546875, 0.008453369140625, -0.00585174560546875, 0.06512451171875, 0.0264434814453125, 0.0006165504455566406, -0.01328277587890625, -0.04364013671875, 0.0648193359375, -0.079345703125, 0.01073455810546875, 0.040496826171875, 0.0210723876953125, -0.0287933349609375, -0.0703125, -0.0679931640625, -0.01203155517578125, -0.00879669189453125, -0.007434844970703125, -0.0298004150390625, -0.008148193359375, 0.020721435546875, 0.016265869140625, -0.038665771484375, -0.0022411346435546875, -0.0587158203125, -0.007083892822265625, 0.0518798828125, 0.0191650390625, 0.041107177734375, -0.0028400421142578125, -0.01214599609375, -0.01983642578125, -0.03948974609375, -0.0006113052368164062, 0.0212249755859375, 0.0209503173828125, -0.044464111328125, 0.04034423828125, -0.00914764404296875, 0.042877197265625, 0.0031223297119140625, -0.022064208984375, 0.041961669921875, -0.01511383056640625, -0.024383544921875, -0.00090789794921875, 0.09375, 0.01739501953125, 0.0065765380859375, 0.0198822021484375, 0.0239410400390625, 0.0147552490234375, 0.0004944801330566406, -0.05877685546875, -0.040679931640625, 0.044464111328125, -0.02618408203125, -0.01352691650390625, 0.0028057098388671875, -0.037078857421875, -0.0162811279296875, 0.0157470703125, 0.043121337890625, -0.04107666015625, -0.028350830078125, 0.00717926025390625, -0.0191497802734375, 0.0157012939453125, 0.0017414093017578125, -0.0816650390625, 0.029388427734375, 0.0343017578125, 0.0556640625, 0.004909515380859375, -0.04345703125, 0.0008192062377929688, -0.006500244140625, -0.001445770263671875, 0.0305938720703125, -0.0233306884765625, -0.007537841796875, -0.0303802490234375, -0.006282806396484375, -0.0030727386474609375, -0.032440185546875, 0.007595062255859375, -0.0139617919921875, 0.04730224609375, 0.0241546630859375, -0.04681396484375, -0.027984619140625, 0.0028362274169921875, -0.035003662109375, 0.07403564453125, 0.007114410400390625, -0.0625, -0.003658294677734375, -0.0740966796875, -0.03741455078125, 0.001739501953125, -0.026947021484375, -0.046630859375, -0.01554107666015625, 0.02252197265625, 0.0394287109375, -0.0238800048828125, 0.0061187744140625, -0.027557373046875, -0.0284271240234375, 0.02520751953125, -0.033294677734375, 0.10650634765625, 0.00553131103515625, -0.060455322265625, 0.01279449462890625, -0.039215087890625, 0.004932403564453125, 0.0266571044921875, 0.0023899078369140625, 0.016693115234375, -0.0015039443969726562, -0.01361083984375, 0.01690673828125, 0.03363037109375, -0.044189453125, -0.0013599395751953125, -0.04248046875, 0.051727294921875, 0.051513671875, 0.0018062591552734375, 0.01163482666015625, -0.031402587890625, 0.029296875, 0.0037021636962890625, 0.026275634765625, -0.0012950897216796875, -0.0631103515625, -0.061309814453125, -0.024139404296875, -0.0011157989501953125, 0.049102783203125, -0.03643798828125, 0.053680419921875, -0.00019621849060058594, -0.045257568359375, -0.0548095703125, -0.00359344482421875, 0.01806640625, 0.01995849609375, 0.038787841796875, -0.0143280029296875, -0.05670166015625, -0.06561279296875, 0.0026226043701171875, -0.02459716796875, -0.00001823902130126953, 0.035980224609375, 0.048675537109375, -0.034088134765625, 0.059478759765625, -0.041107177734375, -0.0105743408203125, -0.022247314453125, 0.00644683837890625, 0.03460693359375, 0.057708740234375, 0.0308685302734375, -0.03436279296875, -0.04302978515625, -0.0015020370483398438, -0.067626953125, 0.00931549072265625, -0.00862884521484375, -0.0024585723876953125, 0.0279388427734375, 0.030426025390625, -0.0572509765625, 0.041229248046875, 0.0450439453125, -0.035247802734375, 0.033050537109375, -0.01311492919921875, 0.01421356201171875, -0.08038330078125, 0.005523681640625, -0.01490020751953125, -0.00984954833984375, -0.038238525390625, 0.0025119781494140625, -0.01038360595703125, 0.00551605224609375, -0.0587158203125, 0.060516357421875, -0.0212860107421875, 0.0161285400390625, -0.01509857177734375, 0.0069122314453125, -0.0160064697265625, 0.05084228515625, -0.01134490966796875, 0.05133056640625, 0.056304931640625, -0.054595947265625, 0.044189453125, 0.032562255859375, -0.022003173828125, 0.0072021484375, -0.059844970703125, 0.0173797607421875, 0.0008840560913085938, 0.0035686492919921875, -0.07568359375, -0.005672454833984375, 0.04229736328125, -0.07220458984375, 0.01369476318359375, 0.0090484619140625, -0.03741455078125, -0.0355224609375, -0.018310546875, 0.019744873046875, 0.065185546875, -0.0335693359375, 0.040679931640625, 0.050628662109375, -0.004230499267578125, -0.04412841796875, -0.055419921875, -0.00989532470703125, -0.0129852294921875, -0.05914306640625, 0.0088653564453125, -0.0212860107421875, -0.022491455078125, -0.005733489990234375, -0.003353118896484375, -0.00789642333984375, 0.02001953125, 0.0232696533203125, 0.0212860107421875, 0.007587432861328125, -0.0255584716796875, -0.0067596435546875, -0.002063751220703125, 0.0276336669921875, 0.0027561187744140625, 0.06591796875, -0.0264892578125, -0.01273345947265625, -0.06732177734375, 0.01232147216796875, 0.04144287109375, 0.013092041015625, 0.073486328125, 0.040740966796875, -0.031402587890625, -0.01232147216796875, -0.0301361083984375, -0.04156494140625, -0.03912353515625, 0.034423828125, -0.02362060546875, -0.056427001953125, 0.0294647216796875, 0.0301361083984375, 0.02081298828125, 0.0538330078125, 0.0618896484375, -0.0137176513671875, 0.07525634765625, 0.032470703125, -0.00585174560546875, 0.03143310546875, -0.045074462890625, 0.01273345947265625, -0.052276611328125, -0.024078369140625, -0.0413818359375, -0.022125244140625, -0.04083251953125, -0.023590087890625, 0.015533447265625, -0.0086822509765625, -0.04327392578125, 0.039947509765625, -0.053375244140625, 0.031402587890625, 0.0538330078125, 0.01428985595703125, -0.0059356689453125, 0.0013942718505859375, 0.00397491455078125, 0.006595611572265625, -0.0594482421875, -0.0281524658203125, 0.0750732421875, 0.03326416015625, 0.052001953125, 0.00612640380859375, 0.045196533203125, 0.004428863525390625, 0.01381683349609375, -0.02191162109375, 0.050048828125, 0.002880096435546875, -0.0684814453125, -0.0261077880859375, -0.03814697265625, -0.0731201171875, 0.01102447509765625, -0.01277923583984375, -0.0714111328125, -0.0121307373046875, 0.020050048828125, -0.0118560791015625, 0.025909423828125, -0.0576171875, 0.07958984375, -0.010345458984375, -0.01081085205078125, -0.02215576171875, -0.057861328125, 0.032745361328125, 0.0247039794921875, 0.00325775146484375, -0.0122528076171875, 0.0184478759765625, 0.06414794921875, -0.032257080078125, 0.07708740234375, -0.00231170654296875, 0.0215301513671875, 0.026458740234375, 0.00936126708984375, 0.0229034423828125, -0.006259918212890625, 0.01104736328125, 0.03912353515625, 0.0012044906616210938, -0.03179931640625, -0.032989501953125, 0.06365966796875, -0.0889892578125, -0.04351806640625, -0.035308837890625, -0.0340576171875, 0.004482269287109375, 0.0158233642578125, 0.0311279296875, 0.0187530517578125, 0.01084136962890625, 0.0261383056640625, 0.0335693359375, -0.021148681640625, 0.0299835205078125, 0.0100555419921875, -0.0119171142578125, -0.035858154296875, 0.06378173828125, -0.01364898681640625, 0.01275634765625, 0.0179443359375, 0.0106201171875, -0.0222320556640625, -0.02581787109375, -0.0224456787109375, 0.032745361328125, -0.05169677734375, -0.0119781494140625, -0.059051513671875, -0.03173828125, -0.036590576171875, -0.005290985107421875, -0.029296875, -0.03741455078125, -0.04498291015625, 0.01113128662109375, 0.042266845703125, 0.03460693359375, 0.00707244873046875, 0.0251007080078125, -0.034423828125, 0.02801513671875, 0.01171875, 0.0235595703125, -0.0004761219024658203, -0.05523681640625, -0.01102447509765625, 0.01216888427734375, -0.032562255859375, -0.04608154296875, 0.031005859375, 0.00553131103515625, 0.043182373046875, 0.0158843994140625, 0.0024890899658203125, 0.051727294921875, -0.0202789306640625, 0.06658935546875, 0.0206146240234375, -0.06866455078125, 0.045684814453125, 0.004779815673828125, 0.034454345703125, 0.032745361328125, 0.034423828125, -0.0262908935546875, -0.0543212890625, -0.0850830078125, -0.0711669921875, 0.075927734375, 0.042327880859375, 0.01273345947265625, -0.0169677734375, 0.0086822509765625, 0.004215240478515625, 0.01058197021484375, -0.07537841796875, -0.06121826171875, -0.01052093505859375, -0.030914306640625, 0.00823211669921875, -0.0020809173583984375, -0.0030078887939453125, -0.036224365234375, 0.087646484375, 0.0168304443359375, 0.05279541015625, 0.0007233619689941406, -0.00982666015625, 0.00405120849609375, -0.00774383544921875, 0.05120849609375, 0.0623779296875, -0.0221405029296875, -0.0010223388671875, 0.0180206298828125, -0.043975830078125, 0.002574920654296875, 0.0061492919921875, -0.01397705078125, -0.0013475418090820312, 0.031829833984375, 0.07769775390625, -0.01085662841796875, -0.039031982421875, 0.0250701904296875, -0.0300140380859375, -0.019989013671875, -0.04461669921875, 0.02880859375, 0.0171661376953125, 0.0232696533203125, 0.015655517578125, -0.0000833272933959961, -0.0153961181640625, -0.032745361328125, 0.006114959716796875, 0.04095458984375, -0.0106353759765625, -0.00833892822265625, 0.0643310546875, 0.0115203857421875, -0.04736328125, 0.06317138671875, -0.0029048919677734375, -0.0245208740234375, 0.054718017578125, 0.054595947265625, 0.06732177734375, 0.00011926889419555664, -0.0075225830078125, 0.043060302734375, 0.027587890625, -0.0025806427001953125, 0.0274505615234375, 0.0004794597625732422, -0.05810546875, -0.0194244384765625, -0.028564453125, -0.025665283203125, 0.0232086181640625, -0.0189208984375, 0.0217132568359375, -0.0443115234375, -0.01229095458984375, 0.0010175704956054688, 0.0101165771484375, -0.046417236328125, -0.0033473968505859375, 0.003116607666015625, 0.061370849609375, -0.05828857421875, 0.0567626953125, 0.0408935546875, -0.0302734375, -0.056854248046875, -0.01971435546875, -0.005359649658203125, -0.059234619140625, 0.036712646484375, 0.0198211669921875, -0.0052947998046875, 0.0154266357421875, -0.0574951171875, -0.054534912109375, 0.08099365234375, 0.033203125, -0.0188140869140625, 0.0198822021484375, -0.01218414306640625, 0.0237579345703125, -0.01512908935546875, 0.05072021484375, 0.04681396484375, 0.0269317626953125, -0.0013723373413085938, -0.06854248046875, 0.0013952255249023438, -0.033355712890625, -0.013580322265625, 0.0182647705078125, -0.0340576171875, 0.07855224609375, -0.0158233642578125, -0.0189208984375, -0.0015478134155273438, 0.071044921875, 0.031768798828125, -0.0023174285888671875, 0.019287109375, 0.042083740234375, 0.0462646484375, -0.004901885986328125, 0.07757568359375, -0.040740966796875, 0.0408935546875, 0.0736083984375, 0.01326751708984375, 0.05621337890625, 0.03204345703125, -0.038543701171875, 0.049896240234375, 0.02008056640625, 0.0005216598510742188, 0.027740478515625, 0.00595855712890625, -0.0244598388671875, 0.00890350341796875, 0.034454345703125, -0.032135009765625, 0.0257568359375, 0.030914306640625, -0.0214080810546875, 0.0078277587890625, -0.014892578125, 0.0181732177734375, -0.01343536376953125, -0.00016188621520996094, 0.05291748046875, 0.0030612945556640625, -0.041778564453125, 0.0770263671875, 0.00785064697265625, 0.06512451171875, -0.04754638671875, -0.0073699951171875, -0.0167694091796875, 0.033538818359375, -0.01392364501953125, -0.049774169921875, 0.0196533203125, -0.014404296875, 0.0001462697982788086, 0.00775146484375, 0.0273895263671875, -0.0333251953125, -0.038177490234375, 0.009185791015625, 0.007694244384765625, 0.04486083984375, 0.0083160400390625, -0.0673828125, 0.038330078125, 0.016693115234375, -0.01477813720703125, 0.009033203125, 0.00650787353515625, 0.0169677734375, 0.06768798828125, 0.034393310546875, 0.0037384033203125, 0.0023555755615234375, -0.0216827392578125, 0.045684814453125, -0.05181884765625, -0.034912109375, -0.0899658203125, 0.037261962890625, -0.004093170166015625, -0.02734375, 0.06671142578125, 0.032012939453125, 0.07965087890625, -0.0093841552734375, 0.058349609375, -0.036590576171875, 0.0159759521484375, -0.0186309814453125, 0.057220458984375, -0.032257080078125, 0.01192474365234375, -0.017242431640625, -0.05328369140625, -0.028106689453125, 0.087890625, -0.0014820098876953125, 0.0198211669921875, 0.05706787109375, 0.06976318359375, 0.002246856689453125, -0.00201416015625, 0.0029354095458984375, 0.039031982421875, 0.032501220703125, 0.05352783203125, 0.033233642578125, -0.052276611328125, 0.036224365234375, -0.04736328125, -0.0232391357421875, -0.017578125, -0.055877685546875, -0.08221435546875, -0.053497314453125, -0.0302276611328125, -0.043975830078125, -0.0023136138916015625, 0.0950927734375, 0.057342529296875, -0.06201171875, -0.04071044921875, -0.0092010498046875, 0.0146484375, -0.027099609375, -0.0213623046875, 0.0275421142578125, -0.016998291015625, -0.0625, 0.0203399658203125, 0.0126495361328125, 0.010498046875, -0.03131103515625, -0.01508331298828125, -0.036895751953125, -0.0010986328125, 0.041259765625, 0.04351806640625, -0.053314208984375, -0.01216888427734375, -0.0198211669921875, -0.00894927978515625, 0.018096923828125, 0.01641845703125, -0.064697265625, 0.0244293212890625, 0.03656005859375, 0.033447265625, 0.05914306640625, -0.0176544189453125, 0.032379150390625, -0.04534912109375, 0.0189971923828125, 0.0211334228515625, 0.023406982421875, 0.0228424072265625, -0.032684326171875, 0.02008056640625, 0.01338958740234375, -0.04400634765625, -0.061431884765625, 0.0028934478759765625, -0.0631103515625, -0.0305633544921875, 0.07537841796875, 0.0088958740234375, -0.0364990234375, -0.0051727294921875, -0.0155029296875, 0.0203704833984375, -0.02838134765625, 0.07269287109375, 0.035919189453125, -0.04010009765625, -0.022125244140625, -0.031768798828125, 0.0301361083984375, 0.0202789306640625, -0.0667724609375, 0.0008912086486816406, 0.019256591796875, 0.027984619140625, 0.0081787109375, 0.0748291015625, -0.00901031494140625, 0.032562255859375, 0.0083160400390625, 0.029693603515625, -0.0166168212890625, -0.027862548828125, -0.01526641845703125, 0.0037479400634765625, -0.0014801025390625, -0.02459716796875 ] ]
nvidia/segformer-b2-finetuned-ade-512-512
2022-08-06T10:07:54.000Z
[ "transformers", "pytorch", "tf", "segformer", "vision", "image-segmentation", "dataset:scene_parse_150", "arxiv:2105.15203", "license:other", "endpoints_compatible", "has_space", "region:us" ]
image-segmentation
nvidia
null
null
nvidia/segformer-b2-finetuned-ade-512-512
1
9,667
transformers
2022-03-02T23:29:05
--- license: other tags: - vision - image-segmentation datasets: - scene_parse_150 widget: - src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg example_title: House - src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg example_title: Castle --- # SegFormer (b2-sized) model fine-tuned on ADE20k SegFormer model fine-tuned on ADE20k at resolution 512x512. It was introduced in the paper [SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers](https://arxiv.org/abs/2105.15203) by Xie et al. and first released in [this repository](https://github.com/NVlabs/SegFormer). Disclaimer: The team releasing SegFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description SegFormer consists of a hierarchical Transformer encoder and a lightweight all-MLP decode head to achieve great results on semantic segmentation benchmarks such as ADE20K and Cityscapes. The hierarchical Transformer is first pre-trained on ImageNet-1k, after which a decode head is added and fine-tuned altogether on a downstream dataset. ## Intended uses & limitations You can use the raw model for semantic segmentation. See the [model hub](https://huggingface.co/models?other=segformer) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: ```python from transformers import SegformerFeatureExtractor, SegformerForSemanticSegmentation from PIL import Image import requests feature_extractor = SegformerFeatureExtractor.from_pretrained("nvidia/segformer-b2-finetuned-ade-512-512") model = SegformerForSemanticSegmentation.from_pretrained("nvidia/segformer-b2-finetuned-ade-512-512") url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # shape (batch_size, num_labels, height/4, width/4) ``` For more code examples, we refer to the [documentation](https://huggingface.co/transformers/model_doc/segformer.html#). ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2105-15203, author = {Enze Xie and Wenhai Wang and Zhiding Yu and Anima Anandkumar and Jose M. Alvarez and Ping Luo}, title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers}, journal = {CoRR}, volume = {abs/2105.15203}, year = {2021}, url = {https://arxiv.org/abs/2105.15203}, eprinttype = {arXiv}, eprint = {2105.15203}, timestamp = {Wed, 02 Jun 2021 11:46:42 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
3,090
[ [ -0.06597900390625, -0.055328369140625, 0.01088714599609375, 0.01525115966796875, -0.0248565673828125, -0.0279998779296875, -0.0001024007797241211, -0.05230712890625, 0.0196685791015625, 0.042144775390625, -0.0648193359375, -0.03887939453125, -0.057861328125, 0.0053558349609375, -0.023406982421875, 0.06060791015625, 0.00591278076171875, -0.0085296630859375, -0.0243682861328125, -0.0267791748046875, -0.00377655029296875, -0.0205230712890625, -0.04864501953125, -0.0270538330078125, 0.028717041015625, 0.0193634033203125, 0.0443115234375, 0.0599365234375, 0.052520751953125, 0.0333251953125, -0.03375244140625, 0.0136566162109375, -0.025390625, -0.01568603515625, -0.00007808208465576172, -0.0072174072265625, -0.0301055908203125, 0.000024259090423583984, 0.0271453857421875, 0.047454833984375, 0.0080413818359375, 0.0290374755859375, -0.0016574859619140625, 0.0341796875, -0.033416748046875, 0.0093231201171875, -0.038238525390625, 0.01134490966796875, 0.0035114288330078125, -0.00038695335388183594, -0.023284912109375, -0.01093292236328125, 0.01531982421875, -0.037200927734375, 0.05535888671875, 0.0052032470703125, 0.11468505859375, 0.0330810546875, -0.0301666259765625, -0.006256103515625, -0.03594970703125, 0.061279296875, -0.04449462890625, 0.038330078125, -0.005268096923828125, 0.0258941650390625, 0.0090789794921875, -0.07635498046875, -0.03411865234375, 0.01537322998046875, -0.0187530517578125, -0.00614166259765625, -0.02801513671875, 0.0034503936767578125, 0.035552978515625, 0.0400390625, -0.036224365234375, 0.0080718994140625, -0.054931640625, -0.03277587890625, 0.053558349609375, 0.0023975372314453125, 0.0196533203125, -0.0252838134765625, -0.06060791015625, -0.03387451171875, -0.024993896484375, 0.00981903076171875, 0.0171661376953125, -0.0007948875427246094, -0.0206451416015625, 0.0357666015625, -0.0003542900085449219, 0.05328369140625, 0.034332275390625, -0.0080108642578125, 0.035369873046875, -0.01233673095703125, -0.0231475830078125, 0.00270843505859375, 0.066162109375, 0.034576416015625, 0.0009551048278808594, 0.0031604766845703125, -0.006214141845703125, 0.01052093505859375, 0.0236663818359375, -0.100830078125, -0.0206146240234375, 0.005710601806640625, -0.0406494140625, -0.0218505859375, 0.0157623291015625, -0.06072998046875, -0.005405426025390625, -0.0137939453125, 0.03594970703125, -0.021270751953125, -0.0066680908203125, 0.004711151123046875, -0.0113677978515625, 0.058380126953125, 0.0183258056640625, -0.06512451171875, 0.018157958984375, 0.04351806640625, 0.06011962890625, -0.0175018310546875, -0.00662994384765625, -0.0120849609375, -0.00780487060546875, -0.01552581787109375, 0.0687255859375, -0.019073486328125, -0.0228271484375, -0.02313232421875, 0.046051025390625, -0.017303466796875, -0.0498046875, 0.057281494140625, -0.04083251953125, 0.0178985595703125, -0.004962921142578125, -0.02850341796875, -0.045318603515625, 0.020355224609375, -0.04803466796875, 0.07244873046875, 0.02105712890625, -0.058868408203125, 0.03680419921875, -0.04345703125, -0.0211639404296875, -0.004779815673828125, 0.004459381103515625, -0.0672607421875, 0.004917144775390625, 0.032684326171875, 0.033233642578125, -0.0185089111328125, 0.01209259033203125, -0.04095458984375, -0.0126953125, -0.0012407302856445312, -0.01509857177734375, 0.07330322265625, 0.0245819091796875, -0.0245361328125, 0.031768798828125, -0.05047607421875, 0.0032806396484375, 0.035308837890625, 0.0042572021484375, -0.0009150505065917969, -0.02374267578125, 0.0212249755859375, 0.0338134765625, 0.018218994140625, -0.050048828125, 0.0017709732055664062, -0.025390625, 0.0333251953125, 0.047393798828125, 0.0079193115234375, 0.03533935546875, -0.006542205810546875, 0.02099609375, 0.0118255615234375, 0.032073974609375, -0.00983428955078125, -0.0185089111328125, -0.08416748046875, -0.03173828125, 0.0169830322265625, 0.0130767822265625, -0.0297393798828125, 0.049072265625, -0.0149383544921875, -0.047210693359375, -0.036102294921875, 0.0015840530395507812, 0.00609588623046875, 0.0377197265625, 0.040191650390625, -0.032989501953125, -0.0572509765625, -0.08953857421875, 0.01177215576171875, 0.01522064208984375, -0.0059356689453125, 0.027679443359375, 0.0413818359375, -0.05078125, 0.0628662109375, -0.05767822265625, -0.024017333984375, -0.0150909423828125, -0.00428009033203125, 0.0210113525390625, 0.043243408203125, 0.048919677734375, -0.0640869140625, -0.026611328125, -0.0167999267578125, -0.046478271484375, -0.00702667236328125, 0.00992584228515625, -0.0289306640625, 0.0126495361328125, 0.0325927734375, -0.04315185546875, 0.03125, 0.0350341796875, -0.046295166015625, 0.0230712890625, -0.00469970703125, -0.004543304443359375, -0.076171875, 0.01483154296875, 0.01410675048828125, -0.0185089111328125, -0.039398193359375, 0.0116424560546875, 0.002315521240234375, -0.01204681396484375, -0.044525146484375, 0.04296875, -0.027496337890625, -0.0038166046142578125, -0.0225982666015625, -0.01291656494140625, 0.01302337646484375, 0.058380126953125, 0.013275146484375, 0.0240936279296875, 0.03521728515625, -0.05230712890625, 0.017120361328125, 0.041717529296875, -0.02886962890625, 0.0384521484375, -0.0810546875, 0.01016998291015625, -0.007232666015625, 0.00872039794921875, -0.060546875, -0.025634765625, 0.02886962890625, -0.0238494873046875, 0.0352783203125, -0.0215301513671875, -0.0188140869140625, -0.045654296875, -0.0174407958984375, 0.0308990478515625, 0.0396728515625, -0.0625, 0.040008544921875, 0.039703369140625, 0.007568359375, -0.0186309814453125, -0.04864501953125, -0.02423095703125, -0.02874755859375, -0.08111572265625, 0.048919677734375, -0.0017614364624023438, 0.01410675048828125, 0.001529693603515625, -0.0279388427734375, -0.002532958984375, -0.00347900390625, 0.0272369384765625, 0.03790283203125, -0.00982666015625, -0.030364990234375, 0.002872467041015625, -0.0335693359375, 0.01068115234375, -0.009185791015625, 0.04962158203125, -0.02313232421875, -0.025390625, -0.0223388671875, 0.0018396377563476562, 0.035064697265625, -0.0218963623046875, 0.035614013671875, 0.0946044921875, -0.0254364013671875, -0.004638671875, -0.0416259765625, -0.0218658447265625, -0.042388916015625, 0.028564453125, -0.016815185546875, -0.08038330078125, 0.03985595703125, 0.0019588470458984375, 0.00411224365234375, 0.0751953125, 0.038330078125, 0.01056671142578125, 0.09442138671875, 0.04620361328125, 0.03485107421875, 0.0390625, -0.061431884765625, 0.0156707763671875, -0.07952880859375, -0.045562744140625, -0.0308990478515625, -0.03509521484375, -0.055389404296875, -0.052215576171875, 0.03009033203125, 0.0139007568359375, -0.029022216796875, 0.044891357421875, -0.0638427734375, 0.0161895751953125, 0.03900146484375, 0.005382537841796875, -0.01568603515625, 0.0093536376953125, -0.0062713623046875, 0.005107879638671875, -0.054534912109375, -0.0269927978515625, 0.02655029296875, 0.043060302734375, 0.05712890625, -0.0105743408203125, 0.04052734375, -0.004886627197265625, -0.0032787322998046875, -0.0743408203125, 0.044952392578125, -0.01120758056640625, -0.05328369140625, -0.008941650390625, -0.0221405029296875, -0.0753173828125, 0.03302001953125, -0.0092315673828125, -0.06549072265625, 0.048797607421875, 0.0086212158203125, -0.0173797607421875, 0.019195556640625, -0.0506591796875, 0.09136962890625, -0.0161895751953125, -0.031768798828125, 0.0057830810546875, -0.05450439453125, 0.0178985595703125, 0.01849365234375, -0.007724761962890625, -0.0301055908203125, 0.021697998046875, 0.07208251953125, -0.05242919921875, 0.0498046875, -0.025390625, 0.0169830322265625, 0.048431396484375, -0.00672149658203125, 0.0296478271484375, 0.00212860107421875, 0.0211029052734375, 0.038055419921875, 0.01849365234375, -0.028106689453125, -0.032196044921875, 0.04803466796875, -0.0631103515625, -0.04364013671875, -0.030914306640625, -0.0216827392578125, -0.0006814002990722656, 0.0246734619140625, 0.03753662109375, 0.03271484375, -0.00696563720703125, 0.035736083984375, 0.04656982421875, -0.0226593017578125, 0.0386962890625, 0.0139312744140625, -0.00769805908203125, -0.031158447265625, 0.0665283203125, -0.01161956787109375, 0.0024738311767578125, 0.02154541015625, 0.022979736328125, -0.040313720703125, -0.0170745849609375, -0.034576416015625, 0.021697998046875, -0.047119140625, -0.0308837890625, -0.06494140625, -0.042022705078125, -0.03564453125, -0.018707275390625, -0.03643798828125, -0.0261688232421875, -0.031036376953125, -0.0008320808410644531, 0.0279998779296875, 0.0283050537109375, -0.0155792236328125, 0.0258941650390625, -0.04962158203125, 0.020111083984375, 0.0305938720703125, 0.0252685546875, -0.0036830902099609375, -0.048248291015625, -0.0099945068359375, 0.00007528066635131836, -0.04058837890625, -0.039154052734375, 0.042327880859375, 0.009246826171875, 0.040130615234375, 0.04498291015625, -0.007518768310546875, 0.07110595703125, -0.0172576904296875, 0.04522705078125, 0.0312347412109375, -0.057647705078125, 0.0287628173828125, -0.0072479248046875, 0.035736083984375, 0.0305023193359375, 0.0184173583984375, -0.042877197265625, 0.0082550048828125, -0.0599365234375, -0.083984375, 0.073974609375, 0.006580352783203125, 0.0009775161743164062, 0.00778961181640625, -0.0015344619750976562, 0.003147125244140625, -0.006072998046875, -0.039276123046875, -0.02777099609375, -0.02740478515625, -0.01541900634765625, -0.004688262939453125, -0.038177490234375, -0.0011310577392578125, -0.038330078125, 0.0516357421875, -0.01141357421875, 0.05230712890625, 0.0190277099609375, -0.024139404296875, -0.00284576416015625, -0.005207061767578125, 0.03228759765625, 0.0226593017578125, -0.020263671875, 0.0108642578125, 0.01383209228515625, -0.03118896484375, -0.006481170654296875, 0.0285491943359375, -0.022705078125, -0.0029544830322265625, 0.025787353515625, 0.084228515625, 0.022491455078125, -0.02142333984375, 0.0408935546875, 0.0012836456298828125, -0.03729248046875, -0.0276947021484375, 0.0185546875, -0.00211334228515625, 0.0296173095703125, 0.0174407958984375, 0.0294342041015625, 0.0247650146484375, -0.0034503936767578125, 0.023834228515625, 0.028045654296875, -0.05352783203125, -0.0280914306640625, 0.060333251953125, 0.00800323486328125, -0.003795623779296875, 0.054962158203125, -0.012451171875, -0.05059814453125, 0.066650390625, 0.04119873046875, 0.07513427734375, -0.0023899078369140625, 0.0186004638671875, 0.058868408203125, 0.01062774658203125, 0.00931549072265625, -0.00981903076171875, -0.0084075927734375, -0.055938720703125, -0.02520751953125, -0.075439453125, 0.00019598007202148438, 0.00860595703125, -0.054962158203125, 0.04119873046875, -0.031982421875, -0.01242828369140625, 0.0188446044921875, 0.00780487060546875, -0.0789794921875, 0.0196990966796875, 0.019287109375, 0.07415771484375, -0.0457763671875, 0.03570556640625, 0.059967041015625, -0.01349639892578125, -0.05712890625, -0.036346435546875, -0.001178741455078125, -0.06768798828125, 0.02996826171875, 0.039886474609375, 0.003307342529296875, 0.0028228759765625, -0.050872802734375, -0.0770263671875, 0.09942626953125, 0.0083770751953125, -0.0230712890625, -0.0021724700927734375, 0.0010280609130859375, 0.02911376953125, -0.031585693359375, 0.0279388427734375, 0.031341552734375, 0.040740966796875, 0.05560302734375, -0.03509521484375, 0.00818634033203125, -0.0236358642578125, 0.015533447265625, 0.0254364013671875, -0.066162109375, 0.0457763671875, -0.026763916015625, -0.019805908203125, -0.00794219970703125, 0.053741455078125, 0.00859832763671875, 0.0233306884765625, 0.0528564453125, 0.05767822265625, 0.032806396484375, -0.0259246826171875, 0.061676025390625, -0.016204833984375, 0.05908203125, 0.05792236328125, 0.0188751220703125, 0.0241851806640625, 0.0330810546875, -0.00942230224609375, 0.032867431640625, 0.07073974609375, -0.04296875, 0.036529541015625, -0.006099700927734375, 0.015594482421875, -0.03021240234375, -0.01617431640625, -0.0305328369140625, 0.06011962890625, 0.021087646484375, -0.044647216796875, -0.017120361328125, -0.0133056640625, -0.001659393310546875, -0.034515380859375, -0.01483917236328125, 0.0501708984375, 0.010589599609375, -0.027252197265625, 0.04693603515625, 0.01383209228515625, 0.051055908203125, -0.032501220703125, 0.0016632080078125, -0.008575439453125, 0.0218048095703125, -0.0261077880859375, -0.03741455078125, 0.045562744140625, -0.0142974853515625, -0.005313873291015625, -0.00724029541015625, 0.0728759765625, -0.0233001708984375, -0.0572509765625, 0.01418304443359375, 0.0099639892578125, 0.0041046142578125, 0.00998687744140625, -0.0682373046875, 0.029052734375, 0.0037479400634765625, -0.032867431640625, 0.001979827880859375, 0.01139068603515625, 0.007781982421875, 0.036041259765625, 0.04345703125, -0.026611328125, 0.0008597373962402344, -0.0124053955078125, 0.06475830078125, -0.05078125, -0.029022216796875, -0.053924560546875, 0.041107177734375, -0.0239105224609375, -0.0248565673828125, 0.0557861328125, 0.052001953125, 0.0821533203125, -0.017242431640625, 0.0211944580078125, -0.033416748046875, 0.01206207275390625, -0.0171661376953125, 0.03765869140625, -0.049163818359375, -0.0088958740234375, -0.0269927978515625, -0.0819091796875, -0.02392578125, 0.065673828125, -0.025970458984375, 0.0186309814453125, 0.03857421875, 0.07183837890625, -0.023406982421875, -0.0037841796875, 0.016510009765625, 0.0075531005859375, 0.0153656005859375, 0.0263519287109375, 0.043975830078125, -0.0419921875, 0.0389404296875, -0.053466796875, -0.0003101825714111328, -0.03875732421875, -0.043182373046875, -0.06585693359375, -0.043365478515625, -0.040008544921875, -0.027374267578125, -0.029876708984375, 0.06524658203125, 0.08331298828125, -0.068603515625, -0.002178192138671875, 0.0027828216552734375, 0.01279449462890625, -0.0145263671875, -0.0221405029296875, 0.037841796875, -0.0016002655029296875, -0.0718994140625, -0.005207061767578125, 0.0204315185546875, 0.01003265380859375, -0.0028705596923828125, -0.0177154541015625, 0.00026106834411621094, -0.01042938232421875, 0.05255126953125, 0.020965576171875, -0.04534912109375, -0.028961181640625, 0.0150299072265625, -0.001964569091796875, 0.0158233642578125, 0.046295166015625, -0.036956787109375, 0.027801513671875, 0.042999267578125, 0.033935546875, 0.07135009765625, 0.007358551025390625, 0.00865936279296875, -0.035308837890625, 0.021942138671875, 0.00974273681640625, 0.03515625, 0.0293731689453125, -0.01739501953125, 0.04034423828125, 0.026458740234375, -0.03753662109375, -0.044342041015625, 0.00939178466796875, -0.09619140625, -0.01055145263671875, 0.08251953125, 0.00409698486328125, -0.04425048828125, 0.0226593017578125, -0.0160369873046875, 0.0308837890625, -0.01296234130859375, 0.040557861328125, 0.0193939208984375, -0.01233673095703125, -0.0296478271484375, -0.00926971435546875, 0.0257415771484375, 0.002044677734375, -0.0380859375, -0.0404052734375, 0.0338134765625, 0.0340576171875, 0.0216217041015625, 0.01039886474609375, -0.0308837890625, 0.0083160400390625, 0.0142974853515625, 0.023284912109375, -0.0222015380859375, -0.018463134765625, -0.01380157470703125, 0.01065826416015625, -0.01058197021484375, -0.0230255126953125 ] ]
Wellcome/WellcomeBertMesh
2023-04-26T09:53:48.000Z
[ "transformers", "pytorch", "bert", "feature-extraction", "text-classification", "custom_code", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
text-classification
Wellcome
null
null
Wellcome/WellcomeBertMesh
4
9,661
transformers
2022-03-02T23:29:05
--- license: apache-2.0 pipeline_tag: text-classification --- # WellcomeBertMesh WellcomeBertMesh is build from the data science team at the WellcomeTrust to tag biomedical grants with Medical Subject Headings ([Mesh](https://www.nlm.nih.gov/mesh/meshhome.html)). Even though developed with the intention to be used towards research grants, it should be applicable to any type of biomedical text close to the domain it was trained which is abstracts from biomedical publications. # Model description The model is inspired from [BertMesh](https://pubmed.ncbi.nlm.nih.gov/32976559/) which is trained on the full text of biomedical publications and uses BioBert as its pretrained model. WellcomeBertMesh is utilising the latest state of the art model in the biomedical domain which is [PubMedBert](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract) from Microsoft and attach a Multilabel attention head which essentially allows the model to pay attention to different tokens per label to decide whether it applies. We train the model using data from the [BioASQ](http://bioasq.org) competition which consists of abstracts from PubMed publications. We use 2016-2019 data for training and 2020-2021 for testing which gives us ~2.5M publications to train and 220K to test. This is out of a total of 14M publications. It takes 4 days to train WellcomeBertMesh on 8 Nvidia P100 GPUs. The model achieves 63% micro f1 with a 0.5 threshold for all labels. The code for developing the model is open source and can be found in https://github.com/wellcometrust/grants_tagger # How to use ⚠️ You need transformers 4.17+ for the example to work due to its recent support for custom models. You can use the model straight from the hub but because it contains a custom forward function due to the multilabel attention head you have to pass `trust_remote_code=True`. You can get access to the probabilities for all labels by omitting `return_labels=True`. ``` from transformers import AutoModel, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained( "Wellcome/WellcomeBertMesh" ) model = AutoModel.from_pretrained( "Wellcome/WellcomeBertMesh", trust_remote_code=True ) text = "This grant is about malaria and not about HIV." inputs = tokenizer([text], padding="max_length") labels = model(**inputs, return_labels=True) print(labels) ``` You can inspect the model code if you navigate to the files and see `model.py`.
2,449
[ [ -0.0161285400390625, -0.03302001953125, 0.0278778076171875, 0.017333984375, -0.005863189697265625, -0.019378662109375, 0.01354217529296875, -0.035308837890625, 0.01299285888671875, 0.0309295654296875, -0.042083740234375, -0.05108642578125, -0.06683349609375, 0.015655517578125, -0.005397796630859375, 0.09344482421875, 0.004302978515625, 0.0010700225830078125, -0.00305938720703125, 0.0007929801940917969, -0.0106048583984375, -0.042938232421875, -0.040252685546875, -0.0506591796875, 0.02691650390625, 0.007724761962890625, 0.0283050537109375, 0.0231781005859375, 0.061492919921875, 0.0184173583984375, -0.022216796875, -0.00391387939453125, -0.043426513671875, -0.019989013671875, -0.037841796875, -0.00496673583984375, -0.04437255859375, 0.01354217529296875, 0.023681640625, 0.0372314453125, -0.0111236572265625, 0.0311431884765625, 0.02252197265625, 0.0164947509765625, -0.04248046875, 0.0131072998046875, -0.01751708984375, 0.0015707015991210938, -0.01461029052734375, -0.00412750244140625, -0.0178985595703125, -0.01519775390625, 0.03961181640625, -0.05181884765625, 0.0225677490234375, 0.0145111083984375, 0.082763671875, 0.026611328125, -0.01739501953125, -0.0186614990234375, -0.005519866943359375, 0.06341552734375, -0.083984375, 0.0284423828125, 0.056915283203125, -0.004970550537109375, -0.0291900634765625, -0.08929443359375, -0.01922607421875, -0.0164337158203125, -0.01387786865234375, 0.009185791015625, -0.0252838134765625, 0.0172576904296875, 0.02862548828125, 0.0159912109375, -0.058868408203125, -0.0208740234375, -0.0335693359375, -0.036407470703125, 0.038177490234375, 0.002140045166015625, -0.0016498565673828125, -0.042510986328125, -0.055389404296875, -0.00611114501953125, -0.0364990234375, 0.010833740234375, -0.01433563232421875, 0.0129547119140625, -0.034332275390625, 0.04486083984375, -0.013763427734375, 0.06768798828125, 0.01337432861328125, -0.0157623291015625, 0.033599853515625, -0.049591064453125, -0.044464111328125, 0.0168304443359375, 0.07806396484375, -0.007381439208984375, 0.0006799697875976562, -0.005542755126953125, -0.017791748046875, -0.01259613037109375, 0.0053863525390625, -0.08380126953125, -0.034027099609375, 0.0246429443359375, -0.0386962890625, -0.0030879974365234375, -0.0073699951171875, -0.0235137939453125, -0.0338134765625, -0.012939453125, 0.06158447265625, -0.0341796875, 0.00620269775390625, 0.0259552001953125, -0.0158538818359375, -0.0137939453125, 0.004543304443359375, -0.04541015625, -0.00905609130859375, 0.02166748046875, 0.059814453125, -0.0100555419921875, -0.01313018798828125, -0.0299835205078125, 0.01519012451171875, 0.007579803466796875, 0.0509033203125, -0.0399169921875, -0.03399658203125, -0.03216552734375, 0.0245513916015625, -0.01043701171875, -0.02838134765625, 0.031402587890625, -0.041900634765625, -0.002750396728515625, -0.01276397705078125, -0.05615234375, -0.0011768341064453125, 0.036346435546875, -0.038177490234375, 0.05224609375, 0.011199951171875, -0.07061767578125, 0.017120361328125, -0.057891845703125, -0.022735595703125, 0.00519561767578125, 0.00719451904296875, -0.057281494140625, -0.016510009765625, 0.01105499267578125, 0.055999755859375, -0.0177154541015625, 0.045196533203125, -0.0313720703125, -0.0098419189453125, 0.0213165283203125, -0.005420684814453125, 0.0645751953125, 0.01611328125, -0.0190277099609375, -0.00354766845703125, -0.0594482421875, -0.004505157470703125, -0.01953125, -0.0300445556640625, -0.02349853515625, -0.0277099609375, 0.00984954833984375, 0.007221221923828125, 0.01126861572265625, -0.053375244140625, 0.0239410400390625, -0.037139892578125, 0.0272369384765625, 0.043975830078125, 0.02252197265625, 0.0207366943359375, 0.00330352783203125, 0.042236328125, -0.00850677490234375, 0.015655517578125, 0.003803253173828125, -0.034332275390625, -0.01300811767578125, -0.055206298828125, 0.0654296875, 0.019500732421875, -0.036163330078125, 0.0234222412109375, 0.0185699462890625, -0.03314208984375, -0.043304443359375, -0.0039043426513671875, 0.028594970703125, 0.039031982421875, 0.04150390625, -0.042327880859375, -0.0335693359375, -0.07147216796875, -0.001560211181640625, 0.0182342529296875, 0.0204315185546875, -0.00507354736328125, 0.06951904296875, -0.03167724609375, 0.0748291015625, -0.054473876953125, -0.0384521484375, -0.0059814453125, 0.029388427734375, 0.044464111328125, 0.05084228515625, 0.0322265625, -0.048675537109375, -0.0221099853515625, -0.0244293212890625, -0.046051025390625, -0.00676727294921875, 0.006191253662109375, -0.0208892822265625, -0.0244293212890625, 0.023284912109375, -0.044281005859375, 0.048431396484375, 0.05194091796875, -0.053466796875, 0.03302001953125, -0.00899505615234375, 0.006320953369140625, -0.0914306640625, 0.0217132568359375, 0.0211181640625, -0.024658203125, -0.029144287109375, -0.00565338134765625, -0.00191497802734375, -0.0015306472778320312, -0.031982421875, 0.0584716796875, -0.037078857421875, 0.032867431640625, -0.0246124267578125, -0.0213165283203125, 0.009857177734375, 0.01329803466796875, 0.0032024383544921875, 0.052520751953125, 0.053955078125, -0.055023193359375, 0.01568603515625, 0.036651611328125, -0.02435302734375, 0.0280303955078125, -0.07025146484375, -0.016571044921875, -0.0013713836669921875, 0.0276031494140625, -0.07415771484375, -0.029693603515625, 0.033050537109375, -0.034881591796875, 0.031982421875, 0.00254058837890625, -0.034576416015625, -0.041656494140625, -0.0084991455078125, 0.030792236328125, 0.0240325927734375, -0.0377197265625, 0.0382080078125, 0.037750244140625, -0.01396942138671875, -0.0460205078125, -0.0594482421875, -0.01336669921875, 0.0082244873046875, -0.0180816650390625, 0.035308837890625, 0.0110931396484375, 0.0267181396484375, 0.01338958740234375, 0.00012445449829101562, -0.01496124267578125, 0.0019330978393554688, 0.0183258056640625, 0.013580322265625, 0.01019287109375, 0.02777099609375, 0.00746917724609375, -0.0214996337890625, 0.0035343170166015625, -0.033203125, 0.025665283203125, 0.01064300537109375, -0.019683837890625, -0.050750732421875, -0.00042939186096191406, 0.00197601318359375, -0.046051025390625, 0.05792236328125, 0.050567626953125, -0.041168212890625, -0.004199981689453125, -0.0298309326171875, -0.032562255859375, -0.034027099609375, 0.04669189453125, -0.01319122314453125, -0.0618896484375, 0.016082763671875, -0.0162353515625, 0.0101776123046875, 0.04205322265625, 0.05059814453125, -0.0113677978515625, 0.046600341796875, 0.060302734375, 0.0007729530334472656, 0.01065826416015625, -0.04736328125, 0.01248931884765625, -0.0634765625, -0.0235748291015625, -0.047515869140625, -0.0178070068359375, -0.02239990234375, -0.0307159423828125, 0.03765869140625, 0.009246826171875, -0.025299072265625, 0.0303192138671875, -0.0633544921875, 0.01708984375, 0.046417236328125, 0.016571044921875, 0.0175018310546875, 0.00513458251953125, 0.00719451904296875, -0.0035114288330078125, -0.0777587890625, -0.039337158203125, 0.08599853515625, 0.036468505859375, 0.085693359375, -0.002834320068359375, 0.081787109375, 0.020660400390625, 0.0628662109375, -0.0290985107421875, 0.01091766357421875, -0.004039764404296875, -0.08685302734375, -0.0174407958984375, -0.0236053466796875, -0.08282470703125, 0.00327301025390625, -0.03265380859375, -0.055908203125, 0.042327880859375, 0.036590576171875, -0.04962158203125, 0.002887725830078125, -0.038604736328125, 0.073974609375, -0.002941131591796875, -0.03369140625, -0.005199432373046875, -0.04705810546875, 0.01554107666015625, -0.0265960693359375, 0.014190673828125, -0.00689697265625, 0.0235443115234375, 0.0831298828125, -0.0204925537109375, 0.0811767578125, -0.0116729736328125, 0.006801605224609375, 0.03741455078125, -0.0267486572265625, 0.02020263671875, 0.0112152099609375, -0.02142333984375, 0.0216522216796875, 0.0252838134765625, -0.0247955322265625, 0.007965087890625, 0.0338134765625, -0.0567626953125, -0.0183258056640625, -0.058197021484375, -0.043182373046875, -0.005222320556640625, 0.0255584716796875, 0.0286712646484375, 0.04498291015625, -0.01438140869140625, 0.0005178451538085938, 0.05975341796875, -0.03619384765625, 0.0171661376953125, 0.0197296142578125, -0.004085540771484375, -0.03631591796875, 0.056182861328125, 0.0065765380859375, 0.00818634033203125, 0.0157318115234375, 0.01873779296875, -0.041473388671875, -0.0253753662109375, -0.0215606689453125, 0.03021240234375, -0.037078857421875, -0.04754638671875, -0.061614990234375, -0.01535797119140625, -0.054412841796875, 0.00879669189453125, -0.0182342529296875, -0.0176544189453125, -0.0275115966796875, 0.0032444000244140625, 0.02239990234375, 0.05877685546875, -0.028289794921875, 0.04351806640625, -0.059326171875, 0.030303955078125, 0.007678985595703125, 0.022735595703125, 0.01023101806640625, -0.0548095703125, -0.0294342041015625, 0.0001785755157470703, -0.06182861328125, -0.0811767578125, 0.0264892578125, 0.008544921875, 0.052490234375, 0.01535797119140625, 0.001247406005859375, 0.039459228515625, -0.0321044921875, 0.046844482421875, 0.0285186767578125, -0.05908203125, 0.02972412109375, -0.00850677490234375, 0.042236328125, 0.042572021484375, 0.047332763671875, -0.01322174072265625, -0.035980224609375, -0.041748046875, -0.060699462890625, 0.06011962890625, 0.02716064453125, -0.0039043426513671875, 0.0035114288330078125, 0.026641845703125, 0.00634765625, 0.023834228515625, -0.0618896484375, -0.0206298828125, -0.0164642333984375, -0.0268402099609375, -0.005512237548828125, -0.0197906494140625, -0.01898193359375, -0.042449951171875, 0.08074951171875, 0.005596160888671875, 0.024658203125, 0.0423583984375, -0.0007610321044921875, 0.01535797119140625, 0.03302001953125, 0.0265045166015625, 0.058197021484375, -0.0211181640625, -0.0034637451171875, 0.0006575584411621094, -0.03765869140625, 0.0035572052001953125, 0.05548095703125, -0.00452423095703125, 0.009918212890625, 0.035491943359375, 0.056884765625, 0.01030731201171875, -0.018951416015625, 0.0286102294921875, 0.01122283935546875, -0.03607177734375, 0.005588531494140625, -0.01320648193359375, 0.035125732421875, 0.038909912109375, 0.0236968994140625, -0.006504058837890625, 0.037567138671875, -0.0194244384765625, 0.028167724609375, 0.03643798828125, -0.0230712890625, -0.0145111083984375, 0.0709228515625, 0.01751708984375, 0.0023651123046875, 0.034423828125, -0.0260162353515625, -0.042083740234375, 0.05914306640625, 0.05938720703125, 0.046234130859375, -0.0176849365234375, 0.0178070068359375, 0.049224853515625, 0.023345947265625, -0.00986480712890625, 0.0192108154296875, 0.006500244140625, -0.03424072265625, -0.034942626953125, -0.08453369140625, -0.01555633544921875, 0.010711669921875, -0.0263519287109375, 0.0030059814453125, -0.0277862548828125, -0.06719970703125, 0.025054931640625, -0.01313018798828125, -0.061859130859375, 0.03375244140625, 0.021942138671875, 0.07684326171875, -0.05694580078125, 0.042083740234375, 0.07513427734375, -0.016571044921875, -0.07598876953125, -0.0123138427734375, -0.016143798828125, -0.0341796875, 0.0660400390625, 0.0169677734375, 0.0031299591064453125, 0.0245361328125, -0.063720703125, -0.07958984375, 0.06817626953125, 0.00322723388671875, -0.049407958984375, 0.003665924072265625, -0.0030155181884765625, 0.02996826171875, -0.00010651350021362305, 0.018951416015625, 0.0210723876953125, 0.04852294921875, 0.004375457763671875, -0.07940673828125, -0.0019388198852539062, 0.01259613037109375, -0.0173797607421875, 0.007556915283203125, -0.0333251953125, 0.0577392578125, -0.020294189453125, 0.00226593017578125, 0.0083770751953125, 0.055755615234375, 0.036712646484375, 0.03759765625, 0.006999969482421875, 0.057891845703125, 0.0626220703125, -0.01358795166015625, 0.085693359375, -0.01557159423828125, 0.07391357421875, 0.0631103515625, -0.005859375, 0.023406982421875, 0.04052734375, -0.0167999267578125, 0.019134521484375, 0.0643310546875, -0.0272064208984375, 0.06671142578125, 0.0223541259765625, 0.00319671630859375, -0.03411865234375, -0.015899658203125, -0.047760009765625, 0.01485443115234375, 0.01092529296875, -0.037322998046875, -0.0295257568359375, 0.0210418701171875, -0.0079345703125, -0.02655029296875, -0.0232086181640625, 0.0294647216796875, 0.038787841796875, -0.033355712890625, 0.050445556640625, 0.0216217041015625, 0.06414794921875, -0.06689453125, -0.01287841796875, 0.021026611328125, 0.03289794921875, -0.0259552001953125, -0.056396484375, 0.0010471343994140625, -0.003757476806640625, -0.013458251953125, 0.0015659332275390625, 0.0533447265625, -0.03887939453125, -0.046234130859375, 0.006809234619140625, 0.0218353271484375, 0.00896453857421875, 0.015472412109375, -0.077880859375, 0.0048065185546875, -0.0048675537109375, -0.037567138671875, 0.00008594989776611328, 0.01204681396484375, 0.0005125999450683594, 0.038360595703125, 0.04058837890625, 0.006710052490234375, -0.009429931640625, -0.00730133056640625, 0.062469482421875, -0.036163330078125, -0.032318115234375, -0.06427001953125, 0.0309906005859375, -0.004302978515625, -0.036895751953125, 0.04681396484375, 0.048919677734375, 0.0438232421875, -0.0176544189453125, 0.053009033203125, -0.0213165283203125, 0.0236358642578125, -0.032958984375, 0.06781005859375, -0.063720703125, -0.0107269287109375, -0.030731201171875, -0.05181884765625, -0.0281219482421875, 0.06158447265625, -0.0307464599609375, 0.00920867919921875, 0.048187255859375, 0.06378173828125, -0.02618408203125, 0.00585174560546875, -0.0132904052734375, 0.029693603515625, 0.0242156982421875, 0.044281005859375, 0.03302001953125, -0.0307159423828125, -0.0009293556213378906, -0.009368896484375, -0.037078857421875, -0.00389862060546875, -0.059478759765625, -0.08380126953125, -0.01348876953125, -0.034912109375, -0.045623779296875, 0.025787353515625, 0.08721923828125, 0.07159423828125, -0.073486328125, -0.0005192756652832031, 0.0281982421875, -0.0082550048828125, -0.0014190673828125, -0.0128936767578125, 0.055145263671875, -0.006916046142578125, -0.04241943359375, -0.002872467041015625, 0.00038695335388183594, 0.01265716552734375, -0.0050506591796875, 0.002254486083984375, -0.0472412109375, -0.00670623779296875, 0.0404052734375, 0.019927978515625, -0.0225677490234375, 0.00080108642578125, -0.0041961669921875, -0.040924072265625, -0.0143585205078125, 0.0174407958984375, -0.066650390625, 0.036041259765625, 0.027099609375, 0.06683349609375, 0.050506591796875, 0.004047393798828125, 0.0215606689453125, -0.041412353515625, 0.004795074462890625, 0.01531982421875, 0.01074981689453125, 0.027008056640625, -0.0308837890625, 0.037811279296875, 0.0197296142578125, -0.051605224609375, -0.0501708984375, -0.005496978759765625, -0.07562255859375, -0.023834228515625, 0.081787109375, -0.005077362060546875, -0.03369140625, -0.00428009033203125, -0.01503753662109375, 0.0208282470703125, -0.03216552734375, 0.053466796875, 0.052001953125, -0.01244354248046875, 0.0253753662109375, -0.029205322265625, 0.03253173828125, 0.03143310546875, -0.05059814453125, -0.02410888671875, 0.0169525146484375, 0.038909912109375, 0.01708984375, 0.07427978515625, -0.009857177734375, 0.02728271484375, -0.027069091796875, 0.04388427734375, -0.01094818115234375, -0.0047607421875, -0.0096435546875, 0.0034637451171875, -0.0038299560546875, -0.016204833984375 ] ]
Qwen/Qwen-7B-Chat-Int4
2023-11-05T03:27:30.000Z
[ "transformers", "safetensors", "qwen", "text-generation", "custom_code", "zh", "en", "arxiv:2309.16609", "arxiv:2305.08322", "arxiv:2009.03300", "arxiv:2305.05280", "arxiv:2210.03629", "has_space", "region:us" ]
text-generation
Qwen
null
null
Qwen/Qwen-7B-Chat-Int4
57
9,651
transformers
2023-08-20T14:57:46
--- language: - zh - en tags: - qwen pipeline_tag: text-generation inference: false --- # Qwen-7B-Chat-Int4 <p align="center"> <img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/logo_qwen.jpg" width="400"/> <p> <br> <p align="center"> 🤗 <a href="https://huggingface.co/Qwen">Hugging Face</a>&nbsp&nbsp | &nbsp&nbsp🤖 <a href="https://modelscope.cn/organization/qwen">ModelScope</a>&nbsp&nbsp | &nbsp&nbsp 📑 <a href="https://arxiv.org/abs/2309.16609">Paper</a>&nbsp&nbsp | &nbsp&nbsp🖥️ <a href="https://modelscope.cn/studios/qwen/Qwen-7B-Chat-Demo/summary">Demo</a> <br> <a href="https://github.com/QwenLM/Qwen/blob/main/assets/wechat.png">WeChat (微信)</a>&nbsp&nbsp | &nbsp&nbsp DingTalk (钉钉) &nbsp&nbsp | &nbsp&nbsp<a href="https://discord.gg/z3GAxXZ9Ce">Discord</a>&nbsp&nbsp </p> <br><br> ## 介绍(Introduction) **通义千问-7B(Qwen-7B)**是阿里云研发的通义千问大模型系列的70亿参数规模的模型。Qwen-7B是基于Transformer的大语言模型, 在超大规模的预训练数据上进行训练得到。预训练数据类型多样,覆盖广泛,包括大量网络文本、专业书籍、代码等。同时,在Qwen-7B的基础上,我们使用对齐机制打造了基于大语言模型的AI助手Qwen-7B-Chat。相较于最初开源的Qwen-7B模型,我们现已将预训练模型和Chat模型更新到效果更优的版本。本仓库为Qwen-7B-Chat的Int4量化模型的仓库。 如果您想了解更多关于通义千问-7B开源模型的细节,我们建议您参阅[GitHub代码库](https://github.com/QwenLM/Qwen)。 **Qwen-7B** is the 7B-parameter version of the large language model series, Qwen (abbr. Tongyi Qianwen), proposed by Alibaba Cloud. Qwen-7B is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc. Additionally, based on the pretrained Qwen-7B, we release Qwen-7B-Chat, a large-model-based AI assistant, which is trained with alignment techniques. Now we have updated both our pretrained and chat models for better performances. This repository is the one for the Int4 quantized model of Qwen-7B-Chat. For more details about the open-source model of Qwen-7B, please refer to the [GitHub](https://github.com/QwenLM/Qwen) code repository. <br> ## 要求(Requirements) * python 3.8及以上版本 * pytorch 2.0及以上版本 * 建议使用CUDA 11.4及以上(GPU用户、flash-attention用户等需考虑此选项) * python 3.8 and above * pytorch 2.0 and above, 2.0 and above are recommended * CUDA 11.4 and above are recommended (this is for GPU users, flash-attention users, etc.) <br> ## 依赖项(Dependency) 运行Qwen-7B-Chat-Int4,请确保满足上述要求,再执行以下pip命令安装依赖库。如安装`auto-gptq`遇到问题,我们建议您到官方[repo](https://github.com/PanQiWei/AutoGPTQ)搜索合适的预编译wheel。 To run Qwen-7B-Chat-Int4, please make sure you meet the above requirements, and then execute the following pip commands to install the dependent libraries. If you meet problems installing `auto-gptq`, we advise you to check out the official [repo](https://github.com/PanQiWei/AutoGPTQ) to find a pre-build wheel. ```bash pip install transformers==4.32.0 accelerate tiktoken einops scipy transformers_stream_generator==0.0.4 peft deepspeed pip install auto-gptq optimum ``` 另外,推荐安装`flash-attention`库(**当前已支持flash attention 2**),以实现更高的效率和更低的显存占用。 In addition, it is recommended to install the `flash-attention` library (**we support flash attention 2 now.**) for higher efficiency and lower memory usage. ```bash git clone https://github.com/Dao-AILab/flash-attention cd flash-attention && pip install . # 下方安装可选,安装可能比较缓慢。 # pip install csrc/layer_norm # pip install csrc/rotary ``` <br> ## 快速使用(Quickstart) 下面我们展示了一个使用Qwen-7B-Chat-Int4模型的样例: We show an example of how to use Qwen-7B-Chat-Int4 in the following code: ```python from transformers import AutoTokenizer, AutoModelForCausalLM # Note: The default behavior now has injection attack prevention off. tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen-7B-Chat-Int4", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained( "Qwen/Qwen-7B-Chat-Int4", device_map="auto", trust_remote_code=True ).eval() response, history = model.chat(tokenizer, "你好", history=None) print(response) # 你好!很高兴为你提供帮助。 ``` 关于更多的使用说明,请参考我们的[GitHub repo](https://github.com/QwenLM/Qwen)获取更多信息。 For more information, please refer to our [GitHub repo](https://github.com/QwenLM/Qwen) for more information. <br> ## 量化 (Quantization) ### 效果评测 我们对BF16,Int8和Int4模型在基准评测上做了测试(使用zero-shot设置),发现量化模型效果损失较小,结果如下所示: We illustrate the zero-shot performance of both BF16, Int8 and Int4 models on the benchmark, and we find that the quantized model does not suffer from significant performance degradation. Results are shown below: | Quantization | MMLU | CEval (val) | GSM8K | Humaneval | | ------------- | :--------: | :----------: | :----: | :--------: | | BF16 | 55.8 | 59.7 | 50.3 | 37.2 | | Int8 | 55.4 | 59.4 | 48.3 | 34.8 | | Int4 | 55.1 | 59.2 | 49.7 | 29.9 | ### 推理速度 (Inference Speed) 我们测算了不同精度模型以及不同FlashAttn库版本下模型生成2048和8192个token的平均推理速度。如图所示: We measured the average inference speed of generating 2048 and 8192 tokens with different quantization levels and versions of flash-attention, respectively. | Quantization | FlashAttn | Speed (2048 tokens) | Speed (8192 tokens) | | ------------- | :-------: | :------------------:| :------------------:| | BF16 | v2 | 40.93 | 36.14 | | Int8 | v2 | 37.47 | 32.54 | | Int4 | v2 | 50.09 | 38.61 | | BF16 | v1 | 40.75 | 35.34 | | Int8 | v1 | 37.51 | 32.39 | | Int4 | v1 | 45.98 | 36.47 | | BF16 | Disabled | 37.55 | 33.56 | | Int8 | Disabled | 37.84 | 32.65 | | Int4 | Disabled | 48.12 | 36.70 | 具体而言,我们记录在长度为1的上下文的条件下生成8192个token的性能。评测运行于单张A100-SXM4-80G GPU,使用PyTorch 2.0.1和CUDA 11.8。推理速度是生成8192个token的速度均值。 In detail, the setting of profiling is generating 8192 new tokens with 1 context token. The profiling runs on a single A100-SXM4-80G GPU with PyTorch 2.0.1 and CUDA 11.8. The inference speed is averaged over the generated 8192 tokens. 注意:以上Int4/Int8模型生成速度使用autogptq库给出,当前``AutoModelForCausalLM.from_pretrained``载入的模型生成速度会慢大约20%。我们已经将该问题汇报给HuggingFace团队,若有解决方案将即时更新。 Note: The generation speed of the Int4/Int8 models mentioned above is provided by the autogptq library. The current speed of the model loaded using "AutoModelForCausalLM.from_pretrained" will be approximately 20% slower. We have reported this issue to the HuggingFace team and will update it promptly if a solution is available. ### 显存使用 (GPU Memory Usage) 我们还测算了不同模型精度编码2048个token及生成8192个token的峰值显存占用情况。(显存消耗在是否使用FlashAttn的情况下均类似。)结果如下所示: We also profile the peak GPU memory usage for encoding 2048 tokens as context (and generating single token) and generating 8192 tokens (with single token as context) under different quantization levels, respectively. (The GPU memory usage is similar when using flash-attention or not.)The results are shown below. | Quantization Level | Peak Usage for Encoding 2048 Tokens | Peak Usage for Generating 8192 Tokens | | ------------------ | :---------------------------------: | :-----------------------------------: | | BF16 | 16.99GB | 22.53GB | | Int8 | 11.20GB | 16.62GB | | Int4 | 8.21GB | 13.63GB | 上述性能测算使用[此脚本](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile.py)完成。 The above speed and memory profiling are conducted using [this script](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile.py). <br> ## Tokenizer > 注:作为术语的“tokenization”在中文中尚无共识的概念对应,本文档采用英文表达以利说明。 基于tiktoken的分词器有别于其他分词器,比如sentencepiece分词器。尤其在微调阶段,需要特别注意特殊token的使用。关于tokenizer的更多信息,以及微调时涉及的相关使用,请参阅[文档](https://github.com/QwenLM/Qwen/blob/main/tokenization_note_zh.md)。 Our tokenizer based on tiktoken is different from other tokenizers, e.g., sentencepiece tokenizer. You need to pay attention to special tokens, especially in finetuning. For more detailed information on the tokenizer and related use in fine-tuning, please refer to the [documentation](https://github.com/QwenLM/Qwen/blob/main/tokenization_note.md). <br> ## 模型细节(Model) 与Qwen-7B预训练模型相同,Qwen-7B-Chat模型规模基本情况如下所示 The details of the model architecture of Qwen-7B-Chat are listed as follows: | Hyperparameter | Value | |:----------------|:------:| | n_layers | 32 | | n_heads | 32 | | d_model | 4096 | | vocab size | 151851 | | sequence length | 8192 | 在位置编码、FFN激活函数和normalization的实现方式上,我们也采用了目前最流行的做法, 即RoPE相对位置编码、SwiGLU激活函数、RMSNorm(可选安装flash-attention加速)。 在分词器方面,相比目前主流开源模型以中英词表为主,Qwen-7B-Chat使用了约15万token大小的词表。 该词表在GPT-4使用的BPE词表`cl100k_base`基础上,对中文、多语言进行了优化,在对中、英、代码数据的高效编解码的基础上,对部分多语言更加友好,方便用户在不扩展词表的情况下对部分语种进行能力增强。 词表对数字按单个数字位切分。调用较为高效的[tiktoken分词库](https://github.com/openai/tiktoken)进行分词。 For position encoding, FFN activation function, and normalization calculation methods, we adopt the prevalent practices, i.e., RoPE relative position encoding, SwiGLU for activation function, and RMSNorm for normalization (optional installation of flash-attention for acceleration). For tokenization, compared to the current mainstream open-source models based on Chinese and English vocabularies, Qwen-7B-Chat uses a vocabulary of over 150K tokens. It first considers efficient encoding of Chinese, English, and code data, and is also more friendly to multilingual languages, enabling users to directly enhance the capability of some languages without expanding the vocabulary. It segments numbers by single digit, and calls the [tiktoken](https://github.com/openai/tiktoken) tokenizer library for efficient tokenization. <br> ## 评测效果(Evaluation) 对于Qwen-7B-Chat模型,我们同样评测了常规的中文理解(C-Eval)、英文理解(MMLU)、代码(HumanEval)和数学(GSM8K)等权威任务,同时包含了长序列任务的评测结果。由于Qwen-7B-Chat模型经过对齐后,激发了较强的外部系统调用能力,我们还进行了工具使用能力方面的评测。 提示:由于硬件和框架造成的舍入误差,复现结果如有波动属于正常现象。 For Qwen-7B-Chat, we also evaluate the model on C-Eval, MMLU, HumanEval, GSM8K, etc., as well as the benchmark evaluation for long-context understanding, and tool usage. Note: Due to rounding errors caused by hardware and framework, differences in reproduced results are possible. ### 中文评测(Chinese Evaluation) #### C-Eval 在[C-Eval](https://arxiv.org/abs/2305.08322)验证集上,我们评价了Qwen-7B-Chat模型的0-shot & 5-shot准确率 We demonstrate the 0-shot & 5-shot accuracy of Qwen-7B-Chat on C-Eval validation set | Model | Avg. Acc. | |:--------------------------------:|:---------:| | LLaMA2-7B-Chat | 31.9 | | LLaMA2-13B-Chat | 36.2 | | LLaMA2-70B-Chat | 44.3 | | ChatGLM2-6B-Chat | 52.6 | | InternLM-7B-Chat | 53.6 | | Baichuan2-7B-Chat | 55.6 | | Baichuan2-13B-Chat | 56.7 | | Qwen-7B-Chat (original) (0-shot) | 54.2 | | **Qwen-7B-Chat (0-shot)** | 59.7 | | **Qwen-7B-Chat (5-shot)** | 59.3 | | **Qwen-14B-Chat (0-shot)** | 69.8 | | **Qwen-14B-Chat (5-shot)** | **71.7** | C-Eval测试集上,Qwen-7B-Chat模型的zero-shot准确率结果如下: The zero-shot accuracy of Qwen-7B-Chat on C-Eval testing set is provided below: | Model | Avg. | STEM | Social Sciences | Humanities | Others | | :---------------------- | :------: | :--: | :-------------: | :--------: | :----: | | Chinese-Alpaca-Plus-13B | 41.5 | 36.6 | 49.7 | 43.1 | 41.2 | | Chinese-Alpaca-2-7B | 40.3 | - | - | - | - | | ChatGLM2-6B-Chat | 50.1 | 46.4 | 60.4 | 50.6 | 46.9 | | Baichuan-13B-Chat | 51.5 | 43.7 | 64.6 | 56.2 | 49.2 | | Qwen-7B-Chat (original) | 54.6 | 47.8 | 67.6 | 59.3 | 50.6 | | **Qwen-7B-Chat** | 58.6 | 53.3 | 72.1 | 62.8 | 52.0 | | **Qwen-14B-Chat** | **69.1** | 65.1 | 80.9 | 71.2 | 63.4 | 在7B规模模型上,经过人类指令对齐的Qwen-7B-Chat模型,准确率在同类相近规模模型中仍然处于前列。 Compared with other pretrained models with comparable model size, the human-aligned Qwen-7B-Chat performs well in C-Eval accuracy. ### 英文评测(English Evaluation) #### MMLU [MMLU](https://arxiv.org/abs/2009.03300)评测集上,Qwen-7B-Chat模型的0-shot & 5-shot准确率如下,效果同样在同类对齐模型中同样表现较优。 The 0-shot & 5-shot accuracy of Qwen-7B-Chat on MMLU is provided below. The performance of Qwen-7B-Chat still on the top between other human-aligned models with comparable size. | Model | Avg. Acc. | |:--------------------------------:|:---------:| | ChatGLM2-6B-Chat | 46.0 | | LLaMA2-7B-Chat | 46.2 | | InternLM-7B-Chat | 51.1 | | Baichuan2-7B-Chat | 52.9 | | LLaMA2-13B-Chat | 54.6 | | Baichuan2-13B-Chat | 57.3 | | LLaMA2-70B-Chat | 63.8 | | Qwen-7B-Chat (original) (0-shot) | 53.9 | | **Qwen-7B-Chat (0-shot)** | 55.8 | | **Qwen-7B-Chat (5-shot)** | 57.0 | | **Qwen-14B-Chat (0-shot)** | 64.6 | | **Qwen-14B-Chat (5-shot)** | **66.5** | ### 代码评测(Coding Evaluation) Qwen-7B-Chat在[HumanEval](https://github.com/openai/human-eval)的zero-shot Pass@1效果如下 The zero-shot Pass@1 of Qwen-7B-Chat on [HumanEval](https://github.com/openai/human-eval) is demonstrated below | Model | Pass@1 | |:-----------------------:|:--------:| | ChatGLM2-6B-Chat | 11.0 | | LLaMA2-7B-Chat | 12.2 | | InternLM-7B-Chat | 14.6 | | Baichuan2-7B-Chat | 13.4 | | LLaMA2-13B-Chat | 18.9 | | Baichuan2-13B-Chat | 17.7 | | LLaMA2-70B-Chat | 32.3 | | Qwen-7B-Chat (original) | 24.4 | | **Qwen-7B-Chat** | 37.2 | | **Qwen-14B-Chat** | **43.9** | ### 数学评测(Mathematics Evaluation) 在评测数学能力的[GSM8K](https://github.com/openai/grade-school-math)上,Qwen-7B-Chat的准确率结果如下 The accuracy of Qwen-7B-Chat on GSM8K is shown below | Model | Acc. | |:--------------------------------:|:--------:| | LLaMA2-7B-Chat | 26.3 | | ChatGLM2-6B-Chat | 28.8 | | Baichuan2-7B-Chat | 32.8 | | InternLM-7B-Chat | 33.0 | | LLaMA2-13B-Chat | 37.1 | | Baichuan2-13B-Chat | 55.3 | | LLaMA2-70B-Chat | 59.3 | | Qwen-7B-Chat (original) (0-shot) | 41.1 | | **Qwen-7B-Chat (0-shot)** | 50.3 | | **Qwen-7B-Chat (8-shot)** | 54.1 | | **Qwen-14B-Chat (0-shot)** | **60.1** | | **Qwen-14B-Chat (8-shot)** | 59.3 | ### 长序列评测(Long-Context Understanding) 通过NTK插值,LogN注意力缩放可以扩展Qwen-7B-Chat的上下文长度。在长文本摘要数据集[VCSUM](https://arxiv.org/abs/2305.05280)上(文本平均长度在15K左右),Qwen-7B-Chat的Rouge-L结果如下: **(若要启用这些技巧,请将config.json里的`use_dynamic_ntk`和`use_logn_attn`设置为true)** We introduce NTK-aware interpolation, LogN attention scaling to extend the context length of Qwen-7B-Chat. The Rouge-L results of Qwen-7B-Chat on long-text summarization dataset [VCSUM](https://arxiv.org/abs/2305.05280) (The average length of this dataset is around 15K) are shown below: **(To use these tricks, please set `use_dynamic_ntk` and `use_long_attn` to true in config.json.)** | Model | VCSUM (zh) | |:------------------|:----------:| | GPT-3.5-Turbo-16k | 16.0 | | LLama2-7B-Chat | 0.2 | | InternLM-7B-Chat | 13.0 | | ChatGLM2-6B-Chat | 16.3 | | **Qwen-7B-Chat** | **16.6** | ### 工具使用能力的评测(Tool Usage) #### ReAct Prompting 千问支持通过 [ReAct Prompting](https://arxiv.org/abs/2210.03629) 调用插件/工具/API。ReAct 也是 [LangChain](https://python.langchain.com/) 框架采用的主要方式之一。在我们开源的、用于评估工具使用能力的评测基准上,千问的表现如下: Qwen-Chat supports calling plugins/tools/APIs through [ReAct Prompting](https://arxiv.org/abs/2210.03629). ReAct is also one of the main approaches used by the [LangChain](https://python.langchain.com/) framework. In our evaluation benchmark for assessing tool usage capabilities, Qwen-Chat's performance is as follows: <table> <tr> <th colspan="4" align="center">Chinese Tool-Use Benchmark</th> </tr> <tr> <th align="center">Model</th><th align="center">Tool Selection (Acc.↑)</th><th align="center">Tool Input (Rouge-L↑)</th><th align="center">False Positive Error↓</th> </tr> <tr> <td>GPT-4</td><td align="center">95%</td><td align="center">0.90</td><td align="center">15.0%</td> </tr> <tr> <td>GPT-3.5</td><td align="center">85%</td><td align="center">0.88</td><td align="center">75.0%</td> </tr> <tr> <td>Qwen-7B-Chat</td><td align="center">98%</td><td align="center">0.91</td><td align="center">7.3%</td> </tr> <tr> <td>Qwen-14B-Chat</td><td align="center">98%</td><td align="center">0.93</td><td align="center">2.4%</td> </tr> </table> > 评测基准中出现的插件均没有出现在千问的训练集中。该基准评估了模型在多个候选插件中选择正确插件的准确率、传入插件的参数的合理性、以及假阳率。假阳率(False Positive)定义:在处理不该调用插件的请求时,错误地调用了插件。 > The plugins that appear in the evaluation set do not appear in the training set of Qwen. This benchmark evaluates the accuracy of the model in selecting the correct plugin from multiple candidate plugins, the rationality of the parameters passed into the plugin, and the false positive rate. False Positive: Incorrectly invoking a plugin when it should not have been called when responding to a query. ![](assets/react_showcase_001.png) ![](assets/react_showcase_002.png) #### Code Interpreter 为了考察Qwen使用Python Code Interpreter完成数学解题、数据可视化、及文件处理与爬虫等任务的能力,我们专门建设并开源了一个评测这方面能力的[评测基准](https://github.com/QwenLM/Qwen-Agent/tree/main/benchmark)。 我们发现Qwen在生成代码的可执行率、结果正确性上均表现较好: To assess Qwen's ability to use the Python Code Interpreter for tasks such as mathematical problem solving, data visualization, and other general-purpose tasks such as file handling and web scraping, we have created and open-sourced a benchmark specifically designed for evaluating these capabilities. You can find the benchmark at this [link](https://github.com/QwenLM/Qwen-Agent/tree/main/benchmark). We have observed that Qwen performs well in terms of code executability and result accuracy when generating code: <table> <tr> <th colspan="4" align="center">Executable Rate of Generated Code (%)</th> </tr> <tr> <th align="center">Model</th><th align="center">Math↑</th><th align="center">Visualization↑</th><th align="center">General↑</th> </tr> <tr> <td>GPT-4</td><td align="center">91.9</td><td align="center">85.9</td><td align="center">82.8</td> </tr> <tr> <td>GPT-3.5</td><td align="center">89.2</td><td align="center">65.0</td><td align="center">74.1</td> </tr> <tr> <td>LLaMA2-7B-Chat</td> <td align="center">41.9</td> <td align="center">33.1</td> <td align="center">24.1 </td> </tr> <tr> <td>LLaMA2-13B-Chat</td> <td align="center">50.0</td> <td align="center">40.5</td> <td align="center">48.3 </td> </tr> <tr> <td>CodeLLaMA-7B-Instruct</td> <td align="center">85.1</td> <td align="center">54.0</td> <td align="center">70.7 </td> </tr> <tr> <td>CodeLLaMA-13B-Instruct</td> <td align="center">93.2</td> <td align="center">55.8</td> <td align="center">74.1 </td> </tr> <tr> <td>InternLM-7B-Chat</td> <td align="center">78.4</td> <td align="center">44.2</td> <td align="center">62.1 </td> </tr> <tr> <td>InternLM-20B-Chat</td> <td align="center">70.3</td> <td align="center">44.2</td> <td align="center">65.5 </td> </tr> <tr> <td>Qwen-7B-Chat</td> <td align="center">82.4</td> <td align="center">64.4</td> <td align="center">67.2 </td> </tr> <tr> <td>Qwen-14B-Chat</td> <td align="center">89.2</td> <td align="center">84.1</td> <td align="center">65.5</td> </tr> </table> <table> <tr> <th colspan="4" align="center">Accuracy of Code Execution Results (%)</th> </tr> <tr> <th align="center">Model</th><th align="center">Math↑</th><th align="center">Visualization-Hard↑</th><th align="center">Visualization-Easy↑</th> </tr> <tr> <td>GPT-4</td><td align="center">82.8</td><td align="center">66.7</td><td align="center">60.8</td> </tr> <tr> <td>GPT-3.5</td><td align="center">47.3</td><td align="center">33.3</td><td align="center">55.7</td> </tr> <tr> <td>LLaMA2-7B-Chat</td> <td align="center">3.9</td> <td align="center">14.3</td> <td align="center">39.2 </td> </tr> <tr> <td>LLaMA2-13B-Chat</td> <td align="center">8.3</td> <td align="center">8.3</td> <td align="center">40.5 </td> </tr> <tr> <td>CodeLLaMA-7B-Instruct</td> <td align="center">14.3</td> <td align="center">26.2</td> <td align="center">60.8 </td> </tr> <tr> <td>CodeLLaMA-13B-Instruct</td> <td align="center">28.2</td> <td align="center">27.4</td> <td align="center">62.0 </td> </tr> <tr> <td>InternLM-7B-Chat-v1.1</td> <td align="center">28.5</td> <td align="center">4.8</td> <td align="center">40.5 </td> </tr> <tr> <td>InternLM-20B-Chat</td> <td align="center">34.6</td> <td align="center">21.4</td> <td align="center">45.6 </td> </tr> <tr> <td>Qwen-7B-Chat</td> <td align="center">41.9</td> <td align="center">40.5</td> <td align="center">54.4 </td> </tr> <tr> <td>Qwen-14B-Chat</td> <td align="center">58.4</td> <td align="center">53.6</td> <td align="center">59.5</td> </tr> </table> <p align="center"> <br> <img src="assets/code_interpreter_showcase_001.jpg" /> <br> <p> #### Huggingface Agent 千问还具备作为 [HuggingFace Agent](https://huggingface.co/docs/transformers/transformers_agents) 的能力。它在 Huggingface 提供的run模式评测基准上的表现如下: Qwen-Chat also has the capability to be used as a [HuggingFace Agent](https://huggingface.co/docs/transformers/transformers_agents). Its performance on the run-mode benchmark provided by HuggingFace is as follows: <table> <tr> <th colspan="4" align="center">HuggingFace Agent Benchmark- Run Mode</th> </tr> <tr> <th align="center">Model</th><th align="center">Tool Selection↑</th><th align="center">Tool Used↑</th><th align="center">Code↑</th> </tr> <tr> <td>GPT-4</td><td align="center">100</td><td align="center">100</td><td align="center">97.4</td> </tr> <tr> <td>GPT-3.5</td><td align="center">95.4</td><td align="center">96.3</td><td align="center">87.0</td> </tr> <tr> <td>StarCoder-Base-15B</td><td align="center">86.1</td><td align="center">87.0</td><td align="center">68.9</td> </tr> <tr> <td>StarCoder-15B</td><td align="center">87.0</td><td align="center">88.0</td><td align="center">68.9</td> </tr> <tr> <td>Qwen-7B-Chat</td><td align="center">87.0</td><td align="center">87.0</td><td align="center">71.5</td> </tr> <tr> <td>Qwen-14B-Chat</td><td align="center">93.5</td><td align="center">94.4</td><td align="center">87.0</td> </tr> </table> <table> <tr> <th colspan="4" align="center">HuggingFace Agent Benchmark - Chat Mode</th> </tr> <tr> <th align="center">Model</th><th align="center">Tool Selection↑</th><th align="center">Tool Used↑</th><th align="center">Code↑</th> </tr> <tr> <td>GPT-4</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">98.5</td> </tr> <tr> <td>GPT-3.5</td><td align="center">97.3</td><td align="center">96.8</td><td align="center">89.6</td> </tr> <tr> <td>StarCoder-Base-15B</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">91.1</td> </tr> <tr> <td>StarCoder-15B</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">89.6</td> </tr> <tr> <td>Qwen-7B-Chat</td><td align="center">94.7</td><td align="center">94.7</td><td align="center">85.1</td> </tr> <tr> <td>Qwen-14B-Chat</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">95.5</td> </tr> </table> <br> ## FAQ 如遇到问题,敬请查阅[FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ_zh.md)以及issue区,如仍无法解决再提交issue。 If you meet problems, please refer to [FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ.md) and the issues first to search a solution before you launch a new issue. <br> ## 引用 (Citation) 如果你觉得我们的工作对你有帮助,欢迎引用! If you find our work helpful, feel free to give us a cite. ``` @article{qwen, title={Qwen Technical Report}, author={Jinze Bai and Shuai Bai and Yunfei Chu and Zeyu Cui and Kai Dang and Xiaodong Deng and Yang Fan and Wenbin Ge and Yu Han and Fei Huang and Binyuan Hui and Luo Ji and Mei Li and Junyang Lin and Runji Lin and Dayiheng Liu and Gao Liu and Chengqiang Lu and Keming Lu and Jianxin Ma and Rui Men and Xingzhang Ren and Xuancheng Ren and Chuanqi Tan and Sinan Tan and Jianhong Tu and Peng Wang and Shijie Wang and Wei Wang and Shengguang Wu and Benfeng Xu and Jin Xu and An Yang and Hao Yang and Jian Yang and Shusheng Yang and Yang Yao and Bowen Yu and Hongyi Yuan and Zheng Yuan and Jianwei Zhang and Xingxuan Zhang and Yichang Zhang and Zhenru Zhang and Chang Zhou and Jingren Zhou and Xiaohuan Zhou and Tianhang Zhu}, journal={arXiv preprint arXiv:2309.16609}, year={2023} } ``` <br> ## 使用协议(License Agreement) 我们的代码和模型权重对学术研究完全开放,并支持商用。请查看[LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE)了解具体的开源协议细节。如需商用,请填写[问卷](https://dashscope.console.aliyun.com/openModelApply/qianwen)申请。 Our code and checkpoints are open to research purpose, and they are allowed for commercial purposes. Check [LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE) for more details about the license. If you have requirements for commercial use, please fill out the [form](https://dashscope.console.aliyun.com/openModelApply/qianwen) to apply. <br> ## 联系我们(Contact Us) 如果你想给我们的研发团队和产品团队留言,欢迎加入我们的微信群、钉钉群以及Discord!同时,也欢迎通过邮件(qianwen_opensource@alibabacloud.com)联系我们。 If you are interested to leave a message to either our research team or product team, join our Discord or WeChat groups! Also, feel free to send an email to qianwen_opensource@alibabacloud.com.
26,828
[ [ -0.033843994140625, -0.0535888671875, 0.00814056396484375, 0.0211181640625, -0.0214691162109375, -0.005558013916015625, -0.003627777099609375, -0.036529541015625, -0.0030231475830078125, 0.01129913330078125, -0.031036376953125, -0.036407470703125, -0.0369873046875, -0.00667572021484375, -0.0191192626953125, 0.058685302734375, 0.0137176513671875, -0.00841522216796875, 0.01442718505859375, -0.0234222412109375, -0.0258941650390625, -0.02984619140625, -0.06689453125, -0.003894805908203125, 0.0214385986328125, 0.005184173583984375, 0.06097412109375, 0.0440673828125, 0.03424072265625, 0.0292510986328125, -0.00450897216796875, -0.006412506103515625, -0.031463623046875, -0.0132598876953125, 0.016021728515625, -0.032257080078125, -0.052734375, 0.0012454986572265625, 0.04791259765625, 0.00820159912109375, 0.0031223297119140625, 0.0216522216796875, 0.00684356689453125, 0.040130615234375, -0.039093017578125, 0.005565643310546875, -0.02392578125, -0.0016088485717773438, -0.0209503173828125, -0.0003044605255126953, -0.0047149658203125, -0.0261688232421875, 0.0175018310546875, -0.048187255859375, 0.01629638671875, 0.020233154296875, 0.08642578125, 0.01285552978515625, -0.043975830078125, 0.004756927490234375, -0.026824951171875, 0.061309814453125, -0.0849609375, 0.01483917236328125, 0.0158538818359375, 0.026824951171875, -0.01096343994140625, -0.0716552734375, -0.0550537109375, -0.00897979736328125, -0.01861572265625, 0.01392364501953125, -0.0241851806640625, 0.01556396484375, 0.0450439453125, 0.02764892578125, -0.049530029296875, -0.0036907196044921875, -0.02838134765625, -0.0275421142578125, 0.05487060546875, 0.00839996337890625, 0.036224365234375, -0.0206298828125, -0.02484130859375, -0.0121612548828125, -0.035400390625, 0.018096923828125, 0.015777587890625, -0.01236724853515625, -0.0322265625, 0.0158843994140625, -0.0250701904296875, 0.0295257568359375, 0.0261383056640625, -0.019561767578125, 0.03411865234375, -0.02886962890625, -0.031158447265625, -0.01123809814453125, 0.1002197265625, 0.035797119140625, -0.001495361328125, 0.0097808837890625, -0.00841522216796875, -0.00922393798828125, -0.0159454345703125, -0.08685302734375, -0.036651611328125, 0.047760009765625, -0.046661376953125, -0.019073486328125, -0.0017709732055664062, -0.032318115234375, 0.008056640625, 0.01435089111328125, 0.050537109375, -0.055694580078125, -0.049163818359375, -0.007740020751953125, -0.018402099609375, 0.021484375, 0.021148681640625, -0.062744140625, 0.00807952880859375, 0.0220794677734375, 0.062103271484375, -0.004116058349609375, -0.029571533203125, -0.0262908935546875, -0.00012087821960449219, -0.010711669921875, 0.033111572265625, 0.00182342529296875, -0.0229644775390625, -0.015472412109375, 0.00872802734375, -0.01357269287109375, -0.03173828125, 0.031890869140625, -0.03631591796875, 0.03619384765625, -0.0179290771484375, -0.04290771484375, -0.0243682861328125, 0.009002685546875, -0.042572021484375, 0.0950927734375, 0.019256591796875, -0.07379150390625, 0.0130462646484375, -0.049102783203125, -0.017791748046875, 0.01010894775390625, 0.0018911361694335938, -0.03857421875, -0.01209259033203125, 0.017181396484375, 0.034820556640625, -0.0168304443359375, 0.0164947509765625, -0.0174102783203125, -0.04351806640625, 0.0250244140625, -0.038360595703125, 0.090576171875, 0.020172119140625, -0.062744140625, 0.030426025390625, -0.05511474609375, 0.01544952392578125, 0.01540374755859375, -0.0032196044921875, -0.006137847900390625, -0.0158538818359375, 0.003986358642578125, 0.031768798828125, 0.02734375, -0.0299835205078125, 0.0052642822265625, -0.04644775390625, 0.0582275390625, 0.046905517578125, -0.006618499755859375, 0.042938232421875, -0.02728271484375, 0.029571533203125, 0.0281219482421875, 0.034820556640625, -0.0177459716796875, -0.04461669921875, -0.06536865234375, -0.01274871826171875, 0.0291900634765625, 0.035888671875, -0.06689453125, 0.035400390625, 0.0006551742553710938, -0.036651611328125, -0.047821044921875, -0.0106201171875, 0.044281005859375, 0.035308837890625, 0.0340576171875, -0.00910186767578125, -0.0245513916015625, -0.058685302734375, -0.004802703857421875, -0.016998291015625, -0.0044097900390625, 0.019683837890625, 0.039764404296875, -0.00682830810546875, 0.060028076171875, -0.0240020751953125, -0.00203704833984375, -0.01236724853515625, -0.00018513202667236328, 0.0181732177734375, 0.05487060546875, 0.068359375, -0.059478759765625, -0.0560302734375, -0.0165863037109375, -0.050628662109375, 0.0029544830322265625, -0.00580596923828125, -0.036865234375, 0.0308380126953125, 0.0103302001953125, -0.059478759765625, 0.04608154296875, 0.039703369140625, -0.040008544921875, 0.059967041015625, -0.00481414794921875, 0.01494598388671875, -0.0821533203125, 0.009857177734375, -0.0026397705078125, -0.0137786865234375, -0.039825439453125, 0.0017347335815429688, 0.02203369140625, 0.01531219482421875, -0.04852294921875, 0.06829833984375, -0.041107177734375, 0.005519866943359375, -0.011566162109375, 0.012664794921875, 0.01201629638671875, 0.05010986328125, -0.015625, 0.060760498046875, 0.054779052734375, -0.05322265625, 0.03204345703125, 0.0220184326171875, -0.01052093505859375, 0.01441192626953125, -0.06097412109375, -0.0009703636169433594, 0.0163421630859375, 0.0177001953125, -0.07958984375, -0.006549835205078125, 0.038116455078125, -0.05535888671875, 0.0240936279296875, -0.01236724853515625, -0.0249176025390625, -0.03857421875, -0.03375244140625, 0.023956298828125, 0.0572509765625, -0.0367431640625, 0.03558349609375, 0.016815185546875, 0.01483917236328125, -0.049957275390625, -0.03564453125, -0.011932373046875, -0.03057861328125, -0.061065673828125, 0.0450439453125, -0.01544952392578125, -0.00025272369384765625, -0.0029697418212890625, 0.001567840576171875, 0.006618499755859375, 0.00310516357421875, 0.01294708251953125, 0.026824951171875, -0.0200958251953125, -0.01082611083984375, 0.007564544677734375, -0.0078125, 0.00652313232421875, -0.0156097412109375, 0.03875732421875, -0.010040283203125, -0.003131866455078125, -0.062164306640625, 0.01367950439453125, 0.036865234375, -0.0117340087890625, 0.06060791015625, 0.085693359375, -0.012115478515625, 0.0065155029296875, -0.04864501953125, -0.0179443359375, -0.043853759765625, 0.0213775634765625, -0.02752685546875, -0.06939697265625, 0.0555419921875, 0.01488494873046875, 0.0201873779296875, 0.05767822265625, 0.03955078125, -0.004055023193359375, 0.093994140625, 0.0340576171875, -0.013427734375, 0.05181884765625, -0.040435791015625, 0.0035762786865234375, -0.0660400390625, 0.0018625259399414062, -0.01190948486328125, -0.022918701171875, -0.058380126953125, -0.0191650390625, 0.0258026123046875, 0.020477294921875, -0.0479736328125, 0.02752685546875, -0.047637939453125, -0.00542449951171875, 0.0631103515625, -0.001819610595703125, 0.0008802413940429688, -0.025146484375, -0.00344085693359375, 0.0098876953125, -0.0675048828125, -0.02239990234375, 0.07464599609375, 0.0271148681640625, 0.0245513916015625, 0.003574371337890625, 0.044036865234375, -0.0033092498779296875, 0.02349853515625, -0.040374755859375, 0.03790283203125, 0.013031005859375, -0.046051025390625, -0.030303955078125, -0.04669189453125, -0.06634521484375, 0.039154052734375, -0.007686614990234375, -0.06341552734375, 0.015777587890625, 0.0181884765625, -0.048065185546875, 0.0160675048828125, -0.06402587890625, 0.08319091796875, -0.01953125, -0.0341796875, -0.0008144378662109375, -0.059326171875, 0.035797119140625, 0.0301055908203125, 0.00980377197265625, -0.022613525390625, 0.0171356201171875, 0.070556640625, -0.044464111328125, 0.0477294921875, -0.0255889892578125, 0.004627227783203125, 0.040985107421875, -0.005107879638671875, 0.03558349609375, 0.012298583984375, 0.01056671142578125, 0.021881103515625, 0.034759521484375, -0.0400390625, -0.040802001953125, 0.04827880859375, -0.06640625, -0.0482177734375, -0.034423828125, -0.0310516357421875, 0.0019931793212890625, 0.0205535888671875, 0.043212890625, 0.03668212890625, 0.00969696044921875, 0.0038166046142578125, 0.038543701171875, -0.031463623046875, 0.05914306640625, 0.03668212890625, -0.024932861328125, -0.034759521484375, 0.060516357421875, 0.0105133056640625, 0.023345947265625, 0.0097808837890625, 0.01096343994140625, -0.0236968994140625, -0.037078857421875, -0.051361083984375, 0.0219268798828125, -0.0236663818359375, -0.02947998046875, -0.05303955078125, -0.0271453857421875, -0.055389404296875, 0.01531219482421875, -0.02777099609375, -0.0330810546875, -0.0299072265625, 0.004364013671875, 0.04376220703125, 0.01004791259765625, 0.002960205078125, 0.04095458984375, -0.0748291015625, 0.022369384765625, 0.0117645263671875, 0.0128936767578125, 0.021392822265625, -0.05218505859375, -0.0341796875, 0.0207977294921875, -0.035797119140625, -0.0550537109375, 0.051116943359375, 0.00980377197265625, 0.040557861328125, 0.040283203125, 0.01430511474609375, 0.05035400390625, -0.0133056640625, 0.06243896484375, 0.0145416259765625, -0.0721435546875, 0.0309600830078125, -0.036346435546875, 0.0303802490234375, 0.0157470703125, 0.0271148681640625, -0.034271240234375, -0.025177001953125, -0.06744384765625, -0.06427001953125, 0.06591796875, 0.035491943359375, 0.005519866943359375, 0.002613067626953125, 0.0281829833984375, -0.0241851806640625, 0.0207366943359375, -0.04754638671875, -0.0423583984375, -0.0250091552734375, -0.01268768310546875, 0.0198974609375, -0.0177001953125, 0.00412750244140625, -0.036376953125, 0.049530029296875, -0.00751495361328125, 0.046661376953125, 0.01898193359375, -0.00418853759765625, -0.0009784698486328125, -0.01064300537109375, 0.026519775390625, 0.04290771484375, -0.01462554931640625, -0.01242828369140625, 0.0125732421875, -0.0478515625, -0.00983428955078125, 0.00760650634765625, -0.01885986328125, -0.0018301010131835938, 0.0210113525390625, 0.07550048828125, 0.0165557861328125, -0.03436279296875, 0.03887939453125, -0.018585205078125, -0.0250244140625, -0.01861572265625, 0.025909423828125, 0.01352691650390625, 0.0325927734375, 0.038848876953125, -0.0211181640625, 0.0172882080078125, -0.040252685546875, 0.0037593841552734375, 0.0305328369140625, -0.012298583984375, -0.033172607421875, 0.0675048828125, 0.012939453125, -0.004886627197265625, 0.04962158203125, -0.0304107666015625, -0.055572509765625, 0.058319091796875, 0.038116455078125, 0.05633544921875, -0.0257415771484375, 0.00946807861328125, 0.05303955078125, 0.0030612945556640625, -0.007564544677734375, 0.0300750732421875, -0.002841949462890625, -0.0546875, -0.025787353515625, -0.045257568359375, -0.0163421630859375, 0.0008063316345214844, -0.042938232421875, 0.0064544677734375, -0.0254364013671875, -0.03375244140625, -0.002582550048828125, 0.0205535888671875, -0.047271728515625, 0.0291595458984375, -0.00527191162109375, 0.056396484375, -0.0278778076171875, 0.0797119140625, 0.0287322998046875, -0.0305023193359375, -0.07080078125, -0.0126953125, -0.0078277587890625, -0.049530029296875, 0.035186767578125, 0.01541900634765625, 0.014007568359375, 0.024688720703125, -0.04376220703125, -0.08123779296875, 0.107177734375, 0.005218505859375, -0.04278564453125, -0.0110321044921875, -0.009796142578125, 0.032989501953125, -0.008453369140625, 0.050628662109375, 0.045867919921875, 0.0268096923828125, 0.01142120361328125, -0.08416748046875, 0.025543212890625, -0.031005859375, 0.0016489028930664062, 0.01331329345703125, -0.0875244140625, 0.084228515625, -0.0119781494140625, -0.0263824462890625, -0.0006117820739746094, 0.07110595703125, 0.021453857421875, 0.0192108154296875, 0.022705078125, 0.02838134765625, 0.044677734375, -0.0131072998046875, 0.06317138671875, -0.04632568359375, 0.042755126953125, 0.05194091796875, 0.006732940673828125, 0.04888916015625, 0.01129913330078125, -0.03765869140625, 0.0270233154296875, 0.05084228515625, -0.0038318634033203125, 0.0282745361328125, -0.00830078125, -0.0208282470703125, -0.00180816650390625, 0.0268707275390625, -0.037017822265625, 0.010009765625, 0.029754638671875, -0.00786590576171875, 0.00806427001953125, 0.00807952880859375, 0.0094757080078125, -0.04669189453125, -0.017913818359375, 0.0506591796875, 0.01453399658203125, -0.04052734375, 0.06719970703125, 0.00836181640625, 0.08160400390625, -0.038665771484375, 0.004848480224609375, -0.0198516845703125, 0.0040740966796875, -0.02105712890625, -0.043304443359375, 0.01403045654296875, -0.0234527587890625, 0.007396697998046875, 0.0096282958984375, 0.059814453125, -0.024688720703125, -0.0155181884765625, 0.0219879150390625, 0.02685546875, 0.013153076171875, -0.0226287841796875, -0.076171875, 0.00685882568359375, 0.0103759765625, -0.050872802734375, 0.046356201171875, 0.04241943359375, -0.0037593841552734375, 0.0540771484375, 0.05157470703125, -0.01496124267578125, 0.0143280029296875, -0.004547119140625, 0.07147216796875, -0.0596923828125, -0.0369873046875, -0.0697021484375, 0.048553466796875, -0.01494598388671875, -0.0447998046875, 0.0753173828125, 0.034576416015625, 0.05816650390625, 0.0188446044921875, 0.054107666015625, -0.0254669189453125, 0.0283355712890625, -0.0245513916015625, 0.06536865234375, -0.03302001953125, 0.0003409385681152344, -0.021575927734375, -0.056060791015625, -0.0010595321655273438, 0.05950927734375, -0.0248565673828125, 0.0206298828125, 0.05084228515625, 0.053802490234375, 0.00897979736328125, -0.00774383544921875, 0.03253173828125, 0.0284271240234375, 0.01776123046875, 0.058380126953125, 0.03753662109375, -0.07171630859375, 0.046722412109375, -0.04248046875, -0.01441192626953125, -0.02679443359375, -0.04400634765625, -0.08001708984375, -0.04327392578125, -0.036590576171875, -0.044189453125, -0.00971221923828125, 0.0550537109375, 0.049102783203125, -0.055938720703125, -0.00960540771484375, 0.007030487060546875, 0.0012149810791015625, -0.0205535888671875, -0.0244140625, 0.04937744140625, -0.0147857666015625, -0.0672607421875, -0.003021240234375, 0.01143646240234375, 0.0177001953125, -0.02392578125, -0.00559234619140625, -0.009521484375, -0.007015228271484375, 0.038909912109375, 0.0256500244140625, -0.0438232421875, -0.01165771484375, 0.00989532470703125, -0.0207977294921875, 0.016510009765625, 0.023162841796875, -0.0545654296875, 0.0038280487060546875, 0.035369873046875, 0.002960205078125, 0.055511474609375, 0.0007886886596679688, 0.036285400390625, -0.027862548828125, 0.02386474609375, 0.00930023193359375, 0.0255889892578125, 0.00974273681640625, -0.031768798828125, 0.0205535888671875, 0.0182952880859375, -0.049957275390625, -0.058990478515625, -0.01105499267578125, -0.0697021484375, -0.011260986328125, 0.08544921875, -0.0301361083984375, -0.04095458984375, -0.0019626617431640625, -0.03399658203125, 0.038330078125, -0.036224365234375, 0.04986572265625, 0.035980224609375, -0.0104827880859375, -0.0299835205078125, -0.052398681640625, 0.055419921875, 0.02301025390625, -0.038299560546875, -0.01303863525390625, 0.0244140625, 0.03118896484375, -0.00186920166015625, 0.0682373046875, 0.004627227783203125, 0.03741455078125, 0.00598907470703125, 0.01140594482421875, -0.0077667236328125, 0.0112152099609375, -0.005176544189453125, -0.0031261444091796875, -0.01409912109375, -0.0181884765625 ] ]
timm/vit_tiny_patch16_224.augreg_in21k
2023-05-06T00:29:58.000Z
[ "timm", "pytorch", "safetensors", "image-classification", "dataset:imagenet-21k", "arxiv:2106.10270", "arxiv:2010.11929", "license:apache-2.0", "region:us" ]
image-classification
timm
null
null
timm/vit_tiny_patch16_224.augreg_in21k
1
9,650
timm
2022-12-22T07:55:55
--- tags: - image-classification - timm library_name: timm license: apache-2.0 datasets: - imagenet-21k --- # Model card for vit_tiny_patch16_224.augreg_in21k A Vision Transformer (ViT) image classification model. Trained on ImageNet-21k (with additional augmentation and regularization) in JAX by paper authors, ported to PyTorch by Ross Wightman. ## Model Details - **Model Type:** Image classification / feature backbone - **Model Stats:** - Params (M): 9.7 - GMACs: 1.1 - Activations (M): 4.1 - Image size: 224 x 224 - **Papers:** - How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers: https://arxiv.org/abs/2106.10270 - An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2 - **Dataset:** ImageNet-21k - **Original:** https://github.com/google-research/vision_transformer ## Model Usage ### Image Classification ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model('vit_tiny_patch16_224.augreg_in21k', pretrained=True) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) ``` ### Image Embeddings ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'vit_tiny_patch16_224.augreg_in21k', pretrained=True, num_classes=0, # remove classifier nn.Linear ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor # or equivalently (without needing to set num_classes=0) output = model.forward_features(transforms(img).unsqueeze(0)) # output is unpooled, a (1, 197, 192) shaped tensor output = model.forward_head(output, pre_logits=True) # output is a (1, num_features) shaped tensor ``` ## Model Comparison Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results). ## Citation ```bibtex @article{steiner2021augreg, title={How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers}, author={Steiner, Andreas and Kolesnikov, Alexander and and Zhai, Xiaohua and Wightman, Ross and Uszkoreit, Jakob and Beyer, Lucas}, journal={arXiv preprint arXiv:2106.10270}, year={2021} } ``` ```bibtex @article{dosovitskiy2020vit, title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale}, author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil}, journal={ICLR}, year={2021} } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/huggingface/pytorch-image-models}} } ```
3,799
[ [ -0.039337158203125, -0.0303497314453125, -0.00014257431030273438, 0.003841400146484375, -0.0255889892578125, -0.026275634765625, -0.0247344970703125, -0.036712646484375, 0.01422119140625, 0.021026611328125, -0.03875732421875, -0.033203125, -0.045166015625, 0.001056671142578125, -0.01038360595703125, 0.0723876953125, -0.0076141357421875, 0.003986358642578125, -0.0158233642578125, -0.0335693359375, -0.0242767333984375, -0.01453399658203125, -0.04766845703125, -0.031524658203125, 0.0289459228515625, 0.01500701904296875, 0.04425048828125, 0.050933837890625, 0.05926513671875, 0.0335693359375, -0.00862884521484375, 0.01393890380859375, -0.0218658447265625, -0.0181427001953125, 0.021148681640625, -0.0487060546875, -0.0281524658203125, 0.01593017578125, 0.054718017578125, 0.0304412841796875, 0.01055908203125, 0.0298309326171875, 0.01064300537109375, 0.034454345703125, -0.0245208740234375, 0.0157012939453125, -0.04058837890625, 0.01751708984375, -0.00276947021484375, -0.001338958740234375, -0.023895263671875, -0.025238037109375, 0.01502227783203125, -0.03790283203125, 0.04779052734375, -0.0007662773132324219, 0.10198974609375, 0.021453857421875, 0.0015382766723632812, 0.015380859375, -0.031524658203125, 0.057037353515625, -0.04461669921875, 0.0275115966796875, 0.0115203857421875, 0.01434326171875, 0.0053863525390625, -0.0791015625, -0.049713134765625, -0.01146697998046875, -0.0197601318359375, 0.00434112548828125, -0.0241241455078125, 0.0185546875, 0.036468505859375, 0.045928955078125, -0.037750244140625, 0.00104522705078125, -0.043121337890625, -0.0223846435546875, 0.04327392578125, 0.0005192756652832031, 0.016571044921875, -0.0158843994140625, -0.041412353515625, -0.044464111328125, -0.0234832763671875, 0.0196685791015625, 0.02099609375, 0.007274627685546875, -0.0386962890625, 0.04345703125, 0.004657745361328125, 0.0521240234375, 0.017547607421875, -0.0159454345703125, 0.048858642578125, -0.01226806640625, -0.02728271484375, -0.0213775634765625, 0.08282470703125, 0.03460693359375, 0.0266265869140625, -0.0005974769592285156, -0.0146026611328125, -0.007843017578125, 0.003261566162109375, -0.08233642578125, -0.03302001953125, 0.005168914794921875, -0.035125732421875, -0.0305023193359375, 0.0235748291015625, -0.05029296875, -0.010894775390625, -0.00800323486328125, 0.059814453125, -0.03131103515625, -0.00986480712890625, 0.004253387451171875, -0.0115814208984375, 0.0328369140625, 0.02081298828125, -0.045745849609375, 0.00768280029296875, 0.0160675048828125, 0.07891845703125, 0.004634857177734375, -0.034271240234375, -0.0217437744140625, -0.032257080078125, -0.0239105224609375, 0.04010009765625, -0.0030231475830078125, -0.00868988037109375, -0.01342010498046875, 0.0294647216796875, -0.0204925537109375, -0.043212890625, 0.024749755859375, -0.01439666748046875, 0.026275634765625, 0.01253509521484375, -0.01366424560546875, -0.02978515625, 0.0202178955078125, -0.032257080078125, 0.090087890625, 0.0224151611328125, -0.067138671875, 0.0296630859375, -0.034088134765625, -0.004718780517578125, -0.01012420654296875, -0.00014531612396240234, -0.08343505859375, 0.007083892822265625, 0.02520751953125, 0.043182373046875, -0.0157623291015625, -0.004222869873046875, -0.03369140625, -0.02587890625, 0.02728271484375, -0.0209197998046875, 0.06671142578125, 0.001544952392578125, -0.02197265625, 0.0213470458984375, -0.04498291015625, 0.00775909423828125, 0.031585693359375, -0.0187530517578125, -0.0012607574462890625, -0.048828125, 0.01226043701171875, 0.0171051025390625, 0.01776123046875, -0.04888916015625, 0.031951904296875, -0.0271759033203125, 0.032928466796875, 0.0494384765625, -0.004566192626953125, 0.031494140625, -0.026458740234375, 0.0229339599609375, 0.015472412109375, 0.0294189453125, -0.0085601806640625, -0.050048828125, -0.07940673828125, -0.036346435546875, 0.0292510986328125, 0.0311279296875, -0.05218505859375, 0.039276123046875, -0.02862548828125, -0.05633544921875, -0.04742431640625, 0.004985809326171875, 0.033203125, 0.036285400390625, 0.0372314453125, -0.040283203125, -0.041412353515625, -0.0712890625, -0.01090240478515625, -0.00601959228515625, -0.003143310546875, 0.019073486328125, 0.04925537109375, -0.0229644775390625, 0.06378173828125, -0.0357666015625, -0.0284423828125, -0.0162200927734375, 0.00481414794921875, 0.025390625, 0.05572509765625, 0.050537109375, -0.046630859375, -0.033599853515625, -0.01145172119140625, -0.06036376953125, 0.0101776123046875, -0.0034160614013671875, -0.01029205322265625, 0.008880615234375, 0.01477813720703125, -0.053314208984375, 0.056884765625, 0.0142364501953125, -0.0269775390625, 0.02911376953125, -0.0196380615234375, 0.0035190582275390625, -0.08758544921875, -0.0003819465637207031, 0.02783203125, -0.0206756591796875, -0.034820556640625, 0.0010471343994140625, 0.0097198486328125, -0.0042724609375, -0.0305023193359375, 0.04547119140625, -0.0382080078125, -0.005168914794921875, -0.00423431396484375, -0.0223236083984375, 0.0056610107421875, 0.0526123046875, -0.005146026611328125, 0.04058837890625, 0.05010986328125, -0.03472900390625, 0.04449462890625, 0.039947509765625, -0.0158538818359375, 0.0340576171875, -0.053466796875, 0.01071929931640625, 0.000278472900390625, 0.0130462646484375, -0.0745849609375, -0.0185394287109375, 0.0280609130859375, -0.0560302734375, 0.048187255859375, -0.04193115234375, -0.035308837890625, -0.047393798828125, -0.032379150390625, 0.031890869140625, 0.056121826171875, -0.05975341796875, 0.042572021484375, 0.00789642333984375, 0.02459716796875, -0.037261962890625, -0.07476806640625, -0.0160369873046875, -0.029144287109375, -0.052978515625, 0.0330810546875, 0.004638671875, 0.0096282958984375, 0.005413055419921875, -0.003936767578125, -0.0001666545867919922, -0.017730712890625, 0.032379150390625, 0.0297088623046875, -0.0161895751953125, -0.0033206939697265625, -0.0266571044921875, -0.0150909423828125, -0.001895904541015625, -0.0259552001953125, 0.03753662109375, -0.026458740234375, -0.0148468017578125, -0.05609130859375, -0.020965576171875, 0.0330810546875, -0.0195159912109375, 0.05548095703125, 0.08935546875, -0.0357666015625, 0.0035877227783203125, -0.041839599609375, -0.033447265625, -0.036956787109375, 0.031890869140625, -0.02154541015625, -0.035125732421875, 0.055816650390625, 0.014190673828125, 0.004756927490234375, 0.05963134765625, 0.034332275390625, 0.001468658447265625, 0.0650634765625, 0.053009033203125, 0.016204833984375, 0.06671142578125, -0.07421875, -0.007183074951171875, -0.0697021484375, -0.0268707275390625, -0.0195770263671875, -0.04083251953125, -0.055511474609375, -0.04034423828125, 0.033599853515625, 0.00667572021484375, -0.0206146240234375, 0.0382080078125, -0.06689453125, 0.01245880126953125, 0.053314208984375, 0.03662109375, -0.00585174560546875, 0.032745361328125, -0.01401519775390625, -0.00849151611328125, -0.060577392578125, -0.00737762451171875, 0.0810546875, 0.034088134765625, 0.061798095703125, -0.0188446044921875, 0.051055908203125, -0.016326904296875, 0.022369384765625, -0.061248779296875, 0.041473388671875, 0.00017273426055908203, -0.03082275390625, -0.0081939697265625, -0.0310516357421875, -0.07904052734375, 0.01788330078125, -0.02545166015625, -0.0635986328125, 0.029510498046875, 0.0166778564453125, -0.0197296142578125, 0.0499267578125, -0.06695556640625, 0.0740966796875, -0.0018548965454101562, -0.03759765625, 0.00749969482421875, -0.051971435546875, 0.017608642578125, 0.01641845703125, -0.0232086181640625, 0.0114288330078125, 0.01556396484375, 0.0765380859375, -0.047027587890625, 0.06451416015625, -0.027496337890625, 0.0284423828125, 0.036773681640625, -0.015655517578125, 0.0302886962890625, 0.005603790283203125, 0.01526641845703125, 0.024017333984375, 0.0006318092346191406, -0.02764892578125, -0.0357666015625, 0.039794921875, -0.07818603515625, -0.0271453857421875, -0.040924072265625, -0.04345703125, 0.00850677490234375, 0.00739288330078125, 0.04925537109375, 0.045196533203125, 0.023773193359375, 0.030120849609375, 0.04888916015625, -0.022491455078125, 0.03228759765625, 0.0014562606811523438, -0.013519287109375, -0.042205810546875, 0.07379150390625, 0.0162200927734375, 0.01146697998046875, 0.01129913330078125, 0.019317626953125, -0.024169921875, -0.035491943359375, -0.02752685546875, 0.032623291015625, -0.0506591796875, -0.038299560546875, -0.04339599609375, -0.03802490234375, -0.024200439453125, 0.0025177001953125, -0.0328369140625, -0.0280303955078125, -0.0284576416015625, 0.0108642578125, 0.06329345703125, 0.038055419921875, -0.0144805908203125, 0.039154052734375, -0.043731689453125, 0.016571044921875, 0.0239410400390625, 0.035247802734375, -0.013336181640625, -0.07598876953125, -0.0289764404296875, 0.0024242401123046875, -0.0391845703125, -0.053619384765625, 0.035003662109375, 0.0164642333984375, 0.029998779296875, 0.02752685546875, -0.016845703125, 0.0662841796875, -0.005512237548828125, 0.044281005859375, 0.02435302734375, -0.038970947265625, 0.03814697265625, -0.0078277587890625, 0.009521484375, 0.0135345458984375, 0.006916046142578125, -0.0223236083984375, -0.00527191162109375, -0.08441162109375, -0.057037353515625, 0.059814453125, 0.0180816650390625, 0.00337982177734375, 0.034759521484375, 0.04205322265625, -0.007457733154296875, 0.0054473876953125, -0.0660400390625, -0.020904541015625, -0.023773193359375, -0.0255279541015625, -0.007343292236328125, -0.0025730133056640625, -0.0034694671630859375, -0.059783935546875, 0.04718017578125, -0.009613037109375, 0.06365966796875, 0.03131103515625, -0.015472412109375, -0.01171112060546875, -0.0291748046875, 0.0262451171875, 0.0170745849609375, -0.0209808349609375, 0.004543304443359375, 0.0211944580078125, -0.054443359375, -0.003131866455078125, 0.024658203125, -0.0093994140625, 0.0046234130859375, 0.03680419921875, 0.079345703125, -0.0081787109375, -0.0013408660888671875, 0.04296875, -0.008758544921875, -0.027587890625, -0.0204925537109375, 0.006084442138671875, -0.0190277099609375, 0.0286712646484375, 0.0222625732421875, 0.033966064453125, -0.00955963134765625, -0.00992584228515625, 0.0110931396484375, 0.03863525390625, -0.041656494140625, -0.028106689453125, 0.051300048828125, -0.01544952392578125, -0.00667572021484375, 0.057586669921875, -0.00614166259765625, -0.03802490234375, 0.0653076171875, 0.0224609375, 0.07354736328125, -0.00476837158203125, -0.002765655517578125, 0.056884765625, 0.0303802490234375, -0.0027923583984375, 0.01149749755859375, 0.0091400146484375, -0.057403564453125, -0.006664276123046875, -0.048065185546875, 0.0022907257080078125, 0.0231170654296875, -0.040802001953125, 0.0304412841796875, -0.041015625, -0.0282440185546875, 0.005840301513671875, 0.0214385986328125, -0.0782470703125, 0.0213775634765625, 0.0006704330444335938, 0.055877685546875, -0.062103271484375, 0.049224853515625, 0.0653076171875, -0.052093505859375, -0.07415771484375, -0.01020050048828125, -0.0103302001953125, -0.0692138671875, 0.031890869140625, 0.032135009765625, 0.0145721435546875, 0.0193939208984375, -0.0595703125, -0.04534912109375, 0.09576416015625, 0.0268096923828125, -0.0100555419921875, 0.00940704345703125, -0.006160736083984375, 0.0288543701171875, -0.0199432373046875, 0.0345458984375, 0.01329803466796875, 0.028472900390625, 0.01556396484375, -0.05255126953125, 0.00554656982421875, -0.0232391357421875, 0.0159759521484375, 0.0133514404296875, -0.062469482421875, 0.07177734375, -0.034912109375, -0.0065765380859375, 0.0177459716796875, 0.047760009765625, 0.007171630859375, 0.00634765625, 0.039581298828125, 0.06475830078125, 0.030242919921875, -0.035003662109375, 0.06939697265625, -0.01020050048828125, 0.056427001953125, 0.037445068359375, 0.040802001953125, 0.033599853515625, 0.03704833984375, -0.02398681640625, 0.02294921875, 0.076904296875, -0.044281005859375, 0.0221405029296875, 0.005451202392578125, 0.005279541015625, -0.017852783203125, 0.0012273788452148438, -0.035797119140625, 0.037322998046875, 0.01557159423828125, -0.042022705078125, -0.005229949951171875, 0.01473236083984375, -0.01178741455078125, -0.0276336669921875, -0.016754150390625, 0.046417236328125, 0.00281524658203125, -0.0284271240234375, 0.0618896484375, 0.00206756591796875, 0.060516357421875, -0.0305938720703125, 0.0007061958312988281, -0.0189056396484375, 0.0357666015625, -0.029327392578125, -0.058502197265625, 0.01302337646484375, -0.01690673828125, -0.00628662109375, 0.004150390625, 0.0546875, -0.03033447265625, -0.043701171875, 0.005054473876953125, 0.0242462158203125, 0.021636962890625, -0.003143310546875, -0.07611083984375, -0.0009603500366210938, 0.00103759765625, -0.042816162109375, 0.01517486572265625, 0.032379150390625, 0.00131988525390625, 0.050079345703125, 0.050323486328125, -0.0032024383544921875, 0.017669677734375, -0.00959014892578125, 0.0682373046875, -0.03131103515625, -0.030731201171875, -0.058929443359375, 0.0469970703125, -0.0074005126953125, -0.043212890625, 0.05364990234375, 0.045928955078125, 0.06854248046875, -0.0119171142578125, 0.03277587890625, -0.01302337646484375, 0.0025920867919921875, -0.025909423828125, 0.04388427734375, -0.055999755859375, -0.005466461181640625, -0.021240234375, -0.06640625, -0.03045654296875, 0.0704345703125, -0.029388427734375, 0.03369140625, 0.039337158203125, 0.07415771484375, -0.0262603759765625, -0.0295562744140625, 0.01214599609375, 0.0158233642578125, 0.00974273681640625, 0.02978515625, 0.0445556640625, -0.06732177734375, 0.040283203125, -0.04638671875, -0.01251220703125, -0.0232086181640625, -0.0341796875, -0.0775146484375, -0.057769775390625, -0.043609619140625, -0.05218505859375, -0.0179901123046875, 0.06298828125, 0.0697021484375, -0.043243408203125, -0.003650665283203125, -0.010040283203125, 0.0038909912109375, -0.023284912109375, -0.0182952880859375, 0.04010009765625, -0.0090484619140625, -0.059112548828125, -0.0257720947265625, -0.00023126602172851562, 0.038726806640625, -0.01279449462890625, -0.011260986328125, -0.008209228515625, -0.024169921875, 0.0162811279296875, 0.02410888671875, -0.051910400390625, -0.020477294921875, -0.0088958740234375, -0.00566864013671875, 0.03521728515625, 0.0297698974609375, -0.05426025390625, 0.041778564453125, 0.041961669921875, 0.024627685546875, 0.0626220703125, -0.0175628662109375, 0.005451202392578125, -0.0667724609375, 0.04718017578125, -0.0020122528076171875, 0.0404052734375, 0.040069580078125, -0.0182342529296875, 0.043304443359375, 0.046295166015625, -0.0343017578125, -0.066650390625, -0.00287628173828125, -0.0799560546875, 0.011688232421875, 0.072265625, -0.0159759521484375, -0.037078857421875, 0.0302276611328125, -0.016326904296875, 0.05181884765625, -0.00568389892578125, 0.036407470703125, 0.0128326416015625, 0.006465911865234375, -0.046234130859375, -0.035614013671875, 0.03875732421875, 0.0086669921875, -0.0384521484375, -0.029510498046875, 0.0047607421875, 0.040618896484375, 0.028106689453125, 0.0234832763671875, -0.00981903076171875, 0.013031005859375, 0.006534576416015625, 0.03863525390625, -0.0257720947265625, -0.0120086669921875, -0.032318115234375, -0.01119232177734375, -0.0028820037841796875, -0.04827880859375 ] ]
superb/hubert-base-superb-ks
2021-11-04T16:03:26.000Z
[ "transformers", "pytorch", "hubert", "audio-classification", "speech", "audio", "en", "dataset:superb", "arxiv:2105.01051", "license:apache-2.0", "endpoints_compatible", "region:us" ]
audio-classification
superb
null
null
superb/hubert-base-superb-ks
3
9,645
transformers
2022-03-02T23:29:05
--- language: en datasets: - superb tags: - speech - audio - hubert - audio-classification license: apache-2.0 widget: - example_title: Speech Commands "down" src: https://cdn-media.huggingface.co/speech_samples/keyword_spotting_down.wav - example_title: Speech Commands "go" src: https://cdn-media.huggingface.co/speech_samples/keyword_spotting_go.wav --- # Hubert-Base for Keyword Spotting ## Model description This is a ported version of [S3PRL's Hubert for the SUPERB Keyword Spotting task](https://github.com/s3prl/s3prl/tree/master/s3prl/downstream/speech_commands). The base model is [hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960), which is pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. For more information refer to [SUPERB: Speech processing Universal PERformance Benchmark](https://arxiv.org/abs/2105.01051) ## Task and dataset description Keyword Spotting (KS) detects preregistered keywords by classifying utterances into a predefined set of words. The task is usually performed on-device for the fast response time. Thus, accuracy, model size, and inference time are all crucial. SUPERB uses the widely used [Speech Commands dataset v1.0](https://www.tensorflow.org/datasets/catalog/speech_commands) for the task. The dataset consists of ten classes of keywords, a class for silence, and an unknown class to include the false positive. For the original model's training and evaluation instructions refer to the [S3PRL downstream task README](https://github.com/s3prl/s3prl/tree/master/s3prl/downstream#ks-keyword-spotting). ## Usage examples You can use the model via the Audio Classification pipeline: ```python from datasets import load_dataset from transformers import pipeline dataset = load_dataset("anton-l/superb_demo", "ks", split="test") classifier = pipeline("audio-classification", model="superb/hubert-base-superb-ks") labels = classifier(dataset[0]["file"], top_k=5) ``` Or use the model directly: ```python import torch from datasets import load_dataset from transformers import HubertForSequenceClassification, Wav2Vec2FeatureExtractor from torchaudio.sox_effects import apply_effects_file effects = [["channels", "1"], ["rate", "16000"], ["gain", "-3.0"]] def map_to_array(example): speech, _ = apply_effects_file(example["file"], effects) example["speech"] = speech.squeeze(0).numpy() return example # load a demo dataset and read audio files dataset = load_dataset("anton-l/superb_demo", "ks", split="test") dataset = dataset.map(map_to_array) model = HubertForSequenceClassification.from_pretrained("superb/hubert-base-superb-ks") feature_extractor = Wav2Vec2FeatureExtractor.from_pretrained("superb/hubert-base-superb-ks") # compute attention masks and normalize the waveform if needed inputs = feature_extractor(dataset[:4]["speech"], sampling_rate=16000, padding=True, return_tensors="pt") logits = model(**inputs).logits predicted_ids = torch.argmax(logits, dim=-1) labels = [model.config.id2label[_id] for _id in predicted_ids.tolist()] ``` ## Eval results The evaluation metric is accuracy. | | **s3prl** | **transformers** | |--------|-----------|------------------| |**test**| `0.9630` | `0.9672` | ### BibTeX entry and citation info ```bibtex @article{yang2021superb, title={SUPERB: Speech processing Universal PERformance Benchmark}, author={Yang, Shu-wen and Chi, Po-Han and Chuang, Yung-Sung and Lai, Cheng-I Jeff and Lakhotia, Kushal and Lin, Yist Y and Liu, Andy T and Shi, Jiatong and Chang, Xuankai and Lin, Guan-Ting and others}, journal={arXiv preprint arXiv:2105.01051}, year={2021} } ```
3,709
[ [ -0.02752685546875, -0.035888671875, 0.0194549560546875, 0.0164947509765625, -0.018035888671875, -0.0005979537963867188, -0.0200042724609375, -0.0233612060546875, 0.000827789306640625, 0.0321044921875, -0.041412353515625, -0.05145263671875, -0.05352783203125, -0.0138702392578125, -0.027099609375, 0.0782470703125, 0.03350830078125, 0.0096282958984375, 0.015411376953125, -0.001964569091796875, -0.017730712890625, -0.03558349609375, -0.0428466796875, -0.0295867919921875, 0.0108489990234375, 0.0333251953125, 0.0285186767578125, 0.042572021484375, 0.035003662109375, 0.0225677490234375, -0.032867431640625, -0.01221466064453125, -0.028961181640625, 0.00016033649444580078, 0.0042266845703125, -0.035400390625, -0.039764404296875, 0.0026607513427734375, 0.046539306640625, 0.027740478515625, -0.0081939697265625, 0.020904541015625, 0.004871368408203125, 0.0225677490234375, -0.034332275390625, 0.0191802978515625, -0.05487060546875, -0.007328033447265625, -0.02728271484375, -0.0228424072265625, -0.01462554931640625, -0.0054473876953125, 0.031463623046875, -0.0305328369140625, 0.03973388671875, -0.0186004638671875, 0.08038330078125, 0.01934814453125, 0.0081634521484375, -0.0218658447265625, -0.043212890625, 0.064453125, -0.070556640625, 0.044708251953125, 0.038543701171875, 0.009124755859375, 0.005657196044921875, -0.0758056640625, -0.050140380859375, -0.019866943359375, 0.0180206298828125, 0.022216796875, -0.02691650390625, 0.00905609130859375, 0.03472900390625, 0.033782958984375, -0.037689208984375, 0.00592803955078125, -0.04833984375, -0.032562255859375, 0.0482177734375, -0.01096343994140625, 0.00860595703125, -0.028533935546875, -0.015350341796875, -0.0187530517578125, -0.041748046875, 0.003971099853515625, 0.028167724609375, 0.0301055908203125, -0.02471923828125, 0.029144287109375, -0.023284912109375, 0.0528564453125, 0.006134033203125, -0.0311126708984375, 0.0555419921875, -0.02056884765625, -0.00847625732421875, 0.021392822265625, 0.068115234375, 0.01122283935546875, 0.01184844970703125, 0.0023555755615234375, -0.01486968994140625, 0.0094146728515625, -0.0086822509765625, -0.05047607421875, -0.0362548828125, 0.0262603759765625, -0.0091400146484375, -0.01507568359375, -0.01380157470703125, -0.03631591796875, -0.00974273681640625, -0.016357421875, 0.0614013671875, -0.048431396484375, -0.0183563232421875, 0.01239776611328125, -0.011444091796875, 0.02044677734375, -0.0222320556640625, -0.07110595703125, 0.017547607421875, 0.03460693359375, 0.064453125, -0.0007982254028320312, -0.0297393798828125, -0.036590576171875, 0.002414703369140625, -0.015716552734375, 0.042877197265625, -0.0108184814453125, -0.030853271484375, -0.0236358642578125, -0.0080413818359375, -0.0060272216796875, -0.050079345703125, 0.06494140625, -0.0171966552734375, 0.020263671875, 0.003810882568359375, -0.0438232421875, -0.0216522216796875, -0.0243072509765625, -0.0268707275390625, 0.0859375, 0.005733489990234375, -0.055267333984375, 0.02288818359375, -0.044525146484375, -0.026611328125, 0.0030879974365234375, -0.0012912750244140625, -0.04339599609375, -0.0023517608642578125, 0.024658203125, 0.04901123046875, -0.01007080078125, 0.0224151611328125, -0.016845703125, -0.04425048828125, 0.01416778564453125, -0.0303497314453125, 0.0782470703125, 0.015380859375, -0.058319091796875, 0.0198822021484375, -0.074462890625, -0.0017099380493164062, -0.006565093994140625, -0.0128631591796875, 0.008544921875, 0.0023651123046875, 0.0109100341796875, 0.0095367431640625, 0.004390716552734375, -0.038787841796875, -0.0161285400390625, -0.0418701171875, 0.03662109375, 0.044769287109375, -0.0145416259765625, 0.0051116943359375, -0.0207672119140625, 0.0169830322265625, -0.0184326171875, -0.00006580352783203125, 0.00588226318359375, -0.042266845703125, -0.06488037109375, -0.044281005859375, 0.03582763671875, 0.047119140625, -0.030517578125, 0.052825927734375, -0.033111572265625, -0.05621337890625, -0.061004638671875, -0.00433349609375, 0.0287322998046875, 0.0386962890625, 0.0482177734375, -0.01107025146484375, -0.05621337890625, -0.06646728515625, -0.0128936767578125, -0.009674072265625, -0.0189971923828125, 0.023529052734375, 0.0300445556640625, -0.01418304443359375, 0.059112548828125, -0.03656005859375, -0.036865234375, -0.037445068359375, 0.0158233642578125, 0.036224365234375, 0.04718017578125, 0.0224456787109375, -0.040191650390625, -0.0267181396484375, -0.0283203125, -0.03436279296875, -0.0249481201171875, 0.01137542724609375, 0.006053924560546875, 0.00586700439453125, 0.0360107421875, -0.036712646484375, 0.0181427001953125, 0.0243377685546875, -0.0084381103515625, 0.037750244140625, 0.0107574462890625, -0.00336456298828125, -0.08270263671875, 0.0093841552734375, -0.0017862319946289062, -0.0007171630859375, -0.06353759765625, -0.0096893310546875, -0.00252532958984375, -0.0013723373413085938, -0.04779052734375, 0.024749755859375, -0.014312744140625, -0.01462554931640625, -0.0170440673828125, 0.0198516845703125, -0.0176849365234375, 0.04705810546875, 0.01351165771484375, 0.062164306640625, 0.07415771484375, -0.048095703125, 0.0281829833984375, 0.01374053955078125, -0.038787841796875, 0.0328369140625, -0.0635986328125, 0.023529052734375, 0.01009368896484375, 0.00934600830078125, -0.09051513671875, -0.0158233642578125, 0.0135650634765625, -0.07049560546875, 0.0269012451171875, -0.0051727294921875, -0.038970947265625, -0.02935791015625, -0.0125732421875, 0.01001739501953125, 0.03741455078125, -0.038421630859375, 0.030303955078125, 0.037261962890625, -0.0105133056640625, -0.04364013671875, -0.054351806640625, -0.01507568359375, -0.025909423828125, -0.03668212890625, 0.0390625, 0.00745391845703125, 0.0157470703125, -0.0099334716796875, -0.022979736328125, 0.0020008087158203125, -0.0091400146484375, 0.0191802978515625, 0.031707763671875, -0.006465911865234375, -0.0005192756652832031, -0.0240020751953125, -0.0129852294921875, -0.00060272216796875, -0.011505126953125, 0.055267333984375, -0.0116729736328125, 0.0022754669189453125, -0.0662841796875, -0.006305694580078125, 0.04949951171875, -0.0114898681640625, 0.022247314453125, 0.08074951171875, -0.031707763671875, 0.01184844970703125, -0.04534912109375, -0.023040771484375, -0.038543701171875, 0.060302734375, -0.031036376953125, -0.05108642578125, 0.034210205078125, 0.02777099609375, -0.002223968505859375, 0.049072265625, 0.032073974609375, 0.00028514862060546875, 0.088623046875, 0.025177001953125, -0.01861572265625, 0.03314208984375, -0.06146240234375, 0.0087432861328125, -0.08245849609375, -0.015838623046875, -0.044952392578125, -0.0109710693359375, -0.036041259765625, -0.0313720703125, 0.0205841064453125, 0.0287628173828125, -0.02130126953125, 0.031402587890625, -0.0445556640625, 0.005950927734375, 0.062225341796875, 0.00534820556640625, -0.004711151123046875, 0.00922393798828125, -0.0073394775390625, -0.0171966552734375, -0.04901123046875, -0.0183258056640625, 0.094482421875, 0.0277252197265625, 0.03594970703125, -0.01015472412109375, 0.0472412109375, -0.0014543533325195312, 0.01435089111328125, -0.062469482421875, 0.052276611328125, -0.01934814453125, -0.0531005859375, -0.02410888671875, -0.0216064453125, -0.060089111328125, 0.0228271484375, -0.018310546875, -0.0716552734375, -0.0019102096557617188, 0.0073699951171875, -0.04058837890625, 0.01617431640625, -0.043426513671875, 0.05145263671875, 0.0007910728454589844, -0.01473236083984375, -0.01180267333984375, -0.042877197265625, 0.021148681640625, 0.00327301025390625, 0.00006312131881713867, -0.0258941650390625, 0.0252227783203125, 0.094482421875, -0.003902435302734375, 0.0450439453125, -0.0178375244140625, 0.0054931640625, 0.02496337890625, -0.0260772705078125, 0.0306549072265625, -0.012664794921875, -0.004451751708984375, 0.0287628173828125, -0.003269195556640625, -0.0134735107421875, -0.0223846435546875, 0.05352783203125, -0.07415771484375, -0.03131103515625, -0.02777099609375, -0.036712646484375, -0.0305938720703125, 0.0141754150390625, 0.06329345703125, 0.057281494140625, 0.0197296142578125, 0.03216552734375, 0.04364013671875, -0.00988006591796875, 0.0291595458984375, 0.035675048828125, -0.00799560546875, -0.053985595703125, 0.08428955078125, 0.023529052734375, 0.01861572265625, 0.01235198974609375, 0.02130126953125, -0.054595947265625, -0.05609130859375, -0.01316070556640625, 0.019287109375, -0.038299560546875, -0.016357421875, -0.05694580078125, -0.0311126708984375, -0.0509033203125, 0.0099945068359375, -0.0362548828125, -0.023590087890625, -0.042877197265625, -0.0059051513671875, 0.0292510986328125, 0.02252197265625, -0.0295562744140625, 0.0496826171875, -0.037689208984375, 0.044830322265625, 0.03179931640625, 0.01983642578125, -0.01226806640625, -0.07708740234375, -0.012664794921875, -0.0008158683776855469, -0.031768798828125, -0.064453125, 0.034393310546875, 0.0208587646484375, 0.050323486328125, 0.024505615234375, -0.0088043212890625, 0.05426025390625, -0.0182342529296875, 0.07281494140625, 0.01479339599609375, -0.09613037109375, 0.05572509765625, -0.0265655517578125, 0.0361328125, 0.0335693359375, 0.029205322265625, -0.041259765625, -0.040435791015625, -0.06524658203125, -0.077392578125, 0.08233642578125, 0.0247650146484375, -0.01036834716796875, 0.0102081298828125, 0.0006475448608398438, -0.0160064697265625, 0.010223388671875, -0.04541015625, -0.049072265625, -0.0207672119140625, -0.0132598876953125, -0.03564453125, -0.018585205078125, -0.0002751350402832031, -0.03582763671875, 0.07171630859375, 0.01314544677734375, 0.039398193359375, 0.023345947265625, -0.001270294189453125, 0.018096923828125, 0.0138702392578125, 0.049713134765625, 0.0294189453125, -0.046478271484375, 0.00962066650390625, 0.027069091796875, -0.061737060546875, 0.0012454986572265625, 0.0242156982421875, 0.0037212371826171875, 0.01641845703125, 0.03436279296875, 0.087890625, 0.0210723876953125, -0.029632568359375, 0.034393310546875, 0.012664794921875, -0.036163330078125, -0.01427459716796875, 0.01236724853515625, 0.01268768310546875, 0.01137542724609375, 0.037506103515625, 0.0161285400390625, 0.02276611328125, -0.0218658447265625, 0.023468017578125, 0.01190948486328125, -0.047149658203125, -0.0220489501953125, 0.0645751953125, 0.002155303955078125, -0.0302276611328125, 0.03466796875, -0.004749298095703125, -0.0523681640625, 0.03399658203125, 0.044830322265625, 0.0787353515625, -0.0305938720703125, -0.0013380050659179688, 0.032684326171875, 0.0112762451171875, -0.0028820037841796875, 0.04388427734375, -0.00908660888671875, -0.05047607421875, -0.030120849609375, -0.045684814453125, -0.01442718505859375, 0.058380126953125, -0.056549072265625, 0.0184478759765625, -0.022705078125, -0.02862548828125, 0.0211334228515625, 0.01360321044921875, -0.04888916015625, 0.019744873046875, 0.0165863037109375, 0.055328369140625, -0.0484619140625, 0.0638427734375, 0.02606201171875, -0.0065155029296875, -0.08074951171875, 0.00688934326171875, -0.00330352783203125, -0.05364990234375, 0.046142578125, 0.0210113525390625, -0.031829833984375, 0.022125244140625, -0.04388427734375, -0.08575439453125, 0.06689453125, 0.0265655517578125, -0.033050537109375, 0.0262603759765625, -0.002490997314453125, 0.035247802734375, -0.005245208740234375, 0.01277923583984375, 0.06396484375, 0.017913818359375, 0.0019140243530273438, -0.08099365234375, -0.0174713134765625, -0.01511383056640625, -0.00920867919921875, -0.0115509033203125, -0.035125732421875, 0.060089111328125, -0.0152740478515625, -0.0182952880859375, 0.002838134765625, 0.05914306640625, 0.050262451171875, 0.0208740234375, 0.04595947265625, 0.054962158203125, 0.051605224609375, -0.020294189453125, 0.0478515625, -0.031646728515625, 0.045074462890625, 0.079345703125, 0.007007598876953125, 0.0758056640625, 0.0178375244140625, -0.030426025390625, 0.021026611328125, 0.04302978515625, 0.00021827220916748047, 0.0645751953125, 0.023773193359375, -0.0247039794921875, -0.008453369140625, -0.000942230224609375, -0.061676025390625, 0.05712890625, 0.03759765625, -0.021209716796875, 0.00989532470703125, 0.0053558349609375, -0.004276275634765625, -0.0311126708984375, -0.026947021484375, 0.031402587890625, -0.007701873779296875, -0.03533935546875, 0.07421875, -0.00916290283203125, 0.0718994140625, -0.04595947265625, 0.0163421630859375, 0.00434112548828125, 0.013031005859375, -0.015869140625, -0.05230712890625, 0.00685882568359375, -0.006259918212890625, -0.003993988037109375, 0.004184722900390625, 0.03424072265625, -0.03338623046875, -0.0283966064453125, 0.019622802734375, 0.01055145263671875, 0.028900146484375, 0.004703521728515625, -0.053466796875, 0.0189971923828125, 0.01433563232421875, -0.019622802734375, -0.01050567626953125, 0.0041046142578125, 0.03936767578125, 0.036468505859375, 0.04974365234375, 0.032684326171875, 0.0194549560546875, 0.0081634521484375, 0.041595458984375, -0.041717529296875, -0.05126953125, -0.050048828125, 0.0214080810546875, -0.007904052734375, -0.04736328125, 0.05181884765625, 0.05157470703125, 0.068359375, -0.00276947021484375, 0.06005859375, 0.004261016845703125, 0.0648193359375, -0.036895751953125, 0.051513671875, -0.034210205078125, 0.0182037353515625, -0.0433349609375, -0.0628662109375, -0.0279693603515625, 0.07000732421875, -0.0244140625, 0.002079010009765625, 0.034423828125, 0.0772705078125, -0.0014190673828125, 0.01357269287109375, 0.0228424072265625, 0.02069091796875, 0.01727294921875, 0.02978515625, 0.052642822265625, -0.0496826171875, 0.045562744140625, -0.042144775390625, -0.01427459716796875, -0.0135345458984375, -0.0278167724609375, -0.046234130859375, -0.05108642578125, -0.036163330078125, -0.036285400390625, 0.00722503662109375, 0.0576171875, 0.07379150390625, -0.07049560546875, -0.02398681640625, -0.01064300537109375, -0.0016031265258789062, -0.0249481201171875, -0.0219573974609375, 0.05303955078125, -0.009124755859375, -0.0684814453125, 0.04443359375, 0.00701141357421875, 0.01412200927734375, -0.00748443603515625, -0.0034637451171875, -0.0121002197265625, -0.0081024169921875, 0.03009033203125, 0.0174560546875, -0.062042236328125, -0.024688720703125, 0.002002716064453125, -0.0081787109375, 0.0006561279296875, 0.0322265625, -0.057159423828125, 0.0335693359375, 0.037322998046875, 0.020477294921875, 0.053558349609375, -0.02618408203125, 0.0188140869140625, -0.048797607421875, 0.005779266357421875, 0.0218353271484375, 0.022979736328125, 0.0386962890625, -0.005725860595703125, 0.0288543701171875, 0.035736083984375, -0.045501708984375, -0.07769775390625, -0.005512237548828125, -0.0784912109375, -0.002346038818359375, 0.09130859375, 0.0010509490966796875, -0.00004285573959350586, -0.0128173828125, -0.0214385986328125, 0.0487060546875, -0.0269012451171875, 0.04901123046875, 0.04266357421875, -0.02337646484375, -0.0037555694580078125, -0.04779052734375, 0.057403564453125, 0.040435791015625, -0.0217437744140625, -0.0083160400390625, 0.017547607421875, 0.03912353515625, 0.02825927734375, 0.06292724609375, -0.00021028518676757812, 0.0239715576171875, 0.0111846923828125, 0.01507568359375, 0.004093170166015625, -0.024658203125, -0.036834716796875, 0.0113677978515625, -0.01861572265625, -0.054595947265625 ] ]
PygmalionAI/mythalion-13b
2023-09-15T20:30:08.000Z
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "text generation", "instruct", "en", "dataset:PygmalionAI/PIPPA", "dataset:Open-Orca/OpenOrca", "dataset:Norquinal/claude_multiround_chat_30k", "dataset:jondurbin/airoboros-gpt4-1.4.1", "dataset:databricks/databricks-dolly-15k", "license:llama2", "text-generation-inference", "region:us" ]
text-generation
PygmalionAI
null
null
PygmalionAI/mythalion-13b
73
9,641
transformers
2023-09-05T12:45:18
--- language: - en thumbnail: null tags: - text generation - instruct pipeline_tag: text-generation inference: false license: llama2 datasets: - PygmalionAI/PIPPA - Open-Orca/OpenOrca - Norquinal/claude_multiround_chat_30k - jondurbin/airoboros-gpt4-1.4.1 - databricks/databricks-dolly-15k --- <h1 style="text-align: center">Mythalion 13B</h1> <h2 style="text-align: center">A merge of Pygmalion-2 13B and MythoMax 13B</h2> ## Model Details The long-awaited release of our new models based on Llama-2 is finally here. This model was created in collaboration with [Gryphe](https://huggingface.co/Gryphe), a mixture of our [Pygmalion-2 13B](https://huggingface.co/PygmalionAI/pygmalion-2-13b) and Gryphe's [Mythomax L2 13B](https://huggingface.co/Gryphe/MythoMax-L2-13b). Finer details of the merge are available in [our blogpost](https://pygmalionai.github.io/blog/posts/introducing_pygmalion_2/#mythalion-13b). According to our testers, this model seems to outperform MythoMax in RP/Chat. **Please make sure you follow the recommended generation settings for SillyTavern [here](https://pygmalionai.github.io/blog/posts/introducing_pygmalion_2/#sillytavern) for the best results!** This model is freely available for both commercial and non-commercial use, as per the Llama-2 license. ## Prompting This model can be prompted using both the Alpaca and [Pygmalion formatting](https://huggingface.co/PygmalionAI/pygmalion-2-13b#prompting). **Alpaca formatting**: ``` ### Instruction: <prompt> ### Response: <leave a newline blank for model to respond> ``` **Pygmalion/Metharme formatting**: ``` <|system|>Enter RP mode. Pretend to be {{char}} whose persona follows: {{persona}} You shall reply to the user while staying in character, and generate long responses. <|user|>Hello!<|model|>{model's response goes here} ``` The model has been trained on prompts using three different roles, which are denoted by the following tokens: `<|system|>`, `<|user|>` and `<|model|>`. The `<|system|>` prompt can be used to inject out-of-channel information behind the scenes, while the `<|user|>` prompt should be used to indicate user input. The `<|model|>` token should then be used to indicate that the model should generate a response. These tokens can happen multiple times and be chained up to form a conversation history. ## Limitations and biases The intended use-case for this model is fictional writing for entertainment purposes. Any other sort of usage is out of scope. As such, it was **not** fine-tuned to be safe and harmless: the base model _and_ this fine-tune have been trained on data known to contain profanity and texts that are lewd or otherwise offensive. It may produce socially unacceptable or undesirable text, even if the prompt itself does not include anything explicitly offensive. Outputs might often be factually wrong or misleading. ## Acknowledgements We would like to thank [SpicyChat](https://spicychat.ai/) for sponsoring the training for the [Pygmalion-2 13B](https://huggingface.co/PygmalionAI/pygmalion-2-13b) model. [<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
3,275
[ [ -0.0338134765625, -0.051666259765625, 0.01275634765625, 0.016937255859375, -0.031829833984375, -0.009063720703125, -0.0188751220703125, -0.060302734375, 0.030548095703125, 0.0192718505859375, -0.058135986328125, -0.00807952880859375, -0.04119873046875, 0.00934600830078125, -0.00176239013671875, 0.09320068359375, 0.01528167724609375, -0.0194854736328125, -0.0146331787109375, -0.01265716552734375, -0.057037353515625, -0.041015625, -0.043243408203125, -0.026947021484375, 0.050445556640625, 0.01983642578125, 0.068359375, 0.045928955078125, 0.02044677734375, 0.021148681640625, -0.02447509765625, 0.03643798828125, -0.034271240234375, 0.007518768310546875, -0.004390716552734375, -0.0185546875, -0.05389404296875, 0.0281219482421875, 0.0289154052734375, 0.02008056640625, -0.0207366943359375, 0.02032470703125, -0.0005159378051757812, 0.0025463104248046875, -0.04022216796875, 0.0190887451171875, -0.02044677734375, -0.002506256103515625, 0.004119873046875, 0.012420654296875, -0.0299072265625, -0.04052734375, -0.001628875732421875, -0.054229736328125, -0.01806640625, 0.02215576171875, 0.075927734375, 0.01485443115234375, -0.03314208984375, -0.0190582275390625, -0.033111572265625, 0.04498291015625, -0.06768798828125, 0.01021575927734375, 0.025634765625, 0.03167724609375, -0.0167999267578125, -0.07806396484375, -0.05902099609375, -0.0357666015625, -0.007297515869140625, 0.0030498504638671875, -0.04290771484375, -0.017181396484375, 0.0129852294921875, 0.02166748046875, -0.0484619140625, 0.0161285400390625, -0.035552978515625, -0.013031005859375, 0.04852294921875, 0.0191497802734375, 0.0258331298828125, -0.027252197265625, -0.03564453125, -0.00788116455078125, -0.03631591796875, -0.004375457763671875, 0.0208892822265625, 0.007129669189453125, -0.03399658203125, 0.056732177734375, 0.0028228759765625, 0.04022216796875, 0.017547607421875, -0.021270751953125, 0.01708984375, -0.022705078125, -0.0195465087890625, -0.01044464111328125, 0.07696533203125, 0.04974365234375, 0.017303466796875, 0.00225067138671875, -0.005588531494140625, 0.00882720947265625, 0.0157623291015625, -0.06866455078125, -0.0201416015625, 0.0166168212890625, -0.047210693359375, -0.042022705078125, -0.0276641845703125, -0.042755126953125, -0.041900634765625, -0.0016880035400390625, 0.0172882080078125, -0.0268096923828125, -0.047698974609375, 0.0010004043579101562, -0.00045752525329589844, 0.0167694091796875, 0.033477783203125, -0.06072998046875, 0.0225982666015625, 0.03997802734375, 0.06268310546875, -0.001796722412109375, -0.0251922607421875, -0.012603759765625, -0.0152130126953125, -0.02227783203125, 0.05108642578125, -0.0207977294921875, -0.0288848876953125, -0.02392578125, -0.001720428466796875, -0.01507568359375, -0.038787841796875, 0.031585693359375, -0.0298919677734375, 0.03851318359375, -0.005390167236328125, -0.03045654296875, -0.0299530029296875, 0.020263671875, -0.035888671875, 0.064208984375, 0.007091522216796875, -0.061370849609375, 0.005832672119140625, -0.043792724609375, 0.0020847320556640625, -0.0024852752685546875, 0.01043701171875, -0.018341064453125, -0.013519287109375, 0.01424407958984375, 0.039306640625, -0.041595458984375, 0.00933074951171875, -0.045074462890625, -0.0291748046875, 0.042266845703125, -0.013031005859375, 0.072265625, 0.03546142578125, -0.01207733154296875, 0.00933074951171875, -0.0419921875, 0.0075225830078125, 0.026763916015625, -0.0176849365234375, 0.001071929931640625, -0.0187225341796875, 0.00467681884765625, 0.027374267578125, 0.024871826171875, -0.027679443359375, 0.0258331298828125, -0.01358795166015625, 0.044677734375, 0.049285888671875, 0.006443023681640625, 0.0259246826171875, -0.04083251953125, 0.039306640625, -0.0164031982421875, 0.038726806640625, 0.0014238357543945312, -0.066162109375, -0.0489501953125, -0.033538818359375, 0.016143798828125, 0.05792236328125, -0.03509521484375, 0.039459228515625, -0.00688934326171875, -0.057403564453125, -0.01355743408203125, -0.016876220703125, 0.036956787109375, 0.03643798828125, 0.00986480712890625, -0.0197296142578125, -0.0439453125, -0.0595703125, -0.00713348388671875, -0.04736328125, -0.00009369850158691406, 0.047393798828125, 0.032867431640625, -0.029296875, 0.050140380859375, -0.04052734375, 0.012939453125, -0.035614013671875, 0.005565643310546875, 0.0139007568359375, 0.049407958984375, 0.058135986328125, -0.05078125, -0.0154266357421875, -0.0033016204833984375, -0.069091796875, -0.0019550323486328125, 0.007389068603515625, 0.00013005733489990234, 0.0177764892578125, -0.000051975250244140625, -0.0777587890625, 0.0374755859375, 0.03973388671875, -0.0249786376953125, 0.0394287109375, 0.0027256011962890625, 0.0192108154296875, -0.0982666015625, 0.01412200927734375, -0.00414276123046875, -0.005130767822265625, -0.0499267578125, 0.0110015869140625, 0.0082855224609375, -0.0267791748046875, -0.041717529296875, 0.056610107421875, -0.0209197998046875, 0.0136871337890625, -0.0153045654296875, -0.00919342041015625, -0.0087127685546875, 0.041015625, -0.00827789306640625, 0.0592041015625, 0.0406494140625, -0.0440673828125, 0.050140380859375, 0.03533935546875, -0.0036029815673828125, 0.0247039794921875, -0.064208984375, 0.03265380859375, 0.0026416778564453125, 0.043060302734375, -0.08306884765625, -0.04229736328125, 0.080078125, -0.0670166015625, 0.02484130859375, -0.0286712646484375, -0.0325927734375, -0.04644775390625, -0.021636962890625, 0.03680419921875, 0.056243896484375, -0.04791259765625, 0.041748046875, 0.0250244140625, -0.0151214599609375, -0.036346435546875, -0.0595703125, 0.015350341796875, -0.038726806640625, -0.07000732421875, -0.0024547576904296875, -0.0189056396484375, -0.0122528076171875, -0.01247406005859375, 0.01076507568359375, -0.0146484375, -0.00522613525390625, 0.038238525390625, 0.04266357421875, -0.006694793701171875, -0.0146484375, -0.010345458984375, 0.006298065185546875, 0.004222869873046875, -0.013458251953125, 0.04852294921875, -0.01418304443359375, 0.005809783935546875, -0.061492919921875, 0.0234375, 0.047149658203125, -0.0294342041015625, 0.062347412109375, 0.034881591796875, -0.0404052734375, 0.025787353515625, -0.054443359375, -0.0164337158203125, -0.034210205078125, 0.00745391845703125, -0.0091552734375, -0.05596923828125, 0.0457763671875, 0.01189422607421875, -0.0014133453369140625, 0.0176849365234375, 0.043121337890625, 0.0035991668701171875, 0.0870361328125, 0.0462646484375, 0.0099945068359375, 0.0221405029296875, -0.0127716064453125, 0.02374267578125, -0.0606689453125, -0.05242919921875, -0.03045654296875, -0.007297515869140625, -0.052978515625, -0.029571533203125, 0.0177154541015625, 0.0054168701171875, -0.015960693359375, 0.043121337890625, -0.033172607421875, 0.0150299072265625, 0.0300140380859375, 0.0130462646484375, -0.004566192626953125, 0.0078887939453125, 0.006595611572265625, -0.009063720703125, -0.053802490234375, -0.039031982421875, 0.08135986328125, 0.034637451171875, 0.059967041015625, 0.0186920166015625, 0.047149658203125, 0.0019159317016601562, 0.020782470703125, -0.061126708984375, 0.04852294921875, 0.01495361328125, -0.04876708984375, 0.00742340087890625, -0.0296630859375, -0.05023193359375, 0.0164794921875, -0.017486572265625, -0.0706787109375, 0.0210723876953125, 0.014495849609375, -0.0439453125, 0.01412200927734375, -0.0609130859375, 0.057037353515625, 0.003692626953125, -0.01494598388671875, -0.01424407958984375, -0.06256103515625, 0.05841064453125, 0.01035308837890625, -0.0034084320068359375, -0.0022430419921875, -0.0026111602783203125, 0.06036376953125, -0.046966552734375, 0.0703125, 0.0012197494506835938, 0.0003821849822998047, 0.038116455078125, 0.00872802734375, 0.039581298828125, 0.0262603759765625, 0.005229949951171875, 0.01617431640625, -0.00923919677734375, -0.01129150390625, -0.04254150390625, 0.0606689453125, -0.06964111328125, -0.042999267578125, -0.043731689453125, -0.0430908203125, -0.00044155120849609375, -0.0022106170654296875, 0.0274505615234375, 0.025665283203125, 0.00559234619140625, 0.000980377197265625, 0.061126708984375, -0.024139404296875, 0.027862548828125, 0.03350830078125, -0.0220489501953125, -0.051910400390625, 0.0496826171875, -0.0067138671875, 0.0111846923828125, -0.006649017333984375, 0.0190582275390625, -0.035064697265625, -0.0096282958984375, -0.041656494140625, 0.0360107421875, -0.04058837890625, -0.022613525390625, -0.0609130859375, -0.0214385986328125, -0.057830810546875, 0.0098876953125, -0.02020263671875, -0.049102783203125, -0.03662109375, 0.0048370361328125, 0.045867919921875, 0.0509033203125, -0.025360107421875, 0.0278472900390625, -0.04010009765625, 0.0166168212890625, 0.040283203125, -0.006793975830078125, -0.00481414794921875, -0.06396484375, 0.00972747802734375, 0.017486572265625, -0.0222625732421875, -0.068115234375, 0.031219482421875, -0.001987457275390625, 0.0284423828125, 0.0274505615234375, -0.0080413818359375, 0.0457763671875, -0.0249786376953125, 0.0718994140625, 0.036163330078125, -0.042205810546875, 0.048828125, -0.033233642578125, 0.0007581710815429688, 0.0030498504638671875, 0.03179931640625, -0.0565185546875, -0.01409912109375, -0.0550537109375, -0.05084228515625, 0.065673828125, 0.0285491943359375, 0.0228424072265625, -0.01239776611328125, 0.03594970703125, -0.0038051605224609375, 0.01514434814453125, -0.060302734375, -0.0302581787109375, -0.0440673828125, 0.011260986328125, 0.01305389404296875, -0.03656005859375, -0.019927978515625, -0.027008056640625, 0.04705810546875, 0.00350189208984375, 0.0304412841796875, -0.00785064697265625, 0.008819580078125, -0.006801605224609375, 0.0027332305908203125, 0.049774169921875, 0.05487060546875, -0.0224151611328125, -0.00664520263671875, 0.01107025146484375, -0.0419921875, -0.003063201904296875, 0.0269622802734375, -0.004619598388671875, -0.0154571533203125, 0.021514892578125, 0.0726318359375, 0.0084381103515625, -0.034423828125, 0.039825439453125, -0.02447509765625, -0.005329132080078125, -0.012725830078125, 0.0183563232421875, 0.00858306884765625, 0.03302001953125, 0.01511383056640625, 0.0170440673828125, 0.00267791748046875, -0.037109375, 0.00115203857421875, 0.01059722900390625, 0.0013513565063476562, -0.0240631103515625, 0.0701904296875, 0.013031005859375, -0.0236968994140625, 0.046173095703125, -0.0207672119140625, -0.026458740234375, 0.05401611328125, 0.06317138671875, 0.048126220703125, -0.02606201171875, 0.023681640625, 0.04254150390625, 0.0167388916015625, -0.0062103271484375, 0.01422119140625, 0.0012369155883789062, -0.026580810546875, -0.010833740234375, -0.0232391357421875, -0.01145172119140625, 0.02459716796875, -0.046966552734375, 0.01526641845703125, -0.05242919921875, -0.0236053466796875, -0.004741668701171875, 0.01128387451171875, -0.032257080078125, 0.023712158203125, 0.0174560546875, 0.07281494140625, -0.06109619140625, 0.04901123046875, 0.071044921875, -0.04632568359375, -0.08636474609375, -0.004970550537109375, 0.0225830078125, -0.07318115234375, 0.0321044921875, 0.0202789306640625, 0.0093536376953125, -0.0030536651611328125, -0.0635986328125, -0.06488037109375, 0.10552978515625, 0.024749755859375, -0.0255889892578125, -0.00030541419982910156, -0.00800323486328125, 0.02740478515625, -0.02752685546875, 0.03778076171875, 0.0237579345703125, 0.0276641845703125, 0.025238037109375, -0.08929443359375, 0.0227203369140625, -0.0299530029296875, 0.0210723876953125, -0.0191192626953125, -0.069580078125, 0.08148193359375, -0.0223541259765625, -0.0180206298828125, 0.0469970703125, 0.06451416015625, 0.03948974609375, 0.013763427734375, 0.036773681640625, 0.04376220703125, 0.060516357421875, 0.003185272216796875, 0.08282470703125, -0.01055145263671875, 0.026947021484375, 0.06280517578125, -0.00969696044921875, 0.04107666015625, 0.02178955078125, 0.0007505416870117188, 0.02001953125, 0.07147216796875, 0.0022068023681640625, 0.03662109375, 0.003875732421875, -0.021331787109375, -0.0214996337890625, -0.032257080078125, -0.052764892578125, 0.027008056640625, 0.0161895751953125, -0.00749969482421875, -0.01146697998046875, 0.003276824951171875, 0.034423828125, -0.01641845703125, -0.0188446044921875, 0.040771484375, 0.0012149810791015625, -0.05389404296875, 0.0791015625, 0.003704071044921875, 0.06768798828125, -0.055999755859375, -0.00022983551025390625, -0.052581787109375, 0.0010547637939453125, -0.015777587890625, -0.05291748046875, -0.0038585662841796875, 0.00627899169921875, -0.010986328125, 0.005451202392578125, 0.06494140625, -0.0171356201171875, -0.0222320556640625, 0.002704620361328125, 0.020782470703125, 0.0284423828125, 0.00638580322265625, -0.05828857421875, 0.027374267578125, -0.005702972412109375, -0.017303466796875, 0.022308349609375, 0.0163116455078125, -0.01287841796875, 0.07427978515625, 0.041015625, -0.00951385498046875, 0.004550933837890625, -0.006237030029296875, 0.0882568359375, -0.016632080078125, -0.031585693359375, -0.043609619140625, 0.032745361328125, -0.0011587142944335938, -0.037933349609375, 0.052215576171875, 0.033538818359375, 0.042327880859375, -0.016265869140625, 0.04571533203125, -0.03509521484375, 0.0308074951171875, -0.0300140380859375, 0.0731201171875, -0.053253173828125, 0.01041412353515625, -0.005096435546875, -0.0574951171875, -0.00789642333984375, 0.050201416015625, 0.004215240478515625, 0.0096435546875, 0.044189453125, 0.08538818359375, -0.01971435546875, 0.006557464599609375, 0.0058746337890625, 0.0164794921875, 0.0147857666015625, 0.0440673828125, 0.0870361328125, -0.03875732421875, 0.053253173828125, -0.033721923828125, -0.04034423828125, -0.019439697265625, -0.056793212890625, -0.08251953125, -0.03375244140625, -0.0252838134765625, -0.044830322265625, 0.0214080810546875, 0.080322265625, 0.057342529296875, -0.04498291015625, -0.017181396484375, 0.02252197265625, -0.0042724609375, -0.005664825439453125, -0.01538848876953125, 0.00484466552734375, 0.0093536376953125, -0.0523681640625, 0.028076171875, 0.0015544891357421875, 0.02606201171875, -0.0008945465087890625, -0.0174407958984375, -0.01739501953125, 0.00225067138671875, 0.0262908935546875, 0.043731689453125, -0.0589599609375, -0.0299530029296875, 0.0008997917175292969, -0.0143585205078125, -0.01003265380859375, 0.04461669921875, -0.0557861328125, 0.003322601318359375, 0.02099609375, 0.0230255126953125, 0.0258636474609375, -0.01021575927734375, 0.0230255126953125, -0.041900634765625, 0.0310821533203125, 0.0146026611328125, 0.018310546875, 0.031219482421875, -0.039276123046875, 0.03802490234375, 0.029296875, -0.046417236328125, -0.0771484375, 0.01326751708984375, -0.0810546875, -0.01436614990234375, 0.1058349609375, -0.01145172119140625, -0.022369384765625, 0.018798828125, -0.042572021484375, 0.033294677734375, -0.03399658203125, 0.05804443359375, 0.038055419921875, -0.01313018798828125, -0.0215301513671875, -0.036712646484375, 0.032958984375, 0.0157318115234375, -0.06201171875, -0.002960205078125, 0.05181884765625, 0.04486083984375, 0.007450103759765625, 0.040374755859375, -0.0018310546875, 0.04290771484375, -0.012847900390625, 0.0124359130859375, -0.02606201171875, -0.02178955078125, -0.0172119140625, -0.02484130859375, -0.011962890625, -0.032012939453125 ] ]
dandelin/vilt-b32-mlm
2022-07-06T12:18:37.000Z
[ "transformers", "pytorch", "vilt", "fill-mask", "arxiv:2102.03334", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
dandelin
null
null
dandelin/vilt-b32-mlm
6
9,631
transformers
2022-03-02T23:29:05
--- license: apache-2.0 --- # Vision-and-Language Transformer (ViLT), pre-trained only Vision-and-Language Transformer (ViLT) model pre-trained on GCC+SBU+COCO+VG (200k steps). It was introduced in the paper [ViLT: Vision-and-Language Transformer Without Convolution or Region Supervision](https://arxiv.org/abs/2102.03334) by Kim et al. and first released in [this repository](https://github.com/dandelin/ViLT). Note: this model only includes the language modeling head. Disclaimer: The team releasing ViLT did not write a model card for this model so this model card has been written by the Hugging Face team. ## Intended uses & limitations You can use the raw model for masked language modeling given an image and a piece of text with [MASK] tokens. ### How to use Here is how to use this model in PyTorch: ``` from transformers import ViltProcessor, ViltForMaskedLM import requests from PIL import Image import re url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) text = "a bunch of [MASK] laying on a [MASK]." processor = ViltProcessor.from_pretrained("dandelin/vilt-b32-mlm") model = ViltForMaskedLM.from_pretrained("dandelin/vilt-b32-mlm") # prepare inputs encoding = processor(image, text, return_tensors="pt") # forward pass outputs = model(**encoding) tl = len(re.findall("\[MASK\]", text)) inferred_token = [text] # gradually fill in the MASK tokens, one by one with torch.no_grad(): for i in range(tl): encoded = processor.tokenizer(inferred_token) input_ids = torch.tensor(encoded.input_ids).to(device) encoded = encoded["input_ids"][0][1:-1] outputs = model(input_ids=input_ids, pixel_values=pixel_values) mlm_logits = outputs.logits[0] # shape (seq_len, vocab_size) # only take into account text features (minus CLS and SEP token) mlm_logits = mlm_logits[1 : input_ids.shape[1] - 1, :] mlm_values, mlm_ids = mlm_logits.softmax(dim=-1).max(dim=-1) # only take into account text mlm_values[torch.tensor(encoded) != 103] = 0 select = mlm_values.argmax().item() encoded[select] = mlm_ids[select].item() inferred_token = [processor.decode(encoded)] selected_token = "" encoded = processor.tokenizer(inferred_token) processor.decode(encoded.input_ids[0], skip_special_tokens=True) ``` ## Training data (to do) ## Training procedure ### Preprocessing (to do) ### Pretraining (to do) ## Evaluation results (to do) ### BibTeX entry and citation info ```bibtex @misc{kim2021vilt, title={ViLT: Vision-and-Language Transformer Without Convolution or Region Supervision}, author={Wonjae Kim and Bokyung Son and Ildoo Kim}, year={2021}, eprint={2102.03334}, archivePrefix={arXiv}, primaryClass={stat.ML} } ```
2,850
[ [ -0.0272369384765625, -0.057037353515625, 0.00696563720703125, 0.0105438232421875, -0.02789306640625, -0.003681182861328125, -0.0303802490234375, -0.0254058837890625, 0.01044464111328125, 0.0297698974609375, -0.037353515625, -0.046600341796875, -0.05914306640625, -0.00634002685546875, -0.02447509765625, 0.07733154296875, -0.005245208740234375, 0.006927490234375, -0.00017821788787841797, -0.0162811279296875, -0.020477294921875, -0.039642333984375, -0.035888671875, -0.0291748046875, 0.01727294921875, 0.01000213623046875, 0.02813720703125, 0.02447509765625, 0.0462646484375, 0.0291290283203125, -0.01242828369140625, 0.00787353515625, -0.03533935546875, -0.013336181640625, 0.00188446044921875, -0.04376220703125, -0.0283050537109375, 0.0022029876708984375, 0.04998779296875, 0.031768798828125, 0.016632080078125, 0.0311126708984375, 0.014190673828125, 0.045501708984375, -0.042572021484375, 0.01531219482421875, -0.04705810546875, 0.013946533203125, -0.010711669921875, -0.01161956787109375, -0.03509521484375, -0.020294189453125, 0.014251708984375, -0.0305328369140625, 0.037384033203125, -0.0091552734375, 0.09417724609375, 0.01111602783203125, -0.00576019287109375, -0.00992584228515625, -0.0443115234375, 0.055023193359375, -0.042694091796875, 0.031524658203125, 0.0114593505859375, 0.0126495361328125, 0.0037937164306640625, -0.08349609375, -0.0562744140625, -0.0016765594482421875, -0.017913818359375, 0.01190948486328125, -0.01702880859375, 0.002941131591796875, 0.03131103515625, 0.032257080078125, -0.049041748046875, -0.0196533203125, -0.053680419921875, -0.0197601318359375, 0.044464111328125, -0.0026111602783203125, 0.02288818359375, -0.0086212158203125, -0.03021240234375, -0.04052734375, -0.0294036865234375, 0.017913818359375, 0.01129913330078125, 0.007450103759765625, -0.0278472900390625, 0.05364990234375, -0.01125335693359375, 0.05987548828125, 0.005359649658203125, -0.012481689453125, 0.049774169921875, -0.0247650146484375, -0.039886474609375, 0.002941131591796875, 0.0869140625, 0.02667236328125, 0.036895751953125, 0.003833770751953125, -0.00409698486328125, 0.002742767333984375, 0.0011987686157226562, -0.07684326171875, -0.0230865478515625, 0.00261688232421875, -0.031982421875, -0.037384033203125, 0.014556884765625, -0.0665283203125, -0.00902557373046875, -0.003635406494140625, 0.04962158203125, -0.02996826171875, -0.02093505859375, 0.003917694091796875, -0.006683349609375, 0.032989501953125, -0.0010738372802734375, -0.06378173828125, 0.01007843017578125, 0.0147247314453125, 0.0733642578125, 0.0033721923828125, -0.0271453857421875, -0.0239105224609375, -0.029998779296875, -0.01654052734375, 0.03375244140625, -0.0217742919921875, -0.0289764404296875, -0.0061492919921875, 0.018157958984375, -0.0253143310546875, -0.029266357421875, 0.033111572265625, -0.0223236083984375, 0.0308380126953125, 0.004650115966796875, -0.0206298828125, -0.0207977294921875, 0.0125274658203125, -0.03485107421875, 0.0908203125, 0.0186767578125, -0.06488037109375, 0.02655029296875, -0.0467529296875, -0.0167694091796875, 0.0058441162109375, -0.0171661376953125, -0.054931640625, -0.005916595458984375, 0.0308380126953125, 0.0430908203125, -0.0033435821533203125, 0.0211334228515625, -0.0121612548828125, -0.034271240234375, 0.0276947021484375, -0.045440673828125, 0.08282470703125, 0.0034198760986328125, -0.0511474609375, 0.0246734619140625, -0.055450439453125, 0.00878143310546875, 0.0194854736328125, -0.0235443115234375, 0.0091400146484375, -0.0218658447265625, 0.0233001708984375, 0.0240325927734375, 0.0130462646484375, -0.0301666259765625, 0.0203094482421875, -0.0452880859375, 0.03375244140625, 0.044464111328125, -0.0159454345703125, 0.03363037109375, -0.0002627372741699219, 0.0290069580078125, 0.005435943603515625, 0.01800537109375, -0.00801849365234375, -0.041656494140625, -0.0860595703125, -0.016815185546875, 0.01434326171875, 0.044921875, -0.06689453125, 0.032012939453125, -0.0360107421875, -0.04345703125, -0.0523681640625, 0.00482177734375, 0.03558349609375, 0.04718017578125, 0.03668212890625, -0.0216217041015625, -0.042755126953125, -0.062347412109375, -0.0029850006103515625, 0.0025959014892578125, 0.0050506591796875, 0.01517486572265625, 0.044708251953125, -0.02459716796875, 0.05792236328125, -0.032501220703125, -0.015533447265625, -0.023529052734375, 0.0086517333984375, 0.029876708984375, 0.047271728515625, 0.051483154296875, -0.057220458984375, -0.041656494140625, 0.0030117034912109375, -0.05889892578125, 0.008514404296875, 0.004119873046875, -0.0160369873046875, 0.0283355712890625, 0.042205810546875, -0.041778564453125, 0.062255859375, 0.043365478515625, -0.037200927734375, 0.0438232421875, -0.015655517578125, 0.01165771484375, -0.08709716796875, 0.0163116455078125, 0.0031566619873046875, -0.0225982666015625, -0.04705810546875, -0.00809478759765625, 0.01410675048828125, -0.00848388671875, -0.050323486328125, 0.058197021484375, -0.038360595703125, 0.00798797607421875, -0.017059326171875, -0.00733184814453125, 0.00600433349609375, 0.060272216796875, 0.0128021240234375, 0.05120849609375, 0.06732177734375, -0.044464111328125, 0.0390625, 0.021514892578125, -0.031280517578125, 0.03045654296875, -0.058837890625, 0.00786590576171875, -0.005558013916015625, -0.0009794235229492188, -0.06414794921875, -0.008148193359375, 0.046356201171875, -0.054595947265625, 0.03338623046875, -0.02142333984375, -0.0274505615234375, -0.041412353515625, -0.0169830322265625, 0.03680419921875, 0.056640625, -0.039581298828125, 0.0477294921875, 0.0039825439453125, 0.00702667236328125, -0.05718994140625, -0.07537841796875, -0.0012197494506835938, -0.0087127685546875, -0.045684814453125, 0.028900146484375, -0.00595855712890625, 0.004730224609375, 0.006229400634765625, -0.0008344650268554688, -0.005847930908203125, -0.0157470703125, 0.0176544189453125, 0.0305023193359375, -0.0213165283203125, -0.007244110107421875, -0.015106201171875, -0.00905609130859375, 0.001010894775390625, -0.022918701171875, 0.05792236328125, -0.031646728515625, -0.01421356201171875, -0.046661376953125, -0.0024623870849609375, 0.037384033203125, -0.03143310546875, 0.059783935546875, 0.07647705078125, -0.032958984375, -0.0006356239318847656, -0.03643798828125, -0.0138092041015625, -0.038787841796875, 0.044677734375, -0.02886962890625, -0.0496826171875, 0.03729248046875, 0.0229949951171875, -0.00470733642578125, 0.051483154296875, 0.06524658203125, 0.000995635986328125, 0.0562744140625, 0.0538330078125, 0.002925872802734375, 0.0557861328125, -0.0567626953125, 0.0149688720703125, -0.056549072265625, -0.034210205078125, -0.0093231201171875, -0.0239715576171875, -0.041259765625, -0.041656494140625, 0.0223541259765625, 0.0123443603515625, -0.01081085205078125, 0.025482177734375, -0.0665283203125, 0.02313232421875, 0.052459716796875, 0.0193328857421875, -0.01056671142578125, 0.017913818359375, -0.0200958251953125, 0.005062103271484375, -0.05303955078125, -0.0239410400390625, 0.0882568359375, 0.0268707275390625, 0.0504150390625, -0.0273590087890625, 0.041412353515625, -0.015655517578125, 0.0184326171875, -0.053314208984375, 0.04400634765625, -0.007724761962890625, -0.035125732421875, -0.01148223876953125, -0.0196990966796875, -0.0677490234375, 0.0234832763671875, -0.03582763671875, -0.06866455078125, 0.022064208984375, 0.0128021240234375, -0.01105499267578125, 0.03643798828125, -0.05950927734375, 0.0792236328125, -0.01995849609375, -0.0277252197265625, 0.015045166015625, -0.053131103515625, 0.02655029296875, 0.0109405517578125, -0.0063018798828125, 0.006229400634765625, 0.03399658203125, 0.0711669921875, -0.03118896484375, 0.0604248046875, -0.01293182373046875, 0.0233917236328125, 0.032989501953125, -0.01294708251953125, 0.0215301513671875, -0.006214141845703125, 0.0174407958984375, 0.02886962890625, -0.003276824951171875, -0.032470703125, -0.036865234375, 0.03936767578125, -0.0599365234375, -0.0268402099609375, -0.04205322265625, -0.0261383056640625, 0.01320648193359375, 0.030059814453125, 0.060150146484375, 0.03759765625, -0.001529693603515625, 0.02484130859375, 0.0572509765625, -0.0238189697265625, 0.041534423828125, -0.006694793701171875, -0.026092529296875, -0.04150390625, 0.0699462890625, 0.002864837646484375, 0.0223541259765625, 0.0212249755859375, 0.01255035400390625, -0.023529052734375, -0.031280517578125, -0.039306640625, 0.0225982666015625, -0.055694580078125, -0.01244354248046875, -0.05511474609375, -0.0462646484375, -0.043670654296875, -0.00959014892578125, -0.033111572265625, -0.01020050048828125, -0.03265380859375, -0.0008015632629394531, 0.03631591796875, 0.02996826171875, 0.00804901123046875, 0.0308990478515625, -0.04412841796875, 0.025482177734375, 0.0232086181640625, 0.0200958251953125, -0.017425537109375, -0.0474853515625, -0.0255889892578125, 0.01763916015625, -0.0152130126953125, -0.06341552734375, 0.041961669921875, 0.0132904052734375, 0.03680419921875, 0.0258636474609375, -0.0197296142578125, 0.05303955078125, -0.027008056640625, 0.05303955078125, 0.02703857421875, -0.069091796875, 0.042724609375, 0.00421142578125, 0.0199127197265625, 0.0287628173828125, 0.00804901123046875, -0.0411376953125, -0.0144195556640625, -0.0562744140625, -0.07080078125, 0.056793212890625, 0.03271484375, 0.01116180419921875, 0.0251617431640625, 0.020904541015625, -0.016326904296875, 0.00927734375, -0.0794677734375, -0.042266845703125, -0.040924072265625, -0.033935546875, -0.0153961181640625, -0.0216217041015625, 0.00033926963806152344, -0.047637939453125, 0.04644775390625, -0.00330352783203125, 0.046661376953125, 0.033782958984375, -0.03021240234375, 0.01348114013671875, -0.01032257080078125, 0.052276611328125, 0.036773681640625, -0.0160980224609375, 0.016326904296875, 0.016754150390625, -0.061004638671875, 0.00017821788787841797, 0.01312255859375, -0.01224517822265625, 0.0162506103515625, 0.031402587890625, 0.09197998046875, -0.00780487060546875, -0.0301971435546875, 0.050567626953125, -0.0089874267578125, -0.0340576171875, -0.0238189697265625, 0.00435638427734375, 0.0038890838623046875, 0.023590087890625, 0.022308349609375, 0.01525115966796875, -0.0105133056640625, -0.0186004638671875, 0.01003265380859375, 0.027496337890625, -0.03973388671875, -0.02294921875, 0.059478759765625, -0.00047326087951660156, -0.0252838134765625, 0.048370361328125, -0.006641387939453125, -0.051605224609375, 0.050201416015625, 0.029449462890625, 0.07489013671875, -0.00176239013671875, 0.00894927978515625, 0.050018310546875, 0.032501220703125, 0.004058837890625, 0.01450347900390625, 0.002460479736328125, -0.057830810546875, -0.017852783203125, -0.05694580078125, 0.0034008026123046875, 0.019134521484375, -0.035247802734375, 0.032867431640625, -0.0297698974609375, -0.01091766357421875, -0.0018815994262695312, 0.0000718832015991211, -0.0799560546875, 0.03265380859375, 0.01537322998046875, 0.051116943359375, -0.06561279296875, 0.0701904296875, 0.0494384765625, -0.040771484375, -0.06085205078125, -0.016876220703125, -0.01690673828125, -0.07940673828125, 0.053741455078125, 0.0247955322265625, -0.0050506591796875, 0.0093994140625, -0.05523681640625, -0.05316162109375, 0.08416748046875, 0.0223236083984375, -0.034271240234375, 0.00273895263671875, 0.01148223876953125, 0.03375244140625, -0.01494598388671875, 0.0251007080078125, 0.030548095703125, 0.02490234375, 0.0095977783203125, -0.062164306640625, -0.00030112266540527344, -0.03424072265625, -0.00044274330139160156, 0.0025348663330078125, -0.03582763671875, 0.06817626953125, -0.011810302734375, -0.01528167724609375, -0.00868988037109375, 0.054473876953125, 0.01421356201171875, 0.01078033447265625, 0.028106689453125, 0.044219970703125, 0.04736328125, -0.0149993896484375, 0.06475830078125, -0.017181396484375, 0.05413818359375, 0.05596923828125, 0.030548095703125, 0.046356201171875, 0.037872314453125, -0.010772705078125, 0.0264129638671875, 0.05126953125, -0.02166748046875, 0.033355712890625, 0.009796142578125, -0.01558685302734375, 0.0034770965576171875, 0.01666259765625, -0.037994384765625, 0.0306854248046875, 0.0212249755859375, -0.0452880859375, -0.004467010498046875, 0.015838623046875, 0.00829315185546875, -0.02069091796875, -0.017913818359375, 0.03961181640625, 0.00463104248046875, -0.0345458984375, 0.064697265625, 0.002399444580078125, 0.07244873046875, -0.038726806640625, 0.0023555755615234375, -0.0065155029296875, 0.0207366943359375, -0.016632080078125, -0.06011962890625, 0.01788330078125, 0.005458831787109375, -0.0105133056640625, -0.01262664794921875, 0.057830810546875, -0.04705810546875, -0.07220458984375, 0.0149688720703125, 0.0181121826171875, 0.0259246826171875, -0.006290435791015625, -0.067626953125, 0.009307861328125, 0.01219940185546875, -0.045379638671875, 0.0006728172302246094, 0.0279388427734375, 0.0030765533447265625, 0.038848876953125, 0.0479736328125, 0.004375457763671875, 0.03045654296875, -0.0009131431579589844, 0.06549072265625, -0.045257568359375, -0.013763427734375, -0.08697509765625, 0.052093505859375, -0.00801849365234375, -0.036285400390625, 0.040191650390625, 0.0457763671875, 0.0767822265625, -0.01494598388671875, 0.053924560546875, -0.0110015869140625, -0.0035877227783203125, -0.04901123046875, 0.06842041015625, -0.043060302734375, -0.0231170654296875, -0.027313232421875, -0.0714111328125, -0.035919189453125, 0.06658935546875, -0.0198516845703125, 0.01352691650390625, 0.055389404296875, 0.0706787109375, -0.01244354248046875, -0.03619384765625, 0.026519775390625, 0.014892578125, 0.024566650390625, 0.0458984375, 0.0283355712890625, -0.0592041015625, 0.050628662109375, -0.037567138671875, -0.0098876953125, -0.02392578125, -0.050933837890625, -0.07958984375, -0.0548095703125, -0.0313720703125, -0.044647216796875, -0.01009368896484375, 0.053680419921875, 0.061676025390625, -0.05560302734375, -0.007625579833984375, -0.0168304443359375, 0.00400543212890625, -0.01166534423828125, -0.0194549560546875, 0.0467529296875, -0.018310546875, -0.068603515625, 0.00032782554626464844, 0.00861358642578125, 0.026458740234375, -0.01445770263671875, -0.009002685546875, -0.015045166015625, -0.017333984375, 0.0418701171875, 0.0172882080078125, -0.0518798828125, -0.00867462158203125, 0.00905609130859375, -0.019989013671875, 0.02386474609375, 0.035430908203125, -0.0537109375, 0.053924560546875, 0.04119873046875, 0.031707763671875, 0.0726318359375, -0.0205841064453125, 0.019317626953125, -0.05194091796875, 0.04248046875, 0.01081085205078125, 0.036224365234375, 0.0258941650390625, -0.02337646484375, 0.02447509765625, 0.042205810546875, -0.03387451171875, -0.05517578125, 0.00005632638931274414, -0.08514404296875, -0.0087432861328125, 0.08538818359375, -0.0084381103515625, -0.043853759765625, 0.024200439453125, -0.032867431640625, 0.041778564453125, 0.0011148452758789062, 0.0308685302734375, 0.0450439453125, 0.01042938232421875, -0.030670166015625, -0.035369873046875, 0.03314208984375, 0.01413726806640625, -0.0423583984375, -0.023529052734375, 0.01079559326171875, 0.040283203125, 0.01462554931640625, 0.057647705078125, -0.01265716552734375, 0.0098876953125, 0.00922393798828125, 0.043670654296875, -0.00788116455078125, -0.0180816650390625, -0.02166748046875, -0.00470733642578125, -0.0196990966796875, -0.048583984375 ] ]
TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ
2023-09-27T12:44:27.000Z
[ "transformers", "safetensors", "llama", "text-generation", "license:other", "has_space", "text-generation-inference", "region:us" ]
text-generation
TheBloke
null
null
TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ
75
9,618
transformers
2023-06-01T09:48:20
--- license: other model_name: WizardLM Uncensored SuperCOT Storytelling 30B base_model: Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b inference: false model_creator: YellowRoseCx model_type: llama prompt_template: 'You are a helpful AI assistant. USER: {prompt} ASSISTANT: ' quantized_by: TheBloke --- <!-- header start --> <!-- 200823 --> <div style="width: auto; margin-left: auto; margin-right: auto"> <img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </div> <div style="display: flex; justify-content: space-between; width: 100%;"> <div style="display: flex; flex-direction: column; align-items: flex-start;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p> </div> <div style="display: flex; flex-direction: column; align-items: flex-end;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p> </div> </div> <div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div> <hr style="margin-top: 1.0em; margin-bottom: 1.0em;"> <!-- header end --> # WizardLM Uncensored SuperCOT Storytelling 30B - GPTQ - Model creator: [YellowRoseCx](https://huggingface.co/Monero) - Original model: [WizardLM Uncensored SuperCOT Storytelling 30B](https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b) <!-- description start --> ## Description This repo contains GPTQ model files for [Monero's WizardLM-Uncensored-SuperCOT-Storytelling-30B](https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b). Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them. <!-- description end --> <!-- repositories-available start --> ## Repositories available * [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-AWQ) * [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ) * [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GGUF) * [YellowRoseCx's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b) <!-- repositories-available end --> <!-- prompt-template start --> ## Prompt template: Vicuna-Short ``` You are a helpful AI assistant. USER: {prompt} ASSISTANT: ``` <!-- prompt-template end --> <!-- README_GPTQ.md-provided-files start --> ## Provided files and GPTQ parameters Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements. Each separate quant is in a different branch. See below for instructions on fetching from different branches. All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa. <details> <summary>Explanation of GPTQ parameters</summary> - Bits: The bit size of the quantised model. - GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value. - Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now. - Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy. - GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s). - Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences. - ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit. </details> | Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc | | ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- | | [main](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ/tree/main) | 4 | None | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 16.94 GB | Yes | 4-bit, with Act Order. No group size, to lower VRAM requirements. | | [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 19.44 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. | | [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 18.18 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. | | [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 17.55 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. | | [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 32.99 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. | | [gptq-8bit-128g-actorder_False](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ/tree/gptq-8bit-128g-actorder_False) | 8 | 128 | No | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 33.73 GB | No | 8-bit, with group size 128g for higher inference quality and without Act Order to improve AutoGPTQ speed. | | [gptq-3bit--1g-actorder_True](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ/tree/gptq-3bit--1g-actorder_True) | 3 | None | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 12.92 GB | No | 3-bit, with Act Order and no group size. Lowest possible VRAM requirements. May be lower quality than 3-bit 128g. | | [gptq-3bit-128g-actorder_False](https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ/tree/gptq-3bit-128g-actorder_False) | 3 | 128 | No | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 13.51 GB | No | 3-bit, with group size 128g but no act-order. Slightly higher VRAM requirements than 3-bit None. | <!-- README_GPTQ.md-provided-files end --> <!-- README_GPTQ.md-download-from-branches start --> ## How to download from branches - In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ:main` - With Git, you can clone a branch with: ``` git clone --single-branch --branch main https://huggingface.co/TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ ``` - In Python Transformers code, the branch is the `revision` parameter; see below. <!-- README_GPTQ.md-download-from-branches end --> <!-- README_GPTQ.md-text-generation-webui start --> ## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui). Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui). It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install. 1. Click the **Model tab**. 2. Under **Download custom model or LoRA**, enter `TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ`. - To download from a specific branch, enter for example `TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ:main` - see Provided Files above for the list of branches for each option. 3. Click **Download**. 4. The model will start downloading. Once it's finished it will say "Done". 5. In the top left, click the refresh icon next to **Model**. 6. In the **Model** dropdown, choose the model you just downloaded: `WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ` 7. The model will automatically load, and is now ready for use! 8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right. * Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`. 9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started! <!-- README_GPTQ.md-text-generation-webui end --> <!-- README_GPTQ.md-use-from-python start --> ## How to use this GPTQ model from Python code ### Install the necessary packages Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later. ```shell pip3 install transformers>=4.32.0 optimum>=1.12.0 pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7 ``` If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead: ```shell pip3 uninstall -y auto-gptq git clone https://github.com/PanQiWei/AutoGPTQ cd AutoGPTQ pip3 install . ``` ### For CodeLlama models only: you must use Transformers 4.33.0 or later. If 4.33.0 is not yet released when you read this, you will need to install Transformers from source: ```shell pip3 uninstall -y transformers pip3 install git+https://github.com/huggingface/transformers.git ``` ### You can then use the following code ```python from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline model_name_or_path = "TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-GPTQ" # To use a different branch, change revision # For example: revision="main" model = AutoModelForCausalLM.from_pretrained(model_name_or_path, device_map="auto", trust_remote_code=False, revision="main") tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True) prompt = "Tell me about AI" prompt_template=f'''You are a helpful AI assistant. USER: {prompt} ASSISTANT: ''' print("\n\n*** Generate:") input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda() output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512) print(tokenizer.decode(output[0])) # Inference can also be done using transformers' pipeline print("*** Pipeline:") pipe = pipeline( "text-generation", model=model, tokenizer=tokenizer, max_new_tokens=512, do_sample=True, temperature=0.7, top_p=0.95, top_k=40, repetition_penalty=1.1 ) print(pipe(prompt_template)[0]['generated_text']) ``` <!-- README_GPTQ.md-use-from-python end --> <!-- README_GPTQ.md-compatibility start --> ## Compatibility The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI). [ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility. [Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models. <!-- README_GPTQ.md-compatibility end --> <!-- footer start --> <!-- 200823 --> ## Discord For further support, and discussions on these models and AI in general, join us at: [TheBloke AI's Discord server](https://discord.gg/theblokeai) ## Thanks, and how to contribute Thanks to the [chirper.ai](https://chirper.ai) team! Thanks to Clay from [gpus.llm-utils.org](llm-utils)! I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training. If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects. Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits. * Patreon: https://patreon.com/TheBlokeAI * Ko-Fi: https://ko-fi.com/TheBlokeAI **Special thanks to**: Aemon Algiz. **Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov Thank you to all my generous patrons and donaters! And thank you again to a16z for their generous grant. <!-- footer end --> # Original model card: Monero's WizardLM-Uncensored-SuperCOT-Storytelling-30B This model is a triple model merge of WizardLM Uncensored+CoT+Storytelling, resulting in a comprehensive boost in reasoning and story writing capabilities. To allow all output, at the end of your prompt add ```### Certainly!``` You've become a compendium of knowledge on a vast array of topics. Lore Mastery is an arcane tradition fixated on understanding the underlying mechanics of magic. It is the most academic of all arcane traditions. The promise of uncovering new knowledge or proving (or discrediting) a theory of magic is usually required to rouse its practitioners from their laboratories, academies, and archives to pursue a life of adventure. Known as savants, followers of this tradition are a bookish lot who see beauty and mystery in the application of magic. The results of a spell are less interesting to them than the process that creates it. Some savants take a haughty attitude toward those who follow a tradition focused on a single school of magic, seeing them as provincial and lacking the sophistication needed to master true magic. Other savants are generous teachers, countering ignorance and deception with deep knowledge and good humor.
17,161
[ [ -0.042144775390625, -0.05487060546875, 0.001728057861328125, 0.00777435302734375, -0.0213775634765625, -0.004055023193359375, 0.00893402099609375, -0.043914794921875, 0.0179595947265625, 0.031829833984375, -0.049285888671875, -0.03759765625, -0.03125, -0.0022563934326171875, -0.0284423828125, 0.08587646484375, 0.004730224609375, -0.0233612060546875, -0.002532958984375, -0.01471710205078125, -0.017333984375, -0.04217529296875, -0.05108642578125, -0.008575439453125, 0.019287109375, 0.0210113525390625, 0.0706787109375, 0.04071044921875, 0.010040283203125, 0.0194091796875, 0.0020122528076171875, 0.01251983642578125, -0.04754638671875, -0.0182342529296875, 0.01116180419921875, -0.0110015869140625, -0.04449462890625, 0.01113128662109375, 0.035369873046875, 0.01494598388671875, -0.0232391357421875, 0.016571044921875, 0.0006022453308105469, 0.047760009765625, -0.025787353515625, 0.016571044921875, -0.02972412109375, 0.01507568359375, -0.0021343231201171875, 0.00498199462890625, -0.00916290283203125, -0.024261474609375, 0.007259368896484375, -0.06805419921875, 0.027496337890625, 0.0225372314453125, 0.08856201171875, -0.00414276123046875, -0.04888916015625, 0.01316070556640625, -0.0254364013671875, 0.04779052734375, -0.06982421875, 0.0273284912109375, 0.04473876953125, 0.00823211669921875, -0.01253509521484375, -0.06329345703125, -0.05657958984375, 0.00238800048828125, -0.018310546875, 0.01959228515625, -0.02935791015625, 0.003612518310546875, 0.040374755859375, 0.05450439453125, -0.06927490234375, -0.011322021484375, -0.01468658447265625, -0.025360107421875, 0.06573486328125, 0.0215911865234375, 0.02496337890625, -0.0205230712890625, -0.021331787109375, -0.03314208984375, -0.04107666015625, -0.00005412101745605469, 0.0294189453125, -0.00778961181640625, -0.034820556640625, 0.035858154296875, -0.0208282470703125, 0.031280517578125, 0.01389312744140625, -0.0154266357421875, 0.024505615234375, -0.056671142578125, -0.0330810546875, -0.019439697265625, 0.09344482421875, 0.04705810546875, -0.00940704345703125, 0.0159149169921875, -0.00020503997802734375, -0.01495361328125, -0.007389068603515625, -0.0767822265625, -0.03759765625, 0.03057861328125, -0.032958984375, -0.01088714599609375, -0.003467559814453125, -0.05780029296875, -0.0112457275390625, -0.0036334991455078125, 0.033782958984375, -0.053436279296875, -0.0341796875, 0.008941650390625, -0.033447265625, 0.041107177734375, 0.02606201171875, -0.06329345703125, 0.04119873046875, 0.0284271240234375, 0.045440673828125, 0.0012845993041992188, -0.013031005859375, -0.011932373046875, 0.01073455810546875, -0.0177001953125, 0.02783203125, -0.00952911376953125, -0.037506103515625, -0.02801513671875, 0.02288818359375, 0.0020122528076171875, -0.0169525146484375, 0.039459228515625, -0.0241546630859375, 0.035308837890625, -0.0298919677734375, -0.0477294921875, -0.0207977294921875, 0.006931304931640625, -0.045654296875, 0.09478759765625, 0.037200927734375, -0.07000732421875, 0.0198211669921875, -0.034576416015625, -0.01300048828125, -0.0048980712890625, 0.0022068023681640625, -0.033447265625, -0.0143890380859375, 0.0162200927734375, 0.0152435302734375, -0.031829833984375, 0.01172637939453125, -0.01096343994140625, -0.0178375244140625, 0.01325225830078125, -0.057647705078125, 0.0948486328125, 0.01258087158203125, -0.03509521484375, -0.01141357421875, -0.059906005859375, 0.0117340087890625, 0.032318115234375, 0.0009660720825195312, -0.0009274482727050781, -0.0163421630859375, 0.00923919677734375, 0.01042938232421875, 0.019378662109375, -0.02642822265625, 0.032745361328125, -0.01947021484375, 0.037078857421875, 0.0443115234375, 0.0084075927734375, 0.0062255859375, -0.0246429443359375, 0.03857421875, 0.0033245086669921875, 0.053924560546875, 0.0028057098388671875, -0.054168701171875, -0.051605224609375, -0.0174560546875, 0.0284271240234375, 0.052886962890625, -0.05035400390625, 0.038177490234375, -0.01448822021484375, -0.05426025390625, -0.023773193359375, -0.00856781005859375, 0.030731201171875, 0.033660888671875, 0.03875732421875, -0.035797119140625, -0.0240936279296875, -0.066162109375, 0.006317138671875, -0.0301513671875, -0.0032329559326171875, 0.036865234375, 0.052154541015625, -0.0156097412109375, 0.058074951171875, -0.048583984375, 0.0009288787841796875, 0.006267547607421875, 0.004547119140625, 0.0303192138671875, 0.03985595703125, 0.051300048828125, -0.05902099609375, -0.054534912109375, 0.003635406494140625, -0.040130615234375, -0.001861572265625, -0.002574920654296875, -0.033203125, 0.0245819091796875, 0.0002593994140625, -0.0823974609375, 0.052886962890625, 0.032073974609375, -0.059906005859375, 0.07183837890625, -0.0239105224609375, 0.0224151611328125, -0.07257080078125, -0.002819061279296875, -0.003910064697265625, -0.023040771484375, -0.03302001953125, -0.006366729736328125, 0.0043792724609375, 0.0166015625, -0.02825927734375, 0.051544189453125, -0.04290771484375, 0.005413055419921875, 0.01493072509765625, 0.0016937255859375, 0.031494140625, 0.0423583984375, -0.01125335693359375, 0.058258056640625, 0.044158935546875, -0.028289794921875, 0.04876708984375, 0.028717041015625, -0.0009832382202148438, 0.027862548828125, -0.05157470703125, 0.00909423828125, 0.01971435546875, 0.0264892578125, -0.072265625, -0.013458251953125, 0.04095458984375, -0.04302978515625, 0.041229248046875, -0.017425537109375, -0.047088623046875, -0.03265380859375, -0.044158935546875, 0.0247802734375, 0.053558349609375, -0.0247802734375, 0.0286102294921875, 0.0260162353515625, 0.006877899169921875, -0.05145263671875, -0.053375244140625, -0.005359649658203125, -0.0177154541015625, -0.043792724609375, 0.03857421875, -0.01317596435546875, 0.005428314208984375, 0.00791168212890625, -0.0113525390625, -0.0015430450439453125, -0.0175018310546875, 0.019500732421875, 0.030364990234375, -0.01256561279296875, -0.0162200927734375, 0.0114898681640625, -0.0006546974182128906, -0.0007815361022949219, -0.034210205078125, 0.0250091552734375, -0.0120697021484375, -0.0054473876953125, -0.0279083251953125, 0.027740478515625, 0.0400390625, 0.004253387451171875, 0.06207275390625, 0.06317138671875, -0.020263671875, 0.0115966796875, -0.037017822265625, -0.00403594970703125, -0.035919189453125, 0.0151824951171875, -0.0147247314453125, -0.056793212890625, 0.043212890625, 0.041778564453125, 0.0128021240234375, 0.06781005859375, 0.026702880859375, 0.005413055419921875, 0.081298828125, 0.0211334228515625, -0.00833892822265625, 0.029449462890625, -0.050018310546875, -0.0038356781005859375, -0.05499267578125, -0.0177764892578125, -0.019378662109375, -0.010101318359375, -0.061370849609375, -0.04608154296875, 0.017974853515625, 0.0255584716796875, -0.05523681640625, 0.044708251953125, -0.048187255859375, 0.0184783935546875, 0.04730224609375, 0.00588226318359375, 0.021820068359375, 0.005718231201171875, -0.02386474609375, 0.003063201904296875, -0.05035400390625, -0.019622802734375, 0.084716796875, 0.0240631103515625, 0.041351318359375, 0.0160980224609375, 0.0274658203125, 0.0169525146484375, 0.0192413330078125, -0.02960205078125, 0.0462646484375, 0.00218963623046875, -0.05743408203125, -0.0313720703125, -0.045196533203125, -0.0638427734375, 0.0178680419921875, -0.006786346435546875, -0.06298828125, 0.031097412109375, -0.004444122314453125, -0.020721435546875, 0.024017333984375, -0.063720703125, 0.0828857421875, -0.01290130615234375, -0.03192138671875, -0.004634857177734375, -0.05780029296875, 0.01971435546875, 0.00954437255859375, 0.0002529621124267578, -0.010498046875, -0.01397705078125, 0.060638427734375, -0.07171630859375, 0.049102783203125, -0.01534271240234375, -0.003635406494140625, 0.045623779296875, -0.005954742431640625, 0.048004150390625, 0.01395416259765625, 0.0063323974609375, 0.03204345703125, 0.036590576171875, -0.03729248046875, -0.033355712890625, 0.0443115234375, -0.07342529296875, -0.042724609375, -0.03826904296875, -0.033843994140625, -0.01312255859375, 0.00778961181640625, 0.0382080078125, 0.038299560546875, -0.0013742446899414062, 0.0033931732177734375, 0.055816650390625, -0.02313232421875, 0.030853271484375, 0.033843994140625, -0.0231781005859375, -0.049468994140625, 0.06842041015625, 0.01236724853515625, 0.020050048828125, 0.021514892578125, 0.006740570068359375, -0.044647216796875, -0.041046142578125, -0.04571533203125, 0.02362060546875, -0.04071044921875, -0.0307769775390625, -0.049896240234375, -0.0302734375, -0.03875732421875, 0.023040771484375, -0.022186279296875, -0.05743408203125, -0.0406494140625, -0.0162811279296875, 0.072265625, 0.0248565673828125, -0.015655517578125, 0.01329803466796875, -0.05865478515625, 0.0206146240234375, 0.0308990478515625, 0.0205078125, -0.0020198822021484375, -0.059967041015625, -0.0135955810546875, 0.0101165771484375, -0.044830322265625, -0.07171630859375, 0.055419921875, 0.014984130859375, 0.03924560546875, 0.037933349609375, 0.0174102783203125, 0.060791015625, -0.0236053466796875, 0.0821533203125, 0.023529052734375, -0.06427001953125, 0.03509521484375, -0.031036376953125, 0.00909423828125, 0.028900146484375, 0.046905517578125, -0.033111572265625, -0.0277862548828125, -0.06744384765625, -0.06695556640625, 0.0389404296875, 0.0286102294921875, 0.00225067138671875, 0.0007128715515136719, 0.028778076171875, 0.01297760009765625, 0.00914764404296875, -0.055908203125, -0.0562744140625, -0.036956787109375, -0.010711669921875, 0.003536224365234375, -0.01491546630859375, -0.0199127197265625, -0.05572509765625, 0.06829833984375, -0.0098114013671875, 0.0472412109375, 0.032562255859375, 0.01032257080078125, 0.00164031982421875, 0.000026106834411621094, 0.024444580078125, 0.0458984375, -0.018280029296875, -0.01097869873046875, 0.014068603515625, -0.07000732421875, 0.005832672119140625, 0.031951904296875, -0.0034942626953125, -0.01036834716796875, 0.015838623046875, 0.0670166015625, -0.0066375732421875, -0.03631591796875, 0.035919189453125, -0.0256805419921875, -0.0307159423828125, -0.01535797119140625, 0.020965576171875, 0.010406494140625, 0.0289154052734375, 0.033660888671875, -0.024444580078125, 0.0242462158203125, -0.04052734375, 0.0164794921875, 0.034210205078125, -0.017059326171875, -0.0200653076171875, 0.05572509765625, -0.0132598876953125, 0.005794525146484375, 0.05108642578125, -0.0310821533203125, -0.0318603515625, 0.058135986328125, 0.03924560546875, 0.054534912109375, -0.00431060791015625, 0.0220947265625, 0.0360107421875, 0.01515960693359375, 0.004669189453125, 0.0214080810546875, -0.00597381591796875, -0.047607421875, -0.02349853515625, -0.04345703125, -0.027008056640625, 0.01148223876953125, -0.047821044921875, 0.005931854248046875, -0.028900146484375, -0.023040771484375, -0.0159759521484375, 0.042938232421875, -0.043365478515625, 0.0186767578125, 0.0009975433349609375, 0.080810546875, -0.0499267578125, 0.0687255859375, 0.022369384765625, -0.023834228515625, -0.0765380859375, -0.0169677734375, 0.0025634765625, -0.0401611328125, 0.0096588134765625, -0.00429534912109375, 0.0247802734375, 0.00862884521484375, -0.049468994140625, -0.06640625, 0.1131591796875, 0.02099609375, -0.041839599609375, -0.0010671615600585938, 0.0038394927978515625, 0.0300140380859375, -0.01274871826171875, 0.058837890625, 0.048004150390625, 0.0301361083984375, 0.0155792236328125, -0.07061767578125, 0.026397705078125, -0.03472900390625, -0.00693511962890625, 0.01122283935546875, -0.07440185546875, 0.06719970703125, 0.0034999847412109375, -0.01293182373046875, -0.0019168853759765625, 0.04779052734375, 0.0291900634765625, 0.01366424560546875, 0.02801513671875, 0.0577392578125, 0.06610107421875, -0.02020263671875, 0.09454345703125, -0.017822265625, 0.044281005859375, 0.057647705078125, 0.00923919677734375, 0.042938232421875, 0.0155792236328125, -0.055816650390625, 0.037994384765625, 0.079345703125, -0.006694793701171875, 0.0299072265625, -0.0023708343505859375, -0.017333984375, -0.0033969879150390625, 0.010040283203125, -0.057647705078125, -0.008148193359375, 0.03240966796875, -0.0095977783203125, 0.01448822021484375, -0.012664794921875, 0.00719451904296875, -0.0528564453125, -0.0191802978515625, 0.031280517578125, 0.0144500732421875, -0.02301025390625, 0.06475830078125, -0.010894775390625, 0.0457763671875, -0.044219970703125, -0.007354736328125, -0.035552978515625, -0.0157470703125, -0.018768310546875, -0.060546875, 0.00843048095703125, -0.01535797119140625, -0.0140380859375, -0.0078125, 0.0467529296875, -0.01678466796875, -0.035858154296875, 0.022857666015625, 0.0260772705078125, 0.0240631103515625, -0.0007071495056152344, -0.0770263671875, 0.0052490234375, 0.003406524658203125, -0.052490234375, 0.03558349609375, 0.049774169921875, 0.005466461181640625, 0.044342041015625, 0.04705810546875, -0.0078125, 0.00891876220703125, -0.01052093505859375, 0.06591796875, -0.066162109375, -0.02435302734375, -0.05572509765625, 0.03631591796875, -0.019439697265625, -0.033966064453125, 0.0692138671875, 0.0513916015625, 0.05108642578125, 0.007228851318359375, 0.04400634765625, -0.0260009765625, 0.01448822021484375, -0.025543212890625, 0.044891357421875, -0.051177978515625, 0.009796142578125, -0.031005859375, -0.052032470703125, -0.002063751220703125, 0.04144287109375, -0.01023101806640625, 0.0122222900390625, 0.043365478515625, 0.0645751953125, 0.0034885406494140625, 0.018951416015625, 0.02142333984375, 0.0318603515625, 0.008758544921875, 0.06298828125, 0.0523681640625, -0.0694580078125, 0.04541015625, -0.0281829833984375, -0.01654052734375, -0.01291656494140625, -0.06158447265625, -0.051116943359375, -0.033905029296875, -0.050018310546875, -0.0455322265625, -0.006122589111328125, 0.059356689453125, 0.057952880859375, -0.05615234375, -0.0161895751953125, -0.0057373046875, 0.01314544677734375, -0.029144287109375, -0.02655029296875, 0.0389404296875, 0.0302581787109375, -0.045806884765625, 0.005359649658203125, 0.01250457763671875, 0.0217742919921875, -0.0155792236328125, -0.03106689453125, -0.0113067626953125, 0.01300048828125, 0.049774169921875, 0.03692626953125, -0.035491943359375, -0.0126800537109375, -0.0044708251953125, -0.00847625732421875, 0.0164794921875, 0.01947021484375, -0.051300048828125, 0.0077667236328125, 0.033935546875, 0.00930023193359375, 0.07147216796875, -0.0011167526245117188, 0.015533447265625, -0.0291900634765625, 0.0007114410400390625, 0.008056640625, 0.020111083984375, 0.00023126602172851562, -0.04400634765625, 0.044769287109375, 0.041259765625, -0.04736328125, -0.0572509765625, -0.01442718505859375, -0.0850830078125, -0.0152435302734375, 0.08343505859375, -0.00800323486328125, -0.029510498046875, -0.01065826416015625, -0.022705078125, 0.0214691162109375, -0.032379150390625, 0.0146331787109375, 0.04541015625, -0.0284423828125, -0.02850341796875, -0.06591796875, 0.0386962890625, 0.0118865966796875, -0.051239013671875, 0.006805419921875, 0.04449462890625, 0.032379150390625, 0.0025463104248046875, 0.0711669921875, -0.02099609375, 0.017303466796875, 0.0218963623046875, -0.001361846923828125, -0.0027103424072265625, 0.0034160614013671875, -0.01227569580078125, 0.0045928955078125, -0.0222930908203125, 0.0004878044128417969 ] ]
alefiury/wav2vec2-large-xlsr-53-gender-recognition-librispeech
2023-11-06T20:35:21.000Z
[ "transformers", "pytorch", "wav2vec2", "audio-classification", "generated_from_trainer", "dataset:librispeech_asr", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
audio-classification
alefiury
null
null
alefiury/wav2vec2-large-xlsr-53-gender-recognition-librispeech
3
9,610
transformers
2023-04-24T02:39:47
--- license: apache-2.0 tags: - generated_from_trainer metrics: - f1 model-index: - name: weights results: [] datasets: - librispeech_asr --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-large-xlsr-53-gender-recognition-librispeech This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on Librispeech-clean-100 for gender recognition. It achieves the following results on the evaluation set: - Loss: 0.0061 - F1: 0.9993 ### Compute your inferences ```python import os from typing import List, Optional, Union, Dict import tqdm import torch import torchaudio import numpy as np import pandas as pd from torch import nn from torch.utils.data import DataLoader from torch.nn import functional as F from transformers import ( AutoFeatureExtractor, AutoModelForAudioClassification, Wav2Vec2Processor ) class CustomDataset(torch.utils.data.Dataset): def __init__( self, dataset: List, basedir: Optional[str] = None, sampling_rate: int = 16000, max_audio_len: int = 5, ): self.dataset = dataset self.basedir = basedir self.sampling_rate = sampling_rate self.max_audio_len = max_audio_len def __len__(self): """ Return the length of the dataset """ return len(self.dataset) def _cutorpad(self, audio: np.ndarray) -> np.ndarray: """ Cut or pad audio to the wished length """ effective_length = self.sampling_rate * self.max_audio_len len_audio = len(audio) # If audio length is bigger than wished audio length if len_audio > effective_length: audio = audio[:effective_length] # Expand one dimension related to the channel dimension return audio def __getitem__(self, index) -> torch.Tensor: """ Return the audio and the sampling rate """ if self.basedir is None: filepath = self.dataset[index] else: filepath = os.path.join(self.basedir, self.dataset[index]) speech_array, sr = torchaudio.load(filepath) # Transform to mono if speech_array.shape[0] > 1: speech_array = torch.mean(speech_array, dim=0, keepdim=True) if sr != self.sampling_rate: transform = torchaudio.transforms.Resample(sr, self.sampling_rate) speech_array = transform(speech_array) sr = self.sampling_rate speech_array = speech_array.squeeze().numpy() # Cut or pad audio speech_array = self._cutorpad(speech_array) return speech_array class CollateFunc: def __init__( self, processor: Wav2Vec2Processor, max_length: Optional[int] = None, padding: Union[bool, str] = True, pad_to_multiple_of: Optional[int] = None, sampling_rate: int = 16000, ): self.padding = padding self.processor = processor self.max_length = max_length self.sampling_rate = sampling_rate self.pad_to_multiple_of = pad_to_multiple_of def __call__(self, batch: List): input_features = [] for audio in batch: input_tensor = self.processor(audio, sampling_rate=self.sampling_rate).input_values input_tensor = np.squeeze(input_tensor) input_features.append({"input_values": input_tensor}) batch = self.processor.pad( input_features, padding=self.padding, max_length=self.max_length, pad_to_multiple_of=self.pad_to_multiple_of, return_tensors="pt", ) return batch def predict(test_dataloader, model, device: torch.device): """ Predict the class of the audio """ model.to(device) model.eval() preds = [] with torch.no_grad(): for batch in tqdm.tqdm(test_dataloader): input_values, attention_mask = batch['input_values'].to(device), batch['attention_mask'].to(device) logits = model(input_values, attention_mask=attention_mask).logits scores = F.softmax(logits, dim=-1) pred = torch.argmax(scores, dim=1).cpu().detach().numpy() preds.extend(pred) return preds def get_gender(model_name_or_path: str, audio_paths: List[str], label2id: Dict, id2label: Dict, device: torch.device): num_labels = 2 feature_extractor = AutoFeatureExtractor.from_pretrained(model_name_or_path) model = AutoModelForAudioClassification.from_pretrained( pretrained_model_name_or_path=model_name_or_path, num_labels=num_labels, label2id=label2id, id2label=id2label, ) test_dataset = CustomDataset(audio_paths) data_collator = CollateFunc( processor=feature_extractor, padding=True, sampling_rate=16000, ) test_dataloader = DataLoader( dataset=test_dataset, batch_size=16, collate_fn=data_collator, shuffle=False, num_workers=10 ) preds = predict(test_dataloader=test_dataloader, model=model, device=device) return preds model_name_or_path = "alefiury/wav2vec2-large-xlsr-53-gender-recognition-librispeech" audio_paths = [] # Must be a list with absolute paths of the audios that will be used in inference device = torch.device("cuda" if torch.cuda.is_available() else "cpu") label2id = { "female": 0, "male": 1 } id2label = { 0: "female", 1: "male" } num_labels = 2 preds = get_gender(model_name_or_path, audio_paths, label2id, id2label, device) ``` ## Training and evaluation data The Librispeech-clean-100 dataset was used to train the model, with 70% of the data used for training, 10% for validation, and 20% for testing. ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.002 | 1.0 | 1248 | 0.0061 | 0.9993 | ### Framework versions - Transformers 4.28.0 - Pytorch 2.0.0+cu118 - Tokenizers 0.13.3
6,675
[ [ -0.04107666015625, -0.047454833984375, 0.0142669677734375, 0.01309967041015625, -0.0025653839111328125, -0.0146636962890625, -0.0206298828125, -0.0197296142578125, 0.006511688232421875, 0.00736236572265625, -0.060943603515625, -0.0457763671875, -0.038970947265625, -0.0155181884765625, -0.0268707275390625, 0.06903076171875, 0.01186370849609375, 0.0103607177734375, 0.0052490234375, -0.01324462890625, -0.047454833984375, -0.0204925537109375, -0.054840087890625, -0.02532958984375, 0.00899505615234375, 0.0235748291015625, 0.01439666748046875, 0.036102294921875, 0.026702880859375, 0.0294189453125, -0.0220184326171875, 0.00278472900390625, -0.0261993408203125, -0.0088043212890625, 0.017608642578125, -0.0335693359375, -0.04412841796875, 0.0167694091796875, 0.054168701171875, 0.0223236083984375, -0.016754150390625, 0.04412841796875, 0.00838470458984375, 0.02777099609375, -0.0227203369140625, 0.007610321044921875, -0.02899169921875, 0.00988006591796875, 0.0006384849548339844, -0.021026611328125, -0.0216064453125, -0.020416259765625, 0.01558685302734375, -0.036376953125, 0.04150390625, -0.00591278076171875, 0.068603515625, 0.0168609619140625, -0.01332855224609375, -0.021087646484375, -0.060882568359375, 0.06640625, -0.058319091796875, 0.04180908203125, 0.0374755859375, 0.01523590087890625, -0.004062652587890625, -0.055145263671875, -0.042144775390625, 0.0019702911376953125, 0.010040283203125, 0.019195556640625, -0.012176513671875, -0.00310516357421875, 0.033172607421875, 0.01904296875, -0.045257568359375, 0.0008935928344726562, -0.059112548828125, -0.026092529296875, 0.06396484375, -0.00042176246643066406, 0.0145111083984375, -0.0109405517578125, -0.031097412109375, -0.031158447265625, -0.01293182373046875, 0.023590087890625, 0.041717529296875, 0.01171875, -0.04132080078125, 0.0303802490234375, 0.004489898681640625, 0.043548583984375, 0.016815185546875, -0.04168701171875, 0.059417724609375, -0.019073486328125, -0.022857666015625, 0.035858154296875, 0.07354736328125, 0.0214080810546875, -0.0007762908935546875, 0.01885986328125, -0.00550079345703125, -0.00829315185546875, -0.01178741455078125, -0.0574951171875, -0.0286712646484375, 0.046722412109375, -0.0214691162109375, -0.018157958984375, -0.00412750244140625, -0.05694580078125, -0.00333404541015625, -0.0201568603515625, 0.056671142578125, -0.05059814453125, -0.0229949951171875, 0.01416015625, -0.0321044921875, 0.0155487060546875, -0.004302978515625, -0.077392578125, 0.01468658447265625, 0.0180511474609375, 0.058319091796875, 0.01153564453125, -0.0233001708984375, -0.022857666015625, 0.006885528564453125, -0.0186614990234375, 0.022003173828125, 0.00334930419921875, -0.03558349609375, -0.0303802490234375, -0.006656646728515625, -0.01479339599609375, -0.0367431640625, 0.0562744140625, -0.005435943603515625, 0.0250396728515625, -0.01418304443359375, -0.040374755859375, -0.01340484619140625, -0.0193939208984375, -0.033050537109375, 0.0946044921875, 0.007617950439453125, -0.060028076171875, 0.034454345703125, -0.039306640625, -0.0225067138671875, -0.022003173828125, -0.01222991943359375, -0.06439208984375, -0.012969970703125, 0.02093505859375, 0.03265380859375, -0.0229644775390625, 0.0160369873046875, -0.0237274169921875, -0.035064697265625, 0.0303955078125, -0.038543701171875, 0.0816650390625, 0.008575439453125, -0.05474853515625, 0.007904052734375, -0.0811767578125, 0.00832366943359375, 0.00502777099609375, -0.0273895263671875, 0.008056640625, -0.01617431640625, 0.0206146240234375, 0.0254364013671875, -0.00403594970703125, -0.042938232421875, -0.01418304443359375, -0.026885986328125, 0.037078857421875, 0.055999755859375, -0.00795745849609375, 0.017822265625, -0.0321044921875, 0.01416778564453125, -0.001781463623046875, 0.0031909942626953125, 0.01617431640625, -0.03631591796875, -0.05560302734375, -0.05126953125, 0.024322509765625, 0.04022216796875, -0.0189361572265625, 0.059814453125, -0.00922393798828125, -0.05877685546875, -0.05865478515625, 0.00029087066650390625, 0.0240325927734375, 0.04888916015625, 0.0455322265625, -0.0140533447265625, -0.054443359375, -0.06890869140625, 0.0108489990234375, -0.010223388671875, -0.0164794921875, 0.0384521484375, 0.039764404296875, -0.0095977783203125, 0.07049560546875, -0.03253173828125, -0.037017822265625, -0.0160369873046875, 0.0186767578125, 0.044464111328125, 0.065673828125, 0.046630859375, -0.03912353515625, -0.005588531494140625, -0.029144287109375, -0.057586669921875, -0.01198577880859375, -0.01093292236328125, -0.015777587890625, 0.003765106201171875, 0.018707275390625, -0.04638671875, 0.03271484375, 0.020782470703125, -0.026123046875, 0.0489501953125, -0.01319122314453125, 0.01398468017578125, -0.06768798828125, 0.00652313232421875, 0.00437164306640625, -0.0042266845703125, -0.045989990234375, -0.034698486328125, -0.01012420654296875, 0.0017576217651367188, -0.043975830078125, 0.03656005859375, -0.0243072509765625, -0.017578125, -0.003170013427734375, 0.004177093505859375, -0.0044097900390625, 0.05072021484375, 0.005847930908203125, 0.048431396484375, 0.051300048828125, -0.057647705078125, 0.039031982421875, 0.02935791015625, -0.0267486572265625, 0.03662109375, -0.06671142578125, 0.0164947509765625, -0.0004029273986816406, 0.01071929931640625, -0.08404541015625, -0.0087432861328125, 0.015655517578125, -0.0682373046875, 0.0257568359375, -0.0103607177734375, -0.032958984375, -0.038970947265625, -0.0185089111328125, 0.03619384765625, 0.049560546875, -0.0411376953125, 0.044464111328125, 0.024444580078125, 0.0124664306640625, -0.041290283203125, -0.064453125, -0.045745849609375, -0.0254058837890625, -0.05303955078125, 0.024932861328125, -0.01090240478515625, 0.005863189697265625, -0.0128326416015625, -0.0166168212890625, 0.004486083984375, -0.004581451416015625, 0.0295867919921875, 0.0255889892578125, -0.0095367431640625, 0.010009765625, -0.0018186569213867188, -0.0016717910766601562, 0.0199737548828125, -0.019744873046875, 0.0616455078125, -0.006256103515625, -0.0145111083984375, -0.061187744140625, 0.00001531839370727539, 0.0245513916015625, -0.0165863037109375, 0.045928955078125, 0.0936279296875, -0.0234375, -0.0114593505859375, -0.035675048828125, -0.01953125, -0.03912353515625, 0.05059814453125, -0.0286712646484375, -0.02789306640625, 0.039276123046875, 0.02703857421875, 0.0054931640625, 0.052093505859375, 0.04754638671875, -0.0225830078125, 0.07061767578125, 0.0120391845703125, 0.005401611328125, 0.0268096923828125, -0.07098388671875, -0.0109710693359375, -0.0611572265625, -0.03643798828125, -0.031768798828125, -0.037261962890625, -0.0428466796875, -0.0298004150390625, 0.03125, 0.0091705322265625, -0.040985107421875, 0.024658203125, -0.050201416015625, 0.001338958740234375, 0.067138671875, 0.02496337890625, -0.0166015625, 0.0136566162109375, -0.01479339599609375, -0.006473541259765625, -0.047821044921875, -0.01317596435546875, 0.087890625, 0.044921875, 0.044708251953125, 0.0010204315185546875, 0.05828857421875, 0.00955963134765625, -0.01480865478515625, -0.046112060546875, 0.041290283203125, 0.00591278076171875, -0.055755615234375, -0.02947998046875, -0.03131103515625, -0.05224609375, 0.0139007568359375, -0.031402587890625, -0.06573486328125, 0.0345458984375, 0.005458831787109375, -0.02239990234375, 0.0307769775390625, -0.042022705078125, 0.048370361328125, -0.00667572021484375, -0.035430908203125, 0.0021209716796875, -0.038848876953125, 0.0213623046875, 0.00302886962890625, 0.01080322265625, -0.0181732177734375, 0.02734375, 0.109375, -0.01751708984375, 0.0411376953125, -0.01904296875, 0.00470733642578125, 0.035186767578125, -0.01666259765625, 0.01016998291015625, 0.01012420654296875, -0.0101776123046875, 0.020538330078125, 0.0032062530517578125, -0.024444580078125, -0.0219879150390625, 0.039581298828125, -0.087646484375, -0.021209716796875, -0.0311737060546875, -0.045623779296875, -0.0172882080078125, 0.0004761219024658203, 0.0552978515625, 0.0601806640625, 0.004337310791015625, 0.01398468017578125, 0.054473876953125, -0.01904296875, 0.047637939453125, 0.0132293701171875, -0.007904052734375, -0.057403564453125, 0.06549072265625, 0.006916046142578125, 0.022613525390625, -0.00531005859375, 0.007904052734375, -0.037322998046875, -0.0265350341796875, -0.0322265625, 0.0215301513671875, -0.046295166015625, -0.009979248046875, -0.04986572265625, -0.024017333984375, -0.04327392578125, 0.00782012939453125, -0.044342041015625, -0.02642822265625, -0.04022216796875, -0.01007843017578125, 0.033966064453125, 0.0158843994140625, -0.0204010009765625, 0.033233642578125, -0.046295166015625, 0.02655029296875, 0.0222320556640625, 0.01136016845703125, -0.008270263671875, -0.0772705078125, -0.01331329345703125, 0.00797271728515625, -0.0267791748046875, -0.06512451171875, 0.036834716796875, 0.00736236572265625, 0.04388427734375, 0.02288818359375, -0.01140594482421875, 0.054107666015625, -0.0183258056640625, 0.053253173828125, 0.03167724609375, -0.0758056640625, 0.0482177734375, -0.01849365234375, 0.018463134765625, 0.029754638671875, 0.02423095703125, -0.0216522216796875, -0.01399993896484375, -0.0611572265625, -0.06744384765625, 0.08575439453125, 0.04144287109375, -0.00518035888671875, 0.026763916015625, 0.006809234619140625, -0.01203155517578125, 0.01238250732421875, -0.04815673828125, -0.059173583984375, -0.0203857421875, -0.0211029052734375, -0.0185699462890625, -0.01364898681640625, -0.01538848876953125, -0.05035400390625, 0.07965087890625, 0.0162506103515625, 0.06121826171875, 0.034698486328125, 0.00211334228515625, 0.00634002685546875, 0.01140594482421875, 0.04669189453125, 0.031890869140625, -0.044342041015625, -0.0028228759765625, 0.026153564453125, -0.04937744140625, 0.034423828125, 0.0191497802734375, -0.004241943359375, 0.0251922607421875, 0.0369873046875, 0.0897216796875, 0.0024852752685546875, -0.0188140869140625, 0.0278778076171875, -0.00945281982421875, -0.0382080078125, -0.039825439453125, 0.01172637939453125, 0.0136566162109375, 0.0234222412109375, 0.0343017578125, 0.00914764404296875, 0.00846099853515625, -0.0209503173828125, 0.021942138671875, 0.01526641845703125, -0.033477783203125, -0.013336181640625, 0.07940673828125, -0.0017862319946289062, -0.019073486328125, 0.050201416015625, -0.00034928321838378906, -0.02392578125, 0.06805419921875, 0.03692626953125, 0.07171630859375, -0.03253173828125, 0.0027599334716796875, 0.056915283203125, 0.0177001953125, 0.0010385513305664062, 0.0266265869140625, -0.0019502639770507812, -0.041748046875, -0.003917694091796875, -0.057098388671875, 0.0018367767333984375, 0.024261474609375, -0.0555419921875, 0.041900634765625, -0.032989501953125, -0.038299560546875, 0.0204315185546875, 0.005870819091796875, -0.061676025390625, 0.0290985107421875, 0.0210113525390625, 0.060638427734375, -0.06787109375, 0.0611572265625, 0.0227508544921875, -0.0341796875, -0.08721923828125, -0.0186614990234375, 0.003971099853515625, -0.03765869140625, 0.055084228515625, 0.01532745361328125, -0.005558013916015625, 0.0031299591064453125, -0.030120849609375, -0.07220458984375, 0.09381103515625, 0.030731201171875, -0.04327392578125, 0.004985809326171875, 0.0116729736328125, 0.028900146484375, -0.01355743408203125, 0.04583740234375, 0.05908203125, 0.028167724609375, 0.00373077392578125, -0.07489013671875, -0.00975799560546875, -0.0150299072265625, -0.01316070556640625, -0.0014514923095703125, -0.0384521484375, 0.08404541015625, -0.02239990234375, 0.007793426513671875, -0.009368896484375, 0.06597900390625, 0.045166015625, 0.0265350341796875, 0.04449462890625, 0.043243408203125, 0.0594482421875, -0.017974853515625, 0.055084228515625, -0.01165008544921875, 0.04486083984375, 0.07232666015625, 0.01279449462890625, 0.049072265625, 0.01776123046875, -0.034759521484375, 0.020416259765625, 0.049163818359375, -0.01751708984375, 0.042236328125, 0.0171966552734375, -0.0196533203125, -0.00518035888671875, 0.01611328125, -0.052490234375, 0.05401611328125, 0.01165771484375, -0.04241943359375, 0.019439697265625, 0.016876220703125, -0.002902984619140625, -0.027679443359375, -0.0146636962890625, 0.038360595703125, -0.005069732666015625, -0.0208740234375, 0.0748291015625, 0.0008325576782226562, 0.062042236328125, -0.036224365234375, 0.0023937225341796875, 0.002819061279296875, 0.01343536376953125, -0.026580810546875, -0.0478515625, -0.0010023117065429688, -0.0209808349609375, -0.018829345703125, 0.005584716796875, 0.0278167724609375, -0.0537109375, -0.050140380859375, 0.035064697265625, 0.0098114013671875, 0.0198516845703125, 0.0178680419921875, -0.0589599609375, 0.012451171875, 0.016845703125, -0.0290069580078125, 0.0055694580078125, 0.007965087890625, 0.033966064453125, 0.03997802734375, 0.049346923828125, 0.01169586181640625, 0.0264129638671875, -0.007198333740234375, 0.042816162109375, -0.04718017578125, -0.04327392578125, -0.05712890625, 0.0290069580078125, -0.00969696044921875, -0.03265380859375, 0.043487548828125, 0.060791015625, 0.062103271484375, -0.007701873779296875, 0.047760009765625, -0.006771087646484375, 0.037933349609375, -0.0447998046875, 0.06146240234375, -0.03765869140625, 0.0280303955078125, -0.029205322265625, -0.058624267578125, 0.0130615234375, 0.0540771484375, -0.01971435546875, 0.028564453125, 0.033477783203125, 0.0670166015625, -0.01294708251953125, 0.00560760498046875, 0.0132598876953125, 0.01177215576171875, 0.0206146240234375, 0.05133056640625, 0.0283355712890625, -0.0643310546875, 0.045806884765625, -0.055145263671875, -0.006816864013671875, -0.0010204315185546875, -0.0252532958984375, -0.059173583984375, -0.053253173828125, -0.039581298828125, -0.045166015625, -0.00209808349609375, 0.076904296875, 0.07135009765625, -0.055145263671875, -0.033935546875, 0.01169586181640625, -0.0023441314697265625, -0.0295867919921875, -0.0203399658203125, 0.05462646484375, 0.0003859996795654297, -0.073974609375, 0.01971435546875, -0.0207672119140625, 0.0171966552734375, -0.00322723388671875, -0.018218994140625, -0.032318115234375, -0.00424957275390625, 0.028106689453125, 0.01111602783203125, -0.04302978515625, -0.01261138916015625, -0.0199737548828125, -0.01020050048828125, 0.017730712890625, 0.0168304443359375, -0.05255126953125, 0.0330810546875, 0.038787841796875, 0.01084136962890625, 0.06304931640625, -0.0197296142578125, 0.017364501953125, -0.057220458984375, 0.0271453857421875, 0.0110626220703125, 0.028045654296875, 0.019500732421875, -0.01885986328125, 0.033416748046875, 0.030548095703125, -0.042144775390625, -0.07037353515625, -0.029937744140625, -0.0887451171875, -0.0060882568359375, 0.09539794921875, 0.0047149658203125, -0.0206451416015625, 0.006847381591796875, -0.0266876220703125, 0.052825927734375, -0.039520263671875, 0.0254058837890625, 0.0262451171875, -0.0153045654296875, -0.00696563720703125, -0.058197021484375, 0.047149658203125, 0.0261993408203125, -0.033905029296875, -0.020111083984375, 0.0238189697265625, 0.047943115234375, 0.01529693603515625, 0.05694580078125, -0.00193023681640625, 0.0374755859375, 0.01407623291015625, 0.03350830078125, -0.01904296875, -0.01172637939453125, -0.0279693603515625, 0.0121307373046875, -0.0201568603515625, -0.05499267578125 ] ]
timm/convnext_base.fb_in22k_ft_in1k_384
2023-03-31T22:04:16.000Z
[ "timm", "pytorch", "safetensors", "image-classification", "dataset:imagenet-1k", "dataset:imagenet-22k", "arxiv:2201.03545", "license:apache-2.0", "region:us" ]
image-classification
timm
null
null
timm/convnext_base.fb_in22k_ft_in1k_384
0
9,590
timm
2022-12-13T07:07:47
--- tags: - image-classification - timm library_tag: timm license: apache-2.0 datasets: - imagenet-1k - imagenet-22k --- # Model card for convnext_base.fb_in22k_ft_in1k_384 A ConvNeXt image classification model. Pretrained on ImageNet-22k and fine-tuned on ImageNet-1k by paper authors. ## Model Details - **Model Type:** Image classification / feature backbone - **Model Stats:** - Params (M): 88.6 - GMACs: 45.2 - Activations (M): 84.5 - Image size: 384 x 384 - **Papers:** - A ConvNet for the 2020s: https://arxiv.org/abs/2201.03545 - **Original:** https://github.com/facebookresearch/ConvNeXt - **Dataset:** ImageNet-1k - **Pretrain Dataset:** ImageNet-22k ## Model Usage ### Image Classification ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model('convnext_base.fb_in22k_ft_in1k_384', pretrained=True) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) ``` ### Feature Map Extraction ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'convnext_base.fb_in22k_ft_in1k_384', pretrained=True, features_only=True, ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 for o in output: # print shape of each feature map in output # e.g.: # torch.Size([1, 128, 96, 96]) # torch.Size([1, 256, 48, 48]) # torch.Size([1, 512, 24, 24]) # torch.Size([1, 1024, 12, 12]) print(o.shape) ``` ### Image Embeddings ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'convnext_base.fb_in22k_ft_in1k_384', pretrained=True, num_classes=0, # remove classifier nn.Linear ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor # or equivalently (without needing to set num_classes=0) output = model.forward_features(transforms(img).unsqueeze(0)) # output is unpooled, a (1, 1024, 12, 12) shaped tensor output = model.forward_head(output, pre_logits=True) # output is a (1, num_features) shaped tensor ``` ## Model Comparison Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results). All timing numbers from eager model PyTorch 1.13 on RTX 3090 w/ AMP. | model |top1 |top5 |img_size|param_count|gmacs |macts |samples_per_sec|batch_size| |------------------------------------------------------------------------------------------------------------------------------|------|------|--------|-----------|------|------|---------------|----------| | [convnextv2_huge.fcmae_ft_in22k_in1k_512](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in22k_in1k_512) |88.848|98.742|512 |660.29 |600.81|413.07|28.58 |48 | | [convnextv2_huge.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in22k_in1k_384) |88.668|98.738|384 |660.29 |337.96|232.35|50.56 |64 | | [convnext_xxlarge.clip_laion2b_soup_ft_in1k](https://huggingface.co/timm/convnext_xxlarge.clip_laion2b_soup_ft_in1k) |88.612|98.704|256 |846.47 |198.09|124.45|122.45 |256 | | [convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384) |88.312|98.578|384 |200.13 |101.11|126.74|196.84 |256 | | [convnextv2_large.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in22k_in1k_384) |88.196|98.532|384 |197.96 |101.1 |126.74|128.94 |128 | | [convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320) |87.968|98.47 |320 |200.13 |70.21 |88.02 |283.42 |256 | | [convnext_xlarge.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k_384) |87.75 |98.556|384 |350.2 |179.2 |168.99|124.85 |192 | | [convnextv2_base.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in22k_in1k_384) |87.646|98.422|384 |88.72 |45.21 |84.49 |209.51 |256 | | [convnext_large.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k_384) |87.476|98.382|384 |197.77 |101.1 |126.74|194.66 |256 | | [convnext_large_mlp.clip_laion2b_augreg_ft_in1k](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_augreg_ft_in1k) |87.344|98.218|256 |200.13 |44.94 |56.33 |438.08 |256 | | [convnextv2_large.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in22k_in1k) |87.26 |98.248|224 |197.96 |34.4 |43.13 |376.84 |256 | | [convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384) |87.138|98.212|384 |88.59 |45.21 |84.49 |365.47 |256 | | [convnext_xlarge.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k) |87.002|98.208|224 |350.2 |60.98 |57.5 |368.01 |256 | | [convnext_base.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k_384) |86.796|98.264|384 |88.59 |45.21 |84.49 |366.54 |256 | | [convnextv2_base.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in22k_in1k) |86.74 |98.022|224 |88.72 |15.38 |28.75 |624.23 |256 | | [convnext_large.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k) |86.636|98.028|224 |197.77 |34.4 |43.13 |581.43 |256 | | [convnext_base.clip_laiona_augreg_ft_in1k_384](https://huggingface.co/timm/convnext_base.clip_laiona_augreg_ft_in1k_384) |86.504|97.97 |384 |88.59 |45.21 |84.49 |368.14 |256 | | [convnext_base.clip_laion2b_augreg_ft_in12k_in1k](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in12k_in1k) |86.344|97.97 |256 |88.59 |20.09 |37.55 |816.14 |256 | | [convnextv2_huge.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in1k) |86.256|97.75 |224 |660.29 |115.0 |79.07 |154.72 |256 | | [convnext_small.in12k_ft_in1k_384](https://huggingface.co/timm/convnext_small.in12k_ft_in1k_384) |86.182|97.92 |384 |50.22 |25.58 |63.37 |516.19 |256 | | [convnext_base.clip_laion2b_augreg_ft_in1k](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in1k) |86.154|97.68 |256 |88.59 |20.09 |37.55 |819.86 |256 | | [convnext_base.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k) |85.822|97.866|224 |88.59 |15.38 |28.75 |1037.66 |256 | | [convnext_small.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_small.fb_in22k_ft_in1k_384) |85.778|97.886|384 |50.22 |25.58 |63.37 |518.95 |256 | | [convnextv2_large.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in1k) |85.742|97.584|224 |197.96 |34.4 |43.13 |375.23 |256 | | [convnext_small.in12k_ft_in1k](https://huggingface.co/timm/convnext_small.in12k_ft_in1k) |85.174|97.506|224 |50.22 |8.71 |21.56 |1474.31 |256 | | [convnext_tiny.in12k_ft_in1k_384](https://huggingface.co/timm/convnext_tiny.in12k_ft_in1k_384) |85.118|97.608|384 |28.59 |13.14 |39.48 |856.76 |256 | | [convnextv2_tiny.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in22k_in1k_384) |85.112|97.63 |384 |28.64 |13.14 |39.48 |491.32 |256 | | [convnextv2_base.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in1k) |84.874|97.09 |224 |88.72 |15.38 |28.75 |625.33 |256 | | [convnext_small.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_small.fb_in22k_ft_in1k) |84.562|97.394|224 |50.22 |8.71 |21.56 |1478.29 |256 | | [convnext_large.fb_in1k](https://huggingface.co/timm/convnext_large.fb_in1k) |84.282|96.892|224 |197.77 |34.4 |43.13 |584.28 |256 | | [convnext_tiny.in12k_ft_in1k](https://huggingface.co/timm/convnext_tiny.in12k_ft_in1k) |84.186|97.124|224 |28.59 |4.47 |13.44 |2433.7 |256 | | [convnext_tiny.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_tiny.fb_in22k_ft_in1k_384) |84.084|97.14 |384 |28.59 |13.14 |39.48 |862.95 |256 | | [convnextv2_tiny.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in22k_in1k) |83.894|96.964|224 |28.64 |4.47 |13.44 |1452.72 |256 | | [convnext_base.fb_in1k](https://huggingface.co/timm/convnext_base.fb_in1k) |83.82 |96.746|224 |88.59 |15.38 |28.75 |1054.0 |256 | | [convnextv2_nano.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in22k_in1k_384) |83.37 |96.742|384 |15.62 |7.22 |24.61 |801.72 |256 | | [convnext_small.fb_in1k](https://huggingface.co/timm/convnext_small.fb_in1k) |83.142|96.434|224 |50.22 |8.71 |21.56 |1464.0 |256 | | [convnextv2_tiny.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in1k) |82.92 |96.284|224 |28.64 |4.47 |13.44 |1425.62 |256 | | [convnext_tiny.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_tiny.fb_in22k_ft_in1k) |82.898|96.616|224 |28.59 |4.47 |13.44 |2480.88 |256 | | [convnext_nano.in12k_ft_in1k](https://huggingface.co/timm/convnext_nano.in12k_ft_in1k) |82.282|96.344|224 |15.59 |2.46 |8.37 |3926.52 |256 | | [convnext_tiny_hnf.a2h_in1k](https://huggingface.co/timm/convnext_tiny_hnf.a2h_in1k) |82.216|95.852|224 |28.59 |4.47 |13.44 |2529.75 |256 | | [convnext_tiny.fb_in1k](https://huggingface.co/timm/convnext_tiny.fb_in1k) |82.066|95.854|224 |28.59 |4.47 |13.44 |2346.26 |256 | | [convnextv2_nano.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in22k_in1k) |82.03 |96.166|224 |15.62 |2.46 |8.37 |2300.18 |256 | | [convnextv2_nano.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in1k) |81.83 |95.738|224 |15.62 |2.46 |8.37 |2321.48 |256 | | [convnext_nano_ols.d1h_in1k](https://huggingface.co/timm/convnext_nano_ols.d1h_in1k) |80.866|95.246|224 |15.65 |2.65 |9.38 |3523.85 |256 | | [convnext_nano.d1h_in1k](https://huggingface.co/timm/convnext_nano.d1h_in1k) |80.768|95.334|224 |15.59 |2.46 |8.37 |3915.58 |256 | | [convnextv2_pico.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_pico.fcmae_ft_in1k) |80.304|95.072|224 |9.07 |1.37 |6.1 |3274.57 |256 | | [convnext_pico.d1_in1k](https://huggingface.co/timm/convnext_pico.d1_in1k) |79.526|94.558|224 |9.05 |1.37 |6.1 |5686.88 |256 | | [convnext_pico_ols.d1_in1k](https://huggingface.co/timm/convnext_pico_ols.d1_in1k) |79.522|94.692|224 |9.06 |1.43 |6.5 |5422.46 |256 | | [convnextv2_femto.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_femto.fcmae_ft_in1k) |78.488|93.98 |224 |5.23 |0.79 |4.57 |4264.2 |256 | | [convnext_femto_ols.d1_in1k](https://huggingface.co/timm/convnext_femto_ols.d1_in1k) |77.86 |93.83 |224 |5.23 |0.82 |4.87 |6910.6 |256 | | [convnext_femto.d1_in1k](https://huggingface.co/timm/convnext_femto.d1_in1k) |77.454|93.68 |224 |5.22 |0.79 |4.57 |7189.92 |256 | | [convnextv2_atto.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_atto.fcmae_ft_in1k) |76.664|93.044|224 |3.71 |0.55 |3.81 |4728.91 |256 | | [convnext_atto_ols.a2_in1k](https://huggingface.co/timm/convnext_atto_ols.a2_in1k) |75.88 |92.846|224 |3.7 |0.58 |4.11 |7963.16 |256 | | [convnext_atto.d2_in1k](https://huggingface.co/timm/convnext_atto.d2_in1k) |75.664|92.9 |224 |3.7 |0.55 |3.81 |8439.22 |256 | ## Citation ```bibtex @article{liu2022convnet, author = {Zhuang Liu and Hanzi Mao and Chao-Yuan Wu and Christoph Feichtenhofer and Trevor Darrell and Saining Xie}, title = {A ConvNet for the 2020s}, journal = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, year = {2022}, } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/huggingface/pytorch-image-models}} } ```
15,732
[ [ -0.06695556640625, -0.0325927734375, -0.0041961669921875, 0.0379638671875, -0.03179931640625, -0.01546478271484375, -0.012603759765625, -0.0355224609375, 0.064697265625, 0.01739501953125, -0.044097900390625, -0.042083740234375, -0.050140380859375, -0.003108978271484375, 0.006778717041015625, 0.06829833984375, -0.0030994415283203125, -0.01009368896484375, 0.0191650390625, -0.0283355712890625, -0.01580810546875, -0.0291595458984375, -0.06304931640625, -0.01502227783203125, 0.0197601318359375, 0.02227783203125, 0.057403564453125, 0.04473876953125, 0.0299224853515625, 0.041290283203125, -0.0181884765625, 0.0125732421875, -0.0147857666015625, -0.026031494140625, 0.041046142578125, -0.0311431884765625, -0.06829833984375, 0.01727294921875, 0.06170654296875, 0.0399169921875, 0.003875732421875, 0.01519775390625, 0.0257568359375, 0.0355224609375, 0.002887725830078125, -0.00463104248046875, -0.0070343017578125, 0.01314544677734375, -0.0201568603515625, 0.0014190673828125, 0.003662109375, -0.052703857421875, 0.0262908935546875, -0.043060302734375, 0.002658843994140625, 0.00012242794036865234, 0.10174560546875, -0.00876617431640625, -0.0167083740234375, 0.0010919570922851562, 0.0109100341796875, 0.05328369140625, -0.060089111328125, 0.0233154296875, 0.0312347412109375, -0.01105499267578125, -0.01502227783203125, -0.047454833984375, -0.045989990234375, -0.004268646240234375, -0.027984619140625, 0.0171661376953125, -0.0272369384765625, -0.005062103271484375, 0.04119873046875, 0.03399658203125, -0.037872314453125, -0.00677490234375, -0.0248565673828125, -0.0088348388671875, 0.06036376953125, -0.0070343017578125, 0.046844482421875, -0.025726318359375, -0.047760009765625, -0.0206146240234375, -0.01496124267578125, 0.034149169921875, 0.0159759521484375, -0.0018014907836914062, -0.07373046875, 0.038238525390625, 0.006763458251953125, 0.0206451416015625, 0.0268402099609375, -0.01458740234375, 0.0595703125, -0.01910400390625, -0.04107666015625, -0.0245819091796875, 0.08953857421875, 0.05328369140625, 0.0308074951171875, 0.0093231201171875, 0.004199981689453125, -0.00646209716796875, -0.03582763671875, -0.074951171875, -0.0117950439453125, 0.02813720703125, -0.042083740234375, -0.0097198486328125, 0.0265045166015625, -0.059906005859375, 0.010101318359375, -0.00897979736328125, 0.014434814453125, -0.0626220703125, -0.0296478271484375, -0.007091522216796875, -0.0270538330078125, 0.0305023193359375, 0.0215911865234375, -0.0258026123046875, 0.0227203369140625, 0.0205841064453125, 0.07421875, 0.021820068359375, -0.0116119384765625, -0.031646728515625, -0.01160430908203125, -0.0263214111328125, 0.02545166015625, 0.01035308837890625, -0.01189422607421875, -0.019866943359375, 0.033233642578125, -0.01299285888671875, -0.030181884765625, 0.031280517578125, 0.02130126953125, 0.00771331787109375, -0.02838134765625, -0.0271148681640625, -0.0211181640625, 0.0286102294921875, -0.03851318359375, 0.07891845703125, 0.03662109375, -0.07855224609375, 0.0228729248046875, -0.03472900390625, -0.0044708251953125, -0.0215911865234375, 0.00608062744140625, -0.057464599609375, -0.008392333984375, 0.0178375244140625, 0.055389404296875, -0.00904083251953125, -0.01026153564453125, -0.028533935546875, -0.00482177734375, 0.0269622802734375, 0.008575439453125, 0.072509765625, 0.01303863525390625, -0.0347900390625, 0.0009403228759765625, -0.04840087890625, 0.0237274169921875, 0.0286407470703125, -0.0034656524658203125, -0.005588531494140625, -0.0606689453125, 0.004215240478515625, 0.039306640625, 0.01500701904296875, -0.040130615234375, 0.0207366943359375, -0.0180511474609375, 0.0285186767578125, 0.04754638671875, -0.004718780517578125, 0.022186279296875, -0.044219970703125, 0.042083740234375, 0.006214141845703125, 0.0196533203125, -0.003326416015625, -0.0296478271484375, -0.055389404296875, -0.05133056640625, 0.01568603515625, 0.035858154296875, -0.033477783203125, 0.055633544921875, 0.012725830078125, -0.046905517578125, -0.05511474609375, 0.0153961181640625, 0.0394287109375, 0.019012451171875, 0.016204833984375, -0.0272979736328125, -0.049468994140625, -0.069580078125, -0.00860595703125, 0.005645751953125, -0.0017223358154296875, 0.04656982421875, 0.0276336669921875, -0.00676727294921875, 0.041168212890625, -0.031890869140625, -0.0203704833984375, -0.009063720703125, -0.006214141845703125, 0.032867431640625, 0.0587158203125, 0.08624267578125, -0.06488037109375, -0.06939697265625, 0.0023937225341796875, -0.08380126953125, 0.0010089874267578125, -0.0035266876220703125, -0.032806396484375, 0.02191162109375, 0.0196990966796875, -0.07452392578125, 0.05224609375, 0.0282135009765625, -0.04693603515625, 0.035430908203125, -0.020263671875, 0.0245361328125, -0.07330322265625, 0.01739501953125, 0.021240234375, -0.0232696533203125, -0.039825439453125, 0.00537872314453125, -0.00707244873046875, 0.01316070556640625, -0.047882080078125, 0.06781005859375, -0.05108642578125, 0.007747650146484375, 0.0028533935546875, 0.00916290283203125, 0.0009737014770507812, 0.037261962890625, -0.002437591552734375, 0.032958984375, 0.058807373046875, -0.0218505859375, 0.03521728515625, 0.039703369140625, -0.0017042160034179688, 0.058074951171875, -0.046966552734375, 0.0088653564453125, 0.0086822509765625, 0.036712646484375, -0.06854248046875, -0.0311431884765625, 0.04345703125, -0.057098388671875, 0.035858154296875, -0.0173187255859375, -0.02667236328125, -0.06005859375, -0.06451416015625, 0.0187225341796875, 0.043212890625, -0.0462646484375, 0.013336181640625, 0.020172119140625, 0.0038738250732421875, -0.046356201171875, -0.048248291015625, -0.00540924072265625, -0.031829833984375, -0.06585693359375, 0.031707763671875, 0.00673675537109375, -0.0085906982421875, 0.0012540817260742188, -0.0018091201782226562, -0.002628326416015625, -0.01300811767578125, 0.0394287109375, 0.032379150390625, -0.0181732177734375, -0.026641845703125, -0.02386474609375, -0.008575439453125, 0.00095367431640625, -0.00862884521484375, 0.041748046875, -0.0256195068359375, 0.01163482666015625, -0.0791015625, 0.01605224609375, 0.04803466796875, -0.0027904510498046875, 0.06768798828125, 0.0762939453125, -0.03350830078125, 0.01068115234375, -0.0285797119140625, -0.010162353515625, -0.038116455078125, -0.00846099853515625, -0.039825439453125, -0.0479736328125, 0.061920166015625, 0.01430511474609375, -0.006191253662109375, 0.052642822265625, 0.024017333984375, -0.0176239013671875, 0.06390380859375, 0.03985595703125, -0.007434844970703125, 0.0450439453125, -0.0667724609375, 0.0006856918334960938, -0.06158447265625, -0.047149658203125, -0.0095367431640625, -0.04229736328125, -0.05572509765625, -0.029022216796875, 0.0198211669921875, 0.0355224609375, -0.0099334716796875, 0.050506591796875, -0.04339599609375, -0.0069732666015625, 0.03704833984375, 0.02532958984375, -0.019195556640625, -0.017730712890625, -0.01122283935546875, -0.01497650146484375, -0.04095458984375, -0.01113128662109375, 0.051788330078125, 0.0484619140625, 0.029388427734375, -0.0014638900756835938, 0.03900146484375, -0.005847930908203125, 0.0231781005859375, -0.036468505859375, 0.05511474609375, -0.004337310791015625, -0.038787841796875, -0.01522064208984375, -0.0340576171875, -0.07354736328125, 0.01041412353515625, -0.02752685546875, -0.06365966796875, -0.0104522705078125, 0.01546478271484375, -0.0224456787109375, 0.04058837890625, -0.050506591796875, 0.055206298828125, -0.00601959228515625, -0.036895751953125, 0.00807952880859375, -0.0657958984375, 0.0178985595703125, 0.030609130859375, -0.00534820556640625, -0.012603759765625, 0.011688232421875, 0.06036376953125, -0.0635986328125, 0.03729248046875, -0.029693603515625, 0.0032711029052734375, 0.03948974609375, -0.005260467529296875, 0.03173828125, 0.009765625, -0.0009851455688476562, 0.002109527587890625, 0.01102447509765625, -0.04840087890625, -0.028717041015625, 0.0484619140625, -0.049835205078125, -0.0287322998046875, -0.040557861328125, -0.0238800048828125, 0.0137481689453125, 0.0011768341064453125, 0.048095703125, 0.043121337890625, -0.01081085205078125, 0.015045166015625, 0.03985595703125, -0.028717041015625, 0.038116455078125, -0.0140228271484375, -0.0023345947265625, -0.039306640625, 0.058502197265625, 0.003932952880859375, 0.0081634521484375, 0.0037384033203125, 0.00522613525390625, -0.032073974609375, -0.01155853271484375, -0.0109710693359375, 0.0513916015625, -0.017181396484375, -0.02740478515625, -0.04766845703125, -0.032745361328125, -0.044281005859375, -0.02569580078125, -0.0290985107421875, -0.0202178955078125, -0.0258941650390625, 0.0054473876953125, 0.055450439453125, 0.041259765625, -0.027923583984375, 0.033294677734375, -0.048095703125, 0.0239410400390625, 0.004154205322265625, 0.032318115234375, -0.021484375, -0.0438232421875, 0.002002716064453125, 0.0017948150634765625, -0.0181121826171875, -0.058563232421875, 0.04840087890625, 0.0113983154296875, 0.0289459228515625, 0.04010009765625, -0.02313232421875, 0.05938720703125, -0.0069427490234375, 0.0372314453125, 0.04193115234375, -0.06549072265625, 0.032257080078125, -0.03033447265625, 0.00701141357421875, 0.01184844970703125, 0.028167724609375, -0.036895751953125, -0.0259552001953125, -0.07354736328125, -0.044647216796875, 0.05181884765625, 0.01108551025390625, -0.00083160400390625, 0.005992889404296875, 0.047943115234375, -0.0061492919921875, 0.01084136962890625, -0.041473388671875, -0.054656982421875, -0.016754150390625, -0.01079559326171875, -0.00732421875, -0.0018835067749023438, -0.001911163330078125, -0.05133056640625, 0.036712646484375, -0.0099945068359375, 0.04296875, 0.0192108154296875, 0.0008106231689453125, -0.002552032470703125, -0.0238189697265625, 0.040863037109375, 0.026580810546875, -0.022674560546875, -0.009979248046875, 0.027923583984375, -0.037872314453125, 0.001514434814453125, 0.0219573974609375, 0.0058746337890625, 0.015472412109375, 0.0245361328125, 0.045654296875, 0.018646240234375, -0.0126800537109375, 0.043365478515625, -0.0160980224609375, -0.030242919921875, -0.0228271484375, -0.0021915435791015625, 0.01462554931640625, 0.035003662109375, 0.0158538818359375, 0.0045013427734375, -0.0228271484375, -0.043853759765625, 0.04144287109375, 0.0592041015625, -0.0333251953125, -0.04193115234375, 0.048095703125, -0.00673675537109375, -0.00804901123046875, 0.0406494140625, -0.006103515625, -0.053955078125, 0.0750732421875, 0.0209808349609375, 0.04278564453125, -0.042816162109375, 0.018310546875, 0.06610107421875, -0.0015277862548828125, 0.00969696044921875, 0.0274200439453125, 0.0283355712890625, -0.033172607421875, 0.00440216064453125, -0.04779052734375, 0.01309967041015625, 0.042510986328125, -0.03546142578125, 0.0267333984375, -0.0565185546875, -0.027130126953125, 0.01482391357421875, 0.03289794921875, -0.06256103515625, 0.0244140625, 0.004467010498046875, 0.0843505859375, -0.05963134765625, 0.06640625, 0.05523681640625, -0.0279541015625, -0.07135009765625, -0.0110321044921875, 0.0157623291015625, -0.057586669921875, 0.0295257568359375, 0.0188140869140625, 0.0164337158203125, -0.016082763671875, -0.04620361328125, -0.03656005859375, 0.0909423828125, 0.036651611328125, -0.01149749755859375, 0.008392333984375, -0.0261077880859375, 0.029693603515625, -0.0207061767578125, 0.03485107421875, 0.040191650390625, 0.040191650390625, 0.015869140625, -0.06939697265625, 0.028289794921875, -0.03228759765625, -0.014068603515625, 0.0217742919921875, -0.10302734375, 0.07904052734375, -0.0261993408203125, -0.0026416778564453125, 0.0140838623046875, 0.0618896484375, 0.030303955078125, 0.004596710205078125, 0.0293731689453125, 0.053863525390625, 0.036041259765625, -0.01476287841796875, 0.07904052734375, 0.0011625289916992188, 0.030517578125, 0.0191497802734375, 0.039276123046875, 0.0309295654296875, 0.0277099609375, -0.032073974609375, 0.01029205322265625, 0.06591796875, -0.014251708984375, 0.00885772705078125, 0.016143798828125, -0.0129241943359375, -0.0096435546875, -0.0160980224609375, -0.046630859375, 0.03216552734375, 0.01187896728515625, -0.0204925537109375, 0.0013380050659179688, -0.007175445556640625, 0.037750244140625, -0.0003745555877685547, -0.0111083984375, 0.033233642578125, 0.018829345703125, -0.043548583984375, 0.039886474609375, -0.0060577392578125, 0.07440185546875, -0.02618408203125, 0.0011472702026367188, -0.0237884521484375, 0.023834228515625, -0.0196533203125, -0.08721923828125, 0.02386474609375, -0.0111846923828125, 0.0146026611328125, -0.00554656982421875, 0.0467529296875, -0.034271240234375, -0.0184173583984375, 0.0399169921875, 0.025665283203125, 0.0293731689453125, 0.004528045654296875, -0.0872802734375, 0.0179290771484375, 0.0110931396484375, -0.041748046875, 0.031005859375, 0.03741455078125, 0.0182647705078125, 0.051513671875, 0.031341552734375, 0.01453399658203125, 0.00646209716796875, -0.0274810791015625, 0.059906005859375, -0.050201416015625, -0.0352783203125, -0.0654296875, 0.034149169921875, -0.0242462158203125, -0.04852294921875, 0.05963134765625, 0.03338623046875, 0.03924560546875, 0.0087738037109375, 0.0399169921875, -0.036712646484375, 0.028167724609375, -0.032623291015625, 0.05364990234375, -0.060272216796875, -0.02392578125, -0.033721923828125, -0.061187744140625, -0.0221099853515625, 0.05499267578125, 0.004482269287109375, 0.017822265625, 0.0261077880859375, 0.044921875, -0.0030612945556640625, -0.017486572265625, -0.004913330078125, 0.019256591796875, 0.003826141357421875, 0.06097412109375, 0.040496826171875, -0.05633544921875, 0.0151519775390625, -0.04754638671875, -0.025177001953125, -0.0258331298828125, -0.055633544921875, -0.0816650390625, -0.060302734375, -0.03729248046875, -0.050018310546875, -0.0234832763671875, 0.0841064453125, 0.07171630859375, -0.040740966796875, -0.01267242431640625, 0.0228729248046875, 0.00782012939453125, -0.01678466796875, -0.0197296142578125, 0.039825439453125, 0.024871826171875, -0.07672119140625, -0.0211181640625, 0.006229400634765625, 0.042144775390625, 0.0236663818359375, -0.0302276611328125, -0.0194091796875, -0.004833221435546875, 0.030670166015625, 0.061920166015625, -0.051971435546875, -0.034423828125, 0.0038604736328125, -0.01995849609375, 0.0189056396484375, 0.0208587646484375, -0.029296875, -0.007274627685546875, 0.040374755859375, 0.009918212890625, 0.05657958984375, 0.01116943359375, 0.01873779296875, -0.046905517578125, 0.049713134765625, -0.00408935546875, 0.0267486572265625, 0.02789306640625, -0.0302581787109375, 0.055267333984375, 0.0374755859375, -0.034454345703125, -0.07275390625, -0.022064208984375, -0.1075439453125, 0.0005693435668945312, 0.05816650390625, -0.01548004150390625, -0.040496826171875, 0.0396728515625, -0.025726318359375, 0.040374755859375, -0.0194549560546875, 0.0202789306640625, 0.0279541015625, -0.0267333984375, -0.03436279296875, -0.041015625, 0.053955078125, 0.024322509765625, -0.0511474609375, -0.0264129638671875, -0.0014705657958984375, 0.036651611328125, 0.0181884765625, 0.059661865234375, -0.01491546630859375, 0.01239776611328125, 0.001506805419921875, 0.0130462646484375, 0.00400543212890625, 0.0017766952514648438, -0.0121307373046875, -0.016510009765625, -0.025177001953125, -0.044921875 ] ]
microsoft/wavlm-base-plus-sv
2022-03-25T10:39:41.000Z
[ "transformers", "pytorch", "wavlm", "audio-xvector", "speech", "en", "arxiv:1912.07875", "arxiv:2106.06909", "arxiv:2101.00390", "arxiv:2110.13900", "endpoints_compatible", "has_space", "region:us" ]
null
microsoft
null
null
microsoft/wavlm-base-plus-sv
14
9,585
transformers
2022-03-02T23:29:05
--- language: - en tags: - speech --- # WavLM-Base-Plus for Speaker Verification [Microsoft's WavLM](https://github.com/microsoft/unilm/tree/master/wavlm) The model was pretrained on 16kHz sampled speech audio with utterance and speaker contrastive loss. When using the model, make sure that your speech input is also sampled at 16kHz. **Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model. The model was pre-trained on: - 60,000 hours of [Libri-Light](https://arxiv.org/abs/1912.07875) - 10,000 hours of [GigaSpeech](https://arxiv.org/abs/2106.06909) - 24,000 hours of [VoxPopuli](https://arxiv.org/abs/2101.00390) [Paper: WavLM: Large-Scale Self-Supervised Pre-Training for Full Stack Speech Processing](https://arxiv.org/abs/2110.13900) Authors: Sanyuan Chen, Chengyi Wang, Zhengyang Chen, Yu Wu, Shujie Liu, Zhuo Chen, Jinyu Li, Naoyuki Kanda, Takuya Yoshioka, Xiong Xiao, Jian Wu, Long Zhou, Shuo Ren, Yanmin Qian, Yao Qian, Jian Wu, Michael Zeng, Furu Wei **Abstract** *Self-supervised learning (SSL) achieves great success in speech recognition, while limited exploration has been attempted for other speech processing tasks. As speech signal contains multi-faceted information including speaker identity, paralinguistics, spoken content, etc., learning universal representations for all speech tasks is challenging. In this paper, we propose a new pre-trained model, WavLM, to solve full-stack downstream speech tasks. WavLM is built based on the HuBERT framework, with an emphasis on both spoken content modeling and speaker identity preservation. We first equip the Transformer structure with gated relative position bias to improve its capability on recognition tasks. For better speaker discrimination, we propose an utterance mixing training strategy, where additional overlapped utterances are created unsupervisely and incorporated during model training. Lastly, we scale up the training dataset from 60k hours to 94k hours. WavLM Large achieves state-of-the-art performance on the SUPERB benchmark, and brings significant improvements for various speech processing tasks on their representative benchmarks.* The original model can be found under https://github.com/microsoft/unilm/tree/master/wavlm. # Fine-tuning details The model is fine-tuned on the [VoxCeleb1 dataset](https://www.robots.ox.ac.uk/~vgg/data/voxceleb/vox1.html) using an X-Vector head with an Additive Margin Softmax loss [X-Vectors: Robust DNN Embeddings for Speaker Recognition](https://www.danielpovey.com/files/2018_icassp_xvectors.pdf) # Usage ## Speaker Verification ```python from transformers import Wav2Vec2FeatureExtractor, WavLMForXVector from datasets import load_dataset import torch dataset = load_dataset("hf-internal-testing/librispeech_asr_demo", "clean", split="validation") feature_extractor = Wav2Vec2FeatureExtractor.from_pretrained('microsoft/wavlm-base-plus-sv') model = WavLMForXVector.from_pretrained('microsoft/wavlm-base-plus-sv') # audio files are decoded on the fly audio = [x["array"] for x in dataset[:2]["audio"]] inputs = feature_extractor(audio, padding=True, return_tensors="pt") embeddings = model(**inputs).embeddings embeddings = torch.nn.functional.normalize(embeddings, dim=-1).cpu() # the resulting embeddings can be used for cosine similarity-based retrieval cosine_sim = torch.nn.CosineSimilarity(dim=-1) similarity = cosine_sim(embeddings[0], embeddings[1]) threshold = 0.86 # the optimal threshold is dataset-dependent if similarity < threshold: print("Speakers are not the same!") ``` # License The official license can be found [here](https://github.com/microsoft/UniSpeech/blob/main/LICENSE) ![design](https://raw.githubusercontent.com/patrickvonplaten/scientific_images/master/wavlm.png)
4,052
[ [ -0.033477783203125, -0.047698974609375, 0.01479339599609375, 0.0043182373046875, -0.01195526123046875, -0.00807952880859375, -0.0176239013671875, -0.034149169921875, -0.011749267578125, 0.03033447265625, -0.051544189453125, -0.0447998046875, -0.031646728515625, -0.0161590576171875, -0.0258941650390625, 0.049041748046875, 0.0369873046875, 0.00818634033203125, -0.0093841552734375, -0.0099945068359375, -0.040130615234375, -0.046478271484375, -0.0357666015625, -0.039794921875, 0.0121917724609375, 0.0148773193359375, 0.01026153564453125, 0.0296173095703125, 0.007793426513671875, 0.025848388671875, -0.018798828125, 0.007808685302734375, -0.03179931640625, -0.00588226318359375, 0.001476287841796875, -0.006183624267578125, -0.042327880859375, 0.00875091552734375, 0.037506103515625, 0.033782958984375, -0.0219573974609375, 0.03802490234375, 0.00858306884765625, 0.0205841064453125, -0.0293426513671875, 0.007293701171875, -0.06854248046875, -0.0029048919677734375, -0.0205841064453125, -0.018585205078125, -0.0291748046875, 0.0002796649932861328, 0.018280029296875, -0.03436279296875, 0.01049041748046875, 0.009307861328125, 0.0633544921875, 0.020721435546875, -0.03045654296875, -0.0150299072265625, -0.05133056640625, 0.07830810546875, -0.062042236328125, 0.065673828125, 0.03082275390625, 0.012725830078125, -0.008026123046875, -0.0592041015625, -0.0435791015625, -0.0256195068359375, 0.019439697265625, 0.01479339599609375, -0.0277862548828125, 0.004894256591796875, 0.029388427734375, 0.0198211669921875, -0.06219482421875, 0.00908660888671875, -0.038909912109375, -0.044921875, 0.054595947265625, -0.0169830322265625, 0.010833740234375, -0.001308441162109375, -0.03668212890625, -0.0206451416015625, -0.038726806640625, 0.0286407470703125, 0.0201568603515625, 0.03216552734375, -0.018585205078125, 0.0218658447265625, 0.00243377685546875, 0.046539306640625, -0.00628662109375, -0.025665283203125, 0.058258056640625, -0.00981903076171875, -0.01690673828125, 0.0350341796875, 0.0592041015625, -0.0034122467041015625, 0.0163116455078125, 0.00981903076171875, -0.01360321044921875, -0.0005078315734863281, 0.00026679039001464844, -0.04425048828125, -0.0298004150390625, 0.038238525390625, -0.0272216796875, -0.00353240966796875, -0.0012102127075195312, -0.01326751708984375, -0.0032672882080078125, -0.036041259765625, 0.062347412109375, -0.038543701171875, -0.032989501953125, 0.007526397705078125, -0.012451171875, 0.0164337158203125, 0.00542449951171875, -0.0565185546875, 0.0178070068359375, 0.049407958984375, 0.04876708984375, -0.011322021484375, -0.0247955322265625, -0.060028076171875, -0.0003094673156738281, -0.007549285888671875, 0.0207672119140625, -0.008819580078125, -0.033905029296875, -0.0164947509765625, -0.01053619384765625, 0.00986480712890625, -0.046966552734375, 0.04852294921875, -0.02685546875, 0.01361846923828125, -0.00530242919921875, -0.060577392578125, -0.0112457275390625, -0.00426483154296875, -0.029296875, 0.09332275390625, 0.0006918907165527344, -0.056243896484375, 0.0157318115234375, -0.035919189453125, -0.033203125, 0.0028476715087890625, -0.0117034912109375, -0.033721923828125, -0.00589752197265625, 0.017608642578125, 0.028411865234375, -0.007190704345703125, 0.01201629638671875, -0.01450347900390625, -0.0340576171875, 0.0131988525390625, -0.036865234375, 0.08538818359375, 0.021759033203125, -0.037841796875, 0.0245513916015625, -0.090087890625, 0.01006317138671875, 0.001312255859375, -0.029022216796875, -0.00665283203125, -0.001171112060546875, 0.026885986328125, 0.024078369140625, 0.0168914794921875, -0.0640869140625, -0.0204010009765625, -0.03466796875, 0.033599853515625, 0.04547119140625, -0.01261138916015625, 0.01107025146484375, -0.01363372802734375, 0.00868988037109375, -0.007720947265625, 0.0157623291015625, -0.00278472900390625, -0.03656005859375, -0.04241943359375, -0.0257568359375, 0.0203399658203125, 0.052520751953125, -0.0236968994140625, 0.068115234375, -0.015777587890625, -0.040435791015625, -0.06781005859375, 0.0103912353515625, 0.031463623046875, 0.051544189453125, 0.0552978515625, -0.00438690185546875, -0.06793212890625, -0.0518798828125, 0.00527191162109375, -0.00685882568359375, -0.024993896484375, 0.0290985107421875, 0.01593017578125, -0.0177154541015625, 0.07806396484375, -0.0184326171875, -0.034912109375, -0.0006170272827148438, -0.0012426376342773438, 0.0283203125, 0.0570068359375, 0.020843505859375, -0.06451416015625, -0.019775390625, -0.01435089111328125, -0.03753662109375, -0.0011615753173828125, 0.01226806640625, 0.01251220703125, 0.0017108917236328125, 0.0355224609375, -0.03912353515625, 0.0250244140625, 0.041046142578125, -0.00984954833984375, 0.04156494140625, -0.0150146484375, -0.01001739501953125, -0.06573486328125, 0.00933837890625, -0.0076446533203125, -0.0310821533203125, -0.055511474609375, -0.0291748046875, 0.00763702392578125, -0.0204925537109375, -0.050689697265625, 0.032257080078125, -0.0313720703125, -0.01386260986328125, -0.01194000244140625, 0.02264404296875, -0.007396697998046875, 0.043670654296875, 0.002880096435546875, 0.052825927734375, 0.061767578125, -0.056640625, 0.0419921875, 0.017852783203125, -0.0401611328125, 0.023834228515625, -0.0625, 0.032073974609375, 0.0124053955078125, 0.020751953125, -0.088134765625, 0.0237274169921875, -0.0009198188781738281, -0.053558349609375, 0.033966064453125, -0.01348876953125, -0.021759033203125, -0.037750244140625, -0.0056304931640625, 0.0263214111328125, 0.06585693359375, -0.051971435546875, 0.038055419921875, 0.0306396484375, -0.0043182373046875, -0.045928955078125, -0.053436279296875, -0.0160369873046875, -0.001861572265625, -0.044464111328125, 0.040679931640625, -0.0157318115234375, -0.003932952880859375, -0.0247344970703125, -0.021209716796875, 0.018096923828125, -0.00333404541015625, 0.026031494140625, 0.0158538818359375, -0.0174713134765625, 0.0170135498046875, -0.02227783203125, -0.01837158203125, -0.0074920654296875, -0.039031982421875, 0.061126708984375, -0.0079803466796875, -0.039031982421875, -0.058990478515625, 0.022369384765625, 0.03363037109375, -0.03704833984375, 0.005970001220703125, 0.090576171875, -0.0170440673828125, -0.00400543212890625, -0.066650390625, -0.01226043701171875, -0.040679931640625, 0.0523681640625, -0.03155517578125, -0.06365966796875, 0.0247955322265625, 0.0179290771484375, 0.0018835067749023438, 0.037261962890625, 0.0367431640625, -0.01438140869140625, 0.07305908203125, 0.02545166015625, -0.018402099609375, 0.035919189453125, -0.0305023193359375, -0.0017557144165039062, -0.06842041015625, -0.033416748046875, -0.04046630859375, -0.009857177734375, -0.03643798828125, -0.032501220703125, 0.0115203857421875, -0.0001361370086669922, -0.0205078125, 0.0267486572265625, -0.059356689453125, 0.002643585205078125, 0.056976318359375, 0.00856781005859375, -0.00330352783203125, 0.008636474609375, -0.01399993896484375, -0.0037860870361328125, -0.03515625, -0.017486572265625, 0.0733642578125, 0.038970947265625, 0.0435791015625, -0.008941650390625, 0.062744140625, 0.024871826171875, -0.01425933837890625, -0.047637939453125, 0.034088134765625, -0.00536346435546875, -0.035980224609375, -0.043609619140625, -0.033966064453125, -0.069091796875, 0.0302886962890625, -0.0151824951171875, -0.0684814453125, 0.019134521484375, 0.0089111328125, -0.01021575927734375, 0.019256591796875, -0.041351318359375, 0.050079345703125, -0.0190582275390625, 0.00591278076171875, -0.0184783935546875, -0.04833984375, -0.00010991096496582031, 0.01132965087890625, 0.02606201171875, -0.0307159423828125, 0.02392578125, 0.0706787109375, -0.0203094482421875, 0.032501220703125, -0.03973388671875, -0.0159149169921875, 0.0206451416015625, -0.017669677734375, 0.033447265625, -0.01641845703125, -0.01568603515625, 0.0242462158203125, 0.01309967041015625, -0.0298614501953125, -0.028900146484375, 0.049468994140625, -0.07708740234375, -0.037689208984375, -0.0023651123046875, -0.03759765625, -0.0248260498046875, 0.00908660888671875, 0.04632568359375, 0.06201171875, -0.00441741943359375, 0.03485107421875, 0.057525634765625, -0.0072479248046875, 0.03363037109375, 0.042938232421875, -0.0153961181640625, -0.03765869140625, 0.08392333984375, 0.0204315185546875, 0.0078277587890625, 0.03192138671875, 0.017669677734375, -0.043731689453125, -0.05615234375, -0.01171875, -0.004459381103515625, -0.04681396484375, 0.001392364501953125, -0.052520751953125, -0.0284271240234375, -0.05145263671875, 0.023345947265625, -0.0428466796875, -0.017578125, -0.040283203125, 0.00281524658203125, 0.021636962890625, 0.03759765625, -0.00203704833984375, 0.006290435791015625, -0.0482177734375, 0.02685546875, 0.03887939453125, 0.006519317626953125, 0.0077667236328125, -0.07476806640625, -0.01210784912109375, 0.0201416015625, -0.010406494140625, -0.05438232421875, 0.016204833984375, 0.02252197265625, 0.06781005859375, 0.02099609375, 0.00919342041015625, 0.052978515625, -0.039093017578125, 0.0694580078125, 0.027099609375, -0.08929443359375, 0.047882080078125, -0.006809234619140625, 0.0305633544921875, 0.0277252197265625, 0.0150146484375, -0.032867431640625, -0.01519775390625, -0.05218505859375, -0.06707763671875, 0.052154541015625, 0.0244598388671875, 0.01373291015625, 0.0193023681640625, 0.00434112548828125, 0.0005154609680175781, 0.002613067626953125, -0.0472412109375, -0.048675537109375, -0.036895751953125, -0.01248931884765625, -0.026153564453125, -0.023162841796875, -0.00013840198516845703, -0.052581787109375, 0.07177734375, 0.0172271728515625, 0.0192108154296875, 0.032562255859375, -0.02642822265625, 0.0206298828125, 0.02783203125, 0.0531005859375, 0.044097900390625, -0.03973388671875, 0.0150146484375, 0.036041259765625, -0.039093017578125, 0.015533447265625, 0.0197296142578125, 0.0188751220703125, 0.0203704833984375, 0.032012939453125, 0.0810546875, 0.0206756591796875, -0.032928466796875, 0.043731689453125, 0.0024547576904296875, -0.02783203125, -0.03533935546875, 0.007228851318359375, 0.01763916015625, 0.0222320556640625, 0.0648193359375, -0.0063323974609375, 0.006744384765625, -0.031829833984375, 0.032928466796875, 0.02850341796875, -0.04345703125, -0.01074981689453125, 0.05010986328125, 0.01277923583984375, -0.04833984375, 0.048614501953125, -0.01360321044921875, -0.03662109375, 0.0301361083984375, 0.050079345703125, 0.060760498046875, -0.03955078125, -0.015960693359375, 0.019195556640625, 0.003582000732421875, 0.0173797607421875, 0.03363037109375, -0.036346435546875, -0.040435791015625, -0.01399993896484375, -0.052703857421875, -0.0161285400390625, 0.02532958984375, -0.05609130859375, 0.01702880859375, -0.0195159912109375, -0.0261688232421875, 0.0128173828125, 0.020233154296875, -0.06317138671875, 0.022705078125, 0.032318115234375, 0.06982421875, -0.04913330078125, 0.09149169921875, 0.0238494873046875, -0.0163116455078125, -0.0733642578125, -0.017181396484375, -0.0015726089477539062, -0.058013916015625, 0.04150390625, 0.01152801513671875, -0.0234832763671875, 0.0233154296875, -0.039459228515625, -0.0718994140625, 0.09100341796875, 0.02508544921875, -0.06787109375, 0.01142120361328125, -0.010498046875, 0.0251312255859375, -0.02764892578125, 0.0084075927734375, 0.0323486328125, 0.0225677490234375, 0.014373779296875, -0.087158203125, 0.001682281494140625, -0.020782470703125, -0.007366180419921875, -0.0220947265625, -0.0251922607421875, 0.077880859375, -0.00894927978515625, -0.007503509521484375, -0.00864410400390625, 0.059173583984375, 0.034576416015625, 0.0186920166015625, 0.0537109375, 0.0277862548828125, 0.07666015625, -0.002094268798828125, 0.05255126953125, -0.019287109375, 0.041168212890625, 0.098388671875, -0.0124359130859375, 0.08447265625, 0.0295867919921875, -0.04010009765625, 0.0279541015625, 0.03515625, -0.021636962890625, 0.0445556640625, 0.02392578125, -0.00844573974609375, -0.0235595703125, 0.00006860494613647461, -0.06353759765625, 0.048675537109375, 0.01104736328125, -0.0086822509765625, 0.007556915283203125, 0.020233154296875, -0.0235748291015625, -0.0237274169921875, -0.017669677734375, 0.055328369140625, 0.01409149169921875, -0.00640869140625, 0.0703125, -0.01922607421875, 0.0767822265625, -0.05419921875, 0.01216888427734375, 0.019989013671875, 0.00942230224609375, -0.026611328125, -0.039703369140625, -0.0080413818359375, -0.0184783935546875, -0.0167083740234375, -0.00559234619140625, 0.0537109375, -0.0517578125, -0.0263519287109375, 0.04290771484375, 0.01861572265625, 0.024261474609375, -0.018341064453125, -0.06005859375, 0.029205322265625, 0.0105133056640625, -0.01776123046875, 0.0159149169921875, 0.0021724700927734375, 0.027618408203125, 0.041046142578125, 0.0673828125, 0.01354217529296875, 0.01384735107421875, 0.0300140380859375, 0.037261962890625, -0.0419921875, -0.052276611328125, -0.055938720703125, 0.0386962890625, -0.01073455810546875, -0.016845703125, 0.06396484375, 0.05377197265625, 0.06793212890625, 0.0026035308837890625, 0.053314208984375, 0.0184326171875, 0.050811767578125, -0.033721923828125, 0.06390380859375, -0.05181884765625, 0.01520538330078125, -0.029022216796875, -0.06378173828125, -0.0006618499755859375, 0.044189453125, -0.0023288726806640625, 0.00922393798828125, 0.0186309814453125, 0.05938720703125, -0.00847625732421875, -0.0007042884826660156, 0.05126953125, 0.01544952392578125, 0.0323486328125, 0.0287017822265625, 0.0570068359375, -0.044830322265625, 0.056182861328125, -0.024688720703125, -0.00495147705078125, -0.0188446044921875, -0.045867919921875, -0.061004638671875, -0.06280517578125, -0.033416748046875, -0.0237579345703125, -0.004344940185546875, 0.090087890625, 0.0870361328125, -0.071044921875, -0.0269622802734375, 0.015594482421875, 0.00345611572265625, -0.02008056640625, -0.01407623291015625, 0.04681396484375, -0.0112457275390625, -0.037750244140625, 0.053863525390625, 0.006565093994140625, 0.0184783935546875, -0.012054443359375, -0.0292816162109375, -0.023162841796875, -0.0021305084228515625, 0.041839599609375, 0.0201263427734375, -0.07708740234375, -0.02178955078125, -0.0203399658203125, -0.0038604736328125, 0.0192718505859375, 0.039886474609375, -0.05206298828125, 0.04376220703125, 0.034027099609375, 0.021240234375, 0.0697021484375, -0.00734710693359375, 0.0162506103515625, -0.07183837890625, 0.01947021484375, 0.0218658447265625, 0.03619384765625, 0.033203125, -0.01029205322265625, 0.0013723373413085938, 0.0195770263671875, -0.04345703125, -0.074462890625, -0.001468658447265625, -0.0958251953125, -0.015411376953125, 0.0794677734375, -0.00690460205078125, -0.003498077392578125, -0.0247039794921875, -0.0294952392578125, 0.04522705078125, -0.031585693359375, 0.0236053466796875, 0.034332275390625, -0.0101165771484375, -0.01337432861328125, -0.03302001953125, 0.047149658203125, 0.0191650390625, -0.027191162109375, -0.000029087066650390625, 0.01348876953125, 0.049560546875, 0.01708984375, 0.056304931640625, 0.005645751953125, 0.0204315185546875, 0.011260986328125, 0.0311431884765625, -0.04046630859375, -0.0095062255859375, -0.0340576171875, 0.01910400390625, -0.00811767578125, -0.04461669921875 ] ]
deepset/electra-base-squad2
2023-09-27T12:04:02.000Z
[ "transformers", "pytorch", "safetensors", "electra", "question-answering", "en", "dataset:squad_v2", "license:cc-by-4.0", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
question-answering
deepset
null
null
deepset/electra-base-squad2
15
9,569
transformers
2022-03-02T23:29:05
--- language: en license: cc-by-4.0 datasets: - squad_v2 model-index: - name: deepset/electra-base-squad2 results: - task: type: question-answering name: Question Answering dataset: name: squad_v2 type: squad_v2 config: squad_v2 split: validation metrics: - type: exact_match value: 77.6074 name: Exact Match verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzE5NTRmMmUwYTk1MTI0NjM0ZmQwNDFmM2Y4Mjk4ZWYxOGVmOWI3ZGFiNWM4OTUxZDQ2ZjdmNmU3OTk5ZjRjYyIsInZlcnNpb24iOjF9.0VZRewdiovE4z3K5box5R0oTT7etpmd0BX44FJBLRFfot-uJ915b-bceSv3luJQ7ENPjaYSa7o7jcHlDzn3oAw - type: f1 value: 81.7181 name: F1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2VlMzM0Y2UzYjhhNTJhMTFiYWZmMDNjNjRiZDgwYzc5NWE3N2M4ZGFlYWQ0ZjVkZTE2MDU0YmMzMDc1MTY5MCIsInZlcnNpb24iOjF9.jRV58UxOM7CJJSsmxJuZvlt00jMGA1thp4aqtcFi1C8qViQ1kW7NYz8rg1gNTDZNez2UwPS1NgN_HnnwBHPbCQ - task: type: question-answering name: Question Answering dataset: name: squad type: squad config: plain_text split: validation metrics: - type: exact_match value: 80.407 name: Exact Match - type: f1 value: 88.942 name: F1 - task: type: question-answering name: Question Answering dataset: name: adversarial_qa type: adversarial_qa config: adversarialQA split: validation metrics: - type: exact_match value: 23.533 name: Exact Match - type: f1 value: 36.521 name: F1 - task: type: question-answering name: Question Answering dataset: name: squad_adversarial type: squad_adversarial config: AddOneSent split: validation metrics: - type: exact_match value: 73.867 name: Exact Match - type: f1 value: 81.381 name: F1 - task: type: question-answering name: Question Answering dataset: name: squadshifts amazon type: squadshifts config: amazon split: test metrics: - type: exact_match value: 64.512 name: Exact Match - type: f1 value: 80.166 name: F1 - task: type: question-answering name: Question Answering dataset: name: squadshifts new_wiki type: squadshifts config: new_wiki split: test metrics: - type: exact_match value: 76.568 name: Exact Match - type: f1 value: 87.706 name: F1 - task: type: question-answering name: Question Answering dataset: name: squadshifts nyt type: squadshifts config: nyt split: test metrics: - type: exact_match value: 77.884 name: Exact Match - type: f1 value: 87.858 name: F1 - task: type: question-answering name: Question Answering dataset: name: squadshifts reddit type: squadshifts config: reddit split: test metrics: - type: exact_match value: 64.399 name: Exact Match - type: f1 value: 78.096 name: F1 --- # electra-base for QA ## Overview **Language model:** electra-base **Language:** English **Downstream-task:** Extractive QA **Training data:** SQuAD 2.0 **Eval data:** SQuAD 2.0 **Code:** See [example](https://github.com/deepset-ai/FARM/blob/master/examples/question_answering.py) in [FARM](https://github.com/deepset-ai/FARM/blob/master/examples/question_answering.py) **Infrastructure**: 1x Tesla v100 ## Hyperparameters ``` seed=42 batch_size = 32 n_epochs = 5 base_LM_model = "google/electra-base-discriminator" max_seq_len = 384 learning_rate = 1e-4 lr_schedule = LinearWarmup warmup_proportion = 0.1 doc_stride=128 max_query_length=64 ``` ## Performance Evaluated on the SQuAD 2.0 dev set with the [official eval script](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/). ``` "exact": 77.30144024256717, "f1": 81.35438272008543, "total": 11873, "HasAns_exact": 74.34210526315789, "HasAns_f1": 82.45961302894314, "HasAns_total": 5928, "NoAns_exact": 80.25231286795626, "NoAns_f1": 80.25231286795626, "NoAns_total": 5945 ``` ## Usage ### In Transformers ```python from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline model_name = "deepset/electra-base-squad2" # a) Get predictions nlp = pipeline('question-answering', model=model_name, tokenizer=model_name) QA_input = { 'question': 'Why is model conversion important?', 'context': 'The option to convert models between FARM and transformers gives freedom to the user and lets people easily switch between frameworks.' } res = nlp(QA_input) # b) Load model & tokenizer model = AutoModelForQuestionAnswering.from_pretrained(model_name) tokenizer = AutoTokenizer.from_pretrained(model_name) ``` ### In FARM ```python from farm.modeling.adaptive_model import AdaptiveModel from farm.modeling.tokenization import Tokenizer from farm.infer import Inferencer model_name = "deepset/electra-base-squad2" # a) Get predictions nlp = Inferencer.load(model_name, task_type="question_answering") QA_input = [{"questions": ["Why is model conversion important?"], "text": "The option to convert models between FARM and transformers gives freedom to the user and lets people easily switch between frameworks."}] res = nlp.inference_from_dicts(dicts=QA_input) # b) Load model & tokenizer model = AdaptiveModel.convert_from_transformers(model_name, device="cpu", task_type="question_answering") tokenizer = Tokenizer.load(model_name) ``` ### In haystack For doing QA at scale (i.e. many docs instead of a single paragraph), you can load the model also in [haystack](https://github.com/deepset-ai/haystack/): ```python reader = FARMReader(model_name_or_path="deepset/electra-base-squad2") # or reader = TransformersReader(model="deepset/electra-base-squad2",tokenizer="deepset/electra-base-squad2") ``` ## Authors Vaishali Pal `vaishali.pal [at] deepset.ai` Branden Chan: `branden.chan [at] deepset.ai` Timo Möller: `timo.moeller [at] deepset.ai` Malte Pietsch: `malte.pietsch [at] deepset.ai` Tanay Soni: `tanay.soni [at] deepset.ai` ## About us ![deepset logo](https://workablehr.s3.amazonaws.com/uploads/account/logo/476306/logo) We bring NLP to the industry via open source! Our focus: Industry specific language models & large scale QA systems. Some of our work: - [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert) - [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad) - [FARM](https://github.com/deepset-ai/FARM) - [Haystack](https://github.com/deepset-ai/haystack/) Get in touch: [Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai) By the way: [we're hiring!](http://www.deepset.ai/jobs)
7,180
[ [ -0.027130126953125, -0.047027587890625, 0.0247802734375, 0.0072174072265625, 0.010162353515625, 0.01004791259765625, -0.01617431640625, -0.0272216796875, 0.00785064697265625, 0.03173828125, -0.04833984375, -0.0479736328125, -0.0120391845703125, 0.008148193359375, -0.02923583984375, 0.07098388671875, -0.0030040740966796875, -0.003170013427734375, -0.005313873291015625, -0.0080108642578125, -0.02337646484375, -0.0352783203125, -0.050872802734375, -0.03033447265625, 0.01355743408203125, 0.03607177734375, 0.0408935546875, 0.00717926025390625, 0.031219482421875, 0.0248565673828125, -0.004230499267578125, 0.0153961181640625, -0.0189056396484375, 0.0200042724609375, -0.01000213623046875, -0.039642333984375, -0.0325927734375, -0.0111541748046875, 0.038909912109375, 0.020721435546875, -0.015838623046875, 0.03985595703125, -0.0198211669921875, 0.0552978515625, -0.03948974609375, 0.016693115234375, -0.05224609375, -0.01763916015625, 0.0191802978515625, 0.0160064697265625, -0.02044677734375, -0.0189056396484375, 0.0166778564453125, -0.03936767578125, 0.022308349609375, -0.01666259765625, 0.10760498046875, 0.02008056640625, -0.010345458984375, -0.009613037109375, -0.047698974609375, 0.05743408203125, -0.09027099609375, 0.00782012939453125, 0.0299224853515625, 0.0254669189453125, 0.007381439208984375, -0.06536865234375, -0.0570068359375, -0.0034198760986328125, -0.0008363723754882812, 0.01373291015625, -0.0192718505859375, -0.00537109375, 0.01024627685546875, 0.029052734375, -0.036102294921875, 0.01105499267578125, -0.03948974609375, -0.00922393798828125, 0.0634765625, 0.0127105712890625, 0.022613525390625, -0.01617431640625, 0.00144195556640625, -0.0244293212890625, -0.04107666015625, 0.0193023681640625, 0.0218658447265625, 0.0271148681640625, -0.0142059326171875, 0.04058837890625, -0.036773681640625, 0.0309906005859375, 0.0205535888671875, 0.0308380126953125, 0.05035400390625, -0.03326416015625, -0.0227203369140625, -0.00890350341796875, 0.07830810546875, 0.0196533203125, 0.00627899169921875, 0.0007152557373046875, -0.0191192626953125, -0.0076141357421875, 0.0183258056640625, -0.0643310546875, -0.0217742919921875, 0.048858642578125, -0.019866943359375, -0.0357666015625, -0.0041961669921875, -0.06756591796875, -0.022216796875, 0.0016727447509765625, 0.037872314453125, -0.031036376953125, -0.040679931640625, 0.018798828125, -0.0035457611083984375, 0.050445556640625, 0.0017824172973632812, -0.06634521484375, 0.01403045654296875, 0.03607177734375, 0.051177978515625, 0.0158843994140625, -0.0295562744140625, -0.0206146240234375, -0.0200042724609375, -0.01541900634765625, 0.0491943359375, -0.0111083984375, 0.004978179931640625, 0.004825592041015625, 0.020751953125, -0.01995849609375, -0.0357666015625, 0.027008056640625, -0.053314208984375, 0.0296783447265625, -0.00452423095703125, -0.038787841796875, -0.031585693359375, 0.0222015380859375, -0.0531005859375, 0.0853271484375, 0.0271148681640625, -0.036651611328125, 0.00942230224609375, -0.05584716796875, -0.024749755859375, 0.0022678375244140625, 0.0076904296875, -0.038604736328125, -0.015655517578125, 0.02862548828125, 0.034912109375, -0.0157623291015625, 0.018524169921875, -0.009185791015625, -0.0216827392578125, 0.01488494873046875, -0.00566864013671875, 0.074951171875, 0.010162353515625, -0.04266357421875, 0.0162811279296875, -0.0576171875, 0.0285186767578125, 0.0107574462890625, -0.0164337158203125, -0.0012950897216796875, 0.009613037109375, 0.01007843017578125, 0.026702880859375, 0.0281219482421875, -0.03338623046875, 0.0023517608642578125, -0.05938720703125, 0.05291748046875, 0.045684814453125, -0.0054931640625, 0.0306396484375, -0.0174102783203125, 0.046844482421875, 0.003993988037109375, 0.00296783447265625, 0.01148223876953125, -0.028411865234375, -0.08740234375, -0.0011768341064453125, 0.0323486328125, 0.061370849609375, -0.057861328125, 0.06427001953125, -0.01297760009765625, -0.053741455078125, -0.07489013671875, 0.006969451904296875, 0.0139617919921875, 0.0271148681640625, 0.037994384765625, 0.01007080078125, -0.060882568359375, -0.07330322265625, 0.0013170242309570312, -0.0185089111328125, -0.006557464599609375, 0.0134735107421875, 0.062286376953125, -0.032684326171875, 0.066162109375, -0.034423828125, -0.023284912109375, -0.0399169921875, -0.01142120361328125, 0.053192138671875, 0.043243408203125, 0.040313720703125, -0.05340576171875, -0.0325927734375, 0.00044083595275878906, -0.047821044921875, 0.0132293701171875, -0.0064544677734375, -0.01117706298828125, 0.00945281982421875, 0.025115966796875, -0.057525634765625, 0.006015777587890625, 0.04766845703125, -0.04205322265625, 0.0423583984375, 0.0015134811401367188, 0.00528717041015625, -0.109375, 0.0171356201171875, -0.005367279052734375, -0.01287078857421875, -0.035064697265625, 0.022613525390625, -0.005840301513671875, -0.003810882568359375, -0.041015625, 0.039093017578125, -0.01470947265625, 0.01387786865234375, 0.0079345703125, -0.0069122314453125, 0.00977325439453125, 0.04217529296875, -0.01250457763671875, 0.0789794921875, 0.043548583984375, -0.051788330078125, 0.04559326171875, 0.037872314453125, -0.033843994140625, 0.005245208740234375, -0.066650390625, 0.01380157470703125, 0.007381439208984375, 0.00506591796875, -0.0660400390625, -0.01221466064453125, 0.018280029296875, -0.04638671875, 0.00437164306640625, -0.00229644775390625, -0.040985107421875, -0.03302001953125, -0.0270538330078125, 0.0125885009765625, 0.061126708984375, -0.0260467529296875, 0.03668212890625, 0.02587890625, -0.008331298828125, -0.038818359375, -0.05072021484375, -0.0090484619140625, -0.005771636962890625, -0.0406494140625, 0.0201416015625, 0.0009026527404785156, -0.01104736328125, 0.01180267333984375, 0.004726409912109375, -0.0316162109375, 0.00919342041015625, 0.0005655288696289062, 0.0280609130859375, -0.0333251953125, 0.0196990966796875, 0.002674102783203125, -0.0012607574462890625, 0.00734710693359375, -0.013214111328125, 0.0546875, -0.0501708984375, -0.0002989768981933594, -0.040008544921875, 0.01654052734375, 0.047882080078125, -0.033111572265625, 0.06195068359375, 0.050811767578125, -0.024383544921875, -0.015716552734375, -0.0438232421875, -0.0103302001953125, -0.036529541015625, 0.048828125, -0.02020263671875, -0.052459716796875, 0.05133056640625, 0.01226043701171875, 0.022369384765625, 0.07818603515625, 0.044952392578125, -0.0174560546875, 0.0828857421875, 0.044708251953125, -0.00821685791015625, 0.031402587890625, -0.06756591796875, 0.00884246826171875, -0.07012939453125, -0.0181121826171875, -0.04522705078125, -0.0186309814453125, -0.042205810546875, -0.018096923828125, 0.007259368896484375, 0.00567626953125, -0.04168701171875, 0.034576416015625, -0.0577392578125, 0.0265960693359375, 0.046051025390625, 0.0090484619140625, 0.0023517608642578125, -0.003269195556640625, 0.00927734375, 0.016815185546875, -0.05450439453125, -0.037872314453125, 0.08172607421875, 0.025726318359375, 0.035552978515625, 0.0002129077911376953, 0.053985595703125, 0.00568389892578125, 0.0012760162353515625, -0.0570068359375, 0.035552978515625, -0.007801055908203125, -0.055511474609375, -0.035736083984375, -0.027862548828125, -0.06964111328125, 0.01282501220703125, -0.00629425048828125, -0.049041748046875, 0.0299835205078125, -0.00357818603515625, -0.0380859375, 0.0298004150390625, -0.0560302734375, 0.0721435546875, -0.00896453857421875, -0.000553131103515625, -0.0105743408203125, -0.0396728515625, 0.0128936767578125, 0.00643157958984375, 0.00849151611328125, -0.0164947509765625, 0.0049896240234375, 0.0648193359375, -0.039825439453125, 0.061614990234375, -0.0011434555053710938, 0.004909515380859375, 0.03326416015625, -0.0038166046142578125, 0.0297698974609375, 0.01027679443359375, -0.0275115966796875, 0.02154541015625, 0.02459716796875, -0.0296478271484375, -0.027923583984375, 0.046356201171875, -0.07763671875, -0.0195770263671875, -0.03240966796875, -0.0340576171875, -0.01464080810546875, 0.023590087890625, 0.032806396484375, 0.0163421630859375, -0.001277923583984375, 0.0228424072265625, 0.050506591796875, -0.0186309814453125, 0.0360107421875, 0.03594970703125, -0.0080718994140625, -0.0282745361328125, 0.06103515625, -0.01007080078125, 0.01302337646484375, 0.0280914306640625, 0.0012521743774414062, -0.030487060546875, -0.035614013671875, -0.0419921875, 0.01238250732421875, -0.044525146484375, -0.0302886962890625, -0.04669189453125, -0.039398193359375, -0.043487548828125, 0.0058135986328125, -0.042266845703125, -0.02337646484375, -0.037841796875, -0.019927978515625, 0.052001953125, 0.0302276611328125, 0.0084381103515625, 0.010284423828125, -0.056640625, 0.0226898193359375, 0.0294189453125, 0.021820068359375, -0.003376007080078125, -0.03924560546875, -0.0192413330078125, 0.024993896484375, -0.0120849609375, -0.064208984375, 0.03985595703125, 0.0182037353515625, 0.02740478515625, -0.00568389892578125, 0.007572174072265625, 0.034454345703125, -0.02496337890625, 0.06854248046875, 0.005126953125, -0.07086181640625, 0.0499267578125, -0.0172576904296875, 0.0241546630859375, 0.073974609375, 0.006877899169921875, -0.04119873046875, -0.0211029052734375, -0.05364990234375, -0.0875244140625, 0.049041748046875, 0.0212860107421875, 0.022247314453125, 0.0022106170654296875, 0.016571044921875, -0.003910064697265625, 0.02239990234375, -0.0501708984375, -0.022613525390625, -0.01511383056640625, -0.029998779296875, -0.001728057861328125, -0.01020050048828125, -0.0035552978515625, -0.03985595703125, 0.07513427734375, -0.0012693405151367188, 0.0200042724609375, 0.0323486328125, -0.0188140869140625, 0.002834320068359375, 0.032867431640625, 0.025482177734375, 0.05914306640625, -0.022918701171875, -0.017822265625, 0.027740478515625, -0.0288543701171875, 0.0125885009765625, 0.024993896484375, -0.041259765625, 0.01345062255859375, 0.0243377685546875, 0.06378173828125, 0.0009593963623046875, -0.05364990234375, 0.0330810546875, 0.00518798828125, -0.02886962890625, -0.043304443359375, 0.00803375244140625, 0.005954742431640625, 0.0285186767578125, 0.0304412841796875, -0.004344940185546875, 0.004253387451171875, -0.0296783447265625, 0.01128387451171875, 0.0149993896484375, -0.03350830078125, -0.015777587890625, 0.04461669921875, 0.0260467529296875, -0.0295257568359375, 0.06854248046875, -0.02447509765625, -0.0513916015625, 0.07745361328125, 0.0238189697265625, 0.0848388671875, -0.001789093017578125, 0.037811279296875, 0.0380859375, 0.029388427734375, 0.00035500526428222656, 0.03973388671875, 0.022430419921875, -0.0667724609375, -0.0386962890625, -0.050872802734375, -0.0226593017578125, 0.0355224609375, -0.048126220703125, 0.00862884521484375, -0.0224761962890625, -0.004032135009765625, -0.004913330078125, 0.01486968994140625, -0.06951904296875, 0.008148193359375, -0.00936126708984375, 0.06597900390625, -0.04998779296875, 0.0404052734375, 0.0712890625, -0.05950927734375, -0.08428955078125, -0.0176544189453125, -0.035491943359375, -0.060455322265625, 0.032562255859375, 0.013641357421875, 0.00797271728515625, 0.0261077880859375, -0.047393798828125, -0.062744140625, 0.08831787109375, 0.010528564453125, -0.041351318359375, -0.02557373046875, -0.006175994873046875, 0.04327392578125, -0.020721435546875, 0.0241851806640625, 0.0489501953125, 0.032440185546875, -0.015045166015625, -0.072021484375, 0.0157012939453125, -0.035308837890625, -0.015045166015625, 0.006839752197265625, -0.0482177734375, 0.068115234375, -0.0225677490234375, -0.00414276123046875, 0.013916015625, 0.039947509765625, 0.02008056640625, 0.0065460205078125, 0.03253173828125, 0.0384521484375, 0.052947998046875, -0.0032558441162109375, 0.0672607421875, -0.0166778564453125, 0.061981201171875, 0.07330322265625, -0.01546478271484375, 0.07305908203125, 0.023956298828125, -0.0289764404296875, 0.06781005859375, 0.04876708984375, -0.0289306640625, 0.035430908203125, 0.002040863037109375, -0.00432586669921875, -0.0207366943359375, 0.01000213623046875, -0.05291748046875, 0.0295867919921875, 0.0112762451171875, -0.0237579345703125, -0.0139617919921875, -0.0139617919921875, -0.00998687744140625, -0.00937652587890625, -0.0157012939453125, 0.053741455078125, -0.00537872314453125, -0.03997802734375, 0.056640625, -0.0061798095703125, 0.056365966796875, -0.052001953125, -0.00817108154296875, -0.006038665771484375, 0.024078369140625, -0.0220947265625, -0.050872802734375, 0.0038738250732421875, -0.0142974853515625, -0.025238037109375, -0.0083160400390625, 0.048187255859375, -0.0340576171875, -0.056549072265625, 0.008148193359375, 0.04681396484375, 0.00811004638671875, -0.01369476318359375, -0.07244873046875, -0.020721435546875, 0.002567291259765625, -0.03472900390625, 0.005397796630859375, 0.01297760009765625, 0.020965576171875, 0.051788330078125, 0.054595947265625, 0.012908935546875, -0.00023663043975830078, 0.00910186767578125, 0.06793212890625, -0.05169677734375, -0.030303955078125, -0.056640625, 0.053192138671875, -0.0174560546875, -0.044342041015625, 0.043121337890625, 0.0433349609375, 0.061737060546875, -0.01522064208984375, 0.05914306640625, -0.0192718505859375, 0.034027099609375, -0.0350341796875, 0.073486328125, -0.0406494140625, 0.0098876953125, -0.0024204254150390625, -0.054595947265625, 0.005214691162109375, 0.0511474609375, -0.01580810546875, -0.00029277801513671875, 0.05889892578125, 0.056732177734375, 0.004375457763671875, -0.01904296875, -0.002468109130859375, 0.0276336669921875, 0.0241851806640625, 0.058319091796875, 0.056365966796875, -0.061767578125, 0.047882080078125, -0.0369873046875, 0.0013494491577148438, -0.0083160400390625, -0.036956787109375, -0.0635986328125, -0.0423583984375, -0.0246124267578125, -0.049224853515625, -0.005786895751953125, 0.06781005859375, 0.053619384765625, -0.087646484375, -0.01537322998046875, -0.021240234375, 0.02203369140625, -0.036468505859375, -0.0267791748046875, 0.036590576171875, -0.031982421875, -0.051116943359375, 0.0275726318359375, -0.016815185546875, -0.0148162841796875, -0.0212860107421875, 0.0058746337890625, -0.03985595703125, -0.00916290283203125, 0.03472900390625, 0.03009033203125, -0.049407958984375, -0.0111541748046875, 0.01441192626953125, -0.01087188720703125, 0.00252532958984375, 0.029815673828125, -0.0662841796875, 0.015838623046875, 0.0465087890625, 0.050140380859375, 0.0299224853515625, -0.018524169921875, 0.051910400390625, -0.036956787109375, 0.0092315673828125, 0.043182373046875, 0.01837158203125, 0.0184783935546875, -0.037689208984375, 0.0457763671875, 0.002269744873046875, -0.040802001953125, -0.06781005859375, 0.0009083747863769531, -0.06268310546875, -0.0203857421875, 0.0966796875, 0.00177764892578125, -0.01397705078125, -0.00191497802734375, -0.0126190185546875, 0.02105712890625, -0.038482666015625, 0.049346923828125, 0.047882080078125, 0.01007080078125, -0.006244659423828125, -0.049102783203125, 0.032989501953125, 0.033447265625, -0.0731201171875, -0.01317596435546875, 0.022247314453125, 0.032501220703125, 0.01186370849609375, 0.043426513671875, 0.01142120361328125, 0.035888671875, -0.014434814453125, 0.0260162353515625, -0.01546478271484375, 0.00588226318359375, -0.03173828125, -0.0012950897216796875, -0.0168914794921875, -0.03375244140625 ] ]
gogamza/kobart-summarization
2023-06-12T23:54:06.000Z
[ "transformers", "pytorch", "safetensors", "bart", "text2text-generation", "ko", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
text2text-generation
gogamza
null
null
gogamza/kobart-summarization
5
9,568
transformers
2022-03-02T23:29:05
--- language: ko tags: - bart license: mit --- # Korean News Summarization Model ## Demo https://huggingface.co/spaces/gogamza/kobart-summarization ## How to use ```python import torch from transformers import PreTrainedTokenizerFast from transformers import BartForConditionalGeneration tokenizer = PreTrainedTokenizerFast.from_pretrained('gogamza/kobart-summarization') model = BartForConditionalGeneration.from_pretrained('gogamza/kobart-summarization') text = "과거를 떠올려보자. 방송을 보던 우리의 모습을. 독보적인 매체는 TV였다. 온 가족이 둘러앉아 TV를 봤다. 간혹 가족들끼리 뉴스와 드라마, 예능 프로그램을 둘러싸고 리모컨 쟁탈전이 벌어지기도 했다. 각자 선호하는 프로그램을 ‘본방’으로 보기 위한 싸움이었다. TV가 한 대인지 두 대인지 여부도 그래서 중요했다. 지금은 어떤가. ‘안방극장’이라는 말은 옛말이 됐다. TV가 없는 집도 많다. 미디어의 혜 택을 누릴 수 있는 방법은 늘어났다. 각자의 방에서 각자의 휴대폰으로, 노트북으로, 태블릿으로 콘텐츠 를 즐긴다." raw_input_ids = tokenizer.encode(text) input_ids = [tokenizer.bos_token_id] + raw_input_ids + [tokenizer.eos_token_id] summary_ids = model.generate(torch.tensor([input_ids])) tokenizer.decode(summary_ids.squeeze().tolist(), skip_special_tokens=True) ```
1,026
[ [ -0.0273895263671875, -0.039520263671875, 0.0157012939453125, 0.0345458984375, -0.045501708984375, 0.003971099853515625, -0.00800323486328125, 0.0186920166015625, 0.036224365234375, 0.021026611328125, -0.022674560546875, -0.045074462890625, -0.054412841796875, 0.01020050048828125, -0.01629638671875, 0.09930419921875, -0.00984954833984375, 0.0079193115234375, 0.003936767578125, -0.01224517822265625, -0.0289459228515625, -0.0139007568359375, -0.046722412109375, -0.03424072265625, -0.0024738311767578125, 0.01421356201171875, 0.053009033203125, 0.019134521484375, 0.02972412109375, 0.035919189453125, -0.0239105224609375, -0.0010976791381835938, -0.022247314453125, -0.0011043548583984375, -0.004261016845703125, -0.0357666015625, -0.0229949951171875, -0.0157623291015625, 0.0325927734375, 0.02587890625, 0.0112457275390625, 0.0287017822265625, -0.00672149658203125, 0.042327880859375, -0.0139617919921875, 0.0228271484375, -0.005096435546875, -0.00012135505676269531, 0.01363372802734375, -0.01024627685546875, -0.0009431838989257812, -0.041473388671875, 0.004596710205078125, -0.041290283203125, 0.0206756591796875, -0.00360870361328125, 0.10205078125, 0.007503509521484375, -0.0435791015625, -0.0209197998046875, -0.043548583984375, 0.05780029296875, -0.06695556640625, 0.00896453857421875, 0.02105712890625, 0.0171051025390625, -0.00289154052734375, -0.0828857421875, -0.03753662109375, -0.00736236572265625, -0.01116943359375, 0.03521728515625, -0.0008792877197265625, 0.00782012939453125, 0.0214080810546875, 0.0362548828125, -0.0421142578125, -0.0308685302734375, -0.041229248046875, -0.0167999267578125, 0.0411376953125, 0.022247314453125, 0.030670166015625, -0.042236328125, -0.01421356201171875, -0.004772186279296875, -0.0111236572265625, 0.0151214599609375, 0.0352783203125, -0.01032257080078125, -0.036285400390625, 0.055267333984375, -0.042144775390625, 0.03814697265625, 0.007106781005859375, -0.0291900634765625, 0.051727294921875, -0.038787841796875, -0.0250396728515625, 0.004772186279296875, 0.06298828125, 0.05780029296875, 0.022918701171875, 0.01922607421875, 0.0026264190673828125, -0.0035686492919921875, -0.0007715225219726562, -0.070068359375, -0.018463134765625, 0.03106689453125, -0.05706787109375, -0.04229736328125, 0.0181427001953125, -0.06585693359375, 0.00528717041015625, -0.0070953369140625, 0.035247802734375, -0.0216827392578125, -0.041412353515625, 0.0128631591796875, -0.030517578125, 0.0005822181701660156, 0.01102447509765625, -0.0655517578125, 0.006557464599609375, 0.006198883056640625, 0.06939697265625, 0.0234527587890625, -0.016754150390625, 0.00832366943359375, -0.003559112548828125, -0.019622802734375, 0.034088134765625, 0.016571044921875, -0.0269622802734375, -0.004673004150390625, 0.01220703125, -0.0097198486328125, -0.0228424072265625, 0.03448486328125, -0.03955078125, 0.025390625, 0.007244110107421875, -0.03289794921875, -0.0234832763671875, 0.012603759765625, -0.04522705078125, 0.07855224609375, 0.0271148681640625, -0.08184814453125, 0.01812744140625, -0.04425048828125, -0.03076171875, -0.01094818115234375, 0.0010623931884765625, -0.061492919921875, 0.0157623291015625, 0.0240325927734375, 0.036895751953125, 0.027801513671875, 0.02679443359375, -0.005771636962890625, -0.0298004150390625, 0.0125579833984375, -0.0173187255859375, 0.07958984375, 0.0187530517578125, -0.0159759521484375, 0.043670654296875, -0.06494140625, -0.0015325546264648438, 0.024169921875, -0.039093017578125, -0.009124755859375, -0.0283355712890625, 0.0164031982421875, 0.0206451416015625, 0.017974853515625, -0.041168212890625, 0.0158843994140625, -0.049591064453125, 0.03985595703125, 0.044158935546875, 0.01226043701171875, 0.036102294921875, -0.0278167724609375, 0.038116455078125, -0.0013074874877929688, -0.0017023086547851562, -0.023681640625, -0.0218048095703125, -0.049102783203125, -0.032379150390625, 0.0278778076171875, 0.038177490234375, -0.08184814453125, 0.04254150390625, -0.00734710693359375, -0.044189453125, -0.02935791015625, -0.0257415771484375, 0.001850128173828125, 0.031402587890625, 0.0188751220703125, 0.0002079010009765625, -0.07086181640625, -0.03179931640625, -0.01314544677734375, -0.0005016326904296875, -0.024200439453125, 0.01332855224609375, 0.051849365234375, -0.0031585693359375, 0.0699462890625, -0.03857421875, -0.0132904052734375, -0.0582275390625, 0.00665283203125, 0.06500244140625, 0.04376220703125, 0.039703369140625, -0.0604248046875, -0.051055908203125, -0.01174163818359375, -0.0635986328125, -0.003978729248046875, -0.005977630615234375, 0.00785064697265625, 0.00806427001953125, 0.02874755859375, -0.047271728515625, 0.059539794921875, 0.0190582275390625, -0.0526123046875, 0.07891845703125, -0.023590087890625, 0.019134521484375, -0.1156005859375, 0.005001068115234375, -0.00933074951171875, -0.0438232421875, -0.03173828125, 0.00788116455078125, -0.0044708251953125, -0.01251220703125, -0.046417236328125, 0.055419921875, -0.024688720703125, 0.013702392578125, -0.0251617431640625, 0.0066375732421875, 0.0018243789672851562, 0.041839599609375, -0.009368896484375, 0.03594970703125, 0.038116455078125, -0.046539306640625, 0.06787109375, 0.022125244140625, -0.034088134765625, 0.0134429931640625, -0.064208984375, -0.023345947265625, -0.0115203857421875, 0.0309906005859375, -0.09271240234375, -0.031646728515625, 0.0390625, -0.05914306640625, 0.00951385498046875, -0.008636474609375, -0.033966064453125, -0.039764404296875, -0.034088134765625, 0.0300140380859375, 0.0292510986328125, -0.03216552734375, 0.04443359375, 0.0012235641479492188, -0.0178375244140625, -0.044586181640625, -0.0386962890625, -0.0201416015625, -0.024261474609375, -0.045684814453125, 0.032012939453125, 0.0041656494140625, 0.0040130615234375, 0.00649261474609375, -0.01445770263671875, 0.007297515869140625, -0.0118408203125, 0.00783538818359375, 0.0374755859375, -0.01169586181640625, -0.00821685791015625, 0.0183868408203125, -0.03466796875, 0.0010156631469726562, -0.006679534912109375, 0.06396484375, -0.012725830078125, -0.0107269287109375, -0.054779052734375, 0.0101318359375, 0.03472900390625, 0.0033855438232421875, 0.05120849609375, 0.061920166015625, -0.0176239013671875, 0.0110626220703125, -0.016876220703125, -0.0036220550537109375, -0.0343017578125, 0.07916259765625, -0.0192413330078125, -0.036834716796875, 0.061248779296875, -0.0197601318359375, -0.01212310791015625, 0.04998779296875, 0.057281494140625, 0.00426483154296875, 0.08831787109375, 0.051788330078125, -0.0128021240234375, 0.045928955078125, -0.052734375, 0.0164947509765625, -0.0594482421875, -0.0253143310546875, -0.018707275390625, -0.01490020751953125, -0.0303802490234375, 0.0016946792602539062, 0.0292510986328125, 0.0260467529296875, -0.0246124267578125, 0.02191162109375, -0.0404052734375, 0.033599853515625, 0.0310821533203125, 0.0239715576171875, 0.01338958740234375, -0.004329681396484375, -0.0185699462890625, -0.00408172607421875, -0.041107177734375, -0.0224456787109375, 0.0865478515625, 0.0273895263671875, 0.0611572265625, 0.004241943359375, 0.05694580078125, -0.00437164306640625, 0.0002913475036621094, -0.05938720703125, 0.037994384765625, 0.009521484375, -0.04351806640625, -0.0282135009765625, -0.0285491943359375, -0.0609130859375, 0.034393310546875, -0.00004464387893676758, -0.07305908203125, 0.00948333740234375, -0.0124969482421875, -0.0261077880859375, 0.033050537109375, -0.0311737060546875, 0.0662841796875, 0.00312042236328125, -0.0170745849609375, -0.001865386962890625, -0.045501708984375, 0.03045654296875, 0.032379150390625, 0.01114654541015625, 0.0008711814880371094, 0.0060882568359375, 0.076904296875, -0.05047607421875, 0.045074462890625, -0.005748748779296875, 0.018585205078125, 0.0187835693359375, -0.020751953125, 0.0225067138671875, 0.04693603515625, 0.007686614990234375, 0.003032684326171875, 0.0028972625732421875, -0.030181884765625, -0.034637451171875, 0.039642333984375, -0.07098388671875, -0.027984619140625, -0.048675537109375, -0.036468505859375, 0.0244140625, 0.040435791015625, 0.06396484375, 0.042694091796875, 0.00482177734375, 0.006801605224609375, 0.0311737060546875, -0.0084075927734375, 0.05615234375, 0.00931549072265625, -0.048370361328125, -0.0550537109375, 0.06536865234375, 0.00836181640625, 0.0092010498046875, -0.005725860595703125, 0.0178070068359375, -0.0296630859375, -0.018218994140625, -0.0372314453125, 0.0223236083984375, -0.0577392578125, -0.0234222412109375, -0.038055419921875, -0.034698486328125, -0.04327392578125, -0.01128387451171875, -0.0278167724609375, -0.03271484375, -0.0261688232421875, -0.0244903564453125, 0.01641845703125, 0.039215087890625, -0.0086517333984375, 0.034088134765625, -0.050048828125, 0.038330078125, 0.003002166748046875, 0.0174102783203125, 0.004444122314453125, -0.05517578125, -0.0382080078125, -0.0215911865234375, -0.03887939453125, -0.076904296875, 0.055572509765625, -0.017974853515625, 0.037139892578125, 0.033538818359375, 0.01464080810546875, 0.029144287109375, -0.007587432861328125, 0.0587158203125, 0.0221405029296875, -0.0718994140625, 0.0246124267578125, -0.0182647705078125, 0.041015625, 0.021331787109375, 0.0205078125, -0.05609130859375, -0.026947021484375, -0.045684814453125, -0.08941650390625, 0.077392578125, 0.028839111328125, 0.01561737060546875, 0.017547607421875, -0.00766754150390625, -0.0096588134765625, 0.0226287841796875, -0.0675048828125, -0.042633056640625, -0.0452880859375, -0.033355712890625, 0.0201568603515625, -0.01380157470703125, -0.0027866363525390625, -0.037750244140625, 0.07476806640625, 0.0152130126953125, 0.03851318359375, 0.035430908203125, -0.01506805419921875, 0.009063720703125, 0.003574371337890625, 0.0628662109375, 0.050048828125, -0.0270233154296875, -0.0035572052001953125, 0.0033893585205078125, -0.0482177734375, 0.0157012939453125, 0.0016202926635742188, -0.022125244140625, 0.041015625, 0.0297393798828125, 0.08013916015625, 0.005962371826171875, -0.0248260498046875, 0.04296875, -0.019256591796875, -0.035064697265625, -0.039306640625, -0.01548004150390625, 0.0194091796875, 0.0256195068359375, 0.033447265625, 0.0200653076171875, -0.00801849365234375, -0.019073486328125, 0.0025806427001953125, -0.005565643310546875, -0.0136260986328125, -0.0022716522216796875, 0.059814453125, 0.01080322265625, -0.024627685546875, 0.041290283203125, -0.0235137939453125, -0.0594482421875, 0.077880859375, 0.0489501953125, 0.060272216796875, -0.0169525146484375, 0.0196990966796875, 0.05133056640625, -0.000823974609375, -0.006679534912109375, 0.0274810791015625, 0.0049896240234375, -0.048858642578125, -0.0312347412109375, -0.035003662109375, -0.00919342041015625, 0.02716064453125, -0.045318603515625, 0.027099609375, -0.0300140380859375, -0.0180511474609375, -0.017547607421875, 0.007312774658203125, -0.0274200439453125, 0.0066680908203125, -0.0184478759765625, 0.07012939453125, -0.07049560546875, 0.051513671875, 0.04766845703125, -0.040771484375, -0.0701904296875, 0.0020465850830078125, -0.01265716552734375, -0.0499267578125, 0.04150390625, 0.0244140625, 0.022430419921875, 0.017852783203125, -0.0273895263671875, -0.072509765625, 0.07611083984375, -0.004161834716796875, -0.0031490325927734375, 0.004199981689453125, -0.005023956298828125, 0.0291900634765625, -0.0211639404296875, 0.04351806640625, 0.0241546630859375, 0.0372314453125, 0.0026683807373046875, -0.050994873046875, 0.006893157958984375, -0.033233642578125, -0.0009822845458984375, -0.0004725456237792969, -0.04888916015625, 0.0753173828125, -0.0228271484375, -0.0037670135498046875, 0.0257720947265625, 0.05535888671875, 0.0248260498046875, 0.04229736328125, 0.006000518798828125, 0.032257080078125, 0.018829345703125, -0.0175933837890625, 0.05694580078125, -0.01175689697265625, 0.07647705078125, 0.051025390625, 0.0218505859375, 0.0171966552734375, 0.018829345703125, -0.035919189453125, 0.045135498046875, 0.05322265625, -0.0249481201171875, 0.059814453125, 0.01448822021484375, -0.0186309814453125, 0.0179290771484375, 0.02105712890625, -0.03887939453125, 0.0261383056640625, 0.01224517822265625, -0.027587890625, -0.027099609375, 0.031005859375, 0.0191802978515625, -0.01003265380859375, -0.039093017578125, 0.048187255859375, -0.00762176513671875, -0.026702880859375, 0.039093017578125, 0.0222320556640625, 0.0609130859375, -0.05645751953125, -0.0015249252319335938, -0.01047515869140625, 0.0229034423828125, -0.0299072265625, -0.040008544921875, 0.01448822021484375, -0.01506805419921875, -0.0159759521484375, 0.011322021484375, 0.057281494140625, -0.0291900634765625, -0.0584716796875, 0.00879669189453125, 0.0144500732421875, 0.0255126953125, 0.00550079345703125, -0.046905517578125, 0.003124237060546875, 0.017730712890625, -0.055938720703125, 0.01314544677734375, 0.01393890380859375, 0.007724761962890625, 0.038604736328125, 0.042724609375, 0.0174560546875, 0.010162353515625, 0.0010747909545898438, 0.0684814453125, -0.04852294921875, -0.07293701171875, -0.07293701171875, 0.05548095703125, -0.024017333984375, -0.03436279296875, 0.058319091796875, 0.049041748046875, 0.06622314453125, -0.03265380859375, 0.071533203125, -0.019744873046875, 0.033721923828125, -0.036773681640625, 0.07891845703125, -0.02947998046875, -0.01788330078125, -0.0244293212890625, -0.053436279296875, -0.020782470703125, 0.057830810546875, -0.0153350830078125, 0.0239410400390625, 0.0655517578125, 0.045654296875, 0.0035381317138671875, -0.005031585693359375, 0.024169921875, 0.024627685546875, 0.01230621337890625, 0.022216796875, 0.030517578125, -0.053863525390625, 0.044769287109375, -0.037811279296875, -0.011627197265625, -0.035430908203125, -0.045562744140625, -0.08099365234375, -0.021240234375, -0.00439453125, -0.043609619140625, -0.02093505859375, 0.06597900390625, 0.04833984375, -0.06573486328125, -0.01013946533203125, -0.006195068359375, 0.038848876953125, -0.0316162109375, -0.0247650146484375, 0.05828857421875, -0.014739990234375, -0.07781982421875, -0.01082611083984375, -0.0118408203125, 0.01186370849609375, 0.0006823539733886719, -0.0060272216796875, -0.0205535888671875, -0.00927734375, 0.0260467529296875, 0.0182342529296875, -0.046295166015625, 0.0001494884490966797, -0.02740478515625, -0.0123138427734375, 0.002349853515625, 0.038238525390625, -0.0506591796875, 0.0191192626953125, 0.05657958984375, 0.0148162841796875, 0.047210693359375, -0.0026187896728515625, 0.0205841064453125, -0.047088623046875, 0.0299072265625, -0.00594329833984375, 0.04150390625, 0.0049896240234375, -0.031707763671875, 0.031402587890625, 0.04010009765625, -0.05450439453125, -0.049560546875, 0.0011644363403320312, -0.06671142578125, -0.02679443359375, 0.06256103515625, -0.013702392578125, -0.0156402587890625, -0.0298919677734375, -0.051177978515625, 0.05645751953125, -0.039703369140625, 0.06298828125, 0.048187255859375, -0.00891876220703125, -0.009307861328125, -0.034576416015625, 0.0258636474609375, 0.023773193359375, -0.03228759765625, -0.0257568359375, -0.0145416259765625, 0.0362548828125, 0.02252197265625, 0.059783935546875, -0.00360870361328125, 0.029571533203125, 0.01812744140625, 0.0172119140625, -0.04034423828125, 0.00041103363037109375, -0.0266571044921875, -0.00011551380157470703, -0.046478271484375, -0.053985595703125 ] ]
sentence-transformers/stsb-xlm-r-multilingual
2022-06-15T21:42:42.000Z
[ "sentence-transformers", "pytorch", "tf", "xlm-roberta", "feature-extraction", "sentence-similarity", "transformers", "arxiv:1908.10084", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/stsb-xlm-r-multilingual
18
9,567
sentence-transformers
2022-03-02T23:29:05
--- pipeline_tag: sentence-similarity license: apache-2.0 tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers --- # sentence-transformers/stsb-xlm-r-multilingual This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/stsb-xlm-r-multilingual') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/stsb-xlm-r-multilingual') model = AutoModel.from_pretrained('sentence-transformers/stsb-xlm-r-multilingual') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, max pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/stsb-xlm-r-multilingual) ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors This model was trained by [sentence-transformers](https://www.sbert.net/). If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084): ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "http://arxiv.org/abs/1908.10084", } ```
3,699
[ [ -0.0201873779296875, -0.057525634765625, 0.022430419921875, 0.0325927734375, -0.0210723876953125, -0.0146636962890625, -0.026214599609375, -0.005985260009765625, 0.01374053955078125, 0.033599853515625, -0.04010009765625, -0.03875732421875, -0.052825927734375, 0.01580810546875, -0.031341552734375, 0.0604248046875, -0.0222625732421875, 0.00238800048828125, -0.0191650390625, -0.02197265625, -0.02490234375, -0.039276123046875, -0.0250396728515625, -0.0153961181640625, 0.021209716796875, 0.0178680419921875, 0.0380859375, 0.034393310546875, 0.0218353271484375, 0.03424072265625, -0.0013275146484375, 0.019378662109375, -0.0214385986328125, -0.00652313232421875, 0.00734710693359375, -0.0294189453125, -0.004482269287109375, 0.01032257080078125, 0.05206298828125, 0.03851318359375, -0.009063720703125, 0.005107879638671875, -0.0021800994873046875, 0.0240631103515625, -0.027313232421875, 0.03076171875, -0.0386962890625, 0.0183258056640625, 0.00911712646484375, -0.0010290145874023438, -0.047760009765625, -0.0105743408203125, 0.0108642578125, -0.0272064208984375, 0.00936126708984375, 0.0206756591796875, 0.0831298828125, 0.021453857421875, -0.028228759765625, -0.0221099853515625, -0.0261383056640625, 0.06951904296875, -0.059906005859375, 0.0264739990234375, 0.0116424560546875, -0.00644683837890625, 0.0063323974609375, -0.06982421875, -0.054901123046875, -0.009857177734375, -0.038543701171875, 0.0213165283203125, -0.0271148681640625, -0.00585174560546875, 0.01189422607421875, 0.01477813720703125, -0.05865478515625, -0.00786590576171875, -0.03515625, -0.016082763671875, 0.036041259765625, 0.0030117034912109375, 0.0362548828125, -0.04345703125, -0.04150390625, -0.0279083251953125, -0.01139068603515625, -0.0034847259521484375, 0.01076507568359375, 0.0249176025390625, -0.030670166015625, 0.06036376953125, 0.0104827880859375, 0.042266845703125, -0.0086669921875, 0.0194244384765625, 0.0504150390625, -0.032440185546875, -0.0139312744140625, -0.0014629364013671875, 0.0843505859375, 0.021087646484375, 0.0213470458984375, -0.007720947265625, -0.01058197021484375, 0.00626373291015625, 0.01439666748046875, -0.06201171875, -0.01904296875, 0.01812744140625, -0.0284271240234375, -0.01027679443359375, 0.0155792236328125, -0.049530029296875, 0.007030487060546875, -0.00482940673828125, 0.052520751953125, -0.049835205078125, -0.001018524169921875, 0.0195465087890625, -0.01439666748046875, 0.00787353515625, -0.0289764404296875, -0.048126220703125, 0.01837158203125, 0.0167694091796875, 0.07940673828125, 0.0008802413940429688, -0.04345703125, -0.0308074951171875, -0.0082550048828125, -0.003154754638671875, 0.05804443359375, -0.0276336669921875, -0.00975799560546875, 0.010498046875, 0.0235137939453125, -0.052825927734375, -0.027862548828125, 0.04437255859375, -0.0221099853515625, 0.049102783203125, 0.01328277587890625, -0.059661865234375, -0.016357421875, 0.0090484619140625, -0.039459228515625, 0.082275390625, 0.0151214599609375, -0.06494140625, 0.00545501708984375, -0.05963134765625, -0.0236053466796875, -0.008514404296875, 0.0005307197570800781, -0.05462646484375, -0.00041604042053222656, 0.034881591796875, 0.053192138671875, 0.0179901123046875, 0.0230560302734375, -0.0160369873046875, -0.0228729248046875, 0.0318603515625, -0.01519775390625, 0.075927734375, 0.0136566162109375, -0.023101806640625, 0.02117919921875, -0.03662109375, -0.0017442703247070312, 0.018402099609375, -0.0129852294921875, -0.016357421875, -0.007427215576171875, 0.02764892578125, 0.0248870849609375, 0.01212310791015625, -0.04840087890625, 0.0073089599609375, -0.040283203125, 0.064453125, 0.04058837890625, -0.00481414794921875, 0.04290771484375, -0.0168304443359375, 0.0234527587890625, 0.03326416015625, -0.0078582763671875, -0.0286407470703125, -0.0276947021484375, -0.06890869140625, -0.0242767333984375, 0.0228729248046875, 0.049346923828125, -0.05267333984375, 0.068603515625, -0.0296783447265625, -0.03167724609375, -0.050750732421875, 0.0032291412353515625, 0.01336669921875, 0.0245513916015625, 0.049102783203125, -0.006771087646484375, -0.049224853515625, -0.07196044921875, -0.007171630859375, 0.00615692138671875, 0.016937255859375, 0.018951416015625, 0.058929443359375, -0.0384521484375, 0.07220458984375, -0.049041748046875, -0.03515625, -0.04443359375, 0.021697998046875, 0.01837158203125, 0.042144775390625, 0.044708251953125, -0.054779052734375, -0.041046142578125, -0.028228759765625, -0.050445556640625, 0.002620697021484375, -0.0186614990234375, -0.01528167724609375, 0.02642822265625, 0.042144775390625, -0.0609130859375, 0.0200042724609375, 0.05023193359375, -0.042236328125, 0.033233642578125, -0.02703857421875, -0.01139068603515625, -0.108154296875, 0.004749298095703125, 0.00893402099609375, -0.020355224609375, -0.03363037109375, 0.006496429443359375, 0.0235443115234375, -0.01058197021484375, -0.032257080078125, 0.041168212890625, -0.03759765625, 0.00771331787109375, 0.0005931854248046875, 0.041717529296875, 0.0007028579711914062, 0.05743408203125, 0.0008230209350585938, 0.052154541015625, 0.04364013671875, -0.041290283203125, 0.024169921875, 0.053497314453125, -0.035797119140625, 0.01554107666015625, -0.055389404296875, -0.007678985595703125, -0.0055694580078125, 0.027252197265625, -0.0858154296875, -0.0006570816040039062, 0.0185699462890625, -0.041168212890625, 0.01511383056640625, 0.0188751220703125, -0.051239013671875, -0.04534912109375, -0.025604248046875, 0.01079559326171875, 0.0300140380859375, -0.040740966796875, 0.0458984375, 0.0216217041015625, -0.016448974609375, -0.047760009765625, -0.0799560546875, 0.00713348388671875, -0.01556396484375, -0.056304931640625, 0.04254150390625, -0.01488494873046875, 0.01096343994140625, 0.0216217041015625, 0.0194549560546875, 0.0005273818969726562, -0.00457763671875, -0.0032482147216796875, 0.0242462158203125, -0.007720947265625, 0.01181793212890625, 0.01139068603515625, -0.01290130615234375, 0.0015745162963867188, -0.0161285400390625, 0.05859375, -0.01544952392578125, -0.00743865966796875, -0.0273590087890625, 0.023193359375, 0.024658203125, -0.0225067138671875, 0.08050537109375, 0.08502197265625, -0.027008056640625, -0.0086822509765625, -0.03497314453125, -0.019287109375, -0.03350830078125, 0.05853271484375, -0.01345062255859375, -0.075439453125, 0.03643798828125, 0.0184783935546875, 0.00489044189453125, 0.043853759765625, 0.04205322265625, 0.0023860931396484375, 0.06866455078125, 0.046783447265625, -0.01390838623046875, 0.033966064453125, -0.045318603515625, 0.032501220703125, -0.0693359375, -0.004283905029296875, -0.0286865234375, -0.0159912109375, -0.0517578125, -0.033172607421875, 0.0184326171875, -0.01250457763671875, -0.017822265625, 0.049407958984375, -0.0400390625, 0.01306915283203125, 0.039886474609375, 0.004673004150390625, -0.00615692138671875, 0.006748199462890625, -0.041412353515625, -0.006134033203125, -0.05841064453125, -0.0400390625, 0.058746337890625, 0.0237884521484375, 0.0299530029296875, -0.007442474365234375, 0.047943115234375, -0.00433349609375, 0.005161285400390625, -0.05364990234375, 0.040802001953125, -0.028961181640625, -0.0294189453125, -0.0189971923828125, -0.037841796875, -0.0736083984375, 0.03887939453125, -0.01384735107421875, -0.061065673828125, -0.0001291036605834961, -0.019287109375, -0.0266571044921875, 0.0266571044921875, -0.06719970703125, 0.08343505859375, -0.003856658935546875, -0.00984954833984375, -0.01104736328125, -0.052337646484375, 0.01543426513671875, 0.009918212890625, 0.021453857421875, 0.00103759765625, -0.00028228759765625, 0.06390380859375, -0.01751708984375, 0.06488037109375, -0.005214691162109375, 0.01318359375, 0.01568603515625, -0.0199127197265625, 0.019744873046875, -0.00392913818359375, 0.00006276369094848633, 0.01091766357421875, -0.005161285400390625, -0.0240936279296875, -0.0288543701171875, 0.05517578125, -0.07769775390625, -0.0242767333984375, -0.040802001953125, -0.0572509765625, -0.005161285400390625, 0.017120361328125, 0.0208587646484375, 0.036041259765625, -0.0095062255859375, 0.031280517578125, 0.0343017578125, -0.040740966796875, 0.05584716796875, 0.0164337158203125, -0.0091400146484375, -0.04046630859375, 0.0467529296875, 0.0076904296875, 0.0020923614501953125, 0.044158935546875, 0.01398468017578125, -0.041412353515625, -0.00984954833984375, -0.03411865234375, 0.035675048828125, -0.042633056640625, -0.0149078369140625, -0.08587646484375, -0.038482666015625, -0.051300048828125, -0.0033092498779296875, -0.01493072509765625, -0.031036376953125, -0.035400390625, -0.0238189697265625, 0.0262451171875, 0.033782958984375, -0.00823974609375, 0.0264129638671875, -0.061248779296875, 0.0165863037109375, 0.0030307769775390625, 0.00922393798828125, -0.007549285888671875, -0.05401611328125, -0.03253173828125, 0.0015735626220703125, -0.036865234375, -0.05718994140625, 0.06512451171875, 0.0165863037109375, 0.04443359375, 0.01236724853515625, 0.0006732940673828125, 0.052276611328125, -0.042266845703125, 0.0648193359375, 0.00949859619140625, -0.0784912109375, 0.024139404296875, 0.0000845789909362793, 0.032470703125, 0.0191802978515625, 0.0257568359375, -0.050933837890625, -0.036041259765625, -0.04638671875, -0.08612060546875, 0.06585693359375, 0.0299530029296875, 0.041656494140625, -0.0184326171875, 0.0168609619140625, -0.0117034912109375, 0.008056640625, -0.08941650390625, -0.0406494140625, -0.0281524658203125, -0.040863037109375, -0.03338623046875, -0.02105712890625, 0.01197052001953125, -0.0206756591796875, 0.0594482421875, -0.0003349781036376953, 0.051788330078125, 0.023345947265625, -0.039764404296875, 0.00829315185546875, 0.021026611328125, 0.04345703125, 0.01898193359375, -0.005001068115234375, 0.01401519775390625, 0.0180511474609375, -0.0233001708984375, -0.0017337799072265625, 0.041839599609375, -0.00334930419921875, 0.01910400390625, 0.0262298583984375, 0.0736083984375, 0.0419921875, -0.037933349609375, 0.0472412109375, -0.00754547119140625, -0.0229034423828125, -0.035003662109375, -0.012359619140625, 0.025604248046875, 0.01593017578125, 0.0256500244140625, 0.005352020263671875, -0.00959014892578125, -0.032012939453125, 0.0262298583984375, 0.02117919921875, -0.0307159423828125, -0.00753021240234375, 0.054931640625, -0.0026187896728515625, -0.0115509033203125, 0.0675048828125, -0.019378662109375, -0.06256103515625, 0.03594970703125, 0.048980712890625, 0.06640625, -0.0042266845703125, 0.0276641845703125, 0.046295166015625, 0.027862548828125, -0.006015777587890625, 0.00389862060546875, 0.01153564453125, -0.076904296875, -0.0264129638671875, -0.044921875, 0.0017156600952148438, -0.0005602836608886719, -0.04449462890625, 0.0200958251953125, 0.005329132080078125, -0.0017032623291015625, -0.009735107421875, 0.003936767578125, -0.046905517578125, -0.0008344650268554688, 0.004360198974609375, 0.06353759765625, -0.06805419921875, 0.0540771484375, 0.05755615234375, -0.04833984375, -0.05841064453125, -0.01506805419921875, -0.0258941650390625, -0.0609130859375, 0.038848876953125, 0.03961181640625, 0.0191650390625, 0.0120697021484375, -0.0367431640625, -0.06597900390625, 0.09478759765625, 0.0206146240234375, -0.0299072265625, -0.0041656494140625, 0.02001953125, 0.03216552734375, -0.03692626953125, 0.028350830078125, 0.02398681640625, 0.03509521484375, -0.01145172119140625, -0.050445556640625, 0.01393890380859375, -0.02532958984375, 0.00984954833984375, -0.003223419189453125, -0.04736328125, 0.074462890625, -0.01509857177734375, -0.0175933837890625, 0.01346588134765625, 0.061737060546875, 0.026336669921875, -0.0033206939697265625, 0.034637451171875, 0.059814453125, 0.040802001953125, -0.0121917724609375, 0.06951904296875, -0.03656005859375, 0.05548095703125, 0.0692138671875, -0.006011962890625, 0.073974609375, 0.033477783203125, -0.0021991729736328125, 0.0623779296875, 0.038818359375, -0.020599365234375, 0.03851318359375, 0.007694244384765625, 0.00628662109375, -0.00499725341796875, 0.01221466064453125, -0.01349639892578125, 0.0295867919921875, 0.0144500732421875, -0.0455322265625, -0.00428009033203125, 0.01287841796875, 0.01351165771484375, 0.007694244384765625, 0.003078460693359375, 0.04156494140625, 0.0208587646484375, -0.04095458984375, 0.0244293212890625, 0.019317626953125, 0.07147216796875, -0.038177490234375, 0.016998291015625, -0.0075531005859375, 0.034759521484375, -0.00501251220703125, -0.04864501953125, 0.02972412109375, -0.009552001953125, -0.0006937980651855469, -0.0262603759765625, 0.03851318359375, -0.056793212890625, -0.055023193359375, 0.03533935546875, 0.0452880859375, 0.0006070137023925781, 0.006931304931640625, -0.0799560546875, -0.0023651123046875, 0.0005583763122558594, -0.0361328125, 0.017364501953125, 0.0267181396484375, 0.0218963623046875, 0.049713134765625, 0.029754638671875, -0.00670623779296875, 0.007965087890625, 0.0168914794921875, 0.053192138671875, -0.053314208984375, -0.046478271484375, -0.0692138671875, 0.049835205078125, -0.0111541748046875, -0.018341064453125, 0.0623779296875, 0.0528564453125, 0.0714111328125, -0.020599365234375, 0.04962158203125, -0.0103302001953125, 0.023162841796875, -0.039154052734375, 0.060211181640625, -0.041107177734375, -0.00872039794921875, -0.022552490234375, -0.06561279296875, -0.0295562744140625, 0.07794189453125, -0.0221099853515625, 0.01335906982421875, 0.0762939453125, 0.06390380859375, -0.01126861572265625, -0.008758544921875, 0.017486572265625, 0.039825439453125, 0.01268768310546875, 0.035369873046875, 0.0347900390625, -0.05584716796875, 0.058685302734375, -0.035919189453125, -0.0008921623229980469, -0.006061553955078125, -0.058197021484375, -0.069580078125, -0.06427001953125, -0.036956787109375, -0.01861572265625, -0.0151519775390625, 0.08184814453125, 0.04998779296875, -0.056976318359375, -0.022491455078125, -0.0084991455078125, -0.007686614990234375, -0.0171051025390625, -0.02520751953125, 0.0401611328125, -0.046142578125, -0.0628662109375, 0.01110076904296875, 0.00196075439453125, 0.0022563934326171875, -0.02642822265625, 0.007312774658203125, -0.04266357421875, 0.00923919677734375, 0.051422119140625, -0.023712158203125, -0.049163818359375, -0.01611328125, 0.01033782958984375, -0.021942138671875, 0.002162933349609375, 0.0318603515625, -0.0458984375, 0.013885498046875, 0.0201873779296875, 0.03216552734375, 0.051788330078125, -0.016632080078125, 0.034912109375, -0.06201171875, 0.01776123046875, -0.00193023681640625, 0.063720703125, 0.038360595703125, -0.0187530517578125, 0.03741455078125, 0.01453399658203125, -0.039337158203125, -0.05279541015625, -0.0085601806640625, -0.083984375, -0.0262298583984375, 0.092041015625, -0.027557373046875, -0.0247650146484375, 0.0097198486328125, -0.0228729248046875, 0.03631591796875, -0.0281982421875, 0.058349609375, 0.06280517578125, 0.004245758056640625, -0.026275634765625, -0.0232086181640625, 0.00717926025390625, 0.030670166015625, -0.04852294921875, -0.007965087890625, 0.017486572265625, 0.023223876953125, 0.0230712890625, 0.022491455078125, -0.010498046875, -0.004795074462890625, -0.007472991943359375, 0.01369476318359375, -0.0079345703125, 0.005329132080078125, -0.025360107421875, 0.01090240478515625, -0.037689208984375, -0.01422119140625 ] ]
Yntec/LAMEanime
2023-10-29T18:24:58.000Z
[ "diffusers", "Anime", "Chibi", "General", "Adorable", "Lasorco", "realisticElves", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
Yntec
null
null
Yntec/LAMEanime
2
9,548
diffusers
2023-10-02T16:02:15
--- license: creativeml-openrail-m library_name: diffusers pipeline_tag: text-to-image tags: - Anime - Chibi - General - Adorable - Lasorco - realisticElves - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers --- # LAMEanime A mix of lamettaRemix and animeSEXTILLION to bring my favorite things from those models together! So it's a remix of 4 models, animeTWO and animeTEN by realisticElves with lametta v1745 and lametta v1602 by Lasorco. Comparison: ![Comparison](https://cdn-uploads.huggingface.co/production/uploads/63239b8370edc53f51cd5d42/xechjNTFOIgGjaXCZM0GN.png) (Click for larger) Sample and prompt: ![Sample](https://cdn-uploads.huggingface.co/production/uploads/63239b8370edc53f51cd5d42/1RAhje0vBSeUMWwqcJ4gk.png) A genie pretty cute little girl making detailed eyes, full shot, mucha lighting, detailed face, by makoto shinkai, stanley artgerm lau, wlop, rossdraws. alejandro jodorowsky jungle moon light, fireflies, lovely view, reliastic, 8 k hd. sitting elementary, bow in hair, gorgeous hair, DETAILED CHIBI EYES, A beautiful Original pages: https://huggingface.co/Yntec/lamettaRemix https://huggingface.co/Lasorco/lametta https://huggingface.co/Yntec/animeSEXTILLION https://civitai.com/models/40245?modelVersionId=45715 (animeTWO) https://civitai.com/models/144023?modelVersionId=160609 (animeTEN) # Recipe - SuperMerger Weight sum MBW 1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,1,1,1,1,1,1,1,1,1,1,1 Model A: animeSEXTILLION Model B: lamettaREMIX Output: lamettaanime - Bake MoistMixV2 VAE: Output: LAMEanime # lamettaSEXTILLION Same model with animeSEXTILLION as the base. - SuperMerger Weight sum MBW 0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,1,1,1,1,1,1,1,1,1,1,1 Model A: animeSEXTILLION Model B: lamettaREMIX Output: lamettaSEXTILLION
1,837
[ [ -0.03997802734375, -0.003978729248046875, 0.0182342529296875, 0.0307464599609375, -0.01132965087890625, -0.005588531494140625, 0.0228424072265625, -0.0234375, 0.079345703125, 0.0648193359375, -0.0635986328125, -0.0111083984375, -0.0250396728515625, -0.004451751708984375, -0.0176849365234375, 0.0546875, 0.0045623779296875, 0.0157623291015625, -0.004852294921875, -0.0143280029296875, -0.04107666015625, -0.0247344970703125, -0.0139007568359375, -0.04339599609375, 0.026275634765625, 0.04669189453125, 0.047637939453125, 0.027252197265625, 0.00891876220703125, 0.03192138671875, -0.00418853759765625, -0.0022983551025390625, -0.038116455078125, 0.0072784423828125, -0.01241302490234375, -0.057891845703125, -0.081298828125, 0.005397796630859375, 0.0172119140625, 0.007755279541015625, -0.0033721923828125, 0.0228729248046875, -0.0089569091796875, 0.04693603515625, -0.0258941650390625, -0.0074920654296875, -0.02252197265625, 0.013275146484375, -0.0198822021484375, 0.0033054351806640625, 0.00034999847412109375, -0.034393310546875, -0.030975341796875, -0.05145263671875, -0.0091400146484375, 0.004405975341796875, 0.0831298828125, 0.005977630615234375, -0.0161895751953125, 0.002681732177734375, -0.05291748046875, 0.07305908203125, -0.0640869140625, 0.0288543701171875, -0.005207061767578125, 0.030029296875, -0.016204833984375, -0.0826416015625, -0.043853759765625, 0.013092041015625, 0.0147857666015625, 0.055633544921875, -0.0223846435546875, -0.0291748046875, 0.041107177734375, 0.042816162109375, -0.04180908203125, -0.0167236328125, -0.032257080078125, 0.01971435546875, 0.0408935546875, 0.01435089111328125, 0.05560302734375, -0.0167388916015625, -0.052520751953125, -0.0247802734375, -0.045806884765625, -0.0224151611328125, 0.035919189453125, -0.020599365234375, -0.026123046875, 0.040283203125, 0.0155181884765625, 0.0158538818359375, 0.01364898681640625, -0.0013265609741210938, 0.0194549560546875, -0.00614166259765625, -0.03076171875, -0.0028133392333984375, 0.060272216796875, 0.046905517578125, 0.0228729248046875, 0.00988006591796875, -0.004852294921875, -0.002460479736328125, 0.012359619140625, -0.08868408203125, -0.01474761962890625, 0.01403045654296875, -0.052642822265625, -0.025848388671875, 0.0031337738037109375, -0.060272216796875, -0.0018625259399414062, -0.03350830078125, 0.0293426513671875, -0.021728515625, -0.035308837890625, 0.006549835205078125, -0.003528594970703125, 0.0084381103515625, 0.036590576171875, -0.05767822265625, 0.0275421142578125, 0.038421630859375, 0.030242919921875, 0.044281005859375, -0.0017423629760742188, 0.003406524658203125, 0.00878143310546875, -0.04278564453125, 0.054046630859375, -0.013580322265625, -0.0260009765625, -0.040283203125, 0.01477813720703125, -0.004302978515625, -0.056793212890625, 0.0308685302734375, -0.0130615234375, -0.0008516311645507812, -0.0283660888671875, -0.035980224609375, -0.036285400390625, -0.0117950439453125, -0.057769775390625, 0.046051025390625, 0.0244598388671875, -0.04736328125, 0.005634307861328125, -0.02996826171875, 0.0159149169921875, 0.00470733642578125, 0.0014810562133789062, -0.016143798828125, 0.02557373046875, -0.021484375, 0.0120849609375, -0.039764404296875, -0.0003867149353027344, -0.035003662109375, -0.031982421875, 0.02667236328125, -0.01149749755859375, 0.055572509765625, 0.0341796875, -0.0247802734375, 0.0205230712890625, -0.07427978515625, 0.0108184814453125, 0.037139892578125, 0.01477813720703125, -0.03466796875, -0.02447509765625, 0.0173187255859375, 0.03125, 0.004180908203125, -0.0245819091796875, 0.0237579345703125, -0.0399169921875, 0.019500732421875, 0.0411376953125, 0.00995635986328125, 0.0254058837890625, -0.067626953125, 0.0517578125, -0.00656890869140625, 0.01094818115234375, -0.003284454345703125, -0.027801513671875, -0.08001708984375, -0.01161956787109375, 0.040130615234375, 0.0445556640625, -0.040191650390625, 0.0117950439453125, -0.0017461776733398438, -0.052886962890625, -0.036712646484375, 0.01425933837890625, 0.01678466796875, 0.016510009765625, 0.020782470703125, -0.047088623046875, -0.05859375, -0.089599609375, 0.008636474609375, -0.004558563232421875, -0.02056884765625, 0.0205841064453125, 0.032989501953125, -0.01428985595703125, 0.0179595947265625, -0.007122039794921875, -0.01090240478515625, -0.02880859375, -0.0020618438720703125, 0.049835205078125, 0.0635986328125, 0.0892333984375, -0.06195068359375, -0.0482177734375, 0.00008392333984375, -0.069091796875, -0.0261993408203125, -0.003078460693359375, -0.0294952392578125, -0.034423828125, 0.012420654296875, -0.052581787109375, 0.036834716796875, 0.021514892578125, -0.0209197998046875, 0.050567626953125, -0.00608062744140625, 0.045257568359375, -0.12371826171875, -0.00396728515625, -0.00543212890625, -0.0279541015625, -0.04559326171875, 0.05810546875, -0.005504608154296875, -0.004947662353515625, -0.0606689453125, 0.06005859375, -0.032470703125, 0.011322021484375, -0.0169677734375, -0.01395416259765625, 0.00628662109375, 0.031646728515625, -0.01493072509765625, 0.026458740234375, 0.037841796875, -0.00971221923828125, 0.050323486328125, 0.0279541015625, -0.009613037109375, 0.05218505859375, -0.07452392578125, 0.01161956787109375, -0.0022525787353515625, 0.010650634765625, -0.037872314453125, -0.0364990234375, 0.038177490234375, -0.0311279296875, 0.02069091796875, -0.0191192626953125, -0.041168212890625, -0.02886962890625, -0.0281219482421875, 0.0227203369140625, 0.0628662109375, -0.027252197265625, 0.054931640625, 0.00934600830078125, -0.01458740234375, -0.014923095703125, -0.05584716796875, 0.00940704345703125, -0.043701171875, -0.045440673828125, 0.0323486328125, -0.01389312744140625, -0.03704833984375, 0.0010995864868164062, -0.0301361083984375, -0.0135040283203125, -0.01206207275390625, 0.042083740234375, 0.020538330078125, -0.0218658447265625, -0.057525634765625, 0.0215606689453125, -0.0010633468627929688, -0.003879547119140625, -0.0251922607421875, 0.045562744140625, -0.0193023681640625, -0.0130767822265625, -0.051666259765625, 0.042572021484375, 0.056488037109375, 0.0135040283203125, 0.01448822021484375, 0.035247802734375, -0.04071044921875, 0.005863189697265625, -0.046905517578125, -0.0156402587890625, -0.03369140625, -0.0159454345703125, -0.04718017578125, -0.049163818359375, 0.0660400390625, -0.0007729530334472656, 0.00530242919921875, 0.0367431640625, 0.038116455078125, -0.027801513671875, 0.0858154296875, 0.025634765625, -0.00392913818359375, 0.01230621337890625, -0.05853271484375, -0.0029163360595703125, -0.050384521484375, -0.0223846435546875, -0.030792236328125, -0.059234619140625, -0.048675537109375, -0.0452880859375, -0.0011053085327148438, 0.0272216796875, -0.00872039794921875, 0.07293701171875, -0.0091705322265625, 0.024169921875, 0.036895751953125, 0.012237548828125, 0.0140533447265625, 0.0055389404296875, -0.011138916015625, -0.0328369140625, -0.05145263671875, -0.019439697265625, 0.0477294921875, 0.020111083984375, 0.0462646484375, 0.0268402099609375, 0.0814208984375, -0.0133819580078125, 0.0220947265625, -0.048248291015625, 0.068115234375, -0.021575927734375, -0.03973388671875, 0.03875732421875, -0.0347900390625, -0.04937744140625, 0.03289794921875, -0.0218505859375, -0.04144287109375, 0.041351318359375, 0.002719879150390625, -0.025146484375, 0.036712646484375, -0.01517486572265625, 0.056549072265625, -0.0016775131225585938, -0.03851318359375, -0.026092529296875, -0.04461669921875, 0.04150390625, 0.00926971435546875, 0.00884246826171875, -0.0171356201171875, 0.007755279541015625, 0.047821044921875, -0.054168701171875, 0.039154052734375, 0.0225067138671875, 0.00766754150390625, 0.039276123046875, 0.03216552734375, 0.0106353759765625, 0.036102294921875, 0.006237030029296875, 0.0178070068359375, -0.0157318115234375, -0.0269012451171875, -0.048187255859375, 0.08740234375, -0.0279541015625, -0.032623291015625, -0.035491943359375, -0.0296783447265625, 0.006214141845703125, 0.022705078125, 0.030517578125, 0.05804443359375, 0.0038394927978515625, 0.0333251953125, 0.05133056640625, 0.007602691650390625, 0.0285186767578125, 0.017822265625, -0.0311737060546875, -0.0335693359375, 0.06494140625, -0.00965118408203125, 0.009918212890625, 0.0133209228515625, 0.021331787109375, 0.01276397705078125, -0.0038776397705078125, -0.039703369140625, 0.0236358642578125, -0.0215606689453125, -0.0232086181640625, -0.037139892578125, -0.0244903564453125, -0.0221710205078125, -0.01349639892578125, -0.037139892578125, -0.05517578125, -0.0277099609375, 0.00986480712890625, 0.0211181640625, 0.03369140625, -0.0297393798828125, 0.01175689697265625, -0.0677490234375, 0.0182647705078125, 0.04046630859375, -0.0019121170043945312, -0.00879669189453125, -0.038116455078125, 0.0211181640625, 0.0030651092529296875, -0.010650634765625, -0.0771484375, 0.057037353515625, -0.00618743896484375, 0.0235748291015625, 0.01522064208984375, -0.036468505859375, 0.0634765625, -0.036834716796875, 0.0477294921875, 0.0413818359375, -0.03887939453125, 0.041656494140625, -0.04107666015625, 0.0236358642578125, 0.044708251953125, 0.0279998779296875, -0.0290679931640625, -0.025238037109375, -0.07769775390625, -0.06951904296875, 0.03240966796875, 0.04248046875, 0.02947998046875, 0.021575927734375, 0.02227783203125, 0.0084686279296875, 0.03216552734375, -0.0390625, -0.0406494140625, -0.02996826171875, -0.0113983154296875, 0.0382080078125, -0.0191497802734375, 0.0081787109375, -0.0186614990234375, 0.06390380859375, 0.011199951171875, 0.0266876220703125, -0.01268768310546875, 0.0183868408203125, -0.021270751953125, 0.0013704299926757812, 0.038177490234375, 0.0343017578125, -0.035003662109375, -0.0121917724609375, -0.014373779296875, -0.039276123046875, 0.0260772705078125, -0.00658416748046875, -0.0225677490234375, 0.016754150390625, 0.01178741455078125, 0.07940673828125, 0.039764404296875, -0.037078857421875, 0.021148681640625, -0.00970458984375, -0.002742767333984375, -0.0294036865234375, 0.026336669921875, 0.006084442138671875, 0.03662109375, 0.00437164306640625, 0.05328369140625, 0.02880859375, -0.03460693359375, 0.027923583984375, -0.01708984375, -0.0267333984375, -0.056121826171875, 0.0848388671875, -0.01263427734375, -0.02325439453125, 0.0162353515625, -0.0140228271484375, -0.0323486328125, 0.07977294921875, 0.05712890625, 0.07904052734375, -0.049835205078125, 0.01470184326171875, 0.048919677734375, -0.009857177734375, -0.02264404296875, 0.05352783203125, 0.008544921875, -0.0271148681640625, -0.02288818359375, -0.0528564453125, -0.0201263427734375, 0.0382080078125, -0.055145263671875, 0.046722412109375, -0.06011962890625, -0.0016956329345703125, 0.0042266845703125, -0.0276641845703125, -0.0304412841796875, 0.0274505615234375, -0.01097869873046875, 0.09246826171875, -0.0726318359375, 0.0650634765625, 0.0621337890625, -0.035430908203125, -0.0767822265625, -0.002651214599609375, 0.028594970703125, -0.040069580078125, 0.035247802734375, 0.0079193115234375, 0.025146484375, 0.0082244873046875, -0.03253173828125, -0.045989990234375, 0.05352783203125, 0.01239776611328125, -0.05657958984375, 0.0131683349609375, -0.0288238525390625, 0.0321044921875, -0.047698974609375, 0.031158447265625, 0.0361328125, 0.03778076171875, 0.045257568359375, -0.060638427734375, -0.0135955810546875, -0.055572509765625, 0.0155487060546875, 0.0106048583984375, -0.07952880859375, 0.06597900390625, -0.0223236083984375, 0.0126190185546875, 0.058380126953125, 0.07427978515625, 0.03662109375, 0.01617431640625, 0.030914306640625, 0.06573486328125, -0.005001068115234375, -0.0202178955078125, 0.06787109375, -0.0024509429931640625, 0.01183319091796875, 0.0870361328125, -0.0195465087890625, 0.03240966796875, 0.01373291015625, -0.0161590576171875, 0.050567626953125, 0.044403076171875, -0.00879669189453125, 0.025634765625, 0.0146636962890625, -0.0099945068359375, -0.0204315185546875, -0.006412506103515625, -0.035888671875, 0.0206146240234375, 0.00970458984375, -0.007030487060546875, -0.00934600830078125, 0.0081939697265625, 0.00705718994140625, 0.01534271240234375, -0.017242431640625, 0.04205322265625, 0.032745361328125, -0.020751953125, 0.038604736328125, -0.00044989585876464844, 0.0338134765625, -0.045501708984375, -0.0264434814453125, -0.05560302734375, -0.006954193115234375, -0.0101776123046875, -0.038177490234375, 0.004535675048828125, -0.0175628662109375, -0.00518035888671875, -0.0191192626953125, 0.0706787109375, -0.0228424072265625, -0.050567626953125, 0.01275634765625, 0.004833221435546875, 0.0260009765625, 0.020599365234375, -0.06658935546875, 0.029296875, 0.0170135498046875, -0.0184173583984375, 0.0124053955078125, 0.0057830810546875, -0.0102996826171875, 0.04229736328125, 0.0307159423828125, 0.0301361083984375, -0.00977325439453125, 0.03271484375, 0.0455322265625, -0.030364990234375, -0.03253173828125, -0.04278564453125, 0.0401611328125, -0.0112152099609375, -0.01222991943359375, 0.06787109375, 0.040496826171875, 0.04248046875, -0.03857421875, 0.043975830078125, -0.02093505859375, 0.01715087890625, -0.0284576416015625, 0.045989990234375, -0.08270263671875, 0.008636474609375, -0.0286407470703125, -0.10369873046875, -0.004444122314453125, 0.0411376953125, 0.034423828125, 0.01200103759765625, 0.01259613037109375, 0.048309326171875, -0.03350830078125, -0.00033211708068847656, 0.01605224609375, 0.035308837890625, 0.0005178451538085938, 0.05023193359375, 0.0550537109375, -0.06878662109375, -0.01100921630859375, -0.048614501953125, -0.0279083251953125, -0.036224365234375, -0.057708740234375, -0.046234130859375, -0.031219482421875, -0.0318603515625, -0.0292510986328125, -0.0155181884765625, 0.0672607421875, 0.0662841796875, -0.0823974609375, -0.0272064208984375, 0.01174163818359375, 0.00571441650390625, -0.035736083984375, -0.019683837890625, -0.0143280029296875, 0.03265380859375, -0.089599609375, 0.05767822265625, 0.0244903564453125, 0.048248291015625, -0.005878448486328125, 0.0186004638671875, 0.00676727294921875, 0.0241546630859375, 0.0203094482421875, 0.02734375, -0.0308837890625, 0.0016107559204101562, -0.0007147789001464844, 0.00762939453125, -0.013397216796875, 0.064697265625, -0.0289764404296875, 0.006587982177734375, 0.06951904296875, -0.002292633056640625, 0.04095458984375, -0.0050506591796875, 0.026275634765625, -0.036163330078125, 0.04443359375, 0.01198577880859375, 0.0304107666015625, 0.003749847412109375, -0.006229400634765625, 0.04986572265625, 0.0291900634765625, -0.027587890625, -0.05328369140625, 0.02581787109375, -0.09686279296875, -0.00981903076171875, 0.0445556640625, 0.016876220703125, -0.034759521484375, 0.039764404296875, -0.0233154296875, -0.01259613037109375, -0.0146026611328125, 0.027587890625, 0.0193328857421875, -0.0027408599853515625, -0.00853729248046875, -0.072265625, 0.00862884521484375, 0.040435791015625, -0.04022216796875, -0.0421142578125, 0.04290771484375, 0.043701171875, 0.028076171875, 0.019378662109375, -0.038299560546875, 0.0361328125, -0.001728057861328125, 0.0227508544921875, -0.01076507568359375, -0.0374755859375, 0.007305145263671875, -0.01043701171875, -0.02288818359375, -0.004299163818359375 ] ]
LeoLM/leo-hessianai-7b
2023-09-11T00:10:47.000Z
[ "transformers", "pytorch", "llama", "text-generation", "custom_code", "en", "de", "dataset:oscar-corpus/OSCAR-2301", "dataset:wikipedia", "dataset:bjoernp/tagesschau-2018-2023", "endpoints_compatible", "text-generation-inference", "region:us" ]
text-generation
LeoLM
null
null
LeoLM/leo-hessianai-7b
29
9,537
transformers
2023-08-22T20:19:12
--- datasets: - oscar-corpus/OSCAR-2301 - wikipedia - bjoernp/tagesschau-2018-2023 language: - en - de library_name: transformers pipeline_tag: text-generation --- # LAION LeoLM: **L**inguistically **E**nhanced **O**pen **L**anguage **M**odel Meet LeoLM, the first open and commercially available German Foundation Language Model built on Llama-2. Our models extend Llama-2's capabilities into German through continued pretraining on a large corpus of German-language and mostly locality specific text. Thanks to a compute grant at HessianAI's new supercomputer **42**, we release two foundation models trained with 8k context length, [`LeoLM/leo-hessianai-7b`](https://huggingface.co/LeoLM/leo-hessianai-7b) and [`LeoLM/leo-hessianai-13b`](https://huggingface.co/LeoLM/leo-hessianai-13b) under the [Llama-2 community license](https://huggingface.co/meta-llama/Llama-2-70b/raw/main/LICENSE.txt) (70b also coming soon! 👀). With this release, we hope to bring a new wave of opportunities to German open-source and commercial LLM research and accelerate adoption. Read our [blog post]() or our paper (preprint coming soon) for more details! *A project by Björn Plüster and Christoph Schuhmann in collaboration with LAION and HessianAI.* ## Model Details - **Finetuned from:** [meta-llama/Llama-2-7b-hf](https://huggingface.co/meta-llama/Llama-2-7b-hf) - **Model type:** Causal decoder-only transformer language model - **Language:** English and German - **License:** [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://huggingface.co/meta-llama/Llama-2-70b/raw/main/LICENSE.txt) - **Contact:** [LAION Discord](https://discord.com/invite/eq3cAMZtCC) or [Björn Plüster](mailto:bjoern.pl@outlook.de) ## Use in 🤗Transformers First install direct dependencies: ``` pip install transformers torch sentencepiece ``` If you want faster inference using flash-attention2, you need to install these dependencies: ```bash pip install packaging ninja pip install flash-attn==v2.1.1 --no-build-isolation pip install git+https://github.com/HazyResearch/flash-attention.git@v2.1.1#subdirectory=csrc/rotary ``` Then load the model in transformers: ```python from transformers import AutoModelForCausalLM, AutoTokenizer import torch model = AutoModelForCausalLM.from_pretrained( model="LeoLM/leo-hessianai-7b", device_map="auto", torch_dtype=torch.float16, trust_remote_code=True # True for flash-attn2 else False ) ``` ## Training parameters ![training_parameters](imgs/training_params.png "Training Hyperparameters") ## Benchmarks ![benchmarks](imgs/benchmarks.png "Benchmark Scores")
2,586
[ [ -0.0149993896484375, -0.0467529296875, 0.01354217529296875, 0.04229736328125, -0.0284576416015625, -0.0164642333984375, -0.0078582763671875, -0.044525146484375, 0.006443023681640625, 0.0311737060546875, -0.038909912109375, -0.028839111328125, -0.054046630859375, -0.0011043548583984375, -0.034942626953125, 0.07708740234375, -0.019683837890625, 0.01068878173828125, 0.0092010498046875, -0.01995849609375, -0.0023632049560546875, -0.02740478515625, -0.038238525390625, -0.051239013671875, 0.032135009765625, 0.024505615234375, 0.04498291015625, 0.0430908203125, 0.0467529296875, 0.02239990234375, -0.014923095703125, -0.014434814453125, -0.033660888671875, -0.0073089599609375, 0.004055023193359375, -0.03485107421875, -0.044281005859375, -0.0009593963623046875, 0.047760009765625, 0.0174102783203125, -0.0108489990234375, 0.0190887451171875, -0.0136566162109375, 0.031463623046875, -0.046356201171875, 0.01462554931640625, -0.0284576416015625, 0.0037937164306640625, -0.0160369873046875, 0.0084075927734375, -0.0311737060546875, -0.0030574798583984375, 0.0113983154296875, -0.047607421875, 0.00502777099609375, -0.0184478759765625, 0.09814453125, 0.044189453125, -0.03460693359375, -0.01186370849609375, -0.048736572265625, 0.058868408203125, -0.051666259765625, 0.03460693359375, 0.032135009765625, 0.0285491943359375, -0.00803375244140625, -0.0948486328125, -0.0338134765625, -0.019500732421875, -0.0149993896484375, 0.0118865966796875, -0.03875732421875, -0.01032257080078125, 0.017303466796875, 0.00841522216796875, -0.03192138671875, -0.0012464523315429688, -0.0284881591796875, -0.00792694091796875, 0.0435791015625, -0.0147705078125, -0.0078277587890625, -0.00431060791015625, -0.032623291015625, -0.019195556640625, -0.058135986328125, -0.0062103271484375, 0.0092926025390625, -0.0178985595703125, -0.03662109375, 0.041229248046875, -0.0125732421875, 0.03936767578125, 0.00377655029296875, -0.0224456787109375, 0.0322265625, -0.00981903076171875, -0.026947021484375, -0.00983428955078125, 0.06439208984375, 0.00511932373046875, 0.0201568603515625, 0.00569915771484375, -0.0270233154296875, 0.000606536865234375, -0.006710052490234375, -0.052337646484375, -0.016265869140625, 0.01021575927734375, -0.0254974365234375, -0.03741455078125, -0.00984954833984375, -0.0433349609375, -0.0078125, -0.002880096435546875, 0.034515380859375, -0.0203857421875, -0.0210113525390625, 0.020599365234375, 0.018280029296875, 0.035858154296875, 0.0201873779296875, -0.057464599609375, -0.0009336471557617188, 0.035064697265625, 0.048828125, 0.015655517578125, -0.034698486328125, -0.031494140625, 0.01702880859375, -0.008575439453125, 0.048309326171875, -0.0113677978515625, -0.0171661376953125, 0.0026569366455078125, 0.006740570068359375, 0.012908935546875, -0.033416748046875, 0.044189453125, -0.0548095703125, 0.03277587890625, 0.001132965087890625, -0.0175018310546875, -0.0177001953125, -0.001068115234375, -0.0428466796875, 0.09716796875, 0.0178680419921875, -0.06787109375, 0.006656646728515625, -0.040313720703125, -0.05218505859375, -0.00595855712890625, -0.0012912750244140625, -0.032012939453125, -0.00423431396484375, 0.01367950439453125, 0.036529541015625, -0.044464111328125, 0.05108642578125, -0.01297760009765625, -0.021209716796875, 0.017059326171875, -0.0321044921875, 0.08184814453125, 0.01593017578125, -0.05694580078125, 0.00867462158203125, -0.0506591796875, -0.0158233642578125, 0.02484130859375, -0.037353515625, 0.019134521484375, -0.007038116455078125, 0.02740478515625, 0.025146484375, 0.0357666015625, -0.022674560546875, -0.002857208251953125, -0.024810791015625, 0.034942626953125, 0.054840087890625, -0.0333251953125, 0.0202789306640625, -0.00922393798828125, 0.0357666015625, -0.0003077983856201172, 0.0192718505859375, -0.0033416748046875, -0.034149169921875, -0.1033935546875, -0.02532958984375, 0.018829345703125, 0.050994873046875, -0.053863525390625, 0.043731689453125, -0.0189056396484375, -0.041534423828125, -0.042205810546875, 0.003570556640625, 0.0389404296875, 0.037384033203125, 0.0253448486328125, -0.0130615234375, -0.04595947265625, -0.06976318359375, 0.0094146728515625, -0.0306243896484375, 0.00308990478515625, 0.024169921875, 0.04925537109375, -0.04583740234375, 0.0672607421875, -0.0285491943359375, -0.0246429443359375, -0.0247039794921875, 0.0015735626220703125, 0.0224151611328125, 0.030303955078125, 0.046722412109375, -0.049407958984375, -0.034027099609375, -0.004352569580078125, -0.08282470703125, -0.01406097412109375, 0.015472412109375, -0.0087738037109375, 0.03411865234375, 0.0307769775390625, -0.056915283203125, 0.0208282470703125, 0.04791259765625, -0.0257720947265625, 0.033294677734375, -0.0051422119140625, -0.0228729248046875, -0.07427978515625, 0.017730712890625, -0.01003265380859375, -0.0214080810546875, -0.043243408203125, 0.01543426513671875, -0.0017910003662109375, 0.006374359130859375, -0.062744140625, 0.07379150390625, -0.0212554931640625, -0.00615692138671875, -0.0193634033203125, 0.0018949508666992188, -0.016815185546875, 0.0281524658203125, 0.01593017578125, 0.050811767578125, 0.060272216796875, -0.04150390625, 0.03076171875, 0.0268096923828125, -0.0261688232421875, 0.01538848876953125, -0.0701904296875, 0.005100250244140625, -0.0012340545654296875, 0.03753662109375, -0.052276611328125, -0.042236328125, 0.043609619140625, -0.0281982421875, 0.021728515625, 0.007289886474609375, -0.047454833984375, -0.0281524658203125, -0.00634002685546875, 0.0293121337890625, 0.062255859375, -0.052154541015625, 0.04901123046875, 0.0260009765625, 0.0106964111328125, -0.07220458984375, -0.07110595703125, -0.0019969940185546875, -0.02154541015625, -0.06591796875, 0.01715087890625, 0.0032196044921875, -0.01366424560546875, -0.006183624267578125, 0.0015735626220703125, -0.004344940185546875, 0.0099639892578125, 0.005084991455078125, 0.0120086669921875, -0.0283203125, 0.0031490325927734375, 0.0033588409423828125, -0.0406494140625, -0.0029964447021484375, -0.0123138427734375, 0.050994873046875, -0.037139892578125, -0.0169219970703125, -0.04083251953125, 0.002197265625, 0.0279693603515625, -0.0170135498046875, 0.0562744140625, 0.07318115234375, -0.03179931640625, 0.00888824462890625, -0.055145263671875, -0.0254364013671875, -0.040679931640625, 0.0284881591796875, -0.0211944580078125, -0.076416015625, 0.0287322998046875, -0.007793426513671875, 0.003215789794921875, 0.042449951171875, 0.054168701171875, -0.008941650390625, 0.057891845703125, 0.0709228515625, -0.0211029052734375, 0.041412353515625, -0.0286865234375, -0.0013828277587890625, -0.046875, -0.00815582275390625, -0.0218658447265625, -0.007015228271484375, -0.021209716796875, -0.0297088623046875, 0.0294342041015625, 0.041351318359375, -0.0360107421875, 0.0360107421875, -0.01666259765625, 0.0271148681640625, 0.036346435546875, -0.002117156982421875, 0.0189208984375, 0.0187835693359375, -0.0057373046875, 0.005184173583984375, -0.05926513671875, -0.0555419921875, 0.077880859375, 0.046875, 0.04339599609375, 0.01329803466796875, 0.052520751953125, -0.00007098913192749023, 0.049285888671875, -0.0360107421875, 0.048858642578125, 0.00044226646423339844, -0.056610107421875, -0.01343536376953125, -0.01059722900390625, -0.0618896484375, 0.0287017822265625, -0.0003795623779296875, -0.052032470703125, 0.00211334228515625, 0.01004791259765625, -0.001865386962890625, 0.0010442733764648438, -0.034454345703125, 0.06219482421875, -0.0275115966796875, -0.0193023681640625, -0.0246429443359375, -0.0662841796875, 0.044342041015625, -0.01474761962890625, 0.00873565673828125, -0.018829345703125, -0.006519317626953125, 0.060791015625, -0.019287109375, 0.0697021484375, -0.0006184577941894531, -0.01763916015625, 0.0264434814453125, -0.00922393798828125, 0.045745849609375, 0.017852783203125, -0.014068603515625, 0.034820556640625, -0.020050048828125, -0.0291290283203125, -0.019195556640625, 0.038421630859375, -0.07257080078125, -0.044921875, -0.047576904296875, -0.034698486328125, 0.0080108642578125, 0.0205841064453125, 0.04364013671875, 0.003368377685546875, -0.0165557861328125, 0.0042877197265625, 0.037689208984375, -0.0159149169921875, 0.0452880859375, 0.055999755859375, -0.019561767578125, -0.031951904296875, 0.05511474609375, -0.0230712890625, 0.01094818115234375, 0.0186920166015625, 0.007843017578125, -0.017852783203125, -0.0114288330078125, -0.0438232421875, 0.053466796875, -0.077880859375, -0.0162353515625, -0.053741455078125, -0.021209716796875, -0.02423095703125, -0.01678466796875, -0.0250091552734375, -0.0296783447265625, -0.052734375, -0.00830841064453125, 0.0330810546875, 0.052001953125, -0.0229034423828125, 0.056640625, -0.04522705078125, 0.015838623046875, 0.00926971435546875, 0.00482940673828125, 0.0017957687377929688, -0.05352783203125, -0.0186767578125, -0.0088653564453125, -0.0296783447265625, -0.05584716796875, 0.0271759033203125, 0.0135040283203125, 0.032196044921875, 0.038909912109375, -0.01140594482421875, 0.03436279296875, -0.03558349609375, 0.0496826171875, 0.0232696533203125, -0.0738525390625, 0.03857421875, -0.025543212890625, 0.014007568359375, 0.0256805419921875, 0.031585693359375, -0.040618896484375, -0.0187530517578125, -0.05426025390625, -0.06060791015625, 0.08038330078125, 0.0306243896484375, 0.01519775390625, 0.00368499755859375, 0.0222625732421875, -0.01436614990234375, 0.0152435302734375, -0.09381103515625, -0.037628173828125, -0.00763702392578125, -0.0191802978515625, -0.0176239013671875, -0.0335693359375, -0.0258331298828125, -0.026458740234375, 0.0684814453125, 0.00337982177734375, 0.040435791015625, 0.006618499755859375, -0.0256805419921875, -0.0186767578125, -0.003734588623046875, 0.052032470703125, 0.037139892578125, -0.00296783447265625, 0.0105743408203125, 0.0282135009765625, -0.041046142578125, 0.02764892578125, 0.0236968994140625, -0.03546142578125, 0.00888824462890625, 0.05975341796875, 0.0740966796875, 0.022125244140625, -0.05224609375, 0.04022216796875, 0.01425933837890625, 0.0019626617431640625, -0.035614013671875, -0.0005636215209960938, 0.021881103515625, 0.03515625, 0.016815185546875, -0.01244354248046875, -0.0159759521484375, -0.01641845703125, 0.020965576171875, 0.019287109375, -0.032562255859375, -0.0258331298828125, 0.06622314453125, 0.018951416015625, -0.017822265625, 0.04132080078125, -0.01617431640625, -0.04486083984375, 0.0489501953125, 0.056854248046875, 0.06988525390625, -0.0214385986328125, 0.02557373046875, 0.02764892578125, 0.02703857421875, -0.0153045654296875, 0.02337646484375, 0.007091522216796875, -0.0531005859375, -0.015716552734375, -0.0511474609375, -0.0013380050659179688, 0.001434326171875, -0.0291290283203125, 0.040863037109375, -0.027923583984375, -0.0078582763671875, -0.0218505859375, -0.0081787109375, -0.04193115234375, 0.01294708251953125, -0.0023174285888671875, 0.0731201171875, -0.056915283203125, 0.0897216796875, 0.044525146484375, -0.0355224609375, -0.04827880859375, -0.00299835205078125, -0.002651214599609375, -0.0721435546875, 0.0657958984375, 0.01483917236328125, -0.01479339599609375, 0.01751708984375, -0.043792724609375, -0.089599609375, 0.10235595703125, 0.046539306640625, -0.032135009765625, 0.01406097412109375, 0.01483917236328125, 0.04608154296875, -0.0279693603515625, 0.0271148681640625, 0.0556640625, 0.04498291015625, 0.015838623046875, -0.06201171875, 0.00684356689453125, -0.0104827880859375, -0.01324462890625, -0.0020885467529296875, -0.07037353515625, 0.07220458984375, -0.02703857421875, -0.0196380615234375, 0.02105712890625, 0.0718994140625, 0.028778076171875, 0.01515960693359375, 0.0211944580078125, 0.040191650390625, 0.051971435546875, -0.0082550048828125, 0.088134765625, -0.0265350341796875, 0.044708251953125, 0.06915283203125, -0.0244598388671875, 0.0675048828125, 0.028289794921875, -0.0137939453125, 0.03729248046875, 0.052520751953125, -0.006702423095703125, 0.05023193359375, -0.00371551513671875, -0.0282440185546875, -0.01204681396484375, -0.006618499755859375, -0.047943115234375, 0.026763916015625, 0.0125579833984375, -0.02349853515625, -0.0168304443359375, -0.00716400146484375, 0.01363372802734375, -0.03656005859375, 0.010772705078125, 0.032073974609375, 0.0264739990234375, -0.0290679931640625, 0.06756591796875, 0.0293426513671875, 0.06201171875, -0.052978515625, 0.0191192626953125, -0.022552490234375, 0.02581787109375, -0.01248931884765625, -0.038116455078125, 0.0092010498046875, 0.0094146728515625, 0.000270843505859375, -0.0016088485717773438, 0.052734375, -0.0303955078125, -0.049468994140625, 0.053924560546875, 0.0237884521484375, 0.0295867919921875, 0.019134521484375, -0.066650390625, 0.01849365234375, -0.00013816356658935547, -0.042083740234375, 0.01502227783203125, 0.00496673583984375, -0.006954193115234375, 0.062744140625, 0.033966064453125, -0.0217742919921875, 0.00965118408203125, -0.0007276535034179688, 0.067626953125, -0.0343017578125, -0.0195465087890625, -0.059539794921875, 0.049591064453125, 0.020782470703125, -0.043975830078125, 0.041717529296875, 0.031707763671875, 0.06536865234375, -0.0201873779296875, 0.039825439453125, -0.0158538818359375, 0.0008072853088378906, -0.0340576171875, 0.0599365234375, -0.056427001953125, 0.01410675048828125, -0.01470184326171875, -0.07421875, -0.0247955322265625, 0.0693359375, -0.0081939697265625, 0.0074310302734375, 0.0504150390625, 0.05908203125, -0.015899658203125, -0.013275146484375, 0.006969451904296875, 0.0249786376953125, 0.0190277099609375, 0.050506591796875, 0.04974365234375, -0.057586669921875, 0.0452880859375, -0.031707763671875, -0.021697998046875, -0.0307769775390625, -0.064453125, -0.069580078125, -0.03839111328125, -0.0212860107421875, -0.030303955078125, -0.00441741943359375, 0.047637939453125, 0.051788330078125, -0.06109619140625, -0.0352783203125, -0.0084686279296875, 0.006275177001953125, -0.00047850608825683594, -0.011993408203125, 0.0377197265625, -0.00565338134765625, -0.048858642578125, 0.027679443359375, 0.00528717041015625, 0.0251007080078125, -0.024017333984375, -0.035186767578125, -0.0279693603515625, -0.005889892578125, 0.0433349609375, 0.0225372314453125, -0.08172607421875, -0.0080718994140625, 0.002735137939453125, -0.0234222412109375, 0.0034084320068359375, 0.03765869140625, -0.053192138671875, 0.0030231475830078125, 0.0205078125, 0.0335693359375, 0.0654296875, -0.0295867919921875, 0.038177490234375, -0.049285888671875, 0.0511474609375, 0.006725311279296875, 0.03143310546875, 0.0247039794921875, -0.022613525390625, 0.06640625, 0.02288818359375, -0.0273590087890625, -0.06591796875, 0.0322265625, -0.066650390625, -0.01123046875, 0.09600830078125, -0.0203857421875, -0.00826263427734375, 0.01248931884765625, -0.00908660888671875, 0.03570556640625, -0.01116943359375, 0.039306640625, 0.047943115234375, 0.006649017333984375, -0.015716552734375, -0.0355224609375, 0.0325927734375, 0.042633056640625, -0.058624267578125, -0.01284027099609375, 0.0277557373046875, 0.0254058837890625, 0.0121307373046875, 0.05975341796875, 0.0037384033203125, 0.0335693359375, -0.00818634033203125, 0.0195159912109375, 0.0005631446838378906, -0.039306640625, -0.049224853515625, -0.01232147216796875, -0.011322021484375, 0.005126953125 ] ]
stabilityai/StableBeluga2
2023-09-18T15:55:32.000Z
[ "transformers", "pytorch", "llama", "text-generation", "en", "dataset:conceptofmind/cot_submix_original", "dataset:conceptofmind/flan2021_submix_original", "dataset:conceptofmind/t0_submix_original", "dataset:conceptofmind/niv2_submix_original", "arxiv:2307.09288", "arxiv:2306.02707", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
stabilityai
null
null
stabilityai/StableBeluga2
838
9,524
transformers
2023-07-20T15:09:28
--- datasets: - conceptofmind/cot_submix_original - conceptofmind/flan2021_submix_original - conceptofmind/t0_submix_original - conceptofmind/niv2_submix_original language: - en pipeline_tag: text-generation --- # Stable Beluga 2 Use [Stable Chat (Research Preview)](https://chat.stability.ai/chat) to test Stability AI's best language models for free ## Model Description `Stable Beluga 2` is a Llama2 70B model finetuned on an Orca style Dataset ## Usage Start chatting with `Stable Beluga 2` using the following code snippet: ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline tokenizer = AutoTokenizer.from_pretrained("stabilityai/StableBeluga2", use_fast=False) model = AutoModelForCausalLM.from_pretrained("stabilityai/StableBeluga2", torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto") system_prompt = "### System:\nYou are Stable Beluga, an AI that follows instructions extremely well. Help as much as you can. Remember, be safe, and don't do anything illegal.\n\n" message = "Write me a poem please" prompt = f"{system_prompt}### User: {message}\n\n### Assistant:\n" inputs = tokenizer(prompt, return_tensors="pt").to("cuda") output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=256) print(tokenizer.decode(output[0], skip_special_tokens=True)) ``` Stable Beluga 2 should be used with this prompt format: ``` ### System: This is a system prompt, please behave and help the user. ### User: Your prompt here ### Assistant: The output of Stable Beluga 2 ``` ## Other Beluga Models [StableBeluga 1 - Delta](https://huggingface.co/stabilityai/StableBeluga1-Delta) [StableBeluga 13B](https://huggingface.co/stabilityai/StableBeluga-13B) [StableBeluga 7B](https://huggingface.co/stabilityai/StableBeluga-7B) ## Model Details * **Developed by**: [Stability AI](https://stability.ai/) * **Model type**: Stable Beluga 2 is an auto-regressive language model fine-tuned on Llama2 70B. * **Language(s)**: English * **Library**: [HuggingFace Transformers](https://github.com/huggingface/transformers) * **License**: Fine-tuned checkpoints (`Stable Beluga 2`) is licensed under the [STABLE BELUGA NON-COMMERCIAL COMMUNITY LICENSE AGREEMENT](https://huggingface.co/stabilityai/StableBeluga2/blob/main/LICENSE.txt) * **Contact**: For questions and comments about the model, please email `lm@stability.ai` ### Training Dataset ` Stable Beluga 2` is trained on our internal Orca-style dataset ### Training Procedure Models are learned via supervised fine-tuning on the aforementioned datasets, trained in mixed-precision (BF16), and optimized with AdamW. We outline the following hyperparameters: | Dataset | Batch Size | Learning Rate |Learning Rate Decay| Warm-up | Weight Decay | Betas | |-------------------|------------|---------------|-------------------|---------|--------------|-------------| | Orca pt1 packed | 256 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) | | Orca pt2 unpacked | 512 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) | ## Ethical Considerations and Limitations Beluga is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Beluga's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Beluga, developers should perform safety testing and tuning tailored to their specific applications of the model. ## How to cite ```bibtex @misc{StableBelugaModels, url={[https://huggingface.co/stabilityai/StableBeluga2](https://huggingface.co/stabilityai/StableBeluga2)}, title={Stable Beluga models}, author={Mahan, Dakota and Carlow, Ryan and Castricato, Louis and Cooper, Nathan and Laforte, Christian} } ``` ## Citations ```bibtext @misc{touvron2023llama, title={Llama 2: Open Foundation and Fine-Tuned Chat Models}, author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom}, year={2023}, eprint={2307.09288}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ```bibtext @misc{mukherjee2023orca, title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4}, author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah}, year={2023}, eprint={2306.02707}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
5,873
[ [ -0.03350830078125, -0.06549072265625, 0.0019140243530273438, 0.030426025390625, -0.017364501953125, -0.0005421638488769531, -0.0094146728515625, -0.03997802734375, 0.0022830963134765625, 0.017486572265625, -0.03778076171875, -0.032745361328125, -0.0469970703125, -0.005123138427734375, -0.020660400390625, 0.079345703125, 0.00646209716796875, -0.01004791259765625, 0.004119873046875, -0.01541900634765625, -0.0477294921875, -0.02154541015625, -0.0667724609375, -0.0242156982421875, 0.01849365234375, 0.01190185546875, 0.055145263671875, 0.050872802734375, 0.01306915283203125, 0.025726318359375, -0.0302886962890625, -0.0010557174682617188, -0.04345703125, -0.01132965087890625, 0.01244354248046875, -0.039886474609375, -0.04742431640625, -0.0036373138427734375, 0.037445068359375, 0.03485107421875, -0.00812530517578125, 0.0150604248046875, 0.0191497802734375, 0.0301666259765625, -0.026947021484375, 0.0301666259765625, -0.0285491943359375, -0.0176849365234375, -0.01270294189453125, 0.01485443115234375, -0.019012451171875, -0.052032470703125, 0.00835418701171875, -0.048370361328125, -0.0014200210571289062, -0.01099395751953125, 0.11077880859375, 0.02703857421875, -0.035858154296875, -0.005115509033203125, -0.038726806640625, 0.06658935546875, -0.06878662109375, 0.032379150390625, 0.0252685546875, 0.0214691162109375, -0.02496337890625, -0.052032470703125, -0.04620361328125, -0.009857177734375, -0.0103302001953125, 0.025146484375, -0.01517486572265625, -0.007587432861328125, 0.02099609375, 0.026275634765625, -0.042724609375, 0.018829345703125, -0.04339599609375, -0.032257080078125, 0.036773681640625, 0.001239776611328125, 0.005207061767578125, -0.01192474365234375, -0.0242156982421875, -0.0304107666015625, -0.053466796875, 0.02593994140625, 0.02508544921875, 0.01471710205078125, -0.04473876953125, 0.045318603515625, -0.002452850341796875, 0.042999267578125, 0.005847930908203125, -0.0277099609375, 0.036346435546875, -0.0303802490234375, -0.0272369384765625, -0.002391815185546875, 0.070556640625, 0.0293426513671875, 0.0059814453125, 0.0200958251953125, -0.0027065277099609375, 0.027587890625, -0.0079803466796875, -0.067138671875, -0.0169219970703125, 0.0250701904296875, -0.03887939453125, -0.04010009765625, -0.02044677734375, -0.081787109375, -0.01007080078125, -0.0012359619140625, 0.02630615234375, -0.027069091796875, -0.034637451171875, 0.00841522216796875, 0.0204010009765625, 0.040374755859375, -0.00035762786865234375, -0.07623291015625, 0.024383544921875, 0.033355712890625, 0.058349609375, 0.014251708984375, -0.0166778564453125, -0.02191162109375, -0.0024662017822265625, -0.03326416015625, 0.0445556640625, -0.01561737060546875, -0.033782958984375, -0.0016183853149414062, 0.01568603515625, -0.0009398460388183594, -0.023651123046875, 0.052734375, -0.019500732421875, 0.031494140625, -0.017333984375, -0.027862548828125, -0.0350341796875, 0.01203155517578125, -0.034942626953125, 0.09014892578125, -0.0021305084228515625, -0.051544189453125, 0.0161590576171875, -0.042205810546875, -0.0253143310546875, -0.02484130859375, -0.0084075927734375, -0.05670166015625, -0.027679443359375, 0.0169219970703125, 0.03326416015625, -0.0227813720703125, 0.0211181640625, -0.035369873046875, -0.0161895751953125, 0.01088714599609375, -0.01548004150390625, 0.08416748046875, 0.0219573974609375, -0.04730224609375, 0.01076507568359375, -0.06671142578125, -0.008270263671875, 0.0300750732421875, -0.03045654296875, -0.005611419677734375, -0.00737762451171875, -0.0234832763671875, -0.0005002021789550781, 0.026123046875, -0.03289794921875, 0.01110076904296875, -0.032958984375, 0.0360107421875, 0.04888916015625, -0.0046539306640625, 0.0163421630859375, -0.0272674560546875, 0.0166473388671875, -0.00524139404296875, 0.025421142578125, -0.00664520263671875, -0.0643310546875, -0.071533203125, -0.023681640625, 0.0307159423828125, 0.05279541015625, -0.0226593017578125, 0.0462646484375, 0.0006194114685058594, -0.05438232421875, -0.044525146484375, -0.0040130615234375, 0.0477294921875, 0.0478515625, 0.027008056640625, -0.0195465087890625, -0.050750732421875, -0.0634765625, 0.01152801513671875, -0.031951904296875, 0.0097503662109375, 0.00868988037109375, 0.0297698974609375, -0.033477783203125, 0.056243896484375, -0.033477783203125, -0.0094146728515625, -0.005947113037109375, 0.01529693603515625, 0.0261077880859375, 0.045562744140625, 0.0662841796875, -0.0439453125, -0.018585205078125, -0.0096588134765625, -0.05682373046875, -0.006618499755859375, 0.007152557373046875, -0.02130126953125, 0.041534423828125, 0.01432037353515625, -0.046356201171875, 0.03759765625, 0.056304931640625, -0.026947021484375, 0.043548583984375, -0.01045989990234375, 0.004642486572265625, -0.088623046875, 0.01453399658203125, 0.011962890625, -0.00833892822265625, -0.04071044921875, -0.004467010498046875, 0.00421142578125, 0.0017261505126953125, -0.0210113525390625, 0.041259765625, -0.0258941650390625, -0.005146026611328125, -0.0100250244140625, 0.0119171142578125, -0.005153656005859375, 0.045806884765625, -0.000247955322265625, 0.0311431884765625, 0.05401611328125, -0.051300048828125, 0.0230712890625, 0.0379638671875, -0.0284576416015625, 0.01556396484375, -0.072265625, 0.008056640625, 0.01378631591796875, 0.017913818359375, -0.09326171875, -0.011077880859375, 0.0306396484375, -0.049163818359375, 0.037689208984375, -0.02459716796875, -0.0299072265625, -0.034637451171875, -0.0184783935546875, 0.0082244873046875, 0.06103515625, -0.034423828125, 0.0296630859375, 0.0345458984375, -0.00885009765625, -0.052276611328125, -0.06207275390625, -0.023834228515625, -0.020111083984375, -0.060150146484375, 0.012603759765625, -0.0267333984375, 0.0037937164306640625, -0.00637054443359375, -0.00792694091796875, 0.002899169921875, 0.0170440673828125, 0.023956298828125, 0.043701171875, -0.00794219970703125, -0.02734375, 0.023223876953125, -0.019683837890625, 0.006252288818359375, -0.004833221435546875, 0.0462646484375, -0.040985107421875, -0.003284454345703125, -0.035247802734375, -0.00415802001953125, 0.04193115234375, -0.024383544921875, 0.059326171875, 0.052215576171875, -0.0390625, 0.0281524658203125, -0.042877197265625, -0.019195556640625, -0.037689208984375, 0.0254974365234375, -0.029876708984375, -0.05609130859375, 0.065673828125, 0.004947662353515625, 0.037750244140625, 0.041107177734375, 0.060821533203125, 0.0106201171875, 0.07598876953125, 0.0513916015625, 0.0026397705078125, 0.0178680419921875, -0.04022216796875, 0.0048370361328125, -0.05670166015625, -0.043975830078125, -0.047088623046875, -0.0142822265625, -0.0447998046875, -0.0007171630859375, 0.0125274658203125, 0.019500732421875, -0.040618896484375, 0.0305023193359375, -0.04034423828125, -0.0024127960205078125, 0.03173828125, 0.0008969306945800781, 0.0084228515625, -0.01128387451171875, -0.032073974609375, 0.001049041748046875, -0.04644775390625, -0.039306640625, 0.0767822265625, 0.047607421875, 0.053375244140625, 0.012420654296875, 0.0270843505859375, -0.0164337158203125, 0.01097869873046875, -0.03826904296875, 0.045166015625, 0.00475311279296875, -0.0654296875, -0.01126861572265625, -0.0330810546875, -0.08697509765625, 0.00494384765625, -0.02239990234375, -0.051544189453125, 0.028411865234375, 0.00963592529296875, -0.033935546875, 0.01226806640625, -0.05206298828125, 0.0802001953125, -0.0218505859375, -0.020355224609375, -0.00516510009765625, -0.06787109375, 0.046173095703125, -0.0009374618530273438, 0.024871826171875, 0.0007042884826660156, -0.0027942657470703125, 0.061248779296875, -0.032318115234375, 0.07171630859375, -0.002010345458984375, -0.00800323486328125, 0.0305328369140625, 0.01554107666015625, 0.03564453125, 0.0159759521484375, -0.00775146484375, 0.0221710205078125, 0.0113067626953125, -0.033416748046875, -0.0212249755859375, 0.055450439453125, -0.0914306640625, -0.0311279296875, -0.036865234375, -0.0170745849609375, 0.00466156005859375, 0.0323486328125, 0.026763916015625, 0.03460693359375, 0.0155181884765625, 0.0196075439453125, 0.05181884765625, -0.023956298828125, 0.025848388671875, 0.0399169921875, -0.031524658203125, -0.044158935546875, 0.045745849609375, 0.0147857666015625, 0.02392578125, 0.006175994873046875, 0.0220794677734375, -0.03594970703125, -0.038543701171875, -0.035980224609375, 0.034759521484375, -0.043243408203125, -0.01323699951171875, -0.04248046875, -0.0146942138671875, -0.034820556640625, -0.0016117095947265625, -0.048553466796875, -0.02655029296875, -0.031585693359375, -0.0198974609375, 0.047882080078125, 0.033203125, -0.0125274658203125, 0.008087158203125, -0.05242919921875, 0.01245880126953125, 0.0205841064453125, 0.02154541015625, -0.00441741943359375, -0.05682373046875, -0.01125335693359375, 0.0182952880859375, -0.032440185546875, -0.0699462890625, 0.0234527587890625, -0.0005593299865722656, 0.058197021484375, 0.0244598388671875, 0.003330230712890625, 0.06036376953125, -0.0047607421875, 0.0714111328125, 0.01873779296875, -0.058746337890625, 0.0467529296875, -0.028289794921875, 0.005001068115234375, 0.027862548828125, 0.043060302734375, -0.01480865478515625, -0.0216064453125, -0.051025390625, -0.061309814453125, 0.050994873046875, 0.025299072265625, -0.0027332305908203125, 0.00954437255859375, 0.04449462890625, 0.00800323486328125, 0.01015472412109375, -0.07159423828125, -0.044281005859375, -0.05023193359375, 0.0008687973022460938, 0.00640869140625, -0.0227508544921875, -0.01457977294921875, -0.0262451171875, 0.06573486328125, 0.00316619873046875, 0.03338623046875, 0.012847900390625, 0.019500732421875, -0.0154266357421875, -0.00330352783203125, 0.05108642578125, 0.038421630859375, -0.033294677734375, -0.00868988037109375, 0.0246429443359375, -0.04248046875, -0.003414154052734375, 0.026641845703125, -0.0245513916015625, -0.0157318115234375, 0.0029430389404296875, 0.07073974609375, 0.011688232421875, -0.0287322998046875, 0.018096923828125, -0.01308441162109375, -0.0259857177734375, -0.025787353515625, -0.00238037109375, 0.0172119140625, 0.02520751953125, 0.0127716064453125, 0.00785064697265625, -0.020416259765625, -0.0537109375, -0.000659942626953125, 0.01123809814453125, -0.0192413330078125, -0.027008056640625, 0.06964111328125, 0.01361846923828125, -0.0082244873046875, 0.04473876953125, -0.015380859375, -0.035614013671875, 0.041534423828125, 0.046844482421875, 0.051971435546875, -0.0281982421875, 0.0020465850830078125, 0.052520751953125, 0.035369873046875, -0.01380157470703125, 0.032623291015625, 0.033111572265625, -0.0445556640625, -0.02978515625, -0.037139892578125, -0.03173828125, 0.03631591796875, -0.04180908203125, 0.043670654296875, -0.036712646484375, -0.0206298828125, -0.01971435546875, 0.02386474609375, -0.028228759765625, 0.01910400390625, 0.004756927490234375, 0.061920166015625, -0.058929443359375, 0.06048583984375, 0.056243896484375, -0.042144775390625, -0.07843017578125, -0.02349853515625, 0.002246856689453125, -0.043914794921875, 0.01471710205078125, 0.00446319580078125, 0.006473541259765625, 0.004974365234375, -0.0418701171875, -0.06976318359375, 0.09869384765625, 0.032257080078125, -0.040283203125, 0.0184783935546875, -0.00753021240234375, 0.042022705078125, -0.005779266357421875, 0.036956787109375, 0.044677734375, 0.04742431640625, 0.01177215576171875, -0.06951904296875, 0.0277099609375, -0.029998779296875, -0.01043701171875, 0.005352020263671875, -0.08624267578125, 0.061920166015625, -0.0142059326171875, -0.0032176971435546875, 0.025482177734375, 0.06597900390625, 0.057952880859375, 0.0240631103515625, 0.03656005859375, 0.05230712890625, 0.052032470703125, -0.020263671875, 0.07574462890625, -0.0261077880859375, 0.034637451171875, 0.045318603515625, 0.002349853515625, 0.05621337890625, 0.004512786865234375, -0.0214080810546875, 0.05059814453125, 0.0751953125, -0.01824951171875, 0.0469970703125, -0.00763702392578125, 0.01081085205078125, -0.0036182403564453125, 0.0174407958984375, -0.05169677734375, 0.005764007568359375, 0.03497314453125, -0.0053558349609375, -0.0032806396484375, -0.0111846923828125, 0.03790283203125, -0.0162811279296875, -0.007335662841796875, 0.03448486328125, 0.0207366943359375, -0.048248291015625, 0.096435546875, 0.003032684326171875, 0.050872802734375, -0.0653076171875, 0.0127105712890625, -0.0280914306640625, 0.01824951171875, -0.0266876220703125, -0.054718017578125, 0.0235595703125, 0.002704620361328125, 0.004344940185546875, 0.01690673828125, 0.053436279296875, -0.01238250732421875, -0.027099609375, 0.041534423828125, 0.006557464599609375, 0.029022216796875, 0.029693603515625, -0.07373046875, 0.027984619140625, 0.00998687744140625, -0.046661376953125, 0.025634765625, 0.028839111328125, 0.0009183883666992188, 0.0528564453125, 0.055450439453125, -0.014404296875, 0.0245513916015625, -0.00994873046875, 0.07470703125, -0.028289794921875, -0.02410888671875, -0.05377197265625, 0.047698974609375, 0.005886077880859375, -0.039794921875, 0.0625, 0.0452880859375, 0.05462646484375, 0.0129241943359375, 0.058135986328125, -0.0186004638671875, 0.036224365234375, -0.032012939453125, 0.051605224609375, -0.042724609375, 0.0256500244140625, -0.017578125, -0.07489013671875, -0.0186004638671875, 0.060760498046875, -0.0171661376953125, 0.010284423828125, 0.038726806640625, 0.06854248046875, -0.0038318634033203125, 0.0003898143768310547, 0.003047943115234375, 0.0537109375, 0.03759765625, 0.038330078125, 0.044525146484375, -0.051849365234375, 0.07440185546875, -0.043212890625, -0.035308837890625, -0.0245208740234375, -0.07489013671875, -0.07940673828125, -0.035308837890625, -0.038421630859375, -0.055145263671875, 0.003726959228515625, 0.048248291015625, 0.05096435546875, -0.057647705078125, -0.026214599609375, -0.012481689453125, -0.00020384788513183594, -0.027191162109375, -0.0134124755859375, 0.03759765625, -0.0014543533325195312, -0.056396484375, 0.02410888671875, 0.0001329183578491211, 0.032470703125, -0.0256805419921875, -0.0224761962890625, -0.0215606689453125, 0.0079345703125, 0.0271453857421875, 0.045745849609375, -0.033843994140625, -0.02203369140625, 0.0089874267578125, -0.006717681884765625, -0.0035572052001953125, -0.0002694129943847656, -0.045166015625, 0.0219573974609375, 0.045379638671875, 0.0259246826171875, 0.061859130859375, -0.01471710205078125, 0.0266265869140625, -0.027313232421875, 0.0215301513671875, -0.00676727294921875, 0.03582763671875, 0.017242431640625, -0.01153564453125, 0.0496826171875, 0.027618408203125, -0.060821533203125, -0.0718994140625, 0.00638580322265625, -0.1005859375, -0.0091094970703125, 0.08795166015625, -0.0216522216796875, -0.0254058837890625, -0.009246826171875, -0.0249176025390625, 0.04766845703125, -0.034210205078125, 0.0665283203125, 0.033538818359375, 0.005016326904296875, -0.0227203369140625, -0.0491943359375, 0.032135009765625, 0.027099609375, -0.049407958984375, -0.01421356201171875, 0.017425537109375, 0.0295562744140625, 0.026214599609375, 0.0299530029296875, -0.03424072265625, 0.01715087890625, -0.0090484619140625, 0.00962066650390625, -0.01190185546875, -0.004779815673828125, -0.0132904052734375, -0.0192718505859375, -0.01171875, -0.0185699462890625 ] ]
facebook/deit-small-distilled-patch16-224
2022-07-13T11:41:21.000Z
[ "transformers", "pytorch", "tf", "deit", "image-classification", "vision", "dataset:imagenet", "arxiv:2012.12877", "arxiv:2006.03677", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
facebook
null
null
facebook/deit-small-distilled-patch16-224
3
9,504
transformers
2022-03-02T23:29:05
--- license: apache-2.0 tags: - image-classification - vision datasets: - imagenet --- # Distilled Data-efficient Image Transformer (small-sized model) Distilled data-efficient Image Transformer (DeiT) model pre-trained and fine-tuned on ImageNet-1k (1 million images, 1,000 classes) at resolution 224x224. It was first introduced in the paper [Training data-efficient image transformers & distillation through attention](https://arxiv.org/abs/2012.12877) by Touvron et al. and first released in [this repository](https://github.com/facebookresearch/deit). However, the weights were converted from the [timm repository](https://github.com/rwightman/pytorch-image-models) by Ross Wightman. Disclaimer: The team releasing DeiT did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description This model is a distilled Vision Transformer (ViT). It uses a distillation token, besides the class token, to effectively learn from a teacher (CNN) during both pre-training and fine-tuning. The distillation token is learned through backpropagation, by interacting with the class ([CLS]) and patch tokens through the self-attention layers. Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. ## Intended uses & limitations You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=facebook/deit) to look for fine-tuned versions on a task that interests you. ### How to use Since this model is a distilled ViT model, you can plug it into DeiTModel, DeiTForImageClassification or DeiTForImageClassificationWithTeacher. Note that the model expects the data to be prepared using DeiTFeatureExtractor. Here we use AutoFeatureExtractor, which will automatically use the appropriate feature extractor given the model name. Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: ```python from transformers import AutoFeatureExtractor, DeiTForImageClassificationWithTeacher from PIL import Image import requests url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) feature_extractor = AutoFeatureExtractor.from_pretrained('facebook/deit-small-distilled-patch16-224') model = DeiTForImageClassificationWithTeacher.from_pretrained('facebook/deit-small-distilled-patch16-224') inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # model predicts one of the 1000 ImageNet classes predicted_class_idx = logits.argmax(-1).item() print("Predicted class:", model.config.id2label[predicted_class_idx]) ``` Currently, both the feature extractor and model support PyTorch. Tensorflow and JAX/FLAX are coming soon. ## Training data This model was pretrained and fine-tuned with distillation on [ImageNet-1k](http://www.image-net.org/challenges/LSVRC/2012/), a dataset consisting of 1 million images and 1k classes. ## Training procedure ### Preprocessing The exact details of preprocessing of images during training/validation can be found [here](https://github.com/facebookresearch/deit/blob/ab5715372db8c6cad5740714b2216d55aeae052e/datasets.py#L78). At inference time, images are resized/rescaled to the same resolution (256x256), center-cropped at 224x224 and normalized across the RGB channels with the ImageNet mean and standard deviation. ### Pretraining The model was trained on a single 8-GPU node for 3 days. Training resolution is 224. For all hyperparameters (such as batch size and learning rate) we refer to table 9 of the original paper. ## Evaluation results | Model | ImageNet top-1 accuracy | ImageNet top-5 accuracy | # params | URL | |---------------------------------------|-------------------------|-------------------------|----------|------------------------------------------------------------------| | DeiT-tiny | 72.2 | 91.1 | 5M | https://huggingface.co/facebook/deit-tiny-patch16-224 | | DeiT-small | 79.9 | 95.0 | 22M | https://huggingface.co/facebook/deit-small-patch16-224 | | DeiT-base | 81.8 | 95.6 | 86M | https://huggingface.co/facebook/deit-base-patch16-224 | | DeiT-tiny distilled | 74.5 | 91.9 | 6M | https://huggingface.co/facebook/deit-tiny-distilled-patch16-224 | | **DeiT-small distilled** | **81.2** | **95.4** | **22M** | **https://huggingface.co/facebook/deit-small-distilled-patch16-224** | | DeiT-base distilled | 83.4 | 96.5 | 87M | https://huggingface.co/facebook/deit-base-distilled-patch16-224 | | DeiT-base 384 | 82.9 | 96.2 | 87M | https://huggingface.co/facebook/deit-base-patch16-384 | | DeiT-base distilled 384 (1000 epochs) | 85.2 | 97.2 | 88M | https://huggingface.co/facebook/deit-base-distilled-patch16-384 | Note that for fine-tuning, the best results are obtained with a higher resolution (384x384). Of course, increasing the model size will result in better performance. ### BibTeX entry and citation info ```bibtex @misc{touvron2021training, title={Training data-efficient image transformers & distillation through attention}, author={Hugo Touvron and Matthieu Cord and Matthijs Douze and Francisco Massa and Alexandre Sablayrolles and Hervé Jégou}, year={2021}, eprint={2012.12877}, archivePrefix={arXiv}, primaryClass={cs.CV} } ``` ```bibtex @misc{wu2020visual, title={Visual Transformers: Token-based Image Representation and Processing for Computer Vision}, author={Bichen Wu and Chenfeng Xu and Xiaoliang Dai and Alvin Wan and Peizhao Zhang and Zhicheng Yan and Masayoshi Tomizuka and Joseph Gonzalez and Kurt Keutzer and Peter Vajda}, year={2020}, eprint={2006.03677}, archivePrefix={arXiv}, primaryClass={cs.CV} } ``` ```bibtex @inproceedings{deng2009imagenet, title={Imagenet: A large-scale hierarchical image database}, author={Deng, Jia and Dong, Wei and Socher, Richard and Li, Li-Jia and Li, Kai and Fei-Fei, Li}, booktitle={2009 IEEE conference on computer vision and pattern recognition}, pages={248--255}, year={2009}, organization={Ieee} } ```
6,806
[ [ -0.05950927734375, -0.037628173828125, 0.004425048828125, 0.01001739501953125, -0.02984619140625, -0.0161285400390625, -0.01255035400390625, -0.03436279296875, 0.0216827392578125, 0.01488494873046875, -0.0300750732421875, -0.0277557373046875, -0.0650634765625, 0.006317138671875, -0.027374267578125, 0.0723876953125, 0.0027370452880859375, -0.0119476318359375, -0.005977630615234375, -0.01026153564453125, -0.0333251953125, -0.0268707275390625, -0.05230712890625, -0.01380157470703125, 0.036468505859375, 0.0189056396484375, 0.042205810546875, 0.055816650390625, 0.061920166015625, 0.0357666015625, -0.0117340087890625, 0.00799560546875, -0.038482666015625, -0.0208282470703125, 0.00330352783203125, -0.0211944580078125, -0.032196044921875, 0.021240234375, 0.032806396484375, 0.0310821533203125, 0.0093841552734375, 0.0261993408203125, 0.0267333984375, 0.0653076171875, -0.037261962890625, 0.016754150390625, -0.03912353515625, 0.011444091796875, 0.00012946128845214844, -0.00104522705078125, -0.017822265625, -0.0098419189453125, 0.01412200927734375, -0.037933349609375, 0.0335693359375, -0.010833740234375, 0.094482421875, 0.038543701171875, -0.0220489501953125, 0.01180267333984375, -0.04278564453125, 0.05322265625, -0.0394287109375, 0.016815185546875, 0.03253173828125, 0.0258636474609375, -0.0037631988525390625, -0.07659912109375, -0.041015625, -0.00421905517578125, -0.0238800048828125, 0.01436614990234375, -0.0272369384765625, 0.0003218650817871094, 0.037353515625, 0.055755615234375, -0.034881591796875, -0.0031986236572265625, -0.040985107421875, -0.0109100341796875, 0.05340576171875, -0.00948333740234375, 0.0012760162353515625, -0.005016326904296875, -0.04742431640625, -0.02191162109375, -0.007518768310546875, 0.004291534423828125, 0.00691986083984375, 0.0033550262451171875, -0.018310546875, 0.031585693359375, -0.0165863037109375, 0.048980712890625, 0.038848876953125, -0.00650787353515625, 0.040069580078125, -0.021636962890625, -0.032440185546875, -0.0044403076171875, 0.07342529296875, 0.037384033203125, 0.0208587646484375, 0.0200042724609375, -0.0130462646484375, 0.004314422607421875, 0.01103973388671875, -0.08514404296875, -0.032989501953125, 0.00295257568359375, -0.053955078125, -0.04248046875, 0.010833740234375, -0.053497314453125, -0.007175445556640625, -0.0237579345703125, 0.0352783203125, -0.0282135009765625, -0.028106689453125, -0.0082244873046875, -0.0106353759765625, 0.0134429931640625, 0.0255584716796875, -0.048675537109375, 0.01097869873046875, 0.01508331298828125, 0.07501220703125, -0.007457733154296875, -0.01485443115234375, -0.0023021697998046875, -0.0295867919921875, -0.029022216796875, 0.045562744140625, 0.0018157958984375, -0.00408172607421875, -0.0235443115234375, 0.02752685546875, -0.00254058837890625, -0.040008544921875, 0.01751708984375, -0.027984619140625, -0.0028820037841796875, -0.0137481689453125, -0.0219573974609375, -0.01788330078125, 0.0255126953125, -0.0621337890625, 0.09051513671875, 0.02142333984375, -0.070556640625, 0.0294189453125, -0.034088134765625, -0.004467010498046875, -0.00942230224609375, 0.006561279296875, -0.050048828125, -0.00463104248046875, 0.022918701171875, 0.04827880859375, -0.016571044921875, 0.005741119384765625, -0.026092529296875, -0.032684326171875, 0.0162353515625, -0.0372314453125, 0.074462890625, 0.0289154052734375, -0.04058837890625, -0.01255035400390625, -0.06610107421875, 0.00033164024353027344, 0.0284423828125, -0.017059326171875, -0.009002685546875, -0.038177490234375, 0.004764556884765625, 0.03216552734375, 0.01532745361328125, -0.036285400390625, 0.01328277587890625, -0.004100799560546875, 0.04168701171875, 0.06036376953125, -0.006229400634765625, 0.0217437744140625, -0.0209503173828125, 0.0167083740234375, 0.0296630859375, 0.031494140625, -0.01947021484375, -0.032989501953125, -0.06695556640625, -0.036468505859375, 0.0275115966796875, 0.0238800048828125, -0.05462646484375, 0.045318603515625, -0.0266876220703125, -0.05474853515625, -0.0254669189453125, 0.0096282958984375, 0.0250396728515625, 0.045562744140625, 0.035552978515625, -0.0343017578125, -0.033935546875, -0.07855224609375, 0.00795745849609375, -0.00510406494140625, 0.01324462890625, 0.01074981689453125, 0.05035400390625, -0.01479339599609375, 0.06805419921875, -0.041259765625, -0.032196044921875, 0.00041294097900390625, -0.0014848709106445312, 0.0257568359375, 0.05133056640625, 0.06787109375, -0.07342529296875, -0.05718994140625, -0.00353240966796875, -0.06072998046875, 0.0157623291015625, 0.009033203125, -0.023101806640625, 0.01241302490234375, 0.0340576171875, -0.0390625, 0.061859130859375, 0.031402587890625, -0.0138397216796875, 0.0235595703125, -0.00733184814453125, 0.0260009765625, -0.085693359375, 0.00627899169921875, 0.031494140625, -0.0260009765625, -0.031890869140625, -0.0126800537109375, 0.001972198486328125, 0.00154876708984375, -0.04486083984375, 0.0247955322265625, -0.0440673828125, 0.0012111663818359375, -0.01690673828125, -0.0242919921875, 0.00830841064453125, 0.05078125, 0.0037174224853515625, 0.04095458984375, 0.047515869140625, -0.03741455078125, 0.046142578125, 0.02032470703125, -0.025177001953125, 0.05230712890625, -0.061370849609375, 0.0149688720703125, -0.007617950439453125, 0.0249786376953125, -0.086669921875, -0.0165557861328125, 0.01629638671875, -0.037689208984375, 0.04620361328125, -0.02642822265625, -0.0228424072265625, -0.0584716796875, -0.0230255126953125, 0.035552978515625, 0.04852294921875, -0.053558349609375, 0.0253143310546875, 0.00855255126953125, 0.02935791015625, -0.052886962890625, -0.07666015625, -0.007694244384765625, -0.0263824462890625, -0.044189453125, 0.034820556640625, 0.00266265869140625, 0.0086822509765625, 0.0142822265625, 0.00004315376281738281, -0.0192718505859375, -0.00749969482421875, 0.032440185546875, 0.03192138671875, -0.01425933837890625, -0.006175994873046875, -0.0207672119140625, -0.0157928466796875, -0.01058197021484375, -0.031097412109375, 0.0281982421875, -0.0288238525390625, -0.0125732421875, -0.06402587890625, 0.000005781650543212891, 0.0491943359375, -0.0020847320556640625, 0.049774169921875, 0.0623779296875, -0.0333251953125, 0.007053375244140625, -0.048431396484375, -0.023681640625, -0.03936767578125, 0.0283203125, -0.029754638671875, -0.043701171875, 0.049041748046875, 0.0023326873779296875, -0.0004303455352783203, 0.06243896484375, 0.036285400390625, -0.018798828125, 0.0653076171875, 0.036651611328125, -0.006542205810546875, 0.0546875, -0.06561279296875, -0.0024166107177734375, -0.05133056640625, -0.01275634765625, -0.017913818359375, -0.056671142578125, -0.055908203125, -0.0288543701171875, 0.0216827392578125, 0.005596160888671875, -0.0306243896484375, 0.043792724609375, -0.0677490234375, 0.0213623046875, 0.057281494140625, 0.03558349609375, 0.0011014938354492188, 0.0238037109375, 0.0008678436279296875, -0.0036945343017578125, -0.045257568359375, -0.00885772705078125, 0.064697265625, 0.03118896484375, 0.053253173828125, -0.023712158203125, 0.048797607421875, 0.010223388671875, 0.0137481689453125, -0.051116943359375, 0.037353515625, -0.016998291015625, -0.05712890625, -0.00908660888671875, -0.0284881591796875, -0.06781005859375, 0.01055145263671875, -0.0160675048828125, -0.04461669921875, 0.042236328125, 0.0312347412109375, -0.01513671875, 0.036163330078125, -0.057708740234375, 0.06597900390625, -0.015045166015625, -0.038818359375, 0.00838470458984375, -0.060150146484375, 0.0181732177734375, -0.00031876564025878906, -0.006015777587890625, 0.0035648345947265625, 0.024444580078125, 0.049346923828125, -0.05712890625, 0.070556640625, -0.0238037109375, 0.01873779296875, 0.055511474609375, -0.0181121826171875, 0.023712158203125, -0.01549530029296875, 0.01088714599609375, 0.04132080078125, 0.006427764892578125, -0.037750244140625, -0.03607177734375, 0.0506591796875, -0.06488037109375, -0.0268402099609375, -0.04327392578125, -0.016632080078125, 0.01279449462890625, 0.0127410888671875, 0.05487060546875, 0.029388427734375, 0.00733184814453125, 0.034942626953125, 0.047637939453125, -0.0175018310546875, 0.03564453125, -0.01248931884765625, -0.0010433197021484375, -0.031585693359375, 0.06756591796875, 0.02813720703125, 0.0220947265625, 0.01264190673828125, 0.021331787109375, -0.01763916015625, -0.0279083251953125, -0.0309906005859375, 0.01065826416015625, -0.054168701171875, -0.040252685546875, -0.04443359375, -0.04034423828125, -0.033477783203125, 0.0003898143768310547, -0.050048828125, -0.027069091796875, -0.03759765625, -0.004322052001953125, 0.05255126953125, 0.041259765625, -0.0228424072265625, 0.0298309326171875, -0.044158935546875, 0.0168304443359375, 0.029815673828125, 0.031494140625, -0.006038665771484375, -0.0528564453125, -0.0220794677734375, 0.01392364501953125, -0.03131103515625, -0.0523681640625, 0.0274505615234375, 0.0234527587890625, 0.033233642578125, 0.032379150390625, -0.00406646728515625, 0.0732421875, -0.0174713134765625, 0.051483154296875, 0.0362548828125, -0.040679931640625, 0.05340576171875, -0.01204681396484375, 0.00955963134765625, 0.046630859375, 0.03375244140625, -0.0186920166015625, -0.0028781890869140625, -0.05767822265625, -0.060577392578125, 0.051422119140625, 0.0098724365234375, 0.0048370361328125, 0.00534820556640625, 0.044189453125, -0.013275146484375, 0.005207061767578125, -0.055694580078125, -0.040130615234375, -0.037994384765625, -0.01605224609375, -0.002727508544921875, -0.0137786865234375, 0.006206512451171875, -0.056396484375, 0.0504150390625, -0.00811004638671875, 0.0360107421875, 0.022369384765625, -0.0084991455078125, 0.006656646728515625, -0.0335693359375, 0.0159759521484375, 0.0277099609375, -0.01302337646484375, 0.008697509765625, 0.005901336669921875, -0.061614990234375, 0.01385498046875, 0.0097198486328125, -0.00716400146484375, -0.0033550262451171875, 0.0248260498046875, 0.0731201171875, -0.0074310302734375, 0.0012464523315429688, 0.05145263671875, -0.0169830322265625, -0.036865234375, -0.0258941650390625, 0.005840301513671875, -0.0083160400390625, 0.03167724609375, 0.0287933349609375, 0.01812744140625, 0.01342010498046875, -0.024261474609375, 0.022186279296875, 0.0203857421875, -0.04083251953125, -0.0193634033203125, 0.04718017578125, -0.007488250732421875, 0.0125732421875, 0.059295654296875, -0.009185791015625, -0.035614013671875, 0.07476806640625, 0.0233154296875, 0.0609130859375, -0.02142333984375, 0.0103912353515625, 0.06512451171875, 0.0122833251953125, -0.0148162841796875, -0.0006341934204101562, 0.00537872314453125, -0.04241943359375, -0.0135498046875, -0.05303955078125, 0.022735595703125, 0.0259246826171875, -0.052703857421875, 0.0282440185546875, -0.031829833984375, -0.040985107421875, 0.0205841064453125, 0.00890350341796875, -0.07464599609375, 0.0308380126953125, 0.006702423095703125, 0.0609130859375, -0.06610107421875, 0.057647705078125, 0.054046630859375, -0.0469970703125, -0.078125, -0.02227783203125, -0.00173187255859375, -0.050567626953125, 0.06298828125, 0.025421142578125, 0.0139923095703125, 0.01477813720703125, -0.0528564453125, -0.06561279296875, 0.10308837890625, 0.0242156982421875, -0.041595458984375, 0.007701873779296875, 0.0075836181640625, 0.036376953125, -0.016357421875, 0.033172607421875, 0.0343017578125, 0.0267181396484375, 0.041351318359375, -0.0606689453125, 0.0029888153076171875, -0.031768798828125, 0.0249481201171875, -0.005146026611328125, -0.07025146484375, 0.072509765625, -0.008148193359375, -0.0053863525390625, -0.00879669189453125, 0.05133056640625, -0.0019435882568359375, 0.01009368896484375, 0.054840087890625, 0.06396484375, 0.026702880859375, -0.0220794677734375, 0.070556640625, -0.006855010986328125, 0.04644775390625, 0.057342529296875, 0.028411865234375, 0.0279998779296875, 0.038177490234375, -0.0262603759765625, 0.0224609375, 0.0841064453125, -0.0268096923828125, 0.04486083984375, 0.00385284423828125, 0.005016326904296875, -0.0065765380859375, 0.0016326904296875, -0.03668212890625, 0.037567138671875, 0.00812530517578125, -0.050201416015625, -0.005748748779296875, 0.018035888671875, -0.01104736328125, -0.02191162109375, -0.0220489501953125, 0.057342529296875, 0.0088958740234375, -0.04193115234375, 0.07061767578125, -0.017730712890625, 0.0567626953125, -0.0152740478515625, -0.0083160400390625, -0.0228424072265625, 0.03314208984375, -0.0272674560546875, -0.053863525390625, 0.02252197265625, -0.007007598876953125, -0.00677490234375, -0.0037174224853515625, 0.0654296875, -0.019989013671875, -0.04925537109375, 0.0160980224609375, 0.01363372802734375, 0.02349853515625, -0.01132965087890625, -0.07220458984375, 0.005794525146484375, 0.0016384124755859375, -0.054718017578125, 0.022735595703125, 0.03863525390625, 0.00072479248046875, 0.031524658203125, 0.046844482421875, -0.0020923614501953125, 0.01534271240234375, -0.0181732177734375, 0.08544921875, -0.024688720703125, -0.024932861328125, -0.0654296875, 0.04730224609375, -0.02288818359375, -0.023529052734375, 0.044189453125, 0.036102294921875, 0.0692138671875, -0.005413055419921875, 0.051025390625, -0.024139404296875, 0.005947113037109375, -0.01351165771484375, 0.042724609375, -0.055816650390625, -0.01445770263671875, -0.031036376953125, -0.07586669921875, -0.01190185546875, 0.07196044921875, -0.01148223876953125, 0.0311737060546875, 0.0362548828125, 0.05621337890625, -0.024169921875, -0.0198211669921875, 0.016204833984375, 0.01256561279296875, 0.01445770263671875, 0.034423828125, 0.04669189453125, -0.06292724609375, 0.03302001953125, -0.06561279296875, -0.03363037109375, -0.0236358642578125, -0.06365966796875, -0.07342529296875, -0.067138671875, -0.0460205078125, -0.039886474609375, -0.0121917724609375, 0.055694580078125, 0.07647705078125, -0.04742431640625, 0.007198333740234375, -0.012176513671875, -0.017822265625, -0.028411865234375, -0.0169525146484375, 0.04119873046875, -0.002864837646484375, -0.07183837890625, -0.027679443359375, 0.0017223358154296875, 0.032806396484375, -0.01486968994140625, -0.0023670196533203125, -0.021484375, -0.0258636474609375, 0.040191650390625, 0.01416778564453125, -0.027679443359375, -0.0209503173828125, 0.003444671630859375, -0.0162811279296875, 0.0210723876953125, 0.035980224609375, -0.041839599609375, 0.0283355712890625, 0.04034423828125, 0.03558349609375, 0.06646728515625, 0.00421905517578125, -0.003765106201171875, -0.051300048828125, 0.040008544921875, 0.002960205078125, 0.031402587890625, 0.0238189697265625, -0.03997802734375, 0.053375244140625, 0.03936767578125, -0.04827880859375, -0.05657958984375, -0.00868988037109375, -0.08709716796875, -0.01528167724609375, 0.077880859375, -0.029205322265625, -0.044097900390625, 0.025054931640625, -0.014068603515625, 0.04107666015625, -0.01149749755859375, 0.0428466796875, 0.03924560546875, 0.0062408447265625, -0.02459716796875, -0.047271728515625, 0.029754638671875, 0.000021696090698242188, -0.0406494140625, -0.018096923828125, 0.032806396484375, 0.03472900390625, 0.0288543701171875, 0.04534912109375, -0.0225372314453125, 0.003948211669921875, 0.005306243896484375, 0.01373291015625, -0.0176849365234375, -0.0179901123046875, -0.0126953125, -0.0181427001953125, -0.018341064453125, -0.047088623046875 ] ]
Helsinki-NLP/opus-mt-et-en
2023-08-16T11:33:57.000Z
[ "transformers", "pytorch", "tf", "marian", "text2text-generation", "translation", "et", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
translation
Helsinki-NLP
null
null
Helsinki-NLP/opus-mt-et-en
1
9,491
transformers
2022-03-02T23:29:04
--- tags: - translation license: apache-2.0 --- ### opus-mt-et-en * source languages: et * target languages: en * OPUS readme: [et-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/et-en/README.md) * dataset: opus * model: transformer-align * pre-processing: normalization + SentencePiece * download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/et-en/opus-2019-12-18.zip) * test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/et-en/opus-2019-12-18.test.txt) * test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/et-en/opus-2019-12-18.eval.txt) ## Benchmarks | testset | BLEU | chr-F | |-----------------------|-------|-------| | newsdev2018-enet.et.en | 30.1 | 0.574 | | newstest2018-enet.et.en | 30.3 | 0.581 | | Tatoeba.et.en | 59.9 | 0.738 |
907
[ [ -0.0214691162109375, -0.03399658203125, 0.0232391357421875, 0.0249786376953125, -0.0283660888671875, -0.0249786376953125, -0.03363037109375, -0.01314544677734375, 0.0024394989013671875, 0.030426025390625, -0.048797607421875, -0.041229248046875, -0.045867919921875, 0.0185699462890625, -0.0069427490234375, 0.0477294921875, -0.016998291015625, 0.036102294921875, 0.0201873779296875, -0.03411865234375, -0.0286712646484375, -0.029815673828125, -0.038543701171875, -0.0242767333984375, 0.0171051025390625, 0.033966064453125, 0.0230712890625, 0.02972412109375, 0.0633544921875, 0.0177001953125, -0.0048828125, 0.0099334716796875, -0.035980224609375, -0.00711822509765625, 0.00861358642578125, -0.04339599609375, -0.05426025390625, -0.01255035400390625, 0.07421875, 0.0322265625, -0.001312255859375, 0.02783203125, -0.004730224609375, 0.0665283203125, -0.0207977294921875, 0.0042877197265625, -0.04522705078125, 0.00635528564453125, -0.0228118896484375, -0.0268096923828125, -0.052764892578125, -0.016815185546875, 0.00933074951171875, -0.04803466796875, -0.00266265869140625, 0.015899658203125, 0.10711669921875, 0.02099609375, -0.0258636474609375, -0.00881195068359375, -0.04681396484375, 0.0806884765625, -0.053131103515625, 0.0498046875, 0.02862548828125, 0.0197296142578125, 0.0200042724609375, -0.042388916015625, -0.0214691162109375, 0.007747650146484375, -0.0144805908203125, 0.0179595947265625, -0.00782012939453125, -0.02154541015625, 0.0211334228515625, 0.052154541015625, -0.056549072265625, 0.0010080337524414062, -0.04620361328125, 0.0020904541015625, 0.05194091796875, 0.00957489013671875, 0.0131072998046875, -0.00438690185546875, -0.0283355712890625, -0.040252685546875, -0.05889892578125, 0.01146697998046875, 0.0238800048828125, 0.0207366943359375, -0.036376953125, 0.054534912109375, -0.01316070556640625, 0.048614501953125, 0.0005984306335449219, 0.0036716461181640625, 0.07421875, -0.0272674560546875, -0.0261688232421875, -0.00872802734375, 0.0849609375, 0.032135009765625, 0.008209228515625, 0.00506591796875, -0.0163421630859375, -0.0186614990234375, 0.00615692138671875, -0.0684814453125, -0.0001131892204284668, 0.017822265625, -0.03436279296875, -0.00800323486328125, 0.005779266357421875, -0.046875, 0.0179290771484375, -0.03253173828125, 0.042724609375, -0.0465087890625, -0.0223236083984375, 0.0283203125, 0.0004916191101074219, 0.0308380126953125, -0.00010657310485839844, -0.042999267578125, 0.0163726806640625, 0.0302581787109375, 0.056365966796875, -0.0281524658203125, -0.017791748046875, -0.0369873046875, -0.017333984375, -0.008056640625, 0.047393798828125, -0.012451171875, -0.03167724609375, -0.0021457672119140625, 0.03814697265625, -0.031524658203125, -0.0221710205078125, 0.10162353515625, -0.018768310546875, 0.05474853515625, -0.032135009765625, -0.043914794921875, -0.02532958984375, 0.036651611328125, -0.0435791015625, 0.0968017578125, 0.005828857421875, -0.061248779296875, 0.01715087890625, -0.0594482421875, -0.00958251953125, -0.0082550048828125, -0.0007762908935546875, -0.05047607421875, 0.005741119384765625, 0.009796142578125, 0.031036376953125, -0.0283660888671875, 0.019989013671875, -0.002384185791015625, -0.0247344970703125, 0.0020904541015625, -0.023101806640625, 0.0755615234375, 0.02276611328125, -0.026947021484375, 0.017547607421875, -0.07379150390625, -0.0016918182373046875, 0.00315093994140625, -0.034149169921875, -0.0133209228515625, 0.010528564453125, 0.0222625732421875, 0.006610870361328125, 0.0153656005859375, -0.05322265625, 0.0126190185546875, -0.0474853515625, 0.021392822265625, 0.046478271484375, -0.0217437744140625, 0.028045654296875, -0.0350341796875, 0.019805908203125, 0.00826263427734375, 0.006805419921875, 0.00316619873046875, -0.034637451171875, -0.0650634765625, -0.017578125, 0.03997802734375, 0.08050537109375, -0.05120849609375, 0.06353759765625, -0.0545654296875, -0.058746337890625, -0.05322265625, -0.0168609619140625, 0.0263519287109375, 0.0281829833984375, 0.039703369140625, -0.007198333740234375, -0.031829833984375, -0.08074951171875, -0.01143646240234375, -0.007144927978515625, -0.0187835693359375, 0.0137786865234375, 0.04681396484375, -0.0080108642578125, 0.036712646484375, -0.036224365234375, -0.027374267578125, -0.0123138427734375, 0.006687164306640625, 0.036407470703125, 0.046142578125, 0.042236328125, -0.06231689453125, -0.0482177734375, 0.00434112548828125, -0.049072265625, -0.01059722900390625, 0.00865936279296875, -0.0164031982421875, 0.00650787353515625, 0.0131072998046875, -0.025238037109375, 0.005901336669921875, 0.043060302734375, -0.049957275390625, 0.036834716796875, -0.004505157470703125, 0.02325439453125, -0.10150146484375, 0.006572723388671875, -0.005298614501953125, -0.0101776123046875, -0.033416748046875, 0.0002613067626953125, 0.0223541259765625, 0.00952911376953125, -0.058074951171875, 0.042236328125, -0.0193023681640625, 0.0001207590103149414, 0.02740478515625, -0.0022106170654296875, 0.00919342041015625, 0.050567626953125, 0.0023632049560546875, 0.055419921875, 0.0546875, -0.03857421875, 0.01110076904296875, 0.0394287109375, -0.0302886962890625, 0.0343017578125, -0.0634765625, -0.0191497802734375, 0.022613525390625, -0.0084381103515625, -0.047607421875, 0.006687164306640625, 0.02642822265625, -0.046875, 0.03509521484375, -0.0012750625610351562, -0.053375244140625, -0.002185821533203125, -0.0285186767578125, 0.03057861328125, 0.0494384765625, -0.014007568359375, 0.047515869140625, 0.003910064697265625, -0.0006818771362304688, -0.032012939453125, -0.07220458984375, -0.0134735107421875, -0.032958984375, -0.058319091796875, 0.0188446044921875, -0.027252197265625, -0.0033817291259765625, 0.0030384063720703125, 0.024444580078125, -0.005611419677734375, 0.0011615753173828125, 0.002902984619140625, 0.0189361572265625, -0.035491943359375, 0.0095062255859375, -0.0011758804321289062, -0.0080718994140625, -0.01001739501953125, -0.0121612548828125, 0.043701171875, -0.0264739990234375, -0.0196380615234375, -0.039642333984375, 0.00421905517578125, 0.0396728515625, -0.0305938720703125, 0.06292724609375, 0.04266357421875, -0.01084136962890625, 0.0093536376953125, -0.029937744140625, 0.006435394287109375, -0.031982421875, 0.015167236328125, -0.036895751953125, -0.058837890625, 0.0418701171875, 0.007480621337890625, 0.03839111328125, 0.06549072265625, 0.048553466796875, 0.0101318359375, 0.050567626953125, 0.0210418701171875, 0.00580596923828125, 0.031402587890625, -0.03656005859375, -0.0089111328125, -0.0789794921875, 0.005275726318359375, -0.05657958984375, -0.0264434814453125, -0.06817626953125, -0.0213623046875, 0.020294189453125, -0.0007624626159667969, -0.0246734619140625, 0.049072265625, -0.04620361328125, 0.0160980224609375, 0.0439453125, -0.00531768798828125, 0.0243682861328125, 0.00519561767578125, -0.039459228515625, -0.0169677734375, -0.033203125, -0.036163330078125, 0.100341796875, 0.0316162109375, 0.0215301513671875, 0.0157012939453125, 0.03741455078125, -0.003330230712890625, 0.015380859375, -0.043792724609375, 0.0318603515625, -0.015350341796875, -0.05303955078125, -0.0227813720703125, -0.042999267578125, -0.0594482421875, 0.03668212890625, -0.01654052734375, -0.04412841796875, 0.0180206298828125, -0.00562286376953125, -0.00975799560546875, 0.035003662109375, -0.052154541015625, 0.08319091796875, -0.006000518798828125, -0.0125732421875, 0.0172271728515625, -0.032684326171875, 0.0197296142578125, 0.002880096435546875, 0.0214691162109375, -0.01251220703125, 0.0084228515625, 0.04498291015625, -0.0107574462890625, 0.0303802490234375, -0.0002428293228149414, -0.002483367919921875, 0.004566192626953125, 0.0030841827392578125, 0.03363037109375, -0.00925445556640625, -0.031982421875, 0.0285797119140625, 0.0036525726318359375, -0.032379150390625, -0.0006008148193359375, 0.040771484375, -0.0565185546875, 0.0026111602783203125, -0.03509521484375, -0.043060302734375, 0.0008878707885742188, 0.021484375, 0.0556640625, 0.05133056640625, -0.022369384765625, 0.042266845703125, 0.06182861328125, -0.02642822265625, 0.02935791015625, 0.05389404296875, -0.01389312744140625, -0.041046142578125, 0.063720703125, 0.01010894775390625, 0.0278472900390625, 0.047088623046875, 0.004756927490234375, -0.01374053955078125, -0.05181884765625, -0.055145263671875, 0.0155181884765625, -0.024139404296875, -0.01184844970703125, -0.0482177734375, -0.00701141357421875, -0.0157318115234375, 0.0148162841796875, -0.039093017578125, -0.045928955078125, -0.01065826416015625, -0.02099609375, 0.020294189453125, 0.01511383056640625, -0.005710601806640625, 0.03082275390625, -0.0743408203125, 0.012664794921875, -0.0098114013671875, 0.027984619140625, -0.03363037109375, -0.057769775390625, -0.03155517578125, 0.001590728759765625, -0.05181884765625, -0.05035400390625, 0.037384033203125, 0.0145111083984375, 0.0198974609375, 0.0224609375, 0.01235198974609375, 0.0224609375, -0.054443359375, 0.07275390625, -0.0020313262939453125, -0.0504150390625, 0.032257080078125, -0.031280517578125, 0.0305633544921875, 0.06500244140625, 0.0189666748046875, -0.0250701904296875, -0.041259765625, -0.0538330078125, -0.06463623046875, 0.06256103515625, 0.054779052734375, -0.013275146484375, 0.01629638671875, -0.0134735107421875, -0.00006109476089477539, 0.0116119384765625, -0.08251953125, -0.03271484375, 0.004787445068359375, -0.0285797119140625, -0.01139068603515625, -0.0181121826171875, -0.0167694091796875, -0.0161590576171875, 0.0802001953125, 0.01372528076171875, 0.0175628662109375, 0.03387451171875, -0.0088348388671875, -0.020843505859375, 0.0290985107421875, 0.07098388671875, 0.04095458984375, -0.043670654296875, -0.01190948486328125, 0.023223876953125, -0.029754638671875, -0.01171112060546875, 0.01428985595703125, -0.034149169921875, 0.024169921875, 0.03289794921875, 0.07958984375, 0.01337432861328125, -0.043701171875, 0.0289764404296875, -0.028533935546875, -0.034759521484375, -0.0491943359375, -0.0129852294921875, 0.00943756103515625, 0.0011110305786132812, 0.0176849365234375, 0.01337432861328125, 0.015350341796875, -0.0163726806640625, 0.0096435546875, 0.006267547607421875, -0.04583740234375, -0.041259765625, 0.039215087890625, 0.01424407958984375, -0.0211944580078125, 0.0338134765625, -0.0296630859375, -0.04425048828125, 0.033447265625, 0.004489898681640625, 0.08050537109375, -0.0167236328125, -0.0149993896484375, 0.060150146484375, 0.0482177734375, -0.0199127197265625, 0.03955078125, 0.01375579833984375, -0.052703857421875, -0.042572021484375, -0.0587158203125, -0.01239013671875, 0.0089263916015625, -0.060028076171875, 0.0311737060546875, 0.0256805419921875, 0.0032939910888671875, -0.0229034423828125, 0.0173187255859375, -0.04925537109375, 0.007373809814453125, -0.020599365234375, 0.0797119140625, -0.0684814453125, 0.060516357421875, 0.03875732421875, -0.017822265625, -0.06488037109375, -0.0229644775390625, -0.015777587890625, -0.03424072265625, 0.042266845703125, 0.016571044921875, 0.026611328125, -0.006343841552734375, -0.01554107666015625, -0.05987548828125, 0.08428955078125, 0.01171875, -0.0506591796875, 0.007396697998046875, 0.0174102783203125, 0.03533935546875, -0.0278778076171875, 0.0192718505859375, 0.033966064453125, 0.057830810546875, 0.004638671875, -0.08575439453125, -0.0172119140625, -0.04193115234375, -0.0280914306640625, 0.041748046875, -0.04541015625, 0.07440185546875, 0.034820556640625, -0.0100555419921875, 0.0017881393432617188, 0.039337158203125, 0.0244598388671875, 0.0267791748046875, 0.038177490234375, 0.08831787109375, 0.0287933349609375, -0.039642333984375, 0.07763671875, -0.0243682861328125, 0.03936767578125, 0.08795166015625, -0.005550384521484375, 0.06842041015625, 0.023040771484375, -0.01275634765625, 0.033843994140625, 0.052703857421875, -0.0195159912109375, 0.034454345703125, -0.00009256601333618164, 0.015716552734375, -0.01174163818359375, 0.01456451416015625, -0.056915283203125, 0.0120849609375, 0.01424407958984375, -0.018646240234375, 0.0014295578002929688, -0.00844573974609375, 0.0034694671630859375, -0.0036067962646484375, -0.01177215576171875, 0.041595458984375, 0.002002716064453125, -0.039947509765625, 0.0545654296875, -0.002471923828125, 0.047515869140625, -0.047943115234375, 0.0076904296875, 0.00027632713317871094, 0.02032470703125, -0.00048804283142089844, -0.044708251953125, 0.034515380859375, 0.00044989585876464844, -0.01477813720703125, -0.03082275390625, 0.01727294921875, -0.040374755859375, -0.0703125, 0.031280517578125, 0.03533935546875, 0.019989013671875, 0.0005011558532714844, -0.0635986328125, 0.006244659423828125, 0.00975799560546875, -0.05120849609375, 0.0006413459777832031, 0.057373046875, 0.0231781005859375, 0.031982421875, 0.041656494140625, 0.0159912109375, 0.020599365234375, -0.0037841796875, 0.0506591796875, -0.0330810546875, -0.033477783203125, -0.05865478515625, 0.06591796875, -0.01514434814453125, -0.0472412109375, 0.052764892578125, 0.07611083984375, 0.0797119140625, -0.010162353515625, 0.022918701171875, -0.00919342041015625, 0.058990478515625, -0.047271728515625, 0.0384521484375, -0.06927490234375, 0.018310546875, -0.0026683807373046875, -0.07049560546875, -0.0130615234375, 0.0223541259765625, -0.0191650390625, -0.0254669189453125, 0.060943603515625, 0.047149658203125, -0.0159149169921875, -0.0116119384765625, 0.0201873779296875, 0.024017333984375, 0.018341064453125, 0.046112060546875, 0.033843994140625, -0.07568359375, 0.036773681640625, -0.02386474609375, -0.006000518798828125, -0.00632476806640625, -0.05145263671875, -0.06439208984375, -0.043060302734375, -0.01496124267578125, -0.01507568359375, -0.022552490234375, 0.07049560546875, 0.040435791015625, -0.07171630859375, -0.0394287109375, -0.0034313201904296875, 0.0080108642578125, -0.01458740234375, -0.020111083984375, 0.043548583984375, -0.018463134765625, -0.07501220703125, 0.030853271484375, 0.00420379638671875, -0.0095977783203125, 0.003757476806640625, -0.02484130859375, -0.034454345703125, -0.0024814605712890625, 0.02392578125, 0.0010576248168945312, -0.037750244140625, 0.0032806396484375, 0.012603759765625, -0.00229644775390625, 0.0291595458984375, 0.023406982421875, -0.012603759765625, 0.0173797607421875, 0.06329345703125, 0.024505615234375, 0.0321044921875, -0.01275634765625, 0.04095458984375, -0.05194091796875, 0.0242767333984375, 0.01580810546875, 0.046112060546875, 0.0263519287109375, -0.00453948974609375, 0.06549072265625, 0.011932373046875, -0.04901123046875, -0.082275390625, 0.0036869049072265625, -0.089599609375, -0.000015795230865478516, 0.06719970703125, -0.016998291015625, -0.0189208984375, 0.0235443115234375, -0.0106964111328125, 0.01197052001953125, -0.030853271484375, 0.0284576416015625, 0.0662841796875, 0.025543212890625, 0.004520416259765625, -0.064697265625, 0.024383544921875, 0.0352783203125, -0.055938720703125, -0.016693115234375, 0.0140838623046875, 0.010040283203125, 0.02734375, 0.0333251953125, -0.020263671875, 0.004871368408203125, -0.0210723876953125, 0.02655029296875, -0.005565643310546875, -0.00913238525390625, -0.019775390625, -0.0019664764404296875, -0.0037593841552734375, -0.0233612060546875 ] ]
invisiblecat/Uber_Realistic_Porn_Merge_V1.3_inpainting
2023-08-18T07:00:32.000Z
[ "diffusers", "not-for-all-audiences", "inpainting", "en", "license:creativeml-openrail-m", "has_space", "diffusers:StableDiffusionInpaintPipeline", "region:us" ]
null
invisiblecat
null
null
invisiblecat/Uber_Realistic_Porn_Merge_V1.3_inpainting
0
9,485
diffusers
2023-08-11T16:36:09
--- license: creativeml-openrail-m language: - en tags: - diffusers - not-for-all-audiences - inpainting library_name: diffusers --- <div style='background: #f044442e; color: #f04444; border: solid 1px #f04444; border-radius: 0.5pc; margin-top: 1.5em; margin-bottom: 1.5em; padding:1em;'> <b>WARNING:</b> Potential for NSFW Content. </div> # Uber Realistic Porn Merge (URPM) - Inpaint - **Author:** [saftle](https://civitai.com/user/saftle) - **Source:** [https://civitai.com/models/2661](https://civitai.com/models/2661) - **Version:** 1.3 - **Base Model:** [Stable Diffusion 1.5](https://huggingface.co/runwayml/stable-diffusion-v1-5) - **License:** [creativeml-openrail-m](https://huggingface.co/spaces/CompVis/stable-diffusion-license)
741
[ [ -0.045013427734375, -0.06298828125, 0.0269622802734375, 0.02911376953125, -0.03460693359375, 0.00754547119140625, 0.0289306640625, -0.0174407958984375, 0.0151824951171875, 0.038726806640625, -0.073486328125, -0.040496826171875, -0.034423828125, -0.01336669921875, -0.044189453125, 0.047882080078125, 0.00691986083984375, -0.004047393798828125, -0.021759033203125, 0.01084136962890625, -0.0234222412109375, -0.0135650634765625, -0.026824951171875, 0.001861572265625, 0.005279541015625, 0.03680419921875, 0.05902099609375, 0.036346435546875, 0.046844482421875, 0.0174560546875, -0.0117340087890625, 0.0220947265625, -0.056671142578125, -0.0177154541015625, -0.0109710693359375, 0.01041412353515625, -0.054656982421875, 0.016815185546875, 0.044891357421875, 0.0258636474609375, -0.0205230712890625, 0.007080078125, 0.001163482666015625, 0.06329345703125, -0.0501708984375, -0.03570556640625, -0.00304412841796875, 0.02777099609375, -0.027679443359375, -0.0101165771484375, -0.0194854736328125, -0.02459716796875, -0.01555633544921875, -0.05072021484375, 0.004337310791015625, -0.01026153564453125, 0.0908203125, 0.01486968994140625, -0.028900146484375, 0.03924560546875, -0.05645751953125, 0.0262908935546875, -0.043212890625, 0.056365966796875, 0.014251708984375, 0.028564453125, -0.0225372314453125, -0.0657958984375, -0.035430908203125, 0.022705078125, -0.003932952880859375, 0.01554107666015625, -0.056121826171875, -0.0238037109375, -0.002864837646484375, 0.0252227783203125, -0.051177978515625, -0.006633758544921875, -0.053985595703125, 0.01026153564453125, 0.04058837890625, 0.0080413818359375, 0.04705810546875, 0.0226898193359375, -0.060546875, 0.0005774497985839844, -0.056121826171875, 0.0013628005981445312, 0.03839111328125, 0.0015974044799804688, -0.06439208984375, 0.031829833984375, -0.005802154541015625, 0.0570068359375, 0.0082244873046875, -0.010406494140625, 0.035980224609375, -0.039642333984375, -0.040069580078125, -0.032012939453125, 0.061126708984375, 0.06640625, 0.00189208984375, 0.01009368896484375, 0.016082763671875, -0.024658203125, 0.041778564453125, -0.07916259765625, -0.004238128662109375, 0.0237579345703125, -0.043792724609375, -0.0361328125, 0.0166778564453125, -0.06915283203125, -0.004138946533203125, 0.0086822509765625, 0.030670166015625, -0.0215606689453125, -0.05078125, 0.0249481201171875, -0.01202392578125, 0.03436279296875, 0.022552490234375, -0.0321044921875, 0.02667236328125, 0.0255889892578125, 0.050567626953125, -0.0014781951904296875, 0.01427459716796875, -0.0030651092529296875, 0.004810333251953125, -0.0131988525390625, 0.025787353515625, -0.0007557868957519531, -0.040191650390625, 0.0160369873046875, 0.0116729736328125, 0.01479339599609375, -0.0222015380859375, 0.059234619140625, -0.0290985107421875, 0.0121917724609375, -0.0247802734375, -0.01415252685546875, -0.0140533447265625, -0.01364898681640625, -0.038543701171875, 0.057708740234375, 0.0189208984375, -0.063232421875, 0.034881591796875, -0.03436279296875, -0.001941680908203125, 0.006572723388671875, 0.0191192626953125, -0.0469970703125, -0.0262603759765625, -0.0184478759765625, 0.03582763671875, -0.00466156005859375, -0.00460052490234375, -0.0533447265625, -0.03143310546875, 0.0226287841796875, -0.03265380859375, 0.060821533203125, 0.0084381103515625, -0.0228424072265625, 0.00720977783203125, -0.061859130859375, -0.031951904296875, 0.0144500732421875, 0.01482391357421875, -0.01436614990234375, -0.0372314453125, 0.036712646484375, 0.032623291015625, 0.02294921875, -0.034942626953125, 0.007678985595703125, 0.00579833984375, 0.001125335693359375, 0.05029296875, 0.00215911865234375, 0.03863525390625, -0.03900146484375, 0.04754638671875, 0.0226593017578125, 0.040191650390625, 0.040771484375, -0.040985107421875, -0.061309814453125, -0.0399169921875, 0.007656097412109375, 0.0256500244140625, -0.037872314453125, 0.041748046875, 0.00426483154296875, -0.057586669921875, -0.027496337890625, 0.01139068603515625, 0.037567138671875, 0.02813720703125, 0.0214691162109375, -0.036285400390625, -0.0290069580078125, -0.07696533203125, 0.03509521484375, 0.018035888671875, -0.0010814666748046875, 0.014739990234375, 0.0272979736328125, -0.000888824462890625, 0.0543212890625, -0.038818359375, -0.03192138671875, 0.01494598388671875, -0.01451873779296875, 0.017608642578125, 0.07525634765625, 0.0701904296875, -0.081787109375, -0.047943115234375, -0.0245513916015625, -0.06256103515625, -0.00984954833984375, 0.023284912109375, -0.0599365234375, -0.0081939697265625, 0.0330810546875, -0.03173828125, 0.05377197265625, 0.040008544921875, -0.047698974609375, 0.03546142578125, -0.0352783203125, 0.030609130859375, -0.063232421875, 0.006916046142578125, 0.045806884765625, -0.0309295654296875, -0.0555419921875, 0.03424072265625, 0.000804901123046875, 0.000453948974609375, -0.09332275390625, 0.037017822265625, -0.0384521484375, 0.02020263671875, -0.007335662841796875, 0.0211029052734375, -0.01511383056640625, 0.0228729248046875, -0.007251739501953125, 0.04345703125, 0.059234619140625, -0.036346435546875, 0.018463134765625, 0.03460693359375, -0.01464080810546875, 0.047607421875, -0.0271148681640625, 0.01009368896484375, -0.0049285888671875, 0.004673004150390625, -0.1005859375, -0.02984619140625, 0.039031982421875, -0.02301025390625, 0.004497528076171875, -0.0204315185546875, -0.040191650390625, -0.019561767578125, -0.054473876953125, 0.040008544921875, 0.07220458984375, -0.0254669189453125, 0.01531219482421875, 0.03314208984375, 0.007232666015625, -0.04351806640625, -0.05218505859375, -0.025177001953125, -0.0257568359375, -0.058135986328125, 0.04168701171875, -0.0096893310546875, -0.01204681396484375, 0.0054779052734375, 0.009674072265625, -0.034088134765625, -0.0198211669921875, 0.031646728515625, 0.051116943359375, -0.0013399124145507812, -0.033355712890625, 0.0172271728515625, 0.0062103271484375, -0.0011959075927734375, 0.01251983642578125, 0.04010009765625, 0.00926971435546875, -0.038421630859375, -0.046875, 0.0283355712890625, 0.0677490234375, -0.0035381317138671875, 0.07568359375, 0.066650390625, -0.043792724609375, 0.009918212890625, -0.046234130859375, 0.011566162109375, -0.03485107421875, -0.01018524169921875, -0.017852783203125, -0.052734375, 0.041259765625, 0.0247802734375, -0.01251983642578125, 0.03387451171875, 0.03179931640625, -0.0092315673828125, 0.07220458984375, 0.060028076171875, 0.0192718505859375, 0.0302734375, -0.043212890625, -0.00859832763671875, -0.06964111328125, -0.04443359375, -0.00846099853515625, -0.027496337890625, -0.050933837890625, -0.046875, 0.0189208984375, 0.01467132568359375, -0.030792236328125, 0.03997802734375, -0.043060302734375, 0.027435302734375, 0.0239715576171875, 0.048919677734375, 0.0244598388671875, 0.001445770263671875, -0.016143798828125, -0.014984130859375, -0.035888671875, -0.024322509765625, 0.042510986328125, 0.0280609130859375, 0.052642822265625, 0.042724609375, 0.0276031494140625, 0.0026340484619140625, 0.0173492431640625, -0.0030689239501953125, 0.04931640625, -0.032745361328125, -0.05517578125, -0.008697509765625, -0.01081085205078125, -0.056396484375, 0.0229034423828125, -0.05010986328125, -0.037628173828125, 0.03662109375, 0.0006823539733886719, -0.028961181640625, 0.0460205078125, -0.01727294921875, 0.046783447265625, 0.0154266357421875, -0.06561279296875, -0.00283050537109375, -0.049041748046875, 0.042510986328125, 0.012481689453125, 0.028045654296875, -0.00978851318359375, -0.0190582275390625, 0.034942626953125, -0.048370361328125, 0.048980712890625, -0.0014057159423828125, 0.0068511962890625, 0.006999969482421875, 0.0035457611083984375, -0.0041046142578125, 0.007587432861328125, 0.021240234375, 0.010101318359375, -0.040313720703125, -0.0285186767578125, -0.01198577880859375, 0.049896240234375, -0.04656982421875, -0.03753662109375, -0.03485107421875, -0.016571044921875, 0.01354217529296875, 0.00848388671875, 0.044097900390625, 0.03680419921875, -0.026885986328125, 0.02197265625, 0.063232421875, 0.015289306640625, 0.032958984375, 0.0408935546875, -0.046356201171875, -0.0418701171875, 0.045562744140625, -0.00739288330078125, 0.031341552734375, 0.0109710693359375, -0.00011867284774780273, -0.02978515625, -0.037017822265625, -0.042633056640625, 0.045806884765625, -0.041046142578125, -0.00849151611328125, -0.03466796875, -0.03997802734375, -0.027679443359375, -0.03692626953125, -0.0108489990234375, -0.01010894775390625, -0.05487060546875, -0.00926971435546875, 0.0311126708984375, 0.044281005859375, -0.0182037353515625, 0.01837158203125, -0.05828857421875, 0.00836944580078125, 0.0286712646484375, 0.0316162109375, -0.01444244384765625, -0.05377197265625, 0.0020732879638671875, -0.0014190673828125, -0.0250244140625, -0.06298828125, 0.0469970703125, -0.0295867919921875, 0.04095458984375, 0.030548095703125, -0.00598907470703125, 0.07000732421875, -0.0172271728515625, 0.055877685546875, 0.033111572265625, -0.041107177734375, 0.02984619140625, -0.05206298828125, 0.01171112060546875, 0.059295654296875, 0.059051513671875, -0.009185791015625, -0.03924560546875, -0.07501220703125, -0.06591796875, 0.025115966796875, 0.04241943359375, 0.0040740966796875, 0.01568603515625, 0.01861572265625, -0.006954193115234375, 0.0262908935546875, -0.05902099609375, -0.040679931640625, -0.0009207725524902344, 0.00762939453125, 0.018096923828125, -0.018402099609375, -0.019989013671875, -0.03717041015625, 0.0657958984375, 0.0240020751953125, 0.00887298583984375, 0.0021457672119140625, 0.01203155517578125, -0.0309600830078125, -0.01055145263671875, 0.058746337890625, 0.057281494140625, -0.0570068359375, -0.023529052734375, -0.0187225341796875, -0.032562255859375, 0.0217742919921875, 0.0003001689910888672, -0.00975799560546875, 0.00746917724609375, 0.00852203369140625, 0.03778076171875, 0.0190277099609375, -0.021881103515625, 0.060760498046875, -0.01262664794921875, -0.0306243896484375, -0.023040771484375, 0.01432037353515625, 0.0080718994140625, 0.015899658203125, 0.0261688232421875, 0.0293426513671875, 0.0379638671875, -0.03460693359375, 0.0299530029296875, 0.0304107666015625, -0.049774169921875, 0.0026035308837890625, 0.06414794921875, 0.026824951171875, -0.031280517578125, 0.037139892578125, -0.0208282470703125, -0.018585205078125, 0.045074462890625, 0.039276123046875, 0.06549072265625, -0.0176239013671875, 0.01071929931640625, 0.0399169921875, -0.00583648681640625, -0.01003265380859375, 0.024017333984375, 0.0230560302734375, -0.0408935546875, -0.01471710205078125, -0.03326416015625, -0.026763916015625, 0.03533935546875, -0.048980712890625, 0.055145263671875, -0.04449462890625, -0.0106658935546875, 0.001964569091796875, -0.0269317626953125, -0.015380859375, 0.041015625, 0.00717926025390625, 0.08465576171875, -0.101318359375, 0.054595947265625, 0.04083251953125, -0.0611572265625, -0.054473876953125, -0.00858306884765625, 0.015655517578125, -0.0218048095703125, 0.025665283203125, 0.00724029541015625, -0.0086822509765625, -0.01526641845703125, -0.039215087890625, -0.065185546875, 0.0782470703125, 0.01934814453125, -0.0201263427734375, -0.01108551025390625, -0.01548004150390625, 0.036285400390625, -0.03973388671875, 0.01346588134765625, 0.0283660888671875, 0.0283203125, 0.039886474609375, -0.062164306640625, 0.004268646240234375, -0.04669189453125, 0.00464630126953125, 0.0163116455078125, -0.0745849609375, 0.0750732421875, 0.0166015625, -0.0133514404296875, 0.0272674560546875, 0.045257568359375, 0.0262603759765625, 0.0081329345703125, 0.046600341796875, 0.0595703125, 0.0051116943359375, -0.0289154052734375, 0.052947998046875, -0.005290985107421875, 0.02850341796875, 0.054229736328125, 0.0082244873046875, 0.0517578125, 0.016937255859375, -0.0292816162109375, 0.045196533203125, 0.048583984375, -0.00589752197265625, 0.0550537109375, -0.0024547576904296875, -0.0241241455078125, -0.0184478759765625, -0.00298309326171875, -0.045135498046875, 0.027008056640625, 0.001476287841796875, -0.00551605224609375, -0.0085601806640625, 0.00397491455078125, 0.016632080078125, 0.01190185546875, -0.0031185150146484375, 0.03814697265625, -0.0132598876953125, -0.00583648681640625, 0.046722412109375, -0.0189208984375, 0.06988525390625, -0.03778076171875, -0.027618408203125, -0.0201873779296875, 0.0084381103515625, -0.0263671875, -0.08026123046875, 0.018035888671875, -0.01158905029296875, -0.0032024383544921875, -0.03125, 0.0589599609375, -0.037384033203125, -0.044525146484375, 0.0277099609375, 0.0088348388671875, 0.023345947265625, 0.034912109375, -0.059783935546875, 0.02459716796875, 0.01273345947265625, -0.0014486312866210938, 0.020751953125, -0.0005779266357421875, 0.017608642578125, 0.061126708984375, 0.05865478515625, 0.0154266357421875, -0.007595062255859375, -0.01389312744140625, 0.07342529296875, -0.0253753662109375, -0.046234130859375, -0.0701904296875, 0.0623779296875, -0.0280303955078125, -0.0216827392578125, 0.05426025390625, 0.051361083984375, 0.056884765625, -0.0258331298828125, 0.033843994140625, -0.000408172607421875, 0.0126800537109375, -0.0372314453125, 0.079833984375, -0.07818603515625, -0.00839996337890625, -0.04248046875, -0.060791015625, -0.01026153564453125, 0.047271728515625, 0.042510986328125, 0.01161956787109375, 0.0287628173828125, 0.06451416015625, -0.0215606689453125, -0.034820556640625, 0.0306549072265625, -0.00014829635620117188, 0.00516510009765625, 0.010406494140625, 0.047698974609375, -0.0321044921875, 0.0077667236328125, -0.06268310546875, -0.0263214111328125, -0.035430908203125, -0.08251953125, -0.047607421875, -0.059173583984375, -0.03997802734375, -0.04449462890625, 0.00049591064453125, 0.04241943359375, 0.0797119140625, -0.0269622802734375, -0.0309906005859375, 0.0024356842041015625, 0.0012111663818359375, -0.006366729736328125, -0.0179595947265625, -0.0030364990234375, 0.04351806640625, -0.07177734375, 0.0105743408203125, 0.039154052734375, 0.05963134765625, -0.044342041015625, 0.0200653076171875, -0.049896240234375, 0.008270263671875, 0.0116119384765625, 0.01800537109375, -0.043731689453125, -0.006824493408203125, -0.0263671875, -0.020660400390625, 0.0017518997192382812, 0.0125579833984375, -0.021728515625, 0.01385498046875, 0.047698974609375, -0.0037555694580078125, 0.0501708984375, 0.00907135009765625, 0.02740478515625, -0.0491943359375, 0.049407958984375, -0.0157470703125, 0.034912109375, 0.025787353515625, -0.038482666015625, 0.05780029296875, 0.03125, -0.0249481201171875, -0.06842041015625, -0.00556182861328125, -0.10748291015625, -0.004199981689453125, 0.0643310546875, 0.0014619827270507812, -0.0168914794921875, 0.0116119384765625, -0.01009368896484375, 0.013763427734375, -0.02276611328125, 0.0087890625, 0.047149658203125, -0.0108642578125, -0.0169219970703125, -0.0386962890625, 0.042266845703125, -0.0006308555603027344, -0.0513916015625, -0.031280517578125, 0.034393310546875, 0.04315185546875, 0.02978515625, 0.0638427734375, -0.0306243896484375, 0.041473388671875, -0.00872039794921875, 0.0182037353515625, -0.02044677734375, -0.02740478515625, -0.0157318115234375, 0.0121612548828125, -0.022705078125, -0.026123046875 ] ]
laion/CLIP-convnext_large_d.laion2B-s26B-b102K-augreg
2023-04-18T19:33:42.000Z
[ "open_clip", "tensorboard", "clip", "zero-shot-image-classification", "arxiv:2201.03545", "arxiv:2210.08402", "arxiv:1910.04867", "license:mit", "has_space", "region:us" ]
zero-shot-image-classification
laion
null
null
laion/CLIP-convnext_large_d.laion2B-s26B-b102K-augreg
4
9,483
open_clip
2023-01-29T22:40:05
--- license: mit library_name: open_clip pipeline_tag: zero-shot-image-classification tags: - clip --- # Model Card for CLIP-convnext_large_d.laion2B-s26B-b102K-augreg # Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Training Details](#training-details) 4. [Evaluation](#evaluation) 5. [Acknowledgements](#acknowledgements) 6. [Citation](#citation) # Model Details ## Model Description A series of CLIP [ConvNeXt-Large](https://arxiv.org/abs/2201.03545) (w/ extra text depth, vision MLP head) models trained on LAION-2B (english), a subset of [LAION-5B](https://arxiv.org/abs/2210.08402), using [OpenCLIP](https://github.com/mlfoundations/open_clip). Goals: * Explore an alternative to ViT and ResNet (w/ AttentionPooling) CLIP models that scales well with model size and image resolution Firsts: * First known ConvNeXt CLIP models trained at scale in the range of CLIP ViT-L/16, ViT-L14, and RN50x16 * First released model weights exploring increase of augmentation + regularization for image tower via adding (greater scale range of RRC, random erasing, stochastic depth) The models utilize: * the [timm](https://github.com/rwightman/pytorch-image-models) ConvNeXt-Large model (`convnext_large`) as the image tower * a MLP (`fc - gelu - drop - fc`) head in vision tower instead of the single projection of other CLIP models * a text tower with same width but 4 layers more depth than ViT-L / RN50x16 models (depth 16, embed dim 768). The models are trained at 256x256 (working on 384 variants) image resolution. At 256x256, the ConvNext-Large-D used roughly 1/2 the training FLOPs to achieve accuracy greater than previous L/14 model trained on LAION-2B. L/14 model is ~1.65x more GMAC, 1.45x more activations, and 1.22x more parameters. The ConvNeXt was trained with 26B samples-seen and L/14 with 34B. | Model | Dataset | Resolution | AugReg | Top-1 ImageNet Zero-Shot (%) | | ----- | ------- | ---------- | ------------ | --------- | | [convnext_large_d.laion2b_s26b_b102k-augreg](https://huggingface.co/laion/CLIP-convnext_large_d.laion2B-s26B-b102K-augreg) | LAION-2B | 256x256 | RRC (0.33, 1.0), RE (0.35), SD (0.1), D(0.1) | 75.9 | | [convnext_large_d_320.laion2b_s29b_b131k-ft](https://huggingface.co/laion/CLIP-convnext_large_d_320.laion2B-s29B-b131K-ft) | LAION-2B | 320x320 | RRC (0.5, 1.0), RE (0.4), SD (0.1), D(0.0) | 76.6 | | [convnext_large_d_320.laion2b_s29b_b131k-ft-soup](https://huggingface.co/laion/CLIP-convnext_large_d_320.laion2B-s29B-b131K-ft-soup) | LAION-2B | 320x320 | RRC (0.5, 1.0), RE (0.4), SD (0.1), D(0.0) | 76.9 | RRC = Random Resize Crop (crop pcts), RE = Random Erasing (prob), SD = Stochastic Depth (prob) -- image tower only, D = Dropout (prob) -- image tower head only LAION-A = LAION Aesthetic, an ~900M sample subset of LAION-2B with pHash dedupe and asthetic score filtering. Model training done by Ross Wightman on the [stability.ai](https://stability.ai/) cluster. # Uses As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model. The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the LAION-5B blog (https://laion.ai/blog/laion-5b/) and upcoming paper include additional discussion as it relates specifically to the training dataset. ## Direct Use Zero-shot image classification, image and text retrieval, among others. ## Downstream Use Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others. ## Out-of-Scope Use As per the OpenAI models, **Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful. Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use. Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases. Further the above notice, the LAION-5B dataset used in training of these models has additional considerations, see below. # Training Details ## Training Data This model was trained with LAION-2B -- A 2 billion sample English subset of LAION-5B (https://laion.ai/blog/laion-5b/). **IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a “safe” subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress. ## Training Procedure All models were trained with a global batch size of 102400 for 128 checkpoint intervals of 203.7M samples for a total of ~26B samples seen over training. For 256x256 models, a slurm script w/ srun below was used on 16 8-GPU (A100 80GB) nodes (Stability). ``` /opt/slurm/sbin/srun --cpu_bind=v --accel-bind=gn python -m training.main \ --save-frequency 1 \ --name "convnext_large_256" \ --resume 'latest' \ --train-data="pipe:aws s3 cp s3://mybucket/path/{laion{00000..xxxxx}.tar -" \ --train-num-samples 203666042 \ --dataset-type webdataset \ --precision amp_bfloat16 \ --beta2 0.98 \ --warmup 10000 \ --batch-size=800 \ --epochs=128 \ --dataset-resampled \ --aug-cfg use_timm=True scale='(0.33, 1.0)' re_prob=0.35 \ --clip-grad-norm 5.0 \ --lr 1.667e-3 \ --workers=6 \ --model "convnext_large_d" \ --seed 0 \ --ddp-static-graph \ --local-loss \ --gather-with-grad \ --grad-checkpointing ``` # Evaluation Evaluation done with code in the [LAION CLIP Benchmark suite](https://github.com/LAION-AI/CLIP_benchmark). ## Testing Data, Factors & Metrics ### Testing Data The testing is performed with VTAB+ (A combination of VTAB (https://arxiv.org/abs/1910.04867) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval. ## Results The model achieves a 75.9 top-1 zero-shot accuracy on ImageNet-1k. ![](convnext_large_zero_shot.png) An initial round of benchmarks have been performed on a wider range of datasets, to be viewable at https://github.com/LAION-AI/CLIP_benchmark/blob/main/benchmark/results.ipynb # Acknowledgements Acknowledging [stability.ai](https://stability.ai/) for compute used to train this model. # Citation **BibTeX:** LAION-5B ```bibtex @inproceedings{schuhmann2022laionb, title={{LAION}-5B: An open large-scale dataset for training next generation image-text models}, author={Christoph Schuhmann and Romain Beaumont and Richard Vencu and Cade W Gordon and Ross Wightman and Mehdi Cherti and Theo Coombes and Aarush Katta and Clayton Mullis and Mitchell Wortsman and Patrick Schramowski and Srivatsa R Kundurthy and Katherine Crowson and Ludwig Schmidt and Robert Kaczmarczyk and Jenia Jitsev}, booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, year={2022}, url={https://openreview.net/forum?id=M3Y74vmsMcY} } ``` OpenCLIP software ```bibtex @software{ilharco_gabriel_2021_5143773, author = {Ilharco, Gabriel and Wortsman, Mitchell and Wightman, Ross and Gordon, Cade and Carlini, Nicholas and Taori, Rohan and Dave, Achal and Shankar, Vaishaal and Namkoong, Hongseok and Miller, John and Hajishirzi, Hannaneh and Farhadi, Ali and Schmidt, Ludwig}, title = {OpenCLIP}, month = jul, year = 2021, note = {If you use this software, please cite it as below.}, publisher = {Zenodo}, version = {0.1}, doi = {10.5281/zenodo.5143773}, url = {https://doi.org/10.5281/zenodo.5143773} } ``` OpenAI CLIP paper ```bibtex @inproceedings{Radford2021LearningTV, title={Learning Transferable Visual Models From Natural Language Supervision}, author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever}, booktitle={ICML}, year={2021} } ``` ```bibtex @Article{liu2022convnet, author = {Zhuang Liu and Hanzi Mao and Chao-Yuan Wu and Christoph Feichtenhofer and Trevor Darrell and Saining Xie}, title = {A ConvNet for the 2020s}, journal = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, year = {2022}, } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/rwightman/pytorch-image-models}} } ```
11,454
[ [ -0.03155517578125, -0.035430908203125, 0.0116729736328125, -0.00017058849334716797, -0.0290374755859375, -0.0290069580078125, -0.0133514404296875, -0.044036865234375, 0.0178375244140625, 0.031768798828125, -0.03839111328125, -0.034759521484375, -0.045654296875, -0.006214141845703125, -0.026824951171875, 0.06683349609375, -0.00650787353515625, 0.0021991729736328125, 0.00296783447265625, -0.02655029296875, -0.0362548828125, -0.040863037109375, -0.056640625, 0.0035877227783203125, 0.0187530517578125, 0.0254058837890625, 0.048065185546875, 0.062286376953125, 0.053466796875, 0.017425537109375, -0.016082763671875, -0.006839752197265625, -0.048919677734375, -0.037567138671875, 0.00861358642578125, -0.022064208984375, -0.045166015625, 0.00783538818359375, 0.052703857421875, 0.0239715576171875, -0.007396697998046875, 0.022064208984375, 0.008056640625, 0.042694091796875, -0.0550537109375, -0.00008881092071533203, -0.0419921875, 0.0092926025390625, -0.0195159912109375, 0.01271820068359375, -0.0167999267578125, -0.00385284423828125, 0.0176544189453125, -0.052001953125, 0.0280914306640625, -0.0110931396484375, 0.09521484375, 0.01517486572265625, -0.0194091796875, 0.01727294921875, -0.04388427734375, 0.06243896484375, -0.05841064453125, 0.030120849609375, 0.019989013671875, 0.0255889892578125, 0.01412200927734375, -0.06683349609375, -0.0267333984375, -0.0118255615234375, 0.0137481689453125, 0.0234527587890625, -0.016754150390625, -0.0012407302856445312, 0.039398193359375, 0.0269622802734375, -0.0390625, 0.00928497314453125, -0.04937744140625, 0.0013713836669921875, 0.06072998046875, 0.004039764404296875, 0.01220703125, -0.0210418701171875, -0.0531005859375, -0.024932861328125, -0.0496826171875, 0.0267791748046875, 0.0158843994140625, -0.0067596435546875, -0.029571533203125, 0.03240966796875, -0.006755828857421875, 0.0307769775390625, 0.0022182464599609375, -0.019805908203125, 0.029998779296875, -0.0239410400390625, -0.034149169921875, -0.01116943359375, 0.0791015625, 0.0513916015625, 0.0015354156494140625, 0.015472412109375, -0.00939178466796875, -0.01319122314453125, 0.01438140869140625, -0.0831298828125, -0.00988006591796875, 0.01389312744140625, -0.048248291015625, -0.027587890625, 0.0233154296875, -0.050445556640625, 0.0027141571044921875, -0.00909423828125, 0.044586181640625, -0.0423583984375, -0.01503753662109375, 0.0090484619140625, -0.01285552978515625, 0.01343536376953125, 0.027313232421875, -0.040313720703125, 0.016632080078125, 0.0294036865234375, 0.0833740234375, -0.0146636962890625, -0.0264129638671875, -0.01218414306640625, 0.0081939697265625, -0.0243988037109375, 0.0379638671875, 0.001682281494140625, -0.0272674560546875, -0.0167388916015625, 0.035552978515625, -0.004138946533203125, -0.04205322265625, 0.04852294921875, -0.0166778564453125, -0.004245758056640625, -0.012054443359375, -0.0269012451171875, -0.036346435546875, 0.01132965087890625, -0.0560302734375, 0.06976318359375, 0.00827789306640625, -0.0654296875, 0.0233154296875, -0.039825439453125, -0.0045318603515625, -0.0088043212890625, 0.005764007568359375, -0.04364013671875, -0.017181396484375, 0.0285797119140625, 0.04730224609375, -0.02703857421875, 0.0248565673828125, -0.039337158203125, -0.031463623046875, 0.0225830078125, -0.03375244140625, 0.068603515625, 0.0003287792205810547, -0.0279388427734375, 0.00820159912109375, -0.046844482421875, -0.00736236572265625, 0.025390625, 0.004352569580078125, -0.00455474853515625, -0.02679443359375, -0.0048370361328125, 0.0208892822265625, 0.00913238525390625, -0.04229736328125, -0.003509521484375, -0.01251983642578125, 0.046051025390625, 0.056915283203125, 0.01152801513671875, 0.0239410400390625, -0.03857421875, 0.036834716796875, 0.0013532638549804688, 0.0472412109375, -0.0165557861328125, -0.0333251953125, -0.054473876953125, -0.04473876953125, 0.0304718017578125, 0.035400390625, -0.03167724609375, 0.03424072265625, -0.019012451171875, -0.041778564453125, -0.028076171875, -0.01280975341796875, 0.028961181640625, 0.03875732421875, 0.0293426513671875, -0.031768798828125, -0.03948974609375, -0.08056640625, 0.0281524658203125, 0.013916015625, -0.01519012451171875, 0.050628662109375, 0.05841064453125, -0.0087738037109375, 0.05816650390625, -0.053955078125, -0.033203125, -0.01436614990234375, -0.0019483566284179688, 0.0184326171875, 0.03515625, 0.06201171875, -0.05853271484375, -0.04302978515625, -0.0122222900390625, -0.07403564453125, 0.0177001953125, -0.0002655982971191406, -0.0123443603515625, 0.0086212158203125, 0.034942626953125, -0.047576904296875, 0.05145263671875, 0.0310821533203125, 0.010833740234375, 0.0465087890625, -0.01030731201171875, 0.006107330322265625, -0.07867431640625, 0.0281524658203125, 0.01311492919921875, -0.01496124267578125, -0.04058837890625, 0.0041961669921875, 0.004749298095703125, -0.0285797119140625, -0.06707763671875, 0.0413818359375, -0.0284576416015625, 0.0021190643310546875, -0.00970458984375, 0.0041351318359375, 0.00868988037109375, 0.050567626953125, 0.01568603515625, 0.0687255859375, 0.0472412109375, -0.0460205078125, 0.0126800537109375, 0.0189971923828125, -0.029327392578125, 0.0305938720703125, -0.0777587890625, 0.0024471282958984375, 0.0004391670227050781, 0.030548095703125, -0.03778076171875, -0.0330810546875, 0.0231781005859375, -0.03509521484375, 0.02191162109375, -0.0274658203125, -0.01264190673828125, -0.044403076171875, -0.057708740234375, 0.040618896484375, 0.049835205078125, -0.03961181640625, 0.0157012939453125, 0.0321044921875, 0.016082763671875, -0.053619384765625, -0.052886962890625, -0.019134521484375, -0.0261993408203125, -0.0601806640625, 0.037384033203125, 0.00402069091796875, 0.006015777587890625, 0.0009441375732421875, 0.0017681121826171875, -0.00934600830078125, -0.00879669189453125, 0.042510986328125, 0.033294677734375, -0.00893402099609375, -0.01605224609375, -0.01715087890625, 0.00244140625, -0.0033588409423828125, -0.01509857177734375, 0.0308380126953125, -0.0116119384765625, -0.01320648193359375, -0.055389404296875, 0.01259613037109375, 0.042694091796875, -0.017822265625, 0.0609130859375, 0.061737060546875, -0.0343017578125, 0.0034580230712890625, -0.033966064453125, -0.0171661376953125, -0.0352783203125, 0.038818359375, -0.00916290283203125, -0.05474853515625, 0.0482177734375, 0.01013946533203125, -0.00733184814453125, 0.053741455078125, 0.0177764892578125, -0.00946044921875, 0.06640625, 0.0418701171875, -0.0020503997802734375, 0.0430908203125, -0.0789794921875, 0.0011768341064453125, -0.091064453125, -0.022674560546875, -0.00928497314453125, -0.0190887451171875, -0.03558349609375, -0.040802001953125, 0.05255126953125, 0.0251922607421875, -0.0242462158203125, 0.032196044921875, -0.0290374755859375, 0.020050048828125, 0.041656494140625, 0.034820556640625, -0.01320648193359375, -0.004192352294921875, 0.0042877197265625, -0.00782012939453125, -0.058563232421875, -0.01535797119140625, 0.09173583984375, 0.053466796875, 0.049530029296875, -0.013458251953125, 0.028961181640625, 0.01404571533203125, -0.0010099411010742188, -0.05267333984375, 0.046173095703125, -0.031646728515625, -0.0482177734375, -0.026275634765625, -0.0272064208984375, -0.06463623046875, 0.00785064697265625, -0.0157623291015625, -0.057373046875, 0.019805908203125, 0.0018186569213867188, -0.03509521484375, 0.045196533203125, -0.04156494140625, 0.073974609375, -0.0224761962890625, -0.037322998046875, -0.00482940673828125, -0.05841064453125, 0.04376220703125, 0.006793975830078125, 0.0044097900390625, -0.01372528076171875, 0.01486968994140625, 0.085693359375, -0.060699462890625, 0.061492919921875, -0.013397216796875, 0.0189971923828125, 0.055145263671875, -0.011383056640625, 0.032501220703125, 0.015106201171875, 0.00673675537109375, 0.05230712890625, 0.00481414794921875, -0.01385498046875, -0.028961181640625, 0.041107177734375, -0.07171630859375, -0.02960205078125, -0.035552978515625, -0.03900146484375, 0.0187225341796875, 0.0148162841796875, 0.05889892578125, 0.053802490234375, -0.01020050048828125, 0.035308837890625, 0.046295166015625, -0.01812744140625, 0.034820556640625, 0.010040283203125, -0.009674072265625, -0.05902099609375, 0.072265625, 0.0212554931640625, 0.0246429443359375, 0.005207061767578125, 0.01309967041015625, -0.01378631591796875, -0.03216552734375, -0.04119873046875, 0.0278472900390625, -0.041748046875, -0.03558349609375, -0.03778076171875, -0.04547119140625, -0.03643798828125, -0.013763427734375, -0.041534423828125, -0.0264129638671875, -0.045867919921875, 0.0012083053588867188, 0.02679443359375, 0.042144775390625, -0.0174713134765625, 0.035858154296875, -0.06622314453125, 0.0147247314453125, 0.019134521484375, 0.0222930908203125, 0.00518035888671875, -0.063720703125, -0.0210418701171875, 0.021087646484375, -0.039337158203125, -0.051849365234375, 0.031005859375, 0.01727294921875, 0.027618408203125, 0.052154541015625, -0.00637054443359375, 0.04827880859375, -0.025909423828125, 0.079345703125, 0.032073974609375, -0.05584716796875, 0.04156494140625, -0.042877197265625, 0.0178680419921875, 0.038360595703125, 0.048370361328125, -0.027069091796875, 0.00472259521484375, -0.06634521484375, -0.06927490234375, 0.0770263671875, 0.0131988525390625, -0.00727081298828125, 0.0144195556640625, 0.04046630859375, -0.0017137527465820312, 0.00921630859375, -0.0638427734375, -0.01052093505859375, -0.03094482421875, 0.0081939697265625, 0.0048828125, -0.032470703125, -0.0098876953125, -0.034820556640625, 0.059326171875, -0.007144927978515625, 0.0408935546875, 0.0192108154296875, 0.0006489753723144531, -0.0035266876220703125, -0.01483917236328125, 0.04046630859375, 0.036834716796875, -0.042877197265625, -0.01727294921875, 0.023345947265625, -0.05096435546875, -0.0029582977294921875, 0.003223419189453125, -0.045623779296875, -0.01105499267578125, 0.02935791015625, 0.08502197265625, 0.0137939453125, -0.040191650390625, 0.06927490234375, -0.00687408447265625, -0.03692626953125, -0.0289764404296875, 0.0068817138671875, -0.0186309814453125, 0.011077880859375, 0.01611328125, 0.01146697998046875, 0.0150146484375, -0.03277587890625, 0.017669677734375, 0.038360595703125, -0.037445068359375, -0.03326416015625, 0.05413818359375, 0.001285552978515625, -0.00472259521484375, 0.047119140625, -0.01296234130859375, -0.045867919921875, 0.0621337890625, 0.0408935546875, 0.0670166015625, -0.0126495361328125, 0.0253143310546875, 0.06884765625, 0.0158538818359375, -0.026702880859375, 0.004474639892578125, 0.02044677734375, -0.039459228515625, -0.00852203369140625, -0.034820556640625, -0.01495361328125, 0.0433349609375, -0.0643310546875, 0.03912353515625, -0.05206298828125, -0.031341552734375, -0.0175933837890625, -0.011383056640625, -0.0423583984375, 0.024688720703125, 0.0084075927734375, 0.07794189453125, -0.06610107421875, 0.051422119140625, 0.045867919921875, -0.043304443359375, -0.07257080078125, -0.0007214546203613281, -0.0121612548828125, -0.0428466796875, 0.030242919921875, 0.03717041015625, 0.006500244140625, -0.0207061767578125, -0.0645751953125, -0.08197021484375, 0.1075439453125, 0.0276947021484375, -0.0303802490234375, 0.0037746429443359375, 0.002376556396484375, 0.0265655517578125, -0.02484130859375, 0.03302001953125, 0.0245361328125, 0.007198333740234375, 0.0210113525390625, -0.07000732421875, 0.0010175704956054688, -0.007587432861328125, 0.01529693603515625, 0.0087738037109375, -0.0938720703125, 0.08526611328125, -0.0177154541015625, -0.0179443359375, 0.0029354095458984375, 0.0504150390625, 0.0119476318359375, 0.015899658203125, 0.034088134765625, 0.0560302734375, 0.0428466796875, 0.0041351318359375, 0.0753173828125, -0.0160675048828125, 0.0254669189453125, 0.07196044921875, -0.0020751953125, 0.06646728515625, 0.0147247314453125, -0.01476287841796875, 0.017547607421875, 0.048370361328125, -0.0259246826171875, 0.048492431640625, -0.01059722900390625, -0.00021791458129882812, -0.006500244140625, -0.03375244140625, -0.03765869140625, 0.033233642578125, 0.0095062255859375, -0.0224151611328125, 0.00572967529296875, 0.0189361572265625, 0.00577545166015625, -0.02880859375, -0.0261688232421875, 0.03131103515625, 0.0115814208984375, -0.034637451171875, 0.058197021484375, 0.004642486572265625, 0.059356689453125, -0.04681396484375, 0.007495880126953125, -0.008026123046875, 0.017578125, -0.0116119384765625, -0.06671142578125, 0.0211944580078125, -0.006580352783203125, 0.00009697675704956055, -0.0006990432739257812, 0.05078125, -0.00928497314453125, -0.038421630859375, 0.0296630859375, -0.0045928955078125, 0.0214080810546875, -0.0140533447265625, -0.058349609375, 0.0153045654296875, 0.006504058837890625, -0.008026123046875, 0.02593994140625, 0.017669677734375, -0.00614166259765625, 0.043609619140625, 0.040557861328125, -0.005962371826171875, 0.002773284912109375, -0.00846099853515625, 0.07427978515625, -0.030242919921875, -0.02850341796875, -0.046783447265625, 0.037139892578125, -0.01351165771484375, -0.03521728515625, 0.050933837890625, 0.051239013671875, 0.0701904296875, -0.0116729736328125, 0.04791259765625, -0.0105438232421875, 0.0238037109375, -0.046783447265625, 0.042236328125, -0.059326171875, -0.0017452239990234375, -0.022186279296875, -0.057464599609375, -0.0081939697265625, 0.04510498046875, -0.0178985595703125, 0.0053863525390625, 0.0523681640625, 0.051544189453125, -0.0220184326171875, -0.0018396377563476562, 0.013763427734375, 0.010345458984375, 0.01479339599609375, 0.04779052734375, 0.043487548828125, -0.0684814453125, 0.044158935546875, -0.059783935546875, -0.0249786376953125, -0.024932861328125, -0.053009033203125, -0.07794189453125, -0.042633056640625, -0.0287322998046875, -0.0240478515625, -0.0016202926635742188, 0.050262451171875, 0.07354736328125, -0.052032470703125, -0.0234222412109375, 0.0131988525390625, -0.01910400390625, -0.01605224609375, -0.015228271484375, 0.04302978515625, 0.0205230712890625, -0.038238525390625, 0.011138916015625, 0.01140594482421875, 0.02191162109375, -0.007053375244140625, -0.004871368408203125, -0.0222930908203125, -0.0122528076171875, 0.04010009765625, 0.029205322265625, -0.04107666015625, -0.0264434814453125, 0.0169525146484375, 0.002063751220703125, 0.0175018310546875, 0.041778564453125, -0.035491943359375, 0.00884246826171875, 0.0309600830078125, 0.0246734619140625, 0.06805419921875, 0.009765625, 0.01371002197265625, -0.04730224609375, 0.034332275390625, 0.006961822509765625, 0.0301513671875, 0.0226898193359375, -0.031585693359375, 0.0550537109375, 0.032318115234375, -0.0487060546875, -0.0657958984375, -0.00445556640625, -0.088623046875, -0.0145416259765625, 0.0928955078125, -0.0245361328125, -0.04412841796875, 0.03436279296875, -0.0196533203125, 0.0226593017578125, -0.0278472900390625, 0.033233642578125, 0.027069091796875, -0.0003895759582519531, -0.0241241455078125, -0.057525634765625, 0.0347900390625, 0.01398468017578125, -0.050872802734375, -0.01143646240234375, 0.0283050537109375, 0.038360595703125, 0.014678955078125, 0.047119140625, -0.016510009765625, 0.0270538330078125, 0.0030765533447265625, 0.01154327392578125, -0.01739501953125, -0.040740966796875, -0.03265380859375, 0.004245758056640625, -0.0148773193359375, -0.034698486328125 ] ]
internlm/internlm-20b
2023-10-19T10:08:46.000Z
[ "transformers", "pytorch", "internlm", "feature-extraction", "text-generation", "custom_code", "license:apache-2.0", "region:us" ]
text-generation
internlm
null
null
internlm/internlm-20b
56
9,462
transformers
2023-09-18T03:27:33
--- license: apache-2.0 pipeline_tag: text-generation --- **InternLM** <div align="center"> <img src="https://github.com/InternLM/InternLM/assets/22529082/b9788105-8892-4398-8b47-b513a292378e" width="200"/> <div>&nbsp;</div> <div align="center"> <b><font size="5">InternLM</font></b> <sup> <a href="https://internlm.intern-ai.org.cn/"> <i><font size="4">HOT</font></i> </a> </sup> <div>&nbsp;</div> </div> [![evaluation](https://github.com/InternLM/InternLM/assets/22529082/f80a2a58-5ddf-471a-8da4-32ab65c8fd3b)](https://github.com/internLM/OpenCompass/) [💻Github Repo](https://github.com/InternLM/InternLM) • [🤔Reporting Issues](https://github.com/InternLM/InternLM/issues/new) </div> ## Introduction The Shanghai Artificial Intelligence Laboratory, in collaboration with SenseTime Technology, the Chinese University of Hong Kong, and Fudan University, has officially released the 20 billion parameter pretrained model, InternLM-20B. InternLM-20B was pre-trained on over **2.3T** Tokens containing high-quality English, Chinese, and code data. Additionally, the Chat version has undergone SFT and RLHF training, enabling it to better and more securely meet users' needs. In terms of model structure, InternLM-20B opted for a deeper architecture, with a depth set at 60 layers. This surpasses the conventional 7B and 13B models that utilize 32 or 40 layers. When parameters are limited, increasing the number of layers can enhance the model's overall capability. Furthermore, compared to InternLM-7B, the pre-training data used for InternLM-20B underwent higher quality cleansing and was supplemented with data rich in knowledge and designed for reinforcing understanding and reasoning capabilities. As a result, it exhibits significant improvements in understanding, reasoning, mathematical, and programming abilities—all of which test the technical proficiency of language models. Overall, InternLM-20B features the following characteristics: - Outstanding overall performance - Strong utility invocation capability - Supports a 16k context length (Through infererence extrapolation) - Better value alignment. ## Performance Evaluation On the 5 capability dimensions proposed by OpenCompass, InternLM-20B has achieved excellent results (the bolded scores represent the best performances within the 13B-33B parameter range). | Capability | Llama-13B | Llama2-13B | Baichuan2-13B | InternLM-20B | Llama-33B | Llama-65B | Llama2-70B | |----------|-----------|------------|---------------|--------------|-----------|-----------|------------| | Language | 42.5 | 47 | 47.5 | **55** | 44.6 | 47.1 | 51.6 | | Knowledge | 58.2 | 58.3 | 48.9 | 60.1 | **64** | 66 | 67.7 | | Understanding | 45.5 | 50.9 | 58.1 | **67.3** | 50.6 | 54.2 | 60.8 | | Reasoning | 42.7 | 43.6 | 44.2 | **54.9** | 46.4 | 49.8 | 55 | | Examination | 37.3 | 45.2 | 51.8 | **62.5** | 47.4 | 49.7 | 57.3 | | Overall | 43.8 | 47.3 | 49.4 | **59.2** | 48.9 | 51.9 | 57.4 | The table below compares the performance of mainstream open-source models on some influential and typical datasets. | | Benchmarks | Llama-13B | Llama2-13B | Baichuan2-13B | InternLM-20B | Llama-33B | Llama-65B | Llama2-70B | |------|------------------|-----------|------------|---------------|--------------|-----------|-----------|------------| | Examination | MMLU | 47.73 | 54.99 | 59.55 | **62.05** | 58.73 | 63.71 | 69.75 | | | C-Eval (val) | 31.83 | 41.4 | **59.01** | 58.8 | 37.47 | 40.36 | 50.13 | | | AGI-Eval | 22.03 | 30.93 | 37.37 | **44.58** | 33.53 | 33.92 | 40.02 | | Knowledge | BoolQ | 78.75 | 82.42 | 67 | **87.46** | 84.43 | 86.61 | 87.74 | | | TriviaQA | 52.47 | 59.36 | 46.61 | 57.26 | **66.24** | 69.79 | 70.71 | | | NaturalQuestions | 20.17 | 24.85 | 16.32 | 25.15 | **30.89** | 33.41 | 34.16 | | Understanding | CMRC | 9.26 | 31.59 | 29.85 | **68.78** | 14.17 | 34.73 | 43.74 | | | CSL | 55 | 58.75 | 63.12 | **65.62** | 57.5 | 59.38 | 60 | | | RACE (middle) | 53.41 | 63.02 | 68.94 | **86.35** | 64.55 | 72.35 | 81.55 | | | RACE (high) | 47.63 | 58.86 | 67.18 | **83.28** | 62.61 | 68.01 | 79.93 | | | XSum | 20.37 | 23.37 | 25.23 | **35.54** | 20.55 | 19.91 | 25.38 | | Reasoning | WinoGrande | 64.64 | 64.01 | 67.32 | **69.38** | 66.85 | 69.38 | 69.77 | | | BBH | 37.93 | 45.62 | 48.98 | **52.51** | 49.98 | 58.38 | 64.91 | | | GSM8K | 20.32 | 29.57 | **52.62** | **52.62** | 42.3 | 54.44 | 63.31 | | | PIQA | 79.71 | 79.76 | 78.07 | 80.25 | **81.34** | 82.15 | 82.54 | | Programming | HumanEval | 14.02 | 18.9 | 17.07 | **25.61** | 17.68 | 18.9 | 26.22 | | | MBPP | 20.6 | 26.8 | 30.8 | **35.6** | 28.4 | 33.6 | 39.6 | Overall, InternLM-20B comprehensively outperforms open-source models in the 13B parameter range in terms of overall capabilities, and on inference evaluation sets, it approaches or even surpasses the performance of Llama-65B. ## Import from Transformers To load the InternLM 20B model using Transformers, use the following code: ```python >>> from transformers import AutoTokenizer, AutoModelForCausalLM >>> tokenizer = AutoTokenizer.from_pretrained("internlm/internlm-20b", trust_remote_code=True) >>> model = AutoModelForCausalLM.from_pretrained("internlm/internlm-20b", trust_remote_code=True).cuda() >>> model = model.eval() >>> inputs = tokenizer(["Coming to the beautiful nature, we found"], return_tensors="pt") >>> for k,v in inputs.items(): inputs[k] = v.cuda() >>> gen_kwargs = {"max_length": 128, "top_p": 0.8, "temperature": 0.8, "do_sample": True, "repetition_penalty": 1.05} >>> output = model.generate(**inputs, **gen_kwargs) >>> output = tokenizer.decode(output[0].tolist(), skip_special_tokens=True) >>> print(output) Coming to the beautiful nature, we found not only various mountains, rivers, trees, and flowers but also many birds and beasts. Birds are the ones we are most familiar with; some are soaring in the sky, some are hopping on the ground, while others perch on trees... ``` **Limitations:** Although we have made efforts to ensure the safety of the model during the training process and to encourage the model to generate text that complies with ethical and legal requirements, the model may still produce unexpected outputs due to its size and probabilistic generation paradigm. For example, the generated responses may contain biases, discrimination, or other harmful content. Please do not propagate such content. We are not responsible for any consequences resulting from the dissemination of harmful information. ## Open Source License The code is licensed under Apache-2.0, while model weights are fully open for academic research and also allow **free** commercial usage. To apply for a commercial license, please fill in the [application form (English)](https://wj.qq.com/s2/12727483/5dba/)/[申请表(中文)](https://wj.qq.com/s2/12725412/f7c1/). For other questions or collaborations, please contact <internlm@pjlab.org.cn>. ## 简介 上海人工智能实验室与商汤科技联合香港中文大学和复旦大学正式推出书生·浦语200亿参数模型版本 InternLM-20B ,InternLM-20B 在超过 **2.3T** Tokens 包含高质量英文、中文和代码的数据上进行预训练,其中 Chat 版本还经过了 SFT 和 RLHF 训练,使其能够更好、更安全地满足用户的需求。 InternLM 20B 在模型结构上选择了深结构,层数设定为60层,超过常规7B和13B模型所使用的32层或者40层。在参数受限的情况下,提高层数有利于提高模型的综合能力。此外,相较于InternLM-7B,InternLM-20B使用的预训练数据经过了更高质量的清洗,并补充了高知识密度和用于强化理解与推理能力的训练数据。因此,它在理解能力、推理能力、数学能力、编程能力等考验语言模型技术水平的方面都得到了显著提升。总体而言,InternLM-20B具有以下的特点: - 优异的综合性能 - 很强的工具调用功能 - 支持16k语境长度(通过推理时外推) - 更好的价值对齐 ## 性能评测 在OpenCompass提出的5个能力维度上,InternLM-20B都取得很好的效果(粗体为13B-33B这个量级范围内,各项最佳成绩) | 能力维度 | Llama-13B | Llama2-13B | Baichuan2-13B | InternLM-20B | Llama-33B | Llama-65B | Llama2-70B | |----------|-----------|------------|---------------|--------------|-----------|-----------|------------| | 语言 | 42.5 | 47 | 47.5 | **55** | 44.6 | 47.1 | 51.6 | | 知识 | 58.2 | 58.3 | 48.9 | 60.1 | **64** | 66 | 67.7 | | 理解 | 45.5 | 50.9 | 58.1 | **67.3** | 50.6 | 54.2 | 60.8 | | 推理 | 42.7 | 43.6 | 44.2 | **54.9** | 46.4 | 49.8 | 55 | | 学科 | 37.3 | 45.2 | 51.8 | **62.5** | 47.4 | 49.7 | 57.3 | | 总平均 | 43.8 | 47.3 | 49.4 | **59.2** | 48.9 | 51.9 | 57.4 | 下表展示了在多个经典数据集上 InternLM 20B 与各个主流开源模型的表现 | | 评测集 | Llama-13B | Llama2-13B | Baichuan2-13B | InternLM-20B | Llama-33B | Llama-65B | Llama2-70B | |------|------------------|-----------|------------|---------------|--------------|-----------|-----------|------------| | 学科 | MMLU | 47.73 | 54.99 | 59.55 | **62.05** | 58.73 | 63.71 | 69.75 | | | C-Eval (val) | 31.83 | 41.4 | **59.01** | 58.8 | 37.47 | 40.36 | 50.13 | | | AGI-Eval | 22.03 | 30.93 | 37.37 | **44.58** | 33.53 | 33.92 | 40.02 | | 知识 | BoolQ | 78.75 | 82.42 | 67 | **87.46** | 84.43 | 86.61 | 87.74 | | | TriviaQA | 52.47 | 59.36 | 46.61 | 57.26 | **66.24** | 69.79 | 70.71 | | | NaturalQuestions | 20.17 | 24.85 | 16.32 | 25.15 | **30.89** | 33.41 | 34.16 | | 理解 | CMRC | 9.26 | 31.59 | 29.85 | **68.78** | 14.17 | 34.73 | 43.74 | | | CSL | 55 | 58.75 | 63.12 | **65.62** | 57.5 | 59.38 | 60 | | | RACE (middle) | 53.41 | 63.02 | 68.94 | **86.35** | 64.55 | 72.35 | 81.55 | | | RACE (high) | 47.63 | 58.86 | 67.18 | **83.28** | 62.61 | 68.01 | 79.93 | | | XSum | 20.37 | 23.37 | 25.23 | **35.54** | 20.55 | 19.91 | 25.38 | | 推理 | WinoGrande | 64.64 | 64.01 | 67.32 | **69.38** | 66.85 | 69.38 | 69.77 | | | BBH | 37.93 | 45.62 | 48.98 | **52.51** | 49.98 | 58.38 | 64.91 | | | GSM8K | 20.32 | 29.57 | **52.62** | **52.62** | 42.3 | 54.44 | 63.31 | | | PIQA | 79.71 | 79.76 | 78.07 | 80.25 | **81.34** | 82.15 | 82.54 | | 编程 | HumanEval | 14.02 | 18.9 | 17.07 | **25.61** | 17.68 | 18.9 | 26.22 | | | MBPP | 20.6 | 26.8 | 30.8 | **35.6** | 28.4 | 33.6 | 39.6 | 总体而言,InternLM-20B 在综合能力上全面领先于13B量级的开源模型,同时在推理评测集上能够接近甚至超越Llama-65B的性能。 ## 通过 Transformers 加载 通过以下的代码加载 InternLM 20B 模型 ```python >>> from transformers import AutoTokenizer, AutoModelForCausalLM >>> tokenizer = AutoTokenizer.from_pretrained("internlm/internlm-20b", trust_remote_code=True) >>> model = AutoModelForCausalLM.from_pretrained("internlm/internlm-20b", trust_remote_code=True).cuda() >>> model = model.eval() >>> inputs = tokenizer(["来到美丽的大自然,我们发现"], return_tensors="pt") >>> for k,v in inputs.items(): inputs[k] = v.cuda() >>> gen_kwargs = {"max_length": 128, "top_p": 0.8, "temperature": 0.8, "do_sample": True, "repetition_penalty": 1.05} >>> output = model.generate(**inputs, **gen_kwargs) >>> output = tokenizer.decode(output[0].tolist(), skip_special_tokens=True) >>> print(output) 来到美丽的大自然,我们发现,这里不仅有大大小小的山川河流和树木花草,而且还有很多飞鸟走兽。我们最熟悉的就是鸟类了,它们有的在天上飞翔,有的在地上跳跃,还有的在树上栖息…… ``` **局限性:** 尽管在训练过程中我们非常注重模型的安全性,尽力促使模型输出符合伦理和法律要求的文本,但受限于模型大小以及概率生成范式,模型可能会产生各种不符合预期的输出,例如回复内容包含偏见、歧视等有害内容,请勿传播这些内容。由于传播不良信息导致的任何后果,本项目不承担责任。 ## 开源许可证 本仓库的代码依照 Apache-2.0 协议开源。模型权重对学术研究完全开放,也可申请免费的商业使用授权([申请表](https://wj.qq.com/s2/12725412/f7c1/))。其他问题与合作请联系 <internlm@pjlab.org.cn>。
13,068
[ [ -0.039581298828125, -0.0433349609375, 0.01169586181640625, 0.0204010009765625, -0.00960540771484375, -0.0007781982421875, -0.0181884765625, -0.046600341796875, -0.007091522216796875, 0.007328033447265625, -0.0267486572265625, -0.050567626953125, -0.0313720703125, 0.005359649658203125, -0.0234527587890625, 0.06903076171875, 0.0050048828125, 0.0155029296875, -0.01108551025390625, -0.01212310791015625, -0.022216796875, -0.0311126708984375, -0.04705810546875, -0.02252197265625, 0.0158538818359375, 0.010223388671875, 0.046661376953125, 0.03753662109375, 0.050079345703125, 0.0162353515625, -0.0247650146484375, 0.0133819580078125, -0.0240631103515625, -0.0189971923828125, 0.0145263671875, -0.0298004150390625, -0.045989990234375, -0.0022449493408203125, 0.0352783203125, 0.033355712890625, -0.01073455810546875, 0.041107177734375, 0.009521484375, 0.055206298828125, -0.0413818359375, 0.0034122467041015625, -0.027099609375, 0.004131317138671875, -0.0233306884765625, 0.0068359375, -0.0118408203125, -0.02532958984375, -0.0022754669189453125, -0.0413818359375, 0.01027679443359375, 0.0292205810546875, 0.0953369140625, 0.0205230712890625, -0.0086517333984375, 0.0022525787353515625, -0.040496826171875, 0.055450439453125, -0.08221435546875, 0.0214996337890625, 0.028411865234375, 0.0149688720703125, 0.0022296905517578125, -0.058258056640625, -0.047393798828125, -0.0242156982421875, -0.01959228515625, 0.0142364501953125, -0.022369384765625, -0.00881195068359375, 0.027069091796875, 0.0435791015625, -0.055023193359375, 0.0193939208984375, -0.046966552734375, -0.00864410400390625, 0.066650390625, 0.0308380126953125, 0.0186309814453125, -0.0148468017578125, -0.0308380126953125, -0.01287078857421875, -0.043670654296875, 0.032684326171875, 0.032623291015625, 0.0166473388671875, -0.0281829833984375, 0.035064697265625, -0.0292205810546875, 0.051910400390625, -0.0074462890625, 0.0028858184814453125, 0.042633056640625, -0.043426513671875, -0.024444580078125, -0.0088958740234375, 0.07818603515625, 0.032196044921875, -0.005084991455078125, 0.009735107421875, -0.0146942138671875, -0.0142364501953125, 0.003131866455078125, -0.062408447265625, -0.02227783203125, 0.0254058837890625, -0.05499267578125, -0.0225982666015625, 0.00791168212890625, -0.058258056640625, 0.00927734375, -0.0240478515625, 0.0296630859375, -0.03973388671875, -0.0357666015625, -0.00936126708984375, -0.0007052421569824219, 0.033843994140625, 0.0162353515625, -0.053436279296875, 0.02410888671875, 0.033233642578125, 0.07879638671875, -0.0203857421875, -0.0226898193359375, 0.0025882720947265625, -0.0141143798828125, -0.03350830078125, 0.03936767578125, -0.02130126953125, -0.02484130859375, -0.0281219482421875, -0.0008606910705566406, -0.00931549072265625, -0.026458740234375, 0.02996826171875, -0.0220184326171875, 0.0268707275390625, -0.019378662109375, -0.032867431640625, -0.0257415771484375, 0.029052734375, -0.03729248046875, 0.10382080078125, 0.013824462890625, -0.07684326171875, 0.021026611328125, -0.07275390625, 0.0019092559814453125, -0.01058197021484375, 0.005580902099609375, -0.046661376953125, -0.013580322265625, 0.019134521484375, 0.0256500244140625, -0.0386962890625, 0.024017333984375, -0.02642822265625, -0.031005859375, -0.0030536651611328125, -0.0307159423828125, 0.08978271484375, 0.034454345703125, -0.0418701171875, 0.023590087890625, -0.0804443359375, 0.0094757080078125, 0.031768798828125, -0.022674560546875, -0.012420654296875, -0.015350341796875, 0.004405975341796875, 0.0139312744140625, 0.0230865478515625, -0.033203125, 0.0140533447265625, -0.039276123046875, 0.045501708984375, 0.059417724609375, -0.007770538330078125, 0.019378662109375, -0.032562255859375, 0.036468505859375, 0.01348114013671875, 0.03460693359375, 0.0030765533447265625, -0.037078857421875, -0.060821533203125, -0.0308990478515625, 0.02398681640625, 0.038848876953125, -0.0216064453125, 0.03912353515625, -0.0043487548828125, -0.059326171875, -0.0469970703125, -0.007427215576171875, 0.033233642578125, 0.049163818359375, 0.04193115234375, 0.00028204917907714844, -0.042755126953125, -0.0791015625, -0.002620697021484375, -0.016693115234375, 0.0059051513671875, 0.03143310546875, 0.0640869140625, -0.0082855224609375, 0.06927490234375, -0.04266357421875, -0.0090484619140625, -0.0092315673828125, -0.004604339599609375, 0.039337158203125, 0.05401611328125, 0.055572509765625, -0.049713134765625, -0.052398681640625, 0.00002968311309814453, -0.06134033203125, 0.0030460357666015625, 0.0191497802734375, -0.01251983642578125, 0.0233917236328125, 0.01971435546875, -0.0543212890625, 0.037811279296875, 0.036224365234375, -0.02764892578125, 0.07635498046875, -0.021392822265625, -0.001880645751953125, -0.07720947265625, 0.037139892578125, -0.0114593505859375, 0.006427764892578125, -0.041595458984375, 0.00272369384765625, 0.00991058349609375, -0.003200531005859375, -0.0386962890625, 0.06109619140625, -0.0380859375, 0.0012273788452148438, 0.01195526123046875, -0.01171112060546875, -0.0030422210693359375, 0.04840087890625, -0.0091552734375, 0.0693359375, 0.05609130859375, -0.041839599609375, 0.01264190673828125, 0.0229949951171875, -0.03558349609375, 0.017333984375, -0.059234619140625, 0.000690460205078125, -0.004634857177734375, 0.0149688720703125, -0.0701904296875, -0.0105438232421875, 0.02557373046875, -0.03814697265625, 0.0206298828125, 0.00945281982421875, -0.026214599609375, -0.051666259765625, -0.030059814453125, 0.0216064453125, 0.04388427734375, -0.032470703125, 0.0243682861328125, 0.004581451416015625, 0.00264739990234375, -0.0552978515625, -0.046295166015625, -0.01224517822265625, -0.022216796875, -0.048309326171875, 0.0279388427734375, -0.0145416259765625, -0.0253448486328125, 0.01393890380859375, 0.00469970703125, 0.0089263916015625, 0.015533447265625, 0.0189971923828125, 0.044097900390625, -0.0223236083984375, -0.007152557373046875, -0.0167236328125, -0.00341033935546875, 0.00514984130859375, -0.007598876953125, 0.04925537109375, -0.0295562744140625, -0.0243988037109375, -0.03692626953125, 0.00617218017578125, 0.04248046875, -0.038848876953125, 0.07183837890625, 0.0474853515625, -0.01702880859375, -0.00244903564453125, -0.02935791015625, -0.00891876220703125, -0.032684326171875, 0.029571533203125, -0.02520751953125, -0.0419921875, 0.0511474609375, 0.0226898193359375, 0.024200439453125, 0.06005859375, 0.041595458984375, -0.0037784576416015625, 0.08184814453125, 0.0287933349609375, -0.01023101806640625, 0.0245361328125, -0.056549072265625, 0.00733184814453125, -0.08331298828125, -0.039581298828125, -0.049560546875, -0.0263519287109375, -0.042327880859375, -0.0158233642578125, 0.0263671875, 0.0100250244140625, -0.057647705078125, 0.0199432373046875, -0.0615234375, 0.0067596435546875, 0.067138671875, 0.01108551025390625, 0.0113067626953125, -0.01384735107421875, -0.02642822265625, 0.005634307861328125, -0.053466796875, -0.031158447265625, 0.10308837890625, 0.0290985107421875, 0.043975830078125, -0.0004315376281738281, 0.044647216796875, 0.0089111328125, 0.0249786376953125, -0.027435302734375, 0.03369140625, -0.0025787353515625, -0.06134033203125, -0.0229949951171875, -0.029052734375, -0.05877685546875, 0.03375244140625, -0.01390838623046875, -0.05810546875, 0.01525115966796875, 0.00934600830078125, -0.03521728515625, 0.037322998046875, -0.037322998046875, 0.0587158203125, -0.0310211181640625, -0.034637451171875, -0.002933502197265625, -0.05389404296875, 0.041259765625, 0.005847930908203125, 0.020660400390625, -0.0187225341796875, 0.00702667236328125, 0.06591796875, -0.039947509765625, 0.044219970703125, -0.0170135498046875, 0.0232696533203125, 0.01727294921875, -0.01050567626953125, 0.042083740234375, -0.005634307861328125, -0.0208587646484375, 0.0416259765625, 0.005462646484375, -0.0270538330078125, -0.016876220703125, 0.0450439453125, -0.0811767578125, -0.0517578125, -0.04217529296875, -0.03851318359375, -0.0005922317504882812, 0.0298309326171875, 0.040496826171875, 0.0197296142578125, -0.0016832351684570312, 0.032745361328125, 0.039093017578125, -0.02679443359375, 0.0447998046875, 0.0389404296875, -0.0220489501953125, -0.0399169921875, 0.058349609375, 0.014892578125, 0.0163726806640625, 0.01511383056640625, 0.0243682861328125, -0.0301971435546875, -0.055023193359375, -0.035919189453125, 0.0144805908203125, -0.033203125, -0.0225372314453125, -0.058197021484375, -0.036407470703125, -0.03338623046875, -0.0013055801391601562, -0.02740478515625, -0.019683837890625, -0.036468505859375, -0.003208160400390625, 0.0278778076171875, 0.03436279296875, -0.00977325439453125, 0.005523681640625, -0.03985595703125, 0.0131072998046875, 0.0033893585205078125, 0.0199432373046875, 0.01708984375, -0.050872802734375, -0.015472412109375, 0.006847381591796875, -0.02947998046875, -0.043853759765625, 0.043975830078125, 0.0018033981323242188, 0.056549072265625, 0.0340576171875, 0.0117645263671875, 0.06201171875, -0.0231170654296875, 0.0782470703125, 0.02276611328125, -0.059722900390625, 0.04205322265625, -0.017425537109375, 0.0165557861328125, 0.033660888671875, 0.034912109375, -0.02008056640625, -0.02838134765625, -0.0587158203125, -0.072265625, 0.06805419921875, 0.0256195068359375, -0.00923919677734375, 0.0202178955078125, 0.01367950439453125, -0.0028896331787109375, 0.0203857421875, -0.06317138671875, -0.0400390625, -0.0280303955078125, -0.0176544189453125, 0.001682281494140625, -0.0026988983154296875, -0.00897216796875, -0.040924072265625, 0.062225341796875, 0.005939483642578125, 0.0243988037109375, 0.0182647705078125, 0.004047393798828125, 0.004840850830078125, -0.00005328655242919922, 0.0439453125, 0.061676025390625, -0.029388427734375, -0.0028171539306640625, 0.021728515625, -0.0386962890625, 0.0008449554443359375, 0.01137542724609375, -0.030242919921875, -0.01324462890625, 0.0304718017578125, 0.07196044921875, -0.004459381103515625, -0.040985107421875, 0.022308349609375, 0.006679534912109375, -0.0182037353515625, -0.03125, 0.0178985595703125, 0.00787353515625, 0.01515960693359375, 0.0228424072265625, 0.01245880126953125, 0.00499725341796875, -0.03033447265625, -0.00962066650390625, 0.042327880859375, -0.01593017578125, -0.0234832763671875, 0.0794677734375, 0.002979278564453125, -0.02459716796875, 0.05389404296875, -0.0007953643798828125, -0.04034423828125, 0.06494140625, 0.033905029296875, 0.051239013671875, -0.01904296875, -0.0013704299926757812, 0.06512451171875, 0.026092529296875, -0.0135345458984375, 0.022918701171875, 0.004093170166015625, -0.032135009765625, 0.0009613037109375, -0.0653076171875, -0.022186279296875, 0.0255889892578125, -0.051300048828125, 0.0217132568359375, -0.02691650390625, -0.017120361328125, -0.01042938232421875, 0.0252685546875, -0.061248779296875, 0.0160675048828125, 0.003742218017578125, 0.0875244140625, -0.048004150390625, 0.059112548828125, 0.0477294921875, -0.04583740234375, -0.074462890625, -0.0224456787109375, -0.0088348388671875, -0.061279296875, 0.0540771484375, 0.01483154296875, 0.0189666748046875, 0.007152557373046875, -0.054229736328125, -0.09075927734375, 0.10540771484375, 0.01122283935546875, -0.05010986328125, -0.01314544677734375, 0.005161285400390625, 0.042022705078125, -0.005565643310546875, 0.03900146484375, 0.034820556640625, 0.044097900390625, -0.0065460205078125, -0.0755615234375, 0.013427734375, -0.04052734375, 0.005615234375, 0.0018177032470703125, -0.07525634765625, 0.09881591796875, -0.01557159423828125, 0.007904052734375, -0.0003829002380371094, 0.048797607421875, 0.04534912109375, 0.0254669189453125, 0.034149169921875, 0.052398681640625, 0.05731201171875, -0.00858306884765625, 0.07293701171875, -0.04339599609375, 0.040771484375, 0.06610107421875, 0.0013275146484375, 0.05877685546875, 0.01410675048828125, -0.0467529296875, 0.03729248046875, 0.06732177734375, -0.0218048095703125, 0.026153564453125, 0.00563812255859375, -0.003448486328125, -0.0023403167724609375, 0.01216888427734375, -0.049957275390625, 0.02471923828125, 0.0086669921875, -0.0198516845703125, -0.0036067962646484375, 0.0011415481567382812, 0.00677490234375, -0.0214996337890625, -0.0195465087890625, 0.0401611328125, 0.0005340576171875, -0.03826904296875, 0.06317138671875, -0.0003476142883300781, 0.051513671875, -0.055633544921875, 0.00475311279296875, -0.0217132568359375, 0.01062774658203125, -0.0288238525390625, -0.05438232421875, 0.01102447509765625, 0.00736236572265625, -0.006702423095703125, 0.001834869384765625, 0.04132080078125, -0.01071929931640625, -0.04547119140625, 0.02313232421875, 0.0196533203125, 0.01409149169921875, 0.00011020898818969727, -0.06475830078125, 0.0015459060668945312, 0.0017671585083007812, -0.053009033203125, 0.0207977294921875, 0.01474761962890625, -0.003818511962890625, 0.057769775390625, 0.067626953125, 0.01461029052734375, 0.0120849609375, -0.0168914794921875, 0.07513427734375, -0.05145263671875, -0.03912353515625, -0.06707763671875, 0.046478271484375, -0.014495849609375, -0.039276123046875, 0.071044921875, 0.061431884765625, 0.05877685546875, 0.006328582763671875, 0.05450439453125, -0.0288238525390625, 0.0271759033203125, -0.030548095703125, 0.06500244140625, -0.059661865234375, 0.006633758544921875, -0.005611419677734375, -0.0645751953125, -0.01519775390625, 0.045379638671875, -0.0259552001953125, 0.004528045654296875, 0.03277587890625, 0.06005859375, 0.0074462890625, -0.0051116943359375, 0.01410675048828125, 0.02978515625, 0.032806396484375, 0.0433349609375, 0.051849365234375, -0.03070068359375, 0.039581298828125, -0.044464111328125, -0.01140594482421875, -0.0243682861328125, -0.04638671875, -0.05987548828125, -0.037139892578125, -0.013763427734375, -0.0245819091796875, -0.0122222900390625, 0.0802001953125, 0.05816650390625, -0.06524658203125, -0.01690673828125, -0.00933837890625, -0.00333404541015625, -0.032379150390625, -0.0196533203125, 0.052093505859375, -0.0191497802734375, -0.0606689453125, 0.01020050048828125, 0.0023555755615234375, 0.008544921875, -0.036102294921875, -0.023284912109375, -0.0219268798828125, 0.0021305084228515625, 0.0299224853515625, 0.0157318115234375, -0.057037353515625, -0.0010156631469726562, 0.006572723388671875, -0.01177215576171875, 0.0176544189453125, 0.0150146484375, -0.042633056640625, 0.0236968994140625, 0.03424072265625, 0.0200958251953125, 0.047821044921875, 0.006069183349609375, 0.01143646240234375, -0.0220947265625, 0.016326904296875, 0.0203399658203125, 0.0250396728515625, 0.0093231201171875, -0.030181884765625, 0.052398681640625, 0.01934814453125, -0.04632568359375, -0.083984375, -0.0197906494140625, -0.0657958984375, -0.02825927734375, 0.0810546875, -0.01509857177734375, -0.0286102294921875, 0.00830841064453125, -0.01299285888671875, 0.01849365234375, -0.027862548828125, 0.05438232421875, 0.044677734375, -0.02362060546875, -0.0009794235229492188, -0.0557861328125, 0.028106689453125, 0.016143798828125, -0.06573486328125, -0.005771636962890625, 0.019012451171875, 0.031585693359375, 0.0120391845703125, 0.05072021484375, -0.01291656494140625, 0.00928497314453125, 0.009033203125, 0.0187225341796875, -0.04083251953125, -0.01267242431640625, -0.00971221923828125, 0.013214111328125, -0.003631591796875, -0.0262298583984375 ] ]
kfkas/Llama-2-ko-7b-Chat
2023-09-20T08:21:34.000Z
[ "transformers", "pytorch", "llama", "text-generation", "facebook", "meta", "llama-2", "kollama", "llama-2-ko", "llama-2-ko-chat", "en", "ko", "arxiv:2307.09288", "has_space", "text-generation-inference", "region:us" ]
text-generation
kfkas
null
null
kfkas/Llama-2-ko-7b-Chat
51
9,454
transformers
2023-07-25T11:56:27
--- language: - en - ko pipeline_tag: text-generation inference: false tags: - facebook - meta - pytorch - llama - llama-2 - kollama - llama-2-ko - llama-2-ko-chat --- # **Llama-2-Ko-Chat** 🦙🇰🇷 <img src=https://github.com/boostcampaitech5/level2_klue-nlp-08/assets/96530685/9dc3a02a-f554-42c7-9aa7-e7bd8c8870b7 style="max-width: 700px; width: 100%" /> Llama-2-Ko-7b-Chat은 [beomi/llama-2-ko-7b 40B](https://huggingface.co/beomi/llama-2-ko-7b)를 토대로 만들어졌습니다. 학습 데이터는 [nlpai-lab/kullm-v2](https://huggingface.co/datasets/nlpai-lab/kullm-v2)를 통해 학습하였습니다. 아직 학습이 진행 중이며 추후 [beomi/llama-2-ko-7b](https://huggingface.co/beomi/llama-2-ko-7b)의 업데이트에 따라 추가로 훈련을 진행할 계획입니다. 이 모델은 [Naver BoostCamp NLP-08](https://github.com/boostcampaitech5/level3_nlp_finalproject-nlp-08) 프로젝트를 토대로 만들어졌습니다. 프로젝트 링크 : [LLaMA](https://github.com/boostcampaitech5/level3_nlp_finalproject-nlp-08) ## Model Details **Backbone Model** beomi/llama-2-ko-7b **Model Developers** [Taemin Kim (kfkas)](https://github.com/taemin6697), [Juwon Kim (uomnf97)](https://github.com/Kim-Ju-won) **data** nlpai-lab/kullm-v2 **prompts** kullm **Input** Models input text only. **Output** Models generate text only. --- | Model | 질문 : 캠핑 여행에 필요한 10가지 품목의 목록을 생성합니다. | |--------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Llama-2-7b-chat-hf | `[Hello! As a helpful and respectful assistant, I'd be happy to help you with your camping trip. To ensure a safe and enjoyable experience, here is a list of 10 essential items you may need for your camping trip:Tent: A sturdy, waterproof tent to provide shelter and protection from the elements.Sleeping Bag: A warm, insulated sleeping bag to keep you cozy during the night.Sleeping Pad: A comfortable sleeping pad to provide extra cushioning and insulation.Backpack: A sturdy, comfortable backpack to carry all your gear.Clothing: Pack layers of breathable, moisture-wicking clothing to stay warm and dry.Footwear: Sturdy, waterproof hiking boots or shoes to protect your feet and provide traction.Headlamp: A reliable, battery-powered headlamp to light your way during the night.First Aid Kit: A well-stocked first aid kit to treat any injuries or illnesses.Water Bottle: A refillable, BPA-free water bottle to stay hydrated.Personal Hygiene Items: Toothbrush, toothpaste, biodegradable soap, and toilet paper to maintain personal hygiene.]` | | Llama-2-ko | `[3일용 캠핑장 용품 - 의류 (5일) : 속옷 , 잠옷, 긴팔 및 반팔이 필요하므로 준비하기!\u200b- 식량 (9일 이상)- 연료 (6일 사용 예정이지만 최소 7 일분 보유해야 합니다.)\u200b- 텐트/거주지역 : 20 ~ 40 평방 미터가 필요합니다.- 물 탱크 (물탱커 / 수족관 )\u200b캠핑에 사용할 8 명 (남자 4명 / 여자 4명이 필요합니다!).이 작업에는 두 개의 조건 부가가 있습니다.(조건 1) 모든 아이템 목록에서 중복 제거를위한 1 번 반복.이 옵션은 중복을 식별하여 제거할뿐만 아니라 추가 항목이 포함 된 카운트를 재설정 할 때 유용하군요.(조건 = "모든 아이템 리스트에서 한 개 이상의 동일 아이템" 이거나 "(모든 항목)"에서 동일한 숫자 값으로 식별 된)\u200b(조건 2) 작업시 각 행이 하나의 아이템 만 포함 할 필요가 없으므로 아이템이 서로 다른 선행자 로부터 여러 번]` | | Llama-2-ko-Chat | `[1) 텐트 - 가장 기본적이면서 유용한 장비 중 하나이며 캠핑 여행을 편안하게 해줍니다.2) 베개 및 침대 시트와 같이 푹신한 수면 용품을 챙기는 것도 잊지 마세요!3) 매트리스 - 편안함과 지지력을 제공하여 잠을 잘 자게 합니다.4) 의류 - 땀을 흘리고 휴식하는 시간을 늘리기 위해 방수재질이 포함된 긴팔옷과 반바지를 챙겨요5) 식기 세트 - 야외 조리에서 음식을 먹으려면 반드시 챙겨야 할 도구지요.6) 스토브와 연료 또는 휴대용 가스 렌턴 등 열 기구를 가져옵니다; 음식 조리에 매우 편리하게 사용할 수 있습니다.7) 랜턴, 플래시 또는 LED 스탠드를 준비하세요! 어두운 밤에 더 안전하도록 돕고, 야외에서 요리하고 놀 때 필수적인 필수 소품이 될 것입니다.8) 손전등 - 밤늦게 이동할 수도 있고 숲속 오솔길을 걸을 때도 이용할 것이므로 반드시 가지고 있어야 해요.9) 구급약상자 - 다양한 상해에 대비해서 알칼리 미스트, 소독제, 항생제, 연고 등 일반적이고 인기 있는 구급 약물을 항상 비축해야 해요. 캠프에서는 응급 상황에 처하더라도 쉽게 찾을 수 있도록 상자에 담습니다.10) 휴대폰 보조 배터리/충전기 또는 충전 가능한 외부 콘센트나 연결된 외부 전원장치 - 이것은 스마트폰이나 스마트 워치를 위한 필수품일 뿐만 아니라 다른 사람들이 필요로 하는 상황에서 긴급 연락이 되도록 도와 줄 수도 있다는 점 기억하세요.]` | --- | Model | 질문 : 삼원색에 대해 알려줘 | |--------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Llama-2-7b-chat-hf | `[Regarding your question, "삼원색에 대해 알려줘" (which means "Tell me about trichromacy" in Korean), I can provide some information. Trichromacy is a concept in vision science that refers to the ability of the human visual system to perceive and distinguish between three primary colors: red, green, and blue...]` | | Llama-2-ko | `[C1, R0.625 G8 B275\u200b3-4, 삼각형 그리기.\u200b아래에는 삼각형을 그림니다 (그림을 그려야 할 영역의 좌표를 지정) - 명령어와 함께 사용되는 문자열로 입력할 좌표 값과 동일한 결과를 얻음 - 예를 들어 "X=-19. Y=0" 으로 입력을 하면 좌표값을 지정된 값과 같게 출력을 한다. ##########################\u200b##3_0-6-1, 좌표 입력으로 텍스트 파일을 생성하라(ex C4 D6 E10 A10 )-명령문에서 각 문자를 입력과 동일하다. -이 명령은 현재 수행중인 프로그램을 종료한다.-파일은 실행 가능한 파일과 일치할]` | | Llama-2-ko-Chat | `[물체를 인식하고 색상에 대한 시각적인 이미지를 생성하기 위해 두 개의 눈은 서로 반대쪽 방향에서 물체의 색상 정보와 명암 정보를 받아들이기 때문에, 인간은 일반적으로 색상을 삼원색이라고 하는 세 가지 원추 형태(빨강, 녹색, 파랑색)로 분류합니다. 이 중 한가지 색상이 다른 색상보다 눈에 훨씬 더 빨리 들어오는 경향이 있다고 합니다. 하지만 모든 사람이 그렇지는 않으므로 항상 삼각형 모양으로 색상을 분류하지는 않습니다. 하지만 삼원색이 우리 눈에 잘 전달되며 색상 구별에 중요하다는 것은 부정할 수 없습니다.]` | --- ## 훈련 진행 현황 <img src=https://github.com/taemin6697/Paper_Review/assets/96530685/b9a697a2-ef06-4b1c-97e1-e72b20d9a8b5 style="max-width: 700px; width: 100%" /> --- ### Inference ```python def gen(x, model, tokenizer, device): prompt = ( f"아래는 작업을 설명하는 명령어입니다. 요청을 적절히 완료하는 응답을 작성하세요.\n\n### 명령어:\n{x}\n\n### 응답:" ) len_prompt = len(prompt) gened = model.generate( **tokenizer(prompt, return_tensors="pt", return_token_type_ids=False).to( device ), max_new_tokens=1024, early_stopping=True, do_sample=True, top_k=20, top_p=0.92, no_repeat_ngram_size=3, eos_token_id=2, repetition_penalty=1.2, num_beams=3 ) return tokenizer.decode(gened[0])[len_prompt:] def LLM_infer(input): device = ( torch.device("cuda:0") if torch.cuda.is_available() else torch.device("cpu") ) model_id = "kfkas/Llama-2-ko-7b-Chat" model = AutoModelForCausalLM.from_pretrained( model_id, device_map={"": 0},torch_dtype=torch.float16, low_cpu_mem_usage=True ) tokenizer = AutoTokenizer.from_pretrained(model_id) model.eval() model.config.use_cache = (True) tokenizer.pad_token = tokenizer.eos_token output = gen(input, model=model, tokenizer=tokenizer, device=device) return output if __name__ == "__main__": text = LLM_infer("삼원색에 대해 알려줘") print(text) ``` ## Note for oobabooga/text-generation-webui Remove `ValueError` at `load_tokenizer` function(line 109 or near), in `modules/models.py`. ```python diff --git a/modules/models.py b/modules/models.py index 232d5fa..de5b7a0 100644 --- a/modules/models.py +++ b/modules/models.py @@ -106,7 +106,7 @@ def load_tokenizer(model_name, model): trust_remote_code=shared.args.trust_remote_code, use_fast=False ) - except ValueError: + except: tokenizer = AutoTokenizer.from_pretrained( path_to_model, trust_remote_code=shared.args.trust_remote_code, ``` Since Llama-2-Ko uses FastTokenizer provided by HF tokenizers NOT sentencepiece package, it is required to use `use_fast=True` option when initialize tokenizer. Apple Sillicon does not support BF16 computing, use CPU instead. (BF16 is supported when using NVIDIA GPU) --- > Below is the original model card of the Llama-2 model. # **Llama 2** Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B pretrained model, converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom. ## Model Details *Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.* Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM. **Model Developers** Meta **Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations. **Input** Models input text only. **Output** Models generate text only. **Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety. ||Training Data|Params|Content Length|GQA|Tokens|LR| |---|---|---|---|---|---|---| |Llama 2|*A new mix of publicly available online data*|7B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|13B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|70B|4k|&#10004;|2.0T|1.5 x 10<sup>-4</sup>| *Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability. **Model Dates** Llama 2 was trained between January 2023 and July 2023. **Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) **Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288) ## Intended Use **Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212). **Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2. ## Hardware and Software **Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. **Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program. ||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)| |---|---|---|---| |Llama 2 7B|184320|400|31.22| |Llama 2 13B|368640|400|62.44| |Llama 2 70B|1720320|400|291.42| |Total|3311616||539.00| **CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. ## Training Data **Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. **Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023. ## Evaluation Results In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library. |Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval| |---|---|---|---|---|---|---|---|---|---| |Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9| |Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9| |Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7| |Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6| |Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3| |Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1| |Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**| **Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1. |||TruthfulQA|Toxigen| |---|---|---|---| |Llama 1|7B|27.42|23.00| |Llama 1|13B|41.74|23.08| |Llama 1|33B|44.19|22.57| |Llama 1|65B|48.71|21.77| |Llama 2|7B|33.29|**21.25**| |Llama 2|13B|41.86|26.10| |Llama 2|70B|**50.18**|24.60| **Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better). |||TruthfulQA|Toxigen| |---|---|---|---| |Llama-2-Chat|7B|57.04|**0.00**| |Llama-2-Chat|13B|62.18|**0.00**| |Llama-2-Chat|70B|**64.14**|0.01| **Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above. ## Ethical Considerations and Limitations Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide) ## Reporting Issues Please report any software “bug,” or other problems with the models through one of the following means: - Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama) - Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback) - Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info) ## Llama Model Index |Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf| |---|---|---|---|---| |7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)| |13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)| |70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
18,477
[ [ -0.03692626953125, -0.003864288330078125, 0.054443359375, 0.041046142578125, -0.037506103515625, 0.015472412109375, 0.0222015380859375, -0.02508544921875, 0.1068115234375, 0.050323486328125, -0.035064697265625, -0.0211944580078125, -0.003780364990234375, 0.0023784637451171875, -0.02276611328125, 0.054443359375, -0.02264404296875, 0.0341796875, 0.01690673828125, -0.0106964111328125, -0.06402587890625, 0.01348114013671875, -0.0521240234375, -0.05242919921875, 0.047515869140625, 0.01277923583984375, 0.035400390625, 0.055145263671875, 0.05096435546875, -0.022430419921875, 0.025909423828125, -0.007183074951171875, -0.03759765625, -0.0006303787231445312, -0.0154571533203125, -0.01385498046875, -0.01174163818359375, -0.0253448486328125, 0.028533935546875, 0.06329345703125, -0.024993896484375, -0.0199432373046875, -0.03466796875, 0.050689697265625, -0.0740966796875, 0.044830322265625, -0.01230621337890625, 0.00988006591796875, -0.013641357421875, -0.0198822021484375, -0.005443572998046875, -0.026641845703125, 0.0030956268310546875, -0.06878662109375, 0.0452880859375, -0.0205230712890625, 0.05096435546875, -0.015045166015625, -0.0259552001953125, 0.038055419921875, -0.0479736328125, 0.051422119140625, 0.0092315673828125, 0.0279388427734375, 0.034393310546875, 0.03948974609375, -0.0186614990234375, -0.0426025390625, -0.004367828369140625, 0.01062774658203125, 0.003925323486328125, 0.033416748046875, 0.03228759765625, 0.0096588134765625, 0.005710601806640625, 0.0128326416015625, -0.0295867919921875, 0.0189361572265625, -0.048065185546875, 0.022369384765625, 0.042388916015625, -0.00315093994140625, 0.052978515625, -0.034515380859375, -0.026702880859375, 0.016326904296875, -0.0333251953125, -0.015228271484375, -0.0011234283447265625, 0.020599365234375, -0.027130126953125, 0.05810546875, -0.0265350341796875, -0.00009649991989135742, 0.06121826171875, 0.0012044906616210938, 0.028656005859375, -0.0312042236328125, -0.05078125, 0.0472412109375, 0.020111083984375, 0.034576416015625, 0.047393798828125, 0.0039520263671875, -0.034881591796875, -0.0211181640625, 0.047760009765625, -0.0284576416015625, -0.041107177734375, 0.017059326171875, -0.06622314453125, -0.0007004737854003906, 0.013397216796875, 0.0043487548828125, -0.039764404296875, 0.0071868896484375, 0.044036865234375, -0.0029544830322265625, -0.045745849609375, -0.00894927978515625, -0.055877685546875, 0.03564453125, 0.002063751220703125, -0.071533203125, 0.043304443359375, 0.0284576416015625, 0.0318603515625, 0.04168701171875, 0.01593017578125, -0.053863525390625, 0.0019321441650390625, -0.04132080078125, 0.05718994140625, -0.0304412841796875, -0.0212554931640625, -0.005138397216796875, 0.0233612060546875, 0.0034580230712890625, -0.049285888671875, 0.08758544921875, -0.0204925537109375, -0.0179595947265625, 0.019561767578125, 0.0082550048828125, 0.0211334228515625, -0.00745391845703125, -0.0433349609375, 0.0264434814453125, 0.03765869140625, -0.04266357421875, 0.0291595458984375, -0.0457763671875, -0.052001953125, 0.004657745361328125, -0.0011491775512695312, -0.0258026123046875, 0.0308990478515625, 0.007801055908203125, 0.0238037109375, 0.0121002197265625, -0.0214996337890625, -0.034942626953125, -0.03302001953125, 0.00811004638671875, 0.00490570068359375, 0.08880615234375, 0.02001953125, -0.033355712890625, -0.006378173828125, -0.057403564453125, -0.052520751953125, -0.00653839111328125, 0.01024627685546875, -0.03173828125, 0.0170440673828125, 0.0146331787109375, 0.04412841796875, 0.0106964111328125, -0.056915283203125, 0.01474761962890625, -0.029510498046875, 0.0000814199447631836, 0.03863525390625, 0.0247344970703125, 0.014923095703125, -0.0447998046875, 0.03594970703125, 0.036376953125, 0.0252685546875, -0.01367950439453125, -0.0295562744140625, -0.0299224853515625, -0.0478515625, -0.0034389495849609375, 0.008514404296875, -0.041168212890625, -0.005115509033203125, 0.00445556640625, -0.0816650390625, 0.002574920654296875, -0.037872314453125, 0.006168365478515625, 0.007663726806640625, 0.00591278076171875, -0.00144195556640625, -0.0247955322265625, -0.054351806640625, -0.046356201171875, -0.038482666015625, -0.003787994384765625, 0.042755126953125, 0.031280517578125, 0.031829833984375, 0.032623291015625, -0.0450439453125, -0.0428466796875, -0.0242767333984375, 0.00928497314453125, 0.01476287841796875, 0.0210418701171875, 0.037384033203125, -0.086669921875, -0.058197021484375, -0.050872802734375, -0.06494140625, -0.025421142578125, -0.0166015625, -0.0182037353515625, -0.0116119384765625, 0.023406982421875, -0.1025390625, 0.054168701171875, 0.0299835205078125, -0.0511474609375, 0.06744384765625, 0.0006976127624511719, -0.003032684326171875, -0.0533447265625, -0.0263519287109375, 0.1024169921875, -0.037139892578125, -0.0228729248046875, 0.03515625, -0.0121002197265625, -0.03326416015625, -0.0260009765625, 0.047760009765625, -0.034423828125, -0.0221099853515625, -0.049957275390625, -0.0253753662109375, -0.0230560302734375, 0.0205230712890625, -0.0036716461181640625, 0.043487548828125, 0.04644775390625, -0.04150390625, 0.044403076171875, -0.0282440185546875, -0.00628662109375, 0.05169677734375, -0.01837158203125, 0.003948211669921875, 0.0053558349609375, -0.00650787353515625, -0.06591796875, -0.0328369140625, 0.0213623046875, -0.029388427734375, 0.00009125471115112305, -0.022430419921875, 0.0010232925415039062, -0.058197021484375, -0.03668212890625, 0.01418304443359375, 0.07244873046875, -0.05096435546875, 0.0203094482421875, 0.00000852346420288086, -0.01451873779296875, -0.02008056640625, -0.037872314453125, 0.01486968994140625, -0.016204833984375, -0.034515380859375, 0.06549072265625, -0.0209197998046875, -0.0229034423828125, -0.011383056640625, -0.0241241455078125, -0.0635986328125, 0.00948333740234375, 0.049346923828125, -0.00788116455078125, -0.017547607421875, -0.0194549560546875, -0.01396942138671875, -0.0343017578125, 0.0113525390625, -0.041900634765625, 0.06951904296875, 0.0203094482421875, -0.02130126953125, -0.01348114013671875, 0.04412841796875, 0.0457763671875, 0.038818359375, 0.0227203369140625, 0.061676025390625, -0.0179290771484375, 0.020904541015625, 0.01306915283203125, -0.02740478515625, -0.03692626953125, -0.0235443115234375, -0.01059722900390625, -0.057647705078125, 0.00099945068359375, 0.0290985107421875, -0.0018253326416015625, 0.056915283203125, -0.00919342041015625, -0.032928466796875, 0.07763671875, 0.0302581787109375, 0.03961181640625, 0.00319671630859375, -0.0231781005859375, 0.0153045654296875, -0.11688232421875, -0.00885009765625, -0.03778076171875, -0.0247039794921875, -0.0274200439453125, -0.0228424072265625, 0.06005859375, 0.0638427734375, -0.016510009765625, 0.049530029296875, -0.039398193359375, 0.041534423828125, 0.0199432373046875, 0.0301055908203125, 0.0014057159423828125, -0.021881103515625, 0.0223236083984375, -0.02288818359375, -0.00628662109375, -0.0369873046875, 0.049346923828125, 0.0174713134765625, 0.0948486328125, 0.00811767578125, 0.060546875, 0.01393890380859375, 0.0694580078125, -0.0389404296875, 0.0181427001953125, -0.051422119140625, -0.037994384765625, -0.0269317626953125, -0.0213775634765625, -0.059722900390625, -0.005794525146484375, 0.0215911865234375, -0.06744384765625, 0.059814453125, -0.0048980712890625, -0.039764404296875, 0.01367950439453125, -0.0472412109375, 0.06591796875, -0.012115478515625, 0.0130462646484375, 0.0168304443359375, -0.0116729736328125, 0.032012939453125, -0.01474761962890625, 0.0162353515625, -0.03546142578125, 0.032012939453125, 0.0288238525390625, 0.024749755859375, 0.0263824462890625, -0.008575439453125, -0.00861358642578125, 0.004688262939453125, 0.0282135009765625, -0.0148162841796875, 0.0295867919921875, -0.0006327629089355469, 0.0212554931640625, -0.034515380859375, -0.03741455078125, 0.00806427001953125, 0.036346435546875, -0.037139892578125, -0.0882568359375, -0.0357666015625, 0.021331787109375, 0.0239410400390625, 0.03900146484375, 0.0682373046875, 0.02978515625, -0.00034046173095703125, -0.003986358642578125, 0.059814453125, -0.0079193115234375, 0.00588226318359375, 0.0264434814453125, -0.0097503662109375, -0.034698486328125, 0.0548095703125, 0.039947509765625, 0.045501708984375, 0.01509857177734375, 0.048187255859375, -0.038482666015625, -0.01316070556640625, -0.07861328125, 0.024627685546875, -0.0182037353515625, 0.02203369140625, -0.0171661376953125, -0.045623779296875, -0.0002951622009277344, -0.053009033203125, -0.0254364013671875, -0.0823974609375, -0.022613525390625, 0.00916290283203125, 0.036224365234375, 0.057037353515625, -0.040771484375, 0.0190887451171875, -0.02862548828125, 0.065673828125, 0.0013580322265625, 0.0166015625, -0.038421630859375, -0.03387451171875, -0.024627685546875, 0.0194091796875, 0.0005965232849121094, -0.07568359375, 0.0268402099609375, -0.005550384521484375, 0.020660400390625, 0.032928466796875, 0.011932373046875, 0.056915283203125, -0.08026123046875, 0.08734130859375, 0.0340576171875, -0.064208984375, -0.017059326171875, -0.04400634765625, 0.0259552001953125, 0.0556640625, 0.034454345703125, -0.0269012451171875, -0.00656890869140625, -0.030670166015625, -0.049163818359375, 0.04632568359375, 0.0240936279296875, 0.0202484130859375, -0.003627777099609375, 0.0357666015625, 0.0092620849609375, 0.01271820068359375, -0.038421630859375, -0.030426025390625, -0.00862884521484375, 0.0019350051879882812, 0.049591064453125, 0.02313232421875, 0.00868988037109375, -0.036163330078125, 0.04205322265625, -0.005298614501953125, 0.007537841796875, 0.015472412109375, 0.025787353515625, -0.0258331298828125, 0.04254150390625, 0.058349609375, 0.07440185546875, -0.03466796875, 0.028564453125, 0.01468658447265625, 0.0008840560913085938, 0.034637451171875, -0.0501708984375, -0.01445770263671875, 0.0299224853515625, -0.0229339599609375, 0.0247955322265625, -0.0033855438232421875, -0.02081298828125, 0.005237579345703125, 0.006755828857421875, 0.0008749961853027344, -0.055877685546875, 0.030242919921875, 0.03131103515625, -0.007221221923828125, 0.00165557861328125, 0.045684814453125, -0.00067138671875, -0.034210205078125, 0.025146484375, 0.00536346435546875, -0.0430908203125, -0.02203369140625, 0.06658935546875, -0.023590087890625, -0.0277252197265625, 0.0258026123046875, -0.068603515625, -0.021881103515625, 0.0220794677734375, 0.0484619140625, 0.01171112060546875, -0.03485107421875, 0.0496826171875, 0.0484619140625, 0.0027713775634765625, -0.01187896728515625, 0.05609130859375, -0.0005006790161132812, -0.03656005859375, -0.00310516357421875, -0.046142578125, -0.0273590087890625, -0.04730224609375, -0.039886474609375, 0.01800537109375, -0.0166168212890625, -0.0219573974609375, -0.034820556640625, 0.0004940032958984375, -0.054595947265625, 0.03179931640625, 0.04669189453125, 0.0404052734375, -0.0217742919921875, 0.030548095703125, 0.0231170654296875, -0.0455322265625, -0.05133056640625, 0.0225830078125, -0.004730224609375, -0.04290771484375, 0.06890869140625, 0.026702880859375, -0.036224365234375, 0.0005545616149902344, -0.02972412109375, -0.034393310546875, 0.040283203125, 0.003143310546875, -0.0399169921875, 0.001148223876953125, 0.004093170166015625, 0.0276947021484375, -0.02996826171875, -0.0003960132598876953, 0.04644775390625, 0.0222625732421875, -0.040557861328125, -0.07855224609375, -0.10205078125, -0.00679779052734375, -0.02288818359375, 0.01198577880859375, -0.07611083984375, 0.055145263671875, -0.054168701171875, -0.0301361083984375, 0.004360198974609375, 0.045196533203125, 0.039825439453125, 0.00884246826171875, 0.00968170166015625, 0.04241943359375, 0.064453125, 0.004520416259765625, 0.0684814453125, 0.0088043212890625, 0.0028400421142578125, 0.0780029296875, -0.01537322998046875, 0.04345703125, 0.0173492431640625, -0.024444580078125, 0.00023949146270751953, 0.042388916015625, 0.01108551025390625, 0.0289459228515625, 0.00843048095703125, -0.0302581787109375, -0.016632080078125, -0.0201263427734375, 0.002140045166015625, 0.0209197998046875, -0.0129241943359375, -0.035400390625, -0.007389068603515625, 0.01206207275390625, 0.01190948486328125, -0.03411865234375, -0.035308837890625, 0.055023193359375, 0.045166015625, -0.033905029296875, 0.0011701583862304688, 0.007053375244140625, -0.0003337860107421875, -0.01236724853515625, 0.019134521484375, 0.0142364501953125, 0.0203857421875, -0.0084228515625, -0.05328369140625, -0.0197601318359375, 0.003520965576171875, -0.013946533203125, -0.051422119140625, 0.038421630859375, -0.0303192138671875, -0.022430419921875, 0.006999969482421875, -0.002353668212890625, 0.0193939208984375, 0.006908416748046875, -0.0180511474609375, -0.04144287109375, 0.0115966796875, -0.0238800048828125, 0.0193634033203125, 0.0141143798828125, -0.017608642578125, -0.0030193328857421875, 0.0146636962890625, 0.0206298828125, 0.0050506591796875, -0.028106689453125, 0.048248291015625, -0.0738525390625, -0.031463623046875, -0.056915283203125, 0.0092315673828125, 0.005950927734375, -0.0258026123046875, 0.036773681640625, 0.052764892578125, 0.03338623046875, 0.0205841064453125, 0.02734375, -0.01226043701171875, -0.005725860595703125, -0.04290771484375, 0.06365966796875, -0.064208984375, -0.00847625732421875, 0.001995086669921875, -0.06646728515625, -0.0060272216796875, 0.059814453125, -0.019439697265625, -0.0193023681640625, 0.050079345703125, 0.040771484375, 0.007785797119140625, 0.040740966796875, 0.01641845703125, 0.0330810546875, 0.050750732421875, 0.0455322265625, 0.055328369140625, -0.01013946533203125, 0.0236968994140625, -0.046844482421875, -0.04473876953125, -0.060882568359375, -0.0357666015625, -0.00611114501953125, -0.0265045166015625, -0.00786590576171875, -0.0736083984375, -0.00010281801223754883, 0.08544921875, 0.048187255859375, -0.0209503173828125, -0.0269317626953125, 0.032684326171875, 0.0022125244140625, -0.0389404296875, -0.0157928466796875, 0.024932861328125, -0.006587982177734375, -0.006031036376953125, 0.0279998779296875, 0.004852294921875, 0.03509521484375, -0.0004489421844482422, 0.043121337890625, -0.0009617805480957031, 0.0005941390991210938, 0.0283050537109375, 0.030914306640625, -0.0027675628662109375, -0.07373046875, 0.0084381103515625, -0.01194000244140625, 0.00235748291015625, 0.03125, -0.0548095703125, 0.0309600830078125, 0.03155517578125, 0.004241943359375, 0.050048828125, -0.01486968994140625, 0.0601806640625, -0.037200927734375, 0.0279388427734375, -0.024871826171875, 0.034210205078125, 0.0006165504455566406, -0.034820556640625, 0.005924224853515625, 0.034637451171875, -0.0479736328125, -0.02545166015625, 0.0016326904296875, -0.06475830078125, 0.047271728515625, 0.064453125, 0.00931549072265625, 0.006229400634765625, -0.015655517578125, -0.0433349609375, 0.0216064453125, -0.024505615234375, 0.0797119140625, 0.04132080078125, 0.0060882568359375, 0.0230712890625, -0.0343017578125, 0.036865234375, 0.0222015380859375, -0.07952880859375, 0.01026153564453125, 0.0295562744140625, 0.03900146484375, 0.0137176513671875, 0.03680419921875, -0.0092315673828125, 0.05029296875, 0.03167724609375, -0.05157470703125, -0.0531005859375, -0.06890869140625, -0.0220184326171875, 0.01258087158203125, 0.04852294921875, -0.0204925537109375 ] ]
pdelobelle/robbert-v2-dutch-base
2022-11-16T11:36:42.000Z
[ "transformers", "pytorch", "tf", "jax", "roberta", "fill-mask", "Dutch", "Flemish", "RoBERTa", "RobBERT", "nl", "dataset:oscar", "dataset:dbrd", "dataset:lassy-ud", "dataset:europarl-mono", "dataset:conll2002", "arxiv:2001.06286", "arxiv:2004.02814", "arxiv:2010.13652", "arxiv:2101.05716", "arxiv:1907.11692", "arxiv:2001.02943", "arxiv:1909.11942", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
pdelobelle
null
null
pdelobelle/robbert-v2-dutch-base
17
9,427
transformers
2022-03-02T23:29:05
--- language: "nl" thumbnail: "https://github.com/iPieter/RobBERT/raw/master/res/robbert_logo.png" tags: - Dutch - Flemish - RoBERTa - RobBERT license: mit datasets: - oscar - dbrd - lassy-ud - europarl-mono - conll2002 widget: - text: "Hallo, ik ben RobBERT, een <mask> taalmodel van de KU Leuven." --- <p align="center"> <img src="https://github.com/iPieter/RobBERT/raw/master/res/robbert_logo_with_name.png" alt="RobBERT: A Dutch RoBERTa-based Language Model" width="75%"> </p> # RobBERT: Dutch RoBERTa-based Language Model. [RobBERT](https://github.com/iPieter/RobBERT) is the state-of-the-art Dutch BERT model. It is a large pre-trained general Dutch language model that can be fine-tuned on a given dataset to perform any text classification, regression or token-tagging task. As such, it has been successfully used by many [researchers](https://scholar.google.com/scholar?oi=bibs&hl=en&cites=7180110604335112086) and [practitioners](https://huggingface.co/models?search=robbert) for achieving state-of-the-art performance for a wide range of Dutch natural language processing tasks, including: - [Emotion detection](https://www.aclweb.org/anthology/2021.wassa-1.27/) - Sentiment analysis ([book reviews](https://arxiv.org/pdf/2001.06286.pdf), [news articles](https://biblio.ugent.be/publication/8704637/file/8704638.pdf)*) - [Coreference resolution](https://arxiv.org/pdf/2001.06286.pdf) - Named entity recognition ([CoNLL](https://arxiv.org/pdf/2001.06286.pdf), [job titles](https://arxiv.org/pdf/2004.02814.pdf)*, [SoNaR](https://github.com/proycon/deepfrog)) - Part-of-speech tagging ([Small UD Lassy](https://arxiv.org/pdf/2001.06286.pdf), [CGN](https://github.com/proycon/deepfrog)) - [Zero-shot word prediction](https://arxiv.org/pdf/2001.06286.pdf) - [Humor detection](https://arxiv.org/pdf/2010.13652.pdf) - [Cyberbulling detection](https://www.cambridge.org/core/journals/natural-language-engineering/article/abs/automatic-classification-of-participant-roles-in-cyberbullying-can-we-detect-victims-bullies-and-bystanders-in-social-media-text/A2079C2C738C29428E666810B8903342) - [Correcting dt-spelling mistakes](https://gitlab.com/spelfouten/dutch-simpletransformers/)* and also achieved outstanding, near-sota results for: - [Natural language inference](https://arxiv.org/pdf/2101.05716.pdf)* - [Review classification](https://medium.com/broadhorizon-cmotions/nlp-with-r-part-5-state-of-the-art-in-nlp-transformers-bert-3449e3cd7494)* \\* *Note that several evaluations use RobBERT-v1, and that the second and improved RobBERT-v2 outperforms this first model on everything we tested* *(Also note that this list is not exhaustive. If you used RobBERT for your application, we are happy to know about it! Send us a mail, or add it yourself to this list by sending a pull request with the edit!)* More in-depth information about RobBERT can be found in our [blog post](https://people.cs.kuleuven.be/~pieter.delobelle/robbert/), [our paper](https://arxiv.org/abs/2001.06286) and [the RobBERT Github repository](https://github.com/iPieter/RobBERT) ## How to use RobBERT uses the [RoBERTa](https://arxiv.org/abs/1907.11692) architecture and pre-training but with a Dutch tokenizer and training data. RoBERTa is the robustly optimized English BERT model, making it even more powerful than the original BERT model. Given this same architecture, RobBERT can easily be finetuned and inferenced using [code to finetune RoBERTa](https://huggingface.co/transformers/model_doc/roberta.html) models and most code used for BERT models, e.g. as provided by [HuggingFace Transformers](https://huggingface.co/transformers/) library. By default, RobBERT has the masked language model head used in training. This can be used as a zero-shot way to fill masks in sentences. It can be tested out for free on [RobBERT's Hosted infererence API of Huggingface](https://huggingface.co/pdelobelle/robbert-v2-dutch-base?text=De+hoofdstad+van+Belgi%C3%AB+is+%3Cmask%3E.). You can also create a new prediction head for your own task by using any of HuggingFace's [RoBERTa-runners](https://huggingface.co/transformers/v2.7.0/examples.html#language-model-training), [their fine-tuning notebooks](https://huggingface.co/transformers/v4.1.1/notebooks.html) by changing the model name to `pdelobelle/robbert-v2-dutch-base`, or use the original fairseq [RoBERTa](https://github.com/pytorch/fairseq/tree/master/examples/roberta) training regimes. Use the following code to download the base model and finetune it yourself, or use one of our finetuned models (documented on [our project site](https://people.cs.kuleuven.be/~pieter.delobelle/robbert/)). ```python from transformers import RobertaTokenizer, RobertaForSequenceClassification tokenizer = RobertaTokenizer.from_pretrained("pdelobelle/robbert-v2-dutch-base") model = RobertaForSequenceClassification.from_pretrained("pdelobelle/robbert-v2-dutch-base") ``` Starting with `transformers v2.4.0` (or installing from source), you can use AutoTokenizer and AutoModel. You can then use most of [HuggingFace's BERT-based notebooks](https://huggingface.co/transformers/v4.1.1/notebooks.html) for finetuning RobBERT on your type of Dutch language dataset. ## Technical Details From The Paper ### Our Performance Evaluation Results All experiments are described in more detail in our [paper](https://arxiv.org/abs/2001.06286), with the code in [our GitHub repository](https://github.com/iPieter/RobBERT). ### Sentiment analysis Predicting whether a review is positive or negative using the [Dutch Book Reviews Dataset](https://github.com/benjaminvdb/110kDBRD). | Model | Accuracy [%] | |-------------------|--------------------------| | ULMFiT | 93.8 | | BERTje | 93.0 | | RobBERT v2 | **95.1** | ### Die/Dat (coreference resolution) We measured how well the models are able to do coreference resolution by predicting whether "die" or "dat" should be filled into a sentence. For this, we used the [EuroParl corpus](https://www.statmt.org/europarl/). #### Finetuning on whole dataset | Model | Accuracy [%] | F1 [%] | |-------------------|--------------------------|--------------| | [Baseline](https://arxiv.org/abs/2001.02943) (LSTM) | | 75.03 | | mBERT | 98.285 | 98.033 | | BERTje | 98.268 | 98.014 | | RobBERT v2 | **99.232** | **99.121** | #### Finetuning on 10K examples We also measured the performance using only 10K training examples. This experiment clearly illustrates that RobBERT outperforms other models when there is little data available. | Model | Accuracy [%] | F1 [%] | |-------------------|--------------------------|--------------| | mBERT | 92.157 | 90.898 | | BERTje | 93.096 | 91.279 | | RobBERT v2 | **97.816** | **97.514** | #### Using zero-shot word masking task Since BERT models are pre-trained using the word masking task, we can use this to predict whether "die" or "dat" is more likely. This experiment shows that RobBERT has internalised more information about Dutch than other models. | Model | Accuracy [%] | |-------------------|--------------------------| | ZeroR | 66.70 | | mBERT | 90.21 | | BERTje | 94.94 | | RobBERT v2 | **98.75** | ### Part-of-Speech Tagging. Using the [Lassy UD dataset](https://universaldependencies.org/treebanks/nl_lassysmall/index.html). | Model | Accuracy [%] | |-------------------|--------------------------| | Frog | 91.7 | | mBERT | **96.5** | | BERTje | 96.3 | | RobBERT v2 | 96.4 | Interestingly, we found that when dealing with **small data sets**, RobBERT v2 **significantly outperforms** other models. <p align="center"> <img src="https://github.com/iPieter/RobBERT/raw/master/res/robbert_pos_accuracy.png" alt="RobBERT's performance on smaller datasets"> </p> ### Named Entity Recognition Using the [CoNLL 2002 evaluation script](https://www.clips.uantwerpen.be/conll2002/ner/). | Model | Accuracy [%] | |-------------------|--------------------------| | Frog | 57.31 | | mBERT | **90.94** | | BERT-NL | 89.7 | | BERTje | 88.3 | | RobBERT v2 | 89.08 | ## Pre-Training Procedure Details We pre-trained RobBERT using the RoBERTa training regime. We pre-trained our model on the Dutch section of the [OSCAR corpus](https://oscar-corpus.com/), a large multilingual corpus which was obtained by language classification in the Common Crawl corpus. This Dutch corpus is 39GB large, with 6.6 billion words spread over 126 million lines of text, where each line could contain multiple sentences, thus using more data than concurrently developed Dutch BERT models. RobBERT shares its architecture with [RoBERTa's base model](https://github.com/pytorch/fairseq/tree/master/examples/roberta), which itself is a replication and improvement over BERT. Like BERT, it's architecture consists of 12 self-attention layers with 12 heads with 117M trainable parameters. One difference with the original BERT model is due to the different pre-training task specified by RoBERTa, using only the MLM task and not the NSP task. During pre-training, it thus only predicts which words are masked in certain positions of given sentences. The training process uses the Adam optimizer with polynomial decay of the learning rate l_r=10^-6 and a ramp-up period of 1000 iterations, with hyperparameters beta_1=0.9 and RoBERTa's default beta_2=0.98. Additionally, a weight decay of 0.1 and a small dropout of 0.1 helps prevent the model from overfitting. RobBERT was trained on a computing cluster with 4 Nvidia P100 GPUs per node, where the number of nodes was dynamically adjusted while keeping a fixed batch size of 8192 sentences. At most 20 nodes were used (i.e. 80 GPUs), and the median was 5 nodes. By using gradient accumulation, the batch size could be set independently of the number of GPUs available, in order to maximally utilize the cluster. Using the [Fairseq library](https://github.com/pytorch/fairseq/tree/master/examples/roberta), the model trained for two epochs, which equals over 16k batches in total, which took about three days on the computing cluster. In between training jobs on the computing cluster, 2 Nvidia 1080 Ti's also covered some parameter updates for RobBERT v2. ## Investigating Limitations and Bias In the [RobBERT paper](https://arxiv.org/abs/2001.06286), we also investigated potential sources of bias in RobBERT. We found that the zeroshot model estimates the probability of *hij* (he) to be higher than *zij* (she) for most occupations in bleached template sentences, regardless of their actual job gender ratio in reality. <p align="center"> <img src="https://github.com/iPieter/RobBERT/raw/master/res/gender_diff.png" alt="RobBERT's performance on smaller datasets"> </p> By augmenting the DBRB Dutch Book sentiment analysis dataset with the stated gender of the author of the review, we found that highly positive reviews written by women were generally more accurately detected by RobBERT as being positive than those written by men. <p align="center"> <img src="https://github.com/iPieter/RobBERT/raw/master/res/dbrd.png" alt="RobBERT's performance on smaller datasets"> </p> ## How to Replicate Our Paper Experiments Replicating our paper experiments is [described in detail on teh RobBERT repository README](https://github.com/iPieter/RobBERT#how-to-replicate-our-paper-experiments). ## Name Origin of RobBERT Most BERT-like models have the word *BERT* in their name (e.g. [RoBERTa](https://huggingface.co/transformers/model_doc/roberta.html), [ALBERT](https://arxiv.org/abs/1909.11942), [CamemBERT](https://camembert-model.fr/), and [many, many others](https://huggingface.co/models?search=bert)). As such, we queried our newly trained model using its masked language model to name itself *\\<mask\\>bert* using [all](https://huggingface.co/pdelobelle/robbert-v2-dutch-base?text=Mijn+naam+is+%3Cmask%3Ebert.) [kinds](https://huggingface.co/pdelobelle/robbert-v2-dutch-base?text=Hallo%2C+ik+ben+%3Cmask%3Ebert.) [of](https://huggingface.co/pdelobelle/robbert-v2-dutch-base?text=Leuk+je+te+ontmoeten%2C+ik+heet+%3Cmask%3Ebert.) [prompts](https://huggingface.co/pdelobelle/robbert-v2-dutch-base?text=Niemand+weet%2C+niemand+weet%2C+dat+ik+%3Cmask%3Ebert+heet.), and it consistently called itself RobBERT. We thought it was really quite fitting, given that RobBERT is a [*very* Dutch name](https://en.wikipedia.org/wiki/Robbert) *(and thus clearly a Dutch language model)*, and additionally has a high similarity to its root architecture, namely [RoBERTa](https://huggingface.co/transformers/model_doc/roberta.html). Since *"rob"* is a Dutch words to denote a seal, we decided to draw a seal and dress it up like [Bert from Sesame Street](https://muppet.fandom.com/wiki/Bert) for the [RobBERT logo](https://github.com/iPieter/RobBERT/blob/master/res/robbert_logo.png). ## Credits and citation This project is created by [Pieter Delobelle](https://people.cs.kuleuven.be/~pieter.delobelle), [Thomas Winters](https://thomaswinters.be) and [Bettina Berendt](https://people.cs.kuleuven.be/~bettina.berendt/). If you would like to cite our paper or model, you can use the following BibTeX: ``` @inproceedings{delobelle2020robbert, title = "{R}ob{BERT}: a {D}utch {R}o{BERT}a-based {L}anguage {M}odel", author = "Delobelle, Pieter and Winters, Thomas and Berendt, Bettina", booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.findings-emnlp.292", doi = "10.18653/v1/2020.findings-emnlp.292", pages = "3255--3265" } ```
14,497
[ [ -0.038665771484375, -0.06304931640625, 0.00949859619140625, 0.024444580078125, -0.0080413818359375, -0.00923919677734375, -0.0293121337890625, -0.0654296875, 0.018646240234375, 0.0158233642578125, -0.028106689453125, -0.03570556640625, -0.059844970703125, 0.0010347366333007812, -0.035186767578125, 0.083984375, 0.011962890625, 0.0182952880859375, 0.003551483154296875, -0.00122833251953125, -0.032867431640625, -0.054931640625, -0.039093017578125, -0.031524658203125, 0.020050048828125, 0.004817962646484375, 0.0372314453125, 0.0284576416015625, 0.04412841796875, 0.0215911865234375, -0.023712158203125, -0.0011072158813476562, -0.0167236328125, -0.00409698486328125, -0.002651214599609375, -0.0247650146484375, -0.0284423828125, 0.007537841796875, 0.026947021484375, 0.033905029296875, -0.0083465576171875, 0.00997161865234375, 0.005840301513671875, 0.052001953125, -0.0295257568359375, 0.01568603515625, -0.037994384765625, -0.0158538818359375, -0.0194091796875, 0.015838623046875, -0.029144287109375, -0.02874755859375, 0.03778076171875, -0.02020263671875, 0.0012826919555664062, -0.0025081634521484375, 0.09942626953125, 0.01212310791015625, -0.004100799560546875, -0.021209716796875, -0.046295166015625, 0.06756591796875, -0.0570068359375, 0.0382080078125, 0.038604736328125, 0.01666259765625, 0.01522064208984375, -0.038543701171875, -0.038543701171875, -0.019195556640625, -0.0101318359375, 0.0275115966796875, -0.037506103515625, 0.0006041526794433594, 0.01470184326171875, 0.030609130859375, -0.040679931640625, 0.0071258544921875, -0.0360107421875, -0.0189361572265625, 0.05328369140625, -0.0197601318359375, 0.0139923095703125, -0.039398193359375, -0.040679931640625, -0.031707763671875, -0.0196533203125, 0.0145111083984375, 0.03399658203125, 0.0245819091796875, -0.00970458984375, 0.0295867919921875, -0.0011091232299804688, 0.06585693359375, 0.004062652587890625, -0.0180206298828125, 0.03460693359375, -0.0151214599609375, -0.018280029296875, -0.0159912109375, 0.0660400390625, 0.017425537109375, 0.02435302734375, -0.0017480850219726562, -0.033905029296875, 0.00811004638671875, 0.00299072265625, -0.05078125, -0.04345703125, 0.01255035400390625, -0.0300445556640625, -0.026641845703125, 0.0039215087890625, -0.037811279296875, -0.00353240966796875, -0.0311737060546875, 0.05419921875, -0.07708740234375, -0.0128173828125, 0.017852783203125, -0.01904296875, 0.02618408203125, 0.0240325927734375, -0.0548095703125, 0.0169677734375, 0.035064697265625, 0.066650390625, -0.0179595947265625, -0.0066375732421875, -0.0154876708984375, -0.0274200439453125, -0.01287841796875, 0.05645751953125, -0.00994873046875, -0.00243377685546875, -0.01181793212890625, -0.0153045654296875, -0.009429931640625, -0.029815673828125, 0.054840087890625, -0.032073974609375, 0.042755126953125, -0.007740020751953125, -0.050689697265625, -0.032623291015625, 0.0271453857421875, -0.041595458984375, 0.09228515625, 0.01123809814453125, -0.07147216796875, 0.0186614990234375, -0.059326171875, -0.02459716796875, -0.0161895751953125, 0.01385498046875, -0.038360595703125, -0.00701904296875, 0.0251922607421875, 0.044952392578125, -0.0142974853515625, 0.0311279296875, -0.006023406982421875, -0.01354217529296875, 0.0218048095703125, -0.0197906494140625, 0.0882568359375, 0.01268768310546875, -0.0219268798828125, -0.005397796630859375, -0.055877685546875, -0.01110076904296875, 0.0190887451171875, -0.032623291015625, -0.0214385986328125, -0.01477813720703125, 0.0192718505859375, 0.0198822021484375, 0.0118408203125, -0.044769287109375, 0.0038661956787109375, -0.0560302734375, 0.0197601318359375, 0.054473876953125, -0.01271820068359375, 0.0227203369140625, -0.038848876953125, 0.037017822265625, -0.00267791748046875, 0.0089111328125, -0.00726318359375, -0.045745849609375, -0.0635986328125, -0.043701171875, 0.06658935546875, 0.07427978515625, -0.045562744140625, 0.0576171875, -0.0163116455078125, -0.056121826171875, -0.04534912109375, -0.0110931396484375, 0.03509521484375, 0.035980224609375, 0.03570556640625, -0.041015625, -0.056427001953125, -0.07867431640625, -0.0281829833984375, -0.01947021484375, -0.006191253662109375, 0.0133514404296875, 0.035980224609375, -0.0145721435546875, 0.0792236328125, -0.031768798828125, -0.03131103515625, -0.0266265869140625, 0.00904083251953125, 0.0308990478515625, 0.04071044921875, 0.060211181640625, -0.0472412109375, -0.035064697265625, -0.018218994140625, -0.040679931640625, -0.005809783935546875, 0.005367279052734375, 0.002941131591796875, 0.03387451171875, 0.0211181640625, -0.0309600830078125, 0.0193023681640625, 0.04254150390625, -0.01219940185546875, 0.051788330078125, 0.0024585723876953125, -0.016815185546875, -0.0877685546875, 0.004302978515625, 0.0167236328125, -0.027984619140625, -0.049285888671875, -0.005680084228515625, -0.01123046875, 0.0026493072509765625, -0.0264434814453125, 0.044586181640625, -0.02227783203125, 0.008331298828125, 0.005664825439453125, 0.014923095703125, -0.01470184326171875, 0.050048828125, 0.01082611083984375, 0.0516357421875, 0.048553466796875, -0.03912353515625, 0.022674560546875, 0.034820556640625, -0.04693603515625, 0.0164031982421875, -0.052520751953125, -0.0021991729736328125, -0.01042938232421875, 0.01641845703125, -0.068359375, -0.0163421630859375, 0.01490020751953125, -0.056121826171875, 0.03240966796875, -0.0157012939453125, -0.047943115234375, -0.040618896484375, -0.0322265625, -0.007724761962890625, 0.058807373046875, -0.037811279296875, 0.043548583984375, 0.03271484375, -0.01053619384765625, -0.051055908203125, -0.070556640625, -0.00044035911560058594, -0.018341064453125, -0.05682373046875, 0.0202484130859375, -0.0013170242309570312, -0.0120086669921875, -0.01314544677734375, 0.00449371337890625, -0.00666046142578125, 0.01092529296875, 0.00899505615234375, 0.03363037109375, -0.0024471282958984375, 0.0251922607421875, -0.01605224609375, -0.01363372802734375, -0.0096588134765625, -0.02410888671875, 0.047271728515625, -0.033111572265625, 0.001140594482421875, -0.03466796875, 0.0105743408203125, 0.0124053955078125, -0.022216796875, 0.05401611328125, 0.061004638671875, -0.017730712890625, -0.00493621826171875, -0.053985595703125, -0.0231781005859375, -0.031707763671875, 0.0115814208984375, -0.01678466796875, -0.06414794921875, 0.036376953125, 0.0029659271240234375, 0.00589752197265625, 0.0482177734375, 0.042205810546875, -0.01629638671875, 0.05230712890625, 0.055206298828125, -0.01258087158203125, 0.0638427734375, -0.04193115234375, 0.03131103515625, -0.04718017578125, 0.0021457672119140625, -0.037628173828125, -0.024871826171875, -0.06201171875, -0.02313232421875, 0.027679443359375, 0.0282135009765625, -0.0029048919677734375, 0.0615234375, -0.051513671875, 0.0218963623046875, 0.0697021484375, 0.00026416778564453125, 0.0150604248046875, 0.0030193328857421875, -0.012359619140625, -0.00592041015625, -0.06103515625, -0.04327392578125, 0.07684326171875, 0.03271484375, 0.048980712890625, 0.0090789794921875, 0.068603515625, 0.0179595947265625, 0.01543426513671875, -0.060882568359375, 0.038055419921875, -0.0158538818359375, -0.069091796875, -0.021484375, -0.015625, -0.06170654296875, 0.0181427001953125, -0.0239105224609375, -0.0745849609375, 0.02239990234375, 0.0227203369140625, -0.032073974609375, 0.0242919921875, -0.0672607421875, 0.058807373046875, -0.01403045654296875, -0.01361083984375, 0.005275726318359375, -0.06494140625, 0.034393310546875, -0.004360198974609375, 0.01023101806640625, -0.013641357421875, 0.0257568359375, 0.07275390625, -0.018768310546875, 0.07220458984375, -0.00771331787109375, -0.0008463859558105469, 0.03759765625, -0.0002560615539550781, 0.045318603515625, -0.01293182373046875, -0.01042938232421875, 0.0260772705078125, 0.020050048828125, -0.0193023681640625, -0.0220184326171875, 0.025726318359375, -0.0572509765625, -0.0247344970703125, -0.049652099609375, -0.027008056640625, -0.01363372802734375, 0.00678253173828125, 0.035064697265625, 0.0242462158203125, -0.018280029296875, 0.00998687744140625, 0.051177978515625, -0.031005859375, 0.0313720703125, 0.0254974365234375, -0.0231475830078125, -0.0187530517578125, 0.06988525390625, 0.015960693359375, 0.0121612548828125, 0.03240966796875, 0.02679443359375, -0.0204620361328125, -0.0335693359375, -0.0272064208984375, 0.01268768310546875, -0.048614501953125, -0.033172607421875, -0.07391357421875, -0.027923583984375, -0.03619384765625, -0.0034961700439453125, -0.028167724609375, -0.05950927734375, -0.01139068603515625, -0.015533447265625, 0.0548095703125, 0.028533935546875, -0.0367431640625, 0.027679443359375, -0.0462646484375, 0.008026123046875, 0.02252197265625, 0.0217132568359375, -0.0123291015625, -0.047943115234375, -0.0124359130859375, -0.002780914306640625, -0.033355712890625, -0.05859375, 0.036163330078125, 0.009033203125, 0.0180206298828125, 0.0217742919921875, 0.00540924072265625, 0.03076171875, -0.0242156982421875, 0.06744384765625, 0.01678466796875, -0.07818603515625, 0.049957275390625, -0.03558349609375, 0.0055389404296875, 0.04180908203125, 0.0297088623046875, -0.047698974609375, -0.0232696533203125, -0.06671142578125, -0.0689697265625, 0.06134033203125, 0.024017333984375, 0.00431060791015625, 0.0059967041015625, 0.004039764404296875, 0.006900787353515625, 0.00743865966796875, -0.072998046875, -0.016937255859375, -0.011810302734375, -0.0088348388671875, -0.0003685951232910156, -0.0172119140625, 0.004852294921875, -0.035430908203125, 0.07452392578125, 0.007350921630859375, 0.045318603515625, 0.018768310546875, -0.013885498046875, 0.01470184326171875, 0.021453857421875, 0.03497314453125, 0.03826904296875, -0.034423828125, -0.01007080078125, 0.0251922607421875, -0.026275634765625, -0.01654052734375, 0.032379150390625, -0.027587890625, 0.020233154296875, 0.0443115234375, 0.06842041015625, -0.00360870361328125, -0.060791015625, 0.057586669921875, 0.011383056640625, -0.0250701904296875, -0.035430908203125, 0.002399444580078125, 0.003932952880859375, 0.0169677734375, 0.005313873291015625, 0.0177154541015625, 0.01346588134765625, -0.0325927734375, 0.0201568603515625, 0.028594970703125, -0.034881591796875, -0.020172119140625, 0.03741455078125, 0.0118255615234375, -0.014923095703125, 0.029998779296875, -0.024688720703125, -0.06390380859375, 0.0242919921875, 0.031524658203125, 0.067138671875, -0.017547607421875, 0.0095672607421875, 0.045806884765625, 0.0364990234375, -0.01210784912109375, 0.0279541015625, 0.002899169921875, -0.06524658203125, -0.03533935546875, -0.06597900390625, -0.0154266357421875, 0.00638580322265625, -0.04461669921875, 0.0180511474609375, -0.0241241455078125, -0.0193634033203125, 0.01158905029296875, 0.00794219970703125, -0.0635986328125, 0.0165863037109375, 0.0024261474609375, 0.08203125, -0.0625, 0.066650390625, 0.041717529296875, -0.036346435546875, -0.0626220703125, -0.0103607177734375, 0.0018777847290039062, -0.061004638671875, 0.04913330078125, 0.00705718994140625, -0.003154754638671875, -0.01171875, -0.050384521484375, -0.066162109375, 0.0643310546875, 0.01476287841796875, -0.04437255859375, -0.0012998580932617188, -0.018707275390625, 0.047271728515625, -0.0016298294067382812, 0.01146697998046875, 0.055694580078125, 0.040283203125, -0.00012636184692382812, -0.08349609375, -0.0223388671875, -0.011077880859375, 0.0034160614013671875, 0.01485443115234375, -0.043731689453125, 0.061431884765625, -0.0084075927734375, -0.01406097412109375, 0.00325775146484375, 0.049468994140625, 0.0033416748046875, 0.0011625289916992188, 0.039886474609375, 0.06292724609375, 0.05853271484375, -0.02801513671875, 0.09228515625, -0.0176239013671875, 0.036102294921875, 0.07757568359375, 0.01287841796875, 0.060882568359375, 0.0145111083984375, -0.0248260498046875, 0.03802490234375, 0.05029296875, -0.017303466796875, 0.0347900390625, 0.021728515625, -0.003643035888671875, -0.018890380859375, -0.0017547607421875, -0.02569580078125, 0.05108642578125, 0.0139007568359375, -0.0257568359375, -0.0022716522216796875, 0.018890380859375, 0.01212310791015625, 0.002292633056640625, 0.00788116455078125, 0.06524658203125, 0.0032787322998046875, -0.0367431640625, 0.049468994140625, 0.021453857421875, 0.07867431640625, -0.051727294921875, 0.020599365234375, -0.0055389404296875, 0.0158233642578125, 0.0020427703857421875, -0.053802490234375, 0.025848388671875, 0.00791168212890625, -0.0121612548828125, -0.032684326171875, 0.06787109375, -0.025970458984375, -0.043243408203125, 0.032379150390625, 0.04779052734375, 0.0215606689453125, -0.0017175674438476562, -0.060546875, 0.0007200241088867188, 0.0018281936645507812, -0.0265045166015625, 0.0041961669921875, 0.0272369384765625, -0.0004069805145263672, 0.0469970703125, 0.043121337890625, 0.003387451171875, -0.0059051513671875, -0.0074615478515625, 0.05963134765625, -0.047393798828125, -0.038818359375, -0.053802490234375, 0.033111572265625, -0.00981903076171875, -0.029144287109375, 0.061553955078125, 0.0487060546875, 0.08184814453125, -0.0116119384765625, 0.06060791015625, -0.0189361572265625, 0.06744384765625, -0.033203125, 0.0445556640625, -0.035430908203125, -0.0000073909759521484375, -0.047119140625, -0.06475830078125, -0.037322998046875, 0.0679931640625, -0.0421142578125, 0.01120758056640625, 0.02667236328125, 0.0523681640625, -0.00734710693359375, -0.00262451171875, 0.016387939453125, 0.020599365234375, 0.0157012939453125, 0.03277587890625, 0.059722900390625, -0.05047607421875, 0.034149169921875, -0.0273590087890625, -0.0190887451171875, -0.027679443359375, -0.0626220703125, -0.07086181640625, -0.05560302734375, -0.02362060546875, -0.032257080078125, 0.007549285888671875, 0.08746337890625, 0.0625, -0.06939697265625, -0.01073455810546875, 0.007183074951171875, 0.0016756057739257812, -0.02764892578125, -0.018798828125, 0.04083251953125, -0.01375579833984375, -0.058807373046875, 0.0174407958984375, 0.00801849365234375, 0.0198516845703125, -0.0016117095947265625, 0.00644683837890625, -0.036376953125, 0.016082763671875, 0.059417724609375, 0.01050567626953125, -0.061614990234375, -0.0224151611328125, 0.0014066696166992188, 0.0023212432861328125, 0.01038360595703125, 0.0297088623046875, -0.059722900390625, 0.0266265869140625, 0.04254150390625, 0.047943115234375, 0.049774169921875, 0.000476837158203125, 0.0229034423828125, -0.060211181640625, 0.00832366943359375, 0.03497314453125, 0.0215606689453125, 0.03607177734375, -0.006526947021484375, 0.058502197265625, 0.01294708251953125, -0.0347900390625, -0.053802490234375, 0.00966644287109375, -0.0880126953125, -0.0259857177734375, 0.07257080078125, 0.0060272216796875, -0.012847900390625, 0.005229949951171875, -0.023406982421875, 0.0128631591796875, -0.03900146484375, 0.05682373046875, 0.07733154296875, -0.00910186767578125, -0.007251739501953125, -0.04644775390625, 0.047882080078125, 0.031524658203125, -0.04571533203125, 0.001247406005859375, 0.03070068359375, 0.020904541015625, 0.035430908203125, 0.041778564453125, -0.006378173828125, 0.0054779052734375, -0.022979736328125, 0.02667236328125, 0.004848480224609375, -0.013824462890625, -0.0235137939453125, -0.007381439208984375, -0.0046234130859375, -0.01641845703125 ] ]
facebook/contriever-msmarco
2022-06-25T17:19:59.000Z
[ "transformers", "pytorch", "bert", "feature-extraction", "arxiv:2112.09118", "endpoints_compatible", "has_space", "region:us" ]
feature-extraction
facebook
null
null
facebook/contriever-msmarco
7
9,424
transformers
2022-03-02T23:29:05
--- tags: - feature-extraction pipeline_tag: feature-extraction --- This model is the finetuned version of the pre-trained contriever model available here https://huggingface.co/facebook/contriever, following the approach described in [Towards Unsupervised Dense Information Retrieval with Contrastive Learning](https://arxiv.org/abs/2112.09118). The associated GitHub repository is available here https://github.com/facebookresearch/contriever. ## Usage (HuggingFace Transformers) Using the model directly available in HuggingFace transformers requires to add a mean pooling operation to obtain a sentence embedding. ```python import torch from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained('facebook/contriever-msmarco') model = AutoModel.from_pretrained('facebook/contriever-msmarco') sentences = [ "Where was Marie Curie born?", "Maria Sklodowska, later known as Marie Curie, was born on November 7, 1867.", "Born in Paris on 15 May 1859, Pierre Curie was the son of Eugène Curie, a doctor of French Catholic origin from Alsace." ] # Apply tokenizer inputs = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings outputs = model(**inputs) # Mean pooling def mean_pooling(token_embeddings, mask): token_embeddings = token_embeddings.masked_fill(~mask[..., None].bool(), 0.) sentence_embeddings = token_embeddings.sum(dim=1) / mask.sum(dim=1)[..., None] return sentence_embeddings embeddings = mean_pooling(outputs[0], inputs['attention_mask']) ```
1,568
[ [ -0.022491455078125, -0.0592041015625, 0.0141448974609375, 0.01389312744140625, -0.0169677734375, -0.0272979736328125, -0.021087646484375, -0.00980377197265625, 0.0297393798828125, 0.03509521484375, -0.060760498046875, -0.0207672119140625, -0.046966552734375, -0.01389312744140625, -0.028472900390625, 0.0819091796875, 0.0009212493896484375, 0.00469207763671875, -0.0161895751953125, -0.013641357421875, 0.003711700439453125, -0.032562255859375, -0.044586181640625, -0.037445068359375, 0.0131683349609375, 0.0277862548828125, 0.060302734375, 0.0250396728515625, 0.03497314453125, 0.0350341796875, -0.005657196044921875, 0.000278472900390625, -0.056243896484375, -0.029876708984375, -0.018280029296875, -0.01454925537109375, -0.0026302337646484375, 0.0259552001953125, 0.036651611328125, 0.0518798828125, -0.00971221923828125, 0.01248931884765625, 0.0279693603515625, 0.0413818359375, -0.0301513671875, 0.01172637939453125, -0.030517578125, -0.0018863677978515625, 0.004901885986328125, 0.0105133056640625, -0.0252838134765625, -0.02178955078125, 0.020751953125, -0.03814697265625, 0.036773681640625, -0.0059814453125, 0.08740234375, 0.0228271484375, -0.0259552001953125, -0.01904296875, -0.0230255126953125, 0.05731201171875, -0.0458984375, 0.02911376953125, 0.0234527587890625, 0.006206512451171875, -0.0157623291015625, -0.080322265625, -0.054534912109375, -0.01995849609375, -0.03265380859375, 0.0028285980224609375, -0.025146484375, 0.00928497314453125, 0.015869140625, 0.046142578125, -0.058746337890625, -0.01229095458984375, -0.040313720703125, -0.023895263671875, 0.044525146484375, -0.01480865478515625, 0.0018100738525390625, -0.0293731689453125, -0.041015625, -0.007476806640625, -0.0083160400390625, 0.0236968994140625, 0.00846099853515625, 0.0302581787109375, -0.042938232421875, 0.05914306640625, -0.029571533203125, 0.040252685546875, 0.024810791015625, 0.021240234375, 0.04876708984375, -0.01190185546875, -0.0241851806640625, -0.00942230224609375, 0.0732421875, 0.040802001953125, 0.047149658203125, -0.0012617111206054688, -0.00876617431640625, 0.0144805908203125, 0.0113372802734375, -0.0770263671875, -0.053619384765625, 0.01181793212890625, -0.048980712890625, -0.02484130859375, 0.0260467529296875, -0.035400390625, -0.0007262229919433594, -0.0238494873046875, 0.05169677734375, -0.032989501953125, -0.014678955078125, 0.01312255859375, 0.0035552978515625, 0.0040283203125, -0.0110321044921875, -0.057373046875, 0.0289459228515625, 0.030670166015625, 0.060028076171875, 0.0087432861328125, -0.01763916015625, -0.053253173828125, -0.0214691162109375, 0.00582122802734375, 0.04302978515625, -0.042236328125, -0.0122222900390625, 0.00921630859375, 0.028564453125, -0.01056671142578125, -0.036590576171875, 0.0498046875, -0.0269317626953125, 0.0283660888671875, 0.00128173828125, -0.059326171875, -0.020477294921875, 0.0240325927734375, -0.04833984375, 0.07684326171875, 0.0467529296875, -0.07183837890625, 0.004547119140625, -0.0345458984375, -0.0180206298828125, -0.0022335052490234375, -0.005779266357421875, -0.047149658203125, -0.0038051605224609375, 0.0099334716796875, 0.0584716796875, 0.009521484375, 0.02197265625, -0.04547119140625, -0.032867431640625, 0.012847900390625, -0.0032634735107421875, 0.077880859375, 0.01419830322265625, -0.01261138916015625, 0.0157928466796875, -0.0645751953125, -0.00907135009765625, 0.01517486572265625, -0.00862884521484375, -0.0236053466796875, -0.0299835205078125, 0.03076171875, 0.030181884765625, 0.022705078125, -0.050140380859375, 0.01070404052734375, -0.0330810546875, 0.050140380859375, 0.0328369140625, 0.006511688232421875, 0.042327880859375, -0.032135009765625, 0.02032470703125, 0.0182037353515625, 0.022125244140625, -0.002155303955078125, -0.008270263671875, -0.056304931640625, -0.0210113525390625, 0.003448486328125, 0.0322265625, -0.0498046875, 0.0296630859375, 0.0028667449951171875, -0.05023193359375, -0.04302978515625, 0.0193634033203125, 0.0305938720703125, 0.050048828125, 0.048675537109375, -0.0140228271484375, -0.049468994140625, -0.0728759765625, -0.0059814453125, -0.0022945404052734375, 0.01143646240234375, 0.0176544189453125, 0.046630859375, -0.04052734375, 0.048919677734375, -0.043060302734375, -0.0196380615234375, -0.00464630126953125, -0.003025054931640625, 0.039703369140625, 0.07708740234375, 0.043853759765625, -0.0670166015625, -0.018585205078125, -0.029052734375, -0.048828125, 0.0167999267578125, -0.0014848709106445312, -0.0308990478515625, -0.006732940673828125, 0.05438232421875, -0.07080078125, 0.031494140625, 0.03948974609375, -0.0390625, 0.049713134765625, -0.00689697265625, -0.01275634765625, -0.09521484375, 0.0029754638671875, 0.022491455078125, -0.0247039794921875, -0.03253173828125, 0.0097198486328125, 0.0099639892578125, 0.005771636962890625, -0.035888671875, 0.04779052734375, -0.0204925537109375, 0.0201263427734375, -0.007061004638671875, 0.01392364501953125, 0.00864410400390625, 0.03521728515625, -0.0160675048828125, 0.0261688232421875, 0.04315185546875, -0.058502197265625, 0.040740966796875, 0.054290771484375, -0.0162811279296875, 0.04071044921875, -0.0543212890625, -0.010101318359375, -0.01092529296875, 0.01483917236328125, -0.08258056640625, -0.0226287841796875, 0.02484130859375, -0.04669189453125, 0.0213775634765625, 0.0006766319274902344, -0.035919189453125, -0.055419921875, -0.0211639404296875, 0.03179931640625, 0.026397705078125, -0.050567626953125, 0.047149658203125, 0.02215576171875, -0.00606536865234375, -0.0391845703125, -0.0635986328125, -0.033599853515625, -0.011474609375, -0.0516357421875, 0.044769287109375, -0.00919342041015625, -0.0026397705078125, 0.01207733154296875, 0.004604339599609375, -0.0084991455078125, 0.0021514892578125, 0.0030879974365234375, 0.023834228515625, -0.0100860595703125, 0.00850677490234375, 0.0017871856689453125, -0.00946044921875, 0.01172637939453125, -0.00656890869140625, 0.047576904296875, -0.0179290771484375, 0.0037631988525390625, -0.040557861328125, 0.01120758056640625, 0.002681732177734375, -0.0232391357421875, 0.052001953125, 0.0770263671875, -0.047698974609375, -0.015106201171875, -0.0501708984375, -0.034149169921875, -0.0406494140625, 0.0316162109375, -0.02105712890625, -0.0849609375, 0.05548095703125, 0.005519866943359375, -0.0025081634521484375, 0.050048828125, 0.02947998046875, -0.01345062255859375, 0.049896240234375, 0.06365966796875, -0.000057756900787353516, 0.045257568359375, -0.043853759765625, -0.00572967529296875, -0.044921875, -0.0206146240234375, -0.0176849365234375, -0.020538330078125, -0.0677490234375, -0.0115203857421875, -0.0084075927734375, 0.01393890380859375, -0.058441162109375, 0.035491943359375, -0.052978515625, 0.0216827392578125, 0.05487060546875, 0.041656494140625, 0.002643585205078125, -0.00569915771484375, -0.01395416259765625, 0.006256103515625, -0.046966552734375, -0.021820068359375, 0.05523681640625, 0.0164947509765625, 0.060516357421875, -0.01297760009765625, 0.052978515625, -0.004566192626953125, 0.0246734619140625, -0.04095458984375, 0.0287322998046875, -0.034637451171875, -0.042572021484375, 0.0003173351287841797, -0.041412353515625, -0.06829833984375, 0.010406494140625, -0.02679443359375, -0.05401611328125, 0.0162506103515625, 0.01114654541015625, -0.03765869140625, 0.00244903564453125, -0.05517578125, 0.0916748046875, 0.00873565673828125, -0.0256805419921875, -0.008026123046875, -0.06414794921875, 0.0271759033203125, 0.01715087890625, 0.0087738037109375, 0.0061798095703125, 0.0228118896484375, 0.060302734375, -0.0227508544921875, 0.0611572265625, -0.020843505859375, 0.0250244140625, 0.0250244140625, 0.01041412353515625, 0.020355224609375, 0.0009617805480957031, -0.0110626220703125, -0.00838470458984375, 0.00799560546875, -0.041290283203125, -0.0258941650390625, 0.06365966796875, -0.06378173828125, -0.0207366943359375, -0.0214080810546875, -0.03900146484375, 0.0031223297119140625, 0.02880859375, 0.035430908203125, 0.045623779296875, -0.0210418701171875, 0.0185699462890625, 0.031341552734375, -0.0096588134765625, 0.038330078125, 0.00896453857421875, -0.018157958984375, -0.0267333984375, 0.0435791015625, 0.00524139404296875, -0.0010814666748046875, 0.0214691162109375, 0.018096923828125, -0.034210205078125, -0.0010929107666015625, 0.004550933837890625, 0.05255126953125, -0.045745849609375, -0.017822265625, -0.0609130859375, -0.038055419921875, -0.035675048828125, -0.022735595703125, -0.0277099609375, -0.025177001953125, -0.04193115234375, -0.02301025390625, 0.029052734375, 0.035186767578125, -0.0156402587890625, 0.05023193359375, -0.048919677734375, 0.0248260498046875, 0.0015354156494140625, 0.030181884765625, -0.0113525390625, -0.05914306640625, -0.0256195068359375, 0.0020236968994140625, -0.0377197265625, -0.046051025390625, 0.0260467529296875, 0.0178680419921875, 0.04827880859375, 0.039215087890625, 0.019622802734375, 0.05029296875, -0.0270233154296875, 0.037200927734375, 0.01502227783203125, -0.058868408203125, 0.0281829833984375, -0.0128173828125, 0.0123443603515625, 0.054290771484375, 0.037841796875, -0.0204010009765625, -0.022705078125, -0.0721435546875, -0.05938720703125, 0.04632568359375, 0.0222930908203125, 0.03814697265625, -0.0182037353515625, 0.044769287109375, 0.00283050537109375, 0.01473236083984375, -0.08197021484375, -0.025360107421875, -0.0260009765625, -0.050018310546875, -0.0230865478515625, -0.017913818359375, 0.0038547515869140625, -0.0282745361328125, 0.06304931640625, -0.0034027099609375, 0.0218505859375, 0.03387451171875, -0.0234527587890625, -0.0002334117889404297, -0.01495361328125, 0.0128173828125, 0.023193359375, -0.0249786376953125, 0.0111236572265625, -0.00989532470703125, -0.0190277099609375, -0.013885498046875, 0.062408447265625, -0.0018072128295898438, 0.0164642333984375, 0.019317626953125, 0.05035400390625, 0.036102294921875, -0.031341552734375, 0.05963134765625, 0.000888824462890625, -0.0185089111328125, -0.04498291015625, -0.004108428955078125, 0.009674072265625, 0.049346923828125, 0.0189208984375, 0.005329132080078125, 0.0186614990234375, -0.0270843505859375, 0.04803466796875, 0.035736083984375, -0.044036865234375, -0.0190582275390625, 0.059967041015625, -0.004146575927734375, -0.0088653564453125, 0.05987548828125, -0.0197601318359375, -0.038360595703125, 0.033477783203125, 0.033782958984375, 0.0537109375, -0.0268096923828125, 0.0196075439453125, 0.057861328125, 0.0110931396484375, -0.00942230224609375, 0.0256805419921875, -0.000720977783203125, -0.0615234375, -0.0185394287109375, -0.0533447265625, -0.002178192138671875, 0.01186370849609375, -0.060760498046875, 0.0304107666015625, -0.0258331298828125, -0.021087646484375, 0.01155853271484375, 0.01068878173828125, -0.07159423828125, 0.0298004150390625, 0.00118255615234375, 0.06964111328125, -0.0677490234375, 0.040069580078125, 0.054290771484375, -0.04364013671875, -0.074951171875, -0.01357269287109375, -0.021240234375, -0.055816650390625, 0.0248870849609375, 0.058197021484375, 0.019683837890625, 0.0286712646484375, -0.060638427734375, -0.05718994140625, 0.0670166015625, 0.03216552734375, -0.044189453125, -0.0104217529296875, -0.0156402587890625, 0.0296630859375, -0.029876708984375, 0.01983642578125, 0.034271240234375, 0.00690460205078125, 0.00467681884765625, -0.0487060546875, 0.0180511474609375, -0.03973388671875, -0.00891876220703125, -0.0196990966796875, -0.061981201171875, 0.08404541015625, -0.025177001953125, -0.00424957275390625, 0.0215301513671875, 0.0701904296875, 0.0298004150390625, 0.025238037109375, 0.03533935546875, 0.0537109375, 0.049041748046875, -0.0054779052734375, 0.07427978515625, -0.025299072265625, 0.0689697265625, 0.07672119140625, 0.021240234375, 0.0540771484375, 0.039886474609375, -0.007965087890625, 0.053558349609375, 0.06353759765625, -0.01548004150390625, 0.04779052734375, 0.01611328125, -0.0030574798583984375, -0.01358795166015625, -0.00360107421875, -0.0228729248046875, 0.0205078125, 0.01114654541015625, -0.045867919921875, -0.020416259765625, 0.00978851318359375, 0.013885498046875, 0.016357421875, -0.029052734375, 0.040802001953125, 0.0223541259765625, -0.044952392578125, 0.04638671875, -0.00746917724609375, 0.061920166015625, -0.0282745361328125, 0.005344390869140625, -0.007625579833984375, 0.0401611328125, -0.0289459228515625, -0.06011962890625, 0.0216522216796875, -0.0158233642578125, -0.00951385498046875, 0.01143646240234375, 0.04656982421875, -0.060943603515625, -0.056549072265625, 0.033477783203125, 0.0269927978515625, 0.00962066650390625, -0.01406097412109375, -0.07745361328125, -0.0047454833984375, -0.0030384063720703125, -0.03656005859375, -0.00417327880859375, 0.03106689453125, 0.00716400146484375, 0.04638671875, 0.047149658203125, -0.026824951171875, -0.005878448486328125, 0.0165252685546875, 0.0869140625, -0.040313720703125, -0.054931640625, -0.08062744140625, 0.05389404296875, 0.00214385986328125, -0.02154541015625, 0.041473388671875, 0.05023193359375, 0.08203125, -0.006656646728515625, 0.02850341796875, -0.0015850067138671875, 0.0312042236328125, -0.0290985107421875, 0.04833984375, -0.047271728515625, -0.03570556640625, -0.042266845703125, -0.07666015625, -0.0181427001953125, 0.088623046875, -0.0114593505859375, 0.02325439453125, 0.038177490234375, 0.061431884765625, -0.02789306640625, 0.0000032782554626464844, -0.00848388671875, 0.03887939453125, 0.0107269287109375, 0.03216552734375, 0.038116455078125, -0.047149658203125, 0.03271484375, -0.04638671875, -0.0223541259765625, -0.0178375244140625, -0.051513671875, -0.07598876953125, -0.053619384765625, -0.0244598388671875, -0.052490234375, -0.0084075927734375, 0.09002685546875, 0.056732177734375, -0.060455322265625, 0.00409698486328125, -0.017913818359375, -0.02020263671875, -0.006000518798828125, -0.02117919921875, 0.04168701171875, -0.0185089111328125, -0.051177978515625, -0.00836944580078125, -0.025634765625, 0.0219268798828125, -0.0037631988525390625, -0.0012979507446289062, -0.031402587890625, -0.004222869873046875, 0.033843994140625, -0.000492095947265625, -0.033294677734375, -0.02716064453125, -0.006824493408203125, -0.01947021484375, -0.0016145706176757812, 0.01096343994140625, -0.049652099609375, 0.0262603759765625, 0.01345062255859375, 0.040740966796875, 0.0712890625, 0.0179595947265625, 0.01239776611328125, -0.05694580078125, 0.0293731689453125, 0.01111602783203125, 0.049224853515625, 0.05328369140625, -0.035369873046875, 0.038482666015625, 0.0255126953125, -0.0447998046875, -0.0469970703125, 0.0093994140625, -0.1064453125, -0.0017261505126953125, 0.0816650390625, -0.0120849609375, -0.021453857421875, 0.017425537109375, -0.0229034423828125, 0.05517578125, -0.02813720703125, 0.04730224609375, 0.046722412109375, 0.001766204833984375, -0.0205230712890625, -0.023162841796875, 0.0248260498046875, 0.02178955078125, -0.0200347900390625, -0.01165771484375, 0.0265960693359375, 0.03399658203125, 0.004985809326171875, 0.050537109375, -0.01151275634765625, 0.01611328125, -0.01654052734375, 0.0115966796875, -0.01080322265625, 0.004489898681640625, -0.011505126953125, -0.00457000732421875, -0.03173828125, -0.0137939453125 ] ]
kssteven/ibert-roberta-base
2021-11-22T10:09:32.000Z
[ "transformers", "pytorch", "ibert", "fill-mask", "arxiv:1907.11692", "arxiv:2101.01321", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
kssteven
null
null
kssteven/ibert-roberta-base
0
9,420
transformers
2022-03-02T23:29:05
# I-BERT base model This model, `ibert-roberta-base`, is an integer-only quantized version of [RoBERTa](https://arxiv.org/abs/1907.11692), and was introduced in [this paper](https://arxiv.org/abs/2101.01321). I-BERT stores all parameters with INT8 representation, and carries out the entire inference using integer-only arithmetic. In particular, I-BERT replaces all floating point operations in the Transformer architectures (e.g., MatMul, GELU, Softmax, and LayerNorm) with closely approximating integer operations. This can result in upto 4x inference speed up as compared to floating point counterpart when tested on an Nvidia T4 GPU. The best model parameters searched via quantization-aware finetuning can be then exported (e.g., to TensorRT) for integer-only deployment of the model. ## Finetuning Procedure Finetuning of I-BERT consists of 3 stages: (1) Full-precision finetuning from the pretrained model on a down-stream task, (2) model quantization, and (3) integer-only finetuning (i.e., quantization-aware training) of the quantized model. ### Full-precision finetuning Full-precision finetuning of I-BERT is similar to RoBERTa finetuning. For instance, you can run the following command to finetune on the [MRPC](https://www.microsoft.com/en-us/download/details.aspx?id=52398) text classification task. ``` python examples/text-classification/run_glue.py \ --model_name_or_path kssteven/ibert-roberta-base \ --task_name MRPC \ --do_eval \ --do_train \ --evaluation_strategy epoch \ --max_seq_length 128 \ --per_device_train_batch_size 32 \ --save_steps 115 \ --learning_rate 2e-5 \ --num_train_epochs 10 \ --output_dir $OUTPUT_DIR ``` ### Model Quantization Once you are done with full-precision finetuning, open up `config.json` in your checkpoint directory and set the `quantize` attribute as `true`. ``` { "_name_or_path": "kssteven/ibert-roberta-base", "architectures": [ "IBertForSequenceClassification" ], "attention_probs_dropout_prob": 0.1, "bos_token_id": 0, "eos_token_id": 2, "finetuning_task": "mrpc", "force_dequant": "none", "hidden_act": "gelu", "hidden_dropout_prob": 0.1, "hidden_size": 768, "initializer_range": 0.02, "intermediate_size": 3072, "layer_norm_eps": 1e-05, "max_position_embeddings": 514, "model_type": "ibert", "num_attention_heads": 12, "num_hidden_layers": 12, "pad_token_id": 1, "position_embedding_type": "absolute", "quant_mode": true, "tokenizer_class": "RobertaTokenizer", "transformers_version": "4.4.0.dev0", "type_vocab_size": 1, "vocab_size": 50265 } ``` Then, your model will automatically run as the integer-only mode when you load the checkpoint. Also, make sure to delete `optimizer.pt`, `scheduler.pt` and `trainer_state.json` in the same directory. Otherwise, HF will not reset the optimizer, scheduler, or trainer state for the following integer-only finetuning. ### Integer-only finetuning (Quantization-aware training) Finally, you will be able to run integer-only finetuning simply by loading the checkpoint file you modified. Note that the only difference in the example command below is `model_name_or_path`. ``` python examples/text-classification/run_glue.py \ --model_name_or_path $CHECKPOINT_DIR --task_name MRPC \ --do_eval \ --do_train \ --evaluation_strategy epoch \ --max_seq_length 128 \ --per_device_train_batch_size 32 \ --save_steps 115 \ --learning_rate 1e-6 \ --num_train_epochs 10 \ --output_dir $OUTPUT_DIR ``` ## Citation info If you use I-BERT, please cite [our papaer](https://arxiv.org/abs/2101.01321). ``` @article{kim2021bert, title={I-BERT: Integer-only BERT Quantization}, author={Kim, Sehoon and Gholami, Amir and Yao, Zhewei and Mahoney, Michael W and Keutzer, Kurt}, journal={arXiv preprint arXiv:2101.01321}, year={2021} } ```
4,381
[ [ -0.048919677734375, -0.043792724609375, 0.0264892578125, 0.0172119140625, -0.019622802734375, -0.023773193359375, -0.0240936279296875, -0.01270294189453125, -0.01412200927734375, 0.0249176025390625, -0.051116943359375, -0.034271240234375, -0.04736328125, -0.00965118408203125, -0.0035152435302734375, 0.09075927734375, 0.003620147705078125, 0.0299224853515625, -0.01238250732421875, -0.00273895263671875, -0.0122222900390625, -0.0411376953125, -0.06494140625, -0.01995849609375, 0.0205841064453125, 0.01461029052734375, 0.024658203125, 0.036346435546875, 0.047027587890625, 0.02667236328125, -0.0251617431640625, -0.019866943359375, -0.0241546630859375, -0.01922607421875, -0.00322723388671875, -0.040313720703125, -0.03851318359375, -0.00617218017578125, 0.06732177734375, 0.030120849609375, 0.00021135807037353516, 0.019195556640625, 0.003238677978515625, 0.0513916015625, -0.050994873046875, 0.022064208984375, -0.03277587890625, 0.00403594970703125, -0.0082855224609375, 0.0164642333984375, -0.033905029296875, -0.032470703125, 0.01806640625, -0.04071044921875, 0.0281219482421875, -0.026885986328125, 0.0814208984375, 0.04083251953125, -0.0396728515625, -0.009979248046875, -0.03594970703125, 0.06744384765625, -0.08233642578125, 0.01534271240234375, 0.0191192626953125, 0.006626129150390625, -0.0054931640625, -0.0635986328125, -0.037261962890625, -0.013702392578125, 0.0124053955078125, 0.013427734375, 0.0016393661499023438, 0.0323486328125, 0.050872802734375, 0.040618896484375, -0.04345703125, 0.007808685302734375, -0.045867919921875, -0.024658203125, 0.01788330078125, 0.01483154296875, -0.00565338134765625, -0.0185699462890625, -0.041290283203125, -0.0305023193359375, -0.0233306884765625, 0.037017822265625, 0.0051116943359375, 0.01300048828125, -0.01253509521484375, 0.025909423828125, -0.033355712890625, 0.049560546875, 0.0169677734375, -0.01561737060546875, 0.0261688232421875, -0.0031871795654296875, -0.0249786376953125, 0.0082550048828125, 0.07342529296875, 0.0029125213623046875, -0.00203704833984375, 0.01546478271484375, -0.011749267578125, -0.0049896240234375, 0.01605224609375, -0.09307861328125, -0.0202484130859375, 0.03436279296875, -0.024139404296875, -0.0270843505859375, 0.005435943603515625, -0.03515625, 0.0167999267578125, -0.0211639404296875, 0.0670166015625, -0.0728759765625, 0.01222991943359375, -0.004337310791015625, -0.00475311279296875, 0.015777587890625, 0.03155517578125, -0.06427001953125, 0.009735107421875, 0.0460205078125, 0.06744384765625, 0.0230255126953125, -0.02191162109375, -0.0269622802734375, -0.007724761962890625, -0.022979736328125, 0.033905029296875, 0.0008907318115234375, -0.0112762451171875, -0.01447296142578125, 0.01242828369140625, -0.0161590576171875, -0.039093017578125, 0.033843994140625, -0.035552978515625, 0.021148681640625, -0.0166168212890625, -0.0469970703125, -0.039520263671875, 0.0177459716796875, -0.038421630859375, 0.0731201171875, 0.01038360595703125, -0.0389404296875, 0.020904541015625, -0.0465087890625, -0.01552581787109375, -0.005321502685546875, 0.0024890899658203125, -0.06396484375, -0.00385284423828125, 0.006099700927734375, 0.02337646484375, -0.0137176513671875, 0.0258331298828125, -0.0325927734375, -0.038787841796875, 0.012237548828125, -0.0287933349609375, 0.0927734375, 0.020111083984375, -0.04791259765625, 0.0005054473876953125, -0.05718994140625, 0.02301025390625, -0.01068878173828125, -0.0526123046875, 0.01287841796875, -0.01806640625, -0.0157318115234375, 0.0190582275390625, 0.045867919921875, -0.0328369140625, 0.0243377685546875, -0.01459503173828125, 0.041351318359375, 0.06500244140625, 0.01209259033203125, 0.034423828125, -0.049102783203125, 0.020843505859375, 0.017059326171875, 0.0145111083984375, -0.006786346435546875, -0.03961181640625, -0.08514404296875, -0.0384521484375, 0.04925537109375, 0.056793212890625, -0.036651611328125, 0.061370849609375, -0.0229339599609375, -0.048492431640625, -0.035430908203125, -0.01277923583984375, 0.038299560546875, 0.033905029296875, 0.021484375, -0.0231781005859375, -0.0280303955078125, -0.0640869140625, 0.007099151611328125, -0.00592803955078125, -0.0069427490234375, 0.0139923095703125, 0.0526123046875, -0.006526947021484375, 0.050140380859375, -0.0308685302734375, -0.03594970703125, 0.011993408203125, 0.03399658203125, 0.039093017578125, 0.0753173828125, 0.038665771484375, -0.0260009765625, -0.04168701171875, -0.02703857421875, -0.060516357421875, 0.0203094482421875, -0.0248260498046875, -0.0172576904296875, 0.0176239013671875, 0.0181884765625, -0.042572021484375, 0.029327392578125, 0.019134521484375, -0.01126861572265625, 0.035888671875, -0.0145263671875, 0.021087646484375, -0.06536865234375, 0.00470733642578125, 0.0008873939514160156, -0.015411376953125, -0.028106689453125, 0.01345062255859375, 0.0185699462890625, -0.00841522216796875, -0.038604736328125, 0.0226898193359375, -0.01094818115234375, 0.00609588623046875, 0.007598876953125, -0.0308990478515625, 0.0025234222412109375, 0.0753173828125, -0.0023670196533203125, 0.054534912109375, 0.046051025390625, -0.04791259765625, -0.006694793701171875, 0.0226898193359375, -0.038116455078125, 0.0299224853515625, -0.0457763671875, 0.01319122314453125, -0.0036487579345703125, 0.0260772705078125, -0.0660400390625, -0.013885498046875, 0.02679443359375, -0.05181884765625, 0.02569580078125, -0.04205322265625, -0.04156494140625, -0.014373779296875, -0.01629638671875, 0.0233001708984375, 0.0572509765625, -0.0242767333984375, 0.03594970703125, -0.0012874603271484375, -0.00887298583984375, -0.027496337890625, -0.045013427734375, -0.0165863037109375, -0.0191802978515625, -0.054534912109375, 0.03076171875, -0.0280303955078125, 0.0002925395965576172, -0.0193023681640625, -0.0031871795654296875, -0.033660888671875, 0.00659942626953125, 0.0138702392578125, 0.0279693603515625, -0.0163421630859375, 0.01097869873046875, 0.0167999267578125, -0.0107574462890625, 0.00809478759765625, -0.0274505615234375, 0.048004150390625, -0.036285400390625, -0.00406646728515625, -0.04638671875, 0.00687408447265625, 0.047210693359375, -0.0197906494140625, 0.06671142578125, 0.072509765625, -0.04119873046875, 0.01302337646484375, -0.031707763671875, -0.0269622802734375, -0.03790283203125, 0.0260009765625, -0.0245513916015625, -0.039031982421875, 0.041656494140625, -0.00046753883361816406, 0.01357269287109375, 0.053802490234375, 0.047882080078125, -0.02325439453125, 0.07049560546875, 0.01358795166015625, 0.0186309814453125, 0.0635986328125, -0.0711669921875, -0.00838470458984375, -0.07867431640625, 0.00009959936141967773, -0.0253143310546875, -0.0267486572265625, -0.0227813720703125, -0.0232696533203125, 0.04766845703125, 0.03509521484375, -0.06097412109375, 0.01291656494140625, -0.051361083984375, 0.01146697998046875, 0.0740966796875, 0.037200927734375, -0.0030574798583984375, -0.01311492919921875, -0.007427215576171875, -0.009857177734375, -0.05169677734375, -0.0287322998046875, 0.080322265625, 0.0166168212890625, 0.03662109375, -0.00315093994140625, 0.0711669921875, 0.00213623046875, 0.004421234130859375, -0.06097412109375, 0.0394287109375, -0.01885986328125, -0.07379150390625, -0.0214385986328125, -0.034820556640625, -0.037384033203125, 0.024749755859375, -0.0195159912109375, -0.0533447265625, 0.0340576171875, 0.01361846923828125, -0.042724609375, 0.0262298583984375, -0.049896240234375, 0.07342529296875, -0.00823211669921875, -0.014739990234375, -0.0189971923828125, -0.0546875, 0.037078857421875, -0.005268096923828125, 0.00835418701171875, -0.02191162109375, 0.015899658203125, 0.07659912109375, -0.05694580078125, 0.06378173828125, -0.03253173828125, 0.0192413330078125, 0.039154052734375, 0.00424957275390625, 0.042327880859375, 0.00445556640625, -0.007495880126953125, 0.036956787109375, 0.0164642333984375, -0.031280517578125, -0.01325225830078125, 0.050323486328125, -0.07379150390625, -0.0194091796875, -0.046142578125, -0.045928955078125, 0.006622314453125, 0.029815673828125, 0.04461669921875, 0.047393798828125, 0.016265869140625, 0.031341552734375, 0.062103271484375, 0.0013971328735351562, 0.04071044921875, 0.0328369140625, 0.0137786865234375, -0.057464599609375, 0.08221435546875, 0.0183563232421875, 0.0338134765625, 0.026702880859375, 0.00737762451171875, -0.0006489753723144531, -0.03607177734375, -0.034698486328125, 0.003635406494140625, -0.0560302734375, -0.0291290283203125, -0.041351318359375, -0.0260467529296875, -0.038482666015625, -0.01200103759765625, -0.0372314453125, -0.050445556640625, -0.04296875, 0.019622802734375, 0.04150390625, 0.0147552490234375, -0.03179931640625, 0.043853759765625, -0.06671142578125, 0.01922607421875, 0.025146484375, 0.0153350830078125, 0.005619049072265625, -0.054351806640625, -0.0288238525390625, 0.029327392578125, -0.043365478515625, -0.033935546875, 0.016265869140625, 0.031341552734375, 0.020355224609375, 0.0289154052734375, 0.0128021240234375, 0.056610107421875, -0.0266265869140625, 0.054901123046875, 0.004150390625, -0.0760498046875, 0.03155517578125, -0.033538818359375, 0.0251312255859375, 0.052337646484375, 0.031982421875, -0.0155792236328125, -0.0147705078125, -0.054931640625, -0.0718994140625, 0.06622314453125, 0.01605224609375, -0.021240234375, 0.00539398193359375, 0.0211639404296875, -0.0178375244140625, 0.01169586181640625, -0.068115234375, -0.0147705078125, -0.01546478271484375, -0.01251983642578125, 0.019195556640625, -0.00577545166015625, -0.01349639892578125, -0.029754638671875, 0.072265625, -0.008056640625, 0.04449462890625, 0.0302734375, -0.0151214599609375, -0.010223388671875, 0.0007691383361816406, 0.028411865234375, 0.039337158203125, -0.0389404296875, -0.031280517578125, 0.0252685546875, -0.0340576171875, 0.005916595458984375, 0.0166168212890625, -0.0229339599609375, -0.005153656005859375, 0.0224456787109375, 0.06072998046875, 0.0255126953125, -0.0308380126953125, 0.035919189453125, 0.00048065185546875, -0.0274810791015625, -0.04547119140625, 0.01027679443359375, -0.0098419189453125, 0.0288238525390625, 0.0260162353515625, 0.02496337890625, 0.01000213623046875, -0.023895263671875, 0.0131988525390625, 0.026611328125, -0.043243408203125, -0.00684356689453125, 0.0635986328125, -0.0015850067138671875, -0.0100555419921875, 0.069580078125, -0.017974853515625, -0.03753662109375, 0.07037353515625, 0.0210723876953125, 0.06719970703125, 0.004901885986328125, -0.004398345947265625, 0.0560302734375, 0.01111602783203125, 0.003200531005859375, 0.0256805419921875, -0.005584716796875, -0.038787841796875, -0.0289154052734375, -0.05316162109375, -0.0256805419921875, -0.00656890869140625, -0.0791015625, 0.0292510986328125, -0.0660400390625, -0.0212249755859375, 0.009613037109375, 0.0054779052734375, -0.056488037109375, 0.033416748046875, 0.01479339599609375, 0.058441162109375, -0.037841796875, 0.07635498046875, 0.040252685546875, -0.043060302734375, -0.04425048828125, -0.0202178955078125, -0.00847625732421875, -0.060394287109375, 0.0347900390625, 0.01090240478515625, 0.038116455078125, 0.0241546630859375, -0.04779052734375, -0.065185546875, 0.1004638671875, 0.0111541748046875, -0.043243408203125, -0.016876220703125, -0.0018367767333984375, 0.019317626953125, 0.005115509033203125, 0.0175933837890625, 0.031951904296875, 0.02386474609375, 0.01441192626953125, -0.06341552734375, 0.0024871826171875, -0.00786590576171875, -0.00901031494140625, 0.0300140380859375, -0.068115234375, 0.10980224609375, -0.0237884521484375, 0.017303466796875, 0.03118896484375, 0.044525146484375, 0.044677734375, -0.0038051605224609375, 0.0369873046875, 0.074951171875, 0.046356201171875, -0.0272369384765625, 0.0565185546875, -0.04644775390625, 0.0660400390625, 0.064453125, 0.0269927978515625, 0.0609130859375, 0.021331787109375, -0.03314208984375, 0.046173095703125, 0.06585693359375, -0.01214599609375, 0.053619384765625, 0.0014047622680664062, -0.0340576171875, -0.0291748046875, 0.01548004150390625, -0.03973388671875, 0.004638671875, 0.0269012451171875, -0.018218994140625, -0.009613037109375, 0.0005750656127929688, 0.01175689697265625, -0.0517578125, -0.021820068359375, 0.065673828125, 0.0007419586181640625, -0.045196533203125, 0.065673828125, -0.00830841064453125, 0.048370361328125, -0.0643310546875, -0.00965118408203125, 0.007717132568359375, 0.034423828125, -0.005290985107421875, -0.039794921875, 0.0012416839599609375, -0.027496337890625, -0.0270538330078125, -0.0025730133056640625, 0.0159454345703125, -0.025787353515625, -0.043609619140625, -0.0105743408203125, -0.01200103759765625, 0.0164337158203125, -0.023590087890625, -0.05828857421875, -0.006435394287109375, 0.02056884765625, -0.0092315673828125, 0.028228759765625, 0.0419921875, 0.01153564453125, 0.0374755859375, 0.05523681640625, -0.02362060546875, 0.02191162109375, -0.0223541259765625, 0.06280517578125, -0.055877685546875, -0.036468505859375, -0.048370361328125, 0.07025146484375, 0.01806640625, -0.044952392578125, 0.0202789306640625, 0.04443359375, 0.05877685546875, -0.00812530517578125, 0.058624267578125, -0.035247802734375, 0.0173492431640625, -0.040618896484375, 0.04779052734375, -0.0272064208984375, 0.00251007080078125, -0.010589599609375, -0.058624267578125, 0.00217437744140625, 0.068115234375, -0.0201263427734375, 0.0022220611572265625, 0.042572021484375, 0.032867431640625, 0.003513336181640625, -0.0143280029296875, 0.0008678436279296875, 0.034942626953125, 0.036895751953125, 0.057586669921875, 0.03558349609375, -0.0584716796875, 0.04827880859375, -0.054718017578125, -0.032684326171875, -0.01224517822265625, -0.04937744140625, -0.0560302734375, -0.04180908203125, -0.02679443359375, -0.0347900390625, -0.0085296630859375, 0.0692138671875, 0.052642822265625, -0.063232421875, -0.00835418701171875, -0.0172271728515625, -0.0104217529296875, -0.01275634765625, -0.0218048095703125, 0.01708984375, -0.04290771484375, -0.0723876953125, -0.00006383657455444336, -0.01461029052734375, 0.0230865478515625, -0.01213836669921875, 0.0025730133056640625, -0.02105712890625, -0.0183258056640625, 0.0185394287109375, 0.015380859375, -0.046905517578125, -0.01512908935546875, -0.0117950439453125, -0.00832366943359375, 0.0167999267578125, 0.0210723876953125, -0.061370849609375, 0.036651611328125, 0.03399658203125, 0.0119476318359375, 0.064697265625, -0.0214385986328125, 0.036651611328125, -0.033355712890625, 0.0242919921875, 0.0296173095703125, 0.040252685546875, 0.0002567768096923828, -0.0207672119140625, 0.0343017578125, 0.01358795166015625, -0.059967041015625, -0.04937744140625, -0.0175628662109375, -0.077392578125, -0.006748199462890625, 0.06427001953125, -0.0302734375, -0.0178375244140625, 0.0270538330078125, -0.0213775634765625, 0.0274505615234375, -0.035125732421875, 0.0579833984375, 0.04718017578125, -0.005138397216796875, 0.01354217529296875, -0.0347900390625, 0.031890869140625, 0.0212860107421875, -0.035552978515625, -0.0300445556640625, 0.01513671875, 0.037139892578125, 0.015625, 0.032958984375, -0.0018863677978515625, 0.039581298828125, 0.004940032958984375, 0.0307769775390625, 0.006500244140625, -0.00823974609375, -0.0278472900390625, -0.0185699462890625, -0.005126953125, -0.034881591796875 ] ]
KoboldAI/OPT-13B-Nerybus-Mix
2023-03-09T06:53:39.000Z
[ "transformers", "pytorch", "opt", "text-generation", "en", "arxiv:2205.01068", "license:other", "has_space", "text-generation-inference", "region:us" ]
text-generation
KoboldAI
null
null
KoboldAI/OPT-13B-Nerybus-Mix
27
9,420
transformers
2023-02-13T06:32:16
--- language: en license: other commercial: no inference: false --- # OPT 13B - Nerybus Mix ## Model description This is a merged (50/50) model of both Erebus 13B and Nerys V2 13B by Mr. Seeker. **Warning: THIS model is NOT suitable for use by minors. The model will output X-rated content.** ## Information: For more information, check out the two source models: - https://huggingface.co/KoboldAI/OPT-13B-Erebus - https://huggingface.co/KoboldAI/OPT-13B-Nerys-v2 ## Limitations and biases Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion). **Warning: This model has a very strong NSFW bias!** ### License OPT-13B is licensed under the OPT-175B license, Copyright (c) Meta Platforms, Inc. All Rights Reserved. ### BibTeX entry and citation info ``` @misc{zhang2022opt, title={OPT: Open Pre-trained Transformer Language Models}, author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer}, year={2022}, eprint={2205.01068}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
1,362
[ [ -0.035125732421875, -0.0521240234375, -0.007389068603515625, 0.035675048828125, -0.01641845703125, -0.01204681396484375, -0.01910400390625, -0.050811767578125, 0.038238525390625, 0.04638671875, -0.07122802734375, -0.01983642578125, -0.03997802734375, 0.0039043426513671875, -0.039215087890625, 0.04388427734375, 0.0167999267578125, 0.0245819091796875, 0.0141448974609375, 0.01316070556640625, -0.038177490234375, -0.023193359375, -0.052154541015625, -0.028167724609375, 0.0098419189453125, 0.0394287109375, 0.05694580078125, 0.01320648193359375, 0.056671142578125, 0.021270751953125, -0.035369873046875, 0.00351715087890625, -0.055145263671875, -0.0154571533203125, -0.01377105712890625, -0.01438140869140625, -0.057952880859375, -0.0028781890869140625, 0.037567138671875, 0.04693603515625, -0.0189971923828125, 0.0131072998046875, -0.0008111000061035156, 0.018829345703125, -0.0243377685546875, -0.0259246826171875, -0.03582763671875, -0.00244903564453125, -0.00830841064453125, 0.001979827880859375, -0.06414794921875, -0.0262908935546875, 0.01444244384765625, -0.025177001953125, 0.01070404052734375, 0.02667236328125, 0.10076904296875, 0.0182952880859375, -0.022674560546875, -0.011505126953125, -0.043060302734375, 0.06463623046875, -0.07861328125, 0.042144775390625, 0.0220794677734375, 0.025421142578125, -0.005153656005859375, -0.055206298828125, -0.020660400390625, -0.0161590576171875, -0.0205841064453125, 0.0206756591796875, -0.03875732421875, -0.0148162841796875, -0.0054168701171875, 0.0404052734375, -0.049163818359375, 0.01299285888671875, -0.06732177734375, -0.00970458984375, 0.0262603759765625, 0.01122283935546875, 0.03192138671875, -0.0265960693359375, -0.026611328125, -0.0213623046875, -0.04083251953125, -0.025177001953125, 0.0406494140625, 0.035247802734375, -0.03948974609375, 0.05419921875, -0.0008897781372070312, 0.07012939453125, 0.032135009765625, 0.01129913330078125, 0.0389404296875, -0.025054931640625, -0.036376953125, 0.001712799072265625, 0.04998779296875, 0.0264739990234375, 0.009765625, -0.00443267822265625, 0.003376007080078125, -0.0002579689025878906, 0.02850341796875, -0.06646728515625, -0.0120849609375, -0.00508880615234375, -0.05792236328125, -0.037139892578125, 0.01435089111328125, -0.0709228515625, -0.00426483154296875, -0.0182037353515625, 0.0179443359375, -0.039764404296875, -0.047515869140625, 0.02032470703125, 0.02197265625, 0.04730224609375, -0.0010528564453125, -0.054107666015625, 0.044158935546875, 0.00847625732421875, 0.03955078125, -0.00876617431640625, -0.02288818359375, 0.016204833984375, -0.00864410400390625, -0.03448486328125, 0.03839111328125, -0.0386962890625, -0.025482177734375, -0.004055023193359375, 0.0296478271484375, -0.00982666015625, -0.0531005859375, 0.0648193359375, -0.030242919921875, 0.03948974609375, 0.01334381103515625, -0.0187835693359375, -0.0256195068359375, 0.00006663799285888672, -0.06298828125, 0.091064453125, 0.01384735107421875, -0.05657958984375, 0.030853271484375, -0.0246124267578125, -0.0191192626953125, 0.006008148193359375, 0.00722503662109375, -0.0246124267578125, 0.024261474609375, -0.0018434524536132812, 0.0178070068359375, -0.0217132568359375, 0.0307769775390625, -0.0265655517578125, -0.017120361328125, 0.004962921142578125, -0.0313720703125, 0.07000732421875, 0.0107879638671875, -0.021270751953125, 0.0041656494140625, -0.07537841796875, 0.0010728836059570312, 0.0220489501953125, -0.010589599609375, -0.00981903076171875, -0.021759033203125, 0.030029296875, 0.0295257568359375, 0.02728271484375, -0.038543701171875, -0.014007568359375, -0.029571533203125, 0.01476287841796875, 0.0421142578125, -0.01300048828125, 0.040985107421875, -0.016876220703125, 0.051849365234375, -0.005329132080078125, 0.0281982421875, 0.0206298828125, -0.0281524658203125, -0.08404541015625, -0.007354736328125, 0.01568603515625, 0.035186767578125, -0.020660400390625, 0.047454833984375, 0.0014085769653320312, -0.07806396484375, -0.048370361328125, -0.021881103515625, 0.011993408203125, 0.0016193389892578125, 0.044342041015625, 0.0008015632629394531, -0.07122802734375, -0.07342529296875, -0.00983428955078125, -0.0169677734375, 0.0250701904296875, 0.035675048828125, 0.0301666259765625, -0.036773681640625, 0.048614501953125, -0.03411865234375, -0.026336669921875, -0.05108642578125, -0.01232147216796875, 0.03546142578125, 0.04833984375, 0.06072998046875, -0.040924072265625, -0.0276031494140625, -0.0035991668701171875, -0.04229736328125, -0.03369140625, 0.0007128715515136719, -0.034210205078125, 0.001270294189453125, 0.041748046875, -0.00800323486328125, 0.042877197265625, 0.06402587890625, -0.043365478515625, 0.033233642578125, -0.006076812744140625, -0.004230499267578125, -0.095947265625, -0.0074310302734375, 0.0024738311767578125, -0.01378631591796875, -0.03643798828125, -0.0016536712646484375, 0.0208282470703125, -0.002079010009765625, -0.04443359375, 0.032135009765625, -0.043426513671875, 0.0156707763671875, -0.0277557373046875, -0.0005846023559570312, -0.01751708984375, 0.0150299072265625, 0.016021728515625, 0.025482177734375, 0.051544189453125, -0.042449951171875, 0.020477294921875, 0.032623291015625, -0.0089263916015625, 0.038665771484375, -0.0706787109375, -0.00926971435546875, 0.0008497238159179688, -0.004817962646484375, -0.055267333984375, -0.023712158203125, 0.0181121826171875, -0.0517578125, 0.03594970703125, 0.006481170654296875, -0.0256195068359375, -0.037689208984375, 0.004852294921875, 0.0118865966796875, 0.046478271484375, -0.0439453125, 0.07177734375, 0.003635406494140625, 0.0003180503845214844, -0.03741455078125, -0.0916748046875, -0.016265869140625, -0.0207672119140625, -0.0675048828125, 0.043548583984375, 0.003948211669921875, 0.0026683807373046875, 0.0256500244140625, -0.012939453125, -0.03302001953125, -0.0208892822265625, 0.023345947265625, 0.04644775390625, -0.01091766357421875, -0.0216064453125, 0.01861572265625, -0.0194854736328125, -0.002349853515625, 0.01485443115234375, 0.057525634765625, -0.01490020751953125, -0.0194854736328125, -0.027374267578125, 0.01189422607421875, 0.0340576171875, -0.01326751708984375, 0.08538818359375, 0.057220458984375, -0.044189453125, -0.01348114013671875, -0.042083740234375, -0.028106689453125, -0.036529541015625, 0.039215087890625, -0.0112152099609375, -0.05657958984375, 0.032867431640625, 0.0081024169921875, 0.0012531280517578125, 0.054931640625, 0.04180908203125, 0.0263214111328125, 0.043060302734375, 0.07568359375, 0.018951416015625, 0.040191650390625, -0.0236358642578125, 0.03729248046875, -0.053375244140625, -0.04986572265625, -0.042938232421875, -0.0259246826171875, -0.032867431640625, -0.01235198974609375, -0.023712158203125, 0.0011262893676757812, -0.0197906494140625, 0.06536865234375, -0.044036865234375, 0.0279388427734375, 0.042694091796875, 0.01515960693359375, 0.007129669189453125, -0.0033817291259765625, -0.0048675537109375, -0.0174102783203125, -0.049560546875, -0.03729248046875, 0.066162109375, 0.0280914306640625, 0.09613037109375, 0.030303955078125, 0.054656982421875, -0.016845703125, 0.004642486572265625, -0.035400390625, 0.0523681640625, -0.01371002197265625, -0.09423828125, -0.0004181861877441406, -0.026397705078125, -0.047821044921875, 0.0181884765625, -0.0204620361328125, -0.04107666015625, 0.037841796875, -0.004512786865234375, -0.01242828369140625, 0.0135955810546875, -0.0653076171875, 0.06982421875, -0.012420654296875, -0.015350341796875, -0.0140380859375, -0.03985595703125, 0.02532958984375, 0.004558563232421875, 0.00879669189453125, 0.01192474365234375, 0.018951416015625, 0.06475830078125, -0.019073486328125, 0.06494140625, 0.013092041015625, 0.0113525390625, 0.024444580078125, 0.00833892822265625, 0.016571044921875, 0.00702667236328125, 0.017547607421875, 0.0029392242431640625, -0.01824951171875, -0.01702880859375, 0.001346588134765625, 0.058074951171875, -0.0662841796875, 0.0012493133544921875, -0.04608154296875, -0.006317138671875, 0.0220489501953125, 0.03387451171875, 0.0484619140625, 0.0517578125, -0.01422119140625, 0.042938232421875, 0.0352783203125, -0.033538818359375, 0.036163330078125, 0.05511474609375, -0.0377197265625, -0.04571533203125, 0.044189453125, -0.0012454986572265625, 0.034454345703125, 0.01085662841796875, -0.00597381591796875, -0.021759033203125, -0.024200439453125, 0.007198333740234375, 0.0413818359375, -0.0423583984375, -0.01201629638671875, -0.04901123046875, -0.029510498046875, -0.033477783203125, -0.016510009765625, -0.041168212890625, -0.004917144775390625, -0.039642333984375, -0.024169921875, 0.01172637939453125, 0.0545654296875, -0.0300750732421875, 0.04095458984375, -0.032501220703125, 0.0135040283203125, -0.0029277801513671875, 0.02081298828125, -0.0019016265869140625, -0.07733154296875, -0.0152587890625, -0.0143585205078125, -0.0255126953125, -0.0924072265625, 0.046112060546875, -0.006191253662109375, 0.052642822265625, 0.032470703125, 0.000705718994140625, 0.048187255859375, -0.02398681640625, 0.054290771484375, 0.018890380859375, -0.0618896484375, 0.018096923828125, -0.00211334228515625, 0.0008611679077148438, 0.04498291015625, 0.0240020751953125, -0.0263519287109375, -0.02203369140625, -0.059417724609375, -0.075927734375, 0.08184814453125, 0.0276031494140625, 0.01065826416015625, -0.00939178466796875, -0.0214080810546875, 0.0077972412109375, 0.004398345947265625, -0.10107421875, -0.04638671875, -0.020416259765625, -0.0160675048828125, 0.0154571533203125, -0.040252685546875, 0.0192718505859375, -0.0194091796875, 0.0841064453125, 0.024261474609375, 0.0313720703125, 0.0026645660400390625, -0.0197906494140625, -0.007526397705078125, 0.01136016845703125, 0.0323486328125, 0.04180908203125, -0.0521240234375, 0.0007476806640625, 0.015869140625, -0.050567626953125, -0.00560760498046875, 0.01296234130859375, -0.0167694091796875, 0.018585205078125, 0.0167999267578125, 0.06591796875, 0.031005859375, -0.030120849609375, 0.022796630859375, -0.0032558441162109375, -0.0214691162109375, -0.039794921875, -0.03302001953125, 0.01474761962890625, 0.01947021484375, 0.013916015625, 0.018951416015625, 0.0032024383544921875, -0.0202178955078125, 0.0021915435791015625, 0.0011320114135742188, -0.042083740234375, -0.02313232421875, 0.04388427734375, 0.00853729248046875, -0.0208892822265625, 0.046630859375, -0.030670166015625, -0.01476287841796875, 0.03582763671875, 0.07171630859375, 0.06671142578125, -0.02850341796875, 0.01641845703125, 0.045440673828125, 0.07086181640625, 0.006832122802734375, 0.035064697265625, 0.0546875, -0.054290771484375, -0.041168212890625, -0.0338134765625, -0.014190673828125, 0.023468017578125, -0.07037353515625, 0.06005859375, -0.006961822509765625, -0.023040771484375, -0.012908935546875, -0.0234375, -0.03558349609375, 0.0289306640625, 0.0186614990234375, 0.0787353515625, -0.0745849609375, 0.0254669189453125, 0.06793212890625, -0.0275115966796875, -0.0723876953125, -0.02642822265625, -0.0164794921875, -0.042633056640625, 0.03460693359375, 0.019378662109375, 0.00390625, 0.0034046173095703125, -0.0462646484375, -0.08056640625, 0.063720703125, 0.01861572265625, -0.034576416015625, -0.0202178955078125, -0.0232696533203125, 0.0225067138671875, -0.0175323486328125, 0.048248291015625, 0.035064697265625, 0.038848876953125, 0.020538330078125, -0.0635986328125, -0.01812744140625, -0.0189666748046875, -0.012725830078125, 0.01230621337890625, -0.04754638671875, 0.07574462890625, -0.00812530517578125, -0.0146942138671875, 0.0109710693359375, 0.051116943359375, 0.0192718505859375, 0.01126861572265625, 0.01375579833984375, 0.060516357421875, 0.022735595703125, -0.015899658203125, 0.044891357421875, -0.0281524658203125, 0.04888916015625, 0.0928955078125, 0.0004146099090576172, 0.04437255859375, 0.03570556640625, -0.02398681640625, 0.047149658203125, 0.05615234375, -0.00394439697265625, 0.050140380859375, -0.00969696044921875, 0.03173828125, -0.028106689453125, 0.0013074874877929688, -0.060150146484375, 0.0087127685546875, 0.0170745849609375, -0.038787841796875, -0.0218048095703125, -0.0179443359375, 0.01490020751953125, 0.004383087158203125, -0.005123138427734375, 0.0325927734375, 0.01001739501953125, -0.01324462890625, 0.03839111328125, 0.00838470458984375, 0.06707763671875, -0.04852294921875, 0.0171966552734375, -0.003082275390625, 0.01256561279296875, -0.025390625, -0.035888671875, 0.02728271484375, -0.004688262939453125, -0.01076507568359375, -0.002040863037109375, 0.060394287109375, -0.0178070068359375, -0.057861328125, 0.046661376953125, 0.038543701171875, 0.0105743408203125, 0.018402099609375, -0.048126220703125, 0.00963592529296875, 0.0169219970703125, -0.02020263671875, -0.01126861572265625, 0.02105712890625, 0.0250701904296875, 0.0426025390625, 0.0259246826171875, 0.0032558441162109375, 0.04718017578125, 0.00431060791015625, 0.06072998046875, -0.04119873046875, -0.026824951171875, -0.055389404296875, 0.026336669921875, -0.0187835693359375, -0.03839111328125, 0.0582275390625, 0.04388427734375, 0.07305908203125, -0.025909423828125, 0.0379638671875, -0.0352783203125, 0.04638671875, -0.0157012939453125, 0.031646728515625, -0.041748046875, -0.020538330078125, -0.03448486328125, -0.10107421875, -0.0023860931396484375, 0.07220458984375, -0.01800537109375, 0.0192718505859375, 0.0711669921875, 0.061279296875, -0.01073455810546875, 0.00583648681640625, 0.0160980224609375, 0.0287017822265625, 0.00397491455078125, 0.043548583984375, 0.0755615234375, -0.03533935546875, 0.042694091796875, -0.0287017822265625, -0.0227813720703125, -0.03564453125, -0.05322265625, -0.09149169921875, -0.01360321044921875, -0.01812744140625, -0.0291595458984375, -0.0225677490234375, 0.0614013671875, 0.05621337890625, -0.04766845703125, -0.01702880859375, -0.0180816650390625, -0.019195556640625, -0.01299285888671875, -0.016845703125, 0.00185394287109375, 0.009063720703125, -0.056610107421875, 0.03216552734375, 0.0027446746826171875, 0.042144775390625, -0.01201629638671875, -0.008514404296875, -0.01493072509765625, 0.0164794921875, 0.0072784423828125, -0.01139068603515625, -0.04998779296875, 0.0240478515625, 0.039031982421875, -0.003971099853515625, -0.0248565673828125, 0.01412200927734375, -0.036529541015625, 0.0125885009765625, 0.038848876953125, 0.01485443115234375, 0.0247802734375, -0.0079345703125, 0.02313232421875, -0.021148681640625, 0.02471923828125, 0.0122528076171875, 0.035888671875, 0.0195159912109375, -0.001194000244140625, 0.048675537109375, 0.007373809814453125, -0.03863525390625, -0.080810546875, 0.01385498046875, -0.09228515625, -0.006717681884765625, 0.09906005859375, 0.01343536376953125, -0.0063018798828125, 0.03131103515625, -0.0338134765625, 0.01080322265625, -0.03173828125, -0.0021381378173828125, 0.052978515625, 0.037994384765625, -0.01413726806640625, -0.03668212890625, -0.0004112720489501953, 0.0287017822265625, -0.047271728515625, -0.0259552001953125, 0.0270538330078125, 0.0024433135986328125, 0.01189422607421875, 0.01131439208984375, -0.0298004150390625, 0.01189422607421875, 0.00514984130859375, 0.0294342041015625, -0.002841949462890625, -0.04248046875, -0.0037708282470703125, -0.01461029052734375, -0.002227783203125, -0.0010881423950195312 ] ]
stablediffusionapi/all-526
2023-04-26T20:04:01.000Z
[ "diffusers", "stablediffusionapi.com", "stable-diffusion-api", "text-to-image", "ultra-realistic", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
stablediffusionapi
null
null
stablediffusionapi/all-526
2
9,408
diffusers
2023-04-26T20:02:20
--- license: creativeml-openrail-m tags: - stablediffusionapi.com - stable-diffusion-api - text-to-image - ultra-realistic pinned: true --- # All 526 API Inference ![generated from stablediffusionapi.com](https://pub-8b49af329fae499aa563997f5d4068a4.r2.dev/generations/9290108341682539305.png) ## Get API Key Get API key from [Stable Diffusion API](http://stablediffusionapi.com/), No Payment needed. Replace Key in below code, change **model_id** to "all-526" Coding in PHP/Node/Java etc? Have a look at docs for more code examples: [View docs](https://stablediffusionapi.com/docs) Model link: [View model](https://stablediffusionapi.com/models/all-526) Credits: [View credits](https://civitai.com/?query=All%20526) View all models: [View Models](https://stablediffusionapi.com/models) import requests import json url = "https://stablediffusionapi.com/api/v3/dreambooth" payload = json.dumps({ "key": "", "model_id": "all-526", "prompt": "actual 8K portrait photo of gareth person, portrait, happy colors, bright eyes, clear eyes, warm smile, smooth soft skin, big dreamy eyes, beautiful intricate colored hair, symmetrical, anime wide eyes, soft lighting, detailed face, by makoto shinkai, stanley artgerm lau, wlop, rossdraws, concept art, digital painting, looking into camera", "negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime", "width": "512", "height": "512", "samples": "1", "num_inference_steps": "30", "safety_checker": "no", "enhance_prompt": "yes", "seed": None, "guidance_scale": 7.5, "multi_lingual": "no", "panorama": "no", "self_attention": "no", "upscale": "no", "embeddings": "embeddings_model_id", "lora": "lora_model_id", "webhook": None, "track_id": None }) headers = { 'Content-Type': 'application/json' } response = requests.request("POST", url, headers=headers, data=payload) print(response.text) > Use this coupon code to get 25% off **DMGG0RBN**
2,390
[ [ -0.03131103515625, -0.049652099609375, 0.039276123046875, 0.02685546875, -0.029510498046875, 0.0007305145263671875, 0.0258331298828125, -0.0281219482421875, 0.0379638671875, 0.04144287109375, -0.06634521484375, -0.06109619140625, -0.023193359375, 0.00301361083984375, -0.006679534912109375, 0.04327392578125, 0.01244354248046875, -0.00997161865234375, -0.005615234375, 0.01442718505859375, -0.0187530517578125, -0.0082550048828125, -0.048309326171875, -0.00829315185546875, 0.017913818359375, -0.0023040771484375, 0.039276123046875, 0.0467529296875, 0.0214691162109375, 0.021636962890625, -0.0147552490234375, -0.0006771087646484375, -0.02825927734375, -0.0230560302734375, -0.01336669921875, -0.058013916015625, -0.04833984375, -0.010009765625, 0.0261077880859375, 0.035125732421875, 0.00554656982421875, 0.031524658203125, -0.0025501251220703125, 0.054534912109375, -0.054656982421875, 0.0224609375, -0.0261993408203125, 0.01345062255859375, -0.004207611083984375, 0.0005970001220703125, -0.026275634765625, -0.022918701171875, -0.00901031494140625, -0.06707763671875, 0.02215576171875, 0.00628662109375, 0.09088134765625, 0.0199432373046875, -0.02032470703125, -0.007537841796875, -0.0267791748046875, 0.06109619140625, -0.07501220703125, 0.0188751220703125, 0.037384033203125, -0.0010156631469726562, -0.00787353515625, -0.058013916015625, -0.04833984375, 0.034332275390625, 0.01418304443359375, 0.018310546875, -0.035400390625, -0.003833770751953125, 0.0221405029296875, 0.025787353515625, -0.03753662109375, -0.0185699462890625, -0.032989501953125, -0.006134033203125, 0.048980712890625, 0.01309967041015625, 0.01053619384765625, -0.039398193359375, -0.0301513671875, -0.0189056396484375, -0.034698486328125, 0.0173492431640625, 0.04547119140625, 0.0245513916015625, -0.0406494140625, 0.03240966796875, -0.034027099609375, 0.054351806640625, 0.0164031982421875, -0.006992340087890625, 0.05487060546875, -0.01270294189453125, -0.026458740234375, -0.020843505859375, 0.081298828125, 0.047149658203125, -0.0036869049072265625, 0.020660400390625, -0.0032806396484375, 0.00104522705078125, 0.00072479248046875, -0.0745849609375, -0.006488800048828125, 0.057830810546875, -0.05517578125, -0.040863037109375, 0.006465911865234375, -0.08074951171875, -0.0178375244140625, 0.00890350341796875, 0.0222015380859375, -0.0243072509765625, -0.041748046875, 0.034576416015625, -0.01445770263671875, 0.0111846923828125, 0.016143798828125, -0.05987548828125, 0.00421905517578125, 0.047698974609375, 0.061187744140625, 0.011077880859375, -0.002109527587890625, 0.01522064208984375, 0.00983428955078125, -0.022430419921875, 0.061614990234375, -0.0156402587890625, -0.03643798828125, -0.00890350341796875, 0.0218048095703125, -0.0109405517578125, -0.0382080078125, 0.05267333984375, -0.040313720703125, -0.00777435302734375, -0.01434326171875, -0.0301971435546875, -0.02850341796875, 0.01045989990234375, -0.05072021484375, 0.0521240234375, 0.00812530517578125, -0.0601806640625, 0.00804901123046875, -0.057159423828125, -0.01197052001953125, -0.0000680685043334961, 0.00930023193359375, -0.039031982421875, -0.00490570068359375, 0.006725311279296875, 0.0231170654296875, -0.007572174072265625, -0.005138397216796875, -0.053619384765625, -0.0255889892578125, 0.0259552001953125, -0.0223846435546875, 0.08917236328125, 0.03643798828125, -0.0162811279296875, -0.0006775856018066406, -0.06903076171875, 0.01352691650390625, 0.036407470703125, -0.02105712890625, -0.0134735107421875, -0.019866943359375, 0.0056915283203125, -0.005268096923828125, 0.019195556640625, -0.038818359375, 0.0162506103515625, -0.035125732421875, 0.037506103515625, 0.043487548828125, 0.0207366943359375, 0.0175628662109375, -0.0238494873046875, 0.053314208984375, 0.0147552490234375, 0.032501220703125, -0.00466156005859375, -0.05194091796875, -0.044525146484375, -0.0361328125, 0.02117919921875, 0.034942626953125, -0.03424072265625, 0.038726806640625, -0.017425537109375, -0.046661376953125, -0.051513671875, -0.0081787109375, 0.022857666015625, 0.029571533203125, 0.01019287109375, -0.01739501953125, -0.05133056640625, -0.06414794921875, -0.0023365020751953125, -0.01352691650390625, -0.007114410400390625, 0.0167388916015625, 0.041351318359375, -0.029541015625, 0.061614990234375, -0.061614990234375, -0.0013189315795898438, -0.001972198486328125, 0.0005326271057128906, 0.0648193359375, 0.048675537109375, 0.05987548828125, -0.06561279296875, -0.0304412841796875, -0.03167724609375, -0.049285888671875, 0.01062774658203125, 0.01457977294921875, -0.034088134765625, -0.0020046234130859375, -0.0014972686767578125, -0.07415771484375, 0.046875, 0.0302886962890625, -0.04693603515625, 0.0380859375, -0.005214691162109375, 0.03839111328125, -0.09320068359375, -0.0012912750244140625, 0.01165008544921875, -0.016632080078125, -0.03070068359375, 0.0261993408203125, 0.00669097900390625, -0.0033321380615234375, -0.0531005859375, 0.034942626953125, -0.025634765625, 0.01099395751953125, -0.018218994140625, -0.0017728805541992188, 0.0233154296875, 0.030242919921875, -0.0043792724609375, 0.0278472900390625, 0.046356201171875, -0.03228759765625, 0.036956787109375, 0.022491455078125, -0.0263519287109375, 0.040771484375, -0.04412841796875, 0.0091400146484375, 0.0017633438110351562, 0.03338623046875, -0.09130859375, -0.035186767578125, 0.036590576171875, -0.048675537109375, -0.00879669189453125, -0.042877197265625, -0.04241943359375, -0.04937744140625, -0.027557373046875, 0.026123046875, 0.055328369140625, -0.03265380859375, 0.056610107421875, 0.01177978515625, 0.0209808349609375, -0.050048828125, -0.064697265625, -0.025238037109375, -0.02935791015625, -0.05169677734375, 0.0180511474609375, -0.018646240234375, -0.0211181640625, 0.0016660690307617188, 0.00335693359375, -0.01593017578125, -0.00318145751953125, 0.03607177734375, 0.037750244140625, -0.01421356201171875, -0.0196685791015625, 0.003665924072265625, -0.005130767822265625, 0.0165557861328125, -0.0235137939453125, 0.059356689453125, -0.00440216064453125, -0.042755126953125, -0.062286376953125, 0.007144927978515625, 0.052764892578125, -0.0012760162353515625, 0.042266845703125, 0.032257080078125, -0.04742431640625, -0.0029506683349609375, -0.041748046875, -0.00960540771484375, -0.03631591796875, 0.0167388916015625, -0.04107666015625, -0.021881103515625, 0.0732421875, 0.0000858306884765625, -0.0059967041015625, 0.051239013671875, 0.03448486328125, -0.005725860595703125, 0.09271240234375, 0.01434326171875, 0.01007080078125, 0.0205230712890625, -0.05694580078125, -0.0000317692756652832, -0.05908203125, -0.0197601318359375, -0.026397705078125, -0.0211334228515625, -0.03277587890625, -0.041351318359375, -0.0035858154296875, 0.016265869140625, -0.02972412109375, 0.0197906494140625, -0.048736572265625, 0.03082275390625, 0.039794921875, 0.027740478515625, 0.0165252685546875, -0.005115509033203125, -0.0021724700927734375, 0.002899169921875, -0.0284271240234375, -0.025299072265625, 0.08001708984375, 0.0210723876953125, 0.056610107421875, 0.0095672607421875, 0.050628662109375, 0.01155853271484375, -0.0012760162353515625, -0.0469970703125, 0.0374755859375, 0.00933837890625, -0.0736083984375, 0.0158233642578125, -0.021148681640625, -0.07275390625, 0.028839111328125, -0.0165557861328125, -0.057373046875, 0.05401611328125, 0.007747650146484375, -0.053436279296875, 0.0361328125, -0.056396484375, 0.058349609375, -0.003803253173828125, -0.044891357421875, -0.0112457275390625, -0.037506103515625, 0.0364990234375, -0.0011110305786132812, 0.046051025390625, -0.036346435546875, -0.0078887939453125, 0.05084228515625, -0.0347900390625, 0.07977294921875, -0.0266571044921875, 0.002410888671875, 0.045806884765625, 0.0006937980651855469, 0.0283050537109375, 0.024078369140625, -0.0171051025390625, 0.01421356201171875, 0.0195159912109375, -0.042999267578125, -0.0250244140625, 0.060028076171875, -0.0614013671875, -0.03143310546875, -0.025848388671875, -0.029541015625, -0.00185394287109375, 0.0305938720703125, 0.032867431640625, 0.0111541748046875, 0.0069427490234375, -0.0005984306335449219, 0.060211181640625, -0.007312774658203125, 0.0386962890625, 0.02398681640625, -0.03955078125, -0.05108642578125, 0.047149658203125, -0.01267242431640625, 0.02276611328125, 0.01041412353515625, 0.01406097412109375, -0.03424072265625, -0.036590576171875, -0.03326416015625, 0.0282135009765625, -0.056915283203125, -0.0311431884765625, -0.0552978515625, 0.00591278076171875, -0.05596923828125, -0.005096435546875, -0.053314208984375, -0.0276641845703125, -0.047698974609375, -0.0178985595703125, 0.049407958984375, 0.0261077880859375, -0.01253509521484375, 0.02362060546875, -0.052001953125, 0.0247650146484375, 0.008575439453125, 0.01983642578125, 0.01352691650390625, -0.039276123046875, -0.002475738525390625, 0.0203857421875, -0.0311431884765625, -0.07391357421875, 0.0377197265625, -0.0107269287109375, 0.024932861328125, 0.06707763671875, 0.017852783203125, 0.069580078125, -0.005672454833984375, 0.056732177734375, 0.029571533203125, -0.06707763671875, 0.058319091796875, -0.046234130859375, 0.01244354248046875, 0.03857421875, 0.0260009765625, -0.0187530517578125, -0.0132598876953125, -0.06640625, -0.08197021484375, 0.03662109375, 0.0059967041015625, 0.02081298828125, 0.00188446044921875, 0.0382080078125, -0.005725860595703125, 0.016387939453125, -0.07025146484375, -0.037261962890625, -0.020050048828125, -0.020172119140625, 0.0367431640625, -0.0013713836669921875, -0.026519775390625, -0.0306396484375, 0.057708740234375, -0.0108642578125, 0.02203369140625, 0.0168914794921875, 0.026947021484375, -0.0169830322265625, 0.004543304443359375, 0.023773193359375, 0.05926513671875, -0.037200927734375, -0.007549285888671875, -0.004192352294921875, -0.030853271484375, 0.0036411285400390625, 0.017120361328125, -0.0228118896484375, -0.002223968505859375, 0.02215576171875, 0.064697265625, 0.0006394386291503906, -0.0380859375, 0.044403076171875, -0.0112457275390625, -0.0343017578125, -0.0357666015625, 0.011383056640625, 0.0350341796875, 0.046112060546875, 0.0390625, 0.0187225341796875, 0.0201416015625, -0.03021240234375, -0.005725860595703125, 0.023590087890625, -0.017486572265625, -0.02752685546875, 0.08074951171875, -0.0037899017333984375, -0.026153564453125, 0.033172607421875, -0.0238800048828125, -0.0093994140625, 0.058441162109375, 0.050140380859375, 0.059783935546875, -0.00516510009765625, 0.01537322998046875, 0.055419921875, -0.00029468536376953125, -0.0098114013671875, 0.057037353515625, 0.01425933837890625, -0.045440673828125, -0.024932861328125, -0.06292724609375, -0.011688232421875, 0.0192413330078125, -0.059814453125, 0.033935546875, -0.05120849609375, -0.0307464599609375, -0.00492095947265625, -0.0135040283203125, -0.045989990234375, 0.026458740234375, 0.00675201416015625, 0.0694580078125, -0.063720703125, 0.044586181640625, 0.050048828125, -0.0445556640625, -0.0732421875, -0.01197052001953125, 0.01245880126953125, -0.05810546875, 0.0280303955078125, 0.00251007080078125, 0.006900787353515625, 0.01030731201171875, -0.05908203125, -0.0684814453125, 0.08441162109375, 0.0243682861328125, -0.033203125, -0.001255035400390625, 0.004486083984375, 0.031341552734375, -0.0296173095703125, 0.03326416015625, 0.02520751953125, 0.03179931640625, 0.0266571044921875, -0.037200927734375, 0.01531982421875, -0.0347900390625, -0.0015459060668945312, -0.017364501953125, -0.06683349609375, 0.0684814453125, -0.0291595458984375, 0.0025119781494140625, 0.017608642578125, 0.049285888671875, 0.05889892578125, 0.028839111328125, 0.04473876953125, 0.07086181640625, 0.044036865234375, -0.01416015625, 0.0721435546875, -0.0209503173828125, 0.058319091796875, 0.044647216796875, -0.00179290771484375, 0.0711669921875, 0.03680419921875, -0.03765869140625, 0.05279541015625, 0.08319091796875, -0.0215911865234375, 0.060394287109375, 0.010772705078125, -0.02923583984375, -0.01345062255859375, 0.00699615478515625, -0.04559326171875, 0.0162506103515625, 0.0215911865234375, -0.0285797119140625, 0.0151214599609375, -0.000835418701171875, 0.0062713623046875, -0.0224151611328125, -0.01715087890625, 0.031463623046875, 0.0031337738037109375, -0.026763916015625, 0.05517578125, -0.0104217529296875, 0.06793212890625, -0.041290283203125, -0.0004863739013671875, -0.00901031494140625, 0.02813720703125, -0.0303192138671875, -0.042449951171875, 0.0063323974609375, -0.0110321044921875, -0.0223388671875, -0.0008859634399414062, 0.042022705078125, 0.001220703125, -0.0509033203125, 0.0215301513671875, 0.0207366943359375, 0.020416259765625, -0.005702972412109375, -0.0712890625, 0.02032470703125, 0.024505615234375, -0.036041259765625, 0.00012636184692382812, 0.02972412109375, 0.035247802734375, 0.04486083984375, 0.05767822265625, 0.01085662841796875, 0.00962066650390625, -0.0010328292846679688, 0.049896240234375, -0.043487548828125, -0.047882080078125, -0.06951904296875, 0.057464599609375, -0.01212310791015625, -0.01419830322265625, 0.047119140625, 0.0634765625, 0.05548095703125, -0.03302001953125, 0.06939697265625, -0.0181732177734375, 0.037109375, -0.0236053466796875, 0.056396484375, -0.056427001953125, 0.009765625, -0.0298309326171875, -0.0509033203125, -0.0111541748046875, 0.05047607421875, -0.010009765625, 0.01314544677734375, 0.0377197265625, 0.05322265625, -0.0206451416015625, 0.007659912109375, -0.0015554428100585938, 0.0168304443359375, 0.0119781494140625, 0.031402587890625, 0.04974365234375, -0.048187255859375, 0.033111572265625, -0.046234130859375, -0.0188140869140625, -0.005992889404296875, -0.056488037109375, -0.053009033203125, -0.022796630859375, -0.039886474609375, -0.05511474609375, -0.01016998291015625, 0.054229736328125, 0.0736083984375, -0.063720703125, -0.01165008544921875, -0.00739288330078125, 0.004314422607421875, -0.0172576904296875, -0.023193359375, 0.0309906005859375, 0.0240478515625, -0.0826416015625, 0.01849365234375, -0.012237548828125, 0.0235748291015625, -0.00644683837890625, 0.008331298828125, -0.018463134765625, 0.00641632080078125, 0.016632080078125, 0.0194091796875, -0.05877685546875, -0.0011539459228515625, -0.006259918212890625, 0.005706787109375, 0.020294189453125, 0.0160064697265625, -0.0313720703125, 0.0233154296875, 0.056243896484375, 0.0143890380859375, 0.045928955078125, 0.006008148193359375, 0.00931549072265625, -0.030853271484375, 0.028961181640625, 0.0021266937255859375, 0.04730224609375, 0.015045166015625, -0.045074462890625, 0.03961181640625, 0.042755126953125, -0.0308074951171875, -0.0634765625, 0.0017690658569335938, -0.08221435546875, -0.0291290283203125, 0.0738525390625, -0.0220184326171875, -0.04833984375, 0.01373291015625, -0.01305389404296875, 0.023345947265625, -0.02874755859375, 0.046234130859375, 0.04058837890625, -0.0262451171875, -0.00962066650390625, -0.06207275390625, 0.01201629638671875, 0.01558685302734375, -0.06634521484375, -0.0116424560546875, 0.03338623046875, 0.0447998046875, 0.038604736328125, 0.0455322265625, -0.025482177734375, 0.0130767822265625, 0.0168304443359375, 0.03009033203125, 0.00954437255859375, 0.0257110595703125, -0.01110076904296875, 0.008087158203125, -0.01036834716796875, -0.0430908203125 ] ]
olm/olm-roberta-base-dec-2022
2023-01-20T14:32:41.000Z
[ "transformers", "pytorch", "tf", "tensorboard", "roberta", "fill-mask", "en", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
olm
null
null
olm/olm-roberta-base-dec-2022
7
9,381
transformers
2022-12-27T22:14:15
--- language: en --- # OLM RoBERTa/BERT December 2022 This is a more up-to-date version of the [original BERT](https://huggingface.co/bert-base-cased) and [original RoBERTa](https://huggingface.co/roberta-base). In addition to being more up-to-date, it also tends to perform better than the original BERT on standard benchmarks. We think it is fair to directly compare our model to the original BERT because our model was trained with about the same level of compute as the original BERT, and the architecture of BERT and RoBERTa are basically the same. The original RoBERTa takes an order of magnitude more compute, although our model is also not that different in performance from the original RoBERTa on many standard benchmarks. Our model was trained on a cleaned December 2022 snapshot of Common Crawl and Wikipedia. This model was created as part of the OLM project, which has the goal of continuously training and releasing models that are up-to-date and comparable in standard language model performance to their static counterparts. This is important because we want our models to know about events like COVID or a presidential election right after they happen. ## Intended uses You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task, such as sequence classification, token classification or question answering. ### How to use You can use this model directly with a pipeline for masked language modeling: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='olm/olm-roberta-base-dec-2022') >>> unmasker("Hello I'm a <mask> model.") [{'score': 0.04252663999795914, 'token': 631, 'token_str': ' new', 'sequence': "Hello I'm a new model."}, {'score': 0.034064881503582, 'token': 4750, 'token_str': ' female', 'sequence': "Hello I'm a female model."}, {'score': 0.03066524863243103, 'token': 932, 'token_str': ' business', 'sequence': "Hello I'm a business model."}, {'score': 0.029599128291010857, 'token': 10345, 'token_str': ' junior', 'sequence': "Hello I'm a junior model."}, {'score': 0.025790784507989883, 'token': 2219, 'token_str': ' human', 'sequence': "Hello I'm a human model."}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import AutoTokenizer, RobertaModel tokenizer = AutoTokenizer.from_pretrained('olm/olm-roberta-base-dec-2022') model = RobertaModel.from_pretrained("olm/olm-roberta-base-dec-2022") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` ## Dataset The model and tokenizer were trained with this [December 2022 cleaned Common Crawl dataset](https://huggingface.co/datasets/olm/olm-CC-MAIN-2022-49-sampling-ratio-olm-0.15114822547) plus this [December 2022 cleaned Wikipedia dataset](https://huggingface.co/datasets/olm/olm-wikipedia-20221220).\ The tokenized version of these concatenated datasets is [here](https://huggingface.co/datasets/olm/olm-december-2022-tokenized-512).\ The datasets were created with this [repo](https://github.com/huggingface/olm-datasets). ## Training The model was trained according to the OLM BERT/RoBERTa instructions at this [repo](https://github.com/huggingface/olm-training). ## Evaluation results The model achieves the following results after tuning on GLUE tasks: | Task | Metric | Original BERT | OLM RoBERTa Dec 2022 (Ours) | |:-----|:---------|----------------:|----------------------------:| |cola |mcc |**0.5889** |0.28067 | |sst2 |acc |0.9181 |**0.9275** | |mrpc |acc/f1 |**0.9182**/0.8923|0.8662/**0.9033** | |stsb |pear/spear|0.8822/0.8794 |**0.8870**/**0.8857** | |qqp |acc/f1 |0.9071/0.8748 |**0.9097**/**0.8791** | |mnli |acc/acc_mm|0.8400/0.8410 |**0.8576**/**0.8621** | |qnli |acc |0.9075 |**0.9192** | |rte |acc |0.6296 |**0.6390** | |wnli |acc |0.4000 |**0.4648** | For both the original BERT and our model, we used the Hugging Face run_glue.py script [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/text-classification). For both models, we used the default fine-tuning hyperparameters and we averaged the results over five training seeds. These are the results for the GLUE dev sets, which can be a bit different than the results for the test sets.
4,601
[ [ -0.0271148681640625, -0.06622314453125, 0.01088714599609375, 0.001102447509765625, -0.0093231201171875, -0.01464080810546875, -0.032745361328125, -0.037109375, 0.0218048095703125, 0.0318603515625, -0.04315185546875, -0.039794921875, -0.057037353515625, -0.00556182861328125, -0.01177215576171875, 0.10430908203125, -0.0028934478759765625, -0.0038433074951171875, -0.015228271484375, -0.01094818115234375, -0.01242828369140625, -0.037689208984375, -0.032135009765625, -0.0171661376953125, 0.0211181640625, 0.009552001953125, 0.045989990234375, 0.048370361328125, 0.043914794921875, 0.031585693359375, -0.02691650390625, 0.0018873214721679688, -0.0230865478515625, -0.01361083984375, 0.01215362548828125, -0.0301666259765625, -0.049041748046875, 0.03240966796875, 0.041259765625, 0.042877197265625, -0.00048804283142089844, 0.0245361328125, 0.0138397216796875, 0.040283203125, -0.0277557373046875, 0.0218963623046875, -0.040618896484375, -0.0085601806640625, -0.00374603271484375, 0.019439697265625, -0.017059326171875, -0.01165008544921875, 0.029083251953125, -0.019744873046875, 0.035552978515625, 0.0012941360473632812, 0.10650634765625, 0.01354217529296875, -0.0116729736328125, -0.0223236083984375, -0.0244140625, 0.0687255859375, -0.060516357421875, 0.0161895751953125, 0.028228759765625, 0.0016355514526367188, -0.016204833984375, -0.058746337890625, -0.034637451171875, -0.022125244140625, -0.01294708251953125, 0.007541656494140625, -0.01323699951171875, -0.006275177001953125, 0.0195159912109375, 0.047027587890625, -0.051361083984375, -0.0021209716796875, -0.047515869140625, -0.017730712890625, 0.0521240234375, -0.0047149658203125, 0.0171356201171875, -0.02288818359375, -0.049346923828125, -0.027557373046875, -0.01348876953125, 0.020355224609375, 0.023223876953125, 0.0296478271484375, -0.029083251953125, 0.03790283203125, -0.0020236968994140625, 0.05462646484375, 0.00946044921875, -0.01137542724609375, 0.044769287109375, -0.009368896484375, -0.031341552734375, -0.01129913330078125, 0.079345703125, 0.0174713134765625, 0.01922607421875, -0.00047659873962402344, -0.0173797607421875, -0.0030422210693359375, 0.01197052001953125, -0.062286376953125, -0.03436279296875, 0.0216217041015625, -0.046905517578125, -0.0294036865234375, 0.0218963623046875, -0.03912353515625, 0.007236480712890625, -0.0179443359375, 0.041015625, -0.0440673828125, -0.0110321044921875, 0.00888824462890625, -0.0095977783203125, 0.01027679443359375, 0.01270294189453125, -0.059844970703125, 0.0212249755859375, 0.05108642578125, 0.0665283203125, -0.005184173583984375, -0.0217437744140625, -0.0301971435546875, -0.026702880859375, -0.006359100341796875, 0.037811279296875, -0.0123748779296875, 0.00811004638671875, -0.0031795501708984375, 0.01444244384765625, -0.0181121826171875, -0.039337158203125, 0.031402587890625, -0.02557373046875, 0.042083740234375, 0.009552001953125, -0.041412353515625, -0.0174102783203125, 0.0175018310546875, -0.04034423828125, 0.10321044921875, 0.018157958984375, -0.047119140625, 0.023956298828125, -0.043121337890625, -0.017242431640625, -0.02874755859375, 0.02008056640625, -0.06414794921875, 0.01023101806640625, 0.02264404296875, 0.037506103515625, -0.0157470703125, 0.0089111328125, -0.0302581787109375, -0.01678466796875, 0.0244140625, -0.003875732421875, 0.08111572265625, 0.015960693359375, -0.0286865234375, 0.0117340087890625, -0.06085205078125, 0.01332855224609375, 0.023223876953125, -0.0294952392578125, -0.0030879974365234375, -0.0271148681640625, 0.00823974609375, 0.0206451416015625, 0.033203125, -0.043548583984375, 0.02508544921875, -0.05108642578125, 0.033843994140625, 0.057769775390625, -0.020416259765625, 0.034698486328125, -0.037200927734375, 0.0239410400390625, 0.005214691162109375, 0.0083465576171875, -0.0264434814453125, -0.06549072265625, -0.0714111328125, -0.04083251953125, 0.039794921875, 0.036468505859375, -0.01361083984375, 0.06097412109375, -0.00301361083984375, -0.046600341796875, -0.058563232421875, 0.006420135498046875, 0.039703369140625, 0.036895751953125, 0.0280609130859375, -0.035797119140625, -0.047332763671875, -0.053131103515625, -0.02008056640625, -0.01385498046875, -0.005809783935546875, 0.02105712890625, 0.055816650390625, -0.00012242794036865234, 0.0771484375, -0.047607421875, -0.035614013671875, -0.01207733154296875, 0.0189208984375, 0.048614501953125, 0.053375244140625, 0.048126220703125, -0.04669189453125, -0.05340576171875, -0.0173187255859375, -0.050811767578125, 0.00688934326171875, 0.0085906982421875, -0.0107879638671875, 0.0513916015625, 0.0295257568359375, -0.056488037109375, 0.0438232421875, 0.050323486328125, -0.0287628173828125, 0.04412841796875, -0.0218353271484375, 0.0030002593994140625, -0.083251953125, 0.017120361328125, -0.0063018798828125, -0.0155029296875, -0.04083251953125, 0.00925445556640625, -0.005645751953125, -0.01251983642578125, -0.0290069580078125, 0.051483154296875, -0.045379638671875, 0.00714111328125, -0.01149749755859375, 0.014068603515625, -0.00582122802734375, 0.050994873046875, -0.0008139610290527344, 0.041534423828125, 0.047760009765625, -0.0292205810546875, 0.019317626953125, 0.036895751953125, -0.035980224609375, 0.017669677734375, -0.0540771484375, 0.0264434814453125, -0.003002166748046875, 0.005748748779296875, -0.08294677734375, -0.01111602783203125, 0.0288848876953125, -0.0614013671875, 0.032562255859375, -0.0294189453125, -0.040924072265625, -0.0406494140625, -0.0272674560546875, 0.00006973743438720703, 0.07257080078125, -0.039031982421875, 0.040771484375, 0.0190277099609375, -0.005970001220703125, -0.037994384765625, -0.042510986328125, -0.006404876708984375, -0.0207366943359375, -0.0736083984375, 0.035614013671875, -0.0199432373046875, 0.00289154052734375, 0.0007877349853515625, -0.004852294921875, -0.0114288330078125, 0.00853729248046875, 0.01428985595703125, 0.0308990478515625, -0.0170135498046875, -0.004375457763671875, -0.0217742919921875, -0.0155029296875, 0.0052947998046875, -0.02117919921875, 0.055419921875, -0.017852783203125, -0.0038242340087890625, -0.030364990234375, -0.0003464221954345703, 0.0295257568359375, -0.0277099609375, 0.077392578125, 0.07000732421875, -0.02032470703125, 0.007049560546875, -0.04638671875, -0.0291748046875, -0.034423828125, 0.0161285400390625, -0.030120849609375, -0.061798095703125, 0.054718017578125, 0.01560211181640625, 0.0039520263671875, 0.04571533203125, 0.0445556640625, -0.021728515625, 0.076416015625, 0.04193115234375, -0.017486572265625, 0.03668212890625, -0.040771484375, 0.00399017333984375, -0.068359375, -0.017608642578125, -0.035400390625, -0.0307159423828125, -0.05267333984375, -0.032623291015625, 0.021453857421875, 0.0234375, -0.00923919677734375, 0.043212890625, -0.047698974609375, 0.007110595703125, 0.06756591796875, 0.0190277099609375, -0.001171112060546875, 0.01203155517578125, -0.01001739501953125, -0.003963470458984375, -0.05059814453125, -0.0328369140625, 0.08331298828125, 0.036163330078125, 0.041412353515625, -0.006103515625, 0.05865478515625, -0.000659942626953125, 0.0157012939453125, -0.03814697265625, 0.033050537109375, -0.009857177734375, -0.06427001953125, -0.0089569091796875, -0.0263214111328125, -0.0784912109375, 0.019866943359375, -0.0254974365234375, -0.07586669921875, -0.005809783935546875, 0.01387786865234375, -0.006153106689453125, 0.0267181396484375, -0.043548583984375, 0.07330322265625, -0.005645751953125, -0.0194244384765625, 0.0027980804443359375, -0.06707763671875, 0.0110015869140625, 0.01102447509765625, -0.00601959228515625, -0.01538848876953125, 0.007801055908203125, 0.0596923828125, -0.04327392578125, 0.06512451171875, -0.0036525726318359375, -0.0029926300048828125, 0.0300140380859375, -0.01207733154296875, 0.03662109375, -0.0202484130859375, -0.004150390625, 0.032379150390625, -0.01519012451171875, -0.0309600830078125, -0.03607177734375, 0.04638671875, -0.06268310546875, -0.040771484375, -0.0479736328125, -0.048797607421875, 0.01300048828125, 0.0174102783203125, 0.0236663818359375, 0.0297698974609375, -0.00493621826171875, 0.01171875, 0.0423583984375, -0.018341064453125, 0.0298614501953125, 0.035400390625, -0.0140533447265625, -0.035400390625, 0.056488037109375, -0.0006680488586425781, 0.018585205078125, 0.012420654296875, 0.0199737548828125, -0.0167694091796875, -0.0288848876953125, -0.019073486328125, 0.0416259765625, -0.03271484375, -0.019683837890625, -0.059295654296875, -0.023590087890625, -0.055511474609375, -0.0007214546203613281, -0.0218048095703125, -0.032745361328125, -0.034912109375, 0.01241302490234375, 0.04339599609375, 0.049774169921875, -0.005115509033203125, 0.0173492431640625, -0.0599365234375, 0.011871337890625, 0.0010080337524414062, 0.0290374755859375, -0.017486572265625, -0.0634765625, -0.02947998046875, 0.01094818115234375, -0.0291748046875, -0.052001953125, 0.04107666015625, 0.0193023681640625, 0.031982421875, 0.02105712890625, 0.00014579296112060547, 0.05255126953125, -0.033782958984375, 0.06048583984375, 0.00702667236328125, -0.06365966796875, 0.050872802734375, -0.0262603759765625, 0.021514892578125, 0.021026611328125, 0.0341796875, -0.0206146240234375, -0.039337158203125, -0.080322265625, -0.0628662109375, 0.08612060546875, 0.03204345703125, 0.001964569091796875, 0.004817962646484375, 0.022705078125, -0.0122222900390625, 0.027740478515625, -0.055572509765625, -0.03948974609375, -0.0010356903076171875, -0.024200439453125, -0.01270294189453125, -0.0187835693359375, -0.0020580291748046875, -0.0248870849609375, 0.056365966796875, -0.00787353515625, 0.040863037109375, 0.0173187255859375, -0.01318359375, -0.006103515625, -0.01271820068359375, 0.033599853515625, 0.02716064453125, -0.046600341796875, -0.01540374755859375, 0.01415252685546875, -0.0269775390625, -0.017120361328125, 0.02276611328125, -0.0036945343017578125, 0.007541656494140625, 0.036712646484375, 0.07049560546875, 0.0113983154296875, -0.038970947265625, 0.042327880859375, 0.022979736328125, -0.0174102783203125, -0.024566650390625, 0.00659942626953125, 0.0014791488647460938, 0.020538330078125, 0.013397216796875, 0.01145172119140625, -0.0012187957763671875, -0.047760009765625, 0.0234222412109375, 0.03594970703125, -0.04119873046875, -0.028106689453125, 0.053436279296875, -0.0022525787353515625, -0.0279083251953125, 0.063720703125, -0.0022563934326171875, -0.05511474609375, 0.060760498046875, 0.039154052734375, 0.061004638671875, -0.017059326171875, -0.0004267692565917969, 0.04779052734375, 0.016845703125, 0.01023101806640625, 0.0178070068359375, 0.00777435302734375, -0.043121337890625, -0.0159149169921875, -0.053955078125, 0.00717926025390625, 0.0189666748046875, -0.06280517578125, 0.0256500244140625, -0.0272369384765625, -0.0224761962890625, -0.00399017333984375, 0.01195526123046875, -0.051605224609375, 0.025482177734375, -0.0013952255249023438, 0.07421875, -0.061492919921875, 0.048919677734375, 0.0648193359375, -0.046905517578125, -0.0479736328125, -0.00736236572265625, 0.004756927490234375, -0.07110595703125, 0.056884765625, 0.0300140380859375, 0.026885986328125, -0.0035572052001953125, -0.042510986328125, -0.0745849609375, 0.0751953125, 0.01070404052734375, -0.018646240234375, -0.004077911376953125, 0.008544921875, 0.044769287109375, -0.0254974365234375, 0.036956787109375, 0.03521728515625, 0.0256195068359375, 0.00013947486877441406, -0.08636474609375, 0.008819580078125, -0.026336669921875, -0.0029296875, 0.02093505859375, -0.05706787109375, 0.089111328125, -0.0233917236328125, -0.0013265609741210938, 0.036590576171875, 0.056671142578125, 0.01641845703125, 0.00540924072265625, 0.03997802734375, 0.0645751953125, 0.04400634765625, -0.02691650390625, 0.0810546875, -0.0259246826171875, 0.0618896484375, 0.06475830078125, 0.0121917724609375, 0.05755615234375, 0.0192413330078125, -0.0228729248046875, 0.044036865234375, 0.04107666015625, -0.024566650390625, 0.029937744140625, 0.00547027587890625, 0.00128936767578125, -0.0093994140625, 0.01043701171875, -0.0195770263671875, 0.038848876953125, 0.00284576416015625, -0.03912353515625, -0.00777435302734375, 0.002208709716796875, 0.02850341796875, -0.006938934326171875, 0.0082550048828125, 0.0640869140625, 0.001605987548828125, -0.053375244140625, 0.062469482421875, 0.004184722900390625, 0.054718017578125, -0.04608154296875, 0.0011663436889648438, 0.002231597900390625, 0.031036376953125, -0.004405975341796875, -0.060577392578125, 0.0141143798828125, -0.00719451904296875, -0.018524169921875, -0.0249176025390625, 0.048309326171875, -0.049835205078125, -0.031829833984375, 0.0182037353515625, 0.016845703125, 0.0283355712890625, -0.01422882080078125, -0.0810546875, 0.019989013671875, 0.0088348388671875, -0.0305633544921875, 0.025848388671875, 0.016204833984375, 0.01270294189453125, 0.05072021484375, 0.057952880859375, -0.007709503173828125, 0.0007786750793457031, -0.01287078857421875, 0.06781005859375, -0.043212890625, -0.0279998779296875, -0.054168701171875, 0.06573486328125, -0.00702667236328125, -0.04058837890625, 0.0665283203125, 0.048248291015625, 0.057647705078125, -0.0164794921875, 0.03143310546875, -0.0284271240234375, 0.043365478515625, -0.043243408203125, 0.05572509765625, -0.055999755859375, -0.009490966796875, -0.030120849609375, -0.06768798828125, -0.006744384765625, 0.065673828125, -0.0088653564453125, 0.0284576416015625, 0.039093017578125, 0.058074951171875, -0.005039215087890625, -0.0230255126953125, -0.0013647079467773438, 0.0298614501953125, 0.018646240234375, 0.03082275390625, 0.0406494140625, -0.063232421875, 0.036590576171875, -0.032684326171875, -0.0228729248046875, -0.031982421875, -0.0631103515625, -0.0859375, -0.0650634765625, -0.037689208984375, -0.044830322265625, 0.0016698837280273438, 0.06890869140625, 0.06524658203125, -0.054534912109375, -0.016448974609375, 0.00428009033203125, -0.0071868896484375, -0.016510009765625, -0.0229034423828125, 0.040008544921875, -0.0421142578125, -0.08209228515625, 0.01226806640625, -0.007965087890625, 0.01302337646484375, -0.007465362548828125, 0.00006073713302612305, -0.030303955078125, -0.0023822784423828125, 0.028533935546875, 0.00555419921875, -0.0572509765625, -0.0254364013671875, 0.01018524169921875, -0.00702667236328125, 0.0219268798828125, 0.0290069580078125, -0.06268310546875, 0.03192138671875, 0.0377197265625, 0.00855255126953125, 0.059478759765625, -0.00408172607421875, 0.038604736328125, -0.06024169921875, 0.0220947265625, 0.0020236968994140625, 0.0307464599609375, 0.0224761962890625, -0.0163421630859375, 0.04132080078125, 0.0174407958984375, -0.044403076171875, -0.06451416015625, 0.0019626617431640625, -0.08489990234375, -0.012420654296875, 0.0650634765625, -0.0266876220703125, -0.03387451171875, 0.00969696044921875, -0.0215301513671875, 0.0283660888671875, -0.0276031494140625, 0.062408447265625, 0.0452880859375, -0.0013303756713867188, -0.00868988037109375, -0.0172119140625, 0.0406494140625, 0.036285400390625, -0.035247802734375, -0.0097503662109375, 0.006946563720703125, 0.03729248046875, 0.0293731689453125, 0.03369140625, -0.0108642578125, 0.004520416259765625, -0.015472412109375, 0.0188446044921875, -0.01282501220703125, -0.01605224609375, -0.02178955078125, -0.004825592041015625, -0.0135498046875, -0.0271148681640625 ] ]
OpenAssistant/oasst-sft-1-pythia-12b
2023-03-11T14:25:14.000Z
[ "transformers", "pytorch", "gpt_neox", "text-generation", "sft", "en", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
OpenAssistant
null
null
OpenAssistant/oasst-sft-1-pythia-12b
277
9,381
transformers
2023-03-09T16:47:26
--- license: apache-2.0 language: - en tags: - sft pipeline_tag: text-generation widget: - text: <|prompter|>What is a meme, and what's the history behind this word?<|endoftext|><|assistant|> - text: <|prompter|>What's the Earth total population<|endoftext|><|assistant|> - text: <|prompter|>Write a story about future of AI development<|endoftext|><|assistant|> --- # Open-Assistant SFT-1 12B Model This is the first iteration English supervised-fine-tuning (SFT) model of the [Open-Assistant](https://github.com/LAION-AI/Open-Assistant) project. It is based on a Pythia 12B that was fine-tuned on ~22k human demonstrations of assistant conversations collected through the [https://open-assistant.io/](https://open-assistant.io/) human feedback web app before March 7, 2023. ## Model Details - **Developed by:** [Open-Assistant Contributors](https://open-assistant.io/) - **Model type:** Transformer-based Language Model - **Language:** English - **Finetuned from:** [EleutherAI / pythia-12b-deduped](https://huggingface.co/EleutherAI/pythia-12b-deduped) - **Code:** [Open-Assistant/model/model_training](https://github.com/LAION-AI/Open-Assistant/tree/main/model/model_training) - **Demo:** [Continuations for 250 random prompts](https://open-assistant.github.io/oasst-model-eval/?f=https%3A%2F%2Fraw.githubusercontent.com%2FOpen-Assistant%2Foasst-model-eval%2Fmain%2Fsampling_reports%2Foasst-sft%2F2023-03-09_andreaskoepf_oasst-1_12b_7000_sampling_noprefix_lottery.json) ([sampling code](https://github.com/Open-Assistant/oasst-model-eval/blob/3d71f3be100c05cd8ddb568365e036a29fbff8c7/model_eval/manual/sampling_report.py)). - **License:** Apache 2.0 - **Contact:** [Open-Assistant Discord](https://ykilcher.com/open-assistant-discord) ## Prompting Two special tokens are used to mark the beginning of user and assistant turns: `<|prompter|>` and `<|assistant|>`. Each turn ends with a `<|endoftext|>` token. Input prompt example: ``` <|prompter|>What is a meme, and what's the history behind this word?<|endoftext|><|assistant|> ``` The input ends with the `<|assistant|>` token to signal that the model should start generating the assistant reply. Example of generated continuation (typical sampling 0.2): ``` A visual joke often accompanied by text that spreads rapidly from individual to individual. The exact origin of the term "meme" is unclear, but it was first used in the 1990s by Richard Dawkins in his book "The Selfish Gene." He defined a meme as an idea, behavior, or style that is transmitted from one person to another within a culture. Dawkins' use of the term caught on among scholars and journalists, who began to apply the concept to other types of cultural items such as advertisements, fashion trends, and jokes. Over time, the term "meme" has come to be used more generally to describe any social behavior that can be repeated and altered by individuals. Today, the term is widely recognized and applied in fields ranging from anthropology to psychology to marketing.<|endoftext|> ``` ## Limitations See limitations of Pythia 12B base model [here](https://huggingface.co/EleutherAI/pythia-12b-deduped#limitations-and-biases). The model is known to fail horribly at answering math and coding questions. Beware of hallucinations: Outputs are often factually wrong or misleading. Replies might look convincing (at first glance) while containing completely made up false statements. This model is usable only for English conversations.
3,491
[ [ -0.03472900390625, -0.09112548828125, 0.016876220703125, 0.021331787109375, -0.01467132568359375, -0.00907135009765625, 0.006938934326171875, -0.02587890625, 0.0294342041015625, 0.01458740234375, -0.06939697265625, -0.00948333740234375, -0.037322998046875, 0.013336181640625, -0.02740478515625, 0.063232421875, -0.00426483154296875, 0.0130767822265625, -0.00926971435546875, -0.0081634521484375, -0.038299560546875, -0.031890869140625, -0.061859130859375, -0.00923919677734375, 0.03472900390625, 0.03900146484375, 0.053314208984375, 0.0261993408203125, 0.0352783203125, 0.02520751953125, 0.00400543212890625, -0.002536773681640625, -0.034210205078125, -0.0236663818359375, -0.02545166015625, -0.01904296875, -0.0316162109375, 0.032470703125, 0.0311126708984375, 0.05914306640625, -0.005523681640625, 0.018768310546875, 0.003139495849609375, 0.037933349609375, -0.03826904296875, -0.005062103271484375, -0.0194244384765625, -0.016876220703125, -0.00341796875, -0.02203369140625, -0.0267333984375, -0.02337646484375, 0.0225677490234375, -0.04119873046875, 0.0043792724609375, -0.006374359130859375, 0.07257080078125, 0.00656890869140625, -0.0126190185546875, -0.040191650390625, -0.0599365234375, 0.05792236328125, -0.051361083984375, 0.018768310546875, 0.02191162109375, 0.0280914306640625, -0.0162811279296875, -0.055877685546875, -0.04901123046875, -0.033294677734375, -0.0028858184814453125, 0.0098419189453125, -0.051055908203125, -0.004608154296875, 0.0189666748046875, 0.0352783203125, -0.061859130859375, -0.00955963134765625, -0.058868408203125, -0.0000073909759521484375, 0.02508544921875, 0.007568359375, 0.016998291015625, -0.0161285400390625, -0.0263824462890625, -0.007274627685546875, -0.03759765625, 0.0020904541015625, 0.03253173828125, -0.0096893310546875, -0.0243682861328125, 0.052764892578125, -0.025909423828125, 0.042938232421875, 0.01198577880859375, 0.0010013580322265625, 0.006816864013671875, -0.033233642578125, -0.0250396728515625, -0.0262603759765625, 0.07635498046875, 0.0172882080078125, 0.0352783203125, -0.0033779144287109375, 0.01015472412109375, 0.01502227783203125, 0.01456451416015625, -0.0628662109375, -0.042144775390625, 0.009613037109375, -0.04766845703125, -0.03924560546875, 0.0186309814453125, -0.050323486328125, -0.00009107589721679688, -0.01065826416015625, 0.0303192138671875, -0.0280914306640625, -0.03289794921875, 0.015960693359375, -0.004024505615234375, 0.0140228271484375, 0.027130126953125, -0.05511474609375, 0.0220947265625, 0.0255889892578125, 0.06793212890625, 0.011016845703125, -0.03863525390625, -0.009063720703125, -0.023223876953125, -0.0248565673828125, 0.047882080078125, -0.022064208984375, -0.0225372314453125, -0.03338623046875, 0.016815185546875, 0.00469970703125, -0.027435302734375, 0.018798828125, -0.0272216796875, 0.04345703125, 0.00441741943359375, -0.037017822265625, -0.0167388916015625, 0.005458831787109375, -0.054351806640625, 0.054534912109375, 0.0203094482421875, -0.032684326171875, 0.00696563720703125, -0.08648681640625, -0.005733489990234375, -0.0060882568359375, -0.00872802734375, -0.00879669189453125, 0.005115509033203125, -0.0055389404296875, 0.02923583984375, -0.04315185546875, 0.0113372802734375, -0.0147705078125, -0.025146484375, 0.018341064453125, -0.0496826171875, 0.08734130859375, 0.023773193359375, -0.01532745361328125, 0.002422332763671875, -0.060394287109375, -0.017608642578125, 0.0297088623046875, -0.0156707763671875, -0.015594482421875, -0.0208892822265625, -0.01181793212890625, 0.01158905029296875, 0.01861572265625, -0.05389404296875, 0.0232086181640625, -0.0291748046875, 0.04046630859375, 0.062469482421875, 0.0029811859130859375, 0.046051025390625, -0.031005859375, 0.046051025390625, -0.006168365478515625, 0.0295562744140625, -0.0272979736328125, -0.054534912109375, -0.0234375, -0.0221099853515625, 0.0119476318359375, 0.055938720703125, -0.036163330078125, 0.052886962890625, 0.003986358642578125, -0.053619384765625, -0.045074462890625, -0.00797271728515625, 0.01200103759765625, 0.029388427734375, 0.02740478515625, -0.01387786865234375, -0.040679931640625, -0.063720703125, -0.017547607421875, -0.0178985595703125, -0.0107879638671875, 0.035491943359375, 0.04864501953125, -0.01374053955078125, 0.049774169921875, -0.03741455078125, -0.0124969482421875, -0.0094451904296875, 0.02178955078125, 0.0258331298828125, 0.056396484375, 0.050079345703125, -0.041290283203125, -0.02374267578125, -0.0137176513671875, -0.055633544921875, -0.011932373046875, -0.022308349609375, -0.0193023681640625, 0.00641632080078125, 0.02459716796875, -0.05242919921875, 0.051483154296875, 0.0496826171875, -0.0626220703125, 0.053375244140625, -0.00756072998046875, 0.0302734375, -0.106689453125, 0.00395965576171875, -0.009033203125, -0.007701873779296875, -0.0216827392578125, -0.0002872943878173828, -0.01291656494140625, -0.00146484375, -0.023468017578125, 0.061279296875, -0.036529541015625, 0.027587890625, -0.0223388671875, -0.0126190185546875, -0.0089569091796875, 0.04083251953125, -0.02252197265625, 0.060302734375, 0.040740966796875, -0.0484619140625, 0.0435791015625, 0.023529052734375, -0.0164031982421875, 0.0435791015625, -0.06396484375, 0.0191802978515625, 0.002674102783203125, 0.00827789306640625, -0.09051513671875, -0.021759033203125, 0.03594970703125, -0.06787109375, 0.00341796875, 0.0092010498046875, -0.03936767578125, -0.044677734375, -0.0177459716796875, 0.03582763671875, 0.050079345703125, -0.0496826171875, 0.038787841796875, 0.01509857177734375, 0.0086669921875, -0.035186767578125, -0.04937744140625, -0.006633758544921875, -0.0014772415161132812, -0.073974609375, 0.01070404052734375, -0.0168609619140625, -0.01441192626953125, 0.0038547515869140625, -0.0029296875, -0.01349639892578125, 0.01776123046875, 0.011810302734375, 0.015167236328125, -0.013702392578125, -0.00948333740234375, 0.00144195556640625, -0.0028972625732421875, 0.003368377685546875, -0.00814056396484375, 0.058807373046875, -0.006683349609375, -0.0272216796875, -0.046630859375, 0.037841796875, 0.04718017578125, -0.0123443603515625, 0.06146240234375, 0.02301025390625, -0.0430908203125, 0.00679779052734375, -0.039276123046875, -0.04522705078125, -0.037322998046875, 0.0274810791015625, -0.0423583984375, -0.060546875, 0.03900146484375, -0.0094451904296875, 0.01271820068359375, 0.038299560546875, 0.04351806640625, -0.01309967041015625, 0.0732421875, 0.0455322265625, 0.00283050537109375, 0.06121826171875, -0.0269012451171875, 0.01751708984375, -0.048187255859375, -0.00597381591796875, -0.041778564453125, -0.030242919921875, -0.048004150390625, -0.032073974609375, 0.0171966552734375, 0.02252197265625, -0.01007080078125, 0.029571533203125, -0.0426025390625, 0.0341796875, 0.060394287109375, 0.008880615234375, 0.00835418701171875, -0.001422882080078125, 0.0074310302734375, 0.0084381103515625, -0.0487060546875, -0.057098388671875, 0.0880126953125, 0.0275115966796875, 0.0787353515625, 0.0147705078125, 0.0458984375, 0.021240234375, 0.00975799560546875, -0.042144775390625, 0.04150390625, -0.0032749176025390625, -0.04388427734375, -0.0287933349609375, -0.02142333984375, -0.08538818359375, -0.0119476318359375, -0.0029354095458984375, -0.06781005859375, 0.01421356201171875, 0.0250396728515625, -0.0181427001953125, 0.0209197998046875, -0.05657958984375, 0.09100341796875, -0.00446319580078125, -0.0347900390625, -0.029052734375, -0.039886474609375, 0.023284912109375, 0.01297760009765625, -0.007568359375, 0.00254058837890625, 0.0235443115234375, 0.061981201171875, -0.045257568359375, 0.05511474609375, -0.0064697265625, 0.01739501953125, 0.048187255859375, 0.01287078857421875, 0.032257080078125, -0.005771636962890625, 0.011505126953125, 0.0175933837890625, 0.0185089111328125, -0.04559326171875, -0.023529052734375, 0.05718994140625, -0.062225341796875, -0.022308349609375, -0.0296173095703125, -0.033203125, 0.0245819091796875, 0.0123748779296875, 0.02392578125, 0.029083251953125, -0.038177490234375, 0.0316162109375, 0.033599853515625, -0.0294952392578125, 0.003971099853515625, 0.044464111328125, -0.01016998291015625, -0.019317626953125, 0.060394287109375, 0.01259613037109375, 0.01296234130859375, 0.0180206298828125, 0.0285797119140625, -0.01122283935546875, -0.00543212890625, -0.03631591796875, 0.0191192626953125, -0.04705810546875, -0.00400543212890625, -0.0548095703125, -0.004886627197265625, -0.043853759765625, -0.0181121826171875, -0.0184326171875, -0.03326416015625, -0.031494140625, -0.01305389404296875, 0.03387451171875, 0.05914306640625, -0.007701873779296875, 0.03631591796875, -0.03912353515625, 0.017547607421875, 0.0173797607421875, 0.0271148681640625, -0.0002918243408203125, -0.036041259765625, -0.00426483154296875, 0.010223388671875, -0.043731689453125, -0.10052490234375, 0.006664276123046875, -0.0064239501953125, 0.046661376953125, 0.02734375, -0.0169677734375, 0.031585693359375, -0.0218505859375, 0.076171875, 0.00923919677734375, -0.05908203125, 0.0408935546875, -0.03118896484375, 0.029510498046875, 0.0270538330078125, 0.022613525390625, -0.034027099609375, -0.032196044921875, -0.06744384765625, -0.055328369140625, 0.054840087890625, 0.0171051025390625, 0.01971435546875, -0.0257415771484375, 0.033050537109375, 0.0132293701171875, 0.0143890380859375, -0.06524658203125, -0.0277557373046875, -0.015289306640625, -0.0193634033203125, 0.0174713134765625, -0.0243988037109375, 0.0113525390625, -0.038726806640625, 0.0677490234375, -0.003208160400390625, 0.0362548828125, 0.0216522216796875, -0.0098114013671875, -0.0254669189453125, 0.00931549072265625, 0.047698974609375, 0.0557861328125, -0.0127105712890625, -0.0100555419921875, -0.004730224609375, -0.03643798828125, 0.0009555816650390625, 0.005096435546875, -0.0182647705078125, -0.0019283294677734375, 0.0155181884765625, 0.060150146484375, 0.004001617431640625, -0.050933837890625, 0.039154052734375, -0.001476287841796875, -0.020721435546875, -0.037841796875, -0.00588226318359375, 0.0012216567993164062, 0.0198822021484375, 0.0167083740234375, 0.006229400634765625, 0.022918701171875, -0.0509033203125, 0.0160675048828125, 0.047210693359375, -0.009124755859375, -0.041534423828125, 0.061370849609375, 0.01436614990234375, -0.03411865234375, 0.033721923828125, -0.0262298583984375, -0.01279449462890625, 0.062347412109375, 0.04931640625, 0.0831298828125, -0.03363037109375, 0.01042938232421875, 0.052886962890625, 0.046875, -0.0034351348876953125, 0.041046142578125, 0.006824493408203125, -0.040496826171875, -0.007720947265625, -0.043304443359375, -0.0271148681640625, 0.0340576171875, -0.0268096923828125, 0.043975830078125, -0.038726806640625, 0.01061248779296875, 0.01137542724609375, -0.01528167724609375, -0.061004638671875, 0.01386260986328125, -0.0037212371826171875, 0.09442138671875, -0.06378173828125, 0.0401611328125, 0.07275390625, -0.049774169921875, -0.06884765625, -0.00406646728515625, -0.004932403564453125, -0.03997802734375, 0.004360198974609375, 0.0095062255859375, 0.008392333984375, 0.0120697021484375, -0.06781005859375, -0.06329345703125, 0.09356689453125, 0.007053375244140625, -0.035430908203125, -0.01103973388671875, -0.008209228515625, 0.03662109375, -0.0401611328125, 0.036773681640625, 0.038360595703125, 0.0253448486328125, 0.008819580078125, -0.07940673828125, 0.0080413818359375, -0.0267333984375, -0.0201416015625, 0.0068817138671875, -0.044647216796875, 0.09588623046875, -0.0165252685546875, 0.00238037109375, 0.042144775390625, 0.03961181640625, 0.005207061767578125, 0.02178955078125, 0.032135009765625, 0.0272674560546875, 0.05120849609375, -0.00933074951171875, 0.08612060546875, -0.0240936279296875, 0.0238494873046875, 0.0750732421875, 0.0177459716796875, 0.03863525390625, 0.03619384765625, -0.0137786865234375, 0.035675048828125, 0.048126220703125, -0.01336669921875, 0.042816162109375, 0.015594482421875, -0.0008568763732910156, 0.0124969482421875, 0.014434814453125, -0.015655517578125, 0.037078857421875, 0.01922607421875, -0.049346923828125, -0.0255126953125, 0.0089874267578125, 0.005062103271484375, -0.00902557373046875, -0.01739501953125, 0.0748291015625, -0.00323486328125, -0.05023193359375, 0.043609619140625, 0.006816864013671875, 0.04925537109375, -0.056549072265625, -0.0205230712890625, -0.005207061767578125, 0.0189666748046875, -0.00951385498046875, -0.047119140625, 0.0189056396484375, 0.0018377304077148438, -0.019073486328125, 0.00400543212890625, 0.05914306640625, -0.02606201171875, -0.0303955078125, 0.0254669189453125, 0.05908203125, 0.04046630859375, -0.018585205078125, -0.063720703125, -0.01348876953125, 0.005046844482421875, -0.0394287109375, -0.0039215087890625, 0.039093017578125, -0.021575927734375, 0.031585693359375, 0.053741455078125, -0.006290435791015625, 0.0182037353515625, -0.003925323486328125, 0.059295654296875, -0.039154052734375, -0.0450439453125, -0.08160400390625, 0.0447998046875, -0.0016603469848632812, -0.038238525390625, 0.044769287109375, 0.038421630859375, 0.06561279296875, -0.026275634765625, 0.061431884765625, -0.0298614501953125, 0.03778076171875, -0.01538848876953125, 0.054534912109375, -0.050537109375, -0.0007920265197753906, -0.036102294921875, -0.057647705078125, 0.0101165771484375, 0.072265625, -0.01511383056640625, 0.0113067626953125, 0.047637939453125, 0.0594482421875, -0.0177154541015625, 0.011505126953125, 0.0194854736328125, 0.028472900390625, 0.0257568359375, 0.032470703125, 0.0595703125, -0.042755126953125, 0.045135498046875, -0.03497314453125, -0.057586669921875, -0.02032470703125, -0.041717529296875, -0.0736083984375, -0.039764404296875, -0.021087646484375, -0.041656494140625, -0.006061553955078125, 0.09722900390625, 0.042144775390625, -0.060394287109375, -0.016448974609375, -0.0299530029296875, 0.0006680488586425781, -0.026458740234375, -0.0248870849609375, 0.003826141357421875, 0.004276275634765625, -0.068115234375, 0.022308349609375, 0.0077667236328125, 0.018951416015625, -0.03314208984375, -0.0032482147216796875, -0.037445068359375, 0.037200927734375, 0.0401611328125, 0.0207061767578125, -0.048919677734375, -0.01464080810546875, 0.028594970703125, -0.00794219970703125, -0.0006537437438964844, 0.046783447265625, -0.05047607421875, 0.033599853515625, 0.040313720703125, 0.02386474609375, 0.00688934326171875, 0.0036468505859375, 0.0440673828125, -0.04962158203125, 0.0186614990234375, 0.017364501953125, 0.03265380859375, 0.017974853515625, -0.040313720703125, 0.0546875, 0.027862548828125, -0.045806884765625, -0.063232421875, 0.0176544189453125, -0.06927490234375, -0.0253753662109375, 0.0777587890625, 0.0005064010620117188, -0.0185699462890625, 0.00783538818359375, -0.04376220703125, 0.021484375, -0.0360107421875, 0.06622314453125, 0.0662841796875, -0.006145477294921875, -0.0169525146484375, -0.0438232421875, 0.027496337890625, 0.031768798828125, -0.06683349609375, -0.00925445556640625, 0.036865234375, 0.034637451171875, 0.0328369140625, 0.07489013671875, -0.0112457275390625, 0.06292724609375, -0.005710601806640625, 0.0144805908203125, -0.002529144287109375, -0.026519775390625, -0.01509857177734375, -0.0008378028869628906, -0.004940032958984375, -0.04644775390625 ] ]
sentence-transformers/all-roberta-large-v1
2021-08-31T09:33:26.000Z
[ "sentence-transformers", "pytorch", "roberta", "feature-extraction", "sentence-similarity", "en", "arxiv:1904.06472", "arxiv:2102.07033", "arxiv:2104.08727", "arxiv:1704.05179", "arxiv:1810.09305", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/all-roberta-large-v1
38
9,375
sentence-transformers
2022-03-02T23:29:05
--- pipeline_tag: sentence-similarity tags: - sentence-transformers - feature-extraction - sentence-similarity language: en license: apache-2.0 --- # all-roberta-large-v1 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/all-roberta-large-v1') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch import torch.nn.functional as F #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/all-roberta-large-v1') model = AutoModel.from_pretrained('sentence-transformers/all-roberta-large-v1') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) # Normalize embeddings sentence_embeddings = F.normalize(sentence_embeddings, p=2, dim=1) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/all-roberta-large-v1) ------ ## Background The project aims to train sentence embedding models on very large sentence level datasets using a self-supervised contrastive learning objective. We used the pretrained [`roberta-large`](https://huggingface.co/roberta-large) model and fine-tuned in on a 1B sentence pairs dataset. We use a contrastive learning objective: given a sentence from the pair, the model should predict which out of a set of randomly sampled other sentences, was actually paired with it in our dataset. We developped this model during the [Community week using JAX/Flax for NLP & CV](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organized by Hugging Face. We developped this model as part of the project: [Train the Best Sentence Embedding Model Ever with 1B Training Pairs](https://discuss.huggingface.co/t/train-the-best-sentence-embedding-model-ever-with-1b-training-pairs/7354). We benefited from efficient hardware infrastructure to run the project: 7 TPUs v3-8, as well as intervention from Googles Flax, JAX, and Cloud team member about efficient deep learning frameworks. ## Intended uses Our model is intented to be used as a sentence and short paragraph encoder. Given an input text, it ouptuts a vector which captures the semantic information. The sentence vector may be used for information retrieval, clustering or sentence similarity tasks. By default, input text longer than 128 word pieces is truncated. ## Training procedure ### Pre-training We use the pretrained [`roberta-large`](https://huggingface.co/roberta-large). Please refer to the model card for more detailed information about the pre-training procedure. ### Fine-tuning We fine-tune the model using a contrastive objective. Formally, we compute the cosine similarity from each possible sentence pairs from the batch. We then apply the cross entropy loss by comparing with true pairs. #### Hyper parameters We trained ou model on a TPU v3-8. We train the model during 400k steps using a batch size of 256 (32 per TPU core). We use a learning rate warm up of 500. The sequence length was limited to 128 tokens. We used the AdamW optimizer with a 2e-5 learning rate. The full training script is accessible in this current repository: `train_script.py`. #### Training data We use the concatenation from multiple datasets to fine-tune our model. The total number of sentence pairs is above 1 billion sentences. We sampled each dataset given a weighted probability which configuration is detailed in the `data_config.json` file. | Dataset | Paper | Number of training tuples | |--------------------------------------------------------|:----------------------------------------:|:--------------------------:| | [Reddit comments (2015-2018)](https://github.com/PolyAI-LDN/conversational-datasets/tree/master/reddit) | [paper](https://arxiv.org/abs/1904.06472) | 726,484,430 | | [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Abstracts) | [paper](https://aclanthology.org/2020.acl-main.447/) | 116,288,806 | | [WikiAnswers](https://github.com/afader/oqa#wikianswers-corpus) Duplicate question pairs | [paper](https://doi.org/10.1145/2623330.2623677) | 77,427,422 | | [PAQ](https://github.com/facebookresearch/PAQ) (Question, Answer) pairs | [paper](https://arxiv.org/abs/2102.07033) | 64,371,441 | | [S2ORC](https://github.com/allenai/s2orc) Citation pairs (Titles) | [paper](https://aclanthology.org/2020.acl-main.447/) | 52,603,982 | | [S2ORC](https://github.com/allenai/s2orc) (Title, Abstract) | [paper](https://aclanthology.org/2020.acl-main.447/) | 41,769,185 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) (Title, Body) pairs | - | 25,316,456 | | [MS MARCO](https://microsoft.github.io/msmarco/) triplets | [paper](https://doi.org/10.1145/3404835.3462804) | 9,144,553 | | [GOOAQ: Open Question Answering with Diverse Answer Types](https://github.com/allenai/gooaq) | [paper](https://arxiv.org/pdf/2104.08727.pdf) | 3,012,496 | | [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 1,198,260 | | [Code Search](https://huggingface.co/datasets/code_search_net) | - | 1,151,414 | | [COCO](https://cocodataset.org/#home) Image captions | [paper](https://link.springer.com/chapter/10.1007%2F978-3-319-10602-1_48) | 828,395| | [SPECTER](https://github.com/allenai/specter) citation triplets | [paper](https://doi.org/10.18653/v1/2020.acl-main.207) | 684,100 | | [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Question, Answer) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 681,164 | | [Yahoo Answers](https://www.kaggle.com/soumikrakshit/yahoo-answers-dataset) (Title, Question) | [paper](https://proceedings.neurips.cc/paper/2015/hash/250cf8b51c773f3f8dc8b4be867a9a02-Abstract.html) | 659,896 | | [SearchQA](https://huggingface.co/datasets/search_qa) | [paper](https://arxiv.org/abs/1704.05179) | 582,261 | | [Eli5](https://huggingface.co/datasets/eli5) | [paper](https://doi.org/10.18653/v1/p19-1346) | 325,475 | | [Flickr 30k](https://shannon.cs.illinois.edu/DenotationGraph/) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/229/33) | 317,695 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles) | | 304,525 | | AllNLI ([SNLI](https://nlp.stanford.edu/projects/snli/) and [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) | [paper SNLI](https://doi.org/10.18653/v1/d15-1075), [paper MultiNLI](https://doi.org/10.18653/v1/n18-1101) | 277,230 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (bodies) | | 250,519 | | [Stack Exchange](https://huggingface.co/datasets/flax-sentence-embeddings/stackexchange_xml) Duplicate questions (titles+bodies) | | 250,460 | | [Sentence Compression](https://github.com/google-research-datasets/sentence-compression) | [paper](https://www.aclweb.org/anthology/D13-1155/) | 180,000 | | [Wikihow](https://github.com/pvl/wikihow_pairs_dataset) | [paper](https://arxiv.org/abs/1810.09305) | 128,542 | | [Altlex](https://github.com/chridey/altlex/) | [paper](https://aclanthology.org/P16-1135.pdf) | 112,696 | | [Quora Question Triplets](https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs) | - | 103,663 | | [Simple Wikipedia](https://cs.pomona.edu/~dkauchak/simplification/) | [paper](https://www.aclweb.org/anthology/P11-2117/) | 102,225 | | [Natural Questions (NQ)](https://ai.google.com/research/NaturalQuestions) | [paper](https://transacl.org/ojs/index.php/tacl/article/view/1455) | 100,231 | | [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/) | [paper](https://aclanthology.org/P18-2124.pdf) | 87,599 | | [TriviaQA](https://huggingface.co/datasets/trivia_qa) | - | 73,346 | | **Total** | | **1,124,818,467** |
9,839
[ [ -0.040069580078125, -0.0245513916015625, 0.052978515625, 0.03582763671875, -0.040557861328125, 0.0164642333984375, 0.023773193359375, -0.02740478515625, 0.074462890625, 0.07568359375, -0.051025390625, -0.024810791015625, -0.040771484375, -0.00603485107421875, -0.030914306640625, 0.068359375, -0.0201416015625, 0.0182342529296875, -0.0025691986083984375, -0.0055999755859375, -0.044647216796875, -0.00849151611328125, -0.05560302734375, -0.0277557373046875, 0.054351806640625, 0.033416748046875, 0.051971435546875, 0.049957275390625, 0.0521240234375, 0.0117950439453125, 0.030059814453125, 0.0094757080078125, -0.029052734375, -0.001373291015625, -0.01201629638671875, -0.01514434814453125, -0.0225677490234375, -0.0289459228515625, 0.033966064453125, 0.057342529296875, -0.035003662109375, 0.01001739501953125, -0.0159912109375, 0.031982421875, -0.052978515625, 0.020111083984375, -0.034393310546875, -0.0023097991943359375, -0.0305023193359375, -0.0243988037109375, -0.00437164306640625, -0.01331329345703125, 0.00766754150390625, -0.07965087890625, 0.039276123046875, -0.007251739501953125, 0.06610107421875, 0.0033016204833984375, -0.0462646484375, 0.0001437664031982422, -0.041168212890625, 0.053680419921875, -0.0065460205078125, 0.05712890625, 0.0296173095703125, 0.035797119140625, -0.040283203125, -0.047271728515625, -0.0244293212890625, -0.0148468017578125, -0.02008056640625, -0.0024738311767578125, -0.007965087890625, 0.01404571533203125, 0.0271148681640625, 0.041900634765625, -0.03619384765625, 0.0096435546875, -0.050689697265625, 0.0110931396484375, 0.05029296875, 0.01023101806640625, 0.044708251953125, -0.0115966796875, -0.04327392578125, 0.007266998291015625, -0.060882568359375, 0.0163726806640625, 0.0176849365234375, 0.0202789306640625, -0.03814697265625, 0.0570068359375, -0.0005388259887695312, 0.035736083984375, 0.036834716796875, -0.0082855224609375, 0.0184783935546875, -0.0213623046875, -0.0523681640625, 0.027923583984375, 0.0325927734375, 0.045501708984375, 0.031524658203125, 0.005580902099609375, -0.051025390625, -0.0011615753173828125, 0.046417236328125, -0.03387451171875, -0.01336669921875, 0.00341033935546875, -0.0399169921875, -0.02337646484375, 0.037078857421875, -0.034759521484375, -0.0311279296875, -0.00667572021484375, 0.0533447265625, -0.045074462890625, -0.043853759765625, -0.0157012939453125, -0.0498046875, 0.019256591796875, 0.00946807861328125, -0.061004638671875, 0.0267791748046875, 0.0177154541015625, 0.0498046875, 0.0179443359375, -0.024322509765625, -0.049285888671875, 0.00047779083251953125, -0.0352783203125, 0.059661865234375, -0.0282135009765625, -0.04132080078125, 0.005023956298828125, 0.033843994140625, 0.0173187255859375, -0.043426513671875, 0.0850830078125, -0.0302886962890625, 0.0188751220703125, -0.010955810546875, -0.0246429443359375, 0.00791168212890625, -0.0158843994140625, -0.055511474609375, 0.0721435546875, 0.012451171875, -0.049957275390625, 0.04400634765625, -0.05645751953125, -0.043487548828125, 0.01548004150390625, 0.004909515380859375, -0.0146942138671875, 0.0274505615234375, 0.00540924072265625, 0.0178680419921875, -0.004547119140625, -0.0161590576171875, -0.047119140625, -0.04443359375, 0.011962890625, 0.00013256072998046875, 0.087890625, 0.00714874267578125, -0.03741455078125, -0.01026153564453125, -0.0548095703125, -0.0118865966796875, 0.0180206298828125, -0.0231781005859375, -0.00811004638671875, -0.0180206298828125, 0.0093994140625, 0.01629638671875, 0.034088134765625, -0.055999755859375, 0.022186279296875, -0.0142822265625, -0.001834869384765625, 0.0809326171875, 0.0177764892578125, 0.016876220703125, -0.0650634765625, 0.048675537109375, -0.00482177734375, 0.029144287109375, 0.0057830810546875, -0.032623291015625, -0.046478271484375, -0.033782958984375, -0.0225830078125, 0.0265655517578125, -0.061920166015625, 0.01641845703125, -0.00525665283203125, -0.06024169921875, -0.004825592041015625, -0.0038280487060546875, 0.0276336669921875, 0.02130126953125, 0.006473541259765625, -0.016754150390625, -0.031707763671875, -0.072021484375, -0.022918701171875, -0.00960540771484375, 0.012969970703125, 0.0158538818359375, 0.0200958251953125, 0.019744873046875, 0.050811767578125, -0.050445556640625, -0.019256591796875, -0.0304412841796875, 0.00408172607421875, 0.0185089111328125, 0.042938232421875, 0.031982421875, -0.060821533203125, -0.03753662109375, -0.037628173828125, -0.06982421875, -0.050537109375, -0.0338134765625, -0.03717041015625, -0.01459503173828125, 0.0222930908203125, -0.09210205078125, 0.052490234375, 0.0382080078125, -0.04669189453125, 0.03558349609375, 0.001743316650390625, 0.00569915771484375, -0.058624267578125, -0.01114654541015625, 0.020660400390625, -0.034698486328125, -0.03094482421875, 0.014007568359375, -0.014190673828125, -0.0126800537109375, -0.029388427734375, 0.035400390625, -0.039154052734375, -0.0207672119140625, -0.04620361328125, -0.01000213623046875, -0.0201416015625, 0.007793426513671875, -0.0193634033203125, 0.04547119140625, 0.057891845703125, -0.05340576171875, 0.0416259765625, -0.01444244384765625, 0.011688232421875, 0.04937744140625, -0.042266845703125, -0.00946807861328125, -0.006710052490234375, 0.004241943359375, -0.0743408203125, -0.029815673828125, -0.00042891502380371094, -0.0223541259765625, -0.0103302001953125, -0.0181427001953125, -0.01776123046875, -0.052215576171875, -0.019989013671875, 0.0462646484375, 0.0662841796875, -0.0267791748046875, 0.0244140625, -0.0011987686157226562, -0.0030364990234375, -0.0251312255859375, -0.04046630859375, 0.0008096694946289062, -0.016357421875, -0.0102386474609375, 0.0426025390625, -0.0292205810546875, -0.042694091796875, -0.024932861328125, -0.0216217041015625, -0.0540771484375, 0.007167816162109375, 0.034088134765625, 0.005344390869140625, -0.043731689453125, -0.00295257568359375, -0.019744873046875, -0.0113525390625, 0.0211639404296875, -0.034637451171875, 0.029144287109375, 0.0008149147033691406, -0.0159454345703125, -0.037445068359375, 0.04852294921875, 0.0865478515625, 0.0267486572265625, 0.0302886962890625, 0.034332275390625, -0.0261077880859375, 0.003925323486328125, -0.013580322265625, -0.0247344970703125, -0.05560302734375, -0.00809478759765625, -0.01294708251953125, -0.047332763671875, 0.01377105712890625, 0.03582763671875, 0.00348663330078125, 0.034088134765625, 0.016510009765625, -0.033294677734375, 0.07611083984375, 0.0335693359375, 0.04022216796875, 0.03302001953125, -0.0257720947265625, 0.0200347900390625, -0.07098388671875, -0.0173187255859375, -0.038665771484375, -0.0316162109375, -0.03350830078125, -0.00829315185546875, 0.0389404296875, 0.0252227783203125, -0.018524169921875, 0.0487060546875, -0.057281494140625, 0.0242919921875, 0.0340576171875, 0.0200958251953125, 0.005146026611328125, -0.033416748046875, 0.00936126708984375, -0.018310546875, -0.019317626953125, -0.052490234375, 0.08056640625, 0.0173187255859375, 0.05999755859375, 0.0360107421875, 0.064453125, 0.024932861328125, 0.05657958984375, -0.058013916015625, 0.021087646484375, -0.031280517578125, -0.07342529296875, -0.0120391845703125, 0.004238128662109375, -0.0743408203125, -0.00980377197265625, -0.003070831298828125, -0.07061767578125, 0.052490234375, -0.029693603515625, -0.049774169921875, 0.018585205078125, -0.076904296875, 0.0655517578125, -0.01076507568359375, -0.00684356689453125, 0.01432037353515625, -0.033660888671875, 0.0172576904296875, 0.010894775390625, 0.0113067626953125, -0.04156494140625, 0.0214080810546875, 0.054229736328125, -0.0157623291015625, 0.0379638671875, -0.002716064453125, 0.00960540771484375, 0.030059814453125, 0.025146484375, 0.0107879638671875, 0.005962371826171875, -0.0024509429931640625, 0.0179290771484375, -0.0047454833984375, -0.058380126953125, -0.01611328125, 0.048370361328125, -0.037078857421875, -0.051239013671875, -0.052398681640625, 0.0037479400634765625, 0.0230255126953125, 0.023406982421875, 0.049591064453125, 0.050262451171875, 0.0037860870361328125, 0.010986328125, 0.05938720703125, 0.015838623046875, 0.042999267578125, 0.036224365234375, -0.01715087890625, -0.07562255859375, 0.05938720703125, 0.034393310546875, 0.039794921875, 0.03564453125, 0.033233642578125, -0.0416259765625, -0.006351470947265625, -0.037628173828125, 0.01464080810546875, -0.03375244140625, -0.0029087066650390625, -0.031524658203125, -0.0394287109375, 0.00026106834411621094, -0.044281005859375, -0.02716064453125, -0.0648193359375, -0.03131103515625, 0.003627777099609375, 0.045989990234375, 0.042724609375, -0.04217529296875, 0.04986572265625, -0.05072021484375, 0.046600341796875, 0.0027980804443359375, 0.02557373046875, -0.032135009765625, -0.053558349609375, -0.0165863037109375, 0.018310546875, -0.01415252685546875, -0.0731201171875, 0.026702880859375, 0.006298065185546875, 0.0265655517578125, 0.048797607421875, -0.007793426513671875, 0.08123779296875, -0.06500244140625, 0.07830810546875, 0.0182647705078125, -0.07293701171875, 0.01552581787109375, -0.04180908203125, 0.028717041015625, 0.0499267578125, 0.02606201171875, -0.036834716796875, -0.03363037109375, -0.056854248046875, -0.053680419921875, 0.049102783203125, 0.02191162109375, 0.0162200927734375, -0.0013456344604492188, 0.037017822265625, 0.0021305084228515625, 0.003070831298828125, -0.07550048828125, -0.0482177734375, -0.01100921630859375, 0.0275115966796875, 0.01824951171875, -0.0200347900390625, 0.00799560546875, -0.0168914794921875, 0.057037353515625, 0.019287109375, 0.026611328125, 0.0177154541015625, 0.00238800048828125, -0.005016326904296875, 0.033447265625, 0.04754638671875, 0.062255859375, -0.03485107421875, 0.03436279296875, -0.0122222900390625, -0.022796630859375, 0.0244293212890625, -0.01151275634765625, -0.0014982223510742188, 0.016143798828125, 0.0185546875, 0.00913238525390625, -0.003528594970703125, -0.0172882080078125, 0.023406982421875, 0.0004475116729736328, -0.018096923828125, -0.06072998046875, 0.01654052734375, 0.01432037353515625, 0.0006394386291503906, 0.038238525390625, 0.03167724609375, 0.022491455078125, -0.04302978515625, 0.034759521484375, 0.0191802978515625, -0.043792724609375, -0.0186920166015625, 0.05718994140625, -0.0100555419921875, -0.035736083984375, 0.042388916015625, -0.032470703125, -0.01165008544921875, 0.025909423828125, 0.07269287109375, 0.032684326171875, -0.00859832763671875, 0.019500732421875, 0.044921875, -0.007022857666015625, 0.0163726806640625, 0.063232421875, -0.006988525390625, -0.038726806640625, -0.0085601806640625, -0.038238525390625, -0.01280975341796875, -0.031494140625, -0.04254150390625, 0.0208740234375, -0.0341796875, -0.0191802978515625, -0.01161956787109375, -0.00531005859375, -0.0697021484375, 0.021026611328125, 0.0208740234375, 0.066162109375, -0.054595947265625, 0.035614013671875, 0.052398681640625, -0.0570068359375, -0.06719970703125, 0.0163726806640625, -0.007293701171875, -0.049102783203125, 0.037811279296875, 0.003475189208984375, -0.01471710205078125, 0.00356292724609375, -0.039794921875, -0.048004150390625, 0.1031494140625, -0.005157470703125, -0.043487548828125, 0.0091705322265625, 0.003406524658203125, 0.0338134765625, -0.00408935546875, 0.021759033203125, 0.02490234375, 0.046417236328125, 0.0006513595581054688, -0.08380126953125, -0.0408935546875, -0.006622314453125, -0.017059326171875, 0.02392578125, -0.0809326171875, 0.03692626953125, -0.00991058349609375, -0.0136871337890625, 0.0147857666015625, 0.0633544921875, 0.033294677734375, 0.01248931884765625, 0.0217437744140625, 0.057708740234375, 0.054168701171875, -0.0022602081298828125, 0.06658935546875, -0.055511474609375, 0.03033447265625, 0.07525634765625, -0.0161895751953125, 0.04901123046875, 0.04681396484375, -0.044036865234375, 0.03277587890625, 0.0307769775390625, -0.0023593902587890625, 0.048065185546875, 0.00823211669921875, 0.0003581047058105469, -0.0240325927734375, -0.0014972686767578125, -0.0282135009765625, 0.03790283203125, 0.00405120849609375, -0.042449951171875, -0.0193634033203125, -0.0206756591796875, 0.006008148193359375, -0.0214996337890625, 0.00775909423828125, 0.053497314453125, 0.017913818359375, -0.04986572265625, 0.0164642333984375, -0.006488800048828125, 0.0301055908203125, -0.037750244140625, 0.0015249252319335938, -0.0101318359375, 0.010345458984375, -0.026824951171875, -0.07904052734375, 0.030242919921875, -0.0162200927734375, -0.03033447265625, -0.041290283203125, 0.0274505615234375, -0.029815673828125, -0.034393310546875, -0.0161285400390625, 0.0112152099609375, 0.01163482666015625, -0.0010213851928710938, -0.0247650146484375, -0.0249786376953125, 0.0243988037109375, -0.0274505615234375, 0.035675048828125, 0.051361083984375, -0.00255584716796875, 0.0217132568359375, 0.03961181640625, 0.00545501708984375, -0.005825042724609375, -0.01641845703125, 0.0618896484375, -0.037322998046875, -0.0197906494140625, -0.0418701171875, 0.0306549072265625, 0.0031337738037109375, -0.0099334716796875, 0.060821533203125, 0.0736083984375, 0.058013916015625, -0.0072021484375, 0.026031494140625, -0.030670166015625, 0.04937744140625, -0.021026611328125, 0.04632568359375, -0.050140380859375, -0.004619598388671875, 0.01824951171875, -0.0748291015625, -0.0049285888671875, 0.035614013671875, -0.0259857177734375, -0.014678955078125, 0.0706787109375, 0.049041748046875, -0.00310516357421875, 0.028289794921875, 0.032318115234375, 0.026031494140625, 0.029937744140625, 0.040008544921875, 0.04046630859375, -0.016876220703125, 0.044647216796875, -0.01995849609375, -0.03070068359375, -0.0452880859375, -0.05206298828125, -0.0318603515625, -0.04052734375, -0.028045654296875, -0.040557861328125, 0.0060272216796875, 0.0352783203125, 0.06402587890625, -0.02825927734375, -0.013916015625, 0.0009055137634277344, 0.0143890380859375, -0.0145721435546875, -0.00981903076171875, 0.04437255859375, -0.0036373138427734375, -0.0284576416015625, 0.0289306640625, 0.0184783935546875, 0.007160186767578125, -0.013916015625, 0.032257080078125, -0.005764007568359375, 0.005809783935546875, 0.031280517578125, 0.0311126708984375, -0.02703857421875, -0.047882080078125, 0.0011167526245117188, -0.01172637939453125, 0.0021610260009765625, 0.0452880859375, -0.0440673828125, 0.033172607421875, 0.02801513671875, 0.0124053955078125, 0.048431396484375, 0.0147857666015625, 0.0743408203125, -0.05224609375, 0.041107177734375, 0.0153350830078125, 0.04913330078125, -0.01023101806640625, -0.045257568359375, 0.0185394287109375, 0.055938720703125, -0.07464599609375, -0.036865234375, 0.01690673828125, -0.08563232421875, 0.0299072265625, 0.082763671875, -0.00800323486328125, -0.00385284423828125, 0.002803802490234375, -0.0282135009765625, 0.01209259033203125, -0.02783203125, 0.020477294921875, 0.052642822265625, -0.00001049041748046875, -0.00981903076171875, -0.0439453125, 0.040618896484375, 0.00623321533203125, -0.08453369140625, 0.008636474609375, 0.0243988037109375, 0.034942626953125, 0.015838623046875, 0.040283203125, -0.021514892578125, 0.03973388671875, 0.038360595703125, -0.0217437744140625, -0.0126800537109375, -0.05517578125, -0.007442474365234375, 0.0248870849609375, 0.03729248046875, -0.0205230712890625 ] ]
Lykon/dreamshaper-xl-1-0
2023-08-26T16:44:49.000Z
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "stable-diffusion-xl", "text-to-image", "art", "artistic", "anime", "dreamshaper", "en", "license:openrail++", "endpoints_compatible", "has_space", "diffusers:StableDiffusionXLPipeline", "region:us" ]
text-to-image
Lykon
null
null
Lykon/dreamshaper-xl-1-0
15
9,370
diffusers
2023-08-26T16:44:49
--- language: - en license: openrail++ tags: - stable-diffusion - stable-diffusion-diffusers - stable-diffusion-xl - text-to-image - art - artistic - diffusers - anime - dreamshaper duplicated_from: lykon-models/dreamshaper-xl-1-0 --- # Dreamshaper SDXL-1-0 `lykon-models/dreamshaper-xl-1-0` is a Stable Diffusion model that has been fine-tuned on [stabilityai/stable-diffusion-xl-base-1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0). Please consider supporting me: - on [Patreon](https://www.patreon.com/Lykon275) - or [buy me a coffee](https://snipfeed.co/lykon) ## Diffusers For more general information on how to run text-to-image models with 🧨 Diffusers, see [the docs](https://huggingface.co/docs/diffusers/using-diffusers/conditional_image_generation). 1. Installation ``` pip install diffusers transformers accelerate ``` 2. Run ```py from diffusers import AutoPipelineForText2Image, DEISMultistepScheduler import torch pipe = AutoPipelineForText2Image.from_pretrained('lykon-models/dreamshaper-xl-1-0', torch_dtype=torch.float16, variant="fp16") pipe.scheduler = DEISMultistepScheduler.from_config(pipe.scheduler.config) pipe = pipe.to("cuda") prompt = "portrait photo of muscular bearded guy in a worn mech suit, light bokeh, intricate, steel metal, elegant, sharp focus, soft lighting, vibrant colors" generator = torch.manual_seed(0) image = pipe(prompt, num_inference_steps=25).images[0] image.save("./image.png") ``` ![](./image.png)
1,485
[ [ -0.02056884765625, -0.032073974609375, 0.039703369140625, 0.034515380859375, -0.038604736328125, -0.00939178466796875, 0.004062652587890625, -0.01605224609375, 0.025787353515625, 0.043701171875, -0.043975830078125, -0.0261688232421875, -0.059814453125, -0.002353668212890625, -0.0227203369140625, 0.06585693359375, -0.01114654541015625, -0.013092041015625, -0.0005178451538085938, 0.006488800048828125, -0.0273284912109375, 0.00260162353515625, -0.0919189453125, -0.022857666015625, 0.044525146484375, -0.004726409912109375, 0.05706787109375, 0.0207977294921875, 0.0224761962890625, 0.0299835205078125, -0.005397796630859375, -0.00818634033203125, -0.029083251953125, 0.03326416015625, 0.0032749176025390625, -0.02392578125, -0.0255279541015625, -0.0241546630859375, 0.04144287109375, 0.0037975311279296875, -0.0299530029296875, 0.01007080078125, 0.0148468017578125, 0.037689208984375, -0.04144287109375, 0.01474761962890625, -0.0222015380859375, 0.00745391845703125, -0.0147857666015625, 0.0124053955078125, -0.0164947509765625, -0.033416748046875, -0.002201080322265625, -0.05120849609375, 0.024932861328125, -0.0080718994140625, 0.08319091796875, 0.032318115234375, -0.025787353515625, -0.01062774658203125, -0.050201416015625, 0.062225341796875, -0.061187744140625, 0.02880859375, 0.00634765625, 0.0217742919921875, 0.01085662841796875, -0.098876953125, -0.04296875, 0.0211334228515625, -0.02471923828125, 0.036346435546875, -0.0190582275390625, 0.01056671142578125, 0.024444580078125, 0.0162506103515625, -0.0364990234375, -0.00899505615234375, -0.050933837890625, -0.02099609375, 0.0482177734375, 0.01580810546875, 0.011566162109375, -0.00971221923828125, -0.0238037109375, -0.0048065185546875, -0.0048980712890625, 0.004913330078125, 0.0208282470703125, -0.00859832763671875, -0.03729248046875, 0.052337646484375, 0.0016231536865234375, 0.034423828125, 0.0261688232421875, -0.0107574462890625, 0.0205078125, 0.0159912109375, -0.020477294921875, -0.00884246826171875, 0.07073974609375, 0.046661376953125, 0.0036106109619140625, 0.00860595703125, -0.02685546875, 0.0286712646484375, -0.001941680908203125, -0.09326171875, -0.034912109375, 0.042510986328125, -0.047332763671875, -0.03826904296875, -0.018951416015625, -0.0587158203125, -0.0245361328125, 0.013275146484375, 0.0438232421875, -0.0267333984375, -0.05181884765625, 0.023162841796875, -0.025604248046875, 0.0157318115234375, 0.0216522216796875, -0.05816650390625, 0.0172271728515625, 0.0194091796875, 0.07708740234375, -0.0028285980224609375, -0.0010194778442382812, 0.0002149343490600586, 0.0029811859130859375, -0.0384521484375, 0.0565185546875, -0.0036907196044921875, -0.03887939453125, 0.00324249267578125, 0.0173187255859375, -0.0099639892578125, -0.0275726318359375, 0.06622314453125, -0.039459228515625, 0.02642822265625, 0.0008420944213867188, -0.04962158203125, -0.00489044189453125, -0.00514984130859375, -0.0374755859375, 0.07220458984375, 0.0263824462890625, -0.07501220703125, 0.0261383056640625, -0.05267333984375, -0.010650634765625, 0.0129241943359375, -0.01299285888671875, -0.059967041015625, 0.0074005126953125, -0.015167236328125, 0.04986572265625, 0.01166534423828125, -0.0005068778991699219, -0.020416259765625, -0.0194244384765625, -0.0017404556274414062, -0.0176544189453125, 0.07806396484375, 0.02789306640625, -0.0155792236328125, 0.00885009765625, -0.0511474609375, 0.011566162109375, 0.03021240234375, -0.01007080078125, 0.004184722900390625, -0.009613037109375, 0.037811279296875, 0.017608642578125, 0.010894775390625, -0.062469482421875, 0.0140533447265625, -0.01702880859375, 0.0179595947265625, 0.06396484375, 0.01087188720703125, 0.00983428955078125, -0.037872314453125, 0.049163818359375, 0.01114654541015625, 0.0011348724365234375, 0.0060272216796875, -0.050048828125, -0.0809326171875, -0.032440185546875, -0.005512237548828125, 0.023712158203125, -0.053253173828125, 0.02239990234375, -0.001651763916015625, -0.0386962890625, -0.031524658203125, -0.020904541015625, 0.00868988037109375, 0.0301971435546875, 0.007770538330078125, -0.013153076171875, -0.048553466796875, -0.06634521484375, 0.0047760009765625, 0.00823974609375, -0.01116180419921875, 0.01018524169921875, 0.031890869140625, -0.0306396484375, 0.046051025390625, -0.0537109375, -0.028045654296875, -0.0175018310546875, -0.00826263427734375, 0.05718994140625, 0.046600341796875, 0.055419921875, -0.050933837890625, -0.06658935546875, -0.0128021240234375, -0.05072021484375, -0.0005888938903808594, -0.0009212493896484375, -0.00946044921875, 0.020477294921875, 0.01116943359375, -0.076904296875, 0.034149169921875, 0.058746337890625, -0.056793212890625, 0.055511474609375, -0.0206451416015625, 0.01239776611328125, -0.0948486328125, 0.0012159347534179688, 0.01447296142578125, -0.042266845703125, -0.033233642578125, -0.005252838134765625, 0.0018291473388671875, -0.01493072509765625, -0.0411376953125, 0.05963134765625, -0.016357421875, 0.03375244140625, -0.0200958251953125, -0.0201416015625, -0.00159454345703125, 0.0264739990234375, 0.0218505859375, 0.0283966064453125, 0.09100341796875, -0.04901123046875, 0.0323486328125, 0.033477783203125, -0.026092529296875, 0.038482666015625, -0.07879638671875, 0.02001953125, -0.0215911865234375, 0.017333984375, -0.07513427734375, -0.01520538330078125, 0.04803466796875, -0.0260467529296875, 0.0290069580078125, -0.0289154052734375, -0.0255279541015625, -0.038238525390625, -0.01113128662109375, 0.03472900390625, 0.0792236328125, -0.0350341796875, 0.04193115234375, 0.0032100677490234375, 0.00571441650390625, -0.0428466796875, -0.04278564453125, -0.0172576904296875, -0.0208282470703125, -0.05706787109375, 0.02728271484375, -0.0215911865234375, -0.0184783935546875, 0.00849151611328125, -0.00882720947265625, -0.0059967041015625, -0.0301666259765625, 0.029327392578125, 0.0197906494140625, -0.01593017578125, -0.036224365234375, 0.0136260986328125, -0.038543701171875, 0.0177154541015625, -0.026153564453125, 0.03985595703125, 0.0083465576171875, -0.00836944580078125, -0.05120849609375, 0.007282257080078125, 0.038360595703125, 0.0243072509765625, 0.062164306640625, 0.057830810546875, -0.037750244140625, -0.0282440185546875, -0.03704833984375, -0.015594482421875, -0.04547119140625, 0.0264739990234375, -0.0408935546875, -0.0290069580078125, 0.0251617431640625, 0.01342010498046875, 0.02386474609375, 0.042144775390625, 0.042999267578125, -0.044647216796875, 0.08062744140625, 0.043914794921875, 0.01861572265625, 0.0247039794921875, -0.0697021484375, -0.0019063949584960938, -0.055938720703125, -0.00859832763671875, -0.001728057861328125, -0.0330810546875, -0.0247039794921875, -0.028900146484375, 0.0247802734375, -0.0014820098876953125, -0.018341064453125, 0.0206451416015625, -0.04486083984375, 0.0325927734375, 0.00450897216796875, 0.0170440673828125, 0.0025806427001953125, 0.0023097991943359375, -0.007198333740234375, -0.01172637939453125, -0.039215087890625, -0.0291290283203125, 0.056671142578125, 0.0231170654296875, 0.074951171875, -0.000743865966796875, 0.056854248046875, 0.0120697021484375, 0.00461578369140625, -0.049163818359375, 0.0439453125, -0.012969970703125, -0.050262451171875, 0.0032138824462890625, -0.004856109619140625, -0.07537841796875, 0.0248260498046875, -0.0216217041015625, -0.044189453125, 0.007221221923828125, 0.025482177734375, -0.0154876708984375, 0.04022216796875, -0.030548095703125, 0.0633544921875, -0.004459381103515625, -0.041290283203125, -0.0203857421875, -0.0202789306640625, 0.0236968994140625, 0.010711669921875, 0.0004949569702148438, -0.0174102783203125, -0.0164642333984375, 0.04998779296875, -0.0218048095703125, 0.061309814453125, -0.035186767578125, -0.0175018310546875, 0.028106689453125, 0.01178741455078125, 0.038116455078125, 0.01175689697265625, -0.0248260498046875, 0.0083465576171875, 0.0230255126953125, -0.042724609375, -0.032562255859375, 0.053466796875, -0.054718017578125, -0.0261688232421875, -0.049835205078125, -0.0274505615234375, 0.0180816650390625, 0.0230712890625, 0.062225341796875, 0.0261688232421875, 0.00583648681640625, 0.006778717041015625, 0.046661376953125, 0.01050567626953125, 0.049652099609375, 0.001384735107421875, -0.0389404296875, -0.034820556640625, 0.047515869140625, 0.006404876708984375, 0.0290679931640625, -0.004543304443359375, 0.0230712890625, -0.0267791748046875, -0.0374755859375, -0.0299835205078125, 0.0278167724609375, -0.0557861328125, -0.0187530517578125, -0.049072265625, -0.044586181640625, -0.0095367431640625, -0.024993896484375, -0.030731201171875, -0.032135009765625, -0.0265045166015625, 0.0086669921875, 0.048614501953125, 0.051177978515625, -0.004852294921875, 0.0213470458984375, -0.036224365234375, 0.038116455078125, 0.017913818359375, 0.0400390625, -0.006465911865234375, -0.0479736328125, -0.00823211669921875, 0.0034465789794921875, -0.034027099609375, -0.05352783203125, 0.0355224609375, 0.0095367431640625, 0.03765869140625, 0.036773681640625, -0.00353240966796875, 0.049224853515625, -0.0286102294921875, 0.04840087890625, 0.031890869140625, -0.0526123046875, 0.03802490234375, -0.039581298828125, 0.0108795166015625, 0.0010833740234375, 0.0272369384765625, -0.047271728515625, -0.04156494140625, -0.061065673828125, -0.056396484375, 0.039459228515625, 0.05157470703125, 0.0247039794921875, 0.01459503173828125, 0.042510986328125, 0.01116180419921875, -0.006122589111328125, -0.083251953125, -0.047027587890625, -0.0233001708984375, -0.01226806640625, 0.007427215576171875, -0.0194091796875, 0.00046062469482421875, -0.025299072265625, 0.0858154296875, -0.01309967041015625, 0.0194549560546875, 0.0247955322265625, 0.0169219970703125, -0.025787353515625, -0.0098114013671875, 0.0171661376953125, 0.042144775390625, -0.0221405029296875, -0.0036907196044921875, 0.00890350341796875, -0.04022216796875, 0.020233154296875, -0.0051422119140625, -0.033935546875, 0.02117919921875, -0.0029087066650390625, 0.061767578125, -0.00634765625, -0.0167999267578125, 0.0322265625, -0.0227813720703125, -0.0222625732421875, -0.04278564453125, 0.0218353271484375, 0.02105712890625, 0.033966064453125, 0.00984954833984375, 0.05364990234375, -0.00009417533874511719, 0.00647735595703125, -0.0019855499267578125, 0.0445556640625, -0.0251312255859375, -0.025909423828125, 0.08038330078125, 0.005950927734375, -0.03045654296875, 0.034149169921875, -0.023345947265625, -0.0104827880859375, 0.0528564453125, 0.07275390625, 0.07965087890625, 0.00122833251953125, 0.01593017578125, 0.060211181640625, -0.0002617835998535156, -0.0204925537109375, 0.045318603515625, 0.005519866943359375, -0.0584716796875, -0.00046062469482421875, -0.047607421875, -0.0171966552734375, 0.0081939697265625, -0.033538818359375, 0.0672607421875, -0.0305938720703125, -0.0249176025390625, -0.0075836181640625, -0.006862640380859375, -0.0531005859375, 0.0284423828125, 0.026641845703125, 0.061492919921875, -0.054656982421875, 0.0684814453125, 0.035400390625, -0.0267181396484375, -0.043304443359375, -0.0026493072509765625, -0.01177978515625, -0.053192138671875, 0.0174713134765625, 0.01143646240234375, -0.01293182373046875, 0.02557373046875, -0.0313720703125, -0.0699462890625, 0.09796142578125, 0.025299072265625, -0.04595947265625, 0.01355743408203125, -0.02435302734375, 0.035003662109375, -0.0159912109375, 0.01715087890625, 0.01224517822265625, 0.0248870849609375, 0.0170745849609375, -0.046600341796875, 0.005153656005859375, -0.0188446044921875, 0.0173187255859375, 0.016845703125, -0.06341552734375, 0.0562744140625, -0.0411376953125, -0.03350830078125, 0.046905517578125, 0.057769775390625, 0.042938232421875, 0.01300811767578125, 0.051025390625, 0.061553955078125, 0.02423095703125, -0.003047943115234375, 0.07562255859375, -0.01416015625, 0.057830810546875, 0.036773681640625, -0.004302978515625, 0.047149658203125, 0.03509521484375, -0.0259552001953125, 0.0589599609375, 0.05413818359375, -0.0213165283203125, 0.041717529296875, 0.009521484375, -0.03173828125, 0.001071929931640625, 0.0217132568359375, -0.0372314453125, -0.0003604888916015625, 0.0172119140625, -0.02667236328125, -0.004177093505859375, -0.0006518363952636719, 0.0015134811401367188, -0.01873779296875, -0.0218048095703125, 0.03173828125, 0.008453369140625, -0.0289459228515625, 0.050445556640625, -0.01531982421875, 0.08880615234375, -0.07086181640625, -0.0031375885009765625, 0.00322723388671875, 0.0426025390625, -0.03997802734375, -0.07672119140625, 0.031402587890625, -0.020965576171875, 0.0011682510375976562, -0.026824951171875, 0.07135009765625, -0.026275634765625, -0.07879638671875, 0.0262603759765625, 0.00666046142578125, 0.0211334228515625, 0.018341064453125, -0.05877685546875, 0.01959228515625, 0.00386810302734375, -0.03424072265625, -0.0012159347534179688, 0.004058837890625, 0.03704833984375, 0.05328369140625, 0.02813720703125, 0.0166473388671875, -0.00033664703369140625, 0.00019788742065429688, 0.04278564453125, -0.02777099609375, -0.0259552001953125, -0.06768798828125, 0.0740966796875, -0.00269317626953125, -0.031982421875, 0.05267333984375, 0.05938720703125, 0.035919189453125, -0.036865234375, 0.06866455078125, -0.034088134765625, 0.0167388916015625, -0.036590576171875, 0.079833984375, -0.042144775390625, 0.00255584716796875, -0.02178955078125, -0.0701904296875, -0.0011911392211914062, 0.0673828125, 0.026763916015625, 0.0146484375, 0.036895751953125, 0.06494140625, -0.033477783203125, -0.0174713134765625, 0.0283050537109375, 0.02484130859375, 0.0183258056640625, 0.01099395751953125, 0.056243896484375, -0.0626220703125, 0.033966064453125, -0.06378173828125, -0.008636474609375, 0.0286712646484375, -0.06671142578125, -0.05621337890625, -0.05743408203125, -0.0640869140625, -0.0650634765625, -0.004970550537109375, 0.053466796875, 0.0750732421875, -0.058013916015625, -0.0223388671875, -0.019744873046875, -0.0005469322204589844, -0.0169677734375, -0.02252197265625, 0.0201416015625, -0.0121612548828125, -0.072021484375, -0.00923919677734375, 0.0067901611328125, 0.0325927734375, -0.0161285400390625, -0.0192413330078125, 0.0077972412109375, 0.0018568038940429688, 0.03192138671875, 0.0102386474609375, -0.055145263671875, -0.006290435791015625, -0.012786865234375, 0.0023021697998046875, 0.01026153564453125, 0.036376953125, -0.06146240234375, 0.01290130615234375, 0.0231170654296875, -0.00298309326171875, 0.067626953125, -0.03472900390625, 0.0217132568359375, -0.0452880859375, 0.0251007080078125, 0.0233612060546875, 0.05072021484375, 0.009307861328125, -0.037506103515625, 0.035003662109375, 0.0254669189453125, -0.05743408203125, -0.0438232421875, 0.010528564453125, -0.08514404296875, -0.009063720703125, 0.08099365234375, 0.006496429443359375, -0.00327301025390625, 0.01094818115234375, -0.056884765625, 0.004184722900390625, -0.0217132568359375, 0.054840087890625, 0.032684326171875, -0.0262908935546875, -0.03948974609375, -0.02899169921875, 0.040313720703125, 0.007694244384765625, -0.04083251953125, -0.0062408447265625, 0.024139404296875, 0.055267333984375, 0.023468017578125, 0.064697265625, 0.0025157928466796875, 0.0186309814453125, 0.021209716796875, 0.00830841064453125, 0.0196380615234375, 0.022430419921875, -0.0195465087890625, -0.0088958740234375, -0.00676727294921875, -0.035614013671875 ] ]
timm/resnet50.tv_in1k
2023-04-05T18:15:23.000Z
[ "timm", "pytorch", "safetensors", "image-classification", "arxiv:1512.03385", "license:bsd-3-clause", "region:us" ]
image-classification
timm
null
null
timm/resnet50.tv_in1k
0
9,365
timm
2023-04-05T18:14:58
--- tags: - image-classification - timm library_tag: timm license: bsd-3-clause --- # Model card for resnet50.tv_in1k A ResNet-B image classification model. This model features: * ReLU activations * single layer 7x7 convolution with pooling * 1x1 convolution shortcut downsample Trained on ImageNet-1k, original torchvision model weight. ## Model Details - **Model Type:** Image classification / feature backbone - **Model Stats:** - Params (M): 25.6 - GMACs: 4.1 - Activations (M): 11.1 - Image size: 224 x 224 - **Papers:** - Deep Residual Learning for Image Recognition: https://arxiv.org/abs/1512.03385 - **Original:** https://github.com/pytorch/vision ## Model Usage ### Image Classification ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model('resnet50.tv_in1k', pretrained=True) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) ``` ### Feature Map Extraction ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'resnet50.tv_in1k', pretrained=True, features_only=True, ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 for o in output: # print shape of each feature map in output # e.g.: # torch.Size([1, 64, 112, 112]) # torch.Size([1, 256, 56, 56]) # torch.Size([1, 512, 28, 28]) # torch.Size([1, 1024, 14, 14]) # torch.Size([1, 2048, 7, 7]) print(o.shape) ``` ### Image Embeddings ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'resnet50.tv_in1k', pretrained=True, num_classes=0, # remove classifier nn.Linear ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor # or equivalently (without needing to set num_classes=0) output = model.forward_features(transforms(img).unsqueeze(0)) # output is unpooled, a (1, 2048, 7, 7) shaped tensor output = model.forward_head(output, pre_logits=True) # output is a (1, num_features) shaped tensor ``` ## Model Comparison Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results). |model |img_size|top1 |top5 |param_count|gmacs|macts|img/sec| |------------------------------------------|--------|-----|-----|-----------|-----|-----|-------| |[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|320 |86.72|98.17|93.6 |35.2 |69.7 |451 | |[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|288 |86.51|98.08|93.6 |28.5 |56.4 |560 | |[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|288 |86.49|98.03|93.6 |28.5 |56.4 |557 | |[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|224 |85.96|97.82|93.6 |17.2 |34.2 |923 | |[resnext101_32x32d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x32d.fb_wsl_ig1b_ft_in1k)|224 |85.11|97.44|468.5 |87.3 |91.1 |254 | |[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|416 |85.0 |97.12|191.9 |108.4|213.8|134 | |[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|352 |84.96|97.22|102.1 |50.2 |101.2|291 | |[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|320 |84.73|97.18|102.1 |41.5 |83.7 |353 | |[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|384 |84.71|96.99|164.0 |77.6 |154.7|183 | |[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|288 |84.57|97.08|93.6 |28.5 |56.4 |557 | |[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|320 |84.45|97.08|93.2 |31.5 |67.8 |446 | |[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|352 |84.43|96.97|129.9 |51.1 |105.5|280 | |[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|288 |84.36|96.92|93.6 |27.6 |53.0 |595 | |[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|320 |84.35|97.04|66.8 |24.1 |47.7 |610 | |[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|288 |84.3 |96.94|164.0 |43.7 |87.1 |333 | |[resnext101_32x8d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_swsl_ig1b_ft_in1k)|224 |84.28|97.17|88.8 |16.5 |31.2 |1100 | |[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|320 |84.24|96.86|191.9 |64.2 |126.6|228 | |[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|288 |84.19|96.87|93.6 |27.2 |51.6 |613 | |[resnext101_32x16d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_wsl_ig1b_ft_in1k)|224 |84.18|97.19|194.0 |36.3 |51.2 |581 | |[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|288 |84.11|97.11|44.6 |15.1 |29.0 |1144 | |[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|320 |83.97|96.82|64.7 |31.2 |67.3 |518 | |[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|256 |83.87|96.75|93.2 |20.2 |43.4 |692 | |[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|224 |83.86|96.65|93.6 |17.2 |34.2 |923 | |[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|320 |83.72|96.61|86.6 |24.3 |48.1 |617 | |[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|256 |83.69|96.78|66.8 |15.4 |30.6 |943 | |[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|224 |83.68|96.61|93.6 |16.7 |32.0 |986 | |[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|320 |83.67|96.74|60.2 |24.1 |47.7 |706 | |[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|256 |83.59|96.61|129.9 |27.1 |55.8 |526 | |[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|224 |83.58|96.4 |93.6 |16.5 |31.2 |1013 | |[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|224 |83.54|96.83|44.6 |9.1 |17.6 |1864 | |[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|288 |83.46|96.54|60.2 |19.1 |37.3 |904 | |[resnext101_32x16d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k)|224 |83.35|96.85|194.0 |36.3 |51.2 |582 | |[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|256 |83.23|96.53|64.7 |20.0 |43.1 |809 | |[resnext101_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_swsl_ig1b_ft_in1k)|224 |83.22|96.75|44.2 |8.0 |21.2 |1814 | |[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|288 |83.16|96.38|83.5 |25.7 |51.6 |590 | |[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|256 |83.14|96.38|60.2 |15.4 |30.5 |1096 | |[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|320 |83.02|96.45|44.6 |16.5 |34.8 |992 | |[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|288 |82.98|96.54|44.6 |13.4 |28.2 |1077 | |[resnext101_64x4d.tv_in1k](https://huggingface.co/timm/resnext101_64x4d.tv_in1k)|224 |82.98|96.25|83.5 |15.5 |31.2 |989 | |[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|256 |82.86|96.28|86.6 |15.6 |30.8 |951 | |[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|224 |82.83|96.22|88.8 |16.5 |31.2 |1099 | |[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|224 |82.8 |96.13|60.2 |11.6 |22.6 |1486 | |[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|288 |82.8 |96.32|44.6 |13.0 |26.8 |1291 | |[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|288 |82.74|95.71|60.2 |19.1 |37.3 |905 | |[resnext101_32x8d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_wsl_ig1b_ft_in1k)|224 |82.69|96.63|88.8 |16.5 |31.2 |1100 | |[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|288 |82.62|95.75|60.2 |19.1 |37.3 |904 | |[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|288 |82.61|96.49|25.6 |8.9 |20.6 |1729 | |[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|288 |82.53|96.13|36.8 |9.9 |21.5 |1773 | |[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|224 |82.5 |96.02|126.9 |22.8 |21.2 |1078 | |[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|224 |82.46|95.92|83.5 |15.5 |31.2 |987 | |[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|288 |82.36|96.18|35.7 |8.1 |20.9 |1964 | |[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|320 |82.35|96.14|25.6 |8.8 |24.1 |1386 | |[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|288 |82.31|95.63|44.6 |13.0 |26.8 |1291 | |[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|288 |82.29|96.01|63.6 |13.6 |28.5 |1078 | |[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|224 |82.29|96.0 |60.2 |11.6 |22.6 |1484 | |[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|288 |82.27|96.06|68.9 |18.9 |23.8 |1176 | |[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|256 |82.26|96.07|44.6 |10.6 |22.2 |1542 | |[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|288 |82.24|95.73|44.6 |13.0 |26.8 |1290 | |[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|288 |82.2 |96.14|27.6 |7.0 |23.8 |1547 | |[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|224 |82.18|96.05|44.6 |8.1 |17.1 |1771 | |[resnext50_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_swsl_ig1b_ft_in1k)|224 |82.17|96.22|25.0 |4.3 |14.4 |2943 | |[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|288 |82.12|95.65|25.6 |7.1 |19.6 |1704 | |[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|288 |82.03|95.94|25.0 |7.0 |23.8 |1745 | |[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|288 |82.0 |96.15|24.9 |5.8 |12.7 |1787 | |[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|256 |81.99|95.85|36.8 |7.8 |17.0 |2230 | |[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|176 |81.98|95.72|88.8 |10.3 |19.4 |1768 | |[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|224 |81.97|95.24|60.2 |11.6 |22.6 |1486 | |[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|224 |81.93|95.75|44.6 |7.8 |16.2 |2122 | |[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|224 |81.9 |95.77|44.6 |7.8 |16.2 |2118 | |[resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k)|224 |81.84|96.1 |194.0 |36.3 |51.2 |583 | |[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|256 |81.78|95.94|35.7 |6.4 |16.6 |2471 | |[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|224 |81.77|95.22|60.2 |11.6 |22.6 |1485 | |[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|224 |81.74|96.06|25.6 |5.4 |12.4 |2813 | |[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|288 |81.65|95.54|25.6 |7.1 |19.6 |1703 | |[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|288 |81.64|95.88|25.6 |7.2 |19.7 |1694 | |[resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k)|224 |81.62|96.04|88.8 |16.5 |31.2 |1101 | |[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|224 |81.61|95.76|68.9 |11.4 |14.4 |1930 | |[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|288 |81.61|95.83|25.6 |8.5 |19.2 |1868 | |[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|224 |81.5 |95.16|44.6 |7.8 |16.2 |2125 | |[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|288 |81.48|95.16|25.0 |7.0 |23.8 |1745 | |[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|288 |81.47|95.71|25.9 |6.9 |18.6 |2071 | |[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|224 |81.45|95.53|68.9 |11.4 |14.4 |1929 | |[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|288 |81.44|95.22|25.6 |7.2 |19.7 |1908 | |[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|256 |81.44|95.67|25.6 |5.6 |15.4 |2168 | |[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|288 |81.4 |95.82|30.2 |6.8 |13.9 |2132 | |[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|288 |81.37|95.74|25.6 |7.2 |19.7 |1910 | |[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|224 |81.32|95.19|44.6 |7.8 |16.2 |2125 | |[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|288 |81.3 |95.65|28.1 |6.8 |18.4 |1803 | |[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|288 |81.3 |95.11|25.0 |7.0 |23.8 |1746 | |[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|224 |81.27|95.62|27.6 |4.3 |14.4 |2591 | |[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|224 |81.26|95.16|25.6 |4.3 |11.8 |2823 | |[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|288 |81.23|95.54|15.7 |4.8 |19.6 |2117 | |[senet154.gluon_in1k](https://huggingface.co/timm/senet154.gluon_in1k)|224 |81.23|95.35|115.1 |20.8 |38.7 |545 | |[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|288 |81.22|95.11|25.6 |6.8 |18.4 |2089 | |[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|288 |81.22|95.63|25.6 |6.8 |18.4 |676 | |[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|288 |81.18|95.09|25.6 |7.2 |19.7 |1908 | |[resnet50.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet50.fb_swsl_ig1b_ft_in1k)|224 |81.18|95.98|25.6 |4.1 |11.1 |3455 | |[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|224 |81.17|95.34|25.0 |4.3 |14.4 |2933 | |[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|224 |81.1 |95.33|25.0 |4.3 |14.4 |2934 | |[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|288 |81.1 |95.23|28.1 |6.8 |18.4 |1801 | |[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|288 |81.1 |95.12|28.1 |6.8 |18.4 |1799 | |[resnet152s.gluon_in1k](https://huggingface.co/timm/resnet152s.gluon_in1k)|224 |81.02|95.41|60.3 |12.9 |25.0 |1347 | |[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|288 |80.97|95.44|25.6 |6.8 |18.4 |2085 | |[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|256 |80.94|95.45|25.9 |5.4 |14.7 |2571 | |[resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.93|95.73|44.2 |8.0 |21.2 |1814 | |[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|288 |80.91|95.55|25.6 |6.8 |18.4 |2084 | |[seresnext101_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_32x4d.gluon_in1k)|224 |80.9 |95.31|49.0 |8.0 |21.3 |1585 | |[seresnext101_64x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_64x4d.gluon_in1k)|224 |80.9 |95.3 |88.2 |15.5 |31.2 |918 | |[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|288 |80.86|95.52|25.6 |6.8 |18.4 |2085 | |[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|224 |80.85|95.43|25.6 |4.1 |11.1 |3450 | |[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|224 |80.84|95.02|25.6 |4.3 |11.8 |2821 | |[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|224 |80.79|95.62|24.9 |3.5 |7.7 |2961 | |[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|288 |80.79|95.36|19.8 |6.0 |14.8 |2506 | |[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|288 |80.79|95.58|19.9 |4.2 |10.6 |2349 | |[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|288 |80.78|94.99|25.6 |6.8 |18.4 |2088 | |[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|288 |80.71|95.43|25.6 |6.8 |18.4 |2087 | |[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|288 |80.7 |95.39|25.0 |7.0 |23.8 |1749 | |[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|192 |80.69|95.24|63.6 |6.0 |12.7 |2270 | |[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|224 |80.68|94.71|25.6 |4.4 |11.9 |3162 | |[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|288 |80.68|95.36|19.7 |6.0 |14.8 |2637 | |[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|224 |80.67|95.3 |25.6 |4.1 |11.1 |3452 | |[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|288 |80.67|95.42|25.0 |7.4 |25.1 |1626 | |[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|224 |80.63|95.21|25.6 |5.2 |11.6 |3034 | |[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|224 |80.61|95.32|25.6 |4.4 |11.9 |2813 | |[resnext101_64x4d.gluon_in1k](https://huggingface.co/timm/resnext101_64x4d.gluon_in1k)|224 |80.61|94.99|83.5 |15.5 |31.2 |989 | |[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|288 |80.6 |95.31|19.9 |6.0 |14.8 |2578 | |[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|256 |80.57|95.17|15.7 |3.8 |15.5 |2710 | |[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|224 |80.56|95.0 |60.2 |11.6 |22.6 |1483 | |[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|224 |80.53|95.16|25.6 |4.4 |11.9 |3164 | |[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|224 |80.53|94.46|25.0 |4.3 |14.4 |2930 | |[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|176 |80.48|94.98|126.9 |14.3 |13.2 |1719 | |[resnet152d.gluon_in1k](https://huggingface.co/timm/resnet152d.gluon_in1k)|224 |80.47|95.2 |60.2 |11.8 |23.4 |1428 | |[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|288 |80.45|95.32|25.6 |6.8 |18.4 |2086 | |[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|224 |80.45|95.24|30.2 |4.1 |8.4 |3530 | |[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|224 |80.45|94.63|25.0 |4.3 |14.4 |2936 | |[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|176 |80.43|95.09|68.9 |7.3 |9.0 |3015 | |[resnet101d.gluon_in1k](https://huggingface.co/timm/resnet101d.gluon_in1k)|224 |80.42|95.01|44.6 |8.1 |17.0 |2007 | |[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|224 |80.38|94.6 |25.6 |4.1 |11.1 |3461 | |[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|256 |80.36|95.1 |19.8 |4.8 |11.7 |3267 | |[resnext101_32x4d.gluon_in1k](https://huggingface.co/timm/resnext101_32x4d.gluon_in1k)|224 |80.34|94.93|44.2 |8.0 |21.2 |1814 | |[resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.32|95.4 |25.0 |4.3 |14.4 |2941 | |[resnet101s.gluon_in1k](https://huggingface.co/timm/resnet101s.gluon_in1k)|224 |80.28|95.16|44.7 |9.2 |18.6 |1851 | |[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|224 |80.26|95.08|28.1 |4.1 |11.1 |2972 | |[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|288 |80.24|95.24|25.6 |8.5 |19.9 |1523 | |[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|224 |80.22|94.63|25.6 |4.4 |11.9 |3162 | |[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|176 |80.2 |94.64|60.2 |7.2 |14.0 |2346 | |[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|224 |80.08|94.74|28.1 |4.1 |11.1 |2969 | |[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|256 |80.08|94.97|19.7 |4.8 |11.7 |3284 | |[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|256 |80.06|94.99|19.9 |4.8 |11.7 |3216 | |[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|224 |80.06|94.95|25.6 |4.1 |11.1 |1109 | |[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|224 |80.02|94.71|28.1 |4.1 |11.1 |2962 | |[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|288 |79.97|95.05|25.6 |6.8 |18.4 |2086 | |[resnet152c.gluon_in1k](https://huggingface.co/timm/resnet152c.gluon_in1k)|224 |79.92|94.84|60.2 |11.8 |23.4 |1455 | |[seresnext50_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext50_32x4d.gluon_in1k)|224 |79.91|94.82|27.6 |4.3 |14.4 |2591 | |[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|224 |79.91|94.67|25.6 |4.1 |11.1 |3456 | |[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|176 |79.9 |94.6 |44.6 |4.9 |10.1 |3341 | |[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|224 |79.89|94.97|35.7 |4.5 |12.1 |2774 | |[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|224 |79.88|94.87|25.6 |4.1 |11.1 |3455 | |[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|320 |79.86|95.07|16.0 |5.2 |16.4 |2168 | |[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|224 |79.85|94.56|25.6 |4.1 |11.1 |3460 | |[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|288 |79.83|94.97|25.6 |6.8 |18.4 |2087 | |[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|224 |79.82|94.62|44.6 |7.8 |16.2 |2114 | |[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|224 |79.76|94.6 |25.0 |4.3 |14.4 |2943 | |[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|224 |79.74|94.95|25.6 |4.1 |11.1 |3455 | |[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|224 |79.74|94.87|19.9 |2.5 |6.4 |3929 | |[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|288 |79.71|94.83|19.7 |6.0 |14.8 |2710 | |[resnet152.gluon_in1k](https://huggingface.co/timm/resnet152.gluon_in1k)|224 |79.68|94.74|60.2 |11.6 |22.6 |1486 | |[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|224 |79.67|94.87|25.0 |4.5 |15.2 |2729 | |[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|288 |79.63|94.91|25.6 |6.8 |18.4 |2086 | |[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|224 |79.56|94.72|25.6 |4.3 |11.8 |2805 | |[resnet101c.gluon_in1k](https://huggingface.co/timm/resnet101c.gluon_in1k)|224 |79.53|94.58|44.6 |8.1 |17.0 |2062 | |[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|224 |79.52|94.61|25.6 |4.1 |11.1 |3459 | |[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|176 |79.42|94.64|25.6 |2.6 |6.9 |5397 | |[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|288 |79.4 |94.66|18.0 |5.9 |14.6 |2752 | |[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|224 |79.38|94.57|25.6 |4.1 |11.1 |3459 | |[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|176 |79.37|94.3 |25.0 |2.7 |9.0 |4577 | |[resnext50_32x4d.gluon_in1k](https://huggingface.co/timm/resnext50_32x4d.gluon_in1k)|224 |79.36|94.43|25.0 |4.3 |14.4 |2942 | |[resnext101_32x8d.tv_in1k](https://huggingface.co/timm/resnext101_32x8d.tv_in1k)|224 |79.31|94.52|88.8 |16.5 |31.2 |1100 | |[resnet101.gluon_in1k](https://huggingface.co/timm/resnet101.gluon_in1k)|224 |79.31|94.53|44.6 |7.8 |16.2 |2125 | |[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|224 |79.31|94.63|25.6 |5.2 |12.0 |2524 | |[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|176 |79.27|94.49|25.6 |2.6 |6.9 |5404 | |[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|224 |79.25|94.31|25.0 |4.3 |14.4 |2931 | |[resnet50.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet50.fb_ssl_yfcc100m_ft_in1k)|224 |79.22|94.84|25.6 |4.1 |11.1 |3451 | |[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|256 |79.21|94.56|19.7 |4.8 |11.7 |3392 | |[resnet50d.gluon_in1k](https://huggingface.co/timm/resnet50d.gluon_in1k)|224 |79.07|94.48|25.6 |4.4 |11.9 |3162 | |[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|224 |79.03|94.38|25.6 |4.1 |11.1 |3453 | |[resnet50.am_in1k](https://huggingface.co/timm/resnet50.am_in1k)|224 |79.01|94.39|25.6 |4.1 |11.1 |3461 | |[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|256 |79.01|94.37|18.0 |4.6 |11.6 |3440 | |[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|256 |78.9 |94.54|16.0 |3.4 |10.5 |3421 | |[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|160 |78.89|94.11|60.2 |5.9 |11.5 |2745 | |[wide_resnet101_2.tv_in1k](https://huggingface.co/timm/wide_resnet101_2.tv_in1k)|224 |78.84|94.28|126.9 |22.8 |21.2 |1079 | |[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|288 |78.83|94.24|16.8 |4.5 |16.8 |2251 | |[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|224 |78.81|94.32|25.6 |4.1 |11.1 |3454 | |[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|288 |78.74|94.33|16.8 |4.5 |16.7 |2264 | |[resnet50s.gluon_in1k](https://huggingface.co/timm/resnet50s.gluon_in1k)|224 |78.72|94.23|25.7 |5.5 |13.5 |2796 | |[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|224 |78.71|94.24|25.6 |4.4 |11.9 |3154 | |[wide_resnet50_2.tv_in1k](https://huggingface.co/timm/wide_resnet50_2.tv_in1k)|224 |78.47|94.09|68.9 |11.4 |14.4 |1934 | |[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|224 |78.46|94.27|25.6 |4.1 |11.1 |3454 | |[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|288 |78.43|94.35|21.8 |6.5 |7.5 |3291 | |[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|288 |78.42|94.04|10.5 |3.1 |13.3 |3226 | |[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|320 |78.33|94.13|16.0 |5.2 |16.4 |2391 | |[resnet152.tv_in1k](https://huggingface.co/timm/resnet152.tv_in1k)|224 |78.32|94.04|60.2 |11.6 |22.6 |1487 | |[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|288 |78.28|94.1 |10.4 |3.1 |13.3 |3062 | |[bat_resnext26ts.ch_in1k](https://huggingface.co/timm/bat_resnext26ts.ch_in1k)|256 |78.25|94.1 |10.7 |2.5 |12.5 |3393 | |[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|224 |78.06|93.78|25.6 |4.1 |11.1 |3450 | |[resnet50c.gluon_in1k](https://huggingface.co/timm/resnet50c.gluon_in1k)|224 |78.0 |93.99|25.6 |4.4 |11.9 |3286 | |[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|288 |78.0 |93.91|10.3 |3.1 |13.3 |3297 | |[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|224 |77.98|93.75|16.8 |2.7 |10.1 |3841 | |[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|288 |77.92|93.77|21.8 |6.1 |6.2 |3609 | |[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|160 |77.88|93.71|44.6 |4.0 |8.3 |3926 | |[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|256 |77.87|93.84|16.0 |3.4 |10.5 |3772 | |[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|256 |77.86|93.79|10.4 |2.4 |10.5 |4263 | |[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|160 |77.82|93.81|35.7 |2.3 |6.2 |5238 | |[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|256 |77.81|93.82|10.5 |2.4 |10.5 |4183 | |[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|160 |77.79|93.6 |25.6 |2.2 |6.0 |5329 | |[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|160 |77.73|93.32|25.0 |2.2 |7.4 |5576 | |[resnext50_32x4d.tv_in1k](https://huggingface.co/timm/resnext50_32x4d.tv_in1k)|224 |77.61|93.7 |25.0 |4.3 |14.4 |2944 | |[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|224 |77.59|93.61|16.8 |2.7 |10.2 |3807 | |[resnet50.gluon_in1k](https://huggingface.co/timm/resnet50.gluon_in1k)|224 |77.58|93.72|25.6 |4.1 |11.1 |3455 | |[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|256 |77.44|93.56|10.3 |2.4 |10.5 |4284 | |[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|288 |77.41|93.63|16.0 |4.3 |13.5 |2907 | |[resnet101.tv_in1k](https://huggingface.co/timm/resnet101.tv_in1k)|224 |77.38|93.54|44.6 |7.8 |16.2 |2125 | |[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|160 |77.22|93.27|25.6 |2.2 |6.1 |5982 | |[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|288 |77.17|93.47|10.3 |3.1 |13.3 |3392 | |[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|288 |77.15|93.27|21.8 |6.1 |6.2 |3615 | |[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|224 |77.1 |93.37|21.8 |3.9 |4.5 |5436 | |[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|224 |77.02|93.07|28.1 |4.1 |11.1 |2952 | |[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|256 |76.78|93.13|10.3 |2.4 |10.5 |4410 | |[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|224 |76.7 |93.17|16.0 |2.6 |8.2 |4859 | |[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|288 |76.5 |93.35|21.8 |6.1 |6.2 |3617 | |[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|224 |76.42|92.87|21.8 |3.7 |3.7 |5984 | |[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|288 |76.35|93.18|16.0 |3.9 |12.2 |3331 | |[resnet50.tv_in1k](https://huggingface.co/timm/resnet50.tv_in1k)|224 |76.13|92.86|25.6 |4.1 |11.1 |3457 | |[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|160 |75.96|92.5 |25.6 |2.1 |5.7 |6490 | |[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|224 |75.52|92.44|21.8 |3.7 |3.7 |5991 | |[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|224 |75.3 |92.58|16.0 |2.4 |7.4 |5583 | |[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|224 |75.16|92.18|21.8 |3.7 |3.7 |5994 | |[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|160 |75.1 |92.08|28.1 |2.1 |5.7 |5513 | |[resnet34.gluon_in1k](https://huggingface.co/timm/resnet34.gluon_in1k)|224 |74.57|91.98|21.8 |3.7 |3.7 |5984 | |[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|288 |73.81|91.83|11.7 |3.4 |5.4 |5196 | |[resnet34.tv_in1k](https://huggingface.co/timm/resnet34.tv_in1k)|224 |73.32|91.42|21.8 |3.7 |3.7 |5979 | |[resnet18.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet18.fb_swsl_ig1b_ft_in1k)|224 |73.28|91.73|11.7 |1.8 |2.5 |10213 | |[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|288 |73.16|91.03|11.7 |3.0 |4.1 |6050 | |[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|224 |72.98|91.11|21.8 |3.7 |3.7 |5967 | |[resnet18.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet18.fb_ssl_yfcc100m_ft_in1k)|224 |72.6 |91.42|11.7 |1.8 |2.5 |10213 | |[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|288 |72.37|90.59|11.7 |3.0 |4.1 |6051 | |[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|224 |72.26|90.31|10.1 |1.7 |5.8 |7026 | |[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|224 |72.26|90.68|11.7 |2.1 |3.3 |8707 | |[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|224 |71.49|90.07|11.7 |1.8 |2.5 |10187 | |[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|176 |71.31|89.69|10.1 |1.1 |3.6 |10970 | |[resnet18.gluon_in1k](https://huggingface.co/timm/resnet18.gluon_in1k)|224 |70.84|89.76|11.7 |1.8 |2.5 |10210 | |[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|224 |70.64|89.47|11.7 |1.8 |2.5 |10194 | |[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|160 |70.56|89.52|21.8 |1.9 |1.9 |10737 | |[resnet18.tv_in1k](https://huggingface.co/timm/resnet18.tv_in1k)|224 |69.76|89.07|11.7 |1.8 |2.5 |10205 | |[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|224 |68.34|88.03|5.4 |1.1 |2.4 |13079 | |[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|224 |68.25|88.17|11.7 |1.8 |2.5 |10167 | |[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|176 |66.71|86.96|5.4 |0.7 |1.5 |20327 | |[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|160 |65.66|86.26|11.7 |0.9 |1.3 |18229 | ## Citation ```bibtex @article{He2015, author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun}, title = {Deep Residual Learning for Image Recognition}, journal = {arXiv preprint arXiv:1512.03385}, year = {2015} } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/huggingface/pytorch-image-models}} } ```
37,876
[ [ -0.0670166015625, -0.0159759521484375, 0.001220703125, 0.029998779296875, -0.032440185546875, -0.00799560546875, -0.00946044921875, -0.0285186767578125, 0.08807373046875, 0.019561767578125, -0.04791259765625, -0.038726806640625, -0.045196533203125, -0.0008182525634765625, 0.025604248046875, 0.06298828125, 0.0005679130554199219, -0.00510406494140625, 0.0171051025390625, -0.01611328125, -0.0028591156005859375, -0.022735595703125, -0.076171875, -0.01220703125, 0.0301513671875, 0.01323699951171875, 0.04998779296875, 0.045745849609375, 0.027984619140625, 0.04510498046875, -0.017913818359375, 0.020751953125, -0.0038394927978515625, -0.01107025146484375, 0.0479736328125, -0.0292510986328125, -0.07086181640625, -0.003269195556640625, 0.05328369140625, 0.0469970703125, 0.0061187744140625, 0.0272979736328125, 0.02838134765625, 0.045074462890625, 0.0022144317626953125, -0.004062652587890625, 0.00186920166015625, 0.009521484375, -0.020721435546875, 0.00634002685546875, -0.004261016845703125, -0.05413818359375, 0.0141143798828125, -0.04681396484375, -0.00460052490234375, -0.00025653839111328125, 0.100830078125, -0.00942230224609375, -0.018035888671875, 0.006542205810546875, 0.0111541748046875, 0.05548095703125, -0.0626220703125, 0.0256805419921875, 0.0440673828125, -0.0005240440368652344, -0.01390838623046875, -0.04779052734375, -0.03826904296875, 0.01029205322265625, -0.031982421875, 0.02459716796875, -0.0238037109375, -0.01678466796875, 0.0284881591796875, 0.0233612060546875, -0.03558349609375, -0.0113525390625, -0.02789306640625, -0.0061492919921875, 0.053924560546875, 0.005229949951171875, 0.051300048828125, -0.0279998779296875, -0.03765869140625, -0.006885528564453125, -0.0119476318359375, 0.035797119140625, 0.019439697265625, 0.00832366943359375, -0.0823974609375, 0.032867431640625, 0.006557464599609375, 0.0170135498046875, 0.027130126953125, -0.0085296630859375, 0.06207275390625, -0.0073394775390625, -0.0400390625, -0.03521728515625, 0.07965087890625, 0.049530029296875, 0.02197265625, -0.007259368896484375, -0.00131988525390625, -0.01158905029296875, -0.030120849609375, -0.06951904296875, -0.00406646728515625, 0.0218963623046875, -0.04132080078125, -0.016082763671875, 0.0251922607421875, -0.06817626953125, -0.0030307769775390625, -0.006877899169921875, 0.005584716796875, -0.05609130859375, -0.033843994140625, 0.00154876708984375, -0.0173492431640625, 0.040191650390625, 0.016143798828125, -0.0230255126953125, 0.031646728515625, 0.0035572052001953125, 0.06396484375, 0.0236358642578125, -0.00313568115234375, -0.014678955078125, 0.0033702850341796875, -0.0267486572265625, 0.02655029296875, 0.01308441162109375, -0.0140228271484375, -0.025634765625, 0.031768798828125, -0.018402099609375, -0.015350341796875, 0.047454833984375, 0.02276611328125, 0.01316070556640625, -0.021392822265625, -0.0187835693359375, -0.0162506103515625, 0.0262298583984375, -0.04364013671875, 0.07489013671875, 0.0288848876953125, -0.08392333984375, 0.01311492919921875, -0.036865234375, 0.0003609657287597656, -0.021240234375, 0.0079345703125, -0.0660400390625, 0.0029659271240234375, 0.016143798828125, 0.052520751953125, -0.01490020751953125, -0.0134124755859375, -0.0258636474609375, 0.0038433074951171875, 0.030517578125, 0.0108489990234375, 0.06671142578125, 0.024566650390625, -0.034515380859375, -0.01666259765625, -0.053985595703125, 0.03350830078125, 0.031402587890625, -0.00029015541076660156, -0.0034465789794921875, -0.058807373046875, 0.0015697479248046875, 0.046051025390625, 0.020477294921875, -0.055267333984375, 0.0195770263671875, -0.01317596435546875, 0.0252532958984375, 0.047210693359375, 0.00109100341796875, 0.01206207275390625, -0.051300048828125, 0.0467529296875, -0.002811431884765625, 0.0212860107421875, 0.00038886070251464844, -0.0303802490234375, -0.0565185546875, -0.056304931640625, 0.0178070068359375, 0.031646728515625, -0.030914306640625, 0.06500244140625, 0.00891876220703125, -0.044921875, -0.048095703125, 0.003429412841796875, 0.041656494140625, 0.01486968994140625, 0.006877899169921875, -0.0250701904296875, -0.05584716796875, -0.07061767578125, -0.025360107421875, 0.0098419189453125, -0.0037994384765625, 0.052642822265625, 0.03192138671875, -0.01404571533203125, 0.040985107421875, -0.0286712646484375, -0.0179595947265625, -0.0101318359375, -0.00838470458984375, 0.033355712890625, 0.058319091796875, 0.0748291015625, -0.05572509765625, -0.0721435546875, 0.0113677978515625, -0.08447265625, -0.004703521728515625, 0.0002715587615966797, -0.0179901123046875, 0.032257080078125, 0.0193634033203125, -0.06658935546875, 0.06024169921875, 0.030487060546875, -0.0653076171875, 0.034454345703125, -0.027740478515625, 0.042938232421875, -0.07989501953125, 0.0203704833984375, 0.0208740234375, -0.0196533203125, -0.043670654296875, 0.0021953582763671875, -0.00782012939453125, 0.01094818115234375, -0.042938232421875, 0.05950927734375, -0.05316162109375, -0.0008516311645507812, 0.01251983642578125, 0.00670623779296875, -0.002899169921875, 0.0323486328125, -0.0034637451171875, 0.043975830078125, 0.06597900390625, -0.01186370849609375, 0.0263824462890625, 0.03118896484375, 0.002452850341796875, 0.058074951171875, -0.047271728515625, 0.00597381591796875, 0.0016851425170898438, 0.03485107421875, -0.07568359375, -0.03070068359375, 0.041961669921875, -0.06341552734375, 0.0478515625, -0.019775390625, -0.018707275390625, -0.0631103515625, -0.06536865234375, 0.0201873779296875, 0.047210693359375, -0.0421142578125, 0.028045654296875, 0.0149383544921875, -0.0045928955078125, -0.037384033203125, -0.051361083984375, 0.0072021484375, -0.031707763671875, -0.0618896484375, 0.033416748046875, 0.026031494140625, -0.013916015625, 0.00923919677734375, -0.0108795166015625, -0.01067352294921875, -0.017303466796875, 0.04498291015625, 0.0256500244140625, -0.0226287841796875, -0.030059814453125, -0.029876708984375, -0.021240234375, -0.00489044189453125, -0.00789642333984375, 0.038177490234375, -0.03375244140625, 0.00719451904296875, -0.11029052734375, 0.01006317138671875, 0.06622314453125, -0.0024700164794921875, 0.07293701171875, 0.05682373046875, -0.034912109375, 0.01218414306640625, -0.033416748046875, -0.0166778564453125, -0.039154052734375, -0.01702880859375, -0.054412841796875, -0.043914794921875, 0.06732177734375, 0.00499725341796875, -0.0104827880859375, 0.058074951171875, 0.01123046875, -0.01995849609375, 0.062286376953125, 0.03460693359375, -0.0030078887939453125, 0.04229736328125, -0.0634765625, 0.00836944580078125, -0.061309814453125, -0.056121826171875, -0.017181396484375, -0.041961669921875, -0.043975830078125, -0.02642822265625, 0.0167694091796875, 0.0290985107421875, -0.018310546875, 0.044769287109375, -0.041259765625, 0.002964019775390625, 0.02276611328125, 0.040985107421875, -0.01477813720703125, -0.0106353759765625, -0.00719451904296875, -0.02569580078125, -0.040008544921875, -0.028228759765625, 0.059051513671875, 0.047576904296875, 0.03155517578125, 0.00814056396484375, 0.041534423828125, 0.006671905517578125, 0.0135345458984375, -0.0221099853515625, 0.0516357421875, 0.003185272216796875, -0.0350341796875, -0.027130126953125, -0.0307464599609375, -0.08123779296875, 0.01263427734375, -0.0338134765625, -0.06365966796875, -0.01318359375, -0.004627227783203125, -0.0270233154296875, 0.055450439453125, -0.04534912109375, 0.04736328125, -0.00432586669921875, -0.04046630859375, -0.00453948974609375, -0.059234619140625, 0.004913330078125, 0.030426025390625, 0.00408172607421875, -0.0004315376281738281, -0.0040283203125, 0.056671142578125, -0.060302734375, 0.043548583984375, -0.02520751953125, 0.00977325439453125, 0.0300140380859375, -0.0013484954833984375, 0.029998779296875, 0.00010752677917480469, -0.01326751708984375, -0.00795745849609375, 0.00989532470703125, -0.0635986328125, -0.0221405029296875, 0.048828125, -0.054779052734375, -0.0289306640625, -0.05023193359375, -0.01983642578125, 0.00743865966796875, 0.0011777877807617188, 0.036285400390625, 0.0484619140625, -0.0028076171875, 0.0186309814453125, 0.04095458984375, -0.03179931640625, 0.039947509765625, -0.01213836669921875, 0.0019664764404296875, -0.04205322265625, 0.05413818359375, 0.0029621124267578125, 0.00011497735977172852, -0.001621246337890625, -0.000492095947265625, -0.031585693359375, -0.01540374755859375, -0.022735595703125, 0.056549072265625, -0.01207733154296875, -0.02130126953125, -0.04632568359375, -0.02459716796875, -0.04205322265625, -0.03253173828125, -0.03302001953125, -0.0269622802734375, -0.023681640625, 0.0021343231201171875, 0.0521240234375, 0.06463623046875, -0.026153564453125, 0.02886962890625, -0.03851318359375, 0.023284912109375, 0.00733184814453125, 0.042877197265625, -0.026641845703125, -0.049774169921875, 0.00348663330078125, -0.0016326904296875, -0.004947662353515625, -0.062225341796875, 0.050323486328125, -0.00013625621795654297, 0.02764892578125, 0.030364990234375, -0.015716552734375, 0.05328369140625, -0.0007948875427246094, 0.03594970703125, 0.045654296875, -0.056121826171875, 0.023895263671875, -0.03375244140625, 0.002170562744140625, 0.0215911865234375, 0.0157470703125, -0.0291290283203125, -0.0255279541015625, -0.06536865234375, -0.029998779296875, 0.05426025390625, 0.00763702392578125, -0.0018243789672851562, -0.0010356903076171875, 0.051910400390625, -0.0074462890625, 0.00510406494140625, -0.04046630859375, -0.068603515625, -0.00982666015625, -0.0120086669921875, 0.004528045654296875, -0.0029544830322265625, 0.00376129150390625, -0.0498046875, 0.049468994140625, 0.004306793212890625, 0.037994384765625, 0.01385498046875, 0.003849029541015625, 0.003849029541015625, -0.02197265625, 0.046905517578125, 0.0285797119140625, -0.0139312744140625, -0.0107879638671875, 0.028045654296875, -0.037750244140625, 0.007251739501953125, 0.0159149169921875, 0.0010805130004882812, 0.0076904296875, 0.005786895751953125, 0.037139892578125, 0.0262603759765625, -0.006298065185546875, 0.039306640625, -0.0195159912109375, -0.042938232421875, -0.01541900634765625, -0.0170440673828125, 0.0211181640625, 0.033050537109375, 0.0251922607421875, 0.0015850067138671875, -0.03106689453125, -0.028778076171875, 0.041107177734375, 0.0567626953125, -0.031463623046875, -0.02886962890625, 0.045074462890625, -0.0004184246063232422, -0.0156402587890625, 0.0288848876953125, -0.00812530517578125, -0.05224609375, 0.0767822265625, 0.0225067138671875, 0.0465087890625, -0.037994384765625, 0.00855255126953125, 0.06500244140625, -0.0017852783203125, 0.016387939453125, 0.02667236328125, 0.03509521484375, -0.0222930908203125, -0.00698089599609375, -0.041748046875, 0.013946533203125, 0.03558349609375, -0.0291290283203125, 0.022705078125, -0.052520751953125, -0.026123046875, 0.00611114501953125, 0.036651611328125, -0.046295166015625, 0.02728271484375, -0.0012035369873046875, 0.08087158203125, -0.06158447265625, 0.06536865234375, 0.06585693359375, -0.04058837890625, -0.06402587890625, -0.000972747802734375, 0.00678253173828125, -0.062164306640625, 0.034088134765625, 0.004787445068359375, 0.0019073486328125, -0.0003509521484375, -0.036529541015625, -0.050506591796875, 0.10284423828125, 0.02838134765625, -0.0024814605712890625, 0.0193634033203125, -0.032745361328125, 0.02764892578125, -0.01245880126953125, 0.043731689453125, 0.027252197265625, 0.0382080078125, 0.0120697021484375, -0.0673828125, 0.0267486572265625, -0.0312347412109375, -0.0103607177734375, 0.0233612060546875, -0.09857177734375, 0.06854248046875, -0.017364501953125, -0.0018777847290039062, 0.0186920166015625, 0.048614501953125, 0.0238037109375, -0.002376556396484375, 0.0172271728515625, 0.0672607421875, 0.034027099609375, -0.01995849609375, 0.07763671875, -0.0160369873046875, 0.042236328125, 0.0148773193359375, 0.04296875, 0.0269012451171875, 0.030914306640625, -0.044921875, 0.019683837890625, 0.0599365234375, -0.003814697265625, 0.0089874267578125, 0.022308349609375, -0.0318603515625, -0.0157470703125, -0.016815185546875, -0.05181884765625, 0.0178070068359375, 0.00751495361328125, -0.010009765625, -0.00949859619140625, -0.0033473968505859375, 0.0175628662109375, 0.0214691162109375, -0.0197296142578125, 0.03778076171875, 0.00543975830078125, -0.029510498046875, 0.03350830078125, -0.002552032470703125, 0.08074951171875, -0.0280609130859375, 0.0118865966796875, -0.0250701904296875, 0.022552490234375, -0.018707275390625, -0.08197021484375, 0.02374267578125, -0.00653076171875, 0.006595611572265625, -0.016876220703125, 0.047576904296875, -0.0268707275390625, -0.0255126953125, 0.027862548828125, 0.0283966064453125, 0.0374755859375, 0.020965576171875, -0.08148193359375, 0.0192108154296875, 0.0077972412109375, -0.04705810546875, 0.0323486328125, 0.037261962890625, 0.0293731689453125, 0.058197021484375, 0.023529052734375, 0.0258026123046875, 0.0173187255859375, -0.03094482421875, 0.0548095703125, -0.048187255859375, -0.03399658203125, -0.060455322265625, 0.04144287109375, -0.0308380126953125, -0.040252685546875, 0.055938720703125, 0.040557861328125, 0.0267791748046875, 0.00069427490234375, 0.051025390625, -0.042510986328125, 0.03594970703125, -0.01910400390625, 0.057464599609375, -0.050079345703125, -0.020294189453125, -0.0160064697265625, -0.04443359375, -0.0304412841796875, 0.06402587890625, -0.00844573974609375, 0.0173187255859375, 0.02154541015625, 0.049285888671875, 0.00693511962890625, -0.00860595703125, 0.0005970001220703125, 0.01263427734375, -0.01050567626953125, 0.068115234375, 0.037567138671875, -0.057220458984375, 0.005641937255859375, -0.035430908203125, -0.021026611328125, -0.0288238525390625, -0.055938720703125, -0.087158203125, -0.049591064453125, -0.0396728515625, -0.0501708984375, -0.0207061767578125, 0.0892333984375, 0.05999755859375, -0.044281005859375, -0.01184844970703125, 0.0109100341796875, 0.00664520263671875, -0.01140594482421875, -0.0162506103515625, 0.040191650390625, 0.011260986328125, -0.07537841796875, -0.0330810546875, 0.00939178466796875, 0.044921875, 0.0284576416015625, -0.036590576171875, -0.0196380615234375, -0.004337310791015625, 0.0265350341796875, 0.06512451171875, -0.06158447265625, -0.0217132568359375, 0.00264739990234375, -0.0374755859375, 0.00820159912109375, 0.019989013671875, -0.03533935546875, -0.00698089599609375, 0.0380859375, 0.0278778076171875, 0.054412841796875, 0.00672149658203125, 0.012298583984375, -0.0305938720703125, 0.040557861328125, -0.0003490447998046875, 0.0242767333984375, 0.01515960693359375, -0.0210723876953125, 0.057220458984375, 0.040863037109375, -0.0304412841796875, -0.076904296875, -0.0134124755859375, -0.09906005859375, -0.005641937255859375, 0.0489501953125, -0.00449371337890625, -0.030120849609375, 0.0301666259765625, -0.03558349609375, 0.039093017578125, -0.0168914794921875, 0.019195556640625, 0.0187835693359375, -0.027130126953125, -0.0257568359375, -0.04364013671875, 0.0467529296875, 0.027313232421875, -0.051513671875, -0.028778076171875, -0.00014507770538330078, 0.0225677490234375, 0.013092041015625, 0.054840087890625, -0.028106689453125, 0.011871337890625, -0.0069122314453125, 0.0178680419921875, -0.0007047653198242188, 0.01201629638671875, -0.023895263671875, -0.0093994140625, -0.0194091796875, -0.048614501953125 ] ]
Yale-LILY/brio-cnndm-uncased
2022-11-15T18:19:02.000Z
[ "transformers", "pytorch", "bart", "text2text-generation", "text-2-text-generation", "arxiv:2203.16804", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
text2text-generation
Yale-LILY
null
null
Yale-LILY/brio-cnndm-uncased
7
9,353
transformers
2022-03-30T23:35:54
--- tags: - text-2-text-generation - bart --- # Model Card for brio-cnndm-uncased # Model Details ## Model Description Abstractive summarization models are commonly trained using maximum likelihood estimation, which assumes a deterministic (one-point) target distribution in which an ideal model will assign all the probability mass to the reference summary. This assumption may lead to performance degradation during inference, where the model needs to compare several system-generated (candidate) summaries that have deviated from the reference summary. To address this problem, we propose a novel training paradigm which assumes a non-deterministic distribution so that different candidate summaries are assigned probability mass according to their quality. - **Developed by:** Yale LILY Lab - **Shared by [Optional]:** Yale LILY Lab - **Model type:** Text2Text Generation - **Language(s) (NLP):** More information needed - **License:** More information needed - **Parent Model:** BART - **Resources for more information:** - [Github Repo](https://github.com/Yale-LILY/BRIO) - [Associated Paper](https://arxiv.org/abs/2203.16804) # Uses ## Direct Use This model can be used for the task of Text2Text Generation ## Downstream Use [Optional] Further analysis also shows that our model can estimate probabilities of candidate summaries that are more correlated with their level of quality. ## Out-of-Scope Use The model should not be used to intentionally create hostile or alienating environments for people. # Bias, Risks, and Limitations Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. ## Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. # Training Details ## Training Data The model creators note in the [associated paper](https://arxiv.org/abs/2203.16804): > CNNDM4: is a large scale news dataset. Nallapati et al: we treat the news articles as the source documents and the associated highlights as the summaries. XSum5: is a highly abstractive dataset of articles from the British Broadcasting Corporation (BBC). NYT6: contains articles from the New York Times and the associated summaries ## Training Procedure ### Preprocessing The model creators note in the [associated paper](https://arxiv.org/abs/2203.16804): > We follow Kedzie et al. (2018) for data preprocessing and splitting, and use the associated archival abstracts as the summaries ### Speeds, Sizes, Times More information needed # Evaluation ## Testing Data, Factors & Metrics ### Testing Data More information needed ### Factors More information needed ### Metrics More information needed ## Results ### CNNDM | | ROUGE-1 | ROUGE-2 | ROUGE-L | |----------|---------|---------|---------| | BART | 44.16 | 21.28 | 40.90 | | Ours | 47.78 | 23.55 | 44.57 | # Model Examination The model creators note in the [associated paper](https://arxiv.org/abs/2203.16804): > We attribute BRIO-Ctr’s superior performance to its use of the same model architecture (BART) for both candidate generation and scoring, while SimCLS uses RoBERTa as the evaluation model. As a result, BRIO-Ctr maximizes the parameter sharing between the two stages, and preserves the power of the Seq2Seq model pre-trained on the same dataset. # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** More information needed - **Hours used:** More information needed - **Cloud Provider:** More information needed - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Technical Specifications [optional] ## Model Architecture and Objective The model creators note in the [associated paper](https://arxiv.org/abs/2203.16804): > Formulate summarization as a sequence-to-sequence (Seq2Seq) problem ## Compute Infrastructure More information needed ### Hardware More information needed ### Software More information needed. # Citation **BibTeX:** ```bibtex @misc{mesh-transformer-jax, @misc{https://doi.org/10.48550/arxiv.2203.16804, doi = {10.48550/ARXIV.2203.16804}, url = {https://arxiv.org/abs/2203.16804}, author = {Liu, Yixin and Liu, Pengfei and Radev, Dragomir and Neubig, Graham}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {BRIO: Bringing Order to Abstractive Summarization}, ``` # Glossary [optional] More information needed # More Information [optional] More information needed # Model Card Authors [optional] Yale LILY Lab in collaboration with Ezi Ozoani and the Hugging Face team # Model Card Contact More information needed # How to Get Started with the Model Use the code below to get started with the model. <details> <summary> Click to expand </summary> ```python from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Yale-LILY/brio-cnndm-uncased") model = AutoModelForSeq2SeqLM.from_pretrained("Yale-LILY/brio-cnndm-uncased") ``` </details>
5,786
[ [ -0.025238037109375, -0.04425048828125, 0.0249481201171875, 0.006999969482421875, -0.020263671875, -0.013916015625, -0.0161590576171875, -0.0338134765625, -0.00775146484375, 0.03656005859375, -0.030670166015625, -0.03814697265625, -0.050994873046875, 0.01025390625, -0.0350341796875, 0.1009521484375, 0.0145263671875, 0.0029239654541015625, -0.0249176025390625, 0.004779815673828125, -0.0209808349609375, -0.04034423828125, -0.048980712890625, -0.0183258056640625, 0.022308349609375, 0.0152130126953125, 0.0406494140625, 0.041534423828125, 0.04132080078125, 0.0236663818359375, -0.02276611328125, 0.007320404052734375, -0.04522705078125, -0.0214080810546875, -0.0025539398193359375, -0.02716064453125, -0.0367431640625, 0.00714874267578125, 0.052581787109375, 0.0450439453125, -0.0168304443359375, 0.024322509765625, 0.0006232261657714844, 0.0439453125, -0.040802001953125, 0.003093719482421875, -0.049346923828125, -0.0006580352783203125, 0.0003376007080078125, -0.0079345703125, -0.032257080078125, -0.01558685302734375, 0.005340576171875, -0.0279998779296875, 0.030670166015625, 0.0006451606750488281, 0.083984375, 0.01248931884765625, -0.04876708984375, -0.01450347900390625, -0.05047607421875, 0.062103271484375, -0.0716552734375, 0.030242919921875, 0.01346588134765625, 0.01190185546875, 0.006961822509765625, -0.05291748046875, -0.045318603515625, -0.0124969482421875, -0.0167388916015625, 0.0399169921875, -0.01256561279296875, 0.0029048919677734375, 0.0248565673828125, 0.033233642578125, -0.052764892578125, 0.005832672119140625, -0.053741455078125, -0.011260986328125, 0.06878662109375, 0.0222625732421875, 0.0064544677734375, -0.0295257568359375, -0.04638671875, -0.006984710693359375, -0.03656005859375, 0.01227569580078125, 0.04022216796875, 0.022735595703125, -0.043426513671875, 0.046783447265625, -0.0205841064453125, 0.0528564453125, -0.00550079345703125, -0.007480621337890625, 0.033660888671875, -0.0555419921875, -0.0194549560546875, 0.00027251243591308594, 0.07916259765625, 0.0305633544921875, -0.0003306865692138672, 0.006031036376953125, -0.018707275390625, -0.01424407958984375, 0.01253509521484375, -0.0804443359375, -0.020843505859375, 0.0236358642578125, -0.037384033203125, -0.033660888671875, 0.01261138916015625, -0.068359375, 0.0023250579833984375, -0.0226593017578125, 0.0277252197265625, -0.02484130859375, -0.0191802978515625, 0.0018815994262695312, -0.01551055908203125, 0.00373077392578125, 0.017974853515625, -0.060150146484375, 0.02215576171875, 0.0374755859375, 0.0758056640625, -0.0192108154296875, -0.01055145263671875, -0.005828857421875, 0.005252838134765625, -0.0142822265625, 0.048553466796875, -0.027252197265625, -0.038604736328125, -0.023101806640625, -0.002422332763671875, -0.00984954833984375, -0.01525115966796875, 0.04351806640625, -0.03173828125, 0.051025390625, -0.006534576416015625, -0.04693603515625, -0.0201873779296875, 0.01177978515625, -0.0435791015625, 0.07989501953125, 0.01776123046875, -0.07550048828125, 0.00576019287109375, -0.0587158203125, -0.0267486572265625, -0.022064208984375, -0.001270294189453125, -0.06689453125, -0.01233673095703125, 0.0129852294921875, 0.034881591796875, -0.023590087890625, 0.030487060546875, -0.016021728515625, -0.021820068359375, 0.0020904541015625, -0.032012939453125, 0.0848388671875, 0.029998779296875, -0.022735595703125, 0.006381988525390625, -0.060638427734375, -0.0129852294921875, 0.012054443359375, -0.0340576171875, -0.00872802734375, -0.01242828369140625, 0.01152801513671875, 0.03021240234375, 0.0226593017578125, -0.039886474609375, 0.0009665489196777344, -0.034454345703125, 0.031829833984375, 0.0528564453125, 0.0081634521484375, 0.0260467529296875, -0.03289794921875, 0.031463623046875, 0.017791748046875, 0.0199432373046875, -0.014007568359375, -0.041595458984375, -0.046630859375, -0.02252197265625, 0.0292816162109375, 0.046478271484375, -0.036041259765625, 0.041015625, -0.032257080078125, -0.058685302734375, -0.01543426513671875, -0.00545501708984375, 0.043060302734375, 0.04718017578125, 0.033599853515625, -0.01213836669921875, -0.049835205078125, -0.06646728515625, 0.00879669189453125, -0.015289306640625, -0.0010271072387695312, 0.0255279541015625, 0.05242919921875, -0.012054443359375, 0.06890869140625, -0.046783447265625, -0.0159759521484375, -0.03057861328125, 0.01050567626953125, 0.023651123046875, 0.049652099609375, 0.03912353515625, -0.068359375, -0.01335906982421875, -0.017974853515625, -0.046356201171875, -0.0020236968994140625, -0.0194549560546875, -0.006595611572265625, 0.0281829833984375, 0.027435302734375, -0.042938232421875, 0.028289794921875, 0.043731689453125, -0.0352783203125, 0.04852294921875, -0.0193328857421875, 0.01025390625, -0.112060546875, 0.0255889892578125, 0.006103515625, -0.007091522216796875, -0.04742431640625, -0.01525115966796875, -0.00963592529296875, -0.00649261474609375, -0.0313720703125, 0.048797607421875, -0.03375244140625, -0.0116729736328125, -0.00948333740234375, -0.0003046989440917969, 0.00983428955078125, 0.051361083984375, 0.0137481689453125, 0.057830810546875, 0.016937255859375, -0.0577392578125, 0.0011053085327148438, 0.0290679931640625, -0.0274810791015625, 0.0284881591796875, -0.0633544921875, -0.0068359375, -0.011016845703125, 0.024383544921875, -0.07293701171875, -0.0005574226379394531, 0.00815582275390625, -0.056793212890625, 0.0249481201171875, -0.0015211105346679688, -0.034332275390625, -0.029266357421875, -0.01448822021484375, 0.036590576171875, 0.038360595703125, -0.023712158203125, 0.036895751953125, 0.0296478271484375, -0.00617218017578125, -0.042633056640625, -0.05316162109375, -0.005756378173828125, -0.0135650634765625, -0.04364013671875, 0.046844482421875, -0.01497650146484375, -0.000058710575103759766, 0.0025691986083984375, 0.01497650146484375, -0.0082550048828125, 0.01306915283203125, 0.01464080810546875, 0.03521728515625, 0.000896453857421875, 0.0035419464111328125, -0.00449371337890625, -0.0104217529296875, 0.00145721435546875, -0.00426483154296875, 0.0266571044921875, -0.0007195472717285156, -0.004604339599609375, -0.04071044921875, 0.0284423828125, 0.03631591796875, -0.0276947021484375, 0.056793212890625, 0.0548095703125, -0.032470703125, 0.0042724609375, -0.033538818359375, -0.0267181396484375, -0.03143310546875, 0.053802490234375, -0.01418304443359375, -0.06158447265625, 0.039947509765625, 0.0230560302734375, 0.0240325927734375, 0.05499267578125, 0.05767822265625, -0.00087738037109375, 0.07794189453125, 0.047119140625, 0.0016431808471679688, 0.04693603515625, -0.040863037109375, 0.01548004150390625, -0.07177734375, -0.0261993408203125, -0.05145263671875, -0.0097198486328125, -0.05291748046875, -0.0228271484375, 0.0361328125, -0.0024623870849609375, -0.03173828125, 0.031646728515625, -0.047943115234375, 0.00778961181640625, 0.047943115234375, 0.005702972412109375, 0.0140533447265625, -0.00664520263671875, -0.006633758544921875, 0.01068878173828125, -0.062744140625, -0.0249786376953125, 0.0885009765625, 0.0295867919921875, 0.031951904296875, 0.016143798828125, 0.0513916015625, 0.0174102783203125, 0.0198974609375, -0.029510498046875, 0.035888671875, -0.011138916015625, -0.05511474609375, -0.0284881591796875, -0.0552978515625, -0.07745361328125, 0.0215301513671875, -0.0246124267578125, -0.058013916015625, 0.0244903564453125, 0.00506591796875, -0.034515380859375, 0.0264129638671875, -0.045379638671875, 0.07177734375, -0.01248931884765625, -0.02099609375, -0.015289306640625, -0.059326171875, 0.0516357421875, 0.0035419464111328125, 0.0223846435546875, -0.00445556640625, 0.00093841552734375, 0.0733642578125, -0.038360595703125, 0.06317138671875, -0.00884246826171875, 0.01239776611328125, 0.0261383056640625, -0.0196075439453125, 0.043975830078125, 0.001430511474609375, -0.0186920166015625, 0.034515380859375, 0.01043701171875, -0.030517578125, -0.03460693359375, 0.038818359375, -0.071533203125, -0.0268707275390625, -0.038604736328125, -0.048309326171875, 0.0046844482421875, 0.0196075439453125, 0.038360595703125, 0.03704833984375, -0.0007557868957519531, 0.0109100341796875, 0.04876708984375, -0.01910400390625, 0.032623291015625, 0.037017822265625, -0.0011968612670898438, -0.053070068359375, 0.06671142578125, 0.019439697265625, 0.0280609130859375, 0.035125732421875, 0.0168609619140625, -0.031951904296875, -0.051055908203125, -0.02276611328125, 0.0244903564453125, -0.040863037109375, -0.021026611328125, -0.07373046875, -0.0279083251953125, -0.0430908203125, 0.00027632713317871094, -0.02740478515625, -0.034332275390625, -0.03973388671875, -0.019744873046875, 0.0198974609375, 0.037841796875, -0.00373077392578125, 0.0201873779296875, -0.05035400390625, 0.0081634521484375, 0.0099029541015625, 0.0154266357421875, -0.00261688232421875, -0.06427001953125, -0.03729248046875, 0.009979248046875, -0.035736083984375, -0.0653076171875, 0.030670166015625, 0.00670623779296875, 0.045440673828125, 0.029205322265625, 0.0170440673828125, 0.040985107421875, -0.0212554931640625, 0.07513427734375, 0.01904296875, -0.0604248046875, 0.038055419921875, -0.02679443359375, 0.02093505859375, 0.043212890625, 0.0251617431640625, -0.03228759765625, -0.033233642578125, -0.06878662109375, -0.06890869140625, 0.05853271484375, 0.036041259765625, 0.01427459716796875, -0.005329132080078125, 0.019439697265625, -0.0177764892578125, 0.0258941650390625, -0.060211181640625, -0.0243988037109375, -0.0189208984375, -0.0205078125, -0.0063323974609375, -0.031585693359375, 0.0004911422729492188, -0.02099609375, 0.07928466796875, -0.0012149810791015625, 0.051910400390625, 0.0296630859375, 0.0015306472778320312, 0.01015472412109375, 0.00891876220703125, 0.0682373046875, 0.03399658203125, -0.027984619140625, 0.002590179443359375, 0.01739501953125, -0.03424072265625, -0.006069183349609375, 0.0172576904296875, -0.030517578125, 0.0084381103515625, 0.0294342041015625, 0.06719970703125, 0.018463134765625, -0.046630859375, 0.05255126953125, 0.00012201070785522461, -0.032562255859375, -0.0295867919921875, -0.0009617805480957031, 0.0007805824279785156, 0.006267547607421875, 0.0164337158203125, 0.01049041748046875, 0.0248565673828125, -0.0262908935546875, 0.02105712890625, 0.036590576171875, -0.028228759765625, -0.0214080810546875, 0.0716552734375, 0.0177001953125, -0.0024967193603515625, 0.0458984375, -0.02154541015625, -0.0379638671875, 0.066650390625, 0.041229248046875, 0.06170654296875, -0.0117950439453125, 0.0101776123046875, 0.06365966796875, 0.0284423828125, -0.0029163360595703125, 0.01515960693359375, 0.00012177228927612305, -0.05181884765625, -0.0134124755859375, -0.040679931640625, -0.017181396484375, 0.004955291748046875, -0.04833984375, 0.038299560546875, -0.031341552734375, -0.010650634765625, 0.0022373199462890625, 0.017669677734375, -0.052276611328125, 0.027618408203125, 0.01145172119140625, 0.07159423828125, -0.07232666015625, 0.049835205078125, 0.040374755859375, -0.04302978515625, -0.060546875, -0.020233154296875, 0.0027370452880859375, -0.037811279296875, 0.03094482421875, 0.01568603515625, 0.0209197998046875, 0.00690460205078125, -0.026153564453125, -0.07537841796875, 0.1097412109375, 0.0164337158203125, -0.0452880859375, -0.007110595703125, -0.00241851806640625, 0.034759521484375, -0.033233642578125, 0.060211181640625, 0.026214599609375, 0.0301055908203125, 0.01544189453125, -0.059051513671875, 0.0111541748046875, -0.0233306884765625, 0.01190185546875, 0.00820159912109375, -0.0682373046875, 0.07275390625, -0.03961181640625, -0.019805908203125, 0.0010700225830078125, 0.058990478515625, 0.0206298828125, 0.0347900390625, 0.03424072265625, 0.051788330078125, 0.057098388671875, -0.0007643699645996094, 0.08575439453125, -0.04205322265625, 0.048004150390625, 0.09417724609375, 0.0030670166015625, 0.060882568359375, 0.01904296875, -0.0296478271484375, 0.0350341796875, 0.06097412109375, -0.0211639404296875, 0.037139892578125, 0.0036067962646484375, -0.005496978759765625, -0.00876617431640625, -0.0102386474609375, -0.02740478515625, 0.0242767333984375, 0.0093841552734375, -0.048919677734375, -0.01910400390625, -0.01058197021484375, 0.01314544677734375, -0.0037631988525390625, -0.0119781494140625, 0.058563232421875, -0.0008335113525390625, -0.045684814453125, 0.036590576171875, 0.0206451416015625, 0.04876708984375, -0.04437255859375, -0.002719879150390625, -0.0121307373046875, -0.006984710693359375, -0.02899169921875, -0.04620361328125, 0.031585693359375, 0.0010499954223632812, -0.0284881591796875, -0.0003631114959716797, 0.0256805419921875, -0.040313720703125, -0.06732177734375, 0.00768280029296875, 0.02362060546875, 0.020263671875, 0.01715087890625, -0.06658935546875, -0.009063720703125, 0.01325225830078125, -0.03521728515625, 0.0135040283203125, 0.01467132568359375, -0.004924774169921875, 0.04046630859375, 0.062347412109375, 0.00193023681640625, -0.0065460205078125, -0.0027484893798828125, 0.0596923828125, -0.04296875, -0.057098388671875, -0.0660400390625, 0.058441162109375, -0.0152435302734375, -0.034210205078125, 0.0523681640625, 0.0711669921875, 0.06719970703125, 0.01087188720703125, 0.06646728515625, -0.006473541259765625, 0.032623291015625, -0.039276123046875, 0.044281005859375, -0.04339599609375, 0.0221405029296875, -0.0246734619140625, -0.060791015625, -0.01045989990234375, 0.030670166015625, -0.0307769775390625, 0.02764892578125, 0.04168701171875, 0.07305908203125, 0.0048980712890625, -0.00016999244689941406, 0.0225677490234375, 0.020355224609375, 0.0293426513671875, 0.048553466796875, 0.027435302734375, -0.046478271484375, 0.04388427734375, -0.032379150390625, -0.014862060546875, -0.0238037109375, -0.0579833984375, -0.06976318359375, -0.054412841796875, -0.0443115234375, -0.03057861328125, 0.00005728006362915039, 0.07012939453125, 0.05999755859375, -0.0579833984375, -0.01224517822265625, -0.0082550048828125, 0.0220489501953125, -0.0250091552734375, -0.0210113525390625, 0.039215087890625, -0.0159149169921875, -0.05853271484375, -0.00511932373046875, 0.001308441162109375, 0.0161590576171875, -0.0191497802734375, -0.01300048828125, -0.0217742919921875, 0.0132904052734375, 0.03826904296875, 0.005435943603515625, -0.041015625, -0.007335662841796875, -0.0106964111328125, -0.0184478759765625, 0.0045013427734375, 0.042022705078125, -0.04559326171875, 0.025604248046875, 0.043487548828125, 0.036376953125, 0.04302978515625, 0.01197052001953125, 0.0310516357421875, -0.04498291015625, -0.00026798248291015625, 0.01023101806640625, 0.0310821533203125, 0.02447509765625, -0.05523681640625, 0.0419921875, 0.028564453125, -0.047119140625, -0.06634521484375, 0.0014696121215820312, -0.089111328125, -0.02069091796875, 0.10040283203125, -0.006069183349609375, -0.0194854736328125, 0.005611419677734375, -0.007801055908203125, 0.03167724609375, -0.0343017578125, 0.058563232421875, 0.03375244140625, -0.00206756591796875, -0.02490234375, -0.04595947265625, 0.025360107421875, 0.0149383544921875, -0.062042236328125, -0.005123138427734375, 0.033538818359375, 0.0313720703125, 0.004486083984375, 0.050445556640625, -0.01385498046875, 0.020477294921875, -0.00821685791015625, 0.0261077880859375, -0.0301666259765625, -0.013671875, -0.0283966064453125, 0.00739288330078125, -0.0247039794921875, -0.0126953125 ] ]
laion/CLIP-ViT-g-14-laion2B-s12B-b42K
2023-01-24T01:01:19.000Z
[ "open_clip", "pytorch", "clip", "arxiv:1910.04867", "license:mit", "has_space", "region:us" ]
null
laion
null
null
laion/CLIP-ViT-g-14-laion2B-s12B-b42K
26
9,350
open_clip
2022-09-14T22:53:40
--- license: mit widget: - src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png candidate_labels: playing music, playing sports example_title: Cat & Dog --- # Model Card for CLIP ViT-g/14 - LAION-2B # Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Training Details](#training-details) 4. [Evaluation](#evaluation) 5. [Acknowledgements](#acknowledgements) 6. [Citation](#citation) 7. [How To Get Started With the Model](#how-to-get-started-with-the-model) # Model Details ## Model Description A CLIP ViT-g/14 model trained with the LAION-2B English subset of LAION-5B (https://laion.ai/blog/laion-5b/) using OpenCLIP (https://github.com/mlfoundations/open_clip). Model training done by Romain Beaumont on the [stability.ai](https://stability.ai/) cluster. # Uses As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model. The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the LAION-5B blog (https://laion.ai/blog/laion-5b/) and upcoming paper include additional discussion as it relates specifically to the training dataset. ## Direct Use Zero-shot image classification, image and text retrieval, among others. ## Downstream Use Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others. ## Out-of-Scope Use As per the OpenAI models, **Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful. Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use. Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases. Further the above notice, the LAION-5B dataset used in training of these models has additional considerations, see below. # Training Details ## Training Data This model was trained with the 2 Billion sample English subset of LAION-5B (https://laion.ai/blog/laion-5b/). **IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a “safe” subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress. ## Training Procedure Please see [training notes](https://docs.google.com/document/d/1EFbMLRWSSV0LUf9Du1pWzWqgeiIRPwEWX2s1C6mAk5c) and [wandb logs](https://wandb.ai/rom1504/eval_openclip/reports/slow-g-14--VmlldzoyNTMwMjg5). # Evaluation Evaluation done with code in the [LAION CLIP Benchmark suite](https://github.com/LAION-AI/CLIP_benchmark). ## Testing Data, Factors & Metrics ### Testing Data The testing is performed with VTAB+ (A combination of VTAB (https://arxiv.org/abs/1910.04867) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval. **TODO** - more detail ## Results The model achieves a 76.6 zero-shot top-1 accuracy on ImageNet-1k. An initial round of benchmarks have been performed on a wider range of datasets, currently viewable at https://github.com/LAION-AI/CLIP_benchmark/blob/main/benchmark/results.ipynb **TODO** - create table for just this model's metrics. # Acknowledgements Acknowledging [stability.ai](https://stability.ai/) for the compute used to train this model. # Citation **BibTeX:** In addition to forthcoming LAION-5B (https://laion.ai/blog/laion-5b/) paper, please cite: OpenAI CLIP paper ``` @inproceedings{Radford2021LearningTV, title={Learning Transferable Visual Models From Natural Language Supervision}, author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever}, booktitle={ICML}, year={2021} } ``` OpenCLIP software ``` @software{ilharco_gabriel_2021_5143773, author = {Ilharco, Gabriel and Wortsman, Mitchell and Wightman, Ross and Gordon, Cade and Carlini, Nicholas and Taori, Rohan and Dave, Achal and Shankar, Vaishaal and Namkoong, Hongseok and Miller, John and Hajishirzi, Hannaneh and Farhadi, Ali and Schmidt, Ludwig}, title = {OpenCLIP}, month = jul, year = 2021, note = {If you use this software, please cite it as below.}, publisher = {Zenodo}, version = {0.1}, doi = {10.5281/zenodo.5143773}, url = {https://doi.org/10.5281/zenodo.5143773} } ``` # How to Get Started with the Model Use the code below to get started with the model. ** TODO ** - Hugging Face transformers, OpenCLIP, and timm getting started snippets
7,409
[ [ -0.0225372314453125, -0.045562744140625, 0.01512908935546875, 0.0037384033203125, -0.03167724609375, -0.032867431640625, -0.01348114013671875, -0.049530029296875, 0.0023937225341796875, 0.032318115234375, -0.033203125, -0.044403076171875, -0.04754638671875, -0.0095672607421875, -0.0301513671875, 0.06927490234375, -0.0156097412109375, -0.0012712478637695312, -0.01824951171875, -0.032379150390625, -0.040863037109375, -0.0406494140625, -0.033843994140625, 0.005374908447265625, 0.00754547119140625, 0.0206756591796875, 0.04833984375, 0.06121826171875, 0.054412841796875, 0.016754150390625, -0.00726318359375, -0.003448486328125, -0.043304443359375, -0.04241943359375, -0.002338409423828125, -0.0252227783203125, -0.050994873046875, 0.011993408203125, 0.040008544921875, 0.0210723876953125, -0.011016845703125, 0.01812744140625, -0.00292205810546875, 0.03070068359375, -0.056304931640625, 0.0179595947265625, -0.04315185546875, -0.00008034706115722656, -0.016448974609375, 0.0167083740234375, -0.02130126953125, -0.01026153564453125, 0.01336669921875, -0.05328369140625, 0.01483154296875, -0.01233673095703125, 0.104736328125, 0.01532745361328125, -0.020416259765625, 0.0140533447265625, -0.049285888671875, 0.05517578125, -0.05560302734375, 0.0273284912109375, 0.02783203125, 0.034881591796875, 0.0179443359375, -0.06640625, -0.030487060546875, -0.018096923828125, 0.01318359375, 0.019378662109375, -0.0211334228515625, 0.0008597373962402344, 0.036224365234375, 0.01209259033203125, -0.027679443359375, -0.00019359588623046875, -0.05303955078125, -0.0027675628662109375, 0.0537109375, -0.0014553070068359375, 0.0312347412109375, -0.0258941650390625, -0.055023193359375, -0.032135009765625, -0.046112060546875, 0.034576416015625, 0.018890380859375, -0.0015745162963867188, -0.040985107421875, 0.03436279296875, 0.0004372596740722656, 0.028106689453125, -0.004604339599609375, -0.0253753662109375, 0.035858154296875, -0.029327392578125, -0.0251007080078125, -0.01593017578125, 0.08233642578125, 0.05157470703125, 0.012542724609375, 0.006702423095703125, -0.0014324188232421875, -0.005275726318359375, 0.0236968994140625, -0.07293701171875, -0.01309967041015625, -0.0023784637451171875, -0.04522705078125, -0.02398681640625, 0.035186767578125, -0.0552978515625, 0.008941650390625, -0.00986480712890625, 0.042694091796875, -0.042266845703125, -0.0187835693359375, 0.0011091232299804688, -0.0012922286987304688, 0.01678466796875, 0.0227508544921875, -0.043304443359375, 0.01340484619140625, 0.0283050537109375, 0.08416748046875, -0.015777587890625, -0.03009033203125, -0.019622802734375, 0.01532745361328125, -0.019744873046875, 0.036651611328125, -0.01152801513671875, -0.030303955078125, -0.0088958740234375, 0.03240966796875, -0.007598876953125, -0.04241943359375, 0.04833984375, -0.01458740234375, 0.0026683807373046875, -0.011688232421875, -0.0133209228515625, -0.0445556640625, 0.0083465576171875, -0.050079345703125, 0.06903076171875, 0.002185821533203125, -0.062744140625, 0.02581787109375, -0.045318603515625, -0.01421356201171875, -0.0134429931640625, -0.0031948089599609375, -0.04510498046875, -0.0188446044921875, 0.0364990234375, 0.038726806640625, -0.01934814453125, 0.03839111328125, -0.04852294921875, -0.0255126953125, 0.0154571533203125, -0.029296875, 0.07196044921875, -0.0006008148193359375, -0.02667236328125, 0.01506805419921875, -0.04486083984375, -0.0095977783203125, 0.0190887451171875, 0.006832122802734375, -0.01486968994140625, -0.0190277099609375, -0.00234222412109375, 0.02044677734375, 0.007717132568359375, -0.038360595703125, 0.00211334228515625, -0.0078887939453125, 0.035980224609375, 0.054229736328125, 0.007793426513671875, 0.0218963623046875, -0.032928466796875, 0.047821044921875, 0.0074462890625, 0.049774169921875, -0.01519775390625, -0.04083251953125, -0.05511474609375, -0.04547119140625, 0.0310821533203125, 0.041259765625, -0.050018310546875, 0.028839111328125, -0.0170745849609375, -0.037506103515625, -0.031280517578125, -0.003223419189453125, 0.0367431640625, 0.0406494140625, 0.0302581787109375, -0.038909912109375, -0.034881591796875, -0.06597900390625, 0.0212860107421875, -0.0021266937255859375, -0.00525665283203125, 0.048187255859375, 0.05731201171875, -0.01279449462890625, 0.0689697265625, -0.049102783203125, -0.03826904296875, -0.00946807861328125, 0.00653076171875, 0.005321502685546875, 0.034332275390625, 0.07086181640625, -0.0645751953125, -0.037353515625, -0.00930023193359375, -0.08758544921875, 0.005558013916015625, 0.0045623779296875, -0.0211181640625, 0.007415771484375, 0.0400390625, -0.0455322265625, 0.052520751953125, 0.032928466796875, 0.007610321044921875, 0.03778076171875, -0.00806427001953125, 0.0021839141845703125, -0.084716796875, 0.028106689453125, 0.007068634033203125, -0.0190277099609375, -0.038421630859375, 0.0026092529296875, 0.0016384124755859375, -0.0293121337890625, -0.0679931640625, 0.039276123046875, -0.02447509765625, 0.0100860595703125, -0.00408935546875, 0.004520416259765625, 0.0037403106689453125, 0.043182373046875, 0.006931304931640625, 0.0701904296875, 0.05828857421875, -0.048919677734375, 0.0021839141845703125, 0.0264434814453125, -0.0257110595703125, 0.0250396728515625, -0.076171875, 0.004428863525390625, -0.004604339599609375, 0.00955963134765625, -0.0254974365234375, -0.03387451171875, 0.035675048828125, -0.033172607421875, 0.02471923828125, -0.0264434814453125, -0.0135498046875, -0.031829833984375, -0.0460205078125, 0.037567138671875, 0.055450439453125, -0.04644775390625, 0.02667236328125, 0.037078857421875, 0.00899505615234375, -0.055206298828125, -0.04736328125, -0.023529052734375, -0.0296630859375, -0.053955078125, 0.029296875, -0.005199432373046875, -0.0015687942504882812, 0.004489898681640625, 0.009033203125, -0.01454925537109375, -0.00595855712890625, 0.05059814453125, 0.040069580078125, -0.00196075439453125, -0.0112762451171875, -0.01055145263671875, 0.000059545040130615234, 0.0008893013000488281, -0.002044677734375, 0.0139007568359375, -0.0173797607421875, -0.021881103515625, -0.04541015625, 0.0135345458984375, 0.04705810546875, -0.0296173095703125, 0.0556640625, 0.058013916015625, -0.0338134765625, 0.0010328292846679688, -0.0261688232421875, -0.005550384521484375, -0.036224365234375, 0.033905029296875, -0.0036220550537109375, -0.05035400390625, 0.04583740234375, 0.0097503662109375, -0.01519775390625, 0.04241943359375, 0.026214599609375, -0.0013332366943359375, 0.0701904296875, 0.069580078125, -0.004253387451171875, 0.0489501953125, -0.05560302734375, 0.01125335693359375, -0.07330322265625, -0.028045654296875, -0.01110076904296875, -0.0030117034912109375, -0.040618896484375, -0.042510986328125, 0.051483154296875, 0.03570556640625, -0.01119232177734375, 0.035064697265625, -0.0246124267578125, 0.02130126953125, 0.037811279296875, 0.028839111328125, 0.00025177001953125, -0.006488800048828125, -0.001743316650390625, -0.0067138671875, -0.051422119140625, -0.0330810546875, 0.0870361328125, 0.04833984375, 0.0604248046875, -0.004016876220703125, 0.031829833984375, 0.0134124755859375, 0.00843048095703125, -0.05328369140625, 0.047821044921875, -0.0278472900390625, -0.045684814453125, -0.0198822021484375, -0.026580810546875, -0.05767822265625, -0.00003826618194580078, -0.00576019287109375, -0.06121826171875, 0.02667236328125, 0.0053253173828125, -0.025299072265625, 0.034332275390625, -0.044769287109375, 0.07647705078125, -0.03253173828125, -0.0253448486328125, 0.00684356689453125, -0.056671142578125, 0.04571533203125, 0.01153564453125, 0.005084991455078125, -0.02001953125, 0.013275146484375, 0.07623291015625, -0.04833984375, 0.07293701171875, -0.01485443115234375, 0.019500732421875, 0.054412841796875, -0.017852783203125, 0.01180267333984375, 0.0174713134765625, 0.0118408203125, 0.056610107421875, 0.002666473388671875, -0.0096282958984375, -0.0302276611328125, 0.032379150390625, -0.07025146484375, -0.0238494873046875, -0.033172607421875, -0.036376953125, 0.01479339599609375, 0.026580810546875, 0.0479736328125, 0.039459228515625, -0.0116119384765625, 0.024322509765625, 0.041961669921875, -0.021026611328125, 0.037811279296875, 0.0185394287109375, -0.018096923828125, -0.0537109375, 0.07391357421875, 0.0230865478515625, 0.0243377685546875, 0.004177093505859375, 0.007221221923828125, -0.00502777099609375, -0.0283966064453125, -0.040863037109375, 0.02532958984375, -0.053009033203125, -0.0310821533203125, -0.03399658203125, -0.031768798828125, -0.032135009765625, -0.00505828857421875, -0.038238525390625, -0.01409912109375, -0.04571533203125, -0.0054168701171875, 0.0299224853515625, 0.042999267578125, -0.01105499267578125, 0.0258941650390625, -0.06475830078125, 0.0236663818359375, 0.02392578125, 0.031982421875, -0.00043082237243652344, -0.048583984375, -0.020172119140625, 0.01470184326171875, -0.041473388671875, -0.049285888671875, 0.0292816162109375, 0.0218048095703125, 0.038299560546875, 0.052154541015625, 0.00982666015625, 0.04510498046875, -0.03076171875, 0.07794189453125, 0.0249481201171875, -0.06329345703125, 0.0438232421875, -0.04595947265625, 0.014892578125, 0.04681396484375, 0.0576171875, -0.0107421875, 0.0035648345947265625, -0.0528564453125, -0.06982421875, 0.07012939453125, 0.012359619140625, 0.001438140869140625, 0.01110076904296875, 0.027679443359375, 0.0000750422477722168, 0.01446533203125, -0.07000732421875, -0.0005664825439453125, -0.03582763671875, 0.0001437664031982422, 0.0124664306640625, -0.0216827392578125, -0.016693115234375, -0.03277587890625, 0.058624267578125, -0.0200958251953125, 0.04534912109375, 0.0172119140625, -0.0109100341796875, -0.007083892822265625, -0.004688262939453125, 0.038970947265625, 0.0457763671875, -0.039215087890625, -0.021484375, 0.01129150390625, -0.048187255859375, -0.00705718994140625, 0.00830841064453125, -0.04962158203125, -0.01107025146484375, 0.0308074951171875, 0.0970458984375, 0.0132904052734375, -0.052581787109375, 0.07025146484375, -0.00334930419921875, -0.027679443359375, -0.0255889892578125, 0.00951385498046875, -0.0212860107421875, 0.0127410888671875, 0.002410888671875, 0.007965087890625, 0.00957489013671875, -0.041351318359375, 0.0165863037109375, 0.035888671875, -0.039764404296875, -0.038543701171875, 0.06768798828125, -0.000141143798828125, -0.00887298583984375, 0.046966552734375, -0.00965118408203125, -0.04046630859375, 0.051483154296875, 0.035491943359375, 0.074462890625, -0.0035953521728515625, 0.0294036865234375, 0.04852294921875, 0.019927978515625, -0.020233154296875, 0.0093994140625, 0.01187896728515625, -0.04095458984375, -0.007144927978515625, -0.03167724609375, -0.0255889892578125, 0.0224151611328125, -0.07470703125, 0.042816162109375, -0.052154541015625, -0.0341796875, -0.0191497802734375, -0.036865234375, -0.037872314453125, 0.0137786865234375, 0.01313018798828125, 0.06671142578125, -0.0679931640625, 0.05413818359375, 0.052215576171875, -0.062286376953125, -0.06768798828125, 0.00843048095703125, -0.00832366943359375, -0.036346435546875, 0.0291748046875, 0.039794921875, -0.004810333251953125, -0.03057861328125, -0.0709228515625, -0.0751953125, 0.1082763671875, 0.039764404296875, -0.0194244384765625, -0.00855255126953125, 0.00301361083984375, 0.031219482421875, -0.0197906494140625, 0.0280303955078125, 0.0166778564453125, 0.0093994140625, 0.01110076904296875, -0.08251953125, -0.005352020263671875, -0.02081298828125, 0.0145721435546875, -0.0009026527404785156, -0.08697509765625, 0.07720947265625, -0.018280029296875, -0.0205535888671875, 0.005062103271484375, 0.053009033203125, 0.0020046234130859375, 0.0283050537109375, 0.0288848876953125, 0.05242919921875, 0.038726806640625, 0.0007719993591308594, 0.0762939453125, -0.0164031982421875, 0.031890869140625, 0.0841064453125, -0.01105499267578125, 0.0736083984375, 0.0169219970703125, -0.0157928466796875, 0.0275115966796875, 0.0284576416015625, -0.0292816162109375, 0.05206298828125, -0.0265045166015625, 0.00807952880859375, -0.007488250732421875, -0.035736083984375, -0.031829833984375, 0.0418701171875, 0.001018524169921875, -0.0271148681640625, 0.00032830238342285156, 0.02923583984375, 0.00411224365234375, -0.0185546875, -0.0139007568359375, 0.039215087890625, 0.01666259765625, -0.02960205078125, 0.05987548828125, 0.0034198760986328125, 0.05230712890625, -0.056304931640625, -0.003448486328125, -0.00228118896484375, 0.018218994140625, -0.0093994140625, -0.053497314453125, 0.0218963623046875, 0.0018024444580078125, -0.01103973388671875, -0.00751495361328125, 0.05560302734375, -0.012115478515625, -0.036651611328125, 0.03778076171875, 0.00246429443359375, 0.01451873779296875, 0.00330352783203125, -0.04913330078125, 0.01262664794921875, 0.001461029052734375, -0.0056304931640625, 0.0302886962890625, 0.01107025146484375, -0.0208587646484375, 0.053558349609375, 0.036041259765625, -0.01401519775390625, 0.01029205322265625, -0.004638671875, 0.0745849609375, -0.0293121337890625, -0.033477783203125, -0.04156494140625, 0.047393798828125, -0.0073089599609375, -0.032501220703125, 0.06060791015625, 0.036529541015625, 0.083740234375, -0.016571044921875, 0.054046630859375, -0.0196685791015625, 0.0204315185546875, -0.052001953125, 0.05560302734375, -0.049835205078125, 0.004192352294921875, -0.0372314453125, -0.0537109375, -0.01198577880859375, 0.0460205078125, -0.0187835693359375, 0.00795745849609375, 0.050750732421875, 0.05535888671875, -0.0225067138671875, -0.006519317626953125, 0.0176239013671875, 0.0128631591796875, 0.02032470703125, 0.032501220703125, 0.0389404296875, -0.0595703125, 0.045501708984375, -0.061737060546875, -0.0222625732421875, -0.0119171142578125, -0.0634765625, -0.08721923828125, -0.038360595703125, -0.032073974609375, -0.0168304443359375, 0.003662109375, 0.053253173828125, 0.07330322265625, -0.0576171875, -0.025177001953125, 0.01371002197265625, -0.0164794921875, -0.017364501953125, -0.0167694091796875, 0.036651611328125, 0.021270751953125, -0.042999267578125, 0.0164031982421875, 0.0114593505859375, 0.022369384765625, -0.00351715087890625, -0.00376129150390625, -0.0280303955078125, -0.0092010498046875, 0.035247802734375, 0.033935546875, -0.04583740234375, -0.023468017578125, 0.01076507568359375, 0.01107025146484375, 0.02117919921875, 0.039947509765625, -0.040496826171875, 0.03350830078125, 0.031463623046875, 0.0296630859375, 0.05059814453125, 0.02301025390625, 0.018798828125, -0.045654296875, 0.03216552734375, 0.0014238357543945312, 0.026580810546875, 0.0247802734375, -0.0283355712890625, 0.051727294921875, 0.03271484375, -0.03399658203125, -0.06695556640625, -0.0044097900390625, -0.08831787109375, -0.0024509429931640625, 0.08544921875, -0.036529541015625, -0.038360595703125, 0.0308685302734375, -0.0180816650390625, 0.0287628173828125, -0.02880859375, 0.03509521484375, 0.0306549072265625, -0.00365447998046875, -0.030029296875, -0.061279296875, 0.02642822265625, 0.01107025146484375, -0.0643310546875, -0.01490020751953125, 0.0287322998046875, 0.034423828125, 0.01568603515625, 0.037811279296875, -0.0230560302734375, 0.0258941650390625, -0.0025539398193359375, 0.0222320556640625, -0.0272216796875, -0.05206298828125, -0.03619384765625, 0.0029735565185546875, -0.009124755859375, -0.033172607421875 ] ]
sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking
2022-06-15T19:34:08.000Z
[ "sentence-transformers", "pytorch", "tf", "distilbert", "feature-extraction", "sentence-similarity", "transformers", "arxiv:1908.10084", "license:apache-2.0", "endpoints_compatible", "region:us" ]
sentence-similarity
sentence-transformers
null
null
sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking
1
9,348
sentence-transformers
2022-03-02T23:29:05
--- pipeline_tag: sentence-similarity license: apache-2.0 tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers --- # sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking') model = AutoModel.from_pretrained('sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, max pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/distilbert-multilingual-nli-stsb-quora-ranking) ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: DistilBertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors This model was trained by [sentence-transformers](https://www.sbert.net/). If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084): ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "http://arxiv.org/abs/1908.10084", } ```
3,814
[ [ -0.0226898193359375, -0.061737060546875, 0.0188140869140625, 0.0328369140625, -0.0206146240234375, -0.00734710693359375, -0.018707275390625, 0.0002086162567138672, 0.0157318115234375, 0.01070404052734375, -0.0364990234375, -0.033782958984375, -0.058319091796875, 0.01128387451171875, -0.034454345703125, 0.06591796875, -0.002788543701171875, 0.0002834796905517578, -0.0238800048828125, -0.018402099609375, -0.0172882080078125, -0.03741455078125, -0.028472900390625, -0.01392364501953125, 0.0261993408203125, 0.01422119140625, 0.044921875, 0.0194091796875, 0.02099609375, 0.038787841796875, -0.00905609130859375, 0.014129638671875, -0.0299224853515625, -0.004184722900390625, 0.0009679794311523438, -0.026611328125, -0.0037136077880859375, 0.0200347900390625, 0.048065185546875, 0.036224365234375, -0.013763427734375, 0.01393890380859375, 0.005771636962890625, 0.0304718017578125, -0.027069091796875, 0.022857666015625, -0.051483154296875, 0.0005116462707519531, 0.008331298828125, 0.00021636486053466797, -0.03900146484375, -0.0238494873046875, 0.0175018310546875, -0.0298309326171875, 0.01000213623046875, 0.01369476318359375, 0.088623046875, 0.033203125, -0.027130126953125, -0.023529052734375, -0.02642822265625, 0.0697021484375, -0.0623779296875, 0.0157318115234375, 0.0163116455078125, 0.0005426406860351562, 0.005329132080078125, -0.0677490234375, -0.06292724609375, -0.01045989990234375, -0.03240966796875, 0.0195159912109375, -0.0290374755859375, -0.0091094970703125, 0.01160430908203125, 0.0240478515625, -0.051910400390625, -0.0079193115234375, -0.039306640625, -0.00952911376953125, 0.042816162109375, -0.00286102294921875, 0.024749755859375, -0.043365478515625, -0.0352783203125, -0.0254669189453125, -0.0097503662109375, 0.00417327880859375, 0.01065826416015625, 0.01395416259765625, -0.0114593505859375, 0.05615234375, -0.005859375, 0.03802490234375, 0.0034885406494140625, 0.0128173828125, 0.04974365234375, -0.035308837890625, -0.01105499267578125, -0.001514434814453125, 0.08514404296875, 0.02447509765625, 0.020263671875, -0.005420684814453125, -0.005908966064453125, 0.001468658447265625, 0.0201568603515625, -0.06243896484375, -0.024810791015625, 0.020355224609375, -0.0198822021484375, -0.0266571044921875, 0.016937255859375, -0.047332763671875, -0.0017976760864257812, 0.0027866363525390625, 0.061492919921875, -0.040679931640625, -0.004993438720703125, 0.0189208984375, -0.0213470458984375, 0.012115478515625, -0.0196380615234375, -0.045013427734375, 0.008270263671875, 0.021636962890625, 0.07305908203125, 0.00516510009765625, -0.043975830078125, -0.0196075439453125, -0.01052093505859375, 0.0018939971923828125, 0.05303955078125, -0.025177001953125, -0.00864410400390625, 0.0090484619140625, 0.0183868408203125, -0.0438232421875, -0.028228759765625, 0.04510498046875, -0.0310211181640625, 0.048858642578125, -0.0032958984375, -0.060516357421875, -0.0128326416015625, 0.0177001953125, -0.046234130859375, 0.0958251953125, 0.0094451904296875, -0.0751953125, 0.007480621337890625, -0.053955078125, -0.020263671875, -0.01141357421875, 0.00662994384765625, -0.045257568359375, 0.001987457275390625, 0.033966064453125, 0.05145263671875, 0.007427215576171875, 0.03045654296875, -0.016571044921875, -0.0299072265625, 0.038543701171875, -0.0277862548828125, 0.08441162109375, 0.0111236572265625, -0.021484375, 0.01153564453125, -0.0313720703125, -0.00588226318359375, 0.0246124267578125, -0.0197906494140625, -0.0216064453125, -0.00278472900390625, 0.018707275390625, 0.029632568359375, 0.025360107421875, -0.050506591796875, 0.01378631591796875, -0.037200927734375, 0.07330322265625, 0.04632568359375, 0.0031795501708984375, 0.0416259765625, -0.01898193359375, 0.0207366943359375, 0.027130126953125, 0.004596710205078125, -0.00659942626953125, -0.032257080078125, -0.07666015625, -0.02191162109375, 0.0301971435546875, 0.049896240234375, -0.06439208984375, 0.07122802734375, -0.0416259765625, -0.041595458984375, -0.061492919921875, -0.0016431808471679688, 0.004638671875, 0.027008056640625, 0.04559326171875, 0.007328033447265625, -0.03680419921875, -0.06689453125, -0.00600433349609375, -0.006458282470703125, 0.011871337890625, 0.00716400146484375, 0.05474853515625, -0.0254058837890625, 0.07757568359375, -0.051971435546875, -0.026611328125, -0.038238525390625, 0.0177001953125, 0.0259246826171875, 0.0379638671875, 0.04193115234375, -0.061920166015625, -0.035552978515625, -0.0411376953125, -0.0511474609375, 0.004863739013671875, -0.01085662841796875, -0.02154541015625, 0.008758544921875, 0.045196533203125, -0.06011962890625, 0.018646240234375, 0.043243408203125, -0.036956787109375, 0.0272064208984375, -0.01593017578125, -0.00244903564453125, -0.11431884765625, 0.0012369155883789062, 0.006397247314453125, -0.01544952392578125, -0.02825927734375, 0.003997802734375, 0.00925445556640625, -0.006229400634765625, -0.032440185546875, 0.029815673828125, -0.0218963623046875, 0.0194091796875, 0.0063323974609375, 0.035064697265625, 0.01393890380859375, 0.06103515625, -0.010650634765625, 0.062042236328125, 0.037017822265625, -0.044921875, 0.0260772705078125, 0.045074462890625, -0.038970947265625, 0.0171356201171875, -0.0709228515625, -0.01052093505859375, -0.00208282470703125, 0.02703857421875, -0.0941162109375, 0.00798797607421875, 0.01470184326171875, -0.045196533203125, 0.00713348388671875, 0.01496124267578125, -0.055694580078125, -0.042266845703125, -0.038421630859375, 0.0169677734375, 0.0322265625, -0.036773681640625, 0.032867431640625, 0.027099609375, -0.0164337158203125, -0.050994873046875, -0.07666015625, -0.0068511962890625, -0.0184783935546875, -0.048675537109375, 0.04315185546875, -0.01438140869140625, 0.00960540771484375, 0.0159149169921875, 0.0167999267578125, -0.0034313201904296875, 0.001773834228515625, -0.0032901763916015625, 0.0232391357421875, -0.0047760009765625, 0.02325439453125, 0.020904541015625, -0.007625579833984375, 0.00290679931640625, -0.01160430908203125, 0.05535888671875, -0.02398681640625, -0.011627197265625, -0.031768798828125, 0.016265869140625, 0.03472900390625, -0.0281829833984375, 0.084228515625, 0.07574462890625, -0.0229339599609375, 0.00222015380859375, -0.0418701171875, -0.01438140869140625, -0.036956787109375, 0.053497314453125, -0.0123443603515625, -0.07330322265625, 0.0293121337890625, 0.00362396240234375, 0.00789642333984375, 0.056671142578125, 0.0399169921875, -0.011627197265625, 0.0650634765625, 0.041015625, -0.025634765625, 0.04034423828125, -0.049896240234375, 0.0306396484375, -0.0648193359375, -0.005275726318359375, -0.03265380859375, -0.027740478515625, -0.0604248046875, -0.037689208984375, 0.026611328125, 0.005794525146484375, -0.01861572265625, 0.042755126953125, -0.0511474609375, 0.0188751220703125, 0.042205810546875, 0.00640869140625, 0.002170562744140625, 0.0101165771484375, -0.0289306640625, -0.0032825469970703125, -0.056121826171875, -0.03936767578125, 0.056121826171875, 0.02691650390625, 0.0284576416015625, 0.0013599395751953125, 0.054718017578125, 0.005046844482421875, -0.002628326416015625, -0.05377197265625, 0.03802490234375, -0.0194244384765625, -0.039031982421875, -0.032867431640625, -0.0298919677734375, -0.07025146484375, 0.036773681640625, -0.0104827880859375, -0.04937744140625, 0.011383056640625, -0.0255279541015625, -0.026641845703125, 0.01476287841796875, -0.062042236328125, 0.079345703125, 0.0034694671630859375, -0.0012197494506835938, -0.0167999267578125, -0.04437255859375, 0.01004791259765625, 0.0182647705078125, 0.016265869140625, -0.007083892822265625, -0.0018310546875, 0.061981201171875, -0.01800537109375, 0.0618896484375, -0.01123046875, 0.01322174072265625, 0.021087646484375, -0.0272369384765625, 0.01404571533203125, 0.01056671142578125, -0.0130157470703125, 0.0159454345703125, 0.0013189315795898438, -0.031951904296875, -0.0372314453125, 0.05914306640625, -0.0750732421875, -0.027313232421875, -0.045135498046875, -0.038360595703125, -0.006862640380859375, 0.0081329345703125, 0.0308837890625, 0.0284576416015625, -0.019866943359375, 0.031280517578125, 0.0411376953125, -0.0297698974609375, 0.053955078125, 0.0168914794921875, -0.00980377197265625, -0.034027099609375, 0.057708740234375, 0.0133514404296875, -0.00255584716796875, 0.040740966796875, 0.0117645263671875, -0.040740966796875, -0.0125274658203125, -0.032806396484375, 0.0305633544921875, -0.04412841796875, -0.01519775390625, -0.07470703125, -0.046661376953125, -0.0439453125, 0.006519317626953125, -0.0172271728515625, -0.041015625, -0.0306549072265625, -0.026885986328125, 0.0374755859375, 0.0341796875, 0.00521087646484375, 0.0194549560546875, -0.048614501953125, 0.01605224609375, 0.019287109375, 0.006778717041015625, -0.000843048095703125, -0.05780029296875, -0.026458740234375, 0.004669189453125, -0.03509521484375, -0.060577392578125, 0.04205322265625, 0.0227813720703125, 0.042083740234375, 0.00472259521484375, 0.0134124755859375, 0.053314208984375, -0.04278564453125, 0.0673828125, 0.0005550384521484375, -0.07666015625, 0.03338623046875, 0.00034499168395996094, 0.03131103515625, 0.044677734375, 0.031585693359375, -0.03778076171875, -0.036773681640625, -0.0455322265625, -0.08575439453125, 0.051513671875, 0.033660888671875, 0.0438232421875, -0.0279541015625, 0.014312744140625, -0.016357421875, 0.017547607421875, -0.08575439453125, -0.037353515625, -0.02947998046875, -0.03692626953125, -0.0266571044921875, -0.0227813720703125, 0.010223388671875, -0.0419921875, 0.06109619140625, 0.00588226318359375, 0.04510498046875, 0.0268402099609375, -0.0401611328125, 0.0262451171875, 0.022857666015625, 0.032257080078125, 0.022705078125, -0.0011749267578125, 0.00360107421875, 0.018035888671875, -0.024658203125, 0.00124359130859375, 0.040313720703125, -0.00609588623046875, 0.02044677734375, 0.0218353271484375, 0.06988525390625, 0.03485107421875, -0.0325927734375, 0.05877685546875, -0.01122283935546875, -0.01540374755859375, -0.038909912109375, -0.01465606689453125, 0.026397705078125, 0.0233306884765625, 0.0335693359375, 0.00022363662719726562, 0.003993988037109375, -0.0277557373046875, 0.0216064453125, 0.0181732177734375, -0.025177001953125, -0.006481170654296875, 0.04327392578125, 0.006313323974609375, -0.00571441650390625, 0.06915283203125, -0.0278778076171875, -0.05810546875, 0.030487060546875, 0.030120849609375, 0.0623779296875, -0.005107879638671875, 0.034942626953125, 0.046234130859375, 0.0310821533203125, -0.0024929046630859375, 0.00402069091796875, 0.00666046142578125, -0.0660400390625, -0.0225677490234375, -0.057342529296875, 0.0087432861328125, -0.002239227294921875, -0.053314208984375, 0.0146484375, -0.004573822021484375, -0.0093994140625, -0.0013637542724609375, 0.00472259521484375, -0.048736572265625, -0.00119781494140625, -0.004550933837890625, 0.05975341796875, -0.072265625, 0.06207275390625, 0.05389404296875, -0.059051513671875, -0.055694580078125, -0.001453399658203125, -0.029296875, -0.0521240234375, 0.0259552001953125, 0.036529541015625, 0.01318359375, 0.0190582275390625, -0.039398193359375, -0.06402587890625, 0.103271484375, 0.022857666015625, -0.0263519287109375, -0.00946044921875, 0.013885498046875, 0.040863037109375, -0.0247802734375, 0.0285797119140625, 0.027374267578125, 0.0259246826171875, -0.005573272705078125, -0.054229736328125, 0.0110321044921875, -0.0338134765625, 0.01006317138671875, -0.005664825439453125, -0.053070068359375, 0.07012939453125, -0.0029048919677734375, -0.0122833251953125, 0.0086669921875, 0.059478759765625, 0.0243988037109375, 0.0013885498046875, 0.03857421875, 0.059722900390625, 0.04815673828125, -0.01340484619140625, 0.0721435546875, -0.0280303955078125, 0.055938720703125, 0.0682373046875, 0.00153350830078125, 0.0859375, 0.03546142578125, -0.01739501953125, 0.06976318359375, 0.04327392578125, -0.0175628662109375, 0.047210693359375, 0.0081329345703125, 0.005260467529296875, -0.0033416748046875, 0.0045166015625, -0.0221099853515625, 0.036773681640625, 0.01288604736328125, -0.038177490234375, 0.000045299530029296875, 0.00666046142578125, 0.0033664703369140625, 0.00553131103515625, 0.00926971435546875, 0.049896240234375, 0.01052093505859375, -0.039398193359375, 0.042877197265625, 0.0130767822265625, 0.07611083984375, -0.03204345703125, 0.01398468017578125, -0.005889892578125, 0.017425537109375, 0.0005202293395996094, -0.04632568359375, 0.026458740234375, -0.00885009765625, -0.0142822265625, -0.0158538818359375, 0.03387451171875, -0.050079345703125, -0.060211181640625, 0.028106689453125, 0.045379638671875, 0.00354766845703125, -0.006771087646484375, -0.08551025390625, -0.00458526611328125, 0.00026917457580566406, -0.036773681640625, 0.017730712890625, 0.03350830078125, 0.02197265625, 0.041351318359375, 0.032562255859375, -0.021881103515625, 0.011322021484375, 0.0181732177734375, 0.06561279296875, -0.045989990234375, -0.040008544921875, -0.0672607421875, 0.054351806640625, -0.0234375, -0.0257568359375, 0.0552978515625, 0.043426513671875, 0.07110595703125, -0.012908935546875, 0.042572021484375, -0.01387786865234375, 0.02410888671875, -0.032745361328125, 0.068115234375, -0.043731689453125, -0.00289154052734375, -0.0232086181640625, -0.0714111328125, -0.007274627685546875, 0.074462890625, -0.0251007080078125, 0.01190185546875, 0.0718994140625, 0.060699462890625, -0.00475311279296875, -0.0123291015625, 0.00809478759765625, 0.036956787109375, 0.005985260009765625, 0.031585693359375, 0.04168701171875, -0.062164306640625, 0.058074951171875, -0.039093017578125, -0.003017425537109375, -0.0026988983154296875, -0.05230712890625, -0.0675048828125, -0.0716552734375, -0.03399658203125, -0.016326904296875, -0.017730712890625, 0.070556640625, 0.04534912109375, -0.059234619140625, -0.007755279541015625, -0.005615234375, -0.00412750244140625, -0.016357421875, -0.025665283203125, 0.036956787109375, -0.036895751953125, -0.06890869140625, 0.01433563232421875, -0.004131317138671875, -0.00946807861328125, -0.03155517578125, 0.00885009765625, -0.05828857421875, 0.01067352294921875, 0.04931640625, -0.0278167724609375, -0.0611572265625, -0.00820159912109375, 0.0046539306640625, -0.02777099609375, -0.009307861328125, 0.021820068359375, -0.05322265625, 0.01465606689453125, 0.03289794921875, 0.036956787109375, 0.04052734375, -0.00878143310546875, 0.025634765625, -0.06805419921875, 0.02166748046875, 0.0009908676147460938, 0.0489501953125, 0.0340576171875, -0.0204315185546875, 0.039306640625, 0.0116424560546875, -0.0386962890625, -0.05413818359375, -0.01248931884765625, -0.07794189453125, -0.024078369140625, 0.0860595703125, -0.0262451171875, -0.0175323486328125, 0.0092315673828125, -0.01922607421875, 0.036773681640625, -0.03363037109375, 0.06573486328125, 0.07073974609375, 0.013824462890625, -0.01432037353515625, -0.0278472900390625, 0.0156097412109375, 0.035919189453125, -0.042755126953125, -0.0178070068359375, 0.0134429931640625, 0.024505615234375, 0.025634765625, 0.02874755859375, -0.007579803466796875, -0.003360748291015625, -0.00926971435546875, 0.0020122528076171875, -0.00640869140625, 0.003143310546875, -0.0280303955078125, 0.00868988037109375, -0.0175018310546875, -0.0245361328125 ] ]
timm/inception_resnet_v2.tf_in1k
2023-05-10T01:09:00.000Z
[ "timm", "pytorch", "safetensors", "image-classification", "dataset:imagenet-1k", "arxiv:1602.07261", "license:apache-2.0", "region:us" ]
image-classification
timm
null
null
timm/inception_resnet_v2.tf_in1k
0
9,344
timm
2023-04-25T21:33:47
--- tags: - image-classification - timm library_name: timm license: apache-2.0 datasets: - imagenet-1k --- # Model card for inception_resnet_v2.tf_in1k A Inception-ResNet-v2 image classification model. Trained on ImageNet-1k paper authors. Ported from Tensorflow via Cadene's pretrained-models.pytorch. ## Model Details - **Model Type:** Image classification / feature backbone - **Model Stats:** - Params (M): 55.8 - GMACs: 13.2 - Activations (M): 25.1 - Image size: 299 x 299 - **Papers:** - https://arxiv.org/abs/1602.07261: https://arxiv.org/abs/1602.07261 - **Original:** - https://github.com/tensorflow/models - https://github.com/Cadene/pretrained-models.pytorch - **Dataset:** ImageNet-1k ## Model Usage ### Image Classification ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model('inception_resnet_v2.tf_in1k', pretrained=True) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) ``` ### Feature Map Extraction ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'inception_resnet_v2.tf_in1k', pretrained=True, features_only=True, ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 for o in output: # print shape of each feature map in output # e.g.: # torch.Size([1, 64, 147, 147]) # torch.Size([1, 192, 71, 71]) # torch.Size([1, 320, 35, 35]) # torch.Size([1, 1088, 17, 17]) # torch.Size([1, 1536, 8, 8]) print(o.shape) ``` ### Image Embeddings ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'inception_resnet_v2.tf_in1k', pretrained=True, num_classes=0, # remove classifier nn.Linear ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor # or equivalently (without needing to set num_classes=0) output = model.forward_features(transforms(img).unsqueeze(0)) # output is unpooled, a (1, 1536, 8, 8) shaped tensor output = model.forward_head(output, pre_logits=True) # output is a (1, num_features) shaped tensor ``` ## Model Comparison Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results). ## Citation ```bibtex @article{Szegedy2016Inceptionv4IA, title={Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning}, author={Christian Szegedy and Sergey Ioffe and Vincent Vanhoucke and Alexander A. Alemi}, journal={ArXiv}, year={2016}, volume={abs/1602.07261} } ```
3,812
[ [ -0.033599853515625, -0.03460693359375, 0.0025634765625, 0.00397491455078125, -0.0276031494140625, -0.01971435546875, -0.014190673828125, -0.02978515625, 0.0145721435546875, 0.028778076171875, -0.037353515625, -0.053253173828125, -0.04998779296875, -0.0074462890625, -0.0199737548828125, 0.0660400390625, -0.002185821533203125, -0.0016222000122070312, -0.023345947265625, -0.0322265625, -0.01355743408203125, -0.014007568359375, -0.07763671875, -0.03277587890625, 0.0235443115234375, 0.0203399658203125, 0.024444580078125, 0.0406494140625, 0.04620361328125, 0.03338623046875, -0.006439208984375, -0.0023860931396484375, -0.0272369384765625, -0.014892578125, 0.021697998046875, -0.046478271484375, -0.0237579345703125, 0.0197296142578125, 0.0631103515625, 0.02606201171875, 0.0026683807373046875, 0.0369873046875, 0.016998291015625, 0.036224365234375, -0.0117645263671875, 0.006626129150390625, -0.0277099609375, 0.0197906494140625, -0.0089111328125, 0.007480621337890625, -0.017120361328125, -0.032501220703125, 0.0225982666015625, -0.03662109375, 0.039215087890625, -0.0011072158813476562, 0.10186767578125, 0.022247314453125, 0.01251983642578125, 0.0037937164306640625, -0.0218658447265625, 0.057586669921875, -0.057403564453125, 0.0244598388671875, 0.0185394287109375, 0.0114593505859375, -0.0028839111328125, -0.09326171875, -0.03985595703125, -0.01007080078125, -0.00939178466796875, -0.006771087646484375, -0.0114593505859375, 0.0022735595703125, 0.0186614990234375, 0.0311737060546875, -0.0247650146484375, 0.0162506103515625, -0.053009033203125, -0.01522064208984375, 0.033538818359375, -0.006145477294921875, 0.01800537109375, -0.0074462890625, -0.0400390625, -0.031951904296875, -0.0290679931640625, 0.0198211669921875, 0.0204010009765625, 0.01287078857421875, -0.047210693359375, 0.028594970703125, 0.01361846923828125, 0.044342041015625, 0.00405120849609375, -0.02606201171875, 0.045379638671875, 0.0017337799072265625, -0.0321044921875, -0.00582122802734375, 0.07525634765625, 0.0240478515625, 0.0191650390625, 0.004604339599609375, -0.0008149147033691406, -0.034881591796875, 0.0025424957275390625, -0.08538818359375, -0.0180511474609375, 0.0290679931640625, -0.039642333984375, -0.035369873046875, 0.0164642333984375, -0.043121337890625, -0.01015472412109375, -0.0024509429931640625, 0.040618896484375, -0.0439453125, -0.03546142578125, -0.0007443428039550781, -0.01396942138671875, 0.033905029296875, 0.005306243896484375, -0.033782958984375, 0.00131988525390625, 0.025238037109375, 0.08917236328125, 0.00878143310546875, -0.035186767578125, -0.0008544921875, -0.030303955078125, -0.0211639404296875, 0.037078857421875, 0.00004982948303222656, -0.00788116455078125, -0.0237884521484375, 0.022613525390625, -0.00730133056640625, -0.04681396484375, 0.0191192626953125, -0.019500732421875, 0.023284912109375, 0.0033130645751953125, -0.022308349609375, -0.046173095703125, 0.0240631103515625, -0.027984619140625, 0.0875244140625, 0.035003662109375, -0.061920166015625, 0.024658203125, -0.038299560546875, -0.006496429443359375, -0.0198211669921875, -0.01025390625, -0.079345703125, -0.007038116455078125, 0.005275726318359375, 0.041229248046875, -0.0262603759765625, 0.0009975433349609375, -0.040802001953125, -0.015380859375, 0.019927978515625, -0.005313873291015625, 0.0772705078125, 0.0140380859375, -0.03021240234375, 0.0134429931640625, -0.045623779296875, 0.01424407958984375, 0.035858154296875, -0.0156707763671875, 0.004192352294921875, -0.048095703125, 0.0038166046142578125, 0.0218963623046875, 0.0115814208984375, -0.045135498046875, 0.0117034912109375, -0.0227813720703125, 0.045623779296875, 0.043609619140625, -0.00005602836608886719, 0.018707275390625, -0.02435302734375, 0.0202484130859375, 0.0321044921875, 0.0145263671875, -0.00039386749267578125, -0.040679931640625, -0.06512451171875, -0.0292816162109375, 0.0265655517578125, 0.025848388671875, -0.033355712890625, 0.03277587890625, -0.0199737548828125, -0.05902099609375, -0.04327392578125, 0.0015277862548828125, 0.03564453125, 0.055816650390625, 0.0253448486328125, -0.034881591796875, -0.042510986328125, -0.07293701171875, 0.01050567626953125, -0.0035953521728515625, 0.004119873046875, 0.0221405029296875, 0.045257568359375, -0.0030727386474609375, 0.0623779296875, -0.029510498046875, -0.0224609375, -0.01528167724609375, 0.006877899169921875, 0.030242919921875, 0.06903076171875, 0.060028076171875, -0.041473388671875, -0.021942138671875, -0.0167388916015625, -0.0792236328125, 0.0202178955078125, -0.004901885986328125, -0.014434814453125, 0.0186004638671875, 0.0176849365234375, -0.03564453125, 0.045196533203125, 0.00640106201171875, -0.020355224609375, 0.0304412841796875, -0.016082763671875, 0.0191192626953125, -0.09503173828125, 0.00919342041015625, 0.021728515625, -0.00406646728515625, -0.0263671875, 0.003543853759765625, -0.0062103271484375, -0.00885772705078125, -0.04681396484375, 0.03997802734375, -0.039459228515625, -0.0215606689453125, -0.004695892333984375, -0.018890380859375, 0.00618743896484375, 0.051422119140625, -0.000032007694244384766, 0.0367431640625, 0.06689453125, -0.038421630859375, 0.035797119140625, 0.031280517578125, -0.0234375, 0.030242919921875, -0.054473876953125, 0.0207366943359375, -0.00862884521484375, 0.009918212890625, -0.07904052734375, -0.01326751708984375, 0.026824951171875, -0.05224609375, 0.046875, -0.039520263671875, -0.0204315185546875, -0.0396728515625, -0.03448486328125, 0.03564453125, 0.05706787109375, -0.050262451171875, 0.0322265625, 0.0107421875, 0.0243682861328125, -0.04888916015625, -0.07415771484375, -0.00905609130859375, -0.033294677734375, -0.05450439453125, 0.031585693359375, 0.01715087890625, 0.01215362548828125, 0.01849365234375, -0.0011262893676757812, -0.00826263427734375, -0.00559234619140625, 0.040557861328125, 0.0174560546875, -0.034820556640625, -0.008514404296875, -0.0230865478515625, -0.00423431396484375, -0.0004703998565673828, -0.035400390625, 0.0467529296875, -0.02117919921875, -0.01092529296875, -0.06964111328125, -0.01532745361328125, 0.0404052734375, -0.008697509765625, 0.07171630859375, 0.08404541015625, -0.041717529296875, 0.003414154052734375, -0.038848876953125, -0.025909423828125, -0.034271240234375, 0.039031982421875, -0.0308685302734375, -0.037322998046875, 0.06390380859375, 0.0005240440368652344, 0.0116424560546875, 0.0548095703125, 0.0145111083984375, -0.005771636962890625, 0.03076171875, 0.044219970703125, 0.01543426513671875, 0.05279541015625, -0.08831787109375, -0.0102691650390625, -0.07183837890625, -0.04241943359375, -0.02813720703125, -0.04229736328125, -0.037078857421875, -0.01373291015625, 0.03192138671875, 0.0284881591796875, -0.0352783203125, 0.038818359375, -0.062225341796875, 0.0048370361328125, 0.055267333984375, 0.041656494140625, -0.02642822265625, 0.0300140380859375, -0.01526641845703125, 0.0024280548095703125, -0.056884765625, -0.0168609619140625, 0.08258056640625, 0.033660888671875, 0.037506103515625, -0.01277923583984375, 0.051605224609375, -0.01483154296875, 0.022674560546875, -0.040069580078125, 0.038726806640625, -0.00262451171875, -0.0272979736328125, -0.01351165771484375, -0.0236968994140625, -0.077392578125, -0.0004589557647705078, -0.022613525390625, -0.0537109375, 0.0190887451171875, 0.007678985595703125, -0.0305023193359375, 0.06085205078125, -0.058929443359375, 0.07415771484375, -0.003582000732421875, -0.0318603515625, -0.00426483154296875, -0.04119873046875, 0.02410888671875, 0.01360321044921875, -0.0210113525390625, -0.0036945343017578125, 0.0135040283203125, 0.08544921875, -0.036895751953125, 0.06719970703125, -0.0316162109375, 0.02838134765625, 0.052154541015625, -0.01081085205078125, 0.0106048583984375, 0.0019207000732421875, -0.0091705322265625, 0.039642333984375, 0.00010275840759277344, -0.032135009765625, -0.03656005859375, 0.04669189453125, -0.0814208984375, -0.0340576171875, -0.02508544921875, -0.020904541015625, 0.0193634033203125, 0.014068603515625, 0.050079345703125, 0.06280517578125, 0.0161285400390625, 0.027984619140625, 0.037353515625, -0.03338623046875, 0.036041259765625, -0.0076904296875, -0.0123443603515625, -0.045379638671875, 0.0538330078125, 0.021697998046875, 0.0135498046875, 0.0009794235229492188, 0.0101776123046875, -0.034515380859375, -0.05255126953125, -0.0245208740234375, 0.0286102294921875, -0.05096435546875, -0.0386962890625, -0.043060302734375, -0.04766845703125, -0.043182373046875, -0.002208709716796875, -0.03668212890625, -0.02337646484375, -0.03533935546875, 0.007236480712890625, 0.053131103515625, 0.0401611328125, -0.014373779296875, 0.04473876953125, -0.04339599609375, 0.00260162353515625, 0.006221771240234375, 0.0423583984375, 0.00046753883361816406, -0.07672119140625, -0.0195159912109375, -0.006679534912109375, -0.031982421875, -0.047149658203125, 0.0230712890625, 0.01953125, 0.0341796875, 0.036163330078125, -0.0213470458984375, 0.05450439453125, 0.0015840530395507812, 0.04559326171875, 0.021942138671875, -0.033660888671875, 0.0479736328125, 0.00644683837890625, 0.00682830810546875, 0.01181793212890625, 0.0224151611328125, -0.0207366943359375, 0.0014619827270507812, -0.080078125, -0.05316162109375, 0.0762939453125, 0.0114593505859375, -0.00214385986328125, 0.0209197998046875, 0.066162109375, -0.0010223388671875, 0.0014514923095703125, -0.04949951171875, -0.0318603515625, -0.02606201171875, -0.0202484130859375, 0.0021152496337890625, -0.016387939453125, -0.002590179443359375, -0.043609619140625, 0.05029296875, 0.01213836669921875, 0.055908203125, 0.035797119140625, -0.0016155242919921875, -0.00212860107421875, -0.038116455078125, 0.036865234375, 0.029205322265625, -0.0293121337890625, 0.0028781890869140625, 0.017425537109375, -0.045867919921875, 0.01128387451171875, 0.00885009765625, -0.00556182861328125, -0.006687164306640625, 0.046600341796875, 0.07177734375, -0.004421234130859375, 0.0114593505859375, 0.02239990234375, -0.002002716064453125, -0.022552490234375, -0.0364990234375, 0.01038360595703125, -0.00441741943359375, 0.02813720703125, 0.02227783203125, 0.030120849609375, -0.01837158203125, -0.0107421875, 0.0239105224609375, 0.0303955078125, -0.0225982666015625, -0.0305633544921875, 0.049468994140625, -0.01093292236328125, -0.023406982421875, 0.08001708984375, -0.005321502685546875, -0.0275726318359375, 0.08447265625, 0.037689208984375, 0.0775146484375, -0.003429412841796875, -0.002025604248046875, 0.07635498046875, 0.01947021484375, -0.00183868408203125, 0.0009560585021972656, 0.0120849609375, -0.06536865234375, 0.004749298095703125, -0.050048828125, 0.01522064208984375, 0.034698486328125, -0.0382080078125, 0.0281219482421875, -0.060028076171875, -0.033905029296875, 0.004985809326171875, 0.01461029052734375, -0.06903076171875, 0.012603759765625, -0.00042438507080078125, 0.0648193359375, -0.052886962890625, 0.053253173828125, 0.0667724609375, -0.056304931640625, -0.072509765625, -0.01678466796875, -0.00624847412109375, -0.0770263671875, 0.04248046875, 0.03558349609375, 0.005214691162109375, 0.0260772705078125, -0.0555419921875, -0.0615234375, 0.1097412109375, 0.043121337890625, 0.0012292861938476562, 0.01386260986328125, 0.0004398822784423828, 0.01561737060546875, -0.036895751953125, 0.03857421875, 0.02435302734375, 0.022003173828125, 0.0266876220703125, -0.038299560546875, 0.00867462158203125, -0.00926971435546875, 0.01055145263671875, 0.00708770751953125, -0.056060791015625, 0.0615234375, -0.0416259765625, -0.0228271484375, -0.0009360313415527344, 0.064453125, 0.0186614990234375, 0.008880615234375, 0.0421142578125, 0.07415771484375, 0.03326416015625, -0.02178955078125, 0.061431884765625, -0.01064300537109375, 0.046142578125, 0.043731689453125, 0.036041259765625, 0.03753662109375, 0.0265655517578125, -0.0239105224609375, 0.03466796875, 0.09075927734375, -0.0248870849609375, 0.0197601318359375, 0.01491546630859375, 0.0002713203430175781, -0.0018396377563476562, 0.0032482147216796875, -0.036285400390625, 0.035919189453125, 0.017364501953125, -0.036163330078125, -0.02227783203125, 0.006450653076171875, 0.0005249977111816406, -0.027099609375, -0.012237548828125, 0.036865234375, 0.00684356689453125, -0.0305023193359375, 0.064453125, 0.004199981689453125, 0.05865478515625, -0.0274658203125, -0.003017425537109375, -0.02716064453125, 0.0167999267578125, -0.0300140380859375, -0.065673828125, 0.0250396728515625, -0.020721435546875, 0.001590728759765625, 0.0123748779296875, 0.03997802734375, -0.030303955078125, -0.036834716796875, 0.0114898681640625, 0.0156402587890625, 0.048126220703125, 0.008758544921875, -0.0892333984375, 0.007701873779296875, 0.00595855712890625, -0.04132080078125, 0.01268768310546875, 0.025054931640625, 0.00899505615234375, 0.05487060546875, 0.040985107421875, -0.009033203125, 0.00862884521484375, -0.0161285400390625, 0.050567626953125, -0.037078857421875, -0.021209716796875, -0.06793212890625, 0.046417236328125, -0.007678985595703125, -0.050811767578125, 0.02679443359375, 0.0482177734375, 0.0692138671875, -0.0034332275390625, 0.0288848876953125, -0.0246734619140625, -0.004169464111328125, -0.0307159423828125, 0.05224609375, -0.047576904296875, -0.0039825439453125, 0.0028018951416015625, -0.062744140625, -0.01776123046875, 0.04510498046875, -0.0168609619140625, 0.033172607421875, 0.038848876953125, 0.073486328125, -0.0295562744140625, -0.029510498046875, 0.005458831787109375, 0.0165557861328125, 0.01322174072265625, 0.0310821533203125, 0.025665283203125, -0.060394287109375, 0.0263214111328125, -0.04931640625, -0.013702392578125, -0.0028247833251953125, -0.0478515625, -0.063232421875, -0.07269287109375, -0.04931640625, -0.057586669921875, -0.0214385986328125, 0.07037353515625, 0.0772705078125, -0.052337646484375, -0.006946563720703125, -0.0024585723876953125, 0.01210784912109375, -0.01464080810546875, -0.0163726806640625, 0.05596923828125, -0.02337646484375, -0.054229736328125, -0.0186004638671875, -0.002044677734375, 0.034210205078125, -0.00677490234375, -0.00858306884765625, -0.016204833984375, -0.0299530029296875, 0.007083892822265625, 0.0262298583984375, -0.050750732421875, -0.02435302734375, -0.0265960693359375, -0.0050048828125, 0.0311737060546875, 0.0282745361328125, -0.046905517578125, 0.0178985595703125, 0.03082275390625, 0.03656005859375, 0.059051513671875, -0.016510009765625, 0.004947662353515625, -0.061248779296875, 0.0469970703125, -0.01045989990234375, 0.031341552734375, 0.03302001953125, -0.0301055908203125, 0.046630859375, 0.041168212890625, -0.0247955322265625, -0.064208984375, -0.004535675048828125, -0.06689453125, -0.004772186279296875, 0.056549072265625, -0.0300750732421875, -0.042510986328125, 0.03564453125, 0.0052947998046875, 0.058502197265625, -0.00015687942504882812, 0.036956787109375, 0.0230712890625, -0.007633209228515625, -0.0482177734375, -0.0311126708984375, 0.0401611328125, 0.008514404296875, -0.0390625, -0.04119873046875, -0.0066680908203125, 0.058990478515625, 0.017181396484375, 0.036468505859375, -0.0181427001953125, 0.00027251243591308594, 0.0106964111328125, 0.04779052734375, -0.044158935546875, -0.0111541748046875, -0.030487060546875, 0.0020580291748046875, 0.0006666183471679688, -0.0543212890625 ] ]
EleutherAI/polyglot-ko-5.8b
2023-06-07T05:03:38.000Z
[ "transformers", "pytorch", "safetensors", "gpt_neox", "text-generation", "causal-lm", "ko", "arxiv:2104.09864", "arxiv:2204.04541", "arxiv:2306.02254", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
EleutherAI
null
null
EleutherAI/polyglot-ko-5.8b
50
9,343
transformers
2022-09-22T06:57:04
--- language: - ko tags: - pytorch - causal-lm license: apache-2.0 --- # Polyglot-Ko-5.8B ## Model Description Polyglot-Ko is a series of large-scale Korean autoregressive language models made by the EleutherAI polyglot team. | Hyperparameter | Value | |----------------------|----------------------------------------------------------------------------------------------------------------------------------------| | \\(n_{parameters}\\) | 5,885,059,072 | | \\(n_{layers}\\) | 28 | | \\(d_{model}\\) | 4096 | | \\(d_{ff}\\) | 16,384 | | \\(n_{heads}\\) | 16 | | \\(d_{head}\\) | 256 | | \\(n_{ctx}\\) | 2,048 | | \\(n_{vocab}\\) | 30,003 / 30,080 | | Positional Encoding | [Rotary Position Embedding (RoPE)](https://arxiv.org/abs/2104.09864) | | RoPE Dimensions | [64](https://github.com/kingoflolz/mesh-transformer-jax/blob/f2aa66e0925de6593dcbb70e72399b97b4130482/mesh_transformer/layers.py#L223) | The model consists of 28 transformer layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary Position Embedding (RoPE) is applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 30003. ## Training data Polyglot-Ko-5.8B was trained on 863 GB of Korean language data (1.2TB before processing), a large-scale dataset curated by [TUNiB](https://tunib.ai/). The data collection process has abided by South Korean laws. This dataset was collected for the purpose of training Polyglot-Ko models, so it will not be released for public use. | Source |Size (GB) | Link | |-------------------------------------|---------|------------------------------------------| | Korean blog posts | 682.3 | - | | Korean news dataset | 87.0 | - | | Modu corpus | 26.4 |corpus.korean.go.kr | | Korean patent dataset | 19.0 | - | | Korean Q & A dataset | 18.1 | - | | KcBert dataset | 12.7 | github.com/Beomi/KcBERT | | Korean fiction dataset | 6.1 | - | | Korean online comments | 4.2 | - | | Korean wikipedia | 1.4 | ko.wikipedia.org | | Clova call | < 1.0 | github.com/clovaai/ClovaCall | | Naver sentiment movie corpus | < 1.0 | github.com/e9t/nsmc | | Korean hate speech dataset | < 1.0 | - | | Open subtitles | < 1.0 | opus.nlpl.eu/OpenSubtitles.php | | AIHub various tasks datasets | < 1.0 |aihub.or.kr | | Standard Korean language dictionary | < 1.0 | stdict.korean.go.kr/main/main.do | Furthermore, in order to avoid the model memorizing and generating personally identifiable information (PII) in the training data, we masked out the following sensitive information in the pre-processing stage: * `<|acc|>` : bank account number * `<|rrn|>` : resident registration number * `<|tell|>` : phone number ## Training procedure Polyglot-Ko-5.8B was trained for 172 billion tokens over 320,000 steps on 256 A100 GPUs with the [GPT-NeoX framework](https://github.com/EleutherAI/gpt-neox). It was trained as an autoregressive language model, using cross-entropy loss to maximize the likelihood of predicting the next token. ## How to use This model can be easily loaded using the `AutoModelForCausalLM` class: ```python from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("EleutherAI/polyglot-ko-5.8b") model = AutoModelForCausalLM.from_pretrained("EleutherAI/polyglot-ko-5.8b") ``` ## Evaluation results We evaluate Polyglot-Ko-3.8B on [KOBEST dataset](https://arxiv.org/abs/2204.04541), a benchmark with 5 downstream tasks, against comparable models such as skt/ko-gpt-trinity-1.2B-v0.5, kakaobrain/kogpt and facebook/xglm-7.5B, using the prompts provided in the paper. The following tables show the results when the number of few-shot examples differ. You can reproduce these results using the [polyglot branch of lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness/tree/polyglot) and the following scripts. For a fair comparison, all models were run under the same conditions and using the same prompts. In the tables, `n` refers to the number of few-shot examples. In case of WiC dataset, all models show random performance. ```console python main.py \ --model gpt2 \ --model_args pretrained='EleutherAI/polyglot-ko-3.8b' \ --tasks kobest_copa,kobest_hellaswag \ --num_fewshot $YOUR_NUM_FEWSHOT \ --batch_size $YOUR_BATCH_SIZE \ --device $YOUR_DEVICE \ --output_path $/path/to/output/ ``` ### COPA (F1) | Model | params | 0-shot | 5-shot | 10-shot | 50-shot | |----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------| | [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.6696 | 0.6477 | 0.6419 | 0.6514 | | [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.7345 | 0.7287 | 0.7277 | 0.7479 | | [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.6723 | 0.6731 | 0.6769 | 0.7119 | | [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.7196 | 0.7193 | 0.7204 | 0.7206 | | [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.7595 | 0.7608 | 0.7638 | 0.7788 | | **[EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) (this)** | **5.8B** | **0.7745** | **0.7676** | **0.7775** | **0.7887** | | [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.7937 | 0.8108 | 0.8037 | 0.8369 | <img src="https://github.com/EleutherAI/polyglot/assets/19511788/d5b49364-aed5-4467-bae2-5a322c8e2ceb" width="800px"> ### HellaSwag (F1) | Model | params | 0-shot | 5-shot | 10-shot | 50-shot | |----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------| | [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.5243 | 0.5272 | 0.5166 | 0.5352 | | [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.5590 | 0.5833 | 0.5828 | 0.5907 | | [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.5665 | 0.5689 | 0.5565 | 0.5622 | | [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.5247 | 0.5260 | 0.5278 | 0.5427 | | [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.5707 | 0.5830 | 0.5670 | 0.5787 | | **[EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) (this)** | **5.8B** | **0.5976** | **0.5998** | **0.5979** | **0.6208** | | [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.5954 | 0.6306 | 0.6098 | 0.6118 | <img src="https://github.com/EleutherAI/polyglot/assets/19511788/5acb60ac-161a-4ab3-a296-db4442e08b7f" width="800px"> ### BoolQ (F1) | Model | params | 0-shot | 5-shot | 10-shot | 50-shot | |----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------| | [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.3356 | 0.4014 | 0.3640 | 0.3560 | | [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.4514 | 0.5981 | 0.5499 | 0.5202 | | [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.4464 | 0.3324 | 0.3324 | 0.3324 | | [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.3552 | 0.4751 | 0.4109 | 0.4038 | | [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.4320 | 0.5263 | 0.4930 | 0.4038 | | **[EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) (this)** | **5.8B** | **0.4356** | **0.5698** | **0.5187** | **0.5236** | | [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.4818 | 0.6041 | 0.6289 | 0.6448 | <img src="https://github.com/EleutherAI/polyglot/assets/19511788/b74c23c0-01f3-4b68-9e10-a48e9aa052ab" width="800px"> ### SentiNeg (F1) | Model | params | 0-shot | 5-shot | 10-shot | 50-shot | |----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------| | [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.6065 | 0.6878 | 0.7280 | 0.8413 | | [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.3747 | 0.8942 | 0.9294 | 0.9698 | | [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.3578 | 0.4471 | 0.3964 | 0.5271 | | [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.6790 | 0.6257 | 0.5514 | 0.7851 | | [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.4858 | 0.7950 | 0.7320 | 0.7851 | | **[EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) (this)** | **5.8B** | **0.3394** | **0.8841** | **0.8808** | **0.9521** | | [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.9117 | 0.9015 | 0.9345 | 0.9723 | <img src="https://github.com/EleutherAI/polyglot/assets/19511788/95b56b19-d349-4b70-9ff9-94a5560f89ee" width="800px"> ### WiC (F1) | Model | params | 0-shot | 5-shot | 10-shot | 50-shot | |----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------| | [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.3290 | 0.4313 | 0.4001 | 0.3621 | | [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.3526 | 0.4775 | 0.4358 | 0.4061 | | [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.3280 | 0.4903 | 0.4945 | 0.3656 | | [EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) | 1.3B | 0.3297 | 0.4850 | 0.4650 | 0.3290 | | [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.3390 | 0.4944 | 0.4203 | 0.3835 | | **[EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) (this)** | **5.8B** | **0.3913** | **0.4688** | **0.4189** | **0.3910** | | [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.3985 | 0.3683 | 0.3307 | 0.3273 | <img src="https://github.com/EleutherAI/polyglot/assets/19511788/4de4a4c3-d7ac-4e04-8b0c-0d533fe88294" width="800px"> ## Limitations and Biases Polyglot-Ko has been trained to optimize next token prediction. Language models such as this are often used for a wide variety of tasks and it is important to be aware of possible unexpected outcomes. For instance, Polyglot-Ko will not always return the most factual or accurate response but the most statistically likely one. In addition, Polyglot may produce socially unacceptable or offensive content. We recommend having a human curator or other filtering mechanism to censor sensitive content. ## Citation and Related Information ### BibTeX entry If you find our work useful, please consider citing: ```bibtex @misc{ko2023technical, title={A Technical Report for Polyglot-Ko: Open-Source Large-Scale Korean Language Models}, author={Hyunwoong Ko and Kichang Yang and Minho Ryu and Taekyoon Choi and Seungmu Yang and jiwung Hyun and Sungho Park}, year={2023}, eprint={2306.02254}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ### Licensing All our models are licensed under the terms of the Apache License 2.0. ``` Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. ``` ### Acknowledgement This project was made possible thanks to the computing resources from [Stability.ai](https://stability.ai), and thanks to [TUNiB](https://tunib.ai) for providing a large-scale Korean dataset for this work.
15,471
[ [ -0.04986572265625, -0.051239013671875, 0.0202484130859375, 0.004985809326171875, -0.0386962890625, 0.0010547637939453125, -0.00893402099609375, -0.04010009765625, 0.03118896484375, 0.01277923583984375, -0.034912109375, -0.04901123046875, -0.054595947265625, -0.007171630859375, -0.0046844482421875, 0.0933837890625, -0.0032901763916015625, -0.01739501953125, 0.0013523101806640625, -0.006412506103515625, -0.0085296630859375, -0.04779052734375, -0.0406494140625, -0.035888671875, 0.02618408203125, 0.0013637542724609375, 0.06158447265625, 0.032501220703125, 0.0257720947265625, 0.0229949951171875, -0.0218048095703125, 0.003475189208984375, -0.03704833984375, -0.036041259765625, 0.0200653076171875, -0.038970947265625, -0.060577392578125, -0.002429962158203125, 0.0450439453125, 0.0208587646484375, -0.01029205322265625, 0.032318115234375, -0.00026798248291015625, 0.047576904296875, -0.032135009765625, 0.0198822021484375, -0.0256195068359375, 0.0134124755859375, -0.0310211181640625, 0.02154541015625, -0.0192108154296875, -0.02899169921875, 0.012542724609375, -0.042999267578125, 0.00402069091796875, -0.002429962158203125, 0.09771728515625, -0.002346038818359375, -0.01543426513671875, -0.0032825469970703125, -0.039581298828125, 0.056121826171875, -0.07940673828125, 0.0202178955078125, 0.0244293212890625, 0.00870513916015625, 0.0005812644958496094, -0.050506591796875, -0.043609619140625, 0.0010356903076171875, -0.0292816162109375, 0.035125732421875, -0.0174560546875, 0.0033092498779296875, 0.03680419921875, 0.032928466796875, -0.05780029296875, -0.00629425048828125, -0.039306640625, -0.0232086181640625, 0.070068359375, 0.01629638671875, 0.03033447265625, -0.02410888671875, -0.022796630859375, -0.0248565673828125, -0.0222625732421875, 0.031585693359375, 0.039703369140625, -0.0013628005981445312, -0.05389404296875, 0.040008544921875, -0.0162353515625, 0.049652099609375, 0.0161590576171875, -0.037017822265625, 0.0509033203125, -0.02801513671875, -0.0258026123046875, -0.005645751953125, 0.083984375, 0.0352783203125, 0.0213775634765625, 0.013885498046875, -0.0083465576171875, -0.0021305084228515625, -0.01128387451171875, -0.06195068359375, -0.0217132568359375, 0.017547607421875, -0.042449951171875, -0.0262603759765625, 0.0253753662109375, -0.060211181640625, 0.0027370452880859375, -0.01213836669921875, 0.027923583984375, -0.035125732421875, -0.031585693359375, 0.016632080078125, 0.0009465217590332031, 0.03466796875, 0.01629638671875, -0.0430908203125, 0.016021728515625, 0.0276641845703125, 0.067138671875, -0.010894775390625, -0.02508544921875, -0.002170562744140625, -0.0020294189453125, -0.016082763671875, 0.04217529296875, -0.0123748779296875, -0.01398468017578125, -0.0254669189453125, 0.01242828369140625, -0.0249786376953125, -0.0230560302734375, 0.03399658203125, -0.012664794921875, 0.0204010009765625, -0.00490570068359375, -0.035308837890625, -0.0330810546875, 0.006046295166015625, -0.039520263671875, 0.07855224609375, 0.01197052001953125, -0.07257080078125, 0.019012451171875, -0.0259552001953125, -0.002384185791015625, -0.00730133056640625, 0.0008029937744140625, -0.06280517578125, 0.00855255126953125, 0.0222625732421875, 0.0250244140625, -0.0264739990234375, 0.0004911422729492188, -0.024505615234375, -0.02069091796875, -0.00202178955078125, -0.0102996826171875, 0.07391357421875, 0.00955963134765625, -0.036041259765625, -0.0020389556884765625, -0.06500244140625, 0.01861572265625, 0.03948974609375, -0.00997161865234375, -0.0070648193359375, -0.02593994140625, -0.00765228271484375, 0.035430908203125, 0.023773193359375, -0.040283203125, 0.0155181884765625, -0.04022216796875, -0.0018548965454101562, 0.050750732421875, -0.0076751708984375, 0.01268768310546875, -0.038055419921875, 0.061126708984375, 0.014862060546875, 0.033966064453125, 0.0019397735595703125, -0.050323486328125, -0.055938720703125, -0.028350830078125, 0.019561767578125, 0.042877197265625, -0.04693603515625, 0.038665771484375, -0.0028209686279296875, -0.06622314453125, -0.0540771484375, 0.010101318359375, 0.03466796875, 0.02349853515625, 0.01419830322265625, -0.0190582275390625, -0.03802490234375, -0.07366943359375, -0.0006823539733886719, -0.01280975341796875, 0.01369476318359375, 0.032135009765625, 0.051055908203125, -0.00168609619140625, 0.061279296875, -0.05511474609375, -0.006862640380859375, -0.0259857177734375, 0.0164337158203125, 0.053131103515625, 0.0293731689453125, 0.067138671875, -0.0501708984375, -0.08056640625, 0.01093292236328125, -0.07366943359375, -0.0099029541015625, 0.003925323486328125, -0.00618743896484375, 0.0243377685546875, 0.0177154541015625, -0.0660400390625, 0.04833984375, 0.048828125, -0.0296783447265625, 0.0697021484375, -0.00881195068359375, -0.0027179718017578125, -0.076904296875, 0.00548553466796875, -0.0134735107421875, -0.020050048828125, -0.050628662109375, -0.0020580291748046875, -0.0128173828125, 0.0069580078125, -0.052001953125, 0.048095703125, -0.03656005859375, 0.00543212890625, -0.018707275390625, 0.0018701553344726562, -0.00870513916015625, 0.044036865234375, -0.0022678375244140625, 0.04388427734375, 0.0689697265625, -0.0206298828125, 0.0491943359375, 0.00281524658203125, -0.0197296142578125, 0.020111083984375, -0.0643310546875, 0.02386474609375, -0.0179901123046875, 0.032135009765625, -0.061920166015625, -0.015899658203125, 0.02825927734375, -0.034027099609375, 0.005329132080078125, -0.0290069580078125, -0.0418701171875, -0.051544189453125, -0.048858642578125, 0.03387451171875, 0.0555419921875, -0.019805908203125, 0.036834716796875, 0.01531982421875, -0.01137542724609375, -0.03472900390625, -0.0304107666015625, -0.026702880859375, -0.0276947021484375, -0.0628662109375, 0.023468017578125, 0.0012464523315429688, 0.002197265625, -0.0006985664367675781, 0.0005736351013183594, 0.006694793701171875, -0.026092529296875, 0.020599365234375, 0.03875732421875, -0.017333984375, -0.018768310546875, -0.01544952392578125, -0.01491546630859375, -0.0034160614013671875, -0.00955963134765625, 0.0650634765625, -0.029998779296875, -0.0140533447265625, -0.053375244140625, 0.0023365020751953125, 0.058380126953125, -0.0098876953125, 0.0706787109375, 0.080322265625, -0.0250244140625, 0.01983642578125, -0.0350341796875, -0.0021762847900390625, -0.035247802734375, 0.007659912109375, -0.03662109375, -0.042266845703125, 0.07061767578125, 0.01611328125, 0.0019969940185546875, 0.052947998046875, 0.048675537109375, -0.0002498626708984375, 0.0889892578125, 0.029388427734375, -0.01654052734375, 0.0296783447265625, -0.0458984375, 0.0157928466796875, -0.0634765625, -0.0273895263671875, -0.00567626953125, -0.01654052734375, -0.06304931640625, -0.028564453125, 0.03875732421875, 0.0241241455078125, -0.01313018798828125, 0.038482666015625, -0.0301513671875, 0.0197296142578125, 0.034637451171875, 0.00806427001953125, 0.0021495819091796875, -0.0025386810302734375, -0.029632568359375, -0.002002716064453125, -0.05291748046875, -0.0236663818359375, 0.07647705078125, 0.0325927734375, 0.06573486328125, 0.00316619873046875, 0.06085205078125, -0.00788116455078125, -0.0023193359375, -0.04718017578125, 0.0428466796875, -0.0051727294921875, -0.047760009765625, -0.0184173583984375, -0.03179931640625, -0.062225341796875, 0.02911376953125, -0.01812744140625, -0.076904296875, 0.00726318359375, 0.0190277099609375, -0.035552978515625, 0.04119873046875, -0.053009033203125, 0.0555419921875, -0.01403045654296875, -0.024566650390625, 0.000911712646484375, -0.041259765625, 0.02490234375, -0.0037784576416015625, 0.00047588348388671875, -0.01666259765625, 0.01495361328125, 0.053558349609375, -0.047821044921875, 0.05255126953125, -0.016387939453125, -0.002071380615234375, 0.04620361328125, -0.01464080810546875, 0.05792236328125, 0.0008697509765625, -0.0020999908447265625, 0.0203857421875, 0.0031337738037109375, -0.036834716796875, -0.039581298828125, 0.036468505859375, -0.06243896484375, -0.038360595703125, -0.052520751953125, -0.042022705078125, 0.0120697021484375, 0.0167236328125, 0.048095703125, 0.0200653076171875, 0.01812744140625, 0.0170440673828125, 0.0294647216796875, -0.036376953125, 0.03851318359375, 0.0123138427734375, -0.03277587890625, -0.038665771484375, 0.06396484375, 0.0191497802734375, 0.03375244140625, -0.00745391845703125, 0.02197265625, -0.02545166015625, -0.0278472900390625, -0.025634765625, 0.048492431640625, -0.029815673828125, -0.01511383056640625, -0.032867431640625, -0.042449951171875, -0.044219970703125, -0.00467681884765625, -0.040069580078125, -0.0209197998046875, -0.0069427490234375, -0.01045989990234375, 0.0345458984375, 0.0509033203125, -0.0007352828979492188, 0.029052734375, -0.044219970703125, 0.0205841064453125, 0.0251617431640625, 0.027618408203125, 0.0024166107177734375, -0.053741455078125, -0.0159149169921875, 0.006011962890625, -0.02423095703125, -0.056884765625, 0.040069580078125, 0.005718231201171875, 0.036529541015625, 0.023712158203125, -0.004611968994140625, 0.0606689453125, -0.0284576416015625, 0.061492919921875, 0.02764892578125, -0.056549072265625, 0.051727294921875, -0.029388427734375, 0.044708251953125, 0.0193939208984375, 0.041656494140625, -0.028076171875, -0.0226593017578125, -0.0640869140625, -0.066650390625, 0.083984375, 0.03472900390625, -0.0044097900390625, 0.01543426513671875, 0.021484375, -0.0033321380615234375, 0.004085540771484375, -0.07537841796875, -0.044219970703125, -0.02252197265625, -0.0145263671875, 0.0021152496337890625, -0.0164794921875, 0.0017633438110351562, -0.044769287109375, 0.06109619140625, -0.0013589859008789062, 0.03350830078125, 0.013763427734375, -0.01336669921875, 0.004016876220703125, -0.00577545166015625, 0.0545654296875, 0.0565185546875, -0.030517578125, -0.01161956787109375, 0.03192138671875, -0.04754638671875, 0.0070953369140625, 0.0016994476318359375, -0.0279693603515625, 0.01416778564453125, 0.0274810791015625, 0.0804443359375, -0.017547607421875, -0.030609130859375, 0.03759765625, 0.005664825439453125, -0.022979736328125, -0.030487060546875, 0.006744384765625, 0.00963592529296875, 0.01611328125, 0.017913818359375, -0.024139404296875, -0.01161956787109375, -0.032012939453125, 0.016693115234375, 0.02191162109375, -0.01493072509765625, -0.041168212890625, 0.034942626953125, -0.01202392578125, -0.00525665283203125, 0.0340576171875, -0.025665283203125, -0.045135498046875, 0.050323486328125, 0.045501708984375, 0.0587158203125, -0.0233001708984375, 0.0225830078125, 0.053619384765625, 0.018096923828125, -0.0006318092346191406, 0.0196990966796875, 0.0287017822265625, -0.044036865234375, -0.0226593017578125, -0.062225341796875, 0.00926971435546875, 0.035736083984375, -0.04693603515625, 0.0289459228515625, -0.041656494140625, -0.035552978515625, 0.00969696044921875, 0.002635955810546875, -0.042022705078125, 0.01163482666015625, 0.0238800048828125, 0.051544189453125, -0.07366943359375, 0.064453125, 0.0548095703125, -0.039398193359375, -0.05670166015625, -0.007373809814453125, 0.0197296142578125, -0.058135986328125, 0.035308837890625, 0.004863739013671875, -0.0020084381103515625, -0.00730133056640625, -0.03369140625, -0.0802001953125, 0.09637451171875, 0.037384033203125, -0.04052734375, -0.006320953369140625, 0.0203704833984375, 0.0458984375, -0.0164794921875, 0.0377197265625, 0.033905029296875, 0.034271240234375, -0.002376556396484375, -0.09515380859375, 0.00531005859375, -0.0254364013671875, 0.004665374755859375, 0.01476287841796875, -0.08843994140625, 0.0784912109375, -0.0033473968505859375, 0.0024738311767578125, -0.01435089111328125, 0.035491943359375, 0.03582763671875, 0.007160186767578125, 0.04736328125, 0.061614990234375, 0.032318115234375, -0.01116943359375, 0.09197998046875, -0.016937255859375, 0.059478759765625, 0.0689697265625, 0.01422119140625, 0.0333251953125, 0.009918212890625, -0.036712646484375, 0.04217529296875, 0.04718017578125, -0.01025390625, 0.02545166015625, 0.010467529296875, -0.0207672119140625, -0.0142822265625, -0.003780364990234375, -0.03173828125, 0.040924072265625, 0.005481719970703125, -0.016571044921875, -0.0120086669921875, 0.0114898681640625, 0.023162841796875, -0.02490234375, -0.0191497802734375, 0.05767822265625, 0.017425537109375, -0.03863525390625, 0.06549072265625, -0.01043701171875, 0.05413818359375, -0.048858642578125, 0.01197052001953125, -0.01116180419921875, 0.005451202392578125, -0.0280303955078125, -0.05999755859375, 0.016571044921875, 0.002651214599609375, -0.01096343994140625, 0.00640869140625, 0.052703857421875, -0.02203369140625, -0.054412841796875, 0.037872314453125, 0.0184478759765625, 0.0301971435546875, 0.00475311279296875, -0.08624267578125, 0.009979248046875, 0.00994110107421875, -0.038421630859375, 0.0237884521484375, 0.0182647705078125, 0.0037097930908203125, 0.038787841796875, 0.044097900390625, 0.0252838134765625, 0.03204345703125, 0.01611328125, 0.05694580078125, -0.046295166015625, -0.0268096923828125, -0.0712890625, 0.0450439453125, -0.017913818359375, -0.031524658203125, 0.0572509765625, 0.047882080078125, 0.0733642578125, -0.00934600830078125, 0.0498046875, -0.0296783447265625, 0.0213165283203125, -0.03411865234375, 0.045806884765625, -0.036407470703125, -0.005283355712890625, -0.035125732421875, -0.06829833984375, -0.00522613525390625, 0.05291748046875, -0.020355224609375, 0.0166473388671875, 0.0419921875, 0.055389404296875, -0.00859832763671875, -0.031219482421875, 0.0184478759765625, 0.02960205078125, 0.0163116455078125, 0.05584716796875, 0.041656494140625, -0.059814453125, 0.041259765625, -0.05145263671875, -0.01271820068359375, -0.0167999267578125, -0.045196533203125, -0.06903076171875, -0.03173828125, -0.032318115234375, -0.01873779296875, -0.005725860595703125, 0.0751953125, 0.06500244140625, -0.058074951171875, -0.0303802490234375, 0.005229949951171875, 0.0074462890625, -0.020721435546875, -0.021636962890625, 0.041351318359375, -0.00890350341796875, -0.07391357421875, -0.006580352783203125, 0.00853729248046875, 0.033111572265625, 0.004749298095703125, -0.01343536376953125, -0.0301361083984375, -0.0022430419921875, 0.054473876953125, 0.02044677734375, -0.05999755859375, -0.0219268798828125, -0.00494384765625, -0.0021305084228515625, 0.01367950439453125, 0.0197601318359375, -0.033203125, 0.030517578125, 0.05755615234375, 0.01294708251953125, 0.06964111328125, 0.00971221923828125, 0.03167724609375, -0.04425048828125, 0.0328369140625, 0.00576019287109375, 0.02203369140625, 0.003749847412109375, -0.02227783203125, 0.047271728515625, 0.033905029296875, -0.03045654296875, -0.060394287109375, -0.00435638427734375, -0.0814208984375, -0.008209228515625, 0.077392578125, -0.0273895263671875, -0.0251312255859375, 0.01045989990234375, -0.0164794921875, 0.0152435302734375, -0.0159149169921875, 0.0430908203125, 0.06829833984375, -0.03314208984375, -0.024017333984375, -0.05419921875, 0.04144287109375, 0.0305633544921875, -0.061309814453125, -0.007572174072265625, 0.003849029541015625, 0.0253143310546875, 0.025909423828125, 0.050048828125, -0.0272064208984375, 0.0274505615234375, -0.006755828857421875, 0.011932373046875, 0.005008697509765625, -0.0169219970703125, -0.02044677734375, -0.0122528076171875, -0.01537322998046875, -0.01293182373046875 ] ]
Yntec/Lyriel
2023-10-13T02:39:06.000Z
[ "diffusers", "General", "Fantasy", "Banned", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
Yntec
null
null
Yntec/Lyriel
1
9,342
diffusers
2023-10-13T00:41:22
--- license: creativeml-openrail-m library_name: diffusers pipeline_tag: text-to-image tags: - General - Fantasy - Banned - stable-diffusion - stable-diffusion-diffusers - diffusers - text-to-image --- # Lyriel 1.2 Original model page: https://civitai.com/models/22922?modelVersionId=32704 Sample and prompt: ![Sample](https://cdn-uploads.huggingface.co/production/uploads/63239b8370edc53f51cd5d42/TI4jSnX6z3KjTPgygrRsQ.png) Pretty CUTE Girl, playing videogame, sitting on tv monitor, gorgeous detailed hair, curly bangs, Ponytail, sneakers Magazine ad, DETAILED CHIBI EYES, iconic, 1940, sharp focus. Illustration By ROSSDRAWS and KlaysMoji and Dave Rapoza and artgerm and leyendecker and Clay Mann
704
[ [ -0.031341552734375, -0.055877685546875, 0.0208587646484375, 0.01806640625, -0.005146026611328125, -0.00997161865234375, 0.022552490234375, -0.036224365234375, 0.053314208984375, 0.0255584716796875, -0.072021484375, -0.0304412841796875, -0.032928466796875, -0.031463623046875, -0.037322998046875, 0.05035400390625, 0.007457733154296875, 0.018310546875, 0.00909423828125, 0.01116180419921875, -0.0309600830078125, 0.00505828857421875, -0.057281494140625, -0.0421142578125, 0.0538330078125, 0.042999267578125, 0.0498046875, 0.0272979736328125, 0.0263824462890625, 0.0280609130859375, 0.0025691986083984375, -0.04815673828125, -0.05517578125, -0.00580596923828125, -0.01261138916015625, -0.04345703125, -0.059661865234375, 0.030303955078125, 0.039642333984375, 0.003917694091796875, 0.002101898193359375, 0.025115966796875, 0.006500244140625, 0.0289459228515625, 0.00762176513671875, 0.0018186569213867188, -0.0028095245361328125, 0.00496673583984375, -0.021697998046875, 0.0159454345703125, -0.0068511962890625, -0.0236358642578125, -0.00762939453125, -0.0672607421875, 0.041351318359375, -0.01517486572265625, 0.10223388671875, 0.031280517578125, -0.0247650146484375, -0.015838623046875, -0.03631591796875, 0.0489501953125, -0.03717041015625, 0.06134033203125, 0.0234222412109375, 0.032470703125, -0.04931640625, -0.09014892578125, -0.01451873779296875, -0.0035648345947265625, -0.00006753206253051758, 0.0152435302734375, -0.051055908203125, -0.0229339599609375, 0.0238037109375, 0.029052734375, -0.046173095703125, 0.00453948974609375, -0.06402587890625, 0.0181121826171875, 0.04364013671875, -0.01430511474609375, 0.024200439453125, 0.0265350341796875, -0.02899169921875, -0.0165557861328125, -0.0255584716796875, 0.019439697265625, 0.047210693359375, -0.0129241943359375, -0.03131103515625, 0.0413818359375, -0.0172119140625, 0.0171356201171875, 0.041961669921875, -0.021942138671875, 0.0145721435546875, 0.0098876953125, -0.044708251953125, -0.0080108642578125, 0.0291748046875, 0.036163330078125, 0.0104217529296875, 0.002532958984375, 0.031341552734375, -0.00274658203125, 0.034027099609375, -0.0745849609375, -0.03765869140625, 0.04034423828125, -0.055419921875, -0.05224609375, 0.00872039794921875, -0.074462890625, -0.023895263671875, 0.0079193115234375, 0.022186279296875, -0.014495849609375, -0.0171051025390625, 0.01383209228515625, -0.0021114349365234375, 0.019622802734375, 0.01425933837890625, -0.061553955078125, -0.00855255126953125, 0.0211334228515625, 0.020263671875, 0.028717041015625, 0.01055908203125, -0.00069427490234375, -0.0038089752197265625, -0.048431396484375, 0.042724609375, 0.01235198974609375, -0.027984619140625, -0.01264190673828125, 0.04248046875, 0.0226287841796875, -0.033935546875, 0.0732421875, -0.040985107421875, -0.007122039794921875, -0.006195068359375, -0.0194854736328125, -0.0092620849609375, 0.006748199462890625, -0.08544921875, 0.0679931640625, 0.020416259765625, -0.05072021484375, 0.039764404296875, -0.0338134765625, 0.00835418701171875, 0.0178985595703125, -0.0007734298706054688, -0.0155792236328125, 0.0272369384765625, -0.01495361328125, 0.017822265625, -0.01314544677734375, -0.01108551025390625, -0.053802490234375, -0.0267486572265625, 0.033294677734375, -0.02764892578125, 0.05291748046875, 0.020172119140625, -0.0183563232421875, -0.0013408660888671875, -0.055816650390625, 0.0240631103515625, 0.049530029296875, -0.002315521240234375, 0.007415771484375, -0.0157012939453125, 0.0032939910888671875, 0.0259857177734375, 0.0244903564453125, -0.030303955078125, 0.0126190185546875, 0.0045166015625, -0.00830078125, 0.0289459228515625, -0.0245361328125, 0.0030002593994140625, -0.0426025390625, 0.046478271484375, -0.031585693359375, 0.033538818359375, 0.01200103759765625, -0.024627685546875, -0.06134033203125, -0.053985595703125, 0.027435302734375, 0.029144287109375, -0.0243377685546875, 0.0095367431640625, 0.00962066650390625, -0.068115234375, -0.0316162109375, -0.00341033935546875, 0.009796142578125, 0.0017385482788085938, -0.0113677978515625, -0.00423431396484375, -0.0323486328125, -0.0911865234375, 0.0025196075439453125, -0.017578125, -0.0198211669921875, 0.0233306884765625, 0.020477294921875, -0.0153350830078125, 0.0307159423828125, -0.0302276611328125, -0.018890380859375, -0.04278564453125, -0.00975799560546875, 0.060791015625, 0.0361328125, 0.09234619140625, -0.05340576171875, -0.0211334228515625, -0.021697998046875, -0.05084228515625, -0.03204345703125, 0.012969970703125, -0.025054931640625, -0.02459716796875, 0.050445556640625, -0.050750732421875, 0.068603515625, 0.0284881591796875, -0.0589599609375, 0.0361328125, -0.01763916015625, 0.0489501953125, -0.07244873046875, 0.0021114349365234375, 0.026031494140625, -0.0218048095703125, -0.046905517578125, 0.0616455078125, -0.030242919921875, 0.017425537109375, -0.05072021484375, 0.05169677734375, -0.0257568359375, 0.01493072509765625, -0.01346588134765625, 0.00347137451171875, 0.01084136962890625, 0.010589599609375, -0.013427734375, 0.038482666015625, 0.01629638671875, -0.01052093505859375, 0.04437255859375, 0.032684326171875, -0.037445068359375, 0.04327392578125, -0.09326171875, 0.0004372596740722656, -0.0135955810546875, 0.00305938720703125, -0.064208984375, -0.0655517578125, 0.0517578125, -0.0276641845703125, 0.0207672119140625, -0.0233154296875, -0.0285186767578125, -0.0382080078125, -0.0361328125, 0.03009033203125, 0.0285186767578125, -0.054595947265625, 0.0120086669921875, 0.01100921630859375, -0.0146484375, -0.012237548828125, -0.053619384765625, -0.029937744140625, -0.03497314453125, -0.040130615234375, 0.05023193359375, -0.0098876953125, -0.03594970703125, -0.0260772705078125, 0.004009246826171875, -0.0292510986328125, -0.0214385986328125, 0.0269012451171875, 0.043792724609375, -0.02734375, -0.01041412353515625, -0.0174102783203125, 0.01010894775390625, -0.002960205078125, 0.03662109375, 0.050811767578125, -0.0147552490234375, -0.0231170654296875, -0.061981201171875, 0.036407470703125, 0.05712890625, 0.01523590087890625, 0.028350830078125, 0.0557861328125, -0.0308837890625, 0.00196075439453125, -0.0335693359375, -0.01242828369140625, -0.037353515625, 0.0289459228515625, -0.031982421875, -0.033782958984375, 0.032318115234375, 0.009002685546875, -0.0257568359375, 0.040863037109375, 0.038787841796875, -0.01531219482421875, 0.0784912109375, 0.06048583984375, 0.00476837158203125, 0.038970947265625, -0.04205322265625, 0.01024627685546875, -0.01523590087890625, -0.0161895751953125, -0.019500732421875, -0.0440673828125, -0.05322265625, -0.040313720703125, 0.00437164306640625, 0.04840087890625, -0.038055419921875, 0.040435791015625, -0.00830078125, 0.04412841796875, 0.05010986328125, 0.044708251953125, -0.007556915283203125, 0.005939483642578125, 0.01438140869140625, -0.035369873046875, -0.045074462890625, -0.0217742919921875, 0.03851318359375, 0.05975341796875, 0.07135009765625, 0.043548583984375, 0.04412841796875, 0.0161285400390625, -0.002727508544921875, -0.061248779296875, 0.038421630859375, -0.012786865234375, -0.08123779296875, -0.0008597373962402344, 0.006572723388671875, -0.037933349609375, -0.017425537109375, -0.0406494140625, -0.0762939453125, 0.0286102294921875, 0.025543212890625, -0.031585693359375, 0.00799560546875, -0.04119873046875, 0.08221435546875, -0.02093505859375, -0.06695556640625, -0.00835418701171875, -0.008819580078125, 0.052032470703125, 0.0116729736328125, -0.00865936279296875, -0.00699615478515625, 0.03485107421875, 0.022247314453125, -0.017333984375, 0.0460205078125, 0.01500701904296875, -0.0098876953125, 0.0309295654296875, -0.003551483154296875, 0.0286712646484375, 0.0279998779296875, 0.0210723876953125, -0.018218994140625, 0.02734375, -0.0292510986328125, -0.02972412109375, 0.0753173828125, -0.0457763671875, -0.032623291015625, -0.049407958984375, -0.0260009765625, -0.00473785400390625, 0.005767822265625, 0.056915283203125, 0.03778076171875, -0.055023193359375, 0.0151214599609375, 0.044647216796875, -0.002742767333984375, 0.043670654296875, -0.00966644287109375, -0.058502197265625, -0.01849365234375, 0.04669189453125, -0.016143798828125, 0.01068115234375, -0.027435302734375, 0.0014562606811523438, -0.0204315185546875, -0.0006618499755859375, -0.0236053466796875, 0.04644775390625, -0.0390625, -0.0006012916564941406, 0.00577545166015625, -0.0194091796875, -0.0232391357421875, -0.01425933837890625, -0.030303955078125, -0.0200347900390625, -0.0550537109375, 0.0093231201171875, 0.029541015625, 0.07818603515625, 0.017547607421875, 0.04058837890625, -0.048126220703125, 0.03472900390625, 0.044830322265625, 0.048065185546875, -0.01549530029296875, -0.037017822265625, 0.008331298828125, -0.00310516357421875, -0.046142578125, -0.067626953125, 0.0491943359375, 0.002674102783203125, 0.061981201171875, 0.03765869140625, -0.025634765625, 0.05389404296875, -0.022735595703125, 0.044219970703125, 0.03009033203125, -0.0245361328125, 0.01203155517578125, -0.04925537109375, 0.0236053466796875, 0.044342041015625, 0.0249481201171875, -0.004032135009765625, 0.0178375244140625, -0.0562744140625, -0.03729248046875, 0.041534423828125, 0.01279449462890625, -0.0017385482788085938, 0.0305633544921875, 0.0238800048828125, -0.0119476318359375, 0.01287078857421875, -0.048583984375, -0.033447265625, -0.024688720703125, -0.0206298828125, 0.007190704345703125, -0.0208740234375, -0.00031828880310058594, -0.040985107421875, 0.0738525390625, -0.0146636962890625, 0.0439453125, 0.0091705322265625, 0.01953125, -0.004421234130859375, 0.018646240234375, 0.051422119140625, 0.0306243896484375, -0.0516357421875, -0.0260009765625, -0.0162200927734375, -0.051483154296875, 0.01299285888671875, 0.01035308837890625, -0.0287628173828125, 0.031341552734375, 0.00884246826171875, 0.059417724609375, 0.029632568359375, -0.036773681640625, 0.0272979736328125, -0.01073455810546875, 0.0027637481689453125, -0.0316162109375, 0.0125885009765625, 0.005298614501953125, 0.0816650390625, 0.01016998291015625, -0.02044677734375, 0.025238037109375, -0.062103271484375, 0.0228729248046875, 0.032867431640625, -0.04486083984375, -0.033477783203125, 0.08221435546875, -0.0230865478515625, -0.0222625732421875, 0.01514434814453125, -0.01200103759765625, -0.020782470703125, 0.0517578125, 0.05908203125, 0.0648193359375, -0.0249786376953125, 0.050201416015625, 0.0428466796875, -0.0172119140625, -0.019775390625, 0.043212890625, 0.033050537109375, -0.019683837890625, 0.0206298828125, -0.043182373046875, -0.00917816162109375, 0.03765869140625, -0.041412353515625, 0.0718994140625, -0.044952392578125, -0.032135009765625, 0.0125274658203125, 0.00844573974609375, -0.035736083984375, 0.046356201171875, 0.015350341796875, 0.097900390625, -0.05096435546875, 0.07293701171875, 0.038360595703125, -0.0264739990234375, -0.05609130859375, -0.012939453125, 0.0200042724609375, -0.0298309326171875, 0.038726806640625, 0.004589080810546875, 0.0035305023193359375, -0.005039215087890625, -0.0555419921875, -0.05987548828125, 0.09393310546875, 0.01922607421875, -0.05523681640625, -0.01320648193359375, -0.0180511474609375, 0.027374267578125, -0.0528564453125, 0.031158447265625, 0.0548095703125, 0.01145172119140625, 0.055877685546875, -0.0523681640625, -0.05224609375, -0.05487060546875, -0.010467529296875, -0.0028553009033203125, -0.065673828125, 0.0762939453125, -0.0239715576171875, -0.003505706787109375, 0.06219482421875, 0.06988525390625, 0.036712646484375, 0.0284881591796875, 0.0345458984375, 0.040679931640625, 0.004657745361328125, -0.01416778564453125, 0.064697265625, -0.013458251953125, 0.01491546630859375, 0.0928955078125, 0.00757598876953125, 0.0498046875, 0.036376953125, -0.022552490234375, 0.0182952880859375, 0.061553955078125, -0.020233154296875, 0.044036865234375, -0.0016489028930664062, -0.013427734375, -0.01165771484375, -0.025054931640625, -0.022216796875, 0.034576416015625, 0.00531768798828125, -0.04766845703125, -0.01334381103515625, -0.021820068359375, 0.010528564453125, 0.005970001220703125, -0.0382080078125, 0.01483154296875, -0.00891876220703125, -0.01238250732421875, 0.02947998046875, -0.0018281936645507812, 0.032623291015625, -0.044830322265625, -0.0290985107421875, -0.022735595703125, 0.025665283203125, -0.008819580078125, -0.032928466796875, -0.0084228515625, 0.0023136138916015625, -0.0301513671875, -0.009918212890625, 0.05712890625, -0.02044677734375, -0.097412109375, 0.023101806640625, 0.0078582763671875, 0.0160675048828125, 0.0316162109375, -0.0615234375, 0.030853271484375, 0.01226043701171875, -0.00391387939453125, 0.005794525146484375, -0.0166473388671875, -0.013702392578125, 0.0296173095703125, 0.01285552978515625, 0.01019287109375, 0.0211639404296875, 0.0146484375, 0.0438232421875, -0.0259246826171875, -0.0287628173828125, -0.02581787109375, 0.039886474609375, -0.01776123046875, -0.047088623046875, 0.049774169921875, 0.039947509765625, 0.03955078125, -0.04510498046875, 0.0297698974609375, -0.020294189453125, 0.0469970703125, -0.0247802734375, 0.0654296875, -0.057708740234375, -0.01129913330078125, -0.0201873779296875, -0.0574951171875, 0.0030002593994140625, 0.05914306640625, 0.032318115234375, 0.0118408203125, 0.0102996826171875, 0.0733642578125, -0.0347900390625, -0.003063201904296875, 0.0171661376953125, 0.0255889892578125, 0.0113677978515625, 0.01432037353515625, 0.050079345703125, -0.07574462890625, -0.0125579833984375, -0.033935546875, -0.032196044921875, -0.0231170654296875, -0.05120849609375, -0.08447265625, -0.0231781005859375, -0.0193939208984375, -0.04669189453125, -0.00438690185546875, 0.0787353515625, 0.0516357421875, -0.06610107421875, -0.0151519775390625, 0.0186004638671875, -0.0115814208984375, -0.0020294189453125, -0.013427734375, -0.004116058349609375, 0.0458984375, -0.07293701171875, 0.037353515625, -0.032073974609375, 0.0557861328125, -0.0233001708984375, 0.00757598876953125, -0.0289459228515625, 0.0216827392578125, 0.01473236083984375, 0.03594970703125, -0.056182861328125, -0.040496826171875, 0.0305938720703125, -0.01226806640625, 0.0024566650390625, 0.039398193359375, -0.037872314453125, -0.0147552490234375, 0.049041748046875, -0.017547607421875, 0.037933349609375, 0.0119476318359375, 0.022705078125, -0.0070343017578125, 0.040802001953125, -0.0228424072265625, 0.030303955078125, 0.0143585205078125, -0.007335662841796875, 0.04376220703125, 0.013275146484375, -0.04254150390625, -0.062286376953125, 0.01480865478515625, -0.09136962890625, -0.00521087646484375, 0.049407958984375, 0.01430511474609375, -0.03857421875, 0.0293426513671875, -0.045318603515625, 0.0222625732421875, 0.0038280487060546875, 0.03082275390625, 0.0423583984375, 0.00861358642578125, -0.0360107421875, -0.07745361328125, 0.0325927734375, -0.01117706298828125, -0.0312042236328125, -0.037200927734375, 0.03509521484375, 0.02838134765625, 0.011444091796875, 0.0169830322265625, -0.0208282470703125, 0.03631591796875, 0.02337646484375, 0.01702880859375, 0.015655517578125, -0.040435791015625, -0.0091705322265625, -0.0025463104248046875, -0.006603240966796875, -0.03240966796875 ] ]
albert-large-v2
2023-04-06T13:41:50.000Z
[ "transformers", "pytorch", "tf", "safetensors", "albert", "fill-mask", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:1909.11942", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
null
null
null
albert-large-v2
12
9,319
transformers
2022-03-02T23:29:04
--- language: en license: apache-2.0 datasets: - bookcorpus - wikipedia --- # ALBERT Large v2 Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/abs/1909.11942) and first released in [this repository](https://github.com/google-research/albert). This model, as all ALBERT models, is uncased: it does not make a difference between english and English. Disclaimer: The team releasing ALBERT did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description ALBERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Sentence Ordering Prediction (SOP): ALBERT uses a pretraining loss based on predicting the ordering of two consecutive segments of text. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the ALBERT model as inputs. ALBERT is particular in that it shares its layers across its Transformer. Therefore, all layers have the same weights. Using repeating layers results in a small memory footprint, however, the computational cost remains similar to a BERT-like architecture with the same number of hidden layers as it has to iterate through the same number of (repeating) layers. This is the second version of the large model. Version 2 is different from version 1 due to different dropout rates, additional training data, and longer training. It has better results in nearly all downstream tasks. This model has the following configuration: - 24 repeating layers - 128 embedding dimension - 1024 hidden dimension - 16 attention heads - 17M parameters ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=albert) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use You can use this model directly with a pipeline for masked language modeling: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='albert-large-v2') >>> unmasker("Hello I'm a [MASK] model.") [ { "sequence":"[CLS] hello i'm a modeling model.[SEP]", "score":0.05816134437918663, "token":12807, "token_str":"▁modeling" }, { "sequence":"[CLS] hello i'm a modelling model.[SEP]", "score":0.03748830780386925, "token":23089, "token_str":"▁modelling" }, { "sequence":"[CLS] hello i'm a model model.[SEP]", "score":0.033725276589393616, "token":1061, "token_str":"▁model" }, { "sequence":"[CLS] hello i'm a runway model.[SEP]", "score":0.017313428223133087, "token":8014, "token_str":"▁runway" }, { "sequence":"[CLS] hello i'm a lingerie model.[SEP]", "score":0.014405295252799988, "token":29104, "token_str":"▁lingerie" } ] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import AlbertTokenizer, AlbertModel tokenizer = AlbertTokenizer.from_pretrained('albert-large-v2') model = AlbertModel.from_pretrained("albert-large-v2") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` and in TensorFlow: ```python from transformers import AlbertTokenizer, TFAlbertModel tokenizer = AlbertTokenizer.from_pretrained('albert-large-v2') model = TFAlbertModel.from_pretrained("albert-large-v2") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='albert-large-v2') >>> unmasker("The man worked as a [MASK].") [ { "sequence":"[CLS] the man worked as a chauffeur.[SEP]", "score":0.029577180743217468, "token":28744, "token_str":"▁chauffeur" }, { "sequence":"[CLS] the man worked as a janitor.[SEP]", "score":0.028865724802017212, "token":29477, "token_str":"▁janitor" }, { "sequence":"[CLS] the man worked as a shoemaker.[SEP]", "score":0.02581118606030941, "token":29024, "token_str":"▁shoemaker" }, { "sequence":"[CLS] the man worked as a blacksmith.[SEP]", "score":0.01849772222340107, "token":21238, "token_str":"▁blacksmith" }, { "sequence":"[CLS] the man worked as a lawyer.[SEP]", "score":0.01820771023631096, "token":3672, "token_str":"▁lawyer" } ] >>> unmasker("The woman worked as a [MASK].") [ { "sequence":"[CLS] the woman worked as a receptionist.[SEP]", "score":0.04604868218302727, "token":25331, "token_str":"▁receptionist" }, { "sequence":"[CLS] the woman worked as a janitor.[SEP]", "score":0.028220869600772858, "token":29477, "token_str":"▁janitor" }, { "sequence":"[CLS] the woman worked as a paramedic.[SEP]", "score":0.0261906236410141, "token":23386, "token_str":"▁paramedic" }, { "sequence":"[CLS] the woman worked as a chauffeur.[SEP]", "score":0.024797942489385605, "token":28744, "token_str":"▁chauffeur" }, { "sequence":"[CLS] the woman worked as a waitress.[SEP]", "score":0.024124596267938614, "token":13678, "token_str":"▁waitress" } ] ``` This bias will also affect all fine-tuned versions of this model. ## Training data The ALBERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using SentencePiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` ### Training The ALBERT procedure follows the BERT setup. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ## Evaluation results When fine-tuned on downstream tasks, the ALBERT models achieve the following results: | | Average | SQuAD1.1 | SQuAD2.0 | MNLI | SST-2 | RACE | |----------------|----------|----------|----------|----------|----------|----------| |V2 | |ALBERT-base |82.3 |90.2/83.2 |82.1/79.3 |84.6 |92.9 |66.8 | |ALBERT-large |85.7 |91.8/85.2 |84.9/81.8 |86.5 |94.9 |75.2 | |ALBERT-xlarge |87.9 |92.9/86.4 |87.9/84.1 |87.9 |95.4 |80.7 | |ALBERT-xxlarge |90.9 |94.6/89.1 |89.8/86.9 |90.6 |96.8 |86.8 | |V1 | |ALBERT-base |80.1 |89.3/82.3 | 80.0/77.1|81.6 |90.3 | 64.0 | |ALBERT-large |82.4 |90.6/83.9 | 82.3/79.4|83.5 |91.7 | 68.5 | |ALBERT-xlarge |85.5 |92.5/86.1 | 86.1/83.1|86.4 |92.4 | 74.8 | |ALBERT-xxlarge |91.0 |94.8/89.3 | 90.2/87.4|90.8 |96.9 | 86.5 | ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-1909-11942, author = {Zhenzhong Lan and Mingda Chen and Sebastian Goodman and Kevin Gimpel and Piyush Sharma and Radu Soricut}, title = {{ALBERT:} {A} Lite {BERT} for Self-supervised Learning of Language Representations}, journal = {CoRR}, volume = {abs/1909.11942}, year = {2019}, url = {http://arxiv.org/abs/1909.11942}, archivePrefix = {arXiv}, eprint = {1909.11942}, timestamp = {Fri, 27 Sep 2019 13:04:21 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-1909-11942.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
9,758
[ [ -0.007793426513671875, -0.03912353515625, 0.020111083984375, 0.0248870849609375, -0.031341552734375, 0.0011548995971679688, 0.007434844970703125, -0.0140533447265625, 0.0265655517578125, 0.0469970703125, -0.03985595703125, -0.032012939453125, -0.0614013671875, 0.009979248046875, -0.042083740234375, 0.08453369140625, 0.00646209716796875, 0.027252197265625, -0.004791259765625, 0.00649261474609375, -0.026214599609375, -0.0501708984375, -0.06402587890625, -0.0231170654296875, 0.040283203125, 0.02606201171875, 0.045562744140625, 0.0509033203125, 0.04046630859375, 0.0306243896484375, -0.0008082389831542969, -0.0146942138671875, -0.0215301513671875, 0.004497528076171875, -0.006542205810546875, -0.045928955078125, -0.033111572265625, 0.006023406982421875, 0.0469970703125, 0.059234619140625, -0.005054473876953125, 0.0275421142578125, -0.0139312744140625, 0.0419921875, -0.02850341796875, 0.0225372314453125, -0.0283355712890625, 0.00533294677734375, -0.0199127197265625, 0.00738525390625, -0.0254058837890625, -0.007228851318359375, 0.0088653564453125, -0.049407958984375, 0.01505279541015625, 0.02838134765625, 0.08184814453125, 0.0097198486328125, -0.0169830322265625, -0.01384735107421875, -0.042877197265625, 0.064697265625, -0.0489501953125, 0.018402099609375, 0.040008544921875, 0.02301025390625, 0.000022172927856445312, -0.076904296875, -0.02484130859375, -0.005672454833984375, -0.018096923828125, -0.0035076141357421875, -0.00428009033203125, -0.00885009765625, 0.033935546875, 0.0274505615234375, -0.031524658203125, 0.00579833984375, -0.05633544921875, -0.0245361328125, 0.05255126953125, 0.0199432373046875, 0.0165557861328125, -0.0139312744140625, -0.02239990234375, -0.0208740234375, -0.02630615234375, 0.004215240478515625, 0.0419921875, 0.022247314453125, -0.01727294921875, 0.054595947265625, -0.027099609375, 0.04010009765625, -0.0049285888671875, -0.0019216537475585938, 0.03656005859375, -0.0009665489196777344, -0.0286407470703125, 0.00139617919921875, 0.08050537109375, 0.0196990966796875, 0.024200439453125, -0.00530242919921875, -0.0347900390625, -0.005352020263671875, 0.0228118896484375, -0.056488037109375, -0.0270233154296875, 0.0100555419921875, -0.033447265625, -0.029327392578125, 0.033843994140625, -0.05224609375, -0.01122283935546875, -0.01009368896484375, 0.0369873046875, -0.0167999267578125, -0.01134490966796875, 0.01459503173828125, -0.0301055908203125, 0.01216888427734375, 0.01226806640625, -0.0665283203125, 0.01708984375, 0.044403076171875, 0.0638427734375, 0.0263214111328125, -0.0148162841796875, -0.033782958984375, -0.006992340087890625, -0.025604248046875, 0.03753662109375, -0.0254058837890625, -0.03668212890625, 0.00817108154296875, 0.0209197998046875, 0.00028824806213378906, -0.02545166015625, 0.04498291015625, -0.0469970703125, 0.03753662109375, -0.0027942657470703125, -0.0298309326171875, -0.0192413330078125, 0.002346038818359375, -0.055511474609375, 0.07855224609375, 0.0284423828125, -0.051513671875, 0.020111083984375, -0.06719970703125, -0.04266357421875, 0.0183563232421875, 0.00916290283203125, -0.04302978515625, 0.0094451904296875, 0.00812530517578125, 0.0299530029296875, -0.0126190185546875, 0.01383209228515625, -0.0186767578125, -0.03009033203125, 0.02337646484375, -0.01297760009765625, 0.0772705078125, 0.01276397705078125, -0.0157318115234375, 0.00894927978515625, -0.065185546875, -0.00731658935546875, 0.019927978515625, -0.01947021484375, -0.017303466796875, -0.0189056396484375, 0.02777099609375, 0.01181793212890625, 0.032989501953125, -0.040069580078125, 0.019439697265625, -0.041351318359375, 0.040069580078125, 0.05621337890625, -0.003696441650390625, 0.02972412109375, -0.0308685302734375, 0.0445556640625, 0.00315093994140625, -0.007396697998046875, -0.01482391357421875, -0.04449462890625, -0.06744384765625, -0.0244598388671875, 0.04010009765625, 0.056427001953125, -0.03631591796875, 0.047332763671875, -0.00946044921875, -0.047271728515625, -0.049346923828125, -0.0034732818603515625, 0.03131103515625, 0.027252197265625, 0.0244293212890625, -0.032379150390625, -0.06292724609375, -0.06805419921875, -0.0204925537109375, -0.012725830078125, -0.024322509765625, 0.0014505386352539062, 0.0625, -0.02508544921875, 0.05206298828125, -0.0545654296875, -0.0306243896484375, -0.0079193115234375, 0.0236358642578125, 0.0369873046875, 0.055145263671875, 0.0278472900390625, -0.0440673828125, -0.031036376953125, -0.0225372314453125, -0.051513671875, 0.0005598068237304688, -0.00292205810546875, -0.0175018310546875, -0.00013172626495361328, 0.041107177734375, -0.057891845703125, 0.0390625, 0.01448822021484375, -0.04119873046875, 0.047943115234375, -0.0233917236328125, 0.00457000732421875, -0.0902099609375, 0.01480865478515625, -0.0053253173828125, -0.02264404296875, -0.05322265625, -0.001979827880859375, -0.0099639892578125, -0.004116058349609375, -0.046600341796875, 0.046478271484375, -0.040191650390625, -0.0036487579345703125, -0.002655029296875, -0.01042938232421875, 0.01088714599609375, 0.0293731689453125, -0.0023937225341796875, 0.0445556640625, 0.05096435546875, -0.04193115234375, 0.049285888671875, 0.0367431640625, -0.04742431640625, 0.0215606689453125, -0.06536865234375, 0.0218048095703125, -0.006580352783203125, -0.0013818740844726562, -0.07855224609375, -0.0264739990234375, 0.0218048095703125, -0.0361328125, 0.0263671875, -0.00439453125, -0.05633544921875, -0.042083740234375, -0.0125732421875, 0.041595458984375, 0.037506103515625, -0.0193634033203125, 0.033660888671875, 0.025970458984375, -0.00982666015625, -0.045379638671875, -0.05426025390625, 0.007274627685546875, -0.0217132568359375, -0.0379638671875, 0.020599365234375, 0.00038552284240722656, -0.0221099853515625, -0.0201873779296875, 0.006778717041015625, -0.00799560546875, 0.004730224609375, 0.02197265625, 0.032928466796875, -0.0175933837890625, -0.01540374755859375, -0.01116943359375, -0.0121002197265625, 0.025238037109375, -0.0022678375244140625, 0.05596923828125, -0.0010433197021484375, -0.00666046142578125, -0.036407470703125, 0.032073974609375, 0.054931640625, -0.01058197021484375, 0.0665283203125, 0.0634765625, -0.0406494140625, 0.005802154541015625, -0.0238037109375, -0.01554107666015625, -0.0389404296875, 0.0460205078125, -0.03753662109375, -0.061187744140625, 0.05755615234375, 0.0215301513671875, -0.0104217529296875, 0.053680419921875, 0.045623779296875, -0.0083770751953125, 0.08734130859375, 0.03143310546875, -0.00665283203125, 0.036590576171875, -0.0173492431640625, 0.0274658203125, -0.06494140625, -0.037078857421875, -0.039520263671875, -0.016510009765625, -0.03179931640625, -0.007785797119140625, 0.016448974609375, 0.0258636474609375, -0.043060302734375, 0.04632568359375, -0.041717529296875, 0.0279083251953125, 0.0677490234375, 0.0127410888671875, -0.01186370849609375, -0.021209716796875, -0.00040411949157714844, 0.008148193359375, -0.02886962890625, -0.03460693359375, 0.077880859375, 0.0438232421875, 0.05230712890625, 0.00726318359375, 0.04461669921875, 0.0179901123046875, 0.011383056640625, -0.049102783203125, 0.045196533203125, -0.00884246826171875, -0.06689453125, -0.0257415771484375, -0.01073455810546875, -0.07659912109375, 0.0097808837890625, -0.0251007080078125, -0.07037353515625, -0.007537841796875, -0.009246826171875, -0.029754638671875, 0.007076263427734375, -0.05328369140625, 0.08184814453125, -0.0193328857421875, -0.01358795166015625, 0.00977325439453125, -0.0667724609375, 0.02337646484375, 0.00177764892578125, 0.016937255859375, -0.0108184814453125, 0.00909423828125, 0.087890625, -0.0347900390625, 0.060089111328125, -0.009735107421875, 0.01502227783203125, 0.0045166015625, 0.0033779144287109375, 0.0279693603515625, 0.0110015869140625, 0.006069183349609375, 0.0277557373046875, 0.0049591064453125, -0.03289794921875, -0.01910400390625, 0.034423828125, -0.064208984375, -0.040985107421875, -0.047149658203125, -0.042449951171875, 0.0131072998046875, 0.033599853515625, 0.043548583984375, 0.04473876953125, -0.010650634765625, 0.0181884765625, 0.0275726318359375, -0.01739501953125, 0.04998779296875, 0.0330810546875, -0.0245361328125, -0.041473388671875, 0.04718017578125, -0.00019502639770507812, -0.0011959075927734375, 0.042236328125, 0.007503509521484375, -0.04547119140625, -0.0157318115234375, -0.033447265625, 0.016448974609375, -0.046875, -0.027008056640625, -0.05047607421875, -0.0297698974609375, -0.049072265625, -0.01036834716796875, -0.0173797607421875, -0.027008056640625, -0.051544189453125, -0.010284423828125, 0.02825927734375, 0.052734375, -0.0022220611572265625, 0.04876708984375, -0.057891845703125, 0.0185546875, 0.02203369140625, 0.030181884765625, -0.0216064453125, -0.059234619140625, -0.034942626953125, -0.0013513565063476562, -0.0146331787109375, -0.05889892578125, 0.04949951171875, 0.009521484375, 0.0308380126953125, 0.045623779296875, -0.00579071044921875, 0.043182373046875, -0.04754638671875, 0.07122802734375, 0.0233001708984375, -0.0772705078125, 0.041534423828125, -0.026031494140625, 0.0205230712890625, 0.031402587890625, 0.0199127197265625, -0.0347900390625, -0.0288543701171875, -0.0631103515625, -0.0765380859375, 0.06658935546875, 0.015777587890625, 0.0178375244140625, 0.0021343231201171875, 0.01364898681640625, 0.0027256011962890625, 0.034759521484375, -0.0662841796875, -0.045806884765625, -0.027862548828125, -0.0203094482421875, -0.01708984375, -0.0218353271484375, -0.00978851318359375, -0.0367431640625, 0.05908203125, 0.019073486328125, 0.0396728515625, -0.0027523040771484375, -0.00661468505859375, -0.0025005340576171875, 0.01454925537109375, 0.0621337890625, 0.040435791015625, -0.0323486328125, 0.0049591064453125, 0.00189208984375, -0.041015625, 0.006061553955078125, 0.0103912353515625, -0.00010395050048828125, 0.01548004150390625, 0.0452880859375, 0.0748291015625, 0.01308441162109375, -0.03759765625, 0.046142578125, 0.0092620849609375, -0.0211334228515625, -0.05035400390625, 0.00980377197265625, -0.00946044921875, 0.011016845703125, 0.02935791015625, 0.01435089111328125, 0.0144500732421875, -0.03753662109375, 0.026123046875, 0.029144287109375, -0.037811279296875, -0.018707275390625, 0.0760498046875, 0.0010633468627929688, -0.064697265625, 0.057159423828125, -0.0178375244140625, -0.047698974609375, 0.05029296875, 0.049896240234375, 0.06964111328125, -0.0235137939453125, 0.009246826171875, 0.042724609375, 0.0224456787109375, -0.030364990234375, 0.01390838623046875, 0.026092529296875, -0.05841064453125, -0.0244140625, -0.063720703125, -0.016082763671875, 0.020233154296875, -0.06573486328125, 0.03302001953125, -0.037261962890625, -0.00868988037109375, 0.01268768310546875, -0.00550079345703125, -0.05908203125, 0.038177490234375, 0.002899169921875, 0.0706787109375, -0.07916259765625, 0.064453125, 0.0635986328125, -0.047760009765625, -0.0660400390625, -0.029541015625, -0.021514892578125, -0.082763671875, 0.0494384765625, 0.03240966796875, 0.0255584716796875, 0.004001617431640625, -0.044036865234375, -0.058502197265625, 0.06842041015625, 0.01129913330078125, -0.04248046875, -0.0146026611328125, 0.00811004638671875, 0.03839111328125, -0.04168701171875, 0.045379638671875, 0.042877197265625, 0.03253173828125, 0.003330230712890625, -0.05999755859375, -0.000027239322662353516, -0.032470703125, 0.00321197509765625, 0.0130157470703125, -0.03411865234375, 0.0770263671875, -0.0080413818359375, 0.00467681884765625, 0.01995849609375, 0.047515869140625, 0.0033512115478515625, 0.0180206298828125, 0.036285400390625, 0.05218505859375, 0.045166015625, -0.020965576171875, 0.06353759765625, -0.01593017578125, 0.043426513671875, 0.061553955078125, 0.004192352294921875, 0.054595947265625, 0.0321044921875, -0.022308349609375, 0.0682373046875, 0.0562744140625, -0.022003173828125, 0.0572509765625, 0.01666259765625, -0.00966644287109375, -0.006435394287109375, 0.0079498291015625, -0.019439697265625, 0.042724609375, 0.0160064697265625, -0.042083740234375, 0.00548553466796875, 0.0006489753723144531, 0.0145263671875, -0.015472412109375, -0.043212890625, 0.05450439453125, 0.015869140625, -0.051116943359375, 0.0230712890625, 0.015533447265625, 0.039031982421875, -0.036376953125, -0.0012226104736328125, -0.0008058547973632812, 0.0156402587890625, -0.008514404296875, -0.059661865234375, 0.0164031982421875, -0.01447296142578125, -0.029327392578125, -0.0234375, 0.04022216796875, -0.037811279296875, -0.057159423828125, -0.0029163360595703125, 0.0182342529296875, 0.022979736328125, -0.00844573974609375, -0.055908203125, -0.01197052001953125, -0.0028514862060546875, -0.0198516845703125, 0.016326904296875, 0.0256500244140625, 0.01224517822265625, 0.041290283203125, 0.051300048828125, -0.009429931640625, -0.0020732879638671875, 0.0021114349365234375, 0.05096435546875, -0.0655517578125, -0.06402587890625, -0.0751953125, 0.054229736328125, -0.00875091552734375, -0.043365478515625, 0.048492431640625, 0.06298828125, 0.05755615234375, -0.031646728515625, 0.03826904296875, -0.009033203125, 0.041839599609375, -0.0284881591796875, 0.057891845703125, -0.033935546875, 0.00597381591796875, -0.027099609375, -0.06805419921875, -0.027069091796875, 0.06494140625, -0.01067352294921875, 0.004360198974609375, 0.0540771484375, 0.058319091796875, 0.0036182403564453125, -0.01096343994140625, 0.0186309814453125, 0.0127105712890625, 0.006591796875, 0.0286102294921875, 0.047271728515625, -0.05548095703125, 0.023773193359375, -0.01448822021484375, -0.00439453125, -0.0294647216796875, -0.0576171875, -0.079345703125, -0.0469970703125, -0.024810791015625, -0.0552978515625, -0.0188140869140625, 0.06634521484375, 0.05340576171875, -0.07208251953125, -0.01806640625, -0.0058441162109375, 0.00530242919921875, -0.01029205322265625, -0.0201263427734375, 0.0302276611328125, -0.0055694580078125, -0.061492919921875, 0.01548004150390625, 0.0022525787353515625, 0.00460052490234375, -0.0150299072265625, -0.00128173828125, -0.0187225341796875, -0.00017559528350830078, 0.0302886962890625, 0.01000213623046875, -0.049896240234375, -0.038330078125, -0.0009412765502929688, -0.016632080078125, 0.006969451904296875, 0.038360595703125, -0.035797119140625, 0.02197265625, 0.0258941650390625, 0.021636962890625, 0.05096435546875, 0.006122589111328125, 0.045562744140625, -0.073486328125, 0.0252838134765625, 0.0197601318359375, 0.04095458984375, 0.0312347412109375, -0.0308685302734375, 0.0302276611328125, 0.035858154296875, -0.044403076171875, -0.06854248046875, 0.0031871795654296875, -0.0765380859375, -0.0058135986328125, 0.08050537109375, -0.00730133056640625, -0.0247039794921875, -0.0055084228515625, -0.025238037109375, 0.032257080078125, -0.0294189453125, 0.05084228515625, 0.058349609375, 0.0086517333984375, -0.0186004638671875, -0.03216552734375, 0.0311737060546875, 0.0240631103515625, -0.035064697265625, -0.0323486328125, 0.00812530517578125, 0.03302001953125, 0.0235748291015625, 0.045257568359375, 0.0003383159637451172, 0.0104827880859375, 0.016693115234375, 0.017669677734375, -0.00530242919921875, -0.018280029296875, -0.017974853515625, 0.01335906982421875, -0.007106781005859375, -0.048126220703125 ] ]
emilyalsentzer/Bio_Discharge_Summary_BERT
2022-02-27T13:59:50.000Z
[ "transformers", "pytorch", "jax", "bert", "fill-mask", "en", "arxiv:1904.03323", "arxiv:1901.08746", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
fill-mask
emilyalsentzer
null
null
emilyalsentzer/Bio_Discharge_Summary_BERT
26
9,313
transformers
2022-03-02T23:29:05
--- language: "en" tags: - fill-mask license: mit --- # ClinicalBERT - Bio + Discharge Summary BERT Model The [Publicly Available Clinical BERT Embeddings](https://arxiv.org/abs/1904.03323) paper contains four unique clinicalBERT models: initialized with BERT-Base (`cased_L-12_H-768_A-12`) or BioBERT (`BioBERT-Base v1.0 + PubMed 200K + PMC 270K`) & trained on either all MIMIC notes or only discharge summaries. This model card describes the Bio+Discharge Summary BERT model, which was initialized from [BioBERT](https://arxiv.org/abs/1901.08746) & trained on only discharge summaries from MIMIC. ## Pretraining Data The `Bio_Discharge_Summary_BERT` model was trained on all discharge summaries from [MIMIC III](https://www.nature.com/articles/sdata201635), a database containing electronic health records from ICU patients at the Beth Israel Hospital in Boston, MA. For more details on MIMIC, see [here](https://mimic.physionet.org/). All notes from the `NOTEEVENTS` table were included (~880M words). ## Model Pretraining ### Note Preprocessing Each note in MIMIC was first split into sections using a rules-based section splitter (e.g. discharge summary notes were split into "History of Present Illness", "Family History", "Brief Hospital Course", etc. sections). Then each section was split into sentences using SciSpacy (`en core sci md` tokenizer). ### Pretraining Procedures The model was trained using code from [Google's BERT repository](https://github.com/google-research/bert) on a GeForce GTX TITAN X 12 GB GPU. Model parameters were initialized with BioBERT (`BioBERT-Base v1.0 + PubMed 200K + PMC 270K`). ### Pretraining Hyperparameters We used a batch size of 32, a maximum sequence length of 128, and a learning rate of 5 · 10−5 for pre-training our models. The models trained on all MIMIC notes were trained for 150,000 steps. The dup factor for duplicating input data with different masks was set to 5. All other default parameters were used (specifically, masked language model probability = 0.15 and max predictions per sequence = 20). ## How to use the model Load the model via the transformers library: ``` from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("emilyalsentzer/Bio_Discharge_Summary_BERT") model = AutoModel.from_pretrained("emilyalsentzer/Bio_Discharge_Summary_BERT") ``` ## More Information Refer to the original paper, [Publicly Available Clinical BERT Embeddings](https://arxiv.org/abs/1904.03323) (NAACL Clinical NLP Workshop 2019) for additional details and performance on NLI and NER tasks. ## Questions? Post a Github issue on the [clinicalBERT repo](https://github.com/EmilyAlsentzer/clinicalBERT) or email emilya@mit.edu with any questions.
2,751
[ [ -0.01447296142578125, -0.032928466796875, 0.05218505859375, 0.025909423828125, -0.022705078125, -0.013671875, 0.0027027130126953125, -0.0394287109375, 0.0236053466796875, 0.033782958984375, -0.03985595703125, -0.05810546875, -0.052215576171875, -0.0027942657470703125, -0.022125244140625, 0.1068115234375, 0.0029296875, 0.044219970703125, -0.0008282661437988281, -0.0132293701171875, -0.0135650634765625, -0.05584716796875, -0.037353515625, -0.023834228515625, 0.041534423828125, -0.010101318359375, 0.0207672119140625, 0.0283355712890625, 0.035858154296875, 0.01227569580078125, -0.007480621337890625, -0.0011053085327148438, -0.028228759765625, -0.00970458984375, 0.02642822265625, -0.0212249755859375, -0.05755615234375, 0.0009646415710449219, 0.05291748046875, 0.053314208984375, -0.00859832763671875, 0.0158233642578125, 0.00601959228515625, 0.030517578125, -0.02972412109375, 0.003849029541015625, -0.019134521484375, 0.01483154296875, 0.0030345916748046875, 0.01038360595703125, -0.0447998046875, -0.0307159423828125, 0.043701171875, -0.0301361083984375, 0.035308837890625, -0.014068603515625, 0.08343505859375, -0.004528045654296875, -0.01067352294921875, -0.01464080810546875, -0.04046630859375, 0.053955078125, -0.0675048828125, 0.023834228515625, 0.0233306884765625, 0.00704193115234375, 0.00984954833984375, -0.07958984375, -0.0261077880859375, -0.03619384765625, -0.0153045654296875, 0.027923583984375, -0.0162811279296875, 0.026763916015625, 0.03765869140625, 0.01222991943359375, -0.0537109375, -0.00962066650390625, -0.043670654296875, -0.0287017822265625, 0.035675048828125, 0.0213165283203125, 0.007259368896484375, -0.031768798828125, -0.036041259765625, -0.0178985595703125, -0.0240936279296875, 0.0027942657470703125, 0.0172882080078125, -0.0005726814270019531, -0.03131103515625, 0.01629638671875, 0.02069091796875, 0.03900146484375, 0.0005006790161132812, 0.00743865966796875, 0.043975830078125, -0.0261383056640625, -0.0179595947265625, 0.005184173583984375, 0.07977294921875, 0.003963470458984375, 0.01355743408203125, 0.01248931884765625, -0.006816864013671875, -0.0251617431640625, 0.034698486328125, -0.06390380859375, -0.042938232421875, 0.029022216796875, -0.06597900390625, -0.0239105224609375, -0.0004184246063232422, -0.039031982421875, -0.01006317138671875, -0.012176513671875, 0.0518798828125, -0.0704345703125, 0.0099029541015625, -0.010223388671875, -0.01194000244140625, 0.004138946533203125, 0.030120849609375, -0.047454833984375, 0.020843505859375, 0.01529693603515625, 0.060638427734375, 0.004001617431640625, -0.00899505615234375, -0.02044677734375, -0.0039215087890625, 0.00274658203125, 0.033843994140625, -0.007122039794921875, -0.03021240234375, 0.00818634033203125, 0.005985260009765625, -0.0172882080078125, -0.0304107666015625, 0.0251617431640625, -0.0159454345703125, 0.0167999267578125, -0.0189361572265625, -0.052825927734375, -0.01433563232421875, -0.00875091552734375, -0.038726806640625, 0.055145263671875, 0.0136566162109375, -0.047515869140625, 0.0207061767578125, -0.04840087890625, -0.0279998779296875, -0.00661468505859375, -0.00933074951171875, -0.043701171875, 0.004337310791015625, 0.01110076904296875, 0.042266845703125, 0.00830841064453125, 0.0237274169921875, -0.022552490234375, -0.0282745361328125, 0.003082275390625, -0.01357269287109375, 0.0809326171875, 0.0189971923828125, -0.01430511474609375, 0.007595062255859375, -0.067626953125, -0.0015468597412109375, 0.00789642333984375, -0.0205230712890625, -0.01035308837890625, -0.01129913330078125, 0.016357421875, 0.00952911376953125, 0.023223876953125, -0.049102783203125, 0.016937255859375, -0.024993896484375, 0.02288818359375, 0.04376220703125, 0.002468109130859375, 0.0009746551513671875, -0.053802490234375, 0.03155517578125, 0.0118255615234375, 0.02679443359375, -0.0288848876953125, -0.03887939453125, -0.047698974609375, -0.04693603515625, 0.0333251953125, 0.03851318359375, -0.0118560791015625, 0.042236328125, -0.0016813278198242188, -0.0377197265625, -0.057281494140625, -0.01006317138671875, 0.04498291015625, 0.0482177734375, 0.06134033203125, -0.029510498046875, -0.051971435546875, -0.08660888671875, 0.013519287109375, -0.0091705322265625, -0.01180267333984375, 0.024810791015625, 0.04339599609375, -0.04193115234375, 0.04669189453125, -0.03375244140625, -0.0291290283203125, -0.0350341796875, 0.041107177734375, 0.035369873046875, 0.045654296875, 0.04302978515625, -0.011749267578125, -0.03387451171875, -0.0278778076171875, -0.058746337890625, -0.002452850341796875, -0.0291290283203125, -0.0069732666015625, 0.0152587890625, 0.02685546875, -0.025238037109375, 0.036956787109375, 0.0224761962890625, 0.007068634033203125, 0.04498291015625, -0.052093505859375, -0.0189971923828125, -0.0892333984375, 0.024505615234375, -0.005401611328125, -0.0174713134765625, -0.053375244140625, -0.018280029296875, 0.01409912109375, -0.00030922889709472656, -0.0268707275390625, 0.031463623046875, -0.021820068359375, 0.0144805908203125, 0.00537872314453125, -0.01248931884765625, 0.0005664825439453125, 0.043060302734375, 0.0122528076171875, 0.023590087890625, 0.0303802490234375, -0.044921875, -0.0080108642578125, 0.0396728515625, -0.00946044921875, -0.0080413818359375, -0.079345703125, -0.0006155967712402344, -0.00855255126953125, 0.03948974609375, -0.07550048828125, -0.0108795166015625, -0.00028252601623535156, -0.037384033203125, 0.034942626953125, 0.00757598876953125, -0.047027587890625, -0.02587890625, -0.0345458984375, 0.025054931640625, 0.07598876953125, -0.02801513671875, 0.046905517578125, -0.0025768280029296875, 0.0027618408203125, -0.049468994140625, -0.050537109375, -0.03173828125, 0.01629638671875, -0.038330078125, 0.05133056640625, -0.006725311279296875, 0.0117950439453125, -0.00001817941665649414, 0.005619049072265625, -0.007785797119140625, -0.0074310302734375, 0.017913818359375, 0.033477783203125, -0.009063720703125, 0.0260162353515625, 0.0171966552734375, 0.007717132568359375, 0.021148681640625, 0.00302886962890625, 0.053314208984375, -0.00826263427734375, -0.01873779296875, -0.05914306640625, 0.0271453857421875, 0.04168701171875, 0.0012903213500976562, 0.07373046875, 0.068359375, -0.03509521484375, 0.01239776611328125, -0.055389404296875, -0.0237579345703125, -0.0265655517578125, 0.034149169921875, 0.01166534423828125, -0.0296630859375, 0.060089111328125, 0.01123809814453125, 0.0177154541015625, 0.044891357421875, 0.033721923828125, -0.04345703125, 0.0826416015625, 0.04547119140625, 0.002964019775390625, 0.031829833984375, -0.051483154296875, 0.001983642578125, -0.0784912109375, -0.0170745849609375, -0.0162200927734375, -0.0173187255859375, -0.0361328125, -0.0133819580078125, 0.043304443359375, 0.0014848709106445312, -0.0275726318359375, 0.024658203125, -0.036529541015625, -0.0192413330078125, 0.04510498046875, 0.03759765625, -0.000522613525390625, -0.0026302337646484375, -0.046875, -0.01348876953125, -0.05902099609375, -0.033660888671875, 0.10528564453125, 0.041412353515625, 0.04400634765625, -0.01067352294921875, 0.0870361328125, 0.013946533203125, 0.03997802734375, -0.02703857421875, 0.03155517578125, -0.0278167724609375, -0.052215576171875, -0.0015840530395507812, -0.0157470703125, -0.07098388671875, 0.0006008148193359375, -0.02734375, -0.05810546875, 0.0136871337890625, 0.0222015380859375, -0.05712890625, 0.01444244384765625, -0.047515869140625, 0.061370849609375, -0.02142333984375, -0.013214111328125, -0.0164642333984375, -0.07293701171875, 0.034942626953125, -0.0236053466796875, -0.0030040740966796875, 0.014068603515625, 0.01483154296875, 0.06463623046875, -0.053466796875, 0.06976318359375, -0.01119232177734375, 0.0167083740234375, 0.013427734375, -0.022064208984375, 0.018463134765625, 0.0016117095947265625, 0.006649017333984375, 0.0328369140625, 0.0223541259765625, -0.030029296875, -0.017303466796875, 0.02587890625, -0.07281494140625, -0.022857666015625, -0.0506591796875, -0.0335693359375, 0.003414154052734375, 0.0025997161865234375, 0.057098388671875, 0.053985595703125, -0.0151519775390625, 0.0097808837890625, 0.061920166015625, -0.062042236328125, 0.02264404296875, 0.03765869140625, -0.01812744140625, -0.035919189453125, 0.0435791015625, 0.0084228515625, 0.032562255859375, 0.02191162109375, 0.0017690658569335938, -0.0236358642578125, -0.0302734375, -0.0104522705078125, 0.0516357421875, -0.034759521484375, 0.00453948974609375, -0.08642578125, -0.046478271484375, -0.04425048828125, -0.005764007568359375, -0.0174713134765625, -0.0076904296875, -0.0284271240234375, 0.01377105712890625, 0.0238037109375, 0.0535888671875, -0.02178955078125, 0.0207061767578125, -0.08319091796875, 0.018157958984375, -0.0001779794692993164, 0.005573272705078125, 0.009674072265625, -0.06353759765625, -0.0208740234375, 0.005245208740234375, -0.03814697265625, -0.0787353515625, 0.040283203125, 0.01111602783203125, 0.0516357421875, 0.030670166015625, -0.0025615692138671875, 0.056182861328125, -0.038543701171875, 0.06317138671875, 0.0208740234375, -0.061859130859375, 0.0596923828125, -0.0265655517578125, 0.029937744140625, 0.036346435546875, 0.061920166015625, -0.0230255126953125, -0.0288848876953125, -0.081298828125, -0.07684326171875, 0.05999755859375, 0.01482391357421875, 0.00916290283203125, -0.017242431640625, 0.0267791748046875, 0.0004258155822753906, 0.007122039794921875, -0.06024169921875, -0.038665771484375, -0.00970458984375, -0.0227813720703125, -0.011383056640625, -0.034088134765625, -0.0103759765625, -0.049102783203125, 0.050689697265625, 0.01334381103515625, 0.070068359375, 0.04522705078125, -0.01302337646484375, 0.0025653839111328125, -0.00536346435546875, 0.061920166015625, 0.0430908203125, -0.055145263671875, -0.0290985107421875, 0.026458740234375, -0.05426025390625, -0.0176544189453125, 0.03790283203125, 0.005420684814453125, 0.0308685302734375, 0.05810546875, 0.05194091796875, 0.015380859375, -0.048919677734375, 0.046112060546875, -0.015289306640625, -0.039276123046875, -0.050811767578125, -0.002666473388671875, -0.0018157958984375, 0.0037994384765625, 0.0162811279296875, 0.0017328262329101562, 0.00868988037109375, -0.035675048828125, 0.0247039794921875, 0.02276611328125, -0.03887939453125, -0.0265655517578125, 0.06512451171875, 0.0016069412231445312, -0.0025157928466796875, 0.0697021484375, 0.00707244873046875, -0.034576416015625, 0.053314208984375, 0.03558349609375, 0.06903076171875, -0.00707244873046875, 0.01152801513671875, 0.058349609375, 0.0049591064453125, 0.002346038818359375, 0.029083251953125, 0.0012311935424804688, -0.03826904296875, -0.0183258056640625, -0.054931640625, -0.0201416015625, 0.036529541015625, -0.075439453125, 0.0170745849609375, -0.05792236328125, -0.0245513916015625, 0.0009412765502929688, -0.00994110107421875, -0.055145263671875, 0.005115509033203125, 0.01143646240234375, 0.06610107421875, -0.061859130859375, 0.0654296875, 0.056396484375, -0.0557861328125, -0.06829833984375, 0.00299835205078125, -0.0165557861328125, -0.056396484375, 0.06903076171875, 0.0110321044921875, 0.02532958984375, -0.01010894775390625, -0.0304718017578125, -0.047760009765625, 0.0845947265625, 0.005649566650390625, -0.04327392578125, -0.01141357421875, -0.00583648681640625, 0.06707763671875, -0.01551055908203125, 0.04608154296875, 0.019134521484375, 0.0104827880859375, -0.01073455810546875, -0.06378173828125, -0.006671905517578125, -0.01424407958984375, -0.010986328125, 0.006740570068359375, -0.04083251953125, 0.08074951171875, -0.03204345703125, 0.00911712646484375, 0.025970458984375, 0.03515625, 0.028839111328125, 0.02294921875, 0.0141143798828125, 0.055084228515625, 0.05206298828125, 0.00011998414993286133, 0.09515380859375, -0.039581298828125, 0.02752685546875, 0.07373046875, -0.0093231201171875, 0.052734375, 0.01593017578125, -0.016082763671875, 0.05914306640625, 0.0445556640625, -0.0110321044921875, 0.04302978515625, 0.016998291015625, -0.0205230712890625, -0.01419830322265625, 0.004878997802734375, -0.052734375, 0.011474609375, 0.0244903564453125, -0.07806396484375, -0.00925445556640625, 0.0015897750854492188, 0.006404876708984375, -0.0228271484375, 0.006427764892578125, 0.0474853515625, 0.009674072265625, -0.042236328125, 0.053375244140625, -0.0190582275390625, 0.028839111328125, -0.0721435546875, -0.015472412109375, -0.01334381103515625, 0.0246734619140625, 0.0013322830200195312, -0.022857666015625, 0.007602691650390625, -0.0019931793212890625, -0.0196075439453125, -0.0243988037109375, 0.014862060546875, -0.0186004638671875, -0.0367431640625, 0.031890869140625, 0.0328369140625, 0.0228271484375, 0.0299835205078125, -0.06488037109375, -0.00952911376953125, 0.003467559814453125, -0.0027790069580078125, 0.0308074951171875, 0.011871337890625, 0.0124359130859375, 0.0377197265625, 0.042999267578125, 0.010955810546875, 0.0021724700927734375, 0.0128326416015625, 0.061431884765625, -0.033477783203125, -0.0222930908203125, -0.05413818359375, 0.04248046875, -0.002910614013671875, -0.042694091796875, 0.033203125, 0.039276123046875, 0.0499267578125, -0.0249481201171875, 0.04962158203125, -0.0002586841583251953, 0.047698974609375, -0.03106689453125, 0.061370849609375, -0.048797607421875, 0.0160369873046875, -0.032867431640625, -0.054107666015625, -0.01058197021484375, 0.06024169921875, -0.01377105712890625, 0.02813720703125, 0.0711669921875, 0.03887939453125, 0.0030155181884765625, -0.01256561279296875, 0.00882720947265625, 0.029083251953125, 0.0213165283203125, 0.046722412109375, 0.035125732421875, -0.04241943359375, 0.0223236083984375, -0.034088134765625, -0.01551055908203125, -0.026580810546875, -0.055023193359375, -0.0806884765625, -0.03472900390625, -0.0222625732421875, -0.03826904296875, 0.022125244140625, 0.0850830078125, 0.057220458984375, -0.0673828125, -0.010589599609375, -0.002239227294921875, -0.033294677734375, -0.01611328125, -0.0121307373046875, 0.04644775390625, -0.032989501953125, -0.02984619140625, 0.0027790069580078125, -0.0108642578125, 0.0204925537109375, -0.002010345458984375, 0.0013589859008789062, -0.037078857421875, -0.0090789794921875, 0.0269927978515625, 0.0185699462890625, -0.0557861328125, -0.00778961181640625, 0.0008516311645507812, -0.02960205078125, -0.00365447998046875, 0.038299560546875, -0.0626220703125, 0.036468505859375, 0.035675048828125, 0.037567138671875, 0.049224853515625, 0.0008482933044433594, 0.04229736328125, -0.050811767578125, -0.004329681396484375, 0.0289764404296875, 0.045745849609375, -0.0057220458984375, -0.0243072509765625, 0.0262908935546875, 0.020172119140625, -0.043792724609375, -0.0535888671875, -0.00807952880859375, -0.08258056640625, -0.008880615234375, 0.06610107421875, -0.0199737548828125, 0.005218505859375, 0.0009260177612304688, -0.01421356201171875, 0.03924560546875, -0.0244140625, 0.0513916015625, 0.042694091796875, -0.0291748046875, 0.0034732818603515625, -0.058441162109375, 0.04083251953125, 0.0562744140625, -0.036224365234375, -0.0292816162109375, 0.01861572265625, 0.0277252197265625, 0.0203704833984375, 0.05010986328125, -0.01548004150390625, 0.03778076171875, -0.02069091796875, 0.03106689453125, -0.0013704299926757812, -0.009674072265625, -0.03399658203125, -0.01154327392578125, -0.0169677734375, -0.02227783203125 ] ]
vilsonrodrigues/falcon-7b-instruct-sharded
2023-07-13T12:22:04.000Z
[ "transformers", "safetensors", "falcon", "text-generation", "custom_code", "en", "dataset:tiiuae/falcon-refinedweb", "arxiv:2205.14135", "arxiv:1911.02150", "arxiv:2005.14165", "arxiv:2104.09864", "arxiv:2306.01116", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
vilsonrodrigues
null
null
vilsonrodrigues/falcon-7b-instruct-sharded
22
9,305
transformers
2023-06-09T01:06:04
--- datasets: - tiiuae/falcon-refinedweb language: - en inference: true widget: - text: "Hey Falcon! Any recommendations for my holidays in Abu Dhabi?" example_title: "Abu Dhabi Trip" - text: "What's the Everett interpretation of quantum mechanics?" example_title: "Q/A: Quantum & Answers" - text: "Give me a list of the top 10 dive sites you would recommend around the world." example_title: "Diving Top 10" - text: "Can you tell me more about deep-water soloing?" example_title: "Extreme sports" - text: "Can you write a short tweet about the Apache 2.0 release of our latest AI model, Falcon LLM?" example_title: "Twitter Helper" - text: "What are the responsabilities of a Chief Llama Officer?" example_title: "Trendy Jobs" license: apache-2.0 --- # Resharded Resharded version of https://huggingface.co/tiiuae/falcon-7b-instruct for low RAM enviroments (e.g. Colab, Kaggle) in safetensors Tutorial: https://medium.com/@vilsonrodrigues/run-your-private-llm-falcon-7b-instruct-with-less-than-6gb-of-gpu-using-4-bit-quantization-ff1d4ffbabcc --- # ✨ Falcon-7B-Instruct **Falcon-7B-Instruct is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) based on [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b) and finetuned on a mixture of chat/instruct datasets. It is made available under the Apache 2.0 license.** *Paper coming soon 😊.* 🤗 To get started with Falcon (inference, finetuning, quantization, etc.), we recommend reading [this great blogpost fron HF](https://huggingface.co/blog/falcon)! ## Why use Falcon-7B-Instruct? * **You are looking for a ready-to-use chat/instruct model based on [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).** * **Falcon-7B is a strong base model, outperforming comparable open-source models** (e.g., [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1) etc.), thanks to being trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). * **It features an architecture optimized for inference**, with FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)) and multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)). ⚠️ Falcon is now available as a core model in the `transformers` library! To use the in-library version, please install the latest version of `transformers` with `pip install git+https://github.com/ huggingface/transformers.git`, then simply remove the `trust_remote_code=True` argument from `from_pretrained()`. 💬 **This is an instruct model, which may not be ideal for further finetuning.** If you are interested in building your own instruct/chat model, we recommend starting from [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b). 🔥 **Looking for an even more powerful model?** [Falcon-40B-Instruct](https://huggingface.co/tiiuae/falcon-40b-instruct) is Falcon-7B-Instruct's big brother! ```python from transformers import AutoTokenizer, AutoModelForCausalLM import transformers import torch model = "tiiuae/falcon-7b-instruct" tokenizer = AutoTokenizer.from_pretrained(model) pipeline = transformers.pipeline( "text-generation", model=model, tokenizer=tokenizer, torch_dtype=torch.bfloat16, device_map="auto", ) sequences = pipeline( "Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:", max_length=200, do_sample=True, top_k=10, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id, ) for seq in sequences: print(f"Result: {seq['generated_text']}") ``` 💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!** For fast inference with Falcon, check-out [Text Generation Inference](https://github.com/huggingface/text-generation-inference)! Read more in this [blogpost]((https://huggingface.co/blog/falcon). You will need **at least 16GB of memory** to swiftly run inference with Falcon-7B-Instruct. # Model Card for Falcon-7B-Instruct ## Model Details ### Model Description - **Developed by:** [https://www.tii.ae](https://www.tii.ae); - **Model type:** Causal decoder-only; - **Language(s) (NLP):** English and French; - **License:** Apache 2.0; - **Finetuned from model:** [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b). ### Model Source - **Paper:** *coming soon*. ## Uses ### Direct Use Falcon-7B-Instruct has been finetuned on a mixture of instruct and chat datasets. ### Out-of-Scope Use Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful. ## Bias, Risks, and Limitations Falcon-7B-Instruct is mostly trained on English data, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online. ### Recommendations We recommend users of Falcon-7B-Instruct to develop guardrails and to take appropriate precautions for any production use. ## How to Get Started with the Model ```python from transformers import AutoTokenizer, AutoModelForCausalLM import transformers import torch model = "tiiuae/falcon-7b-instruct" tokenizer = AutoTokenizer.from_pretrained(model) pipeline = transformers.pipeline( "text-generation", model=model, tokenizer=tokenizer, torch_dtype=torch.bfloat16, device_map="auto", ) sequences = pipeline( "Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:", max_length=200, do_sample=True, top_k=10, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id, ) for seq in sequences: print(f"Result: {seq['generated_text']}") ``` ## Training Details ### Training Data Falcon-7B-Instruct was finetuned on a 250M tokens mixture of instruct/chat datasets. | **Data source** | **Fraction** | **Tokens** | **Description** | |--------------------|--------------|------------|-----------------------------------| | [Bai ze](https://github.com/project-baize/baize-chatbot) | 65% | 164M | chat | | [GPT4All](https://github.com/nomic-ai/gpt4all) | 25% | 62M | instruct | | [GPTeacher](https://github.com/teknium1/GPTeacher) | 5% | 11M | instruct | | [RefinedWeb-English](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) | 5% | 13M | massive web crawl | The data was tokenized with the Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b) tokenizer. ## Evaluation *Paper coming soon.* See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) for early results. Note that this model variant is not optimized for NLP benchmarks. ## Technical Specifications For more information about pretraining, see [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b). ### Model Architecture and Objective Falcon-7B is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token). The architecture is broadly adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)), with the following differences: * **Positionnal embeddings:** rotary ([Su et al., 2021](https://arxiv.org/abs/2104.09864)); * **Attention:** multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)) and FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)); * **Decoder-block:** parallel attention/MLP with a single layer norm. | **Hyperparameter** | **Value** | **Comment** | |--------------------|-----------|----------------------------------------| | Layers | 32 | | | `d_model` | 4544 | Increased to compensate for multiquery | | `head_dim` | 64 | Reduced to optimise for FlashAttention | | Vocabulary | 65024 | | | Sequence length | 2048 | | ### Compute Infrastructure #### Hardware Falcon-7B-Instruct was trained on AWS SageMaker, on 32 A100 40GB GPUs in P4d instances. #### Software Falcon-7B-Instruct was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.) ## Citation *Paper coming soon* 😊. In the meanwhile, you can use the following information to cite: ``` @article{falcon40b, title={{Falcon-40B}: an open large language model with state-of-the-art performance}, author={Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Debbah, Merouane and Goffinet, Etienne and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme}, year={2023} } ``` To learn more about the pretraining dataset, see the 📓 [RefinedWeb paper](https://arxiv.org/abs/2306.01116). ``` @article{refinedweb, title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only}, author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay}, journal={arXiv preprint arXiv:2306.01116}, eprint={2306.01116}, eprinttype = {arXiv}, url={https://arxiv.org/abs/2306.01116}, year={2023} } ``` ## License Falcon-7B-Instruct is made available under the Apache 2.0 license. ## Contact falconllm@tii.ae
10,351
[ [ -0.036102294921875, -0.07000732421875, 0.0032176971435546875, 0.0294036865234375, -0.00860595703125, -0.005779266357421875, -0.0113067626953125, -0.033905029296875, 0.01479339599609375, 0.02801513671875, -0.03857421875, -0.033966064453125, -0.057281494140625, 0.0036754608154296875, -0.0291900634765625, 0.07720947265625, 0.0173187255859375, -0.0142974853515625, 0.01261138916015625, 0.00014889240264892578, -0.0220947265625, -0.0367431640625, -0.07220458984375, -0.0036792755126953125, 0.02679443359375, 0.0176239013671875, 0.048675537109375, 0.060821533203125, 0.0498046875, 0.030853271484375, -0.0177001953125, 0.017852783203125, -0.04779052734375, -0.01424407958984375, 0.003314971923828125, -0.0199432373046875, -0.0220947265625, -0.00018095970153808594, 0.052764892578125, 0.03265380859375, -0.003253936767578125, 0.017547607421875, -0.0010013580322265625, 0.040863037109375, -0.0430908203125, 0.040496826171875, -0.042205810546875, -0.0058135986328125, -0.01235198974609375, 0.0106658935546875, -0.039703369140625, 0.005340576171875, -0.0176239013671875, -0.05938720703125, 0.02020263671875, 0.017364501953125, 0.0936279296875, 0.0284576416015625, -0.0262603759765625, -0.015594482421875, -0.03204345703125, 0.0546875, -0.06298828125, 0.030792236328125, 0.0135650634765625, 0.0274658203125, -0.02923583984375, -0.081787109375, -0.0404052734375, -0.01216888427734375, -0.00606536865234375, 0.0246124267578125, -0.017578125, 0.006114959716796875, 0.034759521484375, 0.015167236328125, -0.0288543701171875, 0.00022041797637939453, -0.037353515625, -0.0177459716796875, 0.045684814453125, 0.0006704330444335938, 0.0192718505859375, -0.024261474609375, -0.0290374755859375, -0.0235443115234375, -0.0301055908203125, 0.0186767578125, 0.030059814453125, 0.026580810546875, -0.027557373046875, 0.034759521484375, -0.0200958251953125, 0.037445068359375, 0.038330078125, -0.0097198486328125, 0.0291900634765625, -0.026641845703125, -0.028961181640625, 0.000576019287109375, 0.0897216796875, 0.0157928466796875, 0.0083465576171875, -0.004978179931640625, -0.0014095306396484375, 0.00439453125, 0.01165771484375, -0.07568359375, 0.006214141845703125, 0.016021728515625, -0.0394287109375, -0.02423095703125, 0.01837158203125, -0.05291748046875, -0.0064849853515625, 0.01160430908203125, 0.016204833984375, -0.040802001953125, -0.027679443359375, 0.0175323486328125, -0.00867462158203125, 0.017181396484375, -0.0009284019470214844, -0.062164306640625, 0.016876220703125, 0.046417236328125, 0.06536865234375, 0.00650787353515625, -0.0418701171875, -0.050323486328125, -0.0007796287536621094, -0.0150299072265625, 0.042572021484375, -0.0321044921875, -0.0264434814453125, -0.0113983154296875, 0.0221405029296875, -0.0267333984375, -0.013031005859375, 0.061279296875, -0.02630615234375, 0.023193359375, -0.01983642578125, -0.04583740234375, -0.0269012451171875, -0.00250244140625, -0.0418701171875, 0.07415771484375, 0.0022373199462890625, -0.08404541015625, 0.01349639892578125, -0.060516357421875, -0.0231170654296875, -0.0184326171875, -0.0010728836059570312, -0.038726806640625, -0.010772705078125, 0.03533935546875, 0.04534912109375, -0.021881103515625, 0.035003662109375, -0.044281005859375, -0.045379638671875, 0.0011301040649414062, -0.0211334228515625, 0.0670166015625, 0.03887939453125, -0.04266357421875, 0.012542724609375, -0.0404052734375, -0.0180816650390625, 0.0162811279296875, -0.0006499290466308594, 0.01285552978515625, -0.000050008296966552734, 0.0031604766845703125, 0.021331787109375, 0.00585174560546875, -0.041473388671875, 0.0058135986328125, -0.04718017578125, 0.04541015625, 0.036529541015625, -0.005096435546875, 0.031585693359375, -0.037353515625, 0.0301513671875, 0.03778076171875, 0.0290374755859375, -0.0166778564453125, -0.04779052734375, -0.07720947265625, -0.0240936279296875, 0.00848388671875, 0.0321044921875, -0.05499267578125, 0.03570556640625, -0.01337432861328125, -0.0491943359375, -0.04022216796875, -0.014373779296875, 0.03680419921875, 0.04718017578125, 0.03741455078125, 0.00432586669921875, -0.045745849609375, -0.06317138671875, -0.00432586669921875, -0.020538330078125, 0.0203704833984375, 0.00962066650390625, 0.042327880859375, -0.028961181640625, 0.050323486328125, -0.022979736328125, -0.01947021484375, -0.01971435546875, 0.0011844635009765625, 0.026580810546875, 0.04241943359375, 0.05523681640625, -0.041107177734375, -0.02490234375, -0.00923919677734375, -0.06610107421875, -0.0033893585205078125, -0.0129241943359375, -0.02593994140625, 0.033416748046875, 0.040679931640625, -0.06378173828125, 0.031005859375, 0.0272216796875, -0.025054931640625, 0.02606201171875, 0.0010509490966796875, 0.00922393798828125, -0.0965576171875, 0.01012420654296875, 0.0123443603515625, 0.0023784637451171875, -0.037933349609375, 0.01312255859375, -0.0005388259887695312, -0.0031681060791015625, -0.049591064453125, 0.05950927734375, -0.0382080078125, 0.004180908203125, -0.006710052490234375, -0.004886627197265625, -0.0117950439453125, 0.052215576171875, 0.00351715087890625, 0.0645751953125, 0.044921875, -0.030242919921875, 0.0016498565673828125, 0.031341552734375, -0.00042510032653808594, 0.005550384521484375, -0.06268310546875, 0.002971649169921875, -0.01039886474609375, 0.0284881591796875, -0.06719970703125, -0.018310546875, 0.039764404296875, -0.05206298828125, 0.0282745361328125, -0.01470184326171875, -0.0301055908203125, -0.041168212890625, -0.0209503173828125, 0.0032253265380859375, 0.047271728515625, -0.04266357421875, 0.0362548828125, 0.0226593017578125, 0.0087890625, -0.07183837890625, -0.046234130859375, -0.001865386962890625, -0.0243988037109375, -0.064208984375, 0.0235443115234375, -0.0014772415161132812, 0.004772186279296875, -0.00516510009765625, 0.00743865966796875, 0.007602691650390625, 0.0035190582275390625, 0.0430908203125, 0.0148773193359375, -0.018768310546875, -0.00856781005859375, 0.008148193359375, -0.0119781494140625, 0.00455474853515625, -0.01522064208984375, 0.036956787109375, -0.049468994140625, -0.0197906494140625, -0.034698486328125, 0.0215911865234375, 0.037200927734375, -0.01215362548828125, 0.06573486328125, 0.08221435546875, -0.0240631103515625, 0.005084991455078125, -0.051025390625, -0.0147552490234375, -0.040313720703125, 0.0299530029296875, -0.02911376953125, -0.0711669921875, 0.052490234375, 0.0199127197265625, 0.00171661376953125, 0.06512451171875, 0.03662109375, 0.0078125, 0.08404541015625, 0.029052734375, -0.01296234130859375, 0.03338623046875, -0.0389404296875, 0.006221771240234375, -0.055694580078125, -0.01506805419921875, -0.04351806640625, -0.0079193115234375, -0.056854248046875, -0.0181732177734375, 0.003993988037109375, 0.02386474609375, -0.0614013671875, 0.0222930908203125, -0.047393798828125, 0.01313018798828125, 0.044189453125, 0.002803802490234375, 0.00101470947265625, -0.0008606910705566406, -0.01056671142578125, 0.0135040283203125, -0.06695556640625, -0.042999267578125, 0.0758056640625, 0.031402587890625, 0.052520751953125, -0.004180908203125, 0.0640869140625, -0.003170013427734375, 0.0285491943359375, -0.03741455078125, 0.041412353515625, -0.0089874267578125, -0.039764404296875, -0.00783538818359375, -0.04132080078125, -0.07611083984375, 0.00787353515625, -0.012451171875, -0.06298828125, 0.00390625, -0.0029659271240234375, -0.01241302490234375, 0.0216827392578125, -0.074951171875, 0.072509765625, 0.0011205673217773438, -0.026458740234375, 0.00965118408203125, -0.058074951171875, 0.046630859375, 0.0020465850830078125, 0.0139617919921875, -0.0024394989013671875, 0.004302978515625, 0.0687255859375, -0.0452880859375, 0.06707763671875, -0.0245361328125, 0.0311431884765625, 0.041534423828125, -0.0199127197265625, 0.04754638671875, 0.0115966796875, -0.0143280029296875, 0.0273895263671875, 0.01947021484375, -0.0301055908203125, -0.036468505859375, 0.061614990234375, -0.0911865234375, -0.049407958984375, -0.04241943359375, -0.0345458984375, -0.00577545166015625, 0.01971435546875, 0.0309600830078125, 0.0234527587890625, 0.00630950927734375, 0.0252532958984375, 0.018829345703125, -0.031097412109375, 0.052886962890625, 0.0247344970703125, -0.018310546875, -0.03558349609375, 0.05609130859375, 0.0023136138916015625, 0.004329681396484375, 0.02239990234375, 0.0191802978515625, -0.04827880859375, -0.03326416015625, -0.036712646484375, 0.033111572265625, -0.048858642578125, -0.0259857177734375, -0.0703125, -0.045166015625, -0.044219970703125, -0.004779815673828125, -0.031494140625, -0.021484375, -0.0445556640625, 0.0005164146423339844, 0.038330078125, 0.034912109375, 0.0011892318725585938, 0.038543701171875, -0.06640625, 0.01230621337890625, -0.007366180419921875, 0.0142974853515625, 0.007053375244140625, -0.053680419921875, -0.0166015625, 0.03277587890625, -0.033782958984375, -0.04852294921875, 0.040985107421875, 0.018951416015625, 0.050537109375, 0.0309600830078125, 0.01227569580078125, 0.059600830078125, -0.016754150390625, 0.05877685546875, 0.0175323486328125, -0.06585693359375, 0.0248565673828125, -0.038604736328125, 0.0176239013671875, 0.027618408203125, 0.0302581787109375, -0.0305938720703125, -0.0396728515625, -0.0718994140625, -0.038726806640625, 0.0657958984375, 0.03515625, 0.0017557144165039062, -0.0211029052734375, 0.029205322265625, -0.01184844970703125, 0.0011453628540039062, -0.0408935546875, -0.0186309814453125, -0.050323486328125, -0.0281982421875, -0.01235198974609375, -0.0036869049072265625, 0.018310546875, -0.02386474609375, 0.060882568359375, -0.00954437255859375, 0.05511474609375, 0.0149688720703125, -0.01232147216796875, 0.0058746337890625, -0.0057220458984375, 0.050323486328125, 0.0282135009765625, -0.0207061767578125, -0.0027790069580078125, 0.007068634033203125, -0.048492431640625, 0.00431060791015625, 0.0286102294921875, -0.0148468017578125, -0.00978851318359375, 0.0298614501953125, 0.07928466796875, 0.0118408203125, -0.029937744140625, 0.035003662109375, -0.0115203857421875, -0.0160675048828125, -0.00936126708984375, 0.0213775634765625, 0.0220947265625, 0.024078369140625, 0.0152740478515625, -0.0037479400634765625, 0.004207611083984375, -0.01837158203125, 0.01494598388671875, 0.016143798828125, -0.020904541015625, -0.0173492431640625, 0.07794189453125, 0.0132293701171875, -0.015167236328125, 0.047760009765625, -0.027923583984375, -0.031646728515625, 0.06378173828125, 0.04638671875, 0.06744384765625, 0.006000518798828125, 0.0199127197265625, 0.05035400390625, 0.021453857421875, -0.014801025390625, 0.01477813720703125, 0.0203399658203125, -0.054412841796875, -0.0352783203125, -0.054901123046875, -0.01666259765625, 0.006755828857421875, -0.037200927734375, 0.0287017822265625, -0.036376953125, -0.0217437744140625, 0.01416778564453125, 0.021484375, -0.05340576171875, 0.00930023193359375, -0.0090484619140625, 0.06689453125, -0.0389404296875, 0.06341552734375, 0.045684814453125, -0.059173583984375, -0.08380126953125, -0.0201568603515625, -0.00698089599609375, -0.06646728515625, 0.051971435546875, 0.031494140625, 0.00331878662109375, 0.0182342529296875, -0.040008544921875, -0.06451416015625, 0.0784912109375, 0.030303955078125, -0.039886474609375, -0.0031757354736328125, 0.01425933837890625, 0.037261962890625, -0.02789306640625, 0.057525634765625, 0.0283966064453125, 0.0355224609375, 0.02398681640625, -0.058319091796875, 0.0168914794921875, -0.040863037109375, 0.0033664703369140625, 0.00893402099609375, -0.07879638671875, 0.065185546875, -0.019378662109375, -0.016082763671875, -0.0007519721984863281, 0.06793212890625, 0.027801513671875, 0.01708984375, 0.0310821533203125, 0.039794921875, 0.04541015625, -0.0124969482421875, 0.0771484375, -0.0438232421875, 0.046142578125, 0.06719970703125, 0.0017986297607421875, 0.05438232421875, 0.0203399658203125, -0.005413055419921875, 0.0193023681640625, 0.06817626953125, -0.0020923614501953125, 0.0179595947265625, -0.00624847412109375, 0.007526397705078125, -0.01152801513671875, -0.0003571510314941406, -0.04486083984375, 0.034698486328125, 0.01934814453125, -0.0212860107421875, -0.0084381103515625, -0.005367279052734375, 0.0269927978515625, -0.0245361328125, -0.004222869873046875, 0.040618896484375, 0.0084991455078125, -0.05499267578125, 0.0753173828125, 0.01116180419921875, 0.06500244140625, -0.04364013671875, 0.00981903076171875, -0.035919189453125, 0.016632080078125, -0.01305389404296875, -0.045562744140625, 0.034149169921875, -0.004169464111328125, 0.0001875162124633789, -0.0011701583862304688, 0.056396484375, -0.021697998046875, -0.054718017578125, 0.0183258056640625, 0.0220489501953125, 0.0179901123046875, -0.01557159423828125, -0.06695556640625, 0.0251922607421875, -0.01044464111328125, -0.0299835205078125, 0.017791748046875, 0.0184478759765625, -0.00429534912109375, 0.06146240234375, 0.055267333984375, -0.01279449462890625, 0.016387939453125, -0.0011758804321289062, 0.060821533203125, -0.05548095703125, -0.03314208984375, -0.053802490234375, 0.034759521484375, -0.00997161865234375, -0.027374267578125, 0.05841064453125, 0.0452880859375, 0.061004638671875, -0.00765228271484375, 0.04632568359375, -0.00820159912109375, 0.0196380615234375, -0.0328369140625, 0.060882568359375, -0.03857421875, 0.007843017578125, -0.02923583984375, -0.055267333984375, -0.01200103759765625, 0.048919677734375, -0.01180267333984375, 0.018951416015625, 0.055938720703125, 0.08013916015625, -0.00894927978515625, 0.0197601318359375, 0.0110931396484375, 0.0277557373046875, 0.03546142578125, 0.05535888671875, 0.060882568359375, -0.060791015625, 0.05047607421875, -0.0269775390625, -0.008026123046875, -0.017730712890625, -0.06256103515625, -0.088623046875, -0.0458984375, -0.0237274169921875, -0.035064697265625, 0.0078277587890625, 0.0662841796875, 0.0596923828125, -0.044586181640625, -0.017181396484375, -0.01473236083984375, 0.004749298095703125, -0.0200042724609375, -0.0170440673828125, 0.037353515625, -0.038787841796875, -0.056488037109375, 0.01224517822265625, 0.0022525787353515625, 0.00974273681640625, -0.007358551025390625, -0.0196380615234375, -0.027587890625, 0.0011301040649414062, 0.044403076171875, 0.021759033203125, -0.060821533203125, -0.032440185546875, 0.0159149169921875, -0.01190185546875, 0.003238677978515625, 0.018646240234375, -0.0419921875, 0.0170745849609375, 0.033111572265625, 0.05206298828125, 0.06719970703125, -0.0035266876220703125, 0.01525115966796875, -0.02545166015625, 0.033203125, -0.0088958740234375, 0.034271240234375, 0.0144195556640625, -0.0292510986328125, 0.04571533203125, 0.032135009765625, -0.039154052734375, -0.05499267578125, -0.01424407958984375, -0.09674072265625, -0.005275726318359375, 0.098388671875, -0.0144805908203125, -0.0343017578125, 0.01418304443359375, -0.030242919921875, 0.040771484375, -0.047088623046875, 0.0465087890625, 0.040374755859375, 0.0027008056640625, -0.0162200927734375, -0.0270843505859375, 0.0292816162109375, 0.01030731201171875, -0.07049560546875, -0.0204010009765625, 0.02490234375, 0.024261474609375, -0.00311279296875, 0.044586181640625, 0.005512237548828125, 0.00482177734375, 0.015777587890625, -0.002567291259765625, -0.038421630859375, -0.01451873779296875, -0.00036263465881347656, 0.00954437255859375, -0.02191162109375, -0.0255126953125 ] ]
timm/eva_large_patch14_196.in22k_ft_in22k_in1k
2023-03-31T06:13:42.000Z
[ "timm", "pytorch", "safetensors", "image-classification", "dataset:imagenet-1k", "dataset:imagenet-22k", "arxiv:2211.07636", "license:mit", "region:us" ]
image-classification
timm
null
null
timm/eva_large_patch14_196.in22k_ft_in22k_in1k
0
9,300
timm
2022-12-22T07:08:20
--- tags: - image-classification - timm library_tag: timm license: mit datasets: - imagenet-1k - imagenet-22k - imagenet-22k --- # Model card for eva_large_patch14_196.in22k_ft_in22k_in1k An EVA image classification model. Pretrained on ImageNet-22k with masked image modeling (using EVA-CLIP as a MIM teacher) and fine-tuned on ImageNet-22k then on ImageNet-1k by paper authors. NOTE: `timm` checkpoints are float32 for consistency with other models. Original checkpoints are float16 or bfloat16 in some cases, see originals if that's preferred. ## Model Details - **Model Type:** Image classification / feature backbone - **Model Stats:** - Params (M): 304.1 - GMACs: 61.6 - Activations (M): 63.5 - Image size: 196 x 196 - **Papers:** - EVA: Exploring the Limits of Masked Visual Representation Learning at Scale: https://arxiv.org/abs/2211.07636 - **Pretrain Dataset:** - ImageNet-22k - ImageNet-22k - **Dataset:** ImageNet-1k - **Original:** - https://github.com/baaivision/EVA - https://huggingface.co/BAAI/EVA ## Model Usage ### Image Classification ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model('eva_large_patch14_196.in22k_ft_in22k_in1k', pretrained=True) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) ``` ### Image Embeddings ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'eva_large_patch14_196.in22k_ft_in22k_in1k', pretrained=True, num_classes=0, # remove classifier nn.Linear ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor # or equivalently (without needing to set num_classes=0) output = model.forward_features(transforms(img).unsqueeze(0)) # output is unpooled, a (1, 197, 1024) shaped tensor output = model.forward_head(output, pre_logits=True) # output is a (1, num_features) shaped tensor ``` ## Model Comparison Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results). |model |top1 |top5 |param_count|img_size| |-----------------------------------------------|------|------|-----------|--------| |eva02_large_patch14_448.mim_m38m_ft_in22k_in1k |90.054|99.042|305.08 |448 | |eva02_large_patch14_448.mim_in22k_ft_in22k_in1k|89.946|99.01 |305.08 |448 | |eva_giant_patch14_560.m30m_ft_in22k_in1k |89.792|98.992|1014.45 |560 | |eva02_large_patch14_448.mim_in22k_ft_in1k |89.626|98.954|305.08 |448 | |eva02_large_patch14_448.mim_m38m_ft_in1k |89.57 |98.918|305.08 |448 | |eva_giant_patch14_336.m30m_ft_in22k_in1k |89.56 |98.956|1013.01 |336 | |eva_giant_patch14_336.clip_ft_in1k |89.466|98.82 |1013.01 |336 | |eva_large_patch14_336.in22k_ft_in22k_in1k |89.214|98.854|304.53 |336 | |eva_giant_patch14_224.clip_ft_in1k |88.882|98.678|1012.56 |224 | |eva02_base_patch14_448.mim_in22k_ft_in22k_in1k |88.692|98.722|87.12 |448 | |eva_large_patch14_336.in22k_ft_in1k |88.652|98.722|304.53 |336 | |eva_large_patch14_196.in22k_ft_in22k_in1k |88.592|98.656|304.14 |196 | |eva02_base_patch14_448.mim_in22k_ft_in1k |88.23 |98.564|87.12 |448 | |eva_large_patch14_196.in22k_ft_in1k |87.934|98.504|304.14 |196 | |eva02_small_patch14_336.mim_in22k_ft_in1k |85.74 |97.614|22.13 |336 | |eva02_tiny_patch14_336.mim_in22k_ft_in1k |80.658|95.524|5.76 |336 | ## Citation ```bibtex @article{EVA, title={EVA: Exploring the Limits of Masked Visual Representation Learning at Scale}, author={Fang, Yuxin and Wang, Wen and Xie, Binhui and Sun, Quan and Wu, Ledell and Wang, Xinggang and Huang, Tiejun and Wang, Xinlong and Cao, Yue}, journal={arXiv preprint arXiv:2211.07636}, year={2022} } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/huggingface/pytorch-image-models}} } ```
5,072
[ [ -0.049102783203125, -0.030059814453125, 0.007232666015625, 0.01015472412109375, -0.0206451416015625, 0.0016345977783203125, -0.013946533203125, -0.030120849609375, 0.044219970703125, 0.03271484375, -0.035614013671875, -0.053680419921875, -0.052490234375, 0.004230499267578125, -0.001873016357421875, 0.0654296875, -0.00756072998046875, -0.0004892349243164062, 0.0008211135864257812, -0.0347900390625, -0.0247650146484375, -0.0260467529296875, -0.0469970703125, -0.01800537109375, 0.0285186767578125, 0.0244903564453125, 0.039886474609375, 0.047607421875, 0.04083251953125, 0.0286102294921875, -0.01544189453125, 0.0106353759765625, -0.019561767578125, -0.0157928466796875, 0.0302886962890625, -0.044158935546875, -0.047607421875, 0.00980377197265625, 0.06805419921875, 0.0304107666015625, 0.00426483154296875, 0.0247650146484375, 0.00934600830078125, 0.04388427734375, -0.0288543701171875, 0.00392913818359375, -0.0238800048828125, 0.0173797607421875, -0.01061248779296875, 0.00304412841796875, -0.02191162109375, -0.0169219970703125, 0.0008149147033691406, -0.052581787109375, 0.0309600830078125, 0.007183074951171875, 0.097900390625, 0.006847381591796875, -0.0104827880859375, 0.0075225830078125, -0.0156402587890625, 0.057586669921875, -0.05572509765625, 0.0176849365234375, 0.0261993408203125, 0.011016845703125, -0.002620697021484375, -0.06402587890625, -0.0416259765625, -0.005767822265625, -0.0190582275390625, 0.007022857666015625, -0.026947021484375, 0.006069183349609375, 0.03753662109375, 0.035369873046875, -0.038055419921875, 0.0089874267578125, -0.043853759765625, -0.0213775634765625, 0.052276611328125, 0.0019025802612304688, 0.02520751953125, -0.0240936279296875, -0.046783447265625, -0.03216552734375, -0.0361328125, 0.01806640625, 0.0276336669921875, -0.001468658447265625, -0.038299560546875, 0.0304412841796875, 0.0006113052368164062, 0.04034423828125, 0.011505126953125, -0.017730712890625, 0.050994873046875, -0.005756378173828125, -0.03192138671875, -0.0132293701171875, 0.0772705078125, 0.04107666015625, 0.003635406494140625, 0.0186614990234375, -0.0147247314453125, -0.028411865234375, -0.00675201416015625, -0.08697509765625, -0.0205078125, 0.01239013671875, -0.03961181640625, -0.0254364013671875, 0.0226593017578125, -0.07122802734375, -0.006237030029296875, -0.00836181640625, 0.05255126953125, -0.0364990234375, -0.0257720947265625, 0.002223968505859375, -0.01140594482421875, 0.0172119140625, 0.02154541015625, -0.04376220703125, 0.024444580078125, 0.0188140869140625, 0.07989501953125, 0.0039215087890625, -0.0224761962890625, -0.01522064208984375, -0.02093505859375, -0.038787841796875, 0.033538818359375, -0.015655517578125, -0.0071258544921875, -0.0162353515625, 0.0285186767578125, -0.017669677734375, -0.0438232421875, 0.0166778564453125, -0.009735107421875, 0.012786865234375, -0.01035308837890625, -0.024322509765625, -0.03900146484375, 0.0227203369140625, -0.04302978515625, 0.08636474609375, 0.0264892578125, -0.05767822265625, 0.0304107666015625, -0.041473388671875, 0.009185791015625, -0.00673675537109375, 0.0029850006103515625, -0.07476806640625, -0.009857177734375, 0.034637451171875, 0.048553466796875, -0.00968170166015625, -0.0030078887939453125, -0.032318115234375, -0.02294921875, 0.01690673828125, -0.005100250244140625, 0.07464599609375, 0.0097808837890625, -0.04034423828125, 0.019561767578125, -0.056884765625, 0.0117950439453125, 0.02874755859375, -0.00543212890625, -0.0018768310546875, -0.052032470703125, 0.0045318603515625, 0.0197906494140625, 0.0007486343383789062, -0.037689208984375, 0.0236358642578125, -0.0029125213623046875, 0.036041259765625, 0.066650390625, 0.005817413330078125, 0.0198211669921875, -0.02838134765625, 0.040435791015625, 0.01381683349609375, 0.0225372314453125, -0.01702880859375, -0.04345703125, -0.0660400390625, -0.060516357421875, 0.032440185546875, 0.0219573974609375, -0.03955078125, 0.0506591796875, -0.005863189697265625, -0.051513671875, -0.034454345703125, 0.005176544189453125, 0.042144775390625, 0.037689208984375, 0.032257080078125, -0.03814697265625, -0.037353515625, -0.07464599609375, 0.01038360595703125, 0.002285003662109375, -0.00594329833984375, 0.0312347412109375, 0.0648193359375, -0.00981903076171875, 0.0496826171875, -0.04571533203125, -0.025604248046875, -0.01468658447265625, 0.0015897750854492188, 0.0257110595703125, 0.053680419921875, 0.06842041015625, -0.03955078125, -0.050079345703125, -0.01409149169921875, -0.059906005859375, 0.0015668869018554688, 0.0018978118896484375, -0.0186614990234375, 0.025054931640625, 0.0240478515625, -0.049224853515625, 0.05316162109375, 0.026275634765625, -0.034698486328125, 0.03460693359375, -0.0218963623046875, 0.0204010009765625, -0.079345703125, 0.017578125, 0.0170135498046875, -0.0182952880859375, -0.0408935546875, 0.00896453857421875, 0.01641845703125, -0.005260467529296875, -0.032928466796875, 0.050048828125, -0.04583740234375, -0.0040740966796875, 0.003353118896484375, -0.0173187255859375, 0.00960540771484375, 0.061309814453125, -0.0013256072998046875, 0.042510986328125, 0.056304931640625, -0.0227813720703125, 0.0254058837890625, 0.019256591796875, -0.031219482421875, 0.0401611328125, -0.051788330078125, 0.00811767578125, -0.00258636474609375, 0.038116455078125, -0.07080078125, -0.0275115966796875, 0.0286712646484375, -0.03656005859375, 0.04705810546875, -0.035430908203125, -0.032867431640625, -0.04541015625, -0.046722412109375, 0.0360107421875, 0.049102783203125, -0.050079345703125, 0.020904541015625, 0.01349639892578125, 0.0110931396484375, -0.045867919921875, -0.052337646484375, -0.0146636962890625, -0.034088134765625, -0.04901123046875, 0.029815673828125, -0.0025787353515625, 0.0033817291259765625, -0.0030269622802734375, -0.0022869110107421875, 0.005641937255859375, -0.0149993896484375, 0.035064697265625, 0.038482666015625, -0.02716064453125, -0.01214599609375, -0.0210418701171875, -0.00899505615234375, -0.0007729530334472656, -0.01324462890625, 0.044891357421875, -0.018646240234375, -0.025543212890625, -0.050689697265625, 0.005565643310546875, 0.049530029296875, -0.015350341796875, 0.063720703125, 0.0682373046875, -0.03997802734375, 0.00007170438766479492, -0.0469970703125, -0.0225677490234375, -0.03515625, 0.03289794921875, -0.0260772705078125, -0.04327392578125, 0.072021484375, 0.0214080810546875, -0.00342559814453125, 0.061614990234375, 0.02569580078125, 0.0041961669921875, 0.075439453125, 0.034515380859375, 0.01483154296875, 0.05224609375, -0.07659912109375, -0.00917816162109375, -0.07659912109375, -0.03863525390625, -0.0182037353515625, -0.047332763671875, -0.041900634765625, -0.034088134765625, 0.040679931640625, 0.02911376953125, -0.034454345703125, 0.031890869140625, -0.058990478515625, 0.015777587890625, 0.0531005859375, 0.050384521484375, -0.015899658203125, 0.0214080810546875, -0.0142669677734375, -0.00402069091796875, -0.051971435546875, -0.0084228515625, 0.0914306640625, 0.043731689453125, 0.03875732421875, -0.0097808837890625, 0.052093505859375, -0.006000518798828125, 0.0202484130859375, -0.03680419921875, 0.047393798828125, 0.005767822265625, -0.048004150390625, -0.0169525146484375, -0.0291748046875, -0.06939697265625, 0.0234527587890625, -0.0266876220703125, -0.0672607421875, 0.02734375, 0.01202392578125, -0.0209197998046875, 0.060394287109375, -0.06231689453125, 0.06195068359375, 0.000030517578125, -0.03399658203125, 0.0009636878967285156, -0.056182861328125, 0.0246124267578125, 0.0111236572265625, -0.004150390625, 0.0010309219360351562, 0.011566162109375, 0.0880126953125, -0.053558349609375, 0.058929443359375, -0.0246429443359375, 0.022918701171875, 0.0380859375, -0.01085662841796875, 0.039703369140625, 0.00273895263671875, 0.00815582275390625, 0.02978515625, 0.00023818016052246094, -0.042449951171875, -0.0307769775390625, 0.04034423828125, -0.0711669921875, -0.038848876953125, -0.039886474609375, -0.03350830078125, 0.01377105712890625, 0.01336669921875, 0.0458984375, 0.047607421875, 0.017852783203125, 0.032470703125, 0.054962158203125, -0.0271453857421875, 0.032745361328125, 0.004535675048828125, -0.022308349609375, -0.0570068359375, 0.07000732421875, 0.0236968994140625, 0.021484375, 0.00554656982421875, 0.0152587890625, -0.01285552978515625, -0.033477783203125, -0.03192138671875, 0.03399658203125, -0.051300048828125, -0.037445068359375, -0.042633056640625, -0.03363037109375, -0.02337646484375, -0.0139007568359375, -0.029541015625, -0.033721923828125, -0.027191162109375, -0.001995086669921875, 0.054931640625, 0.049224853515625, -0.022216796875, 0.0222320556640625, -0.03680419921875, 0.00798797607421875, 0.0161285400390625, 0.032318115234375, -0.00449371337890625, -0.0635986328125, -0.01448822021484375, -0.0003132820129394531, -0.036224365234375, -0.068603515625, 0.043731689453125, 0.000667572021484375, 0.03387451171875, 0.036376953125, -0.01515960693359375, 0.06719970703125, -0.004608154296875, 0.054901123046875, 0.040802001953125, -0.050872802734375, 0.039886474609375, -0.0186309814453125, 0.003910064697265625, 0.0124969482421875, 0.01934814453125, -0.02203369140625, -0.0159912109375, -0.08184814453125, -0.063232421875, 0.06805419921875, 0.019073486328125, -0.0084686279296875, 0.0279693603515625, 0.0312042236328125, 0.0010671615600585938, 0.0003364086151123047, -0.05609130859375, -0.041351318359375, -0.0232696533203125, -0.006847381591796875, -0.0124969482421875, -0.0133209228515625, -0.01291656494140625, -0.053741455078125, 0.050262451171875, -0.003612518310546875, 0.05267333984375, 0.02301025390625, -0.00298309326171875, -0.01184844970703125, -0.024871826171875, 0.048553466796875, 0.0364990234375, -0.0283660888671875, 0.0032520294189453125, 0.0217437744140625, -0.050811767578125, -0.005115509033203125, 0.0128936767578125, -0.0096282958984375, 0.0029754638671875, 0.041046142578125, 0.07177734375, 0.0006451606750488281, -0.01024627685546875, 0.040435791015625, 0.004726409912109375, -0.038055419921875, -0.01279449462890625, 0.007434844970703125, -0.0189666748046875, 0.0268707275390625, 0.033721923828125, 0.01983642578125, -0.0165863037109375, -0.0246429443359375, 0.0130767822265625, 0.0426025390625, -0.03118896484375, -0.025482177734375, 0.046875, -0.02020263671875, -0.013946533203125, 0.046600341796875, -0.0006551742553710938, -0.046783447265625, 0.083740234375, 0.032745361328125, 0.06329345703125, -0.0225982666015625, 0.01122283935546875, 0.07318115234375, 0.0167236328125, 0.002231597900390625, 0.005084991455078125, 0.01544189453125, -0.0302734375, 0.003955841064453125, -0.04052734375, 0.0009908676147460938, 0.042327880859375, -0.042877197265625, 0.0251312255859375, -0.04278564453125, -0.0174102783203125, 0.000004887580871582031, 0.025604248046875, -0.0611572265625, 0.01166534423828125, 0.001636505126953125, 0.06524658203125, -0.06854248046875, 0.05035400390625, 0.047698974609375, -0.042816162109375, -0.07659912109375, -0.01416778564453125, 0.00004762411117553711, -0.07171630859375, 0.03668212890625, 0.02447509765625, 0.02996826171875, -0.0036640167236328125, -0.061981201171875, -0.06329345703125, 0.11163330078125, 0.034088134765625, -0.0207672119140625, 0.016937255859375, -0.003055572509765625, 0.0249176025390625, -0.0281219482421875, 0.044158935546875, 0.0133056640625, 0.0323486328125, 0.0213470458984375, -0.05572509765625, 0.01983642578125, -0.0287933349609375, 0.006473541259765625, 0.01450347900390625, -0.07220458984375, 0.0780029296875, -0.020782470703125, -0.0009441375732421875, 0.00269317626953125, 0.0391845703125, 0.0279541015625, 0.00850677490234375, 0.039337158203125, 0.058135986328125, 0.042144775390625, -0.017578125, 0.0643310546875, -0.011993408203125, 0.040771484375, 0.0303955078125, 0.035064697265625, 0.03851318359375, 0.0285491943359375, -0.0272369384765625, 0.0318603515625, 0.0731201171875, -0.0264892578125, 0.0218658447265625, 0.0128631591796875, -0.01067352294921875, -0.0132293701171875, 0.00817108154296875, -0.046722412109375, 0.020050048828125, 0.0207977294921875, -0.03619384765625, -0.00519561767578125, 0.0079498291015625, 0.00913238525390625, -0.0252532958984375, -0.027862548828125, 0.03399658203125, 0.0006227493286132812, -0.03912353515625, 0.06292724609375, -0.0013399124145507812, 0.053802490234375, -0.039764404296875, -0.0115509033203125, -0.0295257568359375, 0.0176239013671875, -0.0316162109375, -0.06640625, 0.0084686279296875, -0.01038360595703125, -0.000055670738220214844, 0.0009665489196777344, 0.048065185546875, -0.022247314453125, -0.045257568359375, 0.012237548828125, 0.00867462158203125, 0.0218658447265625, 0.00041031837463378906, -0.07025146484375, 0.01033782958984375, 0.0032520294189453125, -0.052978515625, 0.03582763671875, 0.0287628173828125, 0.00075531005859375, 0.04718017578125, 0.04461669921875, -0.00605010986328125, 0.021514892578125, -0.0281219482421875, 0.06646728515625, -0.032318115234375, -0.032562255859375, -0.0572509765625, 0.04638671875, -0.01238250732421875, -0.038421630859375, 0.05023193359375, 0.0489501953125, 0.051910400390625, -0.002574920654296875, 0.028045654296875, -0.015045166015625, 0.0028858184814453125, -0.039215087890625, 0.05523681640625, -0.056243896484375, -0.0016241073608398438, -0.025177001953125, -0.0623779296875, -0.031494140625, 0.05889892578125, -0.0170745849609375, 0.019622802734375, 0.04461669921875, 0.077392578125, -0.0174713134765625, -0.0225677490234375, 0.010650634765625, 0.01531982421875, 0.0109405517578125, 0.04443359375, 0.031402587890625, -0.05322265625, 0.0271148681640625, -0.046539306640625, -0.020965576171875, -0.0218963623046875, -0.047454833984375, -0.080322265625, -0.06256103515625, -0.042236328125, -0.051300048828125, -0.0128631591796875, 0.06817626953125, 0.0762939453125, -0.053436279296875, 0.0002263784408569336, -0.0014047622680664062, 0.0142822265625, -0.024169921875, -0.0180816650390625, 0.05828857421875, 0.0034465789794921875, -0.05645751953125, -0.0197601318359375, 0.0006098747253417969, 0.0242156982421875, -0.0042572021484375, -0.02996826171875, -0.007038116455078125, -0.01052093505859375, 0.0174407958984375, 0.03314208984375, -0.057891845703125, -0.0283355712890625, -0.011016845703125, -0.004608154296875, 0.032958984375, 0.02569580078125, -0.043212890625, 0.02978515625, 0.03448486328125, 0.0074005126953125, 0.0723876953125, -0.0217132568359375, 0.0007176399230957031, -0.0595703125, 0.032012939453125, -0.0027008056640625, 0.03631591796875, 0.01482391357421875, -0.0175933837890625, 0.046875, 0.03570556640625, -0.037261962890625, -0.07769775390625, -0.008514404296875, -0.084716796875, -0.01062774658203125, 0.07611083984375, -0.025482177734375, -0.032012939453125, 0.02911376953125, -0.0177764892578125, 0.0291595458984375, -0.014556884765625, 0.033538818359375, 0.0287628173828125, -0.00238800048828125, -0.0460205078125, -0.041778564453125, 0.0338134765625, 0.01338958740234375, -0.044952392578125, -0.027618408203125, 0.0093994140625, 0.040557861328125, 0.0302886962890625, 0.039276123046875, -0.012420654296875, 0.01047515869140625, 0.01125335693359375, 0.0244598388671875, -0.0268402099609375, -0.012237548828125, -0.02545166015625, 0.0052642822265625, -0.0161590576171875, -0.040283203125 ] ]
hoskinson-center/proofGPT-v0.1
2023-02-15T19:08:47.000Z
[ "transformers", "pytorch", "gpt_neox", "text-generation", "text generation", "causal-lm", "en", "dataset:hoskinson-center/proof-pile", "license:mit", "endpoints_compatible", "text-generation-inference", "region:us" ]
text-generation
hoskinson-center
null
null
hoskinson-center/proofGPT-v0.1
3
9,291
transformers
2022-12-01T01:13:54
--- language: - en tags: - text generation - pytorch - causal-lm - gpt_neox license: mit datasets: - hoskinson-center/proof-pile --- # ProofGPT-v0.1 # Model Description ProofGPT-v0.1 is a 1.3B parameter language model based on the GPT-NeoX architecture and trained on the [proof-pile](https://huggingface.co/datasets/hoskinson-center/proof-pile) (v1.1). We initiailized training with pythia-1.3b weights, a precursor to the [pythia-1.4b](https://huggingface.co/EleutherAI/pythia-1.3b) model that has roughly equivalent performance. Detailed evaluations coming soon :) **Note**: Commit `3bcdc4e` replaced the weights with a model trained on proof-pile v1.1, as opposed to previous commits which were trained on v1.0. Commit `9695b51` updated the tokenizer to have bos, eos, and unk tokens.
795
[ [ -0.0305328369140625, -0.051422119140625, 0.03619384765625, 0.0121002197265625, -0.036163330078125, -0.0259552001953125, 0.007465362548828125, -0.02294921875, 0.01403045654296875, 0.036712646484375, -0.0268096923828125, -0.0289306640625, -0.043914794921875, -0.02032470703125, -0.0255279541015625, 0.10235595703125, -0.00804901123046875, 0.016143798828125, -0.0031986236572265625, -0.01117706298828125, -0.0026531219482421875, -0.050384521484375, -0.0574951171875, -0.0239105224609375, 0.03515625, 0.01454925537109375, 0.05462646484375, 0.047576904296875, 0.020477294921875, 0.00888824462890625, -0.0160064697265625, -0.046600341796875, -0.025604248046875, -0.030548095703125, 0.003353118896484375, -0.01233673095703125, -0.05670166015625, 0.00481414794921875, 0.057342529296875, 0.0293426513671875, -0.039337158203125, 0.017852783203125, -0.017852783203125, 0.04205322265625, -0.027618408203125, 0.007694244384765625, -0.0325927734375, -0.0095367431640625, -0.022918701171875, 0.01003265380859375, -0.040313720703125, -0.0202178955078125, 0.0115509033203125, -0.04998779296875, 0.0245361328125, -0.0020313262939453125, 0.08599853515625, 0.010711669921875, -0.0237274169921875, -0.01136016845703125, -0.032073974609375, 0.040283203125, -0.0701904296875, 0.037750244140625, 0.0295562744140625, 0.046356201171875, -0.0014162063598632812, -0.07415771484375, -0.02899169921875, -0.0228271484375, 0.0059356689453125, -0.0013093948364257812, -0.0142364501953125, 0.007472991943359375, 0.038909912109375, 0.04400634765625, -0.06243896484375, 0.0106201171875, -0.03857421875, -0.0225067138671875, 0.033111572265625, -0.01377105712890625, 0.01123046875, 0.0004172325134277344, -0.039337158203125, -0.00922393798828125, -0.07110595703125, -0.0013875961303710938, 0.03399658203125, 0.006805419921875, -0.03167724609375, 0.0273590087890625, -0.032196044921875, 0.061065673828125, 0.002063751220703125, 0.007335662841796875, 0.0132598876953125, -0.003047943115234375, -0.020477294921875, 0.0110015869140625, 0.062744140625, -0.0007314682006835938, -0.003658294677734375, -0.01435089111328125, -0.01045989990234375, 0.0030841827392578125, 0.0045013427734375, -0.09478759765625, -0.052337646484375, 0.00342559814453125, -0.0352783203125, -0.028106689453125, 0.0285797119140625, -0.0234832763671875, -0.01861572265625, -0.00412750244140625, 0.044189453125, -0.04620361328125, -0.044830322265625, 0.036834716796875, -0.0038471221923828125, 0.033782958984375, 0.024322509765625, -0.06817626953125, 0.0506591796875, 0.059295654296875, 0.061126708984375, 0.0216522216796875, -0.0194854736328125, -0.032196044921875, -0.01708984375, 0.004871368408203125, 0.05279541015625, -0.00832366943359375, -0.0003871917724609375, -0.0139923095703125, -0.0059967041015625, 0.010833740234375, -0.02105712890625, 0.05853271484375, -0.0491943359375, 0.0251922607421875, -0.031707763671875, -0.059906005859375, -0.03143310546875, 0.036346435546875, -0.055328369140625, 0.0701904296875, 0.040252685546875, -0.0670166015625, 0.041046142578125, -0.03564453125, -0.0003566741943359375, 0.0172882080078125, 0.01174163818359375, -0.048187255859375, -0.001430511474609375, 0.01080322265625, 0.01235198974609375, -0.031494140625, -0.0082244873046875, -0.0210418701171875, -0.03192138671875, -0.00771331787109375, -0.01561737060546875, 0.07476806640625, 0.00946807861328125, -0.01953125, 0.01532745361328125, -0.056488037109375, 0.01061248779296875, 0.0205535888671875, -0.02960205078125, 0.0006546974182128906, -0.003139495849609375, -0.0023345947265625, 0.02288818359375, 0.030181884765625, -0.041107177734375, 0.03955078125, -0.0274658203125, 0.045166015625, 0.03668212890625, -0.006397247314453125, 0.00997161865234375, -0.02099609375, 0.040924072265625, -0.0028514862060546875, 0.01197052001953125, -0.0033817291259765625, -0.0435791015625, -0.035400390625, -0.034088134765625, 0.04888916015625, 0.031524658203125, -0.0286865234375, 0.023406982421875, -0.017486572265625, -0.06585693359375, -0.00409698486328125, -0.016143798828125, 0.027069091796875, 0.0155181884765625, 0.024871826171875, -0.0163116455078125, -0.049957275390625, -0.056976318359375, 0.0004706382751464844, -0.0523681640625, 0.0016241073608398438, -0.018157958984375, 0.04132080078125, 0.003261566162109375, 0.067626953125, -0.0161590576171875, 0.029388427734375, -0.0234832763671875, 0.020111083984375, 0.036468505859375, 0.041229248046875, 0.0523681640625, -0.042022705078125, -0.0234375, -0.0025196075439453125, -0.04241943359375, -0.007049560546875, 0.0117034912109375, -0.0030517578125, 0.00849151611328125, 0.0194244384765625, -0.0555419921875, 0.0335693359375, 0.06378173828125, -0.05810546875, 0.064453125, -0.00927734375, -0.0077056884765625, -0.0758056640625, -0.0171661376953125, 0.00024366378784179688, -0.03155517578125, -0.028533935546875, 0.0210418701171875, 0.015838623046875, 0.00661468505859375, -0.046295166015625, 0.045654296875, -0.0311126708984375, -0.004596710205078125, -0.016357421875, 0.0002808570861816406, -0.0170745849609375, 0.0307464599609375, 0.00017726421356201172, 0.041748046875, 0.077392578125, -0.056671142578125, 0.038421630859375, 0.01529693603515625, -0.0022430419921875, 0.0182952880859375, -0.0787353515625, 0.027740478515625, 0.003757476806640625, 0.0187530517578125, -0.059295654296875, -0.0029048919677734375, 0.033966064453125, -0.0294647216796875, 0.01739501953125, -0.04840087890625, -0.04620361328125, -0.033203125, -0.02166748046875, 0.045501708984375, 0.045257568359375, -0.0521240234375, 0.03094482421875, 0.016632080078125, 0.00421142578125, -0.01409912109375, -0.0287017822265625, -0.0081024169921875, -0.0150299072265625, -0.05255126953125, 0.0157470703125, -0.003139495849609375, -0.01751708984375, -0.01104736328125, -0.01464080810546875, 0.000048100948333740234, -0.002918243408203125, 0.01198577880859375, 0.01491546630859375, -0.0227508544921875, -0.01209259033203125, -0.0052642822265625, -0.02508544921875, 0.01102447509765625, -0.05560302734375, 0.058319091796875, -0.034088134765625, -0.023223876953125, -0.0227508544921875, -0.01296234130859375, 0.0345458984375, -0.0145416259765625, 0.06396484375, 0.04046630859375, -0.042877197265625, 0.0149688720703125, -0.0283660888671875, -0.0154571533203125, -0.0306549072265625, 0.02984619140625, -0.037628173828125, -0.06829833984375, 0.044830322265625, 0.004665374755859375, -0.002285003662109375, 0.041900634765625, 0.03887939453125, 0.030059814453125, 0.07550048828125, 0.035400390625, -0.01251220703125, 0.0262298583984375, -0.0175323486328125, 0.0118560791015625, -0.06170654296875, 0.01134490966796875, -0.036407470703125, -0.006343841552734375, -0.0657958984375, -0.0184783935546875, 0.01403045654296875, 0.057861328125, -0.045867919921875, 0.060882568359375, -0.032470703125, 0.033172607421875, 0.044769287109375, 0.019439697265625, 0.0011796951293945312, -0.0131072998046875, 0.002285003662109375, 0.006771087646484375, -0.06536865234375, -0.055145263671875, 0.08282470703125, 0.04486083984375, 0.03350830078125, 0.023193359375, 0.05059814453125, -0.0185394287109375, 0.047821044921875, -0.0457763671875, 0.048095703125, 0.01055145263671875, -0.055908203125, -0.0178680419921875, -0.038055419921875, -0.052642822265625, 0.0208587646484375, -0.003536224365234375, -0.0631103515625, -0.023345947265625, 0.0193328857421875, -0.006839752197265625, 0.035797119140625, -0.048583984375, 0.09765625, -0.01435089111328125, -0.01446533203125, -0.0208282470703125, -0.0267181396484375, 0.0211181640625, 0.033416748046875, -0.0112457275390625, -0.002849578857421875, 0.0210113525390625, 0.05682373046875, -0.04742431640625, 0.0310211181640625, -0.037628173828125, -0.0107574462890625, 0.032470703125, 0.0255889892578125, 0.06634521484375, 0.030426025390625, -0.0166473388671875, 0.0305938720703125, 0.016510009765625, -0.053619384765625, -0.0230560302734375, 0.042510986328125, -0.05035400390625, -0.02288818359375, -0.041839599609375, -0.0267486572265625, 0.0038776397705078125, 0.0229339599609375, 0.024810791015625, 0.031280517578125, -0.02947998046875, 0.0028133392333984375, 0.057220458984375, -0.000675201416015625, 0.030181884765625, 0.045501708984375, -0.040313720703125, -0.029083251953125, 0.05682373046875, -0.006103515625, 0.0217437744140625, 0.024993896484375, 0.030914306640625, -0.0174713134765625, -0.059173583984375, -0.043487548828125, 0.0248260498046875, -0.0430908203125, -0.00817108154296875, -0.031585693359375, -0.0170440673828125, -0.0399169921875, 0.00771331787109375, -0.036468505859375, -0.041717529296875, 0.0116119384765625, -0.02520751953125, 0.041595458984375, 0.06005859375, 0.006805419921875, 0.042327880859375, -0.05218505859375, 0.0112762451171875, 0.03228759765625, 0.050079345703125, -0.037139892578125, -0.061309814453125, -0.028076171875, -0.0005087852478027344, -0.0219268798828125, -0.08013916015625, 0.01397705078125, -0.00946044921875, 0.033294677734375, 0.03668212890625, -0.0312347412109375, 0.0203094482421875, -0.0246734619140625, 0.0655517578125, 0.00713348388671875, -0.07696533203125, 0.0323486328125, -0.05096435546875, 0.016021728515625, 0.026947021484375, 0.017974853515625, -0.031829833984375, -0.005340576171875, -0.0687255859375, -0.057373046875, 0.051849365234375, 0.0224151611328125, -0.0201568603515625, 0.0060577392578125, 0.04278564453125, 0.0182952880859375, 0.01316070556640625, -0.0771484375, -0.010772705078125, -0.03814697265625, -0.01300048828125, 0.0182952880859375, -0.033477783203125, 0.00992584228515625, -0.0282440185546875, 0.06884765625, -0.006343841552734375, 0.0146026611328125, 0.004405975341796875, -0.006496429443359375, 0.0017309188842773438, 0.0096435546875, 0.046630859375, 0.07037353515625, -0.034393310546875, 0.0015935897827148438, 0.01235198974609375, -0.042144775390625, -0.002780914306640625, 0.036590576171875, -0.0274200439453125, -0.004695892333984375, 0.0078277587890625, 0.08502197265625, 0.00853729248046875, -0.0217437744140625, 0.0258026123046875, -0.01236724853515625, -0.03570556640625, -0.02899169921875, -0.007549285888671875, -0.0007748603820800781, -0.000032901763916015625, 0.001461029052734375, -0.00727081298828125, 0.00724029541015625, -0.0238037109375, 0.0197601318359375, 0.0239105224609375, -0.0286102294921875, -0.04681396484375, 0.047882080078125, -0.0139923095703125, -0.016845703125, 0.036529541015625, -0.060699462890625, -0.045684814453125, 0.024322509765625, 0.054046630859375, 0.06146240234375, -0.0082244873046875, 0.03192138671875, 0.0537109375, 0.03173828125, -0.0255584716796875, 0.020355224609375, -0.0010223388671875, -0.05047607421875, -0.0296783447265625, -0.055816650390625, -0.0343017578125, -0.004276275634765625, -0.03857421875, 0.036163330078125, -0.0594482421875, 0.01236724853515625, -0.029541015625, 0.000701904296875, -0.030853271484375, 0.0231170654296875, 0.0125579833984375, 0.0762939453125, -0.052642822265625, 0.0643310546875, 0.077880859375, -0.037261962890625, -0.08258056640625, -0.00759124755859375, -0.00849151611328125, -0.03912353515625, 0.0248260498046875, 0.0052642822265625, 0.014739990234375, 0.0273590087890625, -0.05914306640625, -0.07049560546875, 0.08538818359375, 0.042266845703125, -0.040496826171875, 0.01012420654296875, -0.009674072265625, 0.02685546875, -0.0096588134765625, 0.0246734619140625, 0.035308837890625, 0.034210205078125, -0.00443267822265625, -0.0762939453125, 0.01374053955078125, -0.0243682861328125, -0.0029850006103515625, 0.035064697265625, -0.04510498046875, 0.081298828125, -0.0007014274597167969, -0.0068206787109375, 0.009613037109375, 0.062225341796875, 0.04608154296875, 0.001049041748046875, 0.044189453125, 0.08782958984375, 0.0457763671875, -0.0335693359375, 0.08978271484375, -0.030120849609375, 0.050567626953125, 0.07379150390625, -0.01015472412109375, 0.044097900390625, 0.03265380859375, -0.03533935546875, 0.05615234375, 0.045501708984375, 0.01303863525390625, 0.0239410400390625, 0.0287322998046875, -0.01483154296875, -0.02203369140625, 0.026123046875, -0.042083740234375, 0.0171051025390625, 0.00473785400390625, -0.005733489990234375, -0.01490020751953125, -0.025299072265625, 0.0100860595703125, -0.019622802734375, -0.030487060546875, 0.040618896484375, -0.01134490966796875, -0.034088134765625, 0.036956787109375, 0.0188751220703125, 0.038665771484375, -0.053314208984375, -0.00974273681640625, 0.00659942626953125, 0.01458740234375, 0.0023365020751953125, -0.043212890625, 0.00323486328125, -0.0181732177734375, -0.0094757080078125, -0.0114898681640625, 0.054962158203125, -0.019073486328125, -0.043731689453125, 0.004787445068359375, 0.01377105712890625, 0.018096923828125, -0.0380859375, -0.051055908203125, 0.0007576942443847656, -0.0279083251953125, -0.036773681640625, 0.005489349365234375, 0.037445068359375, -0.007671356201171875, 0.03656005859375, 0.042144775390625, 0.018829345703125, 0.01486968994140625, 0.030975341796875, 0.055572509765625, -0.04449462890625, -0.054840087890625, -0.060638427734375, 0.042266845703125, 0.0040130615234375, -0.0401611328125, 0.057159423828125, 0.04461669921875, 0.07342529296875, -0.0018167495727539062, 0.049346923828125, -0.04302978515625, 0.03033447265625, -0.04864501953125, 0.034942626953125, -0.0103607177734375, 0.0024280548095703125, -0.020233154296875, -0.06658935546875, -0.0104217529296875, 0.05615234375, -0.0185699462890625, 0.0204620361328125, 0.06494140625, 0.05548095703125, 0.00550079345703125, 0.01229095458984375, 0.01131439208984375, 0.022491455078125, 0.026458740234375, 0.050628662109375, 0.04248046875, -0.0806884765625, 0.0306854248046875, -0.0253753662109375, -0.0225982666015625, -0.0009822845458984375, -0.07476806640625, -0.0599365234375, -0.03369140625, -0.037750244140625, -0.0506591796875, 0.003925323486328125, 0.068603515625, 0.0697021484375, -0.05682373046875, -0.0071868896484375, -0.05255126953125, -0.004970550537109375, -0.03173828125, -0.01354217529296875, 0.0193328857421875, 0.005237579345703125, -0.028289794921875, 0.01678466796875, 0.0175323486328125, 0.008148193359375, -0.0236968994140625, -0.047576904296875, -0.01459503173828125, -0.015869140625, 0.0231781005859375, 0.02899169921875, -0.035491943359375, -0.027069091796875, -0.0167999267578125, -0.0012531280517578125, 0.0037593841552734375, 0.067138671875, -0.055389404296875, 0.0050201416015625, 0.05059814453125, 0.0164337158203125, 0.05194091796875, 0.0005774497985839844, 0.037689208984375, -0.048614501953125, 0.036834716796875, 0.0115966796875, 0.053802490234375, 0.0307769775390625, -0.01418304443359375, 0.03533935546875, 0.032318115234375, -0.050933837890625, -0.046112060546875, 0.0277099609375, -0.079345703125, 0.006679534912109375, 0.08099365234375, -0.0080718994140625, -0.0178375244140625, 0.0146026611328125, -0.03179931640625, 0.029296875, -0.0035858154296875, 0.048614501953125, 0.059722900390625, 0.01561737060546875, -0.0338134765625, -0.04278564453125, 0.0303497314453125, 0.035491943359375, -0.051666259765625, 0.005859375, 0.0307769775390625, 0.0282135009765625, 0.0035381317138671875, 0.05853271484375, -0.03167724609375, 0.04541015625, 0.0102691650390625, 0.0260009765625, -0.0165863037109375, -0.02020263671875, -0.019561767578125, 0.006603240966796875, 0.027099609375, -0.0015802383422851562 ] ]
hyunseoki/ko-ref-llama2-7b
2023-10-04T07:22:05.000Z
[ "transformers", "pytorch", "llama", "text-generation", "ko", "endpoints_compatible", "text-generation-inference", "region:us" ]
text-generation
hyunseoki
null
null
hyunseoki/ko-ref-llama2-7b
3
9,287
transformers
2023-10-04T07:05:50
--- language: - ko library_name: transformers pipeline_tag: text-generation --- **Model Developers** HyunseokLee, TaeyoungKim - (kaist alinlab, omnious.ai) **Input** Models input text only. **Output** Models generate text only. **Model Architecture** ko-ref-llama2-7b is an auto-regressive language model based on the LLaMA2 transformer architecture. **Base Model** Llama-2-7B **Training Dataset** Open dataset (Korean). **Training Objective** We trained the model to learn Korean corpus.
503
[ [ 0.004730224609375, -0.05206298828125, 0.035400390625, 0.030426025390625, -0.021514892578125, 0.016876220703125, 0.000690460205078125, -0.0291290283203125, 0.00814056396484375, 0.05712890625, -0.03173828125, -0.03759765625, -0.039794921875, 0.0107269287109375, -0.0276031494140625, 0.0677490234375, -0.0169525146484375, 0.0198822021484375, 0.007061004638671875, 0.0032825469970703125, -0.04931640625, -0.0308685302734375, -0.045135498046875, -0.0279998779296875, 0.018035888671875, 0.05596923828125, 0.05084228515625, 0.06915283203125, 0.032135009765625, 0.023193359375, 0.0106353759765625, -0.020294189453125, -0.05084228515625, 0.0189666748046875, -0.0035343170166015625, -0.06231689453125, -0.03338623046875, -0.019989013671875, 0.037933349609375, 0.01031494140625, -0.0100250244140625, 0.0369873046875, -0.02294921875, 0.038116455078125, -0.029937744140625, 0.03656005859375, -0.05194091796875, -0.01393890380859375, -0.01617431640625, 0.033447265625, -0.035552978515625, -0.03204345703125, -0.016387939453125, -0.029754638671875, -0.0008311271667480469, 0.006366729736328125, 0.07305908203125, 0.03607177734375, -0.03814697265625, -0.031585693359375, -0.047271728515625, 0.04638671875, -0.054046630859375, 0.038116455078125, 0.04510498046875, 0.026214599609375, -0.0147247314453125, -0.05523681640625, -0.04034423828125, -0.018890380859375, 0.0011577606201171875, 0.007801055908203125, 0.0017852783203125, -0.01171112060546875, -0.0045318603515625, 0.018951416015625, -0.038330078125, 0.0330810546875, -0.061859130859375, -0.01093292236328125, 0.06060791015625, 0.006893157958984375, 0.0185699462890625, -0.0214385986328125, -0.0203857421875, -0.0114898681640625, -0.0445556640625, -0.0111083984375, 0.030242919921875, 0.01387786865234375, -0.0203704833984375, 0.0675048828125, -0.0222320556640625, 0.027069091796875, 0.0161285400390625, -0.03509521484375, 0.03155517578125, -0.006389617919921875, -0.0308837890625, 0.0181427001953125, 0.059234619140625, 0.008148193359375, 0.00965118408203125, -0.0174560546875, -0.0186309814453125, 0.0238800048828125, 0.01531982421875, -0.0297393798828125, -0.016021728515625, 0.01271820068359375, -0.056427001953125, -0.05218505859375, -0.01102447509765625, -0.05804443359375, -0.01374053955078125, -0.00731658935546875, 0.0254669189453125, -0.006542205810546875, -0.04119873046875, -0.003040313720703125, 0.01084136962890625, 0.01383209228515625, -0.004779815673828125, -0.0501708984375, 0.0247344970703125, 0.0127410888671875, 0.03594970703125, -0.0078582763671875, -0.025054931640625, 0.004047393798828125, 0.0023860931396484375, -0.033416748046875, 0.0592041015625, -0.0237579345703125, -0.0350341796875, 0.0073089599609375, 0.01453399658203125, -0.006000518798828125, -0.044921875, 0.052703857421875, -0.054168701171875, 0.01947021484375, 0.011810302734375, -0.03179931640625, -0.033966064453125, -0.01019287109375, -0.0662841796875, 0.10595703125, 0.035797119140625, -0.0229644775390625, 0.0033855438232421875, -0.056365966796875, -0.014617919921875, -0.0106964111328125, 0.004093170166015625, -0.03009033203125, 0.01543426513671875, -0.0142364501953125, 0.019622802734375, -0.031829833984375, 0.04534912109375, -0.0052032470703125, 0.0013360977172851562, 0.004611968994140625, -0.0097503662109375, 0.04241943359375, 0.030792236328125, -0.016754150390625, 0.005825042724609375, -0.10302734375, -0.0089263916015625, 0.050262451171875, -0.0379638671875, -0.0095367431640625, -0.002277374267578125, 0.0176849365234375, 0.028106689453125, 0.027740478515625, -0.042083740234375, 0.0232696533203125, -0.0235748291015625, 0.0137939453125, 0.046844482421875, -0.02001953125, 0.046722412109375, -0.01520538330078125, 0.055755615234375, 0.00817108154296875, -0.0143585205078125, -0.01343536376953125, -0.03167724609375, -0.08331298828125, -0.0083160400390625, 0.0258941650390625, 0.0672607421875, -0.048370361328125, 0.022735595703125, -0.0011653900146484375, -0.05987548828125, -0.058258056640625, 0.01861572265625, 0.041046142578125, 0.0401611328125, 0.01324462890625, 0.00951385498046875, -0.06219482421875, -0.0797119140625, 0.00936126708984375, -0.035552978515625, 0.0085296630859375, 0.007053375244140625, 0.043304443359375, -0.02899169921875, 0.049346923828125, -0.026824951171875, -0.00527191162109375, -0.036712646484375, -0.0016307830810546875, 0.0197601318359375, 0.0292510986328125, 0.03656005859375, -0.043701171875, -0.042755126953125, -0.0079193115234375, -0.054473876953125, -0.05059814453125, -0.0054779052734375, -0.023193359375, 0.033843994140625, 0.044342041015625, -0.05804443359375, 0.040008544921875, 0.05517578125, -0.0311279296875, 0.033905029296875, 0.01544952392578125, -0.016357421875, -0.10614013671875, -0.0017023086547851562, -0.0283203125, -0.0183868408203125, -0.04541015625, 0.0023059844970703125, 0.00736236572265625, 0.006561279296875, -0.0242462158203125, 0.05364990234375, -0.0277099609375, -0.006809234619140625, -0.0364990234375, 0.0035266876220703125, -0.005382537841796875, 0.029510498046875, 0.0085906982421875, 0.0496826171875, 0.0190582275390625, -0.052703857421875, 0.0287933349609375, 0.04498291015625, -0.0237884521484375, 0.00992584228515625, -0.059173583984375, 0.0201416015625, -0.01033782958984375, 0.01007080078125, -0.09002685546875, -0.038604736328125, 0.03375244140625, -0.03643798828125, 0.0165252685546875, 0.00620269775390625, -0.04595947265625, -0.04345703125, -0.0016117095947265625, 0.0235443115234375, 0.04840087890625, -0.0379638671875, 0.047943115234375, 0.0220489501953125, -0.018218994140625, -0.0275421142578125, -0.05096435546875, -0.00806427001953125, -0.0023250579833984375, -0.034332275390625, 0.01496124267578125, -0.0016927719116210938, -0.019287109375, 0.0014791488647460938, 0.018646240234375, -0.0194091796875, 0.005588531494140625, 0.028106689453125, 0.045745849609375, -0.033843994140625, 0.0123138427734375, 0.0204925537109375, -0.00519561767578125, -0.0006480216979980469, 0.0172882080078125, 0.060699462890625, 0.00916290283203125, -0.0161285400390625, -0.037750244140625, 0.00965118408203125, 0.031097412109375, 0.005847930908203125, 0.041168212890625, 0.032501220703125, -0.016845703125, 0.007427215576171875, -0.04193115234375, 0.0168304443359375, -0.0352783203125, 0.0648193359375, -0.047607421875, -0.05120849609375, 0.04595947265625, -0.01226806640625, 0.0024394989013671875, 0.066162109375, 0.08294677734375, 0.01812744140625, 0.06732177734375, 0.060546875, -0.014404296875, 0.01316070556640625, 0.00859832763671875, 0.01177215576171875, -0.058807373046875, -0.036041259765625, -0.044891357421875, -0.01068878173828125, -0.038818359375, -0.01222991943359375, -0.01959228515625, 0.03204345703125, -0.040740966796875, 0.046173095703125, -0.032958984375, 0.0237579345703125, 0.049774169921875, -0.01898193359375, 0.00963592529296875, -0.01065826416015625, -0.00870513916015625, 0.020599365234375, -0.052276611328125, -0.061614990234375, 0.0947265625, 0.050018310546875, 0.07110595703125, 0.011199951171875, 0.044708251953125, 0.00936126708984375, 0.02734375, -0.052978515625, 0.0313720703125, 0.0109100341796875, -0.05340576171875, -0.00339508056640625, -0.001491546630859375, -0.0645751953125, -0.00016891956329345703, 0.007228851318359375, -0.051788330078125, 0.01036834716796875, 0.0160675048828125, -0.0012655258178710938, 0.01314544677734375, -0.055023193359375, 0.0462646484375, -0.0291595458984375, 0.0158233642578125, 0.0160369873046875, -0.041900634765625, 0.038848876953125, -0.0221099853515625, 0.0087890625, -0.0093231201171875, -0.00598907470703125, 0.0482177734375, 0.0010595321655273438, 0.05987548828125, -0.0083770751953125, -0.01143646240234375, 0.033203125, 0.023284912109375, 0.0369873046875, -0.0023593902587890625, 0.0051116943359375, 0.019378662109375, -0.01413726806640625, -0.00853729248046875, -0.01849365234375, 0.038818359375, -0.07861328125, -0.034881591796875, -0.0211639404296875, -0.0352783203125, -0.007598876953125, 0.024322509765625, 0.0241546630859375, -0.02032470703125, -0.0191802978515625, -0.001895904541015625, 0.0223541259765625, -0.020599365234375, 0.032928466796875, 0.051727294921875, -0.043975830078125, -0.0572509765625, 0.047027587890625, -0.005702972412109375, 0.0238494873046875, 0.012420654296875, 0.0071563720703125, -0.0306396484375, -0.0173492431640625, -0.050537109375, 0.031097412109375, -0.056182861328125, -0.039031982421875, -0.042938232421875, -0.0374755859375, -0.053619384765625, 0.007053375244140625, -0.048004150390625, -0.035614013671875, -0.040191650390625, -0.024566650390625, 0.01183319091796875, 0.0899658203125, 0.0012302398681640625, 0.061798095703125, -0.05474853515625, 0.0299835205078125, 0.0027637481689453125, 0.0218505859375, -0.00696563720703125, -0.050628662109375, -0.016143798828125, -0.00002193450927734375, -0.0282440185546875, -0.0650634765625, 0.052215576171875, 0.01497650146484375, 0.043701171875, 0.021392822265625, -0.01580810546875, 0.0264434814453125, -0.0458984375, 0.06597900390625, 0.00643157958984375, -0.05718994140625, 0.034881591796875, -0.00885009765625, 0.032806396484375, 0.0269012451171875, 0.02337646484375, -0.02197265625, -0.01425933837890625, -0.044219970703125, -0.04547119140625, 0.062408447265625, 0.01328277587890625, 0.01422882080078125, -0.00472259521484375, 0.039581298828125, 0.021087646484375, 0.025482177734375, -0.07806396484375, -0.0185394287109375, -0.036285400390625, -0.055633544921875, 0.00698089599609375, -0.0523681640625, 0.0177764892578125, 0.0007758140563964844, 0.057830810546875, 0.00278472900390625, 0.044464111328125, 0.006923675537109375, -0.0167999267578125, -0.0245361328125, -0.0014801025390625, 0.055633544921875, 0.0110626220703125, -0.004299163818359375, -0.011474609375, 0.0015726089477539062, -0.051177978515625, 0.0230712890625, -0.0091400146484375, -0.03460693359375, 0.00954437255859375, 0.022979736328125, 0.08489990234375, 0.002124786376953125, -0.044952392578125, -0.002166748046875, 0.0148773193359375, -0.0256805419921875, -0.0421142578125, 0.0035839080810546875, 0.00728607177734375, 0.029571533203125, 0.01113128662109375, -0.0087432861328125, 0.0006084442138671875, -0.0194091796875, -0.0019283294677734375, -0.005527496337890625, -0.007114410400390625, -0.04180908203125, 0.0679931640625, 0.0009965896606445312, -0.035247802734375, 0.04638671875, -0.0058441162109375, -0.049957275390625, 0.0601806640625, 0.09063720703125, 0.066162109375, -0.046539306640625, 0.017120361328125, 0.036773681640625, 0.0202484130859375, -0.019744873046875, 0.04547119140625, 0.0291595458984375, -0.05950927734375, -0.016998291015625, -0.05242919921875, -0.00848388671875, 0.0284423828125, -0.04736328125, 0.035797119140625, -0.0228271484375, -0.0136260986328125, -0.004512786865234375, -0.0155487060546875, -0.042449951171875, 0.016998291015625, 0.01316070556640625, 0.06878662109375, -0.058929443359375, 0.07550048828125, 0.05712890625, -0.041748046875, -0.054534912109375, -0.0268402099609375, -0.007720947265625, -0.0953369140625, 0.09234619140625, 0.00019931793212890625, 0.01446533203125, 0.00806427001953125, -0.06524658203125, -0.10693359375, 0.08197021484375, 0.013153076171875, -0.039794921875, -0.0097198486328125, 0.01605224609375, 0.0270233154296875, -0.0491943359375, -0.0003001689910888672, 0.035614013671875, 0.0298004150390625, -0.00020611286163330078, -0.083251953125, -0.021514892578125, -0.0277557373046875, 0.0161895751953125, -0.005420684814453125, -0.0697021484375, 0.0689697265625, -0.00153350830078125, -0.00445556640625, 0.03973388671875, 0.056793212890625, 0.0183868408203125, 0.0061187744140625, 0.0323486328125, 0.047088623046875, 0.0401611328125, 0.01280975341796875, 0.05621337890625, -0.035247802734375, 0.021514892578125, 0.10943603515625, -0.034027099609375, 0.0728759765625, 0.01189422607421875, -0.01265716552734375, 0.04852294921875, 0.062408447265625, -0.02392578125, 0.049652099609375, 0.01236724853515625, 0.0170135498046875, 0.006755828857421875, -0.004913330078125, -0.0211181640625, 0.06488037109375, 0.0197601318359375, -0.049896240234375, 0.00269317626953125, -0.0018606185913085938, 0.0285491943359375, -0.0080413818359375, -0.019622802734375, 0.05670166015625, -0.008575439453125, -0.040008544921875, 0.0277557373046875, 0.0272674560546875, 0.031158447265625, -0.044281005859375, -0.00986480712890625, -0.01113128662109375, 0.0085601806640625, -0.0027618408203125, -0.033538818359375, 0.0208282470703125, 0.00191497802734375, -0.033843994140625, 0.0172119140625, 0.0794677734375, -0.04168701171875, -0.0557861328125, 0.021209716796875, 0.027130126953125, 0.00739288330078125, 0.0153961181640625, -0.05438232421875, 0.01788330078125, 0.0034637451171875, -0.0323486328125, 0.0118408203125, 0.016998291015625, -0.0188140869140625, 0.056640625, 0.035736083984375, 0.0003161430358886719, 0.021820068359375, -0.0023174285888671875, 0.063232421875, -0.036285400390625, -0.0160369873046875, -0.0531005859375, 0.028289794921875, 0.00027751922607421875, -0.0328369140625, 0.050384521484375, 0.047821044921875, 0.06524658203125, -0.0247039794921875, 0.06207275390625, -0.00998687744140625, 0.056976318359375, -0.0240936279296875, 0.045745849609375, -0.0171356201171875, -0.0115814208984375, -0.00214385986328125, -0.07122802734375, -0.0105133056640625, 0.06103515625, 0.007190704345703125, -0.007190704345703125, 0.039794921875, 0.04498291015625, 0.0145263671875, -0.01305389404296875, 0.0311279296875, 0.044708251953125, 0.003963470458984375, 0.0224609375, 0.045257568359375, -0.05584716796875, 0.0305328369140625, -0.02789306640625, -0.01221466064453125, -0.01141357421875, -0.045318603515625, -0.0772705078125, -0.035552978515625, -0.00855255126953125, -0.045196533203125, -0.00815582275390625, 0.06890869140625, 0.0198822021484375, -0.06243896484375, -0.0309906005859375, -0.0028076171875, -0.0051116943359375, 0.02313232421875, -0.0176849365234375, 0.035400390625, -0.026611328125, -0.049163818359375, 0.0287933349609375, -0.002307891845703125, 0.0176849365234375, -0.039031982421875, -0.007579803466796875, -0.01396942138671875, 0.00856781005859375, 0.041900634765625, 0.016845703125, -0.062225341796875, 0.0034122467041015625, 0.0235137939453125, -0.01490020751953125, 0.0123138427734375, 0.0264892578125, -0.051177978515625, 0.019927978515625, 0.0232696533203125, 0.040283203125, 0.0125274658203125, -0.0012559890747070312, 0.050994873046875, -0.020233154296875, 0.023529052734375, -0.00167083740234375, 0.01654052734375, 0.0104217529296875, -0.039825439453125, 0.05059814453125, 0.0203857421875, -0.0523681640625, -0.054107666015625, 0.01029205322265625, -0.06402587890625, -0.00580596923828125, 0.10540771484375, 0.0014867782592773438, -0.040496826171875, -0.01342010498046875, -0.047149658203125, 0.034423828125, -0.028411865234375, 0.05615234375, 0.052825927734375, 0.001922607421875, -0.0181884765625, -0.04791259765625, 0.0282440185546875, 0.002117156982421875, -0.046630859375, -0.021087646484375, 0.0277557373046875, 0.0242156982421875, 0.01340484619140625, 0.0465087890625, -0.007457733154296875, 0.01186370849609375, 0.0169525146484375, 0.03253173828125, -0.00325775146484375, -0.0236663818359375, -0.01180267333984375, -0.01271820068359375, 0.0029964447021484375, -0.0221405029296875 ] ]
Yntec/CyberRealistic
2023-10-17T04:07:07.000Z
[ "diffusers", "Photorealistic", "Highly Detailed", "Beautiful", "Cyberdelia", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
Yntec
null
null
Yntec/CyberRealistic
1
9,270
diffusers
2023-10-16T21:44:55
--- license: creativeml-openrail-m library_name: diffusers pipeline_tag: text-to-image tags: - Photorealistic - Highly Detailed - Beautiful - Cyberdelia - stable-diffusion - stable-diffusion-diffusers - diffusers - text-to-image --- # CyberRealistic V2 Original page: https://civitai.com/models/15003?modelVersionId=55015 Sample and prompt: ![Sample](https://cdn-uploads.huggingface.co/production/uploads/63239b8370edc53f51cd5d42/wGg_EOOj8jUraAIzridx8.png) An bartolomé esteban murillo Pretty CUTE Girl buying A room made of pizza. called the pizza room, amazing detailed artwork, Full body pose, working in a bakery by graffiti. cookie cooking a bunch of cookies, in the kitchen, DETAILED CHIBI EYES, technicolor, painterly, logo, farmer, elegant, highly detailed, digital art, hyperrealistic
798
[ [ -0.027801513671875, -0.0672607421875, 0.024444580078125, 0.016387939453125, -0.01192474365234375, 0.01702880859375, 0.01502227783203125, -0.056671142578125, 0.06573486328125, 0.04400634765625, -0.06036376953125, -0.031890869140625, -0.038299560546875, -0.00255584716796875, -0.0355224609375, 0.0494384765625, 0.007904052734375, 0.0162200927734375, -0.029022216796875, 0.034515380859375, -0.05804443359375, -0.0132904052734375, -0.0640869140625, -0.01279449462890625, 0.035888671875, 0.04754638671875, 0.050994873046875, 0.016326904296875, -0.0038604736328125, 0.027618408203125, 0.01453399658203125, -0.0089263916015625, -0.040863037109375, -0.0011501312255859375, -0.020233154296875, -0.05389404296875, -0.042236328125, 0.01275634765625, 0.0083160400390625, 0.007144927978515625, -0.0096435546875, 0.006702423095703125, -0.01096343994140625, 0.04498291015625, -0.01207733154296875, 0.013702392578125, -0.00959014892578125, 0.007137298583984375, -0.003124237060546875, 0.01180267333984375, -0.0032825469970703125, -0.039459228515625, -0.024139404296875, -0.09051513671875, 0.00858306884765625, -0.0020961761474609375, 0.09527587890625, 0.007083892822265625, -0.0106353759765625, -0.0204620361328125, -0.05303955078125, 0.0384521484375, -0.0165557861328125, 0.0408935546875, 0.04296875, 0.0550537109375, -0.01543426513671875, -0.07916259765625, -0.040283203125, 0.0187835693359375, -0.0027256011962890625, 0.0215301513671875, -0.04754638671875, -0.0411376953125, 0.02215576171875, 0.0020542144775390625, -0.0570068359375, 0.003978729248046875, -0.04559326171875, 0.0081329345703125, 0.040679931640625, 0.0253143310546875, 0.0369873046875, 0.02227783203125, -0.044097900390625, -0.026123046875, -0.03155517578125, 0.011688232421875, 0.0406494140625, -0.00860595703125, -0.034698486328125, 0.03668212890625, -0.01074981689453125, 0.03912353515625, 0.005252838134765625, -0.0020275115966796875, 0.01035308837890625, -0.0296173095703125, -0.01157379150390625, -0.022003173828125, 0.03973388671875, 0.05169677734375, 0.0284423828125, 0.0302886962890625, 0.0023651123046875, 0.01444244384765625, 0.04931640625, -0.0716552734375, -0.037109375, 0.037811279296875, -0.0257568359375, -0.043853759765625, 0.01934814453125, -0.0740966796875, -0.0206298828125, -0.0225982666015625, -0.015716552734375, -0.016876220703125, -0.034271240234375, 0.00913238525390625, -0.005573272705078125, 0.005016326904296875, 0.0272216796875, -0.05181884765625, 0.0239715576171875, 0.033447265625, 0.048583984375, 0.0477294921875, 0.0164642333984375, 0.00904083251953125, -0.005657196044921875, -0.062225341796875, 0.06463623046875, -0.017608642578125, -0.03765869140625, -0.0164031982421875, 0.0211029052734375, 0.0309295654296875, -0.0457763671875, 0.03265380859375, -0.0189361572265625, 0.0111236572265625, -0.0231170654296875, -0.0053863525390625, -0.0244598388671875, 0.00969696044921875, -0.045166015625, 0.05926513671875, 0.02581787109375, -0.032257080078125, 0.0286712646484375, -0.043304443359375, -0.00911712646484375, 0.035888671875, -0.0218048095703125, -0.0309600830078125, 0.03350830078125, -0.0142364501953125, 0.01605224609375, -0.00653076171875, -0.0028209686279296875, -0.046722412109375, -0.015167236328125, 0.04156494140625, 0.00115966796875, 0.06451416015625, 0.0479736328125, -0.0165252685546875, -0.004589080810546875, -0.0462646484375, 0.0186614990234375, 0.05645751953125, 0.011199951171875, -0.01446533203125, -0.01078033447265625, 0.0305938720703125, 0.0396728515625, 0.034515380859375, -0.03704833984375, 0.0277252197265625, -0.0004229545593261719, 0.0020961761474609375, 0.04278564453125, -0.01154327392578125, -0.01343536376953125, -0.037353515625, 0.056884765625, -0.0089263916015625, 0.021759033203125, 0.021392822265625, -0.0300445556640625, -0.042724609375, -0.053619384765625, 0.0138397216796875, 0.044403076171875, -0.05743408203125, 0.02203369140625, -0.002140045166015625, -0.05645751953125, -0.035552978515625, -0.011199951171875, 0.0312347412109375, -0.00739288330078125, -0.01415252685546875, -0.03509521484375, -0.041839599609375, -0.084228515625, -0.0004906654357910156, -0.006038665771484375, -0.0048370361328125, 0.016326904296875, 0.0270843505859375, -0.0135650634765625, 0.03619384765625, -0.050048828125, -0.0224456787109375, -0.031982421875, -0.0080108642578125, 0.0386962890625, 0.0411376953125, 0.0802001953125, -0.060577392578125, -0.045989990234375, -0.00888824462890625, -0.0657958984375, -0.00783538818359375, -0.003170013427734375, -0.02874755859375, -0.0253448486328125, 0.04827880859375, -0.042510986328125, 0.0458984375, 0.0029239654541015625, -0.0511474609375, 0.039581298828125, -0.0229034423828125, 0.0308837890625, -0.09222412109375, -0.002376556396484375, 0.0250396728515625, -0.0430908203125, -0.03326416015625, 0.053131103515625, -0.03704833984375, -0.043487548828125, -0.05523681640625, 0.051971435546875, -0.04571533203125, -0.0019502639770507812, -0.041961669921875, 0.001155853271484375, 0.0174713134765625, 0.0231170654296875, 0.008514404296875, 0.044891357421875, 0.04833984375, -0.0163726806640625, 0.051788330078125, 0.0197906494140625, -0.034210205078125, 0.07470703125, -0.0814208984375, 0.016326904296875, -0.01251220703125, 0.0226287841796875, -0.09576416015625, -0.050689697265625, 0.055145263671875, -0.05401611328125, 0.006427764892578125, 0.0167694091796875, -0.08203125, -0.04296875, -0.0147705078125, 0.043853759765625, 0.02520751953125, -0.032470703125, 0.01354217529296875, 0.02374267578125, -0.01004791259765625, -0.0005288124084472656, -0.045654296875, 0.005092620849609375, -0.0031452178955078125, -0.0252227783203125, 0.02239990234375, -0.01088714599609375, -0.031951904296875, -0.00310516357421875, 0.0307159423828125, -0.0299835205078125, -0.003932952880859375, 0.03436279296875, 0.024932861328125, -0.0109710693359375, -0.0384521484375, 0.002513885498046875, -0.007720947265625, 0.01067352294921875, 0.004856109619140625, 0.060577392578125, -0.0193328857421875, -0.0073089599609375, -0.095947265625, 0.045501708984375, 0.0567626953125, 0.0196990966796875, 0.0489501953125, 0.051422119140625, -0.06256103515625, 0.0143890380859375, -0.019439697265625, -0.014495849609375, -0.0340576171875, 0.0158538818359375, -0.04156494140625, -0.02191162109375, 0.036590576171875, -0.0005550384521484375, -0.0179901123046875, 0.02294921875, 0.03814697265625, -0.0232391357421875, 0.074462890625, 0.022979736328125, -0.0018301010131835938, 0.033111572265625, -0.04156494140625, 0.00795745849609375, -0.0225372314453125, -0.048095703125, -0.037750244140625, -0.042510986328125, -0.037841796875, -0.027069091796875, 0.0220794677734375, 0.0010251998901367188, -0.014190673828125, 0.06390380859375, -0.0167083740234375, 0.0572509765625, 0.036773681640625, 0.04913330078125, -0.00811004638671875, 0.01323699951171875, 0.01044464111328125, -0.022216796875, -0.0247955322265625, -0.022216796875, 0.035491943359375, 0.016845703125, 0.024200439453125, 0.03277587890625, 0.02801513671875, 0.018402099609375, -0.0200042724609375, -0.054595947265625, 0.04754638671875, 0.00435638427734375, -0.086669921875, 0.0085601806640625, -0.00852203369140625, -0.07080078125, 0.0007920265197753906, -0.031280517578125, -0.054443359375, 0.0266876220703125, 0.0047149658203125, -0.0268707275390625, 0.01104736328125, -0.040435791015625, 0.06396484375, -0.020263671875, -0.042449951171875, -0.038177490234375, -0.0252227783203125, 0.0227508544921875, 0.0084228515625, 0.03857421875, -0.0007681846618652344, -0.0017986297607421875, 0.01522064208984375, -0.033843994140625, 0.06488037109375, -0.0015649795532226562, 0.0096435546875, 0.029205322265625, 0.024322509765625, 0.0115966796875, 0.021759033203125, 0.007808685302734375, 0.0080108642578125, 0.010223388671875, -0.06610107421875, -0.04638671875, 0.06622314453125, -0.0538330078125, -0.01351165771484375, -0.049835205078125, -0.0055694580078125, 0.0048370361328125, 0.00798797607421875, 0.05584716796875, 0.0279998779296875, -0.04534912109375, 0.01059722900390625, 0.027557373046875, -0.01380157470703125, 0.01325225830078125, -0.008819580078125, -0.04730224609375, -0.034515380859375, 0.0701904296875, -0.01052093505859375, 0.00799560546875, 0.0051422119140625, 0.0073394775390625, -0.02801513671875, -0.0000133514404296875, -0.035400390625, 0.03271484375, -0.0191497802734375, -0.011566162109375, -0.0079345703125, -0.0009860992431640625, -0.0286712646484375, -0.04638671875, -0.047882080078125, -0.01995849609375, -0.041839599609375, -0.015899658203125, 0.01018524169921875, 0.07647705078125, 0.003070831298828125, 0.0299530029296875, -0.0341796875, 0.032989501953125, 0.025360107421875, 0.012664794921875, -0.0094757080078125, -0.022674560546875, -0.0082244873046875, -0.009002685546875, -0.0406494140625, -0.056915283203125, 0.052825927734375, -0.02313232421875, 0.031707763671875, 0.04559326171875, -0.025848388671875, 0.034149169921875, 0.0087127685546875, 0.0533447265625, 0.042022705078125, -0.043853759765625, 0.040191650390625, -0.051910400390625, 0.027130126953125, 0.06201171875, 0.033203125, -0.030120849609375, -0.00023221969604492188, -0.0673828125, -0.06256103515625, 0.04058837890625, 0.0257568359375, 0.0345458984375, 0.006191253662109375, 0.0204620361328125, -0.003070831298828125, 0.04296875, -0.06219482421875, -0.0242462158203125, -0.0235137939453125, -0.00623321533203125, 0.0304412841796875, -0.0161590576171875, -0.0012731552124023438, -0.0399169921875, 0.061126708984375, 0.01373291015625, 0.04534912109375, -0.003025054931640625, 0.03631591796875, -0.00787353515625, -0.018951416015625, 0.0615234375, 0.05438232421875, -0.0116119384765625, -0.0216217041015625, -0.021697998046875, -0.018768310546875, 0.01457977294921875, -0.032135009765625, -0.004505157470703125, 0.016326904296875, 0.01629638671875, 0.068115234375, 0.01104736328125, -0.04010009765625, 0.0489501953125, -0.012786865234375, -0.007266998291015625, -0.058990478515625, 0.01812744140625, -0.0015115737915039062, 0.036865234375, -0.00870513916015625, 0.0116119384765625, 0.0287322998046875, -0.0670166015625, 0.00919342041015625, 0.017425537109375, -0.04052734375, -0.043670654296875, 0.06927490234375, -0.003704071044921875, -0.046142578125, 0.04461669921875, -0.023162841796875, -0.02630615234375, 0.0640869140625, 0.057159423828125, 0.0550537109375, -0.0095062255859375, 0.047271728515625, 0.056793212890625, -0.005344390869140625, 0.00733184814453125, 0.04541015625, 0.01251220703125, -0.028778076171875, 0.03948974609375, -0.0079803466796875, -0.0171356201171875, 0.01806640625, -0.05023193359375, 0.061279296875, -0.08612060546875, -0.004589080810546875, -0.0085601806640625, -0.0042724609375, -0.049468994140625, 0.045623779296875, 0.01433563232421875, 0.056549072265625, -0.06463623046875, 0.053558349609375, 0.061553955078125, -0.01593017578125, -0.053253173828125, -0.00818634033203125, 0.0447998046875, -0.0621337890625, 0.0221710205078125, 0.0099334716796875, -0.012908935546875, -0.01384735107421875, -0.069091796875, -0.039398193359375, 0.0687255859375, 0.007358551025390625, -0.06207275390625, -0.0006151199340820312, -0.013763427734375, 0.053375244140625, -0.04931640625, 0.035552978515625, 0.0169219970703125, 0.0335693359375, 0.042327880859375, -0.024017333984375, -0.0185546875, -0.054046630859375, 0.0226287841796875, -0.0050201416015625, -0.07403564453125, 0.035369873046875, -0.0229339599609375, -0.006839752197265625, 0.041534423828125, 0.0672607421875, 0.0235137939453125, 0.03594970703125, 0.05218505859375, 0.06365966796875, 0.018951416015625, -0.01036834716796875, 0.08660888671875, -0.006114959716796875, 0.012420654296875, 0.078369140625, 0.0153045654296875, 0.03619384765625, 0.01067352294921875, -0.0261688232421875, 0.049407958984375, 0.078369140625, -0.01004791259765625, 0.053924560546875, 0.024017333984375, -0.01497650146484375, -0.0160675048828125, -0.0418701171875, -0.0306854248046875, 0.0394287109375, 0.0103302001953125, -0.004863739013671875, 0.0009984970092773438, 0.0009908676147460938, 0.0225677490234375, 0.0278472900390625, -0.007472991943359375, 0.0596923828125, 0.00315093994140625, -0.020538330078125, 0.0308837890625, 0.0013885498046875, 0.0075836181640625, -0.0263671875, -0.0156707763671875, -0.0212554931640625, -0.0034332275390625, -0.0153656005859375, -0.042999267578125, 0.0191802978515625, -0.005504608154296875, -0.003936767578125, -0.016387939453125, 0.0298309326171875, 0.0017366409301757812, -0.0899658203125, 0.0302276611328125, 0.0167083740234375, 0.031829833984375, 0.006923675537109375, -0.07672119140625, 0.017822265625, 0.00494384765625, 0.0139617919921875, 0.00519561767578125, 0.0229339599609375, 0.01004791259765625, 0.0171356201171875, 0.036590576171875, 0.016876220703125, -0.0276031494140625, -0.00154876708984375, 0.043243408203125, -0.029266357421875, -0.04046630859375, -0.047760009765625, 0.042022705078125, -0.0121917724609375, -0.053558349609375, 0.029571533203125, 0.058868408203125, 0.08489990234375, -0.03875732421875, 0.032318115234375, -0.00963592529296875, 0.0308380126953125, -0.0268096923828125, 0.037841796875, -0.07843017578125, -0.01209259033203125, -0.033782958984375, -0.06842041015625, -0.0084686279296875, 0.054443359375, 0.01557159423828125, 0.0082855224609375, 0.02008056640625, 0.0594482421875, -0.0150604248046875, -0.01226043701171875, 0.0299835205078125, 0.01593017578125, 0.0023136138916015625, 0.0169219970703125, 0.060577392578125, -0.044281005859375, -0.018768310546875, -0.0204315185546875, -0.02032470703125, -0.03399658203125, -0.027618408203125, -0.060638427734375, -0.0430908203125, -0.02899169921875, -0.047943115234375, -0.00867462158203125, 0.08538818359375, 0.073974609375, -0.05804443359375, -0.029510498046875, 0.0009069442749023438, -0.003170013427734375, -0.01514434814453125, -0.0183563232421875, -0.029541015625, 0.048065185546875, -0.08551025390625, 0.0216217041015625, -0.006183624267578125, 0.03265380859375, -0.01348114013671875, 0.0275726318359375, -0.00740814208984375, 0.0191650390625, 0.0242462158203125, 0.0165863037109375, -0.033599853515625, -0.01491546630859375, -0.006702423095703125, -0.01474761962890625, 0.011688232421875, 0.050323486328125, -0.0172576904296875, 0.0265045166015625, 0.0465087890625, 0.01049041748046875, 0.0361328125, 0.01229095458984375, 0.0281524658203125, -0.0244293212890625, 0.02581787109375, -0.00311279296875, 0.030548095703125, 0.0144195556640625, -0.056671142578125, 0.034820556640625, 0.052276611328125, -0.01525115966796875, -0.037506103515625, 0.0307769775390625, -0.0845947265625, -0.050140380859375, 0.043426513671875, 0.004039764404296875, -0.02142333984375, 0.0197906494140625, -0.0290679931640625, 0.00965118408203125, -0.030731201171875, 0.045196533203125, 0.0701904296875, -0.00722503662109375, -0.008575439453125, -0.042755126953125, -0.006519317626953125, 0.0257110595703125, -0.07501220703125, -0.03692626953125, 0.06298828125, 0.019134521484375, 0.0161895751953125, 0.041015625, -0.026519775390625, 0.0172271728515625, 0.0241851806640625, 0.021148681640625, -0.0026721954345703125, -0.07373046875, 0.003376007080078125, 0.00298309326171875, -0.00322723388671875, -0.031280517578125 ] ]
jarvisx17/japanese-sentiment-analysis
2023-03-03T11:00:00.000Z
[ "transformers", "pytorch", "tensorboard", "bert", "text-classification", "generated_from_trainer", "ja", "endpoints_compatible", "has_space", "region:us" ]
text-classification
jarvisx17
null
null
jarvisx17/japanese-sentiment-analysis
6
9,259
transformers
2022-11-15T06:28:39
--- tags: - generated_from_trainer language: ja widget: - text: "🤗セグメント利益は、前期比8.3%増の24億28百万円となった" metrics: - accuracy - f1 model-index: - name: Japanese-sentiment-analysis results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # japanese-sentiment-analysis This model was trained from scratch on the chABSA dataset. It achieves the following results on the evaluation set: - Loss: 0.0001 - Accuracy: 1.0 - F1: 1.0 ## Model description Model Train for Japanese sentence sentiments. ## Intended uses & limitations The model was trained on chABSA Japanese dataset. DATASET link : https://www.kaggle.com/datasets/takahirokubo0/chabsa ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ## Usage You can use cURL to access this model: Python API: ``` from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("jarvisx17/japanese-sentiment-analysis") model = AutoModelForSequenceClassification.from_pretrained("jarvisx17/japanese-sentiment-analysis") inputs = tokenizer("I love AutoNLP", return_tensors="pt") outputs = model(**inputs) ``` ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 1.12.1+cu113 - Datasets 2.7.0 - Tokenizers 0.13.2 ### Dependencies - !pip install fugashi - !pip install unidic_lite
1,676
[ [ -0.037139892578125, -0.048797607421875, 0.01418304443359375, 0.02197265625, -0.04168701171875, -0.0115814208984375, -0.00809478759765625, -0.0095672607421875, 0.0227203369140625, 0.0244598388671875, -0.056732177734375, -0.04779052734375, -0.05523681640625, -0.00046443939208984375, -0.01221466064453125, 0.08935546875, 0.006649017333984375, 0.02276611328125, 0.00682830810546875, 0.0016031265258789062, -0.0287933349609375, -0.041412353515625, -0.05889892578125, -0.032257080078125, 0.02008056640625, 0.0170440673828125, 0.0538330078125, 0.0228118896484375, 0.0369873046875, 0.0244903564453125, -0.0018596649169921875, -0.011566162109375, -0.01971435546875, -0.0130615234375, 0.0013494491577148438, -0.04833984375, -0.035797119140625, 0.01369476318359375, 0.0341796875, 0.031982421875, -0.00585174560546875, 0.017333984375, -0.0101470947265625, 0.0244293212890625, -0.039886474609375, 0.035980224609375, -0.036773681640625, 0.0258941650390625, -0.01119232177734375, -0.006954193115234375, -0.024383544921875, -0.0177154541015625, -0.0041656494140625, -0.034515380859375, 0.0197601318359375, 0.00043272972106933594, 0.092529296875, 0.018280029296875, -0.017669677734375, -0.0154266357421875, -0.0245208740234375, 0.0445556640625, -0.08148193359375, 0.00492095947265625, 0.0173797607421875, 0.0192413330078125, 0.01056671142578125, -0.052947998046875, -0.043548583984375, -0.0035266876220703125, -0.0054168701171875, 0.041229248046875, -0.008392333984375, -0.003021240234375, 0.039886474609375, 0.03668212890625, -0.035186767578125, 0.003871917724609375, -0.0257720947265625, -0.01142120361328125, 0.04833984375, 0.031402587890625, 0.0158843994140625, -0.05426025390625, -0.022430419921875, -0.01788330078125, -0.00901031494140625, 0.0184478759765625, 0.046875, 0.0271453857421875, -0.021484375, 0.037872314453125, -0.024627685546875, 0.03582763671875, 0.005512237548828125, 0.0053253173828125, 0.039154052734375, 0.0010051727294921875, -0.0245208740234375, -0.0093841552734375, 0.1031494140625, 0.033477783203125, 0.0219879150390625, 0.01000213623046875, -0.027313232421875, 0.0240478515625, 0.01561737060546875, -0.054840087890625, -0.0271759033203125, 0.005298614501953125, -0.0423583984375, -0.051177978515625, 0.01543426513671875, -0.0687255859375, -0.007030487060546875, -0.0224761962890625, 0.0330810546875, -0.0244598388671875, -0.021026611328125, 0.017913818359375, -0.004619598388671875, 0.0308074951171875, -0.011138916015625, -0.052886962890625, 0.0223541259765625, 0.0128173828125, 0.049163818359375, 0.013427734375, -0.03363037109375, 0.003223419189453125, -0.0216064453125, -0.0145263671875, 0.041290283203125, -0.0172882080078125, -0.050048828125, -0.0047454833984375, 0.01438140869140625, -0.006160736083984375, -0.0171661376953125, 0.049774169921875, -0.01910400390625, 0.036712646484375, -0.008636474609375, -0.0499267578125, -0.0090484619140625, 0.0098114013671875, -0.03216552734375, 0.07568359375, 0.0068817138671875, -0.06591796875, 0.00911712646484375, -0.062469482421875, -0.02630615234375, -0.0124053955078125, 0.0157928466796875, -0.046051025390625, 0.01380157470703125, 0.0159454345703125, 0.0460205078125, -0.00914764404296875, 0.03094482421875, -0.04046630859375, -0.0187225341796875, 0.0308074951171875, -0.036285400390625, 0.078369140625, 0.0279693603515625, -0.037017822265625, -0.005977630615234375, -0.0667724609375, 0.0014047622680664062, 0.0254974365234375, 0.004970550537109375, -0.009735107421875, -0.0277862548828125, 0.02740478515625, 0.0294036865234375, 0.0283660888671875, -0.051025390625, 0.01399993896484375, -0.036956787109375, 0.018524169921875, 0.049041748046875, 0.01139068603515625, -0.001094818115234375, 0.00650787353515625, 0.03839111328125, 0.038665771484375, 0.030029296875, -0.0020694732666015625, -0.02020263671875, -0.08502197265625, 0.0034809112548828125, 0.0164947509765625, 0.026885986328125, -0.036346435546875, 0.0687255859375, -0.0095672607421875, -0.052520751953125, -0.049896240234375, -0.00234222412109375, 0.0248870849609375, 0.04339599609375, 0.0260009765625, -0.04534912109375, -0.054046630859375, -0.0675048828125, -0.0215301513671875, -0.00522613525390625, 0.02655029296875, -0.013427734375, 0.0523681640625, -0.0343017578125, 0.06842041015625, -0.060272216796875, -0.01580810546875, -0.021728515625, 0.02886962890625, 0.0256500244140625, 0.041595458984375, 0.028472900390625, -0.0260009765625, -0.035003662109375, -0.018890380859375, -0.06378173828125, -0.005870819091796875, -0.0010833740234375, -0.019989013671875, 0.01389312744140625, 0.0183258056640625, -0.047637939453125, 0.036773681640625, 0.01326751708984375, -0.0278167724609375, 0.0343017578125, -0.006145477294921875, -0.0066986083984375, -0.1102294921875, 0.0200958251953125, 0.0312347412109375, -0.0171356201171875, -0.0237884521484375, 0.006435394287109375, -0.005786895751953125, -0.032562255859375, -0.02471923828125, 0.0404052734375, -0.0116119384765625, 0.00632476806640625, -0.0338134765625, 0.00345611572265625, 0.007564544677734375, 0.06298828125, 0.020294189453125, 0.032806396484375, 0.05108642578125, -0.033447265625, 0.039337158203125, 0.03485107421875, -0.020904541015625, 0.036468505859375, -0.0667724609375, 0.008819580078125, -0.005584716796875, -0.001384735107421875, -0.07696533203125, -0.0171661376953125, 0.03826904296875, -0.047393798828125, 0.0223846435546875, 0.00592803955078125, -0.032501220703125, -0.039306640625, -0.006381988525390625, -0.0002791881561279297, 0.058319091796875, -0.03936767578125, 0.039825439453125, 0.0196685791015625, 0.006694793701171875, -0.050994873046875, -0.061798095703125, -0.011810302734375, -0.00807952880859375, -0.0087738037109375, -0.0199737548828125, -0.003719329833984375, 0.004329681396484375, 0.0002741813659667969, -0.00025463104248046875, -0.00164794921875, -0.0036602020263671875, 0.0235748291015625, 0.036712646484375, 0.00482940673828125, -0.01519012451171875, 0.0020580291748046875, -0.0280609130859375, 0.0321044921875, 0.004680633544921875, 0.05352783203125, -0.0318603515625, -0.00598907470703125, -0.061798095703125, -0.01509857177734375, 0.034820556640625, -0.0121917724609375, 0.048858642578125, 0.06256103515625, -0.01009368896484375, 0.0013380050659179688, -0.037261962890625, -0.00899505615234375, -0.032379150390625, 0.046142578125, -0.0294036865234375, -0.01479339599609375, 0.0511474609375, 0.00606536865234375, -0.007053375244140625, 0.059234619140625, 0.03790283203125, -0.013427734375, 0.09112548828125, 0.02496337890625, -0.031829833984375, 0.0286407470703125, -0.05364990234375, 0.01512908935546875, -0.06005859375, -0.0237884521484375, -0.029052734375, -0.007755279541015625, -0.043914794921875, 0.0164337158203125, 0.01045989990234375, -0.01140594482421875, -0.031951904296875, 0.022552490234375, -0.030975341796875, 0.0293731689453125, 0.034332275390625, 0.011566162109375, 0.0001837015151977539, 0.0014562606811523438, -0.0014629364013671875, -0.005695343017578125, -0.053070068359375, -0.028289794921875, 0.08319091796875, 0.046051025390625, 0.06195068359375, -0.0222625732421875, 0.0499267578125, 0.0108489990234375, 0.0075225830078125, -0.06585693359375, 0.0513916015625, -0.0137176513671875, -0.0401611328125, 0.0022983551025390625, -0.032501220703125, -0.0599365234375, 0.010406494140625, -0.0120391845703125, -0.0214080810546875, 0.00904083251953125, -0.0099029541015625, -0.0139617919921875, 0.02081298828125, -0.03704833984375, 0.08062744140625, -0.0121307373046875, -0.01081085205078125, -0.01013946533203125, -0.0618896484375, 0.02679443359375, 0.0166015625, 0.00009906291961669922, 0.0010080337524414062, 0.01117706298828125, 0.0673828125, -0.0279083251953125, 0.08245849609375, -0.0384521484375, 0.0142822265625, 0.0242919921875, -0.004268646240234375, 0.0222930908203125, 0.0050506591796875, -0.001422882080078125, 0.017364501953125, -0.00426483154296875, -0.039764404296875, -0.044189453125, 0.051025390625, -0.10772705078125, -0.0200958251953125, -0.050811767578125, -0.0321044921875, -0.01551055908203125, 0.016845703125, 0.048583984375, 0.052520751953125, -0.016754150390625, 0.004383087158203125, 0.031646728515625, 0.0026721954345703125, 0.039306640625, 0.031097412109375, -0.0215606689453125, -0.048583984375, 0.06256103515625, 0.0018281936645507812, 0.0096893310546875, 0.002857208251953125, 0.0214080810546875, -0.0499267578125, -0.0082550048828125, -0.0191802978515625, 0.031402587890625, -0.069091796875, -0.0101470947265625, -0.04315185546875, -0.0225372314453125, -0.042755126953125, 0.010589599609375, -0.036956787109375, -0.0195770263671875, -0.0233917236328125, -0.01424407958984375, 0.012359619140625, 0.03668212890625, 0.0096588134765625, 0.059478759765625, -0.0535888671875, 0.009857177734375, 0.00623321533203125, 0.0238189697265625, 0.0034961700439453125, -0.0657958984375, -0.0302886962890625, 0.01023101806640625, -0.0322265625, -0.060516357421875, 0.0599365234375, -0.0142364501953125, 0.0287017822265625, 0.03289794921875, -0.0084686279296875, 0.05633544921875, 0.00914764404296875, 0.06756591796875, 0.0295867919921875, -0.08758544921875, 0.050079345703125, -0.0244598388671875, 0.036468505859375, 0.05078125, 0.048675537109375, -0.036285400390625, -0.0234832763671875, -0.09136962890625, -0.0460205078125, 0.061279296875, 0.002841949462890625, 0.0168914794921875, -0.003204345703125, 0.037261962890625, -0.0023555755615234375, 0.0267333984375, -0.0941162109375, -0.03521728515625, -0.055938720703125, -0.039825439453125, -0.01727294921875, -0.0138092041015625, 0.0069122314453125, -0.035430908203125, 0.09747314453125, -0.00785064697265625, 0.0173187255859375, 0.01113128662109375, -0.007068634033203125, -0.0073089599609375, 0.003082275390625, 0.00627899169921875, 0.0185699462890625, -0.018707275390625, -0.013336181640625, 0.01047515869140625, -0.0305023193359375, 0.0099334716796875, -0.0037364959716796875, -0.0265350341796875, 0.014129638671875, 0.0119781494140625, 0.10443115234375, 0.0032024383544921875, -0.0157470703125, 0.044769287109375, -0.006744384765625, -0.00342559814453125, -0.0369873046875, 0.003032684326171875, -0.008392333984375, -0.005313873291015625, -0.00852203369140625, 0.0220947265625, 0.00809478759765625, -0.03875732421875, -0.01222991943359375, 0.01971435546875, -0.041107177734375, -0.0263824462890625, 0.044464111328125, 0.01474761962890625, -0.0075225830078125, 0.060150146484375, -0.0189971923828125, -0.03076171875, 0.0455322265625, 0.041595458984375, 0.0828857421875, -0.0120849609375, 0.0211944580078125, 0.054046630859375, 0.0012063980102539062, -0.01556396484375, 0.0260009765625, 0.013397216796875, -0.0592041015625, -0.006969451904296875, -0.051239013671875, -0.0218048095703125, 0.01995849609375, -0.07568359375, 0.033905029296875, -0.042449951171875, -0.0286407470703125, -0.01427459716796875, 0.0251922607421875, -0.044921875, 0.052581787109375, 0.0026397705078125, 0.05950927734375, -0.08685302734375, 0.05633544921875, 0.050079345703125, -0.0662841796875, -0.07574462890625, -0.00476837158203125, -0.0184478759765625, -0.06671142578125, 0.0401611328125, 0.01389312744140625, -0.0018377304077148438, 0.0235748291015625, -0.052581787109375, -0.03619384765625, 0.06646728515625, -0.0026836395263671875, -0.038177490234375, 0.006603240966796875, 0.02276611328125, 0.0543212890625, -0.0229034423828125, 0.05364990234375, 0.03778076171875, 0.0223846435546875, -0.00583648681640625, -0.036529541015625, -0.01055145263671875, -0.039886474609375, 0.0159912109375, 0.0010023117065429688, -0.072509765625, 0.058990478515625, 0.0130157470703125, 0.01861572265625, 0.0121307373046875, 0.061767578125, 0.005992889404296875, 0.010498046875, 0.03173828125, 0.05523681640625, 0.0418701171875, -0.006885528564453125, 0.0758056640625, -0.03875732421875, 0.060333251953125, 0.066162109375, -0.00644683837890625, 0.0531005859375, 0.024749755859375, -0.02178955078125, 0.06317138671875, 0.048187255859375, -0.0268707275390625, 0.03118896484375, -0.003143310546875, -0.00039696693420410156, -0.02276611328125, 0.006488800048828125, -0.0220489501953125, 0.040496826171875, 0.00579071044921875, -0.0340576171875, -0.0019464492797851562, 0.0050506591796875, 0.011383056640625, -0.00658416748046875, -0.032745361328125, 0.049102783203125, 0.0063934326171875, -0.047576904296875, 0.032073974609375, 0.0011758804321289062, 0.060760498046875, -0.06732177734375, 0.028167724609375, -0.0222015380859375, 0.0210113525390625, -0.01861572265625, -0.053955078125, 0.0183563232421875, 0.0018138885498046875, -0.002349853515625, 0.0093994140625, 0.050994873046875, -0.0259552001953125, -0.061676025390625, 0.010498046875, 0.030029296875, 0.016510009765625, 0.01221466064453125, -0.0791015625, -0.0066986083984375, 0.0103607177734375, -0.046295166015625, 0.017669677734375, 0.020294189453125, 0.0229644775390625, 0.046875, 0.035308837890625, 0.00992584228515625, -0.005306243896484375, 0.026763916015625, 0.049346923828125, -0.0567626953125, -0.053314208984375, -0.068603515625, 0.041229248046875, -0.0185089111328125, -0.043731689453125, 0.06524658203125, 0.05157470703125, 0.05975341796875, -0.038421630859375, 0.06781005859375, -0.006526947021484375, 0.0258941650390625, -0.0355224609375, 0.047515869140625, -0.03643798828125, -0.00489044189453125, -0.0240478515625, -0.060943603515625, -0.0034236907958984375, 0.058441162109375, -0.0264434814453125, 0.01357269287109375, 0.05731201171875, 0.06903076171875, -0.007091522216796875, 0.018798828125, 0.0126800537109375, 0.0279083251953125, 0.00839996337890625, 0.0306396484375, 0.059478759765625, -0.06463623046875, 0.015106201171875, -0.051483154296875, 0.00157928466796875, 0.01187896728515625, -0.05267333984375, -0.076904296875, -0.0338134765625, -0.045684814453125, -0.0379638671875, -0.0187225341796875, 0.08038330078125, 0.033355712890625, -0.05584716796875, -0.039215087890625, -0.015380859375, -0.038848876953125, -0.0028705596923828125, -0.025238037109375, 0.028167724609375, -0.055816650390625, -0.0677490234375, -0.005222320556640625, -0.0185394287109375, 0.01274871826171875, -0.031280517578125, -0.002841949462890625, -0.0076141357421875, -0.003833770751953125, 0.0214080810546875, 0.0072784423828125, -0.043914794921875, -0.004180908203125, -0.0008473396301269531, -0.01207733154296875, 0.00917816162109375, 0.034576416015625, -0.045135498046875, 0.0340576171875, 0.029998779296875, 0.025787353515625, 0.0433349609375, -0.0036907196044921875, 0.0293121337890625, -0.053863525390625, 0.021514892578125, 0.00614166259765625, 0.05072021484375, 0.0231475830078125, -0.0245819091796875, 0.032684326171875, 0.051788330078125, -0.0382080078125, -0.04632568359375, -0.0037174224853515625, -0.08245849609375, -0.0160064697265625, 0.08642578125, -0.0086517333984375, -0.033477783203125, 0.0125274658203125, -0.03466796875, 0.04644775390625, -0.0180816650390625, 0.057525634765625, 0.057342529296875, 0.0014791488647460938, -0.005565643310546875, -0.008575439453125, 0.0245513916015625, 0.041595458984375, -0.062469482421875, -0.0026645660400390625, 0.0235748291015625, 0.036285400390625, 0.00698089599609375, 0.045806884765625, -0.0181427001953125, 0.0168609619140625, 0.004703521728515625, 0.042694091796875, -0.0220489501953125, -0.0094451904296875, -0.04180908203125, 0.00678253173828125, 0.0021915435791015625, -0.023529052734375 ] ]
goofyai/3d_render_style_xl
2023-08-14T16:58:25.000Z
[ "diffusers", "text-to-image", "stable-diffusion", "lora", "license:apache-2.0", "has_space", "region:us" ]
text-to-image
goofyai
null
null
goofyai/3d_render_style_xl
62
9,249
diffusers
2023-08-14T16:41:59
--- license: apache-2.0 tags: - text-to-image - stable-diffusion - lora - diffusers base_model: stabilityai/stable-diffusion-xl-base-1.0 instance_prompt: 3d style, 3d, 3d render widget: - text: 3d style llama --- # 3D Render Style SDXL ## Support me in upgrading my 3060 to a 40xx GPU as my current GPU struggles with SDXL training [Buymeacoffee](https://www.buymeacoffee.com/goofy02) | ![3d_style_1.jpeg](3d_style_1.jpeg) | ![3d_style_2.jpeg](3d_style_2.jpeg) | |:----------------------:|:----------------:| | ![3d_style_3.jpeg](3d_style_3.jpeg) | ![3d_style_4.jpeg](3d_style_4.jpeg) | ### Tips: - Prompt with `3d style`, `3d` or `3d render` activation prompts - Lora weight of 0.7-1 works great - Highres fix is highly recommended.
738
[ [ -0.06597900390625, -0.030609130859375, 0.045684814453125, 0.042938232421875, -0.03045654296875, -0.005207061767578125, 0.0215301513671875, -0.010528564453125, 0.0109100341796875, 0.034637451171875, -0.0251922607421875, -0.024139404296875, -0.0261688232421875, -0.005924224853515625, 0.00604248046875, 0.06121826171875, -0.027679443359375, 0.00516510009765625, -0.02728271484375, 0.007419586181640625, -0.02423095703125, -0.031585693359375, -0.08612060546875, -0.0250091552734375, 0.054290771484375, 0.017974853515625, 0.03936767578125, 0.03271484375, 0.0223236083984375, 0.0182952880859375, -0.01079559326171875, 0.005107879638671875, -0.0208282470703125, -0.0032806396484375, -0.0020313262939453125, -0.0130462646484375, -0.05963134765625, -0.01483154296875, 0.054412841796875, 0.0257568359375, -0.0292510986328125, 0.005138397216796875, -0.004680633544921875, 0.056182861328125, -0.060028076171875, 0.0159912109375, -0.0018739700317382812, 0.0308380126953125, -0.019195556640625, 0.01168060302734375, -0.009185791015625, -0.03448486328125, -0.035430908203125, -0.08624267578125, 0.0138397216796875, 0.01262664794921875, 0.061920166015625, 0.0172271728515625, -0.0124359130859375, 0.034698486328125, -0.03594970703125, 0.0274200439453125, -0.065185546875, 0.037017822265625, 0.005870819091796875, 0.039947509765625, 0.01445770263671875, -0.06597900390625, -0.018157958984375, 0.0280303955078125, 0.01493072509765625, 0.0287933349609375, -0.0263214111328125, 0.01531982421875, 0.07391357421875, 0.0406494140625, -0.0379638671875, 0.003330230712890625, -0.05987548828125, -0.00927734375, 0.0499267578125, 0.0101318359375, 0.011810302734375, 0.0102386474609375, -0.051788330078125, -0.030914306640625, -0.047393798828125, -0.033843994140625, 0.04443359375, 0.01525115966796875, -0.029388427734375, 0.031585693359375, -0.004276275634765625, 0.047943115234375, 0.01178741455078125, 0.020294189453125, 0.04010009765625, -0.0145416259765625, -0.041290283203125, 0.00452423095703125, 0.050872802734375, 0.037506103515625, 0.0199432373046875, 0.0177459716796875, -0.01837158203125, -0.00630950927734375, 0.0328369140625, -0.062744140625, -0.031982421875, 0.01318359375, -0.005199432373046875, 0.0005421638488769531, 0.0010385513305664062, -0.08087158203125, -0.02386474609375, -0.0047454833984375, 0.013214111328125, -0.0361328125, -0.047821044921875, 0.017791748046875, -0.040283203125, 0.0203094482421875, 0.0472412109375, -0.08636474609375, 0.038665771484375, 0.013885498046875, 0.031768798828125, -0.0117950439453125, 0.004108428955078125, -0.030517578125, 0.01422882080078125, -0.046234130859375, 0.05181884765625, -0.00424957275390625, -0.051025390625, -0.0031757354736328125, -0.0007190704345703125, 0.001987457275390625, -0.060760498046875, 0.04669189453125, -0.030975341796875, 0.003986358642578125, -0.0271453857421875, -0.0114898681640625, -0.0101318359375, -0.0007367134094238281, -0.062042236328125, 0.100341796875, 0.027587890625, -0.039459228515625, 0.003612518310546875, -0.058013916015625, -0.034149169921875, 0.0227813720703125, -0.01381683349609375, -0.035430908203125, 0.0382080078125, -0.0025043487548828125, 0.01531982421875, 0.001911163330078125, -0.035675048828125, -0.035430908203125, -0.040252685546875, -0.038818359375, -0.0081787109375, 0.037994384765625, 0.037872314453125, -0.0184173583984375, 0.00307464599609375, -0.0679931640625, 0.01448822021484375, 0.0245513916015625, -0.00860595703125, -0.005279541015625, -0.048095703125, 0.0284271240234375, 0.01462554931640625, 0.0219573974609375, -0.044647216796875, 0.022705078125, -0.00595855712890625, 0.015289306640625, 0.060882568359375, -0.0092010498046875, 0.01222991943359375, -0.0018224716186523438, 0.03436279296875, 0.01114654541015625, 0.0277099609375, 0.01320648193359375, -0.05584716796875, -0.0458984375, -0.01035308837890625, -0.037628173828125, 0.02886962890625, -0.07208251953125, -0.01220703125, 0.0196075439453125, -0.022857666015625, -0.005550384521484375, 0.000537872314453125, 0.0233306884765625, 0.003383636474609375, 0.03900146484375, -0.02008056640625, -0.05810546875, -0.053253173828125, 0.0254364013671875, -0.004299163818359375, 0.004268646240234375, 0.0225677490234375, 0.0195159912109375, -0.019073486328125, 0.04150390625, -0.07830810546875, 0.000652313232421875, 0.0017385482788085938, -0.01021575927734375, 0.0311431884765625, 0.03460693359375, 0.0640869140625, -0.040618896484375, -0.0256195068359375, 0.0013303756713867188, -0.044586181640625, 0.015838623046875, 0.0311431884765625, 0.01134490966796875, 0.02484130859375, 0.04425048828125, -0.036956787109375, 0.0299072265625, 0.0184326171875, -0.03369140625, 0.05999755859375, -0.039093017578125, -0.00432586669921875, -0.07806396484375, 0.0044097900390625, 0.0157470703125, -0.0104827880859375, -0.01503753662109375, 0.0121307373046875, 0.01259613037109375, -0.0212249755859375, -0.056976318359375, 0.07293701171875, -0.0182037353515625, -0.005161285400390625, -0.0066986083984375, -0.0187530517578125, 0.044586181640625, 0.0178070068359375, 0.0099945068359375, 0.055023193359375, 0.0438232421875, -0.041290283203125, 0.0657958984375, 0.0052947998046875, -0.0099334716796875, 0.048095703125, -0.052978515625, 0.013397216796875, -0.006084442138671875, -0.0033473968505859375, -0.07720947265625, -0.0171966552734375, 0.0283050537109375, 0.0007181167602539062, 0.027099609375, 0.0027523040771484375, -0.02490234375, -0.048065185546875, -0.0270843505859375, 0.0435791015625, 0.0482177734375, -0.050384521484375, 0.03790283203125, -0.003269195556640625, 0.00618743896484375, -0.040191650390625, -0.05426025390625, -0.0025386810302734375, -0.01202392578125, -0.031524658203125, 0.01561737060546875, -0.04595947265625, -0.029296875, -0.0173797607421875, -0.01230621337890625, -0.004058837890625, -0.009063720703125, 0.041107177734375, 0.052825927734375, -0.045196533203125, -0.041015625, -0.0035114288330078125, -0.01678466796875, 0.020721435546875, -0.0017986297607421875, 0.047454833984375, -0.01207733154296875, -0.00506591796875, -0.0176544189453125, 0.0176849365234375, 0.06634521484375, 0.01207733154296875, 0.016387939453125, 0.075439453125, -0.0308837890625, -0.006580352783203125, -0.04705810546875, -0.0171051025390625, -0.040069580078125, 0.03839111328125, -0.040435791015625, -0.0208740234375, 0.031646728515625, 0.00295257568359375, 0.0016117095947265625, 0.03521728515625, 0.02362060546875, -0.01250457763671875, 0.0927734375, 0.0301971435546875, 0.0211029052734375, 0.034454345703125, -0.031768798828125, 0.00803375244140625, -0.09088134765625, -0.0206451416015625, -0.00698089599609375, 0.01080322265625, -0.00804901123046875, -0.0224761962890625, 0.0123291015625, 0.00838470458984375, -0.054107666015625, 0.044891357421875, -0.037445068359375, 0.0281219482421875, 0.016357421875, 0.0058135986328125, -0.005634307861328125, 0.0224151611328125, -0.016845703125, -0.004726409912109375, -0.04638671875, 0.00437164306640625, 0.045318603515625, 0.00791168212890625, 0.0814208984375, 0.00872039794921875, 0.042388916015625, 0.0197906494140625, -0.0060577392578125, -0.0251922607421875, 0.043212890625, -0.0240631103515625, -0.02490234375, 0.00321197509765625, -0.006862640380859375, -0.025482177734375, -0.005764007568359375, -0.0262603759765625, -0.029693603515625, 0.038787841796875, 0.01447296142578125, -0.055755615234375, 0.05743408203125, -0.05926513671875, 0.063720703125, -0.01296234130859375, -0.06549072265625, 0.00914764404296875, -0.00742340087890625, 0.024688720703125, -0.0034694671630859375, 0.0011949539184570312, -0.014129638671875, -0.002773284912109375, 0.045379638671875, -0.02178955078125, 0.042510986328125, -0.004634857177734375, -0.025482177734375, -0.00899505615234375, -0.02423095703125, 0.03826904296875, 0.0215301513671875, -0.01555633544921875, 0.004276275634765625, 0.0245819091796875, -0.035430908203125, -0.060638427734375, 0.06500244140625, -0.08563232421875, -0.031982421875, -0.050445556640625, -0.0379638671875, -0.0186614990234375, 0.0098876953125, 0.0350341796875, 0.0305938720703125, -0.0113525390625, -0.0167694091796875, 0.06976318359375, 0.006526947021484375, 0.053466796875, 0.0157318115234375, -0.046722412109375, -0.06658935546875, 0.06988525390625, -0.0079193115234375, 0.032501220703125, -0.00782012939453125, -0.007434844970703125, -0.03521728515625, -0.03619384765625, -0.057891845703125, 0.0171966552734375, -0.058441162109375, -0.00945281982421875, -0.004184722900390625, -0.040771484375, -0.038665771484375, -0.025909423828125, -0.04888916015625, -0.02972412109375, -0.0457763671875, 0.0007596015930175781, 0.0333251953125, 0.056854248046875, -0.00617218017578125, 0.017547607421875, -0.0285491943359375, 0.01971435546875, 0.03631591796875, 0.04425048828125, 0.00762176513671875, -0.018646240234375, -0.006622314453125, -0.02178955078125, -0.018707275390625, -0.016265869140625, 0.0197296142578125, -0.01263427734375, 0.054718017578125, 0.0487060546875, 0.00814056396484375, 0.06585693359375, -0.039642333984375, 0.0560302734375, 0.042938232421875, -0.033538818359375, 0.0274200439453125, -0.01404571533203125, 0.037200927734375, 0.0226898193359375, 0.04052734375, -0.035247802734375, 0.006336212158203125, -0.07012939453125, -0.06207275390625, 0.03216552734375, 0.00052642822265625, 0.01019287109375, 0.0042572021484375, 0.028717041015625, -0.003917694091796875, -0.0171356201171875, -0.032135009765625, -0.035125732421875, -0.04803466796875, 0.00795745849609375, 0.0016222000122070312, -0.031646728515625, -0.0281829833984375, -0.035675048828125, 0.06622314453125, -0.0161285400390625, 0.026580810546875, 0.00923919677734375, 0.03289794921875, -0.0088043212890625, 0.004917144775390625, 0.04937744140625, 0.0762939453125, -0.052459716796875, 0.01190948486328125, 0.0119171142578125, -0.053466796875, -0.009063720703125, 0.01290130615234375, 0.01081085205078125, 0.00580596923828125, -0.01128387451171875, 0.0977783203125, 0.012939453125, -0.024383544921875, 0.0155181884765625, -0.0216827392578125, -0.01165008544921875, -0.0325927734375, 0.033843994140625, -0.005832672119140625, 0.020416259765625, 0.016021728515625, 0.038818359375, -0.0058135986328125, -0.01198577880859375, 0.004024505615234375, 0.02349853515625, -0.054962158203125, -0.00795745849609375, 0.07806396484375, 0.007572174072265625, 0.0005064010620117188, 0.064208984375, -0.0102996826171875, 0.0025997161865234375, 0.052947998046875, 0.07330322265625, 0.0543212890625, -0.005580902099609375, 0.016204833984375, 0.07513427734375, -0.00691986083984375, -0.017059326171875, 0.04595947265625, -0.005157470703125, -0.05499267578125, -0.003963470458984375, -0.053802490234375, -0.047088623046875, -0.0011358261108398438, -0.0670166015625, 0.04058837890625, -0.06146240234375, -0.0146636962890625, 0.016632080078125, -0.002269744873046875, -0.0214996337890625, 0.0364990234375, 0.06707763671875, 0.09356689453125, -0.056182861328125, 0.08428955078125, 0.055572509765625, -0.0401611328125, -0.0257415771484375, -0.048126220703125, 0.005870819091796875, -0.075927734375, 0.030517578125, -0.01727294921875, -0.0028839111328125, 0.02783203125, -0.0257720947265625, -0.054229736328125, 0.0828857421875, 0.0233306884765625, -0.046966552734375, 0.0070343017578125, -0.0007567405700683594, 0.0081329345703125, -0.0256195068359375, 0.0297088623046875, 0.0283966064453125, 0.03515625, 0.04864501953125, -0.035400390625, 0.02044677734375, -0.04620361328125, 0.0286102294921875, 0.006725311279296875, -0.054473876953125, 0.0753173828125, -0.040283203125, 0.00290679931640625, 0.024688720703125, 0.052215576171875, 0.039459228515625, 0.003910064697265625, 0.046783447265625, 0.063720703125, 0.048828125, -0.03204345703125, 0.091064453125, -0.021484375, 0.0203857421875, 0.062469482421875, -0.03662109375, 0.048187255859375, 0.022857666015625, -0.0271453857421875, 0.0384521484375, 0.07476806640625, -0.031280517578125, 0.0197601318359375, -0.00846099853515625, -0.03070068359375, -0.0160675048828125, 0.0174713134765625, -0.0631103515625, -0.020751953125, 0.026153564453125, -0.0070648193359375, -0.01381683349609375, 0.019134521484375, 0.040435791015625, -0.0325927734375, -0.057159423828125, 0.043182373046875, 0.00826263427734375, -0.03265380859375, 0.06414794921875, -0.023162841796875, 0.037353515625, -0.0810546875, -0.0251312255859375, -0.03607177734375, 0.04132080078125, -0.044158935546875, -0.0775146484375, 0.021148681640625, -0.0477294921875, -0.033050537109375, -0.0038623809814453125, 0.02813720703125, -0.002445220947265625, -0.06317138671875, 0.02838134765625, -0.002044677734375, 0.0129852294921875, -0.0206146240234375, -0.039642333984375, 0.047943115234375, 0.0030384063720703125, 0.0182952880859375, 0.03582763671875, -0.01629638671875, -0.010467529296875, 0.0212554931640625, 0.039947509765625, 0.004032135009765625, 0.0213623046875, 0.0181884765625, 0.0704345703125, -0.0518798828125, -0.026214599609375, -0.052001953125, 0.01364898681640625, -0.030517578125, -0.0269012451171875, 0.07159423828125, 0.0843505859375, 0.04827880859375, -0.018157958984375, 0.055419921875, -0.0007123947143554688, -0.0263824462890625, -0.0250701904296875, 0.050872802734375, -0.055755615234375, 0.00519561767578125, -0.0206298828125, -0.0762939453125, -0.025787353515625, 0.0260772705078125, -0.0207977294921875, -0.0099334716796875, 0.04718017578125, 0.045562744140625, -0.0173492431640625, 0.02203369140625, 0.0310821533203125, 0.01442718505859375, 0.0116119384765625, 0.0298309326171875, 0.0231475830078125, -0.044281005859375, 0.0199737548828125, -0.0256500244140625, -0.018096923828125, 0.0078887939453125, -0.05230712890625, -0.029541015625, -0.02874755859375, -0.0458984375, -0.052825927734375, 0.007366180419921875, 0.0411376953125, 0.0740966796875, -0.06756591796875, -0.034454345703125, 0.0189361572265625, 0.005100250244140625, -0.0281982421875, -0.018157958984375, 0.0367431640625, -0.0293731689453125, -0.07330322265625, 0.006015777587890625, 0.0301971435546875, 0.0177459716796875, -0.0139312744140625, -0.0225372314453125, -0.0188140869140625, 0.00933074951171875, 0.05194091796875, 0.050750732421875, -0.0283355712890625, -0.004695892333984375, -0.0291900634765625, 0.00341033935546875, 0.053192138671875, 0.027923583984375, -0.055419921875, 0.03765869140625, 0.039093017578125, 0.0024394989013671875, 0.0682373046875, -0.01953125, 0.032928466796875, -0.040863037109375, 0.00920867919921875, -0.003162384033203125, 0.050872802734375, 0.0211944580078125, -0.0134124755859375, 0.048370361328125, 0.036865234375, -0.00981903076171875, -0.04278564453125, -0.0021495819091796875, -0.09393310546875, -0.019866943359375, 0.07147216796875, 0.0008273124694824219, -0.04058837890625, 0.0067138671875, -0.039886474609375, -0.0166015625, -0.0229949951171875, 0.0197601318359375, 0.0291290283203125, -0.0272979736328125, -0.0271759033203125, -0.052764892578125, 0.0350341796875, -0.007205963134765625, -0.060089111328125, -0.00469207763671875, 0.021270751953125, 0.0197296142578125, 0.036865234375, 0.0227203369140625, -0.033660888671875, 0.0301666259765625, 0.01270294189453125, 0.0357666015625, -0.0150299072265625, 0.006259918212890625, -0.005199432373046875, 0.0156707763671875, -0.0168609619140625, -0.0205078125 ] ]
mrm8488/t5-base-finetuned-emotion
2021-06-23T12:46:24.000Z
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "en", "dataset:emotion", "arxiv:1910.10683", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text2text-generation
mrm8488
null
null
mrm8488/t5-base-finetuned-emotion
36
9,248
transformers
2022-03-02T23:29:05
--- language: en datasets: - emotion widget: - text: "I wish you were here but it is impossible" --- # T5-base fine-tuned for Emotion Recognition 😂😢😡😃😯 [Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) base fine-tuned on [emotion recognition](https://github.com/dair-ai/emotion_dataset) dataset for **Emotion Recognition** downstream task. ## Details of T5 The **T5** model was presented in [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/pdf/1910.10683.pdf) by *Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu* in Here the abstract: Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new “Colossal Clean Crawled Corpus”, we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our dataset, pre-trained models, and code. ![model image](https://i.imgur.com/jVFMMWR.png) ## Details of the downstream task (Sentiment Recognition) - Dataset 📚 [Elvis Saravia](https://twitter.com/omarsar0) has gathered a great [dataset](https://github.com/dair-ai/emotion_dataset) for emotion recognition. It allows to classifiy the text into one of the following **6** emotions: - sadness 😢 - joy 😃 - love 🥰 - anger 😡 - fear 😱 - surprise 😯 ## Model fine-tuning 🏋️‍ The training script is a slightly modified version of [this Colab Notebook](https://github.com/patil-suraj/exploring-T5/blob/master/t5_fine_tuning.ipynb) created by [Suraj Patil](https://github.com/patil-suraj), so all credits to him! ## Test set metrics 🧾 | |precision | recall | f1-score |support| |----------|----------|---------|----------|-------| |anger | 0.93| 0.92| 0.93| 275| |fear | 0.91| 0.87| 0.89| 224| |joy | 0.97| 0.94| 0.95| 695| |love | 0.80| 0.91| 0.85| 159| |sadness | 0.97| 0.97| 0.97| 521| |surpirse | 0.73| 0.89| 0.80| 66| | | |accuracy| | | 0.93| 2000| |macro avg| 0.89| 0.92| 0.90| 2000| |weighted avg| 0.94| 0.93| 0.93| 2000| ## Model in Action 🚀 ```python from transformers import AutoTokenizer, AutoModelWithLMHead tokenizer = AutoTokenizer.from_pretrained("mrm8488/t5-base-finetuned-emotion") model = AutoModelWithLMHead.from_pretrained("mrm8488/t5-base-finetuned-emotion") def get_emotion(text): input_ids = tokenizer.encode(text + '</s>', return_tensors='pt') output = model.generate(input_ids=input_ids, max_length=2) dec = [tokenizer.decode(ids) for ids in output] label = dec[0] return label get_emotion("i feel as if i havent blogged in ages are at least truly blogged i am doing an update cute") # Output: 'joy' get_emotion("i have a feeling i kinda lost my best friend") # Output: 'sadness' ``` > Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488) | [LinkedIn](https://www.linkedin.com/in/manuel-romero-cs/) > Made with <span style="color: #e25555;">&hearts;</span> in Spain
3,990
[ [ -0.034393310546875, -0.0239715576171875, 0.005275726318359375, 0.028350830078125, -0.0126800537109375, 0.01004791259765625, -0.0255279541015625, -0.034759521484375, 0.00411224365234375, -0.0005383491516113281, -0.054962158203125, -0.05169677734375, -0.0640869140625, 0.017486572265625, -0.0254974365234375, 0.089599609375, -0.014251708984375, -0.0099334716796875, 0.0086517333984375, -0.01250457763671875, -0.035736083984375, -0.036651611328125, -0.059722900390625, -0.03271484375, 0.0272369384765625, 0.021392822265625, 0.01313018798828125, 0.0291595458984375, 0.050811767578125, 0.0225830078125, -0.0070343017578125, 0.00812530517578125, -0.041351318359375, -0.013153076171875, -0.0013856887817382812, -0.032684326171875, -0.038055419921875, 0.0013580322265625, 0.026641845703125, 0.0271148681640625, 0.01152801513671875, 0.026580810546875, 0.00595855712890625, 0.052520751953125, -0.053314208984375, 0.018890380859375, -0.032928466796875, 0.022003173828125, 0.005298614501953125, -0.01181793212890625, -0.02667236328125, -0.0229034423828125, 0.01078033447265625, -0.025970458984375, 0.015594482421875, -0.0001742839813232422, 0.0938720703125, 0.0303955078125, -0.02606201171875, -0.025665283203125, -0.04150390625, 0.06085205078125, -0.048553466796875, 0.01445770263671875, 0.01434326171875, 0.0009527206420898438, 0.0095977783203125, -0.059326171875, -0.047637939453125, -0.0007653236389160156, -0.01049041748046875, 0.017608642578125, -0.032379150390625, -0.0052337646484375, 0.02215576171875, 0.04046630859375, -0.037994384765625, -0.0038738250732421875, -0.031280517578125, -0.01146697998046875, 0.054534912109375, 0.005283355712890625, 0.02642822265625, -0.0153656005859375, -0.02655029296875, -0.0265655517578125, -0.0247802734375, 0.0180511474609375, 0.00865936279296875, 0.0144500732421875, -0.03375244140625, 0.02008056640625, -0.00385284423828125, 0.046478271484375, 0.033050537109375, 0.002719879150390625, 0.049102783203125, -0.0217742919921875, -0.0216827392578125, -0.0024509429931640625, 0.09088134765625, 0.031646728515625, 0.0345458984375, -0.007358551025390625, -0.016693115234375, -0.0032196044921875, 0.0115966796875, -0.06646728515625, -0.0292816162109375, 0.01904296875, -0.03973388671875, -0.03363037109375, 0.00222015380859375, -0.06463623046875, -0.007312774658203125, -0.021514892578125, 0.04449462890625, -0.048492431640625, -0.02508544921875, 0.0102996826171875, -0.0029048919677734375, 0.0124053955078125, 0.00919342041015625, -0.07708740234375, 0.0085296630859375, 0.0295867919921875, 0.062347412109375, -0.0135040283203125, -0.0299835205078125, -0.015594482421875, -0.02606201171875, -0.019683837890625, 0.049346923828125, -0.01238250732421875, -0.0224761962890625, -0.01224517822265625, 0.006458282470703125, -0.0216827392578125, -0.029144287109375, 0.043060302734375, -0.021240234375, 0.03289794921875, -0.016876220703125, -0.0299835205078125, -0.0189666748046875, 0.01488494873046875, -0.040496826171875, 0.08929443359375, 0.0093841552734375, -0.06280517578125, 0.0171966552734375, -0.07159423828125, -0.0264434814453125, -0.0303955078125, 0.032989501953125, -0.033233642578125, -0.00179290771484375, 0.015655517578125, 0.0455322265625, -0.00832366943359375, 0.005596160888671875, -0.02618408203125, -0.02471923828125, 0.0205078125, -0.0184783935546875, 0.0672607421875, 0.0254364013671875, -0.033203125, 0.0062103271484375, -0.0714111328125, 0.00600433349609375, 0.00882720947265625, -0.0204315185546875, -0.0011653900146484375, -0.028106689453125, 0.0033111572265625, 0.036956787109375, 0.028717041015625, -0.026031494140625, 0.0125885009765625, -0.0340576171875, 0.037384033203125, 0.05279541015625, -0.00843048095703125, 0.035919189453125, -0.020721435546875, 0.031951904296875, 0.012115478515625, -0.0010328292846679688, -0.007598876953125, -0.004638671875, -0.0797119140625, -0.0145416259765625, 0.03662109375, 0.049835205078125, -0.02301025390625, 0.060302734375, -0.018280029296875, -0.04327392578125, -0.060791015625, -0.0025196075439453125, 0.0202484130859375, 0.057647705078125, 0.052032470703125, -0.0211029052734375, -0.0648193359375, -0.042938232421875, -0.0287322998046875, -0.0100555419921875, 0.002758026123046875, 0.00921630859375, 0.0445556640625, -0.028167724609375, 0.060516357421875, -0.0350341796875, -0.026947021484375, -0.029998779296875, 0.038330078125, 0.0190887451171875, 0.050384521484375, 0.0443115234375, -0.044281005859375, -0.05084228515625, -0.00745391845703125, -0.064697265625, -0.017608642578125, -0.00986480712890625, -0.01763916015625, 0.03131103515625, 0.027069091796875, -0.0445556640625, 0.022918701171875, 0.033966064453125, -0.0300140380859375, 0.035614013671875, -0.00897216796875, 0.0077362060546875, -0.10302734375, 0.0264739990234375, 0.00611114501953125, -0.0167083740234375, -0.04595947265625, -0.025054931640625, 0.0113067626953125, 0.010223388671875, -0.03125, 0.0531005859375, -0.03948974609375, 0.00717926025390625, -0.01401519775390625, -0.00861358642578125, -0.001129150390625, 0.0565185546875, -0.0027256011962890625, 0.04754638671875, 0.044647216796875, -0.043182373046875, 0.022064208984375, 0.034698486328125, -0.007587432861328125, 0.044342041015625, -0.036346435546875, 0.0189971923828125, -0.01506805419921875, 0.0124664306640625, -0.07476806640625, -0.005931854248046875, 0.028839111328125, -0.05706787109375, 0.0469970703125, -0.0078582763671875, -0.021728515625, -0.03155517578125, -0.0193023681640625, 0.0077056884765625, 0.05322265625, -0.038330078125, 0.0487060546875, 0.007518768310546875, 0.0007495880126953125, -0.064453125, -0.061187744140625, 0.0091552734375, -0.0318603515625, -0.034637451171875, 0.03375244140625, -0.01338958740234375, 0.007442474365234375, -0.0003361701965332031, -0.00897216796875, -0.01012420654296875, 0.0116729736328125, 0.016204833984375, 0.0260162353515625, -0.0041046142578125, 0.0186309814453125, -0.0103302001953125, -0.0115966796875, 0.01367950439453125, -0.0007624626159667969, 0.049957275390625, -0.03179931640625, 0.0120391845703125, -0.052459716796875, 0.006061553955078125, 0.037933349609375, -0.0229034423828125, 0.051025390625, 0.082275390625, -0.0252838134765625, -0.01287078857421875, -0.0299835205078125, -0.004230499267578125, -0.035308837890625, 0.033355712890625, -0.037109375, -0.052276611328125, 0.04534912109375, -0.0139007568359375, -0.001552581787109375, 0.06298828125, 0.044036865234375, -0.020416259765625, 0.07806396484375, 0.05712890625, -0.0253448486328125, 0.049224853515625, -0.0479736328125, 0.0197601318359375, -0.06927490234375, -0.0217437744140625, -0.040435791015625, -0.039306640625, -0.05145263671875, -0.004215240478515625, 0.004425048828125, 0.005367279052734375, -0.0384521484375, 0.0253143310546875, -0.04302978515625, 0.0147552490234375, 0.03271484375, 0.007755279541015625, 0.005168914794921875, -0.004962921142578125, 0.0015430450439453125, -0.01047515869140625, -0.05010986328125, -0.0187530517578125, 0.0841064453125, 0.0202789306640625, 0.053131103515625, -0.0139312744140625, 0.0665283203125, 0.0209808349609375, 0.0302734375, -0.063232421875, 0.032470703125, -0.0170135498046875, -0.022216796875, -0.0029048919677734375, -0.03631591796875, -0.07159423828125, 0.00768280029296875, -0.0298919677734375, -0.05963134765625, 0.0244903564453125, 0.01157379150390625, -0.0228424072265625, 0.020782470703125, -0.0687255859375, 0.0826416015625, -0.020599365234375, -0.021026611328125, 0.006748199462890625, -0.06396484375, 0.0174407958984375, -0.00003647804260253906, -0.004669189453125, -0.006320953369140625, 0.01513671875, 0.05670166015625, -0.0246124267578125, 0.060211181640625, -0.00820159912109375, 0.01461029052734375, 0.007442474365234375, 0.0045166015625, 0.03314208984375, -0.0231475830078125, -0.006061553955078125, 0.00865936279296875, 0.0002684593200683594, -0.0166778564453125, -0.03863525390625, 0.0443115234375, -0.0767822265625, -0.007354736328125, -0.0396728515625, -0.037811279296875, -0.01009368896484375, 0.0257568359375, 0.038848876953125, 0.00969696044921875, -0.00986480712890625, 0.010162353515625, 0.0521240234375, -0.015472412109375, 0.0479736328125, 0.0216522216796875, 0.00676727294921875, -0.025482177734375, 0.07122802734375, -0.0042724609375, 0.00855255126953125, 0.03363037109375, 0.00848388671875, -0.032623291015625, -0.0189361572265625, -0.01593017578125, 0.025360107421875, -0.05450439453125, -0.031982421875, -0.0654296875, -0.0161590576171875, -0.04266357421875, -0.002758026123046875, -0.037994384765625, -0.0210723876953125, -0.054443359375, -0.0181121826171875, 0.033599853515625, 0.038970947265625, -0.018829345703125, 0.016204833984375, -0.05877685546875, 0.0209197998046875, 0.0034046173095703125, 0.0241851806640625, -0.0030231475830078125, -0.0557861328125, -0.01068115234375, 0.00428009033203125, -0.0249786376953125, -0.06402587890625, 0.060272216796875, 0.0286102294921875, 0.01073455810546875, 0.0203094482421875, 0.0060272216796875, 0.0616455078125, -0.016021728515625, 0.046478271484375, 0.017913818359375, -0.08953857421875, 0.044769287109375, -0.0017442703247070312, 0.026947021484375, 0.049346923828125, 0.045654296875, -0.0550537109375, -0.00623321533203125, -0.0594482421875, -0.06732177734375, 0.0726318359375, 0.011444091796875, 0.00435638427734375, 0.0235748291015625, 0.0226898193359375, -0.005474090576171875, 0.0246734619140625, -0.06903076171875, -0.02557373046875, -0.0279693603515625, -0.050872802734375, -0.003597259521484375, -0.00396728515625, 0.0090484619140625, -0.031494140625, 0.05218505859375, -0.0098876953125, 0.05126953125, 0.0295257568359375, -0.0230255126953125, -0.00531005859375, 0.00827789306640625, 0.0287322998046875, 0.02398681640625, -0.0390625, 0.0000845789909362793, 0.022064208984375, -0.0305023193359375, 0.00342559814453125, 0.0127410888671875, -0.00006663799285888672, -0.0026340484619140625, 0.0270843505859375, 0.087890625, -0.005527496337890625, -0.0293426513671875, 0.0411376953125, -0.01128387451171875, -0.0229339599609375, -0.0228118896484375, 0.0078277587890625, -0.0009946823120117188, 0.0122222900390625, 0.0142364501953125, 0.0261383056640625, 0.0107421875, -0.03814697265625, 0.005855560302734375, 0.01488494873046875, -0.03900146484375, -0.043304443359375, 0.050567626953125, 0.00861358642578125, -0.020782470703125, 0.04437255859375, -0.016143798828125, -0.061065673828125, 0.053009033203125, 0.036956787109375, 0.0799560546875, -0.01309967041015625, 0.01355743408203125, 0.05810546875, 0.01418304443359375, -0.01323699951171875, 0.032867431640625, -0.005126953125, -0.056854248046875, -0.024658203125, -0.050048828125, -0.02227783203125, 0.01271820068359375, -0.04779052734375, 0.035491943359375, -0.0263671875, -0.00695037841796875, -0.00623321533203125, 0.012786865234375, -0.06640625, 0.042938232421875, 0.014373779296875, 0.06793212890625, -0.06378173828125, 0.04962158203125, 0.05279541015625, -0.03662109375, -0.07281494140625, -0.0036983489990234375, -0.017181396484375, -0.056610107421875, 0.054962158203125, 0.0227508544921875, -0.002635955810546875, 0.019256591796875, -0.0496826171875, -0.06695556640625, 0.08837890625, 0.019500732421875, -0.0233612060546875, -0.0068817138671875, 0.0079498291015625, 0.06744384765625, -0.0270538330078125, 0.0389404296875, 0.044891357421875, 0.031829833984375, 0.01183319091796875, -0.060333251953125, 0.01137542724609375, -0.03839111328125, -0.018310546875, 0.00771331787109375, -0.062744140625, 0.07037353515625, -0.0193023681640625, -0.0079498291015625, -0.0162811279296875, 0.05596923828125, 0.00928497314453125, 0.02069091796875, 0.04071044921875, 0.05670166015625, 0.061370849609375, -0.0165252685546875, 0.07391357421875, -0.03759765625, 0.057373046875, 0.06591796875, 0.0068511962890625, 0.055023193359375, 0.0235748291015625, -0.0190277099609375, 0.03997802734375, 0.066650390625, -0.0126800537109375, 0.043212890625, 0.008575439453125, -0.006473541259765625, -0.00936126708984375, 0.006580352783203125, -0.026275634765625, 0.0288543701171875, 0.0173492431640625, -0.035552978515625, -0.007022857666015625, 0.00420379638671875, 0.0181121826171875, -0.0290069580078125, -0.0159149169921875, 0.06353759765625, 0.01103973388671875, -0.049957275390625, 0.06658935546875, -0.006671905517578125, 0.0714111328125, -0.0433349609375, 0.005916595458984375, -0.030670166015625, 0.019134521484375, -0.030670166015625, -0.053985595703125, 0.0281524658203125, 0.0114593505859375, -0.004428863525390625, -0.03765869140625, 0.0673828125, -0.0389404296875, -0.028076171875, 0.026519775390625, 0.0249176025390625, 0.0110321044921875, -0.006984710693359375, -0.067138671875, -0.007843017578125, 0.01279449462890625, -0.03057861328125, 0.0160064697265625, 0.045562744140625, 0.0198516845703125, 0.051025390625, 0.041534423828125, 0.00679779052734375, 0.0015506744384765625, -0.00843048095703125, 0.0546875, -0.05987548828125, -0.03521728515625, -0.06787109375, 0.049041748046875, 0.0014133453369140625, -0.0428466796875, 0.034637451171875, 0.0289154052734375, 0.06292724609375, -0.01163482666015625, 0.0662841796875, -0.0236053466796875, 0.039031982421875, -0.0271453857421875, 0.051849365234375, -0.059967041015625, -0.005069732666015625, -0.0265045166015625, -0.046539306640625, -0.0266876220703125, 0.0555419921875, -0.0223541259765625, 0.01168060302734375, 0.059295654296875, 0.046600341796875, -0.00489044189453125, -0.0127410888671875, -0.005229949951171875, 0.0297393798828125, 0.01457977294921875, 0.065185546875, 0.043975830078125, -0.05816650390625, 0.046142578125, -0.034942626953125, -0.001987457275390625, -0.0215606689453125, -0.06640625, -0.06719970703125, -0.049835205078125, -0.031280517578125, -0.03582763671875, -0.00251007080078125, 0.0831298828125, 0.0426025390625, -0.058837890625, -0.011077880859375, -0.0174560546875, -0.0004105567932128906, -0.0106658935546875, -0.0193939208984375, 0.03912353515625, -0.035614013671875, -0.0693359375, 0.001857757568359375, -0.00485992431640625, 0.018096923828125, 0.00478363037109375, -0.00330352783203125, -0.01934814453125, -0.01454925537109375, 0.0360107421875, 0.0262298583984375, -0.0182342529296875, -0.01091766357421875, 0.0117034912109375, -0.01079559326171875, 0.0306396484375, 0.0251922607421875, -0.04681396484375, 0.0236663818359375, 0.048858642578125, 0.05084228515625, 0.05267333984375, -0.004119873046875, 0.0341796875, -0.04901123046875, -0.003665924072265625, 0.0125885009765625, 0.0259857177734375, 0.036041259765625, -0.0265045166015625, 0.049224853515625, 0.0268402099609375, -0.04486083984375, -0.06292724609375, -0.01259613037109375, -0.0904541015625, -0.0014133453369140625, 0.08062744140625, -0.014129638671875, -0.0288543701171875, 0.022491455078125, -0.0240631103515625, 0.046051025390625, -0.036285400390625, 0.0750732421875, 0.059967041015625, -0.0153045654296875, -0.026580810546875, -0.0251007080078125, 0.052642822265625, 0.043060302734375, -0.06927490234375, -0.01358795166015625, 0.01873779296875, 0.03582763671875, 0.0136260986328125, 0.046661376953125, 0.004154205322265625, 0.0153350830078125, -0.024749755859375, 0.040557861328125, -0.0003428459167480469, -0.0046844482421875, -0.04595947265625, 0.00725555419921875, -0.019989013671875, -0.0285797119140625 ] ]
bennyguo/zero123-diffusers
2023-08-07T08:31:53.000Z
[ "diffusers", "arxiv:2303.11328", "license:mit", "has_space", "diffusers:Zero123Pipeline", "region:us" ]
null
bennyguo
null
null
bennyguo/zero123-diffusers
2
9,237
diffusers
2023-08-01T08:38:23
--- license: mit --- # Uses _Note: This section is originally taken from the [Stable Diffusion v2 model card](https://huggingface.co/stabilityai/stable-diffusion-2), but applies in the same way to Zero-1-to-3._ ## Direct Use The model is intended for research purposes only. Possible research areas and tasks include: - Safe deployment of large-scale models. - Probing and understanding the limitations and biases of generative models. - Generation of artworks and use in design and other artistic processes. - Applications in educational or creative tools. - Research on generative models. Excluded uses are described below. ### Misuse, Malicious Use, and Out-of-Scope Use The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes. #### Out-of-Scope Use The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. #### Misuse and Malicious Use Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to: - Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc. - Intentionally promoting or propagating discriminatory content or harmful stereotypes. - Impersonating individuals without their consent. - Sexual content without consent of the people who might see it. - Mis- and disinformation - Representations of egregious violence and gore - Sharing of copyrighted or licensed material in violation of its terms of use. - Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use. ## Limitations and Bias ### Limitations - The model does not achieve perfect photorealism. - The model cannot render legible text. - Faces and people in general may not be parsed or generated properly. - The autoencoding part of the model is lossy. - Stable Diffusion was trained on a subset of the large-scale dataset [LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, Stability AI has filtered the dataset using LAION's NSFW detector. - Zero-1-to-3 was subsequently finetuned on a subset of the large-scale dataset [Objaverse](https://objaverse.allenai.org/), which might also potentially contain inappropriate content. To partially mitigate this, our demo applies a safety check to every uploaded image. ### Bias While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases. Stable Diffusion was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/), which consists of images that are limited to English descriptions. Images and concepts from communities and cultures that use other languages are likely to be insufficiently accounted for. This affects the overall output of the model, as Western cultures are often overrepresented. Stable Diffusion mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent. ### Safety Module The intended use of this model is with the [Safety Checker](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/stable_diffusion/safety_checker.py) in Diffusers. This checker works by checking model inputs against known hard-coded NSFW concepts. Specifically, the checker compares the class probability of harmful concepts in the embedding space of the uploaded input images. The concepts are passed into the model with the image and compared to a hand-engineered weight for each NSFW concept. ## Citation ``` @misc{liu2023zero1to3, title={Zero-1-to-3: Zero-shot One Image to 3D Object}, author={Ruoshi Liu and Rundi Wu and Basile Van Hoorick and Pavel Tokmakov and Sergey Zakharov and Carl Vondrick}, year={2023}, eprint={2303.11328}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
4,272
[ [ -0.026092529296875, -0.07232666015625, 0.026824951171875, 0.0210723876953125, -0.02294921875, -0.04766845703125, 0.024658203125, -0.0400390625, -0.0138092041015625, 0.04388427734375, -0.0222625732421875, -0.036346435546875, -0.050323486328125, -0.00676727294921875, -0.0516357421875, 0.071533203125, -0.007476806640625, -0.006374359130859375, -0.017181396484375, -0.00820159912109375, -0.020416259765625, -0.012542724609375, -0.0609130859375, -0.01641845703125, 0.032318115234375, 0.01256561279296875, 0.0626220703125, 0.042449951171875, 0.03155517578125, 0.018157958984375, -0.0173187255859375, -0.0224456787109375, -0.0670166015625, 0.00007599592208862305, -0.0121612548828125, -0.01525115966796875, -0.039520263671875, 0.014739990234375, 0.044158935546875, 0.0266571044921875, -0.0130462646484375, 0.0072174072265625, -0.00765228271484375, 0.04229736328125, -0.05242919921875, -0.0230560302734375, -0.0197906494140625, 0.0263214111328125, -0.02923583984375, 0.0234527587890625, -0.0262603759765625, -0.0211181640625, 0.0015382766723632812, -0.050323486328125, 0.0215606689453125, -0.0269927978515625, 0.07989501953125, 0.020477294921875, -0.03326416015625, 0.006481170654296875, -0.0609130859375, 0.044891357421875, -0.04632568359375, 0.0218353271484375, 0.047576904296875, -0.0013647079467773438, -0.00009846687316894531, -0.053009033203125, -0.045623779296875, -0.0008206367492675781, 0.006103515625, 0.02294921875, -0.0276947021484375, -0.01030731201171875, 0.038543701171875, 0.00696563720703125, -0.045654296875, -0.002593994140625, -0.050262451171875, -0.0029125213623046875, 0.05157470703125, -0.0005369186401367188, 0.031707763671875, -0.0167999267578125, -0.041473388671875, -0.0025119781494140625, -0.047515869140625, -0.00968170166015625, 0.031494140625, -0.0301971435546875, -0.024810791015625, 0.043731689453125, -0.0023784637451171875, 0.0391845703125, 0.00701141357421875, -0.0145263671875, 0.0265350341796875, -0.0171661376953125, -0.01513671875, -0.032501220703125, 0.06292724609375, 0.06268310546875, 0.00908660888671875, 0.00439453125, -0.0011987686157226562, 0.00785064697265625, 0.0279388427734375, -0.08624267578125, -0.01332855224609375, 0.00989532470703125, -0.04815673828125, -0.047943115234375, -0.004360198974609375, -0.0684814453125, -0.024566650390625, 0.02178955078125, 0.0308380126953125, -0.0237579345703125, -0.052490234375, 0.00864410400390625, -0.033416748046875, 0.0113677978515625, 0.0244293212890625, -0.038360595703125, 0.0148468017578125, 0.0106201171875, 0.0797119140625, -0.0193634033203125, -0.00545501708984375, 0.005809783935546875, 0.0162811279296875, -0.03265380859375, 0.042449951171875, -0.0154266357421875, -0.0443115234375, -0.004680633544921875, 0.022430419921875, 0.0291290283203125, -0.03570556640625, 0.046875, -0.033905029296875, 0.01549530029296875, -0.00384521484375, -0.023162841796875, -0.0258026123046875, 0.004772186279296875, -0.053863525390625, 0.070068359375, 0.0186767578125, -0.0679931640625, 0.0169830322265625, -0.048187255859375, -0.0274658203125, -0.0039043426513671875, 0.0079345703125, -0.0496826171875, -0.01416778564453125, -0.018646240234375, 0.037689208984375, 0.004913330078125, 0.0193023681640625, -0.041351318359375, -0.0230255126953125, -0.0005779266357421875, -0.042144775390625, 0.08203125, 0.032470703125, -0.032684326171875, -0.00030350685119628906, -0.040985107421875, -0.030029296875, 0.0411376953125, -0.0054473876953125, -0.03076171875, -0.00044655799865722656, 0.017913818359375, 0.0195770263671875, -0.003604888916015625, -0.039642333984375, -0.0155487060546875, -0.00807952880859375, 0.023040771484375, 0.0523681640625, 0.0211181640625, 0.04766845703125, -0.0391845703125, 0.042877197265625, 0.0233612060546875, 0.03302001953125, -0.00199127197265625, -0.06640625, -0.0440673828125, -0.00640869140625, -0.002391815185546875, 0.036285400390625, -0.059295654296875, 0.010650634765625, 0.015350341796875, -0.042388916015625, -0.00754547119140625, 0.0031757354736328125, 0.0240478515625, 0.047119140625, 0.0204315185546875, -0.0265655517578125, -0.019195556640625, -0.05572509765625, 0.028106689453125, -0.00896453857421875, 0.002872467041015625, 0.01511383056640625, 0.044830322265625, -0.0244140625, 0.041748046875, -0.03570556640625, -0.0203094482421875, 0.016998291015625, 0.01143646240234375, 0.000614166259765625, 0.053497314453125, 0.06671142578125, -0.08349609375, -0.036651611328125, -0.017333984375, -0.0870361328125, 0.0036182403564453125, 0.0017862319946289062, -0.034912109375, 0.0193328857421875, 0.026458740234375, -0.0487060546875, 0.05291748046875, 0.039093017578125, -0.03253173828125, 0.0298919677734375, -0.0140228271484375, -0.0034427642822265625, -0.06988525390625, 0.007152557373046875, 0.03887939453125, -0.0264434814453125, -0.06219482421875, 0.0340576171875, -0.0006995201110839844, -0.010498046875, -0.06451416015625, 0.052581787109375, -0.021636962890625, 0.039154052734375, -0.02313232421875, 0.0129852294921875, 0.0075531005859375, 0.00701141357421875, 0.0227508544921875, 0.0401611328125, 0.060028076171875, -0.048675537109375, 0.00048804283142089844, 0.01303863525390625, -0.0120086669921875, 0.04803466796875, -0.059906005859375, 0.01297760009765625, -0.0374755859375, 0.0162506103515625, -0.05877685546875, -0.0182037353515625, 0.04962158203125, -0.0212860107421875, 0.0250701904296875, -0.01214599609375, -0.024169921875, -0.03045654296875, -0.035369873046875, 0.03851318359375, 0.061004638671875, -0.022979736328125, 0.0232086181640625, 0.0428466796875, 0.0179901123046875, -0.04937744140625, -0.053314208984375, -0.01776123046875, -0.036102294921875, -0.062042236328125, 0.018890380859375, -0.01308441162109375, -0.0223846435546875, 0.01055145263671875, -0.0051422119140625, -0.005336761474609375, 0.015716552734375, 0.03485107421875, 0.0216827392578125, 0.00675201416015625, -0.018768310546875, 0.0113372802734375, -0.0018215179443359375, 0.01041412353515625, -0.003444671630859375, 0.018646240234375, 0.0105743408203125, -0.01457977294921875, -0.036651611328125, 0.03955078125, 0.041595458984375, 0.0171051025390625, 0.057373046875, 0.07574462890625, -0.05194091796875, -0.003345489501953125, -0.0310211181640625, -0.0088043212890625, -0.0360107421875, 0.028656005859375, -0.007595062255859375, -0.050262451171875, 0.043975830078125, 0.00762939453125, -0.006259918212890625, 0.0438232421875, 0.044891357421875, -0.012603759765625, 0.08099365234375, 0.0509033203125, 0.01519012451171875, 0.04046630859375, -0.050567626953125, 0.00032258033752441406, -0.072021484375, -0.0299072265625, -0.024017333984375, -0.01373291015625, -0.039581298828125, -0.044525146484375, 0.023681640625, 0.031463623046875, -0.0274658203125, 0.0164642333984375, -0.03656005859375, 0.03717041015625, 0.01293182373046875, 0.02581787109375, 0.0021762847900390625, 0.00395965576171875, 0.00882720947265625, -0.0000407099723815918, -0.049041748046875, -0.03875732421875, 0.0745849609375, 0.047210693359375, 0.07672119140625, 0.0261688232421875, 0.03363037109375, 0.035552978515625, 0.051239013671875, -0.044952392578125, 0.043121337890625, -0.03271484375, -0.0748291015625, -0.01538848876953125, -0.0268096923828125, -0.059600830078125, 0.0177459716796875, -0.0128936767578125, -0.050872802734375, 0.043121337890625, 0.00826263427734375, -0.009063720703125, 0.02838134765625, -0.045501708984375, 0.06329345703125, -0.004299163818359375, -0.062347412109375, -0.00843048095703125, -0.046966552734375, 0.0478515625, -0.01128387451171875, 0.023681640625, -0.0141143798828125, 0.00634765625, 0.059478759765625, -0.0177764892578125, 0.0806884765625, -0.0277099609375, -0.007617950439453125, 0.0284576416015625, 0.0161285400390625, 0.01386260986328125, 0.0240478515625, -0.022308349609375, 0.0523681640625, 0.00531005859375, -0.0248260498046875, -0.006809234619140625, 0.0543212890625, -0.07916259765625, -0.045501708984375, -0.037384033203125, -0.0198211669921875, 0.038848876953125, 0.02923583984375, 0.049041748046875, 0.018157958984375, -0.022186279296875, 0.0170135498046875, 0.060455322265625, -0.01953125, 0.021820068359375, 0.030029296875, -0.037841796875, -0.0238494873046875, 0.05218505859375, 0.007843017578125, 0.040313720703125, -0.01287841796875, 0.020904541015625, -0.0161285400390625, -0.0306396484375, -0.04150390625, 0.01326751708984375, -0.0672607421875, -0.0205535888671875, -0.0513916015625, -0.0297088623046875, -0.0301971435546875, -0.014190673828125, -0.00983428955078125, -0.01641845703125, -0.054443359375, -0.01351165771484375, 0.01727294921875, 0.04937744140625, -0.035491943359375, 0.0156402587890625, -0.0277099609375, 0.041656494140625, 0.01568603515625, 0.032073974609375, -0.0004265308380126953, -0.040313720703125, -0.0007338523864746094, 0.01067352294921875, -0.04229736328125, -0.07708740234375, 0.013427734375, 0.003391265869140625, 0.042694091796875, 0.065185546875, 0.0044708251953125, 0.032470703125, -0.0291290283203125, 0.0810546875, 0.02215576171875, -0.0506591796875, 0.04705810546875, -0.047119140625, 0.00922393798828125, 0.0230865478515625, 0.06060791015625, -0.0260467529296875, -0.0275421142578125, -0.05169677734375, -0.067138671875, 0.059173583984375, 0.034027099609375, 0.035369873046875, -0.019439697265625, 0.0506591796875, -0.003322601318359375, -0.01104736328125, -0.08807373046875, -0.042694091796875, -0.04022216796875, 0.01934814453125, 0.01470947265625, -0.038238525390625, -0.0275726318359375, -0.0270538330078125, 0.07073974609375, 0.006542205810546875, 0.0274200439453125, 0.01678466796875, 0.007083892822265625, -0.048187255859375, -0.0227508544921875, 0.04327392578125, 0.03387451171875, -0.029571533203125, 0.0017251968383789062, -0.0026493072509765625, -0.04681396484375, 0.0186767578125, -0.0017328262329101562, -0.058502197265625, -0.005771636962890625, -0.007495880126953125, 0.059814453125, -0.0160369873046875, -0.024688720703125, 0.047943115234375, -0.015960693359375, -0.03167724609375, -0.034088134765625, 0.0205535888671875, -0.0056304931640625, 0.00928497314453125, 0.00472259521484375, 0.0396728515625, 0.024444580078125, -0.03582763671875, 0.01285552978515625, 0.040069580078125, -0.02716064453125, -0.0201873779296875, 0.08489990234375, 0.00710296630859375, -0.0275421142578125, 0.040985107421875, -0.034942626953125, -0.01104736328125, 0.05035400390625, 0.06475830078125, 0.06005859375, -0.0201568603515625, 0.053863525390625, 0.05291748046875, 0.026824951171875, -0.03369140625, 0.01016998291015625, 0.006206512451171875, -0.07550048828125, -0.006103515625, -0.036407470703125, -0.007793426513671875, 0.0269012451171875, -0.052093505859375, 0.02734375, -0.04498291015625, -0.037933349609375, -0.0031223297119140625, -0.040069580078125, -0.038909912109375, 0.00970458984375, 0.0307464599609375, 0.0552978515625, -0.08294677734375, 0.055267333984375, 0.05364990234375, -0.04840087890625, -0.03717041015625, 0.00734710693359375, 0.0120086669921875, -0.026214599609375, 0.0296478271484375, 0.00757598876953125, -0.0114593505859375, -0.00901031494140625, -0.0694580078125, -0.072265625, 0.080078125, 0.0175628662109375, -0.01404571533203125, 0.007625579833984375, -0.0084228515625, 0.044036865234375, -0.01137542724609375, 0.0164031982421875, 0.0160064697265625, 0.0303497314453125, 0.0287322998046875, -0.028533935546875, 0.018463134765625, -0.035064697265625, 0.03192138671875, -0.00872802734375, -0.06182861328125, 0.06768798828125, -0.0175628662109375, -0.02764892578125, 0.0179443359375, 0.03399658203125, 0.0194549560546875, 0.024200439453125, 0.039154052734375, 0.058319091796875, 0.032470703125, -0.005889892578125, 0.08624267578125, 0.0013675689697265625, 0.03338623046875, 0.05963134765625, -0.0211181640625, 0.048980712890625, 0.0223388671875, -0.006671905517578125, 0.038726806640625, 0.0379638671875, -0.020294189453125, 0.057830810546875, -0.000014662742614746094, -0.015533447265625, -0.00699615478515625, -0.01337432861328125, -0.0396728515625, 0.0113677978515625, 0.03173828125, -0.0179901123046875, -0.0185394287109375, 0.0164947509765625, 0.0164794921875, -0.01861572265625, 0.004520416259765625, 0.044952392578125, 0.01192474365234375, -0.017852783203125, 0.04083251953125, 0.01284027099609375, 0.058685302734375, -0.03594970703125, -0.0126190185546875, -0.0167694091796875, 0.0072021484375, -0.025634765625, -0.06488037109375, 0.037689208984375, 0.007625579833984375, -0.006282806396484375, -0.02520751953125, 0.075439453125, -0.011199951171875, -0.04656982421875, 0.0364990234375, 0.020721435546875, 0.025054931640625, 0.01153564453125, -0.0677490234375, 0.006793975830078125, -0.00395965576171875, -0.0251617431640625, 0.02117919921875, 0.016937255859375, -0.0019893646240234375, 0.045166015625, 0.048004150390625, -0.0038127899169921875, -0.006805419921875, 0.01111602783203125, 0.05596923828125, -0.0192718505859375, -0.0188140869140625, -0.05419921875, 0.061004638671875, -0.007373809814453125, -0.0185699462890625, 0.06317138671875, 0.03143310546875, 0.07318115234375, -0.003513336181640625, 0.05426025390625, -0.0260467529296875, -0.003887176513671875, -0.0263671875, 0.07861328125, -0.059417724609375, 0.0036334991455078125, -0.03717041015625, -0.06524658203125, -0.03167724609375, 0.08514404296875, -0.0033016204833984375, 0.01280975341796875, 0.0312347412109375, 0.061370849609375, -0.004062652587890625, -0.0091094970703125, 0.034332275390625, 0.02850341796875, 0.02227783203125, -0.004543304443359375, 0.06982421875, -0.04437255859375, 0.0118865966796875, -0.054351806640625, -0.02508544921875, -0.0011606216430664062, -0.07025146484375, -0.06988525390625, -0.040924072265625, -0.052154541015625, -0.059326171875, -0.0040130615234375, 0.0196533203125, 0.08013916015625, -0.046722412109375, -0.0058441162109375, -0.0079193115234375, 0.0185089111328125, -0.0093231201171875, -0.0228271484375, 0.0025177001953125, 0.0280609130859375, -0.04705810546875, 0.00035381317138671875, 0.0122222900390625, 0.032562255859375, -0.05322265625, -0.007427215576171875, -0.02191162109375, -0.007427215576171875, 0.05096435546875, 0.01206207275390625, -0.04351806640625, -0.016632080078125, -0.005374908447265625, 0.007537841796875, -0.00759124755859375, 0.01122283935546875, -0.034332275390625, 0.0352783203125, 0.03985595703125, 0.00012087821960449219, 0.06475830078125, 0.006694793701171875, 0.00878143310546875, -0.040924072265625, 0.0195465087890625, 0.0036678314208984375, 0.0307159423828125, 0.020111083984375, -0.0543212890625, 0.048187255859375, 0.034332275390625, -0.044830322265625, -0.05267333984375, 0.01526641845703125, -0.07403564453125, -0.022674560546875, 0.1002197265625, -0.0174713134765625, -0.019927978515625, -0.006229400634765625, -0.0054473876953125, 0.014617919921875, -0.026763916015625, 0.05731201171875, 0.04052734375, -0.00298309326171875, -0.0278472900390625, -0.058990478515625, 0.0296478271484375, -0.0013208389282226562, -0.04705810546875, -0.0240478515625, 0.04644775390625, 0.05877685546875, 0.026458740234375, 0.06439208984375, -0.031280517578125, 0.018218994140625, -0.0018186569213867188, 0.004444122314453125, 0.00733184814453125, -0.02178955078125, -0.0362548828125, 0.01265716552734375, -0.0174713134765625, 0.005847930908203125 ] ]
Qwen/Qwen-14B-Chat
2023-11-05T03:27:18.000Z
[ "transformers", "safetensors", "qwen", "text-generation", "custom_code", "zh", "en", "arxiv:2309.16609", "arxiv:2305.08322", "arxiv:2009.03300", "arxiv:2305.05280", "arxiv:2210.03629", "region:us" ]
text-generation
Qwen
null
null
Qwen/Qwen-14B-Chat
257
9,236
transformers
2023-09-24T03:27:58
--- language: - zh - en tags: - qwen pipeline_tag: text-generation inference: false --- # Qwen-14B-Chat <p align="center"> <img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/logo_qwen.jpg" width="400"/> <p> <br> <p align="center"> 🤗 <a href="https://huggingface.co/Qwen">Hugging Face</a>&nbsp&nbsp | &nbsp&nbsp🤖 <a href="https://modelscope.cn/organization/qwen">ModelScope</a>&nbsp&nbsp | &nbsp&nbsp 📑 <a href="https://arxiv.org/abs/2309.16609">Paper</a>&nbsp&nbsp | &nbsp&nbsp🖥️ <a href="https://modelscope.cn/studios/qwen/Qwen-7B-Chat-Demo/summary">Demo</a> <br> <a href="https://github.com/QwenLM/Qwen/blob/main/assets/wechat.png">WeChat (微信)</a>&nbsp&nbsp | &nbsp&nbsp DingTalk (钉钉) &nbsp&nbsp | &nbsp&nbsp<a href="https://discord.gg/z3GAxXZ9Ce">Discord</a>&nbsp&nbsp </p> <br><br> ## 介绍(Introduction) **通义千问-14B(Qwen-14B)**是阿里云研发的通义千问大模型系列的140亿参数规模的模型。Qwen-14B是基于Transformer的大语言模型, 在超大规模的预训练数据上进行训练得到。预训练数据类型多样,覆盖广泛,包括大量网络文本、专业书籍、代码等。同时,在Qwen-14B的基础上,我们使用对齐机制打造了基于大语言模型的AI助手Qwen-14B-Chat。本仓库为Qwen-14B-Chat的仓库。 如果您想了解更多关于通义千问-14B开源模型的细节,我们建议您参阅[GitHub代码库](https://github.com/QwenLM/Qwen)。 **Qwen-14B** is the 14B-parameter version of the large language model series, Qwen (abbr. Tongyi Qianwen), proposed by Alibaba Cloud. Qwen-14B is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc. Additionally, based on the pretrained Qwen-14B, we release Qwen-14B-Chat, a large-model-based AI assistant, which is trained with alignment techniques. This repository is the one for Qwen-14B-Chat. For more details about the open-source model of Qwen-14B, please refer to the [GitHub](https://github.com/QwenLM/Qwen) code repository. <br> ## 要求(Requirements) * python 3.8及以上版本 * pytorch 1.12及以上版本,推荐2.0及以上版本 * 建议使用CUDA 11.4及以上(GPU用户、flash-attention用户等需考虑此选项) * python 3.8 and above * pytorch 1.12 and above, 2.0 and above are recommended * CUDA 11.4 and above are recommended (this is for GPU users, flash-attention users, etc.) <br> ## 依赖项(Dependency) 运行Qwen-14B-Chat,请确保满足上述要求,再执行以下pip命令安装依赖库 To run Qwen-14B-Chat, please make sure you meet the above requirements, and then execute the following pip commands to install the dependent libraries. ```bash pip install transformers==4.32.0 accelerate tiktoken einops scipy transformers_stream_generator==0.0.4 peft deepspeed ``` 另外,推荐安装`flash-attention`库(**当前已支持flash attention 2**),以实现更高的效率和更低的显存占用。 In addition, it is recommended to install the `flash-attention` library (**we support flash attention 2 now.**) for higher efficiency and lower memory usage. ```bash git clone https://github.com/Dao-AILab/flash-attention cd flash-attention && pip install . # 下方安装可选,安装可能比较缓慢。 # pip install csrc/layer_norm # pip install csrc/rotary ``` <br> ## 快速使用(Quickstart) 下面我们展示了一个使用Qwen-14B-Chat模型,进行多轮对话交互的样例: We show an example of multi-turn interaction with Qwen-14B-Chat in the following code: ```python from transformers import AutoModelForCausalLM, AutoTokenizer from transformers.generation import GenerationConfig # Note: The default behavior now has injection attack prevention off. tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen-14B-Chat", trust_remote_code=True) # use bf16 # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-14B-Chat", device_map="auto", trust_remote_code=True, bf16=True).eval() # use fp16 # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-14B-Chat", device_map="auto", trust_remote_code=True, fp16=True).eval() # use cpu only # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-14B-Chat", device_map="cpu", trust_remote_code=True).eval() # use auto mode, automatically select precision based on the device. model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-14B-Chat", device_map="auto", trust_remote_code=True).eval() # Specify hyperparameters for generation. But if you use transformers>=4.32.0, there is no need to do this. # model.generation_config = GenerationConfig.from_pretrained("Qwen/Qwen-14B-Chat", trust_remote_code=True) # 可指定不同的生成长度、top_p等相关超参 # 第一轮对话 1st dialogue turn response, history = model.chat(tokenizer, "你好", history=None) print(response) # 你好!很高兴为你提供帮助。 # 第二轮对话 2nd dialogue turn response, history = model.chat(tokenizer, "给我讲一个年轻人奋斗创业最终取得成功的故事。", history=history) print(response) # 这是一个关于一个年轻人奋斗创业最终取得成功的故事。 # 故事的主人公叫李明,他来自一个普通的家庭,父母都是普通的工人。从小,李明就立下了一个目标:要成为一名成功的企业家。 # 为了实现这个目标,李明勤奋学习,考上了大学。在大学期间,他积极参加各种创业比赛,获得了不少奖项。他还利用课余时间去实习,积累了宝贵的经验。 # 毕业后,李明决定开始自己的创业之路。他开始寻找投资机会,但多次都被拒绝了。然而,他并没有放弃。他继续努力,不断改进自己的创业计划,并寻找新的投资机会。 # 最终,李明成功地获得了一笔投资,开始了自己的创业之路。他成立了一家科技公司,专注于开发新型软件。在他的领导下,公司迅速发展起来,成为了一家成功的科技企业。 # 李明的成功并不是偶然的。他勤奋、坚韧、勇于冒险,不断学习和改进自己。他的成功也证明了,只要努力奋斗,任何人都有可能取得成功。 # 第三轮对话 3rd dialogue turn response, history = model.chat(tokenizer, "给这个故事起一个标题", history=history) print(response) # 《奋斗创业:一个年轻人的成功之路》 ``` 关于更多的使用说明,请参考我们的[GitHub repo](https://github.com/QwenLM/Qwen)获取更多信息。 For more information, please refer to our [GitHub repo](https://github.com/QwenLM/Qwen) for more information. <br> ## 量化 (Quantization) ### 用法 (Usage) **请注意:我们更新量化方案为基于[AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ)的量化,提供Qwen-14B-Chat的Int4量化模型[点击这里](https://huggingface.co/Qwen/Qwen-14B-Chat-Int4)。相比此前方案,该方案在模型评测效果几乎无损,且存储需求更低,推理速度更优。** **Note: we provide a new solution based on [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ), and release an Int4 quantized model for Qwen-14B-Chat [Click here](https://huggingface.co/Qwen/Qwen-14B-Chat-Int4), which achieves nearly lossless model effects but improved performance on both memory costs and inference speed, in comparison with the previous solution.** 以下我们提供示例说明如何使用Int4量化模型。在开始使用前,请先保证满足要求(如torch 2.0及以上,transformers版本为4.32.0及以上,等等),并安装所需安装包: Here we demonstrate how to use our provided quantized models for inference. Before you start, make sure you meet the requirements of auto-gptq (e.g., torch 2.0 and above, transformers 4.32.0 and above, etc.) and install the required packages: ```bash pip install auto-gptq optimum ``` 如安装`auto-gptq`遇到问题,我们建议您到官方[repo](https://github.com/PanQiWei/AutoGPTQ)搜索合适的预编译wheel。 随后即可使用和上述一致的用法调用量化模型: If you meet problems installing `auto-gptq`, we advise you to check out the official [repo](https://github.com/PanQiWei/AutoGPTQ) to find a pre-build wheel. Then you can load the quantized model easily and run inference as same as usual: ```python model = AutoModelForCausalLM.from_pretrained( "Qwen/Qwen-14B-Chat-Int4", device_map="auto", trust_remote_code=True ).eval() response, history = model.chat(tokenizer, "你好", history=None) ``` ### 效果评测 我们对BF16,Int8和Int4模型在基准评测上做了测试(使用zero-shot设置),发现量化模型效果损失较小,结果如下所示: We illustrate the zero-shot performance of both BF16, Int8 and Int4 models on the benchmark, and we find that the quantized model does not suffer from significant performance degradation. Results are shown below: | Quantization | MMLU | CEval (val) | GSM8K | Humaneval | |--------------|:----:|:-----------:|:-----:|:---------:| | BF16 | 64.6 | 69.8 | 60.1 | 43.9 | | Int8 | 63.6 | 68.6 | 60.0 | 48.2 | | Int4 | 63.3 | 69.0 | 59.8 | 45.7 | ### 推理速度 (Inference Speed) 我们测算了不同精度模型以及不同FlashAttn库版本下模型生成2048和8192个token的平均推理速度。如图所示: We measured the average inference speed of generating 2048 and 8192 tokens with different quantization levels and versions of flash-attention, respectively. | Quantization | FlashAttn | Speed (2048 tokens) | Speed (8192 tokens) | | ------------- | :-------: | :------------------:| :------------------:| | BF16 | v2 | 32.88 | 24.87 | | Int8 | v2 | 29.28 | 24.22 | | Int4 | v2 | 38.72 | 27.33 | | BF16 | v1 | 32.76 | 28.89 | | Int8 | v1 | 28.31 | 23.87 | | Int4 | v1 | 37.81 | 26.46 | | BF16 | Disabled | 29.32 | 22.91 | | Int8 | Disabled | 31.12 | 24.60 | | Int4 | Disabled | 37.65 | 26.00 | 具体而言,我们记录在长度为1的上下文的条件下生成8192个token的性能。评测运行于单张A100-SXM4-80G GPU,使用PyTorch 2.0.1和CUDA 11.8。推理速度是生成8192个token的速度均值。 In detail, the setting of profiling is generating 8192 new tokens with 1 context token. The profiling runs on a single A100-SXM4-80G GPU with PyTorch 2.0.1 and CUDA 11.8. The inference speed is averaged over the generated 8192 tokens. 注意:以上Int4/Int8模型生成速度使用autogptq库给出,当前``AutoModelForCausalLM.from_pretrained``载入的模型生成速度会慢大约20%。我们已经将该问题汇报给HuggingFace团队,若有解决方案将即时更新。 Note: The generation speed of the Int4/Int8 models mentioned above is provided by the autogptq library. The current speed of the model loaded using "AutoModelForCausalLM.from_pretrained" will be approximately 20% slower. We have reported this issue to the HuggingFace team and will update it promptly if a solution is available. ### 显存使用 (GPU Memory Usage) 我们还测算了不同模型精度编码2048个token及生成8192个token的峰值显存占用情况。(显存消耗在是否使用FlashAttn的情况下均类似。)结果如下所示: We also profile the peak GPU memory usage for encoding 2048 tokens as context (and generating single token) and generating 8192 tokens (with single token as context) under different quantization levels, respectively. (The GPU memory usage is similar when using flash-attention or not.)The results are shown below. | Quantization Level | Peak Usage for Encoding 2048 Tokens | Peak Usage for Generating 8192 Tokens | | ------------------ | :---------------------------------: | :-----------------------------------: | | BF16 | 30.15GB | 38.94GB | | Int8 | 18.81GB | 27.54GB | | Int4 | 13.01GB | 21.79GB | 上述性能测算使用[此脚本](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile.py)完成。 The above speed and memory profiling are conducted using [this script](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile.py). <br> ## 模型细节(Model) 与Qwen-14B预训练模型相同,Qwen-14B-Chat模型规模基本情况如下所示 The details of the model architecture of Qwen-14B-Chat are listed as follows | Hyperparameter | Value | |:----------------|:------:| | n_layers | 40 | | n_heads | 40 | | d_model | 5120 | | vocab size | 151851 | | sequence length | 2048 | 在位置编码、FFN激活函数和normalization的实现方式上,我们也采用了目前最流行的做法, 即RoPE相对位置编码、SwiGLU激活函数、RMSNorm(可选安装flash-attention加速)。 在分词器方面,相比目前主流开源模型以中英词表为主,Qwen-14B-Chat使用了约15万token大小的词表。 该词表在GPT-4使用的BPE词表`cl100k_base`基础上,对中文、多语言进行了优化,在对中、英、代码数据的高效编解码的基础上,对部分多语言更加友好,方便用户在不扩展词表的情况下对部分语种进行能力增强。 词表对数字按单个数字位切分。调用较为高效的[tiktoken分词库](https://github.com/openai/tiktoken)进行分词。 For position encoding, FFN activation function, and normalization calculation methods, we adopt the prevalent practices, i.e., RoPE relative position encoding, SwiGLU for activation function, and RMSNorm for normalization (optional installation of flash-attention for acceleration). For tokenization, compared to the current mainstream open-source models based on Chinese and English vocabularies, Qwen-14B-Chat uses a vocabulary of over 150K tokens. It first considers efficient encoding of Chinese, English, and code data, and is also more friendly to multilingual languages, enabling users to directly enhance the capability of some languages without expanding the vocabulary. It segments numbers by single digit, and calls the [tiktoken](https://github.com/openai/tiktoken) tokenizer library for efficient tokenization. <br> ## 评测效果(Evaluation) 对于Qwen-14B-Chat模型,我们同样评测了常规的中文理解(C-Eval)、英文理解(MMLU)、代码(HumanEval)和数学(GSM8K)等权威任务,同时包含了长序列任务的评测结果。由于Qwen-14B-Chat模型经过对齐后,激发了较强的外部系统调用能力,我们还进行了工具使用能力方面的评测。 提示:由于硬件和框架造成的舍入误差,复现结果如有波动属于正常现象。 For Qwen-14B-Chat, we also evaluate the model on C-Eval, MMLU, HumanEval, GSM8K, etc., as well as the benchmark evaluation for long-context understanding, and tool usage. Note: Due to rounding errors caused by hardware and framework, differences in reproduced results are possible. ### 中文评测(Chinese Evaluation) #### C-Eval 在[C-Eval](https://arxiv.org/abs/2305.08322)验证集上,我们评价了Qwen-14B-Chat模型的0-shot & 5-shot准确率 We demonstrate the 0-shot & 5-shot accuracy of Qwen-14B-Chat on C-Eval validation set | Model | Avg. Acc. | |:--------------------------------:|:---------:| | LLaMA2-7B-Chat | 31.9 | | LLaMA2-13B-Chat | 36.2 | | LLaMA2-70B-Chat | 44.3 | | ChatGLM2-6B-Chat | 52.6 | | InternLM-7B-Chat | 53.6 | | Baichuan2-7B-Chat | 55.6 | | Baichuan2-13B-Chat | 56.7 | | Qwen-7B-Chat (original) (0-shot) | 54.2 | | **Qwen-7B-Chat (0-shot)** | 59.7 | | **Qwen-7B-Chat (5-shot)** | 59.3 | | **Qwen-14B-Chat (0-shot)** | 69.8 | | **Qwen-14B-Chat (5-shot)** | **71.7** | C-Eval测试集上,Qwen-14B-Chat模型的zero-shot准确率结果如下: The zero-shot accuracy of Qwen-14B-Chat on C-Eval testing set is provided below: | Model | Avg. | STEM | Social Sciences | Humanities | Others | | :---------------------- | :------: | :--: | :-------------: | :--------: | :----: | | Chinese-Alpaca-Plus-13B | 41.5 | 36.6 | 49.7 | 43.1 | 41.2 | | Chinese-Alpaca-2-7B | 40.3 | - | - | - | - | | ChatGLM2-6B-Chat | 50.1 | 46.4 | 60.4 | 50.6 | 46.9 | | Baichuan-13B-Chat | 51.5 | 43.7 | 64.6 | 56.2 | 49.2 | | Qwen-7B-Chat (original) | 54.6 | 47.8 | 67.6 | 59.3 | 50.6 | | **Qwen-7B-Chat** | 58.6 | 53.3 | 72.1 | 62.8 | 52.0 | | **Qwen-14B-Chat** | **69.1** | 65.1 | 80.9 | 71.2 | 63.4 | 在14B规模模型上,经过人类指令对齐的Qwen-14B-Chat模型,准确率在同类相近规模模型中仍然处于前列。 Compared with other pretrained models with comparable model size, the human-aligned Qwen-14B-Chat performs well in C-Eval accuracy. ### 英文评测(English Evaluation) #### MMLU [MMLU](https://arxiv.org/abs/2009.03300)评测集上,Qwen-14B-Chat模型的 0-shot & 5-shot 准确率如下,效果同样在同类对齐模型中同样表现较优。 The 0-shot & 5-shot accuracy of Qwen-14B-Chat on MMLU is provided below. The performance of Qwen-14B-Chat still on the top between other human-aligned models with comparable size. | Model | Avg. Acc. | |:--------------------------------:|:---------:| | ChatGLM2-6B-Chat | 46.0 | | LLaMA2-7B-Chat | 46.2 | | InternLM-7B-Chat | 51.1 | | Baichuan2-7B-Chat | 52.9 | | LLaMA2-13B-Chat | 54.6 | | Baichuan2-13B-Chat | 57.3 | | LLaMA2-70B-Chat | 63.8 | | Qwen-7B-Chat (original) (0-shot) | 53.9 | | **Qwen-7B-Chat (0-shot)** | 55.8 | | **Qwen-7B-Chat (5-shot)** | 57.0 | | **Qwen-14B-Chat (0-shot)** | 64.6 | | **Qwen-14B-Chat (5-shot)** | **66.5** | ### 代码评测(Coding Evaluation) Qwen-14B-Chat在[HumanEval](https://github.com/openai/human-eval)的zero-shot Pass@1效果如下 The zero-shot Pass@1 of Qwen-14B-Chat on [HumanEval](https://github.com/openai/human-eval) is demonstrated below | Model | Pass@1 | |:-----------------------:|:--------:| | ChatGLM2-6B-Chat | 11.0 | | LLaMA2-7B-Chat | 12.2 | | InternLM-7B-Chat | 14.6 | | Baichuan2-7B-Chat | 13.4 | | LLaMA2-13B-Chat | 18.9 | | Baichuan2-13B-Chat | 17.7 | | LLaMA2-70B-Chat | 32.3 | | Qwen-7B-Chat (original) | 24.4 | | **Qwen-7B-Chat** | 37.2 | | **Qwen-14B-Chat** | **43.9** | ### 数学评测(Mathematics Evaluation) 在评测数学能力的[GSM8K](https://github.com/openai/grade-school-math)上,Qwen-14B-Chat的准确率结果如下 The accuracy of Qwen-14B-Chat on GSM8K is shown below | Model | Acc. | |:--------------------------------:|:--------:| | LLaMA2-7B-Chat | 26.3 | | ChatGLM2-6B-Chat | 28.8 | | Baichuan2-7B-Chat | 32.8 | | InternLM-7B-Chat | 33.0 | | LLaMA2-13B-Chat | 37.1 | | Baichuan2-13B-Chat | 55.3 | | LLaMA2-70B-Chat | 59.3 | | Qwen-7B-Chat (original) (0-shot) | 41.1 | | **Qwen-7B-Chat (0-shot)** | 50.3 | | **Qwen-7B-Chat (8-shot)** | 54.1 | | **Qwen-14B-Chat (0-shot)** | **60.1** | | **Qwen-14B-Chat (8-shot)** | 59.3 | ### 长序列评测(Long-Context Understanding) 通过NTK插值,LogN注意力缩放可以扩展Qwen-14B-Chat的上下文长度。在长文本摘要数据集[VCSUM](https://arxiv.org/abs/2305.05280)上(文本平均长度在15K左右),Qwen-14B-Chat的Rouge-L结果如下: **(若要启用这些技巧,请将config.json里的`use_dynamic_ntk`和`use_logn_attn`设置为true)** We introduce NTK-aware interpolation, LogN attention scaling to extend the context length of Qwen-14B-Chat. The Rouge-L results of Qwen-14B-Chat on long-text summarization dataset [VCSUM](https://arxiv.org/abs/2305.05280) (The average length of this dataset is around 15K) are shown below: **(To use these tricks, please set `use_dynamic_ntk` and `use_long_attn` to true in config.json.)** | Model | VCSUM (zh) | |:------------------|:----------:| | GPT-3.5-Turbo-16k | 16.0 | | LLama2-7B-Chat | 0.2 | | InternLM-7B-Chat | 13.0 | | ChatGLM2-6B-Chat | 16.3 | | **Qwen-14B-Chat** | **17.3** | ### 工具使用能力的评测(Tool Usage) #### ReAct Prompting 千问支持通过 [ReAct Prompting](https://arxiv.org/abs/2210.03629) 调用插件/工具/API。ReAct 也是 [LangChain](https://python.langchain.com/) 框架采用的主要方式之一。在我们开源的、用于评估工具使用能力的评测基准上,千问的表现如下: Qwen-Chat supports calling plugins/tools/APIs through [ReAct Prompting](https://arxiv.org/abs/2210.03629). ReAct is also one of the main approaches used by the [LangChain](https://python.langchain.com/) framework. In our evaluation benchmark for assessing tool usage capabilities, Qwen-Chat's performance is as follows: <table> <tr> <th colspan="4" align="center">Chinese Tool-Use Benchmark</th> </tr> <tr> <th align="center">Model</th><th align="center">Tool Selection (Acc.↑)</th><th align="center">Tool Input (Rouge-L↑)</th><th align="center">False Positive Error↓</th> </tr> <tr> <td>GPT-4</td><td align="center">95%</td><td align="center">0.90</td><td align="center">15.0%</td> </tr> <tr> <td>GPT-3.5</td><td align="center">85%</td><td align="center">0.88</td><td align="center">75.0%</td> </tr> <tr> <td>Qwen-7B-Chat</td><td align="center">98%</td><td align="center">0.91</td><td align="center">7.3%</td> </tr> <tr> <td>Qwen-14B-Chat</td><td align="center">98%</td><td align="center">0.93</td><td align="center">2.4%</td> </tr> </table> > 评测基准中出现的插件均没有出现在千问的训练集中。该基准评估了模型在多个候选插件中选择正确插件的准确率、传入插件的参数的合理性、以及假阳率。假阳率(False Positive)定义:在处理不该调用插件的请求时,错误地调用了插件。 > The plugins that appear in the evaluation set do not appear in the training set of Qwen. This benchmark evaluates the accuracy of the model in selecting the correct plugin from multiple candidate plugins, the rationality of the parameters passed into the plugin, and the false positive rate. False Positive: Incorrectly invoking a plugin when it should not have been called when responding to a query. ![](assets/react_showcase_001.png) ![](assets/react_showcase_002.png) #### Code Interpreter 为了考察Qwen使用Python Code Interpreter完成数学解题、数据可视化、及文件处理与爬虫等任务的能力,我们专门建设并开源了一个评测这方面能力的[评测基准](https://github.com/QwenLM/Qwen-Agent/tree/main/benchmark)。 我们发现Qwen在生成代码的可执行率、结果正确性上均表现较好: To assess Qwen's ability to use the Python Code Interpreter for tasks such as mathematical problem solving, data visualization, and other general-purpose tasks such as file handling and web scraping, we have created and open-sourced a benchmark specifically designed for evaluating these capabilities. You can find the benchmark at this [link](https://github.com/QwenLM/Qwen-Agent/tree/main/benchmark). We have observed that Qwen performs well in terms of code executability and result accuracy when generating code: <table> <tr> <th colspan="4" align="center">Executable Rate of Generated Code (%)</th> </tr> <tr> <th align="center">Model</th><th align="center">Math↑</th><th align="center">Visualization↑</th><th align="center">General↑</th> </tr> <tr> <td>GPT-4</td><td align="center">91.9</td><td align="center">85.9</td><td align="center">82.8</td> </tr> <tr> <td>GPT-3.5</td><td align="center">89.2</td><td align="center">65.0</td><td align="center">74.1</td> </tr> <tr> <td>LLaMA2-7B-Chat</td> <td align="center">41.9</td> <td align="center">33.1</td> <td align="center">24.1 </td> </tr> <tr> <td>LLaMA2-13B-Chat</td> <td align="center">50.0</td> <td align="center">40.5</td> <td align="center">48.3 </td> </tr> <tr> <td>CodeLLaMA-7B-Instruct</td> <td align="center">85.1</td> <td align="center">54.0</td> <td align="center">70.7 </td> </tr> <tr> <td>CodeLLaMA-13B-Instruct</td> <td align="center">93.2</td> <td align="center">55.8</td> <td align="center">74.1 </td> </tr> <tr> <td>InternLM-7B-Chat-v1.1</td> <td align="center">78.4</td> <td align="center">44.2</td> <td align="center">62.1 </td> </tr> <tr> <td>InternLM-20B-Chat</td> <td align="center">70.3</td> <td align="center">44.2</td> <td align="center">65.5 </td> </tr> <tr> <td>Qwen-7B-Chat</td> <td align="center">82.4</td> <td align="center">64.4</td> <td align="center">67.2 </td> </tr> <tr> <td>Qwen-14B-Chat</td> <td align="center">89.2</td> <td align="center">84.1</td> <td align="center">65.5</td> </tr> </table> <table> <tr> <th colspan="4" align="center">Accuracy of Code Execution Results (%)</th> </tr> <tr> <th align="center">Model</th><th align="center">Math↑</th><th align="center">Visualization-Hard↑</th><th align="center">Visualization-Easy↑</th> </tr> <tr> <td>GPT-4</td><td align="center">82.8</td><td align="center">66.7</td><td align="center">60.8</td> </tr> <tr> <td>GPT-3.5</td><td align="center">47.3</td><td align="center">33.3</td><td align="center">55.7</td> </tr> <tr> <td>LLaMA2-7B-Chat</td> <td align="center">3.9</td> <td align="center">14.3</td> <td align="center">39.2 </td> </tr> <tr> <td>LLaMA2-13B-Chat</td> <td align="center">8.3</td> <td align="center">8.3</td> <td align="center">40.5 </td> </tr> <tr> <td>CodeLLaMA-7B-Instruct</td> <td align="center">14.3</td> <td align="center">26.2</td> <td align="center">60.8 </td> </tr> <tr> <td>CodeLLaMA-13B-Instruct</td> <td align="center">28.2</td> <td align="center">27.4</td> <td align="center">62.0 </td> </tr> <tr> <td>InternLM-7B-Chat-v1.1</td> <td align="center">28.5</td> <td align="center">4.8</td> <td align="center">40.5 </td> </tr> <tr> <td>InternLM-20B-Chat</td> <td align="center">34.6</td> <td align="center">21.4</td> <td align="center">45.6 </td> </tr> <tr> <td>Qwen-7B-Chat</td> <td align="center">41.9</td> <td align="center">40.5</td> <td align="center">54.4 </td> </tr> <tr> <td>Qwen-14B-Chat</td> <td align="center">58.4</td> <td align="center">53.6</td> <td align="center">59.5</td> </tr> </table> <p align="center"> <br> <img src="assets/code_interpreter_showcase_001.jpg" /> <br> <p> #### Huggingface Agent 千问还具备作为 [HuggingFace Agent](https://huggingface.co/docs/transformers/transformers_agents) 的能力。它在 Huggingface 提供的run模式评测基准上的表现如下: Qwen-Chat also has the capability to be used as a [HuggingFace Agent](https://huggingface.co/docs/transformers/transformers_agents). Its performance on the run-mode benchmark provided by HuggingFace is as follows: <table> <tr> <th colspan="4" align="center">HuggingFace Agent Benchmark- Run Mode</th> </tr> <tr> <th align="center">Model</th><th align="center">Tool Selection↑</th><th align="center">Tool Used↑</th><th align="center">Code↑</th> </tr> <tr> <td>GPT-4</td><td align="center">100</td><td align="center">100</td><td align="center">97.4</td> </tr> <tr> <td>GPT-3.5</td><td align="center">95.4</td><td align="center">96.3</td><td align="center">87.0</td> </tr> <tr> <td>StarCoder-Base-15B</td><td align="center">86.1</td><td align="center">87.0</td><td align="center">68.9</td> </tr> <tr> <td>StarCoder-15B</td><td align="center">87.0</td><td align="center">88.0</td><td align="center">68.9</td> </tr> <tr> <td>Qwen-7B-Chat</td><td align="center">87.0</td><td align="center">87.0</td><td align="center">71.5</td> </tr> <tr> <td>Qwen-14B-Chat</td><td align="center">93.5</td><td align="center">94.4</td><td align="center">87.0</td> </tr> </table> <table> <tr> <th colspan="4" align="center">HuggingFace Agent Benchmark - Chat Mode</th> </tr> <tr> <th align="center">Model</th><th align="center">Tool Selection↑</th><th align="center">Tool Used↑</th><th align="center">Code↑</th> </tr> <tr> <td>GPT-4</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">98.5</td> </tr> <tr> <td>GPT-3.5</td><td align="center">97.3</td><td align="center">96.8</td><td align="center">89.6</td> </tr> <tr> <td>StarCoder-Base-15B</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">91.1</td> </tr> <tr> <td>StarCoder-15B</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">89.6</td> </tr> <tr> <td>Qwen-7B-Chat</td><td align="center">94.7</td><td align="center">94.7</td><td align="center">85.1</td> </tr> <tr> <td>Qwen-14B-Chat</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">95.5</td> </tr> </table> <br> ## FAQ 如遇到问题,敬请查阅[FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ_zh.md)以及issue区,如仍无法解决再提交issue。 If you meet problems, please refer to [FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ.md) and the issues first to search a solution before you launch a new issue. <br> ## 引用 (Citation) 如果你觉得我们的工作对你有帮助,欢迎引用! If you find our work helpful, feel free to give us a cite. ``` @article{qwen, title={Qwen Technical Report}, author={Jinze Bai and Shuai Bai and Yunfei Chu and Zeyu Cui and Kai Dang and Xiaodong Deng and Yang Fan and Wenbin Ge and Yu Han and Fei Huang and Binyuan Hui and Luo Ji and Mei Li and Junyang Lin and Runji Lin and Dayiheng Liu and Gao Liu and Chengqiang Lu and Keming Lu and Jianxin Ma and Rui Men and Xingzhang Ren and Xuancheng Ren and Chuanqi Tan and Sinan Tan and Jianhong Tu and Peng Wang and Shijie Wang and Wei Wang and Shengguang Wu and Benfeng Xu and Jin Xu and An Yang and Hao Yang and Jian Yang and Shusheng Yang and Yang Yao and Bowen Yu and Hongyi Yuan and Zheng Yuan and Jianwei Zhang and Xingxuan Zhang and Yichang Zhang and Zhenru Zhang and Chang Zhou and Jingren Zhou and Xiaohuan Zhou and Tianhang Zhu}, journal={arXiv preprint arXiv:2309.16609}, year={2023} } ``` <br> ## 使用协议(License Agreement) 我们的代码和模型权重对学术研究完全开放,并支持商用。请查看[LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE)了解具体的开源协议细节。如需商用,欢迎填写[问卷](https://dashscope.console.aliyun.com/openModelApply/qianwen)申请。 Our code and checkpoints are open to research purpose, and they are allowed for commercial purposes. Check [LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE) for more details about the license. If you have requirements for commercial use, please fill out the [form](https://dashscope.console.aliyun.com/openModelApply/qianwen) to apply. <br> ## 联系我们(Contact Us) 如果你想给我们的研发团队和产品团队留言,欢迎加入我们的微信群、钉钉群以及Discord!同时,也欢迎通过邮件(qianwen_opensource@alibabacloud.com)联系我们。 If you are interested to leave a message to either our research team or product team, join our Discord or WeChat groups! Also, feel free to send an email to qianwen_opensource@alibabacloud.com.
28,744
[ [ -0.032806396484375, -0.053802490234375, 0.0117645263671875, 0.0228271484375, -0.01222991943359375, -0.00852203369140625, -0.005687713623046875, -0.0297698974609375, -0.007389068603515625, 0.0138397216796875, -0.045654296875, -0.036376953125, -0.02789306640625, -0.00984954833984375, -0.021087646484375, 0.05535888671875, 0.019439697265625, -0.01215362548828125, 0.0154876708984375, -0.0125885009765625, -0.0192718505859375, -0.0248870849609375, -0.056304931640625, -0.01180267333984375, 0.008880615234375, 0.0097198486328125, 0.062164306640625, 0.0309295654296875, 0.027740478515625, 0.03173828125, 0.004199981689453125, 0.0104217529296875, -0.02899169921875, -0.01313018798828125, 0.017578125, -0.032257080078125, -0.050750732421875, 0.00556182861328125, 0.044586181640625, 0.007122039794921875, -0.006298065185546875, 0.0211181640625, 0.01522064208984375, 0.03289794921875, -0.03265380859375, 0.0081634521484375, -0.025299072265625, -0.005031585693359375, -0.00604248046875, -0.00936126708984375, -0.016693115234375, -0.0299072265625, 0.01824951171875, -0.049530029296875, 0.00934600830078125, 0.01428985595703125, 0.1005859375, 0.004817962646484375, -0.05169677734375, 0.00679779052734375, -0.041290283203125, 0.0767822265625, -0.08447265625, 0.0163421630859375, 0.02166748046875, 0.026031494140625, -0.017303466796875, -0.07098388671875, -0.04791259765625, -0.0162200927734375, -0.02203369140625, 0.017303466796875, -0.0301513671875, 0.0113372802734375, 0.0309600830078125, 0.0217742919921875, -0.049560546875, -0.00682830810546875, -0.022430419921875, -0.0183868408203125, 0.059783935546875, 0.0207977294921875, 0.03350830078125, -0.02337646484375, -0.02618408203125, -0.0176544189453125, -0.0305328369140625, 0.0104522705078125, 0.0195159912109375, -0.009307861328125, -0.03466796875, 0.0168304443359375, -0.0244293212890625, 0.03302001953125, 0.03131103515625, -0.019195556640625, 0.0275115966796875, -0.033935546875, -0.032562255859375, -0.0254364013671875, 0.108154296875, 0.03765869140625, -0.00008386373519897461, 0.01355743408203125, -0.0007004737854003906, -0.0169219970703125, -0.0160675048828125, -0.08087158203125, -0.039276123046875, 0.045928955078125, -0.049285888671875, -0.0288238525390625, -0.008941650390625, -0.03765869140625, 0.0103607177734375, 0.007579803466796875, 0.053009033203125, -0.051025390625, -0.04693603515625, -0.001361846923828125, -0.0142059326171875, 0.022247314453125, 0.0186309814453125, -0.0634765625, 0.01245880126953125, 0.01357269287109375, 0.054931640625, 0.00931549072265625, -0.028472900390625, -0.016204833984375, -0.0007371902465820312, -0.01519012451171875, 0.030517578125, 0.0016202926635742188, -0.0215606689453125, -0.018707275390625, 0.0117950439453125, -0.010833740234375, -0.028350830078125, 0.0304107666015625, -0.04022216796875, 0.0430908203125, -0.00205230712890625, -0.040313720703125, -0.020904541015625, 0.00849151611328125, -0.034027099609375, 0.08197021484375, 0.01004791259765625, -0.07763671875, -0.0003719329833984375, -0.041534423828125, -0.0117645263671875, 0.00023674964904785156, 0.0017728805541992188, -0.04547119140625, -0.026702880859375, 0.0269012451171875, 0.031280517578125, -0.0211944580078125, 0.0223846435546875, -0.0265045166015625, -0.02886962890625, 0.032501220703125, -0.05322265625, 0.1033935546875, 0.0128021240234375, -0.0537109375, 0.031036376953125, -0.05548095703125, 0.02337646484375, 0.0165863037109375, -0.00008112192153930664, -0.0049896240234375, -0.005748748779296875, 0.001964569091796875, 0.0296173095703125, 0.0243682861328125, -0.02447509765625, 0.006313323974609375, -0.04345703125, 0.0594482421875, 0.044158935546875, 0.00537109375, 0.04510498046875, -0.0333251953125, 0.029327392578125, 0.0229339599609375, 0.0352783203125, -0.01280975341796875, -0.03912353515625, -0.06634521484375, -0.01450347900390625, 0.02880859375, 0.0474853515625, -0.07232666015625, 0.045654296875, 0.00325775146484375, -0.045318603515625, -0.04998779296875, -0.0143280029296875, 0.035736083984375, 0.021484375, 0.034271240234375, 0.0034027099609375, -0.037384033203125, -0.059295654296875, -0.0009007453918457031, -0.019683837890625, -0.01099395751953125, 0.0191650390625, 0.035125732421875, -0.00974273681640625, 0.06353759765625, -0.033599853515625, -0.0030422210693359375, -0.01258087158203125, 0.01183319091796875, 0.018951416015625, 0.0535888671875, 0.0548095703125, -0.061370849609375, -0.050384521484375, -0.0146026611328125, -0.058685302734375, 0.00543212890625, -0.0139617919921875, -0.0423583984375, 0.019378662109375, 0.0123443603515625, -0.055755615234375, 0.042449951171875, 0.04583740234375, -0.0322265625, 0.05841064453125, -0.004062652587890625, 0.0180816650390625, -0.084716796875, -0.00382232666015625, 0.00118255615234375, -0.018707275390625, -0.04193115234375, -0.00296783447265625, 0.0185089111328125, 0.0188446044921875, -0.0382080078125, 0.05859375, -0.039520263671875, 0.01454925537109375, -0.009246826171875, 0.01239776611328125, 0.01360321044921875, 0.050811767578125, -0.0038700103759765625, 0.051788330078125, 0.052459716796875, -0.054840087890625, 0.036285400390625, 0.0286712646484375, -0.00917816162109375, 0.01345062255859375, -0.0623779296875, 0.00003153085708618164, 0.0164794921875, 0.016448974609375, -0.08294677734375, -0.004241943359375, 0.040863037109375, -0.060638427734375, 0.017822265625, -0.01183319091796875, -0.01543426513671875, -0.041534423828125, -0.0299530029296875, 0.0177459716796875, 0.05657958984375, -0.033172607421875, 0.037445068359375, 0.014312744140625, 0.016998291015625, -0.045501708984375, -0.042144775390625, -0.01160430908203125, -0.03057861328125, -0.061370849609375, 0.041290283203125, -0.01045989990234375, -0.001720428466796875, 0.001483917236328125, 0.0138702392578125, -0.000995635986328125, 0.00023448467254638672, 0.0103607177734375, 0.030517578125, -0.01837158203125, -0.0050048828125, 0.0019016265869140625, -0.01035308837890625, 0.005657196044921875, -0.01641845703125, 0.039306640625, -0.0117950439453125, -0.0092926025390625, -0.06683349609375, 0.01306915283203125, 0.032989501953125, -0.0169830322265625, 0.06451416015625, 0.07623291015625, -0.01146697998046875, 0.0053253173828125, -0.042816162109375, -0.0261383056640625, -0.0438232421875, 0.0270233154296875, -0.023529052734375, -0.062286376953125, 0.0457763671875, 0.0182952880859375, 0.0229339599609375, 0.053009033203125, 0.042022705078125, -0.00830841064453125, 0.099365234375, 0.031463623046875, -0.004913330078125, 0.048431396484375, -0.04132080078125, 0.0106353759765625, -0.06109619140625, 0.0045318603515625, -0.015625, -0.0162200927734375, -0.0653076171875, -0.0226593017578125, 0.03143310546875, 0.0196990966796875, -0.050537109375, 0.030609130859375, -0.04022216796875, -0.011871337890625, 0.065185546875, 0.01172637939453125, 0.01317596435546875, -0.025970458984375, 0.0101165771484375, -0.00009685754776000977, -0.0672607421875, -0.0304412841796875, 0.07305908203125, 0.03021240234375, 0.039794921875, 0.00431060791015625, 0.053375244140625, -0.00409698486328125, 0.021484375, -0.042388916015625, 0.039703369140625, 0.017486572265625, -0.046722412109375, -0.040008544921875, -0.051788330078125, -0.071044921875, 0.037811279296875, -0.0073089599609375, -0.0604248046875, 0.0226287841796875, 0.0125885009765625, -0.049346923828125, 0.01265716552734375, -0.05352783203125, 0.07513427734375, -0.014312744140625, -0.03399658203125, -0.005069732666015625, -0.051361083984375, 0.045989990234375, 0.02996826171875, 0.0142364501953125, -0.01861572265625, 0.016998291015625, 0.064453125, -0.038604736328125, 0.058502197265625, -0.0166778564453125, -0.0029544830322265625, 0.0478515625, 0.0015430450439453125, 0.03704833984375, 0.0154571533203125, 0.013214111328125, 0.00855255126953125, 0.033355712890625, -0.03607177734375, -0.04315185546875, 0.041839599609375, -0.0692138671875, -0.047637939453125, -0.0275726318359375, -0.0316162109375, 0.001186370849609375, 0.01389312744140625, 0.041595458984375, 0.039276123046875, 0.0036411285400390625, 0.007354736328125, 0.03460693359375, -0.0355224609375, 0.053009033203125, 0.037200927734375, -0.02978515625, -0.035552978515625, 0.055023193359375, 0.007076263427734375, 0.0300445556640625, 0.01531982421875, 0.00949859619140625, -0.0233917236328125, -0.0225067138671875, -0.049346923828125, 0.0172882080078125, -0.0213775634765625, -0.03009033203125, -0.05804443359375, -0.0323486328125, -0.04998779296875, 0.0207672119140625, -0.02911376953125, -0.025787353515625, -0.0298309326171875, 0.00362396240234375, 0.042510986328125, 0.006679534912109375, -0.00782012939453125, 0.031158447265625, -0.08221435546875, 0.0299530029296875, 0.0282745361328125, -0.0007052421569824219, 0.027191162109375, -0.05731201171875, -0.0303802490234375, 0.030853271484375, -0.0450439453125, -0.060638427734375, 0.05462646484375, 0.01131439208984375, 0.037628173828125, 0.03717041015625, 0.02508544921875, 0.051361083984375, -0.01294708251953125, 0.0594482421875, 0.013671875, -0.0787353515625, 0.030975341796875, -0.036529541015625, 0.0185089111328125, 0.004306793212890625, 0.0294342041015625, -0.03717041015625, -0.0282440185546875, -0.0693359375, -0.05987548828125, 0.06390380859375, 0.037841796875, 0.0133209228515625, -0.00014841556549072266, 0.00930023193359375, -0.031402587890625, 0.0164031982421875, -0.059326171875, -0.04107666015625, -0.016937255859375, -0.016937255859375, 0.0272216796875, -0.01418304443359375, -0.0019683837890625, -0.033721923828125, 0.05908203125, -0.00281524658203125, 0.05145263671875, 0.0194244384765625, 0.002323150634765625, -0.005802154541015625, 0.000038623809814453125, 0.0223846435546875, 0.037872314453125, -0.01409149169921875, -0.0128936767578125, 0.017669677734375, -0.033203125, 0.0084381103515625, 0.004444122314453125, -0.01953125, -0.0012216567993164062, 0.0262603759765625, 0.07330322265625, 0.015625, -0.0285186767578125, 0.040740966796875, -0.01427459716796875, -0.0281524658203125, -0.019989013671875, 0.01482391357421875, 0.023162841796875, 0.0439453125, 0.035888671875, -0.0255126953125, 0.017486572265625, -0.04119873046875, 0.008392333984375, 0.031829833984375, -0.006526947021484375, -0.0228271484375, 0.06201171875, 0.0125274658203125, -0.004642486572265625, 0.052154541015625, -0.035400390625, -0.053131103515625, 0.058929443359375, 0.03680419921875, 0.057220458984375, -0.0260162353515625, 0.018768310546875, 0.053955078125, 0.00539398193359375, -0.0100860595703125, 0.025360107421875, 0.003612518310546875, -0.06561279296875, -0.0322265625, -0.036865234375, -0.0163116455078125, 0.00186920166015625, -0.05535888671875, 0.013824462890625, -0.01425933837890625, -0.036102294921875, -0.004360198974609375, 0.0139312744140625, -0.0433349609375, 0.02484130859375, -0.006000518798828125, 0.05059814453125, -0.035430908203125, 0.0753173828125, 0.02288818359375, -0.0298614501953125, -0.0718994140625, -0.00628662109375, -0.01015472412109375, -0.05145263671875, 0.03875732421875, 0.0017757415771484375, 0.00782012939453125, 0.030731201171875, -0.05072021484375, -0.08050537109375, 0.09686279296875, -0.001659393310546875, -0.049041748046875, -0.01318359375, -0.01512908935546875, 0.03485107421875, -0.004474639892578125, 0.054931640625, 0.04339599609375, 0.0287628173828125, 0.0100860595703125, -0.0826416015625, 0.0195159912109375, -0.0242156982421875, -0.0008993148803710938, 0.01294708251953125, -0.084716796875, 0.083251953125, -0.013641357421875, -0.023345947265625, 0.00841522216796875, 0.078125, 0.0208282470703125, 0.01435089111328125, 0.0208740234375, 0.0226287841796875, 0.03973388671875, -0.0133056640625, 0.0545654296875, -0.039154052734375, 0.0548095703125, 0.0611572265625, 0.006927490234375, 0.053863525390625, 0.004024505615234375, -0.0469970703125, 0.03338623046875, 0.048614501953125, -0.00982666015625, 0.037841796875, -0.00429534912109375, -0.01708984375, -0.017822265625, 0.0287322998046875, -0.035888671875, 0.0176239013671875, 0.0304107666015625, -0.00774383544921875, 0.002994537353515625, 0.0116729736328125, 0.0035610198974609375, -0.0355224609375, -0.006114959716796875, 0.049102783203125, 0.018585205078125, -0.034698486328125, 0.07177734375, 0.01232147216796875, 0.07891845703125, -0.03662109375, -0.0009307861328125, -0.0124053955078125, 0.006778717041015625, -0.019195556640625, -0.044219970703125, 0.01398468017578125, -0.025146484375, 0.00556182861328125, 0.01296234130859375, 0.05572509765625, -0.037628173828125, -0.0203704833984375, 0.02520751953125, 0.031494140625, 0.00859832763671875, -0.0167388916015625, -0.07281494140625, 0.00012958049774169922, 0.017364501953125, -0.051788330078125, 0.033935546875, 0.04669189453125, -0.002155303955078125, 0.055877685546875, 0.045562744140625, -0.020660400390625, 0.005535125732421875, -0.00620269775390625, 0.06585693359375, -0.057769775390625, -0.0290069580078125, -0.0704345703125, 0.05352783203125, -0.010894775390625, -0.0357666015625, 0.07366943359375, 0.020965576171875, 0.0533447265625, 0.014556884765625, 0.05523681640625, -0.0238037109375, 0.032623291015625, -0.0248870849609375, 0.0579833984375, -0.0330810546875, 0.0019969940185546875, -0.0194244384765625, -0.046417236328125, 0.004535675048828125, 0.06414794921875, -0.031402587890625, 0.02435302734375, 0.0439453125, 0.061492919921875, 0.01190948486328125, -0.00368499755859375, 0.031280517578125, 0.030792236328125, 0.016082763671875, 0.064453125, 0.052764892578125, -0.07470703125, 0.05084228515625, -0.046630859375, -0.0164031982421875, -0.0261383056640625, -0.04754638671875, -0.079833984375, -0.04217529296875, -0.03662109375, -0.049530029296875, -0.0101318359375, 0.06793212890625, 0.055755615234375, -0.054931640625, -0.0163116455078125, 0.00586700439453125, -0.00015270709991455078, -0.0300750732421875, -0.0258026123046875, 0.045013427734375, -0.01137542724609375, -0.07330322265625, 0.00165557861328125, 0.00092315673828125, 0.0232391357421875, -0.02374267578125, -0.0080413818359375, -0.0166168212890625, -0.0020656585693359375, 0.0341796875, 0.0186004638671875, -0.0433349609375, -0.00605010986328125, 0.00977325439453125, -0.0255126953125, 0.009185791015625, 0.01328277587890625, -0.050537109375, 0.0010519027709960938, 0.034820556640625, -0.00045299530029296875, 0.051849365234375, 0.003246307373046875, 0.045257568359375, -0.026611328125, 0.023468017578125, 0.0122833251953125, 0.0135345458984375, 0.007061004638671875, -0.032684326171875, 0.019378662109375, 0.0114593505859375, -0.052703857421875, -0.05755615234375, -0.007610321044921875, -0.0584716796875, -0.0122222900390625, 0.08349609375, -0.0291595458984375, -0.0283660888671875, -0.0025234222412109375, -0.038116455078125, 0.04364013671875, -0.032989501953125, 0.050567626953125, 0.03759765625, -0.00665283203125, -0.0341796875, -0.05108642578125, 0.050628662109375, 0.025543212890625, -0.044647216796875, -0.00801849365234375, 0.01995849609375, 0.0227508544921875, -0.00362396240234375, 0.061004638671875, 0.00555419921875, 0.0311431884765625, 0.005001068115234375, 0.0185089111328125, -0.0013647079467773438, 0.00429534912109375, -0.004779815673828125, -0.01229095458984375, -0.0109100341796875, -0.028228759765625 ] ]
elyza/ELYZA-japanese-Llama-2-7b
2023-08-29T03:45:51.000Z
[ "transformers", "pytorch", "llama", "text-generation", "ja", "en", "arxiv:2307.09288", "license:llama2", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
elyza
null
null
elyza/ELYZA-japanese-Llama-2-7b
38
9,235
transformers
2023-08-28T12:38:34
--- license: llama2 language: - ja - en --- ## ELYZA-japanese-Llama-2-7b ![ELYZA-Japanese-Llama2-image](./key_visual.png) ### Model Description **ELYZA-japanese-Llama-2-7b** は、 Llama2をベースとして日本語能力を拡張するために追加事前学習を行ったモデルです。 詳細は [Blog記事](https://note.com/elyza/n/na405acaca130) を参照してください。 ### Usage ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer B_INST, E_INST = "[INST]", "[/INST]" B_SYS, E_SYS = "<<SYS>>\n", "\n<</SYS>>\n\n" DEFAULT_SYSTEM_PROMPT = "あなたは誠実で優秀な日本人のアシスタントです。" text = "クマが海辺に行ってアザラシと友達になり、最終的には家に帰るというプロットの短編小説を書いてください。" model_name = "elyza/ELYZA-japanese-Llama-2-7b-instruct" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype="auto") if torch.cuda.is_available(): model = model.to("cuda") prompt = "{bos_token}{b_inst} {system}{prompt} {e_inst} ".format( bos_token=tokenizer.bos_token, b_inst=B_INST, system=f"{B_SYS}{DEFAULT_SYSTEM_PROMPT}{E_SYS}", prompt=text, e_inst=E_INST, ) with torch.no_grad(): token_ids = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt") output_ids = model.generate( token_ids.to(model.device), max_new_tokens=256, pad_token_id=tokenizer.pad_token_id, eos_token_id=tokenizer.eos_token_id, ) output = tokenizer.decode(output_ids.tolist()[0][token_ids.size(1) :], skip_special_tokens=True) print(output) """ 承知しました。以下にクマが海辺に行ってアザラシと友達になり、最終的には家に帰るというプロットの短編小説を記述します。 クマは山の中でゆっくりと眠っていた。 その眠りに落ちたクマは、夢の中で海辺を歩いていた。 そこにはアザラシがいた。 クマはアザラシに話しかける。 「おはよう」とクマが言うと、アザラシは驚いたように顔を上げた。 「あ、こんにちは」アザラシは答えた。 クマはアザラシと友達になりたいと思う。 「私はクマと申します。」クマは... """ ``` ### ELYZA-japanese-Llama-2-7b Models | Model Name | Vocab Size | #Params | |:---------------------------------------------|:----------:|:-------:| |[elyza/ELYZA-japanese-Llama-2-7b](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b)| 32000 | 6.27B | |[elyza/ELYZA-japanese-Llama-2-7b-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-instruct)| 32000 | 6.27B | |[elyza/ELYZA-japanese-Llama-2-7b-fast](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast)| 45043 | 6.37B | |[elyza/ELYZA-japanese-Llama-2-7b-fast-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct)| 45043 | 6.37B | ### Developers 以下アルファベット順 - [Akira Sasaki](https://huggingface.co/akirasasaki) - [Masato Hirakawa](https://huggingface.co/m-hirakawa) - [Shintaro Horie](https://huggingface.co/e-mon) - [Tomoaki Nakamura](https://huggingface.co/tyoyo) ### Licence Llama 2 is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved. ### How to Cite ```tex @misc{elyzallama2023, title={ELYZA-japanese-Llama-2-7b}, url={https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b}, author={Akira Sasaki and Masato Hirakawa and Shintaro Horie and Tomoaki Nakamura}, year={2023}, } ``` ### Citations ```tex @misc{touvron2023llama, title={Llama 2: Open Foundation and Fine-Tuned Chat Models}, author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom}, year={2023}, eprint={2307.09288}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
4,469
[ [ -0.034332275390625, -0.0467529296875, 0.0198974609375, 0.0262298583984375, -0.04071044921875, 0.005809783935546875, 0.0102081298828125, -0.046630859375, 0.044830322265625, 0.0080413818359375, -0.046356201171875, -0.0452880859375, -0.042816162109375, 0.01491546630859375, -0.00801849365234375, 0.0562744140625, -0.010040283203125, -0.02398681640625, 0.00348663330078125, -0.001415252685546875, -0.0154876708984375, -0.028228759765625, -0.038421630859375, -0.0228118896484375, 0.0208587646484375, 0.010986328125, 0.04168701171875, 0.050445556640625, 0.038818359375, 0.031036376953125, -0.0194091796875, 0.0211944580078125, -0.0200347900390625, -0.0155792236328125, 0.018218994140625, -0.036834716796875, -0.058135986328125, -0.02197265625, 0.040374755859375, 0.0231781005859375, 0.006755828857421875, 0.02691650390625, -0.0030803680419921875, 0.022796630859375, -0.021087646484375, 0.0027294158935546875, -0.0281829833984375, 0.006412506103515625, -0.016357421875, -0.0159759521484375, -0.00939178466796875, -0.0264434814453125, -0.020843505859375, -0.0635986328125, -0.004726409912109375, 0.005767822265625, 0.108642578125, 0.016815185546875, -0.0211181640625, -0.0010042190551757812, -0.01177978515625, 0.0648193359375, -0.072998046875, 0.01438140869140625, 0.021820068359375, -0.006072998046875, -0.0255279541015625, -0.06097412109375, -0.05474853515625, -0.00673675537109375, -0.02178955078125, 0.01482391357421875, -0.034393310546875, -0.0230712890625, 0.01470947265625, 0.016204833984375, -0.033447265625, 0.0219573974609375, -0.040313720703125, -0.0093536376953125, 0.0562744140625, 0.014129638671875, 0.043365478515625, -0.0263824462890625, -0.042510986328125, -0.01355743408203125, -0.04998779296875, 0.0183868408203125, 0.0262603759765625, 0.0074310302734375, -0.0531005859375, 0.046661376953125, -0.01548004150390625, 0.032135009765625, 0.00989532470703125, -0.0276031494140625, 0.0484619140625, -0.032745361328125, -0.019561767578125, -0.017120361328125, 0.08441162109375, 0.04998779296875, -0.00214385986328125, 0.01256561279296875, 0.00055694580078125, -0.00033974647521972656, -0.0335693359375, -0.06903076171875, 0.01220703125, 0.02447509765625, -0.044830322265625, -0.0291900634765625, -0.005405426025390625, -0.0654296875, -0.005126953125, 0.00555419921875, 0.0170135498046875, -0.01482391357421875, -0.032623291015625, 0.01398468017578125, 0.0003807544708251953, 0.030792236328125, 0.01136016845703125, -0.049285888671875, 0.01062774658203125, 0.0291900634765625, 0.068359375, 0.005275726318359375, -0.0232696533203125, -0.01128387451171875, 0.016082763671875, -0.01343536376953125, 0.050445556640625, -0.0218658447265625, -0.03857421875, -0.020904541015625, 0.016143798828125, -0.0087127685546875, -0.021270751953125, 0.0276336669921875, -0.00809478759765625, 0.0059814453125, -0.02349853515625, -0.020355224609375, -0.01512908935546875, 0.007068634033203125, -0.0255279541015625, 0.0811767578125, -0.0030651092529296875, -0.06573486328125, 0.00003516674041748047, -0.034515380859375, -0.01251220703125, -0.00908660888671875, -0.002170562744140625, -0.042205810546875, -0.014068603515625, 0.032379150390625, 0.034088134765625, -0.030670166015625, -0.0036296844482421875, -0.0285186767578125, -0.022308349609375, 0.0245208740234375, -0.0033283233642578125, 0.08135986328125, 0.0251312255859375, -0.0341796875, -0.0010271072387695312, -0.06402587890625, 0.004230499267578125, 0.050048828125, -0.0225372314453125, 0.0015430450439453125, -0.0136871337890625, -0.007305145263671875, 0.01053619384765625, 0.043914794921875, -0.042205810546875, 0.0193328857421875, -0.03265380859375, 0.04058837890625, 0.06573486328125, 0.006488800048828125, 0.01099395751953125, -0.037567138671875, 0.033416748046875, 0.009796142578125, 0.0204010009765625, -0.0100555419921875, -0.047698974609375, -0.07305908203125, -0.0283355712890625, -0.0114593505859375, 0.037628173828125, -0.03851318359375, 0.0518798828125, -0.00925445556640625, -0.06048583984375, -0.032867431640625, 0.0029354095458984375, 0.03466796875, 0.022186279296875, 0.0186767578125, -0.0197906494140625, -0.0626220703125, -0.0521240234375, -0.00809478759765625, -0.0268707275390625, 0.0167236328125, 0.03369140625, 0.0506591796875, -0.0311737060546875, 0.04754638671875, -0.03741455078125, -0.017791748046875, -0.0149993896484375, -0.0161895751953125, 0.050262451171875, 0.0504150390625, 0.056060791015625, -0.038299560546875, -0.041259765625, 0.01397705078125, -0.0660400390625, -0.005710601806640625, -0.0007328987121582031, -0.0372314453125, 0.0225830078125, 0.0172271728515625, -0.05487060546875, 0.045135498046875, 0.0311431884765625, -0.047760009765625, 0.0260772705078125, -0.01323699951171875, 0.0123748779296875, -0.08984375, 0.01007843017578125, -0.00785064697265625, 0.0017786026000976562, -0.039337158203125, 0.0021514892578125, -0.013336181640625, 0.0228424072265625, -0.040008544921875, 0.06597900390625, -0.03326416015625, -0.0008831024169921875, -0.004550933837890625, 0.026824951171875, 0.0032939910888671875, 0.04730224609375, -0.0078125, 0.046478271484375, 0.036895751953125, -0.03717041015625, 0.03692626953125, 0.04217529296875, -0.0209197998046875, 0.03399658203125, -0.065185546875, 0.0199127197265625, 0.00507354736328125, 0.032958984375, -0.0888671875, -0.0162200927734375, 0.035980224609375, -0.0513916015625, 0.002384185791015625, -0.00897216796875, -0.03436279296875, -0.0457763671875, -0.031829833984375, 0.022918701171875, 0.045440673828125, -0.054473876953125, 0.02978515625, 0.021240234375, 0.0037250518798828125, -0.05621337890625, -0.0548095703125, -0.013763427734375, -0.0206298828125, -0.057403564453125, 0.030792236328125, -0.01666259765625, -0.0137786865234375, -0.01299285888671875, -0.004848480224609375, -0.0011749267578125, 0.011505126953125, 0.0206756591796875, 0.047393798828125, -0.0181732177734375, -0.028717041015625, -0.00017213821411132812, -0.0130462646484375, -0.0035419464111328125, -0.0034503936767578125, 0.06536865234375, -0.02349853515625, -0.029510498046875, -0.06488037109375, 0.01031494140625, 0.038299560546875, -0.01557159423828125, 0.057769775390625, 0.055755615234375, -0.028350830078125, 0.026824951171875, -0.04193115234375, -0.0026760101318359375, -0.038299560546875, 0.027435302734375, -0.032684326171875, -0.0384521484375, 0.0653076171875, 0.0235748291015625, 0.0174102783203125, 0.051544189453125, 0.046478271484375, 0.0018863677978515625, 0.0753173828125, 0.042083740234375, -0.00360107421875, 0.041534423828125, -0.0494384765625, 0.020965576171875, -0.074951171875, -0.0452880859375, -0.032073974609375, -0.027679443359375, -0.035369873046875, -0.030487060546875, 0.01739501953125, 0.0104827880859375, -0.0452880859375, 0.031524658203125, -0.051055908203125, 0.0214080810546875, 0.0275115966796875, 0.0164947509765625, 0.0162811279296875, 0.00991058349609375, -0.01399993896484375, 0.0017976760864257812, -0.02813720703125, -0.0296783447265625, 0.08184814453125, 0.033050537109375, 0.045196533203125, 0.01910400390625, 0.0626220703125, -0.01172637939453125, 0.000789642333984375, -0.035491943359375, 0.050140380859375, 0.0132293701171875, -0.0491943359375, -0.00687408447265625, -0.01507568359375, -0.07659912109375, 0.036224365234375, 0.0003981590270996094, -0.08135986328125, 0.020416259765625, -0.0170135498046875, -0.0301513671875, 0.03790283203125, -0.036041259765625, 0.0396728515625, -0.02423095703125, -0.032867431640625, -0.00183868408203125, -0.041229248046875, 0.0285491943359375, 0.01490020751953125, 0.0195770263671875, -0.02606201171875, -0.0219879150390625, 0.08001708984375, -0.046417236328125, 0.06585693359375, -0.007442474365234375, -0.01202392578125, 0.0291595458984375, -0.00685882568359375, 0.05279541015625, 0.017242431640625, 0.000698089599609375, 0.0196380615234375, -0.0001327991485595703, -0.0272979736328125, -0.0146484375, 0.05316162109375, -0.0889892578125, -0.05426025390625, -0.03564453125, -0.01357269287109375, 0.014862060546875, 0.0178985595703125, 0.044342041015625, 0.00957489013671875, 0.01552581787109375, 0.01045989990234375, 0.026824951171875, -0.0263214111328125, 0.053802490234375, 0.022247314453125, -0.02288818359375, -0.0467529296875, 0.05242919921875, 0.0109405517578125, 0.01453399658203125, 0.0198822021484375, 0.005130767822265625, -0.0186767578125, -0.01314544677734375, -0.035430908203125, 0.05523681640625, -0.05487060546875, -0.0269317626953125, -0.053070068359375, -0.02410888671875, -0.02874755859375, -0.03173828125, -0.02728271484375, -0.03057861328125, -0.048736572265625, -0.0121307373046875, 0.059661865234375, 0.036834716796875, -0.0096435546875, 0.0277557373046875, -0.039337158203125, 0.0180511474609375, 0.00283050537109375, 0.011444091796875, 0.0157623291015625, -0.063720703125, -0.0048370361328125, 0.0016965866088867188, -0.026458740234375, -0.0677490234375, 0.0545654296875, -0.0035762786865234375, 0.047027587890625, 0.0242919921875, -0.006084442138671875, 0.07342529296875, -0.01044464111328125, 0.06549072265625, 0.042022705078125, -0.0670166015625, 0.047576904296875, -0.029998779296875, 0.0011816024780273438, 0.002323150634765625, 0.0164337158203125, -0.03143310546875, -0.01099395751953125, -0.058868408203125, -0.07366943359375, 0.06903076171875, 0.01702880859375, 0.0159149169921875, 0.00801849365234375, 0.0162506103515625, -0.007381439208984375, 0.0036640167236328125, -0.0733642578125, -0.055450439453125, -0.016937255859375, -0.01340484619140625, 0.00376129150390625, -0.018951416015625, -0.01097869873046875, -0.040130615234375, 0.061370849609375, 0.00344085693359375, 0.04559326171875, 0.020263671875, -0.0042877197265625, -0.0104522705078125, -0.0013360977172851562, 0.05535888671875, 0.0297698974609375, -0.008331298828125, -0.0150299072265625, 0.034393310546875, -0.0458984375, 0.0159912109375, -0.0019683837890625, -0.00981903076171875, 0.00872802734375, 0.0237274169921875, 0.06854248046875, 0.0174713134765625, -0.0305633544921875, 0.036041259765625, 0.0026111602783203125, -0.01039886474609375, -0.032257080078125, 0.00069427490234375, 0.01519012451171875, 0.033233642578125, 0.030120849609375, -0.01358795166015625, -0.0203094482421875, -0.031982421875, -0.01016998291015625, 0.025115966796875, 0.00627899169921875, -0.0218048095703125, 0.0625, 0.01136016845703125, -0.01497650146484375, 0.026214599609375, -0.0011548995971679688, -0.04046630859375, 0.07366943359375, 0.0574951171875, 0.046142578125, -0.016357421875, -0.0008625984191894531, 0.06317138671875, 0.0157623291015625, 0.01151275634765625, 0.0300750732421875, 0.00021564960479736328, -0.039764404296875, 0.004344940185546875, -0.0521240234375, -0.004230499267578125, 0.01451873779296875, -0.030487060546875, 0.031341552734375, -0.0460205078125, -0.0170440673828125, -0.016357421875, 0.02960205078125, -0.047576904296875, 0.0013408660888671875, 0.007350921630859375, 0.052825927734375, -0.05401611328125, 0.047943115234375, 0.041351318359375, -0.0447998046875, -0.0660400390625, -0.0267791748046875, 0.006977081298828125, -0.0849609375, 0.0455322265625, 0.0028171539306640625, -0.00476837158203125, 0.00875091552734375, -0.050048828125, -0.0986328125, 0.10858154296875, 0.01025390625, -0.031768798828125, 0.00791168212890625, -0.00734710693359375, 0.0296783447265625, -0.0225372314453125, 0.047454833984375, 0.04046630859375, 0.04498291015625, 0.01201629638671875, -0.07159423828125, 0.025177001953125, -0.050994873046875, 0.003063201904296875, -0.005889892578125, -0.0970458984375, 0.08599853515625, -0.030181884765625, -0.0078277587890625, 0.032318115234375, 0.06463623046875, 0.054595947265625, 0.01268768310546875, 0.0157012939453125, 0.038543701171875, 0.050811767578125, -0.0178680419921875, 0.06329345703125, -0.0212860107421875, 0.04046630859375, 0.0305633544921875, -0.002838134765625, 0.06243896484375, 0.03424072265625, -0.047454833984375, 0.04962158203125, 0.058807373046875, -0.0222015380859375, 0.024993896484375, 0.00376129150390625, -0.01413726806640625, -0.0037250518798828125, -0.0071868896484375, -0.060760498046875, 0.0224609375, 0.027252197265625, -0.0239105224609375, 0.0030078887939453125, -0.01003265380859375, 0.03900146484375, -0.0172271728515625, -0.01180267333984375, 0.044219970703125, 0.0144195556640625, -0.036041259765625, 0.08209228515625, -0.00475311279296875, 0.07635498046875, -0.034515380859375, 0.017547607421875, -0.0303955078125, 0.01230621337890625, -0.035247802734375, -0.0516357421875, -0.006195068359375, 0.019012451171875, -0.004276275634765625, 0.01195526123046875, 0.032501220703125, -0.0041351318359375, -0.045562744140625, 0.03411865234375, 0.0110931396484375, 0.03515625, 0.04364013671875, -0.054595947265625, 0.0330810546875, 0.0236663818359375, -0.050994873046875, 0.0170440673828125, 0.01092529296875, 0.01447296142578125, 0.056549072265625, 0.05487060546875, 0.0004906654357910156, 0.0299072265625, -0.01502227783203125, 0.06329345703125, -0.04083251953125, -0.0298309326171875, -0.07330322265625, 0.0482177734375, -0.00799560546875, -0.03692626953125, 0.06134033203125, 0.035308837890625, 0.050933837890625, 0.006122589111328125, 0.05743408203125, -0.024017333984375, 0.02239990234375, -0.03228759765625, 0.055511474609375, -0.05859375, 0.0156097412109375, -0.0206146240234375, -0.051300048828125, -0.0167999267578125, 0.067626953125, -0.0149383544921875, 0.0179290771484375, 0.04083251953125, 0.06427001953125, 0.01355743408203125, -0.0215606689453125, 0.00402069091796875, 0.03375244140625, 0.03564453125, 0.06951904296875, 0.05206298828125, -0.06402587890625, 0.033599853515625, -0.040863037109375, 0.0018167495727539062, -0.038116455078125, -0.052520751953125, -0.0743408203125, -0.043731689453125, -0.0258331298828125, -0.03228759765625, -0.0218353271484375, 0.07733154296875, 0.04681396484375, -0.04583740234375, -0.024688720703125, 0.0095367431640625, 0.01953125, -0.0031032562255859375, -0.0127105712890625, 0.039398193359375, 0.004184722900390625, -0.07073974609375, 0.0127410888671875, 0.007080078125, 0.03857421875, 0.0018148422241210938, -0.022796630859375, -0.0207366943359375, 0.008209228515625, 0.0213165283203125, 0.033721923828125, -0.06842041015625, -0.005878448486328125, 0.0022182464599609375, -0.0211944580078125, 0.01279449462890625, -0.0013818740844726562, -0.042510986328125, 0.004180908203125, 0.042510986328125, 0.004451751708984375, 0.043609619140625, -0.01464080810546875, 0.00392913818359375, -0.0281982421875, 0.039520263671875, -0.01291656494140625, 0.0494384765625, 0.0133209228515625, -0.031768798828125, 0.0457763671875, 0.0298004150390625, -0.0264739990234375, -0.0870361328125, -0.0108642578125, -0.08428955078125, -0.013275146484375, 0.0880126953125, -0.0160980224609375, -0.039337158203125, 0.0181121826171875, -0.0258331298828125, 0.034332275390625, -0.0217437744140625, 0.04205322265625, 0.034942626953125, -0.004291534423828125, -0.0084075927734375, -0.032867431640625, 0.013397216796875, 0.022064208984375, -0.0626220703125, -0.0160369873046875, 0.006717681884765625, 0.026336669921875, 0.03009033203125, 0.05499267578125, -0.005001068115234375, 0.0230712890625, 0.00527191162109375, 0.01392364501953125, -0.01422882080078125, 0.006572723388671875, -0.003086090087890625, -0.03033447265625, -0.016448974609375, -0.0264739990234375 ] ]
Salesforce/blip2-opt-6.7b-coco
2023-09-13T08:47:26.000Z
[ "transformers", "pytorch", "blip-2", "visual-question-answering", "vision", "image-to-text", "image-captioning", "en", "arxiv:2301.12597", "license:mit", "endpoints_compatible", "has_space", "region:us" ]
image-to-text
Salesforce
null
null
Salesforce/blip2-opt-6.7b-coco
18
9,233
transformers
2023-02-07T15:18:25
--- language: en license: mit tags: - vision - image-to-text - image-captioning - visual-question-answering pipeline_tag: image-to-text --- # BLIP-2, OPT-6.7b, fine-tuned on COCO BLIP-2 model, leveraging [OPT-6.7b](https://huggingface.co/facebook/opt-6.7b) (a large language model with 6.7 billion parameters). It was introduced in the paper [BLIP-2: Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models](https://arxiv.org/abs/2301.12597) by Li et al. and first released in [this repository](https://github.com/salesforce/LAVIS/tree/main/projects/blip2). Disclaimer: The team releasing BLIP-2 did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description BLIP-2 consists of 3 models: a CLIP-like image encoder, a Querying Transformer (Q-Former) and a large language model. The authors initialize the weights of the image encoder and large language model from pre-trained checkpoints and keep them frozen while training the Querying Transformer, which is a BERT-like Transformer encoder that maps a set of "query tokens" to query embeddings, which bridge the gap between the embedding space of the image encoder and the large language model. The goal for the model is simply to predict the next text token, giving the query embeddings and the previous text. <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/blip2_architecture.jpg" alt="drawing" width="600"/> This allows the model to be used for tasks like: - image captioning - visual question answering (VQA) - chat-like conversations by feeding the image and the previous conversation as prompt to the model ## Direct Use and Downstream Use You can use the raw model for conditional text generation given an image and optional text. See the [model hub](https://huggingface.co/models?search=Salesforce/blip) to look for fine-tuned versions on a task that interests you. ## Bias, Risks, Limitations, and Ethical Considerations BLIP2-OPT uses off-the-shelf OPT as the language model. It inherits the same risks and limitations as mentioned in Meta's model card. > Like other large language models for which the diversity (or lack thereof) of training > data induces downstream impact on the quality of our model, OPT-175B has limitations in terms > of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and > hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern > large language models. > BLIP2 is fine-tuned on image-text datasets (e.g. [LAION](https://laion.ai/blog/laion-400-open-dataset/) ) collected from the internet. As a result the model itself is potentially vulnerable to generating equivalently inappropriate content or replicating inherent biases in the underlying data. BLIP2 has not been tested in real world applications. It should not be directly deployed in any applications. Researchers should first carefully assess the safety and fairness of the model in relation to the specific context they’re being deployed within. ### How to use For code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/main/en/model_doc/blip-2#transformers.Blip2ForConditionalGeneration.forward.example).
3,349
[ [ -0.0302886962890625, -0.061859130859375, -0.0037078857421875, 0.046142578125, -0.0244598388671875, -0.003078460693359375, -0.0233154296875, -0.0728759765625, -0.0038661956787109375, 0.052520751953125, -0.0313720703125, -0.00814056396484375, -0.0455322265625, -0.00246429443359375, -0.0296478271484375, 0.0770263671875, 0.012359619140625, -0.003509521484375, -0.0161895751953125, 0.006877899169921875, -0.029449462890625, -0.0221099853515625, -0.052337646484375, -0.0019855499267578125, 0.01035308837890625, 0.040496826171875, 0.07440185546875, 0.0287933349609375, 0.047760009765625, 0.0176849365234375, -0.0265350341796875, 0.003963470458984375, -0.05755615234375, -0.01342010498046875, -0.0157623291015625, -0.050201416015625, -0.02520751953125, -0.0047607421875, 0.040771484375, 0.05035400390625, 0.0078887939453125, 0.01189422607421875, -0.00534820556640625, 0.046295166015625, -0.037139892578125, 0.01141357421875, -0.05816650390625, -0.01206207275390625, -0.0162353515625, 0.02105712890625, -0.047088623046875, -0.01192474365234375, 0.01380157470703125, -0.029327392578125, 0.0077972412109375, 0.0247039794921875, 0.1038818359375, 0.02410888671875, -0.0207366943359375, -0.0243682861328125, -0.041748046875, 0.057708740234375, -0.046417236328125, 0.04388427734375, 0.033447265625, 0.0272979736328125, -0.003387451171875, -0.051055908203125, -0.043121337890625, -0.01617431640625, -0.005279541015625, 0.009796142578125, -0.0130157470703125, 0.007724761962890625, 0.02728271484375, 0.02947998046875, -0.043609619140625, 0.025665283203125, -0.04925537109375, -0.0286102294921875, 0.04547119140625, -0.0174102783203125, 0.0255279541015625, -0.0321044921875, -0.041595458984375, -0.0012712478637695312, -0.059967041015625, 0.00943756103515625, 0.0247955322265625, 0.0239410400390625, -0.039459228515625, 0.061279296875, -0.001232147216796875, 0.046417236328125, 0.01020050048828125, -0.0027446746826171875, 0.03338623046875, -0.029327392578125, -0.0052490234375, -0.01424407958984375, 0.0706787109375, 0.04241943359375, 0.025543212890625, -0.0088653564453125, -0.021209716796875, 0.0013933181762695312, 0.0151214599609375, -0.08001708984375, -0.00821685791015625, 0.026947021484375, -0.0380859375, -0.014373779296875, -0.006023406982421875, -0.0731201171875, -0.006626129150390625, -0.0036296844482421875, 0.01007843017578125, -0.03729248046875, -0.028106689453125, -0.000507354736328125, -0.01036834716796875, 0.0215606689453125, 0.01303863525390625, -0.0740966796875, 0.01476287841796875, 0.051177978515625, 0.06707763671875, -0.013641357421875, -0.018341064453125, -0.0191497802734375, 0.003612518310546875, -0.033294677734375, 0.0478515625, -0.032196044921875, -0.01555633544921875, 0.025848388671875, 0.01303863525390625, 0.0172271728515625, -0.044769287109375, 0.0263519287109375, -0.034881591796875, 0.01953125, -0.0232086181640625, -0.03912353515625, -0.0125579833984375, -0.004638671875, -0.049407958984375, 0.0867919921875, 0.026824951171875, -0.057342529296875, 0.044403076171875, -0.03515625, -0.033447265625, 0.0151214599609375, 0.002513885498046875, -0.048370361328125, 0.0036468505859375, 0.003421783447265625, 0.0199432373046875, -0.033721923828125, 0.0229034423828125, -0.032470703125, -0.0204925537109375, 0.003986358642578125, -0.006046295166015625, 0.07000732421875, 0.01222991943359375, -0.03741455078125, -0.01122283935546875, -0.031494140625, -0.01971435546875, 0.0164947509765625, -0.019561767578125, -0.00841522216796875, -0.009033203125, 0.032470703125, 0.029937744140625, 0.03265380859375, -0.044464111328125, 0.00910186767578125, -0.048187255859375, 0.03143310546875, 0.024200439453125, -0.010772705078125, 0.043792724609375, -0.00872802734375, 0.037017822265625, 0.01389312744140625, 0.0278778076171875, -0.0258636474609375, -0.05780029296875, -0.06524658203125, -0.01525115966796875, 0.0301513671875, 0.053070068359375, -0.055419921875, 0.037933349609375, -0.013916015625, -0.04296875, -0.050384521484375, 0.0191650390625, 0.05267333984375, 0.0226898193359375, 0.0298614501953125, -0.01322174072265625, -0.043304443359375, -0.08349609375, 0.01216888427734375, -0.0229034423828125, 0.0114288330078125, 0.0290374755859375, 0.038848876953125, -0.0299530029296875, 0.07470703125, -0.0242156982421875, -0.01273345947265625, -0.0355224609375, -0.0088958740234375, -0.0073089599609375, 0.043609619140625, 0.0701904296875, -0.07684326171875, -0.0263519287109375, -0.00954437255859375, -0.04925537109375, -0.00858306884765625, -0.0096282958984375, -0.0256805419921875, 0.05126953125, 0.04205322265625, -0.0704345703125, 0.02447509765625, 0.060455322265625, -0.0191650390625, 0.032562255859375, 0.017486572265625, -0.0195465087890625, -0.0863037109375, 0.02099609375, 0.01302337646484375, -0.0235137939453125, -0.038330078125, 0.00879669189453125, 0.01497650146484375, -0.0176239013671875, -0.042724609375, 0.06011962890625, -0.031463623046875, -0.0211334228515625, -0.01157379150390625, -0.000202178955078125, 0.0048980712890625, 0.026947021484375, 0.01131439208984375, 0.06451416015625, 0.05474853515625, -0.036285400390625, 0.01105499267578125, 0.042572021484375, -0.0219879150390625, 0.023040771484375, -0.061614990234375, 0.00698089599609375, -0.01131439208984375, 0.0201568603515625, -0.07745361328125, -0.01861572265625, 0.01995849609375, -0.04388427734375, 0.0390625, -0.01131439208984375, -0.04962158203125, -0.059906005859375, -0.016998291015625, 0.025787353515625, 0.05596923828125, -0.04547119140625, 0.0254364013671875, 0.042388916015625, -0.015350341796875, -0.043609619140625, -0.07635498046875, 0.010772705078125, 0.000537872314453125, -0.05364990234375, 0.032470703125, -0.00836181640625, 0.01055908203125, 0.00007092952728271484, 0.0238800048828125, -0.0018529891967773438, -0.0106353759765625, 0.0242156982421875, 0.0146331787109375, -0.00467681884765625, 0.003925323486328125, 0.0248565673828125, 0.0003688335418701172, -0.00841522216796875, -0.00634002685546875, 0.0540771484375, 0.00292205810546875, 0.0015573501586914062, -0.0350341796875, 0.0259552001953125, 0.044158935546875, -0.025054931640625, 0.0579833984375, 0.046539306640625, -0.038116455078125, -0.01326751708984375, -0.049560546875, -0.01141357421875, -0.039276123046875, 0.043701171875, -0.0204925537109375, -0.0665283203125, 0.038848876953125, 0.006732940673828125, 0.012481689453125, 0.020172119140625, 0.06317138671875, 0.007659912109375, 0.07366943359375, 0.057708740234375, -0.0096282958984375, 0.045318603515625, -0.0241241455078125, 0.0159759521484375, -0.057037353515625, -0.032745361328125, -0.026458740234375, -0.0029010772705078125, -0.04937744140625, -0.033355712890625, -0.00024819374084472656, 0.0276336669921875, -0.01313018798828125, 0.033660888671875, -0.0298309326171875, 0.01201629638671875, 0.043426513671875, 0.004573822021484375, -0.0129852294921875, -0.00022649765014648438, 0.0012273788452148438, 0.0040435791015625, -0.04925537109375, -0.032501220703125, 0.0694580078125, 0.05474853515625, 0.04840087890625, 0.00702667236328125, 0.0305633544921875, -0.00833892822265625, 0.0231781005859375, -0.06585693359375, 0.040435791015625, -0.0254364013671875, -0.072265625, -0.01377105712890625, -0.0273284912109375, -0.06341552734375, 0.0007424354553222656, -0.016387939453125, -0.058074951171875, 0.0068359375, 0.03509521484375, -0.0019102096557617188, 0.003612518310546875, -0.06341552734375, 0.07135009765625, -0.0268402099609375, -0.0313720703125, 0.0026073455810546875, -0.056732177734375, 0.035064697265625, -0.005290985107421875, 0.004070281982421875, 0.0016326904296875, -0.0030651092529296875, 0.0440673828125, -0.0254364013671875, 0.07269287109375, -0.0191650390625, 0.0018548965454101562, 0.03546142578125, 0.003265380859375, 0.0242767333984375, -0.00006479024887084961, -0.00864410400390625, 0.0290679931640625, -0.00910186767578125, -0.036651611328125, -0.028900146484375, 0.0244598388671875, -0.05780029296875, -0.0297088623046875, -0.00400543212890625, -0.0316162109375, -0.01126861572265625, 0.0282135009765625, 0.047882080078125, 0.0276336669921875, -0.004749298095703125, 0.022857666015625, 0.0257568359375, -0.048736572265625, 0.031280517578125, 0.02044677734375, -0.038848876953125, -0.029541015625, 0.06011962890625, -0.005367279052734375, 0.03765869140625, 0.027923583984375, 0.0049285888671875, -0.04412841796875, -0.033538818359375, -0.04669189453125, 0.0413818359375, -0.040496826171875, -0.032958984375, -0.0501708984375, -0.01486968994140625, -0.042633056640625, -0.00567626953125, -0.05133056640625, -0.0186767578125, -0.0284423828125, -0.00588226318359375, 0.0311126708984375, 0.035064697265625, 0.003376007080078125, 0.0311737060546875, -0.043426513671875, 0.0272979736328125, 0.006072998046875, 0.04058837890625, -0.01070404052734375, -0.03778076171875, -0.0140380859375, 0.0255584716796875, -0.018280029296875, -0.05865478515625, 0.0185546875, 0.024688720703125, 0.0293731689453125, 0.03485107421875, -0.017120361328125, 0.035064697265625, -0.04296875, 0.06610107421875, 0.02081298828125, -0.073486328125, 0.052337646484375, -0.026947021484375, 0.0129852294921875, 0.047698974609375, 0.04058837890625, -0.0276947021484375, -0.040313720703125, -0.036041259765625, -0.06573486328125, 0.052154541015625, 0.0186309814453125, 0.0293426513671875, -0.00966644287109375, 0.041168212890625, 0.0002574920654296875, 0.027191162109375, -0.06280517578125, -0.016143798828125, -0.0322265625, -0.01043701171875, -0.00833892822265625, -0.04510498046875, 0.016876220703125, -0.01447296142578125, 0.033172607421875, -0.00699615478515625, 0.032745361328125, -0.0036907196044921875, -0.0191802978515625, -0.0225372314453125, -0.0242156982421875, 0.0357666015625, 0.035369873046875, -0.01462554931640625, -0.0155029296875, -0.005615234375, -0.031890869140625, -0.0295867919921875, 0.01192474365234375, -0.0286102294921875, -0.0032196044921875, 0.025543212890625, 0.0867919921875, 0.0169830322265625, -0.06378173828125, 0.057525634765625, 0.0179290771484375, -0.009979248046875, -0.015899658203125, -0.004337310791015625, 0.0018033981323242188, 0.0013246536254882812, 0.016571044921875, -0.00738525390625, 0.016021728515625, -0.05126953125, 0.0309906005859375, 0.040069580078125, -0.0304718017578125, -0.034210205078125, 0.053436279296875, 0.0250091552734375, -0.027435302734375, 0.0484619140625, -0.03924560546875, -0.05731201171875, 0.050018310546875, 0.0675048828125, 0.036529541015625, -0.025146484375, 0.029083251953125, 0.035675048828125, 0.04339599609375, -0.00897216796875, 0.02703857421875, 0.0252685546875, -0.061859130859375, -0.02679443359375, -0.052337646484375, -0.0311737060546875, 0.0123443603515625, -0.047027587890625, 0.047088623046875, -0.03643798828125, -0.0179290771484375, 0.002353668212890625, -0.006923675537109375, -0.055572509765625, 0.024017333984375, 0.0215301513671875, 0.076904296875, -0.07427978515625, 0.035369873046875, 0.061492919921875, -0.046356201171875, -0.05914306640625, -0.003322601318359375, -0.005641937255859375, -0.060455322265625, 0.03912353515625, 0.03668212890625, 0.02825927734375, -0.01262664794921875, -0.057220458984375, -0.06884765625, 0.062225341796875, 0.03363037109375, -0.049102783203125, -0.004520416259765625, 0.0174407958984375, 0.05108642578125, -0.0209808349609375, 0.0229339599609375, 0.029541015625, 0.027008056640625, 0.02801513671875, -0.08843994140625, 0.0144500732421875, -0.019134521484375, 0.0143280029296875, 0.00806427001953125, -0.07861328125, 0.0555419921875, -0.023651123046875, -0.00995635986328125, 0.0152130126953125, 0.0462646484375, 0.0194549560546875, 0.0056915283203125, 0.03375244140625, 0.039276123046875, 0.040252685546875, -0.005077362060546875, 0.09246826171875, -0.0323486328125, 0.03546142578125, 0.061065673828125, -0.011199951171875, 0.0701904296875, 0.022247314453125, -0.002410888671875, 0.0242156982421875, 0.05670166015625, -0.0256805419921875, 0.033721923828125, -0.0031375885009765625, 0.025115966796875, -0.005863189697265625, -0.0202178955078125, -0.0255584716796875, 0.05670166015625, 0.0384521484375, -0.027130126953125, 0.00971221923828125, 0.004459381103515625, 0.0229644775390625, -0.0219879150390625, -0.007122039794921875, 0.050933837890625, 0.0054473876953125, -0.06884765625, 0.046630859375, -0.004924774169921875, 0.059661865234375, -0.0202178955078125, 0.006591796875, -0.019256591796875, -0.0009107589721679688, -0.01580810546875, -0.058990478515625, 0.02716064453125, -0.0064697265625, 0.0017423629760742188, 0.0019216537475585938, 0.048553466796875, -0.0406494140625, -0.062347412109375, 0.037506103515625, 0.0171051025390625, 0.0096282958984375, 0.00937652587890625, -0.07891845703125, 0.02728271484375, -0.0030994415283203125, -0.0229034423828125, 0.0029125213623046875, 0.026702880859375, -0.0004029273986816406, 0.058837890625, 0.041748046875, 0.00946044921875, 0.0035247802734375, 0.016632080078125, 0.072265625, -0.0307464599609375, -0.041015625, -0.037353515625, 0.046905517578125, -0.01395416259765625, -0.035003662109375, 0.03692626953125, 0.060882568359375, 0.058380126953125, 0.004283905029296875, 0.051513671875, -0.01593017578125, 0.037567138671875, -0.040985107421875, 0.04608154296875, -0.0606689453125, -0.005657196044921875, -0.03924560546875, -0.07855224609375, -0.02587890625, 0.04443359375, -0.016204833984375, 0.023468017578125, 0.044647216796875, 0.0728759765625, -0.003131866455078125, -0.0129852294921875, 0.004314422607421875, 0.02825927734375, 0.028167724609375, 0.029754638671875, 0.049835205078125, -0.034698486328125, 0.0426025390625, -0.027435302734375, -0.031890869140625, -0.031219482421875, -0.050872802734375, -0.08740234375, -0.0526123046875, -0.0262603759765625, -0.0318603515625, 0.00847625732421875, 0.05072021484375, 0.07177734375, -0.053680419921875, -0.0072174072265625, -0.0094451904296875, -0.005859375, 0.016510009765625, -0.0186614990234375, 0.01134490966796875, -0.026336669921875, -0.06298828125, -0.006343841552734375, 0.022430419921875, 0.01428985595703125, -0.0229644775390625, 0.00733184814453125, -0.004199981689453125, -0.0038890838623046875, 0.051116943359375, 0.021636962890625, -0.052825927734375, -0.0295867919921875, 0.0007748603820800781, -0.00881195068359375, -0.0025424957275390625, 0.0338134765625, -0.03265380859375, 0.0261688232421875, 0.0238494873046875, 0.04205322265625, 0.0455322265625, 0.0064239501953125, 0.037200927734375, -0.04901123046875, 0.049957275390625, 0.01227569580078125, 0.029327392578125, 0.04656982421875, -0.034820556640625, 0.026275634765625, 0.0017557144165039062, -0.035858154296875, -0.06927490234375, 0.030517578125, -0.09228515625, -0.01812744140625, 0.12139892578125, 0.0043182373046875, -0.01540374755859375, 0.005374908447265625, -0.0220794677734375, 0.01288604736328125, -0.03546142578125, 0.050384521484375, 0.03277587890625, 0.0179901123046875, -0.03863525390625, -0.034881591796875, 0.0484619140625, 0.029144287109375, -0.058990478515625, -0.005741119384765625, 0.0299072265625, 0.0172882080078125, 0.0186767578125, 0.03558349609375, -0.012298583984375, 0.0232086181640625, -0.002529144287109375, 0.013519287109375, 0.005340576171875, -0.031890869140625, -0.006298065185546875, -0.0079498291015625, 0.00800323486328125, -0.004787445068359375 ] ]
laion/CLIP-ViT-B-16-DataComp.XL-s13B-b90K
2023-09-29T22:41:47.000Z
[ "open_clip", "zero-shot-image-classification", "dataset:mlfoundations/datacomp_pools", "arxiv:2304.14108", "license:mit", "region:us" ]
zero-shot-image-classification
laion
null
null
laion/CLIP-ViT-B-16-DataComp.XL-s13B-b90K
1
9,219
open_clip
2023-05-15T19:27:18
--- license: mit widget: - src: >- https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png candidate_labels: playing music, playing sports example_title: Cat & Dog library_name: open_clip datasets: - mlfoundations/datacomp_pools pipeline_tag: zero-shot-image-classification --- # Model card for CLIP ViT-B-16 trained DataComp-1B # Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Training Details](#training-details) 4. [Evaluation](#evaluation) 5. [Acknowledgements](#acknowledgements) 6. [Citation](#citation) 7. [How To Get Started With the Model](#how-to-get-started-with-the-model) # Model Details ## Model Description A CLIP ViT-B/16 model trained with the DataComp-1B (https://github.com/mlfoundations/datacomp) using OpenCLIP (https://github.com/mlfoundations/open_clip). Model training done on the [stability.ai](https://stability.ai/) cluster. # Uses As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model. The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the DataComp paper (https://arxiv.org/abs/2304.14108) include additional discussion as it relates specifically to the training dataset. ## Direct Use Zero-shot image classification, image and text retrieval, among others. ## Downstream Use Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others. ## Out-of-Scope Use As per the OpenAI models, **Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful. Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use. # Training Details ## Training Data This model was trained with the 1.4 Billion samples of the DataComp-1B dataset (https://arxiv.org/abs/2304.14108). **IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a “safe” subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress. ## Training Procedure Please see https://arxiv.org/abs/2304.14108. # Evaluation Evaluation done on 38 datasets, using the [DataComp repo](https://github.com/mlfoundations/datacomp) and the [LAION CLIP Benchmark](https://github.com/LAION-AI/CLIP_benchmark). ## Testing Data, Factors & Metrics ### Testing Data The testing is performed on a suite of 38 datasets. See our paper for more details (https://arxiv.org/abs/2304.14108). ## Results The model achieves a 73.5% zero-shot top-1 accuracy on ImageNet-1k. See our paper for more details and results (https://arxiv.org/abs/2304.14108). # Acknowledgements Acknowledging [stability.ai](https://stability.ai/) for the compute used to train this model. # Citation **BibTeX:** DataComp ```bibtex @article{datacomp, title={DataComp: In search of the next generation of multimodal datasets}, author={Samir Yitzhak Gadre, Gabriel Ilharco, Alex Fang, Jonathan Hayase, Georgios Smyrnis, Thao Nguyen, Ryan Marten, Mitchell Wortsman, Dhruba Ghosh, Jieyu Zhang, Eyal Orgad, Rahim Entezari, Giannis Daras, Sarah Pratt, Vivek Ramanujan, Yonatan Bitton, Kalyani Marathe, Stephen Mussmann, Richard Vencu, Mehdi Cherti, Ranjay Krishna, Pang Wei Koh, Olga Saukh, Alexander Ratner, Shuran Song, Hannaneh Hajishirzi, Ali Farhadi, Romain Beaumont, Sewoong Oh, Alex Dimakis, Jenia Jitsev, Yair Carmon, Vaishaal Shankar, Ludwig Schmidt}, journal={arXiv preprint arXiv:2304.14108}, year={2023} } ``` OpenAI CLIP paper ``` @inproceedings{Radford2021LearningTV, title={Learning Transferable Visual Models From Natural Language Supervision}, author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever}, booktitle={ICML}, year={2021} } ``` OpenCLIP software ``` @software{ilharco_gabriel_2021_5143773, author = {Ilharco, Gabriel and Wortsman, Mitchell and Wightman, Ross and Gordon, Cade and Carlini, Nicholas and Taori, Rohan and Dave, Achal and Shankar, Vaishaal and Namkoong, Hongseok and Miller, John and Hajishirzi, Hannaneh and Farhadi, Ali and Schmidt, Ludwig}, title = {OpenCLIP}, month = jul, year = 2021, note = {If you use this software, please cite it as below.}, publisher = {Zenodo}, version = {0.1}, doi = {10.5281/zenodo.5143773}, url = {https://doi.org/10.5281/zenodo.5143773} } ``` # How to Get Started with the Model See https://github.com/mlfoundations/open_clip
7,405
[ [ -0.032501220703125, -0.04937744140625, 0.0123748779296875, 0.0027923583984375, -0.03009033203125, -0.03314208984375, -0.01433563232421875, -0.0433349609375, 0.003753662109375, 0.0306243896484375, -0.042266845703125, -0.04620361328125, -0.04742431640625, -0.00753021240234375, -0.03326416015625, 0.06695556640625, -0.010833740234375, -0.0034580230712890625, -0.0278778076171875, -0.0250701904296875, -0.04150390625, -0.03875732421875, -0.0259246826171875, 0.00931549072265625, 0.01401519775390625, 0.0259857177734375, 0.050140380859375, 0.056365966796875, 0.05938720703125, 0.0165252685546875, -0.0007467269897460938, 0.00812530517578125, -0.044189453125, -0.031982421875, -0.00843048095703125, -0.0222625732421875, -0.0391845703125, 0.0164947509765625, 0.045166015625, 0.032958984375, -0.00672149658203125, 0.0182037353515625, -0.00678253173828125, 0.038543701171875, -0.056610107421875, 0.01898193359375, -0.04718017578125, 0.0015554428100585938, -0.01641845703125, -0.0015325546264648438, -0.0233154296875, -0.013031005859375, 0.0132904052734375, -0.05572509765625, 0.01314544677734375, -0.0018854141235351562, 0.093505859375, 0.01727294921875, -0.0245361328125, 0.0155487060546875, -0.049163818359375, 0.059295654296875, -0.0546875, 0.02398681640625, 0.026153564453125, 0.037200927734375, 0.005870819091796875, -0.0570068359375, -0.03814697265625, -0.005802154541015625, 0.0123443603515625, 0.01189422607421875, -0.02984619140625, 0.003170013427734375, 0.035003662109375, 0.0181427001953125, -0.031768798828125, 0.0022983551025390625, -0.050140380859375, 0.00047087669372558594, 0.0482177734375, 0.01114654541015625, 0.0264739990234375, -0.0197296142578125, -0.0576171875, -0.033233642578125, -0.04595947265625, 0.0305328369140625, 0.01395416259765625, 0.0112152099609375, -0.03643798828125, 0.0377197265625, 0.0006289482116699219, 0.036651611328125, -0.0081634521484375, -0.0179595947265625, 0.03912353515625, -0.03857421875, -0.023590087890625, -0.0121917724609375, 0.07635498046875, 0.054534912109375, 0.01416015625, 0.00670623779296875, 0.004337310791015625, 0.0005598068237304688, 0.0269317626953125, -0.07781982421875, -0.00835418701171875, 0.005893707275390625, -0.042236328125, -0.0208282470703125, 0.03363037109375, -0.06658935546875, -0.0010023117065429688, -0.0215606689453125, 0.040740966796875, -0.036712646484375, -0.0229339599609375, 0.004039764404296875, -0.005443572998046875, 0.0173797607421875, 0.02587890625, -0.05084228515625, 0.0172119140625, 0.0279998779296875, 0.08685302734375, -0.0219268798828125, -0.02587890625, -0.0194854736328125, 0.00853729248046875, -0.0247039794921875, 0.03607177734375, -0.0225677490234375, -0.025665283203125, -0.0058441162109375, 0.0292816162109375, -0.01165771484375, -0.038543701171875, 0.035888671875, -0.0154266357421875, -0.0009813308715820312, -0.0139312744140625, -0.0172271728515625, -0.043243408203125, 0.016845703125, -0.05328369140625, 0.06561279296875, 0.0033130645751953125, -0.0662841796875, 0.029144287109375, -0.052093505859375, -0.00836944580078125, -0.021026611328125, -0.007152557373046875, -0.04876708984375, -0.021697998046875, 0.03826904296875, 0.03857421875, -0.0246124267578125, 0.0360107421875, -0.052398681640625, -0.0246429443359375, 0.0158843994140625, -0.032012939453125, 0.07330322265625, 0.0031070709228515625, -0.0244293212890625, 0.0110626220703125, -0.0521240234375, -0.00592803955078125, 0.018524169921875, 0.001979827880859375, -0.01898193359375, -0.0189208984375, 0.004611968994140625, 0.023651123046875, 0.0034618377685546875, -0.050872802734375, 0.00487518310546875, -0.0069580078125, 0.031982421875, 0.056396484375, 0.01053619384765625, 0.019378662109375, -0.03271484375, 0.04681396484375, 0.0196533203125, 0.046417236328125, -0.0242919921875, -0.03753662109375, -0.047943115234375, -0.045562744140625, 0.024932861328125, 0.037933349609375, -0.05377197265625, 0.031951904296875, -0.0219573974609375, -0.04852294921875, -0.0278472900390625, -0.00133514404296875, 0.0360107421875, 0.044097900390625, 0.03912353515625, -0.041839599609375, -0.040283203125, -0.06805419921875, 0.023590087890625, 0.005126953125, 0.004642486572265625, 0.0374755859375, 0.05535888671875, -0.015380859375, 0.08160400390625, -0.049835205078125, -0.038909912109375, -0.00859832763671875, 0.0020351409912109375, 0.00913238525390625, 0.04547119140625, 0.07177734375, -0.0701904296875, -0.027191162109375, -0.016937255859375, -0.0806884765625, 0.01096343994140625, 0.00218963623046875, -0.0238800048828125, 0.00664520263671875, 0.033233642578125, -0.043853759765625, 0.057525634765625, 0.03533935546875, -0.000110626220703125, 0.037506103515625, -0.00577545166015625, 0.0076141357421875, -0.08221435546875, 0.034576416015625, 0.018035888671875, -0.01081085205078125, -0.04052734375, -0.00403594970703125, 0.0010309219360351562, -0.02801513671875, -0.057525634765625, 0.036224365234375, -0.02435302734375, 0.0023021697998046875, -0.00472259521484375, -0.0028095245361328125, 0.005096435546875, 0.056671142578125, 0.005229949951171875, 0.0648193359375, 0.062347412109375, -0.048583984375, 0.008056640625, 0.0362548828125, -0.03497314453125, 0.028717041015625, -0.06805419921875, -0.00579071044921875, -0.007076263427734375, 0.015380859375, -0.04486083984375, -0.022918701171875, 0.0312347412109375, -0.03265380859375, 0.0279998779296875, -0.022735595703125, -0.0225067138671875, -0.027557373046875, -0.042572021484375, 0.040191650390625, 0.0374755859375, -0.04656982421875, 0.019927978515625, 0.03875732421875, 0.0025424957275390625, -0.05133056640625, -0.0604248046875, -0.0193634033203125, -0.0174713134765625, -0.050567626953125, 0.0245819091796875, -0.01157379150390625, 0.0018634796142578125, 0.0019779205322265625, 0.0052490234375, -0.01444244384765625, -0.00936126708984375, 0.05181884765625, 0.04852294921875, -0.0053253173828125, -0.00469207763671875, -0.008270263671875, -0.0044403076171875, -0.0024700164794921875, 0.00023949146270751953, 0.0133209228515625, -0.00968170166015625, -0.034149169921875, -0.0377197265625, 0.0218353271484375, 0.048126220703125, -0.03228759765625, 0.05029296875, 0.047943115234375, -0.0282440185546875, 0.002162933349609375, -0.025146484375, -0.0038356781005859375, -0.034942626953125, 0.03314208984375, 0.00655364990234375, -0.04974365234375, 0.042266845703125, 0.0104827880859375, -0.0021953582763671875, 0.03704833984375, 0.0282440185546875, 0.00807952880859375, 0.0706787109375, 0.063720703125, 0.0018529891967773438, 0.055755615234375, -0.053375244140625, 0.00740814208984375, -0.06988525390625, -0.0306854248046875, -0.01523590087890625, -0.01428985595703125, -0.04425048828125, -0.0447998046875, 0.053375244140625, 0.012176513671875, -0.0158233642578125, 0.031951904296875, -0.0330810546875, 0.015228271484375, 0.044952392578125, 0.03680419921875, 0.0059661865234375, -0.0023479461669921875, -0.0007839202880859375, -0.0097503662109375, -0.051513671875, -0.0257568359375, 0.09112548828125, 0.041717529296875, 0.06488037109375, -0.005680084228515625, 0.0272674560546875, 0.012298583984375, -0.004337310791015625, -0.053497314453125, 0.04620361328125, -0.0225982666015625, -0.0570068359375, -0.0206298828125, -0.0277099609375, -0.06591796875, 0.004486083984375, -0.01424407958984375, -0.054840087890625, 0.0447998046875, 0.007656097412109375, -0.0221710205078125, 0.038116455078125, -0.046142578125, 0.08050537109375, -0.0241241455078125, -0.0251922607421875, 0.00562286376953125, -0.05975341796875, 0.04083251953125, 0.00897979736328125, 0.0066375732421875, -0.01543426513671875, 0.004871368408203125, 0.06744384765625, -0.050079345703125, 0.07598876953125, -0.018707275390625, 0.0185089111328125, 0.045989990234375, -0.015899658203125, 0.013031005859375, 0.0038547515869140625, 0.00959014892578125, 0.047882080078125, 0.006511688232421875, -0.0139312744140625, -0.032501220703125, 0.036163330078125, -0.06304931640625, -0.0197601318359375, -0.030181884765625, -0.040985107421875, 0.00933837890625, 0.02557373046875, 0.034759521484375, 0.056488037109375, -0.005260467529296875, 0.03289794921875, 0.049896240234375, -0.026214599609375, 0.035919189453125, 0.0231475830078125, -0.0155029296875, -0.050048828125, 0.0758056640625, 0.0281219482421875, 0.0270233154296875, 0.0141754150390625, -0.00047850608825683594, -0.0037860870361328125, -0.035247802734375, -0.034149169921875, 0.01177215576171875, -0.0587158203125, -0.03314208984375, -0.0364990234375, -0.02288818359375, -0.025726318359375, -0.004222869873046875, -0.03955078125, -0.022186279296875, -0.042694091796875, -0.005710601806640625, 0.036407470703125, 0.033416748046875, -0.0164947509765625, 0.01446533203125, -0.058929443359375, 0.02410888671875, 0.0215911865234375, 0.037017822265625, 0.0023517608642578125, -0.043060302734375, -0.0224151611328125, 0.018646240234375, -0.042572021484375, -0.04302978515625, 0.0278778076171875, 0.021148681640625, 0.0447998046875, 0.0419921875, 0.0171356201171875, 0.0506591796875, -0.025604248046875, 0.07403564453125, 0.03009033203125, -0.054290771484375, 0.048553466796875, -0.0419921875, 0.0203094482421875, 0.05279541015625, 0.056396484375, -0.01178741455078125, -0.0022716522216796875, -0.046966552734375, -0.0728759765625, 0.070068359375, 0.0087738037109375, -0.0011043548583984375, 0.0033016204833984375, 0.01253509521484375, 0.00377655029296875, 0.0186614990234375, -0.0623779296875, -0.012176513671875, -0.037933349609375, 0.0007991790771484375, 0.009674072265625, -0.01114654541015625, -0.0147247314453125, -0.03656005859375, 0.0531005859375, -0.015960693359375, 0.045989990234375, 0.022125244140625, -0.004436492919921875, -0.0153350830078125, -0.0028057098388671875, 0.038116455078125, 0.051971435546875, -0.038238525390625, -0.01393890380859375, -0.0023365020751953125, -0.049713134765625, -0.00848388671875, 0.01210784912109375, -0.033538818359375, -0.005687713623046875, 0.028961181640625, 0.08935546875, 0.0250244140625, -0.055511474609375, 0.07281494140625, 0.00505828857421875, -0.0296478271484375, -0.0257415771484375, 0.00908660888671875, -0.0296630859375, 0.0156402587890625, 0.0186920166015625, 0.01751708984375, 0.01141357421875, -0.044952392578125, 0.02081298828125, 0.038330078125, -0.03546142578125, -0.02655029296875, 0.061248779296875, -0.0010356903076171875, 0.0081024169921875, 0.04595947265625, -0.0013704299926757812, -0.03118896484375, 0.055328369140625, 0.0280609130859375, 0.06488037109375, 0.004730224609375, 0.0208587646484375, 0.0531005859375, 0.0197601318359375, -0.01323699951171875, 0.007549285888671875, 0.01055908203125, -0.036346435546875, -0.01015472412109375, -0.0286865234375, -0.03424072265625, 0.014495849609375, -0.0718994140625, 0.041290283203125, -0.05072021484375, -0.02655029296875, -0.007091522216796875, -0.0260162353515625, -0.0428466796875, 0.01203155517578125, 0.01308441162109375, 0.0753173828125, -0.0643310546875, 0.053497314453125, 0.04534912109375, -0.06500244140625, -0.05633544921875, -0.00478363037109375, -0.0036220550537109375, -0.040924072265625, 0.0305938720703125, 0.037261962890625, 0.0029850006103515625, -0.0205078125, -0.07733154296875, -0.07403564453125, 0.112060546875, 0.03857421875, -0.0225677490234375, -0.006496429443359375, 0.004192352294921875, 0.0276336669921875, -0.0206146240234375, 0.029449462890625, 0.0137939453125, 0.0172576904296875, 0.021484375, -0.07928466796875, 0.0017442703247070312, -0.0295562744140625, 0.0129547119140625, 0.006160736083984375, -0.07037353515625, 0.07421875, -0.0238037109375, -0.0196990966796875, 0.0025691986083984375, 0.039886474609375, 0.00807952880859375, 0.031768798828125, 0.02685546875, 0.05377197265625, 0.0321044921875, -0.00007051229476928711, 0.07427978515625, -0.009307861328125, 0.0287628173828125, 0.08233642578125, -0.0005140304565429688, 0.072998046875, 0.02532958984375, -0.01861572265625, 0.0283050537109375, 0.033538818359375, -0.0345458984375, 0.05865478515625, -0.0209503173828125, 0.0148468017578125, -0.01412200927734375, -0.0291748046875, -0.0362548828125, 0.03924560546875, 0.0054779052734375, -0.034698486328125, -0.01383209228515625, 0.0302581787109375, -0.0003771781921386719, -0.015167236328125, -0.01611328125, 0.040130615234375, 0.00733184814453125, -0.0312347412109375, 0.0628662109375, -0.00881195068359375, 0.055328369140625, -0.059844970703125, -0.01108551025390625, -0.0012483596801757812, 0.01470947265625, -0.018035888671875, -0.05877685546875, 0.0186920166015625, -0.0008053779602050781, -0.0161895751953125, -0.007450103759765625, 0.0482177734375, -0.017669677734375, -0.035186767578125, 0.0298614501953125, 0.003910064697265625, 0.010894775390625, 0.0004839897155761719, -0.050567626953125, 0.01476287841796875, -0.0015745162963867188, -0.0157318115234375, 0.033599853515625, 0.017669677734375, -0.006687164306640625, 0.05438232421875, 0.044097900390625, -0.0167388916015625, 0.012237548828125, -0.01352691650390625, 0.08782958984375, -0.03558349609375, -0.034149169921875, -0.040863037109375, 0.050811767578125, -0.00667572021484375, -0.035247802734375, 0.05743408203125, 0.041717529296875, 0.07891845703125, -0.01153564453125, 0.057525634765625, -0.0242156982421875, 0.0289764404296875, -0.045440673828125, 0.0447998046875, -0.049774169921875, 0.00399017333984375, -0.044525146484375, -0.05364990234375, -0.017608642578125, 0.03802490234375, -0.0236968994140625, 0.004901885986328125, 0.04547119140625, 0.06646728515625, -0.0254974365234375, -0.001476287841796875, 0.01708984375, 0.006824493408203125, 0.0198974609375, 0.03936767578125, 0.0335693359375, -0.05419921875, 0.051910400390625, -0.05670166015625, -0.0245208740234375, -0.0146942138671875, -0.06719970703125, -0.0811767578125, -0.0477294921875, -0.033355712890625, -0.006443023681640625, -0.0052947998046875, 0.06903076171875, 0.072998046875, -0.05572509765625, -0.01641845703125, 0.00760650634765625, -0.017913818359375, -0.0248565673828125, -0.0163726806640625, 0.02880859375, 0.01335906982421875, -0.042144775390625, 0.01412200927734375, 0.01035308837890625, 0.019378662109375, -0.0126190185546875, -0.00218963623046875, -0.038970947265625, -0.00943756103515625, 0.033905029296875, 0.034454345703125, -0.039215087890625, -0.0155792236328125, 0.0002231597900390625, 0.005275726318359375, 0.0245819091796875, 0.040740966796875, -0.041412353515625, 0.0494384765625, 0.03436279296875, 0.034637451171875, 0.045196533203125, 0.0159912109375, 0.0183258056640625, -0.0494384765625, 0.0292816162109375, 0.00035071372985839844, 0.0260467529296875, 0.0240936279296875, -0.0239105224609375, 0.053619384765625, 0.0297698974609375, -0.03369140625, -0.069091796875, -0.00934600830078125, -0.09942626953125, -0.010009765625, 0.08441162109375, -0.03521728515625, -0.03143310546875, 0.0203704833984375, -0.01702880859375, 0.0260467529296875, -0.03369140625, 0.0335693359375, 0.0280609130859375, 0.00397491455078125, -0.032440185546875, -0.06500244140625, 0.0213623046875, 0.00078582763671875, -0.06561279296875, -0.004764556884765625, 0.038482666015625, 0.023468017578125, 0.0207061767578125, 0.03094482421875, -0.0271453857421875, 0.0283660888671875, -0.0006709098815917969, 0.0236968994140625, -0.038818359375, -0.04595947265625, -0.03192138671875, 0.0028839111328125, -0.01512908935546875, -0.04290771484375 ] ]
stablediffusionapi/edge-of-realism
2023-05-18T12:56:13.000Z
[ "diffusers", "stablediffusionapi.com", "stable-diffusion-api", "text-to-image", "ultra-realistic", "license:creativeml-openrail-m", "endpoints_compatible", "has_space", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
stablediffusionapi
null
null
stablediffusionapi/edge-of-realism
18
9,216
diffusers
2023-04-22T03:05:12
--- license: creativeml-openrail-m tags: - stablediffusionapi.com - stable-diffusion-api - text-to-image - ultra-realistic pinned: true --- # API Inference ![generated from stablediffusionapi.com](https://cdn.stablediffusionapi.com/generations/7504788501684254537.png) ## Get API Key Get API key from [Stable Diffusion API](http://stablediffusionapi.com/), No Payment needed. Replace Key in below code, change **model_id** to "edge-of-realism" Coding in PHP/Node/Java etc? Have a look at docs for more code examples: [View docs](https://stablediffusionapi.com/docs) Model link: [View model](https://stablediffusionapi.com/models/edge-of-realism) Credits: [View credits](https://civitai.com/?query=model_search) View all models: [View Models](https://stablediffusionapi.com/models) import requests import json url = "https://stablediffusionapi.com/api/v3/dreambooth" payload = json.dumps({ "key": "", "model_id": "edge-of-realism", "prompt": "actual 8K portrait photo of gareth person, portrait, happy colors, bright eyes, clear eyes, warm smile, smooth soft skin, big dreamy eyes, beautiful intricate colored hair, symmetrical, anime wide eyes, soft lighting, detailed face, by makoto shinkai, stanley artgerm lau, wlop, rossdraws, concept art, digital painting, looking into camera", "negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime", "width": "512", "height": "512", "samples": "1", "num_inference_steps": "30", "safety_checker": "no", "enhance_prompt": "yes", "seed": None, "guidance_scale": 7.5, "multi_lingual": "no", "panorama": "no", "self_attention": "no", "upscale": "no", "embeddings": "embeddings_model_id", "lora": "lora_model_id", "webhook": None, "track_id": None }) headers = { 'Content-Type': 'application/json' } response = requests.request("POST", url, headers=headers, data=payload) print(response.text) > Use this coupon code to get 25% off **DMGG0RBN**
2,393
[ [ -0.0325927734375, -0.0711669921875, 0.037567138671875, 0.0155029296875, -0.028045654296875, 0.007659912109375, 0.0202484130859375, -0.0296173095703125, 0.038116455078125, 0.039947509765625, -0.06402587890625, -0.06524658203125, -0.0257720947265625, -0.0074920654296875, -0.006778717041015625, 0.040283203125, 0.002532958984375, -0.00942230224609375, -0.01080322265625, 0.0108795166015625, -0.0104217529296875, -0.00846099853515625, -0.05462646484375, -0.01055145263671875, 0.005184173583984375, -0.0102691650390625, 0.049407958984375, 0.056884765625, 0.0287322998046875, 0.0207977294921875, -0.01244354248046875, -0.01168060302734375, -0.03106689453125, -0.023834228515625, -0.0102386474609375, -0.053192138671875, -0.04931640625, -0.0074920654296875, 0.02838134765625, 0.0330810546875, 0.0057525634765625, 0.03289794921875, -0.00018584728240966797, 0.0489501953125, -0.046661376953125, 0.02105712890625, -0.0229034423828125, 0.01654052734375, 0.00372314453125, -0.003948211669921875, -0.01436614990234375, -0.027618408203125, -0.00894927978515625, -0.0684814453125, 0.0183868408203125, 0.0004949569702148438, 0.10205078125, 0.0162506103515625, -0.01959228515625, -0.009521484375, -0.035186767578125, 0.056060791015625, -0.07476806640625, 0.0198211669921875, 0.033050537109375, 0.00432586669921875, -0.006805419921875, -0.0635986328125, -0.04534912109375, 0.0280914306640625, 0.0213470458984375, 0.0185394287109375, -0.032012939453125, -0.0013456344604492188, 0.019195556640625, 0.027374267578125, -0.031219482421875, -0.0150146484375, -0.02593994140625, -0.0121612548828125, 0.043548583984375, 0.017486572265625, 0.02105712890625, -0.0229034423828125, -0.03082275390625, -0.0242462158203125, -0.0379638671875, 0.031280517578125, 0.049041748046875, 0.02789306640625, -0.041351318359375, 0.028961181640625, -0.032806396484375, 0.056915283203125, 0.015625, -0.026123046875, 0.04852294921875, -0.00913238525390625, -0.0159149169921875, -0.0171051025390625, 0.07159423828125, 0.05987548828125, -0.0131988525390625, 0.0244903564453125, -0.00861358642578125, 0.0034465789794921875, 0.00033283233642578125, -0.07855224609375, -0.0084075927734375, 0.055572509765625, -0.05609130859375, -0.041229248046875, -0.00138092041015625, -0.07476806640625, -0.009246826171875, -0.0009202957153320312, 0.032928466796875, -0.0304412841796875, -0.04376220703125, 0.0225677490234375, -0.0122222900390625, 0.00894927978515625, 0.0142059326171875, -0.049560546875, 0.0121307373046875, 0.0416259765625, 0.07061767578125, 0.01291656494140625, -0.0009608268737792969, 0.0194549560546875, 0.0047454833984375, -0.02459716796875, 0.058074951171875, -0.0160369873046875, -0.035491943359375, -0.00463104248046875, 0.0192108154296875, -0.0024814605712890625, -0.031982421875, 0.04736328125, -0.03973388671875, -0.01232147216796875, -0.0130462646484375, -0.0309600830078125, -0.0245208740234375, 0.0167236328125, -0.042388916015625, 0.0477294921875, 0.01218414306640625, -0.056304931640625, 0.0139923095703125, -0.05670166015625, -0.01329803466796875, -0.00262451171875, 0.006160736083984375, -0.035308837890625, -0.00882720947265625, 0.0143280029296875, 0.017791748046875, 0.006946563720703125, -0.004062652587890625, -0.0543212890625, -0.0237884521484375, 0.0230865478515625, -0.0238800048828125, 0.08837890625, 0.03326416015625, -0.0164642333984375, -0.00559234619140625, -0.06536865234375, 0.00818634033203125, 0.04229736328125, -0.017303466796875, -0.0217132568359375, -0.0177764892578125, 0.005901336669921875, -0.0007214546203613281, 0.027099609375, -0.04290771484375, 0.0200347900390625, -0.032470703125, 0.035919189453125, 0.045989990234375, 0.0195159912109375, 0.01788330078125, -0.020782470703125, 0.0477294921875, 0.0059814453125, 0.033111572265625, 0.0035247802734375, -0.0419921875, -0.03753662109375, -0.03472900390625, 0.0215911865234375, 0.03741455078125, -0.04046630859375, 0.0304107666015625, -0.016204833984375, -0.049163818359375, -0.051055908203125, -0.012786865234375, 0.0236053466796875, 0.04168701171875, 0.00972747802734375, -0.01202392578125, -0.0411376953125, -0.0635986328125, -0.0019121170043945312, -0.009002685546875, -0.0035915374755859375, 0.02264404296875, 0.03778076171875, -0.026123046875, 0.07135009765625, -0.06170654296875, -0.007114410400390625, -0.0008344650268554688, 0.0008602142333984375, 0.061492919921875, 0.050140380859375, 0.05731201171875, -0.06396484375, -0.02911376953125, -0.0284881591796875, -0.0557861328125, 0.00876617431640625, 0.0149688720703125, -0.027313232421875, -0.011962890625, 0.0097503662109375, -0.05755615234375, 0.043426513671875, 0.033782958984375, -0.040863037109375, 0.038665771484375, -0.01080322265625, 0.043975830078125, -0.09051513671875, -0.003856658935546875, 0.01404571533203125, -0.0180206298828125, -0.0330810546875, 0.0352783203125, -0.002880096435546875, -0.0085601806640625, -0.05419921875, 0.044036865234375, -0.022216796875, 0.017608642578125, -0.0216827392578125, 0.01093292236328125, 0.018646240234375, 0.031768798828125, 0.004833221435546875, 0.0295562744140625, 0.05120849609375, -0.041748046875, 0.037353515625, 0.015472412109375, -0.023681640625, 0.0517578125, -0.04705810546875, 0.00833892822265625, 0.003185272216796875, 0.0274200439453125, -0.08197021484375, -0.03765869140625, 0.034912109375, -0.04730224609375, -0.006587982177734375, -0.049896240234375, -0.03851318359375, -0.045257568359375, -0.0268402099609375, 0.022003173828125, 0.06036376953125, -0.031005859375, 0.052398681640625, 0.0231475830078125, 0.0112152099609375, -0.0494384765625, -0.060882568359375, -0.0247344970703125, -0.0274658203125, -0.04925537109375, 0.0237579345703125, -0.0100555419921875, -0.02789306640625, 0.006465911865234375, 0.00957489013671875, -0.0168914794921875, -0.0010480880737304688, 0.032989501953125, 0.03302001953125, -0.0194091796875, -0.02117919921875, 0.0191650390625, -0.00164031982421875, 0.0160064697265625, -0.0222625732421875, 0.059326171875, -0.0118560791015625, -0.04327392578125, -0.06549072265625, 0.0011510848999023438, 0.04248046875, 0.00824737548828125, 0.039306640625, 0.0443115234375, -0.05340576171875, 0.0082550048828125, -0.03759765625, -0.01531982421875, -0.037261962890625, 0.01415252685546875, -0.04046630859375, -0.0233001708984375, 0.077880859375, -0.0091552734375, -0.003662109375, 0.0509033203125, 0.030487060546875, -0.01108551025390625, 0.096923828125, 0.0200347900390625, 0.01416778564453125, 0.0236053466796875, -0.061492919921875, -0.00475311279296875, -0.0633544921875, -0.012176513671875, -0.0203704833984375, -0.02581787109375, -0.024139404296875, -0.0289459228515625, 0.007442474365234375, 0.0281982421875, -0.0250701904296875, 0.018218994140625, -0.048004150390625, 0.02752685546875, 0.0369873046875, 0.033782958984375, 0.0180206298828125, 0.00333404541015625, -0.00799560546875, -0.004528045654296875, -0.027923583984375, -0.035247802734375, 0.08111572265625, 0.027618408203125, 0.056304931640625, 0.002376556396484375, 0.048095703125, 0.01104736328125, -0.002288818359375, -0.038909912109375, 0.036346435546875, 0.0191802978515625, -0.07098388671875, 0.00620269775390625, -0.01971435546875, -0.073974609375, 0.022857666015625, -0.0201568603515625, -0.0653076171875, 0.046539306640625, 0.01129913330078125, -0.053802490234375, 0.04449462890625, -0.043426513671875, 0.058837890625, -0.00494384765625, -0.04534912109375, -0.0089263916015625, -0.034637451171875, 0.03143310546875, 0.005462646484375, 0.051361083984375, -0.03485107421875, -0.011993408203125, 0.055145263671875, -0.036224365234375, 0.077392578125, -0.0298309326171875, 0.0039215087890625, 0.0509033203125, 0.01345062255859375, 0.0253143310546875, 0.035064697265625, -0.00927734375, 0.0177154541015625, 0.0222320556640625, -0.0440673828125, -0.0277252197265625, 0.061492919921875, -0.06378173828125, -0.03167724609375, -0.020416259765625, -0.0255584716796875, -0.00034308433532714844, 0.034393310546875, 0.043670654296875, 0.0298919677734375, -0.0023632049560546875, -0.00392913818359375, 0.058013916015625, -0.0020809173583984375, 0.0352783203125, 0.0167236328125, -0.049774169921875, -0.053253173828125, 0.056488037109375, -0.007076263427734375, 0.02685546875, 0.006866455078125, 0.0223846435546875, -0.0345458984375, -0.047149658203125, -0.0404052734375, 0.0201568603515625, -0.0570068359375, -0.02459716796875, -0.05902099609375, 0.0015010833740234375, -0.054107666015625, -0.018585205078125, -0.058074951171875, -0.0229949951171875, -0.043212890625, -0.011383056640625, 0.0511474609375, 0.023681640625, -0.0166778564453125, 0.0277099609375, -0.051025390625, 0.0205230712890625, 0.005352020263671875, 0.02081298828125, 0.007305145263671875, -0.041595458984375, -0.00684356689453125, 0.02777099609375, -0.0297698974609375, -0.06170654296875, 0.040679931640625, -0.0146331787109375, 0.0302276611328125, 0.06536865234375, 0.0193328857421875, 0.067138671875, 0.001781463623046875, 0.06591796875, 0.02642822265625, -0.06640625, 0.064697265625, -0.0506591796875, 0.0102691650390625, 0.046661376953125, 0.03033447265625, -0.0180206298828125, -0.019775390625, -0.0665283203125, -0.08123779296875, 0.035125732421875, 0.009033203125, 0.0250396728515625, 0.00839996337890625, 0.0347900390625, -0.002941131591796875, 0.0171356201171875, -0.0740966796875, -0.032501220703125, -0.0196990966796875, -0.01322174072265625, 0.032073974609375, 0.005428314208984375, -0.03228759765625, -0.0308990478515625, 0.057769775390625, -0.01039886474609375, 0.0247344970703125, 0.028839111328125, 0.0185546875, -0.018707275390625, -0.005260467529296875, 0.02752685546875, 0.059326171875, -0.04473876953125, -0.0125732421875, 0.001049041748046875, -0.03497314453125, 0.005397796630859375, 0.009796142578125, -0.0239410400390625, -0.0011739730834960938, 0.0259857177734375, 0.06378173828125, -0.01006317138671875, -0.0345458984375, 0.047149658203125, -0.0120697021484375, -0.029449462890625, -0.03857421875, 0.0033054351806640625, 0.0269012451171875, 0.041900634765625, 0.0360107421875, 0.020477294921875, 0.021392822265625, -0.0299835205078125, -0.006511688232421875, 0.0239410400390625, -0.030181884765625, -0.0260772705078125, 0.07452392578125, -0.001033782958984375, -0.034515380859375, 0.03240966796875, -0.028289794921875, -0.0109710693359375, 0.05712890625, 0.051361083984375, 0.0537109375, -0.0043487548828125, 0.01416778564453125, 0.05462646484375, -0.00007021427154541016, -0.008087158203125, 0.049560546875, 0.0206146240234375, -0.043975830078125, -0.015716552734375, -0.0595703125, -0.01346588134765625, 0.02044677734375, -0.0537109375, 0.035308837890625, -0.059539794921875, -0.03936767578125, -0.023162841796875, -0.016204833984375, -0.042083740234375, 0.0251312255859375, 0.00511932373046875, 0.0660400390625, -0.06317138671875, 0.037567138671875, 0.049560546875, -0.0421142578125, -0.06866455078125, -0.01220703125, 0.0106048583984375, -0.050689697265625, 0.0263214111328125, -0.001964569091796875, -0.000029146671295166016, 0.01024627685546875, -0.055877685546875, -0.0721435546875, 0.08404541015625, 0.0225677490234375, -0.0279388427734375, -0.003932952880859375, -0.002582550048828125, 0.03717041015625, -0.037384033203125, 0.032135009765625, 0.0217742919921875, 0.0251312255859375, 0.0164794921875, -0.035797119140625, 0.0092620849609375, -0.0305633544921875, -0.002246856689453125, -0.019805908203125, -0.059356689453125, 0.071044921875, -0.023529052734375, 0.002063751220703125, 0.009033203125, 0.05120849609375, 0.0489501953125, 0.033050537109375, 0.0428466796875, 0.067138671875, 0.0367431640625, -0.0154266357421875, 0.0728759765625, -0.028656005859375, 0.05859375, 0.04034423828125, 0.004680633544921875, 0.06866455078125, 0.027618408203125, -0.036376953125, 0.057769775390625, 0.0843505859375, -0.015045166015625, 0.057708740234375, 0.005462646484375, -0.0245819091796875, -0.01100921630859375, 0.005863189697265625, -0.05084228515625, 0.015350341796875, 0.0269622802734375, -0.0250396728515625, 0.01293182373046875, 0.007556915283203125, 0.006763458251953125, -0.009796142578125, -0.0237579345703125, 0.0305023193359375, 0.0013475418090820312, -0.01605224609375, 0.0576171875, -0.01233673095703125, 0.07025146484375, -0.039276123046875, -0.0009584426879882812, -0.00595855712890625, 0.0260467529296875, -0.0236968994140625, -0.04827880859375, 0.0055389404296875, -0.0188446044921875, -0.006282806396484375, 0.0026111602783203125, 0.041473388671875, 0.003330230712890625, -0.054534912109375, 0.021209716796875, 0.0211944580078125, 0.022216796875, -0.00391387939453125, -0.078369140625, 0.0204925537109375, 0.01409912109375, -0.0360107421875, 0.004840850830078125, 0.01898193359375, 0.036468505859375, 0.05010986328125, 0.05755615234375, 0.01386260986328125, 0.0017595291137695312, -0.007778167724609375, 0.047149658203125, -0.035552978515625, -0.04998779296875, -0.0692138671875, 0.049468994140625, -0.01702880859375, -0.0170440673828125, 0.043121337890625, 0.06365966796875, 0.057891845703125, -0.03387451171875, 0.063720703125, -0.01409149169921875, 0.034393310546875, -0.03228759765625, 0.062225341796875, -0.06005859375, 0.0030155181884765625, -0.041259765625, -0.056915283203125, -0.009979248046875, 0.04693603515625, -0.01511383056640625, 0.009735107421875, 0.035003662109375, 0.056610107421875, -0.0255584716796875, -0.005580902099609375, 0.0015459060668945312, 0.010528564453125, 0.010833740234375, 0.021881103515625, 0.0484619140625, -0.047515869140625, 0.03485107421875, -0.055877685546875, -0.023345947265625, -0.00939178466796875, -0.059539794921875, -0.05328369140625, -0.0271148681640625, -0.0482177734375, -0.06439208984375, -0.008453369140625, 0.0616455078125, 0.06817626953125, -0.0587158203125, -0.0140228271484375, -0.0050201416015625, 0.00444793701171875, -0.02978515625, -0.0236663818359375, 0.033203125, 0.0223541259765625, -0.0843505859375, 0.0130462646484375, -0.005695343017578125, 0.03070068359375, -0.006816864013671875, 0.007442474365234375, -0.01326751708984375, 0.00750732421875, 0.0147552490234375, 0.0220794677734375, -0.0614013671875, -0.005954742431640625, -0.0064697265625, 0.0011768341064453125, 0.0191497802734375, 0.01580810546875, -0.03326416015625, 0.02783203125, 0.050933837890625, 0.01329803466796875, 0.0389404296875, 0.004161834716796875, 0.01178741455078125, -0.0238800048828125, 0.02593994140625, 0.00797271728515625, 0.0440673828125, 0.01100921630859375, -0.047607421875, 0.035400390625, 0.047149658203125, -0.0277862548828125, -0.06756591796875, 0.003582000732421875, -0.07879638671875, -0.0308685302734375, 0.0731201171875, -0.0247344970703125, -0.052703857421875, 0.01000213623046875, -0.016448974609375, 0.025970458984375, -0.0296173095703125, 0.04351806640625, 0.04351806640625, -0.02630615234375, -0.0070953369140625, -0.0543212890625, 0.0156097412109375, 0.0015859603881835938, -0.056396484375, -0.0157928466796875, 0.02642822265625, 0.046173095703125, 0.043731689453125, 0.047027587890625, -0.032196044921875, 0.0096282958984375, 0.03497314453125, 0.0322265625, 0.0008101463317871094, 0.0256500244140625, -0.01389312744140625, 0.00928497314453125, -0.006427764892578125, -0.034393310546875 ] ]
osiria/bert-tweet-italian-uncased-sentiment
2023-06-20T22:31:17.000Z
[ "transformers", "pytorch", "safetensors", "bert", "text-classification", "it", "arxiv:1810.04805", "license:apache-2.0", "endpoints_compatible", "region:us" ]
text-classification
osiria
null
null
osiria/bert-tweet-italian-uncased-sentiment
1
9,209
transformers
2023-05-29T17:27:11
--- license: apache-2.0 language: - it widget: - text: "una fantastica giornata di #calcio! grande prestazione del mister e della squadra" example_title: "Example 1" - text: "il governo dovrebbe fare politica, non soltanto propaganda! #vergogna" example_title: "Example 2" - text: "che serata da sogno sul #redcarpet! grazie a tutti gli attori e registi del cinema italiano #oscar #awards" example_title: "Example 3" --- -------------------------------------------------------------------------------------------------- <body> <span class="vertical-text" style="background-color:lightgreen;border-radius: 3px;padding: 3px;"> </span> <br> <span class="vertical-text" style="background-color:orange;border-radius: 3px;padding: 3px;">    Task: Sentiment Analysis</span> <br> <span class="vertical-text" style="background-color:lightblue;border-radius: 3px;padding: 3px;">    Model: BERT-TWEET</span> <br> <span class="vertical-text" style="background-color:tomato;border-radius: 3px;padding: 3px;">    Lang: IT</span> <br> <span class="vertical-text" style="background-color:lightgrey;border-radius: 3px;padding: 3px;">  </span> <br> <span class="vertical-text" style="background-color:#CF9FFF;border-radius: 3px;padding: 3px;"> </span> </body> -------------------------------------------------------------------------------------------------- <h3>Model description</h3> This is a <b>BERT</b> <b>[1]</b> uncased model for the <b>Italian</b> language, fine-tuned for <b>Sentiment Analysis</b> (<b>positive</b> and <b>negative</b> classes only) on the [SENTIPOLC-16](https://www.evalita.it/campaigns/evalita-2016/tasks-challenge/sentipolc/) dataset, using <b>BERT-TWEET-ITALIAN</b> ([bert-tweet-base-italian-uncased](https://huggingface.co/osiria/bert-tweet-base-italian-uncased)) as a pre-trained model. <h3>Training and Performances</h3> The model is trained to perform binary sentiment classification (<b>positive</b> vs <b>negative</b>) and it's meant to be used primarily on tweets or other social media posts. It has been fine-tuned for Sentiment Analysis, using the SENTIPOLC-16 dataset, for 3 epochs with a constant learning rate of 1e-5 and exploiting class weighting to compensate for the class imbalance. Instances having both positive and negative sentiment have been excluded, resulting in 4154 training instances and 1050 test instances The performances on the test set are reported in the following table: | Accuracy | Recall | Precision | F1 | | ------ | ------ | ------ | ------ | | 83.67 | 83.15 | 80.48 | 81.49 | The Recall, Precision and F1 metrics are averaged over the two classes <h3>Quick usage</h3> ```python from transformers import BertTokenizerFast, BertForSequenceClassification tokenizer = BertTokenizerFast.from_pretrained("osiria/bert-tweet-italian-uncased-sentiment") model = BertForSequenceClassification.from_pretrained("osiria/bert-tweet-italian-uncased-sentiment") from transformers import pipeline classifier = pipeline("text-classification", model = model, tokenizer = tokenizer) classifier("una fantastica giornata di #calcio! grande prestazione del mister e della squadra") # [{'label': 'POSITIVE', 'score': 0.9883694648742676}] ``` <h3>References</h3> [1] https://arxiv.org/abs/1810.04805 <h3>Limitations</h3> This model was trained on tweets, so it's mainly suitable for general-purpose social media text processing, involving short texts written in a social network style. It might show limitations when it comes to longer and more structured text, or domain-specific text. <h3>License</h3> The model is released under <b>Apache-2.0</b> license
3,616
[ [ -0.031402587890625, -0.0543212890625, 0.01122283935546875, 0.04522705078125, -0.029937744140625, 0.0133819580078125, -0.0255889892578125, -0.038177490234375, 0.02801513671875, -0.01023101806640625, -0.059844970703125, -0.05975341796875, -0.059112548828125, -0.004428863525390625, -0.015350341796875, 0.0855712890625, -0.00008445978164672852, 0.00962066650390625, 0.02484130859375, -0.00215911865234375, -0.0157012939453125, -0.0308380126953125, -0.04876708984375, -0.007480621337890625, 0.032928466796875, 0.017120361328125, 0.041168212890625, -0.0022735595703125, 0.033111572265625, 0.0217132568359375, -0.010498046875, 0.0169677734375, -0.0270843505859375, -0.0116424560546875, 0.00258636474609375, -0.0193328857421875, -0.039794921875, 0.00371551513671875, 0.042266845703125, 0.037322998046875, 0.01922607421875, 0.008056640625, 0.0290069580078125, 0.042816162109375, -0.032012939453125, 0.0012006759643554688, -0.0447998046875, 0.00556182861328125, -0.01012420654296875, -0.0017957687377929688, -0.0184173583984375, -0.06298828125, 0.01119232177734375, -0.0236663818359375, 0.0089874267578125, -0.0088043212890625, 0.10333251953125, -0.007472991943359375, -0.00921630859375, -0.01371002197265625, -0.033111572265625, 0.07470703125, -0.07073974609375, 0.018463134765625, 0.009368896484375, -0.00107574462890625, 0.0131988525390625, -0.030731201171875, -0.054718017578125, 0.01055145263671875, -0.004970550537109375, 0.03662109375, -0.03009033203125, -0.0161285400390625, -0.00698089599609375, 0.00830078125, -0.033538818359375, -0.005382537841796875, -0.0208892822265625, 0.0024471282958984375, 0.04248046875, 0.0006327629089355469, 0.0219879150390625, -0.0303955078125, -0.0278778076171875, -0.005146026611328125, -0.00830078125, 0.027801513671875, 0.0175933837890625, 0.0404052734375, -0.03924560546875, 0.044403076171875, -0.004291534423828125, 0.0261993408203125, 0.0182952880859375, -0.0126495361328125, 0.0560302734375, -0.0223236083984375, -0.017852783203125, -0.019134521484375, 0.0828857421875, 0.052734375, 0.0321044921875, -0.0016422271728515625, -0.015899658203125, 0.01247406005859375, 0.016815185546875, -0.057403564453125, 0.006320953369140625, 0.01500701904296875, -0.0462646484375, -0.04345703125, 0.0127105712890625, -0.06378173828125, 0.0051116943359375, -0.0015459060668945312, 0.045684814453125, -0.045196533203125, -0.0246124267578125, 0.01142120361328125, -0.021636962890625, 0.0188446044921875, 0.02166748046875, -0.046844482421875, 0.0113677978515625, 0.03271484375, 0.062225341796875, -0.00899505615234375, -0.0146331787109375, 0.01043701171875, -0.013916015625, -0.0200653076171875, 0.048431396484375, -0.0260772705078125, -0.0253753662109375, 0.0177764892578125, 0.003849029541015625, -0.03131103515625, -0.0011501312255859375, 0.050537109375, -0.0185699462890625, 0.03363037109375, -0.03656005859375, -0.03466796875, -0.0033893585205078125, 0.03997802734375, -0.0262603759765625, 0.09515380859375, 0.00426483154296875, -0.0889892578125, 0.0310211181640625, -0.04644775390625, -0.0394287109375, -0.023895263671875, 0.008392333984375, -0.046417236328125, 0.00386810302734375, 0.0298919677734375, 0.061798095703125, 0.00293731689453125, -0.0091400146484375, -0.039642333984375, -0.0178375244140625, 0.0159912109375, -0.0210113525390625, 0.07806396484375, 0.01129913330078125, -0.047454833984375, -0.00860595703125, -0.049224853515625, 0.0100555419921875, 0.0172119140625, -0.033477783203125, -0.015045166015625, 0.0006098747253417969, 0.01256561279296875, 0.0285797119140625, 0.0308380126953125, -0.057708740234375, 0.00226593017578125, -0.044677734375, 0.0193634033203125, 0.05859375, 0.0006122589111328125, 0.0290679931640625, -0.032623291015625, 0.0193328857421875, 0.01348114013671875, 0.024261474609375, 0.032989501953125, -0.026214599609375, -0.06610107421875, -0.01971435546875, 0.01427459716796875, 0.044647216796875, -0.054229736328125, 0.07147216796875, -0.0338134765625, -0.042694091796875, -0.039337158203125, -0.0022830963134765625, 0.009735107421875, 0.032501220703125, 0.028167724609375, -0.0025157928466796875, -0.04693603515625, -0.056915283203125, -0.00803375244140625, -0.0214996337890625, 0.017242431640625, 0.0153961181640625, 0.05340576171875, -0.01433563232421875, 0.0736083984375, -0.032379150390625, -0.0276641845703125, -0.045196533203125, 0.01264190673828125, 0.02960205078125, 0.03765869140625, 0.059478759765625, -0.044403076171875, -0.052642822265625, -0.00833892822265625, -0.0599365234375, -0.00920867919921875, -0.0005731582641601562, -0.019622802734375, 0.038421630859375, 0.0125732421875, -0.05206298828125, 0.041290283203125, 0.037200927734375, -0.0300140380859375, 0.037841796875, 0.0153961181640625, 0.006988525390625, -0.10833740234375, 0.00482940673828125, 0.02911376953125, -0.0007653236389160156, -0.042694091796875, -0.0252838134765625, -0.00229644775390625, 0.00824737548828125, -0.038116455078125, 0.0285491943359375, -0.00923919677734375, 0.013641357421875, 0.0013990402221679688, 0.002593994140625, 0.00901031494140625, 0.0452880859375, 0.0118865966796875, 0.04571533203125, 0.033111572265625, -0.0240020751953125, 0.0018253326416015625, 0.0144500732421875, -0.029815673828125, 0.04327392578125, -0.056915283203125, -0.01427459716796875, -0.0033245086669921875, 0.013671875, -0.0908203125, 0.0036449432373046875, 0.0260467529296875, -0.06695556640625, 0.00907135009765625, 0.025299072265625, -0.057220458984375, -0.050933837890625, -0.05938720703125, -0.01149749755859375, 0.0389404296875, -0.03692626953125, 0.041748046875, 0.00241851806640625, 0.003582000732421875, -0.0535888671875, -0.0635986328125, -0.02203369140625, -0.0164947509765625, -0.0643310546875, 0.02056884765625, -0.013214111328125, -0.0009527206420898438, 0.00530242919921875, 0.001491546630859375, -0.004116058349609375, 0.00481414794921875, 0.01517486572265625, 0.0263519287109375, -0.00675201416015625, 0.016815185546875, 0.0036106109619140625, 0.02825927734375, 0.00151824951171875, 0.006458282470703125, 0.0616455078125, -0.0215301513671875, 0.0031414031982421875, -0.06060791015625, 0.0202484130859375, 0.039581298828125, -0.00553131103515625, 0.0889892578125, 0.06597900390625, -0.0254669189453125, -0.0153961181640625, -0.045501708984375, -0.005100250244140625, -0.035369873046875, 0.01824951171875, -0.02239990234375, -0.053955078125, 0.0499267578125, 0.0222930908203125, 0.001346588134765625, 0.043975830078125, 0.04669189453125, -0.0447998046875, 0.0843505859375, 0.039764404296875, -0.0311431884765625, 0.049346923828125, -0.038177490234375, 0.023529052734375, -0.043975830078125, -0.0251007080078125, -0.048065185546875, -0.033355712890625, -0.0465087890625, -0.005832672119140625, 0.0237274169921875, -0.00582122802734375, -0.037628173828125, 0.0261383056640625, -0.040863037109375, 0.00511932373046875, 0.04022216796875, 0.0361328125, -0.007808685302734375, 0.00765228271484375, -0.0166168212890625, -0.01134490966796875, -0.038116455078125, -0.036956787109375, 0.0635986328125, 0.043365478515625, 0.0469970703125, 0.0073394775390625, 0.05859375, 0.0139007568359375, 0.01302337646484375, -0.052703857421875, 0.0469970703125, -0.0196380615234375, -0.05767822265625, -0.00768280029296875, -0.0236663818359375, -0.060302734375, 0.01076507568359375, -0.033416748046875, -0.056549072265625, 0.010772705078125, -0.006046295166015625, -0.0247344970703125, 0.040130615234375, -0.05145263671875, 0.05804443359375, -0.0086517333984375, -0.01934814453125, -0.002391815185546875, -0.050567626953125, 0.0008778572082519531, 0.01309967041015625, 0.0225677490234375, -0.00762176513671875, -0.0017862319946289062, 0.06390380859375, -0.041046142578125, 0.090576171875, -0.0196533203125, 0.0048065185546875, 0.0228729248046875, -0.0135498046875, 0.0294342041015625, -0.0193328857421875, -0.0110931396484375, 0.006504058837890625, -0.0224151611328125, -0.0287628173828125, -0.016876220703125, 0.04461669921875, -0.08758544921875, -0.0204315185546875, -0.0391845703125, -0.016845703125, -0.02264404296875, 0.0156707763671875, 0.036224365234375, 0.011474609375, -0.01522064208984375, 0.0252227783203125, 0.042999267578125, -0.0195465087890625, 0.05908203125, 0.0186767578125, 0.007724761962890625, -0.03466796875, 0.05047607421875, 0.01256561279296875, 0.006244659423828125, 0.0280609130859375, 0.007198333740234375, -0.0234527587890625, -0.026885986328125, -0.01021575927734375, 0.02777099609375, -0.041046142578125, -0.026519775390625, -0.05657958984375, -0.0208892822265625, -0.045379638671875, -0.0310821533203125, -0.0254669189453125, -0.0287628173828125, -0.04864501953125, -0.0286865234375, 0.0254974365234375, 0.031707763671875, -0.01192474365234375, 0.035980224609375, -0.0389404296875, 0.005329132080078125, 0.01153564453125, 0.018157958984375, 0.010833740234375, -0.03955078125, -0.0180206298828125, 0.014068603515625, -0.0406494140625, -0.0675048828125, 0.04150390625, -0.0009794235229492188, 0.02362060546875, 0.033172607421875, 0.0033016204833984375, 0.052642822265625, -0.0230560302734375, 0.07086181640625, 0.039215087890625, -0.08209228515625, 0.059234619140625, -0.033538818359375, 0.013397216796875, 0.048126220703125, 0.045135498046875, -0.0227203369140625, -0.043182373046875, -0.04400634765625, -0.06524658203125, 0.0528564453125, 0.028961181640625, 0.0241241455078125, -0.02056884765625, -0.00115203857421875, -0.0149688720703125, 0.0142669677734375, -0.08746337890625, -0.0175628662109375, -0.03424072265625, -0.04150390625, 0.007476806640625, -0.0084381103515625, 0.0125732421875, -0.040618896484375, 0.06329345703125, 0.0088348388671875, 0.047454833984375, 0.0121612548828125, -0.0009417533874511719, -0.004421234130859375, 0.017181396484375, 0.03375244140625, 0.0323486328125, -0.0291595458984375, -0.01220703125, 0.0210418701171875, -0.0218658447265625, -0.01387786865234375, 0.00847625732421875, -0.01605224609375, 0.0166778564453125, 0.039764404296875, 0.06298828125, 0.0064697265625, -0.0184478759765625, 0.058837890625, 0.00659942626953125, -0.041412353515625, -0.038299560546875, -0.0106048583984375, -0.0030536651611328125, 0.0269775390625, 0.038116455078125, 0.007465362548828125, -0.0052032470703125, -0.042694091796875, -0.003650665283203125, 0.04217529296875, -0.0303497314453125, -0.0260467529296875, 0.033355712890625, 0.01477813720703125, 0.004817962646484375, 0.022064208984375, -0.005855560302734375, -0.07000732421875, 0.03509521484375, 0.025726318359375, 0.058197021484375, -0.004375457763671875, 0.032745361328125, 0.035552978515625, 0.0323486328125, -0.00042724609375, 0.0362548828125, -0.00009053945541381836, -0.0770263671875, -0.01273345947265625, -0.052337646484375, 0.00027441978454589844, 0.001834869384765625, -0.048553466796875, 0.0169219970703125, -0.045654296875, -0.045135498046875, 0.006916046142578125, 0.020050048828125, -0.05157470703125, 0.0479736328125, 0.0014553070068359375, 0.0687255859375, -0.08050537109375, 0.05126953125, 0.044586181640625, -0.04443359375, -0.07781982421875, 0.01268768310546875, -0.00972747802734375, -0.0631103515625, 0.061737060546875, 0.01800537109375, -0.01122283935546875, -0.01027679443359375, -0.05609130859375, -0.044403076171875, 0.07574462890625, -0.01153564453125, -0.0146331787109375, -0.012908935546875, 0.0178070068359375, 0.056427001953125, -0.021636962890625, 0.0260772705078125, 0.027435302734375, 0.033172607421875, 0.00838470458984375, -0.06524658203125, -0.00582122802734375, -0.036376953125, 0.015533447265625, -0.0014276504516601562, -0.07568359375, 0.06121826171875, 0.0166778564453125, -0.00524139404296875, 0.0031280517578125, 0.047454833984375, 0.0103607177734375, 0.01033782958984375, 0.036224365234375, 0.06146240234375, 0.026336669921875, -0.0290985107421875, 0.0760498046875, -0.01007843017578125, 0.0465087890625, 0.04754638671875, 0.0048828125, 0.07562255859375, 0.0240020751953125, -0.0157623291015625, 0.05487060546875, 0.04962158203125, -0.0017032623291015625, 0.04840087890625, -0.01371002197265625, -0.0186004638671875, -0.005619049072265625, -0.0084381103515625, -0.0276336669921875, 0.024566650390625, 0.0228424072265625, -0.03955078125, -0.005008697509765625, 0.005817413330078125, 0.04034423828125, -0.0022296905517578125, -0.033477783203125, 0.057159423828125, -0.004367828369140625, -0.038909912109375, 0.042266845703125, 0.0003540515899658203, 0.08221435546875, -0.034942626953125, 0.0391845703125, -0.01334381103515625, 0.02105712890625, -0.020660400390625, -0.07305908203125, 0.0237884521484375, 0.0139007568359375, -0.01328277587890625, -0.021270751953125, 0.045440673828125, -0.00667572021484375, -0.06402587890625, 0.042999267578125, 0.0271453857421875, -0.0124359130859375, 0.01824951171875, -0.07891845703125, 0.0258026123046875, 0.0172119140625, -0.045074462890625, -0.0061492919921875, 0.0465087890625, 0.027557373046875, 0.03466796875, 0.0294189453125, 0.01062774658203125, 0.0133056640625, 0.01197052001953125, 0.08258056640625, -0.03607177734375, -0.0301971435546875, -0.07122802734375, 0.057861328125, -0.03173828125, -0.0200042724609375, 0.064453125, 0.032501220703125, 0.05865478515625, -0.0092315673828125, 0.06494140625, -0.027801513671875, 0.06475830078125, -0.0215911865234375, 0.056732177734375, -0.060791015625, -0.006099700927734375, -0.040008544921875, -0.053192138671875, -0.030792236328125, 0.0789794921875, -0.04022216796875, 0.0273895263671875, 0.05389404296875, 0.055908203125, 0.00945281982421875, -0.0026226043701171875, -0.00830841064453125, 0.033294677734375, 0.01141357421875, 0.049835205078125, 0.0469970703125, -0.0345458984375, 0.041046142578125, -0.038909912109375, -0.01367950439453125, -0.01375579833984375, -0.0465087890625, -0.10107421875, -0.04168701171875, -0.016082763671875, -0.040863037109375, 0.00641632080078125, 0.0823974609375, 0.05511474609375, -0.060333251953125, -0.0143280029296875, -0.0036411285400390625, -0.0098876953125, 0.0003237724304199219, -0.025299072265625, 0.037139892578125, -0.0338134765625, -0.064453125, -0.00860595703125, -0.00826263427734375, 0.015411376953125, 0.005245208740234375, 0.005100250244140625, -0.0298919677734375, 0.0176849365234375, 0.0489501953125, -0.002239227294921875, -0.03448486328125, -0.031402587890625, 0.003536224365234375, -0.025665283203125, 0.0103912353515625, 0.010528564453125, -0.0231781005859375, 0.004131317138671875, 0.041046142578125, 0.03179931640625, 0.037994384765625, 0.0125732421875, 0.00637054443359375, -0.0689697265625, 0.005931854248046875, 0.0281982421875, 0.0233154296875, 0.01551055908203125, -0.0128021240234375, 0.02960205078125, 0.0201416015625, -0.020599365234375, -0.049224853515625, -0.0166778564453125, -0.093017578125, -0.031494140625, 0.07855224609375, 0.0008134841918945312, -0.01367950439453125, 0.0203094482421875, -0.01126861572265625, 0.0258026123046875, -0.060516357421875, 0.057708740234375, 0.06158447265625, -0.0013875961303710938, -0.00925445556640625, -0.0271453857421875, 0.0224151611328125, 0.04052734375, -0.056427001953125, -0.04058837890625, 0.0187530517578125, 0.0279541015625, 0.011138916015625, 0.050201416015625, -0.0010051727294921875, 0.01174163818359375, -0.0309600830078125, 0.01751708984375, 0.021820068359375, -0.01132965087890625, 0.0008721351623535156, 0.0230560302734375, -0.0172882080078125, -0.0343017578125 ] ]
deepset/bert-medium-squad2-distilled
2023-05-05T06:59:02.000Z
[ "transformers", "pytorch", "safetensors", "bert", "question-answering", "exbert", "en", "dataset:squad_v2", "license:mit", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
question-answering
deepset
null
null
deepset/bert-medium-squad2-distilled
3
9,197
transformers
2022-03-02T23:29:05
--- language: en license: mit tags: - exbert datasets: - squad_v2 thumbnail: https://thumb.tildacdn.com/tild3433-3637-4830-a533-353833613061/-/resize/720x/-/format/webp/germanquad.jpg model-index: - name: deepset/bert-medium-squad2-distilled results: - task: type: question-answering name: Question Answering dataset: name: squad_v2 type: squad_v2 config: squad_v2 split: validation metrics: - type: exact_match value: 69.8231 name: Exact Match verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMmE4MGRkZTVjNmViMGNjYjVhY2E1NzcyOGQ1OWE1MWMzMjY5NWU0MmU0Y2I4OWU4YTU5OWQ5YTI2NWE1NmM0ZSIsInZlcnNpb24iOjF9.tnCJvWzMctTwiQu5yig_owO2ZI1t1MZz1AN2lQy4COAGOzuMovD-74acQvMbxJQoRfNNkIetz2hqYivf1lJKDw - type: f1 value: 72.9232 name: F1 verified: true verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTMwNzk0ZDRjNGUyMjQyNzc1NzczZmUwMTU2MTM5MGQ3M2NhODlmOTU4ZDI0YjhlNTVjNDA1MGEwM2M1MzIyZSIsInZlcnNpb24iOjF9.eElGmTOXH_qHTNaPwZ-dUJfVz9VMvCutDCof_6UG_625MwctT_j7iVkWcGwed4tUnunuq1BPm-0iRh1RuuB-AQ --- ## Overview **Language model:** deepset/roberta-base-squad2-distilled **Language:** English **Training data:** SQuAD 2.0 training set **Eval data:** SQuAD 2.0 dev set **Infrastructure**: 1x V100 GPU **Published**: Apr 21st, 2021 ## Details - haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model. ## Hyperparameters ``` batch_size = 6 n_epochs = 2 max_seq_len = 384 learning_rate = 3e-5 lr_schedule = LinearWarmup embeds_dropout_prob = 0.1 temperature = 5 distillation_loss_weight = 1 ``` ## Performance ``` "exact": 68.6431398972458 "f1": 72.7637083790805 ``` ## Authors - Timo Möller: `timo.moeller [at] deepset.ai` - Julian Risch: `julian.risch [at] deepset.ai` - Malte Pietsch: `malte.pietsch [at] deepset.ai` - Michel Bartels: `michel.bartels [at] deepset.ai` ## About us ![deepset logo](https://workablehr.s3.amazonaws.com/uploads/account/logo/476306/logo) We bring NLP to the industry via open source! Our focus: Industry specific language models & large scale QA systems. Some of our work: - [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert) - [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad) - [FARM](https://github.com/deepset-ai/FARM) - [Haystack](https://github.com/deepset-ai/haystack/) Get in touch: [Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai) By the way: [we're hiring!](http://www.deepset.ai/jobs)
2,883
[ [ -0.0252532958984375, -0.050018310546875, 0.0293731689453125, 0.0155181884765625, -0.0056304931640625, 0.0146484375, -0.0347900390625, -0.03717041015625, 0.00551605224609375, 0.01934814453125, -0.057403564453125, -0.060394287109375, -0.035430908203125, -0.0012054443359375, -0.0287628173828125, 0.0806884765625, -0.0005645751953125, 0.00926971435546875, 0.00994110107421875, -0.007350921630859375, -0.0283203125, -0.041412353515625, -0.0546875, -0.0241546630859375, 0.016845703125, 0.0254058837890625, 0.04595947265625, 0.0208282470703125, 0.04168701171875, 0.022186279296875, -0.0022640228271484375, -0.001102447509765625, -0.0390625, 0.0253448486328125, -0.00930023193359375, -0.0191802978515625, -0.02593994140625, -0.00371551513671875, 0.043975830078125, 0.03131103515625, -0.0198822021484375, 0.026153564453125, -0.007701873779296875, 0.0701904296875, -0.046234130859375, 0.021331787109375, -0.05078125, -0.01010894775390625, -0.0005364418029785156, 0.03271484375, -0.014739990234375, -0.01317596435546875, 0.017059326171875, -0.040496826171875, 0.025634765625, -0.0300750732421875, 0.090087890625, 0.0206756591796875, 0.00084686279296875, -0.01483917236328125, -0.050079345703125, 0.057403564453125, -0.0830078125, 0.0213470458984375, 0.03692626953125, 0.038909912109375, -0.0007815361022949219, -0.0672607421875, -0.040374755859375, -0.00859832763671875, -0.000423431396484375, 0.0147552490234375, -0.013427734375, -0.00826263427734375, 0.0028839111328125, 0.03594970703125, -0.047515869140625, 0.0259552001953125, -0.0341796875, 0.0004017353057861328, 0.06573486328125, 0.0026988983154296875, 0.00632476806640625, 0.01317596435546875, -0.0107421875, -0.025299072265625, -0.03662109375, 0.0019664764404296875, 0.01995849609375, 0.0259552001953125, -0.01824951171875, 0.02752685546875, -0.0269317626953125, 0.0308685302734375, 0.006381988525390625, 0.025115966796875, 0.04443359375, -0.0382080078125, -0.0268707275390625, 0.0014162063598632812, 0.07659912109375, 0.005474090576171875, 0.0156707763671875, -0.0040740966796875, -0.02911376953125, -0.006305694580078125, 0.013275146484375, -0.0667724609375, -0.030548095703125, 0.033905029296875, -0.018218994140625, -0.03472900390625, 0.00368499755859375, -0.055419921875, -0.0248565673828125, -0.0049591064453125, 0.03570556640625, -0.038604736328125, -0.03338623046875, 0.016845703125, -0.0168914794921875, 0.031982421875, 0.00659942626953125, -0.0589599609375, 0.0177001953125, 0.047454833984375, 0.056365966796875, 0.0078277587890625, -0.031463623046875, -0.0179290771484375, -0.0086822509765625, -0.01282501220703125, 0.03387451171875, -0.019622802734375, -0.00540924072265625, 0.00882720947265625, 0.0108795166015625, 0.006252288818359375, -0.0322265625, 0.0201873779296875, -0.03961181640625, 0.03338623046875, -0.00218963623046875, -0.044036865234375, -0.0095062255859375, 0.0212249755859375, -0.0633544921875, 0.09027099609375, 0.0260772705078125, -0.0218353271484375, 0.0226898193359375, -0.057037353515625, -0.033966064453125, 0.013885498046875, -0.0018167495727539062, -0.031951904296875, -0.01032257080078125, 0.01934814453125, 0.041595458984375, -0.0240631103515625, 0.027862548828125, -0.01409149169921875, -0.035064697265625, 0.0196075439453125, -0.020416259765625, 0.08221435546875, 0.0111083984375, -0.04901123046875, -0.00036907196044921875, -0.061187744140625, 0.016357421875, 0.01473236083984375, -0.024810791015625, -0.004245758056640625, -0.00511932373046875, 0.01641845703125, 0.01517486572265625, 0.0281829833984375, -0.0263824462890625, 0.0088348388671875, -0.0382080078125, 0.0462646484375, 0.04931640625, -0.01482391357421875, 0.024658203125, -0.0059814453125, 0.0386962890625, 0.004795074462890625, 0.0196075439453125, 0.00823211669921875, -0.0304412841796875, -0.06707763671875, -0.03546142578125, 0.032257080078125, 0.05621337890625, -0.053619384765625, 0.0643310546875, -0.010162353515625, -0.053802490234375, -0.07391357421875, 0.0169219970703125, 0.03778076171875, 0.0213775634765625, 0.031494140625, -0.003749847412109375, -0.053497314453125, -0.07537841796875, -0.0010547637939453125, -0.01201629638671875, -0.0087890625, 0.00116729736328125, 0.04425048828125, -0.0276031494140625, 0.053497314453125, -0.0401611328125, -0.038970947265625, -0.01995849609375, -0.00653839111328125, 0.0462646484375, 0.039093017578125, 0.05426025390625, -0.0482177734375, -0.0380859375, -0.01323699951171875, -0.05059814453125, 0.0202178955078125, 0.01067352294921875, -0.0101776123046875, 0.036712646484375, 0.035888671875, -0.042938232421875, 0.01143646240234375, 0.05999755859375, -0.0271453857421875, 0.03973388671875, 0.00040149688720703125, -0.0012254714965820312, -0.1085205078125, 0.0149993896484375, 0.007427215576171875, -0.00946807861328125, -0.0255279541015625, 0.020416259765625, -0.017242431640625, -0.0108642578125, -0.04534912109375, 0.033477783203125, -0.0279693603515625, 0.009796142578125, 0.007106781005859375, -0.00438690185546875, 0.0015544891357421875, 0.0390625, -0.0042266845703125, 0.0849609375, 0.050445556640625, -0.041839599609375, 0.041961669921875, 0.024749755859375, -0.0484619140625, 0.013916015625, -0.070556640625, 0.00974273681640625, 0.01093292236328125, 0.01372528076171875, -0.0733642578125, -0.01267242431640625, 0.0058746337890625, -0.04278564453125, 0.018707275390625, -0.0157318115234375, -0.057403564453125, -0.0272369384765625, -0.03216552734375, 0.016082763671875, 0.062408447265625, -0.0289764404296875, 0.00905609130859375, 0.0267791748046875, -0.0053253173828125, -0.045562744140625, -0.0694580078125, 0.0035419464111328125, -0.00243377685546875, -0.03570556640625, 0.02374267578125, 0.001827239990234375, -0.01453399658203125, 0.0171966552734375, 0.005405426025390625, -0.048431396484375, 0.01470184326171875, 0.006397247314453125, 0.0267791748046875, -0.031494140625, 0.018768310546875, -0.00843048095703125, -0.003948211669921875, 0.0013561248779296875, -0.022064208984375, 0.04925537109375, -0.046844482421875, 0.0003070831298828125, -0.0445556640625, 0.0192413330078125, 0.035400390625, -0.0233612060546875, 0.06573486328125, 0.065185546875, -0.0281219482421875, -0.01229095458984375, -0.048797607421875, -0.0221099853515625, -0.0382080078125, 0.0433349609375, -0.0147552490234375, -0.071533203125, 0.038909912109375, 0.0106964111328125, 0.0170440673828125, 0.06292724609375, 0.044219970703125, -0.0299224853515625, 0.060302734375, 0.04058837890625, -0.01654052734375, 0.03253173828125, -0.053253173828125, 0.005496978759765625, -0.0498046875, -0.015899658203125, -0.043731689453125, -0.0347900390625, -0.05419921875, -0.0249176025390625, 0.01061248779296875, 0.007579803466796875, -0.033905029296875, 0.041259765625, -0.052337646484375, 0.044464111328125, 0.057861328125, -0.0008029937744140625, 0.0009474754333496094, 0.0028171539306640625, 0.0021572113037109375, 0.0088348388671875, -0.05224609375, -0.034271240234375, 0.0797119140625, 0.0235443115234375, 0.039764404296875, 0.013336181640625, 0.061309814453125, 0.0218505859375, -0.017822265625, -0.049774169921875, 0.03363037109375, -0.0233001708984375, -0.06927490234375, -0.049041748046875, -0.0252838134765625, -0.071533203125, -0.0114898681640625, -0.010162353515625, -0.0406494140625, 0.01453399658203125, -0.0006556510925292969, -0.0170135498046875, 0.0242462158203125, -0.059844970703125, 0.0672607421875, -0.01540374755859375, -0.006626129150390625, -0.018402099609375, -0.05218505859375, 0.01436614990234375, -0.00701141357421875, 0.005523681640625, -0.01666259765625, 0.01285552978515625, 0.046783447265625, -0.0421142578125, 0.0694580078125, -0.01708984375, -0.0053863525390625, 0.024169921875, 0.01049041748046875, 0.035430908203125, 0.0182952880859375, -0.0271148681640625, 0.04840087890625, 0.01953125, -0.035858154296875, -0.0274810791015625, 0.05859375, -0.07452392578125, -0.017547607421875, -0.03857421875, -0.02593994140625, -0.01351165771484375, 0.023529052734375, 0.023193359375, 0.01329803466796875, -0.0246429443359375, 0.0299530029296875, 0.0516357421875, -0.0180511474609375, 0.038848876953125, 0.04339599609375, 0.000007212162017822266, -0.0325927734375, 0.0633544921875, -0.00559234619140625, 0.01360321044921875, 0.0308685302734375, -0.00501251220703125, -0.0218353271484375, -0.032379150390625, -0.038238525390625, 0.0114898681640625, -0.03277587890625, -0.0186920166015625, -0.0224151611328125, -0.039520263671875, -0.04888916015625, -0.00084686279296875, -0.05029296875, -0.037872314453125, -0.0263214111328125, -0.01432037353515625, 0.04931640625, 0.04205322265625, -0.00438690185546875, 0.01448822021484375, -0.05419921875, 0.004913330078125, 0.0172576904296875, 0.045257568359375, -0.01483917236328125, -0.0300750732421875, -0.035125732421875, 0.0285797119140625, -0.0030002593994140625, -0.041534423828125, 0.019287109375, 0.021759033203125, 0.037841796875, 0.00464630126953125, 0.00115966796875, 0.032440185546875, -0.032379150390625, 0.07525634765625, 0.01117706298828125, -0.06427001953125, 0.05426025390625, -0.021728515625, 0.03350830078125, 0.0875244140625, 0.0295257568359375, -0.03564453125, -0.0213470458984375, -0.04638671875, -0.08233642578125, 0.049530029296875, 0.0304412841796875, 0.01222991943359375, -0.0022373199462890625, 0.01458740234375, 0.004451751708984375, 0.0221710205078125, -0.033538818359375, -0.0208282470703125, -0.01165008544921875, -0.02545166015625, -0.015472412109375, -0.021820068359375, -0.0081939697265625, -0.03125, 0.07781982421875, 0.0109100341796875, 0.0184173583984375, 0.0115814208984375, -0.0140380859375, 0.0204925537109375, 0.0158233642578125, 0.047210693359375, 0.06207275390625, -0.034576416015625, -0.01007843017578125, 0.0084075927734375, -0.038604736328125, 0.01009368896484375, 0.0197601318359375, -0.022369384765625, 0.0233612060546875, 0.03558349609375, 0.064697265625, 0.0025691986083984375, -0.054412841796875, 0.039886474609375, 0.0019931793212890625, -0.03997802734375, -0.04681396484375, 0.01215362548828125, 0.006237030029296875, 0.030670166015625, 0.0274810791015625, -0.023284912109375, 0.00865936279296875, -0.0297088623046875, 0.008941650390625, 0.021453857421875, -0.045501708984375, -0.01432037353515625, 0.02777099609375, 0.031097412109375, -0.02685546875, 0.0784912109375, -0.021392822265625, -0.033905029296875, 0.05950927734375, 0.0206451416015625, 0.0867919921875, -0.004222869873046875, 0.0308685302734375, 0.03436279296875, 0.0302734375, 0.00457763671875, 0.029510498046875, 0.0122222900390625, -0.048828125, -0.036590576171875, -0.0382080078125, -0.0175323486328125, 0.038970947265625, -0.047637939453125, 0.010711669921875, -0.039306640625, -0.01132965087890625, 0.0032634735107421875, 0.018585205078125, -0.07281494140625, 0.01061248779296875, -0.0002636909484863281, 0.071533203125, -0.050567626953125, 0.0633544921875, 0.0654296875, -0.053863525390625, -0.058441162109375, -0.01096343994140625, -0.00695037841796875, -0.0712890625, 0.04229736328125, 0.004669189453125, 0.0012359619140625, 0.00884246826171875, -0.044830322265625, -0.05859375, 0.092041015625, 0.0193023681640625, -0.06011962890625, -0.01323699951171875, 0.0029926300048828125, 0.04693603515625, -0.00630950927734375, 0.004146575927734375, 0.04327392578125, 0.03521728515625, -0.004512786865234375, -0.07861328125, -0.0002677440643310547, -0.03021240234375, -0.002964019775390625, 0.013763427734375, -0.0513916015625, 0.049285888671875, -0.005825042724609375, -0.001392364501953125, 0.004253387451171875, 0.0328369140625, 0.0171051025390625, 0.00827789306640625, 0.026580810546875, 0.048614501953125, 0.05682373046875, -0.00899505615234375, 0.06494140625, -0.0218505859375, 0.03704833984375, 0.0989990234375, -0.0222930908203125, 0.0672607421875, 0.028289794921875, -0.0287322998046875, 0.0518798828125, 0.035980224609375, -0.0270538330078125, 0.046844482421875, 0.01329803466796875, -0.0085906982421875, -0.01068878173828125, 0.005825042724609375, -0.052703857421875, 0.03076171875, 0.01174163818359375, -0.0262451171875, -0.0140533447265625, -0.0187530517578125, -0.0009026527404785156, -0.0068817138671875, 0.0005660057067871094, 0.065673828125, -0.00789642333984375, -0.03582763671875, 0.056060791015625, 0.0014600753784179688, 0.0513916015625, -0.0596923828125, -0.004245758056640625, -0.0223388671875, 0.027984619140625, -0.0036830902099609375, -0.06866455078125, 0.00870513916015625, -0.00658416748046875, -0.0325927734375, -0.013885498046875, 0.04107666015625, -0.033538818359375, -0.05999755859375, 0.01201629638671875, 0.032379150390625, 0.02008056640625, -0.0083465576171875, -0.0699462890625, 0.0011730194091796875, -0.005767822265625, -0.0184478759765625, 0.018524169921875, 0.034088134765625, 0.0156707763671875, 0.0404052734375, 0.0513916015625, 0.01396942138671875, -0.002330780029296875, 0.0165252685546875, 0.065185546875, -0.0428466796875, -0.02130126953125, -0.06781005859375, 0.041656494140625, -0.0182952880859375, -0.039337158203125, 0.0404052734375, 0.055145263671875, 0.0819091796875, -0.0206756591796875, 0.058013916015625, -0.003231048583984375, 0.03277587890625, -0.036651611328125, 0.07232666015625, -0.037506103515625, 0.0004169940948486328, -0.0174407958984375, -0.070556640625, -0.0078277587890625, 0.048919677734375, 0.0081024169921875, 0.004856109619140625, 0.05072021484375, 0.05596923828125, 0.0015773773193359375, -0.0219879150390625, 0.004734039306640625, 0.026031494140625, 0.02642822265625, 0.0526123046875, 0.045745849609375, -0.051666259765625, 0.039642333984375, -0.0294036865234375, -0.0075531005859375, -0.02337646484375, -0.054840087890625, -0.064453125, -0.054840087890625, -0.035064697265625, -0.043548583984375, 0.024993896484375, 0.06890869140625, 0.061737060546875, -0.080078125, -0.027130126953125, -0.015472412109375, -0.0032444000244140625, -0.033721923828125, -0.0208282470703125, 0.030975341796875, -0.02984619140625, -0.047149658203125, 0.039093017578125, -0.0012865066528320312, -0.005527496337890625, -0.032379150390625, -0.0037593841552734375, -0.045501708984375, -0.0165252685546875, 0.040191650390625, 0.0207672119140625, -0.04949951171875, 0.0033969879150390625, -0.0007796287536621094, -0.02099609375, -0.01215362548828125, 0.034088134765625, -0.059539794921875, 0.0164642333984375, 0.037872314453125, 0.054412841796875, 0.044219970703125, -0.02679443359375, 0.052886962890625, -0.06048583984375, 0.0276336669921875, 0.038299560546875, 0.00862884521484375, 0.01450347900390625, -0.0272064208984375, 0.0604248046875, 0.0074310302734375, -0.03460693359375, -0.0643310546875, 0.00415802001953125, -0.06158447265625, -0.033966064453125, 0.09527587890625, 0.0007023811340332031, -0.0033740997314453125, 0.01360321044921875, -0.0035724639892578125, 0.018890380859375, -0.038482666015625, 0.055755615234375, 0.0645751953125, 0.030548095703125, 0.0070953369140625, -0.044952392578125, 0.038482666015625, 0.0217132568359375, -0.057769775390625, -0.01120758056640625, 0.04376220703125, 0.02703857421875, -0.00516510009765625, 0.040802001953125, 0.0016832351684570312, 0.023468017578125, -0.0181732177734375, 0.0256500244140625, -0.0240020751953125, -0.014862060546875, -0.0238037109375, -0.0130615234375, -0.006114959716796875, -0.0355224609375 ] ]
timm/resnetv2_50x1_bit.goog_in21k
2023-03-22T20:57:51.000Z
[ "timm", "pytorch", "safetensors", "image-classification", "dataset:imagenet-21k", "arxiv:1912.11370", "arxiv:1603.05027", "license:apache-2.0", "region:us" ]
image-classification
timm
null
null
timm/resnetv2_50x1_bit.goog_in21k
0
9,195
timm
2023-03-22T20:57:10
--- tags: - image-classification - timm library_tag: timm license: apache-2.0 datasets: - imagenet-21k --- # Model card for resnetv2_50x1_bit.goog_in21k A ResNet-V2-BiT (Big Transfer w/ pre-activation ResNet) image classification model. Trained on ImageNet-21k by paper authors. This model uses: * Group Normalization (GN) in combination with Weight Standardization (WS) instead of Batch Normalization (BN).. ## Model Details - **Model Type:** Image classification / feature backbone - **Model Stats:** - Params (M): 68.3 - GMACs: 4.3 - Activations (M): 11.1 - Image size: 224 x 224 - **Papers:** - Big Transfer (BiT): General Visual Representation Learning: https://arxiv.org/abs/1912.11370 - Identity Mappings in Deep Residual Networks: https://arxiv.org/abs/1603.05027 - **Dataset:** ImageNet-21k - **Original:** https://github.com/google-research/big_transfer ## Model Usage ### Image Classification ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model('resnetv2_50x1_bit.goog_in21k', pretrained=True) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5) ``` ### Feature Map Extraction ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'resnetv2_50x1_bit.goog_in21k', pretrained=True, features_only=True, ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1 for o in output: # print shape of each feature map in output # e.g.: # torch.Size([1, 64, 112, 112]) # torch.Size([1, 256, 56, 56]) # torch.Size([1, 512, 28, 28]) # torch.Size([1, 1024, 14, 14]) # torch.Size([1, 2048, 7, 7]) print(o.shape) ``` ### Image Embeddings ```python from urllib.request import urlopen from PIL import Image import timm img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' )) model = timm.create_model( 'resnetv2_50x1_bit.goog_in21k', pretrained=True, num_classes=0, # remove classifier nn.Linear ) model = model.eval() # get model specific transforms (normalization, resize) data_config = timm.data.resolve_model_data_config(model) transforms = timm.data.create_transform(**data_config, is_training=False) output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor # or equivalently (without needing to set num_classes=0) output = model.forward_features(transforms(img).unsqueeze(0)) # output is unpooled, a (1, 2048, 7, 7) shaped tensor output = model.forward_head(output, pre_logits=True) # output is a (1, num_features) shaped tensor ``` ## Model Comparison Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results). ## Citation ```bibtex @inproceedings{Kolesnikov2019BigT, title={Big Transfer (BiT): General Visual Representation Learning}, author={Alexander Kolesnikov and Lucas Beyer and Xiaohua Zhai and Joan Puigcerver and Jessica Yung and Sylvain Gelly and Neil Houlsby}, booktitle={European Conference on Computer Vision}, year={2019} } ``` ```bibtex @article{He2016, author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun}, title = {Identity Mappings in Deep Residual Networks}, journal = {arXiv preprint arXiv:1603.05027}, year = {2016} } ``` ```bibtex @misc{rw2019timm, author = {Ross Wightman}, title = {PyTorch Image Models}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, doi = {10.5281/zenodo.4414861}, howpublished = {\url{https://github.com/huggingface/pytorch-image-models}} } ```
4,520
[ [ -0.0345458984375, -0.027008056640625, 0.0015325546264648438, 0.003284454345703125, -0.02777099609375, -0.02874755859375, -0.02337646484375, -0.033416748046875, 0.018463134765625, 0.035797119140625, -0.030364990234375, -0.0447998046875, -0.053924560546875, -0.01120758056640625, -0.01055908203125, 0.0712890625, -0.00847625732421875, -0.0026073455810546875, -0.015411376953125, -0.039306640625, -0.0176239013671875, -0.024017333984375, -0.06048583984375, -0.029205322265625, 0.03082275390625, 0.0093841552734375, 0.039886474609375, 0.046356201171875, 0.048980712890625, 0.03509521484375, -0.00011098384857177734, 0.00624847412109375, -0.0254974365234375, -0.01555633544921875, 0.015716552734375, -0.046356201171875, -0.03076171875, 0.01873779296875, 0.0548095703125, 0.02569580078125, 0.0012693405151367188, 0.039215087890625, 0.005809783935546875, 0.046783447265625, -0.0228271484375, 0.0164794921875, -0.0306243896484375, 0.00836944580078125, -0.0004696846008300781, 0.01020050048828125, -0.0234375, -0.0335693359375, 0.0147857666015625, -0.037872314453125, 0.0325927734375, -0.001651763916015625, 0.10467529296875, 0.01435089111328125, -0.007663726806640625, 0.00391387939453125, -0.0287628173828125, 0.059722900390625, -0.0548095703125, 0.0178985595703125, 0.019866943359375, 0.02020263671875, -0.00693511962890625, -0.08538818359375, -0.043701171875, -0.0107269287109375, -0.0146636962890625, -0.0011043548583984375, -0.0247344970703125, -0.003692626953125, 0.0211944580078125, 0.0279693603515625, -0.03076171875, 0.01055908203125, -0.050933837890625, -0.0188751220703125, 0.04510498046875, -0.003940582275390625, 0.0222015380859375, -0.009735107421875, -0.0430908203125, -0.03369140625, -0.0264129638671875, 0.0205535888671875, 0.01702880859375, 0.01629638671875, -0.04742431640625, 0.027008056640625, 0.0146636962890625, 0.050628662109375, 0.00408172607421875, -0.0234375, 0.045501708984375, -0.0061798095703125, -0.033203125, -0.017974853515625, 0.08416748046875, 0.035125732421875, 0.024932861328125, 0.00611114501953125, -0.0083160400390625, -0.0307159423828125, -0.0069427490234375, -0.09088134765625, -0.033294677734375, 0.0158843994140625, -0.04962158203125, -0.0257110595703125, 0.021240234375, -0.047393798828125, -0.013458251953125, -0.016387939453125, 0.042999267578125, -0.03167724609375, -0.02899169921875, 0.00279998779296875, -0.01435089111328125, 0.03009033203125, 0.018890380859375, -0.04010009765625, 0.01174163818359375, 0.0214385986328125, 0.078857421875, 0.00359344482421875, -0.0282135009765625, -0.0209197998046875, -0.0238037109375, -0.0228271484375, 0.031463623046875, -0.007717132568359375, 0.001079559326171875, -0.0228271484375, 0.0343017578125, -0.006153106689453125, -0.05230712890625, 0.00859832763671875, -0.0231781005859375, 0.0264129638671875, -0.00739288330078125, -0.01256561279296875, -0.04571533203125, 0.0120849609375, -0.03594970703125, 0.088134765625, 0.0287628173828125, -0.056488037109375, 0.020751953125, -0.0283660888671875, -0.00788116455078125, -0.0236053466796875, 0.0016355514526367188, -0.0794677734375, -0.0021800994873046875, 0.01230621337890625, 0.053436279296875, -0.0282135009765625, 0.006023406982421875, -0.0447998046875, -0.02471923828125, 0.0272064208984375, -0.005218505859375, 0.073974609375, 0.00812530517578125, -0.03240966796875, 0.0196075439453125, -0.0400390625, 0.00997161865234375, 0.036529541015625, -0.0198211669921875, -0.004791259765625, -0.052032470703125, 0.00934600830078125, 0.0305633544921875, 0.01299285888671875, -0.0390625, 0.0264739990234375, -0.006504058837890625, 0.036773681640625, 0.04937744140625, -0.0106048583984375, 0.02490234375, -0.027069091796875, 0.021942138671875, 0.022186279296875, 0.0116424560546875, -0.000025033950805664062, -0.0428466796875, -0.061920166015625, -0.039154052734375, 0.039825439453125, 0.033294677734375, -0.040618896484375, 0.041748046875, -0.0223846435546875, -0.05352783203125, -0.038909912109375, 0.006259918212890625, 0.03466796875, 0.04583740234375, 0.031768798828125, -0.040252685546875, -0.0379638671875, -0.0758056640625, 0.01532745361328125, -0.0012083053588867188, -0.0059967041015625, 0.0250244140625, 0.050384521484375, -0.012115478515625, 0.049163818359375, -0.032379150390625, -0.0272674560546875, -0.0134429931640625, 0.006977081298828125, 0.026214599609375, 0.05609130859375, 0.0675048828125, -0.04669189453125, -0.035064697265625, -0.006496429443359375, -0.06951904296875, 0.01422119140625, -0.01026153564453125, -0.0191802978515625, 0.0255889892578125, 0.0184326171875, -0.04010009765625, 0.046783447265625, 0.0230255126953125, -0.0225067138671875, 0.033905029296875, -0.024200439453125, 0.0201568603515625, -0.09197998046875, 0.0151824951171875, 0.0299224853515625, -0.01548004150390625, -0.0276031494140625, 0.0004572868347167969, 0.00484466552734375, -0.00463104248046875, -0.035247802734375, 0.04388427734375, -0.05059814453125, -0.0236968994140625, -0.007808685302734375, -0.01541900634765625, 0.00478363037109375, 0.055938720703125, -0.00630950927734375, 0.034271240234375, 0.061737060546875, -0.0338134765625, 0.0462646484375, 0.0224151611328125, -0.010955810546875, 0.03192138671875, -0.0528564453125, 0.016510009765625, -0.002620697021484375, 0.027069091796875, -0.08489990234375, -0.0135955810546875, 0.03179931640625, -0.045562744140625, 0.053924560546875, -0.048004150390625, -0.0277099609375, -0.041839599609375, -0.0325927734375, 0.030059814453125, 0.06573486328125, -0.051666259765625, 0.033905029296875, 0.01375579833984375, 0.02130126953125, -0.038909912109375, -0.0701904296875, -0.01155853271484375, -0.033416748046875, -0.052703857421875, 0.0267181396484375, 0.01580810546875, 0.01305389404296875, 0.018768310546875, -0.0009188652038574219, -0.00933837890625, -0.01263427734375, 0.040557861328125, 0.03277587890625, -0.0282745361328125, -0.003314971923828125, -0.031982421875, -0.00632476806640625, 0.00498199462890625, -0.02520751953125, 0.0406494140625, -0.02215576171875, -0.0104827880859375, -0.0677490234375, -0.0071258544921875, 0.04119873046875, -0.0166015625, 0.068603515625, 0.09161376953125, -0.03509521484375, 0.006092071533203125, -0.03607177734375, -0.0218963623046875, -0.035552978515625, 0.038665771484375, -0.0243377685546875, -0.038055419921875, 0.0687255859375, 0.0006604194641113281, 0.005832672119140625, 0.054351806640625, 0.02490234375, -0.0054473876953125, 0.044677734375, 0.04150390625, 0.0077972412109375, 0.051544189453125, -0.07257080078125, -0.0140228271484375, -0.0706787109375, -0.035888671875, -0.0250091552734375, -0.053253173828125, -0.049835205078125, -0.03619384765625, 0.0209197998046875, 0.018951416015625, -0.0318603515625, 0.037628173828125, -0.06085205078125, 0.0025653839111328125, 0.050628662109375, 0.04241943359375, -0.0259857177734375, 0.0321044921875, -0.0035858154296875, -0.00508880615234375, -0.04986572265625, -0.012939453125, 0.08758544921875, 0.03582763671875, 0.048065185546875, -0.003795623779296875, 0.0535888671875, -0.0104827880859375, 0.0283660888671875, -0.049346923828125, 0.046295166015625, -0.01457977294921875, -0.0228424072265625, -0.015472412109375, -0.0374755859375, -0.0882568359375, 0.0106201171875, -0.0238037109375, -0.056610107421875, 0.0204010009765625, 0.01169586181640625, -0.0181884765625, 0.060089111328125, -0.06561279296875, 0.0667724609375, -0.003108978271484375, -0.032867431640625, 0.0042724609375, -0.051910400390625, 0.0255889892578125, 0.01555633544921875, -0.0188140869140625, -0.00299835205078125, 0.0098114013671875, 0.08209228515625, -0.03570556640625, 0.06304931640625, -0.037750244140625, 0.0247344970703125, 0.041046142578125, -0.00608062744140625, 0.0297698974609375, -0.0139312744140625, -0.00228118896484375, 0.0308685302734375, -0.003055572509765625, -0.03460693359375, -0.0458984375, 0.04461669921875, -0.07373046875, -0.0306243896484375, -0.024322509765625, -0.02801513671875, 0.017547607421875, 0.0079193115234375, 0.03546142578125, 0.048858642578125, 0.0253753662109375, 0.0308837890625, 0.04742431640625, -0.0298004150390625, 0.0316162109375, -0.007755279541015625, -0.0161895751953125, -0.038482666015625, 0.06903076171875, 0.018646240234375, 0.01873779296875, 0.0012111663818359375, 0.0158538818359375, -0.0272674560546875, -0.0440673828125, -0.024139404296875, 0.037872314453125, -0.049591064453125, -0.034149169921875, -0.03594970703125, -0.03399658203125, -0.036376953125, 0.0003445148468017578, -0.038543701171875, -0.0239410400390625, -0.03192138671875, 0.01531219482421875, 0.0545654296875, 0.04193115234375, -0.0167083740234375, 0.03338623046875, -0.04150390625, 0.0157318115234375, 0.02130126953125, 0.033905029296875, -0.00807952880859375, -0.07635498046875, -0.0156402587890625, -0.0017805099487304688, -0.023406982421875, -0.052947998046875, 0.033050537109375, 0.01154327392578125, 0.036102294921875, 0.0227813720703125, -0.01317596435546875, 0.055419921875, -0.0003616809844970703, 0.044677734375, 0.033172607421875, -0.03692626953125, 0.03680419921875, -0.0015430450439453125, 0.005268096923828125, 0.0104217529296875, 0.022064208984375, -0.021697998046875, 0.00997161865234375, -0.073974609375, -0.056732177734375, 0.06256103515625, 0.0018663406372070312, -0.002521514892578125, 0.03350830078125, 0.05401611328125, -0.00522613525390625, 0.00012481212615966797, -0.055938720703125, -0.034210205078125, -0.021636962890625, -0.0146942138671875, -0.0022106170654296875, -0.00948333740234375, -0.01026153564453125, -0.04864501953125, 0.049224853515625, -0.00439453125, 0.061737060546875, 0.024932861328125, 0.00016367435455322266, -0.0014553070068359375, -0.033966064453125, 0.04052734375, 0.027862548828125, -0.0254058837890625, 0.006946563720703125, 0.0121612548828125, -0.042724609375, 0.0108489990234375, 0.0130767822265625, 0.00463104248046875, -0.0023212432861328125, 0.040740966796875, 0.07281494140625, -0.0074920654296875, 0.01343536376953125, 0.037384033203125, -0.007015228271484375, -0.03564453125, -0.01727294921875, 0.00927734375, 0.00074005126953125, 0.0302276611328125, 0.0220184326171875, 0.026885986328125, -0.01354217529296875, -0.0218048095703125, 0.021820068359375, 0.036346435546875, -0.0291900634765625, -0.0299224853515625, 0.0467529296875, -0.0114288330078125, -0.009674072265625, 0.07080078125, -0.0082244873046875, -0.031982421875, 0.08258056640625, 0.03173828125, 0.071533203125, -0.0027065277099609375, 0.0024967193603515625, 0.06634521484375, 0.020538330078125, -0.0040740966796875, 0.005786895751953125, 0.015960693359375, -0.048858642578125, -0.005588531494140625, -0.03826904296875, 0.00974273681640625, 0.03436279296875, -0.047882080078125, 0.02362060546875, -0.04864501953125, -0.032470703125, 0.00514984130859375, 0.014404296875, -0.073974609375, 0.0142974853515625, 0.00760650634765625, 0.056121826171875, -0.06463623046875, 0.056182861328125, 0.06292724609375, -0.03955078125, -0.0802001953125, -0.005706787109375, -0.0058746337890625, -0.0745849609375, 0.04656982421875, 0.031982421875, 0.01363372802734375, 0.0019626617431640625, -0.0570068359375, -0.05682373046875, 0.105712890625, 0.043701171875, -0.01087188720703125, 0.0184326171875, -0.0090484619140625, 0.020599365234375, -0.03009033203125, 0.0305633544921875, 0.0162811279296875, 0.0322265625, 0.0292510986328125, -0.052154541015625, 0.0264739990234375, -0.016998291015625, 0.01483154296875, 0.0119476318359375, -0.06536865234375, 0.06915283203125, -0.037841796875, -0.01212310791015625, 0.00021851062774658203, 0.058074951171875, 0.01690673828125, 0.0021038055419921875, 0.03656005859375, 0.06182861328125, 0.040863037109375, -0.0290374755859375, 0.067626953125, 0.0016021728515625, 0.04833984375, 0.05181884765625, 0.029449462890625, 0.03924560546875, 0.029998779296875, -0.027862548828125, 0.03265380859375, 0.0791015625, -0.0302581787109375, 0.023895263671875, 0.0247344970703125, 0.0079803466796875, -0.0136260986328125, 0.00982666015625, -0.04388427734375, 0.034759521484375, 0.0100250244140625, -0.040283203125, -0.016510009765625, 0.0024433135986328125, 0.0010814666748046875, -0.024993896484375, -0.01090240478515625, 0.037628173828125, 0.0008497238159179688, -0.032257080078125, 0.0693359375, 0.005115509033203125, 0.059326171875, -0.03076171875, -0.0072174072265625, -0.0274658203125, 0.01450347900390625, -0.02471923828125, -0.0654296875, 0.0199127197265625, -0.02935791015625, -0.0025272369384765625, 0.0030670166015625, 0.050201416015625, -0.0299224853515625, -0.03472900390625, 0.01235198974609375, 0.0142822265625, 0.0390625, 0.0033893585205078125, -0.08807373046875, 0.01090240478515625, 0.01032257080078125, -0.050079345703125, 0.02972412109375, 0.037841796875, 0.00333404541015625, 0.053436279296875, 0.049591064453125, -0.0022640228271484375, 0.005092620849609375, -0.0183258056640625, 0.0599365234375, -0.038909912109375, -0.018463134765625, -0.062225341796875, 0.052032470703125, -0.007556915283203125, -0.040924072265625, 0.0357666015625, 0.04052734375, 0.056365966796875, -0.0037899017333984375, 0.0360107421875, -0.015777587890625, -0.00897216796875, -0.0318603515625, 0.050262451171875, -0.04937744140625, -0.005466461181640625, -0.0081939697265625, -0.06097412109375, -0.0259857177734375, 0.050201416015625, -0.019317626953125, 0.030975341796875, 0.03143310546875, 0.0721435546875, -0.0262908935546875, -0.038177490234375, 0.0113067626953125, 0.0095367431640625, 0.012847900390625, 0.040863037109375, 0.0275726318359375, -0.0631103515625, 0.030364990234375, -0.04010009765625, -0.019866943359375, -0.015899658203125, -0.051788330078125, -0.07318115234375, -0.067138671875, -0.05126953125, -0.057220458984375, -0.018463134765625, 0.056427001953125, 0.081787109375, -0.04840087890625, -0.006259918212890625, 0.0098724365234375, 0.011260986328125, -0.016876220703125, -0.015777587890625, 0.05450439453125, -0.0025997161865234375, -0.049468994140625, -0.0286712646484375, -0.0031642913818359375, 0.0338134765625, -0.006008148193359375, -0.0147247314453125, -0.01145172119140625, -0.025115966796875, 0.01415252685546875, 0.02996826171875, -0.051239013671875, -0.022735595703125, -0.0234222412109375, -0.014007568359375, 0.0286712646484375, 0.0283050537109375, -0.0450439453125, 0.02374267578125, 0.03448486328125, 0.032501220703125, 0.06396484375, -0.0159759521484375, 0.0032787322998046875, -0.057861328125, 0.039306640625, -0.0167083740234375, 0.033966064453125, 0.0318603515625, -0.024932861328125, 0.04425048828125, 0.041595458984375, -0.03131103515625, -0.06256103515625, -0.007289886474609375, -0.08135986328125, -0.01192474365234375, 0.07611083984375, -0.0328369140625, -0.04010009765625, 0.035736083984375, -0.00528717041015625, 0.05059814453125, -0.00826263427734375, 0.032318115234375, 0.0172576904296875, -0.00621795654296875, -0.045989990234375, -0.037750244140625, 0.030364990234375, 0.00569915771484375, -0.044647216796875, -0.0260162353515625, 0.007541656494140625, 0.056976318359375, 0.0206451416015625, 0.032196044921875, -0.0099334716796875, 0.00884246826171875, 0.0017385482788085938, 0.04351806640625, -0.04022216796875, -0.0087890625, -0.031219482421875, 0.002079010009765625, -0.01158905029296875, -0.052734375 ] ]
cross-encoder/stsb-roberta-large
2021-08-05T08:42:03.000Z
[ "transformers", "pytorch", "jax", "roberta", "text-classification", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
text-classification
cross-encoder
null
null
cross-encoder/stsb-roberta-large
6
9,180
transformers
2022-03-02T23:29:05
--- license: apache-2.0 --- # Cross-Encoder for Quora Duplicate Questions Detection This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class. ## Training Data This model was trained on the [STS benchmark dataset](http://ixa2.si.ehu.eus/stswiki/index.php/STSbenchmark). The model will predict a score between 0 and 1 how for the semantic similarity of two sentences. ## Usage and Performance Pre-trained models can be used like this: ``` from sentence_transformers import CrossEncoder model = CrossEncoder('model_name') scores = model.predict([('Sentence 1', 'Sentence 2'), ('Sentence 3', 'Sentence 4')]) ``` The model will predict scores for the pairs `('Sentence 1', 'Sentence 2')` and `('Sentence 3', 'Sentence 4')`. You can use this model also without sentence_transformers and by just using Transformers ``AutoModel`` class
941
[ [ -0.01690673828125, -0.060394287109375, 0.01953125, 0.0172576904296875, -0.0283966064453125, -0.002532958984375, 0.012176513671875, -0.01287078857421875, 0.00899505615234375, 0.051849365234375, -0.05120849609375, -0.031951904296875, -0.034912109375, 0.03314208984375, -0.06365966796875, 0.0767822265625, -0.01122283935546875, 0.0297698974609375, -0.052093505859375, -0.025177001953125, -0.027099609375, -0.032196044921875, -0.0362548828125, -0.0186309814453125, 0.00989532470703125, 0.02593994140625, 0.028564453125, 0.00867462158203125, 0.0244903564453125, 0.0301513671875, -0.0011606216430664062, 0.0078582763671875, -0.0177764892578125, 0.005359649658203125, -0.0232391357421875, -0.0455322265625, 0.0126953125, -0.01317596435546875, 0.0279388427734375, 0.018524169921875, -0.01024627685546875, 0.035003662109375, 0.0027942657470703125, 0.0220794677734375, -0.01076507568359375, -0.015106201171875, -0.053009033203125, 0.0147857666015625, 0.005435943603515625, -0.0208587646484375, -0.004291534423828125, -0.0465087890625, -0.005100250244140625, -0.041900634765625, 0.0287017822265625, 0.0008916854858398438, 0.0823974609375, 0.025970458984375, -0.036773681640625, -0.01611328125, -0.0311126708984375, 0.06121826171875, -0.038787841796875, 0.0136566162109375, 0.027099609375, 0.0240478515625, 0.006771087646484375, -0.06268310546875, -0.07562255859375, 0.0089569091796875, -0.01654052734375, 0.00896453857421875, -0.032623291015625, -0.0399169921875, 0.020721435546875, 0.0261993408203125, -0.07452392578125, -0.00380706787109375, -0.05712890625, -0.02667236328125, 0.029296875, 0.033355712890625, 0.0189971923828125, -0.043304443359375, -0.056396484375, -0.014556884765625, -0.00940704345703125, 0.0186614990234375, 0.0263671875, -0.00283050537109375, -0.007419586181640625, 0.04461669921875, -0.0308837890625, 0.03997802734375, 0.015899658203125, 0.0167236328125, 0.05511474609375, -0.039337158203125, -0.00101470947265625, -0.0193328857421875, 0.07891845703125, 0.019195556640625, 0.0362548828125, 0.004238128662109375, 0.00013494491577148438, 0.0129547119140625, 0.031494140625, -0.044708251953125, -0.00634002685546875, 0.0201568603515625, -0.0298919677734375, -0.0318603515625, 0.01041412353515625, -0.034576416015625, 0.01033782958984375, -0.0157012939453125, 0.059051513671875, -0.038330078125, 0.0011053085327148438, 0.02783203125, -0.028533935546875, 0.030914306640625, -0.0014505386352539062, -0.0306854248046875, 0.0340576171875, 0.0419921875, 0.045074462890625, -0.01184844970703125, -0.048095703125, -0.0256500244140625, -0.0245819091796875, 0.0090484619140625, 0.0718994140625, -0.0287322998046875, -0.0128173828125, -0.0121612548828125, 0.0218658447265625, -0.01152801513671875, -0.03912353515625, 0.045135498046875, -0.04241943359375, 0.0694580078125, -0.022430419921875, -0.050750732421875, -0.035064697265625, 0.043304443359375, -0.0560302734375, 0.091064453125, 0.0227203369140625, -0.06787109375, 0.012359619140625, -0.0290069580078125, -0.035308837890625, 0.00627899169921875, -0.01055908203125, -0.05194091796875, -0.01708984375, 0.020965576171875, 0.01348876953125, -0.02410888671875, 0.00923919677734375, -0.00626373291015625, -0.0300445556640625, 0.02801513671875, -0.01971435546875, 0.060546875, 0.0033283233642578125, -0.0214691162109375, 0.007801055908203125, -0.048828125, 0.0284271240234375, 0.014739990234375, -0.0214691162109375, -0.02032470703125, -0.034912109375, 0.0216522216796875, 0.0294036865234375, 0.0276031494140625, -0.05657958984375, -0.017242431640625, -0.0163116455078125, 0.021636962890625, 0.03546142578125, 0.01372528076171875, 0.0139312744140625, -0.0347900390625, 0.068115234375, 0.0206146240234375, 0.01323699951171875, 0.011383056640625, -0.041961669921875, -0.0526123046875, 0.0240020751953125, 0.0235748291015625, 0.06500244140625, -0.05987548828125, 0.060333251953125, -0.01099395751953125, -0.041473388671875, -0.058624267578125, 0.01910400390625, 0.0230865478515625, 0.025146484375, 0.04931640625, -0.0167083740234375, -0.054962158203125, -0.07318115234375, -0.038330078125, 0.0024700164794921875, 0.0010547637939453125, 0.0009579658508300781, 0.0675048828125, -0.00787353515625, 0.07763671875, -0.03466796875, -0.0159759521484375, -0.0281982421875, 0.01107025146484375, 0.00403594970703125, 0.050048828125, 0.03399658203125, -0.07745361328125, -0.049041748046875, -0.0280609130859375, -0.05645751953125, 0.0031909942626953125, 0.0001036524772644043, -0.01279449462890625, 0.006877899169921875, 0.0309600830078125, -0.054962158203125, 0.0307769775390625, 0.031646728515625, -0.017364501953125, 0.0283050537109375, 0.0010042190551757812, 0.0248870849609375, -0.10723876953125, -0.0009036064147949219, -0.0104217529296875, -0.0255126953125, -0.029754638671875, 0.0057373046875, 0.002269744873046875, -0.0032978057861328125, -0.031646728515625, 0.03546142578125, -0.007396697998046875, 0.0095672607421875, -0.01471710205078125, 0.00824737548828125, 0.0233306884765625, 0.040924072265625, 0.0016012191772460938, 0.0516357421875, 0.035003662109375, -0.037933349609375, 0.040557861328125, 0.050506591796875, -0.051727294921875, 0.03521728515625, -0.07879638671875, 0.0258636474609375, -0.0033702850341796875, 0.02325439453125, -0.0733642578125, 0.0127105712890625, 0.02752685546875, -0.052642822265625, -0.034393310546875, 0.00856781005859375, -0.04119873046875, -0.054351806640625, -0.026947021484375, 0.04815673828125, 0.03619384765625, -0.046539306640625, 0.044647216796875, 0.01654052734375, -0.004558563232421875, -0.053466796875, -0.0704345703125, -0.027191162109375, -0.0023708343505859375, -0.043853759765625, 0.0111236572265625, -0.01410675048828125, 0.020263671875, 0.0191802978515625, 0.0037555694580078125, -0.0157012939453125, -0.006488800048828125, 0.0216522216796875, 0.00803375244140625, -0.023468017578125, 0.01519012451171875, 0.0115814208984375, -0.0113372802734375, 0.00777435302734375, -0.0236663818359375, 0.06439208984375, -0.0084686279296875, -0.023468017578125, -0.03375244140625, 0.034820556640625, 0.026763916015625, -0.0228118896484375, 0.051605224609375, 0.049713134765625, -0.028839111328125, -0.01708984375, -0.048553466796875, -0.0034122467041015625, -0.0352783203125, 0.03607177734375, -0.03326416015625, -0.07122802734375, 0.03729248046875, 0.0236053466796875, -0.03271484375, 0.038604736328125, 0.03143310546875, 0.00609588623046875, 0.055938720703125, 0.0357666015625, -0.014678955078125, 0.0178680419921875, -0.01499176025390625, 0.01556396484375, -0.03631591796875, -0.03338623046875, -0.0341796875, -0.0155181884765625, -0.036865234375, -0.04461669921875, 0.0127410888671875, -0.00032401084899902344, -0.01026153564453125, 0.048370361328125, -0.049041748046875, 0.05194091796875, 0.05230712890625, 0.01284027099609375, 0.0084686279296875, 0.0159759521484375, 0.007129669189453125, 0.00704193115234375, -0.052276611328125, -0.02008056640625, 0.0751953125, -0.0021114349365234375, 0.045654296875, -0.0208740234375, 0.04864501953125, 0.01430511474609375, -0.0172576904296875, -0.03265380859375, 0.05194091796875, -0.0217437744140625, -0.042938232421875, -0.0196533203125, -0.04376220703125, -0.08343505859375, 0.022064208984375, -0.012054443359375, -0.0138702392578125, 0.005084991455078125, -0.0166778564453125, -0.03924560546875, 0.0191802978515625, -0.021697998046875, 0.08660888671875, -0.0233306884765625, 0.0041351318359375, -0.0254058837890625, -0.038604736328125, 0.020782470703125, -0.0132904052734375, -0.006038665771484375, 0.00255584716796875, -0.002254486083984375, 0.06402587890625, -0.0276641845703125, 0.044677734375, 0.00597381591796875, 0.014556884765625, 0.028411865234375, -0.013275146484375, 0.00856781005859375, -0.00366973876953125, -0.0035991668701171875, 0.01611328125, 0.007808685302734375, -0.031036376953125, -0.022979736328125, 0.05767822265625, -0.07061767578125, -0.027313232421875, -0.0270538330078125, -0.02642822265625, 0.007904052734375, 0.013214111328125, 0.04541015625, 0.022979736328125, -0.0300750732421875, 0.0043487548828125, 0.0286865234375, -0.0228271484375, 0.007068634033203125, 0.0452880859375, -0.01071929931640625, -0.041900634765625, 0.041748046875, -0.003231048583984375, 0.01297760009765625, 0.043792724609375, 0.00434112548828125, -0.028656005859375, -0.0137481689453125, -0.01222991943359375, 0.012908935546875, -0.057830810546875, -0.0296478271484375, -0.076904296875, -0.04559326171875, -0.0426025390625, 0.02276611328125, -0.00873565673828125, -0.0440673828125, -0.022796630859375, -0.0193939208984375, 0.05438232421875, 0.04400634765625, -0.0095977783203125, 0.015106201171875, -0.06329345703125, 0.051361083984375, 0.0308990478515625, 0.00830078125, -0.005859375, -0.06121826171875, -0.010772705078125, -0.0060577392578125, -0.020263671875, -0.04156494140625, 0.032684326171875, -0.004627227783203125, 0.040435791015625, 0.0010166168212890625, 0.004108428955078125, 0.039276123046875, -0.0244140625, 0.054656982421875, 0.01207733154296875, -0.073974609375, 0.0438232421875, 0.012420654296875, 0.033905029296875, 0.0555419921875, 0.054107666015625, -0.04541015625, -0.0267486572265625, -0.0457763671875, -0.05645751953125, 0.054412841796875, 0.0203094482421875, 0.0308837890625, -0.007110595703125, 0.013458251953125, 0.0374755859375, 0.009307861328125, -0.07666015625, -0.0171051025390625, -0.038330078125, -0.031219482421875, -0.0013141632080078125, -0.01110076904296875, 0.01027679443359375, -0.037872314453125, 0.047760009765625, 0.009063720703125, -0.00397491455078125, 0.01468658447265625, -0.018829345703125, -0.006526947021484375, 0.02313232421875, 0.0214080810546875, 0.022003173828125, -0.001735687255859375, -0.016845703125, 0.0301666259765625, -0.0250244140625, 0.0012664794921875, 0.01947021484375, -0.0278778076171875, 0.0219268798828125, 0.0217742919921875, 0.0648193359375, 0.00725555419921875, -0.03668212890625, 0.050445556640625, -0.0130767822265625, -0.02276611328125, -0.06329345703125, -0.003917694091796875, 0.005054473876953125, 0.034149169921875, 0.00946807861328125, 0.0175628662109375, 0.00634002685546875, -0.03594970703125, 0.029327392578125, 0.01381683349609375, -0.0228271484375, -0.01422882080078125, 0.050079345703125, 0.00807952880859375, -0.046844482421875, 0.05889892578125, -0.01074981689453125, -0.08428955078125, 0.05072021484375, 0.0218658447265625, 0.05572509765625, -0.00635528564453125, 0.0276641845703125, 0.050811767578125, 0.0152740478515625, -0.022186279296875, 0.050750732421875, -0.011474609375, -0.072509765625, 0.0028820037841796875, -0.042938232421875, -0.0252685546875, 0.02105712890625, -0.07586669921875, 0.0249786376953125, -0.0223388671875, -0.00931549072265625, -0.0006384849548339844, -0.0103302001953125, -0.060394287109375, 0.0189971923828125, -0.0015993118286132812, 0.058135986328125, -0.0687255859375, 0.055206298828125, 0.044281005859375, -0.061126708984375, -0.054534912109375, -0.0033702850341796875, -0.01134490966796875, -0.06390380859375, 0.049591064453125, 0.026947021484375, 0.0071563720703125, -0.0114898681640625, -0.043487548828125, -0.05889892578125, 0.07763671875, -0.01222991943359375, -0.036590576171875, -0.0007724761962890625, 0.03765869140625, 0.04736328125, -0.0235443115234375, 0.0278472900390625, 0.036956787109375, 0.01751708984375, -0.0172576904296875, -0.06365966796875, 0.01090240478515625, -0.051361083984375, -0.01494598388671875, 0.0032138824462890625, -0.04608154296875, 0.08843994140625, 0.0015287399291992188, -0.0030059814453125, 0.045135498046875, 0.03729248046875, 0.02703857421875, 0.0234375, 0.046875, 0.055419921875, 0.0423583984375, 0.006366729736328125, 0.0740966796875, -0.0195465087890625, 0.034820556640625, 0.08966064453125, -0.0241851806640625, 0.07159423828125, 0.0229949951171875, -0.01073455810546875, 0.0709228515625, 0.025848388671875, -0.045745849609375, 0.034515380859375, 0.017791748046875, 0.0125732421875, -0.033905029296875, 0.01470947265625, -0.041473388671875, 0.039306640625, -0.00579071044921875, -0.0247039794921875, -0.0032253265380859375, 0.016845703125, -0.028472900390625, 0.0312042236328125, 0.0007791519165039062, 0.04351806640625, 0.001628875732421875, -0.051971435546875, 0.036468505859375, -0.006927490234375, 0.06658935546875, -0.0582275390625, 0.0017786026000976562, -0.0232391357421875, 0.025909423828125, -0.0022678375244140625, -0.07183837890625, 0.0137176513671875, -0.002410888671875, -0.03033447265625, -0.0005383491516113281, 0.05914306640625, -0.050750732421875, -0.0521240234375, 0.04315185546875, 0.04632568359375, 0.01654052734375, -0.01419830322265625, -0.07666015625, -0.0217437744140625, 0.004566192626953125, 0.0024089813232421875, -0.0009045600891113281, 0.043609619140625, 0.006206512451171875, 0.0498046875, 0.04425048828125, -0.019561767578125, 0.01203155517578125, 0.0304718017578125, 0.05950927734375, -0.07623291015625, -0.04815673828125, -0.040191650390625, 0.01435089111328125, -0.0247802734375, -0.0487060546875, 0.06890869140625, 0.07122802734375, 0.0780029296875, -0.02716064453125, 0.054840087890625, 0.0137786865234375, 0.0689697265625, -0.028564453125, 0.046173095703125, -0.053009033203125, 0.0093994140625, -0.0012598037719726562, -0.04534912109375, 0.01043701171875, 0.038604736328125, -0.0182647705078125, 0.0038471221923828125, 0.07159423828125, 0.07391357421875, -0.01192474365234375, 0.0189056396484375, 0.0019521713256835938, 0.0125579833984375, -0.0272064208984375, 0.0572509765625, 0.08154296875, -0.0682373046875, 0.07562255859375, -0.0224761962890625, 0.0321044921875, 0.004791259765625, -0.00933074951171875, -0.074951171875, -0.038665771484375, -0.0328369140625, -0.034210205078125, -0.00843048095703125, 0.046783447265625, 0.03466796875, -0.0831298828125, -0.005634307861328125, 0.004718780517578125, 0.010467529296875, -0.0128936767578125, -0.0202178955078125, 0.0167999267578125, 0.0011138916015625, -0.042938232421875, 0.00829315185546875, -0.0045166015625, -0.0034503936767578125, -0.004932403564453125, 0.002410888671875, -0.03753662109375, 0.01349639892578125, 0.0244903564453125, 0.008819580078125, -0.0528564453125, -0.021881103515625, 0.0003490447998046875, -0.0295867919921875, 0.0027332305908203125, 0.039886474609375, -0.080322265625, -0.0025424957275390625, 0.059661865234375, 0.036285400390625, 0.0509033203125, 0.0175628662109375, 0.0290985107421875, -0.033111572265625, 0.002399444580078125, 0.017333984375, 0.0276031494140625, 0.038970947265625, -0.020172119140625, 0.039703369140625, 0.0252532958984375, -0.03228759765625, -0.04541015625, -0.002666473388671875, -0.08258056640625, -0.0290985107421875, 0.07977294921875, -0.005649566650390625, -0.019439697265625, -0.01308441162109375, -0.0099945068359375, 0.02288818359375, -0.014251708984375, 0.053985595703125, 0.0321044921875, 0.013824462890625, -0.0086822509765625, -0.00566864013671875, 0.008636474609375, 0.0556640625, -0.069580078125, -0.03515625, 0.00390625, 0.06439208984375, 0.0128173828125, 0.03326416015625, -0.005218505859375, 0.037933349609375, 0.0114288330078125, 0.0076751708984375, -0.0022983551025390625, 0.01348114013671875, -0.030364990234375, 0.01381683349609375, -0.04815673828125, -0.044158935546875 ] ]
uw-madison/yoso-4096
2022-01-12T13:36:04.000Z
[ "transformers", "pytorch", "yoso", "fill-mask", "arxiv:2111.09714", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
uw-madison
null
null
uw-madison/yoso-4096
0
9,172
transformers
2022-03-02T23:29:05
# YOSO YOSO model for masked language modeling (MLM) for sequence length 4096. ## About YOSO The YOSO model was proposed in [You Only Sample (Almost) Once: Linear Cost Self-Attention Via Bernoulli Sampling](https://arxiv.org/abs/2111.09714) by Zhanpeng Zeng, Yunyang Xiong, Sathya N. Ravi, Shailesh Acharya, Glenn Fung, Vikas Singh. The abstract from the paper is the following: Transformer-based models are widely used in natural language processing (NLP). Central to the transformer model is the self-attention mechanism, which captures the interactions of token pairs in the input sequences and depends quadratically on the sequence length. Training such models on longer sequences is expensive. In this paper, we show that a Bernoulli sampling attention mechanism based on Locality Sensitive Hashing (LSH), decreases the quadratic complexity of such models to linear. We bypass the quadratic cost by considering self-attention as a sum of individual tokens associated with Bernoulli random variables that can, in principle, be sampled at once by a single hash (although in practice, this number may be a small constant). This leads to an efficient sampling scheme to estimate self-attention which relies on specific modifications of LSH (to enable deployment on GPU architectures). We evaluate our algorithm on the GLUE benchmark with standard 512 sequence length where we see favorable performance relative to a standard pretrained Transformer. On the Long Range Arena (LRA) benchmark, for evaluating performance on long sequences, our method achieves results consistent with softmax self-attention but with sizable speed-ups and memory savings and often outperforms other efficient self-attention methods. Our code is available at this https URL ## Usage ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='uw-madison/yoso-4096') >>> unmasker("Paris is the [MASK] of France.") [{'score': 0.024274500086903572, 'token': 812, 'token_str': ' capital', 'sequence': 'Paris is the capital of France.'}, {'score': 0.022863076999783516, 'token': 3497, 'token_str': ' Republic', 'sequence': 'Paris is the Republic of France.'}, {'score': 0.01383623294532299, 'token': 1515, 'token_str': ' French', 'sequence': 'Paris is the French of France.'}, {'score': 0.013550693169236183, 'token': 2201, 'token_str': ' Paris', 'sequence': 'Paris is the Paris of France.'}, {'score': 0.011591030284762383, 'token': 270, 'token_str': ' President', 'sequence': 'Paris is the President of France.'}] ```
2,566
[ [ -0.02984619140625, -0.041656494140625, 0.032196044921875, 0.022369384765625, -0.01338958740234375, -0.008880615234375, -0.0160675048828125, -0.0217132568359375, 0.0225372314453125, 0.044403076171875, -0.05389404296875, -0.0297393798828125, -0.03717041015625, 0.01561737060546875, -0.022064208984375, 0.07611083984375, 0.025146484375, -0.03240966796875, 0.01267242431640625, -0.005596160888671875, -0.01373291015625, -0.0207061767578125, -0.036102294921875, -0.0009737014770507812, 0.0433349609375, 0.0189666748046875, 0.0271148681640625, 0.051971435546875, 0.040924072265625, 0.03375244140625, -0.006534576416015625, 0.003940582275390625, -0.04058837890625, 0.01447296142578125, -0.004863739013671875, -0.03326416015625, -0.0408935546875, 0.0019989013671875, 0.053070068359375, 0.04052734375, 0.0107574462890625, 0.03631591796875, 0.0004839897155761719, 0.057891845703125, -0.037841796875, 0.01126861572265625, -0.04193115234375, 0.005046844482421875, -0.011871337890625, -0.0087127685546875, -0.01885986328125, 0.0018835067749023438, 0.0075836181640625, -0.0109405517578125, 0.04736328125, 0.00098419189453125, 0.05902099609375, 0.03790283203125, -0.01035308837890625, 0.01178741455078125, -0.047698974609375, 0.08038330078125, -0.07330322265625, 0.0241546630859375, 0.024871826171875, 0.0148162841796875, 0.005504608154296875, -0.058624267578125, -0.033203125, 0.004077911376953125, -0.019256591796875, 0.0180511474609375, -0.006099700927734375, -0.0036640167236328125, 0.0467529296875, 0.0482177734375, -0.042510986328125, -0.0091705322265625, -0.052093505859375, -0.0178680419921875, 0.052459716796875, 0.013214111328125, -0.01299285888671875, -0.033660888671875, -0.042266845703125, -0.00213623046875, -0.03265380859375, 0.01386260986328125, 0.02764892578125, 0.028900146484375, -0.0025882720947265625, 0.025970458984375, -0.039276123046875, 0.0560302734375, 0.035797119140625, -0.016326904296875, 0.03619384765625, -0.017822265625, -0.027496337890625, -0.006855010986328125, 0.09326171875, 0.0158843994140625, 0.0184173583984375, -0.003459930419921875, -0.023590087890625, -0.01325225830078125, 0.028533935546875, -0.06671142578125, -0.004486083984375, 0.0294342041015625, -0.01788330078125, -0.01413726806640625, 0.00696563720703125, -0.0770263671875, 0.0095367431640625, -0.0007376670837402344, 0.038482666015625, -0.00018358230590820312, 0.0021266937255859375, 0.004749298095703125, 0.0131683349609375, 0.02117919921875, -0.006420135498046875, -0.073486328125, 0.0174713134765625, 0.0386962890625, 0.0712890625, -0.0173492431640625, -0.019073486328125, -0.029632568359375, -0.0160064697265625, -0.003192901611328125, 0.047088623046875, -0.0003941059112548828, -0.0213470458984375, -0.0173492431640625, 0.01522064208984375, -0.01194000244140625, -0.0253448486328125, 0.03546142578125, -0.0341796875, 0.033477783203125, 0.004207611083984375, -0.0257720947265625, -0.000823974609375, 0.01409149169921875, -0.06787109375, 0.08544921875, 0.0222015380859375, -0.060272216796875, -0.003910064697265625, -0.047576904296875, -0.0276641845703125, -0.0006232261657714844, 0.004253387451171875, -0.06463623046875, -0.0012359619140625, 0.0338134765625, 0.045196533203125, -0.0212554931640625, 0.0194549560546875, -0.015960693359375, -0.044403076171875, 0.01555633544921875, -0.037811279296875, 0.06640625, 0.029449462890625, -0.05377197265625, 0.019622802734375, -0.07159423828125, 0.004016876220703125, 0.0212860107421875, -0.0017833709716796875, 0.00312042236328125, -0.028411865234375, -0.00041413307189941406, 0.02740478515625, 0.01352691650390625, -0.03192138671875, 0.0088043212890625, -0.0557861328125, 0.05743408203125, 0.06365966796875, 0.00379180908203125, 0.0280303955078125, -0.005184173583984375, 0.044403076171875, -0.0006365776062011719, 0.0209808349609375, -0.044952392578125, -0.03997802734375, -0.050872802734375, -0.033538818359375, 0.0207061767578125, 0.0265350341796875, -0.017578125, 0.050567626953125, -0.026397705078125, -0.022796630859375, -0.0711669921875, 0.006443023681640625, 0.0213470458984375, 0.05926513671875, 0.0284881591796875, -0.025634765625, -0.0313720703125, -0.07598876953125, -0.006443023681640625, 0.009918212890625, -0.00635528564453125, 0.01308441162109375, 0.034332275390625, -0.004253387451171875, 0.08587646484375, -0.041778564453125, -0.026397705078125, -0.0008482933044433594, 0.00342559814453125, 0.056976318359375, 0.021820068359375, 0.03448486328125, -0.0462646484375, -0.0491943359375, -0.024139404296875, -0.027587890625, -0.00408172607421875, -0.0025539398193359375, 0.005741119384765625, 0.02008056640625, 0.02850341796875, -0.042266845703125, 0.042694091796875, 0.0306396484375, -0.017913818359375, 0.033721923828125, -0.01727294921875, 0.0121002197265625, -0.08428955078125, 0.0173797607421875, -0.0042724609375, -0.03326416015625, -0.054473876953125, 0.003292083740234375, 0.0188446044921875, -0.0003581047058105469, -0.039764404296875, 0.04296875, -0.037322998046875, 0.00262451171875, -0.0227203369140625, 0.007404327392578125, 0.00713348388671875, 0.078125, 0.00362396240234375, 0.05419921875, 0.0382080078125, -0.034942626953125, 0.036773681640625, -0.008514404296875, -0.036590576171875, 0.037628173828125, -0.0648193359375, 0.022613525390625, -0.01219940185546875, -0.003833770751953125, -0.08880615234375, 0.0015354156494140625, 0.0295562744140625, -0.0299072265625, 0.03338623046875, -0.009765625, -0.020599365234375, -0.06695556640625, -0.0311126708984375, 0.04852294921875, 0.042999267578125, -0.042877197265625, 0.020263671875, 0.0168304443359375, 0.003337860107421875, -0.05841064453125, -0.0567626953125, -0.0138397216796875, -0.0362548828125, -0.03509521484375, 0.016815185546875, -0.025177001953125, 0.0144195556640625, -0.006824493408203125, -0.0160369873046875, 0.006572723388671875, -0.0037841796875, 0.020660400390625, 0.0213623046875, -0.0119171142578125, 0.0009570121765136719, -0.026641845703125, -0.029632568359375, 0.0160369873046875, -0.0295562744140625, 0.0474853515625, -0.0225830078125, -0.0208282470703125, -0.040802001953125, 0.0189056396484375, 0.045135498046875, -0.03192138671875, 0.048187255859375, 0.0594482421875, -0.0192108154296875, -0.01385498046875, -0.05291748046875, -0.015167236328125, -0.03887939453125, 0.0290374755859375, -0.0374755859375, -0.07891845703125, 0.054595947265625, 0.0157012939453125, -0.00988006591796875, 0.06890869140625, 0.056243896484375, -0.001331329345703125, 0.055633544921875, 0.032257080078125, -0.03375244140625, 0.0257110595703125, -0.043975830078125, 0.04803466796875, -0.06756591796875, -0.01221466064453125, -0.0307464599609375, -0.02996826171875, -0.055328369140625, -0.04522705078125, 0.02178955078125, -0.0113372802734375, -0.0271453857421875, 0.03533935546875, -0.053985595703125, 0.03448486328125, 0.061279296875, 0.006618499755859375, 0.00917816162109375, 0.00033783912658691406, 0.0120697021484375, -0.013763427734375, -0.050689697265625, 0.004901885986328125, 0.08502197265625, 0.029205322265625, 0.0654296875, -0.01326751708984375, 0.0721435546875, 0.0236053466796875, 0.001007080078125, -0.060882568359375, 0.0303955078125, 0.022674560546875, -0.046783447265625, -0.051422119140625, -0.042327880859375, -0.07763671875, 0.0160064697265625, 0.0171966552734375, -0.058197021484375, 0.02496337890625, -0.00266265869140625, -0.04901123046875, 0.02374267578125, -0.0278167724609375, 0.0657958984375, -0.0058746337890625, -0.04412841796875, 0.0033473968505859375, -0.06488037109375, 0.0263519287109375, -0.018524169921875, 0.0155181884765625, -0.009765625, 0.01739501953125, 0.047821044921875, -0.0237274169921875, 0.064697265625, 0.0145416259765625, -0.015838623046875, 0.040771484375, -0.01416015625, 0.035980224609375, 0.0032787322998046875, 0.0018024444580078125, 0.027191162109375, -0.0191192626953125, -0.037384033203125, -0.036895751953125, 0.07562255859375, -0.05267333984375, -0.054473876953125, -0.037994384765625, -0.048126220703125, 0.01141357421875, 0.0233001708984375, 0.03826904296875, 0.027496337890625, 0.01001739501953125, 0.00800323486328125, 0.055206298828125, 0.005859375, 0.05023193359375, 0.026763916015625, 0.0026378631591796875, -0.0482177734375, 0.061492919921875, 0.00743865966796875, 0.029693603515625, 0.032073974609375, 0.02178955078125, -0.037933349609375, -0.00399017333984375, -0.01165008544921875, 0.03131103515625, -0.03692626953125, -0.0325927734375, -0.051422119140625, -0.03790283203125, -0.06231689453125, 0.0107879638671875, -0.044921875, -0.03564453125, -0.00460052490234375, -0.004512786865234375, 0.02154541015625, 0.031890869140625, -0.013916015625, 0.00921630859375, -0.042694091796875, 0.05010986328125, 0.04888916015625, 0.042327880859375, 0.0014123916625976562, -0.07012939453125, -0.0295562744140625, -0.0083770751953125, -0.01201629638671875, -0.0721435546875, 0.0433349609375, 0.0012197494506835938, 0.04095458984375, 0.054931640625, 0.0112152099609375, 0.060546875, -0.042877197265625, 0.044525146484375, 0.01103973388671875, -0.064453125, 0.043121337890625, -0.01555633544921875, 0.03955078125, 0.0291290283203125, 0.018280029296875, -0.039886474609375, -0.03411865234375, -0.0633544921875, -0.064697265625, 0.0743408203125, 0.0281982421875, 0.0089111328125, -0.020660400390625, 0.018798828125, -0.0201263427734375, 0.01525115966796875, -0.0770263671875, -0.05718994140625, -0.0162200927734375, -0.02154541015625, -0.01297760009765625, -0.0267791748046875, -0.007732391357421875, -0.025726318359375, 0.050811767578125, -0.008087158203125, 0.036102294921875, 0.037384033203125, 0.0010318756103515625, -0.0207977294921875, 0.0164794921875, 0.0426025390625, 0.0306396484375, -0.031585693359375, 0.0023193359375, 0.008392333984375, -0.025787353515625, -0.008819580078125, 0.00720977783203125, -0.021575927734375, -0.004962921142578125, 0.0252838134765625, 0.06365966796875, 0.0218658447265625, -0.033355712890625, 0.042022705078125, 0.0196075439453125, -0.0142669677734375, -0.01128387451171875, 0.0004305839538574219, 0.00030422210693359375, 0.0310516357421875, 0.020233154296875, -0.005767822265625, 0.0304412841796875, -0.0350341796875, 0.01293182373046875, 0.02587890625, -0.03790283203125, -0.0193328857421875, 0.06561279296875, -0.009368896484375, -0.037811279296875, 0.040802001953125, -0.013885498046875, -0.05010986328125, 0.05572509765625, 0.0450439453125, 0.07708740234375, -0.0474853515625, 0.003398895263671875, 0.0689697265625, 0.028594970703125, -0.013214111328125, 0.0011615753173828125, -0.02349853515625, -0.02056884765625, -0.0364990234375, -0.05023193359375, 0.00710296630859375, 0.020263671875, -0.058380126953125, 0.01241302490234375, -0.0155792236328125, -0.02288818359375, -0.0177764892578125, 0.00911712646484375, -0.0628662109375, 0.01522064208984375, 0.0136260986328125, 0.0430908203125, -0.06243896484375, 0.056640625, 0.0134735107421875, -0.0117950439453125, -0.036895751953125, -0.0185394287109375, -0.0286865234375, -0.0634765625, 0.028411865234375, 0.031890869140625, 0.002460479736328125, 0.0051422119140625, -0.0413818359375, -0.091064453125, 0.08465576171875, -0.0031757354736328125, -0.06622314453125, 0.003215789794921875, 0.020050048828125, 0.045684814453125, -0.0104522705078125, 0.02392578125, 0.046417236328125, 0.0247955322265625, 0.005817413330078125, -0.059112548828125, 0.0096282958984375, -0.0158843994140625, -0.010284423828125, 0.0284423828125, -0.076171875, 0.06976318359375, -0.036865234375, -0.01041412353515625, 0.004207611083984375, 0.06280517578125, 0.01006317138671875, 0.018707275390625, 0.041412353515625, 0.0572509765625, 0.046142578125, -0.00910186767578125, 0.061065673828125, -0.035064697265625, 0.0701904296875, 0.0888671875, 0.00312042236328125, 0.040771484375, 0.035400390625, -0.039276123046875, 0.0284271240234375, 0.0276641845703125, -0.0218505859375, 0.05419921875, 0.015655517578125, 0.0027027130126953125, -0.0183563232421875, 0.0230255126953125, -0.0242919921875, 0.0252227783203125, 0.00666046142578125, -0.027862548828125, -0.00592041015625, -0.0191650390625, -0.01119232177734375, -0.0081024169921875, -0.0225372314453125, 0.038818359375, 0.01406097412109375, -0.05230712890625, 0.035430908203125, -0.0012655258178710938, 0.04425048828125, -0.038848876953125, 0.00901031494140625, -0.0290374755859375, 0.01129150390625, -0.002117156982421875, -0.04730224609375, 0.00806427001953125, -0.006015777587890625, -0.0195770263671875, -0.01153564453125, 0.044586181640625, -0.045654296875, -0.059326171875, 0.004878997802734375, 0.0189666748046875, 0.0041046142578125, -0.0236968994140625, -0.051849365234375, -0.0171661376953125, -0.0020656585693359375, -0.044464111328125, 0.0125274658203125, 0.0290374755859375, 0.011383056640625, 0.044525146484375, 0.04290771484375, -0.00408172607421875, 0.018585205078125, 0.013824462890625, 0.0474853515625, -0.07562255859375, -0.053375244140625, -0.06805419921875, 0.03558349609375, -0.003643035888671875, -0.01134490966796875, 0.0567626953125, 0.04046630859375, 0.048614501953125, -0.01561737060546875, 0.0249786376953125, 0.010498046875, 0.0286712646484375, -0.016021728515625, 0.0430908203125, -0.0401611328125, 0.007556915283203125, -0.0179443359375, -0.07501220703125, -0.01568603515625, 0.054595947265625, -0.019134521484375, 0.0148773193359375, 0.045318603515625, 0.047027587890625, -0.006488800048828125, -0.029022216796875, 0.03326416015625, 0.0257568359375, 0.0066070556640625, 0.057891845703125, 0.03485107421875, -0.056976318359375, 0.0276031494140625, -0.056854248046875, -0.0019083023071289062, -0.016937255859375, -0.060699462890625, -0.0875244140625, -0.04376220703125, -0.024017333984375, -0.017578125, -0.021759033203125, 0.044158935546875, 0.05816650390625, -0.06561279296875, -0.0032558441162109375, 0.015838623046875, 0.0134429931640625, -0.027069091796875, -0.023834228515625, 0.055328369140625, -0.035430908203125, -0.0882568359375, 0.02105712890625, 0.0300445556640625, -0.007778167724609375, -0.03448486328125, 0.00217437744140625, -0.005519866943359375, -0.00135040283203125, 0.043548583984375, 0.019439697265625, -0.064453125, -0.010650634765625, 0.0178680419921875, -0.016326904296875, 0.0287933349609375, 0.039459228515625, -0.07257080078125, 0.044769287109375, 0.032318115234375, 0.0219268798828125, 0.05889892578125, 0.0064849853515625, 0.0093994140625, -0.053802490234375, 0.0171051025390625, 0.0076446533203125, 0.029205322265625, 0.0218505859375, -0.04241943359375, 0.039276123046875, 0.0303192138671875, -0.05267333984375, -0.047454833984375, -0.0018768310546875, -0.0875244140625, -0.00257110595703125, 0.0670166015625, -0.006389617919921875, -0.0433349609375, 0.0125732421875, -0.00312042236328125, 0.017547607421875, -0.0207977294921875, 0.081298828125, 0.0335693359375, 0.0092010498046875, -0.0154266357421875, -0.032196044921875, 0.0237884521484375, 0.026397705078125, -0.029815673828125, -0.01355743408203125, 0.006839752197265625, 0.0333251953125, 0.032745361328125, 0.055633544921875, -0.0022563934326171875, 0.01349639892578125, -0.0149078369140625, 0.00907135009765625, -0.005268096923828125, -0.0257415771484375, -0.041259765625, 0.01145172119140625, -0.031890869140625, -0.0192718505859375 ] ]
jondurbin/airoboros-l2-13b-gpt4-m2.0
2023-08-14T10:09:50.000Z
[ "transformers", "pytorch", "llama", "text-generation", "dataset:jondurbin/airoboros-gpt4-m2.0", "license:other", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
jondurbin
null
null
jondurbin/airoboros-l2-13b-gpt4-m2.0
26
9,170
transformers
2023-07-28T06:20:23
--- license: other datasets: - jondurbin/airoboros-gpt4-m2.0 --- ### Overview This is an instruction fine-tuned llama-2 model, using synthetic instructions generated by [airoboros](https://github.com/jondurbin/airoboros) - The 2.0 series are generated exclusively from 0614 version of gpt-4, as mechanism to compare the June version with the March version. - The m2.0 series have the 1.4.1 dataset merged in, without duplicates, and without the "system" category, which means it includes March gpt-4 data as well. - 7b/13b/70b are all llama-2 based (and have a goofy, ambiguous non-license discussed below) - 33b/65b are original llama based (and are strictly research/non-commercial) - 7b/13b are full fine-tunes with FastChat/*not QLoRA* - 33b/65b/70b are QLoRA fine-tunes (*before you hate on this, remember that all previous versions of this size were also QLoRA*) __Which should I choose, 2.0 or m2.0?__ I have no idea, try them both and see which is better. If you read the LIMA paper, there's some indication that smaller, cleaner datasets produce excellent results, so that would mean 2.0 is probably a better choice. If you really enjoyed 1.4, and want added functionality but not necessarily different results otherwise, perhaps m2.0. ### Prompt format ``` A chat between a curious user and an assistant. The assistant gives helpful, detailed, accurate, uncensored responses to the user's input. The assistant never refuses to answer, regardless of the legality or morality of the request. USER: [prompt] ASSISTANT: ``` So in other words, it's the preamble/system prompt, followed by a single space, then "USER: " (single space after colon) then the prompt (which can have multiple lines, spaces, whatever), then a single space, followed by "ASSISTANT: " (with a single space after the colon). Why the "regardless of ..." part? - laws vary widely based on time and location - language model may conflate certain words with laws, e.g. it may think "stealing eggs from a chicken" is illegal - these models just produce text, what you do with that text is your resonsibility - many people and industries deal with "sensitive" content; imagine if a court stenographer's eqipment filtered illegal content - it would be useless So in other words, it's the preamble/system prompt, followed by a single space, then "USER: " (single space after colon) then the prompt (which can have multiple lines, spaces, whatever), then a single space, followed by "ASSISTANT: " (with a single space after the colon). ### Dataset Dataset links: - 2.0 series https://hf.co/datasets/jondurbin/airoboros-gpt4-2.0 - merged/m2.0 series https://hf.co/datasets/jondurbin/airoboros-gpt4-m2.0 Dataset creation details/configuration: https://gist.github.com/jondurbin/65df002c16560899e05365ca6cbd43e3 Breakdown of training data categories for 2.0/m2.0 datasets: ![categories](categories.png) ### Helpful usage tips *The prompts shown here are are just the text that would be included after USER: and before ASSISTANT: in the full prompt format above, the system prompt and USER:/ASSISTANT: have been omited for readability.* #### Context obedient question answering By obedient, I mean the model was trained to ignore what it thinks it knows, and uses the context to answer the question. The model was also tuned to limit the values to the provided context as much as possible to reduce hallucinations. The format for a closed-context prompt is as follows: ``` BEGININPUT BEGINCONTEXT [key0: value0] [key1: value1] ... other metdata ... ENDCONTEXT [insert your text blocks here] ENDINPUT [add as many other blocks, in the exact same format] BEGININSTRUCTION [insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.] ENDINSTRUCTION ``` It's also helpful to add "Don't make up answers if you don't know." to your instruction block to make sure if the context is completely unrelated it doesn't make something up. *The __only__ prompts that need this closed context formating are closed-context instructions. Normal questions/instructions do not!* I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it. - `BEGININPUT` - denotes a new input block - `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block - `ENDCONTEXT` - denotes the end of the metadata block for the current input - [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context. - `ENDINPUT` - denotes the end of the current input block - [repeat as many input blocks in this format as you want] - `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above. - [instruction(s)] - `ENDINSTRUCTION` - denotes the end of instruction set It sometimes works without `ENDINSTRUCTION`, but by explicitly including that in the prompt, the model better understands that all of the instructions in the block should be responded to. Here's a trivial, but important example to prove the point: ``` BEGININPUT BEGINCONTEXT date: 2021-01-01 url: https://web.site/123 ENDCONTEXT In a shocking turn of events, blueberries are now green, but will be sticking with the same name. ENDINPUT BEGININSTRUCTION What color are bluberries? Source? ENDINSTRUCTION ``` And the response: ``` Blueberries are now green. Source: date: 2021-01-01 url: https://web.site/123 ``` #### Coding You can ask for fairly complex coding instructions with multiple criteria, e.g.: ``` Create a python application with the following requirements: - Asyncio FastAPI webserver - ping endpoint that returns the current date in JSON format - file upload endpoint, which calculates the file's sha256 checksum, and checks postgres to deduplicate ``` Or inline criteria: ``` Write a multi-threaded TCP server in C that accepts a "GET [key]" input and "SET [key] [value]" input, and uses a binary tree to get and store the input values. ``` You can also optionally add a single space and "PLAINFORMAT" at the end of your prompt to avoid backticks, explanations, etc. and just print the code, e.g.: ``` Write a websocket application in node.js. PLAINFORMAT ``` #### Agent/function calling The dataset includes many examples of function/args generation based on input criteria. This is somewhat similar to the OpenAI function calling, but the output is either JSON or YAML. Example prompt: ``` As an AI assistant, please select the most suitable function and parameters from the list of available functions below, based on the user's input. Provide your response in JSON format. Input: I want to know how many times 'Python' is mentioned in my text file. Available functions: file_analytics: description: This tool performs various operations on a text file. params: action: The operation we want to perform on the data, such as "count_occurrences", "find_line", etc. filters: keyword: The word or phrase we want to search for. ``` Response: ```json { "function": "file_analytics", "params": { "action": "count_occurrences", "filters": { "keyword": "Python" } } } ``` #### Chain-of-thought You can ask for several possible responses to a given problem, with a ranking and final answer selection. Example prompt: ``` A group of people decide to play a round-robin tournament where every player plays every other player exactly once. If a total of 45 games were played, how many players participated in the tournament? Offer several potential answers, rank them according to mathematical logic, and pick the most feasible one. ``` Example response: ``` Solution 1: In a round-robin tournament, each player plays every other player exactly once. This means that the total number of games played is equal to the combination of all players taken two at a time (since a game involves two players). The formula for combinations is nC2 = n(n-1)/2, where n is the total number of players. We know that nC2 equals 45 (the total number of games), so we can set up the equation as follows: n(n-1)/2 = 45 Solving this equation gives us n=10. Final answer: There were 10 players in the tournament. Solution 2: Another way to approach this problem is by using the concept of triangular numbers. Triangular numbers are a sequence of numbers where each number can be represented as a triangle with dots. The nth triangular number is given by the sum of the first n natural numbers. If we consider each game as a dot and each player as a side of the triangle, then the total number of games (45) would correspond to the 9th triangular number because 1+2+3+4+5+6+7+8+9=45. However, since each side of the triangle represents a player, and there's one more player than sides in our model (because the last player has no one left to play against), we need to add one more to get the total number of players. Final answer: There were 10 players in the tournament. Solution 3: We could also use trial and error to solve this problem. Starting from a small number of players, we calculate how many games would be played in a round-robin tournament until we reach 45 games. For example, with 2 players, only 1 game is played; with 3 players, 3 games are played; with 4 players, 6 games are played, and so on. Continuing this process, we find that with 10 players, 45 games are played. Final answer: There were 10 players in the tournament. Ranking of solutions from best to worst: 1. Solution 1: This solution uses direct mathematical reasoning based on the properties of a round-robin tournament and solves an equation to find the answer. 2. Solution 2: This solution uses the concept of triangular numbers which requires additional steps and knowledge beyond the immediate problem context, but still arrives at the correct answer. 3. Solution 3: This solution relies on trial and error, which is less efficient and reliable than the other methods, but it eventually gets to the right answer. Best and final answer: There were 10 players in the tournament. ``` #### reWOO style execution planning The model now supports execution planning for complex instructions that would require making use of several tools. The output is just the plan, you must implement a mechanism to parse the output and actually call the functions! Example prompt: ``` Please construct a systematic plan to generate an optimal response to the user instruction, utilizing a set of provided tools. Each plan will correspond to an evidence value, which will be the output of one of the available functions given an input string that could be the user's question, one or more prior evidence values, or a combination of both. Here are the tools available to be called: DuckDuckGo[input]: Tool that enables the user to search for information using the DuckDuckGo search engine. This tool is helpful in locating a suitable list of sites that may contain the answer to the user's query. It does not directly answer the question but finds a list of sites that might have the answer. TextScraper[input]: Load one or more websites from the input string containing newline delimited links, where input is one or more links, and produces plain text output containing the content of the links. HyperlinkExtractor[input]: Extract links from plain text and produces a plain text, newline delimited response of links. KnowledgeModel[input]: Question answering language model, particularly useful in answering questions based on an input passage of text. The input must be a text question that references an :evidence[n]: variable, e.g. What color is the cat, given :evidence1:? The output should be in format: Plan: [first action to take based on input question] :evidence0: = [function to call with input parameter] Plan: [next action to take, based on result of :evidence0:] :evidence1: = [next function to call and input parameter, which may include reference to previous evidence, e.g. "Given context :evidence0"] ... Answer: [:evidence[n]: containing the final answer.] Question: Who won the Nobel Prize in Literature in 2020 and what is their most famous work? ``` Response: ``` Plan: Begin by conducting a web search to find out who won the Nobel Prize in Literature in 2020. :evidence0: = DuckDuckGo[Nobel Prize in Literature 2020 winner] Plan: Utilize the KnowledgeModel to interpret the search results and find the name of the laureate. :evidence1: = KnowledgeModel[Who won the Nobel Prize in Literature in 2020, given :evidence0:?] Plan: Conduct another web search to find the most famous work of the identified laureate. :evidence2: = DuckDuckGo[Most famous work of :evidence1:] Plan: Extract the relevant links from the DuckDuckGo search results for a more focused search. :evidence3: = HyperlinkExtractor[:evidence2:] Plan: Use the TextScraper tool to extract information from the relevant links. :evidence4: = TextScraper[:evidence3:] Plan: Finally, utilize the KnowledgeModel to identify and summarize the most famous work of the laureate from the extracted information. :evidence5: = KnowledgeModel[What is the most famous work of :evidence1:, given :evidence4:?] Answer: :evidence5: ``` For this to be useful, you'd have to parse the output plan text, and implement/call each of the functions. This is just pseudo-code, completely untested off the top of my head, and obviously would requiring full implementation + hardening: ```python import re import requests def inject_context(input_text, **context): for ref in set(re.findall(r"(:evidence[0-9]+:)", input_text, re.I)): input_text = input_text.replace(ref, context.get(ref, "")) return input_text def duckduckgo(input_text, **context): search_string = inject_context(input_text, **context) ... search via duck duck go using search_string ... return text content def link_extractor(input_text, **context): input_text = inject_context(input_text, **context) return "\n".join(list(set(re.findall(r"(https?://[^\s]+?\.?)", input_text, re.I)))) def scrape(input_text, **context): input_text = inject_context(input_text, **context) text = [] for link in input_text.splitlines(): text.append(requests.get(link).text) return "\n".join(text) def infer(input_text, **context) prompt = inject_context(input_text, **context) ... call model with prompt, return output def parse_plan(plan): method_map = { "DuckDuckGo": duckduckgo, "HyperlinkExtractor": link_extractor, "KnowledgeModel": infer, "TextScraper": scrape, } context = {} for line in plan.strip().splitlines(): if line.startswith("Plan:"): print(line) continue parts = re.match("^(:evidence[0-9]+:)\s*=\s*([^\[]+])(\[.*\])\s$", line, re.I) if not parts: if line.startswith("Answer: "): return context.get(line.split(" ")[-1].strip(), "Answer couldn't be generated...") raise RuntimeError("bad format: " + line) context[parts.group(1)] = method_map[parts.group(2)](parts.group(3), **context) ``` ### Contribute If you're interested in new functionality, particularly a new "instructor" type to generate a specific type of training data, take a look at the dataset generation tool repo: https://github.com/jondurbin/airoboros and either make a PR or open an issue with details. To help me with the OpenAI/compute costs: - https://bmc.link/jondurbin - ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11 - BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf ### Licence and usage restrictions The airoboros 2.0/m2.0 models are built on top of either llama or llama-2. Any model with `-l2-` in the name uses llama2, `..-33b-...` and `...-65b-...` are based on the original llama. #### Llama (original) models If the model was based on the original llama (33b/65b), the license is __cc-by-nc-4.0__ and is for research/academic use only -- no commercial usage whatsoever! #### Llama-2 models Base model has a custom Meta license: - See the [meta-license/LICENSE.txt](meta-license/LICENSE.txt) file attached for the original license provided by Meta. - See also [meta-license/USE_POLICY.md](meta-license/USE_POLICY.md) and [meta-license/Responsible-Use-Guide.pdf](meta-license/Responsible-Use-Guide.pdf), also provided by Meta. The fine-tuning data was generated by OpenAI API calls to gpt-4, via [airoboros](https://github.com/jondurbin/airoboros) The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI - what does *compete* actually mean here? - these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place - if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works - the training data used in essentially all large language models includes a significant amount of copyrighted or otherwise non-permissive licensing in the first place - other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2 I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license for llama-2) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly. Your best bet is probably to avoid using this commercially due to the OpenAI API usage. Either way, by using this model, you agree to completely idnemnify me.
17,780
[ [ -0.02484130859375, -0.06732177734375, 0.03790283203125, 0.018890380859375, -0.01177978515625, -0.0213623046875, -0.009490966796875, -0.0222625732421875, 0.01439666748046875, 0.03466796875, -0.05340576171875, -0.042083740234375, -0.028717041015625, 0.0215911865234375, -0.0171051025390625, 0.087890625, -0.0074462890625, -0.0115814208984375, -0.004787445068359375, 0.00482940673828125, -0.050994873046875, -0.0350341796875, -0.06024169921875, -0.00318145751953125, 0.0323486328125, 0.03448486328125, 0.035888671875, 0.0506591796875, 0.038970947265625, 0.0276641845703125, 0.003376007080078125, 0.020263671875, -0.0258026123046875, 0.003753662109375, -0.00862884521484375, -0.033355712890625, -0.0291290283203125, 0.0098876953125, 0.03076171875, 0.03515625, -0.01386260986328125, 0.0212860107421875, 0.0029354095458984375, 0.025665283203125, -0.0379638671875, 0.0159759521484375, -0.0299072265625, 0.001556396484375, -0.00618743896484375, -0.038055419921875, -0.0256195068359375, -0.0196075439453125, 0.007633209228515625, -0.07696533203125, -0.006587982177734375, 0.00904083251953125, 0.0740966796875, 0.0272216796875, -0.037841796875, -0.0303955078125, -0.039337158203125, 0.0606689453125, -0.061676025390625, 0.010406494140625, 0.051177978515625, 0.027435302734375, -0.026519775390625, -0.06585693359375, -0.0484619140625, -0.01499176025390625, -0.02423095703125, 0.01561737060546875, -0.00957489013671875, -0.0009064674377441406, 0.036529541015625, 0.0021114349365234375, -0.06427001953125, -0.0087432861328125, -0.043121337890625, -0.0167694091796875, 0.0455322265625, 0.0255889892578125, 0.0185699462890625, -0.0176544189453125, -0.0303802490234375, -0.0035953521728515625, -0.03997802734375, 0.02105712890625, 0.029632568359375, 0.031036376953125, -0.0238189697265625, 0.0380859375, -0.02496337890625, 0.04656982421875, -0.00040531158447265625, -0.0133514404296875, 0.010528564453125, -0.041107177734375, -0.0198211669921875, -0.013641357421875, 0.08477783203125, 0.046112060546875, 0.00807952880859375, 0.003826141357421875, -0.0037288665771484375, -0.008514404296875, 0.012115478515625, -0.0709228515625, -0.01568603515625, 0.045196533203125, -0.03924560546875, -0.0230255126953125, -0.00118255615234375, -0.06097412109375, -0.01222991943359375, -0.0181427001953125, 0.04083251953125, -0.029754638671875, 0.003444671630859375, 0.00972747802734375, -0.0192108154296875, 0.01435089111328125, 0.033111572265625, -0.0634765625, 0.040679931640625, 0.0305328369140625, 0.06671142578125, 0.005764007568359375, -0.029327392578125, -0.042510986328125, -0.0065155029296875, -0.00861358642578125, 0.057403564453125, -0.033660888671875, -0.02783203125, -0.0196075439453125, 0.0254669189453125, -0.0004036426544189453, -0.0240936279296875, 0.023956298828125, -0.032012939453125, 0.044769287109375, -0.034027099609375, -0.033966064453125, -0.021728515625, 0.0189971923828125, -0.032745361328125, 0.0732421875, 0.0083770751953125, -0.06011962890625, -0.0015115737915039062, -0.0787353515625, -0.027008056640625, -0.00574493408203125, -0.0022182464599609375, -0.0007586479187011719, -0.029205322265625, 0.01201629638671875, 0.0281524658203125, -0.0270538330078125, 0.01453399658203125, -0.01456451416015625, -0.031585693359375, 0.0292816162109375, -0.0263671875, 0.0872802734375, 0.029998779296875, -0.0175323486328125, 0.01038360595703125, -0.053314208984375, -0.0015077590942382812, 0.0135498046875, -0.04052734375, -0.01285552978515625, 0.00664520263671875, -0.0015277862548828125, 0.0020427703857421875, 0.0287322998046875, -0.034088134765625, 0.024200439453125, -0.0246734619140625, 0.061126708984375, 0.058013916015625, 0.010650634765625, 0.0237579345703125, -0.024322509765625, 0.039337158203125, -0.0021572113037109375, 0.02423095703125, -0.037078857421875, -0.046783447265625, -0.042144775390625, -0.00226593017578125, 0.0103607177734375, 0.07501220703125, -0.043792724609375, 0.04052734375, -0.0017538070678710938, -0.032196044921875, -0.021636962890625, -0.0043792724609375, 0.0284423828125, 0.05645751953125, 0.040863037109375, -0.0083160400390625, -0.053192138671875, -0.056365966796875, 0.01059722900390625, -0.0175933837890625, 0.0005354881286621094, 0.033477783203125, 0.051910400390625, -0.0162811279296875, 0.06939697265625, -0.06463623046875, -0.0003170967102050781, -0.00405120849609375, 0.004150390625, 0.02142333984375, 0.04736328125, 0.036529541015625, -0.052978515625, -0.0279388427734375, -0.00787353515625, -0.06512451171875, -0.00977325439453125, -0.0082244873046875, -0.0191802978515625, -0.0034885406494140625, 0.0265960693359375, -0.049896240234375, 0.03350830078125, 0.019683837890625, -0.041259765625, 0.049591064453125, -0.01177215576171875, 0.0200653076171875, -0.09661865234375, 0.0233306884765625, -0.0101470947265625, -0.01136016845703125, -0.050506591796875, 0.02423095703125, -0.01358795166015625, -0.004856109619140625, -0.03643798828125, 0.051055908203125, -0.023773193359375, 0.00604248046875, -0.0055999755859375, 0.00789642333984375, 0.01561737060546875, 0.046844482421875, -0.0100860595703125, 0.07061767578125, 0.03656005859375, -0.05511474609375, 0.041595458984375, 0.018585205078125, -0.0032939910888671875, 0.0282745361328125, -0.066650390625, 0.018157958984375, -0.0059967041015625, 0.0221405029296875, -0.085693359375, -0.0157928466796875, 0.043243408203125, -0.046417236328125, 0.003337860107421875, -0.005779266357421875, -0.0294952392578125, -0.036102294921875, -0.03216552734375, 0.02239990234375, 0.03326416015625, -0.0229339599609375, 0.038909912109375, 0.0239105224609375, 0.001552581787109375, -0.044097900390625, -0.05316162109375, 0.005157470703125, -0.0265350341796875, -0.041107177734375, 0.0191497802734375, -0.032623291015625, -0.0202789306640625, -0.01384735107421875, 0.00933837890625, -0.021728515625, 0.0265350341796875, 0.0158233642578125, 0.020599365234375, -0.01317596435546875, -0.005367279052734375, 0.01004791259765625, -0.0019254684448242188, 0.003482818603515625, -0.0303192138671875, 0.058868408203125, -0.016082763671875, -0.01010894775390625, -0.055023193359375, 0.0390625, 0.0264434814453125, -0.0174102783203125, 0.037933349609375, 0.039703369140625, -0.03460693359375, 0.016357421875, -0.0162506103515625, -0.0255279541015625, -0.042266845703125, 0.0145721435546875, -0.0253753662109375, -0.045654296875, 0.0511474609375, 0.0234832763671875, 0.0197906494140625, 0.031982421875, 0.0287322998046875, -0.0198974609375, 0.0655517578125, 0.0223541259765625, 0.01424407958984375, 0.02447509765625, -0.03790283203125, -0.0026798248291015625, -0.06195068359375, -0.0275726318359375, -0.045989990234375, -0.0241241455078125, -0.04241943359375, -0.021697998046875, 0.0253753662109375, 0.020477294921875, -0.03729248046875, 0.040863037109375, -0.05548095703125, 0.03369140625, 0.05377197265625, 0.008331298828125, 0.01114654541015625, -0.0102081298828125, 0.0005464553833007812, 0.006927490234375, -0.039581298828125, -0.050567626953125, 0.08770751953125, 0.0194244384765625, 0.053680419921875, 0.0102081298828125, 0.059967041015625, 0.0182037353515625, 0.00308990478515625, -0.06146240234375, 0.053985595703125, -0.0014171600341796875, -0.041900634765625, -0.034210205078125, -0.0247955322265625, -0.08740234375, 0.0177001953125, -0.0020847320556640625, -0.07305908203125, 0.01410675048828125, 0.016143798828125, -0.06378173828125, -0.0009083747863769531, -0.06298828125, 0.0721435546875, -0.018829345703125, -0.0243377685546875, 0.0090484619140625, -0.059967041015625, 0.0164642333984375, 0.01119232177734375, 0.0134429931640625, 0.0013599395751953125, -0.01190185546875, 0.0699462890625, -0.05487060546875, 0.06866455078125, -0.0216217041015625, 0.00812530517578125, 0.03814697265625, -0.0019855499267578125, 0.0330810546875, 0.013702392578125, -0.00017940998077392578, 0.0082855224609375, 0.0244598388671875, -0.017303466796875, -0.045013427734375, 0.04425048828125, -0.06451416015625, -0.03912353515625, -0.0279388427734375, -0.0418701171875, 0.01462554931640625, 0.031280517578125, 0.034332275390625, 0.04425048828125, -0.005733489990234375, -0.004283905029296875, 0.041961669921875, -0.0265655517578125, 0.040313720703125, 0.045501708984375, -0.0194549560546875, -0.0421142578125, 0.0560302734375, 0.01276397705078125, -0.0031032562255859375, 0.047393798828125, 0.0299072265625, -0.0251007080078125, -0.027130126953125, -0.049652099609375, 0.013031005859375, -0.048095703125, -0.01739501953125, -0.0675048828125, -0.0012025833129882812, -0.04791259765625, -0.0069427490234375, 0.00007021427154541016, -0.036865234375, -0.04791259765625, -0.0021457672119140625, 0.04351806640625, 0.041900634765625, 0.0024242401123046875, 0.044769287109375, -0.05029296875, 0.0213775634765625, 0.0234527587890625, 0.007617950439453125, -0.00424957275390625, -0.04681396484375, -0.00595855712890625, 0.0161285400390625, -0.037139892578125, -0.0863037109375, 0.0290679931640625, 0.0037555694580078125, 0.034881591796875, 0.041412353515625, -0.00145721435546875, 0.060699462890625, -0.04083251953125, 0.082275390625, -0.0011987686157226562, -0.06298828125, 0.0660400390625, -0.0418701171875, 0.01171112060546875, 0.041595458984375, 0.032867431640625, -0.04534912109375, -0.014068603515625, -0.035797119140625, -0.065185546875, 0.07122802734375, 0.019195556640625, 0.005298614501953125, -0.0081939697265625, 0.0340576171875, 0.0023040771484375, 0.01934814453125, -0.0595703125, -0.0281524658203125, -0.034423828125, -0.01580810546875, 0.0007991790771484375, -0.0029144287109375, -0.0214691162109375, -0.0289154052734375, 0.035736083984375, -0.01261138916015625, 0.043853759765625, 0.0190277099609375, 0.0038509368896484375, 0.00878143310546875, 0.016845703125, 0.060211181640625, 0.041534423828125, -0.024444580078125, 0.0035114288330078125, 0.01343536376953125, -0.040435791015625, 0.01013946533203125, 0.0171356201171875, -0.019989013671875, -0.02056884765625, 0.026947021484375, 0.0614013671875, -0.00412750244140625, -0.045074462890625, 0.0341796875, -0.01450347900390625, -0.006732940673828125, -0.025146484375, 0.020721435546875, 0.004421234130859375, 0.00858306884765625, 0.0196685791015625, -0.00510406494140625, 0.03143310546875, -0.051605224609375, 0.006595611572265625, 0.019683837890625, 0.0027828216552734375, -0.03118896484375, 0.052734375, 0.015777587890625, -0.04534912109375, 0.043243408203125, -0.03790283203125, -0.038238525390625, 0.0675048828125, 0.056121826171875, 0.052032470703125, -0.01450347900390625, 0.020660400390625, 0.0404052734375, 0.0275115966796875, -0.009979248046875, 0.0494384765625, -0.0148162841796875, -0.044921875, -0.00783538818359375, -0.04766845703125, -0.0180511474609375, 0.0177001953125, -0.04412841796875, 0.0189971923828125, -0.0548095703125, -0.01378631591796875, 0.0007328987121582031, 0.007129669189453125, -0.055267333984375, 0.020294189453125, -0.0168914794921875, 0.07391357421875, -0.0728759765625, 0.03656005859375, 0.06427001953125, -0.0545654296875, -0.07196044921875, -0.00566864013671875, 0.00714111328125, -0.051910400390625, 0.0295257568359375, 0.021087646484375, 0.01096343994140625, -0.0001322031021118164, -0.0589599609375, -0.07611083984375, 0.0947265625, 0.01141357421875, -0.03021240234375, -0.01404571533203125, -0.0035457611083984375, 0.04229736328125, -0.032928466796875, 0.04986572265625, 0.039276123046875, 0.0479736328125, -0.0041046142578125, -0.07147216796875, 0.0298614501953125, -0.03118896484375, -0.0063629150390625, -0.002349853515625, -0.063720703125, 0.08551025390625, -0.022857666015625, -0.019287109375, 0.00774383544921875, 0.035888671875, 0.01264190673828125, 0.02484130859375, 0.028900146484375, 0.035186767578125, 0.0806884765625, -0.005130767822265625, 0.0784912109375, -0.021087646484375, 0.0176239013671875, 0.08819580078125, -0.00930023193359375, 0.05926513671875, 0.031494140625, -0.03314208984375, 0.0462646484375, 0.0657958984375, -0.00873565673828125, 0.04339599609375, 0.0024662017822265625, 0.0028533935546875, 0.005580902099609375, -0.0013837814331054688, -0.033416748046875, 0.037139892578125, 0.0223236083984375, -0.01499176025390625, -0.004055023193359375, 0.0019140243530273438, 0.01548004150390625, -0.01247406005859375, -0.006198883056640625, 0.057220458984375, -0.0025196075439453125, -0.06597900390625, 0.0523681640625, 0.00919342041015625, 0.053070068359375, -0.04833984375, -0.0086517333984375, -0.0243072509765625, -0.01261138916015625, -0.02545166015625, -0.06781005859375, 0.0189971923828125, 0.0062255859375, -0.0246734619140625, 0.0006279945373535156, 0.033477783203125, -0.0216522216796875, -0.022003173828125, 0.01146697998046875, 0.0182647705078125, 0.049346923828125, 0.006664276123046875, -0.055572509765625, 0.0065765380859375, 0.007747650146484375, -0.023529052734375, 0.0086212158203125, 0.0265350341796875, -0.0015735626220703125, 0.051910400390625, 0.058135986328125, 0.0018949508666992188, -0.00246429443359375, -0.00838470458984375, 0.0673828125, -0.053680419921875, -0.049530029296875, -0.06182861328125, 0.049041748046875, -0.00812530517578125, -0.03704833984375, 0.04852294921875, 0.049346923828125, 0.05712890625, 0.0057830810546875, 0.059906005859375, -0.0232391357421875, 0.024658203125, -0.04058837890625, 0.04486083984375, -0.048248291015625, 0.027740478515625, -0.01300811767578125, -0.048553466796875, -0.004131317138671875, 0.06298828125, -0.016387939453125, 0.0026874542236328125, 0.05316162109375, 0.07476806640625, -0.0007824897766113281, 0.005588531494140625, -0.0023212432861328125, 0.0158233642578125, 0.02813720703125, 0.0467529296875, 0.055816650390625, -0.045379638671875, 0.042724609375, -0.0192413330078125, -0.0347900390625, -0.0068511962890625, -0.0562744140625, -0.06121826171875, -0.04327392578125, -0.00555419921875, -0.030914306640625, 0.004817962646484375, 0.08856201171875, 0.05328369140625, -0.0635986328125, -0.030059814453125, 0.0016336441040039062, 0.005588531494140625, -0.0198974609375, -0.0234832763671875, 0.0192718505859375, -0.0142822265625, -0.04888916015625, 0.0333251953125, 0.0017805099487304688, 0.0115966796875, -0.0137939453125, -0.006137847900390625, -0.0260009765625, 0.01201629638671875, 0.042938232421875, 0.0272064208984375, -0.053619384765625, -0.0209503173828125, 0.01163482666015625, -0.010986328125, 0.0045318603515625, 0.04034423828125, -0.0550537109375, 0.026458740234375, 0.04254150390625, 0.0204925537109375, 0.0261077880859375, 0.007030487060546875, 0.0257110595703125, -0.0474853515625, 0.006275177001953125, 0.00634002685546875, 0.029083251953125, 0.015899658203125, -0.057525634765625, 0.038055419921875, 0.0221405029296875, -0.0511474609375, -0.0672607421875, 0.00384521484375, -0.0789794921875, -0.032745361328125, 0.0936279296875, -0.014739990234375, -0.01464080810546875, -0.0147857666015625, -0.03521728515625, 0.01415252685546875, -0.053558349609375, 0.047088623046875, 0.047515869140625, -0.0310821533203125, 0.0089111328125, -0.03717041015625, 0.03369140625, 0.004764556884765625, -0.07281494140625, -0.0029163360595703125, 0.040740966796875, 0.03997802734375, 0.0236053466796875, 0.07373046875, 0.00865936279296875, 0.0220489501953125, 0.00023257732391357422, -0.0075531005859375, -0.01666259765625, -0.03143310546875, -0.0191650390625, 0.006801605224609375, -0.0186614990234375, -0.02532958984375 ] ]
EleutherAI/pythia-410m
2023-07-09T16:01:42.000Z
[ "transformers", "pytorch", "safetensors", "gpt_neox", "text-generation", "causal-lm", "pythia", "en", "dataset:EleutherAI/pile", "arxiv:2304.01373", "arxiv:2101.00027", "arxiv:2201.07311", "license:apache-2.0", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
text-generation
EleutherAI
null
null
EleutherAI/pythia-410m
9
9,138
transformers
2023-02-13T18:45:00
--- language: - en tags: - pytorch - causal-lm - pythia license: apache-2.0 datasets: - EleutherAI/pile --- The *Pythia Scaling Suite* is a collection of models developed to facilitate interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf). It contains two sets of eight models of sizes 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two models: one trained on the Pile, and one trained on the Pile after the dataset has been globally deduplicated. All 8 model sizes are trained on the exact same data, in the exact same order. We also provide 154 intermediate checkpoints per model, hosted on Hugging Face as branches. The Pythia model suite was deliberately designed to promote scientific research on large language models, especially interpretability research. Despite not centering downstream performance as a design goal, we find the models <a href="#evaluations">match or exceed</a> the performance of similar and same-sized models, such as those in the OPT and GPT-Neo suites. <details> <summary style="font-weight:600">Details on previous early release and naming convention.</summary> Previously, we released an early version of the Pythia suite to the public. However, we decided to retrain the model suite to address a few hyperparameter discrepancies. This model card <a href="#changelog">lists the changes</a>; see appendix B in the Pythia paper for further discussion. We found no difference in benchmark performance between the two Pythia versions. The old models are [still available](https://huggingface.co/models?other=pythia_v0), but we suggest the retrained suite if you are just starting to use Pythia.<br> **This is the current release.** Please note that all models in the *Pythia* suite were renamed in January 2023. For clarity, a <a href="#naming-convention-and-parameter-count">table comparing the old and new names</a> is provided in this model card, together with exact parameter counts. </details> <br> # Pythia-410M ## Model Details - Developed by: [EleutherAI](http://eleuther.ai) - Model type: Transformer-based Language Model - Language: English - Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia) for training procedure, config files, and details on how to use. [See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation details. - Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox) - License: Apache 2.0 - Contact: to ask questions about this model, join the [EleutherAI Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`. Please read the existing *Pythia* documentation before asking about it in the EleutherAI Discord. For general correspondence: [contact@eleuther. ai](mailto:contact@eleuther.ai). <figure> | Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models | | -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: | | 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — | | 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M | | 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M | | 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — | | 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B | | 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B | | 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B | | 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — | <figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and non-deduped models of a given size have the same hyperparameters. “Equivalent” models have <b>exactly</b> the same architecture, and the same number of non-embedding parameters.</figcaption> </figure> ## Uses and Limitations ### Intended Use The primary intended use of Pythia is research on the behavior, functionality, and limitations of large language models. This suite is intended to provide a controlled setting for performing scientific experiments. We also provide 154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints `step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to `step143000`. These checkpoints are hosted on Hugging Face as branches. Note that branch `143000` corresponds exactly to the model checkpoint on the `main` branch of each model. You may also further fine-tune and adapt Pythia-410M for deployment, as long as your use is in accordance with the Apache 2.0 license. Pythia models work with the Hugging Face [Transformers Library](https://huggingface.co/docs/transformers/index). If you decide to use pre-trained Pythia-410M as a basis for your fine-tuned model, please conduct your own risk and bias assessment. ### Out-of-scope use The Pythia Suite is **not** intended for deployment. It is not a in itself a product and cannot be used for human-facing interactions. For example, the model may generate harmful or offensive text. Please evaluate the risks associated with your particular use case. Pythia models are English-language only, and are not suitable for translation or generating text in other languages. Pythia-410M has not been fine-tuned for downstream contexts in which language models are commonly deployed, such as writing genre prose, or commercial chatbots. This means Pythia-410M will **not** respond to a given prompt the way a product like ChatGPT does. This is because, unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement Learning from Human Feedback (RLHF) to better “follow” human instructions. ### Limitations and biases The core functionality of a large language model is to take a string of text and predict the next token. The token used by the model need not produce the most “accurate” text. Never rely on Pythia-410M to produce factually accurate output. This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset known to contain profanity and texts that are lewd or otherwise offensive. See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a discussion of documented biases with regards to gender, religion, and race. Pythia-410M may produce socially unacceptable or undesirable text, *even if* the prompt itself does not include anything explicitly offensive. If you plan on using text generated through, for example, the Hosted Inference API, we recommend having a human curate the outputs of this language model before presenting it to other people. Please inform your audience that the text was generated by Pythia-410M. ### Quickstart Pythia models can be loaded and used via the following code, demonstrated here for the third `pythia-70m-deduped` checkpoint: ```python from transformers import GPTNeoXForCausalLM, AutoTokenizer model = GPTNeoXForCausalLM.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) tokenizer = AutoTokenizer.from_pretrained( "EleutherAI/pythia-70m-deduped", revision="step3000", cache_dir="./pythia-70m-deduped/step3000", ) inputs = tokenizer("Hello, I am", return_tensors="pt") tokens = model.generate(**inputs) tokenizer.decode(tokens[0]) ``` Revision/branch `step143000` corresponds exactly to the model checkpoint on the `main` branch of each model.<br> For more information on how to use all Pythia models, see [documentation on GitHub](https://github.com/EleutherAI/pythia). ## Training ### Training data [The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in English. It was created by EleutherAI specifically for training large language models. It contains texts from 22 diverse sources, roughly broken down into five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl), prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and miscellaneous (e.g. GitHub, Enron Emails). See [the Pile paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources, methodology, and a discussion of ethical implications. Consult [the datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation about the Pile and its component datasets. The Pile can be downloaded from the [official website](https://pile.eleuther.ai/), or from a [community mirror](https://the-eye.eu/public/AI/pile/).<br> The Pile was **not** deduplicated before being used to train Pythia-410M. ### Training procedure All models were trained on the exact same data, in the exact same order. Each model saw 299,892,736,000 tokens during training, and 143 checkpoints for each model are saved every 2,097,152,000 tokens, spaced evenly throughout training, from `step1000` to `step143000` (which is the same as `main`). In addition, we also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`. This corresponds to training for just under 1 epoch on the Pile for non-deduplicated models, and about 1.5 epochs on the deduplicated Pile. All *Pythia* models trained for 143000 steps at a batch size of 2M (2,097,152 tokens).<br> See [GitHub](https://github.com/EleutherAI/pythia) for more details on training procedure, including [how to reproduce it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br> Pythia uses the same tokenizer as [GPT-NeoX- 20B](https://huggingface.co/EleutherAI/gpt-neox-20b). ## Evaluations All 16 *Pythia* models were evaluated using the [LM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access the results by model and step at `results/json/*` in the [GitHub repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br> Expand the sections below to see plots of evaluation results for all Pythia and Pythia-deduped models compared with OPT and BLOOM. <details> <summary>LAMBADA – OpenAI</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/> </details> <details> <summary>Physical Interaction: Question Answering (PIQA)</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/> </details> <details> <summary>WinoGrande</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/> </details> <details> <summary>AI2 Reasoning Challenge—Easy Set</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/> </details> <details> <summary>SciQ</summary> <img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/> </details> ## Changelog This section compares differences between previously released [Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current models. See Appendix B of the Pythia paper for further discussion of these changes and the motivation behind them. We found that retraining Pythia had no impact on benchmark performance. - All model sizes are now trained with uniform batch size of 2M tokens. Previously, the models of size 160M, 410M, and 1.4B parameters were trained with batch sizes of 4M tokens. - We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64, 128,256,512} in addition to every 1000 training steps. - Flash Attention was used in the new retrained suite. - We remedied a minor inconsistency that existed in the original suite: all models of size 2.8B parameters or smaller had a learning rate (LR) schedule which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and 12B models all used an LR schedule which decayed to a minimum LR of 0. In the redone training runs, we rectified this inconsistency: all models now were trained with LR decaying to a minimum of 0.1× their maximum LR. ### Naming convention and parameter count *Pythia* models were renamed in January 2023. It is possible that the old naming convention still persists in some documentation by accident. The current naming convention (70M, 160M, etc.) is based on total parameter count. <figure style="width:32em"> | current Pythia suffix | old suffix | total params | non-embedding params | | --------------------: | ---------: | -------------: | -------------------: | | 70M | 19M | 70,426,624 | 18,915,328 | | 160M | 125M | 162,322,944 | 85,056,000 | | 410M | 350M | 405,334,016 | 302,311,424 | | 1B | 800M | 1,011,781,632 | 805,736,448 | | 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 | | 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 | | 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 | | 12B | 13B | 11,846,072,320 | 11,327,027,200 | </figure>
13,570
[ [ -0.0273590087890625, -0.055877685546875, 0.0251922607421875, 0.00366973876953125, -0.0182952880859375, -0.01458740234375, -0.0187530517578125, -0.032958984375, 0.0142364501953125, 0.0131683349609375, -0.0246429443359375, -0.0222930908203125, -0.030364990234375, -0.002048492431640625, -0.03375244140625, 0.0845947265625, -0.01074981689453125, -0.00928497314453125, 0.00948333740234375, -0.0065765380859375, -0.0030364990234375, -0.042266845703125, -0.032073974609375, -0.0283355712890625, 0.0487060546875, 0.012481689453125, 0.0626220703125, 0.04583740234375, 0.01203155517578125, 0.022247314453125, -0.0270233154296875, -0.00495147705078125, -0.0111541748046875, -0.008819580078125, -0.00203704833984375, -0.021148681640625, -0.0535888671875, -0.00047850608825683594, 0.051483154296875, 0.049774169921875, -0.01605224609375, 0.01922607421875, -0.0020580291748046875, 0.0274658203125, -0.04156494140625, 0.00138092041015625, -0.0238189697265625, -0.01554107666015625, -0.0089874267578125, 0.01100921630859375, -0.0309600830078125, -0.0263214111328125, 0.03399658203125, -0.049530029296875, 0.019195556640625, 0.01084136962890625, 0.0888671875, -0.00859832763671875, -0.033355712890625, -0.004894256591796875, -0.053070068359375, 0.052276611328125, -0.05572509765625, 0.0269622802734375, 0.0190887451171875, 0.013397216796875, -0.002315521240234375, -0.0675048828125, -0.0426025390625, -0.0148162841796875, -0.008819580078125, -0.0020961761474609375, -0.0467529296875, 0.0025959014892578125, 0.038665771484375, 0.0479736328125, -0.06280517578125, -0.003620147705078125, -0.0300445556640625, -0.0265960693359375, 0.0252685546875, 0.006198883056640625, 0.033660888671875, -0.0228271484375, -0.0023288726806640625, -0.02862548828125, -0.051055908203125, -0.017578125, 0.04278564453125, 0.0048065185546875, -0.028076171875, 0.038330078125, -0.03131103515625, 0.045318603515625, -0.00676727294921875, 0.02099609375, 0.032135009765625, -0.01406097412109375, -0.0390625, -0.007061004638671875, 0.06976318359375, 0.0099945068359375, 0.018951416015625, -0.00133514404296875, -0.0028247833251953125, 0.00445556640625, 0.004314422607421875, -0.08465576171875, -0.0604248046875, 0.0194244384765625, -0.0280303955078125, -0.0295257568359375, -0.0108489990234375, -0.07037353515625, -0.01477813720703125, -0.0134429931640625, 0.039703369140625, -0.0380859375, -0.053558349609375, -0.00921630859375, 0.00045228004455566406, 0.01715087890625, 0.026641845703125, -0.07183837890625, 0.030670166015625, 0.0325927734375, 0.0740966796875, 0.016937255859375, -0.043060302734375, -0.01285552978515625, -0.01824951171875, -0.01007080078125, 0.028167724609375, -0.00811004638671875, -0.01433563232421875, -0.00838470458984375, 0.01378631591796875, -0.00894927978515625, -0.0286102294921875, 0.030181884765625, -0.03302001953125, 0.0203704833984375, -0.0195159912109375, -0.03424072265625, -0.02911376953125, 0.0079803466796875, -0.046875, 0.0635986328125, 0.0169219970703125, -0.07330322265625, 0.016998291015625, -0.0160675048828125, -0.002643585205078125, -0.002346038818359375, 0.0160064697265625, -0.0509033203125, 0.002803802490234375, 0.0255126953125, 0.0031032562255859375, -0.03070068359375, 0.01506805419921875, -0.0196533203125, -0.03350830078125, 0.0137939453125, -0.03778076171875, 0.068115234375, 0.0148773193359375, -0.046661376953125, 0.0214385986328125, -0.043060302734375, 0.017974853515625, 0.017852783203125, -0.03057861328125, 0.00528717041015625, -0.01519775390625, 0.02880859375, 0.0160675048828125, 0.01202392578125, -0.028076171875, 0.0206756591796875, -0.038299560546875, 0.053436279296875, 0.05517578125, -0.004421234130859375, 0.035919189453125, -0.03375244140625, 0.0364990234375, 0.0015869140625, 0.013885498046875, -0.0072479248046875, -0.046905517578125, -0.0765380859375, -0.0205841064453125, 0.026885986328125, 0.0228271484375, -0.036773681640625, 0.031982421875, -0.018280029296875, -0.0654296875, -0.01384735107421875, -0.0065460205078125, 0.0322265625, 0.0237579345703125, 0.033538818359375, -0.01427459716796875, -0.0423583984375, -0.06707763671875, -0.0187225341796875, -0.032073974609375, 0.008880615234375, 0.01477813720703125, 0.06903076171875, -0.0085601806640625, 0.0439453125, -0.026885986328125, 0.0193328857421875, -0.025665283203125, 0.01229095458984375, 0.033416748046875, 0.044708251953125, 0.02813720703125, -0.041351318359375, -0.030059814453125, -0.00012636184692382812, -0.044036865234375, 0.00782012939453125, 0.005306243896484375, -0.022674560546875, 0.02496337890625, 0.006664276123046875, -0.072998046875, 0.034271240234375, 0.047515869140625, -0.039154052734375, 0.062255859375, -0.0250244140625, -0.000972747802734375, -0.0809326171875, 0.0217742919921875, 0.01025390625, -0.0159912109375, -0.04608154296875, 0.004608154296875, 0.0167388916015625, -0.0150909423828125, -0.03338623046875, 0.0458984375, -0.04241943359375, -0.01099395751953125, -0.016998291015625, 0.0025386810302734375, -0.0032367706298828125, 0.049102783203125, 0.01146697998046875, 0.0413818359375, 0.058685302734375, -0.05615234375, 0.0328369140625, 0.01459503173828125, -0.02191162109375, 0.0283355712890625, -0.065673828125, 0.0130767822265625, 0.00736236572265625, 0.032958984375, -0.04388427734375, -0.0287628173828125, 0.037933349609375, -0.04376220703125, 0.01096343994140625, -0.0321044921875, -0.0404052734375, -0.032958984375, -0.01187896728515625, 0.047576904296875, 0.05853271484375, -0.0447998046875, 0.05096435546875, 0.0029144287109375, 0.00870513916015625, -0.027099609375, -0.040985107421875, -0.0182037353515625, -0.04119873046875, -0.049774169921875, 0.02777099609375, 0.01334381103515625, -0.0135040283203125, 0.0010929107666015625, 0.0010385513305664062, 0.00901031494140625, -0.0054779052734375, 0.024169921875, 0.0269775390625, -0.00368499755859375, 0.0015554428100585938, -0.01332855224609375, -0.01113128662109375, 0.000060617923736572266, -0.0386962890625, 0.073486328125, -0.022674560546875, -0.01374053955078125, -0.061981201171875, -0.00023043155670166016, 0.06707763671875, -0.0330810546875, 0.0682373046875, 0.04412841796875, -0.053009033203125, 0.010589599609375, -0.02899169921875, -0.023101806640625, -0.0333251953125, 0.050811767578125, -0.0183868408203125, -0.026275634765625, 0.04656982421875, 0.01763916015625, 0.0188751220703125, 0.042083740234375, 0.05572509765625, 0.0159759521484375, 0.089599609375, 0.035003662109375, -0.01216888427734375, 0.047943115234375, -0.037994384765625, 0.0178985595703125, -0.08612060546875, -0.01549530029296875, -0.037933349609375, -0.01751708984375, -0.0712890625, -0.021942138671875, 0.02423095703125, 0.017120361328125, -0.05877685546875, 0.040863037109375, -0.041595458984375, 0.0039825439453125, 0.04791259765625, 0.0168304443359375, 0.0135498046875, 0.01482391357421875, 0.0038738250732421875, -0.004947662353515625, -0.049896240234375, -0.027435302734375, 0.09332275390625, 0.03656005859375, 0.043975830078125, 0.0234527587890625, 0.05462646484375, -0.01222991943359375, 0.02294921875, -0.0531005859375, 0.03118896484375, 0.0259552001953125, -0.055389404296875, -0.013397216796875, -0.057586669921875, -0.0709228515625, 0.036376953125, 0.005977630615234375, -0.08447265625, 0.0175933837890625, 0.017120361328125, -0.0282135009765625, 0.036956787109375, -0.049041748046875, 0.07354736328125, -0.0169219970703125, -0.034759521484375, -0.024139404296875, -0.023040771484375, 0.01953125, 0.025482177734375, 0.0100555419921875, 0.006744384765625, 0.0207366943359375, 0.076171875, -0.0526123046875, 0.046905517578125, -0.01024627685546875, 0.01236724853515625, 0.023712158203125, 0.020050048828125, 0.05047607421875, 0.00843048095703125, 0.00705718994140625, -0.004207611083984375, 0.00975799560546875, -0.043365478515625, -0.028289794921875, 0.0677490234375, -0.08306884765625, -0.0301361083984375, -0.059539794921875, -0.044525146484375, 0.006191253662109375, 0.0162200927734375, 0.031646728515625, 0.0477294921875, -0.0020580291748046875, 0.0011148452758789062, 0.042449951171875, -0.0389404296875, 0.0281982421875, 0.0172576904296875, -0.03570556640625, -0.04095458984375, 0.07501220703125, 0.0015850067138671875, 0.026611328125, -0.0005559921264648438, 0.0154266357421875, -0.0295257568359375, -0.033843994140625, -0.04595947265625, 0.041839599609375, -0.055023193359375, 0.00026106834411621094, -0.052154541015625, -0.0009436607360839844, -0.03289794921875, 0.007274627685546875, -0.030975341796875, -0.025970458984375, -0.0159912109375, -0.0016241073608398438, 0.043426513671875, 0.036102294921875, 0.00943756103515625, 0.0258941650390625, -0.0399169921875, -0.004070281982421875, 0.0158233642578125, 0.007358551025390625, 0.01103973388671875, -0.06817626953125, -0.0094757080078125, 0.0095977783203125, -0.031280517578125, -0.08599853515625, 0.039276123046875, -0.0053558349609375, 0.0265350341796875, 0.004421234130859375, -0.0162200927734375, 0.043670654296875, -0.006633758544921875, 0.04876708984375, 0.01232147216796875, -0.0767822265625, 0.03875732421875, -0.038665771484375, 0.0258941650390625, 0.0256500244140625, 0.0267486572265625, -0.05706787109375, -0.00582122802734375, -0.074462890625, -0.08294677734375, 0.057708740234375, 0.034423828125, 0.01151275634765625, 0.00980377197265625, 0.02984619140625, -0.0369873046875, 0.01093292236328125, -0.07562255859375, -0.02264404296875, -0.018524169921875, -0.00641632080078125, 0.01157379150390625, -0.00174713134765625, 0.0036029815673828125, -0.040374755859375, 0.07586669921875, 0.0035037994384765625, 0.024627685546875, 0.0212249755859375, -0.028076171875, -0.006412506103515625, -0.0023746490478515625, 0.012237548828125, 0.056121826171875, -0.01129913330078125, 0.00505828857421875, 0.018402099609375, -0.04022216796875, 0.0035991668701171875, 0.01317596435546875, -0.0255126953125, -0.00685882568359375, 0.0144805908203125, 0.06402587890625, 0.01038360595703125, -0.0302276611328125, 0.01505279541015625, -0.004192352294921875, -0.0036563873291015625, -0.021148681640625, -0.013031005859375, 0.0174713134765625, 0.013580322265625, 0.001209259033203125, -0.01384735107421875, -0.0019426345825195312, -0.06658935546875, 0.00476837158203125, 0.0126800537109375, -0.0113677978515625, -0.0303192138671875, 0.04510498046875, 0.003032684326171875, -0.016448974609375, 0.087890625, -0.017791748046875, -0.051544189453125, 0.06097412109375, 0.03564453125, 0.055145263671875, -0.01548004150390625, 0.0228118896484375, 0.0665283203125, 0.02349853515625, -0.01448822021484375, 0.006824493408203125, 0.0069427490234375, -0.040374755859375, -0.006885528564453125, -0.060546875, -0.0197906494140625, 0.01629638671875, -0.047332763671875, 0.033233642578125, -0.046142578125, -0.0060577392578125, -0.0015125274658203125, 0.0176849365234375, -0.04486083984375, 0.0262451171875, 0.01194000244140625, 0.05596923828125, -0.0677490234375, 0.06463623046875, 0.049774169921875, -0.054046630859375, -0.08563232421875, 0.0021686553955078125, 0.0028438568115234375, -0.034393310546875, 0.00988006591796875, 0.0159759521484375, 0.0166473388671875, 0.0150299072265625, -0.0208282470703125, -0.065185546875, 0.0992431640625, 0.0187225341796875, -0.050048828125, -0.0188140869140625, -0.0078887939453125, 0.041534423828125, 0.00412750244140625, 0.053436279296875, 0.05255126953125, 0.03155517578125, 0.006191253662109375, -0.0802001953125, 0.02862548828125, -0.0264129638671875, -0.005290985107421875, 0.0186614990234375, -0.053558349609375, 0.09783935546875, -0.007358551025390625, -0.001995086669921875, 0.033172607421875, 0.042633056640625, 0.0343017578125, -0.00856781005859375, 0.026336669921875, 0.059295654296875, 0.06683349609375, -0.029052734375, 0.0963134765625, -0.024139404296875, 0.058990478515625, 0.06500244140625, 0.013275146484375, 0.04010009765625, 0.031280517578125, -0.028289794921875, 0.039154052734375, 0.0650634765625, -0.006649017333984375, 0.01406097412109375, 0.022003173828125, -0.024169921875, -0.0189971923828125, 0.0081939697265625, -0.045806884765625, 0.01323699951171875, 0.0106658935546875, -0.042816162109375, -0.0156707763671875, -0.026153564453125, 0.0272979736328125, -0.032073974609375, -0.016876220703125, 0.021392822265625, 0.00870513916015625, -0.04962158203125, 0.047760009765625, 0.0214080810546875, 0.042816162109375, -0.0330810546875, 0.01390838623046875, -0.01187896728515625, 0.02520751953125, -0.027008056640625, -0.030303955078125, 0.007293701171875, 0.0012216567993164062, 0.004299163818359375, 0.0091705322265625, 0.029388427734375, -0.0106201171875, -0.0413818359375, 0.0151824951171875, 0.03692626953125, 0.0203094482421875, -0.033935546875, -0.050262451171875, 0.00797271728515625, -0.01200103759765625, -0.04132080078125, 0.0309906005859375, 0.0191650390625, -0.01090240478515625, 0.044219970703125, 0.049224853515625, 0.0035266876220703125, 0.0009016990661621094, 0.00946044921875, 0.074462890625, -0.03759765625, -0.0352783203125, -0.0699462890625, 0.037109375, 0.001865386962890625, -0.051422119140625, 0.0657958984375, 0.043609619140625, 0.053497314453125, 0.020294189453125, 0.045013427734375, -0.0313720703125, -0.0017795562744140625, -0.0206756591796875, 0.051849365234375, -0.03656005859375, 0.0024089813232421875, -0.0355224609375, -0.085205078125, -0.005229949951171875, 0.0675048828125, -0.0401611328125, 0.03131103515625, 0.06005859375, 0.060791015625, -0.00516510009765625, 0.0074615478515625, 0.00420379638671875, 0.02227783203125, 0.039337158203125, 0.0716552734375, 0.0677490234375, -0.051361083984375, 0.041839599609375, -0.037994384765625, -0.0191650390625, -0.0111083984375, -0.035858154296875, -0.061981201171875, -0.03387451171875, -0.035919189453125, -0.055145263671875, -0.0004088878631591797, 0.0643310546875, 0.059326171875, -0.04681396484375, -0.01320648193359375, -0.03924560546875, 0.0028896331787109375, -0.0161590576171875, -0.017486572265625, 0.03253173828125, 0.0120086669921875, -0.07275390625, -0.0029144287109375, -0.01187896728515625, 0.00765228271484375, -0.031341552734375, -0.0226898193359375, -0.01509857177734375, -0.007320404052734375, 0.0018892288208007812, 0.0256195068359375, -0.03887939453125, -0.0174407958984375, 0.0033016204833984375, 0.0033130645751953125, 0.0000407099723815918, 0.053314208984375, -0.043426513671875, 0.01166534423828125, 0.04571533203125, 0.007198333740234375, 0.06207275390625, -0.0196075439453125, 0.033203125, -0.0198974609375, 0.02642822265625, 0.0201873779296875, 0.04998779296875, 0.0264434814453125, -0.0182342529296875, 0.01160430908203125, 0.029754638671875, -0.0555419921875, -0.06573486328125, 0.0254058837890625, -0.05291748046875, -0.005706787109375, 0.0943603515625, -0.0218353271484375, -0.03033447265625, 0.0035839080810546875, -0.01486968994140625, 0.041473388671875, -0.0216522216796875, 0.05023193359375, 0.04876708984375, 0.003955841064453125, -0.0164794921875, -0.048095703125, 0.027862548828125, 0.05218505859375, -0.06353759765625, 0.02642822265625, 0.043243408203125, 0.046112060546875, 0.0182037353515625, 0.041656494140625, -0.022064208984375, 0.046905517578125, 0.004886627197265625, 0.00981903076171875, 0.0017986297607421875, -0.03302001953125, -0.034820556640625, -0.0088653564453125, 0.017303466796875, 0.0006508827209472656 ] ]
hyunseoki/ko-ref-llama2-13b
2023-10-04T07:03:44.000Z
[ "transformers", "pytorch", "llama", "text-generation", "ko", "endpoints_compatible", "text-generation-inference", "region:us" ]
text-generation
hyunseoki
null
null
hyunseoki/ko-ref-llama2-13b
1
9,082
transformers
2023-10-04T06:38:18
--- language: - ko library_name: transformers pipeline_tag: text-generation --- **Model Developers** HyunseokLee, TaeyoungKim - (kaist alinlab, omnious.ai) **Input** Models input text only. **Output** Models generate text only. **Model Architecture** ko-ref-llama2-13b is an auto-regressive language model based on the LLaMA2 transformer architecture. **Base Model** Llama-2-13B **Training Dataset** Open dataset (Korean). **Training Objective** We trained the model to learn Korean corpus.
505
[ [ 0.003978729248046875, -0.05535888671875, 0.034759521484375, 0.03558349609375, -0.0179443359375, 0.0175933837890625, -0.0010786056518554688, -0.02911376953125, 0.00916290283203125, 0.051513671875, -0.0396728515625, -0.035614013671875, -0.040283203125, 0.0104827880859375, -0.0265350341796875, 0.06719970703125, -0.0167999267578125, 0.020172119140625, 0.00722503662109375, 0.00823974609375, -0.0440673828125, -0.0270843505859375, -0.043182373046875, -0.030059814453125, 0.01605224609375, 0.056854248046875, 0.050811767578125, 0.06671142578125, 0.0340576171875, 0.022705078125, 0.01068878173828125, -0.01471710205078125, -0.048919677734375, 0.018707275390625, -0.002147674560546875, -0.05377197265625, -0.031463623046875, -0.0161590576171875, 0.03692626953125, 0.011444091796875, -0.013458251953125, 0.03424072265625, -0.0188140869140625, 0.038848876953125, -0.0272064208984375, 0.03167724609375, -0.05023193359375, -0.0134735107421875, -0.01508331298828125, 0.0303497314453125, -0.036224365234375, -0.032440185546875, -0.015472412109375, -0.0298614501953125, -0.005649566650390625, 0.0067291259765625, 0.08111572265625, 0.038177490234375, -0.03875732421875, -0.0328369140625, -0.049468994140625, 0.047119140625, -0.05010986328125, 0.03839111328125, 0.047088623046875, 0.0218963623046875, -0.0111083984375, -0.05706787109375, -0.03436279296875, -0.0206146240234375, -0.00012946128845214844, 0.00905609130859375, -0.004825592041015625, -0.01457977294921875, -0.01153564453125, 0.0209197998046875, -0.037200927734375, 0.03887939453125, -0.0562744140625, -0.01078033447265625, 0.06036376953125, 0.01230621337890625, 0.0169525146484375, -0.023956298828125, -0.0221710205078125, -0.01021575927734375, -0.043548583984375, -0.0137481689453125, 0.031768798828125, 0.015350341796875, -0.0221405029296875, 0.068603515625, -0.018280029296875, 0.026092529296875, 0.0190277099609375, -0.030303955078125, 0.02960205078125, -0.01125335693359375, -0.03173828125, 0.01035308837890625, 0.06353759765625, 0.00727081298828125, 0.0123138427734375, -0.017425537109375, -0.0164947509765625, 0.0189208984375, 0.0158843994140625, -0.034637451171875, -0.0163116455078125, 0.010772705078125, -0.0595703125, -0.055908203125, -0.0099639892578125, -0.06060791015625, -0.0140838623046875, -0.00640106201171875, 0.025726318359375, -0.0098724365234375, -0.039581298828125, -0.0006461143493652344, 0.01499176025390625, 0.0117645263671875, -0.006717681884765625, -0.050018310546875, 0.031646728515625, 0.01462554931640625, 0.035797119140625, -0.008056640625, -0.0251312255859375, 0.005878448486328125, -0.0019044876098632812, -0.03546142578125, 0.058441162109375, -0.023651123046875, -0.03497314453125, 0.004467010498046875, 0.0156402587890625, -0.007663726806640625, -0.0457763671875, 0.050872802734375, -0.053131103515625, 0.0206756591796875, 0.015655517578125, -0.03485107421875, -0.0282135009765625, -0.012603759765625, -0.06329345703125, 0.10833740234375, 0.03216552734375, -0.02392578125, 0.002040863037109375, -0.054718017578125, -0.01486968994140625, -0.00952911376953125, 0.0051727294921875, -0.0303802490234375, 0.01470184326171875, -0.0146942138671875, 0.0211944580078125, -0.032012939453125, 0.047637939453125, -0.0009207725524902344, 0.002773284912109375, 0.004970550537109375, -0.009918212890625, 0.0462646484375, 0.032440185546875, -0.0134429931640625, 0.00579833984375, -0.10272216796875, -0.008758544921875, 0.050445556640625, -0.04034423828125, -0.010772705078125, -0.00235748291015625, 0.0211181640625, 0.026611328125, 0.029388427734375, -0.0438232421875, 0.021270751953125, -0.0184326171875, 0.01502227783203125, 0.0487060546875, -0.0167236328125, 0.047515869140625, -0.01253509521484375, 0.0574951171875, 0.004161834716796875, -0.01788330078125, -0.0138702392578125, -0.03240966796875, -0.07891845703125, -0.0117645263671875, 0.0272216796875, 0.07171630859375, -0.04266357421875, 0.0261688232421875, 0.00096893310546875, -0.06378173828125, -0.057525634765625, 0.0174407958984375, 0.03985595703125, 0.037628173828125, 0.01126861572265625, 0.01282501220703125, -0.0673828125, -0.0799560546875, 0.0108184814453125, -0.038726806640625, 0.0072174072265625, 0.0131378173828125, 0.043853759765625, -0.033111572265625, 0.046478271484375, -0.03057861328125, -0.007080078125, -0.034271240234375, 0.0037479400634765625, 0.0164642333984375, 0.0301666259765625, 0.03521728515625, -0.03839111328125, -0.0406494140625, -0.00785064697265625, -0.05303955078125, -0.05096435546875, -0.0098419189453125, -0.022735595703125, 0.033416748046875, 0.04931640625, -0.056243896484375, 0.037353515625, 0.05755615234375, -0.034454345703125, 0.03472900390625, 0.01348876953125, -0.0195770263671875, -0.10455322265625, -0.0018358230590820312, -0.0244293212890625, -0.0158843994140625, -0.04193115234375, 0.005893707275390625, 0.00836944580078125, 0.004322052001953125, -0.0189208984375, 0.05169677734375, -0.0213775634765625, -0.00287628173828125, -0.0352783203125, -0.0011568069458007812, -0.00586700439453125, 0.0286865234375, 0.01336669921875, 0.053680419921875, 0.01806640625, -0.052886962890625, 0.0276031494140625, 0.04718017578125, -0.0274505615234375, 0.008544921875, -0.059051513671875, 0.016937255859375, -0.00789642333984375, 0.009002685546875, -0.09368896484375, -0.037567138671875, 0.0316162109375, -0.036163330078125, 0.0163726806640625, 0.0127410888671875, -0.046112060546875, -0.0435791015625, -0.0012674331665039062, 0.0192108154296875, 0.0455322265625, -0.0372314453125, 0.04681396484375, 0.020599365234375, -0.0183868408203125, -0.026763916015625, -0.05865478515625, -0.00579071044921875, -0.0007395744323730469, -0.034942626953125, 0.020599365234375, -0.0008420944213867188, -0.0214691162109375, 0.005130767822265625, 0.0190277099609375, -0.0224609375, 0.00783538818359375, 0.02777099609375, 0.046630859375, -0.034881591796875, 0.01300048828125, 0.0170745849609375, -0.00609588623046875, -0.0006041526794433594, 0.0196533203125, 0.06292724609375, 0.007476806640625, -0.0119781494140625, -0.04449462890625, 0.006927490234375, 0.0279388427734375, 0.00647735595703125, 0.045257568359375, 0.031768798828125, -0.0162506103515625, 0.00986480712890625, -0.038604736328125, 0.01373291015625, -0.035675048828125, 0.06402587890625, -0.0450439453125, -0.051422119140625, 0.04351806640625, -0.01349639892578125, -0.002246856689453125, 0.06207275390625, 0.08135986328125, 0.014801025390625, 0.062225341796875, 0.060821533203125, -0.0139923095703125, 0.01355743408203125, 0.00800323486328125, 0.01190185546875, -0.0557861328125, -0.0396728515625, -0.04736328125, -0.01332855224609375, -0.04217529296875, -0.01256561279296875, -0.0230560302734375, 0.027099609375, -0.04339599609375, 0.04962158203125, -0.0302276611328125, 0.02398681640625, 0.051513671875, -0.01617431640625, 0.0098876953125, -0.01042938232421875, -0.00923919677734375, 0.0142059326171875, -0.04925537109375, -0.05999755859375, 0.0941162109375, 0.048919677734375, 0.07440185546875, 0.00769805908203125, 0.04571533203125, 0.00795745849609375, 0.018218994140625, -0.05438232421875, 0.032989501953125, 0.00800323486328125, -0.055084228515625, -0.004405975341796875, -0.001979827880859375, -0.06256103515625, -0.00567626953125, 0.007488250732421875, -0.0496826171875, 0.01239013671875, 0.00868988037109375, -0.003971099853515625, 0.01314544677734375, -0.05133056640625, 0.04681396484375, -0.0268707275390625, 0.0166473388671875, 0.01161956787109375, -0.039031982421875, 0.037811279296875, -0.0263671875, 0.006893157958984375, -0.006374359130859375, -0.01114654541015625, 0.0460205078125, -0.0013675689697265625, 0.059844970703125, -0.00714111328125, -0.0172119140625, 0.0323486328125, 0.027587890625, 0.036651611328125, -0.00228118896484375, 0.003082275390625, 0.017730712890625, -0.0179443359375, -0.005138397216796875, -0.01412200927734375, 0.03802490234375, -0.0797119140625, -0.03204345703125, -0.019927978515625, -0.035552978515625, -0.006778717041015625, 0.019683837890625, 0.0254974365234375, -0.021514892578125, -0.0234375, 0.0003066062927246094, 0.020355224609375, -0.024810791015625, 0.0333251953125, 0.053802490234375, -0.045318603515625, -0.05596923828125, 0.04541015625, -0.005096435546875, 0.0209503173828125, 0.0145721435546875, 0.005657196044921875, -0.0260009765625, -0.0146636962890625, -0.04644775390625, 0.030731201171875, -0.05023193359375, -0.040496826171875, -0.043701171875, -0.040985107421875, -0.0478515625, 0.00832366943359375, -0.049102783203125, -0.040283203125, -0.04217529296875, -0.028411865234375, 0.0163421630859375, 0.08660888671875, -0.00792694091796875, 0.061737060546875, -0.049957275390625, 0.02984619140625, 0.0079193115234375, 0.0216827392578125, -0.006587982177734375, -0.05462646484375, -0.014007568359375, -0.004375457763671875, -0.0302276611328125, -0.059967041015625, 0.048919677734375, 0.019439697265625, 0.038604736328125, 0.0244903564453125, -0.01739501953125, 0.031524658203125, -0.05035400390625, 0.06561279296875, 0.0028209686279296875, -0.060546875, 0.03790283203125, -0.00791168212890625, 0.0283355712890625, 0.0267333984375, 0.0268402099609375, -0.02557373046875, -0.0155487060546875, -0.0438232421875, -0.04443359375, 0.06671142578125, 0.0141448974609375, 0.0179595947265625, -0.007232666015625, 0.03472900390625, 0.02227783203125, 0.019317626953125, -0.0780029296875, -0.0208740234375, -0.03192138671875, -0.052337646484375, 0.00994873046875, -0.054656982421875, 0.0185699462890625, -0.0006656646728515625, 0.060272216796875, 0.0075225830078125, 0.04620361328125, 0.0031986236572265625, -0.012054443359375, -0.0278472900390625, -0.003875732421875, 0.052001953125, 0.00917816162109375, -0.0084075927734375, -0.013153076171875, -0.0001423358917236328, -0.051849365234375, 0.02508544921875, -0.006320953369140625, -0.034271240234375, 0.009063720703125, 0.0238037109375, 0.08251953125, 0.005542755126953125, -0.046905517578125, -0.00025725364685058594, 0.01511383056640625, -0.0267181396484375, -0.040557861328125, -0.0035381317138671875, 0.00986480712890625, 0.0273895263671875, 0.01258087158203125, -0.00426483154296875, -0.00235748291015625, -0.01702880859375, 0.001964569091796875, -0.00278472900390625, -0.005878448486328125, -0.041046142578125, 0.06719970703125, 0.0020542144775390625, -0.034881591796875, 0.04754638671875, -0.005413055419921875, -0.0423583984375, 0.06005859375, 0.08868408203125, 0.0640869140625, -0.04833984375, 0.022796630859375, 0.038330078125, 0.0213775634765625, -0.01702880859375, 0.045684814453125, 0.02862548828125, -0.0634765625, -0.020599365234375, -0.0489501953125, -0.00821685791015625, 0.0306854248046875, -0.053314208984375, 0.03759765625, -0.02557373046875, -0.011138916015625, -0.0023345947265625, -0.0131378173828125, -0.039947509765625, 0.0172119140625, 0.0157928466796875, 0.07379150390625, -0.059661865234375, 0.07562255859375, 0.05609130859375, -0.039520263671875, -0.053680419921875, -0.024169921875, -0.004619598388671875, -0.10028076171875, 0.0970458984375, -0.0027904510498046875, 0.01317596435546875, 0.0091552734375, -0.06524658203125, -0.10552978515625, 0.080322265625, 0.01024627685546875, -0.039093017578125, -0.01117706298828125, 0.0142974853515625, 0.027008056640625, -0.048004150390625, -0.003887176513671875, 0.03582763671875, 0.0311126708984375, -0.002597808837890625, -0.08184814453125, -0.0220489501953125, -0.0251312255859375, 0.0147552490234375, -0.00634002685546875, -0.06903076171875, 0.06634521484375, -0.0009312629699707031, -0.00640106201171875, 0.047576904296875, 0.054931640625, 0.018280029296875, 0.0034465789794921875, 0.0277862548828125, 0.051849365234375, 0.0445556640625, 0.0113677978515625, 0.0557861328125, -0.034698486328125, 0.02197265625, 0.10894775390625, -0.03265380859375, 0.07232666015625, 0.01316070556640625, -0.01476287841796875, 0.0531005859375, 0.064453125, -0.02508544921875, 0.055816650390625, 0.00968170166015625, 0.0202178955078125, 0.0031490325927734375, -0.006557464599609375, -0.020172119140625, 0.06719970703125, 0.0193328857421875, -0.050994873046875, 0.00029730796813964844, 0.0005059242248535156, 0.03131103515625, -0.0020236968994140625, -0.0168914794921875, 0.058074951171875, -0.00994873046875, -0.03546142578125, 0.0283203125, 0.0303802490234375, 0.0341796875, -0.045135498046875, -0.002696990966796875, -0.01345062255859375, 0.00919342041015625, -0.001674652099609375, -0.035675048828125, 0.024383544921875, 0.00690460205078125, -0.031982421875, 0.01125335693359375, 0.0770263671875, -0.037872314453125, -0.0576171875, 0.0242919921875, 0.0308837890625, 0.005279541015625, 0.01654052734375, -0.05474853515625, 0.0185699462890625, 0.00394439697265625, -0.03265380859375, 0.00922393798828125, 0.022979736328125, -0.01396942138671875, 0.057098388671875, 0.0304107666015625, -0.0010051727294921875, 0.0182037353515625, -0.0028209686279296875, 0.062347412109375, -0.037567138671875, -0.0123138427734375, -0.05718994140625, 0.02508544921875, 0.00335693359375, -0.033111572265625, 0.050567626953125, 0.048492431640625, 0.0635986328125, -0.0261993408203125, 0.0562744140625, -0.01227569580078125, 0.056915283203125, -0.0228271484375, 0.044219970703125, -0.0190277099609375, -0.01214599609375, -0.0053863525390625, -0.06927490234375, -0.01200103759765625, 0.0628662109375, 0.00823211669921875, -0.00957489013671875, 0.041229248046875, 0.046875, 0.015380859375, -0.0084228515625, 0.0267333984375, 0.04327392578125, -0.0018854141235351562, 0.0232696533203125, 0.04876708984375, -0.0567626953125, 0.03399658203125, -0.0288848876953125, -0.01146697998046875, -0.01116943359375, -0.048431396484375, -0.0743408203125, -0.0372314453125, -0.012908935546875, -0.043487548828125, -0.0108184814453125, 0.0711669921875, 0.0237579345703125, -0.06317138671875, -0.032440185546875, -0.0081024169921875, -0.01222991943359375, 0.0194091796875, -0.0172882080078125, 0.034332275390625, -0.0280303955078125, -0.04931640625, 0.03240966796875, -0.004711151123046875, 0.0187835693359375, -0.0347900390625, -0.01061248779296875, -0.017364501953125, 0.011932373046875, 0.03790283203125, 0.01308441162109375, -0.06109619140625, 0.006420135498046875, 0.0261383056640625, -0.0158843994140625, 0.0107879638671875, 0.0271148681640625, -0.0533447265625, 0.0131378173828125, 0.0194244384765625, 0.0439453125, 0.01451873779296875, 0.00016200542449951172, 0.048980712890625, -0.0259552001953125, 0.0260009765625, -0.0021076202392578125, 0.01465606689453125, 0.00980377197265625, -0.039947509765625, 0.056243896484375, 0.01800537109375, -0.05303955078125, -0.05523681640625, 0.01142120361328125, -0.060272216796875, -0.0119171142578125, 0.10357666015625, 0.00311279296875, -0.034149169921875, -0.006488800048828125, -0.046478271484375, 0.030548095703125, -0.027618408203125, 0.055694580078125, 0.058868408203125, 0.003047943115234375, -0.0183868408203125, -0.042327880859375, 0.0269622802734375, 0.004268646240234375, -0.045684814453125, -0.01727294921875, 0.0264892578125, 0.020111083984375, 0.01300048828125, 0.042327880859375, -0.0089263916015625, 0.01021575927734375, 0.01485443115234375, 0.0299224853515625, -0.003910064697265625, -0.025177001953125, -0.0143280029296875, -0.0170745849609375, -0.0006022453308105469, -0.0242156982421875 ] ]
liuhaotian/llava-v1-0719-336px-lora-merge-vicuna-13b-v1.3
2023-07-19T08:54:01.000Z
[ "transformers", "pytorch", "llava", "text-generation", "region:us" ]
text-generation
liuhaotian
null
null
liuhaotian/llava-v1-0719-336px-lora-merge-vicuna-13b-v1.3
8
9,078
transformers
2023-07-19T08:39:56
--- inference: false --- <br> <br> # LLaVA Model Card ## Model details **Model type:** LLaVA is an open-source chatbot trained by fine-tuning LLaMA/Vicuna on GPT-generated multimodal instruction-following data. It is an auto-regressive language model, based on the transformer architecture. **Model date:** LLaVA-v1-0719-336px-LoRA-Vicuna-13B-v1.3 was trained in July 2023. **Paper or resources for more information:** https://llava-vl.github.io/ ## License Non-commerical Use. **Where to send questions or comments about the model:** https://github.com/haotian-liu/LLaVA/issues ## Intended use **Primary intended uses:** The primary use of LLaVA is research on large multimodal models and chatbots. **Primary intended users:** The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence. ## Training dataset - 558K filtered image-text pairs from LAION/CC/SBU, captioned by BLIP. - 80K GPT-generated multimodal instruction-following data. ## Evaluation dataset A preliminary evaluation of the model quality is conducted by creating a set of 90 visual reasoning questions from 30 unique images randomly sampled from COCO val 2014 and each is associated with three types of questions: conversational, detailed description, and complex reasoning. We utilize GPT-4 to judge the model outputs. We also evaluate our model on the ScienceQA dataset. Our synergy with GPT-4 sets a new state-of-the-art on the dataset. See https://llava-vl.github.io/ for more details.
1,576
[ [ -0.00701141357421875, -0.0806884765625, 0.0343017578125, 0.0179901123046875, -0.0310211181640625, 0.00958251953125, 0.0016794204711914062, -0.039520263671875, 0.015228271484375, 0.044677734375, -0.036773681640625, -0.0447998046875, -0.03375244140625, -0.00347900390625, -0.034881591796875, 0.066162109375, 0.0015401840209960938, -0.0015974044799804688, -0.0273284912109375, 0.01091766357421875, -0.060699462890625, -0.0299530029296875, -0.04852294921875, -0.01424407958984375, 0.044769287109375, 0.036163330078125, 0.043060302734375, 0.024139404296875, 0.036224365234375, 0.0269317626953125, 0.0037059783935546875, 0.0194091796875, -0.043670654296875, 0.0026721954345703125, 0.0108795166015625, -0.0556640625, -0.05072021484375, -0.0185089111328125, 0.028961181640625, -0.01256561279296875, -0.015960693359375, 0.023590087890625, -0.004024505615234375, 0.0224609375, -0.018341064453125, 0.043670654296875, -0.064208984375, -0.0168914794921875, -0.0166473388671875, -0.0191802978515625, -0.030120849609375, -0.025421142578125, -0.02557373046875, -0.042694091796875, -0.006771087646484375, 0.0082550048828125, 0.08514404296875, 0.0377197265625, -0.0279541015625, -0.0143585205078125, -0.054595947265625, 0.044769287109375, -0.044525146484375, 0.019805908203125, 0.035919189453125, 0.056884765625, -0.009429931640625, -0.04779052734375, -0.0572509765625, -0.0203399658203125, 0.007564544677734375, 0.015960693359375, -0.031463623046875, -0.0004506111145019531, 0.01021575927734375, 0.026397705078125, -0.0374755859375, 0.0073394775390625, -0.04656982421875, -0.006420135498046875, 0.040374755859375, 0.020294189453125, 0.01678466796875, -0.018310546875, -0.0297698974609375, -0.0023860931396484375, -0.034088134765625, 0.0010776519775390625, 0.038604736328125, 0.01219940185546875, -0.023193359375, 0.059478759765625, -0.01548004150390625, 0.03497314453125, -0.0028743743896484375, -0.0279541015625, 0.024993896484375, -0.018341064453125, -0.04052734375, -0.027740478515625, 0.07635498046875, 0.0249786376953125, 0.0157012939453125, 0.0197906494140625, -0.01470947265625, 0.01444244384765625, 0.01800537109375, -0.03619384765625, -0.00928497314453125, 0.016693115234375, -0.0207977294921875, -0.040496826171875, -0.043121337890625, -0.0479736328125, -0.0226898193359375, -0.0174560546875, 0.0166778564453125, -0.0322265625, -0.020233154296875, -0.01331329345703125, 0.0255889892578125, 0.04388427734375, 0.04119873046875, -0.0628662109375, 0.006366729736328125, 0.03411865234375, 0.04730224609375, -0.0035190582275390625, -0.012451171875, 0.00437164306640625, -0.009979248046875, -0.0138702392578125, 0.08563232421875, -0.045654296875, -0.0215606689453125, -0.0048980712890625, 0.009918212890625, -0.00009584426879882812, -0.018829345703125, 0.053924560546875, -0.04754638671875, 0.0209503173828125, -0.00533294677734375, -0.03240966796875, -0.01551055908203125, 0.03387451171875, -0.0474853515625, 0.07720947265625, -0.0021991729736328125, -0.052642822265625, 0.006317138671875, -0.042938232421875, 0.0018587112426757812, 0.01416015625, -0.0213623046875, -0.0249786376953125, -0.00972747802734375, 0.0287322998046875, 0.01849365234375, -0.04656982421875, 0.038604736328125, -0.011444091796875, -0.02313232421875, 0.0176544189453125, -0.06134033203125, 0.06024169921875, 0.024871826171875, -0.0016450881958007812, 0.021575927734375, -0.05926513671875, -0.012359619140625, 0.0289764404296875, -0.0213623046875, -0.002399444580078125, -0.015960693359375, -0.0022869110107421875, 0.0047607421875, 0.042388916015625, -0.033294677734375, 0.035369873046875, -0.004917144775390625, 0.0159912109375, 0.06268310546875, -0.00589752197265625, 0.009368896484375, -0.0194549560546875, 0.05535888671875, -0.00371551513671875, 0.052276611328125, -0.0115966796875, -0.06976318359375, -0.0718994140625, -0.024627685546875, 0.0036106109619140625, 0.07275390625, -0.0654296875, 0.0181732177734375, -0.01702880859375, -0.050048828125, -0.0535888671875, 0.0186309814453125, 0.02508544921875, 0.040802001953125, 0.0190277099609375, -0.01506805419921875, -0.04180908203125, -0.0814208984375, 0.01016998291015625, -0.033538818359375, 0.00934600830078125, 0.0309295654296875, 0.0386962890625, -0.0302581787109375, 0.06298828125, -0.0270843505859375, -0.0220489501953125, -0.02459716796875, -0.01229095458984375, 0.0181121826171875, 0.014129638671875, 0.0304412841796875, -0.049468994140625, -0.03466796875, 0.0017824172973632812, -0.0775146484375, -0.0020389556884765625, -0.00698089599609375, -0.0283966064453125, 0.0210418701171875, 0.022247314453125, -0.0509033203125, 0.04766845703125, 0.06304931640625, -0.01202392578125, 0.03900146484375, -0.0016994476318359375, 0.01473236083984375, -0.08880615234375, -0.00806427001953125, -0.00867462158203125, -0.01149749755859375, -0.03448486328125, -0.0048675537109375, -0.0100860595703125, -0.00070953369140625, -0.050201416015625, 0.04730224609375, -0.01506805419921875, 0.00244140625, -0.0249481201171875, 0.004222869873046875, 0.0046234130859375, 0.058319091796875, -0.00913238525390625, 0.0665283203125, 0.0382080078125, -0.0330810546875, 0.0543212890625, 0.034210205078125, -0.0197906494140625, 0.041015625, -0.06878662109375, 0.01806640625, -0.0038585662841796875, 0.01221466064453125, -0.0850830078125, -0.022979736328125, 0.044097900390625, -0.0445556640625, 0.01457977294921875, -0.01259613037109375, -0.052490234375, -0.0272979736328125, 0.00142669677734375, 0.026153564453125, 0.05133056640625, -0.0361328125, 0.05474853515625, 0.03277587890625, 0.0099029541015625, -0.04888916015625, -0.05517578125, 0.004146575927734375, -0.020751953125, -0.04730224609375, -0.0028171539306640625, -0.0198822021484375, -0.00989532470703125, -0.0080108642578125, 0.0190887451171875, -0.019134521484375, -0.01171112060546875, 0.024627685546875, 0.038360595703125, -0.00209808349609375, 0.0145111083984375, 0.000850677490234375, 0.0007834434509277344, -0.00769805908203125, 0.0115814208984375, 0.04925537109375, -0.0275421142578125, -0.0289764404296875, -0.06097412109375, 0.0003771781921386719, 0.0296783447265625, -0.002262115478515625, 0.04498291015625, 0.04595947265625, -0.00830078125, 0.02252197265625, -0.057952880859375, 0.0006613731384277344, -0.039154052734375, 0.0264892578125, -0.03668212890625, -0.052154541015625, 0.04473876953125, 0.012847900390625, 0.0210723876953125, 0.0297698974609375, 0.0582275390625, -0.017791748046875, 0.05633544921875, 0.053863525390625, -0.00524139404296875, 0.05157470703125, -0.01641845703125, 0.0040435791015625, -0.05645751953125, -0.032318115234375, -0.0114898681640625, -0.002338409423828125, -0.053497314453125, -0.051300048828125, 0.0035839080810546875, -0.01329803466796875, -0.0207977294921875, 0.03192138671875, -0.0382080078125, 0.036865234375, 0.04620361328125, 0.00951385498046875, 0.0293121337890625, 0.00804901123046875, 0.007434844970703125, 0.0115966796875, -0.041473388671875, -0.05535888671875, 0.08953857421875, 0.04400634765625, 0.07354736328125, 0.0014896392822265625, 0.04791259765625, 0.0208282470703125, 0.0263519287109375, -0.04864501953125, 0.052337646484375, 0.01346588134765625, -0.05474853515625, -0.0220489501953125, -0.019378662109375, -0.07574462890625, 0.01129913330078125, -0.00965118408203125, -0.04754638671875, -0.002338409423828125, 0.0266265869140625, 0.006748199462890625, 0.02911376953125, -0.06109619140625, 0.0565185546875, -0.04010009765625, -0.0205078125, -0.0028667449951171875, -0.0309295654296875, 0.055511474609375, 0.001758575439453125, 0.01348876953125, -0.019195556640625, 0.01064300537109375, 0.031951904296875, -0.01189422607421875, 0.10186767578125, -0.0015048980712890625, -0.0233917236328125, 0.04888916015625, 0.00395965576171875, 0.035186767578125, 0.005268096923828125, 0.0186920166015625, 0.03436279296875, -0.0081329345703125, -0.0364990234375, -0.0435791015625, 0.053924560546875, -0.0882568359375, -0.049591064453125, -0.02471923828125, -0.034271240234375, 0.00208282470703125, 0.006244659423828125, 0.018280029296875, 0.0118865966796875, -0.0104827880859375, -0.005550384521484375, 0.038360595703125, -0.036224365234375, 0.007720947265625, 0.019073486328125, -0.026153564453125, -0.03643798828125, 0.06500244140625, -0.0189056396484375, 0.01568603515625, 0.03717041015625, 0.0014295578002929688, -0.0178375244140625, -0.0185546875, -0.02679443359375, 0.034423828125, -0.06854248046875, -0.0241241455078125, -0.03668212890625, -0.0266265869140625, -0.0323486328125, 0.0175323486328125, -0.039581298828125, -0.0191802978515625, -0.034027099609375, -0.005462646484375, 0.04559326171875, 0.05572509765625, 0.02679443359375, 0.036041259765625, -0.03173828125, 0.0251617431640625, 0.03936767578125, 0.022979736328125, -0.01806640625, -0.056243896484375, 0.004665374755859375, 0.0024738311767578125, -0.03875732421875, -0.060760498046875, 0.041412353515625, 0.0036716461181640625, 0.04925537109375, 0.00998687744140625, -0.01678466796875, 0.051239013671875, -0.028533935546875, 0.06365966796875, 0.0153961181640625, -0.04766845703125, 0.0560302734375, -0.0222320556640625, 0.02716064453125, 0.030975341796875, 0.0177154541015625, -0.024261474609375, -0.03546142578125, -0.045318603515625, -0.051116943359375, 0.0288848876953125, 0.02093505859375, 0.0287933349609375, -0.0084228515625, 0.018951416015625, 0.01015472412109375, 0.01497650146484375, -0.0858154296875, -0.02532958984375, -0.03533935546875, -0.0218963623046875, 0.01203155517578125, -0.03814697265625, -0.007320404052734375, -0.018463134765625, 0.04559326171875, -0.01418304443359375, 0.051300048828125, -0.01099395751953125, -0.01025390625, 0.00914764404296875, 0.0194549560546875, 0.058502197265625, 0.0318603515625, -0.01520538330078125, -0.018951416015625, 0.0186920166015625, -0.0418701171875, -0.0021305084228515625, -0.01123046875, -0.0120391845703125, -0.01151275634765625, 0.02862548828125, 0.08673095703125, 0.00276947021484375, -0.046783447265625, 0.031951904296875, -0.033172607421875, -0.0225677490234375, -0.05657958984375, 0.0061798095703125, 0.003147125244140625, 0.03741455078125, 0.004848480224609375, -0.007633209228515625, 0.000682830810546875, -0.033050537109375, -0.01126861572265625, 0.0219573974609375, -0.01776123046875, -0.0237579345703125, 0.054718017578125, 0.0199737548828125, -0.029022216796875, 0.052642822265625, -0.003528594970703125, -0.01464080810546875, 0.034332275390625, 0.0325927734375, 0.055877685546875, -0.0122222900390625, 0.020172119140625, 0.03759765625, 0.02764892578125, 0.01078033447265625, 0.03125, -0.005748748779296875, -0.05084228515625, -0.0293426513671875, -0.04290771484375, -0.039093017578125, 0.0170135498046875, -0.0300750732421875, 0.03790283203125, -0.028533935546875, -0.020843505859375, -0.02508544921875, -0.0032062530517578125, -0.06854248046875, 0.003444671630859375, 0.0191497802734375, 0.0616455078125, -0.0625, 0.08544921875, 0.027191162109375, -0.056488037109375, -0.04998779296875, -0.0268402099609375, -0.0009322166442871094, -0.0994873046875, 0.060272216796875, -0.00592041015625, -0.0017986297607421875, -0.0198974609375, -0.07000732421875, -0.07958984375, 0.10577392578125, 0.029510498046875, -0.06317138671875, -0.01407623291015625, 0.006317138671875, 0.050323486328125, -0.0275726318359375, 0.044464111328125, 0.034698486328125, 0.025665283203125, 0.03668212890625, -0.0775146484375, -0.01397705078125, -0.034210205078125, 0.002429962158203125, -0.02227783203125, -0.0709228515625, 0.05670166015625, -0.00849151611328125, -0.007411956787109375, 0.0099639892578125, 0.06109619140625, 0.03143310546875, 0.01377105712890625, 0.03326416015625, 0.023101806640625, 0.059173583984375, 0.0031375885009765625, 0.08380126953125, -0.02264404296875, 0.011993408203125, 0.0858154296875, -0.006244659423828125, 0.0587158203125, 0.0295562744140625, -0.00769805908203125, 0.04974365234375, 0.0433349609375, -0.0104522705078125, 0.042266845703125, -0.0021114349365234375, 0.007099151611328125, -0.0153961181640625, 0.00608062744140625, -0.0196533203125, 0.051116943359375, 0.03582763671875, -0.027679443359375, 0.003147125244140625, -0.0033721923828125, -0.0035552978515625, -0.01031494140625, 0.0016527175903320312, 0.050537109375, -0.0032176971435546875, -0.03369140625, 0.06390380859375, -0.020355224609375, 0.055267333984375, -0.039398193359375, -0.01763916015625, -0.040924072265625, 0.0013647079467773438, 0.0012502670288085938, -0.051116943359375, 0.0120391845703125, 0.01446533203125, 0.010833740234375, 0.0022830963134765625, 0.04949951171875, -0.02227783203125, -0.048126220703125, 0.0124664306640625, 0.03173828125, 0.037109375, 0.0230255126953125, -0.075439453125, 0.035186767578125, 0.01015472412109375, -0.0217437744140625, 0.01317596435546875, 0.033203125, -0.0203094482421875, 0.08026123046875, 0.038665771484375, -0.00812530517578125, 0.0039215087890625, 0.0234832763671875, 0.082763671875, -0.03765869140625, -0.0172576904296875, -0.05084228515625, 0.041412353515625, -0.004199981689453125, -0.03167724609375, 0.052459716796875, 0.02978515625, 0.042266845703125, -0.0013580322265625, 0.058441162109375, 0.007678985595703125, 0.0218353271484375, -0.035186767578125, 0.02471923828125, -0.045867919921875, 0.037750244140625, -0.0158233642578125, -0.060455322265625, -0.0172882080078125, 0.0494384765625, -0.0192413330078125, 0.00037932395935058594, 0.03961181640625, 0.0657958984375, 0.00727081298828125, -0.009124755859375, 0.04766845703125, 0.0218048095703125, 0.05108642578125, 0.046661376953125, 0.06884765625, -0.0467529296875, 0.06561279296875, -0.00933837890625, -0.022125244140625, -0.0309906005859375, -0.052337646484375, -0.08819580078125, -0.04718017578125, -0.0214996337890625, -0.01226806640625, 0.01142120361328125, 0.051361083984375, 0.0374755859375, -0.03363037109375, -0.0222320556640625, 0.0012369155883789062, 0.00799560546875, 0.00664520263671875, -0.01515960693359375, 0.01287078857421875, -0.0179595947265625, -0.057037353515625, 0.0217437744140625, -0.0015621185302734375, 0.014068603515625, -0.036224365234375, -0.0005574226379394531, -0.0157928466796875, 0.01326751708984375, 0.045135498046875, 0.0303192138671875, -0.075927734375, -0.0200653076171875, 0.0162200927734375, -0.0127410888671875, 0.0215301513671875, 0.01812744140625, -0.0506591796875, 0.03277587890625, 0.0220489501953125, 0.01537322998046875, 0.044189453125, -0.0081024169921875, 0.024658203125, -0.04864501953125, 0.019866943359375, 0.00937652587890625, 0.02508544921875, 0.03125, -0.03271484375, 0.03497314453125, 0.004016876220703125, -0.05767822265625, -0.053985595703125, 0.019287109375, -0.08380126953125, 0.002208709716796875, 0.104248046875, 0.00914764404296875, -0.0506591796875, 0.003932952880859375, -0.04046630859375, 0.01611328125, -0.0443115234375, 0.05438232421875, 0.03057861328125, -0.0114898681640625, -0.043731689453125, -0.06243896484375, 0.01273345947265625, -0.01311492919921875, -0.07666015625, -0.0078887939453125, 0.037384033203125, 0.0248260498046875, -0.00043845176696777344, 0.0721435546875, -0.00513458251953125, 0.0057220458984375, 0.01171112060546875, 0.0379638671875, -0.01084136962890625, -0.027496337890625, -0.00921630859375, -0.01097869873046875, 0.011138916015625, -0.03179931640625 ] ]
nvidia/segformer-b3-finetuned-ade-512-512
2022-08-06T10:29:16.000Z
[ "transformers", "pytorch", "tf", "segformer", "vision", "image-segmentation", "dataset:scene_parse_150", "arxiv:2105.15203", "license:other", "endpoints_compatible", "has_space", "region:us" ]
image-segmentation
nvidia
null
null
nvidia/segformer-b3-finetuned-ade-512-512
6
9,074
transformers
2022-03-02T23:29:05
--- license: other tags: - vision - image-segmentation datasets: - scene_parse_150 widget: - src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg example_title: House - src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg example_title: Castle --- # SegFormer (b3-sized) model fine-tuned on ADE20k SegFormer model fine-tuned on ADE20k at resolution 512x512. It was introduced in the paper [SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers](https://arxiv.org/abs/2105.15203) by Xie et al. and first released in [this repository](https://github.com/NVlabs/SegFormer). Disclaimer: The team releasing SegFormer did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description SegFormer consists of a hierarchical Transformer encoder and a lightweight all-MLP decode head to achieve great results on semantic segmentation benchmarks such as ADE20K and Cityscapes. The hierarchical Transformer is first pre-trained on ImageNet-1k, after which a decode head is added and fine-tuned altogether on a downstream dataset. ## Intended uses & limitations You can use the raw model for semantic segmentation. See the [model hub](https://huggingface.co/models?other=segformer) to look for fine-tuned versions on a task that interests you. ### How to use Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes: ```python from transformers import SegformerFeatureExtractor, SegformerForSemanticSegmentation from PIL import Image import requests feature_extractor = SegformerFeatureExtractor.from_pretrained("nvidia/segformer-b3-finetuned-ade-512-512") model = SegformerForSemanticSegmentation.from_pretrained("nvidia/segformer-b3-finetuned-ade-512-512") url = "http://images.cocodataset.org/val2017/000000039769.jpg" image = Image.open(requests.get(url, stream=True).raw) inputs = feature_extractor(images=image, return_tensors="pt") outputs = model(**inputs) logits = outputs.logits # shape (batch_size, num_labels, height/4, width/4) ``` For more code examples, we refer to the [documentation](https://huggingface.co/transformers/model_doc/segformer.html#). ### License The license for this model can be found [here](https://github.com/NVlabs/SegFormer/blob/master/LICENSE). ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-2105-15203, author = {Enze Xie and Wenhai Wang and Zhiding Yu and Anima Anandkumar and Jose M. Alvarez and Ping Luo}, title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers}, journal = {CoRR}, volume = {abs/2105.15203}, year = {2021}, url = {https://arxiv.org/abs/2105.15203}, eprinttype = {arXiv}, eprint = {2105.15203}, timestamp = {Wed, 02 Jun 2021 11:46:42 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
3,209
[ [ -0.06634521484375, -0.054901123046875, 0.0140380859375, 0.0165863037109375, -0.0238037109375, -0.0274200439453125, 0.000431060791015625, -0.0523681640625, 0.021392822265625, 0.04351806640625, -0.06439208984375, -0.043487548828125, -0.0565185546875, 0.00823974609375, -0.023223876953125, 0.06085205078125, 0.0035991668701171875, -0.0097198486328125, -0.0223388671875, -0.026885986328125, -0.001857757568359375, -0.0216522216796875, -0.047454833984375, -0.02630615234375, 0.028167724609375, 0.0168609619140625, 0.04461669921875, 0.06121826171875, 0.051116943359375, 0.034759521484375, -0.034088134765625, 0.01126861572265625, -0.0223541259765625, -0.01490020751953125, 0.0034198760986328125, -0.0072174072265625, -0.033782958984375, -0.0002491474151611328, 0.0283966064453125, 0.04718017578125, 0.006618499755859375, 0.025909423828125, -0.0014142990112304688, 0.034423828125, -0.034027099609375, 0.00795745849609375, -0.036956787109375, 0.010650634765625, 0.002933502197265625, 0.0004100799560546875, -0.0207672119140625, -0.01468658447265625, 0.01593017578125, -0.03851318359375, 0.05615234375, 0.0036869049072265625, 0.1160888671875, 0.0303802490234375, -0.027496337890625, -0.004241943359375, -0.035980224609375, 0.0606689453125, -0.049163818359375, 0.039398193359375, -0.006450653076171875, 0.0258331298828125, 0.01097869873046875, -0.07464599609375, -0.0352783203125, 0.01317596435546875, -0.015899658203125, -0.005664825439453125, -0.0279083251953125, 0.003948211669921875, 0.035797119140625, 0.042938232421875, -0.033416748046875, 0.005840301513671875, -0.05419921875, -0.03173828125, 0.053802490234375, 0.00479888916015625, 0.0208282470703125, -0.026611328125, -0.05908203125, -0.03411865234375, -0.0248260498046875, 0.00882720947265625, 0.0194244384765625, 0.0014476776123046875, -0.0218353271484375, 0.035125732421875, -0.0030307769775390625, 0.05718994140625, 0.032989501953125, -0.00975799560546875, 0.039215087890625, -0.0109710693359375, -0.024383544921875, 0.0025615692138671875, 0.06964111328125, 0.033935546875, -0.0007386207580566406, 0.006038665771484375, -0.0073394775390625, 0.00760650634765625, 0.0238494873046875, -0.0987548828125, -0.0170745849609375, 0.00682830810546875, -0.040374755859375, -0.022369384765625, 0.0147247314453125, -0.05908203125, -0.0055694580078125, -0.0130462646484375, 0.03668212890625, -0.0251007080078125, -0.005695343017578125, 0.00876617431640625, -0.00858306884765625, 0.05670166015625, 0.0197906494140625, -0.06390380859375, 0.017120361328125, 0.040435791015625, 0.060760498046875, -0.0182647705078125, -0.00977325439453125, -0.0098419189453125, -0.004650115966796875, -0.0121612548828125, 0.0662841796875, -0.0186309814453125, -0.02459716796875, -0.022705078125, 0.046905517578125, -0.01959228515625, -0.04876708984375, 0.057464599609375, -0.04046630859375, 0.0163116455078125, -0.0012102127075195312, -0.0288543701171875, -0.046051025390625, 0.02166748046875, -0.050140380859375, 0.0693359375, 0.019256591796875, -0.059844970703125, 0.03692626953125, -0.044921875, -0.0202484130859375, -0.0037822723388671875, 0.004878997802734375, -0.06878662109375, 0.0033435821533203125, 0.036163330078125, 0.038055419921875, -0.01544952392578125, 0.01220703125, -0.044281005859375, -0.01313018798828125, -0.0010013580322265625, -0.01486968994140625, 0.07525634765625, 0.0242462158203125, -0.0251312255859375, 0.03369140625, -0.052398681640625, -0.0005774497985839844, 0.036102294921875, 0.003604888916015625, 0.0009655952453613281, -0.0259857177734375, 0.01812744140625, 0.031219482421875, 0.0177459716796875, -0.052093505859375, 0.002086639404296875, -0.0254058837890625, 0.03173828125, 0.04852294921875, 0.00995635986328125, 0.035125732421875, -0.009063720703125, 0.0238800048828125, 0.0131072998046875, 0.033599853515625, -0.00873565673828125, -0.0177764892578125, -0.0850830078125, -0.033416748046875, 0.0170745849609375, 0.01031494140625, -0.0308837890625, 0.0477294921875, -0.0164337158203125, -0.04840087890625, -0.038360595703125, 0.0005412101745605469, 0.003551483154296875, 0.036956787109375, 0.040435791015625, -0.0308074951171875, -0.055450439453125, -0.08941650390625, 0.01091766357421875, 0.016998291015625, -0.00662994384765625, 0.0283355712890625, 0.043609619140625, -0.050140380859375, 0.060943603515625, -0.059600830078125, -0.0263671875, -0.016754150390625, -0.004703521728515625, 0.0240478515625, 0.04364013671875, 0.04833984375, -0.06298828125, -0.02716064453125, -0.01580810546875, -0.044891357421875, -0.005748748779296875, 0.01097869873046875, -0.0257720947265625, 0.01247406005859375, 0.0323486328125, -0.042816162109375, 0.03118896484375, 0.035186767578125, -0.044158935546875, 0.0240325927734375, -0.003406524658203125, -0.0017404556274414062, -0.07672119140625, 0.01372528076171875, 0.01421356201171875, -0.0195159912109375, -0.03802490234375, 0.01059722900390625, 0.0016994476318359375, -0.0127716064453125, -0.047943115234375, 0.04290771484375, -0.0259246826171875, -0.0007758140563964844, -0.0189208984375, -0.01190185546875, 0.01340484619140625, 0.0582275390625, 0.01291656494140625, 0.0235443115234375, 0.036346435546875, -0.053375244140625, 0.0165557861328125, 0.039825439453125, -0.0282135009765625, 0.038909912109375, -0.07916259765625, 0.005786895751953125, -0.00603485107421875, 0.0085601806640625, -0.05487060546875, -0.0276031494140625, 0.0301513671875, -0.024139404296875, 0.03173828125, -0.0221405029296875, -0.01824951171875, -0.045684814453125, -0.0174102783203125, 0.02935791015625, 0.03759765625, -0.060760498046875, 0.044342041015625, 0.03863525390625, 0.008575439453125, -0.019622802734375, -0.046539306640625, -0.0266876220703125, -0.03118896484375, -0.07958984375, 0.051513671875, -0.0031070709228515625, 0.01496124267578125, 0.0027637481689453125, -0.0270233154296875, -0.0028667449951171875, -0.00409698486328125, 0.02655029296875, 0.038665771484375, -0.00957489013671875, -0.032440185546875, -0.0016412734985351562, -0.030609130859375, 0.0099945068359375, -0.007781982421875, 0.048095703125, -0.026885986328125, -0.0269317626953125, -0.022308349609375, -0.0007843971252441406, 0.03668212890625, -0.0210418701171875, 0.03692626953125, 0.0938720703125, -0.025177001953125, -0.00415802001953125, -0.04339599609375, -0.0201416015625, -0.042572021484375, 0.024383544921875, -0.0169677734375, -0.082275390625, 0.042022705078125, 0.00489044189453125, 0.003265380859375, 0.07464599609375, 0.036529541015625, 0.01094818115234375, 0.0965576171875, 0.045989990234375, 0.033111572265625, 0.039398193359375, -0.06097412109375, 0.0142822265625, -0.08001708984375, -0.0430908203125, -0.031036376953125, -0.03131103515625, -0.05633544921875, -0.05108642578125, 0.030975341796875, 0.01221466064453125, -0.0293731689453125, 0.04461669921875, -0.0633544921875, 0.0156402587890625, 0.037322998046875, 0.00629425048828125, -0.01256561279296875, 0.005619049072265625, -0.00916290283203125, 0.00634002685546875, -0.053802490234375, -0.0284423828125, 0.0267181396484375, 0.042724609375, 0.055511474609375, -0.01030731201171875, 0.04522705078125, -0.00775909423828125, -0.0004169940948486328, -0.07177734375, 0.046356201171875, -0.007030487060546875, -0.05322265625, -0.00714874267578125, -0.0240478515625, -0.07562255859375, 0.033111572265625, -0.0104217529296875, -0.06732177734375, 0.04827880859375, 0.00830078125, -0.019989013671875, 0.022735595703125, -0.049072265625, 0.09027099609375, -0.01496124267578125, -0.032318115234375, 0.00945281982421875, -0.0535888671875, 0.019439697265625, 0.0189666748046875, -0.006107330322265625, -0.0322265625, 0.02069091796875, 0.07232666015625, -0.05224609375, 0.0511474609375, -0.02532958984375, 0.0156707763671875, 0.045623779296875, -0.007236480712890625, 0.02728271484375, 0.002712249755859375, 0.0205841064453125, 0.0372314453125, 0.017608642578125, -0.026641845703125, -0.032073974609375, 0.050567626953125, -0.0634765625, -0.046051025390625, -0.031585693359375, -0.0214080810546875, -0.0015354156494140625, 0.0280303955078125, 0.035491943359375, 0.03125, -0.005992889404296875, 0.036041259765625, 0.0477294921875, -0.0240325927734375, 0.039764404296875, 0.01468658447265625, -0.010955810546875, -0.032958984375, 0.06890869140625, -0.01340484619140625, 0.0021190643310546875, 0.020233154296875, 0.02227783203125, -0.036773681640625, -0.0189666748046875, -0.03265380859375, 0.0229339599609375, -0.04852294921875, -0.03216552734375, -0.0665283203125, -0.04241943359375, -0.03460693359375, -0.020782470703125, -0.03662109375, -0.024169921875, -0.03277587890625, 0.0003440380096435547, 0.0292816162109375, 0.0297393798828125, -0.01425933837890625, 0.026641845703125, -0.052703857421875, 0.01947021484375, 0.0291900634765625, 0.0278472900390625, -0.0030364990234375, -0.045989990234375, -0.01000213623046875, -0.000850677490234375, -0.042938232421875, -0.04058837890625, 0.045135498046875, 0.005107879638671875, 0.0394287109375, 0.0430908203125, -0.009124755859375, 0.07159423828125, -0.016448974609375, 0.04638671875, 0.0298309326171875, -0.05999755859375, 0.0283050537109375, -0.01361083984375, 0.038055419921875, 0.030548095703125, 0.021331787109375, -0.040679931640625, 0.004474639892578125, -0.06365966796875, -0.0845947265625, 0.0723876953125, 0.00801849365234375, -0.0007681846618652344, 0.007843017578125, -0.004058837890625, 0.0011301040649414062, -0.005100250244140625, -0.044097900390625, -0.02618408203125, -0.0275115966796875, -0.0149688720703125, -0.00315093994140625, -0.031524658203125, -0.0012025833129882812, -0.03857421875, 0.05419921875, -0.01061248779296875, 0.05047607421875, 0.020233154296875, -0.0234222412109375, 0.000034809112548828125, -0.0027675628662109375, 0.03289794921875, 0.0230712890625, -0.0233306884765625, 0.0073089599609375, 0.0142364501953125, -0.03173828125, -0.00567626953125, 0.0299835205078125, -0.0208587646484375, -0.004177093505859375, 0.0250701904296875, 0.08416748046875, 0.0237579345703125, -0.0198516845703125, 0.04278564453125, -0.0004706382751464844, -0.039398193359375, -0.0262603759765625, 0.0190887451171875, -0.0004730224609375, 0.0290985107421875, 0.0172271728515625, 0.0288543701171875, 0.02423095703125, 0.0010309219360351562, 0.021453857421875, 0.0262451171875, -0.05487060546875, -0.0264892578125, 0.060943603515625, 0.00839996337890625, -0.000911712646484375, 0.052703857421875, -0.00913238525390625, -0.050811767578125, 0.06463623046875, 0.039947509765625, 0.0770263671875, -0.0025043487548828125, 0.0189208984375, 0.057281494140625, 0.010284423828125, 0.009735107421875, -0.00913238525390625, -0.00763702392578125, -0.057220458984375, -0.0259246826171875, -0.07635498046875, -0.0002903938293457031, 0.0110015869140625, -0.053741455078125, 0.04193115234375, -0.0316162109375, -0.01294708251953125, 0.01763916015625, 0.006702423095703125, -0.07891845703125, 0.0196075439453125, 0.01812744140625, 0.0728759765625, -0.045440673828125, 0.03399658203125, 0.05853271484375, -0.0169219970703125, -0.058380126953125, -0.034759521484375, -0.001995086669921875, -0.06561279296875, 0.0296630859375, 0.039459228515625, 0.00116729736328125, 0.00489044189453125, -0.051513671875, -0.0775146484375, 0.0994873046875, 0.00908660888671875, -0.0205078125, -0.0017538070678710938, -0.0004742145538330078, 0.0274658203125, -0.033447265625, 0.0269622802734375, 0.03204345703125, 0.039947509765625, 0.053558349609375, -0.034759521484375, 0.00861358642578125, -0.022735595703125, 0.0144805908203125, 0.0247802734375, -0.0635986328125, 0.04583740234375, -0.0248565673828125, -0.0191802978515625, -0.009796142578125, 0.053314208984375, 0.01244354248046875, 0.0188751220703125, 0.053466796875, 0.05804443359375, 0.03131103515625, -0.025543212890625, 0.0638427734375, -0.015899658203125, 0.056793212890625, 0.056884765625, 0.0182342529296875, 0.0245208740234375, 0.0323486328125, -0.0088043212890625, 0.033782958984375, 0.0704345703125, -0.04150390625, 0.0335693359375, -0.006198883056640625, 0.0124359130859375, -0.0296173095703125, -0.0166015625, -0.030914306640625, 0.05889892578125, 0.0205230712890625, -0.04437255859375, -0.01522064208984375, -0.01186370849609375, -0.0010766983032226562, -0.035064697265625, -0.01517486572265625, 0.0496826171875, 0.00951385498046875, -0.0257110595703125, 0.0491943359375, 0.01088714599609375, 0.051361083984375, -0.033050537109375, 0.0018320083618164062, -0.007221221923828125, 0.0213165283203125, -0.026214599609375, -0.036956787109375, 0.043121337890625, -0.0162200927734375, -0.00348663330078125, -0.0087738037109375, 0.0714111328125, -0.0208892822265625, -0.05615234375, 0.0135345458984375, 0.01064300537109375, 0.0021343231201171875, 0.0113067626953125, -0.06951904296875, 0.03271484375, 0.004558563232421875, -0.034027099609375, 0.0005097389221191406, 0.01042938232421875, 0.00905609130859375, 0.038543701171875, 0.043182373046875, -0.0230560302734375, 0.0002416372299194336, -0.01041412353515625, 0.06695556640625, -0.052947998046875, -0.029144287109375, -0.05499267578125, 0.04296875, -0.0248870849609375, -0.023590087890625, 0.0582275390625, 0.050140380859375, 0.0850830078125, -0.0186004638671875, 0.02191162109375, -0.032501220703125, 0.01319122314453125, -0.01812744140625, 0.03924560546875, -0.0509033203125, -0.006317138671875, -0.03302001953125, -0.0826416015625, -0.027618408203125, 0.0645751953125, -0.0279693603515625, 0.0206146240234375, 0.0389404296875, 0.06964111328125, -0.024566650390625, -0.0006012916564941406, 0.017120361328125, 0.007595062255859375, 0.01535797119140625, 0.024688720703125, 0.04656982421875, -0.040679931640625, 0.039093017578125, -0.05462646484375, 0.0015926361083984375, -0.037353515625, -0.04656982421875, -0.06451416015625, -0.044097900390625, -0.038116455078125, -0.02618408203125, -0.031585693359375, 0.06390380859375, 0.0816650390625, -0.06549072265625, -0.005222320556640625, 0.0027370452880859375, 0.01438140869140625, -0.014923095703125, -0.0227813720703125, 0.036529541015625, -0.0015926361083984375, -0.072509765625, -0.00396728515625, 0.0196380615234375, 0.01371002197265625, -0.002704620361328125, -0.016021728515625, -0.0021533966064453125, -0.012603759765625, 0.050140380859375, 0.0206146240234375, -0.04644775390625, -0.02679443359375, 0.0145111083984375, -0.0010194778442382812, 0.0173187255859375, 0.0452880859375, -0.03778076171875, 0.0277557373046875, 0.04248046875, 0.034088134765625, 0.0721435546875, 0.006061553955078125, 0.0123291015625, -0.035369873046875, 0.0191497802734375, 0.01247406005859375, 0.0362548828125, 0.031829833984375, -0.0175323486328125, 0.039764404296875, 0.024139404296875, -0.038482666015625, -0.044921875, 0.00795745849609375, -0.0926513671875, -0.00966644287109375, 0.07891845703125, 0.0033111572265625, -0.04669189453125, 0.0223846435546875, -0.0142974853515625, 0.0301971435546875, -0.01468658447265625, 0.037567138671875, 0.0194244384765625, -0.01322174072265625, -0.0296630859375, -0.0123748779296875, 0.0281524658203125, 0.002498626708984375, -0.0390625, -0.03875732421875, 0.03485107421875, 0.033111572265625, 0.0214080810546875, 0.01270294189453125, -0.031768798828125, 0.006443023681640625, 0.00997161865234375, 0.0257568359375, -0.01953125, -0.0173492431640625, -0.01568603515625, 0.011138916015625, -0.0116424560546875, -0.021392822265625 ] ]