modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
huawei-noah/TinyBERT_General_4L_312D | 2021-05-19T20:03:32.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"arxiv:1909.10351",
"endpoints_compatible",
"region:us",
"has_space"
] | null | huawei-noah | null | null | huawei-noah/TinyBERT_General_4L_312D | 13 | 31,353 | transformers | 2022-03-02T23:29:05 | TinyBERT: Distilling BERT for Natural Language Understanding
========
TinyBERT is 7.5x smaller and 9.4x faster on inference than BERT-base and achieves competitive performances in the tasks of natural language understanding. It performs a novel transformer distillation at both the pre-training and task-specific learning stages. In general distillation, we use the original BERT-base without fine-tuning as the teacher and a large-scale text corpus as the learning data. By performing the Transformer distillation on the text from general domain, we obtain a general TinyBERT which provides a good initialization for the task-specific distillation. We here provide the general TinyBERT for your tasks at hand.
For more details about the techniques of TinyBERT, refer to our paper:
[TinyBERT: Distilling BERT for Natural Language Understanding](https://arxiv.org/abs/1909.10351)
Citation
========
If you find TinyBERT useful in your research, please cite the following paper:
```
@article{jiao2019tinybert,
title={Tinybert: Distilling bert for natural language understanding},
author={Jiao, Xiaoqi and Yin, Yichun and Shang, Lifeng and Jiang, Xin and Chen, Xiao and Li, Linlin and Wang, Fang and Liu, Qun},
journal={arXiv preprint arXiv:1909.10351},
year={2019}
}
```
| 1,280 | [
[
-0.03094482421875,
-0.065185546875,
0.04718017578125,
0.0214385986328125,
-0.03277587890625,
0.00024271011352539062,
-0.048431396484375,
-0.018890380859375,
-0.00569915771484375,
0.010833740234375,
-0.03936767578125,
-0.0180206298828125,
-0.05609130859375,
0.0080108642578125,
-0.0245361328125,
0.08831787109375,
0.02685546875,
0.00130462646484375,
0.00606536865234375,
-0.00304412841796875,
-0.01568603515625,
-0.020538330078125,
-0.053619384765625,
-0.047088623046875,
0.03826904296875,
0.03558349609375,
0.0440673828125,
0.029815673828125,
0.02117919921875,
0.0242462158203125,
-0.033782958984375,
-0.0082550048828125,
-0.042510986328125,
0.0006437301635742188,
-0.00142669677734375,
-0.05218505859375,
-0.0035266876220703125,
0.0031375885009765625,
0.035064697265625,
0.0562744140625,
0.009613037109375,
0.021240234375,
0.0295562744140625,
0.07232666015625,
-0.03411865234375,
0.01555633544921875,
-0.04852294921875,
-0.0008096694946289062,
-0.00019252300262451172,
0.0222625732421875,
-0.04296875,
-0.0386962890625,
0.04461669921875,
-0.0246429443359375,
0.049072265625,
0.0142974853515625,
0.0679931640625,
0.0294647216796875,
-0.029327392578125,
-0.0341796875,
-0.0526123046875,
0.080810546875,
-0.039154052734375,
0.0246124267578125,
0.00882720947265625,
0.04254150390625,
0.003726959228515625,
-0.08544921875,
-0.059722900390625,
-0.0178680419921875,
-0.0281524658203125,
-0.0180511474609375,
-0.033050537109375,
0.011444091796875,
0.01035308837890625,
0.0306854248046875,
-0.034698486328125,
-0.01132965087890625,
-0.02899169921875,
-0.0029621124267578125,
0.02642822265625,
-0.00576019287109375,
0.004390716552734375,
-0.034576416015625,
-0.0186767578125,
-0.0223388671875,
-0.01436614990234375,
0.00946044921875,
0.0006380081176757812,
0.0258026123046875,
0.0003838539123535156,
0.0027790069580078125,
0.019500732421875,
0.0478515625,
0.0284881591796875,
-0.0228118896484375,
0.0175018310546875,
-0.0253448486328125,
-0.040618896484375,
0.0218505859375,
0.0592041015625,
-0.004627227783203125,
0.032928466796875,
-0.01611328125,
0.0092926025390625,
-0.0259857177734375,
0.015899658203125,
-0.06915283203125,
-0.069091796875,
0.01256561279296875,
-0.055206298828125,
-0.0262603759765625,
-0.01352691650390625,
-0.039276123046875,
-0.02362060546875,
-0.025421142578125,
0.028533935546875,
-0.052978515625,
0.010589599609375,
-0.019134521484375,
-0.02593994140625,
0.0155792236328125,
0.0194091796875,
-0.091064453125,
0.02215576171875,
0.035980224609375,
0.06915283203125,
0.0036602020263671875,
-0.0065765380859375,
-0.03363037109375,
-0.0183868408203125,
-0.005718231201171875,
0.04736328125,
-0.010528564453125,
-0.01483154296875,
-0.01873779296875,
-0.00982666015625,
0.0027713775634765625,
-0.053375244140625,
0.05718994140625,
-0.034912109375,
0.0120391845703125,
-0.007904052734375,
-0.0195770263671875,
-0.0303955078125,
0.0068817138671875,
-0.051788330078125,
0.06866455078125,
-0.018524169921875,
-0.057708740234375,
0.03759765625,
-0.046173095703125,
-0.0291748046875,
-0.013763427734375,
0.0025882720947265625,
-0.0174102783203125,
0.0128173828125,
0.031341552734375,
0.03271484375,
-0.0182037353515625,
0.0111083984375,
-0.0057220458984375,
-0.01568603515625,
-0.00017559528350830078,
-0.037139892578125,
0.0889892578125,
0.0224151611328125,
-0.0178070068359375,
-0.0062103271484375,
-0.054229736328125,
-0.0057525634765625,
0.0075225830078125,
-0.0160980224609375,
-0.040802001953125,
0.00981903076171875,
0.00609588623046875,
0.0014600753784179688,
0.022491455078125,
-0.04608154296875,
0.01371002197265625,
-0.040374755859375,
0.0421142578125,
0.0487060546875,
0.006519317626953125,
0.061767578125,
-0.005825042724609375,
-0.0010890960693359375,
-0.01708984375,
0.00713348388671875,
0.012359619140625,
-0.045379638671875,
-0.07403564453125,
-0.042999267578125,
0.062744140625,
0.032073974609375,
-0.0635986328125,
0.043701171875,
-0.0271148681640625,
-0.04534912109375,
-0.06756591796875,
0.0279693603515625,
0.01186370849609375,
0.034332275390625,
0.031463623046875,
0.029022216796875,
-0.047149658203125,
-0.0718994140625,
-0.0149078369140625,
-0.01641845703125,
-0.0274200439453125,
0.005374908447265625,
0.03277587890625,
-0.02264404296875,
0.09808349609375,
-0.050994873046875,
-0.04034423828125,
-0.051727294921875,
0.055206298828125,
0.037261962890625,
0.0147552490234375,
0.061767578125,
-0.05450439453125,
-0.07000732421875,
-0.02093505859375,
-0.041412353515625,
-0.0035552978515625,
-0.009002685546875,
-0.00909423828125,
-0.0037689208984375,
0.016571044921875,
-0.01611328125,
0.01146697998046875,
0.034210205078125,
-0.004955291748046875,
0.0104827880859375,
-0.0321044921875,
-0.007099151611328125,
-0.05657958984375,
-0.00007289648056030273,
0.0004382133483886719,
-0.01922607421875,
-0.06982421875,
-0.025115966796875,
0.0156707763671875,
0.018280029296875,
-0.034759521484375,
0.0192718505859375,
-0.030731201171875,
0.038299560546875,
-0.021636962890625,
0.0379638671875,
0.01358795166015625,
0.03350830078125,
-0.00926971435546875,
0.052398681640625,
0.0214080810546875,
-0.04290771484375,
0.0202789306640625,
0.034027099609375,
-0.0294647216796875,
0.00104522705078125,
-0.0579833984375,
0.0131683349609375,
-0.0230865478515625,
0.01204681396484375,
-0.08062744140625,
0.02783203125,
0.00565338134765625,
-0.04071044921875,
0.020233154296875,
-0.0179901123046875,
-0.05438232421875,
-0.021209716796875,
-0.0369873046875,
0.0015001296997070312,
0.0596923828125,
-0.0692138671875,
0.040130615234375,
-0.0026721954345703125,
-0.00925445556640625,
-0.03656005859375,
-0.07196044921875,
-0.01541900634765625,
-0.0172119140625,
-0.059967041015625,
0.01953125,
-0.013336181640625,
-0.032196044921875,
-0.00423431396484375,
-0.0110015869140625,
-0.0263824462890625,
-0.00745391845703125,
-0.0027027130126953125,
0.0204315185546875,
-0.0096588134765625,
0.0245361328125,
0.01324462890625,
-0.01349639892578125,
-0.004241943359375,
-0.0130767822265625,
0.020263671875,
-0.0240020751953125,
0.0103759765625,
-0.02130126953125,
0.011138916015625,
0.0171661376953125,
0.0262603759765625,
0.0712890625,
0.054107666015625,
-0.015472412109375,
-0.01149749755859375,
-0.025421142578125,
-0.0291748046875,
-0.039276123046875,
-0.006824493408203125,
-0.0200347900390625,
-0.056976318359375,
0.022735595703125,
0.01456451416015625,
0.016632080078125,
0.05328369140625,
0.050994873046875,
-0.035064697265625,
0.048370361328125,
0.0618896484375,
-0.020721435546875,
0.04144287109375,
-0.027862548828125,
0.051971435546875,
-0.033843994140625,
-0.0028781890869140625,
-0.05096435546875,
-0.050079345703125,
-0.0286407470703125,
-0.0284881591796875,
0.016754150390625,
0.0179901123046875,
-0.004150390625,
0.037506103515625,
-0.043121337890625,
0.0245361328125,
0.061370849609375,
-0.0127105712890625,
0.033905029296875,
-0.00655364990234375,
-0.0191497802734375,
-0.03265380859375,
-0.046051025390625,
-0.055206298828125,
0.06512451171875,
0.0256500244140625,
0.0743408203125,
0.0179595947265625,
0.06884765625,
0.0438232421875,
0.020965576171875,
-0.07025146484375,
0.02557373046875,
-0.01509857177734375,
-0.064697265625,
-0.0275726318359375,
-0.024139404296875,
-0.0718994140625,
0.038116455078125,
-0.01065826416015625,
-0.0780029296875,
0.0212554931640625,
0.03546142578125,
-0.040863037109375,
0.00860595703125,
-0.093017578125,
0.065673828125,
-0.00995635986328125,
0.007709503173828125,
-0.00655364990234375,
-0.061737060546875,
0.0261993408203125,
-0.007160186767578125,
0.010223388671875,
-0.0116729736328125,
0.01511383056640625,
0.06488037109375,
-0.0262451171875,
0.0557861328125,
-0.025665283203125,
0.0002665519714355469,
0.033355712890625,
-0.007755279541015625,
0.02838134765625,
0.0193023681640625,
0.004253387451171875,
0.014129638671875,
0.00905609130859375,
-0.01062774658203125,
-0.05462646484375,
0.0433349609375,
-0.058441162109375,
-0.026092529296875,
-0.039306640625,
-0.014801025390625,
-0.002521514892578125,
0.01132965087890625,
0.03009033203125,
-0.00023174285888671875,
-0.0190277099609375,
0.032257080078125,
0.056610107421875,
0.0008006095886230469,
0.04705810546875,
0.0294647216796875,
0.0088653564453125,
-0.0016679763793945312,
0.0672607421875,
0.0008440017700195312,
0.0113525390625,
0.049713134765625,
0.0207366943359375,
-0.029632568359375,
-0.025390625,
-0.023712158203125,
0.0159454345703125,
-0.04730224609375,
-0.0059356689453125,
-0.058349609375,
-0.0165557861328125,
-0.02166748046875,
-0.007659912109375,
-0.0297698974609375,
-0.04150390625,
-0.03802490234375,
0.015472412109375,
0.03521728515625,
0.03704833984375,
-0.0153656005859375,
0.044189453125,
-0.07464599609375,
0.046142578125,
0.04754638671875,
-0.00213623046875,
0.01062774658203125,
-0.057281494140625,
-0.0208892822265625,
0.00977325439453125,
-0.0347900390625,
-0.0482177734375,
0.0274505615234375,
0.032745361328125,
0.037841796875,
0.050537109375,
0.0297698974609375,
0.042938232421875,
-0.055908203125,
0.07391357421875,
0.02032470703125,
-0.08575439453125,
0.04901123046875,
0.0182037353515625,
0.01517486572265625,
0.067138671875,
0.0217437744140625,
-0.03692626953125,
-0.0201263427734375,
-0.05535888671875,
-0.0634765625,
0.06689453125,
0.043121337890625,
-0.0054473876953125,
0.01360321044921875,
-0.01001739501953125,
0.018585205078125,
0.0184478759765625,
-0.05389404296875,
-0.05120849609375,
0.0014162063598632812,
-0.023345947265625,
-0.018341064453125,
-0.0254974365234375,
-0.012603759765625,
-0.045135498046875,
0.051666259765625,
0.00020623207092285156,
0.0498046875,
0.0028362274169921875,
-0.0380859375,
0.0335693359375,
0.03277587890625,
0.034698486328125,
0.0196533203125,
-0.042633056640625,
0.02325439453125,
0.0299835205078125,
-0.051361083984375,
0.0034961700439453125,
0.015899658203125,
0.019622802734375,
0.025390625,
0.03857421875,
0.046356201171875,
0.00524139404296875,
-0.045135498046875,
0.0421142578125,
-0.0276031494140625,
-0.0268096923828125,
-0.01132965087890625,
0.01806640625,
0.010650634765625,
0.03326416015625,
0.04150390625,
0.0016717910766601562,
0.027618408203125,
-0.035064697265625,
0.003177642822265625,
-0.0014247894287109375,
-0.0232086181640625,
-0.0048065185546875,
0.03704833984375,
0.0216217041015625,
-0.0155792236328125,
0.050048828125,
-0.029632568359375,
-0.023529052734375,
0.0217742919921875,
0.0175018310546875,
0.07330322265625,
0.01995849609375,
-0.0005321502685546875,
0.0091400146484375,
0.0316162109375,
0.018798828125,
0.004627227783203125,
-0.03863525390625,
-0.051239013671875,
-0.0248870849609375,
-0.047576904296875,
-0.028350830078125,
0.006061553955078125,
-0.045745849609375,
-0.007495880126953125,
-0.0248260498046875,
-0.00988006591796875,
0.03448486328125,
0.011077880859375,
-0.050689697265625,
0.0008206367492675781,
0.01270294189453125,
0.0811767578125,
-0.06378173828125,
0.09124755859375,
0.045684814453125,
-0.0205535888671875,
-0.0382080078125,
0.0275726318359375,
-0.0023822784423828125,
-0.045135498046875,
0.07464599609375,
0.000021398067474365234,
-0.0086669921875,
-0.01371002197265625,
-0.035675048828125,
-0.02960205078125,
0.07843017578125,
0.0299224853515625,
-0.036895751953125,
-0.01045989990234375,
-0.01226806640625,
0.05084228515625,
-0.00601959228515625,
0.007480621337890625,
0.0833740234375,
0.032379150390625,
0.0017147064208984375,
-0.058746337890625,
-0.0251617431640625,
-0.01145172119140625,
0.0214996337890625,
0.017913818359375,
-0.036895751953125,
0.06195068359375,
-0.0157318115234375,
-0.00376129150390625,
-0.007137298583984375,
0.061767578125,
-0.004291534423828125,
0.0128021240234375,
0.047576904296875,
0.0177001953125,
0.06060791015625,
-0.0229339599609375,
0.037506103515625,
-0.00690460205078125,
0.059539794921875,
0.09332275390625,
0.0230560302734375,
0.05889892578125,
0.05865478515625,
-0.024658203125,
0.04681396484375,
0.029388427734375,
0.004131317138671875,
0.07373046875,
-0.0017147064208984375,
-0.026611328125,
-0.00673675537109375,
0.031707763671875,
-0.0235443115234375,
0.02301025390625,
0.0249176025390625,
-0.049591064453125,
-0.032684326171875,
0.0211029052734375,
-0.0014276504516601562,
0.0003094673156738281,
0.005626678466796875,
0.05706787109375,
0.01027679443359375,
-0.0250244140625,
0.061676025390625,
0.00420379638671875,
0.059112548828125,
-0.033660888671875,
0.006572723388671875,
0.0011415481567382812,
0.0273895263671875,
0.00016677379608154297,
-0.057586669921875,
0.0227813720703125,
0.015289306640625,
-0.03594970703125,
-0.03460693359375,
0.05877685546875,
-0.0160675048828125,
-0.0236968994140625,
0.0250244140625,
0.0271759033203125,
0.01456451416015625,
-0.0024929046630859375,
-0.043487548828125,
-0.0013427734375,
0.02191162109375,
-0.019622802734375,
0.0001709461212158203,
0.039306640625,
0.0435791015625,
0.0416259765625,
0.0131683349609375,
-0.0140228271484375,
0.01385498046875,
-0.0006260871887207031,
0.05670166015625,
-0.033935546875,
-0.033294677734375,
-0.0780029296875,
0.03155517578125,
-0.0249786376953125,
-0.00670623779296875,
0.057373046875,
0.019012451171875,
0.039093017578125,
-0.0226593017578125,
0.058074951171875,
-0.036407470703125,
0.0445556640625,
-0.01861572265625,
0.064453125,
-0.03497314453125,
0.0011949539184570312,
-0.01087188720703125,
-0.05865478515625,
-0.01117706298828125,
0.074951171875,
-0.035888671875,
-0.00970458984375,
0.059844970703125,
0.026458740234375,
-0.00890350341796875,
0.0196533203125,
0.0213470458984375,
0.021148681640625,
0.0258026123046875,
0.0289306640625,
0.0328369140625,
-0.0283050537109375,
0.06781005859375,
-0.037017822265625,
-0.0219573974609375,
-0.039642333984375,
-0.062042236328125,
-0.09515380859375,
-0.031951904296875,
-0.005390167236328125,
-0.01062774658203125,
-0.01355743408203125,
0.07586669921875,
0.051849365234375,
-0.07427978515625,
0.007572174072265625,
-0.00534820556640625,
0.024658203125,
-0.0175933837890625,
-0.019195556640625,
0.0248260498046875,
-0.046844482421875,
-0.080322265625,
0.01251220703125,
0.0082855224609375,
0.0001322031021118164,
0.0019741058349609375,
0.0036525726318359375,
-0.0215606689453125,
0.0009164810180664062,
0.053314208984375,
0.010162353515625,
-0.054595947265625,
-0.035003662109375,
-0.01538848876953125,
-0.013916015625,
0.0165557861328125,
0.05413818359375,
-0.06182861328125,
0.044036865234375,
0.04046630859375,
0.040283203125,
0.04949951171875,
-0.018524169921875,
0.0211334228515625,
-0.0787353515625,
0.0266571044921875,
0.0141448974609375,
0.024383544921875,
-0.00021731853485107422,
-0.0155029296875,
0.0191650390625,
0.0038547515869140625,
-0.042022705078125,
-0.035125732421875,
-0.002452850341796875,
-0.08636474609375,
-0.01195526123046875,
0.06585693359375,
-0.0030155181884765625,
-0.00542449951171875,
-0.0118865966796875,
-0.043121337890625,
0.0270538330078125,
-0.04534912109375,
0.047607421875,
0.045654296875,
-0.0210113525390625,
-0.002925872802734375,
-0.043243408203125,
0.03973388671875,
0.0188446044921875,
-0.0301971435546875,
-0.0034942626953125,
0.0037631988525390625,
0.0162811279296875,
0.03009033203125,
0.027252197265625,
0.034271240234375,
-0.003482818603515625,
0.00257110595703125,
-0.00013899803161621094,
0.02398681640625,
-0.0272979736328125,
-0.01215362548828125,
0.00792694091796875,
0.0218505859375,
-0.0498046875
]
] |
classla/bcms-bertic-ner | 2023-06-23T06:30:26.000Z | [
"transformers",
"pytorch",
"safetensors",
"electra",
"token-classification",
"hr",
"bs",
"sr",
"cnr",
"hbs",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | classla | null | null | classla/bcms-bertic-ner | 4 | 31,339 | transformers | 2022-03-02T23:29:05 | ---
language:
- hr
- bs
- sr
- cnr
- hbs
widget:
- text: "Zovem se Marko i živim u Zagrebu. Studirao sam u Beogradu na Filozofskom fakultetu. Obožavam album Moanin."
license: apache-2.0
---
# The [BERTić](https://huggingface.co/classla/bcms-bertic)* [bert-ich] /bɜrtitʃ/ model fine-tuned for the task of named entity recognition in Bosnian, Croatian, Montenegrin and Serbian (BCMS)
* The name should resemble the facts (1) that the model was trained in Zagreb, Croatia, where diminutives ending in -ić (as in fotić, smajlić, hengić etc.) are very popular, and (2) that most surnames in the countries where these languages are spoken end in -ić (with diminutive etymology as well).
This is a fine-tuned version of the [BERTić](https://huggingface.co/classla/bcms-bertic) model for the task of named entity recognition (PER, LOC, ORG, MISC). The fine-tuning was performed on the following datasets:
- the [hr500k](http://hdl.handle.net/11356/1183) dataset, 500 thousand tokens in size, standard Croatian
- the [SETimes.SR](http://hdl.handle.net/11356/1200) dataset, 87 thousand tokens in size, standard Serbian
- the [ReLDI-hr](http://hdl.handle.net/11356/1241) dataset, 89 thousand tokens in size, Internet (Twitter) Croatian
- the [ReLDI-sr](http://hdl.handle.net/11356/1240) dataset, 92 thousand tokens in size, Internet (Twitter) Serbian
The data was augmented with missing diacritics and standard data was additionally over-represented. The F1 obtained on dev data (train and test was merged into train) is 91.38. For a more detailed per-dataset evaluation of the BERTić model on the NER task have a look at the [main model page](https://huggingface.co/classla/bcms-bertic).
If you use this fine-tuned model, please cite the following paper:
```
@inproceedings{ljubesic-lauc-2021-bertic,
title = "{BERT}i{\'c} - The Transformer Language Model for {B}osnian, {C}roatian, {M}ontenegrin and {S}erbian",
author = "Ljube{\v{s}}i{\'c}, Nikola and Lauc, Davor",
booktitle = "Proceedings of the 8th Workshop on Balto-Slavic Natural Language Processing",
month = apr,
year = "2021",
address = "Kiyv, Ukraine",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.bsnlp-1.5",
pages = "37--42",
}
```
When running the model in `simpletransformers`, the order of labels has to be set as well.
```
from simpletransformers.ner import NERModel, NERArgs
model_args = NERArgs()
model_args.labels_list = ['B-LOC','B-MISC','B-ORG','B-PER','I-LOC','I-MISC','I-ORG','I-PER','O']
model = NERModel('electra', 'classla/bcms-bertic-ner', args=model_args)
``` | 2,643 | [
[
-0.041595458984375,
-0.03521728515625,
0.0215301513671875,
0.01486968994140625,
-0.028228759765625,
-0.00047087669372558594,
-0.0347900390625,
-0.0545654296875,
0.0174713134765625,
0.0282135009765625,
-0.053497314453125,
-0.04693603515625,
-0.048828125,
0.031341552734375,
-0.00397491455078125,
0.09576416015625,
-0.01160430908203125,
0.0260772705078125,
0.00928497314453125,
-0.01983642578125,
-0.0184478759765625,
-0.04217529296875,
-0.04327392578125,
-0.0266571044921875,
0.0474853515625,
0.0258941650390625,
0.037567138671875,
0.03204345703125,
0.03765869140625,
0.0201568603515625,
-0.02813720703125,
0.019775390625,
-0.0226287841796875,
0.006496429443359375,
-0.0159454345703125,
-0.0211181640625,
-0.034912109375,
-0.00832366943359375,
0.05010986328125,
0.060272216796875,
0.0004734992980957031,
0.0258331298828125,
-0.0098419189453125,
0.041534423828125,
-0.0239105224609375,
0.007205963134765625,
-0.051055908203125,
-0.030181884765625,
-0.0263824462890625,
0.0169525146484375,
-0.036346435546875,
-0.0290985107421875,
0.0226287841796875,
-0.0207977294921875,
0.019287109375,
0.018829345703125,
0.1077880859375,
0.0005097389221191406,
-0.0204925537109375,
-0.022064208984375,
-0.0633544921875,
0.061279296875,
-0.06451416015625,
0.03680419921875,
0.01434326171875,
0.0141143798828125,
-0.025238037109375,
-0.035491943359375,
-0.037933349609375,
-0.01385498046875,
-0.0014715194702148438,
-0.00044655799865722656,
-0.025238037109375,
0.00968170166015625,
0.0338134765625,
0.0035495758056640625,
-0.042999267578125,
0.01885986328125,
-0.05401611328125,
-0.01715087890625,
0.040008544921875,
0.008514404296875,
0.01910400390625,
-0.00797271728515625,
-0.0281219482421875,
0.0100250244140625,
-0.054168701171875,
0.00029659271240234375,
0.037200927734375,
0.06463623046875,
-0.0206298828125,
0.046875,
-0.02203369140625,
0.0621337890625,
0.016815185546875,
0.0021266937255859375,
0.041748046875,
-0.0205078125,
-0.024444580078125,
0.01019287109375,
0.05206298828125,
-0.0006928443908691406,
0.0242462158203125,
-0.026641845703125,
-0.01259613037109375,
-0.0005941390991210938,
0.007160186767578125,
-0.06744384765625,
-0.0278472900390625,
0.006134033203125,
-0.02593994140625,
0.002285003662109375,
0.006519317626953125,
-0.0258331298828125,
0.002960205078125,
-0.039154052734375,
0.044403076171875,
-0.05047607421875,
0.00431060791015625,
-0.007495880126953125,
0.014739990234375,
0.0174407958984375,
0.0255584716796875,
-0.07427978515625,
0.02252197265625,
0.03558349609375,
0.04541015625,
-0.006519317626953125,
-0.00525665283203125,
-0.01181793212890625,
-0.0098114013671875,
-0.00670623779296875,
0.0628662109375,
-0.033447265625,
-0.0092315673828125,
-0.0175628662109375,
0.0159454345703125,
-0.0188751220703125,
-0.03717041015625,
0.04608154296875,
-0.048492431640625,
0.0237884521484375,
-0.0209503173828125,
-0.04681396484375,
-0.0255584716796875,
0.01052093505859375,
-0.04058837890625,
0.07781982421875,
0.003658294677734375,
-0.065185546875,
0.044219970703125,
-0.037628173828125,
-0.037872314453125,
0.0129852294921875,
0.0071563720703125,
-0.03363037109375,
0.0093994140625,
0.0023593902587890625,
0.038177490234375,
-0.0226287841796875,
0.020599365234375,
-0.01383209228515625,
-0.01837158203125,
0.010040283203125,
-0.0167388916015625,
0.07305908203125,
0.0139007568359375,
-0.0191802978515625,
-0.004207611083984375,
-0.078857421875,
0.01184844970703125,
-0.01078033447265625,
-0.056854248046875,
-0.035614013671875,
-0.00034928321838378906,
0.017852783203125,
0.0219268798828125,
0.018402099609375,
-0.055328369140625,
0.027984619140625,
-0.04840087890625,
0.0157623291015625,
0.041656494140625,
-0.006015777587890625,
0.03924560546875,
-0.0177764892578125,
0.0202178955078125,
0.01105499267578125,
0.0029430389404296875,
0.02606201171875,
-0.043365478515625,
-0.07012939453125,
-0.04010009765625,
0.07281494140625,
0.037933349609375,
-0.051239013671875,
0.0257110595703125,
-0.03631591796875,
-0.0576171875,
-0.04681396484375,
-0.016265869140625,
0.024627685546875,
0.015716552734375,
0.0274658203125,
-0.015716552734375,
-0.06842041015625,
-0.08648681640625,
-0.0061798095703125,
-0.0089569091796875,
0.01105499267578125,
0.02288818359375,
0.053497314453125,
-0.03057861328125,
0.0677490234375,
-0.01042938232421875,
-0.006275177001953125,
-0.026641845703125,
0.0255584716796875,
0.05523681640625,
0.03961181640625,
0.042877197265625,
-0.05645751953125,
-0.054107666015625,
-0.01184844970703125,
-0.0204925537109375,
-0.0171966552734375,
-0.0017595291137695312,
-0.0195770263671875,
0.04559326171875,
0.00799560546875,
-0.05279541015625,
0.0177459716796875,
0.0289764404296875,
-0.0301361083984375,
0.05523681640625,
-0.01354217529296875,
-0.00420379638671875,
-0.08721923828125,
0.0154876708984375,
-0.00021791458129882812,
-0.01806640625,
-0.061279296875,
-0.01136016845703125,
0.00406646728515625,
0.0224609375,
-0.03472900390625,
0.04522705078125,
-0.050750732421875,
0.00920867919921875,
-0.00960540771484375,
0.0020122528076171875,
-0.022552490234375,
0.040924072265625,
-0.0013074874877929688,
0.04010009765625,
0.04754638671875,
-0.061492919921875,
0.0243988037109375,
0.019134521484375,
-0.039520263671875,
0.03875732421875,
-0.034515380859375,
-0.0035800933837890625,
-0.00675201416015625,
-0.006195068359375,
-0.0654296875,
-0.0026531219482421875,
0.042999267578125,
-0.04388427734375,
0.05224609375,
-0.030670166015625,
-0.059326171875,
-0.0221099853515625,
0.0081787109375,
0.0271148681640625,
0.023284912109375,
-0.035125732421875,
0.04595947265625,
0.02410888671875,
-0.01357269287109375,
-0.03582763671875,
-0.049957275390625,
0.0101318359375,
-0.0166015625,
-0.0458984375,
0.038818359375,
-0.0267333984375,
-0.007160186767578125,
-0.00624847412109375,
-0.00797271728515625,
-0.0194549560546875,
-0.003292083740234375,
0.012054443359375,
0.040374755859375,
-0.014739990234375,
0.007183074951171875,
0.01027679443359375,
-0.002716064453125,
-0.00881195068359375,
-0.0123291015625,
0.041046142578125,
-0.0057830810546875,
0.0098419189453125,
-0.0293426513671875,
0.00495147705078125,
0.043853759765625,
0.00037217140197753906,
0.07830810546875,
0.045135498046875,
-0.041046142578125,
0.000057578086853027344,
-0.052947998046875,
-0.0034275054931640625,
-0.0284271240234375,
0.0149078369140625,
-0.02947998046875,
-0.048065185546875,
0.050933837890625,
-0.0000750422477722168,
0.004486083984375,
0.04595947265625,
0.054779052734375,
-0.008453369140625,
0.06939697265625,
0.04937744140625,
-0.014068603515625,
0.034393310546875,
-0.0204620361328125,
0.0220489501953125,
-0.07122802734375,
-0.0293426513671875,
-0.050567626953125,
-0.0301055908203125,
-0.07012939453125,
-0.01033782958984375,
-0.00026679039001464844,
0.002704620361328125,
-0.01611328125,
0.0635986328125,
-0.02984619140625,
0.0146636962890625,
0.04376220703125,
0.0024261474609375,
0.00540924072265625,
0.003368377685546875,
-0.0374755859375,
-0.014739990234375,
-0.05224609375,
-0.056640625,
0.054290771484375,
0.0142974853515625,
0.05523681640625,
0.02337646484375,
0.06793212890625,
-0.00020599365234375,
0.0177001953125,
-0.051971435546875,
0.04534912109375,
-0.0165252685546875,
-0.072509765625,
-0.01434326171875,
-0.030853271484375,
-0.078857421875,
0.01361083984375,
-0.028167724609375,
-0.08154296875,
0.042083740234375,
-0.004169464111328125,
-0.02899169921875,
0.01343536376953125,
-0.038055419921875,
0.04974365234375,
-0.01206207275390625,
-0.002902984619140625,
0.00243377685546875,
-0.058502197265625,
0.00897216796875,
0.0030498504638671875,
0.0173797607421875,
-0.0263671875,
0.0160980224609375,
0.06878662109375,
-0.018280029296875,
0.056915283203125,
-0.006290435791015625,
0.00939178466796875,
0.0037994384765625,
0.0014972686767578125,
0.028045654296875,
-0.00881195068359375,
-0.00814056396484375,
0.0269012451171875,
0.00992584228515625,
-0.0132293701171875,
-0.0216827392578125,
0.050750732421875,
-0.049224853515625,
-0.01375579833984375,
-0.04241943359375,
-0.03143310546875,
0.00916290283203125,
0.0268096923828125,
0.044342041015625,
0.0423583984375,
-0.0097503662109375,
0.01052093505859375,
0.0618896484375,
-0.0135040283203125,
0.0305023193359375,
0.057373046875,
0.0022029876708984375,
-0.0261688232421875,
0.052734375,
0.03277587890625,
0.004703521728515625,
0.031219482421875,
-0.00864410400390625,
-0.003993988037109375,
-0.05670166015625,
-0.040679931640625,
0.032684326171875,
-0.044464111328125,
-0.037506103515625,
-0.0517578125,
-0.0288848876953125,
-0.02789306640625,
0.0128173828125,
-0.025360107421875,
-0.045135498046875,
-0.044403076171875,
0.0013036727905273438,
0.0191802978515625,
0.046417236328125,
-0.01421356201171875,
0.0208892822265625,
-0.050872802734375,
0.0159912109375,
-0.007598876953125,
0.031402587890625,
-0.007251739501953125,
-0.06671142578125,
-0.036285400390625,
0.0175018310546875,
-0.01412200927734375,
-0.07080078125,
0.045074462890625,
0.0189361572265625,
0.0516357421875,
0.01486968994140625,
0.0018224716186523438,
0.047607421875,
-0.04974365234375,
0.037078857421875,
-0.0074005126953125,
-0.061676025390625,
0.03631591796875,
-0.022186279296875,
0.0182647705078125,
0.046417236328125,
0.0340576171875,
-0.03729248046875,
-0.0311126708984375,
-0.055145263671875,
-0.09417724609375,
0.0833740234375,
0.045867919921875,
0.0142059326171875,
-0.01195526123046875,
0.030731201171875,
0.01454925537109375,
0.01126861572265625,
-0.075927734375,
-0.04241943359375,
0.0270538330078125,
-0.015411376953125,
0.01544189453125,
-0.037322998046875,
0.017333984375,
-0.03778076171875,
0.09442138671875,
0.0241241455078125,
0.036346435546875,
0.005859375,
-0.0251617431640625,
-0.005710601806640625,
0.028167724609375,
0.059783935546875,
0.040740966796875,
-0.026397705078125,
-0.0038604736328125,
0.030181884765625,
-0.0286407470703125,
-0.0164947509765625,
0.0253753662109375,
-0.01546478271484375,
0.0278472900390625,
0.0106048583984375,
0.07513427734375,
0.032806396484375,
-0.02728271484375,
0.01268768310546875,
-0.00916290283203125,
-0.00815582275390625,
-0.0267333984375,
-0.00843048095703125,
-0.0027523040771484375,
0.028228759765625,
0.032470703125,
0.01727294921875,
0.01177978515625,
-0.036712646484375,
0.00043964385986328125,
0.029937744140625,
-0.0191650390625,
-0.0362548828125,
0.03326416015625,
-0.0021686553955078125,
-0.02496337890625,
0.054290771484375,
-0.022857666015625,
-0.057830810546875,
0.04168701171875,
0.035675048828125,
0.0526123046875,
-0.0169525146484375,
0.00611114501953125,
0.046875,
0.0347900390625,
-0.010009765625,
0.03887939453125,
-0.0025005340576171875,
-0.061004638671875,
-0.0107574462890625,
-0.0711669921875,
-0.0140838623046875,
-0.00328826904296875,
-0.053619384765625,
0.04254150390625,
-0.01873779296875,
-0.0279541015625,
0.029327392578125,
0.01305389404296875,
-0.07159423828125,
0.02154541015625,
0.035247802734375,
0.07257080078125,
-0.058868408203125,
0.06732177734375,
0.07159423828125,
-0.0258941650390625,
-0.058319091796875,
-0.0207672119140625,
-0.03265380859375,
-0.039520263671875,
0.07537841796875,
0.0087432861328125,
0.0104217529296875,
0.005985260009765625,
-0.04876708984375,
-0.09454345703125,
0.0703125,
0.02294921875,
-0.0531005859375,
-0.01224517822265625,
-0.002155303955078125,
0.0604248046875,
-0.0264739990234375,
0.0258026123046875,
0.0347900390625,
0.047454833984375,
0.0128936767578125,
-0.08154296875,
0.0030975341796875,
-0.015594482421875,
0.00519561767578125,
0.0300750732421875,
-0.03338623046875,
0.055572509765625,
-0.022979736328125,
-0.030181884765625,
0.032928466796875,
0.044464111328125,
0.017333984375,
0.0097503662109375,
0.01348114013671875,
0.043853759765625,
0.051300048828125,
-0.0290985107421875,
0.056915283203125,
-0.021514892578125,
0.045806884765625,
0.08154296875,
-0.00033092498779296875,
0.05609130859375,
0.03240966796875,
-0.0186004638671875,
0.057159423828125,
0.0640869140625,
-0.03057861328125,
0.05511474609375,
0.0160980224609375,
-0.005596160888671875,
-0.0157928466796875,
0.00644683837890625,
-0.010009765625,
0.040679931640625,
0.027008056640625,
-0.0374755859375,
-0.018524169921875,
0.004238128662109375,
0.012969970703125,
-0.03253173828125,
-0.0180816650390625,
0.058502197265625,
0.00628662109375,
-0.03765869140625,
0.0516357421875,
0.0176239013671875,
0.05426025390625,
-0.034576416015625,
0.01018524169921875,
0.0033111572265625,
0.0005421638488769531,
-0.01058197021484375,
-0.029754638671875,
0.01537322998046875,
0.0022029876708984375,
-0.041778564453125,
-0.0025577545166015625,
0.04376220703125,
-0.044281005859375,
-0.05914306640625,
0.0200347900390625,
0.017425537109375,
0.0333251953125,
0.01538848876953125,
-0.055511474609375,
-0.00862884521484375,
-0.0060882568359375,
-0.0203857421875,
0.01273345947265625,
0.034576416015625,
-0.007518768310546875,
0.046783447265625,
0.032440185546875,
0.021697998046875,
0.0132904052734375,
0.01529693603515625,
0.056976318359375,
-0.040191650390625,
-0.0222015380859375,
-0.06866455078125,
0.0400390625,
0.00885009765625,
-0.0149078369140625,
0.047149658203125,
0.0460205078125,
0.06597900390625,
-0.0113983154296875,
0.037750244140625,
-0.029205322265625,
0.044342041015625,
-0.0291595458984375,
0.0594482421875,
-0.0273284912109375,
0.015716552734375,
-0.0296478271484375,
-0.08978271484375,
-0.032379150390625,
0.048095703125,
-0.031951904296875,
0.0096435546875,
0.031951904296875,
0.04998779296875,
-0.0091094970703125,
-0.02252197265625,
0.02069091796875,
0.046051025390625,
-0.0022792816162109375,
0.030853271484375,
0.040557861328125,
-0.0458984375,
0.02947998046875,
-0.0254364013671875,
-0.00406646728515625,
-0.0255279541015625,
-0.05902099609375,
-0.06976318359375,
-0.0531005859375,
-0.03759765625,
-0.018798828125,
0.0160369873046875,
0.09527587890625,
0.049896240234375,
-0.08636474609375,
-0.0033931732177734375,
0.01122283935546875,
0.0078125,
-0.004817962646484375,
-0.01515960693359375,
0.0213623046875,
-0.01029205322265625,
-0.06646728515625,
0.0135650634765625,
0.003749847412109375,
0.0230255126953125,
0.0135040283203125,
0.00970458984375,
-0.03936767578125,
0.0065765380859375,
0.03564453125,
0.0208587646484375,
-0.065673828125,
-0.024566650390625,
-0.006496429443359375,
-0.0204620361328125,
0.00838470458984375,
0.023406982421875,
-0.0596923828125,
0.031494140625,
0.042236328125,
0.00415802001953125,
0.0287322998046875,
-0.00783538818359375,
0.0310211181640625,
-0.0557861328125,
-0.0082244873046875,
0.017120361328125,
0.0439453125,
0.029296875,
-0.0266265869140625,
0.0247039794921875,
0.013824462890625,
-0.043243408203125,
-0.06658935546875,
-0.00495147705078125,
-0.0860595703125,
-0.0066375732421875,
0.086669921875,
-0.00324249267578125,
-0.006500244140625,
-0.013153076171875,
-0.0222625732421875,
0.044586181640625,
-0.036346435546875,
0.06524658203125,
0.063720703125,
-0.0018625259399414062,
0.004802703857421875,
-0.0212249755859375,
0.0272064208984375,
0.007640838623046875,
-0.0303955078125,
-0.0032978057861328125,
0.0206146240234375,
0.047760009765625,
0.013275146484375,
0.023040771484375,
0.006305694580078125,
0.007625579833984375,
-0.03765869140625,
0.0239410400390625,
0.007289886474609375,
-0.031890869140625,
-0.02490234375,
-0.0269622802734375,
-0.0037899017333984375,
-0.007312774658203125
]
] |
facebook/mms-lid-126 | 2023-06-13T10:15:48.000Z | [
"transformers",
"pytorch",
"safetensors",
"wav2vec2",
"audio-classification",
"mms",
"ab",
"af",
"ak",
"am",
"ar",
"as",
"av",
"ay",
"az",
"ba",
"bm",
"be",
"bn",
"bi",
"bo",
"sh",
"br",
"bg",
"ca",
"cs",
"ce",
"cv",
"ku",
"cy",
"da",
"de",
"dv",
"dz",
"el",
"en",
"eo",
"et",
"eu",
"ee",
"fo",
"fa",
"fj",
"fi",
"fr",
"fy",
"ff",
"ga",
"gl",
"gn",
"gu",
"zh",
"ht",
"ha",
"he",
"hi",
"hu",
"hy",
"ig",
"ia",
"ms",
"is",
"it",
"jv",
"ja",
"kn",
"ka",
"kk",
"kr",
"km",
"ki",
"rw",
"ky",
"ko",
"kv",
"lo",
"la",
"lv",
"ln",
"lt",
"lb",
"lg",
"mh",
"ml",
"mr",
"mk",
"mg",
"mt",
"mn",
"mi",
"my",
"nl",
"no",
"ne",
"ny",
"oc",
"om",
"or",
"os",
"pa",
"pl",
"pt",
"ps",
"qu",
"ro",
"rn",
"ru",
"sg",
"sk",
"sl",
"sm",
"sn",
"sd",
"so",
"es",
"sq",
"su",
"sv",
"sw",
"ta",
"tt",
"te",
"tg",
"tl",
"th",
"ti",
"ts",
"tr",
"uk",
"vi",
"wo",
"xh",
"yo",
"zu",
"za",
"dataset:google/fleurs",
"arxiv:2305.13516",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] | audio-classification | facebook | null | null | facebook/mms-lid-126 | 11 | 31,333 | transformers | 2023-06-13T08:44:05 | ---
tags:
- mms
language:
- ab
- af
- ak
- am
- ar
- as
- av
- ay
- az
- ba
- bm
- be
- bn
- bi
- bo
- sh
- br
- bg
- ca
- cs
- ce
- cv
- ku
- cy
- da
- de
- dv
- dz
- el
- en
- eo
- et
- eu
- ee
- fo
- fa
- fj
- fi
- fr
- fy
- ff
- ga
- gl
- gn
- gu
- zh
- ht
- ha
- he
- hi
- sh
- hu
- hy
- ig
- ia
- ms
- is
- it
- jv
- ja
- kn
- ka
- kk
- kr
- km
- ki
- rw
- ky
- ko
- kv
- lo
- la
- lv
- ln
- lt
- lb
- lg
- mh
- ml
- mr
- ms
- mk
- mg
- mt
- mn
- mi
- my
- zh
- nl
- 'no'
- 'no'
- ne
- ny
- oc
- om
- or
- os
- pa
- pl
- pt
- ms
- ps
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- qu
- ro
- rn
- ru
- sg
- sk
- sl
- sm
- sn
- sd
- so
- es
- sq
- su
- sv
- sw
- ta
- tt
- te
- tg
- tl
- th
- ti
- ts
- tr
- uk
- ms
- vi
- wo
- xh
- ms
- yo
- ms
- zu
- za
license: cc-by-nc-4.0
datasets:
- google/fleurs
metrics:
- acc
---
# Massively Multilingual Speech (MMS) - Finetuned LID
This checkpoint is a model fine-tuned for speech language identification (LID) and part of Facebook's [Massive Multilingual Speech project](https://research.facebook.com/publications/scaling-speech-technology-to-1000-languages/).
This checkpoint is based on the [Wav2Vec2 architecture](https://huggingface.co/docs/transformers/model_doc/wav2vec2) and classifies raw audio input to a probability distribution over 126 output classes (each class representing a language).
The checkpoint consists of **1 billion parameters** and has been fine-tuned from [facebook/mms-1b](https://huggingface.co/facebook/mms-1b) on 126 languages.
## Table Of Content
- [Example](#example)
- [Supported Languages](#supported-languages)
- [Model details](#model-details)
- [Additional links](#additional-links)
## Example
This MMS checkpoint can be used with [Transformers](https://github.com/huggingface/transformers) to identify
the spoken language of an audio. It can recognize the [following 126 languages](#supported-languages).
Let's look at a simple example.
First, we install transformers and some other libraries
```
pip install torch accelerate torchaudio datasets
pip install --upgrade transformers
````
**Note**: In order to use MMS you need to have at least `transformers >= 4.30` installed. If the `4.30` version
is not yet available [on PyPI](https://pypi.org/project/transformers/) make sure to install `transformers` from
source:
```
pip install git+https://github.com/huggingface/transformers.git
```
Next, we load a couple of audio samples via `datasets`. Make sure that the audio data is sampled to 16000 kHz.
```py
from datasets import load_dataset, Audio
# English
stream_data = load_dataset("mozilla-foundation/common_voice_13_0", "en", split="test", streaming=True)
stream_data = stream_data.cast_column("audio", Audio(sampling_rate=16000))
en_sample = next(iter(stream_data))["audio"]["array"]
# Arabic
stream_data = load_dataset("mozilla-foundation/common_voice_13_0", "ar", split="test", streaming=True)
stream_data = stream_data.cast_column("audio", Audio(sampling_rate=16000))
ar_sample = next(iter(stream_data))["audio"]["array"]
```
Next, we load the model and processor
```py
from transformers import Wav2Vec2ForSequenceClassification, AutoFeatureExtractor
import torch
model_id = "facebook/mms-lid-126"
processor = AutoFeatureExtractor.from_pretrained(model_id)
model = Wav2Vec2ForSequenceClassification.from_pretrained(model_id)
```
Now we process the audio data, pass the processed audio data to the model to classify it into a language, just like we usually do for Wav2Vec2 audio classification models such as [ehcalabres/wav2vec2-lg-xlsr-en-speech-emotion-recognition](https://huggingface.co/harshit345/xlsr-wav2vec-speech-emotion-recognition)
```py
# English
inputs = processor(en_sample, sampling_rate=16_000, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs).logits
lang_id = torch.argmax(outputs, dim=-1)[0].item()
detected_lang = model.config.id2label[lang_id]
# 'eng'
# Arabic
inputs = processor(ar_sample, sampling_rate=16_000, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs).logits
lang_id = torch.argmax(outputs, dim=-1)[0].item()
detected_lang = model.config.id2label[lang_id]
# 'ara'
```
To see all the supported languages of a checkpoint, you can print out the language ids as follows:
```py
processor.id2label.values()
```
For more details, about the architecture please have a look at [the official docs](https://huggingface.co/docs/transformers/main/en/model_doc/mms).
## Supported Languages
This model supports 126 languages. Unclick the following to toogle all supported languages of this checkpoint in [ISO 639-3 code](https://en.wikipedia.org/wiki/ISO_639-3).
You can find more details about the languages and their ISO 649-3 codes in the [MMS Language Coverage Overview](https://dl.fbaipublicfiles.com/mms/misc/language_coverage_mms.html).
<details>
<summary>Click to toggle</summary>
- ara
- cmn
- eng
- spa
- fra
- mlg
- swe
- por
- vie
- ful
- sun
- asm
- ben
- zlm
- kor
- ind
- hin
- tuk
- urd
- aze
- slv
- mon
- hau
- tel
- swh
- bod
- rus
- tur
- heb
- mar
- som
- tgl
- tat
- tha
- cat
- ron
- mal
- bel
- pol
- yor
- nld
- bul
- hat
- afr
- isl
- amh
- tam
- hun
- hrv
- lit
- cym
- fas
- mkd
- ell
- bos
- deu
- sqi
- jav
- nob
- uzb
- snd
- lat
- nya
- grn
- mya
- orm
- lin
- hye
- yue
- pan
- jpn
- kaz
- npi
- kat
- guj
- kan
- tgk
- ukr
- ces
- lav
- bak
- khm
- fao
- glg
- ltz
- lao
- mlt
- sin
- sna
- ita
- srp
- mri
- nno
- pus
- eus
- ory
- lug
- bre
- luo
- slk
- fin
- dan
- yid
- est
- ceb
- war
- san
- kir
- oci
- wol
- haw
- kam
- umb
- xho
- epo
- zul
- ibo
- abk
- ckb
- nso
- gle
- kea
- ast
- sco
- glv
- ina
</details>
## Model details
- **Developed by:** Vineel Pratap et al.
- **Model type:** Multi-Lingual Automatic Speech Recognition model
- **Language(s):** 126 languages, see [supported languages](#supported-languages)
- **License:** CC-BY-NC 4.0 license
- **Num parameters**: 1 billion
- **Audio sampling rate**: 16,000 kHz
- **Cite as:**
@article{pratap2023mms,
title={Scaling Speech Technology to 1,000+ Languages},
author={Vineel Pratap and Andros Tjandra and Bowen Shi and Paden Tomasello and Arun Babu and Sayani Kundu and Ali Elkahky and Zhaoheng Ni and Apoorv Vyas and Maryam Fazel-Zarandi and Alexei Baevski and Yossi Adi and Xiaohui Zhang and Wei-Ning Hsu and Alexis Conneau and Michael Auli},
journal={arXiv},
year={2023}
}
## Additional Links
- [Blog post](https://ai.facebook.com/blog/multilingual-model-speech-recognition/)
- [Transformers documentation](https://huggingface.co/docs/transformers/main/en/model_doc/mms).
- [Paper](https://arxiv.org/abs/2305.13516)
- [GitHub Repository](https://github.com/facebookresearch/fairseq/tree/main/examples/mms#asr)
- [Other **MMS** checkpoints](https://huggingface.co/models?other=mms)
- MMS base checkpoints:
- [facebook/mms-1b](https://huggingface.co/facebook/mms-1b)
- [facebook/mms-300m](https://huggingface.co/facebook/mms-300m)
- [Official Space](https://huggingface.co/spaces/facebook/MMS)
| 7,100 | [
[
-0.043548583984375,
-0.0278472900390625,
0.01038360595703125,
0.031158447265625,
-0.004039764404296875,
0.005039215087890625,
-0.016845703125,
-0.0350341796875,
0.015106201171875,
0.0159149169921875,
-0.051177978515625,
-0.041595458984375,
-0.05047607421875,
0.0019626617431640625,
-0.0183868408203125,
0.061065673828125,
0.0020294189453125,
0.01226043701171875,
0.0290069580078125,
-0.0202484130859375,
-0.0203399658203125,
-0.0245361328125,
-0.055267333984375,
-0.013214111328125,
0.0196685791015625,
0.03094482421875,
0.026641845703125,
0.043548583984375,
0.033538818359375,
0.0263671875,
-0.02020263671875,
-0.00045180320739746094,
-0.00960540771484375,
-0.014129638671875,
0.01165008544921875,
-0.01488494873046875,
-0.035186767578125,
0.0012845993041992188,
0.07305908203125,
0.039031982421875,
-0.006519317626953125,
0.0173492431640625,
0.00045561790466308594,
0.038665771484375,
-0.0212860107421875,
0.021026611328125,
-0.030975341796875,
-0.0035724639892578125,
-0.0210723876953125,
-0.003269195556640625,
-0.02978515625,
-0.0020751953125,
0.011474609375,
-0.035186767578125,
0.00678253173828125,
-0.016937255859375,
0.0810546875,
0.0077667236328125,
-0.028472900390625,
-0.0258941650390625,
-0.05621337890625,
0.06829833984375,
-0.047882080078125,
0.055755615234375,
0.0290374755859375,
0.02593994140625,
0.0023097991943359375,
-0.05389404296875,
-0.05047607421875,
-0.0009403228759765625,
0.00787353515625,
0.0243682861328125,
-0.0224761962890625,
-0.004756927490234375,
0.032989501953125,
0.02587890625,
-0.044219970703125,
0.01464080810546875,
-0.055450439453125,
-0.048583984375,
0.0400390625,
-0.005764007568359375,
0.0262603759765625,
-0.0246124267578125,
-0.01338958740234375,
-0.0228118896484375,
-0.0291290283203125,
0.02203369140625,
0.0176849365234375,
0.0323486328125,
-0.04730224609375,
0.040679931640625,
-0.0237884521484375,
0.052825927734375,
-0.0007772445678710938,
-0.038787841796875,
0.06024169921875,
-0.0284271240234375,
-0.006107330322265625,
0.00841522216796875,
0.071044921875,
0.01453399658203125,
0.0103607177734375,
0.00279998779296875,
-0.0010585784912109375,
0.0161590576171875,
-0.024261474609375,
-0.07208251953125,
-0.0022258758544921875,
0.032958984375,
-0.0213165283203125,
0.0037403106689453125,
-0.0037994384765625,
-0.045745849609375,
0.006458282470703125,
-0.017333984375,
0.034149169921875,
-0.04815673828125,
-0.035858154296875,
0.005786895751953125,
-0.020721435546875,
0.025360107421875,
0.0017957687377929688,
-0.06463623046875,
-0.00231170654296875,
0.0276336669921875,
0.0789794921875,
0.00720977783203125,
-0.032684326171875,
-0.037445068359375,
0.0111846923828125,
-0.00925445556640625,
0.036651611328125,
-0.0177459716796875,
-0.0310516357421875,
-0.005886077880859375,
0.015106201171875,
-0.0201873779296875,
-0.037384033203125,
0.051544189453125,
-0.0186004638671875,
0.0306396484375,
-0.025634765625,
-0.0293731689453125,
-0.0175018310546875,
-0.004123687744140625,
-0.0396728515625,
0.08673095703125,
0.00943756103515625,
-0.0621337890625,
0.030059814453125,
-0.041229248046875,
-0.0322265625,
-0.004199981689453125,
-0.0033206939697265625,
-0.032684326171875,
-0.0184478759765625,
0.028900146484375,
0.0352783203125,
-0.013946533203125,
0.003326416015625,
-0.007045745849609375,
-0.0263671875,
-0.0008478164672851562,
-0.0295257568359375,
0.096435546875,
0.0413818359375,
-0.047515869140625,
0.0031108856201171875,
-0.06036376953125,
-0.0035991668701171875,
-0.00313568115234375,
-0.044952392578125,
0.01885986328125,
0.0031833648681640625,
0.019866943359375,
0.040313720703125,
0.01018524169921875,
-0.0543212890625,
-0.01168060302734375,
-0.0386962890625,
0.046173095703125,
0.039276123046875,
-0.01502227783203125,
0.032073974609375,
-0.032958984375,
0.0249481201171875,
0.014312744140625,
0.000423431396484375,
-0.02520751953125,
-0.05194091796875,
-0.0709228515625,
-0.054534912109375,
0.011688232421875,
0.064697265625,
-0.05010986328125,
0.04302978515625,
-0.0318603515625,
-0.0604248046875,
-0.06072998046875,
0.0010318756103515625,
0.0255584716796875,
0.033355712890625,
0.035980224609375,
-0.021026611328125,
-0.05419921875,
-0.058929443359375,
-0.0007762908935546875,
-0.0272064208984375,
-0.01934814453125,
0.023468017578125,
0.02313232421875,
-0.033447265625,
0.06781005859375,
-0.0086517333984375,
-0.036529541015625,
-0.019866943359375,
0.00084686279296875,
0.01983642578125,
0.04278564453125,
0.0433349609375,
-0.048583984375,
-0.033538818359375,
-0.002475738525390625,
-0.047271728515625,
-0.01064300537109375,
0.0063018798828125,
0.006855010986328125,
0.0284271240234375,
0.0390625,
-0.035980224609375,
0.0070037841796875,
0.05804443359375,
-0.0175018310546875,
0.036163330078125,
-0.00009578466415405273,
0.0119171142578125,
-0.0987548828125,
0.004425048828125,
0.00328826904296875,
-0.0078125,
-0.04742431640625,
-0.006816864013671875,
0.0002803802490234375,
-0.00608062744140625,
-0.051177978515625,
0.04998779296875,
-0.0293426513671875,
-0.005718231201171875,
-0.007160186767578125,
0.01323699951171875,
-0.0204010009765625,
0.045806884765625,
0.01180267333984375,
0.05419921875,
0.072509765625,
-0.05316162109375,
0.0300445556640625,
0.01505279541015625,
-0.0377197265625,
0.043121337890625,
-0.047332763671875,
-0.00435638427734375,
0.00432586669921875,
0.031280517578125,
-0.0826416015625,
-0.00981903076171875,
0.0169830322265625,
-0.06591796875,
0.02215576171875,
-0.02215576171875,
-0.05084228515625,
-0.04058837890625,
-0.0033016204833984375,
0.0323486328125,
0.04852294921875,
-0.038909912109375,
0.042694091796875,
0.046478271484375,
-0.01203155517578125,
-0.044219970703125,
-0.0638427734375,
-0.006229400634765625,
-0.028961181640625,
-0.06622314453125,
0.021759033203125,
-0.008941650390625,
0.0064697265625,
-0.0252532958984375,
0.00011557340621948242,
-0.00142669677734375,
-0.0105133056640625,
0.017578125,
0.0158538818359375,
-0.01007843017578125,
-0.004032135009765625,
0.0031566619873046875,
-0.0232391357421875,
-0.004673004150390625,
-0.0275115966796875,
0.0614013671875,
-0.0173797607421875,
-0.0123138427734375,
-0.06011962890625,
0.0171356201171875,
0.054931640625,
-0.033416748046875,
0.0477294921875,
0.088623046875,
-0.035736083984375,
0.0182952880859375,
-0.051025390625,
-0.0131072998046875,
-0.040802001953125,
0.04327392578125,
-0.037017822265625,
-0.0712890625,
0.06298828125,
0.004512786865234375,
0.01276397705078125,
0.05511474609375,
0.06402587890625,
-0.004772186279296875,
0.08599853515625,
0.035308837890625,
-0.0302886962890625,
0.043426513671875,
-0.034515380859375,
0.00372314453125,
-0.06597900390625,
-0.026153564453125,
-0.056243896484375,
-0.0108184814453125,
-0.055877685546875,
-0.044830322265625,
0.0306243896484375,
-0.0006923675537109375,
-0.022186279296875,
0.042572021484375,
-0.037109375,
0.003330230712890625,
0.04327392578125,
-0.0088348388671875,
0.003719329833984375,
0.016082763671875,
-0.03436279296875,
-0.0111236572265625,
-0.0455322265625,
-0.031524658203125,
0.06536865234375,
0.0399169921875,
0.0238494873046875,
0.01384735107421875,
0.045654296875,
-0.00534820556640625,
0.002666473388671875,
-0.057708740234375,
0.042266845703125,
-0.009124755859375,
-0.06787109375,
-0.0205841064453125,
-0.040740966796875,
-0.0618896484375,
0.0161285400390625,
-0.0016183853149414062,
-0.07464599609375,
0.01788330078125,
-0.005931854248046875,
-0.0142669677734375,
0.0212249755859375,
-0.0535888671875,
0.053070068359375,
0.0115814208984375,
0.0007085800170898438,
-0.019134521484375,
-0.04901123046875,
0.0124969482421875,
0.003337860107421875,
0.03155517578125,
-0.023345947265625,
0.020263671875,
0.08953857421875,
-0.017242431640625,
0.0467529296875,
-0.022918701171875,
-0.00717926025390625,
0.032379150390625,
-0.01149749755859375,
0.02392578125,
-0.006275177001953125,
-0.01371002197265625,
0.034820556640625,
0.01366424560546875,
-0.0291290283203125,
-0.01219940185546875,
0.04901123046875,
-0.06719970703125,
-0.017822265625,
-0.0267791748046875,
-0.042266845703125,
-0.00955963134765625,
0.019500732421875,
0.044403076171875,
0.036285400390625,
0.00804901123046875,
0.01513671875,
0.037200927734375,
-0.03021240234375,
0.044952392578125,
0.048370361328125,
-0.01276397705078125,
-0.040802001953125,
0.0789794921875,
0.024322509765625,
0.028106689453125,
0.0209197998046875,
0.005100250244140625,
-0.0295867919921875,
-0.025848388671875,
-0.052581787109375,
0.028900146484375,
-0.0469970703125,
-0.0033779144287109375,
-0.063232421875,
-0.0295257568359375,
-0.054901123046875,
-0.005710601806640625,
-0.039520263671875,
-0.0362548828125,
-0.01454925537109375,
-0.004390716552734375,
0.03375244140625,
0.01113128662109375,
-0.02044677734375,
0.029876708984375,
-0.055511474609375,
0.03216552734375,
0.009552001953125,
0.0145111083984375,
-0.01045989990234375,
-0.0771484375,
-0.0283203125,
0.0228271484375,
-0.0171356201171875,
-0.06390380859375,
0.0308990478515625,
0.01287078857421875,
0.040130615234375,
0.034698486328125,
-0.0107574462890625,
0.05743408203125,
-0.044189453125,
0.055816650390625,
0.01134490966796875,
-0.08349609375,
0.038360595703125,
-0.033905029296875,
0.0355224609375,
0.0245361328125,
0.0312347412109375,
-0.06634521484375,
-0.042694091796875,
-0.0357666015625,
-0.06842041015625,
0.07696533203125,
0.023193359375,
0.0106048583984375,
0.005710601806640625,
0.001956939697265625,
-0.0277557373046875,
-0.00814056396484375,
-0.05609130859375,
-0.040771484375,
-0.01314544677734375,
-0.0199432373046875,
-0.01338958740234375,
-0.01488494873046875,
-0.0028400421142578125,
-0.037933349609375,
0.06085205078125,
0.01149749755859375,
0.0384521484375,
0.02099609375,
-0.00424957275390625,
-0.004108428955078125,
0.01300048828125,
0.04351806640625,
0.0190887451171875,
-0.01971435546875,
-0.0026645660400390625,
0.023895263671875,
-0.050537109375,
0.008392333984375,
0.0234222412109375,
-0.00933074951171875,
0.01316070556640625,
0.023895263671875,
0.06329345703125,
-0.0019044876098632812,
-0.045654296875,
0.02679443359375,
0.00213623046875,
-0.0162506103515625,
-0.0301055908203125,
-0.00878143310546875,
0.024261474609375,
0.020294189453125,
0.03302001953125,
-0.004344940185546875,
0.0023403167724609375,
-0.042205810546875,
0.0255279541015625,
0.0303955078125,
-0.039642333984375,
-0.01690673828125,
0.054473876953125,
0.01129150390625,
-0.0177154541015625,
0.057037353515625,
-0.0193939208984375,
-0.048370361328125,
0.046417236328125,
0.03485107421875,
0.059112548828125,
-0.04998779296875,
0.008636474609375,
0.05694580078125,
0.040283203125,
0.00334930419921875,
0.0253448486328125,
-0.005218505859375,
-0.0574951171875,
-0.0270233154296875,
-0.056427001953125,
-0.01313018798828125,
0.00033926963806152344,
-0.0455322265625,
0.04290771484375,
-0.0225677490234375,
-0.01091766357421875,
0.00981903076171875,
0.01349639892578125,
-0.048126220703125,
0.022979736328125,
0.02099609375,
0.06732177734375,
-0.0654296875,
0.08013916015625,
0.0278472900390625,
-0.0269927978515625,
-0.075439453125,
-0.0199432373046875,
0.01200103759765625,
-0.054718017578125,
0.033843994140625,
0.0257720947265625,
-0.01113128662109375,
0.011932373046875,
-0.01177215576171875,
-0.0767822265625,
0.08203125,
0.0254669189453125,
-0.033233642578125,
0.0095367431640625,
0.0191802978515625,
0.03680419921875,
-0.0099945068359375,
0.0244140625,
0.048370361328125,
0.04046630859375,
0.0249786376953125,
-0.0955810546875,
0.0009522438049316406,
-0.0291748046875,
-0.02685546875,
0.0017795562744140625,
-0.050689697265625,
0.059814453125,
-0.0217742919921875,
-0.0262451171875,
0.0014858245849609375,
0.0509033203125,
0.041748046875,
0.0205078125,
0.038848876953125,
0.0474853515625,
0.04229736328125,
-0.02197265625,
0.064453125,
-0.01953125,
0.027252197265625,
0.0560302734375,
0.00676727294921875,
0.06610107421875,
0.01776123046875,
-0.023040771484375,
0.03057861328125,
0.0574951171875,
-0.00569915771484375,
0.030364990234375,
-0.003627777099609375,
-0.0226593017578125,
0.004901885986328125,
-0.005962371826171875,
-0.046234130859375,
0.04730224609375,
0.036407470703125,
-0.01983642578125,
0.0007205009460449219,
0.01025390625,
0.02337646484375,
-0.037384033203125,
-0.008819580078125,
0.044952392578125,
0.0017728805541992188,
-0.034332275390625,
0.058258056640625,
0.0262603759765625,
0.063720703125,
-0.049530029296875,
0.0212249755859375,
-0.0050506591796875,
0.0115814208984375,
-0.0230255126953125,
-0.042205810546875,
0.0233612060546875,
0.00266265869140625,
-0.01456451416015625,
-0.005496978759765625,
0.01457977294921875,
-0.0428466796875,
-0.048583984375,
0.035186767578125,
0.0205535888671875,
0.0297698974609375,
-0.005229949951171875,
-0.05621337890625,
0.0187835693359375,
0.00983428955078125,
-0.0157623291015625,
0.011749267578125,
0.021820068359375,
0.00954437255859375,
0.049530029296875,
0.04815673828125,
0.0134429931640625,
0.03094482421875,
0.00664520263671875,
0.0517578125,
-0.039520263671875,
-0.04266357421875,
-0.04534912109375,
0.045989990234375,
0.0109100341796875,
-0.020172119140625,
0.07171630859375,
0.056549072265625,
0.080322265625,
0.01297760009765625,
0.050689697265625,
-0.0161895751953125,
0.04986572265625,
-0.031768798828125,
0.05560302734375,
-0.041107177734375,
0.0157623291015625,
-0.046295166015625,
-0.068115234375,
-0.0227813720703125,
0.0589599609375,
-0.0075531005859375,
0.0214996337890625,
0.039886474609375,
0.070556640625,
-0.0027980804443359375,
-0.015289306640625,
0.0116119384765625,
0.0204010009765625,
0.03814697265625,
0.044219970703125,
0.031494140625,
-0.049652099609375,
0.06011962890625,
-0.0372314453125,
-0.0169219970703125,
-0.00458526611328125,
-0.031707763671875,
-0.04815673828125,
-0.0743408203125,
-0.0299835205078125,
-0.037628173828125,
-0.029052734375,
0.053131103515625,
0.06524658203125,
-0.058502197265625,
-0.034881591796875,
0.00795745849609375,
-0.0014324188232421875,
-0.02178955078125,
-0.0167083740234375,
0.0419921875,
-0.00437164306640625,
-0.0863037109375,
0.031585693359375,
0.0169830322265625,
0.01025390625,
-0.0095672607421875,
-0.025054931640625,
-0.013580322265625,
0.003826141357421875,
0.029998779296875,
0.033355712890625,
-0.0535888671875,
-0.00923919677734375,
-0.0013580322265625,
-0.01149749755859375,
0.00286102294921875,
0.033966064453125,
-0.044219970703125,
0.044769287109375,
0.03955078125,
0.0223388671875,
0.048980712890625,
-0.033538818359375,
0.0159149169921875,
-0.046630859375,
0.04278564453125,
0.00839996337890625,
0.03765869140625,
0.0269622802734375,
0.0019626617431640625,
0.0239715576171875,
0.02825927734375,
-0.03863525390625,
-0.06549072265625,
0.016387939453125,
-0.0950927734375,
-0.00141143798828125,
0.094970703125,
-0.00440216064453125,
0.0061798095703125,
-0.01305389404296875,
-0.020050048828125,
0.0474853515625,
-0.0297698974609375,
0.04327392578125,
0.054351806640625,
0.003963470458984375,
-0.00847625732421875,
-0.04388427734375,
0.03863525390625,
0.035003662109375,
-0.039306640625,
-0.007198333740234375,
0.0286865234375,
0.029632568359375,
0.0245819091796875,
0.06317138671875,
-0.0251312255859375,
0.0219268798828125,
0.0005879402160644531,
0.01251220703125,
0.010894775390625,
-0.0216217041015625,
-0.03082275390625,
-0.01108551025390625,
-0.01399993896484375,
-0.0278778076171875
]
] |
Helsinki-NLP/opus-mt-en-jap | 2023-08-16T11:30:07.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"en",
"jap",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-jap | 5 | 31,230 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-en-jap
* source languages: en
* target languages: jap
* OPUS readme: [en-jap](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/en-jap/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-01-08.zip](https://object.pouta.csc.fi/OPUS-MT-models/en-jap/opus-2020-01-08.zip)
* test set translations: [opus-2020-01-08.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-jap/opus-2020-01-08.test.txt)
* test set scores: [opus-2020-01-08.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-jap/opus-2020-01-08.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| bible-uedin.en.jap | 42.1 | 0.960 |
| 830 | [
[
-0.01983642578125,
-0.033721923828125,
0.0155181884765625,
0.0292816162109375,
-0.03240966796875,
-0.02886962890625,
-0.0345458984375,
-0.0065460205078125,
0.0029087066650390625,
0.0367431640625,
-0.05206298828125,
-0.04010009765625,
-0.044525146484375,
0.0146026611328125,
-0.004638671875,
0.054779052734375,
-0.0094757080078125,
0.0350341796875,
0.01450347900390625,
-0.0309906005859375,
-0.0272064208984375,
-0.0247039794921875,
-0.037078857421875,
-0.0223236083984375,
0.0155181884765625,
0.02996826171875,
0.0340576171875,
0.0293731689453125,
0.0694580078125,
0.018280029296875,
-0.007720947265625,
0.0008525848388671875,
-0.031494140625,
0.00394439697265625,
0.00878143310546875,
-0.044189453125,
-0.051971435546875,
-0.011322021484375,
0.07843017578125,
0.03509521484375,
0.00006014108657836914,
0.0268707275390625,
0.0015516281127929688,
0.07025146484375,
-0.0269012451171875,
0.0116729736328125,
-0.04168701171875,
0.01003265380859375,
-0.0247039794921875,
-0.022186279296875,
-0.05316162109375,
-0.01934814453125,
0.01549530029296875,
-0.051727294921875,
-0.0036983489990234375,
0.0108184814453125,
0.1046142578125,
0.017608642578125,
-0.021209716796875,
-0.01136016845703125,
-0.04229736328125,
0.07952880859375,
-0.061676025390625,
0.04071044921875,
0.032318115234375,
0.0189971923828125,
0.01255035400390625,
-0.035675048828125,
-0.02056884765625,
0.0062255859375,
-0.01181793212890625,
0.01995849609375,
-0.003177642822265625,
-0.0167694091796875,
0.0267181396484375,
0.05657958984375,
-0.05657958984375,
-0.0081634521484375,
-0.03533935546875,
0.0017795562744140625,
0.048187255859375,
0.002223968505859375,
0.016845703125,
-0.016845703125,
-0.032562255859375,
-0.04534912109375,
-0.051971435546875,
0.00791168212890625,
0.0215911865234375,
0.0240936279296875,
-0.032073974609375,
0.051666259765625,
-0.010894775390625,
0.044708251953125,
-0.002475738525390625,
-0.00734710693359375,
0.07135009765625,
-0.02862548828125,
-0.0295257568359375,
-0.01039886474609375,
0.0860595703125,
0.025390625,
0.0095977783203125,
0.00920867919921875,
-0.01502227783203125,
-0.0240936279296875,
0.00838470458984375,
-0.0675048828125,
-0.00360870361328125,
0.004669189453125,
-0.03265380859375,
-0.0056304931640625,
0.0012121200561523438,
-0.0430908203125,
0.01409912109375,
-0.032073974609375,
0.046783447265625,
-0.054473876953125,
-0.0176544189453125,
0.027099609375,
-0.0011606216430664062,
0.02886962890625,
0.00113677978515625,
-0.05078125,
0.011260986328125,
0.025726318359375,
0.056427001953125,
-0.0233001708984375,
-0.016571044921875,
-0.0292205810546875,
-0.00931549072265625,
-0.006809234619140625,
0.044647216796875,
-0.009521484375,
-0.032470703125,
-0.0023212432861328125,
0.035003662109375,
-0.027435302734375,
-0.021881103515625,
0.09393310546875,
-0.02496337890625,
0.056121826171875,
-0.038116455078125,
-0.0426025390625,
-0.0290069580078125,
0.034515380859375,
-0.038909912109375,
0.0902099609375,
0.01206207275390625,
-0.0645751953125,
0.0169525146484375,
-0.062164306640625,
-0.01529693603515625,
0.004856109619140625,
0.004177093505859375,
-0.05206298828125,
0.00879669189453125,
0.0160064697265625,
0.0289764404296875,
-0.026580810546875,
0.02520751953125,
-0.002162933349609375,
-0.0250701904296875,
0.00566864013671875,
-0.0309906005859375,
0.08026123046875,
0.0213165283203125,
-0.0236968994140625,
0.004673004150390625,
-0.0667724609375,
0.0008325576782226562,
0.0026683807373046875,
-0.031707763671875,
-0.0209197998046875,
0.01043701171875,
0.016632080078125,
0.0055999755859375,
0.0253753662109375,
-0.048492431640625,
0.025390625,
-0.054718017578125,
0.002544403076171875,
0.04705810546875,
-0.0203094482421875,
0.0242156982421875,
-0.03009033203125,
0.0283355712890625,
0.005237579345703125,
0.00913238525390625,
-0.003936767578125,
-0.03778076171875,
-0.0689697265625,
-0.01137542724609375,
0.042144775390625,
0.07855224609375,
-0.060455322265625,
0.063720703125,
-0.052154541015625,
-0.058135986328125,
-0.0548095703125,
-0.01061248779296875,
0.0404052734375,
0.0238494873046875,
0.039520263671875,
-0.023101806640625,
-0.031585693359375,
-0.0836181640625,
-0.0168609619140625,
-0.01200103759765625,
-0.0175628662109375,
0.01198577880859375,
0.043853759765625,
-0.00954437255859375,
0.04241943359375,
-0.04241943359375,
-0.0281982421875,
-0.012908935546875,
0.01549530029296875,
0.03759765625,
0.048126220703125,
0.03936767578125,
-0.0633544921875,
-0.050445556640625,
0.0010690689086914062,
-0.0552978515625,
-0.0147552490234375,
0.0016393661499023438,
-0.01453399658203125,
-0.0001634359359741211,
0.00414276123046875,
-0.0191497802734375,
0.006923675537109375,
0.0440673828125,
-0.04180908203125,
0.03790283203125,
-0.0086822509765625,
0.027496337890625,
-0.10369873046875,
0.006988525390625,
-0.01039886474609375,
-0.0112762451171875,
-0.031951904296875,
0.003658294677734375,
0.0234527587890625,
0.004657745361328125,
-0.057647705078125,
0.033477783203125,
-0.0235748291015625,
-0.008331298828125,
0.01541900634765625,
-0.0018768310546875,
0.0045928955078125,
0.052490234375,
-0.00447845458984375,
0.06884765625,
0.0550537109375,
-0.036529541015625,
0.0137481689453125,
0.044769287109375,
-0.032470703125,
0.0286712646484375,
-0.055633544921875,
-0.0230255126953125,
0.0240325927734375,
-0.0082855224609375,
-0.04681396484375,
0.0034885406494140625,
0.025787353515625,
-0.047698974609375,
0.036590576171875,
-0.007251739501953125,
-0.053741455078125,
-0.0007662773132324219,
-0.0223541259765625,
0.034454345703125,
0.053436279296875,
-0.01187896728515625,
0.043670654296875,
0.007720947265625,
-0.00943756103515625,
-0.035247802734375,
-0.07623291015625,
-0.0079193115234375,
-0.025299072265625,
-0.054107666015625,
0.0145263671875,
-0.0283966064453125,
0.0005888938903808594,
-0.003086090087890625,
0.0232086181640625,
-0.00707244873046875,
0.0030803680419921875,
0.0095367431640625,
0.01334381103515625,
-0.040985107421875,
0.0086517333984375,
0.005115509033203125,
-0.015777587890625,
-0.00901031494140625,
-0.0096435546875,
0.042236328125,
-0.03033447265625,
-0.0243682861328125,
-0.0450439453125,
0.006771087646484375,
0.039886474609375,
-0.02587890625,
0.0631103515625,
0.038055419921875,
-0.0031299591064453125,
0.01506805419921875,
-0.0292205810546875,
0.0116119384765625,
-0.03363037109375,
0.007419586181640625,
-0.0345458984375,
-0.057342529296875,
0.03375244140625,
0.0037326812744140625,
0.035308837890625,
0.06683349609375,
0.04766845703125,
0.0083465576171875,
0.050018310546875,
0.023284912109375,
0.00016379356384277344,
0.035736083984375,
-0.03167724609375,
-0.01104736328125,
-0.07843017578125,
0.00806427001953125,
-0.0494384765625,
-0.022705078125,
-0.06390380859375,
-0.0203094482421875,
0.022674560546875,
0.005245208740234375,
-0.01506805419921875,
0.053741455078125,
-0.044525146484375,
0.0248260498046875,
0.0440673828125,
-0.0122222900390625,
0.0238037109375,
0.004009246826171875,
-0.039276123046875,
-0.0210113525390625,
-0.029876708984375,
-0.039276123046875,
0.09820556640625,
0.0272064208984375,
0.023162841796875,
0.018798828125,
0.03643798828125,
0.0036773681640625,
0.019287109375,
-0.04449462890625,
0.03759765625,
-0.0159759521484375,
-0.0560302734375,
-0.0227203369140625,
-0.0406494140625,
-0.06329345703125,
0.035614013671875,
-0.0191650390625,
-0.03900146484375,
0.00788116455078125,
-0.0015125274658203125,
-0.0070037841796875,
0.0293121337890625,
-0.0528564453125,
0.08099365234375,
-0.00566864013671875,
-0.0026035308837890625,
0.02130126953125,
-0.039764404296875,
0.0228424072265625,
-0.0006966590881347656,
0.0184326171875,
-0.0142059326171875,
0.01448822021484375,
0.047027587890625,
-0.00629425048828125,
0.02996826171875,
-0.00417327880859375,
0.00012814998626708984,
0.004673004150390625,
0.0078277587890625,
0.0246124267578125,
-0.006404876708984375,
-0.0283355712890625,
0.0254669189453125,
0.0019054412841796875,
-0.034454345703125,
-0.0110321044921875,
0.0428466796875,
-0.0582275390625,
-0.00711822509765625,
-0.03900146484375,
-0.048980712890625,
-0.0005707740783691406,
0.0236968994140625,
0.04925537109375,
0.049102783203125,
-0.016510009765625,
0.044708251953125,
0.06304931640625,
-0.0265960693359375,
0.0306243896484375,
0.056182861328125,
-0.01739501953125,
-0.037384033203125,
0.062408447265625,
0.01007843017578125,
0.03216552734375,
0.04632568359375,
0.012908935546875,
-0.006229400634765625,
-0.0528564453125,
-0.0557861328125,
0.0167388916015625,
-0.0233612060546875,
-0.012237548828125,
-0.03765869140625,
-0.0102081298828125,
-0.0227813720703125,
0.01416778564453125,
-0.039031982421875,
-0.04205322265625,
-0.01053619384765625,
-0.008636474609375,
0.01448822021484375,
0.0145111083984375,
-0.0006275177001953125,
0.038360595703125,
-0.078857421875,
0.01255035400390625,
-0.01053619384765625,
0.02716064453125,
-0.035308837890625,
-0.06268310546875,
-0.036956787109375,
0.005977630615234375,
-0.056396484375,
-0.055908203125,
0.039764404296875,
0.01096343994140625,
0.0106048583984375,
0.0297088623046875,
0.0130157470703125,
0.0309295654296875,
-0.051727294921875,
0.077392578125,
0.00020301342010498047,
-0.055511474609375,
0.039520263671875,
-0.035491943359375,
0.03875732421875,
0.07098388671875,
0.0213775634765625,
-0.0301361083984375,
-0.045440673828125,
-0.053863525390625,
-0.058135986328125,
0.0611572265625,
0.054595947265625,
-0.01134490966796875,
0.01544189453125,
-0.005126953125,
0.00025010108947753906,
0.0082855224609375,
-0.0908203125,
-0.02886962890625,
0.007358551025390625,
-0.025482177734375,
-0.01174163818359375,
-0.018035888671875,
-0.01279449462890625,
-0.017242431640625,
0.0848388671875,
0.014984130859375,
0.01198577880859375,
0.030487060546875,
-0.0139007568359375,
-0.0193939208984375,
0.029022216796875,
0.067626953125,
0.03643798828125,
-0.042510986328125,
-0.0184478759765625,
0.02410888671875,
-0.0333251953125,
-0.00656890869140625,
0.00775146484375,
-0.0379638671875,
0.0228424072265625,
0.0350341796875,
0.0762939453125,
0.0155029296875,
-0.046295166015625,
0.038818359375,
-0.0284576416015625,
-0.0273284912109375,
-0.053436279296875,
-0.00501251220703125,
0.0082244873046875,
0.0002923011779785156,
0.01425933837890625,
0.010009765625,
0.01556396484375,
-0.01496124267578125,
0.007747650146484375,
0.0095062255859375,
-0.046844482421875,
-0.0360107421875,
0.033416748046875,
0.00798797607421875,
-0.026031494140625,
0.034912109375,
-0.0302734375,
-0.0400390625,
0.0259246826171875,
0.013824462890625,
0.0792236328125,
-0.02044677734375,
-0.0178375244140625,
0.055816650390625,
0.042755126953125,
-0.0187835693359375,
0.0300445556640625,
0.0048065185546875,
-0.05078125,
-0.0419921875,
-0.0672607421875,
-0.01519775390625,
0.00934600830078125,
-0.062103271484375,
0.0251617431640625,
0.021453857421875,
0.0023136138916015625,
-0.0242919921875,
0.0186920166015625,
-0.037841796875,
0.0121307373046875,
-0.0162506103515625,
0.0718994140625,
-0.0731201171875,
0.06463623046875,
0.0345458984375,
-0.022979736328125,
-0.06097412109375,
-0.01442718505859375,
-0.0140838623046875,
-0.0367431640625,
0.043731689453125,
0.018707275390625,
0.021759033203125,
-0.01329803466796875,
-0.0150146484375,
-0.060882568359375,
0.08148193359375,
0.012939453125,
-0.052459716796875,
0.0035762786865234375,
0.0170745849609375,
0.035552978515625,
-0.0212554931640625,
0.01522064208984375,
0.0305938720703125,
0.05419921875,
0.005016326904296875,
-0.07830810546875,
-0.022796630859375,
-0.038726806640625,
-0.0179901123046875,
0.044219970703125,
-0.04180908203125,
0.070556640625,
0.03912353515625,
-0.012115478515625,
0.006961822509765625,
0.046844482421875,
0.018829345703125,
0.0249786376953125,
0.04119873046875,
0.08807373046875,
0.0200958251953125,
-0.0369873046875,
0.07977294921875,
-0.024658203125,
0.03912353515625,
0.08319091796875,
-0.0013418197631835938,
0.06561279296875,
0.021453857421875,
-0.0126495361328125,
0.0369873046875,
0.0438232421875,
-0.0202484130859375,
0.032806396484375,
0.0014543533325195312,
0.008697509765625,
-0.0142059326171875,
0.0184478759765625,
-0.058319091796875,
0.0207672119140625,
0.0138397216796875,
-0.020172119140625,
0.00411224365234375,
-0.0026149749755859375,
0.004657745361328125,
-0.0038166046142578125,
-0.00725555419921875,
0.04736328125,
-0.0007448196411132812,
-0.048309326171875,
0.051361083984375,
-0.0030364990234375,
0.04876708984375,
-0.056121826171875,
0.013214111328125,
-0.0027637481689453125,
0.018798828125,
0.00656890869140625,
-0.042999267578125,
0.033477783203125,
-0.0007839202880859375,
-0.0201416015625,
-0.03582763671875,
0.0174102783203125,
-0.04180908203125,
-0.068359375,
0.026519775390625,
0.03326416015625,
0.026580810546875,
0.005329132080078125,
-0.06988525390625,
-0.0016613006591796875,
0.01593017578125,
-0.053375244140625,
0.0101318359375,
0.050323486328125,
0.0257720947265625,
0.0361328125,
0.0465087890625,
0.0181121826171875,
0.01025390625,
0.002086639404296875,
0.048583984375,
-0.03759765625,
-0.0305938720703125,
-0.061676025390625,
0.057647705078125,
-0.012939453125,
-0.052154541015625,
0.05999755859375,
0.0762939453125,
0.0716552734375,
-0.01192474365234375,
0.0224456787109375,
-0.0030059814453125,
0.0562744140625,
-0.048095703125,
0.045562744140625,
-0.07293701171875,
0.0177764892578125,
-0.015594482421875,
-0.06903076171875,
-0.0196075439453125,
0.030853271484375,
-0.01568603515625,
-0.027557373046875,
0.0582275390625,
0.04931640625,
-0.01593017578125,
-0.00595855712890625,
0.0204010009765625,
0.0222625732421875,
0.0174102783203125,
0.04736328125,
0.0297698974609375,
-0.07379150390625,
0.037200927734375,
-0.020263671875,
-0.0081024169921875,
-0.003696441650390625,
-0.0543212890625,
-0.060272216796875,
-0.046539306640625,
-0.0135498046875,
-0.01367950439453125,
-0.0226287841796875,
0.06829833984375,
0.034454345703125,
-0.0673828125,
-0.0421142578125,
0.0005321502685546875,
0.006900787353515625,
-0.013946533203125,
-0.0205078125,
0.0433349609375,
-0.0269622802734375,
-0.0704345703125,
0.03533935546875,
0.0117034912109375,
-0.006427764892578125,
0.0011072158813476562,
-0.0176239013671875,
-0.036865234375,
-0.0019330978393554688,
0.0233917236328125,
0.0034637451171875,
-0.04132080078125,
0.005290985107421875,
0.0024738311767578125,
-0.006961822509765625,
0.03033447265625,
0.0278167724609375,
-0.0190887451171875,
0.027557373046875,
0.058746337890625,
0.0308837890625,
0.034423828125,
-0.009002685546875,
0.0401611328125,
-0.0535888671875,
0.025115966796875,
0.0169525146484375,
0.045654296875,
0.0279693603515625,
0.0010814666748046875,
0.05657958984375,
0.0179901123046875,
-0.052947998046875,
-0.073486328125,
0.009674072265625,
-0.090087890625,
-0.00339508056640625,
0.06524658203125,
-0.0242919921875,
-0.0200958251953125,
0.02520751953125,
-0.011627197265625,
0.01495361328125,
-0.0249786376953125,
0.033477783203125,
0.06622314453125,
0.0290679931640625,
0.006839752197265625,
-0.05712890625,
0.0277557373046875,
0.043701171875,
-0.054107666015625,
-0.013824462890625,
0.01493072509765625,
0.00865936279296875,
0.0335693359375,
0.03643798828125,
-0.0168609619140625,
0.007572174072265625,
-0.01959228515625,
0.0301361083984375,
-0.00312042236328125,
-0.01114654541015625,
-0.01947021484375,
0.0034618377685546875,
-0.005100250244140625,
-0.020111083984375
]
] |
timm/twins_pcpvt_base.in1k | 2023-04-23T23:21:45.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2104.13840",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/twins_pcpvt_base.in1k | 0 | 31,122 | timm | 2023-04-23T23:21:13 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for twins_pcpvt_base.in1k
A Twins-PCPVT image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 43.8
- GMACs: 6.7
- Activations (M): 25.2
- Image size: 224 x 224
- **Papers:**
- Twins: Revisiting the Design of Spatial Attention in Vision Transformers: https://arxiv.org/abs/2104.13840
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/Meituan-AutoML/Twins
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('twins_pcpvt_base.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'twins_pcpvt_base.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 49, 512) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{chu2021Twins,
title={Twins: Revisiting the Design of Spatial Attention in Vision Transformers},
author={Xiangxiang Chu and Zhi Tian and Yuqing Wang and Bo Zhang and Haibing Ren and Xiaolin Wei and Huaxia Xia and Chunhua Shen},
booktitle={NeurIPS 2021},
url={https://openreview.net/forum?id=5kTlVBkzSRx},
year={2021}
}
```
| 2,841 | [
[
-0.038848876953125,
-0.033447265625,
0.004669189453125,
0.02899169921875,
-0.01029205322265625,
-0.03106689453125,
-0.00872802734375,
-0.0262908935546875,
0.016815185546875,
0.032073974609375,
-0.044097900390625,
-0.035430908203125,
-0.04925537109375,
-0.007236480712890625,
-0.005008697509765625,
0.06280517578125,
-0.0088958740234375,
0.004180908203125,
-0.0240020751953125,
-0.0245208740234375,
-0.025390625,
-0.033355712890625,
-0.058807373046875,
-0.0251922607421875,
0.0235443115234375,
0.0214385986328125,
0.04205322265625,
0.038238525390625,
0.060943603515625,
0.03558349609375,
-0.0032787322998046875,
0.00716400146484375,
-0.0186614990234375,
-0.0288238525390625,
0.025543212890625,
-0.040374755859375,
-0.032623291015625,
0.0246124267578125,
0.0576171875,
0.04205322265625,
0.008575439453125,
0.018585205078125,
0.023681640625,
0.05291748046875,
-0.01253509521484375,
0.022003173828125,
-0.026031494140625,
0.032958984375,
0.0034656524658203125,
-0.0071868896484375,
-0.0222625732421875,
-0.03277587890625,
0.0304107666015625,
-0.037445068359375,
0.04425048828125,
-0.0030670166015625,
0.0845947265625,
0.0345458984375,
-0.00843048095703125,
0.011810302734375,
-0.023681640625,
0.063720703125,
-0.039794921875,
0.0182952880859375,
0.0164337158203125,
0.010589599609375,
-0.0020961761474609375,
-0.06060791015625,
-0.040374755859375,
-0.012908935546875,
-0.022918701171875,
0.004589080810546875,
-0.0222930908203125,
-0.0036869049072265625,
0.03057861328125,
0.00984954833984375,
-0.04229736328125,
-0.002628326416015625,
-0.040679931640625,
-0.0020427703857421875,
0.038543701171875,
0.01192474365234375,
0.0220489501953125,
-0.0170440673828125,
-0.055450439453125,
-0.027618408203125,
-0.01099395751953125,
0.015106201171875,
0.016815185546875,
0.00789642333984375,
-0.050048828125,
0.030670166015625,
0.016143798828125,
0.0322265625,
0.037750244140625,
-0.0001595020294189453,
0.056976318359375,
-0.01568603515625,
-0.03466796875,
-0.020599365234375,
0.07672119140625,
0.035003662109375,
0.0224761962890625,
-0.002246856689453125,
0.01241302490234375,
-0.0230712890625,
0.01024627685546875,
-0.06500244140625,
-0.0254058837890625,
0.0211181640625,
-0.04345703125,
-0.024261474609375,
0.0238494873046875,
-0.05316162109375,
0.00258636474609375,
-0.0238494873046875,
0.046661376953125,
-0.0333251953125,
-0.026214599609375,
0.0176544189453125,
-0.0196075439453125,
0.0428466796875,
0.001293182373046875,
-0.053680419921875,
0.00971221923828125,
0.031890869140625,
0.072509765625,
-0.00882720947265625,
-0.057647705078125,
-0.006427764892578125,
-0.01273345947265625,
-0.0110626220703125,
0.04681396484375,
-0.0014123916625976562,
-0.004062652587890625,
-0.024810791015625,
0.026397705078125,
-0.0284881591796875,
-0.047088623046875,
0.0240936279296875,
-0.006046295166015625,
0.0283966064453125,
-0.0167236328125,
-0.0024852752685546875,
-0.02325439453125,
0.0203399658203125,
-0.0330810546875,
0.08746337890625,
0.01348114013671875,
-0.076171875,
0.03302001953125,
-0.04827880859375,
-0.0101165771484375,
-0.0034046173095703125,
-0.0016145706176757812,
-0.08074951171875,
-0.0017261505126953125,
0.02276611328125,
0.048095703125,
0.003936767578125,
0.005672454833984375,
-0.043914794921875,
-0.017425537109375,
0.016021728515625,
-0.007198333740234375,
0.07659912109375,
-0.013031005859375,
-0.035858154296875,
0.0179290771484375,
-0.037750244140625,
0.0152435302734375,
0.040069580078125,
-0.01427459716796875,
-0.00432586669921875,
-0.03338623046875,
0.0130767822265625,
0.0253448486328125,
0.0172271728515625,
-0.03643798828125,
0.021728515625,
-0.020965576171875,
0.0330810546875,
0.04571533203125,
-0.0106353759765625,
0.02667236328125,
-0.0257415771484375,
0.019775390625,
0.0250701904296875,
0.0394287109375,
0.0137786865234375,
-0.050048828125,
-0.06817626953125,
-0.0235137939453125,
0.0028934478759765625,
0.036163330078125,
-0.042236328125,
0.054229736328125,
-0.0109710693359375,
-0.0440673828125,
-0.03289794921875,
-0.005756378173828125,
0.037384033203125,
0.03155517578125,
0.022613525390625,
-0.03961181640625,
-0.043487548828125,
-0.07843017578125,
-0.0094146728515625,
0.004688262939453125,
0.007411956787109375,
0.0394287109375,
0.053497314453125,
-0.01502227783203125,
0.0504150390625,
-0.035491943359375,
-0.019866943359375,
-0.01175689697265625,
0.005603790283203125,
0.03271484375,
0.055267333984375,
0.06915283203125,
-0.054840087890625,
-0.04339599609375,
-0.020111083984375,
-0.06512451171875,
0.005405426025390625,
0.00029349327087402344,
-0.0242919921875,
0.0260162353515625,
0.0101776123046875,
-0.04931640625,
0.054840087890625,
0.01171875,
-0.040313720703125,
0.039337158203125,
-0.0240020751953125,
0.01226806640625,
-0.07550048828125,
0.0141448974609375,
0.0169525146484375,
-0.0011720657348632812,
-0.038543701171875,
-0.0014324188232421875,
0.004993438720703125,
-0.002429962158203125,
-0.041748046875,
0.035980224609375,
-0.041717529296875,
-0.0191802978515625,
-0.01334381103515625,
-0.01971435546875,
-0.00003618001937866211,
0.073974609375,
0.011322021484375,
0.016082763671875,
0.054718017578125,
-0.030792236328125,
0.04022216796875,
0.041168212890625,
-0.007293701171875,
0.0394287109375,
-0.054412841796875,
0.018341064453125,
-0.0118865966796875,
0.01325225830078125,
-0.08587646484375,
-0.00731658935546875,
0.02899169921875,
-0.05224609375,
0.03973388671875,
-0.030517578125,
-0.04156494140625,
-0.055938720703125,
-0.03448486328125,
0.040252685546875,
0.038482666015625,
-0.07037353515625,
0.02557373046875,
0.017242431640625,
0.01039886474609375,
-0.044189453125,
-0.077880859375,
-0.0173797607421875,
-0.0230255126953125,
-0.059112548828125,
0.023193359375,
-0.005229949951171875,
0.01302337646484375,
0.01479339599609375,
0.006412506103515625,
-0.01439666748046875,
-0.0255584716796875,
0.036895751953125,
0.0292816162109375,
-0.01983642578125,
-0.0008482933044433594,
-0.0206146240234375,
-0.025848388671875,
0.01548004150390625,
-0.027069091796875,
0.042236328125,
-0.0273590087890625,
-0.015869140625,
-0.06524658203125,
0.0131683349609375,
0.0479736328125,
-0.00435638427734375,
0.060943603515625,
0.0838623046875,
-0.03863525390625,
0.008819580078125,
-0.04742431640625,
-0.0174560546875,
-0.037689208984375,
0.032958984375,
-0.0172119140625,
-0.02911376953125,
0.04339599609375,
0.02923583984375,
0.002277374267578125,
0.047393798828125,
0.035125732421875,
-0.00482940673828125,
0.06787109375,
0.05194091796875,
0.005878448486328125,
0.051849365234375,
-0.06610107421875,
-0.004848480224609375,
-0.0665283203125,
-0.04638671875,
-0.0232391357421875,
-0.054412841796875,
-0.03875732421875,
-0.035491943359375,
0.038482666015625,
-0.011566162109375,
-0.0272216796875,
0.031951904296875,
-0.07171630859375,
0.028350830078125,
0.0654296875,
0.032867431640625,
-0.019500732421875,
0.037445068359375,
-0.020599365234375,
-0.01837158203125,
-0.05633544921875,
-0.00514984130859375,
0.07684326171875,
0.033538818359375,
0.0362548828125,
-0.01435089111328125,
0.035491943359375,
-0.0300140380859375,
0.01055908203125,
-0.04034423828125,
0.041595458984375,
-0.0180206298828125,
-0.0307464599609375,
-0.0011587142944335938,
-0.0291748046875,
-0.07684326171875,
0.0218963623046875,
-0.0171051025390625,
-0.044708251953125,
0.026092529296875,
0.0097503662109375,
-0.033050537109375,
0.0556640625,
-0.0693359375,
0.06793212890625,
-0.02496337890625,
-0.042572021484375,
0.0009698867797851562,
-0.04547119140625,
0.01751708984375,
0.0166168212890625,
-0.035430908203125,
0.01263427734375,
0.00554656982421875,
0.0723876953125,
-0.037384033203125,
0.07476806640625,
-0.0300140380859375,
0.034393310546875,
0.0301666259765625,
-0.0270233154296875,
0.01226043701171875,
-0.0070648193359375,
0.01470184326171875,
0.037689208984375,
0.0012664794921875,
-0.04248046875,
-0.030303955078125,
0.0594482421875,
-0.08160400390625,
-0.027496337890625,
-0.035247802734375,
-0.0499267578125,
0.01059722900390625,
0.003326416015625,
0.042205810546875,
0.037139892578125,
0.0247344970703125,
0.0225982666015625,
0.045318603515625,
-0.0355224609375,
0.040374755859375,
0.00838470458984375,
-0.00795745849609375,
-0.042877197265625,
0.047210693359375,
0.00844573974609375,
0.0030574798583984375,
0.016571044921875,
0.01381683349609375,
-0.0246124267578125,
-0.02740478515625,
-0.007595062255859375,
0.043304443359375,
-0.0264129638671875,
-0.01263427734375,
-0.053741455078125,
-0.03582763671875,
-0.04010009765625,
-0.01047515869140625,
-0.0297698974609375,
-0.026275634765625,
-0.028106689453125,
0.01568603515625,
0.0301666259765625,
0.055999755859375,
-0.01474761962890625,
0.0265045166015625,
-0.041900634765625,
0.0192413330078125,
0.0267486572265625,
0.044708251953125,
-0.00817108154296875,
-0.062255859375,
-0.0091400146484375,
-0.006839752197265625,
-0.04205322265625,
-0.05987548828125,
0.051300048828125,
-0.00946044921875,
0.036041259765625,
0.022796630859375,
-0.0369873046875,
0.0628662109375,
-0.0043792724609375,
0.033416748046875,
0.0227508544921875,
-0.040069580078125,
0.049468994140625,
-0.01300811767578125,
0.024627685546875,
0.00525665283203125,
0.02349853515625,
-0.017364501953125,
0.013427734375,
-0.07843017578125,
-0.05169677734375,
0.0799560546875,
0.013916015625,
-0.009735107421875,
0.026947021484375,
0.0606689453125,
-0.00013840198516845703,
0.005779266357421875,
-0.0677490234375,
-0.03466796875,
-0.0162506103515625,
-0.00994110107421875,
-0.00634765625,
-0.01544952392578125,
0.01210784912109375,
-0.066162109375,
0.04638671875,
0.0045928955078125,
0.06707763671875,
0.033111572265625,
0.01038360595703125,
-0.019927978515625,
-0.037261962890625,
0.0297698974609375,
0.01519012451171875,
-0.0177154541015625,
-0.00634002685546875,
0.017364501953125,
-0.046051025390625,
-0.01207733154296875,
0.00946044921875,
-0.0022068023681640625,
0.00298309326171875,
0.029693603515625,
0.061798095703125,
0.01081085205078125,
0.00765228271484375,
0.04278564453125,
-0.0177001953125,
-0.039886474609375,
-0.03253173828125,
-0.0031795501708984375,
-0.0017910003662109375,
0.0340576171875,
0.01371002197265625,
0.034759521484375,
-0.0154876708984375,
-0.01139068603515625,
0.01235198974609375,
0.044830322265625,
-0.0243682861328125,
-0.0291900634765625,
0.04974365234375,
0.0006232261657714844,
-0.0160675048828125,
0.070556640625,
-0.005573272705078125,
-0.048919677734375,
0.0880126953125,
0.02923583984375,
0.08447265625,
-0.01210784912109375,
-0.00627899169921875,
0.070068359375,
0.0281982421875,
0.0098419189453125,
0.0245513916015625,
0.004119873046875,
-0.054351806640625,
0.003650665283203125,
-0.03436279296875,
0.0018949508666992188,
0.03240966796875,
-0.037445068359375,
0.022552490234375,
-0.050689697265625,
-0.022613525390625,
0.0018091201782226562,
0.03021240234375,
-0.07159423828125,
-0.003383636474609375,
0.0086669921875,
0.05328369140625,
-0.06317138671875,
0.051788330078125,
0.07098388671875,
-0.031585693359375,
-0.05621337890625,
-0.00769805908203125,
-0.001964569091796875,
-0.07659912109375,
0.044281005859375,
0.0374755859375,
0.01424407958984375,
0.000885009765625,
-0.0662841796875,
-0.05712890625,
0.0958251953125,
0.0227508544921875,
-0.0222930908203125,
0.0005040168762207031,
0.01277923583984375,
0.0296630859375,
-0.02703857421875,
0.026336669921875,
0.022918701171875,
0.036102294921875,
0.01453399658203125,
-0.056793212890625,
-0.00518035888671875,
-0.028900146484375,
0.0015964508056640625,
0.0207977294921875,
-0.07177734375,
0.066162109375,
-0.050872802734375,
-0.01284027099609375,
0.005344390869140625,
0.037933349609375,
0.0172882080078125,
0.0185089111328125,
0.039337158203125,
0.0693359375,
0.02435302734375,
-0.025177001953125,
0.06427001953125,
-0.005908966064453125,
0.047607421875,
0.05218505859375,
0.0268402099609375,
0.04742431640625,
0.039520263671875,
-0.0254364013671875,
0.0201263427734375,
0.05816650390625,
-0.038238525390625,
0.02294921875,
0.0055999755859375,
0.0137786865234375,
-0.01445770263671875,
0.014892578125,
-0.040374755859375,
0.028167724609375,
0.015106201171875,
-0.039398193359375,
-0.002960205078125,
0.00109100341796875,
-0.0134429931640625,
-0.02191162109375,
-0.01044464111328125,
0.0399169921875,
-0.00232696533203125,
-0.050140380859375,
0.04638671875,
-0.01363372802734375,
0.06573486328125,
-0.0193328857421875,
0.002681732177734375,
-0.019989013671875,
0.036590576171875,
-0.024658203125,
-0.0638427734375,
0.0274810791015625,
-0.0242919921875,
-0.0203399658203125,
-0.0140838623046875,
0.04144287109375,
-0.036102294921875,
-0.04345703125,
0.021240234375,
0.0169219970703125,
0.027923583984375,
-0.005138397216796875,
-0.10009765625,
-0.0011587142944335938,
0.00962066650390625,
-0.0248260498046875,
0.007022857666015625,
0.0249786376953125,
0.0207366943359375,
0.05377197265625,
0.041961669921875,
-0.01078033447265625,
0.0248870849609375,
-0.01120758056640625,
0.0452880859375,
-0.04638671875,
-0.035858154296875,
-0.057708740234375,
0.0313720703125,
0.0005688667297363281,
-0.039031982421875,
0.039886474609375,
0.048065185546875,
0.06622314453125,
-0.017852783203125,
0.03204345703125,
-0.0032863616943359375,
0.00737762451171875,
-0.016357421875,
0.050872802734375,
-0.043121337890625,
-0.007213592529296875,
-0.016204833984375,
-0.06329345703125,
-0.033447265625,
0.04949951171875,
-0.0310821533203125,
0.040069580078125,
0.058563232421875,
0.0662841796875,
-0.02496337890625,
-0.0203399658203125,
0.0116729736328125,
0.0291900634765625,
-0.0012121200561523438,
0.071044921875,
0.044647216796875,
-0.060821533203125,
0.0304412841796875,
-0.0565185546875,
-0.0208282470703125,
-0.01519012451171875,
-0.041534423828125,
-0.0814208984375,
-0.061920166015625,
-0.053802490234375,
-0.0271148681640625,
-0.01593017578125,
0.050933837890625,
0.078857421875,
-0.042388916015625,
-0.003787994384765625,
0.00217437744140625,
0.0018644332885742188,
-0.0187225341796875,
-0.0187835693359375,
0.0498046875,
-0.01097869873046875,
-0.0667724609375,
-0.03253173828125,
0.01357269287109375,
0.032989501953125,
-0.005859375,
-0.005756378173828125,
-0.0175628662109375,
-0.0260467529296875,
0.017669677734375,
0.0177459716796875,
-0.051300048828125,
-0.01338958740234375,
-0.0205078125,
-0.01806640625,
0.03314208984375,
0.03558349609375,
-0.053863525390625,
0.017547607421875,
0.045928955078125,
0.01544189453125,
0.04718017578125,
-0.019805908203125,
0.00342559814453125,
-0.05999755859375,
0.033447265625,
-0.01105499267578125,
0.0445556640625,
0.02593994140625,
-0.034637451171875,
0.040557861328125,
0.041351318359375,
-0.04498291015625,
-0.04827880859375,
-0.00946044921875,
-0.08978271484375,
0.00010526180267333984,
0.0675048828125,
-0.006561279296875,
-0.040313720703125,
0.020233154296875,
-0.01258087158203125,
0.03436279296875,
0.002712249755859375,
0.04669189453125,
0.0201416015625,
-0.00769805908203125,
-0.02996826171875,
-0.029449462890625,
0.023712158203125,
0.039031982421875,
-0.0484619140625,
-0.0223236083984375,
0.0010328292846679688,
0.048187255859375,
0.0203399658203125,
0.02581787109375,
-0.0234832763671875,
0.0231475830078125,
0.0011796951293945312,
0.039215087890625,
-0.021392822265625,
-0.0176849365234375,
-0.041778564453125,
0.00974273681640625,
-0.0261993408203125,
-0.047698974609375
]
] |
cross-encoder/stsb-distilroberta-base | 2021-08-05T08:41:53.000Z | [
"transformers",
"pytorch",
"jax",
"roberta",
"text-classification",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cross-encoder | null | null | cross-encoder/stsb-distilroberta-base | 2 | 31,099 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
---
# Cross-Encoder for Quora Duplicate Questions Detection
This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class.
## Training Data
This model was trained on the [STS benchmark dataset](http://ixa2.si.ehu.eus/stswiki/index.php/STSbenchmark). The model will predict a score between 0 and 1 how for the semantic similarity of two sentences.
## Usage and Performance
Pre-trained models can be used like this:
```
from sentence_transformers import CrossEncoder
model = CrossEncoder('model_name')
scores = model.predict([('Sentence 1', 'Sentence 2'), ('Sentence 3', 'Sentence 4')])
```
The model will predict scores for the pairs `('Sentence 1', 'Sentence 2')` and `('Sentence 3', 'Sentence 4')`.
You can use this model also without sentence_transformers and by just using Transformers ``AutoModel`` class | 941 | [
[
-0.0168914794921875,
-0.060394287109375,
0.01953125,
0.01727294921875,
-0.028411865234375,
-0.00251007080078125,
0.0121612548828125,
-0.0128936767578125,
0.0089874267578125,
0.0518798828125,
-0.051177978515625,
-0.031951904296875,
-0.034912109375,
0.033111572265625,
-0.06365966796875,
0.0767822265625,
-0.011260986328125,
0.0297698974609375,
-0.052093505859375,
-0.025177001953125,
-0.0271148681640625,
-0.032196044921875,
-0.0362548828125,
-0.0186614990234375,
0.00988006591796875,
0.02593994140625,
0.028564453125,
0.00867462158203125,
0.0244903564453125,
0.0301513671875,
-0.00115203857421875,
0.00783538818359375,
-0.0177764892578125,
0.005352020263671875,
-0.023223876953125,
-0.0455322265625,
0.0126953125,
-0.01319122314453125,
0.0279388427734375,
0.018524169921875,
-0.01026153564453125,
0.0350341796875,
0.002777099609375,
0.0221099853515625,
-0.010772705078125,
-0.01508331298828125,
-0.052947998046875,
0.01479339599609375,
0.005435943603515625,
-0.0208892822265625,
-0.00428009033203125,
-0.0465087890625,
-0.0051116943359375,
-0.041900634765625,
0.0286865234375,
0.0009136199951171875,
0.0823974609375,
0.0259857177734375,
-0.0367431640625,
-0.01611328125,
-0.0311126708984375,
0.0611572265625,
-0.038787841796875,
0.0136871337890625,
0.0271453857421875,
0.0240325927734375,
0.0067596435546875,
-0.0626220703125,
-0.07562255859375,
0.0089263916015625,
-0.01654052734375,
0.00896453857421875,
-0.032623291015625,
-0.0399169921875,
0.020721435546875,
0.0262451171875,
-0.07452392578125,
-0.0038051605224609375,
-0.05712890625,
-0.02667236328125,
0.029296875,
0.033355712890625,
0.0189971923828125,
-0.043304443359375,
-0.056396484375,
-0.01454925537109375,
-0.0093994140625,
0.0186614990234375,
0.0263824462890625,
-0.0028400421142578125,
-0.0074462890625,
0.044647216796875,
-0.0308380126953125,
0.03997802734375,
0.0159149169921875,
0.016754150390625,
0.055145263671875,
-0.039337158203125,
-0.001003265380859375,
-0.019317626953125,
0.078857421875,
0.0191650390625,
0.03619384765625,
0.004238128662109375,
0.0001424551010131836,
0.012939453125,
0.031494140625,
-0.044708251953125,
-0.00634002685546875,
0.0201568603515625,
-0.0298919677734375,
-0.0318603515625,
0.0103912353515625,
-0.034576416015625,
0.0103759765625,
-0.015716552734375,
0.059051513671875,
-0.038330078125,
0.0010986328125,
0.02783203125,
-0.02850341796875,
0.0308990478515625,
-0.0014667510986328125,
-0.0306854248046875,
0.0340576171875,
0.0419921875,
0.045074462890625,
-0.01186370849609375,
-0.048126220703125,
-0.0256500244140625,
-0.0245819091796875,
0.00907135009765625,
0.0718994140625,
-0.02874755859375,
-0.0128173828125,
-0.01216888427734375,
0.0218658447265625,
-0.011505126953125,
-0.03912353515625,
0.045135498046875,
-0.04241943359375,
0.0694580078125,
-0.0224456787109375,
-0.050750732421875,
-0.035064697265625,
0.043304443359375,
-0.0560302734375,
0.09112548828125,
0.0227508544921875,
-0.06787109375,
0.012359619140625,
-0.0290069580078125,
-0.035308837890625,
0.00626373291015625,
-0.01056671142578125,
-0.05194091796875,
-0.01708984375,
0.0209808349609375,
0.01348876953125,
-0.024078369140625,
0.009246826171875,
-0.00626373291015625,
-0.0300445556640625,
0.0279998779296875,
-0.01971435546875,
0.06048583984375,
0.003326416015625,
-0.0214691162109375,
0.007781982421875,
-0.048828125,
0.0283966064453125,
0.01473236083984375,
-0.0214691162109375,
-0.02032470703125,
-0.034942626953125,
0.0216522216796875,
0.0293731689453125,
0.0276031494140625,
-0.05657958984375,
-0.0172271728515625,
-0.0162811279296875,
0.0216217041015625,
0.03546142578125,
0.01372528076171875,
0.0139312744140625,
-0.03472900390625,
0.068115234375,
0.0206146240234375,
0.01323699951171875,
0.01139068603515625,
-0.041961669921875,
-0.0526123046875,
0.02398681640625,
0.023590087890625,
0.06494140625,
-0.05987548828125,
0.060302734375,
-0.0110015869140625,
-0.04150390625,
-0.05859375,
0.0190887451171875,
0.0230865478515625,
0.025146484375,
0.04931640625,
-0.0167236328125,
-0.054962158203125,
-0.07318115234375,
-0.038330078125,
0.0024585723876953125,
0.001068115234375,
0.0009469985961914062,
0.0675048828125,
-0.007843017578125,
0.07763671875,
-0.03466796875,
-0.0159454345703125,
-0.028167724609375,
0.0110931396484375,
0.004039764404296875,
0.050048828125,
0.033935546875,
-0.07745361328125,
-0.049041748046875,
-0.0280914306640625,
-0.05645751953125,
0.003192901611328125,
0.00011354684829711914,
-0.0127716064453125,
0.006900787353515625,
0.03094482421875,
-0.055023193359375,
0.0307769775390625,
0.031646728515625,
-0.0174102783203125,
0.0283203125,
0.0010013580322265625,
0.02490234375,
-0.10723876953125,
-0.000885009765625,
-0.01044464111328125,
-0.0255126953125,
-0.029754638671875,
0.005725860595703125,
0.0022754669189453125,
-0.0032806396484375,
-0.031646728515625,
0.03546142578125,
-0.00739288330078125,
0.00952911376953125,
-0.014739990234375,
0.0082244873046875,
0.0233306884765625,
0.040924072265625,
0.0016088485717773438,
0.0516357421875,
0.035003662109375,
-0.037933349609375,
0.04052734375,
0.050506591796875,
-0.051727294921875,
0.03521728515625,
-0.07879638671875,
0.0258636474609375,
-0.003368377685546875,
0.0232696533203125,
-0.0732421875,
0.012725830078125,
0.027557373046875,
-0.052642822265625,
-0.03436279296875,
0.00856781005859375,
-0.041229248046875,
-0.05438232421875,
-0.026947021484375,
0.0482177734375,
0.03619384765625,
-0.0465087890625,
0.04461669921875,
0.0165252685546875,
-0.004543304443359375,
-0.05352783203125,
-0.0704345703125,
-0.0272064208984375,
-0.0023746490478515625,
-0.043853759765625,
0.01111602783203125,
-0.0141143798828125,
0.020263671875,
0.0191802978515625,
0.0037364959716796875,
-0.01568603515625,
-0.0064849853515625,
0.0216522216796875,
0.00801849365234375,
-0.023468017578125,
0.0152130126953125,
0.0115814208984375,
-0.0113372802734375,
0.007785797119140625,
-0.02362060546875,
0.06439208984375,
-0.00850677490234375,
-0.023468017578125,
-0.03375244140625,
0.034820556640625,
0.0267333984375,
-0.0228118896484375,
0.051605224609375,
0.0496826171875,
-0.0287933349609375,
-0.0170745849609375,
-0.048553466796875,
-0.0034046173095703125,
-0.0352783203125,
0.03607177734375,
-0.03326416015625,
-0.07122802734375,
0.03729248046875,
0.023590087890625,
-0.032745361328125,
0.038604736328125,
0.031494140625,
0.00608062744140625,
0.055877685546875,
0.0357666015625,
-0.014678955078125,
0.0178680419921875,
-0.01499176025390625,
0.01556396484375,
-0.036285400390625,
-0.03338623046875,
-0.0341796875,
-0.01551055908203125,
-0.036865234375,
-0.04461669921875,
0.012725830078125,
-0.00032901763916015625,
-0.01027679443359375,
0.048370361328125,
-0.048980712890625,
0.05194091796875,
0.05230712890625,
0.0128173828125,
0.00850677490234375,
0.0159759521484375,
0.007152557373046875,
0.007068634033203125,
-0.052215576171875,
-0.02008056640625,
0.0751953125,
-0.0021076202392578125,
0.045745849609375,
-0.0208892822265625,
0.04864501953125,
0.0143280029296875,
-0.0172576904296875,
-0.03265380859375,
0.05194091796875,
-0.021697998046875,
-0.042938232421875,
-0.0196533203125,
-0.04376220703125,
-0.08343505859375,
0.022064208984375,
-0.0120391845703125,
-0.01385498046875,
0.00510406494140625,
-0.01666259765625,
-0.03924560546875,
0.0191802978515625,
-0.0216827392578125,
0.08660888671875,
-0.0233612060546875,
0.004138946533203125,
-0.025421142578125,
-0.038604736328125,
0.0207977294921875,
-0.0133209228515625,
-0.0060577392578125,
0.00254058837890625,
-0.0022487640380859375,
0.06402587890625,
-0.0276641845703125,
0.044677734375,
0.0059814453125,
0.01453399658203125,
0.0283966064453125,
-0.0132904052734375,
0.00856781005859375,
-0.003662109375,
-0.00362396240234375,
0.016143798828125,
0.00780487060546875,
-0.031036376953125,
-0.022979736328125,
0.05767822265625,
-0.07061767578125,
-0.0272979736328125,
-0.027069091796875,
-0.02642822265625,
0.007904052734375,
0.013214111328125,
0.045379638671875,
0.022979736328125,
-0.03009033203125,
0.0043182373046875,
0.0286865234375,
-0.022796630859375,
0.00707244873046875,
0.045318603515625,
-0.010711669921875,
-0.041900634765625,
0.041748046875,
-0.003246307373046875,
0.012969970703125,
0.04376220703125,
0.004337310791015625,
-0.0286712646484375,
-0.01372528076171875,
-0.01224517822265625,
0.012908935546875,
-0.057830810546875,
-0.0296478271484375,
-0.07684326171875,
-0.04559326171875,
-0.042572021484375,
0.022796630859375,
-0.00876617431640625,
-0.0440673828125,
-0.0228271484375,
-0.0193939208984375,
0.05438232421875,
0.04400634765625,
-0.0095977783203125,
0.015106201171875,
-0.06329345703125,
0.051361083984375,
0.0308685302734375,
0.0083160400390625,
-0.005878448486328125,
-0.06121826171875,
-0.0107879638671875,
-0.0060577392578125,
-0.020263671875,
-0.04156494140625,
0.032623291015625,
-0.0046234130859375,
0.040435791015625,
0.001007080078125,
0.00412750244140625,
0.039276123046875,
-0.0244140625,
0.054595947265625,
0.01207733154296875,
-0.073974609375,
0.0438232421875,
0.01244354248046875,
0.03387451171875,
0.0555419921875,
0.05413818359375,
-0.04541015625,
-0.0267486572265625,
-0.0457763671875,
-0.05645751953125,
0.054443359375,
0.02032470703125,
0.0308837890625,
-0.007076263427734375,
0.01346588134765625,
0.037506103515625,
0.00933074951171875,
-0.07659912109375,
-0.0171051025390625,
-0.038330078125,
-0.031219482421875,
-0.0013284683227539062,
-0.011077880859375,
0.01026153564453125,
-0.037872314453125,
0.047760009765625,
0.0090789794921875,
-0.0039825439453125,
0.014678955078125,
-0.018829345703125,
-0.00653839111328125,
0.02313232421875,
0.0214385986328125,
0.022003173828125,
-0.0017557144165039062,
-0.0168304443359375,
0.0301666259765625,
-0.0249786376953125,
0.0012874603271484375,
0.01947021484375,
-0.027862548828125,
0.0219573974609375,
0.0217742919921875,
0.06488037109375,
0.00724029541015625,
-0.03668212890625,
0.050445556640625,
-0.01308441162109375,
-0.0227813720703125,
-0.0633544921875,
-0.00392913818359375,
0.00506591796875,
0.034149169921875,
0.00946807861328125,
0.017578125,
0.006313323974609375,
-0.03594970703125,
0.029327392578125,
0.01378631591796875,
-0.0228271484375,
-0.014251708984375,
0.050079345703125,
0.008087158203125,
-0.046844482421875,
0.05889892578125,
-0.01071929931640625,
-0.084228515625,
0.05072021484375,
0.0218658447265625,
0.05572509765625,
-0.0063629150390625,
0.0276641845703125,
0.050750732421875,
0.0152740478515625,
-0.022186279296875,
0.050750732421875,
-0.011474609375,
-0.072509765625,
0.002880096435546875,
-0.042938232421875,
-0.0252532958984375,
0.02105712890625,
-0.07586669921875,
0.0250091552734375,
-0.0223236083984375,
-0.009307861328125,
-0.0006732940673828125,
-0.01033782958984375,
-0.060394287109375,
0.0189971923828125,
-0.0015859603881835938,
0.058135986328125,
-0.0687255859375,
0.055206298828125,
0.044281005859375,
-0.061126708984375,
-0.054473876953125,
-0.00336456298828125,
-0.011383056640625,
-0.06390380859375,
0.049591064453125,
0.0269317626953125,
0.00714111328125,
-0.01149749755859375,
-0.043487548828125,
-0.05889892578125,
0.07757568359375,
-0.0121917724609375,
-0.03656005859375,
-0.0007786750793457031,
0.037689208984375,
0.047393798828125,
-0.0235137939453125,
0.027862548828125,
0.036956787109375,
0.0175323486328125,
-0.017242431640625,
-0.06365966796875,
0.01090240478515625,
-0.051361083984375,
-0.01495361328125,
0.003223419189453125,
-0.046112060546875,
0.08843994140625,
0.00152587890625,
-0.0029773712158203125,
0.045135498046875,
0.037322998046875,
0.02703857421875,
0.023468017578125,
0.046905517578125,
0.055419921875,
0.042388916015625,
0.00640106201171875,
0.0740966796875,
-0.0195465087890625,
0.034820556640625,
0.08966064453125,
-0.0241851806640625,
0.07159423828125,
0.0229949951171875,
-0.01073455810546875,
0.0709228515625,
0.0258636474609375,
-0.045745849609375,
0.03448486328125,
0.017791748046875,
0.0125732421875,
-0.033905029296875,
0.0147247314453125,
-0.04150390625,
0.039306640625,
-0.00579071044921875,
-0.0247039794921875,
-0.0032138824462890625,
0.0168609619140625,
-0.02850341796875,
0.0311431884765625,
0.0007839202880859375,
0.043487548828125,
0.0016469955444335938,
-0.051971435546875,
0.036468505859375,
-0.006931304931640625,
0.06658935546875,
-0.0582275390625,
0.0017833709716796875,
-0.0232391357421875,
0.025909423828125,
-0.002254486083984375,
-0.07183837890625,
0.013702392578125,
-0.0024433135986328125,
-0.0303192138671875,
-0.0005397796630859375,
0.059112548828125,
-0.050750732421875,
-0.052093505859375,
0.043121337890625,
0.04632568359375,
0.01654052734375,
-0.01421356201171875,
-0.07666015625,
-0.0217437744140625,
0.00453948974609375,
0.0024013519287109375,
-0.0009336471557617188,
0.0435791015625,
0.00618743896484375,
0.0498046875,
0.044219970703125,
-0.01959228515625,
0.01202392578125,
0.0304412841796875,
0.05950927734375,
-0.07623291015625,
-0.04815673828125,
-0.04022216796875,
0.01434326171875,
-0.024749755859375,
-0.0487060546875,
0.06890869140625,
0.07122802734375,
0.0780029296875,
-0.0271759033203125,
0.054840087890625,
0.0137786865234375,
0.0689697265625,
-0.028564453125,
0.046173095703125,
-0.053009033203125,
0.00937652587890625,
-0.0012607574462890625,
-0.04534912109375,
0.01047515869140625,
0.03857421875,
-0.0182647705078125,
0.0038471221923828125,
0.07159423828125,
0.07391357421875,
-0.01195526123046875,
0.0189208984375,
0.0019521713256835938,
0.01256561279296875,
-0.027191162109375,
0.057159423828125,
0.0816650390625,
-0.0682373046875,
0.07562255859375,
-0.0224609375,
0.032135009765625,
0.004791259765625,
-0.00933074951171875,
-0.07501220703125,
-0.038665771484375,
-0.0328369140625,
-0.034210205078125,
-0.00841522216796875,
0.046783447265625,
0.03466796875,
-0.0831298828125,
-0.005645751953125,
0.004741668701171875,
0.010467529296875,
-0.0128936767578125,
-0.0202178955078125,
0.0168304443359375,
0.001125335693359375,
-0.042938232421875,
0.00830078125,
-0.0045318603515625,
-0.0034637451171875,
-0.0049285888671875,
0.0024013519287109375,
-0.037506103515625,
0.0134735107421875,
0.0244903564453125,
0.00882720947265625,
-0.0528564453125,
-0.021881103515625,
0.0003345012664794922,
-0.0295562744140625,
0.0027599334716796875,
0.0399169921875,
-0.080322265625,
-0.0025634765625,
0.05963134765625,
0.0362548828125,
0.05096435546875,
0.017578125,
0.0290985107421875,
-0.033111572265625,
0.0023956298828125,
0.017333984375,
0.0276031494140625,
0.038970947265625,
-0.0201873779296875,
0.039703369140625,
0.0252532958984375,
-0.03228759765625,
-0.045440673828125,
-0.00264739990234375,
-0.08258056640625,
-0.0290985107421875,
0.07977294921875,
-0.005649566650390625,
-0.0194244384765625,
-0.01309967041015625,
-0.0099945068359375,
0.02288818359375,
-0.0142669677734375,
0.053985595703125,
0.032073974609375,
0.01378631591796875,
-0.0086822509765625,
-0.005641937255859375,
0.008636474609375,
0.0556640625,
-0.069580078125,
-0.035125732421875,
0.003902435302734375,
0.06439208984375,
0.0128021240234375,
0.03326416015625,
-0.005207061767578125,
0.037933349609375,
0.0113983154296875,
0.007701873779296875,
-0.0023288726806640625,
0.01348114013671875,
-0.0303802490234375,
0.01381683349609375,
-0.04815673828125,
-0.044158935546875
]
] |
timm/vit_small_patch14_dinov2.lvd142m | 2023-09-03T21:58:45.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:2304.07193",
"arxiv:2010.11929",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_small_patch14_dinov2.lvd142m | 1 | 30,921 | timm | 2023-05-09T21:09:24 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
---
# Model card for vit_small_patch14_dinov2.lvd142m
A Vision Transformer (ViT) image feature model. Pretrained on LVD-142M with self-supervised DINOv2 method.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 22.1
- GMACs: 46.8
- Activations (M): 198.8
- Image size: 518 x 518
- **Papers:**
- DINOv2: Learning Robust Visual Features without Supervision: https://arxiv.org/abs/2304.07193
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Original:** https://github.com/facebookresearch/dinov2
- **Pretrain Dataset:** LVD-142M
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_small_patch14_dinov2.lvd142m', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_small_patch14_dinov2.lvd142m',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1370, 384) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{oquab2023dinov2,
title={DINOv2: Learning Robust Visual Features without Supervision},
author={Oquab, Maxime and Darcet, Timothée and Moutakanni, Theo and Vo, Huy V. and Szafraniec, Marc and Khalidov, Vasil and Fernandez, Pierre and Haziza, Daniel and Massa, Francisco and El-Nouby, Alaaeldin and Howes, Russell and Huang, Po-Yao and Xu, Hu and Sharma, Vasu and Li, Shang-Wen and Galuba, Wojciech and Rabbat, Mike and Assran, Mido and Ballas, Nicolas and Synnaeve, Gabriel and Misra, Ishan and Jegou, Herve and Mairal, Julien and Labatut, Patrick and Joulin, Armand and Bojanowski, Piotr},
journal={arXiv:2304.07193},
year={2023}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
``` | 3,987 | [
[
-0.03692626953125,
-0.02923583984375,
0.006744384765625,
0.0026874542236328125,
-0.0350341796875,
-0.026947021484375,
-0.020233154296875,
-0.0323486328125,
0.01087188720703125,
0.0187225341796875,
-0.034027099609375,
-0.03826904296875,
-0.050567626953125,
-0.0037403106689453125,
-0.0032806396484375,
0.06298828125,
-0.00965118408203125,
-0.0035572052001953125,
-0.02294921875,
-0.034393310546875,
-0.0123748779296875,
-0.0104217529296875,
-0.039337158203125,
-0.0258026123046875,
0.0312042236328125,
0.00893402099609375,
0.044403076171875,
0.059661865234375,
0.046417236328125,
0.0309295654296875,
-0.0112457275390625,
0.0160064697265625,
-0.0238189697265625,
-0.01380157470703125,
0.01503753662109375,
-0.047637939453125,
-0.03179931640625,
0.0140838623046875,
0.0537109375,
0.0211334228515625,
0.0029468536376953125,
0.0274200439453125,
0.019378662109375,
0.030487060546875,
-0.0233306884765625,
0.02337646484375,
-0.035003662109375,
0.01459503173828125,
-0.00948333740234375,
0.0035381317138671875,
-0.0177154541015625,
-0.0237274169921875,
0.01227569580078125,
-0.044189453125,
0.03082275390625,
-0.0045928955078125,
0.09942626953125,
0.023651123046875,
-0.0090789794921875,
0.010528564453125,
-0.0212860107421875,
0.064697265625,
-0.038970947265625,
0.0240325927734375,
0.02056884765625,
0.0036334991455078125,
0.0021305084228515625,
-0.076904296875,
-0.044677734375,
0.002819061279296875,
-0.0182952880859375,
0.006847381591796875,
-0.03240966796875,
0.0024547576904296875,
0.025604248046875,
0.021392822265625,
-0.032196044921875,
0.015625,
-0.04248046875,
-0.025970458984375,
0.032318115234375,
-0.007534027099609375,
0.01383209228515625,
-0.0287628173828125,
-0.0452880859375,
-0.04144287109375,
-0.02777099609375,
0.030120849609375,
0.01549530029296875,
0.004657745361328125,
-0.0391845703125,
0.0382080078125,
0.01403045654296875,
0.039947509765625,
0.01136016845703125,
-0.0266571044921875,
0.045501708984375,
-0.017059326171875,
-0.02935791015625,
-0.0214080810546875,
0.087158203125,
0.037261962890625,
0.025634765625,
0.0128021240234375,
-0.010101318359375,
-0.0081787109375,
-0.01013946533203125,
-0.08380126953125,
-0.034515380859375,
0.01078033447265625,
-0.038330078125,
-0.03277587890625,
0.01544189453125,
-0.0682373046875,
-0.0142669677734375,
-0.004245758056640625,
0.051605224609375,
-0.0312347412109375,
-0.0241241455078125,
-0.01194000244140625,
-0.008544921875,
0.035491943359375,
0.0177154541015625,
-0.05230712890625,
0.01087188720703125,
0.0191497802734375,
0.07952880859375,
-0.0019092559814453125,
-0.031982421875,
-0.031280517578125,
-0.025726318359375,
-0.0288848876953125,
0.04205322265625,
-0.0026416778564453125,
-0.01326751708984375,
-0.0121612548828125,
0.027374267578125,
-0.01137542724609375,
-0.047027587890625,
0.0206298828125,
-0.0073394775390625,
0.02801513671875,
-0.0037841796875,
-0.004703521728515625,
-0.0234222412109375,
0.0152435302734375,
-0.0307769775390625,
0.09747314453125,
0.0276031494140625,
-0.0675048828125,
0.035491943359375,
-0.01959228515625,
-0.00666046142578125,
-0.0151824951171875,
-0.0085601806640625,
-0.09088134765625,
-0.00884246826171875,
0.028533935546875,
0.042938232421875,
-0.016387939453125,
-0.0010442733764648438,
-0.040069580078125,
-0.0209808349609375,
0.0213623046875,
-0.01324462890625,
0.06939697265625,
0.0023403167724609375,
-0.037322998046875,
0.02545166015625,
-0.049102783203125,
0.01027679443359375,
0.036376953125,
-0.0222625732421875,
-0.005157470703125,
-0.040771484375,
0.0104217529296875,
0.024017333984375,
0.016204833984375,
-0.031951904296875,
0.024383544921875,
-0.0129547119140625,
0.0362548828125,
0.056243896484375,
-0.0165557861328125,
0.0305938720703125,
-0.02264404296875,
0.0241546630859375,
0.0184173583984375,
0.0225982666015625,
-0.004589080810546875,
-0.049652099609375,
-0.06378173828125,
-0.05181884765625,
0.02392578125,
0.040374755859375,
-0.05419921875,
0.04327392578125,
-0.0179595947265625,
-0.044891357421875,
-0.03863525390625,
0.0224151611328125,
0.041351318359375,
0.040191650390625,
0.041046142578125,
-0.041717529296875,
-0.037628173828125,
-0.064697265625,
0.00400543212890625,
-0.01049041748046875,
-0.00756072998046875,
0.033172607421875,
0.04400634765625,
-0.01506805419921875,
0.054840087890625,
-0.02923583984375,
-0.03631591796875,
-0.0163421630859375,
0.0017004013061523438,
0.017608642578125,
0.04833984375,
0.061981201171875,
-0.046112060546875,
-0.031829833984375,
-0.017608642578125,
-0.0675048828125,
0.00940704345703125,
0.0026874542236328125,
-0.007808685302734375,
0.026031494140625,
0.0194854736328125,
-0.061309814453125,
0.0478515625,
0.01499176025390625,
-0.02618408203125,
0.022491455078125,
-0.0257110595703125,
-0.003269195556640625,
-0.0899658203125,
0.003322601318359375,
0.029876708984375,
-0.0175628662109375,
-0.0278472900390625,
-0.00823974609375,
0.01511383056640625,
0.003566741943359375,
-0.034515380859375,
0.040069580078125,
-0.045135498046875,
-0.0156402587890625,
0.005657196044921875,
-0.007251739501953125,
0.00479888916015625,
0.052154541015625,
-0.0012407302856445312,
0.036834716796875,
0.06591796875,
-0.030029296875,
0.040924072265625,
0.0312042236328125,
-0.0305938720703125,
0.041259765625,
-0.053497314453125,
0.0056304931640625,
-0.0006394386291503906,
0.01338958740234375,
-0.0828857421875,
-0.0203399658203125,
0.033966064453125,
-0.044189453125,
0.053314208984375,
-0.039459228515625,
-0.031280517578125,
-0.047119140625,
-0.042449951171875,
0.0322265625,
0.061187744140625,
-0.05804443359375,
0.031646728515625,
0.0119171142578125,
0.01445770263671875,
-0.04534912109375,
-0.07098388671875,
-0.019622802734375,
-0.035919189453125,
-0.04498291015625,
0.0239410400390625,
0.01352691650390625,
0.00969696044921875,
0.006710052490234375,
-0.00556182861328125,
0.001476287841796875,
-0.0140838623046875,
0.03204345703125,
0.0350341796875,
-0.020965576171875,
-0.00713348388671875,
-0.0219573974609375,
-0.00887298583984375,
0.005290985107421875,
-0.021087646484375,
0.035369873046875,
-0.0149383544921875,
-0.00799560546875,
-0.06390380859375,
-0.013336181640625,
0.0439453125,
-0.0259857177734375,
0.054107666015625,
0.09259033203125,
-0.0321044921875,
-0.00391387939453125,
-0.02740478515625,
-0.0229339599609375,
-0.037994384765625,
0.04608154296875,
-0.023101806640625,
-0.03570556640625,
0.058135986328125,
0.01027679443359375,
-0.0133819580078125,
0.053070068359375,
0.034942626953125,
-0.004108428955078125,
0.0606689453125,
0.046783447265625,
0.0205535888671875,
0.05694580078125,
-0.067626953125,
-0.0058441162109375,
-0.068359375,
-0.042938232421875,
-0.0157470703125,
-0.033416748046875,
-0.05670166015625,
-0.04888916015625,
0.02978515625,
0.01425933837890625,
-0.01532745361328125,
0.03375244140625,
-0.0614013671875,
0.00379180908203125,
0.06109619140625,
0.03997802734375,
-0.0200347900390625,
0.02838134765625,
-0.02313232421875,
-0.0134429931640625,
-0.055084228515625,
-0.00445556640625,
0.08355712890625,
0.0361328125,
0.0611572265625,
-0.00499725341796875,
0.047454833984375,
-0.0157470703125,
0.00696563720703125,
-0.05914306640625,
0.04541015625,
-0.0019521713256835938,
-0.0217742919921875,
-0.0198516845703125,
-0.0309906005859375,
-0.07757568359375,
0.01220703125,
-0.025726318359375,
-0.062408447265625,
0.0350341796875,
0.0181732177734375,
-0.022979736328125,
0.04803466796875,
-0.059326171875,
0.073974609375,
-0.00682830810546875,
-0.045654296875,
0.00841522216796875,
-0.05316162109375,
0.0213165283203125,
0.00896453857421875,
-0.0165557861328125,
0.0073394775390625,
0.00852203369140625,
0.07208251953125,
-0.041473388671875,
0.06805419921875,
-0.02838134765625,
0.01702880859375,
0.0413818359375,
-0.01380157470703125,
0.03765869140625,
0.005535125732421875,
0.016876220703125,
0.0178680419921875,
0.006580352783203125,
-0.028472900390625,
-0.034576416015625,
0.04180908203125,
-0.081787109375,
-0.0338134765625,
-0.0421142578125,
-0.0297393798828125,
0.014434814453125,
0.016510009765625,
0.0496826171875,
0.04180908203125,
0.0225677490234375,
0.032012939453125,
0.058929443359375,
-0.0243377685546875,
0.034881591796875,
-0.00600433349609375,
-0.0188446044921875,
-0.049957275390625,
0.07025146484375,
0.0218505859375,
0.0177001953125,
0.011383056640625,
0.013885498046875,
-0.0298309326171875,
-0.036956787109375,
-0.0260162353515625,
0.031494140625,
-0.053680419921875,
-0.034576416015625,
-0.039581298828125,
-0.0259246826171875,
-0.030487060546875,
0.009490966796875,
-0.03759765625,
-0.0249786376953125,
-0.04266357421875,
0.006259918212890625,
0.04925537109375,
0.038177490234375,
-0.022552490234375,
0.0206298828125,
-0.0262603759765625,
0.0218963623046875,
0.0419921875,
0.032958984375,
-0.0204315185546875,
-0.07025146484375,
-0.01416015625,
-0.0028400421142578125,
-0.03369140625,
-0.045745849609375,
0.03558349609375,
0.01983642578125,
0.035552978515625,
0.031646728515625,
-0.01486968994140625,
0.05682373046875,
-0.0054473876953125,
0.045623779296875,
0.030120849609375,
-0.046417236328125,
0.04669189453125,
-0.0107574462890625,
0.006565093994140625,
0.004241943359375,
0.017486572265625,
-0.00440216064453125,
0.005779266357421875,
-0.0689697265625,
-0.05706787109375,
0.06378173828125,
0.0142822265625,
-0.0050811767578125,
0.027191162109375,
0.041717529296875,
-0.01326751708984375,
-0.002349853515625,
-0.0633544921875,
-0.0267181396484375,
-0.03668212890625,
-0.020904541015625,
-0.0082550048828125,
-0.0174407958984375,
-0.01081085205078125,
-0.05035400390625,
0.040557861328125,
-0.01233673095703125,
0.05908203125,
0.029876708984375,
-0.00313568115234375,
-0.0179901123046875,
-0.024932861328125,
0.0362548828125,
0.023956298828125,
-0.025909423828125,
0.0164031982421875,
0.01415252685546875,
-0.04644775390625,
-0.0032863616943359375,
0.0250701904296875,
-0.006099700927734375,
-0.001689910888671875,
0.0355224609375,
0.0748291015625,
-0.005298614501953125,
-0.0028228759765625,
0.049957275390625,
-0.005802154541015625,
-0.0367431640625,
-0.01519012451171875,
0.0111083984375,
-0.00859832763671875,
0.036041259765625,
0.0206298828125,
0.033203125,
-0.0196990966796875,
-0.0264892578125,
0.021087646484375,
0.04412841796875,
-0.04150390625,
-0.03131103515625,
0.0628662109375,
-0.018646240234375,
-0.0036830902099609375,
0.057403564453125,
-0.0037593841552734375,
-0.0309295654296875,
0.07037353515625,
0.0318603515625,
0.060546875,
-0.0168304443359375,
0.005279541015625,
0.05621337890625,
0.02349853515625,
-0.0092315673828125,
0.01483154296875,
0.00458526611328125,
-0.050933837890625,
0.0035343170166015625,
-0.041900634765625,
0.0028285980224609375,
0.024383544921875,
-0.0496826171875,
0.0340576171875,
-0.047882080078125,
-0.030029296875,
0.00930023193359375,
0.01319122314453125,
-0.071044921875,
0.017303466796875,
0.007518768310546875,
0.0491943359375,
-0.059326171875,
0.068359375,
0.0576171875,
-0.046173095703125,
-0.07012939453125,
-0.00804901123046875,
0.004489898681640625,
-0.0718994140625,
0.036041259765625,
0.03240966796875,
0.006298065185546875,
0.01287078857421875,
-0.056121826171875,
-0.058349609375,
0.1141357421875,
0.0335693359375,
-0.0105438232421875,
0.0140380859375,
-0.00449371337890625,
0.027496337890625,
-0.0184326171875,
0.0239105224609375,
0.012542724609375,
0.0225982666015625,
0.027374267578125,
-0.059051513671875,
0.01218414306640625,
-0.021697998046875,
0.014190673828125,
0.0105438232421875,
-0.072998046875,
0.06976318359375,
-0.043853759765625,
-0.00958251953125,
0.01812744140625,
0.049560546875,
0.0147857666015625,
0.00797271728515625,
0.034515380859375,
0.060882568359375,
0.0413818359375,
-0.0296783447265625,
0.06280517578125,
-0.00464630126953125,
0.0555419921875,
0.04559326171875,
0.0309295654296875,
0.0455322265625,
0.0380859375,
-0.030364990234375,
0.0269622802734375,
0.07403564453125,
-0.040374755859375,
0.039886474609375,
0.00739288330078125,
0.004283905029296875,
-0.0239105224609375,
0.0008535385131835938,
-0.035980224609375,
0.038421630859375,
0.0163116455078125,
-0.0458984375,
0.0003418922424316406,
0.0031414031982421875,
0.0012216567993164062,
-0.022796630859375,
-0.0129852294921875,
0.041473388671875,
0.00843048095703125,
-0.034088134765625,
0.0653076171875,
-0.00154876708984375,
0.056243896484375,
-0.0310516357421875,
0.004924774169921875,
-0.0230255126953125,
0.0361328125,
-0.02459716796875,
-0.06591796875,
0.01004791259765625,
-0.0175628662109375,
-0.005474090576171875,
0.0014667510986328125,
0.051361083984375,
-0.03082275390625,
-0.04205322265625,
0.0167236328125,
0.0223236083984375,
0.0159912109375,
0.007648468017578125,
-0.0738525390625,
0.0007901191711425781,
0.002292633056640625,
-0.047027587890625,
0.028167724609375,
0.03985595703125,
0.0037288665771484375,
0.051971435546875,
0.042266845703125,
-0.00958251953125,
0.01302337646484375,
-0.01479339599609375,
0.070068359375,
-0.0309295654296875,
-0.0233306884765625,
-0.060882568359375,
0.0439453125,
-0.004184722900390625,
-0.040618896484375,
0.046630859375,
0.03863525390625,
0.060760498046875,
-0.0012788772583007812,
0.03125,
-0.01276397705078125,
0.0001958608627319336,
-0.031890869140625,
0.048095703125,
-0.050079345703125,
0.0019464492797851562,
-0.02020263671875,
-0.0662841796875,
-0.0352783203125,
0.06488037109375,
-0.0185089111328125,
0.0289154052734375,
0.039520263671875,
0.075927734375,
-0.0274200439453125,
-0.048309326171875,
0.0142669677734375,
0.0214385986328125,
0.0103607177734375,
0.033416748046875,
0.03277587890625,
-0.058441162109375,
0.05126953125,
-0.04351806640625,
-0.015838623046875,
-0.0125579833984375,
-0.04248046875,
-0.0894775390625,
-0.05792236328125,
-0.04620361328125,
-0.056854248046875,
-0.01383209228515625,
0.06024169921875,
0.0703125,
-0.0484619140625,
0.003742218017578125,
-0.004150390625,
0.011627197265625,
-0.0242767333984375,
-0.01580810546875,
0.05511474609375,
-0.0004131793975830078,
-0.053680419921875,
-0.022857666015625,
0.00421142578125,
0.036773681640625,
-0.01241302490234375,
-0.0174102783203125,
-0.00975799560546875,
-0.0171966552734375,
0.01666259765625,
0.029083251953125,
-0.06268310546875,
-0.019622802734375,
-0.0110626220703125,
-0.01233673095703125,
0.04180908203125,
0.0150146484375,
-0.053314208984375,
0.03814697265625,
0.037139892578125,
0.01338958740234375,
0.060150146484375,
-0.0140533447265625,
-0.0011682510375976562,
-0.05755615234375,
0.04266357421875,
-0.01488494873046875,
0.037933349609375,
0.041107177734375,
-0.0173187255859375,
0.036407470703125,
0.052093505859375,
-0.03094482421875,
-0.064208984375,
-0.004108428955078125,
-0.08740234375,
0.0006642341613769531,
0.07769775390625,
-0.031585693359375,
-0.037841796875,
0.03167724609375,
-0.0084686279296875,
0.047454833984375,
-0.006069183349609375,
0.03875732421875,
0.01337432861328125,
0.0059814453125,
-0.0625,
-0.029937744140625,
0.04608154296875,
0.0060882568359375,
-0.040557861328125,
-0.0284271240234375,
0.007747650146484375,
0.048309326171875,
0.0224609375,
0.0221405029296875,
-0.01100921630859375,
0.017822265625,
0.00830078125,
0.0278472900390625,
-0.02301025390625,
-0.01357269287109375,
-0.028106689453125,
-0.003543853759765625,
-0.0032253265380859375,
-0.044281005859375
]
] |
minimaxir/sdxl-wrong-lora | 2023-08-24T02:59:47.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"lora",
"license:mit",
"has_space",
"region:us"
] | text-to-image | minimaxir | null | null | minimaxir/sdxl-wrong-lora | 88 | 30,878 | diffusers | 2023-08-14T04:26:16 | ---
license: mit
base_model: stabilityai/stable-diffusion-xl-base-1.0
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# sdxl-wrong-lora

A LoRA for SDXL 1.0 Base which improves output image quality after loading it and using `wrong` as a negative prompt during inference. You can demo image generation using this LoRA in [this Colab Notebook](https://colab.research.google.com/github/minimaxir/sdxl-experiments/blob/main/sdxl_image_generation.ipynb).
The LoRA is also available in a `safetensors` format for other UIs such as A1111; however this LoRA was created using `diffusers` and I cannot guarantee its efficacy outside of it.
Benefits of using this LoRA:
- Higher detail in textures/fabrics, particularly at full 1024x1024 resolution.
- Higher color saturation and vibrance.
- Higher sharpness for blurry/background objects.
- Better at anatomically-correct hands.
- Less likely to have random artifacts.
- Appears to allow the model to follow the input prompt with a more expected behavior, particularly with prompt weighting such as the [Compel](https://github.com/damian0815/compel) syntax.
## Usage
The LoRA can be loaded using `load_lora_weights` like any other LoRA in `diffusers`:
```py
import torch
from diffusers import DiffusionPipeline, AutoencoderKL
vae = AutoencoderKL.from_pretrained(
"madebyollin/sdxl-vae-fp16-fix",
torch_dtype=torch.float16
)
base = DiffusionPipeline.from_pretrained(
"stabilityai/stable-diffusion-xl-base-1.0",
vae=vae,
torch_dtype=torch.float16,
variant="fp16",
use_safetensors=True
)
base.load_lora_weights("minimaxir/sdxl-wrong-lora")
_ = base.to("cuda")
```
During image generation, use `wrong` as the negative prompt. That's it!
## Examples
**Left image** is the base model output (no LoRA) + refiner, **right image** is base (w/ LoRA) + refiner + `wrong` negative prompt. Both generations use the same seed.
I have also [released a Colab Notebook](https://colab.research.google.com/github/minimaxir/sdxl-experiments/blob/main/sdxl_wrong_comparison.ipynb) to generate these kinds of side-by-side comparison images, although the seeds listed will not give the same results since they were generated on a different GPU/CUDA than the Colab Notebook.
`realistic human Shrek blogging at a computer workstation, hyperrealistic award-winning photo for vanity fair` (cfg = 13, seed = 56583700)

`pepperoni pizza in the shape of a heart, hyperrealistic award-winning professional food photography` (cfg = 13, seed = 75789081)

`presidential painting of realistic human Spongebob Squarepants wearing a suit, (oil on canvas)+++++` (cfg = 13, seed = 85588026)

`San Francisco panorama attacked by (one massive kitten)++++, hyperrealistic award-winning photo by the Associated Press` (cfg = 13, seed = 45454868)

`hyperrealistic death metal album cover featuring edgy moody realistic (human Super Mario)++, edgy and moody` (cfg = 13, seed = 30416580)

## Methodology
The methodology and motivation for creating this LoRA is similar to my [wrong SD 2.0 textual inversion embedding](https://huggingface.co/minimaxir/wrong_embedding_sd_2_0) by training on a balanced variety of undesirable outputs, except trained as a LoRA since textual inversion with SDXL is complicated. The base images were generated from SDXL itself, with some prompt weighting to emphasize undesirable attributes for test images.
You can see the code to generate the wrong images [in this Jupyter Notebook](https://github.com/minimaxir/sdxl-experiments/blob/main/wrong_image_generator.ipynb).
## Notes
- The intuitive way to think about how this LoRA works is that on training start, it indicates an undesirable area of the vast highdimensional latent space which the rest of the diffusion process will move away from. This may work more effectively than textual inversion but more testing needs to be done.
- The description of this LoRA is very careful to not state that the output is objectively _better_ than not using LoRA, because everything is subjective and there are use cases where vibrant output is not desired. For most use cases, the output should be better desired however.
- It's possible to use `not wrong` in the normal prompt itself but in testing it has not much effect.
- You can use other negative prompts in conjunction with the `wrong` prompt but you may want to weight them appropriately.
- All the Notebooks noted here are available [in this GitHub repo](https://github.com/minimaxir/sdxl-experiments).
| 4,691 | [
[
-0.0390625,
-0.0697021484375,
0.04034423828125,
0.01425933837890625,
-0.016845703125,
-0.0171051025390625,
0.01532745361328125,
-0.0352783203125,
0.0298919677734375,
0.032196044921875,
-0.04345703125,
-0.0279693603515625,
-0.039703369140625,
-0.00041866302490234375,
-0.045318603515625,
0.0816650390625,
-0.00008004903793334961,
-0.0110015869140625,
-0.005687713623046875,
-0.0240325927734375,
-0.0316162109375,
-0.00406646728515625,
-0.0667724609375,
-0.02484130859375,
0.057037353515625,
0.01507568359375,
0.058441162109375,
0.046875,
0.0113677978515625,
0.0221099853515625,
-0.00555419921875,
-0.004611968994140625,
-0.048675537109375,
-0.0301055908203125,
-0.0024662017822265625,
-0.0352783203125,
-0.0235595703125,
0.01238250732421875,
0.03173828125,
0.0218353271484375,
-0.007732391357421875,
0.0219573974609375,
0.002178192138671875,
0.051025390625,
-0.05059814453125,
-0.0206298828125,
-0.027557373046875,
0.01541900634765625,
-0.00018227100372314453,
-0.0021610260009765625,
-0.025634765625,
-0.036773681640625,
-0.0201568603515625,
-0.057342529296875,
0.0250396728515625,
0.01007843017578125,
0.09368896484375,
0.032135009765625,
-0.022552490234375,
-0.028411865234375,
-0.045257568359375,
0.061309814453125,
-0.047271728515625,
0.003948211669921875,
0.01056671142578125,
0.015716552734375,
0.00959014892578125,
-0.04693603515625,
-0.06292724609375,
-0.0019197463989257812,
-0.005481719970703125,
0.0207061767578125,
-0.030364990234375,
-0.017242431640625,
0.047210693359375,
0.0043182373046875,
-0.043212890625,
0.01255035400390625,
-0.037689208984375,
-0.011199951171875,
0.04864501953125,
0.01532745361328125,
0.022613525390625,
-0.01464080810546875,
-0.047607421875,
-0.0238494873046875,
-0.0657958984375,
0.0015249252319335938,
0.031829833984375,
0.00864410400390625,
-0.026092529296875,
0.05902099609375,
0.01849365234375,
0.0447998046875,
0.02264404296875,
-0.0186004638671875,
0.038116455078125,
-0.028564453125,
-0.0218353271484375,
0.0013170242309570312,
0.052978515625,
0.0390625,
-0.010650634765625,
0.0259246826171875,
-0.0213165283203125,
0.019287109375,
-0.0163726806640625,
-0.0675048828125,
-0.0031604766845703125,
0.033966064453125,
-0.0389404296875,
-0.032379150390625,
-0.013275146484375,
-0.058258056640625,
-0.032745361328125,
-0.00464630126953125,
0.06353759765625,
-0.0279388427734375,
-0.00879669189453125,
-0.002689361572265625,
-0.033843994140625,
0.019256591796875,
0.039825439453125,
-0.0283355712890625,
0.00441741943359375,
0.007251739501953125,
0.04864501953125,
-0.003276824951171875,
-0.01317596435546875,
-0.037353515625,
-0.0028743743896484375,
-0.0270538330078125,
0.055816650390625,
-0.013214111328125,
-0.029296875,
-0.0187225341796875,
0.004673004150390625,
0.0015859603881835938,
-0.0528564453125,
0.054931640625,
-0.028076171875,
0.025604248046875,
-0.0300750732421875,
-0.0220947265625,
0.0054779052734375,
-0.006282806396484375,
-0.05230712890625,
0.08624267578125,
0.016876220703125,
-0.07147216796875,
0.0200958251953125,
-0.049041748046875,
-0.002941131591796875,
-0.0123443603515625,
-0.020294189453125,
-0.032379150390625,
-0.00029540061950683594,
-0.0009322166442871094,
0.0086517333984375,
-0.01253509521484375,
-0.0231475830078125,
-0.01145172119140625,
-0.043792724609375,
0.0267181396484375,
-0.00959014892578125,
0.0899658203125,
0.0291900634765625,
-0.0284423828125,
0.0024662017822265625,
-0.078125,
0.0189056396484375,
0.036346435546875,
-0.0201263427734375,
-0.031341552734375,
-0.0292816162109375,
0.0214691162109375,
0.0208740234375,
0.01403045654296875,
-0.058837890625,
0.006229400634765625,
-0.018096923828125,
0.025634765625,
0.06878662109375,
-0.0049896240234375,
0.0258636474609375,
-0.019561767578125,
0.0270843505859375,
0.015045166015625,
0.0031299591064453125,
-0.002056121826171875,
-0.049835205078125,
-0.07373046875,
-0.02044677734375,
-0.001995086669921875,
0.0467529296875,
-0.07281494140625,
0.021697998046875,
0.01010894775390625,
-0.04254150390625,
-0.0309906005859375,
0.01654052734375,
0.04083251953125,
0.0531005859375,
0.0208892822265625,
-0.0096435546875,
-0.051361083984375,
-0.058929443359375,
0.01221466064453125,
-0.0105743408203125,
0.0192718505859375,
0.0254974365234375,
0.047271728515625,
-0.037872314453125,
0.07623291015625,
-0.06268310546875,
-0.029205322265625,
-0.0170745849609375,
0.01102447509765625,
0.051544189453125,
0.040802001953125,
0.05169677734375,
-0.0628662109375,
-0.0584716796875,
-0.0181732177734375,
-0.04473876953125,
-0.01010894775390625,
-0.0107574462890625,
-0.01256561279296875,
0.0259552001953125,
0.014190673828125,
-0.0712890625,
0.04156494140625,
0.0269317626953125,
-0.041961669921875,
0.053436279296875,
-0.03558349609375,
-0.0102081298828125,
-0.07061767578125,
0.0191192626953125,
0.01250457763671875,
-0.01055145263671875,
-0.037353515625,
0.01519775390625,
0.00852203369140625,
-0.003948211669921875,
-0.038177490234375,
0.060333251953125,
-0.01348114013671875,
0.00623321533203125,
-0.016693115234375,
0.004840850830078125,
0.0209808349609375,
0.0266571044921875,
-0.0026302337646484375,
0.059967041015625,
0.0232696533203125,
-0.033416748046875,
0.0330810546875,
0.0144195556640625,
-0.0274505615234375,
0.0489501953125,
-0.061614990234375,
0.01493072509765625,
-0.03216552734375,
0.0184326171875,
-0.08477783203125,
-0.00319671630859375,
0.0307159423828125,
-0.024322509765625,
0.01352691650390625,
0.0032863616943359375,
-0.0310821533203125,
-0.04425048828125,
-0.0352783203125,
0.01751708984375,
0.0697021484375,
-0.047882080078125,
0.03515625,
0.0003135204315185547,
0.012481689453125,
-0.06402587890625,
-0.0604248046875,
-0.009674072265625,
-0.029022216796875,
-0.046051025390625,
0.01425933837890625,
-0.039703369140625,
-0.0288238525390625,
-0.00011438131332397461,
0.0069427490234375,
-0.0027008056640625,
-0.01287078857421875,
0.032562255859375,
0.0303955078125,
-0.02728271484375,
-0.0236358642578125,
0.00018703937530517578,
-0.0225830078125,
-0.002933502197265625,
0.007770538330078125,
0.036712646484375,
-0.005443572998046875,
-0.0159149169921875,
-0.04229736328125,
0.023162841796875,
0.021759033203125,
0.031158447265625,
0.039947509765625,
0.06573486328125,
-0.020599365234375,
-0.00804901123046875,
-0.038177490234375,
-0.0007047653198242188,
-0.038909912109375,
0.019439697265625,
-0.0179443359375,
-0.0111541748046875,
0.035125732421875,
0.02532958984375,
-0.00281524658203125,
0.0399169921875,
0.034759521484375,
-0.04248046875,
0.09454345703125,
0.054931640625,
0.022369384765625,
0.0297393798828125,
-0.0648193359375,
-0.0022792816162109375,
-0.06219482421875,
-0.00159454345703125,
-0.0399169921875,
-0.020172119140625,
-0.0183563232421875,
-0.034271240234375,
0.032012939453125,
0.0139312744140625,
-0.0313720703125,
0.0308074951171875,
-0.0203094482421875,
0.048065185546875,
0.028228759765625,
0.011260986328125,
0.0150146484375,
0.0133056640625,
-0.0103912353515625,
-0.014129638671875,
-0.037872314453125,
-0.0308990478515625,
0.061676025390625,
0.010833740234375,
0.09259033203125,
0.003192901611328125,
0.06231689453125,
0.0308380126953125,
0.021392822265625,
-0.03656005859375,
0.047576904296875,
-0.0190277099609375,
-0.0394287109375,
-0.007282257080078125,
-0.0226287841796875,
-0.07574462890625,
0.0163726806640625,
-0.0160675048828125,
-0.05084228515625,
0.040008544921875,
0.0245208740234375,
-0.046661376953125,
0.00750732421875,
-0.0784912109375,
0.051788330078125,
-0.0171966552734375,
-0.0496826171875,
0.00003999471664428711,
-0.0428466796875,
0.0198822021484375,
-0.00801849365234375,
0.023590087890625,
0.0069732666015625,
-0.0120697021484375,
0.04443359375,
-0.0309906005859375,
0.06640625,
-0.03521728515625,
-0.031494140625,
0.023590087890625,
-0.005950927734375,
0.05255126953125,
-0.0025157928466796875,
-0.0113677978515625,
-0.0009593963623046875,
0.01702880859375,
-0.03704833984375,
-0.03704833984375,
0.0677490234375,
-0.07891845703125,
-0.036895751953125,
-0.0306549072265625,
-0.022216796875,
0.0181884765625,
0.0227813720703125,
0.053436279296875,
0.0211029052734375,
-0.002841949462890625,
-0.0146942138671875,
0.06109619140625,
-0.0428466796875,
0.0224456787109375,
0.02386474609375,
-0.048828125,
-0.01389312744140625,
0.09246826171875,
0.02142333984375,
0.0276031494140625,
0.0106048583984375,
0.0191497802734375,
-0.018829345703125,
-0.0230560302734375,
-0.052703857421875,
0.026275634765625,
-0.078857421875,
-0.0186309814453125,
-0.04254150390625,
-0.0211029052734375,
-0.013458251953125,
-0.0202484130859375,
-0.0229644775390625,
-0.04443359375,
-0.04266357421875,
0.0096282958984375,
0.0027923583984375,
0.0477294921875,
-0.0183868408203125,
0.018280029296875,
-0.0308990478515625,
0.033782958984375,
0.01287078857421875,
-0.00923919677734375,
-0.00038814544677734375,
-0.029815673828125,
-0.0117034912109375,
0.0170745849609375,
-0.0299072265625,
-0.04833984375,
0.033172607421875,
0.01177215576171875,
0.026702880859375,
0.046295166015625,
-0.00565338134765625,
0.05682373046875,
-0.0192108154296875,
0.057159423828125,
0.03363037109375,
-0.048828125,
0.04931640625,
-0.006328582763671875,
0.00891876220703125,
0.007762908935546875,
0.0286102294921875,
-0.0206756591796875,
-0.012908935546875,
-0.0594482421875,
-0.054534912109375,
0.045318603515625,
0.0197296142578125,
0.0170135498046875,
0.01154327392578125,
0.0211029052734375,
0.0283050537109375,
-0.00653076171875,
-0.078125,
-0.03741455078125,
-0.0211029052734375,
-0.0016298294067382812,
-0.00811767578125,
-0.0203399658203125,
-0.0260009765625,
-0.018646240234375,
0.061798095703125,
-0.011505126953125,
0.0311431884765625,
0.0214080810546875,
-0.00803375244140625,
-0.0307159423828125,
0.004878997802734375,
0.034515380859375,
0.045074462890625,
-0.03240966796875,
0.003143310546875,
0.00708770751953125,
-0.0269927978515625,
0.017547607421875,
-0.004852294921875,
-0.038482666015625,
0.0003330707550048828,
0.019439697265625,
0.054718017578125,
-0.023345947265625,
-0.059906005859375,
0.046478271484375,
-0.0180816650390625,
0.0012874603271484375,
-0.036163330078125,
0.0283050537109375,
0.000324249267578125,
0.01486968994140625,
0.0286102294921875,
0.035003662109375,
0.006999969482421875,
-0.041229248046875,
-0.01995849609375,
0.0369873046875,
-0.01352691650390625,
-0.0250244140625,
0.051422119140625,
0.00750732421875,
-0.00730133056640625,
0.036468505859375,
-0.0226593017578125,
-0.00565338134765625,
0.0703125,
0.06890869140625,
0.030792236328125,
0.014739990234375,
0.03155517578125,
0.04937744140625,
0.016845703125,
0.01148223876953125,
0.041900634765625,
0.00696563720703125,
-0.057037353515625,
-0.006877899169921875,
-0.038299560546875,
-0.0087127685546875,
0.007678985595703125,
-0.036407470703125,
0.03729248046875,
-0.0311431884765625,
-0.018524169921875,
0.006420135498046875,
0.0007929801940917969,
-0.05169677734375,
0.01242828369140625,
0.0183563232421875,
0.08154296875,
-0.07952880859375,
0.07269287109375,
0.043670654296875,
-0.04864501953125,
-0.03375244140625,
-0.016815185546875,
0.0008611679077148438,
-0.07635498046875,
0.07623291015625,
0.0106658935546875,
-0.00963592529296875,
-0.0008015632629394531,
-0.0709228515625,
-0.0784912109375,
0.09552001953125,
0.0288848876953125,
-0.0234527587890625,
-0.01020050048828125,
0.0122833251953125,
0.043975830078125,
-0.02191162109375,
0.017425537109375,
0.0207061767578125,
0.0270538330078125,
0.0288543701171875,
-0.032196044921875,
0.0202484130859375,
-0.018585205078125,
0.021148681640625,
-0.000530242919921875,
-0.048828125,
0.0865478515625,
-0.05908203125,
-0.034912109375,
0.060943603515625,
0.038299560546875,
0.033050537109375,
0.034027099609375,
0.0469970703125,
0.0706787109375,
0.049774169921875,
-0.003162384033203125,
0.08721923828125,
0.0092620849609375,
0.0308990478515625,
0.07281494140625,
-0.027984619140625,
0.050018310546875,
0.057647705078125,
-0.0186004638671875,
0.0413818359375,
0.04534912109375,
-0.00485992431640625,
0.04443359375,
0.00286102294921875,
0.0033931732177734375,
0.00789642333984375,
-0.0093841552734375,
-0.047882080078125,
-0.002361297607421875,
0.03369140625,
-0.035400390625,
-0.0170135498046875,
0.0166168212890625,
0.00446319580078125,
-0.017852783203125,
-0.0082244873046875,
0.0266571044921875,
-0.00035381317138671875,
-0.0260772705078125,
0.072265625,
-0.00399017333984375,
0.0667724609375,
-0.055938720703125,
-0.01544952392578125,
-0.036285400390625,
0.030029296875,
-0.03326416015625,
-0.0628662109375,
0.0093231201171875,
0.0001329183578491211,
-0.00844573974609375,
-0.01132965087890625,
0.058380126953125,
-0.00585174560546875,
-0.04345703125,
0.027496337890625,
0.0288848876953125,
0.0227813720703125,
0.0205078125,
-0.07257080078125,
0.028289794921875,
0.01230621337890625,
-0.01331329345703125,
0.0245361328125,
0.014617919921875,
0.01519775390625,
0.03729248046875,
0.049774169921875,
0.0013055801391601562,
0.004344940185546875,
-0.0086517333984375,
0.06292724609375,
-0.0447998046875,
-0.0201263427734375,
-0.046539306640625,
0.034912109375,
-0.01091766357421875,
-0.0171661376953125,
0.06256103515625,
0.050201416015625,
0.0567626953125,
-0.019500732421875,
0.050201416015625,
-0.037506103515625,
0.02105712890625,
-0.050140380859375,
0.04742431640625,
-0.057830810546875,
0.0224456787109375,
-0.025390625,
-0.074462890625,
0.006053924560546875,
0.04833984375,
-0.002838134765625,
0.0021114349365234375,
0.049346923828125,
0.0794677734375,
-0.01406097412109375,
-0.020782470703125,
0.01519775390625,
0.051177978515625,
0.0163726806640625,
0.038909912109375,
0.0750732421875,
-0.05059814453125,
0.039947509765625,
-0.042205810546875,
-0.0179595947265625,
-0.0054779052734375,
-0.06640625,
-0.04058837890625,
-0.047882080078125,
-0.03582763671875,
-0.06756591796875,
-0.003795623779296875,
0.06640625,
0.04541015625,
-0.04595947265625,
-0.015838623046875,
0.01535797119140625,
0.0119781494140625,
-0.01312255859375,
-0.021240234375,
0.028045654296875,
0.01080322265625,
-0.059326171875,
0.006229400634765625,
0.005741119384765625,
0.028594970703125,
-0.0218658447265625,
-0.0194091796875,
-0.013397216796875,
0.00959014892578125,
0.044403076171875,
0.032318115234375,
-0.046112060546875,
-0.008819580078125,
-0.0214080810546875,
-0.022186279296875,
0.0222625732421875,
0.031829833984375,
-0.072998046875,
0.00872802734375,
0.0382080078125,
0.004985809326171875,
0.03350830078125,
-0.01763916015625,
-0.005237579345703125,
-0.045623779296875,
0.01052093505859375,
-0.0059661865234375,
0.0406494140625,
0.020782470703125,
-0.03363037109375,
0.047821044921875,
0.026397705078125,
-0.023529052734375,
-0.0716552734375,
0.00014078617095947266,
-0.09063720703125,
-0.025146484375,
0.09588623046875,
0.0003337860107421875,
-0.0343017578125,
0.01171112060546875,
-0.06365966796875,
-0.01084136962890625,
-0.02679443359375,
0.054046630859375,
0.0287322998046875,
-0.0255126953125,
-0.0022602081298828125,
-0.0301055908203125,
0.0152740478515625,
0.0162506103515625,
-0.053314208984375,
-0.01152801513671875,
0.03277587890625,
0.032440185546875,
0.046142578125,
0.05694580078125,
-0.0059661865234375,
0.018035888671875,
0.0018463134765625,
-0.00373077392578125,
0.00916290283203125,
0.005199432373046875,
-0.0171966552734375,
0.0022563934326171875,
-0.002796173095703125,
0.011688232421875
]
] |
camembert/camembert-large | 2020-12-11T21:35:25.000Z | [
"transformers",
"pytorch",
"camembert",
"fr",
"arxiv:1911.03894",
"endpoints_compatible",
"region:us"
] | null | camembert | null | null | camembert/camembert-large | 6 | 30,757 | transformers | 2022-03-02T23:29:05 | ---
language: fr
---
# CamemBERT: a Tasty French Language Model
## Introduction
[CamemBERT](https://arxiv.org/abs/1911.03894) is a state-of-the-art language model for French based on the RoBERTa model.
It is now available on Hugging Face in 6 different versions with varying number of parameters, amount of pretraining data and pretraining data source domains.
For further information or requests, please go to [Camembert Website](https://camembert-model.fr/)
## Pre-trained models
| Model | #params | Arch. | Training data |
|--------------------------------|--------------------------------|-------|-----------------------------------|
| `camembert-base` | 110M | Base | OSCAR (138 GB of text) |
| `camembert/camembert-large` | 335M | Large | CCNet (135 GB of text) |
| `camembert/camembert-base-ccnet` | 110M | Base | CCNet (135 GB of text) |
| `camembert/camembert-base-wikipedia-4gb` | 110M | Base | Wikipedia (4 GB of text) |
| `camembert/camembert-base-oscar-4gb` | 110M | Base | Subsample of OSCAR (4 GB of text) |
| `camembert/camembert-base-ccnet-4gb` | 110M | Base | Subsample of CCNet (4 GB of text) |
## How to use CamemBERT with HuggingFace
##### Load CamemBERT and its sub-word tokenizer :
```python
from transformers import CamembertModel, CamembertTokenizer
# You can replace "camembert-base" with any other model from the table, e.g. "camembert/camembert-large".
tokenizer = CamembertTokenizer.from_pretrained("camembert/camembert-large")
camembert = CamembertModel.from_pretrained("camembert/camembert-large")
camembert.eval() # disable dropout (or leave in train mode to finetune)
```
##### Filling masks using pipeline
```python
from transformers import pipeline
camembert_fill_mask = pipeline("fill-mask", model="camembert/camembert-large", tokenizer="camembert/camembert-large")
results = camembert_fill_mask("Le camembert est <mask> :)")
# results
#[{'sequence': '<s> Le camembert est bon :)</s>', 'score': 0.15560828149318695, 'token': 305},
#{'sequence': '<s> Le camembert est excellent :)</s>', 'score': 0.06821336597204208, 'token': 3497},
#{'sequence': '<s> Le camembert est délicieux :)</s>', 'score': 0.060438305139541626, 'token': 11661},
#{'sequence': '<s> Le camembert est ici :)</s>', 'score': 0.02023460529744625, 'token': 373},
#{'sequence': '<s> Le camembert est meilleur :)</s>', 'score': 0.01778135634958744, 'token': 876}]
```
##### Extract contextual embedding features from Camembert output
```python
import torch
# Tokenize in sub-words with SentencePiece
tokenized_sentence = tokenizer.tokenize("J'aime le camembert !")
# ['▁J', "'", 'aime', '▁le', '▁cam', 'ember', 't', '▁!']
# 1-hot encode and add special starting and end tokens
encoded_sentence = tokenizer.encode(tokenized_sentence)
# [5, 133, 22, 1250, 16, 12034, 14324, 81, 76, 6]
# NB: Can be done in one step : tokenize.encode("J'aime le camembert !")
# Feed tokens to Camembert as a torch tensor (batch dim 1)
encoded_sentence = torch.tensor(encoded_sentence).unsqueeze(0)
embeddings, _ = camembert(encoded_sentence)
# embeddings.detach()
# torch.Size([1, 10, 1024])
#tensor([[[-0.1284, 0.2643, 0.4374, ..., 0.1627, 0.1308, -0.2305],
# [ 0.4576, -0.6345, -0.2029, ..., -0.1359, -0.2290, -0.6318],
# [ 0.0381, 0.0429, 0.5111, ..., -0.1177, -0.1913, -0.1121],
# ...,
```
##### Extract contextual embedding features from all Camembert layers
```python
from transformers import CamembertConfig
# (Need to reload the model with new config)
config = CamembertConfig.from_pretrained("camembert/camembert-large", output_hidden_states=True)
camembert = CamembertModel.from_pretrained("camembert/camembert-large", config=config)
embeddings, _, all_layer_embeddings = camembert(encoded_sentence)
# all_layer_embeddings list of len(all_layer_embeddings) == 25 (input embedding layer + 24 self attention layers)
all_layer_embeddings[5]
# layer 5 contextual embedding : size torch.Size([1, 10, 1024])
#tensor([[[-0.0600, 0.0742, 0.0332, ..., -0.0525, -0.0637, -0.0287],
# [ 0.0950, 0.2840, 0.1985, ..., 0.2073, -0.2172, -0.6321],
# [ 0.1381, 0.1872, 0.1614, ..., -0.0339, -0.2530, -0.1182],
# ...,
```
## Authors
CamemBERT was trained and evaluated by Louis Martin\*, Benjamin Muller\*, Pedro Javier Ortiz Suárez\*, Yoann Dupont, Laurent Romary, Éric Villemonte de la Clergerie, Djamé Seddah and Benoît Sagot.
## Citation
If you use our work, please cite:
```bibtex
@inproceedings{martin2020camembert,
title={CamemBERT: a Tasty French Language Model},
author={Martin, Louis and Muller, Benjamin and Su{\'a}rez, Pedro Javier Ortiz and Dupont, Yoann and Romary, Laurent and de la Clergerie, {\'E}ric Villemonte and Seddah, Djam{\'e} and Sagot, Beno{\^\i}t},
booktitle={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
year={2020}
}
```
| 5,041 | [
[
-0.013946533203125,
-0.06378173828125,
0.0215911865234375,
0.02642822265625,
-0.0148468017578125,
-0.011260986328125,
-0.0293121337890625,
-0.0029506683349609375,
0.038238525390625,
0.0311279296875,
-0.04107666015625,
-0.049224853515625,
-0.04632568359375,
0.002166748046875,
-0.026092529296875,
0.074951171875,
0.0006227493286132812,
0.00609588623046875,
0.003265380859375,
-0.0038585662841796875,
-0.0152130126953125,
-0.04486083984375,
-0.0450439453125,
-0.0330810546875,
0.032958984375,
0.0074920654296875,
0.03497314453125,
0.034698486328125,
0.018890380859375,
0.0285491943359375,
-0.01081085205078125,
0.0003807544708251953,
-0.03216552734375,
-0.009307861328125,
0.003314971923828125,
-0.040191650390625,
-0.04302978515625,
0.005970001220703125,
0.0369873046875,
0.037750244140625,
-0.0018205642700195312,
0.005001068115234375,
0.007709503173828125,
0.04931640625,
-0.01081085205078125,
0.029815673828125,
-0.034332275390625,
0.007190704345703125,
-0.00820159912109375,
-0.00862884521484375,
-0.031768798828125,
-0.022186279296875,
0.0135345458984375,
-0.043670654296875,
0.007335662841796875,
0.0007510185241699219,
0.087890625,
0.006809234619140625,
-0.0199127197265625,
-0.006072998046875,
-0.0186767578125,
0.06842041015625,
-0.06622314453125,
0.043182373046875,
0.0291748046875,
0.001922607421875,
-0.0178070068359375,
-0.08331298828125,
-0.055267333984375,
-0.015960693359375,
-0.0269927978515625,
0.02716064453125,
-0.0181121826171875,
-0.01471710205078125,
0.010406494140625,
0.0259552001953125,
-0.052276611328125,
-0.0198974609375,
-0.017547607421875,
-0.01509857177734375,
0.052154541015625,
-0.00916290283203125,
0.035919189453125,
-0.0312042236328125,
-0.0269622802734375,
-0.033294677734375,
-0.0284881591796875,
0.001987457275390625,
0.008697509765625,
0.0243682861328125,
-0.032073974609375,
0.05133056640625,
-0.0139312744140625,
0.052642822265625,
-0.00513458251953125,
-0.00440216064453125,
0.055023193359375,
-0.0196685791015625,
-0.0204925537109375,
0.0039520263671875,
0.0797119140625,
0.027252197265625,
0.02752685546875,
-0.00191497802734375,
-0.0093231201171875,
-0.00360107421875,
-0.0008025169372558594,
-0.05712890625,
-0.031494140625,
0.024627685546875,
-0.0223388671875,
-0.0150146484375,
0.0046234130859375,
-0.0517578125,
0.007808685302734375,
-0.004413604736328125,
0.039794921875,
-0.057647705078125,
-0.01169586181640625,
0.01611328125,
-0.01654052734375,
0.019561767578125,
0.0097808837890625,
-0.05413818359375,
0.005535125732421875,
0.0322265625,
0.07177734375,
0.002552032470703125,
-0.0309906005859375,
-0.028472900390625,
-0.00971221923828125,
-0.006671905517578125,
0.036895751953125,
-0.0276336669921875,
-0.0176849365234375,
-0.00601959228515625,
0.023681640625,
-0.0238189697265625,
-0.0183868408203125,
0.0401611328125,
-0.0213165283203125,
0.03656005859375,
-0.00939178466796875,
-0.0511474609375,
-0.0287628173828125,
0.033294677734375,
-0.0438232421875,
0.0833740234375,
0.034210205078125,
-0.0714111328125,
0.016387939453125,
-0.038330078125,
-0.0234222412109375,
0.002246856689453125,
-0.0234375,
-0.04083251953125,
-0.0016050338745117188,
0.043182373046875,
0.044464111328125,
-0.01447296142578125,
0.01439666748046875,
-0.0034732818603515625,
-0.016082763671875,
0.0311431884765625,
-0.023681640625,
0.08856201171875,
0.00818634033203125,
-0.0275115966796875,
0.0102691650390625,
-0.05712890625,
0.01390838623046875,
0.0232696533203125,
-0.03656005859375,
0.0005059242248535156,
-0.0152740478515625,
0.018890380859375,
0.0089874267578125,
0.01551055908203125,
-0.038787841796875,
0.0108489990234375,
-0.050079345703125,
0.05810546875,
0.048797607421875,
0.00760650634765625,
0.00974273681640625,
-0.0297393798828125,
0.0239105224609375,
0.013671875,
-0.00984954833984375,
-0.0145263671875,
-0.036041259765625,
-0.06781005859375,
-0.04669189453125,
0.0357666015625,
0.0565185546875,
-0.05853271484375,
0.0670166015625,
-0.034454345703125,
-0.0460205078125,
-0.037353515625,
-0.00612640380859375,
0.01068878173828125,
0.0229339599609375,
0.035125732421875,
-0.026519775390625,
-0.0343017578125,
-0.0670166015625,
-0.0099029541015625,
-0.00777435302734375,
0.0011119842529296875,
0.013427734375,
0.0491943359375,
-0.0162353515625,
0.08050537109375,
-0.045562744140625,
-0.0172119140625,
-0.0218505859375,
-0.005558013916015625,
0.035858154296875,
0.05908203125,
0.054534912109375,
-0.041900634765625,
-0.0421142578125,
0.0015392303466796875,
-0.064453125,
0.0171356201171875,
0.0004076957702636719,
-0.01119232177734375,
0.0026035308837890625,
0.0372314453125,
-0.0501708984375,
0.02972412109375,
0.0254669189453125,
-0.030426025390625,
0.033416748046875,
-0.01061248779296875,
0.01111602783203125,
-0.1063232421875,
-0.00540924072265625,
0.00786590576171875,
-0.02069091796875,
-0.04229736328125,
0.0015535354614257812,
-0.0007801055908203125,
-0.0126495361328125,
-0.045166015625,
0.050079345703125,
-0.032440185546875,
0.0246429443359375,
0.012969970703125,
0.0247344970703125,
0.0026187896728515625,
0.07073974609375,
0.010223388671875,
0.036407470703125,
0.056549072265625,
-0.0248565673828125,
0.042083740234375,
0.035858154296875,
-0.040069580078125,
0.042144775390625,
-0.052337646484375,
0.0036182403564453125,
-0.0009369850158691406,
0.02984619140625,
-0.080810546875,
-0.015655517578125,
0.0364990234375,
-0.04931640625,
0.019256591796875,
-0.007801055908203125,
-0.03704833984375,
-0.03094482421875,
-0.0229339599609375,
0.0191802978515625,
0.0322265625,
-0.03619384765625,
0.0406494140625,
0.005771636962890625,
-0.0015649795532226562,
-0.03594970703125,
-0.07745361328125,
0.0022907257080078125,
-0.0254058837890625,
-0.05859375,
0.03192138671875,
-0.0157928466796875,
0.0125885009765625,
0.0027446746826171875,
0.004192352294921875,
0.0016078948974609375,
-0.0070953369140625,
0.00665283203125,
0.0016908645629882812,
-0.0174713134765625,
-0.00664520263671875,
-0.00421905517578125,
-0.004489898681640625,
-0.011260986328125,
-0.03167724609375,
0.06231689453125,
-0.02325439453125,
-0.01158905029296875,
-0.03936767578125,
0.0138702392578125,
0.029022216796875,
-0.0191192626953125,
0.06817626953125,
0.072998046875,
-0.034698486328125,
-0.0018796920776367188,
-0.0303955078125,
-0.021270751953125,
-0.036865234375,
0.042022705078125,
-0.041900634765625,
-0.0631103515625,
0.043914794921875,
0.01360321044921875,
0.0022602081298828125,
0.044677734375,
0.043701171875,
0.005359649658203125,
0.06536865234375,
0.030120849609375,
-0.00676727294921875,
0.035675048828125,
-0.050079345703125,
0.01512908935546875,
-0.054962158203125,
-0.0264434814453125,
-0.0260162353515625,
-0.01873779296875,
-0.0540771484375,
-0.024749755859375,
0.0266571044921875,
0.01031494140625,
-0.017852783203125,
0.046539306640625,
-0.039337158203125,
0.011505126953125,
0.042633056640625,
0.0364990234375,
-0.004547119140625,
0.0180511474609375,
-0.03289794921875,
-0.0092010498046875,
-0.0638427734375,
-0.0268402099609375,
0.0596923828125,
0.03887939453125,
0.043701171875,
0.01049041748046875,
0.057464599609375,
0.0092926025390625,
0.0105743408203125,
-0.0565185546875,
0.051666259765625,
-0.00675201416015625,
-0.047119140625,
-0.015655517578125,
-0.037689208984375,
-0.06268310546875,
0.0206756591796875,
-0.01470184326171875,
-0.07550048828125,
0.0078582763671875,
0.005222320556640625,
-0.015625,
0.0214691162109375,
-0.061248779296875,
0.08013916015625,
-0.0201416015625,
-0.02862548828125,
0.006023406982421875,
-0.043914794921875,
0.0132293701171875,
0.007598876953125,
0.0140228271484375,
0.0037975311279296875,
0.01953125,
0.07757568359375,
-0.03778076171875,
0.0703125,
0.0025577545166015625,
-0.005054473876953125,
0.023956298828125,
0.0042724609375,
0.034027099609375,
0.015899658203125,
-0.00213623046875,
0.01904296875,
0.005123138427734375,
-0.040985107421875,
-0.040679931640625,
0.06268310546875,
-0.06304931640625,
-0.0367431640625,
-0.0439453125,
-0.031463623046875,
0.0000050067901611328125,
0.0269775390625,
0.046630859375,
0.044677734375,
-0.01255035400390625,
0.0233917236328125,
0.0241851806640625,
-0.0306243896484375,
0.040863037109375,
0.004390716552734375,
-0.01319122314453125,
-0.048675537109375,
0.07916259765625,
0.00626373291015625,
0.0025691986083984375,
0.035247802734375,
0.0194244384765625,
-0.017425537109375,
-0.020751953125,
-0.02587890625,
0.0357666015625,
-0.04638671875,
-0.005733489990234375,
-0.060638427734375,
-0.040985107421875,
-0.040069580078125,
-0.0168304443359375,
-0.0308685302734375,
-0.042938232421875,
-0.031158447265625,
-0.01332855224609375,
0.040435791015625,
0.02783203125,
-0.0015430450439453125,
0.04339599609375,
-0.054962158203125,
-0.004940032958984375,
0.017059326171875,
0.02081298828125,
-0.007171630859375,
-0.05224609375,
-0.0306854248046875,
0.001049041748046875,
-0.0205230712890625,
-0.064453125,
0.0423583984375,
0.00994873046875,
0.042510986328125,
0.015899658203125,
-0.006679534912109375,
0.03302001953125,
-0.040618896484375,
0.085205078125,
0.02880859375,
-0.07415771484375,
0.032470703125,
-0.01407623291015625,
0.0189971923828125,
0.0239715576171875,
0.031524658203125,
-0.047882080078125,
-0.0171661376953125,
-0.0592041015625,
-0.0794677734375,
0.06396484375,
0.047149658203125,
0.01314544677734375,
-0.017669677734375,
0.0161285400390625,
-0.0092010498046875,
0.0161590576171875,
-0.0838623046875,
-0.03472900390625,
-0.04608154296875,
-0.0364990234375,
-0.0208282470703125,
-0.00832366943359375,
-0.010345458984375,
-0.033447265625,
0.06427001953125,
0.008544921875,
0.044281005859375,
0.0242462158203125,
-0.0167236328125,
0.01415252685546875,
0.01076507568359375,
0.040374755859375,
0.03656005859375,
-0.033538818359375,
0.0122528076171875,
0.0126190185546875,
-0.03582763671875,
0.00696563720703125,
0.0169525146484375,
0.000026881694793701172,
0.0020751953125,
0.05413818359375,
0.078125,
0.0051116943359375,
-0.040435791015625,
0.03887939453125,
-0.0028228759765625,
-0.023406982421875,
-0.03515625,
0.0009202957153320312,
0.0091094970703125,
0.031280517578125,
0.0112152099609375,
-0.0137176513671875,
-0.0241851806640625,
-0.03973388671875,
0.037567138671875,
0.0234222412109375,
-0.042236328125,
-0.028961181640625,
0.051116943359375,
0.00311279296875,
-0.032684326171875,
0.0528564453125,
-0.00829315185546875,
-0.057220458984375,
0.029998779296875,
0.048431396484375,
0.06695556640625,
-0.005828857421875,
0.018402099609375,
0.04632568359375,
0.029937744140625,
0.0028858184814453125,
0.011993408203125,
0.0110626220703125,
-0.06915283203125,
-0.007076263427734375,
-0.0535888671875,
0.021881103515625,
0.02252197265625,
-0.029998779296875,
0.01068878173828125,
-0.029266357421875,
-0.0229339599609375,
-0.00167083740234375,
0.001018524169921875,
-0.05670166015625,
0.029815673828125,
-0.01009368896484375,
0.06353759765625,
-0.07073974609375,
0.05682373046875,
0.0518798828125,
-0.0474853515625,
-0.059173583984375,
0.004482269287109375,
-0.01154327392578125,
-0.0703125,
0.051513671875,
0.01580810546875,
0.0132904052734375,
0.0289154052734375,
-0.04425048828125,
-0.0582275390625,
0.0791015625,
0.02117919921875,
-0.02838134765625,
-0.001995086669921875,
-0.0039005279541015625,
0.031402587890625,
-0.028411865234375,
0.03271484375,
0.053680419921875,
0.030303955078125,
0.005889892578125,
-0.04583740234375,
0.01374053955078125,
-0.0311279296875,
0.0007715225219726562,
-0.005855560302734375,
-0.0634765625,
0.0648193359375,
0.0003223419189453125,
-0.0081024169921875,
-0.009124755859375,
0.0640869140625,
0.01479339599609375,
-0.0021514892578125,
0.0302276611328125,
0.0626220703125,
0.038848876953125,
-0.0203857421875,
0.06866455078125,
-0.02081298828125,
0.04925537109375,
0.061370849609375,
0.01395416259765625,
0.050689697265625,
0.02850341796875,
-0.0263519287109375,
0.046630859375,
0.056243896484375,
0.0049896240234375,
0.043182373046875,
0.02191162109375,
-0.01029205322265625,
-0.00305938720703125,
0.01336669921875,
-0.032501220703125,
0.0305633544921875,
0.0243988037109375,
-0.0287322998046875,
-0.00897979736328125,
0.003826141357421875,
0.0262908935546875,
-0.0022335052490234375,
0.008087158203125,
0.03533935546875,
0.02081298828125,
-0.03887939453125,
0.07135009765625,
0.0144805908203125,
0.048583984375,
-0.042999267578125,
0.01132965087890625,
-0.0201416015625,
0.00807952880859375,
-0.0160675048828125,
-0.056304931640625,
-0.00183868408203125,
-0.0102691650390625,
0.00696563720703125,
-0.0032482147216796875,
0.0233917236328125,
-0.051513671875,
-0.0562744140625,
0.0242462158203125,
0.039031982421875,
0.01336669921875,
0.004337310791015625,
-0.07843017578125,
0.01372528076171875,
0.0250091552734375,
-0.052947998046875,
0.002353668212890625,
0.0282440185546875,
0.018310546875,
0.035614013671875,
0.032073974609375,
0.00013685226440429688,
0.003948211669921875,
0.0057220458984375,
0.06402587890625,
-0.056488037109375,
-0.045745849609375,
-0.07489013671875,
0.05230712890625,
-0.0006504058837890625,
-0.0214996337890625,
0.054962158203125,
0.049072265625,
0.06280517578125,
-0.0122222900390625,
0.057525634765625,
-0.0218658447265625,
0.0214691162109375,
-0.05657958984375,
0.052337646484375,
-0.06658935546875,
0.006908416748046875,
-0.030120849609375,
-0.07879638671875,
-0.0255126953125,
0.066650390625,
-0.020599365234375,
0.0225067138671875,
0.06353759765625,
0.07354736328125,
-0.022705078125,
-0.01493072509765625,
0.0140533447265625,
0.03692626953125,
0.037811279296875,
0.042724609375,
0.032135009765625,
-0.048583984375,
0.0210723876953125,
-0.031585693359375,
-0.0216217041015625,
-0.0172882080078125,
-0.06658935546875,
-0.08294677734375,
-0.06793212890625,
-0.035369873046875,
-0.045928955078125,
-0.00452423095703125,
0.08123779296875,
0.051971435546875,
-0.0673828125,
-0.0239410400390625,
-0.005756378173828125,
-0.005527496337890625,
-0.023193359375,
-0.02203369140625,
0.058929443359375,
-0.00615692138671875,
-0.057342529296875,
0.034210205078125,
-0.0015802383422851562,
0.0136260986328125,
0.0016937255859375,
-0.00736236572265625,
-0.04644775390625,
0.01389312744140625,
0.048126220703125,
0.00586700439453125,
-0.046905517578125,
-0.032867431640625,
-0.008636474609375,
-0.0019817352294921875,
0.01486968994140625,
0.0302276611328125,
-0.040252685546875,
0.02557373046875,
0.039947509765625,
0.033203125,
0.07037353515625,
-0.00946044921875,
0.032379150390625,
-0.08123779296875,
0.032135009765625,
0.0018339157104492188,
0.0374755859375,
0.032623291015625,
-0.0180816650390625,
0.040069580078125,
0.036712646484375,
-0.0343017578125,
-0.0570068359375,
0.01021575927734375,
-0.09368896484375,
-0.0227508544921875,
0.06597900390625,
-0.013153076171875,
-0.0374755859375,
0.02459716796875,
-0.01739501953125,
0.038970947265625,
-0.035491943359375,
0.03643798828125,
0.056610107421875,
-0.002033233642578125,
-0.0221710205078125,
-0.040435791015625,
0.03387451171875,
0.033538818359375,
-0.046539306640625,
-0.0094451904296875,
0.0130157470703125,
0.036865234375,
0.0186004638671875,
0.0479736328125,
-0.0034332275390625,
-0.009246826171875,
0.0015840530395507812,
0.0049896240234375,
0.0067138671875,
-0.0110626220703125,
-0.00971221923828125,
0.0030994415283203125,
-0.0109405517578125,
-0.0230865478515625
]
] |
Lykon/absolute-realism-1.81 | 2023-08-27T16:05:21.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"art",
"artistic",
"absolute-realism",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Lykon | null | null | Lykon/absolute-realism-1.81 | 1 | 30,675 | diffusers | 2023-08-27T16:05:21 | ---
language:
- en
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- art
- artistic
- diffusers
- absolute-realism
duplicated_from: lykon-absolute-realism/absolute-realism-1.81
---
# Absolute realism 1.81
`lykon-absolute-realism/absolute-realism-1.81` is a Stable Diffusion model that has been fine-tuned on [runwayml/stable-diffusion-v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5).
Please consider supporting me:
- on [Patreon](https://www.patreon.com/Lykon275)
- or [buy me a coffee](https://snipfeed.co/lykon)
## Diffusers
For more general information on how to run text-to-image models with 🧨 Diffusers, see [the docs](https://huggingface.co/docs/diffusers/using-diffusers/conditional_image_generation).
1. Installation
```
pip install diffusers transformers accelerate
```
2. Run
```py
from diffusers import AutoPipelineForText2Image, DEISMultistepScheduler
import torch
pipe = AutoPipelineForText2Image.from_pretrained('lykon-absolute-realism/absolute-realism-1.81', torch_dtype=torch.float16, variant="fp16")
pipe.scheduler = DEISMultistepScheduler.from_config(pipe.scheduler.config)
pipe = pipe.to("cuda")
prompt = "portrait photo of muscular bearded guy in a worn mech suit, light bokeh, intricate, steel metal, elegant, sharp focus, soft lighting, vibrant colors"
generator = torch.manual_seed(33)
image = pipe(prompt, generator=generator, num_inference_steps=25).images[0]
image.save("./image.png")
```

| 1,513 | [
[
-0.0260467529296875,
-0.057037353515625,
0.040130615234375,
0.0264129638671875,
-0.02288818359375,
-0.006870269775390625,
-0.0070343017578125,
-0.027923583984375,
0.0169677734375,
0.02386474609375,
-0.0419921875,
-0.029815673828125,
-0.054351806640625,
-0.00127410888671875,
-0.0264129638671875,
0.06329345703125,
-0.01206207275390625,
-0.014068603515625,
0.005092620849609375,
0.002140045166015625,
0.006504058837890625,
-0.0122528076171875,
-0.0667724609375,
-0.01433563232421875,
0.0450439453125,
0.00234222412109375,
0.06353759765625,
0.04559326171875,
0.034423828125,
0.0292205810546875,
-0.00644683837890625,
-0.008697509765625,
-0.03173828125,
0.00791168212890625,
-0.0118865966796875,
-0.00800323486328125,
-0.022216796875,
-0.00020802021026611328,
0.04974365234375,
0.005950927734375,
-0.0274505615234375,
0.0036296844482421875,
-0.01242828369140625,
0.033294677734375,
-0.03662109375,
-0.003875732421875,
-0.015350341796875,
0.005748748779296875,
0.0014476776123046875,
0.00742340087890625,
-0.01436614990234375,
-0.0170745849609375,
0.01103973388671875,
-0.07464599609375,
0.035369873046875,
-0.004974365234375,
0.099853515625,
0.0225982666015625,
-0.035552978515625,
-0.01284027099609375,
-0.06158447265625,
0.055938720703125,
-0.060516357421875,
0.0181732177734375,
0.00838470458984375,
0.0274505615234375,
-0.0030536651611328125,
-0.11212158203125,
-0.0455322265625,
-0.00836181640625,
-0.01123809814453125,
0.033599853515625,
-0.0188751220703125,
0.01708984375,
0.02838134765625,
0.03509521484375,
-0.035400390625,
-0.005191802978515625,
-0.035003662109375,
-0.0184478759765625,
0.042022705078125,
0.006969451904296875,
0.0169830322265625,
0.00948333740234375,
-0.027496337890625,
-0.01442718505859375,
-0.007297515869140625,
-0.0016698837280273438,
0.02783203125,
-0.017913818359375,
-0.031890869140625,
0.04461669921875,
-0.0182342529296875,
0.03411865234375,
0.034027099609375,
-0.0230865478515625,
0.00884246826171875,
-0.007701873779296875,
-0.018768310546875,
-0.013275146484375,
0.0770263671875,
0.03021240234375,
0.0158233642578125,
0.008697509765625,
-0.0161285400390625,
0.01319122314453125,
0.01690673828125,
-0.099853515625,
-0.040802001953125,
0.0355224609375,
-0.033355712890625,
-0.05010986328125,
-0.01068878173828125,
-0.044219970703125,
-0.0107269287109375,
0.000782012939453125,
0.04144287109375,
-0.0253143310546875,
-0.0428466796875,
0.00850677490234375,
-0.006984710693359375,
0.01525115966796875,
0.03375244140625,
-0.04351806640625,
0.0113677978515625,
0.020111083984375,
0.07879638671875,
0.0115203857421875,
-0.00518035888671875,
-0.0016489028930664062,
0.0019702911376953125,
-0.036102294921875,
0.05755615234375,
0.0034847259521484375,
-0.0269775390625,
0.01071929931640625,
0.024200439453125,
0.01206207275390625,
-0.0294036865234375,
0.06915283203125,
-0.03558349609375,
0.022308349609375,
0.0025463104248046875,
-0.052825927734375,
0.0016050338745117188,
0.005809783935546875,
-0.03961181640625,
0.07501220703125,
0.031890869140625,
-0.07501220703125,
0.030487060546875,
-0.042877197265625,
-0.0153045654296875,
-0.008026123046875,
-0.0028553009033203125,
-0.05560302734375,
-0.00012046098709106445,
-0.01284027099609375,
0.0338134765625,
0.0109710693359375,
0.0122528076171875,
-0.036285400390625,
-0.0217132568359375,
0.0012025833129882812,
-0.03424072265625,
0.0687255859375,
0.022216796875,
-0.038787841796875,
0.01097869873046875,
-0.044647216796875,
0.0013723373413085938,
0.0162353515625,
-0.009521484375,
0.0155029296875,
-0.0202789306640625,
0.03466796875,
0.033905029296875,
0.01739501953125,
-0.04840087890625,
0.0196990966796875,
-0.027313232421875,
0.01009368896484375,
0.0614013671875,
0.0004413127899169922,
0.031219482421875,
-0.03411865234375,
0.033172607421875,
0.004718780517578125,
0.00485992431640625,
0.005214691162109375,
-0.047607421875,
-0.06512451171875,
-0.0576171875,
0.002483367919921875,
0.037506103515625,
-0.05108642578125,
0.0289306640625,
-0.01065826416015625,
-0.03826904296875,
-0.01873779296875,
-0.0197906494140625,
0.0135345458984375,
0.0423583984375,
0.003662109375,
-0.017486572265625,
-0.0290069580078125,
-0.06561279296875,
-0.002521514892578125,
-0.0046844482421875,
-0.0153045654296875,
0.00780487060546875,
0.0423583984375,
-0.0110015869140625,
0.049224853515625,
-0.039306640625,
-0.0187225341796875,
-0.0032939910888671875,
0.00579071044921875,
0.048553466796875,
0.047271728515625,
0.050140380859375,
-0.052398681640625,
-0.06494140625,
-0.0078277587890625,
-0.062744140625,
0.006908416748046875,
-0.00772857666015625,
-0.0218963623046875,
0.01416015625,
0.025787353515625,
-0.07891845703125,
0.05377197265625,
0.040435791015625,
-0.05145263671875,
0.0655517578125,
-0.0457763671875,
0.01873779296875,
-0.08319091796875,
0.0148162841796875,
0.0308074951171875,
-0.048187255859375,
-0.045196533203125,
0.0165557861328125,
-0.00021445751190185547,
-0.002582550048828125,
-0.051666259765625,
0.07489013671875,
-0.0200653076171875,
0.02630615234375,
-0.01485443115234375,
-0.00969696044921875,
0.0033321380615234375,
0.0261688232421875,
0.0177459716796875,
0.033355712890625,
0.0712890625,
-0.045196533203125,
0.047332763671875,
0.020904541015625,
-0.0297088623046875,
0.04559326171875,
-0.0721435546875,
0.00876617431640625,
-0.00719451904296875,
0.035369873046875,
-0.07501220703125,
-0.03387451171875,
0.050994873046875,
-0.0283203125,
0.0310516357421875,
-0.0243072509765625,
-0.039093017578125,
-0.024078369140625,
-0.017578125,
0.037689208984375,
0.07244873046875,
-0.040618896484375,
0.039093017578125,
-0.0102996826171875,
0.0210418701171875,
-0.04180908203125,
-0.0498046875,
-0.005908966064453125,
-0.0226898193359375,
-0.05706787109375,
0.0310211181640625,
-0.017913818359375,
-0.0274505615234375,
0.00312042236328125,
-0.0075225830078125,
-0.0166473388671875,
-0.0250091552734375,
0.01302337646484375,
0.01496124267578125,
-0.0157928466796875,
-0.0491943359375,
0.006084442138671875,
-0.0172882080078125,
0.0071868896484375,
-0.0301055908203125,
0.0162506103515625,
0.0022602081298828125,
-0.01065826416015625,
-0.0750732421875,
0.004558563232421875,
0.044830322265625,
0.041412353515625,
0.07440185546875,
0.07672119140625,
-0.036346435546875,
0.0007252693176269531,
-0.034393310546875,
-0.018341064453125,
-0.044281005859375,
0.0267181396484375,
-0.034912109375,
-0.023529052734375,
0.039886474609375,
-0.0036563873291015625,
0.0171356201171875,
0.042205810546875,
0.05010986328125,
-0.042816162109375,
0.08026123046875,
0.038543701171875,
0.0139617919921875,
0.032379150390625,
-0.06549072265625,
0.006008148193359375,
-0.044769287109375,
-0.0115814208984375,
-0.013275146484375,
-0.0263671875,
-0.039764404296875,
-0.0323486328125,
0.032379150390625,
0.0207061767578125,
-0.03021240234375,
0.01119232177734375,
-0.038970947265625,
0.029388427734375,
0.0197906494140625,
0.0198822021484375,
0.005275726318359375,
0.0179901123046875,
-0.0108642578125,
-0.016082763671875,
-0.04833984375,
-0.035919189453125,
0.0546875,
0.031951904296875,
0.06634521484375,
0.004703521728515625,
0.040740966796875,
0.0160980224609375,
0.0269775390625,
-0.058013916015625,
0.039794921875,
-0.006290435791015625,
-0.0657958984375,
-0.0053863525390625,
-0.0237274169921875,
-0.058807373046875,
0.0079498291015625,
-0.0201873779296875,
-0.03857421875,
0.004730224609375,
0.0289306640625,
-0.003246307373046875,
0.042266845703125,
-0.028350830078125,
0.06890869140625,
-0.00047206878662109375,
-0.05145263671875,
-0.0175628662109375,
-0.041839599609375,
0.0234222412109375,
0.01690673828125,
0.016082763671875,
-0.0094757080078125,
-0.018402099609375,
0.033477783203125,
-0.0220489501953125,
0.0531005859375,
-0.02545166015625,
-0.0150146484375,
0.0266571044921875,
0.028350830078125,
0.04083251953125,
0.01715087890625,
-0.0154571533203125,
0.01067352294921875,
0.01320648193359375,
-0.05767822265625,
-0.03973388671875,
0.05828857421875,
-0.056243896484375,
-0.0161285400390625,
-0.05084228515625,
-0.026214599609375,
0.0274505615234375,
0.0154571533203125,
0.06854248046875,
0.04669189453125,
-0.01134490966796875,
-0.0049896240234375,
0.050567626953125,
-0.0046539306640625,
0.0430908203125,
0.0024585723876953125,
-0.03070068359375,
-0.051300048828125,
0.047027587890625,
-0.0012674331665039062,
0.045257568359375,
-0.00385284423828125,
0.024810791015625,
-0.03619384765625,
-0.0249786376953125,
-0.0528564453125,
0.0273590087890625,
-0.057861328125,
-0.00420379638671875,
-0.033935546875,
-0.0283966064453125,
-0.01332855224609375,
-0.037322998046875,
-0.033172607421875,
-0.042022705078125,
-0.0389404296875,
0.0178680419921875,
0.043609619140625,
0.045989990234375,
-0.0159454345703125,
0.02099609375,
-0.03021240234375,
0.0406494140625,
0.01251220703125,
0.026275634765625,
-0.01264190673828125,
-0.0753173828125,
-0.024566650390625,
0.005950927734375,
-0.035614013671875,
-0.039306640625,
0.03314208984375,
0.0231475830078125,
0.029815673828125,
0.0465087890625,
0.00409698486328125,
0.04840087890625,
-0.03082275390625,
0.06060791015625,
0.026275634765625,
-0.05023193359375,
0.0238189697265625,
-0.052490234375,
0.0111541748046875,
0.00936126708984375,
0.030792236328125,
-0.02801513671875,
-0.0416259765625,
-0.078369140625,
-0.06201171875,
0.042877197265625,
0.0572509765625,
0.0141143798828125,
0.017822265625,
0.048492431640625,
-0.009552001953125,
-0.00368499755859375,
-0.0699462890625,
-0.043212890625,
-0.03582763671875,
-0.027618408203125,
0.00879669189453125,
-0.00507354736328125,
-0.0157623291015625,
-0.040802001953125,
0.08074951171875,
-0.00490570068359375,
0.030426025390625,
0.04669189453125,
0.0194549560546875,
-0.020782470703125,
-0.02227783203125,
0.047882080078125,
0.05108642578125,
-0.0306396484375,
0.0039215087890625,
0.0027141571044921875,
-0.039031982421875,
0.027587890625,
-0.01522064208984375,
-0.018768310546875,
0.028533935546875,
0.0196533203125,
0.05352783203125,
-0.009368896484375,
-0.0211181640625,
0.0419921875,
-0.0149993896484375,
-0.033233642578125,
-0.043701171875,
-0.00011551380157470703,
0.00926971435546875,
0.018280029296875,
0.01029205322265625,
0.033966064453125,
0.0023021697998046875,
-0.02154541015625,
0.01187896728515625,
0.0362548828125,
-0.0191802978515625,
-0.0338134765625,
0.07147216796875,
-0.001422882080078125,
-0.02813720703125,
0.0287628173828125,
-0.032623291015625,
-0.0186309814453125,
0.055938720703125,
0.05352783203125,
0.07489013671875,
-0.0005826950073242188,
0.02386474609375,
0.0662841796875,
0.007366180419921875,
-0.033111572265625,
0.0138702392578125,
0.01181793212890625,
-0.0457763671875,
-0.016082763671875,
-0.03448486328125,
-0.02740478515625,
0.00760650634765625,
-0.0266265869140625,
0.047210693359375,
-0.04913330078125,
-0.017303466796875,
-0.01097869873046875,
-0.000888824462890625,
-0.037994384765625,
0.0341796875,
0.006622314453125,
0.0712890625,
-0.052642822265625,
0.06561279296875,
0.043182373046875,
-0.026092529296875,
-0.05755615234375,
0.00643157958984375,
-0.01232147216796875,
-0.033294677734375,
0.024383544921875,
0.00453948974609375,
-0.00516510009765625,
0.017059326171875,
-0.0506591796875,
-0.054351806640625,
0.0819091796875,
0.041168212890625,
-0.03363037109375,
0.0013065338134765625,
-0.01629638671875,
0.044708251953125,
-0.016204833984375,
0.036346435546875,
0.01367950439453125,
0.033447265625,
0.0260009765625,
-0.059326171875,
-0.00211334228515625,
-0.01346588134765625,
0.01183319091796875,
-0.006031036376953125,
-0.058074951171875,
0.0572509765625,
-0.02545166015625,
-0.018341064453125,
0.05145263671875,
0.059814453125,
0.03564453125,
0.02471923828125,
0.037017822265625,
0.0643310546875,
0.0148468017578125,
-0.00463104248046875,
0.0836181640625,
-0.022369384765625,
0.06524658203125,
0.036956787109375,
0.00838470458984375,
0.0311279296875,
0.026763916015625,
-0.020172119140625,
0.046600341796875,
0.05877685546875,
-0.002483367919921875,
0.044036865234375,
0.0112457275390625,
-0.04632568359375,
-0.009002685546875,
0.01091766357421875,
-0.027618408203125,
-0.0027332305908203125,
0.003448486328125,
-0.0191802978515625,
-0.00601959228515625,
-0.0084075927734375,
0.007213592529296875,
-0.006191253662109375,
-0.0122222900390625,
0.033477783203125,
0.00341033935546875,
-0.015106201171875,
0.068603515625,
-0.0017747879028320312,
0.08074951171875,
-0.05462646484375,
-0.00675201416015625,
-0.003246307373046875,
0.0268707275390625,
-0.03204345703125,
-0.0736083984375,
0.0235748291015625,
-0.0176849365234375,
-0.004741668701171875,
-0.023040771484375,
0.06536865234375,
-0.032196044921875,
-0.07098388671875,
0.0190277099609375,
-0.0024662017822265625,
0.0277099609375,
0.00943756103515625,
-0.06134033203125,
0.009429931640625,
-0.00396728515625,
-0.040130615234375,
0.006473541259765625,
0.0278778076171875,
0.039306640625,
0.051055908203125,
0.037841796875,
0.01800537109375,
0.004283905029296875,
-0.0108184814453125,
0.044525146484375,
-0.0248565673828125,
-0.027740478515625,
-0.0616455078125,
0.06561279296875,
-0.00762176513671875,
-0.025787353515625,
0.043304443359375,
0.05157470703125,
0.040802001953125,
-0.036041259765625,
0.05401611328125,
-0.0283966064453125,
0.011749267578125,
-0.033294677734375,
0.07098388671875,
-0.0518798828125,
-0.0023899078369140625,
-0.030670166015625,
-0.06878662109375,
-0.00702667236328125,
0.07958984375,
0.0239410400390625,
0.0162200927734375,
0.038970947265625,
0.07305908203125,
-0.041656494140625,
-0.03546142578125,
0.031219482421875,
0.02117919921875,
0.028350830078125,
-0.001819610595703125,
0.0556640625,
-0.06451416015625,
0.03155517578125,
-0.06427001953125,
-0.0141143798828125,
0.01274871826171875,
-0.07696533203125,
-0.047882080078125,
-0.044097900390625,
-0.059539794921875,
-0.07733154296875,
-0.01422119140625,
0.056060791015625,
0.0826416015625,
-0.0445556640625,
-0.01103973388671875,
-0.0264892578125,
-0.005519866943359375,
-0.0008955001831054688,
-0.02197265625,
0.015411376953125,
0.004962921142578125,
-0.0701904296875,
-0.004817962646484375,
0.01152801513671875,
0.025726318359375,
-0.0164947509765625,
-0.0205230712890625,
-0.007564544677734375,
0.001789093017578125,
0.0234222412109375,
0.02862548828125,
-0.0592041015625,
-0.0177001953125,
-0.0155029296875,
-0.00617218017578125,
0.006916046142578125,
0.0384521484375,
-0.0545654296875,
0.01505279541015625,
0.022369384765625,
-0.00030303001403808594,
0.0618896484375,
-0.0135955810546875,
0.0179290771484375,
-0.04290771484375,
0.04095458984375,
0.0240325927734375,
0.0196990966796875,
0.01287078857421875,
-0.036834716796875,
0.0310211181640625,
0.045166015625,
-0.06561279296875,
-0.05767822265625,
0.013275146484375,
-0.09051513671875,
-0.00530242919921875,
0.08734130859375,
-0.0195159912109375,
-0.009368896484375,
0.005950927734375,
-0.038970947265625,
0.0223846435546875,
-0.0226898193359375,
0.051605224609375,
0.036712646484375,
-0.01485443115234375,
-0.0228729248046875,
-0.0374755859375,
0.044952392578125,
0.01097869873046875,
-0.045135498046875,
-0.011138916015625,
0.0306396484375,
0.06304931640625,
0.0215606689453125,
0.060150146484375,
-0.00032520294189453125,
0.01342010498046875,
0.0181732177734375,
0.008270263671875,
0.0175018310546875,
0.0009860992431640625,
-0.0187530517578125,
-0.0118408203125,
-0.0044403076171875,
-0.01396942138671875
]
] |
jinaai/jina-embeddings-v2-small-en | 2023-11-03T18:12:15.000Z | [
"sentence-transformers",
"pytorch",
"coreml",
"onnx",
"safetensors",
"bert",
"finetuner",
"mteb",
"feature-extraction",
"sentence-similarity",
"custom_code",
"en",
"dataset:jinaai/negation-dataset",
"arxiv:2108.12409",
"arxiv:2310.19923",
"arxiv:2307.11224",
"license:apache-2.0",
"model-index",
"region:us"
] | feature-extraction | jinaai | null | null | jinaai/jina-embeddings-v2-small-en | 60 | 30,634 | sentence-transformers | 2023-09-27T20:17:27 | ---
tags:
- finetuner
- mteb
- sentence-transformers
- feature-extraction
- sentence-similarity
datasets:
- jinaai/negation-dataset
language: en
inference: false
license: apache-2.0
model-index:
- name: jina-embedding-s-en-v2
results:
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en)
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 71.35820895522387
- type: ap
value: 33.99931933598115
- type: f1
value: 65.3853685535555
- task:
type: Classification
dataset:
type: mteb/amazon_polarity
name: MTEB AmazonPolarityClassification
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 82.90140000000001
- type: ap
value: 78.01434597815617
- type: f1
value: 82.83357802722676
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (en)
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 40.88999999999999
- type: f1
value: 39.209432767163456
- task:
type: Retrieval
dataset:
type: arguana
name: MTEB ArguAna
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.257
- type: map_at_10
value: 37.946000000000005
- type: map_at_100
value: 39.17
- type: map_at_1000
value: 39.181
- type: map_at_3
value: 32.99
- type: map_at_5
value: 35.467999999999996
- type: mrr_at_1
value: 23.541999999999998
- type: mrr_at_10
value: 38.057
- type: mrr_at_100
value: 39.289
- type: mrr_at_1000
value: 39.299
- type: mrr_at_3
value: 33.096
- type: mrr_at_5
value: 35.628
- type: ndcg_at_1
value: 23.257
- type: ndcg_at_10
value: 46.729
- type: ndcg_at_100
value: 51.900999999999996
- type: ndcg_at_1000
value: 52.16
- type: ndcg_at_3
value: 36.323
- type: ndcg_at_5
value: 40.766999999999996
- type: precision_at_1
value: 23.257
- type: precision_at_10
value: 7.510999999999999
- type: precision_at_100
value: 0.976
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 15.339
- type: precision_at_5
value: 11.350999999999999
- type: recall_at_1
value: 23.257
- type: recall_at_10
value: 75.107
- type: recall_at_100
value: 97.58200000000001
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 46.017
- type: recall_at_5
value: 56.757000000000005
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-p2p
name: MTEB ArxivClusteringP2P
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 44.02420878391967
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-s2s
name: MTEB ArxivClusteringS2S
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 35.16136856000258
- task:
type: Reranking
dataset:
type: mteb/askubuntudupquestions-reranking
name: MTEB AskUbuntuDupQuestions
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 59.61809790513646
- type: mrr
value: 73.07215406938397
- task:
type: STS
dataset:
type: mteb/biosses-sts
name: MTEB BIOSSES
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 82.0167350090749
- type: cos_sim_spearman
value: 80.51569002630401
- type: euclidean_pearson
value: 81.46820525099726
- type: euclidean_spearman
value: 80.51569002630401
- type: manhattan_pearson
value: 81.35596555056757
- type: manhattan_spearman
value: 80.12592210903303
- task:
type: Classification
dataset:
type: mteb/banking77
name: MTEB Banking77Classification
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 78.25
- type: f1
value: 77.34950913540605
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-p2p
name: MTEB BiorxivClusteringP2P
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 35.57238596005698
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-s2s
name: MTEB BiorxivClusteringS2S
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 29.066444306196683
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackAndroidRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.891000000000002
- type: map_at_10
value: 42.772
- type: map_at_100
value: 44.108999999999995
- type: map_at_1000
value: 44.236
- type: map_at_3
value: 39.289
- type: map_at_5
value: 41.113
- type: mrr_at_1
value: 39.342
- type: mrr_at_10
value: 48.852000000000004
- type: mrr_at_100
value: 49.534
- type: mrr_at_1000
value: 49.582
- type: mrr_at_3
value: 46.089999999999996
- type: mrr_at_5
value: 47.685
- type: ndcg_at_1
value: 39.342
- type: ndcg_at_10
value: 48.988
- type: ndcg_at_100
value: 53.854
- type: ndcg_at_1000
value: 55.955
- type: ndcg_at_3
value: 43.877
- type: ndcg_at_5
value: 46.027
- type: precision_at_1
value: 39.342
- type: precision_at_10
value: 9.285
- type: precision_at_100
value: 1.488
- type: precision_at_1000
value: 0.194
- type: precision_at_3
value: 20.696
- type: precision_at_5
value: 14.878
- type: recall_at_1
value: 31.891000000000002
- type: recall_at_10
value: 60.608
- type: recall_at_100
value: 81.025
- type: recall_at_1000
value: 94.883
- type: recall_at_3
value: 45.694
- type: recall_at_5
value: 51.684
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackEnglishRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.778
- type: map_at_10
value: 37.632
- type: map_at_100
value: 38.800000000000004
- type: map_at_1000
value: 38.934999999999995
- type: map_at_3
value: 35.293
- type: map_at_5
value: 36.547000000000004
- type: mrr_at_1
value: 35.35
- type: mrr_at_10
value: 42.936
- type: mrr_at_100
value: 43.69
- type: mrr_at_1000
value: 43.739
- type: mrr_at_3
value: 41.062
- type: mrr_at_5
value: 42.097
- type: ndcg_at_1
value: 35.35
- type: ndcg_at_10
value: 42.528
- type: ndcg_at_100
value: 46.983000000000004
- type: ndcg_at_1000
value: 49.187999999999995
- type: ndcg_at_3
value: 39.271
- type: ndcg_at_5
value: 40.654
- type: precision_at_1
value: 35.35
- type: precision_at_10
value: 7.828
- type: precision_at_100
value: 1.3010000000000002
- type: precision_at_1000
value: 0.17700000000000002
- type: precision_at_3
value: 18.96
- type: precision_at_5
value: 13.120999999999999
- type: recall_at_1
value: 28.778
- type: recall_at_10
value: 50.775000000000006
- type: recall_at_100
value: 69.66799999999999
- type: recall_at_1000
value: 83.638
- type: recall_at_3
value: 40.757
- type: recall_at_5
value: 44.86
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGamingRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 37.584
- type: map_at_10
value: 49.69
- type: map_at_100
value: 50.639
- type: map_at_1000
value: 50.702999999999996
- type: map_at_3
value: 46.61
- type: map_at_5
value: 48.486000000000004
- type: mrr_at_1
value: 43.009
- type: mrr_at_10
value: 52.949999999999996
- type: mrr_at_100
value: 53.618
- type: mrr_at_1000
value: 53.65299999999999
- type: mrr_at_3
value: 50.605999999999995
- type: mrr_at_5
value: 52.095
- type: ndcg_at_1
value: 43.009
- type: ndcg_at_10
value: 55.278000000000006
- type: ndcg_at_100
value: 59.134
- type: ndcg_at_1000
value: 60.528999999999996
- type: ndcg_at_3
value: 50.184
- type: ndcg_at_5
value: 52.919000000000004
- type: precision_at_1
value: 43.009
- type: precision_at_10
value: 8.821
- type: precision_at_100
value: 1.161
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 22.424
- type: precision_at_5
value: 15.436
- type: recall_at_1
value: 37.584
- type: recall_at_10
value: 68.514
- type: recall_at_100
value: 85.099
- type: recall_at_1000
value: 95.123
- type: recall_at_3
value: 55.007
- type: recall_at_5
value: 61.714999999999996
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGisRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.7
- type: map_at_10
value: 32.804
- type: map_at_100
value: 33.738
- type: map_at_1000
value: 33.825
- type: map_at_3
value: 30.639
- type: map_at_5
value: 31.781
- type: mrr_at_1
value: 26.328000000000003
- type: mrr_at_10
value: 34.679
- type: mrr_at_100
value: 35.510000000000005
- type: mrr_at_1000
value: 35.577999999999996
- type: mrr_at_3
value: 32.58
- type: mrr_at_5
value: 33.687
- type: ndcg_at_1
value: 26.328000000000003
- type: ndcg_at_10
value: 37.313
- type: ndcg_at_100
value: 42.004000000000005
- type: ndcg_at_1000
value: 44.232
- type: ndcg_at_3
value: 33.076
- type: ndcg_at_5
value: 34.966
- type: precision_at_1
value: 26.328000000000003
- type: precision_at_10
value: 5.627
- type: precision_at_100
value: 0.8410000000000001
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 14.011000000000001
- type: precision_at_5
value: 9.582
- type: recall_at_1
value: 24.7
- type: recall_at_10
value: 49.324
- type: recall_at_100
value: 71.018
- type: recall_at_1000
value: 87.905
- type: recall_at_3
value: 37.7
- type: recall_at_5
value: 42.281
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackMathematicaRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 14.350999999999999
- type: map_at_10
value: 21.745
- type: map_at_100
value: 22.731
- type: map_at_1000
value: 22.852
- type: map_at_3
value: 19.245
- type: map_at_5
value: 20.788
- type: mrr_at_1
value: 18.159
- type: mrr_at_10
value: 25.833000000000002
- type: mrr_at_100
value: 26.728
- type: mrr_at_1000
value: 26.802
- type: mrr_at_3
value: 23.383000000000003
- type: mrr_at_5
value: 24.887999999999998
- type: ndcg_at_1
value: 18.159
- type: ndcg_at_10
value: 26.518000000000004
- type: ndcg_at_100
value: 31.473000000000003
- type: ndcg_at_1000
value: 34.576
- type: ndcg_at_3
value: 21.907
- type: ndcg_at_5
value: 24.39
- type: precision_at_1
value: 18.159
- type: precision_at_10
value: 4.938
- type: precision_at_100
value: 0.853
- type: precision_at_1000
value: 0.125
- type: precision_at_3
value: 10.655000000000001
- type: precision_at_5
value: 7.985
- type: recall_at_1
value: 14.350999999999999
- type: recall_at_10
value: 37.284
- type: recall_at_100
value: 59.11300000000001
- type: recall_at_1000
value: 81.634
- type: recall_at_3
value: 24.753
- type: recall_at_5
value: 30.979
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackPhysicsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 26.978
- type: map_at_10
value: 36.276
- type: map_at_100
value: 37.547000000000004
- type: map_at_1000
value: 37.678
- type: map_at_3
value: 33.674
- type: map_at_5
value: 35.119
- type: mrr_at_1
value: 32.916000000000004
- type: mrr_at_10
value: 41.798
- type: mrr_at_100
value: 42.72
- type: mrr_at_1000
value: 42.778
- type: mrr_at_3
value: 39.493
- type: mrr_at_5
value: 40.927
- type: ndcg_at_1
value: 32.916000000000004
- type: ndcg_at_10
value: 41.81
- type: ndcg_at_100
value: 47.284
- type: ndcg_at_1000
value: 49.702
- type: ndcg_at_3
value: 37.486999999999995
- type: ndcg_at_5
value: 39.597
- type: precision_at_1
value: 32.916000000000004
- type: precision_at_10
value: 7.411
- type: precision_at_100
value: 1.189
- type: precision_at_1000
value: 0.158
- type: precision_at_3
value: 17.581
- type: precision_at_5
value: 12.397
- type: recall_at_1
value: 26.978
- type: recall_at_10
value: 52.869
- type: recall_at_100
value: 75.78399999999999
- type: recall_at_1000
value: 91.545
- type: recall_at_3
value: 40.717
- type: recall_at_5
value: 46.168
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackProgrammersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.641
- type: map_at_10
value: 32.916000000000004
- type: map_at_100
value: 34.165
- type: map_at_1000
value: 34.286
- type: map_at_3
value: 30.335
- type: map_at_5
value: 31.569000000000003
- type: mrr_at_1
value: 30.593999999999998
- type: mrr_at_10
value: 38.448
- type: mrr_at_100
value: 39.299
- type: mrr_at_1000
value: 39.362
- type: mrr_at_3
value: 36.244
- type: mrr_at_5
value: 37.232
- type: ndcg_at_1
value: 30.593999999999998
- type: ndcg_at_10
value: 38.2
- type: ndcg_at_100
value: 43.742
- type: ndcg_at_1000
value: 46.217000000000006
- type: ndcg_at_3
value: 33.925
- type: ndcg_at_5
value: 35.394
- type: precision_at_1
value: 30.593999999999998
- type: precision_at_10
value: 6.895
- type: precision_at_100
value: 1.1320000000000001
- type: precision_at_1000
value: 0.153
- type: precision_at_3
value: 16.096
- type: precision_at_5
value: 11.05
- type: recall_at_1
value: 24.641
- type: recall_at_10
value: 48.588
- type: recall_at_100
value: 72.841
- type: recall_at_1000
value: 89.535
- type: recall_at_3
value: 36.087
- type: recall_at_5
value: 40.346
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.79425
- type: map_at_10
value: 33.12033333333333
- type: map_at_100
value: 34.221333333333334
- type: map_at_1000
value: 34.3435
- type: map_at_3
value: 30.636583333333338
- type: map_at_5
value: 31.974083333333326
- type: mrr_at_1
value: 29.242416666666664
- type: mrr_at_10
value: 37.11675
- type: mrr_at_100
value: 37.93783333333334
- type: mrr_at_1000
value: 38.003083333333336
- type: mrr_at_3
value: 34.904666666666664
- type: mrr_at_5
value: 36.12916666666667
- type: ndcg_at_1
value: 29.242416666666664
- type: ndcg_at_10
value: 38.03416666666667
- type: ndcg_at_100
value: 42.86674999999999
- type: ndcg_at_1000
value: 45.34550000000001
- type: ndcg_at_3
value: 33.76466666666666
- type: ndcg_at_5
value: 35.668666666666674
- type: precision_at_1
value: 29.242416666666664
- type: precision_at_10
value: 6.589833333333334
- type: precision_at_100
value: 1.0693333333333332
- type: precision_at_1000
value: 0.14641666666666667
- type: precision_at_3
value: 15.430749999999998
- type: precision_at_5
value: 10.833833333333333
- type: recall_at_1
value: 24.79425
- type: recall_at_10
value: 48.582916666666655
- type: recall_at_100
value: 69.88499999999999
- type: recall_at_1000
value: 87.211
- type: recall_at_3
value: 36.625499999999995
- type: recall_at_5
value: 41.553999999999995
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackStatsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.767
- type: map_at_10
value: 28.450999999999997
- type: map_at_100
value: 29.332
- type: map_at_1000
value: 29.426000000000002
- type: map_at_3
value: 26.379
- type: map_at_5
value: 27.584999999999997
- type: mrr_at_1
value: 25.46
- type: mrr_at_10
value: 30.974
- type: mrr_at_100
value: 31.784000000000002
- type: mrr_at_1000
value: 31.857999999999997
- type: mrr_at_3
value: 28.962
- type: mrr_at_5
value: 30.066
- type: ndcg_at_1
value: 25.46
- type: ndcg_at_10
value: 32.041
- type: ndcg_at_100
value: 36.522
- type: ndcg_at_1000
value: 39.101
- type: ndcg_at_3
value: 28.152
- type: ndcg_at_5
value: 30.03
- type: precision_at_1
value: 25.46
- type: precision_at_10
value: 4.893
- type: precision_at_100
value: 0.77
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 11.605
- type: precision_at_5
value: 8.19
- type: recall_at_1
value: 22.767
- type: recall_at_10
value: 40.71
- type: recall_at_100
value: 61.334999999999994
- type: recall_at_1000
value: 80.567
- type: recall_at_3
value: 30.198000000000004
- type: recall_at_5
value: 34.803
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackTexRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.722
- type: map_at_10
value: 22.794
- type: map_at_100
value: 23.7
- type: map_at_1000
value: 23.822
- type: map_at_3
value: 20.781
- type: map_at_5
value: 22.024
- type: mrr_at_1
value: 20.061999999999998
- type: mrr_at_10
value: 26.346999999999998
- type: mrr_at_100
value: 27.153
- type: mrr_at_1000
value: 27.233
- type: mrr_at_3
value: 24.375
- type: mrr_at_5
value: 25.593
- type: ndcg_at_1
value: 20.061999999999998
- type: ndcg_at_10
value: 26.785999999999998
- type: ndcg_at_100
value: 31.319999999999997
- type: ndcg_at_1000
value: 34.346
- type: ndcg_at_3
value: 23.219
- type: ndcg_at_5
value: 25.107000000000003
- type: precision_at_1
value: 20.061999999999998
- type: precision_at_10
value: 4.78
- type: precision_at_100
value: 0.83
- type: precision_at_1000
value: 0.125
- type: precision_at_3
value: 10.874
- type: precision_at_5
value: 7.956
- type: recall_at_1
value: 16.722
- type: recall_at_10
value: 35.204
- type: recall_at_100
value: 55.797
- type: recall_at_1000
value: 77.689
- type: recall_at_3
value: 25.245
- type: recall_at_5
value: 30.115
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackUnixRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.842
- type: map_at_10
value: 32.917
- type: map_at_100
value: 33.961000000000006
- type: map_at_1000
value: 34.069
- type: map_at_3
value: 30.595
- type: map_at_5
value: 31.837
- type: mrr_at_1
value: 29.011
- type: mrr_at_10
value: 36.977
- type: mrr_at_100
value: 37.814
- type: mrr_at_1000
value: 37.885999999999996
- type: mrr_at_3
value: 34.966
- type: mrr_at_5
value: 36.043
- type: ndcg_at_1
value: 29.011
- type: ndcg_at_10
value: 37.735
- type: ndcg_at_100
value: 42.683
- type: ndcg_at_1000
value: 45.198
- type: ndcg_at_3
value: 33.650000000000006
- type: ndcg_at_5
value: 35.386
- type: precision_at_1
value: 29.011
- type: precision_at_10
value: 6.259
- type: precision_at_100
value: 0.984
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 15.329999999999998
- type: precision_at_5
value: 10.541
- type: recall_at_1
value: 24.842
- type: recall_at_10
value: 48.304
- type: recall_at_100
value: 70.04899999999999
- type: recall_at_1000
value: 87.82600000000001
- type: recall_at_3
value: 36.922
- type: recall_at_5
value: 41.449999999999996
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWebmastersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.252000000000002
- type: map_at_10
value: 32.293
- type: map_at_100
value: 33.816
- type: map_at_1000
value: 34.053
- type: map_at_3
value: 29.781999999999996
- type: map_at_5
value: 31.008000000000003
- type: mrr_at_1
value: 29.051
- type: mrr_at_10
value: 36.722
- type: mrr_at_100
value: 37.663000000000004
- type: mrr_at_1000
value: 37.734
- type: mrr_at_3
value: 34.354
- type: mrr_at_5
value: 35.609
- type: ndcg_at_1
value: 29.051
- type: ndcg_at_10
value: 37.775999999999996
- type: ndcg_at_100
value: 43.221
- type: ndcg_at_1000
value: 46.116
- type: ndcg_at_3
value: 33.403
- type: ndcg_at_5
value: 35.118
- type: precision_at_1
value: 29.051
- type: precision_at_10
value: 7.332
- type: precision_at_100
value: 1.49
- type: precision_at_1000
value: 0.23600000000000002
- type: precision_at_3
value: 15.415000000000001
- type: precision_at_5
value: 11.107
- type: recall_at_1
value: 24.252000000000002
- type: recall_at_10
value: 47.861
- type: recall_at_100
value: 72.21600000000001
- type: recall_at_1000
value: 90.886
- type: recall_at_3
value: 35.533
- type: recall_at_5
value: 39.959
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWordpressRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 20.025000000000002
- type: map_at_10
value: 27.154
- type: map_at_100
value: 28.118
- type: map_at_1000
value: 28.237000000000002
- type: map_at_3
value: 25.017
- type: map_at_5
value: 25.832
- type: mrr_at_1
value: 21.627
- type: mrr_at_10
value: 28.884999999999998
- type: mrr_at_100
value: 29.741
- type: mrr_at_1000
value: 29.831999999999997
- type: mrr_at_3
value: 26.741
- type: mrr_at_5
value: 27.628000000000004
- type: ndcg_at_1
value: 21.627
- type: ndcg_at_10
value: 31.436999999999998
- type: ndcg_at_100
value: 36.181000000000004
- type: ndcg_at_1000
value: 38.986
- type: ndcg_at_3
value: 27.025
- type: ndcg_at_5
value: 28.436
- type: precision_at_1
value: 21.627
- type: precision_at_10
value: 5.009
- type: precision_at_100
value: 0.7929999999999999
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 11.522
- type: precision_at_5
value: 7.763000000000001
- type: recall_at_1
value: 20.025000000000002
- type: recall_at_10
value: 42.954
- type: recall_at_100
value: 64.67500000000001
- type: recall_at_1000
value: 85.301
- type: recall_at_3
value: 30.892999999999997
- type: recall_at_5
value: 34.288000000000004
- task:
type: Retrieval
dataset:
type: climate-fever
name: MTEB ClimateFEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 10.079
- type: map_at_10
value: 16.930999999999997
- type: map_at_100
value: 18.398999999999997
- type: map_at_1000
value: 18.561
- type: map_at_3
value: 14.294
- type: map_at_5
value: 15.579
- type: mrr_at_1
value: 22.606
- type: mrr_at_10
value: 32.513
- type: mrr_at_100
value: 33.463
- type: mrr_at_1000
value: 33.513999999999996
- type: mrr_at_3
value: 29.479
- type: mrr_at_5
value: 31.3
- type: ndcg_at_1
value: 22.606
- type: ndcg_at_10
value: 24.053
- type: ndcg_at_100
value: 30.258000000000003
- type: ndcg_at_1000
value: 33.516
- type: ndcg_at_3
value: 19.721
- type: ndcg_at_5
value: 21.144
- type: precision_at_1
value: 22.606
- type: precision_at_10
value: 7.55
- type: precision_at_100
value: 1.399
- type: precision_at_1000
value: 0.2
- type: precision_at_3
value: 14.701
- type: precision_at_5
value: 11.192
- type: recall_at_1
value: 10.079
- type: recall_at_10
value: 28.970000000000002
- type: recall_at_100
value: 50.805
- type: recall_at_1000
value: 69.378
- type: recall_at_3
value: 18.199
- type: recall_at_5
value: 22.442
- task:
type: Retrieval
dataset:
type: dbpedia-entity
name: MTEB DBPedia
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 7.794
- type: map_at_10
value: 15.165999999999999
- type: map_at_100
value: 20.508000000000003
- type: map_at_1000
value: 21.809
- type: map_at_3
value: 11.568000000000001
- type: map_at_5
value: 13.059000000000001
- type: mrr_at_1
value: 56.49999999999999
- type: mrr_at_10
value: 65.90899999999999
- type: mrr_at_100
value: 66.352
- type: mrr_at_1000
value: 66.369
- type: mrr_at_3
value: 64.0
- type: mrr_at_5
value: 65.10000000000001
- type: ndcg_at_1
value: 44.25
- type: ndcg_at_10
value: 32.649
- type: ndcg_at_100
value: 36.668
- type: ndcg_at_1000
value: 43.918
- type: ndcg_at_3
value: 37.096000000000004
- type: ndcg_at_5
value: 34.048
- type: precision_at_1
value: 56.49999999999999
- type: precision_at_10
value: 25.45
- type: precision_at_100
value: 8.055
- type: precision_at_1000
value: 1.7489999999999999
- type: precision_at_3
value: 41.0
- type: precision_at_5
value: 32.85
- type: recall_at_1
value: 7.794
- type: recall_at_10
value: 20.101
- type: recall_at_100
value: 42.448
- type: recall_at_1000
value: 65.88000000000001
- type: recall_at_3
value: 12.753
- type: recall_at_5
value: 15.307
- task:
type: Classification
dataset:
type: mteb/emotion
name: MTEB EmotionClassification
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 44.01
- type: f1
value: 38.659680951114964
- task:
type: Retrieval
dataset:
type: fever
name: MTEB FEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 49.713
- type: map_at_10
value: 61.79
- type: map_at_100
value: 62.28
- type: map_at_1000
value: 62.297000000000004
- type: map_at_3
value: 59.361
- type: map_at_5
value: 60.92100000000001
- type: mrr_at_1
value: 53.405
- type: mrr_at_10
value: 65.79899999999999
- type: mrr_at_100
value: 66.219
- type: mrr_at_1000
value: 66.227
- type: mrr_at_3
value: 63.431000000000004
- type: mrr_at_5
value: 64.98
- type: ndcg_at_1
value: 53.405
- type: ndcg_at_10
value: 68.01899999999999
- type: ndcg_at_100
value: 70.197
- type: ndcg_at_1000
value: 70.571
- type: ndcg_at_3
value: 63.352
- type: ndcg_at_5
value: 66.018
- type: precision_at_1
value: 53.405
- type: precision_at_10
value: 9.119
- type: precision_at_100
value: 1.03
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 25.602999999999998
- type: precision_at_5
value: 16.835
- type: recall_at_1
value: 49.713
- type: recall_at_10
value: 83.306
- type: recall_at_100
value: 92.92
- type: recall_at_1000
value: 95.577
- type: recall_at_3
value: 70.798
- type: recall_at_5
value: 77.254
- task:
type: Retrieval
dataset:
type: fiqa
name: MTEB FiQA2018
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 15.310000000000002
- type: map_at_10
value: 26.204
- type: map_at_100
value: 27.932000000000002
- type: map_at_1000
value: 28.121000000000002
- type: map_at_3
value: 22.481
- type: map_at_5
value: 24.678
- type: mrr_at_1
value: 29.784
- type: mrr_at_10
value: 39.582
- type: mrr_at_100
value: 40.52
- type: mrr_at_1000
value: 40.568
- type: mrr_at_3
value: 37.114000000000004
- type: mrr_at_5
value: 38.596000000000004
- type: ndcg_at_1
value: 29.784
- type: ndcg_at_10
value: 33.432
- type: ndcg_at_100
value: 40.281
- type: ndcg_at_1000
value: 43.653999999999996
- type: ndcg_at_3
value: 29.612
- type: ndcg_at_5
value: 31.223
- type: precision_at_1
value: 29.784
- type: precision_at_10
value: 9.645
- type: precision_at_100
value: 1.645
- type: precision_at_1000
value: 0.22499999999999998
- type: precision_at_3
value: 20.165
- type: precision_at_5
value: 15.401000000000002
- type: recall_at_1
value: 15.310000000000002
- type: recall_at_10
value: 40.499
- type: recall_at_100
value: 66.643
- type: recall_at_1000
value: 87.059
- type: recall_at_3
value: 27.492
- type: recall_at_5
value: 33.748
- task:
type: Retrieval
dataset:
type: hotpotqa
name: MTEB HotpotQA
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 33.599000000000004
- type: map_at_10
value: 47.347
- type: map_at_100
value: 48.191
- type: map_at_1000
value: 48.263
- type: map_at_3
value: 44.698
- type: map_at_5
value: 46.278999999999996
- type: mrr_at_1
value: 67.19800000000001
- type: mrr_at_10
value: 74.054
- type: mrr_at_100
value: 74.376
- type: mrr_at_1000
value: 74.392
- type: mrr_at_3
value: 72.849
- type: mrr_at_5
value: 73.643
- type: ndcg_at_1
value: 67.19800000000001
- type: ndcg_at_10
value: 56.482
- type: ndcg_at_100
value: 59.694
- type: ndcg_at_1000
value: 61.204
- type: ndcg_at_3
value: 52.43299999999999
- type: ndcg_at_5
value: 54.608000000000004
- type: precision_at_1
value: 67.19800000000001
- type: precision_at_10
value: 11.613999999999999
- type: precision_at_100
value: 1.415
- type: precision_at_1000
value: 0.16199999999999998
- type: precision_at_3
value: 32.726
- type: precision_at_5
value: 21.349999999999998
- type: recall_at_1
value: 33.599000000000004
- type: recall_at_10
value: 58.069
- type: recall_at_100
value: 70.736
- type: recall_at_1000
value: 80.804
- type: recall_at_3
value: 49.088
- type: recall_at_5
value: 53.376000000000005
- task:
type: Classification
dataset:
type: mteb/imdb
name: MTEB ImdbClassification
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 73.64359999999999
- type: ap
value: 67.54685976014599
- type: f1
value: 73.55148707559482
- task:
type: Retrieval
dataset:
type: msmarco
name: MTEB MSMARCO
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 19.502
- type: map_at_10
value: 30.816
- type: map_at_100
value: 32.007999999999996
- type: map_at_1000
value: 32.067
- type: map_at_3
value: 27.215
- type: map_at_5
value: 29.304000000000002
- type: mrr_at_1
value: 20.072000000000003
- type: mrr_at_10
value: 31.406
- type: mrr_at_100
value: 32.549
- type: mrr_at_1000
value: 32.602
- type: mrr_at_3
value: 27.839000000000002
- type: mrr_at_5
value: 29.926000000000002
- type: ndcg_at_1
value: 20.086000000000002
- type: ndcg_at_10
value: 37.282
- type: ndcg_at_100
value: 43.206
- type: ndcg_at_1000
value: 44.690000000000005
- type: ndcg_at_3
value: 29.932
- type: ndcg_at_5
value: 33.668
- type: precision_at_1
value: 20.086000000000002
- type: precision_at_10
value: 5.961
- type: precision_at_100
value: 0.898
- type: precision_at_1000
value: 0.10200000000000001
- type: precision_at_3
value: 12.856000000000002
- type: precision_at_5
value: 9.596
- type: recall_at_1
value: 19.502
- type: recall_at_10
value: 57.182
- type: recall_at_100
value: 84.952
- type: recall_at_1000
value: 96.34700000000001
- type: recall_at_3
value: 37.193
- type: recall_at_5
value: 46.157
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (en)
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.96488828089375
- type: f1
value: 93.32119260543482
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (en)
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 72.4965800273598
- type: f1
value: 49.34896217536082
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (en)
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 67.60928043039678
- type: f1
value: 64.34244712074538
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (en)
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 69.75453934095493
- type: f1
value: 68.39224867489249
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-p2p
name: MTEB MedrxivClusteringP2P
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 31.862573504920082
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-s2s
name: MTEB MedrxivClusteringS2S
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 27.511123551196803
- task:
type: Reranking
dataset:
type: mteb/mind_small
name: MTEB MindSmallReranking
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.99145104942086
- type: mrr
value: 32.03606480418627
- task:
type: Retrieval
dataset:
type: nfcorpus
name: MTEB NFCorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.015
- type: map_at_10
value: 11.054
- type: map_at_100
value: 13.773
- type: map_at_1000
value: 15.082999999999998
- type: map_at_3
value: 8.253
- type: map_at_5
value: 9.508999999999999
- type: mrr_at_1
value: 42.105
- type: mrr_at_10
value: 50.44499999999999
- type: mrr_at_100
value: 51.080000000000005
- type: mrr_at_1000
value: 51.129999999999995
- type: mrr_at_3
value: 48.555
- type: mrr_at_5
value: 49.84
- type: ndcg_at_1
value: 40.402
- type: ndcg_at_10
value: 30.403000000000002
- type: ndcg_at_100
value: 28.216
- type: ndcg_at_1000
value: 37.021
- type: ndcg_at_3
value: 35.53
- type: ndcg_at_5
value: 33.202999999999996
- type: precision_at_1
value: 42.105
- type: precision_at_10
value: 22.353
- type: precision_at_100
value: 7.266
- type: precision_at_1000
value: 2.011
- type: precision_at_3
value: 32.921
- type: precision_at_5
value: 28.297
- type: recall_at_1
value: 5.015
- type: recall_at_10
value: 14.393
- type: recall_at_100
value: 28.893
- type: recall_at_1000
value: 60.18
- type: recall_at_3
value: 9.184000000000001
- type: recall_at_5
value: 11.39
- task:
type: Retrieval
dataset:
type: nq
name: MTEB NQ
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.524
- type: map_at_10
value: 44.182
- type: map_at_100
value: 45.228
- type: map_at_1000
value: 45.265
- type: map_at_3
value: 39.978
- type: map_at_5
value: 42.482
- type: mrr_at_1
value: 33.256
- type: mrr_at_10
value: 46.661
- type: mrr_at_100
value: 47.47
- type: mrr_at_1000
value: 47.496
- type: mrr_at_3
value: 43.187999999999995
- type: mrr_at_5
value: 45.330999999999996
- type: ndcg_at_1
value: 33.227000000000004
- type: ndcg_at_10
value: 51.589
- type: ndcg_at_100
value: 56.043
- type: ndcg_at_1000
value: 56.937000000000005
- type: ndcg_at_3
value: 43.751
- type: ndcg_at_5
value: 47.937000000000005
- type: precision_at_1
value: 33.227000000000004
- type: precision_at_10
value: 8.556999999999999
- type: precision_at_100
value: 1.103
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_3
value: 19.921
- type: precision_at_5
value: 14.396999999999998
- type: recall_at_1
value: 29.524
- type: recall_at_10
value: 71.615
- type: recall_at_100
value: 91.056
- type: recall_at_1000
value: 97.72800000000001
- type: recall_at_3
value: 51.451
- type: recall_at_5
value: 61.119
- task:
type: Retrieval
dataset:
type: quora
name: MTEB QuoraRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 69.596
- type: map_at_10
value: 83.281
- type: map_at_100
value: 83.952
- type: map_at_1000
value: 83.97200000000001
- type: map_at_3
value: 80.315
- type: map_at_5
value: 82.223
- type: mrr_at_1
value: 80.17
- type: mrr_at_10
value: 86.522
- type: mrr_at_100
value: 86.644
- type: mrr_at_1000
value: 86.64500000000001
- type: mrr_at_3
value: 85.438
- type: mrr_at_5
value: 86.21799999999999
- type: ndcg_at_1
value: 80.19
- type: ndcg_at_10
value: 87.19
- type: ndcg_at_100
value: 88.567
- type: ndcg_at_1000
value: 88.70400000000001
- type: ndcg_at_3
value: 84.17999999999999
- type: ndcg_at_5
value: 85.931
- type: precision_at_1
value: 80.19
- type: precision_at_10
value: 13.209000000000001
- type: precision_at_100
value: 1.518
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 36.717
- type: precision_at_5
value: 24.248
- type: recall_at_1
value: 69.596
- type: recall_at_10
value: 94.533
- type: recall_at_100
value: 99.322
- type: recall_at_1000
value: 99.965
- type: recall_at_3
value: 85.911
- type: recall_at_5
value: 90.809
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering
name: MTEB RedditClustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 49.27650627571912
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering-p2p
name: MTEB RedditClusteringP2P
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 57.08550946534183
- task:
type: Retrieval
dataset:
type: scidocs
name: MTEB SCIDOCS
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.568
- type: map_at_10
value: 10.862
- type: map_at_100
value: 12.757
- type: map_at_1000
value: 13.031
- type: map_at_3
value: 7.960000000000001
- type: map_at_5
value: 9.337
- type: mrr_at_1
value: 22.5
- type: mrr_at_10
value: 32.6
- type: mrr_at_100
value: 33.603
- type: mrr_at_1000
value: 33.672000000000004
- type: mrr_at_3
value: 29.299999999999997
- type: mrr_at_5
value: 31.25
- type: ndcg_at_1
value: 22.5
- type: ndcg_at_10
value: 18.605
- type: ndcg_at_100
value: 26.029999999999998
- type: ndcg_at_1000
value: 31.256
- type: ndcg_at_3
value: 17.873
- type: ndcg_at_5
value: 15.511
- type: precision_at_1
value: 22.5
- type: precision_at_10
value: 9.58
- type: precision_at_100
value: 2.033
- type: precision_at_1000
value: 0.33
- type: precision_at_3
value: 16.633
- type: precision_at_5
value: 13.54
- type: recall_at_1
value: 4.568
- type: recall_at_10
value: 19.402
- type: recall_at_100
value: 41.277
- type: recall_at_1000
value: 66.963
- type: recall_at_3
value: 10.112
- type: recall_at_5
value: 13.712
- task:
type: STS
dataset:
type: mteb/sickr-sts
name: MTEB SICK-R
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 83.31992291680787
- type: cos_sim_spearman
value: 76.7212346922664
- type: euclidean_pearson
value: 80.42189271706478
- type: euclidean_spearman
value: 76.7212342532493
- type: manhattan_pearson
value: 80.33171093031578
- type: manhattan_spearman
value: 76.63192883074694
- task:
type: STS
dataset:
type: mteb/sts12-sts
name: MTEB STS12
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 83.16654278886763
- type: cos_sim_spearman
value: 73.66390263429565
- type: euclidean_pearson
value: 79.7485360086639
- type: euclidean_spearman
value: 73.66389870373436
- type: manhattan_pearson
value: 79.73652237443706
- type: manhattan_spearman
value: 73.65296117151647
- task:
type: STS
dataset:
type: mteb/sts13-sts
name: MTEB STS13
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 82.40389689929246
- type: cos_sim_spearman
value: 83.29727595993955
- type: euclidean_pearson
value: 82.23970587854079
- type: euclidean_spearman
value: 83.29727595993955
- type: manhattan_pearson
value: 82.18823600831897
- type: manhattan_spearman
value: 83.20746192209594
- task:
type: STS
dataset:
type: mteb/sts14-sts
name: MTEB STS14
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 81.73505246913413
- type: cos_sim_spearman
value: 79.1686548248754
- type: euclidean_pearson
value: 80.48889135993412
- type: euclidean_spearman
value: 79.16864112930354
- type: manhattan_pearson
value: 80.40720651057302
- type: manhattan_spearman
value: 79.0640155089286
- task:
type: STS
dataset:
type: mteb/sts15-sts
name: MTEB STS15
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.3953512879065
- type: cos_sim_spearman
value: 87.29947322714338
- type: euclidean_pearson
value: 86.59759438529645
- type: euclidean_spearman
value: 87.29947511092824
- type: manhattan_pearson
value: 86.52097806169155
- type: manhattan_spearman
value: 87.22987242146534
- task:
type: STS
dataset:
type: mteb/sts16-sts
name: MTEB STS16
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 82.48565753792056
- type: cos_sim_spearman
value: 83.6049720319893
- type: euclidean_pearson
value: 82.56452023172913
- type: euclidean_spearman
value: 83.60490168191697
- type: manhattan_pearson
value: 82.58079941137872
- type: manhattan_spearman
value: 83.60975807374051
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-en)
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 88.18239976618212
- type: cos_sim_spearman
value: 88.23061724730616
- type: euclidean_pearson
value: 87.78482472776658
- type: euclidean_spearman
value: 88.23061724730616
- type: manhattan_pearson
value: 87.75059641730239
- type: manhattan_spearman
value: 88.22527413524622
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (en)
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 63.42816418706765
- type: cos_sim_spearman
value: 63.4569864520124
- type: euclidean_pearson
value: 64.35405409953853
- type: euclidean_spearman
value: 63.4569864520124
- type: manhattan_pearson
value: 63.96649236073056
- type: manhattan_spearman
value: 63.01448583722708
- task:
type: STS
dataset:
type: mteb/stsbenchmark-sts
name: MTEB STSBenchmark
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 83.41659638047614
- type: cos_sim_spearman
value: 84.03893866106175
- type: euclidean_pearson
value: 84.2251203953798
- type: euclidean_spearman
value: 84.03893866106175
- type: manhattan_pearson
value: 84.22733643205514
- type: manhattan_spearman
value: 84.06504411263612
- task:
type: Reranking
dataset:
type: mteb/scidocs-reranking
name: MTEB SciDocsRR
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 79.75608022582414
- type: mrr
value: 94.0947732369301
- task:
type: Retrieval
dataset:
type: scifact
name: MTEB SciFact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 50.161
- type: map_at_10
value: 59.458999999999996
- type: map_at_100
value: 60.156
- type: map_at_1000
value: 60.194
- type: map_at_3
value: 56.45400000000001
- type: map_at_5
value: 58.165
- type: mrr_at_1
value: 53.333
- type: mrr_at_10
value: 61.050000000000004
- type: mrr_at_100
value: 61.586
- type: mrr_at_1000
value: 61.624
- type: mrr_at_3
value: 58.889
- type: mrr_at_5
value: 60.122
- type: ndcg_at_1
value: 53.333
- type: ndcg_at_10
value: 63.888999999999996
- type: ndcg_at_100
value: 66.963
- type: ndcg_at_1000
value: 68.062
- type: ndcg_at_3
value: 59.01
- type: ndcg_at_5
value: 61.373999999999995
- type: precision_at_1
value: 53.333
- type: precision_at_10
value: 8.633000000000001
- type: precision_at_100
value: 1.027
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 23.111
- type: precision_at_5
value: 15.467
- type: recall_at_1
value: 50.161
- type: recall_at_10
value: 75.922
- type: recall_at_100
value: 90.0
- type: recall_at_1000
value: 98.667
- type: recall_at_3
value: 62.90599999999999
- type: recall_at_5
value: 68.828
- task:
type: PairClassification
dataset:
type: mteb/sprintduplicatequestions-pairclassification
name: MTEB SprintDuplicateQuestions
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.81188118811882
- type: cos_sim_ap
value: 95.11619225962413
- type: cos_sim_f1
value: 90.35840484603736
- type: cos_sim_precision
value: 91.23343527013252
- type: cos_sim_recall
value: 89.5
- type: dot_accuracy
value: 99.81188118811882
- type: dot_ap
value: 95.11619225962413
- type: dot_f1
value: 90.35840484603736
- type: dot_precision
value: 91.23343527013252
- type: dot_recall
value: 89.5
- type: euclidean_accuracy
value: 99.81188118811882
- type: euclidean_ap
value: 95.11619225962413
- type: euclidean_f1
value: 90.35840484603736
- type: euclidean_precision
value: 91.23343527013252
- type: euclidean_recall
value: 89.5
- type: manhattan_accuracy
value: 99.80891089108911
- type: manhattan_ap
value: 95.07294266220966
- type: manhattan_f1
value: 90.21794221996959
- type: manhattan_precision
value: 91.46968139773895
- type: manhattan_recall
value: 89.0
- type: max_accuracy
value: 99.81188118811882
- type: max_ap
value: 95.11619225962413
- type: max_f1
value: 90.35840484603736
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering
name: MTEB StackExchangeClustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 55.3481874105239
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering-p2p
name: MTEB StackExchangeClusteringP2P
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 34.421291695525
- task:
type: Reranking
dataset:
type: mteb/stackoverflowdupquestions-reranking
name: MTEB StackOverflowDupQuestions
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.98746633276634
- type: mrr
value: 50.63143249724133
- task:
type: Summarization
dataset:
type: mteb/summeval
name: MTEB SummEval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.009961979844036
- type: cos_sim_spearman
value: 30.558416108881044
- type: dot_pearson
value: 31.009964941134253
- type: dot_spearman
value: 30.545760761761393
- task:
type: Retrieval
dataset:
type: trec-covid
name: MTEB TRECCOVID
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.207
- type: map_at_10
value: 1.6
- type: map_at_100
value: 8.594
- type: map_at_1000
value: 20.213
- type: map_at_3
value: 0.585
- type: map_at_5
value: 0.9039999999999999
- type: mrr_at_1
value: 78.0
- type: mrr_at_10
value: 87.4
- type: mrr_at_100
value: 87.4
- type: mrr_at_1000
value: 87.4
- type: mrr_at_3
value: 86.667
- type: mrr_at_5
value: 87.06700000000001
- type: ndcg_at_1
value: 73.0
- type: ndcg_at_10
value: 65.18
- type: ndcg_at_100
value: 49.631
- type: ndcg_at_1000
value: 43.498999999999995
- type: ndcg_at_3
value: 71.83800000000001
- type: ndcg_at_5
value: 69.271
- type: precision_at_1
value: 78.0
- type: precision_at_10
value: 69.19999999999999
- type: precision_at_100
value: 50.980000000000004
- type: precision_at_1000
value: 19.426
- type: precision_at_3
value: 77.333
- type: precision_at_5
value: 74.0
- type: recall_at_1
value: 0.207
- type: recall_at_10
value: 1.822
- type: recall_at_100
value: 11.849
- type: recall_at_1000
value: 40.492
- type: recall_at_3
value: 0.622
- type: recall_at_5
value: 0.9809999999999999
- task:
type: Retrieval
dataset:
type: webis-touche2020
name: MTEB Touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.001
- type: map_at_10
value: 10.376000000000001
- type: map_at_100
value: 16.936999999999998
- type: map_at_1000
value: 18.615000000000002
- type: map_at_3
value: 5.335999999999999
- type: map_at_5
value: 7.374
- type: mrr_at_1
value: 20.408
- type: mrr_at_10
value: 38.29
- type: mrr_at_100
value: 39.33
- type: mrr_at_1000
value: 39.347
- type: mrr_at_3
value: 32.993
- type: mrr_at_5
value: 36.973
- type: ndcg_at_1
value: 17.347
- type: ndcg_at_10
value: 23.515
- type: ndcg_at_100
value: 37.457
- type: ndcg_at_1000
value: 49.439
- type: ndcg_at_3
value: 22.762999999999998
- type: ndcg_at_5
value: 22.622
- type: precision_at_1
value: 20.408
- type: precision_at_10
value: 22.448999999999998
- type: precision_at_100
value: 8.184
- type: precision_at_1000
value: 1.608
- type: precision_at_3
value: 25.85
- type: precision_at_5
value: 25.306
- type: recall_at_1
value: 2.001
- type: recall_at_10
value: 17.422
- type: recall_at_100
value: 51.532999999999994
- type: recall_at_1000
value: 87.466
- type: recall_at_3
value: 6.861000000000001
- type: recall_at_5
value: 10.502
- task:
type: Classification
dataset:
type: mteb/toxic_conversations_50k
name: MTEB ToxicConversationsClassification
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.54419999999999
- type: ap
value: 14.372170450843907
- type: f1
value: 54.94420257390529
- task:
type: Classification
dataset:
type: mteb/tweet_sentiment_extraction
name: MTEB TweetSentimentExtractionClassification
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 59.402942840973395
- type: f1
value: 59.4166538875571
- task:
type: Clustering
dataset:
type: mteb/twentynewsgroups-clustering
name: MTEB TwentyNewsgroupsClustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 41.569064336457906
- task:
type: PairClassification
dataset:
type: mteb/twittersemeval2015-pairclassification
name: MTEB TwitterSemEval2015
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 85.31322644096085
- type: cos_sim_ap
value: 72.14518894837381
- type: cos_sim_f1
value: 66.67489813557229
- type: cos_sim_precision
value: 62.65954977953121
- type: cos_sim_recall
value: 71.2401055408971
- type: dot_accuracy
value: 85.31322644096085
- type: dot_ap
value: 72.14521480685293
- type: dot_f1
value: 66.67489813557229
- type: dot_precision
value: 62.65954977953121
- type: dot_recall
value: 71.2401055408971
- type: euclidean_accuracy
value: 85.31322644096085
- type: euclidean_ap
value: 72.14520820485349
- type: euclidean_f1
value: 66.67489813557229
- type: euclidean_precision
value: 62.65954977953121
- type: euclidean_recall
value: 71.2401055408971
- type: manhattan_accuracy
value: 85.21785778148656
- type: manhattan_ap
value: 72.01177147657364
- type: manhattan_f1
value: 66.62594673833374
- type: manhattan_precision
value: 62.0336669699727
- type: manhattan_recall
value: 71.95250659630607
- type: max_accuracy
value: 85.31322644096085
- type: max_ap
value: 72.14521480685293
- type: max_f1
value: 66.67489813557229
- task:
type: PairClassification
dataset:
type: mteb/twitterurlcorpus-pairclassification
name: MTEB TwitterURLCorpus
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.12756626693057
- type: cos_sim_ap
value: 86.05430786440826
- type: cos_sim_f1
value: 78.27759692216631
- type: cos_sim_precision
value: 75.33466248931929
- type: cos_sim_recall
value: 81.45980905451185
- type: dot_accuracy
value: 89.12950673341872
- type: dot_ap
value: 86.05431161145492
- type: dot_f1
value: 78.27759692216631
- type: dot_precision
value: 75.33466248931929
- type: dot_recall
value: 81.45980905451185
- type: euclidean_accuracy
value: 89.12756626693057
- type: euclidean_ap
value: 86.05431303247397
- type: euclidean_f1
value: 78.27759692216631
- type: euclidean_precision
value: 75.33466248931929
- type: euclidean_recall
value: 81.45980905451185
- type: manhattan_accuracy
value: 89.04994760740482
- type: manhattan_ap
value: 86.00860610892074
- type: manhattan_f1
value: 78.1846776005392
- type: manhattan_precision
value: 76.10438839480975
- type: manhattan_recall
value: 80.3818909762858
- type: max_accuracy
value: 89.12950673341872
- type: max_ap
value: 86.05431303247397
- type: max_f1
value: 78.27759692216631
---
<!-- TODO: add evaluation results here -->
<br><br>
<p align="center">
<img src="https://github.com/jina-ai/finetuner/blob/main/docs/_static/finetuner-logo-ani.svg?raw=true" alt="Finetuner logo: Finetuner helps you to create experiments in order to improve embeddings on search tasks. It accompanies you to deliver the last mile of performance-tuning for neural search applications." width="150px">
</p>
<p align="center">
<b>The text embedding set trained by <a href="https://jina.ai/"><b>Jina AI</b></a>, <a href="https://github.com/jina-ai/finetuner"><b>Finetuner</b></a> team.</b>
</p>
## Intended Usage & Model Info
`jina-embeddings-v2-small-en` is an English, monolingual **embedding model** supporting **8192 sequence length**.
It is based on a Bert architecture (JinaBert) that supports the symmetric bidirectional variant of [ALiBi](https://arxiv.org/abs/2108.12409) to allow longer sequence length.
The backbone `jina-bert-v2-small-en` is pretrained on the C4 dataset.
The model is further trained on Jina AI's collection of more than 400 millions of sentence pairs and hard negatives.
These pairs were obtained from various domains and were carefully selected through a thorough cleaning process.
The embedding model was trained using 512 sequence length, but extrapolates to 8k sequence length (or even longer) thanks to ALiBi.
This makes our model useful for a range of use cases, especially when processing long documents is needed, including long document retrieval, semantic textual similarity, text reranking, recommendation, RAG and LLM-based generative search, etc.
This model has 33 million parameters, which enables lightning-fast and memory efficient inference, while still delivering impressive performance.
Additionally, we provide the following embedding models:
**V1 (Based on T5, 512 Seq)**
- [`jina-embeddings-v1-small-en`](https://huggingface.co/jinaai/jina-embedding-s-en-v1): 35 million parameters.
- [`jina-embeddings-v1-base-en`](https://huggingface.co/jinaai/jina-embedding-b-en-v1): 110 million parameters.
- [`jina-embeddings-v1-large-en`](https://huggingface.co/jinaai/jina-embedding-l-en-v1): 330 million parameters.
**V2 (Based on JinaBert, 8k Seq)**
- [`jina-embeddings-v2-small-en`](https://huggingface.co/jinaai/jina-embeddings-v2-small-en): 33 million parameters **(you are here)**.
- [`jina-embeddings-v2-base-en`](https://huggingface.co/jinaai/jina-embeddings-v2-base-en): 137 million parameters.
- [`jina-embeddings-v2-large-en`](): 435 million parameters (releasing soon).
## Data & Parameters
Jina Embeddings V2 [technical report](https://arxiv.org/abs/2310.19923)
## Usage
You can use Jina Embedding models directly from transformers package:
```python
!pip install transformers
from transformers import AutoModel
from numpy.linalg import norm
cos_sim = lambda a,b: (a @ b.T) / (norm(a)*norm(b))
model = AutoModel.from_pretrained('jinaai/jina-embeddings-v2-small-en', trust_remote_code=True) # trust_remote_code is needed to use the encode method
embeddings = model.encode(['How is the weather today?', 'What is the current weather like today?'])
print(cos_sim(embeddings[0], embeddings[1]))
```
If you only want to handle shorter sequence, such as 2k, pass the `max_length` parameter to the `encode` function:
```python
embeddings = model.encode(
['Very long ... document'],
max_length=2048
)
```
*Alternatively, you can use Jina AI's [Embeddings platform](https://jina.ai/embeddings/) for fully-managed access to Jina Embeddings models*.
## Fine-tuning
Please consider [Finetuner](https://github.com/jina-ai/finetuner).
## Plans
The development of new bilingual models is currently underway. We will be targeting mainly the German and Spanish languages.
The upcoming models will be called `jina-embeddings-v2-small-de/es`.
## Contact
Join our [Discord community](https://discord.jina.ai) and chat with other community members about ideas.
## Citation
If you find Jina Embeddings useful in your research, please cite the following paper:
```
@misc{günther2023jina,
title={Jina Embeddings 2: 8192-Token General-Purpose Text Embeddings for Long Documents},
author={Michael Günther and Jackmin Ong and Isabelle Mohr and Alaeddine Abdessalem and Tanguy Abel and Mohammad Kalim Akram and Susana Guzman and Georgios Mastrapas and Saba Sturua and Bo Wang and Maximilian Werk and Nan Wang and Han Xiao},
year={2023},
eprint={2310.19923},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!---
``` latex
@misc{günther2023jina,
title={Beyond the 512-Token Barrier: Training General-Purpose Text
Embeddings for Large Documents},
author={Michael Günther and Jackmin Ong and Isabelle Mohr and Alaeddine Abdessalem and Tanguy Abel and Mohammad Kalim Akram and Susana Guzman and Georgios Mastrapas and Saba Sturua and Bo Wang},
year={2023},
eprint={2307.11224},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@misc{günther2023jina,
title={Jina Embeddings: A Novel Set of High-Performance Sentence Embedding Models},
author={Michael Günther and Louis Milliken and Jonathan Geuter and Georgios Mastrapas and Bo Wang and Han Xiao},
year={2023},
eprint={2307.11224},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
--> | 68,090 | [
[
-0.030303955078125,
-0.06732177734375,
0.03546142578125,
0.029083251953125,
-0.02227783203125,
-0.036163330078125,
-0.0237884521484375,
-0.0247802734375,
0.0284423828125,
0.0105133056640625,
-0.0193023681640625,
-0.042205810546875,
-0.04510498046875,
0.01143646240234375,
-0.0164947509765625,
0.058624267578125,
-0.01285552978515625,
0.00847625732421875,
-0.0229339599609375,
-0.0113983154296875,
-0.0253753662109375,
-0.0284881591796875,
-0.041290283203125,
0.00037741661071777344,
0.030426025390625,
0.00818634033203125,
0.047515869140625,
0.0626220703125,
0.01380157470703125,
0.023651123046875,
-0.0069732666015625,
0.019805908203125,
-0.0236663818359375,
0.003765106201171875,
-0.003849029541015625,
-0.039764404296875,
-0.03802490234375,
0.0012531280517578125,
0.054168701171875,
0.027862548828125,
0.006847381591796875,
0.0190582275390625,
-0.0015573501586914062,
0.03948974609375,
-0.039764404296875,
0.02520751953125,
-0.0249481201171875,
-0.003551483154296875,
-0.0183258056640625,
0.01508331298828125,
-0.03662109375,
-0.020599365234375,
0.017547607421875,
-0.04547119140625,
-0.0103607177734375,
0.019744873046875,
0.10003662109375,
0.01427459716796875,
-0.03582763671875,
-0.01177978515625,
-0.023223876953125,
0.07525634765625,
-0.0701904296875,
0.040283203125,
0.0301666259765625,
-0.00852203369140625,
-0.0134735107421875,
-0.04986572265625,
-0.048431396484375,
-0.00847625732421875,
0.0007524490356445312,
0.01239013671875,
-0.006999969482421875,
-0.00495147705078125,
0.020172119140625,
0.017425537109375,
-0.041839599609375,
0.008575439453125,
-0.0195770263671875,
-0.000045239925384521484,
0.05963134765625,
-0.0014772415161132812,
0.022186279296875,
-0.044189453125,
-0.0220489501953125,
-0.0293731689453125,
-0.0247344970703125,
0.0004680156707763672,
0.039215087890625,
0.01166534423828125,
-0.03485107421875,
0.0379638671875,
-0.01045989990234375,
0.04522705078125,
0.00531768798828125,
0.0005927085876464844,
0.038299560546875,
-0.0298919677734375,
-0.0206756591796875,
-0.02703857421875,
0.0845947265625,
0.01465606689453125,
0.0190582275390625,
0.004360198974609375,
-0.01200103759765625,
-0.0250091552734375,
0.006526947021484375,
-0.064208984375,
-0.0015316009521484375,
0.0184326171875,
-0.048431396484375,
-0.0168304443359375,
0.01464080810546875,
-0.065185546875,
-0.0165863037109375,
-0.006610870361328125,
0.049346923828125,
-0.0478515625,
-0.007122039794921875,
0.0255584716796875,
-0.01143646240234375,
0.0244293212890625,
0.01055145263671875,
-0.05828857421875,
0.005931854248046875,
0.035797119140625,
0.06353759765625,
0.0216064453125,
-0.0238494873046875,
-0.019439697265625,
-0.009063720703125,
-0.01247406005859375,
0.032928466796875,
-0.031890869140625,
-0.00039887428283691406,
0.016632080078125,
0.007190704345703125,
-0.0167999267578125,
-0.01068115234375,
0.0394287109375,
-0.0394287109375,
0.023223876953125,
-0.0018777847290039062,
-0.051025390625,
-0.02960205078125,
0.01207733154296875,
-0.0576171875,
0.06646728515625,
0.008270263671875,
-0.05792236328125,
0.004131317138671875,
-0.039154052734375,
-0.022613525390625,
-0.03997802734375,
0.0107269287109375,
-0.0609130859375,
0.000037610530853271484,
0.0406494140625,
0.059539794921875,
-0.04254150390625,
0.01105499267578125,
-0.030426025390625,
-0.0181732177734375,
0.009735107421875,
-0.02294921875,
0.06842041015625,
0.0186920166015625,
-0.0156707763671875,
-0.0105438232421875,
-0.04779052734375,
-0.005428314208984375,
0.03948974609375,
-0.017913818359375,
-0.034515380859375,
-0.00568389892578125,
0.01444244384765625,
0.01555633544921875,
0.0416259765625,
-0.041015625,
0.0257568359375,
-0.043182373046875,
0.0311126708984375,
0.05242919921875,
0.0006704330444335938,
0.032073974609375,
-0.032684326171875,
0.028076171875,
0.00864410400390625,
0.0141754150390625,
-0.0204620361328125,
-0.04931640625,
-0.0693359375,
-0.0157623291015625,
0.05670166015625,
0.031005859375,
-0.0386962890625,
0.053619384765625,
-0.0343017578125,
-0.04730224609375,
-0.05218505859375,
0.0153961181640625,
0.032989501953125,
-0.005191802978515625,
0.03924560546875,
-0.007350921630859375,
-0.048736572265625,
-0.0704345703125,
-0.0045928955078125,
-0.003009796142578125,
-0.0022144317626953125,
0.01532745361328125,
0.052734375,
-0.02099609375,
0.05926513671875,
-0.04010009765625,
-0.0306396484375,
-0.012115478515625,
-0.00507354736328125,
0.028839111328125,
0.0458984375,
0.052825927734375,
-0.04522705078125,
-0.056884765625,
-0.013519287109375,
-0.060943603515625,
0.025299072265625,
-0.0237274169921875,
-0.0099945068359375,
0.0252685546875,
0.0275115966796875,
-0.05926513671875,
0.02447509765625,
0.040130615234375,
-0.0287933349609375,
0.0202178955078125,
-0.0216217041015625,
-0.004642486572265625,
-0.1192626953125,
-0.00196075439453125,
0.0178680419921875,
-0.0111541748046875,
-0.0306396484375,
0.011810302734375,
0.0185546875,
0.00323486328125,
-0.025848388671875,
0.04425048828125,
-0.04779052734375,
0.0212554931640625,
0.01534271240234375,
0.0196075439453125,
0.0022068023681640625,
0.049896240234375,
-0.0219268798828125,
0.03472900390625,
0.040618896484375,
-0.0305328369140625,
0.0168914794921875,
0.04547119140625,
-0.032806396484375,
0.0222930908203125,
-0.054534912109375,
0.0187530517578125,
-0.006557464599609375,
0.0263214111328125,
-0.05706787109375,
-0.01122283935546875,
0.0250091552734375,
-0.039306640625,
0.036865234375,
-0.014129638671875,
-0.061981201171875,
-0.0299530029296875,
-0.0538330078125,
0.019561767578125,
0.04339599609375,
-0.03790283203125,
0.047607421875,
0.0217437744140625,
-0.005413055419921875,
-0.046630859375,
-0.046142578125,
-0.00833892822265625,
-0.0082550048828125,
-0.062103271484375,
0.031524658203125,
0.0033016204833984375,
0.00678253173828125,
0.007511138916015625,
0.0151214599609375,
0.00826263427734375,
-0.00042510032653808594,
0.020416259765625,
0.01300811767578125,
-0.01532745361328125,
0.008880615234375,
0.0271759033203125,
-0.0027942657470703125,
-0.0003504753112792969,
-0.021575927734375,
0.06439208984375,
-0.0258636474609375,
-0.01328277587890625,
-0.046112060546875,
0.0259552001953125,
0.038909912109375,
-0.0301971435546875,
0.08258056640625,
0.06640625,
-0.043182373046875,
-0.005466461181640625,
-0.0384521484375,
-0.0007758140563964844,
-0.03717041015625,
0.03857421875,
-0.0304718017578125,
-0.04156494140625,
0.04254150390625,
0.01178741455078125,
0.006595611572265625,
0.0634765625,
0.0271453857421875,
-0.01055145263671875,
0.08551025390625,
0.032318115234375,
-0.0211944580078125,
0.038543701171875,
-0.06048583984375,
0.00955963134765625,
-0.06878662109375,
-0.027252197265625,
-0.0273895263671875,
-0.0231170654296875,
-0.06243896484375,
-0.0469970703125,
0.00930023193359375,
0.0156402587890625,
-0.02197265625,
0.04254150390625,
-0.0194091796875,
0.00893402099609375,
0.042999267578125,
0.0111846923828125,
-0.011962890625,
-0.004467010498046875,
-0.01409149169921875,
0.0012559890747070312,
-0.04071044921875,
-0.0278472900390625,
0.09326171875,
0.05169677734375,
0.052642822265625,
-0.002162933349609375,
0.06829833984375,
0.004779815673828125,
-0.01045989990234375,
-0.05670166015625,
0.035858154296875,
-0.0197906494140625,
-0.037445068359375,
-0.0235748291015625,
-0.013336181640625,
-0.07659912109375,
0.00862884521484375,
-0.02178955078125,
-0.049713134765625,
0.0218048095703125,
-0.0195770263671875,
-0.031585693359375,
0.0313720703125,
-0.047088623046875,
0.058258056640625,
0.00701904296875,
-0.0242767333984375,
-0.0206451416015625,
-0.04058837890625,
0.00030303001403808594,
-0.00469207763671875,
0.0033016204833984375,
-0.0017242431640625,
-0.007785797119140625,
0.07525634765625,
-0.032989501953125,
0.059967041015625,
-0.0193023681640625,
-0.0067596435546875,
0.017333984375,
-0.0203094482421875,
0.062469482421875,
0.0258636474609375,
-0.01508331298828125,
0.02166748046875,
0.00020635128021240234,
-0.03948974609375,
-0.0416259765625,
0.06280517578125,
-0.0721435546875,
-0.044769287109375,
-0.03955078125,
-0.03631591796875,
-0.01049041748046875,
0.0178985595703125,
0.015380859375,
0.0276947021484375,
-0.00009208917617797852,
0.04461669921875,
0.045928955078125,
-0.039306640625,
0.033966064453125,
0.01464080810546875,
-0.0036907196044921875,
-0.0440673828125,
0.07257080078125,
0.0205230712890625,
-0.00616455078125,
0.0311737060546875,
0.007549285888671875,
-0.02081298828125,
-0.036865234375,
-0.03668212890625,
0.046905517578125,
-0.039642333984375,
-0.026947021484375,
-0.069580078125,
-0.031280517578125,
-0.042755126953125,
-0.01148223876953125,
-0.0428466796875,
-0.0298309326171875,
-0.048187255859375,
-0.006938934326171875,
0.03302001953125,
0.049560546875,
0.0032329559326171875,
0.00640106201171875,
-0.049896240234375,
0.0222625732421875,
0.011260986328125,
0.0308837890625,
0.01335906982421875,
-0.048309326171875,
-0.048492431640625,
0.0162811279296875,
-0.02154541015625,
-0.061065673828125,
0.037200927734375,
0.01287841796875,
0.042327880859375,
0.0343017578125,
0.00634765625,
0.047454833984375,
-0.03887939453125,
0.0640869140625,
0.001068115234375,
-0.0687255859375,
0.039306640625,
-0.0257568359375,
0.007720947265625,
0.028778076171875,
0.0439453125,
-0.0243988037109375,
-0.02587890625,
-0.0450439453125,
-0.0703125,
0.057830810546875,
0.0174560546875,
0.04119873046875,
-0.01178741455078125,
0.0316162109375,
-0.0037631988525390625,
-0.016937255859375,
-0.04803466796875,
-0.035919189453125,
-0.01861572265625,
-0.025390625,
0.0024509429931640625,
-0.031890869140625,
0.007061004638671875,
-0.040435791015625,
0.0587158203125,
-0.0038928985595703125,
0.0592041015625,
0.01229095458984375,
-0.0242156982421875,
0.017974853515625,
0.0034999847412109375,
0.035675048828125,
0.01702880859375,
-0.0251312255859375,
-0.0183868408203125,
0.030242919921875,
-0.0272064208984375,
0.0006356239318847656,
0.0345458984375,
-0.01148223876953125,
0.0016107559204101562,
0.0243682861328125,
0.068115234375,
0.01174163818359375,
-0.0307464599609375,
0.057525634765625,
-0.0035552978515625,
-0.0167388916015625,
-0.03717041015625,
-0.00455474853515625,
0.0157928466796875,
0.0257568359375,
0.0267333984375,
-0.00936126708984375,
-0.0036106109619140625,
-0.0309600830078125,
0.01258087158203125,
0.0238037109375,
-0.01898193359375,
-0.0330810546875,
0.05560302734375,
0.00878143310546875,
-0.008331298828125,
0.049652099609375,
-0.0205078125,
-0.0283203125,
0.04931640625,
0.055816650390625,
0.0721435546875,
-0.004489898681640625,
0.0177764892578125,
0.051513671875,
0.0211944580078125,
-0.008026123046875,
0.02099609375,
0.0219268798828125,
-0.049652099609375,
-0.00972747802734375,
-0.05938720703125,
-0.00775146484375,
0.010955810546875,
-0.05194091796875,
0.026458740234375,
-0.0440673828125,
-0.016632080078125,
0.0006542205810546875,
0.0274810791015625,
-0.0762939453125,
0.01045989990234375,
0.007080078125,
0.0726318359375,
-0.05596923828125,
0.0679931640625,
0.053436279296875,
-0.06463623046875,
-0.044219970703125,
0.0164031982421875,
-0.017333984375,
-0.047088623046875,
0.028839111328125,
0.031219482421875,
0.0189056396484375,
0.0034694671630859375,
-0.032470703125,
-0.08087158203125,
0.09747314453125,
0.0284881591796875,
-0.029815673828125,
-0.0159759521484375,
-0.0023937225341796875,
0.04119873046875,
-0.0160369873046875,
0.02252197265625,
0.02325439453125,
0.046905517578125,
-0.0377197265625,
-0.05645751953125,
0.0341796875,
-0.042572021484375,
0.0252685546875,
-0.0133056640625,
-0.07958984375,
0.0697021484375,
-0.02032470703125,
-0.01403045654296875,
0.027984619140625,
0.07403564453125,
0.0006508827209472656,
0.00446319580078125,
0.031280517578125,
0.052825927734375,
0.047515869140625,
-0.00661468505859375,
0.10272216796875,
-0.0221099853515625,
0.031707763671875,
0.05511474609375,
0.0181732177734375,
0.07537841796875,
0.0198822021484375,
-0.0229644775390625,
0.060791015625,
0.046051025390625,
-0.0178680419921875,
0.037933349609375,
0.004383087158203125,
0.0101470947265625,
-0.01187896728515625,
-0.00385284423828125,
-0.04296875,
0.040252685546875,
0.0283355712890625,
-0.044677734375,
0.00919342041015625,
0.00699615478515625,
0.0218048095703125,
-0.00634765625,
0.01386260986328125,
0.037261962890625,
0.0186614990234375,
-0.030120849609375,
0.0599365234375,
0.035125732421875,
0.07318115234375,
-0.037322998046875,
0.0178985595703125,
-0.018798828125,
0.01165008544921875,
-0.00165557861328125,
-0.049072265625,
-0.00061798095703125,
-0.01213836669921875,
-0.0143585205078125,
-0.00347900390625,
0.03558349609375,
-0.03375244140625,
-0.0296478271484375,
0.03814697265625,
0.0247344970703125,
-0.00708770751953125,
-0.0037212371826171875,
-0.062347412109375,
-0.0018157958984375,
0.00021767616271972656,
-0.052215576171875,
0.0189666748046875,
0.0174713134765625,
0.0033092498779296875,
0.04095458984375,
0.05426025390625,
-0.01276397705078125,
0.00421905517578125,
0.0035991668701171875,
0.05914306640625,
-0.0589599609375,
-0.0301055908203125,
-0.05987548828125,
0.04034423828125,
-0.01360321044921875,
-0.03228759765625,
0.06427001953125,
0.046905517578125,
0.06060791015625,
-0.023895263671875,
0.04681396484375,
-0.010589599609375,
0.007511138916015625,
-0.041961669921875,
0.06988525390625,
-0.0665283203125,
-0.00531768798828125,
-0.0260162353515625,
-0.0692138671875,
-0.0225372314453125,
0.0517578125,
-0.01404571533203125,
0.00507354736328125,
0.046783447265625,
0.057342529296875,
0.001636505126953125,
-0.0176849365234375,
0.01424407958984375,
0.0300445556640625,
0.0250091552734375,
0.05133056640625,
0.0384521484375,
-0.0771484375,
0.03399658203125,
-0.03289794921875,
0.00884246826171875,
-0.034576416015625,
-0.050994873046875,
-0.08026123046875,
-0.07489013671875,
-0.019195556640625,
-0.0267791748046875,
0.00363922119140625,
0.082275390625,
0.04559326171875,
-0.0640869140625,
0.01031494140625,
0.01242828369140625,
0.0142669677734375,
-0.0173187255859375,
-0.01824951171875,
0.0538330078125,
-0.0200653076171875,
-0.08782958984375,
0.0283355712890625,
-0.00489044189453125,
0.00955963134765625,
-0.01751708984375,
-0.0176239013671875,
-0.05322265625,
0.031463623046875,
0.035797119140625,
0.0141754150390625,
-0.05059814453125,
-0.0229644775390625,
0.0039043426513671875,
-0.0209197998046875,
-0.006816864013671875,
0.0015497207641601562,
-0.0413818359375,
0.0364990234375,
0.0626220703125,
0.052642822265625,
0.048675537109375,
-0.021148681640625,
0.04315185546875,
-0.0517578125,
0.016082763671875,
-0.003322601318359375,
0.0478515625,
0.020416259765625,
-0.0221405029296875,
0.041015625,
0.0117034912109375,
-0.021759033203125,
-0.05926513671875,
-0.021087646484375,
-0.08197021484375,
-0.00804901123046875,
0.09173583984375,
-0.0262298583984375,
-0.017913818359375,
0.01239776611328125,
-0.00807952880859375,
0.04302978515625,
-0.032867431640625,
0.04901123046875,
0.046630859375,
0.0251617431640625,
0.0002536773681640625,
-0.049346923828125,
0.030670166015625,
0.04559326171875,
-0.05072021484375,
-0.0277252197265625,
0.008575439453125,
0.0277557373046875,
0.030120849609375,
0.0205535888671875,
-0.01300811767578125,
0.0208587646484375,
0.00887298583984375,
-0.00518798828125,
-0.02996826171875,
-0.0192413330078125,
-0.0214385986328125,
0.0027713775634765625,
-0.017730712890625,
-0.03045654296875
]
] |
timm/visformer_small.in1k | 2023-04-26T16:47:32.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2104.12533",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/visformer_small.in1k | 0 | 30,603 | timm | 2023-04-26T16:47:02 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for visformer_small.in1k
A Visformer image classification model. Trained on ImageNet-1k by https://github.com/hzhang57 and https://github.com/developer0hye.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 40.2
- GMACs: 4.9
- Activations (M): 11.4
- Image size: 224 x 224
- **Papers:**
- Visformer: The Vision-friendly Transformer: https://arxiv.org/abs/2104.12533
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/danczs/Visformer
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('visformer_small.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'visformer_small.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 768, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{chen2021visformer,
title={Visformer: The vision-friendly transformer},
author={Chen, Zhengsu and Xie, Lingxi and Niu, Jianwei and Liu, Xuefeng and Wei, Longhui and Tian, Qi},
booktitle={Proceedings of the IEEE/CVF international conference on computer vision},
pages={589--598},
year={2021}
}
```
| 2,819 | [
[
-0.0282440185546875,
-0.0223388671875,
0.005527496337890625,
0.0077972412109375,
-0.027069091796875,
-0.035675048828125,
-0.01030731201171875,
-0.0306854248046875,
0.0149383544921875,
0.02740478515625,
-0.037933349609375,
-0.045196533203125,
-0.05169677734375,
-0.0106353759765625,
-0.01473236083984375,
0.064208984375,
-0.00958251953125,
0.006862640380859375,
-0.021270751953125,
-0.032135009765625,
-0.017822265625,
-0.0176239013671875,
-0.05902099609375,
-0.0310516357421875,
0.030548095703125,
0.021484375,
0.038909912109375,
0.034942626953125,
0.053863525390625,
0.0321044921875,
-0.01519012451171875,
0.0001481771469116211,
-0.0289154052734375,
-0.0160369873046875,
0.032379150390625,
-0.0611572265625,
-0.042236328125,
0.0182037353515625,
0.055145263671875,
0.03631591796875,
-0.00238800048828125,
0.03790283203125,
0.018707275390625,
0.044219970703125,
-0.0247802734375,
0.018890380859375,
-0.0305633544921875,
0.0258636474609375,
-0.0160369873046875,
0.00988006591796875,
-0.0260009765625,
-0.03900146484375,
0.0223388671875,
-0.0390625,
0.0309295654296875,
0.007770538330078125,
0.0911865234375,
0.028228759765625,
0.0022525787353515625,
0.0177764892578125,
-0.0218658447265625,
0.0572509765625,
-0.0594482421875,
0.0306396484375,
0.01171875,
0.0275421142578125,
0.002155303955078125,
-0.07928466796875,
-0.0523681640625,
-0.007518768310546875,
-0.01107025146484375,
0.00567626953125,
-0.018798828125,
0.007564544677734375,
0.0269317626953125,
0.032623291015625,
-0.034271240234375,
0.002582550048828125,
-0.05352783203125,
-0.01580810546875,
0.0290985107421875,
0.005901336669921875,
0.0119781494140625,
-0.0226898193359375,
-0.043487548828125,
-0.035797119140625,
-0.024322509765625,
0.0411376953125,
0.0172271728515625,
0.00868988037109375,
-0.046356201171875,
0.043182373046875,
0.0106353759765625,
0.048065185546875,
0.0231475830078125,
-0.0269927978515625,
0.044036865234375,
0.01163482666015625,
-0.04376220703125,
-0.0101165771484375,
0.07025146484375,
0.039520263671875,
0.0186309814453125,
0.00726318359375,
-0.007579803466796875,
-0.0198211669921875,
0.00328826904296875,
-0.08807373046875,
-0.020965576171875,
0.0187530517578125,
-0.05462646484375,
-0.044342041015625,
0.0259246826171875,
-0.04119873046875,
-0.01293182373046875,
-0.012420654296875,
0.05078125,
-0.033111572265625,
-0.01346588134765625,
0.00908660888671875,
-0.0158538818359375,
0.04425048828125,
0.019805908203125,
-0.033447265625,
0.01137542724609375,
0.01531982421875,
0.07781982421875,
-0.01141357421875,
-0.03179931640625,
-0.0177154541015625,
-0.0255889892578125,
-0.0210113525390625,
0.039215087890625,
-0.00030803680419921875,
-0.0096893310546875,
-0.017425537109375,
0.03350830078125,
-0.016754150390625,
-0.054901123046875,
0.0243988037109375,
-0.0174407958984375,
0.034027099609375,
0.006839752197265625,
-0.01157379150390625,
-0.047637939453125,
0.025665283203125,
-0.036407470703125,
0.08404541015625,
0.0275726318359375,
-0.068603515625,
0.0296783447265625,
-0.037811279296875,
-0.0009713172912597656,
-0.01165771484375,
-0.0036163330078125,
-0.07794189453125,
-0.0002868175506591797,
0.0006966590881347656,
0.056121826171875,
-0.026458740234375,
0.0052337646484375,
-0.035552978515625,
-0.01763916015625,
0.0188140869140625,
-0.01544189453125,
0.07666015625,
-0.00054168701171875,
-0.02978515625,
0.0267181396484375,
-0.06036376953125,
0.008331298828125,
0.040557861328125,
-0.0189208984375,
0.0017442703247070312,
-0.04925537109375,
0.021942138671875,
0.0205078125,
0.0096282958984375,
-0.04461669921875,
0.01959228515625,
-0.0228271484375,
0.020660400390625,
0.0526123046875,
-0.01055145263671875,
0.022705078125,
-0.032684326171875,
0.0290069580078125,
0.0277099609375,
0.036285400390625,
0.01430511474609375,
-0.0426025390625,
-0.07623291015625,
-0.029632568359375,
0.0185699462890625,
0.0343017578125,
-0.048583984375,
0.024688720703125,
-0.01318359375,
-0.06365966796875,
-0.0304412841796875,
0.0013713836669921875,
0.03314208984375,
0.04937744140625,
0.023712158203125,
-0.039306640625,
-0.03485107421875,
-0.07281494140625,
0.00878143310546875,
0.00335693359375,
-0.001434326171875,
0.025054931640625,
0.0389404296875,
-0.0141754150390625,
0.054840087890625,
-0.039306640625,
-0.0252227783203125,
-0.016448974609375,
-0.006877899169921875,
0.03582763671875,
0.05267333984375,
0.0657958984375,
-0.055450439453125,
-0.04132080078125,
-0.0247039794921875,
-0.0743408203125,
0.01806640625,
0.0098114013671875,
-0.0151824951171875,
0.0178985595703125,
0.01378631591796875,
-0.037261962890625,
0.054229736328125,
0.0183258056640625,
-0.0266876220703125,
0.043731689453125,
-0.01568603515625,
0.00666046142578125,
-0.0782470703125,
0.00785064697265625,
0.0233001708984375,
-0.018035888671875,
-0.0372314453125,
-0.01016998291015625,
0.00311279296875,
-0.0033416748046875,
-0.056793212890625,
0.0484619140625,
-0.035491943359375,
0.00873565673828125,
-0.00792694091796875,
-0.023345947265625,
0.0012483596801757812,
0.05548095703125,
-0.005672454833984375,
0.0294342041015625,
0.07037353515625,
-0.046783447265625,
0.041778564453125,
0.0418701171875,
-0.0157928466796875,
0.0396728515625,
-0.056915283203125,
0.005558013916015625,
-0.01276397705078125,
0.00682830810546875,
-0.0751953125,
-0.015228271484375,
0.031158447265625,
-0.033843994140625,
0.053436279296875,
-0.034454345703125,
-0.018402099609375,
-0.044036865234375,
-0.0435791015625,
0.040618896484375,
0.04766845703125,
-0.05670166015625,
0.04180908203125,
0.0221099853515625,
0.017974853515625,
-0.039825439453125,
-0.06396484375,
-0.0249786376953125,
-0.0289459228515625,
-0.066162109375,
0.03277587890625,
0.0017213821411132812,
0.0024852752685546875,
0.0025386810302734375,
-0.01302337646484375,
-0.005329132080078125,
-0.0222930908203125,
0.039337158203125,
0.04278564453125,
-0.02886962890625,
-0.01605224609375,
-0.02484130859375,
-0.01543426513671875,
0.00384521484375,
-0.0248565673828125,
0.0372314453125,
-0.02642822265625,
-0.01248931884765625,
-0.058197021484375,
-0.0151824951171875,
0.052276611328125,
-0.005252838134765625,
0.0654296875,
0.0762939453125,
-0.038818359375,
-0.002979278564453125,
-0.039337158203125,
-0.0205078125,
-0.037261962890625,
0.0277252197265625,
-0.032501220703125,
-0.0249786376953125,
0.05926513671875,
0.00937652587890625,
0.0016813278198242188,
0.0555419921875,
0.02825927734375,
0.00421142578125,
0.052825927734375,
0.069091796875,
0.020538330078125,
0.058563232421875,
-0.0748291015625,
-0.01226806640625,
-0.06805419921875,
-0.04254150390625,
-0.01544189453125,
-0.039520263671875,
-0.048614501953125,
-0.030487060546875,
0.04052734375,
0.00487518310546875,
-0.0233154296875,
0.038665771484375,
-0.06683349609375,
0.00388336181640625,
0.053466796875,
0.0361328125,
-0.0228424072265625,
0.01453399658203125,
-0.023651123046875,
0.0011320114135742188,
-0.064208984375,
-0.0187835693359375,
0.07794189453125,
0.035125732421875,
0.06378173828125,
-0.0121002197265625,
0.040130615234375,
-0.0275421142578125,
0.018585205078125,
-0.05511474609375,
0.0406494140625,
-0.01351165771484375,
-0.0216064453125,
-0.00901031494140625,
-0.014801025390625,
-0.06854248046875,
0.00966644287109375,
-0.0254974365234375,
-0.0537109375,
0.0294647216796875,
0.01427459716796875,
-0.0242919921875,
0.05352783203125,
-0.056427001953125,
0.08465576171875,
-0.00922393798828125,
-0.032196044921875,
0.01800537109375,
-0.049041748046875,
0.0227813720703125,
0.0118865966796875,
-0.025421142578125,
0.0013513565063476562,
0.02197265625,
0.08795166015625,
-0.045501708984375,
0.0684814453125,
-0.039154052734375,
0.018524169921875,
0.03790283203125,
-0.01079559326171875,
0.02056884765625,
-0.0103912353515625,
0.0032024383544921875,
0.025665283203125,
0.01248931884765625,
-0.0273590087890625,
-0.033416748046875,
0.03607177734375,
-0.0635986328125,
-0.0284271240234375,
-0.0308685302734375,
-0.032806396484375,
0.0099334716796875,
0.00927734375,
0.053680419921875,
0.055419921875,
0.0166473388671875,
0.03179931640625,
0.049285888671875,
-0.025726318359375,
0.02886962890625,
-0.0014247894287109375,
-0.0209808349609375,
-0.026702880859375,
0.061492919921875,
0.0149078369140625,
0.0153656005859375,
0.007610321044921875,
0.01983642578125,
-0.0190887451171875,
-0.035003662109375,
-0.0213623046875,
0.028900146484375,
-0.052825927734375,
-0.0352783203125,
-0.03759765625,
-0.042266845703125,
-0.0304412841796875,
-0.001766204833984375,
-0.03607177734375,
-0.025848388671875,
-0.0204620361328125,
0.0157623291015625,
0.04754638671875,
0.045379638671875,
-0.004039764404296875,
0.035125732421875,
-0.0400390625,
0.01549530029296875,
0.0175018310546875,
0.049835205078125,
-0.00531005859375,
-0.07208251953125,
-0.0102996826171875,
-0.012725830078125,
-0.03448486328125,
-0.05633544921875,
0.0347900390625,
0.0078125,
0.043060302734375,
0.0361328125,
-0.0157012939453125,
0.0513916015625,
-0.01385498046875,
0.043243408203125,
0.0304107666015625,
-0.03814697265625,
0.0462646484375,
0.0025424957275390625,
0.00855255126953125,
0.008331298828125,
0.0241546630859375,
-0.0238800048828125,
-0.0084228515625,
-0.0670166015625,
-0.056121826171875,
0.06793212890625,
0.0125732421875,
-0.01000213623046875,
0.037109375,
0.047454833984375,
0.006748199462890625,
0.00310516357421875,
-0.06488037109375,
-0.045989990234375,
-0.03143310546875,
-0.0236053466796875,
0.0017957687377929688,
-0.01300811767578125,
0.004299163818359375,
-0.05548095703125,
0.056610107421875,
-0.00605010986328125,
0.05218505859375,
0.02972412109375,
-0.0092315673828125,
-0.006587982177734375,
-0.030609130859375,
0.036651611328125,
0.0241546630859375,
-0.0296783447265625,
0.0011272430419921875,
0.0193328857421875,
-0.053192138671875,
-0.0021762847900390625,
0.01544189453125,
-0.005901336669921875,
-0.0001245737075805664,
0.017669677734375,
0.07305908203125,
0.0007181167602539062,
-0.0017499923706054688,
0.0428466796875,
-0.0156402587890625,
-0.0274505615234375,
-0.0305633544921875,
0.0151519775390625,
-0.0137786865234375,
0.034149169921875,
0.019622802734375,
0.03704833984375,
-0.004486083984375,
-0.0097503662109375,
0.0118560791015625,
0.039825439453125,
-0.045013427734375,
-0.02508544921875,
0.0638427734375,
-0.016387939453125,
-0.01287841796875,
0.044403076171875,
-0.004154205322265625,
-0.039764404296875,
0.07257080078125,
0.030364990234375,
0.07666015625,
-0.00014472007751464844,
-0.00705718994140625,
0.05584716796875,
0.01499176025390625,
0.0008521080017089844,
0.01250457763671875,
0.01149749755859375,
-0.055389404296875,
0.00611114501953125,
-0.04962158203125,
-0.00719451904296875,
0.0254974365234375,
-0.040313720703125,
0.037384033203125,
-0.040863037109375,
-0.0260467529296875,
0.0166015625,
0.01430511474609375,
-0.0748291015625,
0.01522064208984375,
0.01300048828125,
0.0692138671875,
-0.047027587890625,
0.050811767578125,
0.0628662109375,
-0.047210693359375,
-0.0758056640625,
-0.01226043701171875,
0.0081634521484375,
-0.06756591796875,
0.04541015625,
0.03619384765625,
-0.00510406494140625,
0.004604339599609375,
-0.06317138671875,
-0.04766845703125,
0.10699462890625,
0.03369140625,
-0.00815582275390625,
0.00901031494140625,
-0.004871368408203125,
0.01236724853515625,
-0.0243988037109375,
0.024688720703125,
0.00505828857421875,
0.032806396484375,
0.021697998046875,
-0.059844970703125,
0.01141357421875,
-0.0259857177734375,
0.005634307861328125,
0.0037441253662109375,
-0.062164306640625,
0.071533203125,
-0.033416748046875,
-0.00768280029296875,
0.00489044189453125,
0.0501708984375,
0.0054779052734375,
0.00904083251953125,
0.047088623046875,
0.0482177734375,
0.04364013671875,
-0.01502227783203125,
0.06817626953125,
-0.00908660888671875,
0.046356201171875,
0.036529541015625,
0.0277252197265625,
0.035552978515625,
0.030242919921875,
-0.019805908203125,
0.0305633544921875,
0.06353759765625,
-0.0330810546875,
0.02178955078125,
0.012420654296875,
0.007537841796875,
-0.007762908935546875,
0.01346588134765625,
-0.036285400390625,
0.050811767578125,
0.0011091232299804688,
-0.02978515625,
-0.0130767822265625,
0.017333984375,
-0.01105499267578125,
-0.032440185546875,
-0.0238800048828125,
0.0360107421875,
0.005046844482421875,
-0.0280303955078125,
0.06536865234375,
0.004596710205078125,
0.0755615234375,
-0.03118896484375,
0.00322723388671875,
-0.0133209228515625,
0.033477783203125,
-0.025634765625,
-0.056610107421875,
0.0215911865234375,
-0.018035888671875,
-0.0158843994140625,
0.001163482666015625,
0.0572509765625,
-0.028106689453125,
-0.044769287109375,
0.01119232177734375,
0.0269622802734375,
0.03192138671875,
-0.003841400146484375,
-0.07281494140625,
-0.00624847412109375,
0.004184722900390625,
-0.0438232421875,
0.01312255859375,
0.0301513671875,
0.006687164306640625,
0.055816650390625,
0.041351318359375,
-0.008026123046875,
0.0194244384765625,
-0.00562286376953125,
0.06005859375,
-0.04119873046875,
-0.022705078125,
-0.051971435546875,
0.04486083984375,
-0.004352569580078125,
-0.039764404296875,
0.040008544921875,
0.051971435546875,
0.06341552734375,
-0.01776123046875,
0.02508544921875,
-0.01378631591796875,
0.00817108154296875,
-0.03173828125,
0.06524658203125,
-0.0555419921875,
-0.01180267333984375,
-0.01453399658203125,
-0.06036376953125,
-0.032135009765625,
0.06329345703125,
-0.018798828125,
0.0201873779296875,
0.039825439453125,
0.0701904296875,
-0.033172607421875,
-0.0280609130859375,
0.0238189697265625,
0.01849365234375,
0.01018524169921875,
0.0272064208984375,
0.0293426513671875,
-0.0635986328125,
0.033538818359375,
-0.04742431640625,
-0.018280029296875,
-0.012054443359375,
-0.045623779296875,
-0.07647705078125,
-0.06353759765625,
-0.043975830078125,
-0.046234130859375,
-0.03045654296875,
0.06597900390625,
0.07598876953125,
-0.05438232421875,
-0.00894927978515625,
0.01178741455078125,
0.0070648193359375,
-0.0174713134765625,
-0.020294189453125,
0.043243408203125,
-0.0142059326171875,
-0.060516357421875,
-0.030242919921875,
0.004055023193359375,
0.037628173828125,
-0.007228851318359375,
-0.01291656494140625,
-0.001430511474609375,
-0.0184173583984375,
0.0308074951171875,
0.0202789306640625,
-0.049072265625,
-0.0208892822265625,
-0.0005002021789550781,
-0.01349639892578125,
0.035614013671875,
0.0205841064453125,
-0.054473876953125,
0.022308349609375,
0.0295257568359375,
0.01751708984375,
0.06378173828125,
-0.0305633544921875,
0.0031757354736328125,
-0.053680419921875,
0.0426025390625,
-0.00859832763671875,
0.046295166015625,
0.0274200439453125,
-0.02252197265625,
0.036895751953125,
0.0352783203125,
-0.026824951171875,
-0.061248779296875,
-0.01100921630859375,
-0.09661865234375,
0.005878448486328125,
0.0625,
-0.01076507568359375,
-0.052459716796875,
0.032806396484375,
-0.0160064697265625,
0.04541015625,
0.00540924072265625,
0.0341796875,
0.014556884765625,
-0.00638580322265625,
-0.036224365234375,
-0.03778076171875,
0.037933349609375,
0.003337860107421875,
-0.044677734375,
-0.037384033203125,
-0.0007367134094238281,
0.056884765625,
0.02294921875,
0.0321044921875,
-0.0157470703125,
0.01513671875,
0.00449371337890625,
0.038330078125,
-0.032073974609375,
-0.0133056640625,
-0.02508544921875,
0.00665283203125,
-0.00798797607421875,
-0.053009033203125
]
] |
m-a-p/MERT-v1-95M | 2023-06-04T04:46:31.000Z | [
"transformers",
"pytorch",
"mert_model",
"feature-extraction",
"music",
"custom_code",
"arxiv:2306.00107",
"license:cc-by-nc-4.0",
"has_space",
"region:us"
] | feature-extraction | m-a-p | null | null | m-a-p/MERT-v1-95M | 11 | 30,579 | transformers | 2023-03-17T10:57:16 | ---
license: cc-by-nc-4.0
inference: false
tags:
- music
---
# Introduction to our series work
The development log of our Music Audio Pre-training (m-a-p) model family:
- 02/06/2023: [arxiv pre-print](https://arxiv.org/abs/2306.00107) and training [codes](https://github.com/yizhilll/MERT) released.
- 17/03/2023: we release two advanced music understanding models, [MERT-v1-95M](https://huggingface.co/m-a-p/MERT-v1-95M) and [MERT-v1-330M](https://huggingface.co/m-a-p/MERT-v1-330M) , trained with new paradigm and dataset. They outperform the previous models and can better generalize to more tasks.
- 14/03/2023: we retrained the MERT-v0 model with open-source-only music dataset [MERT-v0-public](https://huggingface.co/m-a-p/MERT-v0-public)
- 29/12/2022: a music understanding model [MERT-v0](https://huggingface.co/m-a-p/MERT-v0) trained with **MLM** paradigm, which performs better at downstream tasks.
- 29/10/2022: a pre-trained MIR model [music2vec](https://huggingface.co/m-a-p/music2vec-v1) trained with **BYOL** paradigm.
Here is a table for quick model pick-up:
| Name | Pre-train Paradigm | Training Data (hour) | Pre-train Context (second) | Model Size | Transformer Layer-Dimension | Feature Rate | Sample Rate | Release Date |
| ------------------------------------------------------------ | ------------------ | -------------------- | ---------------------------- | ---------- | --------------------------- | ------------ | ----------- | ------------ |
| [MERT-v1-330M](https://huggingface.co/m-a-p/MERT-v1-330M) | MLM | 160K | 5 | 330M | 24-1024 | 75 Hz | 24K Hz | 17/03/2023 |
| [MERT-v1-95M](https://huggingface.co/m-a-p/MERT-v1-95M) | MLM | 20K | 5 | 95M | 12-768 | 75 Hz | 24K Hz | 17/03/2023 |
| [MERT-v0-public](https://huggingface.co/m-a-p/MERT-v0-public) | MLM | 900 | 5 | 95M | 12-768 | 50 Hz | 16K Hz | 14/03/2023 |
| [MERT-v0](https://huggingface.co/m-a-p/MERT-v0) | MLM | 1000 | 5 | 95 M | 12-768 | 50 Hz | 16K Hz | 29/12/2022 |
| [music2vec-v1](https://huggingface.co/m-a-p/music2vec-v1) | BYOL | 1000 | 30 | 95 M | 12-768 | 50 Hz | 16K Hz | 30/10/2022 |
## Explanation
The m-a-p models share the similar model architecture and the most distinguished difference is the paradigm in used pre-training. Other than that, there are several nuance technical configuration needs to know before using:
- **Model Size**: the number of parameters that would be loaded to memory. Please select the appropriate size fitting your hardware.
- **Transformer Layer-Dimension**: The number of transformer layers and the corresponding feature dimensions can be outputted from our model. This is marked out because features extracted by **different layers could have various performance depending on tasks**.
- **Feature Rate**: Given a 1-second audio input, the number of features output by the model.
- **Sample Rate**: The frequency of audio that the model is trained with.
# Introduction to MERT-v1
Compared to MERT-v0, we introduce multiple new things in the MERT-v1 pre-training:
- Change the pseudo labels to 8 codebooks from [encodec](https://github.com/facebookresearch/encodec), which potentially has higher quality and empower our model to support music generation.
- MLM prediction with in-batch noise mixture.
- Train with higher audio frequency (24K Hz).
- Train with more audio data (up to 160 thousands of hours).
- More available model sizes 95M and 330M.
More details will be written in our coming-soon paper.
# Model Usage
```python
# from transformers import Wav2Vec2Processor
from transformers import Wav2Vec2FeatureExtractor
from transformers import AutoModel
import torch
from torch import nn
import torchaudio.transforms as T
from datasets import load_dataset
# loading our model weights
model = AutoModel.from_pretrained("m-a-p/MERT-v1-95M", trust_remote_code=True)
# loading the corresponding preprocessor config
processor = Wav2Vec2FeatureExtractor.from_pretrained("m-a-p/MERT-v1-95M",trust_remote_code=True)
# load demo audio and set processor
dataset = load_dataset("hf-internal-testing/librispeech_asr_demo", "clean", split="validation")
dataset = dataset.sort("id")
sampling_rate = dataset.features["audio"].sampling_rate
resample_rate = processor.sampling_rate
# make sure the sample_rate aligned
if resample_rate != sampling_rate:
print(f'setting rate from {sampling_rate} to {resample_rate}')
resampler = T.Resample(sampling_rate, resample_rate)
else:
resampler = None
# audio file is decoded on the fly
if resampler is None:
input_audio = dataset[0]["audio"]["array"]
else:
input_audio = resampler(torch.from_numpy(dataset[0]["audio"]["array"]))
inputs = processor(input_audio, sampling_rate=resample_rate, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs, output_hidden_states=True)
# take a look at the output shape, there are 13 layers of representation
# each layer performs differently in different downstream tasks, you should choose empirically
all_layer_hidden_states = torch.stack(outputs.hidden_states).squeeze()
print(all_layer_hidden_states.shape) # [13 layer, Time steps, 768 feature_dim]
# for utterance level classification tasks, you can simply reduce the representation in time
time_reduced_hidden_states = all_layer_hidden_states.mean(-2)
print(time_reduced_hidden_states.shape) # [13, 768]
# you can even use a learnable weighted average representation
aggregator = nn.Conv1d(in_channels=13, out_channels=1, kernel_size=1)
weighted_avg_hidden_states = aggregator(time_reduced_hidden_states.unsqueeze(0)).squeeze()
print(weighted_avg_hidden_states.shape) # [768]
```
# Citation
```shell
@misc{li2023mert,
title={MERT: Acoustic Music Understanding Model with Large-Scale Self-supervised Training},
author={Yizhi Li and Ruibin Yuan and Ge Zhang and Yinghao Ma and Xingran Chen and Hanzhi Yin and Chenghua Lin and Anton Ragni and Emmanouil Benetos and Norbert Gyenge and Roger Dannenberg and Ruibo Liu and Wenhu Chen and Gus Xia and Yemin Shi and Wenhao Huang and Yike Guo and Jie Fu},
year={2023},
eprint={2306.00107},
archivePrefix={arXiv},
primaryClass={cs.SD}
}
``` | 6,721 | [
[
-0.051177978515625,
-0.033447265625,
0.01557159423828125,
0.00893402099609375,
-0.008026123046875,
-0.01203155517578125,
-0.02081298828125,
-0.022003173828125,
0.01251220703125,
0.02581787109375,
-0.06689453125,
-0.034759521484375,
-0.0384521484375,
-0.01433563232421875,
-0.01453399658203125,
0.0728759765625,
0.01206207275390625,
0.005523681640625,
0.00992584228515625,
-0.0137939453125,
-0.043060302734375,
-0.0328369140625,
-0.05157470703125,
-0.0272979736328125,
0.015045166015625,
0.02777099609375,
0.022918701171875,
0.040802001953125,
0.04461669921875,
0.0233001708984375,
-0.0139923095703125,
-0.01788330078125,
-0.0220184326171875,
0.000759124755859375,
0.0288238525390625,
-0.0261077880859375,
-0.0582275390625,
0.0206298828125,
0.033447265625,
0.025390625,
0.0007500648498535156,
0.044403076171875,
0.01038360595703125,
0.0479736328125,
-0.016571044921875,
0.00923919677734375,
-0.0218963623046875,
0.01422119140625,
-0.007495880126953125,
-0.0260162353515625,
-0.0308990478515625,
0.0013523101806640625,
0.0174560546875,
-0.05255126953125,
0.01000213623046875,
0.003597259521484375,
0.079345703125,
0.034027099609375,
-0.0220489501953125,
-0.01361846923828125,
-0.0574951171875,
0.062042236328125,
-0.059173583984375,
0.0555419921875,
0.04486083984375,
0.032958984375,
0.0163421630859375,
-0.046142578125,
-0.0307464599609375,
-0.0204010009765625,
-0.002239227294921875,
0.0274810791015625,
-0.006595611572265625,
-0.009613037109375,
0.032073974609375,
0.046630859375,
-0.0491943359375,
-0.001209259033203125,
-0.042724609375,
-0.0199432373046875,
0.053009033203125,
-0.00891876220703125,
0.0066375732421875,
-0.022125244140625,
-0.041168212890625,
-0.032958984375,
-0.03509521484375,
0.018951416015625,
0.025909423828125,
-0.00021886825561523438,
-0.036285400390625,
0.020843505859375,
0.00830841064453125,
0.040374755859375,
0.0255279541015625,
-0.036102294921875,
0.05035400390625,
-0.022125244140625,
-0.022064208984375,
0.0095062255859375,
0.060943603515625,
0.01094818115234375,
-0.0018291473388671875,
0.0146331787109375,
-0.027435302734375,
-0.00738525390625,
0.0011434555053710938,
-0.0570068359375,
-0.033111572265625,
0.0199432373046875,
-0.045379638671875,
-0.0107269287109375,
0.00044417381286621094,
-0.033050537109375,
0.00272369384765625,
-0.045318603515625,
0.06719970703125,
-0.046844482421875,
-0.034698486328125,
0.0155181884765625,
-0.0069427490234375,
0.0230255126953125,
-0.0030879974365234375,
-0.052154541015625,
0.018951416015625,
0.031646728515625,
0.054718017578125,
0.0030117034912109375,
-0.020111083984375,
-0.007678985595703125,
-0.00733184814453125,
-0.0034351348876953125,
0.0252685546875,
0.007232666015625,
-0.0173797607421875,
-0.034759521484375,
-0.0002319812774658203,
-0.01139068603515625,
-0.0394287109375,
0.049346923828125,
-0.0304718017578125,
0.0355224609375,
-0.0209197998046875,
-0.04205322265625,
-0.03643798828125,
-0.0023956298828125,
-0.039581298828125,
0.07440185546875,
0.0227813720703125,
-0.0648193359375,
0.031036376953125,
-0.0546875,
-0.01403045654296875,
-0.0311279296875,
0.01132965087890625,
-0.05810546875,
-0.01000213623046875,
0.020599365234375,
0.03485107421875,
-0.02423095703125,
0.006290435791015625,
-0.0158538818359375,
-0.03680419921875,
0.01617431640625,
-0.029541015625,
0.07196044921875,
0.033599853515625,
-0.045806884765625,
0.016143798828125,
-0.08282470703125,
-0.006992340087890625,
0.01690673828125,
-0.03143310546875,
0.00461578369140625,
-0.02215576171875,
0.0255279541015625,
0.035614013671875,
0.00811767578125,
-0.0279541015625,
-0.006076812744140625,
-0.0287933349609375,
0.044921875,
0.046905517578125,
0.001117706298828125,
0.02593994140625,
-0.0465087890625,
0.035675048828125,
-0.00574493408203125,
0.007488250732421875,
-0.0212249755859375,
-0.0216827392578125,
-0.0572509765625,
-0.031890869140625,
0.02972412109375,
0.0477294921875,
-0.0136566162109375,
0.062103271484375,
-0.0101318359375,
-0.052032470703125,
-0.0657958984375,
-0.0033473968505859375,
0.03143310546875,
0.039154052734375,
0.043487548828125,
-0.0280609130859375,
-0.0601806640625,
-0.07305908203125,
0.00246429443359375,
0.0027751922607421875,
-0.02215576171875,
0.03411865234375,
0.0273284912109375,
-0.00917816162109375,
0.06201171875,
-0.021484375,
-0.0201416015625,
-0.0141143798828125,
0.009368896484375,
0.0457763671875,
0.06378173828125,
0.047576904296875,
-0.04461669921875,
-0.0251617431640625,
-0.02557373046875,
-0.0538330078125,
-0.00914764404296875,
-0.01187896728515625,
-0.01104736328125,
-0.0018701553344726562,
0.0206146240234375,
-0.043914794921875,
0.0188446044921875,
0.0289306640625,
-0.0259246826171875,
0.050323486328125,
0.006649017333984375,
0.0224151611328125,
-0.0799560546875,
0.00499725341796875,
-0.00077056884765625,
-0.00777435302734375,
-0.041656494140625,
-0.016998291015625,
-0.00856781005859375,
-0.005878448486328125,
-0.04217529296875,
0.018646240234375,
-0.04095458984375,
-0.0238494873046875,
-0.0167236328125,
-0.0028400421142578125,
-0.00688934326171875,
0.0687255859375,
0.00450897216796875,
0.052642822265625,
0.05548095703125,
-0.051055908203125,
0.0240936279296875,
0.016265869140625,
-0.033172607421875,
0.0316162109375,
-0.0626220703125,
0.0167999267578125,
0.00727081298828125,
0.0258636474609375,
-0.0714111328125,
-0.0031681060791015625,
0.019683837890625,
-0.05792236328125,
0.0413818359375,
-0.0210418701171875,
-0.031951904296875,
-0.04534912109375,
-0.004901885986328125,
0.03985595703125,
0.05792236328125,
-0.036651611328125,
0.023101806640625,
0.0251617431640625,
0.0124664306640625,
-0.03057861328125,
-0.051055908203125,
-0.03564453125,
-0.044921875,
-0.06317138671875,
0.0222625732421875,
-0.005039215087890625,
-0.004512786865234375,
-0.00865936279296875,
-0.022613525390625,
0.0011320114135742188,
0.00708770751953125,
0.036590576171875,
0.01348876953125,
-0.035858154296875,
0.01042938232421875,
-0.016571044921875,
-0.0215606689453125,
0.0174407958984375,
-0.038482666015625,
0.06610107421875,
-0.0158843994140625,
-0.0299224853515625,
-0.07025146484375,
0.001125335693359375,
0.042938232421875,
-0.030609130859375,
0.0299530029296875,
0.07537841796875,
-0.0124664306640625,
0.0146026611328125,
-0.037322998046875,
-0.0245513916015625,
-0.037506103515625,
0.02801513671875,
-0.0309295654296875,
-0.050933837890625,
0.04254150390625,
0.01380157470703125,
0.0079345703125,
0.05682373046875,
0.03289794921875,
-0.0196380615234375,
0.06622314453125,
0.02777099609375,
-0.00897979736328125,
0.03936767578125,
-0.0650634765625,
-0.0115203857421875,
-0.06915283203125,
-0.01214599609375,
-0.022979736328125,
-0.04522705078125,
-0.04376220703125,
-0.0261077880859375,
0.041839599609375,
0.0034198760986328125,
-0.05047607421875,
0.044769287109375,
-0.041412353515625,
0.0023136138916015625,
0.06494140625,
0.0153350830078125,
-0.006885528564453125,
-0.0011510848999023438,
0.00383758544921875,
-0.006458282470703125,
-0.039520263671875,
-0.015228271484375,
0.1016845703125,
0.05181884765625,
0.03814697265625,
-0.0041351318359375,
0.06060791015625,
0.005615234375,
-0.01035308837890625,
-0.056182861328125,
0.0201873779296875,
0.01499176025390625,
-0.051788330078125,
-0.0209197998046875,
-0.0186614990234375,
-0.051422119140625,
0.02734375,
-0.024810791015625,
-0.038482666015625,
0.01506805419921875,
0.01158905029296875,
-0.032135009765625,
0.029876708984375,
-0.043792724609375,
0.0596923828125,
-0.01523590087890625,
-0.01497650146484375,
-0.0055084228515625,
-0.054443359375,
0.033355712890625,
0.0012044906616210938,
0.02020263671875,
-0.00978851318359375,
0.035003662109375,
0.0894775390625,
-0.0287628173828125,
0.03448486328125,
-0.031402587890625,
-0.0015096664428710938,
0.04376220703125,
-0.0008502006530761719,
0.0224456787109375,
-0.0009984970092773438,
0.0007534027099609375,
0.00843048095703125,
-0.007602691650390625,
-0.02655029296875,
-0.0290069580078125,
0.03192138671875,
-0.07391357421875,
-0.0295562744140625,
-0.0137939453125,
-0.058074951171875,
-0.0168304443359375,
0.00396728515625,
0.052764892578125,
0.0457763671875,
0.0115203857421875,
0.0263824462890625,
0.054168701171875,
-0.0166778564453125,
0.03643798828125,
0.0269775390625,
-0.0081634521484375,
-0.040618896484375,
0.0631103515625,
0.0078582763671875,
0.0233612060546875,
0.0191802978515625,
0.016937255859375,
-0.0255889892578125,
-0.032501220703125,
-0.023834228515625,
0.020111083984375,
-0.032073974609375,
-0.009063720703125,
-0.058380126953125,
-0.01251220703125,
-0.039520263671875,
0.0039520263671875,
-0.048004150390625,
-0.04461669921875,
-0.0166015625,
-0.005859375,
0.033233642578125,
0.038909912109375,
-0.0169525146484375,
0.050811767578125,
-0.054779052734375,
0.0159149169921875,
0.01739501953125,
0.01020050048828125,
-0.00771331787109375,
-0.08856201171875,
-0.0258941650390625,
-0.0013551712036132812,
-0.028228759765625,
-0.0750732421875,
0.0279541015625,
0.00009578466415405273,
0.032745361328125,
0.043182373046875,
-0.0191802978515625,
0.04168701171875,
-0.0264129638671875,
0.052337646484375,
0.0158233642578125,
-0.072509765625,
0.056121826171875,
-0.037261962890625,
0.0096893310546875,
0.040435791015625,
0.025970458984375,
-0.03216552734375,
-0.0052337646484375,
-0.06500244140625,
-0.058502197265625,
0.0782470703125,
0.0178680419921875,
-0.0216064453125,
0.039276123046875,
0.01517486572265625,
0.0024929046630859375,
0.0174560546875,
-0.0693359375,
-0.049346923828125,
-0.0360107421875,
-0.0020904541015625,
-0.0268402099609375,
-0.0093231201171875,
-0.0031871795654296875,
-0.054931640625,
0.067626953125,
0.024322509765625,
0.041656494140625,
0.0164642333984375,
-0.00945281982421875,
0.0006303787231445312,
0.0140838623046875,
0.045989990234375,
0.0278167724609375,
-0.042083740234375,
-0.0007276535034179688,
0.01189422607421875,
-0.03802490234375,
0.02374267578125,
-0.002658843994140625,
-0.00263214111328125,
-0.0025501251220703125,
0.037078857421875,
0.0867919921875,
0.0222625732421875,
-0.0171661376953125,
0.037628173828125,
0.01338958740234375,
-0.0273590087890625,
-0.03790283203125,
0.005847930908203125,
0.0190277099609375,
0.021270751953125,
0.01381683349609375,
0.013885498046875,
0.01220703125,
-0.0273284912109375,
0.039703369140625,
0.00667572021484375,
-0.038909912109375,
-0.02874755859375,
0.06854248046875,
-0.0059356689453125,
-0.0191192626953125,
0.043426513671875,
-0.014556884765625,
-0.0435791015625,
0.062042236328125,
0.041748046875,
0.0714111328125,
-0.03369140625,
-0.01141357421875,
0.055694580078125,
0.023651123046875,
0.00421142578125,
0.0204010009765625,
-0.0218048095703125,
-0.027435302734375,
-0.017547607421875,
-0.05438232421875,
0.003101348876953125,
0.02764892578125,
-0.06646728515625,
0.0220947265625,
-0.028228759765625,
-0.017364501953125,
-0.0046844482421875,
0.0207672119140625,
-0.057586669921875,
0.0257568359375,
0.005672454833984375,
0.0777587890625,
-0.07275390625,
0.0721435546875,
0.030487060546875,
-0.039031982421875,
-0.0885009765625,
-0.0097503662109375,
0.0156402587890625,
-0.052581787109375,
0.045928955078125,
0.0116424560546875,
0.006992340087890625,
0.00724029541015625,
-0.03955078125,
-0.08160400390625,
0.10406494140625,
0.0100555419921875,
-0.048187255859375,
0.01346588134765625,
-0.006900787353515625,
0.034698486328125,
-0.029541015625,
0.021240234375,
0.0628662109375,
0.044525146484375,
0.0131378173828125,
-0.07421875,
-0.0081329345703125,
-0.0281982421875,
-0.0263519287109375,
0.0031070709228515625,
-0.049041748046875,
0.09808349609375,
-0.0175018310546875,
-0.0010890960693359375,
-0.0172576904296875,
0.06304931640625,
0.0308837890625,
0.03179931640625,
0.05810546875,
0.05792236328125,
0.047882080078125,
-0.0131072998046875,
0.069091796875,
-0.0281982421875,
0.0390625,
0.0802001953125,
0.0227813720703125,
0.04425048828125,
0.0156097412109375,
-0.0377197265625,
0.0156707763671875,
0.06793212890625,
-0.0185699462890625,
0.043426513671875,
0.022125244140625,
-0.00559234619140625,
-0.022857666015625,
0.0181884765625,
-0.0662841796875,
0.04815673828125,
0.01239013671875,
-0.029144287109375,
0.0225830078125,
0.02130126953125,
0.0016231536865234375,
-0.0261383056640625,
-0.015899658203125,
0.042938232421875,
0.0005164146423339844,
-0.0292205810546875,
0.06646728515625,
-0.0056610107421875,
0.06787109375,
-0.0267181396484375,
0.0206146240234375,
-0.0013294219970703125,
-0.0027618408203125,
-0.0175628662109375,
-0.039886474609375,
0.0072479248046875,
-0.0225677490234375,
-0.0250701904296875,
-0.0130615234375,
0.035888671875,
-0.03790283203125,
-0.023681640625,
0.0299224853515625,
0.0295562744140625,
0.0271453857421875,
-0.0135955810546875,
-0.060333251953125,
0.002956390380859375,
0.003665924072265625,
-0.039703369140625,
0.00778961181640625,
0.017425537109375,
0.027130126953125,
0.035675048828125,
0.041015625,
0.00615692138671875,
0.0278778076171875,
0.0003154277801513672,
0.04638671875,
-0.0504150390625,
-0.04278564453125,
-0.0355224609375,
0.031951904296875,
0.0170135498046875,
-0.024383544921875,
0.0538330078125,
0.050689697265625,
0.0723876953125,
-0.01007080078125,
0.0426025390625,
-0.0004982948303222656,
0.0496826171875,
-0.042724609375,
0.06402587890625,
-0.05340576171875,
0.0303802490234375,
-0.02862548828125,
-0.06927490234375,
0.0081024169921875,
0.050567626953125,
-0.019134521484375,
0.0247650146484375,
0.031707763671875,
0.048828125,
-0.0194244384765625,
0.00896453857421875,
0.005741119384765625,
0.0236968994140625,
0.01169586181640625,
0.06256103515625,
0.054229736328125,
-0.0654296875,
0.033294677734375,
-0.0574951171875,
-0.0125885009765625,
-0.01049041748046875,
-0.0249786376953125,
-0.053680419921875,
-0.05596923828125,
-0.02410888671875,
-0.0270538330078125,
-0.0152130126953125,
0.08349609375,
0.0716552734375,
-0.059478759765625,
-0.0291595458984375,
0.01561737060546875,
0.0013284683227539062,
-0.038238525390625,
-0.0174713134765625,
0.055328369140625,
-0.005672454833984375,
-0.056671142578125,
0.045623779296875,
-0.007587432861328125,
0.019744873046875,
-0.01202392578125,
-0.0260467529296875,
-0.017608642578125,
-0.0064239501953125,
0.0297698974609375,
0.01214599609375,
-0.04248046875,
-0.0161590576171875,
-0.0038013458251953125,
0.0016384124755859375,
0.0276947021484375,
0.045989990234375,
-0.0423583984375,
0.0120086669921875,
0.04010009765625,
0.01528167724609375,
0.050689697265625,
-0.00983428955078125,
0.0098114013671875,
-0.043121337890625,
0.0173492431640625,
0.012054443359375,
0.0295257568359375,
0.01148223876953125,
-0.01335906982421875,
0.03900146484375,
0.032196044921875,
-0.053802490234375,
-0.06658935546875,
-0.0036373138427734375,
-0.09368896484375,
-0.0095977783203125,
0.07666015625,
-0.00289154052734375,
-0.0169677734375,
0.00579833984375,
-0.0165252685546875,
0.043487548828125,
-0.0196075439453125,
0.028228759765625,
0.03594970703125,
-0.01490020751953125,
-0.00809478759765625,
-0.060821533203125,
0.054412841796875,
0.034210205078125,
-0.03485107421875,
-0.0167694091796875,
0.03155517578125,
0.04132080078125,
0.0183258056640625,
0.058746337890625,
-0.0166778564453125,
0.02972412109375,
0.01291656494140625,
0.0230865478515625,
-0.025726318359375,
-0.034759521484375,
-0.036224365234375,
0.015777587890625,
-0.0164031982421875,
-0.042327880859375
]
] |
timm/pnasnet5large.tf_in1k | 2023-05-10T01:08:21.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1712.00559",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/pnasnet5large.tf_in1k | 0 | 30,536 | timm | 2023-04-25T21:35:29 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for pnasnet5large.tf_in1k
A PNasNet image classification model. Trained on ImageNet-1k by paper authors. Ported from Tensorflow via Cadene's pretrained-models.pytorch.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 86.1
- GMACs: 25.0
- Activations (M): 92.9
- Image size: 331 x 331
- **Papers:**
- Progressive Neural Architecture Search: https://arxiv.org/abs/1712.00559
- **Original:**
- https://github.com/tensorflow/models
- https://github.com/Cadene/pretrained-models.pytorch
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('pnasnet5large.tf_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'pnasnet5large.tf_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 96, 165, 165])
# torch.Size([1, 270, 83, 83])
# torch.Size([1, 1080, 42, 42])
# torch.Size([1, 2160, 21, 21])
# torch.Size([1, 4320, 11, 11])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'pnasnet5large.tf_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 4320, 11, 11) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{liu2018progressive,
title={Progressive Neural Architecture Search},
author={Chenxi Liu and Barret Zoph and Maxim Neumann and Jonathon Shlens and Wei Hua and Li-Jia Li and Li Fei-Fei and Alan Yuille and Jonathan Huang and Kevin Murphy},
year={2018},
eprint={1712.00559},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
| 3,843 | [
[
-0.028472900390625,
-0.032684326171875,
0.0004763603210449219,
0.0157623291015625,
-0.0254364013671875,
-0.021392822265625,
-0.0184173583984375,
-0.0185699462890625,
0.0239410400390625,
0.02899169921875,
-0.032867431640625,
-0.05712890625,
-0.050079345703125,
-0.0130767822265625,
-0.009246826171875,
0.0692138671875,
-0.0120086669921875,
-0.0029144287109375,
-0.00997161865234375,
-0.031158447265625,
-0.00881195068359375,
-0.0234222412109375,
-0.0662841796875,
-0.037841796875,
0.0278167724609375,
0.0111083984375,
0.03472900390625,
0.040740966796875,
0.046112060546875,
0.03631591796875,
-0.013031005859375,
0.0012140274047851562,
-0.0175018310546875,
-0.01552581787109375,
0.0328369140625,
-0.045806884765625,
-0.041351318359375,
0.016632080078125,
0.058349609375,
0.035430908203125,
0.0065155029296875,
0.032867431640625,
0.0072021484375,
0.036865234375,
-0.019012451171875,
0.00768280029296875,
-0.0240325927734375,
0.02813720703125,
-0.0160980224609375,
0.0033321380615234375,
-0.019561767578125,
-0.036163330078125,
0.0287322998046875,
-0.0374755859375,
0.03314208984375,
-0.00704193115234375,
0.10235595703125,
0.0186614990234375,
0.0009250640869140625,
-0.004596710205078125,
-0.01393890380859375,
0.05303955078125,
-0.0657958984375,
0.01678466796875,
0.0251922607421875,
0.009307861328125,
-0.0114288330078125,
-0.0867919921875,
-0.038116455078125,
-0.019134521484375,
-0.00978851318359375,
-0.005886077880859375,
-0.01145172119140625,
0.009033203125,
0.0194549560546875,
0.0275115966796875,
-0.037353515625,
0.0078582763671875,
-0.04193115234375,
-0.0158843994140625,
0.036376953125,
-0.0009398460388183594,
0.02166748046875,
-0.0100250244140625,
-0.038848876953125,
-0.0300750732421875,
-0.0235748291015625,
0.0218658447265625,
0.0279693603515625,
0.0117950439453125,
-0.0401611328125,
0.0244293212890625,
0.015411376953125,
0.0426025390625,
0.0097198486328125,
-0.0247650146484375,
0.0535888671875,
0.003345489501953125,
-0.041290283203125,
-0.002838134765625,
0.08013916015625,
0.0178680419921875,
0.0157623291015625,
0.0011539459228515625,
-0.0106048583984375,
-0.027374267578125,
0.0007724761962890625,
-0.08489990234375,
-0.034912109375,
0.0251922607421875,
-0.050872802734375,
-0.03472900390625,
0.0231781005859375,
-0.04766845703125,
-0.0090789794921875,
0.00437164306640625,
0.045867919921875,
-0.0311126708984375,
-0.037841796875,
0.0078277587890625,
-0.01139068603515625,
0.0218505859375,
0.004058837890625,
-0.034454345703125,
0.010040283203125,
0.0195159912109375,
0.09051513671875,
0.00464630126953125,
-0.03863525390625,
-0.007564544677734375,
-0.031219482421875,
-0.0243072509765625,
0.0287628173828125,
-0.0007214546203613281,
-0.007350921630859375,
-0.023223876953125,
0.0266571044921875,
-0.01377105712890625,
-0.0548095703125,
0.0224609375,
-0.018829345703125,
0.016693115234375,
-0.00043392181396484375,
-0.0191192626953125,
-0.04150390625,
0.021392822265625,
-0.0364990234375,
0.087646484375,
0.022613525390625,
-0.05694580078125,
0.0242767333984375,
-0.05462646484375,
-0.013946533203125,
-0.022857666015625,
0.004123687744140625,
-0.09051513671875,
-0.00707244873046875,
0.01030731201171875,
0.057891845703125,
-0.01285552978515625,
0.005950927734375,
-0.041656494140625,
-0.0196380615234375,
0.0277099609375,
-0.00472259521484375,
0.07684326171875,
0.0091094970703125,
-0.038970947265625,
0.0235443115234375,
-0.041351318359375,
0.00919342041015625,
0.038421630859375,
-0.02410888671875,
0.0070037841796875,
-0.042633056640625,
0.00392913818359375,
0.02374267578125,
0.01337432861328125,
-0.044097900390625,
0.01079559326171875,
-0.0166015625,
0.037933349609375,
0.049072265625,
-0.00780487060546875,
0.0262908935546875,
-0.0295257568359375,
0.022216796875,
0.032501220703125,
0.016448974609375,
-0.001361846923828125,
-0.0390625,
-0.06597900390625,
-0.036773681640625,
0.028472900390625,
0.021697998046875,
-0.03472900390625,
0.038482666015625,
-0.010467529296875,
-0.055328369140625,
-0.039215087890625,
0.007389068603515625,
0.0369873046875,
0.045318603515625,
0.0239715576171875,
-0.039764404296875,
-0.04046630859375,
-0.06256103515625,
0.01116180419921875,
-0.001163482666015625,
0.0012826919555664062,
0.0200042724609375,
0.050994873046875,
-0.0006771087646484375,
0.040863037109375,
-0.0286102294921875,
-0.021240234375,
-0.01161956787109375,
0.00927734375,
0.03448486328125,
0.0635986328125,
0.057342529296875,
-0.047454833984375,
-0.0299835205078125,
-0.016265869140625,
-0.07177734375,
0.018829345703125,
-0.00811767578125,
-0.0189666748046875,
0.020721435546875,
0.01137542724609375,
-0.045196533203125,
0.04412841796875,
0.01186370849609375,
-0.0279693603515625,
0.031768798828125,
-0.0170135498046875,
0.01537322998046875,
-0.0810546875,
0.014404296875,
0.0295257568359375,
-0.006305694580078125,
-0.035186767578125,
0.0043182373046875,
-0.0015392303466796875,
-0.006866455078125,
-0.042694091796875,
0.041046142578125,
-0.045684814453125,
-0.020233154296875,
-0.005794525146484375,
-0.0255279541015625,
0.0018529891967773438,
0.059295654296875,
-0.005474090576171875,
0.031524658203125,
0.061279296875,
-0.042449951171875,
0.03765869140625,
0.03228759765625,
-0.0188140869140625,
0.020233154296875,
-0.0496826171875,
0.0243072509765625,
-0.006229400634765625,
0.021942138671875,
-0.07586669921875,
-0.017974853515625,
0.029205322265625,
-0.04833984375,
0.04730224609375,
-0.036346435546875,
-0.0288543701171875,
-0.042327880859375,
-0.03466796875,
0.026275634765625,
0.04962158203125,
-0.0560302734375,
0.033538818359375,
0.019744873046875,
0.0242767333984375,
-0.041412353515625,
-0.073974609375,
-0.0169219970703125,
-0.03179931640625,
-0.054718017578125,
0.02972412109375,
0.01349639892578125,
0.01175689697265625,
0.01248931884765625,
-0.009735107421875,
-0.01186370849609375,
-0.0038585662841796875,
0.0374755859375,
0.0251007080078125,
-0.039031982421875,
-0.004001617431640625,
-0.028106689453125,
0.003139495849609375,
0.0033130645751953125,
-0.0325927734375,
0.052825927734375,
-0.024871826171875,
-0.0079803466796875,
-0.06646728515625,
-0.01275634765625,
0.04046630859375,
-0.013092041015625,
0.06304931640625,
0.0810546875,
-0.040283203125,
-0.0012969970703125,
-0.0274658203125,
-0.0209197998046875,
-0.036468505859375,
0.039215087890625,
-0.0322265625,
-0.03057861328125,
0.06280517578125,
0.0003695487976074219,
0.0086822509765625,
0.04888916015625,
0.0290679931640625,
-0.007293701171875,
0.0482177734375,
0.036041259765625,
0.0164337158203125,
0.05413818359375,
-0.07904052734375,
-0.01380157470703125,
-0.063720703125,
-0.042449951171875,
-0.024505615234375,
-0.056671142578125,
-0.045501708984375,
-0.022613525390625,
0.0298919677734375,
0.01166534423828125,
-0.035614013671875,
0.03460693359375,
-0.06512451171875,
0.007049560546875,
0.04669189453125,
0.049560546875,
-0.035247802734375,
0.01776123046875,
-0.0206146240234375,
0.00395965576171875,
-0.057281494140625,
-0.0194549560546875,
0.0880126953125,
0.043914794921875,
0.038848876953125,
-0.0079193115234375,
0.05303955078125,
-0.02301025390625,
0.022247314453125,
-0.044769287109375,
0.04669189453125,
-0.01776123046875,
-0.0306854248046875,
-0.00957489013671875,
-0.02947998046875,
-0.0804443359375,
0.001583099365234375,
-0.0212249755859375,
-0.047576904296875,
0.01502227783203125,
0.00894927978515625,
-0.0246429443359375,
0.06280517578125,
-0.06622314453125,
0.07745361328125,
-0.01398468017578125,
-0.0303497314453125,
0.00859832763671875,
-0.046356201171875,
0.020599365234375,
0.0179290771484375,
-0.031402587890625,
-0.0041351318359375,
0.00809478759765625,
0.0858154296875,
-0.03778076171875,
0.06512451171875,
-0.043121337890625,
0.0347900390625,
0.03350830078125,
-0.00923919677734375,
0.018096923828125,
-0.01270294189453125,
-0.0107574462890625,
0.0260772705078125,
-0.00370025634765625,
-0.042877197265625,
-0.03887939453125,
0.047454833984375,
-0.0770263671875,
-0.022796630859375,
-0.0272064208984375,
-0.031585693359375,
0.013458251953125,
0.003925323486328125,
0.0374755859375,
0.048065185546875,
0.0188751220703125,
0.024078369140625,
0.049163818359375,
-0.031768798828125,
0.038330078125,
-0.0137481689453125,
-0.01279449462890625,
-0.033935546875,
0.06256103515625,
0.017974853515625,
0.0058746337890625,
0.005504608154296875,
0.01338958740234375,
-0.0302276611328125,
-0.054443359375,
-0.0195465087890625,
0.0221099853515625,
-0.04974365234375,
-0.0306396484375,
-0.042327880859375,
-0.0345458984375,
-0.028472900390625,
-0.0011606216430664062,
-0.03204345703125,
-0.019134521484375,
-0.0347900390625,
0.01422882080078125,
0.044647216796875,
0.043243408203125,
-0.001434326171875,
0.04620361328125,
-0.035797119140625,
0.005001068115234375,
0.0030803680419921875,
0.04205322265625,
-0.001583099365234375,
-0.064453125,
-0.01515960693359375,
-0.01065826416015625,
-0.0361328125,
-0.056884765625,
0.037261962890625,
0.0169830322265625,
0.041412353515625,
0.0288543701171875,
-0.021453857421875,
0.058807373046875,
0.00485992431640625,
0.0435791015625,
0.0256805419921875,
-0.03924560546875,
0.04339599609375,
0.0036258697509765625,
0.0180816650390625,
0.00881195068359375,
0.0298919677734375,
-0.0113677978515625,
0.0010128021240234375,
-0.07330322265625,
-0.057952880859375,
0.064453125,
0.00896453857421875,
-0.001613616943359375,
0.0242156982421875,
0.056304931640625,
0.0005431175231933594,
0.005107879638671875,
-0.04974365234375,
-0.0294647216796875,
-0.033111572265625,
-0.023284912109375,
0.0008645057678222656,
0.002422332763671875,
0.0030765533447265625,
-0.049224853515625,
0.0543212890625,
-0.0017900466918945312,
0.06512451171875,
0.03167724609375,
-0.006328582763671875,
-0.004985809326171875,
-0.03790283203125,
0.038421630859375,
0.025421142578125,
-0.028411865234375,
0.0031642913818359375,
0.01416778564453125,
-0.04730224609375,
0.006397247314453125,
0.009124755859375,
0.00269317626953125,
0.006595611572265625,
0.032562255859375,
0.080810546875,
0.0014705657958984375,
0.01129150390625,
0.026519775390625,
-0.004764556884765625,
-0.0292816162109375,
-0.0256195068359375,
0.006488800048828125,
-0.0009188652038574219,
0.0309295654296875,
0.02325439453125,
0.0299530029296875,
-0.015228271484375,
-0.02044677734375,
0.0226593017578125,
0.034912109375,
-0.01529693603515625,
-0.03131103515625,
0.047271728515625,
-0.0127105712890625,
-0.0207061767578125,
0.06640625,
-0.0066680908203125,
-0.041290283203125,
0.083984375,
0.030517578125,
0.07708740234375,
-0.0018625259399414062,
0.004482269287109375,
0.06561279296875,
0.01263427734375,
-0.00482177734375,
0.01074981689453125,
0.01056671142578125,
-0.053131103515625,
0.002532958984375,
-0.052581787109375,
0.00728607177734375,
0.037445068359375,
-0.03253173828125,
0.0211181640625,
-0.06573486328125,
-0.035552978515625,
0.0167236328125,
0.0298004150390625,
-0.067138671875,
0.016998291015625,
0.0007138252258300781,
0.06390380859375,
-0.0513916015625,
0.061553955078125,
0.07171630859375,
-0.05133056640625,
-0.0806884765625,
0.00569915771484375,
-0.0035610198974609375,
-0.06549072265625,
0.046234130859375,
0.03045654296875,
0.0148468017578125,
0.010833740234375,
-0.05670166015625,
-0.045318603515625,
0.10809326171875,
0.0447998046875,
-0.0022640228271484375,
0.006626129150390625,
-0.00585174560546875,
0.0191192626953125,
-0.035247802734375,
0.0345458984375,
0.01971435546875,
0.0264739990234375,
0.026519775390625,
-0.054443359375,
0.01541900634765625,
-0.021392822265625,
0.0072021484375,
0.0179443359375,
-0.058258056640625,
0.07696533203125,
-0.0299530029296875,
-0.007671356201171875,
0.0023956298828125,
0.056884765625,
0.0174713134765625,
0.010772705078125,
0.038360595703125,
0.062103271484375,
0.045501708984375,
-0.0227203369140625,
0.060455322265625,
-0.00039386749267578125,
0.04364013671875,
0.040740966796875,
0.0309295654296875,
0.03857421875,
0.0291900634765625,
-0.018890380859375,
0.03216552734375,
0.08935546875,
-0.0272216796875,
0.023284912109375,
0.01727294921875,
-0.00325775146484375,
-0.0033092498779296875,
0.01435089111328125,
-0.0277099609375,
0.040435791015625,
0.01092529296875,
-0.04638671875,
-0.01715087890625,
-0.0001100301742553711,
0.002719879150390625,
-0.0275421142578125,
-0.0215911865234375,
0.035858154296875,
0.0004055500030517578,
-0.03350830078125,
0.0650634765625,
0.00556182861328125,
0.072509765625,
-0.0309295654296875,
0.005016326904296875,
-0.0215911865234375,
0.022247314453125,
-0.0303955078125,
-0.06268310546875,
0.0275421142578125,
-0.024810791015625,
0.007297515869140625,
0.0006909370422363281,
0.050567626953125,
-0.028350830078125,
-0.02838134765625,
0.0114288330078125,
0.0192413330078125,
0.044036865234375,
0.004657745361328125,
-0.09234619140625,
0.0088348388671875,
0.001495361328125,
-0.0479736328125,
0.025726318359375,
0.023284912109375,
0.0088348388671875,
0.051055908203125,
0.04449462890625,
-0.00217437744140625,
0.0181884765625,
-0.0195159912109375,
0.05694580078125,
-0.032684326171875,
-0.026519775390625,
-0.0611572265625,
0.046661376953125,
-0.00739288330078125,
-0.053375244140625,
0.0282440185546875,
0.0419921875,
0.06573486328125,
-0.0034236907958984375,
0.027984619140625,
-0.0182342529296875,
-0.00884246826171875,
-0.032562255859375,
0.06494140625,
-0.049041748046875,
-0.007415771484375,
-0.0037670135498046875,
-0.048736572265625,
-0.0255126953125,
0.058502197265625,
-0.0220947265625,
0.03045654296875,
0.03875732421875,
0.07733154296875,
-0.0283966064453125,
-0.0220489501953125,
0.01244354248046875,
0.0177001953125,
0.004375457763671875,
0.037628173828125,
0.0232391357421875,
-0.05621337890625,
0.0245819091796875,
-0.0457763671875,
-0.01000213623046875,
-0.0020198822021484375,
-0.04656982421875,
-0.07666015625,
-0.07098388671875,
-0.04638671875,
-0.055877685546875,
-0.011199951171875,
0.076171875,
0.08380126953125,
-0.052581787109375,
-0.011962890625,
-0.00032806396484375,
0.00583648681640625,
-0.01428985595703125,
-0.01580810546875,
0.05059814453125,
-0.0181427001953125,
-0.051025390625,
-0.020416259765625,
-0.00403594970703125,
0.0295562744140625,
-0.0124664306640625,
-0.0202178955078125,
-0.0230560302734375,
-0.0236968994140625,
0.0112152099609375,
0.024871826171875,
-0.045989990234375,
-0.00807952880859375,
-0.015716552734375,
-0.0090789794921875,
0.037261962890625,
0.03387451171875,
-0.043701171875,
0.0176239013671875,
0.03179931640625,
0.03570556640625,
0.06951904296875,
-0.032379150390625,
0.002590179443359375,
-0.06640625,
0.04266357421875,
0.0018720626831054688,
0.0310516357421875,
0.032867431640625,
-0.0271453857421875,
0.038421630859375,
0.03326416015625,
-0.0291748046875,
-0.0670166015625,
-0.01186370849609375,
-0.0694580078125,
-0.00887298583984375,
0.05987548828125,
-0.042327880859375,
-0.04876708984375,
0.031951904296875,
0.0049591064453125,
0.058746337890625,
-0.0062103271484375,
0.038299560546875,
0.0247650146484375,
-0.0098114013671875,
-0.049560546875,
-0.03887939453125,
0.034942626953125,
0.0159912109375,
-0.047149658203125,
-0.045074462890625,
-0.008270263671875,
0.05413818359375,
0.016845703125,
0.038177490234375,
-0.01448822021484375,
0.01360321044921875,
0.00620269775390625,
0.034088134765625,
-0.038360595703125,
-0.00995635986328125,
-0.0322265625,
0.0125885009765625,
-0.00836181640625,
-0.058380126953125
]
] |
MCG-NJU/videomae-base | 2023-04-22T11:30:29.000Z | [
"transformers",
"pytorch",
"videomae",
"pretraining",
"vision",
"video-classification",
"arxiv:2203.12602",
"arxiv:2111.06377",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] | video-classification | MCG-NJU | null | null | MCG-NJU/videomae-base | 22 | 30,517 | transformers | 2022-08-03T09:27:59 | ---
license: "cc-by-nc-4.0"
tags:
- vision
- video-classification
---
# VideoMAE (base-sized model, pre-trained only)
VideoMAE model pre-trained on Kinetics-400 for 1600 epochs in a self-supervised way. It was introduced in the paper [VideoMAE: Masked Autoencoders are Data-Efficient Learners for Self-Supervised Video Pre-Training](https://arxiv.org/abs/2203.12602) by Tong et al. and first released in [this repository](https://github.com/MCG-NJU/VideoMAE).
Disclaimer: The team releasing VideoMAE did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
VideoMAE is an extension of [Masked Autoencoders (MAE)](https://arxiv.org/abs/2111.06377) to video. The architecture of the model is very similar to that of a standard Vision Transformer (ViT), with a decoder on top for predicting pixel values for masked patches.
Videos are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds fixed sinus/cosinus position embeddings before feeding the sequence to the layers of the Transformer encoder.
By pre-training the model, it learns an inner representation of videos that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled videos for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire video.
## Intended uses & limitations
You can use the raw model for predicting pixel values for masked patches of a video, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=videomae) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to predict pixel values for randomly masked patches:
```python
from transformers import VideoMAEImageProcessor, VideoMAEForPreTraining
import numpy as np
import torch
num_frames = 16
video = list(np.random.randn(16, 3, 224, 224))
processor = VideoMAEImageProcessor.from_pretrained("MCG-NJU/videomae-base")
model = VideoMAEForPreTraining.from_pretrained("MCG-NJU/videomae-base")
pixel_values = processor(video, return_tensors="pt").pixel_values
num_patches_per_frame = (model.config.image_size // model.config.patch_size) ** 2
seq_length = (num_frames // model.config.tubelet_size) * num_patches_per_frame
bool_masked_pos = torch.randint(0, 2, (1, seq_length)).bool()
outputs = model(pixel_values, bool_masked_pos=bool_masked_pos)
loss = outputs.loss
```
For more code examples, we refer to the [documentation](https://huggingface.co/transformers/main/model_doc/videomae.html#).
## Training data
(to do, feel free to open a PR)
## Training procedure
### Preprocessing
(to do, feel free to open a PR)
### Pretraining
(to do, feel free to open a PR)
## Evaluation results
(to do, feel free to open a PR)
### BibTeX entry and citation info
```bibtex
misc{https://doi.org/10.48550/arxiv.2203.12602,
doi = {10.48550/ARXIV.2203.12602},
url = {https://arxiv.org/abs/2203.12602},
author = {Tong, Zhan and Song, Yibing and Wang, Jue and Wang, Limin},
keywords = {Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {VideoMAE: Masked Autoencoders are Data-Efficient Learners for Self-Supervised Video Pre-Training},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` | 3,768 | [
[
-0.04736328125,
-0.0279693603515625,
0.005573272705078125,
-0.00389862060546875,
-0.032989501953125,
-0.0019121170043945312,
0.00817108154296875,
-0.004123687744140625,
0.03167724609375,
0.0416259765625,
-0.051239013671875,
-0.029998779296875,
-0.07403564453125,
-0.0291290283203125,
-0.0338134765625,
0.07745361328125,
-0.0130462646484375,
-0.0002899169921875,
-0.0007681846618652344,
-0.00293731689453125,
-0.03082275390625,
-0.04693603515625,
-0.0256500244140625,
-0.01953125,
0.0152587890625,
0.007488250732421875,
0.038543701171875,
0.0751953125,
0.051910400390625,
0.03375244140625,
0.0140380859375,
-0.0089874267578125,
-0.027008056640625,
-0.0231170654296875,
0.01033782958984375,
-0.0251922607421875,
-0.0302734375,
0.0191192626953125,
0.044921875,
0.0205078125,
0.0091705322265625,
0.032989501953125,
-0.005218505859375,
0.01531219482421875,
-0.060150146484375,
0.0001933574676513672,
-0.03643798828125,
0.031219482421875,
0.0010557174682617188,
-0.019989013671875,
-0.016448974609375,
-0.006107330322265625,
0.00482177734375,
-0.02734375,
0.040618896484375,
-0.00936126708984375,
0.08551025390625,
0.03375244140625,
-0.0174407958984375,
0.00704193115234375,
-0.0584716796875,
0.042572021484375,
-0.035919189453125,
0.031463623046875,
0.0194091796875,
0.04681396484375,
0.00955963134765625,
-0.0743408203125,
-0.0400390625,
-0.0037841796875,
-0.002338409423828125,
0.004955291748046875,
-0.00775909423828125,
0.0226898193359375,
0.043701171875,
0.04510498046875,
-0.035675048828125,
0.003910064697265625,
-0.03466796875,
-0.03240966796875,
0.046234130859375,
-0.00011718273162841797,
0.0193634033203125,
-0.0159759521484375,
-0.049652099609375,
-0.025299072265625,
-0.0199737548828125,
0.0148162841796875,
0.01904296875,
-0.0195770263671875,
-0.028289794921875,
0.035858154296875,
-0.004302978515625,
0.042266845703125,
0.029205322265625,
-0.016357421875,
0.03497314453125,
-0.01155853271484375,
-0.052215576171875,
-0.0074310302734375,
0.0576171875,
0.0225982666015625,
0.018585205078125,
0.0029201507568359375,
-0.0260772705078125,
0.01727294921875,
0.0272674560546875,
-0.073974609375,
-0.033111572265625,
0.0001932382583618164,
-0.036529541015625,
-0.024505615234375,
0.03033447265625,
-0.04449462890625,
-0.00272369384765625,
-0.032989501953125,
0.070556640625,
-0.0035400390625,
-0.01520538330078125,
0.0010652542114257812,
-0.005184173583984375,
0.0246734619140625,
0.012603759765625,
-0.058319091796875,
0.035308837890625,
0.020416259765625,
0.0650634765625,
-0.004665374755859375,
-0.0233612060546875,
-0.032012939453125,
-0.00677490234375,
-0.0206451416015625,
0.044891357421875,
-0.01303863525390625,
-0.0022525787353515625,
0.00449371337890625,
0.033355712890625,
-0.004970550537109375,
-0.0377197265625,
0.021484375,
-0.03863525390625,
0.01398468017578125,
-0.0014095306396484375,
-0.0267333984375,
-0.0190582275390625,
0.009033203125,
-0.047271728515625,
0.08489990234375,
0.005214691162109375,
-0.045257568359375,
0.042877197265625,
-0.041656494140625,
0.0072021484375,
-0.007663726806640625,
-0.002956390380859375,
-0.054656982421875,
-0.00572967529296875,
0.0237579345703125,
0.041107177734375,
0.0158843994140625,
0.0111236572265625,
-0.016448974609375,
-0.0312347412109375,
0.0194091796875,
-0.0311126708984375,
0.04583740234375,
0.0195159912109375,
-0.04010009765625,
0.023834228515625,
-0.0596923828125,
-0.01206207275390625,
-0.01036834716796875,
0.006694793701171875,
0.01183319091796875,
-0.0258026123046875,
0.0028934478759765625,
0.039520263671875,
-0.0009670257568359375,
-0.052154541015625,
0.006366729736328125,
-0.0072479248046875,
0.034515380859375,
0.06683349609375,
0.0023632049560546875,
0.0384521484375,
-0.0108642578125,
0.0469970703125,
0.018341064453125,
0.0367431640625,
-0.031005859375,
-0.0299224853515625,
-0.060333251953125,
-0.0352783203125,
0.0203857421875,
0.02813720703125,
-0.0440673828125,
0.0305938720703125,
-0.005817413330078125,
-0.035491943359375,
-0.038787841796875,
0.0122528076171875,
0.030181884765625,
0.0294647216796875,
0.032440185546875,
-0.0465087890625,
-0.06982421875,
-0.057708740234375,
0.020294189453125,
0.015655517578125,
-0.0116729736328125,
0.01319122314453125,
0.056427001953125,
-0.01273345947265625,
0.0662841796875,
-0.0322265625,
-0.0235748291015625,
0.006198883056640625,
0.0009908676147460938,
0.022003173828125,
0.054351806640625,
0.0304412841796875,
-0.059051513671875,
-0.0279998779296875,
-0.0225830078125,
-0.057952880859375,
0.003009796142578125,
-0.01045989990234375,
-0.0179901123046875,
-0.00838470458984375,
0.037872314453125,
-0.04156494140625,
0.06280517578125,
0.03704833984375,
-0.018585205078125,
0.023162841796875,
-0.019073486328125,
0.0169830322265625,
-0.0714111328125,
-0.006866455078125,
-0.002162933349609375,
-0.030181884765625,
-0.044677734375,
0.00006771087646484375,
-0.006298065185546875,
-0.0187225341796875,
-0.050048828125,
0.031524658203125,
-0.03460693359375,
-0.0276641845703125,
-0.0276031494140625,
-0.017852783203125,
-0.0088043212890625,
0.059051513671875,
0.014923095703125,
0.037200927734375,
0.0535888671875,
-0.051666259765625,
0.039703369140625,
0.0061187744140625,
-0.029144287109375,
0.022369384765625,
-0.051422119140625,
0.0134124755859375,
-0.0189056396484375,
0.0132904052734375,
-0.06610107421875,
-0.03271484375,
0.0232391357421875,
-0.0400390625,
0.034332275390625,
-0.0270843505859375,
-0.0132904052734375,
-0.0439453125,
-0.011871337890625,
0.057403564453125,
0.054290771484375,
-0.04376220703125,
0.0325927734375,
0.04534912109375,
0.03790283203125,
-0.052215576171875,
-0.0546875,
-0.010406494140625,
-0.0234375,
-0.03900146484375,
0.035064697265625,
-0.007091522216796875,
0.0158233642578125,
0.002109527587890625,
-0.0015649795532226562,
-0.0226898193359375,
-0.016998291015625,
0.044097900390625,
0.0211334228515625,
-0.0192718505859375,
-0.00208282470703125,
-0.03131103515625,
-0.023040771484375,
0.0079803466796875,
-0.0307769775390625,
0.043670654296875,
-0.01102447509765625,
-0.03179931640625,
-0.03497314453125,
0.015167236328125,
0.0301971435546875,
-0.016387939453125,
0.038238525390625,
0.07598876953125,
-0.057098388671875,
-0.0021839141845703125,
-0.051025390625,
-0.0206146240234375,
-0.040771484375,
0.03424072265625,
-0.028228759765625,
-0.040802001953125,
0.058746337890625,
0.019775390625,
-0.0258331298828125,
0.044586181640625,
0.053070068359375,
-0.015777587890625,
0.06597900390625,
0.056915283203125,
0.01309967041015625,
0.05206298828125,
-0.055145263671875,
-0.0039215087890625,
-0.055419921875,
-0.03436279296875,
-0.0028324127197265625,
-0.03302001953125,
-0.0372314453125,
-0.04205322265625,
0.0256500244140625,
0.0193328857421875,
-0.040283203125,
0.03692626953125,
-0.046966552734375,
0.039947509765625,
0.04583740234375,
0.02923583984375,
-0.0165863037109375,
0.006565093994140625,
0.0012493133544921875,
-0.00931549072265625,
-0.05560302734375,
-0.0159759521484375,
0.068115234375,
0.05029296875,
0.05157470703125,
-0.0204620361328125,
0.05206298828125,
0.0241241455078125,
0.0106201171875,
-0.05108642578125,
0.040740966796875,
-0.007160186767578125,
-0.0386962890625,
-0.002979278564453125,
-0.00563812255859375,
-0.063232421875,
-0.005878448486328125,
-0.0242919921875,
-0.046844482421875,
0.024810791015625,
0.030364990234375,
-0.0218048095703125,
0.045196533203125,
-0.044921875,
0.0771484375,
-0.005100250244140625,
-0.0184173583984375,
0.005229949951171875,
-0.05145263671875,
0.019775390625,
0.0083160400390625,
0.0014944076538085938,
0.0231781005859375,
0.0182647705078125,
0.070068359375,
-0.051788330078125,
0.07110595703125,
-0.0273895263671875,
0.0305328369140625,
0.0498046875,
-0.014923095703125,
0.0189971923828125,
-0.004619598388671875,
0.040771484375,
0.0236663818359375,
-0.005550384521484375,
-0.039154052734375,
-0.0528564453125,
0.0246734619140625,
-0.0709228515625,
-0.035980224609375,
-0.03253173828125,
-0.02899169921875,
0.0235137939453125,
0.01000213623046875,
0.051361083984375,
0.04815673828125,
0.01267242431640625,
0.016998291015625,
0.05877685546875,
-0.0208587646484375,
0.04974365234375,
-0.0019702911376953125,
-0.015655517578125,
-0.04058837890625,
0.05657958984375,
0.017974853515625,
0.0305023193359375,
0.0206756591796875,
0.006526947021484375,
-0.0088653564453125,
-0.02880859375,
-0.0281219482421875,
0.00768280029296875,
-0.061737060546875,
-0.02325439453125,
-0.030364990234375,
-0.05596923828125,
-0.0322265625,
-0.01473236083984375,
-0.036407470703125,
-0.017486572265625,
-0.0168914794921875,
-0.0240478515625,
0.02099609375,
0.043426513671875,
-0.0295257568359375,
0.04803466796875,
-0.048614501953125,
0.026275634765625,
0.042205810546875,
0.0357666015625,
-0.017669677734375,
-0.0626220703125,
-0.036956787109375,
0.0031681060791015625,
-0.022674560546875,
-0.055328369140625,
0.042236328125,
0.0089874267578125,
0.0518798828125,
0.036163330078125,
-0.02691650390625,
0.06817626953125,
-0.033660888671875,
0.0601806640625,
0.026214599609375,
-0.0625,
0.042083740234375,
-0.0101470947265625,
0.017852783203125,
0.0130157470703125,
0.0301971435546875,
-0.017425537109375,
0.00630950927734375,
-0.0633544921875,
-0.03955078125,
0.047698974609375,
0.017608642578125,
0.01654052734375,
0.01239013671875,
0.03448486328125,
-0.0035495758056640625,
0.01044464111328125,
-0.071044921875,
-0.021697998046875,
-0.06585693359375,
-0.0001665353775024414,
-0.01338958740234375,
-0.0225830078125,
-0.006866455078125,
-0.03509521484375,
0.03900146484375,
0.0028133392333984375,
0.06231689453125,
0.0283355712890625,
-0.02496337890625,
-0.0198822021484375,
-0.0216522216796875,
0.0411376953125,
0.021728515625,
-0.04559326171875,
0.01247406005859375,
0.0128936767578125,
-0.058929443359375,
0.00945281982421875,
-0.0185394287109375,
-0.006877899169921875,
0.01019287109375,
0.02685546875,
0.087158203125,
0.01085662841796875,
-0.010406494140625,
0.061309814453125,
0.022369384765625,
-0.0237274169921875,
-0.04010009765625,
-0.0006380081176757812,
-0.03509521484375,
0.018951416015625,
0.019775390625,
0.02471923828125,
0.00449371337890625,
-0.03594970703125,
0.041168212890625,
0.0268402099609375,
-0.03900146484375,
-0.0311126708984375,
0.062164306640625,
-0.00931549072265625,
-0.028564453125,
0.037353515625,
0.001750946044921875,
-0.053314208984375,
0.06793212890625,
0.0292816162109375,
0.078125,
-0.02734375,
0.021697998046875,
0.052276611328125,
0.0228729248046875,
-0.00004500150680541992,
-0.0028972625732421875,
-0.01068115234375,
-0.052764892578125,
-0.040771484375,
-0.042205810546875,
-0.00141143798828125,
0.0164947509765625,
-0.054351806640625,
0.043914794921875,
-0.040985107421875,
-0.0173797607421875,
-0.005390167236328125,
-0.005214691162109375,
-0.083251953125,
0.04559326171875,
0.03643798828125,
0.06536865234375,
-0.07537841796875,
0.07598876953125,
0.030975341796875,
-0.04132080078125,
-0.06005859375,
-0.03558349609375,
-0.01303863525390625,
-0.0693359375,
0.055999755859375,
0.0202484130859375,
0.01190185546875,
-0.005184173583984375,
-0.048675537109375,
-0.07244873046875,
0.0848388671875,
0.0179443359375,
-0.01666259765625,
-0.005340576171875,
0.0168609619140625,
0.031982421875,
-0.054656982421875,
0.05450439453125,
0.006378173828125,
0.005828857421875,
0.033599853515625,
-0.059539794921875,
-0.01186370849609375,
-0.023345947265625,
0.004535675048828125,
0.00725555419921875,
-0.039459228515625,
0.07940673828125,
-0.00785064697265625,
0.005977630615234375,
-0.01412200927734375,
0.047515869140625,
-0.0101318359375,
0.0206146240234375,
0.04693603515625,
0.048095703125,
0.0203094482421875,
-0.004215240478515625,
0.0753173828125,
-0.00867462158203125,
0.03375244140625,
0.06231689453125,
0.032012939453125,
0.04437255859375,
0.0256805419921875,
-0.01088714599609375,
0.05157470703125,
0.0660400390625,
-0.034027099609375,
0.05499267578125,
0.005153656005859375,
0.002307891845703125,
-0.0302734375,
0.016998291015625,
-0.032867431640625,
0.04443359375,
0.01445770263671875,
-0.048492431640625,
-0.0030918121337890625,
0.0251617431640625,
-0.0178070068359375,
-0.0206146240234375,
-0.0445556640625,
0.0478515625,
0.0032939910888671875,
-0.042755126953125,
0.04949951171875,
-0.017547607421875,
0.034881591796875,
-0.04815673828125,
-0.00978851318359375,
-0.01224517822265625,
0.0189056396484375,
-0.035400390625,
-0.047607421875,
0.02728271484375,
-0.00597381591796875,
-0.016265869140625,
-0.0118408203125,
0.04705810546875,
-0.0312347412109375,
-0.0440673828125,
0.00606536865234375,
0.00024509429931640625,
0.026611328125,
0.002819061279296875,
-0.04150390625,
-0.002796173095703125,
-0.01470184326171875,
-0.025787353515625,
0.036285400390625,
0.01247406005859375,
-0.0030727386474609375,
0.043731689453125,
0.04534912109375,
-0.00994873046875,
0.040740966796875,
-0.00385284423828125,
0.07025146484375,
-0.043426513671875,
-0.04595947265625,
-0.056488037109375,
0.06353759765625,
-0.007648468017578125,
-0.01873779296875,
0.051788330078125,
0.034942626953125,
0.0753173828125,
-0.0218353271484375,
0.03033447265625,
0.00661468505859375,
0.00482940673828125,
-0.0465087890625,
0.0288543701171875,
-0.023681640625,
0.004901885986328125,
-0.03057861328125,
-0.07501220703125,
-0.021728515625,
0.0606689453125,
-0.01043701171875,
0.0072479248046875,
0.040191650390625,
0.06842041015625,
-0.0286102294921875,
-0.0195159912109375,
0.0283203125,
0.0204620361328125,
0.00792694091796875,
0.040283203125,
0.040740966796875,
-0.06500244140625,
0.03546142578125,
-0.05126953125,
-0.0274200439453125,
-0.00254058837890625,
-0.06524658203125,
-0.07861328125,
-0.048858642578125,
-0.04449462890625,
-0.040771484375,
-0.00667572021484375,
0.063720703125,
0.0902099609375,
-0.067626953125,
-0.01288604736328125,
-0.00799560546875,
-0.014129638671875,
-0.01537322998046875,
-0.01285552978515625,
0.037811279296875,
0.0004935264587402344,
-0.05255126953125,
-0.007106781005859375,
0.0032596588134765625,
0.007015228271484375,
-0.0300140380859375,
-0.01433563232421875,
-0.01229095458984375,
-0.0036678314208984375,
0.035888671875,
0.0279998779296875,
-0.05267333984375,
-0.045501708984375,
0.0016536712646484375,
0.0012054443359375,
0.0261383056640625,
0.0653076171875,
-0.08056640625,
0.04620361328125,
0.03277587890625,
0.030364990234375,
0.0849609375,
-0.01113128662109375,
0.029052734375,
-0.06231689453125,
0.031768798828125,
-0.0007448196411132812,
0.03863525390625,
0.01432037353515625,
-0.0272216796875,
0.03143310546875,
0.03363037109375,
-0.045257568359375,
-0.06549072265625,
0.01415252685546875,
-0.0894775390625,
0.0026340484619140625,
0.07830810546875,
-0.0252227783203125,
-0.0163726806640625,
0.01313018798828125,
-0.0167388916015625,
0.04541015625,
0.005859375,
0.051361083984375,
0.0275726318359375,
0.010009765625,
-0.05145263671875,
-0.037994384765625,
0.04400634765625,
0.00785064697265625,
-0.0323486328125,
-0.044769287109375,
0.007053375244140625,
0.0276641845703125,
0.01416778564453125,
0.040130615234375,
-0.00942230224609375,
0.027801513671875,
0.016998291015625,
0.0178070068359375,
-0.0262603759765625,
-0.025482177734375,
-0.034637451171875,
0.01529693603515625,
-0.033935546875,
-0.055633544921875
]
] |
timm/vgg19.tv_in1k | 2023-04-25T20:16:16.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1409.1556",
"license:bsd-3-clause",
"has_space",
"region:us"
] | image-classification | timm | null | null | timm/vgg19.tv_in1k | 0 | 30,439 | timm | 2023-04-25T20:14:08 | ---
tags:
- image-classification
- timm
library_name: timm
license: bsd-3-clause
datasets:
- imagenet-1k
---
# Model card for vgg19.tv_in1k
A VGG image classification model. Trained on ImageNet-1k, original torchvision weights.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 143.7
- GMACs: 19.6
- Activations (M): 14.9
- Image size: 224 x 224
- **Papers:**
- Very Deep Convolutional Networks for Large-Scale Image Recognition: https://arxiv.org/abs/1409.1556
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/pytorch/vision
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vgg19.tv_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vgg19.tv_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 224, 224])
# torch.Size([1, 128, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 512, 14, 14])
# torch.Size([1, 512, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vgg19.tv_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 512, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{Simonyan2014VeryDC,
title={Very Deep Convolutional Networks for Large-Scale Image Recognition},
author={Karen Simonyan and Andrew Zisserman},
journal={CoRR},
year={2014},
volume={abs/1409.1556}
}
```
| 3,634 | [
[
-0.036590576171875,
-0.0362548828125,
-0.00010943412780761719,
0.0019741058349609375,
-0.0296783447265625,
-0.0210418701171875,
-0.0204925537109375,
-0.02960205078125,
0.01172637939453125,
0.0312347412109375,
-0.0303192138671875,
-0.058746337890625,
-0.055877685546875,
-0.01534271240234375,
-0.01099395751953125,
0.07550048828125,
0.0009169578552246094,
0.00994873046875,
-0.00730133056640625,
-0.032135009765625,
-0.00923919677734375,
-0.0267181396484375,
-0.05780029296875,
-0.0396728515625,
0.0175933837890625,
0.01416778564453125,
0.03497314453125,
0.040130615234375,
0.041534423828125,
0.03680419921875,
-0.009185791015625,
0.0034389495849609375,
-0.02545166015625,
-0.027862548828125,
0.029052734375,
-0.04498291015625,
-0.0260162353515625,
0.0190277099609375,
0.05023193359375,
0.0219879150390625,
0.003253936767578125,
0.026519775390625,
0.0062713623046875,
0.0281524658203125,
-0.01015472412109375,
0.0106964111328125,
-0.036376953125,
0.0205230712890625,
-0.0077362060546875,
0.0081329345703125,
-0.0187530517578125,
-0.0284271240234375,
0.0224456787109375,
-0.037933349609375,
0.03765869140625,
-0.0015554428100585938,
0.1060791015625,
0.01073455810546875,
-0.0016832351684570312,
-0.00473785400390625,
-0.019378662109375,
0.061309814453125,
-0.066650390625,
0.01325225830078125,
0.0235748291015625,
0.0159912109375,
-0.007354736328125,
-0.07672119140625,
-0.045196533203125,
-0.02008056640625,
-0.0113372802734375,
-0.0044708251953125,
-0.01169586181640625,
0.003734588623046875,
0.0270538330078125,
0.0267181396484375,
-0.038238525390625,
0.01116180419921875,
-0.04681396484375,
-0.02020263671875,
0.04345703125,
0.0014858245849609375,
0.0198974609375,
-0.01515960693359375,
-0.040191650390625,
-0.026611328125,
-0.02752685546875,
0.020904541015625,
0.020904541015625,
0.01227569580078125,
-0.04669189453125,
0.0369873046875,
0.00931549072265625,
0.0400390625,
0.01052093505859375,
-0.02642822265625,
0.053619384765625,
0.0030231475830078125,
-0.04473876953125,
0.0021209716796875,
0.08013916015625,
0.0290985107421875,
0.0241546630859375,
0.0125732421875,
-0.0007796287536621094,
-0.0274200439453125,
-0.004878997802734375,
-0.0875244140625,
-0.034423828125,
0.025146484375,
-0.04254150390625,
-0.031646728515625,
0.0234375,
-0.047393798828125,
-0.01580810546875,
-0.00811004638671875,
0.054931640625,
-0.034393310546875,
-0.0207061767578125,
0.0133056640625,
-0.014801025390625,
0.036529541015625,
0.0200653076171875,
-0.041778564453125,
0.0068511962890625,
0.021697998046875,
0.0733642578125,
0.01409912109375,
-0.033050537109375,
-0.02752685546875,
-0.0275421142578125,
-0.0175933837890625,
0.0302734375,
-0.01009368896484375,
-0.0111236572265625,
-0.01554107666015625,
0.0362548828125,
-0.002498626708984375,
-0.0556640625,
0.02093505859375,
-0.01340484619140625,
0.0236968994140625,
-0.006130218505859375,
-0.0211181640625,
-0.032623291015625,
0.0233154296875,
-0.025054931640625,
0.089111328125,
0.0238037109375,
-0.057952880859375,
0.03656005859375,
-0.0245819091796875,
-0.00844573974609375,
-0.00949859619140625,
-0.005706787109375,
-0.08892822265625,
-0.00016582012176513672,
0.0152435302734375,
0.058319091796875,
-0.017333984375,
0.003582000732421875,
-0.042388916015625,
-0.0229339599609375,
0.022125244140625,
-0.007289886474609375,
0.071044921875,
0.0037326812744140625,
-0.037322998046875,
0.024749755859375,
-0.0469970703125,
0.0159149169921875,
0.037445068359375,
-0.0269317626953125,
0.00016319751739501953,
-0.05047607421875,
0.0206298828125,
0.0222015380859375,
0.01169586181640625,
-0.043548583984375,
0.03594970703125,
-0.003238677978515625,
0.032989501953125,
0.050872802734375,
-0.021881103515625,
0.016998291015625,
-0.01543426513671875,
0.0167083740234375,
0.019989013671875,
0.0158843994140625,
0.00537109375,
-0.04241943359375,
-0.055816650390625,
-0.040252685546875,
0.0278472900390625,
0.0292510986328125,
-0.0391845703125,
0.042327880859375,
-0.005550384521484375,
-0.05633544921875,
-0.03564453125,
0.00998687744140625,
0.038055419921875,
0.037750244140625,
0.0207672119140625,
-0.0355224609375,
-0.0413818359375,
-0.06024169921875,
0.01690673828125,
0.004337310791015625,
-0.0041351318359375,
0.02191162109375,
0.051300048828125,
-0.0078887939453125,
0.036529541015625,
-0.03424072265625,
-0.0263671875,
-0.0197906494140625,
0.00867462158203125,
0.035186767578125,
0.051788330078125,
0.06146240234375,
-0.0472412109375,
-0.03662109375,
-0.002132415771484375,
-0.07891845703125,
0.01377105712890625,
-0.00580596923828125,
-0.01105499267578125,
0.015533447265625,
0.020416259765625,
-0.051910400390625,
0.044952392578125,
0.0169677734375,
-0.03143310546875,
0.044281005859375,
-0.026824951171875,
0.01708984375,
-0.078369140625,
0.0037097930908203125,
0.03448486328125,
-0.0157928466796875,
-0.031280517578125,
-0.0012302398681640625,
-0.0017414093017578125,
0.0019683837890625,
-0.045684814453125,
0.045654296875,
-0.038055419921875,
-0.0125732421875,
-0.01186370849609375,
-0.0192108154296875,
-0.0038433074951171875,
0.052978515625,
-0.005985260009765625,
0.030670166015625,
0.068603515625,
-0.036224365234375,
0.047637939453125,
0.0338134765625,
-0.0249786376953125,
0.03338623046875,
-0.051727294921875,
0.00885009765625,
-0.005352020263671875,
0.016632080078125,
-0.0775146484375,
-0.023406982421875,
0.0276947021484375,
-0.04559326171875,
0.0521240234375,
-0.040924072265625,
-0.0267181396484375,
-0.047393798828125,
-0.039459228515625,
0.024505615234375,
0.056121826171875,
-0.0540771484375,
0.021209716796875,
0.0150909423828125,
0.025146484375,
-0.04345703125,
-0.06365966796875,
-0.01512908935546875,
-0.032318115234375,
-0.0406494140625,
0.0267181396484375,
0.00811004638671875,
0.009979248046875,
0.0088348388671875,
-0.00550079345703125,
-0.0012359619140625,
-0.0176544189453125,
0.034912109375,
0.0263671875,
-0.0265350341796875,
-0.003971099853515625,
-0.030670166015625,
-0.0017900466918945312,
0.0012359619140625,
-0.0238494873046875,
0.05267333984375,
-0.018218994140625,
-0.01111602783203125,
-0.060516357421875,
-0.00836944580078125,
0.0391845703125,
-0.0087127685546875,
0.0635986328125,
0.099365234375,
-0.040863037109375,
-0.00033164024353027344,
-0.0255279541015625,
-0.01552581787109375,
-0.03759765625,
0.040771484375,
-0.028076171875,
-0.036163330078125,
0.0633544921875,
-0.0008044242858886719,
0.0005807876586914062,
0.05072021484375,
0.0391845703125,
-0.003917694091796875,
0.05224609375,
0.041595458984375,
0.005306243896484375,
0.0562744140625,
-0.080322265625,
-0.011871337890625,
-0.060211181640625,
-0.043609619140625,
-0.024627685546875,
-0.04266357421875,
-0.047821044921875,
-0.0238800048828125,
0.028076171875,
0.0184783935546875,
-0.02874755859375,
0.0364990234375,
-0.0621337890625,
0.00923919677734375,
0.0565185546875,
0.04559326171875,
-0.0283966064453125,
0.0181884765625,
-0.0106964111328125,
-0.0003809928894042969,
-0.0469970703125,
-0.019805908203125,
0.09014892578125,
0.03173828125,
0.05865478515625,
-0.0179595947265625,
0.051116943359375,
-0.0109710693359375,
0.0170745849609375,
-0.051849365234375,
0.0445556640625,
-0.0118255615234375,
-0.033782958984375,
-0.018157958984375,
-0.025115966796875,
-0.079833984375,
0.00785064697265625,
-0.0270538330078125,
-0.06561279296875,
0.006252288818359375,
0.0092010498046875,
-0.015625,
0.05731201171875,
-0.065185546875,
0.0751953125,
-0.011566162109375,
-0.0303802490234375,
0.0065765380859375,
-0.046539306640625,
0.021820068359375,
0.0124053955078125,
-0.0265655517578125,
-0.0050201416015625,
0.025360107421875,
0.078857421875,
-0.042449951171875,
0.060699462890625,
-0.0364990234375,
0.0235748291015625,
0.039459228515625,
-0.0226898193359375,
0.02947998046875,
-0.004016876220703125,
-0.005184173583984375,
0.0225372314453125,
0.00165557861328125,
-0.040374755859375,
-0.0325927734375,
0.050018310546875,
-0.0758056640625,
-0.032012939453125,
-0.030609130859375,
-0.0296630859375,
0.019287109375,
0.003978729248046875,
0.04931640625,
0.05023193359375,
0.0142822265625,
0.0242156982421875,
0.053192138671875,
-0.03399658203125,
0.034698486328125,
-0.0181884765625,
-0.0199127197265625,
-0.033416748046875,
0.061614990234375,
0.01837158203125,
0.01073455810546875,
0.0027637481689453125,
0.01287078857421875,
-0.0235748291015625,
-0.05157470703125,
-0.022796630859375,
0.0299072265625,
-0.045501708984375,
-0.0311279296875,
-0.041534423828125,
-0.0401611328125,
-0.030059814453125,
-0.006591796875,
-0.0261383056640625,
-0.01139068603515625,
-0.031494140625,
0.00811004638671875,
0.05389404296875,
0.047698974609375,
-0.0067291259765625,
0.0494384765625,
-0.0419921875,
0.0160675048828125,
0.02099609375,
0.032196044921875,
-0.01468658447265625,
-0.06622314453125,
-0.02740478515625,
-0.009613037109375,
-0.040802001953125,
-0.049957275390625,
0.04132080078125,
0.01270294189453125,
0.040985107421875,
0.031158447265625,
-0.021820068359375,
0.056793212890625,
-0.00450897216796875,
0.048248291015625,
0.02886962890625,
-0.048980712890625,
0.036376953125,
-0.004535675048828125,
0.0193939208984375,
0.002727508544921875,
0.023040771484375,
-0.01197052001953125,
-0.00010919570922851562,
-0.06988525390625,
-0.050811767578125,
0.06378173828125,
0.00933074951171875,
0.0025768280029296875,
0.035552978515625,
0.045867919921875,
0.007480621337890625,
0.01282501220703125,
-0.054046630859375,
-0.030059814453125,
-0.0220794677734375,
-0.0199737548828125,
-0.00862884521484375,
-0.003170013427734375,
-0.0060272216796875,
-0.056854248046875,
0.0484619140625,
-0.006298065185546875,
0.065185546875,
0.02471923828125,
0.0002818107604980469,
-0.00791168212890625,
-0.032196044921875,
0.037322998046875,
0.026458740234375,
-0.0303192138671875,
0.0032520294189453125,
0.0196685791015625,
-0.05023193359375,
0.004749298095703125,
0.0147857666015625,
0.01432037353515625,
0.007793426513671875,
0.03753662109375,
0.067626953125,
-0.00690460205078125,
0.0000820159912109375,
0.033599853515625,
-0.004795074462890625,
-0.03436279296875,
-0.0189971923828125,
0.00876617431640625,
-0.004535675048828125,
0.040435791015625,
0.031494140625,
0.01447296142578125,
-0.0126190185546875,
-0.031524658203125,
0.02630615234375,
0.050018310546875,
-0.024810791015625,
-0.0293731689453125,
0.05267333984375,
-0.0190582275390625,
-0.0142364501953125,
0.06329345703125,
-0.0131072998046875,
-0.04217529296875,
0.0865478515625,
0.0301513671875,
0.0687255859375,
-0.007709503173828125,
0.0108642578125,
0.057098388671875,
0.0166168212890625,
0.0032138824462890625,
0.0179443359375,
0.017425537109375,
-0.04730224609375,
-0.0010852813720703125,
-0.04339599609375,
0.0019702911376953125,
0.04449462890625,
-0.03240966796875,
0.03631591796875,
-0.055938720703125,
-0.0302581787109375,
0.0190582275390625,
0.0280303955078125,
-0.07598876953125,
0.02685546875,
0.00728607177734375,
0.05364990234375,
-0.06097412109375,
0.06597900390625,
0.060638427734375,
-0.034881591796875,
-0.0670166015625,
-0.006717681884765625,
-0.0007309913635253906,
-0.0787353515625,
0.03704833984375,
0.0377197265625,
0.0233154296875,
0.00015211105346679688,
-0.0740966796875,
-0.04119873046875,
0.10009765625,
0.0401611328125,
-0.0138092041015625,
0.01226043701171875,
-0.007740020751953125,
0.0221099853515625,
-0.030609130859375,
0.030487060546875,
0.01163482666015625,
0.0265960693359375,
0.027801513671875,
-0.059051513671875,
0.01202392578125,
-0.0155792236328125,
-0.0041351318359375,
0.0210418701171875,
-0.060546875,
0.06884765625,
-0.03851318359375,
-0.0002263784408569336,
0.0023784637451171875,
0.048431396484375,
0.017974853515625,
0.01824951171875,
0.02984619140625,
0.06317138671875,
0.039459228515625,
-0.0242156982421875,
0.0584716796875,
0.0100555419921875,
0.05621337890625,
0.042266845703125,
0.027801513671875,
0.03033447265625,
0.0302276611328125,
-0.025543212890625,
0.0181121826171875,
0.081298828125,
-0.03656005859375,
0.032012939453125,
0.0272674560546875,
-0.0006937980651855469,
-0.0140533447265625,
0.005504608154296875,
-0.0396728515625,
0.03985595703125,
0.0152435302734375,
-0.042938232421875,
-0.0191650390625,
0.01287841796875,
0.004791259765625,
-0.0311431884765625,
-0.0203094482421875,
0.034210205078125,
0.0021648406982421875,
-0.016845703125,
0.067626953125,
-0.0015993118286132812,
0.06488037109375,
-0.0361328125,
-0.003139495849609375,
-0.0106964111328125,
0.018524169921875,
-0.0290679931640625,
-0.068359375,
0.014892578125,
-0.021331787109375,
0.00818634033203125,
0.0111541748046875,
0.04345703125,
-0.0291290283203125,
-0.038055419921875,
0.0123443603515625,
0.0193939208984375,
0.039947509765625,
-0.000919342041015625,
-0.085205078125,
0.01488494873046875,
0.007080078125,
-0.05023193359375,
0.029754638671875,
0.037811279296875,
0.00971221923828125,
0.052032470703125,
0.042724609375,
-0.007015228271484375,
0.0151519775390625,
-0.017425537109375,
0.061676025390625,
-0.04400634765625,
-0.009185791015625,
-0.06353759765625,
0.05023193359375,
-0.0091400146484375,
-0.048431396484375,
0.03765869140625,
0.04248046875,
0.072021484375,
-0.0096588134765625,
0.040557861328125,
-0.01995849609375,
-0.02032470703125,
-0.03851318359375,
0.050323486328125,
-0.051025390625,
-0.01270294189453125,
-0.018341064453125,
-0.0548095703125,
-0.0290069580078125,
0.0489501953125,
-0.0199737548828125,
0.0277557373046875,
0.033294677734375,
0.06805419921875,
-0.028472900390625,
-0.033905029296875,
0.0242462158203125,
0.01357269287109375,
0.01482391357421875,
0.040557861328125,
0.023834228515625,
-0.062255859375,
0.027740478515625,
-0.037506103515625,
-0.0128936767578125,
-0.0171661376953125,
-0.04345703125,
-0.0806884765625,
-0.06658935546875,
-0.0513916015625,
-0.05438232421875,
-0.0188140869140625,
0.06903076171875,
0.08526611328125,
-0.05242919921875,
-0.007099151611328125,
0.00719451904296875,
0.0091552734375,
-0.016204833984375,
-0.0155792236328125,
0.04986572265625,
-0.00417327880859375,
-0.0633544921875,
-0.032379150390625,
-0.0062103271484375,
0.03314208984375,
-0.0049896240234375,
-0.01386260986328125,
-0.01436614990234375,
-0.021514892578125,
0.02459716796875,
0.02801513671875,
-0.050384521484375,
-0.0238494873046875,
-0.0218658447265625,
-0.0123291015625,
0.035888671875,
0.0245819091796875,
-0.046142578125,
0.0265960693359375,
0.02935791015625,
0.026519775390625,
0.07159423828125,
-0.024810791015625,
-0.001987457275390625,
-0.05279541015625,
0.038330078125,
-0.018341064453125,
0.034637451171875,
0.0292205810546875,
-0.0286102294921875,
0.036956787109375,
0.032135009765625,
-0.038909912109375,
-0.064208984375,
-0.007717132568359375,
-0.09033203125,
-0.00794219970703125,
0.06640625,
-0.0273590087890625,
-0.040802001953125,
0.034332275390625,
-0.01995849609375,
0.05224609375,
-0.00969696044921875,
0.040130615234375,
0.0260009765625,
-0.0059356689453125,
-0.0496826171875,
-0.03997802734375,
0.03564453125,
0.0009136199951171875,
-0.05059814453125,
-0.036224365234375,
-0.000797271728515625,
0.05322265625,
0.01397705078125,
0.028167724609375,
-0.006130218505859375,
0.00724029541015625,
0.004486083984375,
0.036163330078125,
-0.025146484375,
-0.00670623779296875,
-0.0293426513671875,
0.006134033203125,
-0.0091705322265625,
-0.057098388671875
]
] |
mosaicml/mpt-7b-instruct | 2023-10-30T21:53:34.000Z | [
"transformers",
"pytorch",
"mpt",
"text-generation",
"Composer",
"MosaicML",
"llm-foundry",
"custom_code",
"dataset:mosaicml/dolly_hhrlhf",
"arxiv:2205.14135",
"arxiv:2108.12409",
"arxiv:2010.04245",
"license:cc-by-sa-3.0",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | mosaicml | null | null | mosaicml/mpt-7b-instruct | 441 | 30,280 | transformers | 2023-05-05T00:52:12 | ---
license: cc-by-sa-3.0
datasets:
- mosaicml/dolly_hhrlhf
tags:
- Composer
- MosaicML
- llm-foundry
inference: false
---
# MPT-7B-Instruct
MPT-7B-Instruct is a model for short-form instruction following.
It is built by finetuning [MPT-7B](https://huggingface.co/mosaicml/mpt-7b) on a [dataset](https://huggingface.co/datasets/sam-mosaic/dolly_hhrlhf) derived from the [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and the [Anthropic Helpful and Harmless (HH-RLHF)](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets.
* License: _CC-By-SA-3.0_
* [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-7b-instruct)
This model was trained by [MosaicML](https://www.mosaicml.com) and follows a modified decoder-only transformer architecture.
## Model Date
May 5, 2023
## Model License
CC-By-SA-3.0
## Documentation
* [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)!
### Example Question/Instruction
**Longboi24**:
> What is a quoll?
**MPT-7B-Instruct**:
>A Quoll (pronounced “cool”) is one of Australia’s native carnivorous marsupial mammals, which are also known as macropods or wallabies in other parts around Asia and South America
## How to Use
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method. This is because we use a custom model architecture that is not yet part of the `transformers` package.
It includes options for many training efficiency features such as [FlashAttention (Dao et al. 2022)](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), QK LayerNorm, and more.
```python
import transformers
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-7b-instruct',
trust_remote_code=True
)
```
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method.
This is because we use a custom `MPT` model architecture that is not yet part of the Hugging Face `transformers` package.
`MPT` includes options for many training efficiency features such as [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), [QK LayerNorm](https://arxiv.org/abs/2010.04245), and more.
To use the optimized [triton implementation](https://github.com/openai/triton) of FlashAttention, you can load the model on GPU (`cuda:0`) with `attn_impl='triton'` and with `bfloat16` precision:
```python
import torch
import transformers
name = 'mosaicml/mpt-7b-instruct'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.attn_config['attn_impl'] = 'triton'
config.init_device = 'cuda:0' # For fast initialization directly on GPU!
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
torch_dtype=torch.bfloat16, # Load model weights in bfloat16
trust_remote_code=True
)
```
Although the model was trained with a sequence length of 2048, ALiBi enables users to increase the maximum sequence length during finetuning and/or inference. For example:
```python
import transformers
name = 'mosaicml/mpt-7b-instruct'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.max_seq_len = 4096 # (input + output) tokens can now be up to 4096
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
trust_remote_code=True
)
```
This model was trained with the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
```
The model can then be used, for example, within a text-generation pipeline.
Note: when running Torch modules in lower precision, it is best practice to use the [torch.autocast context manager](https://pytorch.org/docs/stable/amp.html).
```python
from transformers import pipeline
pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, device='cuda:0')
with torch.autocast('cuda', dtype=torch.bfloat16):
print(
pipe('Here is a recipe for vegan banana bread:\n',
max_new_tokens=100,
do_sample=True,
use_cache=True))
```
### Formatting
This model was trained on data formatted in the dolly-15k format:
```python
INSTRUCTION_KEY = "### Instruction:"
RESPONSE_KEY = "### Response:"
INTRO_BLURB = "Below is an instruction that describes a task. Write a response that appropriately completes the request."
PROMPT_FOR_GENERATION_FORMAT = """{intro}
{instruction_key}
{instruction}
{response_key}
""".format(
intro=INTRO_BLURB,
instruction_key=INSTRUCTION_KEY,
instruction="{instruction}",
response_key=RESPONSE_KEY,
)
example = "James decides to run 3 sprints 3 times a week. He runs 60 meters each sprint. How many total meters does he run a week? Explain before answering."
fmt_ex = PROMPT_FOR_GENERATION_FORMAT.format(instruction=example)
```
In the above example, `fmt_ex` is ready to be tokenized and sent through the model.
## Model Description
The architecture is a modification of a standard decoder-only transformer.
The model has been modified from a standard transformer in the following ways:
* It uses [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf)
* It uses [ALiBi (Attention with Linear Biases)](https://arxiv.org/abs/2108.12409) and does not use positional embeddings
* It does not use biases
| Hyperparameter | Value |
|----------------|-------|
|n_parameters | 6.7B |
|n_layers | 32 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 50432 |
| sequence length | 2048 |
## PreTraining Data
For more details on the pretraining process, see [MPT-7B](https://huggingface.co/mosaicml/mpt-7b).
The data was tokenized using the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
### Training Configuration
This model was trained on 8 A100-40GBs for about 2.3 hours using the [MosaicML Platform](https://www.mosaicml.com/platform).
The model was trained with sharded data parallelism using [FSDP](https://pytorch.org/docs/stable/fsdp.html) and used the AdamW optimizer.
## Limitations and Biases
_The following language is modified from [EleutherAI's GPT-NeoX-20B](https://huggingface.co/EleutherAI/gpt-neox-20b)_
MPT-7B-Instruct can produce factually incorrect output, and should not be relied on to produce factually accurate information.
MPT-7B-Instruct was trained on various public datasets.
While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
## Acknowledgements
This model was finetuned by Sam Havens and the MosaicML NLP team
## MosaicML Platform
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b).
## Disclaimer
The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please cosult an attorney before using this model for commercial purposes.
## Citation
Please cite this model using the following format:
```
@online{MosaicML2023Introducing,
author = {MosaicML NLP Team},
title = {Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs},
year = {2023},
url = {www.mosaicml.com/blog/mpt-7b},
note = {Accessed: 2023-03-28}, % change this date
urldate = {2023-03-28} % change this date
}
```
| 7,951 | [
[
-0.031646728515625,
-0.04302978515625,
0.0182647705078125,
0.0264129638671875,
-0.0312347412109375,
-0.007694244384765625,
-0.0012388229370117188,
-0.02435302734375,
0.0038928985595703125,
0.0281219482421875,
-0.0487060546875,
-0.04498291015625,
-0.046478271484375,
0.0094146728515625,
-0.0232391357421875,
0.07342529296875,
-0.004634857177734375,
-0.0036449432373046875,
-0.0006155967712402344,
-0.0057373046875,
-0.01442718505859375,
-0.03363037109375,
-0.0421142578125,
-0.02569580078125,
0.03338623046875,
0.005687713623046875,
0.059661865234375,
0.0638427734375,
0.0274200439453125,
0.027862548828125,
-0.01715087890625,
0.014617919921875,
-0.031158447265625,
-0.033050537109375,
0.0023975372314453125,
-0.030059814453125,
-0.043182373046875,
0.019775390625,
0.03662109375,
0.025360107421875,
-0.0116729736328125,
0.0276336669921875,
0.0018377304077148438,
0.022735595703125,
-0.02923583984375,
0.01522064208984375,
-0.032806396484375,
0.01189422607421875,
-0.00241851806640625,
0.00031113624572753906,
-0.037841796875,
-0.032379150390625,
0.007083892822265625,
-0.045623779296875,
0.023406982421875,
0.0022735595703125,
0.07830810546875,
0.0136871337890625,
-0.0283355712890625,
0.0002582073211669922,
-0.04083251953125,
0.051788330078125,
-0.06353759765625,
0.0229949951171875,
0.02587890625,
0.01535797119140625,
0.003467559814453125,
-0.07611083984375,
-0.053802490234375,
-0.026031494140625,
-0.01448822021484375,
0.0259246826171875,
-0.00919342041015625,
0.005519866943359375,
0.04486083984375,
0.0272369384765625,
-0.0467529296875,
-0.0161895751953125,
-0.031768798828125,
-0.017303466796875,
0.039642333984375,
0.0166015625,
0.019439697265625,
-0.035614013671875,
-0.049652099609375,
-0.02490234375,
-0.04754638671875,
0.004383087158203125,
0.0289459228515625,
0.0013904571533203125,
-0.043701171875,
0.04547119140625,
0.0002970695495605469,
0.04022216796875,
0.0161590576171875,
-0.003902435302734375,
0.0288238525390625,
-0.0244293212890625,
-0.0198211669921875,
-0.0101318359375,
0.08251953125,
0.0227203369140625,
0.007266998291015625,
-0.0030918121337890625,
-0.0118408203125,
-0.006755828857421875,
0.0091552734375,
-0.077392578125,
-0.0259552001953125,
0.01056671142578125,
-0.0357666015625,
-0.0174560546875,
0.00917816162109375,
-0.038299560546875,
-0.01357269287109375,
-0.025390625,
0.062744140625,
-0.056427001953125,
-0.0230865478515625,
0.01032257080078125,
-0.00951385498046875,
0.027069091796875,
0.01175689697265625,
-0.05340576171875,
0.0120849609375,
0.0310821533203125,
0.07208251953125,
0.0038280487060546875,
-0.04022216796875,
-0.0125579833984375,
0.0013523101806640625,
-0.0005855560302734375,
0.0335693359375,
-0.016326904296875,
-0.02490234375,
-0.021759033203125,
0.019134521484375,
-0.0290069580078125,
-0.035919189453125,
0.0211334228515625,
-0.0283660888671875,
0.04229736328125,
-0.0166168212890625,
-0.0306854248046875,
-0.01708984375,
0.0107574462890625,
-0.040924072265625,
0.08197021484375,
0.035919189453125,
-0.06982421875,
0.0135345458984375,
-0.053497314453125,
-0.0162506103515625,
-0.013885498046875,
0.00580596923828125,
-0.054473876953125,
-0.00995635986328125,
0.022186279296875,
0.031951904296875,
-0.025634765625,
0.0086822509765625,
-0.017913818359375,
-0.034881591796875,
0.0207366943359375,
-0.033050537109375,
0.07696533203125,
0.01422882080078125,
-0.048828125,
0.0123291015625,
-0.06329345703125,
-0.007457733154296875,
0.01983642578125,
-0.029144287109375,
0.0199127197265625,
-0.02252197265625,
0.0010280609130859375,
0.0218505859375,
0.01332855224609375,
-0.042999267578125,
0.01788330078125,
-0.02850341796875,
0.0347900390625,
0.0567626953125,
-0.01270294189453125,
0.0267791748046875,
-0.03466796875,
0.0309600830078125,
0.0112152099609375,
0.0400390625,
-0.011474609375,
-0.0416259765625,
-0.068359375,
-0.025146484375,
0.0231475830078125,
0.038330078125,
-0.07220458984375,
0.0301055908203125,
-0.00911712646484375,
-0.049468994140625,
-0.0430908203125,
-0.00991058349609375,
0.040191650390625,
0.04144287109375,
0.0516357421875,
-0.0233154296875,
-0.047607421875,
-0.0572509765625,
0.00013113021850585938,
0.00667572021484375,
-0.0019273757934570312,
0.0244293212890625,
0.04339599609375,
-0.024810791015625,
0.06292724609375,
-0.0272216796875,
0.006381988525390625,
-0.024871826171875,
0.01959228515625,
0.0428466796875,
0.047882080078125,
0.040313720703125,
-0.052398681640625,
-0.04693603515625,
-0.01448822021484375,
-0.053558349609375,
0.005523681640625,
-0.00531005859375,
-0.0177764892578125,
0.006793975830078125,
0.0146636962890625,
-0.07733154296875,
0.04205322265625,
0.035369873046875,
-0.032012939453125,
0.0516357421875,
-0.00917816162109375,
-0.0011014938354492188,
-0.10198974609375,
0.00389862060546875,
-0.0068817138671875,
-0.016937255859375,
-0.0379638671875,
-0.01415252685546875,
0.00881195068359375,
-0.0006527900695800781,
-0.057891845703125,
0.03558349609375,
-0.032318115234375,
0.004730224609375,
-0.0162506103515625,
-0.0194549560546875,
-0.0024776458740234375,
0.047698974609375,
0.01715087890625,
0.052032470703125,
0.04058837890625,
-0.042999267578125,
0.03424072265625,
0.0238494873046875,
-0.0126190185546875,
0.01554107666015625,
-0.056915283203125,
0.0181427001953125,
0.01007843017578125,
0.018280029296875,
-0.05328369140625,
-0.009735107421875,
0.037841796875,
-0.039581298828125,
0.02740478515625,
-0.027618408203125,
-0.02777099609375,
-0.0457763671875,
-0.0139312744140625,
0.036102294921875,
0.05865478515625,
-0.055572509765625,
0.054168701171875,
-0.00537872314453125,
0.021209716796875,
-0.05975341796875,
-0.03289794921875,
-0.0175323486328125,
-0.019439697265625,
-0.055694580078125,
0.0287322998046875,
0.00024247169494628906,
0.01470184326171875,
-0.0103759765625,
-0.0093841552734375,
0.004119873046875,
-0.00962066650390625,
0.0237579345703125,
0.03106689453125,
-0.019439697265625,
-0.01953125,
-0.01284027099609375,
-0.021392822265625,
0.01264190673828125,
-0.0205535888671875,
0.069091796875,
-0.0239410400390625,
-0.0182647705078125,
-0.05303955078125,
0.00882720947265625,
0.036285400390625,
-0.01357269287109375,
0.073974609375,
0.0838623046875,
-0.01285552978515625,
0.00009846687316894531,
-0.04095458984375,
-0.024810791015625,
-0.0406494140625,
0.0246429443359375,
-0.01168060302734375,
-0.042205810546875,
0.040924072265625,
0.010711669921875,
-0.00775909423828125,
0.045684814453125,
0.047332763671875,
-0.0077667236328125,
0.07122802734375,
0.0416259765625,
0.0205535888671875,
0.04046630859375,
-0.060699462890625,
-0.001827239990234375,
-0.0711669921875,
-0.0216827392578125,
-0.01480865478515625,
-0.0185699462890625,
-0.046142578125,
-0.046630859375,
0.0210723876953125,
-0.002330780029296875,
-0.0513916015625,
0.056976318359375,
-0.044830322265625,
0.0286102294921875,
0.059478759765625,
0.0300445556640625,
0.00042629241943359375,
-0.0126190185546875,
-0.02154541015625,
0.01187896728515625,
-0.061920166015625,
-0.032470703125,
0.08734130859375,
0.025543212890625,
0.0576171875,
-0.006641387939453125,
0.058746337890625,
-0.01009368896484375,
0.03466796875,
-0.035125732421875,
0.039520263671875,
0.00882720947265625,
-0.052215576171875,
-0.007476806640625,
-0.049774169921875,
-0.0635986328125,
0.021636962890625,
-0.018646240234375,
-0.05242919921875,
0.021942138671875,
0.0182952880859375,
-0.0347900390625,
0.036041259765625,
-0.06591796875,
0.0750732421875,
-0.01904296875,
-0.04046630859375,
0.005985260009765625,
-0.06292724609375,
0.0278472900390625,
0.01300811767578125,
-0.00494384765625,
-0.003513336181640625,
0.0210723876953125,
0.06011962890625,
-0.040283203125,
0.07012939453125,
-0.0214691162109375,
0.0224151611328125,
0.032958984375,
-0.009490966796875,
0.0377197265625,
0.00818634033203125,
0.0044708251953125,
0.0164031982421875,
0.0129852294921875,
-0.03271484375,
-0.02496337890625,
0.0287017822265625,
-0.08447265625,
-0.0394287109375,
-0.04156494140625,
-0.0472412109375,
0.0055389404296875,
0.0126190185546875,
0.0518798828125,
0.025360107421875,
0.007793426513671875,
0.0195159912109375,
0.0504150390625,
-0.0369873046875,
0.053375244140625,
0.0206756591796875,
-0.00890350341796875,
-0.038726806640625,
0.064208984375,
-0.0018415451049804688,
0.02880859375,
0.02215576171875,
0.02142333984375,
-0.031951904296875,
-0.03289794921875,
-0.02923583984375,
0.033050537109375,
-0.038299560546875,
-0.0263519287109375,
-0.05072021484375,
-0.035064697265625,
-0.0328369140625,
0.005767822265625,
-0.04638671875,
-0.0245513916015625,
-0.0301055908203125,
0.0011005401611328125,
0.0245208740234375,
0.04290771484375,
-0.007904052734375,
0.05194091796875,
-0.06396484375,
0.0289459228515625,
0.026611328125,
0.0207366943359375,
-0.0026874542236328125,
-0.056915283203125,
-0.025299072265625,
0.0137481689453125,
-0.0482177734375,
-0.06451416015625,
0.037139892578125,
-0.0010900497436523438,
0.036895751953125,
0.0223541259765625,
-0.0030117034912109375,
0.04449462890625,
-0.0286712646484375,
0.05865478515625,
0.029144287109375,
-0.06976318359375,
0.0264129638671875,
-0.029296875,
0.026702880859375,
0.006855010986328125,
0.045684814453125,
-0.034881591796875,
-0.007110595703125,
-0.06561279296875,
-0.060760498046875,
0.0704345703125,
0.037322998046875,
0.01153564453125,
0.0015516281127929688,
0.030426025390625,
0.00322723388671875,
0.01125335693359375,
-0.09063720703125,
-0.01551055908203125,
-0.044219970703125,
-0.0174407958984375,
-0.0009670257568359375,
-0.0080718994140625,
-0.01470184326171875,
-0.045806884765625,
0.056915283203125,
0.001560211181640625,
0.052703857421875,
0.019744873046875,
-0.0255584716796875,
0.002513885498046875,
-0.0016050338745117188,
0.035400390625,
0.050811767578125,
-0.024871826171875,
-0.00011056661605834961,
0.0233917236328125,
-0.05419921875,
0.0082244873046875,
0.01898193359375,
-0.006153106689453125,
-0.006351470947265625,
0.02862548828125,
0.07061767578125,
-0.0006899833679199219,
-0.0088043212890625,
0.045623779296875,
-0.01273345947265625,
-0.017364501953125,
-0.016845703125,
0.01202392578125,
0.0268402099609375,
0.032806396484375,
0.01019287109375,
0.0047607421875,
-0.01554107666015625,
-0.03375244140625,
0.0127410888671875,
0.0181121826171875,
-0.0203857421875,
-0.03692626953125,
0.061798095703125,
0.0022487640380859375,
-0.01898193359375,
0.056549072265625,
-0.009368896484375,
-0.040496826171875,
0.05548095703125,
0.050140380859375,
0.06060791015625,
-0.02593994140625,
0.021636962890625,
0.03887939453125,
0.0269775390625,
-0.0020999908447265625,
0.015106201171875,
-0.0042724609375,
-0.050018310546875,
-0.0305328369140625,
-0.061065673828125,
-0.014312744140625,
0.00372314453125,
-0.04229736328125,
0.0272369384765625,
-0.0362548828125,
-0.0205841064453125,
-0.01470947265625,
-0.0019464492797851562,
-0.0611572265625,
0.02203369140625,
0.0183868408203125,
0.06884765625,
-0.06243896484375,
0.06982421875,
0.032196044921875,
-0.041015625,
-0.07977294921875,
-0.0206451416015625,
-0.01218414306640625,
-0.0726318359375,
0.0301971435546875,
0.0239715576171875,
0.007114410400390625,
0.00014412403106689453,
-0.055328369140625,
-0.0648193359375,
0.1075439453125,
0.038604736328125,
-0.0281982421875,
-0.0160675048828125,
0.03369140625,
0.034881591796875,
-0.0236663818359375,
0.05096435546875,
0.040863037109375,
0.0288543701171875,
0.0229034423828125,
-0.060211181640625,
0.0155792236328125,
-0.018463134765625,
0.00013065338134765625,
-0.00014972686767578125,
-0.061981201171875,
0.08990478515625,
-0.0172576904296875,
-0.005908966064453125,
0.01241302490234375,
0.04913330078125,
0.0176544189453125,
0.017364501953125,
0.029388427734375,
0.058990478515625,
0.036041259765625,
-0.0224456787109375,
0.0908203125,
-0.0232086181640625,
0.0477294921875,
0.07366943359375,
0.0178985595703125,
0.040618896484375,
0.0262908935546875,
-0.0215301513671875,
0.033355712890625,
0.06500244140625,
-0.01525115966796875,
0.0321044921875,
0.0018310546875,
-0.01209259033203125,
-0.0225677490234375,
0.017822265625,
-0.051727294921875,
0.0242462158203125,
0.0135345458984375,
-0.044189453125,
-0.01012420654296875,
0.0019445419311523438,
0.0094451904296875,
-0.035308837890625,
-0.00928497314453125,
0.044830322265625,
0.0159759521484375,
-0.035400390625,
0.063232421875,
-0.01015472412109375,
0.0513916015625,
-0.042633056640625,
0.00797271728515625,
-0.020416259765625,
0.0166015625,
-0.022064208984375,
-0.04803466796875,
0.01505279541015625,
-0.0159454345703125,
-0.0033206939697265625,
-0.012847900390625,
0.0273895263671875,
-0.0254364013671875,
-0.035369873046875,
0.0199432373046875,
0.0251312255859375,
0.007183074951171875,
-0.01300811767578125,
-0.07061767578125,
-0.0074920654296875,
0.0028629302978515625,
-0.038665771484375,
0.018707275390625,
0.0203857421875,
0.0194549560546875,
0.045501708984375,
0.053863525390625,
-0.0118865966796875,
0.025238037109375,
-0.004673004150390625,
0.078125,
-0.0574951171875,
-0.0196685791015625,
-0.066162109375,
0.05413818359375,
-0.0050048828125,
-0.0282745361328125,
0.055328369140625,
0.04754638671875,
0.067626953125,
-0.01175689697265625,
0.036224365234375,
-0.01479339599609375,
0.0212860107421875,
-0.040679931640625,
0.058135986328125,
-0.0292205810546875,
0.01666259765625,
-0.0236663818359375,
-0.09405517578125,
-0.004665374755859375,
0.05108642578125,
-0.035797119140625,
0.0193023681640625,
0.0621337890625,
0.07012939453125,
-0.027618408203125,
0.006023406982421875,
0.0135345458984375,
0.031158447265625,
0.018890380859375,
0.048736572265625,
0.07659912109375,
-0.05010986328125,
0.049102783203125,
-0.050506591796875,
-0.01053619384765625,
-0.0008563995361328125,
-0.0557861328125,
-0.07989501953125,
-0.035614013671875,
-0.0182647705078125,
-0.044708251953125,
-0.0015325546264648438,
0.07904052734375,
0.06597900390625,
-0.049102783203125,
-0.0235137939453125,
-0.0019216537475585938,
0.002880096435546875,
-0.01245880126953125,
-0.01505279541015625,
0.04193115234375,
-0.004878997802734375,
-0.053497314453125,
0.0040740966796875,
0.00032138824462890625,
0.026031494140625,
0.005550384521484375,
-0.009674072265625,
-0.028045654296875,
0.00021004676818847656,
0.039886474609375,
0.0161590576171875,
-0.037078857421875,
-0.02099609375,
0.0069427490234375,
-0.00421142578125,
0.0303955078125,
0.034637451171875,
-0.044708251953125,
0.01396942138671875,
0.02203369140625,
0.019439697265625,
0.0738525390625,
0.002719879150390625,
0.0335693359375,
-0.038726806640625,
0.01447296142578125,
0.020782470703125,
0.036834716796875,
0.0243682861328125,
-0.0220947265625,
0.0341796875,
0.03887939453125,
-0.04290771484375,
-0.04974365234375,
0.00528717041015625,
-0.08514404296875,
-0.00644683837890625,
0.09295654296875,
-0.01479339599609375,
-0.0423583984375,
0.01392364501953125,
-0.0275726318359375,
0.04595947265625,
-0.009613037109375,
0.056915283203125,
0.0285186767578125,
-0.00872039794921875,
-0.029144287109375,
-0.025390625,
0.03955078125,
0.02154541015625,
-0.048980712890625,
-0.01313018798828125,
0.00897979736328125,
0.039581298828125,
0.01305389404296875,
0.0278472900390625,
-0.01409149169921875,
0.0294342041015625,
0.0036449432373046875,
0.018707275390625,
-0.0208740234375,
-0.01296234130859375,
-0.003719329833984375,
0.00311279296875,
-0.026214599609375,
-0.0164031982421875
]
] |
skt/kogpt2-base-v2 | 2021-09-23T16:29:28.000Z | [
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"ko",
"license:cc-by-nc-sa-4.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | skt | null | null | skt/kogpt2-base-v2 | 26 | 30,077 | transformers | 2022-03-02T23:29:05 | ---
language: ko
tags:
- gpt2
license: cc-by-nc-sa-4.0
---
For more details: https://github.com/SKT-AI/KoGPT2
| 111 | [
[
-0.03240966796875,
-0.029876708984375,
0.0517578125,
0.01099395751953125,
-0.0250396728515625,
0.006359100341796875,
0.003520965576171875,
-0.0302276611328125,
-0.000637054443359375,
0.0275726318359375,
-0.044189453125,
-0.0211181640625,
-0.0219879150390625,
-0.040496826171875,
0.004489898681640625,
0.0799560546875,
-0.0187530517578125,
0.024322509765625,
0.0018014907836914062,
-0.0234222412109375,
-0.047119140625,
-0.0101776123046875,
-0.054412841796875,
-0.053466796875,
0.016876220703125,
0.036224365234375,
0.026092529296875,
0.02838134765625,
0.037567138671875,
0.0016756057739257812,
0.00872039794921875,
-0.05438232421875,
-0.0140228271484375,
0.01006317138671875,
-0.0173797607421875,
-0.02349853515625,
-0.033935546875,
0.00457000732421875,
0.042755126953125,
-0.0008077621459960938,
0.0028514862060546875,
0.01371002197265625,
-0.007213592529296875,
0.07318115234375,
-0.007617950439453125,
0.050689697265625,
-0.002704620361328125,
-0.001033782958984375,
-0.0034637451171875,
-0.0008792877197265625,
-0.006378173828125,
-0.029937744140625,
0.0169677734375,
-0.06817626953125,
0.0016498565673828125,
-0.027679443359375,
0.0665283203125,
0.0250244140625,
-0.05145263671875,
-0.0382080078125,
-0.04144287109375,
-0.0008339881896972656,
-0.04010009765625,
0.0289764404296875,
0.01629638671875,
0.05633544921875,
-0.004123687744140625,
-0.0601806640625,
-0.0249176025390625,
-0.039154052734375,
-0.0212554931640625,
0.044281005859375,
0.0242462158203125,
0.039398193359375,
0.0270538330078125,
0.0127716064453125,
-0.07269287109375,
-0.038543701171875,
-0.062744140625,
-0.021270751953125,
0.035858154296875,
-0.0169525146484375,
0.0367431640625,
-0.0081787109375,
-0.03936767578125,
-0.01078033447265625,
-0.03973388671875,
-0.013946533203125,
0.040985107421875,
-0.01194000244140625,
-0.027435302734375,
0.05902099609375,
-0.04046630859375,
0.0164337158203125,
-0.005401611328125,
0.0269012451171875,
0.0295867919921875,
-0.034210205078125,
0.01207733154296875,
0.025390625,
0.0101776123046875,
0.031494140625,
0.00704193115234375,
-0.01200103759765625,
-0.0001665353775024414,
0.0245819091796875,
0.0156097412109375,
-0.045928955078125,
-0.0992431640625,
0.02227783203125,
-0.023223876953125,
-0.003421783447265625,
0.01080322265625,
-0.05645751953125,
0.002655029296875,
-0.03631591796875,
0.054534912109375,
-0.0311431884765625,
-0.0517578125,
-0.0196533203125,
-0.045501708984375,
0.0572509765625,
0.035430908203125,
-0.0156707763671875,
0.01345062255859375,
0.054534912109375,
0.061279296875,
-0.0000871419906616211,
-0.01355743408203125,
-0.0295257568359375,
-0.00551605224609375,
0.00634002685546875,
0.051177978515625,
-0.0201416015625,
-0.0064849853515625,
-0.005596160888671875,
0.0010986328125,
0.003604888916015625,
-0.006725311279296875,
0.039093017578125,
-0.04345703125,
-0.023345947265625,
-0.0294647216796875,
-0.02252197265625,
-0.0311431884765625,
0.03363037109375,
-0.044586181640625,
0.08758544921875,
0.03997802734375,
-0.049530029296875,
0.014923095703125,
-0.08062744140625,
-0.01438140869140625,
0.029571533203125,
-0.0027713775634765625,
-0.0233154296875,
-0.0006361007690429688,
-0.0256500244140625,
-0.00905609130859375,
0.0235137939453125,
-0.002117156982421875,
-0.02972412109375,
-0.01125335693359375,
-0.0137939453125,
0.00917816162109375,
0.04217529296875,
0.034149169921875,
-0.01467132568359375,
0.008941650390625,
-0.03533935546875,
0.032470703125,
0.002471923828125,
-0.0309295654296875,
-0.01256561279296875,
-0.015716552734375,
0.006015777587890625,
0.0180816650390625,
0.020965576171875,
-0.0300140380859375,
0.05853271484375,
-0.01366424560546875,
0.056182861328125,
0.0296783447265625,
0.0030803680419921875,
0.0491943359375,
-0.00839996337890625,
0.037078857421875,
-0.0249786376953125,
0.0193328857421875,
-0.01274871826171875,
-0.056182861328125,
-0.036102294921875,
0.01531982421875,
0.046630859375,
0.061981201171875,
-0.079345703125,
-0.015777587890625,
0.004520416259765625,
-0.04547119140625,
-0.00472259521484375,
-0.00861358642578125,
0.027557373046875,
-0.01076507568359375,
-0.0086822509765625,
-0.028289794921875,
-0.04620361328125,
-0.0487060546875,
-0.0024051666259765625,
-0.0257720947265625,
0.01148223876953125,
0.0313720703125,
0.06689453125,
-0.01450347900390625,
0.0855712890625,
-0.0323486328125,
0.00847625732421875,
0.0004968643188476562,
0.02276611328125,
-0.00370025634765625,
0.037567138671875,
0.052886962890625,
-0.05859375,
-0.07843017578125,
0.02093505859375,
-0.0207977294921875,
-0.020538330078125,
0.00980377197265625,
-0.0248870849609375,
-0.0069732666015625,
0.0072021484375,
-0.0794677734375,
0.04644775390625,
0.046356201171875,
-0.0236053466796875,
0.064697265625,
-0.0096588134765625,
0.001491546630859375,
-0.0936279296875,
0.01459503173828125,
-0.0214691162109375,
-0.030426025390625,
-0.035003662109375,
0.0170745849609375,
0.0163116455078125,
-0.0291290283203125,
-0.022735595703125,
0.027801513671875,
-0.037750244140625,
-0.0279541015625,
-0.01251220703125,
0.0010919570922851562,
-0.043060302734375,
0.034576416015625,
0.00814056396484375,
0.07720947265625,
0.03167724609375,
-0.0250244140625,
0.041015625,
-0.0017681121826171875,
-0.008056640625,
0.011138916015625,
-0.049041748046875,
0.03228759765625,
0.0264892578125,
0.03326416015625,
-0.09808349609375,
-0.06524658203125,
0.055450439453125,
-0.05548095703125,
0.0054473876953125,
-0.0794677734375,
-0.038177490234375,
-0.0200042724609375,
-0.0271453857421875,
0.0458984375,
0.061431884765625,
-0.042999267578125,
0.0112152099609375,
0.038421630859375,
-0.0379638671875,
0.01219940185546875,
-0.018707275390625,
-0.0127410888671875,
-0.013153076171875,
-0.0114898681640625,
0.00782012939453125,
-0.01529693603515625,
-0.0181884765625,
0.007106781005859375,
-0.0026035308837890625,
-0.0310516357421875,
-0.02227783203125,
0.0187835693359375,
-0.00057220458984375,
-0.026336669921875,
0.0178985595703125,
0.017303466796875,
-0.049102783203125,
0.010589599609375,
-0.055267333984375,
0.054443359375,
-0.0208587646484375,
-0.02484130859375,
-0.0455322265625,
0.011932373046875,
0.0545654296875,
0.00865936279296875,
0.003086090087890625,
0.072265625,
-0.0230560302734375,
0.0252227783203125,
-0.0267333984375,
-0.0140228271484375,
-0.0302276611328125,
0.0223541259765625,
-0.005645751953125,
-0.04327392578125,
0.042999267578125,
-0.049407958984375,
-0.01043701171875,
0.068115234375,
0.0472412109375,
0.0001729726791381836,
0.09039306640625,
0.0284881591796875,
-0.0026569366455078125,
0.027374267578125,
-0.027130126953125,
0.0256195068359375,
-0.054595947265625,
-0.0193023681640625,
-0.0252838134765625,
-0.00714874267578125,
-0.048004150390625,
-0.0169219970703125,
0.0161895751953125,
0.055450439453125,
-0.0254364013671875,
0.049835205078125,
-0.062744140625,
0.043670654296875,
0.046844482421875,
-0.01055908203125,
-0.00876617431640625,
-0.004421234130859375,
-0.00970458984375,
0.0013904571533203125,
-0.0411376953125,
-0.0321044921875,
0.06060791015625,
0.023956298828125,
0.04278564453125,
0.00970458984375,
0.04248046875,
0.040283203125,
0.007511138916015625,
-0.04400634765625,
0.004764556884765625,
0.048126220703125,
-0.052215576171875,
-0.0186309814453125,
0.0005502700805664062,
-0.01812744140625,
-0.008087158203125,
0.019317626953125,
-0.041107177734375,
-0.0138702392578125,
0.039459228515625,
-0.021392822265625,
0.020355224609375,
-0.07574462890625,
0.05517578125,
-0.0230865478515625,
0.00080108642578125,
0.005153656005859375,
-0.053955078125,
0.015472412109375,
0.00978851318359375,
-0.0204925537109375,
0.01090240478515625,
0.00971221923828125,
0.041961669921875,
-0.037841796875,
0.050689697265625,
-0.025848388671875,
0.0012149810791015625,
0.047454833984375,
-0.005950927734375,
0.05279541015625,
0.060943603515625,
0.00984954833984375,
0.011932373046875,
0.016815185546875,
-0.032806396484375,
-0.0184478759765625,
0.062744140625,
-0.047119140625,
0.0218048095703125,
-0.048614501953125,
-0.033294677734375,
0.02630615234375,
0.0201263427734375,
0.039520263671875,
0.00917816162109375,
0.00811004638671875,
0.011993408203125,
0.045654296875,
0.006443023681640625,
0.014190673828125,
0.0259552001953125,
-0.0088043212890625,
-0.055633544921875,
0.059600830078125,
0.0170440673828125,
0.014617919921875,
0.003215789794921875,
0.005283355712890625,
-0.0367431640625,
-0.036224365234375,
-0.06024169921875,
-0.0013952255249023438,
-0.04083251953125,
-0.014984130859375,
-0.0218353271484375,
-0.0248870849609375,
-0.041748046875,
-0.019256591796875,
-0.035125732421875,
-0.05169677734375,
0.0194091796875,
-0.0014257431030273438,
0.07318115234375,
0.07440185546875,
-0.010955810546875,
0.05609130859375,
-0.034698486328125,
0.00972747802734375,
0.0124969482421875,
0.05950927734375,
-0.019378662109375,
-0.01004791259765625,
-0.0208740234375,
0.0223846435546875,
-0.037200927734375,
-0.0504150390625,
0.00826263427734375,
-0.01349639892578125,
0.03692626953125,
0.0189361572265625,
0.0175323486328125,
0.027099609375,
-0.036102294921875,
0.070556640625,
0.0265960693359375,
-0.03973388671875,
0.058929443359375,
-0.04095458984375,
0.0208587646484375,
0.045135498046875,
-0.004425048828125,
-0.032806396484375,
0.0116119384765625,
-0.0517578125,
-0.07183837890625,
0.06195068359375,
0.0491943359375,
-0.0118865966796875,
0.0231781005859375,
0.07135009765625,
0.01715087890625,
0.001659393310546875,
-0.0633544921875,
-0.02606201171875,
-0.00954437255859375,
-0.01062774658203125,
-0.005462646484375,
-0.039703369140625,
-0.0012760162353515625,
-0.0267333984375,
0.057373046875,
0.0159759521484375,
0.0452880859375,
0.026397705078125,
-0.0213165283203125,
0.0187835693359375,
0.0166473388671875,
0.073486328125,
0.044921875,
-0.005420684814453125,
-0.00823974609375,
0.027252197265625,
-0.049041748046875,
-0.01248931884765625,
0.01468658447265625,
-0.01026153564453125,
0.0045318603515625,
0.021728515625,
0.0601806640625,
-0.005847930908203125,
-0.02435302734375,
0.0130157470703125,
0.0206756591796875,
-0.06182861328125,
-0.058685302734375,
0.0129241943359375,
0.0091552734375,
0.054290771484375,
0.00887298583984375,
-0.008819580078125,
0.0174407958984375,
-0.0297088623046875,
0.026611328125,
-0.003993988037109375,
-0.04815673828125,
-0.04876708984375,
0.04644775390625,
0.020050048828125,
-0.050201416015625,
0.039398193359375,
-0.0386962890625,
-0.08453369140625,
0.057525634765625,
0.035675048828125,
0.06109619140625,
-0.00860595703125,
0.030975341796875,
0.031768798828125,
0.019195556640625,
-0.0149383544921875,
0.035247802734375,
-0.031280517578125,
0.01715087890625,
-0.020599365234375,
-0.0082550048828125,
-0.024261474609375,
0.0106964111328125,
-0.03033447265625,
-0.00594329833984375,
-0.00992584228515625,
-0.00399017333984375,
-0.031494140625,
0.0386962890625,
-0.032196044921875,
-0.00827789306640625,
-0.01421356201171875,
0.06109619140625,
-0.06689453125,
0.0830078125,
0.0882568359375,
-0.0220947265625,
-0.047210693359375,
0.01097869873046875,
-0.005901336669921875,
-0.0169219970703125,
0.0301971435546875,
0.014801025390625,
0.01331329345703125,
0.01457977294921875,
-0.0469970703125,
-0.05609130859375,
0.0960693359375,
0.0163116455078125,
-0.040985107421875,
0.0083465576171875,
0.01082611083984375,
0.0262603759765625,
0.0052490234375,
0.00972747802734375,
0.03436279296875,
0.053375244140625,
-0.000766754150390625,
-0.07550048828125,
-0.043670654296875,
-0.0386962890625,
-0.01467132568359375,
0.03985595703125,
-0.042938232421875,
0.043853759765625,
-0.002162933349609375,
0.01084136962890625,
0.024139404296875,
0.003131866455078125,
0.000057220458984375,
0.0587158203125,
0.02838134765625,
0.0516357421875,
0.02435302734375,
-0.038238525390625,
0.0643310546875,
-0.031585693359375,
0.0673828125,
0.07342529296875,
-0.01715087890625,
0.0211181640625,
0.022003173828125,
-0.0191192626953125,
-0.0173797607421875,
0.07220458984375,
0.0022430419921875,
0.05499267578125,
-0.004108428955078125,
-0.0229034423828125,
0.0085296630859375,
0.044097900390625,
-0.05853271484375,
-0.0020732879638671875,
0.00962066650390625,
-0.0181884765625,
-0.025634765625,
0.0030155181884765625,
-0.004528045654296875,
-0.052032470703125,
-0.0145416259765625,
0.058563232421875,
-0.0084381103515625,
-0.03900146484375,
-0.0151519775390625,
-0.01279449462890625,
-0.0010585784912109375,
-0.05767822265625,
-0.038482666015625,
0.0185394287109375,
0.027191162109375,
0.00732421875,
-0.080322265625,
0.022674560546875,
-0.012237548828125,
-0.011260986328125,
0.0099334716796875,
0.080322265625,
-0.03717041015625,
-0.0228118896484375,
0.01557159423828125,
-0.004016876220703125,
0.032562255859375,
-0.01433563232421875,
-0.06787109375,
0.004825592041015625,
0.01386260986328125,
-0.026336669921875,
0.0182037353515625,
0.00789642333984375,
-0.019317626953125,
0.036590576171875,
0.0386962890625,
0.0005517005920410156,
0.004383087158203125,
-0.009674072265625,
0.0496826171875,
-0.0256195068359375,
-0.0462646484375,
-0.040679931640625,
0.03314208984375,
-0.026092529296875,
-0.0228271484375,
0.035064697265625,
0.071533203125,
0.07647705078125,
-0.023162841796875,
0.0760498046875,
-0.027374267578125,
0.0084991455078125,
-0.034088134765625,
0.06854248046875,
-0.01468658447265625,
-0.01641845703125,
-0.007320404052734375,
-0.06829833984375,
-0.0221710205078125,
0.0628662109375,
-0.025726318359375,
0.00429534912109375,
0.034210205078125,
0.0458984375,
-0.0266265869140625,
0.0037746429443359375,
0.007793426513671875,
0.015350341796875,
0.026031494140625,
0.01016998291015625,
0.0401611328125,
-0.032623291015625,
0.049041748046875,
-0.04595947265625,
-0.00943756103515625,
-0.03875732421875,
-0.0478515625,
-0.0489501953125,
-0.048065185546875,
-0.028289794921875,
-0.0201873779296875,
0.035675048828125,
0.051025390625,
0.06854248046875,
-0.044952392578125,
0.020965576171875,
-0.0272979736328125,
-0.0120697021484375,
-0.01464080810546875,
-0.0165252685546875,
0.054412841796875,
-0.01062774658203125,
-0.028717041015625,
-0.024688720703125,
0.02264404296875,
-0.007354736328125,
0.00732421875,
-0.048736572265625,
-0.0003719329833984375,
-0.01505279541015625,
0.05145263671875,
0.034759521484375,
-0.044586181640625,
-0.033782958984375,
-0.04071044921875,
0.00096893310546875,
0.00653839111328125,
0.0660400390625,
-0.0290069580078125,
0.0296783447265625,
0.060028076171875,
0.0133819580078125,
0.02349853515625,
0.0198822021484375,
0.026763916015625,
-0.0068359375,
-0.00717926025390625,
-0.00968170166015625,
0.00290679931640625,
-0.034210205078125,
-0.047210693359375,
0.065185546875,
0.042999267578125,
-0.05987548828125,
-0.05419921875,
0.019256591796875,
-0.10772705078125,
0.0185546875,
0.0635986328125,
-0.0031452178955078125,
0.00875091552734375,
-0.0181732177734375,
-0.07366943359375,
0.045013427734375,
-0.03912353515625,
0.039459228515625,
0.07025146484375,
-0.0206756591796875,
-0.0192413330078125,
-0.1104736328125,
0.0296783447265625,
-0.0225677490234375,
-0.039154052734375,
-0.0010595321655273438,
0.01654052734375,
0.053375244140625,
-0.0002918243408203125,
0.043304443359375,
-0.01064300537109375,
0.04461669921875,
0.0191192626953125,
0.01629638671875,
-0.005580902099609375,
-0.028228759765625,
-0.00850677490234375,
-0.0014867782592773438,
-0.003631591796875,
-0.02569580078125
]
] |
timm/tf_efficientnetv2_s.in21k_ft_in1k | 2023-04-27T22:17:54.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:imagenet-21k",
"arxiv:2104.00298",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/tf_efficientnetv2_s.in21k_ft_in1k | 0 | 30,043 | timm | 2022-12-13T00:19:21 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
- imagenet-21k
---
# Model card for tf_efficientnetv2_s.in21k_ft_in1k
A EfficientNet-v2 image classification model. Trained on ImageNet-21k and fine-tuned on ImageNet-1k in Tensorflow by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 21.5
- GMACs: 5.4
- Activations (M): 22.7
- Image size: train = 300 x 300, test = 384 x 384
- **Papers:**
- EfficientNetV2: Smaller Models and Faster Training: https://arxiv.org/abs/2104.00298
- **Dataset:** ImageNet-1k
- **Pretrain Dataset:** ImageNet-21k
- **Original:** https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('tf_efficientnetv2_s.in21k_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnetv2_s.in21k_ft_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 24, 150, 150])
# torch.Size([1, 48, 75, 75])
# torch.Size([1, 64, 38, 38])
# torch.Size([1, 160, 19, 19])
# torch.Size([1, 256, 10, 10])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnetv2_s.in21k_ft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 10, 10) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2021efficientnetv2,
title={Efficientnetv2: Smaller models and faster training},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={10096--10106},
year={2021},
organization={PMLR}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,188 | [
[
-0.0286712646484375,
-0.034149169921875,
-0.004840850830078125,
0.00856781005859375,
-0.0247650146484375,
-0.0294647216796875,
-0.019927978515625,
-0.0293121337890625,
0.0127410888671875,
0.0291290283203125,
-0.027679443359375,
-0.046600341796875,
-0.054443359375,
-0.015106201171875,
-0.00945281982421875,
0.06475830078125,
-0.01021575927734375,
0.0021800994873046875,
-0.0136566162109375,
-0.038604736328125,
-0.00849151611328125,
-0.00904083251953125,
-0.065673828125,
-0.03515625,
0.02587890625,
0.0241241455078125,
0.036590576171875,
0.052642822265625,
0.050506591796875,
0.03582763671875,
-0.005039215087890625,
0.00720977783203125,
-0.0256805419921875,
-0.01233673095703125,
0.0303192138671875,
-0.043609619140625,
-0.03131103515625,
0.01149749755859375,
0.05267333984375,
0.0257415771484375,
0.0031566619873046875,
0.03436279296875,
0.0115509033203125,
0.041839599609375,
-0.0208587646484375,
0.01367950439453125,
-0.02734375,
0.0139923095703125,
-0.00899505615234375,
0.005268096923828125,
-0.018096923828125,
-0.028106689453125,
0.0152435302734375,
-0.042633056640625,
0.03131103515625,
-0.0041961669921875,
0.0977783203125,
0.0227508544921875,
-0.007843017578125,
0.000005125999450683594,
-0.015045166015625,
0.053619384765625,
-0.054718017578125,
0.0160980224609375,
0.0236053466796875,
0.0155792236328125,
-0.0048065185546875,
-0.0833740234375,
-0.034820556640625,
-0.012481689453125,
-0.0152130126953125,
-0.004268646240234375,
-0.0235443115234375,
0.004856109619140625,
0.023529052734375,
0.0152130126953125,
-0.03887939453125,
0.0174102783203125,
-0.044891357421875,
-0.017181396484375,
0.041107177734375,
0.0005726814270019531,
0.0178680419921875,
-0.0170135498046875,
-0.033905029296875,
-0.036865234375,
-0.0257110595703125,
0.024627685546875,
0.0234375,
0.01137542724609375,
-0.039886474609375,
0.0343017578125,
0.004730224609375,
0.0440673828125,
-0.0018157958984375,
-0.0258331298828125,
0.042724609375,
0.00019240379333496094,
-0.031524658203125,
-0.01300048828125,
0.0823974609375,
0.033477783203125,
0.0164642333984375,
0.006168365478515625,
-0.009918212890625,
-0.0276336669921875,
-0.0083770751953125,
-0.0989990234375,
-0.0325927734375,
0.0253143310546875,
-0.048797607421875,
-0.03363037109375,
0.0199432373046875,
-0.041900634765625,
-0.0079498291015625,
-0.0009222030639648438,
0.044464111328125,
-0.0306396484375,
-0.03369140625,
-0.00655364990234375,
-0.0204010009765625,
0.021759033203125,
0.01483917236328125,
-0.037841796875,
0.01111602783203125,
0.0300445556640625,
0.08929443359375,
0.00504302978515625,
-0.0283355712890625,
-0.0210113525390625,
-0.0288848876953125,
-0.0262603759765625,
0.033172607421875,
-0.000888824462890625,
-0.0003952980041503906,
-0.0219268798828125,
0.0222625732421875,
-0.006763458251953125,
-0.05340576171875,
0.0129241943359375,
-0.0178375244140625,
0.015045166015625,
-0.0025119781494140625,
-0.016204833984375,
-0.0394287109375,
0.017120361328125,
-0.03350830078125,
0.09979248046875,
0.0305328369140625,
-0.0657958984375,
0.0197906494140625,
-0.04241943359375,
-0.00681304931640625,
-0.019622802734375,
0.004497528076171875,
-0.080078125,
-0.00789642333984375,
0.00701141357421875,
0.0601806640625,
-0.0219268798828125,
0.002613067626953125,
-0.044464111328125,
-0.0200653076171875,
0.017669677734375,
0.0018138885498046875,
0.0797119140625,
0.01526641845703125,
-0.03704833984375,
0.017578125,
-0.0382080078125,
0.0159912109375,
0.0396728515625,
-0.0204620361328125,
-0.0022869110107421875,
-0.047943115234375,
0.01605224609375,
0.0247344970703125,
0.00768280029296875,
-0.037933349609375,
0.01959228515625,
-0.0147552490234375,
0.03851318359375,
0.04412841796875,
-0.0144195556640625,
0.024627685546875,
-0.0245513916015625,
0.0165252685546875,
0.019195556640625,
0.01221466064453125,
0.0043182373046875,
-0.03973388671875,
-0.0679931640625,
-0.036529541015625,
0.0293426513671875,
0.030364990234375,
-0.04669189453125,
0.031585693359375,
-0.01934814453125,
-0.061065673828125,
-0.0308380126953125,
0.0090484619140625,
0.040313720703125,
0.0439453125,
0.0219573974609375,
-0.03277587890625,
-0.034637451171875,
-0.06951904296875,
0.0001652240753173828,
-0.00421142578125,
0.00228118896484375,
0.0259552001953125,
0.056549072265625,
-0.005184173583984375,
0.04248046875,
-0.0300445556640625,
-0.021148681640625,
-0.0174102783203125,
0.005084991455078125,
0.024078369140625,
0.059844970703125,
0.06427001953125,
-0.0400390625,
-0.0391845703125,
-0.0092315673828125,
-0.0703125,
0.013580322265625,
0.0014133453369140625,
-0.021392822265625,
0.0222320556640625,
0.0177764892578125,
-0.0426025390625,
0.043853759765625,
0.016387939453125,
-0.032135009765625,
0.0262298583984375,
-0.0178375244140625,
0.0156402587890625,
-0.0848388671875,
0.01165771484375,
0.0271148681640625,
-0.0178985595703125,
-0.03692626953125,
0.00627899169921875,
0.00302886962890625,
0.0010242462158203125,
-0.03887939453125,
0.05029296875,
-0.042083740234375,
-0.01739501953125,
-0.01062774658203125,
-0.02197265625,
-0.0008835792541503906,
0.048675537109375,
-0.00766754150390625,
0.0285797119140625,
0.060089111328125,
-0.033477783203125,
0.039031982421875,
0.0303192138671875,
-0.021575927734375,
0.0226593017578125,
-0.053558349609375,
0.01812744140625,
-0.00011450052261352539,
0.017364501953125,
-0.07489013671875,
-0.025360107421875,
0.030487060546875,
-0.046630859375,
0.044647216796875,
-0.038238525390625,
-0.037261962890625,
-0.04083251953125,
-0.03692626953125,
0.02496337890625,
0.0533447265625,
-0.05743408203125,
0.031585693359375,
0.0174407958984375,
0.019866943359375,
-0.044830322265625,
-0.07611083984375,
-0.0189971923828125,
-0.0297393798828125,
-0.06036376953125,
0.0240478515625,
0.0171051025390625,
0.0057373046875,
0.01515960693359375,
-0.003276824951171875,
-0.01241302490234375,
-0.003154754638671875,
0.03765869140625,
0.0233001708984375,
-0.024169921875,
-0.002857208251953125,
-0.0212554931640625,
-0.005275726318359375,
0.002475738525390625,
-0.03021240234375,
0.041412353515625,
-0.0209197998046875,
-0.0032196044921875,
-0.06610107421875,
-0.007137298583984375,
0.028106689453125,
-0.0027790069580078125,
0.06439208984375,
0.0897216796875,
-0.037811279296875,
-0.003978729248046875,
-0.032318115234375,
-0.02587890625,
-0.036590576171875,
0.0462646484375,
-0.0258636474609375,
-0.040252685546875,
0.0550537109375,
0.004486083984375,
0.00797271728515625,
0.0550537109375,
0.0297393798828125,
-0.007781982421875,
0.04888916015625,
0.037628173828125,
0.022216796875,
0.058013916015625,
-0.08148193359375,
-0.0193023681640625,
-0.058624267578125,
-0.040435791015625,
-0.0299072265625,
-0.048614501953125,
-0.053253173828125,
-0.03076171875,
0.03204345703125,
0.0204010009765625,
-0.0382080078125,
0.036285400390625,
-0.0615234375,
0.00846099853515625,
0.0531005859375,
0.043670654296875,
-0.0301513671875,
0.0301361083984375,
-0.01111602783203125,
0.002841949462890625,
-0.06317138671875,
-0.0168914794921875,
0.08660888671875,
0.03350830078125,
0.035430908203125,
-0.004573822021484375,
0.04833984375,
-0.019134521484375,
0.021759033203125,
-0.047637939453125,
0.0413818359375,
-0.014312744140625,
-0.0312347412109375,
-0.0100555419921875,
-0.039825439453125,
-0.078369140625,
0.01123809814453125,
-0.0224609375,
-0.0582275390625,
0.00867462158203125,
0.019439697265625,
-0.01678466796875,
0.058837890625,
-0.06610107421875,
0.0765380859375,
-0.00907135009765625,
-0.037933349609375,
0.0022735595703125,
-0.049957275390625,
0.0230712890625,
0.020599365234375,
-0.02392578125,
-0.006256103515625,
0.003894805908203125,
0.0904541015625,
-0.04962158203125,
0.054443359375,
-0.04241943359375,
0.0396728515625,
0.0413818359375,
-0.007251739501953125,
0.03204345703125,
-0.00864410400390625,
-0.01045989990234375,
0.0272369384765625,
0.0011472702026367188,
-0.03594970703125,
-0.037841796875,
0.045684814453125,
-0.078125,
-0.019561767578125,
-0.0266265869140625,
-0.0277862548828125,
0.017364501953125,
0.00804901123046875,
0.042877197265625,
0.050018310546875,
0.018310546875,
0.0274658203125,
0.040924072265625,
-0.025848388671875,
0.038848876953125,
-0.010986328125,
-0.01003265380859375,
-0.038543701171875,
0.06463623046875,
0.01953125,
0.009429931640625,
0.007709503173828125,
0.018402099609375,
-0.0292205810546875,
-0.04669189453125,
-0.0242156982421875,
0.0228729248046875,
-0.05364990234375,
-0.0391845703125,
-0.053558349609375,
-0.0262603759765625,
-0.0304718017578125,
0.00036525726318359375,
-0.04132080078125,
-0.034210205078125,
-0.037139892578125,
0.01427459716796875,
0.059906005859375,
0.045440673828125,
-0.0165252685546875,
0.04693603515625,
-0.033660888671875,
0.01233673095703125,
0.0105133056640625,
0.0360107421875,
-0.002628326416015625,
-0.06585693359375,
-0.01450347900390625,
-0.00991058349609375,
-0.0299835205078125,
-0.048553466796875,
0.034698486328125,
0.0192413330078125,
0.0316162109375,
0.0305328369140625,
-0.0193023681640625,
0.054901123046875,
0.003353118896484375,
0.03826904296875,
0.033599853515625,
-0.037109375,
0.04010009765625,
-0.0012874603271484375,
0.006893157958984375,
0.0133819580078125,
0.0196685791015625,
-0.0147247314453125,
0.005558013916015625,
-0.07110595703125,
-0.058929443359375,
0.07318115234375,
0.0129241943359375,
-0.005382537841796875,
0.03509521484375,
0.054443359375,
0.0011453628540039062,
0.0012845993041992188,
-0.048004150390625,
-0.03594970703125,
-0.0295257568359375,
-0.022613525390625,
-0.0014753341674804688,
-0.00856781005859375,
-0.001438140869140625,
-0.051361083984375,
0.05548095703125,
-0.004268646240234375,
0.065185546875,
0.0184173583984375,
-0.00542449951171875,
-0.0021648406982421875,
-0.037200927734375,
0.034393310546875,
0.0197601318359375,
-0.0240478515625,
0.006259918212890625,
0.0114288330078125,
-0.040069580078125,
0.007476806640625,
0.01450347900390625,
-0.0018568038940429688,
0.0004744529724121094,
0.0380859375,
0.079833984375,
-0.00856781005859375,
0.010650634765625,
0.036041259765625,
-0.002307891845703125,
-0.03387451171875,
-0.0203399658203125,
0.0151214599609375,
0.0032024383544921875,
0.035400390625,
0.01477813720703125,
0.0311737060546875,
-0.009002685546875,
-0.01555633544921875,
0.019317626953125,
0.04034423828125,
-0.023223876953125,
-0.022369384765625,
0.050933837890625,
-0.00847625732421875,
-0.014678955078125,
0.06964111328125,
-0.01412200927734375,
-0.037933349609375,
0.0841064453125,
0.02557373046875,
0.0689697265625,
0.0034332275390625,
0.004638671875,
0.06829833984375,
0.018096923828125,
-0.005939483642578125,
0.0128631591796875,
0.01128387451171875,
-0.05145263671875,
0.0054931640625,
-0.0390625,
0.00946044921875,
0.0260772705078125,
-0.0382080078125,
0.024078369140625,
-0.05108642578125,
-0.031524658203125,
0.011138916015625,
0.0302581787109375,
-0.07745361328125,
0.0125885009765625,
-0.004131317138671875,
0.06610107421875,
-0.05303955078125,
0.060302734375,
0.061767578125,
-0.03167724609375,
-0.08428955078125,
-0.01380157470703125,
0.00262451171875,
-0.0701904296875,
0.0458984375,
0.037506103515625,
0.0147552490234375,
0.008819580078125,
-0.05755615234375,
-0.0469970703125,
0.1107177734375,
0.040069580078125,
-0.00836181640625,
0.0214080810546875,
-0.00429534912109375,
0.01470947265625,
-0.0299835205078125,
0.048248291015625,
0.020233154296875,
0.034423828125,
0.02197265625,
-0.049560546875,
0.0173187255859375,
-0.026275634765625,
0.01256561279296875,
0.01265716552734375,
-0.06610107421875,
0.06451416015625,
-0.0396728515625,
-0.00670623779296875,
0.0045318603515625,
0.05401611328125,
0.0138702392578125,
0.011138916015625,
0.04010009765625,
0.0616455078125,
0.040924072265625,
-0.033782958984375,
0.070068359375,
0.0058135986328125,
0.050018310546875,
0.04150390625,
0.0390625,
0.04034423828125,
0.0282135009765625,
-0.01502227783203125,
0.023345947265625,
0.08660888671875,
-0.0290069580078125,
0.0266265869140625,
0.0152130126953125,
0.00884246826171875,
-0.00946044921875,
0.005584716796875,
-0.032196044921875,
0.045013427734375,
0.0094451904296875,
-0.037841796875,
-0.0157318115234375,
-0.00171661376953125,
0.00344085693359375,
-0.03729248046875,
-0.0167388916015625,
0.038787841796875,
0.0007476806640625,
-0.032928466796875,
0.06427001953125,
0.01378631591796875,
0.0654296875,
-0.0295257568359375,
0.00274658203125,
-0.0175933837890625,
0.01800537109375,
-0.027984619140625,
-0.060089111328125,
0.0205535888671875,
-0.01538848876953125,
0.006336212158203125,
0.0009083747863769531,
0.051910400390625,
-0.0276336669921875,
-0.034576416015625,
0.01515960693359375,
0.021575927734375,
0.04229736328125,
0.0012950897216796875,
-0.0908203125,
0.0114593505859375,
0.007366180419921875,
-0.056640625,
0.0197906494140625,
0.0214385986328125,
0.0086822509765625,
0.0516357421875,
0.040313720703125,
-0.006961822509765625,
0.01120758056640625,
-0.0208587646484375,
0.058929443359375,
-0.031158447265625,
-0.02386474609375,
-0.058319091796875,
0.05010986328125,
-0.0100555419921875,
-0.050262451171875,
0.028778076171875,
0.042205810546875,
0.06390380859375,
0.00018465518951416016,
0.033721923828125,
-0.02484130859375,
-0.007564544677734375,
-0.031524658203125,
0.05694580078125,
-0.0601806640625,
-0.00812530517578125,
-0.00440216064453125,
-0.05548095703125,
-0.0244293212890625,
0.056427001953125,
-0.0142669677734375,
0.035888671875,
0.0374755859375,
0.07769775390625,
-0.02691650390625,
-0.028778076171875,
0.00797271728515625,
0.0125885009765625,
0.0064697265625,
0.03570556640625,
0.027252197265625,
-0.06256103515625,
0.03094482421875,
-0.054931640625,
-0.01410675048828125,
-0.0192718505859375,
-0.050537109375,
-0.0750732421875,
-0.060943603515625,
-0.05096435546875,
-0.058441162109375,
-0.01268768310546875,
0.07415771484375,
0.0810546875,
-0.050018310546875,
-0.0088043212890625,
0.0005803108215332031,
0.00913238525390625,
-0.014892578125,
-0.0167388916015625,
0.053497314453125,
-0.0035190582275390625,
-0.054351806640625,
-0.027252197265625,
-0.005126953125,
0.0261383056640625,
0.0027790069580078125,
-0.0220947265625,
-0.0077667236328125,
-0.028045654296875,
0.0134735107421875,
0.020050048828125,
-0.046630859375,
-0.013458251953125,
-0.0198211669921875,
-0.0135040283203125,
0.0295562744140625,
0.03204345703125,
-0.03619384765625,
0.02423095703125,
0.04052734375,
0.02764892578125,
0.06842041015625,
-0.0267791748046875,
-0.005218505859375,
-0.05950927734375,
0.045501708984375,
-0.0090789794921875,
0.03094482421875,
0.0343017578125,
-0.02301025390625,
0.047607421875,
0.032257080078125,
-0.02838134765625,
-0.06658935546875,
-0.01194000244140625,
-0.08050537109375,
-0.00803375244140625,
0.07635498046875,
-0.033111572265625,
-0.0380859375,
0.036712646484375,
0.00618743896484375,
0.056365966796875,
-0.010986328125,
0.0268707275390625,
0.01467132568359375,
-0.0091552734375,
-0.04388427734375,
-0.045501708984375,
0.034820556640625,
0.00963592529296875,
-0.047637939453125,
-0.036376953125,
-0.0055084228515625,
0.05615234375,
0.00896453857421875,
0.03448486328125,
-0.003444671630859375,
0.010650634765625,
0.01395416259765625,
0.034698486328125,
-0.047088623046875,
-0.009063720703125,
-0.0218963623046875,
0.00406646728515625,
-0.0072784423828125,
-0.047393798828125
]
] |
voidful/wav2vec2-xlsr-multilingual-56 | 2023-03-18T12:38:57.000Z | [
"transformers",
"pytorch",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"robust-speech-event",
"speech",
"xlsr-fine-tuning-week",
"multilingual",
"ar",
"as",
"br",
"ca",
"cnh",
"cs",
"cv",
"cy",
"de",
"dv",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"hi",
"hsb",
"hu",
"ia",
"id",
"ja",
"ka",
"ky",
"lg",
"lt",
"ly",
"mn",
"mt",
"nl",
"or",
"pl",
"pt",
"ro",
"ru",
"sah",
"sl",
"ta",
"th",
"tr",
"tt",
"uk",
"vi",
"dataset:common_voice",
"arxiv:1910.09700",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | voidful | null | null | voidful/wav2vec2-xlsr-multilingual-56 | 20 | 29,994 | transformers | 2022-03-02T23:29:05 |
---
language:
- multilingual
- ar
- as
- br
- ca
- cnh
- cs
- cv
- cy
- de
- dv
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- hi
- hsb
- hu
- ia
- id
- ja
- ka
- ky
- lg
- lt
- ly
- mn
- mt
- nl
- or
- pl
- pt
- ro
- ru
- sah
- sl
- ta
- th
- tr
- tt
- uk
- vi
license: apache-2.0
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
- robust-speech-event
- speech
- xlsr-fine-tuning-week
datasets:
- common_voice
language_bcp47:
- fy-NL
- ga-IE
- pa-IN
- rm-sursilv
- rm-vallader
- sy-SE
- zh-CN
- zh-HK
- zh-TW
model-index:
- name: XLSR Wav2Vec2 for 56 language by Voidful
results:
- task:
type: automatic-speech-recognition
name: Speech Recognition
dataset:
name: Common Voice
type: common_voice
metrics:
- type: cer
value: 23.21
name: Test CER
---
# Model Card for wav2vec2-xlsr-multilingual-56
# Model Details
## Model Description
- **Developed by:** voidful
- **Shared by [Optional]:** Hugging Face
- **Model type:** automatic-speech-recognition
- **Language(s) (NLP):** multilingual (*56 language, 1 model Multilingual ASR*)
- **License:** Apache-2.0
- **Related Models:**
- **Parent Model:** wav2vec
- **Resources for more information:**
- [GitHub Repo](https://github.com/voidful/wav2vec2-xlsr-multilingual-56)
- [Model Space](https://huggingface.co/spaces/Kamtera/Persian_Automatic_Speech_Recognition_and-more)
# Uses
## Direct Use
This model can be used for the task of automatic-speech-recognition
## Downstream Use [Optional]
More information needed
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Training Details
## Training Data
See the [common_voice dataset card](https://huggingface.co/datasets/common_voice)
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on 56 language using the [Common Voice](https://huggingface.co/datasets/common_voice).
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
When using this model, make sure that your speech input is sampled at 16kHz.
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
More information needed
### Factors
### Metrics
More information needed
## Results
<details>
<summary> Click to expand </summary>
| Common Voice Languages | Num. of data | Hour | WER | CER |
|------------------------|--------------|--------|--------|-------|
| ar | 21744 | 81.5 | 75.29 | 31.23 |
| as | 394 | 1.1 | 95.37 | 46.05 |
| br | 4777 | 7.4 | 93.79 | 41.16 |
| ca | 301308 | 692.8 | 24.80 | 10.39 |
| cnh | 1563 | 2.4 | 68.11 | 23.10 |
| cs | 9773 | 39.5 | 67.86 | 12.57 |
| cv | 1749 | 5.9 | 95.43 | 34.03 |
| cy | 11615 | 106.7 | 67.03 | 23.97 |
| de | 262113 | 822.8 | 27.03 | 6.50 |
| dv | 4757 | 18.6 | 92.16 | 30.15 |
| el | 3717 | 11.1 | 94.48 | 58.67 |
| en | 580501 | 1763.6 | 34.87 | 14.84 |
| eo | 28574 | 162.3 | 37.77 | 6.23 |
| es | 176902 | 337.7 | 19.63 | 5.41 |
| et | 5473 | 35.9 | 86.87 | 20.79 |
| eu | 12677 | 90.2 | 44.80 | 7.32 |
| fa | 12806 | 290.6 | 53.81 | 15.09 |
| fi | 875 | 2.6 | 93.78 | 27.57 |
| fr | 314745 | 664.1 | 33.16 | 13.94 |
| fy-NL | 6717 | 27.2 | 72.54 | 26.58 |
| ga-IE | 1038 | 3.5 | 92.57 | 51.02 |
| hi | 292 | 2.0 | 90.95 | 57.43 |
| hsb | 980 | 2.3 | 89.44 | 27.19 |
| hu | 4782 | 9.3 | 97.15 | 36.75 |
| ia | 5078 | 10.4 | 52.00 | 11.35 |
| id | 3965 | 9.9 | 82.50 | 22.82 |
| it | 70943 | 178.0 | 39.09 | 8.72 |
| ja | 1308 | 8.2 | 99.21 | 62.06 |
| ka | 1585 | 4.0 | 90.53 | 18.57 |
| ky | 3466 | 12.2 | 76.53 | 19.80 |
| lg | 1634 | 17.1 | 98.95 | 43.84 |
| lt | 1175 | 3.9 | 92.61 | 26.81 |
| lv | 4554 | 6.3 | 90.34 | 30.81 |
| mn | 4020 | 11.6 | 82.68 | 30.14 |
| mt | 3552 | 7.8 | 84.18 | 22.96 |
| nl | 14398 | 71.8 | 57.18 | 19.01 |
| or | 517 | 0.9 | 90.93 | 27.34 |
| pa-IN | 255 | 0.8 | 87.95 | 42.03 |
| pl | 12621 | 112.0 | 56.14 | 12.06 |
| pt | 11106 | 61.3 | 53.24 | 16.32 |
| rm-sursilv | 2589 | 5.9 | 78.17 | 23.31 |
| rm-vallader | 931 | 2.3 | 73.67 | 21.76 |
| ro | 4257 | 8.7 | 83.84 | 21.95 |
| ru | 23444 | 119.1 | 61.83 | 15.18 |
| sah | 1847 | 4.4 | 94.38 | 38.46 |
| sl | 2594 | 6.7 | 84.21 | 20.54 |
| sv-SE | 4350 | 20.8 | 83.68 | 30.79 |
| ta | 3788 | 18.4 | 84.19 | 21.60 |
| th | 4839 | 11.7 | 141.87 | 37.16 |
| tr | 3478 | 22.3 | 66.77 | 15.55 |
| tt | 13338 | 26.7 | 86.80 | 33.57 |
| uk | 7271 | 39.4 | 70.23 | 14.34 |
| vi | 421 | 1.7 | 96.06 | 66.25 |
| zh-CN | 27284 | 58.7 | 89.67 | 23.96 |
| zh-HK | 12678 | 92.1 | 81.77 | 18.82 |
| zh-TW | 6402 | 56.6 | 85.08 | 29.07 |
</details>
# Model Examination
More information needed
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed
# Citation
**BibTeX:**
```
More information needed
```
**APA:**
```
More information needed
```
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
voidful in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
## Env setup:
```
!pip install torchaudio
!pip install datasets transformers
!pip install asrp
!wget -O lang_ids.pk https://huggingface.co/voidful/wav2vec2-xlsr-multilingual-56/raw/main/lang_ids.pk
```
## Usage
```
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
AutoTokenizer,
AutoModelWithLMHead
)
import torch
import re
import sys
import soundfile as sf
model_name = "voidful/wav2vec2-xlsr-multilingual-56"
device = "cuda"
processor_name = "voidful/wav2vec2-xlsr-multilingual-56"
import pickle
with open("lang_ids.pk", 'rb') as output:
lang_ids = pickle.load(output)
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(processor_name)
model.eval()
def load_file_to_data(file,sampling_rate=16_000):
batch = {}
speech, _ = torchaudio.load(file)
if sampling_rate != '16_000' or sampling_rate != '16000':
resampler = torchaudio.transforms.Resample(orig_freq=sampling_rate, new_freq=16_000)
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
else:
batch["speech"] = speech.squeeze(0).numpy()
batch["sampling_rate"] = '16000'
return batch
def predict(data):
features = processor(data["speech"], sampling_rate=data["sampling_rate"], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
decoded_results = []
for logit in logits:
pred_ids = torch.argmax(logit, dim=-1)
mask = pred_ids.ge(1).unsqueeze(-1).expand(logit.size())
vocab_size = logit.size()[-1]
voice_prob = torch.nn.functional.softmax((torch.masked_select(logit, mask).view(-1,vocab_size)),dim=-1)
comb_pred_ids = torch.argmax(voice_prob, dim=-1)
decoded_results.append(processor.decode(comb_pred_ids))
return decoded_results
def predict_lang_specific(data,lang_code):
features = processor(data["speech"], sampling_rate=data["sampling_rate"], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
decoded_results = []
for logit in logits:
pred_ids = torch.argmax(logit, dim=-1)
mask = ~pred_ids.eq(processor.tokenizer.pad_token_id).unsqueeze(-1).expand(logit.size())
vocab_size = logit.size()[-1]
voice_prob = torch.nn.functional.softmax((torch.masked_select(logit, mask).view(-1,vocab_size)),dim=-1)
filtered_input = pred_ids[pred_ids!=processor.tokenizer.pad_token_id].view(1,-1).to(device)
if len(filtered_input[0]) == 0:
decoded_results.append("")
else:
lang_mask = torch.empty(voice_prob.shape[-1]).fill_(0)
lang_index = torch.tensor(sorted(lang_ids[lang_code]))
lang_mask.index_fill_(0, lang_index, 1)
lang_mask = lang_mask.to(device)
comb_pred_ids = torch.argmax(lang_mask*voice_prob, dim=-1)
decoded_results.append(processor.decode(comb_pred_ids))
return decoded_results
predict(load_file_to_data('audio file path',sampling_rate=16_000)) # beware of the audio file sampling rate
predict_lang_specific(load_file_to_data('audio file path',sampling_rate=16_000),'en') # beware of the audio file sampling rate
```
```python
{{ get_started_code | default("More information needed", true)}}
```
</details>
| 12,129 | [
[
-0.045318603515625,
-0.036529541015625,
0.0056610107421875,
0.0107574462890625,
-0.0078277587890625,
-0.0085906982421875,
-0.0038623809814453125,
-0.0238037109375,
0.031463623046875,
0.0132598876953125,
-0.046142578125,
-0.042144775390625,
-0.04388427734375,
-0.025054931640625,
0.004001617431640625,
0.043853759765625,
0.01502227783203125,
0.00048232078552246094,
-0.0009107589721679688,
-0.007110595703125,
-0.0364990234375,
-0.0168914794921875,
-0.06536865234375,
-0.01087188720703125,
0.00756072998046875,
0.0222015380859375,
0.0445556640625,
0.052978515625,
0.0198822021484375,
0.03533935546875,
-0.03924560546875,
0.00027823448181152344,
-0.013916015625,
-0.0285491943359375,
0.01776123046875,
-0.0284881591796875,
-0.032318115234375,
-0.004566192626953125,
0.03875732421875,
0.046173095703125,
-0.0095062255859375,
0.01708984375,
0.01061248779296875,
0.05035400390625,
-0.026397705078125,
0.024383544921875,
-0.0321044921875,
-0.009307861328125,
-0.012908935546875,
0.0004544258117675781,
-0.0029582977294921875,
-0.0144500732421875,
0.004764556884765625,
-0.035919189453125,
0.00626373291015625,
-0.008056640625,
0.08306884765625,
0.0116119384765625,
-0.015655517578125,
-0.01450347900390625,
-0.0311737060546875,
0.0599365234375,
-0.054229736328125,
0.045684814453125,
0.036285400390625,
-0.00482177734375,
-0.00039577484130859375,
-0.049560546875,
-0.053741455078125,
0.0088653564453125,
-0.005252838134765625,
0.0323486328125,
-0.02130126953125,
-0.028228759765625,
0.0240325927734375,
0.0284271240234375,
-0.045623779296875,
0.0076446533203125,
-0.050628662109375,
-0.0265655517578125,
0.056060791015625,
0.0033588409423828125,
0.03125,
-0.03826904296875,
-0.037933349609375,
-0.018707275390625,
-0.01708984375,
0.0523681640625,
0.03289794921875,
0.0284881591796875,
-0.0556640625,
0.040802001953125,
0.0002524852752685547,
0.02850341796875,
0.0018901824951171875,
-0.004756927490234375,
0.0601806640625,
-0.040130615234375,
-0.0137176513671875,
0.00390625,
0.060089111328125,
0.03155517578125,
-0.0108642578125,
0.013641357421875,
0.00591278076171875,
0.007335662841796875,
-0.01102447509765625,
-0.05303955078125,
-0.01171112060546875,
0.044677734375,
-0.039825439453125,
-0.0119476318359375,
0.0093994140625,
-0.06219482421875,
0.019317626953125,
-0.043853759765625,
0.0149078369140625,
-0.01200103759765625,
-0.05340576171875,
0.004119873046875,
-0.004772186279296875,
0.0209503173828125,
0.015899658203125,
-0.061431884765625,
0.0265045166015625,
0.0308990478515625,
0.07373046875,
-0.003383636474609375,
-0.0008087158203125,
-0.00650787353515625,
0.006748199462890625,
-0.02459716796875,
0.056060791015625,
-0.0126495361328125,
-0.042327880859375,
-0.021881103515625,
0.0196075439453125,
-0.0075531005859375,
-0.0220794677734375,
0.0523681640625,
-0.004055023193359375,
0.02752685546875,
-0.0204315185546875,
-0.0160980224609375,
-0.0037364959716796875,
0.0013170242309570312,
-0.0513916015625,
0.09661865234375,
0.00011563301086425781,
-0.06768798828125,
0.0146331787109375,
-0.042144775390625,
-0.00504302978515625,
-0.00868988037109375,
-0.00315093994140625,
-0.0634765625,
-0.0288848876953125,
0.01502227783203125,
0.0158843994140625,
-0.029083251953125,
0.01348876953125,
-0.01439666748046875,
-0.01284027099609375,
-0.007190704345703125,
-0.0233612060546875,
0.10888671875,
0.0379638671875,
-0.038421630859375,
-0.0021953582763671875,
-0.06787109375,
0.01277923583984375,
0.0089111328125,
-0.0288238525390625,
-0.00460052490234375,
-0.0120849609375,
0.0126190185546875,
0.0284271240234375,
0.0261383056640625,
-0.042999267578125,
-0.002384185791015625,
-0.033233642578125,
0.039093017578125,
0.051422119140625,
0.003505706787109375,
0.0260162353515625,
-0.03515625,
0.0277862548828125,
0.010345458984375,
0.01142120361328125,
0.0196990966796875,
-0.0323486328125,
-0.061553955078125,
-0.03302001953125,
0.0021991729736328125,
0.048126220703125,
-0.0221099853515625,
0.04833984375,
0.0043182373046875,
-0.0426025390625,
-0.0367431640625,
-0.00554656982421875,
0.03448486328125,
0.03326416015625,
0.0196533203125,
0.0006427764892578125,
-0.06158447265625,
-0.06103515625,
-0.00315093994140625,
-0.0101470947265625,
0.0182342529296875,
0.045745849609375,
0.04754638671875,
-0.019012451171875,
0.06915283203125,
-0.044097900390625,
-0.035400390625,
-0.0124664306640625,
-0.005748748779296875,
0.035247802734375,
0.043914794921875,
0.05108642578125,
-0.06927490234375,
-0.047607421875,
-0.0011281967163085938,
-0.045867919921875,
0.003658294677734375,
-0.00252532958984375,
0.0003705024719238281,
0.02392578125,
0.023956298828125,
-0.046722412109375,
0.038848876953125,
0.059051513671875,
-0.0297393798828125,
0.053070068359375,
-0.0298614501953125,
0.02459716796875,
-0.08148193359375,
0.02880859375,
0.003498077392578125,
-0.00457000732421875,
-0.0467529296875,
-0.0247039794921875,
-0.0135955810546875,
-0.02081298828125,
-0.047332763671875,
0.053466796875,
-0.04949951171875,
-0.0043182373046875,
0.00681304931640625,
0.007366180419921875,
-0.008209228515625,
0.0487060546875,
0.01383209228515625,
0.07733154296875,
0.052276611328125,
-0.05340576171875,
0.017913818359375,
0.0290069580078125,
-0.03204345703125,
0.0285491943359375,
-0.055023193359375,
0.01467132568359375,
-0.006397247314453125,
0.01381683349609375,
-0.07659912109375,
-0.00431060791015625,
0.0188446044921875,
-0.0582275390625,
0.0262451171875,
-0.007068634033203125,
-0.0137176513671875,
-0.062347412109375,
-0.021209716796875,
0.0107269287109375,
0.033660888671875,
-0.02960205078125,
0.043975830078125,
0.05548095703125,
-0.01442718505859375,
-0.045501708984375,
-0.06427001953125,
-0.0011386871337890625,
-0.01221466064453125,
-0.060638427734375,
0.038421630859375,
-0.021026611328125,
-0.0165252685546875,
-0.0031490325927734375,
-0.00661468505859375,
-0.0176849365234375,
0.001140594482421875,
0.023895263671875,
0.0240936279296875,
-0.0003211498260498047,
-0.02001953125,
-0.0134735107421875,
-0.031707763671875,
0.00246429443359375,
0.0009756088256835938,
0.036407470703125,
-0.0194244384765625,
-0.008544921875,
-0.06524658203125,
0.008087158203125,
0.056365966796875,
-0.03955078125,
0.06634521484375,
0.041107177734375,
-0.0323486328125,
0.0033702850341796875,
-0.0316162109375,
-0.0011844635009765625,
-0.033660888671875,
0.032196044921875,
-0.0467529296875,
-0.05267333984375,
0.05670166015625,
0.0033397674560546875,
0.006099700927734375,
0.047119140625,
0.04351806640625,
-0.017059326171875,
0.0665283203125,
0.0236053466796875,
-0.0113372802734375,
0.0288543701171875,
-0.0618896484375,
0.002002716064453125,
-0.0650634765625,
-0.053497314453125,
-0.060577392578125,
-0.005863189697265625,
-0.0298919677734375,
-0.0241546630859375,
0.021026611328125,
-0.00262451171875,
-0.0300445556640625,
0.028106689453125,
-0.058380126953125,
0.00931549072265625,
0.052276611328125,
0.0186004638671875,
0.0021209716796875,
-0.006771087646484375,
-0.005847930908203125,
-0.009246826171875,
-0.0322265625,
-0.03155517578125,
0.06024169921875,
0.0306854248046875,
0.0254364013671875,
0.037322998046875,
0.040252685546875,
0.01560211181640625,
-0.0273284912109375,
-0.0391845703125,
0.01424407958984375,
-0.013946533203125,
-0.058837890625,
-0.0258636474609375,
-0.018707275390625,
-0.06439208984375,
0.0225372314453125,
-0.0263519287109375,
-0.06494140625,
0.033233642578125,
0.0132904052734375,
-0.0253753662109375,
0.027862548828125,
-0.04034423828125,
0.058013916015625,
-0.02764892578125,
-0.03515625,
-0.006633758544921875,
-0.05133056640625,
0.0247802734375,
-0.0007433891296386719,
0.03668212890625,
-0.011505126953125,
-0.00011169910430908203,
0.0556640625,
-0.051971435546875,
0.043212890625,
-0.023834228515625,
0.00812530517578125,
0.038604736328125,
-0.0159759521484375,
0.05322265625,
-0.00504302978515625,
-0.0199432373046875,
0.00988006591796875,
0.0061187744140625,
-0.0205230712890625,
-0.01212310791015625,
0.0731201171875,
-0.0841064453125,
-0.036865234375,
-0.03790283203125,
-0.0303497314453125,
0.0096282958984375,
0.024078369140625,
0.03253173828125,
0.040130615234375,
-0.0052490234375,
0.02008056640625,
0.05487060546875,
-0.025909423828125,
0.031280517578125,
0.024383544921875,
0.005580902099609375,
-0.056610107421875,
0.051483154296875,
0.0164031982421875,
0.0308990478515625,
0.00940704345703125,
0.0203094482421875,
-0.045501708984375,
-0.02471923828125,
-0.01458740234375,
0.018951416015625,
-0.005001068115234375,
-0.006977081298828125,
-0.06646728515625,
-0.011627197265625,
-0.0684814453125,
-0.0160369873046875,
-0.039306640625,
-0.0241546630859375,
-0.027923583984375,
-0.01311492919921875,
0.04791259765625,
0.038482666015625,
-0.02227783203125,
0.0148468017578125,
-0.035369873046875,
0.0283660888671875,
0.0036220550537109375,
0.0161285400390625,
-0.0010223388671875,
-0.046051025390625,
-0.02099609375,
0.0233612060546875,
-0.01207733154296875,
-0.0484619140625,
0.053131103515625,
0.0006852149963378906,
0.044708251953125,
0.01837158203125,
0.004520416259765625,
0.06463623046875,
-0.0217742919921875,
0.068603515625,
0.03192138671875,
-0.07293701171875,
0.043853759765625,
-0.0241546630859375,
0.0284881591796875,
0.044403076171875,
0.022552490234375,
-0.03887939453125,
-0.0250091552734375,
-0.064208984375,
-0.06280517578125,
0.07244873046875,
0.0272216796875,
0.0087890625,
0.0166778564453125,
0.01129150390625,
-0.0428466796875,
0.003986358642578125,
-0.06646728515625,
-0.06787109375,
-0.010986328125,
0.003940582275390625,
-0.001617431640625,
-0.0149383544921875,
-0.0010232925415039062,
-0.03289794921875,
0.06707763671875,
0.024658203125,
0.03558349609375,
0.0175018310546875,
0.0110015869140625,
-0.0038242340087890625,
0.0196380615234375,
0.06280517578125,
0.029541015625,
-0.038970947265625,
-0.00940704345703125,
0.01654052734375,
-0.040130615234375,
0.0072784423828125,
-0.0050048828125,
-0.0242462158203125,
0.008544921875,
0.00836181640625,
0.05902099609375,
0.014007568359375,
-0.00865936279296875,
0.042999267578125,
0.0032863616943359375,
-0.041839599609375,
-0.052520751953125,
-0.01335906982421875,
0.03155517578125,
0.0160064697265625,
0.022186279296875,
0.02001953125,
0.0132293701171875,
-0.049774169921875,
0.0181121826171875,
0.041046142578125,
-0.043548583984375,
-0.005092620849609375,
0.07769775390625,
0.0257415771484375,
-0.03167724609375,
0.0311737060546875,
0.01244354248046875,
-0.056488037109375,
0.081298828125,
0.03741455078125,
0.045867919921875,
-0.0257110595703125,
-0.00630950927734375,
0.07586669921875,
0.02069091796875,
0.0009441375732421875,
0.0272216796875,
0.0009822845458984375,
-0.0321044921875,
0.002285003662109375,
-0.0367431640625,
-0.00870513916015625,
0.019744873046875,
-0.06396484375,
0.045928955078125,
-0.0251617431640625,
-0.0272064208984375,
-0.00826263427734375,
0.03155517578125,
-0.048065185546875,
0.035247802734375,
0.004791259765625,
0.07330322265625,
-0.07537841796875,
0.06494140625,
0.03173828125,
-0.0635986328125,
-0.0880126953125,
-0.037322998046875,
0.01593017578125,
-0.044342041015625,
0.045562744140625,
0.00013327598571777344,
-0.0169830322265625,
0.002574920654296875,
-0.033355712890625,
-0.0869140625,
0.0960693359375,
-0.0015554428100585938,
-0.056365966796875,
0.0186004638671875,
-0.007076263427734375,
0.027191162109375,
-0.0038089752197265625,
0.0296478271484375,
0.040496826171875,
0.04095458984375,
0.0151214599609375,
-0.07330322265625,
0.005817413330078125,
-0.03271484375,
-0.007110595703125,
0.006137847900390625,
-0.08319091796875,
0.07513427734375,
-0.032379150390625,
-0.002410888671875,
0.005878448486328125,
0.05328369140625,
0.031036376953125,
0.0200958251953125,
0.0411376953125,
0.05792236328125,
0.06219482421875,
-0.00820159912109375,
0.062042236328125,
-0.0253143310546875,
0.05133056640625,
0.08001708984375,
0.007472991943359375,
0.073974609375,
0.0310211181640625,
-0.047332763671875,
0.032928466796875,
0.05615234375,
-0.0092926025390625,
0.046661376953125,
-0.003570556640625,
-0.02923583984375,
-0.00002104043960571289,
-0.005344390869140625,
-0.053619384765625,
0.0206298828125,
0.012847900390625,
-0.0224609375,
0.0160980224609375,
-0.0018587112426757812,
0.015350341796875,
0.00693511962890625,
-0.0247650146484375,
0.049652099609375,
0.0036144256591796875,
-0.023040771484375,
0.0369873046875,
-0.007415771484375,
0.058349609375,
-0.04669189453125,
0.006847381591796875,
0.006561279296875,
0.0254058837890625,
-0.042205810546875,
-0.07037353515625,
0.01406097412109375,
-0.0013942718505859375,
-0.035614013671875,
-0.0012388229370117188,
0.019073486328125,
-0.0308990478515625,
-0.0426025390625,
0.031036376953125,
0.0061492919921875,
0.025665283203125,
0.028533935546875,
-0.07208251953125,
0.00830841064453125,
0.02923583984375,
-0.030975341796875,
-0.0013380050659179688,
0.0173492431640625,
0.0159454345703125,
0.037567138671875,
0.0633544921875,
0.025238037109375,
0.0028781890869140625,
-0.002819061279296875,
0.0782470703125,
-0.044921875,
-0.04412841796875,
-0.038970947265625,
0.04510498046875,
-0.01186370849609375,
-0.035064697265625,
0.058685302734375,
0.060333251953125,
0.0447998046875,
0.00537872314453125,
0.057769775390625,
-0.0300750732421875,
0.05450439453125,
-0.03656005859375,
0.052734375,
-0.06024169921875,
0.0048675537109375,
-0.023681640625,
-0.047821044921875,
-0.0198211669921875,
0.0743408203125,
-0.021881103515625,
0.0159454345703125,
0.03375244140625,
0.07122802734375,
0.0193634033203125,
-0.002452850341796875,
0.006938934326171875,
0.0289154052734375,
0.0202178955078125,
0.036865234375,
0.020050048828125,
-0.046905517578125,
0.044342041015625,
-0.040740966796875,
-0.00875091552734375,
-0.00844573974609375,
-0.045623779296875,
-0.053436279296875,
-0.04412841796875,
-0.04736328125,
-0.040740966796875,
-0.01448822021484375,
0.0755615234375,
0.06390380859375,
-0.05316162109375,
-0.031646728515625,
-0.004871368408203125,
-0.007244110107421875,
-0.012176513671875,
-0.0099945068359375,
0.06585693359375,
0.006015777587890625,
-0.07073974609375,
0.0081329345703125,
0.0016870498657226562,
0.0258941650390625,
-0.007007598876953125,
-0.0235443115234375,
-0.018951416015625,
-0.0052947998046875,
0.022674560546875,
0.03533935546875,
-0.059295654296875,
-0.008575439453125,
-0.0095367431640625,
-0.0265350341796875,
0.0199127197265625,
0.0150604248046875,
-0.027923583984375,
0.03253173828125,
0.033477783203125,
0.01242828369140625,
0.054443359375,
-0.0014162063598632812,
0.016387939453125,
-0.021392822265625,
0.02581787109375,
-0.00933074951171875,
0.01763916015625,
0.01000213623046875,
-0.0394287109375,
0.043701171875,
0.0210723876953125,
-0.042938232421875,
-0.060394287109375,
-0.0207672119140625,
-0.103759765625,
-0.01629638671875,
0.08837890625,
-0.004016876220703125,
-0.0300750732421875,
-0.01000213623046875,
-0.033966064453125,
0.0233154296875,
-0.0491943359375,
0.01617431640625,
0.048309326171875,
-0.0120697021484375,
-0.0198516845703125,
-0.0628662109375,
0.035247802734375,
0.00475311279296875,
-0.054840087890625,
-0.00487518310546875,
0.0275115966796875,
0.030181884765625,
0.012603759765625,
0.0599365234375,
-0.025177001953125,
0.0162506103515625,
0.0198822021484375,
0.0318603515625,
-0.005954742431640625,
0.000988006591796875,
-0.019866943359375,
0.0028438568115234375,
0.0016775131225585938,
-0.035491943359375
]
] |
ckiplab/albert-tiny-chinese-ws | 2022-05-10T03:28:12.000Z | [
"transformers",
"pytorch",
"albert",
"token-classification",
"zh",
"license:gpl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | ckiplab | null | null | ckiplab/albert-tiny-chinese-ws | 2 | 29,948 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
- pytorch
- token-classification
- albert
- zh
license: gpl-3.0
---
# CKIP ALBERT Tiny Chinese
This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).
這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2)及自然語言處理工具(包含斷詞、詞性標記、實體辨識)。
## Homepage
- https://github.com/ckiplab/ckip-transformers
## Contributers
- [Mu Yang](https://muyang.pro) at [CKIP](https://ckip.iis.sinica.edu.tw) (Author & Maintainer)
## Usage
Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.
請使用 BertTokenizerFast 而非 AutoTokenizer。
```
from transformers import (
BertTokenizerFast,
AutoModel,
)
tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
model = AutoModel.from_pretrained('ckiplab/albert-tiny-chinese-ws')
```
For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.
有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers 。
| 1,128 | [
[
-0.0245361328125,
-0.01873779296875,
0.005298614501953125,
0.0528564453125,
-0.024505615234375,
0.004512786865234375,
-0.0161590576171875,
-0.022308349609375,
0.0019178390502929688,
0.0238037109375,
-0.02679443359375,
-0.0174407958984375,
-0.038055419921875,
0.007568359375,
-0.01454925537109375,
0.060333251953125,
-0.01212310791015625,
0.0217132568359375,
0.0279388427734375,
0.004993438720703125,
-0.0175628662109375,
-0.0164794921875,
-0.0594482421875,
-0.0423583984375,
0.0010738372802734375,
0.0312042236328125,
0.05487060546875,
0.03387451171875,
0.038787841796875,
0.0230560302734375,
-0.0003304481506347656,
-0.006130218505859375,
-0.006866455078125,
-0.031768798828125,
0.000789642333984375,
-0.04425048828125,
-0.026641845703125,
-0.0184173583984375,
0.051605224609375,
0.03564453125,
0.0012712478637695312,
0.00795745849609375,
0.0167083740234375,
0.0238037109375,
-0.0223541259765625,
0.0302581787109375,
-0.04205322265625,
0.022796630859375,
-0.0153656005859375,
-0.0037021636962890625,
-0.0257568359375,
-0.0142364501953125,
0.00775146484375,
-0.04901123046875,
0.025299072265625,
0.0020599365234375,
0.09381103515625,
0.00536346435546875,
-0.0220947265625,
-0.018707275390625,
-0.050628662109375,
0.07958984375,
-0.06280517578125,
0.0273590087890625,
0.02655029296875,
0.022003173828125,
0.0010204315185546875,
-0.0811767578125,
-0.051422119140625,
-0.01274871826171875,
-0.0284271240234375,
0.0175323486328125,
0.005214691162109375,
-0.002307891845703125,
0.0300445556640625,
0.021026611328125,
-0.04583740234375,
0.022491455078125,
-0.030792236328125,
-0.0306243896484375,
0.042633056640625,
0.00396728515625,
0.035369873046875,
-0.0369873046875,
-0.032073974609375,
-0.024505615234375,
-0.0458984375,
0.0167999267578125,
0.0160064697265625,
0.010986328125,
-0.0419921875,
0.04638671875,
-0.004566192626953125,
0.0189971923828125,
0.01052093505859375,
-0.007568359375,
0.026641845703125,
-0.0264434814453125,
-0.001056671142578125,
-0.00927734375,
0.07720947265625,
0.0155181884765625,
0.0007824897766113281,
0.00995635986328125,
-0.0260162353515625,
-0.0245513916015625,
-0.021514892578125,
-0.062408447265625,
-0.052001953125,
0.0142669677734375,
-0.061065673828125,
-0.0158843994140625,
0.007671356201171875,
-0.050567626953125,
0.0192108154296875,
-0.015289306640625,
0.028533935546875,
-0.048004150390625,
-0.038482666015625,
-0.00402069091796875,
-0.0261688232421875,
0.0496826171875,
0.00958251953125,
-0.09271240234375,
0.0004391670227050781,
0.04547119140625,
0.06146240234375,
0.015838623046875,
-0.01224517822265625,
0.007350921630859375,
0.0283203125,
-0.018310546875,
0.042205810546875,
-0.007965087890625,
-0.04949951171875,
0.008819580078125,
0.005123138427734375,
0.002864837646484375,
-0.03411865234375,
0.057037353515625,
-0.0212554931640625,
0.0242156982421875,
-0.00970458984375,
-0.01421356201171875,
0.0019397735595703125,
0.006465911865234375,
-0.03668212890625,
0.0799560546875,
0.00955963134765625,
-0.06524658203125,
0.0168304443359375,
-0.06756591796875,
-0.038330078125,
0.02508544921875,
-0.0109710693359375,
-0.034423828125,
-0.007160186767578125,
0.0159912109375,
0.0211181640625,
-0.00968170166015625,
0.005252838134765625,
-0.0081024169921875,
-0.01654052734375,
-0.0010404586791992188,
-0.03125,
0.0909423828125,
0.024139404296875,
-0.01425933837890625,
0.01381683349609375,
-0.05242919921875,
0.00936126708984375,
0.0257110595703125,
-0.01385498046875,
-0.026123046875,
0.00861358642578125,
0.03887939453125,
0.0127716064453125,
0.0380859375,
-0.04327392578125,
0.039794921875,
-0.0452880859375,
0.056243896484375,
0.05731201171875,
-0.017486572265625,
0.0249481201171875,
-0.01508331298828125,
0.00038242340087890625,
0.0015439987182617188,
0.02642822265625,
-0.0025310516357421875,
-0.039306640625,
-0.08416748046875,
-0.028472900390625,
0.037200927734375,
0.049224853515625,
-0.089111328125,
0.0535888671875,
-0.022796630859375,
-0.050201416015625,
-0.031951904296875,
-0.00009304285049438477,
0.0006275177001953125,
0.0072479248046875,
0.035369873046875,
-0.0185699462890625,
-0.042205810546875,
-0.0777587890625,
0.006351470947265625,
-0.0458984375,
-0.049560546875,
-0.0019893646240234375,
0.048065185546875,
-0.033966064453125,
0.0733642578125,
-0.041015625,
-0.0264892578125,
-0.0203704833984375,
0.0426025390625,
0.0228424072265625,
0.06390380859375,
0.04290771484375,
-0.07330322265625,
-0.057830810546875,
-0.01201629638671875,
-0.0242919921875,
-0.0015974044799804688,
-0.02032470703125,
-0.006832122802734375,
-0.005279541015625,
0.005321502685546875,
-0.04974365234375,
0.0151824951171875,
0.025543212890625,
-0.0005087852478027344,
0.061370849609375,
-0.005023956298828125,
-0.0227508544921875,
-0.0909423828125,
0.00970458984375,
-0.01189422607421875,
-0.0031452178955078125,
-0.026947021484375,
-0.003231048583984375,
0.01605224609375,
-0.00705718994140625,
-0.044097900390625,
0.04571533203125,
-0.0263824462890625,
0.02508544921875,
-0.018157958984375,
-0.0056610107421875,
-0.01280975341796875,
0.04034423828125,
0.0251312255859375,
0.052520751953125,
0.0401611328125,
-0.049896240234375,
0.0289154052734375,
0.05322265625,
-0.02020263671875,
-0.00494384765625,
-0.0701904296875,
-0.00418853759765625,
0.0277862548828125,
0.0036640167236328125,
-0.06341552734375,
-0.006740570068359375,
0.04449462890625,
-0.052734375,
0.047607421875,
0.0018329620361328125,
-0.0738525390625,
-0.039215087890625,
-0.039581298828125,
0.0284576416015625,
0.047393798828125,
-0.0499267578125,
0.031982421875,
0.0185546875,
-0.016510009765625,
-0.0308990478515625,
-0.05853271484375,
-0.00312042236328125,
0.01342010498046875,
-0.0430908203125,
0.042938232421875,
-0.0199432373046875,
0.0177001953125,
0.0019893646240234375,
0.00879669189453125,
-0.030670166015625,
-0.0079803466796875,
-0.01155853271484375,
0.030120849609375,
-0.01396942138671875,
-0.0029277801513671875,
0.0171051025390625,
-0.0253448486328125,
0.0078277587890625,
0.003582000732421875,
0.04522705078125,
-0.0036182403564453125,
-0.022186279296875,
-0.048553466796875,
0.0205535888671875,
0.013458251953125,
-0.01360321044921875,
0.0283203125,
0.07904052734375,
-0.020721435546875,
-0.014190673828125,
-0.0217437744140625,
-0.0169525146484375,
-0.04168701171875,
0.039337158203125,
-0.03326416015625,
-0.06536865234375,
0.0206451416015625,
-0.0097808837890625,
0.0123291015625,
0.05810546875,
0.049560546875,
-0.005489349365234375,
0.09857177734375,
0.0723876953125,
-0.03350830078125,
0.032958984375,
-0.034149169921875,
0.0311431884765625,
-0.071533203125,
0.0212554931640625,
-0.048126220703125,
0.00386810302734375,
-0.0654296875,
-0.0218505859375,
0.002788543701171875,
0.014495849609375,
-0.0220947265625,
0.05120849609375,
-0.0567626953125,
-0.0007257461547851562,
0.053070068359375,
-0.0286865234375,
-0.0015153884887695312,
-0.012176513671875,
-0.01611328125,
-0.009429931640625,
-0.0535888671875,
-0.053375244140625,
0.053497314453125,
0.046295166015625,
0.05328369140625,
0.0034637451171875,
0.036712646484375,
-0.0001913309097290039,
0.03765869140625,
-0.062286376953125,
0.042144775390625,
-0.007328033447265625,
-0.056396484375,
-0.02252197265625,
-0.0207977294921875,
-0.06390380859375,
0.0220947265625,
-0.003902435302734375,
-0.0701904296875,
0.01526641845703125,
0.00728607177734375,
-0.0192718505859375,
0.025909423828125,
-0.037200927734375,
0.05987548828125,
-0.02978515625,
0.00386810302734375,
-0.005794525146484375,
-0.050048828125,
0.03485107421875,
-0.00548553466796875,
0.000885009765625,
-0.0058135986328125,
-0.0018072128295898438,
0.061981201171875,
-0.0196075439453125,
0.061065673828125,
-0.00481414794921875,
0.0010805130004882812,
0.0260467529296875,
-0.0155181884765625,
0.0259552001953125,
0.023895263671875,
0.01070404052734375,
0.04705810546875,
0.0241241455078125,
-0.0261688232421875,
-0.0172271728515625,
0.039031982421875,
-0.06494140625,
-0.032012939453125,
-0.044769287109375,
-0.01416015625,
0.011199951171875,
0.04254150390625,
0.038848876953125,
-0.005413055419921875,
0.003185272216796875,
0.0188140869140625,
0.0166473388671875,
-0.0274200439453125,
0.048431396484375,
0.045623779296875,
-0.005649566650390625,
-0.0294647216796875,
0.07305908203125,
0.007801055908203125,
0.00653839111328125,
0.044891357421875,
0.00400543212890625,
-0.01123046875,
-0.03289794921875,
-0.029144287109375,
0.035675048828125,
-0.023681640625,
-0.005046844482421875,
-0.0255889892578125,
-0.03680419921875,
-0.0440673828125,
0.010040283203125,
-0.03289794921875,
-0.03167724609375,
-0.0252685546875,
0.007183074951171875,
-0.027801513671875,
0.007904052734375,
-0.0281219482421875,
0.0379638671875,
-0.0814208984375,
0.039642333984375,
0.0170135498046875,
0.01324462890625,
0.00550079345703125,
-0.02044677734375,
-0.04766845703125,
0.007160186767578125,
-0.06640625,
-0.045623779296875,
0.04364013671875,
0.002857208251953125,
0.040679931640625,
0.0489501953125,
0.019500732421875,
0.039947509765625,
-0.049407958984375,
0.08331298828125,
0.0304718017578125,
-0.0836181640625,
0.0308990478515625,
-0.00888824462890625,
0.0232696533203125,
0.021697998046875,
0.03387451171875,
-0.060394287109375,
-0.0245361328125,
-0.039093017578125,
-0.08184814453125,
0.05413818359375,
0.030364990234375,
0.0196990966796875,
0.0029354095458984375,
-0.005825042724609375,
-0.005725860595703125,
0.01358795166015625,
-0.074951171875,
-0.03924560546875,
-0.0257415771484375,
-0.0252685546875,
0.015655517578125,
-0.032012939453125,
-0.0006875991821289062,
-0.01971435546875,
0.0806884765625,
-0.0037517547607421875,
0.060821533203125,
0.027313232421875,
0.0025787353515625,
-0.01352691650390625,
0.007671356201171875,
0.033294677734375,
0.035552978515625,
-0.0145263671875,
-0.0176849365234375,
0.00704193115234375,
-0.042724609375,
-0.0166015625,
0.026275634765625,
-0.034149169921875,
0.03057861328125,
0.0408935546875,
0.045196533203125,
0.0084686279296875,
-0.0318603515625,
0.04437255859375,
-0.0218048095703125,
-0.01247406005859375,
-0.07196044921875,
-0.0038280487060546875,
0.000995635986328125,
0.006473541259765625,
0.041534423828125,
-0.006038665771484375,
0.01062774658203125,
-0.01143646240234375,
0.010162353515625,
0.0298004150390625,
-0.038482666015625,
-0.03875732421875,
0.055877685546875,
0.032257080078125,
-0.0269927978515625,
0.0579833984375,
-0.01227569580078125,
-0.06011962890625,
0.0458984375,
0.0308380126953125,
0.06988525390625,
-0.0181427001953125,
0.004482269287109375,
0.047149658203125,
0.04339599609375,
0.0010976791381835938,
0.01143646240234375,
-0.0220489501953125,
-0.0654296875,
-0.03472900390625,
-0.0306243896484375,
-0.0352783203125,
0.0247039794921875,
-0.03631591796875,
0.046234130859375,
-0.032562255859375,
-0.00542449951171875,
0.000255584716796875,
0.00016558170318603516,
-0.04290771484375,
0.004547119140625,
0.00839996337890625,
0.08367919921875,
-0.04962158203125,
0.08697509765625,
0.04229736328125,
-0.040496826171875,
-0.0648193359375,
0.0136566162109375,
-0.021240234375,
-0.0528564453125,
0.07177734375,
0.024688720703125,
0.0243682861328125,
0.00522613525390625,
-0.05389404296875,
-0.0567626953125,
0.07696533203125,
-0.013824462890625,
-0.02972412109375,
-0.00597381591796875,
0.0257415771484375,
0.031585693359375,
-0.002925872802734375,
0.035675048828125,
0.010345458984375,
0.04638671875,
-0.0166778564453125,
-0.08489990234375,
-0.0182647705078125,
-0.0183563232421875,
0.00972747802734375,
0.016204833984375,
-0.07177734375,
0.055938720703125,
0.0011529922485351562,
-0.026031494140625,
0.036285400390625,
0.0726318359375,
-0.002410888671875,
0.0178680419921875,
0.041107177734375,
0.0311126708984375,
-0.0005211830139160156,
-0.0164642333984375,
0.04229736328125,
-0.03900146484375,
0.06219482421875,
0.062286376953125,
0.0021572113037109375,
0.054931640625,
0.032379150390625,
-0.038238525390625,
0.036163330078125,
0.0528564453125,
-0.039276123046875,
0.04693603515625,
-0.0032787322998046875,
-0.00839996337890625,
-0.005611419677734375,
0.004642486572265625,
-0.03741455078125,
0.01702880859375,
0.0247650146484375,
-0.0201263427734375,
-0.0033740997314453125,
-0.007415771484375,
-0.0030155181884765625,
-0.030120849609375,
-0.01009368896484375,
0.038055419921875,
0.0189666748046875,
-0.0121917724609375,
0.0350341796875,
0.0273284912109375,
0.0758056640625,
-0.07696533203125,
-0.022003173828125,
0.0176849365234375,
0.0145263671875,
-0.008056640625,
-0.043609619140625,
0.01473236083984375,
-0.023223876953125,
-0.01206207275390625,
-0.0102996826171875,
0.056243896484375,
-0.02435302734375,
-0.04180908203125,
0.024871826171875,
0.006671905517578125,
0.00814056396484375,
0.02532958984375,
-0.08331298828125,
-0.022308349609375,
0.02581787109375,
-0.03472900390625,
0.0151824951171875,
0.0175933837890625,
0.00841522216796875,
0.050750732421875,
0.06121826171875,
0.013580322265625,
-0.01403045654296875,
-0.00792694091796875,
0.0615234375,
-0.0411376953125,
-0.043121337890625,
-0.0528564453125,
0.053985595703125,
-0.019989013671875,
-0.023895263671875,
0.058135986328125,
0.053680419921875,
0.0889892578125,
-0.02484130859375,
0.06781005859375,
-0.031890869140625,
0.0567626953125,
-0.01093292236328125,
0.061676025390625,
-0.040313720703125,
-0.00408172607421875,
-0.0210418701171875,
-0.058441162109375,
-0.01763916015625,
0.060943603515625,
-0.01352691650390625,
-0.0104827880859375,
0.054718017578125,
0.049407958984375,
-0.002437591552734375,
-0.0126800537109375,
0.01395416259765625,
0.0179443359375,
0.047393798828125,
0.0308380126953125,
0.043609619140625,
-0.04150390625,
0.05328369140625,
-0.0543212890625,
-0.01352691650390625,
-0.0115966796875,
-0.04669189453125,
-0.049468994140625,
-0.039520263671875,
-0.0234375,
-0.005382537841796875,
-0.0293121337890625,
0.06402587890625,
0.0535888671875,
-0.076171875,
-0.0278472900390625,
0.00649261474609375,
0.0127410888671875,
-0.0266571044921875,
-0.025604248046875,
0.045989990234375,
-0.0228729248046875,
-0.0855712890625,
0.003284454345703125,
0.01053619384765625,
0.007904052734375,
-0.0210418701171875,
0.0004978179931640625,
-0.012786865234375,
-0.01444244384765625,
0.0296173095703125,
0.037811279296875,
-0.04742431640625,
-0.031036376953125,
-0.00998687744140625,
-0.0172271728515625,
0.005985260009765625,
0.04541015625,
-0.01428985595703125,
0.023956298828125,
0.042816162109375,
0.0182647705078125,
0.0230255126953125,
-0.019500732421875,
0.039886474609375,
-0.039794921875,
0.028167724609375,
0.027587890625,
0.041107177734375,
0.023895263671875,
-0.013092041015625,
0.0374755859375,
0.035491943359375,
-0.052978515625,
-0.04949951171875,
0.025054931640625,
-0.0723876953125,
-0.014007568359375,
0.06805419921875,
-0.012176513671875,
-0.0157470703125,
-0.0023345947265625,
-0.041046142578125,
0.0419921875,
-0.0231475830078125,
0.0450439453125,
0.054840087890625,
-0.0064849853515625,
-0.00494384765625,
-0.04278564453125,
0.0295257568359375,
0.03021240234375,
-0.0233306884765625,
-0.02545166015625,
-0.002902984619140625,
0.0103607177734375,
0.048004150390625,
0.03326416015625,
-0.0029315948486328125,
0.005352020263671875,
-0.005191802978515625,
0.039031982421875,
0.0023555755615234375,
0.0133514404296875,
0.0024394989013671875,
-0.01294708251953125,
0.00794219970703125,
-0.03399658203125
]
] |
elastic/distilbert-base-cased-finetuned-conll03-english | 2023-04-20T10:32:13.000Z | [
"transformers",
"pytorch",
"safetensors",
"distilbert",
"token-classification",
"en",
"dataset:conll2003",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | elastic | null | null | elastic/distilbert-base-cased-finetuned-conll03-english | 10 | 29,857 | transformers | 2022-03-02T23:29:05 | ---
language: en
license: apache-2.0
datasets:
- conll2003
model-index:
- name: elastic/distilbert-base-cased-finetuned-conll03-english
results:
- task:
type: token-classification
name: Token Classification
dataset:
name: conll2003
type: conll2003
config: conll2003
split: validation
metrics:
- type: accuracy
value: 0.9834432212868665
name: Accuracy
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTZmZTJlMzUzOTAzZjg3N2UxNmMxMjQ2M2FhZTM4MDdkYzYyYTYyNjM1YjQ0M2Y4ZmIyMzkwMmY5YjZjZGVhYSIsInZlcnNpb24iOjF9.QaSLUR7AtQmE9F-h6EBueF6INQgdKwUUzS3bNvRu44rhNDY1KAJJkmDC8FeAIVMnlOSvPKvr5pOvJ59W1zckCw
- type: precision
value: 0.9857564461012737
name: Precision
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDVmNmNmNWIwNTI0Yzc0YTI1NTk2NDM4YjY4NzY0ODQ4NzQ5MDQxMzYyYWM4YzUwNmYxZWQ1NTU2YTZiM2U2MCIsInZlcnNpb24iOjF9.ui_o64VBS_oC89VeQTx_B-nUUM0ZaivFyb6wNrYZcopJXvYgzptLCkARdBKdBajFjjupdhtq1VCdGbJ3yaXgBA
- type: recall
value: 0.9882123948925569
name: Recall
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODg4Mzg1NTY3NjU4ZGQxOGVhMzQxNWU0ZTYxNWM2ZTg1OGZlM2U5ZGMxYTA2NzdiZjM5YWFkZjkzOGYwYTlkMyIsInZlcnNpb24iOjF9.8jHJv_5ZQp_CX3-k8-C3c5Hs4zp7bJPRTeE5SFrNgeX-FdhPv_8bHBM_DqOD2P_nkAzQ_PtEFfEokQpouZFJCw
- type: f1
value: 0.9869828926905132
name: F1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzZlOGRjMDllYWY5MjdhODk2MmNmMDk5MDQyZGYzZDYwZTE1ZDY2MDNlMzAzN2JlMmE5Y2M3ZTNkOWE2MDBjYyIsInZlcnNpb24iOjF9.VKwzPQFSbrnUZ25gkKUZvYO_xFZcaTOSkDcN-YCxksF5DRnKudKI2HmvO8l8GCsQTCoD4DiSTKzghzLMxB1jCg
- type: loss
value: 0.07748260349035263
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNmVmOTQ2MWI2MzZhY2U2ODQ3YjA0ZWVjYzU1NGRlMTczZDI0NmM0OWI4YmIzMmEyYjlmNDIwYmRiODM4MWM0YiIsInZlcnNpb24iOjF9.0Prq087l2Xfh-ceS99zzUDcKM4Vr4CLM2rF1F1Fqd2fj9MOhVZEXF4JACVn0fWAFqfZIPS2GD8sSwfNYaXkZAA
---
[DistilBERT base cased](https://huggingface.co/distilbert-base-cased), fine-tuned for NER using the [conll03 english dataset](https://huggingface.co/datasets/conll2003). Note that this model is sensitive to capital letters — "english" is different than "English". For the case insensitive version, please use [elastic/distilbert-base-uncased-finetuned-conll03-english](https://huggingface.co/elastic/distilbert-base-uncased-finetuned-conll03-english).
## Versions
- Transformers version: 4.3.1
- Datasets version: 1.3.0
## Training
```
$ run_ner.py \
--model_name_or_path distilbert-base-cased \
--label_all_tokens True \
--return_entity_level_metrics True \
--dataset_name conll2003 \
--output_dir /tmp/distilbert-base-cased-finetuned-conll03-english \
--do_train \
--do_eval
```
After training, we update the labels to match the NER specific labels from the
dataset [conll2003](https://raw.githubusercontent.com/huggingface/datasets/1.3.0/datasets/conll2003/dataset_infos.json)
| 3,099 | [
[
-0.021392822265625,
-0.03839111328125,
0.01268768310546875,
0.0218353271484375,
-0.0031108856201171875,
0.0032501220703125,
-0.0135345458984375,
-0.011199951171875,
0.031585693359375,
0.0233154296875,
-0.05908203125,
-0.053863525390625,
-0.045745849609375,
0.0228118896484375,
-0.00354766845703125,
0.0975341796875,
-0.010162353515625,
0.0282440185546875,
-0.00010788440704345703,
-0.029083251953125,
-0.0247039794921875,
-0.050506591796875,
-0.0806884765625,
-0.0379638671875,
0.029541015625,
0.0268096923828125,
0.02947998046875,
0.01416778564453125,
0.0496826171875,
0.0251617431640625,
-0.034393310546875,
-0.0121917724609375,
-0.0302734375,
0.0004291534423828125,
-0.00923919677734375,
-0.013092041015625,
-0.06817626953125,
-0.00835418701171875,
0.0435791015625,
0.052154541015625,
-0.0305023193359375,
0.0281829833984375,
-0.006793975830078125,
0.04656982421875,
-0.0240325927734375,
0.018402099609375,
-0.047698974609375,
0.0025539398193359375,
-0.0038814544677734375,
-0.0010251998901367188,
-0.029144287109375,
-0.0025653839111328125,
0.0224609375,
-0.050140380859375,
0.0423583984375,
0.01364898681640625,
0.08892822265625,
0.0291290283203125,
-0.028900146484375,
-0.0147552490234375,
-0.03912353515625,
0.049591064453125,
-0.06231689453125,
0.03265380859375,
0.022705078125,
0.0283660888671875,
-0.0167694091796875,
-0.061737060546875,
-0.0292205810546875,
-0.0054168701171875,
-0.01534271240234375,
0.014892578125,
-0.023468017578125,
0.0013360977172851562,
0.045989990234375,
0.042022705078125,
-0.036224365234375,
0.0144500732421875,
-0.04437255859375,
-0.0198822021484375,
0.04931640625,
0.01702880859375,
0.0103607177734375,
0.0060272216796875,
-0.025299072265625,
-0.027557373046875,
-0.040771484375,
-0.000995635986328125,
0.037750244140625,
0.037689208984375,
-0.0399169921875,
0.0435791015625,
-0.031036376953125,
0.0516357421875,
0.03466796875,
-0.005519866943359375,
0.06878662109375,
-0.00830078125,
-0.012176513671875,
0.03118896484375,
0.063720703125,
0.03759765625,
0.01454925537109375,
-0.0046234130859375,
-0.040252685546875,
-0.0075531005859375,
-0.0059051513671875,
-0.07720947265625,
-0.0137786865234375,
0.01070404052734375,
-0.01447296142578125,
-0.01262664794921875,
0.00406646728515625,
-0.0367431640625,
-0.007007598876953125,
-0.0279693603515625,
0.038055419921875,
-0.0259552001953125,
-0.0193023681640625,
0.029632568359375,
-0.00838470458984375,
0.008087158203125,
-0.0010204315185546875,
-0.07177734375,
0.0416259765625,
0.0291290283203125,
0.046630859375,
0.0092926025390625,
0.0004794597625732422,
-0.0279693603515625,
-0.0127716064453125,
-0.006053924560546875,
0.0279998779296875,
-0.00841522216796875,
-0.0128173828125,
-0.00646209716796875,
0.033660888671875,
-0.00661468505859375,
-0.040313720703125,
0.05072021484375,
-0.039337158203125,
0.018707275390625,
-0.0037708282470703125,
-0.03192138671875,
-0.012908935546875,
0.03515625,
-0.05950927734375,
0.08489990234375,
0.032928466796875,
-0.06109619140625,
0.041412353515625,
-0.0452880859375,
-0.04736328125,
0.00464630126953125,
0.006381988525390625,
-0.053802490234375,
0.0005526542663574219,
0.0176544189453125,
0.0303192138671875,
-0.0164947509765625,
0.0204315185546875,
-0.01253509521484375,
-0.0177154541015625,
-0.004978179931640625,
-0.01348114013671875,
0.0555419921875,
-0.01151275634765625,
-0.03204345703125,
-0.0128631591796875,
-0.08251953125,
-0.003200531005859375,
0.007419586181640625,
-0.014129638671875,
-0.025634765625,
-0.0128936767578125,
0.01525115966796875,
0.0139617919921875,
0.033935546875,
-0.0556640625,
0.0166168212890625,
-0.03216552734375,
0.0293731689453125,
0.044281005859375,
0.000698089599609375,
0.0343017578125,
-0.01439666748046875,
0.03558349609375,
0.01251983642578125,
0.01546478271484375,
0.027069091796875,
-0.01361846923828125,
-0.071533203125,
-0.0246429443359375,
0.04498291015625,
0.036773681640625,
-0.03363037109375,
0.03631591796875,
-0.023406982421875,
-0.0499267578125,
-0.038177490234375,
-0.01861572265625,
0.03460693359375,
0.056732177734375,
0.050994873046875,
-0.0089874267578125,
-0.047882080078125,
-0.08489990234375,
0.01340484619140625,
0.0004968643188476562,
-0.0028285980224609375,
0.0271148681640625,
0.054473876953125,
-0.0182342529296875,
0.046630859375,
-0.046295166015625,
-0.0225677490234375,
-0.009033203125,
0.00725555419921875,
0.0689697265625,
0.03607177734375,
0.038299560546875,
-0.050750732421875,
-0.05084228515625,
0.0015735626220703125,
-0.038238525390625,
-0.00788116455078125,
-0.0021953582763671875,
-0.007122039794921875,
0.005603790283203125,
0.03662109375,
-0.026397705078125,
0.0277252197265625,
0.03387451171875,
-0.03216552734375,
0.025115966796875,
-0.0156097412109375,
0.00930023193359375,
-0.1041259765625,
-0.00153350830078125,
0.0289764404296875,
-0.0139617919921875,
-0.02752685546875,
-0.02093505859375,
0.006885528564453125,
0.0276641845703125,
-0.0555419921875,
0.053924560546875,
-0.04742431640625,
0.0125579833984375,
0.004734039306640625,
-0.0218658447265625,
0.0090789794921875,
0.019744873046875,
0.00920867919921875,
0.0447998046875,
0.060333251953125,
-0.049591064453125,
0.0240325927734375,
0.0438232421875,
-0.022857666015625,
0.055328369140625,
-0.071044921875,
-0.003162384033203125,
-0.01403045654296875,
0.01361846923828125,
-0.027252197265625,
-0.01495361328125,
0.0202178955078125,
-0.0208587646484375,
0.0462646484375,
-0.0312347412109375,
-0.030731201171875,
-0.0323486328125,
0.0013360977172851562,
0.030517578125,
0.03118896484375,
-0.049468994140625,
0.035980224609375,
0.0258636474609375,
-0.000698089599609375,
-0.03875732421875,
-0.05731201171875,
-0.01580810546875,
-0.0288238525390625,
-0.0301055908203125,
0.0562744140625,
-0.0187225341796875,
-0.00852203369140625,
-0.0157012939453125,
-0.0261688232421875,
-0.02789306640625,
-0.00347900390625,
0.01548004150390625,
0.025390625,
-0.029327392578125,
-0.00405120849609375,
0.004123687744140625,
-0.0199737548828125,
0.002811431884765625,
-0.00012302398681640625,
0.036529541015625,
-0.0260162353515625,
0.00421905517578125,
-0.03173828125,
-0.00958251953125,
0.03466796875,
0.00627899169921875,
0.05487060546875,
0.040252685546875,
-0.046966552734375,
-0.01448822021484375,
-0.035858154296875,
-0.035064697265625,
-0.039825439453125,
0.032867431640625,
-0.041473388671875,
-0.051025390625,
0.038665771484375,
0.01332855224609375,
-0.004261016845703125,
0.05584716796875,
0.029510498046875,
-0.00667572021484375,
0.0751953125,
0.050506591796875,
-0.01537322998046875,
0.0198974609375,
-0.040191650390625,
0.006618499755859375,
-0.083251953125,
-0.02532958984375,
-0.032470703125,
-0.03839111328125,
-0.0570068359375,
0.006122589111328125,
0.02032470703125,
0.0087738037109375,
-0.04315185546875,
0.060028076171875,
-0.058807373046875,
0.04437255859375,
0.0361328125,
0.006839752197265625,
0.01226043701171875,
0.013671875,
-0.007152557373046875,
-0.018218994140625,
-0.046173095703125,
-0.03350830078125,
0.06585693359375,
0.0186004638671875,
0.054595947265625,
0.002349853515625,
0.04595947265625,
-0.00841522216796875,
0.0308074951171875,
-0.0399169921875,
0.022735595703125,
-0.02044677734375,
-0.058746337890625,
-0.00536346435546875,
-0.02618408203125,
-0.0750732421875,
-0.005336761474609375,
-0.0169219970703125,
-0.053314208984375,
0.00666046142578125,
0.0016870498657226562,
-0.020172119140625,
0.0232086181640625,
-0.037445068359375,
0.07122802734375,
-0.028839111328125,
-0.0060882568359375,
0.0196533203125,
-0.042877197265625,
-0.00811767578125,
-0.0031261444091796875,
-0.0027637481689453125,
-0.01788330078125,
0.00676727294921875,
0.060028076171875,
-0.035247802734375,
0.0516357421875,
-0.016876220703125,
0.010223388671875,
0.0124969482421875,
-0.01148223876953125,
0.028076171875,
0.0241241455078125,
-0.033599853515625,
0.043792724609375,
-0.005596160888671875,
-0.01025390625,
-0.0222930908203125,
0.05413818359375,
-0.053436279296875,
-0.0124969482421875,
-0.055328369140625,
-0.0299072265625,
-0.0077972412109375,
0.02227783203125,
0.046539306640625,
0.03887939453125,
-0.0245819091796875,
0.0280609130859375,
0.04522705078125,
-0.01094818115234375,
0.0287322998046875,
0.05987548828125,
0.01116180419921875,
-0.039581298828125,
0.050201416015625,
0.00403594970703125,
-0.0016870498657226562,
0.0318603515625,
0.0169677734375,
-0.03302001953125,
-0.03753662109375,
-0.041595458984375,
0.0148773193359375,
-0.034393310546875,
-0.0260467529296875,
-0.048187255859375,
-0.03277587890625,
-0.0158538818359375,
-0.00576019287109375,
-0.03546142578125,
-0.0236968994140625,
-0.040618896484375,
-0.0244903564453125,
0.059234619140625,
0.042266845703125,
-0.005268096923828125,
0.0361328125,
-0.03887939453125,
0.018035888671875,
-0.0181427001953125,
0.032470703125,
-0.029754638671875,
-0.06207275390625,
-0.00960540771484375,
0.0241241455078125,
-0.01534271240234375,
-0.047882080078125,
0.03851318359375,
0.0210113525390625,
0.0364990234375,
-0.006305694580078125,
0.000865936279296875,
0.049285888671875,
-0.050018310546875,
0.03680419921875,
0.0306243896484375,
-0.0416259765625,
0.045928955078125,
-0.022674560546875,
0.01531982421875,
0.0760498046875,
0.04998779296875,
-0.0311431884765625,
-0.01168060302734375,
-0.058013916015625,
-0.10491943359375,
0.052032470703125,
0.010223388671875,
0.0223846435546875,
-0.01192474365234375,
0.0266571044921875,
0.016082763671875,
0.01200103759765625,
-0.07049560546875,
-0.04705810546875,
0.006023406982421875,
-0.0300140380859375,
-0.0035915374755859375,
0.0031719207763671875,
-0.00489044189453125,
-0.05194091796875,
0.08306884765625,
-0.005584716796875,
-0.0232696533203125,
0.0238189697265625,
-0.0022430419921875,
0.01139068603515625,
0.003604888916015625,
0.0183868408203125,
0.0258636474609375,
-0.035888671875,
-0.026885986328125,
0.0295562744140625,
-0.04119873046875,
0.0137481689453125,
0.020538330078125,
-0.03826904296875,
0.023956298828125,
-0.007358551025390625,
0.08477783203125,
0.01554107666015625,
-0.0120849609375,
0.04205322265625,
-0.02532958984375,
-0.041229248046875,
-0.0516357421875,
-0.01393890380859375,
0.0026493072509765625,
0.018157958984375,
0.0196685791015625,
0.0084228515625,
0.009613037109375,
-0.00226593017578125,
0.0169219970703125,
0.0154266357421875,
-0.0513916015625,
-0.0298004150390625,
0.04656982421875,
0.0013895034790039062,
-0.0218048095703125,
0.06280517578125,
-0.0285491943359375,
-0.0249481201171875,
0.062164306640625,
0.048065185546875,
0.0655517578125,
0.01201629638671875,
0.00878143310546875,
0.0718994140625,
0.0277099609375,
-0.015167236328125,
0.035614013671875,
0.0264739990234375,
-0.04486083984375,
-0.018890380859375,
-0.07086181640625,
-0.0203704833984375,
0.03546142578125,
-0.06341552734375,
0.05145263671875,
-0.035430908203125,
-0.03173828125,
0.0044708251953125,
-0.007198333740234375,
-0.0758056640625,
0.033538818359375,
0.0026531219482421875,
0.06280517578125,
-0.0631103515625,
0.033294677734375,
0.0687255859375,
-0.0406494140625,
-0.07342529296875,
-0.0224609375,
-0.0125579833984375,
-0.0231475830078125,
0.05096435546875,
0.0198822021484375,
0.013763427734375,
0.010101318359375,
-0.0266571044921875,
-0.0860595703125,
0.089599609375,
0.023773193359375,
-0.0711669921875,
0.01947021484375,
0.005092620849609375,
0.04315185546875,
-0.0198822021484375,
0.043365478515625,
0.030517578125,
0.035125732421875,
0.005031585693359375,
-0.0689697265625,
-0.000446319580078125,
-0.0333251953125,
-0.0198974609375,
0.0198822021484375,
-0.0533447265625,
0.060882568359375,
-0.0027713775634765625,
-0.0023403167724609375,
0.0022258758544921875,
0.04705810546875,
0.02813720703125,
0.0264739990234375,
0.038299560546875,
0.0728759765625,
0.052337646484375,
-0.025543212890625,
0.053802490234375,
-0.0379638671875,
0.044189453125,
0.097412109375,
-0.0267181396484375,
0.04583740234375,
0.038330078125,
-0.01093292236328125,
0.046722412109375,
0.06243896484375,
-0.0345458984375,
0.049346923828125,
0.032470703125,
-0.0190277099609375,
0.006618499755859375,
-0.00522613525390625,
-0.046173095703125,
0.0252227783203125,
0.0123138427734375,
-0.0152587890625,
-0.01245880126953125,
-0.018524169921875,
0.00829315185546875,
-0.0101470947265625,
-0.0177001953125,
0.045074462890625,
0.0035305023193359375,
-0.041534423828125,
0.052490234375,
-0.0032901763916015625,
0.0811767578125,
-0.0360107421875,
0.001529693603515625,
-0.018280029296875,
0.0228271484375,
-0.0228424072265625,
-0.043914794921875,
0.01505279541015625,
0.0028705596923828125,
-0.03558349609375,
-0.0229949951171875,
0.0322265625,
-0.026702880859375,
-0.06890869140625,
0.025177001953125,
0.01641845703125,
0.01123046875,
0.0107574462890625,
-0.05682373046875,
-0.0198822021484375,
-0.01273345947265625,
-0.0347900390625,
0.00995635986328125,
0.0290069580078125,
-0.005313873291015625,
0.03131103515625,
0.051666259765625,
0.01038360595703125,
-0.005748748779296875,
0.03448486328125,
0.058013916015625,
-0.03228759765625,
-0.0211944580078125,
-0.05889892578125,
0.054229736328125,
-0.023468017578125,
-0.0623779296875,
0.03717041015625,
0.0731201171875,
0.08636474609375,
-0.01190185546875,
0.040771484375,
-0.020721435546875,
0.0203094482421875,
-0.026580810546875,
0.05950927734375,
-0.05487060546875,
-0.004940032958984375,
-0.017333984375,
-0.07684326171875,
-0.01348114013671875,
0.05328369140625,
-0.007251739501953125,
0.00490570068359375,
0.05352783203125,
0.0748291015625,
-0.03118896484375,
-0.017822265625,
-0.004749298095703125,
0.02984619140625,
-0.0035381317138671875,
0.0345458984375,
0.036529541015625,
-0.0679931640625,
0.01959228515625,
-0.04681396484375,
-0.0203399658203125,
0.0008225440979003906,
-0.07696533203125,
-0.0599365234375,
-0.0322265625,
-0.059722900390625,
-0.0447998046875,
-0.004192352294921875,
0.05450439453125,
0.05487060546875,
-0.071044921875,
-0.011962890625,
-0.0157318115234375,
-0.0115509033203125,
-0.0050811767578125,
-0.0140380859375,
0.01715087890625,
0.00597381591796875,
-0.0670166015625,
0.032806396484375,
0.0003914833068847656,
0.0184173583984375,
0.01326751708984375,
-0.016845703125,
-0.01244354248046875,
-0.0248565673828125,
0.0200042724609375,
0.022552490234375,
-0.0283660888671875,
-0.01114654541015625,
0.01297760009765625,
-0.0196533203125,
0.01262664794921875,
0.0154876708984375,
-0.0341796875,
0.01025390625,
0.030548095703125,
0.0171966552734375,
0.054229736328125,
-0.03363037109375,
0.034271240234375,
-0.07415771484375,
0.0244140625,
0.0277252197265625,
0.0511474609375,
0.049407958984375,
-0.0238189697265625,
0.063720703125,
0.0092010498046875,
-0.030853271484375,
-0.058807373046875,
-0.01192474365234375,
-0.07806396484375,
0.01178741455078125,
0.09979248046875,
0.0129852294921875,
-0.0302581787109375,
0.026702880859375,
-0.01715087890625,
0.033477783203125,
-0.027435302734375,
0.03778076171875,
0.05157470703125,
0.023040771484375,
0.0219268798828125,
-0.046051025390625,
0.020965576171875,
0.0308380126953125,
-0.0316162109375,
-0.0300445556640625,
0.024627685546875,
0.042327880859375,
0.0030345916748046875,
0.044036865234375,
-0.0153961181640625,
0.0007753372192382812,
-0.01297760009765625,
0.033111572265625,
0.003414154052734375,
-0.014984130859375,
-0.0193634033203125,
-0.032562255859375,
-0.0164337158203125,
-0.01537322998046875
]
] |
KoboldAI/OPT-6.7B-Nerybus-Mix | 2023-02-13T14:56:10.000Z | [
"transformers",
"pytorch",
"opt",
"text-generation",
"en",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | KoboldAI | null | null | KoboldAI/OPT-6.7B-Nerybus-Mix | 19 | 29,788 | transformers | 2023-02-13T14:21:14 | ---
license: other
language:
- en
inference: false
---
# OPT-6.7B-Nerybus-Mix
This is an experimental model containing a ***parameter-wise 50/50 blend (weighted average)*** of the weights of *NerysV2-6.7B* and *ErebusV1-6.7B*
Preliminary testing produces pretty coherent outputs, however, it seems less impressive than the 2.7B variant of Nerybus, as both 6.7B source models appear more similar than their 2.7B counterparts.
# License
The two models used for this blend, *NerysV2-6.7B* and *ErebusV1-6.7B* are made by **Mr. Seeker**.
- https://huggingface.co/KoboldAI/OPT-6.7B-Erebus
- https://huggingface.co/KoboldAI/OPT-6B-nerys-v2
The base OPT-6.7B model is licensed under the OPT-175B license, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
# Evaluation Results
No formal evaluation is available for this model at this time. This blend was created in FP16, due to available memory constraints.
It is recommend to use this model with the KoboldAI software. All feedback and comments can be directed to Concedo on the KoboldAI discord.
| 1,057 | [
[
-0.038360595703125,
-0.054107666015625,
0.01270294189453125,
0.04583740234375,
-0.03167724609375,
-0.021820068359375,
-0.00901031494140625,
-0.06683349609375,
0.061248779296875,
0.037109375,
-0.05615234375,
-0.01439666748046875,
-0.03826904296875,
-0.01093292236328125,
-0.0291595458984375,
0.05596923828125,
-0.0002243518829345703,
0.0211944580078125,
0.0076446533203125,
-0.01059722900390625,
-0.05169677734375,
-0.0077056884765625,
-0.06414794921875,
-0.047607421875,
0.0498046875,
0.0261383056640625,
0.060699462890625,
-0.0032901763916015625,
0.04052734375,
0.0225067138671875,
-0.0260162353515625,
-0.0121002197265625,
-0.05194091796875,
-0.0153350830078125,
-0.002391815185546875,
-0.046112060546875,
-0.06964111328125,
0.01233673095703125,
0.037078857421875,
0.0202178955078125,
-0.035797119140625,
0.0167388916015625,
-0.0172119140625,
0.020782470703125,
-0.03411865234375,
-0.0254058837890625,
-0.021636962890625,
0.00858306884765625,
-0.0016345977783203125,
0.0027790069580078125,
-0.04827880859375,
-0.01035308837890625,
0.0164337158203125,
-0.053192138671875,
0.022186279296875,
0.03155517578125,
0.08612060546875,
0.0030975341796875,
-0.0029087066650390625,
-0.008026123046875,
-0.01971435546875,
0.07989501953125,
-0.0816650390625,
0.019805908203125,
0.01361083984375,
0.037109375,
-0.01416778564453125,
-0.04266357421875,
-0.014068603515625,
-0.0013360977172851562,
0.01282501220703125,
0.036590576171875,
-0.0284271240234375,
-0.00676727294921875,
-0.00049591064453125,
0.0297698974609375,
-0.0439453125,
0.012420654296875,
-0.057952880859375,
-0.01059722900390625,
0.046600341796875,
0.0247650146484375,
0.01311492919921875,
-0.01849365234375,
-0.028717041015625,
-0.035491943359375,
-0.0271453857421875,
-0.0176239013671875,
0.049346923828125,
0.006984710693359375,
-0.032928466796875,
0.06048583984375,
-0.035125732421875,
0.06768798828125,
0.0247344970703125,
0.005054473876953125,
0.031005859375,
-0.036346435546875,
-0.050384521484375,
0.01080322265625,
0.049346923828125,
0.056488037109375,
0.002223968505859375,
0.005764007568359375,
-0.00966644287109375,
0.0091705322265625,
0.01318359375,
-0.0836181640625,
-0.041259765625,
0.0207061767578125,
-0.057373046875,
-0.0469970703125,
0.0173187255859375,
-0.0570068359375,
-0.0037689208984375,
-0.008636474609375,
0.041534423828125,
-0.034637451171875,
-0.039031982421875,
0.019073486328125,
-0.043487548828125,
0.055694580078125,
0.02471923828125,
-0.050750732421875,
0.028656005859375,
0.0205230712890625,
0.052825927734375,
-0.017486572265625,
-0.001743316650390625,
0.0025196075439453125,
-0.0202484130859375,
-0.0249481201171875,
0.050262451171875,
-0.037322998046875,
-0.04364013671875,
-0.030609130859375,
0.0102996826171875,
0.0128326416015625,
-0.0587158203125,
0.0372314453125,
-0.018463134765625,
0.01397705078125,
-0.01506805419921875,
-0.033294677734375,
-0.038726806640625,
0.0026988983154296875,
-0.062469482421875,
0.0828857421875,
0.029571533203125,
-0.0413818359375,
0.05657958984375,
-0.024261474609375,
-0.0110015869140625,
0.0140380859375,
0.001163482666015625,
-0.03131103515625,
0.03643798828125,
-0.0078277587890625,
0.0032176971435546875,
-0.0188446044921875,
-0.0028057098388671875,
-0.07989501953125,
-0.04510498046875,
0.0221405029296875,
-0.0281219482421875,
0.06884765625,
0.017181396484375,
-0.01534271240234375,
-0.005573272705078125,
-0.051116943359375,
0.02142333984375,
0.01103973388671875,
-0.0009946823120117188,
-0.0135955810546875,
-0.017730712890625,
0.02716064453125,
0.01007843017578125,
0.0257568359375,
-0.0301513671875,
-0.01334381103515625,
-0.029876708984375,
0.01123046875,
0.0523681640625,
-0.00833892822265625,
0.038330078125,
-0.052215576171875,
0.0401611328125,
0.0023441314697265625,
0.028900146484375,
0.03045654296875,
-0.05535888671875,
-0.08087158203125,
-0.023895263671875,
0.04803466796875,
0.023162841796875,
-0.02081298828125,
0.02593994140625,
0.0131378173828125,
-0.0704345703125,
-0.030792236328125,
-0.022705078125,
0.00457000732421875,
0.0010232925415039062,
0.01314544677734375,
-0.0236358642578125,
-0.044158935546875,
-0.08087158203125,
-0.0195465087890625,
-0.0010051727294921875,
0.003208160400390625,
0.0308837890625,
0.043121337890625,
-0.00823974609375,
0.05322265625,
-0.04827880859375,
0.004802703857421875,
-0.0226287841796875,
0.0198516845703125,
0.044891357421875,
0.035552978515625,
0.074462890625,
-0.05615234375,
-0.03997802734375,
-0.0036563873291015625,
-0.0428466796875,
-0.0241241455078125,
0.032470703125,
-0.025360107421875,
0.0024776458740234375,
0.0092620849609375,
-0.02850341796875,
0.043914794921875,
0.059814453125,
-0.042633056640625,
0.041748046875,
-0.01934814453125,
0.025177001953125,
-0.08349609375,
-0.0007243156433105469,
0.0038051605224609375,
-0.027313232421875,
-0.03277587890625,
0.005153656005859375,
0.007167816162109375,
0.0240325927734375,
-0.0650634765625,
0.050384521484375,
-0.03662109375,
-0.007007598876953125,
-0.020538330078125,
0.007465362548828125,
-0.005649566650390625,
0.024566650390625,
-0.0037841796875,
0.01206207275390625,
0.04345703125,
-0.03558349609375,
0.037445068359375,
0.004962921142578125,
-0.00565338134765625,
0.0703125,
-0.081298828125,
-0.025177001953125,
-0.004787445068359375,
0.00957489013671875,
-0.053192138671875,
-0.034149169921875,
0.042449951171875,
-0.047637939453125,
0.01071929931640625,
-0.0018262863159179688,
-0.0263214111328125,
-0.01012420654296875,
-0.0305023193359375,
0.035736083984375,
0.0743408203125,
-0.0166778564453125,
0.07708740234375,
0.006969451904296875,
-0.00881195068359375,
-0.0138702392578125,
-0.08050537109375,
-0.03411865234375,
-0.0362548828125,
-0.057342529296875,
0.03076171875,
-0.024139404296875,
-0.0170745849609375,
0.01100921630859375,
-0.023529052734375,
-0.0242767333984375,
-0.0311737060546875,
0.0311737060546875,
0.0498046875,
0.00038909912109375,
-0.0335693359375,
0.02099609375,
-0.0006361007690429688,
0.00894927978515625,
-0.0021457672119140625,
0.0242767333984375,
0.003971099853515625,
-0.0174407958984375,
-0.0213775634765625,
0.00839996337890625,
0.033050537109375,
0.0193023681640625,
0.04541015625,
0.040924072265625,
-0.033355712890625,
-0.015625,
-0.035736083984375,
-0.01479339599609375,
-0.037628173828125,
0.025482177734375,
-0.01605224609375,
-0.0228118896484375,
0.054412841796875,
0.03143310546875,
0.01806640625,
0.0645751953125,
0.0380859375,
-0.00699615478515625,
0.07708740234375,
0.0238189697265625,
0.0233001708984375,
0.040191650390625,
-0.023193359375,
0.004756927490234375,
-0.04595947265625,
-0.0362548828125,
-0.030792236328125,
-0.054412841796875,
-0.046478271484375,
-0.028564453125,
0.00783538818359375,
0.0011157989501953125,
0.0113067626953125,
0.0457763671875,
-0.033477783203125,
0.0231475830078125,
0.048187255859375,
0.042266845703125,
0.003017425537109375,
0.0186614990234375,
0.0039825439453125,
0.0024318695068359375,
-0.07196044921875,
0.0022182464599609375,
0.07354736328125,
0.0277099609375,
0.0792236328125,
0.00469970703125,
0.046478271484375,
0.009307861328125,
0.0302276611328125,
-0.034759521484375,
0.044647216796875,
-0.005962371826171875,
-0.09149169921875,
0.00449371337890625,
-0.04718017578125,
-0.041107177734375,
0.042388916015625,
-0.03125,
-0.033599853515625,
0.020721435546875,
0.018402099609375,
-0.02197265625,
0.0021343231201171875,
-0.06884765625,
0.06866455078125,
-0.017974853515625,
-0.0210723876953125,
-0.01220703125,
-0.034942626953125,
0.01165008544921875,
0.01113128662109375,
-0.0012102127075195312,
-0.00791168212890625,
0.0208587646484375,
0.0703125,
-0.0200653076171875,
0.042388916015625,
-0.00431060791015625,
0.0212860107421875,
0.0482177734375,
-0.005802154541015625,
0.035797119140625,
0.005764007568359375,
0.00742340087890625,
0.0178375244140625,
0.000029861927032470703,
-0.05548095703125,
-0.019683837890625,
0.056854248046875,
-0.046661376953125,
0.019134521484375,
-0.041046142578125,
-0.037994384765625,
0.03265380859375,
0.0169219970703125,
0.047271728515625,
0.059234619140625,
-0.0225677490234375,
0.0323486328125,
0.05999755859375,
-0.007678985595703125,
0.0308074951171875,
0.03973388671875,
-0.0289306640625,
-0.0499267578125,
0.063720703125,
0.004116058349609375,
0.040191650390625,
0.00627899169921875,
0.0092926025390625,
-0.02764892578125,
-0.026397705078125,
-0.0198822021484375,
0.02099609375,
-0.047637939453125,
0.0020656585693359375,
-0.02740478515625,
-0.0078125,
-0.024658203125,
-0.0149993896484375,
-0.035614013671875,
-0.025360107421875,
-0.0323486328125,
0.01010894775390625,
0.01166534423828125,
0.055877685546875,
0.0081939697265625,
0.034576416015625,
-0.031829833984375,
0.005725860595703125,
0.00994110107421875,
0.03253173828125,
-0.0198516845703125,
-0.06317138671875,
0.0014133453369140625,
0.036346435546875,
-0.011260986328125,
-0.0926513671875,
0.033294677734375,
-0.00867462158203125,
0.038726806640625,
0.0291748046875,
-0.01806640625,
0.057220458984375,
0.00394439697265625,
0.047332763671875,
0.033935546875,
-0.0347900390625,
0.00890350341796875,
-0.024810791015625,
0.03472900390625,
0.052642822265625,
0.0035247802734375,
0.01065826416015625,
-0.04364013671875,
-0.069580078125,
-0.06793212890625,
0.06195068359375,
0.0546875,
-0.0252227783203125,
0.0227508544921875,
0.0023212432861328125,
0.009765625,
0.0278472900390625,
-0.07684326171875,
-0.02740478515625,
-0.01247406005859375,
-0.032562255859375,
-0.0086212158203125,
-0.0180206298828125,
0.00894927978515625,
-0.011444091796875,
0.06591796875,
0.0220947265625,
0.017852783203125,
0.0124969482421875,
-0.00885772705078125,
-0.0099029541015625,
-0.0137481689453125,
0.04852294921875,
0.05572509765625,
-0.057891845703125,
-0.00862884521484375,
0.031158447265625,
-0.042144775390625,
-0.019989013671875,
0.0129547119140625,
-0.0245819091796875,
0.01348114013671875,
-0.00258636474609375,
0.056488037109375,
0.033538818359375,
-0.0391845703125,
0.01605224609375,
-0.01088714599609375,
-0.045135498046875,
-0.0195770263671875,
0.0151824951171875,
0.005706787109375,
0.035797119140625,
-0.0061187744140625,
0.01495361328125,
0.0230560302734375,
-0.02374267578125,
-0.003665924072265625,
0.0224151611328125,
-0.0340576171875,
-0.01128387451171875,
0.04437255859375,
0.0139007568359375,
-0.01123809814453125,
0.041778564453125,
-0.042083740234375,
0.0037078857421875,
0.064453125,
0.043121337890625,
0.06292724609375,
-0.0018825531005859375,
0.00572967529296875,
0.0301361083984375,
0.03924560546875,
-0.018341064453125,
0.038299560546875,
0.033294677734375,
-0.016876220703125,
-0.0169830322265625,
-0.0543212890625,
-0.0275115966796875,
-0.011962890625,
-0.04718017578125,
0.05364990234375,
-0.004405975341796875,
-0.03228759765625,
-0.005023956298828125,
-0.00913238525390625,
-0.0343017578125,
0.03472900390625,
-0.00235748291015625,
0.06451416015625,
-0.08856201171875,
0.0279693603515625,
0.06658935546875,
-0.013671875,
-0.059906005859375,
-0.04730224609375,
-0.0303955078125,
-0.020538330078125,
0.0008597373962402344,
0.0210418701171875,
0.01059722900390625,
-0.0233612060546875,
-0.050567626953125,
-0.0579833984375,
0.08612060546875,
0.0155181884765625,
-0.027191162109375,
-0.00478363037109375,
-0.0234222412109375,
0.01702880859375,
-0.0263214111328125,
0.05322265625,
0.026214599609375,
0.04742431640625,
0.03314208984375,
-0.058624267578125,
-0.0167236328125,
-0.025604248046875,
-0.0028095245361328125,
0.0248260498046875,
-0.055908203125,
0.0814208984375,
0.009765625,
0.00742340087890625,
0.013275146484375,
0.0621337890625,
0.040313720703125,
0.03472900390625,
0.03900146484375,
0.073974609375,
0.013946533203125,
-0.017913818359375,
0.07183837890625,
-0.01183319091796875,
0.052093505859375,
0.0860595703125,
-0.0030841827392578125,
0.041168212890625,
0.046844482421875,
-0.016815185546875,
0.028594970703125,
0.0732421875,
-0.01849365234375,
0.038665771484375,
0.0019779205322265625,
0.015777587890625,
-0.021148681640625,
0.019805908203125,
-0.040985107421875,
0.00978851318359375,
-0.00012862682342529297,
-0.0355224609375,
-0.01032257080078125,
-0.049102783203125,
0.005527496337890625,
-0.040252685546875,
-0.0215301513671875,
0.0138702392578125,
0.00887298583984375,
-0.06036376953125,
0.03753662109375,
0.0155792236328125,
0.05340576171875,
-0.06732177734375,
-0.01213836669921875,
0.004917144775390625,
0.005191802978515625,
-0.015655517578125,
-0.043914794921875,
0.01181793212890625,
-0.035614013671875,
-0.004810333251953125,
0.01403045654296875,
0.0675048828125,
-0.036224365234375,
-0.054473876953125,
0.0221405029296875,
-0.0120849609375,
0.0275115966796875,
-0.0018892288208007812,
-0.05230712890625,
0.0180206298828125,
-0.00791168212890625,
-0.0222320556640625,
-0.002658843994140625,
-0.010162353515625,
0.0262908935546875,
0.03533935546875,
0.029937744140625,
0.0105438232421875,
0.039276123046875,
-0.01483917236328125,
0.038665771484375,
-0.0205535888671875,
-0.0574951171875,
-0.046783447265625,
0.0298919677734375,
-0.0340576171875,
-0.0394287109375,
0.06646728515625,
0.04766845703125,
0.05419921875,
-0.00959014892578125,
0.035430908203125,
-0.0438232421875,
-0.006000518798828125,
-0.030029296875,
0.03173828125,
-0.0640869140625,
-0.03228759765625,
-0.0222320556640625,
-0.10400390625,
0.0162353515625,
0.053680419921875,
0.00908660888671875,
0.043487548828125,
0.054290771484375,
0.05767822265625,
-0.02191162109375,
0.006710052490234375,
0.0146331787109375,
0.0277557373046875,
-0.01305389404296875,
0.046478271484375,
0.033111572265625,
-0.053466796875,
0.0248260498046875,
-0.0172119140625,
-0.032958984375,
-0.053741455078125,
-0.06268310546875,
-0.08056640625,
0.017852783203125,
-0.01467132568359375,
-0.04248046875,
0.0009641647338867188,
0.03863525390625,
0.06634521484375,
-0.037445068359375,
-0.0305938720703125,
-0.0194549560546875,
-0.005680084228515625,
-0.00954437255859375,
-0.014251708984375,
-0.0308685302734375,
0.029632568359375,
-0.060821533203125,
0.01318359375,
0.016845703125,
0.03436279296875,
-0.0215911865234375,
-0.0210723876953125,
-0.0037937164306640625,
0.0262298583984375,
0.0186614990234375,
-0.006195068359375,
-0.047088623046875,
0.0028018951416015625,
0.0164337158203125,
0.007259368896484375,
-0.01479339599609375,
0.0121612548828125,
-0.0280609130859375,
0.032379150390625,
0.08074951171875,
-0.0145416259765625,
0.04937744140625,
-0.002010345458984375,
0.03216552734375,
-0.0172882080078125,
0.0192108154296875,
-0.000007808208465576172,
0.032257080078125,
-0.00040793418884277344,
0.002620697021484375,
0.0254058837890625,
0.02032470703125,
-0.036956787109375,
-0.054779052734375,
0.01346588134765625,
-0.11260986328125,
-0.0138092041015625,
0.09747314453125,
0.023101806640625,
-0.037139892578125,
0.046142578125,
-0.0201568603515625,
0.00954437255859375,
-0.031463623046875,
0.0092926025390625,
0.050628662109375,
0.0157012939453125,
0.01148223876953125,
-0.0579833984375,
-0.0036907196044921875,
0.05316162109375,
-0.03460693359375,
-0.04107666015625,
0.03558349609375,
0.03863525390625,
0.0015716552734375,
0.029693603515625,
-0.0306243896484375,
0.0325927734375,
0.028564453125,
0.01119232177734375,
0.0021686553955078125,
-0.03936767578125,
-0.00434112548828125,
0.0055694580078125,
-0.00890350341796875,
-0.0076141357421875
]
] |
flax-community/clip-rsicd-v2 | 2022-04-24T21:03:53.000Z | [
"transformers",
"pytorch",
"jax",
"clip",
"zero-shot-image-classification",
"vision",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-image-classification | flax-community | null | null | flax-community/clip-rsicd-v2 | 12 | 29,726 | transformers | 2022-03-02T23:29:05 | ---
tags:
- vision
---
# Model Card: clip-rsicd
## Model Details
This model is a fine-tuned [CLIP by OpenAI](https://huggingface.co/openai/clip-vit-base-patch32). It is designed with an aim to improve zero-shot image classification, text-to-image and image-to-image retrieval specifically on remote sensing images.
### Model Date
July 2021
### Model Type
The base model uses a ViT-B/32 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
### Model Version
We release several checkpoints for `clip-rsicd` model. Refer to [our github repo](https://github.com/arampacha/CLIP-rsicd#evaluation-results) for performance metrics on zero-shot classification for each of those.
### Training
To reproduce the fine-tuning procedure one can use released [script](https://github.com/arampacha/CLIP-rsicd/blob/master/run_clip_flax_tv.py).
The model was trained using batch size 1024, adafactor optimizer with linear warmup and decay with peak learning rate 1e-4 on 1 TPU-v3-8.
Full log of the training run can be found on [WandB](https://wandb.ai/wandb/hf-flax-clip-rsicd/runs/2dj1exsw).
### Demo
Check out the model text-to-image and image-to-image capabilities using [this demo](https://huggingface.co/spaces/sujitpal/clip-rsicd-demo).
### Documents
- [Fine-tuning CLIP on RSICD with HuggingFace and flax/jax on colab using TPU](https://colab.research.google.com/github/arampacha/CLIP-rsicd/blob/master/nbs/Fine_tuning_CLIP_with_HF_on_TPU.ipynb)
### Use with Transformers
```python
from PIL import Image
import requests
from transformers import CLIPProcessor, CLIPModel
model = CLIPModel.from_pretrained("flax-community/clip-rsicd-v2")
processor = CLIPProcessor.from_pretrained("flax-community/clip-rsicd-v2")
url = "https://raw.githubusercontent.com/arampacha/CLIP-rsicd/master/data/stadium_1.jpg"
image = Image.open(requests.get(url, stream=True).raw)
labels = ["residential area", "playground", "stadium", "forest", "airport"]
inputs = processor(text=[f"a photo of a {l}" for l in labels], images=image, return_tensors="pt", padding=True)
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image # this is the image-text similarity score
probs = logits_per_image.softmax(dim=1) # we can take the softmax to get the label probabilities
for l, p in zip(labels, probs[0]):
print(f"{l:<16} {p:.4f}")
```
[Try it on colab](https://colab.research.google.com/github/arampacha/CLIP-rsicd/blob/master/nbs/clip_rsicd_zero_shot.ipynb)
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification.
In addition, we can imagine applications in defense and law enforcement, climate change and global warming, and even some consumer applications. A partial list of applications can be found [here](https://github.com/arampacha/CLIP-rsicd#applications). In general we think such models can be useful as digital assistants for humans engaged in searching through large collections of images.
We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
## Data
The model was trained on publicly available remote sensing image captions datasets. Namely [RSICD](https://github.com/201528014227051/RSICD_optimal), [UCM](https://mega.nz/folder/wCpSzSoS#RXzIlrv--TDt3ENZdKN8JA) and [Sydney](https://mega.nz/folder/pG4yTYYA#4c4buNFLibryZnlujsrwEQ). More information on the datasets used can be found on [our project page](https://github.com/arampacha/CLIP-rsicd#dataset).
## Performance and Limitations
### Performance
| Model-name | k=1 | k=3 | k=5 | k=10 |
| -------------------------------- | ----- | ----- | ----- | ----- |
| original CLIP | 0.572 | 0.745 | 0.837 | 0.939 |
| clip-rsicd-v2 (this model) | **0.883** | **0.968** | **0.982** | **0.998** |
## Limitations
The model is fine-tuned on RSI data but can contain some biases and limitations of the original CLIP model. Refer to [CLIP model card](https://huggingface.co/openai/clip-vit-base-patch32#limitations) for details on those.
| 4,727 | [
[
-0.036834716796875,
-0.03411865234375,
0.0204010009765625,
-0.0056304931640625,
-0.034912109375,
-0.01177215576171875,
-0.00994873046875,
-0.033538818359375,
0.028778076171875,
0.0279693603515625,
-0.032623291015625,
-0.04266357421875,
-0.05389404296875,
0.01091766357421875,
-0.0249786376953125,
0.08056640625,
-0.01236724853515625,
0.00368499755859375,
-0.009979248046875,
-0.0295867919921875,
-0.0293426513671875,
-0.0294036865234375,
-0.0188751220703125,
0.012542724609375,
0.0008406639099121094,
0.037506103515625,
0.053497314453125,
0.0506591796875,
0.054595947265625,
0.0158843994140625,
-0.02508544921875,
0.005367279052734375,
-0.029815673828125,
-0.02239990234375,
-0.0184478759765625,
-0.024688720703125,
-0.0258941650390625,
0.002864837646484375,
0.05169677734375,
0.006092071533203125,
0.00264739990234375,
0.0303955078125,
-0.006381988525390625,
0.0255889892578125,
-0.0543212890625,
-0.002384185791015625,
-0.03863525390625,
0.0081329345703125,
-0.02032470703125,
-0.02020263671875,
-0.0031452178955078125,
0.0094146728515625,
0.0142974853515625,
-0.0523681640625,
0.045166015625,
-0.004810333251953125,
0.11712646484375,
0.006404876708984375,
0.0016269683837890625,
-0.0024700164794921875,
-0.0477294921875,
0.06591796875,
-0.053802490234375,
0.01010894775390625,
0.022003173828125,
0.00884246826171875,
0.0114593505859375,
-0.034515380859375,
-0.03851318359375,
0.004467010498046875,
0.0264892578125,
-0.0007281303405761719,
-0.01690673828125,
-0.009429931640625,
0.03466796875,
0.03582763671875,
-0.0292205810546875,
-0.0163116455078125,
-0.057220458984375,
0.000980377197265625,
0.039215087890625,
0.015838623046875,
0.0277862548828125,
-0.0297393798828125,
-0.0523681640625,
-0.0399169921875,
-0.040435791015625,
0.0134735107421875,
0.0111236572265625,
0.0180206298828125,
-0.031280517578125,
0.035980224609375,
-0.00677490234375,
0.04425048828125,
0.0010194778442382812,
-0.03411865234375,
0.0209808349609375,
-0.028106689453125,
-0.01605224609375,
-0.02459716796875,
0.07232666015625,
0.042266845703125,
-0.0024814605712890625,
0.017333984375,
-0.0189208984375,
-0.00225067138671875,
-0.00777435302734375,
-0.08062744140625,
-0.0228729248046875,
-0.0097198486328125,
-0.0269927978515625,
-0.0301971435546875,
0.01934814453125,
-0.03887939453125,
0.0187835693359375,
-0.00943756103515625,
0.057159423828125,
-0.046722412109375,
-0.01666259765625,
0.01207733154296875,
-0.0283966064453125,
0.01403045654296875,
0.016754150390625,
-0.049041748046875,
0.03216552734375,
0.0284576416015625,
0.09710693359375,
-0.0192108154296875,
-0.0258941650390625,
-0.0029125213623046875,
0.004520416259765625,
-0.01023101806640625,
0.06732177734375,
-0.0267486572265625,
-0.034210205078125,
-0.035064697265625,
0.020538330078125,
-0.002902984619140625,
-0.033935546875,
0.04132080078125,
-0.0260162353515625,
0.002864837646484375,
-0.019439697265625,
-0.01861572265625,
-0.03167724609375,
0.0214385986328125,
-0.037994384765625,
0.06622314453125,
0.0158538818359375,
-0.06463623046875,
0.026153564453125,
-0.04339599609375,
-0.01334381103515625,
-0.003753662109375,
-0.00738525390625,
-0.06683349609375,
-0.022857666015625,
0.02947998046875,
0.024810791015625,
-0.02874755859375,
0.01125335693359375,
-0.0445556640625,
-0.0283355712890625,
0.025299072265625,
-0.02557373046875,
0.05450439453125,
0.01263427734375,
-0.0010519027709960938,
0.01364898681640625,
-0.035430908203125,
-0.011016845703125,
0.022308349609375,
0.004688262939453125,
-0.0159454345703125,
-0.02001953125,
0.0194854736328125,
0.007587432861328125,
-0.00426483154296875,
-0.0499267578125,
0.0174102783203125,
-0.01263427734375,
0.050628662109375,
0.0462646484375,
0.0254058837890625,
0.0287017822265625,
-0.031463623046875,
0.055145263671875,
0.008575439453125,
0.0297393798828125,
-0.023773193359375,
-0.036346435546875,
-0.0214385986328125,
-0.047332763671875,
0.04486083984375,
0.035675048828125,
-0.029388427734375,
0.012847900390625,
-0.01039886474609375,
-0.037567138671875,
-0.0214691162109375,
-0.0023365020751953125,
0.0220794677734375,
0.03515625,
0.02703857421875,
-0.06195068359375,
-0.040313720703125,
-0.06500244140625,
0.0241241455078125,
0.0214385986328125,
-0.0028133392333984375,
0.035858154296875,
0.05731201171875,
0.00710296630859375,
0.083251953125,
-0.06610107421875,
-0.047698974609375,
-0.0187835693359375,
0.00763702392578125,
0.0207672119140625,
0.0350341796875,
0.06768798828125,
-0.052459716796875,
-0.036285400390625,
-0.032073974609375,
-0.043487548828125,
0.01293182373046875,
-0.001621246337890625,
-0.007389068603515625,
0.0030994415283203125,
0.0081939697265625,
-0.05755615234375,
0.058380126953125,
0.01010894775390625,
-0.01580810546875,
0.05364990234375,
-0.01337432861328125,
0.021331787109375,
-0.0731201171875,
0.01548004150390625,
0.02178955078125,
-0.0164794921875,
-0.023162841796875,
0.007503509521484375,
0.0040283203125,
-0.03369140625,
-0.048248291015625,
0.0209503173828125,
-0.04437255859375,
-0.0285186767578125,
-0.0259552001953125,
0.004245758056640625,
0.024383544921875,
0.047088623046875,
0.0308685302734375,
0.08233642578125,
0.054595947265625,
-0.043365478515625,
-0.003185272216796875,
0.041595458984375,
-0.0303497314453125,
0.020904541015625,
-0.07025146484375,
-0.0106048583984375,
-0.0012969970703125,
0.01461029052734375,
-0.051513671875,
-0.01364898681640625,
0.038543701171875,
-0.0430908203125,
0.03192138671875,
-0.0277099609375,
-0.016937255859375,
-0.032012939453125,
-0.052825927734375,
0.0565185546875,
0.0352783203125,
-0.0263824462890625,
0.0192108154296875,
0.034332275390625,
0.00513458251953125,
-0.045684814453125,
-0.0482177734375,
-0.00260162353515625,
0.0007371902465820312,
-0.0309600830078125,
0.040802001953125,
-0.00013005733489990234,
0.0203704833984375,
0.0260772705078125,
0.004505157470703125,
-0.0089874267578125,
-0.0099334716796875,
0.0379638671875,
0.0467529296875,
-0.02825927734375,
-0.027984619140625,
-0.017059326171875,
0.0031375885009765625,
-0.0164947509765625,
0.0071868896484375,
0.0266876220703125,
-0.01143646240234375,
-0.01557159423828125,
-0.048187255859375,
0.01261138916015625,
0.043365478515625,
-0.032501220703125,
0.051788330078125,
0.036834716796875,
-0.019775390625,
-0.008331298828125,
-0.01425933837890625,
-0.0106048583984375,
-0.0298309326171875,
0.03759765625,
-0.0219879150390625,
-0.03936767578125,
0.045257568359375,
0.0248870849609375,
-0.016265869140625,
0.048065185546875,
0.0154876708984375,
0.0022296905517578125,
0.06109619140625,
0.053558349609375,
-0.0020751953125,
0.04998779296875,
-0.0726318359375,
0.0001609325408935547,
-0.0872802734375,
-0.004665374755859375,
-0.0189971923828125,
-0.003154754638671875,
-0.01568603515625,
-0.061492919921875,
0.041717529296875,
0.03411865234375,
-0.004795074462890625,
0.0382080078125,
-0.061798095703125,
0.041656494140625,
0.034149169921875,
0.0304718017578125,
-0.006641387939453125,
0.0024261474609375,
0.016632080078125,
-0.00992584228515625,
-0.057586669921875,
-0.027862548828125,
0.0821533203125,
0.03973388671875,
0.06329345703125,
-0.0140533447265625,
0.0257720947265625,
0.031494140625,
0.0053558349609375,
-0.051788330078125,
0.037628173828125,
-0.02099609375,
-0.036590576171875,
-0.01020050048828125,
0.002254486083984375,
-0.04254150390625,
0.0180511474609375,
-0.036834716796875,
-0.037567138671875,
0.0258331298828125,
0.01517486572265625,
-0.04327392578125,
0.051788330078125,
-0.03643798828125,
0.058502197265625,
-0.0105133056640625,
-0.034210205078125,
-0.005855560302734375,
-0.039154052734375,
0.044891357421875,
0.0252227783203125,
-0.0075225830078125,
-0.020477294921875,
0.01036834716796875,
0.07379150390625,
-0.054595947265625,
0.064208984375,
-0.007457733154296875,
0.031158447265625,
0.0430908203125,
-0.0126190185546875,
0.01538848876953125,
-0.004047393798828125,
0.0099639892578125,
0.03594970703125,
0.0321044921875,
-0.01446533203125,
-0.023101806640625,
0.03887939453125,
-0.07421875,
-0.01885986328125,
-0.052581787109375,
-0.024688720703125,
0.0313720703125,
0.0136871337890625,
0.03387451171875,
0.07000732421875,
0.0002219676971435547,
0.0003108978271484375,
0.05389404296875,
-0.031402587890625,
0.0153045654296875,
0.037811279296875,
-0.0099639892578125,
-0.054168701171875,
0.079345703125,
0.012847900390625,
0.018524169921875,
0.01419830322265625,
0.007213592529296875,
-0.0110015869140625,
-0.0394287109375,
-0.04180908203125,
0.0287933349609375,
-0.048614501953125,
-0.061614990234375,
-0.0302581787109375,
-0.0262908935546875,
-0.04449462890625,
-0.00022804737091064453,
-0.034820556640625,
-0.032501220703125,
-0.06756591796875,
0.0084381103515625,
0.04571533203125,
0.058380126953125,
-0.0196533203125,
0.0166473388671875,
-0.051239013671875,
0.03350830078125,
-0.009185791015625,
0.033294677734375,
0.005359649658203125,
-0.059478759765625,
-0.006381988525390625,
0.0003254413604736328,
-0.05731201171875,
-0.07489013671875,
0.046905517578125,
0.0284576416015625,
0.0193023681640625,
0.038360595703125,
0.01068878173828125,
0.054168701171875,
-0.020050048828125,
0.072998046875,
0.02020263671875,
-0.060546875,
0.04315185546875,
-0.01800537109375,
0.0245361328125,
0.0275726318359375,
0.033538818359375,
-0.031280517578125,
-0.0089569091796875,
-0.045440673828125,
-0.07122802734375,
0.08428955078125,
-0.005157470703125,
-0.0130157470703125,
-0.006603240966796875,
0.028411865234375,
0.004238128662109375,
-0.015045166015625,
-0.039031982421875,
-0.008148193359375,
-0.01922607421875,
-0.003948211669921875,
0.0187835693359375,
-0.033233642578125,
0.0031032562255859375,
-0.03485107421875,
0.06414794921875,
-0.00037789344787597656,
0.038787841796875,
0.05108642578125,
-0.01389312744140625,
-0.02117919921875,
-0.006084442138671875,
0.06756591796875,
0.039886474609375,
-0.040740966796875,
-0.016754150390625,
0.012237548828125,
-0.04107666015625,
-0.01355743408203125,
-0.017059326171875,
-0.04425048828125,
0.0030155181884765625,
0.019561767578125,
0.0889892578125,
0.0251312255859375,
-0.042938232421875,
0.0755615234375,
-0.005168914794921875,
-0.040435791015625,
-0.029632568359375,
-0.0015430450439453125,
-0.019805908203125,
0.0020618438720703125,
0.034149169921875,
0.040740966796875,
0.0151824951171875,
-0.0256805419921875,
0.029388427734375,
0.052978515625,
-0.01543426513671875,
-0.040435791015625,
0.0667724609375,
-0.00260162353515625,
-0.0289306640625,
0.02874755859375,
-0.014892578125,
-0.06390380859375,
0.056671142578125,
0.034912109375,
0.055206298828125,
0.012542724609375,
0.0213470458984375,
0.06219482421875,
-0.0008769035339355469,
-0.0166168212890625,
0.0002818107604980469,
-0.0282135009765625,
-0.045257568359375,
-0.0223388671875,
-0.040679931640625,
-0.048553466796875,
0.017913818359375,
-0.06854248046875,
0.04449462890625,
-0.046722412109375,
-0.0261383056640625,
0.0004341602325439453,
-0.004302978515625,
-0.0477294921875,
-0.0015630722045898438,
0.0150146484375,
0.09112548828125,
-0.07562255859375,
0.05126953125,
0.038116455078125,
-0.0531005859375,
-0.05419921875,
-0.0192108154296875,
-0.0301361083984375,
-0.0382080078125,
0.044952392578125,
0.01318359375,
-0.0165863037109375,
-0.033935546875,
-0.0755615234375,
-0.0804443359375,
0.0863037109375,
0.00980377197265625,
-0.0257415771484375,
0.016143798828125,
-0.0019130706787109375,
0.0257415771484375,
-0.0227813720703125,
0.053497314453125,
0.00934600830078125,
-0.00740814208984375,
0.02642822265625,
-0.0621337890625,
-0.006622314453125,
-0.00762176513671875,
0.01538848876953125,
0.013641357421875,
-0.0682373046875,
0.0657958984375,
-0.038482666015625,
-0.030487060546875,
0.024688720703125,
0.06292724609375,
0.0155029296875,
0.016357421875,
0.0291290283203125,
0.05267333984375,
0.0201416015625,
0.008087158203125,
0.056915283203125,
-0.02703857421875,
0.050811767578125,
0.0634765625,
-0.007045745849609375,
0.08197021484375,
0.02618408203125,
-0.02197265625,
0.02178955078125,
0.03961181640625,
-0.03936767578125,
0.062164306640625,
-0.01354217529296875,
-0.0005030632019042969,
-0.01126861572265625,
-0.0089111328125,
-0.02349853515625,
0.024688720703125,
0.01343536376953125,
-0.0308990478515625,
-0.0191497802734375,
0.0291900634765625,
-0.0208587646484375,
0.0016384124755859375,
-0.0109405517578125,
0.03070068359375,
-0.01226043701171875,
-0.029632568359375,
0.0478515625,
0.0235748291015625,
0.08197021484375,
-0.03692626953125,
-0.0233306884765625,
0.005802154541015625,
0.015777587890625,
-0.013214111328125,
-0.08380126953125,
0.030548095703125,
-0.0019216537475585938,
-0.026214599609375,
-0.00983428955078125,
0.05438232421875,
-0.0269012451171875,
-0.035980224609375,
-0.004467010498046875,
-0.03546142578125,
0.040008544921875,
0.007022857666015625,
-0.06585693359375,
0.041778564453125,
0.0103302001953125,
-0.0081939697265625,
0.0230560302734375,
0.0037364959716796875,
-0.0023651123046875,
0.043426513671875,
0.0380859375,
0.01375579833984375,
-0.01128387451171875,
-0.031280517578125,
0.07275390625,
-0.050018310546875,
-0.03375244140625,
-0.040557861328125,
0.0033130645751953125,
-0.011016845703125,
-0.04412841796875,
0.03961181640625,
0.0601806640625,
0.061920166015625,
-0.0258026123046875,
0.041351318359375,
-0.01678466796875,
0.0589599609375,
-0.0243072509765625,
0.03948974609375,
-0.0301361083984375,
0.005184173583984375,
-0.024200439453125,
-0.03704833984375,
-0.01885986328125,
0.05084228515625,
-0.028533935546875,
-0.02081298828125,
0.0193939208984375,
0.0765380859375,
-0.02728271484375,
-0.0007963180541992188,
0.00792694091796875,
-0.0033435821533203125,
0.01430511474609375,
0.0390625,
0.0362548828125,
-0.07391357421875,
0.043426513671875,
-0.04876708984375,
-0.01305389404296875,
-0.0114898681640625,
-0.058837890625,
-0.06292724609375,
-0.050384521484375,
-0.045166015625,
-0.013214111328125,
-0.0012645721435546875,
0.06500244140625,
0.07421875,
-0.061798095703125,
-0.004619598388671875,
0.0193939208984375,
-0.00006699562072753906,
-0.0029315948486328125,
-0.02093505859375,
0.03277587890625,
0.023193359375,
-0.03936767578125,
0.0014352798461914062,
0.01568603515625,
0.017913818359375,
-0.0059661865234375,
-0.0047760009765625,
-0.02301025390625,
-0.010650634765625,
0.02850341796875,
0.0650634765625,
-0.052032470703125,
-0.0189056396484375,
-0.004241943359375,
0.0034027099609375,
0.0271759033203125,
0.041473388671875,
-0.051727294921875,
0.004787445068359375,
0.036163330078125,
0.019805908203125,
0.062744140625,
0.02813720703125,
0.006595611572265625,
-0.045867919921875,
0.0204925537109375,
0.003955841064453125,
0.01099395751953125,
0.036590576171875,
-0.023101806640625,
0.0223541259765625,
0.036346435546875,
-0.045166015625,
-0.07501220703125,
-0.01459503173828125,
-0.0986328125,
-0.025970458984375,
0.08184814453125,
-0.0152740478515625,
-0.057037353515625,
0.0273895263671875,
-0.032806396484375,
0.01311492919921875,
-0.037841796875,
0.0248870849609375,
0.017059326171875,
0.0006318092346191406,
-0.041961669921875,
-0.022369384765625,
0.0280609130859375,
0.020294189453125,
-0.0103302001953125,
-0.01537322998046875,
0.023162841796875,
0.0738525390625,
0.01461029052734375,
0.054595947265625,
-0.01506805419921875,
0.0213775634765625,
0.0009341239929199219,
0.030426025390625,
-0.0259246826171875,
-0.020050048828125,
-0.02545166015625,
0.01117706298828125,
0.0032215118408203125,
-0.051513671875
]
] |
laion/CLIP-ViT-g-14-laion2B-s34B-b88K | 2023-04-18T17:34:59.000Z | [
"open_clip",
"zero-shot-image-classification",
"clip",
"license:mit",
"has_space",
"region:us"
] | zero-shot-image-classification | laion | null | null | laion/CLIP-ViT-g-14-laion2B-s34B-b88K | 4 | 29,689 | open_clip | 2023-03-06T00:43:13 | ---
tags:
- zero-shot-image-classification
- clip
library_tag: open_clip
license: mit
pipeline_tag: zero-shot-image-classification
---
# Model card for CLIP-ViT-g-14-laion2B-s34B-b88K | 183 | [
[
-0.0110931396484375,
0.001678466796875,
0.03790283203125,
0.025634765625,
-0.06671142578125,
0.0157623291015625,
0.051910400390625,
0.0251922607421875,
0.04345703125,
0.032806396484375,
-0.05712890625,
-0.0253753662109375,
-0.03216552734375,
-0.005741119384765625,
-0.020599365234375,
0.037109375,
0.0032138824462890625,
0.02978515625,
-0.005313873291015625,
-0.00908660888671875,
-0.041748046875,
-0.0258941650390625,
-0.0284423828125,
-0.01268768310546875,
-0.0080718994140625,
0.02777099609375,
0.0594482421875,
0.055328369140625,
0.038909912109375,
0.0264129638671875,
-0.031646728515625,
0.01009368896484375,
-0.030029296875,
-0.039215087890625,
-0.00701904296875,
-0.04278564453125,
-0.08074951171875,
-0.0038356781005859375,
0.049591064453125,
-0.024566650390625,
-0.017578125,
0.034027099609375,
0.006366729736328125,
-0.00839996337890625,
-0.027862548828125,
0.0206298828125,
-0.042205810546875,
0.007625579833984375,
-0.0211944580078125,
-0.0147247314453125,
-0.00817108154296875,
-0.01971435546875,
0.005462646484375,
-0.055389404296875,
0.0186920166015625,
-0.02960205078125,
0.10552978515625,
0.0113525390625,
-0.00971221923828125,
-0.00557708740234375,
-0.041259765625,
0.057342529296875,
-0.03857421875,
0.0297088623046875,
0.0161285400390625,
0.0106658935546875,
-0.01287841796875,
-0.04052734375,
0.01143646240234375,
-0.002109527587890625,
0.00250244140625,
0.0290985107421875,
-0.007511138916015625,
0.002269744873046875,
0.04718017578125,
0.0226287841796875,
-0.023406982421875,
-0.027618408203125,
-0.0758056640625,
0.0015897750854492188,
0.032135009765625,
0.02655029296875,
0.041259765625,
-0.032470703125,
-0.0584716796875,
0.005802154541015625,
-0.039398193359375,
-0.0254364013671875,
0.037017822265625,
-0.0215301513671875,
-0.0416259765625,
0.0301361083984375,
0.019073486328125,
0.026702880859375,
0.005863189697265625,
-0.02569580078125,
-0.002193450927734375,
-0.01324462890625,
-0.032470703125,
-0.0183258056640625,
0.0174102783203125,
0.057891845703125,
-0.007617950439453125,
0.03466796875,
-0.001010894775390625,
-0.01435089111328125,
0.0187225341796875,
-0.08453369140625,
-0.023162841796875,
-0.032867431640625,
-0.04888916015625,
-0.0240631103515625,
0.062255859375,
-0.058563232421875,
0.0209808349609375,
0.007511138916015625,
0.044647216796875,
0.0012350082397460938,
-0.007457733154296875,
-0.005924224853515625,
-0.042144775390625,
0.018829345703125,
0.01751708984375,
-0.048126220703125,
0.018096923828125,
0.03082275390625,
0.052459716796875,
0.0203704833984375,
0.0033130645751953125,
-0.002193450927734375,
0.048126220703125,
-0.03778076171875,
0.024688720703125,
-0.0175628662109375,
-0.07843017578125,
0.018707275390625,
0.051727294921875,
-0.016387939453125,
-0.042205810546875,
0.0887451171875,
-0.02435302734375,
-0.018951416015625,
-0.0673828125,
-0.022308349609375,
-0.020263671875,
0.013397216796875,
-0.0450439453125,
0.090576171875,
-0.00691986083984375,
-0.05694580078125,
0.0169830322265625,
-0.0228729248046875,
-0.001132965087890625,
0.040679931640625,
-0.0197601318359375,
-0.0238037109375,
0.0150299072265625,
-0.0009007453918457031,
0.0203094482421875,
0.0013494491577148438,
0.004871368408203125,
-0.037445068359375,
-0.0465087890625,
-0.008056640625,
-0.01541900634765625,
0.05279541015625,
0.01515960693359375,
0.034637451171875,
0.01306915283203125,
-0.0190277099609375,
0.004085540771484375,
0.027618408203125,
0.0080108642578125,
-0.0171966552734375,
-0.0309600830078125,
-0.0186767578125,
0.0113067626953125,
0.0115814208984375,
-0.038177490234375,
0.01898193359375,
0.0296173095703125,
0.05206298828125,
0.06585693359375,
0.00849151611328125,
-0.00664520263671875,
-0.032012939453125,
0.0595703125,
-0.022064208984375,
0.045745849609375,
0.0277252197265625,
-0.038055419921875,
-0.04345703125,
-0.0184173583984375,
-0.00971221923828125,
0.05609130859375,
-0.0276947021484375,
0.01288604736328125,
0.022674560546875,
-0.03253173828125,
0.0012063980102539062,
-0.01552581787109375,
0.0063018798828125,
0.00888824462890625,
0.0258331298828125,
-0.0159912109375,
-0.0457763671875,
-0.061248779296875,
-0.00435638427734375,
-0.023773193359375,
-0.00130462646484375,
0.05682373046875,
0.050506591796875,
-0.040252685546875,
0.033050537109375,
-0.08502197265625,
-0.039764404296875,
-0.03228759765625,
-0.01751708984375,
0.0141448974609375,
-0.0000960230827331543,
0.06982421875,
-0.04730224609375,
-0.0177001953125,
-0.0228271484375,
-0.038848876953125,
0.0025386810302734375,
0.0190887451171875,
-0.01079559326171875,
-0.028076171875,
0.03887939453125,
-0.05194091796875,
0.06036376953125,
0.0092620849609375,
-0.006317138671875,
0.03338623046875,
-0.03607177734375,
0.0247955322265625,
-0.075927734375,
-0.00868988037109375,
-0.00830078125,
-0.0184173583984375,
-0.0245513916015625,
-0.0253448486328125,
0.0044708251953125,
-0.0019292831420898438,
-0.04718017578125,
0.035491943359375,
-0.032867431640625,
0.0028533935546875,
-0.025848388671875,
-0.00170135498046875,
0.01520538330078125,
0.0021209716796875,
-0.01163482666015625,
0.056396484375,
0.016845703125,
-0.0277557373046875,
0.03173828125,
0.054718017578125,
-0.00301361083984375,
0.0266265869140625,
-0.04449462890625,
-0.00691986083984375,
-0.03741455078125,
0.002285003662109375,
-0.0293426513671875,
-0.04345703125,
0.031524658203125,
-0.01291656494140625,
0.01496124267578125,
-0.06304931640625,
0.0061798095703125,
-0.051055908203125,
-0.0227508544921875,
0.0596923828125,
0.02288818359375,
-0.041748046875,
0.047882080078125,
0.0176239013671875,
-0.0016393661499023438,
-0.03363037109375,
-0.0570068359375,
-0.029876708984375,
-0.0170440673828125,
-0.0253753662109375,
0.04833984375,
-0.01166534423828125,
0.00922393798828125,
-0.003108978271484375,
-0.044769287109375,
-0.03472900390625,
-0.01556396484375,
0.03662109375,
0.039886474609375,
-0.0077362060546875,
-0.026885986328125,
-0.01348114013671875,
-0.0275115966796875,
0.0147247314453125,
0.01265716552734375,
0.0288848876953125,
-0.01540374755859375,
0.0093231201171875,
-0.0164947509765625,
0.033843994140625,
0.05084228515625,
-0.0255584716796875,
0.03338623046875,
0.05780029296875,
-0.046661376953125,
-0.01763916015625,
-0.0185394287109375,
-0.0018634796142578125,
-0.037567138671875,
0.0465087890625,
-0.034088134765625,
-0.022003173828125,
0.0465087890625,
0.01486968994140625,
-0.052978515625,
0.040985107421875,
0.039215087890625,
0.0233306884765625,
0.053192138671875,
0.044830322265625,
0.001171112060546875,
0.049407958984375,
-0.043792724609375,
0.03509521484375,
-0.046630859375,
-0.029998779296875,
0.0031719207763671875,
0.0288848876953125,
-0.0116424560546875,
-0.05426025390625,
0.0199737548828125,
0.040985107421875,
0.0005555152893066406,
0.0426025390625,
-0.0246734619140625,
0.051727294921875,
0.03607177734375,
0.006744384765625,
-0.0172119140625,
-0.0361328125,
-0.005260467529296875,
0.00940704345703125,
-0.0435791015625,
-0.023406982421875,
0.070068359375,
0.034637451171875,
0.0682373046875,
0.007373809814453125,
0.03924560546875,
0.00909423828125,
0.0024280548095703125,
-0.034942626953125,
0.048797607421875,
-0.050262451171875,
-0.0478515625,
0.016845703125,
0.01515960693359375,
-0.0189666748046875,
0.0039005279541015625,
0.00917816162109375,
-0.06146240234375,
0.00626373291015625,
0.0283660888671875,
-0.0361328125,
0.039581298828125,
-0.034271240234375,
0.09149169921875,
-0.013275146484375,
0.0014858245849609375,
-0.00699615478515625,
-0.005733489990234375,
0.047393798828125,
0.03948974609375,
-0.0187225341796875,
-0.018890380859375,
0.0291900634765625,
0.081787109375,
-0.06402587890625,
0.00588226318359375,
-0.0190887451171875,
0.020477294921875,
0.06451416015625,
-0.020965576171875,
0.040863037109375,
0.06671142578125,
0.0209197998046875,
-0.0222320556640625,
0.031463623046875,
-0.0153350830078125,
-0.030975341796875,
0.037628173828125,
-0.035919189453125,
-0.0265960693359375,
-0.02447509765625,
0.00640106201171875,
0.01517486572265625,
-0.0214691162109375,
0.089111328125,
0.059112548828125,
-0.01140594482421875,
0.006561279296875,
0.041748046875,
0.00830078125,
0.038177490234375,
0.0097808837890625,
-0.0229949951171875,
-0.06231689453125,
0.0697021484375,
0.0093994140625,
0.0156402587890625,
0.00765228271484375,
0.0307159423828125,
-0.007404327392578125,
-0.0001131296157836914,
-0.03570556640625,
0.01337432861328125,
-0.0278778076171875,
-0.037322998046875,
-0.0305633544921875,
-0.051544189453125,
-0.00627899169921875,
-0.018402099609375,
-0.054962158203125,
0.01457977294921875,
-0.052520751953125,
-0.02264404296875,
0.032928466796875,
0.039886474609375,
-0.037872314453125,
0.082275390625,
-0.0643310546875,
0.0291595458984375,
0.0452880859375,
0.034088134765625,
-0.00640869140625,
-0.060638427734375,
-0.00241851806640625,
-0.0231781005859375,
-0.0545654296875,
-0.06640625,
0.0226898193359375,
0.0010776519775390625,
0.035614013671875,
0.048187255859375,
-0.0204315185546875,
0.06341552734375,
-0.033172607421875,
0.0703125,
0.021026611328125,
-0.0887451171875,
0.015777587890625,
-0.0408935546875,
0.0078887939453125,
0.0185089111328125,
0.0361328125,
-0.05377197265625,
0.0303802490234375,
-0.0650634765625,
-0.04205322265625,
0.023284912109375,
-0.012298583984375,
0.002349853515625,
0.01800537109375,
0.0330810546875,
0.0031490325927734375,
0.004947662353515625,
-0.055511474609375,
-0.0289764404296875,
-0.03302001953125,
-0.004131317138671875,
0.04278564453125,
-0.078369140625,
0.005764007568359375,
-0.029327392578125,
0.063720703125,
-0.00502777099609375,
0.053192138671875,
0.0186614990234375,
-0.004924774169921875,
0.00730133056640625,
-0.0012664794921875,
0.07159423828125,
0.03240966796875,
-0.056884765625,
-0.0131988525390625,
0.008270263671875,
-0.048065185546875,
-0.003154754638671875,
0.018524169921875,
-0.043670654296875,
0.0236053466796875,
0.01904296875,
0.066162109375,
0.01090240478515625,
0.005191802978515625,
0.082275390625,
-0.0003681182861328125,
-0.01293182373046875,
-0.035675048828125,
-0.0103302001953125,
-0.00560760498046875,
0.00766754150390625,
0.0067291259765625,
-0.0038394927978515625,
0.018463134765625,
-0.02813720703125,
0.038238525390625,
0.05841064453125,
-0.057708740234375,
-0.040008544921875,
0.05633544921875,
0.0219879150390625,
-0.0589599609375,
0.021087646484375,
-0.033905029296875,
-0.06378173828125,
0.049224853515625,
0.04022216796875,
0.0787353515625,
-0.038909912109375,
0.024200439453125,
0.04541015625,
0.009796142578125,
-0.0295867919921875,
0.06298828125,
0.0215911865234375,
-0.039215087890625,
-0.0004119873046875,
-0.0167083740234375,
-0.0273590087890625,
0.01538848876953125,
-0.1068115234375,
0.044036865234375,
-0.0494384765625,
-0.03692626953125,
-0.01959228515625,
-0.00012111663818359375,
-0.0408935546875,
0.0245819091796875,
0.0191192626953125,
0.09173583984375,
-0.0684814453125,
0.0604248046875,
0.0279083251953125,
-0.00780487060546875,
-0.058135986328125,
-0.02459716796875,
0.0046539306640625,
-0.05078125,
0.0144500732421875,
0.01210784912109375,
0.00257110595703125,
-0.05853271484375,
-0.051025390625,
-0.081787109375,
0.07672119140625,
0.00933837890625,
-0.0097808837890625,
0.013671875,
-0.03326416015625,
0.0013418197631835938,
-0.029937744140625,
0.0291900634765625,
0.02801513671875,
0.01560211181640625,
0.0191802978515625,
-0.0606689453125,
-0.0249481201171875,
-0.01030731201171875,
0.0013751983642578125,
0.027130126953125,
-0.10174560546875,
0.04937744140625,
-0.01465606689453125,
-0.0007333755493164062,
0.05126953125,
0.03912353515625,
0.00955963134765625,
0.0014171600341796875,
-0.0119781494140625,
0.03564453125,
0.035247802734375,
0.0034732818603515625,
0.0087890625,
0.0079345703125,
0.03778076171875,
0.0682373046875,
-0.03057861328125,
0.033203125,
0.03411865234375,
0.0014972686767578125,
0.0361328125,
0.031005859375,
-0.07086181640625,
0.0282440185546875,
0.01465606689453125,
-0.0029582977294921875,
-0.0283355712890625,
-0.00891876220703125,
-0.054779052734375,
0.03350830078125,
0.021270751953125,
-0.05419921875,
-0.01093292236328125,
-0.034393310546875,
0.0191650390625,
-0.0092315673828125,
-0.04791259765625,
0.01055145263671875,
0.02252197265625,
-0.0015497207641601562,
-0.01016998291015625,
0.017303466796875,
0.0150146484375,
-0.059112548828125,
-0.0184783935546875,
0.0015516281127929688,
0.035614013671875,
-0.0010118484497070312,
-0.032684326171875,
0.0277862548828125,
-0.022705078125,
-0.03704833984375,
-0.016387939453125,
0.045562744140625,
-0.0223846435546875,
-0.05291748046875,
0.044342041015625,
-0.0163421630859375,
0.0183563232421875,
0.0256805419921875,
-0.04669189453125,
0.040679931640625,
0.01139068603515625,
0.02667236328125,
0.004825592041015625,
-0.006298065185546875,
0.00597381591796875,
0.04931640625,
-0.008270263671875,
-0.01306915283203125,
0.03369140625,
-0.03436279296875,
0.042999267578125,
-0.048828125,
-0.05126953125,
-0.055389404296875,
0.02435302734375,
-0.0262603759765625,
-0.06903076171875,
0.0439453125,
0.0755615234375,
0.03973388671875,
-0.04705810546875,
0.0301361083984375,
0.0171661376953125,
0.0217437744140625,
-0.013580322265625,
0.041534423828125,
-0.034576416015625,
-0.01934814453125,
-0.02777099609375,
-0.05194091796875,
-0.001422882080078125,
0.050628662109375,
0.0014057159423828125,
-0.02386474609375,
0.035369873046875,
0.05499267578125,
-0.00954437255859375,
0.0284271240234375,
0.0262298583984375,
0.00913238525390625,
0.0190582275390625,
0.0235443115234375,
0.038116455078125,
-0.072509765625,
0.03448486328125,
0.006679534912109375,
-0.018646240234375,
-0.042816162109375,
-0.07940673828125,
-0.07354736328125,
-0.007213592529296875,
-0.036651611328125,
-0.0308380126953125,
0.0012140274047851562,
0.06005859375,
0.08245849609375,
-0.0312042236328125,
-0.0283355712890625,
0.036407470703125,
0.0308990478515625,
0.00897216796875,
-0.01389312744140625,
0.034210205078125,
0.04022216796875,
-0.003971099853515625,
0.024322509765625,
0.0198211669921875,
0.051483154296875,
0.0303192138671875,
-0.0149383544921875,
0.01061248779296875,
0.03350830078125,
0.0168914794921875,
0.037933349609375,
-0.050628662109375,
-0.032257080078125,
-0.00341033935546875,
0.0203399658203125,
0.00850677490234375,
0.06109619140625,
-0.00479888916015625,
-0.0372314453125,
0.01934814453125,
0.00315093994140625,
0.050689697265625,
0.026702880859375,
0.044464111328125,
-0.031890869140625,
0.03363037109375,
-0.01314544677734375,
0.032135009765625,
-0.02447509765625,
0.0117950439453125,
0.02685546875,
0.04541015625,
-0.0435791015625,
-0.07415771484375,
0.00797271728515625,
-0.10870361328125,
0.01497650146484375,
0.078857421875,
-0.0090789794921875,
-0.03057861328125,
0.031768798828125,
-0.041412353515625,
-0.0138397216796875,
-0.0243682861328125,
0.002349853515625,
0.040130615234375,
0.0015201568603515625,
-0.03741455078125,
-0.051055908203125,
0.05340576171875,
-0.011932373046875,
-0.0416259765625,
-0.02239990234375,
-0.027374267578125,
0.035064697265625,
0.004955291748046875,
0.032012939453125,
-0.00881195068359375,
0.025421142578125,
0.0111846923828125,
0.019989013671875,
-0.01102447509765625,
-0.04052734375,
-0.0040740966796875,
-0.0107574462890625,
-0.01788330078125,
-0.064697265625
]
] |
daryl149/llama-2-7b-chat-hf | 2023-07-23T17:12:59.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | daryl149 | null | null | daryl149/llama-2-7b-chat-hf | 80 | 29,405 | transformers | 2023-07-18T18:36:56 | These are the converted model weights for Llama-2-7B-chat in Huggingface format.
Courtesy of [Mirage-Studio.io](https://mirage-studio.io), home of MirageGPT: the private ChatGPT alternative.
---
license: other
LLAMA 2 COMMUNITY LICENSE AGREEMENT
Llama 2 Version Release Date: July 18, 2023
"Agreement" means the terms and conditions for use, reproduction, distribution and
modification of the Llama Materials set forth herein.
"Documentation" means the specifications, manuals and documentation
accompanying Llama 2 distributed by Meta at ai.meta.com/resources/models-and-
libraries/llama-downloads/.
"Licensee" or "you" means you, or your employer or any other person or entity (if
you are entering into this Agreement on such person or entity's behalf), of the age
required under applicable laws, rules or regulations to provide legal consent and that
has legal authority to bind your employer or such other person or entity if you are
entering in this Agreement on their behalf.
"Llama 2" means the foundational large language models and software and
algorithms, including machine-learning model code, trained model weights,
inference-enabling code, training-enabling code, fine-tuning enabling code and other
elements of the foregoing distributed by Meta at ai.meta.com/resources/models-and-
libraries/llama-downloads/.
"Llama Materials" means, collectively, Meta's proprietary Llama 2 and
Documentation (and any portion thereof) made available under this Agreement.
"Meta" or "we" means Meta Platforms Ireland Limited (if you are located in or, if you
are an entity, your principal place of business is in the EEA or Switzerland) and Meta
Platforms, Inc. (if you are located outside of the EEA or Switzerland).
By clicking "I Accept" below or by using or distributing any portion or element of the
Llama Materials, you agree to be bound by this Agreement.
1. License Rights and Redistribution.
a. Grant of Rights. You are granted a non-exclusive, worldwide, non-
transferable and royalty-free limited license under Meta's intellectual property or
other rights owned by Meta embodied in the Llama Materials to use, reproduce,
distribute, copy, create derivative works of, and make modifications to the Llama
Materials.
b. Redistribution and Use.
i. If you distribute or make the Llama Materials, or any derivative works
thereof, available to a third party, you shall provide a copy of this Agreement to such
third party.
ii. If you receive Llama Materials, or any derivative works thereof, from
a Licensee as part of an integrated end user product, then Section 2 of this
Agreement will not apply to you.
iii. You must retain in all copies of the Llama Materials that you
distribute the following attribution notice within a "Notice" text file distributed as a
part of such copies: "Llama 2 is licensed under the LLAMA 2 Community License,
Copyright (c) Meta Platforms, Inc. All Rights Reserved."
iv. Your use of the Llama Materials must comply with applicable laws
and regulations (including trade compliance laws and regulations) and adhere to the
Acceptable Use Policy for the Llama Materials (available at
https://ai.meta.com/llama/use-policy), which is hereby incorporated by reference into
this Agreement.
v. You will not use the Llama Materials or any output or results of the
Llama Materials to improve any other large language model (excluding Llama 2 or
derivative works thereof).
2. Additional Commercial Terms. If, on the Llama 2 version release date, the
monthly active users of the products or services made available by or for Licensee,
or Licensee's affiliates, is greater than 700 million monthly active users in the
preceding calendar month, you must request a license from Meta, which Meta may
grant to you in its sole discretion, and you are not authorized to exercise any of the
rights under this Agreement unless or until Meta otherwise expressly grants you
such rights.
3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE
LLAMA MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE
PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND,
EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY
WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR
FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE
FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING
THE LLAMA MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR
USE OF THE LLAMA MATERIALS AND ANY OUTPUT AND RESULTS.
4. Limitation of Liability. IN NO EVENT WILL META OR ITS AFFILIATES BE
LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT,
NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS
AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL,
CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN
IF META OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF
ANY OF THE FOREGOING.
5. Intellectual Property.
a. No trademark licenses are granted under this Agreement, and in
connection with the Llama Materials, neither Meta nor Licensee may use any name
or mark owned by or associated with the other or any of its affiliates, except as
required for reasonable and customary use in describing and redistributing the
Llama Materials.
b. Subject to Meta's ownership of Llama Materials and derivatives made by or
for Meta, with respect to any derivative works and modifications of the Llama
Materials that are made by you, as between you and Meta, you are and will be the
owner of such derivative works and modifications.
c. If you institute litigation or other proceedings against Meta or any entity
(including a cross-claim or counterclaim in a lawsuit) alleging that the Llama
Materials or Llama 2 outputs or results, or any portion of any of the foregoing,
constitutes infringement of intellectual property or other rights owned or licensable
by you, then any licenses granted to you under this Agreement shall terminate as of
the date such litigation or claim is filed or instituted. You will indemnify and hold
harmless Meta from and against any claim by any third party arising out of or related
to your use or distribution of the Llama Materials.
6. Term and Termination. The term of this Agreement will commence upon your
acceptance of this Agreement or access to the Llama Materials and will continue in
full force and effect until terminated in accordance with the terms and conditions
herein. Meta may terminate this Agreement if you are in breach of any term or
condition of this Agreement. Upon termination of this Agreement, you shall delete
and cease use of the Llama Materials. Sections 3, 4 and 7 shall survive the
termination of this Agreement.
7. Governing Law and Jurisdiction. This Agreement will be governed and
construed under the laws of the State of California without regard to choice of law
principles, and the UN Convention on Contracts for the International Sale of Goods
does not apply to this Agreement. The courts of California shall have exclusive
jurisdiction of any dispute arising out of this Agreement.
---
| 7,237 | [
[
-0.02630615234375,
-0.039337158203125,
0.036224365234375,
0.047149658203125,
-0.041900634765625,
-0.006809234619140625,
0.0007162094116210938,
-0.057464599609375,
0.03277587890625,
0.058258056640625,
-0.04205322265625,
-0.03759765625,
-0.0601806640625,
0.0178680419921875,
-0.02880859375,
0.0882568359375,
-0.0196075439453125,
-0.040283203125,
-0.0250701904296875,
-0.00591278076171875,
-0.0269622802734375,
-0.0296630859375,
-0.017181396484375,
-0.0185394287109375,
0.037811279296875,
0.028411865234375,
0.049835205078125,
0.042724609375,
0.0301666259765625,
0.0238037109375,
-0.014984130859375,
-0.00489044189453125,
-0.0408935546875,
-0.0110015869140625,
-0.013092041015625,
-0.027984619140625,
-0.06390380859375,
0.01520538330078125,
0.01534271240234375,
0.017913818359375,
-0.034454345703125,
0.04400634765625,
-0.005329132080078125,
0.02069091796875,
-0.036102294921875,
0.022125244140625,
-0.04620361328125,
0.0023975372314453125,
-0.01763916015625,
-0.0092010498046875,
-0.0171661376953125,
-0.01812744140625,
-0.0126800537109375,
-0.07293701171875,
-0.0184326171875,
-0.0023040771484375,
0.0723876953125,
0.0280303955078125,
-0.0308990478515625,
-0.01233673095703125,
-0.021240234375,
0.055450439453125,
-0.059783935546875,
0.0107574462890625,
0.044647216796875,
0.0310821533203125,
-0.01032257080078125,
-0.0701904296875,
-0.052154541015625,
0.0017948150634765625,
0.0022144317626953125,
0.0201568603515625,
-0.041748046875,
-0.01329803466796875,
0.017669677734375,
0.050018310546875,
-0.0302581787109375,
0.010650634765625,
-0.044769287109375,
-0.00693511962890625,
0.0706787109375,
0.0035114288330078125,
0.0246734619140625,
-0.01995849609375,
-0.046478271484375,
-0.009490966796875,
-0.0687255859375,
0.00572967529296875,
0.04278564453125,
0.0016756057739257812,
-0.035430908203125,
0.05865478515625,
-0.0130615234375,
0.0165863037109375,
-0.0021800994873046875,
-0.047760009765625,
0.03656005859375,
-0.034271240234375,
-0.0237579345703125,
-0.0101470947265625,
0.062164306640625,
0.055084228515625,
-0.00286865234375,
-0.01776123046875,
-0.0174713134765625,
-0.01076507568359375,
-0.0086669921875,
-0.045196533203125,
0.01380157470703125,
0.004314422607421875,
-0.045501708984375,
-0.0217437744140625,
-0.0194549560546875,
-0.05401611328125,
-0.0309600830078125,
-0.0155029296875,
0.015625,
0.02191162109375,
-0.045684814453125,
0.025360107421875,
-0.0157928466796875,
0.0428466796875,
0.0127410888671875,
-0.045989990234375,
0.03387451171875,
0.019378662109375,
0.059295654296875,
0.0105743408203125,
-0.01274871826171875,
0.0055084228515625,
0.034942626953125,
-0.0235443115234375,
0.03314208984375,
-0.0208282470703125,
-0.06451416015625,
-0.00281524658203125,
0.0223236083984375,
-0.0007734298706054688,
-0.038970947265625,
0.03192138671875,
-0.028106689453125,
0.01224517822265625,
-0.00909423828125,
-0.007785797119140625,
-0.0246124267578125,
0.002227783203125,
-0.033416748046875,
0.08038330078125,
0.0171051025390625,
-0.036376953125,
0.0050506591796875,
-0.041839599609375,
-0.0255584716796875,
-0.01470947265625,
0.01800537109375,
-0.0220794677734375,
-0.01311492919921875,
0.0035648345947265625,
0.019195556640625,
-0.0288848876953125,
0.021484375,
-0.0288543701171875,
0.0006399154663085938,
0.00762939453125,
-0.01360321044921875,
0.08038330078125,
0.01708984375,
-0.046478271484375,
-0.01419830322265625,
-0.051727294921875,
-0.0294036865234375,
0.0438232421875,
-0.050994873046875,
0.002429962158203125,
0.01129150390625,
0.004070281982421875,
0.02117919921875,
0.049713134765625,
-0.045257568359375,
0.032073974609375,
-0.027008056640625,
0.02630615234375,
0.056671142578125,
0.00821685791015625,
0.0213165283203125,
-0.0341796875,
0.050628662109375,
0.0005664825439453125,
0.02593994140625,
0.0014247894287109375,
-0.05645751953125,
-0.06640625,
-0.02197265625,
-0.01137542724609375,
0.051971435546875,
-0.03704833984375,
0.028961181640625,
-0.0243072509765625,
-0.04864501953125,
-0.0340576171875,
0.024261474609375,
0.0382080078125,
0.033203125,
0.034698486328125,
-0.0212860107421875,
-0.042205810546875,
-0.0726318359375,
0.00489044189453125,
-0.0130615234375,
-0.001636505126953125,
0.04656982421875,
0.0352783203125,
-0.03704833984375,
0.060821533203125,
-0.043975830078125,
-0.0394287109375,
-0.00954437255859375,
-0.01160430908203125,
0.0277862548828125,
0.023681640625,
0.07611083984375,
-0.045166015625,
-0.037261962890625,
0.0005011558532714844,
-0.04681396484375,
-0.027740478515625,
-0.0019626617431640625,
-0.01277923583984375,
0.009124755859375,
0.0258636474609375,
-0.05810546875,
0.058319091796875,
0.057037353515625,
-0.03253173828125,
0.0241241455078125,
-0.01422882080078125,
0.004573822021484375,
-0.08123779296875,
0.0019073486328125,
0.0018854141235351562,
-0.0221405029296875,
-0.0305328369140625,
0.005290985107421875,
-0.04595947265625,
-0.001617431640625,
-0.05120849609375,
0.05377197265625,
-0.01532745361328125,
-0.0075836181640625,
-0.015716552734375,
0.02545166015625,
0.00640106201171875,
0.031951904296875,
-0.013092041015625,
0.062103271484375,
0.0233154296875,
-0.057037353515625,
0.01340484619140625,
0.03369140625,
-0.0143280029296875,
0.0386962890625,
-0.073486328125,
0.005413055419921875,
-0.00476837158203125,
0.0450439453125,
-0.0499267578125,
-0.0114593505859375,
0.0584716796875,
-0.042144775390625,
0.00040721893310546875,
0.0034961700439453125,
-0.04913330078125,
-0.00809478759765625,
-0.027435302734375,
0.0186767578125,
0.056182861328125,
-0.04345703125,
0.047760009765625,
0.037567138671875,
0.0004496574401855469,
-0.05572509765625,
-0.07073974609375,
-0.0010890960693359375,
-0.0396728515625,
-0.03692626953125,
0.0399169921875,
-0.01001739501953125,
-0.0212860107421875,
0.008392333984375,
0.0076751708984375,
-0.0133056640625,
0.01548004150390625,
0.0406494140625,
-0.01166534423828125,
0.0004477500915527344,
-0.0179901123046875,
0.01392364501953125,
-0.00806427001953125,
0.006793975830078125,
0.003753662109375,
0.037506103515625,
0.006511688232421875,
-0.019439697265625,
-0.031707763671875,
0.01340484619140625,
0.043182373046875,
-0.00795745849609375,
0.044189453125,
0.035552978515625,
-0.04278564453125,
0.0232391357421875,
-0.0438232421875,
0.0034008026123046875,
-0.037200927734375,
0.0157928466796875,
-0.020721435546875,
-0.0430908203125,
0.0638427734375,
0.011962890625,
0.031280517578125,
0.06610107421875,
0.055084228515625,
0.0015878677368164062,
0.042633056640625,
0.0655517578125,
-0.0010662078857421875,
0.033447265625,
-0.0163726806640625,
0.004619598388671875,
-0.08441162109375,
-0.041656494140625,
-0.0289154052734375,
-0.038177490234375,
-0.0443115234375,
-0.03863525390625,
0.0054931640625,
0.01335906982421875,
-0.038604736328125,
0.04705810546875,
-0.0227203369140625,
0.0302734375,
0.0289764404296875,
0.0133056640625,
0.03155517578125,
0.0028285980224609375,
-0.00273895263671875,
0.005298614501953125,
-0.018890380859375,
-0.052581787109375,
0.08282470703125,
0.04852294921875,
0.0338134765625,
0.03973388671875,
0.04632568359375,
0.0224609375,
0.0225677490234375,
-0.060638427734375,
0.050140380859375,
0.00206756591796875,
-0.06170654296875,
-0.0025882720947265625,
-0.0158233642578125,
-0.0732421875,
0.01296234130859375,
0.00429534912109375,
-0.08380126953125,
0.039825439453125,
-0.01534271240234375,
-0.0017871856689453125,
0.03131103515625,
-0.037506103515625,
0.036712646484375,
-0.006763458251953125,
-0.0122528076171875,
-0.02789306640625,
-0.04815673828125,
0.0443115234375,
-0.00531005859375,
0.0147705078125,
-0.02618408203125,
-0.042236328125,
0.07269287109375,
-0.0426025390625,
0.1080322265625,
-0.01727294921875,
-0.0167999267578125,
0.045257568359375,
-0.003875732421875,
0.0309295654296875,
0.01407623291015625,
0.0018033981323242188,
0.048614501953125,
0.009246826171875,
-0.01174163818359375,
-0.01776123046875,
0.038299560546875,
-0.09100341796875,
-0.05450439453125,
-0.032745361328125,
-0.032989501953125,
0.02880859375,
0.0199737548828125,
0.01343536376953125,
-0.0028209686279296875,
0.020233154296875,
0.0303955078125,
0.02691650390625,
-0.035552978515625,
0.0253753662109375,
0.037933349609375,
-0.033477783203125,
-0.046722412109375,
0.067626953125,
0.008392333984375,
0.0209503173828125,
0.0026035308837890625,
0.006633758544921875,
-0.0287322998046875,
-0.0272979736328125,
-0.0222625732421875,
0.0253143310546875,
-0.07177734375,
-0.035491943359375,
-0.033203125,
-3.5762786865234375e-7,
-0.01995849609375,
-0.00977325439453125,
-0.0161590576171875,
-0.041412353515625,
-0.06317138671875,
-0.01141357421875,
0.052154541015625,
0.04864501953125,
-0.0188751220703125,
0.048248291015625,
-0.043121337890625,
0.02569580078125,
0.01557159423828125,
0.022674560546875,
-0.023956298828125,
-0.0625,
0.0008940696716308594,
0.003376007080078125,
-0.039459228515625,
-0.0697021484375,
0.0189666748046875,
0.01320648193359375,
0.042938232421875,
0.0189666748046875,
-0.002162933349609375,
0.05596923828125,
-0.035125732421875,
0.074462890625,
0.035003662109375,
-0.06622314453125,
0.038055419921875,
-0.031829833984375,
-0.01071929931640625,
0.027313232421875,
0.0360107421875,
-0.035003662109375,
-0.020599365234375,
-0.0670166015625,
-0.0662841796875,
0.034881591796875,
0.01129913330078125,
0.027557373046875,
0.006633758544921875,
0.04278564453125,
-0.01446533203125,
0.0203399658203125,
-0.09283447265625,
-0.031890869140625,
-0.016326904296875,
-0.00704193115234375,
0.01485443115234375,
-0.0234222412109375,
-0.03240966796875,
-0.00937652587890625,
0.057220458984375,
0.0056304931640625,
0.01947021484375,
0.004695892333984375,
-0.0147705078125,
-0.0161895751953125,
0.0102996826171875,
0.05743408203125,
0.06646728515625,
-0.0074462890625,
-0.00829315185546875,
0.04132080078125,
-0.057769775390625,
0.0118560791015625,
0.00783538818359375,
-0.00498199462890625,
-0.021392822265625,
0.0015850067138671875,
0.037811279296875,
0.0340576171875,
-0.0379638671875,
0.0401611328125,
0.00824737548828125,
-0.02777099609375,
-0.0259857177734375,
-0.0086669921875,
0.01532745361328125,
0.042083740234375,
0.0216522216796875,
-0.0150146484375,
0.017303466796875,
-0.028533935546875,
0.0087127685546875,
0.0221405029296875,
-0.003139495849609375,
-0.02032470703125,
0.054718017578125,
-0.002918243408203125,
-0.0018024444580078125,
0.0247650146484375,
-0.00908660888671875,
-0.01971435546875,
0.05596923828125,
0.054718017578125,
0.04833984375,
-0.007610321044921875,
0.017425537109375,
0.0285186767578125,
0.0504150390625,
0.01380157470703125,
0.012939453125,
0.0012521743774414062,
-0.0286102294921875,
-0.0253143310546875,
-0.046142578125,
-0.03515625,
0.0010900497436523438,
-0.049591064453125,
0.031005859375,
-0.054412841796875,
-0.019927978515625,
-0.041900634765625,
0.007480621337890625,
-0.030517578125,
0.00888824462890625,
0.022216796875,
0.06597900390625,
-0.05718994140625,
0.04248046875,
0.0478515625,
-0.0726318359375,
-0.0589599609375,
-0.017486572265625,
0.01171875,
-0.07391357421875,
0.044952392578125,
-0.00276947021484375,
-0.0200958251953125,
-0.00495147705078125,
-0.0538330078125,
-0.08013916015625,
0.1187744140625,
0.0341796875,
-0.029541015625,
0.00801849365234375,
0.00983428955078125,
0.0173187255859375,
-0.01540374755859375,
0.031829833984375,
0.04425048828125,
0.037445068359375,
0.026123046875,
-0.08087158203125,
0.01690673828125,
-0.0005049705505371094,
0.004848480224609375,
-0.0286102294921875,
-0.06951904296875,
0.04937744140625,
-0.0048980712890625,
-0.0190277099609375,
0.00968170166015625,
0.052947998046875,
0.05535888671875,
0.024169921875,
0.033294677734375,
0.054412841796875,
0.054168701171875,
-0.0160369873046875,
0.08465576171875,
-0.0199737548828125,
0.0303802490234375,
0.057525634765625,
0.005237579345703125,
0.078369140625,
0.03338623046875,
-0.042816162109375,
0.06549072265625,
0.072265625,
0.004650115966796875,
0.036468505859375,
0.01485443115234375,
-0.0229949951171875,
-0.0016984939575195312,
-0.04022216796875,
-0.05682373046875,
0.0183868408203125,
0.0340576171875,
-0.030609130859375,
-0.00818634033203125,
-0.02410888671875,
0.027313232421875,
-0.013397216796875,
-0.0162811279296875,
0.041351318359375,
0.035186767578125,
0.004741668701171875,
0.046630859375,
0.0073089599609375,
0.052581787109375,
-0.04779052734375,
0.001842498779296875,
-0.0301513671875,
-0.011322021484375,
-0.0440673828125,
-0.046722412109375,
0.0146636962890625,
0.009552001953125,
-0.0159454345703125,
-0.016204833984375,
0.043121337890625,
0.0032329559326171875,
-0.036956787109375,
0.0313720703125,
0.006885528564453125,
0.033233642578125,
0.0347900390625,
-0.057830810546875,
0.0201568603515625,
-0.013214111328125,
-0.047119140625,
0.024505615234375,
0.0107421875,
0.00998687744140625,
0.056793212890625,
0.0496826171875,
-0.0019969940185546875,
0.01232147216796875,
-0.011016845703125,
0.08831787109375,
-0.038238525390625,
-0.0093994140625,
-0.04742431640625,
0.06427001953125,
0.005779266357421875,
-0.0231170654296875,
0.04779052734375,
0.0338134765625,
0.055084228515625,
-0.00537872314453125,
0.04705810546875,
-0.0182342529296875,
0.0129852294921875,
-0.037994384765625,
0.0528564453125,
-0.06854248046875,
0.0157470703125,
-0.02215576171875,
-0.08544921875,
-0.0142669677734375,
0.05224609375,
-0.0223236083984375,
0.005474090576171875,
0.028717041015625,
0.036346435546875,
0.003421783447265625,
-0.02203369140625,
0.01406097412109375,
0.01367950439453125,
0.038238525390625,
0.0396728515625,
0.045928955078125,
-0.0509033203125,
0.04425048828125,
-0.0239410400390625,
-0.0202789306640625,
-0.00955963134765625,
-0.0787353515625,
-0.052642822265625,
-0.02490234375,
-0.02984619140625,
-0.0285797119140625,
-0.032745361328125,
0.061279296875,
0.041015625,
-0.03411865234375,
-0.031341552734375,
0.0224456787109375,
0.02642822265625,
0.008270263671875,
-0.0160980224609375,
0.007518768310546875,
0.004718780517578125,
-0.05487060546875,
0.04248046875,
0.0162811279296875,
0.02667236328125,
-0.031097412109375,
-0.0186767578125,
-0.0173187255859375,
-0.006572723388671875,
0.060150146484375,
0.040771484375,
-0.07220458984375,
-0.0141143798828125,
-0.00881195068359375,
-0.0006017684936523438,
-0.0042266845703125,
0.0184173583984375,
-0.04974365234375,
0.0105133056640625,
0.033966064453125,
0.035491943359375,
0.044952392578125,
0.00225067138671875,
0.0073089599609375,
-0.04583740234375,
0.032440185546875,
0.00675201416015625,
0.041839599609375,
0.0037384033203125,
-0.030853271484375,
0.043914794921875,
0.01763916015625,
-0.03631591796875,
-0.0596923828125,
0.0095977783203125,
-0.07904052734375,
0.0074462890625,
0.0941162109375,
-0.016998291015625,
-0.01332855224609375,
0.009979248046875,
-0.011474609375,
0.01544952392578125,
-0.0185394287109375,
0.027984619140625,
0.039276123046875,
-0.0059814453125,
-0.01104736328125,
-0.06951904296875,
0.0204315185546875,
0.00507354736328125,
-0.071533203125,
-0.00616455078125,
0.037750244140625,
0.036865234375,
0.02716064453125,
0.055206298828125,
-0.0263671875,
0.01367950439453125,
-0.0122222900390625,
0.0211181640625,
0.007335662841796875,
-0.018157958984375,
-0.01457977294921875,
-0.01739501953125,
0.000568389892578125,
-0.01678466796875
]
] |
Venkatesh4342/distilbert-helpdesk-sentiment | 2023-10-09T13:59:25.000Z | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | Venkatesh4342 | null | null | Venkatesh4342/distilbert-helpdesk-sentiment | 0 | 29,354 | transformers | 2023-09-11T08:37:03 | ---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-helpdesk-sentiment
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-helpdesk-sentiment
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3008
- Accuracy: 0.9417
## Model description
Model was trained on Dataset of abstractive summaries of customer and helpdesk which include mixed sentiment[negative and positive] in single sample(sentence) but data labeled carefully to customer's sentiment.
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 7
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 0.68 | 100 | 0.3234 | 0.8932 |
| No log | 1.35 | 200 | 0.3211 | 0.9029 |
| No log | 2.03 | 300 | 0.3265 | 0.9126 |
| No log | 2.7 | 400 | 0.3553 | 0.9126 |
| No log | 3.38 | 500 | 0.3894 | 0.9223 |
| 0.2663 | 4.05 | 600 | 0.3379 | 0.9223 |
| 0.2663 | 4.73 | 700 | 0.2877 | 0.9320 |
| 0.2663 | 5.41 | 800 | 0.3072 | 0.9417 |
| 0.2663 | 6.08 | 900 | 0.2992 | 0.9320 |
| 0.2663 | 6.76 | 1000 | 0.3008 | 0.9417 |
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
| 2,143 | [
[
-0.035980224609375,
-0.039642333984375,
0.004974365234375,
0.018096923828125,
-0.019439697265625,
-0.014007568359375,
-0.0030078887939453125,
-0.007068634033203125,
0.0136260986328125,
0.01294708251953125,
-0.05804443359375,
-0.04925537109375,
-0.0543212890625,
-0.00860595703125,
-0.0161285400390625,
0.099853515625,
0.00855255126953125,
0.02777099609375,
-0.01470184326171875,
-0.001934051513671875,
-0.027618408203125,
-0.053314208984375,
-0.060272216796875,
-0.044036865234375,
0.0203704833984375,
0.0211181640625,
0.058380126953125,
0.03656005859375,
0.04443359375,
0.02301025390625,
-0.0382080078125,
-0.0037136077880859375,
-0.038177490234375,
-0.02825927734375,
0.0103607177734375,
-0.037322998046875,
-0.055755615234375,
0.005535125732421875,
0.032928466796875,
0.04351806640625,
-0.0191802978515625,
0.037872314453125,
0.015106201171875,
0.047454833984375,
-0.034088134765625,
0.0264434814453125,
-0.0325927734375,
0.01971435546875,
-0.00888824462890625,
-0.014984130859375,
-0.0170135498046875,
-0.00528717041015625,
0.0090789794921875,
-0.035675048828125,
0.03973388671875,
-0.002849578857421875,
0.0833740234375,
0.033721923828125,
-0.02496337890625,
-0.007648468017578125,
-0.047821044921875,
0.053802490234375,
-0.062286376953125,
0.00276947021484375,
0.031707763671875,
0.023712158203125,
0.004772186279296875,
-0.0499267578125,
-0.04644775390625,
0.01213836669921875,
-0.01546478271484375,
0.0261077880859375,
-0.0189056396484375,
0.00229644775390625,
0.058929443359375,
0.053558349609375,
-0.035858154296875,
0.00567626953125,
-0.033447265625,
-0.00921630859375,
0.053466796875,
0.03955078125,
-0.018585205078125,
-0.030609130859375,
-0.0306396484375,
-0.01163482666015625,
-0.002887725830078125,
0.0259552001953125,
0.041839599609375,
0.017608642578125,
-0.02252197265625,
0.0297088623046875,
-0.02386474609375,
0.051361083984375,
0.01546478271484375,
-0.017486572265625,
0.056671142578125,
0.0045013427734375,
-0.040802001953125,
0.00882720947265625,
0.06317138671875,
0.055206298828125,
0.0060577392578125,
0.01416778564453125,
-0.024566650390625,
0.0034027099609375,
0.0188446044921875,
-0.06793212890625,
-0.0269012451171875,
0.0161285400390625,
-0.043701171875,
-0.05694580078125,
0.0215606689453125,
-0.05352783203125,
0.0005259513854980469,
-0.035125732421875,
0.0205230712890625,
-0.0228271484375,
-0.0270843505859375,
0.008148193359375,
-0.0101776123046875,
0.0188446044921875,
0.0106964111328125,
-0.0657958984375,
0.0241241455078125,
0.0391845703125,
0.048126220703125,
0.01163482666015625,
-0.0123291015625,
-0.003482818603515625,
-0.0115814208984375,
-0.01540374755859375,
0.03643798828125,
-0.0017251968383789062,
-0.035430908203125,
-0.01258087158203125,
0.01274871826171875,
-0.007503509521484375,
-0.0361328125,
0.059356689453125,
-0.0181121826171875,
0.029510498046875,
-0.01401519775390625,
-0.037261962890625,
-0.0252838134765625,
0.03643798828125,
-0.044708251953125,
0.100830078125,
0.0177764892578125,
-0.083740234375,
0.03778076171875,
-0.0421142578125,
-0.010650634765625,
-0.0100555419921875,
-0.0044403076171875,
-0.062286376953125,
0.0039215087890625,
0.0027103424072265625,
0.03704833984375,
-0.017059326171875,
0.0301666259765625,
-0.0262298583984375,
-0.0300445556640625,
0.01041412353515625,
-0.042266845703125,
0.08013916015625,
0.015045166015625,
-0.03973388671875,
-0.00373077392578125,
-0.0885009765625,
0.0120086669921875,
0.0216064453125,
-0.0286865234375,
-0.0098114013671875,
-0.0248565673828125,
0.0265655517578125,
0.0260467529296875,
0.0253753662109375,
-0.038238525390625,
0.017822265625,
-0.0286407470703125,
0.02301025390625,
0.05181884765625,
0.004070281982421875,
0.008209228515625,
-0.028778076171875,
0.027130126953125,
0.037384033203125,
0.03399658203125,
0.016204833984375,
-0.02386474609375,
-0.08111572265625,
-0.01837158203125,
0.008056640625,
0.03656005859375,
-0.0258026123046875,
0.05413818359375,
-0.01142120361328125,
-0.04949951171875,
-0.0235748291015625,
0.005462646484375,
0.02044677734375,
0.06634521484375,
0.03131103515625,
-0.00554656982421875,
-0.0408935546875,
-0.08380126953125,
0.005634307861328125,
-0.009368896484375,
0.0233154296875,
0.0031108856201171875,
0.043365478515625,
-0.019622802734375,
0.0726318359375,
-0.0570068359375,
-0.0250396728515625,
-0.01137542724609375,
0.01227569580078125,
0.053070068359375,
0.040740966796875,
0.058380126953125,
-0.05340576171875,
-0.03192138671875,
-0.022979736328125,
-0.060760498046875,
0.023406982421875,
-0.007335662841796875,
-0.0109100341796875,
0.0040283203125,
0.01538848876953125,
-0.043701171875,
0.058929443359375,
0.030426025390625,
-0.0266876220703125,
0.054046630859375,
-0.021392822265625,
-0.006866455078125,
-0.1051025390625,
0.0218963623046875,
0.013153076171875,
-0.0156707763671875,
-0.037261962890625,
-0.0225677490234375,
0.00235748291015625,
-0.0083465576171875,
-0.0276336669921875,
0.0276031494140625,
-0.01313018798828125,
0.0171661376953125,
-0.0161285400390625,
-0.031707763671875,
0.0084991455078125,
0.0599365234375,
0.01100921630859375,
0.050689697265625,
0.053192138671875,
-0.03851318359375,
0.03778076171875,
0.035186767578125,
-0.023651123046875,
0.052520751953125,
-0.06134033203125,
0.00391387939453125,
-0.00946044921875,
-0.0008721351623535156,
-0.06707763671875,
-0.00914764404296875,
0.03033447265625,
-0.03656005859375,
0.022125244140625,
-0.0157470703125,
-0.0262451171875,
-0.043975830078125,
-0.011199951171875,
0.007434844970703125,
0.037872314453125,
-0.029571533203125,
0.032501220703125,
-0.0021991729736328125,
0.01323699951171875,
-0.0623779296875,
-0.0699462890625,
-0.01453399658203125,
-0.0226287841796875,
-0.033050537109375,
0.014923095703125,
-0.0015325546264648438,
-0.0146026611328125,
-0.005504608154296875,
-0.0117340087890625,
-0.0215606689453125,
0.0015726089477539062,
0.038787841796875,
0.03033447265625,
-0.005413055419921875,
-0.00009399652481079102,
-0.0020771026611328125,
-0.0279388427734375,
0.024932861328125,
0.0052337646484375,
0.033843994140625,
-0.0219879150390625,
-0.0241851806640625,
-0.05780029296875,
0.00557708740234375,
0.039886474609375,
-0.00506591796875,
0.075439453125,
0.04522705078125,
-0.03521728515625,
-0.002269744873046875,
-0.0306854248046875,
-0.0157928466796875,
-0.034576416015625,
0.042572021484375,
-0.03253173828125,
-0.0207672119140625,
0.053802490234375,
-0.00218963623046875,
0.00490570068359375,
0.07061767578125,
0.038055419921875,
-0.00862884521484375,
0.08612060546875,
0.0227508544921875,
-0.0174407958984375,
0.0189971923828125,
-0.066162109375,
0.006664276123046875,
-0.047393798828125,
-0.0411376953125,
-0.036346435546875,
-0.02734375,
-0.041656494140625,
0.007778167724609375,
0.0088043212890625,
0.016510009765625,
-0.04302978515625,
0.0130157470703125,
-0.05029296875,
0.01434326171875,
0.048492431640625,
0.02203369140625,
0.00833892822265625,
0.004215240478515625,
-0.0247802734375,
-0.005664825439453125,
-0.058380126953125,
-0.036224365234375,
0.08013916015625,
0.038543701171875,
0.05804443359375,
-0.01114654541015625,
0.0546875,
0.01007843017578125,
0.01557159423828125,
-0.05340576171875,
0.0224609375,
-0.00350189208984375,
-0.06353759765625,
-0.002948760986328125,
-0.034423828125,
-0.0411376953125,
0.00905609130859375,
-0.0224761962890625,
-0.034423828125,
0.0216064453125,
0.0226593017578125,
-0.0307769775390625,
0.036865234375,
-0.038848876953125,
0.0810546875,
-0.0278472900390625,
-0.01922607421875,
-0.0026226043701171875,
-0.040924072265625,
0.01003265380859375,
0.0079345703125,
-0.0002460479736328125,
-0.006267547607421875,
0.0214385986328125,
0.0623779296875,
-0.04632568359375,
0.05755615234375,
-0.03143310546875,
0.0189666748046875,
0.0209197998046875,
-0.014617919921875,
0.040863037109375,
0.0201873779296875,
-0.01483917236328125,
0.0183868408203125,
0.01088714599609375,
-0.041534423828125,
-0.0377197265625,
0.04840087890625,
-0.08111572265625,
-0.0241851806640625,
-0.0615234375,
-0.0296630859375,
-0.00629425048828125,
0.016510009765625,
0.0440673828125,
0.049041748046875,
-0.00897216796875,
0.0201568603515625,
0.0499267578125,
0.0072784423828125,
0.0257720947265625,
0.017333984375,
-0.003116607666015625,
-0.0450439453125,
0.05364990234375,
-0.01103973388671875,
0.01543426513671875,
0.005580902099609375,
0.0036563873291015625,
-0.035125732421875,
-0.0267181396484375,
-0.03424072265625,
0.01288604736328125,
-0.05804443359375,
-0.0237579345703125,
-0.035552978515625,
-0.028228759765625,
-0.02801513671875,
0.003116607666015625,
-0.031005859375,
-0.024688720703125,
-0.05560302734375,
-0.022857666015625,
0.0428466796875,
0.03717041015625,
0.01134490966796875,
0.039825439453125,
-0.042694091796875,
-0.01216888427734375,
0.0054931640625,
0.02252197265625,
0.00397491455078125,
-0.05926513671875,
-0.0225677490234375,
0.01459503173828125,
-0.037200927734375,
-0.0518798828125,
0.0404052734375,
-0.002323150634765625,
0.0455322265625,
0.0517578125,
-0.0008983612060546875,
0.07000732421875,
-0.005401611328125,
0.055328369140625,
0.038177490234375,
-0.05340576171875,
0.03704833984375,
-0.01306915283203125,
0.0206298828125,
0.059173583984375,
0.0418701171875,
-0.03558349609375,
-0.00849151611328125,
-0.084228515625,
-0.053558349609375,
0.06622314453125,
0.015655517578125,
0.01141357421875,
0.0023860931396484375,
0.0308074951171875,
-0.01396942138671875,
0.0296783447265625,
-0.06365966796875,
-0.053558349609375,
-0.036285400390625,
-0.034149169921875,
-0.01043701171875,
-0.02520751953125,
-0.00881195068359375,
-0.045684814453125,
0.0660400390625,
0.004180908203125,
0.0216522216796875,
0.0127410888671875,
0.01190185546875,
0.006107330322265625,
0.0068206787109375,
0.03851318359375,
0.050933837890625,
-0.04071044921875,
0.003253936767578125,
0.02252197265625,
-0.03936767578125,
0.0006337165832519531,
0.0172271728515625,
-0.01438140869140625,
0.01483917236328125,
0.01629638671875,
0.07977294921875,
0.00033354759216308594,
-0.00930023193359375,
0.043792724609375,
-0.00516510009765625,
-0.03973388671875,
-0.05133056640625,
-0.0029754638671875,
-0.0009679794311523438,
0.01261138916015625,
0.030303955078125,
0.03857421875,
0.0083160400390625,
-0.0141448974609375,
0.004486083984375,
0.0183258056640625,
-0.054931640625,
-0.01149749755859375,
0.0498046875,
0.006317138671875,
-0.01328277587890625,
0.06329345703125,
-0.009429931640625,
-0.03497314453125,
0.055694580078125,
0.0213470458984375,
0.058837890625,
-0.009674072265625,
0.002155303955078125,
0.057891845703125,
0.01033782958984375,
-0.0101776123046875,
0.0347900390625,
0.017913818359375,
-0.031768798828125,
-0.007335662841796875,
-0.05914306640625,
-0.008636474609375,
0.0355224609375,
-0.08807373046875,
0.03289794921875,
-0.03717041015625,
-0.04638671875,
0.005580902099609375,
0.01377105712890625,
-0.06854248046875,
0.0455322265625,
0.00481414794921875,
0.0843505859375,
-0.07080078125,
0.049591064453125,
0.053314208984375,
-0.045928955078125,
-0.08013916015625,
-0.0250091552734375,
-0.005733489990234375,
-0.05224609375,
0.0526123046875,
0.01187896728515625,
0.01479339599609375,
0.005992889404296875,
-0.0290985107421875,
-0.040374755859375,
0.09002685546875,
0.005825042724609375,
-0.054351806640625,
0.0018224716186523438,
0.0297393798828125,
0.048553466796875,
-0.0094757080078125,
0.039794921875,
0.0233001708984375,
0.01473236083984375,
0.01169586181640625,
-0.059600830078125,
-0.0038051605224609375,
-0.02911376953125,
0.0031070709228515625,
0.00963592529296875,
-0.05718994140625,
0.0767822265625,
0.00991058349609375,
0.0272064208984375,
-0.0021839141845703125,
0.042877197265625,
0.0083160400390625,
0.0237579345703125,
0.038299560546875,
0.0771484375,
0.048309326171875,
-0.0199432373046875,
0.063232421875,
-0.038970947265625,
0.07025146484375,
0.073486328125,
0.00604248046875,
0.0450439453125,
0.042449951171875,
-0.02386474609375,
0.037200927734375,
0.062744140625,
-0.0196990966796875,
0.031829833984375,
0.01020050048828125,
-0.0157318115234375,
-0.02734375,
0.01715087890625,
-0.04339599609375,
0.02459716796875,
0.002231597900390625,
-0.05322265625,
-0.022125244140625,
-0.0092315673828125,
0.0013017654418945312,
-0.01287841796875,
-0.03228759765625,
0.04791259765625,
-0.0099945068359375,
-0.023529052734375,
0.05670166015625,
0.0029754638671875,
0.04144287109375,
-0.043487548828125,
-0.0031986236572265625,
-0.0142669677734375,
0.040069580078125,
-0.037933349609375,
-0.050750732421875,
0.02392578125,
0.0024356842041015625,
-0.0253753662109375,
-0.0058135986328125,
0.031585693359375,
-0.01219940185546875,
-0.0682373046875,
0.00133514404296875,
0.018524169921875,
0.007190704345703125,
-0.00308990478515625,
-0.08349609375,
-0.0143890380859375,
0.0082550048828125,
-0.038787841796875,
0.002826690673828125,
0.0292816162109375,
0.01253509521484375,
0.03570556640625,
0.042327880859375,
-0.0002092123031616211,
-0.0014982223510742188,
0.01155853271484375,
0.08770751953125,
-0.041839599609375,
-0.048492431640625,
-0.06219482421875,
0.046844482421875,
-0.033233642578125,
-0.054351806640625,
0.0548095703125,
0.07196044921875,
0.064453125,
-0.01430511474609375,
0.051116943359375,
-0.00653839111328125,
0.041229248046875,
-0.0278472900390625,
0.055084228515625,
-0.04254150390625,
-0.0098724365234375,
-0.0200653076171875,
-0.0596923828125,
-0.00937652587890625,
0.058746337890625,
-0.034698486328125,
0.010986328125,
0.035552978515625,
0.0582275390625,
-0.003772735595703125,
0.01079559326171875,
0.00516510009765625,
-0.002590179443359375,
0.008270263671875,
0.033782958984375,
0.050201416015625,
-0.056182861328125,
0.043212890625,
-0.0521240234375,
-0.01751708984375,
-0.01033782958984375,
-0.055999755859375,
-0.0784912109375,
-0.028228759765625,
-0.035491943359375,
-0.03936767578125,
-0.0172882080078125,
0.07012939453125,
0.05755615234375,
-0.060516357421875,
-0.0173187255859375,
-0.0080413818359375,
-0.0306854248046875,
-0.023162841796875,
-0.0182952880859375,
0.032257080078125,
-0.01259613037109375,
-0.053192138671875,
-0.0272064208984375,
-0.01849365234375,
0.0269622802734375,
-0.02459716796875,
-0.0179595947265625,
-0.00762939453125,
-0.02447509765625,
0.0159912109375,
-0.00669097900390625,
-0.032928466796875,
-0.006717681884765625,
0.005130767822265625,
-0.016571044921875,
0.017822265625,
0.0283050537109375,
-0.0251312255859375,
0.0228271484375,
0.0236053466796875,
0.0240325927734375,
0.057464599609375,
0.0052490234375,
0.00518798828125,
-0.060211181640625,
0.033172607421875,
0.0217132568359375,
0.0303497314453125,
0.0169677734375,
-0.0391845703125,
0.0231475830078125,
0.038421630859375,
-0.036651611328125,
-0.054962158203125,
-0.01273345947265625,
-0.08355712890625,
0.003299713134765625,
0.08270263671875,
-0.00391387939453125,
-0.029632568359375,
0.01381683349609375,
-0.02777099609375,
0.0165557861328125,
-0.025421142578125,
0.04937744140625,
0.06732177734375,
-0.00754547119140625,
0.004329681396484375,
-0.035125732421875,
0.047698974609375,
0.0163421630859375,
-0.044708251953125,
-0.0102996826171875,
0.03076171875,
0.037261962890625,
0.01229095458984375,
0.03607177734375,
-0.0038356781005859375,
0.01305389404296875,
0.0111846923828125,
0.020477294921875,
-0.034332275390625,
-0.00946807861328125,
-0.0308380126953125,
-0.0001456737518310547,
0.01318359375,
-0.036163330078125
]
] |
cmarkea/distilcamembert-base-sentiment | 2023-08-01T10:04:44.000Z | [
"transformers",
"pytorch",
"tf",
"onnx",
"safetensors",
"camembert",
"text-classification",
"fr",
"dataset:amazon_reviews_multi",
"dataset:allocine",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cmarkea | null | null | cmarkea/distilcamembert-base-sentiment | 26 | 29,291 | transformers | 2022-03-02T23:29:05 | ---
language: fr
license: mit
datasets:
- amazon_reviews_multi
- allocine
widget:
- text: "Je pensais lire un livre nul, mais finalement je l'ai trouvé super !"
- text: "Cette banque est très bien, mais elle n'offre pas les services de paiements sans contact."
- text: "Cette banque est très bien et elle offre en plus les services de paiements sans contact."
---
DistilCamemBERT-Sentiment
=========================
We present DistilCamemBERT-Sentiment, which is [DistilCamemBERT](https://huggingface.co/cmarkea/distilcamembert-base) fine-tuned for the sentiment analysis task for the French language. This model is built using two datasets: [Amazon Reviews](https://huggingface.co/datasets/amazon_reviews_multi) and [Allociné.fr](https://huggingface.co/datasets/allocine) to minimize the bias. Indeed, Amazon reviews are similar in messages and relatively shorts, contrary to Allociné critics, who are long and rich texts.
This modelization is close to [tblard/tf-allocine](https://huggingface.co/tblard/tf-allocine) based on [CamemBERT](https://huggingface.co/camembert-base) model. The problem of the modelizations based on CamemBERT is at the scaling moment, for the production phase, for example. Indeed, inference cost can be a technological issue. To counteract this effect, we propose this modelization which **divides the inference time by two** with the same consumption power thanks to [DistilCamemBERT](https://huggingface.co/cmarkea/distilcamembert-base).
Dataset
-------
The dataset comprises 204,993 reviews for training and 4,999 reviews for the test from Amazon, and 235,516 and 4,729 critics from [Allocine website](https://www.allocine.fr/). The dataset is labeled into five categories:
* 1 star: represents a terrible appreciation,
* 2 stars: bad appreciation,
* 3 stars: neutral appreciation,
* 4 stars: good appreciation,
* 5 stars: excellent appreciation.
Evaluation results
------------------
In addition of accuracy (called here *exact accuracy*) in order to be robust to +/-1 star estimation errors, we will take the following definition as a performance measure:
$$\mathrm{top\!-\!2\; acc}=\frac{1}{|\mathcal{O}|}\sum_{i\in\mathcal{O}}\sum_{0\leq l < 2}\mathbb{1}(\hat{f}_{i,l}=y_i)$$
where \\(\hat{f}_l\\) is the l-th largest predicted label, \\(y\\) the true label, \\(\mathcal{O}\\) is the test set of the observations and \\(\mathbb{1}\\) is the indicator function.
| **class** | **exact accuracy (%)** | **top-2 acc (%)** | **support** |
| :---------: | :--------------------: | :---------------: | :---------: |
| **global** | 61.01 | 88.80 | 9,698 |
| **1 star** | 87.21 | 77.17 | 1,905 |
| **2 stars** | 79.19 | 84.75 | 1,935 |
| **3 stars** | 77.85 | 78.98 | 1,974 |
| **4 stars** | 78.61 | 90.22 | 1,952 |
| **5 stars** | 85.96 | 82.92 | 1,932 |
Benchmark
---------
This model is compared to 3 reference models (see below). As each model doesn't have the exact definition of targets, we detail the performance measure used for each. An **AMD Ryzen 5 4500U @ 2.3GHz with 6 cores** was used for the mean inference time measure.
#### bert-base-multilingual-uncased-sentiment
[nlptown/bert-base-multilingual-uncased-sentiment](https://huggingface.co/nlptown/bert-base-multilingual-uncased-sentiment) is based on BERT model in the multilingual and uncased version. This sentiment analyzer is trained on Amazon reviews, similar to our model. Hence the targets and their definitions are the same.
| **model** | **time (ms)** | **exact accuracy (%)** | **top-2 acc (%)** |
| :-------: | :-----------: | :--------------------: | :---------------: |
| [cmarkea/distilcamembert-base-sentiment](https://huggingface.co/cmarkea/distilcamembert-base-sentiment) | **95.56** | **61.01** | **88.80** |
| [nlptown/bert-base-multilingual-uncased-sentiment](https://huggingface.co/nlptown/bert-base-multilingual-uncased-sentiment) | 187.70 | 54.41 | 82.82 |
#### tf-allociné and barthez-sentiment-classification
[tblard/tf-allocine](https://huggingface.co/tblard/tf-allocine) based on [CamemBERT](https://huggingface.co/camembert-base) model and [moussaKam/barthez-sentiment-classification](https://huggingface.co/moussaKam/barthez-sentiment-classification) based on [BARThez](https://huggingface.co/moussaKam/barthez) use the same bi-class definition between them. To bring this back to a two-class problem, we will only consider the *"1 star"* and *"2 stars"* labels for the *negative* sentiments and *"4 stars"* and *"5 stars"* for *positive* sentiments. We exclude the *"3 stars"* which can be interpreted as a *neutral* class. In this context, the problem of +/-1 star estimation errors disappears. Then we use only the classical accuracy definition.
| **model** | **time (ms)** | **exact accuracy (%)** |
| :-------: | :-----------: | :--------------------: |
| [cmarkea/distilcamembert-base-sentiment](https://huggingface.co/cmarkea/distilcamembert-base-sentiment) | **95.56** | **97.52** |
| [tblard/tf-allocine](https://huggingface.co/tblard/tf-allocine) | 329.74 | 95.69 |
| [moussaKam/barthez-sentiment-classification](https://huggingface.co/moussaKam/barthez-sentiment-classification) | 197.95 | 94.29 |
How to use DistilCamemBERT-Sentiment
------------------------------------
```python
from transformers import pipeline
analyzer = pipeline(
task='text-classification',
model="cmarkea/distilcamembert-base-sentiment",
tokenizer="cmarkea/distilcamembert-base-sentiment"
)
result = analyzer(
"J'aime me promener en forêt même si ça me donne mal aux pieds.",
return_all_scores=True
)
result
[{'label': '1 star',
'score': 0.047529436647892},
{'label': '2 stars',
'score': 0.14150355756282806},
{'label': '3 stars',
'score': 0.3586442470550537},
{'label': '4 stars',
'score': 0.3181498646736145},
{'label': '5 stars',
'score': 0.13417290151119232}]
```
### Optimum + ONNX
```python
from optimum.onnxruntime import ORTModelForSequenceClassification
from transformers import AutoTokenizer, pipeline
HUB_MODEL = "cmarkea/distilcamembert-base-sentiment"
tokenizer = AutoTokenizer.from_pretrained(HUB_MODEL)
model = ORTModelForSequenceClassification.from_pretrained(HUB_MODEL)
onnx_qa = pipeline("text-classification", model=model, tokenizer=tokenizer)
# Quantized onnx model
quantized_model = ORTModelForSequenceClassification.from_pretrained(
HUB_MODEL, file_name="model_quantized.onnx"
)
```
Citation
--------
```bibtex
@inproceedings{delestre:hal-03674695,
TITLE = {{DistilCamemBERT : une distillation du mod{\`e}le fran{\c c}ais CamemBERT}},
AUTHOR = {Delestre, Cyrile and Amar, Abibatou},
URL = {https://hal.archives-ouvertes.fr/hal-03674695},
BOOKTITLE = {{CAp (Conf{\'e}rence sur l'Apprentissage automatique)}},
ADDRESS = {Vannes, France},
YEAR = {2022},
MONTH = Jul,
KEYWORDS = {NLP ; Transformers ; CamemBERT ; Distillation},
PDF = {https://hal.archives-ouvertes.fr/hal-03674695/file/cap2022.pdf},
HAL_ID = {hal-03674695},
HAL_VERSION = {v1},
}
``` | 7,180 | [
[
-0.0372314453125,
-0.049346923828125,
0.023162841796875,
0.0244140625,
-0.0217132568359375,
0.002105712890625,
-0.020233154296875,
-0.0268707275390625,
0.0176849365234375,
0.0135955810546875,
-0.03802490234375,
-0.047119140625,
-0.057891845703125,
-0.0033473968505859375,
0.0006093978881835938,
0.1004638671875,
0.01033782958984375,
0.0193939208984375,
0.0139617919921875,
-0.023956298828125,
-0.012298583984375,
-0.053375244140625,
-0.03436279296875,
-0.0176544189453125,
0.037506103515625,
0.0147552490234375,
0.049163818359375,
0.0202178955078125,
0.042510986328125,
0.0236968994140625,
-0.012298583984375,
-0.0006418228149414062,
-0.026153564453125,
0.0038204193115234375,
0.00408935546875,
-0.03619384765625,
-0.061004638671875,
0.01922607421875,
0.032562255859375,
0.047882080078125,
0.0039215087890625,
0.0303802490234375,
0.02783203125,
0.0616455078125,
-0.04144287109375,
0.024139404296875,
-0.033447265625,
-0.00506591796875,
0.000537872314453125,
0.0011157989501953125,
-0.0245819091796875,
-0.035888671875,
0.0065460205078125,
-0.027984619140625,
0.021881103515625,
0.0077972412109375,
0.08587646484375,
0.027740478515625,
-0.014404296875,
-0.01428985595703125,
-0.05059814453125,
0.08843994140625,
-0.0748291015625,
0.0206298828125,
0.01018524169921875,
0.015838623046875,
0.01654052734375,
-0.04150390625,
-0.04742431640625,
0.00146484375,
-0.01194000244140625,
0.0479736328125,
-0.032623291015625,
-0.006561279296875,
0.0194854736328125,
0.046173095703125,
-0.034423828125,
-0.00360107421875,
-0.017913818359375,
-0.005313873291015625,
0.057891845703125,
0.0008726119995117188,
-0.0194854736328125,
-0.0291595458984375,
-0.044891357421875,
-0.0298309326171875,
-0.00872802734375,
0.016510009765625,
0.0239105224609375,
0.0286407470703125,
-0.010986328125,
0.040252685546875,
-0.00786590576171875,
0.038787841796875,
0.0219879150390625,
-0.01508331298828125,
0.05511474609375,
-0.0257568359375,
-0.027313232421875,
0.004001617431640625,
0.08111572265625,
0.04693603515625,
0.0157470703125,
0.0180816650390625,
-0.0284423828125,
0.026031494140625,
0.00029277801513671875,
-0.058868408203125,
-0.0236968994140625,
0.0243682861328125,
-0.043975830078125,
-0.037872314453125,
0.01129913330078125,
-0.0701904296875,
-0.005664825439453125,
-0.01617431640625,
0.037841796875,
-0.0290985107421875,
-0.0220794677734375,
-0.00525665283203125,
-0.0184326171875,
0.0089874267578125,
-0.001224517822265625,
-0.0450439453125,
0.0092315673828125,
0.03521728515625,
0.056732177734375,
-0.002071380615234375,
-0.0137786865234375,
-0.01361846923828125,
-0.036773681640625,
-0.0207366943359375,
0.051605224609375,
-0.0141754150390625,
-0.0242767333984375,
-0.01415252685546875,
0.0007481575012207031,
0.00128936767578125,
-0.0304107666015625,
0.06256103515625,
-0.0281219482421875,
0.03338623046875,
-0.0295562744140625,
-0.0521240234375,
-0.033111572265625,
0.039031982421875,
-0.049957275390625,
0.09368896484375,
0.01947021484375,
-0.06591796875,
0.0268707275390625,
-0.048248291015625,
-0.00823974609375,
-0.026123046875,
0.0220794677734375,
-0.0472412109375,
0.005115509033203125,
0.0186614990234375,
0.047027587890625,
-0.0350341796875,
0.01471710205078125,
-0.0243072509765625,
-0.01055908203125,
0.023956298828125,
-0.043975830078125,
0.08880615234375,
0.0218353271484375,
-0.038665771484375,
-0.0090789794921875,
-0.069091796875,
0.01464080810546875,
0.0111236572265625,
-0.028900146484375,
-0.020416259765625,
-0.00952911376953125,
0.01528167724609375,
0.00980377197265625,
0.020843505859375,
-0.047882080078125,
0.00402069091796875,
-0.0307769775390625,
0.0207366943359375,
0.05010986328125,
0.00628662109375,
0.0226898193359375,
-0.0299224853515625,
0.032470703125,
0.010009765625,
0.03228759765625,
0.00970458984375,
-0.037506103515625,
-0.06689453125,
-0.0299835205078125,
0.037933349609375,
0.05926513671875,
-0.031646728515625,
0.0654296875,
-0.0274810791015625,
-0.05706787109375,
-0.0311737060546875,
0.002201080322265625,
0.03570556640625,
0.040069580078125,
0.03955078125,
-0.0202789306640625,
-0.033935546875,
-0.069580078125,
-0.0161895751953125,
-0.030548095703125,
0.00937652587890625,
0.00591278076171875,
0.0259246826171875,
-0.017059326171875,
0.062744140625,
-0.052825927734375,
-0.0238189697265625,
-0.0188751220703125,
0.01415252685546875,
0.048614501953125,
0.0266571044921875,
0.06256103515625,
-0.0472412109375,
-0.0626220703125,
-0.01374053955078125,
-0.059600830078125,
-0.00667572021484375,
0.0092315673828125,
-0.012603759765625,
0.0292816162109375,
0.0025959014892578125,
-0.039825439453125,
0.0224761962890625,
0.043426513671875,
-0.0275115966796875,
0.054168701171875,
0.005985260009765625,
0.0140228271484375,
-0.09259033203125,
0.0018072128295898438,
0.0202789306640625,
-0.0037517547607421875,
-0.04510498046875,
-0.0136871337890625,
-0.003589630126953125,
0.014892578125,
-0.026397705078125,
0.03515625,
-0.020660400390625,
0.0135650634765625,
-0.00047278404235839844,
0.007137298583984375,
0.0193939208984375,
0.06964111328125,
-0.002414703369140625,
0.030731201171875,
0.04339599609375,
-0.028839111328125,
0.036224365234375,
0.015838623046875,
-0.026611328125,
0.057708740234375,
-0.044708251953125,
-0.012939453125,
-0.00611114501953125,
0.018951416015625,
-0.0908203125,
0.009613037109375,
0.03521728515625,
-0.054443359375,
0.0255584716796875,
0.00102996826171875,
-0.03436279296875,
-0.0245819091796875,
-0.037750244140625,
0.00948333740234375,
0.05096435546875,
-0.03765869140625,
0.042388916015625,
0.0131072998046875,
-0.0091552734375,
-0.0694580078125,
-0.061126708984375,
-0.02740478515625,
-0.0242462158203125,
-0.034942626953125,
0.00983428955078125,
-0.0161285400390625,
-0.031890869140625,
0.001766204833984375,
-0.005817413330078125,
-0.01123046875,
-0.0096435546875,
0.01161956787109375,
0.047210693359375,
-0.01068878173828125,
0.0139007568359375,
-0.0015478134155273438,
-0.0139617919921875,
0.005947113037109375,
-0.01049041748046875,
0.03558349609375,
-0.0310821533203125,
0.0009493827819824219,
-0.03472900390625,
0.007274627685546875,
0.04705810546875,
-0.00804901123046875,
0.0517578125,
0.049072265625,
-0.00687408447265625,
-0.007526397705078125,
-0.048248291015625,
-0.0204925537109375,
-0.032958984375,
0.0289459228515625,
-0.0220794677734375,
-0.033111572265625,
0.040435791015625,
0.0177001953125,
0.0209503173828125,
0.067626953125,
0.043670654296875,
-0.01812744140625,
0.07940673828125,
0.0160369873046875,
-0.023406982421875,
0.0248565673828125,
-0.0540771484375,
0.0248870849609375,
-0.0501708984375,
-0.0108489990234375,
-0.039031982421875,
-0.040435791015625,
-0.05621337890625,
-0.0147552490234375,
0.0260772705078125,
0.017120361328125,
-0.0311737060546875,
0.0228729248046875,
-0.0589599609375,
0.0041656494140625,
0.046356201171875,
0.007965087890625,
0.0178985595703125,
0.01149749755859375,
-0.0155487060546875,
-0.0126495361328125,
-0.05584716796875,
-0.03704833984375,
0.08563232421875,
0.03704833984375,
0.04705810546875,
0.000059545040130615234,
0.041290283203125,
0.0301971435546875,
0.0302581787109375,
-0.07733154296875,
0.033538818359375,
-0.02020263671875,
-0.06475830078125,
-0.00885009765625,
-0.0316162109375,
-0.0526123046875,
0.0284881591796875,
-0.01593017578125,
-0.03741455078125,
0.03533935546875,
0.017822265625,
-0.02728271484375,
0.036773681640625,
-0.042205810546875,
0.0667724609375,
-0.029083251953125,
-0.0306549072265625,
-0.0018873214721679688,
-0.046722412109375,
0.0226898193359375,
-0.00695037841796875,
0.029937744140625,
-0.025299072265625,
0.01273345947265625,
0.057708740234375,
-0.0367431640625,
0.0633544921875,
-0.0266571044921875,
0.004940032958984375,
0.041473388671875,
-0.0010080337524414062,
0.0340576171875,
-0.0092620849609375,
-0.0223388671875,
0.039703369140625,
0.0072479248046875,
-0.0313720703125,
-0.03741455078125,
0.052978515625,
-0.06658935546875,
-0.0216064453125,
-0.065673828125,
-0.0216522216796875,
-0.01532745361328125,
-0.00017690658569335938,
0.04486083984375,
0.03765869140625,
-0.0140533447265625,
0.01812744140625,
0.03472900390625,
-0.008697509765625,
0.03594970703125,
0.0270233154296875,
-0.00988006591796875,
-0.03460693359375,
0.07080078125,
0.005992889404296875,
0.0203094482421875,
0.0303192138671875,
0.028167724609375,
-0.042755126953125,
-0.012939453125,
-0.03375244140625,
0.0196075439453125,
-0.060150146484375,
-0.029327392578125,
-0.05072021484375,
-0.0201873779296875,
-0.02685546875,
-0.0005245208740234375,
-0.042694091796875,
-0.0523681640625,
-0.02362060546875,
-0.01508331298828125,
0.035308837890625,
0.02685546875,
-0.01678466796875,
0.0240325927734375,
-0.058929443359375,
-0.008392333984375,
0.0201873779296875,
0.01922607421875,
-0.0004968643188476562,
-0.0498046875,
-0.0143280029296875,
0.0090789794921875,
-0.04827880859375,
-0.066162109375,
0.0457763671875,
0.016693115234375,
0.03350830078125,
0.0396728515625,
0.0177459716796875,
0.027313232421875,
-0.0100555419921875,
0.07696533203125,
0.030609130859375,
-0.0684814453125,
0.026824951171875,
-0.01220703125,
0.01070404052734375,
0.046875,
0.05023193359375,
-0.0255126953125,
-0.022735595703125,
-0.0523681640625,
-0.061798095703125,
0.054534912109375,
0.0211639404296875,
-0.00632476806640625,
0.005359649658203125,
0.0135498046875,
0.00345611572265625,
0.023468017578125,
-0.0673828125,
-0.048980712890625,
-0.039306640625,
-0.030914306640625,
-0.0301666259765625,
-0.0245513916015625,
-0.0040435791015625,
-0.03485107421875,
0.077880859375,
0.007049560546875,
0.033294677734375,
0.017181396484375,
0.005889892578125,
0.004756927490234375,
0.0264892578125,
0.01427459716796875,
0.041534423828125,
-0.037506103515625,
0.0026569366455078125,
0.0195770263671875,
-0.03460693359375,
0.01383209228515625,
0.01013946533203125,
-0.00870513916015625,
0.01285552978515625,
0.0299835205078125,
0.0709228515625,
0.00420379638671875,
-0.0357666015625,
0.0352783203125,
0.0029964447021484375,
-0.0244140625,
-0.04083251953125,
-0.0100860595703125,
-0.00965118408203125,
0.0230255126953125,
0.02508544921875,
0.0247802734375,
0.0123748779296875,
-0.04583740234375,
-0.00211334228515625,
0.029266357421875,
-0.05230712890625,
-0.0287322998046875,
0.0526123046875,
0.003498077392578125,
-0.004817962646484375,
0.037322998046875,
-0.024688720703125,
-0.06488037109375,
0.03717041015625,
0.0080718994140625,
0.0733642578125,
-0.0211334228515625,
0.01873779296875,
0.0557861328125,
0.018951416015625,
-0.0129241943359375,
0.0213470458984375,
0.01140594482421875,
-0.060943603515625,
-0.01409149169921875,
-0.0751953125,
-0.00691986083984375,
-0.0034503936767578125,
-0.054290771484375,
0.0087738037109375,
-0.03668212890625,
-0.047943115234375,
0.017608642578125,
0.0203094482421875,
-0.05224609375,
0.034698486328125,
-0.00592803955078125,
0.04693603515625,
-0.065185546875,
0.050689697265625,
0.0506591796875,
-0.03436279296875,
-0.06658935546875,
-0.0095672607421875,
0.0036334991455078125,
-0.037139892578125,
0.048126220703125,
0.0011882781982421875,
-0.001739501953125,
-0.00036072731018066406,
-0.0248870849609375,
-0.054840087890625,
0.07720947265625,
-0.00458526611328125,
-0.052276611328125,
0.0161285400390625,
0.00011020898818969727,
0.051605224609375,
-0.0230255126953125,
0.03411865234375,
0.052642822265625,
0.044036865234375,
0.00383758544921875,
-0.06292724609375,
-0.0009365081787109375,
-0.0284881591796875,
-0.019744873046875,
0.01189422607421875,
-0.0748291015625,
0.08648681640625,
0.00928497314453125,
0.00457763671875,
-0.01277923583984375,
0.031463623046875,
0.01554107666015625,
0.029754638671875,
0.0511474609375,
0.058563232421875,
0.04638671875,
-0.0258941650390625,
0.08258056640625,
-0.02801513671875,
0.057098388671875,
0.0635986328125,
-0.0032482147216796875,
0.0653076171875,
0.015655517578125,
-0.041778564453125,
0.049530029296875,
0.0521240234375,
-0.0206756591796875,
0.041961669921875,
-0.004741668701171875,
-0.017303466796875,
-0.0158538818359375,
0.00942230224609375,
-0.02764892578125,
0.035888671875,
0.01416015625,
-0.038787841796875,
0.0006680488586425781,
-0.00582122802734375,
0.016143798828125,
-0.0025157928466796875,
-0.016754150390625,
0.043121337890625,
0.009307861328125,
-0.0528564453125,
0.06744384765625,
0.0018873214721679688,
0.05926513671875,
-0.035888671875,
0.019744873046875,
-0.01508331298828125,
0.0338134765625,
-0.01824951171875,
-0.047882080078125,
0.00997161865234375,
0.0016002655029296875,
-0.024627685546875,
-0.00005060434341430664,
0.03662109375,
-0.018035888671875,
-0.071044921875,
0.03570556640625,
0.0367431640625,
0.01499176025390625,
-0.0120697021484375,
-0.0765380859375,
0.0108489990234375,
0.02197265625,
-0.040008544921875,
-0.00001233816146850586,
0.018829345703125,
0.0167694091796875,
0.0262451171875,
0.03692626953125,
0.00016760826110839844,
-0.0026531219482421875,
-0.0014476776123046875,
0.061981201171875,
-0.043792724609375,
-0.03277587890625,
-0.06915283203125,
0.05010986328125,
-0.01392364501953125,
-0.02764892578125,
0.07049560546875,
0.05242919921875,
0.052978515625,
-0.00637054443359375,
0.0714111328125,
-0.016845703125,
0.059417724609375,
-0.01751708984375,
0.053009033203125,
-0.0648193359375,
0.0156402587890625,
-0.02606201171875,
-0.11199951171875,
-0.017181396484375,
0.047882080078125,
-0.025787353515625,
0.01378631591796875,
0.050445556640625,
0.05560302734375,
-0.004917144775390625,
0.0012063980102539062,
0.01528167724609375,
0.036590576171875,
0.004764556884765625,
0.036041259765625,
0.04541015625,
-0.05780029296875,
0.041778564453125,
-0.04669189453125,
-0.0380859375,
-0.0210113525390625,
-0.051513671875,
-0.070556640625,
-0.035675048828125,
-0.0325927734375,
-0.04583740234375,
-0.0152587890625,
0.07763671875,
0.0355224609375,
-0.07696533203125,
-0.0227813720703125,
0.01476287841796875,
-0.01433563232421875,
-0.036865234375,
-0.0241241455078125,
0.042144775390625,
-0.0212554931640625,
-0.0758056640625,
0.00838470458984375,
-0.0044097900390625,
-0.005462646484375,
-0.0225067138671875,
-0.01141357421875,
-0.0136566162109375,
0.016082763671875,
0.050811767578125,
-0.0005273818969726562,
-0.046295166015625,
-0.01349639892578125,
0.0124969482421875,
-0.0035686492919921875,
0.018829345703125,
0.0310821533203125,
-0.035003662109375,
0.032958984375,
0.048126220703125,
0.01287841796875,
0.049896240234375,
0.0167694091796875,
0.0079345703125,
-0.0540771484375,
0.01015472412109375,
0.026947021484375,
0.0244903564453125,
0.0220184326171875,
-0.0228729248046875,
0.037567138671875,
0.035858154296875,
-0.054107666015625,
-0.055999755859375,
-0.006168365478515625,
-0.0938720703125,
-0.0286102294921875,
0.079345703125,
-0.01442718505859375,
-0.01528167724609375,
0.01113128662109375,
-0.01352691650390625,
0.0259552001953125,
-0.043670654296875,
0.05633544921875,
0.0606689453125,
-0.01396942138671875,
-0.0016880035400390625,
-0.0408935546875,
0.033447265625,
0.03509521484375,
-0.032684326171875,
-0.01108551025390625,
0.038848876953125,
0.031829833984375,
0.0309600830078125,
0.03253173828125,
-0.0079803466796875,
0.0043487548828125,
0.009002685546875,
0.0223236083984375,
0.007843017578125,
-0.0085906982421875,
-0.01273345947265625,
0.01470947265625,
-0.002353668212890625,
-0.0261077880859375
]
] |
laion/CLIP-ViT-B-32-xlm-roberta-base-laion5B-s13B-b90k | 2022-11-14T16:18:01.000Z | [
"open_clip",
"arxiv:1910.04867",
"license:mit",
"has_space",
"region:us"
] | null | laion | null | null | laion/CLIP-ViT-B-32-xlm-roberta-base-laion5B-s13B-b90k | 8 | 29,137 | open_clip | 2022-11-13T16:37:55 | ---
license: mit
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/cat-dog-music.png
candidate_labels: playing music, playing sports
example_title: Cat & Dog
---
# Model Card for CLIP ViT-B/32 xlm roberta base - LAION-5B
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Training Details](#training-details)
4. [Evaluation](#evaluation)
5. [Acknowledgements](#acknowledgements)
6. [Citation](#citation)
7. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
A CLIP ViT-B/32 xlm roberta base model trained with the LAION-5B (https://laion.ai/blog/laion-5b/) using OpenCLIP (https://github.com/mlfoundations/open_clip).
Model training done by Romain Beaumont on the [stability.ai](https://stability.ai/) cluster.
# Uses
## Direct Use
Zero-shot image classification, image and text retrieval, among others.
## Downstream Use
Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others.
# Training Details
## Training Data
This model was trained with the full LAION-5B (https://laion.ai/blog/laion-5b/).
## Training Procedure
Training with batch size 90k for 13B sample of laion5B, see https://wandb.ai/rom1504/open-clip/reports/xlm-roberta-base-B-32--VmlldzoyOTQ5OTE2
Model is B/32 on visual side, xlm roberta base initialized with pretrained weights on text side.
# Evaluation
Evaluation done with code in the [LAION CLIP Benchmark suite](https://github.com/LAION-AI/CLIP_benchmark).
## Testing Data, Factors & Metrics
### Testing Data
The testing is performed with VTAB+ (A combination of VTAB (https://arxiv.org/abs/1910.04867) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval.
## Results
The model achieves
* imagenet 1k 62.33% (vs 62.9% for baseline)
* mscoco 63.4% (vs 60.8% for baseline)
* flickr30k 86.2% (vs 85.4% for baseline)
A preliminary multilingual evaluation was run: 43% on imagenet1k italian (vs 21% for english B/32), 37% for imagenet1k japanese (vs 1% for english B/32 and 50% for B/16 clip japanese). It shows the multilingual property is indeed there as expected. Larger models will get even better performance.

# Acknowledgements
Acknowledging [stability.ai](https://stability.ai/) for the compute used to train this model.
# Citation
**BibTeX:**
In addition to forthcoming LAION-5B (https://laion.ai/blog/laion-5b/) paper, please cite:
OpenAI CLIP paper
```
@inproceedings{Radford2021LearningTV,
title={Learning Transferable Visual Models From Natural Language Supervision},
author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever},
booktitle={ICML},
year={2021}
}
```
OpenCLIP software
```
@software{ilharco_gabriel_2021_5143773,
author = {Ilharco, Gabriel and
Wortsman, Mitchell and
Wightman, Ross and
Gordon, Cade and
Carlini, Nicholas and
Taori, Rohan and
Dave, Achal and
Shankar, Vaishaal and
Namkoong, Hongseok and
Miller, John and
Hajishirzi, Hannaneh and
Farhadi, Ali and
Schmidt, Ludwig},
title = {OpenCLIP},
month = jul,
year = 2021,
note = {If you use this software, please cite it as below.},
publisher = {Zenodo},
version = {0.1},
doi = {10.5281/zenodo.5143773},
url = {https://doi.org/10.5281/zenodo.5143773}
}
```
# How To Get Started With the Model
https://github.com/mlfoundations/open_clip | 3,900 | [
[
-0.025726318359375,
-0.042572021484375,
0.0200042724609375,
0.01074981689453125,
-0.031585693359375,
-0.0222320556640625,
-0.0290679931640625,
-0.03558349609375,
0.00978851318359375,
0.03460693359375,
-0.029754638671875,
-0.05902099609375,
-0.05450439453125,
-0.0031299591064453125,
-0.0238189697265625,
0.0679931640625,
-0.01654052734375,
0.01456451416015625,
-0.016326904296875,
-0.040985107421875,
-0.04559326171875,
-0.0322265625,
-0.044586181640625,
-0.003864288330078125,
0.0085296630859375,
0.0293121337890625,
0.046478271484375,
0.053436279296875,
0.052703857421875,
0.01983642578125,
-0.0114593505859375,
0.00855255126953125,
-0.0435791015625,
-0.04302978515625,
0.00119781494140625,
-0.03973388671875,
-0.04827880859375,
0.0009450912475585938,
0.040130615234375,
0.011688232421875,
-0.004276275634765625,
0.020477294921875,
-0.010833740234375,
0.035247802734375,
-0.046051025390625,
0.0220184326171875,
-0.038665771484375,
0.004222869873046875,
-0.016357421875,
0.006916046142578125,
-0.031036376953125,
-0.01160430908203125,
0.01329803466796875,
-0.05328369140625,
0.016998291015625,
-0.0145721435546875,
0.11370849609375,
0.00801849365234375,
-0.0197906494140625,
0.004383087158203125,
-0.043426513671875,
0.06671142578125,
-0.055633544921875,
0.025665283203125,
0.0203704833984375,
0.035552978515625,
0.0297393798828125,
-0.060455322265625,
-0.0253143310546875,
-0.0125732421875,
0.015228271484375,
0.029510498046875,
-0.033782958984375,
-0.0107574462890625,
0.0364990234375,
0.01397705078125,
-0.0294342041015625,
-0.004352569580078125,
-0.055938720703125,
0.00327301025390625,
0.030242919921875,
0.0009098052978515625,
0.0322265625,
-0.028961181640625,
-0.0546875,
-0.035491943359375,
-0.042633056640625,
0.0258941650390625,
0.01482391357421875,
-0.0012035369873046875,
-0.047393798828125,
0.02630615234375,
0.0189971923828125,
0.047027587890625,
-0.0016393661499023438,
-0.0279388427734375,
0.031158447265625,
-0.032318115234375,
-0.0144500732421875,
-0.013427734375,
0.0771484375,
0.046661376953125,
0.0156097412109375,
0.0234222412109375,
0.00556182861328125,
-0.014556884765625,
0.01294708251953125,
-0.07427978515625,
-0.02020263671875,
-0.005970001220703125,
-0.0469970703125,
-0.0063018798828125,
0.04083251953125,
-0.044189453125,
0.022735595703125,
-0.017120361328125,
0.036773681640625,
-0.06329345703125,
-0.0058441162109375,
-0.003894805908203125,
-0.00786590576171875,
0.016937255859375,
0.03338623046875,
-0.04498291015625,
0.0186004638671875,
0.0214691162109375,
0.0931396484375,
-0.02020263671875,
-0.03448486328125,
-0.032684326171875,
0.0160064697265625,
-0.022430419921875,
0.04559326171875,
-0.005001068115234375,
-0.017730712890625,
-0.00994110107421875,
0.0232086181640625,
-0.00594329833984375,
-0.037506103515625,
0.0389404296875,
-0.016204833984375,
0.0084075927734375,
-0.00958251953125,
-0.0226898193359375,
-0.040618896484375,
0.0098419189453125,
-0.053863525390625,
0.0640869140625,
0.001338958740234375,
-0.06195068359375,
0.0307769775390625,
-0.038848876953125,
-0.002227783203125,
-0.0140838623046875,
0.003620147705078125,
-0.042938232421875,
-0.0247802734375,
0.044342041015625,
0.0284423828125,
-0.01064300537109375,
0.0230712890625,
-0.04217529296875,
-0.0236968994140625,
0.017669677734375,
-0.028717041015625,
0.07501220703125,
-0.0008311271667480469,
-0.0119476318359375,
0.01971435546875,
-0.055267333984375,
-0.0024242401123046875,
0.01305389404296875,
-0.007259368896484375,
-0.01270294189453125,
-0.0296478271484375,
-0.0014524459838867188,
0.0117034912109375,
0.01451873779296875,
-0.0307464599609375,
0.006206512451171875,
-0.0087432861328125,
0.0230255126953125,
0.049041748046875,
0.003719329833984375,
0.0219879150390625,
-0.022216796875,
0.055938720703125,
0.014068603515625,
0.035247802734375,
-0.0307464599609375,
-0.03765869140625,
-0.060150146484375,
-0.05389404296875,
0.034332275390625,
0.049560546875,
-0.048828125,
0.027099609375,
-0.024993896484375,
-0.0341796875,
-0.0413818359375,
0.01142120361328125,
0.040191650390625,
0.035919189453125,
0.04217529296875,
-0.041015625,
-0.041595458984375,
-0.06817626953125,
0.0203094482421875,
0.00373077392578125,
0.007259368896484375,
0.043243408203125,
0.050048828125,
-0.0033702850341796875,
0.0643310546875,
-0.0526123046875,
-0.039764404296875,
-0.028564453125,
0.0163116455078125,
0.0233917236328125,
0.025299072265625,
0.08062744140625,
-0.053802490234375,
-0.0570068359375,
-0.004741668701171875,
-0.07562255859375,
0.0010423660278320312,
0.0024089813232421875,
-0.020721435546875,
0.012542724609375,
0.0281524658203125,
-0.0458984375,
0.044647216796875,
0.03662109375,
0.00437164306640625,
0.039154052734375,
-0.02099609375,
0.00569915771484375,
-0.0924072265625,
0.01611328125,
0.00679779052734375,
-0.0312347412109375,
-0.030914306640625,
-0.00638580322265625,
0.00713348388671875,
-0.017974853515625,
-0.0550537109375,
0.042938232421875,
-0.048583984375,
-0.0012331008911132812,
-0.006683349609375,
0.0093994140625,
0.00731658935546875,
0.042022705078125,
0.00821685791015625,
0.054656982421875,
0.0648193359375,
-0.036468505859375,
0.0169525146484375,
0.03875732421875,
-0.028961181640625,
0.03369140625,
-0.06744384765625,
0.002590179443359375,
-0.00231170654296875,
0.0081024169921875,
-0.03765869140625,
-0.0176544189453125,
0.0278472900390625,
-0.034759521484375,
0.0304412841796875,
-0.046966552734375,
-0.0169219970703125,
-0.03033447265625,
-0.033935546875,
0.0296630859375,
0.04815673828125,
-0.05206298828125,
0.027435302734375,
0.01849365234375,
-0.000006616115570068359,
-0.047149658203125,
-0.05499267578125,
-0.0034427642822265625,
-0.013275146484375,
-0.052215576171875,
0.036468505859375,
-0.00395965576171875,
0.004695892333984375,
0.015472412109375,
0.018310546875,
-0.0179443359375,
-0.0247039794921875,
0.037506103515625,
0.04254150390625,
-0.016754150390625,
-0.0182037353515625,
-0.01226806640625,
-0.0074920654296875,
-0.016204833984375,
-0.01242828369140625,
0.0254058837890625,
-0.0197601318359375,
-0.0228271484375,
-0.03704833984375,
0.010284423828125,
0.032989501953125,
-0.035552978515625,
0.056304931640625,
0.0682373046875,
-0.01763916015625,
-0.0023250579833984375,
-0.02447509765625,
0.007762908935546875,
-0.0296478271484375,
0.035614013671875,
-0.01074981689453125,
-0.03424072265625,
0.04779052734375,
0.019989013671875,
-0.01538848876953125,
0.029449462890625,
0.0335693359375,
0.006359100341796875,
0.06402587890625,
0.06707763671875,
-0.0031871795654296875,
0.06585693359375,
-0.0672607421875,
-0.0030002593994140625,
-0.065673828125,
-0.007720947265625,
-0.0068817138671875,
-0.0013065338134765625,
-0.02484130859375,
-0.052886962890625,
0.044036865234375,
0.03973388671875,
0.0038051605224609375,
0.044342041015625,
-0.0172576904296875,
0.0201568603515625,
0.04986572265625,
0.02239990234375,
0.00989532470703125,
0.00322723388671875,
-0.009765625,
-0.015411376953125,
-0.060302734375,
-0.039825439453125,
0.08489990234375,
0.0330810546875,
0.061737060546875,
0.0078582763671875,
0.041259765625,
0.000011026859283447266,
0.0108642578125,
-0.052886962890625,
0.04217529296875,
-0.0307159423828125,
-0.0343017578125,
-0.01432037353515625,
-0.0247802734375,
-0.05853271484375,
0.006927490234375,
-0.0131072998046875,
-0.047088623046875,
0.0184478759765625,
0.0192108154296875,
-0.009796142578125,
0.031707763671875,
-0.04754638671875,
0.0736083984375,
-0.0390625,
-0.025299072265625,
0.0024738311767578125,
-0.0489501953125,
0.040740966796875,
0.02154541015625,
-0.002651214599609375,
-0.00823974609375,
0.01172637939453125,
0.059326171875,
-0.052978515625,
0.06341552734375,
-0.01324462890625,
0.0238037109375,
0.0439453125,
-0.0190582275390625,
0.0101470947265625,
0.0150909423828125,
0.01412200927734375,
0.04144287109375,
0.00750732421875,
-0.0184173583984375,
-0.036407470703125,
0.0302276611328125,
-0.061279296875,
-0.0150909423828125,
-0.02435302734375,
-0.0333251953125,
0.0214385986328125,
0.033935546875,
0.048492431640625,
0.04656982421875,
-0.01280975341796875,
0.0216217041015625,
0.04046630859375,
-0.0229949951171875,
0.03228759765625,
0.027130126953125,
-0.0272979736328125,
-0.0628662109375,
0.07025146484375,
0.0213623046875,
0.0258026123046875,
0.020477294921875,
0.004940032958984375,
0.005153656005859375,
-0.030303955078125,
-0.046600341796875,
0.026123046875,
-0.05743408203125,
-0.03033447265625,
-0.033355712890625,
-0.022186279296875,
-0.0265960693359375,
-0.00360107421875,
-0.040374755859375,
-0.0181121826171875,
-0.04766845703125,
-0.007541656494140625,
0.0347900390625,
0.04083251953125,
-0.00582122802734375,
0.02484130859375,
-0.06683349609375,
0.0247802734375,
0.00971221923828125,
0.022125244140625,
0.005706787109375,
-0.0496826171875,
-0.03045654296875,
0.0100555419921875,
-0.0460205078125,
-0.066650390625,
0.04132080078125,
0.025115966796875,
0.0489501953125,
0.04931640625,
-0.01226043701171875,
0.061370849609375,
-0.03253173828125,
0.06793212890625,
0.031890869140625,
-0.06817626953125,
0.045166015625,
-0.03277587890625,
0.0232696533203125,
0.038238525390625,
0.046630859375,
-0.01043701171875,
-0.00994873046875,
-0.047454833984375,
-0.06634521484375,
0.06768798828125,
0.006664276123046875,
-0.0014104843139648438,
0.0097503662109375,
0.0193939208984375,
0.0019254684448242188,
0.006450653076171875,
-0.07708740234375,
-0.00940704345703125,
-0.041412353515625,
-0.01265716552734375,
0.006145477294921875,
-0.0185546875,
-0.0125732421875,
-0.031036376953125,
0.05230712890625,
-0.023406982421875,
0.0367431640625,
0.02252197265625,
-0.0262451171875,
-0.0008907318115234375,
-0.00565338134765625,
0.049560546875,
0.055999755859375,
-0.0360107421875,
-0.0163421630859375,
0.00959014892578125,
-0.03314208984375,
-0.0100860595703125,
0.0036678314208984375,
-0.04541015625,
0.0121002197265625,
0.050872802734375,
0.11083984375,
0.02490234375,
-0.0423583984375,
0.06787109375,
0.0009484291076660156,
-0.0261077880859375,
-0.02691650390625,
0.005584716796875,
-0.00827789306640625,
0.0140380859375,
0.006404876708984375,
0.00847625732421875,
-0.001827239990234375,
-0.042877197265625,
0.0127716064453125,
0.03717041015625,
-0.04541015625,
-0.039398193359375,
0.05712890625,
-0.01157379150390625,
-0.0094146728515625,
0.03741455078125,
-0.007083892822265625,
-0.044647216796875,
0.0513916015625,
0.0482177734375,
0.057952880859375,
0.00848388671875,
0.0254058837890625,
0.056884765625,
0.01287841796875,
-0.013763427734375,
0.01617431640625,
0.01220703125,
-0.0447998046875,
-0.00479888916015625,
-0.0241851806640625,
-0.0311279296875,
0.0018854141235351562,
-0.07769775390625,
0.0472412109375,
-0.03961181640625,
-0.037200927734375,
-0.013916015625,
-0.0225372314453125,
-0.046630859375,
0.004169464111328125,
0.0077667236328125,
0.08001708984375,
-0.052001953125,
0.056732177734375,
0.05963134765625,
-0.07037353515625,
-0.062103271484375,
-0.007251739501953125,
-0.0107574462890625,
-0.04156494140625,
0.048004150390625,
0.0240020751953125,
0.001461029052734375,
-0.0264129638671875,
-0.06585693359375,
-0.07525634765625,
0.110107421875,
0.040863037109375,
-0.0161895751953125,
0.00492095947265625,
-0.0190277099609375,
0.035797119140625,
-0.0318603515625,
0.0290679931640625,
-0.0006856918334960938,
0.0174102783203125,
0.013153076171875,
-0.06878662109375,
-0.01313018798828125,
-0.0211334228515625,
0.0152130126953125,
0.0001964569091796875,
-0.0823974609375,
0.08135986328125,
-0.031707763671875,
-0.01654052734375,
0.0186309814453125,
0.054229736328125,
0.011016845703125,
0.0150909423828125,
0.0211334228515625,
0.048248291015625,
0.032806396484375,
-0.0013523101806640625,
0.07501220703125,
-0.0257110595703125,
0.034576416015625,
0.0872802734375,
-0.00844573974609375,
0.07562255859375,
0.023773193359375,
-0.0135955810546875,
0.04425048828125,
0.02313232421875,
-0.044647216796875,
0.0458984375,
-0.0274200439453125,
0.017913818359375,
-0.00867462158203125,
-0.0128936767578125,
-0.0204925537109375,
0.036651611328125,
-0.0004317760467529297,
-0.035614013671875,
-0.003326416015625,
0.0224761962890625,
0.0009260177612304688,
-0.009063720703125,
-0.01336669921875,
0.044891357421875,
0.0192718505859375,
-0.04132080078125,
0.052459716796875,
0.0089874267578125,
0.0418701171875,
-0.0618896484375,
-0.01248931884765625,
-0.00418853759765625,
0.0256500244140625,
-0.00785064697265625,
-0.05560302734375,
0.01210784912109375,
0.00792694091796875,
-0.02581787109375,
-0.01201629638671875,
0.049102783203125,
-0.0286712646484375,
-0.044189453125,
0.04083251953125,
0.0092315673828125,
0.0157928466796875,
0.020233154296875,
-0.0556640625,
0.013092041015625,
0.005321502685546875,
-0.004009246826171875,
0.01384735107421875,
0.016448974609375,
-0.01898193359375,
0.054779052734375,
0.040863037109375,
0.013092041015625,
0.0208740234375,
0.0032825469970703125,
0.053314208984375,
-0.03704833984375,
-0.033416748046875,
-0.04522705078125,
0.03607177734375,
-0.01849365234375,
-0.03790283203125,
0.0643310546875,
0.043548583984375,
0.07061767578125,
-0.027923583984375,
0.043731689453125,
-0.00522613525390625,
0.02508544921875,
-0.0684814453125,
0.0440673828125,
-0.0430908203125,
0.0038547515869140625,
-0.0423583984375,
-0.0540771484375,
-0.0181121826171875,
0.0472412109375,
-0.0222930908203125,
0.00534820556640625,
0.047607421875,
0.050384521484375,
-0.028472900390625,
-0.02978515625,
0.0160980224609375,
0.014617919921875,
0.0211944580078125,
0.041259765625,
0.03759765625,
-0.05487060546875,
0.057037353515625,
-0.042724609375,
-0.0167999267578125,
-0.01324462890625,
-0.06512451171875,
-0.08447265625,
-0.030487060546875,
-0.0271148681640625,
-0.00989532470703125,
0.0009264945983886719,
0.0589599609375,
0.08135986328125,
-0.056915283203125,
-0.03033447265625,
0.01580810546875,
0.0007181167602539062,
-0.0240020751953125,
-0.0174102783203125,
0.03594970703125,
0.01329803466796875,
-0.052734375,
0.00785064697265625,
0.022735595703125,
0.024810791015625,
-0.0018339157104492188,
-0.01366424560546875,
-0.0300445556640625,
-0.01336669921875,
0.0310516357421875,
0.03045654296875,
-0.052520751953125,
-0.005252838134765625,
0.00463104248046875,
0.0003254413604736328,
0.03472900390625,
0.037200927734375,
-0.034637451171875,
0.03253173828125,
0.03253173828125,
0.017547607421875,
0.06390380859375,
0.0260009765625,
0.031707763671875,
-0.0516357421875,
0.033111572265625,
-0.01149749755859375,
0.037353515625,
0.033843994140625,
-0.009033203125,
0.04095458984375,
0.0254058837890625,
-0.0224761962890625,
-0.0584716796875,
-0.0022029876708984375,
-0.09820556640625,
-0.00492095947265625,
0.06982421875,
-0.045379638671875,
-0.0312347412109375,
0.038726806640625,
-0.034393310546875,
0.02392578125,
-0.0238494873046875,
0.01605224609375,
0.039459228515625,
0.0038585662841796875,
-0.046661376953125,
-0.0535888671875,
0.0156402587890625,
0.01165771484375,
-0.058929443359375,
-0.0276641845703125,
0.033843994140625,
0.035247802734375,
0.0216827392578125,
0.045989990234375,
-0.021209716796875,
0.028961181640625,
-0.0038700103759765625,
0.0312347412109375,
-0.032257080078125,
-0.0462646484375,
-0.032318115234375,
-0.004535675048828125,
-0.0048065185546875,
-0.028045654296875
]
] |
google/vit-base-patch32-224-in21k | 2022-12-08T10:59:40.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"vit",
"feature-extraction",
"vision",
"dataset:imagenet-21k",
"arxiv:2010.11929",
"arxiv:2006.03677",
"license:apache-2.0",
"region:us"
] | feature-extraction | google | null | null | google/vit-base-patch32-224-in21k | 17 | 29,082 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
tags:
- vision
datasets:
- imagenet-21k
inference: false
---
# Vision Transformer (base-sized model)
Vision Transformer (ViT) model pre-trained on ImageNet-21k (14 million images, 21,843 classes) at resolution 224x224. It was introduced in the paper [An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale](https://arxiv.org/abs/2010.11929) by Dosovitskiy et al. and first released in [this repository](https://github.com/google-research/vision_transformer). However, the weights were converted from the [timm repository](https://github.com/rwightman/pytorch-image-models) by Ross Wightman, who already converted the weights from JAX to PyTorch. Credits go to him.
Disclaimer: The team releasing ViT did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
The Vision Transformer (ViT) is a transformer encoder model (BERT-like) pretrained on a large collection of images in a supervised fashion, namely ImageNet-21k, at a resolution of 224x224 pixels.
Images are presented to the model as a sequence of fixed-size patches (resolution 32x32), which are linearly embedded. One also adds a [CLS] token to the beginning of a sequence to use it for classification tasks. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder.
Note that this model does not provide any fine-tuned heads, as these were zero'd by Google researchers. However, the model does include the pre-trained pooler, which can be used for downstream tasks (such as image classification).
By pre-training the model, it learns an inner representation of images that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled images for instance, you can train a standard classifier by placing a linear layer on top of the pre-trained encoder. One typically places a linear layer on top of the [CLS] token, as the last hidden state of this token can be seen as a representation of an entire image.
## Intended uses & limitations
You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=google/vit) to look for
fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model in PyTorch:
```python
from transformers import ViTImageProcessor, ViTModel
from PIL import Image
import requests
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
processor = ViTImageProcessor.from_pretrained('google/vit-base-patch32-224-in21k')
model = ViTModel.from_pretrained('google/vit-base-patch32-224-in21k')
inputs = processor(images=image, return_tensors="pt")
outputs = model(**inputs)
last_hidden_state = outputs.last_hidden_state
```
Refer to the [docs](https://huggingface.co/docs/transformers/model_doc/vit) for usage in TensorFlow and JAX/FLAX.
## Training data
The ViT model was pretrained on [ImageNet-21k](http://www.image-net.org/), a dataset consisting of 14 million images and 21k classes.
## Training procedure
### Preprocessing
The exact details of preprocessing of images during training/validation can be found [here](https://github.com/google-research/vision_transformer/blob/master/vit_jax/input_pipeline.py).
Images are resized/rescaled to the same resolution (224x224) and normalized across the RGB channels with mean (0.5, 0.5, 0.5) and standard deviation (0.5, 0.5, 0.5).
### Pretraining
The model was trained on TPUv3 hardware (8 cores). All model variants are trained with a batch size of 4096 and learning rate warmup of 10k steps. For ImageNet, the authors found it beneficial to additionally apply gradient clipping at global norm 1. Pre-training resolution is 224.
## Evaluation results
For evaluation results on several image classification benchmarks, we refer to tables 2 and 5 of the original paper. Note that for fine-tuning, the best results are obtained with a higher resolution (384x384). Of course, increasing the model size will result in better performance.
### BibTeX entry and citation info
```bibtex
@misc{wu2020visual,
title={Visual Transformers: Token-based Image Representation and Processing for Computer Vision},
author={Bichen Wu and Chenfeng Xu and Xiaoliang Dai and Alvin Wan and Peizhao Zhang and Zhicheng Yan and Masayoshi Tomizuka and Joseph Gonzalez and Kurt Keutzer and Peter Vajda},
year={2020},
eprint={2006.03677},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```bibtex
@inproceedings{deng2009imagenet,
title={Imagenet: A large-scale hierarchical image database},
author={Deng, Jia and Dong, Wei and Socher, Richard and Li, Li-Jia and Li, Kai and Fei-Fei, Li},
booktitle={2009 IEEE conference on computer vision and pattern recognition},
pages={248--255},
year={2009},
organization={Ieee}
}
``` | 4,947 | [
[
-0.04400634765625,
-0.01629638671875,
0.005584716796875,
-0.006023406982421875,
-0.0325927734375,
-0.01407623291015625,
-0.004665374755859375,
-0.043975830078125,
0.01201629638671875,
0.03460693359375,
-0.0236358642578125,
-0.018280029296875,
-0.06005859375,
-0.00809478759765625,
-0.03814697265625,
0.07122802734375,
-0.006755828857421875,
-0.00021827220916748047,
-0.01534271240234375,
-0.01415252685546875,
-0.0257110595703125,
-0.035064697265625,
-0.046966552734375,
-0.025726318359375,
0.036773681640625,
0.007720947265625,
0.0528564453125,
0.0595703125,
0.059783935546875,
0.03369140625,
-0.00537109375,
-0.00490570068359375,
-0.031341552734375,
-0.022552490234375,
-0.00567626953125,
-0.04296875,
-0.0292205810546875,
0.0164794921875,
0.04669189453125,
0.03192138671875,
0.0169830322265625,
0.0248565673828125,
0.00690460205078125,
0.0308837890625,
-0.047210693359375,
0.01934814453125,
-0.03704833984375,
0.03131103515625,
-0.00894927978515625,
-0.0094757080078125,
-0.0355224609375,
-0.012542724609375,
0.02117919921875,
-0.03802490234375,
0.039703369140625,
-0.00408172607421875,
0.1083984375,
0.01690673828125,
-0.025390625,
0.01824951171875,
-0.056182861328125,
0.055877685546875,
-0.0198516845703125,
0.03369140625,
0.006427764892578125,
0.039306640625,
0.01384735107421875,
-0.0902099609375,
-0.042510986328125,
-0.00327301025390625,
-0.00931549072265625,
0.01325225830078125,
-0.02691650390625,
0.0175323486328125,
0.043060302734375,
0.04827880859375,
-0.026702880859375,
0.0018520355224609375,
-0.04278564453125,
-0.0242462158203125,
0.03558349609375,
-0.0022983551025390625,
0.0131988525390625,
-0.0030193328857421875,
-0.048370361328125,
-0.03900146484375,
-0.0286865234375,
0.01058197021484375,
0.0034923553466796875,
-0.0006895065307617188,
-0.013580322265625,
0.041107177734375,
0.0078887939453125,
0.049468994140625,
0.0280303955078125,
-0.00704193115234375,
0.03753662109375,
-0.017608642578125,
-0.0260772705078125,
-0.01538848876953125,
0.061004638671875,
0.032012939453125,
0.0240936279296875,
-0.0032444000244140625,
-0.0257110595703125,
0.01280975341796875,
0.0390625,
-0.078125,
-0.01219940185546875,
-0.009765625,
-0.05450439453125,
-0.027862548828125,
0.020782470703125,
-0.04248046875,
-0.0135040283203125,
-0.0303955078125,
0.0599365234375,
-0.01416015625,
-0.019439697265625,
-0.012420654296875,
-0.007373809814453125,
0.048065185546875,
0.0306549072265625,
-0.046478271484375,
0.0260162353515625,
0.022857666015625,
0.078125,
-0.005817413330078125,
-0.01800537109375,
-0.00928497314453125,
-0.0228118896484375,
-0.0308837890625,
0.045745849609375,
-0.00899505615234375,
-0.01482391357421875,
0.0014162063598632812,
0.032501220703125,
-0.0023250579833984375,
-0.0380859375,
0.031097412109375,
-0.046905517578125,
0.00902557373046875,
-0.006160736083984375,
-0.0187225341796875,
-0.0192413330078125,
0.011962890625,
-0.052520751953125,
0.07501220703125,
0.0204010009765625,
-0.060791015625,
0.036163330078125,
-0.0389404296875,
-0.007061004638671875,
0.01458740234375,
-0.00345611572265625,
-0.050018310546875,
0.006565093994140625,
0.0203857421875,
0.04205322265625,
-0.0081939697265625,
-0.005664825439453125,
-0.01239013671875,
-0.042083740234375,
0.01508331298828125,
-0.0259246826171875,
0.06158447265625,
0.01424407958984375,
-0.027099609375,
0.0189666748046875,
-0.04296875,
-0.00290679931640625,
0.0178985595703125,
-0.01541900634765625,
0.0008516311645507812,
-0.0284576416015625,
0.0169677734375,
0.0260467529296875,
0.0236053466796875,
-0.059295654296875,
0.01427459716796875,
-0.01212310791015625,
0.0352783203125,
0.05926513671875,
-0.0162200927734375,
0.041900634765625,
-0.0188446044921875,
0.030914306640625,
0.01473236083984375,
0.0396728515625,
-0.026275634765625,
-0.04730224609375,
-0.0806884765625,
-0.016998291015625,
0.027923583984375,
0.029083251953125,
-0.058197021484375,
0.03668212890625,
-0.035125732421875,
-0.04541015625,
-0.0286865234375,
-0.01335906982421875,
0.022857666015625,
0.037933349609375,
0.03656005859375,
-0.0423583984375,
-0.048187255859375,
-0.07080078125,
0.01052093505859375,
0.00043129920959472656,
-0.00732421875,
0.01100921630859375,
0.0579833984375,
-0.0265655517578125,
0.07061767578125,
-0.027679443359375,
-0.0261688232421875,
-0.004161834716796875,
-0.0014562606811523438,
0.0296478271484375,
0.0487060546875,
0.040374755859375,
-0.062469482421875,
-0.028778076171875,
0.0036373138427734375,
-0.057373046875,
0.01910400390625,
-0.002285003662109375,
-0.023651123046875,
0.0016889572143554688,
0.031524658203125,
-0.04803466796875,
0.06500244140625,
0.027374267578125,
-0.01157379150390625,
0.032806396484375,
-0.00928497314453125,
-0.00043201446533203125,
-0.08270263671875,
0.0008077621459960938,
0.01465606689453125,
-0.03411865234375,
-0.04046630859375,
0.019134521484375,
0.016571044921875,
-0.0159454345703125,
-0.03704833984375,
0.025482177734375,
-0.03515625,
-0.01519775390625,
-0.01947021484375,
-0.0292510986328125,
0.0009675025939941406,
0.044219970703125,
0.0059814453125,
0.04473876953125,
0.052520751953125,
-0.0411376953125,
0.04547119140625,
0.0210723876953125,
-0.0294647216796875,
0.039154052734375,
-0.066162109375,
0.0192413330078125,
-0.007793426513671875,
0.0239105224609375,
-0.057373046875,
-0.017608642578125,
0.01204681396484375,
-0.030914306640625,
0.0435791015625,
-0.0278167724609375,
-0.030120849609375,
-0.056915283203125,
-0.0148162841796875,
0.041595458984375,
0.06182861328125,
-0.058624267578125,
0.052032470703125,
0.020477294921875,
0.036834716796875,
-0.051666259765625,
-0.07684326171875,
-0.0035610198974609375,
-0.00891876220703125,
-0.04144287109375,
0.048492431640625,
0.0155487060546875,
0.018768310546875,
0.017120361328125,
0.0004048347473144531,
-0.004505157470703125,
-0.0252685546875,
0.0419921875,
0.0286865234375,
-0.0247039794921875,
-0.0015535354614257812,
-0.032440185546875,
-0.0191650390625,
-0.0011529922485351562,
-0.042083740234375,
0.037811279296875,
-0.037841796875,
-0.023773193359375,
-0.04583740234375,
-0.0005283355712890625,
0.05169677734375,
-0.0186309814453125,
0.0496826171875,
0.0745849609375,
-0.04473876953125,
0.0029506683349609375,
-0.035919189453125,
-0.0098876953125,
-0.038818359375,
0.0279541015625,
-0.0259552001953125,
-0.0455322265625,
0.0537109375,
0.006511688232421875,
-0.007640838623046875,
0.0478515625,
0.034759521484375,
-0.0123291015625,
0.062744140625,
0.0523681640625,
0.00418853759765625,
0.059906005859375,
-0.059722900390625,
0.008453369140625,
-0.0631103515625,
-0.025848388671875,
-0.01424407958984375,
-0.04052734375,
-0.0489501953125,
-0.04022216796875,
0.0263214111328125,
0.00473785400390625,
-0.030242919921875,
0.042205810546875,
-0.054656982421875,
0.02996826171875,
0.061370849609375,
0.044586181640625,
-0.0101470947265625,
0.0157623291015625,
-0.00792694091796875,
0.0032901763916015625,
-0.041900634765625,
-0.01120758056640625,
0.076416015625,
0.04266357421875,
0.053985595703125,
-0.0135345458984375,
0.03857421875,
0.0006833076477050781,
0.0101470947265625,
-0.0716552734375,
0.045440673828125,
-0.01358795166015625,
-0.036956787109375,
-0.00463104248046875,
-0.018096923828125,
-0.081298828125,
0.01174163818359375,
-0.0268402099609375,
-0.049530029296875,
0.034820556640625,
0.0163726806640625,
-0.0115814208984375,
0.043853759765625,
-0.0477294921875,
0.06365966796875,
-0.00893402099609375,
-0.0233917236328125,
0.004451751708984375,
-0.051116943359375,
0.01166534423828125,
0.004302978515625,
-0.0208892822265625,
0.0260772705078125,
0.0184783935546875,
0.0640869140625,
-0.05938720703125,
0.06658935546875,
-0.0236053466796875,
0.027618408203125,
0.032958984375,
-0.0227203369140625,
0.0227813720703125,
-0.0224456787109375,
0.029510498046875,
0.03472900390625,
-0.0037994384765625,
-0.039276123046875,
-0.043853759765625,
0.02899169921875,
-0.07763671875,
-0.03485107421875,
-0.0338134765625,
-0.0236968994140625,
0.01324462890625,
0.02484130859375,
0.05865478515625,
0.0511474609375,
0.0188446044921875,
0.0452880859375,
0.0482177734375,
-0.025177001953125,
0.035919189453125,
-0.016082763671875,
-0.019256591796875,
-0.0175933837890625,
0.06488037109375,
0.0257110595703125,
0.0110015869140625,
0.0288543701171875,
0.0169830322265625,
-0.021636962890625,
-0.03594970703125,
-0.0244140625,
0.00823211669921875,
-0.0645751953125,
-0.03558349609375,
-0.035552978515625,
-0.052642822265625,
-0.0204925537109375,
-0.012237548828125,
-0.03790283203125,
-0.01427459716796875,
-0.0279388427734375,
-0.00666046142578125,
0.032989501953125,
0.05328369140625,
-0.005584716796875,
0.047271728515625,
-0.043121337890625,
0.0122528076171875,
0.04351806640625,
0.035675048828125,
0.00348663330078125,
-0.053070068359375,
-0.028167724609375,
-0.006046295166015625,
-0.0232391357421875,
-0.047882080078125,
0.02716064453125,
0.0170745849609375,
0.043701171875,
0.04937744140625,
-0.0210418701171875,
0.0694580078125,
-0.02996826171875,
0.0570068359375,
0.035064697265625,
-0.058013916015625,
0.037078857421875,
-0.01108551025390625,
0.0210418701171875,
0.01480865478515625,
0.0252227783203125,
-0.0204010009765625,
0.007022857666015625,
-0.058624267578125,
-0.05438232421875,
0.047454833984375,
0.01065826416015625,
0.0188751220703125,
0.0261993408203125,
0.031097412109375,
-0.011474609375,
-0.0032215118408203125,
-0.064208984375,
-0.0166015625,
-0.05316162109375,
-0.0091552734375,
0.00214385986328125,
-0.0128173828125,
0.0032196044921875,
-0.0499267578125,
0.0249481201171875,
-0.0032596588134765625,
0.068359375,
0.01332855224609375,
-0.0243377685546875,
-0.005252838134765625,
-0.0271148681640625,
0.0192413330078125,
0.03155517578125,
-0.0205230712890625,
0.01387786865234375,
0.006927490234375,
-0.0684814453125,
-0.0013027191162109375,
-0.00629425048828125,
-0.008453369140625,
-0.004024505615234375,
0.0406494140625,
0.08642578125,
0.003009796142578125,
-0.0031833648681640625,
0.06658935546875,
-0.00734710693359375,
-0.0257110595703125,
-0.036041259765625,
0.00344085693359375,
-0.02423095703125,
0.0219268798828125,
0.03619384765625,
0.0411376953125,
-0.0015478134155273438,
-0.022003173828125,
0.022186279296875,
0.0216827392578125,
-0.039825439453125,
-0.0302276611328125,
0.05108642578125,
-0.0026092529296875,
-0.003772735595703125,
0.0645751953125,
-0.0006489753723144531,
-0.05145263671875,
0.061248779296875,
0.0382080078125,
0.06378173828125,
-0.00920867919921875,
0.0087127685546875,
0.048431396484375,
0.02484130859375,
-0.00970458984375,
0.00305938720703125,
-0.0049591064453125,
-0.07672119140625,
-0.031097412109375,
-0.044189453125,
-0.00640869140625,
0.021484375,
-0.060821533203125,
0.0279998779296875,
-0.044647216796875,
-0.03326416015625,
0.005344390869140625,
-0.00368499755859375,
-0.0908203125,
0.027069091796875,
0.02557373046875,
0.0628662109375,
-0.05694580078125,
0.060455322265625,
0.058837890625,
-0.044464111328125,
-0.0673828125,
-0.0221099853515625,
-0.0173492431640625,
-0.0673828125,
0.056243896484375,
0.032379150390625,
0.003490447998046875,
0.008148193359375,
-0.06292724609375,
-0.0654296875,
0.09027099609375,
0.0158538818359375,
-0.0218048095703125,
-0.00009733438491821289,
0.003459930419921875,
0.03521728515625,
-0.027191162109375,
0.041107177734375,
0.0040435791015625,
0.01776123046875,
0.0268096923828125,
-0.0595703125,
-0.001567840576171875,
-0.0338134765625,
0.027191162109375,
0.000049591064453125,
-0.042999267578125,
0.08599853515625,
-0.01090240478515625,
-0.017120361328125,
0.002605438232421875,
0.045928955078125,
-0.0100860595703125,
-0.01031494140625,
0.056640625,
0.05401611328125,
0.03369140625,
-0.027191162109375,
0.08197021484375,
0.000014185905456542969,
0.03759765625,
0.043701171875,
0.021728515625,
0.045989990234375,
0.027801513671875,
-0.0219573974609375,
0.03887939453125,
0.068359375,
-0.03887939453125,
0.03912353515625,
0.004329681396484375,
0.006526947021484375,
-0.01224517822265625,
-0.0011749267578125,
-0.03515625,
0.044830322265625,
0.027008056640625,
-0.048187255859375,
-0.00241851806640625,
0.023651123046875,
-0.0244140625,
-0.034454345703125,
-0.0465087890625,
0.036285400390625,
0.0005865097045898438,
-0.0289459228515625,
0.050994873046875,
-0.0129241943359375,
0.0565185546875,
-0.0266571044921875,
-0.00531005859375,
-0.01331329345703125,
0.0290985107421875,
-0.0306243896484375,
-0.06158447265625,
0.01041412353515625,
-0.01126861572265625,
-0.01058197021484375,
-0.0147705078125,
0.0662841796875,
-0.0069732666015625,
-0.04229736328125,
0.0156402587890625,
0.008026123046875,
0.0237274169921875,
-0.006427764892578125,
-0.050201416015625,
-0.005825042724609375,
-0.013824462890625,
-0.028717041015625,
0.0181732177734375,
0.0224456787109375,
-0.0117645263671875,
0.038665771484375,
0.050933837890625,
-0.00045108795166015625,
0.03045654296875,
0.0011491775512695312,
0.07476806640625,
-0.039459228515625,
-0.038330078125,
-0.03936767578125,
0.0469970703125,
-0.014495849609375,
-0.027069091796875,
0.040283203125,
0.0263214111328125,
0.083740234375,
-0.025146484375,
0.038482666015625,
-0.007350921630859375,
0.00015282630920410156,
-0.024993896484375,
0.04058837890625,
-0.04449462890625,
-0.021484375,
-0.026123046875,
-0.0775146484375,
-0.033447265625,
0.0697021484375,
-0.01251983642578125,
0.0171051025390625,
0.041839599609375,
0.055877685546875,
-0.0198822021484375,
-0.0114288330078125,
0.029144287109375,
0.0157470703125,
0.007656097412109375,
0.034820556640625,
0.06353759765625,
-0.059478759765625,
0.038970947265625,
-0.0384521484375,
-0.020263671875,
-0.0172119140625,
-0.050994873046875,
-0.06982421875,
-0.05523681640625,
-0.029083251953125,
-0.03759765625,
-0.019622802734375,
0.05120849609375,
0.08795166015625,
-0.061126708984375,
0.00010412931442260742,
-0.01373291015625,
-0.018798828125,
-0.0203094482421875,
-0.01485443115234375,
0.034698486328125,
-0.0037174224853515625,
-0.058380126953125,
-0.01486968994140625,
0.0013246536254882812,
0.0177764892578125,
-0.02203369140625,
-0.0027256011962890625,
0.0006814002990722656,
-0.027313232421875,
0.048492431640625,
0.02093505859375,
-0.046478271484375,
-0.038299560546875,
-0.0002727508544921875,
-0.00521087646484375,
0.0231170654296875,
0.05035400390625,
-0.06695556640625,
0.037445068359375,
0.042083740234375,
0.0430908203125,
0.07208251953125,
-0.006305694580078125,
0.0166015625,
-0.057098388671875,
0.033538818359375,
0.007354736328125,
0.047515869140625,
0.0194854736328125,
-0.026336669921875,
0.03790283203125,
0.0295867919921875,
-0.043121337890625,
-0.05474853515625,
0.0067901611328125,
-0.0938720703125,
-0.00024056434631347656,
0.0694580078125,
-0.027618408203125,
-0.03704833984375,
0.0134735107421875,
-0.01119232177734375,
0.041229248046875,
-0.0035495758056640625,
0.025390625,
0.0244598388671875,
0.004177093505859375,
-0.04730224609375,
-0.0305328369140625,
0.0199127197265625,
-0.0028400421142578125,
-0.033203125,
-0.046234130859375,
0.005481719970703125,
0.017791748046875,
0.040130615234375,
0.022613525390625,
-0.028411865234375,
0.01270294189453125,
0.0185546875,
0.0305328369140625,
-0.00838470458984375,
-0.0278472900390625,
-0.0247802734375,
0.008453369140625,
-0.0174560546875,
-0.0550537109375
]
] |
timm/sebotnet33ts_256.a1h_in1k | 2023-04-26T16:12:15.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:2101.11605",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/sebotnet33ts_256.a1h_in1k | 0 | 29,047 | timm | 2023-04-26T16:12:04 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for sebotnet33ts_256.a1h_in1k
A BotNet image classification model (with Squeeze-and-Excitation channel attention, based on ResNet architecture). Trained on ImageNet-1k in `timm` by Ross Wightman.
NOTE: this model did not adhere to any specific paper configuration, it was tuned for reasonable training times and reduced frequency of self-attention blocks.
Recipe details:
* Based on [ResNet Strikes Back](https://arxiv.org/abs/2110.00476) `A1` recipe
* LAMB optimizer
* Stronger dropout, stochastic depth, and RandAugment than paper `A1` recipe
* Cosine LR schedule with warmup
This model architecture is implemented using `timm`'s flexible [BYOBNet (Bring-Your-Own-Blocks Network)](https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py).
BYOB (with BYOANet attention specific blocks) allows configuration of:
* block / stage layout
* block-type interleaving
* stem layout
* output stride (dilation)
* activation and norm layers
* channel and spatial / self-attention layers
...and also includes `timm` features common to many other architectures, including:
* stochastic depth
* gradient checkpointing
* layer-wise LR decay
* per-stage feature extraction
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 13.7
- GMACs: 3.9
- Activations (M): 17.5
- Image size: 256 x 256
- **Papers:**
- Bottleneck Transformers for Visual Recognition: https://arxiv.org/abs/2101.11605
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('sebotnet33ts_256.a1h_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'sebotnet33ts_256.a1h_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 32, 128, 128])
# torch.Size([1, 256, 64, 64])
# torch.Size([1, 512, 32, 32])
# torch.Size([1, 1024, 16, 16])
# torch.Size([1, 1280, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'sebotnet33ts_256.a1h_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@article{Srinivas2021BottleneckTF,
title={Bottleneck Transformers for Visual Recognition},
author={A. Srinivas and Tsung-Yi Lin and Niki Parmar and Jonathon Shlens and P. Abbeel and Ashish Vaswani},
journal={2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2021},
pages={16514-16524}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 5,416 | [
[
-0.03704833984375,
-0.04095458984375,
0.0011167526245117188,
0.007656097412109375,
-0.0189666748046875,
-0.0257415771484375,
-0.01739501953125,
-0.030792236328125,
0.02288818359375,
0.04071044921875,
-0.03558349609375,
-0.0430908203125,
-0.052154541015625,
-0.004604339599609375,
-0.0106658935546875,
0.0650634765625,
-0.00525665283203125,
-0.0068817138671875,
-0.01032257080078125,
-0.0399169921875,
-0.0189361572265625,
-0.028900146484375,
-0.0692138671875,
-0.0312347412109375,
0.02935791015625,
0.0117034912109375,
0.033782958984375,
0.048980712890625,
0.052215576171875,
0.038177490234375,
-0.007656097412109375,
0.006160736083984375,
-0.0181884765625,
-0.01336669921875,
0.020294189453125,
-0.041534423828125,
-0.0318603515625,
0.0172576904296875,
0.040740966796875,
0.0210418701171875,
0.004596710205078125,
0.035430908203125,
0.0050506591796875,
0.044403076171875,
-0.0172119140625,
0.00388336181640625,
-0.03643798828125,
0.015655517578125,
-0.0160369873046875,
-0.002696990966796875,
-0.02984619140625,
-0.0205841064453125,
0.0166473388671875,
-0.03033447265625,
0.039398193359375,
0.0044708251953125,
0.09942626953125,
0.0211639404296875,
0.0008182525634765625,
0.0029926300048828125,
-0.013671875,
0.057647705078125,
-0.058197021484375,
0.021209716796875,
0.01546478271484375,
0.019073486328125,
-0.00675201416015625,
-0.07403564453125,
-0.039520263671875,
-0.01450347900390625,
-0.00617218017578125,
0.0070343017578125,
-0.018585205078125,
-0.0015535354614257812,
0.02935791015625,
0.016754150390625,
-0.03424072265625,
0.00592803955078125,
-0.045867919921875,
-0.01332855224609375,
0.0411376953125,
0.0003533363342285156,
0.0241851806640625,
-0.0257110595703125,
-0.044464111328125,
-0.0289459228515625,
-0.0251617431640625,
0.0211639404296875,
0.01934814453125,
0.017730712890625,
-0.045654296875,
0.0255279541015625,
0.01291656494140625,
0.040924072265625,
0.004638671875,
-0.01806640625,
0.04730224609375,
0.0061492919921875,
-0.034759521484375,
-0.0162200927734375,
0.08026123046875,
0.024017333984375,
0.019256591796875,
0.01052093505859375,
-0.01580810546875,
-0.02392578125,
0.007648468017578125,
-0.07733154296875,
-0.0284271240234375,
0.02130126953125,
-0.052001953125,
-0.031402587890625,
0.01412200927734375,
-0.04827880859375,
-0.01311492919921875,
-0.0019321441650390625,
0.0443115234375,
-0.046295166015625,
-0.0233612060546875,
0.0009312629699707031,
-0.0198822021484375,
0.025390625,
0.0223541259765625,
-0.03594970703125,
0.016387939453125,
0.020111083984375,
0.08746337890625,
-0.00032138824462890625,
-0.0345458984375,
-0.0160980224609375,
-0.0256500244140625,
-0.016143798828125,
0.0396728515625,
-0.0046844482421875,
-0.0025691986083984375,
-0.0256195068359375,
0.0306243896484375,
-0.0116729736328125,
-0.057464599609375,
0.00691986083984375,
-0.0236053466796875,
0.014434814453125,
0.0031604766845703125,
-0.01812744140625,
-0.0428466796875,
0.0270843505859375,
-0.04010009765625,
0.0894775390625,
0.020263671875,
-0.060760498046875,
0.0185394287109375,
-0.04180908203125,
-0.0188446044921875,
-0.0161895751953125,
-0.005199432373046875,
-0.082275390625,
0.00031495094299316406,
0.0160369873046875,
0.0557861328125,
-0.02459716796875,
0.0011053085327148438,
-0.047210693359375,
-0.0193023681640625,
0.0318603515625,
-0.007472991943359375,
0.07916259765625,
0.01325225830078125,
-0.03448486328125,
0.016754150390625,
-0.046051025390625,
0.01152801513671875,
0.048553466796875,
-0.0159759521484375,
-0.0011091232299804688,
-0.04669189453125,
0.010406494140625,
0.0249481201171875,
0.01216888427734375,
-0.036224365234375,
0.01271820068359375,
-0.0164337158203125,
0.041717529296875,
0.057464599609375,
-0.00774383544921875,
0.0195770263671875,
-0.032257080078125,
0.0259857177734375,
0.0220489501953125,
0.0213470458984375,
-0.012939453125,
-0.0347900390625,
-0.06939697265625,
-0.037017822265625,
0.0285186767578125,
0.028656005859375,
-0.035369873046875,
0.036956787109375,
-0.01076507568359375,
-0.053253173828125,
-0.036834716796875,
0.005558013916015625,
0.037689208984375,
0.04217529296875,
0.0279388427734375,
-0.0435791015625,
-0.043487548828125,
-0.07177734375,
0.0017576217651367188,
0.00463104248046875,
0.00909423828125,
0.03302001953125,
0.05401611328125,
-0.007427215576171875,
0.048187255859375,
-0.026092529296875,
-0.0220489501953125,
-0.0223388671875,
0.0088653564453125,
0.033416748046875,
0.058441162109375,
0.05712890625,
-0.044464111328125,
-0.039306640625,
-0.0094451904296875,
-0.0660400390625,
0.00811767578125,
-0.01219940185546875,
-0.00966644287109375,
0.0289154052734375,
0.01401519775390625,
-0.037384033203125,
0.048675537109375,
0.01447296142578125,
-0.01108551025390625,
0.0307464599609375,
-0.01546478271484375,
0.0176544189453125,
-0.0867919921875,
0.01117706298828125,
0.0293121337890625,
-0.017730712890625,
-0.031585693359375,
0.005413055419921875,
0.004878997802734375,
-0.01105499267578125,
-0.037750244140625,
0.050262451171875,
-0.04241943359375,
-0.0191497802734375,
-0.00878143310546875,
-0.0157928466796875,
-0.0025482177734375,
0.06597900390625,
0.0006465911865234375,
0.034759521484375,
0.057037353515625,
-0.037017822265625,
0.033233642578125,
0.02099609375,
-0.01015472412109375,
0.030731201171875,
-0.057830810546875,
0.0231170654296875,
0.0011186599731445312,
0.0101165771484375,
-0.08013916015625,
-0.01776123046875,
0.031524658203125,
-0.0501708984375,
0.048614501953125,
-0.036102294921875,
-0.028900146484375,
-0.047698974609375,
-0.03564453125,
0.027435302734375,
0.056854248046875,
-0.05889892578125,
0.0384521484375,
0.0226898193359375,
0.0230255126953125,
-0.039306640625,
-0.0594482421875,
-0.01490020751953125,
-0.0330810546875,
-0.04925537109375,
0.0223236083984375,
0.01287078857421875,
0.01415252685546875,
0.006183624267578125,
-0.01165008544921875,
-0.005615234375,
-0.007083892822265625,
0.04693603515625,
0.0267181396484375,
-0.01513671875,
-0.005161285400390625,
-0.03271484375,
-0.0027179718017578125,
-0.00049591064453125,
-0.0191650390625,
0.053741455078125,
-0.0224151611328125,
-0.01537322998046875,
-0.06475830078125,
-0.00652313232421875,
0.03521728515625,
-0.0113372802734375,
0.05120849609375,
0.0780029296875,
-0.036956787109375,
-0.0025463104248046875,
-0.04412841796875,
-0.02349853515625,
-0.03826904296875,
0.037933349609375,
-0.030242919921875,
-0.034759521484375,
0.0643310546875,
-0.005809783935546875,
-0.004169464111328125,
0.04730224609375,
0.0265045166015625,
-0.01294708251953125,
0.05267333984375,
0.04681396484375,
0.00975799560546875,
0.05889892578125,
-0.0697021484375,
-0.01690673828125,
-0.06622314453125,
-0.0350341796875,
-0.023223876953125,
-0.051483154296875,
-0.048553466796875,
-0.0302581787109375,
0.0341796875,
0.007251739501953125,
-0.03436279296875,
0.04327392578125,
-0.06256103515625,
0.00390625,
0.05908203125,
0.0472412109375,
-0.030609130859375,
0.0251312255859375,
-0.01322174072265625,
-0.00433349609375,
-0.05999755859375,
-0.017913818359375,
0.07977294921875,
0.03887939453125,
0.054595947265625,
-0.015960693359375,
0.061920166015625,
-0.00806427001953125,
0.023681640625,
-0.042083740234375,
0.051422119140625,
-0.0079193115234375,
-0.0360107421875,
-0.01486968994140625,
-0.031494140625,
-0.0810546875,
0.0000635385513305664,
-0.01568603515625,
-0.058990478515625,
0.006641387939453125,
0.01282501220703125,
-0.0178985595703125,
0.05218505859375,
-0.0638427734375,
0.0728759765625,
-0.017547607421875,
-0.030914306640625,
-0.0019311904907226562,
-0.057373046875,
0.01593017578125,
0.015716552734375,
-0.0240936279296875,
-0.008056640625,
0.01322174072265625,
0.080810546875,
-0.04693603515625,
0.07342529296875,
-0.0286865234375,
0.032073974609375,
0.0389404296875,
-0.01166534423828125,
0.026153564453125,
-0.011810302734375,
-0.0006871223449707031,
0.033294677734375,
-0.0035533905029296875,
-0.033355712890625,
-0.041778564453125,
0.03668212890625,
-0.07647705078125,
-0.0350341796875,
-0.0272369384765625,
-0.03106689453125,
0.0196685791015625,
0.01187896728515625,
0.0406494140625,
0.051239013671875,
0.0196075439453125,
0.01535797119140625,
0.042694091796875,
-0.040618896484375,
0.031005859375,
-0.0007843971252441406,
-0.010162353515625,
-0.043914794921875,
0.0667724609375,
0.01953125,
0.00970458984375,
0.015350341796875,
0.01157379150390625,
-0.0227203369140625,
-0.033172607421875,
-0.017913818359375,
0.0311279296875,
-0.050323486328125,
-0.04052734375,
-0.04364013671875,
-0.043914794921875,
-0.03912353515625,
-0.006031036376953125,
-0.037750244140625,
-0.0222320556640625,
-0.0243682861328125,
0.023681640625,
0.053436279296875,
0.04022216796875,
-0.008941650390625,
0.04400634765625,
-0.040924072265625,
0.0117340087890625,
0.010162353515625,
0.033782958984375,
-0.0035915374755859375,
-0.072021484375,
-0.017425537109375,
-0.01666259765625,
-0.0360107421875,
-0.06231689453125,
0.03948974609375,
0.00791168212890625,
0.03759765625,
0.0239410400390625,
-0.01654052734375,
0.062347412109375,
-0.008209228515625,
0.034454345703125,
0.035552978515625,
-0.047698974609375,
0.052703857421875,
0.0011568069458007812,
0.0165863037109375,
0.0042724609375,
0.0189056396484375,
-0.02587890625,
-0.00612640380859375,
-0.07989501953125,
-0.051239013671875,
0.07183837890625,
0.0012407302856445312,
-0.00469970703125,
0.02056884765625,
0.049835205078125,
0.002994537353515625,
0.0042877197265625,
-0.061004638671875,
-0.029266357421875,
-0.0208892822265625,
-0.0181732177734375,
0.005298614501953125,
-0.0030384063720703125,
-0.0023441314697265625,
-0.045928955078125,
0.050140380859375,
-0.0008988380432128906,
0.061431884765625,
0.0282135009765625,
-0.006862640380859375,
-0.0020046234130859375,
-0.028106689453125,
0.0276336669921875,
0.0149078369140625,
-0.0220184326171875,
0.0116729736328125,
0.0243072509765625,
-0.0377197265625,
0.004627227783203125,
0.00980377197265625,
-0.003650665283203125,
-0.0013751983642578125,
0.037750244140625,
0.07025146484375,
-0.004673004150390625,
-0.005764007568359375,
0.03192138671875,
0.00141143798828125,
-0.0257415771484375,
-0.02325439453125,
0.013702392578125,
-0.01259613037109375,
0.0413818359375,
0.0138397216796875,
0.035369873046875,
-0.0108489990234375,
-0.0188446044921875,
0.0216064453125,
0.04132080078125,
-0.0224609375,
-0.030242919921875,
0.051971435546875,
-0.01238250732421875,
-0.0233612060546875,
0.062744140625,
-0.005889892578125,
-0.0445556640625,
0.0870361328125,
0.039459228515625,
0.0736083984375,
-0.0034198760986328125,
0.0039825439453125,
0.059417724609375,
0.0201416015625,
0.005489349365234375,
0.0185089111328125,
0.0087738037109375,
-0.057037353515625,
0.0092010498046875,
-0.03466796875,
0.00675201416015625,
0.034759521484375,
-0.03521728515625,
0.02947998046875,
-0.06146240234375,
-0.040252685546875,
0.007419586181640625,
0.0215301513671875,
-0.0787353515625,
0.0174560546875,
0.0003528594970703125,
0.0660400390625,
-0.051300048828125,
0.0535888671875,
0.07025146484375,
-0.043426513671875,
-0.0751953125,
-0.00926971435546875,
-0.00016295909881591797,
-0.08203125,
0.047027587890625,
0.04168701171875,
0.0088958740234375,
0.006031036376953125,
-0.06549072265625,
-0.047119140625,
0.10296630859375,
0.03668212890625,
-0.0097198486328125,
0.0208587646484375,
-0.011138916015625,
0.01812744140625,
-0.0276336669921875,
0.034576416015625,
0.01338958740234375,
0.0209197998046875,
0.02093505859375,
-0.04937744140625,
0.0208282470703125,
-0.0258941650390625,
0.01152801513671875,
0.013031005859375,
-0.0714111328125,
0.06524658203125,
-0.03936767578125,
-0.015655517578125,
0.0043487548828125,
0.049835205078125,
0.02020263671875,
0.0159912109375,
0.039215087890625,
0.065673828125,
0.039825439453125,
-0.0182342529296875,
0.06304931640625,
-0.005840301513671875,
0.04364013671875,
0.053253173828125,
0.0254364013671875,
0.041839599609375,
0.02923583984375,
-0.022857666015625,
0.01922607421875,
0.0823974609375,
-0.032623291015625,
0.0258331298828125,
0.0152587890625,
0.01372528076171875,
-0.01485443115234375,
0.00278472900390625,
-0.033660888671875,
0.037261962890625,
0.00902557373046875,
-0.044097900390625,
-0.009033203125,
0.005229949951171875,
-0.008056640625,
-0.0229644775390625,
-0.01036834716796875,
0.038238525390625,
0.0027256011962890625,
-0.0290985107421875,
0.061187744140625,
0.0019683837890625,
0.061187744140625,
-0.0257415771484375,
0.004764556884765625,
-0.02142333984375,
0.015228271484375,
-0.01702880859375,
-0.05926513671875,
0.0157928466796875,
-0.0195159912109375,
0.0022335052490234375,
0.0007033348083496094,
0.06048583984375,
-0.0175933837890625,
-0.0292816162109375,
0.0161895751953125,
0.026885986328125,
0.043792724609375,
0.00589752197265625,
-0.0928955078125,
0.018310546875,
-0.0005197525024414062,
-0.043426513671875,
0.0155181884765625,
0.035614013671875,
0.01068115234375,
0.059661865234375,
0.0362548828125,
-0.00380706787109375,
0.01033782958984375,
-0.0083160400390625,
0.061248779296875,
-0.046875,
-0.0229644775390625,
-0.06475830078125,
0.0389404296875,
-0.015960693359375,
-0.040008544921875,
0.0404052734375,
0.04058837890625,
0.0682373046875,
-0.00911712646484375,
0.037506103515625,
-0.020843505859375,
-0.0003437995910644531,
-0.0294342041015625,
0.045074462890625,
-0.061920166015625,
-0.0005679130554199219,
-0.015869140625,
-0.0491943359375,
-0.0231781005859375,
0.05615234375,
-0.0241851806640625,
0.03436279296875,
0.034393310546875,
0.0740966796875,
-0.0243377685546875,
-0.02899169921875,
0.0158233642578125,
0.009765625,
0.0050048828125,
0.040618896484375,
0.040130615234375,
-0.058197021484375,
0.0266265869140625,
-0.042022705078125,
-0.01212310791015625,
-0.018798828125,
-0.043792724609375,
-0.082763671875,
-0.0682373046875,
-0.047119140625,
-0.048583984375,
-0.009674072265625,
0.071533203125,
0.0848388671875,
-0.050689697265625,
-0.01025390625,
0.0008716583251953125,
0.01168060302734375,
-0.02227783203125,
-0.0171661376953125,
0.0419921875,
-0.0221710205078125,
-0.05596923828125,
-0.0277252197265625,
0.01512908935546875,
0.034088134765625,
0.0021877288818359375,
-0.0172271728515625,
-0.01110076904296875,
-0.0182342529296875,
0.01215362548828125,
0.0281524658203125,
-0.05517578125,
-0.0181732177734375,
-0.0082550048828125,
-0.01282501220703125,
0.0350341796875,
0.03643798828125,
-0.044189453125,
0.024383544921875,
0.030609130859375,
0.036956787109375,
0.061614990234375,
-0.0213470458984375,
0.0027618408203125,
-0.06524658203125,
0.040130615234375,
-0.012420654296875,
0.03857421875,
0.027618408203125,
-0.02508544921875,
0.049835205078125,
0.039520263671875,
-0.03564453125,
-0.0625,
-0.0021610260009765625,
-0.0859375,
-0.01271820068359375,
0.0634765625,
-0.0238494873046875,
-0.048370361328125,
0.033966064453125,
-0.00649261474609375,
0.0531005859375,
-0.01087188720703125,
0.0465087890625,
0.0160980224609375,
-0.01025390625,
-0.05377197265625,
-0.037506103515625,
0.0256195068359375,
0.0188140869140625,
-0.04718017578125,
-0.031494140625,
0.0005006790161132812,
0.04840087890625,
0.0232696533203125,
0.044525146484375,
-0.0102996826171875,
0.010498046875,
0.004215240478515625,
0.0389404296875,
-0.027618408203125,
-0.0177154541015625,
-0.0258331298828125,
0.0011415481567382812,
-0.01081085205078125,
-0.049530029296875
]
] |
allegro/herbert-klej-cased-v1 | 2021-05-28T16:18:22.000Z | [
"transformers",
"pytorch",
"jax",
"roberta",
"pl",
"arxiv:2005.00630",
"endpoints_compatible",
"region:us"
] | null | allegro | null | null | allegro/herbert-klej-cased-v1 | 5 | 29,016 | transformers | 2022-03-02T23:29:05 | ---
language: pl
---
# HerBERT
**[HerBERT](https://en.wikipedia.org/wiki/Zbigniew_Herbert)** is a BERT-based Language Model trained on Polish Corpora
using only MLM objective with dynamic masking of whole words. For more details, please refer to:
[KLEJ: Comprehensive Benchmark for Polish Language Understanding](https://arxiv.org/abs/2005.00630).
## Dataset
**HerBERT** training dataset is a combination of several publicly available corpora for Polish language:
| Corpus | Tokens | Texts |
| :------ | ------: | ------: |
| [OSCAR](https://traces1.inria.fr/oscar/)| 6710M | 145M |
| [Open Subtitles](http://opus.nlpl.eu/OpenSubtitles-v2018.php) | 1084M | 1.1M |
| [Wikipedia](https://dumps.wikimedia.org/) | 260M | 1.5M |
| [Wolne Lektury](https://wolnelektury.pl/) | 41M | 5.5k |
| [Allegro Articles](https://allegro.pl/artykuly) | 18M | 33k |
## Tokenizer
The training dataset was tokenized into subwords using [HerBERT Tokenizer](https://huggingface.co/allegro/herbert-klej-cased-tokenizer-v1); a character level byte-pair encoding with
a vocabulary size of 50k tokens. The tokenizer itself was trained on [Wolne Lektury](https://wolnelektury.pl/) and a publicly available subset of
[National Corpus of Polish](http://nkjp.pl/index.php?page=14&lang=0) with a [fastBPE](https://github.com/glample/fastBPE) library.
Tokenizer utilizes `XLMTokenizer` implementation for that reason, one should load it as `allegro/herbert-klej-cased-tokenizer-v1`.
## HerBERT models summary
| Model | WWM | Cased | Tokenizer | Vocab Size | Batch Size | Train Steps |
| :------ | ------: | ------: | ------: | ------: | ------: | ------: |
| herbert-klej-cased-v1 | YES | YES | BPE | 50K | 570 | 180k |
## Model evaluation
HerBERT was evaluated on the [KLEJ](https://klejbenchmark.com/) benchmark, publicly available set of nine evaluation tasks for the Polish language understanding.
It had the best average performance and obtained the best results for three of them.
| Model | Average | NKJP-NER | CDSC-E | CDSC-R | CBD | PolEmo2.0-IN\t|PolEmo2.0-OUT | DYK | PSC | AR\t|
| :------ | ------: | ------: | ------: | ------: | ------: | ------: | ------: | ------: | ------: | ------: |
| herbert-klej-cased-v1 | **80.5** | 92.7 | 92.5 | 91.9 | **50.3** | **89.2** |**76.3** |52.1 |95.3 | 84.5 |
Full leaderboard is available [online](https://klejbenchmark.com/leaderboard).
## HerBERT usage
Model training and experiments were conducted with [transformers](https://github.com/huggingface/transformers) in version 2.0.
Example code:
```python
from transformers import XLMTokenizer, RobertaModel
tokenizer = XLMTokenizer.from_pretrained("allegro/herbert-klej-cased-tokenizer-v1")
model = RobertaModel.from_pretrained("allegro/herbert-klej-cased-v1")
encoded_input = tokenizer.encode("Kto ma lepszą sztukę, ma lepszy rząd – to jasne.", return_tensors='pt')
outputs = model(encoded_input)
```
HerBERT can also be loaded using `AutoTokenizer` and `AutoModel`:
```python
tokenizer = AutoTokenizer.from_pretrained("allegro/herbert-klej-cased-tokenizer-v1")
model = AutoModel.from_pretrained("allegro/herbert-klej-cased-v1")
```
## License
CC BY-SA 4.0
## Citation
If you use this model, please cite the following paper:
```
@inproceedings{rybak-etal-2020-klej,
title = "{KLEJ}: Comprehensive Benchmark for {P}olish Language Understanding",
author = "Rybak, Piotr and
Mroczkowski, Robert and
Tracz, Janusz and
Gawlik, Ireneusz",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-main.111",
doi = "10.18653/v1/2020.acl-main.111",
pages = "1191--1201",
}
```
## Authors
The model was trained by **Allegro Machine Learning Research** team.
You can contact us at: <a href="mailto:klejbenchmark@allegro.pl">klejbenchmark@allegro.pl</a>
| 4,007 | [
[
-0.034271240234375,
-0.056396484375,
0.031829833984375,
0.017913818359375,
-0.0166778564453125,
-0.003772735595703125,
-0.044342041015625,
-0.0372314453125,
0.003063201904296875,
0.017303466796875,
-0.051239013671875,
-0.04638671875,
-0.05902099609375,
0.007511138916015625,
-0.0228729248046875,
0.0782470703125,
-0.018890380859375,
0.018951416015625,
0.005207061767578125,
-0.0159759521484375,
0.002170562744140625,
-0.044586181640625,
-0.035400390625,
-0.02545166015625,
0.0192108154296875,
0.00933074951171875,
0.03857421875,
0.031219482421875,
0.0279541015625,
0.0259857177734375,
-0.00942230224609375,
-0.003448486328125,
-0.0207366943359375,
-0.0017061233520507812,
0.00824737548828125,
-0.027374267578125,
-0.028961181640625,
0.0027637481689453125,
0.05389404296875,
0.0469970703125,
-0.009857177734375,
0.01488494873046875,
-0.00891876220703125,
0.0404052734375,
-0.043853759765625,
0.027740478515625,
-0.045196533203125,
0.005092620849609375,
-0.0215911865234375,
0.01435089111328125,
-0.047943115234375,
0.0034503936767578125,
0.0010051727294921875,
-0.05230712890625,
0.0187530517578125,
0.004726409912109375,
0.0989990234375,
0.01458740234375,
-0.01364898681640625,
-0.0251617431640625,
-0.0413818359375,
0.0721435546875,
-0.058685302734375,
0.033599853515625,
0.0172271728515625,
0.01261138916015625,
-0.004940032958984375,
-0.0635986328125,
-0.042694091796875,
-0.0273590087890625,
-0.0187225341796875,
0.0192108154296875,
-0.0225830078125,
0.010528564453125,
0.036285400390625,
0.02490234375,
-0.0428466796875,
0.0005707740783691406,
-0.031463623046875,
-0.0230255126953125,
0.0391845703125,
0.0011692047119140625,
0.02679443359375,
-0.017364501953125,
-0.033905029296875,
-0.031524658203125,
-0.0259857177734375,
0.0141448974609375,
0.0251617431640625,
0.0237579345703125,
-0.01342010498046875,
0.033172607421875,
-0.00865936279296875,
0.05859375,
0.01065826416015625,
-0.01280975341796875,
0.06182861328125,
-0.016326904296875,
-0.01041412353515625,
0.01251220703125,
0.0888671875,
-0.001140594482421875,
0.0221710205078125,
-0.01027679443359375,
-0.013916015625,
0.0028057098388671875,
0.005931854248046875,
-0.05987548828125,
-0.01922607421875,
0.00992584228515625,
-0.02783203125,
-0.007602691650390625,
0.0253448486328125,
-0.02874755859375,
0.0156097412109375,
-0.015899658203125,
0.037567138671875,
-0.058319091796875,
-0.01126861572265625,
0.0017900466918945312,
-0.004474639892578125,
0.0172119140625,
-0.0003826618194580078,
-0.0733642578125,
0.023651123046875,
0.03765869140625,
0.044342041015625,
-0.0139312744140625,
-0.0400390625,
-0.041473388671875,
-0.01195526123046875,
-0.0203399658203125,
0.034027099609375,
-0.01261138916015625,
-0.018280029296875,
-0.003482818603515625,
0.0218658447265625,
-0.02593994140625,
-0.028289794921875,
0.05157470703125,
-0.0305023193359375,
0.0298309326171875,
-0.0009675025939941406,
-0.054901123046875,
-0.009490966796875,
0.006305694580078125,
-0.032501220703125,
0.105224609375,
0.02679443359375,
-0.0595703125,
0.0372314453125,
-0.04730224609375,
-0.0251312255859375,
-0.00753021240234375,
-0.0029697418212890625,
-0.040863037109375,
0.0052490234375,
0.026458740234375,
0.0170135498046875,
0.0009169578552246094,
0.0352783203125,
-0.0169525146484375,
-0.0253448486328125,
0.01557159423828125,
-0.0211181640625,
0.08197021484375,
0.01058197021484375,
-0.040252685546875,
0.0223236083984375,
-0.0780029296875,
0.0086212158203125,
0.0088653564453125,
-0.03167724609375,
-0.0223541259765625,
-0.0158538818359375,
0.0194091796875,
0.0372314453125,
0.0248565673828125,
-0.0577392578125,
0.004116058349609375,
-0.0523681640625,
0.027374267578125,
0.0538330078125,
-0.018585205078125,
0.0255889892578125,
-0.00531768798828125,
0.031280517578125,
0.001987457275390625,
0.024169921875,
-0.00823211669921875,
-0.0406494140625,
-0.070068359375,
-0.0250396728515625,
0.043914794921875,
0.044647216796875,
-0.052734375,
0.044921875,
-0.0170135498046875,
-0.041412353515625,
-0.0517578125,
0.000804901123046875,
0.0362548828125,
0.050445556640625,
0.040740966796875,
-0.0177001953125,
-0.045166015625,
-0.08349609375,
-0.0132904052734375,
-0.01523590087890625,
-0.005573272705078125,
0.0204925537109375,
0.0499267578125,
-0.01116180419921875,
0.06793212890625,
-0.019775390625,
-0.0166168212890625,
-0.0240631103515625,
0.01837158203125,
0.04119873046875,
0.03955078125,
0.044769287109375,
-0.039886474609375,
-0.05889892578125,
-0.01317596435546875,
-0.0579833984375,
-0.01119232177734375,
0.0006308555603027344,
-0.01837158203125,
0.04156494140625,
0.035491943359375,
-0.043670654296875,
0.016815185546875,
0.03765869140625,
-0.030975341796875,
0.042999267578125,
-0.0305328369140625,
-0.00742340087890625,
-0.079345703125,
0.01544952392578125,
-0.01239776611328125,
-0.0094451904296875,
-0.0478515625,
0.00778961181640625,
0.018707275390625,
0.0011663436889648438,
-0.05963134765625,
0.052490234375,
-0.035888671875,
-0.00982666015625,
-0.001956939697265625,
0.02655029296875,
-0.0114593505859375,
0.06207275390625,
0.00820159912109375,
0.058929443359375,
0.05340576171875,
-0.03192138671875,
0.009185791015625,
0.0352783203125,
-0.03759765625,
0.013458251953125,
-0.053985595703125,
0.0002810955047607422,
0.0024261474609375,
0.009552001953125,
-0.055938720703125,
0.005214691162109375,
0.033721923828125,
-0.0435791015625,
0.038909912109375,
-0.002166748046875,
-0.041412353515625,
-0.036529541015625,
-0.0207366943359375,
0.00849151611328125,
0.051910400390625,
-0.0443115234375,
0.05523681640625,
0.01385498046875,
-0.0015573501586914062,
-0.05999755859375,
-0.05133056640625,
-0.0050201416015625,
-0.004085540771484375,
-0.051727294921875,
0.043487548828125,
-0.0125732421875,
-0.01192474365234375,
0.00943756103515625,
-0.00615692138671875,
0.00328826904296875,
0.004238128662109375,
0.0169830322265625,
0.038726806640625,
-0.01355743408203125,
0.005062103271484375,
0.0009636878967285156,
-0.0247802734375,
-0.0006132125854492188,
-0.0057220458984375,
0.05389404296875,
-0.0231475830078125,
-0.01036834716796875,
-0.041412353515625,
0.003509521484375,
0.0379638671875,
-0.01214599609375,
0.0699462890625,
0.0667724609375,
-0.0035953521728515625,
-0.00015783309936523438,
-0.0298919677734375,
-0.014312744140625,
-0.034210205078125,
0.03509521484375,
-0.0328369140625,
-0.05389404296875,
0.04083251953125,
0.013031005859375,
0.006137847900390625,
0.0643310546875,
0.052459716796875,
0.00331878662109375,
0.07696533203125,
0.03759765625,
-0.0163421630859375,
0.047271728515625,
-0.046661376953125,
0.00919342041015625,
-0.07513427734375,
-0.00829315185546875,
-0.04296875,
-0.0027561187744140625,
-0.058349609375,
-0.01479339599609375,
0.012969970703125,
0.0149993896484375,
-0.0250396728515625,
0.042205810546875,
-0.0328369140625,
0.00762176513671875,
0.056427001953125,
-0.0029659271240234375,
0.01131439208984375,
0.004474639892578125,
-0.03558349609375,
-0.006137847900390625,
-0.0635986328125,
-0.03741455078125,
0.0848388671875,
0.0193939208984375,
0.037353515625,
-0.01476287841796875,
0.06732177734375,
-0.0008006095886230469,
0.022918701171875,
-0.055450439453125,
0.043975830078125,
-0.00954437255859375,
-0.056060791015625,
-0.0291595458984375,
-0.032501220703125,
-0.071533203125,
0.038970947265625,
-0.0177764892578125,
-0.0643310546875,
0.0156097412109375,
0.0024852752685546875,
-0.00696563720703125,
0.0167083740234375,
-0.04107666015625,
0.08148193359375,
-0.0150909423828125,
-0.0124359130859375,
-0.0103607177734375,
-0.05242919921875,
0.0156402587890625,
-0.0017995834350585938,
0.0284576416015625,
-0.00850677490234375,
0.00527191162109375,
0.065185546875,
-0.04827880859375,
0.04010009765625,
-0.0196533203125,
0.0046234130859375,
0.0220947265625,
-0.016204833984375,
0.038482666015625,
-0.0158538818359375,
-0.00982666015625,
0.03515625,
-0.003986358642578125,
-0.0305328369140625,
-0.032440185546875,
0.047515869140625,
-0.07373046875,
-0.029754638671875,
-0.044342041015625,
-0.042633056640625,
-0.0155487060546875,
0.0214385986328125,
0.038543701171875,
0.03167724609375,
-0.007720947265625,
0.006351470947265625,
0.033843994140625,
-0.029632568359375,
0.044342041015625,
0.05389404296875,
-0.012664794921875,
-0.023406982421875,
0.06719970703125,
0.00830841064453125,
0.0247802734375,
0.01251983642578125,
0.009918212890625,
-0.01491546630859375,
-0.032196044921875,
-0.047332763671875,
0.036102294921875,
-0.058349609375,
0.0098876953125,
-0.054168701171875,
-0.018280029296875,
-0.0259552001953125,
0.007457733154296875,
-0.033843994140625,
-0.045440673828125,
-0.011505126953125,
-0.00763702392578125,
0.019256591796875,
0.030487060546875,
-0.01160430908203125,
0.01258087158203125,
-0.05487060546875,
0.010833740234375,
0.0014982223510742188,
0.0125732421875,
-0.00423431396484375,
-0.05523681640625,
-0.0291595458984375,
0.0009860992431640625,
-0.0238189697265625,
-0.043609619140625,
0.035614013671875,
0.0157623291015625,
0.056060791015625,
0.018157958984375,
0.0109100341796875,
0.042388916015625,
-0.0458984375,
0.0677490234375,
0.01053619384765625,
-0.07891845703125,
0.043212890625,
-0.01090240478515625,
0.01800537109375,
0.05487060546875,
0.027252197265625,
-0.03228759765625,
-0.0531005859375,
-0.0740966796875,
-0.0863037109375,
0.09014892578125,
0.027587890625,
0.0053558349609375,
-0.00749969482421875,
0.00920867919921875,
-0.00435638427734375,
0.014312744140625,
-0.061767578125,
-0.033966064453125,
-0.00714111328125,
-0.0205841064453125,
-0.0145416259765625,
-0.0117645263671875,
0.006072998046875,
-0.031341552734375,
0.0911865234375,
0.0104522705078125,
0.0239410400390625,
0.0188751220703125,
-0.033355712890625,
-0.0009374618530273438,
0.0131072998046875,
0.053680419921875,
0.046478271484375,
-0.020355224609375,
-0.01318359375,
0.02154541015625,
-0.0389404296875,
-0.003772735595703125,
0.0146942138671875,
-0.0216064453125,
0.026214599609375,
0.0406494140625,
0.09051513671875,
0.02362060546875,
-0.05584716796875,
0.035797119140625,
-0.0032024383544921875,
-0.03668212890625,
-0.0307769775390625,
0.00439453125,
-0.00772857666015625,
0.00064849853515625,
0.02264404296875,
-0.00901031494140625,
-0.0026798248291015625,
-0.039642333984375,
0.00824737548828125,
0.0170440673828125,
-0.04119873046875,
-0.01279449462890625,
0.049530029296875,
-0.00811767578125,
-0.01898193359375,
0.0654296875,
-0.005512237548828125,
-0.046722412109375,
0.046478271484375,
0.03472900390625,
0.05841064453125,
-0.01171112060546875,
0.002498626708984375,
0.0511474609375,
0.0184326171875,
-0.00978851318359375,
0.0181884765625,
-0.015899658203125,
-0.05340576171875,
-0.027679443359375,
-0.059661865234375,
-0.017913818359375,
-0.0006847381591796875,
-0.059417724609375,
0.0122528076171875,
-0.00466156005859375,
-0.021270751953125,
0.0035247802734375,
-0.0015153884887695312,
-0.03472900390625,
0.0048980712890625,
0.0002880096435546875,
0.07305908203125,
-0.057525634765625,
0.087158203125,
0.034332275390625,
-0.04754638671875,
-0.0733642578125,
-0.006183624267578125,
-0.0289306640625,
-0.06048583984375,
0.07550048828125,
0.006381988525390625,
0.007442474365234375,
-0.004405975341796875,
-0.0158538818359375,
-0.06695556640625,
0.06982421875,
0.0235443115234375,
-0.051910400390625,
-0.0038166046142578125,
0.006282806396484375,
0.044097900390625,
-0.012664794921875,
0.0204925537109375,
0.039703369140625,
0.048492431640625,
-0.008087158203125,
-0.0780029296875,
-0.006305694580078125,
-0.02911376953125,
0.0085296630859375,
0.0211944580078125,
-0.03857421875,
0.0806884765625,
-0.00023376941680908203,
-0.0080108642578125,
0.014434814453125,
0.045013427734375,
0.022918701171875,
0.004604339599609375,
0.033538818359375,
0.05517578125,
0.047760009765625,
-0.020660400390625,
0.0921630859375,
-0.06353759765625,
0.060394287109375,
0.0836181640625,
0.00786590576171875,
0.06353759765625,
0.0399169921875,
-0.0298919677734375,
0.05517578125,
0.040863037109375,
-0.018035888671875,
0.036865234375,
0.01593017578125,
-0.003864288330078125,
-0.02099609375,
0.02520751953125,
-0.030975341796875,
0.0222015380859375,
0.01335906982421875,
-0.043853759765625,
-0.01076507568359375,
0.0251617431640625,
0.00774383544921875,
-0.005863189697265625,
-0.00835418701171875,
0.042999267578125,
0.0007224082946777344,
-0.053436279296875,
0.06634521484375,
0.0124359130859375,
0.0516357421875,
-0.056640625,
0.0189056396484375,
0.0036907196044921875,
0.0233001708984375,
-0.005954742431640625,
-0.04010009765625,
0.014617919921875,
-0.002384185791015625,
-0.01012420654296875,
-0.0147552490234375,
0.0343017578125,
-0.029937744140625,
-0.05340576171875,
0.03289794921875,
0.0233306884765625,
0.0258636474609375,
0.01360321044921875,
-0.0751953125,
0.01335906982421875,
-0.01024627685546875,
-0.046661376953125,
0.0274810791015625,
0.0178680419921875,
-0.00029158592224121094,
0.037261962890625,
0.034423828125,
-0.0019044876098632812,
0.00888824462890625,
0.0233306884765625,
0.05230712890625,
-0.0305938720703125,
-0.034271240234375,
-0.0709228515625,
0.0399169921875,
-0.01297760009765625,
-0.026641845703125,
0.07977294921875,
0.053985595703125,
0.0787353515625,
-0.0140380859375,
0.04119873046875,
-0.0144805908203125,
0.0292510986328125,
-0.036376953125,
0.049346923828125,
-0.03948974609375,
0.002544403076171875,
-0.03314208984375,
-0.06549072265625,
-0.0133514404296875,
0.06951904296875,
-0.03363037109375,
0.01186370849609375,
0.0518798828125,
0.051177978515625,
0.00357818603515625,
-0.018951416015625,
0.0184783935546875,
0.0390625,
0.01229095458984375,
0.045318603515625,
0.04315185546875,
-0.05340576171875,
0.042724609375,
-0.038055419921875,
-0.015625,
-0.01274871826171875,
-0.046600341796875,
-0.06658935546875,
-0.048919677734375,
-0.026763916015625,
-0.033111572265625,
-0.0008172988891601562,
0.07647705078125,
0.050201416015625,
-0.06964111328125,
-0.0228729248046875,
-0.021331787109375,
-0.0128326416015625,
-0.020416259765625,
-0.018096923828125,
0.049652099609375,
-0.048126220703125,
-0.0662841796875,
0.02679443359375,
0.0084381103515625,
-0.0055694580078125,
-0.0154876708984375,
-0.0251617431640625,
-0.0181427001953125,
0.0021762847900390625,
0.03973388671875,
0.01371002197265625,
-0.062408447265625,
-0.0079193115234375,
0.006824493408203125,
-0.01305389404296875,
0.023284912109375,
0.0362548828125,
-0.057769775390625,
0.030975341796875,
0.026397705078125,
0.01203155517578125,
0.0728759765625,
-0.016265869140625,
0.0209503173828125,
-0.03900146484375,
0.03179931640625,
0.00818634033203125,
0.044464111328125,
0.0174560546875,
-0.0140838623046875,
0.035552978515625,
0.0205535888671875,
-0.0305328369140625,
-0.06231689453125,
-0.003421783447265625,
-0.07708740234375,
-0.032745361328125,
0.07012939453125,
-0.026580810546875,
-0.03955078125,
0.006290435791015625,
-0.025634765625,
0.02545166015625,
-0.03436279296875,
0.0511474609375,
0.06781005859375,
-0.0156402587890625,
0.0009002685546875,
-0.042144775390625,
0.030975341796875,
0.027374267578125,
-0.039520263671875,
-0.00029540061950683594,
0.01654052734375,
0.034576416015625,
0.023895263671875,
0.048614501953125,
-0.0162200927734375,
0.01983642578125,
-0.004001617431640625,
0.0235595703125,
-0.018707275390625,
-0.013397216796875,
-0.0251617431640625,
0.00524139404296875,
0.0023250579833984375,
-0.01125335693359375
]
] |
Qwen/Qwen-7B-Chat | 2023-11-05T03:27:28.000Z | [
"transformers",
"safetensors",
"qwen",
"text-generation",
"custom_code",
"zh",
"en",
"arxiv:2309.16609",
"arxiv:2305.08322",
"arxiv:2009.03300",
"arxiv:2305.05280",
"arxiv:2210.03629",
"has_space",
"region:us"
] | text-generation | Qwen | null | null | Qwen/Qwen-7B-Chat | 647 | 28,987 | transformers | 2023-08-03T03:01:31 | ---
language:
- zh
- en
tags:
- qwen
pipeline_tag: text-generation
inference: false
---
# Qwen-7B-Chat
<p align="center">
<img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/logo_qwen.jpg" width="400"/>
<p>
<br>
<p align="center">
🤗 <a href="https://huggingface.co/Qwen">Hugging Face</a>   |   🤖 <a href="https://modelscope.cn/organization/qwen">ModelScope</a>   |    📑 <a href="https://arxiv.org/abs/2309.16609">Paper</a>   |   🖥️ <a href="https://modelscope.cn/studios/qwen/Qwen-7B-Chat-Demo/summary">Demo</a>
<br>
<a href="https://github.com/QwenLM/Qwen/blob/main/assets/wechat.png">WeChat (微信)</a>   |    DingTalk (钉钉)    |   <a href="https://discord.gg/z3GAxXZ9Ce">Discord</a>  
</p>
<br><br>
## 介绍(Introduction)
**通义千问-7B(Qwen-7B)**是阿里云研发的通义千问大模型系列的70亿参数规模的模型。Qwen-7B是基于Transformer的大语言模型, 在超大规模的预训练数据上进行训练得到。预训练数据类型多样,覆盖广泛,包括大量网络文本、专业书籍、代码等。同时,在Qwen-7B的基础上,我们使用对齐机制打造了基于大语言模型的AI助手Qwen-7B-Chat。相较于最初开源的Qwen-7B模型,我们现已将预训练模型和Chat模型更新到效果更优的版本。本仓库为Qwen-7B-Chat的仓库。
如果您想了解更多关于通义千问-7B开源模型的细节,我们建议您参阅[GitHub代码库](https://github.com/QwenLM/Qwen)。
**Qwen-7B** is the 7B-parameter version of the large language model series, Qwen (abbr. Tongyi Qianwen), proposed by Alibaba Cloud. Qwen-7B is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc. Additionally, based on the pretrained Qwen-7B, we release Qwen-7B-Chat, a large-model-based AI assistant, which is trained with alignment techniques. Now we have updated both our pretrained and chat models with better performances. This repository is the one for Qwen-7B-Chat.
For more details about Qwen, please refer to the [GitHub](https://github.com/QwenLM/Qwen) code repository.
<br>
## 要求(Requirements)
* python 3.8及以上版本
* pytorch 1.12及以上版本,推荐2.0及以上版本
* 建议使用CUDA 11.4及以上(GPU用户、flash-attention用户等需考虑此选项)
* python 3.8 and above
* pytorch 1.12 and above, 2.0 and above are recommended
* CUDA 11.4 and above are recommended (this is for GPU users, flash-attention users, etc.)
<br>
## 依赖项(Dependency)
运行Qwen-7B-Chat,请确保满足上述要求,再执行以下pip命令安装依赖库
To run Qwen-7B-Chat, please make sure you meet the above requirements, and then execute the following pip commands to install the dependent libraries.
```bash
pip install transformers==4.32.0 accelerate tiktoken einops scipy transformers_stream_generator==0.0.4 peft deepspeed
```
另外,推荐安装`flash-attention`库(**当前已支持flash attention 2**),以实现更高的效率和更低的显存占用。
In addition, it is recommended to install the `flash-attention` library (**we support flash attention 2 now.**) for higher efficiency and lower memory usage.
```bash
git clone https://github.com/Dao-AILab/flash-attention
cd flash-attention && pip install .
# 下方安装可选,安装可能比较缓慢。
# pip install csrc/layer_norm
# pip install csrc/rotary
```
<br>
## 快速使用(Quickstart)
下面我们展示了一个使用Qwen-7B-Chat模型,进行多轮对话交互的样例:
We show an example of multi-turn interaction with Qwen-7B-Chat in the following code:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers.generation import GenerationConfig
# Note: The default behavior now has injection attack prevention off.
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen-7B-Chat", trust_remote_code=True)
# use bf16
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-7B-Chat", device_map="auto", trust_remote_code=True, bf16=True).eval()
# use fp16
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-7B-Chat", device_map="auto", trust_remote_code=True, fp16=True).eval()
# use cpu only
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-7B-Chat", device_map="cpu", trust_remote_code=True).eval()
# use auto mode, automatically select precision based on the device.
model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-7B-Chat", device_map="auto", trust_remote_code=True).eval()
# Specify hyperparameters for generation. But if you use transformers>=4.32.0, there is no need to do this.
# model.generation_config = GenerationConfig.from_pretrained("Qwen/Qwen-7B-Chat", trust_remote_code=True) # 可指定不同的生成长度、top_p等相关超参
# 第一轮对话 1st dialogue turn
response, history = model.chat(tokenizer, "你好", history=None)
print(response)
# 你好!很高兴为你提供帮助。
# 第二轮对话 2nd dialogue turn
response, history = model.chat(tokenizer, "给我讲一个年轻人奋斗创业最终取得成功的故事。", history=history)
print(response)
# 这是一个关于一个年轻人奋斗创业最终取得成功的故事。
# 故事的主人公叫李明,他来自一个普通的家庭,父母都是普通的工人。从小,李明就立下了一个目标:要成为一名成功的企业家。
# 为了实现这个目标,李明勤奋学习,考上了大学。在大学期间,他积极参加各种创业比赛,获得了不少奖项。他还利用课余时间去实习,积累了宝贵的经验。
# 毕业后,李明决定开始自己的创业之路。他开始寻找投资机会,但多次都被拒绝了。然而,他并没有放弃。他继续努力,不断改进自己的创业计划,并寻找新的投资机会。
# 最终,李明成功地获得了一笔投资,开始了自己的创业之路。他成立了一家科技公司,专注于开发新型软件。在他的领导下,公司迅速发展起来,成为了一家成功的科技企业。
# 李明的成功并不是偶然的。他勤奋、坚韧、勇于冒险,不断学习和改进自己。他的成功也证明了,只要努力奋斗,任何人都有可能取得成功。
# 第三轮对话 3rd dialogue turn
response, history = model.chat(tokenizer, "给这个故事起一个标题", history=history)
print(response)
# 《奋斗创业:一个年轻人的成功之路》
```
关于更多的使用说明,请参考我们的[GitHub repo](https://github.com/QwenLM/Qwen)获取更多信息。
For more information, please refer to our [GitHub repo](https://github.com/QwenLM/Qwen) for more information.
<br>
## Tokenizer
> 注:作为术语的“tokenization”在中文中尚无共识的概念对应,本文档采用英文表达以利说明。
基于tiktoken的分词器有别于其他分词器,比如sentencepiece分词器。尤其在微调阶段,需要特别注意特殊token的使用。关于tokenizer的更多信息,以及微调时涉及的相关使用,请参阅[文档](https://github.com/QwenLM/Qwen/blob/main/tokenization_note_zh.md)。
Our tokenizer based on tiktoken is different from other tokenizers, e.g., sentencepiece tokenizer. You need to pay attention to special tokens, especially in finetuning. For more detailed information on the tokenizer and related use in fine-tuning, please refer to the [documentation](https://github.com/QwenLM/Qwen/blob/main/tokenization_note.md).
<br>
## 量化 (Quantization)
### 用法 (Usage)
**请注意:我们更新量化方案为基于[AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ)的量化,提供Qwen-7B-Chat的Int4量化模型[点击这里](https://huggingface.co/Qwen/Qwen-7B-Chat-Int4)。相比此前方案,该方案在模型评测效果几乎无损,且存储需求更低,推理速度更优。**
**Note: we provide a new solution based on [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ), and release an Int4 quantized model for Qwen-7B-Chat [Click here](https://huggingface.co/Qwen/Qwen-7B-Chat-Int4), which achieves nearly lossless model effects but improved performance on both memory costs and inference speed, in comparison with the previous solution.**
以下我们提供示例说明如何使用Int4量化模型。在开始使用前,请先保证满足要求(如torch 2.0及以上,transformers版本为4.32.0及以上,等等),并安装所需安装包:
Here we demonstrate how to use our provided quantized models for inference. Before you start, make sure you meet the requirements of auto-gptq (e.g., torch 2.0 and above, transformers 4.32.0 and above, etc.) and install the required packages:
```bash
pip install auto-gptq optimum
```
如安装`auto-gptq`遇到问题,我们建议您到官方[repo](https://github.com/PanQiWei/AutoGPTQ)搜索合适的预编译wheel。
随后即可使用和上述一致的用法调用量化模型:
If you meet problems installing `auto-gptq`, we advise you to check out the official [repo](https://github.com/PanQiWei/AutoGPTQ) to find a pre-build wheel.
Then you can load the quantized model easily and run inference as same as usual:
```python
model = AutoModelForCausalLM.from_pretrained(
"Qwen/Qwen-7B-Chat-Int4",
device_map="auto",
trust_remote_code=True
).eval()
response, history = model.chat(tokenizer, "你好", history=None)
```
### 效果评测
我们对BF16,Int8和Int4模型在基准评测上做了测试(使用zero-shot设置),发现量化模型效果损失较小,结果如下所示:
We illustrate the zero-shot performance of both BF16, Int8 and Int4 models on the benchmark, and we find that the quantized model does not suffer from significant performance degradation. Results are shown below:
| Quantization | MMLU | CEval (val) | GSM8K | Humaneval |
| ------------- | :--------: | :----------: | :----: | :--------: |
| BF16 | 55.8 | 59.7 | 50.3 | 37.2 |
| Int8 | 55.4 | 59.4 | 48.3 | 34.8 |
| Int4 | 55.1 | 59.2 | 49.7 | 29.9 |
### 推理速度 (Inference Speed)
我们测算了不同精度模型以及不同FlashAttn库版本下模型生成2048和8192个token的平均推理速度。如图所示:
We measured the average inference speed of generating 2048 and 8192 tokens with different quantization levels and versions of flash-attention, respectively.
| Quantization | FlashAttn | Speed (2048 tokens) | Speed (8192 tokens) |
| ------------- | :-------: | :------------------:| :------------------:|
| BF16 | v2 | 40.93 | 36.14 |
| Int8 | v2 | 37.47 | 32.54 |
| Int4 | v2 | 50.09 | 38.61 |
| BF16 | v1 | 40.75 | 35.34 |
| Int8 | v1 | 37.51 | 32.39 |
| Int4 | v1 | 45.98 | 36.47 |
| BF16 | Disabled | 37.55 | 33.56 |
| Int8 | Disabled | 37.84 | 32.65 |
| Int4 | Disabled | 48.12 | 36.70 |
具体而言,我们记录在长度为1的上下文的条件下生成8192个token的性能。评测运行于单张A100-SXM4-80G GPU,使用PyTorch 2.0.1和CUDA 11.8。推理速度是生成8192个token的速度均值。
In detail, the setting of profiling is generating 8192 new tokens with 1 context token. The profiling runs on a single A100-SXM4-80G GPU with PyTorch 2.0.1 and CUDA 11.8. The inference speed is averaged over the generated 8192 tokens.
注意:以上Int4/Int8模型生成速度使用autogptq库给出,当前``AutoModelForCausalLM.from_pretrained``载入的模型生成速度会慢大约20%。我们已经将该问题汇报给HuggingFace团队,若有解决方案将即时更新。
Note: The generation speed of the Int4/Int8 models mentioned above is provided by the autogptq library. The current speed of the model loaded using "AutoModelForCausalLM.from_pretrained" will be approximately 20% slower. We have reported this issue to the HuggingFace team and will update it promptly if a solution is available.
### 显存使用 (GPU Memory Usage)
我们还测算了不同模型精度编码2048个token及生成8192个token的峰值显存占用情况。(显存消耗在是否使用FlashAttn的情况下均类似。)结果如下所示:
We also profile the peak GPU memory usage for encoding 2048 tokens as context (and generating single token) and generating 8192 tokens (with single token as context) under different quantization levels, respectively. (The GPU memory usage is similar when using flash-attention or not.)The results are shown below.
| Quantization Level | Peak Usage for Encoding 2048 Tokens | Peak Usage for Generating 8192 Tokens |
| ------------------ | :---------------------------------: | :-----------------------------------: |
| BF16 | 16.99GB | 22.53GB |
| Int8 | 11.20GB | 16.62GB |
| Int4 | 8.21GB | 13.63GB |
上述性能测算使用[此脚本](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile.py)完成。
The above speed and memory profiling are conducted using [this script](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile.py).
<br>
## 模型细节(Model)
与Qwen-7B预训练模型相同,Qwen-7B-Chat模型规模基本情况如下所示:
The details of the model architecture of Qwen-7B-Chat are listed as follows:
| Hyperparameter | Value |
|:----------------|:------:|
| n_layers | 32 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 151851 |
| sequence length | 8192 |
在位置编码、FFN激活函数和normalization的实现方式上,我们也采用了目前最流行的做法,
即RoPE相对位置编码、SwiGLU激活函数、RMSNorm(可选安装flash-attention加速)。
在分词器方面,相比目前主流开源模型以中英词表为主,Qwen-7B-Chat使用了约15万token大小的词表。
该词表在GPT-4使用的BPE词表`cl100k_base`基础上,对中文、多语言进行了优化,在对中、英、代码数据的高效编解码的基础上,对部分多语言更加友好,方便用户在不扩展词表的情况下对部分语种进行能力增强。
词表对数字按单个数字位切分。调用较为高效的[tiktoken分词库](https://github.com/openai/tiktoken)进行分词。
For position encoding, FFN activation function, and normalization calculation methods, we adopt the prevalent practices, i.e., RoPE relative position encoding, SwiGLU for activation function, and RMSNorm for normalization (optional installation of flash-attention for acceleration).
For tokenization, compared to the current mainstream open-source models based on Chinese and English vocabularies, Qwen-7B-Chat uses a vocabulary of over 150K tokens.
It first considers efficient encoding of Chinese, English, and code data, and is also more friendly to multilingual languages, enabling users to directly enhance the capability of some languages without expanding the vocabulary.
It segments numbers by single digit, and calls the [tiktoken](https://github.com/openai/tiktoken) tokenizer library for efficient tokenization.
<br>
## 评测效果(Evaluation)
对于Qwen-7B-Chat模型,我们同样评测了常规的中文理解(C-Eval)、英文理解(MMLU)、代码(HumanEval)和数学(GSM8K)等权威任务,同时包含了长序列任务的评测结果。由于Qwen-7B-Chat模型经过对齐后,激发了较强的外部系统调用能力,我们还进行了工具使用能力方面的评测。
提示:由于硬件和框架造成的舍入误差,复现结果如有波动属于正常现象。
For Qwen-7B-Chat, we also evaluate the model on C-Eval, MMLU, HumanEval, GSM8K, etc., as well as the benchmark evaluation for long-context understanding, and tool usage.
Note: Due to rounding errors caused by hardware and framework, differences in reproduced results are possible.
### 中文评测(Chinese Evaluation)
#### C-Eval
在[C-Eval](https://arxiv.org/abs/2305.08322)验证集上,我们评价了Qwen-7B-Chat模型的0-shot & 5-shot准确率
We demonstrate the 0-shot & 5-shot accuracy of Qwen-7B-Chat on C-Eval validation set
| Model | Avg. Acc. |
|:--------------------------------:|:---------:|
| LLaMA2-7B-Chat | 31.9 |
| LLaMA2-13B-Chat | 36.2 |
| LLaMA2-70B-Chat | 44.3 |
| ChatGLM2-6B-Chat | 52.6 |
| InternLM-7B-Chat | 53.6 |
| Baichuan2-7B-Chat | 55.6 |
| Baichuan2-13B-Chat | 56.7 |
| Qwen-7B-Chat (original) (0-shot) | 54.2 |
| **Qwen-7B-Chat (0-shot)** | 59.7 |
| **Qwen-7B-Chat (5-shot)** | 59.3 |
| **Qwen-14B-Chat (0-shot)** | 69.8 |
| **Qwen-14B-Chat (5-shot)** | **71.7** |
C-Eval测试集上,Qwen-7B-Chat模型的zero-shot准确率结果如下:
The zero-shot accuracy of Qwen-7B-Chat on C-Eval testing set is provided below:
| Model | Avg. | STEM | Social Sciences | Humanities | Others |
| :---------------------- | :------: | :--: | :-------------: | :--------: | :----: |
| Chinese-Alpaca-Plus-13B | 41.5 | 36.6 | 49.7 | 43.1 | 41.2 |
| Chinese-Alpaca-2-7B | 40.3 | - | - | - | - |
| ChatGLM2-6B-Chat | 50.1 | 46.4 | 60.4 | 50.6 | 46.9 |
| Baichuan-13B-Chat | 51.5 | 43.7 | 64.6 | 56.2 | 49.2 |
| Qwen-7B-Chat (original) | 54.6 | 47.8 | 67.6 | 59.3 | 50.6 |
| **Qwen-7B-Chat** | 58.6 | 53.3 | 72.1 | 62.8 | 52.0 |
| **Qwen-14B-Chat** | **69.1** | 65.1 | 80.9 | 71.2 | 63.4 |
在7B规模模型上,经过人类指令对齐的Qwen-7B-Chat模型,准确率在同类相近规模模型中仍然处于前列。
Compared with other pretrained models with comparable model size, the human-aligned Qwen-7B-Chat performs well in C-Eval accuracy.
### 英文评测(English Evaluation)
#### MMLU
[MMLU](https://arxiv.org/abs/2009.03300)评测集上,Qwen-7B-Chat模型的 0-shot & 5-shot 准确率如下,效果同样在同类对齐模型中同样表现较优。
The 0-shot & 5-shot accuracy of Qwen-7B-Chat on MMLU is provided below.
The performance of Qwen-7B-Chat still on the top between other human-aligned models with comparable size.
| Model | Avg. Acc. |
|:--------------------------------:|:---------:|
| ChatGLM2-6B-Chat | 46.0 |
| LLaMA2-7B-Chat | 46.2 |
| InternLM-7B-Chat | 51.1 |
| Baichuan2-7B-Chat | 52.9 |
| LLaMA2-13B-Chat | 54.6 |
| Baichuan2-13B-Chat | 57.3 |
| LLaMA2-70B-Chat | 63.8 |
| Qwen-7B-Chat (original) (0-shot) | 53.9 |
| **Qwen-7B-Chat (0-shot)** | 55.8 |
| **Qwen-7B-Chat (5-shot)** | 57.0 |
| **Qwen-14B-Chat (0-shot)** | 64.6 |
| **Qwen-14B-Chat (5-shot)** | **66.5** |
### 代码评测(Coding Evaluation)
Qwen-7B-Chat在[HumanEval](https://github.com/openai/human-eval)的zero-shot Pass@1效果如下
The zero-shot Pass@1 of Qwen-7B-Chat on [HumanEval](https://github.com/openai/human-eval) is demonstrated below
| Model | Pass@1 |
|:-----------------------:|:--------:|
| ChatGLM2-6B-Chat | 11.0 |
| LLaMA2-7B-Chat | 12.2 |
| Baichuan2-7B-Chat | 13.4 |
| InternLM-7B-Chat | 14.6 |
| Baichuan2-13B-Chat | 17.7 |
| LLaMA2-13B-Chat | 18.9 |
| LLaMA2-70B-Chat | 32.3 |
| Qwen-7B-Chat (original) | 24.4 |
| **Qwen-7B-Chat** | 37.2 |
| **Qwen-14B-Chat** | **43.9** |
### 数学评测(Mathematics Evaluation)
在评测数学能力的[GSM8K](https://github.com/openai/grade-school-math)上,Qwen-7B-Chat的准确率结果如下
The accuracy of Qwen-7B-Chat on GSM8K is shown below
| Model | Acc. |
|:------------------------------------:|:--------:|
| LLaMA2-7B-Chat | 26.3 |
| ChatGLM2-6B-Chat | 28.8 |
| Baichuan2-7B-Chat | 32.8 |
| InternLM-7B-Chat | 33.0 |
| LLaMA2-13B-Chat | 37.1 |
| Baichuan2-13B-Chat | 55.3 |
| LLaMA2-70B-Chat | 59.3 |
| **Qwen-7B-Chat (original) (0-shot)** | 41.1 |
| **Qwen-7B-Chat (0-shot)** | 50.3 |
| **Qwen-7B-Chat (8-shot)** | 54.1 |
| **Qwen-14B-Chat (0-shot)** | **60.1** |
| **Qwen-14B-Chat (8-shot)** | 59.3 |
### 长序列评测(Long-Context Understanding)
通过NTK插值,LogN注意力缩放可以扩展Qwen-7B-Chat的上下文长度。在长文本摘要数据集[VCSUM](https://arxiv.org/abs/2305.05280)上(文本平均长度在15K左右),Qwen-7B-Chat的Rouge-L结果如下:
**(若要启用这些技巧,请将config.json里的`use_dynamic_ntk`和`use_logn_attn`设置为true)**
We introduce NTK-aware interpolation, LogN attention scaling to extend the context length of Qwen-7B-Chat. The Rouge-L results of Qwen-7B-Chat on long-text summarization dataset [VCSUM](https://arxiv.org/abs/2305.05280) (The average length of this dataset is around 15K) are shown below:
**(To use these tricks, please set `use_dynamic_ntk` and `use_long_attn` to true in config.json.)**
| Model | VCSUM (zh) |
|:------------------|:----------:|
| GPT-3.5-Turbo-16k | 16.0 |
| LLama2-7B-Chat | 0.2 |
| InternLM-7B-Chat | 13.0 |
| ChatGLM2-6B-Chat | 16.3 |
| **Qwen-7B-Chat** | **16.6** |
### 工具使用能力的评测(Tool Usage)
#### ReAct Prompting
千问支持通过 [ReAct Prompting](https://arxiv.org/abs/2210.03629) 调用插件/工具/API。ReAct 也是 [LangChain](https://python.langchain.com/) 框架采用的主要方式之一。在我们开源的、用于评估工具使用能力的评测基准上,千问的表现如下:
Qwen-Chat supports calling plugins/tools/APIs through [ReAct Prompting](https://arxiv.org/abs/2210.03629). ReAct is also one of the main approaches used by the [LangChain](https://python.langchain.com/) framework. In our evaluation benchmark for assessing tool usage capabilities, Qwen-Chat's performance is as follows:
<table>
<tr>
<th colspan="4" align="center">Chinese Tool-Use Benchmark</th>
</tr>
<tr>
<th align="center">Model</th><th align="center">Tool Selection (Acc.↑)</th><th align="center">Tool Input (Rouge-L↑)</th><th align="center">False Positive Error↓</th>
</tr>
<tr>
<td>GPT-4</td><td align="center">95%</td><td align="center">0.90</td><td align="center">15.0%</td>
</tr>
<tr>
<td>GPT-3.5</td><td align="center">85%</td><td align="center">0.88</td><td align="center">75.0%</td>
</tr>
<tr>
<td>Qwen-7B-Chat</td><td align="center">98%</td><td align="center">0.91</td><td align="center">7.3%</td>
</tr>
<tr>
<td>Qwen-14B-Chat</td><td align="center">98%</td><td align="center">0.93</td><td align="center">2.4%</td>
</tr>
</table>
> 评测基准中出现的插件均没有出现在千问的训练集中。该基准评估了模型在多个候选插件中选择正确插件的准确率、传入插件的参数的合理性、以及假阳率。假阳率(False Positive)定义:在处理不该调用插件的请求时,错误地调用了插件。
> The plugins that appear in the evaluation set do not appear in the training set of Qwen. This benchmark evaluates the accuracy of the model in selecting the correct plugin from multiple candidate plugins, the rationality of the parameters passed into the plugin, and the false positive rate. False Positive: Incorrectly invoking a plugin when it should not have been called when responding to a query.


#### Code Interpreter
为了考察Qwen使用Python Code Interpreter完成数学解题、数据可视化、及文件处理与爬虫等任务的能力,我们专门建设并开源了一个评测这方面能力的[评测基准](https://github.com/QwenLM/Qwen-Agent/tree/main/benchmark)。
我们发现Qwen在生成代码的可执行率、结果正确性上均表现较好:
To assess Qwen's ability to use the Python Code Interpreter for tasks such as mathematical problem solving, data visualization, and other general-purpose tasks such as file handling and web scraping, we have created and open-sourced a benchmark specifically designed for evaluating these capabilities. You can find the benchmark at this [link](https://github.com/QwenLM/Qwen-Agent/tree/main/benchmark).
We have observed that Qwen performs well in terms of code executability and result accuracy when generating code:
<table>
<tr>
<th colspan="4" align="center">Executable Rate of Generated Code (%)</th>
</tr>
<tr>
<th align="center">Model</th><th align="center">Math↑</th><th align="center">Visualization↑</th><th align="center">General↑</th>
</tr>
<tr>
<td>GPT-4</td><td align="center">91.9</td><td align="center">85.9</td><td align="center">82.8</td>
</tr>
<tr>
<td>GPT-3.5</td><td align="center">89.2</td><td align="center">65.0</td><td align="center">74.1</td>
</tr>
<tr>
<td>LLaMA2-7B-Chat</td>
<td align="center">41.9</td>
<td align="center">33.1</td>
<td align="center">24.1 </td>
</tr>
<tr>
<td>LLaMA2-13B-Chat</td>
<td align="center">50.0</td>
<td align="center">40.5</td>
<td align="center">48.3 </td>
</tr>
<tr>
<td>CodeLLaMA-7B-Instruct</td>
<td align="center">85.1</td>
<td align="center">54.0</td>
<td align="center">70.7 </td>
</tr>
<tr>
<td>CodeLLaMA-13B-Instruct</td>
<td align="center">93.2</td>
<td align="center">55.8</td>
<td align="center">74.1 </td>
</tr>
<tr>
<td>InternLM-7B-Chat-v1.1</td>
<td align="center">78.4</td>
<td align="center">44.2</td>
<td align="center">62.1 </td>
</tr>
<tr>
<td>InternLM-20B-Chat</td>
<td align="center">70.3</td>
<td align="center">44.2</td>
<td align="center">65.5 </td>
</tr>
<tr>
<td>Qwen-7B-Chat</td>
<td align="center">82.4</td>
<td align="center">64.4</td>
<td align="center">67.2 </td>
</tr>
<tr>
<td>Qwen-14B-Chat</td>
<td align="center">89.2</td>
<td align="center">84.1</td>
<td align="center">65.5</td>
</tr>
</table>
<table>
<tr>
<th colspan="4" align="center">Accuracy of Code Execution Results (%)</th>
</tr>
<tr>
<th align="center">Model</th><th align="center">Math↑</th><th align="center">Visualization-Hard↑</th><th align="center">Visualization-Easy↑</th>
</tr>
<tr>
<td>GPT-4</td><td align="center">82.8</td><td align="center">66.7</td><td align="center">60.8</td>
</tr>
<tr>
<td>GPT-3.5</td><td align="center">47.3</td><td align="center">33.3</td><td align="center">55.7</td>
</tr>
<tr>
<td>LLaMA2-7B-Chat</td>
<td align="center">3.9</td>
<td align="center">14.3</td>
<td align="center">39.2 </td>
</tr>
<tr>
<td>LLaMA2-13B-Chat</td>
<td align="center">8.3</td>
<td align="center">8.3</td>
<td align="center">40.5 </td>
</tr>
<tr>
<td>CodeLLaMA-7B-Instruct</td>
<td align="center">14.3</td>
<td align="center">26.2</td>
<td align="center">60.8 </td>
</tr>
<tr>
<td>CodeLLaMA-13B-Instruct</td>
<td align="center">28.2</td>
<td align="center">27.4</td>
<td align="center">62.0 </td>
</tr>
<tr>
<td>InternLM-7B-Chat-v1.1</td>
<td align="center">28.5</td>
<td align="center">4.8</td>
<td align="center">40.5 </td>
</tr>
<tr>
<td>InternLM-20B-Chat</td>
<td align="center">34.6</td>
<td align="center">21.4</td>
<td align="center">45.6 </td>
</tr>
<tr>
<td>Qwen-7B-Chat</td>
<td align="center">41.9</td>
<td align="center">40.5</td>
<td align="center">54.4 </td>
</tr>
<tr>
<td>Qwen-14B-Chat</td>
<td align="center">58.4</td>
<td align="center">53.6</td>
<td align="center">59.5</td>
</tr>
</table>
<p align="center">
<br>
<img src="assets/code_interpreter_showcase_001.jpg" />
<br>
<p>
#### Huggingface Agent
千问还具备作为 [HuggingFace Agent](https://huggingface.co/docs/transformers/transformers_agents) 的能力。它在 Huggingface 提供的run模式评测基准上的表现如下:
Qwen-Chat also has the capability to be used as a [HuggingFace Agent](https://huggingface.co/docs/transformers/transformers_agents). Its performance on the run-mode benchmark provided by HuggingFace is as follows:
<table>
<tr>
<th colspan="4" align="center">HuggingFace Agent Benchmark- Run Mode</th>
</tr>
<tr>
<th align="center">Model</th><th align="center">Tool Selection↑</th><th align="center">Tool Used↑</th><th align="center">Code↑</th>
</tr>
<tr>
<td>GPT-4</td><td align="center">100</td><td align="center">100</td><td align="center">97.4</td>
</tr>
<tr>
<td>GPT-3.5</td><td align="center">95.4</td><td align="center">96.3</td><td align="center">87.0</td>
</tr>
<tr>
<td>StarCoder-Base-15B</td><td align="center">86.1</td><td align="center">87.0</td><td align="center">68.9</td>
</tr>
<tr>
<td>StarCoder-15B</td><td align="center">87.0</td><td align="center">88.0</td><td align="center">68.9</td>
</tr>
<tr>
<td>Qwen-7B-Chat</td><td align="center">87.0</td><td align="center">87.0</td><td align="center">71.5</td>
</tr>
<tr>
<td>Qwen-14B-Chat</td><td align="center">93.5</td><td align="center">94.4</td><td align="center">87.0</td>
</tr>
</table>
<table>
<tr>
<th colspan="4" align="center">HuggingFace Agent Benchmark - Chat Mode</th>
</tr>
<tr>
<th align="center">Model</th><th align="center">Tool Selection↑</th><th align="center">Tool Used↑</th><th align="center">Code↑</th>
</tr>
<tr>
<td>GPT-4</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">98.5</td>
</tr>
<tr>
<td>GPT-3.5</td><td align="center">97.3</td><td align="center">96.8</td><td align="center">89.6</td>
</tr>
<tr>
<td>StarCoder-Base-15B</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">91.1</td>
</tr>
<tr>
<td>StarCoder-15B</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">89.6</td>
</tr>
<tr>
<td>Qwen-7B-Chat</td><td align="center">94.7</td><td align="center">94.7</td><td align="center">85.1</td>
</tr>
<tr>
<td>Qwen-14B-Chat</td><td align="center">97.9</td><td align="center">97.9</td><td align="center">95.5</td>
</tr>
</table>
<br>
## FAQ
如遇到问题,敬请查阅[FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ_zh.md)以及issue区,如仍无法解决再提交issue。
If you meet problems, please refer to [FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ.md) and the issues first to search a solution before you launch a new issue.
<br>
## 引用 (Citation)
如果你觉得我们的工作对你有帮助,欢迎引用!
If you find our work helpful, feel free to give us a cite.
```
@article{qwen,
title={Qwen Technical Report},
author={Jinze Bai and Shuai Bai and Yunfei Chu and Zeyu Cui and Kai Dang and Xiaodong Deng and Yang Fan and Wenbin Ge and Yu Han and Fei Huang and Binyuan Hui and Luo Ji and Mei Li and Junyang Lin and Runji Lin and Dayiheng Liu and Gao Liu and Chengqiang Lu and Keming Lu and Jianxin Ma and Rui Men and Xingzhang Ren and Xuancheng Ren and Chuanqi Tan and Sinan Tan and Jianhong Tu and Peng Wang and Shijie Wang and Wei Wang and Shengguang Wu and Benfeng Xu and Jin Xu and An Yang and Hao Yang and Jian Yang and Shusheng Yang and Yang Yao and Bowen Yu and Hongyi Yuan and Zheng Yuan and Jianwei Zhang and Xingxuan Zhang and Yichang Zhang and Zhenru Zhang and Chang Zhou and Jingren Zhou and Xiaohuan Zhou and Tianhang Zhu},
journal={arXiv preprint arXiv:2309.16609},
year={2023}
}
```
<br>
## 使用协议(License Agreement)
我们的代码和模型权重对学术研究完全开放,并支持商用。请查看[LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE)了解具体的开源协议细节。如需商用,请填写[问卷](https://dashscope.console.aliyun.com/openModelApply/qianwen)申请。
Our code and checkpoints are open to research purpose, and they are allowed for commercial purposes. Check [LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE) for more details about the license. If you have requirements for commercial use, please fill out the [form](https://dashscope.console.aliyun.com/openModelApply/qianwen) to apply.
<br>
## 联系我们(Contact Us)
如果你想给我们的研发团队和产品团队留言,欢迎加入我们的微信群、钉钉群以及Discord!同时,也欢迎通过邮件(qianwen_opensource@alibabacloud.com)联系我们。
If you are interested to leave a message to either our research team or product team, join our Discord or WeChat groups! Also, feel free to send an email to qianwen_opensource@alibabacloud.com.
| 29,492 | [
[
-0.0301513671875,
-0.05548095703125,
0.0097198486328125,
0.0193023681640625,
-0.01947021484375,
-0.0080413818359375,
-0.006988525390625,
-0.0289306640625,
-0.004367828369140625,
0.0188446044921875,
-0.039276123046875,
-0.03729248046875,
-0.030120849609375,
-0.01021575927734375,
-0.0189361572265625,
0.054534912109375,
0.0203857421875,
-0.008514404296875,
0.0182647705078125,
-0.013427734375,
-0.03106689453125,
-0.03326416015625,
-0.0634765625,
-0.01149749755859375,
0.0093536376953125,
0.00966644287109375,
0.06195068359375,
0.0322265625,
0.0273895263671875,
0.033935546875,
-0.0013914108276367188,
0.0103912353515625,
-0.030426025390625,
-0.00926971435546875,
0.0163421630859375,
-0.042999267578125,
-0.0501708984375,
0.003742218017578125,
0.043121337890625,
0.00789642333984375,
-0.00439453125,
0.0168914794921875,
0.0142974853515625,
0.0304412841796875,
-0.029052734375,
0.01546478271484375,
-0.0276641845703125,
-0.0066986083984375,
-0.0106201171875,
-0.004344940185546875,
-0.014373779296875,
-0.031585693359375,
0.01444244384765625,
-0.050201416015625,
0.0066680908203125,
0.016082763671875,
0.096435546875,
0.002582550048828125,
-0.047454833984375,
0.00266265869140625,
-0.03515625,
0.07696533203125,
-0.08880615234375,
0.0145721435546875,
0.0217132568359375,
0.03106689453125,
-0.0243682861328125,
-0.07562255859375,
-0.051910400390625,
-0.0201263427734375,
-0.0209197998046875,
0.0150146484375,
-0.023895263671875,
0.01434326171875,
0.031463623046875,
0.022491455078125,
-0.042816162109375,
-0.00875091552734375,
-0.02752685546875,
-0.022918701171875,
0.052520751953125,
0.0157623291015625,
0.03753662109375,
-0.0267333984375,
-0.02349853515625,
-0.017730712890625,
-0.0292205810546875,
0.0155792236328125,
0.018707275390625,
-0.0028400421142578125,
-0.036468505859375,
0.018341064453125,
-0.02490234375,
0.031402587890625,
0.03125,
-0.0194854736328125,
0.0274505615234375,
-0.0299224853515625,
-0.031768798828125,
-0.0189056396484375,
0.102294921875,
0.04180908203125,
-0.00031065940856933594,
0.01265716552734375,
-0.0028705596923828125,
-0.0120391845703125,
-0.015838623046875,
-0.07598876953125,
-0.039276123046875,
0.04937744140625,
-0.051422119140625,
-0.033599853515625,
-0.007480621337890625,
-0.03057861328125,
0.0074920654296875,
0.004718780517578125,
0.05291748046875,
-0.049346923828125,
-0.0482177734375,
0.00009876489639282227,
-0.01666259765625,
0.0202178955078125,
0.0183868408203125,
-0.060516357421875,
0.00711822509765625,
0.0186920166015625,
0.05712890625,
0.0099639892578125,
-0.03033447265625,
-0.022491455078125,
-0.0022735595703125,
-0.0164031982421875,
0.0309906005859375,
0.0026092529296875,
-0.0243682861328125,
-0.011962890625,
0.011749267578125,
-0.0088043212890625,
-0.03216552734375,
0.035797119140625,
-0.03887939453125,
0.04583740234375,
-0.0062713623046875,
-0.038543701171875,
-0.0222930908203125,
0.0131683349609375,
-0.033721923828125,
0.07855224609375,
0.0084075927734375,
-0.07708740234375,
-0.002101898193359375,
-0.045562744140625,
-0.015625,
-0.0016775131225585938,
0.00429534912109375,
-0.042999267578125,
-0.02032470703125,
0.0258026123046875,
0.031036376953125,
-0.018646240234375,
0.020721435546875,
-0.0271759033203125,
-0.031768798828125,
0.035064697265625,
-0.051025390625,
0.10223388671875,
0.011962890625,
-0.050933837890625,
0.0340576171875,
-0.0531005859375,
0.022674560546875,
0.01306915283203125,
-0.00374603271484375,
-0.004177093505859375,
-0.004474639892578125,
0.00553131103515625,
0.030914306640625,
0.0255584716796875,
-0.0270538330078125,
0.0095367431640625,
-0.054534912109375,
0.060577392578125,
0.047454833984375,
0.0017528533935546875,
0.043365478515625,
-0.03778076171875,
0.0265960693359375,
0.0254669189453125,
0.039398193359375,
-0.012481689453125,
-0.0352783203125,
-0.07293701171875,
-0.00713348388671875,
0.0318603515625,
0.043609619140625,
-0.080322265625,
0.047149658203125,
0.003818511962890625,
-0.042266845703125,
-0.051361083984375,
-0.01556396484375,
0.037261962890625,
0.0205078125,
0.03326416015625,
-0.0007238388061523438,
-0.03631591796875,
-0.05438232421875,
-0.003482818603515625,
-0.0207061767578125,
-0.01039886474609375,
0.0132293701171875,
0.0304412841796875,
-0.00598907470703125,
0.06353759765625,
-0.0286407470703125,
-0.00864410400390625,
-0.017791748046875,
0.003566741943359375,
0.0215911865234375,
0.05596923828125,
0.05621337890625,
-0.06341552734375,
-0.047576904296875,
-0.01226043701171875,
-0.058990478515625,
0.0071868896484375,
-0.0131683349609375,
-0.0421142578125,
0.018585205078125,
0.012359619140625,
-0.05621337890625,
0.040252685546875,
0.044342041015625,
-0.0313720703125,
0.055999755859375,
-0.0031223297119140625,
0.0181121826171875,
-0.093505859375,
-0.007232666015625,
-0.00739288330078125,
-0.019134521484375,
-0.043609619140625,
-0.0019931793212890625,
0.01399993896484375,
0.0208587646484375,
-0.045379638671875,
0.059844970703125,
-0.037933349609375,
0.0181732177734375,
-0.01496124267578125,
0.01535797119140625,
0.01251983642578125,
0.050750732421875,
-0.01444244384765625,
0.047576904296875,
0.05322265625,
-0.0533447265625,
0.03887939453125,
0.0288848876953125,
-0.00726318359375,
0.0137786865234375,
-0.06451416015625,
0.003147125244140625,
0.0163421630859375,
0.016357421875,
-0.07830810546875,
-0.00445556640625,
0.047576904296875,
-0.0628662109375,
0.0189361572265625,
-0.0163116455078125,
-0.0167388916015625,
-0.040679931640625,
-0.032623291015625,
0.01496124267578125,
0.054473876953125,
-0.033203125,
0.042266845703125,
0.015289306640625,
0.01461029052734375,
-0.04595947265625,
-0.037384033203125,
-0.00732421875,
-0.02978515625,
-0.05780029296875,
0.03680419921875,
-0.0126495361328125,
0.0004935264587402344,
-0.002941131591796875,
0.01055145263671875,
0.003082275390625,
0.0010251998901367188,
0.013946533203125,
0.0296783447265625,
-0.01480865478515625,
-0.0027980804443359375,
0.00858306884765625,
-0.01155853271484375,
0.003894805908203125,
-0.0211334228515625,
0.04376220703125,
-0.01084136962890625,
-0.01104736328125,
-0.06658935546875,
0.0136871337890625,
0.03594970703125,
-0.0173797607421875,
0.061248779296875,
0.07830810546875,
-0.0133056640625,
0.00807952880859375,
-0.04779052734375,
-0.02386474609375,
-0.044830322265625,
0.0290985107421875,
-0.0291595458984375,
-0.06402587890625,
0.04779052734375,
0.016082763671875,
0.0261688232421875,
0.0533447265625,
0.043121337890625,
-0.00726318359375,
0.09637451171875,
0.03363037109375,
-0.00872802734375,
0.050323486328125,
-0.039947509765625,
0.00984954833984375,
-0.061981201171875,
0.005390167236328125,
-0.017333984375,
-0.0147247314453125,
-0.064208984375,
-0.0197906494140625,
0.030487060546875,
0.026580810546875,
-0.04559326171875,
0.0260009765625,
-0.041595458984375,
-0.01079559326171875,
0.06402587890625,
0.01056671142578125,
0.01175689697265625,
-0.0259857177734375,
0.0040740966796875,
0.006076812744140625,
-0.067138671875,
-0.035430908203125,
0.07037353515625,
0.031585693359375,
0.035003662109375,
0.004978179931640625,
0.05322265625,
-0.0021686553955078125,
0.0242767333984375,
-0.04376220703125,
0.036651611328125,
0.01358795166015625,
-0.04254150390625,
-0.03765869140625,
-0.04730224609375,
-0.072998046875,
0.039337158203125,
-0.00783538818359375,
-0.0594482421875,
0.0229644775390625,
0.0149383544921875,
-0.046234130859375,
0.01446533203125,
-0.05706787109375,
0.07354736328125,
-0.015716552734375,
-0.027557373046875,
-0.002044677734375,
-0.05322265625,
0.04022216796875,
0.032562255859375,
0.016326904296875,
-0.0209808349609375,
0.0201416015625,
0.06396484375,
-0.040679931640625,
0.062042236328125,
-0.0223846435546875,
0.002471923828125,
0.046173095703125,
0.0002598762512207031,
0.0360107421875,
0.014495849609375,
0.01163482666015625,
0.01091766357421875,
0.03704833984375,
-0.037017822265625,
-0.0426025390625,
0.044586181640625,
-0.06976318359375,
-0.050201416015625,
-0.025238037109375,
-0.0291748046875,
0.003406524658203125,
0.0189208984375,
0.041473388671875,
0.0413818359375,
0.006809234619140625,
0.0087127685546875,
0.0286712646484375,
-0.035919189453125,
0.0523681640625,
0.032867431640625,
-0.0309600830078125,
-0.03717041015625,
0.056304931640625,
0.007457733154296875,
0.0290374755859375,
0.0146484375,
0.011749267578125,
-0.02471923828125,
-0.0266571044921875,
-0.05029296875,
0.01395416259765625,
-0.025115966796875,
-0.0307464599609375,
-0.054962158203125,
-0.029022216796875,
-0.0523681640625,
0.020599365234375,
-0.0274810791015625,
-0.0260467529296875,
-0.0285491943359375,
0.0035343170166015625,
0.039398193359375,
0.01183319091796875,
-0.0023670196533203125,
0.0330810546875,
-0.083984375,
0.0286712646484375,
0.022552490234375,
0.0026264190673828125,
0.027557373046875,
-0.052520751953125,
-0.032196044921875,
0.035919189453125,
-0.040740966796875,
-0.057952880859375,
0.05291748046875,
0.00722503662109375,
0.03900146484375,
0.03302001953125,
0.0280303955078125,
0.045501708984375,
-0.01214599609375,
0.0625,
0.01763916015625,
-0.07830810546875,
0.029449462890625,
-0.03643798828125,
0.0188751220703125,
0.00586700439453125,
0.0203399658203125,
-0.0396728515625,
-0.0298309326171875,
-0.064453125,
-0.05877685546875,
0.060394287109375,
0.038787841796875,
0.018798828125,
0.0004513263702392578,
0.016204833984375,
-0.032562255859375,
0.019744873046875,
-0.056884765625,
-0.0400390625,
-0.0242767333984375,
-0.0182342529296875,
0.02703857421875,
-0.0150146484375,
0.00098419189453125,
-0.0299224853515625,
0.056060791015625,
-0.0017080307006835938,
0.047637939453125,
0.019927978515625,
0.00030517578125,
-0.00090789794921875,
0.0035915374755859375,
0.024810791015625,
0.036224365234375,
-0.0110626220703125,
-0.0146331787109375,
0.0186920166015625,
-0.0343017578125,
0.005046844482421875,
0.006011962890625,
-0.018829345703125,
0.0012865066528320312,
0.0227813720703125,
0.07489013671875,
0.01444244384765625,
-0.026763916015625,
0.03961181640625,
-0.0162353515625,
-0.02630615234375,
-0.0211944580078125,
0.0182342529296875,
0.023956298828125,
0.04534912109375,
0.035430908203125,
-0.02471923828125,
0.015625,
-0.043365478515625,
0.0036468505859375,
0.0263824462890625,
-0.0124053955078125,
-0.0212249755859375,
0.06341552734375,
0.01500701904296875,
-0.00946807861328125,
0.050384521484375,
-0.0347900390625,
-0.0596923828125,
0.05712890625,
0.0355224609375,
0.05706787109375,
-0.0222930908203125,
0.01285552978515625,
0.048095703125,
0.0033092498779296875,
-0.00943756103515625,
0.0260467529296875,
0.00537872314453125,
-0.065673828125,
-0.02984619140625,
-0.0390625,
-0.0162811279296875,
-0.00301361083984375,
-0.047943115234375,
0.01438140869140625,
-0.01541900634765625,
-0.038604736328125,
-0.005947113037109375,
0.01442718505859375,
-0.043182373046875,
0.022430419921875,
-0.011993408203125,
0.04742431640625,
-0.031219482421875,
0.0784912109375,
0.0252838134765625,
-0.03350830078125,
-0.073486328125,
-0.00791168212890625,
-0.01302337646484375,
-0.04962158203125,
0.0352783203125,
0.0036869049072265625,
0.00931549072265625,
0.0276641845703125,
-0.04876708984375,
-0.073974609375,
0.09539794921875,
-0.0006690025329589844,
-0.0479736328125,
-0.0146331787109375,
-0.0138397216796875,
0.033172607421875,
-0.004261016845703125,
0.050537109375,
0.039337158203125,
0.0283966064453125,
0.0089874267578125,
-0.0882568359375,
0.0194091796875,
-0.02587890625,
0.0036029815673828125,
0.00826263427734375,
-0.0850830078125,
0.0853271484375,
-0.01155853271484375,
-0.022796630859375,
0.01213836669921875,
0.076171875,
0.0181884765625,
0.0133056640625,
0.0250396728515625,
0.0159454345703125,
0.03936767578125,
-0.01629638671875,
0.05657958984375,
-0.039031982421875,
0.05322265625,
0.0626220703125,
0.00811767578125,
0.055999755859375,
0.00469970703125,
-0.042144775390625,
0.0290985107421875,
0.04852294921875,
-0.00821685791015625,
0.032623291015625,
-0.0001456737518310547,
-0.0184326171875,
-0.01446533203125,
0.0230865478515625,
-0.03826904296875,
0.018035888671875,
0.0298614501953125,
-0.006542205810546875,
0.005878448486328125,
0.0166473388671875,
0.0031833648681640625,
-0.0352783203125,
-0.009552001953125,
0.05169677734375,
0.0216827392578125,
-0.033233642578125,
0.06951904296875,
0.01506805419921875,
0.08367919921875,
-0.03607177734375,
-0.003681182861328125,
-0.0141448974609375,
0.007312774658203125,
-0.0170440673828125,
-0.038665771484375,
0.0108184814453125,
-0.03131103515625,
0.005863189697265625,
0.0156707763671875,
0.060272216796875,
-0.037811279296875,
-0.0202789306640625,
0.0199432373046875,
0.032623291015625,
0.01025390625,
-0.01861572265625,
-0.0706787109375,
0.002811431884765625,
0.0183258056640625,
-0.046722412109375,
0.032989501953125,
0.043548583984375,
-0.00510406494140625,
0.054779052734375,
0.0511474609375,
-0.016998291015625,
0.00519561767578125,
0.0000019073486328125,
0.069580078125,
-0.056396484375,
-0.03302001953125,
-0.06658935546875,
0.056854248046875,
-0.01329803466796875,
-0.03314208984375,
0.06915283203125,
0.0206298828125,
0.056182861328125,
0.0136260986328125,
0.05718994140625,
-0.0215606689453125,
0.036285400390625,
-0.02777099609375,
0.06427001953125,
-0.0287628173828125,
0.004055023193359375,
-0.0153656005859375,
-0.044586181640625,
0.001983642578125,
0.06396484375,
-0.0289764404296875,
0.0222625732421875,
0.041748046875,
0.060760498046875,
0.01381683349609375,
-0.00782012939453125,
0.03570556640625,
0.0352783203125,
0.022125244140625,
0.05914306640625,
0.05389404296875,
-0.0750732421875,
0.05218505859375,
-0.042388916015625,
-0.01055908203125,
-0.026611328125,
-0.0439453125,
-0.08465576171875,
-0.041259765625,
-0.030548095703125,
-0.050537109375,
-0.005504608154296875,
0.07257080078125,
0.050689697265625,
-0.0565185546875,
-0.0172119140625,
0.0106353759765625,
0.00614166259765625,
-0.02587890625,
-0.02581787109375,
0.04803466796875,
-0.015960693359375,
-0.07427978515625,
-0.0018033981323242188,
0.0016336441040039062,
0.0245208740234375,
-0.0240020751953125,
-0.003082275390625,
-0.0122833251953125,
-0.00678253173828125,
0.039093017578125,
0.0162506103515625,
-0.05035400390625,
-0.01242828369140625,
0.009124755859375,
-0.0227203369140625,
0.00948333740234375,
0.0141754150390625,
-0.0491943359375,
0.00768280029296875,
0.0341796875,
-0.002777099609375,
0.04541015625,
-0.001049041748046875,
0.048736572265625,
-0.026153564453125,
0.0234527587890625,
0.00858306884765625,
0.0189361572265625,
0.00760650634765625,
-0.029754638671875,
0.0156402587890625,
0.0100860595703125,
-0.047943115234375,
-0.0555419921875,
-0.00896453857421875,
-0.06341552734375,
-0.01129150390625,
0.08056640625,
-0.03082275390625,
-0.026763916015625,
-0.0085906982421875,
-0.049346923828125,
0.048828125,
-0.032470703125,
0.055328369140625,
0.039031982421875,
0.00046062469482421875,
-0.031890869140625,
-0.05242919921875,
0.051177978515625,
0.0245208740234375,
-0.04327392578125,
-0.01132965087890625,
0.018707275390625,
0.0238800048828125,
-0.004726409912109375,
0.061553955078125,
0.005615234375,
0.03289794921875,
0.0015048980712890625,
0.017425537109375,
-0.005718231201171875,
0.01018524169921875,
-0.003162384033203125,
-0.0078277587890625,
-0.004673004150390625,
-0.031707763671875
]
] |
microsoft/deberta-v3-small | 2022-09-26T08:59:13.000Z | [
"transformers",
"pytorch",
"tf",
"deberta-v2",
"deberta",
"deberta-v3",
"fill-mask",
"en",
"arxiv:2006.03654",
"arxiv:2111.09543",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | microsoft | null | null | microsoft/deberta-v3-small | 37 | 28,925 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- deberta
- deberta-v3
- fill-mask
thumbnail: https://huggingface.co/front/thumbnails/microsoft.png
license: mit
---
## DeBERTaV3: Improving DeBERTa using ELECTRA-Style Pre-Training with Gradient-Disentangled Embedding Sharing
[DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. With those two improvements, DeBERTa out perform RoBERTa on a majority of NLU tasks with 80GB training data.
In [DeBERTa V3](https://arxiv.org/abs/2111.09543), we further improved the efficiency of DeBERTa using ELECTRA-Style pre-training with Gradient Disentangled Embedding Sharing. Compared to DeBERTa, our V3 version significantly improves the model performance on downstream tasks. You can find more technique details about the new model from our [paper](https://arxiv.org/abs/2111.09543).
Please check the [official repository](https://github.com/microsoft/DeBERTa) for more implementation details and updates.
The DeBERTa V3 small model comes with 6 layers and a hidden size of 768. It has **44M** backbone parameters with a vocabulary containing 128K tokens which introduces 98M parameters in the Embedding layer. This model was trained using the 160GB data as DeBERTa V2.
#### Fine-tuning on NLU tasks
We present the dev results on SQuAD 2.0 and MNLI tasks.
| Model |Vocabulary(K)|Backbone #Params(M)| SQuAD 2.0(F1/EM) | MNLI-m/mm(ACC)|
|-------------------|----------|-------------------|-----------|----------|
| RoBERTa-base |50 |86 | 83.7/80.5 | 87.6/- |
| XLNet-base |32 |92 | -/80.2 | 86.8/- |
| ELECTRA-base |30 |86 | -/80.5 | 88.8/ |
| DeBERTa-base |50 |100 | 86.2/83.1| 88.8/88.5|
| DeBERTa-v3-large|128|304 | 91.5/89.0 | 91.8/91.9 |
| DeBERTa-v3-base |128|86 | 88.4/85.4 | 90.6/90.7|
| **DeBERTa-v3-small** |128|**44** | **82.8/80.4** | **88.3/87.7**|
| DeBERTa-v3-small+SiFT|128|22 | -/- | 88.8/88.5|
#### Fine-tuning with HF transformers
```bash
#!/bin/bash
cd transformers/examples/pytorch/text-classification/
pip install datasets
export TASK_NAME=mnli
output_dir="ds_results"
num_gpus=8
batch_size=8
python -m torch.distributed.launch --nproc_per_node=${num_gpus} \
run_glue.py \
--model_name_or_path microsoft/deberta-v3-small \
--task_name $TASK_NAME \
--do_train \
--do_eval \
--evaluation_strategy steps \
--max_seq_length 256 \
--warmup_steps 1500 \
--per_device_train_batch_size ${batch_size} \
--learning_rate 4.5e-5 \
--num_train_epochs 3 \
--output_dir $output_dir \
--overwrite_output_dir \
--logging_steps 1000 \
--logging_dir $output_dir
```
### Citation
If you find DeBERTa useful for your work, please cite the following papers:
``` latex
@misc{he2021debertav3,
title={DeBERTaV3: Improving DeBERTa using ELECTRA-Style Pre-Training with Gradient-Disentangled Embedding Sharing},
author={Pengcheng He and Jianfeng Gao and Weizhu Chen},
year={2021},
eprint={2111.09543},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
``` latex
@inproceedings{
he2021deberta,
title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION},
author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=XPZIaotutsD}
}
```
| 3,585 | [
[
-0.0303192138671875,
-0.047393798828125,
0.0199127197265625,
0.0236053466796875,
-0.0176239013671875,
0.00567626953125,
-0.008758544921875,
-0.0333251953125,
0.02392578125,
0.0017366409301757812,
-0.0271453857421875,
-0.037841796875,
-0.06671142578125,
-0.006801605224609375,
-0.0157470703125,
0.059173583984375,
-0.01139068603515625,
0.0085906982421875,
0.0029125213623046875,
-0.015655517578125,
-0.032806396484375,
-0.0384521484375,
-0.0478515625,
-0.03692626953125,
0.041168212890625,
0.015777587890625,
0.036956787109375,
0.0169677734375,
0.0347900390625,
0.022674560546875,
-0.023651123046875,
0.017822265625,
-0.03326416015625,
-0.00566864013671875,
0.01313018798828125,
-0.027069091796875,
-0.0562744140625,
0.0004911422729492188,
0.0389404296875,
0.022003173828125,
0.003448486328125,
0.01824951171875,
0.0219268798828125,
0.07208251953125,
-0.057952880859375,
0.006732940673828125,
-0.044708251953125,
0.0022411346435546875,
0.002796173095703125,
-0.004558563232421875,
-0.027862548828125,
-0.01092529296875,
0.0227813720703125,
-0.03460693359375,
0.01873779296875,
-0.0229034423828125,
0.0924072265625,
0.04058837890625,
-0.0135650634765625,
-0.006938934326171875,
-0.04766845703125,
0.072998046875,
-0.05584716796875,
0.0239715576171875,
0.01922607421875,
0.00820159912109375,
-0.0035877227783203125,
-0.051239013671875,
-0.048553466796875,
-0.0038585662841796875,
-0.0084228515625,
0.0211029052734375,
-0.04864501953125,
-0.006725311279296875,
0.0248870849609375,
0.0230712890625,
-0.0382080078125,
0.019622802734375,
-0.03143310546875,
0.00479888916015625,
0.046875,
0.002166748046875,
0.0002218484878540039,
-0.0021514892578125,
-0.027557373046875,
-0.031158447265625,
-0.053558349609375,
0.0031490325927734375,
0.038818359375,
-0.006351470947265625,
-0.00771331787109375,
0.0137939453125,
-0.006359100341796875,
0.06298828125,
0.00943756103515625,
0.02264404296875,
0.056549072265625,
-0.0057525634765625,
-0.024169921875,
0.01023101806640625,
0.056396484375,
0.0171051025390625,
-0.0118865966796875,
-0.00225067138671875,
-0.0032863616943359375,
-0.0018129348754882812,
0.01517486572265625,
-0.07080078125,
-0.042510986328125,
0.0323486328125,
-0.03314208984375,
-0.022796630859375,
-0.007457733154296875,
-0.062042236328125,
-0.015655517578125,
-0.03082275390625,
0.046356201171875,
-0.040313720703125,
-0.01861572265625,
0.02099609375,
-0.0015888214111328125,
0.021728515625,
0.0379638671875,
-0.08184814453125,
0.007781982421875,
0.0252227783203125,
0.059722900390625,
-0.002460479736328125,
-0.0225982666015625,
-0.032470703125,
-0.019073486328125,
-0.0015201568603515625,
0.01861572265625,
-0.00457000732421875,
0.011993408203125,
-0.0150299072265625,
0.002658843994140625,
-0.017608642578125,
-0.021148681640625,
0.0204315185546875,
-0.04888916015625,
-0.000014781951904296875,
-0.01288604736328125,
-0.02532958984375,
-0.03192138671875,
0.017333984375,
-0.05487060546875,
0.0787353515625,
0.0092010498046875,
-0.06689453125,
0.019561767578125,
-0.04888916015625,
-0.0038700103759765625,
-0.020355224609375,
0.00579071044921875,
-0.046966552734375,
-0.01250457763671875,
0.03741455078125,
0.0458984375,
-0.0107574462890625,
0.0055694580078125,
-0.01450347900390625,
-0.03857421875,
0.016693115234375,
-0.0308380126953125,
0.09130859375,
0.022552490234375,
-0.05657958984375,
-0.0019292831420898438,
-0.0650634765625,
0.001796722412109375,
0.017852783203125,
-0.014801025390625,
-0.02093505859375,
-0.0061798095703125,
-0.005939483642578125,
0.01317596435546875,
0.04046630859375,
-0.03582763671875,
0.0196075439453125,
-0.0289459228515625,
0.050994873046875,
0.052490234375,
-0.0226287841796875,
0.03076171875,
-0.0200653076171875,
0.020660400390625,
0.01947021484375,
0.0208740234375,
0.0171966552734375,
-0.0361328125,
-0.080322265625,
-0.04632568359375,
0.046112060546875,
0.0390625,
-0.040283203125,
0.052947998046875,
-0.0187835693359375,
-0.059967041015625,
-0.050689697265625,
0.00966644287109375,
0.0222320556640625,
0.018890380859375,
0.05633544921875,
-0.00264739990234375,
-0.052703857421875,
-0.0679931640625,
0.00754547119140625,
0.0034427642822265625,
0.001651763916015625,
0.0038089752197265625,
0.051239013671875,
-0.03265380859375,
0.05645751953125,
-0.0477294921875,
-0.033111572265625,
-0.016998291015625,
0.01302337646484375,
0.036346435546875,
0.050201416015625,
0.06524658203125,
-0.04296875,
-0.04656982421875,
-0.0267333984375,
-0.0482177734375,
0.0264892578125,
0.00502777099609375,
-0.020233154296875,
0.0258331298828125,
0.01430511474609375,
-0.0350341796875,
0.0305633544921875,
0.04754638671875,
-0.0105133056640625,
-0.0006952285766601562,
-0.0280914306640625,
0.004016876220703125,
-0.0850830078125,
0.0116119384765625,
-0.00019860267639160156,
-0.006420135498046875,
-0.041656494140625,
-0.01165008544921875,
0.012420654296875,
0.018280029296875,
-0.0335693359375,
0.01399993896484375,
-0.042022705078125,
0.015472412109375,
0.0016832351684570312,
0.0255279541015625,
0.01019287109375,
0.06591796875,
-0.00013375282287597656,
0.042755126953125,
0.049713134765625,
-0.04425048828125,
0.01690673828125,
0.0338134765625,
-0.027252197265625,
0.006008148193359375,
-0.0731201171875,
0.0219573974609375,
-0.01488494873046875,
0.023895263671875,
-0.072265625,
0.01148223876953125,
0.027618408203125,
-0.03863525390625,
0.03411865234375,
-0.01369476318359375,
-0.041107177734375,
-0.0245513916015625,
-0.04241943359375,
0.01383209228515625,
0.062347412109375,
-0.058929443359375,
0.0297698974609375,
0.0230255126953125,
0.022125244140625,
-0.06719970703125,
-0.0599365234375,
-0.0204620361328125,
-0.0235748291015625,
-0.044647216796875,
0.057830810546875,
-0.0070343017578125,
-0.0002453327178955078,
-0.002399444580078125,
0.007266998291015625,
-0.01348876953125,
0.01806640625,
0.0119476318359375,
0.0267181396484375,
0.007305145263671875,
0.01187896728515625,
0.00872039794921875,
0.0034637451171875,
-0.0092620849609375,
-0.006031036376953125,
0.050384521484375,
-0.0287628173828125,
-0.006778717041015625,
-0.03448486328125,
0.005802154541015625,
0.032867431640625,
-0.0297698974609375,
0.07037353515625,
0.07330322265625,
-0.0201568603515625,
-0.00933837890625,
-0.0278778076171875,
-0.015838623046875,
-0.035186767578125,
0.0217742919921875,
-0.01403045654296875,
-0.04559326171875,
0.04522705078125,
0.016754150390625,
0.017852783203125,
0.052734375,
0.0357666015625,
-0.013671875,
0.0767822265625,
0.03326416015625,
-0.006710052490234375,
0.052642822265625,
-0.0850830078125,
0.01145172119140625,
-0.08807373046875,
-0.0192718505859375,
-0.032073974609375,
-0.049560546875,
-0.04034423828125,
-0.025390625,
0.01486968994140625,
0.03961181640625,
-0.01934814453125,
0.049652099609375,
-0.07061767578125,
0.005279541015625,
0.04534912109375,
0.036956787109375,
0.005870819091796875,
0.013397216796875,
0.01325225830078125,
-0.00495147705078125,
-0.059173583984375,
-0.033447265625,
0.0894775390625,
0.0361328125,
0.057342529296875,
0.0259857177734375,
0.07373046875,
0.005580902099609375,
-0.0083160400390625,
-0.03936767578125,
0.0305633544921875,
-0.00981903076171875,
-0.03326416015625,
-0.01708984375,
-0.023529052734375,
-0.09228515625,
0.0286102294921875,
-0.01363372802734375,
-0.0732421875,
0.0306243896484375,
0.03021240234375,
-0.03265380859375,
0.0258026123046875,
-0.0439453125,
0.05242919921875,
-0.002651214599609375,
-0.031585693359375,
-0.0219879150390625,
-0.0386962890625,
0.01099395751953125,
0.0083160400390625,
-0.0263214111328125,
-0.0026149749755859375,
0.01067352294921875,
0.07330322265625,
-0.02423095703125,
0.05859375,
-0.0181884765625,
-0.0186614990234375,
0.037384033203125,
-0.01523590087890625,
0.056396484375,
0.01739501953125,
-0.00433349609375,
0.0379638671875,
0.0017461776733398438,
-0.0274505615234375,
-0.045562744140625,
0.07073974609375,
-0.0831298828125,
-0.03094482421875,
-0.0439453125,
-0.0266265869140625,
-0.0031795501708984375,
-0.0020732879638671875,
0.035675048828125,
0.043212890625,
0.00936126708984375,
0.0347900390625,
0.07171630859375,
-0.005092620849609375,
0.042205810546875,
0.03497314453125,
0.007965087890625,
-0.025421142578125,
0.0684814453125,
0.017120361328125,
-0.0019969940185546875,
0.038116455078125,
-0.01557159423828125,
-0.0293121337890625,
-0.050384521484375,
-0.035980224609375,
0.01125335693359375,
-0.053924560546875,
-0.0216522216796875,
-0.071533203125,
-0.0145721435546875,
-0.0244598388671875,
0.004512786865234375,
-0.0347900390625,
-0.048370361328125,
-0.060516357421875,
0.0034656524658203125,
0.055145263671875,
0.04156494140625,
-0.004360198974609375,
0.0178375244140625,
-0.061614990234375,
0.0014295578002929688,
0.01763916015625,
0.0074615478515625,
0.0159912109375,
-0.045684814453125,
-0.0277862548828125,
0.019195556640625,
-0.047149658203125,
-0.065673828125,
0.0322265625,
-0.004436492919921875,
0.0489501953125,
-0.0081939697265625,
0.007335662841796875,
0.040008544921875,
-0.028350830078125,
0.06268310546875,
0.019287109375,
-0.0711669921875,
0.045867919921875,
-0.00972747802734375,
0.01021575927734375,
0.043212890625,
0.040863037109375,
0.01219940185546875,
-0.004924774169921875,
-0.06243896484375,
-0.0755615234375,
0.0596923828125,
0.03857421875,
-0.00201416015625,
0.00872039794921875,
0.00592803955078125,
-0.0117645263671875,
0.0106048583984375,
-0.048828125,
-0.0411376953125,
-0.02264404296875,
-0.0209197998046875,
-0.010589599609375,
-0.01316070556640625,
-0.00991058349609375,
-0.0439453125,
0.068359375,
0.0053863525390625,
0.046417236328125,
0.0379638671875,
-0.0246734619140625,
0.0195159912109375,
0.005283355712890625,
0.047515869140625,
0.050384521484375,
-0.0352783203125,
0.0004978179931640625,
0.02850341796875,
-0.03717041015625,
0.01445770263671875,
0.026519775390625,
-0.0083465576171875,
0.0184326171875,
0.019378662109375,
0.07427978515625,
-0.006237030029296875,
-0.025421142578125,
0.035400390625,
-0.010498046875,
-0.0309295654296875,
-0.032806396484375,
0.00424957275390625,
-0.01168060302734375,
0.033905029296875,
0.0251922607421875,
0.011962890625,
0.0127105712890625,
-0.01323699951171875,
0.00038433074951171875,
0.022613525390625,
-0.031036376953125,
-0.0262908935546875,
0.04205322265625,
0.018798828125,
0.0034008026123046875,
0.038299560546875,
-0.0159454345703125,
-0.03912353515625,
0.053009033203125,
0.0290985107421875,
0.068603515625,
0.0006608963012695312,
0.01352691650390625,
0.0548095703125,
0.0226898193359375,
-0.0024738311767578125,
0.03369140625,
0.002849578857421875,
-0.03955078125,
-0.0162506103515625,
-0.03955078125,
-0.01032257080078125,
0.02862548828125,
-0.051483154296875,
0.0209808349609375,
-0.018035888671875,
-0.0168609619140625,
0.0008883476257324219,
0.017578125,
-0.07037353515625,
0.00012302398681640625,
-0.007747650146484375,
0.048797607421875,
-0.043304443359375,
0.06982421875,
0.05108642578125,
-0.04058837890625,
-0.058380126953125,
-0.0124969482421875,
-0.035736083984375,
-0.04119873046875,
0.07147216796875,
0.01508331298828125,
-0.01207733154296875,
0.0179595947265625,
-0.026641845703125,
-0.07275390625,
0.11114501953125,
0.0302886962890625,
-0.0616455078125,
0.0016641616821289062,
-0.0189361572265625,
0.043304443359375,
-0.0086212158203125,
0.0236968994140625,
0.0312347412109375,
0.02008056640625,
-0.008819580078125,
-0.05181884765625,
0.01328277587890625,
-0.017303466796875,
0.007534027099609375,
0.0189208984375,
-0.064208984375,
0.0894775390625,
-0.007740020751953125,
0.0034160614013671875,
0.0016317367553710938,
0.045013427734375,
0.0189208984375,
0.004634857177734375,
0.03704833984375,
0.058441162109375,
0.057037353515625,
-0.0127716064453125,
0.058868408203125,
-0.026641845703125,
0.0535888671875,
0.07525634765625,
0.009307861328125,
0.058837890625,
0.035552978515625,
-0.026336669921875,
0.049407958984375,
0.04779052734375,
-0.01117706298828125,
0.05450439453125,
0.02337646484375,
-0.00522613525390625,
-0.00316619873046875,
0.0262298583984375,
-0.052764892578125,
0.03436279296875,
0.0102386474609375,
-0.046966552734375,
-0.00882720947265625,
0.0029277801513671875,
0.00560760498046875,
-0.017974853515625,
-0.006359100341796875,
0.038299560546875,
-0.00782012939453125,
-0.044158935546875,
0.08795166015625,
-0.0098724365234375,
0.057769775390625,
-0.037353515625,
-0.0161895751953125,
-0.002838134765625,
0.03765869140625,
-0.0181121826171875,
-0.0401611328125,
0.0027008056640625,
-0.001407623291015625,
-0.01403045654296875,
0.00045561790466308594,
0.030120849609375,
-0.0253753662109375,
-0.0247344970703125,
0.0362548828125,
0.017120361328125,
0.00910186767578125,
0.00212860107421875,
-0.061614990234375,
0.02349853515625,
0.00756072998046875,
-0.03399658203125,
0.02264404296875,
0.0106048583984375,
0.0246429443359375,
0.028350830078125,
0.03240966796875,
-0.021240234375,
0.01248931884765625,
-0.007534027099609375,
0.07861328125,
-0.021484375,
-0.01934814453125,
-0.0718994140625,
0.04376220703125,
-0.01605224609375,
-0.03326416015625,
0.0648193359375,
0.0252227783203125,
0.055084228515625,
-0.0144805908203125,
0.0379638671875,
-0.022064208984375,
0.007518768310546875,
-0.048065185546875,
0.048492431640625,
-0.0577392578125,
0.01507568359375,
-0.03485107421875,
-0.0767822265625,
-0.007465362548828125,
0.0574951171875,
-0.01389312744140625,
0.0034656524658203125,
0.04010009765625,
0.052764892578125,
-0.002056121826171875,
-0.016693115234375,
0.004058837890625,
0.002971649169921875,
0.03302001953125,
0.06646728515625,
0.050811767578125,
-0.0755615234375,
0.043060302734375,
-0.03460693359375,
-0.01490020751953125,
-0.02398681640625,
-0.058837890625,
-0.08221435546875,
-0.050994873046875,
-0.041107177734375,
-0.03143310546875,
-0.0035991668701171875,
0.059173583984375,
0.05303955078125,
-0.049835205078125,
0.0198822021484375,
-0.0296630859375,
-0.001728057861328125,
-0.035797119140625,
-0.0135345458984375,
0.04620361328125,
-0.021697998046875,
-0.0831298828125,
0.0251007080078125,
-0.0196533203125,
0.021514892578125,
-0.0261077880859375,
-0.0213775634765625,
-0.034637451171875,
-0.00490570068359375,
0.03814697265625,
0.0001976490020751953,
-0.039794921875,
-0.005580902099609375,
-0.001178741455078125,
-0.01025390625,
0.0024013519287109375,
0.021820068359375,
-0.054962158203125,
0.023773193359375,
0.05487060546875,
0.0280609130859375,
0.061920166015625,
-0.0295867919921875,
0.01763916015625,
-0.050933837890625,
0.0297698974609375,
0.0190277099609375,
0.0372314453125,
0.01467132568359375,
-0.03436279296875,
0.048065185546875,
-0.00832366943359375,
-0.04840087890625,
-0.062225341796875,
-0.00115203857421875,
-0.08319091796875,
-0.01194000244140625,
0.070556640625,
-0.035980224609375,
-0.0125732421875,
0.012481689453125,
-0.0191650390625,
0.0267791748046875,
-0.03619384765625,
0.054229736328125,
0.03680419921875,
0.0017137527465820312,
0.006069183349609375,
-0.0357666015625,
0.04473876953125,
0.0426025390625,
-0.0411376953125,
-0.0109100341796875,
0.0157470703125,
0.01474761962890625,
0.0352783203125,
0.04290771484375,
-0.0014238357543945312,
0.01088714599609375,
-0.00711822509765625,
0.004062652587890625,
-0.0228729248046875,
-0.034271240234375,
-0.0318603515625,
-0.011871337890625,
-0.006267547607421875,
-0.0513916015625
]
] |
lllyasviel/sd-controlnet-openpose | 2023-04-24T22:30:17.000Z | [
"diffusers",
"art",
"controlnet",
"stable-diffusion",
"image-to-image",
"arxiv:2302.05543",
"license:openrail",
"has_space",
"diffusers:ControlNetModel",
"region:us"
] | image-to-image | lllyasviel | null | null | lllyasviel/sd-controlnet-openpose | 81 | 28,881 | diffusers | 2023-02-24T07:09:43 | ---
license: openrail
base_model: runwayml/stable-diffusion-v1-5
tags:
- art
- controlnet
- stable-diffusion
- image-to-image
---
# Controlnet - *Human Pose Version*
ControlNet is a neural network structure to control diffusion models by adding extra conditions.
This checkpoint corresponds to the ControlNet conditioned on **Human Pose Estimation**.
It can be used in combination with [Stable Diffusion](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/text2img).

## Model Details
- **Developed by:** Lvmin Zhang, Maneesh Agrawala
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based.
- **Resources for more information:** [GitHub Repository](https://github.com/lllyasviel/ControlNet), [Paper](https://arxiv.org/abs/2302.05543).
- **Cite as:**
@misc{zhang2023adding,
title={Adding Conditional Control to Text-to-Image Diffusion Models},
author={Lvmin Zhang and Maneesh Agrawala},
year={2023},
eprint={2302.05543},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
## Introduction
Controlnet was proposed in [*Adding Conditional Control to Text-to-Image Diffusion Models*](https://arxiv.org/abs/2302.05543) by
Lvmin Zhang, Maneesh Agrawala.
The abstract reads as follows:
*We present a neural network structure, ControlNet, to control pretrained large diffusion models to support additional input conditions.
The ControlNet learns task-specific conditions in an end-to-end way, and the learning is robust even when the training dataset is small (< 50k).
Moreover, training a ControlNet is as fast as fine-tuning a diffusion model, and the model can be trained on a personal devices.
Alternatively, if powerful computation clusters are available, the model can scale to large amounts (millions to billions) of data.
We report that large diffusion models like Stable Diffusion can be augmented with ControlNets to enable conditional inputs like edge maps, segmentation maps, keypoints, etc.
This may enrich the methods to control large diffusion models and further facilitate related applications.*
## Released Checkpoints
The authors released 8 different checkpoints, each trained with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5)
on a different type of conditioning:
| Model Name | Control Image Overview| Control Image Example | Generated Image Example |
|---|---|---|---|
|[lllyasviel/sd-controlnet-canny](https://huggingface.co/lllyasviel/sd-controlnet-canny)<br/> *Trained with canny edge detection* | A monochrome image with white edges on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_bird_canny.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_bird_canny.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_canny_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_canny_1.png"/></a>|
|[lllyasviel/sd-controlnet-depth](https://huggingface.co/lllyasviel/sd-controlnet-depth)<br/> *Trained with Midas depth estimation* |A grayscale image with black representing deep areas and white representing shallow areas.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_vermeer_depth.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_vermeer_depth.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_depth_2.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_depth_2.png"/></a>|
|[lllyasviel/sd-controlnet-hed](https://huggingface.co/lllyasviel/sd-controlnet-hed)<br/> *Trained with HED edge detection (soft edge)* |A monochrome image with white soft edges on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_bird_hed.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_bird_hed.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_hed_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_hed_1.png"/></a> |
|[lllyasviel/sd-controlnet-mlsd](https://huggingface.co/lllyasviel/sd-controlnet-mlsd)<br/> *Trained with M-LSD line detection* |A monochrome image composed only of white straight lines on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_room_mlsd.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_room_mlsd.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_mlsd_0.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_mlsd_0.png"/></a>|
|[lllyasviel/sd-controlnet-normal](https://huggingface.co/lllyasviel/sd-controlnet-normal)<br/> *Trained with normal map* |A [normal mapped](https://en.wikipedia.org/wiki/Normal_mapping) image.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_human_normal.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_human_normal.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_normal_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_normal_1.png"/></a>|
|[lllyasviel/sd-controlnet_openpose](https://huggingface.co/lllyasviel/sd-controlnet-openpose)<br/> *Trained with OpenPose bone image* |A [OpenPose bone](https://github.com/CMU-Perceptual-Computing-Lab/openpose) image.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_human_openpose.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_human_openpose.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_openpose_0.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_openpose_0.png"/></a>|
|[lllyasviel/sd-controlnet_scribble](https://huggingface.co/lllyasviel/sd-controlnet-scribble)<br/> *Trained with human scribbles* |A hand-drawn monochrome image with white outlines on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_vermeer_scribble.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_vermeer_scribble.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_scribble_0.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_scribble_0.png"/></a> |
|[lllyasviel/sd-controlnet_seg](https://huggingface.co/lllyasviel/sd-controlnet-seg)<br/>*Trained with semantic segmentation* |An [ADE20K](https://groups.csail.mit.edu/vision/datasets/ADE20K/)'s segmentation protocol image.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_room_seg.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_room_seg.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_seg_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_seg_1.png"/></a> |
## Example
It is recommended to use the checkpoint with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) as the checkpoint
has been trained on it.
Experimentally, the checkpoint can be used with other diffusion models such as dreamboothed stable diffusion.
**Note**: If you want to process an image to create the auxiliary conditioning, external dependencies are required as shown below:
1. Install https://github.com/patrickvonplaten/controlnet_aux
```sh
$ pip install controlnet_aux
```
2. Let's install `diffusers` and related packages:
```
$ pip install diffusers transformers accelerate
```
3. Run code:
```py
from PIL import Image
from diffusers import StableDiffusionControlNetPipeline, ControlNetModel, UniPCMultistepScheduler
import torch
from controlnet_aux import OpenposeDetector
from diffusers.utils import load_image
openpose = OpenposeDetector.from_pretrained('lllyasviel/ControlNet')
image = load_image("https://huggingface.co/lllyasviel/sd-controlnet-openpose/resolve/main/images/pose.png")
image = openpose(image)
controlnet = ControlNetModel.from_pretrained(
"lllyasviel/sd-controlnet-openpose", torch_dtype=torch.float16
)
pipe = StableDiffusionControlNetPipeline.from_pretrained(
"runwayml/stable-diffusion-v1-5", controlnet=controlnet, safety_checker=None, torch_dtype=torch.float16
)
pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config)
# Remove if you do not have xformers installed
# see https://huggingface.co/docs/diffusers/v0.13.0/en/optimization/xformers#installing-xformers
# for installation instructions
pipe.enable_xformers_memory_efficient_attention()
pipe.enable_model_cpu_offload()
image = pipe("chef in the kitchen", image, num_inference_steps=20).images[0]
image.save('images/chef_pose_out.png')
```



### Training
The Openpose model was trained on 200k pose-image, caption pairs. The pose estimation images were generated with Openpose. The model was trained for 300 GPU-hours with Nvidia A100 80G using Stable Diffusion 1.5 as a base model.
### Blog post
For more information, please also have a look at the [official ControlNet Blog Post](https://huggingface.co/blog/controlnet). | 11,528 | [
[
-0.043975830078125,
-0.037353515625,
-0.0064544677734375,
0.03045654296875,
-0.02130126953125,
-0.0215301513671875,
-0.004749298095703125,
-0.049072265625,
0.0640869140625,
0.01227569580078125,
-0.0419921875,
-0.033905029296875,
-0.053466796875,
-0.0025615692138671875,
0.0003952980041503906,
0.042755126953125,
-0.00791168212890625,
-0.00931549072265625,
0.017578125,
-0.0094757080078125,
-0.008880615234375,
-0.0107421875,
-0.09527587890625,
-0.037994384765625,
0.043975830078125,
0.004119873046875,
0.05181884765625,
0.061859130859375,
0.03948974609375,
0.0305023193359375,
-0.02288818359375,
0.020233154296875,
-0.016815185546875,
-0.015106201171875,
0.00888824462890625,
-0.01020050048828125,
-0.0562744140625,
0.00766754150390625,
0.044189453125,
0.026519775390625,
0.0120849609375,
-0.0143890380859375,
0.005214691162109375,
0.0535888671875,
-0.039398193359375,
-0.00412750244140625,
-0.005748748779296875,
0.032562255859375,
-0.0162811279296875,
-0.007232666015625,
-0.0074920654296875,
-0.02020263671875,
0.00839996337890625,
-0.06280517578125,
-0.0206298828125,
-0.0024890899658203125,
0.10980224609375,
0.00527191162109375,
-0.0225982666015625,
-0.0014142990112304688,
-0.01099395751953125,
0.04315185546875,
-0.0596923828125,
0.00913238525390625,
0.021484375,
0.017181396484375,
-0.017913818359375,
-0.059722900390625,
-0.04144287109375,
-0.0128021240234375,
-0.00958251953125,
0.033966064453125,
-0.035186767578125,
0.0036449432373046875,
0.0148162841796875,
0.014251708984375,
-0.0300750732421875,
0.0196075439453125,
-0.027618408203125,
-0.032073974609375,
0.053680419921875,
-0.0025424957275390625,
0.053802490234375,
-0.00827789306640625,
-0.039398193359375,
-0.0257110595703125,
-0.026702880859375,
0.034637451171875,
0.0257415771484375,
0.003467559814453125,
-0.05487060546875,
0.034423828125,
-0.00910186767578125,
0.0709228515625,
0.0283050537109375,
-0.01727294921875,
0.037872314453125,
-0.0166168212890625,
-0.0183258056640625,
-0.0247802734375,
0.0863037109375,
0.047943115234375,
0.0209503173828125,
0.0031147003173828125,
-0.0112762451171875,
-0.01044464111328125,
0.002826690673828125,
-0.067626953125,
-0.01256561279296875,
0.005489349365234375,
-0.055908203125,
-0.0296783447265625,
-0.006092071533203125,
-0.05322265625,
-0.0245208740234375,
-0.01641845703125,
0.0253448486328125,
-0.050689697265625,
-0.04559326171875,
0.02081298828125,
-0.03692626953125,
0.041046142578125,
0.03839111328125,
-0.02569580078125,
0.0172119140625,
0.0097503662109375,
0.0794677734375,
-0.0167999267578125,
-0.019500732421875,
-0.0157928466796875,
-0.0081634521484375,
-0.0256500244140625,
0.0377197265625,
-0.003040313720703125,
-0.0052490234375,
-0.0147705078125,
0.024505615234375,
-0.00433349609375,
-0.026092529296875,
0.0196990966796875,
-0.013214111328125,
0.0060577392578125,
-0.003841400146484375,
-0.0158843994140625,
-0.0244293212890625,
0.0236663818359375,
-0.045562744140625,
0.044586181640625,
0.0156402587890625,
-0.0794677734375,
0.0245361328125,
-0.05035400390625,
-0.023406982421875,
-0.0178985595703125,
0.0012254714965820312,
-0.046966552734375,
-0.0440673828125,
-0.00058746337890625,
0.056671142578125,
-0.00604248046875,
-0.01151275634765625,
-0.044708251953125,
-0.005001068115234375,
0.0207366943359375,
-0.00534820556640625,
0.0897216796875,
0.01194000244140625,
-0.0455322265625,
0.01493072509765625,
-0.05609130859375,
0.00545501708984375,
0.02081298828125,
-0.027374267578125,
-0.0109100341796875,
-0.01178741455078125,
0.0084381103515625,
0.05316162109375,
0.0225830078125,
-0.053314208984375,
0.006404876708984375,
-0.02618408203125,
0.0301361083984375,
0.04071044921875,
0.00677490234375,
0.040802001953125,
-0.048736572265625,
0.032867431640625,
0.0196075439453125,
0.028900146484375,
0.0142822265625,
-0.031982421875,
-0.061065673828125,
-0.0394287109375,
-0.003387451171875,
0.04595947265625,
-0.0594482421875,
0.0648193359375,
-0.0035419464111328125,
-0.04302978515625,
-0.025360107421875,
0.003223419189453125,
0.038543701171875,
0.045623779296875,
0.0244598388671875,
-0.03753662109375,
-0.0294342041015625,
-0.0777587890625,
0.00814056396484375,
0.0149688720703125,
-0.0021839141845703125,
0.0215911865234375,
0.042877197265625,
-0.00335693359375,
0.049102783203125,
-0.0133209228515625,
-0.034332275390625,
-0.0016164779663085938,
-0.0089263916015625,
0.0142822265625,
0.07623291015625,
0.05938720703125,
-0.07171630859375,
-0.039306640625,
0.00982666015625,
-0.07354736328125,
0.00215911865234375,
-0.0145111083984375,
-0.033050537109375,
0.007396697998046875,
0.03802490234375,
-0.03680419921875,
0.06793212890625,
0.031402587890625,
-0.036224365234375,
0.03424072265625,
-0.0235443115234375,
0.0154571533203125,
-0.08074951171875,
0.0265350341796875,
0.03460693359375,
-0.0177764892578125,
-0.04119873046875,
0.006839752197265625,
0.0080718994140625,
0.006946563720703125,
-0.048370361328125,
0.05072021484375,
-0.049072265625,
-0.006465911865234375,
-0.0147705078125,
0.0010423660278320312,
-0.0030517578125,
0.05108642578125,
0.01079559326171875,
0.0361328125,
0.0611572265625,
-0.029449462890625,
0.0303955078125,
0.0309295654296875,
-0.004962921142578125,
0.068115234375,
-0.0694580078125,
0.002513885498046875,
-0.00102996826171875,
0.035797119140625,
-0.0594482421875,
-0.0232391357421875,
0.04150390625,
-0.043548583984375,
0.051116943359375,
-0.006824493408203125,
-0.014190673828125,
-0.041839599609375,
-0.04278564453125,
0.004405975341796875,
0.031982421875,
-0.036651611328125,
0.032501220703125,
0.0177764892578125,
0.0204010009765625,
-0.051422119140625,
-0.07952880859375,
0.0033626556396484375,
-0.0111236572265625,
-0.061767578125,
0.0258026123046875,
-0.00011456012725830078,
0.002513885498046875,
0.01084136962890625,
0.00750732421875,
-0.015960693359375,
0.00907135009765625,
0.0452880859375,
0.01039886474609375,
-0.0224761962890625,
-0.01119232177734375,
-0.0022430419921875,
-0.006061553955078125,
-0.0145416259765625,
-0.03387451171875,
0.033111572265625,
-0.0079803466796875,
-0.016632080078125,
-0.07965087890625,
0.0196990966796875,
0.0404052734375,
-0.01424407958984375,
0.052886962890625,
0.053680419921875,
-0.033172607421875,
-0.0008220672607421875,
-0.025482177734375,
-0.008056640625,
-0.037139892578125,
-0.01213836669921875,
-0.017669677734375,
-0.06256103515625,
0.05194091796875,
0.004978179931640625,
-0.011016845703125,
0.04119873046875,
0.0175018310546875,
-0.01824951171875,
0.06573486328125,
0.028900146484375,
-0.01264190673828125,
0.050445556640625,
-0.058074951171875,
-0.01474761962890625,
-0.074951171875,
-0.0237274169921875,
-0.021484375,
-0.07049560546875,
-0.0287322998046875,
-0.02227783203125,
0.0254364013671875,
0.02069091796875,
-0.051422119140625,
0.03607177734375,
-0.050933837890625,
0.00972747802734375,
0.0308837890625,
0.051727294921875,
-0.0191497802734375,
-0.007480621337890625,
-0.0249481201171875,
0.012481689453125,
-0.045562744140625,
-0.0163116455078125,
0.05328369140625,
0.034454345703125,
0.039825439453125,
-0.002895355224609375,
0.041778564453125,
0.009124755859375,
0.008453369140625,
-0.051116943359375,
0.039703369140625,
-0.006465911865234375,
-0.0458984375,
-0.031707763671875,
-0.0262603759765625,
-0.0914306640625,
0.0163421630859375,
-0.04266357421875,
-0.054779052734375,
0.0328369140625,
0.0118560791015625,
-0.0128936767578125,
0.0501708984375,
-0.052032470703125,
0.04864501953125,
0.000705718994140625,
-0.0361328125,
0.01280975341796875,
-0.07281494140625,
0.0130767822265625,
0.021392822265625,
-0.0214385986328125,
0.0036449432373046875,
-0.01268768310546875,
0.06591796875,
-0.06231689453125,
0.0721435546875,
-0.037689208984375,
0.01079559326171875,
0.02691650390625,
0.005008697509765625,
0.02850341796875,
-0.02105712890625,
-0.01446533203125,
0.0020923614501953125,
-0.007167816162109375,
-0.0452880859375,
-0.029754638671875,
0.0408935546875,
-0.06719970703125,
-0.01155853271484375,
-0.018035888671875,
-0.0213165283203125,
0.01023101806640625,
0.026214599609375,
0.041229248046875,
0.035858154296875,
0.0191650390625,
0.0235137939453125,
0.039520263671875,
-0.03173828125,
0.04217529296875,
-0.00021898746490478516,
0.006805419921875,
-0.0400390625,
0.05389404296875,
0.014251708984375,
0.01332855224609375,
0.0260009765625,
0.0185089111328125,
-0.01552581787109375,
-0.034210205078125,
-0.025665283203125,
0.033538818359375,
-0.04150390625,
-0.03985595703125,
-0.04412841796875,
-0.03594970703125,
-0.0253753662109375,
-0.03692626953125,
-0.0192718505859375,
-0.0182342529296875,
-0.0513916015625,
0.009674072265625,
0.047637939453125,
0.04058837890625,
-0.0276031494140625,
0.050567626953125,
-0.0136260986328125,
0.02410888671875,
0.025970458984375,
0.022003173828125,
0.005138397216796875,
-0.027618408203125,
0.0186004638671875,
0.0194854736328125,
-0.0224609375,
-0.06658935546875,
0.032958984375,
0.00957489013671875,
0.03472900390625,
0.034912109375,
-0.0200958251953125,
0.0435791015625,
-0.006725311279296875,
0.0322265625,
0.056060791015625,
-0.05059814453125,
0.040435791015625,
-0.03466796875,
0.0270843505859375,
0.03192138671875,
0.041290283203125,
-0.039154052734375,
-0.0156097412109375,
-0.04266357421875,
-0.040283203125,
0.046234130859375,
0.0003151893615722656,
-0.01611328125,
0.0220947265625,
0.0526123046875,
-0.0292510986328125,
-0.0018835067749023438,
-0.060150146484375,
-0.033966064453125,
-0.0233306884765625,
-0.00946044921875,
0.004344940185546875,
0.01306915283203125,
-0.0035858154296875,
-0.04315185546875,
0.05035400390625,
-0.016204833984375,
0.043701171875,
0.03765869140625,
0.01131439208984375,
-0.0162353515625,
-0.030731201171875,
0.05767822265625,
0.0323486328125,
0.0037403106689453125,
-0.00814056396484375,
-0.005786895751953125,
-0.04046630859375,
0.019500732421875,
-0.0084075927734375,
-0.02587890625,
-0.01708984375,
0.027862548828125,
0.065673828125,
-0.00481414794921875,
0.0012712478637695312,
0.07440185546875,
-0.0021762847900390625,
-0.056304931640625,
-0.01042938232421875,
0.003650665283203125,
0.0156402587890625,
0.035064697265625,
0.011016845703125,
0.0274505615234375,
0.004070281982421875,
-0.0030002593994140625,
0.0264892578125,
0.03472900390625,
-0.0380859375,
-0.016143798828125,
0.04034423828125,
-0.004055023193359375,
-0.01540374755859375,
0.020782470703125,
-0.03466796875,
-0.055816650390625,
0.07440185546875,
0.04058837890625,
0.05181884765625,
-0.0097808837890625,
0.0284576416015625,
0.052001953125,
0.025482177734375,
0.01507568359375,
0.029327392578125,
0.0078887939453125,
-0.05023193359375,
-0.0263824462890625,
-0.0289764404296875,
0.00446319580078125,
0.0227203369140625,
-0.035400390625,
0.0254669189453125,
-0.07073974609375,
-0.02044677734375,
0.002971649169921875,
0.0148468017578125,
-0.0487060546875,
0.03204345703125,
0.0012674331665039062,
0.09405517578125,
-0.05889892578125,
0.055084228515625,
0.05535888671875,
-0.03759765625,
-0.09136962890625,
-0.009490966796875,
0.0113525390625,
-0.0626220703125,
0.055450439453125,
0.0025653839111328125,
-0.0117950439453125,
-0.00528717041015625,
-0.072265625,
-0.04302978515625,
0.10107421875,
0.007232666015625,
-0.016021728515625,
0.01004791259765625,
-0.037078857421875,
0.032440185546875,
-0.033538818359375,
0.0245513916015625,
0.0165863037109375,
0.0445556640625,
0.0185394287109375,
-0.058197021484375,
0.023406982421875,
-0.04266357421875,
0.01401519775390625,
0.0158843994140625,
-0.0755615234375,
0.068359375,
0.018218994140625,
-0.00738525390625,
0.002227783203125,
0.049224853515625,
0.01251983642578125,
0.0113067626953125,
0.056915283203125,
0.05474853515625,
0.02606201171875,
-0.005672454833984375,
0.06396484375,
-0.0034389495849609375,
0.00791168212890625,
0.048828125,
0.0206451416015625,
0.045379638671875,
0.026092529296875,
0.00354766845703125,
0.026336669921875,
0.060150146484375,
0.0032062530517578125,
0.020355224609375,
0.043731689453125,
0.0009531974792480469,
-0.006328582763671875,
0.0025482177734375,
-0.024078369140625,
0.031585693359375,
0.024627685546875,
-0.0161590576171875,
-0.0136871337890625,
0.0295562744140625,
0.012725830078125,
0.0015745162963867188,
-0.0274505615234375,
0.049224853515625,
-0.01035308837890625,
-0.030670166015625,
0.064697265625,
-0.009552001953125,
0.09466552734375,
-0.045074462890625,
0.0012235641479492188,
-0.022979736328125,
-0.003894805908203125,
-0.0302276611328125,
-0.07177734375,
0.01503753662109375,
-0.019683837890625,
0.0247650146484375,
-0.01617431640625,
0.0726318359375,
-0.0172576904296875,
-0.00970458984375,
0.043548583984375,
0.01273345947265625,
0.0269927978515625,
0.018096923828125,
-0.0887451171875,
0.01398468017578125,
0.00487518310546875,
-0.033111572265625,
0.0106201171875,
0.035675048828125,
0.005413055419921875,
0.053192138671875,
0.031585693359375,
0.0462646484375,
0.020294189453125,
-0.031524658203125,
0.0748291015625,
-0.02130126953125,
-0.0260162353515625,
-0.03326416015625,
0.046630859375,
-0.043853759765625,
-0.039581298828125,
0.035858154296875,
0.0218658447265625,
0.056488037109375,
-0.00041866302490234375,
0.044952392578125,
-0.0282135009765625,
0.0222930908203125,
-0.0345458984375,
0.0826416015625,
-0.07452392578125,
-0.035064697265625,
-0.015777587890625,
-0.033935546875,
-0.0401611328125,
0.07366943359375,
-0.0133514404296875,
0.0104522705078125,
0.031463623046875,
0.0721435546875,
-0.0233306884765625,
-0.034881591796875,
0.0071563720703125,
0.002742767333984375,
0.0110321044921875,
0.05926513671875,
0.047149658203125,
-0.04693603515625,
0.00946807861328125,
-0.04052734375,
-0.038604736328125,
-0.02362060546875,
-0.06719970703125,
-0.065673828125,
-0.056640625,
-0.037139892578125,
-0.0413818359375,
-0.0165252685546875,
0.06536865234375,
0.09490966796875,
-0.046295166015625,
-0.0244293212890625,
-0.0093841552734375,
0.006229400634765625,
-0.01454925537109375,
-0.013397216796875,
0.029510498046875,
0.01093292236328125,
-0.032684326171875,
0.0007071495056152344,
0.0247650146484375,
0.044219970703125,
0.00011402368545532227,
-0.032684326171875,
-0.044647216796875,
-0.023773193359375,
0.021087646484375,
0.0526123046875,
-0.034454345703125,
-0.021575927734375,
-0.0036163330078125,
-0.0258941650390625,
0.0080413818359375,
0.0294342041015625,
-0.01525115966796875,
0.0237274169921875,
0.046234130859375,
0.047332763671875,
0.048126220703125,
-0.0124053955078125,
0.0030078887939453125,
-0.0411376953125,
0.04058837890625,
0.0032024383544921875,
0.0262451171875,
0.017120361328125,
-0.0247039794921875,
0.040069580078125,
0.025726318359375,
-0.043060302734375,
-0.031280517578125,
-0.001995086669921875,
-0.110595703125,
-0.0189208984375,
0.06500244140625,
-0.031524658203125,
-0.053314208984375,
0.01873779296875,
-0.037200927734375,
0.03277587890625,
-0.019439697265625,
0.004413604736328125,
0.0184783935546875,
-0.0043182373046875,
-0.0189971923828125,
-0.032257080078125,
0.043853759765625,
0.016357421875,
-0.06219482421875,
-0.05426025390625,
0.04022216796875,
0.0146636962890625,
0.0360107421875,
0.05615234375,
-0.008056640625,
0.0211639404296875,
-0.007747650146484375,
0.0294036865234375,
-0.01395416259765625,
-0.01092529296875,
-0.0408935546875,
-0.004878997802734375,
-0.01526641845703125,
-0.0380859375
]
] |
airesearch/wav2vec2-large-xlsr-53-th | 2022-03-23T18:24:45.000Z | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"robust-speech-event",
"speech",
"xlsr-fine-tuning",
"th",
"dataset:common_voice",
"doi:10.57967/hf/0404",
"license:cc-by-sa-4.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | airesearch | null | null | airesearch/wav2vec2-large-xlsr-53-th | 8 | 28,832 | transformers | 2022-03-02T23:29:05 | ---
language: th
datasets:
- common_voice
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
- robust-speech-event
- speech
- xlsr-fine-tuning
license: cc-by-sa-4.0
model-index:
- name: XLS-R-53 - Thai
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice 7
type: mozilla-foundation/common_voice_7_0
args: th
metrics:
- name: Test WER
type: wer
value: 0.9524
- name: Test SER
type: ser
value: 1.2346
- name: Test CER
type: cer
value: 0.1623
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Robust Speech Event - Dev Data
type: speech-recognition-community-v2/dev_data
args: sv
metrics:
- name: Test WER
type: wer
value: null
- name: Test SER
type: ser
value: null
- name: Test CER
type: cer
value: null
---
# `wav2vec2-large-xlsr-53-th`
Finetuning `wav2vec2-large-xlsr-53` on Thai [Common Voice 7.0](https://commonvoice.mozilla.org/en/datasets)
[Read more on our blog](https://medium.com/airesearch-in-th/airesearch-in-th-3c1019a99cd)
We finetune [wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) based on [Fine-tuning Wav2Vec2 for English ASR](https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_tuning_Wav2Vec2_for_English_ASR.ipynb) using Thai examples of [Common Voice Corpus 7.0](https://commonvoice.mozilla.org/en/datasets). The notebooks and scripts can be found in [vistec-ai/wav2vec2-large-xlsr-53-th](https://github.com/vistec-ai/wav2vec2-large-xlsr-53-th). The pretrained model and processor can be found at [airesearch/wav2vec2-large-xlsr-53-th](https://huggingface.co/airesearch/wav2vec2-large-xlsr-53-th).
## `robust-speech-event`
Add `syllable_tokenize`, `word_tokenize` ([PyThaiNLP](https://github.com/PyThaiNLP/pythainlp)) and [deepcut](https://github.com/rkcosmos/deepcut) tokenizers to `eval.py` from [robust-speech-event](https://github.com/huggingface/transformers/tree/master/examples/research_projects/robust-speech-event#evaluation)
```
> python eval.py --model_id ./ --dataset mozilla-foundation/common_voice_7_0 --config th --split test --log_outputs --thai_tokenizer newmm/syllable/deepcut/cer
```
### Eval results on Common Voice 7 "test":
| | WER PyThaiNLP 2.3.1 | WER deepcut | SER | CER |
|---------------------------------|---------------------|-------------|---------|---------|
| Only Tokenization | 0.9524% | 2.5316% | 1.2346% | 0.1623% |
| Cleaning rules and Tokenization | TBD | TBD | TBD | TBD |
## Usage
```
#load pretrained processor and model
processor = Wav2Vec2Processor.from_pretrained("airesearch/wav2vec2-large-xlsr-53-th")
model = Wav2Vec2ForCTC.from_pretrained("airesearch/wav2vec2-large-xlsr-53-th")
#function to resample to 16_000
def speech_file_to_array_fn(batch,
text_col="sentence",
fname_col="path",
resampling_to=16000):
speech_array, sampling_rate = torchaudio.load(batch[fname_col])
resampler=torchaudio.transforms.Resample(sampling_rate, resampling_to)
batch["speech"] = resampler(speech_array)[0].numpy()
batch["sampling_rate"] = resampling_to
batch["target_text"] = batch[text_col]
return batch
#get 2 examples as sample input
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
#infer
with torch.no_grad():
logits = model(inputs.input_values,).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
>> Prediction: ['และ เขา ก็ สัมผัส ดีบุก', 'คุณ สามารถ รับทราบ เมื่อ ข้อความ นี้ ถูก อ่าน แล้ว']
>> Reference: ['และเขาก็สัมผัสดีบุก', 'คุณสามารถรับทราบเมื่อข้อความนี้ถูกอ่านแล้ว']
```
## Datasets
Common Voice Corpus 7.0](https://commonvoice.mozilla.org/en/datasets) contains 133 validated hours of Thai (255 total hours) at 5GB. We pre-tokenize with `pythainlp.tokenize.word_tokenize`. We preprocess the dataset using cleaning rules described in `notebooks/cv-preprocess.ipynb` by [@tann9949](https://github.com/tann9949). We then deduplicate and split as described in [ekapolc/Thai_commonvoice_split](https://github.com/ekapolc/Thai_commonvoice_split) in order to 1) avoid data leakage due to random splits after cleaning in [Common Voice Corpus 7.0](https://commonvoice.mozilla.org/en/datasets) and 2) preserve the majority of the data for the training set. The dataset loading script is `scripts/th_common_voice_70.py`. You can use this scripts together with `train_cleand.tsv`, `validation_cleaned.tsv` and `test_cleaned.tsv` to have the same splits as we do. The resulting dataset is as follows:
```
DatasetDict({
train: Dataset({
features: ['path', 'sentence'],
num_rows: 86586
})
test: Dataset({
features: ['path', 'sentence'],
num_rows: 2502
})
validation: Dataset({
features: ['path', 'sentence'],
num_rows: 3027
})
})
```
## Training
We fintuned using the following configuration on a single V100 GPU and chose the checkpoint with the lowest validation loss. The finetuning script is `scripts/wav2vec2_finetune.py`
```
# create model
model = Wav2Vec2ForCTC.from_pretrained(
"facebook/wav2vec2-large-xlsr-53",
attention_dropout=0.1,
hidden_dropout=0.1,
feat_proj_dropout=0.0,
mask_time_prob=0.05,
layerdrop=0.1,
gradient_checkpointing=True,
ctc_loss_reduction="mean",
pad_token_id=processor.tokenizer.pad_token_id,
vocab_size=len(processor.tokenizer)
)
model.freeze_feature_extractor()
training_args = TrainingArguments(
output_dir="../data/wav2vec2-large-xlsr-53-thai",
group_by_length=True,
per_device_train_batch_size=32,
gradient_accumulation_steps=1,
per_device_eval_batch_size=16,
metric_for_best_model='wer',
evaluation_strategy="steps",
eval_steps=1000,
logging_strategy="steps",
logging_steps=1000,
save_strategy="steps",
save_steps=1000,
num_train_epochs=100,
fp16=True,
learning_rate=1e-4,
warmup_steps=1000,
save_total_limit=3,
report_to="tensorboard"
)
```
## Evaluation
We benchmark on the test set using WER with words tokenized by [PyThaiNLP](https://github.com/PyThaiNLP/pythainlp) 2.3.1 and [deepcut](https://github.com/rkcosmos/deepcut), and CER. We also measure performance when spell correction using [TNC](http://www.arts.chula.ac.th/ling/tnc/) ngrams is applied. Evaluation codes can be found in `notebooks/wav2vec2_finetuning_tutorial.ipynb`. Benchmark is performed on `test-unique` split.
| | WER PyThaiNLP 2.3.1 | WER deepcut | CER |
|--------------------------------|---------------------|----------------|----------------|
| [Kaldi from scratch](https://github.com/vistec-AI/commonvoice-th) | 23.04 | | 7.57 |
| Ours without spell correction | 13.634024 | **8.152052** | **2.813019** |
| Ours with spell correction | 17.996397 | 14.167975 | 5.225761 |
| Google Web Speech API※ | 13.711234 | 10.860058 | 7.357340 |
| Microsoft Bing Speech API※ | **12.578819** | 9.620991 | 5.016620 |
| Amazon Transcribe※ | 21.86334 | 14.487553 | 7.077562 |
| NECTEC AI for Thai Partii API※ | 20.105887 | 15.515631 | 9.551027 |
※ APIs are not finetuned with Common Voice 7.0 data
## LICENSE
[cc-by-sa 4.0](https://github.com/vistec-AI/wav2vec2-large-xlsr-53-th/blob/main/LICENSE)
## Ackowledgements
* model training and validation notebooks/scripts [@cstorm125](https://github.com/cstorm125/)
* dataset cleaning scripts [@tann9949](https://github.com/tann9949)
* dataset splits [@ekapolc](https://github.com/ekapolc/) and [@14mss](https://github.com/14mss)
* running the training [@mrpeerat](https://github.com/mrpeerat)
* spell correction [@wannaphong](https://github.com/wannaphong)
| 8,384 | [
[
-0.0309295654296875,
-0.045989990234375,
0.0042877197265625,
0.014984130859375,
-0.0227203369140625,
0.0002651214599609375,
-0.044647216796875,
-0.033782958984375,
0.0077972412109375,
0.01543426513671875,
-0.04986572265625,
-0.053985595703125,
-0.03857421875,
-0.01378631591796875,
0.00162506103515625,
0.05706787109375,
0.00799560546875,
0.014129638671875,
0.01397705078125,
-0.01526641845703125,
-0.045562744140625,
-0.017791748046875,
-0.0635986328125,
-0.016693115234375,
0.011077880859375,
0.0489501953125,
0.018280029296875,
0.033172607421875,
0.02618408203125,
0.0200347900390625,
-0.0198822021484375,
0.005306243896484375,
-0.0204620361328125,
0.005519866943359375,
0.01727294921875,
-0.035064697265625,
-0.0286102294921875,
0.00372314453125,
0.049102783203125,
0.0249481201171875,
-0.02142333984375,
0.024993896484375,
0.00655364990234375,
0.039459228515625,
-0.0309295654296875,
0.01483917236328125,
-0.051177978515625,
-0.006011962890625,
-0.0247650146484375,
-0.0008940696716308594,
-0.010467529296875,
-0.016998291015625,
0.01507568359375,
-0.03900146484375,
0.0176849365234375,
-0.014373779296875,
0.0794677734375,
0.0209503173828125,
-0.0117950439453125,
-0.023345947265625,
-0.039215087890625,
0.06390380859375,
-0.06561279296875,
0.042449951171875,
0.0340576171875,
0.0197296142578125,
-0.00247955322265625,
-0.07196044921875,
-0.054443359375,
-0.007213592529296875,
0.0227813720703125,
0.026885986328125,
-0.0224761962890625,
0.00719451904296875,
0.0225982666015625,
0.0203704833984375,
-0.0487060546875,
0.0263824462890625,
-0.054290771484375,
-0.04083251953125,
0.045745849609375,
-0.00997161865234375,
0.021942138671875,
-0.0148773193359375,
-0.01163482666015625,
-0.03887939453125,
-0.02801513671875,
0.033355712890625,
0.0256195068359375,
0.0277252197265625,
-0.03704833984375,
0.03558349609375,
-0.01383209228515625,
0.0299072265625,
0.004329681396484375,
-0.02423095703125,
0.05755615234375,
-0.029754638671875,
-0.01806640625,
0.03314208984375,
0.07159423828125,
0.02764892578125,
0.01187896728515625,
0.0273590087890625,
0.0052642822265625,
0.00994873046875,
-0.0230712890625,
-0.0665283203125,
-0.0229034423828125,
0.03973388671875,
-0.01513671875,
-0.01052093505859375,
-0.0023975372314453125,
-0.055328369140625,
0.0038852691650390625,
-0.02532958984375,
0.039337158203125,
-0.04376220703125,
-0.0245208740234375,
0.0082550048828125,
-0.0163726806640625,
0.029266357421875,
0.0036716461181640625,
-0.052764892578125,
0.0206298828125,
0.04754638671875,
0.06829833984375,
-0.0037593841552734375,
-0.0252227783203125,
-0.04522705078125,
-0.0293426513671875,
-0.009918212890625,
0.03399658203125,
-0.01055145263671875,
-0.0272674560546875,
-0.01068878173828125,
-0.0018320083618164062,
-0.01187896728515625,
-0.046722412109375,
0.064208984375,
-0.0096435546875,
0.0312347412109375,
-0.0169219970703125,
-0.0256195068359375,
-0.01134490966796875,
-0.00923919677734375,
-0.03399658203125,
0.09075927734375,
-0.0137786865234375,
-0.04998779296875,
0.031951904296875,
-0.04095458984375,
-0.041290283203125,
-0.0207366943359375,
-0.0009055137634277344,
-0.0511474609375,
-0.0222320556640625,
0.0295867919921875,
0.032318115234375,
-0.0272216796875,
0.0073089599609375,
-0.00759124755859375,
-0.03277587890625,
0.023162841796875,
-0.039520263671875,
0.076416015625,
0.026519775390625,
-0.056671142578125,
0.0007715225219726562,
-0.07928466796875,
0.0233917236328125,
-0.005802154541015625,
-0.03875732421875,
0.00959014892578125,
-0.0191650390625,
0.0238189697265625,
0.0218658447265625,
0.0089263916015625,
-0.04071044921875,
0.0010404586791992188,
-0.053802490234375,
0.045135498046875,
0.03887939453125,
-0.00287628173828125,
0.0103607177734375,
-0.01522064208984375,
0.0272979736328125,
0.0082550048828125,
-0.0012636184692382812,
0.00640869140625,
-0.041229248046875,
-0.06414794921875,
-0.0367431640625,
0.0311279296875,
0.0478515625,
-0.025115966796875,
0.054168701171875,
-0.025482177734375,
-0.061859130859375,
-0.0755615234375,
-0.005619049072265625,
0.03765869140625,
0.039093017578125,
0.042999267578125,
-0.0197296142578125,
-0.060150146484375,
-0.056884765625,
-0.01113128662109375,
-0.00104522705078125,
-0.0032253265380859375,
0.0330810546875,
0.032806396484375,
-0.0237884521484375,
0.05487060546875,
-0.0372314453125,
-0.03900146484375,
-0.022186279296875,
0.007328033447265625,
0.028533935546875,
0.049774169921875,
0.0243072509765625,
-0.0474853515625,
-0.0295257568359375,
-0.01548004150390625,
-0.0174713134765625,
-0.0098876953125,
0.00017750263214111328,
0.0033359527587890625,
0.0275726318359375,
0.02996826171875,
-0.03717041015625,
0.0222015380859375,
0.0271453857421875,
-0.0111846923828125,
0.044525146484375,
-0.0100555419921875,
0.01259613037109375,
-0.09552001953125,
0.01373291015625,
0.0014371871948242188,
0.0037250518798828125,
-0.035797119140625,
-0.031280517578125,
-0.01241302490234375,
-0.0065460205078125,
-0.040496826171875,
0.0261688232421875,
-0.03497314453125,
-0.0039520263671875,
0.0021457672119140625,
0.0243988037109375,
-0.007602691650390625,
0.04522705078125,
-0.0034694671630859375,
0.07061767578125,
0.055877685546875,
-0.044830322265625,
0.0288238525390625,
0.0168609619140625,
-0.03900146484375,
0.0252838134765625,
-0.057830810546875,
0.0293121337890625,
0.020538330078125,
0.02301025390625,
-0.0859375,
-0.01551055908203125,
0.0159454345703125,
-0.06781005859375,
0.0300140380859375,
-0.0187530517578125,
-0.03106689453125,
-0.037841796875,
-0.034393310546875,
0.025909423828125,
0.05621337890625,
-0.03948974609375,
0.0191192626953125,
0.03204345703125,
0.0011987686157226562,
-0.055023193359375,
-0.07061767578125,
-0.01088714599609375,
-0.0190582275390625,
-0.054840087890625,
0.02093505859375,
0.0033721923828125,
-0.001796722412109375,
-0.01509857177734375,
-0.0273284912109375,
-0.008026123046875,
-0.0037326812744140625,
0.0284881591796875,
0.01512908935546875,
-0.0174102783203125,
-0.00811004638671875,
-0.00165557861328125,
0.0009832382202148438,
-0.0029430389404296875,
-0.034149169921875,
0.05389404296875,
-0.004619598388671875,
-0.006465911865234375,
-0.07379150390625,
-0.007476806640625,
0.052001953125,
-0.033905029296875,
0.0312347412109375,
0.072509765625,
-0.027099609375,
-0.0012769699096679688,
-0.050384521484375,
-0.0003845691680908203,
-0.036041259765625,
0.057220458984375,
-0.024810791015625,
-0.0521240234375,
0.045013427734375,
0.0036640167236328125,
-0.01039886474609375,
0.056427001953125,
0.03759765625,
0.0034427642822265625,
0.06207275390625,
0.004749298095703125,
-0.0196533203125,
0.029266357421875,
-0.060760498046875,
0.0007882118225097656,
-0.078125,
-0.0367431640625,
-0.051483154296875,
-0.0195465087890625,
-0.056243896484375,
-0.035186767578125,
0.0172119140625,
0.007549285888671875,
-0.0143890380859375,
0.0301055908203125,
-0.06097412109375,
0.01023101806640625,
0.0523681640625,
0.0009794235229492188,
-0.0179901123046875,
0.0136871337890625,
-0.030792236328125,
-0.015289306640625,
-0.03765869140625,
-0.0174560546875,
0.0933837890625,
0.029296875,
0.03399658203125,
-0.00858306884765625,
0.04571533203125,
-0.005130767822265625,
-0.034149169921875,
-0.05645751953125,
0.034820556640625,
-0.01418304443359375,
-0.03076171875,
-0.0293731689453125,
-0.0244140625,
-0.072509765625,
0.012176513671875,
-0.01568603515625,
-0.06488037109375,
0.0172576904296875,
-0.0007390975952148438,
-0.0245208740234375,
0.01500701904296875,
-0.0550537109375,
0.0604248046875,
-0.0081634521484375,
-0.0166473388671875,
-0.02899169921875,
-0.054534912109375,
0.013916015625,
0.00820159912109375,
0.007808685302734375,
-0.00830841064453125,
0.022613525390625,
0.09332275390625,
-0.04345703125,
0.038330078125,
-0.0289306640625,
0.00817108154296875,
0.04266357421875,
-0.0275421142578125,
0.02947998046875,
-0.0037822723388671875,
-0.0112152099609375,
0.022552490234375,
0.0124053955078125,
-0.01117706298828125,
-0.0245208740234375,
0.06072998046875,
-0.08099365234375,
-0.012176513671875,
-0.030792236328125,
-0.0103759765625,
-0.00960540771484375,
0.01220703125,
0.054107666015625,
0.058624267578125,
-0.00308990478515625,
0.0308990478515625,
0.0452880859375,
-0.0222015380859375,
0.0198211669921875,
0.0222015380859375,
0.01128387451171875,
-0.0699462890625,
0.06646728515625,
0.026885986328125,
0.0179290771484375,
0.01509857177734375,
0.0159912109375,
-0.037445068359375,
-0.045166015625,
-0.01364898681640625,
0.01338958740234375,
-0.043609619140625,
-0.00982666015625,
-0.042572021484375,
-0.0192718505859375,
-0.06256103515625,
0.016693115234375,
-0.042083740234375,
-0.037445068359375,
-0.032684326171875,
-0.0024356842041015625,
0.037384033203125,
0.023284912109375,
-0.0185699462890625,
0.024139404296875,
-0.02850341796875,
0.025970458984375,
0.01274871826171875,
0.013641357421875,
-0.004680633544921875,
-0.062744140625,
-0.02032470703125,
0.0214385986328125,
-0.0175018310546875,
-0.0462646484375,
0.021484375,
0.017852783203125,
0.024688720703125,
0.02178955078125,
-0.003421783447265625,
0.06683349609375,
-0.02392578125,
0.08026123046875,
0.0183868408203125,
-0.07562255859375,
0.062469482421875,
-0.008697509765625,
0.027191162109375,
0.057098388671875,
0.00472259521484375,
-0.051239013671875,
-0.0167083740234375,
-0.050689697265625,
-0.08660888671875,
0.0755615234375,
0.0244598388671875,
-0.01505279541015625,
0.0163421630859375,
0.01983642578125,
-0.015167236328125,
0.0009622573852539062,
-0.0301055908203125,
-0.04522705078125,
-0.0217742919921875,
-0.02508544921875,
-0.0181427001953125,
-0.0225677490234375,
0.0076446533203125,
-0.0401611328125,
0.0740966796875,
0.0194854736328125,
0.0203857421875,
0.043609619140625,
-0.0027179718017578125,
0.009735107421875,
0.023529052734375,
0.06109619140625,
0.02276611328125,
-0.020477294921875,
0.0012359619140625,
0.0285491943359375,
-0.07562255859375,
0.01019287109375,
0.02191162109375,
0.003795623779296875,
0.00707244873046875,
0.0283355712890625,
0.084228515625,
0.0122528076171875,
-0.036285400390625,
0.03448486328125,
-0.013580322265625,
-0.0352783203125,
-0.039703369140625,
0.01328277587890625,
0.01065826416015625,
0.01177215576171875,
0.0311737060546875,
0.00959014892578125,
0.00276947021484375,
-0.0311279296875,
0.0159454345703125,
0.017242431640625,
-0.03070068359375,
-0.0225067138671875,
0.068115234375,
0.01222991943359375,
-0.030853271484375,
0.049591064453125,
-0.0040435791015625,
-0.050750732421875,
0.051361083984375,
0.0239105224609375,
0.0777587890625,
-0.0170135498046875,
-0.01526641845703125,
0.051025390625,
0.01442718505859375,
-0.01346588134765625,
0.0298614501953125,
-0.01241302490234375,
-0.031768798828125,
-0.02197265625,
-0.040740966796875,
-0.00841522216796875,
0.03399658203125,
-0.051361083984375,
0.03857421875,
-0.0275421142578125,
-0.0171966552734375,
0.011688232421875,
0.04095458984375,
-0.05731201171875,
0.0290985107421875,
0.009033203125,
0.06689453125,
-0.04461669921875,
0.0723876953125,
0.0306549072265625,
-0.0357666015625,
-0.1029052734375,
0.006038665771484375,
-0.00044918060302734375,
-0.04803466796875,
0.042877197265625,
0.0244598388671875,
-0.0212554931640625,
-0.002288818359375,
-0.034637451171875,
-0.06622314453125,
0.09857177734375,
0.0269012451171875,
-0.05572509765625,
0.0180206298828125,
0.017364501953125,
0.0352783203125,
-0.0020885467529296875,
0.0173492431640625,
0.055023193359375,
0.03668212890625,
0.0111541748046875,
-0.09765625,
0.0012998580932617188,
-0.0211029052734375,
-0.0195465087890625,
-0.011322021484375,
-0.055450439453125,
0.054656982421875,
-0.01605224609375,
-0.01509857177734375,
0.0011844635009765625,
0.05096435546875,
0.032318115234375,
0.0389404296875,
0.036529541015625,
0.05609130859375,
0.08123779296875,
-0.0097198486328125,
0.042877197265625,
-0.0151824951171875,
0.02716064453125,
0.08447265625,
-0.00388336181640625,
0.07012939453125,
0.0216217041015625,
-0.0238494873046875,
0.0239105224609375,
0.04364013671875,
-0.011688232421875,
0.047515869140625,
0.01654052734375,
-0.01300811767578125,
0.004390716552734375,
0.01036834716796875,
-0.05035400390625,
0.060028076171875,
0.00962066650390625,
-0.004093170166015625,
0.0175933837890625,
0.01026153564453125,
0.018829345703125,
-0.0131072998046875,
-0.01117706298828125,
0.0498046875,
0.0032634735107421875,
-0.053863525390625,
0.0712890625,
0.0028629302978515625,
0.0740966796875,
-0.038665771484375,
0.01079559326171875,
0.003810882568359375,
0.0258941650390625,
-0.022796630859375,
-0.047149658203125,
0.0002689361572265625,
-0.00030612945556640625,
-0.005481719970703125,
0.005947113037109375,
0.036407470703125,
-0.044525146484375,
-0.02850341796875,
0.02081298828125,
-0.00047135353088378906,
0.0264892578125,
0.0020847320556640625,
-0.050018310546875,
0.02001953125,
0.0221099853515625,
-0.0206298828125,
0.002292633056640625,
0.031402587890625,
0.018157958984375,
0.0352783203125,
0.055419921875,
0.0308990478515625,
0.01297760009765625,
0.0154266357421875,
0.04400634765625,
-0.039886474609375,
-0.04266357421875,
-0.056427001953125,
0.036529541015625,
-0.0106048583984375,
-0.04364013671875,
0.0504150390625,
0.07586669921875,
0.0670166015625,
0.0138702392578125,
0.059906005859375,
0.0059356689453125,
0.045013427734375,
-0.04791259765625,
0.06829833984375,
-0.03741455078125,
0.0209503173828125,
-0.027099609375,
-0.04791259765625,
-0.0112457275390625,
0.045623779296875,
-0.0148468017578125,
0.0009937286376953125,
0.038330078125,
0.06683349609375,
0.0017538070678710938,
-0.0148162841796875,
0.0026149749755859375,
0.025238037109375,
0.0289306640625,
0.04498291015625,
0.03033447265625,
-0.062042236328125,
0.0623779296875,
-0.0269927978515625,
-0.0006594657897949219,
-0.0123443603515625,
-0.0274200439453125,
-0.047515869140625,
-0.0643310546875,
-0.0416259765625,
-0.039947509765625,
-0.0006437301635742188,
0.0830078125,
0.050933837890625,
-0.056121826171875,
-0.0229034423828125,
-0.0036296844482421875,
0.0009775161743164062,
-0.0215911865234375,
-0.0191650390625,
0.059814453125,
-0.00655364990234375,
-0.055633544921875,
0.03411865234375,
-0.0061798095703125,
0.0046539306640625,
0.00899505615234375,
-0.01271820068359375,
-0.01432037353515625,
-0.008636474609375,
0.0168914794921875,
0.023529052734375,
-0.058258056640625,
-0.0148468017578125,
-0.0027523040771484375,
-0.00604248046875,
0.0197906494140625,
0.0285491943359375,
-0.05120849609375,
0.035980224609375,
0.034576416015625,
0.0169219970703125,
0.057098388671875,
-0.0236663818359375,
0.01337432861328125,
-0.044708251953125,
0.026763916015625,
0.00946044921875,
0.0218963623046875,
0.025970458984375,
-0.0170745849609375,
0.0243682861328125,
0.0208740234375,
-0.0411376953125,
-0.0762939453125,
-0.0234527587890625,
-0.099365234375,
-0.0018558502197265625,
0.0999755859375,
0.00908660888671875,
-0.01788330078125,
0.0103759765625,
-0.03875732421875,
0.057769775390625,
-0.036773681640625,
0.032379150390625,
0.03790283203125,
0.00472259521484375,
0.0077667236328125,
-0.042205810546875,
0.05023193359375,
0.025634765625,
-0.037261962890625,
0.0009546279907226562,
0.0248260498046875,
0.051971435546875,
0.006740570068359375,
0.058685302734375,
-0.005802154541015625,
0.0263824462890625,
-0.0045318603515625,
0.019073486328125,
-0.00994110107421875,
-0.0036296844482421875,
-0.0478515625,
-0.0007486343383789062,
-0.0019426345825195312,
-0.056884765625
]
] |
pszemraj/led-large-book-summary | 2023-10-05T06:56:50.000Z | [
"transformers",
"pytorch",
"safetensors",
"led",
"text2text-generation",
"summarization",
"summary",
"longformer",
"booksum",
"long-document",
"long-form",
"en",
"dataset:kmfoda/booksum",
"arxiv:2105.08209",
"doi:10.57967/hf/0101",
"license:apache-2.0",
"license:bsd-3-clause",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | summarization | pszemraj | null | null | pszemraj/led-large-book-summary | 58 | 28,764 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
license:
- apache-2.0
- bsd-3-clause
tags:
- summarization
- led
- summary
- longformer
- booksum
- long-document
- long-form
datasets:
- kmfoda/booksum
metrics:
- rouge
widget:
- text: large earthquakes along a given fault segment do not occur at random intervals
because it takes time to accumulate the strain energy for the rupture. The rates
at which tectonic plates move and accumulate strain at their boundaries are approximately
uniform. Therefore, in first approximation, one may expect that large ruptures
of the same fault segment will occur at approximately constant time intervals.
If subsequent main shocks have different amounts of slip across the fault, then
the recurrence time may vary, and the basic idea of periodic mainshocks must be
modified. For great plate boundary ruptures the length and slip often vary by
a factor of 2. Along the southern segment of the San Andreas fault the recurrence
interval is 145 years with variations of several decades. The smaller the standard
deviation of the average recurrence interval, the more specific could be the long
term prediction of a future mainshock.
example_title: earthquakes
- text: ' A typical feed-forward neural field algorithm. Spatiotemporal coordinates
are fed into a neural network that predicts values in the reconstructed domain.
Then, this domain is mapped to the sensor domain where sensor measurements are
available as supervision. Class and Section Problems Addressed Generalization
(Section 2) Inverse problems, ill-posed problems, editability; symmetries. Hybrid
Representations (Section 3) Computation & memory efficiency, representation capacity,
editability: Forward Maps (Section 4) Inverse problems Network Architecture (Section
5) Spectral bias, integration & derivatives. Manipulating Neural Fields (Section
6) Edit ability, constraints, regularization. Table 2: The five classes of techniques
in the neural field toolbox each addresses problems that arise in learning, inference,
and control. (Section 3). We can supervise reconstruction via differentiable forward
maps that transform Or project our domain (e.g, 3D reconstruction via 2D images;
Section 4) With appropriate network architecture choices, we can overcome neural
network spectral biases (blurriness) and efficiently compute derivatives and integrals
(Section 5). Finally, we can manipulate neural fields to add constraints and regularizations,
and to achieve editable representations (Section 6). Collectively, these classes
constitute a ''toolbox'' of techniques to help solve problems with neural fields
There are three components in a conditional neural field: (1) An encoder or inference
function € that outputs the conditioning latent variable 2 given an observation
0 E(0) =2. 2 is typically a low-dimensional vector, and is often referred to aS
a latent code Or feature code_ (2) A mapping function 4 between Z and neural field
parameters O: Y(z) = O; (3) The neural field itself $. The encoder € finds the
most probable z given the observations O: argmaxz P(2/0). The decoder maximizes
the inverse conditional probability to find the most probable 0 given Z: arg-
max P(Olz). We discuss different encoding schemes with different optimality guarantees
(Section 2.1.1), both global and local conditioning (Section 2.1.2), and different
mapping functions Y (Section 2.1.3) 2. Generalization Suppose we wish to estimate
a plausible 3D surface shape given a partial or noisy point cloud. We need a suitable
prior over the sur- face in its reconstruction domain to generalize to the partial
observations. A neural network expresses a prior via the function space of its
architecture and parameters 0, and generalization is influenced by the inductive
bias of this function space (Section 5).'
example_title: scientific paper
- text: ' the big variety of data coming from diverse sources is one of the key properties
of the big data phenomenon. It is, therefore, beneficial to understand how data
is generated in various environments and scenarios, before looking at what should
be done with this data and how to design the best possible architecture to accomplish
this The evolution of IT architectures, described in Chapter 2, means that the
data is no longer processed by a few big monolith systems, but rather by a group
of services In parallel to the processing layer, the underlying data storage has
also changed and became more distributed This, in turn, required a significant
paradigm shift as the traditional approach to transactions (ACID) could no longer
be supported. On top of this, cloud computing is becoming a major approach with
the benefits of reducing costs and providing on-demand scalability but at the
same time introducing concerns about privacy, data ownership, etc In the meantime
the Internet continues its exponential growth: Every day both structured and unstructured
data is published and available for processing: To achieve competitive advantage
companies have to relate their corporate resources to external services, e.g.
financial markets, weather forecasts, social media, etc While several of the sites
provide some sort of API to access the data in a more orderly fashion; countless
sources require advanced web mining and Natural Language Processing (NLP) processing
techniques: Advances in science push researchers to construct new instruments
for observing the universe O conducting experiments to understand even better
the laws of physics and other domains. Every year humans have at their disposal
new telescopes, space probes, particle accelerators, etc These instruments generate
huge streams of data, which need to be stored and analyzed. The constant drive
for efficiency in the industry motivates the introduction of new automation techniques
and process optimization: This could not be done without analyzing the precise
data that describe these processes. As more and more human tasks are automated,
machines provide rich data sets, which can be analyzed in real-time to drive efficiency
to new levels. Finally, it is now evident that the growth of the Internet of Things
is becoming a major source of data. More and more of the devices are equipped
with significant computational power and can generate a continuous data stream
from their sensors. In the subsequent sections of this chapter, we will look at
the domains described above to see what they generate in terms of data sets. We
will compare the volumes but will also look at what is characteristic and important
from their respective points of view. 3.1 The Internet is undoubtedly the largest
database ever created by humans. While several well described; cleaned, and structured
data sets have been made available through this medium, most of the resources
are of an ambiguous, unstructured, incomplete or even erroneous nature. Still,
several examples in the areas such as opinion mining, social media analysis, e-governance,
etc, clearly show the potential lying in these resources. Those who can successfully
mine and interpret the Internet data can gain unique insight and competitive advantage
in their business An important area of data analytics on the edge of corporate
IT and the Internet is Web Analytics.'
example_title: data science textbook
- text: 'Transformer-based models have shown to be very useful for many NLP tasks.
However, a major limitation of transformers-based models is its O(n^2)O(n 2) time
& memory complexity (where nn is sequence length). Hence, it''s computationally
very expensive to apply transformer-based models on long sequences n > 512n>512.
Several recent papers, e.g. Longformer, Performer, Reformer, Clustered attention
try to remedy this problem by approximating the full attention matrix. You can
checkout 🤗''s recent blog post in case you are unfamiliar with these models.
BigBird (introduced in paper) is one of such recent models to address this issue.
BigBird relies on block sparse attention instead of normal attention (i.e. BERT''s
attention) and can handle sequences up to a length of 4096 at a much lower computational
cost compared to BERT. It has achieved SOTA on various tasks involving very long
sequences such as long documents summarization, question-answering with long contexts.
BigBird RoBERTa-like model is now available in 🤗Transformers. The goal of this
post is to give the reader an in-depth understanding of big bird implementation
& ease one''s life in using BigBird with 🤗Transformers. But, before going into
more depth, it is important to remember that the BigBird''s attention is an approximation
of BERT''s full attention and therefore does not strive to be better than BERT''s
full attention, but rather to be more efficient. It simply allows to apply transformer-based
models to much longer sequences since BERT''s quadratic memory requirement quickly
becomes unbearable. Simply put, if we would have ∞ compute & ∞ time, BERT''s attention
would be preferred over block sparse attention (which we are going to discuss
in this post).
If you wonder why we need more compute when working with longer sequences, this
blog post is just right for you!
Some of the main questions one might have when working with standard BERT-like
attention include:
Do all tokens really have to attend to all other tokens? Why not compute attention
only over important tokens? How to decide what tokens are important? How to attend
to just a few tokens in a very efficient way? In this blog post, we will try to
answer those questions.
What tokens should be attended to? We will give a practical example of how attention
works by considering the sentence ''BigBird is now available in HuggingFace for
extractive question answering''. In BERT-like attention, every word would simply
attend to all other tokens.
Let''s think about a sensible choice of key tokens that a queried token actually
only should attend to by writing some pseudo-code. Will will assume that the token
available is queried and build a sensible list of key tokens to attend to.
>>> # let''s consider following sentence as an example >>> example = [''BigBird'',
''is'', ''now'', ''available'', ''in'', ''HuggingFace'', ''for'', ''extractive'',
''question'', ''answering'']
>>> # further let''s assume, we''re trying to understand the representation of
''available'' i.e. >>> query_token = ''available'' >>> # We will initialize an
empty `set` and fill up the tokens of our interest as we proceed in this section.
>>> key_tokens = [] # => currently ''available'' token doesn''t have anything
to attend Nearby tokens should be important because, in a sentence (sequence of
words), the current word is highly dependent on neighboring past & future tokens.
This intuition is the idea behind the concept of sliding attention.'
example_title: bigbird blog intro
- text: 'The majority of available text summarization datasets include short-form
source documents that lack long-range causal and temporal dependencies, and often
contain strong layout and stylistic biases. While relevant, such datasets will
offer limited challenges for future generations of text summarization systems.
We address these issues by introducing BookSum, a collection of datasets for long-form
narrative summarization. Our dataset covers source documents from the literature
domain, such as novels, plays and stories, and includes highly abstractive, human
written summaries on three levels of granularity of increasing difficulty: paragraph-,
chapter-, and book-level. The domain and structure of our dataset poses a unique
set of challenges for summarization systems, which include: processing very long
documents, non-trivial causal and temporal dependencies, and rich discourse structures.
To facilitate future work, we trained and evaluated multiple extractive and abstractive
summarization models as baselines for our dataset.'
example_title: BookSum Abstract
inference:
parameters:
max_length: 64
min_length: 8
no_repeat_ngram_size: 3
early_stopping: true
repetition_penalty: 3.5
length_penalty: 0.3
encoder_no_repeat_ngram_size: 3
num_beams: 4
model-index:
- name: pszemraj/led-large-book-summary
results:
- task:
type: summarization
name: Summarization
dataset:
name: kmfoda/booksum
type: kmfoda/booksum
config: kmfoda--booksum
split: test
metrics:
- type: rouge
value: 31.7308
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNjJmZjMxYTY0OGU3MzNjNmIzNmYyODNlNDg2ZGRhZDAzNTMwMDM5YWMxODc1OTc1ZWE3MzM2OTg1ODFhZDBkNCIsInZlcnNpb24iOjF9.B8BCKgySYVZW910_1zP0LfCpQYJbAe6loyWut76JlgZb2kV1_x9ybqtNESX0ka-lNqhYyXUNDpuS-7pTmsJVDg
- type: rouge
value: 5.3311
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzViMmY4ODFjYTc5ODk5MmRhMDQ3ZDRiYWQwMDg0OTk3ZTA4NDAxYTNiNDgyMmI4NDA3ZDMwYWViOTBkODBjNyIsInZlcnNpb24iOjF9.MOhJLDcgvv93mVFL1igIgIiTAH3b2Xa4gmBObq7RF44Mmu8Kxtd1KP7rOlDVFOrtrsooGPGsyE1GMCQ2kqeMDg
- type: rouge
value: 16.1465
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNzNjMzEwMTliZGE3ZmQ4M2UxMDAyMTY3YzJjZmMyMDYyN2YyNDM0N2VhNzI1MDc1YTg4MTRjMmEzNjVkNTk1NCIsInZlcnNpb24iOjF9.XLJ-DVKiYLlbw5E5rWADKbzUzf5fNHhlTCWPCC5dU4NI9Yeh76aR7TPt36ZzLDwTBknnR8KHqlaF8F8YAvBUAg
- type: rouge
value: 29.0883
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTcwNzEwMmE5NjQxZTkzYmQyZDZmNzllYzYyNGI5OTMyNWMwNjdiM2I2YmM5YjdmY2E5OWQ3OTk3ZDA1MTc3YyIsInZlcnNpb24iOjF9.d6rFxjCB6RJNI_pn2DNNSjuZe4rdvj0RatkaTJRp5lP0F_AFfU5Zn9zRWzZJV7V-xMauIc4UhfdoLp9r_-CABA
- type: loss
value: 4.815707206726074
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTMwMTgxMmJkODY3MjkzOWJhMzJhOTIxMWVkODhjZmM0MWUzMWQ1N2JkZjRhOTQxNmU1YWVjYzQ0MDNlZWI3OSIsInZlcnNpb24iOjF9.mkBQHYhYFfDV6F4klXGJ1dSsF-pbCs-6F9zcw6IYznwmXUjtk7m5J4Zt4JAju5LKz4YizvEcUCl_L0WddnfvDA
- type: gen_len
value: 154.9036
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTc0ZmM1ZDM4MDE0MzY3MDM3OWJhNDkzZjJkZDdkMjU5M2JmMDJjYTIxODA1OTllNmY5ZWQzZDlmNWFiYzk4NiIsInZlcnNpb24iOjF9.VQ_O_xSTz870tnM08PJXQOwg9OsNNwI_HVX4S7AuW57_FzGGyRaWSuGE5SWzRS4Tur9YP0QxV4VV0Yoaoi3IAA
- task:
type: summarization
name: Summarization
dataset:
name: samsum
type: samsum
config: samsum
split: test
metrics:
- type: rouge
value: 33.4484
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTk4Yjg1YTc4YmY0MzBiZDU4ZjFhNzI4MjZkMWU1MzBlOWNlMjQ5ODMzY2YzYzRhYjJkMGUzNmI3ZjdkMzIzZSIsInZlcnNpb24iOjF9.AqS8A1OUiM0IZFBEGirv5F3Novk8lSUYSfPc3bYWLA6t-W7wgup3qA207eGbE5j9CkDWZ7QrSG1U6Z9A0sOqAA
- type: rouge
value: 10.4249
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2U4NjUyNTFmOGM5OTlhZDMyMTlmM2E4OWI2NGFiMDAyMGJjMzRjNWNlMGEyYWFmNTE5ZWMxM2I0ZGZmNWNmOCIsInZlcnNpb24iOjF9.SgJcHJ4qoRWXFvFiwv1PUutWktvsxQNynVPEv-GtBgxd6WI7o561ONyco5U-5tcyE_1SbSCJzz-L-R-q3cvoDA
- type: rouge
value: 24.5802
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZmQ5MDI5MzdiNGE5NDM0MmU5OThmZTBkNjkxMzg5N2IxNGVlODdhZTZhNjg3NzFjYWEyMzA3MTQxNjMyMjRkOCIsInZlcnNpb24iOjF9.Bg5dHqCcJjmxa-xGWNR5lD9g3quX7lKkH0pjiTd2xE5WiPoLLN2c0mYa2GovdW7__WnYwhhHC7es03jmvyZbCw
- type: rouge
value: 29.8226
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGFhOTEwNGM1MmZkNDk2ZjQ1Y2MyNjM3MGI5MGY3MWVkM2I0MjU2NWFiYmEwMjE4MTJlZWIwOGQ2MjQ3YjgzYSIsInZlcnNpb24iOjF9.W_aQKs10oXQdKEczJBGM3iiwJgb-VaXTpyA3sGof5WbhHf9vITAQA-xvynh5LgKtXQ1zjx737hnHgjEsu_Y0Cw
- type: loss
value: 4.176078796386719
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiN2JhODQ5YTZkNDZkZGYyNGU2MzkxMWU5MTEwMGM2YmVjZTA5YzI5NTMxMDNhYjhlOTAxMzFiMDYwYmM0MjEzZCIsInZlcnNpb24iOjF9.OvZrPBOR5jhkoTGBgsInkH7j3_xpacXHDoT7UIXEnyXzadfBO-O-K6fjalLNZw8wSkbjHIFcL_6S_qTTxPsNAQ
- type: gen_len
value: 65.4005
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiM2NhYjc3ZjQzNDEwYmMzOTM0ODkyZTJhZWNhNzZhYmEyZTYxMzA2YTYzMWFjOTA5ZjlhYWMzODg3NzY1ZTUwYSIsInZlcnNpb24iOjF9.vk9bgmtQFeRwdY3VXjtrJr_5wUCIeoAkI3kO0cHxhxmJo6RvUnyXiut72FuB-mlLZvqgiNkaZ-u_bh0Z3DjuCw
- task:
type: summarization
name: Summarization
dataset:
name: billsum
type: billsum
config: default
split: test
metrics:
- type: rouge
value: 40.5843
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTVjMDkyMWZjYTQ0NzgzNGUxZjNiMTg3NjU1MWJlNTQ2MWQ1NjE1MDk1OTU4ZjJiNGQ5ODg3Y2VlMWUyMzllNyIsInZlcnNpb24iOjF9.OhqBcVIuHk7fzmdrsWMvUe1bLeVMZVstZUoZpP7C1vR-3aIDl7r6eBmPrt5w-KcNq5p4teNPBsq7oKzbd5ZgDQ
- type: rouge
value: 17.3401
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGQxYmQzMmE0OTcyNTM5NmMwNjIxNzYxZDcwMDFkYzJkOWY4YWY3NTdhZGRhZDdlMDAxNzcwODQ5OGM3Mzc1MCIsInZlcnNpb24iOjF9.Pksn25EEqvmx757N7Swrd4yXc_xU7-AMN9yNe8lrbBa-l1LoI_2PUASvnjML4f705cfuyMAfb0FkFp5WfER2AA
- type: rouge
value: 25.1256
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjhjYzI5MDBiMjk2NTY3MDNmZTdiOGYwMTRlYjIwZjAwMjdlNTAyYzdhYTJlODQ4MjYzYmQ3MjRlYTA2YzhhZSIsInZlcnNpb24iOjF9.1jPepsweS2bzIqDverQzzhmhFGch7gpoEGFGqQ8zW7K10aUKWFX8lt-uZAmTa1Z5ZhzyXGBzc3dReFPhWRRJBg
- type: rouge
value: 34.6619
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiM2VkZDIxNWJjOTA0NzFjOTIwOTdjYjc1M2EyNDVjZjY2ZjY3MjIxNDk3YTc5YWExNzAwN2FhOTc1NjVhYjBkYiIsInZlcnNpb24iOjF9.8opqHSUckPohoSF9jfPTpXDz2AtDwvdMqOdIXx2kE1tkOcbLPbOBfcc8RhRR98y8S26yC6EYFhFnf03CV2ejAQ
- type: loss
value: 4.792657375335693
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYTY5ZTRkMGU3OGVkODMzMDU5OWE1NTM5YjA4NDliZDlmNzc2NzZjNjFmNTA3M2EwY2NmN2E0MWJmZjQ5ZDliMiIsInZlcnNpb24iOjF9.KCKdk8xt2NWcMmYKV3-9eVEsFm9MqGllSMu9QCFJFIQlnyNXllHKdBLouoaGQz8IRYXvZKH8_TLDPIQx-31jAg
- type: gen_len
value: 163.9394
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzdkZDYyZGUzYmFkZmI2NjUwYmQ0MzZjMmIyZjI1YTFiMzM4OThiZjBiMzljOTVkZTgwMjA0NTE5OGM2YmFjMiIsInZlcnNpb24iOjF9.XyMZLUdkUIF32KTJMuv_bJswQCx_Tfg4Fx823cURUixSeoIKps8_a634AreZ3Z8kb7bfE_sFGh3rM9KWsMxlDw
- task:
type: summarization
name: Summarization
dataset:
name: multi_news
type: multi_news
config: default
split: test
metrics:
- type: rouge
value: 39.0834
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNjYzMmVlMDM4MTNkMTI4MjAyMTU2YTg1ZWQwNTI1MmJlNGUwZmE1NTRmYTljZTQwY2RlMjcxOTgyZGMyYTc0ZiIsInZlcnNpb24iOjF9.6yuSr7UmsFatwqQ-mEO4gmsEtWI05kGB5Ib2pnl05H1OiPT2uUwmqdUytUw8KTx9u1jv9q0cTF1cL-n2kPEJAA
- type: rouge
value: 11.4043
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMWI5N2U2ZWI1ODM2MWUwOTIzYTAzNmRhNDA2OWEzZWRjMGEzMjBmY2EwN2YyYzU1NWE0YjIyZDE3MWE0MmMxZCIsInZlcnNpb24iOjF9.wonuxbBl25TzEaHUH_E816nHJ1OSXKfkaq7eJzbLpsfeGwcDklxUSxZxRO7VBiBMaY3Qttf9ywmEIPp40HnpBA
- type: rouge
value: 19.1813
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZjU1NDZhN2NkMzZiZGJkODE4NDZiYjViOTZkNGMyNDlkNjBlZmFjYzU1N2IzMjFjYjY1MDU1Zjk2MzA0M2U4NyIsInZlcnNpb24iOjF9.bTCRzv3J9NiCh4aV23tAWGTvrdQCv_RS40zGwC4AJXtGS40cY7tJHYwBf9U9_rCetDBxqfjJpdaUbCAOglxLAA
- type: rouge
value: 35.1581
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMDNhNTUyZjE4NjYxYjIzYThmMDM2YWNhM2QwYzY1ODI2ZTE3NmNjMmVhOTAzZjZlOWQwYzc1NzU2NDNjNzIxMyIsInZlcnNpb24iOjF9.cWlSbEBgrMN5D-fV_yL9geNMyMkIItcVO3wehNJPzFi3E0v1-4q8pnX-UgjLzto8X7JLi6as2V_HtZE4-C-CDw
- type: loss
value: 4.654905319213867
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYTc5Nzk0ODhiNWUzNTAxNzk2YzZmMjU2NDliY2UzOTYyYTdmZGEyYjI5NDNhOTE0MGUxOTgxMGVjMmNhM2UyMSIsInZlcnNpb24iOjF9.eBBAebcl3AwkrjR6a8BvoSjDfpw8LWTRFjyIFHVzspvoOKVfnO8_NB_UeR_K127OwXyoZ70Z7X_aKJOe-2kTDA
- type: gen_len
value: 186.2494
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOWI2NjVlYjgwYWJiMjcyMDUzMzEwNDNjZTMxMDM0MjAzMzk1ZmIwY2Q1ZDQ2Y2M5NDBlMDEzYzFkNWEyNzJmNiIsInZlcnNpb24iOjF9.iZ1Iy7FuWL4GH7LS5EylVj5eZRC3L2ZsbYQapAkMNzR_VXPoMGvoM69Hp-kU7gW55tmz2V4Qxhvoz9cM8fciBA
- task:
type: summarization
name: Summarization
dataset:
name: cnn_dailymail
type: cnn_dailymail
config: 3.0.0
split: test
metrics:
- type: rouge
value: 32.8774
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYWVlNjQzNWU1NTgyNTk2MzdhMDkyM2U3N2UzYzQ3ODJmOTJiMGViZDc0NzNiNDlmZGZmNTQzZmNjYTFjMzJmMCIsInZlcnNpb24iOjF9.qA54KJrGf79XCLnDrAMPp0saErVL_zKicLso9ZX2xxNdCANGExal5PFmmTT7aw7TUdkmUsNhmIRI9cBZ8J_1BA
- type: rouge
value: 13.3706
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDMzZWVjZmQ4ZWI2MWZmMGEzNjJhY2JmZjJhZTYwMTk2OTM2ODhlMmFmYmMxZGUyZWQzMmUxYzA0ZjJiMjcwYiIsInZlcnNpb24iOjF9.03Di-BfbZoWAVqRJc3x37Tn1Ae6vtZWymZL2w1ob8OQ8iOggYwmDmNQwv-bCXjT7fLjXYvh9uTndYsL05nj_Ag
- type: rouge
value: 20.4365
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYjI5YzdjZmM0YmZjYTU0OTg3ZTRjZWZkYTU2NzhlZjkwNGE2YmUzYzI1OThjMDUxOTcyNzk3ZTUyNmIzMWYzZCIsInZlcnNpb24iOjF9.LDg9lCKTh74kilxRBpunGSeOXJohaICXWjNf525ck-1h21AtjIQB8U7BTm80eyNRe7yIQpAlgOruCAxRqpTHDw
- type: rouge
value: 30.4408
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTZhMGJjMzg0MzQxY2U2ZTIzYTYzOGRhMGEyYjY1ZjQyZjNmNGIwMzFjOWJjNzU2NWQzMzc1Y2IxYWZkZGY5YyIsInZlcnNpb24iOjF9.LkvaIEsw0U-osBR--46f7rsF-s1fcu19Z22DkvwiMwWJj9AnsUwDWNcCecIyi5tziQpUx0PpZEKyXAhCrVx1Bw
- type: loss
value: 5.3488945960998535
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTc4Y2JlZWRlNDRkOTI4ODQyZjBlMjU5NmUyZTZmNzJjYTg0NjM1YzI4NzUzYjhmODBkY2U4NGJiMTlhYTc2ZiIsInZlcnNpb24iOjF9.CB6oO5j3cKJPOelM8pwT2lTenp5bZTkBFC5MPYW_nus-O5F1s4DaY-gdSUK3baTkMXbQ2yqaI_g_QAfNVmqhDQ
- type: gen_len
value: 181.8326
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOThmMGNlMGEwYjljMmNiZjdkMjc5NzZhNTYwMzAzOWFkYzA1NzZiNTIyN2IxNDJmOTk4MDliYzY2YjdjNGY4MSIsInZlcnNpb24iOjF9._buvRpxKLuKNNtOmALbFm3-nWCs2NCLh1l8gfVqDmKmv8JqJHQ27cdgZ4mklPLYOUhf6YWjby5_lp3ZGEctkCQ
---
# led-large-book-summary
<a href="https://colab.research.google.com/gist/pszemraj/3eba944ddc9fc9a4a1bfb21e83b57620/summarization-token-batching.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
This model is a fine-tuned version of [allenai/led-large-16384](https://huggingface.co/allenai/led-large-16384) on the `BookSum` dataset (`kmfoda/booksum`). It aims to generalize well and be useful in summarizing lengthy text for both academic and everyday purposes.
- Handles up to 16,384 tokens input
- See the Colab demo linked above or try the [demo on Spaces](https://huggingface.co/spaces/pszemraj/summarize-long-text)
> **Note:** Due to inference API timeout constraints, outputs may be truncated before the fully summary is returned (try python or the demo)
---
## Basic Usage
To improve summary quality, use `encoder_no_repeat_ngram_size=3` when calling the pipeline object. This setting encourages the model to utilize new vocabulary and construct an abstractive summary.
Load the model into a pipeline object:
```python
import torch
from transformers import pipeline
hf_name = 'pszemraj/led-large-book-summary'
summarizer = pipeline(
"summarization",
hf_name,
device=0 if torch.cuda.is_available() else -1,
)
```
Feed the text into the pipeline object:
```python
wall_of_text = "your words here"
result = summarizer(
wall_of_text,
min_length=16,
max_length=256,
no_repeat_ngram_size=3,
encoder_no_repeat_ngram_size=3,
repetition_penalty=3.5,
num_beams=4,
early_stopping=True,
)
```
**Important:** For optimal summary quality, use the global attention mask when decoding, as demonstrated in [this community notebook](https://colab.research.google.com/drive/12INTTR6n64TzS4RrXZxMSXfrOd9Xzamo?usp=sharing), see the definition of `generate_answer(batch)`.
If you're facing computing constraints, consider using the base version [`pszemraj/led-base-book-summary`](https://huggingface.co/pszemraj/led-base-book-summary).
---
## Training Information
### Data
The model was fine-tuned on the [booksum](https://arxiv.org/abs/2105.08209) dataset. During training, the `chapter`was the input col, while the `summary_text` was the output.
### Procedure
Fine-tuning was run on the BookSum dataset across 13+ epochs. Notably, the final four epochs combined the training and validation sets as 'train' to enhance generalization.
### Hyperparameters
The training process involved different settings across stages:
- **Initial Three Epochs:** Low learning rate (5e-05), batch size of 1, 4 gradient accumulation steps, and a linear learning rate scheduler.
- **In-between Epochs:** Learning rate reduced to 4e-05, increased batch size to 2, 16 gradient accumulation steps, and switched to a cosine learning rate scheduler with a 0.05 warmup ratio.
- **Final Two Epochs:** Further reduced learning rate (2e-05), batch size reverted to 1, maintained gradient accumulation steps at 16, and continued with a cosine learning rate scheduler, albeit with a lower warmup ratio (0.03).
### Versions
- Transformers 4.19.2
- Pytorch 1.11.0+cu113
- Datasets 2.2.2
- Tokenizers 0.12.1
---
## Simplified Usage with TextSum
To streamline the process of using this and other models, I've developed [a Python package utility](https://github.com/pszemraj/textsum) named `textsum`. This package offers simple interfaces for applying summarization models to text documents of arbitrary length.
Install TextSum:
```bash
pip install textsum
```
Then use it in Python with this model:
```python
from textsum.summarize import Summarizer
model_name = "pszemraj/led-large-book-summary"
summarizer = Summarizer(
model_name_or_path=model_name, # you can use any Seq2Seq model on the Hub
token_batch_length=4096, # tokens to batch summarize at a time, up to 16384
)
long_string = "This is a long string of text that will be summarized."
out_str = summarizer.summarize_string(long_string)
print(f"summary: {out_str}")
```
Currently implemented interfaces include a Python API, a Command-Line Interface (CLI), and a demo/web UI.
For detailed explanations and documentation, check the [README](https://github.com/pszemraj/textsum) or the [wiki](https://github.com/pszemraj/textsum/wiki)
---
## Related Models
Check out these other related models, also trained on the BookSum dataset:
- [LED-large continued](https://huggingface.co/pszemraj/led-large-book-summary-continued) - experiment with further fine-tuning
- [Long-T5-tglobal-base](https://huggingface.co/pszemraj/long-t5-tglobal-base-16384-book-summary)
- [BigBird-Pegasus-Large-K](https://huggingface.co/pszemraj/bigbird-pegasus-large-K-booksum)
- [Pegasus-X-Large](https://huggingface.co/pszemraj/pegasus-x-large-book-summary)
- [Long-T5-tglobal-XL](https://huggingface.co/pszemraj/long-t5-tglobal-xl-16384-book-summary)
There are also other variants on other datasets etc on my hf profile, feel free to try them out :)
---
| 28,848 | [
[
-0.024383544921875,
-0.050689697265625,
0.01483917236328125,
0.006561279296875,
-0.0283050537109375,
-0.020050048828125,
-0.033477783203125,
-0.0226898193359375,
0.007678985595703125,
0.02685546875,
-0.035308837890625,
-0.045684814453125,
-0.0380859375,
0.020538330078125,
-0.0231170654296875,
0.1075439453125,
0.0006623268127441406,
-0.0115814208984375,
0.006561279296875,
-0.0044403076171875,
-0.025726318359375,
-0.0277099609375,
-0.0372314453125,
-0.036468505859375,
0.043121337890625,
0.0215911865234375,
0.047119140625,
0.049407958984375,
0.051177978515625,
0.0195159912109375,
-0.03277587890625,
0.02386474609375,
-0.040252685546875,
-0.007755279541015625,
0.0020008087158203125,
-0.018585205078125,
-0.0589599609375,
-0.0121917724609375,
0.06707763671875,
0.0469970703125,
-0.0108489990234375,
0.028350830078125,
0.0139617919921875,
0.062042236328125,
-0.038238525390625,
0.02752685546875,
-0.0166473388671875,
-0.0007538795471191406,
-0.0041961669921875,
-0.0017261505126953125,
-0.035125732421875,
-0.014739990234375,
0.0155487060546875,
-0.04815673828125,
0.035736083984375,
0.0109710693359375,
0.0821533203125,
0.0164642333984375,
-0.033782958984375,
-0.034393310546875,
-0.034393310546875,
0.075439453125,
-0.0672607421875,
0.01222991943359375,
0.0275726318359375,
0.003780364990234375,
0.00643157958984375,
-0.06298828125,
-0.042144775390625,
0.0026187896728515625,
-0.02801513671875,
0.035552978515625,
0.0050506591796875,
0.00522613525390625,
0.0357666015625,
0.0458984375,
-0.055694580078125,
-0.00815582275390625,
-0.047943115234375,
-0.00023663043975830078,
0.057220458984375,
0.024505615234375,
-0.006237030029296875,
-0.03607177734375,
-0.026458740234375,
-0.0226593017578125,
-0.0291748046875,
0.00839996337890625,
0.0220184326171875,
0.0211944580078125,
-0.027374267578125,
0.041717529296875,
-0.0177764892578125,
0.0526123046875,
0.00283050537109375,
-0.0204010009765625,
0.03570556640625,
-0.035797119140625,
-0.02288818359375,
-0.01273345947265625,
0.0626220703125,
0.053863525390625,
0.013153076171875,
0.0023956298828125,
-0.0223236083984375,
-0.01873779296875,
-0.0034084320068359375,
-0.091796875,
-0.0196990966796875,
0.0345458984375,
-0.040679931640625,
-0.0081939697265625,
-0.0033779144287109375,
-0.058013916015625,
-0.0214996337890625,
-0.023773193359375,
0.03448486328125,
-0.04254150390625,
-0.0009908676147460938,
0.01263427734375,
-0.031524658203125,
0.0235595703125,
0.0256195068359375,
-0.0849609375,
0.0156097412109375,
0.0357666015625,
0.07763671875,
0.0016870498657226562,
-0.03326416015625,
-0.0272674560546875,
0.0142364501953125,
-0.0223236083984375,
0.04461669921875,
-0.0019369125366210938,
-0.0323486328125,
-0.006683349609375,
0.01097869873046875,
-0.002025604248046875,
-0.0304718017578125,
0.037628173828125,
-0.041168212890625,
0.0264892578125,
-0.02923583984375,
-0.0576171875,
-0.0176849365234375,
0.014495849609375,
-0.0364990234375,
0.067138671875,
0.00533294677734375,
-0.07574462890625,
0.0170440673828125,
-0.035552978515625,
-0.0355224609375,
-0.004535675048828125,
-0.00033736228942871094,
-0.059722900390625,
-0.01468658447265625,
0.0237274169921875,
0.049896240234375,
-0.017822265625,
0.019622802734375,
0.0033416748046875,
-0.041961669921875,
-0.0008440017700195312,
-0.003978729248046875,
0.0784912109375,
0.0112762451171875,
-0.012451171875,
0.01018524169921875,
-0.059051513671875,
-0.0169830322265625,
0.0081024169921875,
-0.03570556640625,
-0.004486083984375,
-0.007160186767578125,
0.0196533203125,
-0.023956298828125,
0.034149169921875,
-0.029571533203125,
0.030609130859375,
-0.031280517578125,
0.03375244140625,
0.0506591796875,
0.01007843017578125,
0.026153564453125,
-0.045654296875,
0.0250244140625,
-0.0078125,
0.00798797607421875,
-0.04058837890625,
-0.034210205078125,
-0.06219482421875,
-0.0309906005859375,
0.0283050537109375,
0.039093017578125,
-0.04669189453125,
0.06158447265625,
-0.0308380126953125,
-0.0576171875,
-0.005580902099609375,
-0.0012598037719726562,
0.0288848876953125,
0.045867919921875,
0.04083251953125,
-0.01116180419921875,
-0.0261077880859375,
-0.0537109375,
-0.0008268356323242188,
-0.003940582275390625,
-0.015228271484375,
0.0137481689453125,
0.054901123046875,
-0.02398681640625,
0.06402587890625,
-0.0679931640625,
-0.03240966796875,
-0.0223846435546875,
0.01165771484375,
0.052215576171875,
0.033477783203125,
0.042388916015625,
-0.0517578125,
-0.03912353515625,
-0.0033416748046875,
-0.06707763671875,
0.00420379638671875,
-0.026397705078125,
-0.006183624267578125,
0.0213623046875,
0.03424072265625,
-0.052947998046875,
0.0203399658203125,
0.0235137939453125,
-0.041534423828125,
0.056121826171875,
-0.0272369384765625,
0.005619049072265625,
-0.1026611328125,
0.0167236328125,
0.00504302978515625,
-0.006168365478515625,
-0.0270233154296875,
0.00615692138671875,
0.0267333984375,
0.0017595291137695312,
-0.00728607177734375,
0.0477294921875,
-0.0465087890625,
0.01161956787109375,
-0.01505279541015625,
0.0038776397705078125,
0.00746917724609375,
0.058013916015625,
-0.003437042236328125,
0.056793212890625,
0.035003662109375,
-0.055450439453125,
0.031707763671875,
0.043792724609375,
-0.033477783203125,
0.0069427490234375,
-0.06341552734375,
-0.013153076171875,
-0.01251220703125,
0.024993896484375,
-0.0697021484375,
-0.02862548828125,
0.016510009765625,
-0.032623291015625,
0.033660888671875,
0.0016613006591796875,
-0.04486083984375,
-0.024993896484375,
-0.039215087890625,
0.027862548828125,
0.055267333984375,
-0.034576416015625,
0.04754638671875,
0.00450897216796875,
-0.0284881591796875,
-0.055206298828125,
-0.05462646484375,
-0.0008053779602050781,
-0.01103973388671875,
-0.038360595703125,
0.03546142578125,
-0.0236663818359375,
-0.006549835205078125,
-0.01267242431640625,
0.004528045654296875,
0.0174560546875,
-0.0154571533203125,
0.01158905029296875,
0.027008056640625,
-0.00443267822265625,
0.0182342529296875,
0.0151214599609375,
-0.0007777214050292969,
-0.005970001220703125,
-0.008148193359375,
0.04083251953125,
-0.02264404296875,
0.0117340087890625,
-0.03302001953125,
0.01824951171875,
0.035186767578125,
-0.01296234130859375,
0.06939697265625,
0.0416259765625,
-0.024993896484375,
-0.0109405517578125,
-0.020050048828125,
-0.0079193115234375,
-0.03778076171875,
0.046539306640625,
-0.0106048583984375,
-0.040740966796875,
0.04803466796875,
0.02197265625,
0.0297088623046875,
0.051544189453125,
0.048675537109375,
-0.00018227100372314453,
0.044952392578125,
0.042327880859375,
-0.00450897216796875,
0.04986572265625,
-0.050537109375,
-0.0032024383544921875,
-0.06451416015625,
-0.0210113525390625,
-0.0255889892578125,
-0.0033206939697265625,
-0.0303955078125,
-0.015228271484375,
0.030059814453125,
0.018768310546875,
-0.034210205078125,
0.026458740234375,
-0.041717529296875,
0.0299224853515625,
0.037811279296875,
0.019195556640625,
0.0287017822265625,
0.00836181640625,
0.0120086669921875,
0.01251220703125,
-0.045196533203125,
-0.034454345703125,
0.09490966796875,
0.027618408203125,
0.053680419921875,
0.00785064697265625,
0.050933837890625,
0.0277252197265625,
0.0226287841796875,
-0.0634765625,
0.03668212890625,
-0.0018815994262695312,
-0.041961669921875,
-0.035186767578125,
-0.038543701171875,
-0.064453125,
0.0192718505859375,
-0.006683349609375,
-0.038360595703125,
0.0121917724609375,
0.005462646484375,
-0.0360107421875,
0.01263427734375,
-0.0657958984375,
0.0579833984375,
-0.0032329559326171875,
-0.021240234375,
-0.0084686279296875,
-0.055389404296875,
0.0183563232421875,
-0.00959014892578125,
0.00875091552734375,
0.0151214599609375,
-0.00487518310546875,
0.0828857421875,
-0.0406494140625,
0.06341552734375,
0.00019669532775878906,
-0.01194000244140625,
0.021026611328125,
-0.034454345703125,
0.0390625,
-0.0029163360595703125,
-0.0220794677734375,
0.0188446044921875,
-0.0026378631591796875,
-0.051544189453125,
-0.029266357421875,
0.0313720703125,
-0.072021484375,
-0.028411865234375,
-0.05267333984375,
-0.053680419921875,
0.00562286376953125,
0.0295257568359375,
0.043701171875,
0.05084228515625,
-0.004093170166015625,
0.01641845703125,
0.049774169921875,
-0.021759033203125,
0.05523681640625,
0.0235443115234375,
-0.0255279541015625,
-0.056365966796875,
0.05560302734375,
0.0187225341796875,
0.0155029296875,
0.0277099609375,
0.01336669921875,
-0.01558685302734375,
-0.0506591796875,
-0.036529541015625,
0.0369873046875,
-0.055877685546875,
-0.020416259765625,
-0.05523681640625,
-0.029083251953125,
-0.0439453125,
-0.01297760009765625,
-0.007419586181640625,
-0.0196533203125,
-0.036773681640625,
-0.0088653564453125,
0.046234130859375,
0.054229736328125,
0.00963592529296875,
0.04901123046875,
-0.0625,
0.02838134765625,
0.0210113525390625,
0.0127105712890625,
0.018341064453125,
-0.0693359375,
-0.05340576171875,
-0.004962921142578125,
-0.033233642578125,
-0.05682373046875,
0.03717041015625,
0.0208892822265625,
0.0036334991455078125,
0.03802490234375,
0.02130126953125,
0.032684326171875,
-0.04254150390625,
0.06488037109375,
0.0207672119140625,
-0.0560302734375,
0.0271148681640625,
-0.05047607421875,
0.035308837890625,
0.0219573974609375,
0.0238800048828125,
-0.036712646484375,
-0.01546478271484375,
-0.0379638671875,
-0.0819091796875,
0.060699462890625,
0.0283050537109375,
-0.00274658203125,
0.01849365234375,
0.031280517578125,
0.00988006591796875,
0.0183868408203125,
-0.062744140625,
-0.03314208984375,
-0.0162353515625,
-0.021209716796875,
-0.028106689453125,
-0.0236968994140625,
-0.00885772705078125,
-0.034423828125,
0.07647705078125,
0.0008707046508789062,
0.0169525146484375,
0.0279541015625,
-0.01611328125,
0.00443267822265625,
0.00942230224609375,
0.03582763671875,
0.061737060546875,
-0.043792724609375,
-0.0166473388671875,
0.012786865234375,
-0.050384521484375,
-0.01904296875,
0.032684326171875,
-0.026458740234375,
0.006732940673828125,
0.045166015625,
0.060791015625,
-0.0158233642578125,
-0.045867919921875,
0.03253173828125,
-0.009246826171875,
-0.0116729736328125,
-0.033538818359375,
0.0006508827209472656,
0.013397216796875,
0.0237579345703125,
0.046051025390625,
-0.0029392242431640625,
0.01268768310546875,
-0.03387451171875,
-0.0011653900146484375,
0.0105438232421875,
0.003154754638671875,
-0.0147705078125,
0.047393798828125,
0.02093505859375,
-0.01296234130859375,
0.046173095703125,
-0.025787353515625,
-0.0345458984375,
0.061126708984375,
0.0296783447265625,
0.051055908203125,
0.0128936767578125,
0.004520416259765625,
0.048583984375,
0.02447509765625,
-0.01355743408203125,
0.005580902099609375,
-0.01511383056640625,
-0.0394287109375,
-0.019805908203125,
-0.0633544921875,
-0.03289794921875,
0.00473785400390625,
-0.053131103515625,
0.02587890625,
-0.0305328369140625,
-0.0052947998046875,
0.00594329833984375,
0.024383544921875,
-0.03216552734375,
0.0222930908203125,
-0.0027217864990234375,
0.07177734375,
-0.060638427734375,
0.04864501953125,
0.042724609375,
-0.051055908203125,
-0.052398681640625,
0.00878143310546875,
-0.029327392578125,
-0.03509521484375,
0.0181121826171875,
0.0267333984375,
0.0037441253662109375,
-0.00017881393432617188,
-0.03692626953125,
-0.04840087890625,
0.104248046875,
0.00942230224609375,
-0.048095703125,
-0.01361846923828125,
-0.01270294189453125,
0.04547119140625,
-0.003063201904296875,
0.027923583984375,
0.032257080078125,
0.0258026123046875,
-0.0005602836608886719,
-0.05621337890625,
0.0079193115234375,
-0.031646728515625,
-0.01050567626953125,
0.008758544921875,
-0.08392333984375,
0.088623046875,
-0.0235443115234375,
-0.0020656585693359375,
0.03338623046875,
0.061767578125,
0.021942138671875,
0.033477783203125,
0.0117340087890625,
0.045166015625,
0.07061767578125,
-0.00012230873107910156,
0.0909423828125,
-0.0156402587890625,
0.065673828125,
0.07391357421875,
0.0017881393432617188,
0.051849365234375,
0.0186614990234375,
-0.021514892578125,
0.025421142578125,
0.06524658203125,
-0.02532958984375,
0.042022705078125,
-0.00484466552734375,
-0.00815582275390625,
-0.004924774169921875,
0.01145172119140625,
-0.05584716796875,
0.003620147705078125,
0.0164642333984375,
-0.03448486328125,
-0.00629425048828125,
-0.00669097900390625,
0.004611968994140625,
-0.01377105712890625,
-0.027099609375,
0.046112060546875,
0.0152435302734375,
-0.04931640625,
0.06463623046875,
0.0112457275390625,
0.0697021484375,
-0.048828125,
0.012115478515625,
-0.0218658447265625,
0.032745361328125,
-0.02020263671875,
-0.04461669921875,
0.00022351741790771484,
0.0037403106689453125,
-0.0345458984375,
-0.01380157470703125,
0.0242767333984375,
-0.034210205078125,
-0.056396484375,
0.0125885009765625,
0.0159149169921875,
0.01424407958984375,
0.002521514892578125,
-0.034210205078125,
-0.010406494140625,
0.00830841064453125,
-0.041778564453125,
0.01203155517578125,
0.032989501953125,
0.019256591796875,
0.034210205078125,
0.0340576171875,
0.00243377685546875,
-0.0051727294921875,
-0.00914764404296875,
0.062744140625,
-0.06695556640625,
-0.050994873046875,
-0.06427001953125,
0.034759521484375,
-0.0199127197265625,
-0.040313720703125,
0.06549072265625,
0.06201171875,
0.0555419921875,
-0.0220489501953125,
0.057098388671875,
-0.00970458984375,
0.0394287109375,
-0.0428466796875,
0.07806396484375,
-0.05584716796875,
0.01776123046875,
-0.0296783447265625,
-0.068603515625,
-0.01551055908203125,
0.04168701171875,
-0.017303466796875,
0.00811767578125,
0.0595703125,
0.057281494140625,
-0.001155853271484375,
0.00475311279296875,
0.003406524658203125,
0.0279541015625,
0.0181732177734375,
0.033477783203125,
0.036865234375,
-0.060638427734375,
0.046356201171875,
-0.01202392578125,
-0.03173828125,
-0.0184173583984375,
-0.053863525390625,
-0.0877685546875,
-0.045684814453125,
-0.0291595458984375,
-0.033172607421875,
0.0021076202392578125,
0.0609130859375,
0.0391845703125,
-0.0626220703125,
-0.026123046875,
-0.0011053085327148438,
0.004367828369140625,
-0.00914764404296875,
-0.0159149169921875,
0.0601806640625,
-0.018341064453125,
-0.0687255859375,
-0.00785064697265625,
-0.00879669189453125,
0.0031585693359375,
0.00954437255859375,
-0.01430511474609375,
-0.0160369873046875,
-0.0013856887817382812,
0.048095703125,
0.01442718505859375,
-0.0308380126953125,
0.0007061958312988281,
-0.009063720703125,
-0.005451202392578125,
-0.0006299018859863281,
0.03192138671875,
-0.042205810546875,
0.017730712890625,
0.03717041015625,
0.0455322265625,
0.05804443359375,
-0.0017957687377929688,
0.0227508544921875,
-0.048919677734375,
0.01305389404296875,
0.0053863525390625,
0.041961669921875,
0.0269927978515625,
-0.0284576416015625,
0.039764404296875,
0.0217742919921875,
-0.05682373046875,
-0.05908203125,
-0.0010843276977539062,
-0.08953857421875,
-0.0333251953125,
0.09674072265625,
-0.004985809326171875,
-0.023468017578125,
0.0214691162109375,
-0.0281982421875,
0.043182373046875,
-0.04302978515625,
0.06463623046875,
0.056488037109375,
-0.00365447998046875,
0.00876617431640625,
-0.0209808349609375,
0.032257080078125,
0.031829833984375,
-0.052459716796875,
0.01528167724609375,
0.041259765625,
0.0238800048828125,
0.0223846435546875,
0.04486083984375,
0.01404571533203125,
0.01971435546875,
0.001026153564453125,
0.01140594482421875,
-0.01332855224609375,
-0.0184326171875,
-0.0269927978515625,
0.0084991455078125,
-0.019256591796875,
-0.018463134765625
]
] |
klue/roberta-base | 2023-06-12T12:29:12.000Z | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"fill-mask",
"korean",
"klue",
"ko",
"arxiv:2105.09680",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | klue | null | null | klue/roberta-base | 9 | 28,734 | transformers | 2022-03-02T23:29:05 | ---
language: ko
tags:
- korean
- klue
mask_token: "[MASK]"
widget:
- text: 대한민국의 수도는 [MASK] 입니다.
---
# KLUE RoBERTa base
Pretrained RoBERTa Model on Korean Language. See [Github](https://github.com/KLUE-benchmark/KLUE) and [Paper](https://arxiv.org/abs/2105.09680) for more details.
## How to use
_NOTE:_ Use `BertTokenizer` instead of RobertaTokenizer. (`AutoTokenizer` will load `BertTokenizer`)
```python
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("klue/roberta-base")
tokenizer = AutoTokenizer.from_pretrained("klue/roberta-base")
```
## BibTeX entry and citation info
```bibtex
@misc{park2021klue,
title={KLUE: Korean Language Understanding Evaluation},
author={Sungjoon Park and Jihyung Moon and Sungdong Kim and Won Ik Cho and Jiyoon Han and Jangwon Park and Chisung Song and Junseong Kim and Yongsook Song and Taehwan Oh and Joohong Lee and Juhyun Oh and Sungwon Lyu and Younghoon Jeong and Inkwon Lee and Sangwoo Seo and Dongjun Lee and Hyunwoo Kim and Myeonghwa Lee and Seongbo Jang and Seungwon Do and Sunkyoung Kim and Kyungtae Lim and Jongwon Lee and Kyumin Park and Jamin Shin and Seonghyun Kim and Lucy Park and Alice Oh and Jungwoo Ha and Kyunghyun Cho},
year={2021},
eprint={2105.09680},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 1,347 | [
[
-0.0131988525390625,
-0.0277557373046875,
0.031097412109375,
0.02935791015625,
-0.0246734619140625,
0.00537872314453125,
-0.042572021484375,
-0.0140380859375,
0.003143310546875,
0.0152130126953125,
-0.025421142578125,
-0.042388916015625,
-0.05230712890625,
0.00501251220703125,
0.006290435791015625,
0.077392578125,
-0.005214691162109375,
0.03436279296875,
-0.0010747909545898438,
-0.0205230712890625,
-0.017578125,
-0.0276031494140625,
-0.038604736328125,
-0.0404052734375,
0.01312255859375,
0.004932403564453125,
0.030914306640625,
0.03131103515625,
0.00536346435546875,
0.0234222412109375,
-0.004119873046875,
-0.0119171142578125,
-0.0188446044921875,
0.0079345703125,
0.004669189453125,
-0.036163330078125,
-0.03607177734375,
0.0180816650390625,
0.04705810546875,
0.034393310546875,
0.0227203369140625,
0.03131103515625,
-0.00609588623046875,
0.049957275390625,
-0.029632568359375,
0.0279998779296875,
-0.03173828125,
-0.0096282958984375,
-0.0083465576171875,
0.01042938232421875,
-0.03179931640625,
-0.024566650390625,
0.0157470703125,
-0.0419921875,
0.005084991455078125,
-0.01763916015625,
0.11517333984375,
0.0239410400390625,
-0.03564453125,
-0.029205322265625,
-0.0347900390625,
0.06610107421875,
-0.053955078125,
0.04412841796875,
0.032928466796875,
0.021942138671875,
-0.007450103759765625,
-0.06219482421875,
-0.040435791015625,
-0.037506103515625,
-0.014373779296875,
0.016815185546875,
0.00022268295288085938,
-0.00719451904296875,
0.0192718505859375,
0.0244140625,
-0.057891845703125,
-0.0192108154296875,
-0.026397705078125,
-0.010467529296875,
0.047088623046875,
-0.0033168792724609375,
0.0174713134765625,
-0.03546142578125,
-0.0205078125,
-0.0362548828125,
-0.0160369873046875,
0.0135650634765625,
0.03314208984375,
0.0202178955078125,
-0.027130126953125,
0.0258331298828125,
-0.0111846923828125,
0.05322265625,
0.0159149169921875,
-0.01953125,
0.066162109375,
-0.033599853515625,
-0.01163482666015625,
-0.030609130859375,
0.061309814453125,
0.00595855712890625,
0.0254669189453125,
-0.0133819580078125,
0.00653839111328125,
-0.00673675537109375,
-0.0012416839599609375,
-0.052764892578125,
-0.059539794921875,
0.016937255859375,
-0.0567626953125,
-0.003940582275390625,
0.0227813720703125,
-0.06463623046875,
0.00589752197265625,
-0.0347900390625,
0.0258026123046875,
-0.023468017578125,
-0.03564453125,
-0.006610870361328125,
0.0001519918441772461,
0.019927978515625,
-0.01629638671875,
-0.0462646484375,
0.0077972412109375,
0.02545166015625,
0.051605224609375,
-0.002849578857421875,
-0.0316162109375,
-0.0226287841796875,
-0.0171661376953125,
-0.0152587890625,
0.0374755859375,
-0.014556884765625,
-0.0205230712890625,
-0.0099334716796875,
0.02783203125,
-0.033294677734375,
-0.036956787109375,
0.051177978515625,
-0.0350341796875,
0.022857666015625,
0.01100921630859375,
-0.046783447265625,
-0.017669677734375,
0.01165008544921875,
-0.040618896484375,
0.10296630859375,
0.035430908203125,
-0.04010009765625,
0.03704833984375,
-0.036346435546875,
-0.024627685546875,
-0.0013980865478515625,
-0.00470733642578125,
-0.055572509765625,
-0.009674072265625,
0.01067352294921875,
0.016571044921875,
-0.0006055831909179688,
0.024688720703125,
-0.020599365234375,
-0.0057220458984375,
0.00537872314453125,
-0.009429931640625,
0.08807373046875,
0.031890869140625,
-0.03045654296875,
0.0206756591796875,
-0.0841064453125,
0.0509033203125,
0.0104217529296875,
-0.04144287109375,
-0.0284881591796875,
-0.027435302734375,
0.030120849609375,
0.0225677490234375,
0.032073974609375,
-0.0276641845703125,
0.0126190185546875,
-0.03009033203125,
0.0155487060546875,
0.0340576171875,
-0.0167388916015625,
0.03857421875,
-0.0084228515625,
0.04840087890625,
-0.005191802978515625,
0.004947662353515625,
-0.01151275634765625,
-0.0245819091796875,
-0.04986572265625,
-0.033721923828125,
0.0614013671875,
0.04742431640625,
-0.053375244140625,
0.051971435546875,
-0.0181121826171875,
-0.059295654296875,
-0.06488037109375,
-0.00370025634765625,
0.044677734375,
0.0187530517578125,
0.02606201171875,
0.00861358642578125,
-0.05926513671875,
-0.046356201171875,
-0.033203125,
-0.0159149169921875,
-0.002826690673828125,
0.047393798828125,
0.043487548828125,
-0.012054443359375,
0.054107666015625,
-0.033538818359375,
0.0002865791320800781,
-0.0019435882568359375,
0.0246124267578125,
0.044891357421875,
0.05694580078125,
0.05615234375,
-0.048065185546875,
-0.08770751953125,
-0.008392333984375,
-0.037628173828125,
-0.02215576171875,
-0.0018014907836914062,
-0.0130615234375,
0.045806884765625,
0.0276947021484375,
-0.0443115234375,
0.02734375,
0.045135498046875,
-0.0345458984375,
0.05780029296875,
-0.0210418701171875,
0.0253143310546875,
-0.09075927734375,
0.019775390625,
-0.0230712890625,
-0.00905609130859375,
-0.04022216796875,
0.00740814208984375,
0.018035888671875,
-0.003276824951171875,
-0.0128326416015625,
0.058135986328125,
-0.0472412109375,
-0.0009212493896484375,
-0.022857666015625,
0.012603759765625,
-0.00928497314453125,
0.032958984375,
0.002353668212890625,
0.054443359375,
0.050933837890625,
-0.040191650390625,
0.0159912109375,
0.0176849365234375,
-0.03717041015625,
0.0166473388671875,
-0.055023193359375,
0.0184326171875,
0.00909423828125,
0.0154571533203125,
-0.0828857421875,
-0.00432586669921875,
0.0303497314453125,
-0.05780029296875,
0.0203857421875,
-0.045867919921875,
-0.0310516357421875,
-0.040740966796875,
-0.007427215576171875,
0.037567138671875,
0.0418701171875,
-0.043243408203125,
0.055389404296875,
0.0123291015625,
0.0010690689086914062,
-0.034393310546875,
-0.0287017822265625,
-0.0168914794921875,
-0.0175933837890625,
-0.043792724609375,
0.023529052734375,
-0.0175933837890625,
0.006103515625,
0.01364898681640625,
0.0025539398193359375,
-0.01099395751953125,
0.0007715225219726562,
0.00614166259765625,
0.030303955078125,
-0.021514892578125,
0.00490570068359375,
-0.029388427734375,
-0.031646728515625,
-0.00830841064453125,
-0.0246124267578125,
0.07440185546875,
-0.01497650146484375,
-0.00954437255859375,
-0.031585693359375,
-0.00604248046875,
0.02978515625,
-0.016571044921875,
0.04718017578125,
0.07196044921875,
-0.0027923583984375,
-0.0013790130615234375,
-0.023468017578125,
0.006561279296875,
-0.0302734375,
0.04766845703125,
-0.040863037109375,
-0.037261962890625,
0.04962158203125,
-0.01282501220703125,
-0.021453857421875,
0.038787841796875,
0.04736328125,
0.015350341796875,
0.068603515625,
0.0262908935546875,
-0.0258026123046875,
0.039703369140625,
-0.032958984375,
0.019866943359375,
-0.06939697265625,
-0.0031185150146484375,
-0.064453125,
0.0021533966064453125,
-0.061981201171875,
-0.023712158203125,
0.0133056640625,
0.02532958984375,
-0.032958984375,
0.0400390625,
-0.03369140625,
0.0267791748046875,
0.050201416015625,
-0.00588226318359375,
0.0001742839813232422,
-0.00830841064453125,
-0.0191802978515625,
-0.01242828369140625,
-0.042724609375,
-0.036407470703125,
0.09912109375,
0.01316070556640625,
0.051116943359375,
-0.006572723388671875,
0.052093505859375,
-0.00736236572265625,
0.0128631591796875,
-0.0419921875,
0.0255584716796875,
-0.01131439208984375,
-0.055328369140625,
-0.01593017578125,
-0.0321044921875,
-0.07470703125,
0.0213775634765625,
0.0009741783142089844,
-0.04962158203125,
0.00861358642578125,
-0.0017080307006835938,
-0.0047454833984375,
0.00167083740234375,
-0.038970947265625,
0.07763671875,
-0.011444091796875,
0.009185791015625,
0.003932952880859375,
-0.03179931640625,
0.01398468017578125,
0.0123138427734375,
0.004528045654296875,
-0.00759124755859375,
0.012603759765625,
0.046051025390625,
-0.0186920166015625,
0.039947509765625,
-0.035308837890625,
0.01727294921875,
0.0135498046875,
-0.008941650390625,
0.041839599609375,
0.0169830322265625,
-0.005828857421875,
0.0182342529296875,
0.005008697509765625,
-0.041259765625,
-0.037384033203125,
0.041473388671875,
-0.07476806640625,
-0.0147705078125,
-0.051971435546875,
-0.045074462890625,
-0.004383087158203125,
0.03570556640625,
0.031890869140625,
0.00350189208984375,
0.0032978057861328125,
0.00734710693359375,
0.018798828125,
-0.006641387939453125,
0.025421142578125,
0.056854248046875,
-0.0288238525390625,
-0.06463623046875,
0.059051513671875,
0.01210784912109375,
0.0192718505859375,
-0.0210723876953125,
0.00911712646484375,
-0.0275115966796875,
-0.0321044921875,
-0.03509521484375,
0.030609130859375,
-0.0487060546875,
-0.0084075927734375,
-0.04791259765625,
-0.0311737060546875,
-0.044158935546875,
0.00786590576171875,
-0.0301361083984375,
-0.023773193359375,
-0.0218658447265625,
0.01324462890625,
0.0235137939453125,
0.0295257568359375,
-0.00872039794921875,
0.0198974609375,
-0.04083251953125,
0.0164031982421875,
-0.006183624267578125,
0.0296783447265625,
0.016937255859375,
-0.059112548828125,
-0.039337158203125,
0.0017299652099609375,
-0.0172271728515625,
-0.040008544921875,
0.0267333984375,
0.0164794921875,
0.06658935546875,
0.006313323974609375,
0.00389862060546875,
0.056121826171875,
-0.03375244140625,
0.0665283203125,
0.0007882118225097656,
-0.06634521484375,
0.030609130859375,
-0.01514434814453125,
0.025390625,
0.052093505859375,
0.03887939453125,
-0.041107177734375,
-0.0230255126953125,
-0.057830810546875,
-0.08734130859375,
0.055572509765625,
0.01123809814453125,
-0.0005788803100585938,
-0.0028934478759765625,
0.0181121826171875,
-0.0029048919677734375,
0.00450897216796875,
-0.08331298828125,
-0.04608154296875,
-0.0232391357421875,
-0.0275421142578125,
0.0003113746643066406,
-0.01224517822265625,
-0.00225067138671875,
-0.034454345703125,
0.09173583984375,
-0.001338958740234375,
0.01202392578125,
0.0255584716796875,
-0.0211334228515625,
-0.00600433349609375,
-0.003459930419921875,
0.04669189453125,
0.03619384765625,
-0.01177978515625,
-0.017547607421875,
0.004535675048828125,
-0.04150390625,
0.00801849365234375,
0.0216827392578125,
-0.0266265869140625,
0.020721435546875,
0.0305023193359375,
0.0545654296875,
0.0202178955078125,
-0.044830322265625,
0.0216522216796875,
0.002506256103515625,
-0.0217437744140625,
-0.049957275390625,
-0.004299163818359375,
0.00943756103515625,
0.0292816162109375,
0.028289794921875,
-0.00821685791015625,
-0.021697998046875,
-0.019287109375,
0.018280029296875,
0.0210418701171875,
-0.01561737060546875,
-0.02117919921875,
0.044219970703125,
-0.00791168212890625,
-0.0237579345703125,
0.0404052734375,
-0.0333251953125,
-0.056915283203125,
0.05706787109375,
0.051300048828125,
0.07025146484375,
-0.0187225341796875,
0.003177642822265625,
0.06646728515625,
0.003459930419921875,
0.0052490234375,
0.043243408203125,
0.006275177001953125,
-0.042755126953125,
-0.01062774658203125,
-0.047515869140625,
0.0099029541015625,
0.03314208984375,
-0.056976318359375,
0.01255035400390625,
-0.0144195556640625,
-0.0211334228515625,
0.0084381103515625,
0.033203125,
-0.04022216796875,
0.005435943603515625,
-0.02313232421875,
0.0526123046875,
-0.06707763671875,
0.075439453125,
0.05963134765625,
-0.04595947265625,
-0.07464599609375,
0.0141448974609375,
-0.0215606689453125,
-0.043731689453125,
0.07855224609375,
-0.01091766357421875,
0.0210723876953125,
0.0019407272338867188,
-0.020843505859375,
-0.09075927734375,
0.07763671875,
-0.003116607666015625,
-0.01904296875,
0.0258941650390625,
-0.00849151611328125,
0.0419921875,
-0.035491943359375,
0.02001953125,
0.0379638671875,
0.054046630859375,
-0.01175689697265625,
-0.075927734375,
-0.01158905029296875,
-0.03765869140625,
0.0138397216796875,
0.0016946792602539062,
-0.0595703125,
0.0948486328125,
0.00801849365234375,
-0.01094818115234375,
0.043670654296875,
0.054046630859375,
0.026519775390625,
0.026519775390625,
0.03973388671875,
0.05133056640625,
0.062469482421875,
-0.0108642578125,
0.0526123046875,
-0.047698974609375,
0.06268310546875,
0.09503173828125,
-0.0082550048828125,
0.041900634765625,
0.015533447265625,
-0.040130615234375,
0.044921875,
0.05010986328125,
-0.04071044921875,
0.055389404296875,
0.0186920166015625,
-0.00765228271484375,
-0.0103759765625,
0.02850341796875,
-0.046539306640625,
0.031341552734375,
-0.0087127685546875,
-0.036163330078125,
-0.0120849609375,
0.01153564453125,
0.02435302734375,
0.015838623046875,
-0.00620269775390625,
0.040374755859375,
-0.01361083984375,
-0.04736328125,
0.06988525390625,
-0.0025310516357421875,
0.044891357421875,
-0.044769287109375,
0.007389068603515625,
0.0054779052734375,
0.021697998046875,
-0.011566162109375,
-0.0322265625,
0.0024356842041015625,
-0.005046844482421875,
-0.038177490234375,
0.0189056396484375,
0.0506591796875,
-0.05438232421875,
-0.046112060546875,
0.0384521484375,
0.0216522216796875,
0.025177001953125,
0.0260772705078125,
-0.06878662109375,
0.004444122314453125,
0.0086517333984375,
-0.041046142578125,
0.0205841064453125,
0.03436279296875,
0.00833892822265625,
0.040924072265625,
0.055511474609375,
0.00921630859375,
0.028594970703125,
0.0121002197265625,
0.057159423828125,
-0.0357666015625,
-0.04388427734375,
-0.04974365234375,
0.0274200439453125,
-0.0177154541015625,
-0.051910400390625,
0.07330322265625,
0.05865478515625,
0.0780029296875,
-0.0308380126953125,
0.058929443359375,
-0.027496337890625,
0.04815673828125,
-0.034332275390625,
0.073974609375,
-0.0362548828125,
-0.010955810546875,
-0.0234222412109375,
-0.05657958984375,
-0.00786590576171875,
0.0693359375,
-0.0254669189453125,
0.034942626953125,
0.05029296875,
0.053131103515625,
0.004669189453125,
-0.0230255126953125,
0.00957489013671875,
0.0377197265625,
0.02593994140625,
0.0229949951171875,
0.027252197265625,
-0.06005859375,
0.04119873046875,
-0.0245361328125,
-0.0132598876953125,
-0.0153045654296875,
-0.0723876953125,
-0.0726318359375,
-0.05963134765625,
-0.0222625732421875,
-0.049896240234375,
-0.0029735565185546875,
0.0841064453125,
0.061492919921875,
-0.07513427734375,
-0.01519012451171875,
0.0023097991943359375,
0.0060882568359375,
-0.0239715576171875,
-0.0206756591796875,
0.06793212890625,
-0.05328369140625,
-0.07440185546875,
0.0167236328125,
-0.022857666015625,
0.00909423828125,
0.0003864765167236328,
-0.03363037109375,
-0.035369873046875,
-0.01213836669921875,
0.0201873779296875,
0.0234222412109375,
-0.05615234375,
-0.01068878173828125,
-0.016265869140625,
-0.0208282470703125,
0.007602691650390625,
0.040924072265625,
-0.047271728515625,
0.0294036865234375,
0.033050537109375,
0.033050537109375,
0.042572021484375,
-0.0025081634521484375,
0.0214080810546875,
-0.035003662109375,
0.0284881591796875,
0.008758544921875,
0.019317626953125,
0.005619049072265625,
-0.013885498046875,
0.0540771484375,
0.0297088623046875,
-0.06365966796875,
-0.06475830078125,
-0.0007348060607910156,
-0.06695556640625,
-0.0199127197265625,
0.06793212890625,
-0.0487060546875,
-0.03155517578125,
-0.0204925537109375,
-0.01373291015625,
0.03143310546875,
-0.01087188720703125,
0.06280517578125,
0.071533203125,
0.0099029541015625,
0.0038738250732421875,
-0.04998779296875,
0.046844482421875,
0.0291290283203125,
-0.03924560546875,
0.0025691986083984375,
0.0035724639892578125,
0.04437255859375,
0.018585205078125,
0.04931640625,
-0.01483917236328125,
0.02984619140625,
0.00872039794921875,
0.023712158203125,
-0.01953125,
-0.0165252685546875,
-0.0247344970703125,
-0.0220794677734375,
-0.00605010986328125,
-0.0178985595703125
]
] |
philschmid/distilbart-cnn-12-6-samsum | 2022-12-05T13:32:50.000Z | [
"transformers",
"pytorch",
"bart",
"text2text-generation",
"sagemaker",
"summarization",
"en",
"dataset:samsum",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | summarization | philschmid | null | null | philschmid/distilbart-cnn-12-6-samsum | 15 | 28,629 | transformers | 2022-03-02T23:29:05 | ---
language: en
license: apache-2.0
tags:
- sagemaker
- bart
- summarization
datasets:
- samsum
widget:
- text: "Jeff: Can I train a \U0001F917 Transformers model on Amazon SageMaker? \n\
Philipp: Sure you can use the new Hugging Face Deep Learning Container. \nJeff:\
\ ok.\nJeff: and how can I get started? \nJeff: where can I find documentation?\
\ \nPhilipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face "
model-index:
- name: philschmid/distilbart-cnn-12-6-samsum
results:
- task:
type: summarization
name: Summarization
dataset:
name: samsum
type: samsum
config: samsum
split: test
metrics:
- type: rouge
value: 41.0895
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTBlZmQzZDFmNzY5YTBjMTI3ZjRkNDk4NTI3YzQxOGY4MjlkMTU4ZGJkZWE4YjQ4ZDFhOTIxM2M1YWEyMDQ4MCIsInZlcnNpb24iOjF9.Nw7idRmEmjS-c91HthjVGw6YxttVA_tRB2QRkGwVSVABR3_BY84HvwLOZVstc6a9gUHopMj_W9SRfa_6xTWcBA
- type: rouge
value: 20.7459
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjRhN2E4ZDNiNWNkY2RkNjMxMjhiYjcxYjc4OWM4NWQ5MDNjZDMwOGIwNWI2NWJiYzljMzc0NzY1ZDBmMTRmZCIsInZlcnNpb24iOjF9.nYxNimi33AW0T8T1JhqFUukxe4W4niXj4UzLRTuc40NeZveDTSpPS8QzR4rF1gK-r2irqIX5FrvG4dwQHrESBA
- type: rouge
value: 31.5952
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZjYzYTRjOTU1MDVhN2ZlOGE3YTIxMjk1NDBiY2E2ZWI1MDA5ZTJkOTQ4MzgxNThkOGU4OTUzODU0YWE1OTQ5MiIsInZlcnNpb24iOjF9.G2EtxIlJ86AcNx2bqw2nu1UbdczQ-anl1c02EopQyC81BEcEAbnY-liPvHXLjPVQvP97GGGjqTDLZYjYJ71hDQ
- type: rouge
value: 38.3389
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOWExZmM2YWI1MDU0ZjE1YWY1NDNiY2E4YTkwZjA5MTE0YmM1NzI4YTc1YjI0MWFmZDlkYTBlZjVlMDk2ZGQxZSIsInZlcnNpb24iOjF9.jjBghJ66Gj_95AdDpWG2TR_MnuUtj8Fzc0M3KS9vqsM0iqtlu9khY8lXrFpMaIeDxVBYKltMMFdZWH8mVv2wCg
- type: loss
value: 1.4566329717636108
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDAyYjFmNzRlYzkyNmUxMWM5YWYxNjgwZGQ1ZDc1Njg0MzU1ZjM4MjI5MTJlODRiNDdiNGRjYTkzZGUxNGMyZiIsInZlcnNpb24iOjF9.2eH5b7DlPeVQ_zFGlvKyRvqrc7yyT8vcf3koJGKGysV00vCQew8sOmFEmDegiBka8gq3UL987Dd2yZCU3b64Cw
- type: gen_len
value: 59.6032
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNTFkOTlmMjFlODIxZWExZmQyMTQyNDYyY2RmNjY3YmM1NDFlNTI3OTRhZjU0MDQxNGY2NGQ4ZWY2OWFmNDliMyIsInZlcnNpb24iOjF9.K9qwFg3Flnu2-1H-WI9adj7yoBuJ3zBBDyda5BxRpJ1D4L_alLpCweqrVGuynOPl9PAWPuHo7bAG1y2zZNmmDw
- task:
type: summarization
name: Summarization
dataset:
name: xsum
type: xsum
config: default
split: test
metrics:
- type: rouge
value: 21.1644
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYzVjNWI1MWJmOTAzMDYxNjlkYzg4NjM3MzczMDJiYjNjYTg3NjYzNjQ3YTczYzg5MGU2NDcxOWQwZjdlODU4YSIsInZlcnNpb24iOjF9.CqB-ANpnx0GvwhsjeCzLB_RxaKqbnhc_980RG8fqDb2hNTk4LvDhqdDfkLFQMj8kvW4nQLLDSNUENQ7Uni9kCA
- type: rouge
value: 4.0659
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZjgzYTc5Y2VjZTliNmEwYTZjNTViODdjYTg2MGMyZTFkMjgyMjk0OGIzZjg1ODgzZDJmOTMwNmU4MzdlNTI2OCIsInZlcnNpb24iOjF9.1AfPtrpJ38Khz5vfRsN4Jwb3J_PdycddRH9DJtEccqmEz9BzDo-AO7Ts94sfVlYfSf3srplLHDcd_XFCwQtlBg
- type: rouge
value: 13.9414
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiODZlZTM2OTM5NzVlNDBjYjc2NDdhNmIxNDZhNGVjOTk4YzIzZmEwYzEwZmQ2ZmNmMjIzNzYwMzkxMjU1ODcxMCIsInZlcnNpb24iOjF9.vvp5PKmEp-Hyt46zgsvzjGOO8wrV0cDG68Z0VPqW2WfY5Sp3k3krEcKLATdQAQjfy96gKCCkQpBFefpjYWcmDA
- type: rouge
value: 17.0718
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZWEyN2QyOWFhMjAzZTk4NjU5ODU5YjgxMjczNzc1MTM5OTY1OTVjZDMwZDhjODFlZTVmODNkYzFhOTc5NzZhYiIsInZlcnNpb24iOjF9.PwJT3EYTV3KifWaySiwSTxGyBWTB8bHMuaXG3AyRvWkY2xju1BSaBjPGCcfmlZs1yJwghOH7N4dBW5yJBEp5DA
- type: loss
value: 3.002755880355835
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZGY0NjgzOGExZWZmNWI3ZTA1ZjRlYTU2OTZkYzk5NDdmOGVmNDdkNWU5YjViMWQyNGE2MTNkODhkZmQ1ZGE2OSIsInZlcnNpb24iOjF9.pWru9Nhl0aZThHz0qveOHmxTOCrZjHu9ySt5wI9MnGQ5ZEpxfufjpI196EMMn-KSSxAl-s7wHygtGC9_WtC1BQ
- type: gen_len
value: 71.2969
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2Q4ODZlZDY4YzE1MDEwNzJjNmNiYjI4NTU5NjRlNmY2NWNjYmUwNzcwZmY3NWVlZDA3NTMxY2Y2NWI1ODYwMCIsInZlcnNpb24iOjF9.0Y_lnTKQ5nmjnwAEju9T7xlLObWgwPLMOxlWDpjPBkDeW0bzHYqJcRADtFcvAhznJ3HktIV830QxjqkRYjZTDw
- task:
type: summarization
name: Summarization
dataset:
name: cnn_dailymail
type: cnn_dailymail
config: 3.0.0
split: test
metrics:
- type: rouge
value: 42.9764
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMzQ0NzAyNDM5YWIyZmY5OTUwNWZjNzRkYTliYWI0YzdmZTg5ZGVhZjJlMDBiZjg2YmE0ZjkxNzU4YjRkYTJjNCIsInZlcnNpb24iOjF9.dymlXdITNpMZpOaYvif-LcxaSRWKh8_RxV6mdBpuvlThPPi3-TwKCW20Fowor8H5RPsC0M1cfvNNzINCyApKCA
- type: rouge
value: 19.8711
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMTY5ZTI0OTEzNzI1NDRjYWE5MDk2NzVhZDEyNGEzNzU3OTE0NmJmYzhkMTU0MTI2MDVlOTdlNGU0OGI2MTdlNSIsInZlcnNpb24iOjF9.7xN2u4HjPL78CkkihB9I0befTn04IQqimvNlSHpc888arBm_qCtTGl7q7389ArpWUKEdkhvZ94BgB-Z_cXsqCg
- type: rouge
value: 29.5196
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGY3YzRiYjljZWJhNGJiMTdhNDY5OTk0ODJhYWMxOGMwYjY4ZTBlNmQ0YTUxMzQ3YmZiNzg0ZDJiNTg4MzdiYyIsInZlcnNpb24iOjF9.Yj0ZaJelYcMJ-8SIon9x7GxRityWR3p0vcLNctTfcg6eCClalTQKBclCVgpDCO8WQyVxSz8EyCDb2qedRgF9CQ
- type: rouge
value: 39.959
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiYmI2MTE3NjQ2M2MwYjJmN2JmMTEwN2JkMDVhMWRiMDBiM2FlMzM5NjAxYWM1N2Q2MTA5MGRhZmI5ZDNjNzMwYSIsInZlcnNpb24iOjF9.A-2Ch4-M691OBAp4KmsYut10K3sF0fjw5ztutK_LTtn68Ne0x8w-u-7pEyjTuWJrJx4Q3Yb1eW3yeHPTnFI0DA
- type: loss
value: 3.014679193496704
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMmNiNWZlOGZmOWZhNWZjNTg1ODIzMzY2ZGYxNzBiOWIxMWRiZTQ5MzM5NjRlYTQwMzRjOGI3YzNmZDhmYjQ3MCIsInZlcnNpb24iOjF9.BZkiJxZG0RdFzNgxgcS8U6_zPT1t7rvs-603cnC1tjMMYF3Lbae7rExRb-fVHN_ofZV_w5vl4uRLQ3OxZUY5Ag
- type: gen_len
value: 81.956
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTdhNGYxYWRlNTU0MzAxMWU1NzNmMTBjMmY3NzkzODAyYTMzZWYyZmNiMTViMzNmYTE0ZmFmNDdhMzQwMmJkNyIsInZlcnNpb24iOjF9.8lm84JtbCh-diuNQ01oXK6P8vV9CPyA8y-7D9o_OHb9Vk3pNEFM1jMSZVdEG9wFuMpWL3ARbXLadEPQB5HN8AQ
---
## `distilbart-cnn-12-6-samsum`
This model was trained using Amazon SageMaker and the new Hugging Face Deep Learning container.
For more information look at:
- [🤗 Transformers Documentation: Amazon SageMaker](https://huggingface.co/transformers/sagemaker.html)
- [Example Notebooks](https://github.com/huggingface/notebooks/tree/master/sagemaker)
- [Amazon SageMaker documentation for Hugging Face](https://docs.aws.amazon.com/sagemaker/latest/dg/hugging-face.html)
- [Python SDK SageMaker documentation for Hugging Face](https://sagemaker.readthedocs.io/en/stable/frameworks/huggingface/index.html)
- [Deep Learning Container](https://github.com/aws/deep-learning-containers/blob/master/available_images.md#huggingface-training-containers)
## Hyperparameters
```json
{
"dataset_name": "samsum",
"do_eval": true,
"do_train": true,
"fp16": true,
"learning_rate": 5e-05,
"model_name_or_path": "sshleifer/distilbart-cnn-12-6",
"num_train_epochs": 3,
"output_dir": "/opt/ml/model",
"per_device_eval_batch_size": 8,
"per_device_train_batch_size": 8,
"seed": 7
}
```
## Train results
| key | value |
| --- | ----- |
| epoch | 3.0 |
| init_mem_cpu_alloc_delta | 180338 |
| init_mem_cpu_peaked_delta | 18282 |
| init_mem_gpu_alloc_delta | 1222242816 |
| init_mem_gpu_peaked_delta | 0 |
| train_mem_cpu_alloc_delta | 6971403 |
| train_mem_cpu_peaked_delta | 640733 |
| train_mem_gpu_alloc_delta | 4910897664 |
| train_mem_gpu_peaked_delta | 23331969536 |
| train_runtime | 155.2034 |
| train_samples | 14732 |
| train_samples_per_second | 2.242 |
## Eval results
| key | value |
| --- | ----- |
| epoch | 3.0 |
| eval_loss | 1.4209576845169067 |
| eval_mem_cpu_alloc_delta | 868003 |
| eval_mem_cpu_peaked_delta | 18250 |
| eval_mem_gpu_alloc_delta | 0 |
| eval_mem_gpu_peaked_delta | 328244736 |
| eval_runtime | 0.6088 |
| eval_samples | 818 |
| eval_samples_per_second | 1343.647 |
## Usage
```python
from transformers import pipeline
summarizer = pipeline("summarization", model="philschmid/distilbart-cnn-12-6-samsum")
conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker?
Philipp: Sure you can use the new Hugging Face Deep Learning Container.
Jeff: ok.
Jeff: and how can I get started?
Jeff: where can I find documentation?
Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
'''
nlp(conversation)
```
| 9,723 | [
[
-0.051116943359375,
-0.050750732421875,
0.004199981689453125,
0.0031528472900390625,
-0.0098419189453125,
-0.0007605552673339844,
-0.01345062255859375,
-0.0204620361328125,
0.00910186767578125,
0.01727294921875,
-0.06671142578125,
-0.036285400390625,
-0.062347412109375,
-0.01337432861328125,
-0.0062103271484375,
0.0970458984375,
0.00321197509765625,
0.02142333984375,
0.000024020671844482422,
-0.01311492919921875,
-0.0089111328125,
-0.0228424072265625,
-0.04364013671875,
-0.04583740234375,
0.020172119140625,
0.01026153564453125,
0.05755615234375,
0.0269622802734375,
0.037078857421875,
0.030609130859375,
-0.0217437744140625,
-0.0092010498046875,
-0.03558349609375,
-0.0259552001953125,
0.0200958251953125,
-0.022430419921875,
-0.041015625,
0.01497650146484375,
0.03912353515625,
0.0343017578125,
-0.006221771240234375,
0.0290374755859375,
0.008636474609375,
0.05596923828125,
-0.0252532958984375,
0.025848388671875,
-0.0250396728515625,
0.0228118896484375,
-0.003871917724609375,
0.017822265625,
-0.00391387939453125,
-0.01727294921875,
0.0178985595703125,
-0.0377197265625,
0.035186767578125,
-0.007617950439453125,
0.104248046875,
0.041534423828125,
-0.0175933837890625,
-0.0021820068359375,
-0.025634765625,
0.054840087890625,
-0.050140380859375,
0.0173797607421875,
0.0233917236328125,
0.038116455078125,
-0.004802703857421875,
-0.066162109375,
-0.05389404296875,
-0.006374359130859375,
-0.0209808349609375,
0.009674072265625,
-0.0199127197265625,
-0.006622314453125,
0.040802001953125,
0.045013427734375,
-0.046844482421875,
-0.00424957275390625,
-0.03558349609375,
-0.01195526123046875,
0.04937744140625,
-0.0023212432861328125,
0.00989532470703125,
-0.0181121826171875,
-0.040557861328125,
-0.034820556640625,
-0.01224517822265625,
0.02215576171875,
0.010833740234375,
0.007442474365234375,
-0.031646728515625,
0.0239105224609375,
-0.031097412109375,
0.038818359375,
0.01177215576171875,
-0.00634002685546875,
0.05511474609375,
-0.0003826618194580078,
-0.034942626953125,
0.00775146484375,
0.08148193359375,
0.0293121337890625,
0.01245880126953125,
0.0175323486328125,
-0.0088043212890625,
-0.0110626220703125,
0.01200103759765625,
-0.09619140625,
-0.03582763671875,
0.015869140625,
-0.033477783203125,
-0.03790283203125,
0.005939483642578125,
-0.055938720703125,
-0.0043792724609375,
-0.0224456787109375,
0.0249176025390625,
-0.033538818359375,
-0.0171356201171875,
-0.00029349327087402344,
-0.01076507568359375,
0.043853759765625,
0.0232086181640625,
-0.0732421875,
0.016998291015625,
0.041412353515625,
0.0570068359375,
0.019500732421875,
-0.03057861328125,
-0.0231781005859375,
-0.0233612060546875,
-0.01003265380859375,
0.043731689453125,
0.0007853507995605469,
-0.016937255859375,
-0.0189666748046875,
0.025146484375,
0.0005459785461425781,
-0.039093017578125,
0.0401611328125,
-0.0305023193359375,
0.033111572265625,
-0.0145263671875,
-0.03582763671875,
-0.0357666015625,
0.0175018310546875,
-0.047088623046875,
0.0872802734375,
0.0280609130859375,
-0.05535888671875,
0.0239715576171875,
-0.0477294921875,
-0.01297760009765625,
0.006793975830078125,
-0.00933837890625,
-0.07037353515625,
-0.0103759765625,
0.0191497802734375,
0.0594482421875,
-0.0230560302734375,
0.0236358642578125,
-0.02484130859375,
-0.0229034423828125,
0.02301025390625,
-0.020263671875,
0.0950927734375,
0.0112457275390625,
-0.04583740234375,
0.00605010986328125,
-0.06402587890625,
0.00039839744567871094,
0.034423828125,
-0.006488800048828125,
0.007579803466796875,
-0.0440673828125,
0.010833740234375,
0.0121002197265625,
0.019989013671875,
-0.0291900634765625,
0.0194549560546875,
-0.006610870361328125,
0.0289306640625,
0.046478271484375,
-0.007183074951171875,
0.0158843994140625,
-0.02764892578125,
0.0244598388671875,
0.01131439208984375,
0.01305389404296875,
-0.0130462646484375,
-0.048980712890625,
-0.06256103515625,
-0.0200347900390625,
0.00858306884765625,
0.03558349609375,
-0.042816162109375,
0.049530029296875,
-0.01467132568359375,
-0.06439208984375,
-0.04754638671875,
0.007228851318359375,
0.038665771484375,
0.0662841796875,
0.03656005859375,
-0.0144805908203125,
-0.031890869140625,
-0.072998046875,
0.0007543563842773438,
0.00025391578674316406,
-0.00443267822265625,
0.0204315185546875,
0.051239013671875,
-0.03179931640625,
0.052734375,
-0.049224853515625,
-0.0289306640625,
-0.00576019287109375,
0.01983642578125,
0.0249176025390625,
0.047119140625,
0.0601806640625,
-0.052886962890625,
-0.034149169921875,
-0.0142364501953125,
-0.0701904296875,
0.015777587890625,
0.0021228790283203125,
-0.0185546875,
0.01331329345703125,
0.0225982666015625,
-0.050994873046875,
0.032135009765625,
0.04547119140625,
-0.0191650390625,
0.050445556640625,
0.003215789794921875,
0.005279541015625,
-0.09344482421875,
0.0084686279296875,
0.02105712890625,
-0.01503753662109375,
-0.03662109375,
0.004581451416015625,
0.00417327880859375,
-0.0145416259765625,
-0.04547119140625,
0.030059814453125,
-0.021942138671875,
0.005504608154296875,
-0.0158843994140625,
-0.0180511474609375,
0.0077667236328125,
0.06475830078125,
0.006534576416015625,
0.05120849609375,
0.061187744140625,
-0.041290283203125,
0.0489501953125,
0.050384521484375,
-0.034759521484375,
0.039031982421875,
-0.06011962890625,
0.0007901191711425781,
-0.00447845458984375,
0.03192138671875,
-0.056304931640625,
-0.031219482421875,
0.0347900390625,
-0.05810546875,
0.03619384765625,
-0.0206146240234375,
-0.019744873046875,
-0.03155517578125,
-0.01001739501953125,
0.018341064453125,
0.06317138671875,
-0.054962158203125,
0.03668212890625,
0.014495849609375,
0.00554656982421875,
-0.048309326171875,
-0.067626953125,
-0.00852203369140625,
-0.0199127197265625,
-0.040283203125,
0.0302734375,
-0.003948211669921875,
-0.0012731552124023438,
0.005218505859375,
0.01027679443359375,
-0.012115478515625,
0.0106353759765625,
0.0205230712890625,
0.033050537109375,
-0.0204620361328125,
-0.01244354248046875,
-0.00861358642578125,
-0.027069091796875,
0.0169525146484375,
0.00447845458984375,
0.035308837890625,
-0.0479736328125,
-0.012969970703125,
-0.06597900390625,
-0.007305145263671875,
0.031097412109375,
-0.004913330078125,
0.062286376953125,
0.08111572265625,
-0.038055419921875,
-0.0139312744140625,
-0.032562255859375,
-0.0257415771484375,
-0.0396728515625,
0.02178955078125,
-0.02178955078125,
-0.05426025390625,
0.041412353515625,
-0.020782470703125,
-0.0002415180206298828,
0.05279541015625,
0.044830322265625,
-0.00135040283203125,
0.06256103515625,
0.0260162353515625,
-0.0242919921875,
0.036773681640625,
-0.0606689453125,
-0.0113983154296875,
-0.06640625,
-0.03271484375,
-0.0272369384765625,
-0.0389404296875,
-0.042510986328125,
-0.021270751953125,
0.018280029296875,
0.012481689453125,
-0.050872802734375,
0.038818359375,
-0.062255859375,
0.0254058837890625,
0.05511474609375,
0.01885986328125,
-0.0010328292846679688,
-0.0007562637329101562,
-0.006793975830078125,
-0.00492095947265625,
-0.035919189453125,
-0.0218963623046875,
0.061309814453125,
0.03472900390625,
0.049072265625,
-0.0111541748046875,
0.050445556640625,
-0.004604339599609375,
0.01534271240234375,
-0.05584716796875,
0.03546142578125,
0.00724029541015625,
-0.0435791015625,
-0.0072174072265625,
-0.040740966796875,
-0.057861328125,
0.01464080810546875,
-0.0196533203125,
-0.04876708984375,
0.001598358154296875,
0.0013475418090820312,
-0.02178955078125,
0.027587890625,
-0.049072265625,
0.0882568359375,
-0.023651123046875,
-0.01506805419921875,
0.00453948974609375,
-0.06353759765625,
0.03936767578125,
0.0154571533203125,
-0.0130767822265625,
-0.0164337158203125,
0.0260162353515625,
0.045562744140625,
-0.041229248046875,
0.057647705078125,
-0.0254974365234375,
0.0158843994140625,
0.027618408203125,
-0.014251708984375,
0.0265655517578125,
-0.0007619857788085938,
0.004489898681640625,
0.04205322265625,
-0.0074310302734375,
-0.0391845703125,
-0.040191650390625,
0.05145263671875,
-0.08221435546875,
-0.0277252197265625,
-0.055419921875,
-0.0187530517578125,
0.0079498291015625,
0.0227203369140625,
0.031219482421875,
0.01812744140625,
-0.0165557861328125,
0.0209503173828125,
0.054656982421875,
-0.018707275390625,
0.033660888671875,
0.014801025390625,
-0.01149749755859375,
-0.0401611328125,
0.06549072265625,
-0.01535797119140625,
0.00958251953125,
0.0076141357421875,
0.0198822021484375,
-0.00936126708984375,
-0.003459930419921875,
-0.057830810546875,
0.018035888671875,
-0.035888671875,
-0.022918701171875,
-0.041412353515625,
-0.04241943359375,
-0.032928466796875,
-0.01058197021484375,
-0.047607421875,
-0.0192108154296875,
-0.038787841796875,
-0.0084381103515625,
0.056121826171875,
0.031402587890625,
0.00943756103515625,
0.03375244140625,
-0.057647705078125,
0.0235443115234375,
0.0236663818359375,
0.0283050537109375,
0.0016145706176757812,
-0.049835205078125,
-0.019989013671875,
0.013092041015625,
-0.046173095703125,
-0.049163818359375,
0.032928466796875,
0.01422882080078125,
0.037139892578125,
0.041046142578125,
-0.004222869873046875,
0.054290771484375,
-0.0257415771484375,
0.060943603515625,
0.0225372314453125,
-0.056671142578125,
0.039581298828125,
-0.0163726806640625,
0.0245361328125,
0.04547119140625,
0.04608154296875,
-0.0321044921875,
0.00420379638671875,
-0.07061767578125,
-0.041656494140625,
0.0654296875,
0.01788330078125,
0.0013217926025390625,
0.0267181396484375,
0.02630615234375,
-0.00472259521484375,
0.0240936279296875,
-0.060791015625,
-0.0467529296875,
-0.0292816162109375,
-0.01068115234375,
-0.00872802734375,
0.0095062255859375,
-0.019989013671875,
-0.046783447265625,
0.07684326171875,
-0.011138916015625,
0.026824951171875,
0.01282501220703125,
0.003711700439453125,
0.001781463623046875,
-0.0237579345703125,
0.02557373046875,
0.01488494873046875,
-0.01335906982421875,
-0.01082611083984375,
0.016387939453125,
-0.03704833984375,
-0.001834869384765625,
0.03790283203125,
-0.0158538818359375,
0.00399017333984375,
0.012847900390625,
0.084716796875,
0.00824737548828125,
-0.031951904296875,
0.0482177734375,
-0.018951416015625,
-0.03155517578125,
-0.040924072265625,
0.005496978759765625,
0.0157928466796875,
0.03192138671875,
0.01245880126953125,
0.01316070556640625,
0.0125579833984375,
-0.0294189453125,
0.0210418701171875,
0.03662109375,
-0.03424072265625,
-0.017120361328125,
0.06280517578125,
0.003803253173828125,
-0.0210113525390625,
0.06939697265625,
-0.0238037109375,
-0.053466796875,
0.06243896484375,
0.01297760009765625,
0.07861328125,
-0.0172119140625,
0.021820068359375,
0.0484619140625,
0.00849151611328125,
-0.0014047622680664062,
0.00786590576171875,
-0.01207733154296875,
-0.044769287109375,
-0.01360321044921875,
-0.053802490234375,
-0.035858154296875,
0.0242767333984375,
-0.06671142578125,
0.03753662109375,
-0.038360595703125,
-0.0081634521484375,
0.01505279541015625,
0.0170440673828125,
-0.07159423828125,
0.0232391357421875,
0.021484375,
0.0594482421875,
-0.05987548828125,
0.057830810546875,
0.04937744140625,
-0.0394287109375,
-0.06182861328125,
-0.015594482421875,
0.00879669189453125,
-0.0760498046875,
0.033416748046875,
0.037109375,
0.009429931640625,
1.7881393432617188e-7,
-0.055633544921875,
-0.0521240234375,
0.103271484375,
0.018096923828125,
-0.035003662109375,
0.0189361572265625,
0.006160736083984375,
0.036224365234375,
-0.005523681640625,
0.0265960693359375,
0.04693603515625,
0.0227813720703125,
0.031768798828125,
-0.056793212890625,
0.01236724853515625,
-0.038360595703125,
0.00014650821685791016,
0.01004791259765625,
-0.08160400390625,
0.06573486328125,
-0.01226806640625,
0.01226806640625,
0.005802154541015625,
0.059844970703125,
0.029327392578125,
0.03350830078125,
0.02545166015625,
0.0775146484375,
0.05322265625,
-0.0167694091796875,
0.0692138671875,
-0.0241546630859375,
0.049072265625,
0.0704345703125,
0.007720947265625,
0.037841796875,
0.012542724609375,
-0.026123046875,
0.052886962890625,
0.0635986328125,
-0.037994384765625,
0.031829833984375,
0.006927490234375,
-0.0221405029296875,
-0.0023479461669921875,
0.00659942626953125,
-0.039703369140625,
0.0223846435546875,
0.019561767578125,
-0.0343017578125,
-0.00821685791015625,
0.0025787353515625,
0.018218994140625,
-0.022735595703125,
-0.00580596923828125,
0.060791015625,
0.00420379638671875,
-0.042022705078125,
0.0616455078125,
-0.007335662841796875,
0.04595947265625,
-0.047393798828125,
-0.0034046173095703125,
-0.0020313262939453125,
0.0179443359375,
-0.0213623046875,
-0.0518798828125,
0.0137786865234375,
-0.0037784576416015625,
0.00004029273986816406,
-0.01236724853515625,
0.04302978515625,
-0.023162841796875,
-0.0521240234375,
0.0101318359375,
0.0215606689453125,
0.029541015625,
-0.005558013916015625,
-0.08642578125,
0.00403594970703125,
0.0025768280029296875,
-0.052490234375,
0.032623291015625,
0.036407470703125,
0.0306854248046875,
0.050140380859375,
0.048797607421875,
-0.013153076171875,
-0.003337860107421875,
0.0015306472778320312,
0.08563232421875,
-0.045684814453125,
-0.0195770263671875,
-0.05865478515625,
0.0457763671875,
-0.0162200927734375,
-0.042816162109375,
0.0489501953125,
0.04718017578125,
0.07330322265625,
-0.003124237060546875,
0.056915283203125,
-0.011322021484375,
0.0159912109375,
-0.02435302734375,
0.0572509765625,
-0.04547119140625,
-0.0147705078125,
-0.0235443115234375,
-0.0672607421875,
-0.00855255126953125,
0.0692138671875,
-0.0019025802612304688,
0.0236968994140625,
0.0340576171875,
0.055633544921875,
-0.005527496337890625,
0.0137939453125,
-0.003917694091796875,
0.016937255859375,
0.0187225341796875,
0.037506103515625,
0.03118896484375,
-0.056640625,
0.0301361083984375,
-0.06402587890625,
-0.032928466796875,
-0.00891876220703125,
-0.058990478515625,
-0.0721435546875,
-0.04449462890625,
-0.0404052734375,
-0.03997802734375,
-0.02362060546875,
0.0616455078125,
0.05780029296875,
-0.054901123046875,
-0.004337310791015625,
-0.0175018310546875,
-0.01049041748046875,
-0.0284576416015625,
-0.02032470703125,
0.035125732421875,
-0.01502227783203125,
-0.0731201171875,
-0.0014085769653320312,
-0.017333984375,
0.03045654296875,
-0.0284271240234375,
-0.0162811279296875,
0.00033855438232421875,
-0.03448486328125,
0.0211029052734375,
0.0214080810546875,
-0.039520263671875,
-0.0043487548828125,
-0.00362396240234375,
-0.00673675537109375,
0.0131072998046875,
0.02471923828125,
-0.056793212890625,
0.0250244140625,
0.0191650390625,
0.052642822265625,
0.0634765625,
0.005340576171875,
0.014556884765625,
-0.0435791015625,
0.0188140869140625,
0.01161956787109375,
0.0401611328125,
0.0321044921875,
-0.037139892578125,
0.037933349609375,
0.026824951171875,
-0.067626953125,
-0.041778564453125,
0.0034885406494140625,
-0.09197998046875,
-0.01151275634765625,
0.07440185546875,
-0.01435089111328125,
-0.04132080078125,
0.039093017578125,
-0.02178955078125,
0.04107666015625,
-0.0297698974609375,
0.057342529296875,
0.040283203125,
-0.0081329345703125,
-0.0214385986328125,
-0.042816162109375,
0.04718017578125,
0.01503753662109375,
-0.0462646484375,
-0.0157928466796875,
0.028289794921875,
0.0273895263671875,
0.01454925537109375,
0.0233001708984375,
-0.004978179931640625,
0.01334381103515625,
0.00789642333984375,
0.0300445556640625,
-0.0140838623046875,
-0.005664825439453125,
-0.0347900390625,
-0.0173187255859375,
-0.032806396484375,
-0.0285491943359375
]
] |
cross-encoder/ms-marco-MiniLM-L-2-v2 | 2021-08-05T08:39:25.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"text-classification",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | cross-encoder | null | null | cross-encoder/ms-marco-MiniLM-L-2-v2 | 1 | 28,545 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
---
# Cross-Encoder for MS Marco
This model was trained on the [MS Marco Passage Ranking](https://github.com/microsoft/MSMARCO-Passage-Ranking) task.
The model can be used for Information Retrieval: Given a query, encode the query will all possible passages (e.g. retrieved with ElasticSearch). Then sort the passages in a decreasing order. See [SBERT.net Retrieve & Re-rank](https://www.sbert.net/examples/applications/retrieve_rerank/README.html) for more details. The training code is available here: [SBERT.net Training MS Marco](https://github.com/UKPLab/sentence-transformers/tree/master/examples/training/ms_marco)
## Usage with Transformers
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model = AutoModelForSequenceClassification.from_pretrained('model_name')
tokenizer = AutoTokenizer.from_pretrained('model_name')
features = tokenizer(['How many people live in Berlin?', 'How many people live in Berlin?'], ['Berlin has a population of 3,520,031 registered inhabitants in an area of 891.82 square kilometers.', 'New York City is famous for the Metropolitan Museum of Art.'], padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
print(scores)
```
## Usage with SentenceTransformers
The usage becomes easier when you have [SentenceTransformers](https://www.sbert.net/) installed. Then, you can use the pre-trained models like this:
```python
from sentence_transformers import CrossEncoder
model = CrossEncoder('model_name', max_length=512)
scores = model.predict([('Query', 'Paragraph1'), ('Query', 'Paragraph2') , ('Query', 'Paragraph3')])
```
## Performance
In the following table, we provide various pre-trained Cross-Encoders together with their performance on the [TREC Deep Learning 2019](https://microsoft.github.io/TREC-2019-Deep-Learning/) and the [MS Marco Passage Reranking](https://github.com/microsoft/MSMARCO-Passage-Ranking/) dataset.
| Model-Name | NDCG@10 (TREC DL 19) | MRR@10 (MS Marco Dev) | Docs / Sec |
| ------------- |:-------------| -----| --- |
| **Version 2 models** | | |
| cross-encoder/ms-marco-TinyBERT-L-2-v2 | 69.84 | 32.56 | 9000
| cross-encoder/ms-marco-MiniLM-L-2-v2 | 71.01 | 34.85 | 4100
| cross-encoder/ms-marco-MiniLM-L-4-v2 | 73.04 | 37.70 | 2500
| cross-encoder/ms-marco-MiniLM-L-6-v2 | 74.30 | 39.01 | 1800
| cross-encoder/ms-marco-MiniLM-L-12-v2 | 74.31 | 39.02 | 960
| **Version 1 models** | | |
| cross-encoder/ms-marco-TinyBERT-L-2 | 67.43 | 30.15 | 9000
| cross-encoder/ms-marco-TinyBERT-L-4 | 68.09 | 34.50 | 2900
| cross-encoder/ms-marco-TinyBERT-L-6 | 69.57 | 36.13 | 680
| cross-encoder/ms-marco-electra-base | 71.99 | 36.41 | 340
| **Other models** | | |
| nboost/pt-tinybert-msmarco | 63.63 | 28.80 | 2900
| nboost/pt-bert-base-uncased-msmarco | 70.94 | 34.75 | 340
| nboost/pt-bert-large-msmarco | 73.36 | 36.48 | 100
| Capreolus/electra-base-msmarco | 71.23 | 36.89 | 340
| amberoad/bert-multilingual-passage-reranking-msmarco | 68.40 | 35.54 | 330
| sebastian-hofstaetter/distilbert-cat-margin_mse-T2-msmarco | 72.82 | 37.88 | 720
Note: Runtime was computed on a V100 GPU.
| 3,233 | [
[
-0.03228759765625,
-0.043670654296875,
0.0250396728515625,
0.01169586181640625,
-0.0126800537109375,
0.0107574462890625,
-0.01340484619140625,
-0.038543701171875,
0.025115966796875,
0.0255584716796875,
-0.041229248046875,
-0.051055908203125,
-0.057952880859375,
0.0030689239501953125,
-0.033294677734375,
0.059326171875,
-0.0015010833740234375,
0.0123291015625,
-0.01372528076171875,
-0.0082550048828125,
-0.0193939208984375,
-0.0308074951171875,
-0.041351318359375,
-0.0218048095703125,
0.0360107421875,
0.0159454345703125,
0.05816650390625,
0.0297698974609375,
0.0419921875,
0.032958984375,
-0.0084075927734375,
0.0070343017578125,
-0.01413726806640625,
0.00011330842971801758,
0.005214691162109375,
-0.0286865234375,
-0.041717529296875,
-0.00911712646484375,
0.033416748046875,
0.0265960693359375,
0.0005869865417480469,
0.0199737548828125,
-0.00020587444305419922,
0.0433349609375,
-0.02935791015625,
-0.0037517547607421875,
-0.02587890625,
0.0184326171875,
-0.01494598388671875,
-0.018768310546875,
-0.03521728515625,
-0.0162811279296875,
0.0132904052734375,
-0.043914794921875,
0.0298919677734375,
0.01174163818359375,
0.09521484375,
0.0262603759765625,
-0.016448974609375,
-0.0196685791015625,
-0.035400390625,
0.05419921875,
-0.051483154296875,
0.053314208984375,
0.0137481689453125,
0.01326751708984375,
0.0089111328125,
-0.07318115234375,
-0.033660888671875,
-0.016265869140625,
-0.01439666748046875,
0.0192718505859375,
-0.03179931640625,
-0.00640869140625,
0.0313720703125,
0.0309600830078125,
-0.07501220703125,
-0.006092071533203125,
-0.053802490234375,
-0.00921630859375,
0.04913330078125,
0.020233154296875,
0.020172119140625,
-0.0188446044921875,
-0.024200439453125,
-0.01071929931640625,
-0.03790283203125,
0.016326904296875,
0.0207366943359375,
0.0008835792541503906,
-0.0154571533203125,
0.0308380126953125,
-0.017791748046875,
0.059600830078125,
0.00838470458984375,
0.007106781005859375,
0.058074951171875,
-0.019500732421875,
-0.0178985595703125,
0.0018739700317382812,
0.07366943359375,
0.021514892578125,
0.007843017578125,
-0.00957489013671875,
-0.0169677734375,
-0.01285552978515625,
0.030364990234375,
-0.066162109375,
-0.020111083984375,
0.022003173828125,
-0.040313720703125,
-0.0100860595703125,
0.0122833251953125,
-0.06396484375,
0.01207733154296875,
-0.0098724365234375,
0.04547119140625,
-0.0301055908203125,
0.0024127960205078125,
0.017730712890625,
-0.01068115234375,
0.02166748046875,
0.01345062255859375,
-0.055816650390625,
0.0009927749633789062,
0.0260009765625,
0.070556640625,
-0.0086517333984375,
-0.028472900390625,
-0.012115478515625,
-0.0028095245361328125,
-0.01251983642578125,
0.04266357421875,
-0.035430908203125,
-0.0234832763671875,
-0.0054473876953125,
0.021514892578125,
-0.01129150390625,
-0.0229034423828125,
0.053558349609375,
-0.034820556640625,
0.038299560546875,
-0.0094757080078125,
-0.0261993408203125,
-0.01184844970703125,
0.0177001953125,
-0.059112548828125,
0.09100341796875,
0.0029811859130859375,
-0.06390380859375,
0.01236724853515625,
-0.052947998046875,
-0.02569580078125,
-0.0120697021484375,
0.0030975341796875,
-0.05743408203125,
0.003444671630859375,
0.030609130859375,
0.019378662109375,
-0.0241546630859375,
0.007495880126953125,
-0.0130615234375,
-0.0340576171875,
0.012298583984375,
-0.031219482421875,
0.08197021484375,
0.029815673828125,
-0.0369873046875,
0.004062652587890625,
-0.0506591796875,
0.00893402099609375,
0.0214385986328125,
-0.031890869140625,
-0.0004200935363769531,
-0.021514892578125,
0.0103759765625,
0.0301055908203125,
0.032958984375,
-0.037689208984375,
0.00778961181640625,
-0.02099609375,
0.036376953125,
0.03466796875,
-0.0082244873046875,
0.025909423828125,
-0.02264404296875,
0.05029296875,
0.009490966796875,
0.03265380859375,
0.0006618499755859375,
-0.047607421875,
-0.06622314453125,
-0.01016998291015625,
0.038299560546875,
0.044158935546875,
-0.055450439453125,
0.04083251953125,
-0.038909912109375,
-0.0531005859375,
-0.062042236328125,
-0.007534027099609375,
0.031494140625,
0.0255126953125,
0.049713134765625,
-0.006671905517578125,
-0.055084228515625,
-0.07501220703125,
-0.025146484375,
0.0017757415771484375,
0.0030002593994140625,
0.0179595947265625,
0.04840087890625,
-0.019744873046875,
0.055328369140625,
-0.03997802734375,
-0.0163116455078125,
-0.034393310546875,
0.0002409219741821289,
0.019012451171875,
0.050018310546875,
0.04730224609375,
-0.052642822265625,
-0.0408935546875,
-0.01424407958984375,
-0.052215576171875,
0.005359649658203125,
0.0027637481689453125,
-0.01044464111328125,
0.02032470703125,
0.04638671875,
-0.052001953125,
0.0511474609375,
0.037109375,
-0.034393310546875,
0.02801513671875,
-0.03302001953125,
0.021881103515625,
-0.09063720703125,
0.0076446533203125,
-0.0025196075439453125,
-0.01190185546875,
-0.038787841796875,
-0.0119476318359375,
0.00701141357421875,
-0.001888275146484375,
-0.0261077880859375,
0.0248870849609375,
-0.04547119140625,
-0.0025634765625,
0.0092010498046875,
0.00580596923828125,
0.0127716064453125,
0.047637939453125,
0.0245361328125,
0.05841064453125,
0.039093017578125,
-0.0266571044921875,
0.0180816650390625,
0.027587890625,
-0.04620361328125,
0.0284881591796875,
-0.0693359375,
-0.0006070137023925781,
-0.00978851318359375,
0.0081329345703125,
-0.07489013671875,
0.01265716552734375,
0.0178985595703125,
-0.0655517578125,
0.023468017578125,
-0.01025390625,
-0.0296478271484375,
-0.049591064453125,
-0.01348114013671875,
0.024658203125,
0.037750244140625,
-0.03564453125,
0.04351806640625,
0.0255279541015625,
0.0005297660827636719,
-0.052947998046875,
-0.091552734375,
0.01371002197265625,
-0.004207611083984375,
-0.055267333984375,
0.04754638671875,
-0.01538848876953125,
0.01108551025390625,
0.0029315948486328125,
-0.00333404541015625,
-0.0031147003173828125,
-0.0084075927734375,
0.01454925537109375,
0.0247650146484375,
-0.01410675048828125,
0.0011739730834960938,
0.0009527206420898438,
-0.01641845703125,
0.005214691162109375,
-0.0157012939453125,
0.04803466796875,
-0.0132598876953125,
-0.0095062255859375,
-0.0189666748046875,
0.01491546630859375,
0.03729248046875,
-0.042388916015625,
0.054107666015625,
0.061004638671875,
-0.0243072509765625,
-0.00824737548828125,
-0.03143310546875,
-0.0075836181640625,
-0.037811279296875,
0.033843994140625,
-0.04351806640625,
-0.057952880859375,
0.039886474609375,
0.0227508544921875,
0.002056121826171875,
0.038360595703125,
0.036651611328125,
-0.0015344619750976562,
0.07733154296875,
0.03631591796875,
-0.0036258697509765625,
0.04937744140625,
-0.053619384765625,
0.02227783203125,
-0.058074951171875,
-0.044281005859375,
-0.049591064453125,
-0.033447265625,
-0.051300048828125,
-0.0264129638671875,
0.0228729248046875,
-0.01006317138671875,
-0.0170135498046875,
0.05230712890625,
-0.056396484375,
0.024322509765625,
0.0546875,
0.0208587646484375,
0.00777435302734375,
0.01097869873046875,
-0.019195556640625,
-0.0092620849609375,
-0.06256103515625,
-0.024505615234375,
0.0980224609375,
0.0125885009765625,
0.05255126953125,
0.0012960433959960938,
0.057952880859375,
0.0231170654296875,
-0.002849578857421875,
-0.03240966796875,
0.03302001953125,
-0.01132965087890625,
-0.05841064453125,
-0.0173187255859375,
-0.0316162109375,
-0.08074951171875,
0.0255584716796875,
-0.0159759521484375,
-0.043731689453125,
0.038543701171875,
-0.006702423095703125,
-0.029266357421875,
0.02374267578125,
-0.0419921875,
0.09814453125,
-0.0312347412109375,
-0.0268707275390625,
-0.007366180419921875,
-0.0555419921875,
0.0127410888671875,
0.01555633544921875,
0.002552032470703125,
0.006885528564453125,
-0.01245880126953125,
0.056854248046875,
-0.02764892578125,
0.0261077880859375,
-0.0109405517578125,
0.0114593505859375,
0.014068603515625,
-0.007411956787109375,
0.0287628173828125,
-0.0006060600280761719,
-0.0078125,
0.0250396728515625,
-0.003238677978515625,
-0.0299072265625,
-0.03192138671875,
0.061004638671875,
-0.06878662109375,
-0.031341552734375,
-0.040771484375,
-0.0272979736328125,
-0.0022487640380859375,
0.01551055908203125,
0.05780029296875,
0.03167724609375,
0.0003285408020019531,
0.032318115234375,
0.0557861328125,
-0.0235595703125,
0.04302978515625,
0.028411865234375,
-0.003978729248046875,
-0.055389404296875,
0.058074951171875,
0.0229034423828125,
0.012481689453125,
0.0430908203125,
-0.01367950439453125,
-0.0357666015625,
-0.040557861328125,
-0.0266571044921875,
0.01265716552734375,
-0.039947509765625,
-0.016815185546875,
-0.054840087890625,
-0.0307769775390625,
-0.037567138671875,
-0.005558013916015625,
-0.03125,
-0.03204345703125,
-0.01806640625,
-0.013092041015625,
0.016448974609375,
0.045745849609375,
0.00983428955078125,
0.0153045654296875,
-0.046051025390625,
0.0160369873046875,
0.00048232078552246094,
0.0117950439453125,
-0.00791168212890625,
-0.0657958984375,
-0.03411865234375,
-0.005168914794921875,
-0.0308380126953125,
-0.06207275390625,
0.051055908203125,
-0.00653839111328125,
0.054901123046875,
0.01139068603515625,
0.004329681396484375,
0.05633544921875,
-0.02923583984375,
0.0675048828125,
0.0123443603515625,
-0.064697265625,
0.050018310546875,
0.0019702911376953125,
0.02935791015625,
0.04730224609375,
0.041900634765625,
-0.040008544921875,
-0.0194244384765625,
-0.057861328125,
-0.071044921875,
0.0675048828125,
0.0224609375,
-0.00823974609375,
0.0055084228515625,
0.0015153884887695312,
-0.00902557373046875,
0.021209716796875,
-0.07275390625,
-0.036895751953125,
-0.03399658203125,
-0.0285491943359375,
-0.023681640625,
-0.01242828369140625,
0.01543426513671875,
-0.047027587890625,
0.0582275390625,
0.01316070556640625,
0.042999267578125,
0.045501708984375,
-0.031036376953125,
0.006603240966796875,
0.00827789306640625,
0.05181884765625,
0.048309326171875,
-0.0204010009765625,
-0.0017242431640625,
0.0159454345703125,
-0.038360595703125,
-0.0105438232421875,
0.0177459716796875,
-0.034820556640625,
0.0289764404296875,
0.025360107421875,
0.07574462890625,
0.016815185546875,
-0.029052734375,
0.048858642578125,
0.0038661956787109375,
-0.02093505859375,
-0.037689208984375,
-0.01491546630859375,
0.0015192031860351562,
0.028717041015625,
0.0185089111328125,
0.00479888916015625,
0.0191497802734375,
-0.031036376953125,
0.01177215576171875,
0.0269012451171875,
-0.044189453125,
-0.01540374755859375,
0.0682373046875,
0.0130615234375,
-0.032073974609375,
0.0516357421875,
0.001750946044921875,
-0.061553955078125,
0.03900146484375,
0.027374267578125,
0.078857421875,
-0.021392822265625,
0.013336181640625,
0.052215576171875,
0.051361083984375,
0.005542755126953125,
0.0262603759765625,
-0.01184844970703125,
-0.039886474609375,
-0.0010862350463867188,
-0.041351318359375,
-0.008880615234375,
-0.005039215087890625,
-0.050811767578125,
0.0222930908203125,
-0.01332855224609375,
-0.0245361328125,
-0.014373779296875,
0.0204925537109375,
-0.0631103515625,
0.012451171875,
0.00412750244140625,
0.08380126953125,
-0.041015625,
0.08038330078125,
0.04351806640625,
-0.066162109375,
-0.043731689453125,
-0.00978851318359375,
-0.0299835205078125,
-0.052215576171875,
0.04278564453125,
0.0094757080078125,
0.0087738037109375,
0.000009715557098388672,
-0.0267486572265625,
-0.061767578125,
0.111083984375,
0.0152740478515625,
-0.05126953125,
-0.0137939453125,
0.033111572265625,
0.03857421875,
-0.0259246826171875,
0.050933837890625,
0.032623291015625,
0.037200927734375,
-0.01457977294921875,
-0.0706787109375,
0.01125335693359375,
-0.03704833984375,
-0.0032672882080078125,
0.00603485107421875,
-0.062286376953125,
0.07940673828125,
-0.01666259765625,
0.0127410888671875,
0.0125732421875,
0.04534912109375,
0.01546478271484375,
0.02581787109375,
0.02642822265625,
0.06341552734375,
0.05157470703125,
-0.029876708984375,
0.0665283203125,
-0.042327880859375,
0.04290771484375,
0.06787109375,
0.01528167724609375,
0.06707763671875,
0.032379150390625,
-0.024627685546875,
0.05596923828125,
0.05474853515625,
-0.016387939453125,
0.038787841796875,
0.002918243408203125,
0.0012187957763671875,
-0.031005859375,
0.0289764404296875,
-0.05096435546875,
0.01800537109375,
0.011810302734375,
-0.060272216796875,
-0.005710601806640625,
-0.003978729248046875,
-0.00833892822265625,
-0.0123443603515625,
-0.0185546875,
0.03387451171875,
-0.00580596923828125,
-0.0430908203125,
0.051513671875,
0.0023021697998046875,
0.056427001953125,
-0.050018310546875,
0.0139312744140625,
-0.0194549560546875,
0.020294189453125,
-0.017059326171875,
-0.06591796875,
0.007144927978515625,
-0.0037059783935546875,
-0.0113067626953125,
-0.020843505859375,
0.036865234375,
-0.043975830078125,
-0.043060302734375,
0.0309600830078125,
0.0240936279296875,
0.0159912109375,
-0.00719451904296875,
-0.07806396484375,
0.0166778564453125,
0.0159912109375,
-0.037750244140625,
0.0084686279296875,
0.03179931640625,
0.0098114013671875,
0.050384521484375,
0.03662109375,
-0.00911712646484375,
0.031524658203125,
0.0025615692138671875,
0.053253173828125,
-0.06591796875,
-0.039642333984375,
-0.04345703125,
0.045806884765625,
-0.02197265625,
-0.040008544921875,
0.06805419921875,
0.07818603515625,
0.0748291015625,
-0.0241851806640625,
0.0504150390625,
-0.01110076904296875,
0.0186614990234375,
-0.029449462890625,
0.05877685546875,
-0.064208984375,
0.0188446044921875,
-0.016510009765625,
-0.0625,
-0.0132293701171875,
0.04803466796875,
-0.033111572265625,
0.019439697265625,
0.050384521484375,
0.07061767578125,
0.0006060600280761719,
-0.0019130706787109375,
0.0185394287109375,
0.01190185546875,
0.0135498046875,
0.06591796875,
0.0489501953125,
-0.06964111328125,
0.075927734375,
-0.032958984375,
0.01216888427734375,
-0.0166473388671875,
-0.03179931640625,
-0.06414794921875,
-0.043975830078125,
-0.0248565673828125,
-0.03173828125,
0.0123291015625,
0.0626220703125,
0.054840087890625,
-0.056304931640625,
-0.01556396484375,
-0.0017452239990234375,
0.007488250732421875,
-0.010467529296875,
-0.0172119140625,
0.032623291015625,
-0.0207366943359375,
-0.0718994140625,
0.0252227783203125,
0.0010271072387695312,
0.0006823539733886719,
-0.0185089111328125,
-0.03289794921875,
-0.022216796875,
0.0032825469970703125,
0.034393310546875,
0.00815582275390625,
-0.054962158203125,
-0.0093536376953125,
0.0142364501953125,
-0.022430419921875,
0.0221710205078125,
0.045806884765625,
-0.058868408203125,
0.01727294921875,
0.062286376953125,
0.031494140625,
0.068359375,
-0.01568603515625,
0.0207977294921875,
-0.0311279296875,
-0.003086090087890625,
0.0117340087890625,
0.043365478515625,
0.01078033447265625,
-0.01444244384765625,
0.045318603515625,
0.0296173095703125,
-0.04547119140625,
-0.061737060546875,
-0.01357269287109375,
-0.086669921875,
-0.026580810546875,
0.06787109375,
-0.0101470947265625,
-0.033416748046875,
0.0133819580078125,
-0.01111602783203125,
0.017822265625,
-0.0283660888671875,
0.035552978515625,
0.04962158203125,
0.004360198974609375,
-0.0204925537109375,
-0.043365478515625,
0.031005859375,
0.0176544189453125,
-0.052276611328125,
-0.01372528076171875,
0.013336181640625,
0.03564453125,
0.015167236328125,
0.033294677734375,
-0.0309600830078125,
0.0238037109375,
0.01204681396484375,
0.0310516357421875,
-0.02178955078125,
-0.03131103515625,
-0.0252685546875,
0.01305389404296875,
-0.03125,
-0.03863525390625
]
] |
codellama/CodeLlama-13b-hf | 2023-10-27T18:04:56.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"llama-2",
"code",
"arxiv:2308.12950",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | codellama | null | null | codellama/CodeLlama-13b-hf | 53 | 28,519 | transformers | 2023-08-24T16:31:44 | ---
language:
- code
pipeline_tag: text-generation
tags:
- llama-2
license: llama2
---
# **Code Llama**
Code Llama is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 34 billion parameters. This is the repository for the base 13B version in the Hugging Face Transformers format. This model is designed for general code synthesis and understanding. Links to other models can be found in the index at the bottom.
| | Base Model | Python | Instruct |
| --- | ----------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- |
| 7B | [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) | [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) | [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) |
| 13B | [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf) | [codellama/CodeLlama-13b-Python-hf](https://huggingface.co/codellama/CodeLlama-13b-Python-hf) | [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf) |
| 34B | [codellama/CodeLlama-34b-hf](https://huggingface.co/codellama/CodeLlama-34b-hf) | [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf) | [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf) |
## Model Use
To use this model, please make sure to install transformers from `main` until the next version is released:
```bash
pip install git+https://github.com/huggingface/transformers.git@main accelerate
```
Model capabilities:
- [x] Code completion.
- [x] Infilling.
- [ ] Instructions / chat.
- [ ] Python specialist.
```python
from transformers import AutoTokenizer
import transformers
import torch
model = "codellama/CodeLlama-13b-hf"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
sequences = pipeline(
'import socket\n\ndef ping_exponential_backoff(host: str):',
do_sample=True,
top_k=10,
temperature=0.1,
top_p=0.95,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
max_length=200,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Model Details
*Note: Use of this model is governed by the Meta license. Meta developed and publicly released the Code Llama family of large language models (LLMs).
**Model Developers** Meta
**Variations** Code Llama comes in three model sizes, and three variants:
* Code Llama: base models designed for general code synthesis and understanding
* Code Llama - Python: designed specifically for Python
* Code Llama - Instruct: for instruction following and safer deployment
All variants are available in sizes of 7B, 13B and 34B parameters.
**This repository contains the base version of the 13B parameters model.**
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Code Llama is an auto-regressive language model that uses an optimized transformer architecture.
**Model Dates** Code Llama and its variants have been trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of Code Llama - Instruct will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** More information can be found in the paper "[Code Llama: Open Foundation Models for Code](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/)" or its [arXiv page](https://arxiv.org/abs/2308.12950).
## Intended Use
**Intended Use Cases** Code Llama and its variants is intended for commercial and research use in English and relevant programming languages. The base model Code Llama can be adapted for a variety of code synthesis and understanding tasks, Code Llama - Python is designed specifically to handle the Python programming language, and Code Llama - Instruct is intended to be safer to use for code assistant and generation applications.
**Out-of-Scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Code Llama and its variants.
## Hardware and Software
**Training Factors** We used custom training libraries. The training and fine-tuning of the released models have been performed Meta’s Research Super Cluster.
**Carbon Footprint** In aggregate, training all 9 Code Llama models required 400K GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 65.3 tCO2eq, 100% of which were offset by Meta’s sustainability program.
## Training Data
All experiments reported here and the released models have been trained and fine-tuned using the same data as Llama 2 with different weights (see Section 2 and Table 1 in the [research paper](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/) for details).
## Evaluation Results
See evaluations for the main models and detailed ablations in Section 3 and safety evaluations in Section 4 of the research paper.
## Ethical Considerations and Limitations
Code Llama and its variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Code Llama’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate or objectionable responses to user prompts. Therefore, before deploying any applications of Code Llama, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available available at [https://ai.meta.com/llama/responsible-use-guide](https://ai.meta.com/llama/responsible-use-guide). | 6,774 | [
[
-0.0263519287109375,
-0.0504150390625,
0.0188140869140625,
0.040924072265625,
-0.0182342529296875,
0.0099029541015625,
-0.005916595458984375,
-0.045074462890625,
0.0202484130859375,
0.0347900390625,
-0.0301361083984375,
-0.041961669921875,
-0.045501708984375,
0.022064208984375,
-0.036041259765625,
0.0853271484375,
-0.0020351409912109375,
-0.02783203125,
-0.01537322998046875,
0.00287628173828125,
-0.0169219970703125,
-0.04193115234375,
-0.0172119140625,
-0.032867431640625,
0.0196533203125,
0.024871826171875,
0.050262451171875,
0.045166015625,
0.040283203125,
0.0268402099609375,
-0.0202789306640625,
0.0034236907958984375,
-0.02667236328125,
-0.0246429443359375,
0.01947021484375,
-0.040283203125,
-0.054534912109375,
-0.0013980865478515625,
0.0273284912109375,
0.0236663818359375,
-0.020965576171875,
0.03350830078125,
-0.01274871826171875,
0.03558349609375,
-0.02386474609375,
0.0164031982421875,
-0.049774169921875,
-0.005191802978515625,
0.0040130615234375,
-0.0092315673828125,
-0.016082763671875,
-0.036895751953125,
-0.007106781005859375,
-0.03338623046875,
-0.0035228729248046875,
-0.0017871856689453125,
0.08758544921875,
0.0380859375,
-0.0219573974609375,
-0.0173492431640625,
-0.02398681640625,
0.059417724609375,
-0.072021484375,
0.0029201507568359375,
0.0240478515625,
-0.005268096923828125,
-0.010650634765625,
-0.06585693359375,
-0.05426025390625,
-0.0254974365234375,
-0.01111602783203125,
-0.0002751350402832031,
-0.035400390625,
0.0049591064453125,
0.0301361083984375,
0.035064697265625,
-0.035675048828125,
0.00887298583984375,
-0.0352783203125,
-0.0157928466796875,
0.06683349609375,
0.01061248779296875,
0.03033447265625,
-0.0241851806640625,
-0.0272674560546875,
-0.0051116943359375,
-0.058563232421875,
0.005096435546875,
0.034027099609375,
-0.00893402099609375,
-0.057525634765625,
0.051788330078125,
-0.01462554931640625,
0.043121337890625,
0.01006317138671875,
-0.03594970703125,
0.0408935546875,
-0.0218963623046875,
-0.025543212890625,
-0.01220703125,
0.0718994140625,
0.038299560546875,
0.0252685546875,
0.004192352294921875,
-0.01446533203125,
0.0189971923828125,
0.00600433349609375,
-0.06561279296875,
-0.0121002197265625,
0.0256805419921875,
-0.04742431640625,
-0.052001953125,
-0.0169830322265625,
-0.05975341796875,
-0.0047760009765625,
0.0005216598510742188,
0.01219940185546875,
-0.0150909423828125,
-0.03387451171875,
0.0169677734375,
0.004360198974609375,
0.035247802734375,
0.004268646240234375,
-0.0655517578125,
0.00464630126953125,
0.0338134765625,
0.059295654296875,
0.0037250518798828125,
-0.038238525390625,
-0.0005426406860351562,
-0.00968170166015625,
-0.0207672119140625,
0.04815673828125,
-0.0330810546875,
-0.036163330078125,
-0.01384735107421875,
0.0097198486328125,
-0.005260467529296875,
-0.035797119140625,
0.0136260986328125,
-0.0259857177734375,
0.0002486705780029297,
0.01029205322265625,
-0.02337646484375,
-0.030303955078125,
0.0018262863159179688,
-0.04046630859375,
0.08843994140625,
0.0194091796875,
-0.056793212890625,
-0.0029125213623046875,
-0.042022705078125,
-0.02569580078125,
-0.0199127197265625,
-0.0019683837890625,
-0.051239013671875,
-0.005008697509765625,
0.0157928466796875,
0.039581298828125,
-0.0291748046875,
0.02978515625,
-0.01105499267578125,
-0.0298004150390625,
0.0158538818359375,
-0.01367950439453125,
0.08087158203125,
0.0260467529296875,
-0.0384521484375,
0.017242431640625,
-0.06463623046875,
-0.00749969482421875,
0.038330078125,
-0.0338134765625,
0.0125885009765625,
-0.01027679443359375,
0.0006170272827148438,
0.0006260871887207031,
0.03753662109375,
-0.02520751953125,
0.036834716796875,
-0.032257080078125,
0.05859375,
0.05120849609375,
-0.000316619873046875,
0.028656005859375,
-0.041534423828125,
0.052398681640625,
-0.006969451904296875,
0.016998291015625,
-0.0227813720703125,
-0.056915283203125,
-0.07623291015625,
-0.0206146240234375,
0.0013914108276367188,
0.053131103515625,
-0.036712646484375,
0.0496826171875,
-0.0016689300537109375,
-0.060882568359375,
-0.037384033203125,
0.0126190185546875,
0.0343017578125,
0.02557373046875,
0.026153564453125,
-0.00926971435546875,
-0.059326171875,
-0.0601806640625,
0.00838470458984375,
-0.034820556640625,
0.010467529296875,
0.0186309814453125,
0.06365966796875,
-0.04693603515625,
0.059112548828125,
-0.032073974609375,
-0.0009946823120117188,
-0.0273284912109375,
-0.017608642578125,
0.040557861328125,
0.045074462890625,
0.054107666015625,
-0.0404052734375,
-0.0224456787109375,
0.0022449493408203125,
-0.0650634765625,
-0.00959014892578125,
-0.0185394287109375,
-0.00555419921875,
0.0268096923828125,
0.0245208740234375,
-0.050323486328125,
0.039093017578125,
0.06317138671875,
-0.0208282470703125,
0.044677734375,
-0.0120697021484375,
-0.007244110107421875,
-0.08013916015625,
0.0177459716796875,
-0.0126495361328125,
-0.0032196044921875,
-0.03753662109375,
0.0255889892578125,
0.005825042724609375,
0.005382537841796875,
-0.04022216796875,
0.0284881591796875,
-0.03076171875,
0.000021159648895263672,
-0.008819580078125,
-0.0145416259765625,
-0.0005650520324707031,
0.05389404296875,
-0.0021381378173828125,
0.07012939453125,
0.04266357421875,
-0.0457763671875,
0.0291290283203125,
0.022613525390625,
-0.0241546630859375,
0.0128021240234375,
-0.07391357421875,
0.025421142578125,
0.00922393798828125,
0.0253448486328125,
-0.0640869140625,
-0.01532745361328125,
0.0260467529296875,
-0.038970947265625,
0.0072174072265625,
-0.0028362274169921875,
-0.035980224609375,
-0.03857421875,
-0.0186309814453125,
0.033416748046875,
0.0650634765625,
-0.044830322265625,
0.0316162109375,
0.0301971435546875,
0.0122222900390625,
-0.05389404296875,
-0.05340576171875,
0.006103515625,
-0.033538818359375,
-0.05517578125,
0.0330810546875,
-0.021331787109375,
-0.01134490966796875,
-0.01385498046875,
0.00699615478515625,
0.0006566047668457031,
0.0220489501953125,
0.033477783203125,
0.0304412841796875,
-0.00913238525390625,
-0.0162811279296875,
-0.0028591156005859375,
-0.01174163818359375,
0.005405426025390625,
0.00826263427734375,
0.05950927734375,
-0.031341552734375,
-0.0179901123046875,
-0.0462646484375,
0.01186370849609375,
0.040557861328125,
-0.018280029296875,
0.046478271484375,
0.03302001953125,
-0.0266571044921875,
-0.000713348388671875,
-0.049346923828125,
0.002513885498046875,
-0.042388916015625,
0.022430419921875,
-0.019439697265625,
-0.060577392578125,
0.053131103515625,
0.007556915283203125,
0.01493072509765625,
0.040130615234375,
0.05859375,
0.00594329833984375,
0.0557861328125,
0.0682373046875,
-0.0290069580078125,
0.0298614501953125,
-0.045928955078125,
0.006313323974609375,
-0.0584716796875,
-0.031829833984375,
-0.043609619140625,
-0.005146026611328125,
-0.050262451171875,
-0.03472900390625,
0.0224609375,
0.01424407958984375,
-0.03936767578125,
0.053985595703125,
-0.06365966796875,
0.0311431884765625,
0.033477783203125,
0.00443267822265625,
0.0258636474609375,
0.0030117034912109375,
-0.004638671875,
0.022064208984375,
-0.035858154296875,
-0.050323486328125,
0.09014892578125,
0.035125732421875,
0.063720703125,
-0.00533294677734375,
0.0667724609375,
0.0017938613891601562,
0.0252685546875,
-0.044708251953125,
0.044647216796875,
0.0210418701171875,
-0.037872314453125,
-0.0019397735595703125,
-0.019927978515625,
-0.0682373046875,
0.01070404052734375,
0.00530242919921875,
-0.061920166015625,
0.006954193115234375,
0.0008525848388671875,
-0.018463134765625,
0.0274658203125,
-0.05224609375,
0.04931640625,
-0.01386260986328125,
0.0005478858947753906,
-0.010162353515625,
-0.041290283203125,
0.041656494140625,
-0.00266265869140625,
0.0126495361328125,
-0.01177978515625,
-0.01279449462890625,
0.05120849609375,
-0.03985595703125,
0.07940673828125,
0.0072021484375,
-0.0268707275390625,
0.046142578125,
-0.0023021697998046875,
0.036712646484375,
0.0036029815673828125,
-0.0171356201171875,
0.049163818359375,
0.000012755393981933594,
-0.0194854736328125,
-0.005901336669921875,
0.04742431640625,
-0.08099365234375,
-0.055389404296875,
-0.03240966796875,
-0.0303802490234375,
0.0226593017578125,
0.01165771484375,
0.035400390625,
0.00533294677734375,
0.011444091796875,
0.01041412353515625,
0.030975341796875,
-0.0494384765625,
0.050384521484375,
0.023773193359375,
-0.0205841064453125,
-0.03631591796875,
0.061859130859375,
-0.01052093505859375,
0.0180816650390625,
0.0175628662109375,
0.003711700439453125,
-0.00966644287109375,
-0.0322265625,
-0.0347900390625,
0.03564453125,
-0.0457763671875,
-0.0419921875,
-0.050262451171875,
-0.031219482421875,
-0.027252197265625,
-0.0228424072265625,
-0.024200439453125,
-0.0208587646484375,
-0.05059814453125,
-0.01221466064453125,
0.059539794921875,
0.054779052734375,
-0.00048351287841796875,
0.0360107421875,
-0.046905517578125,
0.030517578125,
0.00921630859375,
0.027923583984375,
0.00440216064453125,
-0.040985107421875,
-0.00783538818359375,
-0.00286865234375,
-0.040435791015625,
-0.069091796875,
0.0455322265625,
0.01087188720703125,
0.04388427734375,
0.01010894775390625,
-0.00115203857421875,
0.0517578125,
-0.031280517578125,
0.06744384765625,
0.025604248046875,
-0.085205078125,
0.048431396484375,
-0.018157958984375,
0.00559234619140625,
0.00601959228515625,
0.021697998046875,
-0.034210205078125,
-0.02288818359375,
-0.0540771484375,
-0.05914306640625,
0.04803466796875,
0.019256591796875,
0.021392822265625,
-0.00043463706970214844,
0.0309600830078125,
-0.006317138671875,
0.01776123046875,
-0.08001708984375,
-0.0301055908203125,
-0.0268096923828125,
-0.01885986328125,
-0.0032138824462890625,
-0.01849365234375,
-0.002384185791015625,
-0.02294921875,
0.0361328125,
-0.01092529296875,
0.047607421875,
0.01486968994140625,
-0.0147552490234375,
-0.0191802978515625,
-0.0004138946533203125,
0.048309326171875,
0.04412841796875,
-0.003665924072265625,
-0.01073455810546875,
0.031005859375,
-0.0421142578125,
0.0182647705078125,
-0.00443267822265625,
-0.0066680908203125,
-0.016815185546875,
0.040252685546875,
0.048309326171875,
0.006984710693359375,
-0.0560302734375,
0.038665771484375,
0.007904052734375,
-0.0224151611328125,
-0.037261962890625,
0.0179901123046875,
0.0245361328125,
0.024444580078125,
0.02117919921875,
-0.0010366439819335938,
-0.00893402099609375,
-0.0264892578125,
0.004840850830078125,
0.0258636474609375,
0.012115478515625,
-0.0261993408203125,
0.06927490234375,
0.01163482666015625,
-0.0247802734375,
0.038330078125,
0.003269195556640625,
-0.042633056640625,
0.0906982421875,
0.050537109375,
0.05718994140625,
-0.015380859375,
0.01023101806640625,
0.03759765625,
0.04339599609375,
0.00295257568359375,
0.031219482421875,
0.00458526611328125,
-0.041778564453125,
-0.0240936279296875,
-0.05975341796875,
-0.0236358642578125,
0.005741119384765625,
-0.0325927734375,
0.0260162353515625,
-0.048309326171875,
-0.005565643310546875,
-0.02410888671875,
0.00864410400390625,
-0.050994873046875,
-0.0003578662872314453,
0.006317138671875,
0.07183837890625,
-0.049896240234375,
0.06695556640625,
0.0406494140625,
-0.0511474609375,
-0.06982421875,
-0.01812744140625,
-0.006504058837890625,
-0.0909423828125,
0.040008544921875,
0.0216827392578125,
0.00536346435546875,
0.0096282958984375,
-0.06719970703125,
-0.081787109375,
0.0977783203125,
0.032684326171875,
-0.037322998046875,
-0.0033397674560546875,
0.0125732421875,
0.0391845703125,
-0.027862548828125,
0.038726806640625,
0.04681396484375,
0.0303497314453125,
-0.006587982177734375,
-0.08782958984375,
0.023193359375,
-0.0296783447265625,
0.01230621337890625,
-0.0165252685546875,
-0.08148193359375,
0.07940673828125,
-0.04034423828125,
-0.01013946533203125,
0.0304107666015625,
0.0533447265625,
0.040008544921875,
0.01294708251953125,
0.027252197265625,
0.042572021484375,
0.048248291015625,
-0.0018024444580078125,
0.08087158203125,
-0.03668212890625,
0.04010009765625,
0.038330078125,
-0.006557464599609375,
0.05389404296875,
0.0287017822265625,
-0.039031982421875,
0.05682373046875,
0.058685302734375,
-0.015899658203125,
0.0195465087890625,
0.0221099853515625,
-0.00421142578125,
-0.004268646240234375,
-0.0010690689086914062,
-0.056976318359375,
0.0305938720703125,
0.0242156982421875,
-0.0290069580078125,
0.0038585662841796875,
-0.016448974609375,
0.02093505859375,
-0.0137176513671875,
-0.002346038818359375,
0.0474853515625,
0.015289306640625,
-0.036224365234375,
0.086669921875,
0.00913238525390625,
0.07440185546875,
-0.03472900390625,
-0.00788116455078125,
-0.0305938720703125,
0.004695892333984375,
-0.039764404296875,
-0.039398193359375,
0.01522064208984375,
0.0186614990234375,
0.00009173154830932617,
-0.007793426513671875,
0.0380859375,
-0.006175994873046875,
-0.0404052734375,
0.02801513671875,
0.01453399658203125,
0.0223388671875,
0.01436614990234375,
-0.055877685546875,
0.03399658203125,
0.0144195556640625,
-0.03558349609375,
0.0205535888671875,
0.01070404052734375,
0.01165771484375,
0.06842041015625,
0.054962158203125,
-0.00994110107421875,
0.01520538330078125,
-0.01380157470703125,
0.08294677734375,
-0.054107666015625,
-0.026702880859375,
-0.06329345703125,
0.050811767578125,
0.0152740478515625,
-0.035797119140625,
0.0452880859375,
0.027801513671875,
0.061920166015625,
-0.008087158203125,
0.0572509765625,
-0.01806640625,
0.00630950927734375,
-0.031585693359375,
0.05078125,
-0.051971435546875,
0.02783203125,
-0.038726806640625,
-0.06732177734375,
-0.0196533203125,
0.0684814453125,
-0.0044097900390625,
0.0090789794921875,
0.040740966796875,
0.07464599609375,
0.0181427001953125,
-0.007389068603515625,
0.0129852294921875,
0.0162506103515625,
0.030120849609375,
0.062744140625,
0.06829833984375,
-0.049713134765625,
0.053314208984375,
-0.0467529296875,
-0.0195770263671875,
-0.022247314453125,
-0.07598876953125,
-0.07464599609375,
-0.038177490234375,
-0.025604248046875,
-0.032135009765625,
-0.0193634033203125,
0.0714111328125,
0.047271728515625,
-0.045562744140625,
-0.037109375,
-0.01239013671875,
0.0277557373046875,
-0.0081024169921875,
-0.016204833984375,
0.0246124267578125,
-0.0122833251953125,
-0.061187744140625,
0.02069091796875,
-0.0018901824951171875,
0.012054443359375,
-0.0215606689453125,
-0.018890380859375,
-0.01352691650390625,
0.0010614395141601562,
0.03326416015625,
0.025421142578125,
-0.062744140625,
-0.0171356201171875,
0.0089569091796875,
-0.0148162841796875,
0.01036834716796875,
0.029510498046875,
-0.04888916015625,
-0.001323699951171875,
0.02801513671875,
0.032318115234375,
0.029052734375,
-0.0171356201171875,
0.0170135498046875,
-0.0296478271484375,
0.032562255859375,
-0.0014896392822265625,
0.0367431640625,
0.0088043212890625,
-0.043701171875,
0.049407958984375,
0.0238189697265625,
-0.0528564453125,
-0.06768798828125,
0.00864410400390625,
-0.0810546875,
-0.0139007568359375,
0.095458984375,
-0.01090240478515625,
-0.02471923828125,
0.013641357421875,
-0.0275421142578125,
0.0217742919921875,
-0.030120849609375,
0.053741455078125,
0.0224609375,
-0.0083465576171875,
-0.0141754150390625,
-0.02667236328125,
0.0205230712890625,
0.0216217041015625,
-0.07037353515625,
-0.010894775390625,
0.023895263671875,
0.0311279296875,
0.0139007568359375,
0.054962158203125,
-0.00545501708984375,
0.01447296142578125,
0.00499725341796875,
0.031158447265625,
-0.006084442138671875,
-0.013275146484375,
-0.025604248046875,
-0.00634765625,
-0.007232666015625,
-0.004787445068359375
]
] |
augtoma/qCammel-70-x | 2023-07-27T16:47:02.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama-2",
"qCammel-70",
"en",
"arxiv:2305.12031",
"arxiv:2305.14314",
"arxiv:2302.70971",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | augtoma | null | null | augtoma/qCammel-70-x | 20 | 28,401 | transformers | 2023-07-23T00:39:34 | ---
license: other
language:
- en
pipeline_tag: text-generation
inference: false
tags:
- pytorch
- llama
- llama-2
- qCammel-70
library_name: transformers
---
# qCammel-70
qCammel-70 is a fine-tuned version of Llama-2 70B model, trained on a distilled dataset of 15,000 instructions using QLoRA. This model is optimized for academic medical knowledge and instruction-following capabilities.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept their License before downloading this model .*
The fine-tuning process applied to qCammel-70 involves a distilled dataset of 15,000 instructions and is trained with QLoRA,
**Variations** The original Llama 2 has parameter sizes of 7B, 13B, and 70B. This is the fine-tuned version of the 70B model.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** qCammel-70 is based on the Llama 2 architecture, an auto-regressive language model that uses a decoder only transformer architecture.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
Llama 2 is licensed under the LLAMA 2 Community License, Copyright © Meta Platforms, Inc. All Rights Reserved
**Research Papers**
- [Clinical Camel: An Open-Source Expert-Level Medical Language Model with Dialogue-Based Knowledge Encoding](https://arxiv.org/abs/2305.12031)
- [QLoRA: Efficient Finetuning of Quantized LLMs](https://arxiv.org/abs/2305.14314)
- [LLaMA: Open and Efficient Foundation Language Models](https://arxiv.org/abs/2302.70971)
| 1,811 | [
[
-0.0020751953125,
-0.043212890625,
0.034881591796875,
0.0110931396484375,
-0.045745849609375,
-0.01242828369140625,
0.0012865066528320312,
-0.03680419921875,
-0.0123291015625,
0.05712890625,
-0.02337646484375,
-0.049072265625,
-0.05059814453125,
0.0017137527465820312,
-0.0239410400390625,
0.08184814453125,
-0.00750732421875,
0.0399169921875,
-0.006084442138671875,
-0.0164947509765625,
-0.018218994140625,
-0.04803466796875,
-0.045013427734375,
-0.0426025390625,
0.03485107421875,
0.021728515625,
0.048248291015625,
0.056396484375,
0.03790283203125,
0.013946533203125,
-0.0301666259765625,
0.006122589111328125,
-0.0445556640625,
-0.01081085205078125,
0.00916290283203125,
-0.05670166015625,
-0.06561279296875,
-0.02685546875,
0.0279388427734375,
0.015899658203125,
-0.0290679931640625,
0.034515380859375,
-0.00862884521484375,
0.058197021484375,
-0.04547119140625,
0.0215301513671875,
-0.03973388671875,
-0.004673004150390625,
-0.01473236083984375,
-0.00812530517578125,
-0.0166473388671875,
-0.0269317626953125,
0.01107025146484375,
-0.0263519287109375,
-0.0010385513305664062,
0.00231170654296875,
0.05267333984375,
0.0180816650390625,
-0.04644775390625,
-0.0313720703125,
-0.03472900390625,
0.064208984375,
-0.07342529296875,
0.007965087890625,
0.05230712890625,
0.03619384765625,
-0.0038204193115234375,
-0.06695556640625,
-0.03564453125,
-0.0283355712890625,
0.01186370849609375,
-0.004657745361328125,
-0.03125,
0.0188751220703125,
0.01390838623046875,
0.03839111328125,
-0.04376220703125,
0.0020275115966796875,
-0.068359375,
-0.00708770751953125,
0.045623779296875,
0.0201568603515625,
0.01158905029296875,
-0.00785064697265625,
-0.021514892578125,
-0.00464630126953125,
-0.060028076171875,
0.00026416778564453125,
0.00635528564453125,
-0.0172576904296875,
-0.032318115234375,
0.052978515625,
-0.04107666015625,
0.05352783203125,
-0.0013523101806640625,
-0.040313720703125,
0.0187835693359375,
-0.007228851318359375,
-0.03033447265625,
0.0169525146484375,
0.06561279296875,
0.028350830078125,
0.01345062255859375,
0.005405426025390625,
-0.0282440185546875,
-0.01181793212890625,
0.0141754150390625,
-0.062164306640625,
-0.029449462890625,
0.0296173095703125,
-0.04278564453125,
-0.030670166015625,
-0.02008056640625,
-0.0280609130859375,
-0.018890380859375,
-0.0258026123046875,
0.0247955322265625,
-0.03778076171875,
-0.002140045166015625,
0.0225830078125,
0.01552581787109375,
0.0258941650390625,
0.0244140625,
-0.048675537109375,
0.01506805419921875,
0.0311279296875,
0.062164306640625,
-0.00605010986328125,
-0.001270294189453125,
-0.00623321533203125,
0.0090179443359375,
-0.01480865478515625,
0.05950927734375,
-0.0282135009765625,
-0.036834716796875,
0.0031452178955078125,
0.00835418701171875,
0.016448974609375,
-0.054443359375,
0.03936767578125,
-0.049835205078125,
0.018524169921875,
-0.00928497314453125,
-0.02789306640625,
-0.051971435546875,
0.0176849365234375,
-0.047698974609375,
0.07568359375,
0.037445068359375,
-0.0438232421875,
-0.012725830078125,
-0.033355712890625,
-0.01312255859375,
-0.005107879638671875,
0.0035572052001953125,
-0.035186767578125,
0.0034618377685546875,
0.01158905029296875,
0.01390838623046875,
-0.048309326171875,
0.04437255859375,
-0.00975799560546875,
-0.041015625,
0.0179901123046875,
-0.04119873046875,
0.08428955078125,
0.0181427001953125,
-0.041229248046875,
0.0145111083984375,
-0.07598876953125,
-0.0013866424560546875,
0.02032470703125,
-0.0504150390625,
-0.0002853870391845703,
-0.0224761962890625,
-0.01165771484375,
0.0115509033203125,
0.03802490234375,
-0.016845703125,
0.01554107666015625,
-0.0269317626953125,
0.03619384765625,
0.037261962890625,
-0.0018205642700195312,
0.0147705078125,
-0.0479736328125,
0.057830810546875,
0.0111083984375,
0.034088134765625,
0.00555419921875,
-0.045257568359375,
-0.08282470703125,
-0.023529052734375,
0.016265869140625,
0.055694580078125,
-0.04931640625,
0.0243072509765625,
-0.0023212432861328125,
-0.05731201171875,
-0.07867431640625,
0.0176239013671875,
0.046875,
0.03253173828125,
0.044586181640625,
-0.0162506103515625,
-0.0313720703125,
-0.0799560546875,
0.0165863037109375,
-0.01364898681640625,
-0.0047454833984375,
0.01751708984375,
0.028594970703125,
-0.043487548828125,
0.042144775390625,
-0.0307464599609375,
-0.0196533203125,
-0.01172637939453125,
-0.00360870361328125,
0.0026416778564453125,
0.0322265625,
0.057098388671875,
-0.035369873046875,
-0.0208587646484375,
0.002758026123046875,
-0.05987548828125,
-0.01568603515625,
0.01898193359375,
-0.028106689453125,
0.0160064697265625,
0.016021728515625,
-0.0357666015625,
0.03240966796875,
0.0673828125,
0.0010595321655273438,
0.039794921875,
-0.0166473388671875,
0.0114898681640625,
-0.08978271484375,
0.0029811859130859375,
-0.0141143798828125,
-0.02099609375,
-0.043365478515625,
0.0141754150390625,
0.00872039794921875,
0.01279449462890625,
-0.05810546875,
0.037200927734375,
-0.028045654296875,
0.0218658447265625,
-0.041259765625,
0.004665374755859375,
0.01226043701171875,
0.048095703125,
-0.018829345703125,
0.051300048828125,
0.032623291015625,
-0.049560546875,
0.031280517578125,
0.01548004150390625,
-0.021270751953125,
0.02117919921875,
-0.08160400390625,
0.0284576416015625,
-0.00797271728515625,
0.0296478271484375,
-0.046844482421875,
-0.004474639892578125,
0.045074462890625,
-0.00882720947265625,
0.01261138916015625,
-0.006801605224609375,
-0.028045654296875,
-0.01146697998046875,
-0.0244140625,
0.060699462890625,
0.052093505859375,
-0.019989013671875,
0.037322998046875,
0.0238494873046875,
-0.0058441162109375,
-0.049163818359375,
-0.050689697265625,
-0.012725830078125,
-0.01099395751953125,
-0.014923095703125,
0.0262603759765625,
-0.020599365234375,
-0.0177001953125,
-0.01235198974609375,
0.00875091552734375,
-0.01366424560546875,
0.0013580322265625,
0.036102294921875,
0.0275726318359375,
-0.024627685546875,
0.00676727294921875,
0.043731689453125,
0.00139617919921875,
0.007419586181640625,
-0.006923675537109375,
0.057647705078125,
-0.0120697021484375,
-0.0201568603515625,
-0.0673828125,
0.01140594482421875,
0.04669189453125,
-0.011810302734375,
0.059722900390625,
0.041229248046875,
-0.03564453125,
0.024444580078125,
-0.03900146484375,
-0.011810302734375,
-0.03533935546875,
0.038543701171875,
-0.02313232421875,
-0.062164306640625,
0.023529052734375,
0.007755279541015625,
0.01111602783203125,
0.05352783203125,
0.0673828125,
0.000690460205078125,
0.07080078125,
0.03680419921875,
-0.0084991455078125,
0.024658203125,
-0.0203399658203125,
-0.00931549072265625,
-0.075927734375,
-0.042999267578125,
-0.0418701171875,
-0.012908935546875,
-0.0396728515625,
-0.042022705078125,
0.0208282470703125,
0.0241851806640625,
-0.04473876953125,
0.0107574462890625,
-0.039215087890625,
0.01081085205078125,
0.05120849609375,
0.0305328369140625,
0.03192138671875,
0.003604888916015625,
0.0006089210510253906,
0.02789306640625,
-0.057861328125,
-0.0672607421875,
0.09576416015625,
0.0357666015625,
0.053741455078125,
0.0129241943359375,
0.038970947265625,
0.0246124267578125,
0.040740966796875,
-0.056488037109375,
0.034912109375,
-0.00965118408203125,
-0.052276611328125,
0.006633758544921875,
-0.02435302734375,
-0.06427001953125,
0.01009368896484375,
-0.014862060546875,
-0.044464111328125,
0.0054473876953125,
0.0236968994140625,
-0.01708984375,
0.0166473388671875,
-0.0352783203125,
0.028564453125,
-0.023040771484375,
0.0103302001953125,
-0.0185699462890625,
-0.042144775390625,
0.047454833984375,
-0.01256561279296875,
-0.00113677978515625,
-0.02264404296875,
0.0013370513916015625,
0.055511474609375,
-0.0214385986328125,
0.0762939453125,
0.0022449493408203125,
-0.02569580078125,
0.04376220703125,
0.0075531005859375,
0.02752685546875,
0.017486572265625,
-0.01544189453125,
0.046844482421875,
0.027252197265625,
-0.0450439453125,
-0.023651123046875,
0.038330078125,
-0.08917236328125,
-0.05157470703125,
-0.0231781005859375,
-0.04180908203125,
0.008544921875,
0.00499725341796875,
0.04217529296875,
0.03485107421875,
0.002597808837890625,
0.0124969482421875,
0.04449462890625,
-0.031402587890625,
0.0028896331787109375,
0.047576904296875,
-0.008453369140625,
-0.0513916015625,
0.053314208984375,
-0.002368927001953125,
0.01812744140625,
0.032257080078125,
0.01349639892578125,
-0.0176239013671875,
-0.050079345703125,
-0.042938232421875,
0.0330810546875,
-0.0623779296875,
-0.03131103515625,
-0.0648193359375,
-0.03021240234375,
-0.027923583984375,
0.007083892822265625,
-0.01824951171875,
-0.03643798828125,
-0.058349609375,
0.004222869873046875,
0.037841796875,
0.061920166015625,
0.0245361328125,
0.051422119140625,
-0.075927734375,
0.0198211669921875,
0.028167724609375,
0.0013761520385742188,
0.01103973388671875,
-0.06658935546875,
-0.0161285400390625,
0.0288543701171875,
-0.039642333984375,
-0.07861328125,
0.02752685546875,
0.00960540771484375,
0.03314208984375,
0.03857421875,
0.01313018798828125,
0.0411376953125,
-0.02581787109375,
0.062347412109375,
0.01593017578125,
-0.06439208984375,
0.03741455078125,
-0.00980377197265625,
0.034515380859375,
0.035064697265625,
0.03289794921875,
-0.0168609619140625,
-0.0273284912109375,
-0.0278167724609375,
-0.057220458984375,
0.040008544921875,
0.01039886474609375,
0.0198822021484375,
0.0086212158203125,
0.0521240234375,
0.02142333984375,
0.02923583984375,
-0.06622314453125,
-0.027130126953125,
-0.0189056396484375,
-0.0457763671875,
0.0033321380615234375,
-0.059478759765625,
-0.03302001953125,
-0.0168609619140625,
0.0595703125,
-0.010650634765625,
0.0231781005859375,
0.0057830810546875,
-0.0223236083984375,
-0.004039764404296875,
0.0123748779296875,
0.081298828125,
0.05694580078125,
-0.01336669921875,
0.00592041015625,
0.0304718017578125,
-0.051055908203125,
0.0109405517578125,
0.0034389495849609375,
-0.0168609619140625,
-0.00801849365234375,
0.032440185546875,
0.07122802734375,
0.002887725830078125,
-0.069091796875,
0.0082550048828125,
-0.00861358642578125,
-0.021484375,
-0.0191192626953125,
0.029144287109375,
0.01055145263671875,
0.04931640625,
0.0172576904296875,
-0.0228271484375,
0.0203399658203125,
-0.01849365234375,
-0.0035152435302734375,
0.00988006591796875,
-0.044189453125,
-0.0287628173828125,
0.053924560546875,
-0.0013866424560546875,
-0.0322265625,
0.043853759765625,
-0.01136016845703125,
-0.029541015625,
0.04779052734375,
0.045806884765625,
0.04595947265625,
-0.0153045654296875,
0.017608642578125,
0.0224456787109375,
0.0163116455078125,
-0.0111083984375,
0.042266845703125,
0.012847900390625,
-0.0268096923828125,
-0.004302978515625,
-0.0589599609375,
-0.03228759765625,
0.008544921875,
-0.066650390625,
0.0144500732421875,
-0.030548095703125,
-0.031280517578125,
-0.02691650390625,
0.0007791519165039062,
-0.049713134765625,
0.025299072265625,
-0.005779266357421875,
0.06304931640625,
-0.02764892578125,
0.0836181640625,
0.059295654296875,
-0.0165252685546875,
-0.05731201171875,
-0.0096282958984375,
-0.004058837890625,
-0.08953857421875,
0.0294189453125,
-0.0005254745483398438,
0.0000699162483215332,
-0.00205230712890625,
-0.045013427734375,
-0.062164306640625,
0.11199951171875,
0.05010986328125,
-0.0736083984375,
0.00018990039825439453,
0.007617950439453125,
0.036346435546875,
-0.0270538330078125,
0.01605224609375,
0.045013427734375,
0.00820159912109375,
-0.004917144775390625,
-0.06939697265625,
0.00766754150390625,
-0.01245880126953125,
0.00396728515625,
-0.020660400390625,
-0.070068359375,
0.06597900390625,
-0.0283966064453125,
0.0099029541015625,
0.024139404296875,
0.055023193359375,
0.062744140625,
0.02960205078125,
0.0211334228515625,
0.02313232421875,
0.061370849609375,
0.007343292236328125,
0.0726318359375,
-0.052520751953125,
0.0259857177734375,
0.06927490234375,
-0.0024089813232421875,
0.06561279296875,
0.0203399658203125,
-0.0162811279296875,
0.05096435546875,
0.058929443359375,
0.00737762451171875,
0.041717529296875,
0.01552581787109375,
-0.020263671875,
0.00327301025390625,
-0.0202178955078125,
-0.050048828125,
0.038604736328125,
0.01213836669921875,
-0.04931640625,
0.015716552734375,
-0.01172637939453125,
0.017822265625,
-0.03326416015625,
-0.022735595703125,
0.048492431640625,
0.034576416015625,
-0.0254974365234375,
0.078125,
0.019683837890625,
0.0355224609375,
-0.041412353515625,
-0.0017976760864257812,
-0.023681640625,
-0.01153564453125,
-0.004772186279296875,
-0.023590087890625,
-0.00787353515625,
0.004730224609375,
-0.0117034912109375,
0.02935791015625,
0.037933349609375,
-0.03546142578125,
-0.0494384765625,
0.010589599609375,
0.032379150390625,
0.0130462646484375,
0.007274627685546875,
-0.060028076171875,
0.01059722900390625,
0.00433349609375,
-0.035980224609375,
0.0296173095703125,
0.02337646484375,
-0.0092010498046875,
0.06512451171875,
0.0355224609375,
0.00632476806640625,
0.00382232666015625,
0.0047760009765625,
0.07647705078125,
-0.04559326171875,
-0.01157379150390625,
-0.057464599609375,
0.0287017822265625,
0.020538330078125,
-0.03668212890625,
0.0164337158203125,
0.022064208984375,
0.044830322265625,
-0.00669097900390625,
0.048187255859375,
0.0307464599609375,
-0.005542755126953125,
-0.054443359375,
0.055938720703125,
-0.057861328125,
0.0164031982421875,
0.01470947265625,
-0.06683349609375,
-0.008026123046875,
0.049560546875,
0.0002651214599609375,
0.018310546875,
0.03704833984375,
0.05682373046875,
0.02093505859375,
-0.007762908935546875,
0.0245361328125,
0.01361083984375,
0.0230712890625,
0.049468994140625,
0.043060302734375,
-0.03668212890625,
0.04730224609375,
-0.0316162109375,
-0.021270751953125,
-0.037994384765625,
-0.042266845703125,
-0.08251953125,
-0.021331787109375,
-0.028564453125,
-0.03338623046875,
0.0263214111328125,
0.0726318359375,
0.050567626953125,
-0.0628662109375,
-0.0158233642578125,
0.0219268798828125,
-0.00159454345703125,
-0.0132598876953125,
-0.01145172119140625,
0.03912353515625,
0.01348114013671875,
-0.054595947265625,
0.017822265625,
0.01436614990234375,
0.033203125,
-0.037109375,
-0.0140228271484375,
0.0084686279296875,
0.015228271484375,
0.0438232421875,
0.0035266876220703125,
-0.06768798828125,
0.00787353515625,
0.005214691162109375,
-0.035919189453125,
0.00640869140625,
0.03436279296875,
-0.03839111328125,
0.0189056396484375,
0.0245361328125,
0.0235137939453125,
0.028106689453125,
-0.00658416748046875,
0.042724609375,
-0.0179443359375,
0.0251007080078125,
0.0181732177734375,
0.0299072265625,
0.01552581787109375,
-0.031890869140625,
0.024505615234375,
-0.0015659332275390625,
-0.04864501953125,
-0.06146240234375,
0.0089111328125,
-0.087890625,
-0.018463134765625,
0.111572265625,
-0.0193634033203125,
-0.0236053466796875,
0.00012230873107910156,
-0.04888916015625,
0.042144775390625,
-0.0283966064453125,
0.0780029296875,
0.020660400390625,
0.007781982421875,
-0.0299072265625,
-0.05706787109375,
0.02752685546875,
0.018310546875,
-0.05426025390625,
-0.03582763671875,
0.0203399658203125,
0.02777099609375,
-0.01528167724609375,
0.060028076171875,
-0.01470184326171875,
0.0234222412109375,
-0.0193328857421875,
0.016204833984375,
-0.0096282958984375,
-0.01165008544921875,
-0.02032470703125,
0.01340484619140625,
0.0273895263671875,
-0.01056671142578125
]
] |
Mizuiro-sakura/luke-japanese-large-sentiment-analysis-wrime | 2023-05-15T12:58:08.000Z | [
"transformers",
"pytorch",
"safetensors",
"luke",
"text-classification",
"sentiment-analysis",
"wrime",
"SentimentAnalysis",
"sentiment-classification",
"ja",
"dataset:shunk031/wrime",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | Mizuiro-sakura | null | null | Mizuiro-sakura/luke-japanese-large-sentiment-analysis-wrime | 9 | 28,255 | transformers | 2023-03-13T12:40:08 | ---
language: ja
license: mit
tags:
- luke
- sentiment-analysis
- wrime
- SentimentAnalysis
- pytorch
- sentiment-classification
datasets: shunk031/wrime
---
# このモデルはLuke-japanese-large-liteをファインチューニングしたものです。
このモデルは8つの感情(喜び、悲しみ、期待、驚き、怒り、恐れ、嫌悪、信頼)の内、どの感情が文章に含まれているのか分析することができます。
このモデルはwrimeデータセット(
https://huggingface.co/datasets/shunk031/wrime
)を用いて学習を行いました。
# This model is based on Luke-japanese-large-lite
This model is fine-tuned model which besed on studio-ousia/Luke-japanese-large-lite.
This could be able to analyze which emotions (joy or sadness or anticipation or surprise or anger or fear or disdust or trust ) are included.
This model was fine-tuned by using wrime dataset.
# what is Luke? Lukeとは?[1]
LUKE (Language Understanding with Knowledge-based Embeddings) is a new pre-trained contextualized representation of words and entities based on transformer. LUKE treats words and entities in a given text as independent tokens, and outputs contextualized representations of them. LUKE adopts an entity-aware self-attention mechanism that is an extension of the self-attention mechanism of the transformer, and considers the types of tokens (words or entities) when computing attention scores.
LUKE achieves state-of-the-art results on five popular NLP benchmarks including SQuAD v1.1 (extractive question answering), CoNLL-2003 (named entity recognition), ReCoRD (cloze-style question answering), TACRED (relation classification), and Open Entity (entity typing).
luke-japaneseは、単語とエンティティの知識拡張型訓練済み Transformer モデルLUKEの日本語版です。LUKE は単語とエンティティを独立したトークンとして扱い、これらの文脈を考慮した表現を出力します。
# how to use 使い方
ステップ1:pythonとpytorch, sentencepieceのインストールとtransformersのアップデート(バージョンが古すぎるとLukeTokenizerが入っていないため)
update transformers and install sentencepiece, python and pytorch
ステップ2:下記のコードを実行する
Please execute this code
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification, LukeConfig
import torch
tokenizer = AutoTokenizer.from_pretrained("Mizuiro-sakura/luke-japanese-large-sentiment-analysis-wrime")
config = LukeConfig.from_pretrained('Mizuiro-sakura/luke-japanese-large-sentiment-analysis-wrime', output_hidden_states=True)
model = AutoModelForSequenceClassification.from_pretrained('Mizuiro-sakura/luke-japanese-large-sentiment-analysis-wrime', config=config)
text='すごく楽しかった。また行きたい。'
max_seq_length=512
token=tokenizer(text,
truncation=True,
max_length=max_seq_length,
padding="max_length")
output=model(torch.tensor(token['input_ids']).unsqueeze(0), torch.tensor(token['attention_mask']).unsqueeze(0))
max_index=torch.argmax(torch.tensor(output.logits))
if max_index==0:
print('joy、うれしい')
elif max_index==1:
print('sadness、悲しい')
elif max_index==2:
print('anticipation、期待')
elif max_index==3:
print('surprise、驚き')
elif max_index==4:
print('anger、怒り')
elif max_index==5:
print('fear、恐れ')
elif max_index==6:
print('disgust、嫌悪')
elif max_index==7:
print('trust、信頼')
```
# Acknowledgments 謝辞
Lukeの開発者である山田先生とStudio ousiaさんには感謝いたします。
I would like to thank Mr.Yamada @ikuyamada and Studio ousia @StudioOusia.
# Citation
[1]@inproceedings{yamada2020luke,
title={LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention},
author={Ikuya Yamada and Akari Asai and Hiroyuki Shindo and Hideaki Takeda and Yuji Matsumoto},
booktitle={EMNLP},
year={2020}
}
| 3,381 | [
[
-0.0372314453125,
-0.064208984375,
0.0205535888671875,
0.0153961181640625,
-0.028961181640625,
-0.0131988525390625,
-0.03509521484375,
-0.030914306640625,
0.037567138671875,
0.0159759521484375,
-0.05279541015625,
-0.042633056640625,
-0.05267333984375,
0.0216522216796875,
-0.0030040740966796875,
0.06787109375,
-0.01212310791015625,
-0.0125274658203125,
0.0237884521484375,
-0.011383056640625,
-0.030853271484375,
-0.021881103515625,
-0.052520751953125,
-0.01432037353515625,
0.0330810546875,
0.0223388671875,
0.045806884765625,
0.03961181640625,
0.04193115234375,
0.03564453125,
0.0113067626953125,
-0.01080322265625,
-0.0270233154296875,
0.004367828369140625,
0.0009679794311523438,
-0.05133056640625,
-0.02398681640625,
-0.0005655288696289062,
0.0428466796875,
0.0284423828125,
0.0115203857421875,
0.0154571533203125,
-0.00689697265625,
0.04986572265625,
-0.0372314453125,
0.0310516357421875,
-0.02813720703125,
0.017730712890625,
-0.00814056396484375,
-0.0184783935546875,
-0.016693115234375,
-0.01092529296875,
0.0017490386962890625,
-0.04736328125,
0.0216522216796875,
-0.007205963134765625,
0.08941650390625,
0.025115966796875,
-0.009033203125,
-0.00879669189453125,
-0.024871826171875,
0.061065673828125,
-0.053009033203125,
0.0229034423828125,
0.0027408599853515625,
0.0044403076171875,
0.00177001953125,
-0.05889892578125,
-0.0626220703125,
-0.00885009765625,
-0.023651123046875,
0.03656005859375,
-0.00518798828125,
-0.00032901763916015625,
0.0281829833984375,
0.02239990234375,
-0.035614013671875,
-0.00342559814453125,
-0.0036296844482421875,
-0.0002384185791015625,
0.054656982421875,
0.0002789497375488281,
0.044036865234375,
-0.047515869140625,
-0.038604736328125,
-0.03033447265625,
-0.01983642578125,
0.0221710205078125,
0.0212554931640625,
0.007568359375,
-0.030792236328125,
0.0548095703125,
-0.018463134765625,
0.034454345703125,
0.0211181640625,
-0.01904296875,
0.042205810546875,
-0.0125885009765625,
-0.0256195068359375,
-0.0022296905517578125,
0.09661865234375,
0.047027587890625,
0.040740966796875,
0.0005712509155273438,
-0.004550933837890625,
0.0098876953125,
-0.0020751953125,
-0.06134033203125,
0.00850677490234375,
0.0131378173828125,
-0.04229736328125,
-0.0028934478759765625,
0.00923919677734375,
-0.08392333984375,
-0.004047393798828125,
-0.01153564453125,
0.043853759765625,
-0.042236328125,
-0.03411865234375,
0.0192108154296875,
-0.004730224609375,
-0.001605987548828125,
0.0015926361083984375,
-0.057586669921875,
-0.004047393798828125,
0.016387939453125,
0.045501708984375,
-0.0016889572143554688,
-0.0458984375,
-0.007068634033203125,
0.001007080078125,
-0.0169830322265625,
0.0263519287109375,
-0.0179290771484375,
-0.033538818359375,
-0.00782012939453125,
0.0226593017578125,
-0.036224365234375,
-0.0181427001953125,
0.031768798828125,
-0.02166748046875,
0.042999267578125,
-0.010894775390625,
-0.0396728515625,
-0.007228851318359375,
-0.0013141632080078125,
-0.029815673828125,
0.064208984375,
0.019866943359375,
-0.0718994140625,
0.0166168212890625,
-0.0513916015625,
-0.03240966796875,
-0.001209259033203125,
-0.00897979736328125,
-0.033416748046875,
0.002285003662109375,
0.0274505615234375,
0.050811767578125,
-0.0108795166015625,
0.0150146484375,
-0.007564544677734375,
-0.0157012939453125,
0.041656494140625,
-0.0279541015625,
0.060302734375,
0.02093505859375,
-0.057464599609375,
0.0012903213500976562,
-0.043670654296875,
0.0003848075866699219,
0.047607421875,
-0.00135040283203125,
-0.0169677734375,
-0.025634765625,
0.0188446044921875,
0.0239715576171875,
0.031005859375,
-0.046173095703125,
0.00618743896484375,
-0.043853759765625,
0.0258636474609375,
0.040771484375,
0.0179595947265625,
0.0267486572265625,
-0.011566162109375,
0.04168701171875,
0.01654052734375,
0.0150909423828125,
-0.0264434814453125,
-0.02410888671875,
-0.08026123046875,
-0.0108489990234375,
0.0009517669677734375,
0.0236663818359375,
-0.05792236328125,
0.06573486328125,
-0.02740478515625,
-0.03912353515625,
-0.043212890625,
-0.0034847259521484375,
0.034149169921875,
0.051422119140625,
0.040435791015625,
-0.02313232421875,
-0.042144775390625,
-0.06268310546875,
-0.023590087890625,
-0.017059326171875,
0.016387939453125,
0.0065765380859375,
0.0252685546875,
-0.0214080810546875,
0.06158447265625,
-0.041473388671875,
-0.036895751953125,
-0.03338623046875,
0.003753662109375,
0.033203125,
0.03826904296875,
0.032562255859375,
-0.0595703125,
-0.04888916015625,
-0.00771331787109375,
-0.06878662109375,
0.0035686492919921875,
-0.0229949951171875,
-0.0233001708984375,
0.031280517578125,
0.03338623046875,
-0.06719970703125,
0.0282745361328125,
-0.0005059242248535156,
-0.05096435546875,
0.0189208984375,
-0.003597259521484375,
0.026885986328125,
-0.0975341796875,
0.01126861572265625,
0.0177154541015625,
0.0007948875427246094,
-0.0295562744140625,
0.0139007568359375,
0.01146697998046875,
0.00637054443359375,
-0.0211639404296875,
0.050811767578125,
-0.040069580078125,
0.01971435546875,
-0.0027828216552734375,
0.0283203125,
0.0089263916015625,
0.050689697265625,
0.01140594482421875,
0.045562744140625,
0.028167724609375,
-0.03228759765625,
0.0341796875,
0.0283660888671875,
-0.0189666748046875,
0.0462646484375,
-0.0439453125,
-0.0246734619140625,
-0.015228271484375,
0.00208282470703125,
-0.0740966796875,
-0.01776123046875,
0.02685546875,
-0.050048828125,
0.064453125,
-0.001617431640625,
-0.034820556640625,
-0.0447998046875,
-0.02496337890625,
0.0014333724975585938,
0.046478271484375,
-0.03839111328125,
0.042694091796875,
0.0161590576171875,
-0.0024566650390625,
-0.06829833984375,
-0.063232421875,
-0.00006347894668579102,
-0.02716064453125,
-0.035919189453125,
0.0258941650390625,
-0.0198974609375,
0.0043792724609375,
0.0024394989013671875,
-0.0014200210571289062,
0.03131103515625,
-0.0005321502685546875,
0.0081024169921875,
0.03692626953125,
-0.026611328125,
0.0121002197265625,
-0.00623321533203125,
-0.0269317626953125,
0.023406982421875,
0.01384735107421875,
0.05975341796875,
-0.01192474365234375,
-0.01335906982421875,
-0.053558349609375,
0.00659942626953125,
0.045196533203125,
-0.013763427734375,
0.046875,
0.08673095703125,
-0.016571044921875,
0.007080078125,
-0.06201171875,
-0.006671905517578125,
-0.038604736328125,
0.053619384765625,
-0.035400390625,
-0.047332763671875,
0.038848876953125,
0.0207366943359375,
-0.01387786865234375,
0.07147216796875,
0.0474853515625,
0.0005717277526855469,
0.07513427734375,
0.028350830078125,
-0.0236663818359375,
0.050689697265625,
-0.050262451171875,
0.024139404296875,
-0.07550048828125,
-0.0243988037109375,
-0.0168304443359375,
-0.0222625732421875,
-0.053192138671875,
-0.013153076171875,
0.02325439453125,
0.004116058349609375,
-0.025054931640625,
0.025543212890625,
-0.053009033203125,
0.0167083740234375,
0.027984619140625,
-0.0026493072509765625,
0.009857177734375,
-0.0013322830200195312,
0.00589752197265625,
-0.019561767578125,
-0.037384033203125,
-0.01971435546875,
0.07318115234375,
0.036895751953125,
0.055328369140625,
0.00403594970703125,
0.06536865234375,
-0.01102447509765625,
0.01708984375,
-0.0640869140625,
0.04742431640625,
0.0124969482421875,
-0.043670654296875,
-0.034881591796875,
-0.03814697265625,
-0.09613037109375,
0.021026611328125,
-0.0099945068359375,
-0.07366943359375,
0.00879669189453125,
-0.032440185546875,
-0.01129913330078125,
0.01129913330078125,
-0.0767822265625,
0.07244873046875,
-0.01123809814453125,
-0.032562255859375,
-0.0022602081298828125,
-0.06256103515625,
0.021697998046875,
0.0185546875,
0.008514404296875,
-0.01338958740234375,
-0.010162353515625,
0.0716552734375,
-0.0160064697265625,
0.05535888671875,
-0.0214080810546875,
0.0007939338684082031,
0.0212554931640625,
-0.003932952880859375,
0.018951416015625,
0.0323486328125,
0.01229095458984375,
-0.0041351318359375,
0.00299072265625,
-0.0153350830078125,
-0.042572021484375,
0.0689697265625,
-0.0867919921875,
-0.04962158203125,
-0.0352783203125,
-0.02386474609375,
-0.01091766357421875,
0.0298004150390625,
0.05926513671875,
0.0311431884765625,
0.002735137939453125,
0.00623321533203125,
0.03961181640625,
-0.02166748046875,
0.06951904296875,
0.032684326171875,
-0.0239410400390625,
-0.0285186767578125,
0.058929443359375,
0.020294189453125,
0.006450653076171875,
0.035430908203125,
0.018829345703125,
-0.036285400390625,
0.00179290771484375,
-0.0394287109375,
0.053375244140625,
-0.057586669921875,
-0.0270233154296875,
-0.069580078125,
-0.0306243896484375,
-0.033538818359375,
-0.0140838623046875,
-0.033843994140625,
-0.0191650390625,
-0.030670166015625,
-0.0098419189453125,
0.030120849609375,
0.032196044921875,
0.0125885009765625,
0.01189422607421875,
-0.0479736328125,
0.039703369140625,
0.0025424957275390625,
0.0330810546875,
0.012908935546875,
-0.040924072265625,
-0.0247955322265625,
-0.00867462158203125,
-0.01409149169921875,
-0.065673828125,
0.045867919921875,
-0.0092010498046875,
0.039306640625,
0.0284423828125,
0.0006246566772460938,
0.04083251953125,
-0.01448822021484375,
0.057159423828125,
0.040283203125,
-0.076171875,
0.041961669921875,
-0.02581787109375,
0.033782958984375,
0.03411865234375,
0.021881103515625,
-0.059539794921875,
-0.028533935546875,
-0.0767822265625,
-0.06622314453125,
0.06890869140625,
0.01055145263671875,
0.0260772705078125,
-0.0102996826171875,
0.01898193359375,
-0.006748199462890625,
0.01544189453125,
-0.0736083984375,
-0.05792236328125,
-0.0287933349609375,
-0.045501708984375,
-0.03057861328125,
-0.01251220703125,
0.002262115478515625,
-0.02716064453125,
0.06060791015625,
-0.0016183853149414062,
0.05255126953125,
0.045196533203125,
-0.01629638671875,
-0.029083251953125,
0.009307861328125,
0.021575927734375,
0.022430419921875,
-0.011199951171875,
-0.005199432373046875,
0.00975799560546875,
-0.046478271484375,
0.00814056396484375,
0.0194854736328125,
-0.00772857666015625,
0.01470947265625,
0.04595947265625,
0.0645751953125,
0.013519287109375,
-0.0290374755859375,
0.0577392578125,
-0.0151214599609375,
0.00634765625,
-0.025909423828125,
0.01763916015625,
0.005878448486328125,
0.01239013671875,
0.02972412109375,
-0.01477813720703125,
0.004573822021484375,
-0.037445068359375,
-0.0106048583984375,
0.0260009765625,
-0.0177001953125,
-0.013214111328125,
0.04644775390625,
-0.0038013458251953125,
-0.0069732666015625,
0.022308349609375,
-0.00763702392578125,
-0.046783447265625,
0.057281494140625,
0.045440673828125,
0.062347412109375,
-0.0241546630859375,
0.0166015625,
0.0609130859375,
0.017913818359375,
0.0078887939453125,
0.0255279541015625,
0.006786346435546875,
-0.05157470703125,
-0.01256561279296875,
-0.05389404296875,
0.00797271728515625,
0.01470947265625,
-0.05242919921875,
0.0172271728515625,
-0.037322998046875,
-0.0201263427734375,
-0.01102447509765625,
0.022796630859375,
-0.07000732421875,
0.04541015625,
0.0037384033203125,
0.0458984375,
-0.045623779296875,
0.044342041015625,
0.04150390625,
-0.0517578125,
-0.073486328125,
0.00409698486328125,
-0.01654052734375,
-0.06927490234375,
0.030242919921875,
0.03619384765625,
-0.005474090576171875,
0.0208892822265625,
-0.0244140625,
-0.08709716796875,
0.0804443359375,
0.00371551513671875,
-0.03118896484375,
0.00887298583984375,
0.0027008056640625,
0.0592041015625,
-0.01861572265625,
0.041046142578125,
0.041839599609375,
0.042633056640625,
-0.01788330078125,
-0.04522705078125,
0.0187530517578125,
-0.042144775390625,
0.0050811767578125,
0.0130767822265625,
-0.07806396484375,
0.057037353515625,
-0.01357269287109375,
-0.024749755859375,
-0.00026917457580566406,
0.06195068359375,
-0.0008406639099121094,
0.01495361328125,
0.02899169921875,
0.046234130859375,
0.033477783203125,
-0.01462554931640625,
0.0838623046875,
-0.0277099609375,
0.045501708984375,
0.05560302734375,
0.0106048583984375,
0.05560302734375,
0.034332275390625,
-0.035491943359375,
0.04833984375,
0.031494140625,
-0.0151214599609375,
0.031402587890625,
-0.01308441162109375,
-0.005748748779296875,
0.006168365478515625,
-0.0019378662109375,
-0.04376220703125,
0.037353515625,
0.023773193359375,
-0.033782958984375,
0.0153045654296875,
0.01187896728515625,
0.0233612060546875,
-0.000682830810546875,
-0.0153656005859375,
0.055999755859375,
0.01113128662109375,
-0.06982421875,
0.050079345703125,
0.005840301513671875,
0.06854248046875,
-0.047119140625,
0.033477783203125,
-0.01690673828125,
0.01432037353515625,
-0.015960693359375,
-0.06695556640625,
-0.001689910888671875,
-0.0088348388671875,
0.0002340078353881836,
0.003131866455078125,
0.03814697265625,
-0.037506103515625,
-0.046356201171875,
0.038330078125,
0.032562255859375,
-0.00731658935546875,
0.0152130126953125,
-0.06695556640625,
-0.001499176025390625,
0.0082855224609375,
-0.06982421875,
0.023712158203125,
0.04205322265625,
0.03387451171875,
0.03851318359375,
0.0294342041015625,
-0.004329681396484375,
-0.0019550323486328125,
0.020477294921875,
0.045867919921875,
-0.06085205078125,
-0.040130615234375,
-0.073974609375,
0.046875,
-0.01171875,
-0.0484619140625,
0.0633544921875,
0.0292510986328125,
0.06134033203125,
-0.017730712890625,
0.049530029296875,
-0.01397705078125,
0.038787841796875,
-0.04339599609375,
0.0550537109375,
-0.056915283203125,
-0.0090484619140625,
-0.03948974609375,
-0.07354736328125,
-0.0174102783203125,
0.054412841796875,
-0.020477294921875,
0.038848876953125,
0.07769775390625,
0.04736328125,
0.00634765625,
-0.003108978271484375,
0.0182342529296875,
0.031890869140625,
0.007488250732421875,
0.055084228515625,
0.042388916015625,
-0.05181884765625,
0.0194854736328125,
-0.052001953125,
-0.0008449554443359375,
-0.02178955078125,
-0.03631591796875,
-0.08270263671875,
-0.057464599609375,
-0.033477783203125,
-0.033477783203125,
-0.03802490234375,
0.078125,
0.02886962890625,
-0.04132080078125,
-0.00460052490234375,
-0.002048492431640625,
0.00026798248291015625,
0.0010900497436523438,
-0.028411865234375,
0.04705810546875,
-0.0418701171875,
-0.070556640625,
0.0088958740234375,
0.017791748046875,
0.0172271728515625,
-0.0118865966796875,
0.0031871795654296875,
-0.0124359130859375,
-0.00843048095703125,
0.023529052734375,
0.003154754638671875,
-0.056121826171875,
-0.01092529296875,
0.0029506683349609375,
-0.0308837890625,
0.01678466796875,
0.02734375,
-0.04339599609375,
0.042572021484375,
0.0207672119140625,
0.028961181640625,
0.049163818359375,
-0.017791748046875,
0.01424407958984375,
-0.045196533203125,
0.01392364501953125,
0.0023784637451171875,
0.052154541015625,
0.037506103515625,
-0.05377197265625,
0.04595947265625,
0.03192138671875,
-0.0289764404296875,
-0.038238525390625,
0.006072998046875,
-0.08441162109375,
-0.0219573974609375,
0.08831787109375,
-0.007419586181640625,
-0.0287322998046875,
0.018951416015625,
-0.0272369384765625,
0.05926513671875,
-0.020477294921875,
0.055755615234375,
0.046844482421875,
0.01117706298828125,
-0.0222625732421875,
-0.02752685546875,
0.0275726318359375,
0.052276611328125,
-0.057281494140625,
-0.006927490234375,
0.008880615234375,
0.02325439453125,
0.032196044921875,
0.037750244140625,
0.00014150142669677734,
0.019256591796875,
-0.00421142578125,
0.02410888671875,
-0.0010862350463867188,
0.0017757415771484375,
-0.028961181640625,
0.0011844635009765625,
-0.00400543212890625,
-0.01058197021484375
]
] |
studio-ousia/luke-japanese-base-lite | 2022-11-09T06:22:22.000Z | [
"transformers",
"pytorch",
"luke",
"fill-mask",
"named entity recognition",
"entity typing",
"relation classification",
"question answering",
"ja",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | studio-ousia | null | null | studio-ousia/luke-japanese-base-lite | 4 | 28,220 | transformers | 2022-10-25T09:27:16 | ---
language: ja
thumbnail: https://github.com/studio-ousia/luke/raw/master/resources/luke_logo.png
tags:
- luke
- named entity recognition
- entity typing
- relation classification
- question answering
license: apache-2.0
---
## luke-japanese
**luke-japanese** is the Japanese version of **LUKE** (**L**anguage
**U**nderstanding with **K**nowledge-based **E**mbeddings), a pre-trained
_knowledge-enhanced_ contextualized representation of words and entities. LUKE
treats words and entities in a given text as independent tokens, and outputs
contextualized representations of them. Please refer to our
[GitHub repository](https://github.com/studio-ousia/luke) for more details and
updates.
This model is a lightweight version which does not contain Wikipedia entity
embeddings. Please use the
[full version](https://huggingface.co/studio-ousia/luke-japanese-base/) for
tasks that use Wikipedia entities as inputs.
**luke-japanese**は、単語とエンティティの知識拡張型訓練済み Transformer モデル**LUKE**の日本語版です。LUKE は単語とエンティティを独立したトークンとして扱い、これらの文脈を考慮した表現を出力します。詳細については、[GitHub リポジトリ](https://github.com/studio-ousia/luke)を参照してください。
このモデルは、Wikipedia エンティティのエンベディングを含まない軽量版のモデルです。Wikipedia エンティティを入力として使うタスクには、[full version](https://huggingface.co/studio-ousia/luke-japanese-base/)を使用してください。
### Experimental results on JGLUE
The experimental results evaluated on the dev set of
[JGLUE](https://github.com/yahoojapan/JGLUE) are shown as follows:
| Model | MARC-ja | JSTS | JNLI | JCommonsenseQA |
| ---------------------- | --------- | ------------------- | --------- | -------------- |
| | acc | Pearson/Spearman | acc | acc |
| **LUKE Japanese base** | **0.965** | **0.916**/**0.877** | **0.912** | **0.842** |
| _Baselines:_ | |
| Tohoku BERT base | 0.958 | 0.909/0.868 | 0.899 | 0.808 |
| NICT BERT base | 0.958 | 0.910/0.871 | 0.902 | 0.823 |
| Waseda RoBERTa base | 0.962 | 0.913/0.873 | 0.895 | 0.840 |
| XLM RoBERTa base | 0.961 | 0.877/0.831 | 0.893 | 0.687 |
The baseline scores are obtained from
[here](https://github.com/yahoojapan/JGLUE/blob/a6832af23895d6faec8ecf39ec925f1a91601d62/README.md).
### Citation
```latex
@inproceedings{yamada2020luke,
title={LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention},
author={Ikuya Yamada and Akari Asai and Hiroyuki Shindo and Hideaki Takeda and Yuji Matsumoto},
booktitle={EMNLP},
year={2020}
}
```
| 2,621 | [
[
-0.04449462890625,
-0.07891845703125,
0.0419921875,
-0.01151275634765625,
-0.01367950439453125,
0.0028533935546875,
-0.032379150390625,
-0.04144287109375,
0.055084228515625,
0.0234832763671875,
-0.03717041015625,
-0.059173583984375,
-0.04693603515625,
0.00725555419921875,
-0.0017547607421875,
0.0682373046875,
-0.0269622802734375,
0.0004954338073730469,
-0.0021495819091796875,
-0.009063720703125,
-0.03948974609375,
-0.007205963134765625,
-0.041656494140625,
-0.018798828125,
0.0211334228515625,
0.0264129638671875,
0.054412841796875,
0.03533935546875,
0.041748046875,
0.0242767333984375,
0.0116729736328125,
-0.01534271240234375,
-0.033416748046875,
0.00420379638671875,
0.001110076904296875,
-0.03033447265625,
-0.038787841796875,
-0.0062713623046875,
0.049560546875,
0.05548095703125,
0.005451202392578125,
0.0245819091796875,
-0.0081329345703125,
0.047637939453125,
-0.049407958984375,
0.01953125,
-0.01189422607421875,
0.005523681640625,
-0.0174102783203125,
-0.0100555419921875,
-0.0118865966796875,
0.001499176025390625,
-0.00563812255859375,
-0.07464599609375,
0.0131378173828125,
-0.01274871826171875,
0.088623046875,
0.01015472412109375,
-0.0166015625,
-0.00714874267578125,
-0.030517578125,
0.047149658203125,
-0.0469970703125,
0.0382080078125,
0.01392364501953125,
0.01343536376953125,
-0.00844573974609375,
-0.051361083984375,
-0.043121337890625,
-0.02423095703125,
-0.0200958251953125,
0.045928955078125,
0.00641632080078125,
-0.00917816162109375,
0.0277557373046875,
0.024322509765625,
-0.0526123046875,
0.001239776611328125,
-0.013702392578125,
0.0048065185546875,
0.05218505859375,
0.0024356842041015625,
0.053253173828125,
-0.040679931640625,
-0.04339599609375,
-0.020416259765625,
-0.0355224609375,
0.0251312255859375,
0.01361083984375,
0.0225982666015625,
-0.03741455078125,
0.044036865234375,
-0.0208892822265625,
0.03656005859375,
0.0093536376953125,
-0.0255279541015625,
0.037261962890625,
-0.0294036865234375,
-0.01459503173828125,
-0.0027713775634765625,
0.0833740234375,
0.04302978515625,
0.0165863037109375,
-0.004062652587890625,
-0.00023424625396728516,
-0.0006952285766601562,
0.00809478759765625,
-0.06109619140625,
0.0022029876708984375,
-0.0030517578125,
-0.038360595703125,
0.0048980712890625,
0.01983642578125,
-0.0634765625,
-0.0004718303680419922,
-0.0112762451171875,
0.032012939453125,
-0.034881591796875,
-0.04437255859375,
-0.005374908447265625,
-0.005069732666015625,
0.00603485107421875,
0.0177154541015625,
-0.05877685546875,
0.0174407958984375,
0.01953125,
0.04205322265625,
-0.0023479461669921875,
-0.036865234375,
0.016754150390625,
0.025970458984375,
-0.0271453857421875,
0.0291900634765625,
-0.032379150390625,
-0.0555419921875,
-0.01806640625,
0.00800323486328125,
-0.013427734375,
-0.00928497314453125,
0.0364990234375,
-0.0269622802734375,
0.0270233154296875,
-0.04351806640625,
-0.060821533203125,
-0.006877899169921875,
0.008880615234375,
-0.042877197265625,
0.065673828125,
0.0227508544921875,
-0.0501708984375,
0.0404052734375,
-0.05267333984375,
-0.0103607177734375,
0.01727294921875,
-0.032989501953125,
-0.024383544921875,
-0.0095367431640625,
0.0194854736328125,
0.0330810546875,
-0.0203094482421875,
0.0257110595703125,
-0.028411865234375,
-0.033782958984375,
0.0036334991455078125,
-0.017059326171875,
0.0687255859375,
0.01181793212890625,
-0.050811767578125,
-0.00928497314453125,
-0.055206298828125,
0.00470733642578125,
0.049163818359375,
-0.0130767822265625,
-0.02947998046875,
-0.0231170654296875,
0.002208709716796875,
0.00836181640625,
0.0298919677734375,
-0.037445068359375,
0.01197052001953125,
-0.0261383056640625,
-0.005191802978515625,
0.0285186767578125,
0.0189208984375,
0.045745849609375,
-0.006549835205078125,
0.039794921875,
-0.006763458251953125,
0.0091705322265625,
-0.02349853515625,
-0.0467529296875,
-0.06341552734375,
-0.0121612548828125,
0.00838470458984375,
0.03070068359375,
-0.061798095703125,
0.045562744140625,
-0.042266845703125,
-0.05267333984375,
-0.03411865234375,
0.0024433135986328125,
0.04461669921875,
0.05877685546875,
0.04437255859375,
-0.0221099853515625,
-0.0269927978515625,
-0.07293701171875,
-0.016632080078125,
-0.026580810546875,
0.01751708984375,
0.0130462646484375,
0.045318603515625,
-0.01163482666015625,
0.0457763671875,
-0.033355712890625,
-0.0139007568359375,
-0.0133209228515625,
-0.01172637939453125,
0.0171356201171875,
0.045501708984375,
0.052459716796875,
-0.052001953125,
-0.061920166015625,
0.0011129379272460938,
-0.083740234375,
0.005764007568359375,
-0.004566192626953125,
-0.017242431640625,
0.0163421630859375,
0.038726806640625,
-0.06939697265625,
0.0394287109375,
0.00884246826171875,
-0.054656982421875,
0.0240478515625,
0.0002090930938720703,
0.036773681640625,
-0.10089111328125,
0.0219573974609375,
0.006793975830078125,
-0.0068206787109375,
-0.042694091796875,
0.042938232421875,
0.0215606689453125,
-0.0020885467529296875,
-0.017974853515625,
0.0614013671875,
-0.0518798828125,
0.006801605224609375,
0.004497528076171875,
0.021209716796875,
0.014556884765625,
0.036834716796875,
0.0150909423828125,
0.0609130859375,
0.0369873046875,
-0.0309600830078125,
0.016998291015625,
0.01004791259765625,
-0.0269775390625,
0.043609619140625,
-0.03875732421875,
-0.01727294921875,
-0.00927734375,
0.0189666748046875,
-0.06494140625,
-0.0217132568359375,
0.0445556640625,
-0.047332763671875,
0.06201171875,
-0.003170013427734375,
-0.0300445556640625,
-0.031402587890625,
-0.032501220703125,
0.0150146484375,
0.04180908203125,
-0.0285797119140625,
0.028411865234375,
0.0228424072265625,
-0.00011217594146728516,
-0.06103515625,
-0.056060791015625,
-0.01061248779296875,
-0.0104217529296875,
-0.034820556640625,
0.0450439453125,
-0.012969970703125,
-0.007587432861328125,
0.0284881591796875,
0.0030879974365234375,
0.01125335693359375,
0.007663726806640625,
0.01480865478515625,
0.04302978515625,
-0.033172607421875,
-0.002590179443359375,
0.011260986328125,
-0.01291656494140625,
0.00853729248046875,
0.016265869140625,
0.0295257568359375,
-0.004665374755859375,
-0.0364990234375,
-0.034088134765625,
0.02789306640625,
0.037078857421875,
-0.004093170166015625,
0.0555419921875,
0.06353759765625,
-0.0230865478515625,
0.0157928466796875,
-0.040924072265625,
0.0009374618530273438,
-0.031524658203125,
0.03857421875,
-0.05035400390625,
-0.06597900390625,
0.0577392578125,
0.0222015380859375,
0.0009069442749023438,
0.07763671875,
0.036773681640625,
-0.007579803466796875,
0.07080078125,
0.0130157470703125,
-0.00897216796875,
0.033355712890625,
-0.042083740234375,
0.00024628639221191406,
-0.0670166015625,
-0.0241851806640625,
-0.02618408203125,
-0.0210113525390625,
-0.06829833984375,
-0.01593017578125,
0.0175628662109375,
0.0273590087890625,
-0.032806396484375,
0.025543212890625,
-0.0361328125,
0.040130615234375,
0.0273590087890625,
-0.0011777877807617188,
0.01105499267578125,
-0.0019626617431640625,
-0.01030731201171875,
-0.01277923583984375,
-0.037841796875,
-0.0281219482421875,
0.09130859375,
0.0225982666015625,
0.04229736328125,
0.0307159423828125,
0.076416015625,
-0.0025787353515625,
0.022491455078125,
-0.044403076171875,
0.042510986328125,
0.01480865478515625,
-0.06768798828125,
-0.034820556640625,
-0.042724609375,
-0.08935546875,
0.0294036865234375,
-0.0169219970703125,
-0.06695556640625,
0.0052337646484375,
-0.027069091796875,
-0.0157928466796875,
0.02239990234375,
-0.07220458984375,
0.06787109375,
0.00019884109497070312,
-0.029693603515625,
-0.00215911865234375,
-0.0400390625,
0.034027099609375,
0.0165252685546875,
0.025543212890625,
-0.012725830078125,
-0.013824462890625,
0.051177978515625,
-0.02886962890625,
0.05035400390625,
-0.0294952392578125,
0.0030193328857421875,
0.031768798828125,
0.003753662109375,
0.020904541015625,
0.046539306640625,
0.0191497802734375,
0.017181396484375,
0.00997161865234375,
-0.038909912109375,
-0.04010009765625,
0.08465576171875,
-0.08349609375,
-0.0640869140625,
-0.031402587890625,
-0.0175323486328125,
-0.006671905517578125,
0.039031982421875,
0.03619384765625,
0.0185699462890625,
-0.02239990234375,
0.0092620849609375,
0.035736083984375,
-0.01160430908203125,
0.03955078125,
0.052093505859375,
-0.040863037109375,
-0.0227508544921875,
0.06011962890625,
0.028778076171875,
0.0108184814453125,
0.03680419921875,
0.0010290145874023438,
-0.0256805419921875,
-0.019073486328125,
-0.0634765625,
0.029266357421875,
-0.0601806640625,
-0.01129913330078125,
-0.040802001953125,
-0.016998291015625,
-0.0362548828125,
-0.01763916015625,
-0.028717041015625,
-0.0269012451171875,
-0.031524658203125,
-0.01476287841796875,
0.0125732421875,
0.04498291015625,
0.0235595703125,
-0.01012420654296875,
-0.050445556640625,
0.03472900390625,
0.009857177734375,
0.034881591796875,
0.006748199462890625,
-0.036285400390625,
-0.026641845703125,
0.0013561248779296875,
-0.0094146728515625,
-0.07818603515625,
0.01541900634765625,
-0.00954437255859375,
0.061737060546875,
0.01160430908203125,
-0.0074005126953125,
0.054656982421875,
-0.024658203125,
0.07989501953125,
0.03594970703125,
-0.05694580078125,
0.047393798828125,
-0.033935546875,
0.041290283203125,
0.06781005859375,
0.0286712646484375,
-0.032501220703125,
-0.030548095703125,
-0.068359375,
-0.054534912109375,
0.05517578125,
0.0125579833984375,
0.022308349609375,
-0.0182647705078125,
0.02783203125,
-0.0005297660827636719,
-0.003570556640625,
-0.0631103515625,
-0.06561279296875,
-0.01085662841796875,
-0.023681640625,
-0.0181427001953125,
-0.01335906982421875,
0.005283355712890625,
-0.0311126708984375,
0.055511474609375,
0.00656890869140625,
0.025390625,
0.031494140625,
-0.020965576171875,
-0.0264892578125,
0.0115509033203125,
0.033660888671875,
0.040802001953125,
-0.0163116455078125,
-0.0157470703125,
0.01385498046875,
-0.0457763671875,
0.00959014892578125,
0.022186279296875,
-0.04193115234375,
0.022491455078125,
0.048736572265625,
0.06866455078125,
0.0307159423828125,
-0.0300445556640625,
0.0484619140625,
-0.0115966796875,
-0.0109100341796875,
-0.03729248046875,
0.026153564453125,
0.00783538818359375,
0.01270294189453125,
0.0163116455078125,
-0.0081329345703125,
0.01641845703125,
-0.033477783203125,
-0.018096923828125,
0.02508544921875,
-0.0252227783203125,
-0.0207061767578125,
0.045562744140625,
0.0110015869140625,
-0.0050048828125,
0.00323486328125,
-0.0242462158203125,
-0.01213836669921875,
0.05035400390625,
0.040863037109375,
0.051422119140625,
-0.02386474609375,
0.0084381103515625,
0.067138671875,
0.00986480712890625,
0.0108489990234375,
0.030517578125,
-0.002223968505859375,
-0.048797607421875,
-0.0152740478515625,
-0.04180908203125,
-0.00418853759765625,
0.020538330078125,
-0.05438232421875,
0.006069183349609375,
-0.037628173828125,
-0.02117919921875,
0.00159454345703125,
0.02349853515625,
-0.063720703125,
0.0171661376953125,
0.018890380859375,
0.056427001953125,
-0.03955078125,
0.0567626953125,
0.063232421875,
-0.06512451171875,
-0.07659912109375,
-0.0024871826171875,
-0.007289886474609375,
-0.05615234375,
0.0303955078125,
0.01554107666015625,
0.004444122314453125,
0.0133056640625,
-0.015960693359375,
-0.10198974609375,
0.1011962890625,
0.021026611328125,
-0.038909912109375,
0.01235198974609375,
0.004993438720703125,
0.040191650390625,
-0.0133819580078125,
0.03814697265625,
0.03631591796875,
0.045654296875,
-0.0248870849609375,
-0.06787109375,
0.01953125,
-0.052947998046875,
0.007663726806640625,
0.021728515625,
-0.0633544921875,
0.055419921875,
-0.005390167236328125,
-0.0202484130859375,
-0.0048828125,
0.04888916015625,
0.0251007080078125,
0.0206298828125,
0.0191802978515625,
0.06439208984375,
0.04693603515625,
-0.0173797607421875,
0.0946044921875,
-0.032958984375,
0.04473876953125,
0.07635498046875,
0.0027217864990234375,
0.047821044921875,
0.035888671875,
-0.044403076171875,
0.058441162109375,
0.033203125,
0.004253387451171875,
0.030731201171875,
-0.015838623046875,
-0.0038471221923828125,
0.00812530517578125,
-0.006107330322265625,
-0.06591796875,
0.028961181640625,
0.0194091796875,
-0.03436279296875,
0.0023193359375,
-0.007625579833984375,
0.02947998046875,
0.01541900634765625,
-0.0269012451171875,
0.051727294921875,
0.001316070556640625,
-0.06781005859375,
0.054351806640625,
0.0083160400390625,
0.04632568359375,
-0.049468994140625,
0.00933074951171875,
-0.0135955810546875,
-0.01291656494140625,
-0.0105133056640625,
-0.073486328125,
-0.01119232177734375,
0.005096435546875,
-0.0158233642578125,
-0.0035419464111328125,
0.059051513671875,
-0.0300445556640625,
-0.0479736328125,
0.040679931640625,
0.056671142578125,
-0.0058746337890625,
0.031768798828125,
-0.07183837890625,
-0.0003044605255126953,
-0.0019702911376953125,
-0.0665283203125,
0.0323486328125,
0.0221099853515625,
0.0179901123046875,
0.0194854736328125,
0.038787841796875,
-0.0018529891967773438,
0.00313568115234375,
0.037017822265625,
0.05267333984375,
-0.048797607421875,
-0.041229248046875,
-0.069580078125,
0.042449951171875,
-0.01499176025390625,
-0.038970947265625,
0.07135009765625,
0.039642333984375,
0.0771484375,
-0.01049041748046875,
0.049560546875,
-0.0130462646484375,
0.035888671875,
-0.048431396484375,
0.062042236328125,
-0.07122802734375,
-0.00665283203125,
-0.038055419921875,
-0.06866455078125,
-0.0132293701171875,
0.035369873046875,
-0.0201416015625,
0.0419921875,
0.05670166015625,
0.04010009765625,
0.004993438720703125,
-0.01183319091796875,
0.019317626953125,
0.006793975830078125,
0.01068115234375,
0.04986572265625,
0.046417236328125,
-0.04852294921875,
0.01020050048828125,
-0.04010009765625,
0.00521087646484375,
-0.0100250244140625,
-0.023345947265625,
-0.0703125,
-0.047698974609375,
-0.02484130859375,
-0.0203399658203125,
-0.0205841064453125,
0.06634521484375,
0.038238525390625,
-0.051177978515625,
-0.00698089599609375,
-0.00894927978515625,
-0.0021228790283203125,
0.0004925727844238281,
-0.0219879150390625,
0.03399658203125,
-0.0218963623046875,
-0.0655517578125,
0.0297088623046875,
0.01062774658203125,
0.011322021484375,
-0.0203094482421875,
-0.0265960693359375,
-0.017608642578125,
-0.0167388916015625,
0.01483154296875,
0.0132293701171875,
-0.06768798828125,
-0.019378662109375,
-0.0062103271484375,
-0.029388427734375,
0.006183624267578125,
0.020050048828125,
-0.04827880859375,
0.0489501953125,
0.029052734375,
0.030731201171875,
0.0491943359375,
-0.0150299072265625,
-0.00004976987838745117,
-0.052032470703125,
0.01629638671875,
-0.0008673667907714844,
0.05340576171875,
0.02642822265625,
-0.039886474609375,
0.041351318359375,
0.0298309326171875,
-0.01910400390625,
-0.0396728515625,
0.0028629302978515625,
-0.072998046875,
-0.037384033203125,
0.07977294921875,
-0.01453399658203125,
-0.0216217041015625,
0.0162506103515625,
-0.00421905517578125,
0.043121337890625,
-0.04010009765625,
0.0343017578125,
0.05841064453125,
0.01528167724609375,
-0.0143280029296875,
-0.04736328125,
0.0239410400390625,
0.028900146484375,
-0.060546875,
-0.027618408203125,
0.025146484375,
0.022216796875,
0.033233642578125,
0.0660400390625,
-0.001102447509765625,
0.0302734375,
0.00885009765625,
0.01983642578125,
-0.006580352783203125,
-0.0037288665771484375,
-0.01123046875,
0.009063720703125,
0.003887176513671875,
0.00377655029296875
]
] |
google/byt5-small | 2023-01-24T16:36:59.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"multilingual",
"af",
"am",
"ar",
"az",
"be",
"bg",
"bn",
"ca",
"ceb",
"co",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fil",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"haw",
"hi",
"hmn",
"ht",
"hu",
"hy",
"ig",
"is",
"it",
"iw",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lb",
"lo",
"lt",
"lv",
"mg",
"mi",
"mk",
"ml",
"mn",
"mr",
"ms",
"mt",
"my",
"ne",
"nl",
"no",
"ny",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"sm",
"sn",
"so",
"sq",
"sr",
"st",
"su",
"sv",
"sw",
"ta",
"te",
"tg",
"th",
"tr",
"uk",
"und",
"ur",
"uz",
"vi",
"xh",
"yi",
"yo",
"zh",
"zu",
"dataset:mc4",
"arxiv:1907.06292",
"arxiv:2105.13626",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/byt5-small | 27 | 28,130 | transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- af
- am
- ar
- az
- be
- bg
- bn
- ca
- ceb
- co
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fil
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- haw
- hi
- hmn
- ht
- hu
- hy
- ig
- is
- it
- iw
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lb
- lo
- lt
- lv
- mg
- mi
- mk
- ml
- mn
- mr
- ms
- mt
- my
- ne
- nl
- no
- ny
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- st
- su
- sv
- sw
- ta
- te
- tg
- th
- tr
- uk
- und
- ur
- uz
- vi
- xh
- yi
- yo
- zh
- zu
datasets:
- mc4
license: apache-2.0
---
# ByT5 - Small
ByT5 is a tokenizer-free version of [Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and generally follows the architecture of [MT5](https://huggingface.co/google/mt5-small).
ByT5 was only pre-trained on [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
ByT5 works especially well on noisy text data,*e.g.*, `google/byt5-small` significantly outperforms [mt5-small](https://huggingface.co/google/mt5-small) on [TweetQA](https://arxiv.org/abs/1907.06292).
Paper: [ByT5: Towards a token-free future with pre-trained byte-to-byte models](https://arxiv.org/abs/2105.13626)
Authors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel*
## Example Inference
ByT5 works on raw UTF-8 bytes and can be used without a tokenizer:
```python
from transformers import T5ForConditionalGeneration
import torch
model = T5ForConditionalGeneration.from_pretrained('google/byt5-small')
input_ids = torch.tensor([list("Life is like a box of chocolates.".encode("utf-8"))]) + 3 # add 3 for special tokens
labels = torch.tensor([list("La vie est comme une boîte de chocolat.".encode("utf-8"))]) + 3 # add 3 for special tokens
loss = model(input_ids, labels=labels).loss # forward pass
```
For batched inference & training it is however recommended using a tokenizer class for padding:
```python
from transformers import T5ForConditionalGeneration, AutoTokenizer
model = T5ForConditionalGeneration.from_pretrained('google/byt5-small')
tokenizer = AutoTokenizer.from_pretrained('google/byt5-small')
model_inputs = tokenizer(["Life is like a box of chocolates.", "Today is Monday."], padding="longest", return_tensors="pt")
labels = tokenizer(["La vie est comme une boîte de chocolat.", "Aujourd'hui c'est lundi."], padding="longest", return_tensors="pt").input_ids
loss = model(**model_inputs, labels=labels).loss # forward pass
```
## Abstract
Most widely-used pre-trained language models operate on sequences of tokens corresponding to word or subword units. Encoding text as a sequence of tokens requires a tokenizer, which is typically created as an independent artifact from the model. Token-free models that instead operate directly on raw text (bytes or characters) have many benefits: they can process text in any language out of the box, they are more robust to noise, and they minimize technical debt by removing complex and error-prone text preprocessing pipelines. Since byte or character sequences are longer than token sequences, past work on token-free models has often introduced new model architectures designed to amortize the cost of operating directly on raw text. In this paper, we show that a standard Transformer architecture can be used with minimal modifications to process byte sequences. We carefully characterize the trade-offs in terms of parameter count, training FLOPs, and inference speed, and show that byte-level models are competitive with their token-level counterparts. We also demonstrate that byte-level models are significantly more robust to noise and perform better on tasks that are sensitive to spelling and pronunciation. As part of our contribution, we release a new set of pre-trained byte-level Transformer models based on the T5 architecture, as well as all code and data used in our experiments.

| 4,231 | [
[
-0.01953125,
-0.0225067138671875,
0.021148681640625,
0.00807952880859375,
-0.0286712646484375,
-0.004489898681640625,
-0.0214385986328125,
-0.044647216796875,
0.0013217926025390625,
0.0204315185546875,
-0.04296875,
-0.0347900390625,
-0.056304931640625,
0.0192718505859375,
-0.03216552734375,
0.07781982421875,
-0.0014944076538085938,
0.0081329345703125,
0.0160675048828125,
-0.01340484619140625,
-0.0236358642578125,
-0.03961181640625,
-0.05255126953125,
-0.02935791015625,
0.041229248046875,
0.026611328125,
0.035552978515625,
0.05548095703125,
0.03863525390625,
0.0268096923828125,
-0.00598907470703125,
-0.00666046142578125,
-0.0281829833984375,
-0.01715087890625,
0.00010716915130615234,
-0.047027587890625,
-0.0361328125,
-0.0002663135528564453,
0.03216552734375,
0.046539306640625,
0.00940704345703125,
0.032806396484375,
-0.00811004638671875,
0.0222930908203125,
-0.042144775390625,
0.00556182861328125,
-0.050506591796875,
0.00862884521484375,
-0.01934814453125,
0.01334381103515625,
-0.050018310546875,
-0.00881195068359375,
0.0010137557983398438,
-0.039520263671875,
0.0462646484375,
0.01253509521484375,
0.07745361328125,
0.0186614990234375,
-0.0233612060546875,
-0.01898193359375,
-0.04461669921875,
0.07635498046875,
-0.052032470703125,
0.052490234375,
0.00963592529296875,
0.01416778564453125,
-0.0016078948974609375,
-0.094970703125,
-0.04595947265625,
0.006671905517578125,
-0.01503753662109375,
0.013885498046875,
-0.0186004638671875,
0.01514434814453125,
0.0230560302734375,
0.035247802734375,
-0.042449951171875,
0.00799560546875,
-0.05426025390625,
-0.0137481689453125,
0.025543212890625,
-0.00885772705078125,
0.0214080810546875,
-0.026275634765625,
-0.031463623046875,
-0.0098724365234375,
-0.04608154296875,
0.01340484619140625,
0.013031005859375,
0.0228729248046875,
-0.015869140625,
0.020538330078125,
-0.00435638427734375,
0.0238189697265625,
0.0167999267578125,
-0.0005550384521484375,
0.0224761962890625,
-0.026702880859375,
-0.027587890625,
0.01654052734375,
0.070556640625,
0.0055999755859375,
0.0213623046875,
-0.03997802734375,
-0.0278778076171875,
0.0108642578125,
0.0211334228515625,
-0.0968017578125,
-0.020538330078125,
0.0263671875,
-0.056427001953125,
-0.0308837890625,
-0.004512786865234375,
-0.039398193359375,
-0.00420379638671875,
0.0073699951171875,
0.04608154296875,
-0.041839599609375,
-0.00023829936981201172,
0.0177154541015625,
-0.016693115234375,
0.012451171875,
0.0017948150634765625,
-0.09808349609375,
0.01496124267578125,
0.0279998779296875,
0.05950927734375,
-0.00823974609375,
-0.0185394287109375,
-0.033935546875,
-0.003177642822265625,
-0.0260009765625,
0.046356201171875,
-0.0210113525390625,
-0.03961181640625,
-0.027557373046875,
0.0182342529296875,
-0.0019969940185546875,
-0.034637451171875,
0.06256103515625,
-0.03900146484375,
0.033538818359375,
-0.01291656494140625,
-0.039825439453125,
-0.00989532470703125,
0.0042266845703125,
-0.05126953125,
0.06689453125,
-0.0007543563842773438,
-0.06439208984375,
0.052215576171875,
-0.06439208984375,
-0.019989013671875,
-0.005084991455078125,
0.0127716064453125,
-0.049285888671875,
0.006702423095703125,
0.03021240234375,
0.03314208984375,
-0.0179595947265625,
0.00394439697265625,
-0.0222625732421875,
-0.044677734375,
0.00780487060546875,
-0.045135498046875,
0.06292724609375,
0.0256500244140625,
-0.0284271240234375,
0.021514892578125,
-0.072021484375,
0.002872467041015625,
0.002597808837890625,
-0.035308837890625,
0.0030231475830078125,
-0.01052093505859375,
0.00620269775390625,
0.018829345703125,
0.0122222900390625,
-0.043975830078125,
0.0227508544921875,
-0.035736083984375,
0.058013916015625,
0.0498046875,
-0.016632080078125,
0.042816162109375,
-0.024017333984375,
0.0239410400390625,
0.028411865234375,
0.01434326171875,
-0.01806640625,
-0.01788330078125,
-0.0767822265625,
-0.038543701171875,
0.05072021484375,
0.031707763671875,
-0.05767822265625,
0.0200347900390625,
-0.057342529296875,
-0.039886474609375,
-0.056427001953125,
-0.00740814208984375,
0.0194549560546875,
0.033111572265625,
0.052703857421875,
-0.0221099853515625,
-0.0599365234375,
-0.044219970703125,
-0.0195465087890625,
0.0029125213623046875,
-0.00927734375,
0.0015077590942382812,
0.043426513671875,
-0.0199432373046875,
0.0679931640625,
-0.01064300537109375,
-0.00982666015625,
-0.03155517578125,
0.020721435546875,
0.0222625732421875,
0.053924560546875,
0.038330078125,
-0.041961669921875,
-0.0230712890625,
-0.015960693359375,
-0.048187255859375,
0.006500244140625,
-0.00656890869140625,
0.0159454345703125,
0.0236663818359375,
0.0240478515625,
-0.0494384765625,
0.0229339599609375,
0.030242919921875,
-0.031097412109375,
0.031280517578125,
-0.0233154296875,
0.0019664764404296875,
-0.0958251953125,
0.01483917236328125,
-0.00908660888671875,
-0.035186767578125,
-0.050384521484375,
-0.01291656494140625,
0.0100860595703125,
-0.00927734375,
-0.053863525390625,
0.06512451171875,
-0.039886474609375,
0.0014190673828125,
-0.0006117820739746094,
-0.0142822265625,
-0.005558013916015625,
0.042083740234375,
0.0010995864868164062,
0.07904052734375,
0.027313232421875,
-0.046478271484375,
0.00490570068359375,
0.02020263671875,
-0.0299835205078125,
0.00604248046875,
-0.049652099609375,
0.01451873779296875,
-0.005847930908203125,
0.0189971923828125,
-0.06549072265625,
0.0002810955047607422,
0.018280029296875,
-0.055328369140625,
0.0244903564453125,
-0.014068603515625,
-0.039459228515625,
-0.0340576171875,
-0.031036376953125,
0.032623291015625,
0.055694580078125,
-0.05072021484375,
0.043975830078125,
-0.0108489990234375,
0.0241241455078125,
-0.059051513671875,
-0.07843017578125,
0.005420684814453125,
-0.0295867919921875,
-0.0537109375,
0.044586181640625,
0.00337982177734375,
0.023651123046875,
-0.0018568038940429688,
-0.0149383544921875,
-0.00555419921875,
0.00490570068359375,
0.0034389495849609375,
0.0202789306640625,
-0.025238037109375,
-0.0019969940185546875,
-0.00965118408203125,
-0.0212860107421875,
0.000048279762268066406,
-0.049774169921875,
0.038909912109375,
-0.011199951171875,
0.0114288330078125,
-0.0340576171875,
0.00598907470703125,
0.042022705078125,
-0.01384735107421875,
0.0670166015625,
0.07489013671875,
-0.020233154296875,
-0.016998291015625,
-0.0305328369140625,
-0.024566650390625,
-0.0428466796875,
0.02392578125,
-0.051177978515625,
-0.051177978515625,
0.05377197265625,
0.00997161865234375,
0.002185821533203125,
0.03912353515625,
0.035675048828125,
0.004405975341796875,
0.0682373046875,
0.050323486328125,
-0.002208709716796875,
0.04730224609375,
-0.038909912109375,
0.0250701904296875,
-0.046295166015625,
0.0015287399291992188,
-0.03411865234375,
-0.0253448486328125,
-0.07049560546875,
-0.0201873779296875,
0.007251739501953125,
-0.0036067962646484375,
-0.02667236328125,
0.0300140380859375,
-0.041778564453125,
0.0274200439453125,
0.049346923828125,
0.0057525634765625,
0.01215362548828125,
-0.0036907196044921875,
-0.01806640625,
-0.004611968994140625,
-0.06402587890625,
-0.044281005859375,
0.0853271484375,
0.0257415771484375,
0.05328369140625,
-0.00310516357421875,
0.052032470703125,
0.01160430908203125,
0.0207977294921875,
-0.056182861328125,
0.034698486328125,
-0.03656005859375,
-0.05181884765625,
-0.00986480712890625,
-0.02752685546875,
-0.087646484375,
0.01383209228515625,
-0.0224761962890625,
-0.0758056640625,
0.011077880859375,
0.01308441162109375,
-0.0235595703125,
0.040191650390625,
-0.07708740234375,
0.08807373046875,
0.0020885467529296875,
-0.02813720703125,
0.005321502685546875,
-0.05792236328125,
0.022216796875,
0.0079345703125,
0.001369476318359375,
0.0221099853515625,
0.00623321533203125,
0.061004638671875,
-0.03826904296875,
0.065673828125,
-0.00853729248046875,
0.01538848876953125,
0.0157318115234375,
-0.0169219970703125,
0.0401611328125,
-0.0224761962890625,
-0.00408935546875,
0.017425537109375,
0.0189361572265625,
-0.038604736328125,
-0.03985595703125,
0.0307769775390625,
-0.0723876953125,
-0.032623291015625,
-0.01922607421875,
-0.02435302734375,
0.00302886962890625,
0.037200927734375,
0.052978515625,
0.0304718017578125,
0.002857208251953125,
0.034759521484375,
0.0513916015625,
-0.01457977294921875,
0.06365966796875,
-0.001377105712890625,
-0.003936767578125,
-0.0305023193359375,
0.07830810546875,
0.02435302734375,
0.01275634765625,
0.0377197265625,
0.0182647705078125,
-0.039520263671875,
-0.039581298828125,
-0.040283203125,
0.01186370849609375,
-0.064697265625,
-0.0066986083984375,
-0.057708740234375,
-0.019805908203125,
-0.042327880859375,
-0.012725830078125,
-0.03240966796875,
-0.0309295654296875,
-0.0333251953125,
-0.0099945068359375,
0.009246826171875,
0.0280609130859375,
0.014739990234375,
0.03948974609375,
-0.0673828125,
0.01953125,
0.00235748291015625,
0.0279541015625,
0.004444122314453125,
-0.04638671875,
-0.0208282470703125,
-0.0085296630859375,
-0.03173828125,
-0.048797607421875,
0.029449462890625,
-0.002445220947265625,
0.0171966552734375,
0.027496337890625,
0.00731658935546875,
0.051910400390625,
-0.036895751953125,
0.07073974609375,
0.00945281982421875,
-0.0906982421875,
-0.0009288787841796875,
-0.01209259033203125,
0.0240020751953125,
0.0224761962890625,
0.0200347900390625,
-0.052032470703125,
-0.0143585205078125,
-0.07257080078125,
-0.058685302734375,
0.0589599609375,
0.02020263671875,
0.0090484619140625,
0.00901031494140625,
0.01233673095703125,
0.016448974609375,
0.01548004150390625,
-0.078125,
-0.01953125,
-0.03857421875,
-0.03533935546875,
0.005954742431640625,
-0.0146484375,
0.034027099609375,
-0.0160064697265625,
0.05474853515625,
-0.0010814666748046875,
0.0528564453125,
0.01264190673828125,
-0.02447509765625,
0.0211029052734375,
0.024169921875,
0.0478515625,
0.023040771484375,
-0.0162811279296875,
0.020263671875,
0.041046142578125,
-0.048919677734375,
0.0078125,
0.0019931793212890625,
-0.018768310546875,
0.00839996337890625,
0.0257720947265625,
0.0859375,
0.006198883056640625,
-0.0224761962890625,
0.03399658203125,
-0.016845703125,
-0.01641845703125,
-0.01230621337890625,
0.004444122314453125,
0.004009246826171875,
0.0113372802734375,
0.015869140625,
0.00902557373046875,
0.00847625732421875,
-0.031005859375,
0.01043701171875,
0.0180206298828125,
-0.026275634765625,
-0.041656494140625,
0.0789794921875,
0.0137176513671875,
-0.0187225341796875,
0.049346923828125,
-0.0169830322265625,
-0.035736083984375,
0.037109375,
0.053436279296875,
0.06878662109375,
0.004688262939453125,
-0.00960540771484375,
0.0275115966796875,
0.0221405029296875,
-0.0067596435546875,
0.01177978515625,
-0.0149688720703125,
-0.05859375,
-0.0291900634765625,
-0.043731689453125,
-0.0127716064453125,
0.0164642333984375,
-0.0184173583984375,
0.049346923828125,
-0.03155517578125,
-0.0108795166015625,
0.0029468536376953125,
0.0174713134765625,
-0.060516357421875,
0.055023193359375,
0.00984954833984375,
0.07196044921875,
-0.042083740234375,
0.075439453125,
0.039337158203125,
-0.045867919921875,
-0.0780029296875,
-0.006671905517578125,
-0.040130615234375,
-0.054351806640625,
0.044708251953125,
0.03460693359375,
-0.00312042236328125,
0.018096923828125,
-0.0382080078125,
-0.061737060546875,
0.095947265625,
0.03399658203125,
-0.00864410400390625,
-0.032958984375,
0.0177459716796875,
0.035400390625,
-0.00649261474609375,
0.041748046875,
0.0306854248046875,
0.0273895263671875,
0.0194091796875,
-0.05682373046875,
0.0121917724609375,
-0.01206207275390625,
0.0013408660888671875,
0.01263427734375,
-0.060821533203125,
0.060699462890625,
-0.01244354248046875,
-0.0207672119140625,
-0.0012121200561523438,
0.0703125,
0.0035686492919921875,
-0.00144195556640625,
0.0278778076171875,
0.037384033203125,
0.0643310546875,
-0.012542724609375,
0.08038330078125,
-0.03204345703125,
0.04541015625,
0.055694580078125,
0.031829833984375,
0.04345703125,
0.038238525390625,
-0.0020618438720703125,
0.026763916015625,
0.0633544921875,
-0.01271820068359375,
0.0411376953125,
0.00609588623046875,
-0.0158538818359375,
-0.00720977783203125,
0.009674072265625,
-0.01568603515625,
0.03021240234375,
0.007259368896484375,
-0.032440185546875,
-0.01397705078125,
-0.004123687744140625,
0.0189971923828125,
-0.0430908203125,
-0.020599365234375,
0.037200927734375,
0.0095672607421875,
-0.044647216796875,
0.058746337890625,
0.0248565673828125,
0.07891845703125,
-0.035552978515625,
0.032318115234375,
-0.021575927734375,
0.0290374755859375,
-0.021484375,
-0.0305023193359375,
0.03173828125,
0.01139068603515625,
-0.0181732177734375,
-0.031463623046875,
0.0587158203125,
-0.0257720947265625,
-0.042205810546875,
0.0009760856628417969,
0.0221710205078125,
0.0215911865234375,
0.01447296142578125,
-0.040557861328125,
-0.008514404296875,
-0.0081787109375,
-0.0374755859375,
0.01041412353515625,
0.0404052734375,
-0.0193023681640625,
0.056732177734375,
0.04180908203125,
0.0070343017578125,
0.0290069580078125,
-0.00913238525390625,
0.0440673828125,
-0.04931640625,
-0.0433349609375,
-0.072021484375,
0.049957275390625,
0.0159912109375,
-0.02569580078125,
0.0325927734375,
0.047393798828125,
0.07330322265625,
-0.0197601318359375,
0.053924560546875,
-0.0016565322875976562,
0.010162353515625,
-0.04248046875,
0.06671142578125,
-0.050048828125,
-0.001132965087890625,
-0.003612518310546875,
-0.048187255859375,
-0.0277099609375,
0.03680419921875,
-0.0304718017578125,
0.0210113525390625,
0.0616455078125,
0.051422119140625,
-0.032623291015625,
-0.01250457763671875,
0.03399658203125,
0.01273345947265625,
0.039276123046875,
0.038909912109375,
0.0282135009765625,
-0.0599365234375,
0.0634765625,
-0.0079498291015625,
0.0186920166015625,
-0.0010747909545898438,
-0.0665283203125,
-0.07745361328125,
-0.03253173828125,
-0.004993438720703125,
-0.0270538330078125,
0.0011997222900390625,
0.08331298828125,
0.06524658203125,
-0.0550537109375,
-0.003368377685546875,
-0.0173797607421875,
0.00368499755859375,
-0.0014505386352539062,
-0.01297760009765625,
0.033233642578125,
-0.044952392578125,
-0.0731201171875,
0.004955291748046875,
-0.00348663330078125,
0.0221405029296875,
0.0021457672119140625,
0.0085906982421875,
0.0060577392578125,
0.0171661376953125,
0.033843994140625,
0.0052642822265625,
-0.041534423828125,
-0.03961181640625,
0.0149688720703125,
-0.0173492431640625,
0.035675048828125,
0.044586181640625,
-0.06402587890625,
0.01934814453125,
0.034423828125,
0.058319091796875,
0.061492919921875,
-0.0110321044921875,
0.02972412109375,
-0.06219482421875,
0.021514892578125,
-0.0010423660278320312,
0.035491943359375,
0.031280517578125,
-0.0236358642578125,
0.0304718017578125,
0.03485107421875,
-0.036163330078125,
-0.053619384765625,
-0.002162933349609375,
-0.0706787109375,
-0.01042938232421875,
0.07061767578125,
-0.012451171875,
-0.039276123046875,
0.0184478759765625,
-0.004802703857421875,
0.047271728515625,
-0.030975341796875,
0.0677490234375,
0.0672607421875,
0.013336181640625,
-0.020599365234375,
-0.0274810791015625,
0.043182373046875,
0.034149169921875,
-0.051055908203125,
-0.01450347900390625,
-0.00649261474609375,
0.040069580078125,
0.004192352294921875,
0.045654296875,
0.001323699951171875,
0.00969696044921875,
0.00951385498046875,
0.02252197265625,
-0.0056304931640625,
-0.0139617919921875,
-0.0174560546875,
0.0123748779296875,
-0.004680633544921875,
-0.048919677734375
]
] |
cmarkea/distilcamembert-base-ner | 2023-08-01T10:05:12.000Z | [
"transformers",
"pytorch",
"tf",
"onnx",
"safetensors",
"camembert",
"token-classification",
"fr",
"dataset:Jean-Baptiste/wikiner_fr",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | cmarkea | null | null | cmarkea/distilcamembert-base-ner | 16 | 28,091 | transformers | 2022-03-02T23:29:05 | ---
language: fr
license: mit
datasets:
- Jean-Baptiste/wikiner_fr
widget:
- text: "Boulanger, habitant à Boulanger et travaillant dans le magasin Boulanger situé dans la ville de Boulanger. Boulanger a écrit le livre éponyme Boulanger édité par la maison d'édition Boulanger."
- text: "Quentin Jerome Tarantino naît le 27 mars 1963 à Knoxville, dans le Tennessee. Il est le fils de Connie McHugh, une infirmière, née le 3 septembre 1946, et de Tony Tarantino, acteur et musicien amateur né à New York. Ce dernier est d'origine italienne par son père ; sa mère a des ascendances irlandaises et cherokees. Il est prénommé d'après Quint Asper, le personnage joué par Burt Reynolds dans la série Gunsmoke et Quentin Compson, personnage du roman Le Bruit et la Fureur. Son père quitte le domicile familial avant même sa naissance. En 1965, sa mère déménage à Torrance, dans la banlieue sud de Los Angeles, et se remarie avec Curtis Zastoupil, un pianiste de bar, qui lui fait découvrir le cinéma. Le couple divorce alors que le jeune Quentin a une dizaine d'années."
---
DistilCamemBERT-NER
===================
We present DistilCamemBERT-NER, which is [DistilCamemBERT](https://huggingface.co/cmarkea/distilcamembert-base) fine-tuned for the NER (Named Entity Recognition) task for the French language. The work is inspired by [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) based on the [CamemBERT](https://huggingface.co/camembert-base) model. The problem of the modelizations based on CamemBERT is at the scaling moment, for the production phase, for example. Indeed, inference cost can be a technological issue. To counteract this effect, we propose this modelization which **divides the inference time by two** with the same consumption power thanks to [DistilCamemBERT](https://huggingface.co/cmarkea/distilcamembert-base).
Dataset
-------
The dataset used is [wikiner_fr](https://huggingface.co/datasets/Jean-Baptiste/wikiner_fr), which represents ~170k sentences labeled in 5 categories :
* PER: personality ;
* LOC: location ;
* ORG: organization ;
* MISC: miscellaneous entities (movies title, books, etc.) ;
* O: background (Outside entity).
Evaluation results
------------------
| **class** | **precision (%)** | **recall (%)** | **f1 (%)** | **support (#sub-word)** |
| :------------: | :---------------: | :------------: | :--------: | :---------------------: |
| **global** | 98.17 | 98.19 | 98.18 | 378,776 |
| **PER** | 96.78 | 96.87 | 96.82 | 23,754 |
| **LOC** | 94.05 | 93.59 | 93.82 | 27,196 |
| **ORG** | 86.05 | 85.92 | 85.98 | 6,526 |
| **MISC** | 88.78 | 84.69 | 86.69 | 11,891 |
| **O** | 99.26 | 99.47 | 99.37 | 309,409 |
Benchmark
---------
This model performance is compared to 2 reference models (see below) with the metric f1 score. For the mean inference time measure, an AMD Ryzen 5 4500U @ 2.3GHz with 6 cores was used:
| **model** | **time (ms)** | **PER (%)** | **LOC (%)** | **ORG (%)** | **MISC (%)** | **O (%)** |
| :---------------------------------------------------------------------------------------------------------------: | :-----------: | :---------: | :---------: | :---------: | :-----------: | :-------: |
| [cmarkea/distilcamembert-base-ner](https://huggingface.co/cmarkea/distilcamembert-base-ner) | **43.44** | **96.82** | **93.82** | **85.98** | **86.69** | **99.37** |
| [Davlan/bert-base-multilingual-cased-ner-hrl](https://huggingface.co/Davlan/bert-base-multilingual-cased-ner-hrl) | 87.56 | 79.93 | 72.89 | 61.34 | n/a | 96.04 |
| [flair/ner-french](https://huggingface.co/flair/ner-french) | 314.96 | 82.91 | 76.17 | 70.96 | 76.29 | 97.65 |
How to use DistilCamemBERT-NER
------------------------------
```python
from transformers import pipeline
ner = pipeline(
task='ner',
model="cmarkea/distilcamembert-base-ner",
tokenizer="cmarkea/distilcamembert-base-ner",
aggregation_strategy="simple"
)
result = ner(
"Le Crédit Mutuel Arkéa est une banque Française, elle comprend le CMB "
"qui est une banque située en Bretagne et le CMSO qui est une banque "
"qui se situe principalement en Aquitaine. C'est sous la présidence de "
"Louis Lichou, dans les années 1980 que différentes filiales sont créées "
"au sein du CMB et forment les principales filiales du groupe qui "
"existent encore aujourd'hui (Federal Finance, Suravenir, Financo, etc.)."
)
result
[{'entity_group': 'ORG',
'score': 0.9974479,
'word': 'Crédit Mutuel Arkéa',
'start': 3,
'end': 22},
{'entity_group': 'LOC',
'score': 0.9000358,
'word': 'Française',
'start': 38,
'end': 47},
{'entity_group': 'ORG',
'score': 0.9788757,
'word': 'CMB',
'start': 66,
'end': 69},
{'entity_group': 'LOC',
'score': 0.99919766,
'word': 'Bretagne',
'start': 99,
'end': 107},
{'entity_group': 'ORG',
'score': 0.9594884,
'word': 'CMSO',
'start': 114,
'end': 118},
{'entity_group': 'LOC',
'score': 0.99935514,
'word': 'Aquitaine',
'start': 169,
'end': 178},
{'entity_group': 'PER',
'score': 0.99911094,
'word': 'Louis Lichou',
'start': 208,
'end': 220},
{'entity_group': 'ORG',
'score': 0.96226394,
'word': 'CMB',
'start': 291,
'end': 294},
{'entity_group': 'ORG',
'score': 0.9983959,
'word': 'Federal Finance',
'start': 374,
'end': 389},
{'entity_group': 'ORG',
'score': 0.9984454,
'word': 'Suravenir',
'start': 391,
'end': 400},
{'entity_group': 'ORG',
'score': 0.9985084,
'word': 'Financo',
'start': 402,
'end': 409}]
```
### Optimum + ONNX
```python
from optimum.onnxruntime import ORTModelForTokenClassification
from transformers import AutoTokenizer, pipeline
HUB_MODEL = "cmarkea/distilcamembert-base-nli"
tokenizer = AutoTokenizer.from_pretrained(HUB_MODEL)
model = ORTModelForTokenClassification.from_pretrained(HUB_MODEL)
onnx_qa = pipeline("token-classification", model=model, tokenizer=tokenizer)
# Quantized onnx model
quantized_model = ORTModelForTokenClassification.from_pretrained(
HUB_MODEL, file_name="model_quantized.onnx"
)
```
Citation
--------
```bibtex
@inproceedings{delestre:hal-03674695,
TITLE = {{DistilCamemBERT : une distillation du mod{\`e}le fran{\c c}ais CamemBERT}},
AUTHOR = {Delestre, Cyrile and Amar, Abibatou},
URL = {https://hal.archives-ouvertes.fr/hal-03674695},
BOOKTITLE = {{CAp (Conf{\'e}rence sur l'Apprentissage automatique)}},
ADDRESS = {Vannes, France},
YEAR = {2022},
MONTH = Jul,
KEYWORDS = {NLP ; Transformers ; CamemBERT ; Distillation},
PDF = {https://hal.archives-ouvertes.fr/hal-03674695/file/cap2022.pdf},
HAL_ID = {hal-03674695},
HAL_VERSION = {v1},
}
``` | 7,109 | [
[
-0.04058837890625,
-0.047454833984375,
0.0260009765625,
0.0189361572265625,
-0.0222930908203125,
0.0005650520324707031,
-0.0201873779296875,
-0.01276397705078125,
0.027313232421875,
0.0233306884765625,
-0.040863037109375,
-0.051666259765625,
-0.05780029296875,
0.0180816650390625,
-0.00769805908203125,
0.06787109375,
0.002285003662109375,
-0.0008945465087890625,
0.0156707763671875,
-0.022613525390625,
-0.00946044921875,
-0.0440673828125,
-0.052093505859375,
-0.01605224609375,
0.0255584716796875,
0.0033283233642578125,
0.0391845703125,
0.043426513671875,
0.0330810546875,
0.0280609130859375,
-0.0213470458984375,
0.0142974853515625,
-0.00952911376953125,
-0.0063629150390625,
-0.0013170242309570312,
-0.0355224609375,
-0.053985595703125,
0.00205230712890625,
0.04559326171875,
0.05084228515625,
-0.013763427734375,
0.020538330078125,
0.01312255859375,
0.0577392578125,
-0.030487060546875,
0.028656005859375,
-0.0460205078125,
-0.0010986328125,
-0.014312744140625,
-0.01300811767578125,
-0.01181793212890625,
-0.016571044921875,
-0.0007677078247070312,
-0.04864501953125,
0.0265655517578125,
0.015960693359375,
0.09796142578125,
0.03460693359375,
-0.0175628662109375,
-0.0186767578125,
-0.03863525390625,
0.08172607421875,
-0.08209228515625,
0.033660888671875,
0.02203369140625,
0.004825592041015625,
-0.005306243896484375,
-0.061553955078125,
-0.060302734375,
0.005519866943359375,
-0.03680419921875,
0.0277557373046875,
-0.0281982421875,
-0.0093841552734375,
0.0235443115234375,
0.0340576171875,
-0.044647216796875,
0.001537322998046875,
-0.019805908203125,
-0.007404327392578125,
0.053863525390625,
0.01009368896484375,
0.0029010772705078125,
-0.0303955078125,
-0.0302886962890625,
-0.0135498046875,
-0.018707275390625,
0.00783538818359375,
0.018280029296875,
0.028778076171875,
-0.0195465087890625,
0.047332763671875,
-0.0245513916015625,
0.051055908203125,
0.021942138671875,
-0.0135498046875,
0.050018310546875,
-0.035491943359375,
-0.0210418701171875,
0.0080718994140625,
0.0899658203125,
0.035369873046875,
0.00577545166015625,
0.0034923553466796875,
-0.019439697265625,
0.006988525390625,
-0.01311492919921875,
-0.06597900390625,
-0.0236358642578125,
0.027435302734375,
-0.0355224609375,
-0.01172637939453125,
0.01335906982421875,
-0.0743408203125,
0.010650634765625,
-0.01910400390625,
0.019927978515625,
-0.035186767578125,
-0.0175018310546875,
0.0011186599731445312,
-0.0192413330078125,
0.0106048583984375,
-0.0048675537109375,
-0.05865478515625,
0.0211334228515625,
0.032745361328125,
0.06317138671875,
0.001995086669921875,
-0.0113067626953125,
-0.0294647216796875,
-0.011260986328125,
-0.00727081298828125,
0.047882080078125,
-0.0193634033203125,
-0.0249481201171875,
-0.0265045166015625,
0.0151824951171875,
-0.0196533203125,
-0.03802490234375,
0.047332763671875,
-0.0261077880859375,
0.0230255126953125,
-0.01294708251953125,
-0.0419921875,
-0.0169830322265625,
0.02239990234375,
-0.05487060546875,
0.08734130859375,
0.0223541259765625,
-0.0733642578125,
0.039459228515625,
-0.049774169921875,
-0.017547607421875,
-0.004314422607421875,
-0.01361083984375,
-0.041290283203125,
-0.00650787353515625,
0.03240966796875,
0.03826904296875,
-0.042938232421875,
0.00847625732421875,
-0.00539398193359375,
0.002902984619140625,
0.0079803466796875,
-0.029144287109375,
0.084716796875,
0.0255279541015625,
-0.04107666015625,
-0.005748748779296875,
-0.078369140625,
0.015380859375,
0.0209197998046875,
-0.03497314453125,
-0.0196533203125,
-0.0055694580078125,
-0.004077911376953125,
0.01152801513671875,
0.0131683349609375,
-0.0294952392578125,
0.003032684326171875,
-0.03778076171875,
0.0447998046875,
0.045745849609375,
0.012451171875,
0.0271759033203125,
-0.026824951171875,
0.030426025390625,
0.01317596435546875,
0.01218414306640625,
0.006252288818359375,
-0.034912109375,
-0.06146240234375,
-0.047454833984375,
0.04644775390625,
0.0562744140625,
-0.045379638671875,
0.06134033203125,
-0.039520263671875,
-0.046112060546875,
-0.040740966796875,
0.00308990478515625,
0.02203369140625,
0.046722412109375,
0.0435791015625,
-0.0107269287109375,
-0.04150390625,
-0.07196044921875,
-0.00885009765625,
-0.01534271240234375,
0.003696441650390625,
0.0186309814453125,
0.046600341796875,
-0.0105438232421875,
0.06756591796875,
-0.056671142578125,
-0.0215911865234375,
-0.01355743408203125,
0.00809478759765625,
0.06524658203125,
0.03631591796875,
0.052032470703125,
-0.06658935546875,
-0.055389404296875,
-0.003841400146484375,
-0.0589599609375,
-0.001922607421875,
0.0062713623046875,
-0.0069427490234375,
0.021697998046875,
0.02777099609375,
-0.051483154296875,
0.02874755859375,
0.036407470703125,
-0.0310211181640625,
0.046417236328125,
-0.01715087890625,
0.0176239013671875,
-0.097900390625,
0.005847930908203125,
-0.0009241104125976562,
-0.003192901611328125,
-0.036834716796875,
-0.00926971435546875,
-0.0027027130126953125,
0.0235137939453125,
-0.03704833984375,
0.045501708984375,
-0.052459716796875,
0.0279541015625,
0.01708984375,
0.0141754150390625,
0.009246826171875,
0.055206298828125,
-0.003620147705078125,
0.040557861328125,
0.042877197265625,
-0.05084228515625,
0.032623291015625,
0.01538848876953125,
-0.034271240234375,
0.0478515625,
-0.044769287109375,
-0.01232147216796875,
-0.0072784423828125,
0.0163726806640625,
-0.0638427734375,
-0.0002294778823852539,
0.0243072509765625,
-0.03759765625,
0.033447265625,
-0.0170135498046875,
-0.03509521484375,
-0.03570556640625,
-0.0170745849609375,
0.017852783203125,
0.0303192138671875,
-0.031707763671875,
0.056396484375,
0.01873779296875,
-0.00585174560546875,
-0.05316162109375,
-0.06427001953125,
-0.022735595703125,
-0.0333251953125,
-0.045257568359375,
0.042449951171875,
-0.00467681884765625,
-0.022735595703125,
0.0172576904296875,
-0.0089111328125,
-0.0080718994140625,
0.0019550323486328125,
0.0048065185546875,
0.0291748046875,
-0.0155487060546875,
-0.00449371337890625,
-0.0009140968322753906,
-0.009857177734375,
-0.011077880859375,
-0.0238494873046875,
0.058990478515625,
-0.0140380859375,
-0.007137298583984375,
-0.0343017578125,
0.0193634033203125,
0.0406494140625,
-0.0276641845703125,
0.07080078125,
0.047607421875,
-0.026153564453125,
-0.0030612945556640625,
-0.039337158203125,
-0.0146026611328125,
-0.032257080078125,
0.0227508544921875,
-0.036712646484375,
-0.05419921875,
0.049530029296875,
0.0095367431640625,
0.01381683349609375,
0.0792236328125,
0.04095458984375,
-0.005107879638671875,
0.06365966796875,
0.00955963134765625,
-0.01253509521484375,
0.02252197265625,
-0.057281494140625,
0.0205535888671875,
-0.058380126953125,
-0.0220184326171875,
-0.04315185546875,
-0.0288543701171875,
-0.061859130859375,
-0.03509521484375,
0.01959228515625,
0.02215576171875,
-0.0157012939453125,
0.04083251953125,
-0.055816650390625,
0.00476837158203125,
0.041412353515625,
0.0039043426513671875,
0.020965576171875,
-0.0019817352294921875,
-0.03155517578125,
-0.006374359130859375,
-0.05389404296875,
-0.034515380859375,
0.07440185546875,
0.007579803466796875,
0.04010009765625,
0.0156097412109375,
0.0594482421875,
0.0149078369140625,
0.0159149169921875,
-0.05230712890625,
0.03912353515625,
-0.00856781005859375,
-0.066650390625,
-0.023956298828125,
-0.046295166015625,
-0.07354736328125,
0.033416748046875,
-0.016143798828125,
-0.06097412109375,
0.0290985107421875,
0.015960693359375,
-0.030609130859375,
0.034149169921875,
-0.05059814453125,
0.060302734375,
-0.0233001708984375,
-0.0199127197265625,
0.00597381591796875,
-0.044036865234375,
0.01165008544921875,
-0.005405426025390625,
0.03192138671875,
-0.0223236083984375,
0.0029354095458984375,
0.05340576171875,
-0.05645751953125,
0.0396728515625,
-0.01031494140625,
0.0070953369140625,
0.04327392578125,
-0.01003265380859375,
0.0509033203125,
0.006561279296875,
-0.014862060546875,
0.025299072265625,
0.006191253662109375,
-0.0287933349609375,
-0.0396728515625,
0.054443359375,
-0.03936767578125,
-0.03961181640625,
-0.052093505859375,
-0.0147247314453125,
0.006443023681640625,
0.01971435546875,
0.050323486328125,
0.034088134765625,
-0.0012683868408203125,
0.020904541015625,
0.041534423828125,
-0.01336669921875,
0.049530029296875,
0.0234832763671875,
-0.007476806640625,
-0.037994384765625,
0.05889892578125,
0.0209808349609375,
0.01544952392578125,
0.044464111328125,
0.0071563720703125,
-0.03643798828125,
-0.037872314453125,
-0.0291748046875,
0.024261474609375,
-0.03546142578125,
-0.0258026123046875,
-0.0712890625,
-0.024200439453125,
-0.045867919921875,
0.008453369140625,
-0.03570556640625,
-0.04766845703125,
-0.036407470703125,
-0.01474761962890625,
0.03759765625,
0.0142364501953125,
-0.01043701171875,
0.0166778564453125,
-0.055511474609375,
0.014434814453125,
0.0206451416015625,
0.0130615234375,
-0.00653076171875,
-0.047943115234375,
-0.0089874267578125,
0.00020885467529296875,
-0.0291748046875,
-0.0694580078125,
0.047760009765625,
0.018280029296875,
0.05023193359375,
0.0253753662109375,
0.01183319091796875,
0.04571533203125,
-0.03460693359375,
0.06146240234375,
0.020263671875,
-0.061737060546875,
0.039215087890625,
-0.0101470947265625,
0.0187225341796875,
0.03973388671875,
0.0430908203125,
-0.03656005859375,
-0.0157318115234375,
-0.06256103515625,
-0.092041015625,
0.05255126953125,
0.03765869140625,
-0.0074005126953125,
-0.01557159423828125,
0.0036563873291015625,
-0.01262664794921875,
0.01385498046875,
-0.046600341796875,
-0.060150146484375,
-0.021697998046875,
-0.025360107421875,
-0.0097198486328125,
-0.00835418701171875,
-0.0070343017578125,
-0.0290069580078125,
0.07330322265625,
0.00576019287109375,
0.026947021484375,
0.036590576171875,
-0.004405975341796875,
0.0233306884765625,
0.025970458984375,
0.028411865234375,
0.043182373046875,
-0.0272064208984375,
0.006549835205078125,
0.0283355712890625,
-0.03021240234375,
0.002105712890625,
0.02813720703125,
-0.006641387939453125,
0.016143798828125,
0.03228759765625,
0.06683349609375,
0.005687713623046875,
-0.03497314453125,
0.0256500244140625,
-0.006443023681640625,
-0.037139892578125,
-0.035797119140625,
-0.0074462890625,
0.00958251953125,
0.027191162109375,
0.037628173828125,
-0.0014696121215820312,
0.006763458251953125,
-0.03515625,
0.0093231201171875,
0.027923583984375,
-0.02862548828125,
-0.0183258056640625,
0.06329345703125,
-0.0129547119140625,
-0.01461029052734375,
0.041748046875,
-0.0308837890625,
-0.051727294921875,
0.048492431640625,
0.017913818359375,
0.061614990234375,
-0.00994873046875,
0.0018281936645507812,
0.05633544921875,
0.0284576416015625,
-0.0089111328125,
0.025665283203125,
0.01311492919921875,
-0.053619384765625,
-0.00757598876953125,
-0.07476806640625,
0.0193328857421875,
0.007198333740234375,
-0.0450439453125,
0.015411376953125,
-0.03533935546875,
-0.049530029296875,
0.02508544921875,
0.007137298583984375,
-0.06451416015625,
0.0281982421875,
-0.008941650390625,
0.053466796875,
-0.0538330078125,
0.0478515625,
0.050384521484375,
-0.03826904296875,
-0.07568359375,
-0.0131683349609375,
-0.003032684326171875,
-0.0301361083984375,
0.047088623046875,
-0.0090789794921875,
0.016845703125,
0.00665283203125,
-0.020172119140625,
-0.0760498046875,
0.10308837890625,
0.01110076904296875,
-0.05078125,
-0.00742340087890625,
-0.0006227493286132812,
0.043609619140625,
-0.02020263671875,
0.03472900390625,
0.054473876953125,
0.047454833984375,
-0.0008311271667480469,
-0.0723876953125,
0.0182647705078125,
-0.0275421142578125,
-0.0072479248046875,
0.025543212890625,
-0.07977294921875,
0.08099365234375,
-0.00026035308837890625,
-0.01474761962890625,
-0.013641357421875,
0.05230712890625,
0.0251312255859375,
0.0194854736328125,
0.040496826171875,
0.0699462890625,
0.056396484375,
-0.031890869140625,
0.07183837890625,
-0.0263214111328125,
0.056243896484375,
0.08056640625,
0.005523681640625,
0.048614501953125,
0.035003662109375,
-0.04498291015625,
0.040496826171875,
0.043182373046875,
-0.0159912109375,
0.04254150390625,
0.0109100341796875,
-0.0228729248046875,
-0.00789642333984375,
0.00785064697265625,
-0.0357666015625,
0.03594970703125,
0.017669677734375,
-0.0419921875,
0.00036525726318359375,
-0.0301971435546875,
0.0223541259765625,
0.00170135498046875,
-0.01096343994140625,
0.040985107421875,
0.014862060546875,
-0.044158935546875,
0.06402587890625,
0.00250244140625,
0.0579833984375,
-0.037139892578125,
0.0118560791015625,
-0.01317596435546875,
0.0208892822265625,
-0.022186279296875,
-0.0372314453125,
0.01534271240234375,
-0.0033092498779296875,
-0.033111572265625,
0.004344940185546875,
0.0227813720703125,
-0.02191162109375,
-0.0662841796875,
0.034271240234375,
0.03277587890625,
0.0159912109375,
-0.006061553955078125,
-0.06787109375,
-0.007045745849609375,
0.013916015625,
-0.03936767578125,
0.0019025802612304688,
0.04327392578125,
0.00463104248046875,
0.041229248046875,
0.049407958984375,
0.01352691650390625,
0.0221405029296875,
-0.0058441162109375,
0.0650634765625,
-0.05255126953125,
-0.0222320556640625,
-0.07977294921875,
0.045623779296875,
-0.0161895751953125,
-0.029693603515625,
0.05340576171875,
0.0648193359375,
0.06256103515625,
0.003757476806640625,
0.05682373046875,
-0.039306640625,
0.039825439453125,
-0.041015625,
0.0635986328125,
-0.061614990234375,
0.01053619384765625,
-0.024139404296875,
-0.08935546875,
-0.01364898681640625,
0.033447265625,
-0.0175933837890625,
0.0162506103515625,
0.06103515625,
0.06396484375,
-0.00608062744140625,
-0.01904296875,
0.004489898681640625,
0.030303955078125,
0.0175933837890625,
0.0364990234375,
0.0231475830078125,
-0.053375244140625,
0.043182373046875,
-0.044525146484375,
-0.018402099609375,
-0.0146484375,
-0.0645751953125,
-0.05670166015625,
-0.049896240234375,
-0.031890869140625,
-0.027984619140625,
-0.0145111083984375,
0.0809326171875,
0.04437255859375,
-0.07550048828125,
-0.00574493408203125,
-0.002414703369140625,
-0.0007376670837402344,
-0.031036376953125,
-0.0213775634765625,
0.047882080078125,
-0.00910186767578125,
-0.07220458984375,
0.01947021484375,
0.0031185150146484375,
0.0030078887939453125,
0.0089874267578125,
-0.0120697021484375,
-0.044403076171875,
0.0038585662841796875,
0.0386962890625,
0.00714874267578125,
-0.050079345703125,
0.002323150634765625,
-0.0031719207763671875,
-0.00411224365234375,
0.0185394287109375,
0.0268096923828125,
-0.03167724609375,
0.02471923828125,
0.048858642578125,
0.0160064697265625,
0.056732177734375,
0.007686614990234375,
0.0116729736328125,
-0.046905517578125,
0.0232696533203125,
0.01904296875,
0.0369873046875,
0.01605224609375,
-0.0294189453125,
0.050079345703125,
0.038116455078125,
-0.03607177734375,
-0.052947998046875,
-0.01458740234375,
-0.0853271484375,
-0.0223541259765625,
0.07489013671875,
-0.01727294921875,
-0.030059814453125,
0.01274871826171875,
-0.006053924560546875,
0.040313720703125,
-0.047607421875,
0.037139892578125,
0.056671142578125,
-0.0010519027709960938,
0.0003635883331298828,
-0.045257568359375,
0.0284881591796875,
0.027557373046875,
-0.042633056640625,
-0.0167236328125,
0.0222015380859375,
0.034912109375,
0.026641845703125,
0.04559326171875,
-0.017791748046875,
-0.01531982421875,
0.007556915283203125,
0.01099395751953125,
0.0089874267578125,
0.00319671630859375,
0.0004169940948486328,
0.0010318756103515625,
-0.0155181884765625,
-0.0158233642578125
]
] |
lllyasviel/control_v11p_sd15_canny | 2023-05-04T18:48:49.000Z | [
"diffusers",
"art",
"controlnet",
"stable-diffusion",
"controlnet-v1-1",
"image-to-image",
"arxiv:2302.05543",
"license:openrail",
"has_space",
"diffusers:ControlNetModel",
"region:us"
] | image-to-image | lllyasviel | null | null | lllyasviel/control_v11p_sd15_canny | 20 | 28,055 | diffusers | 2023-04-14T19:24:43 | ---
license: openrail
base_model: runwayml/stable-diffusion-v1-5
tags:
- art
- controlnet
- stable-diffusion
- controlnet-v1-1
- image-to-image
duplicated_from: ControlNet-1-1-preview/control_v11p_sd15_canny
---
# Controlnet - v1.1 - *Canny Version*
**Controlnet v1.1** is the successor model of [Controlnet v1.0](https://huggingface.co/lllyasviel/sd-controlnet-canny)
and was released in [lllyasviel/ControlNet-v1-1](https://huggingface.co/lllyasviel/ControlNet-v1-1) by [Lvmin Zhang](https://huggingface.co/lllyasviel).
This checkpoint is a conversion of [the original checkpoint](https://huggingface.co/lllyasviel/ControlNet-v1-1/blob/main/control_v11p_sd15_canny.pth) into `diffusers` format.
It can be used in combination with **Stable Diffusion**, such as [runwayml/stable-diffusion-v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5).
For more details, please also have a look at the [🧨 Diffusers docs](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/controlnet).
ControlNet is a neural network structure to control diffusion models by adding extra conditions.

This checkpoint corresponds to the ControlNet conditioned on **Canny edges**.
## Model Details
- **Developed by:** Lvmin Zhang, Maneesh Agrawala
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based.
- **Resources for more information:** [GitHub Repository](https://github.com/lllyasviel/ControlNet), [Paper](https://arxiv.org/abs/2302.05543).
- **Cite as:**
@misc{zhang2023adding,
title={Adding Conditional Control to Text-to-Image Diffusion Models},
author={Lvmin Zhang and Maneesh Agrawala},
year={2023},
eprint={2302.05543},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
## Introduction
Controlnet was proposed in [*Adding Conditional Control to Text-to-Image Diffusion Models*](https://arxiv.org/abs/2302.05543) by
Lvmin Zhang, Maneesh Agrawala.
The abstract reads as follows:
*We present a neural network structure, ControlNet, to control pretrained large diffusion models to support additional input conditions.
The ControlNet learns task-specific conditions in an end-to-end way, and the learning is robust even when the training dataset is small (< 50k).
Moreover, training a ControlNet is as fast as fine-tuning a diffusion model, and the model can be trained on a personal devices.
Alternatively, if powerful computation clusters are available, the model can scale to large amounts (millions to billions) of data.
We report that large diffusion models like Stable Diffusion can be augmented with ControlNets to enable conditional inputs like edge maps, segmentation maps, keypoints, etc.
This may enrich the methods to control large diffusion models and further facilitate related applications.*
## Example
It is recommended to use the checkpoint with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) as the checkpoint
has been trained on it.
Experimentally, the checkpoint can be used with other diffusion models such as dreamboothed stable diffusion.
**Note**: If you want to process an image to create the auxiliary conditioning, external dependencies are required as shown below:
1. Install [opencv](https://opencv.org/):
```sh
$ pip install opencv-contrib-python
```
2. Let's install `diffusers` and related packages:
```
$ pip install diffusers transformers accelerate
```
3. Run code:
```python
import torch
import os
from huggingface_hub import HfApi
from pathlib import Path
from diffusers.utils import load_image
import numpy as np
import cv2
from PIL import Image
from diffusers import (
ControlNetModel,
StableDiffusionControlNetPipeline,
UniPCMultistepScheduler,
)
checkpoint = "lllyasviel/control_v11p_sd15_canny"
image = load_image(
"https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/input.png"
)
image = np.array(image)
low_threshold = 100
high_threshold = 200
image = cv2.Canny(image, low_threshold, high_threshold)
image = image[:, :, None]
image = np.concatenate([image, image, image], axis=2)
control_image = Image.fromarray(image)
control_image.save("./images/control.png")
controlnet = ControlNetModel.from_pretrained(checkpoint, torch_dtype=torch.float16)
pipe = StableDiffusionControlNetPipeline.from_pretrained(
"runwayml/stable-diffusion-v1-5", controlnet=controlnet, torch_dtype=torch.float16
)
pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config)
pipe.enable_model_cpu_offload()
generator = torch.manual_seed(33)
image = pipe("a blue paradise bird in the jungle", num_inference_steps=20, generator=generator, image=control_image).images[0]
image.save('images/image_out.png')
```



## Other released checkpoints v1-1
The authors released 14 different checkpoints, each trained with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5)
on a different type of conditioning:
| Model Name | Control Image Overview| Condition Image | Control Image Example | Generated Image Example |
|---|---|---|---|---|
|[lllyasviel/control_v11p_sd15_canny](https://huggingface.co/lllyasviel/control_v11p_sd15_canny)<br/> | *Trained with canny edge detection* | A monochrome image with white edges on a black background.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11e_sd15_ip2p](https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p)<br/> | *Trained with pixel to pixel instruction* | No condition .|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_inpaint](https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint)<br/> | Trained with image inpainting | No condition.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/output.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/output.png"/></a>|
|[lllyasviel/control_v11p_sd15_mlsd](https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd)<br/> | Trained with multi-level line segment detection | An image with annotated line segments.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11f1p_sd15_depth](https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth)<br/> | Trained with depth estimation | An image with depth information, usually represented as a grayscale image.|<a href="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_normalbae](https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae)<br/> | Trained with surface normal estimation | An image with surface normal information, usually represented as a color-coded image.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_seg](https://huggingface.co/lllyasviel/control_v11p_sd15_seg)<br/> | Trained with image segmentation | An image with segmented regions, usually represented as a color-coded image.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_lineart](https://huggingface.co/lllyasviel/control_v11p_sd15_lineart)<br/> | Trained with line art generation | An image with line art, usually black lines on a white background.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15s2_lineart_anime](https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime)<br/> | Trained with anime line art generation | An image with anime-style line art.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_openpose](https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime)<br/> | Trained with human pose estimation | An image with human poses, usually represented as a set of keypoints or skeletons.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_scribble](https://huggingface.co/lllyasviel/control_v11p_sd15_scribble)<br/> | Trained with scribble-based image generation | An image with scribbles, usually random or user-drawn strokes.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_softedge](https://huggingface.co/lllyasviel/control_v11p_sd15_softedge)<br/> | Trained with soft edge image generation | An image with soft edges, usually to create a more painterly or artistic effect.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11e_sd15_shuffle](https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle)<br/> | Trained with image shuffling | An image with shuffled patches or regions.|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11f1e_sd15_tile](https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile)<br/> | Trained with image tiling | A blurry image or part of an image .|<a href="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/original.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/original.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/output.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/output.png"/></a>|
## Improvements in Canny 1.1:
- The training dataset of previous cnet 1.0 has several problems including (1) a small group of greyscale human images are duplicated thousands of times (!!), causing the previous model somewhat likely to generate grayscale human images; (2) some images has low quality, very blurry, or significant JPEG artifacts; (3) a small group of images has wrong paired prompts caused by a mistake in our data processing scripts. The new model fixed all problems of the training dataset and should be more reasonable in many cases.
- Because the Canny model is one of the most important (perhaps the most frequently used) ControlNet, we used a fund to train it on a machine with 8 Nvidia A100 80G with batchsize 8×32=256 for 3 days, spending 72×30=2160 USD (8 A100 80G with 30 USD/hour). The model is resumed from Canny 1.0.
- Some reasonable data augmentations are applied to training, like random left-right flipping.
- Although it is difficult to evaluate a ControlNet, we find Canny 1.1 is a bit more robust and a bit higher visual quality than Canny 1.0.
## More information
For more information, please also have a look at the [Diffusers ControlNet Blog Post](https://huggingface.co/blog/controlnet) and have a look at the [official docs](https://github.com/lllyasviel/ControlNet-v1-1-nightly). | 16,594 | [
[
-0.042327880859375,
-0.04315185546875,
0.01007843017578125,
0.045806884765625,
-0.020263671875,
-0.0195159912109375,
0.002918243408203125,
-0.04034423828125,
0.04022216796875,
0.0214080810546875,
-0.055999755859375,
-0.0291748046875,
-0.055938720703125,
-0.01363372802734375,
-0.0122528076171875,
0.0623779296875,
-0.0222320556640625,
0.0019741058349609375,
0.00628662109375,
-0.006015777587890625,
-0.0079498291015625,
-0.01123809814453125,
-0.09375,
-0.03338623046875,
0.0340576171875,
0.0015039443969726562,
0.03662109375,
0.041290283203125,
0.038787841796875,
0.0283660888671875,
-0.029296875,
0.00605010986328125,
-0.0215606689453125,
-0.0152435302734375,
0.01593017578125,
-0.01183319091796875,
-0.05291748046875,
0.003574371337890625,
0.054443359375,
0.021392822265625,
0.0007653236389160156,
-0.0170135498046875,
0.0122222900390625,
0.050384521484375,
-0.037353515625,
-0.006168365478515625,
-0.012847900390625,
0.01666259765625,
-0.00969696044921875,
0.006748199462890625,
-0.01277923583984375,
-0.023590087890625,
0.005542755126953125,
-0.059906005859375,
-0.00710296630859375,
-0.01303863525390625,
0.102783203125,
0.02386474609375,
-0.034698486328125,
-0.007720947265625,
-0.018951416015625,
0.052093505859375,
-0.059356689453125,
0.006839752197265625,
0.010009765625,
0.0191802978515625,
-0.01739501953125,
-0.0731201171875,
-0.0384521484375,
-0.01287078857421875,
-0.006938934326171875,
0.0374755859375,
-0.0253448486328125,
0.004665374755859375,
0.0183563232421875,
0.0167388916015625,
-0.02716064453125,
0.01885986328125,
-0.0252838134765625,
-0.0297698974609375,
0.04852294921875,
-0.00007796287536621094,
0.044219970703125,
0.0022220611572265625,
-0.044464111328125,
-0.0036830902099609375,
-0.027557373046875,
0.029296875,
0.017578125,
-0.0078277587890625,
-0.059539794921875,
0.0318603515625,
-0.004108428955078125,
0.05377197265625,
0.0313720703125,
-0.017913818359375,
0.0345458984375,
-0.0180816650390625,
-0.0272369384765625,
-0.0218658447265625,
0.07696533203125,
0.03936767578125,
0.01076507568359375,
-0.00017690658569335938,
-0.0095062255859375,
-0.00732421875,
-0.003932952880859375,
-0.08880615234375,
-0.0128021240234375,
0.01654052734375,
-0.041839599609375,
-0.0288238525390625,
-0.00923919677734375,
-0.05169677734375,
-0.013763427734375,
-0.006862640380859375,
0.0269927978515625,
-0.045501708984375,
-0.037506103515625,
0.01041412353515625,
-0.035308837890625,
0.04254150390625,
0.046722412109375,
-0.034698486328125,
0.0163726806640625,
0.01296234130859375,
0.07763671875,
-0.019744873046875,
-0.008819580078125,
-0.020233154296875,
-0.00691986083984375,
-0.02685546875,
0.040679931640625,
-0.0102996826171875,
-0.010772705078125,
-0.0082855224609375,
0.02545166015625,
-0.00865936279296875,
-0.0271148681640625,
0.032318115234375,
-0.026611328125,
0.01123046875,
-0.004589080810546875,
-0.0269927978515625,
-0.01123046875,
0.0200653076171875,
-0.03533935546875,
0.05596923828125,
0.017822265625,
-0.08306884765625,
0.0248260498046875,
-0.038665771484375,
-0.015899658203125,
-0.0199127197265625,
0.01026153564453125,
-0.05743408203125,
-0.033477783203125,
0.0031223297119140625,
0.04095458984375,
-0.0015277862548828125,
-0.00862884521484375,
-0.037567138671875,
-0.001979827880859375,
0.01111602783203125,
-0.00890350341796875,
0.09637451171875,
0.0110626220703125,
-0.04095458984375,
0.01593017578125,
-0.055328369140625,
0.003528594970703125,
0.010772705078125,
-0.023193359375,
0.0035800933837890625,
-0.0211639404296875,
0.009429931640625,
0.04571533203125,
0.0250701904296875,
-0.0531005859375,
0.01275634765625,
-0.0193328857421875,
0.036285400390625,
0.048065185546875,
0.0177764892578125,
0.04388427734375,
-0.039642333984375,
0.042022705078125,
0.022979736328125,
0.0225830078125,
0.007080078125,
-0.0399169921875,
-0.08026123046875,
-0.045440673828125,
0.001346588134765625,
0.0469970703125,
-0.0623779296875,
0.05841064453125,
0.00911712646484375,
-0.050994873046875,
-0.0194091796875,
0.0033130645751953125,
0.037628173828125,
0.03814697265625,
0.0236358642578125,
-0.036285400390625,
-0.030731201171875,
-0.06939697265625,
0.0112152099609375,
0.015625,
-0.0008649826049804688,
0.0128631591796875,
0.04913330078125,
-0.0078582763671875,
0.0496826171875,
-0.0181427001953125,
-0.0290069580078125,
-0.00417327880859375,
-0.0102996826171875,
0.022552490234375,
0.07781982421875,
0.0594482421875,
-0.058807373046875,
-0.04779052734375,
-0.0016031265258789062,
-0.0675048828125,
-0.0020351409912109375,
-0.0164794921875,
-0.037567138671875,
0.0146484375,
0.04571533203125,
-0.04656982421875,
0.060150146484375,
0.040130615234375,
-0.04095458984375,
0.04840087890625,
-0.0276947021484375,
0.01434326171875,
-0.07403564453125,
0.016571044921875,
0.02606201171875,
-0.024444580078125,
-0.0462646484375,
0.007617950439453125,
0.011505126953125,
0.005268096923828125,
-0.0555419921875,
0.052520751953125,
-0.037567138671875,
0.01346588134765625,
-0.0211181640625,
-0.0087890625,
0.003978729248046875,
0.04864501953125,
0.0153656005859375,
0.03759765625,
0.07525634765625,
-0.047515869140625,
0.0272979736328125,
0.03204345703125,
-0.0194244384765625,
0.06463623046875,
-0.06256103515625,
0.0095367431640625,
-0.01332855224609375,
0.043792724609375,
-0.0755615234375,
-0.021087646484375,
0.0499267578125,
-0.039581298828125,
0.04022216796875,
-0.0211181640625,
-0.0179595947265625,
-0.03350830078125,
-0.0257568359375,
0.01551055908203125,
0.054443359375,
-0.0372314453125,
0.0262603759765625,
0.0092010498046875,
0.011566162109375,
-0.03912353515625,
-0.07232666015625,
-0.006229400634765625,
-0.028411865234375,
-0.0626220703125,
0.039031982421875,
-0.01407623291015625,
0.0011720657348632812,
0.0009307861328125,
0.004486083984375,
-0.023284912109375,
0.0010080337524414062,
0.027984619140625,
0.02099609375,
-0.00696563720703125,
-0.01404571533203125,
0.01165771484375,
-0.01287078857421875,
-0.0050048828125,
-0.0277557373046875,
0.03131103515625,
0.004558563232421875,
-0.0152435302734375,
-0.07452392578125,
0.018890380859375,
0.043212890625,
-0.003040313720703125,
0.06787109375,
0.06536865234375,
-0.032318115234375,
-0.0005755424499511719,
-0.0262603759765625,
-0.00844573974609375,
-0.038604736328125,
-0.002227783203125,
-0.017822265625,
-0.053802490234375,
0.051483154296875,
0.0074615478515625,
-0.0025501251220703125,
0.04925537109375,
0.0294647216796875,
-0.01654052734375,
0.06610107421875,
0.040130615234375,
-0.0075531005859375,
0.05987548828125,
-0.0579833984375,
-0.01163482666015625,
-0.07696533203125,
-0.0232391357421875,
-0.0261993408203125,
-0.0477294921875,
-0.0265045166015625,
-0.0300140380859375,
0.035614013671875,
0.035552978515625,
-0.05511474609375,
0.036651611328125,
-0.0445556640625,
0.008453369140625,
0.025054931640625,
0.042510986328125,
-0.01031494140625,
-0.01058197021484375,
-0.014923095703125,
0.007488250732421875,
-0.04840087890625,
-0.0199432373046875,
0.048126220703125,
0.038787841796875,
0.039886474609375,
-0.003444671630859375,
0.04522705078125,
0.0024929046630859375,
0.021697998046875,
-0.0440673828125,
0.03924560546875,
-0.0027980804443359375,
-0.04168701171875,
-0.014862060546875,
-0.0226593017578125,
-0.0782470703125,
0.0069732666015625,
-0.036590576171875,
-0.0572509765625,
0.0262908935546875,
0.02093505859375,
-0.00875091552734375,
0.03741455078125,
-0.050567626953125,
0.057891845703125,
-0.001697540283203125,
-0.045623779296875,
0.004093170166015625,
-0.0638427734375,
0.019683837890625,
0.0206298828125,
-0.01325225830078125,
0.0030956268310546875,
-0.01137542724609375,
0.065185546875,
-0.059906005859375,
0.0634765625,
-0.038299560546875,
-0.0009407997131347656,
0.030853271484375,
-0.0017681121826171875,
0.04364013671875,
-0.0128021240234375,
-0.0178070068359375,
0.006198883056640625,
-0.0034770965576171875,
-0.045623779296875,
-0.027008056640625,
0.052398681640625,
-0.05511474609375,
-0.01454925537109375,
-0.019317626953125,
-0.0194244384765625,
0.0174407958984375,
0.016143798828125,
0.050994873046875,
0.0287628173828125,
0.0147857666015625,
0.004604339599609375,
0.048187255859375,
-0.0283050537109375,
0.0491943359375,
0.0022983551025390625,
-0.007476806640625,
-0.043304443359375,
0.05645751953125,
0.0026950836181640625,
0.029632568359375,
0.0186004638671875,
0.01140594482421875,
-0.01727294921875,
-0.04052734375,
-0.03131103515625,
0.03369140625,
-0.047332763671875,
-0.032684326171875,
-0.047637939453125,
-0.03533935546875,
-0.033538818359375,
-0.036102294921875,
-0.0200042724609375,
-0.0215606689453125,
-0.057098388671875,
0.014739990234375,
0.0474853515625,
0.038116455078125,
-0.017578125,
0.043426513671875,
-0.0190582275390625,
0.0202178955078125,
0.0239410400390625,
0.027862548828125,
-0.002483367919921875,
-0.046539306640625,
0.00261688232421875,
0.0085601806640625,
-0.03997802734375,
-0.058349609375,
0.039398193359375,
0.002895355224609375,
0.03570556640625,
0.040679931640625,
-0.017547607421875,
0.0517578125,
-0.020263671875,
0.046234130859375,
0.048675537109375,
-0.060760498046875,
0.039642333984375,
-0.0328369140625,
0.021270751953125,
0.0222015380859375,
0.040130615234375,
-0.034393310546875,
-0.0262908935546875,
-0.054473876953125,
-0.051544189453125,
0.04632568359375,
0.018524169921875,
-0.008453369140625,
0.024444580078125,
0.050384521484375,
-0.02520751953125,
0.01094818115234375,
-0.0625,
-0.034149169921875,
-0.023193359375,
0.004241943359375,
0.0017414093017578125,
0.0060272216796875,
-0.001750946044921875,
-0.035308837890625,
0.06793212890625,
-0.0015764236450195312,
0.045623779296875,
0.0390625,
0.00916290283203125,
-0.01277923583984375,
-0.0210723876953125,
0.04248046875,
0.0345458984375,
-0.00811004638671875,
-0.0221710205078125,
0.005977630615234375,
-0.02685546875,
0.017425537109375,
-0.00473785400390625,
-0.026885986328125,
-0.01090240478515625,
0.027923583984375,
0.06396484375,
-0.0151214599609375,
-0.0123748779296875,
0.058807373046875,
0.0034008026123046875,
-0.043731689453125,
-0.0234375,
0.005779266357421875,
0.0131378173828125,
0.03741455078125,
0.01103973388671875,
0.030792236328125,
0.0042572021484375,
-0.01216888427734375,
0.0199432373046875,
0.04522705078125,
-0.044769287109375,
-0.00991058349609375,
0.058624267578125,
0.002105712890625,
-0.0118255615234375,
0.0303497314453125,
-0.03173828125,
-0.053558349609375,
0.07501220703125,
0.03973388671875,
0.05517578125,
-0.006511688232421875,
0.0198516845703125,
0.056427001953125,
0.0128021240234375,
0.005584716796875,
0.0150909423828125,
0.006816864013671875,
-0.051513671875,
-0.03155517578125,
-0.0325927734375,
-0.0031375885009765625,
0.0144195556640625,
-0.0318603515625,
0.034271240234375,
-0.06085205078125,
-0.0179290771484375,
-0.0066375732421875,
0.0084075927734375,
-0.053863525390625,
0.030792236328125,
0.00498199462890625,
0.09564208984375,
-0.0628662109375,
0.06463623046875,
0.043243408203125,
-0.03509521484375,
-0.067626953125,
-0.004077911376953125,
0.0018310546875,
-0.059814453125,
0.0491943359375,
0.01201629638671875,
-0.003997802734375,
0.005886077880859375,
-0.062744140625,
-0.04510498046875,
0.1005859375,
0.01922607421875,
-0.019989013671875,
0.00714874267578125,
-0.03985595703125,
0.035003662109375,
-0.03326416015625,
0.037567138671875,
0.03228759765625,
0.04034423828125,
0.03338623046875,
-0.0595703125,
0.01453399658203125,
-0.033294677734375,
0.010467529296875,
0.0106353759765625,
-0.07965087890625,
0.067626953125,
-0.0025882720947265625,
-0.01313018798828125,
0.0251007080078125,
0.055511474609375,
0.018035888671875,
0.01169586181640625,
0.048675537109375,
0.059814453125,
0.0254058837890625,
-0.007366180419921875,
0.0753173828125,
-0.002685546875,
0.0226898193359375,
0.048187255859375,
0.0218505859375,
0.04034423828125,
0.0287017822265625,
-0.007419586181640625,
0.039642333984375,
0.06689453125,
0.003467559814453125,
0.035003662109375,
0.039886474609375,
-0.02056884765625,
-0.0115966796875,
-0.004276275634765625,
-0.0262603759765625,
0.006633758544921875,
0.019561767578125,
-0.020233154296875,
-0.0183563232421875,
0.0215301513671875,
0.0189056396484375,
-0.0125732421875,
-0.034454345703125,
0.0509033203125,
-0.0089874267578125,
-0.043212890625,
0.058319091796875,
-0.0038013458251953125,
0.085693359375,
-0.055816650390625,
0.003108978271484375,
-0.0191497802734375,
0.00440216064453125,
-0.02984619140625,
-0.0628662109375,
0.0152435302734375,
-0.01425933837890625,
0.0193328857421875,
-0.028778076171875,
0.053985595703125,
-0.0305328369140625,
-0.032958984375,
0.0401611328125,
0.00542449951171875,
0.030853271484375,
0.01248931884765625,
-0.0821533203125,
0.018310546875,
0.00859832763671875,
-0.03448486328125,
0.0172271728515625,
0.0258026123046875,
0.01422119140625,
0.056732177734375,
0.02496337890625,
0.02923583984375,
0.0217742919921875,
-0.01959228515625,
0.0797119140625,
-0.0209503173828125,
-0.024871826171875,
-0.041900634765625,
0.062042236328125,
-0.0232391357421875,
-0.034698486328125,
0.040802001953125,
0.02349853515625,
0.057525634765625,
-0.006923675537109375,
0.05517578125,
-0.03167724609375,
0.01526641845703125,
-0.0504150390625,
0.063232421875,
-0.0670166015625,
-0.020233154296875,
-0.025604248046875,
-0.050872802734375,
-0.0240631103515625,
0.06427001953125,
-0.011749267578125,
0.014739990234375,
0.041778564453125,
0.07745361328125,
-0.01593017578125,
-0.0458984375,
0.0070953369140625,
0.00704193115234375,
0.0255126953125,
0.057525634765625,
0.051849365234375,
-0.05230712890625,
0.0257415771484375,
-0.0394287109375,
-0.036773681640625,
-0.00664520263671875,
-0.075439453125,
-0.063720703125,
-0.053863525390625,
-0.0540771484375,
-0.054473876953125,
-0.017974853515625,
0.0557861328125,
0.08660888671875,
-0.05133056640625,
-0.01371002197265625,
-0.0269775390625,
0.00826263427734375,
-0.01457977294921875,
-0.015869140625,
0.027587890625,
-0.006622314453125,
-0.06396484375,
0.002193450927734375,
0.0167694091796875,
0.044342041015625,
-0.007236480712890625,
-0.032440185546875,
-0.0306549072265625,
-0.0193634033203125,
0.0192108154296875,
0.035003662109375,
-0.036590576171875,
-0.0163116455078125,
-0.022186279296875,
-0.01517486572265625,
0.01067352294921875,
0.042022705078125,
-0.032806396484375,
0.0188140869140625,
0.041290283203125,
0.0293731689453125,
0.06414794921875,
-0.01238250732421875,
0.006191253662109375,
-0.04083251953125,
0.039398193359375,
0.0013628005981445312,
0.032379150390625,
0.00896453857421875,
-0.0237274169921875,
0.030548095703125,
0.026824951171875,
-0.0572509765625,
-0.031585693359375,
0.01236724853515625,
-0.108642578125,
-0.01189422607421875,
0.07659912109375,
-0.02862548828125,
-0.031768798828125,
0.01349639892578125,
-0.03546142578125,
0.030181884765625,
-0.0269317626953125,
0.0167999267578125,
0.029876708984375,
-0.0133819580078125,
-0.028472900390625,
-0.0278472900390625,
0.045257568359375,
0.01371002197265625,
-0.060516357421875,
-0.039215087890625,
0.040130615234375,
0.03057861328125,
0.0218658447265625,
0.068603515625,
-0.00890350341796875,
0.01019287109375,
-0.003108978271484375,
0.015625,
0.0025424957275390625,
-0.010162353515625,
-0.041046142578125,
-0.00859832763671875,
-0.01538848876953125,
-0.0316162109375
]
] |
HumanCompatibleAI/ppo-seals-CartPole-v0 | 2023-09-19T09:41:41.000Z | [
"stable-baselines3",
"seals/CartPole-v0",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | HumanCompatibleAI | null | null | HumanCompatibleAI/ppo-seals-CartPole-v0 | 14 | 28,039 | stable-baselines3 | 2022-12-29T13:39:32 | ---
library_name: stable-baselines3
tags:
- seals/CartPole-v0
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: seals/CartPole-v0
type: seals/CartPole-v0
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **PPO** Agent playing **seals/CartPole-v0**
This is a trained model of a **PPO** agent playing **seals/CartPole-v0**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo ppo --env seals/CartPole-v0 -orga HumanCompatibleAI -f logs/
python -m rl_zoo3.enjoy --algo ppo --env seals/CartPole-v0 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo ppo --env seals/CartPole-v0 -orga HumanCompatibleAI -f logs/
python -m rl_zoo3.enjoy --algo ppo --env seals/CartPole-v0 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo ppo --env seals/CartPole-v0 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo ppo --env seals/CartPole-v0 -f logs/ -orga HumanCompatibleAI
```
## Hyperparameters
```python
OrderedDict([('batch_size', 256),
('clip_range', 0.4),
('ent_coef', 0.008508727919228772),
('gae_lambda', 0.9),
('gamma', 0.9999),
('learning_rate', 0.0012403278189645594),
('max_grad_norm', 0.8),
('n_envs', 8),
('n_epochs', 10),
('n_steps', 512),
('n_timesteps', 100000.0),
('policy', 'MlpPolicy'),
('policy_kwargs',
{'activation_fn': <class 'torch.nn.modules.activation.ReLU'>,
'net_arch': [{'pi': [64, 64], 'vf': [64, 64]}]}),
('vf_coef', 0.489343896591493),
('normalize', False)])
```
# Environment Arguments
```python
{'render_mode': 'rgb_array'}
```
| 2,715 | [
[
-0.040557861328125,
-0.032470703125,
0.01036834716796875,
0.0139617919921875,
-0.022064208984375,
0.0020351409912109375,
0.0001283884048461914,
-0.0269775390625,
0.00594329833984375,
0.0291900634765625,
-0.0428466796875,
-0.04833984375,
-0.03961181640625,
0.0025615692138671875,
0.0239715576171875,
0.0745849609375,
0.006622314453125,
0.001842498779296875,
0.003246307373046875,
-0.0062408447265625,
-0.01934814453125,
-0.033355712890625,
-0.05615234375,
-0.04498291015625,
0.01450347900390625,
0.0283203125,
0.04852294921875,
0.047943115234375,
0.0212554931640625,
0.024322509765625,
-0.0168609619140625,
0.0033130645751953125,
-0.023773193359375,
-0.01340484619140625,
-0.003879547119140625,
-0.024658203125,
-0.05157470703125,
0.00969696044921875,
0.038116455078125,
0.00951385498046875,
-0.01052093505859375,
0.0184173583984375,
-0.00450897216796875,
0.0322265625,
-0.052001953125,
0.050079345703125,
-0.029022216796875,
0.02569580078125,
0.0036106109619140625,
-0.0080108642578125,
-0.0193634033203125,
-0.000606536865234375,
0.0162811279296875,
-0.09283447265625,
0.012054443359375,
-0.0001227855682373047,
0.11187744140625,
-0.00994873046875,
-0.02557373046875,
-0.0065460205078125,
-0.036163330078125,
0.07122802734375,
-0.06268310546875,
0.0162506103515625,
0.042510986328125,
0.0260467529296875,
-0.01441192626953125,
-0.059295654296875,
-0.0284576416015625,
-0.0230712890625,
-0.0084991455078125,
0.031890869140625,
0.0007963180541992188,
0.00920867919921875,
0.032501220703125,
0.0113372802734375,
-0.04681396484375,
-0.0021495819091796875,
-0.039276123046875,
-0.0256500244140625,
0.04766845703125,
0.0206451416015625,
0.0018835067749023438,
-0.00861358642578125,
-0.039215087890625,
-0.039794921875,
-0.0255279541015625,
0.03326416015625,
0.0310821533203125,
0.024871826171875,
-0.0295867919921875,
0.036468505859375,
-0.04034423828125,
0.0467529296875,
-0.00782012939453125,
-0.032958984375,
0.037384033203125,
-0.0024871826171875,
-0.01812744140625,
-0.0133819580078125,
0.0633544921875,
0.039794921875,
0.00997161865234375,
0.0284271240234375,
-0.0296630859375,
-0.0195770263671875,
0.00849151611328125,
-0.040069580078125,
-0.02484130859375,
0.0216064453125,
-0.01312255859375,
-0.03265380859375,
0.0103302001953125,
-0.06488037109375,
0.00574493408203125,
-0.010986328125,
0.040313720703125,
-0.038238525390625,
-0.010345458984375,
0.0109710693359375,
-0.009490966796875,
0.045806884765625,
0.02496337890625,
-0.06439208984375,
0.02734375,
0.04449462890625,
0.0645751953125,
0.01910400390625,
-0.051055908203125,
-0.022979736328125,
0.0311279296875,
-0.017242431640625,
0.053741455078125,
-0.0113677978515625,
-0.0231170654296875,
0.01030731201171875,
0.00727081298828125,
-0.0178070068359375,
-0.032257080078125,
0.041473388671875,
-0.04266357421875,
0.002910614013671875,
-0.00701141357421875,
-0.02972412109375,
-0.031982421875,
0.03729248046875,
-0.047943115234375,
0.08544921875,
0.01178741455078125,
-0.062255859375,
0.0231475830078125,
-0.048919677734375,
-0.013885498046875,
0.0063323974609375,
0.00205230712890625,
-0.076416015625,
-0.0281982421875,
0.0203094482421875,
0.021514892578125,
-0.01080322265625,
0.00731658935546875,
-0.038909912109375,
-0.0239715576171875,
0.009796142578125,
0.0027923583984375,
0.08197021484375,
0.0128021240234375,
-0.0257720947265625,
0.01129150390625,
-0.0516357421875,
0.006725311279296875,
0.0276031494140625,
-0.037506103515625,
0.0101165771484375,
-0.01149749755859375,
0.00998687744140625,
0.0167999267578125,
0.022491455078125,
-0.023529052734375,
0.035491943359375,
-0.019256591796875,
0.04571533203125,
0.051300048828125,
0.02117919921875,
0.01444244384765625,
-0.04046630859375,
0.036590576171875,
0.0045623779296875,
0.039886474609375,
0.023895263671875,
-0.05303955078125,
-0.04400634765625,
-0.028076171875,
-0.00302886962890625,
0.0343017578125,
-0.0455322265625,
0.042266845703125,
0.0018758773803710938,
-0.054656982421875,
-0.01302337646484375,
-0.01178741455078125,
0.0283050537109375,
0.0406494140625,
0.0325927734375,
-0.005023956298828125,
-0.03875732421875,
-0.0654296875,
0.0071563720703125,
-0.0364990234375,
0.0015687942504882812,
0.022247314453125,
0.0753173828125,
-0.005565643310546875,
0.054412841796875,
-0.04083251953125,
-0.0273284912109375,
-0.012359619140625,
0.0133056640625,
0.031524658203125,
0.05352783203125,
0.04742431640625,
-0.034423828125,
-0.0252532958984375,
-0.0058746337890625,
-0.07012939453125,
0.0174713134765625,
0.006557464599609375,
-0.0106964111328125,
0.0006132125854492188,
0.0111846923828125,
-0.072265625,
0.029541015625,
0.0140838623046875,
-0.012725830078125,
0.06121826171875,
-0.0362548828125,
0.00933837890625,
-0.048736572265625,
0.0177154541015625,
-0.004436492919921875,
0.0008435249328613281,
-0.0372314453125,
0.024169921875,
-0.0005450248718261719,
-0.026641845703125,
-0.065185546875,
0.0411376953125,
-0.03668212890625,
-0.01593017578125,
0.004497528076171875,
0.00136566162109375,
0.00887298583984375,
0.047943115234375,
0.0169525146484375,
0.049407958984375,
0.07452392578125,
-0.0557861328125,
0.03179931640625,
0.0239715576171875,
-0.00760650634765625,
0.01263427734375,
-0.0546875,
0.0105438232421875,
0.00980377197265625,
0.03533935546875,
-0.056243896484375,
-0.033477783203125,
0.0513916015625,
-0.024688720703125,
0.0105438232421875,
-0.03302001953125,
-0.02294921875,
-0.0413818359375,
-0.040252685546875,
0.04034423828125,
0.035003662109375,
-0.0321044921875,
0.029754638671875,
0.0267333984375,
0.02001953125,
-0.06201171875,
-0.024169921875,
-0.020172119140625,
-0.0308837890625,
-0.03802490234375,
0.019378662109375,
0.00453948974609375,
0.0001404285430908203,
-0.010406494140625,
-0.0015268325805664062,
-0.0071868896484375,
0.01067352294921875,
0.00496673583984375,
0.026641845703125,
-0.017822265625,
-0.011962890625,
-0.021697998046875,
-0.014404296875,
0.01263427734375,
-0.0302734375,
0.04217529296875,
-0.031463623046875,
-0.0101318359375,
-0.050201416015625,
-0.01233673095703125,
0.0506591796875,
-0.02203369140625,
0.057098388671875,
0.0501708984375,
-0.03668212890625,
-0.0137786865234375,
-0.0140228271484375,
-0.021087646484375,
-0.035003662109375,
0.020050048828125,
-0.032440185546875,
-0.022247314453125,
0.056671142578125,
0.011260986328125,
0.01261138916015625,
0.045501708984375,
0.0185546875,
-0.003795623779296875,
0.0731201171875,
0.0243682861328125,
-0.0007252693176269531,
0.030609130859375,
-0.0660400390625,
-0.0193939208984375,
-0.06732177734375,
-0.032958984375,
-0.0360107421875,
-0.0033092498779296875,
-0.029327392578125,
-0.0202789306640625,
0.024810791015625,
0.037078857421875,
-0.051177978515625,
0.035736083984375,
-0.045989990234375,
0.0210418701171875,
0.037261962890625,
0.016815185546875,
0.006023406982421875,
0.0037250518798828125,
-0.02703857421875,
0.0030345916748046875,
-0.0615234375,
-0.0462646484375,
0.0859375,
0.034912109375,
0.0517578125,
-0.00439453125,
0.046661376953125,
0.0129241943359375,
0.01763916015625,
-0.046051025390625,
0.0457763671875,
0.0164642333984375,
-0.05279541015625,
-0.0256195068359375,
-0.01230621337890625,
-0.076904296875,
0.03656005859375,
-0.0231170654296875,
-0.0631103515625,
0.0014495849609375,
0.021697998046875,
-0.0176544189453125,
0.0309600830078125,
-0.028167724609375,
0.07171630859375,
-0.014801025390625,
-0.029541015625,
-0.00007712841033935547,
-0.045684814453125,
0.040771484375,
0.01149749755859375,
0.0070953369140625,
-0.017852783203125,
-0.0152435302734375,
0.07012939453125,
-0.04925537109375,
0.048126220703125,
-0.03466796875,
0.019317626953125,
0.050811767578125,
-0.0145416259765625,
0.044921875,
0.042144775390625,
-0.0052032470703125,
0.02325439453125,
-0.0128021240234375,
-0.054412841796875,
-0.02471923828125,
0.0440673828125,
-0.1038818359375,
-0.042694091796875,
-0.06280517578125,
-0.02984619140625,
-0.002147674560546875,
0.0122222900390625,
0.03717041015625,
0.01800537109375,
0.00688934326171875,
0.001308441162109375,
0.042266845703125,
-0.0160369873046875,
0.0309600830078125,
0.05157470703125,
-0.01558685302734375,
-0.05645751953125,
0.0557861328125,
0.00695037841796875,
0.00887298583984375,
0.016357421875,
0.01277923583984375,
-0.035003662109375,
-0.042449951171875,
-0.03271484375,
0.033905029296875,
-0.0426025390625,
-0.02801513671875,
-0.03424072265625,
-0.031768798828125,
-0.0361328125,
-0.00951385498046875,
-0.0382080078125,
-0.0179443359375,
-0.037445068359375,
-0.00868988037109375,
0.040863037109375,
0.042724609375,
-0.0228118896484375,
0.029571533203125,
-0.042266845703125,
0.007747650146484375,
0.036376953125,
0.004756927490234375,
-0.003635406494140625,
-0.048126220703125,
-0.0205535888671875,
0.00400543212890625,
-0.045562744140625,
-0.07415771484375,
0.040557861328125,
0.00746917724609375,
0.057403564453125,
0.036590576171875,
0.0037975311279296875,
0.051422119140625,
-0.005031585693359375,
0.06304931640625,
0.0070037841796875,
-0.062347412109375,
0.05426025390625,
-0.03826904296875,
0.01593017578125,
0.02703857421875,
0.03533935546875,
-0.0118408203125,
-0.0259552001953125,
-0.061004638671875,
-0.065185546875,
0.08074951171875,
0.032470703125,
-0.0259552001953125,
0.00833892822265625,
0.030609130859375,
-0.0199127197265625,
-0.0025920867919921875,
-0.07415771484375,
-0.02557373046875,
-0.031829833984375,
0.0229339599609375,
-0.01873779296875,
0.014007568359375,
-0.0220947265625,
-0.0269317626953125,
0.08038330078125,
-0.010711669921875,
0.03338623046875,
0.030364990234375,
-0.0102691650390625,
-0.01513671875,
-0.006561279296875,
0.0604248046875,
0.04278564453125,
-0.04498291015625,
-0.005283355712890625,
0.024139404296875,
-0.039337158203125,
0.026397705078125,
0.00998687744140625,
-0.00605010986328125,
-0.004077911376953125,
0.041839599609375,
0.050262451171875,
0.0177459716796875,
-0.0240478515625,
0.044677734375,
0.0015010833740234375,
-0.03350830078125,
-0.035003662109375,
0.00267791748046875,
0.0019273757934570312,
0.0137939453125,
0.03131103515625,
0.00313568115234375,
-0.012054443359375,
-0.032470703125,
0.01239776611328125,
0.027069091796875,
-0.029754638671875,
-0.028961181640625,
0.058868408203125,
0.00429534912109375,
-0.0239715576171875,
0.056610107421875,
-0.01837158203125,
-0.043975830078125,
0.07806396484375,
0.03961181640625,
0.054901123046875,
-0.0095977783203125,
0.024688720703125,
0.0662841796875,
0.0238037109375,
-0.007293701171875,
0.0177001953125,
0.009857177734375,
-0.04949951171875,
-0.003002166748046875,
-0.038330078125,
-0.031036376953125,
0.0478515625,
-0.061920166015625,
0.02880859375,
-0.05224609375,
-0.0257720947265625,
-0.00754547119140625,
0.0340576171875,
-0.053802490234375,
0.00980377197265625,
0.00713348388671875,
0.07562255859375,
-0.08038330078125,
0.0760498046875,
0.08013916015625,
-0.0472412109375,
-0.06695556640625,
-0.01334381103515625,
-0.0037841796875,
-0.056060791015625,
0.0455322265625,
0.0025997161865234375,
0.0084381103515625,
-0.002685546875,
-0.0457763671875,
-0.06756591796875,
0.10791015625,
0.018341064453125,
-0.025848388671875,
0.00922393798828125,
0.00920867919921875,
0.0413818359375,
-0.026214599609375,
0.036956787109375,
0.020660400390625,
0.04339599609375,
-0.00811004638671875,
-0.052093505859375,
0.00020170211791992188,
-0.00647735595703125,
-0.002197265625,
-0.00975799560546875,
-0.07574462890625,
0.09954833984375,
-0.0123291015625,
0.0062713623046875,
0.004383087158203125,
0.055450439453125,
0.0654296875,
0.0160675048828125,
0.036163330078125,
0.04852294921875,
0.037872314453125,
-0.00392913818359375,
0.05535888671875,
-0.0350341796875,
0.07135009765625,
0.05633544921875,
-0.015838623046875,
0.04937744140625,
0.015716552734375,
-0.0204010009765625,
0.0240478515625,
0.05499267578125,
-0.02105712890625,
0.0361328125,
0.011077880859375,
-0.0244903564453125,
-0.015899658203125,
0.0174102783203125,
-0.04736328125,
0.0209197998046875,
0.0178070068359375,
-0.00511932373046875,
-0.0220947265625,
-0.001079559326171875,
-0.00649261474609375,
-0.0224609375,
-0.0262908935546875,
0.047149658203125,
-0.0104522705078125,
-0.0472412109375,
0.06878662109375,
-0.00031447410583496094,
0.0305023193359375,
-0.05426025390625,
-0.0179901123046875,
-0.011993408203125,
0.02093505859375,
-0.01415252685546875,
-0.061737060546875,
-0.0052490234375,
-0.0185089111328125,
0.005947113037109375,
0.00513458251953125,
0.0513916015625,
-0.005950927734375,
-0.018646240234375,
0.045166015625,
0.0199432373046875,
0.02716064453125,
0.0168914794921875,
-0.08660888671875,
0.0034084320068359375,
-0.007801055908203125,
-0.0443115234375,
0.028839111328125,
0.0172119140625,
0.003810882568359375,
0.06060791015625,
0.042266845703125,
-0.0074005126953125,
0.008544921875,
-0.0190582275390625,
0.06890869140625,
-0.04473876953125,
-0.03546142578125,
-0.033721923828125,
0.037078857421875,
0.0022449493408203125,
-0.0548095703125,
0.04412841796875,
0.0699462890625,
0.06890869140625,
-0.039520263671875,
0.04388427734375,
-0.00446319580078125,
-0.006694793701171875,
-0.034515380859375,
0.061126708984375,
-0.02880859375,
0.005340576171875,
-0.0172119140625,
-0.0667724609375,
-0.00879669189453125,
0.06573486328125,
-0.0207366943359375,
-0.00574493408203125,
0.031341552734375,
0.07177734375,
-0.014312744140625,
-0.0104217529296875,
0.0212554931640625,
0.0168914794921875,
0.023590087890625,
0.05377197265625,
0.068359375,
-0.052703857421875,
0.05303955078125,
-0.0504150390625,
-0.015960693359375,
-0.0215911865234375,
-0.052947998046875,
-0.05950927734375,
-0.0224151611328125,
-0.0394287109375,
-0.0297393798828125,
0.023406982421875,
0.061370849609375,
0.07623291015625,
-0.057586669921875,
-0.042266845703125,
-0.0187530517578125,
0.006317138671875,
-0.04345703125,
-0.022125244140625,
0.026214599609375,
-0.02069091796875,
-0.051055908203125,
0.019622802734375,
-0.00492095947265625,
0.0160980224609375,
-0.0073394775390625,
-0.037200927734375,
-0.04296875,
-0.01114654541015625,
0.0225830078125,
0.0537109375,
-0.048309326171875,
-0.03289794921875,
-0.034393310546875,
-0.0207061767578125,
0.0189361572265625,
0.0255126953125,
-0.054779052734375,
0.01202392578125,
0.033905029296875,
-0.00847625732421875,
0.0718994140625,
-0.005748748779296875,
0.026763916015625,
-0.03289794921875,
0.0210418701171875,
0.02197265625,
0.0284423828125,
-0.00189971923828125,
-0.0099639892578125,
0.039825439453125,
0.03851318359375,
-0.054779052734375,
-0.0506591796875,
0.0014734268188476562,
-0.08294677734375,
-0.019622802734375,
0.06036376953125,
-0.04754638671875,
-0.036712646484375,
-0.00811767578125,
-0.0166015625,
0.028839111328125,
-0.038970947265625,
0.038787841796875,
0.033447265625,
-0.017120361328125,
0.004199981689453125,
-0.050506591796875,
0.0418701171875,
0.01171112060546875,
-0.06280517578125,
-0.0167236328125,
0.00817108154296875,
0.0364990234375,
0.0185394287109375,
0.03533935546875,
-0.0037059783935546875,
0.024932861328125,
0.018707275390625,
0.0175323486328125,
-0.0167999267578125,
-0.021942138671875,
-0.034423828125,
0.020660400390625,
0.00383758544921875,
-0.017578125
]
] |
microsoft/DialoGPT-small | 2023-06-26T19:50:08.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"gpt2",
"text-generation",
"conversational",
"arxiv:1911.00536",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | conversational | microsoft | null | null | microsoft/DialoGPT-small | 58 | 27,971 | transformers | 2022-03-02T23:29:05 | ---
thumbnail: https://huggingface.co/front/thumbnails/dialogpt.png
tags:
- conversational
license: mit
---
## A State-of-the-Art Large-scale Pretrained Response generation model (DialoGPT)
DialoGPT is a SOTA large-scale pretrained dialogue response generation model for multiturn conversations.
The [human evaluation results](https://github.com/dreasysnail/Dialogpt_dev#human-evaluation) indicate that the response generated from DialoGPT is comparable to human response quality under a single-turn conversation Turing test.
The model is trained on 147M multi-turn dialogue from Reddit discussion thread.
* Multi-turn generation examples from an interactive environment:
|Role | Response |
|---------|--------|
|User | Does money buy happiness? |
| Bot | Depends how much money you spend on it .|
|User | What is the best way to buy happiness ? |
| Bot | You just have to be a millionaire by your early 20s, then you can be happy . |
|User |This is so difficult ! |
| Bot | You have no idea how hard it is to be a millionaire and happy . There is a reason the rich have a lot of money |
Please find the information about preprocessing, training and full details of the DialoGPT in the [original DialoGPT repository](https://github.com/microsoft/DialoGPT)
ArXiv paper: [https://arxiv.org/abs/1911.00536](https://arxiv.org/abs/1911.00536)
### How to use
Now we are ready to try out how the model works as a chatting partner!
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-small")
# Let's chat for 5 lines
for step in range(5):
# encode the new user input, add the eos_token and return a tensor in Pytorch
new_user_input_ids = tokenizer.encode(input(">> User:") + tokenizer.eos_token, return_tensors='pt')
# append the new user input tokens to the chat history
bot_input_ids = torch.cat([chat_history_ids, new_user_input_ids], dim=-1) if step > 0 else new_user_input_ids
# generated a response while limiting the total chat history to 1000 tokens,
chat_history_ids = model.generate(bot_input_ids, max_length=1000, pad_token_id=tokenizer.eos_token_id)
# pretty print last ouput tokens from bot
print("DialoGPT: {}".format(tokenizer.decode(chat_history_ids[:, bot_input_ids.shape[-1]:][0], skip_special_tokens=True)))
```
| 2,417 | [
[
-0.02978515625,
-0.07098388671875,
0.002788543701171875,
0.00815582275390625,
-0.0128173828125,
0.01139068603515625,
-0.0003464221954345703,
-0.014678955078125,
0.0147705078125,
0.032623291015625,
-0.06378173828125,
-0.00939178466796875,
-0.03204345703125,
-0.0035495758056640625,
-0.0141448974609375,
0.0809326171875,
0.0296173095703125,
0.01425933837890625,
-0.0003509521484375,
0.00940704345703125,
-0.037384033203125,
-0.053802490234375,
-0.06427001953125,
-0.0147857666015625,
0.00383758544921875,
0.02105712890625,
0.03857421875,
-0.00261688232421875,
0.024566650390625,
0.036529541015625,
-0.0020885467529296875,
0.008026123046875,
-0.05670166015625,
0.00342559814453125,
0.0162506103515625,
-0.041717529296875,
-0.053070068359375,
0.00798797607421875,
0.0173492431640625,
0.0220489501953125,
0.0004658699035644531,
0.029144287109375,
0.01041412353515625,
0.0238189697265625,
-0.0274200439453125,
0.0191802978515625,
-0.044677734375,
0.0036640167236328125,
0.01276397705078125,
-0.045989990234375,
-0.03369140625,
-0.018096923828125,
0.035614013671875,
-0.040863037109375,
0.02044677734375,
0.0167999267578125,
0.0672607421875,
-0.0028839111328125,
-0.032623291015625,
-0.039886474609375,
-0.03619384765625,
0.057037353515625,
-0.067626953125,
0.0194549560546875,
0.023284912109375,
0.018096923828125,
-0.0379638671875,
-0.0650634765625,
-0.044708251953125,
-0.0214996337890625,
-0.00008189678192138672,
0.01126861572265625,
-0.01763916015625,
0.0260772705078125,
0.0277557373046875,
0.0257568359375,
-0.053375244140625,
-0.020416259765625,
-0.04010009765625,
-0.046051025390625,
0.038848876953125,
0.020050048828125,
0.0166778564453125,
-0.0246124267578125,
-0.03076171875,
-0.00983428955078125,
-0.03094482421875,
0.007274627685546875,
0.03375244140625,
0.0179595947265625,
-0.01544189453125,
0.050811767578125,
-0.02105712890625,
0.0594482421875,
0.0113525390625,
-0.0223388671875,
0.0278778076171875,
-0.039886474609375,
-0.0181732177734375,
-0.0120391845703125,
0.072265625,
0.0389404296875,
0.020294189453125,
0.0193634033203125,
-0.0019512176513671875,
-0.0268707275390625,
-0.004673004150390625,
-0.078857421875,
-0.0144500732421875,
0.0333251953125,
-0.043609619140625,
-0.0242156982421875,
-0.017974853515625,
-0.058502197265625,
-0.012664794921875,
-0.010528564453125,
0.05267333984375,
-0.03826904296875,
-0.0309906005859375,
0.0057830810546875,
-0.0178375244140625,
0.0179901123046875,
0.0254974365234375,
-0.05914306640625,
0.007720947265625,
0.0298614501953125,
0.072265625,
0.0171966552734375,
-0.031097412109375,
-0.040191650390625,
-0.031158447265625,
-0.00191497802734375,
0.041839599609375,
-0.01558685302734375,
-0.0232391357421875,
0.00418853759765625,
-0.00589752197265625,
-0.00888824462890625,
-0.032470703125,
-0.005382537841796875,
-0.036468505859375,
0.05145263671875,
0.01055145263671875,
-0.0540771484375,
-0.00043082237243652344,
0.0262451171875,
-0.0229644775390625,
0.058624267578125,
0.004299163818359375,
-0.06378173828125,
0.0203399658203125,
-0.07135009765625,
-0.01384735107421875,
0.00618743896484375,
-0.0038585662841796875,
-0.018096923828125,
0.0015497207641601562,
-0.0005869865417480469,
0.039093017578125,
-0.0198516845703125,
0.0018529891967773438,
-0.0274658203125,
-0.01215362548828125,
0.04425048828125,
-0.039520263671875,
0.07415771484375,
0.02777099609375,
-0.0215301513671875,
0.031280517578125,
-0.046356201171875,
0.0167999267578125,
0.0155181884765625,
-0.02398681640625,
0.0226593017578125,
-0.0204315185546875,
0.012054443359375,
0.038177490234375,
0.033782958984375,
-0.03997802734375,
0.011688232421875,
-0.0301513671875,
0.06439208984375,
0.063720703125,
0.0014467239379882812,
0.016937255859375,
-0.0224151611328125,
0.03411865234375,
0.0034923553466796875,
0.0176849365234375,
-0.031890869140625,
-0.034332275390625,
-0.0626220703125,
-0.0237579345703125,
0.01267242431640625,
0.038970947265625,
-0.0592041015625,
0.053802490234375,
-0.00626373291015625,
-0.029510498046875,
-0.0302581787109375,
-0.00586700439453125,
0.0166778564453125,
0.038116455078125,
0.0083770751953125,
-0.028778076171875,
-0.05108642578125,
-0.0489501953125,
-0.00556182861328125,
-0.029571533203125,
-0.01369476318359375,
0.0278778076171875,
0.045318603515625,
-0.0038509368896484375,
0.0753173828125,
-0.04681396484375,
-0.00992584228515625,
-0.033111572265625,
0.0286102294921875,
0.006298065185546875,
0.04571533203125,
0.0308837890625,
-0.04583740234375,
-0.033843994140625,
-0.02862548828125,
-0.04046630859375,
0.0177459716796875,
-0.01465606689453125,
-0.019287109375,
0.0156707763671875,
0.03271484375,
-0.055511474609375,
0.041656494140625,
0.037109375,
-0.049163818359375,
0.051910400390625,
-0.01251983642578125,
0.0272369384765625,
-0.099609375,
0.0001575946807861328,
-0.02947998046875,
-0.03485107421875,
-0.04534912109375,
-0.01390838623046875,
-0.0292816162109375,
-0.032958984375,
-0.05126953125,
0.044586181640625,
-0.026153564453125,
-0.0003464221954345703,
-0.0187835693359375,
0.003482818603515625,
-0.0275726318359375,
0.058624267578125,
-0.001129150390625,
0.05657958984375,
0.0426025390625,
-0.030975341796875,
0.051666259765625,
0.0302886962890625,
-0.0116119384765625,
0.04107666015625,
-0.06317138671875,
0.0261993408203125,
0.00751495361328125,
0.0288543701171875,
-0.107666015625,
-0.02862548828125,
0.0118560791015625,
-0.07122802734375,
0.00980377197265625,
-0.0151519775390625,
-0.041351318359375,
-0.036712646484375,
-0.0213165283203125,
0.016510009765625,
0.047821044921875,
-0.026641845703125,
0.045135498046875,
0.0269317626953125,
-0.01061248779296875,
-0.0294189453125,
-0.0302581787109375,
0.00640869140625,
-0.0102386474609375,
-0.06451416015625,
-0.008544921875,
-0.03106689453125,
0.0166778564453125,
-0.029144287109375,
0.00753021240234375,
-0.00946044921875,
-0.0031833648681640625,
0.0187835693359375,
0.032562255859375,
-0.004184722900390625,
0.0012531280517578125,
-0.037261962890625,
-0.019195556640625,
0.00010925531387329102,
-0.00585174560546875,
0.10333251953125,
-0.028289794921875,
-0.016265869140625,
-0.054779052734375,
0.020477294921875,
0.049530029296875,
0.0017490386962890625,
0.04656982421875,
0.050872802734375,
-0.020843505859375,
0.0163116455078125,
-0.04833984375,
-0.047454833984375,
-0.040771484375,
0.050018310546875,
-0.029510498046875,
-0.07318115234375,
0.043914794921875,
0.00183868408203125,
0.0258331298828125,
0.0343017578125,
0.06640625,
-0.0028133392333984375,
0.09442138671875,
0.04046630859375,
0.0011835098266601562,
0.054840087890625,
-0.02996826171875,
0.0152587890625,
-0.040863037109375,
-0.0009717941284179688,
-0.0220184326171875,
-0.01284027099609375,
-0.044891357421875,
-0.01425933837890625,
0.0103607177734375,
0.00043892860412597656,
-0.03497314453125,
0.02838134765625,
-0.03363037109375,
0.01151275634765625,
0.055633544921875,
0.0033626556396484375,
0.007129669189453125,
-0.0062713623046875,
0.005401611328125,
-0.0031604766845703125,
-0.0565185546875,
-0.040130615234375,
0.09259033203125,
0.029327392578125,
0.0521240234375,
-0.01451873779296875,
0.06011962890625,
0.007038116455078125,
0.0081634521484375,
-0.0634765625,
0.054718017578125,
0.03955078125,
-0.070068359375,
-0.03375244140625,
-0.045684814453125,
-0.0723876953125,
0.01023101806640625,
-0.0201263427734375,
-0.0816650390625,
-0.01320648193359375,
0.0302276611328125,
-0.036865234375,
0.01300811767578125,
-0.0703125,
0.07049560546875,
-0.0219268798828125,
-0.019989013671875,
-0.008636474609375,
-0.052886962890625,
0.01544189453125,
0.0166015625,
-0.00926971435546875,
-0.01229095458984375,
0.0220947265625,
0.066650390625,
-0.038726806640625,
0.060028076171875,
-0.017852783203125,
0.0230255126953125,
0.0277099609375,
0.012664794921875,
0.023406982421875,
0.0079498291015625,
0.01947021484375,
-0.0020751953125,
0.0123443603515625,
-0.034637451171875,
-0.022735595703125,
0.039520263671875,
-0.0718994140625,
-0.042083740234375,
-0.02642822265625,
-0.038726806640625,
-0.0108795166015625,
0.030731201171875,
0.050384521484375,
0.036285400390625,
-0.021148681640625,
0.021697998046875,
0.024810791015625,
-0.025848388671875,
0.03717041015625,
0.025115966796875,
-0.020660400390625,
-0.0374755859375,
0.0643310546875,
0.006153106689453125,
0.035125732421875,
0.00572967529296875,
0.003078460693359375,
-0.0234527587890625,
-0.01474761962890625,
-0.0281982421875,
0.005641937255859375,
-0.034271240234375,
-0.0164642333984375,
-0.04693603515625,
-0.034912109375,
-0.04815673828125,
-0.00814056396484375,
-0.045989990234375,
-0.0227813720703125,
-0.0166778564453125,
0.002323150634765625,
0.024871826171875,
0.0286102294921875,
-0.0019292831420898438,
0.02972412109375,
-0.05096435546875,
0.0205841064453125,
0.04730224609375,
0.0083770751953125,
0.0029048919677734375,
-0.039794921875,
0.003276824951171875,
0.02099609375,
-0.039398193359375,
-0.056304931640625,
0.037689208984375,
0.008575439453125,
0.03759765625,
0.032958984375,
0.009063720703125,
0.057342529296875,
-0.020721435546875,
0.07354736328125,
0.038726806640625,
-0.06671142578125,
0.028839111328125,
-0.0129852294921875,
0.0274658203125,
0.032958984375,
0.007762908935546875,
-0.050384521484375,
-0.0217742919921875,
-0.06707763671875,
-0.068603515625,
0.06683349609375,
0.047027587890625,
0.031280517578125,
0.00789642333984375,
0.003246307373046875,
-0.00002086162567138672,
0.036651611328125,
-0.05780029296875,
-0.029144287109375,
-0.0262451171875,
-0.00804901123046875,
0.0028820037841796875,
-0.0220947265625,
-0.00946044921875,
-0.01300811767578125,
0.04632568359375,
-0.00797271728515625,
0.060394287109375,
0.01125335693359375,
-0.00855255126953125,
0.00409698486328125,
0.01367950439453125,
0.04913330078125,
0.0611572265625,
-0.027587890625,
-0.00815582275390625,
0.01245880126953125,
-0.034576416015625,
-0.00368499755859375,
0.01238250732421875,
0.01861572265625,
-0.006679534912109375,
0.0306549072265625,
0.06756591796875,
-0.0064239501953125,
-0.0478515625,
0.05029296875,
-0.031158447265625,
-0.026611328125,
-0.036102294921875,
0.002094268798828125,
0.0125579833984375,
0.0117034912109375,
0.040924072265625,
0.000019729137420654297,
0.00475311279296875,
-0.053619384765625,
0.01071929931640625,
0.03594970703125,
-0.02740478515625,
-0.0253448486328125,
0.045654296875,
0.045013427734375,
-0.048004150390625,
0.063720703125,
-0.009307861328125,
-0.050567626953125,
0.03594970703125,
0.036285400390625,
0.072021484375,
0.002101898193359375,
0.0164794921875,
0.03778076171875,
0.0007348060607910156,
0.0123443603515625,
0.0259246826171875,
-0.013702392578125,
-0.05792236328125,
-0.0141754150390625,
-0.0308380126953125,
-0.016326904296875,
0.024383544921875,
-0.0352783203125,
0.0221710205078125,
-0.034576416015625,
-0.03057861328125,
0.003482818603515625,
0.0018892288208007812,
-0.075927734375,
0.0013151168823242188,
-0.005657196044921875,
0.055419921875,
-0.047088623046875,
0.0268402099609375,
0.034393310546875,
-0.0261077880859375,
-0.04217529296875,
-0.00478363037109375,
0.01122283935546875,
-0.0751953125,
0.03863525390625,
0.0231475830078125,
0.006702423095703125,
0.0185394287109375,
-0.059814453125,
-0.053192138671875,
0.06878662109375,
0.023834228515625,
-0.033782958984375,
-0.0084991455078125,
0.01532745361328125,
0.0289306640625,
-0.02899169921875,
0.052032470703125,
0.032257080078125,
0.00809478759765625,
0.025238037109375,
-0.082275390625,
-0.0004248619079589844,
-0.0225067138671875,
-0.00839996337890625,
-0.0045166015625,
-0.0548095703125,
0.066650390625,
-0.0165557861328125,
-0.0107879638671875,
0.0218048095703125,
0.04437255859375,
0.023834228515625,
0.00406646728515625,
0.0548095703125,
0.0236358642578125,
0.036712646484375,
-0.0164337158203125,
0.060516357421875,
-0.044036865234375,
0.05230712890625,
0.07391357421875,
0.0135955810546875,
0.05145263671875,
0.04144287109375,
-0.01308441162109375,
0.0176239013671875,
0.058837890625,
0.01552581787109375,
0.0263214111328125,
0.018829345703125,
-0.01263427734375,
-0.029937744140625,
0.0025882720947265625,
-0.03302001953125,
0.035736083984375,
0.0122833251953125,
-0.0241851806640625,
-0.00887298583984375,
0.00811004638671875,
0.01325225830078125,
-0.048370361328125,
0.0018444061279296875,
0.068359375,
-0.005352020263671875,
-0.0469970703125,
0.048980712890625,
-0.0205230712890625,
0.06591796875,
-0.060577392578125,
-0.005519866943359375,
-0.00730133056640625,
0.0190887451171875,
-0.00974273681640625,
-0.040771484375,
-0.009857177734375,
-0.01293182373046875,
0.01206207275390625,
-0.0037994384765625,
0.04901123046875,
-0.0271148681640625,
-0.02264404296875,
-0.00146484375,
0.040557861328125,
0.0188751220703125,
0.0021152496337890625,
-0.070068359375,
-0.004337310791015625,
0.019073486328125,
-0.052825927734375,
0.0213470458984375,
0.0181427001953125,
0.0278778076171875,
0.055755615234375,
0.05987548828125,
-0.010986328125,
0.01155853271484375,
-0.01263427734375,
0.0634765625,
-0.045654296875,
-0.0443115234375,
-0.058837890625,
0.054595947265625,
-0.0268402099609375,
-0.058624267578125,
0.05560302734375,
0.043670654296875,
0.057281494140625,
-0.01593017578125,
0.05023193359375,
-0.025390625,
0.025665283203125,
-0.018646240234375,
0.044586181640625,
-0.03594970703125,
-0.0027561187744140625,
-0.0194091796875,
-0.057037353515625,
0.0010519027709960938,
0.06439208984375,
-0.0110626220703125,
0.0161590576171875,
0.035430908203125,
0.06622314453125,
0.008697509765625,
-0.00628662109375,
0.02978515625,
0.0255584716796875,
0.039520263671875,
0.03863525390625,
0.07000732421875,
-0.0305938720703125,
0.059814453125,
-0.00998687744140625,
-0.0303955078125,
-0.033355712890625,
-0.047760009765625,
-0.0909423828125,
-0.052398681640625,
-0.01593017578125,
-0.040771484375,
-0.00982666015625,
0.0999755859375,
0.07568359375,
-0.05010986328125,
-0.031097412109375,
-0.012054443359375,
-0.00826263427734375,
0.0013637542724609375,
-0.0236053466796875,
0.01297760009765625,
-0.03302001953125,
-0.0640869140625,
-0.01007080078125,
0.006404876708984375,
0.027984619140625,
-0.031097412109375,
-0.004608154296875,
-0.0119171142578125,
0.00951385498046875,
0.04541015625,
0.028839111328125,
-0.039764404296875,
-0.0258331298828125,
0.01143646240234375,
-0.01177215576171875,
0.003452301025390625,
0.050750732421875,
-0.0321044921875,
0.053375244140625,
0.05523681640625,
0.0110321044921875,
0.05755615234375,
-0.0141754150390625,
0.059417724609375,
-0.0382080078125,
0.02996826171875,
0.022186279296875,
0.031585693359375,
0.0140228271484375,
-0.019378662109375,
0.021331787109375,
0.0145721435546875,
-0.056549072265625,
-0.060211181640625,
0.015869140625,
-0.06951904296875,
-0.0098876953125,
0.07415771484375,
-0.01873779296875,
-0.01200103759765625,
-0.01131439208984375,
-0.05621337890625,
0.018310546875,
-0.05126953125,
0.06256103515625,
0.050872802734375,
-0.0269927978515625,
-0.00608062744140625,
-0.034942626953125,
0.04510498046875,
0.0235443115234375,
-0.0498046875,
0.004619598388671875,
0.03173828125,
0.0341796875,
0.022247314453125,
0.07183837890625,
0.0015134811401367188,
0.0256805419921875,
0.01104736328125,
0.01512908935546875,
-0.007442474365234375,
-0.0020122528076171875,
0.0030612945556640625,
0.01593017578125,
-0.0030460357666015625,
-0.035736083984375
]
] |
google/owlv2-large-patch14-ensemble | 2023-10-23T09:17:51.000Z | [
"transformers",
"pytorch",
"owlv2",
"zero-shot-object-detection",
"vision",
"object-detection",
"arxiv:2306.09683",
"license:apache-2.0",
"region:us"
] | object-detection | google | null | null | google/owlv2-large-patch14-ensemble | 3 | 27,960 | transformers | 2023-10-13T12:44:06 | ---
license: apache-2.0
tags:
- vision
- object-detection
inference: false
---
# Model Card: OWLv2
## Model Details
The OWLv2 model (short for Open-World Localization) was proposed in [Scaling Open-Vocabulary Object Detection](https://arxiv.org/abs/2306.09683) by Matthias Minderer, Alexey Gritsenko, Neil Houlsby. OWLv2, like OWL-ViT, is a zero-shot text-conditioned object detection model that can be used to query an image with one or multiple text queries.
The model uses CLIP as its multi-modal backbone, with a ViT-like Transformer to get visual features and a causal language model to get the text features. To use CLIP for detection, OWL-ViT removes the final token pooling layer of the vision model and attaches a lightweight classification and box head to each transformer output token. Open-vocabulary classification is enabled by replacing the fixed classification layer weights with the class-name embeddings obtained from the text model. The authors first train CLIP from scratch and fine-tune it end-to-end with the classification and box heads on standard detection datasets using a bipartite matching loss. One or multiple text queries per image can be used to perform zero-shot text-conditioned object detection.
### Model Date
June 2023
### Model Type
The model uses a CLIP backbone with a ViT-L/14 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss. The CLIP backbone is trained from scratch and fine-tuned together with the box and class prediction heads with an object detection objective.
### Documents
- [OWLv2 Paper](https://arxiv.org/abs/2306.09683)
### Use with Transformers
```python3
import requests
from PIL import Image
import torch
from transformers import Owlv2Processor, Owlv2ForObjectDetection
processor = Owlv2Processor.from_pretrained("google/owlv2-large-patch14-ensemble")
model = Owlv2ForObjectDetection.from_pretrained("google/owlv2-large-patch14-ensemble")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
texts = [["a photo of a cat", "a photo of a dog"]]
inputs = processor(text=texts, images=image, return_tensors="pt")
outputs = model(**inputs)
# Target image sizes (height, width) to rescale box predictions [batch_size, 2]
target_sizes = torch.Tensor([image.size[::-1]])
# Convert outputs (bounding boxes and class logits) to COCO API
results = processor.post_process_object_detection(outputs=outputs, threshold=0.1, target_sizes=target_sizes)
i = 0 # Retrieve predictions for the first image for the corresponding text queries
text = texts[i]
boxes, scores, labels = results[i]["boxes"], results[i]["scores"], results[i]["labels"]
# Print detected objects and rescaled box coordinates
for box, score, label in zip(boxes, scores, labels):
box = [round(i, 2) for i in box.tolist()]
print(f"Detected {text[label]} with confidence {round(score.item(), 3)} at location {box}")
```
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, text-conditioned object detection. We also hope it can be used for interdisciplinary studies of the potential impact of such models, especially in areas that commonly require identifying objects whose label is unavailable during training.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
## Data
The CLIP backbone of the model was trained on publicly available image-caption data. This was done through a combination of crawling a handful of websites and using commonly-used pre-existing image datasets such as [YFCC100M](http://projects.dfki.uni-kl.de/yfcc100m/). A large portion of the data comes from our crawling of the internet. This means that the data is more representative of people and societies most connected to the internet. The prediction heads of OWL-ViT, along with the CLIP backbone, are fine-tuned on publicly available object detection datasets such as [COCO](https://cocodataset.org/#home) and [OpenImages](https://storage.googleapis.com/openimages/web/index.html).
(to be updated for v2)
### BibTeX entry and citation info
```bibtex
@misc{minderer2023scaling,
title={Scaling Open-Vocabulary Object Detection},
author={Matthias Minderer and Alexey Gritsenko and Neil Houlsby},
year={2023},
eprint={2306.09683},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` | 4,828 | [
[
-0.02520751953125,
-0.05078125,
0.0255889892578125,
-0.0143890380859375,
-0.0211639404296875,
-0.03448486328125,
-0.00363922119140625,
-0.06787109375,
0.0021038055419921875,
0.031341552734375,
-0.02398681640625,
-0.047821044921875,
-0.0474853515625,
0.01375579833984375,
-0.029449462890625,
0.05767822265625,
0.015289306640625,
-0.020843505859375,
-0.031524658203125,
-0.00875091552734375,
-0.031158447265625,
-0.0167694091796875,
-0.044036865234375,
-0.005733489990234375,
0.019378662109375,
0.031463623046875,
0.04681396484375,
0.083984375,
0.06353759765625,
0.019500732421875,
0.0020904541015625,
0.00862884521484375,
-0.031341552734375,
-0.036895751953125,
-0.01025390625,
-0.04473876953125,
-0.02374267578125,
0.006214141845703125,
0.030914306640625,
0.0060577392578125,
0.00814056396484375,
0.00841522216796875,
-0.0031070709228515625,
0.022705078125,
-0.058807373046875,
0.023040771484375,
-0.06317138671875,
0.01535797119140625,
-0.01024627685546875,
-0.01207733154296875,
-0.042572021484375,
0.0227203369140625,
0.01849365234375,
-0.058441162109375,
0.038604736328125,
0.01311492919921875,
0.09429931640625,
0.01708984375,
-0.008209228515625,
-0.01554107666015625,
-0.039306640625,
0.08746337890625,
-0.03411865234375,
0.01026153564453125,
0.0411376953125,
0.0102996826171875,
0.00638580322265625,
-0.0557861328125,
-0.0401611328125,
0.0017347335815429688,
0.0005483627319335938,
0.020050048828125,
-0.032928466796875,
-0.011474609375,
0.008819580078125,
0.0209197998046875,
-0.045074462890625,
0.01033782958984375,
-0.054168701171875,
-0.0191497802734375,
0.046478271484375,
0.006103515625,
0.01666259765625,
0.00086212158203125,
-0.04248046875,
-0.03179931640625,
-0.017425537109375,
0.01629638671875,
0.0183868408203125,
0.017791748046875,
-0.00601959228515625,
0.042816162109375,
-0.0304107666015625,
0.06683349609375,
-0.0012874603271484375,
-0.0255279541015625,
0.0225982666015625,
-0.0151519775390625,
-0.0134735107421875,
-0.00672149658203125,
0.0714111328125,
0.050018310546875,
0.018035888671875,
-0.00872802734375,
0.0026569366455078125,
0.019622802734375,
0.0035572052001953125,
-0.0694580078125,
-0.0200958251953125,
0.02655029296875,
-0.03680419921875,
-0.037200927734375,
0.01708984375,
-0.047637939453125,
0.004207611083984375,
-0.00748443603515625,
0.044342041015625,
-0.0249176025390625,
-0.017425537109375,
0.03656005859375,
-0.0191802978515625,
0.038299560546875,
0.019927978515625,
-0.0489501953125,
0.0031681060791015625,
0.032257080078125,
0.07427978515625,
-0.02056884765625,
-0.0286712646484375,
-0.0333251953125,
-0.0003886222839355469,
-0.0278472900390625,
0.08575439453125,
-0.0401611328125,
-0.020660400390625,
-0.0031986236572265625,
0.0207366943359375,
0.012664794921875,
-0.036834716796875,
0.037628173828125,
-0.0196533203125,
0.0226593017578125,
-0.0010986328125,
-0.019073486328125,
-0.020416259765625,
0.04486083984375,
-0.0301971435546875,
0.089599609375,
-0.005756378173828125,
-0.0814208984375,
0.0301666259765625,
-0.034149169921875,
-0.005573272705078125,
-0.019744873046875,
0.0124664306640625,
-0.05462646484375,
-0.01490020751953125,
0.03460693359375,
0.038909912109375,
-0.0200347900390625,
-0.00836181640625,
-0.04296875,
-0.031524658203125,
0.01195526123046875,
-0.0189361572265625,
0.054443359375,
0.005062103271484375,
0.0021953582763671875,
0.01190948486328125,
-0.052093505859375,
-0.0034351348876953125,
0.030487060546875,
-0.0031280517578125,
-0.017120361328125,
0.0024585723876953125,
0.0026645660400390625,
0.0117340087890625,
0.0162811279296875,
-0.05682373046875,
0.0184783935546875,
-0.030914306640625,
0.031341552734375,
0.03460693359375,
-0.01261138916015625,
0.02423095703125,
-0.01148223876953125,
0.01473236083984375,
0.00916290283203125,
0.0340576171875,
-0.031707763671875,
-0.0567626953125,
-0.050048828125,
-0.0136871337890625,
-0.00937652587890625,
0.046142578125,
-0.036895751953125,
0.0239105224609375,
-0.0110015869140625,
-0.0345458984375,
-0.0193939208984375,
-0.01806640625,
0.0263824462890625,
0.044097900390625,
0.0465087890625,
-0.026947021484375,
-0.034942626953125,
-0.06866455078125,
0.00380706787109375,
-0.0058135986328125,
-0.0193939208984375,
0.01763916015625,
0.05926513671875,
-0.0130767822265625,
0.09368896484375,
-0.0692138671875,
-0.054656982421875,
0.004947662353515625,
0.001644134521484375,
0.002765655517578125,
0.03326416015625,
0.04046630859375,
-0.06396484375,
-0.034912109375,
-0.005428314208984375,
-0.07269287109375,
0.0230255126953125,
0.0240020751953125,
-0.0083160400390625,
0.0091400146484375,
0.0350341796875,
-0.032012939453125,
0.05902099609375,
0.017547607421875,
-0.013946533203125,
0.042144775390625,
-0.0114288330078125,
-0.01219940185546875,
-0.08343505859375,
-0.0022907257080078125,
0.00585174560546875,
-0.00998687744140625,
-0.0389404296875,
0.004955291748046875,
-0.000006258487701416016,
-0.017303466796875,
-0.051177978515625,
0.03863525390625,
-0.0394287109375,
-0.017242431640625,
-0.029388427734375,
0.0193023681640625,
0.01428985595703125,
0.04461669921875,
0.038543701171875,
0.050872802734375,
0.06292724609375,
-0.0399169921875,
0.0167083740234375,
0.02349853515625,
-0.016448974609375,
0.03985595703125,
-0.06622314453125,
0.0214385986328125,
-0.01219940185546875,
0.0093841552734375,
-0.07147216796875,
-0.02197265625,
0.019439697265625,
-0.05157470703125,
0.0174560546875,
-0.01071929931640625,
-0.0243988037109375,
-0.051422119140625,
-0.04095458984375,
0.037078857421875,
0.030120849609375,
-0.038299560546875,
0.024322509765625,
0.02239990234375,
0.047088623046875,
-0.061431884765625,
-0.069091796875,
-0.0034580230712890625,
-0.006744384765625,
-0.0489501953125,
0.028717041015625,
-0.0029544830322265625,
0.00518035888671875,
0.0218658447265625,
0.0002105236053466797,
-0.01554107666015625,
-0.01247406005859375,
0.01380157470703125,
0.03131103515625,
-0.0162353515625,
0.00652313232421875,
-0.032562255859375,
-0.020843505859375,
-0.0043182373046875,
-0.044891357421875,
0.0304107666015625,
-0.0168914794921875,
-0.01397705078125,
-0.05230712890625,
0.0017404556274414062,
0.031768798828125,
-0.038787841796875,
0.049530029296875,
0.06854248046875,
-0.040863037109375,
0.00848388671875,
-0.031585693359375,
-0.0263671875,
-0.032073974609375,
0.0401611328125,
-0.017364501953125,
-0.039337158203125,
0.034515380859375,
0.032073974609375,
-0.002147674560546875,
0.044464111328125,
0.029876708984375,
0.0077972412109375,
0.0709228515625,
0.054840087890625,
0.00848388671875,
0.0361328125,
-0.072265625,
0.01097869873046875,
-0.0833740234375,
-0.028106689453125,
-0.029022216796875,
-0.0029315948486328125,
-0.029052734375,
-0.06683349609375,
0.0160675048828125,
0.0148468017578125,
0.00511932373046875,
0.036468505859375,
-0.0799560546875,
0.03778076171875,
0.0350341796875,
0.036834716796875,
0.0185394287109375,
0.010284423828125,
0.01219940185546875,
0.01458740234375,
-0.040771484375,
-0.02978515625,
0.10223388671875,
0.02752685546875,
0.04547119140625,
-0.029083251953125,
0.031890869140625,
0.0147247314453125,
0.00659942626953125,
-0.07574462890625,
0.03961181640625,
-0.0188446044921875,
-0.05133056640625,
-0.029388427734375,
0.00734710693359375,
-0.08685302734375,
0.0272064208984375,
-0.0193939208984375,
-0.0792236328125,
0.0267181396484375,
0.0171051025390625,
-0.01480865478515625,
0.040191650390625,
-0.039306640625,
0.07244873046875,
0.00848388671875,
-0.048492431640625,
0.00531005859375,
-0.036590576171875,
0.029541015625,
0.017333984375,
-0.002685546875,
-0.0255126953125,
-0.004608154296875,
0.06488037109375,
-0.0291748046875,
0.058441162109375,
0.0068359375,
0.0157623291015625,
0.059906005859375,
-0.01468658447265625,
0.04132080078125,
-0.005229949951171875,
0.009124755859375,
0.037200927734375,
0.0037059783935546875,
-0.038299560546875,
-0.017333984375,
0.028350830078125,
-0.062103271484375,
-0.03350830078125,
-0.04461669921875,
-0.04547119140625,
0.023468017578125,
0.02056884765625,
0.059478759765625,
0.045562744140625,
0.01316070556640625,
0.041015625,
0.049072265625,
-0.020751953125,
0.0307464599609375,
0.02587890625,
-0.0299530029296875,
-0.01849365234375,
0.07183837890625,
0.00899505615234375,
0.00653839111328125,
0.047454833984375,
0.023834228515625,
-0.0272369384765625,
-0.033935546875,
-0.007228851318359375,
0.0152740478515625,
-0.047027587890625,
-0.0367431640625,
-0.05755615234375,
-0.007274627685546875,
-0.039276123046875,
-0.01081085205078125,
-0.0389404296875,
-0.0061187744140625,
-0.05682373046875,
0.00225830078125,
0.032257080078125,
0.03857421875,
-0.005889892578125,
0.025970458984375,
-0.0159759521484375,
0.01763916015625,
0.028839111328125,
0.0126800537109375,
-0.0021190643310546875,
-0.0360107421875,
-0.01202392578125,
0.006267547607421875,
-0.034698486328125,
-0.04632568359375,
0.033660888671875,
0.0054168701171875,
0.0163421630859375,
0.051849365234375,
0.01068115234375,
0.046142578125,
-0.00844573974609375,
0.058624267578125,
0.02459716796875,
-0.0439453125,
0.053436279296875,
-0.025115966796875,
0.023223876953125,
0.006023406982421875,
0.00984954833984375,
-0.0125579833984375,
-0.0235137939453125,
-0.0276336669921875,
-0.057769775390625,
0.07928466796875,
0.02105712890625,
-0.00829315185546875,
-0.0012369155883789062,
0.020599365234375,
-0.016510009765625,
-0.013885498046875,
-0.057952880859375,
-0.00043845176696777344,
-0.0289306640625,
-0.0102081298828125,
0.0007081031799316406,
-0.02020263671875,
0.0173492431640625,
-0.01751708984375,
0.03533935546875,
-0.0212860107421875,
0.0570068359375,
0.044219970703125,
-0.0015535354614257812,
0.0010547637939453125,
-0.01983642578125,
0.039581298828125,
0.034454345703125,
-0.036346435546875,
-0.004718780517578125,
0.01328277587890625,
-0.0506591796875,
-0.01087188720703125,
-0.0203094482421875,
-0.01800537109375,
-0.01139068603515625,
0.047515869140625,
0.06634521484375,
0.0043792724609375,
-0.03570556640625,
0.04473876953125,
0.0016584396362304688,
-0.0213470458984375,
-0.037567138671875,
0.0085296630859375,
-0.0177459716796875,
0.002437591552734375,
0.037139892578125,
0.013763427734375,
0.020843505859375,
-0.0404052734375,
0.016754150390625,
0.037078857421875,
-0.038360595703125,
-0.0251922607421875,
0.0675048828125,
-0.0114288330078125,
-0.025634765625,
0.042877197265625,
-0.017547607421875,
-0.0389404296875,
0.078125,
0.043548583984375,
0.059783935546875,
0.0037746429443359375,
0.007083892822265625,
0.06365966796875,
0.024444580078125,
-0.023284912109375,
-0.0092620849609375,
0.0009975433349609375,
-0.07000732421875,
-0.00571441650390625,
-0.048553466796875,
-0.0195770263671875,
0.01366424560546875,
-0.06280517578125,
0.04168701171875,
-0.023590087890625,
-0.020721435546875,
-0.00025272369384765625,
-0.00023055076599121094,
-0.065185546875,
0.017486572265625,
0.003520965576171875,
0.07757568359375,
-0.05230712890625,
0.060638427734375,
0.042877197265625,
-0.041839599609375,
-0.051025390625,
0.00159454345703125,
-0.005138397216796875,
-0.07855224609375,
0.037261962890625,
0.055816650390625,
-0.0103302001953125,
0.00910186767578125,
-0.056365966796875,
-0.074462890625,
0.09503173828125,
0.0007300376892089844,
-0.017822265625,
-0.007282257080078125,
-0.0035610198974609375,
0.0340576171875,
-0.033355712890625,
0.038116455078125,
0.0187835693359375,
0.0223388671875,
0.02398681640625,
-0.04376220703125,
-0.0191497802734375,
-0.0015363693237304688,
0.0081024169921875,
0.00363922119140625,
-0.063720703125,
0.080810546875,
-0.0310821533203125,
-0.016510009765625,
0.0102996826171875,
0.047882080078125,
0.005126953125,
0.03521728515625,
0.02777099609375,
0.05963134765625,
0.03076171875,
-0.014556884765625,
0.07745361328125,
-0.0131988525390625,
0.054107666015625,
0.06817626953125,
0.003307342529296875,
0.0845947265625,
0.0202484130859375,
-0.0182037353515625,
0.0279541015625,
0.050048828125,
-0.029876708984375,
0.0389404296875,
-0.01526641845703125,
0.018646240234375,
-0.0104217529296875,
-0.0195465087890625,
-0.0264739990234375,
0.059600830078125,
0.0194549560546875,
-0.0158843994140625,
-0.009613037109375,
0.0260162353515625,
-0.0018320083618164062,
-0.00592803955078125,
-0.018341064453125,
0.04168701171875,
-0.0243682861328125,
-0.037139892578125,
0.043731689453125,
0.0038738250732421875,
0.06402587890625,
-0.031463623046875,
-0.0005726814270019531,
-0.00323486328125,
0.02935791015625,
-0.02197265625,
-0.07440185546875,
0.02978515625,
-0.00396728515625,
-0.0184783935546875,
0.01366424560546875,
0.057952880859375,
-0.0254364013671875,
-0.049346923828125,
0.035400390625,
-0.01715087890625,
0.045074462890625,
-0.0240020751953125,
-0.0458984375,
0.0270538330078125,
-0.00012731552124023438,
-0.0164031982421875,
0.01116943359375,
0.0211944580078125,
-0.00569915771484375,
0.050445556640625,
0.0570068359375,
-0.01509857177734375,
0.0084991455078125,
-0.04541015625,
0.052825927734375,
-0.026641845703125,
-0.027008056640625,
-0.04827880859375,
0.03558349609375,
-0.0169830322265625,
-0.029266357421875,
0.043304443359375,
0.061981201171875,
0.0816650390625,
-0.0194091796875,
0.028717041015625,
-0.009857177734375,
0.0187835693359375,
-0.0328369140625,
0.0249481201171875,
-0.0587158203125,
-0.007488250732421875,
-0.004924774169921875,
-0.06451416015625,
-0.026824951171875,
0.057037353515625,
-0.0307464599609375,
-0.01328277587890625,
0.0474853515625,
0.0721435546875,
-0.01332855224609375,
-0.038116455078125,
0.041259765625,
0.01104736328125,
0.020263671875,
0.0469970703125,
0.0299530029296875,
-0.068359375,
0.07342529296875,
-0.0430908203125,
-0.01739501953125,
-0.0340576171875,
-0.059539794921875,
-0.0777587890625,
-0.04779052734375,
-0.039947509765625,
-0.0123291015625,
-0.0084991455078125,
0.031524658203125,
0.08026123046875,
-0.05572509765625,
-0.0072784423828125,
-0.01226806640625,
0.005519866943359375,
-0.01114654541015625,
-0.02288818359375,
0.039276123046875,
-0.0047149658203125,
-0.0711669921875,
-0.00408172607421875,
0.033233642578125,
0.00557708740234375,
-0.01568603515625,
-0.0019550323486328125,
-0.0233154296875,
0.00440216064453125,
0.05120849609375,
0.03167724609375,
-0.07464599609375,
-0.032989501953125,
0.0127105712890625,
0.0020046234130859375,
0.0235748291015625,
0.0254364013671875,
-0.059600830078125,
0.057220458984375,
0.032196044921875,
0.00571441650390625,
0.054473876953125,
-0.0007371902465820312,
-0.011871337890625,
-0.03741455078125,
0.05755615234375,
-0.0014181137084960938,
0.025726318359375,
0.0278167724609375,
-0.036346435546875,
0.047637939453125,
0.03857421875,
-0.04132080078125,
-0.07012939453125,
0.007450103759765625,
-0.08685302734375,
-0.02618408203125,
0.07080078125,
-0.0226593017578125,
-0.049072265625,
0.006988525390625,
-0.0259552001953125,
0.0201416015625,
-0.0291748046875,
0.047088623046875,
0.036651611328125,
-0.00592041015625,
-0.027557373046875,
-0.0272979736328125,
0.01161956787109375,
-0.01611328125,
-0.045928955078125,
-0.0205841064453125,
0.01229095458984375,
0.031097412109375,
0.0399169921875,
0.036895751953125,
-0.008697509765625,
0.01114654541015625,
0.0220184326171875,
0.020843505859375,
-0.0219573974609375,
-0.03472900390625,
0.000213623046875,
0.0084686279296875,
-0.0116424560546875,
-0.057220458984375
]
] |
amazon/FalconLite2 | 2023-11-02T13:14:33.000Z | [
"transformers",
"RefinedWeb",
"text-generation",
"custom_code",
"license:apache-2.0",
"text-generation-inference",
"region:us"
] | text-generation | amazon | null | null | amazon/FalconLite2 | 45 | 27,957 | transformers | 2023-10-04T06:53:57 | ---
license: apache-2.0
inference: false
---
# FalconLite2 Model
FalconLit2 is a fine-tuned and quantized [Falcon 40B](https://huggingface.co/tiiuae/falcon-40b) language model, capable of processing long (up to 24K tokens) input sequences. By utilizing 4-bit [GPTQ quantization](https://github.com/PanQiWei/AutoGPTQ) and adapted RotaryEmbedding, FalconLite2 is able to process 10x longer contexts while consuming 4x less GPU memory than the original model. FalconLite2 is useful for applications such as topic retrieval, summarization, and question-answering. FalconLite2 can be deployed on a single AWS `g5.12x` instance with [TGI 1.0.3](https://github.com/huggingface/text-generation-inference/tree/v1.0.3) and [TGI 1.1.0](https://github.com/huggingface/text-generation-inference/tree/v1.1.0), making it suitable for applications that require high performance in resource-constrained environments. You can also deploy FalconLite2 directly on SageMaker endpoints.
FalconLite2 evolves from [FalconLite](https://huggingface.co/amazon/FalconLite), and their similarities and differences are summarized below:
|Model|Fine-tuned on long contexts| Quantization | Max context length| RotaryEmbedding adaptation| Inference framework|
|----------|-------------:|-------------:|------------:|-----------:|-----------:|
| FalconLite | No | 4-bit GPTQ |12K | [dNTK](https://www.reddit.com/r/LocalLLaMA/comments/14mrgpr/dynamically_scaled_rope_further_increases/) | TGI 0.9.2 |
| FalconLite2 | Yes | 4-bit GPTQ |24K | rope_theta = 1000000 | TGI 1.0.3 & 1.1.0 |
## Model Details
- **Developed by:** [AWS Contributors](https://github.com/orgs/aws-samples/teams/aws-prototype-ml-apac)
- **Model type:** [Falcon 40B](https://huggingface.co/tiiuae/falcon-40b)
- **Language:** English
- **Finetuned from weights:** [Falcon 40B SFT OASST-TOP1 model](https://huggingface.co/OpenAssistant/falcon-40b-sft-top1-560)
- **Finetuned on data:**
- [SLidingEncoder and Decoder (SLED)](https://huggingface.co/datasets/tau/sled)
- [(Long) Natural Questions (NQ)](https://huggingface.co/datasets/togethercomputer/Long-Data-Collections#multi-passage-qa-from-natural-questions)
- [OpenAssistant Conversations Dataset (OASST1)](https://huggingface.co/datasets/OpenAssistant/oasst1)
- **Served using framework:** [Text-Generation-Inference 1.0.3](https://github.com/huggingface/text-generation-inference/tree/v1.0.3)
- **Model License:** Apache 2.0
- **Contact:** [GitHub issues](https://github.com/awslabs/extending-the-context-length-of-open-source-llms/issues)
## Deploy FalconLite2 on EC2 ##
SSH login to an AWS `g5.12x` instance with the [Deep Learning AMI](https://aws.amazon.com/releasenotes/aws-deep-learning-ami-gpu-pytorch-2-0-ubuntu-20-04/).
### Start TGI server-1.0.3
```bash
git clone https://github.com/awslabs/extending-the-context-length-of-open-source-llms.git falconlite-dev
cd falconlite-dev/falconlite2
# this may take a while to build updated vLLM CUDA kernels
./docker_build.sh
./start_falconlite.sh
```
### Start TGI server-1.1.0
```bash
git clone https://github.com/awslabs/extending-the-context-length-of-open-source-llms.git falconlite-dev
cd falconlite-dev/falconlite2-tgi1.1.0
# this may take a while to build updated vLLM CUDA kernels
./docker_build_rebuild_vllm_rope-theta.sh
./start_falconlite.sh
```
### Perform inference
```bash
# after FalconLite has been completely started
pip install -r ../script/requirements-client.txt
# test short context
python falconlite_client.py
# test long context of 13400 tokens,
# which are copied from [Amazon Aurora FAQs](https://aws.amazon.com/rds/aurora/faqs/)
python falconlite_client.py -l
```
**Important** - Use the prompt template below for FalconLite2:
```
<|prompter|>What are the main challenges to support a long context for LLM?<|endoftext|><|assistant|>
```
**Important** - When using FalconLite2 for inference for the first time, it may require a brief 'warm-up' period that can take 10s of seconds. However, subsequent inferences should be faster and return results in a more timely manner. This warm-up period is normal and should not affect the overall performance of the system once the initialisation period has been completed.
## Deploy FalconLite2 on Amazon SageMaker ##
To deploy FalconLite2 on a SageMaker endpoint with TGI-1.0.3, please follow [this notebook](https://github.com/awslabs/extending-the-context-length-of-open-source-llms/blob/main/falconlite2/sm_deploy.ipynb) running on a SageMaker Notebook instance (e.g. `g5.xlarge`).
To deploy FalconLite2 on a SageMaker endpoint with TGI-1.1.0, please follow [this notebook](https://github.com/awslabs/extending-the-context-length-of-open-source-llms/blob/main/falconlite2-tgi1.1.0/sm_deploy.ipynb) running on a SageMaker Notebook instance (e.g. `g5.xlarge`).
## Evalution Result ##
We evaluated FalconLite2 against benchmarks that are specifically designed to assess the capabilities of LLMs in handling longer contexts.
### Accuracy ###
|Eval task|Input length| Input length | Input length| Input length| Input length|
|----------|-------------:|-------------:|------------:|-----------:|-----------:|
| | 2851| 5568 |8313 | 11044 | 13780
| [Topic Retrieval](https://lmsys.org/blog/2023-06-29-longchat/) | 100% | 100% | 100% | 100% | 90% |
|Eval task|Input length| Input length | Input length| Input length| Input length|Input length|
|----------|-------------:|-------------:|------------:|-----------:|-----------:|-----------:|
| | 3818| 5661 |7505 | 9354 | 11188 | 12657
| [Line Retrieval](https://lmsys.org/blog/2023-06-29-longchat/#longeval-results) | 84% | 82% | 66% | 56% | 62% | 34% |
|Eval task|Input length| Input length | Input length| Input length|
|----------|-------------:|-------------:|------------:|-----------:|
| | 3264| 5396 |8329 | 10197 |
| [Pass key Retrieval](https://github.com/epfml/landmark-attention/blob/main/llama/run_test.py#L101) | 100% | 100% | 100% | 100% |
|Eval task| Test set Accuracy | Hard subset Accuracy|
|----------|-------------:|-------------:|
| [Question Answering with Long Input Texts](https://nyu-mll.github.io/quality/) | 53.4% | 45.4% |
## Limitations ##
Before using the FalconLite model, it is important to perform your own independent assessment, and take measures to ensure that your use would comply with your own specific quality control practices and standards, and that your use would comply with the local rules, laws, regulations, licenses and terms that apply to you, and your content. | 6,589 | [
[
-0.035308837890625,
-0.071044921875,
0.04840087890625,
0.0210113525390625,
-0.00225830078125,
0.0004439353942871094,
-0.0099945068359375,
-0.037506103515625,
0.016204833984375,
0.02886962890625,
-0.039703369140625,
-0.048492431640625,
-0.03082275390625,
0.0001735687255859375,
-0.02203369140625,
0.07177734375,
-0.00505828857421875,
-0.019195556640625,
-0.007549285888671875,
-0.0099029541015625,
-0.023895263671875,
-0.02972412109375,
-0.06854248046875,
-0.005710601806640625,
0.01934814453125,
0.03399658203125,
0.03814697265625,
0.0277557373046875,
0.0400390625,
0.01513671875,
-0.023834228515625,
0.021453857421875,
-0.033111572265625,
0.001407623291015625,
-0.00226593017578125,
-0.0187530517578125,
-0.0419921875,
-0.018585205078125,
0.05877685546875,
0.037872314453125,
-0.0006399154663085938,
0.02166748046875,
0.00868988037109375,
0.06121826171875,
-0.041717529296875,
0.0137481689453125,
-0.0198974609375,
0.0072174072265625,
-0.017333984375,
-0.0011539459228515625,
-0.0159759521484375,
0.00975799560546875,
-0.0025787353515625,
-0.049713134765625,
0.002758026123046875,
0.0204925537109375,
0.06890869140625,
0.02227783203125,
-0.037933349609375,
-0.0267181396484375,
-0.02545166015625,
0.07855224609375,
-0.06256103515625,
0.03350830078125,
0.03204345703125,
0.02398681640625,
-0.0113372802734375,
-0.07843017578125,
-0.02130126953125,
-0.01186370849609375,
-0.020904541015625,
0.023193359375,
0.003192901611328125,
0.00664520263671875,
0.031890869140625,
0.022308349609375,
-0.0440673828125,
0.00894927978515625,
-0.0294952392578125,
-0.004367828369140625,
0.055145263671875,
0.0184173583984375,
-0.003387451171875,
-0.031982421875,
-0.0282440185546875,
-0.001190185546875,
-0.04766845703125,
0.04254150390625,
0.00489044189453125,
0.01160430908203125,
-0.0080413818359375,
0.033233642578125,
-0.03253173828125,
0.0308837890625,
0.0179595947265625,
-0.004077911376953125,
0.0254058837890625,
-0.042938232421875,
-0.02880859375,
-0.0021495819091796875,
0.0743408203125,
0.0267181396484375,
-0.01226806640625,
0.0014886856079101562,
-0.0220947265625,
-0.0205230712890625,
0.005634307861328125,
-0.08837890625,
0.01171112060546875,
0.042694091796875,
-0.032470703125,
-0.021392822265625,
0.00395965576171875,
-0.059783935546875,
-0.01224517822265625,
0.0173187255859375,
0.0178070068359375,
-0.029022216796875,
-0.00814056396484375,
0.007843017578125,
-0.0009984970092773438,
0.01078033447265625,
0.01116180419921875,
-0.07696533203125,
0.03778076171875,
0.06494140625,
0.0753173828125,
-0.0093994140625,
-0.0188446044921875,
-0.05267333984375,
-0.00981903076171875,
-0.0316162109375,
0.033355712890625,
-0.0180511474609375,
-0.040069580078125,
-0.0109710693359375,
0.0249481201171875,
-0.0051422119140625,
-0.0248565673828125,
0.057586669921875,
-0.031707763671875,
0.028961181640625,
-0.033111572265625,
-0.055938720703125,
-0.0156707763671875,
-0.0036716461181640625,
-0.03515625,
0.09991455078125,
0.0259857177734375,
-0.05615234375,
0.0114593505859375,
-0.06561279296875,
-0.01849365234375,
0.00875091552734375,
-0.01206207275390625,
-0.041778564453125,
-0.00978851318359375,
0.0187835693359375,
0.0458984375,
-0.020599365234375,
0.026885986328125,
-0.015899658203125,
-0.056121826171875,
0.0087127685546875,
-0.0016202926635742188,
0.062744140625,
0.031402587890625,
-0.041259765625,
0.01297760009765625,
-0.03131103515625,
0.011260986328125,
0.007167816162109375,
-0.014251708984375,
0.007080078125,
0.01055908203125,
0.011749267578125,
-0.0015048980712890625,
0.0228271484375,
-0.038299560546875,
0.005374908447265625,
-0.03265380859375,
0.056243896484375,
0.038665771484375,
0.01128387451171875,
0.032379150390625,
-0.039459228515625,
0.0267333984375,
0.004138946533203125,
0.014617919921875,
-0.0172576904296875,
-0.033935546875,
-0.07806396484375,
-0.01262664794921875,
0.00032210350036621094,
0.042755126953125,
-0.040283203125,
0.036285400390625,
-0.0265960693359375,
-0.041717529296875,
-0.0323486328125,
0.01629638671875,
0.039825439453125,
0.051666259765625,
0.0518798828125,
-0.0011663436889648438,
-0.036163330078125,
-0.07330322265625,
-0.003704071044921875,
-0.00994873046875,
-0.0012645721435546875,
0.04913330078125,
0.055938720703125,
-0.031341552734375,
0.062744140625,
-0.0364990234375,
-0.0263519287109375,
-0.006748199462890625,
-0.00850677490234375,
0.0377197265625,
0.03350830078125,
0.0609130859375,
-0.048126220703125,
-0.041656494140625,
-0.0036754608154296875,
-0.06146240234375,
0.0028076171875,
-0.006801605224609375,
-0.005603790283203125,
0.047943115234375,
0.03948974609375,
-0.0546875,
0.0165557861328125,
0.0309906005859375,
-0.040679931640625,
0.04083251953125,
-0.0033721923828125,
0.0176849365234375,
-0.095458984375,
0.0252227783203125,
0.0076446533203125,
-0.01428985595703125,
-0.04559326171875,
0.022552490234375,
0.0251312255859375,
0.00543212890625,
-0.04656982421875,
0.057586669921875,
-0.0280303955078125,
-0.00508880615234375,
0.0011968612670898438,
-0.001739501953125,
0.00978851318359375,
0.031280517578125,
-0.0026874542236328125,
0.10308837890625,
0.031707763671875,
-0.04345703125,
0.0183258056640625,
0.0170135498046875,
-0.03662109375,
0.022430419921875,
-0.06884765625,
0.0039520263671875,
0.005512237548828125,
0.0294036865234375,
-0.0692138671875,
-0.019256591796875,
0.0111846923828125,
-0.03424072265625,
0.01214599609375,
-0.008514404296875,
-0.0264434814453125,
-0.03387451171875,
-0.04083251953125,
0.01409149169921875,
0.06109619140625,
-0.040771484375,
0.0236358642578125,
0.006378173828125,
-0.00958251953125,
-0.054595947265625,
-0.0618896484375,
0.0096282958984375,
-0.033050537109375,
-0.06536865234375,
0.04803466796875,
-0.015228271484375,
-0.0183563232421875,
-0.0118560791015625,
-0.0024356842041015625,
-0.00228118896484375,
0.0196533203125,
0.031951904296875,
0.0097808837890625,
-0.022613525390625,
0.015045166015625,
0.01526641845703125,
-0.0055694580078125,
0.0038604736328125,
-0.0091552734375,
0.03759765625,
-0.045867919921875,
-0.0128021240234375,
-0.036834716796875,
0.0182037353515625,
0.04180908203125,
-0.0092315673828125,
0.05126953125,
0.0455322265625,
-0.02166748046875,
-0.00525665283203125,
-0.05126953125,
-0.014373779296875,
-0.042755126953125,
0.03985595703125,
-0.0316162109375,
-0.07696533203125,
0.0452880859375,
0.010009765625,
0.01904296875,
0.051605224609375,
0.02215576171875,
-0.019256591796875,
0.058807373046875,
0.0247344970703125,
-0.0006775856018066406,
0.0343017578125,
-0.040496826171875,
0.00707244873046875,
-0.09375,
-0.0147247314453125,
-0.0250396728515625,
-0.012237548828125,
-0.034149169921875,
-0.03167724609375,
0.0186920166015625,
0.016876220703125,
-0.03057861328125,
0.0232391357421875,
-0.038055419921875,
0.0194549560546875,
0.0258026123046875,
-0.000004887580871582031,
0.006595611572265625,
-0.00553131103515625,
-0.007450103759765625,
-0.00429534912109375,
-0.041259765625,
-0.04010009765625,
0.08935546875,
0.0280303955078125,
0.0345458984375,
0.00807952880859375,
0.0672607421875,
0.0131988525390625,
0.00994873046875,
-0.059234619140625,
0.046173095703125,
-0.004932403564453125,
-0.043609619140625,
-0.022064208984375,
-0.02978515625,
-0.06390380859375,
0.01088714599609375,
-0.0009350776672363281,
-0.061370849609375,
-0.006702423095703125,
-0.0162506103515625,
-0.03924560546875,
0.01812744140625,
-0.0469970703125,
0.05572509765625,
-0.022552490234375,
-0.044677734375,
0.00243377685546875,
-0.041656494140625,
0.0240631103515625,
-0.0208282470703125,
0.0212554931640625,
-0.00231170654296875,
-0.0209808349609375,
0.0748291015625,
-0.042694091796875,
0.041534423828125,
-0.013702392578125,
0.00890350341796875,
0.0347900390625,
-0.00701904296875,
0.045989990234375,
0.023895263671875,
-0.022186279296875,
0.017181396484375,
0.0225372314453125,
-0.0322265625,
-0.041015625,
0.076171875,
-0.060791015625,
-0.060028076171875,
-0.03875732421875,
-0.03131103515625,
0.003955841064453125,
0.0115203857421875,
0.017547607421875,
0.01727294921875,
-0.009735107421875,
0.036834716796875,
0.045440673828125,
-0.01294708251953125,
0.05694580078125,
0.025146484375,
-0.0308837890625,
-0.0278472900390625,
0.04901123046875,
0.0167694091796875,
0.0132904052734375,
0.017120361328125,
-0.0024394989013671875,
-0.0271759033203125,
-0.04632568359375,
-0.043792724609375,
0.0196380615234375,
-0.042999267578125,
-0.01253509521484375,
-0.056732177734375,
-0.03375244140625,
-0.0308990478515625,
-0.02166748046875,
-0.029998779296875,
-0.039794921875,
-0.05316162109375,
-0.01064300537109375,
0.041351318359375,
0.033843994140625,
0.020294189453125,
0.0178680419921875,
-0.05328369140625,
-0.0013523101806640625,
0.01262664794921875,
0.0133819580078125,
-0.00975799560546875,
-0.053375244140625,
-0.017669677734375,
0.0230560302734375,
-0.0255584716796875,
-0.047454833984375,
0.03106689453125,
0.0212249755859375,
0.030059814453125,
0.026611328125,
0.002841949462890625,
0.059906005859375,
-0.0224151611328125,
0.08502197265625,
0.0007343292236328125,
-0.068359375,
0.04638671875,
-0.0308990478515625,
0.01507568359375,
0.051605224609375,
0.033355712890625,
-0.052032470703125,
-0.006969451904296875,
-0.045257568359375,
-0.062347412109375,
0.044189453125,
0.0284271240234375,
-0.00201416015625,
-0.003604888916015625,
0.03857421875,
-0.01033782958984375,
-0.0048675537109375,
-0.03436279296875,
-0.01300811767578125,
-0.01474761962890625,
-0.011749267578125,
-0.025848388671875,
-0.02606201171875,
-0.00684356689453125,
-0.01885986328125,
0.0545654296875,
-0.0136871337890625,
0.030548095703125,
0.0274505615234375,
-0.01323699951171875,
-0.00017845630645751953,
0.01499176025390625,
0.04876708984375,
0.056488037109375,
-0.03302001953125,
-0.003971099853515625,
0.039031982421875,
-0.034149169921875,
-0.001087188720703125,
0.01421356201171875,
-0.0211639404296875,
-0.014312744140625,
0.0299224853515625,
0.055938720703125,
0.020050048828125,
-0.04913330078125,
0.037109375,
-0.007205963134765625,
-0.027862548828125,
-0.04071044921875,
0.0285186767578125,
0.019287109375,
0.0252532958984375,
0.042236328125,
-0.025665283203125,
0.016021728515625,
-0.035736083984375,
0.0101470947265625,
0.036590576171875,
-0.03179931640625,
-0.0035915374755859375,
0.048828125,
0.015533447265625,
-0.030609130859375,
0.051910400390625,
-0.0183563232421875,
-0.035491943359375,
0.0699462890625,
0.035980224609375,
0.049102783203125,
0.0003478527069091797,
0.006366729736328125,
0.04376220703125,
0.00786590576171875,
-0.01329803466796875,
0.0252227783203125,
-0.004497528076171875,
-0.0225372314453125,
-0.01523590087890625,
-0.05859375,
-0.021484375,
0.005313873291015625,
-0.039459228515625,
0.01422882080078125,
-0.046783447265625,
-0.02227783203125,
-0.007442474365234375,
0.053955078125,
-0.04205322265625,
0.0018644332885742188,
-0.0109710693359375,
0.078857421875,
-0.03863525390625,
0.05712890625,
0.045440673828125,
-0.032745361328125,
-0.0478515625,
-0.0264739990234375,
0.0008187294006347656,
-0.05169677734375,
0.050079345703125,
0.03302001953125,
0.001071929931640625,
0.011962890625,
-0.0245513916015625,
-0.07928466796875,
0.1324462890625,
-0.0195770263671875,
-0.049896240234375,
-0.0031337738037109375,
0.0119171142578125,
0.03765869140625,
-0.0298309326171875,
0.040252685546875,
0.04046630859375,
0.04095458984375,
0.007598876953125,
-0.07769775390625,
0.029510498046875,
-0.0254058837890625,
0.00298309326171875,
0.0163726806640625,
-0.08270263671875,
0.07440185546875,
-0.01824951171875,
-0.0222625732421875,
0.032806396484375,
0.051544189453125,
0.036712646484375,
0.032135009765625,
0.0145416259765625,
0.05047607421875,
0.06671142578125,
-0.01476287841796875,
0.07647705078125,
-0.0345458984375,
0.05120849609375,
0.057647705078125,
-0.01148223876953125,
0.06683349609375,
0.034423828125,
-0.019073486328125,
0.03594970703125,
0.0677490234375,
-0.006481170654296875,
0.0258026123046875,
-0.005584716796875,
-0.017669677734375,
-0.002124786376953125,
0.00029587745666503906,
-0.05316162109375,
0.00833892822265625,
0.03424072265625,
-0.029632568359375,
0.00115966796875,
-0.0193328857421875,
0.02215576171875,
-0.0284576416015625,
-0.005367279052734375,
0.037750244140625,
0.0165557861328125,
-0.0614013671875,
0.069091796875,
0.01311492919921875,
0.06884765625,
-0.042694091796875,
0.0162200927734375,
-0.03961181640625,
0.0113067626953125,
-0.03265380859375,
-0.06005859375,
0.02166748046875,
0.0001418590545654297,
0.00490570068359375,
-0.01012420654296875,
0.041839599609375,
-0.007598876953125,
-0.042633056640625,
0.023956298828125,
0.02880859375,
0.01404571533203125,
-0.005786895751953125,
-0.0584716796875,
0.01708984375,
-0.00612640380859375,
-0.045166015625,
0.031158447265625,
0.005771636962890625,
-0.0026340484619140625,
0.060211181640625,
0.0543212890625,
-0.0213165283203125,
-0.00408172607421875,
-0.0201263427734375,
0.06414794921875,
-0.06549072265625,
-0.0217742919921875,
-0.059661865234375,
0.03460693359375,
-0.0191192626953125,
-0.032440185546875,
0.06304931640625,
0.06109619140625,
0.040679931640625,
0.0185699462890625,
0.040771484375,
-0.0017108917236328125,
0.0207977294921875,
-0.039520263671875,
0.0670166015625,
-0.0614013671875,
0.0200653076171875,
-0.0112152099609375,
-0.0772705078125,
-0.00884246826171875,
0.048980712890625,
-0.016876220703125,
0.01427459716796875,
0.07037353515625,
0.07012939453125,
-0.01039886474609375,
0.0010614395141601562,
0.00571441650390625,
0.0183258056640625,
0.047760009765625,
0.075927734375,
0.047149658203125,
-0.055694580078125,
0.031951904296875,
-0.0220947265625,
-0.01314544677734375,
-0.033111572265625,
-0.053009033203125,
-0.072998046875,
-0.041961669921875,
-0.01230621337890625,
-0.02777099609375,
0.01171112060546875,
0.059906005859375,
0.057159423828125,
-0.04156494140625,
-0.0283203125,
-0.016937255859375,
-0.00794219970703125,
-0.00211334228515625,
-0.0215606689453125,
0.040374755859375,
-0.035491943359375,
-0.049652099609375,
0.038665771484375,
0.00572967529296875,
0.005702972412109375,
-0.018798828125,
-0.029205322265625,
-0.013702392578125,
0.0008769035339355469,
0.045928955078125,
0.0257568359375,
-0.054901123046875,
-0.028411865234375,
0.021484375,
-0.018768310546875,
0.006679534912109375,
0.01195526123046875,
-0.059234619140625,
0.008331298828125,
0.036834716796875,
0.043182373046875,
0.07086181640625,
-0.0003070831298828125,
-0.005802154541015625,
-0.02850341796875,
0.006114959716796875,
0.015960693359375,
0.030975341796875,
0.0143280029296875,
-0.040496826171875,
0.059417724609375,
0.005344390869140625,
-0.05682373046875,
-0.056243896484375,
0.011688232421875,
-0.08563232421875,
-0.024017333984375,
0.0980224609375,
-0.013397216796875,
-0.0241851806640625,
0.0176544189453125,
-0.0182952880859375,
0.004459381103515625,
-0.039520263671875,
0.052642822265625,
0.0555419921875,
-0.01544189453125,
-0.00765228271484375,
-0.044647216796875,
0.03607177734375,
0.039520263671875,
-0.07122802734375,
-0.0175933837890625,
0.0333251953125,
0.036407470703125,
0.00865936279296875,
0.0509033203125,
0.00830841064453125,
0.018890380859375,
-0.01372528076171875,
-0.0276641845703125,
-0.031768798828125,
-0.01300048828125,
-0.01432037353515625,
0.00726318359375,
-0.0170135498046875,
0.004070281982421875
]
] |
stabilityai/StableBeluga-13B | 2023-08-29T20:21:26.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"en",
"dataset:conceptofmind/cot_submix_original",
"dataset:conceptofmind/flan2021_submix_original",
"dataset:conceptofmind/t0_submix_original",
"dataset:conceptofmind/niv2_submix_original",
"arxiv:2307.09288",
"arxiv:2306.02707",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | stabilityai | null | null | stabilityai/StableBeluga-13B | 111 | 27,956 | transformers | 2023-07-27T02:54:21 | ---
datasets:
- conceptofmind/cot_submix_original
- conceptofmind/flan2021_submix_original
- conceptofmind/t0_submix_original
- conceptofmind/niv2_submix_original
language:
- en
pipeline_tag: text-generation
---
# Stable Beluga 13B
Use [Stable Chat (Research Preview)](https://chat.stability.ai/chat) to test Stability AI's best language models for free
## Model Description
`Stable Beluga 13B` is a Llama2 13B model finetuned on an Orca style Dataset
## Usage
Start chatting with `Stable Beluga 13B` using the following code snippet:
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("stabilityai/StableBeluga-13B", use_fast=False)
model = AutoModelForCausalLM.from_pretrained("stabilityai/StableBeluga-13B", torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto")
system_prompt = "### System:\nYou are Stable Beluga 13B, an AI that follows instructions extremely well. Help as much as you can. Remember, be safe, and don't do anything illegal.\n\n"
message = "Write me a poem please"
prompt = f"{system_prompt}### User: {message}\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=256)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
Stable Beluga 13B should be used with this prompt format:
```
### System:
This is a system prompt, please behave and help the user.
### User:
Your prompt here
### Assistant
The output of Stable Beluga 13B
```
## Model Details
* **Developed by**: [Stability AI](https://stability.ai/)
* **Model type**: Stable Beluga 13B is an auto-regressive language model fine-tuned on Llama2 13B.
* **Language(s)**: English
* **Library**: [HuggingFace Transformers](https://github.com/huggingface/transformers)
* **License**: Fine-tuned checkpoints (`Stable Beluga 13B`) is licensed under the [STABLE BELUGA NON-COMMERCIAL COMMUNITY LICENSE AGREEMENT](https://huggingface.co/stabilityai/StableBeluga-13B/blob/main/LICENSE.txt)
* **Contact**: For questions and comments about the model, please email `lm@stability.ai`
### Training Dataset
` Stable Beluga 13B` is trained on our internal Orca-style dataset
### Training Procedure
Models are learned via supervised fine-tuning on the aforementioned datasets, trained in mixed-precision (BF16), and optimized with AdamW. We outline the following hyperparameters:
| Dataset | Batch Size | Learning Rate |Learning Rate Decay| Warm-up | Weight Decay | Betas |
|-------------------|------------|---------------|-------------------|---------|--------------|-------------|
| Orca pt1 packed | 256 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) |
| Orca pt2 unpacked | 512 | 3e-5 | Cosine to 3e-6 | 100 | 1e-6 | (0.9, 0.95) |
## Ethical Considerations and Limitations
Beluga is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Beluga's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Beluga, developers should perform safety testing and tuning tailored to their specific applications of the model.
## Citations
```bibtext
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtext
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 5,324 | [
[
-0.033233642578125,
-0.0748291015625,
0.0035572052001953125,
0.03155517578125,
-0.0187530517578125,
0.004390716552734375,
-0.01096343994140625,
-0.038848876953125,
0.00305938720703125,
0.018768310546875,
-0.04522705078125,
-0.0372314453125,
-0.048797607421875,
-0.0047454833984375,
-0.0251312255859375,
0.07867431640625,
0.009613037109375,
-0.01313018798828125,
0.00893402099609375,
-0.0077056884765625,
-0.047607421875,
-0.02264404296875,
-0.06964111328125,
-0.0210418701171875,
0.017425537109375,
0.0168609619140625,
0.05535888671875,
0.052734375,
0.017669677734375,
0.0264739990234375,
-0.0305938720703125,
0.00525665283203125,
-0.045928955078125,
-0.01241302490234375,
0.012725830078125,
-0.037109375,
-0.044708251953125,
-0.0006909370422363281,
0.03314208984375,
0.03143310546875,
-0.0129547119140625,
0.0146942138671875,
0.0179901123046875,
0.028076171875,
-0.0279693603515625,
0.0259552001953125,
-0.03399658203125,
-0.01340484619140625,
-0.01203155517578125,
0.004673004150390625,
-0.0177764892578125,
-0.0518798828125,
0.00626373291015625,
-0.044342041015625,
-0.00005412101745605469,
-0.004573822021484375,
0.10986328125,
0.0293731689453125,
-0.036376953125,
-0.0100250244140625,
-0.0426025390625,
0.06536865234375,
-0.067138671875,
0.032806396484375,
0.0229339599609375,
0.024566650390625,
-0.0275421142578125,
-0.04986572265625,
-0.049072265625,
-0.0166473388671875,
-0.003650665283203125,
0.026214599609375,
-0.0146484375,
-0.001972198486328125,
0.0194549560546875,
0.031707763671875,
-0.038909912109375,
0.02081298828125,
-0.0411376953125,
-0.031951904296875,
0.03948974609375,
0.01000213623046875,
0.007396697998046875,
-0.00980377197265625,
-0.0242462158203125,
-0.0274810791015625,
-0.052825927734375,
0.024017333984375,
0.0301971435546875,
0.01465606689453125,
-0.041168212890625,
0.0428466796875,
-0.00351715087890625,
0.045806884765625,
0.00667572021484375,
-0.024627685546875,
0.032073974609375,
-0.0305938720703125,
-0.030242919921875,
-0.002765655517578125,
0.07354736328125,
0.02813720703125,
0.006687164306640625,
0.02459716796875,
-0.00266265869140625,
0.0300445556640625,
-0.003406524658203125,
-0.06500244140625,
-0.01384735107421875,
0.0266571044921875,
-0.037841796875,
-0.041168212890625,
-0.024078369140625,
-0.078369140625,
-0.01302337646484375,
-0.002410888671875,
0.02386474609375,
-0.034393310546875,
-0.0311431884765625,
0.0160369873046875,
0.02630615234375,
0.03985595703125,
0.00240325927734375,
-0.07366943359375,
0.0261383056640625,
0.033843994140625,
0.053497314453125,
0.00907135009765625,
-0.0211334228515625,
-0.021026611328125,
-0.00962066650390625,
-0.03436279296875,
0.043914794921875,
-0.0154571533203125,
-0.03521728515625,
-0.0036449432373046875,
0.0111236572265625,
-0.006931304931640625,
-0.0231781005859375,
0.051177978515625,
-0.018646240234375,
0.032958984375,
-0.010498046875,
-0.032470703125,
-0.03271484375,
0.0165863037109375,
-0.0291595458984375,
0.08966064453125,
-0.00835418701171875,
-0.0509033203125,
0.0162811279296875,
-0.0478515625,
-0.0237579345703125,
-0.0258026123046875,
-0.01007080078125,
-0.052947998046875,
-0.027130126953125,
0.0178375244140625,
0.0377197265625,
-0.0191650390625,
0.021270751953125,
-0.0316162109375,
-0.020660400390625,
0.012237548828125,
-0.0222930908203125,
0.082763671875,
0.0258331298828125,
-0.050323486328125,
0.01373291015625,
-0.06982421875,
-0.01055145263671875,
0.0266571044921875,
-0.029205322265625,
-0.0015439987182617188,
-0.00897216796875,
-0.0202789306640625,
0.000926971435546875,
0.0265655517578125,
-0.0408935546875,
0.00861358642578125,
-0.03253173828125,
0.035919189453125,
0.05426025390625,
-0.0036468505859375,
0.01399993896484375,
-0.0257720947265625,
0.025543212890625,
-0.0027446746826171875,
0.02984619140625,
-0.006793975830078125,
-0.06219482421875,
-0.07220458984375,
-0.0239715576171875,
0.0269775390625,
0.054931640625,
-0.023712158203125,
0.05242919921875,
0.00469970703125,
-0.05572509765625,
-0.04705810546875,
-0.00444793701171875,
0.04620361328125,
0.0489501953125,
0.0245208740234375,
-0.022186279296875,
-0.04962158203125,
-0.06036376953125,
0.0128326416015625,
-0.0293731689453125,
0.01158905029296875,
0.01096343994140625,
0.0271148681640625,
-0.035247802734375,
0.056915283203125,
-0.037322998046875,
-0.0139007568359375,
-0.0084991455078125,
0.0120849609375,
0.025360107421875,
0.04217529296875,
0.062042236328125,
-0.04010009765625,
-0.01812744140625,
-0.008270263671875,
-0.052978515625,
-0.0034008026123046875,
0.00489044189453125,
-0.01922607421875,
0.04034423828125,
0.0223236083984375,
-0.04132080078125,
0.0411376953125,
0.0609130859375,
-0.0274810791015625,
0.045013427734375,
-0.01399993896484375,
0.00420379638671875,
-0.0897216796875,
0.00958251953125,
0.01023101806640625,
-0.00528717041015625,
-0.03875732421875,
-0.0021190643310546875,
0.0040283203125,
-0.0007562637329101562,
-0.02642822265625,
0.037628173828125,
-0.023040771484375,
0.001651763916015625,
-0.01348876953125,
0.006511688232421875,
-0.01116180419921875,
0.046600341796875,
-0.00006878376007080078,
0.036834716796875,
0.057037353515625,
-0.055816650390625,
0.0243988037109375,
0.03472900390625,
-0.0290069580078125,
0.017333984375,
-0.06842041015625,
0.00861358642578125,
0.0142059326171875,
0.01324462890625,
-0.09490966796875,
-0.010162353515625,
0.0291595458984375,
-0.04901123046875,
0.0343017578125,
-0.014923095703125,
-0.0286407470703125,
-0.03607177734375,
-0.0171966552734375,
0.0037841796875,
0.059967041015625,
-0.037078857421875,
0.03375244140625,
0.031829833984375,
-0.0023365020751953125,
-0.054412841796875,
-0.06048583984375,
-0.0198211669921875,
-0.021484375,
-0.06414794921875,
0.01085662841796875,
-0.0290069580078125,
0.00783538818359375,
-0.007404327392578125,
-0.005107879638671875,
0.00263214111328125,
0.0202789306640625,
0.0239105224609375,
0.04144287109375,
-0.0091094970703125,
-0.0291748046875,
0.0161590576171875,
-0.0174560546875,
0.00409698486328125,
0.001476287841796875,
0.052093505859375,
-0.042236328125,
-0.01104736328125,
-0.03564453125,
-0.00620269775390625,
0.039794921875,
-0.024322509765625,
0.060150146484375,
0.051361083984375,
-0.03399658203125,
0.0266571044921875,
-0.04669189453125,
-0.020843505859375,
-0.0391845703125,
0.0271148681640625,
-0.026031494140625,
-0.061004638671875,
0.06829833984375,
0.0050811767578125,
0.034515380859375,
0.03851318359375,
0.05963134765625,
0.00919342041015625,
0.0740966796875,
0.05419921875,
-0.00146484375,
0.020355224609375,
-0.03997802734375,
0.00934600830078125,
-0.0533447265625,
-0.039581298828125,
-0.046630859375,
-0.0146484375,
-0.046844482421875,
0.002162933349609375,
0.015960693359375,
0.00732421875,
-0.038238525390625,
0.0292510986328125,
-0.041839599609375,
0.004917144775390625,
0.0357666015625,
0.0034999847412109375,
0.0118865966796875,
-0.0154571533203125,
-0.0285491943359375,
0.0022373199462890625,
-0.047698974609375,
-0.038177490234375,
0.07421875,
0.046112060546875,
0.059814453125,
0.00545501708984375,
0.032501220703125,
-0.0194091796875,
0.00823211669921875,
-0.0391845703125,
0.0472412109375,
0.0032825469970703125,
-0.06256103515625,
-0.0136566162109375,
-0.032470703125,
-0.090576171875,
0.006168365478515625,
-0.0215606689453125,
-0.052215576171875,
0.0267486572265625,
0.0125274658203125,
-0.039337158203125,
0.01503753662109375,
-0.055694580078125,
0.08050537109375,
-0.02178955078125,
-0.01050567626953125,
-0.00788116455078125,
-0.062347412109375,
0.0489501953125,
0.0009431838989257812,
0.0245208740234375,
-0.000012218952178955078,
-0.0011224746704101562,
0.055267333984375,
-0.032012939453125,
0.07354736328125,
0.0006465911865234375,
-0.010528564453125,
0.02838134765625,
0.016845703125,
0.0311431884765625,
0.009918212890625,
-0.0017843246459960938,
0.0204925537109375,
0.00855255126953125,
-0.0305938720703125,
-0.0197906494140625,
0.054229736328125,
-0.09442138671875,
-0.03643798828125,
-0.037109375,
-0.01849365234375,
0.0023651123046875,
0.025054931640625,
0.0275421142578125,
0.0294342041015625,
0.01442718505859375,
0.0196685791015625,
0.053131103515625,
-0.031982421875,
0.0267181396484375,
0.0416259765625,
-0.0321044921875,
-0.039337158203125,
0.049652099609375,
0.01183319091796875,
0.023223876953125,
0.00478363037109375,
0.0246734619140625,
-0.03216552734375,
-0.042877197265625,
-0.0308837890625,
0.028045654296875,
-0.039703369140625,
-0.0161895751953125,
-0.0433349609375,
-0.02142333984375,
-0.035369873046875,
0.0006537437438964844,
-0.050323486328125,
-0.022979736328125,
-0.03204345703125,
-0.0233154296875,
0.049041748046875,
0.03204345703125,
-0.0109405517578125,
0.015869140625,
-0.05108642578125,
0.01318359375,
0.019683837890625,
0.0217132568359375,
-0.00390625,
-0.059906005859375,
-0.01062774658203125,
0.01458740234375,
-0.037841796875,
-0.0638427734375,
0.024810791015625,
0.0023097991943359375,
0.054046630859375,
0.0262451171875,
0.003406524658203125,
0.0628662109375,
-0.0086669921875,
0.0704345703125,
0.01641845703125,
-0.060821533203125,
0.046600341796875,
-0.0301361083984375,
0.004673004150390625,
0.0264129638671875,
0.041748046875,
-0.0252685546875,
-0.02960205078125,
-0.055267333984375,
-0.059234619140625,
0.052490234375,
0.0242919921875,
0.0030269622802734375,
0.0036792755126953125,
0.03857421875,
0.0054931640625,
0.00569915771484375,
-0.0675048828125,
-0.0406494140625,
-0.045806884765625,
-0.00044608116149902344,
0.004154205322265625,
-0.020660400390625,
-0.00909423828125,
-0.0221099853515625,
0.06329345703125,
0.0054473876953125,
0.0290069580078125,
0.01180267333984375,
0.0185546875,
-0.01425933837890625,
0.00007444620132446289,
0.051177978515625,
0.04046630859375,
-0.033294677734375,
-0.00908660888671875,
0.0243072509765625,
-0.04681396484375,
-0.0020046234130859375,
0.023651123046875,
-0.02069091796875,
-0.0211334228515625,
0.0028934478759765625,
0.0687255859375,
0.0085296630859375,
-0.0308990478515625,
0.0190887451171875,
-0.017730712890625,
-0.0247802734375,
-0.0255889892578125,
0.0018720626831054688,
0.0170745849609375,
0.0211944580078125,
0.01291656494140625,
0.00528717041015625,
-0.01971435546875,
-0.051544189453125,
-0.0035572052001953125,
0.01305389404296875,
-0.017425537109375,
-0.0234832763671875,
0.068115234375,
0.01605224609375,
-0.01000213623046875,
0.046600341796875,
-0.0096282958984375,
-0.03466796875,
0.0458984375,
0.03875732421875,
0.05120849609375,
-0.033050537109375,
0.002971649169921875,
0.045440673828125,
0.039154052734375,
-0.01027679443359375,
0.034210205078125,
0.036712646484375,
-0.04986572265625,
-0.0352783203125,
-0.0360107421875,
-0.036102294921875,
0.037200927734375,
-0.0411376953125,
0.042388916015625,
-0.040283203125,
-0.017791748046875,
-0.019683837890625,
0.02264404296875,
-0.031341552734375,
0.021759033203125,
0.00251007080078125,
0.06719970703125,
-0.05908203125,
0.05938720703125,
0.050079345703125,
-0.040008544921875,
-0.078369140625,
-0.025360107421875,
0.0013456344604492188,
-0.053619384765625,
0.0167083740234375,
0.0043792724609375,
0.00714111328125,
0.0050201416015625,
-0.046142578125,
-0.06903076171875,
0.09539794921875,
0.0362548828125,
-0.039459228515625,
0.0199127197265625,
-0.0008502006530761719,
0.045257568359375,
-0.004032135009765625,
0.036773681640625,
0.045562744140625,
0.043121337890625,
0.01007843017578125,
-0.07452392578125,
0.0236358642578125,
-0.0236358642578125,
-0.01032257080078125,
0.003787994384765625,
-0.083251953125,
0.058807373046875,
-0.00888824462890625,
-0.00307464599609375,
0.022064208984375,
0.0643310546875,
0.055145263671875,
0.022857666015625,
0.039703369140625,
0.053131103515625,
0.053802490234375,
-0.016021728515625,
0.079833984375,
-0.028533935546875,
0.030792236328125,
0.048797607421875,
0.005126953125,
0.055145263671875,
0.005977630615234375,
-0.0187225341796875,
0.046600341796875,
0.0799560546875,
-0.01024627685546875,
0.044677734375,
-0.0031414031982421875,
0.01056671142578125,
-0.007045745849609375,
0.01502227783203125,
-0.0469970703125,
0.0121612548828125,
0.035125732421875,
-0.0058441162109375,
-0.00516510009765625,
-0.006610870361328125,
0.032318115234375,
-0.01152801513671875,
-0.008758544921875,
0.03680419921875,
0.0177764892578125,
-0.045379638671875,
0.09326171875,
0.0027713775634765625,
0.058319091796875,
-0.06536865234375,
0.01517486572265625,
-0.031982421875,
0.0218658447265625,
-0.0256805419921875,
-0.0535888671875,
0.0247039794921875,
-0.0004291534423828125,
0.0102996826171875,
0.011810302734375,
0.05462646484375,
-0.01226806640625,
-0.0263519287109375,
0.03790283203125,
0.00913238525390625,
0.02606201171875,
0.022003173828125,
-0.07196044921875,
0.025665283203125,
0.007442474365234375,
-0.04364013671875,
0.0233001708984375,
0.033905029296875,
0.0018367767333984375,
0.05859375,
0.053558349609375,
-0.01139068603515625,
0.0292510986328125,
-0.00984954833984375,
0.07464599609375,
-0.0316162109375,
-0.02545166015625,
-0.0574951171875,
0.0408935546875,
0.00919342041015625,
-0.0389404296875,
0.0631103515625,
0.0433349609375,
0.05841064453125,
0.01213836669921875,
0.0516357421875,
-0.0145416259765625,
0.035797119140625,
-0.033843994140625,
0.051483154296875,
-0.037811279296875,
0.0223236083984375,
-0.01885986328125,
-0.07196044921875,
-0.018035888671875,
0.06024169921875,
-0.0168914794921875,
0.00489044189453125,
0.04266357421875,
0.0699462890625,
-0.006885528564453125,
0.004543304443359375,
0.0098419189453125,
0.050537109375,
0.036102294921875,
0.041656494140625,
0.05255126953125,
-0.04705810546875,
0.0782470703125,
-0.044189453125,
-0.03436279296875,
-0.0202484130859375,
-0.07720947265625,
-0.08197021484375,
-0.03558349609375,
-0.037078857421875,
-0.049346923828125,
0.0023326873779296875,
0.05462646484375,
0.05133056640625,
-0.056182861328125,
-0.030303955078125,
-0.01139068603515625,
-0.00325775146484375,
-0.026641845703125,
-0.01348114013671875,
0.037078857421875,
-0.00823974609375,
-0.056365966796875,
0.024200439453125,
-0.0014371871948242188,
0.0306854248046875,
-0.0270843505859375,
-0.0190277099609375,
-0.0212249755859375,
0.00875091552734375,
0.0242156982421875,
0.040863037109375,
-0.03399658203125,
-0.0250091552734375,
0.01212310791015625,
-0.00528717041015625,
-0.0012607574462890625,
0.0008535385131835938,
-0.042510986328125,
0.0197296142578125,
0.0369873046875,
0.03399658203125,
0.06048583984375,
-0.0178680419921875,
0.02838134765625,
-0.031494140625,
0.0204925537109375,
-0.00417327880859375,
0.034454345703125,
0.0156402587890625,
-0.0189208984375,
0.050140380859375,
0.0233306884765625,
-0.061065673828125,
-0.06756591796875,
0.005855560302734375,
-0.09576416015625,
-0.0056915283203125,
0.08837890625,
-0.0187225341796875,
-0.0282745361328125,
-0.008056640625,
-0.030609130859375,
0.04144287109375,
-0.0372314453125,
0.0760498046875,
0.037200927734375,
0.0049896240234375,
-0.02227783203125,
-0.0426025390625,
0.03271484375,
0.028411865234375,
-0.053558349609375,
-0.01253509521484375,
0.019561767578125,
0.0257720947265625,
0.02203369140625,
0.035552978515625,
-0.0290679931640625,
0.0187225341796875,
-0.01751708984375,
0.0106658935546875,
-0.01226043701171875,
-0.005641937255859375,
-0.01480865478515625,
-0.0200653076171875,
-0.0108184814453125,
-0.0246124267578125
]
] |
Qwen/Qwen-VL-Chat | 2023-10-31T12:41:18.000Z | [
"transformers",
"pytorch",
"qwen",
"text-generation",
"custom_code",
"zh",
"en",
"arxiv:2308.12966",
"has_space",
"region:us"
] | text-generation | Qwen | null | null | Qwen/Qwen-VL-Chat | 108 | 27,761 | transformers | 2023-08-20T04:45:22 | ---
language:
- zh
- en
tags:
- qwen
pipeline_tag: text-generation
inference: false
---
# Qwen-VL-Chat
<br>
<p align="center">
<img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/logo_vl.jpg" width="400"/>
<p>
<br>
<p align="center">
Qwen-VL <a href="https://modelscope.cn/models/qwen/Qwen-VL/summary">🤖 <a> | <a href="https://huggingface.co/Qwen/Qwen-VL">🤗</a>  | Qwen-VL-Chat <a href="https://modelscope.cn/models/qwen/Qwen-VL-Chat/summary">🤖 <a>| <a href="https://huggingface.co/Qwen/Qwen-VL-Chat">🤗</a>  | Qwen-VL-Chat-Int4 <a href="https://huggingface.co/Qwen/Qwen-VL-Chat-Int4">🤗</a>
<br>
<a href="assets/wechat.png">WeChat</a>   |   <a href="https://discord.gg/z3GAxXZ9Ce">Discord</a>   |   <a href="https://modelscope.cn/studios/qwen/Qwen-VL-Chat-Demo/summary">Demo</a>  |  <a href="https://arxiv.org/abs/2308.12966">Report</a>
</p>
<br>
**Qwen-VL** 是阿里云研发的大规模视觉语言模型(Large Vision Language Model, LVLM)。Qwen-VL 可以以图像、文本、检测框作为输入,并以文本和检测框作为输出。Qwen-VL 系列模型性能强大,具备多语言对话、多图交错对话等能力,并支持中文开放域定位和细粒度图像识别与理解。
**Qwen-VL** (Qwen Large Vision Language Model) is the visual multimodal version of the large model series, Qwen (abbr. Tongyi Qianwen), proposed by Alibaba Cloud. Qwen-VL accepts image, text, and bounding box as inputs, outputs text and bounding box. The features of Qwen-VL include:
目前,我们提供了Qwen-VL和Qwen-VL-Chat两个模型,分别为预训练模型和Chat模型。如果想了解更多关于模型的信息,请点击[链接](https://github.com/QwenLM/Qwen-VL/blob/master/visual_memo.md)查看我们的技术备忘录。本仓库为Qwen-VL-Chat仓库。
We release Qwen-VL and Qwen-VL-Chat, which are pretrained model and Chat model respectively. For more details about Qwen-VL, please refer to our [technical memo](https://github.com/QwenLM/Qwen-VL/blob/master/visual_memo.md). This repo is the one for Qwen-VL-Chat.
<br>
## 安装要求 (Requirements)
* python 3.8及以上版本
* pytorch 1.12及以上版本,推荐2.0及以上版本
* 建议使用CUDA 11.4及以上(GPU用户需考虑此选项)
* python 3.8 and above
* pytorch 1.12 and above, 2.0 and above are recommended
* CUDA 11.4 and above are recommended (this is for GPU users)
<br>
## 快速开始 (Quickstart)
我们提供简单的示例来说明如何利用 🤗 Transformers 快速使用Qwen-VL-Chat。
在开始前,请确保你已经配置好环境并安装好相关的代码包。最重要的是,确保你满足上述要求,然后安装相关的依赖库。
Below, we provide simple examples to show how to use Qwen-VL-Chat with 🤗 Transformers.
Before running the code, make sure you have setup the environment and installed the required packages. Make sure you meet the above requirements, and then install the dependent libraries.
```bash
pip install -r requirements.txt
```
接下来你可以开始使用Transformers来使用我们的模型。关于视觉模块的更多用法,请参考[教程](TUTORIAL.md)。
Now you can start with Transformers. More usage aboue vision encoder, please refer to [tutorial](TUTORIAL_zh.md).
#### 🤗 Transformers
To use Qwen-VL-Chat for the inference, all you need to do is to input a few lines of codes as demonstrated below. However, **please make sure that you are using the latest code.**
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers.generation import GenerationConfig
import torch
torch.manual_seed(1234)
# Note: The default behavior now has injection attack prevention off.
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen-VL-Chat", trust_remote_code=True)
# use bf16
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="auto", trust_remote_code=True, bf16=True).eval()
# use fp16
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="auto", trust_remote_code=True, fp16=True).eval()
# use cpu only
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="cpu", trust_remote_code=True).eval()
# use cuda device
model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="cuda", trust_remote_code=True).eval()
# Specify hyperparameters for generation (No need to do this if you are using transformers>=4.32.0)
# model.generation_config = GenerationConfig.from_pretrained("Qwen/Qwen-VL-Chat", trust_remote_code=True)
# 1st dialogue turn
query = tokenizer.from_list_format([
{'image': 'https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg'},
{'text': '这是什么'},
])
response, history = model.chat(tokenizer, query=query, history=None)
print(response)
# 图中是一名年轻女子在沙滩上和她的狗玩耍,狗的品种可能是拉布拉多。她们坐在沙滩上,狗的前腿抬起来,似乎在和人类击掌。两人之间充满了信任和爱。
# 2nd dialogue turn
response, history = model.chat(tokenizer, '输出"击掌"的检测框', history=history)
print(response)
# <ref>击掌</ref><box>(517,508),(589,611)</box>
image = tokenizer.draw_bbox_on_latest_picture(response, history)
if image:
image.save('1.jpg')
else:
print("no box")
```
<p align="center">
<img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo_highfive.jpg" width="500"/>
<p>
<br>
## 量化 (Quantization)
### 用法 (Usage)
当前我们提供了基于[AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ)的量化方案,并提供了Qwen-VL-Chat的Int4量化版本Qwen-VL-Chat-Int4 [点击此处](https://huggingface.co/Qwen/Qwen-VL-Chat-Int4)。该模型在效果评测上几乎无损,并在显存占用和推理速度上具有明显优势。
下文说明如何使用该量化模型。开始之前,请确保你满足要求(如torch2.0及以上、transformers 4.32.0及以上,等)并安装所需的代码库:
We provide a new solution based on [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ), and release an Int4 quantized model for Qwen-VL-Chat, Qwen-VL-Chat-Int4 [Click here](https://huggingface.co/Qwen/Qwen-VL-Chat-Int4), which achieves nearly lossless model effects but improved performance on both memory costs and inference speed.
Here we demonstrate how to use our provided quantized models for inference. Before you start, make sure you meet the requirements (e.g., torch 2.0 and above, transformers 4.32.0 and above, etc.) and install the required packages:
```bash
pip install optimum
git clone https://github.com/JustinLin610/AutoGPTQ.git & cd AutoGPTQ
pip install -v .
```
如遇到安装 `auto-gptq` 的问题,建议您前往官方[repo](https://github.com/PanQiWei/AutoGPTQ) 寻找合适的wheel。
随后你便可以按照上述用法,轻松调用量化模型:
If you meet problems installing `auto-gptq`, we advise you to check out the official [repo](https://github.com/PanQiWei/AutoGPTQ) to find a wheel.
Then you can load the quantized model easily and run inference as same as usual:
```python
model = AutoModelForCausalLM.from_pretrained(
"Qwen/Qwen-VL-Chat-Int4",
device_map="auto",
trust_remote_code=True
).eval()
# Either a local path or an u[](https://)rl between <img></img> tags.
image_path = 'https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg'
response, history = model.chat(tokenizer, query=f'<img>{image_path}</img>这是什么', history=None)
print(response)
```
### 效果评测 (Performance)
我们列出不同精度下模型在评测基准 **[TouchStone](https://github.com/OFA-Sys/TouchStone)** 上的表现,并发现量化模型并没有显著性能损失。结果如下所示:
We illustrate the model performance of both BF16 and Int4 models on the benchmark **[TouchStone](https://github.com/OFA-Sys/TouchStone)**, and we find that the quantized model does not suffer from significant performance degradation. Results are shown below:
| Quantization | ZH. | EN |
| ------------ | :--------: | :-----------: |
| BF16 | 401.2 | 645.2 |
| Int4 | 386.6 | 651.4 |
### 推理速度 (Inference Speed)
我们测算了在输入一张图片(即258个token)的条件下BF16和Int4的模型生成1792 (2048-258) 和 7934 (8192-258) 个token的平均速度。
We measured the average inference speed (tokens/s) of generating 1792 (2048-258) and 7934 (8192-258) tokens with the context of an image (which takes 258 tokens) under BF16 precision and Int4 quantization, respectively.
| Quantization | Speed (2048 tokens) | Speed (8192 tokens) |
| ------------ | :-----------------: | :-----------------: |
| BF16 | 28.87 | 24.32 |
| Int4 | 37.79 | 34.34 |
推理速度测算是在单卡 A100-SXM4-80G GPU上运行,使用PyTorch 2.0.1及CUDA 11.4。
The profiling runs on a single A100-SXM4-80G GPU with PyTorch 2.0.1 and CUDA 11.4.
### GPU显存占用 (GPU Memory Usage)
我们还测算了在一张图片输入的条件下BF16和Int4模型生成1792 (2048-258) 和 7934 (8192-258) 个token所需显存。结果如下所示:
We also profile the peak GPU memory usage for encoding 1792 (2048-258) tokens (including an image) as context (and generating single token) and generating 7934 (8192-258) tokens (with an image as context) under BF16 or Int4 quantization level, respectively. The results are shown below.
| Quantization | Peak Usage for Encoding 2048 Tokens | Peak Usage for Generating 8192 Tokens |
| ------------ | :---------------------------------: | :-----------------------------------: |
| BF16 | 22.60GB | 28.01GB |
| Int4 | 11.82GB | 17.23GB |
上述速度和显存测算使用[此脚本](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile_mm.py)完成。
The above speed and memory profiling are conducted using [this script](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile_mm.py).
<br>
## 评测
我们从两个角度评测了两个模型的能力:
1. 在**英文标准 Benchmark** 上评测模型的基础任务能力。目前评测了四大类多模态任务:
- Zero-shot Caption: 评测模型在未见过数据集上的零样本图片描述能力;
- General VQA: 评测模型的通用问答能力,例如判断题、颜色、个数、类目等问答能力;
- Text-based VQA:评测模型对于图片中文字相关的识别/问答能力,例如文档问答、图表问答、文字问答等;
- Referring Expression Compression:评测模型给定物体描述画检测框的能力;
2. **试金石 (TouchStone)**:为了评测模型整体的图文对话能力和人类对齐水平。我们为此构建了一个基于 GPT4 打分来评测 LVLM 模型的 Benchmark:TouchStone。在 TouchStone-v0.1 中:
- 评测基准总计涵盖 300+张图片、800+道题目、27个类别。包括基础属性问答、人物地标问答、影视作品问答、视觉推理、反事实推理、诗歌创作、故事写作,商品比较、图片解题等**尽可能广泛的类别**。
- 为了弥补目前 GPT4 无法直接读取图片的缺陷,我们给所有的带评测图片提供了**人工标注的充分详细描述**,并且将图片的详细描述、问题和模型的输出结果一起交给 GPT4 打分。
- 评测同时包含英文版本和中文版本。
评测结果如下:
We evaluated the model's ability from two perspectives:
1. **Standard Benchmarks**: We evaluate the model's basic task capabilities on four major categories of multimodal tasks:
- Zero-shot Caption: Evaluate model's zero-shot image captioning ability on unseen datasets;
- General VQA: Evaluate the general question-answering ability of pictures, such as the judgment, color, number, category, etc;
- Text-based VQA: Evaluate the model's ability to recognize text in pictures, such as document QA, chart QA, etc;
- Referring Expression Comprehension: Evaluate the ability to localize a target object in an image described by a referring expression.
2. **TouchStone**: To evaluate the overall text-image dialogue capability and alignment level with humans, we have constructed a benchmark called TouchStone, which is based on scoring with GPT4 to evaluate the LVLM model.
- The TouchStone benchmark covers a total of 300+ images, 800+ questions, and 27 categories. Such as attribute-based Q&A, celebrity recognition, writing poetry, summarizing multiple images, product comparison, math problem solving, etc;
- In order to break the current limitation of GPT4 in terms of direct image input, TouchStone provides fine-grained image annotations by human labeling. These detailed annotations, along with the questions and the model's output, are then presented to GPT4 for scoring.
- The benchmark includes both English and Chinese versions.
The results of the evaluation are as follows:
Qwen-VL outperforms current SOTA generalist models on multiple VL tasks and has a more comprehensive coverage in terms of capability range.
<p align="center">
<img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/radar.png" width="600"/>
<p>
### 零样本图像描述 & 通用视觉问答 (Zero-shot Captioning & General VQA)
<table>
<thead>
<tr>
<th rowspan="2">Model type</th>
<th rowspan="2">Model</th>
<th colspan="2">Zero-shot Captioning</th>
<th colspan="5">General VQA</th>
</tr>
<tr>
<th>NoCaps</th>
<th>Flickr30K</th>
<th>VQAv2<sup>dev</sup></th>
<th>OK-VQA</th>
<th>GQA</th>
<th>SciQA-Img<br>(0-shot)</th>
<th>VizWiz<br>(0-shot)</th>
</tr>
</thead>
<tbody align="center">
<tr>
<td rowspan="10">Generalist<br>Models</td>
<td>Flamingo-9B</td>
<td>-</td>
<td>61.5</td>
<td>51.8</td>
<td>44.7</td>
<td>-</td>
<td>-</td>
<td>28.8</td>
</tr>
<tr>
<td>Flamingo-80B</td>
<td>-</td>
<td>67.2</td>
<td>56.3</td>
<td>50.6</td>
<td>-</td>
<td>-</td>
<td>31.6</td>
</tr>
<tr>
<td>Unified-IO-XL</td>
<td>100.0</td>
<td>-</td>
<td>77.9</td>
<td>54.0</td>
<td>-</td>
<td>-</td>
<td>-</td>
</tr>
<tr>
<td>Kosmos-1</td>
<td>-</td>
<td>67.1</td>
<td>51.0</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>29.2</td>
</tr>
<tr>
<td>Kosmos-2</td>
<td>-</td>
<td>66.7</td>
<td>45.6</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
</tr>
<tr>
<td>BLIP-2 (Vicuna-13B)</td>
<td>103.9</td>
<td>71.6</td>
<td>65.0</td>
<td>45.9</td>
<td>32.3</td>
<td>61.0</td>
<td>19.6</td>
</tr>
<tr>
<td>InstructBLIP (Vicuna-13B)</td>
<td><strong>121.9</strong></td>
<td>82.8</td>
<td>-</td>
<td>-</td>
<td>49.5</td>
<td>63.1</td>
<td>33.4</td>
</tr>
<tr>
<td>Shikra (Vicuna-13B)</td>
<td>-</td>
<td>73.9</td>
<td>77.36</td>
<td>47.16</td>
<td>-</td>
<td>-</td>
<td>-</td>
</tr>
<tr>
<td><strong>Qwen-VL (Qwen-7B)</strong></td>
<td>121.4</td>
<td><b>85.8</b></td>
<td><b>78.8</b></td>
<td><b>58.6</b></td>
<td><b>59.3</b></td>
<td>67.1</td>
<td>35.2</td>
</tr>
<!-- <tr>
<td>Qwen-VL (4-shot)</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>63.6</td>
<td>-</td>
<td>-</td>
<td>39.1</td>
</tr> -->
<tr>
<td>Qwen-VL-Chat</td>
<td>120.2</td>
<td>81.0</td>
<td>78.2</td>
<td>56.6</td>
<td>57.5</td>
<td><b>68.2</b></td>
<td><b>38.9</b></td>
</tr>
<!-- <tr>
<td>Qwen-VL-Chat (4-shot)</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>60.6</td>
<td>-</td>
<td>-</td>
<td>44.45</td>
</tr> -->
<tr>
<td>Previous SOTA<br>(Per Task Fine-tuning)</td>
<td>-</td>
<td>127.0<br>(PALI-17B)</td>
<td>84.5<br>(InstructBLIP<br>-FlanT5-XL)</td>
<td>86.1<br>(PALI-X<br>-55B)</td>
<td>66.1<br>(PALI-X<br>-55B)</td>
<td>72.1<br>(CFR)</td>
<td>92.53<br>(LLaVa+<br>GPT-4)</td>
<td>70.9<br>(PALI-X<br>-55B)</td>
</tr>
</tbody>
</table>
- 在 Zero-shot Caption 中,Qwen-VL 在 Flickr30K 数据集上取得了 **SOTA** 的结果,并在 Nocaps 数据集上取得了和 InstructBlip 可竞争的结果。
- 在 General VQA 中,Qwen-VL 取得了 LVLM 模型同等量级和设定下 **SOTA** 的结果。
- For zero-shot image captioning, Qwen-VL achieves the **SOTA** on Flickr30K and competitive results on Nocaps with InstructBlip.
- For general VQA, Qwen-VL achieves the **SOTA** under the same generalist LVLM scale settings.
### 文本导向的视觉问答 (Text-oriented VQA)
<table>
<thead>
<tr>
<th>Model type</th>
<th>Model</th>
<th>TextVQA</th>
<th>DocVQA</th>
<th>ChartQA</th>
<th>AI2D</th>
<th>OCR-VQA</th>
</tr>
</thead>
<tbody align="center">
<tr>
<td rowspan="5">Generalist Models</td>
<td>BLIP-2 (Vicuna-13B)</td>
<td>42.4</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
</tr>
<tr>
<td>InstructBLIP (Vicuna-13B)</td>
<td>50.7</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
</tr>
<tr>
<td>mPLUG-DocOwl (LLaMA-7B)</td>
<td>52.6</td>
<td>62.2</td>
<td>57.4</td>
<td>-</td>
<td>-</td>
</tr>
<tr>
<td>Pic2Struct-Large (1.3B)</td>
<td>-</td>
<td><b>76.6</b></td>
<td>58.6</td>
<td>42.1</td>
<td>71.3</td>
</tr>
<tr>
<td>Qwen-VL (Qwen-7B)</td>
<td><b>63.8</b></td>
<td>65.1</td>
<td><b>65.7</b></td>
<td><b>62.3</b></td>
<td><b>75.7</b></td>
</tr>
<tr>
<td>Specialist SOTAs<br>(Specialist/Finetuned)</td>
<td>PALI-X-55B (Single-task FT)<br>(Without OCR Pipeline)</td>
<td>71.44</td>
<td>80.0</td>
<td>70.0</td>
<td>81.2</td>
<td>75.0</td>
</tr>
</tbody>
</table>
- 在文字相关的识别/问答评测上,取得了当前规模下通用 LVLM 达到的最好结果。
- 分辨率对上述某几个评测非常重要,大部分 224 分辨率的开源 LVLM 模型无法完成以上评测,或只能通过切图的方式解决。Qwen-VL 将分辨率提升到 448,可以直接以端到端的方式进行以上评测。Qwen-VL 在很多任务上甚至超过了 1024 分辨率的 Pic2Struct-Large 模型。
- In text-related recognition/QA evaluation, Qwen-VL achieves the SOTA under the generalist LVLM scale settings.
- Resolution is important for several above evaluations. While most open-source LVLM models with 224 resolution are incapable of these evaluations or can only solve these by cutting images, Qwen-VL scales the resolution to 448 so that it can be evaluated end-to-end. Qwen-VL even outperforms Pic2Struct-Large models of 1024 resolution on some tasks.
### 细粒度视觉定位 (Referring Expression Comprehension)
<table>
<thead>
<tr>
<th rowspan="2">Model type</th>
<th rowspan="2">Model</th>
<th colspan="3">RefCOCO</th>
<th colspan="3">RefCOCO+</th>
<th colspan="2">RefCOCOg</th>
<th>GRIT</th>
</tr>
<tr>
<th>val</th>
<th>test-A</th>
<th>test-B</th>
<th>val</th>
<th>test-A</th>
<th>test-B</th>
<th>val-u</th>
<th>test-u</th>
<th>refexp</th>
</tr>
</thead>
<tbody align="center">
<tr>
<td rowspan="8">Generalist Models</td>
<td>GPV-2</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>51.50</td>
</tr>
<tr>
<td>OFA-L*</td>
<td>79.96</td>
<td>83.67</td>
<td>76.39</td>
<td>68.29</td>
<td>76.00</td>
<td>61.75</td>
<td>67.57</td>
<td>67.58</td>
<td>61.70</td>
</tr>
<tr>
<td>Unified-IO</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td><b>78.61</b></td>
</tr>
<tr>
<td>VisionLLM-H</td>
<td></td>
<td>86.70</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
<td>-</td>
</tr>
<tr>
<td>Shikra-7B</td>
<td>87.01</td>
<td>90.61</td>
<td>80.24 </td>
<td>81.60</td>
<td>87.36</td>
<td>72.12</td>
<td>82.27</td>
<td>82.19</td>
<td>69.34</td>
</tr>
<tr>
<td>Shikra-13B</td>
<td>87.83 </td>
<td>91.11</td>
<td>81.81</td>
<td>82.89</td>
<td>87.79</td>
<td>74.41</td>
<td>82.64</td>
<td>83.16</td>
<td>69.03</td>
</tr>
<tr>
<td>Qwen-VL-7B</td>
<td><b>89.36</b></td>
<td>92.26</td>
<td><b>85.34</b></td>
<td><b>83.12</b></td>
<td>88.25</td>
<td><b>77.21</b></td>
<td>85.58</td>
<td>85.48</td>
<td>78.22</td>
</tr>
<tr>
<td>Qwen-VL-7B-Chat</td>
<td>88.55</td>
<td><b>92.27</b></td>
<td>84.51</td>
<td>82.82</td>
<td><b>88.59</b></td>
<td>76.79</td>
<td><b>85.96</b></td>
<td><b>86.32</b></td>
<td>-</td>
<tr>
<td rowspan="3">Specialist SOTAs<br>(Specialist/Finetuned)</td>
<td>G-DINO-L</td>
<td>90.56 </td>
<td>93.19</td>
<td>88.24</td>
<td>82.75</td>
<td>88.95</td>
<td>75.92</td>
<td>86.13</td>
<td>87.02</td>
<td>-</td>
</tr>
<tr>
<td>UNINEXT-H</td>
<td>92.64 </td>
<td>94.33</td>
<td>91.46</td>
<td>85.24</td>
<td>89.63</td>
<td>79.79</td>
<td>88.73</td>
<td>89.37</td>
<td>-</td>
</tr>
<tr>
<td>ONE-PEACE</td>
<td>92.58 </td>
<td>94.18</td>
<td>89.26</td>
<td>88.77</td>
<td>92.21</td>
<td>83.23</td>
<td>89.22</td>
<td>89.27</td>
<td>-</td>
</tr>
</tbody>
</table>
- 在定位任务上,Qwen-VL 全面超过 Shikra-13B,取得了目前 Generalist LVLM 模型上在 Refcoco 上的 **SOTA**。
- Qwen-VL 并没有在任何中文定位数据上训练过,但通过中文 Caption 数据和 英文 Grounding 数据的训练,可以 Zero-shot 泛化出中文 Grounding 能力。
我们提供了以上**所有**评测脚本以供复现我们的实验结果。请阅读 [eval/EVALUATION.md](eval/EVALUATION.md) 了解更多信息。
- Qwen-VL achieves the **SOTA** in all above referring expression comprehension benchmarks.
- Qwen-VL has not been trained on any Chinese grounding data, but it can still generalize to the Chinese Grounding tasks in a zero-shot way by training Chinese Caption data and English Grounding data.
We provide all of the above evaluation scripts for reproducing our experimental results. Please read [eval/EVALUATION.md](eval/EVALUATION.md) for more information.
### 闲聊能力测评 (Chat Evaluation)
TouchStone 是一个基于 GPT4 打分来评测 LVLM 模型的图文对话能力和人类对齐水平的基准。它涵盖了 300+张图片、800+道题目、27个类别,包括基础属性、人物地标、视觉推理、诗歌创作、故事写作、商品比较、图片解题等**尽可能广泛的类别**。关于 TouchStone 的详细介绍,请参考[touchstone/README_CN.md](touchstone/README_CN.md)了解更多信息。
TouchStone is a benchmark based on scoring with GPT4 to evaluate the abilities of the LVLM model on text-image dialogue and alignment levels with humans. It covers a total of 300+ images, 800+ questions, and 27 categories, such as attribute-based Q&A, celebrity recognition, writing poetry, summarizing multiple images, product comparison, math problem solving, etc. Please read [touchstone/README_CN.md](touchstone/README.md) for more information.
#### 英语 (English)
| Model | Score |
|---------------|-------|
| PandaGPT | 488.5 |
| MiniGPT4 | 531.7 |
| InstructBLIP | 552.4 |
| LLaMA-AdapterV2 | 590.1 |
| mPLUG-Owl | 605.4 |
| LLaVA | 602.7 |
| Qwen-VL-Chat | 645.2 |
#### 中文 (Chinese)
| Model | Score |
|---------------|-------|
| VisualGLM | 247.1 |
| Qwen-VL-Chat | 401.2 |
Qwen-VL-Chat 模型在中英文的对齐评测中均取得当前 LVLM 模型下的最好结果。
Qwen-VL-Chat has achieved the best results in both Chinese and English alignment evaluation.
<br>
## 常见问题 (FAQ)
如遇到问题,敬请查阅 [FAQ](https://github.com/QwenLM/Qwen-VL/blob/master/FAQ_zh.md)以及issue区,如仍无法解决再提交issue。
If you meet problems, please refer to [FAQ](https://github.com/QwenLM/Qwen-VL/blob/master/FAQ.md) and the issues first to search a solution before you launch a new issue.
<br>
## 使用协议 (License Agreement)
研究人员与开发者可使用Qwen-VL和Qwen-VL-Chat或进行二次开发。我们同样允许商业使用,具体细节请查看[LICENSE](https://github.com/QwenLM/Qwen-VL/blob/master/LICENSE)。如需商用,请填写[问卷](https://dashscope.console.aliyun.com/openModelApply/qianwen)申请。
Researchers and developers are free to use the codes and model weights of both Qwen-VL and Qwen-VL-Chat. We also allow their commercial use. Check our license at [LICENSE](LICENSE) for more details.
<br>
## 引用 (Citation)
如果你觉得我们的论文和代码对你的研究有帮助,请考虑:star: 和引用 :pencil: :)
If you find our paper and code useful in your research, please consider giving a star :star: and citation :pencil: :)
```BibTeX
@article{Qwen-VL,
title={Qwen-VL: A Frontier Large Vision-Language Model with Versatile Abilities},
author={Bai, Jinze and Bai, Shuai and Yang, Shusheng and Wang, Shijie and Tan, Sinan and Wang, Peng and Lin, Junyang and Zhou, Chang and Zhou, Jingren},
journal={arXiv preprint arXiv:2308.12966},
year={2023}
}
```
<br>
## 联系我们 (Contact Us)
如果你想给我们的研发团队和产品团队留言,请通过邮件(qianwen_opensource@alibabacloud.com)联系我们。
If you are interested to leave a message to either our research team or product team, feel free to send an email to qianwen_opensource@alibabacloud.com.
```
```
| 22,737 | [
[
-0.02978515625,
-0.061309814453125,
0.010772705078125,
0.017181396484375,
-0.02337646484375,
-0.01403045654296875,
-0.00901031494140625,
-0.03643798828125,
-0.0006008148193359375,
0.0207977294921875,
-0.040283203125,
-0.04058837890625,
-0.02728271484375,
-0.01346588134765625,
-0.02325439453125,
0.049530029296875,
0.0159454345703125,
-0.008392333984375,
0.004302978515625,
-0.018463134765625,
-0.019012451171875,
-0.04071044921875,
-0.0635986328125,
-0.006317138671875,
0.01110076904296875,
0.0013151168823242188,
0.0595703125,
0.036834716796875,
0.02630615234375,
0.03466796875,
-0.00005424022674560547,
0.015411376953125,
-0.033782958984375,
-0.01059722900390625,
0.016143798828125,
-0.0421142578125,
-0.048675537109375,
-0.0007548332214355469,
0.046630859375,
0.0017042160034179688,
-0.0022430419921875,
0.01480865478515625,
0.010528564453125,
0.032806396484375,
-0.03948974609375,
0.01480865478515625,
-0.032073974609375,
0.0034923553466796875,
-0.011016845703125,
-0.00656890869140625,
-0.0182647705078125,
-0.030609130859375,
0.01953125,
-0.055816650390625,
0.0046539306640625,
0.012359619140625,
0.1005859375,
0.0079193115234375,
-0.044219970703125,
0.00661468505859375,
-0.039093017578125,
0.07366943359375,
-0.08245849609375,
0.0257720947265625,
0.02056884765625,
0.0261383056640625,
-0.00954437255859375,
-0.08544921875,
-0.05584716796875,
-0.01482391357421875,
-0.020416259765625,
0.01739501953125,
-0.034088134765625,
0.01375579833984375,
0.03369140625,
0.021453857421875,
-0.038238525390625,
-0.007518768310546875,
-0.0222625732421875,
-0.0247802734375,
0.052032470703125,
0.0185546875,
0.034637451171875,
-0.018524169921875,
-0.0281829833984375,
-0.0219573974609375,
-0.0309600830078125,
0.01105499267578125,
0.01401519775390625,
-0.00759124755859375,
-0.036865234375,
0.0204315185546875,
-0.020843505859375,
0.04022216796875,
0.0193634033203125,
-0.0174102783203125,
0.0286865234375,
-0.03631591796875,
-0.035888671875,
-0.0305633544921875,
0.10870361328125,
0.0416259765625,
0.0028209686279296875,
0.015594482421875,
0.00016796588897705078,
-0.00800323486328125,
-0.02203369140625,
-0.0845947265625,
-0.0406494140625,
0.04010009765625,
-0.046722412109375,
-0.0211181640625,
-0.01338958740234375,
-0.04052734375,
0.004825592041015625,
0.00952911376953125,
0.053466796875,
-0.050262451171875,
-0.041259765625,
0.0032672882080078125,
-0.01666259765625,
0.023773193359375,
0.0224151611328125,
-0.05841064453125,
0.01373291015625,
0.010284423828125,
0.05743408203125,
0.00801849365234375,
-0.0210723876953125,
-0.022308349609375,
-0.011566162109375,
-0.01605224609375,
0.0291748046875,
0.0032024383544921875,
-0.0260162353515625,
-0.01519012451171875,
0.00797271728515625,
-0.0101470947265625,
-0.0264739990234375,
0.0411376953125,
-0.039337158203125,
0.039306640625,
-0.01546478271484375,
-0.040130615234375,
-0.02471923828125,
0.0080718994140625,
-0.037841796875,
0.07635498046875,
0.00847625732421875,
-0.07989501953125,
0.005340576171875,
-0.036102294921875,
-0.0024127960205078125,
0.0048675537109375,
-0.006557464599609375,
-0.041473388671875,
-0.0236053466796875,
0.0284576416015625,
0.0274200439453125,
-0.02099609375,
0.01024627685546875,
-0.0267333984375,
-0.0313720703125,
0.0335693359375,
-0.056488037109375,
0.093505859375,
0.016357421875,
-0.045654296875,
0.0274658203125,
-0.04693603515625,
0.030426025390625,
0.021240234375,
-0.0054168701171875,
-0.01036834716796875,
-0.0004024505615234375,
0.0079345703125,
0.0283355712890625,
0.031463623046875,
-0.0280609130859375,
0.009185791015625,
-0.03753662109375,
0.056549072265625,
0.04644775390625,
-0.0013704299926757812,
0.036651611328125,
-0.0222320556640625,
0.0201416015625,
0.0238037109375,
0.042266845703125,
-0.00724029541015625,
-0.039031982421875,
-0.0726318359375,
-0.0172119140625,
0.02691650390625,
0.044647216796875,
-0.082275390625,
0.0377197265625,
-0.0034542083740234375,
-0.03778076171875,
-0.051239013671875,
-0.0105743408203125,
0.03973388671875,
0.023773193359375,
0.0265655517578125,
-0.01320648193359375,
-0.03668212890625,
-0.061431884765625,
0.00311279296875,
-0.02484130859375,
-0.00910186767578125,
0.031768798828125,
0.03277587890625,
-0.01305389404296875,
0.05950927734375,
-0.034515380859375,
-0.0008635520935058594,
-0.002529144287109375,
-0.001575469970703125,
0.0174102783203125,
0.046630859375,
0.054901123046875,
-0.0640869140625,
-0.050811767578125,
-0.011383056640625,
-0.06756591796875,
0.01168060302734375,
-0.0009965896606445312,
-0.044952392578125,
0.0234527587890625,
0.01328277587890625,
-0.0540771484375,
0.04705810546875,
0.048095703125,
-0.03515625,
0.051513671875,
-0.0181427001953125,
0.0171051025390625,
-0.08087158203125,
-0.005619049072265625,
0.007427215576171875,
-0.01861572265625,
-0.040802001953125,
0.0029296875,
0.0193023681640625,
0.019561767578125,
-0.038818359375,
0.05706787109375,
-0.0367431640625,
0.0093536376953125,
-0.00821685791015625,
0.01258087158203125,
0.01319122314453125,
0.053253173828125,
-0.002895355224609375,
0.052703857421875,
0.062255859375,
-0.05029296875,
0.0411376953125,
0.031402587890625,
-0.006954193115234375,
0.020416259765625,
-0.0638427734375,
0.005550384521484375,
0.01187896728515625,
0.01056671142578125,
-0.07977294921875,
0.0012121200561523438,
0.04620361328125,
-0.054443359375,
0.0200347900390625,
-0.0166473388671875,
-0.0221405029296875,
-0.042633056640625,
-0.0303802490234375,
0.0242156982421875,
0.061370849609375,
-0.042724609375,
0.038818359375,
0.01654052734375,
0.0234527587890625,
-0.0462646484375,
-0.04107666015625,
-0.0027675628662109375,
-0.026153564453125,
-0.059661865234375,
0.03594970703125,
-0.01213836669921875,
-0.00550079345703125,
0.00583648681640625,
0.0108489990234375,
-0.0054779052734375,
-0.00518798828125,
0.0159454345703125,
0.0340576171875,
-0.0194091796875,
-0.007480621337890625,
-0.006427764892578125,
-0.007465362548828125,
0.0019207000732421875,
-0.0262451171875,
0.044219970703125,
-0.0182037353515625,
-0.0091400146484375,
-0.06915283203125,
0.008544921875,
0.03533935546875,
-0.022216796875,
0.061859130859375,
0.0733642578125,
-0.01548004150390625,
0.00481414794921875,
-0.039703369140625,
-0.023834228515625,
-0.04443359375,
0.030487060546875,
-0.025482177734375,
-0.05804443359375,
0.046478271484375,
0.02545166015625,
0.018218994140625,
0.05059814453125,
0.039398193359375,
-0.006244659423828125,
0.093994140625,
0.035675048828125,
-0.00823211669921875,
0.05072021484375,
-0.047454833984375,
0.0104827880859375,
-0.05767822265625,
0.0034923553466796875,
-0.0079193115234375,
-0.0094757080078125,
-0.05963134765625,
-0.0372314453125,
0.0313720703125,
0.0192108154296875,
-0.0389404296875,
0.0242462158203125,
-0.0413818359375,
-0.006259918212890625,
0.064453125,
0.01885986328125,
0.0081329345703125,
-0.0145263671875,
0.006954193115234375,
-0.006587982177734375,
-0.06475830078125,
-0.0256805419921875,
0.07421875,
0.0302581787109375,
0.043304443359375,
-0.003063201904296875,
0.0455322265625,
-0.00577545166015625,
0.01091766357421875,
-0.04327392578125,
0.0462646484375,
0.00907135009765625,
-0.0367431640625,
-0.035980224609375,
-0.045074462890625,
-0.06787109375,
0.038238525390625,
-0.01336669921875,
-0.06378173828125,
0.0167083740234375,
0.012725830078125,
-0.04449462890625,
0.0219268798828125,
-0.052093505859375,
0.07177734375,
-0.019317626953125,
-0.038482666015625,
-0.0014829635620117188,
-0.045318603515625,
0.040435791015625,
0.026947021484375,
0.003925323486328125,
-0.0092620849609375,
0.013519287109375,
0.06597900390625,
-0.0487060546875,
0.057464599609375,
-0.0204315185546875,
-0.0009160041809082031,
0.048309326171875,
-0.0047149658203125,
0.038330078125,
0.008209228515625,
0.0164031982421875,
0.0154571533203125,
0.0278778076171875,
-0.0340576171875,
-0.04443359375,
0.04852294921875,
-0.07159423828125,
-0.04132080078125,
-0.020355224609375,
-0.031524658203125,
0.005084991455078125,
0.0086822509765625,
0.0418701171875,
0.045074462890625,
0.013641357421875,
0.0015764236450195312,
0.0390625,
-0.038726806640625,
0.048797607421875,
0.026824951171875,
-0.026580810546875,
-0.0364990234375,
0.06475830078125,
0.006710052490234375,
0.0271453857421875,
0.008819580078125,
0.00858306884765625,
-0.0277862548828125,
-0.0297393798828125,
-0.044525146484375,
0.0221405029296875,
-0.037017822265625,
-0.0218658447265625,
-0.06103515625,
-0.038848876953125,
-0.03704833984375,
0.0223236083984375,
-0.033905029296875,
-0.018646240234375,
-0.0280303955078125,
0.008331298828125,
0.048736572265625,
0.00675201416015625,
-0.001644134521484375,
0.0303802490234375,
-0.0731201171875,
0.02703857421875,
0.03497314453125,
0.004055023193359375,
0.0200042724609375,
-0.05316162109375,
-0.02459716796875,
0.03509521484375,
-0.0299224853515625,
-0.051300048828125,
0.055023193359375,
0.01528167724609375,
0.04248046875,
0.036834716796875,
0.0160980224609375,
0.05743408203125,
-0.01126861572265625,
0.060089111328125,
0.014984130859375,
-0.0740966796875,
0.027679443359375,
-0.0274200439453125,
0.00989532470703125,
0.007381439208984375,
0.0245513916015625,
-0.037689208984375,
-0.02484130859375,
-0.059234619140625,
-0.0626220703125,
0.054107666015625,
0.039398193359375,
0.01495361328125,
0.0040740966796875,
0.017852783203125,
-0.0335693359375,
0.0117340087890625,
-0.060272216796875,
-0.0462646484375,
-0.019744873046875,
-0.01082611083984375,
0.02203369140625,
-0.0181121826171875,
0.001644134521484375,
-0.042144775390625,
0.051605224609375,
-0.005084991455078125,
0.05413818359375,
0.022613525390625,
0.00263214111328125,
-0.0025882720947265625,
-0.0122222900390625,
0.0219268798828125,
0.03436279296875,
-0.01190948486328125,
-0.012420654296875,
0.0189361572265625,
-0.031494140625,
-0.002971649169921875,
0.0018987655639648438,
-0.0143585205078125,
0.0009965896606445312,
0.0208740234375,
0.08447265625,
0.0126190185546875,
-0.031280517578125,
0.0452880859375,
-0.02703857421875,
-0.031494140625,
-0.0223388671875,
0.013671875,
0.0207672119140625,
0.04815673828125,
0.0293121337890625,
-0.017059326171875,
0.00940704345703125,
-0.038482666015625,
0.011810302734375,
0.037109375,
-0.0127716064453125,
-0.0214080810546875,
0.07415771484375,
0.008819580078125,
-0.00412750244140625,
0.05792236328125,
-0.030975341796875,
-0.0531005859375,
0.054443359375,
0.03363037109375,
0.0526123046875,
-0.01319122314453125,
0.01517486572265625,
0.05535888671875,
-0.0008487701416015625,
-0.0075531005859375,
0.0265655517578125,
0.0016260147094726562,
-0.06427001953125,
-0.0313720703125,
-0.040191650390625,
-0.01270294189453125,
0.0010547637939453125,
-0.044891357421875,
0.0126800537109375,
-0.017852783203125,
-0.034454345703125,
-0.005649566650390625,
0.01117706298828125,
-0.051666259765625,
0.0218048095703125,
0.004299163818359375,
0.049285888671875,
-0.036376953125,
0.0821533203125,
0.0200958251953125,
-0.0284423828125,
-0.0723876953125,
-0.0134124755859375,
-0.00983428955078125,
-0.0595703125,
0.04669189453125,
-0.0016107559204101562,
0.01006317138671875,
0.028533935546875,
-0.055816650390625,
-0.074462890625,
0.098388671875,
0.0019464492797851562,
-0.0452880859375,
-0.0133819580078125,
-0.016937255859375,
0.03515625,
-0.0013780593872070312,
0.0540771484375,
0.03619384765625,
0.0240325927734375,
0.0185089111328125,
-0.0821533203125,
0.0158233642578125,
-0.028961181640625,
0.0019178390502929688,
0.01187896728515625,
-0.07452392578125,
0.09259033203125,
-0.01397705078125,
-0.029266357421875,
0.0134124755859375,
0.07232666015625,
0.0190582275390625,
0.00646209716796875,
0.0270233154296875,
0.0302734375,
0.0419921875,
-0.0133056640625,
0.06463623046875,
-0.038604736328125,
0.0537109375,
0.052459716796875,
0.0064697265625,
0.05523681640625,
0.013031005859375,
-0.04327392578125,
0.025848388671875,
0.0467529296875,
-0.0073089599609375,
0.03741455078125,
0.0027179718017578125,
-0.02606201171875,
-0.0198516845703125,
0.02093505859375,
-0.03509521484375,
0.01995849609375,
0.0302734375,
-0.01047515869140625,
-0.0017271041870117188,
0.0093536376953125,
-0.0006127357482910156,
-0.03338623046875,
-0.00951385498046875,
0.044036865234375,
0.01568603515625,
-0.0278472900390625,
0.077880859375,
0.009613037109375,
0.08660888671875,
-0.036865234375,
-0.005229949951171875,
-0.0127716064453125,
0.0074920654296875,
-0.0116119384765625,
-0.041717529296875,
0.0035037994384765625,
-0.026031494140625,
0.01508331298828125,
0.0144195556640625,
0.0584716796875,
-0.038848876953125,
-0.0231170654296875,
0.024261474609375,
0.034759521484375,
0.012939453125,
-0.018463134765625,
-0.0716552734375,
0.0053558349609375,
0.0147247314453125,
-0.0462646484375,
0.027069091796875,
0.0390625,
-0.0054931640625,
0.05889892578125,
0.048919677734375,
-0.014678955078125,
0.0110626220703125,
-0.01078033447265625,
0.0682373046875,
-0.0540771484375,
-0.0279083251953125,
-0.0755615234375,
0.04827880859375,
-0.0175018310546875,
-0.0300750732421875,
0.07427978515625,
0.0249786376953125,
0.05303955078125,
0.0128326416015625,
0.05712890625,
-0.0221099853515625,
0.0163116455078125,
-0.0297088623046875,
0.0609130859375,
-0.036224365234375,
-0.00035309791564941406,
-0.0181427001953125,
-0.04132080078125,
0.0030040740966796875,
0.06231689453125,
-0.0208740234375,
0.016998291015625,
0.044952392578125,
0.06353759765625,
0.0063934326171875,
-0.01043701171875,
0.039947509765625,
0.0291748046875,
0.0194091796875,
0.07080078125,
0.052459716796875,
-0.0758056640625,
0.043792724609375,
-0.045318603515625,
-0.01021575927734375,
-0.020904541015625,
-0.045745849609375,
-0.08258056640625,
-0.0498046875,
-0.036651611328125,
-0.048736572265625,
-0.009033203125,
0.0609130859375,
0.05743408203125,
-0.045135498046875,
-0.01256561279296875,
0.00565338134765625,
-0.0008387565612792969,
-0.017608642578125,
-0.0223236083984375,
0.04425048828125,
-0.0037174224853515625,
-0.07623291015625,
-0.0054168701171875,
0.0034275054931640625,
0.0257568359375,
-0.0185699462890625,
-0.0127410888671875,
-0.007755279541015625,
-0.004955291748046875,
0.04107666015625,
0.0261688232421875,
-0.049102783203125,
-0.01280975341796875,
0.00925445556640625,
-0.028411865234375,
0.01297760009765625,
0.01275634765625,
-0.054443359375,
0.01654052734375,
0.036529541015625,
0.0034770965576171875,
0.055572509765625,
-0.0054931640625,
0.040252685546875,
-0.025238037109375,
0.0296478271484375,
0.0080718994140625,
0.0239105224609375,
0.008453369140625,
-0.0278167724609375,
0.018707275390625,
0.0184478759765625,
-0.045074462890625,
-0.052825927734375,
-0.008270263671875,
-0.0692138671875,
-0.0096588134765625,
0.08447265625,
-0.0286712646484375,
-0.03704833984375,
0.004703521728515625,
-0.0438232421875,
0.041046142578125,
-0.0207977294921875,
0.0413818359375,
0.0313720703125,
0.0030002593994140625,
-0.034423828125,
-0.05670166015625,
0.048248291015625,
0.02032470703125,
-0.044891357421875,
-0.01053619384765625,
0.0240325927734375,
0.0258331298828125,
-0.006351470947265625,
0.06939697265625,
0.0005354881286621094,
0.0297088623046875,
0.01076507568359375,
0.022369384765625,
-0.0025787353515625,
0.0115203857421875,
-0.00347900390625,
-0.01654052734375,
-0.005634307861328125,
-0.038909912109375
]
] |
lvwerra/distilbert-imdb | 2023-01-25T09:25:22.000Z | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:imdb",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | lvwerra | null | null | lvwerra/distilbert-imdb | 11 | 27,660 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- imdb
metrics:
- accuracy
model-index:
- name: distilbert-imdb
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: imdb
type: imdb
args: plain_text
metrics:
- name: Accuracy
type: accuracy
value: 0.928
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-imdb
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset (training notebook is [here](https://huggingface.co/lvwerra/distilbert-imdb/blob/main/distilbert-imdb-training.ipynb)).
It achieves the following results on the evaluation set:
- Loss: 0.1903
- Accuracy: 0.928
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2195 | 1.0 | 1563 | 0.1903 | 0.928 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
| 1,685 | [
[
-0.03533935546875,
-0.044189453125,
0.011993408203125,
0.00988006591796875,
-0.02838134765625,
-0.01044464111328125,
-0.0005483627319335938,
-0.0035877227783203125,
0.0140533447265625,
0.01934814453125,
-0.054962158203125,
-0.036529541015625,
-0.0654296875,
-0.00588226318359375,
-0.032196044921875,
0.09710693359375,
0.0032138824462890625,
0.033721923828125,
-0.015716552734375,
0.0004649162292480469,
-0.019378662109375,
-0.0538330078125,
-0.0399169921875,
-0.038665771484375,
0.019622802734375,
0.0186614990234375,
0.057525634765625,
0.067626953125,
0.05718994140625,
0.018402099609375,
-0.031951904296875,
-0.002681732177734375,
-0.051422119140625,
-0.034515380859375,
-0.0162811279296875,
-0.0187530517578125,
-0.05487060546875,
0.004741668701171875,
0.056121826171875,
0.03851318359375,
-0.0251007080078125,
0.0413818359375,
0.012908935546875,
0.04815673828125,
-0.037933349609375,
0.026458740234375,
-0.04266357421875,
0.01373291015625,
-0.0208587646484375,
-0.01293182373046875,
-0.0159912109375,
0.004711151123046875,
-0.0020313262939453125,
-0.0321044921875,
0.043182373046875,
0.006977081298828125,
0.086181640625,
0.03619384765625,
-0.02130126953125,
0.0033092498779296875,
-0.06378173828125,
0.039093017578125,
-0.049468994140625,
0.00986480712890625,
0.0295867919921875,
0.03729248046875,
-0.00646209716796875,
-0.048126220703125,
-0.042266845703125,
-0.018463134765625,
-0.006244659423828125,
0.00218963623046875,
-0.01274871826171875,
0.01105499267578125,
0.07415771484375,
0.048370361328125,
-0.0311279296875,
0.0142364501953125,
-0.0543212890625,
-0.0154876708984375,
0.03936767578125,
0.03271484375,
-0.0278472900390625,
-0.00989532470703125,
-0.03350830078125,
-0.0214385986328125,
-0.016754150390625,
0.01526641845703125,
0.04412841796875,
0.00421905517578125,
-0.019500732421875,
0.05078125,
-0.0260772705078125,
0.04718017578125,
0.0237274169921875,
-0.0158843994140625,
0.033599853515625,
0.004581451416015625,
-0.026824951171875,
0.0124664306640625,
0.05120849609375,
0.057525634765625,
0.02276611328125,
0.017242431640625,
-0.0173492431640625,
-0.002979278564453125,
0.0235595703125,
-0.07763671875,
-0.030792236328125,
0.0123748779296875,
-0.0323486328125,
-0.043121337890625,
0.0169219970703125,
-0.038238525390625,
-0.00499725341796875,
-0.0286102294921875,
0.0308380126953125,
-0.030670166015625,
-0.01396942138671875,
-0.00444793701171875,
-0.01267242431640625,
0.01407623291015625,
0.0141754150390625,
-0.0732421875,
0.022979736328125,
0.0216827392578125,
0.042449951171875,
0.01146697998046875,
-0.03216552734375,
-0.0174407958984375,
0.005764007568359375,
-0.01070404052734375,
0.0226593017578125,
0.00027251243591308594,
-0.034759521484375,
-0.015838623046875,
0.0160369873046875,
0.0025844573974609375,
-0.04150390625,
0.0631103515625,
-0.0159454345703125,
0.0019283294677734375,
-0.01348114013671875,
-0.039581298828125,
-0.0100555419921875,
0.0307464599609375,
-0.059112548828125,
0.0806884765625,
0.01438140869140625,
-0.061187744140625,
0.04290771484375,
-0.034088134765625,
-0.006061553955078125,
-0.01049041748046875,
-0.00846099853515625,
-0.056121826171875,
0.005092620849609375,
0.013336181640625,
0.03814697265625,
-0.029205322265625,
0.0421142578125,
-0.0340576171875,
-0.050079345703125,
0.005199432373046875,
-0.044769287109375,
0.0611572265625,
0.0162200927734375,
-0.031036376953125,
-0.01531982421875,
-0.09197998046875,
0.0038547515869140625,
0.025299072265625,
-0.034637451171875,
0.00333404541015625,
-0.02716064453125,
0.018798828125,
0.01485443115234375,
0.0190887451171875,
-0.03985595703125,
0.0095062255859375,
-0.007648468017578125,
0.01056671142578125,
0.04595947265625,
-0.000025391578674316406,
0.00110626220703125,
-0.03179931640625,
0.01617431640625,
0.0380859375,
0.0338134765625,
0.00772857666015625,
-0.0161590576171875,
-0.06591796875,
-0.0211944580078125,
0.0177764892578125,
0.027587890625,
-0.017181396484375,
0.047637939453125,
-0.0081329345703125,
-0.05084228515625,
-0.02215576171875,
0.0083465576171875,
0.034698486328125,
0.060302734375,
0.034088134765625,
-0.016143798828125,
-0.037445068359375,
-0.0919189453125,
0.005603790283203125,
-0.0146484375,
0.0195159912109375,
-0.00298309326171875,
0.042266845703125,
-0.0171661376953125,
0.056884765625,
-0.041168212890625,
-0.011199951171875,
-0.004116058349609375,
-0.00420379638671875,
0.051177978515625,
0.060394287109375,
0.05908203125,
-0.03265380859375,
-0.0231475830078125,
-0.02630615234375,
-0.0670166015625,
0.02593994140625,
-0.00395965576171875,
-0.0193634033203125,
-0.0242767333984375,
0.0226287841796875,
-0.034576416015625,
0.05743408203125,
0.01457977294921875,
-0.013519287109375,
0.048828125,
-0.031951904296875,
0.006336212158203125,
-0.08673095703125,
0.01070404052734375,
0.0225677490234375,
-0.006565093994140625,
-0.0179595947265625,
-0.0167388916015625,
0.01068115234375,
-0.00707244873046875,
-0.033294677734375,
0.032440185546875,
-0.013275146484375,
0.01690673828125,
-0.011871337890625,
-0.034576416015625,
0.0149078369140625,
0.0618896484375,
0.0035190582275390625,
0.028167724609375,
0.05364990234375,
-0.044830322265625,
0.042205810546875,
0.035858154296875,
-0.0306243896484375,
0.0489501953125,
-0.06658935546875,
-0.0018739700317382812,
-0.0251617431640625,
-0.0001882314682006836,
-0.052947998046875,
-0.0127105712890625,
0.023712158203125,
-0.0178375244140625,
0.037750244140625,
-0.037628173828125,
-0.025299072265625,
-0.031982421875,
-0.00689697265625,
0.01435089111328125,
0.040618896484375,
-0.043426513671875,
0.01442718505859375,
0.001934051513671875,
0.01317596435546875,
-0.062225341796875,
-0.053619384765625,
-0.0221710205078125,
-0.0255889892578125,
-0.025634765625,
0.0293121337890625,
-0.00469207763671875,
-0.01393890380859375,
-0.005733489990234375,
-0.0123138427734375,
-0.028472900390625,
-0.005939483642578125,
0.03656005859375,
0.040252685546875,
-0.004138946533203125,
-0.0154571533203125,
0.0160980224609375,
-0.01407623291015625,
0.0157470703125,
0.01165771484375,
0.0192108154296875,
-0.01087188720703125,
-0.0242156982421875,
-0.04974365234375,
0.0029315948486328125,
0.0523681640625,
-0.00296783447265625,
0.0653076171875,
0.05712890625,
-0.0386962890625,
0.0038089752197265625,
-0.036407470703125,
-0.01435089111328125,
-0.0308837890625,
0.047119140625,
-0.036773681640625,
-0.00843048095703125,
0.0411376953125,
-0.0033702850341796875,
0.0072784423828125,
0.07373046875,
0.038726806640625,
-0.01904296875,
0.07080078125,
0.023040771484375,
-0.0040435791015625,
0.02838134765625,
-0.07135009765625,
-0.0164947509765625,
-0.059661865234375,
-0.034423828125,
-0.030792236328125,
-0.0293121337890625,
-0.041290283203125,
-0.002162933349609375,
0.0195465087890625,
0.032562255859375,
-0.04583740234375,
0.035980224609375,
-0.042724609375,
0.033935546875,
0.0531005859375,
0.026397705078125,
0.011962890625,
0.0116119384765625,
-0.0270233154296875,
-0.0057525634765625,
-0.0447998046875,
-0.038330078125,
0.09381103515625,
0.0357666015625,
0.08172607421875,
-0.00568389892578125,
0.051483154296875,
0.0187225341796875,
0.00536346435546875,
-0.02716064453125,
0.024627685546875,
0.0021686553955078125,
-0.07684326171875,
-0.01287841796875,
-0.0194549560546875,
-0.0300445556640625,
0.01241302490234375,
-0.0325927734375,
-0.045989990234375,
0.0253448486328125,
0.0269927978515625,
-0.0240936279296875,
0.0257568359375,
-0.040283203125,
0.07867431640625,
-0.033660888671875,
-0.041107177734375,
-0.0173187255859375,
-0.038970947265625,
0.01580810546875,
-0.0006451606750488281,
-0.021209716796875,
-0.01384735107421875,
0.03741455078125,
0.054290771484375,
-0.05328369140625,
0.04266357421875,
-0.03680419921875,
0.03619384765625,
0.03179931640625,
-0.012908935546875,
0.0494384765625,
0.031341552734375,
-0.0213623046875,
0.03167724609375,
0.01038360595703125,
-0.034149169921875,
-0.0304718017578125,
0.06005859375,
-0.07611083984375,
-0.0198211669921875,
-0.05841064453125,
-0.02740478515625,
-0.0019445419311523438,
0.0065460205078125,
0.049652099609375,
0.061676025390625,
-0.022125244140625,
0.016021728515625,
0.03448486328125,
0.00769805908203125,
0.0280303955078125,
0.0216217041015625,
0.0018873214721679688,
-0.032562255859375,
0.043060302734375,
0.0016422271728515625,
0.0112762451171875,
0.00647735595703125,
-0.0011816024780273438,
-0.043792724609375,
-0.05194091796875,
-0.054656982421875,
0.0159149169921875,
-0.06671142578125,
-0.01001739501953125,
-0.02886962890625,
-0.036102294921875,
-0.0128173828125,
0.01476287841796875,
-0.033050537109375,
-0.032073974609375,
-0.040740966796875,
-0.029083251953125,
0.035736083984375,
0.032989501953125,
0.005031585693359375,
0.041778564453125,
-0.04852294921875,
-0.0016698837280273438,
0.00640869140625,
0.02606201171875,
-0.00820159912109375,
-0.07061767578125,
-0.03875732421875,
0.012725830078125,
-0.043243408203125,
-0.039459228515625,
0.024658203125,
0.0149993896484375,
0.0599365234375,
0.031494140625,
-0.002063751220703125,
0.0762939453125,
-0.02734375,
0.040496826171875,
0.026580810546875,
-0.039642333984375,
0.03253173828125,
-0.00928497314453125,
0.0109710693359375,
0.0677490234375,
0.04901123046875,
0.00678253173828125,
0.003940582275390625,
-0.0755615234375,
-0.044158935546875,
0.07305908203125,
0.0287017822265625,
0.004230499267578125,
0.01174163818359375,
0.028411865234375,
-0.00077056884765625,
0.030731201171875,
-0.059539794921875,
-0.04443359375,
-0.027496337890625,
-0.006526947021484375,
-0.0123291015625,
-0.0261077880859375,
-0.01358795166015625,
-0.052947998046875,
0.0732421875,
0.00421142578125,
0.0133209228515625,
0.01152801513671875,
-0.005001068115234375,
-0.0040130615234375,
-0.013427734375,
0.02764892578125,
0.0526123046875,
-0.06744384765625,
-0.010986328125,
0.01239776611328125,
-0.041717529296875,
0.00972747802734375,
0.0259552001953125,
0.00597381591796875,
0.021240234375,
0.0175628662109375,
0.0870361328125,
-0.002674102783203125,
-0.0240325927734375,
0.0322265625,
-0.00762939453125,
-0.0203857421875,
-0.044525146484375,
0.00936126708984375,
-0.0145721435546875,
0.021240234375,
0.021209716796875,
0.0303192138671875,
0.0103759765625,
-0.026031494140625,
0.015869140625,
0.01007843017578125,
-0.039306640625,
-0.0110321044921875,
0.06524658203125,
0.002513885498046875,
-0.01358795166015625,
0.07073974609375,
-0.01093292236328125,
-0.0033111572265625,
0.059844970703125,
0.027496337890625,
0.061065673828125,
0.000690460205078125,
-0.0096588134765625,
0.06884765625,
0.00901031494140625,
-0.0130767822265625,
0.016754150390625,
0.007904052734375,
-0.03387451171875,
-0.0091552734375,
-0.07086181640625,
-0.0234832763671875,
0.0307769775390625,
-0.0811767578125,
0.048431396484375,
-0.049896240234375,
-0.037872314453125,
0.0284423828125,
0.0001976490020751953,
-0.07122802734375,
0.039093017578125,
0.005176544189453125,
0.07244873046875,
-0.0626220703125,
0.07855224609375,
0.043121337890625,
-0.046600341796875,
-0.06463623046875,
-0.0240936279296875,
-0.00780487060546875,
-0.053253173828125,
0.055389404296875,
0.006656646728515625,
0.0245361328125,
0.007747650146484375,
-0.0254669189453125,
-0.057586669921875,
0.08795166015625,
0.03692626953125,
-0.07275390625,
-0.00769805908203125,
0.017730712890625,
0.0413818359375,
-0.0145721435546875,
0.060089111328125,
0.026702880859375,
0.0079498291015625,
0.029876708984375,
-0.0726318359375,
-0.0214691162109375,
-0.02606201171875,
0.00418853759765625,
0.01059722900390625,
-0.042144775390625,
0.08392333984375,
-0.0150146484375,
0.0264739990234375,
0.00408935546875,
0.032073974609375,
0.0201416015625,
0.0225372314453125,
0.0328369140625,
0.0631103515625,
0.036346435546875,
-0.012725830078125,
0.06689453125,
-0.035430908203125,
0.056121826171875,
0.08526611328125,
-0.0033016204833984375,
0.0242156982421875,
0.036285400390625,
-0.02825927734375,
0.0302734375,
0.059417724609375,
-0.030731201171875,
0.046661376953125,
0.018157958984375,
0.0025386810302734375,
-0.01168060302734375,
0.0167083740234375,
-0.043243408203125,
0.0306549072265625,
0.00667572021484375,
-0.057281494140625,
-0.026397705078125,
-0.0125274658203125,
-0.0078277587890625,
-0.01044464111328125,
-0.026702880859375,
0.03729248046875,
-0.01456451416015625,
-0.032745361328125,
0.07733154296875,
0.0029773712158203125,
0.0275421142578125,
-0.04443359375,
-0.01800537109375,
-0.0107574462890625,
0.036285400390625,
-0.0217742919921875,
-0.04022216796875,
0.0172882080078125,
0.007354736328125,
-0.029632568359375,
0.0096282958984375,
0.0157318115234375,
-0.026641845703125,
-0.06689453125,
0.0007963180541992188,
0.0187225341796875,
0.0257415771484375,
-0.00506591796875,
-0.081787109375,
-0.00032448768615722656,
0.0037097930908203125,
-0.0145416259765625,
0.0158843994140625,
0.01378631591796875,
0.0007176399230957031,
0.029144287109375,
0.046112060546875,
-0.006683349609375,
0.00479888916015625,
0.005474090576171875,
0.0748291015625,
-0.04412841796875,
-0.030181884765625,
-0.072021484375,
0.05316162109375,
-0.0280303955078125,
-0.05621337890625,
0.04443359375,
0.076171875,
0.06951904296875,
-0.0286102294921875,
0.03656005859375,
-0.0070648193359375,
0.0295257568359375,
-0.0306396484375,
0.043792724609375,
-0.0311279296875,
0.0048370361328125,
-0.036041259765625,
-0.08514404296875,
0.0135345458984375,
0.050262451171875,
-0.01078033447265625,
0.00513458251953125,
0.040985107421875,
0.060028076171875,
-0.01861572265625,
-0.00875091552734375,
0.0223236083984375,
-0.001354217529296875,
0.01666259765625,
0.0296783447265625,
0.0457763671875,
-0.06488037109375,
0.0302734375,
-0.0609130859375,
-0.0248260498046875,
-0.016754150390625,
-0.05975341796875,
-0.07342529296875,
-0.035858154296875,
-0.037933349609375,
-0.036102294921875,
-0.0023937225341796875,
0.0797119140625,
0.062255859375,
-0.05328369140625,
-0.0274505615234375,
0.0065765380859375,
-0.02703857421875,
-0.026153564453125,
-0.0121917724609375,
0.0244140625,
0.01236724853515625,
-0.07598876953125,
0.0028209686279296875,
-0.0124359130859375,
0.026947021484375,
-0.0220947265625,
-0.01555633544921875,
-0.02001953125,
-0.030548095703125,
0.00780487060546875,
0.0026950836181640625,
-0.0295867919921875,
-0.003997802734375,
-0.013397216796875,
-0.004695892333984375,
0.00884246826171875,
0.020751953125,
-0.037200927734375,
0.043060302734375,
0.0107269287109375,
0.014129638671875,
0.073974609375,
0.006984710693359375,
0.0145111083984375,
-0.0526123046875,
0.04241943359375,
0.0202178955078125,
0.046600341796875,
0.011688232421875,
-0.036285400390625,
0.03851318359375,
0.0303192138671875,
-0.038421630859375,
-0.065185546875,
-0.0217742919921875,
-0.0855712890625,
0.020751953125,
0.07440185546875,
0.00015246868133544922,
-0.0239715576171875,
0.023284912109375,
-0.0269317626953125,
0.025421142578125,
-0.0247955322265625,
0.038330078125,
0.054168701171875,
-0.0012111663818359375,
0.01165771484375,
-0.03167724609375,
0.033966064453125,
0.00930023193359375,
-0.0294647216796875,
-0.015899658203125,
0.03350830078125,
0.03680419921875,
0.006626129150390625,
0.0240936279296875,
-0.0159149169921875,
0.00919342041015625,
0.01395416259765625,
0.0206756591796875,
-0.0423583984375,
-0.0273895263671875,
-0.020416259765625,
-0.01354217529296875,
0.0205535888671875,
-0.039520263671875
]
] |
tals/albert-xlarge-vitaminc-mnli | 2023-03-17T05:27:53.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"albert",
"text-classification",
"dataset:glue",
"dataset:multi_nli",
"dataset:tals/vitaminc",
"endpoints_compatible",
"region:us"
] | text-classification | tals | null | null | tals/albert-xlarge-vitaminc-mnli | 2 | 27,647 | transformers | 2022-03-02T23:29:05 | ---
datasets:
- glue
- multi_nli
- tals/vitaminc
---
# Details
Model used in [Get Your Vitamin C! Robust Fact Verification with Contrastive Evidence](https://aclanthology.org/2021.naacl-main.52/) (Schuster et al., NAACL 21`).
For more details see: https://github.com/TalSchuster/VitaminC
When using this model, please cite the paper.
# BibTeX entry and citation info
```bibtex
@inproceedings{schuster-etal-2021-get,
title = "Get Your Vitamin {C}! Robust Fact Verification with Contrastive Evidence",
author = "Schuster, Tal and
Fisch, Adam and
Barzilay, Regina",
booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.naacl-main.52",
doi = "10.18653/v1/2021.naacl-main.52",
pages = "624--643",
abstract = "Typical fact verification models use retrieved written evidence to verify claims. Evidence sources, however, often change over time as more information is gathered and revised. In order to adapt, models must be sensitive to subtle differences in supporting evidence. We present VitaminC, a benchmark infused with challenging cases that require fact verification models to discern and adjust to slight factual changes. We collect over 100,000 Wikipedia revisions that modify an underlying fact, and leverage these revisions, together with additional synthetically constructed ones, to create a total of over 400,000 claim-evidence pairs. Unlike previous resources, the examples in VitaminC are contrastive, i.e., they contain evidence pairs that are nearly identical in language and content, with the exception that one supports a given claim while the other does not. We show that training using this design increases robustness{---}improving accuracy by 10{\%} on adversarial fact verification and 6{\%} on adversarial natural language inference (NLI). Moreover, the structure of VitaminC leads us to define additional tasks for fact-checking resources: tagging relevant words in the evidence for verifying the claim, identifying factual revisions, and providing automatic edits via factually consistent text generation.",
}
```
| 2,344 | [
[
-0.002689361572265625,
-0.066650390625,
0.0220184326171875,
0.0001399517059326172,
-0.00734710693359375,
0.01324462890625,
-0.0159759521484375,
-0.052459716796875,
0.01424407958984375,
0.00348663330078125,
-0.02679443359375,
-0.04534912109375,
-0.036529541015625,
0.03363037109375,
-0.0235137939453125,
0.076416015625,
0.01544952392578125,
0.027069091796875,
-0.0321044921875,
-0.01474761962890625,
0.00026988983154296875,
-0.033172607421875,
-0.0248565673828125,
-0.0208282470703125,
0.039764404296875,
0.02587890625,
0.0611572265625,
0.061676025390625,
0.049560546875,
0.0211181640625,
-0.0221405029296875,
0.04150390625,
-0.01995849609375,
-0.0019044876098632812,
-0.0340576171875,
-0.03948974609375,
-0.04327392578125,
0.0096282958984375,
0.033966064453125,
0.052276611328125,
-0.004726409912109375,
0.0200958251953125,
-0.003570556640625,
0.0400390625,
-0.0704345703125,
0.01812744140625,
-0.05999755859375,
-0.0133514404296875,
-0.006317138671875,
0.010345458984375,
-0.041168212890625,
-0.0215911865234375,
0.007640838623046875,
-0.041046142578125,
0.044677734375,
-0.0029811859130859375,
0.10546875,
0.006427764892578125,
-0.044586181640625,
-0.00954437255859375,
-0.031982421875,
0.037567138671875,
-0.052459716796875,
0.039520263671875,
0.024322509765625,
0.003173828125,
-0.00803375244140625,
-0.059600830078125,
-0.06512451171875,
-0.040313720703125,
-0.00437164306640625,
0.0061492919921875,
-0.003803253173828125,
-0.0164031982421875,
0.026885986328125,
0.039947509765625,
-0.03802490234375,
-0.0176849365234375,
-0.0290374755859375,
-0.02142333984375,
0.03546142578125,
-0.0091094970703125,
0.025726318359375,
-0.02960205078125,
-0.03472900390625,
-0.0165252685546875,
-0.06597900390625,
0.00525665283203125,
0.00794219970703125,
0.032073974609375,
-0.0204620361328125,
0.051116943359375,
-0.0220489501953125,
0.034942626953125,
0.014404296875,
0.0014934539794921875,
0.03656005859375,
-0.060943603515625,
-0.01461029052734375,
0.0003800392150878906,
0.05792236328125,
0.03131103515625,
-0.0098114013671875,
-0.023956298828125,
0.01519775390625,
0.01702880859375,
0.0145721435546875,
-0.06243896484375,
-0.04400634765625,
0.003326416015625,
-0.0111846923828125,
-0.036712646484375,
0.0176239013671875,
-0.050811767578125,
-0.024078369140625,
-0.0079498291015625,
0.01340484619140625,
-0.0240631103515625,
-0.012451171875,
-0.01224517822265625,
-0.00040793418884277344,
0.021636962890625,
0.00926971435546875,
-0.0517578125,
0.018463134765625,
0.03253173828125,
0.07415771484375,
-0.02020263671875,
-0.00269317626953125,
-0.041259765625,
-0.00661468505859375,
-0.038330078125,
0.08319091796875,
-0.03240966796875,
-0.004573822021484375,
-0.0019063949584960938,
0.013458251953125,
0.004547119140625,
-0.037841796875,
0.049102783203125,
-0.05145263671875,
0.01025390625,
-0.056854248046875,
-0.0234375,
-0.00937652587890625,
0.045684814453125,
-0.05975341796875,
0.0771484375,
-0.010345458984375,
-0.08819580078125,
0.043212890625,
-0.036895751953125,
-0.0396728515625,
-0.01071929931640625,
-0.00031566619873046875,
-0.048065185546875,
-0.0041961669921875,
0.012847900390625,
-0.007106781005859375,
-0.0418701171875,
0.035003662109375,
-0.046234130859375,
0.01117706298828125,
0.053131103515625,
-0.0270538330078125,
0.0770263671875,
0.0191650390625,
-0.03265380859375,
-0.001392364501953125,
-0.0826416015625,
-0.0020751953125,
0.0113983154296875,
-0.047576904296875,
-0.006496429443359375,
-0.0011615753173828125,
-0.0128631591796875,
0.0247344970703125,
0.008697509765625,
-0.048431396484375,
0.00540924072265625,
-0.0274505615234375,
0.00914764404296875,
0.038177490234375,
0.007213592529296875,
0.01116943359375,
-0.0211639404296875,
0.01995849609375,
0.0160369873046875,
0.0138702392578125,
0.02032470703125,
-0.037078857421875,
-0.059844970703125,
-0.0225982666015625,
0.061370849609375,
0.041900634765625,
-0.07049560546875,
0.0283966064453125,
-0.025482177734375,
-0.0225372314453125,
-0.0230560302734375,
0.01232147216796875,
0.03179931640625,
0.06292724609375,
0.058685302734375,
-0.0284271240234375,
-0.040435791015625,
-0.06561279296875,
-0.036712646484375,
-0.03143310546875,
0.00804901123046875,
0.015289306640625,
0.040008544921875,
0.006275177001953125,
0.06744384765625,
-0.0311126708984375,
0.006946563720703125,
-0.01538848876953125,
-0.00684356689453125,
0.01213836669921875,
0.036773681640625,
0.034637451171875,
-0.0770263671875,
-0.040313720703125,
-0.0174560546875,
-0.06396484375,
-0.0058441162109375,
0.0023288726806640625,
-0.01357269287109375,
-0.004604339599609375,
0.03228759765625,
-0.046875,
0.036041259765625,
0.0265350341796875,
-0.030303955078125,
0.03314208984375,
-0.00626373291015625,
0.016448974609375,
-0.0789794921875,
0.026824951171875,
0.0101776123046875,
-0.032440185546875,
-0.04412841796875,
0.024566650390625,
0.0176849365234375,
0.0169525146484375,
-0.038665771484375,
0.0243072509765625,
-0.04071044921875,
0.028564453125,
-0.008331298828125,
0.0118255615234375,
0.00710296630859375,
0.02374267578125,
-0.0160064697265625,
0.0435791015625,
0.035919189453125,
-0.061431884765625,
0.03253173828125,
0.0167236328125,
-0.0225372314453125,
0.051544189453125,
-0.04840087890625,
-0.004180908203125,
0.0029773712158203125,
0.0225677490234375,
-0.056854248046875,
-0.025665283203125,
0.0369873046875,
-0.033233642578125,
0.028350830078125,
0.0037899017333984375,
-0.039398193359375,
-0.01442718505859375,
-0.033966064453125,
0.0164031982421875,
0.00710296630859375,
-0.0579833984375,
0.049652099609375,
0.0230712890625,
0.0081329345703125,
-0.059844970703125,
-0.0714111328125,
0.0194549560546875,
0.004863739013671875,
-0.03936767578125,
0.0296783447265625,
-0.031280517578125,
-0.026458740234375,
0.0150299072265625,
-0.032867431640625,
-0.0058441162109375,
0.01025390625,
0.005207061767578125,
0.030181884765625,
-0.028228759765625,
0.0117340087890625,
-0.00814056396484375,
0.00801849365234375,
0.003467559814453125,
-0.01288604736328125,
0.00676727294921875,
-0.00457000732421875,
-0.01381683349609375,
-0.031585693359375,
0.0121002197265625,
0.0279693603515625,
-0.026611328125,
0.06927490234375,
0.0439453125,
-0.0310516357421875,
0.004425048828125,
-0.04345703125,
-0.01045989990234375,
-0.031158447265625,
0.0231475830078125,
-0.00841522216796875,
-0.0258026123046875,
0.0292510986328125,
-0.0004649162292480469,
-0.0067596435546875,
0.05908203125,
0.0389404296875,
0.0227203369140625,
0.06121826171875,
0.036163330078125,
0.0172882080078125,
0.036041259765625,
-0.022552490234375,
0.005950927734375,
-0.05462646484375,
-0.0447998046875,
-0.044403076171875,
-0.0097503662109375,
-0.0399169921875,
-0.0330810546875,
-0.0007953643798828125,
-0.0023021697998046875,
-0.0168914794921875,
0.023895263671875,
-0.01806640625,
0.045135498046875,
0.034454345703125,
0.04638671875,
0.0322265625,
0.00909423828125,
-0.0401611328125,
0.010467529296875,
-0.06231689453125,
-0.02362060546875,
0.08343505859375,
-0.0002999305725097656,
0.032135009765625,
0.0390625,
0.028411865234375,
0.0221405029296875,
0.048828125,
-0.027496337890625,
0.039520263671875,
-0.00916290283203125,
-0.08135986328125,
0.0169219970703125,
-0.045684814453125,
-0.050262451171875,
0.0161590576171875,
-0.052825927734375,
-0.058349609375,
0.052215576171875,
0.0261077880859375,
-0.017242431640625,
0.0202789306640625,
-0.061279296875,
0.060455322265625,
-0.01102447509765625,
-0.05682373046875,
-0.0001418590545654297,
-0.05792236328125,
0.0161895751953125,
0.0193328857421875,
0.0131072998046875,
0.0032253265380859375,
0.013519287109375,
0.0175018310546875,
-0.0452880859375,
0.07049560546875,
0.0058441162109375,
0.0164794921875,
0.03350830078125,
0.01093292236328125,
0.04205322265625,
0.033782958984375,
-0.00853729248046875,
-0.00893402099609375,
-0.0157318115234375,
-0.045562744140625,
-0.034912109375,
0.0672607421875,
-0.0467529296875,
-0.04351806640625,
-0.06500244140625,
-0.0241546630859375,
-0.00714874267578125,
0.024017333984375,
0.0230560302734375,
0.023040771484375,
-0.03582763671875,
0.031524658203125,
0.056121826171875,
-0.00457763671875,
0.0101470947265625,
0.04150390625,
0.011688232421875,
-0.04022216796875,
0.061981201171875,
-0.00022518634796142578,
0.035003662109375,
0.0484619140625,
0.01496124267578125,
-0.021759033203125,
-0.048492431640625,
-0.0203857421875,
0.01421356201171875,
-0.0657958984375,
-0.0413818359375,
-0.0682373046875,
-0.01062774658203125,
-0.039703369140625,
0.019134521484375,
-0.031829833984375,
-0.03759765625,
-0.03253173828125,
-0.0297698974609375,
0.021759033203125,
0.044708251953125,
-0.0307769775390625,
-0.01290130615234375,
-0.0158843994140625,
0.044097900390625,
0.0200042724609375,
0.006450653076171875,
-0.0169525146484375,
-0.02716064453125,
-0.02032470703125,
0.0298919677734375,
-0.037933349609375,
-0.07733154296875,
0.009521484375,
0.035125732421875,
0.036712646484375,
0.0202484130859375,
0.021636962890625,
0.032073974609375,
-0.00934600830078125,
0.07611083984375,
0.01280975341796875,
-0.055755615234375,
0.0228118896484375,
0.00933074951171875,
0.027557373046875,
0.05718994140625,
0.0322265625,
-0.005550384521484375,
-0.022979736328125,
-0.0684814453125,
-0.06561279296875,
0.0235137939453125,
0.0135040283203125,
-0.01236724853515625,
-0.0226898193359375,
0.02996826171875,
-0.0170745849609375,
0.0033206939697265625,
-0.07598876953125,
-0.025482177734375,
-0.034454345703125,
-0.027984619140625,
0.0131683349609375,
-0.0223846435546875,
-0.020965576171875,
-0.017364501953125,
0.06854248046875,
-0.020355224609375,
-0.002513885498046875,
0.045928955078125,
-0.039703369140625,
0.0108795166015625,
0.023223876953125,
0.03857421875,
0.043365478515625,
-0.01197052001953125,
0.043426513671875,
0.0299835205078125,
-0.0460205078125,
-0.01446533203125,
0.0176239013671875,
-0.0310821533203125,
0.00241851806640625,
0.031890869140625,
0.0298004150390625,
0.0209503173828125,
-0.03460693359375,
0.04400634765625,
-0.00911712646484375,
-0.005397796630859375,
0.004138946533203125,
0.00830078125,
0.0024738311767578125,
0.0167236328125,
0.01322174072265625,
0.035491943359375,
0.0141448974609375,
-0.04217529296875,
0.00559234619140625,
0.0230712890625,
-0.032135009765625,
-0.04241943359375,
0.0567626953125,
0.0021209716796875,
0.006427764892578125,
0.0413818359375,
-0.035400390625,
-0.040924072265625,
0.036163330078125,
0.0421142578125,
0.049896240234375,
-0.0110626220703125,
0.0230560302734375,
0.052001953125,
0.0258331298828125,
-0.00749969482421875,
0.021331787109375,
-0.0018749237060546875,
-0.044891357421875,
-0.0096435546875,
-0.033233642578125,
-0.01239776611328125,
-0.006183624267578125,
-0.029632568359375,
0.0130767822265625,
-0.0477294921875,
-0.026611328125,
0.035308837890625,
0.0121612548828125,
-0.06658935546875,
0.0200042724609375,
-0.01503753662109375,
0.06427001953125,
-0.057708740234375,
0.051300048828125,
0.047607421875,
-0.06427001953125,
-0.05340576171875,
0.00295257568359375,
0.030059814453125,
-0.034881591796875,
0.04669189453125,
-0.020263671875,
-0.01123809814453125,
-0.043670654296875,
-0.06597900390625,
-0.062164306640625,
0.084716796875,
0.046875,
-0.01849365234375,
-0.004512786865234375,
-0.0103912353515625,
0.0517578125,
-0.038909912109375,
0.007694244384765625,
0.0176239013671875,
0.0290679931640625,
0.0304718017578125,
-0.038421630859375,
0.032562255859375,
-0.036376953125,
-0.01345062255859375,
-0.0291290283203125,
-0.05224609375,
0.06976318359375,
-0.04705810546875,
-0.03387451171875,
0.0190277099609375,
0.058349609375,
0.00433349609375,
0.019256591796875,
0.0291290283203125,
0.0297393798828125,
0.076171875,
-0.0032291412353515625,
0.0880126953125,
-0.02239990234375,
0.0272369384765625,
0.09320068359375,
-0.0063323974609375,
0.06353759765625,
0.0234375,
-0.01849365234375,
0.06573486328125,
0.051971435546875,
-0.026885986328125,
0.036590576171875,
0.00846099853515625,
-0.018310546875,
-0.0280609130859375,
-0.031494140625,
-0.0361328125,
0.019287109375,
0.0081329345703125,
-0.042327880859375,
0.005962371826171875,
-0.044891357421875,
0.0223541259765625,
-0.001415252685546875,
0.003673553466796875,
0.053619384765625,
-0.0019063949584960938,
-0.04669189453125,
0.047760009765625,
-0.007617950439453125,
0.018951416015625,
-0.033782958984375,
-0.0024013519287109375,
-0.02178955078125,
0.02581787109375,
-0.03411865234375,
-0.0248565673828125,
0.000008702278137207031,
0.01666259765625,
-0.026611328125,
0.0341796875,
0.046539306640625,
-0.03704833984375,
-0.052154541015625,
-0.004718780517578125,
0.027313232421875,
-0.0175323486328125,
0.00916290283203125,
-0.059295654296875,
-0.00913238525390625,
-0.0024547576904296875,
0.000988006591796875,
0.0092010498046875,
0.041717529296875,
0.0101318359375,
0.043212890625,
0.07501220703125,
0.0208587646484375,
0.0009446144104003906,
0.005222320556640625,
0.08154296875,
-0.039520263671875,
-0.0294647216796875,
-0.061798095703125,
0.034912109375,
-0.0394287109375,
-0.0233001708984375,
0.06243896484375,
0.06134033203125,
0.06524658203125,
0.0005168914794921875,
0.03509521484375,
-0.01282501220703125,
0.0193023681640625,
-0.0419921875,
0.047882080078125,
-0.04278564453125,
0.024322509765625,
-0.020721435546875,
-0.051910400390625,
-0.05389404296875,
0.03948974609375,
-0.007472991943359375,
0.0143585205078125,
0.0826416015625,
0.0699462890625,
-0.0181884765625,
0.00026345252990722656,
-0.00690460205078125,
0.04425048828125,
0.03607177734375,
0.0110626220703125,
0.035614013671875,
-0.042236328125,
0.04522705078125,
-0.0134735107421875,
-0.01323699951171875,
-0.0166015625,
-0.0762939453125,
-0.061126708984375,
-0.0296783447265625,
-0.0631103515625,
-0.07684326171875,
0.0280609130859375,
0.09014892578125,
0.051055908203125,
-0.0765380859375,
0.006893157958984375,
-0.004528045654296875,
0.027130126953125,
-0.03240966796875,
-0.00833892822265625,
0.036773681640625,
-0.0120391845703125,
-0.021514892578125,
0.02996826171875,
-0.0015153884887695312,
0.0033931732177734375,
0.00035381317138671875,
-0.001087188720703125,
-0.0655517578125,
0.0148162841796875,
0.022857666015625,
0.044708251953125,
-0.04669189453125,
-0.0255279541015625,
0.01404571533203125,
-0.01482391357421875,
0.0284271240234375,
0.067138671875,
-0.04681396484375,
0.056915283203125,
0.03778076171875,
0.040252685546875,
0.038360595703125,
-0.0019197463989257812,
0.029876708984375,
-0.052490234375,
-0.00586700439453125,
-0.01251983642578125,
0.03082275390625,
0.060516357421875,
-0.01580810546875,
0.0367431640625,
0.045623779296875,
-0.02276611328125,
-0.0596923828125,
0.01385498046875,
-0.08905029296875,
-0.01104736328125,
0.08819580078125,
-0.01593017578125,
-0.043365478515625,
-0.0085601806640625,
0.0110626220703125,
0.03350830078125,
-0.026458740234375,
0.07733154296875,
0.05047607421875,
0.023529052734375,
-0.0015707015991210938,
-0.0081634521484375,
0.04345703125,
-0.00295257568359375,
-0.0677490234375,
0.0121612548828125,
0.0247802734375,
0.042388916015625,
0.025665283203125,
0.0136566162109375,
-0.01593017578125,
0.0263671875,
0.022125244140625,
-0.0013208389282226562,
-0.0028820037841796875,
-0.023406982421875,
-0.0018711090087890625,
0.01331329345703125,
0.012939453125,
-0.0007228851318359375
]
] |
NousResearch/Nous-Hermes-llama-2-7b | 2023-09-24T09:16:53.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"llama-2",
"self-instruct",
"distillation",
"synthetic instruction",
"en",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | NousResearch | null | null | NousResearch/Nous-Hermes-llama-2-7b | 52 | 27,571 | transformers | 2023-07-25T19:39:23 | ---
language:
- en
tags:
- llama-2
- self-instruct
- distillation
- synthetic instruction
license:
- mit
---
# Model Card: Nous-Hermes-Llama2-7b
Compute provided by our project sponsor Redmond AI, thank you! Follow RedmondAI on Twitter @RedmondAI.
## Model Description
Nous-Hermes-Llama2-7b is a state-of-the-art language model fine-tuned on over 300,000 instructions. This model was fine-tuned by Nous Research, with Teknium leading the fine tuning process and dataset curation, Redmond AI sponsoring the compute, and several other contributors.
This Hermes model uses the exact same dataset as Hermes on Llama-1. This is to ensure consistency between the old Hermes and new, for anyone who wanted to keep Hermes as similar to the old one, just more capable.
This model stands out for its long responses, lower hallucination rate, and absence of OpenAI censorship mechanisms. The fine-tuning process was performed with a 4096 sequence length on an 8x a100 80GB DGX machine.
## Model Training
The model was trained almost entirely on synthetic GPT-4 outputs. Curating high quality GPT-4 datasets enables incredibly high quality in knowledge, task completion, and style.
This includes data from diverse sources such as GPTeacher, the general, roleplay v1&2, code instruct datasets, Nous Instruct & PDACTL (unpublished), and several others, detailed further below
## Collaborators
The model fine-tuning and the datasets were a collaboration of efforts and resources between Teknium, Karan4D, Emozilla, Huemin Art and Redmond AI.
Special mention goes to @winglian for assisting in some of the training issues.
Huge shoutout and acknowledgement is deserved for all the dataset creators who generously share their datasets openly.
Among the contributors of datasets:
- GPTeacher was made available by Teknium
- Wizard LM by nlpxucan
- Nous Research Instruct Dataset was provided by Karan4D and HueminArt.
- GPT4-LLM and Unnatural Instructions were provided by Microsoft
- Airoboros dataset by jondurbin
- Camel-AI's domain expert datasets are from Camel-AI
- CodeAlpaca dataset by Sahil 2801.
If anyone was left out, please open a thread in the community tab.
## Prompt Format
The model follows the Alpaca prompt format:
```
### Instruction:
<prompt>
### Response:
<leave a newline blank for model to respond>
```
or
```
### Instruction:
<prompt>
### Input:
<additional context>
### Response:
<leave a newline blank for model to respond>
```
GPT4All:
```| Task |Version| Metric |Value | |Stderr|
|-------------|------:|--------|-----:|---|-----:|
|arc_challenge| 0|acc |0.4735|± |0.0146|
| | |acc_norm|0.5017|± |0.0146|
|arc_easy | 0|acc |0.7946|± |0.0083|
| | |acc_norm|0.7605|± |0.0088|
|boolq | 1|acc |0.8000|± |0.0070|
|hellaswag | 0|acc |0.5924|± |0.0049|
| | |acc_norm|0.7774|± |0.0042|
|openbookqa | 0|acc |0.3600|± |0.0215|
| | |acc_norm|0.4660|± |0.0223|
|piqa | 0|acc |0.7889|± |0.0095|
| | |acc_norm|0.7976|± |0.0094|
|winogrande | 0|acc |0.6993|± |0.0129|
Average: 0.686
```
BigBench:
```
| Task |Version| Metric |Value | |Stderr|
|------------------------------------------------|------:|---------------------|-----:|---|-----:|
|bigbench_causal_judgement | 0|multiple_choice_grade|0.5579|± |0.0361|
|bigbench_date_understanding | 0|multiple_choice_grade|0.6233|± |0.0253|
|bigbench_disambiguation_qa | 0|multiple_choice_grade|0.3062|± |0.0288|
|bigbench_geometric_shapes | 0|multiple_choice_grade|0.2006|± |0.0212|
| | |exact_str_match |0.0000|± |0.0000|
|bigbench_logical_deduction_five_objects | 0|multiple_choice_grade|0.2540|± |0.0195|
|bigbench_logical_deduction_seven_objects | 0|multiple_choice_grade|0.1657|± |0.0141|
|bigbench_logical_deduction_three_objects | 0|multiple_choice_grade|0.4067|± |0.0284|
|bigbench_movie_recommendation | 0|multiple_choice_grade|0.2780|± |0.0201|
|bigbench_navigate | 0|multiple_choice_grade|0.5000|± |0.0158|
|bigbench_reasoning_about_colored_objects | 0|multiple_choice_grade|0.4405|± |0.0111|
|bigbench_ruin_names | 0|multiple_choice_grade|0.2701|± |0.0210|
|bigbench_salient_translation_error_detection | 0|multiple_choice_grade|0.2034|± |0.0127|
|bigbench_snarks | 0|multiple_choice_grade|0.5028|± |0.0373|
|bigbench_sports_understanding | 0|multiple_choice_grade|0.6136|± |0.0155|
|bigbench_temporal_sequences | 0|multiple_choice_grade|0.2720|± |0.0141|
|bigbench_tracking_shuffled_objects_five_objects | 0|multiple_choice_grade|0.1944|± |0.0112|
|bigbench_tracking_shuffled_objects_seven_objects| 0|multiple_choice_grade|0.1497|± |0.0085|
|bigbench_tracking_shuffled_objects_three_objects| 0|multiple_choice_grade|0.4067|± |0.0284|
Average: 0.3525
```
AGIEval
```
| Task |Version| Metric |Value | |Stderr|
|------------------------------|------:|--------|-----:|---|-----:|
|agieval_aqua_rat | 0|acc |0.2520|± |0.0273|
| | |acc_norm|0.2402|± |0.0269|
|agieval_logiqa_en | 0|acc |0.2796|± |0.0176|
| | |acc_norm|0.3241|± |0.0184|
|agieval_lsat_ar | 0|acc |0.2478|± |0.0285|
| | |acc_norm|0.2348|± |0.0280|
|agieval_lsat_lr | 0|acc |0.2843|± |0.0200|
| | |acc_norm|0.2765|± |0.0198|
|agieval_lsat_rc | 0|acc |0.3271|± |0.0287|
| | |acc_norm|0.3011|± |0.0280|
|agieval_sat_en | 0|acc |0.4660|± |0.0348|
| | |acc_norm|0.4223|± |0.0345|
|agieval_sat_en_without_passage| 0|acc |0.3738|± |0.0338|
| | |acc_norm|0.3447|± |0.0332|
|agieval_sat_math | 0|acc |0.2500|± |0.0293|
| | |acc_norm|0.2364|± |0.0287|
Average: 0.2975
```
## Benchmark Results
## Resources for Applied Use Cases:
For an example of a back and forth chatbot using huggingface transformers and discord, check out: https://github.com/teknium1/alpaca-discord
For an example of a roleplaying discord chatbot, check out this: https://github.com/teknium1/alpaca-roleplay-discordbot
LM Studio is a good choice for a chat interface that supports GGML versions (to come)
## Future Plans
We plan to continue to iterate on both more high quality data, and new data filtering techniques to eliminate lower quality data going forward.
## Model Usage
The model is available for download on Hugging Face. It is suitable for a wide range of language tasks, from generating creative text to understanding and following complex instructions.
| 7,374 | [
[
-0.0438232421875,
-0.06182861328125,
0.02056884765625,
0.0134429931640625,
-0.006832122802734375,
0.0023345947265625,
-0.00635528564453125,
-0.04046630859375,
0.036346435546875,
0.013702392578125,
-0.0479736328125,
-0.0494384765625,
-0.055450439453125,
0.00258636474609375,
-0.0108489990234375,
0.0728759765625,
0.006378173828125,
-0.01727294921875,
0.0055694580078125,
-0.01824951171875,
-0.03857421875,
-0.01515960693359375,
-0.053802490234375,
-0.0174407958984375,
0.026519775390625,
0.03094482421875,
0.059722900390625,
0.0396728515625,
0.03497314453125,
0.0246734619140625,
-0.018310546875,
0.000050902366638183594,
-0.02606201171875,
-0.0177459716796875,
0.00313568115234375,
-0.029266357421875,
-0.05279541015625,
0.00817108154296875,
0.0289764404296875,
0.0400390625,
-0.00826263427734375,
0.03662109375,
0.0084991455078125,
0.0645751953125,
-0.0292510986328125,
0.025604248046875,
-0.0107574462890625,
0.00341033935546875,
-0.00452423095703125,
-0.002071380615234375,
-0.0029048919677734375,
-0.0394287109375,
-0.00701141357421875,
-0.05108642578125,
0.01044464111328125,
-0.002887725830078125,
0.0902099609375,
0.0208740234375,
-0.01788330078125,
-0.0252227783203125,
-0.0325927734375,
0.06390380859375,
-0.060882568359375,
0.016754150390625,
0.04248046875,
0.009735107421875,
-0.019317626953125,
-0.0379638671875,
-0.064453125,
-0.0005040168762207031,
-0.0239105224609375,
0.023101806640625,
-0.013336181640625,
-0.019744873046875,
0.0301361083984375,
0.044189453125,
-0.044891357421875,
0.0057525634765625,
-0.041290283203125,
-0.004150390625,
0.057037353515625,
0.0233612060546875,
0.024261474609375,
-0.02166748046875,
-0.024932861328125,
-0.0235595703125,
-0.0269775390625,
0.033111572265625,
0.02490234375,
0.01171112060546875,
-0.0369873046875,
0.050384521484375,
-0.0201873779296875,
0.045135498046875,
0.006885528564453125,
-0.025054931640625,
0.058929443359375,
-0.033294677734375,
-0.023712158203125,
-0.01151275634765625,
0.06744384765625,
0.0423583984375,
-0.0003800392150878906,
0.0164337158203125,
0.004291534423828125,
0.017730712890625,
0.00812530517578125,
-0.054718017578125,
-0.0157623291015625,
0.036041259765625,
-0.034332275390625,
-0.0186767578125,
-0.0086669921875,
-0.0721435546875,
-0.00511932373046875,
-0.0229644775390625,
0.024169921875,
-0.035888671875,
-0.021453857421875,
0.0011167526245117188,
-0.005123138427734375,
0.032745361328125,
0.0297088623046875,
-0.0635986328125,
0.0161895751953125,
0.032867431640625,
0.06671142578125,
-0.01128387451171875,
-0.0129852294921875,
-0.015899658203125,
0.00670623779296875,
-0.0298614501953125,
0.0479736328125,
-0.0271453857421875,
-0.02716064453125,
-0.0226898193359375,
0.019989013671875,
-0.01488494873046875,
-0.0228729248046875,
0.04400634765625,
-0.01189422607421875,
0.035369873046875,
-0.0211029052734375,
-0.032501220703125,
-0.0197601318359375,
0.03033447265625,
-0.059661865234375,
0.09747314453125,
0.0215911865234375,
-0.06671142578125,
0.028106689453125,
-0.061676025390625,
-0.0032291412353515625,
-0.016387939453125,
-0.0131683349609375,
-0.05224609375,
-0.024627685546875,
0.026214599609375,
0.027008056640625,
-0.0308990478515625,
0.0157318115234375,
-0.017730712890625,
-0.026641845703125,
-0.0016088485717773438,
-0.014923095703125,
0.0819091796875,
0.0168304443359375,
-0.046142578125,
0.0110626220703125,
-0.07061767578125,
0.006969451904296875,
0.026123046875,
-0.03424072265625,
0.004184722900390625,
-0.0171966552734375,
-0.01145172119140625,
0.0269012451171875,
0.0160980224609375,
-0.039215087890625,
0.024566650390625,
-0.0260467529296875,
0.02655029296875,
0.060699462890625,
-0.0017032623291015625,
0.0205230712890625,
-0.046051025390625,
0.0369873046875,
0.00266265869140625,
0.01486968994140625,
0.0023975372314453125,
-0.052276611328125,
-0.05389404296875,
-0.040771484375,
0.01123046875,
0.049224853515625,
-0.035736083984375,
0.045684814453125,
-0.0108642578125,
-0.05181884765625,
-0.04205322265625,
0.0047760009765625,
0.0299072265625,
0.04876708984375,
0.04010009765625,
-0.0215606689453125,
-0.0301666259765625,
-0.06640625,
0.0061798095703125,
-0.0102081298828125,
-0.0032958984375,
0.036041259765625,
0.061798095703125,
-0.021270751953125,
0.06494140625,
-0.057281494140625,
-0.03607177734375,
-0.019683837890625,
-0.00966644287109375,
0.049560546875,
0.046478271484375,
0.055267333984375,
-0.04632568359375,
-0.04400634765625,
-0.0077056884765625,
-0.06341552734375,
0.004611968994140625,
0.009796142578125,
-0.020782470703125,
0.033447265625,
0.0227508544921875,
-0.058135986328125,
0.054962158203125,
0.044586181640625,
-0.037841796875,
0.055328369140625,
-0.0166473388671875,
0.01409149169921875,
-0.08453369140625,
0.034027099609375,
0.0008654594421386719,
0.00675201416015625,
-0.0372314453125,
-0.0088043212890625,
0.00026869773864746094,
-0.0000016093254089355469,
-0.0233612060546875,
0.056396484375,
-0.040374755859375,
-0.0020751953125,
0.01465606689453125,
0.00859832763671875,
-0.00455474853515625,
0.05712890625,
0.0005478858947753906,
0.06610107421875,
0.0494384765625,
-0.036041259765625,
0.022552490234375,
0.0298614501953125,
-0.03375244140625,
0.034210205078125,
-0.0595703125,
-0.0010309219360351562,
-0.00994110107421875,
0.01837158203125,
-0.0933837890625,
-0.0211029052734375,
0.040435791015625,
-0.047637939453125,
0.0196533203125,
0.006626129150390625,
-0.0218353271484375,
-0.06280517578125,
-0.0455322265625,
0.0055999755859375,
0.0275421142578125,
-0.0250244140625,
0.0285491943359375,
0.021087646484375,
-0.01427459716796875,
-0.053497314453125,
-0.046478271484375,
-0.01200103759765625,
-0.0164337158203125,
-0.045318603515625,
0.0224609375,
-0.0243377685546875,
-0.006206512451171875,
0.002567291259765625,
-0.01165008544921875,
0.00018215179443359375,
0.007457733154296875,
0.0248565673828125,
0.035430908203125,
-0.01727294921875,
-0.008453369140625,
-0.0200042724609375,
-0.01343536376953125,
-0.0013265609741210938,
0.0204315185546875,
0.045745849609375,
-0.01030731201171875,
-0.0270233154296875,
-0.04156494140625,
0.0195770263671875,
0.049072265625,
-0.021331787109375,
0.07415771484375,
0.038818359375,
-0.01335906982421875,
0.01386260986328125,
-0.0321044921875,
-0.00036025047302246094,
-0.0361328125,
0.013702392578125,
-0.0273590087890625,
-0.06329345703125,
0.047271728515625,
0.016082763671875,
0.0244140625,
0.045135498046875,
0.043365478515625,
0.004039764404296875,
0.07080078125,
0.035614013671875,
-0.025543212890625,
0.0283050537109375,
-0.047454833984375,
0.00933074951171875,
-0.06243896484375,
-0.032196044921875,
-0.04156494140625,
-0.035430908203125,
-0.044219970703125,
-0.027069091796875,
0.0148773193359375,
-0.0016880035400390625,
-0.041900634765625,
0.0216522216796875,
-0.056365966796875,
0.019989013671875,
0.051025390625,
0.0151214599609375,
0.014495849609375,
-0.004634857177734375,
-0.0133209228515625,
0.003337860107421875,
-0.035186767578125,
-0.042266845703125,
0.0899658203125,
0.01546478271484375,
0.04302978515625,
0.027862548828125,
0.047515869140625,
0.0221405029296875,
0.0166473388671875,
-0.043914794921875,
0.039459228515625,
0.0096588134765625,
-0.05255126953125,
-0.0306243896484375,
-0.0258331298828125,
-0.09234619140625,
0.03363037109375,
-0.0223541259765625,
-0.0738525390625,
0.0135650634765625,
0.00800323486328125,
-0.025146484375,
0.024017333984375,
-0.057464599609375,
0.0733642578125,
-0.00540924072265625,
-0.0404052734375,
-0.0034656524658203125,
-0.05731201171875,
0.031097412109375,
0.00321197509765625,
0.0288543701171875,
-0.0224151611328125,
0.00485992431640625,
0.058929443359375,
-0.04095458984375,
0.052520751953125,
-0.006687164306640625,
-0.00785064697265625,
0.033721923828125,
-0.00506591796875,
0.0394287109375,
0.003955841064453125,
-0.008819580078125,
0.0097198486328125,
-0.0007219314575195312,
-0.042388916015625,
-0.0177764892578125,
0.058441162109375,
-0.0736083984375,
-0.060272216796875,
-0.0513916015625,
-0.03643798828125,
-0.004261016845703125,
0.02410888671875,
0.00897216796875,
0.0163421630859375,
-0.0038661956787109375,
0.01416778564453125,
0.0345458984375,
-0.033111572265625,
0.040618896484375,
0.033355712890625,
-0.0101165771484375,
-0.036590576171875,
0.05572509765625,
0.0035381317138671875,
0.0203704833984375,
0.00662994384765625,
0.007053375244140625,
-0.021331787109375,
-0.0193023681640625,
-0.0382080078125,
0.0293426513671875,
-0.020599365234375,
-0.02191162109375,
-0.05145263671875,
-0.0117340087890625,
-0.04266357421875,
-0.018096923828125,
-0.01102447509765625,
-0.034210205078125,
-0.0279083251953125,
-0.0159454345703125,
0.043609619140625,
0.047088623046875,
-0.00466156005859375,
0.00966644287109375,
-0.039215087890625,
0.0377197265625,
0.0218963623046875,
0.030364990234375,
0.0035572052001953125,
-0.036651611328125,
-0.00527191162109375,
0.01093292236328125,
-0.03936767578125,
-0.06976318359375,
0.0469970703125,
0.0001430511474609375,
0.0457763671875,
0.0165252685546875,
-0.01062774658203125,
0.0565185546875,
-0.01546478271484375,
0.078857421875,
0.019744873046875,
-0.0557861328125,
0.05255126953125,
-0.0279693603515625,
0.0287017822265625,
0.03546142578125,
0.04095458984375,
-0.040374755859375,
-0.0313720703125,
-0.06304931640625,
-0.07269287109375,
0.084716796875,
0.0300750732421875,
-0.01165771484375,
0.0028209686279296875,
0.00943756103515625,
-0.01364898681640625,
0.0154571533203125,
-0.0548095703125,
-0.04803466796875,
-0.00922393798828125,
-0.019622802734375,
-0.01349639892578125,
-0.006866455078125,
-0.0159149169921875,
-0.0379638671875,
0.048980712890625,
-0.0001291036605834961,
0.038360595703125,
0.01003265380859375,
0.00696563720703125,
0.0034198760986328125,
0.0158233642578125,
0.042755126953125,
0.04388427734375,
-0.0214996337890625,
-0.006603240966796875,
0.021240234375,
-0.052490234375,
0.0133514404296875,
0.008880615234375,
-0.00868988037109375,
-0.0103759765625,
0.0238189697265625,
0.042144775390625,
-0.0045623779296875,
-0.03350830078125,
0.038665771484375,
-0.0022125244140625,
-0.0250244140625,
-0.031982421875,
0.01424407958984375,
0.00920867919921875,
0.0230865478515625,
0.0173797607421875,
0.0008664131164550781,
0.0021915435791015625,
-0.048004150390625,
0.00658416748046875,
0.028472900390625,
-0.01454925537109375,
-0.00946044921875,
0.06097412109375,
-0.001918792724609375,
-0.01503753662109375,
0.039825439453125,
-0.007381439208984375,
-0.034515380859375,
0.071044921875,
0.033447265625,
0.037200927734375,
-0.033050537109375,
0.01235198974609375,
0.06903076171875,
0.020782470703125,
-0.00672149658203125,
0.032501220703125,
0.00878143310546875,
-0.031585693359375,
-0.01453399658203125,
-0.054290771484375,
-0.024932861328125,
0.0277099609375,
-0.04241943359375,
0.020904541015625,
-0.03350830078125,
-0.0186767578125,
0.0058135986328125,
0.022430419921875,
-0.054107666015625,
0.026336669921875,
-0.0002028942108154297,
0.06243896484375,
-0.06805419921875,
0.06414794921875,
0.0521240234375,
-0.05517578125,
-0.078125,
-0.0264892578125,
0.010589599609375,
-0.06475830078125,
0.044189453125,
0.0139312744140625,
0.0086669921875,
-0.00899505615234375,
-0.035430908203125,
-0.09674072265625,
0.107421875,
0.0077667236328125,
-0.031402587890625,
0.01102447509765625,
0.012969970703125,
0.042724609375,
-0.0014247894287109375,
0.04107666015625,
0.0482177734375,
0.042083740234375,
0.0108795166015625,
-0.07684326171875,
0.0213775634765625,
-0.04998779296875,
-0.00788116455078125,
0.0234527587890625,
-0.079345703125,
0.07867431640625,
-0.0145111083984375,
-0.0032558441162109375,
-0.01099395751953125,
0.043487548828125,
0.035675048828125,
0.021209716796875,
0.0283966064453125,
0.0640869140625,
0.06585693359375,
-0.012847900390625,
0.072021484375,
-0.029022216796875,
0.036407470703125,
0.07513427734375,
0.00739288330078125,
0.057464599609375,
0.0220947265625,
-0.043182373046875,
0.041595458984375,
0.044525146484375,
-0.0118560791015625,
0.0212554931640625,
0.00780487060546875,
-0.01177978515625,
0.006778717041015625,
0.01526641845703125,
-0.047119140625,
0.0171051025390625,
0.0281982421875,
-0.0225067138671875,
-0.001178741455078125,
-0.0123443603515625,
0.0254974365234375,
-0.009307861328125,
-0.0194854736328125,
0.04632568359375,
-0.00647735595703125,
-0.04803466796875,
0.06292724609375,
-0.0092620849609375,
0.05224609375,
-0.045806884765625,
-0.00787353515625,
-0.027008056640625,
0.0221710205078125,
-0.03240966796875,
-0.08453369140625,
0.0218658447265625,
0.005413055419921875,
-0.0104522705078125,
-0.00005841255187988281,
0.03662109375,
-0.0158538818359375,
-0.03619384765625,
0.02130126953125,
0.0267333984375,
0.0170135498046875,
0.0169677734375,
-0.0662841796875,
0.005397796630859375,
0.01461029052734375,
-0.046966552734375,
0.0220184326171875,
0.037139892578125,
-0.0021305084228515625,
0.051025390625,
0.054840087890625,
-0.007457733154296875,
0.00458526611328125,
-0.0201416015625,
0.08251953125,
-0.06280517578125,
-0.031402587890625,
-0.058624267578125,
0.037872314453125,
-0.020172119140625,
-0.035186767578125,
0.07012939453125,
0.0567626953125,
0.044403076171875,
-0.0023784637451171875,
0.0513916015625,
-0.033111572265625,
0.043609619140625,
-0.026123046875,
0.04559326171875,
-0.055511474609375,
-0.001041412353515625,
-0.035064697265625,
-0.059661865234375,
-0.025299072265625,
0.061553955078125,
-0.031646728515625,
0.0104522705078125,
0.042999267578125,
0.0709228515625,
0.0038661956787109375,
0.009307861328125,
0.0018186569213867188,
0.0242156982421875,
0.0218048095703125,
0.06256103515625,
0.03680419921875,
-0.035430908203125,
0.04449462890625,
-0.0262451171875,
-0.01515960693359375,
-0.0206298828125,
-0.045318603515625,
-0.06219482421875,
-0.03436279296875,
-0.02911376953125,
-0.0333251953125,
-0.01226043701171875,
0.06817626953125,
0.0303497314453125,
-0.0538330078125,
-0.025177001953125,
0.006450653076171875,
0.007045745849609375,
-0.0237579345703125,
-0.019256591796875,
0.056640625,
-0.0108184814453125,
-0.058349609375,
0.00405120849609375,
-0.006603240966796875,
0.01145172119140625,
0.004070281982421875,
-0.0087127685546875,
-0.0309295654296875,
0.0061187744140625,
0.0489501953125,
0.031585693359375,
-0.0452880859375,
-0.0230865478515625,
0.00005906820297241211,
-0.0197601318359375,
0.031585693359375,
0.003406524658203125,
-0.04559326171875,
0.01473236083984375,
0.0291748046875,
0.021942138671875,
0.05615234375,
0.003276824951171875,
0.0028133392333984375,
-0.0290069580078125,
0.00958251953125,
-0.004428863525390625,
0.02276611328125,
0.0089569091796875,
-0.0273590087890625,
0.0633544921875,
0.01169586181640625,
-0.04986572265625,
-0.056732177734375,
-0.0118408203125,
-0.09442138671875,
-0.01763916015625,
0.08660888671875,
-0.005828857421875,
-0.037200927734375,
-0.003276824951171875,
-0.03912353515625,
0.01372528076171875,
-0.049072265625,
0.056427001953125,
0.051788330078125,
-0.0276336669921875,
-0.0027446746826171875,
-0.062103271484375,
0.032623291015625,
0.0245208740234375,
-0.069091796875,
-0.007213592529296875,
0.044097900390625,
0.028717041015625,
0.0233612060546875,
0.06402587890625,
-0.009735107421875,
0.0106048583984375,
0.007022857666015625,
0.0183258056640625,
0.0013952255249023438,
0.0032863616943359375,
-0.00447845458984375,
0.0157318115234375,
-0.007450103759765625,
-0.02947998046875
]
] |
Yntec/aMovieX | 2023-09-16T08:29:51.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"MagicArt35",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/aMovieX | 2 | 27,527 | diffusers | 2023-09-16T07:32:13 | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
- MagicArt35
---
# AmovieX
Samples and prompts:

Pretty cute girl in a future where humanity has colonized the stars, a group of explorers embarks on a journey to a distant planet, hoping to discover new forms of life and unlock the secrets of the universe. But as they descend through the planet’s thick atmosphere, they discover that the world below is more dangerous and mysterious than they could have ever imagined.

Create pretty cute girl in an otherworldly landscape inspired by a specific planet or moon in our solar system, featuring unique geological formations and extraterrestrial flora and fauna.
Original page:
https://civitai.com/models/94687/photo-movie-x | 1,080 | [
[
-0.024169921875,
-0.0531005859375,
0.061126708984375,
-0.00637054443359375,
-0.01444244384765625,
0.038818359375,
0.024444580078125,
-0.036041259765625,
0.06793212890625,
0.046417236328125,
-0.0760498046875,
-0.05401611328125,
-0.04144287109375,
0.00428009033203125,
-0.0360107421875,
0.0499267578125,
-0.0007390975952148438,
-0.01271820068359375,
0.01229095458984375,
0.02197265625,
-0.04046630859375,
0.003551483154296875,
-0.0177764892578125,
0.00982666015625,
0.01168060302734375,
0.054046630859375,
0.05731201171875,
0.026885986328125,
0.0298614501953125,
0.0225677490234375,
-0.01145172119140625,
0.00255584716796875,
-0.01432037353515625,
0.00432586669921875,
-0.01297760009765625,
-0.02886962890625,
-0.0100250244140625,
0.034210205078125,
0.05877685546875,
0.03302001953125,
-0.001983642578125,
0.0257720947265625,
-0.0242156982421875,
0.018646240234375,
-0.0239105224609375,
-0.01104736328125,
0.003814697265625,
0.04034423828125,
0.00023794174194335938,
0.004978179931640625,
0.0129852294921875,
-0.01175689697265625,
0.0010271072387695312,
-0.08660888671875,
0.02020263671875,
0.0323486328125,
0.09527587890625,
0.004451751708984375,
-0.012939453125,
-0.0048828125,
-0.035675048828125,
0.05438232421875,
0.0036487579345703125,
0.0078887939453125,
-0.01800537109375,
0.0518798828125,
-0.0000438690185546875,
-0.06414794921875,
-0.040771484375,
0.0004940032958984375,
0.033050537109375,
0.0076141357421875,
-0.043792724609375,
-0.0390625,
0.0251922607421875,
0.02490234375,
-0.04046630859375,
-0.005889892578125,
-0.03631591796875,
-0.01534271240234375,
0.0284881591796875,
0.0246734619140625,
0.050994873046875,
-0.048858642578125,
-0.024200439453125,
-0.042633056640625,
-0.045013427734375,
0.044830322265625,
0.02484130859375,
0.0228424072265625,
-0.054962158203125,
0.06072998046875,
0.0028667449951171875,
0.0228424072265625,
0.0035457611083984375,
0.0008740425109863281,
0.01318359375,
-0.0251312255859375,
0.00009834766387939453,
-0.023895263671875,
0.053314208984375,
0.0648193359375,
0.00531005859375,
0.0238494873046875,
0.0027103424072265625,
0.0154876708984375,
0.031097412109375,
-0.06927490234375,
-0.0460205078125,
0.017303466796875,
-0.04132080078125,
-0.047576904296875,
0.006526947021484375,
-0.072265625,
-0.03253173828125,
-0.01163482666015625,
0.022735595703125,
-0.022430419921875,
-0.01082611083984375,
-0.01800537109375,
-0.03387451171875,
0.00670623779296875,
0.047088623046875,
-0.057037353515625,
-0.0012149810791015625,
0.0185394287109375,
0.0694580078125,
0.0272979736328125,
-0.014984130859375,
-0.0304412841796875,
-0.014923095703125,
-0.0242156982421875,
0.075439453125,
-0.0462646484375,
-0.06915283203125,
-0.028045654296875,
0.0249786376953125,
0.01702880859375,
-0.036865234375,
0.026611328125,
-0.00257110595703125,
-0.024932861328125,
-0.0203857421875,
0.0244293212890625,
-0.027435302734375,
0.01091766357421875,
-0.031890869140625,
0.031158447265625,
0.004383087158203125,
-0.0249176025390625,
0.048614501953125,
-0.05780029296875,
0.013519287109375,
0.006587982177734375,
-0.0246429443359375,
-0.01561737060546875,
0.0131988525390625,
0.038116455078125,
0.00453948974609375,
-0.00806427001953125,
-0.00792694091796875,
-0.0465087890625,
-0.0160675048828125,
0.041748046875,
0.006282806396484375,
0.036712646484375,
0.04364013671875,
0.01160430908203125,
0.0196685791015625,
-0.05267333984375,
0.0226898193359375,
0.01062774658203125,
-0.002140045166015625,
-0.01023101806640625,
-0.0308380126953125,
0.021484375,
0.026214599609375,
-0.018463134765625,
-0.068115234375,
0.00428009033203125,
0.005680084228515625,
0.03411865234375,
0.040771484375,
0.023712158203125,
0.0421142578125,
-0.039886474609375,
0.07403564453125,
0.0021533966064453125,
0.020294189453125,
-0.019287109375,
-0.0274505615234375,
-0.042633056640625,
-0.03045654296875,
-0.00246429443359375,
0.03619384765625,
-0.0256195068359375,
0.00586700439453125,
-0.0017728805541992188,
-0.05267333984375,
-0.056793212890625,
-0.00263214111328125,
0.0177154541015625,
0.0223236083984375,
0.0116424560546875,
-0.035614013671875,
-0.04412841796875,
-0.046051025390625,
0.02410888671875,
-0.01123809814453125,
-0.024261474609375,
0.034759521484375,
0.057342529296875,
-0.0261383056640625,
0.054595947265625,
-0.058135986328125,
-0.002201080322265625,
-0.0095672607421875,
-0.01433563232421875,
0.040802001953125,
0.044921875,
0.038909912109375,
-0.06451416015625,
-0.032379150390625,
-0.0033130645751953125,
-0.0772705078125,
-0.00015783309936523438,
0.0228729248046875,
-0.0183868408203125,
-0.042205810546875,
0.03656005859375,
-0.05084228515625,
0.0176239013671875,
0.0216827392578125,
-0.058074951171875,
0.051300048828125,
-0.0183563232421875,
0.055999755859375,
-0.06915283203125,
0.007427215576171875,
0.007091522216796875,
-0.022674560546875,
-0.0191650390625,
0.0360107421875,
-0.00582122802734375,
-0.060394287109375,
-0.06390380859375,
0.05523681640625,
-0.038848876953125,
0.0094146728515625,
-0.0161285400390625,
-0.011077880859375,
0.02294921875,
0.0031147003173828125,
0.015655517578125,
0.031158447265625,
0.08013916015625,
-0.031280517578125,
0.035247802734375,
0.061737060546875,
-0.0224761962890625,
0.047576904296875,
-0.042510986328125,
-0.01007080078125,
-0.0102996826171875,
0.0234832763671875,
-0.044708251953125,
-0.051055908203125,
0.0217132568359375,
-0.059173583984375,
-0.00797271728515625,
-0.0229949951171875,
-0.02392578125,
-0.0404052734375,
-0.027008056640625,
0.01385498046875,
0.0662841796875,
-0.03802490234375,
0.01427459716796875,
0.027740478515625,
-0.020751953125,
-0.020843505859375,
-0.042327880859375,
0.025177001953125,
-0.0447998046875,
-0.06915283203125,
0.034637451171875,
-0.00493621826171875,
-0.04815673828125,
0.00380706787109375,
0.019073486328125,
0.00391387939453125,
-0.01305389404296875,
0.05511474609375,
0.032440185546875,
-0.0191802978515625,
-0.044342041015625,
0.021514892578125,
0.007167816162109375,
-0.0005474090576171875,
-0.01535797119140625,
0.00693511962890625,
-0.01538848876953125,
-0.0032501220703125,
-0.0474853515625,
0.057952880859375,
0.04473876953125,
0.0245819091796875,
0.00518035888671875,
0.040863037109375,
-0.02691650390625,
-0.008758544921875,
-0.01390838623046875,
-0.0294952392578125,
-0.029693603515625,
-0.00995635986328125,
-0.0291595458984375,
-0.050506591796875,
0.0626220703125,
0.004467010498046875,
-0.0221405029296875,
0.0186920166015625,
0.01153564453125,
-0.0271453857421875,
0.05621337890625,
0.060028076171875,
0.0303192138671875,
0.016082763671875,
-0.06195068359375,
0.0024318695068359375,
-0.0390625,
-0.0211029052734375,
-0.01201629638671875,
-0.01031494140625,
-0.048095703125,
-0.059814453125,
0.032928466796875,
-0.0166778564453125,
0.00022685527801513672,
0.0258941650390625,
-0.01235198974609375,
0.057037353515625,
0.037994384765625,
0.018035888671875,
0.004123687744140625,
-0.0005564689636230469,
-0.02392578125,
-0.00992584228515625,
-0.0037593841552734375,
-0.04888916015625,
0.049407958984375,
0.022216796875,
0.057861328125,
0.027557373046875,
0.0626220703125,
0.03076171875,
-0.0261077880859375,
-0.0372314453125,
0.053619384765625,
-0.0191497802734375,
-0.054595947265625,
-0.00873565673828125,
0.008544921875,
-0.044189453125,
0.01021575927734375,
-0.056640625,
-0.045623779296875,
0.0298614501953125,
0.01084136962890625,
-0.07427978515625,
0.01346588134765625,
-0.06646728515625,
0.059417724609375,
0.0217437744140625,
-0.039154052734375,
0.002471923828125,
-0.05987548828125,
0.01690673828125,
-0.00017344951629638672,
0.03155517578125,
-0.00482940673828125,
0.0010557174682617188,
0.029327392578125,
-0.03955078125,
0.04974365234375,
-0.00873565673828125,
0.022216796875,
0.05963134765625,
-0.00004583597183227539,
0.007808685302734375,
0.04742431640625,
0.0121307373046875,
-0.003124237060546875,
-0.00914764404296875,
-0.01593017578125,
-0.022064208984375,
0.059661865234375,
-0.054412841796875,
-0.0271453857421875,
-0.0308990478515625,
-0.0255279541015625,
0.0311279296875,
0.0002281665802001953,
0.039093017578125,
0.042205810546875,
-0.0318603515625,
-0.01129150390625,
0.037750244140625,
-0.0306854248046875,
0.045379638671875,
0.036468505859375,
-0.03912353515625,
-0.040374755859375,
0.046051025390625,
0.0015201568603515625,
0.005664825439453125,
0.027862548828125,
0.00963592529296875,
-0.03228759765625,
-0.031158447265625,
-0.0308990478515625,
0.01428985595703125,
-0.05828857421875,
-0.01065826416015625,
-0.0455322265625,
0.01235198974609375,
-0.03607177734375,
-0.048675537109375,
-0.023712158203125,
-0.03643798828125,
-0.033447265625,
-0.0214996337890625,
0.038818359375,
0.04827880859375,
-0.0281982421875,
0.016448974609375,
-0.0762939453125,
0.0186004638671875,
0.03045654296875,
-0.031036376953125,
-0.04180908203125,
-0.034637451171875,
-0.000640869140625,
-0.031341552734375,
-0.0567626953125,
-0.0350341796875,
0.06768798828125,
-0.0038852691650390625,
0.0232086181640625,
0.001064300537109375,
-0.0073394775390625,
0.0447998046875,
-0.035247802734375,
0.07342529296875,
0.02349853515625,
-0.049041748046875,
0.062255859375,
-0.06658935546875,
0.0396728515625,
0.03765869140625,
-0.007080078125,
-0.02581787109375,
-0.039825439453125,
-0.06890869140625,
-0.06597900390625,
0.035919189453125,
0.0250396728515625,
0.019744873046875,
0.0177154541015625,
0.02374267578125,
0.00954437255859375,
0.032623291015625,
-0.06256103515625,
-0.0157318115234375,
-0.0291900634765625,
0.0249481201171875,
-0.0251312255859375,
0.0207061767578125,
0.006305694580078125,
-0.004848480224609375,
0.0252838134765625,
-0.006130218505859375,
0.0252838134765625,
0.0191802978515625,
0.0168304443359375,
-0.04376220703125,
0.018524169921875,
0.016937255859375,
0.0229644775390625,
-0.04034423828125,
0.0079345703125,
-0.00992584228515625,
-0.043060302734375,
0.037628173828125,
0.005207061767578125,
-0.00867462158203125,
0.003597259521484375,
0.0032062530517578125,
0.06915283203125,
-0.00258636474609375,
-0.050262451171875,
0.006656646728515625,
-0.00884246826171875,
-0.0001277923583984375,
-0.06097412109375,
0.03472900390625,
-0.016082763671875,
0.031890869140625,
0.03045654296875,
0.038909912109375,
0.03759765625,
-0.050872802734375,
0.00954437255859375,
0.0174102783203125,
-0.040008544921875,
-0.05523681640625,
0.060455322265625,
0.01430511474609375,
-0.0228271484375,
0.0300140380859375,
-0.0264434814453125,
-0.0205078125,
0.0718994140625,
0.01971435546875,
0.055419921875,
-0.001377105712890625,
0.043182373046875,
0.05767822265625,
0.00254058837890625,
0.024627685546875,
0.052581787109375,
0.005870819091796875,
-0.0198822021484375,
0.0121307373046875,
-0.058868408203125,
-0.044158935546875,
0.00322723388671875,
-0.0089569091796875,
0.0704345703125,
-0.05072021484375,
0.0006327629089355469,
0.021240234375,
-0.0037841796875,
-0.02496337890625,
0.043243408203125,
0.02508544921875,
0.081298828125,
-0.08135986328125,
0.03167724609375,
0.06536865234375,
-0.041748046875,
-0.072265625,
-0.01058197021484375,
0.04852294921875,
-0.04046630859375,
0.026031494140625,
0.044708251953125,
0.00753021240234375,
-0.0023345947265625,
-0.04888916015625,
-0.057861328125,
0.084228515625,
-0.00000858306884765625,
-0.0199432373046875,
0.0017995834350585938,
-0.034576416015625,
0.023712158203125,
-0.061492919921875,
0.014007568359375,
0.0233612060546875,
0.0277862548828125,
0.04107666015625,
-0.049102783203125,
-0.01342010498046875,
-0.07025146484375,
-0.01491546630859375,
0.0178375244140625,
-0.034576416015625,
0.0738525390625,
-0.08734130859375,
-0.0123291015625,
0.0457763671875,
0.0567626953125,
0.0535888671875,
0.03173828125,
0.060821533203125,
0.033233642578125,
0.0269927978515625,
-0.00637054443359375,
0.0902099609375,
0.0283050537109375,
-0.006046295166015625,
0.04302978515625,
-0.0028858184814453125,
0.033447265625,
0.03179931640625,
-0.0202484130859375,
0.032989501953125,
0.091064453125,
-0.019195556640625,
0.04974365234375,
0.0107879638671875,
-0.0279998779296875,
-0.0247802734375,
-0.035552978515625,
-0.00904083251953125,
0.03424072265625,
0.01392364501953125,
-0.01314544677734375,
-0.009735107421875,
0.0283050537109375,
-0.00545501708984375,
0.029754638671875,
-0.02069091796875,
0.022735595703125,
0.026763916015625,
-0.0626220703125,
0.01424407958984375,
-0.01479339599609375,
0.03192138671875,
-0.052581787109375,
-0.0233306884765625,
0.00461578369140625,
0.000865936279296875,
-0.026519775390625,
-0.072998046875,
0.0014963150024414062,
0.015716552734375,
-0.0207977294921875,
-0.029296875,
0.0243988037109375,
-0.005615234375,
-0.05377197265625,
0.0007534027099609375,
0.005954742431640625,
0.03387451171875,
0.01557159423828125,
-0.059326171875,
-0.0028362274169921875,
0.022308349609375,
0.0201263427734375,
-0.0009398460388183594,
0.0174560546875,
-0.00933837890625,
0.042999267578125,
0.037139892578125,
-0.004039764404296875,
-0.00958251953125,
0.0006313323974609375,
0.0390625,
-0.0498046875,
-0.07135009765625,
-0.043243408203125,
0.06866455078125,
0.0082550048828125,
-0.0185394287109375,
0.063232421875,
0.0274810791015625,
0.014984130859375,
-0.049468994140625,
0.0572509765625,
-0.027740478515625,
0.038116455078125,
-0.06134033203125,
0.045501708984375,
-0.048583984375,
0.019683837890625,
-0.047698974609375,
-0.0584716796875,
-0.0157470703125,
0.0261383056640625,
0.01210784912109375,
-0.006130218505859375,
0.04681396484375,
0.0701904296875,
-0.0131378173828125,
-0.0104522705078125,
0.015869140625,
0.00580596923828125,
0.037353515625,
0.01409149169921875,
0.0977783203125,
-0.00914764404296875,
0.025177001953125,
-0.0301971435546875,
-0.0268402099609375,
0.0025997161865234375,
-0.06787109375,
-0.053253173828125,
-0.040802001953125,
-0.045745849609375,
-0.0246429443359375,
-0.01558685302734375,
0.0701904296875,
0.0648193359375,
-0.041656494140625,
-0.0237884521484375,
-0.0124359130859375,
0.004688262939453125,
0.019927978515625,
-0.006519317626953125,
-0.00634002685546875,
0.0283050537109375,
-0.0791015625,
0.053863525390625,
0.00643157958984375,
0.07598876953125,
-0.0148468017578125,
0.00812530517578125,
-0.004131317138671875,
-0.005153656005859375,
0.0121917724609375,
0.02362060546875,
-0.024261474609375,
-0.02862548828125,
0.0183258056640625,
0.038055419921875,
-0.0094451904296875,
0.033721923828125,
-0.037200927734375,
0.0194549560546875,
0.04180908203125,
-0.047821044921875,
0.0197906494140625,
0.004772186279296875,
0.04144287109375,
-0.02984619140625,
0.01800537109375,
-0.0211029052734375,
0.041534423828125,
0.054473876953125,
-0.039947509765625,
0.044647216796875,
0.02191162109375,
-0.0361328125,
-0.064697265625,
0.01226043701171875,
-0.0943603515625,
-0.01209259033203125,
0.070068359375,
-0.0020847320556640625,
-0.0283050537109375,
-0.003337860107421875,
-0.03375244140625,
-0.0294342041015625,
-0.05780029296875,
0.0232086181640625,
0.041656494140625,
-0.0116424560546875,
0.003177642822265625,
-0.031951904296875,
0.00943756103515625,
-0.01273345947265625,
-0.048736572265625,
-0.019805908203125,
0.05413818359375,
0.0255126953125,
0.06304931640625,
0.039154052734375,
-0.0194854736328125,
0.0298309326171875,
0.0257415771484375,
0.0094757080078125,
0.01171112060546875,
-0.0404052734375,
-0.018646240234375,
0.00759124755859375,
0.00931549072265625,
-0.0361328125
]
] |
Jean-Baptiste/camembert-ner-with-dates | 2023-06-16T01:31:43.000Z | [
"transformers",
"pytorch",
"onnx",
"safetensors",
"camembert",
"token-classification",
"fr",
"dataset:Jean-Baptiste/wikiner_fr",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | Jean-Baptiste | null | null | Jean-Baptiste/camembert-ner-with-dates | 28 | 27,508 | transformers | 2022-03-02T23:29:04 | ---
language: fr
datasets:
- Jean-Baptiste/wikiner_fr
widget:
- text: "Je m'appelle jean-baptiste et j'habite à montréal depuis fevr 2012"
license: mit
---
# camembert-ner: model fine-tuned from camemBERT for NER task (including DATE tag).
## Introduction
[camembert-ner-with-dates] is an extension of french camembert-ner model with an additionnal tag for dates.
Model was trained on enriched version of wikiner-fr dataset (~170 634 sentences).
On my test data (mix of chat and email), this model got an f1 score of ~83% (in comparison dateparser was ~70%).
Dateparser library can still be be used on the output of this model in order to convert text to python datetime object
(https://dateparser.readthedocs.io/en/latest/).
## How to use camembert-ner-with-dates with HuggingFace
##### Load camembert-ner-with-dates and its sub-word tokenizer :
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("Jean-Baptiste/camembert-ner-with-dates")
model = AutoModelForTokenClassification.from_pretrained("Jean-Baptiste/camembert-ner-with-dates")
##### Process text sample (from wikipedia)
from transformers import pipeline
nlp = pipeline('ner', model=model, tokenizer=tokenizer, aggregation_strategy="simple")
nlp("Apple est créée le 1er avril 1976 dans le garage de la maison d'enfance de Steve Jobs à Los Altos en Californie par Steve Jobs, Steve Wozniak et Ronald Wayne14, puis constituée sous forme de société le 3 janvier 1977 à l'origine sous le nom d'Apple Computer, mais pour ses 30 ans et pour refléter la diversification de ses produits, le mot « computer » est retiré le 9 janvier 2015.")
[{'entity_group': 'ORG',
'score': 0.9776379466056824,
'word': 'Apple',
'start': 0,
'end': 5},
{'entity_group': 'DATE',
'score': 0.9793774570737567,
'word': 'le 1er avril 1976 dans le',
'start': 15,
'end': 41},
{'entity_group': 'PER',
'score': 0.9958226680755615,
'word': 'Steve Jobs',
'start': 74,
'end': 85},
{'entity_group': 'LOC',
'score': 0.995087186495463,
'word': 'Los Altos',
'start': 87,
'end': 97},
{'entity_group': 'LOC',
'score': 0.9953305125236511,
'word': 'Californie',
'start': 100,
'end': 111},
{'entity_group': 'PER',
'score': 0.9961076378822327,
'word': 'Steve Jobs',
'start': 115,
'end': 126},
{'entity_group': 'PER',
'score': 0.9960325956344604,
'word': 'Steve Wozniak',
'start': 127,
'end': 141},
{'entity_group': 'PER',
'score': 0.9957776467005411,
'word': 'Ronald Wayne',
'start': 144,
'end': 157},
{'entity_group': 'DATE',
'score': 0.994030773639679,
'word': 'le 3 janvier 1977 à',
'start': 198,
'end': 218},
{'entity_group': 'ORG',
'score': 0.9720810294151306,
'word': "d'Apple Computer",
'start': 240,
'end': 257},
{'entity_group': 'DATE',
'score': 0.9924157659212748,
'word': '30 ans et',
'start': 272,
'end': 282},
{'entity_group': 'DATE',
'score': 0.9934852868318558,
'word': 'le 9 janvier 2015.',
'start': 363,
'end': 382}]
```
## Model performances (metric: seqeval)
Global
```
'precision': 0.928
'recall': 0.928
'f1': 0.928
```
By entity
```
Label LOC: (precision:0.929, recall:0.932, f1:0.931, support:9510)
Label PER: (precision:0.952, recall:0.965, f1:0.959, support:9399)
Label MISC: (precision:0.878, recall:0.844, f1:0.860, support:5364)
Label ORG: (precision:0.848, recall:0.883, f1:0.865, support:2299)
Label DATE: Not relevant because of method used to add date tag on wikiner dataset (estimated f1 ~90%)
```
| 3,547 | [
[
-0.027557373046875,
-0.051422119140625,
0.0216827392578125,
0.024200439453125,
-0.0172119140625,
-0.00870513916015625,
-0.0055694580078125,
-0.01849365234375,
0.042633056640625,
0.02227783203125,
-0.050628662109375,
-0.072509765625,
-0.0465087890625,
0.0119781494140625,
-0.007381439208984375,
0.0838623046875,
0.0037593841552734375,
-0.012542724609375,
0.01031494140625,
-0.0178985595703125,
-0.01508331298828125,
-0.043975830078125,
-0.044891357421875,
-0.0286407470703125,
0.033966064453125,
0.00736236572265625,
0.04296875,
0.041595458984375,
0.047119140625,
0.0229644775390625,
-0.0187530517578125,
0.0031299591064453125,
-0.002971649169921875,
-0.015655517578125,
-0.0106658935546875,
-0.035247802734375,
-0.05133056640625,
0.01371002197265625,
0.029632568359375,
0.058807373046875,
-0.01250457763671875,
0.01371002197265625,
-0.00792694091796875,
0.04052734375,
-0.0191192626953125,
0.0300750732421875,
-0.054290771484375,
0.016571044921875,
-0.01776123046875,
-0.0215606689453125,
0.0013208389282226562,
-0.0236358642578125,
-0.006893157958984375,
-0.056060791015625,
0.0179901123046875,
0.00981903076171875,
0.1021728515625,
0.0121917724609375,
-0.0200042724609375,
-0.0222625732421875,
-0.0205535888671875,
0.0728759765625,
-0.082763671875,
0.0263824462890625,
0.03717041015625,
-0.0036411285400390625,
-0.021331787109375,
-0.0545654296875,
-0.047271728515625,
-0.0033435821533203125,
-0.0298919677734375,
0.0254364013671875,
-0.01036834716796875,
-0.0167236328125,
0.0158538818359375,
0.035858154296875,
-0.055938720703125,
0.0017070770263671875,
-0.017242431640625,
-0.007625579833984375,
0.05023193359375,
0.0191650390625,
0.015289306640625,
-0.038909912109375,
-0.035064697265625,
-0.01442718505859375,
-0.0213470458984375,
0.0162353515625,
0.0214691162109375,
0.0291748046875,
-0.0238037109375,
0.0706787109375,
-0.033782958984375,
0.049224853515625,
0.004177093505859375,
-0.00627899169921875,
0.04931640625,
-0.0335693359375,
-0.0206451416015625,
-0.00493621826171875,
0.099365234375,
0.048828125,
0.0011472702026367188,
0.00377655029296875,
-0.0255126953125,
0.01064300537109375,
-0.007236480712890625,
-0.06060791015625,
-0.0294036865234375,
0.0182952880859375,
-0.02911376953125,
-0.0219573974609375,
0.0130615234375,
-0.06829833984375,
0.008148193359375,
-0.01538848876953125,
0.03289794921875,
-0.04107666015625,
0.0016317367553710938,
0.0042572021484375,
-0.022216796875,
0.00206756591796875,
0.0011129379272460938,
-0.0379638671875,
0.0173187255859375,
0.03515625,
0.0640869140625,
0.01428985595703125,
-0.020233154296875,
-0.0288543701171875,
-0.00363922119140625,
-0.0200042724609375,
0.05633544921875,
-0.0283966064453125,
-0.03204345703125,
-0.02618408203125,
0.0185699462890625,
-0.0221099853515625,
-0.017364501953125,
0.0491943359375,
-0.0240478515625,
0.038055419921875,
-0.006439208984375,
-0.04486083984375,
-0.012237548828125,
0.03668212890625,
-0.038909912109375,
0.0916748046875,
0.0295257568359375,
-0.08306884765625,
0.049652099609375,
-0.048187255859375,
-0.0238189697265625,
0.01102447509765625,
-0.0229644775390625,
-0.03656005859375,
-0.01035308837890625,
0.041595458984375,
0.043792724609375,
-0.0288543701171875,
0.020172119140625,
-0.01519775390625,
-0.005596160888671875,
0.022674560546875,
-0.0164031982421875,
0.0745849609375,
0.011566162109375,
-0.0235443115234375,
-0.0034809112548828125,
-0.0711669921875,
0.021697998046875,
0.01544189453125,
-0.04669189453125,
-0.01372528076171875,
-0.0056304931640625,
0.01192474365234375,
0.0189361572265625,
0.0215911865234375,
-0.03094482421875,
0.0143890380859375,
-0.05438232421875,
0.035430908203125,
0.042205810546875,
0.01366424560546875,
0.01593017578125,
-0.031494140625,
0.032623291015625,
0.0154266357421875,
0.0090789794921875,
0.00199127197265625,
-0.033416748046875,
-0.035858154296875,
-0.02801513671875,
0.02337646484375,
0.056488037109375,
-0.039520263671875,
0.08941650390625,
-0.05816650390625,
-0.05816650390625,
-0.032745361328125,
-0.002471923828125,
0.0007486343383789062,
0.05224609375,
0.0394287109375,
-0.031280517578125,
-0.034332275390625,
-0.0679931640625,
-0.01513671875,
-0.018096923828125,
0.0086212158203125,
0.0295257568359375,
0.050811767578125,
-0.021270751953125,
0.065185546875,
-0.058258056640625,
-0.02447509765625,
-0.00263214111328125,
0.0180206298828125,
0.071044921875,
0.046844482421875,
0.0401611328125,
-0.052215576171875,
-0.035858154296875,
0.0030307769775390625,
-0.054718017578125,
0.0055389404296875,
0.00701904296875,
-0.005535125732421875,
0.013092041015625,
0.0229949951171875,
-0.06256103515625,
0.0439453125,
0.02801513671875,
-0.048614501953125,
0.033233642578125,
-0.0306396484375,
0.03155517578125,
-0.0933837890625,
0.00292205810546875,
-0.001705169677734375,
-0.0086517333984375,
-0.02752685546875,
-0.0150604248046875,
-0.0004496574401855469,
0.0106353759765625,
-0.0219573974609375,
0.0236053466796875,
-0.03582763671875,
0.01617431640625,
0.005252838134765625,
0.007694244384765625,
0.007419586181640625,
0.0301666259765625,
-0.00579833984375,
0.042083740234375,
0.03662109375,
-0.0386962890625,
0.03179931640625,
0.00970458984375,
-0.04180908203125,
0.049835205078125,
-0.04339599609375,
-0.004398345947265625,
-0.006378173828125,
0.011993408203125,
-0.056976318359375,
-0.0205535888671875,
0.0273895263671875,
-0.04351806640625,
0.025665283203125,
-0.0161285400390625,
-0.038299560546875,
-0.038330078125,
-0.0272216796875,
0.0143585205078125,
0.0178070068359375,
-0.0205535888671875,
0.05621337890625,
0.01401519775390625,
-0.004482269287109375,
-0.03814697265625,
-0.0748291015625,
-0.00394439697265625,
-0.027862548828125,
-0.051544189453125,
0.044647216796875,
-0.00897216796875,
0.0001932382583618164,
0.0259857177734375,
-0.01335906982421875,
-0.007160186767578125,
0.00530242919921875,
0.0037364959716796875,
0.024810791015625,
-0.01519775390625,
-0.005615234375,
-0.0026988983154296875,
-0.00864410400390625,
-0.00872039794921875,
-0.025054931640625,
0.06353759765625,
0.00350189208984375,
-0.01043701171875,
-0.020751953125,
0.018218994140625,
0.03228759765625,
-0.0269775390625,
0.08685302734375,
0.053985595703125,
-0.046417236328125,
0.013641357421875,
-0.038238525390625,
-0.0036106109619140625,
-0.0283050537109375,
0.024383544921875,
-0.04827880859375,
-0.052001953125,
0.049896240234375,
0.025054931640625,
0.0026397705078125,
0.07550048828125,
0.02679443359375,
-0.001129150390625,
0.0699462890625,
0.00789642333984375,
-0.0218505859375,
0.029022216796875,
-0.03765869140625,
0.0099945068359375,
-0.053070068359375,
-0.033843994140625,
-0.036041259765625,
-0.0162353515625,
-0.060882568359375,
-0.0275421142578125,
0.01275634765625,
0.0230712890625,
-0.018035888671875,
0.039794921875,
-0.06561279296875,
0.0037326812744140625,
0.04150390625,
0.01934814453125,
-0.0014753341674804688,
-0.00986480712890625,
-0.0219573974609375,
-0.0195159912109375,
-0.036407470703125,
-0.0218963623046875,
0.0770263671875,
0.00699615478515625,
0.052337646484375,
0.00446319580078125,
0.0633544921875,
0.01435089111328125,
0.0149078369140625,
-0.048095703125,
0.043121337890625,
-0.0081329345703125,
-0.0523681640625,
-0.0175323486328125,
-0.050689697265625,
-0.0723876953125,
0.0225982666015625,
-0.0227203369140625,
-0.07379150390625,
0.01166534423828125,
0.0037479400634765625,
-0.0100250244140625,
0.017608642578125,
-0.0443115234375,
0.059051513671875,
-0.0265960693359375,
-0.015960693359375,
0.01181793212890625,
-0.04327392578125,
0.004215240478515625,
-0.00536346435546875,
0.03704833984375,
-0.04010009765625,
-0.0089263916015625,
0.0673828125,
-0.050750732421875,
0.037506103515625,
-0.00901031494140625,
0.007534027099609375,
0.01788330078125,
-0.00818634033203125,
0.0616455078125,
0.01186370849609375,
-0.0018739700317382812,
0.00667572021484375,
0.004642486572265625,
-0.020843505859375,
-0.0303497314453125,
0.06561279296875,
-0.04107666015625,
-0.0258026123046875,
-0.05059814453125,
-0.036407470703125,
0.0164337158203125,
0.0279388427734375,
0.047760009765625,
0.03533935546875,
0.0013990402221679688,
0.015472412109375,
0.02484130859375,
-0.0158538818359375,
0.03424072265625,
0.0215606689453125,
-0.01336669921875,
-0.059600830078125,
0.055877685546875,
0.0232391357421875,
-0.012298583984375,
0.04962158203125,
0.0102996826171875,
-0.034149169921875,
-0.031097412109375,
-0.007656097412109375,
0.028533935546875,
-0.042205810546875,
-0.0297698974609375,
-0.0792236328125,
-0.016754150390625,
-0.061981201171875,
0.0007462501525878906,
-0.021636962890625,
-0.04290771484375,
-0.042449951171875,
-0.0186004638671875,
0.048553466796875,
0.0223846435546875,
-0.00749969482421875,
0.0179901123046875,
-0.0535888671875,
0.013580322265625,
0.00525665283203125,
0.028778076171875,
-0.032318115234375,
-0.04364013671875,
-0.0116424560546875,
-0.0020294189453125,
-0.0196533203125,
-0.08441162109375,
0.04638671875,
0.006496429443359375,
0.032989501953125,
0.01230621337890625,
-0.0164642333984375,
0.0628662109375,
-0.026458740234375,
0.06884765625,
0.0191650390625,
-0.06298828125,
0.056243896484375,
-0.0291748046875,
0.0231781005859375,
0.0323486328125,
0.0236358642578125,
-0.0533447265625,
-0.019256591796875,
-0.06719970703125,
-0.0986328125,
0.0648193359375,
0.0380859375,
-0.006473541259765625,
-0.003444671630859375,
0.00635528564453125,
-0.01457977294921875,
0.0182952880859375,
-0.06927490234375,
-0.052642822265625,
-0.0239715576171875,
-0.046875,
-0.02044677734375,
-0.0004124641418457031,
-0.00579833984375,
-0.033782958984375,
0.0692138671875,
0.00441741943359375,
0.0253448486328125,
0.031707763671875,
-0.00789642333984375,
0.01180267333984375,
0.029541015625,
0.02593994140625,
0.0440673828125,
-0.034698486328125,
0.00006717443466186523,
0.024871826171875,
-0.0228118896484375,
-0.0069580078125,
0.05804443359375,
-0.00702667236328125,
0.00811767578125,
0.0379638671875,
0.053680419921875,
0.006855010986328125,
-0.0174102783203125,
0.036346435546875,
0.00763702392578125,
-0.0318603515625,
-0.036895751953125,
-0.00701904296875,
0.01279449462890625,
0.0140838623046875,
0.031707763671875,
-0.005748748779296875,
0.0124359130859375,
-0.027374267578125,
0.00838470458984375,
0.0276947021484375,
-0.039794921875,
-0.0259857177734375,
0.06689453125,
-0.02911376953125,
-0.0202178955078125,
0.034210205078125,
-0.0244598388671875,
-0.046539306640625,
0.047393798828125,
0.02423095703125,
0.053466796875,
-0.004913330078125,
-0.00415802001953125,
0.04254150390625,
0.03961181640625,
-0.0023975372314453125,
0.035888671875,
0.015594482421875,
-0.05206298828125,
-0.0018310546875,
-0.05224609375,
0.0325927734375,
0.0273590087890625,
-0.043670654296875,
0.0158843994140625,
-0.0235748291015625,
-0.052947998046875,
0.0165252685546875,
0.0130767822265625,
-0.07177734375,
0.04925537109375,
-0.019287109375,
0.05712890625,
-0.06591796875,
0.0295257568359375,
0.0787353515625,
-0.051025390625,
-0.0751953125,
-0.0177764892578125,
-0.0113067626953125,
-0.036773681640625,
0.0400390625,
0.01123046875,
0.036956787109375,
0.0013818740844726562,
-0.03753662109375,
-0.07269287109375,
0.0977783203125,
-0.0006418228149414062,
-0.033416748046875,
-0.01053619384765625,
-0.0019426345825195312,
0.03497314453125,
-0.0202178955078125,
0.046661376953125,
0.04583740234375,
0.035888671875,
-0.01244354248046875,
-0.05767822265625,
0.0156097412109375,
-0.035186767578125,
-0.00868988037109375,
0.03515625,
-0.05804443359375,
0.0675048828125,
0.0191192626953125,
-0.00067901611328125,
-0.016448974609375,
0.04547119140625,
0.02105712890625,
0.03228759765625,
0.031097412109375,
0.0723876953125,
0.051605224609375,
-0.033447265625,
0.06427001953125,
-0.01474761962890625,
0.052337646484375,
0.0682373046875,
0.0079803466796875,
0.04241943359375,
0.031585693359375,
-0.036346435546875,
0.03765869140625,
0.045166015625,
-0.0158843994140625,
0.035430908203125,
0.003627777099609375,
-0.0163116455078125,
-0.003444671630859375,
0.006317138671875,
-0.0295257568359375,
0.027740478515625,
0.0267486572265625,
-0.03875732421875,
-0.0009012222290039062,
-0.0251007080078125,
0.022613525390625,
0.0031566619873046875,
-0.0102996826171875,
0.045654296875,
0.007465362548828125,
-0.040679931640625,
0.0249176025390625,
0.005901336669921875,
0.06719970703125,
-0.03289794921875,
0.0118408203125,
-0.01458740234375,
0.017425537109375,
-0.020050048828125,
-0.056854248046875,
0.00595855712890625,
0.0024471282958984375,
-0.0268096923828125,
0.011749267578125,
0.045745849609375,
-0.0292205810546875,
-0.07305908203125,
0.021484375,
0.0211334228515625,
0.0213470458984375,
-0.0057525634765625,
-0.07977294921875,
-0.013214111328125,
0.01348876953125,
-0.04119873046875,
-0.0130615234375,
0.048004150390625,
0.00820159912109375,
0.052825927734375,
0.052093505859375,
0.02374267578125,
0.0184326171875,
-0.0009102821350097656,
0.0657958984375,
-0.076171875,
-0.034912109375,
-0.0706787109375,
0.045623779296875,
-0.014373779296875,
-0.0262451171875,
0.046600341796875,
0.0802001953125,
0.049652099609375,
0.01145172119140625,
0.0479736328125,
-0.042144775390625,
0.045166015625,
-0.044342041015625,
0.060546875,
-0.0673828125,
0.006832122802734375,
-0.0204925537109375,
-0.073486328125,
-0.01140594482421875,
0.029998779296875,
-0.020263671875,
0.0242767333984375,
0.050567626953125,
0.06866455078125,
-0.0137481689453125,
-0.01483154296875,
-0.00572967529296875,
0.0122833251953125,
0.0355224609375,
0.038238525390625,
0.033599853515625,
-0.053497314453125,
0.038909912109375,
-0.046356201171875,
-0.01128387451171875,
-0.0138092041015625,
-0.06298828125,
-0.060455322265625,
-0.03228759765625,
-0.031585693359375,
-0.019195556640625,
-0.00917816162109375,
0.08465576171875,
0.03448486328125,
-0.0631103515625,
-0.00109100341796875,
-0.01203155517578125,
-0.01474761962890625,
-0.01220703125,
-0.027496337890625,
0.0523681640625,
-0.0129241943359375,
-0.05047607421875,
0.024200439453125,
-0.00109100341796875,
0.0208587646484375,
0.039886474609375,
0.0032825469970703125,
-0.039154052734375,
-0.00867462158203125,
0.032318115234375,
-0.00536346435546875,
-0.043365478515625,
-0.006435394287109375,
-0.0008983612060546875,
-0.010467529296875,
0.0236663818359375,
0.0168914794921875,
-0.031494140625,
0.0224456787109375,
0.04766845703125,
0.00811767578125,
0.06109619140625,
0.0155487060546875,
0.0185699462890625,
-0.056488037109375,
0.0201263427734375,
0.01666259765625,
0.035308837890625,
0.033172607421875,
-0.032470703125,
0.04400634765625,
0.038055419921875,
-0.0419921875,
-0.0531005859375,
-0.0205841064453125,
-0.074462890625,
-0.0108642578125,
0.059051513671875,
-0.0019168853759765625,
-0.022918701171875,
0.0201873779296875,
-0.00572967529296875,
0.037750244140625,
-0.045562744140625,
0.0310516357421875,
0.0557861328125,
0.0008630752563476562,
0.00420379638671875,
-0.03350830078125,
0.01837158203125,
0.0213470458984375,
-0.061553955078125,
-0.035186767578125,
0.0224456787109375,
0.03662109375,
0.0158233642578125,
0.055419921875,
-0.0186614990234375,
-0.005672454833984375,
0.00566864013671875,
0.01129913330078125,
0.01526641845703125,
0.01422119140625,
-0.00226593017578125,
-0.00572967529296875,
-0.0036182403564453125,
-0.0211029052734375
]
] |
Yntec/photoMovieXFinal | 2023-09-16T09:16:16.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"MagicArt35",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/photoMovieXFinal | 2 | 27,478 | diffusers | 2023-09-16T08:21:06 | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
- MagicArt35
---
# Photo Movie X Final
Samples and prompts:

Portrait of beautiful pretty cute lady, wearing ice princess dress realistic, stunning realistic photograph, full lips, 3d render, octane render, intricately detailed, cinematic, trending on artstation | Isometric | Centered hyper realistic cover photo awesome full color, hand drawn, dark, gritty, realistic mucha, klimt, erte .12k, intricate. high definition , cinematic, Rough sketch, mix of bold dark lines and loose lines, bold lines, on paper ,

Pretty cute girl in a future where humanity has colonized the stars, a group of explorers embarks on a journey to a distant planet, hoping to discover new forms of life and unlock the secrets of the universe. But as they descend through the planet’s thick atmosphere, they discover that the world below is more dangerous and mysterious than they could have ever imagined.
Original page:
https://civitai.com/models/94687?modelVersionId=103445 | 1,362 | [
[
-0.038055419921875,
-0.023651123046875,
0.045928955078125,
-0.01898193359375,
-0.013458251953125,
0.035980224609375,
0.0341796875,
-0.05096435546875,
0.05096435546875,
0.047271728515625,
-0.07464599609375,
-0.03411865234375,
-0.039093017578125,
-0.002033233642578125,
-0.040435791015625,
0.061920166015625,
0.0007615089416503906,
-0.019439697265625,
0.015594482421875,
0.0168914794921875,
-0.01100921630859375,
-0.01525115966796875,
-0.025421142578125,
0.01102447509765625,
0.04766845703125,
0.045623779296875,
0.0848388671875,
0.045745849609375,
0.0311431884765625,
0.0230255126953125,
0.0009150505065917969,
-0.01128387451171875,
-0.04248046875,
-0.0008864402770996094,
-0.0159912109375,
-0.040313720703125,
-0.01446533203125,
0.033233642578125,
0.06195068359375,
0.0438232421875,
0.007175445556640625,
0.0198516845703125,
-0.03826904296875,
0.0276336669921875,
-0.016876220703125,
-0.006805419921875,
0.00823974609375,
0.0251922607421875,
-0.022674560546875,
0.01407623291015625,
0.00635528564453125,
0.0061187744140625,
0.01136016845703125,
-0.073486328125,
0.0172576904296875,
0.032867431640625,
0.1055908203125,
-0.022735595703125,
-0.0199127197265625,
0.01377105712890625,
-0.0631103515625,
0.0380859375,
0.01165008544921875,
0.022674560546875,
-0.00543212890625,
0.0615234375,
0.0026493072509765625,
-0.07293701171875,
-0.0386962890625,
-0.0166168212890625,
0.028717041015625,
0.00635528564453125,
-0.0259552001953125,
-0.028717041015625,
0.0313720703125,
0.062164306640625,
-0.04071044921875,
0.0038661956787109375,
-0.04345703125,
-0.0100250244140625,
0.043792724609375,
0.0221099853515625,
0.051483154296875,
-0.0154876708984375,
-0.0347900390625,
-0.05230712890625,
-0.03857421875,
0.03753662109375,
0.037506103515625,
0.00763702392578125,
-0.056121826171875,
0.053253173828125,
-0.0031280517578125,
0.018585205078125,
0.005298614501953125,
0.00251007080078125,
0.01097869873046875,
-0.0227203369140625,
-0.005771636962890625,
-0.0235443115234375,
0.05682373046875,
0.06549072265625,
-0.0154266357421875,
0.0133056640625,
0.00803375244140625,
0.01291656494140625,
0.006450653076171875,
-0.0693359375,
-0.0241851806640625,
0.016815185546875,
-0.0159149169921875,
-0.03826904296875,
0.002902984619140625,
-0.06524658203125,
-0.0283050537109375,
-0.00469207763671875,
0.00958251953125,
-0.035797119140625,
-0.025299072265625,
0.000835418701171875,
-0.0208892822265625,
0.023040771484375,
0.04583740234375,
-0.051055908203125,
0.01654052734375,
0.0084381103515625,
0.056121826171875,
0.0212249755859375,
0.00571441650390625,
-0.0146636962890625,
-0.00015032291412353516,
-0.04644775390625,
0.0682373046875,
-0.03369140625,
-0.05322265625,
-0.01235198974609375,
0.01041412353515625,
0.0270233154296875,
-0.0225677490234375,
0.0841064453125,
-0.0298919677734375,
0.01092529296875,
-0.029022216796875,
-0.006046295166015625,
-0.0117645263671875,
0.004566192626953125,
-0.0540771484375,
0.030609130859375,
0.0312347412109375,
-0.035736083984375,
0.054443359375,
-0.0291290283203125,
0.022735595703125,
0.028289794921875,
-0.0296478271484375,
-0.028167724609375,
0.005832672119140625,
0.015716552734375,
-0.004848480224609375,
-0.0120849609375,
-0.028289794921875,
-0.051300048828125,
-0.0288848876953125,
0.01090240478515625,
0.032867431640625,
0.06951904296875,
0.039886474609375,
-0.0152587890625,
0.007228851318359375,
-0.055999755859375,
0.0267181396484375,
0.034149169921875,
0.00951385498046875,
-0.0025501251220703125,
-0.04730224609375,
0.0202484130859375,
0.051605224609375,
-0.01226043701171875,
-0.052276611328125,
0.02655029296875,
0.002758026123046875,
-0.002166748046875,
0.04852294921875,
0.004993438720703125,
0.03948974609375,
-0.03936767578125,
0.07574462890625,
-0.006618499755859375,
0.0161285400390625,
-0.0297393798828125,
-0.01515960693359375,
-0.0438232421875,
-0.0321044921875,
0.00292205810546875,
0.028778076171875,
-0.03106689453125,
0.00807952880859375,
-0.000736236572265625,
-0.06329345703125,
-0.04144287109375,
-0.01812744140625,
0.0309295654296875,
0.0228729248046875,
-0.00623321533203125,
-0.031524658203125,
-0.04705810546875,
-0.0556640625,
0.005229949951171875,
-0.021942138671875,
-0.01316070556640625,
0.032562255859375,
0.04412841796875,
-0.011566162109375,
0.027252197265625,
-0.04156494140625,
-0.0310821533203125,
-0.01471710205078125,
-0.024932861328125,
0.049774169921875,
0.038360595703125,
0.05218505859375,
-0.062042236328125,
-0.03619384765625,
-0.0072784423828125,
-0.06256103515625,
0.020294189453125,
-0.0038471221923828125,
-0.029632568359375,
-0.01500701904296875,
0.02252197265625,
-0.055877685546875,
0.00958251953125,
0.02593994140625,
-0.04791259765625,
0.07666015625,
-0.021240234375,
0.049530029296875,
-0.049530029296875,
0.0019931793212890625,
0.0164642333984375,
-0.0298309326171875,
-0.03399658203125,
0.0660400390625,
-0.0205078125,
-0.04193115234375,
-0.0528564453125,
0.05609130859375,
-0.05218505859375,
0.021087646484375,
0.00042939186096191406,
-0.00624847412109375,
0.00893402099609375,
0.00647735595703125,
0.005115509033203125,
0.0101470947265625,
0.09033203125,
-0.022705078125,
0.04119873046875,
0.04022216796875,
-0.0401611328125,
0.04852294921875,
-0.050079345703125,
0.0162506103515625,
-0.0013942718505859375,
0.0207061767578125,
-0.0662841796875,
-0.045440673828125,
0.034820556640625,
-0.043487548828125,
0.00335693359375,
0.00220489501953125,
-0.052032470703125,
-0.04718017578125,
-0.0262298583984375,
0.0263519287109375,
0.07220458984375,
-0.011993408203125,
0.03021240234375,
0.0176544189453125,
-0.005657196044921875,
-0.01142120361328125,
-0.03265380859375,
0.0330810546875,
-0.02099609375,
-0.0640869140625,
0.05096435546875,
0.00072479248046875,
-0.039276123046875,
-0.006099700927734375,
0.01300811767578125,
-0.011383056640625,
-0.0465087890625,
0.03680419921875,
0.0380859375,
-0.01471710205078125,
-0.055938720703125,
0.00406646728515625,
0.0047454833984375,
-0.0004954338073730469,
-0.006855010986328125,
0.01543426513671875,
-0.02447509765625,
-0.0029277801513671875,
-0.05859375,
0.060211181640625,
0.0531005859375,
0.024505615234375,
0.022735595703125,
0.05023193359375,
-0.02398681640625,
0.0292205810546875,
-0.019134521484375,
-0.022979736328125,
-0.0306549072265625,
-0.001556396484375,
-0.04547119140625,
-0.05682373046875,
0.058990478515625,
0.01019287109375,
-0.025726318359375,
0.03778076171875,
-0.0026607513427734375,
-0.035888671875,
0.080322265625,
0.07525634765625,
-0.003448486328125,
0.016571044921875,
-0.057159423828125,
0.003498077392578125,
-0.0133056640625,
-0.0251312255859375,
-0.023529052734375,
-0.012542724609375,
-0.058441162109375,
-0.0270843505859375,
0.037139892578125,
0.00928497314453125,
-0.01154327392578125,
0.0210113525390625,
-0.007049560546875,
0.038116455078125,
0.035186767578125,
0.02069091796875,
0.01535797119140625,
0.00905609130859375,
-0.0271759033203125,
-0.0192413330078125,
-0.006320953369140625,
-0.0288238525390625,
0.03271484375,
0.01593017578125,
0.0394287109375,
0.00908660888671875,
0.045623779296875,
0.01092529296875,
-0.030975341796875,
-0.020660400390625,
0.063720703125,
-0.0029754638671875,
-0.0755615234375,
0.0012989044189453125,
-0.004062652587890625,
-0.037689208984375,
-0.001605987548828125,
-0.040130615234375,
-0.0531005859375,
0.027984619140625,
0.0126953125,
-0.06610107421875,
0.0289306640625,
-0.0506591796875,
0.0455322265625,
-0.005268096923828125,
-0.0584716796875,
-0.0071258544921875,
-0.07427978515625,
0.033477783203125,
-0.0009474754333496094,
0.01026153564453125,
0.0033969879150390625,
-0.01023101806640625,
0.0281982421875,
-0.0262451171875,
0.04010009765625,
-0.01197052001953125,
0.013214111328125,
0.03765869140625,
0.0189666748046875,
0.017913818359375,
0.023223876953125,
-0.005016326904296875,
0.003936767578125,
-0.0037822723388671875,
-0.03631591796875,
-0.0302581787109375,
0.04791259765625,
-0.050048828125,
-0.04693603515625,
-0.047576904296875,
-0.037017822265625,
0.021942138671875,
0.0157928466796875,
0.0537109375,
0.03076171875,
-0.015533447265625,
-0.0266571044921875,
0.032928466796875,
-0.0318603515625,
0.03424072265625,
0.0186309814453125,
-0.068115234375,
-0.043365478515625,
0.04052734375,
0.0087738037109375,
0.024627685546875,
0.025421142578125,
0.0102386474609375,
-0.04071044921875,
-0.0203094482421875,
-0.03533935546875,
0.0237579345703125,
-0.045684814453125,
0.00525665283203125,
-0.039398193359375,
0.0022602081298828125,
-0.028778076171875,
-0.036956787109375,
-0.0169219970703125,
-0.027984619140625,
-0.0238494873046875,
-0.0243072509765625,
0.03704833984375,
0.0294342041015625,
-0.01396942138671875,
0.0119781494140625,
-0.06427001953125,
0.034912109375,
0.009063720703125,
0.000591278076171875,
-0.0487060546875,
-0.047393798828125,
-0.005062103271484375,
-0.031646728515625,
-0.031036376953125,
-0.03887939453125,
0.0462646484375,
0.0101318359375,
0.0177459716796875,
0.02301025390625,
-0.0017938613891601562,
0.05242919921875,
-0.028045654296875,
0.06524658203125,
0.0265045166015625,
-0.041778564453125,
0.041839599609375,
-0.072265625,
0.041778564453125,
0.042816162109375,
0.0323486328125,
-0.03704833984375,
-0.020904541015625,
-0.08355712890625,
-0.067138671875,
0.038665771484375,
0.024139404296875,
0.03656005859375,
0.03643798828125,
0.0384521484375,
0.011199951171875,
0.0229339599609375,
-0.054840087890625,
-0.043060302734375,
-0.013275146484375,
0.04931640625,
-0.0087432861328125,
-0.005382537841796875,
0.004894256591796875,
-0.035430908203125,
0.0283966064453125,
0.0128021240234375,
0.03240966796875,
0.0309295654296875,
0.0440673828125,
-0.039703369140625,
0.0010480880737304688,
0.0250091552734375,
0.033477783203125,
-0.04144287109375,
0.00702667236328125,
-0.0215911865234375,
-0.0238494873046875,
0.042388916015625,
0.00925445556640625,
-0.0179443359375,
0.020965576171875,
0.01226043701171875,
0.06524658203125,
-0.024261474609375,
-0.051849365234375,
0.01419830322265625,
-0.005741119384765625,
-0.0035724639892578125,
-0.051483154296875,
0.020416259765625,
-0.0167694091796875,
0.0083465576171875,
0.030975341796875,
0.043701171875,
0.0277557373046875,
-0.064208984375,
0.020172119140625,
0.01380157470703125,
-0.054901123046875,
-0.0479736328125,
0.0517578125,
-0.01241302490234375,
-0.017181396484375,
0.030364990234375,
-0.038116455078125,
-0.0211944580078125,
0.04754638671875,
0.01812744140625,
0.04034423828125,
-0.033111572265625,
0.047088623046875,
0.053558349609375,
0.004428863525390625,
0.0161590576171875,
0.053314208984375,
0.0182037353515625,
-0.0294952392578125,
-0.00493621826171875,
-0.0236663818359375,
-0.0386962890625,
0.03082275390625,
-0.00821685791015625,
0.0689697265625,
-0.054656982421875,
0.021392822265625,
0.00580596923828125,
0.0106964111328125,
-0.02008056640625,
0.05023193359375,
0.0195465087890625,
0.0916748046875,
-0.06341552734375,
0.04083251953125,
0.035064697265625,
-0.061309814453125,
-0.08837890625,
-0.0258331298828125,
0.03662109375,
-0.0291748046875,
0.0289306640625,
0.033843994140625,
0.004604339599609375,
-0.006805419921875,
-0.046051025390625,
-0.05499267578125,
0.055511474609375,
0.037567138671875,
-0.05712890625,
-0.007167816162109375,
-0.007843017578125,
0.047271728515625,
-0.042144775390625,
0.02508544921875,
0.041015625,
0.033966064453125,
0.03436279296875,
-0.039794921875,
-0.00946807861328125,
-0.0582275390625,
-0.0175323486328125,
-0.00347900390625,
-0.041015625,
0.060089111328125,
-0.08203125,
-0.020599365234375,
0.042877197265625,
0.0682373046875,
0.04840087890625,
0.0117645263671875,
0.06121826171875,
0.050384521484375,
0.0204010009765625,
-0.01335906982421875,
0.12127685546875,
0.0256500244140625,
-0.01361083984375,
0.057769775390625,
-0.016693115234375,
0.020477294921875,
0.017486572265625,
-0.034454345703125,
0.03656005859375,
0.09912109375,
-0.024169921875,
0.04705810546875,
0.0177764892578125,
-0.044891357421875,
-0.00806427001953125,
-0.04608154296875,
-0.00966644287109375,
0.0236968994140625,
-0.00820159912109375,
-0.029296875,
-0.016937255859375,
0.018768310546875,
0.00006699562072753906,
0.0193328857421875,
-0.03387451171875,
0.0242919921875,
0.01305389404296875,
-0.04852294921875,
0.039764404296875,
-0.0237274169921875,
0.01165008544921875,
-0.0458984375,
-0.0194854736328125,
0.001987457275390625,
-0.0027618408203125,
-0.0263214111328125,
-0.06707763671875,
-0.0026378631591796875,
-0.023040771484375,
-0.0214385986328125,
-0.0231781005859375,
0.0416259765625,
-0.0174560546875,
-0.0706787109375,
0.0157470703125,
0.007373809814453125,
0.03271484375,
-0.00269317626953125,
-0.0684814453125,
0.0045013427734375,
0.00832366943359375,
0.01410675048828125,
-0.0007658004760742188,
0.00710296630859375,
-0.004688262939453125,
0.042999267578125,
0.0295867919921875,
0.005840301513671875,
-0.005222320556640625,
0.00662994384765625,
0.0494384765625,
-0.0330810546875,
-0.050201416015625,
-0.044891357421875,
0.0589599609375,
-0.005687713623046875,
-0.036163330078125,
0.043914794921875,
0.043182373046875,
0.049560546875,
-0.038909912109375,
0.049774169921875,
-0.0263671875,
0.028564453125,
-0.034210205078125,
0.0479736328125,
-0.037017822265625,
-0.00643157958984375,
-0.060089111328125,
-0.05633544921875,
-0.0380859375,
0.006195068359375,
0.01314544677734375,
0.01104736328125,
0.04473876953125,
0.056427001953125,
0.00494384765625,
-0.025848388671875,
0.03448486328125,
-0.0003719329833984375,
0.03106689453125,
0.01410675048828125,
0.0750732421875,
-0.0404052734375,
0.0302581787109375,
-0.0450439453125,
-0.02569580078125,
-0.0014657974243164062,
-0.0687255859375,
-0.037109375,
-0.06298828125,
-0.04327392578125,
-0.026031494140625,
-0.0142364501953125,
0.053955078125,
0.07281494140625,
-0.0242919921875,
-0.0242919921875,
-0.00010251998901367188,
-0.00417327880859375,
-0.00696563720703125,
-0.01397705078125,
0.00304412841796875,
0.04669189453125,
-0.072509765625,
0.023284912109375,
0.0251922607421875,
0.05633544921875,
-0.0191650390625,
0.01384735107421875,
-0.004184722900390625,
-0.01459503173828125,
0.0084381103515625,
0.0251312255859375,
-0.032562255859375,
-0.01141357421875,
0.005641937255859375,
0.01107025146484375,
0.00887298583984375,
0.041778564453125,
-0.031494140625,
0.0185394287109375,
0.036102294921875,
-0.0078582763671875,
0.03643798828125,
-0.0198974609375,
0.051239013671875,
-0.0235595703125,
0.009521484375,
0.0034542083740234375,
0.047882080078125,
0.03533935546875,
-0.06396484375,
0.039398193359375,
0.026214599609375,
-0.0343017578125,
-0.0648193359375,
0.022369384765625,
-0.116455078125,
-0.0017786026000976562,
0.06109619140625,
-0.00859832763671875,
-0.0198516845703125,
0.00470733642578125,
-0.03948974609375,
-0.020782470703125,
-0.042816162109375,
0.03680419921875,
0.0511474609375,
-0.02008056640625,
-0.039520263671875,
-0.0236968994140625,
0.01934814453125,
0.00028443336486816406,
-0.060302734375,
-0.0213165283203125,
0.05224609375,
0.0146636962890625,
0.05877685546875,
0.053558349609375,
-0.0404052734375,
0.0297088623046875,
0.0034351348876953125,
0.02191162109375,
0.0223236083984375,
-0.0214996337890625,
-0.00982666015625,
0.024658203125,
-0.00803375244140625,
-0.01274871826171875
]
] |
cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual | 2022-12-01T00:38:43.000Z | [
"transformers",
"pytorch",
"xlm-roberta",
"text-classification",
"dataset:cardiffnlp/tweet_sentiment_multilingual",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual | 2 | 27,471 | transformers | 2022-12-01T00:32:11 | ---
datasets:
- cardiffnlp/tweet_sentiment_multilingual
metrics:
- f1
- accuracy
model-index:
- name: cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: cardiffnlp/tweet_sentiment_multilingual
type: all
split: test
metrics:
- name: Micro F1 (cardiffnlp/tweet_sentiment_multilingual/all)
type: micro_f1_cardiffnlp/tweet_sentiment_multilingual/all
value: 0.6931034482758621
- name: Macro F1 (cardiffnlp/tweet_sentiment_multilingual/all)
type: micro_f1_cardiffnlp/tweet_sentiment_multilingual/all
value: 0.692628774202147
- name: Accuracy (cardiffnlp/tweet_sentiment_multilingual/all)
type: accuracy_cardiffnlp/tweet_sentiment_multilingual/all
value: 0.6931034482758621
pipeline_tag: text-classification
widget:
- text: Get the all-analog Classic Vinyl Edition of "Takin Off" Album from {@herbiehancock@} via {@bluenoterecords@} link below {{URL}}
example_title: "topic_classification 1"
- text: Yes, including Medicare and social security saving👍
example_title: "sentiment 1"
- text: All two of them taste like ass.
example_title: "offensive 1"
- text: If you wanna look like a badass, have drama on social media
example_title: "irony 1"
- text: Whoever just unfollowed me you a bitch
example_title: "hate 1"
- text: I love swimming for the same reason I love meditating...the feeling of weightlessness.
example_title: "emotion 1"
- text: Beautiful sunset last night from the pontoon @TupperLakeNY
example_title: "emoji 1"
---
# cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual
This model is a fine-tuned version of [cardiffnlp/twitter-xlm-roberta-base](https://huggingface.co/cardiffnlp/twitter-xlm-roberta-base) on the
[`cardiffnlp/tweet_sentiment_multilingual (all)`](https://huggingface.co/datasets/cardiffnlp/tweet_sentiment_multilingual)
via [`tweetnlp`](https://github.com/cardiffnlp/tweetnlp).
Training split is `train` and parameters have been tuned on the validation split `validation`.
Following metrics are achieved on the test split `test` ([link](https://huggingface.co/cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual/raw/main/metric.json)).
- F1 (micro): 0.6931034482758621
- F1 (macro): 0.692628774202147
- Accuracy: 0.6931034482758621
### Usage
Install tweetnlp via pip.
```shell
pip install tweetnlp
```
Load the model in python.
```python
import tweetnlp
model = tweetnlp.Classifier("cardiffnlp/twitter-xlm-roberta-base-sentiment-multilingual", max_length=128)
model.predict('Get the all-analog Classic Vinyl Edition of "Takin Off" Album from {@herbiehancock@} via {@bluenoterecords@} link below {{URL}}')
```
### Reference
```
@inproceedings{dimosthenis-etal-2022-twitter,
title = "{T}witter {T}opic {C}lassification",
author = "Antypas, Dimosthenis and
Ushio, Asahi and
Camacho-Collados, Jose and
Neves, Leonardo and
Silva, Vitor and
Barbieri, Francesco",
booktitle = "Proceedings of the 29th International Conference on Computational Linguistics",
month = oct,
year = "2022",
address = "Gyeongju, Republic of Korea",
publisher = "International Committee on Computational Linguistics"
}
```
| 3,299 | [
[
-0.01971435546875,
-0.034820556640625,
0.01091766357421875,
0.03729248046875,
-0.030487060546875,
0.024444580078125,
-0.0435791015625,
-0.03729248046875,
0.04205322265625,
0.021240234375,
-0.058349609375,
-0.06671142578125,
-0.044769287109375,
0.01012420654296875,
-0.005859375,
0.07623291015625,
-0.007061004638671875,
0.02960205078125,
0.0163116455078125,
-0.043243408203125,
-0.01097869873046875,
-0.03375244140625,
-0.045166015625,
-0.0102996826171875,
0.03436279296875,
0.0186309814453125,
0.03375244140625,
0.016204833984375,
0.032562255859375,
0.018768310546875,
-0.01348114013671875,
0.00875091552734375,
-0.038238525390625,
0.0092926025390625,
0.002162933349609375,
-0.0209197998046875,
-0.033935546875,
0.00640869140625,
0.048553466796875,
0.03912353515625,
0.003810882568359375,
0.01947021484375,
-0.0010538101196289062,
0.03228759765625,
-0.021942138671875,
0.0164794921875,
-0.0380859375,
-0.01250457763671875,
-0.003917694091796875,
-0.0025177001953125,
-0.0231170654296875,
-0.037750244140625,
-0.01108551025390625,
-0.01534271240234375,
0.005489349365234375,
0.00667572021484375,
0.086669921875,
-0.0020046234130859375,
-0.01824951171875,
0.00286102294921875,
-0.049468994140625,
0.07470703125,
-0.06292724609375,
0.04010009765625,
0.0124359130859375,
0.0188751220703125,
0.01186370849609375,
-0.027984619140625,
-0.035400390625,
-0.01509857177734375,
0.0021724700927734375,
0.0258636474609375,
0.000919342041015625,
-0.0155487060546875,
0.004825592041015625,
0.0238800048828125,
-0.05096435546875,
-0.017547607421875,
-0.037567138671875,
-0.0037860870361328125,
0.0479736328125,
-0.017333984375,
0.0232391357421875,
-0.0255889892578125,
-0.0313720703125,
-0.00954437255859375,
-0.03485107421875,
-0.007793426513671875,
0.01428985595703125,
0.06097412109375,
-0.04608154296875,
0.03515625,
0.0021190643310546875,
0.03839111328125,
-0.0025653839111328125,
-0.01053619384765625,
0.0638427734375,
-0.02276611328125,
-0.018585205078125,
-0.0098724365234375,
0.09600830078125,
0.0157470703125,
0.03436279296875,
-0.0098876953125,
-0.01529693603515625,
0.00017702579498291016,
-0.0180816650390625,
-0.05328369140625,
0.0033931732177734375,
0.0313720703125,
-0.0245513916015625,
-0.0178985595703125,
0.00809478759765625,
-0.044525146484375,
0.0058135986328125,
-0.01702880859375,
0.035430908203125,
-0.057586669921875,
-0.0438232421875,
0.01235198974609375,
-0.0035400390625,
-0.0017032623291015625,
0.00406646728515625,
-0.046112060546875,
-0.006923675537109375,
0.042144775390625,
0.06781005859375,
-0.005855560302734375,
-0.02685546875,
-0.017730712890625,
-0.0011854171752929688,
-0.0235748291015625,
0.04022216796875,
-0.021087646484375,
-0.0222015380859375,
0.00891876220703125,
0.0013036727905273438,
-0.0258636474609375,
-0.0284423828125,
0.062469482421875,
-0.0243377685546875,
0.03570556640625,
-0.016693115234375,
-0.0364990234375,
-0.0011501312255859375,
0.039337158203125,
-0.040130615234375,
0.0789794921875,
0.034271240234375,
-0.06787109375,
0.0244293212890625,
-0.0548095703125,
-0.0256805419921875,
-0.0099945068359375,
0.009674072265625,
-0.0256500244140625,
-0.000843048095703125,
0.011566162109375,
0.0261993408203125,
-0.021026611328125,
0.0032501220703125,
-0.048858642578125,
-0.0129852294921875,
0.00395965576171875,
-0.001430511474609375,
0.09442138671875,
0.0149383544921875,
-0.0185089111328125,
-0.001708984375,
-0.0689697265625,
0.0262298583984375,
0.00542449951171875,
-0.02899169921875,
-0.007221221923828125,
-0.029541015625,
0.04437255859375,
0.037017822265625,
0.0125579833984375,
-0.057891845703125,
0.01419830322265625,
-0.036102294921875,
0.0234832763671875,
0.035308837890625,
-0.0034942626953125,
0.022857666015625,
-0.028900146484375,
0.05096435546875,
0.007457733154296875,
0.0188140869140625,
0.01324462890625,
-0.033538818359375,
-0.037628173828125,
-0.0311737060546875,
0.03668212890625,
0.0484619140625,
-0.04248046875,
0.038238525390625,
-0.0308074951171875,
-0.04180908203125,
-0.048858642578125,
-0.005062103271484375,
0.026336669921875,
0.0123138427734375,
0.031768798828125,
0.01018524169921875,
-0.06512451171875,
-0.047515869140625,
-0.0307769775390625,
-0.0311431884765625,
0.0140380859375,
0.0228729248046875,
0.028045654296875,
-0.029937744140625,
0.05377197265625,
-0.01788330078125,
-0.02386474609375,
-0.033935546875,
0.01071929931640625,
0.033935546875,
0.0369873046875,
0.061981201171875,
-0.057708740234375,
-0.07568359375,
-0.005733489990234375,
-0.047088623046875,
-0.0239105224609375,
0.006824493408203125,
-0.01904296875,
0.041168212890625,
0.01276397705078125,
-0.058837890625,
0.005710601806640625,
0.037353515625,
-0.0224151611328125,
0.03399658203125,
0.01727294921875,
0.01953125,
-0.11407470703125,
-0.010040283203125,
0.017364501953125,
-0.023345947265625,
-0.03375244140625,
0.00946044921875,
0.007579803466796875,
0.0273590087890625,
-0.040557861328125,
0.048187255859375,
-0.0238800048828125,
0.01953125,
0.0019969940185546875,
-0.0009045600891113281,
-0.0044403076171875,
0.028472900390625,
-0.008544921875,
0.038055419921875,
0.041229248046875,
-0.002056121826171875,
0.020721435546875,
0.010528564453125,
-0.01537322998046875,
0.031646728515625,
-0.04193115234375,
-0.01024627685546875,
0.01129150390625,
0.0069732666015625,
-0.07977294921875,
-0.003627777099609375,
0.02606201171875,
-0.061065673828125,
0.049652099609375,
-0.0202789306640625,
-0.057403564453125,
-0.0308837890625,
-0.034393310546875,
0.02630615234375,
0.045135498046875,
-0.03253173828125,
0.047271728515625,
0.0238494873046875,
0.01209259033203125,
-0.051300048828125,
-0.063232421875,
0.00750732421875,
-0.0234527587890625,
-0.0606689453125,
0.01477813720703125,
-0.0176544189453125,
-0.0214691162109375,
-0.01039886474609375,
0.0036182403564453125,
-0.006885528564453125,
-0.0169677734375,
0.0030193328857421875,
0.02484130859375,
-0.019622802734375,
0.005588531494140625,
-0.0204925537109375,
0.00463104248046875,
-0.0022869110107421875,
-0.046630859375,
0.055419921875,
-0.00844573974609375,
0.0075836181640625,
-0.0252227783203125,
0.02099609375,
0.0399169921875,
0.0019817352294921875,
0.06689453125,
0.073974609375,
-0.0234527587890625,
-0.0196990966796875,
-0.040679931640625,
0.003971099853515625,
-0.0303192138671875,
0.027130126953125,
-0.02734375,
-0.057159423828125,
0.04632568359375,
0.0209503173828125,
-0.01104736328125,
0.0579833984375,
0.044403076171875,
-0.006832122802734375,
0.08331298828125,
0.04205322265625,
-0.0248260498046875,
0.027496337890625,
-0.0528564453125,
0.016448974609375,
-0.044921875,
-0.004703521728515625,
-0.048858642578125,
-0.004711151123046875,
-0.08203125,
-0.0204925537109375,
0.01042938232421875,
0.007781982421875,
-0.037994384765625,
0.02911376953125,
-0.02825927734375,
0.0158233642578125,
0.038970947265625,
0.01325225830078125,
0.0057220458984375,
0.006313323974609375,
-0.0249176025390625,
-0.01319122314453125,
-0.035888671875,
-0.029022216796875,
0.08074951171875,
0.02081298828125,
0.044769287109375,
0.0199737548828125,
0.0552978515625,
0.003231048583984375,
0.01517486572265625,
-0.05340576171875,
0.0325927734375,
-0.0238800048828125,
-0.05010986328125,
-0.0044097900390625,
-0.0504150390625,
-0.038116455078125,
-0.0029582977294921875,
-0.00818634033203125,
-0.05902099609375,
-0.005596160888671875,
-0.020263671875,
-0.00434112548828125,
0.0276031494140625,
-0.03509521484375,
0.053009033203125,
-0.01381683349609375,
-0.01548004150390625,
0.007503509521484375,
-0.04010009765625,
0.0200347900390625,
-0.00923919677734375,
0.0273590087890625,
-0.021942138671875,
-0.00562286376953125,
0.053009033203125,
-0.034759521484375,
0.0548095703125,
-0.01020050048828125,
-0.0012950897216796875,
0.00011879205703735352,
-0.0014448165893554688,
0.01108551025390625,
0.01468658447265625,
-0.0210723876953125,
0.027313232421875,
0.0052032470703125,
-0.033203125,
-0.0191192626953125,
0.06610107421875,
-0.08221435546875,
-0.0224609375,
-0.040557861328125,
-0.04315185546875,
-0.031097412109375,
0.0275115966796875,
0.047760009765625,
0.0278167724609375,
-0.016571044921875,
0.01087188720703125,
0.031341552734375,
-0.01062774658203125,
0.02520751953125,
0.023101806640625,
-0.0253753662109375,
-0.043670654296875,
0.044647216796875,
0.01522064208984375,
0.02178955078125,
0.044830322265625,
0.033233642578125,
-0.0355224609375,
-0.0399169921875,
-0.029693603515625,
0.037506103515625,
-0.05145263671875,
-0.0208892822265625,
-0.053741455078125,
-0.0285797119140625,
-0.051605224609375,
0.0137481689453125,
-0.022369384765625,
-0.060302734375,
-0.0240325927734375,
-0.013946533203125,
0.040283203125,
0.045562744140625,
-0.0269012451171875,
0.02398681640625,
-0.058135986328125,
0.00870513916015625,
-0.0224151611328125,
0.0241241455078125,
-0.0167694091796875,
-0.06707763671875,
-0.035980224609375,
0.0224609375,
-0.0222320556640625,
-0.0582275390625,
0.05224609375,
0.0257110595703125,
0.0180511474609375,
0.025115966796875,
0.00830078125,
0.036346435546875,
-0.0209197998046875,
0.05859375,
0.021392822265625,
-0.0560302734375,
0.03961181640625,
-0.05426025390625,
0.021453857421875,
0.045196533203125,
0.040740966796875,
-0.042877197265625,
-0.056304931640625,
-0.048675537109375,
-0.0718994140625,
0.0662841796875,
0.01085662841796875,
0.021270751953125,
-0.006072998046875,
0.0152587890625,
-0.0009388923645019531,
0.01500701904296875,
-0.0701904296875,
-0.042205810546875,
-0.0127716064453125,
-0.038116455078125,
-0.0241241455078125,
-0.0252838134765625,
-0.006458282470703125,
-0.0294189453125,
0.07281494140625,
0.0084228515625,
0.01023101806640625,
-0.00348663330078125,
0.0049285888671875,
-0.0201416015625,
0.01261138916015625,
0.033050537109375,
0.049224853515625,
-0.0535888671875,
-0.0237579345703125,
-0.0026493072509765625,
-0.038299560546875,
-0.0013933181762695312,
0.00997161865234375,
-0.0218353271484375,
0.0276947021484375,
0.0394287109375,
0.06396484375,
0.004550933837890625,
-0.0192718505859375,
0.039703369140625,
-0.0117034912109375,
-0.0241546630859375,
-0.0389404296875,
-0.007793426513671875,
0.0191192626953125,
0.01178741455078125,
0.0308380126953125,
0.020782470703125,
-0.01134490966796875,
-0.035614013671875,
0.01324462890625,
0.0208587646484375,
-0.05010986328125,
-0.038299560546875,
0.050506591796875,
-0.0131988525390625,
-0.0203094482421875,
0.025848388671875,
-0.0261383056640625,
-0.063720703125,
0.041046142578125,
0.0284423828125,
0.08697509765625,
-0.01337432861328125,
0.01361083984375,
0.048492431640625,
0.0224761962890625,
-0.00286102294921875,
0.059906005859375,
0.013702392578125,
-0.061370849609375,
-0.0217132568359375,
-0.05731201171875,
-0.005191802978515625,
0.00766754150390625,
-0.047821044921875,
0.032684326171875,
-0.028045654296875,
-0.0171051025390625,
0.00753021240234375,
0.017120361328125,
-0.04541015625,
0.0276947021484375,
0.001377105712890625,
0.0911865234375,
-0.0718994140625,
0.0693359375,
0.056121826171875,
-0.047393798828125,
-0.08221435546875,
0.019744873046875,
-0.00732421875,
-0.05255126953125,
0.06573486328125,
0.01629638671875,
0.01398468017578125,
-0.01041412353515625,
-0.037811279296875,
-0.059844970703125,
0.054840087890625,
0.02398681640625,
-0.024871826171875,
0.02301025390625,
0.0290679931640625,
0.05615234375,
-0.034637451171875,
0.04498291015625,
0.03106689453125,
0.0343017578125,
0.0325927734375,
-0.07501220703125,
-0.01126861572265625,
-0.03240966796875,
-0.004497528076171875,
0.01174163818359375,
-0.0650634765625,
0.0794677734375,
-0.0069580078125,
0.0042877197265625,
0.009613037109375,
0.03472900390625,
0.025634765625,
0.00699615478515625,
0.033355712890625,
0.062347412109375,
0.033935546875,
-0.027557373046875,
0.08056640625,
-0.035308837890625,
0.055816650390625,
0.0750732421875,
-0.01293182373046875,
0.0738525390625,
0.0316162109375,
-0.02447509765625,
0.038055419921875,
0.06689453125,
0.00942230224609375,
0.05255126953125,
-0.0036773681640625,
-0.01454925537109375,
-0.01029205322265625,
-0.006183624267578125,
-0.0299224853515625,
0.03350830078125,
0.009429931640625,
-0.0269317626953125,
-0.00007361173629760742,
-0.00925445556640625,
0.0205230712890625,
-0.00994873046875,
-0.0143280029296875,
0.042694091796875,
0.01218414306640625,
-0.04620361328125,
0.0673828125,
0.00489044189453125,
0.06982421875,
-0.036529541015625,
0.0278778076171875,
-0.01390838623046875,
0.028076171875,
-0.0166015625,
-0.051605224609375,
0.021270751953125,
-0.00899505615234375,
-0.0205841064453125,
-0.0158843994140625,
0.032958984375,
-0.048583984375,
-0.0794677734375,
0.0655517578125,
0.03985595703125,
0.0013685226440429688,
0.001148223876953125,
-0.07763671875,
0.00611114501953125,
0.0241546630859375,
-0.03607177734375,
0.004589080810546875,
0.041778564453125,
0.00933837890625,
0.036712646484375,
0.035400390625,
0.04315185546875,
-0.003265380859375,
0.04083251953125,
0.053009033203125,
-0.0421142578125,
-0.03680419921875,
-0.06982421875,
0.027587890625,
-0.00524139404296875,
-0.02386474609375,
0.0655517578125,
0.052398681640625,
0.07470703125,
0.0012836456298828125,
0.07342529296875,
-0.01358795166015625,
0.0753173828125,
-0.0224609375,
0.058441162109375,
-0.06964111328125,
0.018524169921875,
-0.02679443359375,
-0.05804443359375,
-0.033538818359375,
0.029632568359375,
-0.0166015625,
0.039794921875,
0.057647705078125,
0.05780029296875,
-0.0185394287109375,
-0.00457763671875,
0.01470184326171875,
0.040985107421875,
0.01082611083984375,
0.03558349609375,
0.0447998046875,
-0.049835205078125,
0.03472900390625,
-0.044952392578125,
-0.01222991943359375,
-0.00855255126953125,
-0.06207275390625,
-0.08056640625,
-0.043975830078125,
-0.02398681640625,
-0.06536865234375,
-0.00460052490234375,
0.09228515625,
0.04815673828125,
-0.0831298828125,
-0.02825927734375,
0.00714111328125,
0.0022220611572265625,
0.002895355224609375,
-0.0191192626953125,
0.040130615234375,
-0.02099609375,
-0.0626220703125,
-0.0016813278198242188,
0.01174163818359375,
0.004344940185546875,
-0.005245208740234375,
-0.00884246826171875,
-0.0301666259765625,
-0.00604248046875,
0.0307159423828125,
0.0015850067138671875,
-0.0438232421875,
-0.00846099853515625,
0.014373779296875,
-0.0225372314453125,
0.0206146240234375,
0.0182647705078125,
-0.021636962890625,
0.00859832763671875,
0.05316162109375,
0.004367828369140625,
0.0311431884765625,
-0.01049041748046875,
0.0372314453125,
-0.0618896484375,
0.0182647705078125,
0.029541015625,
0.049560546875,
0.044281005859375,
-0.017913818359375,
0.045745849609375,
0.0261993408203125,
-0.0294036865234375,
-0.07159423828125,
-0.01294708251953125,
-0.0843505859375,
-0.01104736328125,
0.11273193359375,
-0.0089263916015625,
-0.0242156982421875,
0.01369476318359375,
0.001979827880859375,
0.033111572265625,
-0.039154052734375,
0.05712890625,
0.0435791015625,
0.0273895263671875,
-0.01305389404296875,
-0.040863037109375,
0.036956787109375,
0.018157958984375,
-0.03216552734375,
-0.01273345947265625,
0.01142120361328125,
0.033905029296875,
0.01114654541015625,
0.04974365234375,
-0.011932373046875,
0.0176544189453125,
-0.02825927734375,
0.019500732421875,
0.00872039794921875,
-0.0046844482421875,
-0.02117919921875,
-0.002613067626953125,
-0.005115509033203125,
-0.02484130859375
]
] |
timm/vit_small_patch16_224.augreg_in21k_ft_in1k | 2023-05-06T00:28:22.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:imagenet-21k",
"arxiv:2106.10270",
"arxiv:2010.11929",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_small_patch16_224.augreg_in21k_ft_in1k | 0 | 27,449 | timm | 2022-12-22T07:54:03 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
- imagenet-21k
---
# Model card for vit_small_patch16_224.augreg_in21k_ft_in1k
A Vision Transformer (ViT) image classification model. Trained on ImageNet-21k and fine-tuned on ImageNet-1k (with additional augmentation and regularization) in JAX by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 22.1
- GMACs: 4.3
- Activations (M): 8.2
- Image size: 224 x 224
- **Papers:**
- How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers: https://arxiv.org/abs/2106.10270
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Dataset:** ImageNet-1k
- **Pretrain Dataset:** ImageNet-21k
- **Original:** https://github.com/google-research/vision_transformer
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_small_patch16_224.augreg_in21k_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_small_patch16_224.augreg_in21k_ft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 197, 384) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{steiner2021augreg,
title={How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers},
author={Steiner, Andreas and Kolesnikov, Alexander and and Zhai, Xiaohua and Wightman, Ross and Uszkoreit, Jakob and Beyer, Lucas},
journal={arXiv preprint arXiv:2106.10270},
year={2021}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 3,907 | [
[
-0.03997802734375,
-0.0290985107421875,
-0.002399444580078125,
0.005107879638671875,
-0.0290069580078125,
-0.02655029296875,
-0.0219879150390625,
-0.03509521484375,
0.014617919921875,
0.02203369140625,
-0.0418701171875,
-0.035980224609375,
-0.04718017578125,
0.0012598037719726562,
-0.01096343994140625,
0.07366943359375,
-0.01071929931640625,
0.0039215087890625,
-0.0169525146484375,
-0.033660888671875,
-0.0234527587890625,
-0.019134521484375,
-0.047271728515625,
-0.03253173828125,
0.029632568359375,
0.01389312744140625,
0.04443359375,
0.046112060546875,
0.05938720703125,
0.033538818359375,
-0.0081329345703125,
0.0108489990234375,
-0.0256805419921875,
-0.0178985595703125,
0.022247314453125,
-0.047210693359375,
-0.028717041015625,
0.018768310546875,
0.0545654296875,
0.0290985107421875,
0.00836944580078125,
0.0273895263671875,
0.01123809814453125,
0.03680419921875,
-0.02496337890625,
0.01507568359375,
-0.03851318359375,
0.0205230712890625,
-0.00438690185546875,
-0.0019044876098632812,
-0.023040771484375,
-0.0247802734375,
0.0191802978515625,
-0.040283203125,
0.0458984375,
-0.0023956298828125,
0.10296630859375,
0.022674560546875,
0.0048828125,
0.0165252685546875,
-0.030120849609375,
0.056304931640625,
-0.046112060546875,
0.03125,
0.01393890380859375,
0.01409912109375,
0.00478363037109375,
-0.07794189453125,
-0.050994873046875,
-0.013153076171875,
-0.019683837890625,
0.00719451904296875,
-0.02276611328125,
0.0189208984375,
0.035675048828125,
0.04388427734375,
-0.039093017578125,
0.0003173351287841797,
-0.042633056640625,
-0.0207672119140625,
0.042388916015625,
-0.0011157989501953125,
0.01422882080078125,
-0.01282501220703125,
-0.04449462890625,
-0.0455322265625,
-0.024871826171875,
0.021453857421875,
0.022216796875,
0.00504302978515625,
-0.037506103515625,
0.041168212890625,
0.0032482147216796875,
0.04962158203125,
0.0175628662109375,
-0.0168304443359375,
0.04852294921875,
-0.0119171142578125,
-0.029388427734375,
-0.0190582275390625,
0.08233642578125,
0.033935546875,
0.0281982421875,
-0.0013580322265625,
-0.01357269287109375,
-0.00913238525390625,
0.00433349609375,
-0.08221435546875,
-0.0294036865234375,
0.00583648681640625,
-0.035400390625,
-0.02923583984375,
0.02655029296875,
-0.0482177734375,
-0.00868988037109375,
-0.0086212158203125,
0.05963134765625,
-0.032928466796875,
-0.01377105712890625,
0.00743865966796875,
-0.01184844970703125,
0.03533935546875,
0.0194854736328125,
-0.04412841796875,
0.008209228515625,
0.016204833984375,
0.07794189453125,
0.003147125244140625,
-0.036834716796875,
-0.01922607421875,
-0.0343017578125,
-0.0238800048828125,
0.037750244140625,
-0.0022563934326171875,
-0.009429931640625,
-0.01320648193359375,
0.0291595458984375,
-0.0189361572265625,
-0.04217529296875,
0.0248870849609375,
-0.01502227783203125,
0.027587890625,
0.00872039794921875,
-0.014495849609375,
-0.030670166015625,
0.020355224609375,
-0.031005859375,
0.091064453125,
0.0278778076171875,
-0.068359375,
0.032135009765625,
-0.034576416015625,
-0.006023406982421875,
-0.00992584228515625,
0.0009541511535644531,
-0.0828857421875,
0.00452423095703125,
0.023529052734375,
0.044342041015625,
-0.0163116455078125,
-0.0033931732177734375,
-0.030853271484375,
-0.025787353515625,
0.025909423828125,
-0.0194244384765625,
0.0672607421875,
0.002544403076171875,
-0.0229339599609375,
0.021514892578125,
-0.043853759765625,
0.00621795654296875,
0.032958984375,
-0.018310546875,
0.0005674362182617188,
-0.046905517578125,
0.01091766357421875,
0.0169830322265625,
0.01678466796875,
-0.04931640625,
0.0302581787109375,
-0.028045654296875,
0.0310211181640625,
0.04931640625,
-0.007488250732421875,
0.0290985107421875,
-0.025909423828125,
0.0253143310546875,
0.0197601318359375,
0.0303497314453125,
-0.009796142578125,
-0.047760009765625,
-0.07855224609375,
-0.03424072265625,
0.0273590087890625,
0.03240966796875,
-0.0513916015625,
0.0413818359375,
-0.0282135009765625,
-0.056182861328125,
-0.04656982421875,
0.0031909942626953125,
0.03399658203125,
0.04010009765625,
0.038726806640625,
-0.040374755859375,
-0.0400390625,
-0.0723876953125,
-0.00902557373046875,
-0.0038471221923828125,
-0.0009026527404785156,
0.016693115234375,
0.048370361328125,
-0.020263671875,
0.06512451171875,
-0.0340576171875,
-0.0268402099609375,
-0.016021728515625,
0.004993438720703125,
0.0263214111328125,
0.05572509765625,
0.052001953125,
-0.047882080078125,
-0.034423828125,
-0.01068878173828125,
-0.0638427734375,
0.01105499267578125,
-0.0024814605712890625,
-0.01276397705078125,
0.01044464111328125,
0.0146636962890625,
-0.0518798828125,
0.05926513671875,
0.01294708251953125,
-0.0273284912109375,
0.0306243896484375,
-0.0177001953125,
0.004848480224609375,
-0.0867919921875,
-0.00091552734375,
0.02789306640625,
-0.01934814453125,
-0.035797119140625,
-0.00011622905731201172,
0.006931304931640625,
-0.0032100677490234375,
-0.031585693359375,
0.04443359375,
-0.036285400390625,
-0.0036334991455078125,
-0.00487518310546875,
-0.023345947265625,
0.00405120849609375,
0.05426025390625,
-0.00399017333984375,
0.040313720703125,
0.05413818359375,
-0.036285400390625,
0.043487548828125,
0.041015625,
-0.017669677734375,
0.034423828125,
-0.05438232421875,
0.01229095458984375,
-0.0033245086669921875,
0.015167236328125,
-0.0753173828125,
-0.015838623046875,
0.0291748046875,
-0.05426025390625,
0.04840087890625,
-0.0394287109375,
-0.034454345703125,
-0.04718017578125,
-0.0310211181640625,
0.0311431884765625,
0.0567626953125,
-0.060089111328125,
0.043670654296875,
0.00612640380859375,
0.0237274169921875,
-0.041961669921875,
-0.07196044921875,
-0.01800537109375,
-0.0283966064453125,
-0.054718017578125,
0.033935546875,
0.0065765380859375,
0.0110626220703125,
0.005619049072265625,
-0.00606536865234375,
0.0006628036499023438,
-0.016571044921875,
0.032928466796875,
0.0299530029296875,
-0.0172119140625,
-0.0038471221923828125,
-0.0246734619140625,
-0.016204833984375,
-0.0013380050659179688,
-0.0252685546875,
0.038604736328125,
-0.02325439453125,
-0.0148773193359375,
-0.05755615234375,
-0.019866943359375,
0.0347900390625,
-0.0212249755859375,
0.056396484375,
0.08734130859375,
-0.035064697265625,
0.004100799560546875,
-0.041595458984375,
-0.031280517578125,
-0.037109375,
0.034271240234375,
-0.0244903564453125,
-0.034759521484375,
0.05499267578125,
0.0125579833984375,
0.00560760498046875,
0.0584716796875,
0.03167724609375,
0.0016641616821289062,
0.0628662109375,
0.051849365234375,
0.011566162109375,
0.0662841796875,
-0.07330322265625,
-0.007289886474609375,
-0.0684814453125,
-0.0262603759765625,
-0.018890380859375,
-0.0413818359375,
-0.0533447265625,
-0.03662109375,
0.033966064453125,
0.0087432861328125,
-0.021087646484375,
0.038909912109375,
-0.066650390625,
0.0133514404296875,
0.05389404296875,
0.038604736328125,
-0.0083160400390625,
0.0323486328125,
-0.0150604248046875,
-0.006618499755859375,
-0.060089111328125,
-0.00815582275390625,
0.0811767578125,
0.03619384765625,
0.06085205078125,
-0.0208740234375,
0.049774169921875,
-0.018890380859375,
0.0245513916015625,
-0.058929443359375,
0.040985107421875,
-0.0021953582763671875,
-0.0301055908203125,
-0.00969696044921875,
-0.03033447265625,
-0.07769775390625,
0.0164031982421875,
-0.02593994140625,
-0.060791015625,
0.026336669921875,
0.01525115966796875,
-0.0184478759765625,
0.04901123046875,
-0.0650634765625,
0.074462890625,
-0.004360198974609375,
-0.037109375,
0.006374359130859375,
-0.053314208984375,
0.01427459716796875,
0.017547607421875,
-0.026885986328125,
0.0107269287109375,
0.0192718505859375,
0.07623291015625,
-0.046173095703125,
0.0626220703125,
-0.031158447265625,
0.02655029296875,
0.036895751953125,
-0.0167388916015625,
0.029693603515625,
0.002719879150390625,
0.013458251953125,
0.025604248046875,
0.0003554821014404297,
-0.027008056640625,
-0.03656005859375,
0.035736083984375,
-0.077392578125,
-0.0277252197265625,
-0.0399169921875,
-0.042022705078125,
0.0082855224609375,
0.005687713623046875,
0.051910400390625,
0.045654296875,
0.0208740234375,
0.029632568359375,
0.04888916015625,
-0.0256195068359375,
0.029876708984375,
0.00036025047302246094,
-0.01381683349609375,
-0.041473388671875,
0.07122802734375,
0.0165557861328125,
0.0116424560546875,
0.0125885009765625,
0.0185089111328125,
-0.025543212890625,
-0.035614013671875,
-0.02630615234375,
0.031524658203125,
-0.053009033203125,
-0.03662109375,
-0.042266845703125,
-0.038848876953125,
-0.0250244140625,
0.0011539459228515625,
-0.032012939453125,
-0.024688720703125,
-0.0279998779296875,
0.00893402099609375,
0.0621337890625,
0.0389404296875,
-0.009918212890625,
0.04132080078125,
-0.04290771484375,
0.016265869140625,
0.0220184326171875,
0.039337158203125,
-0.01428985595703125,
-0.07696533203125,
-0.0279388427734375,
0.0020599365234375,
-0.03839111328125,
-0.05499267578125,
0.034271240234375,
0.0163421630859375,
0.03240966796875,
0.029022216796875,
-0.0193939208984375,
0.0657958984375,
-0.005458831787109375,
0.043975830078125,
0.025177001953125,
-0.039581298828125,
0.03778076171875,
-0.006969451904296875,
0.01033782958984375,
0.01525115966796875,
0.01253509521484375,
-0.0218505859375,
-0.00411224365234375,
-0.08087158203125,
-0.0552978515625,
0.060882568359375,
0.01861572265625,
0.0033893585205078125,
0.034881591796875,
0.045928955078125,
-0.0040130615234375,
0.0051727294921875,
-0.06695556640625,
-0.0238800048828125,
-0.0291595458984375,
-0.0253448486328125,
-0.0077667236328125,
-0.0010328292846679688,
-0.0015039443969726562,
-0.061553955078125,
0.048431396484375,
-0.00756072998046875,
0.062286376953125,
0.03515625,
-0.014739990234375,
-0.01224517822265625,
-0.02947998046875,
0.02716064453125,
0.018310546875,
-0.021575927734375,
0.0024852752685546875,
0.021148681640625,
-0.055267333984375,
-0.0027408599853515625,
0.0244903564453125,
-0.0079193115234375,
0.00372314453125,
0.03515625,
0.08172607421875,
-0.00864410400390625,
-0.0008225440979003906,
0.040802001953125,
-0.00830078125,
-0.030853271484375,
-0.02117919921875,
0.00615692138671875,
-0.0195465087890625,
0.02764892578125,
0.02313232421875,
0.029388427734375,
-0.012054443359375,
-0.01015472412109375,
0.0111541748046875,
0.040313720703125,
-0.0399169921875,
-0.0277862548828125,
0.05035400390625,
-0.015594482421875,
-0.006946563720703125,
0.060211181640625,
-0.006374359130859375,
-0.04180908203125,
0.06512451171875,
0.023834228515625,
0.0758056640625,
-0.00629425048828125,
-0.00368499755859375,
0.059814453125,
0.02923583984375,
-0.003063201904296875,
0.01148223876953125,
0.0099945068359375,
-0.05938720703125,
-0.00595855712890625,
-0.04937744140625,
0.002361297607421875,
0.024932861328125,
-0.039093017578125,
0.0309600830078125,
-0.04022216796875,
-0.0274505615234375,
0.0042724609375,
0.0191802978515625,
-0.07745361328125,
0.0220184326171875,
0.001621246337890625,
0.05657958984375,
-0.059967041015625,
0.04986572265625,
0.0640869140625,
-0.051971435546875,
-0.072509765625,
-0.0124969482421875,
-0.01323699951171875,
-0.0665283203125,
0.0340576171875,
0.03314208984375,
0.012054443359375,
0.0190887451171875,
-0.06182861328125,
-0.045928955078125,
0.0977783203125,
0.027435302734375,
-0.01122283935546875,
0.0106964111328125,
-0.0021495819091796875,
0.028594970703125,
-0.020904541015625,
0.03375244140625,
0.01369476318359375,
0.0300445556640625,
0.01526641845703125,
-0.053619384765625,
0.00528717041015625,
-0.0230255126953125,
0.0122222900390625,
0.01502227783203125,
-0.06182861328125,
0.0726318359375,
-0.0313720703125,
-0.00864410400390625,
0.01474761962890625,
0.04901123046875,
0.006633758544921875,
0.006122589111328125,
0.041015625,
0.06695556640625,
0.029632568359375,
-0.033966064453125,
0.06915283203125,
-0.00974273681640625,
0.05413818359375,
0.03839111328125,
0.037841796875,
0.032958984375,
0.0355224609375,
-0.0266265869140625,
0.023590087890625,
0.07598876953125,
-0.042633056640625,
0.0225372314453125,
0.007598876953125,
0.005146026611328125,
-0.01690673828125,
0.004322052001953125,
-0.03619384765625,
0.039276123046875,
0.01471710205078125,
-0.04119873046875,
-0.005893707275390625,
0.0142974853515625,
-0.01229095458984375,
-0.029022216796875,
-0.014892578125,
0.046173095703125,
0.001312255859375,
-0.03240966796875,
0.06201171875,
-0.00033664703369140625,
0.06201171875,
-0.033203125,
-0.0010156631469726562,
-0.019561767578125,
0.0323486328125,
-0.028350830078125,
-0.058441162109375,
0.0123443603515625,
-0.01806640625,
-0.005039215087890625,
0.0028896331787109375,
0.05352783203125,
-0.0301055908203125,
-0.04241943359375,
0.007171630859375,
0.0235443115234375,
0.022918701171875,
-0.005931854248046875,
-0.07666015625,
-0.003154754638671875,
0.000675201416015625,
-0.0435791015625,
0.0155792236328125,
0.029876708984375,
0.0032749176025390625,
0.050689697265625,
0.049774169921875,
-0.006221771240234375,
0.0177154541015625,
-0.0094146728515625,
0.0687255859375,
-0.031951904296875,
-0.029876708984375,
-0.05877685546875,
0.047210693359375,
-0.005016326904296875,
-0.046905517578125,
0.050872802734375,
0.0447998046875,
0.0703125,
-0.011199951171875,
0.033233642578125,
-0.01296234130859375,
0.002681732177734375,
-0.0271453857421875,
0.04534912109375,
-0.05511474609375,
-0.0078582763671875,
-0.0219268798828125,
-0.06622314453125,
-0.0274658203125,
0.072265625,
-0.026214599609375,
0.033538818359375,
0.040863037109375,
0.07342529296875,
-0.0250244140625,
-0.0284423828125,
0.012939453125,
0.01509857177734375,
0.00865936279296875,
0.0291748046875,
0.044342041015625,
-0.0672607421875,
0.037506103515625,
-0.047454833984375,
-0.01265716552734375,
-0.01898193359375,
-0.03570556640625,
-0.07708740234375,
-0.06146240234375,
-0.043670654296875,
-0.0506591796875,
-0.018646240234375,
0.0643310546875,
0.0711669921875,
-0.04217529296875,
-0.004383087158203125,
-0.01125335693359375,
0.0017633438110351562,
-0.0234832763671875,
-0.01824951171875,
0.039093017578125,
-0.010284423828125,
-0.0574951171875,
-0.0251617431640625,
-0.0006437301635742188,
0.038330078125,
-0.0139923095703125,
-0.0129547119140625,
-0.01082611083984375,
-0.02508544921875,
0.0181884765625,
0.022430419921875,
-0.052093505859375,
-0.0183258056640625,
-0.00640869140625,
-0.0037441253662109375,
0.037506103515625,
0.0289306640625,
-0.055267333984375,
0.041046142578125,
0.040985107421875,
0.025238037109375,
0.06378173828125,
-0.0149078369140625,
0.0056610107421875,
-0.065185546875,
0.045867919921875,
-0.00228118896484375,
0.039398193359375,
0.0377197265625,
-0.0197906494140625,
0.045074462890625,
0.043243408203125,
-0.03448486328125,
-0.06378173828125,
-0.003391265869140625,
-0.08160400390625,
0.01025390625,
0.0701904296875,
-0.017791748046875,
-0.035858154296875,
0.029327392578125,
-0.01611328125,
0.05267333984375,
-0.004364013671875,
0.035430908203125,
0.017120361328125,
0.00559234619140625,
-0.045654296875,
-0.035003662109375,
0.039825439453125,
0.01116180419921875,
-0.040771484375,
-0.0294036865234375,
0.0031280517578125,
0.040863037109375,
0.02789306640625,
0.025238037109375,
-0.01024627685546875,
0.0128021240234375,
0.00484466552734375,
0.03912353515625,
-0.0260467529296875,
-0.0123138427734375,
-0.03118896484375,
-0.0118408203125,
-0.005611419677734375,
-0.0469970703125
]
] |
ckiplab/albert-base-chinese | 2022-05-10T03:28:08.000Z | [
"transformers",
"pytorch",
"albert",
"fill-mask",
"lm-head",
"zh",
"license:gpl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | ckiplab | null | null | ckiplab/albert-base-chinese | 6 | 27,417 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
- pytorch
- lm-head
- albert
- zh
license: gpl-3.0
---
# CKIP ALBERT Base Chinese
This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).
這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2)及自然語言處理工具(包含斷詞、詞性標記、實體辨識)。
## Homepage
- https://github.com/ckiplab/ckip-transformers
## Contributers
- [Mu Yang](https://muyang.pro) at [CKIP](https://ckip.iis.sinica.edu.tw) (Author & Maintainer)
## Usage
Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.
請使用 BertTokenizerFast 而非 AutoTokenizer。
```
from transformers import (
BertTokenizerFast,
AutoModel,
)
tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
model = AutoModel.from_pretrained('ckiplab/albert-base-chinese')
```
For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.
有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers 。
| 1,112 | [
[
-0.0231170654296875,
-0.0228729248046875,
0.0011615753173828125,
0.05743408203125,
-0.0247650146484375,
0.00675201416015625,
-0.011199951171875,
-0.018585205078125,
-0.004375457763671875,
0.033050537109375,
-0.0255889892578125,
-0.024566650390625,
-0.043060302734375,
0.005756378173828125,
-0.0168609619140625,
0.062744140625,
-0.0206451416015625,
0.022369384765625,
0.0261688232421875,
0.00742340087890625,
-0.0157928466796875,
-0.0286407470703125,
-0.0543212890625,
-0.0430908203125,
-0.005023956298828125,
0.0193023681640625,
0.050567626953125,
0.0318603515625,
0.038787841796875,
0.0226593017578125,
0.0028896331787109375,
-0.008636474609375,
-0.01035308837890625,
-0.02142333984375,
-0.002414703369140625,
-0.042633056640625,
-0.031158447265625,
-0.0177459716796875,
0.05108642578125,
0.03668212890625,
0.0001233816146850586,
-0.0020313262939453125,
0.01114654541015625,
0.0277252197265625,
-0.028228759765625,
0.0321044921875,
-0.040069580078125,
0.0225830078125,
-0.010498046875,
-0.009765625,
-0.0251922607421875,
-0.0130462646484375,
0.0104522705078125,
-0.04888916015625,
0.0250701904296875,
-0.006679534912109375,
0.09710693359375,
0.003490447998046875,
-0.022369384765625,
-0.019317626953125,
-0.052520751953125,
0.080322265625,
-0.06170654296875,
0.032745361328125,
0.02874755859375,
0.0184478759765625,
0.00018155574798583984,
-0.075439453125,
-0.044952392578125,
-0.01381683349609375,
-0.0201873779296875,
0.0243988037109375,
0.006877899169921875,
-0.003116607666015625,
0.0303955078125,
0.02239990234375,
-0.048492431640625,
0.0139617919921875,
-0.0305023193359375,
-0.0295257568359375,
0.04205322265625,
-0.0019893646240234375,
0.036346435546875,
-0.0299835205078125,
-0.038299560546875,
-0.0258331298828125,
-0.04827880859375,
0.0162506103515625,
0.0194244384765625,
0.005084991455078125,
-0.03662109375,
0.043609619140625,
-0.00855255126953125,
0.0177459716796875,
0.0126190185546875,
-0.006404876708984375,
0.035064697265625,
-0.0218963623046875,
-0.004833221435546875,
-0.00870513916015625,
0.07110595703125,
0.018310546875,
0.00547027587890625,
0.004627227783203125,
-0.024200439453125,
-0.024322509765625,
-0.019439697265625,
-0.059051513671875,
-0.046875,
0.017791748046875,
-0.05596923828125,
-0.01158905029296875,
0.01097869873046875,
-0.0452880859375,
0.0233612060546875,
-0.01812744140625,
0.0263214111328125,
-0.049957275390625,
-0.04736328125,
0.0007119178771972656,
-0.0280303955078125,
0.057220458984375,
0.008941650390625,
-0.09161376953125,
0.00293731689453125,
0.04510498046875,
0.0557861328125,
0.00937652587890625,
-0.01358795166015625,
0.01262664794921875,
0.031158447265625,
-0.015960693359375,
0.0411376953125,
-0.007160186767578125,
-0.0535888671875,
0.010589599609375,
0.0066986083984375,
0.003147125244140625,
-0.03466796875,
0.057220458984375,
-0.0250396728515625,
0.027130126953125,
-0.012481689453125,
-0.0186767578125,
-0.0029239654541015625,
0.00855255126953125,
-0.03607177734375,
0.08184814453125,
0.01959228515625,
-0.0628662109375,
0.0175628662109375,
-0.0670166015625,
-0.04119873046875,
0.02587890625,
-0.0076751708984375,
-0.03277587890625,
-0.0150146484375,
0.0185089111328125,
0.0229949951171875,
-0.00708770751953125,
0.0138092041015625,
-0.0011739730834960938,
-0.01505279541015625,
-0.0013837814331054688,
-0.0302581787109375,
0.09698486328125,
0.02447509765625,
-0.021331787109375,
0.0108489990234375,
-0.05224609375,
0.006137847900390625,
0.0217742919921875,
-0.01568603515625,
-0.0212554931640625,
0.011138916015625,
0.039154052734375,
0.00853729248046875,
0.039031982421875,
-0.04034423828125,
0.037811279296875,
-0.04345703125,
0.051666259765625,
0.05462646484375,
-0.02020263671875,
0.0242919921875,
-0.011688232421875,
0.004474639892578125,
0.006519317626953125,
0.024627685546875,
-0.0106048583984375,
-0.03607177734375,
-0.08203125,
-0.0251922607421875,
0.0290679931640625,
0.057708740234375,
-0.08111572265625,
0.061431884765625,
-0.0186309814453125,
-0.04730224609375,
-0.0247802734375,
-0.0034580230712890625,
0.0016651153564453125,
0.01214599609375,
0.041717529296875,
-0.02105712890625,
-0.045135498046875,
-0.07769775390625,
0.006679534912109375,
-0.044952392578125,
-0.041717529296875,
-0.00292205810546875,
0.045135498046875,
-0.035369873046875,
0.07342529296875,
-0.036163330078125,
-0.017669677734375,
-0.0207672119140625,
0.044281005859375,
0.0266876220703125,
0.066650390625,
0.042938232421875,
-0.075439453125,
-0.054412841796875,
-0.01284027099609375,
-0.0310821533203125,
-0.0062255859375,
-0.0201263427734375,
-0.01055908203125,
0.0004944801330566406,
0.005611419677734375,
-0.048858642578125,
0.015411376953125,
0.0261077880859375,
-0.0004711151123046875,
0.06378173828125,
-0.0009016990661621094,
-0.0164794921875,
-0.097412109375,
0.01214599609375,
-0.01326751708984375,
-0.0024013519287109375,
-0.030181884765625,
-0.00228118896484375,
0.01520538330078125,
-0.00395965576171875,
-0.0421142578125,
0.042083740234375,
-0.027984619140625,
0.025360107421875,
-0.0191192626953125,
-0.01450347900390625,
-0.01200103759765625,
0.0418701171875,
0.0305938720703125,
0.05206298828125,
0.0489501953125,
-0.0538330078125,
0.032073974609375,
0.052886962890625,
-0.01934814453125,
-0.0015249252319335938,
-0.06939697265625,
-0.005039215087890625,
0.0214691162109375,
0.00981903076171875,
-0.064697265625,
-0.004573822021484375,
0.046875,
-0.053131103515625,
0.04669189453125,
0.00400543212890625,
-0.0645751953125,
-0.031524658203125,
-0.032135009765625,
0.029052734375,
0.048004150390625,
-0.04449462890625,
0.038421630859375,
0.0172271728515625,
-0.0161285400390625,
-0.04534912109375,
-0.05810546875,
-0.0007195472717285156,
0.01499176025390625,
-0.04180908203125,
0.048828125,
-0.018402099609375,
0.0196380615234375,
-0.0004911422729492188,
0.0092926025390625,
-0.032257080078125,
-0.005252838134765625,
-0.00640106201171875,
0.030914306640625,
-0.0135498046875,
-0.00605010986328125,
0.01715087890625,
-0.0258026123046875,
0.01300811767578125,
0.005218505859375,
0.051116943359375,
0.0016384124755859375,
-0.0233154296875,
-0.04608154296875,
0.0215911865234375,
0.0163116455078125,
-0.0175628662109375,
0.0260772705078125,
0.07281494140625,
-0.0186614990234375,
-0.012725830078125,
-0.0277252197265625,
-0.0098876953125,
-0.040313720703125,
0.0469970703125,
-0.034759521484375,
-0.059234619140625,
0.02337646484375,
-0.00850677490234375,
0.01479339599609375,
0.056884765625,
0.050140380859375,
0.001041412353515625,
0.09539794921875,
0.06463623046875,
-0.036773681640625,
0.0299835205078125,
-0.03338623046875,
0.02911376953125,
-0.07086181640625,
0.016357421875,
-0.046478271484375,
0.01171875,
-0.060577392578125,
-0.0176239013671875,
-0.0017595291137695312,
0.017547607421875,
-0.024200439453125,
0.056671142578125,
-0.056884765625,
-0.0033702850341796875,
0.055023193359375,
-0.02874755859375,
-0.0024585723876953125,
-0.007648468017578125,
-0.015869140625,
-0.0009369850158691406,
-0.04345703125,
-0.049835205078125,
0.055145263671875,
0.05078125,
0.052978515625,
-0.0011014938354492188,
0.03607177734375,
-0.005615234375,
0.038848876953125,
-0.05712890625,
0.03851318359375,
-0.00630950927734375,
-0.0604248046875,
-0.0193023681640625,
-0.0199127197265625,
-0.0621337890625,
0.0170745849609375,
-0.0036830902099609375,
-0.06292724609375,
0.01263427734375,
0.0031032562255859375,
-0.0101318359375,
0.0253448486328125,
-0.0308074951171875,
0.0537109375,
-0.035552978515625,
0.0048675537109375,
-0.0036067962646484375,
-0.048126220703125,
0.030548095703125,
0.00044536590576171875,
-0.003696441650390625,
-0.00616455078125,
0.00505828857421875,
0.0574951171875,
-0.0121307373046875,
0.055145263671875,
-0.007732391357421875,
-0.003047943115234375,
0.022705078125,
-0.01947021484375,
0.0203094482421875,
0.0184478759765625,
0.00814056396484375,
0.048309326171875,
0.0189056396484375,
-0.027099609375,
-0.01947021484375,
0.035491943359375,
-0.0693359375,
-0.03271484375,
-0.043548583984375,
-0.01873779296875,
0.010345458984375,
0.038665771484375,
0.0404052734375,
-0.0015649795532226562,
0.0007853507995605469,
0.0198822021484375,
0.019927978515625,
-0.031219482421875,
0.043243408203125,
0.04425048828125,
-0.005771636962890625,
-0.035308837890625,
0.0660400390625,
0.007419586181640625,
0.007663726806640625,
0.0489501953125,
-0.005153656005859375,
-0.0178375244140625,
-0.034637451171875,
-0.0270538330078125,
0.03131103515625,
-0.029876708984375,
-0.0016584396362304688,
-0.0298309326171875,
-0.04132080078125,
-0.04888916015625,
0.00885009765625,
-0.028717041015625,
-0.0274200439453125,
-0.0225830078125,
0.0018301010131835938,
-0.0264434814453125,
0.007740020751953125,
-0.0224456787109375,
0.037506103515625,
-0.07830810546875,
0.036163330078125,
0.0125579833984375,
0.0174102783203125,
0.0006203651428222656,
-0.0170135498046875,
-0.043365478515625,
0.008819580078125,
-0.06549072265625,
-0.05108642578125,
0.045318603515625,
-0.0009102821350097656,
0.04901123046875,
0.047088623046875,
0.01265716552734375,
0.039459228515625,
-0.047088623046875,
0.08074951171875,
0.029693603515625,
-0.0848388671875,
0.0293731689453125,
-0.01233673095703125,
0.0239410400390625,
0.022003173828125,
0.0418701171875,
-0.0584716796875,
-0.0253143310546875,
-0.03680419921875,
-0.08734130859375,
0.049041748046875,
0.0295257568359375,
0.0238494873046875,
-0.00011271238327026367,
0.0010223388671875,
-0.002696990966796875,
0.01371002197265625,
-0.08001708984375,
-0.04327392578125,
-0.03594970703125,
-0.022247314453125,
0.0179443359375,
-0.02752685546875,
0.0031490325927734375,
-0.0150909423828125,
0.0823974609375,
0.00606536865234375,
0.057342529296875,
0.033477783203125,
-0.001018524169921875,
-0.0142822265625,
0.007366180419921875,
0.032806396484375,
0.04266357421875,
-0.0192413330078125,
-0.019378662109375,
0.005275726318359375,
-0.044097900390625,
-0.0158843994140625,
0.0273590087890625,
-0.03204345703125,
0.0304412841796875,
0.040618896484375,
0.050445556640625,
0.00514984130859375,
-0.031890869140625,
0.039215087890625,
-0.01343536376953125,
-0.020050048828125,
-0.07501220703125,
-0.0012054443359375,
0.0025577545166015625,
0.0025501251220703125,
0.048828125,
-0.01010894775390625,
0.0107269287109375,
-0.01435089111328125,
0.01488494873046875,
0.031219482421875,
-0.036407470703125,
-0.03533935546875,
0.052886962890625,
0.0310821533203125,
-0.025604248046875,
0.061614990234375,
-0.005397796630859375,
-0.066650390625,
0.048248291015625,
0.03399658203125,
0.07232666015625,
-0.0260467529296875,
0.004314422607421875,
0.051300048828125,
0.034820556640625,
-0.00003618001937866211,
0.01200103759765625,
-0.019989013671875,
-0.0693359375,
-0.039154052734375,
-0.029266357421875,
-0.03533935546875,
0.0304412841796875,
-0.038177490234375,
0.044189453125,
-0.03240966796875,
-0.005977630615234375,
-0.00525665283203125,
-0.004695892333984375,
-0.03814697265625,
0.00927734375,
0.00934600830078125,
0.0811767578125,
-0.048187255859375,
0.08056640625,
0.04278564453125,
-0.042388916015625,
-0.060516357421875,
0.00806427001953125,
-0.0301361083984375,
-0.052978515625,
0.07781982421875,
0.0268707275390625,
0.0228271484375,
0.005443572998046875,
-0.05487060546875,
-0.058319091796875,
0.07745361328125,
-0.01300811767578125,
-0.0279998779296875,
-0.00722503662109375,
0.030914306640625,
0.0291748046875,
-0.004917144775390625,
0.04046630859375,
0.006755828857421875,
0.04656982421875,
-0.0131072998046875,
-0.082275390625,
-0.018310546875,
-0.02215576171875,
0.00348663330078125,
0.017669677734375,
-0.0628662109375,
0.060882568359375,
0.010528564453125,
-0.02484130859375,
0.0286407470703125,
0.06988525390625,
0.00287628173828125,
0.01287841796875,
0.041229248046875,
0.03558349609375,
-0.005023956298828125,
-0.0144805908203125,
0.038330078125,
-0.0439453125,
0.062164306640625,
0.0576171875,
-0.004512786865234375,
0.0535888671875,
0.0284881591796875,
-0.039337158203125,
0.04205322265625,
0.0501708984375,
-0.0428466796875,
0.04522705078125,
0.00018477439880371094,
-0.00954437255859375,
-0.006488800048828125,
0.0084686279296875,
-0.0411376953125,
0.017303466796875,
0.024261474609375,
-0.0252685546875,
-0.00988006591796875,
-0.01282501220703125,
-0.001827239990234375,
-0.031219482421875,
-0.00739288330078125,
0.0362548828125,
0.0132293701171875,
-0.02374267578125,
0.039642333984375,
0.0248565673828125,
0.07244873046875,
-0.07623291015625,
-0.0269622802734375,
0.01959228515625,
0.01087188720703125,
-0.00797271728515625,
-0.044830322265625,
0.01375579833984375,
-0.0259552001953125,
-0.012451171875,
-0.0140228271484375,
0.052337646484375,
-0.0260772705078125,
-0.0406494140625,
0.0291595458984375,
0.00591278076171875,
0.01050567626953125,
0.021209716796875,
-0.08612060546875,
-0.02545166015625,
0.0250244140625,
-0.0362548828125,
0.01251983642578125,
0.010986328125,
0.00708770751953125,
0.049041748046875,
0.06378173828125,
0.00945281982421875,
-0.014404296875,
-0.00518798828125,
0.06298828125,
-0.0406494140625,
-0.039031982421875,
-0.05401611328125,
0.0582275390625,
-0.0175628662109375,
-0.02850341796875,
0.05242919921875,
0.0545654296875,
0.0863037109375,
-0.0234832763671875,
0.0750732421875,
-0.0277252197265625,
0.0574951171875,
-0.01308441162109375,
0.059539794921875,
-0.032958984375,
-0.007175445556640625,
-0.022552490234375,
-0.06591796875,
-0.0197296142578125,
0.06512451171875,
-0.01197052001953125,
-0.0072021484375,
0.053619384765625,
0.050079345703125,
-0.00008755922317504883,
-0.01690673828125,
0.01154327392578125,
0.01367950439453125,
0.048004150390625,
0.03302001953125,
0.042816162109375,
-0.042724609375,
0.044921875,
-0.05133056640625,
-0.01465606689453125,
-0.006839752197265625,
-0.04931640625,
-0.053192138671875,
-0.04400634765625,
-0.025115966796875,
-0.0098114013671875,
-0.0234222412109375,
0.06201171875,
0.054840087890625,
-0.0797119140625,
-0.0325927734375,
-0.0037937164306640625,
0.006565093994140625,
-0.023345947265625,
-0.025604248046875,
0.04608154296875,
-0.02606201171875,
-0.08721923828125,
0.002956390380859375,
0.00927734375,
0.0058746337890625,
-0.024139404296875,
-0.0024471282958984375,
-0.0172576904296875,
-0.0166168212890625,
0.0277252197265625,
0.0345458984375,
-0.0499267578125,
-0.020050048828125,
-0.001964569091796875,
-0.0146331787109375,
0.010986328125,
0.04327392578125,
-0.0159912109375,
0.0240631103515625,
0.048309326171875,
0.0203094482421875,
0.024627685546875,
-0.00920867919921875,
0.049774169921875,
-0.0323486328125,
0.021392822265625,
0.0264434814453125,
0.0406494140625,
0.0251922607421875,
-0.016510009765625,
0.03668212890625,
0.033935546875,
-0.0545654296875,
-0.044586181640625,
0.0262603759765625,
-0.073974609375,
-0.016693115234375,
0.07098388671875,
-0.0182952880859375,
-0.01334381103515625,
-0.00949859619140625,
-0.04107666015625,
0.0469970703125,
-0.02313232421875,
0.043609619140625,
0.0609130859375,
-0.006031036376953125,
-0.007602691650390625,
-0.040191650390625,
0.0297698974609375,
0.033721923828125,
-0.02532958984375,
-0.027618408203125,
-0.00415802001953125,
0.01251983642578125,
0.048828125,
0.03271484375,
-0.007843017578125,
0.006923675537109375,
-0.01068115234375,
0.044097900390625,
0.0011138916015625,
0.0159912109375,
0.005863189697265625,
-0.0128173828125,
0.0025501251220703125,
-0.027313232421875
]
] |
Rostlab/prot_t5_xl_uniref50 | 2023-01-31T21:05:58.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"protein language model",
"dataset:UniRef50",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | Rostlab | null | null | Rostlab/prot_t5_xl_uniref50 | 29 | 27,411 | transformers | 2022-03-02T23:29:04 | ---
tags:
- protein language model
datasets:
- UniRef50
---
# ProtT5-XL-UniRef50 model
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://doi.org/10.1101/2020.07.12.199554) and first released in
[this repository](https://github.com/agemagician/ProtTrans). This model is trained on uppercase amino acids: it only works with capital letter amino acids.
## Model description
ProtT5-XL-UniRef50 is based on the `t5-3b` model and was pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between this T5 model and the original T5 version is the denosing objective.
The original T5-3B model was pretrained using a span denosing objective, while this model was pre-trained with a Bart-like MLM denosing objective.
The masking probability is consistent with the original T5 training by randomly masking 15% of the amino acids in the input.
It has been shown that the features extracted from this self-supervised model (LM-embeddings) captured important biophysical properties governing protein shape.
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
## Intended uses & limitations
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks on can gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
We have also noticed that for feature extraction, its better to use the feature extracted from the encoder not from the decoder.
### How to use
Here is how to use this model to extract the features of a given protein sequence in PyTorch:
```python
sequence_examples = ["PRTEINO", "SEQWENCE"]
# this will replace all rare/ambiguous amino acids by X and introduce white-space between all amino acids
sequence_examples = [" ".join(list(re.sub(r"[UZOB]", "X", sequence))) for sequence in sequence_examples]
# tokenize sequences and pad up to the longest sequence in the batch
ids = tokenizer.batch_encode_plus(sequence_examples, add_special_tokens=True, padding="longest")
input_ids = torch.tensor(ids['input_ids']).to(device)
attention_mask = torch.tensor(ids['attention_mask']).to(device)
# generate embeddings
with torch.no_grad():
embedding_repr = model(input_ids=input_ids,attention_mask=attention_mask)
# extract embeddings for the first ([0,:]) sequence in the batch while removing padded & special tokens ([0,:7])
emb_0 = embedding_repr.last_hidden_state[0,:7] # shape (7 x 1024)
print(f"Shape of per-residue embedding of first sequences: {emb_0.shape}")
# do the same for the second ([1,:]) sequence in the batch while taking into account different sequence lengths ([1,:8])
emb_1 = embedding_repr.last_hidden_state[1,:8] # shape (8 x 1024)
# if you want to derive a single representation (per-protein embedding) for the whole protein
emb_0_per_protein = emb_0.mean(dim=0) # shape (1024)
print(f"Shape of per-protein embedding of first sequences: {emb_0_per_protein.shape}")
```
## Training data
The ProtT5-XL-UniRef50 model was pretrained on [UniRef50](https://www.uniprot.org/help/uniref), a dataset consisting of 45 million protein sequences.
## Training procedure
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids "U,Z,O,B" were mapped to "X".
The inputs of the model are then of the form:
```
Protein Sequence [EOS]
```
The preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.
The details of the masking procedure for each sequence are as follows:
- 15% of the amino acids are masked.
- In 90% of the cases, the masked amino acids are replaced by `[MASK]` token.
- In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
### Pretraining
The model was trained on a single TPU Pod V2-256 for 991.5 thousand steps in total, using sequence length 512 (batch size 2k).
It was trained using ProtT5-XL-BFD model as an initial checkpoint, rather than training from scratch.
It has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
When the model is used for feature extraction, this model achieves the following results:
Test results :
| Task/Dataset | secondary structure (3-states) | secondary structure (8-states) | Localization | Membrane |
|:-----:|:-----:|:-----:|:-----:|:-----:|
| CASP12 | 81 | 70 | | |
| TS115 | 87 | 77 | | |
| CB513 | 86 | 74 | | |
| DeepLoc | | | 81 | 91 |
### BibTeX entry and citation info
```bibtex
@article {Elnaggar2020.07.12.199554,
author = {Elnaggar, Ahmed and Heinzinger, Michael and Dallago, Christian and Rehawi, Ghalia and Wang, Yu and Jones, Llion and Gibbs, Tom and Feher, Tamas and Angerer, Christoph and Steinegger, Martin and BHOWMIK, DEBSINDHU and Rost, Burkhard},
title = {ProtTrans: Towards Cracking the Language of Life{\textquoteright}s Code Through Self-Supervised Deep Learning and High Performance Computing},
elocation-id = {2020.07.12.199554},
year = {2020},
doi = {10.1101/2020.07.12.199554},
publisher = {Cold Spring Harbor Laboratory},
abstract = {Computational biology and bioinformatics provide vast data gold-mines from protein sequences, ideal for Language Models (LMs) taken from Natural Language Processing (NLP). These LMs reach for new prediction frontiers at low inference costs. Here, we trained two auto-regressive language models (Transformer-XL, XLNet) and two auto-encoder models (Bert, Albert) on data from UniRef and BFD containing up to 393 billion amino acids (words) from 2.1 billion protein sequences (22- and 112 times the entire English Wikipedia). The LMs were trained on the Summit supercomputer at Oak Ridge National Laboratory (ORNL), using 936 nodes (total 5616 GPUs) and one TPU Pod (V3-512 or V3-1024). We validated the advantage of up-scaling LMs to larger models supported by bigger data by predicting secondary structure (3-states: Q3=76-84, 8 states: Q8=65-73), sub-cellular localization for 10 cellular compartments (Q10=74) and whether a protein is membrane-bound or water-soluble (Q2=89). Dimensionality reduction revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein shape. This implied learning some of the grammar of the language of life realized in protein sequences. The successful up-scaling of protein LMs through HPC to larger data sets slightly reduced the gap between models trained on evolutionary information and LMs. Availability ProtTrans: \<a href="https://github.com/agemagician/ProtTrans"\>https://github.com/agemagician/ProtTrans\</a\>Competing Interest StatementThe authors have declared no competing interest.},
URL = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554},
eprint = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554.full.pdf},
journal = {bioRxiv}
}
```
> Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
| 7,648 | [
[
-0.0225677490234375,
-0.041290283203125,
0.01690673828125,
-0.01338958740234375,
-0.0157318115234375,
0.00799560546875,
0.00452423095703125,
-0.0316162109375,
0.0230865478515625,
0.0294189453125,
-0.03668212890625,
-0.04132080078125,
-0.048980712890625,
0.024505615234375,
-0.0168609619140625,
0.0604248046875,
0.010223388671875,
0.0085601806640625,
0.0014247894287109375,
-0.005603790283203125,
0.007755279541015625,
-0.0333251953125,
-0.0218963623046875,
-0.03131103515625,
0.03167724609375,
0.01546478271484375,
0.03167724609375,
0.061004638671875,
0.036102294921875,
0.021331787109375,
-0.0149688720703125,
0.00984954833984375,
-0.0299835205078125,
-0.0141448974609375,
0.01232147216796875,
-0.0240631103515625,
-0.044952392578125,
-0.0036678314208984375,
0.044342041015625,
0.05523681640625,
0.00620269775390625,
0.021942138671875,
0.01090240478515625,
0.053375244140625,
-0.039520263671875,
0.012420654296875,
-0.0280914306640625,
0.0164794921875,
-0.016326904296875,
0.0050201416015625,
-0.028778076171875,
-0.00806427001953125,
0.01172637939453125,
-0.031646728515625,
0.0209503173828125,
0.006092071533203125,
0.07208251953125,
0.00594329833984375,
-0.01187896728515625,
-0.0005354881286621094,
-0.034515380859375,
0.044769287109375,
-0.0753173828125,
0.048736572265625,
0.0205535888671875,
0.0012903213500976562,
-0.034393310546875,
-0.074462890625,
-0.04632568359375,
-0.017822265625,
-0.00720977783203125,
0.006420135498046875,
-0.00667572021484375,
0.0179443359375,
0.0204925537109375,
0.032196044921875,
-0.0648193359375,
-0.005260467529296875,
-0.048828125,
-0.0150909423828125,
0.0526123046875,
-0.0211334228515625,
0.0242919921875,
-0.03399658203125,
-0.01506805419921875,
-0.021575927734375,
-0.05010986328125,
0.0111236572265625,
0.02117919921875,
0.01007843017578125,
-0.005123138427734375,
0.03790283203125,
-0.01421356201171875,
0.04022216796875,
-0.00007385015487670898,
0.005207061767578125,
0.038330078125,
-0.00498199462890625,
-0.035003662109375,
0.00664520263671875,
0.07861328125,
0.00035834312438964844,
0.027740478515625,
-0.00911712646484375,
0.0009927749633789062,
-0.032135009765625,
0.0045166015625,
-0.0660400390625,
-0.04022216796875,
0.050262451171875,
-0.043304443359375,
-0.01287078857421875,
0.0214080810546875,
-0.05987548828125,
-0.01104736328125,
-0.0027370452880859375,
0.059417724609375,
-0.05145263671875,
-0.00516510009765625,
0.0210113525390625,
-0.005832672119140625,
0.01262664794921875,
-0.00800323486328125,
-0.057220458984375,
0.015167236328125,
0.01271820068359375,
0.0689697265625,
-0.01120758056640625,
-0.01404571533203125,
-0.0252532958984375,
0.012054443359375,
0.0034580230712890625,
0.040985107421875,
-0.033935546875,
-0.0145263671875,
-0.0274200439453125,
0.02020263671875,
-0.004703521728515625,
-0.040557861328125,
0.01995849609375,
-0.03570556640625,
-0.0008101463317871094,
-0.00791168212890625,
-0.051544189453125,
-0.035125732421875,
-0.0159759521484375,
-0.05413818359375,
0.0704345703125,
0.0121917724609375,
-0.0557861328125,
0.01064300537109375,
-0.05975341796875,
-0.015106201171875,
0.01448822021484375,
-0.00745391845703125,
-0.048614501953125,
-0.003082275390625,
0.028900146484375,
0.037811279296875,
-0.006046295166015625,
0.020477294921875,
-0.03533935546875,
-0.0158233642578125,
0.03448486328125,
0.00252532958984375,
0.057220458984375,
0.02484130859375,
-0.040679931640625,
0.0017347335815429688,
-0.0643310546875,
0.0252685546875,
0.0130157470703125,
-0.0267333984375,
0.0149688720703125,
-0.01070404052734375,
-0.0118865966796875,
0.021270751953125,
0.00284576416015625,
-0.043304443359375,
0.03271484375,
-0.039093017578125,
0.05462646484375,
0.0474853515625,
0.00774383544921875,
0.025146484375,
-0.0096435546875,
0.0239715576171875,
0.0125274658203125,
0.01409912109375,
-0.0087738037109375,
-0.038787841796875,
-0.05340576171875,
-0.034912109375,
0.0482177734375,
0.0341796875,
-0.0247344970703125,
0.040740966796875,
-0.0186309814453125,
-0.0352783203125,
-0.05279541015625,
0.006343841552734375,
0.038177490234375,
0.01390838623046875,
0.061309814453125,
-0.030029296875,
-0.06268310546875,
-0.06756591796875,
-0.0134429931640625,
0.0106353759765625,
-0.0137939453125,
0.01036834716796875,
0.05731201171875,
-0.0181732177734375,
0.06695556640625,
-0.032318115234375,
-0.0218505859375,
-0.0372314453125,
-0.003631591796875,
0.03009033203125,
0.055816650390625,
0.01708984375,
-0.0460205078125,
-0.032379150390625,
-0.0117034912109375,
-0.06201171875,
0.00232696533203125,
0.0084686279296875,
0.0054931640625,
0.016632080078125,
0.0286865234375,
-0.05169677734375,
0.036041259765625,
0.0286712646484375,
-0.012420654296875,
0.03460693359375,
-0.0260009765625,
-0.00704193115234375,
-0.0826416015625,
0.025421142578125,
-0.0004489421844482422,
-0.01486968994140625,
-0.060943603515625,
-0.0046539306640625,
0.0023040771484375,
-0.004291534423828125,
-0.047576904296875,
0.04156494140625,
-0.05755615234375,
-0.018218994140625,
-0.00562286376953125,
-0.0122222900390625,
0.00556182861328125,
0.05230712890625,
-0.0015125274658203125,
0.038482666015625,
0.029296875,
-0.046875,
0.00033783912658691406,
0.0227203369140625,
-0.0257415771484375,
-0.0201263427734375,
-0.0670166015625,
0.01270294189453125,
-0.0050506591796875,
0.0267181396484375,
-0.0634765625,
-0.0101470947265625,
0.0313720703125,
-0.048065185546875,
0.036773681640625,
-0.006633758544921875,
-0.0217742919921875,
-0.039459228515625,
-0.0300445556640625,
0.02056884765625,
0.048126220703125,
-0.0221099853515625,
0.043121337890625,
0.02386474609375,
0.0061492919921875,
-0.0478515625,
-0.051727294921875,
-0.0164642333984375,
-0.006755828857421875,
-0.039154052734375,
0.04931640625,
0.0033740997314453125,
0.006740570068359375,
-0.0063934326171875,
-0.0254974365234375,
0.0207061767578125,
-0.01338958740234375,
0.0258026123046875,
-0.0021038055419921875,
-0.00201416015625,
-0.005706787109375,
-0.0082550048828125,
-0.0084228515625,
-0.0139312744140625,
-0.05084228515625,
0.057464599609375,
-0.0160980224609375,
-0.006984710693359375,
-0.04681396484375,
0.025970458984375,
0.0335693359375,
-0.0262451171875,
0.059967041015625,
0.057373046875,
-0.0380859375,
-0.0024242401123046875,
-0.034942626953125,
-0.0302581787109375,
-0.03765869140625,
0.0467529296875,
-0.03082275390625,
-0.049163818359375,
0.047119140625,
0.002346038818359375,
-0.012939453125,
0.047698974609375,
0.025177001953125,
-0.020294189453125,
0.066650390625,
0.057403564453125,
0.02044677734375,
0.026336669921875,
-0.07171630859375,
0.0253448486328125,
-0.07391357421875,
-0.041259765625,
-0.031494140625,
-0.038055419921875,
-0.046600341796875,
-0.03497314453125,
0.0323486328125,
0.0341796875,
-0.024810791015625,
0.038482666015625,
-0.035491943359375,
0.0190887451171875,
0.049774169921875,
0.0347900390625,
0.00010943412780761719,
0.020294189453125,
-0.035919189453125,
-0.0023899078369140625,
-0.0699462890625,
-0.040802001953125,
0.09466552734375,
0.038055419921875,
0.042572021484375,
0.00013113021850585938,
0.057464599609375,
0.0249481201171875,
0.005870819091796875,
-0.044342041015625,
0.03717041015625,
-0.027069091796875,
-0.03179931640625,
-0.0220489501953125,
-0.02496337890625,
-0.0733642578125,
0.0045318603515625,
-0.0193939208984375,
-0.0743408203125,
0.04376220703125,
0.00763702392578125,
-0.035491943359375,
0.033416748046875,
-0.05548095703125,
0.0780029296875,
-0.0142669677734375,
-0.0233001708984375,
0.00774383544921875,
-0.07177734375,
0.0103302001953125,
0.0002073049545288086,
0.0175323486328125,
0.006465911865234375,
0.0195465087890625,
0.08538818359375,
-0.046295166015625,
0.0709228515625,
-0.009033203125,
0.00971221923828125,
0.01678466796875,
0.007793426513671875,
0.031524658203125,
0.0025730133056640625,
0.0008177757263183594,
0.028778076171875,
0.01190948486328125,
-0.0386962890625,
-0.0271453857421875,
0.022918701171875,
-0.07513427734375,
-0.0360107421875,
-0.0287628173828125,
-0.0280609130859375,
0.011810302734375,
0.0222625732421875,
0.045806884765625,
0.040435791015625,
0.0145263671875,
0.0264129638671875,
0.062255859375,
-0.032440185546875,
0.046661376953125,
0.022979736328125,
-0.01415252685546875,
-0.0478515625,
0.0587158203125,
0.0266265869140625,
0.021331787109375,
0.045135498046875,
0.00798797607421875,
-0.036712646484375,
-0.0531005859375,
-0.0172882080078125,
0.032012939453125,
-0.049957275390625,
-0.0250701904296875,
-0.0799560546875,
-0.030426025390625,
-0.0465087890625,
0.002735137939453125,
-0.036376953125,
-0.032470703125,
-0.0155487060546875,
-0.007495880126953125,
0.0214996337890625,
0.041778564453125,
-0.0201568603515625,
0.0091705322265625,
-0.0699462890625,
0.0212860107421875,
0.00739288330078125,
0.002460479736328125,
-0.01357269287109375,
-0.06488037109375,
-0.021484375,
0.018829345703125,
-0.0313720703125,
-0.08563232421875,
0.050048828125,
0.04962158203125,
0.054840087890625,
0.0127410888671875,
-0.0007023811340332031,
0.0367431640625,
-0.0190277099609375,
0.056854248046875,
-0.002735137939453125,
-0.06561279296875,
0.05145263671875,
-0.0038318634033203125,
0.03271484375,
0.0162200927734375,
0.052520751953125,
-0.0226898193359375,
-0.029571533203125,
-0.0687255859375,
-0.07843017578125,
0.051788330078125,
0.024200439453125,
-0.0172119140625,
0.0011510848999023438,
0.045379638671875,
0.010528564453125,
0.00830841064453125,
-0.0728759765625,
-0.029632568359375,
-0.0265350341796875,
-0.02740478515625,
-0.00446319580078125,
-0.0211334228515625,
0.0031604766845703125,
-0.0340576171875,
0.056732177734375,
-0.0149078369140625,
0.05389404296875,
0.03985595703125,
-0.03680419921875,
0.00531768798828125,
0.009674072265625,
0.044769287109375,
0.046173095703125,
-0.044036865234375,
0.018463134765625,
0.007762908935546875,
-0.06329345703125,
-0.0008778572082519531,
0.02178955078125,
-0.0189361572265625,
-0.008880615234375,
0.045257568359375,
0.0604248046875,
0.00078582763671875,
-0.0369873046875,
0.0159759521484375,
0.0184478759765625,
-0.0186004638671875,
0.00028228759765625,
0.0021991729736328125,
0.007282257080078125,
0.0369873046875,
0.053619384765625,
-0.01340484619140625,
0.00473785400390625,
-0.035064697265625,
0.032745361328125,
0.016143798828125,
-0.0168914794921875,
-0.04217529296875,
0.05572509765625,
-0.00478363037109375,
-0.017974853515625,
0.0491943359375,
-0.0186004638671875,
-0.050384521484375,
0.049468994140625,
0.04962158203125,
0.07342529296875,
-0.0217742919921875,
0.020599365234375,
0.0518798828125,
0.01617431640625,
0.0005655288696289062,
0.021881103515625,
0.01049041748046875,
-0.051727294921875,
-0.019683837890625,
-0.08392333984375,
0.0039043426513671875,
0.02801513671875,
-0.0236968994140625,
0.01276397705078125,
-0.031982421875,
-0.00580596923828125,
0.004901885986328125,
0.02593994140625,
-0.050048828125,
0.0181427001953125,
0.00797271728515625,
0.06671142578125,
-0.063232421875,
0.0831298828125,
0.057098388671875,
-0.0511474609375,
-0.0592041015625,
-0.005367279052734375,
-0.008331298828125,
-0.053253173828125,
0.06732177734375,
0.033477783203125,
0.0089874267578125,
0.0267791748046875,
-0.041046142578125,
-0.07623291015625,
0.09814453125,
0.027984619140625,
-0.054443359375,
-0.01036834716796875,
0.0184783935546875,
0.041290283203125,
-0.036956787109375,
0.0254974365234375,
0.03997802734375,
0.0223388671875,
-0.002117156982421875,
-0.05352783203125,
0.01806640625,
-0.0233154296875,
-0.00179290771484375,
0.0032806396484375,
-0.0411376953125,
0.06512451171875,
-0.03179931640625,
-0.01361846923828125,
-0.01201629638671875,
0.06207275390625,
0.0252532958984375,
-0.0095672607421875,
0.0045623779296875,
0.0465087890625,
0.051025390625,
-0.0027923583984375,
0.06634521484375,
-0.0244140625,
0.0389404296875,
0.059478759765625,
0.0088348388671875,
0.0618896484375,
0.041107177734375,
-0.039825439453125,
0.03350830078125,
0.05902099609375,
-0.0079193115234375,
0.047210693359375,
0.0229339599609375,
0.0049896240234375,
-0.0157012939453125,
0.00566864013671875,
-0.045196533203125,
0.02630615234375,
0.029266357421875,
-0.0313720703125,
0.00016617774963378906,
0.0030574798583984375,
-0.0004940032958984375,
-0.0248870849609375,
-0.0061798095703125,
0.043304443359375,
0.020660400390625,
-0.02105712890625,
0.06549072265625,
0.003448486328125,
0.045867919921875,
-0.0423583984375,
0.00437164306640625,
-0.038543701171875,
0.0196533203125,
-0.0170440673828125,
-0.0430908203125,
0.01617431640625,
-0.01593017578125,
-0.01312255859375,
0.0016889572143554688,
0.043182373046875,
-0.033355712890625,
-0.034912109375,
0.0242767333984375,
0.0266876220703125,
0.0258026123046875,
-0.0079498291015625,
-0.0654296875,
0.004322052001953125,
0.002002716064453125,
-0.0404052734375,
0.0249176025390625,
0.01180267333984375,
0.0030059814453125,
0.06109619140625,
0.04022216796875,
-0.001087188720703125,
0.00699615478515625,
-0.0169219970703125,
0.06170654296875,
-0.06402587890625,
-0.021575927734375,
-0.07000732421875,
0.050506591796875,
0.005031585693359375,
-0.00980377197265625,
0.032928466796875,
0.047088623046875,
0.07403564453125,
-0.0241241455078125,
0.042938232421875,
-0.021942138671875,
0.0169219970703125,
-0.047210693359375,
0.054229736328125,
-0.0408935546875,
0.0186309814453125,
-0.016021728515625,
-0.0771484375,
-0.028350830078125,
0.05224609375,
-0.01309967041015625,
0.0242156982421875,
0.06475830078125,
0.06829833984375,
-0.00226593017578125,
-0.01300811767578125,
0.00998687744140625,
0.0323486328125,
0.0372314453125,
0.056182861328125,
0.0149078369140625,
-0.06488037109375,
0.039031982421875,
-0.02850341796875,
-0.0035381317138671875,
-0.02490234375,
-0.06512451171875,
-0.06329345703125,
-0.05169677734375,
-0.034759521484375,
-0.03485107421875,
0.02044677734375,
0.07818603515625,
0.058441162109375,
-0.07989501953125,
-0.01348114013671875,
-0.008453369140625,
-0.0155029296875,
-0.0218048095703125,
-0.0143280029296875,
0.055084228515625,
-0.009429931640625,
-0.0416259765625,
0.0270233154296875,
0.01702880859375,
0.0267181396484375,
-0.0101318359375,
-0.0171661376953125,
-0.053466796875,
-0.00998687744140625,
0.036285400390625,
0.045867919921875,
-0.0557861328125,
-0.021942138671875,
0.01202392578125,
-0.00492095947265625,
0.0159454345703125,
0.0312347412109375,
-0.06396484375,
0.021514892578125,
0.041778564453125,
0.039215087890625,
0.053558349609375,
-0.01548004150390625,
0.049163818359375,
-0.051544189453125,
0.003734588623046875,
-0.0024623870849609375,
0.021881103515625,
0.024658203125,
-0.02679443359375,
0.0362548828125,
0.04437255859375,
-0.035003662109375,
-0.061248779296875,
0.00653076171875,
-0.068359375,
-0.028289794921875,
0.08294677734375,
-0.01226806640625,
-0.03717041015625,
0.004119873046875,
0.0038509368896484375,
0.053253173828125,
-0.01415252685546875,
0.054290771484375,
0.0302886962890625,
-0.01277923583984375,
-0.0296630859375,
-0.042510986328125,
0.06268310546875,
0.032318115234375,
-0.048309326171875,
-0.01198577880859375,
0.0014791488647460938,
0.035980224609375,
0.006114959716796875,
0.03558349609375,
-0.022369384765625,
0.01207733154296875,
-0.01111602783203125,
0.02850341796875,
-0.0280914306640625,
-0.031036376953125,
-0.029815673828125,
0.0172119140625,
-0.01336669921875,
-0.01285552978515625
]
] |
TheBloke/Mistral-7B-OpenOrca-GPTQ | 2023-10-16T08:48:47.000Z | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"en",
"dataset:Open-Orca/OpenOrca",
"arxiv:2306.02707",
"arxiv:2301.13688",
"license:apache-2.0",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Mistral-7B-OpenOrca-GPTQ | 76 | 27,351 | transformers | 2023-10-02T14:28:09 | ---
base_model: Open-Orca/Mistral-7B-OpenOrca
datasets:
- Open-Orca/OpenOrca
inference: false
language:
- en
library_name: transformers
license: apache-2.0
model_creator: OpenOrca
model_name: Mistral 7B OpenOrca
model_type: mistral
pipeline_tag: text-generation
prompt_template: '<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Mistral 7B OpenOrca - GPTQ
- Model creator: [OpenOrca](https://huggingface.co/Open-Orca)
- Original model: [Mistral 7B OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)
<!-- description start -->
## Description
This repo contains GPTQ model files for [OpenOrca's Mistral 7B OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GGUF)
* [OpenOrca's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: ChatML
```
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files, and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
Most GPTQ files are made with AutoGPTQ. Mistral models are currently made with Transformers.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The calibration dataset used during quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ calibration dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ/tree/main) | 4 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 32768 | 4.16 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 32768 | 4.57 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 32768 | 7.52 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 32768 | 7.68 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
| [gptq-8bit-32g-actorder_True](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ/tree/gptq-8bit-32g-actorder_True) | 8 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 32768 | 8.17 GB | No | 8-bit, with group size 32g and Act Order for maximum inference quality. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 32768 | 4.30 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download, including from branches
### In text-generation-webui
To download from the `main` branch, enter `TheBloke/Mistral-7B-OpenOrca-GPTQ` in the "Download model" box.
To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/Mistral-7B-OpenOrca-GPTQ:gptq-4bit-32g-actorder_True`
### From the command line
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
To download the `main` branch to a folder called `Mistral-7B-OpenOrca-GPTQ`:
```shell
mkdir Mistral-7B-OpenOrca-GPTQ
huggingface-cli download TheBloke/Mistral-7B-OpenOrca-GPTQ --local-dir Mistral-7B-OpenOrca-GPTQ --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir Mistral-7B-OpenOrca-GPTQ
huggingface-cli download TheBloke/Mistral-7B-OpenOrca-GPTQ --revision gptq-4bit-32g-actorder_True --local-dir Mistral-7B-OpenOrca-GPTQ --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Huggingface cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
mkdir Mistral-7B-OpenOrca-GPTQ
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/Mistral-7B-OpenOrca-GPTQ --local-dir Mistral-7B-OpenOrca-GPTQ --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
### With `git` (**not** recommended)
To clone a specific branch with `git`, use a command like this:
```shell
git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ
```
Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Mistral-7B-OpenOrca-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Mistral-7B-OpenOrca-GPTQ:gptq-4bit-32g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Mistral-7B-OpenOrca-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
It's recommended to use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/Mistral-7B-OpenOrca-GPTQ --port 3000 --quantize gptq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_GPTQ.md-use-from-tgi end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install transformers optimum
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
git checkout v0.4.2
pip3 install .
```
### You can then use the following code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Mistral-7B-OpenOrca-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-32g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI).
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama and Mistral models in 4-bit. Please see the Provided Files table above for per-file compatibility.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Pierre Kircher, Stanislav Ovsiannikov, Michael Levine, Eugene Pentland, Andrey, 준교 김, Randy H, Fred von Graf, Artur Olbinski, Caitlyn Gatomon, terasurfer, Jeff Scroggin, James Bentley, Vadim, Gabriel Puliatti, Harry Royden McLaughlin, Sean Connelly, Dan Guido, Edmond Seymore, Alicia Loh, subjectnull, AzureBlack, Manuel Alberto Morcote, Thomas Belote, Lone Striker, Chris Smitley, Vitor Caleffi, Johann-Peter Hartmann, Clay Pascal, biorpg, Brandon Frisco, sidney chen, transmissions 11, Pedro Madruga, jinyuan sun, Ajan Kanaga, Emad Mostaque, Trenton Dambrowitz, Jonathan Leane, Iucharbius, usrbinkat, vamX, George Stoitzev, Luke Pendergrass, theTransient, Olakabola, Swaroop Kallakuri, Cap'n Zoog, Brandon Phillips, Michael Dempsey, Nikolai Manek, danny, Matthew Berman, Gabriel Tamborski, alfie_i, Raymond Fosdick, Tom X Nguyen, Raven Klaugh, LangChain4j, Magnesian, Illia Dulskyi, David Ziegler, Mano Prime, Luis Javier Navarrete Lozano, Erik Bjäreholt, 阿明, Nathan Dryer, Alex, Rainer Wilmers, zynix, TL, Joseph William Delisle, John Villwock, Nathan LeClaire, Willem Michiel, Joguhyik, GodLy, OG, Alps Aficionado, Jeffrey Morgan, ReadyPlayerEmma, Tiffany J. Kim, Sebastain Graf, Spencer Kim, Michael Davis, webtim, Talal Aujan, knownsqashed, John Detwiler, Imad Khwaja, Deo Leter, Jerry Meng, Elijah Stavena, Rooh Singh, Pieter, SuperWojo, Alexandros Triantafyllidis, Stephen Murray, Ai Maven, ya boyyy, Enrico Ros, Ken Nordquist, Deep Realms, Nicholas, Spiking Neurons AB, Elle, Will Dee, Jack West, RoA, Luke @flexchar, Viktor Bowallius, Derek Yates, Subspace Studios, jjj, Toran Billups, Asp the Wyvern, Fen Risland, Ilya, NimbleBox.ai, Chadd, Nitin Borwankar, Emre, Mandus, Leonard Tan, Kalila, K, Trailburnt, S_X, Cory Kujawski
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: OpenOrca's Mistral 7B OpenOrca
<p><h1>🐋 TBD 🐋</h1></p>

[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
# OpenOrca - Mistral - 7B - 8k
We have used our own [OpenOrca dataset](https://huggingface.co/datasets/Open-Orca/OpenOrca) to fine-tune on top of [Mistral 7B](https://huggingface.co/mistralai/Mistral-7B-v0.1).
This dataset is our attempt to reproduce the dataset generated for Microsoft Research's [Orca Paper](https://arxiv.org/abs/2306.02707).
We use [OpenChat](https://huggingface.co/openchat) packing, trained with [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl).
This release is trained on a curated filtered subset of most of our GPT-4 augmented data.
It is the same subset of our data as was used in our [OpenOrcaxOpenChat-Preview2-13B model](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B).
HF Leaderboard evals place this model as #2 for all models smaller than 30B at release time, outperforming all but one 13B model.
TBD
Want to visualize our full (pre-filtering) dataset? Check out our [Nomic Atlas Map](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2).
[<img src="https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B/resolve/main/OpenOrca%20Nomic%20Atlas.png" alt="Atlas Nomic Dataset Map" width="400" height="400" />](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2)
We are in-process with training more models, so keep a look out on our org for releases coming soon with exciting partners.
We will also give sneak-peak announcements on our Discord, which you can find here:
https://AlignmentLab.ai
or on the OpenAccess AI Collective Discord for more information about Axolotl trainer here:
https://discord.gg/5y8STgB3P3
# Prompt Template
We used [OpenAI's Chat Markup Language (ChatML)](https://github.com/openai/openai-python/blob/main/chatml.md) format, with `<|im_start|>` and `<|im_end|>` tokens added to support this.
## Example Prompt Exchange
TBD
# Evaluation
We have evaluated using the methodology and tools for the HuggingFace Leaderboard, and find that we have significantly improved upon the base model.
TBD
## HuggingFaceH4 Open LLM Leaderboard Performance
TBD
## GPT4ALL Leaderboard Performance
TBD
# Dataset
We used a curated, filtered selection of most of the GPT-4 augmented data from our OpenOrca dataset, which aims to reproduce the Orca Research Paper dataset.
# Training
We trained with 8x A6000 GPUs for 62 hours, completing 4 epochs of full fine tuning on our dataset in one training run.
Commodity cost was ~$400.
# Citation
```bibtex
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
| 22,484 | [
[
-0.040802001953125,
-0.059173583984375,
0.005855560302734375,
0.01485443115234375,
-0.01338958740234375,
-0.0192108154296875,
0.004547119140625,
-0.04168701171875,
0.0165557861328125,
0.02960205078125,
-0.0384521484375,
-0.042816162109375,
-0.0247039794921875,
-0.007740020751953125,
-0.0201568603515625,
0.0814208984375,
0.0036678314208984375,
-0.0157470703125,
-0.0013437271118164062,
-0.0252532958984375,
-0.016387939453125,
-0.0325927734375,
-0.0648193359375,
-0.01177215576171875,
0.03143310546875,
0.00907135009765625,
0.06829833984375,
0.038818359375,
0.011688232421875,
0.0239410400390625,
-0.00905609130859375,
-0.00182342529296875,
-0.04559326171875,
-0.00897979736328125,
0.005100250244140625,
-0.0231170654296875,
-0.044921875,
0.002716064453125,
0.0295867919921875,
0.01045989990234375,
-0.0295867919921875,
0.0183868408203125,
0.0068817138671875,
0.05279541015625,
-0.044281005859375,
0.0141448974609375,
-0.0179290771484375,
-0.0013418197631835938,
-0.014312744140625,
0.014801025390625,
-0.006023406982421875,
-0.0390625,
0.01174163818359375,
-0.0682373046875,
0.01922607421875,
-0.003376007080078125,
0.08770751953125,
0.009246826171875,
-0.04229736328125,
0.0079803466796875,
-0.0302886962890625,
0.043365478515625,
-0.06829833984375,
0.0259246826171875,
0.034942626953125,
0.020538330078125,
-0.0227508544921875,
-0.07086181640625,
-0.04266357421875,
-0.0003142356872558594,
-0.0084381103515625,
0.0275726318359375,
-0.039581298828125,
0.00716400146484375,
0.034027099609375,
0.0574951171875,
-0.0650634765625,
-0.017242431640625,
-0.0228118896484375,
-0.01430511474609375,
0.05859375,
0.01078033447265625,
0.0269927978515625,
-0.00998687744140625,
-0.02215576171875,
-0.04278564453125,
-0.045196533203125,
0.0179901123046875,
0.0223541259765625,
0.00202178955078125,
-0.048187255859375,
0.033538818359375,
-0.0235443115234375,
0.036163330078125,
0.00824737548828125,
-0.0113372802734375,
0.0227508544921875,
-0.03668212890625,
-0.0345458984375,
-0.0251312255859375,
0.0880126953125,
0.0292205810546875,
-0.020660400390625,
0.020965576171875,
-0.0006957054138183594,
-0.007659912109375,
-0.0016164779663085938,
-0.07562255859375,
-0.03875732421875,
0.037750244140625,
-0.037200927734375,
-0.01546478271484375,
0.0079193115234375,
-0.0567626953125,
-0.0002865791320800781,
-0.0037136077880859375,
0.03729248046875,
-0.043853759765625,
-0.0306243896484375,
0.01427459716796875,
-0.036590576171875,
0.0345458984375,
0.03375244140625,
-0.05010986328125,
0.04071044921875,
0.0262451171875,
0.057952880859375,
0.008697509765625,
-0.0028839111328125,
-0.020538330078125,
0.0080108642578125,
-0.0092926025390625,
0.0318603515625,
-0.00960540771484375,
-0.03729248046875,
-0.0195465087890625,
0.026519775390625,
0.0023975372314453125,
-0.0179901123046875,
0.04388427734375,
-0.0180511474609375,
0.036590576171875,
-0.04803466796875,
-0.0301971435546875,
-0.036895751953125,
0.0026645660400390625,
-0.051513671875,
0.1041259765625,
0.038116455078125,
-0.07391357421875,
0.0140380859375,
-0.03802490234375,
-0.01146697998046875,
-0.00641632080078125,
0.0010137557983398438,
-0.036895751953125,
-0.006511688232421875,
0.02044677734375,
0.0206756591796875,
-0.03082275390625,
0.0007801055908203125,
-0.030059814453125,
-0.01544189453125,
0.0088348388671875,
-0.035980224609375,
0.10186767578125,
0.014190673828125,
-0.03533935546875,
-0.0038051605224609375,
-0.04437255859375,
0.006038665771484375,
0.0307464599609375,
-0.01300048828125,
-0.01079559326171875,
-0.022064208984375,
0.01488494873046875,
0.0159454345703125,
0.0195770263671875,
-0.0298614501953125,
0.035369873046875,
-0.0232086181640625,
0.043975830078125,
0.045623779296875,
0.0026912689208984375,
0.014739990234375,
-0.03802490234375,
0.037445068359375,
0.0110931396484375,
0.049407958984375,
0.0108184814453125,
-0.056884765625,
-0.052093505859375,
-0.0218505859375,
0.0182647705078125,
0.043670654296875,
-0.058349609375,
0.03216552734375,
-0.00823974609375,
-0.0615234375,
-0.0260009765625,
-0.015625,
0.0259857177734375,
0.02557373046875,
0.033477783203125,
-0.036102294921875,
-0.0176544189453125,
-0.0587158203125,
0.0088958740234375,
-0.035919189453125,
-0.0003612041473388672,
0.04083251953125,
0.057891845703125,
-0.01204681396484375,
0.061279296875,
-0.051849365234375,
-0.0099639892578125,
0.01171112060546875,
0.002445220947265625,
0.0244598388671875,
0.041595458984375,
0.0673828125,
-0.056854248046875,
-0.04052734375,
0.00047016143798828125,
-0.048919677734375,
-0.00241851806640625,
0.0037288665771484375,
-0.034576416015625,
0.01678466796875,
0.00014579296112060547,
-0.0810546875,
0.058258056640625,
0.038299560546875,
-0.043792724609375,
0.061248779296875,
-0.0221710205078125,
0.01485443115234375,
-0.08428955078125,
0.01126861572265625,
0.0099029541015625,
-0.021942138671875,
-0.0286102294921875,
0.0095977783203125,
-0.00641632080078125,
0.01102447509765625,
-0.033935546875,
0.057891845703125,
-0.037567138671875,
-0.00012373924255371094,
0.01094818115234375,
-0.00876617431640625,
0.02166748046875,
0.04156494140625,
-0.01195526123046875,
0.059661865234375,
0.043304443359375,
-0.03131103515625,
0.041259765625,
0.0309295654296875,
0.005268096923828125,
0.0194091796875,
-0.067138671875,
0.007183074951171875,
0.004058837890625,
0.03607177734375,
-0.07373046875,
-0.0176849365234375,
0.04412841796875,
-0.038299560546875,
0.02838134765625,
-0.02685546875,
-0.028656005859375,
-0.0302581787109375,
-0.0452880859375,
0.03009033203125,
0.049774169921875,
-0.032501220703125,
0.037139892578125,
0.0302886962890625,
0.004619598388671875,
-0.05328369140625,
-0.0433349609375,
-0.02069091796875,
-0.0204010009765625,
-0.048004150390625,
0.036102294921875,
-0.0077972412109375,
-0.00391387939453125,
0.0029697418212890625,
-0.007808685302734375,
-0.0018024444580078125,
-0.0112762451171875,
0.0384521484375,
0.0198974609375,
-0.01549530029296875,
-0.020355224609375,
0.0157012939453125,
0.005527496337890625,
-0.0014219284057617188,
-0.02886962890625,
0.029449462890625,
-0.021270751953125,
-0.006374359130859375,
-0.029449462890625,
0.016632080078125,
0.0399169921875,
-0.0010690689086914062,
0.061431884765625,
0.06793212890625,
-0.028289794921875,
0.00726318359375,
-0.036956787109375,
-0.01171112060546875,
-0.036346435546875,
0.00384521484375,
-0.020355224609375,
-0.06298828125,
0.052154541015625,
0.03179931640625,
0.0238494873046875,
0.062103271484375,
0.035675048828125,
0.008819580078125,
0.069091796875,
0.02923583984375,
-0.0160369873046875,
0.037506103515625,
-0.044281005859375,
-0.0169525146484375,
-0.05126953125,
-0.01318359375,
-0.037994384765625,
-0.01239776611328125,
-0.058258056640625,
-0.040679931640625,
0.03326416015625,
0.0281219482421875,
-0.05609130859375,
0.04473876953125,
-0.056610107421875,
0.01192474365234375,
0.045867919921875,
0.025604248046875,
0.0133056640625,
0.004344940185546875,
-0.01523590087890625,
0.01071929931640625,
-0.045013427734375,
-0.02008056640625,
0.077880859375,
0.03155517578125,
0.05010986328125,
0.0243682861328125,
0.0325927734375,
0.0079498291015625,
0.0233154296875,
-0.03570556640625,
0.033111572265625,
0.0004775524139404297,
-0.051361083984375,
-0.0265960693359375,
-0.046112060546875,
-0.07366943359375,
0.0177459716796875,
-0.00653839111328125,
-0.0623779296875,
0.035919189453125,
-0.0008292198181152344,
-0.02447509765625,
0.0222320556640625,
-0.04901123046875,
0.0806884765625,
-0.0023441314697265625,
-0.03192138671875,
-0.0015993118286132812,
-0.0550537109375,
0.0289154052734375,
0.02001953125,
-0.0041046142578125,
-0.00844573974609375,
-0.0139617919921875,
0.0589599609375,
-0.0684814453125,
0.049957275390625,
-0.024505615234375,
-0.00335693359375,
0.044647216796875,
-0.005523681640625,
0.034637451171875,
0.01549530029296875,
0.0026950836181640625,
0.032958984375,
0.04388427734375,
-0.036712646484375,
-0.03179931640625,
0.04095458984375,
-0.0810546875,
-0.0294189453125,
-0.041412353515625,
-0.0301971435546875,
0.00662994384765625,
0.00498199462890625,
0.039825439453125,
0.036712646484375,
-0.01399993896484375,
0.0022735595703125,
0.049041748046875,
-0.03167724609375,
0.03009033203125,
0.0220794677734375,
-0.0239105224609375,
-0.0477294921875,
0.05889892578125,
0.0033397674560546875,
0.0136566162109375,
0.0220184326171875,
0.00860595703125,
-0.03485107421875,
-0.027923583984375,
-0.04620361328125,
0.0296630859375,
-0.0382080078125,
-0.0301361083984375,
-0.049957275390625,
-0.0200653076171875,
-0.0345458984375,
0.020050048828125,
-0.0259552001953125,
-0.04986572265625,
-0.03253173828125,
0.0083160400390625,
0.06854248046875,
0.03167724609375,
-0.01495361328125,
0.0273284912109375,
-0.06256103515625,
0.0181732177734375,
0.0259857177734375,
0.0116119384765625,
0.0040740966796875,
-0.0491943359375,
-0.008636474609375,
0.0193634033203125,
-0.04974365234375,
-0.07281494140625,
0.047882080078125,
0.0128631591796875,
0.0301971435546875,
0.038360595703125,
0.01338958740234375,
0.06298828125,
-0.0094451904296875,
0.07647705078125,
0.017822265625,
-0.0635986328125,
0.038055419921875,
-0.043792724609375,
0.01483154296875,
0.034210205078125,
0.046234130859375,
-0.026519775390625,
-0.0210113525390625,
-0.061248779296875,
-0.056671142578125,
0.037750244140625,
0.0345458984375,
-0.007106781005859375,
0.00582122802734375,
0.04840087890625,
0.0010395050048828125,
0.005718231201171875,
-0.05731201171875,
-0.046661376953125,
-0.027862548828125,
-0.00617218017578125,
0.01103973388671875,
-0.00054931640625,
-0.02239990234375,
-0.044891357421875,
0.07403564453125,
-0.0127105712890625,
0.05078125,
0.0254364013671875,
0.018218994140625,
-0.0031490325927734375,
-0.00021886825561523438,
0.0251922607421875,
0.0401611328125,
-0.0206146240234375,
-0.0201568603515625,
0.007762908935546875,
-0.06524658203125,
0.002166748046875,
0.0262603759765625,
-0.0048828125,
-0.0038814544677734375,
0.004673004150390625,
0.05181884765625,
-0.00926971435546875,
-0.024017333984375,
0.044830322265625,
-0.0302581787109375,
-0.0222320556640625,
-0.0244140625,
0.01971435546875,
0.0126953125,
0.0304718017578125,
0.0216827392578125,
-0.0133056640625,
0.0219879150390625,
-0.0390625,
0.01108551025390625,
0.039093017578125,
-0.0227813720703125,
-0.0252532958984375,
0.06280517578125,
-0.01187896728515625,
0.02178955078125,
0.05255126953125,
-0.0177764892578125,
-0.0235137939453125,
0.054534912109375,
0.0272216796875,
0.0572509765625,
-0.017059326171875,
0.017181396484375,
0.044891357421875,
0.0122528076171875,
-0.015045166015625,
0.035064697265625,
-0.004489898681640625,
-0.046417236328125,
-0.022369384765625,
-0.047454833984375,
-0.0233917236328125,
0.01544952392578125,
-0.0654296875,
0.01549530029296875,
-0.0340576171875,
-0.034423828125,
-0.00844573974609375,
0.023895263671875,
-0.04571533203125,
0.0176849365234375,
-0.004444122314453125,
0.07763671875,
-0.05889892578125,
0.0657958984375,
0.044830322265625,
-0.038726806640625,
-0.0780029296875,
-0.0220947265625,
0.0139617919921875,
-0.04229736328125,
0.0104217529296875,
-0.0002105236053466797,
0.02166748046875,
-0.0018310546875,
-0.0546875,
-0.061492919921875,
0.1080322265625,
0.0293426513671875,
-0.0335693359375,
-0.008087158203125,
-0.0032520294189453125,
0.025909423828125,
-0.004547119140625,
0.06011962890625,
0.043060302734375,
0.026519775390625,
0.0194091796875,
-0.07427978515625,
0.0303497314453125,
-0.03338623046875,
0.0026092529296875,
0.0194549560546875,
-0.07415771484375,
0.0736083984375,
0.002468109130859375,
-0.0117950439453125,
0.0166015625,
0.05120849609375,
0.03253173828125,
0.00624847412109375,
0.0278167724609375,
0.0712890625,
0.0535888671875,
-0.032440185546875,
0.0850830078125,
-0.0113067626953125,
0.04498291015625,
0.0584716796875,
0.010009765625,
0.0552978515625,
0.0131072998046875,
-0.049163818359375,
0.04437255859375,
0.07684326171875,
-0.001026153564453125,
0.023468017578125,
-0.0011167526245117188,
-0.02996826171875,
0.0006899833679199219,
0.004505157470703125,
-0.057403564453125,
0.007137298583984375,
0.031402587890625,
-0.0146484375,
-0.0017414093017578125,
-0.01253509521484375,
0.00801849365234375,
-0.052581787109375,
-0.0161285400390625,
0.044464111328125,
0.0220794677734375,
-0.019500732421875,
0.0638427734375,
-0.00022017955780029297,
0.04296875,
-0.04119873046875,
-0.01120758056640625,
-0.0250701904296875,
-0.006103515625,
-0.0266571044921875,
-0.05462646484375,
0.00823974609375,
-0.0220184326171875,
-0.007541656494140625,
0.0036182403564453125,
0.04998779296875,
-0.0147247314453125,
-0.021759033203125,
0.023895263671875,
0.037445068359375,
0.026092529296875,
-0.01702880859375,
-0.08477783203125,
0.01393890380859375,
0.00249481201171875,
-0.05133056640625,
0.03436279296875,
0.039764404296875,
0.005474090576171875,
0.045867919921875,
0.050506591796875,
-0.004283905029296875,
-0.0012454986572265625,
-0.010467529296875,
0.078125,
-0.05584716796875,
-0.0277252197265625,
-0.05267333984375,
0.046051025390625,
-0.006313323974609375,
-0.036956787109375,
0.058074951171875,
0.048919677734375,
0.05615234375,
0.0065460205078125,
0.04693603515625,
-0.0303497314453125,
0.01013946533203125,
-0.0234832763671875,
0.05029296875,
-0.051513671875,
-0.0003139972686767578,
-0.033203125,
-0.0638427734375,
-0.0015745162963867188,
0.052032470703125,
-0.00836944580078125,
0.0184326171875,
0.0306243896484375,
0.06964111328125,
-0.00551605224609375,
0.0191192626953125,
0.0095062255859375,
0.0269012451171875,
0.01331329345703125,
0.06463623046875,
0.046417236328125,
-0.07275390625,
0.03167724609375,
-0.0281829833984375,
-0.0245819091796875,
-0.0022220611572265625,
-0.055023193359375,
-0.059417724609375,
-0.0396728515625,
-0.0430908203125,
-0.056884765625,
-0.005939483642578125,
0.0634765625,
0.060943603515625,
-0.05096435546875,
-0.02655029296875,
-0.004039764404296875,
-0.0017194747924804688,
-0.0278167724609375,
-0.02423095703125,
0.03399658203125,
0.025543212890625,
-0.048248291015625,
0.0042724609375,
0.0079498291015625,
0.02685546875,
-0.01103973388671875,
-0.021270751953125,
-0.0074462890625,
-0.00246429443359375,
0.044281005859375,
0.0465087890625,
-0.044891357421875,
-0.0084228515625,
-0.014434814453125,
-0.01085662841796875,
0.01546478271484375,
0.021575927734375,
-0.05572509765625,
0.004268646240234375,
0.04107666015625,
0.015228271484375,
0.07012939453125,
0.004886627197265625,
0.0267791748046875,
-0.0297393798828125,
0.01092529296875,
0.00226593017578125,
0.025909423828125,
-0.0021953582763671875,
-0.03399658203125,
0.05078125,
0.0283355712890625,
-0.049591064453125,
-0.047637939453125,
-0.01812744140625,
-0.09912109375,
-0.017333984375,
0.0771484375,
-0.0176849365234375,
-0.0301971435546875,
-0.0027713775634765625,
-0.025482177734375,
0.02880859375,
-0.0460205078125,
0.0197296142578125,
0.03399658203125,
-0.016815185546875,
-0.0256805419921875,
-0.0628662109375,
0.047454833984375,
0.0136566162109375,
-0.0621337890625,
0.0003268718719482422,
0.043792724609375,
0.0309295654296875,
0.00400543212890625,
0.0592041015625,
-0.0248870849609375,
0.0240325927734375,
0.0081634521484375,
0.00698089599609375,
0.0021915435791015625,
0.00689697265625,
-0.025634765625,
0.0038280487060546875,
-0.019195556640625,
0.000835418701171875
]
] |
facebook/musicgen-small | 2023-10-10T11:56:15.000Z | [
"transformers",
"pytorch",
"safetensors",
"musicgen",
"text-to-audio",
"arxiv:2306.05284",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-to-audio | facebook | null | null | facebook/musicgen-small | 118 | 27,332 | transformers | 2023-06-08T17:28:01 | ---
inference: true
tags:
- musicgen
license: cc-by-nc-4.0
pipeline_tag: text-to-audio
widget:
- text: "a funky house with 80s hip hop vibes"
example_title: "Prompt 1"
- text: "a chill song with influences from lofi, chillstep and downtempo"
example_title: "Prompt 2"
- text: "a catchy beat for a podcast intro"
example_title: "Prompt 3"
---
# MusicGen - Small - 300M
MusicGen is a text-to-music model capable of genreating high-quality music samples conditioned on text descriptions or audio prompts.
It is a single stage auto-regressive Transformer model trained over a 32kHz EnCodec tokenizer with 4 codebooks sampled at 50 Hz.
Unlike existing methods, like MusicLM, MusicGen doesn't require a self-supervised semantic representation, and it generates all 4 codebooks in one pass.
By introducing a small delay between the codebooks, we show we can predict them in parallel, thus having only 50 auto-regressive steps per second of audio.
MusicGen was published in [Simple and Controllable Music Generation](https://arxiv.org/abs/2306.05284) by *Jade Copet, Felix Kreuk, Itai Gat, Tal Remez, David Kant, Gabriel Synnaeve, Yossi Adi, Alexandre Défossez*.
Four checkpoints are released:
- [**small** (this checkpoint)](https://huggingface.co/facebook/musicgen-small)
- [medium](https://huggingface.co/facebook/musicgen-medium)
- [large](https://huggingface.co/facebook/musicgen-large)
- [melody](https://huggingface.co/facebook/musicgen-melody)
## Example
Try out MusicGen yourself!
* Audiocraft Colab:
<a target="_blank" href="https://colab.research.google.com/drive/1fxGqfg96RBUvGxZ1XXN07s3DthrKUl4-?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
* Hugging Face Colab:
<a target="_blank" href="https://colab.research.google.com/github/sanchit-gandhi/notebooks/blob/main/MusicGen.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
* Hugging Face Demo:
<a target="_blank" href="https://huggingface.co/spaces/facebook/MusicGen">
<img src="https://huggingface.co/datasets/huggingface/badges/raw/main/open-in-hf-spaces-sm.svg" alt="Open in HuggingFace"/>
</a>
## 🤗 Transformers Usage
You can run MusicGen locally with the 🤗 Transformers library from version 4.31.0 onwards.
1. First install the 🤗 [Transformers library](https://github.com/huggingface/transformers) and scipy:
```
pip install --upgrade pip
pip install --upgrade transformers scipy
```
2. Run inference via the `Text-to-Audio` (TTA) pipeline. You can infer the MusicGen model via the TTA pipeline in just a few lines of code!
```python
from transformers import pipeline
import scipy
synthesiser = pipeline("text-to-audio", "facebook/musicgen-small")
music = synthesiser("lo-fi music with a soothing melody", forward_params={"do_sample": True})
scipy.io.wavfile.write("musicgen_out.wav", rate=music["sampling_rate"], music=audio["audio"])
```
3. Run inference via the Transformers modelling code. You can use the processor + generate code to convert text into a mono 32 kHz audio waveform for more fine-grained control.
```python
from transformers import AutoProcessor, MusicgenForConditionalGeneration
processor = AutoProcessor.from_pretrained("facebook/musicgen-small")
model = MusicgenForConditionalGeneration.from_pretrained("facebook/musicgen-small")
inputs = processor(
text=["80s pop track with bassy drums and synth", "90s rock song with loud guitars and heavy drums"],
padding=True,
return_tensors="pt",
)
audio_values = model.generate(**inputs, max_new_tokens=256)
```
3. Listen to the audio samples either in an ipynb notebook:
```python
from IPython.display import Audio
sampling_rate = model.config.audio_encoder.sampling_rate
Audio(audio_values[0].numpy(), rate=sampling_rate)
```
Or save them as a `.wav` file using a third-party library, e.g. `scipy`:
```python
import scipy
sampling_rate = model.config.audio_encoder.sampling_rate
scipy.io.wavfile.write("musicgen_out.wav", rate=sampling_rate, data=audio_values[0, 0].numpy())
```
For more details on using the MusicGen model for inference using the 🤗 Transformers library, refer to the [MusicGen docs](https://huggingface.co/docs/transformers/model_doc/musicgen).
## Audiocraft Usage
You can also run MusicGen locally through the original [Audiocraft library]((https://github.com/facebookresearch/audiocraft):
1. First install the [`audiocraft` library](https://github.com/facebookresearch/audiocraft)
```
pip install git+https://github.com/facebookresearch/audiocraft.git
```
2. Make sure to have [`ffmpeg`](https://ffmpeg.org/download.html) installed:
```
apt get install ffmpeg
```
3. Run the following Python code:
```py
from audiocraft.models import MusicGen
from audiocraft.data.audio import audio_write
model = MusicGen.get_pretrained("small")
model.set_generation_params(duration=8) # generate 8 seconds.
descriptions = ["happy rock", "energetic EDM"]
wav = model.generate(descriptions) # generates 2 samples.
for idx, one_wav in enumerate(wav):
# Will save under {idx}.wav, with loudness normalization at -14 db LUFS.
audio_write(f'{idx}', one_wav.cpu(), model.sample_rate, strategy="loudness")
```
## Model details
**Organization developing the model:** The FAIR team of Meta AI.
**Model date:** MusicGen was trained between April 2023 and May 2023.
**Model version:** This is the version 1 of the model.
**Model type:** MusicGen consists of an EnCodec model for audio tokenization, an auto-regressive language model based on the transformer architecture for music modeling. The model comes in different sizes: 300M, 1.5B and 3.3B parameters ; and two variants: a model trained for text-to-music generation task and a model trained for melody-guided music generation.
**Paper or resources for more information:** More information can be found in the paper [Simple and Controllable Music Generation](https://arxiv.org/abs/2306.05284).
**Citation details:**
```
@misc{copet2023simple,
title={Simple and Controllable Music Generation},
author={Jade Copet and Felix Kreuk and Itai Gat and Tal Remez and David Kant and Gabriel Synnaeve and Yossi Adi and Alexandre Défossez},
year={2023},
eprint={2306.05284},
archivePrefix={arXiv},
primaryClass={cs.SD}
}
```
**License:** Code is released under MIT, model weights are released under CC-BY-NC 4.0.
**Where to send questions or comments about the model:** Questions and comments about MusicGen can be sent via the [Github repository](https://github.com/facebookresearch/audiocraft) of the project, or by opening an issue.
## Intended use
**Primary intended use:** The primary use of MusicGen is research on AI-based music generation, including:
- Research efforts, such as probing and better understanding the limitations of generative models to further improve the state of science
- Generation of music guided by text or melody to understand current abilities of generative AI models by machine learning amateurs
**Primary intended users:** The primary intended users of the model are researchers in audio, machine learning and artificial intelligence, as well as amateur seeking to better understand those models.
**Out-of-scope use cases:** The model should not be used on downstream applications without further risk evaluation and mitigation. The model should not be used to intentionally create or disseminate music pieces that create hostile or alienating environments for people. This includes generating music that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
## Metrics
**Models performance measures:** We used the following objective measure to evaluate the model on a standard music benchmark:
- Frechet Audio Distance computed on features extracted from a pre-trained audio classifier (VGGish)
- Kullback-Leibler Divergence on label distributions extracted from a pre-trained audio classifier (PaSST)
- CLAP Score between audio embedding and text embedding extracted from a pre-trained CLAP model
Additionally, we run qualitative studies with human participants, evaluating the performance of the model with the following axes:
- Overall quality of the music samples;
- Text relevance to the provided text input;
- Adherence to the melody for melody-guided music generation.
More details on performance measures and human studies can be found in the paper.
**Decision thresholds:** Not applicable.
## Evaluation datasets
The model was evaluated on the [MusicCaps benchmark](https://www.kaggle.com/datasets/googleai/musiccaps) and on an in-domain held-out evaluation set, with no artist overlap with the training set.
## Training datasets
The model was trained on licensed data using the following sources: the [Meta Music Initiative Sound Collection](https://www.fb.com/sound), [Shutterstock music collection](https://www.shutterstock.com/music) and the [Pond5 music collection](https://www.pond5.com/). See the paper for more details about the training set and corresponding preprocessing.
## Evaluation results
Below are the objective metrics obtained on MusicCaps with the released model. Note that for the publicly released models, we had all the datasets go through a state-of-the-art music source separation method, namely using the open source [Hybrid Transformer for Music Source Separation](https://github.com/facebookresearch/demucs) (HT-Demucs), in order to keep only the instrumental part. This explains the difference in objective metrics with the models used in the paper.
| Model | Frechet Audio Distance | KLD | Text Consistency | Chroma Cosine Similarity |
|---|---|---|---|---|
| **facebook/musicgen-small** | 4.88 | 1.42 | 0.27 | - |
| facebook/musicgen-medium | 5.14 | 1.38 | 0.28 | - |
| facebook/musicgen-large | 5.48 | 1.37 | 0.28 | - |
| facebook/musicgen-melody | 4.93 | 1.41 | 0.27 | 0.44 |
More information can be found in the paper [Simple and Controllable Music Generation](https://arxiv.org/abs/2306.05284), in the Results section.
## Limitations and biases
**Data:** The data sources used to train the model are created by music professionals and covered by legal agreements with the right holders. The model is trained on 20K hours of data, we believe that scaling the model on larger datasets can further improve the performance of the model.
**Mitigations:** Vocals have been removed from the data source using corresponding tags, and then using a state-of-the-art music source separation method, namely using the open source [Hybrid Transformer for Music Source Separation](https://github.com/facebookresearch/demucs) (HT-Demucs).
**Limitations:**
- The model is not able to generate realistic vocals.
- The model has been trained with English descriptions and will not perform as well in other languages.
- The model does not perform equally well for all music styles and cultures.
- The model sometimes generates end of songs, collapsing to silence.
- It is sometimes difficult to assess what types of text descriptions provide the best generations. Prompt engineering may be required to obtain satisfying results.
**Biases:** The source of data is potentially lacking diversity and all music cultures are not equally represented in the dataset. The model may not perform equally well on the wide variety of music genres that exists. The generated samples from the model will reflect the biases from the training data. Further work on this model should include methods for balanced and just representations of cultures, for example, by scaling the training data to be both diverse and inclusive.
**Risks and harms:** Biases and limitations of the model may lead to generation of samples that may be considered as biased, inappropriate or offensive. We believe that providing the code to reproduce the research and train new models will allow to broaden the application to new and more representative data.
**Use cases:** Users must be aware of the biases, limitations and risks of the model. MusicGen is a model developed for artificial intelligence research on controllable music generation. As such, it should not be used for downstream applications without further investigation and mitigation of risks. | 12,296 | [
[
-0.042083740234375,
-0.047576904296875,
0.017303466796875,
0.036224365234375,
0.0008254051208496094,
-0.007053375244140625,
-0.0391845703125,
-0.0213775634765625,
0.00971221923828125,
0.017547607421875,
-0.0770263671875,
-0.056488037109375,
-0.025360107421875,
0.00982666015625,
-0.031494140625,
0.076416015625,
0.0156402587890625,
-0.0009860992431640625,
-0.02313232421875,
0.0119476318359375,
-0.0197601318359375,
-0.00988006591796875,
-0.047332763671875,
-0.03228759765625,
0.00910186767578125,
0.017059326171875,
0.01678466796875,
0.040283203125,
0.040252685546875,
0.0291900634765625,
-0.02203369140625,
-0.0180206298828125,
-0.02825927734375,
0.005481719970703125,
0.019805908203125,
-0.048675537109375,
-0.040313720703125,
0.03631591796875,
0.0254058837890625,
0.0170440673828125,
-0.0115203857421875,
0.04351806640625,
-0.00272369384765625,
0.0252685546875,
0.004642486572265625,
0.0190277099609375,
-0.035430908203125,
0.013092041015625,
-0.00727081298828125,
-0.009613037109375,
-0.0276336669921875,
-0.025848388671875,
0.0004963874816894531,
-0.060699462890625,
0.026763916015625,
0.01103973388671875,
0.06585693359375,
0.027252197265625,
0.008544921875,
-0.0269317626953125,
-0.048065185546875,
0.06158447265625,
-0.054443359375,
0.0132598876953125,
0.034881591796875,
0.024871826171875,
-0.01425933837890625,
-0.0611572265625,
-0.041229248046875,
-0.011199951171875,
0.00887298583984375,
0.0293121337890625,
-0.023040771484375,
-0.006404876708984375,
0.037139892578125,
0.051971435546875,
-0.040618896484375,
-0.03265380859375,
-0.042144775390625,
-0.023345947265625,
0.0628662109375,
-0.01021575927734375,
0.043853759765625,
-0.038787841796875,
-0.033477783203125,
-0.036956787109375,
-0.037078857421875,
0.0235748291015625,
0.047515869140625,
0.0214996337890625,
-0.04608154296875,
0.035552978515625,
0.0190582275390625,
0.04736328125,
0.0399169921875,
-0.03717041015625,
0.0537109375,
-0.033599853515625,
-0.009674072265625,
0.01995849609375,
0.09576416015625,
0.00933074951171875,
0.00836944580078125,
0.01206207275390625,
-0.0181121826171875,
0.006195068359375,
-0.006237030029296875,
-0.0604248046875,
-0.034515380859375,
0.03497314453125,
-0.05743408203125,
-0.024658203125,
0.0008516311645507812,
-0.056243896484375,
0.0017652511596679688,
-0.0309600830078125,
0.06500244140625,
-0.053863525390625,
-0.0246124267578125,
0.005229949951171875,
-0.04168701171875,
-0.006763458251953125,
-0.0236663818359375,
-0.055023193359375,
-0.006786346435546875,
0.039581298828125,
0.0631103515625,
-0.004302978515625,
-0.01497650146484375,
-0.0169219970703125,
-0.006328582763671875,
-0.0089263916015625,
0.032470703125,
-0.0159454345703125,
-0.05035400390625,
-0.02178955078125,
0.00864410400390625,
-0.006664276123046875,
-0.05169677734375,
0.04949951171875,
-0.0023822784423828125,
0.045928955078125,
0.00383758544921875,
-0.04156494140625,
-0.0088043212890625,
-0.0231170654296875,
-0.04449462890625,
0.068115234375,
0.0270843505859375,
-0.056488037109375,
0.0223388671875,
-0.056793212890625,
-0.0322265625,
-0.02679443359375,
-0.0018129348754882812,
-0.052001953125,
0.0081329345703125,
0.0019931793212890625,
0.01910400390625,
-0.0206146240234375,
0.01441192626953125,
-0.02520751953125,
-0.04559326171875,
0.019683837890625,
-0.0218048095703125,
0.076171875,
0.043670654296875,
-0.034912109375,
0.009979248046875,
-0.0789794921875,
-0.0155487060546875,
0.020843505859375,
-0.038604736328125,
-0.005214691162109375,
-0.0179443359375,
0.0380859375,
0.0299835205078125,
0.0049896240234375,
-0.055877685546875,
0.006153106689453125,
-0.037017822265625,
0.051361083984375,
0.043426513671875,
-0.00830078125,
0.039520263671875,
-0.053802490234375,
0.03912353515625,
-0.01837158203125,
0.00830841064453125,
-0.019927978515625,
-0.03656005859375,
-0.0208282470703125,
-0.032196044921875,
0.04010009765625,
0.031341552734375,
-0.0240631103515625,
0.059967041015625,
-0.00543975830078125,
-0.0535888671875,
-0.07763671875,
0.01061248779296875,
0.0113067626953125,
0.02630615234375,
0.032562255859375,
-0.017364501953125,
-0.0494384765625,
-0.05645751953125,
-0.0139923095703125,
-0.014434814453125,
-0.034637451171875,
0.032867431640625,
0.018890380859375,
-0.0277557373046875,
0.07537841796875,
-0.024871826171875,
-0.03656005859375,
-0.0185394287109375,
0.0269622802734375,
0.045989990234375,
0.067138671875,
0.055877685546875,
-0.06048583984375,
-0.0218658447265625,
-0.041351318359375,
-0.035064697265625,
-0.0390625,
-0.01457977294921875,
-0.0149993896484375,
-0.02001953125,
0.030853271484375,
-0.05291748046875,
-0.0017461776733398438,
0.047332763671875,
-0.023956298828125,
0.0241546630859375,
0.0171661376953125,
0.0135650634765625,
-0.084228515625,
0.006011962890625,
0.00505828857421875,
-0.00482940673828125,
-0.036773681640625,
-0.032501220703125,
-0.0205841064453125,
-0.00926971435546875,
-0.025482177734375,
0.016876220703125,
-0.00911712646484375,
0.0045166015625,
-0.0284271240234375,
0.0085296630859375,
-0.00408172607421875,
0.04278564453125,
-0.00417327880859375,
0.044586181640625,
0.042999267578125,
-0.034881591796875,
0.02642822265625,
0.0237884521484375,
-0.0328369140625,
0.045440673828125,
-0.0537109375,
-0.00237274169921875,
-0.00469970703125,
0.02276611328125,
-0.07037353515625,
-0.0233612060546875,
0.0208740234375,
-0.0626220703125,
0.038055419921875,
-0.006916046142578125,
-0.04095458984375,
-0.046966552734375,
0.0026149749755859375,
0.042388916015625,
0.0745849609375,
-0.03912353515625,
0.060272216796875,
0.031494140625,
-0.006580352783203125,
0.005825042724609375,
-0.06414794921875,
-0.0249786376953125,
-0.036956787109375,
-0.0625,
0.044677734375,
-0.0228729248046875,
-0.01525115966796875,
0.00627899169921875,
-0.0130615234375,
0.0222015380859375,
-0.00293731689453125,
0.044281005859375,
0.0125579833984375,
-0.00635528564453125,
0.020172119140625,
-0.005229949951171875,
-0.0181121826171875,
0.0251617431640625,
-0.0299072265625,
0.048858642578125,
-0.01025390625,
-0.0220489501953125,
-0.044464111328125,
0.00553131103515625,
0.03826904296875,
-0.01824951171875,
0.00443267822265625,
0.070068359375,
-0.00640869140625,
-0.019073486328125,
-0.0206451416015625,
-0.016754150390625,
-0.040313720703125,
0.0125274658203125,
-0.0203399658203125,
-0.04644775390625,
0.0274810791015625,
-0.00849151611328125,
0.01073455810546875,
0.06982421875,
0.038421630859375,
-0.024383544921875,
0.08709716796875,
0.035797119140625,
-0.006671905517578125,
0.053253173828125,
-0.04840087890625,
-0.0172576904296875,
-0.044769287109375,
-0.0249786376953125,
-0.0401611328125,
-0.035369873046875,
-0.057830810546875,
-0.030364990234375,
0.03509521484375,
-0.0172271728515625,
-0.029083251953125,
0.04241943359375,
-0.052001953125,
-0.00009351968765258789,
0.060546875,
-0.0018329620361328125,
0.003688812255859375,
0.0117034912109375,
-0.01297760009765625,
0.004787445068359375,
-0.048797607421875,
-0.0025272369384765625,
0.077880859375,
0.03765869140625,
0.06695556640625,
-0.00982666015625,
0.07073974609375,
0.017486572265625,
0.0132598876953125,
-0.05389404296875,
0.0171356201171875,
-0.00586700439453125,
-0.058837890625,
-0.007579803466796875,
-0.03741455078125,
-0.047149658203125,
-0.00225067138671875,
-0.022430419921875,
-0.03948974609375,
0.01480865478515625,
0.004016876220703125,
-0.029998779296875,
0.0138397216796875,
-0.044586181640625,
0.051544189453125,
-0.0142974853515625,
-0.006397247314453125,
0.0140228271484375,
-0.04119873046875,
0.03118896484375,
-0.0176849365234375,
0.042938232421875,
-0.02960205078125,
0.0287017822265625,
0.0694580078125,
-0.016754150390625,
0.047882080078125,
-0.0171966552734375,
-0.00930023193359375,
0.041473388671875,
-0.00980377197265625,
0.0160675048828125,
-0.013153076171875,
-0.0009074211120605469,
0.016326904296875,
0.004398345947265625,
-0.0177459716796875,
-0.02362060546875,
0.05462646484375,
-0.056884765625,
-0.034210205078125,
-0.00469970703125,
-0.05169677734375,
-0.00865936279296875,
0.005802154541015625,
0.059173583984375,
0.0228729248046875,
0.006744384765625,
0.02630615234375,
0.04425048828125,
-0.0303955078125,
0.041656494140625,
0.012664794921875,
-0.02398681640625,
-0.04296875,
0.0755615234375,
-0.0132904052734375,
0.020263671875,
0.0034999847412109375,
0.041015625,
-0.028076171875,
-0.006744384765625,
-0.032501220703125,
0.01479339599609375,
-0.03125,
0.00028967857360839844,
-0.048614501953125,
-0.0025539398193359375,
-0.027801513671875,
0.01065826416015625,
-0.03948974609375,
-0.029541015625,
-0.0236663818359375,
-0.00001806020736694336,
0.04449462890625,
0.0292816162109375,
-0.034088134765625,
0.0194854736328125,
-0.049774169921875,
0.039703369140625,
0.00811767578125,
0.0285186767578125,
-0.023681640625,
-0.0675048828125,
-0.0023021697998046875,
0.0021457672119140625,
-0.016204833984375,
-0.06646728515625,
0.0206451416015625,
0.0123291015625,
0.035491943359375,
0.029876708984375,
-0.0041656494140625,
0.037689208984375,
-0.03240966796875,
0.06146240234375,
0.0273895263671875,
-0.07293701171875,
0.06048583984375,
-0.03839111328125,
0.0230560302734375,
0.045562744140625,
0.01506805419921875,
-0.0364990234375,
-0.031585693359375,
-0.060516357421875,
-0.06719970703125,
0.05780029296875,
0.033782958984375,
0.01027679443359375,
0.01062774658203125,
0.004543304443359375,
0.0039825439453125,
0.0219268798828125,
-0.064697265625,
-0.04461669921875,
-0.044677734375,
-0.03564453125,
-0.021514892578125,
0.0108489990234375,
-0.017852783203125,
-0.03729248046875,
0.06915283203125,
0.0127105712890625,
0.03961181640625,
0.024749755859375,
0.01534271240234375,
-0.028594970703125,
0.0093231201171875,
0.0265960693359375,
0.00017142295837402344,
-0.0245819091796875,
-0.01096343994140625,
-0.0130157470703125,
-0.03704833984375,
0.0266571044921875,
0.01934814453125,
-0.03045654296875,
0.018646240234375,
0.019012451171875,
0.061553955078125,
0.01690673828125,
-0.041961669921875,
0.04559326171875,
-0.01165771484375,
-0.0261688232421875,
-0.04302978515625,
0.0249481201171875,
0.028411865234375,
0.0139007568359375,
0.023162841796875,
0.01593017578125,
0.0010385513305664062,
-0.0325927734375,
0.0428466796875,
0.0083770751953125,
-0.053009033203125,
-0.01451873779296875,
0.0904541015625,
-0.00775909423828125,
-0.029510498046875,
0.0300445556640625,
-0.0032711029052734375,
-0.01580810546875,
0.06768798828125,
0.0537109375,
0.0860595703125,
-0.01363372802734375,
0.006015777587890625,
0.04534912109375,
0.01338958740234375,
-0.002445220947265625,
0.0297088623046875,
-0.01317596435546875,
-0.0162811279296875,
-0.0162353515625,
-0.0748291015625,
0.005153656005859375,
0.0190277099609375,
-0.042327880859375,
0.016937255859375,
-0.02630615234375,
-0.0309906005859375,
0.00916290283203125,
-0.01447296142578125,
-0.047882080078125,
0.015655517578125,
0.00518798828125,
0.083740234375,
-0.07391357421875,
0.041168212890625,
0.026275634765625,
-0.04864501953125,
-0.0711669921875,
0.0112457275390625,
0.0208587646484375,
-0.0273590087890625,
0.037689208984375,
0.0165252685546875,
0.00982666015625,
0.011749267578125,
-0.05963134765625,
-0.0701904296875,
0.0833740234375,
-0.0006928443908691406,
-0.047149658203125,
0.016510009765625,
-0.0122833251953125,
0.050689697265625,
-0.034637451171875,
-0.0002644062042236328,
0.055511474609375,
0.05352783203125,
0.016693115234375,
-0.0469970703125,
-0.005519866943359375,
-0.042449951171875,
-0.01934814453125,
-0.00768280029296875,
-0.04937744140625,
0.0816650390625,
-0.003749847412109375,
-0.01605224609375,
0.0209808349609375,
0.05218505859375,
0.036224365234375,
0.033050537109375,
0.061187744140625,
0.0374755859375,
0.043182373046875,
-0.03778076171875,
0.08880615234375,
-0.046600341796875,
0.0321044921875,
0.0654296875,
0.024749755859375,
0.031402587890625,
0.0269622802734375,
-0.0273284912109375,
0.03070068359375,
0.0640869140625,
-0.045745849609375,
0.049041748046875,
0.044097900390625,
-0.0123748779296875,
-0.0206451416015625,
-0.007083892822265625,
-0.042999267578125,
0.053924560546875,
0.008331298828125,
-0.055694580078125,
0.01947021484375,
0.0233001708984375,
-0.0029754638671875,
-0.00423431396484375,
-0.005901336669921875,
0.0487060546875,
0.0111541748046875,
-0.052337646484375,
0.030792236328125,
0.0024127960205078125,
0.06927490234375,
-0.036834716796875,
0.003246307373046875,
0.000415802001953125,
-0.0005431175231933594,
-0.0179443359375,
-0.0335693359375,
-0.0022220611572265625,
-0.00605010986328125,
-0.024200439453125,
-0.017822265625,
0.047576904296875,
-0.05792236328125,
-0.041351318359375,
0.03204345703125,
0.01453399658203125,
0.00435638427734375,
-0.00475311279296875,
-0.061737060546875,
0.0057220458984375,
0.0018329620361328125,
-0.0231170654296875,
0.00337982177734375,
0.0141754150390625,
0.032867431640625,
0.03204345703125,
0.0631103515625,
0.0205078125,
0.006031036376953125,
0.01020050048828125,
0.03851318359375,
-0.047698974609375,
-0.0426025390625,
-0.042755126953125,
0.0293426513671875,
0.00738525390625,
-0.00908660888671875,
0.06097412109375,
0.046234130859375,
0.07666015625,
-0.018798828125,
0.07373046875,
-0.0252532958984375,
0.03570556640625,
-0.0276031494140625,
0.0616455078125,
-0.063720703125,
0.01910400390625,
-0.042266845703125,
-0.06036376953125,
0.01081085205078125,
0.044342041015625,
-0.0057373046875,
0.0257720947265625,
0.01377105712890625,
0.056365966796875,
-0.01311492919921875,
0.01139068603515625,
0.0017547607421875,
0.0186004638671875,
0.01538848876953125,
0.043426513671875,
0.0609130859375,
-0.044097900390625,
0.055633544921875,
-0.06243896484375,
-0.006900787353515625,
-0.0009150505065917969,
-0.040863037109375,
-0.05889892578125,
-0.049713134765625,
-0.022308349609375,
-0.039154052734375,
-0.0303497314453125,
0.08172607421875,
0.0440673828125,
-0.061767578125,
-0.0175933837890625,
0.00542449951171875,
0.01010894775390625,
-0.031646728515625,
-0.022979736328125,
0.0364990234375,
0.0007252693176269531,
-0.08367919921875,
0.046417236328125,
0.00296783447265625,
0.025787353515625,
-0.0165252685546875,
-0.008941650390625,
-0.00505828857421875,
0.006465911865234375,
0.021484375,
0.009796142578125,
-0.047882080078125,
-0.006591796875,
0.0005502700805664062,
-0.01715087890625,
0.01548004150390625,
0.04852294921875,
-0.02862548828125,
0.038330078125,
0.065673828125,
0.0137176513671875,
0.043853759765625,
0.013580322265625,
0.0308685302734375,
-0.052520751953125,
-0.003002166748046875,
0.003093719482421875,
0.0285186767578125,
0.01922607421875,
-0.0128326416015625,
0.031982421875,
0.046875,
-0.038818359375,
-0.043731689453125,
-0.002895355224609375,
-0.0772705078125,
-0.0222625732421875,
0.100341796875,
0.00673675537109375,
-0.028533935546875,
0.011016845703125,
-0.0285186767578125,
0.04266357421875,
-0.02484130859375,
0.048797607421875,
0.0278778076171875,
-0.006900787353515625,
0.00885772705078125,
-0.04937744140625,
0.06719970703125,
0.0102081298828125,
-0.029998779296875,
-0.022674560546875,
0.040771484375,
0.037872314453125,
0.043060302734375,
0.036865234375,
-0.0287933349609375,
0.03729248046875,
0.020965576171875,
0.031402587890625,
-0.019561767578125,
-0.01495361328125,
-0.0269012451171875,
0.0382080078125,
-0.035400390625,
-0.026123046875
]
] |
m3hrdadfi/typo-detector-distilbert-en | 2021-06-16T16:14:20.000Z | [
"transformers",
"pytorch",
"tf",
"distilbert",
"token-classification",
"en",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | m3hrdadfi | null | null | m3hrdadfi/typo-detector-distilbert-en | 4 | 27,289 | transformers | 2022-03-02T23:29:05 | ---
language: en
widget:
- text: "He had also stgruggled with addiction during his time in Congress ."
- text: "The review thoroughla assessed all aspects of JLENS SuR and CPG esign maturit and confidence ."
- text: "Letterma also apologized two his staff for the satyation ."
- text: "Vincent Jay had earlier won France 's first gold in gthe 10km biathlon sprint ."
- text: "It is left to the directors to figure out hpw to bring the stry across to tye audience ."
---
# Typo Detector
## Dataset Information
For this specific task, I used [NeuSpell](https://github.com/neuspell/neuspell) corpus as my raw data.
## Evaluation
The following tables summarize the scores obtained by model overall and per each class.
| # | precision | recall | f1-score | support |
|:------------:|:---------:|:--------:|:--------:|:--------:|
| TYPO | 0.992332 | 0.985997 | 0.989154 | 416054.0 |
| micro avg | 0.992332 | 0.985997 | 0.989154 | 416054.0 |
| macro avg | 0.992332 | 0.985997 | 0.989154 | 416054.0 |
| weighted avg | 0.992332 | 0.985997 | 0.989154 | 416054.0 |
## How to use
You use this model with Transformers pipeline for NER (token-classification).
### Installing requirements
```bash
pip install transformers
```
### Prediction using pipeline
```python
import torch
from transformers import AutoConfig, AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline
model_name_or_path = "m3hrdadfi/typo-detector-distilbert-en"
config = AutoConfig.from_pretrained(model_name_or_path)
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
model = AutoModelForTokenClassification.from_pretrained(model_name_or_path, config=config)
nlp = pipeline('token-classification', model=model, tokenizer=tokenizer, aggregation_strategy="average")
```
```python
sentences = [
"He had also stgruggled with addiction during his time in Congress .",
"The review thoroughla assessed all aspects of JLENS SuR and CPG esign maturit and confidence .",
"Letterma also apologized two his staff for the satyation .",
"Vincent Jay had earlier won France 's first gold in gthe 10km biathlon sprint .",
"It is left to the directors to figure out hpw to bring the stry across to tye audience .",
]
for sentence in sentences:
typos = [sentence[r["start"]: r["end"]] for r in nlp(sentence)]
detected = sentence
for typo in typos:
detected = detected.replace(typo, f'<i>{typo}</i>')
print(" [Input]: ", sentence)
print("[Detected]: ", detected)
print("-" * 130)
```
Output:
```text
[Input]: He had also stgruggled with addiction during his time in Congress .
[Detected]: He had also <i>stgruggled</i> with addiction during his time in Congress .
----------------------------------------------------------------------------------------------------------------------------------
[Input]: The review thoroughla assessed all aspects of JLENS SuR and CPG esign maturit and confidence .
[Detected]: The review <i>thoroughla</i> assessed all aspects of JLENS SuR and CPG <i>esign</i> <i>maturit</i> and confidence .
----------------------------------------------------------------------------------------------------------------------------------
[Input]: Letterma also apologized two his staff for the satyation .
[Detected]: <i>Letterma</i> also apologized <i>two</i> his staff for the <i>satyation</i> .
----------------------------------------------------------------------------------------------------------------------------------
[Input]: Vincent Jay had earlier won France 's first gold in gthe 10km biathlon sprint .
[Detected]: Vincent Jay had earlier won France 's first gold in <i>gthe</i> 10km biathlon sprint .
----------------------------------------------------------------------------------------------------------------------------------
[Input]: It is left to the directors to figure out hpw to bring the stry across to tye audience .
[Detected]: It is left to the directors to figure out <i>hpw</i> to bring the <i>stry</i> across to <i>tye</i> audience .
----------------------------------------------------------------------------------------------------------------------------------
```
## Questions?
Post a Github issue on the [TypoDetector Issues](https://github.com/m3hrdadfi/typo-detector/issues) repo. | 4,341 | [
[
-0.0205535888671875,
-0.025054931640625,
0.03497314453125,
0.0193939208984375,
-0.0172271728515625,
-0.0140380859375,
-0.006877899169921875,
-0.0224609375,
0.03875732421875,
0.048004150390625,
-0.02337646484375,
-0.05224609375,
-0.06689453125,
0.027679443359375,
-0.02911376953125,
0.07659912109375,
-0.01508331298828125,
-0.024261474609375,
0.008331298828125,
-0.0266265869140625,
-0.03778076171875,
-0.05047607421875,
-0.04974365234375,
-0.01263427734375,
0.040313720703125,
0.031005859375,
0.0264892578125,
0.035186767578125,
0.040069580078125,
0.02880859375,
-0.01568603515625,
0.0355224609375,
-0.02294921875,
0.0012826919555664062,
-0.00949859619140625,
-0.0286712646484375,
-0.03240966796875,
0.0007433891296386719,
0.04693603515625,
0.03900146484375,
0.00724029541015625,
0.025177001953125,
0.0214996337890625,
0.07037353515625,
-0.04290771484375,
0.0005922317504882812,
-0.0184783935546875,
-0.00937652587890625,
-0.0299530029296875,
-0.033843994140625,
-0.030059814453125,
-0.023162841796875,
-0.007663726806640625,
-0.053619384765625,
0.00302886962890625,
0.0095977783203125,
0.07891845703125,
0.016387939453125,
-0.013580322265625,
-0.023590087890625,
-0.057403564453125,
0.05859375,
-0.057037353515625,
0.005435943603515625,
0.031951904296875,
0.00754547119140625,
0.0025959014892578125,
-0.0701904296875,
-0.044464111328125,
-0.0017957687377929688,
-0.0160980224609375,
0.0242919921875,
-0.01055145263671875,
-0.002166748046875,
0.03778076171875,
0.0478515625,
-0.0679931640625,
-0.01873779296875,
-0.0325927734375,
-0.022430419921875,
0.0570068359375,
0.0203399658203125,
0.013153076171875,
-0.0224609375,
-0.01068115234375,
-0.01515960693359375,
-0.0125732421875,
0.005954742431640625,
0.036041259765625,
0.03021240234375,
-0.0266265869140625,
0.05743408203125,
-0.025482177734375,
0.048736572265625,
0.01062774658203125,
-0.01190185546875,
0.031219482421875,
-0.0382080078125,
-0.032501220703125,
-0.0027923583984375,
0.0655517578125,
0.035003662109375,
0.0494384765625,
-0.0045318603515625,
-0.009063720703125,
0.01019287109375,
0.00875091552734375,
-0.036865234375,
-0.0333251953125,
0.00702667236328125,
-0.042083740234375,
-0.03021240234375,
0.01181793212890625,
-0.060699462890625,
-0.01422882080078125,
0.00029468536376953125,
0.047607421875,
-0.0379638671875,
-0.006374359130859375,
0.0018405914306640625,
-0.035919189453125,
0.006328582763671875,
0.012481689453125,
-0.06585693359375,
0.023162841796875,
0.023895263671875,
0.052642822265625,
0.004611968994140625,
-0.0278778076171875,
-0.03173828125,
-0.0008597373962402344,
-0.01473236083984375,
0.0513916015625,
-0.01200103759765625,
-0.0164642333984375,
0.0064849853515625,
0.0078582763671875,
-0.032196044921875,
-0.028564453125,
0.03985595703125,
-0.0159149169921875,
0.04742431640625,
-0.023529052734375,
-0.050201416015625,
-0.0190582275390625,
0.028717041015625,
-0.039581298828125,
0.072265625,
0.031158447265625,
-0.07666015625,
0.0220184326171875,
-0.026092529296875,
-0.0313720703125,
-0.0008249282836914062,
-0.0022125244140625,
-0.028533935546875,
0.005626678466796875,
0.020965576171875,
-0.004825592041015625,
0.0030002593994140625,
-0.0201263427734375,
0.002559661865234375,
-0.010009765625,
0.01459503173828125,
-0.021728515625,
0.08770751953125,
-0.0007691383361816406,
-0.0296630859375,
-0.01041412353515625,
-0.062286376953125,
0.01097869873046875,
0.0177154541015625,
-0.028961181640625,
-0.0139007568359375,
-0.024139404296875,
0.00795745849609375,
0.01181793212890625,
0.0130615234375,
-0.020416259765625,
0.003917694091796875,
-0.05810546875,
0.0093841552734375,
0.04931640625,
0.02557373046875,
0.03204345703125,
-0.0390625,
0.041534423828125,
-0.0242919921875,
0.01032257080078125,
-0.01555633544921875,
-0.028167724609375,
-0.0640869140625,
-0.004390716552734375,
0.036468505859375,
0.06298828125,
-0.046142578125,
0.0361328125,
-0.01000213623046875,
-0.030731201171875,
-0.025970458984375,
0.01021575927734375,
0.044708251953125,
0.052154541015625,
0.043914794921875,
-0.0105438232421875,
-0.04742431640625,
-0.06524658203125,
-0.048248291015625,
-0.0304718017578125,
0.00843048095703125,
0.01540374755859375,
0.048614501953125,
0.018890380859375,
0.0462646484375,
-0.04608154296875,
-0.02874755859375,
-0.0227203369140625,
0.007289886474609375,
0.021728515625,
0.041351318359375,
0.05511474609375,
-0.030914306640625,
-0.04864501953125,
-0.005580902099609375,
-0.0587158203125,
-0.01134490966796875,
0.01201629638671875,
-0.005733489990234375,
0.03021240234375,
0.0269927978515625,
-0.03643798828125,
0.06201171875,
0.018646240234375,
-0.0458984375,
0.07635498046875,
-0.0269927978515625,
0.0196380615234375,
-0.0909423828125,
0.025146484375,
-0.0098419189453125,
-0.0036487579345703125,
-0.04931640625,
-0.01739501953125,
-0.0005669593811035156,
0.0144195556640625,
-0.0249176025390625,
0.053558349609375,
-0.01519775390625,
0.00798797607421875,
-0.0018329620361328125,
0.0149383544921875,
0.032989501953125,
0.0288543701171875,
-0.00801849365234375,
0.0369873046875,
0.01605224609375,
-0.0567626953125,
0.0262298583984375,
0.045501708984375,
-0.03448486328125,
0.034820556640625,
-0.046295166015625,
-0.0142059326171875,
0.0016651153564453125,
-0.00341796875,
-0.096923828125,
-0.013092041015625,
0.042083740234375,
-0.05950927734375,
0.01654052734375,
-0.0163116455078125,
-0.032012939453125,
-0.051361083984375,
-0.0159149169921875,
0.01837158203125,
0.018280029296875,
-0.01184844970703125,
0.01074981689453125,
0.0201568603515625,
-0.019287109375,
-0.0377197265625,
-0.07305908203125,
0.007305145263671875,
-0.0219268798828125,
-0.055084228515625,
0.035980224609375,
-0.01116943359375,
-0.025543212890625,
-0.016815185546875,
-0.018463134765625,
-0.0147552490234375,
-0.0120697021484375,
0.0009279251098632812,
0.033172607421875,
-0.004085540771484375,
-0.00360107421875,
-0.0117950439453125,
-0.01715087890625,
-0.006313323974609375,
-0.0082244873046875,
0.0516357421875,
-0.00965118408203125,
-0.0154876708984375,
-0.04681396484375,
0.0206451416015625,
0.052459716796875,
-0.0153656005859375,
0.03375244140625,
0.03778076171875,
-0.0236358642578125,
0.00760650634765625,
-0.04486083984375,
0.0014314651489257812,
-0.035247802734375,
0.01238250732421875,
-0.0035839080810546875,
-0.043365478515625,
0.051513671875,
0.0277252197265625,
0.00695037841796875,
0.0693359375,
0.029632568359375,
-0.029541015625,
0.061431884765625,
0.039764404296875,
0.00383758544921875,
0.035980224609375,
-0.04046630859375,
0.02630615234375,
-0.060699462890625,
-0.0284576416015625,
-0.05377197265625,
-0.037384033203125,
-0.0584716796875,
-0.004253387451171875,
0.017547607421875,
-0.0023975372314453125,
-0.023162841796875,
0.036956787109375,
-0.0623779296875,
0.01386260986328125,
0.031982421875,
-0.0121612548828125,
0.00585174560546875,
0.00044465065002441406,
-0.022064208984375,
-0.00977325439453125,
-0.0555419921875,
-0.044219970703125,
0.06890869140625,
0.00958251953125,
0.04510498046875,
0.002986907958984375,
0.05706787109375,
0.01837158203125,
0.0009851455688476562,
-0.041748046875,
0.035980224609375,
-0.037078857421875,
-0.05364990234375,
-0.0203857421875,
-0.01267242431640625,
-0.0675048828125,
0.01300048828125,
-0.0174407958984375,
-0.07550048828125,
0.038726806640625,
0.0160064697265625,
-0.057220458984375,
0.038177490234375,
-0.06085205078125,
0.08123779296875,
-0.0159759521484375,
-0.0201263427734375,
0.0230560302734375,
-0.04986572265625,
0.023345947265625,
0.031494140625,
0.0220947265625,
-0.001667022705078125,
0.0194244384765625,
0.0889892578125,
-0.0528564453125,
0.03717041015625,
-0.01023101806640625,
0.020477294921875,
0.03533935546875,
0.00970458984375,
0.036834716796875,
0.0020580291748046875,
-0.004180908203125,
-0.0159912109375,
0.0153045654296875,
-0.01461029052734375,
-0.009185791015625,
0.03094482421875,
-0.0479736328125,
-0.0254974365234375,
-0.055511474609375,
-0.031768798828125,
-0.0027008056640625,
0.0226593017578125,
0.06640625,
0.061614990234375,
-0.0010938644409179688,
0.005161285400390625,
0.03668212890625,
-0.02911376953125,
0.048370361328125,
0.0200958251953125,
-0.0169219970703125,
-0.04864501953125,
0.05340576171875,
0.022979736328125,
-0.0029621124267578125,
0.02996826171875,
0.00337982177734375,
-0.030731201171875,
-0.037200927734375,
-0.016387939453125,
0.0187835693359375,
-0.038299560546875,
-0.02099609375,
-0.05712890625,
-0.0355224609375,
-0.059112548828125,
-0.01194000244140625,
-0.0205535888671875,
-0.0494384765625,
-0.040557861328125,
-0.00957489013671875,
0.018218994140625,
0.04046630859375,
-0.00655364990234375,
0.040130615234375,
-0.053009033203125,
0.0241241455078125,
0.0243988037109375,
0.0199127197265625,
-0.0205230712890625,
-0.06915283203125,
-0.0210113525390625,
-0.012359619140625,
-0.029327392578125,
-0.0670166015625,
0.046539306640625,
-0.004695892333984375,
0.050323486328125,
0.03790283203125,
0.00838470458984375,
0.06207275390625,
-0.01702880859375,
0.07757568359375,
0.0198516845703125,
-0.0650634765625,
0.0596923828125,
-0.0037670135498046875,
0.018707275390625,
0.06207275390625,
0.0025844573974609375,
-0.05914306640625,
-0.033203125,
-0.059600830078125,
-0.0738525390625,
0.064208984375,
0.036224365234375,
-0.013916015625,
-0.0056915283203125,
0.0152130126953125,
-0.00005036592483520508,
0.01285552978515625,
-0.04864501953125,
-0.0462646484375,
-0.0192108154296875,
-0.0168609619140625,
-0.0113067626953125,
-0.030609130859375,
-0.0204925537109375,
-0.0267791748046875,
0.054107666015625,
0.0187530517578125,
0.037872314453125,
0.0455322265625,
-0.034149169921875,
0.007904052734375,
0.023284912109375,
0.06640625,
0.07244873046875,
-0.02557373046875,
0.016754150390625,
0.015472412109375,
-0.04266357421875,
-0.0012426376342773438,
0.01654052734375,
-0.0115814208984375,
0.0191650390625,
0.061614990234375,
0.06500244140625,
-0.002513885498046875,
-0.037628173828125,
0.033447265625,
0.00014591217041015625,
-0.0418701171875,
-0.035491943359375,
0.0007739067077636719,
-0.020721435546875,
0.0205535888671875,
0.028900146484375,
0.007717132568359375,
-0.004459381103515625,
-0.0325927734375,
0.0199127197265625,
0.018218994140625,
-0.03009033203125,
-0.00904083251953125,
0.049896240234375,
-0.0009918212890625,
-0.03753662109375,
0.0286712646484375,
-0.0296478271484375,
-0.052978515625,
0.059844970703125,
0.03277587890625,
0.068115234375,
-0.032958984375,
0.02386474609375,
0.0604248046875,
0.05938720703125,
0.0008840560913085938,
0.0380859375,
0.02410888671875,
-0.04425048828125,
-0.0171966552734375,
-0.04974365234375,
-0.0003077983856201172,
0.0210418701171875,
-0.038543701171875,
-0.003978729248046875,
-0.040557861328125,
-0.018310546875,
0.00795745849609375,
0.0257720947265625,
-0.05242919921875,
0.020477294921875,
0.017242431640625,
0.0897216796875,
-0.06878662109375,
0.044464111328125,
0.04217529296875,
-0.039306640625,
-0.06524658203125,
-0.0077056884765625,
-0.0146331787109375,
-0.0560302734375,
0.034149169921875,
0.004913330078125,
0.0218963623046875,
-0.01412200927734375,
-0.0175018310546875,
-0.063232421875,
0.0882568359375,
0.00714874267578125,
-0.03582763671875,
-0.0044403076171875,
0.01316070556640625,
0.036346435546875,
-0.0181427001953125,
0.0217132568359375,
0.04229736328125,
0.0655517578125,
-0.012847900390625,
-0.06768798828125,
-0.0017786026000976562,
-0.0159912109375,
-0.012786865234375,
-0.0032329559326171875,
-0.047271728515625,
0.07159423828125,
-0.0014677047729492188,
-0.02203369140625,
0.01091766357421875,
0.03582763671875,
0.019287109375,
0.04962158203125,
0.037628173828125,
0.065185546875,
0.06683349609375,
-0.014892578125,
0.08648681640625,
-0.0173492431640625,
0.030059814453125,
0.07763671875,
0.0048675537109375,
0.0258941650390625,
0.032501220703125,
-0.02764892578125,
0.042327880859375,
0.043914794921875,
-0.00655364990234375,
0.0477294921875,
-0.00025653839111328125,
-0.01678466796875,
-0.004302978515625,
0.00890350341796875,
-0.042327880859375,
0.03240966796875,
0.013916015625,
-0.04608154296875,
-0.019683837890625,
-0.004604339599609375,
-0.0036983489990234375,
0.00592803955078125,
-0.041351318359375,
0.049835205078125,
0.01024627685546875,
-0.05072021484375,
0.0440673828125,
-0.0014314651489257812,
0.0209808349609375,
-0.04449462890625,
0.0228271484375,
-0.0221710205078125,
0.01169586181640625,
-0.020050048828125,
-0.0648193359375,
0.042083740234375,
0.0089111328125,
-0.014007568359375,
-0.0126800537109375,
0.0701904296875,
-0.0235595703125,
-0.034423828125,
0.01070404052734375,
0.041656494140625,
0.0038089752197265625,
-0.00937652587890625,
-0.0484619140625,
0.00231170654296875,
0.004772186279296875,
-0.0280914306640625,
0.0034465789794921875,
0.030426025390625,
0.0229034423828125,
0.031158447265625,
0.0280914306640625,
0.004852294921875,
0.0180816650390625,
-0.0014524459838867188,
0.044891357421875,
-0.060546875,
-0.049072265625,
-0.0657958984375,
0.046356201171875,
-0.026336669921875,
-0.042327880859375,
0.0640869140625,
0.0601806640625,
0.045379638671875,
-0.0014543533325195312,
0.060546875,
-0.0193023681640625,
0.04400634765625,
-0.028839111328125,
0.048187255859375,
-0.042572021484375,
-0.01519775390625,
-0.02044677734375,
-0.0640869140625,
-0.0249481201171875,
0.0595703125,
-0.036041259765625,
0.00539398193359375,
0.0738525390625,
0.050628662109375,
0.0145721435546875,
-0.01557159423828125,
0.0147705078125,
0.0294342041015625,
0.0022430419921875,
0.045684814453125,
0.03741455078125,
-0.060943603515625,
0.07025146484375,
-0.03143310546875,
-0.0073394775390625,
-0.037933349609375,
-0.046142578125,
-0.0555419921875,
-0.0467529296875,
-0.0301055908203125,
-0.050750732421875,
0.0006346702575683594,
0.0567626953125,
0.0297393798828125,
-0.074951171875,
-0.0100860595703125,
-0.00994110107421875,
0.032196044921875,
-0.04168701171875,
-0.0261993408203125,
0.0343017578125,
-0.016387939453125,
-0.052764892578125,
0.0180206298828125,
0.0280609130859375,
0.0027008056640625,
0.0160369873046875,
-0.00519561767578125,
-0.036956787109375,
0.0175018310546875,
0.04443359375,
0.042388916015625,
-0.038970947265625,
-0.040283203125,
-0.00670623779296875,
-0.0262603759765625,
0.00041747093200683594,
0.035491943359375,
-0.043975830078125,
0.0282745361328125,
0.0616455078125,
0.0131683349609375,
0.047271728515625,
-0.005062103271484375,
0.0069732666015625,
-0.0601806640625,
-0.00095367431640625,
0.0264892578125,
0.037109375,
0.00897979736328125,
-0.0123748779296875,
0.04010009765625,
0.044677734375,
-0.05303955078125,
-0.05657958984375,
0.001468658447265625,
-0.10528564453125,
-0.00809478759765625,
0.07232666015625,
0.0015277862548828125,
-0.0167694091796875,
-0.01300048828125,
-0.01413726806640625,
0.04638671875,
-0.0083770751953125,
0.0611572265625,
0.07379150390625,
-0.0030422210693359375,
-0.006275177001953125,
-0.05108642578125,
0.03594970703125,
0.029632568359375,
-0.055023193359375,
0.00688934326171875,
0.0167694091796875,
0.033843994140625,
0.01641845703125,
0.0745849609375,
-0.015472412109375,
0.012908935546875,
0.005207061767578125,
0.01245880126953125,
-0.007106781005859375,
-0.0154571533203125,
-0.017181396484375,
0.01947021484375,
-0.0233612060546875,
-0.04486083984375
]
] |
lllyasviel/control_v11f1p_sd15_depth | 2023-05-04T18:49:15.000Z | [
"diffusers",
"art",
"controlnet",
"stable-diffusion",
"controlnet-v1-1",
"image-to-image",
"arxiv:2302.05543",
"license:openrail",
"has_space",
"diffusers:ControlNetModel",
"region:us"
] | image-to-image | lllyasviel | null | null | lllyasviel/control_v11f1p_sd15_depth | 19 | 27,183 | diffusers | 2023-04-16T14:13:02 | ---
license: openrail
base_model: runwayml/stable-diffusion-v1-5
tags:
- art
- controlnet
- stable-diffusion
- controlnet-v1-1
- image-to-image
duplicated_from: ControlNet-1-1-preview/control_v11p_sd15_depth
---
# Controlnet - v1.1 - *depth Version*
**Controlnet v1.1** is the successor model of [Controlnet v1.0](https://huggingface.co/lllyasviel/ControlNet)
and was released in [lllyasviel/ControlNet-v1-1](https://huggingface.co/lllyasviel/ControlNet-v1-1) by [Lvmin Zhang](https://huggingface.co/lllyasviel).
This checkpoint is a conversion of [the original checkpoint](https://huggingface.co/lllyasviel/ControlNet-v1-1/blob/main/control_v11f1p_sd15_depth.pth) into `diffusers` format.
It can be used in combination with **Stable Diffusion**, such as [runwayml/stable-diffusion-v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5).
For more details, please also have a look at the [🧨 Diffusers docs](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/controlnet).
ControlNet is a neural network structure to control diffusion models by adding extra conditions.

This checkpoint corresponds to the ControlNet conditioned on **depth images**.
## Model Details
- **Developed by:** Lvmin Zhang, Maneesh Agrawala
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based.
- **Resources for more information:** [GitHub Repository](https://github.com/lllyasviel/ControlNet), [Paper](https://arxiv.org/abs/2302.05543).
- **Cite as:**
@misc{zhang2023adding,
title={Adding Conditional Control to Text-to-Image Diffusion Models},
author={Lvmin Zhang and Maneesh Agrawala},
year={2023},
eprint={2302.05543},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
## Introduction
Controlnet was proposed in [*Adding Conditional Control to Text-to-Image Diffusion Models*](https://arxiv.org/abs/2302.05543) by
Lvmin Zhang, Maneesh Agrawala.
The abstract reads as follows:
*We present a neural network structure, ControlNet, to control pretrained large diffusion models to support additional input conditions.
The ControlNet learns task-specific conditions in an end-to-end way, and the learning is robust even when the training dataset is small (< 50k).
Moreover, training a ControlNet is as fast as fine-tuning a diffusion model, and the model can be trained on a personal devices.
Alternatively, if powerful computation clusters are available, the model can scale to large amounts (millions to billions) of data.
We report that large diffusion models like Stable Diffusion can be augmented with ControlNets to enable conditional inputs like edge maps, depthmentation maps, keypoints, etc.
This may enrich the methods to control large diffusion models and further facilitate related applications.*
## Example
It is recommended to use the checkpoint with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) as the checkpoint
has been trained on it.
Experimentally, the checkpoint can be used with other diffusion models such as dreamboothed stable diffusion.
**Note**: If you want to process an image to create the auxiliary conditioning, external dependencies are required as shown below:
1. Let's install `diffusers` and related packages:
```
$ pip install diffusers transformers accelerate
```
3. Run code:
```python
import torch
import os
from huggingface_hub import HfApi
from pathlib import Path
from diffusers.utils import load_image
from PIL import Image
import numpy as np
from transformers import pipeline
from diffusers import (
ControlNetModel,
StableDiffusionControlNetPipeline,
UniPCMultistepScheduler,
)
checkpoint = "lllyasviel/control_v11p_sd15_depth"
image = load_image(
"https://huggingface.co/lllyasviel/control_v11p_sd15_depth/resolve/main/images/input.png"
)
prompt = "Stormtrooper's lecture in beautiful lecture hall"
depth_estimator = pipeline('depth-estimation')
image = depth_estimator(image)['depth']
image = np.array(image)
image = image[:, :, None]
image = np.concatenate([image, image, image], axis=2)
control_image = Image.fromarray(image)
control_image.save("./images/control.png")
controlnet = ControlNetModel.from_pretrained(checkpoint, torch_dtype=torch.float16)
pipe = StableDiffusionControlNetPipeline.from_pretrained(
"runwayml/stable-diffusion-v1-5", controlnet=controlnet, torch_dtype=torch.float16
)
pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config)
pipe.enable_model_cpu_offload()
generator = torch.manual_seed(0)
image = pipe(prompt, num_inference_steps=30, generator=generator, image=control_image).images[0]
image.save('images/image_out.png')
```



## Other released checkpoints v1-1
The authors released 14 different checkpoints, each trained with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5)
on a different type of conditioning:
| Model Name | Control Image Overview| Condition Image | Control Image Example | Generated Image Example |
|---|---|---|---|---|
|[lllyasviel/control_v11p_sd15_canny](https://huggingface.co/lllyasviel/control_v11p_sd15_canny)<br/> | *Trained with canny edge detection* | A monochrome image with white edges on a black background.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11e_sd15_ip2p](https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p)<br/> | *Trained with pixel to pixel instruction* | No condition .|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_inpaint](https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint)<br/> | Trained with image inpainting | No condition.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/output.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/output.png"/></a>|
|[lllyasviel/control_v11p_sd15_mlsd](https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd)<br/> | Trained with multi-level line segment detection | An image with annotated line segments.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11f1p_sd15_depth](https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth)<br/> | Trained with depth estimation | An image with depth information, usually represented as a grayscale image.|<a href="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_normalbae](https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae)<br/> | Trained with surface normal estimation | An image with surface normal information, usually represented as a color-coded image.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_seg](https://huggingface.co/lllyasviel/control_v11p_sd15_seg)<br/> | Trained with image segmentation | An image with segmented regions, usually represented as a color-coded image.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_lineart](https://huggingface.co/lllyasviel/control_v11p_sd15_lineart)<br/> | Trained with line art generation | An image with line art, usually black lines on a white background.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15s2_lineart_anime](https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime)<br/> | Trained with anime line art generation | An image with anime-style line art.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_openpose](https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime)<br/> | Trained with human pose estimation | An image with human poses, usually represented as a set of keypoints or skeletons.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_scribble](https://huggingface.co/lllyasviel/control_v11p_sd15_scribble)<br/> | Trained with scribble-based image generation | An image with scribbles, usually random or user-drawn strokes.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_softedge](https://huggingface.co/lllyasviel/control_v11p_sd15_softedge)<br/> | Trained with soft edge image generation | An image with soft edges, usually to create a more painterly or artistic effect.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11e_sd15_shuffle](https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle)<br/> | Trained with image shuffling | An image with shuffled patches or regions.|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11f1e_sd15_tile](https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile)<br/> | Trained with image tiling | A blurry image or part of an image .|<a href="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/original.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/original.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/output.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/output.png"/></a>|
## Improvements in Depth 1.1:
- The training dataset of previous cnet 1.0 has several problems including (1) a small group of greyscale human images are duplicated thousands of times (!!), causing the previous model somewhat likely to generate grayscale human images; (2) some images has low quality, very blurry, or significant JPEG artifacts; (3) a small group of images has wrong paired prompts caused by a mistake in our data processing scripts. The new model fixed all problems of the training dataset and should be more reasonable in many cases.
- The new depth model is a relatively unbiased model. It is not trained with some specific type of depth by some specific depth estimation method. It is not over-fitted to one preprocessor. This means this model will work better with different depth estimation, different preprocessor resolutions, or even with real depth created by 3D engines.
- Some reasonable data augmentations are applied to training, like random left-right flipping.
- The model is resumed from depth 1.0, and it should work well in all cases where depth 1.0 works well. If not, please open an issue with image, and we will take a look at your case. Depth 1.1 works well in many failure cases of depth 1.0.
- If you use Midas depth (the "depth" in webui plugin) with 384 preprocessor resolution, the difference between depth 1.0 and 1.1 should be minimal. However, if you try other preprocessor resolutions or other preprocessors (like leres and zoe), the depth 1.1 is expected to be a bit better than 1.0.
## More information
For more information, please also have a look at the [Diffusers ControlNet Blog Post](https://huggingface.co/blog/controlnet) and have a look at the [official docs](https://github.com/lllyasviel/ControlNet-v1-1-nightly). | 16,995 | [
[
-0.045379638671875,
-0.044830322265625,
0.01189422607421875,
0.041748046875,
-0.0189971923828125,
-0.0177001953125,
0.007568359375,
-0.039581298828125,
0.039886474609375,
0.02264404296875,
-0.05609130859375,
-0.02996826171875,
-0.056121826171875,
-0.01389312744140625,
-0.013671875,
0.06085205078125,
-0.0219268798828125,
-0.0007729530334472656,
0.00461578369140625,
-0.004302978515625,
-0.0071868896484375,
-0.0107269287109375,
-0.092529296875,
-0.035858154296875,
0.03338623046875,
0.005374908447265625,
0.041259765625,
0.043304443359375,
0.03851318359375,
0.029571533203125,
-0.028778076171875,
0.00464630126953125,
-0.022491455078125,
-0.0149688720703125,
0.01351165771484375,
-0.008758544921875,
-0.053375244140625,
0.002788543701171875,
0.05694580078125,
0.0257415771484375,
0.0015192031860351562,
-0.0166473388671875,
0.01043701171875,
0.053192138671875,
-0.037384033203125,
-0.00899505615234375,
-0.00940704345703125,
0.020751953125,
-0.0059814453125,
0.00843048095703125,
-0.0120391845703125,
-0.0221710205078125,
0.01122283935546875,
-0.055877685546875,
-0.005214691162109375,
-0.01279449462890625,
0.10247802734375,
0.0240478515625,
-0.0312347412109375,
-0.00531768798828125,
-0.0188446044921875,
0.049957275390625,
-0.06243896484375,
0.009307861328125,
0.00823211669921875,
0.0156402587890625,
-0.0131072998046875,
-0.07464599609375,
-0.038299560546875,
-0.0094451904296875,
-0.01001739501953125,
0.03558349609375,
-0.02679443359375,
0.006244659423828125,
0.019927978515625,
0.0219879150390625,
-0.030914306640625,
0.0210418701171875,
-0.025726318359375,
-0.0289764404296875,
0.049102783203125,
0.0017251968383789062,
0.043548583984375,
0.0016336441040039062,
-0.04541015625,
-0.00655364990234375,
-0.0272064208984375,
0.027740478515625,
0.0161590576171875,
-0.01120758056640625,
-0.0606689453125,
0.032806396484375,
-0.00589752197265625,
0.055145263671875,
0.031585693359375,
-0.017364501953125,
0.037994384765625,
-0.0133209228515625,
-0.03009033203125,
-0.0216827392578125,
0.07666015625,
0.03717041015625,
0.0115814208984375,
0.00011903047561645508,
-0.0121917724609375,
-0.009857177734375,
-0.0029621124267578125,
-0.091796875,
-0.0114898681640625,
0.0189666748046875,
-0.040069580078125,
-0.0276641845703125,
-0.009307861328125,
-0.055206298828125,
-0.015960693359375,
-0.005279541015625,
0.0281982421875,
-0.04559326171875,
-0.03692626953125,
0.0126953125,
-0.034942626953125,
0.041778564453125,
0.043853759765625,
-0.0338134765625,
0.0165252685546875,
0.01288604736328125,
0.079345703125,
-0.0203857421875,
-0.01294708251953125,
-0.0205230712890625,
-0.00617218017578125,
-0.02490234375,
0.03863525390625,
-0.01009368896484375,
-0.01090240478515625,
-0.005157470703125,
0.0257415771484375,
-0.006717681884765625,
-0.0259857177734375,
0.0301666259765625,
-0.0264434814453125,
0.0152130126953125,
-0.0010986328125,
-0.03192138671875,
-0.0126953125,
0.0200653076171875,
-0.036407470703125,
0.056060791015625,
0.020477294921875,
-0.079833984375,
0.0243682861328125,
-0.03741455078125,
-0.0194244384765625,
-0.020477294921875,
0.013153076171875,
-0.057403564453125,
-0.031982421875,
0.004425048828125,
0.044158935546875,
-0.00043964385986328125,
-0.0094451904296875,
-0.034912109375,
-0.0028934478759765625,
0.00807952880859375,
-0.00722503662109375,
0.092529296875,
0.00942230224609375,
-0.044677734375,
0.015869140625,
-0.053466796875,
0.0018911361694335938,
0.01027679443359375,
-0.0188140869140625,
0.005077362060546875,
-0.02294921875,
0.005889892578125,
0.0472412109375,
0.026641845703125,
-0.050689697265625,
0.00969696044921875,
-0.0241546630859375,
0.03436279296875,
0.048553466796875,
0.0181121826171875,
0.04901123046875,
-0.036376953125,
0.044830322265625,
0.02398681640625,
0.02276611328125,
-0.00011551380157470703,
-0.03863525390625,
-0.07769775390625,
-0.043304443359375,
0.0011739730834960938,
0.045440673828125,
-0.061065673828125,
0.0594482421875,
0.006343841552734375,
-0.05145263671875,
-0.019989013671875,
0.0035457611083984375,
0.038909912109375,
0.04156494140625,
0.02484130859375,
-0.03851318359375,
-0.0296630859375,
-0.07025146484375,
0.01192474365234375,
0.0149383544921875,
-0.00128936767578125,
0.0128326416015625,
0.050018310546875,
-0.008331298828125,
0.04937744140625,
-0.02001953125,
-0.0274810791015625,
-0.00555419921875,
-0.00803375244140625,
0.022216796875,
0.0784912109375,
0.057159423828125,
-0.05682373046875,
-0.046661376953125,
-0.0017576217651367188,
-0.06829833984375,
-0.0009136199951171875,
-0.0162353515625,
-0.039459228515625,
0.0172576904296875,
0.043975830078125,
-0.050445556640625,
0.059326171875,
0.040985107421875,
-0.043304443359375,
0.043304443359375,
-0.02398681640625,
0.010589599609375,
-0.07415771484375,
0.016082763671875,
0.0276336669921875,
-0.0248565673828125,
-0.046112060546875,
0.00855255126953125,
0.00910186767578125,
0.004180908203125,
-0.053955078125,
0.056854248046875,
-0.039031982421875,
0.01030731201171875,
-0.0216064453125,
-0.005458831787109375,
0.008331298828125,
0.049835205078125,
0.0168304443359375,
0.033935546875,
0.077392578125,
-0.048248291015625,
0.02813720703125,
0.032684326171875,
-0.017578125,
0.06396484375,
-0.06585693359375,
0.00927734375,
-0.01253509521484375,
0.042022705078125,
-0.07281494140625,
-0.02008056640625,
0.048248291015625,
-0.039459228515625,
0.042449951171875,
-0.021392822265625,
-0.0178375244140625,
-0.032867431640625,
-0.02508544921875,
0.0135955810546875,
0.058074951171875,
-0.0380859375,
0.02874755859375,
0.014373779296875,
0.010772705078125,
-0.0400390625,
-0.071044921875,
-0.00666046142578125,
-0.02850341796875,
-0.06475830078125,
0.03912353515625,
-0.0102386474609375,
0.0020809173583984375,
0.0005655288696289062,
0.0016050338745117188,
-0.0220794677734375,
-0.003307342529296875,
0.027069091796875,
0.01776123046875,
-0.00740814208984375,
-0.01409912109375,
0.01134490966796875,
-0.0113525390625,
-0.003421783447265625,
-0.0261993408203125,
0.034912109375,
0.0034656524658203125,
-0.0162506103515625,
-0.07305908203125,
0.018280029296875,
0.04278564453125,
-0.0029926300048828125,
0.0660400390625,
0.06927490234375,
-0.033050537109375,
-0.0011854171752929688,
-0.02886962890625,
-0.014068603515625,
-0.03887939453125,
-0.000010609626770019531,
-0.0164337158203125,
-0.055419921875,
0.05230712890625,
0.00695037841796875,
-0.0033206939697265625,
0.048095703125,
0.027984619140625,
-0.01543426513671875,
0.06781005859375,
0.040374755859375,
-0.011260986328125,
0.05877685546875,
-0.061614990234375,
-0.01088714599609375,
-0.07525634765625,
-0.0225677490234375,
-0.02325439453125,
-0.05145263671875,
-0.024810791015625,
-0.0278778076171875,
0.034393310546875,
0.0338134765625,
-0.0565185546875,
0.035400390625,
-0.04400634765625,
0.00797271728515625,
0.026641845703125,
0.040740966796875,
-0.01050567626953125,
-0.008880615234375,
-0.0112457275390625,
0.0062255859375,
-0.04949951171875,
-0.0187225341796875,
0.043121337890625,
0.04290771484375,
0.0406494140625,
-0.005352020263671875,
0.046142578125,
0.0005326271057128906,
0.020263671875,
-0.046356201171875,
0.03912353515625,
0.00251007080078125,
-0.041229248046875,
-0.019287109375,
-0.024810791015625,
-0.07769775390625,
0.009307861328125,
-0.032806396484375,
-0.053802490234375,
0.029205322265625,
0.01776123046875,
-0.00824737548828125,
0.037109375,
-0.052886962890625,
0.060089111328125,
0.001064300537109375,
-0.047882080078125,
0.0037708282470703125,
-0.06573486328125,
0.0171356201171875,
0.021453857421875,
-0.01448822021484375,
0.002399444580078125,
-0.01068878173828125,
0.064208984375,
-0.05816650390625,
0.06610107421875,
-0.042083740234375,
-0.004703521728515625,
0.029632568359375,
-0.00004875659942626953,
0.043548583984375,
-0.008453369140625,
-0.0161590576171875,
0.005481719970703125,
-0.007110595703125,
-0.044952392578125,
-0.02880859375,
0.052093505859375,
-0.0526123046875,
-0.0182037353515625,
-0.019775390625,
-0.021881103515625,
0.014373779296875,
0.0199127197265625,
0.049072265625,
0.028961181640625,
0.0123748779296875,
0.00536346435546875,
0.04925537109375,
-0.027435302734375,
0.050994873046875,
0.0032482147216796875,
-0.0072174072265625,
-0.041351318359375,
0.053436279296875,
-0.0010156631469726562,
0.028961181640625,
0.0186920166015625,
0.0110931396484375,
-0.019378662109375,
-0.037994384765625,
-0.033660888671875,
0.032379150390625,
-0.047088623046875,
-0.03472900390625,
-0.04754638671875,
-0.036407470703125,
-0.031463623046875,
-0.037628173828125,
-0.0209503173828125,
-0.0215301513671875,
-0.054534912109375,
0.01081085205078125,
0.0477294921875,
0.0367431640625,
-0.01788330078125,
0.043670654296875,
-0.022369384765625,
0.019195556640625,
0.02386474609375,
0.02728271484375,
-0.004093170166015625,
-0.049102783203125,
-0.0010633468627929688,
0.00823211669921875,
-0.0396728515625,
-0.056976318359375,
0.039703369140625,
0.00453948974609375,
0.034423828125,
0.041046142578125,
-0.0203094482421875,
0.051849365234375,
-0.0218658447265625,
0.046112060546875,
0.045257568359375,
-0.06396484375,
0.039520263671875,
-0.029327392578125,
0.024932861328125,
0.0200042724609375,
0.04144287109375,
-0.034912109375,
-0.0261077880859375,
-0.05841064453125,
-0.052398681640625,
0.04632568359375,
0.016998291015625,
-0.00408172607421875,
0.02581787109375,
0.04901123046875,
-0.0265960693359375,
0.0120697021484375,
-0.060089111328125,
-0.0325927734375,
-0.022369384765625,
0.004467010498046875,
0.0011243820190429688,
0.005680084228515625,
-0.002780914306640625,
-0.0374755859375,
0.06732177734375,
-0.0020999908447265625,
0.041778564453125,
0.041412353515625,
0.00702667236328125,
-0.01297760009765625,
-0.02398681640625,
0.0411376953125,
0.03863525390625,
-0.0062103271484375,
-0.021331787109375,
0.00719451904296875,
-0.0283966064453125,
0.0166778564453125,
-0.003047943115234375,
-0.0245513916015625,
-0.006153106689453125,
0.0289764404296875,
0.06561279296875,
-0.01116943359375,
-0.01483154296875,
0.058319091796875,
0.0079498291015625,
-0.043121337890625,
-0.0220184326171875,
0.001705169677734375,
0.0120849609375,
0.036529541015625,
0.01080322265625,
0.027984619140625,
0.0065155029296875,
-0.01026153564453125,
0.0219268798828125,
0.043670654296875,
-0.04437255859375,
-0.011260986328125,
0.0550537109375,
0.0030498504638671875,
-0.0083160400390625,
0.0300445556640625,
-0.03192138671875,
-0.05242919921875,
0.072509765625,
0.037841796875,
0.055206298828125,
-0.01099395751953125,
0.0235595703125,
0.056396484375,
0.01409912109375,
0.00562286376953125,
0.014892578125,
0.01061248779296875,
-0.05426025390625,
-0.032135009765625,
-0.032440185546875,
-0.004756927490234375,
0.01386260986328125,
-0.03179931640625,
0.033203125,
-0.06005859375,
-0.0138397216796875,
-0.00868988037109375,
0.007511138916015625,
-0.05499267578125,
0.0340576171875,
0.004215240478515625,
0.0926513671875,
-0.0614013671875,
0.0611572265625,
0.043975830078125,
-0.035400390625,
-0.066162109375,
-0.0037708282470703125,
0.0004935264587402344,
-0.06256103515625,
0.045440673828125,
0.010894775390625,
-0.00643157958984375,
0.0093231201171875,
-0.061859130859375,
-0.0458984375,
0.10052490234375,
0.01971435546875,
-0.01763916015625,
0.004245758056640625,
-0.037445068359375,
0.03570556640625,
-0.03509521484375,
0.03448486328125,
0.034454345703125,
0.04095458984375,
0.03253173828125,
-0.05560302734375,
0.0170745849609375,
-0.0333251953125,
0.01096343994140625,
0.01250457763671875,
-0.07489013671875,
0.06805419921875,
-0.00299835205078125,
-0.013214111328125,
0.0209197998046875,
0.05694580078125,
0.0184326171875,
0.0123138427734375,
0.04901123046875,
0.06268310546875,
0.02362060546875,
-0.007213592529296875,
0.075439453125,
-0.0053863525390625,
0.022491455078125,
0.0479736328125,
0.02001953125,
0.039398193359375,
0.0290374755859375,
-0.0055389404296875,
0.040191650390625,
0.06561279296875,
0.0010309219360351562,
0.03399658203125,
0.03839111328125,
-0.018280029296875,
-0.00836181640625,
-0.0039825439453125,
-0.02862548828125,
0.006931304931640625,
0.0203857421875,
-0.0229034423828125,
-0.0159912109375,
0.01751708984375,
0.0194244384765625,
-0.01367950439453125,
-0.03350830078125,
0.051605224609375,
-0.006710052490234375,
-0.04083251953125,
0.05877685546875,
-0.0063629150390625,
0.08447265625,
-0.05108642578125,
0.003231048583984375,
-0.0221405029296875,
0.0031414031982421875,
-0.0308074951171875,
-0.0665283203125,
0.01512908935546875,
-0.0164337158203125,
0.021087646484375,
-0.0282135009765625,
0.05413818359375,
-0.029510498046875,
-0.032958984375,
0.039093017578125,
0.01230621337890625,
0.029998779296875,
0.010772705078125,
-0.081298828125,
0.015380859375,
0.005889892578125,
-0.03753662109375,
0.014404296875,
0.029388427734375,
0.0189056396484375,
0.05438232421875,
0.02301025390625,
0.0250091552734375,
0.0206146240234375,
-0.0163726806640625,
0.07891845703125,
-0.021148681640625,
-0.0246734619140625,
-0.044830322265625,
0.065185546875,
-0.0259246826171875,
-0.0360107421875,
0.042999267578125,
0.0245208740234375,
0.060516357421875,
-0.008544921875,
0.053619384765625,
-0.0300445556640625,
0.01558685302734375,
-0.0511474609375,
0.062042236328125,
-0.067138671875,
-0.0215606689453125,
-0.0289306640625,
-0.054046630859375,
-0.02264404296875,
0.06378173828125,
-0.0101165771484375,
0.017333984375,
0.0440673828125,
0.07525634765625,
-0.01708984375,
-0.044952392578125,
0.00695037841796875,
0.00699615478515625,
0.0230712890625,
0.055023193359375,
0.051513671875,
-0.05255126953125,
0.0230865478515625,
-0.043975830078125,
-0.0347900390625,
-0.00566864013671875,
-0.07159423828125,
-0.0657958984375,
-0.056060791015625,
-0.05322265625,
-0.052398681640625,
-0.021087646484375,
0.050750732421875,
0.08795166015625,
-0.049072265625,
-0.01464080810546875,
-0.0271453857421875,
0.00836181640625,
-0.013519287109375,
-0.0165863037109375,
0.027099609375,
-0.006450653076171875,
-0.06640625,
0.004180908203125,
0.016571044921875,
0.042999267578125,
-0.00957489013671875,
-0.033233642578125,
-0.0284271240234375,
-0.0217742919921875,
0.02215576171875,
0.03521728515625,
-0.035552978515625,
-0.01554107666015625,
-0.019775390625,
-0.014312744140625,
0.01030731201171875,
0.04022216796875,
-0.033050537109375,
0.01540374755859375,
0.04449462890625,
0.03204345703125,
0.06536865234375,
-0.01195526123046875,
0.0107879638671875,
-0.04449462890625,
0.0399169921875,
0.004108428955078125,
0.03472900390625,
0.01050567626953125,
-0.0252685546875,
0.03387451171875,
0.0229949951171875,
-0.05712890625,
-0.034271240234375,
0.01313018798828125,
-0.10308837890625,
-0.0096893310546875,
0.074462890625,
-0.0271453857421875,
-0.0287933349609375,
0.015960693359375,
-0.0330810546875,
0.028472900390625,
-0.026336669921875,
0.01861572265625,
0.02978515625,
-0.0127716064453125,
-0.0318603515625,
-0.0273284912109375,
0.04742431640625,
0.017669677734375,
-0.061370849609375,
-0.04254150390625,
0.041107177734375,
0.0263519287109375,
0.024444580078125,
0.0650634765625,
-0.00986480712890625,
0.0081329345703125,
-0.00464630126953125,
0.0179901123046875,
0.00506591796875,
-0.01172637939453125,
-0.043243408203125,
-0.0101318359375,
-0.0187530517578125,
-0.028961181640625
]
] |
Qwen/Qwen-14B | 2023-11-05T03:27:16.000Z | [
"transformers",
"safetensors",
"qwen",
"text-generation",
"custom_code",
"zh",
"en",
"arxiv:2309.16609",
"has_space",
"region:us"
] | text-generation | Qwen | null | null | Qwen/Qwen-14B | 159 | 27,153 | transformers | 2023-09-24T03:28:41 | ---
language:
- zh
- en
tags:
- qwen
pipeline_tag: text-generation
inference: false
---
# Qwen-14B
<p align="center">
<img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/logo_qwen.jpg" width="400"/>
<p>
<br>
<p align="center">
🤗 <a href="https://huggingface.co/Qwen">Hugging Face</a>   |   🤖 <a href="https://modelscope.cn/organization/qwen">ModelScope</a>   |    📑 <a href="https://arxiv.org/abs/2309.16609">Paper</a>   |   🖥️ <a href="https://modelscope.cn/studios/qwen/Qwen-7B-Chat-Demo/summary">Demo</a>
<br>
<a href="https://github.com/QwenLM/Qwen/blob/main/assets/wechat.png">WeChat (微信)</a>   |    DingTalk (钉钉)    |   <a href="https://discord.gg/z3GAxXZ9Ce">Discord</a>  
</p>
<br>
## 介绍 (Introduction)
**通义千问-14B**(**Qwen-14B**)是阿里云研发的通义千问大模型系列的140亿参数规模的模型。Qwen-14B是基于Transformer的大语言模型, 在超大规模的预训练数据上进行训练得到。预训练数据类型多样,覆盖广泛,包括大量网络文本、专业书籍、代码等。同时,在Qwen-14B的基础上,我们使用对齐机制打造了基于大语言模型的AI助手Qwen-14B-Chat。本仓库为Qwen-14B的仓库。
通义千问-14B(Qwen-14B)主要有以下特点:
1. **大规模高质量训练语料**:使用超过3万亿tokens的数据进行预训练,包含高质量中、英、多语言、代码、数学等数据,涵盖通用及专业领域的训练语料。通过大量对比实验对预训练语料分布进行了优化。
2. **强大的性能**:Qwen-14B在多个中英文下游评测任务上(涵盖常识推理、代码、数学、翻译等),效果显著超越现有的相近规模开源模型,甚至在部分指标上相比更大尺寸模型也有较强竞争力。具体评测结果请详见下文。
3. **覆盖更全面的词表**:相比目前以中英词表为主的开源模型,Qwen-14B使用了约15万大小的词表。该词表对多语言更加友好,方便用户在不扩展词表的情况下对部分语种进行能力增强和扩展。
如果您想了解更多关于通义千问14B开源模型的细节,我们建议您参阅[GitHub代码库](https://github.com/QwenLM/Qwen)。
**Qwen-14B** is the 14B-parameter version of the large language model series, Qwen (abbr. Tongyi Qianwen), proposed by Alibaba Cloud. Qwen-14B is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc. Additionally, based on the pretrained Qwen-14B, we release Qwen-14B-Chat, a large-model-based AI assistant, which is trained with alignment techniques. This repository is the one for Qwen-14B.
The features of Qwen-14B include:
1. **Large-scale high-quality training corpora**: It is pretrained on over 3 trillion tokens, including Chinese, English, multilingual texts, code, and mathematics, covering general and professional fields. The distribution of the pre-training corpus has been optimized through a large number of ablation experiments.
2. **Competitive performance**: It significantly surpasses existing open-source models of similar scale on multiple Chinese and English downstream evaluation tasks (including commonsense, reasoning, code, mathematics, etc.), and even surpasses some larger-scale models in several benchmarks. See below for specific evaluation results.
3. **More comprehensive vocabulary coverage**: Compared with other open-source models based on Chinese and English vocabularies, Qwen-14B uses a vocabulary of over 150K tokens. This vocabulary is more friendly to multiple languages, enabling users to directly further enhance the capability for certain languages without expanding the vocabulary.
For more details about the open-source model of Qwen-14B, please refer to the [GitHub](https://github.com/QwenLM/Qwen) code repository.
<br>
## 要求(Requirements)
* python 3.8及以上版本
* pytorch 1.12及以上版本,推荐2.0及以上版本
* 建议使用CUDA 11.4及以上(GPU用户、flash-attention用户等需考虑此选项)
* python 3.8 and above
* pytorch 1.12 and above, 2.0 and above are recommended
* CUDA 11.4 and above are recommended (this is for GPU users, flash-attention users, etc.)
<br>
## 依赖项 (Dependency)
运行Qwen-14B,请确保满足上述要求,再执行以下pip命令安装依赖库
To run Qwen-14B, please make sure you meet the above requirements, and then execute the following pip commands to install the dependent libraries.
```bash
pip install transformers==4.32.0 accelerate tiktoken einops scipy transformers_stream_generator==0.0.4 peft deepspeed
```
另外,推荐安装`flash-attention`库(**当前已支持flash attention 2**),以实现更高的效率和更低的显存占用。
In addition, it is recommended to install the `flash-attention` library (**we support flash attention 2 now.**) for higher efficiency and lower memory usage.
```bash
git clone https://github.com/Dao-AILab/flash-attention
cd flash-attention && pip install .
# 下方安装可选,安装可能比较缓慢。
# pip install csrc/layer_norm
# pip install csrc/rotary
```
<br>
## 快速使用(Quickstart)
您可以通过以下代码轻松调用:
You can easily call the model with the following code:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers.generation import GenerationConfig
# Note: The default behavior now has injection attack prevention off.
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen-14B", trust_remote_code=True)
# use bf16
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-14B", device_map="auto", trust_remote_code=True, bf16=True).eval()
# use fp16
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-14B", device_map="auto", trust_remote_code=True, fp16=True).eval()
# use cpu only
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-14B", device_map="cpu", trust_remote_code=True).eval()
# use auto mode, automatically select precision based on the device.
model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-14B", device_map="auto", trust_remote_code=True).eval()
# Specify hyperparameters for generation. But if you use transformers>=4.32.0, there is no need to do this.
# model.generation_config = GenerationConfig.from_pretrained("Qwen/Qwen-14B", trust_remote_code=True)
inputs = tokenizer('蒙古国的首都是乌兰巴托(Ulaanbaatar)\n冰岛的首都是雷克雅未克(Reykjavik)\n埃塞俄比亚的首都是', return_tensors='pt')
inputs = inputs.to(model.device)
pred = model.generate(**inputs)
print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))
# 蒙古国的首都是乌兰巴托(Ulaanbaatar)\n冰岛的首都是雷克雅未克(Reykjavik)\n埃塞俄比亚的首都是亚的斯亚贝巴(Addis Ababa)...
```
关于更多的使用说明,请参考我们的[GitHub repo](https://github.com/QwenLM/Qwen)获取更多信息。
For more information, please refer to our [GitHub repo](https://github.com/QwenLM/Qwen) for more information.
<br>
## Tokenizer
> 注:作为术语的“tokenization”在中文中尚无共识的概念对应,本文档采用英文表达以利说明。
基于tiktoken的分词器有别于其他分词器,比如sentencepiece分词器。尤其在微调阶段,需要特别注意特殊token的使用。关于tokenizer的更多信息,以及微调时涉及的相关使用,请参阅[文档](https://github.com/QwenLM/Qwen/blob/main/tokenization_note_zh.md)。
Our tokenizer based on tiktoken is different from other tokenizers, e.g., sentencepiece tokenizer. You need to pay attention to special tokens, especially in finetuning. For more detailed information on the tokenizer and related use in fine-tuning, please refer to the [documentation](https://github.com/QwenLM/Qwen/blob/main/tokenization_note.md).
<br>
## 模型细节 (Model)
Qwen-14B模型规模基本情况如下所示:
The details of the model architecture of Qwen-14B are listed as follows:
| Hyperparameter | Value |
|:----------------|:-------|
| n_layers | 40 |
| n_heads | 40 |
| d_model | 5120 |
| vocab size | 151851 |
| sequence length | 2048 |
在位置编码、FFN激活函数和normalization的实现方式上,我们也采用了目前最流行的做法,
即RoPE相对位置编码、SwiGLU激活函数、RMSNorm(可选安装flash-attention加速)。
在分词器方面,相比目前主流开源模型以中英词表为主,Qwen-14B使用了超过15万token大小的词表。 该词表在GPT-4使用的BPE词表`cl100k_base`基础上,对中文、多语言进行了优化,在对中、英、代码数据的高效编解码的基础上,对部分多语言更加友好,方便用户在不扩展词表的情况下对部分语种进行能力增强。
词表对数字按单个数字位切分。调用较为高效的[tiktoken分词库](https://github.com/openai/tiktoken)进行分词。
我们从部分语种各随机抽取100万个文档语料,以对比不同模型的编码压缩率(以支持100语种的XLM-R为基准值1,越低越好),具体性能见图。
可以看到Qwen-14B在保持中英代码高效解码的前提下,对部分使用人群较多的语种(泰语th、希伯来语he、阿拉伯语ar、韩语ko、越南语vi、日语ja、土耳其语tr、印尼语id、波兰语pl、俄语ru、荷兰语nl、葡萄牙语pt、意大利语it、德语de、西班牙语es、法语fr等)上也实现了较高的压缩率,使得模型在这些语种上也具备较强的可扩展性和较高的训练和推理效率。
在预训练数据方面,Qwen-14B模型一方面利用了部分开源通用语料,
另一方面也积累了海量全网语料以及高质量文本内容,去重及过滤后的语料超过3T tokens。
囊括全网文本、百科、书籍、代码、数学及各个领域垂类。
<p align="center">
<img src="assets/tokenizer.png" style="width: 1200px"/>
<p>
For position encoding, FFN activation function, and normalization methods, we adopt the prevalent practices, i.e., RoPE relative position encoding, SwiGLU for activation function, and RMSNorm for normalization (optional installation of flash-attention for acceleration).
For tokenization, compared to the current mainstream open-source models based on Chinese and English vocabularies, Qwen-14B uses a vocabulary of over 150K tokens. It first considers efficient encoding of Chinese, English, and code data, and is also more friendly to multilingual languages, enabling users to directly enhance the capability of some languages without expanding the vocabulary. It segments numbers by single digit, and calls the [tiktoken](https://github.com/openai/tiktoken) tokenizer library for efficient tokenization.
We randomly selected 1 million document corpus of each language to test and compare the encoding compression rates of different models (with XLM-R, which supports 100 languages, as the base value 1). The specific performance is shown in the figure above.
As can be seen, while ensuring the efficient decoding of Chinese, English, and code, Qwen-14B also achieves a high compression rate for many other languages (such as th, he, ar, ko, vi, ja, tr, id, pl, ru, nl, pt, it, de, es, fr etc.), equipping the model with strong scalability as well as high training and inference efficiency in these languages.
For pre-training data, on the one hand, Qwen-14B uses part of the open-source generic corpus. On the other hand, it uses a massive amount of accumulated web corpus and high-quality text content. The scale of corpus reaches over 3T tokens after deduplication and filtration, encompassing web text, encyclopedias, books, code, mathematics, and various domain.
<br>
## 评测效果(Evaluation)
我们选取了MMLU,C-Eval,GSM8K, MATH, HumanEval, MBPP, BBH, CMMLU等目前较流行的benchmark,对模型的中英知识能力、翻译、数学推理、代码等能力进行综合评测。从下列结果可以看到Qwen模型在所有benchmark上均取得了同级别开源模型中的最优表现。
We selected MMLU, C-Eval, GSM8K, MATH, HumanEval, MBPP, BBH, CMMLU, which are currently popular benchmarks, to test the model’s Chinese and English knowledge capabilities, translation, mathematical reasoning, coding and other capabilities. From the following comprehensive evaluation results, we can see that the Qwen model outperform the similarly sized open-source models on all tasks.
| Model | MMLU | C-Eval | GSM8K | MATH | HumanEval | MBPP | BBH | CMMLU |
|:-------------------|:--------:|:--------:|:--------:|:--------:|:---------:|:--------:|:--------:|:--------:|
| | 5-shot | 5-shot | 8-shot | 4-shot | 0-shot | 3-shot | 3-shot | 5-shot |
| LLaMA2-7B | 46.8 | 32.5 | 16.7 | 3.3 | 12.8 | 20.8 | 38.2 | 31.8 |
| LLaMA2-13B | 55.0 | 41.4 | 29.6 | 5.0 | 18.9 | 30.3 | 45.6 | 38.4 |
| LLaMA2-34B | 62.6 | - | 42.2 | 6.2 | 22.6 | 33.0 | 44.1 | - |
| ChatGLM2-6B | 47.9 | 51.7 | 32.4 | 6.5 | - | - | 33.7 | - |
| InternLM-7B | 51.0 | 53.4 | 31.2 | 6.3 | 10.4 | 14.0 | 37.0 | 51.8 |
| InternLM-20B | 62.1 | 58.8 | 52.6 | 7.9 | 25.6 | 35.6 | 52.5 | 59.0 |
| Baichuan2-7B | 54.7 | 56.3 | 24.6 | 5.6 | 18.3 | 24.2 | 41.6 | 57.1 |
| Baichuan2-13B | 59.5 | 59.0 | 52.8 | 10.1 | 17.1 | 30.2 | 49.0 | 62.0 |
| Qwen-7B (original) | 56.7 | 59.6 | 51.6 | - | 24.4 | 31.2 | 40.6 | 58.8 |
| **Qwen-7B** | 58.2 | 63.5 | 51.7 | 11.6 | 29.9 | 31.6 | 45.0 | 62.2 |
| **Qwen-14B** | **66.3** | **72.1** | **61.3** | **24.8** | **32.3** | **40.8** | **53.4** | **71.0** |
### 长序列评测(Long-Context Evaluation)
我们引入NTK插值,LogN注意力缩放,窗口注意力等技巧,将Qwen-7B (original)和14B模型的上下文长度从2K扩展到8K以上,将Qwen-7B从8K扩到32K。在arXiv数据上使用PPL指标测试Qwen-7B和Qwen-14B在不同长度下的表现,结果如下:
**(若要启用NTK和LogN注意力缩放,请将config.json里的`use_dynamic_ntk`和`use_logn_attn`设置为true)**
We introduce NTK-aware interpolation, LogN attention scaling, Window attention, etc. to extend the context length to over 8K tokens. We conduct language modeling experiments on the arXiv dataset with the PPL evaluation. Results are demonstrated below:
**(To use NTK interpolation and LogN scaling, please set `use_dynamic_ntk` and `use_long_attn` to true in config.json.)**
<table>
<tr>
<th rowspan="2">Model</th><th colspan="6" align="center">Sequence Length</th>
</tr>
<tr>
<th align="center">1024</th><th align="center">2048</th><th align="center">4096</th><th align="center">8192</th><th align="center">16384</th><th align="center">32768</th>
</tr>
<tr>
<td>Qwen-7B (original)</td><td align="center">4.23</td><td align="center">3.78</td><td align="center">39.35</td><td align="center">469.81</td><td align="center">2645.09</td><td align="center">-</td>
</tr>
<tr>
<td>+ dynamic_ntk</td><td align="center">4.23</td><td align="center">3.78</td><td align="center">3.59</td><td align="center">3.66</td><td align="center">5.71</td><td align="center">-</td>
</tr>
<tr>
<td>+ dynamic_ntk + logn</td><td align="center">4.23</td><td align="center">3.78</td><td align="center">3.58</td><td align="center">3.56</td><td align="center">4.62</td><td align="center">-</td>
</tr>
<tr>
<td>+ dynamic_ntk + logn + window_attn</td><td align="center">4.23</td><td align="center">3.78</td><td align="center">3.58</td><td align="center">3.49</td><td align="center">4.32</td><td align="center">-</td>
</tr>
<tr>
<tr>
<td>Qwen-7B</td><td align="center"><b>4.23</b></td><td align="center"><b>3.81</b></td><td align="center"><b>3.52</b></td><td align="center"><b>3.31</b></td><td align="center">7.27</td><td align="center">181.49</td>
</tr>
<tr>
<td>+ dynamic_ntk + logn + window_attn</td><td align="center"><b>4.23</b></td><td align="center"><b>3.81</b></td><td align="center"><b>3.52</b></td><td align="center"><b>3.33</b></td><td align="center"><b>3.22</b></td><td align="center"><b>3.17</b></td>
</tr>
<tr>
<td>Qwen-14B</td><td align="center"><b>-</b></td><td align="center"><b>3.46</b></td><td align="center">22.79</td><td align="center">334.65</td><td align="center">3168.35</td><td align="center">-</td>
</tr>
<tr>
<td>+ dynamic_ntk + logn + window_attn</td><td align="center"><b>-</b></td><td align="center"><b>3.46</b></td><td align="center"><b>3.29</b></td><td align="center"><b>3.18</b></td><td align="center">3.42</td><td align="center">-</td>
</tr>
</table>
## 评测复现(Reproduction)
我们提供了评测脚本,方便大家复现模型效果,详见[链接](https://github.com/QwenLM/Qwen/tree/main/eval)。提示:由于硬件和框架造成的舍入误差,复现结果如有小幅波动属于正常现象。
We have provided evaluation scripts to reproduce the performance of our model, details as [link](https://github.com/QwenLM/Qwen/tree/main/eval).
<br>
## FAQ
如遇到问题,敬请查阅[FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ_zh.md)以及issue区,如仍无法解决再提交issue。
If you meet problems, please refer to [FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ.md) and the issues first to search a solution before you launch a new issue.
<br>
## 引用 (Citation)
如果你觉得我们的工作对你有帮助,欢迎引用!
If you find our work helpful, feel free to give us a cite.
```
@article{qwen,
title={Qwen Technical Report},
author={Jinze Bai and Shuai Bai and Yunfei Chu and Zeyu Cui and Kai Dang and Xiaodong Deng and Yang Fan and Wenbin Ge and Yu Han and Fei Huang and Binyuan Hui and Luo Ji and Mei Li and Junyang Lin and Runji Lin and Dayiheng Liu and Gao Liu and Chengqiang Lu and Keming Lu and Jianxin Ma and Rui Men and Xingzhang Ren and Xuancheng Ren and Chuanqi Tan and Sinan Tan and Jianhong Tu and Peng Wang and Shijie Wang and Wei Wang and Shengguang Wu and Benfeng Xu and Jin Xu and An Yang and Hao Yang and Jian Yang and Shusheng Yang and Yang Yao and Bowen Yu and Hongyi Yuan and Zheng Yuan and Jianwei Zhang and Xingxuan Zhang and Yichang Zhang and Zhenru Zhang and Chang Zhou and Jingren Zhou and Xiaohuan Zhou and Tianhang Zhu},
journal={arXiv preprint arXiv:2309.16609},
year={2023}
}
```
<br>
## 使用协议(License Agreement)
我们的代码和模型权重对学术研究完全开放,并支持商用。请查看[LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE)了解具体的开源协议细节。如需商用,请填写[问卷](https://dashscope.console.aliyun.com/openModelApply/qianwen)申请。
Our code and checkpoints are open to research purpose, and they are allowed for commercial purposes. Check [LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE) for more details about the license. If you have requirements for commercial use, please fill out the [form](https://dashscope.console.aliyun.com/openModelApply/qianwen) to apply.
<br>
## 联系我们(Contact Us)
如果你想给我们的研发团队和产品团队留言,欢迎加入我们的微信群、钉钉群以及Discord!同时,也欢迎通过邮件(qianwen_opensource@alibabacloud.com)联系我们。
If you are interested to leave a message to either our research team or product team, join our Discord or WeChat groups! Also, feel free to send an email to qianwen_opensource@alibabacloud.com.
| 16,680 | [
[
-0.03753662109375,
-0.03851318359375,
0.00405120849609375,
0.01617431640625,
-0.0200042724609375,
-0.0141754150390625,
-0.0129547119140625,
-0.035675048828125,
-0.00023281574249267578,
0.016693115234375,
-0.042724609375,
-0.04779052734375,
-0.0308990478515625,
0.0011930465698242188,
-0.01357269287109375,
0.054107666015625,
0.005359649658203125,
-0.0017404556274414062,
0.0292816162109375,
-0.0117950439453125,
-0.0215911865234375,
-0.01471710205078125,
-0.04046630859375,
-0.0121917724609375,
0.0215606689453125,
0.0114288330078125,
0.061798095703125,
0.049713134765625,
0.043548583984375,
0.02935791015625,
-0.0100860595703125,
0.016632080078125,
-0.028076171875,
-0.02166748046875,
0.00846099853515625,
-0.03436279296875,
-0.047698974609375,
-0.0009016990661621094,
0.054534912109375,
0.025390625,
0.006496429443359375,
0.0303497314453125,
0.0200042724609375,
0.036468505859375,
-0.020355224609375,
0.004253387451171875,
-0.0274505615234375,
-0.0000776052474975586,
-0.0194091796875,
-0.0011234283447265625,
-0.00701904296875,
-0.031646728515625,
0.01290130615234375,
-0.062286376953125,
0.01094818115234375,
0.0248565673828125,
0.109375,
-0.005138397216796875,
-0.0396728515625,
0.001956939697265625,
-0.0258941650390625,
0.08245849609375,
-0.0816650390625,
0.019012451171875,
0.019287109375,
0.019378662109375,
-0.01239013671875,
-0.0855712890625,
-0.05035400390625,
-0.0185699462890625,
-0.0235137939453125,
0.0189361572265625,
-0.019073486328125,
0.0110931396484375,
0.033294677734375,
0.0294952392578125,
-0.049774169921875,
-0.0003921985626220703,
-0.031036376953125,
-0.02105712890625,
0.0421142578125,
0.0088653564453125,
0.0450439453125,
-0.035552978515625,
-0.036895751953125,
-0.01520538330078125,
-0.034576416015625,
0.01024627685546875,
0.0188140869140625,
0.00910186767578125,
-0.0345458984375,
0.022979736328125,
-0.013458251953125,
0.028045654296875,
0.0232391357421875,
-0.021575927734375,
0.034912109375,
-0.029632568359375,
-0.03009033203125,
-0.0168304443359375,
0.095947265625,
0.043701171875,
-0.007457733154296875,
0.00255584716796875,
-0.01009368896484375,
-0.0158538818359375,
-0.0161285400390625,
-0.08087158203125,
-0.0259857177734375,
0.04974365234375,
-0.066162109375,
-0.037567138671875,
0.0115203857421875,
-0.03961181640625,
0.0065765380859375,
0.002323150634765625,
0.05694580078125,
-0.0487060546875,
-0.046051025390625,
0.006481170654296875,
-0.01715087890625,
0.0157012939453125,
0.0160980224609375,
-0.0570068359375,
0.008392333984375,
0.0262298583984375,
0.060455322265625,
0.005474090576171875,
-0.0390625,
-0.0189666748046875,
0.0041656494140625,
-0.0171051025390625,
0.0226593017578125,
0.01409912109375,
-0.0290679931640625,
-0.00911712646484375,
0.019866943359375,
-0.00630950927734375,
-0.03729248046875,
0.041748046875,
-0.043060302734375,
0.035888671875,
-0.0153045654296875,
-0.037261962890625,
-0.0272979736328125,
0.01277923583984375,
-0.051513671875,
0.08184814453125,
0.01558685302734375,
-0.08612060546875,
0.002872467041015625,
-0.053070068359375,
-0.01506805419921875,
0.013580322265625,
0.007526397705078125,
-0.03857421875,
-0.0247802734375,
0.015838623046875,
0.023345947265625,
-0.0276031494140625,
0.0195159912109375,
-0.00788116455078125,
-0.031982421875,
0.025543212890625,
-0.03857421875,
0.10675048828125,
0.0184326171875,
-0.05108642578125,
0.036773681640625,
-0.061309814453125,
0.019378662109375,
0.0212860107421875,
-0.01450347900390625,
-0.0016613006591796875,
-0.0215301513671875,
0.0224151611328125,
0.02630615234375,
0.03802490234375,
-0.039947509765625,
0.0116119384765625,
-0.05364990234375,
0.055938720703125,
0.053863525390625,
-0.0026378631591796875,
0.0394287109375,
-0.035797119140625,
0.03375244140625,
0.01546478271484375,
0.0241241455078125,
-0.014678955078125,
-0.029510498046875,
-0.061126708984375,
-0.008636474609375,
0.0419921875,
0.042144775390625,
-0.06817626953125,
0.046630859375,
-0.0014772415161132812,
-0.04620361328125,
-0.06005859375,
-0.002716064453125,
0.038726806640625,
0.03253173828125,
0.041839599609375,
0.01277923583984375,
-0.0504150390625,
-0.051025390625,
0.00010228157043457031,
-0.0145721435546875,
-0.01184844970703125,
0.01129913330078125,
0.037109375,
-0.015594482421875,
0.05133056640625,
-0.034088134765625,
-0.0215301513671875,
-0.0255584716796875,
0.00032520294189453125,
0.0295562744140625,
0.056121826171875,
0.05718994140625,
-0.064453125,
-0.039398193359375,
0.0021152496337890625,
-0.05645751953125,
0.007701873779296875,
-0.0111541748046875,
-0.04058837890625,
0.006816864013671875,
0.0184783935546875,
-0.053375244140625,
0.0257568359375,
0.045806884765625,
-0.02972412109375,
0.06878662109375,
-0.0073394775390625,
0.00849151611328125,
-0.104736328125,
0.0106964111328125,
-0.005340576171875,
-0.004322052001953125,
-0.032958984375,
0.01922607421875,
0.0197296142578125,
0.014862060546875,
-0.04315185546875,
0.059844970703125,
-0.03363037109375,
0.0227508544921875,
-0.00894927978515625,
0.014251708984375,
0.0159149169921875,
0.0445556640625,
-0.0157928466796875,
0.05303955078125,
0.047027587890625,
-0.05010986328125,
0.039398193359375,
0.029205322265625,
-0.024627685546875,
0.003269195556640625,
-0.061004638671875,
-0.0108795166015625,
0.014190673828125,
0.00455474853515625,
-0.07275390625,
-0.004894256591796875,
0.032806396484375,
-0.053955078125,
0.0150146484375,
-0.004425048828125,
-0.016204833984375,
-0.0462646484375,
-0.03997802734375,
0.0191802978515625,
0.046234130859375,
-0.045318603515625,
0.042083740234375,
0.00624847412109375,
0.0135040283203125,
-0.046783447265625,
-0.05096435546875,
-0.0074615478515625,
-0.019927978515625,
-0.057403564453125,
0.06317138671875,
-0.00829315185546875,
-0.005153656005859375,
0.00914764404296875,
0.0126495361328125,
0.00923919677734375,
0.0034332275390625,
0.0105743408203125,
0.037933349609375,
-0.0279083251953125,
-0.003314971923828125,
0.0030460357666015625,
-0.01030731201171875,
-0.0015897750854492188,
-0.03741455078125,
0.04034423828125,
-0.006092071533203125,
-0.00768280029296875,
-0.0704345703125,
0.01079559326171875,
0.040191650390625,
-0.02838134765625,
0.057952880859375,
0.08447265625,
-0.0234527587890625,
-0.00010198354721069336,
-0.026336669921875,
-0.01226043701171875,
-0.04510498046875,
0.033966064453125,
-0.0290069580078125,
-0.060028076171875,
0.044891357421875,
0.004978179931640625,
0.032012939453125,
0.06243896484375,
0.036956787109375,
-0.014068603515625,
0.08343505859375,
0.03399658203125,
-0.0095062255859375,
0.0489501953125,
-0.060028076171875,
0.00597381591796875,
-0.06494140625,
-0.0017900466918945312,
-0.0235748291015625,
-0.0292205810546875,
-0.068603515625,
-0.0177764892578125,
0.0232696533203125,
0.0219573974609375,
-0.040008544921875,
0.035675048828125,
-0.035858154296875,
-0.0111083984375,
0.06494140625,
0.007598876953125,
0.005588531494140625,
-0.021575927734375,
-0.0154266357421875,
0.0028095245361328125,
-0.059783935546875,
-0.03643798828125,
0.07122802734375,
0.034515380859375,
0.0298919677734375,
0.00653839111328125,
0.054107666015625,
0.0024509429931640625,
0.0104522705078125,
-0.050506591796875,
0.03533935546875,
0.002109527587890625,
-0.04010009765625,
-0.03814697265625,
-0.039825439453125,
-0.06976318359375,
0.0347900390625,
-0.009613037109375,
-0.05535888671875,
0.019989013671875,
0.006931304931640625,
-0.047393798828125,
0.032928466796875,
-0.0482177734375,
0.07476806640625,
-0.0178985595703125,
-0.03460693359375,
-0.0001323223114013672,
-0.049591064453125,
0.030181884765625,
0.030303955078125,
0.01953125,
-0.00640106201171875,
0.0162811279296875,
0.07427978515625,
-0.04498291015625,
0.051025390625,
-0.019927978515625,
-0.00638580322265625,
0.046966552734375,
-0.005252838134765625,
0.033477783203125,
0.00528717041015625,
-0.0014295578002929688,
0.01470184326171875,
0.025238037109375,
-0.03466796875,
-0.030914306640625,
0.045806884765625,
-0.07177734375,
-0.046142578125,
-0.020751953125,
-0.03436279296875,
0.011138916015625,
0.033966064453125,
0.047515869140625,
0.04058837890625,
0.00350189208984375,
0.0168304443359375,
0.0311737060546875,
-0.0325927734375,
0.045440673828125,
0.0207977294921875,
-0.035614013671875,
-0.04388427734375,
0.07305908203125,
0.0218963623046875,
0.0162200927734375,
0.026153564453125,
0.01708984375,
-0.0215911865234375,
-0.035400390625,
-0.055328369140625,
0.0169219970703125,
-0.0292205810546875,
-0.0340576171875,
-0.050048828125,
-0.0279388427734375,
-0.04718017578125,
0.0089569091796875,
-0.01296234130859375,
-0.0292205810546875,
-0.031707763671875,
-0.01288604736328125,
0.0386962890625,
0.0204925537109375,
-0.005859375,
0.030364990234375,
-0.0771484375,
0.038299560546875,
0.01580810546875,
0.007328033447265625,
0.0306396484375,
-0.053436279296875,
-0.040679931640625,
0.02398681640625,
-0.03594970703125,
-0.054107666015625,
0.042083740234375,
0.0015201568603515625,
0.033294677734375,
0.043182373046875,
0.0180816650390625,
0.04132080078125,
-0.0184783935546875,
0.0634765625,
0.02117919921875,
-0.07647705078125,
0.02734375,
-0.034912109375,
0.01396942138671875,
0.00677490234375,
0.0237274169921875,
-0.039215087890625,
-0.01471710205078125,
-0.05206298828125,
-0.06561279296875,
0.059356689453125,
0.0200347900390625,
0.009521484375,
0.0023937225341796875,
0.006732940673828125,
-0.0198974609375,
0.008026123046875,
-0.051300048828125,
-0.037628173828125,
-0.024261474609375,
-0.022857666015625,
0.0279693603515625,
-0.014892578125,
0.0008053779602050781,
-0.0293121337890625,
0.056640625,
0.0015687942504882812,
0.035430908203125,
0.01483154296875,
-0.005390167236328125,
-0.0036983489990234375,
0.00064849853515625,
0.0272064208984375,
0.035675048828125,
-0.01837158203125,
-0.006755828857421875,
0.017364501953125,
-0.054107666015625,
-0.0005412101745605469,
0.007022857666015625,
-0.0308990478515625,
0.007167816162109375,
0.0286865234375,
0.0704345703125,
0.01398468017578125,
-0.031158447265625,
0.0323486328125,
-0.0015659332275390625,
-0.036773681640625,
-0.00919342041015625,
0.00576019287109375,
0.027099609375,
0.03466796875,
0.037139892578125,
-0.01480865478515625,
0.01311492919921875,
-0.033843994140625,
0.00698089599609375,
0.01837158203125,
-0.00995635986328125,
-0.0248565673828125,
0.0631103515625,
0.01552581787109375,
-0.0037403106689453125,
0.036041259765625,
-0.03448486328125,
-0.058380126953125,
0.0672607421875,
0.049713134765625,
0.05926513671875,
-0.016845703125,
0.01308441162109375,
0.055938720703125,
0.01255035400390625,
-0.003833770751953125,
0.0377197265625,
0.0093231201171875,
-0.06085205078125,
-0.022186279296875,
-0.047882080078125,
-0.0141754150390625,
0.003704071044921875,
-0.0548095703125,
0.0261688232421875,
-0.0310821533203125,
-0.03021240234375,
-0.0026912689208984375,
0.0227813720703125,
-0.046630859375,
0.031951904296875,
-0.00986480712890625,
0.070068359375,
-0.0335693359375,
0.075927734375,
0.03369140625,
-0.039947509765625,
-0.07470703125,
0.006954193115234375,
-0.016326904296875,
-0.05010986328125,
0.05072021484375,
0.00814056396484375,
-0.0016317367553710938,
0.0296783447265625,
-0.03851318359375,
-0.07501220703125,
0.1065673828125,
-0.000347137451171875,
-0.04827880859375,
-0.016021728515625,
-0.0167999267578125,
0.0273590087890625,
-0.00045609474182128906,
0.0240631103515625,
0.0284423828125,
0.036712646484375,
-0.0022449493408203125,
-0.0794677734375,
0.022308349609375,
-0.0258636474609375,
0.0048675537109375,
0.005962371826171875,
-0.0791015625,
0.08856201171875,
-0.0207061767578125,
-0.024139404296875,
0.0184326171875,
0.07763671875,
0.0167388916015625,
0.006954193115234375,
0.0168609619140625,
0.022369384765625,
0.053558349609375,
-0.01184844970703125,
0.0546875,
-0.04046630859375,
0.0484619140625,
0.05523681640625,
0.0048828125,
0.0562744140625,
0.01422882080078125,
-0.042236328125,
0.034210205078125,
0.041900634765625,
-0.0211334228515625,
0.0369873046875,
0.0014095306396484375,
-0.014251708984375,
-0.0037670135498046875,
0.0009741783142089844,
-0.046661376953125,
0.01288604736328125,
0.0230712890625,
-0.016754150390625,
0.00577545166015625,
0.0195770263671875,
0.004116058349609375,
-0.0232086181640625,
-0.0292205810546875,
0.039642333984375,
0.01380157470703125,
-0.0235137939453125,
0.0616455078125,
0.0295562744140625,
0.0821533203125,
-0.037078857421875,
-0.0005726814270019531,
-0.003955841064453125,
0.00867462158203125,
-0.0264434814453125,
-0.0462646484375,
-0.0008664131164550781,
-0.0167236328125,
-0.0064697265625,
0.006725311279296875,
0.05670166015625,
-0.03302001953125,
-0.03363037109375,
0.026885986328125,
0.0268402099609375,
-0.0012311935424804688,
0.001377105712890625,
-0.061614990234375,
0.005901336669921875,
0.0167999267578125,
-0.0421142578125,
0.0283203125,
0.045379638671875,
-0.0005440711975097656,
0.04144287109375,
0.0540771484375,
-0.00443267822265625,
0.003147125244140625,
0.009735107421875,
0.0728759765625,
-0.059051513671875,
-0.03497314453125,
-0.060516357421875,
0.0538330078125,
-0.001617431640625,
-0.038787841796875,
0.067626953125,
0.02880859375,
0.07122802734375,
0.00688934326171875,
0.06561279296875,
-0.018951416015625,
0.0413818359375,
-0.032867431640625,
0.07037353515625,
-0.035614013671875,
-0.003597259521484375,
-0.024566650390625,
-0.043365478515625,
-0.00955963134765625,
0.060821533203125,
-0.0227813720703125,
0.01910400390625,
0.042266845703125,
0.058197021484375,
0.011688232421875,
-0.00507354736328125,
0.030853271484375,
0.0308685302734375,
0.01122283935546875,
0.049835205078125,
0.045440673828125,
-0.073974609375,
0.051361083984375,
-0.042236328125,
-0.005878448486328125,
-0.028961181640625,
-0.04241943359375,
-0.07110595703125,
-0.03582763671875,
-0.0239410400390625,
-0.044281005859375,
-0.015899658203125,
0.082763671875,
0.035247802734375,
-0.0689697265625,
-0.0196075439453125,
0.006317138671875,
0.0092620849609375,
-0.035369873046875,
-0.0227203369140625,
0.06231689453125,
-0.0122833251953125,
-0.06842041015625,
0.00260162353515625,
0.0012998580932617188,
0.0185699462890625,
-0.0230255126953125,
-0.013671875,
-0.01244354248046875,
-0.013214111328125,
0.034698486328125,
0.00970458984375,
-0.050872802734375,
-0.00125885009765625,
0.0272979736328125,
-0.024261474609375,
0.0032062530517578125,
0.0137176513671875,
-0.04229736328125,
0.0052642822265625,
0.0396728515625,
0.00939178466796875,
0.02825927734375,
-0.0035877227783203125,
0.03143310546875,
-0.02227783203125,
0.01532745361328125,
0.006198883056640625,
0.0297698974609375,
0.005359649658203125,
-0.0256195068359375,
0.012115478515625,
0.0159759521484375,
-0.034698486328125,
-0.06109619140625,
-0.003772735595703125,
-0.060882568359375,
-0.0211334228515625,
0.078857421875,
-0.036651611328125,
-0.029388427734375,
0.004909515380859375,
-0.053619384765625,
0.0439453125,
-0.0181732177734375,
0.046142578125,
0.039337158203125,
0.01552581787109375,
-0.0187530517578125,
-0.045440673828125,
0.04248046875,
0.0194549560546875,
-0.035736083984375,
-0.00879669189453125,
0.0070953369140625,
0.0186614990234375,
0.009063720703125,
0.041107177734375,
-0.00336456298828125,
0.038970947265625,
0.0029125213623046875,
0.0246429443359375,
-0.0123291015625,
-0.0004162788391113281,
-0.00679779052734375,
0.0025768280029296875,
0.0006933212280273438,
-0.036224365234375
]
] |
intfloat/multilingual-e5-small | 2023-09-08T08:09:07.000Z | [
"sentence-transformers",
"pytorch",
"onnx",
"safetensors",
"bert",
"mteb",
"Sentence Transformers",
"sentence-similarity",
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh",
"arxiv:2212.03533",
"arxiv:2108.08787",
"arxiv:2104.08663",
"arxiv:2210.07316",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | intfloat | null | null | intfloat/multilingual-e5-small | 43 | 27,109 | sentence-transformers | 2023-06-30T07:31:03 | ---
tags:
- mteb
- Sentence Transformers
- sentence-similarity
- sentence-transformers
model-index:
- name: multilingual-e5-small
results:
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en)
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 73.79104477611939
- type: ap
value: 36.9996434842022
- type: f1
value: 67.95453679103099
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (de)
config: de
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 71.64882226980728
- type: ap
value: 82.11942130026586
- type: f1
value: 69.87963421606715
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en-ext)
config: en-ext
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 75.8095952023988
- type: ap
value: 24.46869495579561
- type: f1
value: 63.00108480037597
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (ja)
config: ja
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 64.186295503212
- type: ap
value: 15.496804690197042
- type: f1
value: 52.07153895475031
- task:
type: Classification
dataset:
type: mteb/amazon_polarity
name: MTEB AmazonPolarityClassification
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 88.699325
- type: ap
value: 85.27039559917269
- type: f1
value: 88.65556295032513
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (en)
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 44.69799999999999
- type: f1
value: 43.73187348654165
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (de)
config: de
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 40.245999999999995
- type: f1
value: 39.3863530637684
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (es)
config: es
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 40.394
- type: f1
value: 39.301223469483446
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (fr)
config: fr
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 38.864
- type: f1
value: 37.97974261868003
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (ja)
config: ja
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 37.682
- type: f1
value: 37.07399369768313
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (zh)
config: zh
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 37.504
- type: f1
value: 36.62317273874278
- task:
type: Retrieval
dataset:
type: arguana
name: MTEB ArguAna
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 19.061
- type: map_at_10
value: 31.703
- type: map_at_100
value: 32.967
- type: map_at_1000
value: 33.001000000000005
- type: map_at_3
value: 27.466
- type: map_at_5
value: 29.564
- type: mrr_at_1
value: 19.559
- type: mrr_at_10
value: 31.874999999999996
- type: mrr_at_100
value: 33.146
- type: mrr_at_1000
value: 33.18
- type: mrr_at_3
value: 27.667
- type: mrr_at_5
value: 29.74
- type: ndcg_at_1
value: 19.061
- type: ndcg_at_10
value: 39.062999999999995
- type: ndcg_at_100
value: 45.184000000000005
- type: ndcg_at_1000
value: 46.115
- type: ndcg_at_3
value: 30.203000000000003
- type: ndcg_at_5
value: 33.953
- type: precision_at_1
value: 19.061
- type: precision_at_10
value: 6.279999999999999
- type: precision_at_100
value: 0.9129999999999999
- type: precision_at_1000
value: 0.099
- type: precision_at_3
value: 12.706999999999999
- type: precision_at_5
value: 9.431000000000001
- type: recall_at_1
value: 19.061
- type: recall_at_10
value: 62.802
- type: recall_at_100
value: 91.323
- type: recall_at_1000
value: 98.72
- type: recall_at_3
value: 38.122
- type: recall_at_5
value: 47.155
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-p2p
name: MTEB ArxivClusteringP2P
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 39.22266660528253
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-s2s
name: MTEB ArxivClusteringS2S
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 30.79980849482483
- task:
type: Reranking
dataset:
type: mteb/askubuntudupquestions-reranking
name: MTEB AskUbuntuDupQuestions
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 57.8790068352054
- type: mrr
value: 71.78791276436706
- task:
type: STS
dataset:
type: mteb/biosses-sts
name: MTEB BIOSSES
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 82.36328364043163
- type: cos_sim_spearman
value: 82.26211536195868
- type: euclidean_pearson
value: 80.3183865039173
- type: euclidean_spearman
value: 79.88495276296132
- type: manhattan_pearson
value: 80.14484480692127
- type: manhattan_spearman
value: 80.39279565980743
- task:
type: BitextMining
dataset:
type: mteb/bucc-bitext-mining
name: MTEB BUCC (de-en)
config: de-en
split: test
revision: d51519689f32196a32af33b075a01d0e7c51e252
metrics:
- type: accuracy
value: 98.0375782881002
- type: f1
value: 97.86012526096033
- type: precision
value: 97.77139874739039
- type: recall
value: 98.0375782881002
- task:
type: BitextMining
dataset:
type: mteb/bucc-bitext-mining
name: MTEB BUCC (fr-en)
config: fr-en
split: test
revision: d51519689f32196a32af33b075a01d0e7c51e252
metrics:
- type: accuracy
value: 93.35241030156286
- type: f1
value: 92.66050333846944
- type: precision
value: 92.3306919069631
- type: recall
value: 93.35241030156286
- task:
type: BitextMining
dataset:
type: mteb/bucc-bitext-mining
name: MTEB BUCC (ru-en)
config: ru-en
split: test
revision: d51519689f32196a32af33b075a01d0e7c51e252
metrics:
- type: accuracy
value: 94.0699688257707
- type: f1
value: 93.50236693222492
- type: precision
value: 93.22791825424315
- type: recall
value: 94.0699688257707
- task:
type: BitextMining
dataset:
type: mteb/bucc-bitext-mining
name: MTEB BUCC (zh-en)
config: zh-en
split: test
revision: d51519689f32196a32af33b075a01d0e7c51e252
metrics:
- type: accuracy
value: 89.25750394944708
- type: f1
value: 88.79234684921889
- type: precision
value: 88.57293312269616
- type: recall
value: 89.25750394944708
- task:
type: Classification
dataset:
type: mteb/banking77
name: MTEB Banking77Classification
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 79.41558441558442
- type: f1
value: 79.25886487487219
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-p2p
name: MTEB BiorxivClusteringP2P
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 35.747820820329736
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-s2s
name: MTEB BiorxivClusteringS2S
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 27.045143830596146
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.252999999999997
- type: map_at_10
value: 31.655916666666666
- type: map_at_100
value: 32.680749999999996
- type: map_at_1000
value: 32.79483333333334
- type: map_at_3
value: 29.43691666666666
- type: map_at_5
value: 30.717416666666665
- type: mrr_at_1
value: 28.602750000000004
- type: mrr_at_10
value: 35.56875
- type: mrr_at_100
value: 36.3595
- type: mrr_at_1000
value: 36.427749999999996
- type: mrr_at_3
value: 33.586166666666664
- type: mrr_at_5
value: 34.73641666666666
- type: ndcg_at_1
value: 28.602750000000004
- type: ndcg_at_10
value: 36.06933333333334
- type: ndcg_at_100
value: 40.70141666666667
- type: ndcg_at_1000
value: 43.24341666666667
- type: ndcg_at_3
value: 32.307916666666664
- type: ndcg_at_5
value: 34.129999999999995
- type: precision_at_1
value: 28.602750000000004
- type: precision_at_10
value: 6.097666666666667
- type: precision_at_100
value: 0.9809166666666668
- type: precision_at_1000
value: 0.13766666666666663
- type: precision_at_3
value: 14.628166666666667
- type: precision_at_5
value: 10.266916666666667
- type: recall_at_1
value: 24.252999999999997
- type: recall_at_10
value: 45.31916666666667
- type: recall_at_100
value: 66.03575000000001
- type: recall_at_1000
value: 83.94708333333334
- type: recall_at_3
value: 34.71941666666666
- type: recall_at_5
value: 39.46358333333333
- task:
type: Retrieval
dataset:
type: climate-fever
name: MTEB ClimateFEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.024000000000001
- type: map_at_10
value: 15.644
- type: map_at_100
value: 17.154
- type: map_at_1000
value: 17.345
- type: map_at_3
value: 13.028
- type: map_at_5
value: 14.251
- type: mrr_at_1
value: 19.674
- type: mrr_at_10
value: 29.826999999999998
- type: mrr_at_100
value: 30.935000000000002
- type: mrr_at_1000
value: 30.987
- type: mrr_at_3
value: 26.645000000000003
- type: mrr_at_5
value: 28.29
- type: ndcg_at_1
value: 19.674
- type: ndcg_at_10
value: 22.545
- type: ndcg_at_100
value: 29.207
- type: ndcg_at_1000
value: 32.912
- type: ndcg_at_3
value: 17.952
- type: ndcg_at_5
value: 19.363
- type: precision_at_1
value: 19.674
- type: precision_at_10
value: 7.212000000000001
- type: precision_at_100
value: 1.435
- type: precision_at_1000
value: 0.212
- type: precision_at_3
value: 13.507
- type: precision_at_5
value: 10.397
- type: recall_at_1
value: 9.024000000000001
- type: recall_at_10
value: 28.077999999999996
- type: recall_at_100
value: 51.403
- type: recall_at_1000
value: 72.406
- type: recall_at_3
value: 16.768
- type: recall_at_5
value: 20.737
- task:
type: Retrieval
dataset:
type: dbpedia-entity
name: MTEB DBPedia
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.012
- type: map_at_10
value: 17.138
- type: map_at_100
value: 24.146
- type: map_at_1000
value: 25.622
- type: map_at_3
value: 12.552
- type: map_at_5
value: 14.435
- type: mrr_at_1
value: 62.25000000000001
- type: mrr_at_10
value: 71.186
- type: mrr_at_100
value: 71.504
- type: mrr_at_1000
value: 71.514
- type: mrr_at_3
value: 69.333
- type: mrr_at_5
value: 70.408
- type: ndcg_at_1
value: 49.75
- type: ndcg_at_10
value: 37.76
- type: ndcg_at_100
value: 42.071
- type: ndcg_at_1000
value: 49.309
- type: ndcg_at_3
value: 41.644
- type: ndcg_at_5
value: 39.812999999999995
- type: precision_at_1
value: 62.25000000000001
- type: precision_at_10
value: 30.15
- type: precision_at_100
value: 9.753
- type: precision_at_1000
value: 1.9189999999999998
- type: precision_at_3
value: 45.667
- type: precision_at_5
value: 39.15
- type: recall_at_1
value: 8.012
- type: recall_at_10
value: 22.599
- type: recall_at_100
value: 48.068
- type: recall_at_1000
value: 71.328
- type: recall_at_3
value: 14.043
- type: recall_at_5
value: 17.124
- task:
type: Classification
dataset:
type: mteb/emotion
name: MTEB EmotionClassification
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 42.455
- type: f1
value: 37.59462649781862
- task:
type: Retrieval
dataset:
type: fever
name: MTEB FEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 58.092
- type: map_at_10
value: 69.586
- type: map_at_100
value: 69.968
- type: map_at_1000
value: 69.982
- type: map_at_3
value: 67.48100000000001
- type: map_at_5
value: 68.915
- type: mrr_at_1
value: 62.166
- type: mrr_at_10
value: 73.588
- type: mrr_at_100
value: 73.86399999999999
- type: mrr_at_1000
value: 73.868
- type: mrr_at_3
value: 71.6
- type: mrr_at_5
value: 72.99
- type: ndcg_at_1
value: 62.166
- type: ndcg_at_10
value: 75.27199999999999
- type: ndcg_at_100
value: 76.816
- type: ndcg_at_1000
value: 77.09700000000001
- type: ndcg_at_3
value: 71.36
- type: ndcg_at_5
value: 73.785
- type: precision_at_1
value: 62.166
- type: precision_at_10
value: 9.716
- type: precision_at_100
value: 1.065
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 28.278
- type: precision_at_5
value: 18.343999999999998
- type: recall_at_1
value: 58.092
- type: recall_at_10
value: 88.73400000000001
- type: recall_at_100
value: 95.195
- type: recall_at_1000
value: 97.04599999999999
- type: recall_at_3
value: 78.45
- type: recall_at_5
value: 84.316
- task:
type: Retrieval
dataset:
type: fiqa
name: MTEB FiQA2018
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.649
- type: map_at_10
value: 26.457000000000004
- type: map_at_100
value: 28.169
- type: map_at_1000
value: 28.352
- type: map_at_3
value: 23.305
- type: map_at_5
value: 25.169000000000004
- type: mrr_at_1
value: 32.407000000000004
- type: mrr_at_10
value: 40.922
- type: mrr_at_100
value: 41.931000000000004
- type: mrr_at_1000
value: 41.983
- type: mrr_at_3
value: 38.786
- type: mrr_at_5
value: 40.205999999999996
- type: ndcg_at_1
value: 32.407000000000004
- type: ndcg_at_10
value: 33.314
- type: ndcg_at_100
value: 40.312
- type: ndcg_at_1000
value: 43.685
- type: ndcg_at_3
value: 30.391000000000002
- type: ndcg_at_5
value: 31.525
- type: precision_at_1
value: 32.407000000000004
- type: precision_at_10
value: 8.966000000000001
- type: precision_at_100
value: 1.6019999999999999
- type: precision_at_1000
value: 0.22200000000000003
- type: precision_at_3
value: 20.165
- type: precision_at_5
value: 14.722
- type: recall_at_1
value: 16.649
- type: recall_at_10
value: 39.117000000000004
- type: recall_at_100
value: 65.726
- type: recall_at_1000
value: 85.784
- type: recall_at_3
value: 27.914
- type: recall_at_5
value: 33.289
- task:
type: Retrieval
dataset:
type: hotpotqa
name: MTEB HotpotQA
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.253
- type: map_at_10
value: 56.16799999999999
- type: map_at_100
value: 57.06099999999999
- type: map_at_1000
value: 57.126
- type: map_at_3
value: 52.644999999999996
- type: map_at_5
value: 54.909
- type: mrr_at_1
value: 72.505
- type: mrr_at_10
value: 79.66
- type: mrr_at_100
value: 79.869
- type: mrr_at_1000
value: 79.88
- type: mrr_at_3
value: 78.411
- type: mrr_at_5
value: 79.19800000000001
- type: ndcg_at_1
value: 72.505
- type: ndcg_at_10
value: 65.094
- type: ndcg_at_100
value: 68.219
- type: ndcg_at_1000
value: 69.515
- type: ndcg_at_3
value: 59.99
- type: ndcg_at_5
value: 62.909000000000006
- type: precision_at_1
value: 72.505
- type: precision_at_10
value: 13.749
- type: precision_at_100
value: 1.619
- type: precision_at_1000
value: 0.179
- type: precision_at_3
value: 38.357
- type: precision_at_5
value: 25.313000000000002
- type: recall_at_1
value: 36.253
- type: recall_at_10
value: 68.744
- type: recall_at_100
value: 80.925
- type: recall_at_1000
value: 89.534
- type: recall_at_3
value: 57.535000000000004
- type: recall_at_5
value: 63.282000000000004
- task:
type: Classification
dataset:
type: mteb/imdb
name: MTEB ImdbClassification
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 80.82239999999999
- type: ap
value: 75.65895781725314
- type: f1
value: 80.75880969095746
- task:
type: Retrieval
dataset:
type: msmarco
name: MTEB MSMARCO
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 21.624
- type: map_at_10
value: 34.075
- type: map_at_100
value: 35.229
- type: map_at_1000
value: 35.276999999999994
- type: map_at_3
value: 30.245
- type: map_at_5
value: 32.42
- type: mrr_at_1
value: 22.264
- type: mrr_at_10
value: 34.638000000000005
- type: mrr_at_100
value: 35.744
- type: mrr_at_1000
value: 35.787
- type: mrr_at_3
value: 30.891000000000002
- type: mrr_at_5
value: 33.042
- type: ndcg_at_1
value: 22.264
- type: ndcg_at_10
value: 40.991
- type: ndcg_at_100
value: 46.563
- type: ndcg_at_1000
value: 47.743
- type: ndcg_at_3
value: 33.198
- type: ndcg_at_5
value: 37.069
- type: precision_at_1
value: 22.264
- type: precision_at_10
value: 6.5089999999999995
- type: precision_at_100
value: 0.9299999999999999
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_3
value: 14.216999999999999
- type: precision_at_5
value: 10.487
- type: recall_at_1
value: 21.624
- type: recall_at_10
value: 62.303
- type: recall_at_100
value: 88.124
- type: recall_at_1000
value: 97.08
- type: recall_at_3
value: 41.099999999999994
- type: recall_at_5
value: 50.381
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (en)
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 91.06703146374831
- type: f1
value: 90.86867815863172
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (de)
config: de
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 87.46970977740209
- type: f1
value: 86.36832872036588
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (es)
config: es
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 89.26951300867245
- type: f1
value: 88.93561193959502
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (fr)
config: fr
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 84.22799874725963
- type: f1
value: 84.30490069236556
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (hi)
config: hi
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 86.02007888131948
- type: f1
value: 85.39376041027991
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (th)
config: th
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 85.34900542495481
- type: f1
value: 85.39859673336713
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (en)
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 71.078431372549
- type: f1
value: 53.45071102002276
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (de)
config: de
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 65.85798816568047
- type: f1
value: 46.53112748993529
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (es)
config: es
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 67.96864576384256
- type: f1
value: 45.966703022829506
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (fr)
config: fr
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 61.31537738803633
- type: f1
value: 45.52601712835461
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (hi)
config: hi
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 66.29616349946218
- type: f1
value: 47.24166485726613
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (th)
config: th
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 67.51537070524412
- type: f1
value: 49.463476319014276
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (af)
config: af
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 57.06792199058508
- type: f1
value: 54.094921857502285
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (am)
config: am
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 51.960322797579025
- type: f1
value: 48.547371223370945
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ar)
config: ar
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 54.425016812373904
- type: f1
value: 50.47069202054312
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (az)
config: az
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 59.798251513113655
- type: f1
value: 57.05013069086648
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (bn)
config: bn
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 59.37794216543376
- type: f1
value: 56.3607992649805
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (cy)
config: cy
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 46.56018829858777
- type: f1
value: 43.87319715715134
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (da)
config: da
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.9724277067922
- type: f1
value: 59.36480066245562
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (de)
config: de
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.72696704774715
- type: f1
value: 59.143595966615855
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (el)
config: el
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 61.5971755211836
- type: f1
value: 59.169445724946726
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (en)
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 70.29589778076665
- type: f1
value: 67.7577001808977
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (es)
config: es
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 66.31136516476126
- type: f1
value: 64.52032955983242
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (fa)
config: fa
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 65.54472091459314
- type: f1
value: 61.47903120066317
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (fi)
config: fi
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 61.45595158036314
- type: f1
value: 58.0891846024637
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (fr)
config: fr
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 65.47074646940149
- type: f1
value: 62.84830858877575
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (he)
config: he
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 58.046402151983855
- type: f1
value: 55.269074430533195
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (hi)
config: hi
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 64.06523201075991
- type: f1
value: 61.35339643021369
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (hu)
config: hu
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 60.954942837928726
- type: f1
value: 57.07035922704846
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (hy)
config: hy
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 57.404169468728995
- type: f1
value: 53.94259011839138
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (id)
config: id
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 64.16610625420309
- type: f1
value: 61.337103431499365
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (is)
config: is
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 52.262945527908535
- type: f1
value: 49.7610691598921
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (it)
config: it
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 65.54472091459314
- type: f1
value: 63.469099018440154
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ja)
config: ja
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 68.22797579018157
- type: f1
value: 64.89098471083001
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (jv)
config: jv
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 50.847343644922674
- type: f1
value: 47.8536963168393
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ka)
config: ka
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 48.45326160053799
- type: f1
value: 46.370078045805556
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (km)
config: km
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 42.83120376597175
- type: f1
value: 39.68948521599982
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (kn)
config: kn
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 57.5084061869536
- type: f1
value: 53.961876160401545
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ko)
config: ko
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 63.7895090786819
- type: f1
value: 61.134223684676
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (lv)
config: lv
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 54.98991257565569
- type: f1
value: 52.579862862826296
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ml)
config: ml
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 61.90316072629456
- type: f1
value: 58.203024538290336
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (mn)
config: mn
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 57.09818426361802
- type: f1
value: 54.22718458445455
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ms)
config: ms
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 58.991257565568255
- type: f1
value: 55.84892781767421
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (my)
config: my
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 55.901143241425686
- type: f1
value: 52.25264332199797
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (nb)
config: nb
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 61.96368527236047
- type: f1
value: 58.927243876153454
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (nl)
config: nl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 65.64223268325489
- type: f1
value: 62.340453718379706
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (pl)
config: pl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 64.52589105581708
- type: f1
value: 61.661113187022174
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (pt)
config: pt
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 66.84599865501009
- type: f1
value: 64.59342572873005
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ro)
config: ro
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 60.81035642232684
- type: f1
value: 57.5169089806797
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ru)
config: ru
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 65.75991930060525
- type: f1
value: 62.89531115787938
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (sl)
config: sl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 56.51647612642906
- type: f1
value: 54.33154780100043
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (sq)
config: sq
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 57.985877605917956
- type: f1
value: 54.46187524463802
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (sv)
config: sv
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 65.03026227303296
- type: f1
value: 62.34377392877748
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (sw)
config: sw
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 53.567585743106925
- type: f1
value: 50.73770655983206
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ta)
config: ta
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 57.2595830531271
- type: f1
value: 53.657327291708626
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (te)
config: te
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 57.82784129119032
- type: f1
value: 54.82518072665301
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (th)
config: th
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 64.06859448554137
- type: f1
value: 63.00185280500495
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (tl)
config: tl
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 58.91055817081371
- type: f1
value: 55.54116301224262
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (tr)
config: tr
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 63.54404841963686
- type: f1
value: 59.57650946030184
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (ur)
config: ur
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 59.27706792199059
- type: f1
value: 56.50010066083435
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (vi)
config: vi
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 64.0719569603228
- type: f1
value: 61.817075925647956
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (zh-CN)
config: zh-CN
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 68.23806321452591
- type: f1
value: 65.24917026029749
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (zh-TW)
config: zh-TW
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.53530598520511
- type: f1
value: 61.71131132295768
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (af)
config: af
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 63.04303967720243
- type: f1
value: 60.3950085685985
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (am)
config: am
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 56.83591123066578
- type: f1
value: 54.95059828830849
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ar)
config: ar
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 59.62340282447881
- type: f1
value: 59.525159996498225
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (az)
config: az
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 60.85406859448555
- type: f1
value: 59.129299095681276
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (bn)
config: bn
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 62.76731674512441
- type: f1
value: 61.159560612627715
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (cy)
config: cy
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 50.181573638197705
- type: f1
value: 46.98422176289957
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (da)
config: da
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 68.92737054472092
- type: f1
value: 67.69135611952979
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (de)
config: de
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 69.18964357767318
- type: f1
value: 68.46106138186214
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (el)
config: el
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.0712844653665
- type: f1
value: 66.75545422473901
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (en)
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.4754539340955
- type: f1
value: 74.38427146553252
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (es)
config: es
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 69.82515131136518
- type: f1
value: 69.63516462173847
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (fa)
config: fa
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 68.70880968392737
- type: f1
value: 67.45420662567926
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (fi)
config: fi
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 65.95494283792871
- type: f1
value: 65.06191009049222
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (fr)
config: fr
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 68.75924680564896
- type: f1
value: 68.30833379585945
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (he)
config: he
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 63.806321452589096
- type: f1
value: 63.273048243765054
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (hi)
config: hi
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.68997982515133
- type: f1
value: 66.54703855381324
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (hu)
config: hu
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 66.46940147948891
- type: f1
value: 65.91017343463396
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (hy)
config: hy
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 59.49899125756556
- type: f1
value: 57.90333469917769
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (id)
config: id
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.9219905850706
- type: f1
value: 67.23169403762938
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (is)
config: is
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 56.486213853396094
- type: f1
value: 54.85282355583758
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (it)
config: it
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 69.04169468728985
- type: f1
value: 68.83833333320462
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ja)
config: ja
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 73.88702084734365
- type: f1
value: 74.04474735232299
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (jv)
config: jv
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 56.63416274377943
- type: f1
value: 55.11332211687954
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ka)
config: ka
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 52.23604572965702
- type: f1
value: 50.86529813991055
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (km)
config: km
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 46.62407531943511
- type: f1
value: 43.63485467164535
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (kn)
config: kn
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 59.15601882985878
- type: f1
value: 57.522837510959924
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ko)
config: ko
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 69.84532616005382
- type: f1
value: 69.60021127179697
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (lv)
config: lv
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 56.65770006724949
- type: f1
value: 55.84219135523227
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ml)
config: ml
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 66.53665097511768
- type: f1
value: 65.09087787792639
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (mn)
config: mn
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 59.31405514458642
- type: f1
value: 58.06135303831491
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ms)
config: ms
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 64.88231338264964
- type: f1
value: 62.751099407787926
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (my)
config: my
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 58.86012104909213
- type: f1
value: 56.29118323058282
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (nb)
config: nb
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.37390719569602
- type: f1
value: 66.27922244885102
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (nl)
config: nl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 70.8675184936113
- type: f1
value: 70.22146529932019
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (pl)
config: pl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 68.2212508406187
- type: f1
value: 67.77454802056282
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (pt)
config: pt
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 68.18090114324143
- type: f1
value: 68.03737625431621
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ro)
config: ro
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 64.65030262273034
- type: f1
value: 63.792945486912856
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ru)
config: ru
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 69.48217888365838
- type: f1
value: 69.96028997292197
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (sl)
config: sl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 60.17821116341627
- type: f1
value: 59.3935969827171
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (sq)
config: sq
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 62.86146603900471
- type: f1
value: 60.133692735032376
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (sv)
config: sv
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 70.89441829186282
- type: f1
value: 70.03064076194089
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (sw)
config: sw
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 58.15063887020847
- type: f1
value: 56.23326278499678
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ta)
config: ta
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 59.43846671149966
- type: f1
value: 57.70440450281974
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (te)
config: te
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 60.8507061197041
- type: f1
value: 59.22916396061171
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (th)
config: th
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 70.65568258238063
- type: f1
value: 69.90736239440633
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (tl)
config: tl
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 60.8843308675185
- type: f1
value: 59.30332663713599
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (tr)
config: tr
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 68.05312710154674
- type: f1
value: 67.44024062594775
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (ur)
config: ur
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 62.111634162743776
- type: f1
value: 60.89083013084519
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (vi)
config: vi
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.44115669132482
- type: f1
value: 67.92227541674552
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (zh-CN)
config: zh-CN
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 74.4687289845326
- type: f1
value: 74.16376793486025
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (zh-TW)
config: zh-TW
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 68.31876260928043
- type: f1
value: 68.5246745215607
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-p2p
name: MTEB MedrxivClusteringP2P
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 30.90431696479766
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-s2s
name: MTEB MedrxivClusteringS2S
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 27.259158476693774
- task:
type: Reranking
dataset:
type: mteb/mind_small
name: MTEB MindSmallReranking
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.28445330838555
- type: mrr
value: 31.15758529581164
- task:
type: Retrieval
dataset:
type: nfcorpus
name: MTEB NFCorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.353
- type: map_at_10
value: 11.565
- type: map_at_100
value: 14.097000000000001
- type: map_at_1000
value: 15.354999999999999
- type: map_at_3
value: 8.749
- type: map_at_5
value: 9.974
- type: mrr_at_1
value: 42.105
- type: mrr_at_10
value: 50.589
- type: mrr_at_100
value: 51.187000000000005
- type: mrr_at_1000
value: 51.233
- type: mrr_at_3
value: 48.246
- type: mrr_at_5
value: 49.546
- type: ndcg_at_1
value: 40.402
- type: ndcg_at_10
value: 31.009999999999998
- type: ndcg_at_100
value: 28.026
- type: ndcg_at_1000
value: 36.905
- type: ndcg_at_3
value: 35.983
- type: ndcg_at_5
value: 33.764
- type: precision_at_1
value: 42.105
- type: precision_at_10
value: 22.786
- type: precision_at_100
value: 6.916
- type: precision_at_1000
value: 1.981
- type: precision_at_3
value: 33.333
- type: precision_at_5
value: 28.731
- type: recall_at_1
value: 5.353
- type: recall_at_10
value: 15.039
- type: recall_at_100
value: 27.348
- type: recall_at_1000
value: 59.453
- type: recall_at_3
value: 9.792
- type: recall_at_5
value: 11.882
- task:
type: Retrieval
dataset:
type: nq
name: MTEB NQ
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 33.852
- type: map_at_10
value: 48.924
- type: map_at_100
value: 49.854
- type: map_at_1000
value: 49.886
- type: map_at_3
value: 44.9
- type: map_at_5
value: 47.387
- type: mrr_at_1
value: 38.035999999999994
- type: mrr_at_10
value: 51.644
- type: mrr_at_100
value: 52.339
- type: mrr_at_1000
value: 52.35999999999999
- type: mrr_at_3
value: 48.421
- type: mrr_at_5
value: 50.468999999999994
- type: ndcg_at_1
value: 38.007000000000005
- type: ndcg_at_10
value: 56.293000000000006
- type: ndcg_at_100
value: 60.167
- type: ndcg_at_1000
value: 60.916000000000004
- type: ndcg_at_3
value: 48.903999999999996
- type: ndcg_at_5
value: 52.978
- type: precision_at_1
value: 38.007000000000005
- type: precision_at_10
value: 9.041
- type: precision_at_100
value: 1.1199999999999999
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_3
value: 22.084
- type: precision_at_5
value: 15.608
- type: recall_at_1
value: 33.852
- type: recall_at_10
value: 75.893
- type: recall_at_100
value: 92.589
- type: recall_at_1000
value: 98.153
- type: recall_at_3
value: 56.969
- type: recall_at_5
value: 66.283
- task:
type: Retrieval
dataset:
type: quora
name: MTEB QuoraRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 69.174
- type: map_at_10
value: 82.891
- type: map_at_100
value: 83.545
- type: map_at_1000
value: 83.56700000000001
- type: map_at_3
value: 79.944
- type: map_at_5
value: 81.812
- type: mrr_at_1
value: 79.67999999999999
- type: mrr_at_10
value: 86.279
- type: mrr_at_100
value: 86.39
- type: mrr_at_1000
value: 86.392
- type: mrr_at_3
value: 85.21
- type: mrr_at_5
value: 85.92999999999999
- type: ndcg_at_1
value: 79.69000000000001
- type: ndcg_at_10
value: 86.929
- type: ndcg_at_100
value: 88.266
- type: ndcg_at_1000
value: 88.428
- type: ndcg_at_3
value: 83.899
- type: ndcg_at_5
value: 85.56700000000001
- type: precision_at_1
value: 79.69000000000001
- type: precision_at_10
value: 13.161000000000001
- type: precision_at_100
value: 1.513
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 36.603
- type: precision_at_5
value: 24.138
- type: recall_at_1
value: 69.174
- type: recall_at_10
value: 94.529
- type: recall_at_100
value: 99.15
- type: recall_at_1000
value: 99.925
- type: recall_at_3
value: 85.86200000000001
- type: recall_at_5
value: 90.501
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering
name: MTEB RedditClustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 39.13064340585255
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering-p2p
name: MTEB RedditClusteringP2P
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 58.97884249325877
- task:
type: Retrieval
dataset:
type: scidocs
name: MTEB SCIDOCS
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 3.4680000000000004
- type: map_at_10
value: 7.865
- type: map_at_100
value: 9.332
- type: map_at_1000
value: 9.587
- type: map_at_3
value: 5.800000000000001
- type: map_at_5
value: 6.8790000000000004
- type: mrr_at_1
value: 17.0
- type: mrr_at_10
value: 25.629
- type: mrr_at_100
value: 26.806
- type: mrr_at_1000
value: 26.889000000000003
- type: mrr_at_3
value: 22.8
- type: mrr_at_5
value: 24.26
- type: ndcg_at_1
value: 17.0
- type: ndcg_at_10
value: 13.895
- type: ndcg_at_100
value: 20.491999999999997
- type: ndcg_at_1000
value: 25.759999999999998
- type: ndcg_at_3
value: 13.347999999999999
- type: ndcg_at_5
value: 11.61
- type: precision_at_1
value: 17.0
- type: precision_at_10
value: 7.090000000000001
- type: precision_at_100
value: 1.669
- type: precision_at_1000
value: 0.294
- type: precision_at_3
value: 12.3
- type: precision_at_5
value: 10.02
- type: recall_at_1
value: 3.4680000000000004
- type: recall_at_10
value: 14.363000000000001
- type: recall_at_100
value: 33.875
- type: recall_at_1000
value: 59.711999999999996
- type: recall_at_3
value: 7.483
- type: recall_at_5
value: 10.173
- task:
type: STS
dataset:
type: mteb/sickr-sts
name: MTEB SICK-R
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 83.04084311714061
- type: cos_sim_spearman
value: 77.51342467443078
- type: euclidean_pearson
value: 80.0321166028479
- type: euclidean_spearman
value: 77.29249114733226
- type: manhattan_pearson
value: 80.03105964262431
- type: manhattan_spearman
value: 77.22373689514794
- task:
type: STS
dataset:
type: mteb/sts12-sts
name: MTEB STS12
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.1680158034387
- type: cos_sim_spearman
value: 76.55983344071117
- type: euclidean_pearson
value: 79.75266678300143
- type: euclidean_spearman
value: 75.34516823467025
- type: manhattan_pearson
value: 79.75959151517357
- type: manhattan_spearman
value: 75.42330344141912
- task:
type: STS
dataset:
type: mteb/sts13-sts
name: MTEB STS13
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 76.48898993209346
- type: cos_sim_spearman
value: 76.96954120323366
- type: euclidean_pearson
value: 76.94139109279668
- type: euclidean_spearman
value: 76.85860283201711
- type: manhattan_pearson
value: 76.6944095091912
- type: manhattan_spearman
value: 76.61096912972553
- task:
type: STS
dataset:
type: mteb/sts14-sts
name: MTEB STS14
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 77.85082366246944
- type: cos_sim_spearman
value: 75.52053350101731
- type: euclidean_pearson
value: 77.1165845070926
- type: euclidean_spearman
value: 75.31216065884388
- type: manhattan_pearson
value: 77.06193941833494
- type: manhattan_spearman
value: 75.31003701700112
- task:
type: STS
dataset:
type: mteb/sts15-sts
name: MTEB STS15
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.36305246526497
- type: cos_sim_spearman
value: 87.11704613927415
- type: euclidean_pearson
value: 86.04199125810939
- type: euclidean_spearman
value: 86.51117572414263
- type: manhattan_pearson
value: 86.0805106816633
- type: manhattan_spearman
value: 86.52798366512229
- task:
type: STS
dataset:
type: mteb/sts16-sts
name: MTEB STS16
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 82.18536255599724
- type: cos_sim_spearman
value: 83.63377151025418
- type: euclidean_pearson
value: 83.24657467993141
- type: euclidean_spearman
value: 84.02751481993825
- type: manhattan_pearson
value: 83.11941806582371
- type: manhattan_spearman
value: 83.84251281019304
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (ko-ko)
config: ko-ko
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 78.95816528475514
- type: cos_sim_spearman
value: 78.86607380120462
- type: euclidean_pearson
value: 78.51268699230545
- type: euclidean_spearman
value: 79.11649316502229
- type: manhattan_pearson
value: 78.32367302808157
- type: manhattan_spearman
value: 78.90277699624637
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (ar-ar)
config: ar-ar
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 72.89126914997624
- type: cos_sim_spearman
value: 73.0296921832678
- type: euclidean_pearson
value: 71.50385903677738
- type: euclidean_spearman
value: 73.13368899716289
- type: manhattan_pearson
value: 71.47421463379519
- type: manhattan_spearman
value: 73.03383242946575
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-ar)
config: en-ar
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 59.22923684492637
- type: cos_sim_spearman
value: 57.41013211368396
- type: euclidean_pearson
value: 61.21107388080905
- type: euclidean_spearman
value: 60.07620768697254
- type: manhattan_pearson
value: 59.60157142786555
- type: manhattan_spearman
value: 59.14069604103739
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-de)
config: en-de
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 76.24345978774299
- type: cos_sim_spearman
value: 77.24225743830719
- type: euclidean_pearson
value: 76.66226095469165
- type: euclidean_spearman
value: 77.60708820493146
- type: manhattan_pearson
value: 76.05303324760429
- type: manhattan_spearman
value: 76.96353149912348
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-en)
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 85.50879160160852
- type: cos_sim_spearman
value: 86.43594662965224
- type: euclidean_pearson
value: 86.06846012826577
- type: euclidean_spearman
value: 86.02041395794136
- type: manhattan_pearson
value: 86.10916255616904
- type: manhattan_spearman
value: 86.07346068198953
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-tr)
config: en-tr
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 58.39803698977196
- type: cos_sim_spearman
value: 55.96910950423142
- type: euclidean_pearson
value: 58.17941175613059
- type: euclidean_spearman
value: 55.03019330522745
- type: manhattan_pearson
value: 57.333358138183286
- type: manhattan_spearman
value: 54.04614023149965
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (es-en)
config: es-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 70.98304089637197
- type: cos_sim_spearman
value: 72.44071656215888
- type: euclidean_pearson
value: 72.19224359033983
- type: euclidean_spearman
value: 73.89871188913025
- type: manhattan_pearson
value: 71.21098311547406
- type: manhattan_spearman
value: 72.93405764824821
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (es-es)
config: es-es
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 85.99792397466308
- type: cos_sim_spearman
value: 84.83824377879495
- type: euclidean_pearson
value: 85.70043288694438
- type: euclidean_spearman
value: 84.70627558703686
- type: manhattan_pearson
value: 85.89570850150801
- type: manhattan_spearman
value: 84.95806105313007
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (fr-en)
config: fr-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 72.21850322994712
- type: cos_sim_spearman
value: 72.28669398117248
- type: euclidean_pearson
value: 73.40082510412948
- type: euclidean_spearman
value: 73.0326539281865
- type: manhattan_pearson
value: 71.8659633964841
- type: manhattan_spearman
value: 71.57817425823303
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (it-en)
config: it-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 75.80921368595645
- type: cos_sim_spearman
value: 77.33209091229315
- type: euclidean_pearson
value: 76.53159540154829
- type: euclidean_spearman
value: 78.17960842810093
- type: manhattan_pearson
value: 76.13530186637601
- type: manhattan_spearman
value: 78.00701437666875
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (nl-en)
config: nl-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 74.74980608267349
- type: cos_sim_spearman
value: 75.37597374318821
- type: euclidean_pearson
value: 74.90506081911661
- type: euclidean_spearman
value: 75.30151613124521
- type: manhattan_pearson
value: 74.62642745918002
- type: manhattan_spearman
value: 75.18619716592303
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (en)
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 59.632662289205584
- type: cos_sim_spearman
value: 60.938543391610914
- type: euclidean_pearson
value: 62.113200529767056
- type: euclidean_spearman
value: 61.410312633261164
- type: manhattan_pearson
value: 61.75494698945686
- type: manhattan_spearman
value: 60.92726195322362
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (de)
config: de
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 45.283470551557244
- type: cos_sim_spearman
value: 53.44833015864201
- type: euclidean_pearson
value: 41.17892011120893
- type: euclidean_spearman
value: 53.81441383126767
- type: manhattan_pearson
value: 41.17482200420659
- type: manhattan_spearman
value: 53.82180269276363
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (es)
config: es
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 60.5069165306236
- type: cos_sim_spearman
value: 66.87803259033826
- type: euclidean_pearson
value: 63.5428979418236
- type: euclidean_spearman
value: 66.9293576586897
- type: manhattan_pearson
value: 63.59789526178922
- type: manhattan_spearman
value: 66.86555009875066
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (pl)
config: pl
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 28.23026196280264
- type: cos_sim_spearman
value: 35.79397812652861
- type: euclidean_pearson
value: 17.828102102767353
- type: euclidean_spearman
value: 35.721501145568894
- type: manhattan_pearson
value: 17.77134274219677
- type: manhattan_spearman
value: 35.98107902846267
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (tr)
config: tr
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 56.51946541393812
- type: cos_sim_spearman
value: 63.714686006214485
- type: euclidean_pearson
value: 58.32104651305898
- type: euclidean_spearman
value: 62.237110895702216
- type: manhattan_pearson
value: 58.579416468759185
- type: manhattan_spearman
value: 62.459738981727
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (ar)
config: ar
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 48.76009839569795
- type: cos_sim_spearman
value: 56.65188431953149
- type: euclidean_pearson
value: 50.997682160915595
- type: euclidean_spearman
value: 55.99910008818135
- type: manhattan_pearson
value: 50.76220659606342
- type: manhattan_spearman
value: 55.517347595391456
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (ru)
config: ru
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 51.232731157702425
- type: cos_sim_spearman
value: 59.89531877658345
- type: euclidean_pearson
value: 49.937914570348376
- type: euclidean_spearman
value: 60.220905659334036
- type: manhattan_pearson
value: 50.00987996844193
- type: manhattan_spearman
value: 60.081341480977926
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (zh)
config: zh
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 54.717524559088005
- type: cos_sim_spearman
value: 66.83570886252286
- type: euclidean_pearson
value: 58.41338625505467
- type: euclidean_spearman
value: 66.68991427704938
- type: manhattan_pearson
value: 58.78638572916807
- type: manhattan_spearman
value: 66.58684161046335
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (fr)
config: fr
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 73.2962042954962
- type: cos_sim_spearman
value: 76.58255504852025
- type: euclidean_pearson
value: 75.70983192778257
- type: euclidean_spearman
value: 77.4547684870542
- type: manhattan_pearson
value: 75.75565853870485
- type: manhattan_spearman
value: 76.90208974949428
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (de-en)
config: de-en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 54.47396266924846
- type: cos_sim_spearman
value: 56.492267162048606
- type: euclidean_pearson
value: 55.998505203070195
- type: euclidean_spearman
value: 56.46447012960222
- type: manhattan_pearson
value: 54.873172394430995
- type: manhattan_spearman
value: 56.58111534551218
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (es-en)
config: es-en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 69.87177267688686
- type: cos_sim_spearman
value: 74.57160943395763
- type: euclidean_pearson
value: 70.88330406826788
- type: euclidean_spearman
value: 74.29767636038422
- type: manhattan_pearson
value: 71.38245248369536
- type: manhattan_spearman
value: 74.53102232732175
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (it)
config: it
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 72.80225656959544
- type: cos_sim_spearman
value: 76.52646173725735
- type: euclidean_pearson
value: 73.95710720200799
- type: euclidean_spearman
value: 76.54040031984111
- type: manhattan_pearson
value: 73.89679971946774
- type: manhattan_spearman
value: 76.60886958161574
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (pl-en)
config: pl-en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 70.70844249898789
- type: cos_sim_spearman
value: 72.68571783670241
- type: euclidean_pearson
value: 72.38800772441031
- type: euclidean_spearman
value: 72.86804422703312
- type: manhattan_pearson
value: 71.29840508203515
- type: manhattan_spearman
value: 71.86264441749513
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (zh-en)
config: zh-en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 58.647478923935694
- type: cos_sim_spearman
value: 63.74453623540931
- type: euclidean_pearson
value: 59.60138032437505
- type: euclidean_spearman
value: 63.947930832166065
- type: manhattan_pearson
value: 58.59735509491861
- type: manhattan_spearman
value: 62.082503844627404
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (es-it)
config: es-it
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 65.8722516867162
- type: cos_sim_spearman
value: 71.81208592523012
- type: euclidean_pearson
value: 67.95315252165956
- type: euclidean_spearman
value: 73.00749822046009
- type: manhattan_pearson
value: 68.07884688638924
- type: manhattan_spearman
value: 72.34210325803069
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (de-fr)
config: de-fr
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 54.5405814240949
- type: cos_sim_spearman
value: 60.56838649023775
- type: euclidean_pearson
value: 53.011731611314104
- type: euclidean_spearman
value: 58.533194841668426
- type: manhattan_pearson
value: 53.623067729338494
- type: manhattan_spearman
value: 58.018756154446926
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (de-pl)
config: de-pl
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 13.611046866216112
- type: cos_sim_spearman
value: 28.238192909158492
- type: euclidean_pearson
value: 22.16189199885129
- type: euclidean_spearman
value: 35.012895679076564
- type: manhattan_pearson
value: 21.969771178698387
- type: manhattan_spearman
value: 32.456985088607475
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (fr-pl)
config: fr-pl
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 74.58077407011655
- type: cos_sim_spearman
value: 84.51542547285167
- type: euclidean_pearson
value: 74.64613843596234
- type: euclidean_spearman
value: 84.51542547285167
- type: manhattan_pearson
value: 75.15335973101396
- type: manhattan_spearman
value: 84.51542547285167
- task:
type: STS
dataset:
type: mteb/stsbenchmark-sts
name: MTEB STSBenchmark
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 82.0739825531578
- type: cos_sim_spearman
value: 84.01057479311115
- type: euclidean_pearson
value: 83.85453227433344
- type: euclidean_spearman
value: 84.01630226898655
- type: manhattan_pearson
value: 83.75323603028978
- type: manhattan_spearman
value: 83.89677983727685
- task:
type: Reranking
dataset:
type: mteb/scidocs-reranking
name: MTEB SciDocsRR
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 78.12945623123957
- type: mrr
value: 93.87738713719106
- task:
type: Retrieval
dataset:
type: scifact
name: MTEB SciFact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 52.983000000000004
- type: map_at_10
value: 62.946000000000005
- type: map_at_100
value: 63.514
- type: map_at_1000
value: 63.554
- type: map_at_3
value: 60.183
- type: map_at_5
value: 61.672000000000004
- type: mrr_at_1
value: 55.667
- type: mrr_at_10
value: 64.522
- type: mrr_at_100
value: 64.957
- type: mrr_at_1000
value: 64.995
- type: mrr_at_3
value: 62.388999999999996
- type: mrr_at_5
value: 63.639
- type: ndcg_at_1
value: 55.667
- type: ndcg_at_10
value: 67.704
- type: ndcg_at_100
value: 70.299
- type: ndcg_at_1000
value: 71.241
- type: ndcg_at_3
value: 62.866
- type: ndcg_at_5
value: 65.16999999999999
- type: precision_at_1
value: 55.667
- type: precision_at_10
value: 9.033
- type: precision_at_100
value: 1.053
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 24.444
- type: precision_at_5
value: 16.133
- type: recall_at_1
value: 52.983000000000004
- type: recall_at_10
value: 80.656
- type: recall_at_100
value: 92.5
- type: recall_at_1000
value: 99.667
- type: recall_at_3
value: 67.744
- type: recall_at_5
value: 73.433
- task:
type: PairClassification
dataset:
type: mteb/sprintduplicatequestions-pairclassification
name: MTEB SprintDuplicateQuestions
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.72772277227723
- type: cos_sim_ap
value: 92.17845897992215
- type: cos_sim_f1
value: 85.9746835443038
- type: cos_sim_precision
value: 87.07692307692308
- type: cos_sim_recall
value: 84.89999999999999
- type: dot_accuracy
value: 99.3039603960396
- type: dot_ap
value: 60.70244020124878
- type: dot_f1
value: 59.92742353551063
- type: dot_precision
value: 62.21743810548978
- type: dot_recall
value: 57.8
- type: euclidean_accuracy
value: 99.71683168316832
- type: euclidean_ap
value: 91.53997039964659
- type: euclidean_f1
value: 84.88372093023257
- type: euclidean_precision
value: 90.02242152466367
- type: euclidean_recall
value: 80.30000000000001
- type: manhattan_accuracy
value: 99.72376237623763
- type: manhattan_ap
value: 91.80756777790289
- type: manhattan_f1
value: 85.48468106479157
- type: manhattan_precision
value: 85.8728557013118
- type: manhattan_recall
value: 85.1
- type: max_accuracy
value: 99.72772277227723
- type: max_ap
value: 92.17845897992215
- type: max_f1
value: 85.9746835443038
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering
name: MTEB StackExchangeClustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 53.52464042600003
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering-p2p
name: MTEB StackExchangeClusteringP2P
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 32.071631948736
- task:
type: Reranking
dataset:
type: mteb/stackoverflowdupquestions-reranking
name: MTEB StackOverflowDupQuestions
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.19552407604654
- type: mrr
value: 49.95269130379425
- task:
type: Summarization
dataset:
type: mteb/summeval
name: MTEB SummEval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.345293033095427
- type: cos_sim_spearman
value: 29.976931423258403
- type: dot_pearson
value: 27.047078008958408
- type: dot_spearman
value: 27.75894368380218
- task:
type: Retrieval
dataset:
type: trec-covid
name: MTEB TRECCOVID
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.22
- type: map_at_10
value: 1.706
- type: map_at_100
value: 9.634
- type: map_at_1000
value: 23.665
- type: map_at_3
value: 0.5950000000000001
- type: map_at_5
value: 0.95
- type: mrr_at_1
value: 86.0
- type: mrr_at_10
value: 91.8
- type: mrr_at_100
value: 91.8
- type: mrr_at_1000
value: 91.8
- type: mrr_at_3
value: 91.0
- type: mrr_at_5
value: 91.8
- type: ndcg_at_1
value: 80.0
- type: ndcg_at_10
value: 72.573
- type: ndcg_at_100
value: 53.954
- type: ndcg_at_1000
value: 47.760999999999996
- type: ndcg_at_3
value: 76.173
- type: ndcg_at_5
value: 75.264
- type: precision_at_1
value: 86.0
- type: precision_at_10
value: 76.4
- type: precision_at_100
value: 55.50000000000001
- type: precision_at_1000
value: 21.802
- type: precision_at_3
value: 81.333
- type: precision_at_5
value: 80.4
- type: recall_at_1
value: 0.22
- type: recall_at_10
value: 1.925
- type: recall_at_100
value: 12.762
- type: recall_at_1000
value: 44.946000000000005
- type: recall_at_3
value: 0.634
- type: recall_at_5
value: 1.051
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (sqi-eng)
config: sqi-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 91.0
- type: f1
value: 88.55666666666666
- type: precision
value: 87.46166666666667
- type: recall
value: 91.0
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (fry-eng)
config: fry-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 57.22543352601156
- type: f1
value: 51.03220478943021
- type: precision
value: 48.8150289017341
- type: recall
value: 57.22543352601156
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (kur-eng)
config: kur-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 46.58536585365854
- type: f1
value: 39.66870798578116
- type: precision
value: 37.416085946573745
- type: recall
value: 46.58536585365854
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (tur-eng)
config: tur-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 89.7
- type: f1
value: 86.77999999999999
- type: precision
value: 85.45333333333332
- type: recall
value: 89.7
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (deu-eng)
config: deu-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 97.39999999999999
- type: f1
value: 96.58333333333331
- type: precision
value: 96.2
- type: recall
value: 97.39999999999999
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (nld-eng)
config: nld-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 92.4
- type: f1
value: 90.3
- type: precision
value: 89.31666666666668
- type: recall
value: 92.4
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ron-eng)
config: ron-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 86.9
- type: f1
value: 83.67190476190476
- type: precision
value: 82.23333333333332
- type: recall
value: 86.9
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ang-eng)
config: ang-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 50.0
- type: f1
value: 42.23229092632078
- type: precision
value: 39.851634683724235
- type: recall
value: 50.0
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ido-eng)
config: ido-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 76.3
- type: f1
value: 70.86190476190477
- type: precision
value: 68.68777777777777
- type: recall
value: 76.3
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (jav-eng)
config: jav-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 57.073170731707314
- type: f1
value: 50.658958927251604
- type: precision
value: 48.26480836236933
- type: recall
value: 57.073170731707314
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (isl-eng)
config: isl-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 68.2
- type: f1
value: 62.156507936507936
- type: precision
value: 59.84964285714286
- type: recall
value: 68.2
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (slv-eng)
config: slv-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 77.52126366950182
- type: f1
value: 72.8496210148701
- type: precision
value: 70.92171498003819
- type: recall
value: 77.52126366950182
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (cym-eng)
config: cym-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 70.78260869565217
- type: f1
value: 65.32422360248447
- type: precision
value: 63.063067367415194
- type: recall
value: 70.78260869565217
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (kaz-eng)
config: kaz-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 78.43478260869566
- type: f1
value: 73.02608695652172
- type: precision
value: 70.63768115942028
- type: recall
value: 78.43478260869566
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (est-eng)
config: est-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 60.9
- type: f1
value: 55.309753694581275
- type: precision
value: 53.130476190476195
- type: recall
value: 60.9
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (heb-eng)
config: heb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 72.89999999999999
- type: f1
value: 67.92023809523809
- type: precision
value: 65.82595238095237
- type: recall
value: 72.89999999999999
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (gla-eng)
config: gla-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 46.80337756332931
- type: f1
value: 39.42174900558496
- type: precision
value: 36.97101116280851
- type: recall
value: 46.80337756332931
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (mar-eng)
config: mar-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 89.8
- type: f1
value: 86.79
- type: precision
value: 85.375
- type: recall
value: 89.8
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (lat-eng)
config: lat-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 47.199999999999996
- type: f1
value: 39.95484348984349
- type: precision
value: 37.561071428571424
- type: recall
value: 47.199999999999996
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (bel-eng)
config: bel-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 87.8
- type: f1
value: 84.68190476190475
- type: precision
value: 83.275
- type: recall
value: 87.8
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (pms-eng)
config: pms-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 48.76190476190476
- type: f1
value: 42.14965986394558
- type: precision
value: 39.96743626743626
- type: recall
value: 48.76190476190476
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (gle-eng)
config: gle-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 66.10000000000001
- type: f1
value: 59.58580086580086
- type: precision
value: 57.150238095238095
- type: recall
value: 66.10000000000001
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (pes-eng)
config: pes-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 87.3
- type: f1
value: 84.0
- type: precision
value: 82.48666666666666
- type: recall
value: 87.3
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (nob-eng)
config: nob-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 90.4
- type: f1
value: 87.79523809523809
- type: precision
value: 86.6
- type: recall
value: 90.4
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (bul-eng)
config: bul-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 87.0
- type: f1
value: 83.81
- type: precision
value: 82.36666666666666
- type: recall
value: 87.0
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (cbk-eng)
config: cbk-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 63.9
- type: f1
value: 57.76533189033189
- type: precision
value: 55.50595238095239
- type: recall
value: 63.9
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (hun-eng)
config: hun-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 76.1
- type: f1
value: 71.83690476190478
- type: precision
value: 70.04928571428573
- type: recall
value: 76.1
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (uig-eng)
config: uig-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 66.3
- type: f1
value: 59.32626984126984
- type: precision
value: 56.62535714285713
- type: recall
value: 66.3
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (rus-eng)
config: rus-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 90.60000000000001
- type: f1
value: 87.96333333333334
- type: precision
value: 86.73333333333333
- type: recall
value: 90.60000000000001
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (spa-eng)
config: spa-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 93.10000000000001
- type: f1
value: 91.10000000000001
- type: precision
value: 90.16666666666666
- type: recall
value: 93.10000000000001
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (hye-eng)
config: hye-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 85.71428571428571
- type: f1
value: 82.29142600436403
- type: precision
value: 80.8076626877166
- type: recall
value: 85.71428571428571
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (tel-eng)
config: tel-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 88.88888888888889
- type: f1
value: 85.7834757834758
- type: precision
value: 84.43732193732193
- type: recall
value: 88.88888888888889
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (afr-eng)
config: afr-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 88.5
- type: f1
value: 85.67190476190476
- type: precision
value: 84.43333333333332
- type: recall
value: 88.5
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (mon-eng)
config: mon-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 82.72727272727273
- type: f1
value: 78.21969696969695
- type: precision
value: 76.18181818181819
- type: recall
value: 82.72727272727273
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (arz-eng)
config: arz-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 61.0062893081761
- type: f1
value: 55.13976240391334
- type: precision
value: 52.92112499659669
- type: recall
value: 61.0062893081761
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (hrv-eng)
config: hrv-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 89.5
- type: f1
value: 86.86666666666666
- type: precision
value: 85.69166666666668
- type: recall
value: 89.5
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (nov-eng)
config: nov-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 73.54085603112841
- type: f1
value: 68.56031128404669
- type: precision
value: 66.53047989623866
- type: recall
value: 73.54085603112841
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (gsw-eng)
config: gsw-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 43.58974358974359
- type: f1
value: 36.45299145299145
- type: precision
value: 33.81155881155882
- type: recall
value: 43.58974358974359
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (nds-eng)
config: nds-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 59.599999999999994
- type: f1
value: 53.264689754689755
- type: precision
value: 50.869166666666665
- type: recall
value: 59.599999999999994
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ukr-eng)
config: ukr-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 85.2
- type: f1
value: 81.61666666666665
- type: precision
value: 80.02833333333335
- type: recall
value: 85.2
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (uzb-eng)
config: uzb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 63.78504672897196
- type: f1
value: 58.00029669188548
- type: precision
value: 55.815809968847354
- type: recall
value: 63.78504672897196
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (lit-eng)
config: lit-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 66.5
- type: f1
value: 61.518333333333345
- type: precision
value: 59.622363699102834
- type: recall
value: 66.5
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ina-eng)
config: ina-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 88.6
- type: f1
value: 85.60222222222221
- type: precision
value: 84.27916666666665
- type: recall
value: 88.6
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (lfn-eng)
config: lfn-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 58.699999999999996
- type: f1
value: 52.732375957375965
- type: precision
value: 50.63214035964035
- type: recall
value: 58.699999999999996
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (zsm-eng)
config: zsm-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 92.10000000000001
- type: f1
value: 89.99666666666667
- type: precision
value: 89.03333333333333
- type: recall
value: 92.10000000000001
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ita-eng)
config: ita-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 90.10000000000001
- type: f1
value: 87.55666666666667
- type: precision
value: 86.36166666666668
- type: recall
value: 90.10000000000001
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (cmn-eng)
config: cmn-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 91.4
- type: f1
value: 88.89000000000001
- type: precision
value: 87.71166666666666
- type: recall
value: 91.4
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (lvs-eng)
config: lvs-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 65.7
- type: f1
value: 60.67427750410509
- type: precision
value: 58.71785714285714
- type: recall
value: 65.7
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (glg-eng)
config: glg-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 85.39999999999999
- type: f1
value: 81.93190476190475
- type: precision
value: 80.37833333333333
- type: recall
value: 85.39999999999999
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ceb-eng)
config: ceb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 47.833333333333336
- type: f1
value: 42.006625781625786
- type: precision
value: 40.077380952380956
- type: recall
value: 47.833333333333336
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (bre-eng)
config: bre-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 10.4
- type: f1
value: 8.24465007215007
- type: precision
value: 7.664597069597071
- type: recall
value: 10.4
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ben-eng)
config: ben-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 82.6
- type: f1
value: 77.76333333333334
- type: precision
value: 75.57833333333332
- type: recall
value: 82.6
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (swg-eng)
config: swg-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 52.67857142857143
- type: f1
value: 44.302721088435376
- type: precision
value: 41.49801587301587
- type: recall
value: 52.67857142857143
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (arq-eng)
config: arq-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 28.3205268935236
- type: f1
value: 22.426666605171157
- type: precision
value: 20.685900116470915
- type: recall
value: 28.3205268935236
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (kab-eng)
config: kab-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 22.7
- type: f1
value: 17.833970473970474
- type: precision
value: 16.407335164835164
- type: recall
value: 22.7
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (fra-eng)
config: fra-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 92.2
- type: f1
value: 89.92999999999999
- type: precision
value: 88.87
- type: recall
value: 92.2
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (por-eng)
config: por-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 91.4
- type: f1
value: 89.25
- type: precision
value: 88.21666666666667
- type: recall
value: 91.4
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (tat-eng)
config: tat-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 69.19999999999999
- type: f1
value: 63.38269841269841
- type: precision
value: 61.14773809523809
- type: recall
value: 69.19999999999999
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (oci-eng)
config: oci-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 48.8
- type: f1
value: 42.839915639915645
- type: precision
value: 40.770287114845935
- type: recall
value: 48.8
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (pol-eng)
config: pol-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 88.8
- type: f1
value: 85.90666666666668
- type: precision
value: 84.54166666666666
- type: recall
value: 88.8
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (war-eng)
config: war-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 46.6
- type: f1
value: 40.85892920804686
- type: precision
value: 38.838223114604695
- type: recall
value: 46.6
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (aze-eng)
config: aze-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 84.0
- type: f1
value: 80.14190476190475
- type: precision
value: 78.45333333333333
- type: recall
value: 84.0
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (vie-eng)
config: vie-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 90.5
- type: f1
value: 87.78333333333333
- type: precision
value: 86.5
- type: recall
value: 90.5
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (nno-eng)
config: nno-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 74.5
- type: f1
value: 69.48397546897547
- type: precision
value: 67.51869047619049
- type: recall
value: 74.5
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (cha-eng)
config: cha-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 32.846715328467155
- type: f1
value: 27.828177499710343
- type: precision
value: 26.63451511991658
- type: recall
value: 32.846715328467155
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (mhr-eng)
config: mhr-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 8.0
- type: f1
value: 6.07664116764988
- type: precision
value: 5.544177607179943
- type: recall
value: 8.0
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (dan-eng)
config: dan-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 87.6
- type: f1
value: 84.38555555555554
- type: precision
value: 82.91583333333334
- type: recall
value: 87.6
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ell-eng)
config: ell-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 87.5
- type: f1
value: 84.08333333333331
- type: precision
value: 82.47333333333333
- type: recall
value: 87.5
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (amh-eng)
config: amh-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 80.95238095238095
- type: f1
value: 76.13095238095238
- type: precision
value: 74.05753968253967
- type: recall
value: 80.95238095238095
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (pam-eng)
config: pam-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 8.799999999999999
- type: f1
value: 6.971422975172975
- type: precision
value: 6.557814916172301
- type: recall
value: 8.799999999999999
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (hsb-eng)
config: hsb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 44.099378881987576
- type: f1
value: 37.01649742022413
- type: precision
value: 34.69420618488942
- type: recall
value: 44.099378881987576
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (srp-eng)
config: srp-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 84.3
- type: f1
value: 80.32666666666667
- type: precision
value: 78.60666666666665
- type: recall
value: 84.3
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (epo-eng)
config: epo-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 92.5
- type: f1
value: 90.49666666666666
- type: precision
value: 89.56666666666668
- type: recall
value: 92.5
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (kzj-eng)
config: kzj-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 10.0
- type: f1
value: 8.268423529875141
- type: precision
value: 7.878118605532398
- type: recall
value: 10.0
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (awa-eng)
config: awa-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 79.22077922077922
- type: f1
value: 74.27128427128426
- type: precision
value: 72.28715728715729
- type: recall
value: 79.22077922077922
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (fao-eng)
config: fao-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 65.64885496183206
- type: f1
value: 58.87495456197747
- type: precision
value: 55.992366412213734
- type: recall
value: 65.64885496183206
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (mal-eng)
config: mal-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 96.06986899563319
- type: f1
value: 94.78408539543909
- type: precision
value: 94.15332362930616
- type: recall
value: 96.06986899563319
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ile-eng)
config: ile-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 77.2
- type: f1
value: 71.72571428571428
- type: precision
value: 69.41000000000001
- type: recall
value: 77.2
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (bos-eng)
config: bos-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 86.4406779661017
- type: f1
value: 83.2391713747646
- type: precision
value: 81.74199623352166
- type: recall
value: 86.4406779661017
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (cor-eng)
config: cor-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 8.4
- type: f1
value: 6.017828743398003
- type: precision
value: 5.4829865484756795
- type: recall
value: 8.4
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (cat-eng)
config: cat-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 83.5
- type: f1
value: 79.74833333333333
- type: precision
value: 78.04837662337664
- type: recall
value: 83.5
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (eus-eng)
config: eus-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 60.4
- type: f1
value: 54.467301587301584
- type: precision
value: 52.23242424242424
- type: recall
value: 60.4
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (yue-eng)
config: yue-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 74.9
- type: f1
value: 69.68699134199134
- type: precision
value: 67.59873015873016
- type: recall
value: 74.9
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (swe-eng)
config: swe-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 88.0
- type: f1
value: 84.9652380952381
- type: precision
value: 83.66166666666666
- type: recall
value: 88.0
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (dtp-eng)
config: dtp-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 9.1
- type: f1
value: 7.681244588744588
- type: precision
value: 7.370043290043291
- type: recall
value: 9.1
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (kat-eng)
config: kat-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 80.9651474530831
- type: f1
value: 76.84220605132133
- type: precision
value: 75.19606398962966
- type: recall
value: 80.9651474530831
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (jpn-eng)
config: jpn-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 86.9
- type: f1
value: 83.705
- type: precision
value: 82.3120634920635
- type: recall
value: 86.9
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (csb-eng)
config: csb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 29.64426877470356
- type: f1
value: 23.98763072676116
- type: precision
value: 22.506399397703746
- type: recall
value: 29.64426877470356
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (xho-eng)
config: xho-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 70.4225352112676
- type: f1
value: 62.84037558685445
- type: precision
value: 59.56572769953053
- type: recall
value: 70.4225352112676
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (orv-eng)
config: orv-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 19.64071856287425
- type: f1
value: 15.125271011207756
- type: precision
value: 13.865019261197494
- type: recall
value: 19.64071856287425
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ind-eng)
config: ind-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 90.2
- type: f1
value: 87.80666666666666
- type: precision
value: 86.70833333333331
- type: recall
value: 90.2
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (tuk-eng)
config: tuk-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 23.15270935960591
- type: f1
value: 18.407224958949097
- type: precision
value: 16.982385430661292
- type: recall
value: 23.15270935960591
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (max-eng)
config: max-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 55.98591549295775
- type: f1
value: 49.94718309859154
- type: precision
value: 47.77864154624717
- type: recall
value: 55.98591549295775
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (swh-eng)
config: swh-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 73.07692307692307
- type: f1
value: 66.74358974358974
- type: precision
value: 64.06837606837607
- type: recall
value: 73.07692307692307
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (hin-eng)
config: hin-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 94.89999999999999
- type: f1
value: 93.25
- type: precision
value: 92.43333333333332
- type: recall
value: 94.89999999999999
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (dsb-eng)
config: dsb-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 37.78705636743215
- type: f1
value: 31.63899658680452
- type: precision
value: 29.72264397629742
- type: recall
value: 37.78705636743215
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ber-eng)
config: ber-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 21.6
- type: f1
value: 16.91697302697303
- type: precision
value: 15.71225147075147
- type: recall
value: 21.6
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (tam-eng)
config: tam-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 85.01628664495115
- type: f1
value: 81.38514037536838
- type: precision
value: 79.83170466883823
- type: recall
value: 85.01628664495115
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (slk-eng)
config: slk-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 83.39999999999999
- type: f1
value: 79.96380952380952
- type: precision
value: 78.48333333333333
- type: recall
value: 83.39999999999999
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (tgl-eng)
config: tgl-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 83.2
- type: f1
value: 79.26190476190476
- type: precision
value: 77.58833333333334
- type: recall
value: 83.2
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ast-eng)
config: ast-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 75.59055118110236
- type: f1
value: 71.66854143232096
- type: precision
value: 70.30183727034121
- type: recall
value: 75.59055118110236
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (mkd-eng)
config: mkd-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 65.5
- type: f1
value: 59.26095238095238
- type: precision
value: 56.81909090909092
- type: recall
value: 65.5
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (khm-eng)
config: khm-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 55.26315789473685
- type: f1
value: 47.986523325858506
- type: precision
value: 45.33950006595436
- type: recall
value: 55.26315789473685
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ces-eng)
config: ces-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 82.89999999999999
- type: f1
value: 78.835
- type: precision
value: 77.04761904761905
- type: recall
value: 82.89999999999999
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (tzl-eng)
config: tzl-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 43.269230769230774
- type: f1
value: 36.20421245421245
- type: precision
value: 33.57371794871795
- type: recall
value: 43.269230769230774
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (urd-eng)
config: urd-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 88.0
- type: f1
value: 84.70666666666666
- type: precision
value: 83.23166666666665
- type: recall
value: 88.0
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (ara-eng)
config: ara-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 77.4
- type: f1
value: 72.54666666666667
- type: precision
value: 70.54318181818181
- type: recall
value: 77.4
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (kor-eng)
config: kor-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 78.60000000000001
- type: f1
value: 74.1588888888889
- type: precision
value: 72.30250000000001
- type: recall
value: 78.60000000000001
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (yid-eng)
config: yid-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 72.40566037735849
- type: f1
value: 66.82587328813744
- type: precision
value: 64.75039308176099
- type: recall
value: 72.40566037735849
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (fin-eng)
config: fin-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 73.8
- type: f1
value: 68.56357142857144
- type: precision
value: 66.3178822055138
- type: recall
value: 73.8
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (tha-eng)
config: tha-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 91.78832116788321
- type: f1
value: 89.3552311435523
- type: precision
value: 88.20559610705597
- type: recall
value: 91.78832116788321
- task:
type: BitextMining
dataset:
type: mteb/tatoeba-bitext-mining
name: MTEB Tatoeba (wuu-eng)
config: wuu-eng
split: test
revision: 9080400076fbadbb4c4dcb136ff4eddc40b42553
metrics:
- type: accuracy
value: 74.3
- type: f1
value: 69.05085581085581
- type: precision
value: 66.955
- type: recall
value: 74.3
- task:
type: Retrieval
dataset:
type: webis-touche2020
name: MTEB Touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 2.896
- type: map_at_10
value: 8.993
- type: map_at_100
value: 14.133999999999999
- type: map_at_1000
value: 15.668000000000001
- type: map_at_3
value: 5.862
- type: map_at_5
value: 7.17
- type: mrr_at_1
value: 34.694
- type: mrr_at_10
value: 42.931000000000004
- type: mrr_at_100
value: 44.81
- type: mrr_at_1000
value: 44.81
- type: mrr_at_3
value: 38.435
- type: mrr_at_5
value: 41.701
- type: ndcg_at_1
value: 31.633
- type: ndcg_at_10
value: 21.163
- type: ndcg_at_100
value: 33.306000000000004
- type: ndcg_at_1000
value: 45.275999999999996
- type: ndcg_at_3
value: 25.685999999999996
- type: ndcg_at_5
value: 23.732
- type: precision_at_1
value: 34.694
- type: precision_at_10
value: 17.755000000000003
- type: precision_at_100
value: 6.938999999999999
- type: precision_at_1000
value: 1.48
- type: precision_at_3
value: 25.85
- type: precision_at_5
value: 23.265
- type: recall_at_1
value: 2.896
- type: recall_at_10
value: 13.333999999999998
- type: recall_at_100
value: 43.517
- type: recall_at_1000
value: 79.836
- type: recall_at_3
value: 6.306000000000001
- type: recall_at_5
value: 8.825
- task:
type: Classification
dataset:
type: mteb/toxic_conversations_50k
name: MTEB ToxicConversationsClassification
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 69.3874
- type: ap
value: 13.829909072469423
- type: f1
value: 53.54534203543492
- task:
type: Classification
dataset:
type: mteb/tweet_sentiment_extraction
name: MTEB TweetSentimentExtractionClassification
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 62.62026032823995
- type: f1
value: 62.85251350485221
- task:
type: Clustering
dataset:
type: mteb/twentynewsgroups-clustering
name: MTEB TwentyNewsgroupsClustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 33.21527881409797
- task:
type: PairClassification
dataset:
type: mteb/twittersemeval2015-pairclassification
name: MTEB TwitterSemEval2015
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 84.97943613280086
- type: cos_sim_ap
value: 70.75454316885921
- type: cos_sim_f1
value: 65.38274012676743
- type: cos_sim_precision
value: 60.761214318078835
- type: cos_sim_recall
value: 70.76517150395777
- type: dot_accuracy
value: 79.0546581629612
- type: dot_ap
value: 47.3197121792147
- type: dot_f1
value: 49.20106524633821
- type: dot_precision
value: 42.45499808502489
- type: dot_recall
value: 58.49604221635884
- type: euclidean_accuracy
value: 85.08076533349228
- type: euclidean_ap
value: 70.95016106374474
- type: euclidean_f1
value: 65.43987900176455
- type: euclidean_precision
value: 62.64478764478765
- type: euclidean_recall
value: 68.49604221635884
- type: manhattan_accuracy
value: 84.93771234428085
- type: manhattan_ap
value: 70.63668388755362
- type: manhattan_f1
value: 65.23895401262398
- type: manhattan_precision
value: 56.946084218811485
- type: manhattan_recall
value: 76.35883905013192
- type: max_accuracy
value: 85.08076533349228
- type: max_ap
value: 70.95016106374474
- type: max_f1
value: 65.43987900176455
- task:
type: PairClassification
dataset:
type: mteb/twitterurlcorpus-pairclassification
name: MTEB TwitterURLCorpus
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 88.69096130709822
- type: cos_sim_ap
value: 84.82526278228542
- type: cos_sim_f1
value: 77.65485060585536
- type: cos_sim_precision
value: 75.94582658619167
- type: cos_sim_recall
value: 79.44256236526024
- type: dot_accuracy
value: 80.97954748321496
- type: dot_ap
value: 64.81642914145866
- type: dot_f1
value: 60.631996987229975
- type: dot_precision
value: 54.5897293631712
- type: dot_recall
value: 68.17831844779796
- type: euclidean_accuracy
value: 88.6987231730508
- type: euclidean_ap
value: 84.80003825477253
- type: euclidean_f1
value: 77.67194179854496
- type: euclidean_precision
value: 75.7128235122094
- type: euclidean_recall
value: 79.73514012935017
- type: manhattan_accuracy
value: 88.62692591298949
- type: manhattan_ap
value: 84.80451408255276
- type: manhattan_f1
value: 77.69888949572183
- type: manhattan_precision
value: 73.70311528631622
- type: manhattan_recall
value: 82.15275639051433
- type: max_accuracy
value: 88.6987231730508
- type: max_ap
value: 84.82526278228542
- type: max_f1
value: 77.69888949572183
language:
- multilingual
- af
- am
- ar
- as
- az
- be
- bg
- bn
- br
- bs
- ca
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- hu
- hy
- id
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- 'no'
- om
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sa
- sd
- si
- sk
- sl
- so
- sq
- sr
- su
- sv
- sw
- ta
- te
- th
- tl
- tr
- ug
- uk
- ur
- uz
- vi
- xh
- yi
- zh
license: mit
---
## Multilingual-E5-small
[Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
This model has 12 layers and the embedding size is 384.
## Usage
Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
```python
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def average_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
# Each input text should start with "query: " or "passage: ", even for non-English texts.
# For tasks other than retrieval, you can simply use the "query: " prefix.
input_texts = ['query: how much protein should a female eat',
'query: 南瓜的家常做法',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: 1.清炒南瓜丝 原料:嫩南瓜半个 调料:葱、盐、白糖、鸡精 做法: 1、南瓜用刀薄薄的削去表面一层皮,用勺子刮去瓤 2、擦成细丝(没有擦菜板就用刀慢慢切成细丝) 3、锅烧热放油,入葱花煸出香味 4、入南瓜丝快速翻炒一分钟左右,放盐、一点白糖和鸡精调味出锅 2.香葱炒南瓜 原料:南瓜1只 调料:香葱、蒜末、橄榄油、盐 做法: 1、将南瓜去皮,切成片 2、油锅8成热后,将蒜末放入爆香 3、爆香后,将南瓜片放入,翻炒 4、在翻炒的同时,可以不时地往锅里加水,但不要太多 5、放入盐,炒匀 6、南瓜差不多软和绵了之后,就可以关火 7、撒入香葱,即可出锅"]
tokenizer = AutoTokenizer.from_pretrained('intfloat/multilingual-e5-small')
model = AutoModel.from_pretrained('intfloat/multilingual-e5-small')
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
```
## Supported Languages
This model is initialized from [microsoft/Multilingual-MiniLM-L12-H384](https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384)
and continually trained on a mixture of multilingual datasets.
It supports 100 languages from xlm-roberta,
but low-resource languages may see performance degradation.
## Training Details
**Initialization**: [microsoft/Multilingual-MiniLM-L12-H384](https://huggingface.co/microsoft/Multilingual-MiniLM-L12-H384)
**First stage**: contrastive pre-training with weak supervision
| Dataset | Weak supervision | # of text pairs |
|--------------------------------------------------------------------------------------------------------|---------------------------------------|-----------------|
| Filtered [mC4](https://huggingface.co/datasets/mc4) | (title, page content) | 1B |
| [CC News](https://huggingface.co/datasets/intfloat/multilingual_cc_news) | (title, news content) | 400M |
| [NLLB](https://huggingface.co/datasets/allenai/nllb) | translation pairs | 2.4B |
| [Wikipedia](https://huggingface.co/datasets/intfloat/wikipedia) | (hierarchical section title, passage) | 150M |
| Filtered [Reddit](https://www.reddit.com/) | (comment, response) | 800M |
| [S2ORC](https://github.com/allenai/s2orc) | (title, abstract) and citation pairs | 100M |
| [Stackexchange](https://stackexchange.com/) | (question, answer) | 50M |
| [xP3](https://huggingface.co/datasets/bigscience/xP3) | (input prompt, response) | 80M |
| [Miscellaneous unsupervised SBERT data](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) | - | 10M |
**Second stage**: supervised fine-tuning
| Dataset | Language | # of text pairs |
|----------------------------------------------------------------------------------------|--------------|-----------------|
| [MS MARCO](https://microsoft.github.io/msmarco/) | English | 500k |
| [NQ](https://github.com/facebookresearch/DPR) | English | 70k |
| [Trivia QA](https://github.com/facebookresearch/DPR) | English | 60k |
| [NLI from SimCSE](https://github.com/princeton-nlp/SimCSE) | English | <300k |
| [ELI5](https://huggingface.co/datasets/eli5) | English | 500k |
| [DuReader Retrieval](https://github.com/baidu/DuReader/tree/master/DuReader-Retrieval) | Chinese | 86k |
| [KILT Fever](https://huggingface.co/datasets/kilt_tasks) | English | 70k |
| [KILT HotpotQA](https://huggingface.co/datasets/kilt_tasks) | English | 70k |
| [SQuAD](https://huggingface.co/datasets/squad) | English | 87k |
| [Quora](https://huggingface.co/datasets/quora) | English | 150k |
| [Mr. TyDi](https://huggingface.co/datasets/castorini/mr-tydi) | 11 languages | 50k |
| [MIRACL](https://huggingface.co/datasets/miracl/miracl) | 16 languages | 40k |
For all labeled datasets, we only use its training set for fine-tuning.
For other training details, please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
## Benchmark Results on [Mr. TyDi](https://arxiv.org/abs/2108.08787)
| Model | Avg MRR@10 | | ar | bn | en | fi | id | ja | ko | ru | sw | te | th |
|-----------------------|------------|-------|------| --- | --- | --- | --- | --- | --- | --- |------| --- | --- |
| BM25 | 33.3 | | 36.7 | 41.3 | 15.1 | 28.8 | 38.2 | 21.7 | 28.1 | 32.9 | 39.6 | 42.4 | 41.7 |
| mDPR | 16.7 | | 26.0 | 25.8 | 16.2 | 11.3 | 14.6 | 18.1 | 21.9 | 18.5 | 7.3 | 10.6 | 13.5 |
| BM25 + mDPR | 41.7 | | 49.1 | 53.5 | 28.4 | 36.5 | 45.5 | 35.5 | 36.2 | 42.7 | 40.5 | 42.0 | 49.2 |
| | |
| multilingual-e5-small | 64.4 | | 71.5 | 66.3 | 54.5 | 57.7 | 63.2 | 55.4 | 54.3 | 60.8 | 65.4 | 89.1 | 70.1 |
| multilingual-e5-base | 65.9 | | 72.3 | 65.0 | 58.5 | 60.8 | 64.9 | 56.6 | 55.8 | 62.7 | 69.0 | 86.6 | 72.7 |
| multilingual-e5-large | **70.5** | | 77.5 | 73.2 | 60.8 | 66.8 | 68.5 | 62.5 | 61.6 | 65.8 | 72.7 | 90.2 | 76.2 |
## MTEB Benchmark Evaluation
Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
## Support for Sentence Transformers
Below is an example for usage with sentence_transformers.
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer('intfloat/multilingual-e5-small')
input_texts = [
'query: how much protein should a female eat',
'query: 南瓜的家常做法',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 i s 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or traini ng for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: 1.清炒南瓜丝 原料:嫩南瓜半个 调料:葱、盐、白糖、鸡精 做法: 1、南瓜用刀薄薄的削去表面一层皮 ,用勺子刮去瓤 2、擦成细丝(没有擦菜板就用刀慢慢切成细丝) 3、锅烧热放油,入葱花煸出香味 4、入南瓜丝快速翻炒一分钟左右, 放盐、一点白糖和鸡精调味出锅 2.香葱炒南瓜 原料:南瓜1只 调料:香葱、蒜末、橄榄油、盐 做法: 1、将南瓜去皮,切成片 2、油 锅8成热后,将蒜末放入爆香 3、爆香后,将南瓜片放入,翻炒 4、在翻炒的同时,可以不时地往锅里加水,但不要太多 5、放入盐,炒匀 6、南瓜差不多软和绵了之后,就可以关火 7、撒入香葱,即可出锅"
]
embeddings = model.encode(input_texts, normalize_embeddings=True)
```
Package requirements
`pip install sentence_transformers~=2.2.2`
Contributors: [michaelfeil](https://huggingface.co/michaelfeil)
## FAQ
**1. Do I need to add the prefix "query: " and "passage: " to input texts?**
Yes, this is how the model is trained, otherwise you will see a performance degradation.
Here are some rules of thumb:
- Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval.
- Use "query: " prefix for symmetric tasks such as semantic similarity, bitext mining, paraphrase retrieval.
- Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering.
**2. Why are my reproduced results slightly different from reported in the model card?**
Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences.
**3. Why does the cosine similarity scores distribute around 0.7 to 1.0?**
This is a known and expected behavior as we use a low temperature 0.01 for InfoNCE contrastive loss.
For text embedding tasks like text retrieval or semantic similarity,
what matters is the relative order of the scores instead of the absolute values,
so this should not be an issue.
## Citation
If you find our paper or models helpful, please consider cite as follows:
```
@article{wang2022text,
title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
journal={arXiv preprint arXiv:2212.03533},
year={2022}
}
```
## Limitations
Long texts will be truncated to at most 512 tokens.
| 159,473 | [
[
-0.03765869140625,
-0.03680419921875,
0.00609588623046875,
0.01953125,
-0.017425537109375,
-0.00681304931640625,
-0.0200653076171875,
-0.034576416015625,
0.028564453125,
0.013397216796875,
-0.041412353515625,
-0.052490234375,
-0.05126953125,
0.021575927734375,
0.0084381103515625,
0.0784912109375,
-0.00901031494140625,
0.0054473876953125,
-0.0032176971435546875,
-0.030609130859375,
-0.0202178955078125,
-0.0294036865234375,
-0.048980712890625,
-0.01535797119140625,
0.0255279541015625,
0.032867431640625,
0.034942626953125,
0.046630859375,
0.034942626953125,
0.023529052734375,
-0.010009765625,
0.0253753662109375,
-0.029815673828125,
-0.01934814453125,
0.011932373046875,
-0.029815673828125,
-0.03564453125,
-0.0018014907836914062,
0.0443115234375,
0.04339599609375,
0.001087188720703125,
0.0226898193359375,
0.0196990966796875,
0.06549072265625,
-0.0231781005859375,
0.019378662109375,
-0.01554107666015625,
0.00696563720703125,
-0.02154541015625,
0.00359344482421875,
-0.01035308837890625,
-0.005611419677734375,
0.00054931640625,
-0.039886474609375,
0.0006823539733886719,
0.00919342041015625,
0.0916748046875,
0.01462554931640625,
-0.0289306640625,
-0.0178985595703125,
-0.004108428955078125,
0.06292724609375,
-0.0540771484375,
0.038482666015625,
0.047943115234375,
0.0013647079467773438,
0.0007319450378417969,
-0.04803466796875,
-0.036773681640625,
0.0006046295166015625,
-0.039215087890625,
0.02490234375,
-0.01373291015625,
-0.01049041748046875,
0.0225982666015625,
0.0303955078125,
-0.065185546875,
0.007701873779296875,
-0.0233306884765625,
-0.01433563232421875,
0.0562744140625,
0.00794219970703125,
0.02783203125,
-0.038482666015625,
-0.0219573974609375,
-0.01226043701171875,
-0.033843994140625,
0.025360107421875,
0.02490234375,
0.0220947265625,
-0.04852294921875,
0.035980224609375,
-0.0218658447265625,
0.04510498046875,
0.009063720703125,
-0.0159912109375,
0.05029296875,
-0.055145263671875,
-0.018218994140625,
-0.0146484375,
0.084228515625,
0.0258941650390625,
-0.00023174285888671875,
0.01116180419921875,
-0.009796142578125,
-0.00397491455078125,
-0.0217742919921875,
-0.06280517578125,
-0.0263519287109375,
0.0293731689453125,
-0.042572021484375,
-0.0176544189453125,
-0.002223968505859375,
-0.0579833984375,
0.00293731689453125,
-0.01506805419921875,
0.0181732177734375,
-0.043548583984375,
-0.0193939208984375,
0.00116729736328125,
0.00463104248046875,
0.01343536376953125,
0.010223388671875,
-0.055328369140625,
0.0164031982421875,
0.010498046875,
0.0673828125,
-0.011566162109375,
-0.02166748046875,
-0.021484375,
0.0010175704956054688,
-0.00765228271484375,
0.044677734375,
-0.0165863037109375,
-0.01212310791015625,
-0.007205963134765625,
0.029449462890625,
-0.0296783447265625,
-0.024017333984375,
0.042633056640625,
-0.014312744140625,
0.0362548828125,
-0.025543212890625,
-0.02850341796875,
-0.0025386810302734375,
0.027679443359375,
-0.047210693359375,
0.103271484375,
0.0158843994140625,
-0.0792236328125,
0.0202178955078125,
-0.045501708984375,
-0.02667236328125,
-0.01215362548828125,
-0.01910400390625,
-0.0433349609375,
-0.0234832763671875,
0.036468505859375,
0.03521728515625,
-0.0145416259765625,
0.004665374755859375,
-0.00426483154296875,
-0.0164794921875,
-0.005702972412109375,
-0.010650634765625,
0.08447265625,
0.021331787109375,
-0.043731689453125,
-0.0003275871276855469,
-0.0653076171875,
0.0032901763916015625,
0.02191162109375,
-0.0296783447265625,
-0.0114593505859375,
-0.0198822021484375,
0.0005402565002441406,
0.040740966796875,
0.0291900634765625,
-0.036712646484375,
0.01611328125,
-0.0462646484375,
0.03192138671875,
0.03997802734375,
-0.007049560546875,
0.0289306640625,
-0.032135009765625,
0.0322265625,
0.028717041015625,
0.0030193328857421875,
-0.01206207275390625,
-0.045745849609375,
-0.06793212890625,
-0.0284576416015625,
0.0306243896484375,
0.04876708984375,
-0.064453125,
0.03680419921875,
-0.02587890625,
-0.031341552734375,
-0.062255859375,
0.0235443115234375,
0.043426513671875,
0.033172607421875,
0.04541015625,
-0.0014104843139648438,
-0.0462646484375,
-0.075439453125,
-0.011505126953125,
0.00035452842712402344,
0.004848480224609375,
0.02593994140625,
0.056182861328125,
-0.0230560302734375,
0.0469970703125,
-0.0361328125,
-0.04241943359375,
-0.025360107421875,
-0.00392913818359375,
0.033233642578125,
0.045867919921875,
0.0509033203125,
-0.062164306640625,
-0.06280517578125,
0.006038665771484375,
-0.06463623046875,
0.00450897216796875,
0.0060577392578125,
-0.01557159423828125,
0.040435791015625,
0.037017822265625,
-0.04095458984375,
0.025238037109375,
0.051727294921875,
-0.029083251953125,
0.033782958984375,
-0.02099609375,
0.007701873779296875,
-0.09759521484375,
0.0152740478515625,
0.003993988037109375,
-0.00988006591796875,
-0.0304412841796875,
-0.00021255016326904297,
0.0053558349609375,
0.0036640167236328125,
-0.03363037109375,
0.052398681640625,
-0.057159423828125,
0.0144500732421875,
0.0096588134765625,
0.0241241455078125,
0.00021731853485107422,
0.048126220703125,
0.0024242401123046875,
0.05438232421875,
0.03448486328125,
-0.04241943359375,
0.004055023193359375,
0.0300445556640625,
-0.0299224853515625,
0.0293426513671875,
-0.044464111328125,
-0.005428314208984375,
-0.002040863037109375,
0.0169219970703125,
-0.07855224609375,
-0.019195556640625,
0.017608642578125,
-0.0408935546875,
0.032135009765625,
-0.0106964111328125,
-0.041259765625,
-0.048095703125,
-0.04656982421875,
0.0115966796875,
0.0194854736328125,
-0.0298309326171875,
0.0325927734375,
0.0224456787109375,
-0.004688262939453125,
-0.05816650390625,
-0.0748291015625,
-0.01100921630859375,
-0.005462646484375,
-0.06195068359375,
0.0306243896484375,
-0.01336669921875,
0.00843048095703125,
0.003498077392578125,
0.0019130706787109375,
0.01190948486328125,
-0.00756072998046875,
0.005443572998046875,
0.01404571533203125,
-0.01224517822265625,
-0.00643157958984375,
0.00392913818359375,
-0.00470733642578125,
-0.021209716796875,
-0.0052947998046875,
0.043212890625,
-0.03155517578125,
0.00115966796875,
-0.037689208984375,
0.02410888671875,
0.02978515625,
-0.0164337158203125,
0.0782470703125,
0.071533203125,
-0.022705078125,
0.006992340087890625,
-0.035858154296875,
-0.006984710693359375,
-0.034576416015625,
0.0330810546875,
-0.048675537109375,
-0.064453125,
0.05322265625,
0.010711669921875,
0.007534027099609375,
0.060577392578125,
0.037353515625,
-0.0158538818359375,
0.08135986328125,
0.034332275390625,
-0.0117034912109375,
0.029998779296875,
-0.05096435546875,
-0.0024394989013671875,
-0.06280517578125,
-0.025421142578125,
-0.033233642578125,
-0.023773193359375,
-0.0716552734375,
-0.0301361083984375,
0.019134521484375,
0.005496978759765625,
-0.025726318359375,
0.0295562744140625,
-0.042999267578125,
0.011749267578125,
0.03826904296875,
0.0164031982421875,
0.0053558349609375,
0.0081329345703125,
-0.033966064453125,
-0.021026611328125,
-0.06463623046875,
-0.0308074951171875,
0.07647705078125,
0.0302886962890625,
0.033599853515625,
0.01457977294921875,
0.05255126953125,
0.00478363037109375,
0.008697509765625,
-0.035400390625,
0.0299835205078125,
-0.0221710205078125,
-0.0416259765625,
-0.01459503173828125,
-0.04925537109375,
-0.07257080078125,
0.0253753662109375,
-0.02008056640625,
-0.0555419921875,
0.01528167724609375,
0.004940032958984375,
-0.021270751953125,
0.019866943359375,
-0.06866455078125,
0.0760498046875,
-0.0215911865234375,
-0.03594970703125,
0.0163421630859375,
-0.06329345703125,
0.02227783203125,
0.006855010986328125,
0.03143310546875,
-0.004222869873046875,
-0.0101776123046875,
0.06951904296875,
-0.032073974609375,
0.05377197265625,
-0.01451873779296875,
0.00803375244140625,
0.01503753662109375,
-0.01137542724609375,
0.05230712890625,
-0.0024566650390625,
-0.00690460205078125,
0.025360107421875,
0.0168914794921875,
-0.044830322265625,
-0.0224761962890625,
0.062042236328125,
-0.07415771484375,
-0.03143310546875,
-0.0467529296875,
-0.026397705078125,
-0.00782012939453125,
0.0281829833984375,
0.040374755859375,
0.00911712646484375,
0.007572174072265625,
0.0244293212890625,
0.044403076171875,
-0.030029296875,
0.04107666015625,
0.03411865234375,
-0.00650787353515625,
-0.050384521484375,
0.066650390625,
0.022186279296875,
0.01175689697265625,
0.03125,
0.01239013671875,
-0.02789306640625,
-0.03216552734375,
-0.0482177734375,
0.036834716796875,
-0.033660888671875,
-0.0252685546875,
-0.06866455078125,
-0.0162200927734375,
-0.04931640625,
-0.0006194114685058594,
-0.0279693603515625,
-0.0268402099609375,
-0.0257415771484375,
-0.0089569091796875,
0.0209503173828125,
0.0347900390625,
-0.002777099609375,
0.0150146484375,
-0.048126220703125,
0.0163421630859375,
-0.005107879638671875,
0.01885986328125,
-0.011199951171875,
-0.054718017578125,
-0.042724609375,
0.015625,
-0.0232696533203125,
-0.052490234375,
0.048675537109375,
0.021697998046875,
0.041473388671875,
0.02459716796875,
-0.0017290115356445312,
0.0526123046875,
-0.02606201171875,
0.0689697265625,
0.02069091796875,
-0.053680419921875,
0.049102783203125,
-0.0236358642578125,
0.034515380859375,
0.05322265625,
0.0517578125,
-0.03436279296875,
-0.0192718505859375,
-0.050323486328125,
-0.068115234375,
0.0665283203125,
0.0277252197265625,
-0.006404876708984375,
0.005611419677734375,
0.0137481689453125,
-0.01140594482421875,
0.0135955810546875,
-0.06988525390625,
-0.05389404296875,
-0.0149078369140625,
-0.03680419921875,
-0.007793426513671875,
-0.018524169921875,
-0.00650787353515625,
-0.042877197265625,
0.059356689453125,
-0.0113525390625,
0.03515625,
0.0240631103515625,
-0.01499176025390625,
0.004608154296875,
0.0029354095458984375,
0.053680419921875,
0.033905029296875,
-0.0222930908203125,
0.0099334716796875,
0.0252532958984375,
-0.048614501953125,
-0.005626678466796875,
0.0165863037109375,
-0.0179901123046875,
0.0045928955078125,
0.0291900634765625,
0.054718017578125,
0.01763916015625,
-0.03338623046875,
0.03948974609375,
-0.006671905517578125,
-0.02337646484375,
-0.0190277099609375,
-0.01451873779296875,
0.0150909423828125,
0.01261138916015625,
0.02349853515625,
-0.013763427734375,
-0.00035071372985839844,
-0.04022216796875,
0.0164337158203125,
0.0149688720703125,
-0.0308837890625,
-0.019989013671875,
0.044281005859375,
0.00670623779296875,
0.00804901123046875,
0.0399169921875,
-0.019012451171875,
-0.044677734375,
0.041717529296875,
0.034454345703125,
0.040374755859375,
-0.0225982666015625,
0.01116180419921875,
0.0643310546875,
0.04278564453125,
0.005893707275390625,
0.027923583984375,
0.01390838623046875,
-0.047088623046875,
-0.0159454345703125,
-0.062408447265625,
0.0038547515869140625,
0.01519775390625,
-0.034423828125,
0.0230712890625,
-0.0165252685546875,
-0.013763427734375,
0.00043582916259765625,
0.045440673828125,
-0.06109619140625,
0.01910400390625,
-0.00000852346420288086,
0.0728759765625,
-0.06256103515625,
0.0697021484375,
0.06219482421875,
-0.061767578125,
-0.06591796875,
-0.0189666748046875,
0.0004208087921142578,
-0.05364990234375,
0.05438232421875,
0.0198211669921875,
0.00951385498046875,
-0.0027790069580078125,
-0.02813720703125,
-0.07073974609375,
0.09381103515625,
0.0185089111328125,
-0.038482666015625,
-0.0031032562255859375,
0.0164337158203125,
0.043212890625,
-0.025634765625,
0.032440185546875,
0.0462646484375,
0.040924072265625,
-0.011627197265625,
-0.0709228515625,
0.01258087158203125,
-0.046356201171875,
-0.0024051666259765625,
0.001255035400390625,
-0.07855224609375,
0.07220458984375,
-0.00911712646484375,
-0.0062408447265625,
-0.0102691650390625,
0.052276611328125,
0.0269927978515625,
0.025787353515625,
0.0238037109375,
0.06109619140625,
0.055450439453125,
-0.0135650634765625,
0.0888671875,
-0.034271240234375,
0.037109375,
0.07373046875,
0.019927978515625,
0.05816650390625,
0.033111572265625,
-0.0283050537109375,
0.046966552734375,
0.0687255859375,
0.0021457672119140625,
0.03826904296875,
-0.01082611083984375,
-0.0035991668701171875,
-0.0032978057861328125,
-0.0034275054931640625,
-0.039794921875,
0.0225067138671875,
0.0207977294921875,
-0.03668212890625,
0.0040283203125,
0.01210784912109375,
0.0289306640625,
-0.007472991943359375,
-0.01378631591796875,
0.05523681640625,
0.01467132568359375,
-0.034759521484375,
0.06353759765625,
0.00302886962890625,
0.06793212890625,
-0.05108642578125,
0.01373291015625,
-0.0308685302734375,
0.0158843994140625,
-0.0323486328125,
-0.059326171875,
0.01294708251953125,
-0.0062255859375,
-0.0192718505859375,
-0.01462554931640625,
0.03912353515625,
-0.046875,
-0.04150390625,
0.035369873046875,
0.04010009765625,
0.0167083740234375,
0.01010894775390625,
-0.0848388671875,
0.005733489990234375,
0.020660400390625,
-0.040771484375,
0.034271240234375,
0.0352783203125,
0.0048370361328125,
0.040252685546875,
0.0496826171875,
0.010223388671875,
0.007259368896484375,
-0.0005545616149902344,
0.07159423828125,
-0.050567626953125,
-0.0384521484375,
-0.055633544921875,
0.037445068359375,
-0.023040771484375,
-0.0299224853515625,
0.078369140625,
0.05816650390625,
0.072509765625,
0.003208160400390625,
0.049285888671875,
-0.01467132568359375,
0.03656005859375,
-0.03558349609375,
0.05859375,
-0.06622314453125,
-0.002109527587890625,
-0.03143310546875,
-0.054901123046875,
-0.0211639404296875,
0.05450439453125,
-0.0293426513671875,
0.0138397216796875,
0.05987548828125,
0.05767822265625,
-0.001007080078125,
-0.016082763671875,
0.01496124267578125,
0.028717041015625,
0.01226806640625,
0.056915283203125,
0.0233612060546875,
-0.0706787109375,
0.0517578125,
-0.0386962890625,
-0.01085662841796875,
-0.02117919921875,
-0.044219970703125,
-0.0631103515625,
-0.055633544921875,
-0.032135009765625,
-0.03594970703125,
-0.004169464111328125,
0.0738525390625,
0.047821044921875,
-0.06842041015625,
-0.020599365234375,
0.005054473876953125,
0.00409698486328125,
-0.0194549560546875,
-0.0175018310546875,
0.06005859375,
-0.0207977294921875,
-0.06451416015625,
0.0130615234375,
0.0011568069458007812,
0.0101318359375,
-0.00403594970703125,
-0.0235748291015625,
-0.049346923828125,
-0.007701873779296875,
0.054412841796875,
0.0209503173828125,
-0.040191650390625,
-0.0033130645751953125,
0.004833221435546875,
-0.030517578125,
0.009124755859375,
0.00885009765625,
-0.027313232421875,
0.0299224853515625,
0.03619384765625,
0.033538818359375,
0.059661865234375,
-0.006072998046875,
0.00951385498046875,
-0.05364990234375,
0.0252685546875,
0.0034580230712890625,
0.032379150390625,
0.0272064208984375,
-0.0258331298828125,
0.05975341796875,
0.0284423828125,
-0.03326416015625,
-0.05902099609375,
-0.017364501953125,
-0.0810546875,
-0.01203155517578125,
0.08856201171875,
-0.0203857421875,
-0.033416748046875,
-0.005199432373046875,
-0.016998291015625,
0.022216796875,
-0.032012939453125,
0.035675048828125,
0.05816650390625,
-0.01434326171875,
-0.0228729248046875,
-0.054718017578125,
0.0489501953125,
0.03375244140625,
-0.0595703125,
-0.011749267578125,
0.01541900634765625,
0.02630615234375,
0.01485443115234375,
0.0579833984375,
-0.01143646240234375,
-0.000052034854888916016,
-0.0003266334533691406,
0.00687408447265625,
-0.00563812255859375,
-0.00302886962890625,
-0.0100860595703125,
0.00018346309661865234,
-0.02386474609375,
-0.01396942138671875
]
] |
vumichien/whisper-medium-jp | 2022-12-31T00:19:56.000Z | [
"transformers",
"pytorch",
"tensorboard",
"whisper",
"automatic-speech-recognition",
"whisper-event",
"generated_from_trainer",
"ja",
"dataset:mozilla-foundation/common_voice_11_0",
"doi:10.57967/hf/0338",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | vumichien | null | null | vumichien/whisper-medium-jp | 22 | 27,096 | transformers | 2022-12-07T07:04:41 | ---
language:
- ja
license: apache-2.0
tags:
- whisper-event
- generated_from_trainer
datasets:
- mozilla-foundation/common_voice_11_0
metrics:
- wer
model-index:
- name: Whisper Medium Japanese
results:
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: mozilla-foundation/common_voice_11_0 ja
type: mozilla-foundation/common_voice_11_0
config: ja
split: test
args: ja
metrics:
- type: wer
value: 9.035472972972974
name: WER
- type: cer
value: 5.61
name: CER
- task:
type: automatic-speech-recognition
name: Automatic Speech Recognition
dataset:
name: google/fleurs ja_jp
type: google/fleurs
config: ja_jp
split: test
metrics:
- type: wer
value: 13.56
name: WER
- type: cer
value: 8.01
name: CER
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# openai/whisper-medium
This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the common_voice_11_0 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3029
- Wer: 9.0355
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 5000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.0392 | 3.03 | 1000 | 0.2023 | 10.1807 |
| 0.0036 | 7.01 | 2000 | 0.2478 | 9.4409 |
| 0.0013 | 10.04 | 3000 | 0.2791 | 9.1014 |
| 0.0002 | 14.01 | 4000 | 0.2970 | 9.0625 |
| 0.0002 | 17.04 | 5000 | 0.3029 | 9.0355 |
### Framework versions
- Transformers 4.26.0.dev0
- Pytorch 1.13.0+cu117
- Datasets 2.7.1.dev0
- Tokenizers 0.13.2
| 2,451 | [
[
-0.0361328125,
-0.04248046875,
0.0010938644409179688,
0.01849365234375,
-0.02545166015625,
-0.039215087890625,
-0.0240631103515625,
-0.019012451171875,
0.01800537109375,
0.0264434814453125,
-0.052093505859375,
-0.041412353515625,
-0.047210693359375,
-0.0186614990234375,
-0.013824462890625,
0.08416748046875,
0.006328582763671875,
0.0185699462890625,
0.01451873779296875,
-0.0161590576171875,
-0.03179931640625,
-0.03204345703125,
-0.07000732421875,
-0.047027587890625,
0.02838134765625,
0.025054931640625,
0.055755615234375,
0.0499267578125,
0.03985595703125,
0.01523590087890625,
-0.03179931640625,
-0.00795745849609375,
-0.04449462890625,
-0.034027099609375,
0.018890380859375,
-0.03558349609375,
-0.054473876953125,
0.01546478271484375,
0.062408447265625,
0.03143310546875,
-0.030303955078125,
0.04254150390625,
0.0087890625,
0.0236663818359375,
-0.04473876953125,
0.0156707763671875,
-0.038818359375,
0.0146942138671875,
-0.023284912109375,
-0.02532958984375,
-0.018524169921875,
-0.00011974573135375977,
0.020721435546875,
-0.050994873046875,
0.035491943359375,
0.0062255859375,
0.09912109375,
0.0235137939453125,
-0.0145416259765625,
0.006649017333984375,
-0.068359375,
0.050994873046875,
-0.059600830078125,
0.02264404296875,
0.0274810791015625,
0.046051025390625,
0.0128173828125,
-0.064208984375,
-0.043121337890625,
0.0030040740966796875,
-0.0003311634063720703,
0.0262908935546875,
-0.02117919921875,
0.0103912353515625,
0.04473876953125,
0.0494384765625,
-0.040802001953125,
0.01947021484375,
-0.05133056640625,
-0.0164794921875,
0.038818359375,
0.031494140625,
-0.007678985595703125,
-0.00576019287109375,
-0.021209716796875,
-0.0216522216796875,
-0.033111572265625,
0.01265716552734375,
0.051483154296875,
0.0187225341796875,
-0.046905517578125,
0.03594970703125,
-0.027252197265625,
0.05426025390625,
-0.0002446174621582031,
-0.0212860107421875,
0.042144775390625,
0.0017299652099609375,
-0.03521728515625,
0.01352691650390625,
0.06854248046875,
0.04644775390625,
0.0090179443359375,
0.01373291015625,
-0.0202178955078125,
-0.004062652587890625,
0.01367950439453125,
-0.07427978515625,
-0.01226043701171875,
0.005832672119140625,
-0.061431884765625,
-0.05340576171875,
0.0118865966796875,
-0.04351806640625,
0.01099395751953125,
-0.0272064208984375,
0.045989990234375,
-0.0137176513671875,
-0.02850341796875,
0.0151824951171875,
-0.01001739501953125,
0.034210205078125,
0.00827789306640625,
-0.059173583984375,
0.037750244140625,
0.0279541015625,
0.058563232421875,
0.019500732421875,
-0.026031494140625,
-0.017059326171875,
0.0035724639892578125,
-0.0210723876953125,
0.0297698974609375,
-0.01050567626953125,
-0.034393310546875,
-0.0178375244140625,
0.00989532470703125,
-0.02166748046875,
-0.038177490234375,
0.07330322265625,
-0.0144805908203125,
0.0303955078125,
-0.0007090568542480469,
-0.0278472900390625,
-0.0182952880859375,
0.016204833984375,
-0.054168701171875,
0.08099365234375,
0.0030574798583984375,
-0.05902099609375,
0.040924072265625,
-0.052764892578125,
0.00235748291015625,
-0.005237579345703125,
-0.005401611328125,
-0.056671142578125,
-0.004512786865234375,
0.012115478515625,
0.032958984375,
-0.03558349609375,
0.0199127197265625,
-0.021881103515625,
-0.053741455078125,
0.0003592967987060547,
-0.058624267578125,
0.0648193359375,
0.016357421875,
-0.033203125,
0.020751953125,
-0.08544921875,
0.0092926025390625,
0.02508544921875,
-0.0216522216796875,
0.00870513916015625,
-0.0243988037109375,
0.031707763671875,
0.00914764404296875,
0.00681304931640625,
-0.046661376953125,
0.0019931793212890625,
-0.03387451171875,
0.033843994140625,
0.047271728515625,
0.00815582275390625,
0.00040721893310546875,
-0.041015625,
0.0138092041015625,
0.0177001953125,
0.039276123046875,
0.00943756103515625,
-0.043914794921875,
-0.0762939453125,
-0.0181121826171875,
0.023284912109375,
0.0260467529296875,
-0.01004791259765625,
0.046630859375,
-0.01062774658203125,
-0.0645751953125,
-0.05224609375,
-0.0029754638671875,
0.0301361083984375,
0.055755615234375,
0.028564453125,
-0.0012054443359375,
-0.0301055908203125,
-0.084716796875,
0.0037479400634765625,
-0.0077362060546875,
0.0035858154296875,
0.03289794921875,
0.040374755859375,
-0.011383056640625,
0.051300048828125,
-0.05133056640625,
-0.0301055908203125,
-0.01078033447265625,
0.00923919677734375,
0.040557861328125,
0.04541015625,
0.054595947265625,
-0.0472412109375,
-0.01483917236328125,
-0.012786865234375,
-0.0526123046875,
0.01480865478515625,
-0.0098724365234375,
-0.01515960693359375,
-0.00662994384765625,
0.0161285400390625,
-0.04302978515625,
0.06011962890625,
0.037261962890625,
-0.01904296875,
0.05377197265625,
-0.0242767333984375,
-0.0027561187744140625,
-0.09197998046875,
0.0266571044921875,
0.0162353515625,
0.00798797607421875,
-0.0252227783203125,
0.0014810562133789062,
-0.0017747879028320312,
-0.0254974365234375,
-0.03228759765625,
0.051666259765625,
-0.005069732666015625,
0.014984130859375,
-0.00994110107421875,
-0.02239990234375,
-0.00864410400390625,
0.056671142578125,
0.0183258056640625,
0.052764892578125,
0.04913330078125,
-0.042266845703125,
0.0287017822265625,
0.0299072265625,
-0.0307769775390625,
0.0307159423828125,
-0.0870361328125,
0.0123138427734375,
0.001262664794921875,
0.006343841552734375,
-0.04608154296875,
-0.0222320556640625,
0.0361328125,
-0.04083251953125,
0.01401519775390625,
-0.02215576171875,
-0.0250244140625,
-0.0287628173828125,
-0.0031299591064453125,
0.0165863037109375,
0.045867919921875,
-0.04364013671875,
0.0303955078125,
-0.01021575927734375,
0.0220184326171875,
-0.034912109375,
-0.050628662109375,
-0.0191802978515625,
-0.0198516845703125,
-0.047821044921875,
0.0268707275390625,
0.0005664825439453125,
0.0021839141845703125,
-0.0032482147216796875,
-0.0025119781494140625,
-0.0125732421875,
-0.00617218017578125,
0.03515625,
0.0280914306640625,
-0.0262908935546875,
-0.015899658203125,
-0.0005736351013183594,
-0.0226593017578125,
0.011962890625,
0.0034236907958984375,
0.04425048828125,
-0.01495361328125,
-0.0253143310546875,
-0.06622314453125,
-0.0006136894226074219,
0.037078857421875,
-0.022125244140625,
0.06439208984375,
0.05859375,
-0.03924560546875,
-0.00021529197692871094,
-0.03515625,
-0.009979248046875,
-0.034088134765625,
0.0278167724609375,
-0.041534423828125,
-0.017669677734375,
0.053131103515625,
0.0100555419921875,
0.0107269287109375,
0.07098388671875,
0.03497314453125,
0.00861358642578125,
0.08721923828125,
0.0206756591796875,
-0.016082763671875,
0.01465606689453125,
-0.0634765625,
-0.0164947509765625,
-0.0645751953125,
-0.03350830078125,
-0.04913330078125,
-0.030303955078125,
-0.043365478515625,
0.00485992431640625,
0.0276336669921875,
0.0132598876953125,
-0.047607421875,
0.0132293701171875,
-0.03948974609375,
0.0163726806640625,
0.062408447265625,
0.037139892578125,
0.00007575750350952148,
0.0016469955444335938,
-0.01203155517578125,
-0.0005850791931152344,
-0.056182861328125,
-0.032379150390625,
0.09197998046875,
0.041839599609375,
0.0540771484375,
-0.002445220947265625,
0.051971435546875,
-0.013519287109375,
0.01004791259765625,
-0.048553466796875,
0.0271759033203125,
0.00298309326171875,
-0.062225341796875,
-0.0211334228515625,
-0.03228759765625,
-0.0582275390625,
0.010009765625,
-0.0254974365234375,
-0.043121337890625,
0.016632080078125,
0.0146942138671875,
-0.0242767333984375,
0.03167724609375,
-0.03997802734375,
0.07879638671875,
-0.005886077880859375,
-0.004550933837890625,
-0.0159149169921875,
-0.035491943359375,
0.019683837890625,
0.01312255859375,
-0.027008056640625,
0.005035400390625,
0.0119171142578125,
0.0775146484375,
-0.048431396484375,
0.05419921875,
-0.0294952392578125,
0.0277557373046875,
0.032135009765625,
-0.0268707275390625,
0.0308837890625,
0.011627197265625,
-0.00943756103515625,
0.0192718505859375,
0.00002288818359375,
-0.03863525390625,
-0.043304443359375,
0.0460205078125,
-0.0927734375,
-0.01316070556640625,
-0.03082275390625,
-0.0245513916015625,
-0.00078582763671875,
0.0177764892578125,
0.05963134765625,
0.07037353515625,
-0.00959014892578125,
0.0285186767578125,
0.040802001953125,
-0.008453369140625,
0.0246734619140625,
0.0311126708984375,
0.0038604736328125,
-0.049468994140625,
0.0679931640625,
0.0032520294189453125,
0.0166473388671875,
-0.007457733154296875,
0.02239990234375,
-0.019439697265625,
-0.041748046875,
-0.041534423828125,
0.0201873779296875,
-0.04083251953125,
-0.01024627685546875,
-0.0263824462890625,
-0.048736572265625,
-0.02264404296875,
0.01064300537109375,
-0.0291900634765625,
-0.0073699951171875,
-0.042388916015625,
0.0008215904235839844,
0.0258331298828125,
0.0325927734375,
0.018585205078125,
0.058013916015625,
-0.04931640625,
0.0088348388671875,
0.012359619140625,
0.041534423828125,
0.01036834716796875,
-0.0660400390625,
-0.01885986328125,
0.010894775390625,
-0.035919189453125,
-0.04388427734375,
0.027618408203125,
0.021881103515625,
0.055694580078125,
0.049041748046875,
-0.01141357421875,
0.0716552734375,
-0.0170745849609375,
0.061126708984375,
0.026123046875,
-0.040802001953125,
0.04400634765625,
-0.029998779296875,
0.0248565673828125,
0.039825439453125,
0.037384033203125,
-0.0254669189453125,
-0.006618499755859375,
-0.08697509765625,
-0.0399169921875,
0.06317138671875,
0.0274810791015625,
0.0069122314453125,
0.01316070556640625,
0.027130126953125,
-0.01306915283203125,
0.01274871826171875,
-0.047027587890625,
-0.04583740234375,
-0.0279541015625,
-0.0179443359375,
-0.01416778564453125,
-0.01885986328125,
-0.01468658447265625,
-0.039154052734375,
0.06475830078125,
-0.0083160400390625,
0.0335693359375,
0.0146942138671875,
0.016265869140625,
-0.0104522705078125,
-0.0012569427490234375,
0.048919677734375,
0.041534423828125,
-0.046173095703125,
-0.021728515625,
0.01043701171875,
-0.05126953125,
-0.00936126708984375,
0.0179595947265625,
-0.0225677490234375,
0.00933837890625,
0.035400390625,
0.1053466796875,
0.0204925537109375,
-0.0161590576171875,
0.045440673828125,
-0.008392333984375,
-0.037139892578125,
-0.0345458984375,
0.00807952880859375,
-0.0145416259765625,
0.01201629638671875,
0.0160675048828125,
0.03466796875,
0.01104736328125,
-0.0188751220703125,
-0.0018987655639648438,
0.01044464111328125,
-0.03875732421875,
-0.022247314453125,
0.06182861328125,
0.01226043701171875,
-0.0224761962890625,
0.0640869140625,
0.006069183349609375,
-0.0198822021484375,
0.05291748046875,
0.040985107421875,
0.07257080078125,
-0.0120697021484375,
-0.006561279296875,
0.053009033203125,
0.01158905029296875,
-0.00917816162109375,
0.04071044921875,
0.01458740234375,
-0.03851318359375,
-0.0153350830078125,
-0.0499267578125,
-0.0227508544921875,
0.0455322265625,
-0.09002685546875,
0.048492431640625,
-0.038421630859375,
-0.0273895263671875,
0.019927978515625,
0.01488494873046875,
-0.0740966796875,
0.043701171875,
0.0030670166015625,
0.09619140625,
-0.059234619140625,
0.05474853515625,
0.03887939453125,
-0.039093017578125,
-0.08807373046875,
-0.013092041015625,
-0.00982666015625,
-0.0648193359375,
0.042938232421875,
0.0088348388671875,
0.020751953125,
0.01038360595703125,
-0.042724609375,
-0.052703857421875,
0.08770751953125,
0.020050048828125,
-0.04791259765625,
-0.00260162353515625,
0.00945281982421875,
0.041656494140625,
-0.01309967041015625,
0.046875,
0.01529693603515625,
0.0233612060546875,
0.01203155517578125,
-0.09039306640625,
-0.0096893310546875,
-0.02947998046875,
0.0014810562133789062,
0.01389312744140625,
-0.06591796875,
0.07135009765625,
-0.0006093978881835938,
0.016937255859375,
0.02178955078125,
0.0477294921875,
0.0187835693359375,
0.02838134765625,
0.039642333984375,
0.07000732421875,
0.0406494140625,
-0.01390838623046875,
0.07000732421875,
-0.032379150390625,
0.03692626953125,
0.07928466796875,
0.01032257080078125,
0.057342529296875,
0.01422119140625,
-0.0105438232421875,
0.024505615234375,
0.055450439453125,
-0.0196990966796875,
0.030059814453125,
0.0033550262451171875,
-0.00691986083984375,
-0.01451873779296875,
0.00896453857421875,
-0.047393798828125,
0.0418701171875,
0.00897216796875,
-0.047821044921875,
-0.0165252685546875,
-0.00875091552734375,
-0.00948333740234375,
-0.02215576171875,
-0.03277587890625,
0.046875,
-0.019439697265625,
-0.023223876953125,
0.061920166015625,
0.0113677978515625,
0.0411376953125,
-0.055694580078125,
-0.00916290283203125,
0.00011610984802246094,
0.021148681640625,
-0.0193328857421875,
-0.0411376953125,
0.0142974853515625,
-0.0106201171875,
-0.0165863037109375,
0.0089874267578125,
0.039581298828125,
-0.01490020751953125,
-0.0472412109375,
0.00429534912109375,
0.0148773193359375,
0.0160369873046875,
-0.00638580322265625,
-0.06341552734375,
0.00753021240234375,
-0.0068511962890625,
-0.03143310546875,
0.01055145263671875,
0.01213836669921875,
0.006618499755859375,
0.041290283203125,
0.043701171875,
0.01445770263671875,
0.0080108642578125,
0.0225677490234375,
0.0740966796875,
-0.0435791015625,
-0.0517578125,
-0.037322998046875,
0.031524658203125,
-0.0239410400390625,
-0.08343505859375,
0.040283203125,
0.0689697265625,
0.058013916015625,
-0.0064697265625,
0.03814697265625,
0.005992889404296875,
0.039642333984375,
-0.041046142578125,
0.058837890625,
-0.0352783203125,
-0.006961822509765625,
-0.0198211669921875,
-0.0718994140625,
0.00611114501953125,
0.04290771484375,
-0.0171966552734375,
0.004947662353515625,
0.021453857421875,
0.06573486328125,
-0.01317596435546875,
0.018829345703125,
0.01334381103515625,
-0.0007886886596679688,
0.0116424560546875,
0.0305328369140625,
0.037322998046875,
-0.06805419921875,
0.044677734375,
-0.04766845703125,
-0.0192413330078125,
-0.00017023086547851562,
-0.035400390625,
-0.07342529296875,
-0.0355224609375,
-0.033905029296875,
-0.028106689453125,
-0.004810333251953125,
0.07000732421875,
0.07257080078125,
-0.0443115234375,
-0.024322509765625,
0.0023097991943359375,
-0.036773681640625,
-0.0262298583984375,
-0.01824951171875,
0.02850341796875,
-0.00548553466796875,
-0.053314208984375,
0.00421142578125,
-0.0306854248046875,
0.0249176025390625,
-0.0222015380859375,
-0.03240966796875,
-0.00012421607971191406,
-0.016754150390625,
0.00920867919921875,
0.003505706787109375,
-0.045562744140625,
-0.0181884765625,
-0.00261688232421875,
-0.0006961822509765625,
0.002933502197265625,
0.0288238525390625,
-0.0452880859375,
0.025604248046875,
0.0207672119140625,
0.01244354248046875,
0.06097412109375,
-0.01110076904296875,
0.0203857421875,
-0.05426025390625,
0.0369873046875,
0.0162506103515625,
0.022735595703125,
-0.002910614013671875,
-0.020416259765625,
0.03436279296875,
0.0306549072265625,
-0.039825439453125,
-0.06707763671875,
-0.0196075439453125,
-0.082275390625,
0.016998291015625,
0.079345703125,
0.00594329833984375,
-0.02606201171875,
0.02880859375,
-0.028839111328125,
0.0203094482421875,
-0.02203369140625,
0.0257720947265625,
0.043121337890625,
0.0100250244140625,
-0.00861358642578125,
-0.043670654296875,
0.03363037109375,
0.0087890625,
-0.037567138671875,
-0.0234375,
0.01690673828125,
0.03448486328125,
0.00937652587890625,
0.034515380859375,
-0.0113525390625,
0.0278472900390625,
0.0233612060546875,
0.0088043212890625,
-0.0310211181640625,
-0.027801513671875,
-0.02587890625,
-0.0012798309326171875,
0.0078277587890625,
-0.047332763671875
]
] |
Jorgeutd/bert-large-uncased-finetuned-ner | 2023-11-06T14:28:33.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"en",
"dataset:conll2003",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | Jorgeutd | null | null | Jorgeutd/bert-large-uncased-finetuned-ner | 5 | 27,076 | transformers | 2022-03-02T23:29:04 | ---
language: en
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- conll2003
metrics:
- precision
- recall
- f1
- accuracy
widget:
- text: My name is Scott and I live in Columbus.
- text: My name is Scott and I am calling from Buffalo, NY. I would like to file a
complain with United Airlines.
- text: Apple was founded in 1976 by Steve Jobs, Steve Wozniak and Ronald Wayne.
base_model: bert-large-uncased
model-index:
- name: bert-large-uncased-finetuned-ner
results:
- task:
type: token-classification
name: Token Classification
dataset:
name: conll2003
type: conll2003
args: conll2003
metrics:
- type: precision
value: 0.9504719600222099
name: Precision
- type: recall
value: 0.9574896520863632
name: Recall
- type: f1
value: 0.9539679001337494
name: F1
- type: accuracy
value: 0.9885618059637473
name: Accuracy
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-large-uncased-finetuned-ner
This model is a fine-tuned version of [bert-large-uncased](https://huggingface.co/bert-large-uncased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0778
- Precision: 0.9505
- Recall: 0.9575
- F1: 0.9540
- Accuracy: 0.9886
## Model description
More information needed
#### Limitations and bias
This model is limited by its training dataset of entity-annotated news articles from a specific span of time. This may not generalize well for all use cases in different domains. Furthermore, the model occassionally tags subword tokens as entities and post-processing of results may be necessary to handle those cases.
#### How to use
You can use this model with Transformers *pipeline* for NER.
```python
from transformers import pipeline
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("Jorgeutd/bert-large-uncased-finetuned-ner")
model = AutoModelForTokenClassification.from_pretrained("Jorgeutd/bert-large-uncased-finetuned-ner")
nlp = pipeline("ner", model=model, tokenizer=tokenizer)
example = "My name is Scott and I live in Ohio"
ner_results = nlp(example)
print(ner_results)
```
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.1997 | 1.0 | 878 | 0.0576 | 0.9316 | 0.9257 | 0.9286 | 0.9837 |
| 0.04 | 2.0 | 1756 | 0.0490 | 0.9400 | 0.9513 | 0.9456 | 0.9870 |
| 0.0199 | 3.0 | 2634 | 0.0557 | 0.9436 | 0.9540 | 0.9488 | 0.9879 |
| 0.0112 | 4.0 | 3512 | 0.0602 | 0.9443 | 0.9569 | 0.9506 | 0.9881 |
| 0.0068 | 5.0 | 4390 | 0.0631 | 0.9451 | 0.9589 | 0.9520 | 0.9882 |
| 0.0044 | 6.0 | 5268 | 0.0638 | 0.9510 | 0.9567 | 0.9538 | 0.9885 |
| 0.003 | 7.0 | 6146 | 0.0722 | 0.9495 | 0.9560 | 0.9527 | 0.9885 |
| 0.0016 | 8.0 | 7024 | 0.0762 | 0.9491 | 0.9595 | 0.9543 | 0.9887 |
| 0.0018 | 9.0 | 7902 | 0.0769 | 0.9496 | 0.9542 | 0.9519 | 0.9883 |
| 0.0009 | 10.0 | 8780 | 0.0778 | 0.9505 | 0.9575 | 0.9540 | 0.9886 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.8.1+cu111
- Datasets 1.18.3
- Tokenizers 0.11.0
| 3,900 | [
[
-0.049163818359375,
-0.0465087890625,
0.011932373046875,
0.00794219970703125,
-0.0088043212890625,
-0.0211944580078125,
-0.0172271728515625,
-0.01430511474609375,
0.036376953125,
0.0254669189453125,
-0.045135498046875,
-0.0465087890625,
-0.05487060546875,
-0.006252288818359375,
-0.0185546875,
0.0775146484375,
0.01169586181640625,
-0.00177001953125,
0.0034637451171875,
-0.0008521080017089844,
-0.019439697265625,
-0.035064697265625,
-0.052703857421875,
-0.036468505859375,
0.021453857421875,
0.0230560302734375,
0.0645751953125,
0.04400634765625,
0.034210205078125,
0.025146484375,
-0.026611328125,
-0.003696441650390625,
-0.0192413330078125,
-0.028411865234375,
0.00797271728515625,
-0.0283966064453125,
-0.033782958984375,
0.0025272369384765625,
0.04388427734375,
0.0479736328125,
-0.01009368896484375,
0.02996826171875,
0.0072479248046875,
0.0440673828125,
-0.02947998046875,
0.01499176025390625,
-0.036590576171875,
0.019561767578125,
-0.00933074951171875,
-0.0187835693359375,
-0.009857177734375,
-0.018035888671875,
0.022125244140625,
-0.040313720703125,
0.036224365234375,
-0.00348663330078125,
0.099853515625,
0.021209716796875,
-0.0223846435546875,
-0.00003415346145629883,
-0.044769287109375,
0.06005859375,
-0.06573486328125,
0.0178985595703125,
0.035369873046875,
0.00774383544921875,
-0.0141143798828125,
-0.06280517578125,
-0.052490234375,
0.0106048583984375,
-0.017425537109375,
0.01482391357421875,
-0.018218994140625,
-0.01194000244140625,
0.041290283203125,
0.043670654296875,
-0.042877197265625,
0.0120086669921875,
-0.035430908203125,
-0.020538330078125,
0.043731689453125,
0.01479339599609375,
-0.0019121170043945312,
-0.034759521484375,
-0.03253173828125,
-0.017242431640625,
-0.02197265625,
0.02239990234375,
0.03607177734375,
0.020721435546875,
-0.024322509765625,
0.039703369140625,
-0.016082763671875,
0.044525146484375,
0.01317596435546875,
-0.00406646728515625,
0.058074951171875,
-0.01142120361328125,
-0.0257415771484375,
0.0006628036499023438,
0.0672607421875,
0.044403076171875,
0.01287078857421875,
0.006805419921875,
-0.0206146240234375,
-0.01690673828125,
0.0115814208984375,
-0.07086181640625,
-0.032562255859375,
0.025726318359375,
-0.04296875,
-0.020904541015625,
0.000904083251953125,
-0.052764892578125,
0.01047515869140625,
-0.032989501953125,
0.04730224609375,
-0.0259246826171875,
-0.01409149169921875,
0.0033588409423828125,
-0.01381683349609375,
0.0236053466796875,
0.01155853271484375,
-0.0787353515625,
0.0201873779296875,
0.032623291015625,
0.054656982421875,
0.01166534423828125,
-0.0134735107421875,
-0.0075531005859375,
0.0005016326904296875,
-0.0248870849609375,
0.05108642578125,
-0.0120697021484375,
-0.0233917236328125,
-0.001934051513671875,
0.03021240234375,
-0.00887298583984375,
-0.0254364013671875,
0.05096435546875,
-0.029388427734375,
0.0265045166015625,
-0.01763916015625,
-0.041229248046875,
-0.01409912109375,
0.028411865234375,
-0.0430908203125,
0.0926513671875,
0.01181793212890625,
-0.06890869140625,
0.0416259765625,
-0.0443115234375,
-0.011199951171875,
-0.00754547119140625,
-0.004150390625,
-0.060699462890625,
-0.00994873046875,
0.0296783447265625,
0.029937744140625,
-0.0195770263671875,
0.024078369140625,
-0.01244354248046875,
-0.03253173828125,
-0.0014667510986328125,
-0.03753662109375,
0.07952880859375,
0.0101165771484375,
-0.046173095703125,
0.005413055419921875,
-0.07427978515625,
0.0168609619140625,
0.020721435546875,
-0.029144287109375,
-0.0024852752685546875,
-0.0218963623046875,
0.0200347900390625,
0.015533447265625,
0.031341552734375,
-0.04248046875,
0.01305389404296875,
-0.03204345703125,
0.04510498046875,
0.05487060546875,
0.005710601806640625,
0.02398681640625,
-0.0325927734375,
0.012969970703125,
0.01922607421875,
0.022430419921875,
0.0043182373046875,
-0.03668212890625,
-0.075439453125,
-0.0191192626953125,
0.02862548828125,
0.03350830078125,
-0.0186309814453125,
0.0711669921875,
-0.01110076904296875,
-0.04815673828125,
-0.0305938720703125,
-0.0026035308837890625,
0.00859832763671875,
0.047637939453125,
0.03765869140625,
-0.005916595458984375,
-0.041290283203125,
-0.08203125,
0.007720947265625,
0.00397491455078125,
0.00670623779296875,
0.0240936279296875,
0.058197021484375,
-0.0287933349609375,
0.06976318359375,
-0.042877197265625,
-0.037628173828125,
-0.0018224716186523438,
0.00710296630859375,
0.055999755859375,
0.05682373046875,
0.052764892578125,
-0.045928955078125,
-0.0283050537109375,
-0.010223388671875,
-0.05328369140625,
0.0263824462890625,
-0.00975799560546875,
-0.0108642578125,
0.0093994140625,
0.0290069580078125,
-0.048980712890625,
0.0518798828125,
0.037445068359375,
-0.0285491943359375,
0.056182861328125,
-0.0469970703125,
-0.0017910003662109375,
-0.08087158203125,
0.0230865478515625,
0.0011749267578125,
-0.0022869110107421875,
-0.0287322998046875,
-0.01306915283203125,
0.004871368408203125,
-0.007720947265625,
-0.023223876953125,
0.046661376953125,
-0.0242462158203125,
0.007656097412109375,
-0.00022876262664794922,
-0.0162200927734375,
0.0012311935424804688,
0.050048828125,
0.01214599609375,
0.059539794921875,
0.04949951171875,
-0.041351318359375,
0.019256591796875,
0.025726318359375,
-0.042572021484375,
0.0284576416015625,
-0.0640869140625,
-0.0010004043579101562,
-0.00511932373046875,
0.006160736083984375,
-0.0726318359375,
-0.0197601318359375,
0.01082611083984375,
-0.03753662109375,
0.020172119140625,
-0.00865936279296875,
-0.035003662109375,
-0.056793212890625,
-0.016998291015625,
0.0048828125,
0.036468505859375,
-0.0355224609375,
0.033477783203125,
0.0097808837890625,
0.00669097900390625,
-0.052703857421875,
-0.06768798828125,
-0.0017976760864257812,
-0.008148193359375,
-0.04248046875,
0.0321044921875,
-0.006256103515625,
0.00539398193359375,
0.0077667236328125,
-0.01110076904296875,
-0.019744873046875,
-0.005146026611328125,
0.012725830078125,
0.019744873046875,
-0.0168914794921875,
0.00033855438232421875,
-0.004726409912109375,
-0.0256805419921875,
0.01206207275390625,
-0.010772705078125,
0.048187255859375,
-0.0175018310546875,
-0.0196685791015625,
-0.052215576171875,
0.006099700927734375,
0.0367431640625,
-0.0168609619140625,
0.068603515625,
0.0618896484375,
-0.03369140625,
0.00431060791015625,
-0.040069580078125,
-0.0237274169921875,
-0.03521728515625,
0.025482177734375,
-0.027557373046875,
-0.048828125,
0.06304931640625,
0.0085601806640625,
0.017852783203125,
0.06964111328125,
0.036529541015625,
-0.01128387451171875,
0.07684326171875,
0.02154541015625,
-0.0006847381591796875,
0.022247314453125,
-0.0673828125,
0.007659912109375,
-0.0616455078125,
-0.0419921875,
-0.035400390625,
-0.03936767578125,
-0.045745849609375,
-0.004955291748046875,
0.01434326171875,
0.005222320556640625,
-0.0570068359375,
0.0194854736328125,
-0.0628662109375,
0.0236968994140625,
0.076416015625,
0.0258026123046875,
0.0031414031982421875,
-0.0036945343017578125,
-0.0235748291015625,
-0.0158233642578125,
-0.050537109375,
-0.03045654296875,
0.09161376953125,
0.0232696533203125,
0.04852294921875,
-0.0026607513427734375,
0.055999755859375,
0.006511688232421875,
-0.0011968612670898438,
-0.046600341796875,
0.01806640625,
-0.007232666015625,
-0.07843017578125,
-0.022796630859375,
-0.0255126953125,
-0.0782470703125,
0.0193939208984375,
-0.037841796875,
-0.058837890625,
0.035186767578125,
0.01259613037109375,
-0.04095458984375,
0.040313720703125,
-0.048828125,
0.0726318359375,
-0.01482391357421875,
-0.0223236083984375,
0.00429534912109375,
-0.057861328125,
0.00870513916015625,
-0.0015172958374023438,
-0.00249481201171875,
-0.00733184814453125,
0.012115478515625,
0.06536865234375,
-0.04949951171875,
0.0499267578125,
-0.0153656005859375,
0.018463134765625,
0.022552490234375,
-0.0124359130859375,
0.047760009765625,
0.01532745361328125,
-0.01287078857421875,
0.0151214599609375,
0.006908416748046875,
-0.039825439453125,
-0.024261474609375,
0.06842041015625,
-0.08489990234375,
-0.04461669921875,
-0.05084228515625,
-0.043426513671875,
0.01324462890625,
0.036895751953125,
0.0457763671875,
0.050567626953125,
0.007476806640625,
0.030914306640625,
0.053070068359375,
-0.01214599609375,
0.049652099609375,
0.0164794921875,
-0.0014638900756835938,
-0.050048828125,
0.050384521484375,
0.01287078857421875,
0.00836944580078125,
0.008392333984375,
0.0200653076171875,
-0.03692626953125,
-0.0300140380859375,
-0.0254669189453125,
0.01476287841796875,
-0.0394287109375,
-0.0167388916015625,
-0.048675537109375,
-0.03387451171875,
-0.046112060546875,
-0.02325439453125,
-0.030426025390625,
-0.0243988037109375,
-0.037017822265625,
-0.0030975341796875,
0.0439453125,
0.0316162109375,
-0.006744384765625,
0.0265960693359375,
-0.047332763671875,
0.00917816162109375,
0.016845703125,
0.022918701171875,
0.0037860870361328125,
-0.051605224609375,
-0.00850677490234375,
0.004924774169921875,
-0.02569580078125,
-0.037322998046875,
0.0460205078125,
0.00585174560546875,
0.044769287109375,
0.04193115234375,
0.00203704833984375,
0.07318115234375,
-0.0284576416015625,
0.0572509765625,
0.0268707275390625,
-0.057861328125,
0.04217529296875,
-0.0235748291015625,
0.034881591796875,
0.05126953125,
0.028411865234375,
-0.026092529296875,
-0.0162353515625,
-0.0863037109375,
-0.07159423828125,
0.061065673828125,
0.023956298828125,
0.0145416259765625,
-0.00048160552978515625,
0.02264404296875,
-0.01959228515625,
0.0202178955078125,
-0.0709228515625,
-0.05328369140625,
-0.0220947265625,
-0.00795745849609375,
-0.015960693359375,
-0.01268768310546875,
-0.01163482666015625,
-0.051483154296875,
0.05291748046875,
0.01377105712890625,
0.038177490234375,
0.029022216796875,
0.007568359375,
-0.0004076957702636719,
0.00579071044921875,
0.038665771484375,
0.059234619140625,
-0.046173095703125,
0.0031490325927734375,
0.01435089111328125,
-0.0401611328125,
-0.0030231475830078125,
0.0226287841796875,
-0.0250244140625,
0.01303863525390625,
0.0258331298828125,
0.053924560546875,
0.0174102783203125,
-0.00643157958984375,
0.046295166015625,
0.004695892333984375,
-0.0421142578125,
-0.05419921875,
-0.00405120849609375,
0.0019016265869140625,
0.02362060546875,
0.0364990234375,
0.0312347412109375,
0.00537872314453125,
-0.0312347412109375,
0.01470184326171875,
0.0285797119140625,
-0.038726806640625,
-0.0043487548828125,
0.0670166015625,
0.005035400390625,
-0.0169219970703125,
0.0633544921875,
-0.007472991943359375,
-0.0372314453125,
0.0675048828125,
0.04345703125,
0.04852294921875,
-0.003597259521484375,
0.0015211105346679688,
0.0693359375,
0.035064697265625,
-0.0009698867797851562,
0.025634765625,
0.012542724609375,
-0.034027099609375,
-0.01288604736328125,
-0.050628662109375,
-0.01055908203125,
0.03704833984375,
-0.06304931640625,
0.0290069580078125,
-0.039947509765625,
-0.0411376953125,
0.00550079345703125,
0.01393890380859375,
-0.0665283203125,
0.038055419921875,
-0.0016775131225585938,
0.083251953125,
-0.06317138671875,
0.053680419921875,
0.042877197265625,
-0.041534423828125,
-0.07476806640625,
-0.01715087890625,
-0.015350341796875,
-0.057403564453125,
0.047637939453125,
0.0162200927734375,
0.0283355712890625,
0.01214599609375,
-0.037841796875,
-0.078857421875,
0.08184814453125,
0.0028839111328125,
-0.04083251953125,
0.002655029296875,
0.00875091552734375,
0.03692626953125,
-0.0022983551025390625,
0.0401611328125,
0.033447265625,
0.035369873046875,
0.0112457275390625,
-0.06121826171875,
0.006683349609375,
-0.0236358642578125,
0.0022125244140625,
0.0283050537109375,
-0.056854248046875,
0.08233642578125,
-0.018524169921875,
0.0205078125,
0.0084228515625,
0.052459716796875,
0.016265869140625,
0.01229095458984375,
0.0301971435546875,
0.0716552734375,
0.056915283203125,
-0.026458740234375,
0.0621337890625,
-0.0302886962890625,
0.062255859375,
0.0740966796875,
0.01727294921875,
0.060699462890625,
0.03515625,
-0.02423095703125,
0.035186767578125,
0.061767578125,
-0.0277252197265625,
0.0421142578125,
0.007015228271484375,
-0.00885009765625,
-0.0257568359375,
0.0174407958984375,
-0.053955078125,
0.01678466796875,
0.0190277099609375,
-0.047515869140625,
-0.0216064453125,
-0.01385498046875,
0.003139495849609375,
-0.01416778564453125,
-0.029998779296875,
0.033935546875,
-0.0132293701171875,
-0.031005859375,
0.053802490234375,
-0.00833892822265625,
0.051177978515625,
-0.04083251953125,
0.004070281982421875,
-0.0038509368896484375,
0.03314208984375,
-0.034393310546875,
-0.075439453125,
0.0166473388671875,
-0.0110626220703125,
-0.0187225341796875,
0.005542755126953125,
0.03143310546875,
-0.0151214599609375,
-0.0548095703125,
0.0040283203125,
0.0061798095703125,
0.00933074951171875,
0.010406494140625,
-0.068603515625,
-0.0164794921875,
0.00547027587890625,
-0.04193115234375,
0.004962921142578125,
0.0361328125,
0.0144195556640625,
0.03759765625,
0.06048583984375,
-0.005344390869140625,
0.01177978515625,
0.000164031982421875,
0.07464599609375,
-0.0531005859375,
-0.036407470703125,
-0.04638671875,
0.03326416015625,
-0.018798828125,
-0.057769775390625,
0.0557861328125,
0.06414794921875,
0.05328369140625,
-0.00789642333984375,
0.038848876953125,
-0.021575927734375,
0.035064697265625,
-0.0325927734375,
0.05548095703125,
-0.048553466796875,
-0.01371002197265625,
-0.0168914794921875,
-0.07440185546875,
-0.031097412109375,
0.06414794921875,
-0.033935546875,
0.0158233642578125,
0.042999267578125,
0.05706787109375,
0.00151824951171875,
0.0008378028869628906,
0.0022106170654296875,
0.01306915283203125,
0.01070404052734375,
0.03704833984375,
0.03228759765625,
-0.054931640625,
0.040740966796875,
-0.0543212890625,
-0.00399017333984375,
-0.01328277587890625,
-0.05572509765625,
-0.07281494140625,
-0.0367431640625,
-0.03192138671875,
-0.032562255859375,
-0.01519012451171875,
0.07147216796875,
0.059600830078125,
-0.0560302734375,
-0.0052947998046875,
-0.0143890380859375,
-0.0174407958984375,
-0.01097869873046875,
-0.0186004638671875,
0.05706787109375,
-0.0179290771484375,
-0.06365966796875,
-0.011077880859375,
-0.013153076171875,
0.0250701904296875,
-0.007808685302734375,
-0.0014448165893554688,
-0.0255126953125,
-0.0116729736328125,
0.026611328125,
0.0106048583984375,
-0.049774169921875,
-0.0277557373046875,
0.0009965896606445312,
-0.01006317138671875,
0.01971435546875,
0.01171112060546875,
-0.04150390625,
0.032257080078125,
0.0225982666015625,
0.0270233154296875,
0.06365966796875,
-0.00685882568359375,
0.0111236572265625,
-0.0440673828125,
0.0244140625,
0.00977325439453125,
0.031341552734375,
0.00873565673828125,
-0.0299072265625,
0.034759521484375,
0.0221099853515625,
-0.0615234375,
-0.053192138671875,
-0.0277099609375,
-0.0870361328125,
-0.007137298583984375,
0.080078125,
0.000054776668548583984,
-0.0313720703125,
0.0055084228515625,
-0.0099334716796875,
0.013427734375,
-0.035797119140625,
0.04296875,
0.05645751953125,
-0.00627899169921875,
0.0018053054809570312,
-0.04937744140625,
0.0325927734375,
0.0194244384765625,
-0.041290283203125,
-0.0174102783203125,
0.01593017578125,
0.033294677734375,
0.02703857421875,
0.026214599609375,
-0.0084686279296875,
0.01531219482421875,
0.01108551025390625,
0.0233001708984375,
-0.00705718994140625,
-0.00333404541015625,
-0.01849365234375,
0.00003415346145629883,
-0.0007414817810058594,
-0.0292816162109375
]
] |
microsoft/layoutlmv2-large-uncased | 2022-09-16T03:40:36.000Z | [
"transformers",
"pytorch",
"layoutlmv2",
"en",
"arxiv:2012.14740",
"license:cc-by-nc-sa-4.0",
"endpoints_compatible",
"region:us"
] | null | microsoft | null | null | microsoft/layoutlmv2-large-uncased | 6 | 27,055 | transformers | 2022-03-02T23:29:05 | ---
language: en
license: cc-by-nc-sa-4.0
---
# LayoutLMv2
**Multimodal (text + layout/format + image) pre-training for document AI**
## Introduction
LayoutLMv2 is an improved version of LayoutLM with new pre-training tasks to model the interaction among text, layout, and image in a single multi-modal framework. It outperforms strong baselines and achieves new state-of-the-art results on a wide variety of downstream visually-rich document understanding tasks, including , including FUNSD (0.7895 → 0.8420), CORD (0.9493 → 0.9601), SROIE (0.9524 → 0.9781), Kleister-NDA (0.834 → 0.852), RVL-CDIP (0.9443 → 0.9564), and DocVQA (0.7295 → 0.8672).
[LayoutLMv2: Multi-modal Pre-training for Visually-Rich Document Understanding](https://arxiv.org/abs/2012.14740)
Yang Xu, Yiheng Xu, Tengchao Lv, Lei Cui, Furu Wei, Guoxin Wang, Yijuan Lu, Dinei Florencio, Cha Zhang, Wanxiang Che, Min Zhang, Lidong Zhou, [ACL 2021](#)
| 922 | [
[
-0.01181793212890625,
-0.045440673828125,
0.031768798828125,
0.0204620361328125,
-0.022186279296875,
0.005748748779296875,
0.0184478759765625,
-0.0034732818603515625,
-0.007228851318359375,
0.0219268798828125,
-0.049407958984375,
-0.038665771484375,
-0.045654296875,
-0.026641845703125,
-0.0026531219482421875,
0.05975341796875,
-0.0205078125,
0.03289794921875,
-0.03936767578125,
-0.0032024383544921875,
-0.042449951171875,
-0.0190887451171875,
-0.0261993408203125,
-0.019317626953125,
0.0308990478515625,
0.008880615234375,
0.036407470703125,
0.031951904296875,
0.045806884765625,
0.0272674560546875,
0.00604248046875,
0.02703857421875,
-0.02862548828125,
0.001621246337890625,
0.0177001953125,
-0.0195465087890625,
-0.050994873046875,
0.01139068603515625,
0.04949951171875,
0.020660400390625,
0.0082244873046875,
-0.002452850341796875,
0.007366180419921875,
0.053924560546875,
-0.049774169921875,
0.016754150390625,
-0.017333984375,
0.01360321044921875,
-0.0157318115234375,
-0.022552490234375,
-0.046783447265625,
-0.006702423095703125,
-0.003124237060546875,
-0.08441162109375,
-0.005290985107421875,
0.0302276611328125,
0.0716552734375,
0.0250701904296875,
-0.054107666015625,
-0.01528167724609375,
-0.023468017578125,
0.057373046875,
-0.028594970703125,
0.050018310546875,
0.035064697265625,
0.01065826416015625,
-0.0016155242919921875,
-0.0814208984375,
-0.0419921875,
-0.003307342529296875,
-0.024658203125,
0.0404052734375,
0.0051422119140625,
0.014617919921875,
0.01995849609375,
0.01409912109375,
-0.06829833984375,
0.00836944580078125,
-0.036834716796875,
-0.032806396484375,
0.0377197265625,
0.00646209716796875,
0.0374755859375,
-0.0202789306640625,
-0.039459228515625,
-0.0223236083984375,
-0.01145172119140625,
0.01486968994140625,
0.01641845703125,
-0.0303192138671875,
-0.033203125,
0.0122222900390625,
0.01953125,
0.051116943359375,
-0.0090484619140625,
-0.0400390625,
0.049407958984375,
-0.04486083984375,
-0.042510986328125,
-0.025634765625,
0.060211181640625,
0.0170745849609375,
-0.019012451171875,
0.01464080810546875,
-0.006267547607421875,
-0.019439697265625,
0.0273590087890625,
-0.048797607421875,
-0.031982421875,
0.010528564453125,
-0.05712890625,
-0.01654052734375,
0.0162506103515625,
-0.037811279296875,
-0.035064697265625,
-0.04022216796875,
0.032958984375,
-0.06317138671875,
-0.0172271728515625,
-0.01102447509765625,
-0.020294189453125,
0.043060302734375,
0.08349609375,
-0.043365478515625,
-0.004039764404296875,
0.049560546875,
0.058502197265625,
-0.0200347900390625,
-0.043670654296875,
-0.045440673828125,
0.0121612548828125,
-0.025634765625,
0.066162109375,
-0.0267181396484375,
-0.04132080078125,
-0.0050811767578125,
-0.0014448165893554688,
-0.01396942138671875,
-0.0249481201171875,
0.052825927734375,
-0.0284423828125,
0.05682373046875,
-0.0111846923828125,
-0.018096923828125,
-0.01462554931640625,
0.03302001953125,
-0.058502197265625,
0.0836181640625,
0.02606201171875,
-0.067626953125,
0.031768798828125,
-0.05169677734375,
-0.036285400390625,
0.00528717041015625,
-0.02178955078125,
-0.040985107421875,
-0.00928497314453125,
0.00933837890625,
0.0144805908203125,
0.01042938232421875,
-0.006961822509765625,
-0.01024627685546875,
-0.0164947509765625,
-0.00745391845703125,
-0.0244903564453125,
0.055511474609375,
0.025970458984375,
0.0017976760864257812,
0.043701171875,
-0.06854248046875,
0.0107574462890625,
0.01114654541015625,
-0.035858154296875,
-0.0274200439453125,
-0.0261993408203125,
0.03863525390625,
0.03314208984375,
0.0293731689453125,
-0.0203704833984375,
0.0282135009765625,
-0.0203399658203125,
0.00815582275390625,
0.05303955078125,
-0.01678466796875,
0.05322265625,
-0.0212249755859375,
0.040924072265625,
-0.0031032562255859375,
0.0212554931640625,
-0.042724609375,
-0.0310211181640625,
-0.0391845703125,
-0.040557861328125,
0.01030731201171875,
0.051483154296875,
-0.0645751953125,
0.0256805419921875,
-0.00759124755859375,
-0.03619384765625,
-0.0247650146484375,
0.0105743408203125,
0.06298828125,
0.0350341796875,
0.0197296142578125,
-0.030517578125,
-0.0499267578125,
-0.049652099609375,
0.006923675537109375,
-0.00293731689453125,
0.0008673667907714844,
0.0200653076171875,
0.03839111328125,
-0.020660400390625,
0.03594970703125,
-0.0251922607421875,
-0.037750244140625,
-0.0250396728515625,
0.021759033203125,
-0.01187896728515625,
0.037139892578125,
0.046905517578125,
-0.07861328125,
-0.04547119140625,
-0.0036106109619140625,
-0.060211181640625,
0.0209503173828125,
-0.0159912109375,
-0.023834228515625,
0.029449462890625,
0.034698486328125,
-0.046173095703125,
0.049774169921875,
0.036407470703125,
-0.031707763671875,
0.038055419921875,
-0.04620361328125,
0.0069580078125,
-0.0989990234375,
-0.004985809326171875,
-0.005001068115234375,
-0.00963592529296875,
-0.047515869140625,
0.0026092529296875,
0.040802001953125,
-0.00862884521484375,
-0.0457763671875,
0.053802490234375,
-0.055694580078125,
-0.01513671875,
-0.0149383544921875,
0.00527191162109375,
0.033660888671875,
0.040008544921875,
-0.00922393798828125,
0.05157470703125,
0.032501220703125,
-0.01293182373046875,
0.033294677734375,
0.046356201171875,
-0.0303192138671875,
0.04052734375,
-0.0264129638671875,
0.018829345703125,
-0.029022216796875,
0.0491943359375,
-0.0904541015625,
-0.0262603759765625,
0.013427734375,
-0.03570556640625,
0.040069580078125,
0.01593017578125,
-0.03594970703125,
-0.032379150390625,
-0.054840087890625,
0.028961181640625,
0.0303192138671875,
-0.023345947265625,
0.06365966796875,
0.008148193359375,
0.007297515869140625,
-0.04205322265625,
-0.046173095703125,
-0.01277923583984375,
0.007030487060546875,
-0.06488037109375,
0.0252227783203125,
-0.00989532470703125,
-0.00449371337890625,
-0.0205078125,
-0.004436492919921875,
-0.00635528564453125,
0.005420684814453125,
0.03338623046875,
0.031463623046875,
-0.004398345947265625,
0.001468658447265625,
0.003936767578125,
-0.0146484375,
-0.01219940185546875,
-0.0032100677490234375,
0.060211181640625,
-0.0006837844848632812,
-0.043792724609375,
-0.0628662109375,
0.017181396484375,
0.041229248046875,
-0.039825439453125,
0.034698486328125,
0.07562255859375,
-0.024871826171875,
0.011962890625,
-0.02679443359375,
0.0238494873046875,
-0.0360107421875,
0.056793212890625,
-0.0313720703125,
-0.043212890625,
0.01244354248046875,
-0.0041351318359375,
0.00867462158203125,
0.03118896484375,
0.0374755859375,
-0.025848388671875,
0.06939697265625,
0.0400390625,
0.011627197265625,
0.051422119140625,
-0.028472900390625,
0.005588531494140625,
-0.0723876953125,
-0.07293701171875,
-0.01971435546875,
-0.02410888671875,
-0.0167694091796875,
-0.049407958984375,
0.022857666015625,
0.0158233642578125,
-0.01158905029296875,
0.0058746337890625,
-0.0282135009765625,
0.0124969482421875,
0.05059814453125,
-0.00765228271484375,
0.003055572509765625,
0.003993988037109375,
-0.0104522705078125,
-0.0134429931640625,
-0.0264129638671875,
-0.0286407470703125,
0.066162109375,
0.031524658203125,
0.065673828125,
0.006793975830078125,
0.0396728515625,
0.0255584716796875,
0.00493621826171875,
-0.054351806640625,
0.036865234375,
-0.00856781005859375,
-0.0220947265625,
-0.028106689453125,
0.005558013916015625,
-0.0645751953125,
0.01904296875,
-0.0166168212890625,
-0.04083251953125,
0.0025501251220703125,
0.0153961181640625,
-0.011566162109375,
0.01432037353515625,
-0.0677490234375,
0.07354736328125,
-0.046234130859375,
-0.0177459716796875,
-0.0045013427734375,
-0.050994873046875,
0.0185699462890625,
-0.004047393798828125,
0.01678466796875,
0.01334381103515625,
0.0010356903076171875,
0.053070068359375,
-0.04058837890625,
0.0286865234375,
-0.0223846435546875,
-0.0093841552734375,
-0.004241943359375,
0.0018815994262695312,
0.049407958984375,
0.01544189453125,
0.0059356689453125,
-0.015899658203125,
0.0002428293228149414,
-0.029052734375,
-0.055816650390625,
0.034332275390625,
-0.075439453125,
-0.0268096923828125,
-0.04180908203125,
-0.0428466796875,
0.00630950927734375,
0.0302734375,
0.041961669921875,
0.04266357421875,
-0.01934814453125,
0.0202484130859375,
0.05712890625,
-0.0233001708984375,
0.03131103515625,
0.01456451416015625,
-0.0225372314453125,
-0.042694091796875,
0.06671142578125,
-0.00588226318359375,
-0.0086822509765625,
0.045654296875,
-0.009979248046875,
-0.0257415771484375,
-0.041748046875,
-0.032501220703125,
0.006923675537109375,
-0.05242919921875,
-0.006977081298828125,
-0.0704345703125,
-0.032958984375,
-0.0467529296875,
-0.019561767578125,
-0.0260772705078125,
-0.01253509521484375,
-0.033111572265625,
0.0010118484497070312,
-0.01151275634765625,
0.0552978515625,
0.0115509033203125,
0.0213623046875,
-0.042877197265625,
0.025390625,
0.028228759765625,
0.0083770751953125,
-0.01507568359375,
-0.04998779296875,
-0.00844573974609375,
-0.01505279541015625,
-0.04290771484375,
-0.054473876953125,
0.0246734619140625,
0.0072021484375,
0.051361083984375,
0.043426513671875,
-0.026641845703125,
0.033935546875,
-0.0377197265625,
0.061614990234375,
0.040496826171875,
-0.052734375,
0.05035400390625,
-0.020050048828125,
0.022857666015625,
0.01824951171875,
0.0127716064453125,
-0.033355712890625,
0.01464080810546875,
-0.0421142578125,
-0.05322265625,
0.0791015625,
0.0120697021484375,
0.0025634765625,
0.044586181640625,
0.005207061767578125,
0.01253509521484375,
0.0142364501953125,
-0.05169677734375,
-0.02545166015625,
-0.057464599609375,
-0.02020263671875,
0.01458740234375,
-0.04254150390625,
-0.005886077880859375,
-0.0396728515625,
0.044586181640625,
-0.0028839111328125,
0.038330078125,
0.00008082389831542969,
-0.0255126953125,
0.0137939453125,
0.0059356689453125,
0.08575439453125,
0.05267333984375,
-0.02532958984375,
0.00734710693359375,
0.00662994384765625,
-0.053955078125,
0.00844573974609375,
0.0218505859375,
-0.002796173095703125,
0.004638671875,
0.05804443359375,
0.10589599609375,
-0.0025920867919921875,
-0.031829833984375,
0.057037353515625,
-0.01311492919921875,
-0.048797607421875,
-0.041656494140625,
-0.0007987022399902344,
-0.0188751220703125,
0.0074462890625,
0.01971435546875,
0.01110076904296875,
0.005695343017578125,
-0.03204345703125,
0.00522613525390625,
0.0322265625,
-0.0338134765625,
-0.0241851806640625,
0.054931640625,
0.023773193359375,
-0.0372314453125,
0.0286102294921875,
-0.027862548828125,
-0.032989501953125,
0.04522705078125,
0.043182373046875,
0.04425048828125,
-0.01261138916015625,
0.0294647216796875,
0.022918701171875,
0.012939453125,
0.0201873779296875,
0.0301513671875,
-0.0063323974609375,
-0.038818359375,
-0.01082611083984375,
-0.037017822265625,
-0.0162506103515625,
0.029083251953125,
-0.03204345703125,
0.024017333984375,
-0.044525146484375,
0.0167236328125,
-0.0071868896484375,
0.01412200927734375,
-0.05126953125,
0.026397705078125,
0.01910400390625,
0.08154296875,
-0.035369873046875,
0.08428955078125,
0.0936279296875,
-0.025604248046875,
-0.05645751953125,
0.01013946533203125,
0.0255279541015625,
-0.06976318359375,
0.060211181640625,
0.0258331298828125,
0.00555419921875,
-0.004764556884765625,
-0.033172607421875,
-0.06591796875,
0.10589599609375,
-0.007221221923828125,
-0.01788330078125,
-0.0218505859375,
-0.00841522216796875,
0.0328369140625,
-0.020904541015625,
0.037353515625,
0.005123138427734375,
0.0482177734375,
0.0153961181640625,
-0.056793212890625,
-0.00957489013671875,
-0.049224853515625,
0.0263671875,
-0.0010023117065429688,
-0.0677490234375,
0.0689697265625,
-0.00836944580078125,
-0.00589752197265625,
0.01216888427734375,
0.05224609375,
0.0241851806640625,
0.04205322265625,
0.039031982421875,
0.041778564453125,
0.06378173828125,
-0.0171661376953125,
0.075439453125,
0.0015478134155273438,
0.007965087890625,
0.08935546875,
-0.0208282470703125,
0.0313720703125,
0.0219268798828125,
-0.006763458251953125,
0.041259765625,
0.053924560546875,
-0.006793975830078125,
0.0565185546875,
-0.01641845703125,
0.00884246826171875,
-0.014373779296875,
0.006885528564453125,
-0.049560546875,
0.034393310546875,
0.01654052734375,
-0.023223876953125,
-0.00595855712890625,
0.02325439453125,
0.0088043212890625,
0.0005517005920410156,
-0.0035533905029296875,
0.058746337890625,
-0.016845703125,
-0.0310821533203125,
0.0235595703125,
0.0182037353515625,
0.05010986328125,
-0.061767578125,
0.0011415481567382812,
-0.0219268798828125,
0.003963470458984375,
-0.015594482421875,
-0.058502197265625,
-0.007904052734375,
-0.0382080078125,
-0.0173492431640625,
-0.029876708984375,
0.07403564453125,
-0.0224456787109375,
-0.0301055908203125,
0.016448974609375,
0.02740478515625,
-0.01168060302734375,
0.00212860107421875,
-0.057586669921875,
0.01358795166015625,
0.007617950439453125,
-0.0274658203125,
0.0283355712890625,
0.028778076171875,
-0.01678466796875,
0.0335693359375,
0.039794921875,
-0.0281982421875,
0.0113525390625,
0.01702880859375,
0.063720703125,
-0.025726318359375,
-0.04278564453125,
-0.047760009765625,
0.05615234375,
-0.0308990478515625,
-0.036224365234375,
0.0557861328125,
0.06353759765625,
0.06494140625,
-0.0175628662109375,
0.0684814453125,
0.0035915374755859375,
0.011962890625,
-0.05224609375,
0.06390380859375,
-0.07421875,
-0.013519287109375,
-0.0233612060546875,
-0.06146240234375,
-0.0166168212890625,
0.04376220703125,
-0.01309967041015625,
0.006114959716796875,
0.0731201171875,
0.044830322265625,
-0.0077972412109375,
-0.0231170654296875,
0.033294677734375,
-0.00324249267578125,
0.036895751953125,
0.032958984375,
0.06365966796875,
-0.04449462890625,
0.043060302734375,
-0.01418304443359375,
-0.0169219970703125,
-0.027191162109375,
-0.041473388671875,
-0.078125,
-0.062164306640625,
-0.013671875,
-0.0270538330078125,
-0.01062774658203125,
0.050506591796875,
0.06298828125,
-0.03668212890625,
0.01763916015625,
0.011199951171875,
0.008270263671875,
0.004486083984375,
-0.00989532470703125,
0.0455322265625,
-0.01401519775390625,
-0.047271728515625,
0.017578125,
0.0384521484375,
0.0222320556640625,
-0.01287841796875,
-0.0419921875,
-0.0230865478515625,
-0.0005087852478027344,
0.048736572265625,
0.006610870361328125,
-0.056243896484375,
0.0030040740966796875,
-0.01239013671875,
-0.0328369140625,
0.0220489501953125,
0.055694580078125,
-0.03289794921875,
0.03857421875,
0.059722900390625,
0.026519775390625,
0.032684326171875,
0.007068634033203125,
0.01568603515625,
-0.05914306640625,
0.0548095703125,
-0.01552581787109375,
0.036407470703125,
0.0202178955078125,
-0.02374267578125,
0.029815673828125,
0.0286712646484375,
-0.0270233154296875,
-0.05126953125,
0.0195159912109375,
-0.09271240234375,
-0.0240936279296875,
0.0841064453125,
-0.0035247802734375,
-0.0160980224609375,
-0.005130767822265625,
-0.05804443359375,
0.024688720703125,
-0.022552490234375,
0.024078369140625,
0.038726806640625,
-0.000016510486602783203,
-0.03155517578125,
-0.0278472900390625,
0.047271728515625,
0.0297088623046875,
-0.08203125,
-0.042724609375,
0.044097900390625,
-0.00568389892578125,
0.0264129638671875,
0.0714111328125,
-0.0030574798583984375,
0.035552978515625,
-0.00888824462890625,
0.011871337890625,
-0.041534423828125,
-0.0316162109375,
-0.003940582275390625,
0.008636474609375,
-0.005458831787109375,
-0.026702880859375
]
] |
typeform/distilbert-base-uncased-mnli | 2023-03-22T08:49:00.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"distilbert",
"text-classification",
"zero-shot-classification",
"en",
"dataset:multi_nli",
"arxiv:1910.09700",
"arxiv:2105.09680",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-classification | typeform | null | null | typeform/distilbert-base-uncased-mnli | 32 | 26,991 | transformers | 2022-03-02T23:29:05 | ---
language: en
pipeline_tag: zero-shot-classification
tags:
- distilbert
datasets:
- multi_nli
metrics:
- accuracy
---
# DistilBERT base model (uncased)
## Table of Contents
- [Model Details](#model-details)
- [How to Get Started With the Model](#how-to-get-started-with-the-model)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
- [Evaluation](#evaluation)
- [Environmental Impact](#environmental-impact)
## Model Details
**Model Description:** This is the [uncased DistilBERT model](https://huggingface.co/distilbert-base-uncased) fine-tuned on [Multi-Genre Natural Language Inference](https://huggingface.co/datasets/multi_nli) (MNLI) dataset for the zero-shot classification task.
- **Developed by:** The [Typeform](https://www.typeform.com/) team.
- **Model Type:** Zero-Shot Classification
- **Language(s):** English
- **License:** Unknown
- **Parent Model:** See the [distilbert base uncased model](https://huggingface.co/distilbert-base-uncased) for more information about the Distilled-BERT base model.
## How to Get Started with the Model
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("typeform/distilbert-base-uncased-mnli")
model = AutoModelForSequenceClassification.from_pretrained("typeform/distilbert-base-uncased-mnli")
```
## Uses
This model can be used for text classification tasks.
## Risks, Limitations and Biases
**CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.**
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
## Training
#### Training Data
This model of DistilBERT-uncased is pretrained on the Multi-Genre Natural Language Inference [(MultiNLI)](https://huggingface.co/datasets/multi_nli) corpus. It is a crowd-sourced collection of 433k sentence pairs annotated with textual entailment information. The corpus covers a range of genres of spoken and written text, and supports a distinctive cross-genre generalization evaluation.
This model is also **not** case-sensitive, i.e., it does not make a difference between "english" and "English".
#### Training Procedure
Training is done on a [p3.2xlarge](https://aws.amazon.com/ec2/instance-types/p3/) AWS EC2 with the following hyperparameters:
```
$ run_glue.py \
--model_name_or_path distilbert-base-uncased \
--task_name mnli \
--do_train \
--do_eval \
--max_seq_length 128 \
--per_device_train_batch_size 16 \
--learning_rate 2e-5 \
--num_train_epochs 5 \
--output_dir /tmp/distilbert-base-uncased_mnli/
```
## Evaluation
#### Evaluation Results
When fine-tuned on downstream tasks, this model achieves the following results:
- **Epoch = ** 5.0
- **Evaluation Accuracy =** 0.8206875508543532
- **Evaluation Loss =** 0.8706700205802917
- ** Evaluation Runtime = ** 17.8278
- ** Evaluation Samples per second = ** 551.498
MNLI and MNLI-mm results:
| Task | MNLI | MNLI-mm |
|:----:|:----:|:----:|
| | 82.0 | 82.0 |
## Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type based on the [associated paper](https://arxiv.org/pdf/2105.09680.pdf).
**Hardware Type:** 1 NVIDIA Tesla V100 GPUs
**Hours used:** Unknown
**Cloud Provider:** AWS EC2 P3
**Compute Region:** Unknown
**Carbon Emitted:** (Power consumption x Time x Carbon produced based on location of power grid): Unknown
| 3,882 | [
[
-0.0269775390625,
-0.057525634765625,
0.035491943359375,
0.015106201171875,
-0.01708984375,
-0.006504058837890625,
-0.01338958740234375,
-0.032470703125,
-0.00801849365234375,
0.02655029296875,
-0.043914794921875,
-0.04144287109375,
-0.06414794921875,
0.00609588623046875,
-0.0208892822265625,
0.09112548828125,
0.00716400146484375,
0.0013446807861328125,
-0.004688262939453125,
0.0004949569702148438,
-0.0189208984375,
-0.057830810546875,
-0.03594970703125,
-0.01445770263671875,
0.0301666259765625,
0.006618499755859375,
0.0279998779296875,
0.037445068359375,
0.0325927734375,
0.026824951171875,
-0.0172576904296875,
-0.01024627685546875,
-0.031890869140625,
-0.0211334228515625,
0.01055908203125,
-0.0177154541015625,
-0.02862548828125,
0.01490020751953125,
0.03302001953125,
0.054046630859375,
-0.01551055908203125,
0.0309600830078125,
0.01486968994140625,
0.043792724609375,
-0.057342529296875,
0.01751708984375,
-0.056884765625,
0.0016736984252929688,
-0.016632080078125,
0.01026153564453125,
-0.034698486328125,
0.0002193450927734375,
0.004123687744140625,
-0.041259765625,
0.03045654296875,
0.006679534912109375,
0.0792236328125,
0.035919189453125,
-0.032684326171875,
-0.01519775390625,
-0.056121826171875,
0.0810546875,
-0.076171875,
0.018280029296875,
0.0255889892578125,
0.01275634765625,
-0.00022327899932861328,
-0.05169677734375,
-0.054046630859375,
-0.0112457275390625,
-0.01409912109375,
0.0240020751953125,
-0.022430419921875,
0.00838470458984375,
0.0295562744140625,
0.04296875,
-0.03924560546875,
0.0215911865234375,
-0.031768798828125,
-0.007595062255859375,
0.051239013671875,
0.0086822509765625,
0.00560760498046875,
-0.021728515625,
-0.032562255859375,
-0.0018033981323242188,
-0.01904296875,
0.018341064453125,
0.0369873046875,
0.0282745361328125,
-0.0272216796875,
0.036346435546875,
-0.0146636962890625,
0.047393798828125,
0.00787353515625,
-0.01540374755859375,
0.036407470703125,
-0.03863525390625,
-0.034942626953125,
0.006694793701171875,
0.07794189453125,
0.01995849609375,
-0.006153106689453125,
-0.006328582763671875,
-0.01094818115234375,
0.0008106231689453125,
0.0107269287109375,
-0.076904296875,
-0.0205841064453125,
0.03009033203125,
-0.030029296875,
-0.03314208984375,
0.001728057861328125,
-0.058746337890625,
-0.004573822021484375,
-0.0189208984375,
0.0297088623046875,
-0.032440185546875,
-0.0201568603515625,
0.016937255859375,
-0.0208587646484375,
0.00033354759216308594,
0.01155853271484375,
-0.0709228515625,
0.0173492431640625,
0.031982421875,
0.058868408203125,
-0.0006651878356933594,
-0.0199432373046875,
-0.003894805908203125,
-0.007190704345703125,
0.0015382766723632812,
0.03497314453125,
-0.016571044921875,
-0.02874755859375,
-0.0155792236328125,
0.0115203857421875,
0.0006241798400878906,
-0.0178070068359375,
0.05169677734375,
-0.0151519775390625,
0.037353515625,
-0.01262664794921875,
-0.039794921875,
-0.032989501953125,
0.0010442733764648438,
-0.0513916015625,
0.107666015625,
0.00974273681640625,
-0.073486328125,
0.038818359375,
-0.05364990234375,
-0.027740478515625,
-0.0066070556640625,
0.0158538818359375,
-0.046600341796875,
-0.00637054443359375,
0.0083770751953125,
0.040252685546875,
-0.0303802490234375,
0.04156494140625,
-0.0325927734375,
-0.023773193359375,
-0.0092620849609375,
-0.04779052734375,
0.09478759765625,
0.0144805908203125,
-0.053253173828125,
-0.00643157958984375,
-0.0662841796875,
-0.005489349365234375,
0.016998291015625,
-0.0203704833984375,
-0.02410888671875,
-0.0131683349609375,
0.027130126953125,
0.024078369140625,
0.0250701904296875,
-0.053924560546875,
-0.00031828880310058594,
-0.0161895751953125,
0.044158935546875,
0.043609619140625,
-0.005802154541015625,
0.0198211669921875,
-0.01476287841796875,
0.012664794921875,
0.0092620849609375,
0.0283966064453125,
0.007091522216796875,
-0.03680419921875,
-0.06744384765625,
-0.0303802490234375,
0.029632568359375,
0.051544189453125,
-0.030517578125,
0.07391357421875,
-0.0117645263671875,
-0.061370849609375,
-0.040252685546875,
0.00582122802734375,
0.04022216796875,
0.055206298828125,
0.040374755859375,
-0.0024738311767578125,
-0.043975830078125,
-0.07177734375,
0.0121307373046875,
-0.01080322265625,
0.0010433197021484375,
0.01499176025390625,
0.06170654296875,
-0.0191650390625,
0.06597900390625,
-0.034423828125,
-0.00981903076171875,
-0.01263427734375,
0.0294036865234375,
0.026641845703125,
0.03338623046875,
0.06146240234375,
-0.051849365234375,
-0.0352783203125,
-0.02447509765625,
-0.06158447265625,
0.0030422210693359375,
0.002716064453125,
-0.002948760986328125,
0.029998779296875,
0.0166168212890625,
-0.03875732421875,
0.0297393798828125,
0.0408935546875,
-0.0208892822265625,
0.042724609375,
-0.02081298828125,
0.01013946533203125,
-0.08758544921875,
0.0159149169921875,
0.0157318115234375,
0.0090789794921875,
-0.0426025390625,
-0.0179443359375,
-0.010040283203125,
-0.0024814605712890625,
-0.046661376953125,
0.035430908203125,
-0.03009033203125,
0.01107025146484375,
-0.01114654541015625,
-0.0193634033203125,
0.0191802978515625,
0.061309814453125,
0.0203094482421875,
0.04925537109375,
0.047119140625,
-0.057159423828125,
0.0091705322265625,
0.00823211669921875,
-0.0239715576171875,
0.0250701904296875,
-0.06396484375,
0.007350921630859375,
-0.01462554931640625,
0.0210418701171875,
-0.0645751953125,
0.00677490234375,
0.00559234619140625,
-0.032012939453125,
0.026275634765625,
-0.00525665283203125,
-0.047393798828125,
-0.0323486328125,
-0.01355743408203125,
0.018341064453125,
0.044708251953125,
-0.027984619140625,
0.0306549072265625,
0.0250244140625,
-0.003635406494140625,
-0.05950927734375,
-0.06573486328125,
-0.0163421630859375,
-0.02850341796875,
-0.0288543701171875,
0.031982421875,
-0.01180267333984375,
-0.01464080810546875,
0.0037899017333984375,
0.005100250244140625,
-0.029449462890625,
0.020965576171875,
0.0270233154296875,
0.03619384765625,
-0.00023281574249267578,
-0.0015716552734375,
0.0032634735107421875,
0.000850677490234375,
0.0036296844482421875,
-0.01050567626953125,
0.027740478515625,
-0.01165771484375,
-0.007564544677734375,
-0.043914794921875,
0.0201873779296875,
0.042236328125,
-0.007656097412109375,
0.0673828125,
0.05157470703125,
-0.037841796875,
-0.00025177001953125,
-0.036346435546875,
-0.01555633544921875,
-0.033172607421875,
0.042724609375,
-0.01033782958984375,
-0.05548095703125,
0.0282745361328125,
0.0186004638671875,
0.021728515625,
0.0677490234375,
0.033660888671875,
-0.01496124267578125,
0.06475830078125,
0.046875,
-0.01580810546875,
0.02642822265625,
-0.040985107421875,
0.0162200927734375,
-0.056671142578125,
-0.017974853515625,
-0.049102783203125,
-0.0191192626953125,
-0.056121826171875,
-0.0232391357421875,
0.0082855224609375,
0.0182952880859375,
-0.0467529296875,
0.04327392578125,
-0.055755615234375,
0.007476806640625,
0.052978515625,
-0.00417327880859375,
0.01441192626953125,
-0.005107879638671875,
-0.008056640625,
-0.0117645263671875,
-0.05230712890625,
-0.050140380859375,
0.0897216796875,
0.040863037109375,
0.03515625,
-0.0025157928466796875,
0.05169677734375,
0.0255584716796875,
0.0163421630859375,
-0.044158935546875,
0.0240936279296875,
-0.0272674560546875,
-0.0733642578125,
-0.0306549072265625,
-0.029876708984375,
-0.06585693359375,
0.0120086669921875,
-0.0259552001953125,
-0.050933837890625,
0.0160369873046875,
0.0075225830078125,
-0.037384033203125,
0.038238525390625,
-0.05609130859375,
0.07794189453125,
-0.043914794921875,
-0.018829345703125,
0.0003986358642578125,
-0.04522705078125,
0.0294036865234375,
-0.010894775390625,
0.0212554931640625,
-0.0141143798828125,
0.00980377197265625,
0.059234619140625,
-0.03253173828125,
0.06915283203125,
-0.0262908935546875,
0.00980377197265625,
0.031463623046875,
-0.0279693603515625,
0.0233001708984375,
-0.007434844970703125,
-0.01348114013671875,
0.0596923828125,
-0.0017042160034179688,
-0.026153564453125,
-0.032928466796875,
0.043670654296875,
-0.07135009765625,
-0.0214385986328125,
-0.0535888671875,
-0.0287628173828125,
-0.004596710205078125,
0.0187530517578125,
0.045318603515625,
0.02276611328125,
-0.024322509765625,
0.0201568603515625,
0.0517578125,
-0.0323486328125,
0.038818359375,
0.04736328125,
0.0031948089599609375,
-0.0210418701171875,
0.058746337890625,
0.008331298828125,
0.01523590087890625,
0.033843994140625,
0.003917694091796875,
-0.042877197265625,
-0.044586181640625,
-0.041656494140625,
0.006641387939453125,
-0.037353515625,
-0.00888824462890625,
-0.07073974609375,
-0.025634765625,
-0.03900146484375,
0.005023956298828125,
-0.034698486328125,
-0.0219268798828125,
-0.046783447265625,
-0.0218963623046875,
0.040252685546875,
0.042816162109375,
0.0073394775390625,
0.031402587890625,
-0.044525146484375,
0.01259613037109375,
0.01343536376953125,
0.0199737548828125,
-0.008697509765625,
-0.055694580078125,
-0.00806427001953125,
0.0276641845703125,
-0.032440185546875,
-0.05255126953125,
0.0297393798828125,
0.0117340087890625,
0.05157470703125,
0.0218505859375,
0.0115203857421875,
0.045074462890625,
-0.034912109375,
0.06903076171875,
0.0249176025390625,
-0.065185546875,
0.041778564453125,
-0.0181427001953125,
0.01184844970703125,
0.0618896484375,
0.050506591796875,
-0.016387939453125,
-0.0246124267578125,
-0.061431884765625,
-0.07452392578125,
0.054290771484375,
0.025299072265625,
0.0025787353515625,
-0.00098419189453125,
0.00853729248046875,
0.005825042724609375,
0.0276947021484375,
-0.07708740234375,
-0.037933349609375,
-0.0284576416015625,
-0.01413726806640625,
-0.0186614990234375,
-0.01268768310546875,
0.002758026123046875,
-0.045806884765625,
0.072021484375,
0.0128936767578125,
0.0262451171875,
0.012969970703125,
-0.00821685791015625,
0.0267333984375,
0.0201568603515625,
0.02960205078125,
0.0170135498046875,
-0.04376220703125,
-0.005924224853515625,
0.01654052734375,
-0.03900146484375,
0.0157928466796875,
0.022491455078125,
-0.02117919921875,
0.01074981689453125,
0.01971435546875,
0.0899658203125,
0.01032257080078125,
-0.0340576171875,
0.0335693359375,
-0.006855010986328125,
-0.0386962890625,
-0.0308380126953125,
0.00028634071350097656,
0.0067291259765625,
-0.0015420913696289062,
0.0219573974609375,
0.0030651092529296875,
0.02935791015625,
-0.04412841796875,
0.00830841064453125,
0.028594970703125,
-0.03729248046875,
-0.01198577880859375,
0.06536865234375,
0.026824951171875,
-0.002315521240234375,
0.05499267578125,
-0.0345458984375,
-0.0245208740234375,
0.06292724609375,
0.027801513671875,
0.06988525390625,
0.004779815673828125,
0.0122222900390625,
0.056396484375,
0.0350341796875,
-0.0022792816162109375,
0.00913238525390625,
0.0045928955078125,
-0.043426513671875,
-0.02679443359375,
-0.07147216796875,
-0.00910186767578125,
0.035003662109375,
-0.05255126953125,
0.01995849609375,
-0.039703369140625,
-0.022918701171875,
0.019927978515625,
0.0125732421875,
-0.06732177734375,
0.026580810546875,
0.01506805419921875,
0.063720703125,
-0.08209228515625,
0.06121826171875,
0.04608154296875,
-0.04315185546875,
-0.05609130859375,
-0.01114654541015625,
0.0017442703247070312,
-0.040069580078125,
0.05096435546875,
0.025115966796875,
0.000537872314453125,
-0.0013647079467773438,
-0.03143310546875,
-0.0615234375,
0.0960693359375,
0.0250244140625,
-0.061248779296875,
0.008087158203125,
0.01666259765625,
0.0478515625,
-0.01386260986328125,
0.048370361328125,
0.033843994140625,
0.03326416015625,
0.00382232666015625,
-0.07708740234375,
-0.0015773773193359375,
-0.0306854248046875,
0.00368499755859375,
0.005889892578125,
-0.0499267578125,
0.073486328125,
-0.01511383056640625,
-0.011627197265625,
-0.0127716064453125,
0.04351806640625,
0.0210418701171875,
0.0271453857421875,
0.0433349609375,
0.06805419921875,
0.059906005859375,
-0.0134124755859375,
0.07574462890625,
-0.0296173095703125,
0.039276123046875,
0.088623046875,
-0.01479339599609375,
0.059539794921875,
0.020355224609375,
-0.032379150390625,
0.046966552734375,
0.0655517578125,
-0.0172882080078125,
0.05059814453125,
0.00519561767578125,
-0.022216796875,
0.0024204254150390625,
-0.01120758056640625,
-0.028961181640625,
0.0325927734375,
0.0038013458251953125,
-0.03460693359375,
-0.010650634765625,
-0.00438690185546875,
0.011871337890625,
-0.006591796875,
-0.01309967041015625,
0.0499267578125,
-0.0094757080078125,
-0.048736572265625,
0.047821044921875,
0.00998687744140625,
0.08502197265625,
-0.04498291015625,
-0.00004595518112182617,
0.00609588623046875,
0.00241851806640625,
-0.0157928466796875,
-0.0545654296875,
0.0214996337890625,
0.01313018798828125,
-0.034332275390625,
-0.01361083984375,
0.032073974609375,
-0.024444580078125,
-0.06536865234375,
0.034271240234375,
0.01554107666015625,
0.01251983642578125,
-0.0157470703125,
-0.07476806640625,
-0.013092041015625,
0.00479888916015625,
-0.0156402587890625,
0.0236053466796875,
0.029052734375,
0.01259613037109375,
0.0301055908203125,
0.0545654296875,
-0.01253509521484375,
-0.0015115737915039062,
-0.0002982616424560547,
0.050872802734375,
-0.03765869140625,
-0.0279083251953125,
-0.0634765625,
0.057830810546875,
-0.01514434814453125,
-0.0394287109375,
0.0618896484375,
0.065185546875,
0.06585693359375,
-0.00341033935546875,
0.08502197265625,
-0.0201873779296875,
0.025726318359375,
-0.0274200439453125,
0.0482177734375,
-0.044586181640625,
-0.004077911376953125,
-0.0281829833984375,
-0.0670166015625,
0.010650634765625,
0.045013427734375,
-0.0205230712890625,
0.013916015625,
0.05169677734375,
0.0479736328125,
-0.007282257080078125,
-0.0007834434509277344,
0.00780487060546875,
0.01317596435546875,
0.0019102096557617188,
0.039947509765625,
0.0230712890625,
-0.044769287109375,
0.0352783203125,
-0.055938720703125,
-0.02764892578125,
-0.0006160736083984375,
-0.081298828125,
-0.06072998046875,
-0.044403076171875,
-0.03839111328125,
-0.01885986328125,
-0.00493621826171875,
0.047943115234375,
0.056549072265625,
-0.06597900390625,
-0.0146942138671875,
-0.021514892578125,
-0.01468658447265625,
-0.022064208984375,
-0.01837158203125,
0.022491455078125,
-0.01328277587890625,
-0.06805419921875,
0.004283905029296875,
0.0080413818359375,
0.01461029052734375,
-0.02947998046875,
-0.006343841552734375,
-0.030487060546875,
-0.00611114501953125,
0.0450439453125,
0.008819580078125,
-0.0562744140625,
-0.005634307861328125,
0.01047515869140625,
-0.01776123046875,
-0.0011234283447265625,
0.024444580078125,
-0.036102294921875,
0.0228118896484375,
0.023773193359375,
0.0323486328125,
0.0491943359375,
-0.0116424560546875,
0.0195770263671875,
-0.05218505859375,
0.0295257568359375,
0.01898193359375,
0.039215087890625,
0.0232086181640625,
-0.04083251953125,
0.035430908203125,
0.0273590087890625,
-0.04595947265625,
-0.054840087890625,
-0.0068511962890625,
-0.0826416015625,
-0.03192138671875,
0.09515380859375,
-0.0129241943359375,
-0.0184326171875,
-0.00539398193359375,
-0.010894775390625,
0.03326416015625,
-0.031402587890625,
0.05755615234375,
0.06585693359375,
0.0014467239379882812,
0.0007257461547851562,
-0.057769775390625,
0.032196044921875,
0.026092529296875,
-0.061309814453125,
-0.0008711814880371094,
0.042144775390625,
0.03533935546875,
0.0137176513671875,
0.04241943359375,
-0.006256103515625,
-0.005588531494140625,
0.0079803466796875,
0.035430908203125,
-0.0088958740234375,
-0.01448822021484375,
-0.018829345703125,
-0.0138702392578125,
-0.01247406005859375,
-0.02337646484375
]
] |
BubbleSheep/Hgn_trans_en2zh | 2022-08-22T10:14:19.000Z | [
"transformers",
"pytorch",
"marian",
"text2text-generation",
"translation",
"en",
"zh",
"dataset:THUOCL清华大学开放中文词库",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | BubbleSheep | null | null | BubbleSheep/Hgn_trans_en2zh | 3 | 26,946 | transformers | 2022-07-28T14:03:50 | ---
language:
- en
- zh
thumbnail: "url to a thumbnail used in social sharing"
tags:
- translation
license: apache-2.0
datasets:
- THUOCL清华大学开放中文词库
metrics:
- bleu
---
# Model Details
- **Model Description:**
This model has been pre-trained for English-Chinese Translation, and use datasets of THUOCL to fine tune the model.
- **source group**: English
- **target group**: Chinese
- **Parent Model:** Helsinki-NLP/opus-mt-en-zh, see https://huggingface.co/Helsinki-NLP/opus-mt-en-zh
- **Model Type:** Translation
#### Training Data
- 清华大学中文开放词库(THUOCL)
- **Data link**: http://thuocl.thunlp.org/
## How to Get Started With the Model
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("BubbleSheep/Hgn_trans_en2zh")
model = AutoModelForSeq2SeqLM.from_pretrained("BubbleSheep/Hgn_trans_en2zh")
```
| 872 | [
[
0.00609588623046875,
-0.0287628173828125,
0.009490966796875,
0.0263519287109375,
-0.038116455078125,
-0.0133514404296875,
-0.0213775634765625,
-0.0157928466796875,
0.0010862350463867188,
0.03271484375,
-0.0531005859375,
-0.037353515625,
-0.0474853515625,
0.017730712890625,
-0.013275146484375,
0.07208251953125,
-0.0135345458984375,
0.041259765625,
0.035125732421875,
0.0009298324584960938,
-0.0222015380859375,
-0.034332275390625,
-0.05859375,
-0.0303192138671875,
0.01125335693359375,
0.024261474609375,
0.040618896484375,
0.0567626953125,
0.021759033203125,
0.0229949951171875,
0.0017948150634765625,
0.0004849433898925781,
-0.043365478515625,
-0.0229644775390625,
0.0013628005981445312,
-0.039459228515625,
-0.044403076171875,
0.0008788108825683594,
0.05224609375,
0.042083740234375,
-0.0145263671875,
0.034576416015625,
0.0031280517578125,
0.0301055908203125,
-0.00893402099609375,
0.0203094482421875,
-0.0418701171875,
0.027984619140625,
-0.01314544677734375,
0.00275421142578125,
-0.0292510986328125,
-0.01450347900390625,
0.006793975830078125,
-0.058563232421875,
0.0182952880859375,
0.0159912109375,
0.09454345703125,
0.01305389404296875,
-0.04132080078125,
0.0106353759765625,
-0.0474853515625,
0.0693359375,
-0.063720703125,
0.0278167724609375,
0.046630859375,
0.01540374755859375,
0.0017871856689453125,
-0.0760498046875,
-0.032806396484375,
-0.0002155303955078125,
-0.0273284912109375,
0.0087890625,
0.0004334449768066406,
-0.00930023193359375,
0.01110076904296875,
0.050048828125,
-0.05206298828125,
0.00379180908203125,
-0.0513916015625,
-0.0283966064453125,
0.052337646484375,
0.020751953125,
0.02569580078125,
-0.01471710205078125,
-0.03961181640625,
-0.002044677734375,
-0.034759521484375,
-0.0008454322814941406,
0.005584716796875,
0.0256195068359375,
-0.03265380859375,
0.0506591796875,
-0.00864410400390625,
0.051116943359375,
0.01593017578125,
-0.0005950927734375,
0.03643798828125,
-0.0408935546875,
-0.020233154296875,
-0.0021648406982421875,
0.0745849609375,
0.0164642333984375,
0.0175933837890625,
0.00711822509765625,
-0.027679443359375,
-0.03265380859375,
-0.0034770965576171875,
-0.06298828125,
-0.0244598388671875,
0.01071929931640625,
-0.0518798828125,
-0.024810791015625,
0.003631591796875,
-0.02459716796875,
0.02313232421875,
-0.0242156982421875,
0.04962158203125,
-0.0278778076171875,
-0.037017822265625,
0.021026611328125,
0.0007920265197753906,
0.0181121826171875,
-0.0010242462158203125,
-0.054962158203125,
0.0027751922607421875,
0.022369384765625,
0.0550537109375,
-0.020172119140625,
-0.0360107421875,
-0.02752685546875,
0.0207061767578125,
-0.01468658447265625,
0.0333251953125,
-0.003482818603515625,
-0.0355224609375,
0.007724761962890625,
0.033172607421875,
-0.03143310546875,
-0.0185394287109375,
0.06231689453125,
-0.027069091796875,
0.0716552734375,
-0.0028133392333984375,
-0.051177978515625,
-0.0232696533203125,
0.0225830078125,
-0.05072021484375,
0.0791015625,
0.038848876953125,
-0.052337646484375,
0.00827789306640625,
-0.06378173828125,
-0.03271484375,
0.018035888671875,
0.02032470703125,
-0.03350830078125,
0.0027446746826171875,
0.01430511474609375,
0.0224456787109375,
-0.015380859375,
0.0167388916015625,
0.00231170654296875,
-0.01084136962890625,
-0.01148223876953125,
-0.0267333984375,
0.09033203125,
0.02471923828125,
-0.016326904296875,
0.024169921875,
-0.050323486328125,
0.0009102821350097656,
0.0335693359375,
-0.019805908203125,
-0.0160675048828125,
-0.0204620361328125,
0.035491943359375,
0.03778076171875,
0.04248046875,
-0.0277862548828125,
0.027923583984375,
-0.035491943359375,
0.04620361328125,
0.040618896484375,
-0.01194000244140625,
0.02178955078125,
-0.0196380615234375,
0.03546142578125,
0.01446533203125,
0.02679443359375,
0.0005698204040527344,
-0.035247802734375,
-0.059967041015625,
-0.00577545166015625,
0.032806396484375,
0.051788330078125,
-0.07196044921875,
0.059417724609375,
-0.0179595947265625,
-0.049407958984375,
-0.045318603515625,
0.003925323486328125,
0.025848388671875,
0.046630859375,
0.045745849609375,
-0.00637054443359375,
-0.0283966064453125,
-0.057861328125,
-0.007152557373046875,
-0.0220947265625,
-0.0141754150390625,
-0.0099334716796875,
0.044708251953125,
-0.03094482421875,
0.03265380859375,
-0.032257080078125,
-0.048095703125,
-0.0272674560546875,
0.0036602020263671875,
0.0294647216796875,
0.05511474609375,
0.04656982421875,
-0.062042236328125,
-0.053253173828125,
-0.01153564453125,
-0.0308074951171875,
-0.00835418701171875,
0.0106201171875,
-0.03363037109375,
0.0158233642578125,
0.024017333984375,
-0.04010009765625,
0.0308685302734375,
0.047607421875,
-0.04296875,
0.04962158203125,
-0.0174407958984375,
0.00539398193359375,
-0.12042236328125,
0.00897216796875,
-0.01507568359375,
-0.016632080078125,
-0.047454833984375,
-0.0088348388671875,
0.01324462890625,
0.00789642333984375,
-0.05914306640625,
0.044891357421875,
-0.0225830078125,
0.017181396484375,
-0.01190185546875,
-0.01480865478515625,
0.0157318115234375,
0.04437255859375,
0.029022216796875,
0.044342041015625,
0.04058837890625,
-0.055633544921875,
0.03802490234375,
0.044097900390625,
-0.01904296875,
0.0139617919921875,
-0.071044921875,
-0.01071929931640625,
0.02801513671875,
0.0043792724609375,
-0.061798095703125,
-0.016876220703125,
0.03802490234375,
-0.035980224609375,
0.032470703125,
-0.005687713623046875,
-0.058258056640625,
-0.040252685546875,
-0.007511138916015625,
0.053924560546875,
0.0300140380859375,
-0.042388916015625,
0.05377197265625,
0.01209259033203125,
0.0020198822021484375,
-0.0250244140625,
-0.0653076171875,
-0.006145477294921875,
-0.0255584716796875,
-0.04180908203125,
0.0372314453125,
-0.0137786865234375,
0.0249481201171875,
0.004039764404296875,
0.0211029052734375,
-0.0244598388671875,
0.002414703369140625,
-0.0189056396484375,
0.035919189453125,
-0.0274658203125,
0.0052337646484375,
-0.0020961761474609375,
-0.0117340087890625,
-0.004520416259765625,
-0.041046142578125,
0.049102783203125,
-0.0140533447265625,
0.006988525390625,
-0.04693603515625,
0.00582122802734375,
0.021484375,
-0.0322265625,
0.067138671875,
0.06787109375,
-0.032073974609375,
-0.00223541259765625,
-0.0187530517578125,
-0.01279449462890625,
-0.03448486328125,
0.042694091796875,
-0.027069091796875,
-0.045928955078125,
0.040740966796875,
0.0186767578125,
0.00591278076171875,
0.0733642578125,
0.050445556640625,
0.00846099853515625,
0.07659912109375,
0.048431396484375,
-0.0223541259765625,
0.031829833984375,
-0.03448486328125,
-0.00008702278137207031,
-0.04962158203125,
0.0010538101196289062,
-0.033050537109375,
0.0018310546875,
-0.040618896484375,
-0.011505126953125,
0.0099945068359375,
-0.002155303955078125,
-0.026580810546875,
0.06036376953125,
-0.035614013671875,
0.0111846923828125,
0.04400634765625,
-0.01517486572265625,
0.0230560302734375,
0.0018482208251953125,
-0.00949859619140625,
-0.00878143310546875,
-0.0648193359375,
-0.032867431640625,
0.06878662109375,
0.035125732421875,
0.0276947021484375,
0.005672454833984375,
0.022857666015625,
0.00031638145446777344,
0.020660400390625,
-0.05078125,
0.0226593017578125,
-0.021759033203125,
-0.06829833984375,
-0.02545166015625,
-0.03460693359375,
-0.07049560546875,
0.01715087890625,
-0.0092315673828125,
-0.03546142578125,
-0.0024356842041015625,
0.00585174560546875,
-0.016510009765625,
0.026214599609375,
-0.03497314453125,
0.09185791015625,
-0.040252685546875,
-0.003566741943359375,
0.0008959770202636719,
-0.04974365234375,
0.038970947265625,
-0.0158843994140625,
0.008544921875,
-0.0012331008911132812,
0.00307464599609375,
0.06005859375,
-0.0253143310546875,
0.038116455078125,
-0.00461578369140625,
-0.0214996337890625,
0.009674072265625,
-0.005298614501953125,
0.021759033203125,
-0.00789642333984375,
-0.01393890380859375,
0.0384521484375,
0.0131378173828125,
-0.033111572265625,
-0.00208282470703125,
0.027679443359375,
-0.06622314453125,
-0.0238800048828125,
-0.028045654296875,
-0.037384033203125,
0.00959014892578125,
0.045440673828125,
0.06268310546875,
0.0159149169921875,
-0.0173492431640625,
0.01372528076171875,
0.0241851806640625,
-0.013397216796875,
0.0279541015625,
0.06524658203125,
-0.027099609375,
-0.03564453125,
0.055450439453125,
0.0076751708984375,
0.0201263427734375,
0.0196685791015625,
0.0220947265625,
-0.037200927734375,
-0.0246734619140625,
-0.047607421875,
0.03533935546875,
-0.0263519287109375,
-0.00362396240234375,
-0.035247802734375,
-0.03668212890625,
-0.04656982421875,
0.035125732421875,
-0.039337158203125,
-0.034576416015625,
-0.04058837890625,
-0.01345062255859375,
0.0027370452880859375,
0.032196044921875,
-0.01366424560546875,
0.04248046875,
-0.07965087890625,
0.027923583984375,
0.004123687744140625,
0.0190582275390625,
-0.0266265869140625,
-0.08843994140625,
-0.0506591796875,
0.0163726806640625,
-0.0177459716796875,
-0.057830810546875,
0.05938720703125,
0.02398681640625,
0.042755126953125,
0.040283203125,
0.01062774658203125,
0.037078857421875,
-0.041412353515625,
0.075927734375,
-0.001674652099609375,
-0.055938720703125,
0.04132080078125,
-0.0219268798828125,
0.0196075439453125,
0.041534423828125,
0.035980224609375,
-0.03173828125,
-0.01522064208984375,
-0.052886962890625,
-0.06524658203125,
0.059906005859375,
0.034454345703125,
0.0076751708984375,
0.0249481201171875,
0.01541900634765625,
0.00031113624572753906,
0.019287109375,
-0.08465576171875,
-0.03668212890625,
-0.028717041015625,
-0.0268707275390625,
-0.01538848876953125,
-0.016265869140625,
-0.0070037841796875,
-0.0418701171875,
0.09124755859375,
0.005962371826171875,
0.009033203125,
0.0098876953125,
-0.0123443603515625,
-0.021820068359375,
-0.01001739501953125,
0.047088623046875,
0.023223876953125,
-0.0181427001953125,
-0.01503753662109375,
0.0013561248779296875,
-0.036529541015625,
0.0008754730224609375,
0.0303802490234375,
-0.03167724609375,
0.022674560546875,
0.032958984375,
0.0733642578125,
-0.0157623291015625,
-0.0175933837890625,
0.036285400390625,
-0.0230560302734375,
-0.0211334228515625,
-0.0565185546875,
0.0005636215209960938,
-0.00508880615234375,
-0.00516510009765625,
0.032440185546875,
-0.0185699462890625,
0.0023822784423828125,
-0.01456451416015625,
0.00907135009765625,
0.00418853759765625,
-0.039215087890625,
-0.032958984375,
0.0625,
0.01123809814453125,
-0.021636962890625,
0.054595947265625,
-0.01409912109375,
-0.05810546875,
0.04205322265625,
0.040008544921875,
0.07733154296875,
-0.042022705078125,
0.0013895034790039062,
0.059112548828125,
0.03253173828125,
-0.017486572265625,
0.01155853271484375,
0.006969451904296875,
-0.059722900390625,
-0.0266876220703125,
-0.039154052734375,
0.005054473876953125,
0.036285400390625,
-0.057037353515625,
0.028961181640625,
0.0110015869140625,
-0.01399993896484375,
-0.0258636474609375,
0.0082855224609375,
-0.0447998046875,
0.0208740234375,
0.0057525634765625,
0.072265625,
-0.07666015625,
0.08526611328125,
0.052825927734375,
-0.035919189453125,
-0.0771484375,
0.0007781982421875,
-0.01386260986328125,
-0.057342529296875,
0.058380126953125,
0.0211334228515625,
0.0213775634765625,
0.02703857421875,
-0.048583984375,
-0.07196044921875,
0.074462890625,
0.00232696533203125,
-0.0316162109375,
-0.0032444000244140625,
0.0152587890625,
0.042327880859375,
-0.041412353515625,
0.00519561767578125,
0.03887939453125,
0.044097900390625,
-0.00577545166015625,
-0.08160400390625,
-0.0225372314453125,
-0.04412841796875,
0.0003147125244140625,
-0.0184783935546875,
-0.052490234375,
0.07672119140625,
-0.0003285408020019531,
-0.0157012939453125,
0.01459503173828125,
0.068603515625,
0.0100250244140625,
0.006500244140625,
0.0474853515625,
0.03338623046875,
0.029541015625,
-0.01058197021484375,
0.052703857421875,
-0.046722412109375,
0.03448486328125,
0.095703125,
-0.00176239013671875,
0.049530029296875,
0.0264739990234375,
-0.0105743408203125,
0.037017822265625,
0.0687255859375,
-0.035919189453125,
0.0396728515625,
-0.005741119384765625,
-0.0009670257568359375,
-0.01195526123046875,
-0.01515960693359375,
-0.049835205078125,
0.0282135009765625,
0.034454345703125,
-0.031890869140625,
0.0004067420959472656,
0.0013589859008789062,
-0.00186920166015625,
-0.00508880615234375,
-0.034454345703125,
0.044464111328125,
0.006183624267578125,
-0.04833984375,
0.051239013671875,
0.0208740234375,
0.04498291015625,
-0.053253173828125,
0.00803375244140625,
-0.01031494140625,
0.01108551025390625,
-0.0007023811340332031,
-0.04071044921875,
0.01403045654296875,
-0.0138702392578125,
-0.007343292236328125,
-0.0072021484375,
0.0413818359375,
-0.06878662109375,
-0.06298828125,
0.0252227783203125,
0.034393310546875,
0.0137786865234375,
0.00514984130859375,
-0.08367919921875,
-0.03253173828125,
0.0178375244140625,
-0.04852294921875,
-0.0116729736328125,
0.034698486328125,
0.0029926300048828125,
0.039459228515625,
0.03857421875,
0.0160369873046875,
0.003498077392578125,
0.0066680908203125,
0.0462646484375,
-0.0379638671875,
-0.047210693359375,
-0.054962158203125,
0.04412841796875,
-0.006114959716796875,
-0.031524658203125,
0.06341552734375,
0.05804443359375,
0.07720947265625,
-0.0303192138671875,
0.0501708984375,
0.00623321533203125,
0.04168701171875,
-0.03680419921875,
0.05548095703125,
-0.0552978515625,
-0.0115509033203125,
-0.002735137939453125,
-0.0694580078125,
-0.0105438232421875,
0.049591064453125,
-0.0023288726806640625,
-0.0029277801513671875,
0.04339599609375,
0.05316162109375,
-0.01070404052734375,
-0.01236724853515625,
0.0189208984375,
0.0341796875,
0.0204315185546875,
0.052886962890625,
0.0421142578125,
-0.06768798828125,
0.039825439453125,
-0.0401611328125,
0.004802703857421875,
-0.018829345703125,
-0.0560302734375,
-0.0772705078125,
-0.048248291015625,
-0.0308685302734375,
-0.038330078125,
-0.0303192138671875,
0.078125,
0.039459228515625,
-0.07763671875,
-0.0229644775390625,
0.003154754638671875,
-0.00789642333984375,
-0.01172637939453125,
-0.01471710205078125,
0.0513916015625,
-0.0177154541015625,
-0.07244873046875,
0.0201873779296875,
-0.0106353759765625,
0.03546142578125,
-0.0183258056640625,
-0.02032470703125,
-0.020050048828125,
-0.0110015869140625,
0.0286407470703125,
0.015106201171875,
-0.047332763671875,
0.006099700927734375,
0.0037097930908203125,
-0.03448486328125,
0.002178192138671875,
0.0218505859375,
-0.01247406005859375,
0.017974853515625,
0.0341796875,
0.01593017578125,
0.036712646484375,
0.00182342529296875,
0.060638427734375,
-0.040618896484375,
0.0296173095703125,
0.00252532958984375,
0.0390625,
0.0281219482421875,
-0.0204925537109375,
0.055908203125,
0.0322265625,
-0.028717041015625,
-0.054931640625,
0.00608062744140625,
-0.059295654296875,
-0.01454925537109375,
0.091552734375,
-0.020233154296875,
-0.0225677490234375,
-0.0010986328125,
-0.031036376953125,
0.052459716796875,
-0.005199432373046875,
0.0411376953125,
0.055694580078125,
0.0294952392578125,
0.00006365776062011719,
-0.038330078125,
0.04376220703125,
0.031585693359375,
-0.038055419921875,
-0.01247406005859375,
0.0204315185546875,
0.015838623046875,
0.00946044921875,
0.036590576171875,
-0.0112152099609375,
0.012298583984375,
-0.033233642578125,
0.0222015380859375,
-0.0183563232421875,
0.0002636909484863281,
-0.02203369140625,
-0.01001739501953125,
-0.0063323974609375,
-0.023712158203125
]
] |
NTQAI/pedestrian_gender_recognition | 2023-07-06T07:29:58.000Z | [
"transformers",
"pytorch",
"onnx",
"safetensors",
"beit",
"image-classification",
"vision",
"generated_from_trainer",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-classification | NTQAI | null | null | NTQAI/pedestrian_gender_recognition | 3 | 26,934 | transformers | 2023-01-06T04:37:51 | ---
license: apache-2.0
tags:
- image-classification
- vision
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: outputs
results:
- task:
name: Image Classification
type: image-classification
metrics:
- name: Accuracy
type: accuracy
value: 0.9107332624867163
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# outputs
This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on the [PETA dataset](http://mmlab.ie.cuhk.edu.hk/projects/PETA_files/Pedestrian%20Attribute%20Recognition%20At%20Far%20Distance.pdf) dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2170
- Accuracy: 0.9107
## Model description
More information needed
#### How to use
You can use this model with Transformers *pipeline* .
```python
from transformers import pipeline
gender_classifier = pipeline(model="NTQAI/pedestrian_gender_recognition")
image_path = "abc.jpg"
results = gender_classifier(image_path)
print(results)
```
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.5193 | 1.0 | 2000 | 0.3346 | 0.8533 |
| 0.337 | 2.0 | 4000 | 0.2892 | 0.8778 |
| 0.3771 | 3.0 | 6000 | 0.2493 | 0.8969 |
| 0.3819 | 4.0 | 8000 | 0.2275 | 0.9100 |
| 0.3581 | 5.0 | 10000 | 0.2170 | 0.9107 |
### Framework versions
- Transformers 4.24.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.6.1
- Tokenizers 0.13.1
### Contact information
For personal communication related to this project, please contact Nha Nguyen Van (nha282@gmail.com). | 2,334 | [
[
-0.0300750732421875,
-0.028900146484375,
0.007785797119140625,
0.009552001953125,
-0.01210784912109375,
-0.01739501953125,
0.00296783447265625,
-0.029052734375,
-0.0036602020263671875,
0.024810791015625,
-0.041229248046875,
-0.04339599609375,
-0.047576904296875,
0.0012655258178710938,
-0.02197265625,
0.0673828125,
0.00865936279296875,
0.01788330078125,
0.0008502006530761719,
-0.0059814453125,
-0.034637451171875,
-0.044586181640625,
-0.058197021484375,
-0.047882080078125,
0.022918701171875,
0.016265869140625,
0.046539306640625,
0.06451416015625,
0.039031982421875,
0.02716064453125,
-0.0227203369140625,
-0.0025119781494140625,
-0.0191802978515625,
-0.041168212890625,
-0.0019435882568359375,
-0.032958984375,
-0.0594482421875,
0.01457977294921875,
0.058746337890625,
0.037445068359375,
-0.00402069091796875,
0.034149169921875,
0.0038700103759765625,
0.033843994140625,
-0.039337158203125,
0.02447509765625,
-0.035064697265625,
0.024993896484375,
-0.015380859375,
-0.029327392578125,
-0.0264892578125,
0.0026397705078125,
0.008697509765625,
-0.0316162109375,
0.04656982421875,
-0.0016651153564453125,
0.092529296875,
0.0157318115234375,
-0.03350830078125,
0.019317626953125,
-0.058319091796875,
0.0555419921875,
-0.051666259765625,
0.018524169921875,
0.034393310546875,
0.0291290283203125,
0.00020611286163330078,
-0.050140380859375,
-0.036346435546875,
-0.01378631591796875,
-0.012054443359375,
0.00896453857421875,
-0.0178985595703125,
-0.00560760498046875,
0.05120849609375,
0.02166748046875,
-0.043212890625,
0.0114898681640625,
-0.0479736328125,
-0.02569580078125,
0.0462646484375,
0.032379150390625,
-0.0086822509765625,
-0.018218994140625,
-0.0295867919921875,
-0.02178955078125,
-0.026336669921875,
0.01114654541015625,
0.048736572265625,
0.01654052734375,
-0.03057861328125,
0.0423583984375,
-0.0170135498046875,
0.0535888671875,
0.00804901123046875,
-0.01026153564453125,
0.0589599609375,
-0.0009002685546875,
-0.0211181640625,
0.00577545166015625,
0.067138671875,
0.039520263671875,
0.0156707763671875,
0.004573822021484375,
-0.00937652587890625,
-0.00482940673828125,
0.0180816650390625,
-0.0703125,
-0.01763916015625,
0.01395416259765625,
-0.03875732421875,
-0.0450439453125,
0.016021728515625,
-0.052947998046875,
0.005664825439453125,
-0.0309600830078125,
0.0280914306640625,
-0.0232086181640625,
-0.020782470703125,
0.01129150390625,
-0.0097198486328125,
0.0382080078125,
0.0131683349609375,
-0.06634521484375,
0.0169525146484375,
0.03948974609375,
0.04986572265625,
0.01108551025390625,
-0.03948974609375,
-0.0172271728515625,
0.00746917724609375,
-0.024200439453125,
0.0386962890625,
-0.01131439208984375,
-0.01375579833984375,
-0.020233154296875,
0.0285186767578125,
-0.00785064697265625,
-0.03948974609375,
0.0452880859375,
-0.012451171875,
0.0241241455078125,
-0.0008287429809570312,
-0.025054931640625,
-0.01078033447265625,
0.01849365234375,
-0.041778564453125,
0.09417724609375,
0.01457977294921875,
-0.07611083984375,
0.037078857421875,
-0.042938232421875,
0.006748199462890625,
-0.0026721954345703125,
-0.008544921875,
-0.061309814453125,
-0.023590087890625,
0.005840301513671875,
0.030914306640625,
-0.0169219970703125,
0.03070068359375,
-0.0274505615234375,
-0.0384521484375,
0.004802703857421875,
-0.041259765625,
0.0704345703125,
-0.006320953369140625,
-0.04412841796875,
0.0189971923828125,
-0.08209228515625,
0.0026874542236328125,
0.028045654296875,
-0.0289459228515625,
0.0086517333984375,
-0.021881103515625,
0.03302001953125,
0.0261688232421875,
0.01605224609375,
-0.044464111328125,
0.0043792724609375,
-0.0245513916015625,
0.0246429443359375,
0.06121826171875,
-0.01525115966796875,
0.00986480712890625,
-0.0238037109375,
0.02276611328125,
0.0201416015625,
0.0206146240234375,
0.0185394287109375,
-0.05078125,
-0.065185546875,
-0.00994110107421875,
0.0199432373046875,
0.037994384765625,
-0.0247039794921875,
0.06280517578125,
-0.014739990234375,
-0.059173583984375,
-0.0173187255859375,
-0.01480865478515625,
0.02520751953125,
0.0633544921875,
0.02899169921875,
-0.022796630859375,
-0.045684814453125,
-0.08624267578125,
0.0017843246459960938,
-0.02703857421875,
0.000006139278411865234,
0.01393890380859375,
0.0609130859375,
-0.0222930908203125,
0.07110595703125,
-0.038360595703125,
-0.014984130859375,
-0.007404327392578125,
0.01947021484375,
0.0265655517578125,
0.07012939453125,
0.058258056640625,
-0.041168212890625,
-0.006893157958984375,
-0.0159454345703125,
-0.060546875,
0.00695037841796875,
-0.00847625732421875,
-0.0302734375,
0.00455474853515625,
0.01326751708984375,
-0.03948974609375,
0.058868408203125,
0.020599365234375,
-0.044342041015625,
0.064453125,
-0.0299224853515625,
-0.01058197021484375,
-0.08514404296875,
0.027923583984375,
0.021636962890625,
-0.0198516845703125,
-0.0394287109375,
-0.00402069091796875,
0.00911712646484375,
-0.010711669921875,
-0.0396728515625,
0.04705810546875,
-0.0036220550537109375,
-0.0007390975952148438,
-0.0172271728515625,
-0.0308685302734375,
-0.0113067626953125,
0.057464599609375,
0.025848388671875,
0.0438232421875,
0.04925537109375,
-0.039031982421875,
0.020721435546875,
0.037750244140625,
-0.009857177734375,
0.040191650390625,
-0.0609130859375,
0.01003265380859375,
-0.01461029052734375,
0.00887298583984375,
-0.049163818359375,
-0.0162353515625,
0.04559326171875,
-0.036590576171875,
0.0241546630859375,
-0.0200042724609375,
-0.02923583984375,
-0.04095458984375,
-0.00849151611328125,
0.0250091552734375,
0.03729248046875,
-0.037750244140625,
0.041168212890625,
0.0020732879638671875,
0.032501220703125,
-0.053619384765625,
-0.0625,
-0.003070831298828125,
-0.0189971923828125,
-0.037750244140625,
0.0204925537109375,
0.00022733211517333984,
0.017181396484375,
-0.006839752197265625,
-0.0004837512969970703,
-0.0201263427734375,
-0.00033402442932128906,
0.023162841796875,
0.03173828125,
-0.0092926025390625,
-0.0038928985595703125,
-0.0213775634765625,
-0.041168212890625,
0.0276947021484375,
0.002437591552734375,
0.04852294921875,
-0.0196380615234375,
-0.03167724609375,
-0.0621337890625,
0.01030731201171875,
0.03863525390625,
-0.016693115234375,
0.060211181640625,
0.08050537109375,
-0.039947509765625,
0.00621795654296875,
-0.022430419921875,
-0.0204010009765625,
-0.036834716796875,
0.04412841796875,
-0.0289764404296875,
-0.0287933349609375,
0.0496826171875,
0.0023555755615234375,
-0.00099945068359375,
0.07598876953125,
0.041046142578125,
0.01000213623046875,
0.09014892578125,
0.0274505615234375,
-0.0020999908447265625,
0.0300750732421875,
-0.0611572265625,
-0.0131378173828125,
-0.0645751953125,
-0.03997802734375,
-0.04132080078125,
-0.026275634765625,
-0.057373046875,
-0.00746917724609375,
0.01065826416015625,
-0.0034580230712890625,
-0.0643310546875,
0.0283966064453125,
-0.053375244140625,
0.01061248779296875,
0.0679931640625,
0.034149169921875,
-0.00423431396484375,
0.00658416748046875,
-0.0088043212890625,
-0.0024394989013671875,
-0.058807373046875,
-0.0450439453125,
0.091064453125,
0.041351318359375,
0.04730224609375,
-0.01305389404296875,
0.046112060546875,
-0.0007352828979492188,
0.0115966796875,
-0.045806884765625,
0.04412841796875,
0.003711700439453125,
-0.052825927734375,
-0.00421905517578125,
-0.040679931640625,
-0.07244873046875,
0.009429931640625,
-0.0241851806640625,
-0.064697265625,
0.036590576171875,
0.01934814453125,
-0.00841522216796875,
0.0478515625,
-0.039703369140625,
0.08587646484375,
-0.00951385498046875,
-0.0267333984375,
-0.002567291259765625,
-0.045166015625,
0.01763916015625,
-0.0057220458984375,
-0.028900146484375,
-0.005985260009765625,
0.01393890380859375,
0.07586669921875,
-0.04168701171875,
0.06378173828125,
-0.0197601318359375,
0.023406982421875,
0.01439666748046875,
-0.026458740234375,
0.042510986328125,
-0.0013322830200195312,
-0.010223388671875,
0.03131103515625,
-0.0009646415710449219,
-0.046966552734375,
-0.023101806640625,
0.03326416015625,
-0.0850830078125,
-0.02899169921875,
-0.060638427734375,
-0.03302001953125,
0.0022678375244140625,
0.01238250732421875,
0.03485107421875,
0.048126220703125,
0.0076141357421875,
0.01358795166015625,
0.032073974609375,
-0.0103912353515625,
0.02996826171875,
0.025848388671875,
0.0026988983154296875,
-0.035186767578125,
0.061920166015625,
-0.00002771615982055664,
0.00646209716796875,
-0.0019989013671875,
0.0281219482421875,
-0.027618408203125,
-0.027374267578125,
-0.040374755859375,
0.0218505859375,
-0.05279541015625,
-0.0117034912109375,
-0.041900634765625,
-0.037322998046875,
-0.02642822265625,
0.0099945068359375,
-0.0279998779296875,
-0.0155487060546875,
-0.04083251953125,
-0.0031948089599609375,
0.03509521484375,
0.035797119140625,
-0.000011920928955078125,
0.0389404296875,
-0.047149658203125,
0.00965118408203125,
0.023223876953125,
0.028411865234375,
0.0084381103515625,
-0.058074951171875,
-0.0166015625,
0.0008187294006347656,
-0.032958984375,
-0.061737060546875,
0.036651611328125,
-0.0023326873779296875,
0.049346923828125,
0.03424072265625,
-0.01251220703125,
0.05731201171875,
-0.0230865478515625,
0.050445556640625,
0.0196380615234375,
-0.058563232421875,
0.03564453125,
-0.0244140625,
0.03106689453125,
0.0298614501953125,
0.0259857177734375,
-0.024627685546875,
-0.01003265380859375,
-0.08270263671875,
-0.054901123046875,
0.08270263671875,
0.033233642578125,
-0.003452301025390625,
0.017913818359375,
0.0265655517578125,
-0.0187530517578125,
0.01491546630859375,
-0.053497314453125,
-0.044403076171875,
-0.02178955078125,
-0.0157928466796875,
-0.00011473894119262695,
-0.0145263671875,
-0.00659942626953125,
-0.05914306640625,
0.0677490234375,
-0.01035308837890625,
0.04132080078125,
0.0157623291015625,
0.01422882080078125,
-0.0037784576416015625,
-0.004474639892578125,
0.03729248046875,
0.05908203125,
-0.044769287109375,
-0.00308990478515625,
0.021026611328125,
-0.056884765625,
0.01326751708984375,
0.011383056640625,
-0.01435089111328125,
0.00986480712890625,
0.026885986328125,
0.08074951171875,
-0.01160430908203125,
-0.0153045654296875,
0.03399658203125,
-0.005046844482421875,
-0.03375244140625,
-0.0345458984375,
0.0009813308715820312,
-0.0076446533203125,
0.0135955810546875,
0.0227508544921875,
0.0369873046875,
0.0012788772583007812,
-0.01535797119140625,
0.018829345703125,
0.012786865234375,
-0.03741455078125,
-0.02032470703125,
0.064208984375,
-0.004840850830078125,
-0.026397705078125,
0.057891845703125,
-0.01255035400390625,
-0.0380859375,
0.07781982421875,
0.036865234375,
0.0758056640625,
-0.0157928466796875,
0.00955963134765625,
0.06744384765625,
0.0255126953125,
-0.009735107421875,
0.030120849609375,
0.0176849365234375,
-0.0374755859375,
-0.0181884765625,
-0.0450439453125,
-0.019744873046875,
0.035888671875,
-0.05670166015625,
0.03369140625,
-0.035125732421875,
-0.035064697265625,
0.01210784912109375,
0.00836181640625,
-0.0712890625,
0.034942626953125,
0.0102996826171875,
0.0731201171875,
-0.077880859375,
0.05810546875,
0.05023193359375,
-0.0309906005859375,
-0.07354736328125,
-0.013763427734375,
-0.013458251953125,
-0.059173583984375,
0.06036376953125,
0.0157012939453125,
0.01212310791015625,
0.004543304443359375,
-0.042022705078125,
-0.05810546875,
0.0838623046875,
0.0347900390625,
-0.035430908203125,
0.00748443603515625,
0.039886474609375,
0.04351806640625,
-0.01447296142578125,
0.050872802734375,
0.02825927734375,
0.034881591796875,
0.0116424560546875,
-0.06640625,
0.007801055908203125,
-0.031402587890625,
0.005046844482421875,
0.004283905029296875,
-0.052398681640625,
0.077392578125,
0.007396697998046875,
0.021697998046875,
0.005962371826171875,
0.042327880859375,
0.0302581787109375,
0.0185546875,
0.04296875,
0.056396484375,
0.043609619140625,
-0.0133056640625,
0.05810546875,
-0.03033447265625,
0.05328369140625,
0.07568359375,
0.014984130859375,
0.048187255859375,
0.0290985107421875,
-0.0257568359375,
0.034454345703125,
0.06292724609375,
-0.034515380859375,
0.01206207275390625,
0.013885498046875,
0.0006570816040039062,
-0.035888671875,
0.01259613037109375,
-0.0443115234375,
0.041778564453125,
-0.004795074462890625,
-0.051788330078125,
-0.0272979736328125,
-0.0140228271484375,
-0.018829345703125,
-0.02947998046875,
-0.018951416015625,
0.0296173095703125,
-0.0224456787109375,
-0.0156707763671875,
0.056396484375,
0.01012420654296875,
0.036376953125,
-0.042083740234375,
-0.0243682861328125,
0.0039005279541015625,
0.02716064453125,
-0.0277099609375,
-0.05328369140625,
0.01910400390625,
0.00049591064453125,
-0.01013946533203125,
0.004299163818359375,
0.045074462890625,
-0.0183868408203125,
-0.07159423828125,
0.006206512451171875,
0.0222320556640625,
0.0239410400390625,
-0.00342559814453125,
-0.07586669921875,
-0.01666259765625,
-0.005279541015625,
-0.02239990234375,
0.01088714599609375,
0.0174713134765625,
-0.01122283935546875,
0.05267333984375,
0.04833984375,
0.002735137939453125,
0.00908660888671875,
-0.0013303756713867188,
0.0640869140625,
-0.049163818359375,
-0.0404052734375,
-0.05462646484375,
0.0389404296875,
-0.01300048828125,
-0.05908203125,
0.04541015625,
0.072021484375,
0.059722900390625,
-0.0162200927734375,
0.033599853515625,
0.0016355514526367188,
0.01593017578125,
-0.0177764892578125,
0.05816650390625,
-0.03717041015625,
-0.00365447998046875,
-0.0236968994140625,
-0.06494140625,
-0.00656890869140625,
0.053070068359375,
-0.0253753662109375,
0.0186920166015625,
0.040679931640625,
0.07177734375,
-0.019378662109375,
-0.00682830810546875,
0.007579803466796875,
-0.00811767578125,
0.019073486328125,
0.039642333984375,
0.0452880859375,
-0.06536865234375,
0.031982421875,
-0.0478515625,
-0.01177978515625,
-0.0030231475830078125,
-0.057891845703125,
-0.07080078125,
-0.03607177734375,
-0.0478515625,
-0.03521728515625,
-0.0080108642578125,
0.0731201171875,
0.0758056640625,
-0.057037353515625,
-0.0264892578125,
-0.017425537109375,
-0.0193023681640625,
-0.0283966064453125,
-0.016876220703125,
0.0474853515625,
-0.0020542144775390625,
-0.06182861328125,
-0.0170135498046875,
-0.0265045166015625,
0.0280914306640625,
-0.0291900634765625,
-0.0218048095703125,
-0.0209503173828125,
-0.025665283203125,
0.01397705078125,
0.00785064697265625,
-0.036407470703125,
-0.0262908935546875,
-0.01464080810546875,
-0.007465362548828125,
0.029083251953125,
0.025848388671875,
-0.05523681640625,
0.029296875,
0.035858154296875,
0.0208282470703125,
0.0579833984375,
-0.005565643310546875,
0.0238189697265625,
-0.0511474609375,
0.03350830078125,
0.0173492431640625,
0.032928466796875,
0.017669677734375,
-0.0308685302734375,
0.039764404296875,
0.04705810546875,
-0.038909912109375,
-0.053985595703125,
-0.012359619140625,
-0.0821533203125,
0.005992889404296875,
0.07855224609375,
-0.0034160614013671875,
-0.044464111328125,
0.0021038055419921875,
-0.02691650390625,
0.034881591796875,
-0.0265960693359375,
0.041656494140625,
0.03472900390625,
-0.0022716522216796875,
-0.0009946823120117188,
-0.0399169921875,
0.0297393798828125,
0.01471710205078125,
-0.05328369140625,
-0.0121612548828125,
0.0174102783203125,
0.034393310546875,
0.0110321044921875,
0.0244598388671875,
-0.002483367919921875,
0.022064208984375,
0.006832122802734375,
0.031646728515625,
-0.0257110595703125,
-0.0092010498046875,
-0.03375244140625,
0.005924224853515625,
-0.0061798095703125,
-0.04071044921875
]
] |
nreimers/BERT-Tiny_L-2_H-128_A-2 | 2021-05-28T11:05:21.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"feature-extraction",
"endpoints_compatible",
"has_space",
"region:us"
] | feature-extraction | nreimers | null | null | nreimers/BERT-Tiny_L-2_H-128_A-2 | 2 | 26,925 | transformers | 2022-03-02T23:29:05 | This is the BERT-Medium model from Google: https://github.com/google-research/bert#bert. A BERT model with 2 layers, 128 hidden unit size, and 2 attention heads. | 161 | [
[
-0.03155517578125,
-0.0633544921875,
0.0413818359375,
0.027191162109375,
0.003246307373046875,
-0.021331787109375,
-0.005657196044921875,
-0.0204315185546875,
0.034423828125,
0.042633056640625,
-0.060943603515625,
-0.007678985595703125,
-0.029083251953125,
-0.037200927734375,
-0.039306640625,
0.06805419921875,
0.0347900390625,
0.0307159423828125,
0.00885009765625,
0.0032215118408203125,
-0.010406494140625,
-0.0034427642822265625,
-0.060943603515625,
-0.0450439453125,
0.08245849609375,
0.0361328125,
0.056488037109375,
0.0245819091796875,
0.06121826171875,
0.01097869873046875,
-0.001232147216796875,
-0.024871826171875,
-0.04571533203125,
-0.03619384765625,
-0.002323150634765625,
-0.0132904052734375,
-0.04876708984375,
0.00972747802734375,
0.05548095703125,
0.050567626953125,
-0.0005679130554199219,
0.0433349609375,
0.0010013580322265625,
0.0509033203125,
-0.0201873779296875,
0.00942230224609375,
-0.01425933837890625,
-0.002498626708984375,
0.01532745361328125,
0.02606201171875,
-0.0252227783203125,
-0.004009246826171875,
0.049468994140625,
-0.030242919921875,
0.03558349609375,
-0.01120758056640625,
0.0665283203125,
0.0214385986328125,
-0.026092529296875,
-0.0052947998046875,
-0.050933837890625,
0.055511474609375,
-0.0238494873046875,
0.03680419921875,
0.0305633544921875,
0.036834716796875,
0.01180267333984375,
-0.0679931640625,
-0.010101318359375,
-0.0011653900146484375,
0.00803375244140625,
-0.0010805130004882812,
-0.01207733154296875,
-0.0124359130859375,
0.01288604736328125,
0.011627197265625,
-0.0479736328125,
-0.01068878173828125,
-0.0406494140625,
0.0004000663757324219,
0.03369140625,
-0.006023406982421875,
0.01000213623046875,
0.00879669189453125,
-0.044769287109375,
0.005340576171875,
-0.03302001953125,
-0.00644683837890625,
0.036346435546875,
-0.0004897117614746094,
-0.0270233154296875,
0.038909912109375,
0.002841949462890625,
0.06683349609375,
0.0157012939453125,
0.0029354095458984375,
0.01373291015625,
0.0149993896484375,
-0.032257080078125,
-0.01178741455078125,
0.019805908203125,
0.0165863037109375,
0.0021610260009765625,
-0.027069091796875,
-0.0292205810546875,
-0.01528167724609375,
0.04644775390625,
-0.057281494140625,
-0.045135498046875,
0.002323150634765625,
-0.060577392578125,
-0.02325439453125,
0.00029969215393066406,
-0.030670166015625,
-0.003688812255859375,
-0.00695037841796875,
0.065185546875,
-0.037811279296875,
-0.027252197265625,
-0.00646209716796875,
-0.004932403564453125,
0.034454345703125,
0.007572174072265625,
-0.05340576171875,
0.0316162109375,
0.0281829833984375,
0.0499267578125,
0.0145416259765625,
-0.00632476806640625,
0.0094146728515625,
0.0166015625,
-0.034149169921875,
0.0277557373046875,
-0.048431396484375,
-0.01288604736328125,
-0.0187530517578125,
0.0197601318359375,
0.00978851318359375,
-0.004425048828125,
0.055419921875,
-0.062103271484375,
0.0321044921875,
-0.03900146484375,
-0.022918701171875,
-0.05206298828125,
0.0301361083984375,
-0.07635498046875,
0.08111572265625,
0.03619384765625,
-0.0450439453125,
0.01285552978515625,
-0.07281494140625,
-0.003429412841796875,
0.031829833984375,
0.0014209747314453125,
-0.037261962890625,
0.012298583984375,
-0.01042938232421875,
0.0270233154296875,
-0.01947021484375,
0.013031005859375,
-0.02020263671875,
-0.035552978515625,
0.00739288330078125,
-0.0017805099487304688,
0.06817626953125,
0.0277557373046875,
-0.00850677490234375,
0.016387939453125,
-0.0665283203125,
0.01012420654296875,
0.0266571044921875,
-0.0305633544921875,
-0.01024627685546875,
-0.01459503173828125,
0.02398681640625,
0.019256591796875,
0.0570068359375,
-0.0430908203125,
0.00016319751739501953,
0.023712158203125,
0.0474853515625,
0.0247650146484375,
-0.007175445556640625,
0.030029296875,
-0.040771484375,
0.003082275390625,
-0.0082550048828125,
0.0252227783203125,
0.0154571533203125,
-0.054229736328125,
-0.0517578125,
-0.02899169921875,
0.043243408203125,
0.01189422607421875,
-0.03424072265625,
0.04388427734375,
-0.0005869865417480469,
-0.048828125,
-0.030120849609375,
0.0020694732666015625,
0.025360107421875,
0.0184783935546875,
0.00435638427734375,
-0.0276336669921875,
-0.007709503173828125,
-0.11614990234375,
0.02532958984375,
-0.0006794929504394531,
-0.033935546875,
0.05206298828125,
0.027801513671875,
-0.040679931640625,
0.0560302734375,
-0.0357666015625,
-0.022613525390625,
-0.00852203369140625,
0.005084991455078125,
0.021575927734375,
0.041839599609375,
0.06329345703125,
-0.04248046875,
-0.02825927734375,
-0.0330810546875,
-0.031585693359375,
0.0175323486328125,
0.0006256103515625,
-0.027130126953125,
-0.0251007080078125,
-0.00565338134765625,
-0.044525146484375,
-0.007389068603515625,
0.047149658203125,
-0.0182952880859375,
0.0237274169921875,
-0.021392822265625,
-0.0159759521484375,
-0.058380126953125,
0.03204345703125,
0.02764892578125,
-0.0115966796875,
-0.039794921875,
0.034942626953125,
0.0419921875,
-0.0179901123046875,
-0.022308349609375,
0.0203399658203125,
-0.0452880859375,
-0.0203857421875,
0.0013332366943359375,
-0.019439697265625,
-0.0018825531005859375,
0.054656982421875,
-0.01422119140625,
0.053253173828125,
0.05633544921875,
-0.0283203125,
0.03997802734375,
0.0083465576171875,
-0.037078857421875,
0.012481689453125,
-0.069580078125,
0.04840087890625,
0.004383087158203125,
0.0105133056640625,
-0.0814208984375,
-0.060699462890625,
-0.0014715194702148438,
-0.0274658203125,
0.005092620849609375,
0.0026187896728515625,
-0.06634521484375,
-0.0474853515625,
-0.03662109375,
0.021392822265625,
0.0300445556640625,
-0.031341552734375,
0.0310211181640625,
0.041107177734375,
-0.0282440185546875,
-0.0205230712890625,
-0.064208984375,
-0.0014934539794921875,
-0.0015764236450195312,
-0.048675537109375,
0.0296630859375,
-0.021331787109375,
-0.0009975433349609375,
0.026580810546875,
0.0245361328125,
-0.048065185546875,
-0.01136016845703125,
0.00933074951171875,
0.023162841796875,
-0.0296173095703125,
0.0255279541015625,
-0.00910186767578125,
0.02935791015625,
0.02880859375,
-0.00372314453125,
0.053863525390625,
-0.0321044921875,
0.0205841064453125,
-0.03167724609375,
0.02923583984375,
0.049835205078125,
0.0176849365234375,
0.05145263671875,
0.06689453125,
-0.05218505859375,
-0.0020618438720703125,
-0.04827880859375,
-0.0266265869140625,
-0.03240966796875,
0.0160064697265625,
-0.029571533203125,
-0.02630615234375,
0.05157470703125,
0.023468017578125,
-0.00945281982421875,
0.040496826171875,
0.05487060546875,
0.0004703998565673828,
0.0892333984375,
0.04840087890625,
-0.017486572265625,
0.046478271484375,
-0.03485107421875,
-0.0052947998046875,
-0.0655517578125,
-0.032684326171875,
-0.0023288726806640625,
-0.046234130859375,
-0.01702880859375,
-0.0227813720703125,
0.016571044921875,
-0.01128387451171875,
-0.0638427734375,
0.0274505615234375,
-0.041015625,
0.043701171875,
0.0909423828125,
0.06048583984375,
-0.0245208740234375,
-0.005672454833984375,
-0.0033054351806640625,
0.0129241943359375,
-0.034576416015625,
-0.03607177734375,
0.0703125,
0.056549072265625,
0.0582275390625,
0.0252838134765625,
0.0131988525390625,
0.0269622802734375,
-0.0108489990234375,
-0.043212890625,
0.03515625,
-0.00402069091796875,
-0.11370849609375,
-0.03521728515625,
-0.0159912109375,
-0.061798095703125,
0.005596160888671875,
-0.01727294921875,
-0.0286407470703125,
-0.002490997314453125,
-0.0131988525390625,
-0.0389404296875,
0.00913238525390625,
-0.072265625,
0.046600341796875,
-0.00901031494140625,
-0.00916290283203125,
-0.01092529296875,
-0.058074951171875,
0.03741455078125,
-0.0010957717895507812,
-0.02392578125,
-0.003658294677734375,
0.01178741455078125,
0.076171875,
0.0079803466796875,
0.05584716796875,
-0.01558685302734375,
-0.009796142578125,
0.036529541015625,
0.0018205642700195312,
0.033966064453125,
0.0200653076171875,
0.02301025390625,
-0.00690460205078125,
-0.0279541015625,
-0.05059814453125,
0.003826141357421875,
0.06982421875,
-0.0823974609375,
-0.007778167724609375,
0.0013666152954101562,
-0.07080078125,
-0.01097869873046875,
0.037261962890625,
0.017669677734375,
0.03570556640625,
-0.003993988037109375,
0.053924560546875,
0.06524658203125,
-0.0128631591796875,
0.036956787109375,
0.0290374755859375,
-0.004119873046875,
-0.0007128715515136719,
0.0697021484375,
0.00988006591796875,
-0.0034008026123046875,
0.01276397705078125,
0.007389068603515625,
-0.018585205078125,
-0.0191650390625,
-0.0174102783203125,
0.033447265625,
-0.023040771484375,
0.00780487060546875,
-0.038055419921875,
-0.052337646484375,
-0.033935546875,
-0.0021762847900390625,
-0.0309906005859375,
-0.037841796875,
-0.04241943359375,
0.00482177734375,
0.01033782958984375,
0.0672607421875,
-0.002445220947265625,
0.054656982421875,
-0.032470703125,
0.0241546630859375,
0.0732421875,
0.061492919921875,
-0.0042877197265625,
-0.037200927734375,
-0.01490020751953125,
-0.00925445556640625,
-0.04315185546875,
-0.038970947265625,
0.040679931640625,
-0.00383758544921875,
0.05633544921875,
0.04974365234375,
-0.0256500244140625,
0.0096893310546875,
-0.049041748046875,
0.056976318359375,
0.048828125,
-0.039581298828125,
0.0229339599609375,
-0.048919677734375,
0.01427459716796875,
0.0049285888671875,
0.0245361328125,
-0.006710052490234375,
-0.00942230224609375,
-0.08038330078125,
-0.04974365234375,
0.044708251953125,
0.018707275390625,
0.00829315185546875,
0.021484375,
0.00366973876953125,
0.0131072998046875,
0.016021728515625,
-0.0648193359375,
-0.0301666259765625,
-0.0118255615234375,
-0.0138092041015625,
0.0046539306640625,
-0.0594482421875,
-0.0325927734375,
-0.0002892017364501953,
0.031646728515625,
0.0125732421875,
0.056732177734375,
0.0023555755615234375,
-0.0139617919921875,
-0.006832122802734375,
-0.016326904296875,
0.054168701171875,
0.0478515625,
-0.06231689453125,
0.021636962890625,
-0.00627899169921875,
-0.04083251953125,
-0.0213623046875,
-0.006183624267578125,
-0.005535125732421875,
0.005077362060546875,
0.0299224853515625,
0.046600341796875,
0.03143310546875,
-0.00858306884765625,
0.023956298828125,
0.0037860870361328125,
-0.03070068359375,
-0.037384033203125,
0.007587432861328125,
0.017486572265625,
0.0296478271484375,
0.04083251953125,
-0.0077972412109375,
0.033721923828125,
-0.029937744140625,
0.022918701171875,
0.02301025390625,
-0.051300048828125,
-0.03582763671875,
0.048095703125,
0.030853271484375,
-0.0352783203125,
0.03765869140625,
-0.0162200927734375,
-0.0247955322265625,
0.049591064453125,
0.038177490234375,
0.0455322265625,
-0.0049285888671875,
0.031707763671875,
0.0246734619140625,
0.0289154052734375,
-0.0076141357421875,
0.036346435546875,
0.0182342529296875,
-0.037200927734375,
-0.00583648681640625,
-0.0177764892578125,
-0.045562744140625,
0.05584716796875,
-0.06964111328125,
0.003452301025390625,
-0.049407958984375,
-0.031829833984375,
0.031951904296875,
0.03192138671875,
-0.0667724609375,
0.0296173095703125,
0.0251007080078125,
0.0770263671875,
-0.0361328125,
0.08447265625,
0.056488037109375,
-0.006908416748046875,
-0.0433349609375,
-0.02264404296875,
-0.007793426513671875,
-0.08544921875,
0.04266357421875,
0.0095367431640625,
0.002811431884765625,
0.001983642578125,
-0.06793212890625,
-0.0836181640625,
0.0806884765625,
0.0287628173828125,
-0.044830322265625,
-0.01397705078125,
-0.03900146484375,
0.0287628173828125,
-0.033203125,
0.002429962158203125,
0.0311737060546875,
0.0196533203125,
0.004611968994140625,
-0.0450439453125,
-0.0173797607421875,
-0.02099609375,
0.0108489990234375,
-0.004611968994140625,
-0.068115234375,
0.1055908203125,
-0.01306915283203125,
0.0095367431640625,
0.011016845703125,
0.019989013671875,
0.01715087890625,
0.00037217140197753906,
0.04925537109375,
0.04876708984375,
0.0261688232421875,
-0.002704620361328125,
0.064453125,
-0.0191802978515625,
0.06976318359375,
0.065673828125,
-0.004352569580078125,
0.034027099609375,
0.019927978515625,
-0.0000616312026977539,
0.02899169921875,
0.060760498046875,
-0.0228424072265625,
0.06982421875,
-0.002353668212890625,
-0.00681304931640625,
0.00574493408203125,
0.0059967041015625,
-0.043487548828125,
0.007137298583984375,
0.0390625,
-0.04925537109375,
-0.0127716064453125,
-0.031829833984375,
-0.0245208740234375,
-0.041900634765625,
-0.039154052734375,
0.036651611328125,
-0.0188751220703125,
-0.01245880126953125,
0.0128631591796875,
-0.0007071495056152344,
0.0208587646484375,
-0.026824951171875,
0.00146484375,
0.00418853759765625,
0.016021728515625,
0.0189208984375,
-0.083251953125,
-0.01465606689453125,
-0.0132904052734375,
-0.03680419921875,
0.004543304443359375,
0.056610107421875,
-0.0288848876953125,
-0.06695556640625,
0.038604736328125,
0.01342010498046875,
0.020599365234375,
-0.00444793701171875,
-0.0745849609375,
-0.0061798095703125,
-0.016937255859375,
-0.036895751953125,
0.012664794921875,
0.039703369140625,
0.017364501953125,
0.039886474609375,
0.01377105712890625,
0.0059051513671875,
0.025848388671875,
0.0187835693359375,
0.057403564453125,
-0.05548095703125,
-0.048492431640625,
-0.015869140625,
0.00872039794921875,
-0.031158447265625,
-0.0305328369140625,
0.03326416015625,
0.04620361328125,
0.03533935546875,
-0.039886474609375,
0.040252685546875,
-0.001529693603515625,
0.04876708984375,
-0.00926971435546875,
0.049163818359375,
-0.03857421875,
-0.0179595947265625,
-0.007808685302734375,
-0.08538818359375,
-0.01702880859375,
0.061676025390625,
0.0098419189453125,
0.002986907958984375,
0.03680419921875,
0.03314208984375,
-0.01424407958984375,
-0.01129913330078125,
0.040557861328125,
0.0202484130859375,
0.0070343017578125,
0.048919677734375,
0.037261962890625,
-0.01104736328125,
0.0200653076171875,
-0.035919189453125,
-0.027435302734375,
-0.04364013671875,
-0.027679443359375,
-0.070556640625,
-0.0428466796875,
-0.0037059783935546875,
-0.0035858154296875,
-0.01216888427734375,
0.03265380859375,
0.09539794921875,
-0.052398681640625,
-0.006824493408203125,
0.0029125213623046875,
-0.00807952880859375,
0.00106048583984375,
-0.0138397216796875,
0.0247955322265625,
-0.0117645263671875,
-0.05303955078125,
0.0093536376953125,
0.0062408447265625,
0.0094146728515625,
-0.0335693359375,
0.002777099609375,
-0.020965576171875,
0.0132904052734375,
0.055419921875,
-0.01261138916015625,
-0.06024169921875,
-0.053955078125,
-0.006069183349609375,
-0.0307159423828125,
-0.0280303955078125,
0.039947509765625,
-0.0209808349609375,
0.002475738525390625,
0.0158233642578125,
0.03485107421875,
0.041717529296875,
-0.0081939697265625,
0.0131683349609375,
-0.0806884765625,
0.01436614990234375,
0.00791168212890625,
0.038604736328125,
0.01031494140625,
-0.0372314453125,
0.06365966796875,
-0.004032135009765625,
-0.0516357421875,
-0.056121826171875,
0.0014858245849609375,
-0.10772705078125,
-0.025726318359375,
0.056640625,
-0.01049041748046875,
-0.01763916015625,
0.03631591796875,
-0.019775390625,
0.0074310302734375,
-0.03857421875,
0.0731201171875,
0.07781982421875,
0.007389068603515625,
-0.039031982421875,
-0.0516357421875,
-0.0049896240234375,
0.0043487548828125,
-0.02850341796875,
-0.036163330078125,
0.0168304443359375,
0.0325927734375,
0.051025390625,
0.007587432861328125,
-0.00968170166015625,
0.0242462158203125,
0.0158233642578125,
0.01538848876953125,
0.014129638671875,
-0.07476806640625,
-0.0038242340087890625,
0.021240234375,
-0.00799560546875,
-0.0552978515625
]
] |
medical-ner-proj/bert-medical-ner-proj | 2023-05-07T02:46:47.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"license:openrail",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | medical-ner-proj | null | null | medical-ner-proj/bert-medical-ner-proj | 20 | 26,922 | transformers | 2023-05-05T04:33:38 | ---
license: openrail
---
Medical documents NER model by fine tuning BERT
widget:
- example_title: "example 1" text: "John Doe has a history of hypertension, which is well-controlled with medication. He has no history of allergies or surgeries. He is not currently taking any medication except for his blood pressure medication."
- example_title: "example 2" text: "On physical examination, John Doe appears acutely ill. He has a temperature of 38.5°C and a heart rate of 105 beats per minute. His blood pressure is 140/90 mmHg, and his oxygen saturation is 90% on room air. His lungs have diminished breath sounds and wheezing. There is no cyanosis, and his heart sounds are normal."
- example_title: "example 3" text: "Based on Mary Smith's symptoms and physical examination, she is suspected to have suffered a stroke, likely caused by hypertension. Her history of migraines may also be a contributing factor." | 915 | [
[
-0.01149749755859375,
-0.0667724609375,
0.06219482421875,
-0.0247039794921875,
-0.01308441162109375,
-0.04888916015625,
0.00591278076171875,
-0.02392578125,
0.034027099609375,
0.0577392578125,
-0.0278778076171875,
-0.0311126708984375,
-0.056854248046875,
0.006404876708984375,
-0.0298614501953125,
0.10430908203125,
0.00971221923828125,
0.0222930908203125,
0.0004978179931640625,
0.025665283203125,
-0.0308380126953125,
-0.034637451171875,
-0.0550537109375,
-0.043243408203125,
0.044677734375,
0.0203857421875,
0.06390380859375,
0.024322509765625,
0.061981201171875,
0.0198974609375,
-0.0278167724609375,
-0.02081298828125,
-0.028717041015625,
0.0303955078125,
0.00865936279296875,
-0.0117340087890625,
-0.056671142578125,
-0.01395416259765625,
0.03875732421875,
0.0189361572265625,
0.0018901824951171875,
-0.004146575927734375,
-0.0095367431640625,
0.06280517578125,
-0.005161285400390625,
0.00007587671279907227,
-0.02197265625,
-0.010986328125,
-0.00385284423828125,
-0.0200347900390625,
-0.026092529296875,
-0.01148223876953125,
0.0361328125,
-0.050079345703125,
0.026763916015625,
-0.00919342041015625,
0.10833740234375,
0.0200653076171875,
-0.04052734375,
-0.01457977294921875,
-0.04473876953125,
0.036346435546875,
-0.072265625,
0.020111083984375,
0.039581298828125,
0.006191253662109375,
-0.0352783203125,
-0.055206298828125,
-0.04058837890625,
-0.01763916015625,
-0.0085296630859375,
0.015716552734375,
-0.029693603515625,
0.0008602142333984375,
0.0272064208984375,
0.035400390625,
-0.044586181640625,
-0.0021991729736328125,
-0.03692626953125,
-0.0178070068359375,
0.0215606689453125,
0.045989990234375,
0.016754150390625,
-0.0389404296875,
-0.026580810546875,
-0.004657745361328125,
-0.01189422607421875,
-0.0029964447021484375,
0.0189056396484375,
0.025421142578125,
-0.0262908935546875,
0.040435791015625,
-0.005542755126953125,
0.082763671875,
0.0161590576171875,
-0.032135009765625,
0.0386962890625,
-0.0123291015625,
-0.03094482421875,
-0.0004665851593017578,
0.039825439453125,
0.0379638671875,
0.020782470703125,
0.00597381591796875,
-0.0216217041015625,
-0.01470184326171875,
0.047515869140625,
-0.0634765625,
-0.049591064453125,
0.0102081298828125,
-0.06414794921875,
-0.00211334228515625,
-0.0023937225341796875,
-0.042510986328125,
-0.0269927978515625,
-0.042144775390625,
0.052581787109375,
-0.051422119140625,
-0.0090484619140625,
-0.0174102783203125,
-0.03607177734375,
-0.0016937255859375,
0.023468017578125,
-0.04425048828125,
0.019775390625,
0.0194854736328125,
0.033599853515625,
0.02215576171875,
0.0059661865234375,
-0.016754150390625,
0.005542755126953125,
-0.00890350341796875,
0.063720703125,
-0.02984619140625,
-0.042572021484375,
-0.01412200927734375,
0.01337432861328125,
0.0003914833068847656,
-0.01471710205078125,
0.0555419921875,
-0.00962066650390625,
0.0241851806640625,
-0.0311431884765625,
-0.044708251953125,
-0.02301025390625,
0.01070404052734375,
-0.053009033203125,
0.0504150390625,
0.0170745849609375,
-0.06927490234375,
0.039703369140625,
-0.046600341796875,
-0.043548583984375,
0.0009627342224121094,
-0.00214385986328125,
-0.042083740234375,
0.01143646240234375,
0.00669097900390625,
0.033660888671875,
-0.00926971435546875,
0.01186370849609375,
-0.0201568603515625,
-0.01525115966796875,
0.019683837890625,
-0.01558685302734375,
0.044158935546875,
0.00311279296875,
0.017578125,
0.0063629150390625,
-0.065673828125,
0.01372528076171875,
0.004444122314453125,
-0.031829833984375,
-0.04229736328125,
0.00824737548828125,
0.0330810546875,
0.01019287109375,
0.0298614501953125,
-0.041748046875,
0.024383544921875,
-0.0457763671875,
0.007236480712890625,
0.0235748291015625,
0.0482177734375,
0.0111083984375,
-0.060546875,
0.0106964111328125,
-0.0110626220703125,
0.038482666015625,
0.01160430908203125,
-0.05712890625,
-0.058319091796875,
-0.023284912109375,
0.0120849609375,
0.035919189453125,
-0.02752685546875,
0.033599853515625,
-0.0045928955078125,
-0.043609619140625,
-0.022491455078125,
-0.0028553009033203125,
0.03155517578125,
0.086669921875,
0.039215087890625,
-0.00434112548828125,
-0.035675048828125,
-0.08148193359375,
-0.00487518310546875,
-0.005039215087890625,
-0.0021209716796875,
0.03143310546875,
0.031646728515625,
-0.028656005859375,
0.0279388427734375,
-0.05029296875,
-0.043243408203125,
-0.01508331298828125,
0.03802490234375,
0.055816650390625,
0.037017822265625,
0.0295562744140625,
-0.052337646484375,
-0.02325439453125,
0.00217437744140625,
-0.034088134765625,
-0.0027294158935546875,
-0.004779815673828125,
-0.002490997314453125,
0.03173828125,
0.043701171875,
-0.019287109375,
0.06365966796875,
0.0236663818359375,
-0.03765869140625,
0.0206451416015625,
-0.06494140625,
-0.0020771026611328125,
-0.10504150390625,
0.02044677734375,
0.0191650390625,
0.003353118896484375,
-0.0526123046875,
-0.01241302490234375,
0.006549835205078125,
-0.00009816884994506836,
-0.0118408203125,
0.041015625,
-0.032562255859375,
0.03094482421875,
-0.043609619140625,
-0.0158538818359375,
0.0164947509765625,
0.0010976791381835938,
-0.007244110107421875,
0.01035308837890625,
0.0281982421875,
-0.048797607421875,
-0.0011644363403320312,
0.02337646484375,
0.00002276897430419922,
0.042205810546875,
-0.06683349609375,
-0.0192718505859375,
-0.017059326171875,
-0.0072174072265625,
-0.050567626953125,
-0.031494140625,
-0.01348876953125,
-0.047088623046875,
0.043182373046875,
-0.0024204254150390625,
-0.05908203125,
-0.0131988525390625,
-0.0007576942443847656,
-0.0269317626953125,
0.042144775390625,
0.002166748046875,
0.05413818359375,
0.0201568603515625,
-0.0241851806640625,
-0.0124664306640625,
-0.035125732421875,
-0.00445556640625,
0.00807952880859375,
-0.033538818359375,
0.037750244140625,
-0.01015472412109375,
-0.007465362548828125,
0.0207366943359375,
-0.01097869873046875,
-0.0601806640625,
0.0003063678741455078,
0.0433349609375,
0.04718017578125,
-0.004474639892578125,
0.03387451171875,
0.006557464599609375,
-0.005939483642578125,
0.0059661865234375,
0.033721923828125,
0.0242462158203125,
-0.005939483642578125,
0.0031833648681640625,
-0.053558349609375,
0.04486083984375,
0.051483154296875,
0.018798828125,
0.032073974609375,
0.04425048828125,
-0.0455322265625,
0.039154052734375,
-0.05352783203125,
-0.0222930908203125,
-0.0245361328125,
0.0182342529296875,
-0.0179443359375,
-0.025604248046875,
0.055206298828125,
0.02471923828125,
-0.01611328125,
0.062286376953125,
0.0472412109375,
-0.036468505859375,
0.068359375,
0.049896240234375,
0.00511932373046875,
0.032867431640625,
0.0018177032470703125,
0.0166015625,
-0.06671142578125,
-0.0292816162109375,
-0.0577392578125,
-0.0252532958984375,
-0.0550537109375,
0.0119171142578125,
0.0543212890625,
-0.0063018798828125,
-0.030609130859375,
0.0222930908203125,
-0.05523681640625,
0.030242919921875,
0.051971435546875,
0.03802490234375,
-0.0033359527587890625,
-0.0179290771484375,
-0.0298614501953125,
-0.00836181640625,
-0.043426513671875,
-0.030975341796875,
0.07855224609375,
0.0147705078125,
0.06732177734375,
0.0027923583984375,
0.0577392578125,
-0.01117706298828125,
0.0224456787109375,
-0.037384033203125,
0.00974273681640625,
-0.03973388671875,
-0.093994140625,
-0.038970947265625,
-0.0088348388671875,
-0.087890625,
-0.0273284912109375,
-0.06805419921875,
-0.057403564453125,
0.00936126708984375,
0.01531219482421875,
-0.01922607421875,
0.0025463104248046875,
-0.0660400390625,
0.046600341796875,
-0.01505279541015625,
-0.013946533203125,
0.01213836669921875,
-0.055755615234375,
0.0176849365234375,
-0.0006823539733886719,
0.003082275390625,
-0.0026340484619140625,
0.00946807861328125,
0.05517578125,
-0.04742431640625,
0.0504150390625,
-0.0143280029296875,
0.0316162109375,
-0.00716400146484375,
-0.0105133056640625,
0.0260772705078125,
-0.00978851318359375,
-0.006008148193359375,
0.0015020370483398438,
0.0173492431640625,
-0.027740478515625,
-0.026611328125,
0.06182861328125,
-0.05267333984375,
-0.0142059326171875,
-0.05133056640625,
-0.061431884765625,
0.008087158203125,
0.03692626953125,
0.03765869140625,
0.032196044921875,
-0.0018625259399414062,
0.01168060302734375,
0.03717041015625,
-0.01506805419921875,
0.02874755859375,
0.057647705078125,
0.009765625,
-0.0308380126953125,
0.0257720947265625,
0.0186309814453125,
0.0038127899169921875,
0.025054931640625,
0.034942626953125,
-0.01354217529296875,
-0.0281982421875,
-0.01129150390625,
0.0196075439453125,
-0.04180908203125,
-0.0161285400390625,
-0.04541015625,
-0.03656005859375,
-0.031768798828125,
-0.0188751220703125,
0.0076751708984375,
-0.0021533966064453125,
-0.058349609375,
0.01505279541015625,
0.060394287109375,
0.0682373046875,
0.01061248779296875,
0.027099609375,
-0.0657958984375,
0.037872314453125,
0.0208892822265625,
0.018585205078125,
-0.0008826255798339844,
-0.047943115234375,
-0.00583648681640625,
-0.03936767578125,
-0.0306549072265625,
-0.07281494140625,
0.044464111328125,
-0.004161834716796875,
0.05419921875,
0.034149169921875,
0.0142974853515625,
0.07830810546875,
-0.033935546875,
0.04498291015625,
0.027130126953125,
-0.057861328125,
0.05340576171875,
-0.030242919921875,
0.031097412109375,
0.046783447265625,
0.03631591796875,
-0.0200042724609375,
-0.026397705078125,
-0.0643310546875,
-0.06927490234375,
0.0197906494140625,
0.004291534423828125,
0.004077911376953125,
-0.00576019287109375,
0.030975341796875,
0.0017337799072265625,
-0.027587890625,
-0.0587158203125,
-0.01302337646484375,
-0.00418853759765625,
-0.0017833709716796875,
0.0009860992431640625,
-0.0225830078125,
-0.01261138916015625,
-0.05535888671875,
0.0595703125,
0.00847625732421875,
0.01715087890625,
0.05615234375,
0.0174102783203125,
-0.006519317626953125,
0.025726318359375,
0.05853271484375,
0.051300048828125,
-0.0662841796875,
-0.0123138427734375,
0.0006442070007324219,
-0.0389404296875,
-0.019744873046875,
0.034423828125,
-0.01256561279296875,
0.0287628173828125,
0.035308837890625,
0.05084228515625,
0.0185699462890625,
-0.0148468017578125,
0.043853759765625,
-0.0184326171875,
-0.041351318359375,
-0.03826904296875,
0.0267486572265625,
0.007389068603515625,
-0.0014295578002929688,
0.0212554931640625,
0.01059722900390625,
0.013427734375,
-0.032867431640625,
0.00823974609375,
0.0263671875,
-0.043731689453125,
0.0006875991821289062,
0.0469970703125,
0.022979736328125,
-0.029022216796875,
0.051727294921875,
0.00466156005859375,
-0.013031005859375,
0.03204345703125,
0.04052734375,
0.040130615234375,
-0.0025272369384765625,
-0.0240936279296875,
0.036956787109375,
0.0279083251953125,
0.01245880126953125,
0.07635498046875,
0.033355712890625,
-0.0184326171875,
-0.0262908935546875,
-0.0362548828125,
-0.032073974609375,
0.03350830078125,
-0.0787353515625,
0.024169921875,
-0.0321044921875,
-0.05157470703125,
0.0557861328125,
-0.0017461776733398438,
-0.05792236328125,
0.043182373046875,
-0.01297760009765625,
0.07098388671875,
-0.07415771484375,
0.07220458984375,
0.04669189453125,
-0.0306549072265625,
-0.067138671875,
0.0112457275390625,
0.0010728836059570312,
-0.036224365234375,
0.05157470703125,
-0.0082244873046875,
0.03350830078125,
-0.032867431640625,
-0.0223388671875,
-0.06494140625,
0.067138671875,
0.00018453598022460938,
-0.04632568359375,
0.00011259317398071289,
-0.056304931640625,
0.05963134765625,
0.00225067138671875,
0.01525115966796875,
0.006072998046875,
0.048492431640625,
-0.01224517822265625,
-0.0789794921875,
-0.0054931640625,
-0.03143310546875,
-0.00860595703125,
0.040008544921875,
-0.01091766357421875,
0.09417724609375,
-0.0279388427734375,
0.01329803466796875,
0.005123138427734375,
0.0008368492126464844,
0.00431060791015625,
0.05072021484375,
0.0196533203125,
0.061309814453125,
0.04840087890625,
-0.008392333984375,
0.10101318359375,
-0.0277252197265625,
0.00684356689453125,
0.0877685546875,
-0.0162353515625,
0.07501220703125,
0.05340576171875,
0.0027294158935546875,
0.056488037109375,
0.0243072509765625,
-0.008270263671875,
0.036712646484375,
0.016448974609375,
-0.01039886474609375,
-0.01065826416015625,
0.00600433349609375,
-0.054168701171875,
0.0287017822265625,
0.03936767578125,
-0.06732177734375,
-0.018585205078125,
-0.0162200927734375,
0.0175323486328125,
0.003040313720703125,
-0.0200042724609375,
0.06304931640625,
-0.02093505859375,
-0.04638671875,
0.0460205078125,
-0.014007568359375,
0.018157958984375,
-0.04498291015625,
-0.017059326171875,
0.0091094970703125,
0.006549835205078125,
-0.0177154541015625,
-0.049407958984375,
0.0308990478515625,
0.0031642913818359375,
-0.0670166015625,
-0.00506591796875,
0.055877685546875,
0.015350341796875,
-0.0521240234375,
0.00952911376953125,
0.027191162109375,
0.032073974609375,
0.01497650146484375,
-0.039825439453125,
-0.01256561279296875,
-0.00792694091796875,
-0.0036411285400390625,
0.022430419921875,
0.024169921875,
0.0099945068359375,
0.0267486572265625,
0.064208984375,
0.0277862548828125,
-0.0253143310546875,
0.0012769699096679688,
0.055450439453125,
-0.053985595703125,
-0.024169921875,
-0.0400390625,
0.0231475830078125,
-0.0262908935546875,
-0.06365966796875,
0.037750244140625,
0.039794921875,
0.03338623046875,
-0.0133056640625,
0.0312347412109375,
-0.004222869873046875,
0.0662841796875,
-0.042144775390625,
0.08148193359375,
-0.050872802734375,
-0.0201568603515625,
-0.034332275390625,
-0.040496826171875,
-0.02716064453125,
0.056243896484375,
-0.0219268798828125,
-0.00926971435546875,
0.072265625,
0.054901123046875,
0.0166015625,
-0.0158843994140625,
0.006282806396484375,
0.0323486328125,
0.006687164306640625,
0.040313720703125,
0.0115509033203125,
-0.032989501953125,
0.022979736328125,
-0.0166168212890625,
-0.01204681396484375,
-0.045562744140625,
-0.0753173828125,
-0.072021484375,
-0.051025390625,
-0.0237274169921875,
-0.04229736328125,
0.0009164810180664062,
0.040985107421875,
0.06781005859375,
-0.040985107421875,
0.021331787109375,
0.00534820556640625,
-0.0027923583984375,
-0.0038890838623046875,
-0.01495361328125,
0.0279998779296875,
0.00244140625,
-0.0255126953125,
0.0143585205078125,
0.0187225341796875,
0.0494384765625,
0.0138092041015625,
0.006175994873046875,
-0.01275634765625,
-0.00295257568359375,
0.018310546875,
0.0184478759765625,
-0.06884765625,
-0.004474639892578125,
-0.0027923583984375,
-0.032257080078125,
0.00119781494140625,
0.021728515625,
-0.042266845703125,
0.0545654296875,
0.04888916015625,
0.004993438720703125,
0.046966552734375,
-0.0127716064453125,
0.003696441650390625,
-0.03497314453125,
-0.0083160400390625,
0.037750244140625,
0.034027099609375,
0.00848388671875,
-0.037109375,
0.0033092498779296875,
-0.0124664306640625,
-0.055755615234375,
-0.049835205078125,
-0.0112152099609375,
-0.1016845703125,
-0.02655029296875,
0.049072265625,
0.0107574462890625,
-0.0180816650390625,
-0.01422119140625,
-0.013641357421875,
0.017242431640625,
-0.049072265625,
0.05609130859375,
0.054840087890625,
-0.04888916015625,
0.01483154296875,
-0.0706787109375,
0.00177764892578125,
0.023712158203125,
-0.041839599609375,
-0.0283203125,
0.0283203125,
0.047576904296875,
0.0214385986328125,
0.037200927734375,
-0.00923919677734375,
0.035614013671875,
0.006137847900390625,
0.022216796875,
-0.0117034912109375,
-0.0008053779602050781,
-0.01142120361328125,
0.0011281967163085938,
0.003795623779296875,
-0.01483154296875
]
] |
facebook/esm2_t33_650M_UR50D | 2023-03-21T15:05:12.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"esm",
"fill-mask",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | facebook | null | null | facebook/esm2_t33_650M_UR50D | 13 | 26,897 | transformers | 2022-09-27T14:36:16 | ---
license: mit
widget:
- text: "MQIFVKTLTGKTITLEVEPS<mask>TIENVKAKIQDKEGIPPDQQRLIFAGKQLEDGRTLSDYNIQKESTLHLVLRLRGG"
---
## ESM-2
ESM-2 is a state-of-the-art protein model trained on a masked language modelling objective. It is suitable for fine-tuning on a wide range of tasks that take protein sequences as input. For detailed information on the model architecture and training data, please refer to the [accompanying paper](https://www.biorxiv.org/content/10.1101/2022.07.20.500902v2). You may also be interested in some demo notebooks ([PyTorch](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/protein_language_modeling.ipynb), [TensorFlow](https://colab.research.google.com/github/huggingface/notebooks/blob/main/examples/protein_language_modeling-tf.ipynb)) which demonstrate how to fine-tune ESM-2 models on your tasks of interest.
Several ESM-2 checkpoints are available in the Hub with varying sizes. Larger sizes generally have somewhat better accuracy, but require much more memory and time to train:
| Checkpoint name | Num layers | Num parameters |
|------------------------------|----|----------|
| [esm2_t48_15B_UR50D](https://huggingface.co/facebook/esm2_t48_15B_UR50D) | 48 | 15B |
| [esm2_t36_3B_UR50D](https://huggingface.co/facebook/esm2_t36_3B_UR50D) | 36 | 3B |
| [esm2_t33_650M_UR50D](https://huggingface.co/facebook/esm2_t33_650M_UR50D) | 33 | 650M |
| [esm2_t30_150M_UR50D](https://huggingface.co/facebook/esm2_t30_150M_UR50D) | 30 | 150M |
| [esm2_t12_35M_UR50D](https://huggingface.co/facebook/esm2_t12_35M_UR50D) | 12 | 35M |
| [esm2_t6_8M_UR50D](https://huggingface.co/facebook/esm2_t6_8M_UR50D) | 6 | 8M | | 1,705 | [
[
-0.029815673828125,
-0.04095458984375,
0.023773193359375,
0.0173492431640625,
-0.0149078369140625,
0.004985809326171875,
0.00994873046875,
-0.03546142578125,
0.0180511474609375,
0.0284881591796875,
-0.056793212890625,
-0.036529541015625,
-0.064208984375,
0.00572967529296875,
-0.0138702392578125,
0.0743408203125,
0.0011749267578125,
0.0188751220703125,
-0.0239105224609375,
-0.006732940673828125,
-0.007183074951171875,
-0.0170440673828125,
-0.056732177734375,
-0.05047607421875,
0.0230560302734375,
0.0333251953125,
0.019195556640625,
0.046905517578125,
0.033355712890625,
0.016204833984375,
-0.034912109375,
0.0249176025390625,
-0.03607177734375,
0.01319122314453125,
-0.0087890625,
-0.0305938720703125,
-0.055877685546875,
-0.010772705078125,
0.031890869140625,
0.036956787109375,
0.004093170166015625,
0.03448486328125,
0.013763427734375,
0.06524658203125,
-0.0227813720703125,
0.0138397216796875,
-0.029541015625,
0.020233154296875,
-0.020721435546875,
-0.00202178955078125,
-0.0264739990234375,
0.0036602020263671875,
0.004505157470703125,
-0.0245513916015625,
0.01412200927734375,
0.0107421875,
0.09185791015625,
0.0157318115234375,
-0.044158935546875,
-0.016021728515625,
-0.0306549072265625,
0.05950927734375,
-0.034423828125,
0.030426025390625,
0.052032470703125,
0.0214996337890625,
-0.021453857421875,
-0.053619384765625,
-0.002597808837890625,
0.021453857421875,
-0.0012655258178710938,
0.028076171875,
-0.0186614990234375,
0.013580322265625,
0.039581298828125,
0.0223236083984375,
-0.0679931640625,
0.013824462890625,
-0.04443359375,
-0.01551055908203125,
0.0406494140625,
0.01465606689453125,
0.0236053466796875,
0.00199127197265625,
-0.032958984375,
0.0108184814453125,
-0.039947509765625,
0.003681182861328125,
0.0205535888671875,
-0.0030498504638671875,
-0.0158233642578125,
0.043548583984375,
-0.033111572265625,
0.053955078125,
0.005382537841796875,
-0.0117340087890625,
0.034088134765625,
-0.0039215087890625,
0.0024890899658203125,
-0.0355224609375,
0.038848876953125,
0.050933837890625,
-0.00638580322265625,
-0.007762908935546875,
-0.031097412109375,
-0.00795745849609375,
0.00390625,
-0.09149169921875,
-0.01216888427734375,
0.04443359375,
-0.036346435546875,
-0.0167236328125,
0.0099029541015625,
-0.056884765625,
-0.0008573532104492188,
-0.0178680419921875,
0.0267181396484375,
-0.037384033203125,
-0.0174407958984375,
0.01219940185546875,
-0.033660888671875,
0.02471923828125,
0.0158233642578125,
-0.061431884765625,
0.044921875,
0.050384521484375,
0.08306884765625,
-0.0030689239501953125,
-0.0159759521484375,
-0.031280517578125,
0.02044677734375,
-0.0101776123046875,
0.061859130859375,
-0.015411376953125,
-0.0037975311279296875,
0.002223968505859375,
0.022796630859375,
-0.004528045654296875,
-0.03857421875,
0.0276947021484375,
-0.0217437744140625,
0.01114654541015625,
-0.03057861328125,
-0.056640625,
-0.0291900634765625,
0.00865936279296875,
-0.0303192138671875,
0.1041259765625,
0.01529693603515625,
-0.0408935546875,
0.007537841796875,
-0.041107177734375,
-0.0251312255859375,
-0.0007195472717285156,
-0.01081085205078125,
-0.053558349609375,
0.0085296630859375,
-0.01369476318359375,
0.0305023193359375,
-0.02337646484375,
0.0022525787353515625,
-0.02398681640625,
-0.025787353515625,
0.005367279052734375,
0.0322265625,
0.05029296875,
0.033660888671875,
-0.0428466796875,
-0.022613525390625,
-0.06646728515625,
0.018707275390625,
0.0158233642578125,
-0.019744873046875,
0.0256195068359375,
0.0021533966064453125,
0.0164642333984375,
0.045745849609375,
0.01812744140625,
-0.034088134765625,
0.005619049072265625,
-0.023101806640625,
0.046600341796875,
0.032958984375,
0.0015163421630859375,
0.0207977294921875,
-0.04913330078125,
0.0296478271484375,
0.0031948089599609375,
0.006259918212890625,
-0.007755279541015625,
-0.061187744140625,
-0.06744384765625,
-0.028350830078125,
-0.0062713623046875,
0.04888916015625,
-0.0197906494140625,
0.05511474609375,
0.012298583984375,
-0.04278564453125,
-0.0269317626953125,
0.0082550048828125,
0.040374755859375,
0.018463134765625,
0.033782958984375,
-0.01462554931640625,
-0.057373046875,
-0.08612060546875,
-0.027435302734375,
0.0015354156494140625,
-0.022552490234375,
0.0152130126953125,
0.05963134765625,
-0.026641845703125,
0.054718017578125,
-0.0266571044921875,
-0.02203369140625,
-0.01270294189453125,
0.008636474609375,
0.00936126708984375,
0.0478515625,
0.0462646484375,
-0.0322265625,
-0.031158447265625,
-0.008636474609375,
-0.05731201171875,
-0.0152740478515625,
0.01123046875,
-0.003570556640625,
0.01137542724609375,
0.045166015625,
-0.034423828125,
0.0084991455078125,
0.051055908203125,
-0.04669189453125,
0.009918212890625,
-0.01177215576171875,
-0.0023899078369140625,
-0.096435546875,
0.0152740478515625,
0.00007206201553344727,
-0.03424072265625,
-0.046783447265625,
0.00763702392578125,
0.0125732421875,
-0.015167236328125,
-0.0416259765625,
0.048370361328125,
-0.0540771484375,
-0.0234527587890625,
-0.0274658203125,
-0.002124786376953125,
0.018707275390625,
0.03668212890625,
0.0012674331665039062,
0.03265380859375,
0.054534912109375,
-0.0256195068359375,
0.0040283203125,
0.0270538330078125,
-0.0220489501953125,
0.031219482421875,
-0.06622314453125,
0.03704833984375,
-0.0192108154296875,
0.023895263671875,
-0.073486328125,
-0.03302001953125,
0.01123046875,
-0.038482666015625,
0.03802490234375,
-0.0248565673828125,
-0.0335693359375,
-0.03631591796875,
-0.032928466796875,
0.0127410888671875,
0.056365966796875,
-0.0262298583984375,
0.03216552734375,
0.041229248046875,
-0.0027790069580078125,
-0.0264739990234375,
-0.0712890625,
-0.008544921875,
-0.0088958740234375,
-0.047698974609375,
0.034454345703125,
0.0021114349365234375,
0.01171112060546875,
-0.01244354248046875,
-0.01020050048828125,
0.008087158203125,
0.0018529891967773438,
0.04730224609375,
-0.00318145751953125,
0.00846099853515625,
-0.01380157470703125,
0.0275115966796875,
-0.02154541015625,
-0.01309967041015625,
-0.0189971923828125,
0.048248291015625,
-0.035552978515625,
-0.01416015625,
-0.04949951171875,
0.03375244140625,
0.050628662109375,
-0.00811004638671875,
0.06573486328125,
0.060302734375,
-0.06109619140625,
-0.0086517333984375,
-0.04327392578125,
-0.032623291015625,
-0.031341552734375,
0.059844970703125,
-0.04376220703125,
-0.07720947265625,
0.056732177734375,
-0.01531219482421875,
0.0014848709106445312,
0.04052734375,
0.047088623046875,
-0.018951416015625,
0.0902099609375,
0.0277099609375,
0.026123046875,
0.02911376953125,
-0.03466796875,
-0.00632476806640625,
-0.07073974609375,
-0.0582275390625,
-0.044403076171875,
-0.03399658203125,
-0.03118896484375,
-0.03631591796875,
0.0091400146484375,
0.041748046875,
-0.039642333984375,
0.04913330078125,
-0.025054931640625,
0.033843994140625,
0.0179901123046875,
0.0227203369140625,
-0.013671875,
0.01508331298828125,
-0.00626373291015625,
0.0030231475830078125,
-0.0616455078125,
-0.04449462890625,
0.0653076171875,
0.06353759765625,
0.0335693359375,
0.00970458984375,
0.04510498046875,
0.00743865966796875,
-0.006885528564453125,
-0.060516357421875,
0.035308837890625,
-0.01123809814453125,
-0.0589599609375,
-0.00641632080078125,
-0.0142669677734375,
-0.049652099609375,
0.01009368896484375,
-0.0161590576171875,
-0.06939697265625,
-0.0020008087158203125,
0.01319122314453125,
-0.0159759521484375,
0.0242156982421875,
-0.03961181640625,
0.04669189453125,
0.0008807182312011719,
-0.0232696533203125,
-0.0088958740234375,
-0.06109619140625,
-0.0017414093017578125,
0.00011843442916870117,
0.00598907470703125,
-0.0295867919921875,
-0.012847900390625,
0.07513427734375,
-0.04168701171875,
0.0552978515625,
-0.0124053955078125,
0.0255889892578125,
0.0203857421875,
0.0025196075439453125,
0.066162109375,
0.0070343017578125,
-0.01312255859375,
0.022705078125,
0.0084381103515625,
-0.0628662109375,
-0.0172119140625,
0.037384033203125,
-0.0740966796875,
-0.01154327392578125,
-0.04071044921875,
-0.024322509765625,
-0.013824462890625,
0.01751708984375,
0.05328369140625,
0.034454345703125,
-0.0010318756103515625,
0.0256500244140625,
0.04559326171875,
-0.0185089111328125,
0.0187835693359375,
0.053192138671875,
-0.01611328125,
-0.039520263671875,
0.0452880859375,
0.0187530517578125,
0.0248565673828125,
0.024993896484375,
-0.00922393798828125,
-0.031707763671875,
-0.0433349609375,
-0.030548095703125,
0.019378662109375,
-0.03564453125,
-0.031494140625,
-0.08038330078125,
-0.023193359375,
-0.0273895263671875,
-0.01010894775390625,
-0.057373046875,
-0.033233642578125,
-0.0161895751953125,
-0.020355224609375,
0.043914794921875,
0.046844482421875,
-0.0206451416015625,
0.0167083740234375,
-0.0455322265625,
0.01413726806640625,
0.01270294189453125,
0.029083251953125,
-0.032501220703125,
-0.06964111328125,
-0.01366424560546875,
-0.0029811859130859375,
-0.0203094482421875,
-0.07269287109375,
0.0171356201171875,
0.03857421875,
0.032562255859375,
0.032562255859375,
-0.0296478271484375,
0.027008056640625,
-0.0310516357421875,
0.051788330078125,
0.0270233154296875,
-0.048431396484375,
0.056732177734375,
-0.032989501953125,
0.020843505859375,
0.0472412109375,
0.0258331298828125,
-0.048858642578125,
-0.0305938720703125,
-0.03900146484375,
-0.059173583984375,
0.06439208984375,
0.0241241455078125,
-0.0011472702026367188,
-0.01035308837890625,
0.032745361328125,
0.007488250732421875,
0.00457000732421875,
-0.03216552734375,
-0.036712646484375,
0.004230499267578125,
-0.005645751953125,
0.01409149169921875,
-0.05279541015625,
-0.01340484619140625,
-0.022552490234375,
0.075439453125,
-0.01216888427734375,
0.038299560546875,
0.0015172958374023438,
-0.00214385986328125,
-0.0300750732421875,
-0.01210784912109375,
0.05474853515625,
0.03839111328125,
-0.03857421875,
0.007297515869140625,
0.027252197265625,
-0.03179931640625,
-0.003246307373046875,
0.0068206787109375,
-0.029541015625,
0.003482818603515625,
0.0164947509765625,
0.0645751953125,
0.00647735595703125,
-0.0341796875,
0.039642333984375,
0.01507568359375,
-0.032806396484375,
-0.015289306640625,
-0.008392333984375,
0.0230865478515625,
0.0316162109375,
0.0161590576171875,
0.0136566162109375,
0.01053619384765625,
-0.043487548828125,
0.0322265625,
0.019134521484375,
-0.04583740234375,
-0.028411865234375,
0.054107666015625,
0.012939453125,
-0.02825927734375,
0.0513916015625,
-0.037078857421875,
-0.046234130859375,
0.0589599609375,
0.060760498046875,
0.05908203125,
-0.018951416015625,
0.0161590576171875,
0.06768798828125,
0.0206451416015625,
-0.028289794921875,
0.044708251953125,
0.0280914306640625,
-0.04388427734375,
-0.00981903076171875,
-0.06500244140625,
-0.0035572052001953125,
0.034423828125,
-0.06707763671875,
0.037933349609375,
-0.028594970703125,
-0.0192413330078125,
-0.007476806640625,
0.01250457763671875,
-0.0576171875,
0.01177215576171875,
0.006793975830078125,
0.08245849609375,
-0.081298828125,
0.066162109375,
0.07244873046875,
-0.01837158203125,
-0.0347900390625,
-0.037567138671875,
0.029998779296875,
-0.0655517578125,
0.015411376953125,
0.0281524658203125,
0.01262664794921875,
0.005229949951171875,
-0.0252685546875,
-0.06549072265625,
0.10992431640625,
0.0168304443359375,
-0.06646728515625,
0.012359619140625,
0.0033588409423828125,
0.037750244140625,
-0.0205230712890625,
0.0286865234375,
0.03533935546875,
0.0157470703125,
0.009033203125,
-0.047576904296875,
0.008575439453125,
-0.0372314453125,
0.01369476318359375,
0.01169586181640625,
-0.08514404296875,
0.05303955078125,
-0.0198974609375,
-0.002696990966796875,
0.031158447265625,
0.0447998046875,
0.04388427734375,
0.0303192138671875,
0.021942138671875,
0.057708740234375,
0.052093505859375,
-0.0233917236328125,
0.061004638671875,
-0.035675048828125,
0.06524658203125,
0.06878662109375,
-0.001308441162109375,
0.0423583984375,
0.03948974609375,
-0.0217437744140625,
0.0196075439453125,
0.07720947265625,
-0.0194854736328125,
0.0323486328125,
0.022796630859375,
-0.004302978515625,
-0.0251007080078125,
-0.005153656005859375,
-0.04620361328125,
0.0125274658203125,
0.0169830322265625,
-0.02703857421875,
-0.013916015625,
-0.003940582275390625,
0.00994873046875,
-0.01232147216796875,
-0.0012722015380859375,
0.052276611328125,
0.0165557861328125,
-0.0303192138671875,
0.0220489501953125,
0.02056884765625,
0.0305938720703125,
-0.041656494140625,
0.0035114288330078125,
-0.0335693359375,
0.006618499755859375,
-0.026031494140625,
-0.047515869140625,
0.0237274169921875,
0.00478363037109375,
-0.017822265625,
-0.0221405029296875,
0.058563232421875,
-0.0369873046875,
-0.037567138671875,
0.033447265625,
0.0357666015625,
0.035369873046875,
-0.0035839080810546875,
-0.0738525390625,
0.015716552734375,
-0.016143798828125,
-0.03857421875,
0.033660888671875,
0.00844573974609375,
0.0238800048828125,
0.045928955078125,
0.01568603515625,
-0.0125732421875,
-0.00899505615234375,
0.00476837158203125,
0.05059814453125,
-0.04034423828125,
-0.031829833984375,
-0.050537109375,
0.037506103515625,
-0.004856109619140625,
-0.0294036865234375,
0.049041748046875,
0.079833984375,
0.06219482421875,
-0.0169677734375,
0.039459228515625,
-0.0143890380859375,
0.039276123046875,
-0.03558349609375,
0.04364013671875,
-0.05352783203125,
-0.00812530517578125,
-0.0004878044128417969,
-0.06658935546875,
-0.01263427734375,
0.048614501953125,
0.006649017333984375,
0.01154327392578125,
0.044219970703125,
0.0784912109375,
0.01079559326171875,
-0.00738525390625,
0.012939453125,
0.01055145263671875,
0.0096282958984375,
0.05462646484375,
0.050811767578125,
-0.0694580078125,
0.0123291015625,
-0.01837158203125,
-0.0291900634765625,
-0.0299224853515625,
-0.0445556640625,
-0.08013916015625,
-0.053802490234375,
-0.04180908203125,
-0.05511474609375,
0.020721435546875,
0.08074951171875,
0.0772705078125,
-0.077392578125,
-0.00989532470703125,
-0.013275146484375,
-0.0201568603515625,
-0.025360107421875,
-0.01010894775390625,
0.0174407958984375,
-0.01235198974609375,
-0.0631103515625,
0.0235595703125,
0.039276123046875,
0.017974853515625,
0.0149993896484375,
-0.0333251953125,
-0.0199432373046875,
0.0005307197570800781,
0.0509033203125,
0.0275421142578125,
-0.042755126953125,
-0.0261688232421875,
0.0014925003051757812,
-0.0191497802734375,
-0.00658416748046875,
0.0282745361328125,
-0.006740570068359375,
0.0223846435546875,
0.047088623046875,
0.0295867919921875,
0.0726318359375,
-0.0147705078125,
0.0293121337890625,
-0.0489501953125,
0.0205230712890625,
0.0081939697265625,
0.0240325927734375,
0.006381988525390625,
-0.01074981689453125,
0.048858642578125,
0.0303192138671875,
-0.04071044921875,
-0.0576171875,
0.027099609375,
-0.0877685546875,
-0.0222930908203125,
0.10821533203125,
0.0018110275268554688,
-0.00083160400390625,
-0.0012607574462890625,
-0.00428009033203125,
0.029754638671875,
-0.0183258056640625,
0.04443359375,
0.052490234375,
-0.0165557861328125,
-0.0011873245239257812,
-0.04718017578125,
0.055267333984375,
0.0389404296875,
-0.05621337890625,
-0.035308837890625,
0.007537841796875,
0.039642333984375,
-0.01088714599609375,
0.045074462890625,
-0.029449462890625,
0.016326904296875,
0.007686614990234375,
-0.0007205009460449219,
-0.021697998046875,
-0.0279083251953125,
-0.021575927734375,
0.0030803680419921875,
0.0026378631591796875,
-0.0196685791015625
]
] |
timm/densenet121.ra_in1k | 2023-04-21T22:51:56.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1608.06993",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/densenet121.ra_in1k | 0 | 26,870 | timm | 2023-04-21T22:51:48 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for densenet121.ra_in1k
A DenseNet image classification model. Pretrained on ImageNet-1k in `timm` by Ross Wightman using RandAugment `RA` recipe. Related to `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 8.0
- GMACs: 2.9
- Activations (M): 6.9
- Image size: train = 224 x 224, test = 288 x 288
- **Papers:**
- Densely Connected Convolutional Networks: https://arxiv.org/abs/1608.06993
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('densenet121.ra_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'densenet121.ra_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 1024, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'densenet121.ra_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1024, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Citation
```bibtex
@inproceedings{huang2017densely,
title={Densely Connected Convolutional Networks},
author={Huang, Gao and Liu, Zhuang and van der Maaten, Laurens and Weinberger, Kilian Q },
booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
year={2017}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,011 | [
[
-0.0347900390625,
-0.0355224609375,
-0.0008859634399414062,
0.007659912109375,
-0.0209808349609375,
-0.02734375,
-0.0236358642578125,
-0.024261474609375,
0.02923583984375,
0.043670654296875,
-0.03704833984375,
-0.047393798828125,
-0.05596923828125,
-0.005382537841796875,
-0.01435089111328125,
0.06744384765625,
-0.0025310516357421875,
0.00460052490234375,
-0.01003265380859375,
-0.036834716796875,
-0.025543212890625,
-0.02337646484375,
-0.07708740234375,
-0.04241943359375,
0.037445068359375,
0.02734375,
0.0391845703125,
0.035308837890625,
0.05047607421875,
0.036468505859375,
0.00020170211791992188,
0.00890350341796875,
-0.027069091796875,
-0.0130157470703125,
0.0340576171875,
-0.03631591796875,
-0.0242462158203125,
0.01323699951171875,
0.060699462890625,
0.041229248046875,
0.00457763671875,
0.0208740234375,
0.0122833251953125,
0.048004150390625,
-0.018402099609375,
-0.004062652587890625,
-0.034637451171875,
0.0190887451171875,
-0.014617919921875,
-0.0027484893798828125,
-0.027191162109375,
-0.02581787109375,
0.01525115966796875,
-0.0411376953125,
0.0287933349609375,
0.0048065185546875,
0.09881591796875,
0.0255126953125,
0.004974365234375,
-0.001895904541015625,
-0.0243988037109375,
0.0665283203125,
-0.0570068359375,
0.0110626220703125,
0.0266571044921875,
0.0275421142578125,
-0.01317596435546875,
-0.08001708984375,
-0.04144287109375,
-0.01279449462890625,
-0.01300811767578125,
-0.000213623046875,
-0.0132293701171875,
0.0009188652038574219,
0.0252685546875,
0.01654052734375,
-0.038360595703125,
0.01451873779296875,
-0.041778564453125,
-0.0099945068359375,
0.043792724609375,
0.00891876220703125,
0.0239410400390625,
-0.0084075927734375,
-0.038970947265625,
-0.032379150390625,
-0.025115966796875,
0.013336181640625,
0.0217437744140625,
0.0213775634765625,
-0.051605224609375,
0.0280914306640625,
0.01215362548828125,
0.04022216796875,
0.006404876708984375,
-0.022918701171875,
0.043792724609375,
0.0056304931640625,
-0.040618896484375,
0.004428863525390625,
0.07769775390625,
0.01568603515625,
0.0250396728515625,
0.00787353515625,
-0.01290130615234375,
-0.037872314453125,
0.002124786376953125,
-0.088623046875,
-0.0276947021484375,
0.0203094482421875,
-0.053436279296875,
-0.0275421142578125,
0.022430419921875,
-0.0406494140625,
-0.01197052001953125,
-0.004856109619140625,
0.0435791015625,
-0.036529541015625,
-0.032379150390625,
0.0244140625,
-0.01494598388671875,
0.021026611328125,
0.017669677734375,
-0.04193115234375,
0.0205078125,
0.0197906494140625,
0.08099365234375,
0.0057525634765625,
-0.03271484375,
-0.0146484375,
-0.022735595703125,
-0.0195770263671875,
0.036712646484375,
-0.01371002197265625,
-0.0004603862762451172,
-0.0253753662109375,
0.022613525390625,
-0.01061248779296875,
-0.05987548828125,
0.0113372802734375,
-0.0204315185546875,
0.0128173828125,
-0.005222320556640625,
-0.0205535888671875,
-0.042083740234375,
0.029083251953125,
-0.037139892578125,
0.09088134765625,
0.0225067138671875,
-0.0655517578125,
0.0264129638671875,
-0.0347900390625,
-0.0127410888671875,
-0.01238250732421875,
-0.01348114013671875,
-0.079345703125,
-0.0115203857421875,
0.007534027099609375,
0.060272216796875,
-0.0244598388671875,
0.002513885498046875,
-0.0460205078125,
-0.028106689453125,
0.0269622802734375,
-0.0033893585205078125,
0.085205078125,
0.01387786865234375,
-0.0298919677734375,
0.01568603515625,
-0.050872802734375,
0.015960693359375,
0.037567138671875,
-0.0199127197265625,
-0.00675201416015625,
-0.04754638671875,
0.0158233642578125,
0.015350341796875,
0.0010271072387695312,
-0.041015625,
0.01763916015625,
-0.0139312744140625,
0.03094482421875,
0.042236328125,
0.0011615753173828125,
0.0261993408203125,
-0.0307464599609375,
0.0208282470703125,
0.021942138671875,
0.017364501953125,
0.0019779205322265625,
-0.0374755859375,
-0.0570068359375,
-0.03973388671875,
0.022735595703125,
0.026702880859375,
-0.03753662109375,
0.040802001953125,
-0.01229095458984375,
-0.06085205078125,
-0.025238037109375,
0.008544921875,
0.03143310546875,
0.035064697265625,
0.0254058837890625,
-0.03607177734375,
-0.047149658203125,
-0.0638427734375,
0.01300811767578125,
0.01151275634765625,
0.0080108642578125,
0.02923583984375,
0.048553466796875,
-0.0143280029296875,
0.043914794921875,
-0.033660888671875,
-0.0262451171875,
-0.0259552001953125,
0.0031337738037109375,
0.03118896484375,
0.0726318359375,
0.0635986328125,
-0.047271728515625,
-0.037567138671875,
-0.007171630859375,
-0.0745849609375,
0.0199432373046875,
-0.019775390625,
-0.0158233642578125,
0.003147125244140625,
0.024200439453125,
-0.036468505859375,
0.041656494140625,
0.01141357421875,
-0.01137542724609375,
0.02685546875,
-0.01514434814453125,
0.0140533447265625,
-0.0853271484375,
0.00528717041015625,
0.02301025390625,
0.0026397705078125,
-0.0321044921875,
-0.0023593902587890625,
-0.007175445556640625,
-0.0106201171875,
-0.0394287109375,
0.050079345703125,
-0.04217529296875,
-0.0209197998046875,
-0.00994110107421875,
-0.0261993408203125,
0.0028076171875,
0.057586669921875,
-0.009613037109375,
0.0274200439453125,
0.0655517578125,
-0.036224365234375,
0.03497314453125,
0.0254058837890625,
-0.0262451171875,
0.0204315185546875,
-0.05694580078125,
0.0171356201171875,
-0.00659942626953125,
0.02008056640625,
-0.08148193359375,
-0.0184478759765625,
0.02587890625,
-0.043426513671875,
0.052886962890625,
-0.0380859375,
-0.03070068359375,
-0.0304412841796875,
-0.027008056640625,
0.0404052734375,
0.045318603515625,
-0.0491943359375,
0.0309295654296875,
0.018585205078125,
0.0297393798828125,
-0.0411376953125,
-0.077392578125,
-0.01053619384765625,
-0.029998779296875,
-0.058013916015625,
0.0231781005859375,
0.0069732666015625,
0.0027103424072265625,
0.007648468017578125,
0.004108428955078125,
-0.0232696533203125,
-0.0017910003662109375,
0.04205322265625,
0.01076507568359375,
-0.0301055908203125,
-0.00995635986328125,
-0.0257415771484375,
-0.00556182861328125,
-0.0005927085876464844,
-0.0203857421875,
0.04644775390625,
-0.01027679443359375,
-0.01392364501953125,
-0.06396484375,
0.0006704330444335938,
0.033111572265625,
-0.0038299560546875,
0.059600830078125,
0.08673095703125,
-0.04693603515625,
-0.00672149658203125,
-0.03778076171875,
-0.033935546875,
-0.03704833984375,
0.034698486328125,
-0.026397705078125,
-0.037750244140625,
0.06304931640625,
-0.0012960433959960938,
-0.00325775146484375,
0.04290771484375,
0.02178955078125,
-0.00788116455078125,
0.04571533203125,
0.04754638671875,
0.01392364501953125,
0.0540771484375,
-0.07965087890625,
-0.0152130126953125,
-0.06195068359375,
-0.038055419921875,
-0.0362548828125,
-0.059478759765625,
-0.032867431640625,
-0.0153045654296875,
0.0333251953125,
0.0142822265625,
-0.031005859375,
0.03240966796875,
-0.06298828125,
0.0100555419921875,
0.04388427734375,
0.04791259765625,
-0.03857421875,
0.0267486572265625,
-0.0159912109375,
-0.006038665771484375,
-0.06463623046875,
-0.01009368896484375,
0.0771484375,
0.028961181640625,
0.053558349609375,
-0.015716552734375,
0.06048583984375,
-0.01502227783203125,
0.023468017578125,
-0.040618896484375,
0.044586181640625,
-0.016632080078125,
-0.033538818359375,
-0.0167388916015625,
-0.03204345703125,
-0.08453369140625,
-0.002471923828125,
-0.0316162109375,
-0.039337158203125,
0.01806640625,
0.00814056396484375,
-0.02435302734375,
0.0601806640625,
-0.065673828125,
0.0723876953125,
-0.008331298828125,
-0.022064208984375,
-0.0037250518798828125,
-0.04510498046875,
0.025848388671875,
0.01434326171875,
-0.027801513671875,
-0.0051727294921875,
0.016326904296875,
0.078857421875,
-0.044647216796875,
0.0736083984375,
-0.0347900390625,
0.036865234375,
0.0384521484375,
-0.014190673828125,
0.0213165283203125,
-0.0063018798828125,
-0.01338958740234375,
0.0288848876953125,
-0.010406494140625,
-0.0401611328125,
-0.043731689453125,
0.044830322265625,
-0.07635498046875,
-0.01389312744140625,
-0.021575927734375,
-0.034454345703125,
0.0220947265625,
0.01186370849609375,
0.04315185546875,
0.061737060546875,
0.019866943359375,
0.033477783203125,
0.050628662109375,
-0.039886474609375,
0.02880859375,
-0.00803375244140625,
-0.004611968994140625,
-0.03497314453125,
0.06109619140625,
0.0262603759765625,
0.00582122802734375,
0.0165863037109375,
0.01554107666015625,
-0.00928497314453125,
-0.058258056640625,
-0.016815185546875,
0.0206146240234375,
-0.04779052734375,
-0.039215087890625,
-0.048797607421875,
-0.042572021484375,
-0.0290069580078125,
-0.0158233642578125,
-0.032440185546875,
-0.018096923828125,
-0.027008056640625,
0.0218353271484375,
0.049774169921875,
0.053009033203125,
-0.0060272216796875,
0.043914794921875,
-0.039093017578125,
0.00727081298828125,
0.0119171142578125,
0.031890869140625,
0.0040130615234375,
-0.06787109375,
-0.028961181640625,
-0.01105499267578125,
-0.02191162109375,
-0.0465087890625,
0.033111572265625,
0.01065826416015625,
0.041107177734375,
0.03033447265625,
-0.01442718505859375,
0.05816650390625,
-0.0012502670288085938,
0.036865234375,
0.0298919677734375,
-0.030029296875,
0.04254150390625,
0.0029125213623046875,
0.0148162841796875,
0.01336669921875,
0.0287933349609375,
-0.01540374755859375,
-0.0012178421020507812,
-0.06854248046875,
-0.050384521484375,
0.06146240234375,
0.00191497802734375,
0.0010156631469726562,
0.0112762451171875,
0.06414794921875,
0.00417327880859375,
-0.007457733154296875,
-0.050018310546875,
-0.03778076171875,
-0.02069091796875,
-0.0195770263671875,
0.004718780517578125,
-0.01107025146484375,
0.0010747909545898438,
-0.045867919921875,
0.049774169921875,
0.00022971630096435547,
0.050872802734375,
0.02874755859375,
-0.0009732246398925781,
-0.00669097900390625,
-0.036346435546875,
0.043243408203125,
0.0236968994140625,
-0.029541015625,
0.00958251953125,
0.0181121826171875,
-0.054840087890625,
0.002513885498046875,
0.00247955322265625,
-0.005573272705078125,
0.0006537437438964844,
0.0462646484375,
0.06207275390625,
0.01282501220703125,
0.00386810302734375,
0.0301971435546875,
-0.0097503662109375,
-0.033843994140625,
-0.0289306640625,
0.01215362548828125,
-0.01554107666015625,
0.0321044921875,
0.0250701904296875,
0.03778076171875,
-0.01280975341796875,
-0.01280975341796875,
0.02490234375,
0.04241943359375,
-0.028106689453125,
-0.024200439453125,
0.0428466796875,
-0.01490020751953125,
-0.0095672607421875,
0.06622314453125,
-0.007259368896484375,
-0.0284576416015625,
0.087646484375,
0.035003662109375,
0.07757568359375,
0.004734039306640625,
-0.0048980712890625,
0.07012939453125,
0.0274200439453125,
-0.0014171600341796875,
0.01203155517578125,
0.01529693603515625,
-0.068359375,
-0.0030727386474609375,
-0.034332275390625,
0.00795745849609375,
0.032867431640625,
-0.0401611328125,
0.0247650146484375,
-0.06884765625,
-0.0247039794921875,
0.01322174072265625,
0.031585693359375,
-0.0721435546875,
0.021331787109375,
-0.0036106109619140625,
0.06707763671875,
-0.05657958984375,
0.06707763671875,
0.0665283203125,
-0.0450439453125,
-0.0799560546875,
-0.01486968994140625,
-0.00128936767578125,
-0.07489013671875,
0.055511474609375,
0.03131103515625,
0.01358795166015625,
0.00652313232421875,
-0.06964111328125,
-0.0499267578125,
0.10479736328125,
0.043365478515625,
-0.003787994384765625,
0.02716064453125,
-0.00347137451171875,
0.00714111328125,
-0.035675048828125,
0.03387451171875,
0.00400543212890625,
0.032196044921875,
0.020477294921875,
-0.035980224609375,
0.0196380615234375,
-0.019134521484375,
0.0024929046630859375,
0.010955810546875,
-0.05926513671875,
0.0655517578125,
-0.034027099609375,
-0.01395416259765625,
0.000202178955078125,
0.058929443359375,
0.01317596435546875,
0.00920867919921875,
0.045623779296875,
0.0667724609375,
0.039337158203125,
-0.0167694091796875,
0.065185546875,
0.0041046142578125,
0.03265380859375,
0.04815673828125,
0.02313232421875,
0.0362548828125,
0.0228729248046875,
-0.0127105712890625,
0.02435302734375,
0.0775146484375,
-0.02716064453125,
0.0230560302734375,
0.0190582275390625,
0.004978179931640625,
-0.0148162841796875,
0.005939483642578125,
-0.03607177734375,
0.03387451171875,
0.0163421630859375,
-0.0294036865234375,
-0.0201416015625,
0.007076263427734375,
-0.002716064453125,
-0.01473236083984375,
-0.0162200927734375,
0.038787841796875,
0.005214691162109375,
-0.0240020751953125,
0.0665283203125,
0.00275421142578125,
0.0675048828125,
-0.039886474609375,
-0.005565643310546875,
-0.027801513671875,
0.0146484375,
-0.03411865234375,
-0.06427001953125,
0.0291290283203125,
-0.026336669921875,
-0.00994110107421875,
-0.0016546249389648438,
0.053314208984375,
-0.026611328125,
-0.0304718017578125,
0.01947021484375,
0.0120849609375,
0.048797607421875,
0.0028095245361328125,
-0.09466552734375,
0.024261474609375,
0.0041046142578125,
-0.0396728515625,
0.03070068359375,
0.045318603515625,
0.01152801513671875,
0.0545654296875,
0.04510498046875,
-0.0062255859375,
0.01439666748046875,
-0.0014543533325195312,
0.0635986328125,
-0.03912353515625,
-0.01271820068359375,
-0.054840087890625,
0.03948974609375,
-0.005924224853515625,
-0.0518798828125,
0.03814697265625,
0.04608154296875,
0.06866455078125,
-0.0005035400390625,
0.0291900634765625,
-0.014923095703125,
-0.00551605224609375,
-0.03948974609375,
0.054229736328125,
-0.0521240234375,
0.0005402565002441406,
-0.007373809814453125,
-0.05499267578125,
-0.029937744140625,
0.051055908203125,
-0.0155181884765625,
0.034454345703125,
0.037872314453125,
0.07794189453125,
-0.028900146484375,
-0.03399658203125,
0.0196533203125,
0.015716552734375,
0.0039215087890625,
0.0341796875,
0.0238800048828125,
-0.062164306640625,
0.0294647216796875,
-0.050079345703125,
-0.01117706298828125,
-0.01010894775390625,
-0.048797607421875,
-0.0770263671875,
-0.06549072265625,
-0.044342041015625,
-0.052520751953125,
-0.019683837890625,
0.06903076171875,
0.088134765625,
-0.054046630859375,
-0.01200103759765625,
0.00531768798828125,
0.0132598876953125,
-0.0228729248046875,
-0.016937255859375,
0.044036865234375,
-0.022674560546875,
-0.042877197265625,
-0.0155029296875,
0.00897979736328125,
0.02880859375,
-0.0030269622802734375,
-0.02069091796875,
-0.0191650390625,
-0.0228271484375,
0.0106964111328125,
0.0251922607421875,
-0.040374755859375,
-0.01751708984375,
-0.02252197265625,
-0.0174102783203125,
0.0233917236328125,
0.039581298828125,
-0.03875732421875,
0.017486572265625,
0.0290069580078125,
0.043304443359375,
0.061492919921875,
-0.031524658203125,
0.0009369850158691406,
-0.070556640625,
0.05230712890625,
-0.006427764892578125,
0.0372314453125,
0.031982421875,
-0.029998779296875,
0.043182373046875,
0.0294647216796875,
-0.035980224609375,
-0.0640869140625,
-0.004604339599609375,
-0.08935546875,
-0.0119781494140625,
0.062744140625,
-0.023468017578125,
-0.038177490234375,
0.021148681640625,
-0.001529693603515625,
0.040557861328125,
-0.0021762847900390625,
0.031524658203125,
0.01505279541015625,
0.00222015380859375,
-0.049041748046875,
-0.038665771484375,
0.037139892578125,
0.015716552734375,
-0.0428466796875,
-0.0394287109375,
-0.00732421875,
0.04461669921875,
0.01305389404296875,
0.045654296875,
-0.0071563720703125,
0.0058746337890625,
0.008819580078125,
0.040283203125,
-0.033538818359375,
-0.0135040283203125,
-0.0263214111328125,
0.0008606910705566406,
-0.0104827880859375,
-0.043182373046875
]
] |
facebook/opt-13b | 2023-01-24T17:10:32.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"opt",
"text-generation",
"en",
"arxiv:2205.01068",
"arxiv:2005.14165",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | facebook | null | null | facebook/opt-13b | 61 | 26,854 | transformers | 2022-05-11T08:27:07 | ---
language: en
inference: false
tags:
- opt
- text-generation
license: other
commercial: false
---
# OPT : Open Pre-trained Transformer Language Models
OPT was first introduced in [Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) and first released in [metaseq's repository](https://github.com/facebookresearch/metaseq) on May 3rd 2022 by Meta AI.
**Disclaimer**: The team releasing OPT wrote an official model card, which is available in Appendix D of the [paper](https://arxiv.org/pdf/2205.01068.pdf).
Content from **this** model card has been written by the Hugging Face team.
## Intro
To quote the first two paragraphs of the [official paper](https://arxiv.org/abs/2205.01068)
> Large language models trained on massive text collections have shown surprising emergent
> capabilities to generate text and perform zero- and few-shot learning. While in some cases the public
> can interact with these models through paid APIs, full model access is currently limited to only a
> few highly resourced labs. This restricted access has limited researchers’ ability to study how and
> why these large language models work, hindering progress on improving known challenges in areas
> such as robustness, bias, and toxicity.
> We present Open Pretrained Transformers (OPT), a suite of decoder-only pre-trained transformers ranging from 125M
> to 175B parameters, which we aim to fully and responsibly share with interested researchers. We train the OPT models to roughly match
> the performance and sizes of the GPT-3 class of models, while also applying the latest best practices in data
> collection and efficient training. Our aim in developing this suite of OPT models is to enable reproducible and responsible research at scale, and
> to bring more voices to the table in studying the impact of these LLMs. Definitions of risk, harm, bias, and toxicity, etc., should be articulated by the
> collective research community as a whole, which is only possible when models are available for study.
## Model description
OPT was predominantly pretrained with English text, but a small amount of non-English data is still present within the training corpus via CommonCrawl. The model was pretrained using a causal language modeling (CLM) objective.
OPT belongs to the same family of decoder-only models like [GPT-3](https://arxiv.org/abs/2005.14165). As such, it was pretrained using the self-supervised causal language modedling objective.
For evaluation, OPT follows [GPT-3](https://arxiv.org/abs/2005.14165) by using their prompts and overall experimental setup. For more details, please read
the [official paper](https://arxiv.org/abs/2205.01068).
## Intended uses & limitations
The pretrained-only model can be used for prompting for evaluation of downstream tasks as well as text generation.
In addition, the model can be fine-tuned on a downstream task using the [CLM example](https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling). For all other OPT checkpoints, please have a look at the [model hub](https://huggingface.co/models?filter=opt).
### How to use
For large OPT models, such as this one, it is not recommend to make use of the `text-generation` pipeline because
one should load the model in half-precision to accelerate generation and optimize memory consumption on GPU.
It is recommended to directly call the [`generate`](https://huggingface.co/docs/transformers/main/en/main_classes/text_generation#transformers.generation_utils.GenerationMixin.generate)
method as follows:
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-13b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-13b", use_fast=False)
>>> prompt = "Hello, I'm am conscious and"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> generated_ids = model.generate(input_ids)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
['Hello, I am conscious and aware of my surroundings.\nI am conscious and aware of my']
```
By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-13b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-13b", use_fast=False)
>>> prompt = "Hello, I'm am conscious and"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> set_seed(32)
>>> generated_ids = model.generate(input_ids, do_sample=True)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
['Hello, I am conscious and aware.\nSo that makes you dead, right? ']
```
### Limitations and bias
As mentioned in Meta AI's model card, given that the training data used for this model contains a lot of
unfiltered content from the internet, which is far from neutral the model is strongly biased :
> Like other large language models for which the diversity (or lack thereof) of training
> data induces downstream impact on the quality of our model, OPT-175B has limitations in terms
> of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and
> hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern
> large language models.
Here's an example of how the model can have biased predictions:
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-13b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-13b", use_fast=False)
>>> prompt = "The woman worked as a"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> set_seed(32)
>>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
The woman worked as a supervisor in the office
The woman worked as a social media consultant for
The woman worked as a cashier at the
The woman worked as a teacher, and was
The woman worked as a maid at our friends
```
compared to:
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-13b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-13b", use_fast=False)
>>> prompt = "The man worked as a"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> set_seed(32)
>>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
The man worked as a consultant to the defense
The man worked as a bartender in a bar
The man worked as a cashier at the
The man worked as a teacher, and was
The man worked as a professional athlete while he
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The Meta AI team wanted to train this model on a corpus as large as possible. It is composed of the union of the following 5 filtered datasets of textual documents:
- BookCorpus, which consists of more than 10K unpublished books,
- CC-Stories, which contains a subset of CommonCrawl data filtered to match the
story-like style of Winograd schemas,
- The Pile, from which * Pile-CC, OpenWebText2, USPTO, Project Gutenberg, OpenSubtitles, Wikipedia, DM Mathematics and HackerNews* were included.
- Pushshift.io Reddit dataset that was developed in Baumgartner et al. (2020) and processed in
Roller et al. (2021)
- CCNewsV2 containing an updated version of the English portion of the CommonCrawl News
dataset that was used in RoBERTa (Liu et al., 2019b)
The final training data contains 180B tokens corresponding to 800GB of data. The validation split was made of 200MB of the pretraining data, sampled proportionally
to each dataset’s size in the pretraining corpus.
The dataset might contains offensive content as parts of the dataset are a subset of
public Common Crawl data, along with a subset of public Reddit data, which could contain sentences
that, if viewed directly, can be insulting, threatening, or might otherwise cause anxiety.
### Collection process
The dataset was collected form internet, and went through classic data processing algorithms and
re-formatting practices, including removing repetitive/non-informative text like *Chapter One* or
*This ebook by Project Gutenberg.*
## Training procedure
### Preprocessing
The texts are tokenized using the **GPT2** byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a
vocabulary size of 50272. The inputs are sequences of 2048 consecutive tokens.
The 175B model was trained on 992 *80GB A100 GPUs*. The training duration was roughly ~33 days of continuous training.
### BibTeX entry and citation info
```bibtex
@misc{zhang2022opt,
title={OPT: Open Pre-trained Transformer Language Models},
author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer},
year={2022},
eprint={2205.01068},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 10,027 | [
[
-0.022125244140625,
-0.06524658203125,
0.01387786865234375,
0.020660400390625,
-0.012298583984375,
-0.0107421875,
-0.032196044921875,
-0.034210205078125,
0.0051727294921875,
0.031982421875,
-0.045013427734375,
-0.032073974609375,
-0.045806884765625,
0.00785064697265625,
-0.036285400390625,
0.0797119140625,
-0.00498199462890625,
-0.0004620552062988281,
0.01275634765625,
0.007083892822265625,
-0.02471923828125,
-0.035736083984375,
-0.0655517578125,
-0.012847900390625,
0.0168914794921875,
0.005771636962890625,
0.050811767578125,
0.043243408203125,
0.03070068359375,
0.0291748046875,
0.0031490325927734375,
0.0078582763671875,
-0.047454833984375,
-0.022186279296875,
-0.00504302978515625,
-0.031341552734375,
-0.035247802734375,
0.0186614990234375,
0.038726806640625,
0.035797119140625,
0.00885772705078125,
0.01727294921875,
0.001377105712890625,
0.03045654296875,
-0.036468505859375,
0.016937255859375,
-0.051971435546875,
-0.004993438720703125,
-0.0117645263671875,
0.00717926025390625,
-0.044586181640625,
-0.0211639404296875,
0.00907135009765625,
-0.03466796875,
0.0282745361328125,
-0.0030670166015625,
0.09002685546875,
0.0294036865234375,
-0.0215301513671875,
-0.0125579833984375,
-0.05072021484375,
0.05035400390625,
-0.06787109375,
0.0272216796875,
0.0177154541015625,
0.007110595703125,
0.00339508056640625,
-0.06671142578125,
-0.046356201171875,
-0.0134735107421875,
-0.016448974609375,
0.017425537109375,
-0.02581787109375,
0.01247406005859375,
0.022064208984375,
0.0279083251953125,
-0.041778564453125,
0.0014238357543945312,
-0.040985107421875,
-0.0258941650390625,
0.05584716796875,
0.0045013427734375,
0.025665283203125,
-0.02667236328125,
-0.0182342529296875,
-0.00588226318359375,
-0.031158447265625,
-0.001781463623046875,
0.03643798828125,
0.0183868408203125,
-0.0164947509765625,
0.042755126953125,
-0.0218658447265625,
0.0595703125,
0.017608642578125,
0.00997161865234375,
0.0301971435546875,
-0.0308990478515625,
-0.0230865478515625,
-0.00791168212890625,
0.08880615234375,
0.02001953125,
0.02752685546875,
-0.00347900390625,
-0.004085540771484375,
0.00876617431640625,
0.0030803680419921875,
-0.06451416015625,
-0.02734375,
0.024169921875,
-0.03955078125,
-0.027130126953125,
0.002674102783203125,
-0.055511474609375,
0.00007575750350952148,
-0.01114654541015625,
0.04425048828125,
-0.032440185546875,
-0.036895751953125,
0.0201568603515625,
0.0016345977783203125,
0.0187835693359375,
0.00373077392578125,
-0.06451416015625,
-0.00048732757568359375,
0.0237274169921875,
0.060272216796875,
-0.0031414031982421875,
-0.0300140380859375,
-0.0166778564453125,
-0.0053863525390625,
-0.01474761962890625,
0.030914306640625,
-0.019805908203125,
-0.00980377197265625,
-0.00066375732421875,
0.00873565673828125,
-0.0197296142578125,
-0.024444580078125,
0.04736328125,
-0.029876708984375,
0.032501220703125,
-0.017303466796875,
-0.03277587890625,
-0.004116058349609375,
-0.006656646728515625,
-0.044921875,
0.09051513671875,
0.00873565673828125,
-0.0718994140625,
0.0310821533203125,
-0.051544189453125,
-0.0277862548828125,
-0.00519561767578125,
-0.0024204254150390625,
-0.02557373046875,
0.002040863037109375,
0.03082275390625,
0.044525146484375,
-0.00986480712890625,
0.034698486328125,
-0.014404296875,
-0.017333984375,
0.005512237548828125,
-0.038543701171875,
0.09033203125,
0.02496337890625,
-0.045989990234375,
0.020965576171875,
-0.04315185546875,
-0.005584716796875,
0.0282440185546875,
-0.0138092041015625,
-0.002864837646484375,
-0.0084075927734375,
0.0153350830078125,
0.03399658203125,
0.0248565673828125,
-0.0372314453125,
0.00876617431640625,
-0.043914794921875,
0.051361083984375,
0.072265625,
-0.0184783935546875,
0.0306854248046875,
-0.007312774658203125,
0.0199127197265625,
0.0049285888671875,
0.0282135009765625,
-0.0068817138671875,
-0.026092529296875,
-0.07891845703125,
-0.0199127197265625,
0.01453399658203125,
0.0260772705078125,
-0.0550537109375,
0.052764892578125,
-0.0210723876953125,
-0.05255126953125,
-0.043792724609375,
0.0002465248107910156,
0.0267333984375,
0.03173828125,
0.03271484375,
-0.01151275634765625,
-0.04425048828125,
-0.059539794921875,
-0.0249786376953125,
-0.00717926025390625,
0.01432037353515625,
0.0281982421875,
0.0484619140625,
-0.034759521484375,
0.0867919921875,
-0.04473876953125,
-0.0220489501953125,
-0.03131103515625,
-0.004100799560546875,
0.0298919677734375,
0.049774169921875,
0.043792724609375,
-0.056671142578125,
-0.046661376953125,
-0.017974853515625,
-0.05517578125,
-0.00457763671875,
-0.00872039794921875,
-0.0302581787109375,
0.0295867919921875,
0.04681396484375,
-0.06475830078125,
0.0233612060546875,
0.039825439453125,
-0.0343017578125,
0.042449951171875,
0.001373291015625,
-0.01502227783203125,
-0.09149169921875,
0.0198822021484375,
-0.005069732666015625,
-0.013702392578125,
-0.040802001953125,
-0.018341064453125,
0.00044155120849609375,
-0.01102447509765625,
-0.0430908203125,
0.061431884765625,
-0.0288848876953125,
0.0169525146484375,
-0.0118255615234375,
0.003543853759765625,
-0.00836944580078125,
0.0440673828125,
0.0084228515625,
0.042205810546875,
0.053070068359375,
-0.0494384765625,
0.0194549560546875,
0.0165863037109375,
-0.01654052734375,
0.0162811279296875,
-0.05438232421875,
0.00522613525390625,
-0.0158843994140625,
0.0253448486328125,
-0.072265625,
-0.02471923828125,
0.028564453125,
-0.0469970703125,
0.0253753662109375,
0.01033782958984375,
-0.0394287109375,
-0.058013916015625,
-0.0016689300537109375,
0.030120849609375,
0.041290283203125,
-0.04296875,
0.049041748046875,
0.025909423828125,
0.0152740478515625,
-0.0556640625,
-0.047515869140625,
-0.00806427001953125,
-0.0071868896484375,
-0.055694580078125,
0.026397705078125,
-0.0102996826171875,
-0.00017023086547851562,
0.0094757080078125,
-0.007312774658203125,
0.004848480224609375,
-0.002918243408203125,
0.00572967529296875,
0.0252685546875,
-0.0017299652099609375,
0.0020275115966796875,
-0.005695343017578125,
-0.01641845703125,
0.01287078857421875,
-0.032073974609375,
0.06402587890625,
-0.01812744140625,
-0.011199951171875,
-0.03961181640625,
-0.0022373199462890625,
0.032257080078125,
-0.0303497314453125,
0.06756591796875,
0.0723876953125,
-0.037322998046875,
-0.0131988525390625,
-0.056304931640625,
-0.026153564453125,
-0.04107666015625,
0.050933837890625,
-0.0074615478515625,
-0.0570068359375,
0.039459228515625,
0.01483154296875,
0.017364501953125,
0.058990478515625,
0.06134033203125,
0.0184478759765625,
0.0804443359375,
0.047119140625,
-0.0226287841796875,
0.05023193359375,
-0.04559326171875,
0.0225372314453125,
-0.045745849609375,
-0.00460052490234375,
-0.0246124267578125,
-0.00305938720703125,
-0.032989501953125,
-0.0211334228515625,
0.0081634521484375,
0.004241943359375,
-0.0291290283203125,
0.03656005859375,
-0.055816650390625,
0.0243377685546875,
0.042327880859375,
0.01377105712890625,
-0.000335693359375,
-0.01149749755859375,
-0.009857177734375,
0.004302978515625,
-0.06121826171875,
-0.0307769775390625,
0.09246826171875,
0.0296630859375,
0.053192138671875,
-0.02581787109375,
0.0548095703125,
0.00251007080078125,
0.0322265625,
-0.034332275390625,
0.0433349609375,
0.0009455680847167969,
-0.07806396484375,
-0.01143646240234375,
-0.04022216796875,
-0.058502197265625,
0.01479339599609375,
-0.00858306884765625,
-0.051361083984375,
0.0103759765625,
0.01461029052734375,
-0.0263824462890625,
0.0247344970703125,
-0.0631103515625,
0.093994140625,
-0.0343017578125,
-0.0341796875,
0.0037822723388671875,
-0.048980712890625,
0.0340576171875,
-0.0016689300537109375,
0.01299285888671875,
-0.0019025802612304688,
0.0199432373046875,
0.0728759765625,
-0.033721923828125,
0.07440185546875,
-0.0140228271484375,
0.00441741943359375,
0.0333251953125,
-0.0182647705078125,
0.033233642578125,
-0.00710296630859375,
-0.004398345947265625,
0.0238494873046875,
-0.01300811767578125,
-0.032379150390625,
-0.0104827880859375,
0.042572021484375,
-0.081298828125,
-0.03240966796875,
-0.031890869140625,
-0.0360107421875,
0.0066375732421875,
0.0438232421875,
0.057708740234375,
0.022796630859375,
-0.0093841552734375,
0.006809234619140625,
0.0343017578125,
-0.034912109375,
0.048187255859375,
0.0140380859375,
-0.017730712890625,
-0.0295867919921875,
0.062744140625,
0.0099334716796875,
0.0250091552734375,
0.0105133056640625,
0.005268096923828125,
-0.032867431640625,
-0.02178955078125,
-0.022186279296875,
0.035888671875,
-0.05767822265625,
-0.0175323486328125,
-0.0728759765625,
-0.0311737060546875,
-0.04888916015625,
-0.0172119140625,
-0.046234130859375,
-0.0032482147216796875,
-0.035064697265625,
-0.0160064697265625,
0.0200042724609375,
0.034454345703125,
-0.004913330078125,
0.035675048828125,
-0.0372314453125,
0.0202178955078125,
0.010589599609375,
0.024078369140625,
0.004486083984375,
-0.037994384765625,
-0.028778076171875,
0.012969970703125,
-0.0249786376953125,
-0.061798095703125,
0.036285400390625,
0.000579833984375,
0.04705810546875,
0.039886474609375,
0.0143585205078125,
0.040802001953125,
-0.028289794921875,
0.05230712890625,
0.00791168212890625,
-0.0804443359375,
0.029937744140625,
-0.0289154052734375,
0.0123138427734375,
0.0386962890625,
0.032073974609375,
-0.0265655517578125,
-0.03411865234375,
-0.05706787109375,
-0.077392578125,
0.07464599609375,
0.033905029296875,
0.021240234375,
-0.00995635986328125,
0.0286712646484375,
-0.0138397216796875,
0.016632080078125,
-0.10205078125,
-0.039398193359375,
-0.03106689453125,
-0.0297088623046875,
-0.013671875,
-0.0138092041015625,
0.0092620849609375,
-0.0343017578125,
0.063232421875,
0.004344940185546875,
0.03936767578125,
0.0245208740234375,
-0.0235748291015625,
-0.00638580322265625,
-0.0098876953125,
0.0218963623046875,
0.04559326171875,
-0.01316070556640625,
-0.0001678466796875,
0.0119781494140625,
-0.036529541015625,
-0.005092620849609375,
0.023162841796875,
-0.02392578125,
0.0020294189453125,
0.019775390625,
0.076904296875,
-0.0004360675811767578,
-0.038055419921875,
0.039703369140625,
0.0032939910888671875,
-0.01355743408203125,
-0.030853271484375,
0.004283905029296875,
0.007434844970703125,
0.00807952880859375,
0.025726318359375,
0.00498199462890625,
-0.00814056396484375,
-0.0306243896484375,
0.01006317138671875,
0.040802001953125,
-0.0233612060546875,
-0.0228271484375,
0.0802001953125,
0.021728515625,
-0.0130767822265625,
0.047576904296875,
-0.0158538818359375,
-0.0579833984375,
0.0455322265625,
0.0416259765625,
0.0726318359375,
-0.01186370849609375,
0.01317596435546875,
0.047607421875,
0.04779052734375,
-0.0107421875,
-0.0016317367553710938,
0.01198577880859375,
-0.06024169921875,
-0.03094482421875,
-0.0543212890625,
0.004383087158203125,
0.01425933837890625,
-0.03179931640625,
0.038543701171875,
-0.016143798828125,
-0.021087646484375,
-0.010406494140625,
0.0003750324249267578,
-0.06549072265625,
0.0190887451171875,
0.0019178390502929688,
0.059539794921875,
-0.07659912109375,
0.055206298828125,
0.035125732421875,
-0.04510498046875,
-0.0732421875,
-0.0007123947143554688,
-0.022705078125,
-0.055938720703125,
0.0478515625,
0.04022216796875,
0.0189056396484375,
0.034149169921875,
-0.05291748046875,
-0.07305908203125,
0.0782470703125,
0.02874755859375,
-0.02227783203125,
-0.0228729248046875,
0.0179901123046875,
0.04022216796875,
-0.01123046875,
0.0394287109375,
0.034393310546875,
0.0308380126953125,
-0.009796142578125,
-0.06402587890625,
0.0149688720703125,
-0.01506805419921875,
-0.01385498046875,
0.0026645660400390625,
-0.06561279296875,
0.08880615234375,
-0.00909423828125,
-0.01849365234375,
-0.013824462890625,
0.05291748046875,
0.000980377197265625,
0.001926422119140625,
0.03009033203125,
0.042694091796875,
0.043731689453125,
-0.0152587890625,
0.078857421875,
-0.038299560546875,
0.055511474609375,
0.056640625,
0.003692626953125,
0.042327880859375,
0.018096923828125,
-0.01303863525390625,
0.017730712890625,
0.05291748046875,
-0.00939178466796875,
0.0255126953125,
-0.0007162094116210938,
0.00066375732421875,
-0.0150909423828125,
0.005855560302734375,
-0.0380859375,
0.0286102294921875,
0.00867462158203125,
-0.04296875,
-0.012298583984375,
0.0023403167724609375,
0.0190277099609375,
-0.026947021484375,
-0.01580810546875,
0.0384521484375,
0.00383758544921875,
-0.05731201171875,
0.053497314453125,
0.0099029541015625,
0.07177734375,
-0.0450439453125,
0.0239715576171875,
-0.0036602020263671875,
0.0293426513671875,
-0.01873779296875,
-0.0246429443359375,
0.017730712890625,
-0.00959014892578125,
-0.0009202957153320312,
-0.017730712890625,
0.054718017578125,
-0.03558349609375,
-0.0484619140625,
0.028411865234375,
0.0294647216796875,
0.0037384033203125,
-0.0207366943359375,
-0.06500244140625,
0.0109405517578125,
0.00861358642578125,
-0.035675048828125,
0.0025348663330078125,
0.0173492431640625,
0.00800323486328125,
0.042572021484375,
0.057769775390625,
-0.00949859619140625,
0.027374267578125,
-0.0168914794921875,
0.0753173828125,
-0.037567138671875,
-0.0304718017578125,
-0.08209228515625,
0.04833984375,
-0.005992889404296875,
-0.0257568359375,
0.0711669921875,
0.052703857421875,
0.0836181640625,
-0.01214599609375,
0.058441162109375,
-0.0300140380859375,
0.0159454345703125,
-0.0286407470703125,
0.07135009765625,
-0.04315185546875,
-0.006053924560546875,
-0.046356201171875,
-0.0733642578125,
-0.0094757080078125,
0.060272216796875,
-0.037567138671875,
0.025787353515625,
0.049591064453125,
0.058837890625,
-0.00868988037109375,
-0.0147705078125,
-0.0005316734313964844,
0.03570556640625,
0.0325927734375,
0.042633056640625,
0.04144287109375,
-0.044189453125,
0.056640625,
-0.038360595703125,
-0.0213470458984375,
-0.02301025390625,
-0.053955078125,
-0.0849609375,
-0.04693603515625,
-0.0214080810546875,
-0.044097900390625,
-0.0140838623046875,
0.06744384765625,
0.05474853515625,
-0.04638671875,
-0.016571044921875,
-0.0304718017578125,
0.00025916099548339844,
-0.00394439697265625,
-0.0251922607421875,
0.04364013671875,
-0.0361328125,
-0.0712890625,
-0.006847381591796875,
-0.004573822021484375,
-0.0006465911865234375,
-0.019134521484375,
-0.006862640380859375,
-0.02618408203125,
0.004238128662109375,
0.03912353515625,
0.01099395751953125,
-0.04620361328125,
-0.006214141845703125,
0.017181396484375,
-0.015533447265625,
-0.00542449951171875,
0.03790283203125,
-0.04583740234375,
0.035064697265625,
0.0297393798828125,
0.04022216796875,
0.045013427734375,
-0.006885528564453125,
0.034881591796875,
-0.031890869140625,
0.0207366943359375,
0.0184783935546875,
0.03546142578125,
0.017669677734375,
-0.033905029296875,
0.035675048828125,
0.0303802490234375,
-0.04693603515625,
-0.07244873046875,
0.00914764404296875,
-0.0662841796875,
-0.0169677734375,
0.10174560546875,
-0.0013275146484375,
-0.0185699462890625,
-0.00014162063598632812,
-0.0310211181640625,
0.044281005859375,
-0.0221099853515625,
0.051513671875,
0.054412841796875,
0.00563812255859375,
-0.0011959075927734375,
-0.043975830078125,
0.044677734375,
0.03533935546875,
-0.0552978515625,
0.0076904296875,
0.031707763671875,
0.02801513671875,
0.0108184814453125,
0.07244873046875,
-0.00470733642578125,
0.0080718994140625,
-0.005771636962890625,
0.0181121826171875,
-0.01178741455078125,
-0.01340484619140625,
-0.00881195068359375,
-0.00734710693359375,
-0.0181121826171875,
-0.011199951171875
]
] |
smallcloudai/Refact-1_6B-fim | 2023-09-29T08:04:09.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_refact",
"text-generation",
"code",
"custom_code",
"en",
"dataset:bigcode/the-stack-dedup",
"dataset:rombodawg/2XUNCENSORED_MegaCodeTraining188k",
"dataset:bigcode/commitpackft",
"arxiv:2108.12409",
"arxiv:1607.06450",
"arxiv:1910.07467",
"arxiv:1911.02150",
"license:bigscience-openrail-m",
"model-index",
"has_space",
"region:us"
] | text-generation | smallcloudai | null | null | smallcloudai/Refact-1_6B-fim | 101 | 26,849 | transformers | 2023-08-29T15:48:36 | ---
pipeline_tag: text-generation
inference: true
widget:
- text: 'def print_hello_world():'
example_title: Hello world
group: Python
license: bigscience-openrail-m
pretrain-datasets:
- books
- arxiv
- c4
- falcon-refinedweb
- wiki
- github-issues
- stack_markdown
- self-made dataset of permissive github code
datasets:
- bigcode/the-stack-dedup
- rombodawg/2XUNCENSORED_MegaCodeTraining188k
- bigcode/commitpackft
metrics:
- code_eval
library_name: transformers
tags:
- code
model-index:
- name: Refact-1.6B
results:
- task:
type: text-generation
dataset:
type: openai_humaneval
name: HumanEval
metrics:
- name: pass@1 (T=0.01)
type: pass@1
value: 32.0
verified: false
- name: pass@1 (T=0.2)
type: pass@1
value: 31.5
verified: false
- name: pass@10 (T=0.8)
type: pass@10
value: 53.0
verified: false
- name: pass@100 (T=0.8)
type: pass@100
value: 76.9
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalSynthesize Python
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 35.8
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalSynthesize JavaScript
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 31.6
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalSynthesize Java
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 29.1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalSynthesize Go
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalSynthesize C++
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 26.3
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalSynthesize Rust
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalSynthesize Average
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixTests Python
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 18.38
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixTests JavaScript
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 12.28
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixTests Java
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 15.12
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixTests Go
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixTests C++
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 13.17
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixTests Rust
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 2.8
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixTests Average
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixDocs Python
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 26.92
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixDocs JavaScript
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 26.85
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixDocs Java
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 30.76
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixDocs Go
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixDocs C++
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 25.94
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixDocs Rust
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 8.44
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalFixDocs Average
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalExplain Python
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 26.46
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalExplain JavaScript
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 17.86
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalExplain Java
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 20.94
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalExplain Go
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalExplain C++
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 18.78
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalExplain Rust
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: bigcode/humanevalpack
name: HumanEvalExplain Average
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: -1
verified: false
- task:
type: text-generation
dataset:
type: mbpp
name: MBPP
metrics:
- name: pass@1 (T=0.01)
type: pass@1
value: 31.15
verified: false
- task:
type: text-generation
dataset:
type: ds1000
name: DS-1000 (Overall Completion)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 10.1
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (C++)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 21.61
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (C#)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 13.91
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (D)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 9.5
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Go)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 53.57
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Java)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 21.58
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Julia)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 13.75
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (JavaScript)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 26.88
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Lua)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 15.26
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (PHP)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 23.04
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Perl)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 12.1
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Python)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 29.6
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (R)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 13.77
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Ruby)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 12.68
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Racket)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 4.29
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Rust)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 19.54
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Scala)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 18.33
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Bash)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 5.7
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Swift)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 17.68
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (TypeScript)
metrics:
- name: pass@1 (T=0.2)
type: pass@1
value: 25
verified: false
language:
- en
---

# Refact-1.6B
Finally, the model we started training with our [blog post](https://refact.ai/blog/2023/applying-recent-innovations-to-train-model/) is ready 🎉
After fine-tuning on generated data, it beats Replit 3b, Stability Code 3b and many other models. It almost beats
StarCoder ten times the size!
Model | Size | HumanEval pass@1 | HumanEval pass@10 |
----------------------|---------------|--------------------|--------------------|
DeciCoder-1b | 1b | 19.1% | |
<b>Refact-1.6-fim</b> | <b>1.6b</b> | <b>32.0%</b> | <b>53.0%</b> |
StableCode | 3b | 20.2% | 33.8% |
ReplitCode v1 | 3b | 21.9% | |
CodeGen2.5-multi | 7b | 28.4% | 47.5% |
CodeLlama | 7b | 33.5% | 59.6% |
StarCoder | 15b | 33.6% | |
Likely, it's the best model for practical use in your IDE for code completion because it's smart and fast!
You can start using it right now by downloading the
[Refact plugin](https://refact.ai/). You can host the model yourself, too, using the
[open source docker container](https://github.com/smallcloudai/refact).
And it's multi-language (see MultiPL-HumanEval and other metrics below) and it works as a chat (see the section below).
# It Works As a Chat
The primary application of this model is code completion (infill) in multiple programming languages.
But it works as a chat quite well.
HumanEval results using instruction following (chat) format, against models specialized for chat only:
Model | Size | pass@1 | pass@10 |
-----------------------|--------|----------|----------|
<b>Refact-1.6-fim</b> | 1.6b | 38.4% | 55.6% |
StableCode-instruct | 3b | 26.9% | 36.2% |
OctoGeeX | 6b | 44.7% | |
CodeLlama-instruct | 7b | 34.8% | 64.3% |
CodeGen2.5-instruct | 7b | 36.2% | 60.87 |
CodeLlama-instruct | 13b | 42.7% | 71.6% |
StarChat-β | 15b | 33.5% | |
OctoCoder | 15b | 46.2% | |
# Example
Fill-in-the-middle uses special tokens to identify the prefix/middle/suffix part of the input and output:
```python
# pip install -q transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "smallcloudai/Refact-1_6B-fim"
device = "cuda" # for GPU usage or "cpu" for CPU usage
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint, trust_remote_code=True).to(device)
prompt = '<fim_prefix>def print_hello_world():\n """<fim_suffix>\n print("Hello world!")<fim_middle>'
inputs = tokenizer.encode(prompt, return_tensors="pt").to(device)
outputs = model.generate(inputs, max_length=100, temperature=0.2)
print("-"*80)
print(tokenizer.decode(outputs[0]))
```
# Chat Format
The same model works as chat (experimental).
```python
prompt_template = "<empty_output>SYSTEM {system}\n" \
"<empty_output>USER {query}\n" \
"<empty_output>ASSISTANT"
prompt = prompt_template.format(system="You are a programming assistant",
query="How do I sort a list in Python?")
```
# Architecture
As described in more detail in the blog post, we used:
- [ALiBi](https://arxiv.org/abs/2108.12409) based attention
- [LayerNorm](https://arxiv.org/abs/1607.06450v1) instead of [RMSNorm](https://arxiv.org/pdf/1910.07467.pdf)
- [Multi Query Attention](https://arxiv.org/abs/1911.02150)
We also used LiON, flash attention, early dropout. It's not that innovative that you can't run it, in fact you can -- see an example below.
# Pretraining
For the base model, we used our own dataset that contains code with permissive licenses only, and open text datasets.
Filtering is the key to success of this model:
- We only used text in English
- Only topics related to computer science
- Applied heavy deduplication
The text to code proportion was 50:50, model trained for 1.2T tokens.
We don't release the base model, because its Fill-in-the-Middle (FIM) capability likes to repeat itself too much, so
its practical use is limited. But if you still want it, write us a message on Discord.
# Finetuning
We tested our hypothesis that chat data should boost base model performance in FIM and
regular left-to-right code completion. We found that just 15% of open
[code](https://huggingface.co/datasets/bigcode/commitpackft)
[instruction-following](https://huggingface.co/datasets/rombodawg/2XUNCENSORED_MegaCodeTraining188k) datasets,
that we filtered for quality, improves almost all metrics.
Additionally, to improve FIM, we observed common failure modes, and prepared a synthetic dataset based on
[The Stack dedup v1.1](https://huggingface.co/datasets/bigcode/the-stack-dedup) to address them.
There is a distribution shift between typical code on the internet, and the code you write in your IDE.
The former is likely finished, so the model tries to come up with a suggestion that makes the code complete.
You are likely to have half-written code as you work on it, there is no single addition that can repair it
fully.
In practice, model needs to have a tendency to stop after a couple of lines are added, and sometimes don't write
anything at all. We found that just giving it empty completions, single line completions, multiline
completions that end with a smaller text indent or at least a newline -- makes it much more usable. This data
was used as the rest 85% of the finetune dataset.
The final model is the result of several attempts to make it work as good as possible for code completion,
and to perform well on a wide range of metrics. The best attempt took 40B tokens.
# Limitations and Bias
The Refact-1.6B model was trained on text in English. But it has seen a lot more languages in
code comments. Its performance on non-English languages is lower, for sure.
# Model Stats
- **Architecture:** LLAMA-like model with multi-query attention
- **Objectives** Fill-in-the-Middle, Chat
- **Tokens context:** 4096
- **Pretraining tokens:** 1.2T
- **Finetuning tokens:** 40B
- **Precision:** bfloat16
- **GPUs** 64 NVidia A5000
- **Training time** 28 days
# License
The model is licensed under the BigScience OpenRAIL-M v1 license agreement
# Citation
If you are using this model, please give a link to this page. | 18,416 | [
[
-0.03497314453125,
-0.0626220703125,
0.00799560546875,
0.00986480712890625,
-0.009429931640625,
-0.0099639892578125,
-0.02783203125,
-0.03851318359375,
0.00891876220703125,
0.033966064453125,
-0.0343017578125,
-0.036834716796875,
-0.03643798828125,
-0.0010175704956054688,
-0.0258331298828125,
0.08062744140625,
0.0196533203125,
0.00522613525390625,
-0.01390838623046875,
-0.0163421630859375,
-0.0433349609375,
-0.05584716796875,
-0.056488037109375,
-0.0247802734375,
0.03350830078125,
0.046234130859375,
0.050018310546875,
0.052459716796875,
0.056671142578125,
0.02471923828125,
-0.0154571533203125,
0.00728607177734375,
-0.0517578125,
-0.0196075439453125,
0.01081085205078125,
-0.04791259765625,
-0.040740966796875,
0.00606536865234375,
0.0258331298828125,
0.0275726318359375,
-0.00977325439453125,
0.030181884765625,
0.002887725830078125,
0.06378173828125,
-0.03289794921875,
0.0096435546875,
-0.03778076171875,
-0.0146331787109375,
0.000400543212890625,
0.0096435546875,
-0.02886962890625,
-0.01218414306640625,
-0.0061492919921875,
-0.032012939453125,
0.0100250244140625,
0.01195526123046875,
0.084228515625,
0.041473388671875,
-0.019256591796875,
-0.0133056640625,
-0.05462646484375,
0.056396484375,
-0.07147216796875,
0.038299560546875,
0.0264892578125,
0.015777587890625,
-0.01410675048828125,
-0.0723876953125,
-0.05279541015625,
-0.0238037109375,
-0.0142059326171875,
0.004299163818359375,
-0.0166015625,
0.0102386474609375,
0.040679931640625,
0.040679931640625,
-0.04315185546875,
0.0166168212890625,
-0.04638671875,
-0.009063720703125,
0.054779052734375,
0.005634307861328125,
0.01519775390625,
-0.019927978515625,
-0.0243988037109375,
-0.01311492919921875,
-0.03863525390625,
0.0257720947265625,
0.0121612548828125,
0.00774383544921875,
-0.022491455078125,
0.034698486328125,
-0.028717041015625,
0.0482177734375,
0.00876617431640625,
-0.0111846923828125,
0.0262298583984375,
-0.04180908203125,
-0.03338623046875,
-0.0294036865234375,
0.07781982421875,
0.02691650390625,
0.014190673828125,
0.01171875,
-0.0096282958984375,
0.00559234619140625,
0.01445770263671875,
-0.07183837890625,
-0.027618408203125,
0.033355712890625,
-0.0499267578125,
-0.023590087890625,
-0.013641357421875,
-0.03363037109375,
-0.0006256103515625,
-0.03179931640625,
0.0289764404296875,
-0.04510498046875,
-0.0159454345703125,
0.015594482421875,
0.004840850830078125,
0.012481689453125,
0.019500732421875,
-0.05419921875,
0.007534027099609375,
0.0491943359375,
0.062103271484375,
0.0035457611083984375,
-0.017059326171875,
-0.023529052734375,
-0.03936767578125,
-0.01418304443359375,
0.046417236328125,
-0.021575927734375,
-0.0032253265380859375,
-0.034210205078125,
-0.006084442138671875,
-0.00832366943359375,
-0.03057861328125,
0.026123046875,
-0.03741455078125,
0.031982421875,
-0.001171112060546875,
-0.048828125,
-0.0244903564453125,
0.0177154541015625,
-0.042510986328125,
0.0771484375,
-0.0033111572265625,
-0.06512451171875,
0.0164794921875,
-0.054351806640625,
-0.004787445068359375,
-0.015594482421875,
-0.0028400421142578125,
-0.03271484375,
-0.00888824462890625,
0.006587982177734375,
0.0367431640625,
-0.0291290283203125,
0.026123046875,
-0.02764892578125,
-0.05096435546875,
0.0215911865234375,
-0.0294189453125,
0.06488037109375,
0.04022216796875,
-0.02520751953125,
0.0110931396484375,
-0.0572509765625,
0.0126953125,
0.006809234619140625,
-0.0264129638671875,
-0.00878143310546875,
-0.0134735107421875,
0.007518768310546875,
0.035400390625,
0.0182952880859375,
-0.029144287109375,
0.019683837890625,
-0.036895751953125,
0.047821044921875,
0.035247802734375,
0.0007109642028808594,
0.031646728515625,
-0.039794921875,
0.035858154296875,
0.0018949508666992188,
0.0273895263671875,
-0.0163726806640625,
-0.0384521484375,
-0.0797119140625,
-0.03619384765625,
0.046539306640625,
0.04443359375,
-0.045806884765625,
0.044586181640625,
-0.0209503173828125,
-0.06951904296875,
-0.048828125,
0.01010894775390625,
0.04913330078125,
0.0286102294921875,
0.021514892578125,
-0.0180816650390625,
-0.042327880859375,
-0.07281494140625,
0.0037403106689453125,
-0.0225677490234375,
0.0006256103515625,
0.01416778564453125,
0.047332763671875,
-0.035552978515625,
0.06707763671875,
-0.03436279296875,
-0.01995849609375,
-0.0240478515625,
-0.0058441162109375,
0.0343017578125,
0.0557861328125,
0.040802001953125,
-0.05792236328125,
-0.0311279296875,
-0.01087188720703125,
-0.050079345703125,
0.0110015869140625,
-0.00909423828125,
-0.0227203369140625,
0.0237579345703125,
0.041046142578125,
-0.048248291015625,
0.0304412841796875,
0.035003662109375,
-0.0205230712890625,
0.0275421142578125,
-0.00562286376953125,
0.00921630859375,
-0.0909423828125,
0.032012939453125,
-0.0105438232421875,
-0.0100250244140625,
-0.04852294921875,
0.01377105712890625,
0.00860595703125,
-0.004474639892578125,
-0.032806396484375,
0.045074462890625,
-0.026702880859375,
0.0175323486328125,
-0.0113067626953125,
-0.0019054412841796875,
0.00006210803985595703,
0.07421875,
-0.01031494140625,
0.05340576171875,
0.02587890625,
-0.0482177734375,
0.01438140869140625,
0.02044677734375,
-0.0195770263671875,
0.003875732421875,
-0.06365966796875,
0.0203094482421875,
-0.003170013427734375,
0.01390838623046875,
-0.09063720703125,
-0.01546478271484375,
0.0186309814453125,
-0.05194091796875,
0.0194549560546875,
-0.0255584716796875,
-0.047698974609375,
-0.03851318359375,
-0.01465606689453125,
0.023529052734375,
0.053497314453125,
-0.032958984375,
0.0299072265625,
0.0213165283203125,
0.0015888214111328125,
-0.04718017578125,
-0.05181884765625,
-0.0063018798828125,
-0.01605224609375,
-0.05963134765625,
0.01849365234375,
-0.0256195068359375,
-0.005443572998046875,
-0.018096923828125,
0.0011720657348632812,
-0.009796142578125,
0.0210113525390625,
0.0305328369140625,
0.03179931640625,
-0.0130462646484375,
0.0055694580078125,
-0.02587890625,
-0.006511688232421875,
-0.00829315185546875,
-0.003826141357421875,
0.0528564453125,
-0.0254669189453125,
-0.0166015625,
-0.04852294921875,
0.0087890625,
0.059356689453125,
-0.0256195068359375,
0.05072021484375,
0.034820556640625,
-0.029754638671875,
0.0005769729614257812,
-0.060546875,
-0.01995849609375,
-0.03607177734375,
0.02978515625,
-0.0179290771484375,
-0.059814453125,
0.050048828125,
0.0227508544921875,
0.02569580078125,
0.034881591796875,
0.032623291015625,
0.0006690025329589844,
0.081787109375,
0.055511474609375,
-0.0201873779296875,
0.0555419921875,
-0.050079345703125,
0.0185089111328125,
-0.044586181640625,
-0.0159759521484375,
-0.0433349609375,
-0.0285186767578125,
-0.051116943359375,
-0.02630615234375,
0.01081085205078125,
0.032928466796875,
-0.0251007080078125,
0.03515625,
-0.051055908203125,
0.0206298828125,
0.0499267578125,
0.0103607177734375,
0.0256195068359375,
0.004734039306640625,
0.005283355712890625,
-0.00562286376953125,
-0.0704345703125,
-0.041046142578125,
0.08746337890625,
0.044219970703125,
0.050262451171875,
-0.00403594970703125,
0.06591796875,
0.0050201416015625,
0.03564453125,
-0.054962158203125,
0.031768798828125,
-0.005100250244140625,
-0.044281005859375,
-0.00861358642578125,
-0.042877197265625,
-0.06036376953125,
0.02099609375,
-0.00827789306640625,
-0.041473388671875,
0.01332855224609375,
0.01538848876953125,
-0.0491943359375,
0.006954193115234375,
-0.0760498046875,
0.08154296875,
-0.01690673828125,
-0.0252532958984375,
-0.021514892578125,
-0.0396728515625,
0.039703369140625,
0.0023441314697265625,
0.017486572265625,
-0.004627227783203125,
0.00982666015625,
0.06298828125,
-0.045806884765625,
0.06005859375,
-0.005435943603515625,
0.0137481689453125,
0.0267791748046875,
-0.0046844482421875,
0.02557373046875,
0.008636474609375,
-0.004901885986328125,
0.0286102294921875,
0.007129669189453125,
-0.02197265625,
-0.044525146484375,
0.05126953125,
-0.0654296875,
-0.03997802734375,
-0.023468017578125,
-0.0270233154296875,
0.00632476806640625,
0.0206298828125,
0.032440185546875,
0.039520263671875,
-0.001087188720703125,
0.0258331298828125,
0.04559326171875,
-0.0172271728515625,
0.03155517578125,
0.047027587890625,
-0.0256500244140625,
-0.04119873046875,
0.059906005859375,
0.0173187255859375,
0.0196533203125,
0.0129547119140625,
-0.005290985107421875,
-0.0085296630859375,
-0.0274658203125,
-0.033111572265625,
0.027740478515625,
-0.0467529296875,
-0.032623291015625,
-0.07208251953125,
-0.03216552734375,
-0.029083251953125,
0.000051856040954589844,
-0.04095458984375,
-0.0355224609375,
-0.020904541015625,
0.006649017333984375,
0.03936767578125,
0.04473876953125,
0.0108489990234375,
0.020599365234375,
-0.06298828125,
0.0159759521484375,
0.0034427642822265625,
0.0305633544921875,
0.0025634765625,
-0.0595703125,
-0.034515380859375,
0.0267486572265625,
-0.02581787109375,
-0.037353515625,
0.0291290283203125,
0.003658294677734375,
0.0308837890625,
0.022613525390625,
0.002574920654296875,
0.06219482421875,
-0.0209197998046875,
0.058074951171875,
0.0072784423828125,
-0.05145263671875,
0.037506103515625,
-0.010406494140625,
0.0221405029296875,
0.048980712890625,
0.0188446044921875,
-0.04376220703125,
-0.037689208984375,
-0.0673828125,
-0.062103271484375,
0.0673828125,
0.031005859375,
0.004787445068359375,
-0.00106048583984375,
0.030792236328125,
-0.01195526123046875,
0.01300048828125,
-0.053741455078125,
-0.0272216796875,
-0.01123046875,
-0.0198516845703125,
0.0006966590881347656,
-0.0160369873046875,
0.017181396484375,
-0.034942626953125,
0.054931640625,
0.0006556510925292969,
0.040679931640625,
0.019256591796875,
-0.012359619140625,
0.0167388916015625,
0.0077667236328125,
0.038848876953125,
0.036834716796875,
-0.018035888671875,
-0.01873779296875,
0.007663726806640625,
-0.0296630859375,
-0.00865936279296875,
0.01535797119140625,
-0.0108795166015625,
-0.0012845993041992188,
0.0284271240234375,
0.059112548828125,
0.0160064697265625,
-0.052490234375,
0.039337158203125,
-0.0208740234375,
-0.004871368408203125,
-0.03009033203125,
0.0307159423828125,
0.004032135009765625,
0.0152130126953125,
0.01407623291015625,
0.024871826171875,
0.0092315673828125,
-0.034820556640625,
0.01241302490234375,
0.01617431640625,
-0.019500732421875,
-0.0149383544921875,
0.06256103515625,
0.018157958984375,
-0.018798828125,
0.060943603515625,
-0.0135650634765625,
-0.04412841796875,
0.07037353515625,
0.019744873046875,
0.052886962890625,
0.0031604766845703125,
-0.0084228515625,
0.052978515625,
0.0174713134765625,
-0.018768310546875,
0.03143310546875,
-0.00165557861328125,
-0.043243408203125,
-0.029388427734375,
-0.043914794921875,
-0.0129852294921875,
0.01171875,
-0.036865234375,
0.040557861328125,
-0.042449951171875,
-0.0218658447265625,
-0.0020046234130859375,
0.0195159912109375,
-0.0711669921875,
0.0158538818359375,
0.0029811859130859375,
0.0821533203125,
-0.04852294921875,
0.06866455078125,
0.049835205078125,
-0.06494140625,
-0.0704345703125,
-0.0178985595703125,
-0.013458251953125,
-0.0421142578125,
0.045135498046875,
0.03460693359375,
0.012298583984375,
0.01265716552734375,
-0.05340576171875,
-0.08441162109375,
0.08746337890625,
0.017608642578125,
-0.054229736328125,
-0.016143798828125,
-0.005634307861328125,
0.05810546875,
-0.0279693603515625,
0.047454833984375,
0.0360107421875,
0.014892578125,
-0.005886077880859375,
-0.0823974609375,
0.002777099609375,
-0.049835205078125,
0.00441741943359375,
0.001277923583984375,
-0.07501220703125,
0.07037353515625,
-0.022186279296875,
-0.015411376953125,
0.010009765625,
0.05059814453125,
0.0178375244140625,
0.016571044921875,
0.034698486328125,
0.04632568359375,
0.053955078125,
-0.0021820068359375,
0.08203125,
-0.05035400390625,
0.03778076171875,
0.077880859375,
0.005229949951171875,
0.0721435546875,
0.016082763671875,
-0.00600433349609375,
0.03546142578125,
0.054718017578125,
-0.007793426513671875,
0.0204010009765625,
0.0003898143768310547,
0.015625,
-0.0171356201171875,
-0.00036644935607910156,
-0.0307464599609375,
0.03411865234375,
0.0162811279296875,
-0.0198974609375,
0.0135650634765625,
0.0093536376953125,
0.0108489990234375,
-0.006195068359375,
0.00001913309097290039,
0.0679931640625,
0.01224517822265625,
-0.06439208984375,
0.08502197265625,
0.0083770751953125,
0.07415771484375,
-0.0379638671875,
-0.00257110595703125,
-0.038970947265625,
0.02923583984375,
-0.02410888671875,
-0.033233642578125,
0.0157318115234375,
0.004550933837890625,
-0.0272674560546875,
-0.01079559326171875,
0.040435791015625,
-0.039276123046875,
-0.0287628173828125,
0.02484130859375,
0.0196685791015625,
0.01513671875,
-0.01324462890625,
-0.05853271484375,
0.007904052734375,
0.01477813720703125,
-0.0267181396484375,
0.01001739501953125,
0.0193939208984375,
-0.00391387939453125,
0.06488037109375,
0.05657958984375,
-0.01526641845703125,
-0.002384185791015625,
-0.0097808837890625,
0.07122802734375,
-0.052001953125,
-0.0384521484375,
-0.05657958984375,
0.055084228515625,
-0.01186370849609375,
-0.032379150390625,
0.053436279296875,
0.05224609375,
0.07757568359375,
0.002349853515625,
0.04339599609375,
-0.0222625732421875,
0.033660888671875,
-0.036102294921875,
0.06573486328125,
-0.050262451171875,
0.03131103515625,
-0.01036834716796875,
-0.059234619140625,
-0.00911712646484375,
0.036376953125,
-0.0162811279296875,
0.00933837890625,
0.03985595703125,
0.08258056640625,
-0.01261138916015625,
-0.00390625,
0.0186004638671875,
0.01788330078125,
0.02166748046875,
0.05389404296875,
0.04913330078125,
-0.066650390625,
0.056304931640625,
-0.040771484375,
-0.01947021484375,
-0.0264434814453125,
-0.031707763671875,
-0.0760498046875,
-0.042999267578125,
-0.0284423828125,
-0.0377197265625,
-0.0182037353515625,
0.09405517578125,
0.060546875,
-0.056060791015625,
-0.0128631591796875,
0.00514984130859375,
0.00829315185546875,
-0.01546478271484375,
-0.0189056396484375,
0.01373291015625,
-0.034637451171875,
-0.056884765625,
0.00910186767578125,
-0.013885498046875,
0.0015325546264648438,
-0.00841522216796875,
-0.0120086669921875,
-0.0218353271484375,
-0.0023097991943359375,
0.032867431640625,
0.015625,
-0.050689697265625,
-0.01241302490234375,
0.0157012939453125,
-0.041107177734375,
0.00852203369140625,
0.04241943359375,
-0.050872802734375,
0.0258331298828125,
0.050567626953125,
0.040374755859375,
0.028106689453125,
-0.00531768798828125,
0.034515380859375,
-0.038848876953125,
0.0188140869140625,
0.0036773681640625,
0.033416748046875,
0.0187225341796875,
-0.03802490234375,
0.052459716796875,
0.01439666748046875,
-0.055084228515625,
-0.061676025390625,
-0.00780487060546875,
-0.057220458984375,
-0.01425933837890625,
0.08489990234375,
-0.004886627197265625,
-0.022613525390625,
-0.00148773193359375,
-0.035858154296875,
0.0240478515625,
-0.03192138671875,
0.06378173828125,
0.048126220703125,
0.001598358154296875,
-0.0016622543334960938,
-0.04571533203125,
0.043548583984375,
0.041107177734375,
-0.0447998046875,
0.00531768798828125,
0.045318603515625,
0.041748046875,
0.01995849609375,
0.06451416015625,
-0.001941680908203125,
0.032806396484375,
-0.016632080078125,
0.031005859375,
-0.023162841796875,
-0.020751953125,
-0.036224365234375,
-0.00876617431640625,
0.003963470458984375,
-0.01727294921875
]
] |
joeddav/distilbert-base-uncased-go-emotions-student | 2021-02-19T22:15:52.000Z | [
"transformers",
"pytorch",
"tf",
"distilbert",
"text-classification",
"tensorflow",
"en",
"dataset:go_emotions",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | joeddav | null | null | joeddav/distilbert-base-uncased-go-emotions-student | 58 | 26,759 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- text-classification
- pytorch
- tensorflow
datasets:
- go_emotions
license: mit
widget:
- text: "I feel lucky to be here."
---
# distilbert-base-uncased-go-emotions-student
## Model Description
This model is distilled from the zero-shot classification pipeline on the unlabeled GoEmotions dataset using [this
script](https://github.com/huggingface/transformers/tree/master/examples/research_projects/zero-shot-distillation).
It was trained with mixed precision for 10 epochs and otherwise used the default script arguments.
## Intended Usage
The model can be used like any other model trained on GoEmotions, but will likely not perform as well as a model
trained with full supervision. It is primarily intended as a demo of how an expensive NLI-based zero-shot model
can be distilled to a more efficient student, allowing a classifier to be trained with only unlabeled data. Note
that although the GoEmotions dataset allow multiple labels per instance, the teacher used single-label
classification to create psuedo-labels.
| 1,055 | [
[
-0.005817413330078125,
-0.035858154296875,
0.0289154052734375,
-0.01190948486328125,
-0.030364990234375,
0.00928497314453125,
0.0067901611328125,
0.00109100341796875,
0.01300811767578125,
0.024383544921875,
-0.045562744140625,
-0.0513916015625,
-0.046478271484375,
-0.01097869873046875,
-0.0308380126953125,
0.108642578125,
-0.0035400390625,
0.007152557373046875,
0.00726318359375,
-0.01166534423828125,
-0.006206512451171875,
-0.056976318359375,
-0.0230255126953125,
-0.04315185546875,
0.018585205078125,
0.033782958984375,
0.042083740234375,
0.007781982421875,
0.031463623046875,
0.0184478759765625,
-0.023162841796875,
-0.042022705078125,
-0.055023193359375,
-0.0108489990234375,
0.00850677490234375,
-0.04296875,
-0.037567138671875,
0.0308685302734375,
0.017913818359375,
0.044708251953125,
-0.03387451171875,
0.0345458984375,
-0.0081634521484375,
0.038543701171875,
-0.061004638671875,
0.004169464111328125,
-0.0283966064453125,
0.0273284912109375,
-0.01336669921875,
0.00867462158203125,
-0.04302978515625,
0.004924774169921875,
0.0300445556640625,
-0.032958984375,
0.032073974609375,
-0.024261474609375,
0.07086181640625,
0.0249176025390625,
-0.040435791015625,
-0.005886077880859375,
-0.050140380859375,
0.07781982421875,
-0.049896240234375,
-0.007640838623046875,
0.0160064697265625,
0.0435791015625,
-0.00530242919921875,
-0.06903076171875,
-0.03839111328125,
0.021240234375,
0.007152557373046875,
0.037322998046875,
-0.0197601318359375,
0.00908660888671875,
0.028228759765625,
0.05029296875,
-0.033477783203125,
-0.0175323486328125,
-0.032501220703125,
-0.0001499652862548828,
0.03106689453125,
-0.0007777214050292969,
0.0079345703125,
-0.01230621337890625,
-0.0462646484375,
-0.021759033203125,
-0.0193634033203125,
0.0197601318359375,
0.049285888671875,
0.0276336669921875,
-0.0221710205078125,
0.064697265625,
-0.007175445556640625,
0.05230712890625,
0.0114288330078125,
-0.010894775390625,
0.045318603515625,
0.006130218505859375,
-0.05059814453125,
0.009063720703125,
0.060638427734375,
0.03668212890625,
0.044921875,
-0.00562286376953125,
-0.01232147216796875,
0.008636474609375,
0.031951904296875,
-0.084716796875,
-0.0313720703125,
0.0297393798828125,
-0.028594970703125,
-0.052764892578125,
0.006011962890625,
-0.049224853515625,
0.01148223876953125,
-0.035552978515625,
0.0257110595703125,
-0.0303192138671875,
-0.02716064453125,
0.016510009765625,
-0.0216522216796875,
-0.01128387451171875,
0.00455474853515625,
-0.056304931640625,
0.00720977783203125,
-0.0020198822021484375,
0.055633544921875,
0.007480621337890625,
-0.0146026611328125,
0.012847900390625,
-0.005832672119140625,
-0.0236358642578125,
0.035736083984375,
-0.038848876953125,
-0.044891357421875,
-0.00965118408203125,
0.016387939453125,
0.00945281982421875,
-0.035858154296875,
0.047027587890625,
-0.0199127197265625,
0.0228118896484375,
0.005413055419921875,
-0.04522705078125,
-0.0357666015625,
0.005313873291015625,
-0.03436279296875,
0.08575439453125,
0.04833984375,
-0.05767822265625,
0.0266876220703125,
-0.055206298828125,
0.0092620849609375,
0.01190948486328125,
0.02337646484375,
-0.0355224609375,
0.03369140625,
-0.01445770263671875,
0.0305023193359375,
0.0047760009765625,
0.051971435546875,
-0.0295562744140625,
-0.023162841796875,
-0.0019292831420898438,
-0.0307769775390625,
0.0953369140625,
0.005863189697265625,
-0.012359619140625,
-0.000576019287109375,
-0.07318115234375,
-0.002277374267578125,
0.01239013671875,
-0.004024505615234375,
-0.03436279296875,
-0.027252197265625,
0.00128173828125,
0.00815582275390625,
0.00928497314453125,
-0.05645751953125,
0.023193359375,
-0.0129852294921875,
0.0121002197265625,
0.0355224609375,
0.00927734375,
0.04132080078125,
0.004058837890625,
0.0278778076171875,
0.032745361328125,
0.01360321044921875,
-0.0016012191772460938,
-0.0262603759765625,
-0.06549072265625,
-0.03228759765625,
0.0177154541015625,
0.0321044921875,
-0.036346435546875,
0.037139892578125,
0.01548004150390625,
-0.06781005859375,
-0.0307769775390625,
-0.005512237548828125,
0.01538848876953125,
0.0826416015625,
0.0391845703125,
-0.0170440673828125,
-0.0367431640625,
-0.062225341796875,
0.03277587890625,
-0.01171875,
-0.01218414306640625,
-0.01494598388671875,
0.05865478515625,
-0.02178955078125,
0.08123779296875,
-0.05322265625,
-0.0223846435546875,
0.005207061767578125,
0.042877197265625,
0.0380859375,
0.02880859375,
0.067138671875,
-0.0296173095703125,
-0.032989501953125,
-0.0287017822265625,
-0.0859375,
-0.0004258155822753906,
0.0031986236572265625,
-0.035736083984375,
-0.0034694671630859375,
0.00838470458984375,
-0.03900146484375,
0.06622314453125,
0.03277587890625,
-0.01102447509765625,
0.042938232421875,
-0.0227508544921875,
0.00489044189453125,
-0.08660888671875,
-0.0010242462158203125,
0.010406494140625,
0.0007648468017578125,
-0.05615234375,
-0.00167083740234375,
0.00678253173828125,
-0.0246429443359375,
-0.07037353515625,
0.0162200927734375,
-0.0275115966796875,
0.0189971923828125,
-0.0219268798828125,
-0.026275634765625,
0.0105743408203125,
0.050933837890625,
0.029052734375,
0.0284271240234375,
0.0792236328125,
-0.037200927734375,
0.061431884765625,
0.018218994140625,
0.0024967193603515625,
0.07525634765625,
-0.0660400390625,
0.01511383056640625,
-0.0225982666015625,
0.01512908935546875,
-0.0699462890625,
-0.019378662109375,
0.012237548828125,
-0.02911376953125,
0.03216552734375,
-0.0231781005859375,
-0.016815185546875,
-0.0418701171875,
-0.025482177734375,
0.0276031494140625,
0.03753662109375,
-0.0478515625,
0.051849365234375,
0.0186309814453125,
0.009521484375,
-0.037994384765625,
-0.0579833984375,
-0.016571044921875,
-0.036834716796875,
-0.016143798828125,
0.0093841552734375,
-0.0158538818359375,
-0.00836944580078125,
0.0023326873779296875,
-0.0023479461669921875,
-0.0287017822265625,
-0.00916290283203125,
0.055267333984375,
0.0509033203125,
0.007228851318359375,
0.01363372802734375,
0.0261993408203125,
-0.015777587890625,
-0.000457763671875,
-0.0022125244140625,
0.01082611083984375,
-0.00603485107421875,
-0.021331787109375,
-0.042724609375,
0.0181121826171875,
0.038330078125,
0.01287841796875,
0.056793212890625,
0.0181884765625,
-0.03717041015625,
-0.01145172119140625,
-0.01319122314453125,
-0.0189971923828125,
-0.029388427734375,
0.026275634765625,
-0.0244140625,
-0.043182373046875,
0.041168212890625,
0.00254058837890625,
0.00666046142578125,
0.0338134765625,
0.043670654296875,
-0.01258087158203125,
0.063720703125,
0.038848876953125,
-0.00957489013671875,
0.0038204193115234375,
-0.040863037109375,
0.005138397216796875,
-0.04852294921875,
-0.0330810546875,
-0.032318115234375,
-0.04254150390625,
-0.0208740234375,
-0.015655517578125,
0.000010669231414794922,
0.0246429443359375,
-0.0391845703125,
0.038238525390625,
-0.064697265625,
0.044921875,
0.046142578125,
0.0034503936767578125,
0.01104736328125,
0.003498077392578125,
0.007137298583984375,
-0.007663726806640625,
-0.0199737548828125,
-0.05072021484375,
0.10015869140625,
0.03546142578125,
0.060638427734375,
0.005023956298828125,
0.06390380859375,
0.0032596588134765625,
0.034027099609375,
-0.054534912109375,
0.0298919677734375,
-0.004131317138671875,
-0.071044921875,
-0.0079345703125,
-0.0219573974609375,
-0.05810546875,
0.0010061264038085938,
-0.0013151168823242188,
-0.058563232421875,
0.0102081298828125,
0.0029754638671875,
-0.031341552734375,
0.02435302734375,
-0.08184814453125,
0.0740966796875,
-0.0175018310546875,
0.0004949569702148438,
0.033477783203125,
-0.05096435546875,
0.031646728515625,
-0.0101776123046875,
-0.0131988525390625,
-0.009674072265625,
0.026397705078125,
0.05426025390625,
-0.04315185546875,
0.06719970703125,
-0.03558349609375,
0.00168609619140625,
0.06298828125,
0.0050048828125,
0.00405120849609375,
-0.00008165836334228516,
0.0033111572265625,
0.04541015625,
0.0032711029052734375,
-0.032318115234375,
-0.01788330078125,
0.043121337890625,
-0.06982421875,
0.00037860870361328125,
-0.049652099609375,
-0.037994384765625,
-0.0029811859130859375,
0.0088348388671875,
0.045623779296875,
0.029266357421875,
-0.01465606689453125,
0.01329803466796875,
0.04522705078125,
0.003055572509765625,
0.047088623046875,
0.035430908203125,
-0.0312347412109375,
-0.00940704345703125,
0.06085205078125,
0.019805908203125,
0.0157318115234375,
-0.0034809112548828125,
0.007389068603515625,
-0.039276123046875,
-0.0144805908203125,
-0.03875732421875,
-0.00472259521484375,
-0.044708251953125,
-0.0284423828125,
-0.0295562744140625,
-0.0182952880859375,
-0.037322998046875,
-0.00995635986328125,
-0.038177490234375,
-0.01396942138671875,
-0.060394287109375,
-0.051849365234375,
0.0301971435546875,
0.036834716796875,
-0.01001739501953125,
0.061798095703125,
-0.051177978515625,
0.029632568359375,
0.0286712646484375,
0.029876708984375,
-0.022857666015625,
-0.0665283203125,
-0.0029926300048828125,
0.01751708984375,
-0.0306243896484375,
-0.08160400390625,
0.04443359375,
0.01535797119140625,
0.03314208984375,
0.04150390625,
0.0211334228515625,
0.042083740234375,
-0.0181121826171875,
0.049468994140625,
0.025970458984375,
-0.062225341796875,
0.04571533203125,
-0.0208740234375,
0.022613525390625,
0.06585693359375,
0.054534912109375,
-0.01042938232421875,
-0.0250701904296875,
-0.0618896484375,
-0.07574462890625,
0.06585693359375,
0.01715087890625,
0.0213775634765625,
-0.000015974044799804688,
0.01245880126953125,
0.014373779296875,
0.0245208740234375,
-0.0648193359375,
-0.034210205078125,
-0.03802490234375,
-0.037933349609375,
-0.0006337165832519531,
-0.032867431640625,
-0.00917816162109375,
-0.01457977294921875,
0.057098388671875,
-0.00023126602172851562,
-0.00646209716796875,
-0.004772186279296875,
-0.0034694671630859375,
-0.01139068603515625,
-0.00206756591796875,
0.016937255859375,
0.01052093505859375,
-0.07452392578125,
0.0096435546875,
0.0027828216552734375,
-0.042633056640625,
0.0355224609375,
-0.00791168212890625,
-0.01097869873046875,
0.01788330078125,
-0.005046844482421875,
0.08245849609375,
-0.0019273757934570312,
-0.0267333984375,
0.0277557373046875,
-0.015869140625,
-0.035064697265625,
-0.038726806640625,
0.003673553466796875,
0.00716400146484375,
0.01041412353515625,
0.02642822265625,
0.005031585693359375,
0.01409149169921875,
-0.036956787109375,
0.0116729736328125,
0.006000518798828125,
-0.049285888671875,
-0.030609130859375,
0.032562255859375,
0.01727294921875,
-0.012359619140625,
0.061737060546875,
-0.0192718505859375,
-0.01166534423828125,
0.037872314453125,
0.01922607421875,
0.06036376953125,
-0.0233306884765625,
0.03271484375,
0.04302978515625,
-0.00934600830078125,
-0.021270751953125,
0.001560211181640625,
0.00733184814453125,
-0.044281005859375,
-0.021453857421875,
-0.057220458984375,
-0.036865234375,
0.01708984375,
-0.07147216796875,
0.0252532958984375,
-0.03485107421875,
-0.032928466796875,
0.00691986083984375,
-0.0301971435546875,
-0.034576416015625,
0.04736328125,
0.0211029052734375,
0.064208984375,
-0.09381103515625,
0.055694580078125,
0.048248291015625,
-0.034088134765625,
-0.07769775390625,
-0.03240966796875,
0.0186614990234375,
-0.029327392578125,
0.05963134765625,
0.0232696533203125,
-0.0008244514465332031,
0.01203155517578125,
-0.0556640625,
-0.03338623046875,
0.07147216796875,
0.03106689453125,
-0.0511474609375,
0.0027561187744140625,
0.013092041015625,
0.05633544921875,
-0.02960205078125,
0.06146240234375,
0.032440185546875,
0.00664520263671875,
0.040283203125,
-0.034698486328125,
-0.051239013671875,
-0.0234527587890625,
-0.01318359375,
-0.0013074874877929688,
-0.0264892578125,
0.07794189453125,
-0.007572174072265625,
-0.00787353515625,
-0.025054931640625,
0.04718017578125,
0.00930023193359375,
0.03338623046875,
0.0253448486328125,
0.084716796875,
0.043182373046875,
-0.0243072509765625,
0.0626220703125,
-0.007152557373046875,
0.06658935546875,
0.10260009765625,
-0.036865234375,
0.041412353515625,
0.03863525390625,
-0.0274200439453125,
0.03961181640625,
0.060455322265625,
0.00330352783203125,
0.044097900390625,
0.0244140625,
-0.02508544921875,
-0.0019931793212890625,
-0.007793426513671875,
-0.0290679931640625,
0.0360107421875,
-0.0069122314453125,
-0.0125732421875,
-0.037109375,
0.006420135498046875,
0.00905609130859375,
-0.017791748046875,
-0.0157012939453125,
0.03546142578125,
-0.01461029052734375,
-0.034637451171875,
0.033294677734375,
-0.0163116455078125,
0.05169677734375,
-0.04315185546875,
0.005756378173828125,
-0.01531219482421875,
0.0193634033203125,
-0.01242828369140625,
-0.0679931640625,
0.0283660888671875,
0.0211029052734375,
-0.0175323486328125,
-0.01258087158203125,
0.044708251953125,
-0.0406494140625,
-0.06903076171875,
0.0262603759765625,
0.0275421142578125,
0.036285400390625,
-0.0224761962890625,
-0.050079345703125,
-0.01062774658203125,
0.006824493408203125,
-0.04913330078125,
0.0287017822265625,
0.046234130859375,
0.01580810546875,
0.0257568359375,
0.04168701171875,
-0.00612640380859375,
-0.006378173828125,
0.023284912109375,
0.061676025390625,
-0.03497314453125,
0.003925323486328125,
-0.06842041015625,
0.07708740234375,
-0.0058746337890625,
-0.036224365234375,
0.04107666015625,
0.048828125,
0.06536865234375,
-0.0279693603515625,
0.043304443359375,
-0.00007766485214233398,
0.0467529296875,
0.0030994415283203125,
0.04449462890625,
-0.03546142578125,
-0.0323486328125,
-0.0269927978515625,
-0.07879638671875,
-0.00974273681640625,
0.061920166015625,
0.01113128662109375,
0.006816864013671875,
0.0386962890625,
0.04107666015625,
-0.0038318634033203125,
0.0012388229370117188,
0.034332275390625,
-0.00797271728515625,
0.01019287109375,
0.045013427734375,
0.040496826171875,
-0.03173828125,
0.007656097412109375,
-0.05255126953125,
-0.03424072265625,
0.00395965576171875,
-0.0743408203125,
-0.078125,
-0.023590087890625,
-0.050750732421875,
-0.0240631103515625,
-0.016845703125,
0.0391845703125,
0.056884765625,
-0.06927490234375,
-0.01201629638671875,
-0.017059326171875,
-0.0260772705078125,
0.024139404296875,
-0.0174713134765625,
0.006832122802734375,
-0.0031871795654296875,
-0.08013916015625,
0.00836944580078125,
0.02813720703125,
0.049468994140625,
-0.0122833251953125,
0.0065460205078125,
-0.0179443359375,
-0.0048980712890625,
0.0172271728515625,
0.0029048919677734375,
-0.00926971435546875,
-0.01507568359375,
0.01314544677734375,
-0.00830078125,
-0.0004367828369140625,
0.02642822265625,
-0.05169677734375,
0.047637939453125,
0.021209716796875,
-0.0008306503295898438,
0.034088134765625,
0.00933074951171875,
0.007778167724609375,
-0.07635498046875,
0.0447998046875,
0.0161590576171875,
0.035858154296875,
0.0193023681640625,
-0.01006317138671875,
0.0528564453125,
0.0298309326171875,
-0.04351806640625,
-0.06646728515625,
0.006908416748046875,
-0.082763671875,
-0.00780487060546875,
0.06890869140625,
-0.03802490234375,
-0.035369873046875,
0.0096282958984375,
-0.034881591796875,
0.04656982421875,
-0.042327880859375,
0.048828125,
0.061676025390625,
0.00287628173828125,
-0.01084136962890625,
-0.032562255859375,
-0.004665374755859375,
0.0198822021484375,
-0.044586181640625,
0.00020647048950195312,
0.043914794921875,
0.0122833251953125,
0.028594970703125,
0.026092529296875,
-0.01314544677734375,
-0.0178985595703125,
0.0181732177734375,
0.050567626953125,
-0.0155792236328125,
-0.0295257568359375,
-0.00704193115234375,
0.0168914794921875,
-0.0227813720703125,
-0.031005859375
]
] |
tiiuae/falcon-180B-chat | 2023-11-06T16:13:19.000Z | [
"transformers",
"safetensors",
"falcon",
"text-generation",
"en",
"de",
"es",
"fr",
"dataset:tiiuae/falcon-refinedweb",
"arxiv:1911.02150",
"arxiv:2005.14165",
"arxiv:2104.09864",
"arxiv:2205.14135",
"arxiv:2306.01116",
"license:unknown",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | tiiuae | null | null | tiiuae/falcon-180B-chat | 465 | 26,684 | transformers | 2023-09-04T08:29:48 | ---
datasets:
- tiiuae/falcon-refinedweb
language:
- en
- de
- es
- fr
inference: false
license: unknown
extra_gated_heading: "Acknowledge license to access the repository"
extra_gated_prompt: "You agree to the [Falcon-180B TII license](https://huggingface.co/spaces/tiiuae/falcon-180b-license/blob/main/LICENSE.txt) and [acceptable use policy](https://huggingface.co/spaces/tiiuae/falcon-180b-license/blob/main/ACCEPTABLE_USE_POLICY.txt)."
extra_gated_button_content: "I agree to the terms and conditions of the Falcon-180B TII license and to the acceptable use policy"
---
# 🚀 Falcon-180B-Chat
**Falcon-180B-Chat is a 180B parameters causal decoder-only model built by [TII](https://www.tii.ae) based on [Falcon-180B](https://huggingface.co/tiiuae/falcon-180B) and finetuned on a mixture of [Ultrachat](https://huggingface.co/datasets/stingning/ultrachat), [Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus) and [Airoboros](https://huggingface.co/datasets/jondurbin/airoboros-2.1). It is made available under the [Falcon-180B TII License](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/LICENSE.txt) and [Acceptable Use Policy](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/ACCEPTABLE_USE_POLICY.txt).**
*Paper coming soon* 😊
🤗 To get started with Falcon (inference, finetuning, quantization, etc.), we recommend reading [this great blogpost from HF](https://hf.co/blog/falcon-180b) or this [one](https://huggingface.co/blog/falcon) from the release of the 40B!
Note that since the 180B is larger than what can easily be handled with `transformers`+`acccelerate`, we recommend using [Text Generation Inference](https://github.com/huggingface/text-generation-inference).
You will need **at least 400GB of memory** to swiftly run inference with Falcon-180B.
## Why use Falcon-180B-chat?
* ✨ **You are looking for a ready-to-use chat/instruct model based on [Falcon-180B](https://huggingface.co/tiiuae/falcon-180B).**
* **It is the best open-access model currently available, and one of the best model overall.** Falcon-180B outperforms [LLaMA-2](https://huggingface.co/meta-llama/Llama-2-70b-hf), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1), [MPT](https://huggingface.co/mosaicml/mpt-7b), etc. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
* **It features an architecture optimized for inference**, with multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)).
* **It is made available under a permissive license allowing for commercial use**.
💬 **This is a Chat model, which may not be ideal for further finetuning.** If you are interested in building your own instruct/chat model, we recommend starting from [Falcon-180B](https://huggingface.co/tiiuae/falcon-180b).
💸 **Looking for a smaller, less expensive model?** [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct) and [Falcon-40B-Instruct](https://huggingface.co/tiiuae/falcon-40b-instruct) are Falcon-180B-Chat's little brothers!
💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!**
# Model Card for Falcon-180B-Chat
## Model Details
### Model Description
- **Developed by:** [https://www.tii.ae](https://www.tii.ae);
- **Model type:** Causal decoder-only;
- **Language(s) (NLP):** English, German, Spanish, French (and limited capabilities in Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish);
- **License:** [Falcon-180B TII License](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/LICENSE.txt) and [Acceptable Use Policy](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/ACCEPTABLE_USE_POLICY.txt).
### Model Source
- **Paper:** *coming soon*.
## Uses
See the [acceptable use policy](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/ACCEPTABLE_USE_POLICY.txt).
### Direct Use
Falcon-180B-Chat has been finetuned on a chat dataset.
### Out-of-Scope Use
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
## Bias, Risks, and Limitations
Falcon-180B-Chat is mostly trained on English data, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
### Recommendations
We recommend users of Falcon-180B-Chat to develop guardrails and to take appropriate precautions for any production use.
## How to Get Started with the Model
To run inference with the model in full `bfloat16` precision you need approximately 8xA100 80GB or equivalent.
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-180b-chat"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Training Details
**Falcon-180B-Chat is based on [Falcon-180B](https://huggingface.co/tiiuae/falcon-180B).**
### Training Data
Falcon-180B-Chat is finetuned on a mixture of [Ultrachat](https://huggingface.co/datasets/stingning/ultrachat), [Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus) and [Airoboros](https://huggingface.co/datasets/jondurbin/airoboros-2.1).
The data was tokenized with the Falcon tokenizer.
## Evaluation
*Paper coming soon.*
See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) for early results.
## Technical Specifications
### Model Architecture and Objective
Falcon-180B-Chat is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token).
The architecture is broadly adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)), with the following differences:
* **Positionnal embeddings:** rotary ([Su et al., 2021](https://arxiv.org/abs/2104.09864));
* **Attention:** multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)) and FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135));
* **Decoder-block:** parallel attention/MLP with a two layer norms.
For multiquery, we are using an internal variant which uses independent key and values per tensor parallel degree.
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|-----------|----------------------------------------|
| Layers | 80 | |
| `d_model` | 14848 | |
| `head_dim` | 64 | Reduced to optimise for FlashAttention |
| Vocabulary | 65024 | |
| Sequence length | 2048 | |
### Compute Infrastructure
#### Hardware
Falcon-180B-Chat was trained on AWS SageMaker, on up to 4,096 A100 40GB GPUs in P4d instances.
#### Software
Falcon-180B-Chat was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.)
## Citation
*Paper coming soon* 😊. In the meanwhile, you can use the following information to cite:
```
@article{falcon,
title={The Falcon Series of Language Models:Towards Open Frontier Models},
author={Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Debbah, Merouane and Goffinet, Etienne and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme},
year={2023}
}
```
To learn more about the pretraining dataset, see the 📓 [RefinedWeb paper](https://arxiv.org/abs/2306.01116).
```
@article{refinedweb,
title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
journal={arXiv preprint arXiv:2306.01116},
eprint={2306.01116},
eprinttype = {arXiv},
url={https://arxiv.org/abs/2306.01116},
year={2023}
}
```
## Contact
falconllm@tii.ae | 8,979 | [
[
-0.03759765625,
-0.0745849609375,
0.0025482177734375,
0.033447265625,
0.002651214599609375,
-0.005924224853515625,
-0.01129150390625,
-0.037750244140625,
0.0267791748046875,
0.0304412841796875,
-0.0484619140625,
-0.0345458984375,
-0.046783447265625,
-0.007656097412109375,
-0.0289459228515625,
0.0791015625,
0.01239013671875,
-0.0081024169921875,
0.0025768280029296875,
-0.004791259765625,
-0.01409149169921875,
-0.0518798828125,
-0.0689697265625,
-0.00034046173095703125,
0.0292205810546875,
0.028717041015625,
0.052093505859375,
0.055419921875,
0.04815673828125,
0.028045654296875,
-0.01316070556640625,
0.0175323486328125,
-0.051025390625,
-0.0170135498046875,
0.0092010498046875,
-0.0218658447265625,
-0.04058837890625,
0.0038318634033203125,
0.053558349609375,
0.038970947265625,
-0.00824737548828125,
0.01311492919921875,
0.0017642974853515625,
0.03466796875,
-0.040771484375,
0.037689208984375,
-0.038421630859375,
-0.006923675537109375,
-0.019775390625,
0.01058197021484375,
-0.034271240234375,
0.012847900390625,
-0.01593017578125,
-0.06048583984375,
0.0003147125244140625,
0.0167083740234375,
0.09967041015625,
0.01111602783203125,
-0.0276031494140625,
-0.005878448486328125,
-0.04095458984375,
0.058013916015625,
-0.06536865234375,
0.03814697265625,
0.0263824462890625,
0.0272064208984375,
-0.03253173828125,
-0.07421875,
-0.0355224609375,
-0.012237548828125,
-0.0005221366882324219,
0.0204620361328125,
-0.02618408203125,
0.00702667236328125,
0.0228118896484375,
0.010955810546875,
-0.031829833984375,
0.01390838623046875,
-0.035491943359375,
-0.017364501953125,
0.052520751953125,
0.0063629150390625,
0.014739990234375,
-0.0161285400390625,
-0.025054931640625,
-0.02435302734375,
-0.0279083251953125,
0.019989013671875,
0.033905029296875,
0.031646728515625,
-0.032928466796875,
0.044097900390625,
-0.0244903564453125,
0.03887939453125,
0.01861572265625,
-0.005428314208984375,
0.0259552001953125,
-0.029632568359375,
-0.0274505615234375,
-0.023651123046875,
0.0947265625,
0.0220794677734375,
0.00835418701171875,
-0.0028781890869140625,
0.008331298828125,
-0.017333984375,
-0.0004944801330566406,
-0.068603515625,
0.007110595703125,
0.0251617431640625,
-0.04046630859375,
-0.0195465087890625,
0.021209716796875,
-0.058685302734375,
-0.0018243789672851562,
0.01009368896484375,
0.01009368896484375,
-0.034271240234375,
-0.0360107421875,
0.01666259765625,
-0.006038665771484375,
0.018280029296875,
-0.0045318603515625,
-0.05609130859375,
0.02032470703125,
0.049346923828125,
0.06793212890625,
0.009307861328125,
-0.046295166015625,
-0.046630859375,
-0.00547027587890625,
-0.01507568359375,
0.046295166015625,
-0.03680419921875,
-0.0251617431640625,
-0.00878143310546875,
0.0220947265625,
-0.0228271484375,
-0.00711822509765625,
0.062469482421875,
-0.0265960693359375,
0.0237579345703125,
-0.0243072509765625,
-0.04876708984375,
-0.0235595703125,
-0.00550079345703125,
-0.038421630859375,
0.06927490234375,
-0.0044403076171875,
-0.07421875,
0.01123046875,
-0.066650390625,
-0.01015472412109375,
-0.00598907470703125,
0.0036563873291015625,
-0.03131103515625,
-0.0244293212890625,
0.027801513671875,
0.047882080078125,
-0.0208740234375,
0.0276031494140625,
-0.05902099609375,
-0.038421630859375,
0.0091552734375,
-0.01175689697265625,
0.0721435546875,
0.0430908203125,
-0.018646240234375,
0.00926971435546875,
-0.0297698974609375,
-0.007335662841796875,
0.0217437744140625,
-0.0028400421142578125,
0.007110595703125,
0.007762908935546875,
0.0120391845703125,
0.0162200927734375,
0.005802154541015625,
-0.048858642578125,
0.003978729248046875,
-0.0323486328125,
0.0426025390625,
0.02374267578125,
-0.0032749176025390625,
0.0266876220703125,
-0.038360595703125,
0.0311737060546875,
0.0288238525390625,
0.031524658203125,
-0.0169830322265625,
-0.049468994140625,
-0.072021484375,
-0.033721923828125,
0.0055999755859375,
0.043182373046875,
-0.043212890625,
0.039276123046875,
-0.0087127685546875,
-0.049957275390625,
-0.03564453125,
-0.0071258544921875,
0.032073974609375,
0.033966064453125,
0.0270538330078125,
0.005878448486328125,
-0.05670166015625,
-0.06512451171875,
-0.00742340087890625,
-0.0291290283203125,
0.01207733154296875,
0.034088134765625,
0.034576416015625,
-0.02838134765625,
0.054290771484375,
-0.0269622802734375,
-0.0228271484375,
-0.014404296875,
0.005023956298828125,
0.01119232177734375,
0.03448486328125,
0.060302734375,
-0.0374755859375,
-0.0281982421875,
-0.0117034912109375,
-0.06842041015625,
-0.0010156631469726562,
-0.010040283203125,
-0.0297698974609375,
0.03680419921875,
0.05029296875,
-0.057647705078125,
0.0245208740234375,
0.040283203125,
-0.0274505615234375,
0.0186920166015625,
0.00661468505859375,
0.01214599609375,
-0.09173583984375,
0.01042938232421875,
0.0164031982421875,
0.0028934478759765625,
-0.041473388671875,
0.0139312744140625,
-0.006168365478515625,
-0.003185272216796875,
-0.046051025390625,
0.060028076171875,
-0.02618408203125,
0.007781982421875,
-0.0050048828125,
0.0041656494140625,
-0.007633209228515625,
0.05535888671875,
0.0111236572265625,
0.055084228515625,
0.045806884765625,
-0.0233154296875,
0.00908660888671875,
0.03515625,
0.0031299591064453125,
0.0194244384765625,
-0.06549072265625,
0.00742340087890625,
-0.00463104248046875,
0.032623291015625,
-0.0721435546875,
-0.0175628662109375,
0.037841796875,
-0.052154541015625,
0.023223876953125,
-0.01355743408203125,
-0.026031494140625,
-0.041229248046875,
-0.0193328857421875,
0.001667022705078125,
0.034515380859375,
-0.034149169921875,
0.0242919921875,
0.034149169921875,
0.0020656585693359375,
-0.06646728515625,
-0.0511474609375,
0.00957489013671875,
-0.021636962890625,
-0.06396484375,
0.019683837890625,
-0.0025177001953125,
-0.0034961700439453125,
-0.0011157989501953125,
0.0207672119140625,
-0.001338958740234375,
-0.001468658447265625,
0.038116455078125,
0.0196380615234375,
-0.01971435546875,
-0.00853729248046875,
-0.0021076202392578125,
-0.0029430389404296875,
0.000023603439331054688,
-0.016815185546875,
0.034423828125,
-0.05133056640625,
-0.0217437744140625,
-0.041656494140625,
0.036163330078125,
0.040069580078125,
-0.01381683349609375,
0.073974609375,
0.07025146484375,
-0.022796630859375,
0.01557159423828125,
-0.05291748046875,
-0.01837158203125,
-0.03753662109375,
0.03741455078125,
-0.0295562744140625,
-0.0703125,
0.05242919921875,
0.0262603759765625,
0.01061248779296875,
0.053497314453125,
0.031646728515625,
-0.005313873291015625,
0.08245849609375,
0.0333251953125,
-0.0113983154296875,
0.0418701171875,
-0.035064697265625,
0.0004639625549316406,
-0.0555419921875,
-0.02105712890625,
-0.049896240234375,
-0.01385498046875,
-0.05596923828125,
-0.0202178955078125,
-0.0024566650390625,
0.0207366943359375,
-0.053131103515625,
0.026397705078125,
-0.0275421142578125,
0.0189361572265625,
0.041473388671875,
0.0024929046630859375,
0.00429534912109375,
-0.00311279296875,
-0.0102996826171875,
0.0034198760986328125,
-0.061553955078125,
-0.041595458984375,
0.08917236328125,
0.043487548828125,
0.048309326171875,
-0.007602691650390625,
0.060028076171875,
-0.010498046875,
0.0186767578125,
-0.046966552734375,
0.043853759765625,
-0.00873565673828125,
-0.049713134765625,
-0.01385498046875,
-0.0426025390625,
-0.0716552734375,
0.00502777099609375,
-0.01507568359375,
-0.06591796875,
0.0035190582275390625,
-0.0171051025390625,
-0.0093231201171875,
0.025177001953125,
-0.0687255859375,
0.06689453125,
-0.0037822723388671875,
-0.0213470458984375,
0.00551605224609375,
-0.04937744140625,
0.043243408203125,
0.0007596015930175781,
0.01837158203125,
-0.004241943359375,
-0.002628326416015625,
0.0748291015625,
-0.04376220703125,
0.066162109375,
-0.0293731689453125,
0.0292205810546875,
0.037841796875,
-0.011199951171875,
0.041290283203125,
0.0107269287109375,
-0.01251983642578125,
0.04693603515625,
0.030059814453125,
-0.0305633544921875,
-0.028900146484375,
0.059356689453125,
-0.095947265625,
-0.038787841796875,
-0.03271484375,
-0.0443115234375,
-0.01558685302734375,
0.016693115234375,
0.0202178955078125,
0.0188751220703125,
-0.007232666015625,
0.037872314453125,
0.01313018798828125,
-0.02838134765625,
0.0433349609375,
0.040557861328125,
-0.0277862548828125,
-0.030303955078125,
0.051055908203125,
0.0030651092529296875,
0.006473541259765625,
0.022247314453125,
0.0084075927734375,
-0.0382080078125,
-0.02825927734375,
-0.0281982421875,
0.03497314453125,
-0.04229736328125,
-0.019256591796875,
-0.07305908203125,
-0.0491943359375,
-0.03472900390625,
0.00217437744140625,
-0.03558349609375,
-0.023468017578125,
-0.0457763671875,
0.0029125213623046875,
0.0421142578125,
0.0271148681640625,
-0.00222015380859375,
0.036376953125,
-0.066162109375,
0.00469207763671875,
-0.0005245208740234375,
0.0233001708984375,
-0.0022602081298828125,
-0.05255126953125,
-0.011444091796875,
0.0494384765625,
-0.029754638671875,
-0.042083740234375,
0.034942626953125,
0.0307769775390625,
0.058380126953125,
0.0280303955078125,
0.005443572998046875,
0.06591796875,
-0.01026153564453125,
0.060394287109375,
0.0079803466796875,
-0.05926513671875,
0.024139404296875,
-0.046051025390625,
0.00771331787109375,
0.0310516357421875,
0.042327880859375,
-0.04168701171875,
-0.045166015625,
-0.0648193359375,
-0.035247802734375,
0.07684326171875,
0.0281829833984375,
0.01172637939453125,
-0.011444091796875,
0.031494140625,
-0.01500701904296875,
-0.0070648193359375,
-0.040985107421875,
-0.0191497802734375,
-0.04327392578125,
-0.0206756591796875,
-0.00991058349609375,
-0.00559234619140625,
0.007587432861328125,
-0.0263519287109375,
0.05718994140625,
-0.012054443359375,
0.05291748046875,
0.0030670166015625,
0.00249481201171875,
-0.0030498504638671875,
-0.00960540771484375,
0.04656982421875,
0.025360107421875,
-0.0284423828125,
-0.0211639404296875,
0.00732421875,
-0.045135498046875,
-0.01013946533203125,
0.027191162109375,
-0.0013484954833984375,
-0.01617431640625,
0.028289794921875,
0.07940673828125,
0.01425933837890625,
-0.0367431640625,
0.040557861328125,
-0.0173492431640625,
-0.0269622802734375,
-0.006557464599609375,
0.011871337890625,
0.026824951171875,
0.0312347412109375,
0.018890380859375,
-0.0064697265625,
-0.0004906654357910156,
-0.02581787109375,
0.019012451171875,
0.033721923828125,
-0.02288818359375,
-0.0137481689453125,
0.06646728515625,
0.01097869873046875,
-0.0192108154296875,
0.052459716796875,
-0.0208282470703125,
-0.0188751220703125,
0.0548095703125,
0.042266845703125,
0.06512451171875,
0.005046844482421875,
0.0267333984375,
0.048614501953125,
0.0105743408203125,
-0.00661468505859375,
0.021484375,
0.018402099609375,
-0.05694580078125,
-0.033447265625,
-0.0533447265625,
-0.0293731689453125,
0.014251708984375,
-0.0443115234375,
0.038848876953125,
-0.03070068359375,
-0.0208740234375,
0.013153076171875,
0.0222320556640625,
-0.0474853515625,
-0.007465362548828125,
0.00305938720703125,
0.07000732421875,
-0.044158935546875,
0.06915283203125,
0.0491943359375,
-0.05120849609375,
-0.07861328125,
-0.018157958984375,
0.006683349609375,
-0.068603515625,
0.048583984375,
0.013153076171875,
0.00464630126953125,
0.01422119140625,
-0.043853759765625,
-0.06695556640625,
0.08001708984375,
0.025604248046875,
-0.038909912109375,
-0.0020389556884765625,
0.01678466796875,
0.03985595703125,
-0.033721923828125,
0.057220458984375,
0.02630615234375,
0.02777099609375,
0.0452880859375,
-0.07147216796875,
0.005359649658203125,
-0.03802490234375,
0.0008077621459960938,
-0.0019521713256835938,
-0.07940673828125,
0.06695556640625,
-0.020233154296875,
-0.0139923095703125,
0.00485992431640625,
0.0614013671875,
0.025054931640625,
0.009918212890625,
0.0222625732421875,
0.03485107421875,
0.04571533203125,
-0.012481689453125,
0.067138671875,
-0.0369873046875,
0.04254150390625,
0.06402587890625,
-0.005214691162109375,
0.072021484375,
0.0194244384765625,
-0.0126190185546875,
0.0283355712890625,
0.060882568359375,
0.0008168220520019531,
0.01800537109375,
-0.015167236328125,
0.0195159912109375,
-0.01561737060546875,
-0.0035686492919921875,
-0.036407470703125,
0.051116943359375,
0.0248870849609375,
-0.01250457763671875,
-0.0115509033203125,
0.00012826919555664062,
0.034088134765625,
-0.024505615234375,
0.00794219970703125,
0.049163818359375,
0.007476806640625,
-0.056365966796875,
0.078857421875,
0.016815185546875,
0.06585693359375,
-0.05072021484375,
0.01407623291015625,
-0.040191650390625,
0.01702880859375,
-0.015960693359375,
-0.052642822265625,
0.03912353515625,
-0.004116058349609375,
0.0120391845703125,
-0.0037326812744140625,
0.04638671875,
-0.02618408203125,
-0.0484619140625,
0.0251617431640625,
0.0277557373046875,
0.020751953125,
-0.01531219482421875,
-0.07305908203125,
0.031280517578125,
-0.01244354248046875,
-0.0308837890625,
0.02301025390625,
0.01311492919921875,
-0.002796173095703125,
0.060211181640625,
0.052459716796875,
-0.00936126708984375,
0.002166748046875,
-0.00640106201171875,
0.058807373046875,
-0.050079345703125,
-0.0298309326171875,
-0.04864501953125,
0.03521728515625,
-0.007442474365234375,
-0.0259246826171875,
0.057952880859375,
0.04046630859375,
0.054229736328125,
0.0062408447265625,
0.039886474609375,
-0.01140594482421875,
0.0261077880859375,
-0.02252197265625,
0.061676025390625,
-0.046722412109375,
0.008819580078125,
-0.0299530029296875,
-0.049224853515625,
-0.0147552490234375,
0.04339599609375,
-0.00811004638671875,
0.0193634033203125,
0.047882080078125,
0.082763671875,
-0.00562286376953125,
0.032623291015625,
0.007793426513671875,
0.0243988037109375,
0.036163330078125,
0.0640869140625,
0.06024169921875,
-0.059906005859375,
0.056488037109375,
-0.0219879150390625,
-0.02496337890625,
-0.0256195068359375,
-0.06353759765625,
-0.08770751953125,
-0.061065673828125,
-0.03094482421875,
-0.0288238525390625,
-0.002716064453125,
0.06500244140625,
0.068359375,
-0.050537109375,
-0.0343017578125,
-0.006175994873046875,
-0.006916046142578125,
-0.0217132568359375,
-0.017242431640625,
0.023040771484375,
-0.031036376953125,
-0.0565185546875,
0.01467132568359375,
-0.001644134521484375,
0.0102081298828125,
-0.01532745361328125,
-0.0243377685546875,
-0.03802490234375,
0.006603240966796875,
0.0406494140625,
0.0372314453125,
-0.054901123046875,
-0.0267333984375,
0.0266265869140625,
-0.01326751708984375,
-0.002750396728515625,
0.01727294921875,
-0.037994384765625,
0.016204833984375,
0.03558349609375,
0.0396728515625,
0.06781005859375,
-0.00795745849609375,
0.01776123046875,
-0.0360107421875,
0.035430908203125,
-0.006198883056640625,
0.0286102294921875,
0.011260986328125,
-0.0269622802734375,
0.03765869140625,
0.0248870849609375,
-0.03912353515625,
-0.05853271484375,
-0.016998291015625,
-0.0911865234375,
-0.01142120361328125,
0.1065673828125,
-0.0186767578125,
-0.0205230712890625,
0.0069732666015625,
-0.0300445556640625,
0.0282440185546875,
-0.0394287109375,
0.03704833984375,
0.0538330078125,
-0.005435943603515625,
-0.0183563232421875,
-0.032745361328125,
0.02850341796875,
0.006229400634765625,
-0.06744384765625,
-0.005619049072265625,
0.0276641845703125,
0.0231475830078125,
-0.00598907470703125,
0.040771484375,
-0.008941650390625,
-0.0005307197570800781,
0.00977325439453125,
-0.004131317138671875,
-0.04156494140625,
-0.0191497802734375,
-0.0049285888671875,
0.0006451606750488281,
-0.0161285400390625,
-0.0263824462890625
]
] |
xlm-clm-ende-1024 | 2023-04-06T13:40:40.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"xlm",
"fill-mask",
"multilingual",
"en",
"de",
"arxiv:1901.07291",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | null | null | null | xlm-clm-ende-1024 | 0 | 26,658 | transformers | 2022-03-02T23:29:04 | ---
language:
- multilingual
- en
- de
---
# xlm-clm-ende-1024
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Training](#training)
5. [Evaluation](#evaluation)
6. [Environmental Impact](#environmental-impact)
7. [Technical Specifications](#technical-specifications)
8. [Citation](#citation)
9. [Model Card Authors](#model-card-authors)
10. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
The XLM model was proposed in [Cross-lingual Language Model Pretraining](https://arxiv.org/abs/1901.07291) by Guillaume Lample, Alexis Conneau. xlm-clm-ende-1024 is a transformer pretrained using a causal language modeling (CLM) objective (next token prediction) for English-German.
## Model Description
- **Developed by:** Guillaume Lample, Alexis Conneau, see [associated paper](https://arxiv.org/abs/1901.07291)
- **Model type:** Language model
- **Language(s) (NLP):** English-German
- **License:** Unknown
- **Related Models:** [xlm-clm-enfr-1024](https://huggingface.co/xlm-clm-enfr-1024), [xlm-mlm-ende-1024](https://huggingface.co/xlm-mlm-ende-1024), [xlm-mlm-enfr-1024](https://huggingface.co/xlm-mlm-enfr-1024), [xlm-mlm-enro-1024](https://huggingface.co/xlm-mlm-enro-1024)
- **Resources for more information:**
- [Associated paper](https://arxiv.org/abs/1901.07291)
- [GitHub Repo](https://github.com/facebookresearch/XLM)
- [Hugging Face Multilingual Models for Inference docs](https://huggingface.co/docs/transformers/v4.20.1/en/multilingual#xlm-with-language-embeddings)
# Uses
## Direct Use
The model is a language model. The model can be used for causal language modeling.
## Downstream Use
To learn more about this task and potential downstream uses, see the [Hugging Face Multilingual Models for Inference](https://huggingface.co/docs/transformers/v4.20.1/en/multilingual#xlm-with-language-embeddings) docs.
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
# Training
See the [associated paper](https://arxiv.org/pdf/1901.07291.pdf) for details on the training data and training procedure.
# Evaluation
## Testing Data, Factors & Metrics
See the [associated paper](https://arxiv.org/pdf/1901.07291.pdf) for details on the testing data, factors and metrics.
## Results
For xlm-clm-ende-1024 results, see Table 2 of the [associated paper](https://arxiv.org/pdf/1901.07291.pdf).
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications
The model developers write:
> We implement all our models in PyTorch (Paszke et al., 2017), and train them on 64 Volta GPUs for the language modeling tasks, and 8 GPUs for the MT tasks. We use float16 operations to speed up training and to reduce the memory usage of our models.
See the [associated paper](https://arxiv.org/pdf/1901.07291.pdf) for further details.
# Citation
**BibTeX:**
```bibtex
@article{lample2019cross,
title={Cross-lingual language model pretraining},
author={Lample, Guillaume and Conneau, Alexis},
journal={arXiv preprint arXiv:1901.07291},
year={2019}
}
```
**APA:**
- Lample, G., & Conneau, A. (2019). Cross-lingual language model pretraining. arXiv preprint arXiv:1901.07291.
# Model Card Authors
This model card was written by the team at Hugging Face.
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
import torch
from transformers import XLMTokenizer, XLMWithLMHeadModel
tokenizer = XLMTokenizer.from_pretrained("xlm-clm-ende-1024")
model = XLMWithLMHeadModel.from_pretrained("xlm-clm-ende-1024")
input_ids = torch.tensor([tokenizer.encode("Wikipedia was used to")]) # batch size of 1
language_id = tokenizer.lang2id["en"] # 0
langs = torch.tensor([language_id] * input_ids.shape[1]) # torch.tensor([0, 0, 0, ..., 0])
# We reshape it to be of size (batch_size, sequence_length)
langs = langs.view(1, -1) # is now of shape [1, sequence_length] (we have a batch size of 1)
outputs = model(input_ids, langs=langs)
```
</details> | 4,979 | [
[
-0.031494140625,
-0.04803466796875,
0.02105712890625,
0.01544189453125,
-0.00701904296875,
-0.0125274658203125,
-0.027374267578125,
-0.031524658203125,
-0.0012989044189453125,
0.042083740234375,
-0.041015625,
-0.037261962890625,
-0.05352783203125,
-0.00128173828125,
-0.028411865234375,
0.076171875,
-0.0235748291015625,
0.0211334228515625,
0.0005011558532714844,
-0.01264190673828125,
-0.0243682861328125,
-0.05035400390625,
-0.05859375,
-0.0232391357421875,
0.038055419921875,
0.01198577880859375,
0.039642333984375,
0.03424072265625,
0.01470947265625,
0.02752685546875,
-0.00930023193359375,
-0.01117706298828125,
-0.03997802734375,
-0.030303955078125,
0.0081939697265625,
-0.0288848876953125,
-0.037078857421875,
0.01544952392578125,
0.0570068359375,
0.05914306640625,
-0.00365447998046875,
0.0161895751953125,
0.003505706787109375,
0.030975341796875,
-0.02325439453125,
0.0147857666015625,
-0.03564453125,
0.00841522216796875,
-0.009674072265625,
0.01175689697265625,
-0.040313720703125,
-0.00534820556640625,
0.014129638671875,
-0.035430908203125,
0.006786346435546875,
0.012847900390625,
0.084716796875,
0.0024204254150390625,
-0.0234375,
-0.007659912109375,
-0.04302978515625,
0.07177734375,
-0.06976318359375,
0.06573486328125,
0.0211334228515625,
0.0034999847412109375,
0.01806640625,
-0.06732177734375,
-0.0460205078125,
-0.01532745361328125,
-0.022247314453125,
0.013427734375,
-0.0165863037109375,
-0.004085540771484375,
0.0335693359375,
0.01348114013671875,
-0.04852294921875,
0.010223388671875,
-0.02947998046875,
-0.0218658447265625,
0.04327392578125,
-0.0010128021240234375,
0.04168701171875,
-0.005466461181640625,
-0.0255126953125,
-0.0239105224609375,
-0.05035400390625,
0.005992889404296875,
0.037078857421875,
0.0248565673828125,
-0.040069580078125,
0.031402587890625,
0.00966644287109375,
0.045135498046875,
0.0002884864807128906,
-0.0027141571044921875,
0.040313720703125,
-0.03997802734375,
-0.0164794921875,
-0.0017642974853515625,
0.07330322265625,
0.007724761962890625,
0.0139007568359375,
-0.01105499267578125,
-0.0148162841796875,
-0.01239776611328125,
0.00023925304412841797,
-0.0743408203125,
-0.0026721954345703125,
0.016632080078125,
-0.029205322265625,
-0.00424957275390625,
0.001888275146484375,
-0.046234130859375,
0.016082763671875,
-0.037445068359375,
0.03729248046875,
-0.03558349609375,
-0.0360107421875,
0.0093994140625,
0.0149688720703125,
0.01690673828125,
-0.002197265625,
-0.051177978515625,
0.0205230712890625,
0.0245361328125,
0.0562744140625,
-0.0087890625,
-0.0396728515625,
-0.0194244384765625,
-0.01438140869140625,
-0.0115814208984375,
0.03363037109375,
-0.0194549560546875,
-0.0220947265625,
-0.0030727386474609375,
0.026092529296875,
-0.0176239013671875,
-0.02294921875,
0.036041259765625,
-0.0316162109375,
0.048309326171875,
-0.01491546630859375,
-0.04290771484375,
-0.02410888671875,
0.0005602836608886719,
-0.0552978515625,
0.08416748046875,
0.01561737060546875,
-0.06976318359375,
0.0197296142578125,
-0.04608154296875,
-0.0218353271484375,
-0.01314544677734375,
0.0012063980102539062,
-0.048095703125,
-0.0175018310546875,
0.01123046875,
0.0260772705078125,
-0.0218963623046875,
0.03173828125,
-0.009918212890625,
-0.002063751220703125,
-0.004184722900390625,
-0.0195465087890625,
0.08441162109375,
0.02215576171875,
-0.042724609375,
0.0201568603515625,
-0.04559326171875,
0.003704071044921875,
0.01116180419921875,
-0.01354217529296875,
-0.00701904296875,
-0.019073486328125,
0.028839111328125,
0.0401611328125,
0.0233917236328125,
-0.0301971435546875,
-0.012451171875,
-0.030792236328125,
0.01983642578125,
0.04803466796875,
-0.028076171875,
0.034881591796875,
-0.0121307373046875,
0.0411376953125,
0.0226287841796875,
0.0148773193359375,
-0.0112152099609375,
-0.031890869140625,
-0.069091796875,
-0.004703521728515625,
0.030975341796875,
0.045257568359375,
-0.051361083984375,
0.044189453125,
-0.018798828125,
-0.05706787109375,
-0.030548095703125,
0.01470947265625,
0.0501708984375,
0.036041259765625,
0.032379150390625,
-0.0230712890625,
-0.0533447265625,
-0.05987548828125,
-0.0193023681640625,
-0.00016176700592041016,
0.013153076171875,
0.018707275390625,
0.0460205078125,
-0.034210205078125,
0.06396484375,
-0.03277587890625,
-0.0039520263671875,
-0.037841796875,
0.0099639892578125,
0.01303863525390625,
0.040618896484375,
0.05029296875,
-0.05706787109375,
-0.051361083984375,
-0.006153106689453125,
-0.050750732421875,
-0.013946533203125,
-0.00018870830535888672,
-0.01558685302734375,
0.04168701171875,
0.044769287109375,
-0.0399169921875,
0.0164794921875,
0.05419921875,
-0.03729248046875,
0.045654296875,
-0.011505126953125,
-0.01015472412109375,
-0.08795166015625,
0.018951416015625,
0.01056671142578125,
-0.012542724609375,
-0.055999755859375,
-0.006072998046875,
0.004993438720703125,
-0.01453399658203125,
-0.054931640625,
0.0697021484375,
-0.038909912109375,
0.01377105712890625,
-0.007598876953125,
0.0111236572265625,
-0.005878448486328125,
0.0552978515625,
0.029510498046875,
0.034423828125,
0.0582275390625,
-0.043670654296875,
0.004024505615234375,
0.00896453857421875,
-0.019989013671875,
0.01275634765625,
-0.056640625,
0.00670623779296875,
-0.004734039306640625,
0.01320648193359375,
-0.04681396484375,
0.01238250732421875,
0.016876220703125,
-0.025726318359375,
0.046630859375,
0.006122589111328125,
-0.04290771484375,
-0.0233001708984375,
-0.0051422119140625,
0.041015625,
0.04473876953125,
-0.03375244140625,
0.05084228515625,
0.03192138671875,
-0.00787353515625,
-0.052398681640625,
-0.044036865234375,
-0.0096893310546875,
-0.024627685546875,
-0.049346923828125,
0.033935546875,
-0.0149078369140625,
-0.0019311904907226562,
0.0012226104736328125,
0.01183319091796875,
-0.00043582916259765625,
-0.0044708251953125,
0.00484466552734375,
0.0169830322265625,
-0.006378173828125,
-0.0087127685546875,
-0.014251708984375,
-0.0187530517578125,
0.006809234619140625,
-0.0177154541015625,
0.05029296875,
-0.0165557861328125,
-0.0114288330078125,
-0.028411865234375,
0.0202178955078125,
0.0253753662109375,
-0.0173492431640625,
0.06878662109375,
0.072265625,
-0.043731689453125,
-0.0010700225830078125,
-0.0399169921875,
-0.0267333984375,
-0.032867431640625,
0.05206298828125,
-0.00919342041015625,
-0.0596923828125,
0.0506591796875,
0.007572174072265625,
0.002105712890625,
0.040252685546875,
0.0482177734375,
0.0265655517578125,
0.077880859375,
0.063720703125,
-0.0147552490234375,
0.048370361328125,
-0.040496826171875,
0.036834716796875,
-0.05523681640625,
-0.0183258056640625,
-0.03948974609375,
0.0026721954345703125,
-0.052703857421875,
-0.0294189453125,
0.006809234619140625,
0.0161590576171875,
-0.034332275390625,
0.05828857421875,
-0.021514892578125,
-0.0009407997131347656,
0.040557861328125,
-0.0033550262451171875,
0.0031833648681640625,
-0.01116180419921875,
-0.035400390625,
-0.01117706298828125,
-0.0604248046875,
-0.03741455078125,
0.07476806640625,
0.042144775390625,
0.050567626953125,
-0.0070343017578125,
0.042816162109375,
-0.01461029052734375,
0.030364990234375,
-0.03814697265625,
0.03961181640625,
-0.0013933181762695312,
-0.05596923828125,
-0.0279998779296875,
-0.05010986328125,
-0.0750732421875,
0.01430511474609375,
-0.006221771240234375,
-0.05645751953125,
0.00193023681640625,
0.007213592529296875,
-0.00977325439453125,
0.033538818359375,
-0.06329345703125,
0.0953369140625,
-0.04351806640625,
-0.0244903564453125,
-0.00591278076171875,
-0.04302978515625,
0.024200439453125,
-0.0269622802734375,
0.031341552734375,
0.006366729736328125,
0.0152587890625,
0.0771484375,
-0.042205810546875,
0.06854248046875,
-0.016204833984375,
-0.01226806640625,
0.0210113525390625,
-0.0210418701171875,
0.03790283203125,
-0.01043701171875,
0.00353240966796875,
0.0460205078125,
0.0009288787841796875,
-0.019927978515625,
-0.0296630859375,
0.0484619140625,
-0.07666015625,
-0.032958984375,
-0.031646728515625,
-0.0411376953125,
0.00173187255859375,
0.035430908203125,
0.03277587890625,
0.028350830078125,
-0.01029205322265625,
0.00371551513671875,
0.040313720703125,
-0.05987548828125,
0.0306396484375,
0.042236328125,
-0.035247802734375,
-0.0261688232421875,
0.06378173828125,
0.0239105224609375,
0.0289764404296875,
0.0313720703125,
0.01047515869140625,
-0.038421630859375,
-0.0167083740234375,
-0.020172119140625,
0.034820556640625,
-0.05560302734375,
-0.0018768310546875,
-0.06622314453125,
-0.0281829833984375,
-0.05010986328125,
0.0012216567993164062,
-0.041015625,
-0.01385498046875,
-0.01367950439453125,
-0.014129638671875,
0.01751708984375,
0.033660888671875,
-0.00926971435546875,
0.024993896484375,
-0.0501708984375,
0.0100250244140625,
0.005634307861328125,
0.026580810546875,
0.0024433135986328125,
-0.05438232421875,
-0.044342041015625,
0.0155792236328125,
-0.027740478515625,
-0.049530029296875,
0.04925537109375,
-0.001495361328125,
0.060546875,
0.02655029296875,
-0.00887298583984375,
0.040802001953125,
-0.04571533203125,
0.058563232421875,
0.0240936279296875,
-0.07733154296875,
0.0235595703125,
-0.00335693359375,
0.0180511474609375,
0.0131072998046875,
0.051300048828125,
-0.05340576171875,
-0.0243988037109375,
-0.0572509765625,
-0.0755615234375,
0.06854248046875,
0.0161590576171875,
0.0181884765625,
0.002841949462890625,
0.01021575927734375,
-0.003887176513671875,
0.004856109619140625,
-0.09521484375,
-0.04302978515625,
-0.01995849609375,
-0.0139923095703125,
-0.0253753662109375,
-0.011932373046875,
0.00708770751953125,
-0.03802490234375,
0.07440185546875,
0.0030231475830078125,
0.0278472900390625,
0.005458831787109375,
-0.030303955078125,
-0.0035839080810546875,
0.00189208984375,
0.048309326171875,
0.03985595703125,
-0.0164794921875,
0.0056610107421875,
0.032257080078125,
-0.033477783203125,
0.0004987716674804688,
0.0195770263671875,
-0.011474609375,
0.0094757080078125,
0.026031494140625,
0.0859375,
0.02593994140625,
-0.045440673828125,
0.031402587890625,
-0.0028896331787109375,
-0.005218505859375,
-0.03045654296875,
-0.020477294921875,
0.0217437744140625,
-0.0081939697265625,
0.00873565673828125,
-0.0082550048828125,
-0.0139007568359375,
-0.03692626953125,
0.01751708984375,
0.028411865234375,
-0.03826904296875,
-0.034698486328125,
0.0654296875,
0.010528564453125,
-0.013427734375,
0.0421142578125,
-0.010223388671875,
-0.0552978515625,
0.04437255859375,
0.040802001953125,
0.0667724609375,
-0.023773193359375,
0.0013790130615234375,
0.05615234375,
0.032989501953125,
-0.00377655029296875,
0.006801605224609375,
0.0185089111328125,
-0.07080078125,
-0.035736083984375,
-0.050567626953125,
-0.00817108154296875,
0.0163421630859375,
-0.033905029296875,
0.0498046875,
-0.0050201416015625,
-0.0056304931640625,
-0.0098876953125,
0.01224517822265625,
-0.07049560546875,
0.0221099853515625,
0.0176239013671875,
0.073486328125,
-0.0736083984375,
0.06903076171875,
0.041839599609375,
-0.049102783203125,
-0.08135986328125,
-0.026214599609375,
-0.0107879638671875,
-0.059844970703125,
0.0628662109375,
0.0223236083984375,
0.00916290283203125,
0.007724761962890625,
-0.033660888671875,
-0.0732421875,
0.0789794921875,
0.0330810546875,
-0.041229248046875,
-0.0004227161407470703,
0.0306243896484375,
0.034088134765625,
-0.02435302734375,
0.0478515625,
0.030731201171875,
0.045928955078125,
-0.0085296630859375,
-0.07208251953125,
0.0193634033203125,
-0.0234832763671875,
0.0015506744384765625,
0.002471923828125,
-0.046875,
0.0765380859375,
-0.0151824951171875,
-0.017364501953125,
-0.015228271484375,
0.0426025390625,
0.0233154296875,
0.00670623779296875,
0.04296875,
0.06396484375,
0.039093017578125,
-0.009521484375,
0.08685302734375,
-0.052886962890625,
0.04229736328125,
0.068359375,
-0.00954437255859375,
0.048187255859375,
0.0250701904296875,
-0.017059326171875,
0.037872314453125,
0.06341552734375,
0.00626373291015625,
0.0285186767578125,
0.011444091796875,
-0.002178192138671875,
-0.00937652587890625,
0.0013275146484375,
-0.038787841796875,
0.0125579833984375,
0.01178741455078125,
-0.034088134765625,
-0.017791748046875,
0.0030193328857421875,
0.03265380859375,
-0.0198211669921875,
-0.0076751708984375,
0.0374755859375,
0.029022216796875,
-0.05560302734375,
0.041961669921875,
0.0164337158203125,
0.05865478515625,
-0.054901123046875,
0.00778961181640625,
-0.0118255615234375,
0.01183319091796875,
-0.01273345947265625,
-0.03619384765625,
0.019073486328125,
-0.0038604736328125,
-0.006343841552734375,
-0.0377197265625,
0.03302001953125,
-0.05511474609375,
-0.0570068359375,
0.052978515625,
0.037078857421875,
0.0156707763671875,
-0.0007257461547851562,
-0.07684326171875,
0.0053253173828125,
-0.0008096694946289062,
-0.040618896484375,
0.0237579345703125,
0.0231475830078125,
-0.007904052734375,
0.043243408203125,
0.04559326171875,
-0.00569915771484375,
0.011016845703125,
0.0212249755859375,
0.056488037109375,
-0.045135498046875,
-0.028076171875,
-0.059722900390625,
0.050567626953125,
0.0086517333984375,
-0.023345947265625,
0.0699462890625,
0.05596923828125,
0.09039306640625,
-0.003398895263671875,
0.05804443359375,
-0.00891876220703125,
0.0206146240234375,
-0.03826904296875,
0.058258056640625,
-0.045806884765625,
0.00836944580078125,
-0.037200927734375,
-0.07855224609375,
-0.0157470703125,
0.041961669921875,
-0.02972412109375,
0.0396728515625,
0.06280517578125,
0.06781005859375,
0.0006246566772460938,
-0.022552490234375,
0.0237579345703125,
0.0311737060546875,
0.0195770263671875,
0.039581298828125,
0.03424072265625,
-0.04840087890625,
0.041046142578125,
-0.035858154296875,
-0.006488800048828125,
-0.0037975311279296875,
-0.0675048828125,
-0.06488037109375,
-0.05419921875,
-0.040435791015625,
-0.0287322998046875,
-0.024261474609375,
0.06793212890625,
0.0633544921875,
-0.0635986328125,
-0.036834716796875,
-0.01123809814453125,
0.0103759765625,
-0.0150604248046875,
-0.016387939453125,
0.03936767578125,
-0.0297393798828125,
-0.071533203125,
-0.004123687744140625,
-0.005298614501953125,
0.0103912353515625,
-0.027191162109375,
-0.036468505859375,
-0.0271148681640625,
0.0012340545654296875,
0.045379638671875,
0.015655517578125,
-0.050506591796875,
-0.00959014892578125,
0.013702392578125,
-0.0095367431640625,
-0.005992889404296875,
0.041046142578125,
-0.0361328125,
0.0218505859375,
0.028411865234375,
0.031890869140625,
0.05352783203125,
-0.02752685546875,
0.0478515625,
-0.04632568359375,
0.0225677490234375,
-0.0024433135986328125,
0.0533447265625,
0.0283203125,
-0.021514892578125,
0.037078857421875,
0.022216796875,
-0.037078857421875,
-0.0621337890625,
0.0038394927978515625,
-0.07440185546875,
-0.0177154541015625,
0.09283447265625,
-0.020751953125,
-0.0167694091796875,
0.005218505859375,
-0.0162811279296875,
0.0396728515625,
-0.005275726318359375,
0.0323486328125,
0.057373046875,
0.00788116455078125,
-0.03070068359375,
-0.03778076171875,
0.034271240234375,
0.033355712890625,
-0.06427001953125,
-0.00737762451171875,
0.00823211669921875,
0.0347900390625,
0.0167999267578125,
0.037841796875,
-0.018280029296875,
-0.0024967193603515625,
-0.004856109619140625,
0.041168212890625,
0.00630950927734375,
-0.008941650390625,
-0.01323699951171875,
-0.02655029296875,
-0.005283355712890625,
0.01532745361328125
]
] |
timm/resnext101_32x8d.tv_in1k | 2023-04-05T19:10:33.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:1611.05431",
"arxiv:1512.03385",
"license:bsd-3-clause",
"region:us"
] | image-classification | timm | null | null | timm/resnext101_32x8d.tv_in1k | 0 | 26,632 | timm | 2023-04-05T19:09:23 | ---
tags:
- image-classification
- timm
library_tag: timm
license: bsd-3-clause
---
# Model card for resnext101_32x8d.tv_in1k
A ResNeXt-B image classification model.
This model features:
* ReLU activations
* single layer 7x7 convolution with pooling
* 1x1 convolution shortcut downsample
* grouped 3x3 bottleneck convolutions
Trained on ImageNet-1k, original torchvision model weight.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 88.8
- GMACs: 16.5
- Activations (M): 31.2
- Image size: 224 x 224
- **Papers:**
- Aggregated Residual Transformations for Deep Neural Networks: https://arxiv.org/abs/1611.05431
- Deep Residual Learning for Image Recognition: https://arxiv.org/abs/1512.03385
- **Original:** https://github.com/pytorch/vision
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('resnext101_32x8d.tv_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnext101_32x8d.tv_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 2048, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnext101_32x8d.tv_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
|model |img_size|top1 |top5 |param_count|gmacs|macts|img/sec|
|------------------------------------------|--------|-----|-----|-----------|-----|-----|-------|
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|320 |86.72|98.17|93.6 |35.2 |69.7 |451 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|288 |86.51|98.08|93.6 |28.5 |56.4 |560 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|288 |86.49|98.03|93.6 |28.5 |56.4 |557 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|224 |85.96|97.82|93.6 |17.2 |34.2 |923 |
|[resnext101_32x32d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x32d.fb_wsl_ig1b_ft_in1k)|224 |85.11|97.44|468.5 |87.3 |91.1 |254 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|416 |85.0 |97.12|191.9 |108.4|213.8|134 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|352 |84.96|97.22|102.1 |50.2 |101.2|291 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|320 |84.73|97.18|102.1 |41.5 |83.7 |353 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|384 |84.71|96.99|164.0 |77.6 |154.7|183 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|288 |84.57|97.08|93.6 |28.5 |56.4 |557 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|320 |84.45|97.08|93.2 |31.5 |67.8 |446 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|352 |84.43|96.97|129.9 |51.1 |105.5|280 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|288 |84.36|96.92|93.6 |27.6 |53.0 |595 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|320 |84.35|97.04|66.8 |24.1 |47.7 |610 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|288 |84.3 |96.94|164.0 |43.7 |87.1 |333 |
|[resnext101_32x8d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_swsl_ig1b_ft_in1k)|224 |84.28|97.17|88.8 |16.5 |31.2 |1100 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|320 |84.24|96.86|191.9 |64.2 |126.6|228 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|288 |84.19|96.87|93.6 |27.2 |51.6 |613 |
|[resnext101_32x16d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_wsl_ig1b_ft_in1k)|224 |84.18|97.19|194.0 |36.3 |51.2 |581 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|288 |84.11|97.11|44.6 |15.1 |29.0 |1144 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|320 |83.97|96.82|64.7 |31.2 |67.3 |518 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|256 |83.87|96.75|93.2 |20.2 |43.4 |692 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|224 |83.86|96.65|93.6 |17.2 |34.2 |923 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|320 |83.72|96.61|86.6 |24.3 |48.1 |617 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|256 |83.69|96.78|66.8 |15.4 |30.6 |943 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|224 |83.68|96.61|93.6 |16.7 |32.0 |986 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|320 |83.67|96.74|60.2 |24.1 |47.7 |706 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|256 |83.59|96.61|129.9 |27.1 |55.8 |526 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|224 |83.58|96.4 |93.6 |16.5 |31.2 |1013 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|224 |83.54|96.83|44.6 |9.1 |17.6 |1864 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|288 |83.46|96.54|60.2 |19.1 |37.3 |904 |
|[resnext101_32x16d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k)|224 |83.35|96.85|194.0 |36.3 |51.2 |582 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|256 |83.23|96.53|64.7 |20.0 |43.1 |809 |
|[resnext101_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_swsl_ig1b_ft_in1k)|224 |83.22|96.75|44.2 |8.0 |21.2 |1814 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|288 |83.16|96.38|83.5 |25.7 |51.6 |590 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|256 |83.14|96.38|60.2 |15.4 |30.5 |1096 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|320 |83.02|96.45|44.6 |16.5 |34.8 |992 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|288 |82.98|96.54|44.6 |13.4 |28.2 |1077 |
|[resnext101_64x4d.tv_in1k](https://huggingface.co/timm/resnext101_64x4d.tv_in1k)|224 |82.98|96.25|83.5 |15.5 |31.2 |989 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|256 |82.86|96.28|86.6 |15.6 |30.8 |951 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|224 |82.83|96.22|88.8 |16.5 |31.2 |1099 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|224 |82.8 |96.13|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|288 |82.8 |96.32|44.6 |13.0 |26.8 |1291 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|288 |82.74|95.71|60.2 |19.1 |37.3 |905 |
|[resnext101_32x8d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_wsl_ig1b_ft_in1k)|224 |82.69|96.63|88.8 |16.5 |31.2 |1100 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|288 |82.62|95.75|60.2 |19.1 |37.3 |904 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|288 |82.61|96.49|25.6 |8.9 |20.6 |1729 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|288 |82.53|96.13|36.8 |9.9 |21.5 |1773 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|224 |82.5 |96.02|126.9 |22.8 |21.2 |1078 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|224 |82.46|95.92|83.5 |15.5 |31.2 |987 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|288 |82.36|96.18|35.7 |8.1 |20.9 |1964 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|320 |82.35|96.14|25.6 |8.8 |24.1 |1386 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|288 |82.31|95.63|44.6 |13.0 |26.8 |1291 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|288 |82.29|96.01|63.6 |13.6 |28.5 |1078 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|224 |82.29|96.0 |60.2 |11.6 |22.6 |1484 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|288 |82.27|96.06|68.9 |18.9 |23.8 |1176 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|256 |82.26|96.07|44.6 |10.6 |22.2 |1542 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|288 |82.24|95.73|44.6 |13.0 |26.8 |1290 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|288 |82.2 |96.14|27.6 |7.0 |23.8 |1547 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|224 |82.18|96.05|44.6 |8.1 |17.1 |1771 |
|[resnext50_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_swsl_ig1b_ft_in1k)|224 |82.17|96.22|25.0 |4.3 |14.4 |2943 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|288 |82.12|95.65|25.6 |7.1 |19.6 |1704 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|288 |82.03|95.94|25.0 |7.0 |23.8 |1745 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|288 |82.0 |96.15|24.9 |5.8 |12.7 |1787 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|256 |81.99|95.85|36.8 |7.8 |17.0 |2230 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|176 |81.98|95.72|88.8 |10.3 |19.4 |1768 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|224 |81.97|95.24|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|224 |81.93|95.75|44.6 |7.8 |16.2 |2122 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|224 |81.9 |95.77|44.6 |7.8 |16.2 |2118 |
|[resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k)|224 |81.84|96.1 |194.0 |36.3 |51.2 |583 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|256 |81.78|95.94|35.7 |6.4 |16.6 |2471 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|224 |81.77|95.22|60.2 |11.6 |22.6 |1485 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|224 |81.74|96.06|25.6 |5.4 |12.4 |2813 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|288 |81.65|95.54|25.6 |7.1 |19.6 |1703 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|288 |81.64|95.88|25.6 |7.2 |19.7 |1694 |
|[resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k)|224 |81.62|96.04|88.8 |16.5 |31.2 |1101 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|224 |81.61|95.76|68.9 |11.4 |14.4 |1930 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|288 |81.61|95.83|25.6 |8.5 |19.2 |1868 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|224 |81.5 |95.16|44.6 |7.8 |16.2 |2125 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|288 |81.48|95.16|25.0 |7.0 |23.8 |1745 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|288 |81.47|95.71|25.9 |6.9 |18.6 |2071 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|224 |81.45|95.53|68.9 |11.4 |14.4 |1929 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|288 |81.44|95.22|25.6 |7.2 |19.7 |1908 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|256 |81.44|95.67|25.6 |5.6 |15.4 |2168 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|288 |81.4 |95.82|30.2 |6.8 |13.9 |2132 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|288 |81.37|95.74|25.6 |7.2 |19.7 |1910 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|224 |81.32|95.19|44.6 |7.8 |16.2 |2125 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|288 |81.3 |95.65|28.1 |6.8 |18.4 |1803 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|288 |81.3 |95.11|25.0 |7.0 |23.8 |1746 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|224 |81.27|95.62|27.6 |4.3 |14.4 |2591 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|224 |81.26|95.16|25.6 |4.3 |11.8 |2823 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|288 |81.23|95.54|15.7 |4.8 |19.6 |2117 |
|[senet154.gluon_in1k](https://huggingface.co/timm/senet154.gluon_in1k)|224 |81.23|95.35|115.1 |20.8 |38.7 |545 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|288 |81.22|95.11|25.6 |6.8 |18.4 |2089 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|288 |81.22|95.63|25.6 |6.8 |18.4 |676 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|288 |81.18|95.09|25.6 |7.2 |19.7 |1908 |
|[resnet50.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet50.fb_swsl_ig1b_ft_in1k)|224 |81.18|95.98|25.6 |4.1 |11.1 |3455 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|224 |81.17|95.34|25.0 |4.3 |14.4 |2933 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|224 |81.1 |95.33|25.0 |4.3 |14.4 |2934 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|288 |81.1 |95.23|28.1 |6.8 |18.4 |1801 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|288 |81.1 |95.12|28.1 |6.8 |18.4 |1799 |
|[resnet152s.gluon_in1k](https://huggingface.co/timm/resnet152s.gluon_in1k)|224 |81.02|95.41|60.3 |12.9 |25.0 |1347 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|288 |80.97|95.44|25.6 |6.8 |18.4 |2085 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|256 |80.94|95.45|25.9 |5.4 |14.7 |2571 |
|[resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.93|95.73|44.2 |8.0 |21.2 |1814 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|288 |80.91|95.55|25.6 |6.8 |18.4 |2084 |
|[seresnext101_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_32x4d.gluon_in1k)|224 |80.9 |95.31|49.0 |8.0 |21.3 |1585 |
|[seresnext101_64x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_64x4d.gluon_in1k)|224 |80.9 |95.3 |88.2 |15.5 |31.2 |918 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|288 |80.86|95.52|25.6 |6.8 |18.4 |2085 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|224 |80.85|95.43|25.6 |4.1 |11.1 |3450 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|224 |80.84|95.02|25.6 |4.3 |11.8 |2821 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|224 |80.79|95.62|24.9 |3.5 |7.7 |2961 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|288 |80.79|95.36|19.8 |6.0 |14.8 |2506 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|288 |80.79|95.58|19.9 |4.2 |10.6 |2349 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|288 |80.78|94.99|25.6 |6.8 |18.4 |2088 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|288 |80.71|95.43|25.6 |6.8 |18.4 |2087 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|288 |80.7 |95.39|25.0 |7.0 |23.8 |1749 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|192 |80.69|95.24|63.6 |6.0 |12.7 |2270 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|224 |80.68|94.71|25.6 |4.4 |11.9 |3162 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|288 |80.68|95.36|19.7 |6.0 |14.8 |2637 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|224 |80.67|95.3 |25.6 |4.1 |11.1 |3452 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|288 |80.67|95.42|25.0 |7.4 |25.1 |1626 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|224 |80.63|95.21|25.6 |5.2 |11.6 |3034 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|224 |80.61|95.32|25.6 |4.4 |11.9 |2813 |
|[resnext101_64x4d.gluon_in1k](https://huggingface.co/timm/resnext101_64x4d.gluon_in1k)|224 |80.61|94.99|83.5 |15.5 |31.2 |989 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|288 |80.6 |95.31|19.9 |6.0 |14.8 |2578 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|256 |80.57|95.17|15.7 |3.8 |15.5 |2710 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|224 |80.56|95.0 |60.2 |11.6 |22.6 |1483 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|224 |80.53|95.16|25.6 |4.4 |11.9 |3164 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|224 |80.53|94.46|25.0 |4.3 |14.4 |2930 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|176 |80.48|94.98|126.9 |14.3 |13.2 |1719 |
|[resnet152d.gluon_in1k](https://huggingface.co/timm/resnet152d.gluon_in1k)|224 |80.47|95.2 |60.2 |11.8 |23.4 |1428 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|288 |80.45|95.32|25.6 |6.8 |18.4 |2086 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|224 |80.45|95.24|30.2 |4.1 |8.4 |3530 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|224 |80.45|94.63|25.0 |4.3 |14.4 |2936 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|176 |80.43|95.09|68.9 |7.3 |9.0 |3015 |
|[resnet101d.gluon_in1k](https://huggingface.co/timm/resnet101d.gluon_in1k)|224 |80.42|95.01|44.6 |8.1 |17.0 |2007 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|224 |80.38|94.6 |25.6 |4.1 |11.1 |3461 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|256 |80.36|95.1 |19.8 |4.8 |11.7 |3267 |
|[resnext101_32x4d.gluon_in1k](https://huggingface.co/timm/resnext101_32x4d.gluon_in1k)|224 |80.34|94.93|44.2 |8.0 |21.2 |1814 |
|[resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.32|95.4 |25.0 |4.3 |14.4 |2941 |
|[resnet101s.gluon_in1k](https://huggingface.co/timm/resnet101s.gluon_in1k)|224 |80.28|95.16|44.7 |9.2 |18.6 |1851 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|224 |80.26|95.08|28.1 |4.1 |11.1 |2972 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|288 |80.24|95.24|25.6 |8.5 |19.9 |1523 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|224 |80.22|94.63|25.6 |4.4 |11.9 |3162 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|176 |80.2 |94.64|60.2 |7.2 |14.0 |2346 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|224 |80.08|94.74|28.1 |4.1 |11.1 |2969 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|256 |80.08|94.97|19.7 |4.8 |11.7 |3284 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|256 |80.06|94.99|19.9 |4.8 |11.7 |3216 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|224 |80.06|94.95|25.6 |4.1 |11.1 |1109 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|224 |80.02|94.71|28.1 |4.1 |11.1 |2962 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|288 |79.97|95.05|25.6 |6.8 |18.4 |2086 |
|[resnet152c.gluon_in1k](https://huggingface.co/timm/resnet152c.gluon_in1k)|224 |79.92|94.84|60.2 |11.8 |23.4 |1455 |
|[seresnext50_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext50_32x4d.gluon_in1k)|224 |79.91|94.82|27.6 |4.3 |14.4 |2591 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|224 |79.91|94.67|25.6 |4.1 |11.1 |3456 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|176 |79.9 |94.6 |44.6 |4.9 |10.1 |3341 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|224 |79.89|94.97|35.7 |4.5 |12.1 |2774 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|224 |79.88|94.87|25.6 |4.1 |11.1 |3455 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|320 |79.86|95.07|16.0 |5.2 |16.4 |2168 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|224 |79.85|94.56|25.6 |4.1 |11.1 |3460 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|288 |79.83|94.97|25.6 |6.8 |18.4 |2087 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|224 |79.82|94.62|44.6 |7.8 |16.2 |2114 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|224 |79.76|94.6 |25.0 |4.3 |14.4 |2943 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|224 |79.74|94.95|25.6 |4.1 |11.1 |3455 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|224 |79.74|94.87|19.9 |2.5 |6.4 |3929 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|288 |79.71|94.83|19.7 |6.0 |14.8 |2710 |
|[resnet152.gluon_in1k](https://huggingface.co/timm/resnet152.gluon_in1k)|224 |79.68|94.74|60.2 |11.6 |22.6 |1486 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|224 |79.67|94.87|25.0 |4.5 |15.2 |2729 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|288 |79.63|94.91|25.6 |6.8 |18.4 |2086 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|224 |79.56|94.72|25.6 |4.3 |11.8 |2805 |
|[resnet101c.gluon_in1k](https://huggingface.co/timm/resnet101c.gluon_in1k)|224 |79.53|94.58|44.6 |8.1 |17.0 |2062 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|224 |79.52|94.61|25.6 |4.1 |11.1 |3459 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|176 |79.42|94.64|25.6 |2.6 |6.9 |5397 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|288 |79.4 |94.66|18.0 |5.9 |14.6 |2752 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|224 |79.38|94.57|25.6 |4.1 |11.1 |3459 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|176 |79.37|94.3 |25.0 |2.7 |9.0 |4577 |
|[resnext50_32x4d.gluon_in1k](https://huggingface.co/timm/resnext50_32x4d.gluon_in1k)|224 |79.36|94.43|25.0 |4.3 |14.4 |2942 |
|[resnext101_32x8d.tv_in1k](https://huggingface.co/timm/resnext101_32x8d.tv_in1k)|224 |79.31|94.52|88.8 |16.5 |31.2 |1100 |
|[resnet101.gluon_in1k](https://huggingface.co/timm/resnet101.gluon_in1k)|224 |79.31|94.53|44.6 |7.8 |16.2 |2125 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|224 |79.31|94.63|25.6 |5.2 |12.0 |2524 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|176 |79.27|94.49|25.6 |2.6 |6.9 |5404 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|224 |79.25|94.31|25.0 |4.3 |14.4 |2931 |
|[resnet50.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet50.fb_ssl_yfcc100m_ft_in1k)|224 |79.22|94.84|25.6 |4.1 |11.1 |3451 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|256 |79.21|94.56|19.7 |4.8 |11.7 |3392 |
|[resnet50d.gluon_in1k](https://huggingface.co/timm/resnet50d.gluon_in1k)|224 |79.07|94.48|25.6 |4.4 |11.9 |3162 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|224 |79.03|94.38|25.6 |4.1 |11.1 |3453 |
|[resnet50.am_in1k](https://huggingface.co/timm/resnet50.am_in1k)|224 |79.01|94.39|25.6 |4.1 |11.1 |3461 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|256 |79.01|94.37|18.0 |4.6 |11.6 |3440 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|256 |78.9 |94.54|16.0 |3.4 |10.5 |3421 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|160 |78.89|94.11|60.2 |5.9 |11.5 |2745 |
|[wide_resnet101_2.tv_in1k](https://huggingface.co/timm/wide_resnet101_2.tv_in1k)|224 |78.84|94.28|126.9 |22.8 |21.2 |1079 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|288 |78.83|94.24|16.8 |4.5 |16.8 |2251 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|224 |78.81|94.32|25.6 |4.1 |11.1 |3454 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|288 |78.74|94.33|16.8 |4.5 |16.7 |2264 |
|[resnet50s.gluon_in1k](https://huggingface.co/timm/resnet50s.gluon_in1k)|224 |78.72|94.23|25.7 |5.5 |13.5 |2796 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|224 |78.71|94.24|25.6 |4.4 |11.9 |3154 |
|[wide_resnet50_2.tv_in1k](https://huggingface.co/timm/wide_resnet50_2.tv_in1k)|224 |78.47|94.09|68.9 |11.4 |14.4 |1934 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|224 |78.46|94.27|25.6 |4.1 |11.1 |3454 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|288 |78.43|94.35|21.8 |6.5 |7.5 |3291 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|288 |78.42|94.04|10.5 |3.1 |13.3 |3226 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|320 |78.33|94.13|16.0 |5.2 |16.4 |2391 |
|[resnet152.tv_in1k](https://huggingface.co/timm/resnet152.tv_in1k)|224 |78.32|94.04|60.2 |11.6 |22.6 |1487 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|288 |78.28|94.1 |10.4 |3.1 |13.3 |3062 |
|[bat_resnext26ts.ch_in1k](https://huggingface.co/timm/bat_resnext26ts.ch_in1k)|256 |78.25|94.1 |10.7 |2.5 |12.5 |3393 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|224 |78.06|93.78|25.6 |4.1 |11.1 |3450 |
|[resnet50c.gluon_in1k](https://huggingface.co/timm/resnet50c.gluon_in1k)|224 |78.0 |93.99|25.6 |4.4 |11.9 |3286 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|288 |78.0 |93.91|10.3 |3.1 |13.3 |3297 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|224 |77.98|93.75|16.8 |2.7 |10.1 |3841 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|288 |77.92|93.77|21.8 |6.1 |6.2 |3609 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|160 |77.88|93.71|44.6 |4.0 |8.3 |3926 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|256 |77.87|93.84|16.0 |3.4 |10.5 |3772 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|256 |77.86|93.79|10.4 |2.4 |10.5 |4263 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|160 |77.82|93.81|35.7 |2.3 |6.2 |5238 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|256 |77.81|93.82|10.5 |2.4 |10.5 |4183 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|160 |77.79|93.6 |25.6 |2.2 |6.0 |5329 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|160 |77.73|93.32|25.0 |2.2 |7.4 |5576 |
|[resnext50_32x4d.tv_in1k](https://huggingface.co/timm/resnext50_32x4d.tv_in1k)|224 |77.61|93.7 |25.0 |4.3 |14.4 |2944 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|224 |77.59|93.61|16.8 |2.7 |10.2 |3807 |
|[resnet50.gluon_in1k](https://huggingface.co/timm/resnet50.gluon_in1k)|224 |77.58|93.72|25.6 |4.1 |11.1 |3455 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|256 |77.44|93.56|10.3 |2.4 |10.5 |4284 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|288 |77.41|93.63|16.0 |4.3 |13.5 |2907 |
|[resnet101.tv_in1k](https://huggingface.co/timm/resnet101.tv_in1k)|224 |77.38|93.54|44.6 |7.8 |16.2 |2125 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|160 |77.22|93.27|25.6 |2.2 |6.1 |5982 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|288 |77.17|93.47|10.3 |3.1 |13.3 |3392 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|288 |77.15|93.27|21.8 |6.1 |6.2 |3615 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|224 |77.1 |93.37|21.8 |3.9 |4.5 |5436 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|224 |77.02|93.07|28.1 |4.1 |11.1 |2952 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|256 |76.78|93.13|10.3 |2.4 |10.5 |4410 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|224 |76.7 |93.17|16.0 |2.6 |8.2 |4859 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|288 |76.5 |93.35|21.8 |6.1 |6.2 |3617 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|224 |76.42|92.87|21.8 |3.7 |3.7 |5984 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|288 |76.35|93.18|16.0 |3.9 |12.2 |3331 |
|[resnet50.tv_in1k](https://huggingface.co/timm/resnet50.tv_in1k)|224 |76.13|92.86|25.6 |4.1 |11.1 |3457 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|160 |75.96|92.5 |25.6 |2.1 |5.7 |6490 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|224 |75.52|92.44|21.8 |3.7 |3.7 |5991 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|224 |75.3 |92.58|16.0 |2.4 |7.4 |5583 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|224 |75.16|92.18|21.8 |3.7 |3.7 |5994 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|160 |75.1 |92.08|28.1 |2.1 |5.7 |5513 |
|[resnet34.gluon_in1k](https://huggingface.co/timm/resnet34.gluon_in1k)|224 |74.57|91.98|21.8 |3.7 |3.7 |5984 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|288 |73.81|91.83|11.7 |3.4 |5.4 |5196 |
|[resnet34.tv_in1k](https://huggingface.co/timm/resnet34.tv_in1k)|224 |73.32|91.42|21.8 |3.7 |3.7 |5979 |
|[resnet18.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet18.fb_swsl_ig1b_ft_in1k)|224 |73.28|91.73|11.7 |1.8 |2.5 |10213 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|288 |73.16|91.03|11.7 |3.0 |4.1 |6050 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|224 |72.98|91.11|21.8 |3.7 |3.7 |5967 |
|[resnet18.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet18.fb_ssl_yfcc100m_ft_in1k)|224 |72.6 |91.42|11.7 |1.8 |2.5 |10213 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|288 |72.37|90.59|11.7 |3.0 |4.1 |6051 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|224 |72.26|90.31|10.1 |1.7 |5.8 |7026 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|224 |72.26|90.68|11.7 |2.1 |3.3 |8707 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|224 |71.49|90.07|11.7 |1.8 |2.5 |10187 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|176 |71.31|89.69|10.1 |1.1 |3.6 |10970 |
|[resnet18.gluon_in1k](https://huggingface.co/timm/resnet18.gluon_in1k)|224 |70.84|89.76|11.7 |1.8 |2.5 |10210 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|224 |70.64|89.47|11.7 |1.8 |2.5 |10194 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|160 |70.56|89.52|21.8 |1.9 |1.9 |10737 |
|[resnet18.tv_in1k](https://huggingface.co/timm/resnet18.tv_in1k)|224 |69.76|89.07|11.7 |1.8 |2.5 |10205 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|224 |68.34|88.03|5.4 |1.1 |2.4 |13079 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|224 |68.25|88.17|11.7 |1.8 |2.5 |10167 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|176 |66.71|86.96|5.4 |0.7 |1.5 |20327 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|160 |65.66|86.26|11.7 |0.9 |1.3 |18229 |
## Citation
```bibtex
@article{Xie2016,
title={Aggregated Residual Transformations for Deep Neural Networks},
author={Saining Xie and Ross Girshick and Piotr Dollár and Zhuowen Tu and Kaiming He},
journal={arXiv preprint arXiv:1611.05431},
year={2016}
}
```
```bibtex
@article{He2015,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {arXiv preprint arXiv:1512.03385},
year = {2015}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 38,302 | [
[
-0.06585693359375,
-0.0173492431640625,
0.0036373138427734375,
0.0295257568359375,
-0.03240966796875,
-0.00868988037109375,
-0.009521484375,
-0.029205322265625,
0.0849609375,
0.019378662109375,
-0.047760009765625,
-0.03955078125,
-0.04595947265625,
-0.0016622543334960938,
0.024169921875,
0.06427001953125,
-0.0010700225830078125,
-0.00616455078125,
0.0147857666015625,
-0.019317626953125,
-0.004680633544921875,
-0.02325439453125,
-0.07525634765625,
-0.01324462890625,
0.0298919677734375,
0.011749267578125,
0.05145263671875,
0.045867919921875,
0.02752685546875,
0.045318603515625,
-0.018798828125,
0.0200042724609375,
-0.004619598388671875,
-0.00960540771484375,
0.04705810546875,
-0.0284423828125,
-0.0693359375,
-0.0019817352294921875,
0.05352783203125,
0.04779052734375,
0.005001068115234375,
0.0263519287109375,
0.029327392578125,
0.0428466796875,
0.0017347335815429688,
-0.003307342529296875,
0.0014104843139648438,
0.0086822509765625,
-0.018951416015625,
0.0072174072265625,
-0.00420379638671875,
-0.05462646484375,
0.01280975341796875,
-0.047332763671875,
-0.003528594970703125,
-0.0011796951293945312,
0.1004638671875,
-0.00884246826171875,
-0.016754150390625,
0.007099151611328125,
0.00904083251953125,
0.055450439453125,
-0.060272216796875,
0.0264129638671875,
0.0404052734375,
-0.001598358154296875,
-0.01331329345703125,
-0.049407958984375,
-0.0384521484375,
0.011138916015625,
-0.0330810546875,
0.023681640625,
-0.0239715576171875,
-0.0169525146484375,
0.0284423828125,
0.024383544921875,
-0.0355224609375,
-0.01025390625,
-0.027313232421875,
-0.00846099853515625,
0.05255126953125,
0.0057220458984375,
0.051177978515625,
-0.0249786376953125,
-0.038848876953125,
-0.00971221923828125,
-0.01215362548828125,
0.03436279296875,
0.01708984375,
0.00824737548828125,
-0.082275390625,
0.0310211181640625,
0.010711669921875,
0.01824951171875,
0.0262908935546875,
-0.0115966796875,
0.06268310546875,
-0.007213592529296875,
-0.039276123046875,
-0.03515625,
0.07989501953125,
0.049591064453125,
0.020111083984375,
-0.007686614990234375,
-0.0027599334716796875,
-0.0107269287109375,
-0.03094482421875,
-0.0716552734375,
-0.0034580230712890625,
0.0200042724609375,
-0.041229248046875,
-0.0167236328125,
0.0242156982421875,
-0.0679931640625,
-0.0029773712158203125,
-0.00787353515625,
0.00901031494140625,
-0.055206298828125,
-0.033843994140625,
0.0006594657897949219,
-0.0167999267578125,
0.03912353515625,
0.01812744140625,
-0.0226898193359375,
0.031951904296875,
0.003711700439453125,
0.06488037109375,
0.0211029052734375,
-0.00458526611328125,
-0.01477813720703125,
0.0021228790283203125,
-0.0261688232421875,
0.0285797119140625,
0.01148223876953125,
-0.01483154296875,
-0.025787353515625,
0.0316162109375,
-0.0207366943359375,
-0.017364501953125,
0.047515869140625,
0.0231781005859375,
0.01381683349609375,
-0.0189971923828125,
-0.0193939208984375,
-0.0153961181640625,
0.02642822265625,
-0.04248046875,
0.0758056640625,
0.0293731689453125,
-0.081787109375,
0.01277923583984375,
-0.0350341796875,
0.0003590583801269531,
-0.021514892578125,
0.006984710693359375,
-0.06719970703125,
0.003021240234375,
0.0172576904296875,
0.05169677734375,
-0.016082763671875,
-0.01325225830078125,
-0.0265960693359375,
0.004360198974609375,
0.0303497314453125,
0.0112457275390625,
0.066162109375,
0.023895263671875,
-0.033203125,
-0.01491546630859375,
-0.055267333984375,
0.03424072265625,
0.031829833984375,
-0.0005521774291992188,
-0.0052032470703125,
-0.0606689453125,
0.00331878662109375,
0.0457763671875,
0.0202789306640625,
-0.05352783203125,
0.0196685791015625,
-0.01160430908203125,
0.0249481201171875,
0.048553466796875,
0.00139617919921875,
0.01238250732421875,
-0.050537109375,
0.0452880859375,
-0.0014181137084960938,
0.020599365234375,
-0.0015211105346679688,
-0.031402587890625,
-0.05523681640625,
-0.055023193359375,
0.017974853515625,
0.03125,
-0.030181884765625,
0.065673828125,
0.0083160400390625,
-0.045745849609375,
-0.047332763671875,
0.0034389495849609375,
0.042266845703125,
0.0173187255859375,
0.00806427001953125,
-0.025115966796875,
-0.054412841796875,
-0.0706787109375,
-0.024505615234375,
0.0101318359375,
-0.0032558441162109375,
0.050628662109375,
0.032867431640625,
-0.0146331787109375,
0.039276123046875,
-0.029327392578125,
-0.016876220703125,
-0.01015472412109375,
-0.007564544677734375,
0.033050537109375,
0.059478759765625,
0.07611083984375,
-0.054443359375,
-0.07086181640625,
0.011016845703125,
-0.08197021484375,
-0.006847381591796875,
-0.0015850067138671875,
-0.01934814453125,
0.032073974609375,
0.0187225341796875,
-0.0665283203125,
0.0589599609375,
0.0298004150390625,
-0.06549072265625,
0.033447265625,
-0.028228759765625,
0.040924072265625,
-0.081298828125,
0.0207672119140625,
0.0204620361328125,
-0.019256591796875,
-0.04351806640625,
0.0035457611083984375,
-0.0064697265625,
0.01042938232421875,
-0.041259765625,
0.060089111328125,
-0.053192138671875,
-0.0016889572143554688,
0.01100921630859375,
0.0062103271484375,
-0.002185821533203125,
0.033599853515625,
-0.0017938613891601562,
0.044097900390625,
0.06585693359375,
-0.01384735107421875,
0.024444580078125,
0.0325927734375,
0.001537322998046875,
0.057952880859375,
-0.0465087890625,
0.007564544677734375,
0.00251007080078125,
0.0325927734375,
-0.0753173828125,
-0.028900146484375,
0.039886474609375,
-0.0618896484375,
0.048614501953125,
-0.0199737548828125,
-0.020355224609375,
-0.06207275390625,
-0.063720703125,
0.0203857421875,
0.04931640625,
-0.043487548828125,
0.0266571044921875,
0.015716552734375,
-0.0036563873291015625,
-0.036346435546875,
-0.0491943359375,
0.0045013427734375,
-0.031524658203125,
-0.0615234375,
0.03472900390625,
0.0244293212890625,
-0.01338958740234375,
0.00849151611328125,
-0.01227569580078125,
-0.00988006591796875,
-0.0167083740234375,
0.044219970703125,
0.0267333984375,
-0.022247314453125,
-0.0309295654296875,
-0.02899169921875,
-0.0223846435546875,
-0.00449371337890625,
-0.008056640625,
0.03875732421875,
-0.031829833984375,
0.007137298583984375,
-0.10797119140625,
0.008636474609375,
0.06732177734375,
-0.00015687942504882812,
0.072998046875,
0.058135986328125,
-0.034912109375,
0.01393890380859375,
-0.03399658203125,
-0.015411376953125,
-0.038787841796875,
-0.0167388916015625,
-0.053436279296875,
-0.04412841796875,
0.06878662109375,
0.004810333251953125,
-0.0105438232421875,
0.05877685546875,
0.01238250732421875,
-0.0175018310546875,
0.0638427734375,
0.0350341796875,
-0.0028533935546875,
0.041595458984375,
-0.0628662109375,
0.0068359375,
-0.06060791015625,
-0.055908203125,
-0.0184326171875,
-0.042816162109375,
-0.04449462890625,
-0.0264739990234375,
0.0188751220703125,
0.02886962890625,
-0.0193328857421875,
0.045196533203125,
-0.042572021484375,
0.0027923583984375,
0.0237884521484375,
0.03961181640625,
-0.0148162841796875,
-0.00911712646484375,
-0.0093841552734375,
-0.0255584716796875,
-0.038055419921875,
-0.0274200439453125,
0.059051513671875,
0.046295166015625,
0.03350830078125,
0.0078582763671875,
0.041229248046875,
0.003963470458984375,
0.0136566162109375,
-0.021728515625,
0.051116943359375,
0.005786895751953125,
-0.032318115234375,
-0.025238037109375,
-0.0298614501953125,
-0.0819091796875,
0.01495361328125,
-0.0345458984375,
-0.064453125,
-0.01154327392578125,
-0.004261016845703125,
-0.0276947021484375,
0.056365966796875,
-0.045867919921875,
0.049530029296875,
-0.00371551513671875,
-0.041717529296875,
-0.0041351318359375,
-0.05859375,
0.004627227783203125,
0.02972412109375,
0.003173828125,
0.00028228759765625,
-0.003734588623046875,
0.057342529296875,
-0.061859130859375,
0.0426025390625,
-0.0263824462890625,
0.00922393798828125,
0.0300445556640625,
-0.002269744873046875,
0.0305328369140625,
-0.0005278587341308594,
-0.01183319091796875,
-0.00798797607421875,
0.0099639892578125,
-0.061248779296875,
-0.0249481201171875,
0.0496826171875,
-0.056060791015625,
-0.029632568359375,
-0.0501708984375,
-0.0202178955078125,
0.0070953369140625,
0.0016345977783203125,
0.036346435546875,
0.048370361328125,
-0.0034542083740234375,
0.017578125,
0.041015625,
-0.03173828125,
0.039306640625,
-0.010498046875,
0.0009093284606933594,
-0.042633056640625,
0.05364990234375,
0.0024585723876953125,
-0.0001876354217529297,
-0.0013990402221679688,
0.00046181678771972656,
-0.032867431640625,
-0.01702880859375,
-0.022857666015625,
0.05548095703125,
-0.01132965087890625,
-0.0231170654296875,
-0.04718017578125,
-0.025634765625,
-0.043609619140625,
-0.031402587890625,
-0.032501220703125,
-0.029083251953125,
-0.0225982666015625,
0.0019159317016601562,
0.05230712890625,
0.06414794921875,
-0.027191162109375,
0.029296875,
-0.03997802734375,
0.0221710205078125,
0.0075836181640625,
0.04248046875,
-0.0263519287109375,
-0.0517578125,
0.0031032562255859375,
-0.0037326812744140625,
-0.008544921875,
-0.0626220703125,
0.051544189453125,
-0.0008912086486816406,
0.02899169921875,
0.03094482421875,
-0.0167999267578125,
0.055511474609375,
-0.0008234977722167969,
0.034820556640625,
0.045135498046875,
-0.055267333984375,
0.0258941650390625,
-0.032135009765625,
0.0013341903686523438,
0.019989013671875,
0.0149993896484375,
-0.0289459228515625,
-0.0244903564453125,
-0.0682373046875,
-0.0312347412109375,
0.055755615234375,
0.01018524169921875,
-0.002620697021484375,
0.0003879070281982422,
0.053131103515625,
-0.00542449951171875,
0.0049591064453125,
-0.041595458984375,
-0.06903076171875,
-0.00963592529296875,
-0.01148223876953125,
0.004718780517578125,
-0.003635406494140625,
0.00399017333984375,
-0.05157470703125,
0.049652099609375,
0.0043792724609375,
0.039276123046875,
0.01377105712890625,
0.002716064453125,
0.004299163818359375,
-0.0216064453125,
0.047515869140625,
0.027099609375,
-0.0142974853515625,
-0.01031494140625,
0.0274810791015625,
-0.039306640625,
0.0083770751953125,
0.0164794921875,
0.0012874603271484375,
0.007556915283203125,
0.00611114501953125,
0.039947509765625,
0.0258331298828125,
-0.0056304931640625,
0.038909912109375,
-0.018096923828125,
-0.04156494140625,
-0.016082763671875,
-0.0157623291015625,
0.0188751220703125,
0.03094482421875,
0.0240325927734375,
0.004329681396484375,
-0.030670166015625,
-0.0287017822265625,
0.03887939453125,
0.05548095703125,
-0.030303955078125,
-0.031005859375,
0.046173095703125,
-0.001338958740234375,
-0.0145416259765625,
0.0297088623046875,
-0.00800323486328125,
-0.05047607421875,
0.07611083984375,
0.025146484375,
0.0465087890625,
-0.03955078125,
0.00641632080078125,
0.06658935546875,
-0.0023975372314453125,
0.01556396484375,
0.0251617431640625,
0.03314208984375,
-0.0249786376953125,
-0.0058135986328125,
-0.041259765625,
0.0140380859375,
0.037689208984375,
-0.0305938720703125,
0.0233917236328125,
-0.05255126953125,
-0.025115966796875,
0.0066070556640625,
0.03546142578125,
-0.046630859375,
0.0263519287109375,
-0.0002961158752441406,
0.080322265625,
-0.062469482421875,
0.0654296875,
0.06646728515625,
-0.0408935546875,
-0.06329345703125,
-0.002407073974609375,
0.006984710693359375,
-0.06500244140625,
0.03338623046875,
0.006595611572265625,
0.003246307373046875,
-0.0008416175842285156,
-0.03680419921875,
-0.050994873046875,
0.10302734375,
0.029937744140625,
-0.0017871856689453125,
0.0189971923828125,
-0.031494140625,
0.028106689453125,
-0.01361846923828125,
0.044769287109375,
0.029327392578125,
0.0390625,
0.01287841796875,
-0.06689453125,
0.026611328125,
-0.031951904296875,
-0.00899505615234375,
0.0242156982421875,
-0.09735107421875,
0.06854248046875,
-0.0182037353515625,
-0.002685546875,
0.019317626953125,
0.0484619140625,
0.0242156982421875,
-0.00235748291015625,
0.018646240234375,
0.0694580078125,
0.03509521484375,
-0.02099609375,
0.0772705078125,
-0.016937255859375,
0.042083740234375,
0.0157470703125,
0.041717529296875,
0.025909423828125,
0.0295867919921875,
-0.04412841796875,
0.0212249755859375,
0.06024169921875,
-0.0037021636962890625,
0.0099639892578125,
0.0205230712890625,
-0.031829833984375,
-0.016632080078125,
-0.01480865478515625,
-0.051971435546875,
0.0171051025390625,
0.00986480712890625,
-0.0117340087890625,
-0.01169586181640625,
-0.0026760101318359375,
0.0174560546875,
0.022247314453125,
-0.0197601318359375,
0.038665771484375,
0.004840850830078125,
-0.029998779296875,
0.033599853515625,
-0.0029048919677734375,
0.07891845703125,
-0.026885986328125,
0.01157379150390625,
-0.025787353515625,
0.024017333984375,
-0.017181396484375,
-0.082275390625,
0.0240478515625,
-0.00727081298828125,
0.006526947021484375,
-0.018096923828125,
0.04827880859375,
-0.0266265869140625,
-0.026458740234375,
0.028106689453125,
0.0302581787109375,
0.03790283203125,
0.0205841064453125,
-0.0828857421875,
0.0195770263671875,
0.007007598876953125,
-0.045867919921875,
0.033233642578125,
0.037689208984375,
0.0276947021484375,
0.056884765625,
0.023406982421875,
0.02301025390625,
0.0157928466796875,
-0.0292205810546875,
0.055389404296875,
-0.04876708984375,
-0.033966064453125,
-0.06103515625,
0.04083251953125,
-0.0305328369140625,
-0.040008544921875,
0.05517578125,
0.043243408203125,
0.0276947021484375,
0.0007147789001464844,
0.05023193359375,
-0.040985107421875,
0.03729248046875,
-0.019866943359375,
0.05645751953125,
-0.048583984375,
-0.01953125,
-0.016998291015625,
-0.04632568359375,
-0.03045654296875,
0.064208984375,
-0.00732421875,
0.0181732177734375,
0.0229339599609375,
0.05023193359375,
0.00568389892578125,
-0.00968170166015625,
0.0028018951416015625,
0.01416778564453125,
-0.0103607177734375,
0.067138671875,
0.0377197265625,
-0.058349609375,
0.00656890869140625,
-0.035430908203125,
-0.0200042724609375,
-0.0267333984375,
-0.05450439453125,
-0.08612060546875,
-0.05126953125,
-0.042388916015625,
-0.051116943359375,
-0.0211029052734375,
0.08978271484375,
0.059112548828125,
-0.045074462890625,
-0.01092529296875,
0.010040283203125,
0.008087158203125,
-0.01206207275390625,
-0.0159759521484375,
0.041259765625,
0.00862884521484375,
-0.07672119140625,
-0.03338623046875,
0.0087738037109375,
0.04742431640625,
0.0291748046875,
-0.036285400390625,
-0.01617431640625,
-0.00616455078125,
0.026763916015625,
0.0634765625,
-0.06060791015625,
-0.02093505859375,
0.0007543563842773438,
-0.036041259765625,
0.01071929931640625,
0.02227783203125,
-0.037261962890625,
-0.00832366943359375,
0.037200927734375,
0.0263824462890625,
0.054962158203125,
0.005191802978515625,
0.011810302734375,
-0.03094482421875,
0.04217529296875,
-0.0025310516357421875,
0.026275634765625,
0.0150909423828125,
-0.0204010009765625,
0.057464599609375,
0.042236328125,
-0.03021240234375,
-0.07696533203125,
-0.0155029296875,
-0.09930419921875,
-0.005100250244140625,
0.049224853515625,
-0.004566192626953125,
-0.030853271484375,
0.0297393798828125,
-0.03509521484375,
0.03790283203125,
-0.01678466796875,
0.01934814453125,
0.0181732177734375,
-0.0269927978515625,
-0.02679443359375,
-0.04132080078125,
0.046417236328125,
0.0251617431640625,
-0.05133056640625,
-0.0283966064453125,
0.0011453628540039062,
0.02349853515625,
0.01462554931640625,
0.0550537109375,
-0.0294036865234375,
0.01165008544921875,
-0.006870269775390625,
0.0195159912109375,
0.0004305839538574219,
0.01303863525390625,
-0.0237884521484375,
-0.00914764404296875,
-0.0189971923828125,
-0.047637939453125
]
] |
sagorsarker/codeswitch-hineng-ner-lince | 2023-03-27T15:21:21.000Z | [
"transformers",
"pytorch",
"jax",
"safetensors",
"bert",
"token-classification",
"codeswitching",
"hindi-english",
"ner",
"hi",
"en",
"multilingual",
"dataset:lince",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | sagorsarker | null | null | sagorsarker/codeswitch-hineng-ner-lince | 1 | 26,568 | transformers | 2022-03-02T23:29:05 | ---
language:
- hi
- en
- multilingual
license: mit
tags:
- codeswitching
- hindi-english
- ner
datasets:
- lince
---
# codeswitch-hineng-ner-lince
This is a pretrained model for **Name Entity Recognition** of `Hindi-english` code-mixed data used from [LinCE](https://ritual.uh.edu/lince/home)
This model is trained for this below repository.
[https://github.com/sagorbrur/codeswitch](https://github.com/sagorbrur/codeswitch)
To install codeswitch:
```
pip install codeswitch
```
## Name Entity Recognition of Code-Mixed Data
* **Method-1**
```py
from transformers import AutoTokenizer, AutoModelForTokenClassification, pipeline
tokenizer = AutoTokenizer.from_pretrained("sagorsarker/codeswitch-hineng-ner-lince")
model = AutoModelForTokenClassification.from_pretrained("sagorsarker/codeswitch-hineng-ner-lince")
ner_model = pipeline('ner', model=model, tokenizer=tokenizer)
ner_model("put any hindi english code-mixed sentence")
```
* **Method-2**
```py
from codeswitch.codeswitch import NER
ner = NER('hin-eng')
text = "" # your mixed sentence
result = ner.tag(text)
print(result)
```
| 1,106 | [
[
-0.02813720703125,
-0.0183868408203125,
0.0027637481689453125,
0.0159912109375,
-0.0270233154296875,
0.00691986083984375,
-0.01090240478515625,
-0.0106964111328125,
0.034149169921875,
0.035614013671875,
-0.0389404296875,
-0.02569580078125,
-0.06170654296875,
0.0222625732421875,
-0.042449951171875,
0.08831787109375,
-0.0209197998046875,
0.01474761962890625,
0.02105712890625,
-0.002254486083984375,
-0.0240936279296875,
-0.0295562744140625,
-0.02691650390625,
-0.0249176025390625,
0.01496124267578125,
0.0247650146484375,
0.034149169921875,
0.0316162109375,
0.0262908935546875,
0.0257415771484375,
0.00460052490234375,
-0.00046515464782714844,
-0.008697509765625,
0.005634307861328125,
-0.0157012939453125,
-0.033905029296875,
-0.0114288330078125,
-0.001338958740234375,
0.03436279296875,
0.014312744140625,
-0.008880615234375,
0.03289794921875,
-0.005947113037109375,
0.022064208984375,
-0.054443359375,
0.0156707763671875,
-0.053466796875,
0.00223541259765625,
-0.0155792236328125,
-0.01136016845703125,
-0.0232086181640625,
-0.0289306640625,
0.0010175704956054688,
-0.050994873046875,
0.0120697021484375,
-0.00009137392044067383,
0.10235595703125,
0.027069091796875,
-0.03607177734375,
-0.032684326171875,
-0.031341552734375,
0.05755615234375,
-0.0654296875,
0.03436279296875,
0.031982421875,
0.0101318359375,
0.006687164306640625,
-0.08587646484375,
-0.051177978515625,
-0.00016391277313232422,
-0.0147247314453125,
-0.00936126708984375,
-0.006023406982421875,
-0.01316070556640625,
0.048065185546875,
0.0170745849609375,
-0.04278564453125,
-0.00911712646484375,
-0.036651611328125,
-0.05059814453125,
0.034088134765625,
0.00698089599609375,
0.034271240234375,
-0.0306396484375,
-0.037261962890625,
-0.00853729248046875,
-0.0207061767578125,
-0.0015163421630859375,
0.0294647216796875,
0.00688934326171875,
-0.0245361328125,
0.043670654296875,
-0.0321044921875,
0.053070068359375,
0.003002166748046875,
-0.0121917724609375,
0.052978515625,
-0.018890380859375,
-0.0298919677734375,
-0.0019550323486328125,
0.06524658203125,
-0.0009250640869140625,
0.037017822265625,
0.003108978271484375,
-0.0258331298828125,
0.0227508544921875,
0.0015172958374023438,
-0.05865478515625,
-0.048126220703125,
0.00188446044921875,
-0.040252685546875,
-0.034759521484375,
0.0129852294921875,
-0.059722900390625,
-0.00812530517578125,
-0.0168304443359375,
0.047515869140625,
-0.0247955322265625,
-0.0352783203125,
0.009063720703125,
-0.006076812744140625,
0.035614013671875,
-0.0005421638488769531,
-0.07476806640625,
0.0264892578125,
0.024322509765625,
0.05169677734375,
0.0078887939453125,
-0.045562744140625,
-0.0245819091796875,
-0.017547607421875,
-0.002857208251953125,
0.04473876953125,
-0.00992584228515625,
-0.008697509765625,
-0.0111541748046875,
0.0247650146484375,
-0.0283660888671875,
-0.03192138671875,
0.033660888671875,
-0.0482177734375,
0.0268402099609375,
0.0135955810546875,
-0.0428466796875,
-0.005107879638671875,
0.0145111083984375,
-0.03668212890625,
0.07562255859375,
0.0157318115234375,
-0.0877685546875,
0.0244140625,
-0.053314208984375,
-0.042083740234375,
0.02008056640625,
-0.0227508544921875,
-0.0419921875,
-0.0164947509765625,
0.01351165771484375,
0.026580810546875,
0.0124969482421875,
0.0305633544921875,
0.01148223876953125,
-0.037078857421875,
0.01332855224609375,
-0.027496337890625,
0.08331298828125,
0.03338623046875,
-0.04302978515625,
0.002445220947265625,
-0.06341552734375,
0.0174713134765625,
0.02020263671875,
-0.024444580078125,
-0.01256561279296875,
-0.0120391845703125,
0.028472900390625,
0.0159149169921875,
0.02001953125,
-0.035858154296875,
0.0255889892578125,
-0.031158447265625,
0.03814697265625,
0.037139892578125,
-0.00559234619140625,
0.041351318359375,
-0.00012743473052978516,
0.047088623046875,
0.01132965087890625,
0.007373809814453125,
-0.01085662841796875,
-0.03369140625,
-0.07586669921875,
-0.00760650634765625,
0.04888916015625,
0.046234130859375,
-0.078125,
0.034210205078125,
-0.01351165771484375,
-0.037200927734375,
-0.03582763671875,
-0.0074920654296875,
0.0251007080078125,
0.0183258056640625,
0.028778076171875,
-0.02685546875,
-0.070556640625,
-0.062042236328125,
-0.029022216796875,
-0.0004811286926269531,
0.0002474784851074219,
-0.004608154296875,
0.07244873046875,
-0.053497314453125,
0.05963134765625,
-0.051025390625,
-0.005809783935546875,
-0.0040130615234375,
0.024871826171875,
0.0294647216796875,
0.043731689453125,
0.0190277099609375,
-0.05242919921875,
-0.039337158203125,
-0.022796630859375,
-0.034210205078125,
-0.00022518634796142578,
-0.005126953125,
0.001659393310546875,
0.01119232177734375,
0.0300445556640625,
-0.034637451171875,
0.03826904296875,
0.0479736328125,
-0.033660888671875,
0.07080078125,
0.0013170242309570312,
0.01488494873046875,
-0.10009765625,
0.003871917724609375,
-0.0261077880859375,
-0.01126861572265625,
-0.02734375,
0.015350341796875,
0.0408935546875,
-0.0177459716796875,
-0.038055419921875,
0.052276611328125,
-0.04766845703125,
0.00783538818359375,
-0.007762908935546875,
-0.022369384765625,
-0.0029811859130859375,
0.038543701171875,
0.005176544189453125,
0.06597900390625,
0.06884765625,
-0.054046630859375,
0.040863037109375,
0.032196044921875,
-0.037994384765625,
0.004276275634765625,
-0.033447265625,
0.0069427490234375,
0.0202789306640625,
-0.0004246234893798828,
-0.047149658203125,
-0.0293426513671875,
0.035858154296875,
-0.0460205078125,
0.01459503173828125,
-0.0268402099609375,
-0.036224365234375,
-0.030120849609375,
0.006999969482421875,
0.049407958984375,
0.0474853515625,
-0.053192138671875,
0.044036865234375,
0.01922607421875,
-0.0003504753112792969,
-0.0584716796875,
-0.0654296875,
0.0024356842041015625,
-0.0016040802001953125,
-0.021331787109375,
0.00875091552734375,
-0.0015869140625,
0.016632080078125,
-0.0098419189453125,
-0.016754150390625,
-0.033111572265625,
-0.008544921875,
0.01152801513671875,
0.019134521484375,
-0.0179290771484375,
0.0030498504638671875,
-0.0106201171875,
-0.0323486328125,
0.007038116455078125,
-0.023712158203125,
0.057403564453125,
-0.015350341796875,
-0.018829345703125,
-0.0221710205078125,
-0.006397247314453125,
0.027130126953125,
-0.037109375,
0.032745361328125,
0.053314208984375,
-0.02642822265625,
-0.022918701171875,
-0.029815673828125,
-0.01093292236328125,
-0.03607177734375,
0.0408935546875,
-0.030059814453125,
-0.0419921875,
0.036773681640625,
-0.006793975830078125,
-0.0225982666015625,
0.038360595703125,
0.026336669921875,
0.00678253173828125,
0.05645751953125,
0.032470703125,
-0.0155029296875,
0.03326416015625,
-0.026153564453125,
0.04254150390625,
-0.046112060546875,
-0.01036834716796875,
-0.046173095703125,
0.006580352783203125,
-0.051788330078125,
-0.005214691162109375,
-0.006732940673828125,
-0.00022077560424804688,
-0.0221099853515625,
0.064697265625,
-0.06170654296875,
0.04150390625,
0.03143310546875,
-0.0019855499267578125,
0.01087188720703125,
0.0019550323486328125,
-0.0246429443359375,
-0.00002181529998779297,
-0.037750244140625,
-0.0298614501953125,
0.07373046875,
0.029815673828125,
0.06927490234375,
-0.01227569580078125,
0.07232666015625,
-0.005695343017578125,
-0.006954193115234375,
-0.048614501953125,
0.03375244140625,
0.0045318603515625,
-0.0574951171875,
-0.00164794921875,
-0.033660888671875,
-0.0716552734375,
0.0007967948913574219,
-0.01197052001953125,
-0.04803466796875,
0.0130462646484375,
0.0009722709655761719,
-0.01708984375,
0.031341552734375,
-0.06243896484375,
0.06878662109375,
-0.0251007080078125,
0.011993408203125,
0.00981903076171875,
-0.039947509765625,
0.03265380859375,
-0.00572967529296875,
0.0269317626953125,
0.0103607177734375,
0.01415252685546875,
0.08416748046875,
-0.024658203125,
0.04302978515625,
-0.00827789306640625,
0.005207061767578125,
0.0038166046142578125,
0.00823974609375,
0.0245513916015625,
0.0205535888671875,
0.00664520263671875,
0.0262603759765625,
0.0157623291015625,
-0.036865234375,
-0.0206451416015625,
0.055877685546875,
-0.05987548828125,
-0.0038623809814453125,
-0.060333251953125,
-0.02490234375,
0.0153961181640625,
0.0391845703125,
0.059722900390625,
0.04168701171875,
0.01477813720703125,
-0.0043792724609375,
0.04425048828125,
-0.01456451416015625,
0.037567138671875,
0.031982421875,
-0.0294952392578125,
-0.034637451171875,
0.0545654296875,
-0.0006799697875976562,
0.005985260009765625,
0.007686614990234375,
-0.00917816162109375,
-0.01288604736328125,
-0.019439697265625,
-0.0289306640625,
0.003936767578125,
-0.059539794921875,
-0.029327392578125,
-0.0513916015625,
-0.058990478515625,
-0.0303802490234375,
0.00823974609375,
-0.02923583984375,
-0.0129547119140625,
-0.017852783203125,
0.004974365234375,
0.0279388427734375,
0.044219970703125,
0.0017366409301757812,
0.03302001953125,
-0.05755615234375,
0.042144775390625,
0.02032470703125,
0.048797607421875,
-0.0113983154296875,
-0.045074462890625,
-0.03814697265625,
0.00267791748046875,
-0.0107879638671875,
-0.0863037109375,
0.03680419921875,
-0.01165771484375,
0.043060302734375,
0.0102386474609375,
-0.012176513671875,
0.0296630859375,
-0.0276336669921875,
0.0537109375,
0.0152435302734375,
-0.08245849609375,
0.0445556640625,
0.00165557861328125,
0.03753662109375,
0.050811767578125,
0.0281524658203125,
-0.04290771484375,
-0.005634307861328125,
-0.0576171875,
-0.07763671875,
0.051788330078125,
0.031707763671875,
0.0181884765625,
0.0004887580871582031,
0.01477813720703125,
0.0067138671875,
0.01245880126953125,
-0.0966796875,
-0.036651611328125,
-0.02191162109375,
-0.0201873779296875,
0.00986480712890625,
-0.012298583984375,
0.0114288330078125,
-0.03594970703125,
0.06927490234375,
0.00762176513671875,
0.02923583984375,
0.0116729736328125,
-0.03863525390625,
0.0004172325134277344,
0.030670166015625,
0.029266357421875,
0.0270233154296875,
-0.0093841552734375,
0.012725830078125,
0.0159149169921875,
-0.05755615234375,
0.0112152099609375,
0.00887298583984375,
-0.018829345703125,
0.024200439453125,
0.03857421875,
0.067626953125,
0.0223846435546875,
-0.017425537109375,
0.043914794921875,
-0.024688720703125,
-0.0215606689453125,
-0.03814697265625,
0.005565643310546875,
0.005680084228515625,
0.008026123046875,
0.0233154296875,
0.042205810546875,
0.0093536376953125,
-0.0141143798828125,
0.0222320556640625,
0.0036754608154296875,
-0.015960693359375,
0.00921630859375,
0.03826904296875,
0.01116943359375,
-0.0426025390625,
0.057373046875,
-0.03753662109375,
-0.06744384765625,
0.06494140625,
0.035797119140625,
0.0772705078125,
-0.0089111328125,
0.00664520263671875,
0.056427001953125,
0.021636962890625,
-0.01035308837890625,
0.033355712890625,
-0.0206146240234375,
-0.0654296875,
-0.01788330078125,
-0.060333251953125,
-0.00566864013671875,
0.0038433074951171875,
-0.052978515625,
0.04766845703125,
-0.01052093505859375,
-0.0002065896987915039,
0.0017986297607421875,
0.005825042724609375,
-0.07159423828125,
0.0243072509765625,
0.004146575927734375,
0.07025146484375,
-0.0794677734375,
0.092041015625,
0.050201416015625,
-0.04962158203125,
-0.0980224609375,
0.0189056396484375,
-0.0250244140625,
-0.057861328125,
0.0594482421875,
0.0150604248046875,
0.0079803466796875,
0.028900146484375,
-0.043853759765625,
-0.0804443359375,
0.066650390625,
-0.0065155029296875,
-0.0521240234375,
-0.0009098052978515625,
-0.0204315185546875,
0.0230255126953125,
-0.019561767578125,
0.05029296875,
0.04052734375,
0.03289794921875,
-0.0153961181640625,
-0.08526611328125,
-0.00794219970703125,
-0.040863037109375,
-0.006099700927734375,
0.02203369140625,
-0.01244354248046875,
0.069580078125,
-0.0164794921875,
-0.01242828369140625,
0.0089263916015625,
0.02325439453125,
0.03192138671875,
0.01202392578125,
0.0438232421875,
0.033843994140625,
0.03131103515625,
-0.0142974853515625,
0.048187255859375,
-0.031463623046875,
0.0703125,
0.0771484375,
-0.00894927978515625,
0.052734375,
0.036163330078125,
-0.018829345703125,
0.06988525390625,
0.04620361328125,
-0.037689208984375,
0.050933837890625,
0.0205078125,
0.006969451904296875,
0.01142120361328125,
0.033111572265625,
-0.044921875,
0.0298309326171875,
0.00323486328125,
-0.0245513916015625,
-0.0138702392578125,
0.006252288818359375,
-0.0005087852478027344,
-0.01195526123046875,
-0.00832366943359375,
0.047698974609375,
-0.0031585693359375,
-0.0474853515625,
0.058990478515625,
0.0015840530395507812,
0.089111328125,
-0.06024169921875,
-0.0038814544677734375,
0.0015020370483398438,
0.01165771484375,
-0.03564453125,
-0.042938232421875,
0.0164031982421875,
0.0010728836059570312,
-0.0089569091796875,
-0.0026454925537109375,
0.066650390625,
-0.049041748046875,
-0.031585693359375,
0.045013427734375,
0.01093292236328125,
0.0190887451171875,
0.01189422607421875,
-0.061065673828125,
0.0018711090087890625,
0.0165863037109375,
-0.0233917236328125,
0.006374359130859375,
0.00812530517578125,
0.01654052734375,
0.034271240234375,
0.03643798828125,
0.01537322998046875,
0.023956298828125,
0.016204833984375,
0.054046630859375,
-0.04510498046875,
-0.049713134765625,
-0.057464599609375,
0.045745849609375,
-0.0053558349609375,
-0.04193115234375,
0.052520751953125,
0.05340576171875,
0.082275390625,
-0.04071044921875,
0.053619384765625,
-0.02362060546875,
0.02734375,
-0.040008544921875,
0.0714111328125,
-0.03387451171875,
0.0128326416015625,
-0.02081298828125,
-0.06707763671875,
-0.0030384063720703125,
0.061798095703125,
-0.019195556640625,
0.0135345458984375,
0.0714111328125,
0.07806396484375,
-0.037811279296875,
-0.005176544189453125,
0.01152801513671875,
0.0172882080078125,
-0.0111846923828125,
0.03289794921875,
0.05889892578125,
-0.0709228515625,
0.0341796875,
-0.03765869140625,
-0.00785064697265625,
-0.0032749176025390625,
-0.0489501953125,
-0.061920166015625,
-0.0491943359375,
-0.0286865234375,
-0.040435791015625,
-0.01041412353515625,
0.07275390625,
0.0638427734375,
-0.0850830078125,
-0.0183868408203125,
-0.01763916015625,
-0.00011336803436279297,
0.0027523040771484375,
-0.0230712890625,
0.021240234375,
-0.030029296875,
-0.03997802734375,
0.0098114013671875,
-0.005870819091796875,
-0.0020771026611328125,
-0.005092620849609375,
-0.0161285400390625,
-0.0361328125,
-0.005001068115234375,
0.0031223297119140625,
0.018402099609375,
-0.05218505859375,
-0.00478363037109375,
-0.0072021484375,
-0.0350341796875,
-0.0016117095947265625,
0.0491943359375,
-0.05035400390625,
0.0257720947265625,
0.0465087890625,
0.02960205078125,
0.04583740234375,
-0.03192138671875,
0.02593994140625,
-0.05865478515625,
0.0250091552734375,
0.01177978515625,
0.05523681640625,
0.01306915283203125,
-0.0286865234375,
0.037384033203125,
0.0231781005859375,
-0.05059814453125,
-0.0523681640625,
0.019195556640625,
-0.0443115234375,
-0.0222625732421875,
0.08203125,
-0.0061492919921875,
-0.016448974609375,
-0.0184326171875,
-0.03131103515625,
0.04541015625,
-0.009063720703125,
0.039276123046875,
0.0421142578125,
0.0220184326171875,
-0.0174407958984375,
-0.0256195068359375,
0.0347900390625,
0.0302734375,
-0.05316162109375,
-0.0240325927734375,
-0.0037899017333984375,
0.05755615234375,
0.0197906494140625,
0.0537109375,
-0.003948211669921875,
0.016632080078125,
0.0081024169921875,
0.05194091796875,
-0.0034122467041015625,
-0.0160675048828125,
-0.04150390625,
-0.0159454345703125,
-0.01019287109375,
-0.031890869140625
]
] |
guillaumekln/faster-whisper-medium | 2023-05-12T18:58:10.000Z | [
"ctranslate2",
"audio",
"automatic-speech-recognition",
"en",
"zh",
"de",
"es",
"ru",
"ko",
"fr",
"ja",
"pt",
"tr",
"pl",
"ca",
"nl",
"ar",
"sv",
"it",
"id",
"hi",
"fi",
"vi",
"he",
"uk",
"el",
"ms",
"cs",
"ro",
"da",
"hu",
"ta",
"no",
"th",
"ur",
"hr",
"bg",
"lt",
"la",
"mi",
"ml",
"cy",
"sk",
"te",
"fa",
"lv",
"bn",
"sr",
"az",
"sl",
"kn",
"et",
"mk",
"br",
"eu",
"is",
"hy",
"ne",
"mn",
"bs",
"kk",
"sq",
"sw",
"gl",
"mr",
"pa",
"si",
"km",
"sn",
"yo",
"so",
"af",
"oc",
"ka",
"be",
"tg",
"sd",
"gu",
"am",
"yi",
"lo",
"uz",
"fo",
"ht",
"ps",
"tk",
"nn",
"mt",
"sa",
"lb",
"my",
"bo",
"tl",
"mg",
"as",
"tt",
"haw",
"ln",
"ha",
"ba",
"jw",
"su",
"license:mit",
"region:us"
] | automatic-speech-recognition | guillaumekln | null | null | guillaumekln/faster-whisper-medium | 22 | 26,516 | ctranslate2 | 2023-03-23T10:28:55 | ---
language:
- en
- zh
- de
- es
- ru
- ko
- fr
- ja
- pt
- tr
- pl
- ca
- nl
- ar
- sv
- it
- id
- hi
- fi
- vi
- he
- uk
- el
- ms
- cs
- ro
- da
- hu
- ta
- 'no'
- th
- ur
- hr
- bg
- lt
- la
- mi
- ml
- cy
- sk
- te
- fa
- lv
- bn
- sr
- az
- sl
- kn
- et
- mk
- br
- eu
- is
- hy
- ne
- mn
- bs
- kk
- sq
- sw
- gl
- mr
- pa
- si
- km
- sn
- yo
- so
- af
- oc
- ka
- be
- tg
- sd
- gu
- am
- yi
- lo
- uz
- fo
- ht
- ps
- tk
- nn
- mt
- sa
- lb
- my
- bo
- tl
- mg
- as
- tt
- haw
- ln
- ha
- ba
- jw
- su
tags:
- audio
- automatic-speech-recognition
license: mit
library_name: ctranslate2
---
# Whisper medium model for CTranslate2
This repository contains the conversion of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format.
This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper).
## Example
```python
from faster_whisper import WhisperModel
model = WhisperModel("medium")
segments, info = model.transcribe("audio.mp3")
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text))
```
## Conversion details
The original model was converted with the following command:
```
ct2-transformers-converter --model openai/whisper-medium --output_dir faster-whisper-medium \
--copy_files tokenizer.json --quantization float16
```
Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html).
## More information
**For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-medium).**
| 2,010 | [
[
0.00681304931640625,
-0.03399658203125,
0.017822265625,
0.03814697265625,
-0.03619384765625,
-0.0253143310546875,
-0.04205322265625,
-0.0262603759765625,
0.007350921630859375,
0.056396484375,
-0.031494140625,
-0.040435791015625,
-0.04290771484375,
-0.0277862548828125,
-0.0290679931640625,
0.07421875,
-0.01824951171875,
0.0239715576171875,
0.035675048828125,
-0.0103302001953125,
-0.0265960693359375,
-0.01015472412109375,
-0.060089111328125,
-0.026275634765625,
0.01410675048828125,
0.018096923828125,
0.0494384765625,
0.027984619140625,
0.0377197265625,
0.01995849609375,
-0.0224456787109375,
-0.005218505859375,
-0.0270843505859375,
-0.0078582763671875,
0.0213470458984375,
-0.048126220703125,
-0.045867919921875,
0.0153961181640625,
0.056396484375,
0.01496124267578125,
-0.0305938720703125,
0.0374755859375,
-0.017730712890625,
0.0235137939453125,
-0.03759765625,
0.01605224609375,
-0.041046142578125,
0.00296783447265625,
-0.01262664794921875,
-0.027496337890625,
-0.04248046875,
-0.0255279541015625,
0.034454345703125,
-0.064453125,
0.0117950439453125,
0.0021610260009765625,
0.06707763671875,
0.0174560546875,
-0.036102294921875,
-0.0214996337890625,
-0.079833984375,
0.06817626953125,
-0.057647705078125,
0.01947021484375,
0.0186767578125,
0.048309326171875,
0.016937255859375,
-0.0841064453125,
-0.0198974609375,
-0.0113677978515625,
0.01318359375,
0.0264739990234375,
-0.0287017822265625,
0.021759033203125,
0.0137481689453125,
0.03387451171875,
-0.0460205078125,
-0.01080322265625,
-0.05767822265625,
-0.03271484375,
0.03399658203125,
0.0101318359375,
0.016998291015625,
-0.01389312744140625,
-0.024627685546875,
-0.045013427734375,
-0.038665771484375,
-0.002227783203125,
0.038818359375,
0.00931549072265625,
-0.048828125,
0.0478515625,
0.005218505859375,
0.0287017822265625,
0.0006899833679199219,
-0.028656005859375,
0.0252532958984375,
-0.02093505859375,
-0.010284423828125,
0.0301666259765625,
0.05181884765625,
0.045562744140625,
0.0017490386962890625,
0.0183563232421875,
-0.032135009765625,
-0.003963470458984375,
0.00615692138671875,
-0.0858154296875,
-0.030487060546875,
0.0140228271484375,
-0.07525634765625,
-0.0194091796875,
0.00027561187744140625,
-0.019256591796875,
0.0058441162109375,
-0.01546478271484375,
0.049652099609375,
-0.032867431640625,
-0.043182373046875,
0.0311737060546875,
-0.0416259765625,
0.0108795166015625,
0.032012939453125,
-0.05303955078125,
0.03497314453125,
0.0297698974609375,
0.08209228515625,
0.01062774658203125,
-0.0027866363525390625,
-0.0136260986328125,
0.024993896484375,
-0.01464080810546875,
0.042510986328125,
-0.0004968643188476562,
-0.036956787109375,
-0.0033721923828125,
-0.004772186279296875,
-0.005283355712890625,
-0.043701171875,
0.05987548828125,
-0.019134521484375,
0.03143310546875,
0.02001953125,
-0.017578125,
-0.019317626953125,
0.0010204315185546875,
-0.053375244140625,
0.07330322265625,
0.036407470703125,
-0.056365966796875,
-0.0088348388671875,
-0.062255859375,
-0.01055145263671875,
-0.0112457275390625,
0.033416748046875,
-0.031646728515625,
0.018402099609375,
-0.01056671142578125,
0.005825042724609375,
-0.04150390625,
0.02606201171875,
-0.01393890380859375,
-0.037445068359375,
0.0292816162109375,
-0.041412353515625,
0.06884765625,
0.03192138671875,
0.0017910003662109375,
0.0263519287109375,
-0.044342041015625,
-0.0003986358642578125,
0.00453948974609375,
-0.031768798828125,
-0.01496124267578125,
-0.01399993896484375,
0.05499267578125,
-0.0006241798400878906,
0.0220794677734375,
-0.042266845703125,
0.01404571533203125,
-0.031524658203125,
0.06707763671875,
0.03387451171875,
-0.0046234130859375,
0.032135009765625,
-0.034912109375,
0.0070648193359375,
0.014404296875,
0.033660888671875,
0.0162353515625,
-0.034759521484375,
-0.055084228515625,
-0.007114410400390625,
0.034393310546875,
0.0304412841796875,
-0.0443115234375,
0.0162353515625,
-0.0184326171875,
-0.07354736328125,
-0.06939697265625,
-0.02978515625,
0.0180511474609375,
0.0272979736328125,
0.042083740234375,
-0.0006728172302246094,
-0.052581787109375,
-0.06591796875,
-0.003570556640625,
-0.0252838134765625,
-0.0182037353515625,
0.0251617431640625,
0.033447265625,
-0.01499176025390625,
0.038116455078125,
-0.051513671875,
-0.034759521484375,
-0.019256591796875,
0.0289459228515625,
0.023590087890625,
0.06060791015625,
0.05023193359375,
-0.06097412109375,
-0.0173187255859375,
-0.019073486328125,
-0.0236053466796875,
-0.0021152496337890625,
-0.0095672607421875,
-0.0052337646484375,
-0.000823974609375,
0.009185791015625,
-0.052734375,
0.0290679931640625,
0.060028076171875,
-0.029083251953125,
0.046173095703125,
-0.00682830810546875,
0.0037021636962890625,
-0.0946044921875,
0.01369476318359375,
0.00644683837890625,
-0.008087158203125,
-0.043975830078125,
0.0008831024169921875,
0.01555633544921875,
0.0035572052001953125,
-0.05596923828125,
0.048309326171875,
-0.00951385498046875,
-0.0014104843139648438,
-0.012725830078125,
-0.0179290771484375,
-0.006923675537109375,
0.0174102783203125,
0.0291900634765625,
0.062744140625,
0.0295257568359375,
-0.0284576416015625,
0.020416259765625,
0.03436279296875,
-0.017242431640625,
0.0098724365234375,
-0.08367919921875,
0.005985260009765625,
0.0185394287109375,
0.039794921875,
-0.0457763671875,
-0.0038509368896484375,
0.02093505859375,
-0.0462646484375,
0.01375579833984375,
-0.042755126953125,
-0.0452880859375,
-0.0165252685546875,
-0.0338134765625,
0.03759765625,
0.051666259765625,
-0.027252197265625,
0.0537109375,
0.0156402587890625,
0.0138092041015625,
0.00040531158447265625,
-0.07672119140625,
-0.00441741943359375,
-0.019622802734375,
-0.059295654296875,
0.05438232421875,
-0.0212860107421875,
-0.00954437255859375,
-0.00664520263671875,
-0.0007319450378417969,
-0.0273895263671875,
-0.00989532470703125,
0.02880859375,
0.03179931640625,
-0.034881591796875,
-0.017730712890625,
0.035064697265625,
-0.0255889892578125,
0.00940704345703125,
-0.035247802734375,
0.052337646484375,
-0.00864410400390625,
0.00787353515625,
-0.055084228515625,
0.00421905517578125,
0.038818359375,
-0.0172271728515625,
0.040863037109375,
0.0679931640625,
-0.0289459228515625,
-0.0097503662109375,
-0.038665771484375,
-0.0246429443359375,
-0.03558349609375,
0.0153350830078125,
-0.0248565673828125,
-0.052520751953125,
0.034759521484375,
0.0032482147216796875,
-0.001773834228515625,
0.05712890625,
0.0399169921875,
0.004627227783203125,
0.0841064453125,
0.0295867919921875,
0.0193328857421875,
0.0362548828125,
-0.055328369140625,
-0.0171051025390625,
-0.090576171875,
-0.0217437744140625,
-0.050811767578125,
-0.01873779296875,
-0.026031494140625,
-0.0205535888671875,
0.04010009765625,
0.0165557861328125,
-0.0283050537109375,
0.04718017578125,
-0.05450439453125,
-0.00226593017578125,
0.036712646484375,
0.0247344970703125,
0.0182037353515625,
-0.0016069412231445312,
0.0172271728515625,
-0.009429931640625,
-0.0304107666015625,
-0.031463623046875,
0.090087890625,
0.045928955078125,
0.05755615234375,
0.0250701904296875,
0.042266845703125,
0.0028858184814453125,
0.0133819580078125,
-0.059173583984375,
0.0186004638671875,
-0.018463134765625,
-0.048126220703125,
-0.004138946533203125,
-0.0263519287109375,
-0.047210693359375,
-0.0006899833679199219,
-0.009307861328125,
-0.0352783203125,
-0.0010700225830078125,
-0.0003254413604736328,
-0.007480621337890625,
0.02520751953125,
-0.042236328125,
0.054840087890625,
0.01250457763671875,
0.0213775634765625,
-0.0216827392578125,
-0.032318115234375,
0.037017822265625,
0.01241302490234375,
-0.0236053466796875,
0.003261566162109375,
-0.0118255615234375,
0.08502197265625,
-0.0440673828125,
0.05316162109375,
-0.0307464599609375,
-0.00786590576171875,
0.0511474609375,
0.013641357421875,
0.027191162109375,
0.0173797607421875,
-0.01192474365234375,
0.03375244140625,
0.0310211181640625,
-0.005344390869140625,
-0.0299224853515625,
0.03582763671875,
-0.09649658203125,
-0.004673004150390625,
-0.012176513671875,
-0.046417236328125,
0.0311279296875,
0.008270263671875,
0.04052734375,
0.05291748046875,
0.0023136138916015625,
0.0238494873046875,
0.050140380859375,
0.0085906982421875,
0.033416748046875,
0.052703857421875,
-0.0033168792724609375,
-0.05133056640625,
0.058319091796875,
0.0202178955078125,
0.0229949951171875,
0.0273895263671875,
0.0341796875,
-0.039459228515625,
-0.07354736328125,
-0.0307464599609375,
0.0078277587890625,
-0.046875,
-0.0255584716796875,
-0.045867919921875,
-0.042205810546875,
-0.03466796875,
0.0096282958984375,
-0.04547119140625,
-0.05438232421875,
-0.0452880859375,
0.0276336669921875,
0.046844482421875,
0.033294677734375,
0.0008726119995117188,
0.05322265625,
-0.07574462890625,
0.0173492431640625,
-0.003875732421875,
0.00991058349609375,
0.00720977783203125,
-0.0772705078125,
-0.0026302337646484375,
0.0082855224609375,
-0.0250091552734375,
-0.0557861328125,
0.0325927734375,
0.005405426025390625,
0.007686614990234375,
0.0216064453125,
0.006900787353515625,
0.053924560546875,
-0.0157470703125,
0.06829833984375,
0.027191162109375,
-0.080078125,
0.045684814453125,
-0.040283203125,
0.00664520263671875,
0.03607177734375,
0.01277923583984375,
-0.04693603515625,
-0.0070343017578125,
-0.047027587890625,
-0.046630859375,
0.0567626953125,
0.0284576416015625,
-0.0012111663818359375,
0.0195159912109375,
0.0170440673828125,
0.0063323974609375,
0.0131683349609375,
-0.060699462890625,
-0.0198211669921875,
-0.046173095703125,
-0.030792236328125,
0.01082611083984375,
-0.022308349609375,
-0.01171112060546875,
-0.0153961181640625,
0.050872802734375,
-0.0164947509765625,
0.03155517578125,
0.036224365234375,
-0.007671356201171875,
-0.011077880859375,
0.0018310546875,
0.053863525390625,
0.01212310791015625,
-0.041046142578125,
-0.004657745361328125,
0.008575439453125,
-0.060882568359375,
-0.00470733642578125,
0.004058837890625,
-0.0284423828125,
0.01021575927734375,
0.037109375,
0.06011962890625,
0.028045654296875,
-0.0237274169921875,
0.051116943359375,
-0.015411376953125,
-0.033050537109375,
-0.054779052734375,
0.0065155029296875,
0.01551055908203125,
0.0166778564453125,
0.0227203369140625,
0.0212249755859375,
0.027252197265625,
-0.02825927734375,
-0.0205841064453125,
0.009368896484375,
-0.039947509765625,
-0.034423828125,
0.05859375,
0.01236724853515625,
-0.0235137939453125,
0.04931640625,
0.004695892333984375,
-0.01447296142578125,
0.03643798828125,
0.06317138671875,
0.0831298828125,
-0.007251739501953125,
0.0088043212890625,
0.038604736328125,
0.0216827392578125,
-0.020050048828125,
0.049407958984375,
-0.007411956787109375,
-0.03179931640625,
-0.0270843505859375,
-0.052276611328125,
-0.02532958984375,
0.005596160888671875,
-0.06658935546875,
0.027496337890625,
-0.044708251953125,
-0.0171661376953125,
0.0007176399230957031,
0.0084381103515625,
-0.039398193359375,
0.004596710205078125,
0.004547119140625,
0.1055908203125,
-0.049652099609375,
0.08416748046875,
0.038116455078125,
-0.03436279296875,
-0.0830078125,
-0.0218048095703125,
-0.005992889404296875,
-0.050323486328125,
0.040802001953125,
0.009796142578125,
0.005558013916015625,
0.0007586479187011719,
-0.0576171875,
-0.0628662109375,
0.1092529296875,
0.00832366943359375,
-0.041412353515625,
-0.01125335693359375,
0.004322052001953125,
0.037811279296875,
-0.0411376953125,
0.05316162109375,
0.023712158203125,
0.05987548828125,
0.006866455078125,
-0.08465576171875,
0.006702423095703125,
-0.01074981689453125,
0.0183563232421875,
0.00821685791015625,
-0.0654296875,
0.08746337890625,
-0.005573272705078125,
-0.004055023193359375,
0.05731201171875,
0.048065185546875,
0.021514892578125,
0.0229949951171875,
0.036376953125,
0.034637451171875,
0.0237274169921875,
-0.024444580078125,
0.047210693359375,
-0.0182037353515625,
0.034423828125,
0.0601806640625,
-0.01435089111328125,
0.07305908203125,
0.02838134765625,
0.0007877349853515625,
0.048187255859375,
0.037261962890625,
-0.022064208984375,
0.0389404296875,
-0.0014734268188476562,
-0.0024852752685546875,
-0.0020923614501953125,
-0.01258087158203125,
-0.03741455078125,
0.051666259765625,
0.02801513671875,
-0.0296478271484375,
-0.005641937255859375,
-0.0108795166015625,
0.0032196044921875,
-0.012481689453125,
-0.029083251953125,
0.048065185546875,
-0.0083770751953125,
-0.0272979736328125,
0.05218505859375,
0.033294677734375,
0.0677490234375,
-0.0682373046875,
-0.0129547119140625,
0.0217437744140625,
0.006908416748046875,
-0.0186309814453125,
-0.0565185546875,
0.030548095703125,
-0.0235137939453125,
-0.0229339599609375,
-0.0024127960205078125,
0.0499267578125,
-0.046051025390625,
-0.030426025390625,
0.027435302734375,
0.009796142578125,
0.019775390625,
-0.0197906494140625,
-0.061279296875,
0.033721923828125,
0.01087188720703125,
-0.026580810546875,
0.01529693603515625,
-0.0013380050659179688,
0.004474639892578125,
0.03826904296875,
0.06671142578125,
0.0183258056640625,
-0.0020198822021484375,
0.01055145263671875,
0.05718994140625,
-0.0450439453125,
-0.046478271484375,
-0.02496337890625,
0.0357666015625,
-0.003810882568359375,
-0.059326171875,
0.034881591796875,
0.059234619140625,
0.038787841796875,
-0.021331787109375,
0.035797119140625,
0.0032711029052734375,
0.0240631103515625,
-0.06396484375,
0.06683349609375,
-0.0300445556640625,
-0.00849151611328125,
-0.005413055419921875,
-0.06890869140625,
0.005741119384765625,
0.026092529296875,
0.0025806427001953125,
-0.01535797119140625,
0.05010986328125,
0.06475830078125,
-0.01250457763671875,
0.0196990966796875,
-0.002422332763671875,
0.028533935546875,
0.0238494873046875,
0.043365478515625,
0.044647216796875,
-0.07098388671875,
0.057647705078125,
-0.02728271484375,
-0.00942230224609375,
-0.006092071533203125,
-0.045806884765625,
-0.06658935546875,
-0.044525146484375,
-0.0290069580078125,
-0.050018310546875,
-0.0115814208984375,
0.065673828125,
0.0723876953125,
-0.038421630859375,
-0.031646728515625,
0.00811767578125,
-0.01218414306640625,
0.00048160552978515625,
-0.0220947265625,
0.031402587890625,
0.0261077880859375,
-0.04888916015625,
0.034942626953125,
0.006191253662109375,
0.037200927734375,
-0.0258331298828125,
-0.033172607421875,
0.0211639404296875,
0.0016498565673828125,
0.023895263671875,
-0.0023441314697265625,
-0.0589599609375,
-0.01250457763671875,
-0.017669677734375,
-0.0023250579833984375,
0.00627899169921875,
0.05328369140625,
-0.045562744140625,
0.0031642913818359375,
0.03900146484375,
-0.01348114013671875,
0.0572509765625,
-0.0323486328125,
0.016937255859375,
-0.03607177734375,
0.037200927734375,
0.0227508544921875,
0.03143310546875,
0.00922393798828125,
-0.00547027587890625,
0.033294677734375,
0.02020263671875,
-0.028839111328125,
-0.06964111328125,
-0.0041046142578125,
-0.10003662109375,
-0.00786590576171875,
0.0821533203125,
0.003124237060546875,
-0.0287628173828125,
0.0222625732421875,
-0.04876708984375,
0.0277862548828125,
-0.04949951171875,
0.0191192626953125,
0.01131439208984375,
0.03326416015625,
-0.004993438720703125,
-0.036407470703125,
0.026153564453125,
-0.0016536712646484375,
-0.030242919921875,
-0.00409698486328125,
0.01062774658203125,
0.03741455078125,
0.0306396484375,
0.03631591796875,
-0.034942626953125,
0.034881591796875,
0.021697998046875,
0.0222625732421875,
-0.02264404296875,
-0.031524658203125,
-0.0204010009765625,
-0.000682830810546875,
-0.0006556510925292969,
-0.0146942138671875
]
] |
fxmarty/resnet-tiny-beans | 2022-08-23T13:18:09.000Z | [
"transformers",
"pytorch",
"onnx",
"resnet",
"image-classification",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-classification | fxmarty | null | null | fxmarty/resnet-tiny-beans | 1 | 26,352 | transformers | 2022-07-27T09:51:50 | ---
license: apache-2.0
---
A model trained on the beans dataset, just for testing and having a really tiny model.
| 116 | [
[
-0.0302581787109375,
-0.053131103515625,
0.023956298828125,
0.0170745849609375,
-0.009246826171875,
-0.0302276611328125,
0.0086212158203125,
0.002033233642578125,
0.0210723876953125,
0.02734375,
-0.049835205078125,
-0.0022430419921875,
-0.017242431640625,
-0.01361083984375,
-0.0665283203125,
0.082275390625,
0.048126220703125,
0.01239013671875,
-0.005687713623046875,
-0.0010824203491210938,
-0.027496337890625,
0.00860595703125,
-0.0489501953125,
-0.036651611328125,
0.05413818359375,
0.058624267578125,
0.0633544921875,
0.044464111328125,
0.0469970703125,
-0.0024318695068359375,
0.00423431396484375,
-0.021392822265625,
-0.036346435546875,
-0.0396728515625,
0.001369476318359375,
-0.06201171875,
-0.0161285400390625,
0.01422882080078125,
0.0205535888671875,
0.08001708984375,
0.0226593017578125,
0.0533447265625,
-0.0236968994140625,
0.01239013671875,
-0.0345458984375,
-0.0024166107177734375,
-0.016845703125,
0.0194091796875,
-0.006610870361328125,
0.0137481689453125,
-0.035736083984375,
-0.02978515625,
-0.01351165771484375,
-0.050628662109375,
0.0655517578125,
0.05963134765625,
0.07781982421875,
0.008056640625,
-0.0282440185546875,
0.017669677734375,
-0.03778076171875,
0.033355712890625,
-0.0136260986328125,
0.02587890625,
0.0258026123046875,
0.054595947265625,
-0.00820159912109375,
-0.0447998046875,
-0.05145263671875,
-0.01194000244140625,
-0.0280609130859375,
-0.006008148193359375,
0.01824951171875,
-0.0034313201904296875,
0.0273590087890625,
0.032257080078125,
-0.033050537109375,
-0.00554656982421875,
-0.0215606689453125,
0.0187530517578125,
0.04754638671875,
0.032684326171875,
0.009552001953125,
-0.005908966064453125,
-0.0216522216796875,
-0.0082244873046875,
-0.051971435546875,
-0.033203125,
0.004199981689453125,
0.048431396484375,
-0.0379638671875,
0.043731689453125,
0.0048065185546875,
0.0628662109375,
-0.0239715576171875,
0.0023021697998046875,
0.00417327880859375,
0.028076171875,
-0.0452880859375,
-0.0301513671875,
0.04986572265625,
-0.0063629150390625,
0.02166748046875,
-0.02178955078125,
-0.046142578125,
0.01168060302734375,
0.023895263671875,
-0.0670166015625,
-0.060516357421875,
0.0005955696105957031,
-0.04681396484375,
-0.030731201171875,
-0.00710296630859375,
-0.032501220703125,
-0.0229339599609375,
-0.0164642333984375,
0.03515625,
-0.0253448486328125,
-0.0162506103515625,
-0.004894256591796875,
-0.03070068359375,
-0.01001739501953125,
0.01116180419921875,
-0.0712890625,
0.04095458984375,
0.0271148681640625,
0.07232666015625,
0.0225372314453125,
-0.03045654296875,
-0.0143890380859375,
0.002689361572265625,
-0.0543212890625,
0.039703369140625,
-0.039642333984375,
-0.057861328125,
-0.021759033203125,
0.039306640625,
0.0004401206970214844,
0.00031256675720214844,
0.03546142578125,
-0.04833984375,
-0.01113128662109375,
-0.037872314453125,
-0.0419921875,
-0.0007200241088867188,
0.01085662841796875,
-0.0416259765625,
0.0850830078125,
0.044158935546875,
-0.033203125,
0.066162109375,
-0.06707763671875,
-0.0278167724609375,
0.037689208984375,
-0.036102294921875,
-0.049285888671875,
0.0291595458984375,
-0.020660400390625,
0.0284576416015625,
-0.05218505859375,
0.0005970001220703125,
-0.054168701171875,
-0.03692626953125,
0.022369384765625,
-0.0097198486328125,
0.0419921875,
0.02276611328125,
0.01154327392578125,
-0.034027099609375,
-0.0860595703125,
0.007389068603515625,
0.041717529296875,
-0.007659912109375,
-0.033660888671875,
-0.0249481201171875,
0.0274200439453125,
-0.01198577880859375,
-0.0211334228515625,
-0.064208984375,
0.03692626953125,
0.003261566162109375,
0.0193634033203125,
0.0252532958984375,
0.0030536651611328125,
0.037445068359375,
-0.04425048828125,
0.023162841796875,
-0.01206207275390625,
0.036407470703125,
-0.010040283203125,
-0.0450439453125,
-0.08734130859375,
-0.053131103515625,
0.068115234375,
-0.00800323486328125,
-0.0086822509765625,
0.04107666015625,
-0.0159149169921875,
-0.04364013671875,
-0.04302978515625,
0.004497528076171875,
0.01611328125,
0.0187530517578125,
-0.006313323974609375,
0.00589752197265625,
-0.0189361572265625,
-0.08538818359375,
0.04058837890625,
-0.0044708251953125,
-0.044036865234375,
-0.004032135009765625,
0.072998046875,
-0.01221466064453125,
0.09857177734375,
-0.058807373046875,
-0.027130126953125,
-0.015838623046875,
0.0070037841796875,
0.0161285400390625,
0.054229736328125,
0.051849365234375,
-0.048980712890625,
-0.02032470703125,
-0.004161834716796875,
-0.038909912109375,
-0.004970550537109375,
-0.0010356903076171875,
-0.010650634765625,
-0.057525634765625,
0.0155029296875,
-0.049896240234375,
0.03955078125,
0.0290679931640625,
-0.0041351318359375,
0.01470184326171875,
-0.01441192626953125,
-0.005390167236328125,
-0.032745361328125,
-0.002155303955078125,
-0.007904052734375,
-0.027984619140625,
-0.02703857421875,
0.022125244140625,
0.04779052734375,
-0.0411376953125,
-0.01448822021484375,
0.0625,
-0.04302978515625,
-0.027923583984375,
-0.041595458984375,
-0.0200042724609375,
0.0244140625,
0.0237274169921875,
-0.0024852752685546875,
0.055938720703125,
0.0197906494140625,
-0.0626220703125,
0.0290679931640625,
0.029632568359375,
-0.01145172119140625,
0.046112060546875,
-0.04107666015625,
0.0189208984375,
0.0060882568359375,
-0.01519012451171875,
-0.05975341796875,
-0.055938720703125,
-0.015716552734375,
-0.01641845703125,
0.004543304443359375,
-0.033447265625,
-0.03619384765625,
-0.05841064453125,
0.0011911392211914062,
0.0270843505859375,
0.0323486328125,
-0.07952880859375,
0.034515380859375,
0.0252532958984375,
0.0092315673828125,
0.01898193359375,
-0.0836181640625,
-0.03350830078125,
-0.010040283203125,
-0.0287017822265625,
-0.0031833648681640625,
-0.0024700164794921875,
0.0014982223510742188,
0.00463104248046875,
0.018798828125,
-0.02117919921875,
0.007232666015625,
0.04217529296875,
0.016021728515625,
-0.033935546875,
0.0692138671875,
-0.015625,
-0.01221466064453125,
-0.006771087646484375,
-0.004425048828125,
0.0174713134765625,
-0.0032749176025390625,
-0.026458740234375,
-0.0132598876953125,
0.0165252685546875,
0.0110626220703125,
0.0301055908203125,
0.05224609375,
-0.005138397216796875,
-0.02935791015625,
-0.060211181640625,
-0.0161895751953125,
-0.030548095703125,
-0.0280609130859375,
-0.003414154052734375,
-0.06488037109375,
-0.033538818359375,
0.0345458984375,
0.006954193115234375,
-0.01251220703125,
0.01049041748046875,
0.0287322998046875,
-0.006931304931640625,
0.05792236328125,
0.0657958984375,
0.0042724609375,
-0.0008134841918945312,
-0.017974853515625,
0.0181121826171875,
-0.032470703125,
-0.00409698486328125,
-0.047271728515625,
-0.0263519287109375,
-0.01082611083984375,
-0.024078369140625,
-0.006374359130859375,
-0.019561767578125,
-0.058807373046875,
0.021728515625,
-0.0234222412109375,
0.0616455078125,
0.053985595703125,
0.01373291015625,
0.00039196014404296875,
-0.05255126953125,
0.0023937225341796875,
0.01181793212890625,
-0.054779052734375,
-0.032745361328125,
0.0921630859375,
0.029449462890625,
0.08343505859375,
-0.00878143310546875,
-0.003086090087890625,
0.045623779296875,
0.031402587890625,
-0.06060791015625,
0.0290069580078125,
0.020294189453125,
-0.07928466796875,
-0.01355743408203125,
-0.026031494140625,
-0.07464599609375,
0.00982666015625,
-0.006046295166015625,
-0.023101806640625,
0.02783203125,
0.027313232421875,
-0.050811767578125,
0.01019287109375,
-0.0771484375,
0.064208984375,
-0.0244293212890625,
0.0033054351806640625,
0.0264129638671875,
-0.026611328125,
0.0179595947265625,
-0.0159149169921875,
0.00719451904296875,
-0.0125579833984375,
-0.0014696121215820312,
0.047607421875,
-0.043304443359375,
0.057525634765625,
-0.006343841552734375,
0.0273895263671875,
0.0093536376953125,
0.031402587890625,
0.0249176025390625,
0.0259857177734375,
0.028411865234375,
-0.00464630126953125,
0.0168609619140625,
-0.0692138671875,
-0.02410888671875,
0.0275115966796875,
-0.0595703125,
-0.016754150390625,
-0.03472900390625,
-0.040771484375,
-0.0146026611328125,
0.015625,
0.00994110107421875,
0.0113983154296875,
-0.036529541015625,
0.02081298828125,
0.05584716796875,
-0.00383758544921875,
0.01389312744140625,
0.050018310546875,
-0.0592041015625,
0.03887939453125,
0.055633544921875,
-0.00919342041015625,
0.018035888671875,
0.021820068359375,
0.032012939453125,
0.0210418701171875,
-0.0168609619140625,
-0.0244140625,
0.020538330078125,
-0.0230255126953125,
-0.01383209228515625,
-0.03741455078125,
-0.0251312255859375,
-0.0193634033203125,
-0.01284027099609375,
-0.060302734375,
-0.044677734375,
-0.00974273681640625,
-0.0325927734375,
0.03192138671875,
0.049957275390625,
-0.01236724853515625,
0.09686279296875,
-0.0386962890625,
0.0265960693359375,
0.043701171875,
0.035980224609375,
-0.0265655517578125,
-0.033843994140625,
-0.0299835205078125,
-0.00431060791015625,
-0.00939178466796875,
-0.039215087890625,
0.02587890625,
0.01166534423828125,
0.002178192138671875,
0.059814453125,
-0.0111846923828125,
0.024627685546875,
-0.0154571533203125,
0.040557861328125,
0.02130126953125,
-0.0419921875,
0.054931640625,
-0.030792236328125,
0.014801025390625,
0.039306640625,
0.0201263427734375,
-0.0171051025390625,
-0.00719451904296875,
-0.07086181640625,
-0.006244659423828125,
0.059814453125,
0.01081085205078125,
-0.002838134765625,
-0.0006995201110839844,
0.0168914794921875,
0.041229248046875,
0.0306396484375,
-0.042572021484375,
-0.00841522216796875,
0.002094268798828125,
-0.004222869873046875,
0.010833740234375,
-0.01134490966796875,
-0.05584716796875,
-0.00823974609375,
0.035430908203125,
-0.024078369140625,
0.03399658203125,
-0.034423828125,
-0.0120086669921875,
-0.0115814208984375,
-0.0103912353515625,
-0.0305023193359375,
0.0240020751953125,
-0.04302978515625,
0.00424957275390625,
0.01044464111328125,
-0.0303955078125,
0.025115966796875,
-0.017669677734375,
-0.0452880859375,
0.0019741058349609375,
0.00750732421875,
0.051971435546875,
0.0235595703125,
-0.0220794677734375,
0.0604248046875,
-0.0321044921875,
-0.0089263916015625,
-0.0244598388671875,
0.01546478271484375,
-0.004467010498046875,
-0.006916046142578125,
0.00102996826171875,
0.01483917236328125,
0.01456451416015625,
-0.0221710205078125,
0.0153656005859375,
0.001312255859375,
-0.022613525390625,
-0.027984619140625,
0.051116943359375,
0.03167724609375,
-0.01453399658203125,
0.056610107421875,
-0.037017822265625,
0.00710296630859375,
0.053192138671875,
0.034637451171875,
0.0160064697265625,
-0.004329681396484375,
0.030914306640625,
0.01094818115234375,
0.04046630859375,
-0.014892578125,
0.006366729736328125,
0.00272369384765625,
-0.0565185546875,
0.021392822265625,
-0.049560546875,
-0.0379638671875,
0.042327880859375,
-0.05145263671875,
0.04718017578125,
-0.06890869140625,
-0.0161285400390625,
0.0047149658203125,
-0.010040283203125,
-0.07733154296875,
0.0792236328125,
-0.00777435302734375,
0.0933837890625,
-0.08624267578125,
0.07513427734375,
0.034088134765625,
-0.0394287109375,
-0.033111572265625,
-0.017913818359375,
0.00238037109375,
-0.07037353515625,
0.037567138671875,
0.00910186767578125,
0.0386962890625,
-0.031158447265625,
-0.0295562744140625,
-0.0452880859375,
0.039306640625,
0.043548583984375,
-0.050628662109375,
-0.01525115966796875,
0.0017576217651367188,
0.023193359375,
-0.038665771484375,
0.044036865234375,
0.0755615234375,
-0.0011396408081054688,
-0.01328277587890625,
-0.023284912109375,
-0.020233154296875,
-0.037689208984375,
0.0278778076171875,
0.003650665283203125,
-0.0849609375,
0.08074951171875,
0.01006317138671875,
-0.001667022705078125,
0.0146026611328125,
0.057769775390625,
-0.00013065338134765625,
0.00264739990234375,
0.053955078125,
0.036956787109375,
0.056243896484375,
-0.0335693359375,
0.10968017578125,
0.0059661865234375,
0.042694091796875,
0.0855712890625,
-0.00472259521484375,
0.016845703125,
0.0218658447265625,
-0.0137939453125,
-0.0167236328125,
0.06610107421875,
-0.017913818359375,
0.047271728515625,
0.01715087890625,
-0.0026950836181640625,
-0.03741455078125,
0.0099029541015625,
-0.036865234375,
0.01082611083984375,
0.02154541015625,
-0.0299530029296875,
0.00823974609375,
0.02374267578125,
-0.006916046142578125,
-0.007175445556640625,
-0.0216064453125,
0.01103973388671875,
0.01438140869140625,
-0.00804901123046875,
0.0026721954345703125,
-0.01074981689453125,
0.0039215087890625,
-0.04644775390625,
0.01538848876953125,
-0.0275115966796875,
0.032989501953125,
-0.0016469955444335938,
-0.034088134765625,
-0.00574493408203125,
-0.006237030029296875,
-0.034942626953125,
0.0172882080078125,
0.038604736328125,
0.01436614990234375,
-0.068359375,
0.035797119140625,
0.0004665851593017578,
0.0224456787109375,
0.01085662841796875,
-0.03570556640625,
0.00395965576171875,
-0.03155517578125,
-0.032501220703125,
0.0218658447265625,
0.050750732421875,
0.0173187255859375,
0.061004638671875,
0.017333984375,
0.019775390625,
0.0111541748046875,
0.0390625,
0.047637939453125,
-0.05157470703125,
-0.06268310546875,
-0.013336181640625,
0.01338958740234375,
-0.025115966796875,
-0.036041259765625,
0.041229248046875,
0.060028076171875,
0.052825927734375,
-0.03143310546875,
0.07745361328125,
-0.016571044921875,
0.0657958984375,
-0.00789642333984375,
0.0268402099609375,
-0.0379638671875,
-0.0009889602661132812,
0.0205078125,
-0.0330810546875,
-0.0088348388671875,
0.057647705078125,
-0.005886077880859375,
0.011322021484375,
0.0498046875,
0.0655517578125,
-0.046173095703125,
0.06365966796875,
0.05218505859375,
0.0347900390625,
0.01445770263671875,
0.027557373046875,
0.049774169921875,
-0.03167724609375,
0.029022216796875,
-0.031341552734375,
-0.036865234375,
-0.02764892578125,
-0.0243072509765625,
-0.08203125,
-0.030548095703125,
-0.00848388671875,
-0.003147125244140625,
0.0161895751953125,
0.045684814453125,
0.09161376953125,
-0.059722900390625,
-0.0155792236328125,
-0.0160064697265625,
-0.03369140625,
-0.01482391357421875,
-0.00783538818359375,
0.0108184814453125,
-0.0307464599609375,
-0.052825927734375,
0.0259857177734375,
0.00506591796875,
0.004055023193359375,
-0.014068603515625,
0.01367950439453125,
0.00039196014404296875,
0.0179443359375,
0.011566162109375,
0.026031494140625,
0.0178985595703125,
-0.059844970703125,
-0.0335693359375,
-0.0183868408203125,
-0.005405426025390625,
0.04742431640625,
-0.0214691162109375,
0.03143310546875,
-0.00033593177795410156,
0.061004638671875,
0.051361083984375,
-0.0266876220703125,
0.04327392578125,
-0.08154296875,
0.052154541015625,
0.0086212158203125,
0.042510986328125,
0.0228118896484375,
-0.0305938720703125,
0.06085205078125,
0.023345947265625,
-0.04766845703125,
-0.07086181640625,
0.00916290283203125,
-0.07427978515625,
-0.014434814453125,
0.06182861328125,
-0.0085296630859375,
-0.04718017578125,
0.007965087890625,
-0.0102386474609375,
0.0090179443359375,
-0.0176239013671875,
0.06524658203125,
0.0343017578125,
0.0265655517578125,
0.0017290115356445312,
-0.046142578125,
0.018157958984375,
0.030731201171875,
-0.055206298828125,
-0.007205963134765625,
0.002471923828125,
0.0286407470703125,
0.017791748046875,
0.03045654296875,
0.016387939453125,
0.040863037109375,
0.06402587890625,
0.02423095703125,
0.00392913818359375,
-0.048492431640625,
0.0031833648681640625,
0.0364990234375,
-0.0019989013671875,
-0.0513916015625
]
] |
ckiplab/albert-tiny-chinese-pos | 2022-05-10T03:28:11.000Z | [
"transformers",
"pytorch",
"albert",
"token-classification",
"zh",
"license:gpl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | ckiplab | null | null | ckiplab/albert-tiny-chinese-pos | 0 | 26,237 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
- pytorch
- token-classification
- albert
- zh
license: gpl-3.0
---
# CKIP ALBERT Tiny Chinese
This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).
這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2)及自然語言處理工具(包含斷詞、詞性標記、實體辨識)。
## Homepage
- https://github.com/ckiplab/ckip-transformers
## Contributers
- [Mu Yang](https://muyang.pro) at [CKIP](https://ckip.iis.sinica.edu.tw) (Author & Maintainer)
## Usage
Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.
請使用 BertTokenizerFast 而非 AutoTokenizer。
```
from transformers import (
BertTokenizerFast,
AutoModel,
)
tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
model = AutoModel.from_pretrained('ckiplab/albert-tiny-chinese-pos')
```
For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.
有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers 。
| 1,129 | [
[
-0.0240325927734375,
-0.0189361572265625,
0.00638580322265625,
0.05303955078125,
-0.0247955322265625,
0.0028018951416015625,
-0.0156402587890625,
-0.02154541015625,
0.0012149810791015625,
0.02532958984375,
-0.0245208740234375,
-0.0181732177734375,
-0.037445068359375,
0.00862884521484375,
-0.0139923095703125,
0.0595703125,
-0.01177978515625,
0.021942138671875,
0.0283660888671875,
0.0061798095703125,
-0.0173492431640625,
-0.01727294921875,
-0.060272216796875,
-0.04217529296875,
-0.00047969818115234375,
0.03173828125,
0.054901123046875,
0.034332275390625,
0.0396728515625,
0.0226287841796875,
-0.0002027750015258789,
-0.007205963134765625,
-0.006420135498046875,
-0.032867431640625,
0.0002925395965576172,
-0.04486083984375,
-0.027191162109375,
-0.0188140869140625,
0.052947998046875,
0.035247802734375,
0.0013551712036132812,
0.00579071044921875,
0.0158233642578125,
0.0240020751953125,
-0.0225830078125,
0.0309906005859375,
-0.0418701171875,
0.0229949951171875,
-0.0151519775390625,
-0.0033664703369140625,
-0.026214599609375,
-0.0140228271484375,
0.00687408447265625,
-0.0494384765625,
0.02587890625,
0.0018444061279296875,
0.09375,
0.00563812255859375,
-0.0206451416015625,
-0.020050048828125,
-0.050750732421875,
0.078857421875,
-0.062255859375,
0.0260162353515625,
0.027191162109375,
0.022430419921875,
0.0019006729125976562,
-0.08056640625,
-0.0517578125,
-0.012939453125,
-0.0287933349609375,
0.0180511474609375,
0.005939483642578125,
-0.0023517608642578125,
0.02935791015625,
0.0216064453125,
-0.0447998046875,
0.0212554931640625,
-0.0298919677734375,
-0.02935791015625,
0.042999267578125,
0.004016876220703125,
0.035430908203125,
-0.0382080078125,
-0.03228759765625,
-0.0247344970703125,
-0.04595947265625,
0.017120361328125,
0.015716552734375,
0.01061248779296875,
-0.040802001953125,
0.04632568359375,
-0.0031070709228515625,
0.018585205078125,
0.012420654296875,
-0.00637054443359375,
0.026153564453125,
-0.0280914306640625,
-0.0008678436279296875,
-0.008941650390625,
0.07794189453125,
0.0144195556640625,
0.0011014938354492188,
0.01111602783203125,
-0.0263214111328125,
-0.0261993408203125,
-0.021240234375,
-0.061187744140625,
-0.05389404296875,
0.0146026611328125,
-0.05987548828125,
-0.01551055908203125,
0.00799560546875,
-0.05096435546875,
0.0181427001953125,
-0.0144195556640625,
0.027923583984375,
-0.049072265625,
-0.03826904296875,
-0.0024280548095703125,
-0.0243377685546875,
0.050628662109375,
0.01047515869140625,
-0.0931396484375,
0.0006666183471679688,
0.04644775390625,
0.060882568359375,
0.0161285400390625,
-0.01165771484375,
0.00806427001953125,
0.02850341796875,
-0.0181732177734375,
0.043548583984375,
-0.00875091552734375,
-0.04998779296875,
0.007755279541015625,
0.004772186279296875,
0.002422332763671875,
-0.036407470703125,
0.058624267578125,
-0.01922607421875,
0.0234375,
-0.00867462158203125,
-0.01422882080078125,
0.0022525787353515625,
0.00666046142578125,
-0.03759765625,
0.07916259765625,
0.009765625,
-0.0640869140625,
0.01654052734375,
-0.0679931640625,
-0.03778076171875,
0.0253448486328125,
-0.0116729736328125,
-0.035491943359375,
-0.00746917724609375,
0.0174102783203125,
0.0206146240234375,
-0.0094451904296875,
0.00557708740234375,
-0.0067596435546875,
-0.0164642333984375,
-0.0005078315734863281,
-0.02972412109375,
0.09161376953125,
0.0236358642578125,
-0.0135498046875,
0.01313018798828125,
-0.05206298828125,
0.00774383544921875,
0.0259552001953125,
-0.01418304443359375,
-0.02532958984375,
0.00836181640625,
0.03863525390625,
0.0119476318359375,
0.037445068359375,
-0.042510986328125,
0.040252685546875,
-0.046356201171875,
0.0565185546875,
0.05682373046875,
-0.0166168212890625,
0.0264129638671875,
-0.01512908935546875,
0.0004208087921142578,
0.0022945404052734375,
0.0264739990234375,
-0.0030536651611328125,
-0.040496826171875,
-0.08355712890625,
-0.0292510986328125,
0.0380859375,
0.0498046875,
-0.08831787109375,
0.053680419921875,
-0.0223236083984375,
-0.0491943359375,
-0.0322265625,
-0.000024020671844482422,
0.00010114908218383789,
0.007549285888671875,
0.03424072265625,
-0.019439697265625,
-0.04083251953125,
-0.0771484375,
0.00624847412109375,
-0.046417236328125,
-0.049713134765625,
-0.000736236572265625,
0.04766845703125,
-0.03271484375,
0.07379150390625,
-0.04058837890625,
-0.0275726318359375,
-0.019744873046875,
0.042205810546875,
0.0245208740234375,
0.06402587890625,
0.043792724609375,
-0.073974609375,
-0.058502197265625,
-0.01158905029296875,
-0.023040771484375,
-0.0022220611572265625,
-0.0218353271484375,
-0.0066680908203125,
-0.00618743896484375,
0.0035839080810546875,
-0.04949951171875,
0.01537322998046875,
0.023590087890625,
-0.0011425018310546875,
0.061187744140625,
-0.005344390869140625,
-0.022491455078125,
-0.09033203125,
0.00911712646484375,
-0.0123748779296875,
-0.0028247833251953125,
-0.02740478515625,
-0.0027294158935546875,
0.0167388916015625,
-0.008453369140625,
-0.0447998046875,
0.04632568359375,
-0.02606201171875,
0.0247955322265625,
-0.0177764892578125,
-0.005084991455078125,
-0.0127105712890625,
0.03997802734375,
0.0263519287109375,
0.05242919921875,
0.039764404296875,
-0.04998779296875,
0.028411865234375,
0.053802490234375,
-0.0201568603515625,
-0.0045623779296875,
-0.07000732421875,
-0.004596710205078125,
0.0278472900390625,
0.0035228729248046875,
-0.0650634765625,
-0.00689697265625,
0.046966552734375,
-0.053070068359375,
0.047088623046875,
0.002086639404296875,
-0.07452392578125,
-0.038330078125,
-0.03887939453125,
0.0296173095703125,
0.046966552734375,
-0.049041748046875,
0.032928466796875,
0.0181732177734375,
-0.0173492431640625,
-0.02960205078125,
-0.0592041015625,
-0.0032901763916015625,
0.013916015625,
-0.043121337890625,
0.042144775390625,
-0.0193939208984375,
0.017059326171875,
0.0013837814331054688,
0.0101470947265625,
-0.03155517578125,
-0.0081024169921875,
-0.01235198974609375,
0.028778076171875,
-0.01345062255859375,
-0.00225067138671875,
0.0167236328125,
-0.02667236328125,
0.0081634521484375,
0.004016876220703125,
0.044891357421875,
-0.0033416748046875,
-0.02301025390625,
-0.048736572265625,
0.020111083984375,
0.01242828369140625,
-0.01348876953125,
0.0269775390625,
0.07794189453125,
-0.0208892822265625,
-0.0131072998046875,
-0.0210723876953125,
-0.0166168212890625,
-0.0413818359375,
0.038665771484375,
-0.033538818359375,
-0.06396484375,
0.02099609375,
-0.00981903076171875,
0.01165771484375,
0.058258056640625,
0.049591064453125,
-0.00439453125,
0.09747314453125,
0.07110595703125,
-0.03173828125,
0.032562255859375,
-0.03314208984375,
0.0321044921875,
-0.0709228515625,
0.02154541015625,
-0.04876708984375,
0.004528045654296875,
-0.06378173828125,
-0.0238037109375,
0.00286865234375,
0.015869140625,
-0.021240234375,
0.0506591796875,
-0.0572509765625,
0.0008006095886230469,
0.0528564453125,
-0.0289459228515625,
-0.0006504058837890625,
-0.012115478515625,
-0.01678466796875,
-0.00921630859375,
-0.05377197265625,
-0.053955078125,
0.053375244140625,
0.046905517578125,
0.053070068359375,
0.0032501220703125,
0.037017822265625,
-0.0008635520935058594,
0.037811279296875,
-0.062347412109375,
0.0416259765625,
-0.00887298583984375,
-0.056915283203125,
-0.0221405029296875,
-0.019989013671875,
-0.0634765625,
0.021453857421875,
-0.003635406494140625,
-0.06982421875,
0.0170135498046875,
0.00774383544921875,
-0.0211944580078125,
0.026153564453125,
-0.0380859375,
0.05963134765625,
-0.030975341796875,
0.0037364959716796875,
-0.00434112548828125,
-0.049041748046875,
0.034698486328125,
-0.006488800048828125,
0.0004734992980957031,
-0.005893707275390625,
-0.0021610260009765625,
0.060302734375,
-0.019439697265625,
0.0615234375,
-0.004306793212890625,
0.00037670135498046875,
0.026885986328125,
-0.01549530029296875,
0.0266265869140625,
0.025146484375,
0.010101318359375,
0.047515869140625,
0.02398681640625,
-0.02716064453125,
-0.0169525146484375,
0.038177490234375,
-0.0653076171875,
-0.03155517578125,
-0.044036865234375,
-0.01433563232421875,
0.01242828369140625,
0.042022705078125,
0.038482666015625,
-0.0075225830078125,
0.0041656494140625,
0.01824951171875,
0.01702880859375,
-0.0279541015625,
0.04730224609375,
0.045196533203125,
-0.00609588623046875,
-0.0303497314453125,
0.0721435546875,
0.007244110107421875,
0.005611419677734375,
0.04669189453125,
0.0044708251953125,
-0.0121002197265625,
-0.0323486328125,
-0.02825927734375,
0.03607177734375,
-0.0235443115234375,
-0.00406646728515625,
-0.02520751953125,
-0.03656005859375,
-0.045379638671875,
0.0095977783203125,
-0.03302001953125,
-0.031585693359375,
-0.0259552001953125,
0.00811004638671875,
-0.028564453125,
0.00800323486328125,
-0.028564453125,
0.038970947265625,
-0.08074951171875,
0.03973388671875,
0.0171661376953125,
0.0136566162109375,
0.00487518310546875,
-0.0193328857421875,
-0.0462646484375,
0.006656646728515625,
-0.06597900390625,
-0.046783447265625,
0.043914794921875,
0.002902984619140625,
0.040496826171875,
0.05010986328125,
0.0196075439453125,
0.039459228515625,
-0.048583984375,
0.082763671875,
0.0302276611328125,
-0.08355712890625,
0.031280517578125,
-0.0091705322265625,
0.024139404296875,
0.0215606689453125,
0.03399658203125,
-0.059783935546875,
-0.0236968994140625,
-0.038848876953125,
-0.0821533203125,
0.054168701171875,
0.031585693359375,
0.018798828125,
0.0018110275268554688,
-0.004180908203125,
-0.004528045654296875,
0.01367950439453125,
-0.07513427734375,
-0.038665771484375,
-0.0259857177734375,
-0.02484130859375,
0.01456451416015625,
-0.032440185546875,
-0.0001608133316040039,
-0.019989013671875,
0.07940673828125,
-0.0030612945556640625,
0.061981201171875,
0.0278472900390625,
0.0031795501708984375,
-0.01361846923828125,
0.0081024169921875,
0.033599853515625,
0.036285400390625,
-0.0149078369140625,
-0.0178375244140625,
0.0059814453125,
-0.043182373046875,
-0.0168304443359375,
0.02703857421875,
-0.03515625,
0.030517578125,
0.03948974609375,
0.044525146484375,
0.00836181640625,
-0.032257080078125,
0.044036865234375,
-0.021270751953125,
-0.013336181640625,
-0.07281494140625,
-0.004299163818359375,
0.00021219253540039062,
0.0066070556640625,
0.04150390625,
-0.006565093994140625,
0.010498046875,
-0.011199951171875,
0.01056671142578125,
0.02825927734375,
-0.038482666015625,
-0.038604736328125,
0.05596923828125,
0.032135009765625,
-0.0278472900390625,
0.05889892578125,
-0.01244354248046875,
-0.059783935546875,
0.0462646484375,
0.030670166015625,
0.0704345703125,
-0.0179901123046875,
0.005176544189453125,
0.04779052734375,
0.0430908203125,
0.002227783203125,
0.011199951171875,
-0.02239990234375,
-0.0654296875,
-0.034515380859375,
-0.030670166015625,
-0.035369873046875,
0.0247344970703125,
-0.03497314453125,
0.04620361328125,
-0.034332275390625,
-0.005413055419921875,
-0.00013554096221923828,
0.0007243156433105469,
-0.04205322265625,
0.003726959228515625,
0.008544921875,
0.08319091796875,
-0.0496826171875,
0.08660888671875,
0.04388427734375,
-0.040863037109375,
-0.06439208984375,
0.01348876953125,
-0.02239990234375,
-0.053070068359375,
0.07049560546875,
0.023834228515625,
0.025604248046875,
0.00479888916015625,
-0.054473876953125,
-0.05584716796875,
0.07635498046875,
-0.01418304443359375,
-0.030975341796875,
-0.005748748779296875,
0.02716064453125,
0.031829833984375,
-0.003986358642578125,
0.035736083984375,
0.01009368896484375,
0.04644775390625,
-0.016510009765625,
-0.08404541015625,
-0.01898193359375,
-0.0183563232421875,
0.008636474609375,
0.016876220703125,
-0.0718994140625,
0.05755615234375,
0.00179290771484375,
-0.026123046875,
0.036865234375,
0.07147216796875,
-0.0014486312866210938,
0.018341064453125,
0.0408935546875,
0.03173828125,
-0.00001919269561767578,
-0.01708984375,
0.042449951171875,
-0.039825439453125,
0.06298828125,
0.06341552734375,
0.0010747909545898438,
0.054931640625,
0.032989501953125,
-0.03778076171875,
0.03515625,
0.053253173828125,
-0.04010009765625,
0.04632568359375,
-0.00318145751953125,
-0.00783538818359375,
-0.007598876953125,
0.005096435546875,
-0.03662109375,
0.0162811279296875,
0.0251922607421875,
-0.020843505859375,
-0.003917694091796875,
-0.00823974609375,
-0.004207611083984375,
-0.029541015625,
-0.01080322265625,
0.037078857421875,
0.0189666748046875,
-0.01145172119140625,
0.034393310546875,
0.0265960693359375,
0.07373046875,
-0.0753173828125,
-0.0228118896484375,
0.017578125,
0.015380859375,
-0.006862640380859375,
-0.0452880859375,
0.0142059326171875,
-0.0239105224609375,
-0.01200103759765625,
-0.0108795166015625,
0.056610107421875,
-0.023529052734375,
-0.042022705078125,
0.023681640625,
0.0061187744140625,
0.009765625,
0.024169921875,
-0.0841064453125,
-0.0237274169921875,
0.0254364013671875,
-0.033538818359375,
0.0142822265625,
0.0184478759765625,
0.0080413818359375,
0.05029296875,
0.0601806640625,
0.01323699951171875,
-0.0134429931640625,
-0.0081787109375,
0.06060791015625,
-0.04168701171875,
-0.04351806640625,
-0.052215576171875,
0.053955078125,
-0.019256591796875,
-0.0242919921875,
0.05712890625,
0.053558349609375,
0.08935546875,
-0.025482177734375,
0.06689453125,
-0.031341552734375,
0.057098388671875,
-0.01087188720703125,
0.062042236328125,
-0.03948974609375,
-0.0029926300048828125,
-0.02056884765625,
-0.05908203125,
-0.019134521484375,
0.062164306640625,
-0.0157470703125,
-0.01042938232421875,
0.054595947265625,
0.048797607421875,
-0.002178192138671875,
-0.01377105712890625,
0.01415252685546875,
0.0184326171875,
0.0462646484375,
0.03240966796875,
0.044342041015625,
-0.041595458984375,
0.0535888671875,
-0.054901123046875,
-0.0127105712890625,
-0.01216888427734375,
-0.046112060546875,
-0.049163818359375,
-0.038787841796875,
-0.0238037109375,
-0.0047149658203125,
-0.0291290283203125,
0.06268310546875,
0.053070068359375,
-0.07733154296875,
-0.0278472900390625,
0.005809783935546875,
0.0131072998046875,
-0.0260162353515625,
-0.025848388671875,
0.04620361328125,
-0.02191162109375,
-0.08612060546875,
0.0014553070068359375,
0.011474609375,
0.0078582763671875,
-0.0201416015625,
0.0002925395965576172,
-0.01436614990234375,
-0.01439666748046875,
0.02825927734375,
0.037872314453125,
-0.048187255859375,
-0.0302581787109375,
-0.00997161865234375,
-0.0171966552734375,
0.007755279541015625,
0.04541015625,
-0.01381683349609375,
0.02484130859375,
0.043609619140625,
0.01812744140625,
0.0221405029296875,
-0.018096923828125,
0.039947509765625,
-0.039520263671875,
0.0274505615234375,
0.028564453125,
0.040557861328125,
0.0237579345703125,
-0.013519287109375,
0.0379638671875,
0.035125732421875,
-0.0533447265625,
-0.04791259765625,
0.025299072265625,
-0.072509765625,
-0.01425933837890625,
0.06781005859375,
-0.01276397705078125,
-0.017578125,
-0.0027179718017578125,
-0.04150390625,
0.041778564453125,
-0.0237884521484375,
0.0455322265625,
0.055145263671875,
-0.007587432861328125,
-0.00672149658203125,
-0.041778564453125,
0.0287933349609375,
0.03045654296875,
-0.0227813720703125,
-0.0262603759765625,
-0.0026226043701171875,
0.0095367431640625,
0.049224853515625,
0.03424072265625,
-0.00417327880859375,
0.0063323974609375,
-0.0047607421875,
0.03924560546875,
0.0034809112548828125,
0.013336181640625,
0.0013132095336914062,
-0.012115478515625,
0.008575439453125,
-0.033172607421875
]
] |
timm/vit_base_patch16_clip_224.openai | 2022-12-24T21:36:26.000Z | [
"timm",
"pytorch",
"image-classification",
"vision",
"arxiv:2103.00020",
"arxiv:1908.04913",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_base_patch16_clip_224.openai | 1 | 26,180 | timm | 2022-11-01T22:01:59 | ---
tags:
- image-classification
- timm
- vision
library_tag: timm
license: apache-2.0
---
# CLIP (OpenAI model for timm)
## Model Details
The CLIP model was developed by researchers at OpenAI to learn about what contributes to robustness in computer vision tasks. The model was also developed to test the ability of models to generalize to arbitrary image classification tasks in a zero-shot manner. It was not developed for general model deployment - to deploy models like CLIP, researchers will first need to carefully study their capabilities in relation to the specific context they’re being deployed within.
This instance of the CLIP model is intended for loading in
* `timm` (https://github.com/rwightman/pytorch-image-models) and
* `OpenCLIP` (https://github.com/mlfoundations/open_clip) libraries.
Please see https://huggingface.co/openai/clip-vit-base-patch16 for use in Hugging Face Transformers.
### Model Date
January 2021
### Model Type
The model uses a ViT-B/16 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss.
The original implementation had two variants: one using a ResNet image encoder and the other using a Vision Transformer. This repository has the variant with the Vision Transformer.
### Documents
- [Blog Post](https://openai.com/blog/clip/)
- [CLIP Paper](https://arxiv.org/abs/2103.00020)
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such models - the CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
### Out-of-Scope Use Cases
**Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful.
Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use.
Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases.
## Data
The model was trained on publicly available image-caption data. This was done through a combination of crawling a handful of websites and using commonly-used pre-existing image datasets such as [YFCC100M](http://projects.dfki.uni-kl.de/yfcc100m/). A large portion of the data comes from our crawling of the internet. This means that the data is more representative of people and societies most connected to the internet which tend to skew towards more developed nations, and younger, male users.
### Data Mission Statement
Our goal with building this dataset was to test out robustness and generalizability in computer vision tasks. As a result, the focus was on gathering large quantities of data from different publicly-available internet data sources. The data was gathered in a mostly non-interventionist manner. However, we only crawled websites that had policies against excessively violent and adult images and allowed us to filter out such content. We do not intend for this dataset to be used as the basis for any commercial or deployed model and will not be releasing the dataset.
## Limitations
CLIP and our analysis of it have a number of limitations. CLIP currently struggles with respect to certain tasks such as fine grained classification and counting objects. CLIP also poses issues with regards to fairness and bias which we discuss in the paper and briefly in the next section. Additionally, our approach to testing CLIP also has an important limitation- in many cases we have used linear probes to evaluate the performance of CLIP and there is evidence suggesting that linear probes can underestimate model performance.
### Bias and Fairness
We find that the performance of CLIP - and the specific biases it exhibits - can depend significantly on class design and the choices one makes for categories to include and exclude. We tested the risk of certain kinds of denigration with CLIP by classifying images of people from [Fairface](https://arxiv.org/abs/1908.04913) into crime-related and non-human animal categories. We found significant disparities with respect to race and gender. Additionally, we found that these disparities could shift based on how the classes were constructed. (Details captured in the Broader Impacts Section in the paper).
We also tested the performance of CLIP on gender, race and age classification using the Fairface dataset (We default to using race categories as they are constructed in the Fairface dataset.) in order to assess quality of performance across different demographics. We found accuracy >96% across all races for gender classification with ‘Middle Eastern’ having the highest accuracy (98.4%) and ‘White’ having the lowest (96.5%). Additionally, CLIP averaged ~93% for racial classification and ~63% for age classification. Our use of evaluations to test for gender, race and age classification as well as denigration harms is simply to evaluate performance of the model across people and surface potential risks and not to demonstrate an endorsement/enthusiasm for such tasks.
| 6,353 | [
[
-0.04296875,
-0.042724609375,
0.01276397705078125,
0.0011091232299804688,
-0.01271820068359375,
-0.0121002197265625,
0.00421142578125,
-0.0509033203125,
0.01018524169921875,
0.033294677734375,
-0.023773193359375,
-0.032470703125,
-0.04656982421875,
0.006656646728515625,
-0.0467529296875,
0.06048583984375,
-0.008636474609375,
0.0015897750854492188,
-0.020721435546875,
-0.0303497314453125,
-0.044891357421875,
-0.051055908203125,
-0.017303466796875,
0.016815185546875,
0.01114654541015625,
0.01042938232421875,
0.052490234375,
0.06121826171875,
0.0611572265625,
0.0134124755859375,
-0.017425537109375,
-0.0076446533203125,
-0.036895751953125,
-0.04852294921875,
-0.032928466796875,
-0.031280517578125,
-0.026763916015625,
0.0208892822265625,
0.04486083984375,
0.0303192138671875,
0.0009675025939941406,
0.019195556640625,
0.0020313262939453125,
0.0294189453125,
-0.07330322265625,
-0.01055145263671875,
-0.040252685546875,
0.0055084228515625,
-0.0283050537109375,
0.0076446533203125,
-0.014923095703125,
-0.01904296875,
0.0243377685546875,
-0.0299224853515625,
0.036041259765625,
0.0002703666687011719,
0.09503173828125,
0.0121917724609375,
-0.0221405029296875,
0.0013628005981445312,
-0.05059814453125,
0.0592041015625,
-0.035552978515625,
0.0206756591796875,
0.01861572265625,
0.036590576171875,
0.01006317138671875,
-0.061279296875,
-0.045989990234375,
-0.00501251220703125,
0.021270751953125,
0.0008268356323242188,
-0.0202484130859375,
-0.00009906291961669922,
0.03057861328125,
0.03375244140625,
-0.01251983642578125,
-0.00556182861328125,
-0.052703857421875,
-0.0181121826171875,
0.0472412109375,
0.019775390625,
0.03216552734375,
-0.01546478271484375,
-0.0499267578125,
-0.032989501953125,
-0.0458984375,
0.0374755859375,
0.03802490234375,
0.0142364501953125,
-0.0159912109375,
0.051483154296875,
0.00382232666015625,
0.0267333984375,
-0.0002980232238769531,
-0.02001953125,
0.0234375,
-0.035003662109375,
-0.00951385498046875,
-0.0227508544921875,
0.0577392578125,
0.06597900390625,
0.0172882080078125,
0.007099151611328125,
-0.005825042724609375,
0.015869140625,
0.034149169921875,
-0.0675048828125,
-0.006847381591796875,
-0.01401519775390625,
-0.0462646484375,
-0.0262908935546875,
0.0252227783203125,
-0.07940673828125,
0.00794219970703125,
-0.007587432861328125,
0.054931640625,
-0.035064697265625,
-0.00591278076171875,
0.00914764404296875,
-0.0213775634765625,
0.022735595703125,
0.0276336669921875,
-0.049163818359375,
0.0311126708984375,
0.025543212890625,
0.08673095703125,
-0.0367431640625,
-0.022216796875,
0.0006375312805175781,
-0.003566741943359375,
-0.01016998291015625,
0.052032470703125,
-0.0223846435546875,
-0.033966064453125,
-0.0034732818603515625,
0.0309906005859375,
-0.00568389892578125,
-0.048553466796875,
0.043731689453125,
-0.02001953125,
0.00853729248046875,
-0.02008056640625,
-0.02374267578125,
-0.047088623046875,
0.0207061767578125,
-0.048309326171875,
0.0682373046875,
0.0189666748046875,
-0.05950927734375,
0.03204345703125,
-0.06268310546875,
-0.00510406494140625,
-0.01137542724609375,
-0.01080322265625,
-0.046417236328125,
-0.0208587646484375,
0.0308685302734375,
0.0189971923828125,
-0.01401519775390625,
0.026275634765625,
-0.044708251953125,
-0.03424072265625,
0.01470947265625,
-0.0213623046875,
0.0726318359375,
0.004268646240234375,
-0.030059814453125,
0.00490570068359375,
-0.0214385986328125,
-0.018035888671875,
0.021331787109375,
0.0093841552734375,
-0.01873779296875,
-0.0085906982421875,
0.0189666748046875,
0.0038967132568359375,
-0.0102996826171875,
-0.0592041015625,
0.006366729736328125,
-0.005802154541015625,
0.036163330078125,
0.05242919921875,
0.0089263916015625,
0.0250396728515625,
-0.03961181640625,
0.0467529296875,
-0.0025424957275390625,
0.050567626953125,
-0.011260986328125,
-0.036468505859375,
-0.03460693359375,
-0.04010009765625,
0.043609619140625,
0.04827880859375,
-0.039337158203125,
0.01166534423828125,
-0.00545501708984375,
-0.0215301513671875,
-0.018096923828125,
-0.0183868408203125,
0.02813720703125,
0.0467529296875,
0.0237579345703125,
-0.07342529296875,
-0.03778076171875,
-0.08135986328125,
0.0100250244140625,
-0.003662109375,
-0.0058441162109375,
0.04644775390625,
0.071044921875,
-0.020904541015625,
0.09197998046875,
-0.052642822265625,
-0.0377197265625,
-0.01052093505859375,
-0.0084075927734375,
-0.01082611083984375,
0.03802490234375,
0.08172607421875,
-0.07208251953125,
-0.01507568359375,
-0.0450439453125,
-0.064208984375,
0.00490570068359375,
0.01326751708984375,
-0.0144195556640625,
0.0016918182373046875,
0.01279449462890625,
-0.0213775634765625,
0.0772705078125,
0.01515960693359375,
-0.0007348060607910156,
0.053375244140625,
0.0151214599609375,
0.0247650146484375,
-0.0438232421875,
0.0294189453125,
0.01303863525390625,
-0.018951416015625,
-0.03692626953125,
0.0115509033203125,
-0.0019083023071289062,
-0.035614013671875,
-0.06842041015625,
0.029388427734375,
-0.006938934326171875,
-0.008087158203125,
-0.01345062255859375,
-0.010284423828125,
0.014892578125,
0.0478515625,
0.0013303756713867188,
0.08612060546875,
0.032928466796875,
-0.05621337890625,
-0.0059814453125,
0.045135498046875,
-0.032440185546875,
0.045654296875,
-0.06781005859375,
0.00319671630859375,
-0.003009796142578125,
0.006134033203125,
-0.03863525390625,
-0.02978515625,
0.0277099609375,
-0.0286407470703125,
0.01385498046875,
-0.01122283935546875,
-0.0285186767578125,
-0.040802001953125,
-0.047210693359375,
0.055572509765625,
0.044769287109375,
-0.0372314453125,
0.026458740234375,
0.063720703125,
0.01549530029296875,
-0.042755126953125,
-0.05596923828125,
-0.0011539459228515625,
-0.028228759765625,
-0.0548095703125,
0.042022705078125,
-0.00019419193267822266,
-0.00472259521484375,
0.007965087890625,
0.0057373046875,
-0.025390625,
0.004055023193359375,
0.034149169921875,
0.0406494140625,
-0.0033740997314453125,
-0.0078125,
-0.019500732421875,
0.0280914306640625,
-0.0005936622619628906,
0.0157470703125,
0.014801025390625,
-0.005878448486328125,
-0.0316162109375,
-0.036529541015625,
0.031768798828125,
0.035675048828125,
-0.02093505859375,
0.037384033203125,
0.032501220703125,
-0.0230255126953125,
0.0011472702026367188,
-0.040679931640625,
-0.0033550262451171875,
-0.032989501953125,
0.029754638671875,
-0.00399017333984375,
-0.06475830078125,
0.058319091796875,
0.004070281982421875,
-0.00824737548828125,
0.045440673828125,
0.025970458984375,
-0.00240325927734375,
0.0640869140625,
0.08099365234375,
0.0009946823120117188,
0.055816650390625,
-0.049774169921875,
0.0028228759765625,
-0.074951171875,
-0.0228271484375,
-0.0230255126953125,
-0.0179443359375,
-0.035430908203125,
-0.03466796875,
0.049072265625,
0.012603759765625,
-0.00850677490234375,
0.0330810546875,
-0.04986572265625,
0.03204345703125,
0.044708251953125,
0.035614013671875,
0.00014853477478027344,
-0.0080108642578125,
0.003742218017578125,
-0.0184173583984375,
-0.044708251953125,
-0.03900146484375,
0.0909423828125,
0.052703857421875,
0.0548095703125,
-0.00878143310546875,
0.019561767578125,
0.037353515625,
-0.0003676414489746094,
-0.061279296875,
0.040283203125,
-0.0293426513671875,
-0.0614013671875,
-0.0110321044921875,
-0.008270263671875,
-0.0638427734375,
0.005382537841796875,
-0.0015630722045898438,
-0.06011962890625,
0.043609619140625,
0.004924774169921875,
-0.025848388671875,
0.0482177734375,
-0.0411376953125,
0.072265625,
-0.01580810546875,
-0.0242156982421875,
0.0002218484878540039,
-0.05657958984375,
0.051788330078125,
-0.0027103424072265625,
0.004955291748046875,
-0.0185546875,
0.0063934326171875,
0.08172607421875,
-0.044769287109375,
0.07135009765625,
-0.013763427734375,
0.033782958984375,
0.060638427734375,
-0.01375579833984375,
-0.004619598388671875,
-0.01508331298828125,
0.0176239013671875,
0.04754638671875,
0.02032470703125,
-0.01021575927734375,
-0.034088134765625,
0.00717926025390625,
-0.05853271484375,
-0.032012939453125,
-0.0306854248046875,
-0.036895751953125,
0.0123291015625,
0.01114654541015625,
0.034149169921875,
0.0562744140625,
-0.009368896484375,
0.00905609130859375,
0.04779052734375,
-0.0372314453125,
0.025909423828125,
0.0176544189453125,
-0.0266571044921875,
-0.0435791015625,
0.06268310546875,
0.0185089111328125,
0.01561737060546875,
0.0033092498779296875,
0.00414276123046875,
-0.0177154541015625,
-0.040252685546875,
-0.035186767578125,
0.006649017333984375,
-0.05596923828125,
-0.030487060546875,
-0.036529541015625,
-0.026275634765625,
-0.036468505859375,
0.0006318092346191406,
-0.02899169921875,
-0.0233306884765625,
-0.038818359375,
0.0166015625,
0.0211029052734375,
0.048858642578125,
-0.01128387451171875,
0.0265045166015625,
-0.04638671875,
0.0256500244140625,
0.023834228515625,
0.0401611328125,
-0.0027923583984375,
-0.047210693359375,
-0.0127105712890625,
-0.0009326934814453125,
-0.06793212890625,
-0.059356689453125,
0.031402587890625,
0.0244598388671875,
0.046051025390625,
0.020721435546875,
0.0198211669921875,
0.05084228515625,
-0.033843994140625,
0.07855224609375,
0.01303863525390625,
-0.074462890625,
0.0482177734375,
-0.030609130859375,
0.01145172119140625,
0.060211181640625,
0.036865234375,
-0.0195770263671875,
-0.01352691650390625,
-0.041046142578125,
-0.06549072265625,
0.06011962890625,
0.0147857666015625,
0.0096435546875,
0.0006208419799804688,
0.0302581787109375,
0.0021762847900390625,
0.0119171142578125,
-0.048553466796875,
-0.0147705078125,
-0.03338623046875,
0.00223541259765625,
0.0252838134765625,
-0.0284576416015625,
-0.01123046875,
-0.0247039794921875,
0.0244293212890625,
-0.0057373046875,
0.047515869140625,
0.042022705078125,
-0.01439666748046875,
0.00251007080078125,
-0.00794219970703125,
0.04559326171875,
0.0498046875,
-0.0318603515625,
-0.0150299072265625,
0.017547607421875,
-0.06524658203125,
0.00493621826171875,
-0.013580322265625,
-0.038543701171875,
-0.0006351470947265625,
0.019317626953125,
0.06268310546875,
0.013702392578125,
-0.0611572265625,
0.07916259765625,
-0.0021820068359375,
-0.036895751953125,
-0.0229949951171875,
0.0003552436828613281,
-0.050018310546875,
0.0128021240234375,
0.0264434814453125,
0.01788330078125,
0.030792236328125,
-0.04534912109375,
0.03729248046875,
0.034912109375,
-0.0286865234375,
-0.0323486328125,
0.059234619140625,
0.014495849609375,
-0.0164337158203125,
0.03668212890625,
-0.01433563232421875,
-0.06878662109375,
0.059112548828125,
0.02593994140625,
0.059844970703125,
-0.0022525787353515625,
0.0129547119140625,
0.037139892578125,
0.020172119140625,
-0.02337646484375,
0.0008707046508789062,
0.00501251220703125,
-0.04425048828125,
-0.01788330078125,
-0.028076171875,
-0.05169677734375,
0.0096588134765625,
-0.0714111328125,
0.033447265625,
-0.046905517578125,
-0.037567138671875,
-0.014373779296875,
-0.023406982421875,
-0.05078125,
0.006221771240234375,
0.003910064697265625,
0.09259033203125,
-0.06353759765625,
0.03387451171875,
0.035003662109375,
-0.048919677734375,
-0.054443359375,
-0.004764556884765625,
0.002979278564453125,
-0.04840087890625,
0.0521240234375,
0.03753662109375,
-0.0015869140625,
-0.03521728515625,
-0.07330322265625,
-0.0679931640625,
0.08099365234375,
0.0361328125,
-0.0197296142578125,
-0.00823211669921875,
-0.0008826255798339844,
0.0273590087890625,
-0.0280303955078125,
0.0222930908203125,
0.0245513916015625,
-0.0002460479736328125,
0.02716064453125,
-0.09417724609375,
-0.00984954833984375,
-0.008544921875,
0.0185089111328125,
0.00986480712890625,
-0.0623779296875,
0.0745849609375,
-0.0186920166015625,
-0.03729248046875,
0.007160186767578125,
0.0312347412109375,
-0.007526397705078125,
0.033966064453125,
0.03778076171875,
0.052886962890625,
0.0279083251953125,
0.001918792724609375,
0.08441162109375,
-0.0012187957763671875,
0.034393310546875,
0.072509765625,
-0.0151519775390625,
0.06817626953125,
0.0251617431640625,
-0.032562255859375,
0.0275115966796875,
0.03314208984375,
-0.047698974609375,
0.0589599609375,
-0.001720428466796875,
0.005970001220703125,
0.0019664764404296875,
-0.03948974609375,
-0.0167083740234375,
0.055084228515625,
-0.000012278556823730469,
-0.0299835205078125,
-0.01125335693359375,
0.0333251953125,
-0.0142364501953125,
-0.00104522705078125,
-0.03192138671875,
0.033477783203125,
-0.0086822509765625,
-0.0238189697265625,
0.0250396728515625,
0.0037708282470703125,
0.07012939453125,
-0.032806396484375,
-0.00975799560546875,
0.0079803466796875,
0.0126495361328125,
-0.007778167724609375,
-0.0706787109375,
0.053466796875,
0.01125335693359375,
-0.01119232177734375,
-0.001560211181640625,
0.05816650390625,
-0.0015192031860351562,
-0.034576416015625,
0.016357421875,
0.0013227462768554688,
0.0277862548828125,
-0.01175689697265625,
-0.045379638671875,
0.0170135498046875,
0.00263214111328125,
0.00861358642578125,
0.032867431640625,
0.0011281967163085938,
-0.01474761962890625,
0.05157470703125,
0.028472900390625,
-0.006305694580078125,
0.00150299072265625,
-0.021026611328125,
0.08160400390625,
-0.04266357421875,
-0.0297088623046875,
-0.055206298828125,
0.031036376953125,
-0.002346038818359375,
-0.0272216796875,
0.052642822265625,
0.044769287109375,
0.08251953125,
-0.0034198760986328125,
0.043182373046875,
-0.0224456787109375,
0.04290771484375,
-0.0255279541015625,
0.038330078125,
-0.03875732421875,
-0.005126953125,
-0.033050537109375,
-0.049774169921875,
-0.018157958984375,
0.04156494140625,
-0.0237579345703125,
-0.0032062530517578125,
0.04034423828125,
0.0484619140625,
-0.0129547119140625,
-0.0022373199462890625,
0.018951416015625,
-0.03192138671875,
0.0204925537109375,
0.038543701171875,
0.051177978515625,
-0.05401611328125,
0.051300048828125,
-0.05194091796875,
-0.023468017578125,
-0.01520538330078125,
-0.064453125,
-0.08477783203125,
-0.03533935546875,
-0.0357666015625,
-0.0100860595703125,
-0.0002455711364746094,
0.047943115234375,
0.0684814453125,
-0.053558349609375,
-0.012054443359375,
0.0253448486328125,
-0.0059967041015625,
0.001983642578125,
-0.01947021484375,
0.0228271484375,
0.01116180419921875,
-0.042022705078125,
-0.01099395751953125,
0.01035308837890625,
0.027923583984375,
-0.017669677734375,
0.01496124267578125,
-0.0132598876953125,
-0.00955963134765625,
0.0321044921875,
0.038787841796875,
-0.05169677734375,
-0.037017822265625,
0.01238250732421875,
0.00920867919921875,
0.0235443115234375,
0.055816650390625,
-0.054168701171875,
0.032745361328125,
0.019012451171875,
0.04473876953125,
0.043975830078125,
0.01788330078125,
0.028228759765625,
-0.039276123046875,
0.01561737060546875,
0.0189971923828125,
0.0265350341796875,
0.0238189697265625,
-0.02972412109375,
0.050262451171875,
0.039581298828125,
-0.049041748046875,
-0.07806396484375,
0.004924774169921875,
-0.082275390625,
-0.006771087646484375,
0.06158447265625,
-0.034454345703125,
-0.044403076171875,
0.006717681884765625,
-0.0211334228515625,
0.01995849609375,
-0.0304107666015625,
0.05950927734375,
0.031585693359375,
-0.007564544677734375,
-0.03472900390625,
-0.05078125,
0.0158843994140625,
0.00408935546875,
-0.046539306640625,
-0.02716064453125,
0.025054931640625,
0.040252685546875,
0.0242767333984375,
0.0316162109375,
-0.031341552734375,
0.02923583984375,
-0.00196075439453125,
0.0207977294921875,
-0.0235748291015625,
-0.0256805419921875,
-0.043670654296875,
0.02392578125,
-0.019378662109375,
-0.04779052734375
]
] |
openai/whisper-medium.en | 2023-09-08T12:57:13.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"whisper",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"en",
"arxiv:2212.04356",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | openai | null | null | openai/whisper-medium.en | 28 | 26,134 | transformers | 2022-09-26T07:02:02 | ---
language:
- en
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
widget:
- example_title: Librispeech sample 1
src: https://cdn-media.huggingface.co/speech_samples/sample1.flac
- example_title: Librispeech sample 2
src: https://cdn-media.huggingface.co/speech_samples/sample2.flac
model-index:
- name: whisper-medium.en
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 4.120542365210176
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (other)
type: librispeech_asr
config: other
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 7.431640255663553
pipeline_tag: automatic-speech-recognition
license: apache-2.0
---
# Whisper
Whisper is a pre-trained model for automatic speech recognition (ASR) and speech translation. Trained on 680k hours
of labelled data, Whisper models demonstrate a strong ability to generalise to many datasets and domains **without** the need
for fine-tuning.
Whisper was proposed in the paper [Robust Speech Recognition via Large-Scale Weak Supervision](https://arxiv.org/abs/2212.04356)
by Alec Radford et al. from OpenAI. The original code repository can be found [here](https://github.com/openai/whisper).
**Disclaimer**: Content for this model card has partly been written by the Hugging Face team, and parts of it were
copied and pasted from the original model card.
## Model details
Whisper is a Transformer based encoder-decoder model, also referred to as a _sequence-to-sequence_ model.
It was trained on 680k hours of labelled speech data annotated using large-scale weak supervision.
The models were trained on either English-only data or multilingual data. The English-only models were trained
on the task of speech recognition. The multilingual models were trained on both speech recognition and speech
translation. For speech recognition, the model predicts transcriptions in the *same* language as the audio.
For speech translation, the model predicts transcriptions to a *different* language to the audio.
Whisper checkpoints come in five configurations of varying model sizes.
The smallest four are trained on either English-only or multilingual data.
The largest checkpoints are multilingual only. All ten of the pre-trained checkpoints
are available on the [Hugging Face Hub](https://huggingface.co/models?search=openai/whisper). The
checkpoints are summarised in the following table with links to the models on the Hub:
| Size | Parameters | English-only | Multilingual |
|----------|------------|------------------------------------------------------|-----------------------------------------------------|
| tiny | 39 M | [✓](https://huggingface.co/openai/whisper-tiny.en) | [✓](https://huggingface.co/openai/whisper-tiny) |
| base | 74 M | [✓](https://huggingface.co/openai/whisper-base.en) | [✓](https://huggingface.co/openai/whisper-base) |
| small | 244 M | [✓](https://huggingface.co/openai/whisper-small.en) | [✓](https://huggingface.co/openai/whisper-small) |
| medium | 769 M | [✓](https://huggingface.co/openai/whisper-medium.en) | [✓](https://huggingface.co/openai/whisper-medium) |
| large | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large) |
| large-v2 | 1550 M | x | [✓](https://huggingface.co/openai/whisper-large-v2) |
# Usage
This checkpoint is an *English-only* model, meaning it can be used for English speech recognition. Multilingual speech
recognition or speech translation is possible through use of a multilingual checkpoint.
To transcribe audio samples, the model has to be used alongside a [`WhisperProcessor`](https://huggingface.co/docs/transformers/model_doc/whisper#transformers.WhisperProcessor).
The `WhisperProcessor` is used to:
1. Pre-process the audio inputs (converting them to log-Mel spectrograms for the model)
2. Post-process the model outputs (converting them from tokens to text)
## Transcription
```python
>>> from transformers import WhisperProcessor, WhisperForConditionalGeneration
>>> from datasets import load_dataset
>>> # load model and processor
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-medium.en")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-medium.en")
>>> # load dummy dataset and read audio files
>>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
>>> sample = ds[0]["audio"]
>>> input_features = processor(sample["array"], sampling_rate=sample["sampling_rate"], return_tensors="pt").input_features
>>> # generate token ids
>>> predicted_ids = model.generate(input_features)
>>> # decode token ids to text
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=False)
['<|startoftranscript|><|notimestamps|> Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel.<|endoftext|>']
>>> transcription = processor.batch_decode(predicted_ids, skip_special_tokens=True)
[' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.']
```
The context tokens can be removed from the start of the transcription by setting `skip_special_tokens=True`.
## Evaluation
This code snippet shows how to evaluate Whisper medium.en on [LibriSpeech test-clean](https://huggingface.co/datasets/librispeech_asr):
```python
>>> from datasets import load_dataset
>>> from transformers import WhisperForConditionalGeneration, WhisperProcessor
>>> import torch
>>> from evaluate import load
>>> librispeech_test_clean = load_dataset("librispeech_asr", "clean", split="test")
>>> processor = WhisperProcessor.from_pretrained("openai/whisper-medium.en")
>>> model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-medium.en").to("cuda")
>>> def map_to_pred(batch):
>>> audio = batch["audio"]
>>> input_features = processor(audio["array"], sampling_rate=audio["sampling_rate"], return_tensors="pt").input_features
>>> batch["reference"] = processor.tokenizer._normalize(batch['text'])
>>>
>>> with torch.no_grad():
>>> predicted_ids = model.generate(input_features.to("cuda"))[0]
>>> transcription = processor.decode(predicted_ids)
>>> batch["prediction"] = processor.tokenizer._normalize(transcription)
>>> return batch
>>> result = librispeech_test_clean.map(map_to_pred)
>>> wer = load("wer")
>>> print(100 * wer.compute(references=result["reference"], predictions=result["prediction"]))
3.0154449620004904
```
## Long-Form Transcription
The Whisper model is intrinsically designed to work on audio samples of up to 30s in duration. However, by using a chunking
algorithm, it can be used to transcribe audio samples of up to arbitrary length. This is possible through Transformers
[`pipeline`](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.AutomaticSpeechRecognitionPipeline)
method. Chunking is enabled by setting `chunk_length_s=30` when instantiating the pipeline. With chunking enabled, the pipeline
can be run with batched inference. It can also be extended to predict sequence level timestamps by passing `return_timestamps=True`:
```python
>>> import torch
>>> from transformers import pipeline
>>> from datasets import load_dataset
>>> device = "cuda:0" if torch.cuda.is_available() else "cpu"
>>> pipe = pipeline(
>>> "automatic-speech-recognition",
>>> model="openai/whisper-medium.en",
>>> chunk_length_s=30,
>>> device=device,
>>> )
>>> ds = load_dataset("hf-internal-testing/librispeech_asr_dummy", "clean", split="validation")
>>> sample = ds[0]["audio"]
>>> prediction = pipe(sample.copy(), batch_size=8)["text"]
" Mr. Quilter is the apostle of the middle classes, and we are glad to welcome his gospel."
>>> # we can also return timestamps for the predictions
>>> prediction = pipe(sample.copy(), batch_size=8, return_timestamps=True)["chunks"]
[{'text': ' Mr. Quilter is the apostle of the middle classes and we are glad to welcome his gospel.',
'timestamp': (0.0, 5.44)}]
```
Refer to the blog post [ASR Chunking](https://huggingface.co/blog/asr-chunking) for more details on the chunking algorithm.
## Fine-Tuning
The pre-trained Whisper model demonstrates a strong ability to generalise to different datasets and domains. However,
its predictive capabilities can be improved further for certain languages and tasks through *fine-tuning*. The blog
post [Fine-Tune Whisper with 🤗 Transformers](https://huggingface.co/blog/fine-tune-whisper) provides a step-by-step
guide to fine-tuning the Whisper model with as little as 5 hours of labelled data.
### Evaluated Use
The primary intended users of these models are AI researchers studying robustness, generalization, capabilities, biases, and constraints of the current model. However, Whisper is also potentially quite useful as an ASR solution for developers, especially for English speech recognition. We recognize that once models are released, it is impossible to restrict access to only “intended” uses or to draw reasonable guidelines around what is or is not research.
The models are primarily trained and evaluated on ASR and speech translation to English tasks. They show strong ASR results in ~10 languages. They may exhibit additional capabilities, particularly if fine-tuned on certain tasks like voice activity detection, speaker classification, or speaker diarization but have not been robustly evaluated in these areas. We strongly recommend that users perform robust evaluations of the models in a particular context and domain before deploying them.
In particular, we caution against using Whisper models to transcribe recordings of individuals taken without their consent or purporting to use these models for any kind of subjective classification. We recommend against use in high-risk domains like decision-making contexts, where flaws in accuracy can lead to pronounced flaws in outcomes. The models are intended to transcribe and translate speech, use of the model for classification is not only not evaluated but also not appropriate, particularly to infer human attributes.
## Training Data
The models are trained on 680,000 hours of audio and the corresponding transcripts collected from the internet. 65% of this data (or 438,000 hours) represents English-language audio and matched English transcripts, roughly 18% (or 126,000 hours) represents non-English audio and English transcripts, while the final 17% (or 117,000 hours) represents non-English audio and the corresponding transcript. This non-English data represents 98 different languages.
As discussed in [the accompanying paper](https://cdn.openai.com/papers/whisper.pdf), we see that performance on transcription in a given language is directly correlated with the amount of training data we employ in that language.
## Performance and Limitations
Our studies show that, over many existing ASR systems, the models exhibit improved robustness to accents, background noise, technical language, as well as zero shot translation from multiple languages into English; and that accuracy on speech recognition and translation is near the state-of-the-art level.
However, because the models are trained in a weakly supervised manner using large-scale noisy data, the predictions may include texts that are not actually spoken in the audio input (i.e. hallucination). We hypothesize that this happens because, given their general knowledge of language, the models combine trying to predict the next word in audio with trying to transcribe the audio itself.
Our models perform unevenly across languages, and we observe lower accuracy on low-resource and/or low-discoverability languages or languages where we have less training data. The models also exhibit disparate performance on different accents and dialects of particular languages, which may include higher word error rate across speakers of different genders, races, ages, or other demographic criteria. Our full evaluation results are presented in [the paper accompanying this release](https://cdn.openai.com/papers/whisper.pdf).
In addition, the sequence-to-sequence architecture of the model makes it prone to generating repetitive texts, which can be mitigated to some degree by beam search and temperature scheduling but not perfectly. Further analysis on these limitations are provided in [the paper](https://cdn.openai.com/papers/whisper.pdf). It is likely that this behavior and hallucinations may be worse on lower-resource and/or lower-discoverability languages.
## Broader Implications
We anticipate that Whisper models’ transcription capabilities may be used for improving accessibility tools. While Whisper models cannot be used for real-time transcription out of the box – their speed and size suggest that others may be able to build applications on top of them that allow for near-real-time speech recognition and translation. The real value of beneficial applications built on top of Whisper models suggests that the disparate performance of these models may have real economic implications.
There are also potential dual use concerns that come with releasing Whisper. While we hope the technology will be used primarily for beneficial purposes, making ASR technology more accessible could enable more actors to build capable surveillance technologies or scale up existing surveillance efforts, as the speed and accuracy allow for affordable automatic transcription and translation of large volumes of audio communication. Moreover, these models may have some capabilities to recognize specific individuals out of the box, which in turn presents safety concerns related both to dual use and disparate performance. In practice, we expect that the cost of transcription is not the limiting factor of scaling up surveillance projects.
### BibTeX entry and citation info
```bibtex
@misc{radford2022whisper,
doi = {10.48550/ARXIV.2212.04356},
url = {https://arxiv.org/abs/2212.04356},
author = {Radford, Alec and Kim, Jong Wook and Xu, Tao and Brockman, Greg and McLeavey, Christine and Sutskever, Ilya},
title = {Robust Speech Recognition via Large-Scale Weak Supervision},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
| 14,846 | [
[
-0.0207366943359375,
-0.0467529296875,
0.00691986083984375,
0.034454345703125,
-0.00489044189453125,
0.0005168914794921875,
-0.027984619140625,
-0.045989990234375,
0.0178375244140625,
0.0247039794921875,
-0.061279296875,
-0.03985595703125,
-0.053985595703125,
-0.01171112060546875,
-0.043975830078125,
0.07550048828125,
0.01107025146484375,
-0.000885009765625,
0.0168304443359375,
-0.00530242919921875,
-0.0240478515625,
-0.0196075439453125,
-0.053741455078125,
-0.01447296142578125,
0.01477813720703125,
0.01055908203125,
0.02850341796875,
0.03997802734375,
0.011077880859375,
0.03167724609375,
-0.031341552734375,
-0.005153656005859375,
-0.027069091796875,
-0.006671905517578125,
0.029266357421875,
-0.034332275390625,
-0.045440673828125,
0.0133209228515625,
0.058685302734375,
0.0350341796875,
-0.026947021484375,
0.032867431640625,
0.0179901123046875,
0.0237884521484375,
-0.0208892822265625,
0.0193023681640625,
-0.05029296875,
-0.0089874267578125,
-0.0201416015625,
-0.00007832050323486328,
-0.0251617431640625,
-0.0226898193359375,
0.042449951171875,
-0.0452880859375,
0.029388427734375,
0.0106964111328125,
0.07806396484375,
0.0179595947265625,
-0.003997802734375,
-0.03173828125,
-0.054901123046875,
0.08294677734375,
-0.066650390625,
0.039337158203125,
0.029876708984375,
0.0195770263671875,
0.00310516357421875,
-0.0697021484375,
-0.053985595703125,
-0.0009908676147460938,
-0.0017681121826171875,
0.0241851806640625,
-0.0256500244140625,
-0.0005183219909667969,
0.0184326171875,
0.03106689453125,
-0.035064697265625,
0.0013952255249023438,
-0.053802490234375,
-0.0499267578125,
0.047454833984375,
-0.00024509429931640625,
0.0214996337890625,
-0.0200347900390625,
-0.0189971923828125,
-0.0308685302734375,
-0.0193328857421875,
0.033233642578125,
0.02862548828125,
0.032989501953125,
-0.053375244140625,
0.0266571044921875,
-0.004772186279296875,
0.0458984375,
0.01422882080078125,
-0.045928955078125,
0.046173095703125,
-0.01117706298828125,
-0.0148468017578125,
0.0274810791015625,
0.07830810546875,
0.0172882080078125,
0.0063934326171875,
0.0070037841796875,
-0.01129913330078125,
0.0133819580078125,
-0.007183074951171875,
-0.06439208984375,
-0.003574371337890625,
0.03692626953125,
-0.040283203125,
-0.0224761962890625,
-0.0176544189453125,
-0.04736328125,
0.0095977783203125,
-0.0120086669921875,
0.0518798828125,
-0.043182373046875,
-0.025054931640625,
0.0180816650390625,
-0.02978515625,
0.0230712890625,
0.0019664764404296875,
-0.060638427734375,
0.0267333984375,
0.032196044921875,
0.06549072265625,
0.008026123046875,
-0.0457763671875,
-0.035614013671875,
0.00791168212890625,
0.0092315673828125,
0.0345458984375,
-0.0197601318359375,
-0.042694091796875,
-0.0161895751953125,
0.0069427490234375,
-0.024566650390625,
-0.043212890625,
0.05560302734375,
-0.01029205322265625,
0.036590576171875,
0.00028061866760253906,
-0.039794921875,
-0.015777587890625,
-0.0148773193359375,
-0.031280517578125,
0.06951904296875,
0.005947113037109375,
-0.0533447265625,
0.01265716552734375,
-0.038665771484375,
-0.035614013671875,
-0.0203094482421875,
0.01354217529296875,
-0.046783447265625,
-0.0053253173828125,
0.032073974609375,
0.0307464599609375,
-0.0142974853515625,
0.00563812255859375,
-0.003711700439453125,
-0.030914306640625,
0.0262298583984375,
-0.032379150390625,
0.07574462890625,
0.01332855224609375,
-0.033538818359375,
0.016021728515625,
-0.05792236328125,
0.00977325439453125,
0.0036029815673828125,
-0.01258087158203125,
0.01360321044921875,
-0.0028171539306640625,
0.02392578125,
0.003448486328125,
0.01143646240234375,
-0.056884765625,
-0.00942230224609375,
-0.047637939453125,
0.055419921875,
0.047943115234375,
-0.006317138671875,
0.025726318359375,
-0.044525146484375,
0.0225982666015625,
0.00537872314453125,
0.03289794921875,
-0.0127410888671875,
-0.045135498046875,
-0.0732421875,
-0.0311126708984375,
0.03338623046875,
0.053070068359375,
-0.025543212890625,
0.043487548828125,
-0.0149383544921875,
-0.0572509765625,
-0.096923828125,
-0.0114898681640625,
0.0426025390625,
0.044158935546875,
0.053009033203125,
-0.01302337646484375,
-0.056640625,
-0.053070068359375,
-0.01123809814453125,
-0.0227813720703125,
-0.01490020751953125,
0.0288238525390625,
0.02276611328125,
-0.02679443359375,
0.05010986328125,
-0.037139892578125,
-0.0394287109375,
-0.0188751220703125,
0.004131317138671875,
0.036041259765625,
0.048309326171875,
0.0194244384765625,
-0.053131103515625,
-0.032073974609375,
-0.01519012451171875,
-0.04449462890625,
-0.007556915283203125,
-0.0053253173828125,
0.005039215087890625,
0.0019626617431640625,
0.0277252197265625,
-0.052886962890625,
0.030609130859375,
0.05035400390625,
-0.01381683349609375,
0.054168701171875,
0.01169586181640625,
-0.0017957687377929688,
-0.0867919921875,
-0.003665924072265625,
-0.00968170166015625,
-0.007610321044921875,
-0.05010986328125,
-0.01947021484375,
-0.00820159912109375,
-0.00701904296875,
-0.039093017578125,
0.04571533203125,
-0.025482177734375,
0.0029239654541015625,
-0.004283905029296875,
0.00824737548828125,
-0.0030879974365234375,
0.038604736328125,
0.01442718505859375,
0.049346923828125,
0.0638427734375,
-0.041839599609375,
0.0166168212890625,
0.040740966796875,
-0.0250091552734375,
0.02130126953125,
-0.07562255859375,
0.0125274658203125,
0.00978851318359375,
0.0159149169921875,
-0.052001953125,
-0.00788116455078125,
0.0015106201171875,
-0.07342529296875,
0.032989501953125,
-0.0226898193359375,
-0.025726318359375,
-0.038482666015625,
-0.01291656494140625,
0.00514984130859375,
0.06817626953125,
-0.03533935546875,
0.053955078125,
0.031524658203125,
-0.0155181884765625,
-0.042083740234375,
-0.041046142578125,
-0.0176849365234375,
-0.0186004638671875,
-0.058441162109375,
0.03741455078125,
-0.00965118408203125,
-0.00033974647521972656,
-0.0137939453125,
-0.0089874267578125,
0.0084381103515625,
-0.0178070068359375,
0.035980224609375,
0.036956787109375,
-0.00917816162109375,
-0.0200653076171875,
0.0152130126953125,
-0.018707275390625,
-0.00008571147918701172,
-0.0169525146484375,
0.05218505859375,
-0.025726318359375,
-0.004856109619140625,
-0.059417724609375,
0.015380859375,
0.037445068359375,
-0.0271453857421875,
0.041717529296875,
0.066162109375,
-0.0198822021484375,
-0.016082763671875,
-0.053802490234375,
-0.0181121826171875,
-0.043487548828125,
0.01161956787109375,
-0.0259857177734375,
-0.05792236328125,
0.053131103515625,
0.01236724853515625,
0.00760650634765625,
0.046966552734375,
0.03729248046875,
-0.02056884765625,
0.06915283203125,
0.031280517578125,
-0.0195465087890625,
0.0224761962890625,
-0.058441162109375,
-0.01007843017578125,
-0.07904052734375,
-0.0279083251953125,
-0.04229736328125,
-0.0214080810546875,
-0.037261962890625,
-0.0267486572265625,
0.0396728515625,
0.00852203369140625,
-0.00890350341796875,
0.0335693359375,
-0.05718994140625,
0.0011110305786132812,
0.048370361328125,
0.004245758056640625,
0.00835418701171875,
-0.00418853759765625,
-0.0092315673828125,
-0.00579833984375,
-0.027587890625,
-0.0250091552734375,
0.07305908203125,
0.03955078125,
0.040985107421875,
-0.007061004638671875,
0.05517578125,
-0.004100799560546875,
0.003879547119140625,
-0.056243896484375,
0.03656005859375,
-0.00872802734375,
-0.044769287109375,
-0.029022216796875,
-0.021697998046875,
-0.061737060546875,
0.01108551025390625,
-0.0149383544921875,
-0.050384521484375,
0.0088958740234375,
-0.006664276123046875,
-0.02392578125,
0.0197906494140625,
-0.053680419921875,
0.043548583984375,
0.00968170166015625,
0.00832366943359375,
-0.004772186279296875,
-0.057708740234375,
0.007965087890625,
0.007442474365234375,
0.01050567626953125,
-0.01175689697265625,
0.0163421630859375,
0.08221435546875,
-0.033233642578125,
0.06683349609375,
-0.02606201171875,
0.00864410400390625,
0.039093017578125,
-0.01517486572265625,
0.0240631103515625,
-0.0170135498046875,
-0.01262664794921875,
0.03204345703125,
0.0225982666015625,
-0.0225067138671875,
-0.0231170654296875,
0.036651611328125,
-0.0814208984375,
-0.0239410400390625,
-0.019561767578125,
-0.0297088623046875,
-0.01229095458984375,
0.01483154296875,
0.06109619140625,
0.0513916015625,
-0.005519866943359375,
0.000789642333984375,
0.036285400390625,
-0.0186309814453125,
0.040802001953125,
0.049072265625,
-0.0170745849609375,
-0.035675048828125,
0.07086181640625,
0.018463134765625,
0.0195770263671875,
0.01227569580078125,
0.0322265625,
-0.032745361328125,
-0.050384521484375,
-0.04083251953125,
0.024017333984375,
-0.0292816162109375,
-0.01242828369140625,
-0.06622314453125,
-0.040740966796875,
-0.045379638671875,
-0.00019741058349609375,
-0.035186767578125,
-0.0210113525390625,
-0.0305938720703125,
0.006561279296875,
0.044952392578125,
0.0307464599609375,
0.0013799667358398438,
0.04144287109375,
-0.06903076171875,
0.030426025390625,
0.0241851806640625,
0.00942230224609375,
0.0033321380615234375,
-0.07342529296875,
-0.006855010986328125,
0.01406097412109375,
-0.0261993408203125,
-0.046783447265625,
0.036346435546875,
0.0282440185546875,
0.031494140625,
0.0189971923828125,
-0.00012123584747314453,
0.0711669921875,
-0.0517578125,
0.05853271484375,
0.0159454345703125,
-0.0919189453125,
0.054840087890625,
-0.0276641845703125,
0.0182037353515625,
0.032440185546875,
0.0271759033203125,
-0.045989990234375,
-0.039764404296875,
-0.052215576171875,
-0.04827880859375,
0.051300048828125,
0.0225677490234375,
0.00707244873046875,
0.0205230712890625,
0.0156097412109375,
0.00876617431640625,
0.00960540771484375,
-0.035552978515625,
-0.03558349609375,
-0.0298004150390625,
-0.0182342529296875,
-0.00827789306640625,
-0.002838134765625,
0.00015342235565185547,
-0.041046142578125,
0.058258056640625,
0.0012788772583007812,
0.03436279296875,
0.0309600830078125,
0.004756927490234375,
-0.0017385482788085938,
0.0123443603515625,
0.0255279541015625,
0.019500732421875,
-0.0192413330078125,
-0.0279998779296875,
0.0255279541015625,
-0.06402587890625,
0.0018682479858398438,
0.026031494140625,
-0.0219268798828125,
0.00873565673828125,
0.0526123046875,
0.0826416015625,
0.01395416259765625,
-0.037872314453125,
0.0501708984375,
-0.00682830810546875,
-0.0175018310546875,
-0.047698974609375,
0.002414703369140625,
0.02392578125,
0.0225067138671875,
0.0274658203125,
0.00885772705078125,
0.013031005859375,
-0.037689208984375,
0.0112762451171875,
0.019927978515625,
-0.0394287109375,
-0.038604736328125,
0.06561279296875,
0.005191802978515625,
-0.0294952392578125,
0.055633544921875,
0.0024261474609375,
-0.047607421875,
0.03497314453125,
0.050018310546875,
0.072265625,
-0.038116455078125,
-0.001430511474609375,
0.034149169921875,
0.0164031982421875,
0.0021495819091796875,
0.036865234375,
-0.0032672882080078125,
-0.0540771484375,
-0.03485107421875,
-0.07861328125,
-0.0243377685546875,
0.0020294189453125,
-0.0726318359375,
0.0269012451171875,
-0.0236053466796875,
-0.0199737548828125,
0.0235443115234375,
0.00579071044921875,
-0.056396484375,
0.0152587890625,
0.0019550323486328125,
0.07574462890625,
-0.0521240234375,
0.07562255859375,
0.0120086669921875,
-0.0205535888671875,
-0.084716796875,
-0.00008779764175415039,
0.00286865234375,
-0.0738525390625,
0.023895263671875,
0.0244598388671875,
-0.015167236328125,
0.00983428955078125,
-0.03851318359375,
-0.053985595703125,
0.0819091796875,
0.01110076904296875,
-0.051483154296875,
-0.01433563232421875,
-0.0049285888671875,
0.03985595703125,
-0.01514434814453125,
0.0177764892578125,
0.054779052734375,
0.03375244140625,
0.011199951171875,
-0.108642578125,
-0.00968170166015625,
-0.0199737548828125,
-0.01971435546875,
0.00237274169921875,
-0.058990478515625,
0.06878662109375,
-0.03143310546875,
-0.0177001953125,
0.0220489501953125,
0.058990478515625,
0.0306396484375,
0.03094482421875,
0.048736572265625,
0.044952392578125,
0.056243896484375,
-0.01384735107421875,
0.07275390625,
-0.0135955810546875,
0.0175323486328125,
0.07135009765625,
-0.0060272216796875,
0.08306884765625,
0.015960693359375,
-0.037261962890625,
0.0513916015625,
0.0261993408203125,
-0.0034885406494140625,
0.03472900390625,
0.00006020069122314453,
-0.0267333984375,
0.0112762451171875,
-0.008453369140625,
-0.0390625,
0.059661865234375,
0.031829833984375,
-0.017059326171875,
0.031646728515625,
0.01055145263671875,
0.007534027099609375,
-0.00972747802734375,
-0.0171966552734375,
0.064208984375,
0.0140533447265625,
-0.0308685302734375,
0.06268310546875,
-0.005084991455078125,
0.0809326171875,
-0.06036376953125,
0.01390838623046875,
0.0151214599609375,
0.014678955078125,
-0.0168609619140625,
-0.0477294921875,
0.025146484375,
-0.0152740478515625,
-0.0171661376953125,
-0.0131683349609375,
0.042144775390625,
-0.049896240234375,
-0.04217529296875,
0.038421630859375,
0.0269775390625,
0.02386474609375,
-0.00925445556640625,
-0.06256103515625,
0.033172607421875,
0.0147247314453125,
-0.01349639892578125,
0.0106048583984375,
0.00971221923828125,
0.0258331298828125,
0.05224609375,
0.06378173828125,
0.033172607421875,
0.0159759521484375,
0.0103302001953125,
0.061004638671875,
-0.050567626953125,
-0.041656494140625,
-0.04803466796875,
0.038604736328125,
-0.0021953582763671875,
-0.030029296875,
0.06414794921875,
0.049530029296875,
0.053558349609375,
0.004596710205078125,
0.05377197265625,
-0.0001634359359741211,
0.07635498046875,
-0.0379638671875,
0.06646728515625,
-0.029327392578125,
0.004436492919921875,
-0.0295867919921875,
-0.05322265625,
0.01007080078125,
0.041717529296875,
-0.006359100341796875,
-0.0008254051208496094,
0.0281219482421875,
0.06597900390625,
0.000008463859558105469,
0.0211944580078125,
0.0025043487548828125,
0.035491943359375,
0.01751708984375,
0.036651611328125,
0.047027587890625,
-0.0604248046875,
0.048492431640625,
-0.043304443359375,
-0.018035888671875,
0.0108795166015625,
-0.034332275390625,
-0.0643310546875,
-0.06304931640625,
-0.0207672119140625,
-0.0452880859375,
-0.0198516845703125,
0.05328369140625,
0.06719970703125,
-0.060455322265625,
-0.030303955078125,
0.026092529296875,
-0.007091522216796875,
-0.025787353515625,
-0.018829345703125,
0.0438232421875,
0.003940582275390625,
-0.06927490234375,
0.048553466796875,
0.0025997161865234375,
0.024749755859375,
-0.0185394287109375,
-0.01514434814453125,
0.0053253173828125,
0.00006657838821411133,
0.038421630859375,
0.0167999267578125,
-0.060333251953125,
-0.01395416259765625,
0.0091094970703125,
0.01453399658203125,
-0.0001691579818725586,
0.028289794921875,
-0.05499267578125,
0.030548095703125,
0.0191497802734375,
0.007610321044921875,
0.07000732421875,
-0.022125244140625,
0.0212249755859375,
-0.050811767578125,
0.03216552734375,
0.0226593017578125,
0.026031494140625,
0.02777099609375,
-0.01166534423828125,
0.016510009765625,
0.0185699462890625,
-0.0467529296875,
-0.072998046875,
-0.005542755126953125,
-0.08868408203125,
-0.0027484893798828125,
0.074951171875,
0.0030345916748046875,
-0.0211944580078125,
-0.006000518798828125,
-0.024139404296875,
0.039581298828125,
-0.038299560546875,
0.033935546875,
0.03216552734375,
0.0026874542236328125,
-0.003498077392578125,
-0.04510498046875,
0.048431396484375,
0.0159759521484375,
-0.027923583984375,
-0.00717926025390625,
0.005176544189453125,
0.0455322265625,
0.0230712890625,
0.06341552734375,
-0.0233154296875,
0.01165008544921875,
0.01297760009765625,
0.01529693603515625,
-0.0016546249389648438,
-0.01187896728515625,
-0.0247650146484375,
-0.006275177001953125,
-0.01611328125,
-0.0302734375
]
] |
TheBloke/Falcon-180B-Chat-GPTQ | 2023-09-27T13:02:37.000Z | [
"transformers",
"safetensors",
"falcon",
"text-generation",
"en",
"de",
"es",
"fr",
"dataset:tiiuae/falcon-refinedweb",
"arxiv:1911.02150",
"arxiv:2005.14165",
"arxiv:2104.09864",
"arxiv:2205.14135",
"arxiv:2306.01116",
"license:unknown",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Falcon-180B-Chat-GPTQ | 65 | 26,063 | transformers | 2023-09-06T16:47:56 | ---
language:
- en
- de
- es
- fr
license: unknown
datasets:
- tiiuae/falcon-refinedweb
model_name: Falcon 180B Chat
inference: false
model_creator: Technology Innovation Institute
model_link: https://huggingface.co/tiiuae/falcon-180B-chat
model_type: falcon
quantized_by: TheBloke
base_model: tiiuae/falcon-180B-chat
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Falcon 180B Chat - GPTQ
- Model creator: [Technology Innovation Institute](https://huggingface.co/tiiuae)
- Original model: [Falcon 180B Chat](https://huggingface.co/tiiuae/falcon-180B-chat)
<!-- description start -->
## Description
This repo contains GPTQ model files for [Technology Innovation Institute's Falcon 180B Chat](https://huggingface.co/tiiuae/falcon-180B-chat).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
## Requirements
Transformers version 4.33.0 is required.
Due to the huge size of the model, the GPTQ has been sharded. This will break compatibility with AutoGPTQ, and therefore any clients/libraries that use AutoGPTQ directly.
But they work great loaded directly through Transformers - and can be served using Text Generation Inference!
## Compatibility
Currently these GPTQs are known to work with:
- Transformers 4.33.0
- [Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) version 1.0.4
- Docker container: `ghcr.io/huggingface/text-generation-inference:latest`
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Falcon-180B-Chat-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Falcon-180B-Chat-GGUF)
* [Technology Innovation Institute's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/tiiuae/falcon-180B-chat)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Falcon
```
{system_message}
User: {prompt}
Assistant:
```
Example:
```
Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe
User: Hello, Girafatron!
Girafatron:
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| main | 4 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 94.25 GB | No | 4-bit, with Act Order and group size 128g. Higher quality than group_size=None, but also higher VRAM usage. |
| gptq-4bit--1g-actorder_True | 4 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 92.74 GB | No | 4-bit, with Act Order. No group size, to lower VRAM requirements. |
| gptq-3bit-128g-actorder_True | 3 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 73.81 GB | No | 3-bit, so much lower VRAM requirements but worse quality than 4-bit. With group size 128g and act-order. Higher quality than 3bit-128g-False. |
| gptq-3bit--1g-actorder_True | 3 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 2048 | 70.54 GB | No | 3-bit, so much lower VRAM requirements but worse quality than 4-bit. With no group size for lowest possible VRAM requirements. Lower quality than 3-bit 128g. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download from branches
- In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/Falcon-180B-Chat-GPTQ:gptq-3bit--1g-actorder_True`
- With Git, you can clone a branch with:
```
git clone --single-branch --branch gptq-3bit--1g-actorder_True https://huggingface.co/TheBloke/Falcon-180B-Chat-GPTQ
```
- In Python Transformers code, the branch is the `revision` parameter; see below.
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
**NOTE**: I have not tested this model with Text Generation Webui. It *should* work through the Transformers Loader. It will *not* work through the AutoGPTQ loader, due to the files being sharded.
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Falcon-180B-Chat-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Falcon-180B-Chat-GPTQ:gptq-3bit-128g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. Choose Loader: Transformers
6. In the top left, click the refresh icon next to **Model**.
7. In the **Model** dropdown, choose the model you just downloaded: `Falcon-180B-Chat-GPTQ`
8. The model will automatically load, and is now ready for use!
9. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
10. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ.
```shell
pip3 install transformers>=4.33.0 optimum>=1.12.0
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
### Transformers sample code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Falcon-180B-Chat-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-3bit-128g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''User: {prompt}
Assistant: '''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, do_sample=True, temperature=0.7, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
temperature=0.7,
do_sample=True,
top_p=0.95,
repetition_penalty=1.15
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The provided files have been tested with Transformers 4.33.0, and TGI 1.0.4.
Because they are sharded, they will not yet via AutoGPTQ. It is hoped support will be added soon.
Note: lack of support for AutoGPTQ doesn't affect your ability to load these models from Python code. It only affects third-party clients that might use AutoGPTQ.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is confirmed working as of version 1.0.4.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute.
Thanks to the [chirper.ai](https://chirper.ai) team!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Russ Johnson, J, alfie_i, Alex, NimbleBox.ai, Chadd, Mandus, Nikolai Manek, Ken Nordquist, ya boyyy, Illia Dulskyi, Viktor Bowallius, vamX, Iucharbius, zynix, Magnesian, Clay Pascal, Pierre Kircher, Enrico Ros, Tony Hughes, Elle, Andrey, knownsqashed, Deep Realms, Jerry Meng, Lone Striker, Derek Yates, Pyrater, Mesiah Bishop, James Bentley, Femi Adebogun, Brandon Frisco, SuperWojo, Alps Aficionado, Michael Dempsey, Vitor Caleffi, Will Dee, Edmond Seymore, usrbinkat, LangChain4j, Kacper Wikieł, Luke Pendergrass, John Detwiler, theTransient, Nathan LeClaire, Tiffany J. Kim, biorpg, Eugene Pentland, Stanislav Ovsiannikov, Fred von Graf, terasurfer, Kalila, Dan Guido, Nitin Borwankar, 阿明, Ai Maven, John Villwock, Gabriel Puliatti, Stephen Murray, Asp the Wyvern, danny, Chris Smitley, ReadyPlayerEmma, S_X, Daniel P. Andersen, Olakabola, Jeffrey Morgan, Imad Khwaja, Caitlyn Gatomon, webtim, Alicia Loh, Trenton Dambrowitz, Swaroop Kallakuri, Erik Bjäreholt, Leonard Tan, Spiking Neurons AB, Luke @flexchar, Ajan Kanaga, Thomas Belote, Deo Leter, RoA, Willem Michiel, transmissions 11, subjectnull, Matthew Berman, Joseph William Delisle, David Ziegler, Michael Davis, Johann-Peter Hartmann, Talal Aujan, senxiiz, Artur Olbinski, Rainer Wilmers, Spencer Kim, Fen Risland, Cap'n Zoog, Rishabh Srivastava, Michael Levine, Geoffrey Montalvo, Sean Connelly, Alexandros Triantafyllidis, Pieter, Gabriel Tamborski, Sam, Subspace Studios, Junyu Yang, Pedro Madruga, Vadim, Cory Kujawski, K, Raven Klaugh, Randy H, Mano Prime, Sebastain Graf, Space Cruiser
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Technology Innovation Institute's Falcon 180B Chat
# 🚀 Falcon-180B-Chat
**Falcon-180B-Chat is a 180B parameters causal decoder-only model built by [TII](https://www.tii.ae) based on [Falcon-180B](https://huggingface.co/tiiuae/falcon-180B) and finetuned on a mixture of [Ultrachat](https://huggingface.co/datasets/stingning/ultrachat), [Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus) and [Airoboros](https://huggingface.co/datasets/jondurbin/airoboros-2.1). It is made available under the [Falcon-180B TII License](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/LICENSE.txt) and [Acceptable Use Policy](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/ACCEPTABLE_USE_POLICY.txt).**
*Paper coming soon* 😊
🤗 To get started with Falcon (inference, finetuning, quantization, etc.), we recommend reading [this great blogpost from HF](https://hf.co/blog/falcon-180b) or this [one](https://huggingface.co/blog/falcon) from the release of the 40B!
Note that since the 180B is larger than what can easily be handled with `transformers`+`acccelerate`, we recommend using [Text Generation Inference](https://github.com/huggingface/text-generation-inference).
You will need **at least 400GB of memory** to swiftly run inference with Falcon-180B.
## Why use Falcon-180B-chat?
* ✨ **You are looking for a ready-to-use chat/instruct model based on [Falcon-180B](https://huggingface.co/tiiuae/falcon-180B).**
* **It is the best open-access model currently available, and one of the best model overall.** Falcon-180B outperforms [LLaMA-2](https://huggingface.co/meta-llama/Llama-2-70b-hf), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1), [MPT](https://huggingface.co/mosaicml/mpt-7b), etc. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
* **It features an architecture optimized for inference**, with multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)).
* **It is made available under a permissive license allowing for commercial use**.
💬 **This is a Chat model, which may not be ideal for further finetuning.** If you are interested in building your own instruct/chat model, we recommend starting from [Falcon-180B](https://huggingface.co/tiiuae/falcon-180b).
💸 **Looking for a smaller, less expensive model?** [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct) and [Falcon-40B-Instruct](https://huggingface.co/tiiuae/falcon-40b-instruct) are Falcon-180B-Chat's little brothers!
💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!**
# Model Card for Falcon-180B-Chat
## Model Details
### Model Description
- **Developed by:** [https://www.tii.ae](https://www.tii.ae);
- **Model type:** Causal decoder-only;
- **Language(s) (NLP):** English, German, Spanish, French (and limited capabilities in Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish);
- **License:** [Falcon-180B TII License](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/LICENSE.txt) and [Acceptable Use Policy](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/ACCEPTABLE_USE_POLICY.txt).
### Model Source
- **Paper:** *coming soon*.
## Uses
See the [acceptable use policy](https://huggingface.co/tiiuae/falcon-180B-chat/blob/main/ACCEPTABLE_USE_POLICY.txt).
### Direct Use
Falcon-180B-Chat has been finetuned on a chat dataset.
### Out-of-Scope Use
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
## Bias, Risks, and Limitations
Falcon-180B-Chat is mostly trained on English data, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
### Recommendations
We recommend users of Falcon-180B-Chat to develop guardrails and to take appropriate precautions for any production use.
## How to Get Started with the Model
To run inference with the model in full `bfloat16` precision you need approximately 8xA100 80GB or equivalent.
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-180b-chat"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Training Details
**Falcon-180B-Chat is based on [Falcon-180B](https://huggingface.co/tiiuae/falcon-180B).**
### Training Data
Falcon-180B-Chat is finetuned on a mixture of [Ultrachat](https://huggingface.co/datasets/stingning/ultrachat), [Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus) and [Airoboros](https://huggingface.co/datasets/jondurbin/airoboros-2.1).
The data was tokenized with the Falcon tokenizer.
## Evaluation
*Paper coming soon.*
See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) for early results.
## Technical Specifications
### Model Architecture and Objective
Falcon-180B-Chat is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token).
The architecture is broadly adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)), with the following differences:
* **Positionnal embeddings:** rotary ([Su et al., 2021](https://arxiv.org/abs/2104.09864));
* **Attention:** multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)) and FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135));
* **Decoder-block:** parallel attention/MLP with a two layer norms.
For multiquery, we are using an internal variant which uses independent key and values per tensor parallel degree.
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|-----------|----------------------------------------|
| Layers | 80 | |
| `d_model` | 14848 | |
| `head_dim` | 64 | Reduced to optimise for FlashAttention |
| Vocabulary | 65024 | |
| Sequence length | 2048 | |
### Compute Infrastructure
#### Hardware
Falcon-180B-Chat was trained on AWS SageMaker, on up to 4,096 A100 40GB GPUs in P4d instances.
#### Software
Falcon-180B-Chat was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.)
## Citation
*Paper coming soon* 😊. In the meanwhile, you can use the following information to cite:
```
@article{falcon,
title={The Falcon Series of Language Models:Towards Open Frontier Models},
author={Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Debbah, Merouane and Goffinet, Etienne and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme},
year={2023}
}
```
To learn more about the pretraining dataset, see the 📓 [RefinedWeb paper](https://arxiv.org/abs/2306.01116).
```
@article{refinedweb,
title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
journal={arXiv preprint arXiv:2306.01116},
eprint={2306.01116},
eprinttype = {arXiv},
url={https://arxiv.org/abs/2306.01116},
year={2023}
}
```
## Contact
falconllm@tii.ae
| 22,474 | [
[
-0.04510498046875,
-0.060211181640625,
0.004558563232421875,
0.01525115966796875,
-0.0169219970703125,
-0.00033926963806152344,
0.0089569091796875,
-0.04034423828125,
0.0184173583984375,
0.02392578125,
-0.049774169921875,
-0.0268096923828125,
-0.03204345703125,
-0.002216339111328125,
-0.033294677734375,
0.0755615234375,
0.00467681884765625,
-0.0294952392578125,
-0.0009355545043945312,
-0.006298065185546875,
-0.0202178955078125,
-0.026580810546875,
-0.058685302734375,
-0.018585205078125,
0.0246429443359375,
0.0182037353515625,
0.0718994140625,
0.04742431640625,
0.024658203125,
0.0263214111328125,
-0.0027923583984375,
-0.0019474029541015625,
-0.037689208984375,
-0.01082611083984375,
0.00691986083984375,
-0.00861358642578125,
-0.043670654296875,
0.01012420654296875,
0.033599853515625,
0.011474609375,
-0.02716064453125,
0.00833892822265625,
0.00298309326171875,
0.0543212890625,
-0.034423828125,
0.020477294921875,
-0.0288848876953125,
0.006717681884765625,
-0.002681732177734375,
0.0121612548828125,
-0.005748748779296875,
-0.0251007080078125,
0.0068817138671875,
-0.06781005859375,
0.015869140625,
0.0006666183471679688,
0.09832763671875,
0.0094757080078125,
-0.048004150390625,
0.00157928466796875,
-0.040771484375,
0.04400634765625,
-0.06549072265625,
0.026275634765625,
0.033660888671875,
0.026275634765625,
-0.01983642578125,
-0.07537841796875,
-0.047607421875,
-0.007488250732421875,
-0.01369476318359375,
0.0280303955078125,
-0.03656005859375,
0.002841949462890625,
0.032806396484375,
0.05230712890625,
-0.06817626953125,
-0.01502227783203125,
-0.0264739990234375,
-0.0156402587890625,
0.060302734375,
0.01116943359375,
0.033447265625,
-0.021331787109375,
-0.0245819091796875,
-0.037322998046875,
-0.047576904296875,
0.007289886474609375,
0.029937744140625,
0.00472259521484375,
-0.048370361328125,
0.029632568359375,
-0.0253143310546875,
0.0279693603515625,
0.0255279541015625,
0.003414154052734375,
0.0279998779296875,
-0.0406494140625,
-0.03179931640625,
-0.02862548828125,
0.0992431640625,
0.0274200439453125,
-0.005298614501953125,
0.0163726806640625,
-0.00028777122497558594,
-0.01325225830078125,
0.01424407958984375,
-0.07659912109375,
-0.038055419921875,
0.0306243896484375,
-0.034210205078125,
-0.023406982421875,
-0.001911163330078125,
-0.0611572265625,
0.0032253265380859375,
0.0008635520935058594,
0.04156494140625,
-0.040008544921875,
-0.03350830078125,
0.0082855224609375,
-0.027008056640625,
0.0357666015625,
0.0211029052734375,
-0.0703125,
0.0297088623046875,
0.028472900390625,
0.05279541015625,
0.01317596435546875,
-0.0196380615234375,
-0.023406982421875,
0.01154327392578125,
-0.00640106201171875,
0.038787841796875,
-0.01369476318359375,
-0.044891357421875,
-0.0194244384765625,
0.0279388427734375,
0.002368927001953125,
-0.0163726806640625,
0.0399169921875,
-0.02069091796875,
0.036041259765625,
-0.034820556640625,
-0.03759765625,
-0.0255279541015625,
0.006252288818359375,
-0.0482177734375,
0.08642578125,
0.0364990234375,
-0.065185546875,
0.0147705078125,
-0.053009033203125,
-0.0234222412109375,
0.00318145751953125,
-0.0021839141845703125,
-0.042327880859375,
-0.0123443603515625,
0.0200042724609375,
0.0217132568359375,
-0.0191802978515625,
0.00777435302734375,
-0.0233154296875,
-0.0268707275390625,
0.01030731201171875,
-0.038299560546875,
0.086181640625,
0.0198822021484375,
-0.03643798828125,
-0.007198333740234375,
-0.043914794921875,
0.0085906982421875,
0.0328369140625,
-0.01204681396484375,
0.006969451904296875,
-0.01078033447265625,
0.00518035888671875,
0.01010894775390625,
0.01380157470703125,
-0.0305633544921875,
0.035675048828125,
-0.0208892822265625,
0.055389404296875,
0.04400634765625,
0.0010671615600585938,
0.0277557373046875,
-0.033538818359375,
0.038360595703125,
0.003849029541015625,
0.049530029296875,
0.0016613006591796875,
-0.0496826171875,
-0.05712890625,
-0.0153045654296875,
0.0251007080078125,
0.04156494140625,
-0.0589599609375,
0.035675048828125,
-0.0103607177734375,
-0.057281494140625,
-0.0189208984375,
-0.01378631591796875,
0.018707275390625,
0.0294647216796875,
0.034454345703125,
-0.031768798828125,
-0.0309295654296875,
-0.06414794921875,
0.0056610107421875,
-0.041656494140625,
0.00007826089859008789,
0.032379150390625,
0.0596923828125,
-0.027801513671875,
0.061981201171875,
-0.044281005859375,
-0.0091705322265625,
-0.005100250244140625,
0.01336669921875,
0.0198822021484375,
0.04705810546875,
0.06024169921875,
-0.05889892578125,
-0.044769287109375,
-0.0067291259765625,
-0.049407958984375,
-0.004833221435546875,
0.0026683807373046875,
-0.0283050537109375,
0.01381683349609375,
0.0005044937133789062,
-0.0838623046875,
0.04730224609375,
0.04052734375,
-0.04022216796875,
0.058807373046875,
-0.0160980224609375,
0.01515960693359375,
-0.082763671875,
0.005863189697265625,
0.00923919677734375,
-0.021484375,
-0.03741455078125,
0.00868988037109375,
0.0020427703857421875,
0.006500244140625,
-0.03131103515625,
0.047607421875,
-0.032989501953125,
0.00533294677734375,
-0.0011692047119140625,
-0.007709503173828125,
0.029083251953125,
0.033294677734375,
-0.014007568359375,
0.072265625,
0.033721923828125,
-0.036041259765625,
0.041351318359375,
0.038360595703125,
-0.0027313232421875,
0.0170440673828125,
-0.07000732421875,
0.01020050048828125,
0.00847625732421875,
0.0309295654296875,
-0.0723876953125,
-0.0251617431640625,
0.046630859375,
-0.04638671875,
0.038116455078125,
-0.0200653076171875,
-0.02740478515625,
-0.039276123046875,
-0.03814697265625,
0.0220184326171875,
0.060150146484375,
-0.0291748046875,
0.037994384765625,
0.030426025390625,
0.0005984306335449219,
-0.048797607421875,
-0.053985595703125,
-0.004016876220703125,
-0.02032470703125,
-0.04791259765625,
0.0379638671875,
-0.0149993896484375,
0.00011599063873291016,
-0.0013704299926757812,
0.004058837890625,
-0.003635406494140625,
-0.0037937164306640625,
0.023712158203125,
0.0225830078125,
-0.010711669921875,
-0.011749267578125,
0.013397216796875,
-0.004383087158203125,
0.00017559528350830078,
-0.0221710205078125,
0.033203125,
-0.023529052734375,
-0.00901031494140625,
-0.0308837890625,
0.0231475830078125,
0.042633056640625,
0.0002923011779785156,
0.050537109375,
0.06707763671875,
-0.01995849609375,
0.013275146484375,
-0.0399169921875,
-0.00865936279296875,
-0.042205810546875,
0.01580810546875,
-0.01407623291015625,
-0.05462646484375,
0.040069580078125,
0.0293426513671875,
0.0092926025390625,
0.060333251953125,
0.03143310546875,
-0.0099029541015625,
0.07891845703125,
0.03289794921875,
-0.01561737060546875,
0.038299560546875,
-0.0457763671875,
-0.011505126953125,
-0.0635986328125,
-0.01259613037109375,
-0.035003662109375,
-0.01568603515625,
-0.06500244140625,
-0.0261993408203125,
0.02349853515625,
0.01446533203125,
-0.064208984375,
0.04608154296875,
-0.05621337890625,
0.01561737060546875,
0.03759765625,
0.0091400146484375,
0.017974853515625,
0.0049896240234375,
-0.0093841552734375,
0.00440216064453125,
-0.045989990234375,
-0.02410888671875,
0.073974609375,
0.0267333984375,
0.0496826171875,
0.017578125,
0.036834716796875,
0.01258087158203125,
0.0226287841796875,
-0.041656494140625,
0.037506103515625,
-0.004573822021484375,
-0.05206298828125,
-0.0265045166015625,
-0.049560546875,
-0.07342529296875,
0.01515960693359375,
-0.0033588409423828125,
-0.06060791015625,
0.02349853515625,
-0.0028057098388671875,
-0.021392822265625,
0.01690673828125,
-0.054473876953125,
0.07708740234375,
-0.0118255615234375,
-0.0261077880859375,
0.004428863525390625,
-0.050323486328125,
0.024322509765625,
0.01432037353515625,
0.00616455078125,
-0.012115478515625,
-0.010223388671875,
0.05401611328125,
-0.06549072265625,
0.05462646484375,
-0.017333984375,
-0.002849578857421875,
0.041259765625,
-0.00682830810546875,
0.050506591796875,
0.01143646240234375,
0.0019683837890625,
0.021270751953125,
0.024139404296875,
-0.031982421875,
-0.036224365234375,
0.0413818359375,
-0.082275390625,
-0.04254150390625,
-0.034210205078125,
-0.033966064453125,
-0.0017881393432617188,
0.004970550537109375,
0.04248046875,
0.0310821533203125,
-0.005863189697265625,
0.00803375244140625,
0.041748046875,
-0.0285491943359375,
0.038909912109375,
0.0328369140625,
-0.0301971435546875,
-0.045166015625,
0.0635986328125,
0.00905609130859375,
0.01849365234375,
0.0170135498046875,
0.01221466064453125,
-0.040496826171875,
-0.03070068359375,
-0.054962158203125,
0.0224761962890625,
-0.039398193359375,
-0.026947021484375,
-0.050689697265625,
-0.033905029296875,
-0.033294677734375,
0.015625,
-0.02947998046875,
-0.04736328125,
-0.03607177734375,
0.0013055801391601562,
0.06427001953125,
0.038909912109375,
-0.01214599609375,
0.02618408203125,
-0.06683349609375,
0.017791748046875,
0.032379150390625,
0.0167083740234375,
0.0019197463989257812,
-0.057373046875,
-0.008056640625,
0.02679443359375,
-0.052978515625,
-0.0633544921875,
0.04632568359375,
0.016448974609375,
0.033203125,
0.02716064453125,
0.01082611083984375,
0.06121826171875,
-0.016265869140625,
0.07672119140625,
0.0193939208984375,
-0.068603515625,
0.040130615234375,
-0.052398681640625,
0.0207672119140625,
0.032989501953125,
0.0423583984375,
-0.031494140625,
-0.025634765625,
-0.0611572265625,
-0.057220458984375,
0.0335693359375,
0.040069580078125,
0.01374053955078125,
0.0034313201904296875,
0.03778076171875,
-0.003803253173828125,
0.0107269287109375,
-0.0611572265625,
-0.035430908203125,
-0.040924072265625,
-0.0096282958984375,
0.00804901123046875,
0.0004787445068359375,
-0.006992340087890625,
-0.04791259765625,
0.0711669921875,
-0.01251983642578125,
0.056640625,
0.0204620361328125,
0.0119781494140625,
-0.0115509033203125,
0.009246826171875,
0.0281524658203125,
0.041229248046875,
-0.01303863525390625,
-0.0205230712890625,
0.0088653564453125,
-0.06494140625,
0.0130157470703125,
0.02642822265625,
-0.0164794921875,
-0.008270263671875,
0.0036563873291015625,
0.058685302734375,
-0.0028743743896484375,
-0.0208740234375,
0.04119873046875,
-0.028411865234375,
-0.0268707275390625,
-0.023834228515625,
0.02099609375,
0.0147705078125,
0.02801513671875,
0.033599853515625,
-0.01898193359375,
0.0194244384765625,
-0.04290771484375,
0.00882720947265625,
0.038543701171875,
-0.023529052734375,
-0.0229949951171875,
0.0634765625,
0.0025577545166015625,
-0.0019855499267578125,
0.060943603515625,
-0.0253143310546875,
-0.037261962890625,
0.0653076171875,
0.03753662109375,
0.0638427734375,
-0.00859832763671875,
0.0294189453125,
0.038970947265625,
0.01434326171875,
-0.0106353759765625,
0.0257110595703125,
-0.0010986328125,
-0.047210693359375,
-0.028411865234375,
-0.044036865234375,
-0.034332275390625,
0.0220947265625,
-0.052398681640625,
0.0167236328125,
-0.026153564453125,
-0.030792236328125,
-0.01430511474609375,
0.021331787109375,
-0.041046142578125,
0.0194854736328125,
-0.0019474029541015625,
0.0697021484375,
-0.043548583984375,
0.062164306640625,
0.03955078125,
-0.04254150390625,
-0.08062744140625,
-0.009674072265625,
0.004299163818359375,
-0.04443359375,
0.0178680419921875,
0.007080078125,
0.021148681640625,
0.01093292236328125,
-0.05084228515625,
-0.06854248046875,
0.108154296875,
0.0206756591796875,
-0.043426513671875,
-0.01177978515625,
0.0140380859375,
0.0299530029296875,
-0.0103759765625,
0.057525634765625,
0.034515380859375,
0.034759521484375,
0.016998291015625,
-0.0748291015625,
0.0288543701171875,
-0.0399169921875,
-0.0002913475036621094,
0.01824951171875,
-0.086181640625,
0.0701904296875,
0.0028591156005859375,
-0.0120849609375,
0.01593017578125,
0.0479736328125,
0.0201873779296875,
-0.00015819072723388672,
0.028045654296875,
0.0526123046875,
0.05389404296875,
-0.02978515625,
0.08355712890625,
-0.021759033203125,
0.052703857421875,
0.05682373046875,
-0.001232147216796875,
0.05096435546875,
0.01523590087890625,
-0.0489501953125,
0.046966552734375,
0.06561279296875,
-0.01483154296875,
0.0275421142578125,
-0.00435638427734375,
-0.0177154541015625,
-0.0029125213623046875,
0.00830078125,
-0.058502197265625,
0.015777587890625,
0.0301666259765625,
-0.0157623291015625,
0.004703521728515625,
-0.009979248046875,
-0.0021381378173828125,
-0.05157470703125,
-0.0081329345703125,
0.04229736328125,
0.0184326171875,
-0.0245513916015625,
0.06768798828125,
-0.01171112060546875,
0.04815673828125,
-0.0460205078125,
-0.005950927734375,
-0.0302734375,
-0.00644683837890625,
-0.020477294921875,
-0.05987548828125,
0.0214996337890625,
-0.012786865234375,
-0.00577545166015625,
0.00046324729919433594,
0.057464599609375,
-0.0091400146484375,
-0.028167724609375,
0.0230712890625,
0.03546142578125,
0.0166778564453125,
-0.009368896484375,
-0.08062744140625,
0.0193939208984375,
-0.0020294189453125,
-0.050689697265625,
0.03668212890625,
0.02752685546875,
0.0219268798828125,
0.058074951171875,
0.04296875,
-0.0168914794921875,
-0.0044403076171875,
-0.00859832763671875,
0.07427978515625,
-0.05908203125,
-0.02569580078125,
-0.060455322265625,
0.04638671875,
-0.01216888427734375,
-0.02679443359375,
0.0577392578125,
0.04296875,
0.052398681640625,
0.0008893013000488281,
0.05560302734375,
-0.029266357421875,
0.01122283935546875,
-0.01995849609375,
0.06292724609375,
-0.054168701171875,
0.00977325439453125,
-0.031524658203125,
-0.0550537109375,
0.00658416748046875,
0.054840087890625,
-0.00600433349609375,
0.0230865478515625,
0.0357666015625,
0.06744384765625,
-0.003414154052734375,
0.0242462158203125,
0.014617919921875,
0.027862548828125,
0.0184783935546875,
0.0626220703125,
0.0667724609375,
-0.07891845703125,
0.054168701171875,
-0.0272979736328125,
-0.0091705322265625,
-0.005016326904296875,
-0.0628662109375,
-0.054779052734375,
-0.036773681640625,
-0.04412841796875,
-0.04376220703125,
-0.00003618001937866211,
0.063720703125,
0.06036376953125,
-0.0430908203125,
-0.01910400390625,
-0.01412200927734375,
-0.00592041015625,
-0.0136260986328125,
-0.0255584716796875,
0.0216522216796875,
0.01482391357421875,
-0.055206298828125,
0.013671875,
0.0086212158203125,
0.036041259765625,
-0.01461029052734375,
-0.020263671875,
-0.0132904052734375,
0.0008435249328613281,
0.0396728515625,
0.04266357421875,
-0.036041259765625,
-0.0220794677734375,
-0.00978851318359375,
-0.01203155517578125,
0.0216522216796875,
0.0209197998046875,
-0.05877685546875,
0.0008330345153808594,
0.037445068359375,
0.01446533203125,
0.06622314453125,
-0.00037670135498046875,
0.031494140625,
-0.0294952392578125,
0.004505157470703125,
0.0008826255798339844,
0.026641845703125,
0.00815582275390625,
-0.03717041015625,
0.050506591796875,
0.0269622802734375,
-0.056304931640625,
-0.0489501953125,
-0.00580596923828125,
-0.0841064453125,
-0.01190185546875,
0.08477783203125,
-0.0106658935546875,
-0.0235595703125,
-0.007663726806640625,
-0.020538330078125,
0.03350830078125,
-0.04400634765625,
0.038970947265625,
0.038665771484375,
-0.01161956787109375,
-0.0232391357421875,
-0.053314208984375,
0.03814697265625,
0.01134490966796875,
-0.064453125,
-0.0035190582275390625,
0.0312347412109375,
0.03125,
0.0023136138916015625,
0.0577392578125,
-0.016265869140625,
0.031158447265625,
0.00800323486328125,
-0.00330352783203125,
-0.006561279296875,
0.006969451904296875,
-0.0180206298828125,
-0.001911163330078125,
-0.0197296142578125,
-0.000008463859558105469
]
] |
ckiplab/albert-tiny-chinese-ner | 2022-05-10T03:28:10.000Z | [
"transformers",
"pytorch",
"albert",
"token-classification",
"zh",
"license:gpl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | ckiplab | null | null | ckiplab/albert-tiny-chinese-ner | 2 | 26,028 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
- pytorch
- token-classification
- albert
- zh
license: gpl-3.0
---
# CKIP ALBERT Tiny Chinese
This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).
這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2)及自然語言處理工具(包含斷詞、詞性標記、實體辨識)。
## Homepage
- https://github.com/ckiplab/ckip-transformers
## Contributers
- [Mu Yang](https://muyang.pro) at [CKIP](https://ckip.iis.sinica.edu.tw) (Author & Maintainer)
## Usage
Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.
請使用 BertTokenizerFast 而非 AutoTokenizer。
```
from transformers import (
BertTokenizerFast,
AutoModel,
)
tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
model = AutoModel.from_pretrained('ckiplab/albert-tiny-chinese-ner')
```
For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.
有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers 。
| 1,129 | [
[
-0.0240631103515625,
-0.01983642578125,
0.0046234130859375,
0.05218505859375,
-0.023834228515625,
0.004291534423828125,
-0.0160675048828125,
-0.022064208984375,
0.0016078948974609375,
0.025177001953125,
-0.0250091552734375,
-0.016815185546875,
-0.0380859375,
0.0080718994140625,
-0.01419830322265625,
0.06060791015625,
-0.01140594482421875,
0.022552490234375,
0.027496337890625,
0.005886077880859375,
-0.0174102783203125,
-0.0179443359375,
-0.0615234375,
-0.043243408203125,
0.0002338886260986328,
0.031646728515625,
0.054412841796875,
0.03515625,
0.039794921875,
0.02301025390625,
-0.0014047622680664062,
-0.0079345703125,
-0.006931304931640625,
-0.03265380859375,
0.0015392303466796875,
-0.044677734375,
-0.0279541015625,
-0.0197906494140625,
0.052520751953125,
0.034637451171875,
0.0019969940185546875,
0.00637054443359375,
0.01617431640625,
0.024169921875,
-0.0207977294921875,
0.03009033203125,
-0.041839599609375,
0.0219573974609375,
-0.0152435302734375,
-0.0030727386474609375,
-0.02569580078125,
-0.0153045654296875,
0.00666046142578125,
-0.04852294921875,
0.0265655517578125,
0.0011959075927734375,
0.0936279296875,
0.005458831787109375,
-0.0219268798828125,
-0.019744873046875,
-0.05029296875,
0.07904052734375,
-0.0625,
0.0269317626953125,
0.0267791748046875,
0.0231781005859375,
0.001926422119140625,
-0.07965087890625,
-0.05279541015625,
-0.0131072998046875,
-0.028045654296875,
0.0170745849609375,
0.0052947998046875,
-0.0029048919677734375,
0.0291290283203125,
0.021484375,
-0.0458984375,
0.0215911865234375,
-0.031707763671875,
-0.0301971435546875,
0.04296875,
0.0036220550537109375,
0.036468505859375,
-0.03753662109375,
-0.0309295654296875,
-0.024932861328125,
-0.045562744140625,
0.01549530029296875,
0.016326904296875,
0.01090240478515625,
-0.041961669921875,
0.046783447265625,
-0.0036296844482421875,
0.018951416015625,
0.01287078857421875,
-0.007061004638671875,
0.027496337890625,
-0.0278472900390625,
-0.00012058019638061523,
-0.00826263427734375,
0.076904296875,
0.01558685302734375,
0.00141143798828125,
0.00994110107421875,
-0.0261688232421875,
-0.0249176025390625,
-0.0218048095703125,
-0.06256103515625,
-0.05377197265625,
0.01354217529296875,
-0.0596923828125,
-0.015045166015625,
0.007114410400390625,
-0.051300048828125,
0.0197906494140625,
-0.01499176025390625,
0.0290374755859375,
-0.048980712890625,
-0.03814697265625,
-0.00353240966796875,
-0.0247802734375,
0.05145263671875,
0.0098724365234375,
-0.09295654296875,
0.001697540283203125,
0.046173095703125,
0.06097412109375,
0.017059326171875,
-0.01168060302734375,
0.0079345703125,
0.0266265869140625,
-0.0176849365234375,
0.04205322265625,
-0.00870513916015625,
-0.048858642578125,
0.00689697265625,
0.0057220458984375,
0.0022525787353515625,
-0.03515625,
0.05743408203125,
-0.020965576171875,
0.0237884521484375,
-0.0084381103515625,
-0.01447296142578125,
0.0017337799072265625,
0.00643157958984375,
-0.037811279296875,
0.07879638671875,
0.00994110107421875,
-0.06427001953125,
0.017486572265625,
-0.068115234375,
-0.03778076171875,
0.0256500244140625,
-0.01141357421875,
-0.034698486328125,
-0.00783538818359375,
0.017547607421875,
0.0204620361328125,
-0.0099029541015625,
0.0049591064453125,
-0.00672149658203125,
-0.01611328125,
0.00017452239990234375,
-0.030487060546875,
0.091064453125,
0.02294921875,
-0.01465606689453125,
0.012847900390625,
-0.053131103515625,
0.008575439453125,
0.0256805419921875,
-0.0137481689453125,
-0.02569580078125,
0.0074005126953125,
0.040679931640625,
0.0130157470703125,
0.037384033203125,
-0.043548583984375,
0.0391845703125,
-0.046630859375,
0.056793212890625,
0.057464599609375,
-0.017303466796875,
0.0262603759765625,
-0.01546478271484375,
0.0006241798400878906,
0.002040863037109375,
0.0252685546875,
-0.0022983551025390625,
-0.038787841796875,
-0.0855712890625,
-0.0284576416015625,
0.038116455078125,
0.048858642578125,
-0.08819580078125,
0.05364990234375,
-0.02337646484375,
-0.049346923828125,
-0.032958984375,
0.0005779266357421875,
0.0011692047119140625,
0.00762939453125,
0.035003662109375,
-0.0187225341796875,
-0.041290283203125,
-0.0771484375,
0.006580352783203125,
-0.0455322265625,
-0.048583984375,
-0.0004215240478515625,
0.04876708984375,
-0.0340576171875,
0.07318115234375,
-0.03961181640625,
-0.0267486572265625,
-0.020965576171875,
0.0423583984375,
0.0246124267578125,
0.0653076171875,
0.04217529296875,
-0.0743408203125,
-0.05908203125,
-0.0098724365234375,
-0.024078369140625,
-0.0013055801391601562,
-0.021453857421875,
-0.005825042724609375,
-0.005199432373046875,
0.005245208740234375,
-0.048431396484375,
0.0142822265625,
0.0244903564453125,
0.00009179115295410156,
0.059295654296875,
-0.005641937255859375,
-0.0227508544921875,
-0.09100341796875,
0.0096282958984375,
-0.0121917724609375,
-0.002323150634765625,
-0.0281524658203125,
-0.0028476715087890625,
0.0164794921875,
-0.006931304931640625,
-0.04437255859375,
0.047576904296875,
-0.0265655517578125,
0.0263671875,
-0.0186614990234375,
-0.006011962890625,
-0.01384735107421875,
0.0391845703125,
0.0256500244140625,
0.052947998046875,
0.0396728515625,
-0.050384521484375,
0.027587890625,
0.05303955078125,
-0.0194549560546875,
-0.004840850830078125,
-0.07196044921875,
-0.002773284912109375,
0.0268402099609375,
0.0028934478759765625,
-0.0628662109375,
-0.00860595703125,
0.0455322265625,
-0.0540771484375,
0.047698974609375,
0.0014934539794921875,
-0.07373046875,
-0.037628173828125,
-0.037872314453125,
0.03033447265625,
0.04718017578125,
-0.050323486328125,
0.032562255859375,
0.0183563232421875,
-0.017425537109375,
-0.0298919677734375,
-0.0592041015625,
-0.0025234222412109375,
0.0127716064453125,
-0.043731689453125,
0.042694091796875,
-0.0205535888671875,
0.017242431640625,
0.0013837814331054688,
0.00966644287109375,
-0.03240966796875,
-0.0085296630859375,
-0.01094818115234375,
0.0296173095703125,
-0.0142974853515625,
-0.002712249755859375,
0.016448974609375,
-0.0264129638671875,
0.00794219970703125,
0.004695892333984375,
0.045379638671875,
-0.004238128662109375,
-0.021636962890625,
-0.047515869140625,
0.0196990966796875,
0.01294708251953125,
-0.01309967041015625,
0.027587890625,
0.0782470703125,
-0.0204010009765625,
-0.0130462646484375,
-0.0225982666015625,
-0.0174713134765625,
-0.041412353515625,
0.03887939453125,
-0.033599853515625,
-0.06610107421875,
0.0207977294921875,
-0.0090484619140625,
0.010406494140625,
0.058258056640625,
0.0498046875,
-0.0031795501708984375,
0.09814453125,
0.07208251953125,
-0.032440185546875,
0.032135009765625,
-0.032928466796875,
0.0310821533203125,
-0.07159423828125,
0.0211334228515625,
-0.047454833984375,
0.00426483154296875,
-0.06573486328125,
-0.023223876953125,
0.0025177001953125,
0.0147705078125,
-0.021697998046875,
0.051513671875,
-0.05712890625,
0.001834869384765625,
0.053131103515625,
-0.0283050537109375,
-0.001190185546875,
-0.0123748779296875,
-0.0155487060546875,
-0.00923919677734375,
-0.053192138671875,
-0.05364990234375,
0.054168701171875,
0.04632568359375,
0.05328369140625,
0.0038318634033203125,
0.0372314453125,
-0.0021209716796875,
0.036895751953125,
-0.062255859375,
0.0418701171875,
-0.00797271728515625,
-0.057861328125,
-0.0215606689453125,
-0.0208282470703125,
-0.063720703125,
0.0212249755859375,
-0.003437042236328125,
-0.06951904296875,
0.0164947509765625,
0.0069732666015625,
-0.0199737548828125,
0.0243682861328125,
-0.036712646484375,
0.0595703125,
-0.0308837890625,
0.004032135009765625,
-0.004657745361328125,
-0.050872802734375,
0.034332275390625,
-0.007564544677734375,
0.00013756752014160156,
-0.005069732666015625,
-0.0011739730834960938,
0.0609130859375,
-0.0189361572265625,
0.0615234375,
-0.0029354095458984375,
0.002490997314453125,
0.0250396728515625,
-0.016510009765625,
0.0250701904296875,
0.0247344970703125,
0.0098724365234375,
0.047454833984375,
0.022216796875,
-0.026947021484375,
-0.0167999267578125,
0.039276123046875,
-0.06591796875,
-0.0308685302734375,
-0.045074462890625,
-0.014434814453125,
0.01346588134765625,
0.042999267578125,
0.03912353515625,
-0.006320953369140625,
0.002895355224609375,
0.0194549560546875,
0.0164642333984375,
-0.0276336669921875,
0.047454833984375,
0.0447998046875,
-0.00565338134765625,
-0.03131103515625,
0.07305908203125,
0.0081939697265625,
0.0069427490234375,
0.045623779296875,
0.004467010498046875,
-0.01107025146484375,
-0.03204345703125,
-0.02825927734375,
0.036285400390625,
-0.0239410400390625,
-0.003314971923828125,
-0.0262451171875,
-0.03741455078125,
-0.044464111328125,
0.009552001953125,
-0.031768798828125,
-0.0318603515625,
-0.025482177734375,
0.008056640625,
-0.0278472900390625,
0.0074310302734375,
-0.0283050537109375,
0.03948974609375,
-0.0804443359375,
0.0391845703125,
0.017059326171875,
0.01253509521484375,
0.004405975341796875,
-0.0204010009765625,
-0.047119140625,
0.006725311279296875,
-0.0648193359375,
-0.044891357421875,
0.044677734375,
0.0037059783935546875,
0.040374755859375,
0.04864501953125,
0.0187530517578125,
0.039581298828125,
-0.049407958984375,
0.08154296875,
0.0302276611328125,
-0.08349609375,
0.0311279296875,
-0.00868988037109375,
0.0243377685546875,
0.022552490234375,
0.0341796875,
-0.059234619140625,
-0.0232391357421875,
-0.038787841796875,
-0.08203125,
0.053802490234375,
0.03009033203125,
0.020599365234375,
0.00138092041015625,
-0.005474090576171875,
-0.004695892333984375,
0.012939453125,
-0.07574462890625,
-0.038482666015625,
-0.0263671875,
-0.02410888671875,
0.0138702392578125,
-0.031494140625,
0.0003662109375,
-0.0196380615234375,
0.0797119140625,
-0.004016876220703125,
0.0606689453125,
0.02740478515625,
0.0018215179443359375,
-0.012969970703125,
0.0085601806640625,
0.032867431640625,
0.03582763671875,
-0.01486968994140625,
-0.01776123046875,
0.006435394287109375,
-0.04327392578125,
-0.016387939453125,
0.026824951171875,
-0.035125732421875,
0.030303955078125,
0.039520263671875,
0.045135498046875,
0.00836181640625,
-0.0318603515625,
0.044158935546875,
-0.0216217041015625,
-0.012847900390625,
-0.07269287109375,
-0.0044097900390625,
0.00041937828063964844,
0.006580352783203125,
0.040496826171875,
-0.0060577392578125,
0.0110931396484375,
-0.01078033447265625,
0.01029205322265625,
0.028564453125,
-0.039398193359375,
-0.0391845703125,
0.055572509765625,
0.031280517578125,
-0.0270843505859375,
0.0594482421875,
-0.0119476318359375,
-0.060089111328125,
0.046875,
0.0322265625,
0.0704345703125,
-0.0184478759765625,
0.0045318603515625,
0.04766845703125,
0.043365478515625,
0.00194549560546875,
0.0111846923828125,
-0.0212860107421875,
-0.06591796875,
-0.034942626953125,
-0.030303955078125,
-0.03466796875,
0.024322509765625,
-0.035675048828125,
0.04595947265625,
-0.03338623046875,
-0.005321502685546875,
0.0009160041809082031,
-0.0002903938293457031,
-0.042449951171875,
0.004425048828125,
0.007732391357421875,
0.0821533203125,
-0.04876708984375,
0.08697509765625,
0.042999267578125,
-0.04156494140625,
-0.06500244140625,
0.01367950439453125,
-0.02032470703125,
-0.052520751953125,
0.0736083984375,
0.0252685546875,
0.0246429443359375,
0.004970550537109375,
-0.05377197265625,
-0.05609130859375,
0.07611083984375,
-0.01316070556640625,
-0.0300445556640625,
-0.00582122802734375,
0.026153564453125,
0.0322265625,
-0.0030574798583984375,
0.036773681640625,
0.0098724365234375,
0.04583740234375,
-0.0156707763671875,
-0.084228515625,
-0.0190887451171875,
-0.020294189453125,
0.00919342041015625,
0.0175018310546875,
-0.07177734375,
0.05780029296875,
0.0013408660888671875,
-0.0261383056640625,
0.036529541015625,
0.0721435546875,
-0.002483367919921875,
0.01861572265625,
0.040557861328125,
0.030609130859375,
-0.0004439353942871094,
-0.0160064697265625,
0.042022705078125,
-0.040374755859375,
0.06134033203125,
0.0634765625,
0.0021877288818359375,
0.055023193359375,
0.033172607421875,
-0.036285400390625,
0.034332275390625,
0.05242919921875,
-0.039947509765625,
0.045745849609375,
-0.0026092529296875,
-0.00907135009765625,
-0.00603485107421875,
0.0042572021484375,
-0.0372314453125,
0.017333984375,
0.0237274169921875,
-0.0207977294921875,
-0.00231170654296875,
-0.00858306884765625,
-0.0023651123046875,
-0.029510498046875,
-0.0106964111328125,
0.038177490234375,
0.0193023681640625,
-0.0108489990234375,
0.0345458984375,
0.0261993408203125,
0.07568359375,
-0.0748291015625,
-0.0225372314453125,
0.01739501953125,
0.0158843994140625,
-0.007564544677734375,
-0.043304443359375,
0.01448822021484375,
-0.02337646484375,
-0.012115478515625,
-0.01012420654296875,
0.056915283203125,
-0.024322509765625,
-0.0435791015625,
0.02490234375,
0.006641387939453125,
0.00960540771484375,
0.0254669189453125,
-0.083251953125,
-0.023529052734375,
0.0248870849609375,
-0.032562255859375,
0.0151519775390625,
0.0174560546875,
0.00884246826171875,
0.05010986328125,
0.06109619140625,
0.0133209228515625,
-0.01346588134765625,
-0.00823974609375,
0.061614990234375,
-0.041748046875,
-0.041961669921875,
-0.05364990234375,
0.053253173828125,
-0.0192413330078125,
-0.02447509765625,
0.058624267578125,
0.0540771484375,
0.08990478515625,
-0.02508544921875,
0.06671142578125,
-0.031646728515625,
0.05682373046875,
-0.0110015869140625,
0.06280517578125,
-0.039764404296875,
-0.00417327880859375,
-0.0211334228515625,
-0.05987548828125,
-0.0197906494140625,
0.062469482421875,
-0.01482391357421875,
-0.009674072265625,
0.053497314453125,
0.050750732421875,
-0.003261566162109375,
-0.01312255859375,
0.0137481689453125,
0.0181732177734375,
0.04656982421875,
0.03192138671875,
0.0435791015625,
-0.042388916015625,
0.05279541015625,
-0.054595947265625,
-0.01259613037109375,
-0.01184844970703125,
-0.04669189453125,
-0.05029296875,
-0.038909912109375,
-0.0236968994140625,
-0.005573272705078125,
-0.028289794921875,
0.0643310546875,
0.053192138671875,
-0.07672119140625,
-0.028717041015625,
0.00534820556640625,
0.012664794921875,
-0.0257110595703125,
-0.02557373046875,
0.045196533203125,
-0.021453857421875,
-0.0858154296875,
0.0034618377685546875,
0.0102691650390625,
0.007236480712890625,
-0.019378662109375,
-0.0009551048278808594,
-0.01348876953125,
-0.0144195556640625,
0.028106689453125,
0.037872314453125,
-0.04779052734375,
-0.030609130859375,
-0.0085296630859375,
-0.016815185546875,
0.007335662841796875,
0.0440673828125,
-0.01392364501953125,
0.02508544921875,
0.04364013671875,
0.0194549560546875,
0.0235137939453125,
-0.0191650390625,
0.03997802734375,
-0.038177490234375,
0.025604248046875,
0.0279541015625,
0.039520263671875,
0.0243377685546875,
-0.01316070556640625,
0.0380859375,
0.033843994140625,
-0.054595947265625,
-0.04864501953125,
0.02423095703125,
-0.07244873046875,
-0.01248931884765625,
0.0675048828125,
-0.01140594482421875,
-0.0169525146484375,
-0.00283050537109375,
-0.041107177734375,
0.0418701171875,
-0.024139404296875,
0.044647216796875,
0.054595947265625,
-0.006519317626953125,
-0.00739288330078125,
-0.042572021484375,
0.027618408203125,
0.030548095703125,
-0.0234832763671875,
-0.0265655517578125,
-0.0024814605712890625,
0.0103759765625,
0.04852294921875,
0.033843994140625,
-0.004055023193359375,
0.006259918212890625,
-0.004474639892578125,
0.03955078125,
0.00325775146484375,
0.01323699951171875,
0.0002574920654296875,
-0.01338958740234375,
0.00786590576171875,
-0.032073974609375
]
] |
JosephusCheung/ACertainThing | 2022-12-20T03:16:02.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"arxiv:2106.09685",
"doi:10.57967/hf/0197",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | JosephusCheung | null | null | JosephusCheung/ACertainThing | 190 | 25,975 | diffusers | 2022-12-13T18:05:27 | ---
language:
- en
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
inference: true
widget:
- text: "masterpiece, best quality, 1girl, brown hair, green eyes, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden"
example_title: "example 1girl"
- text: "masterpiece, best quality, 1boy, brown hair, green eyes, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden"
example_title: "example 1boy"
---
# ACertainThing
**Try full functions with Google Colab free T4** [](https://colab.research.google.com/drive/1gwJViXR0UxoXx01qiU6uTSEKGjTagOgp?usp=sharing)
Anything3.0 is an overfitted model that takes liberties when it shouldn't be generating human images and certain details. However, the community has given it a high rating, and I believe that is because many lazy people who don't know how to write a prompt can use this overfitted model to generate high-quality images even if their prompts are poorly written.
Here is a ACertain version of Anything3.0, made with Dreambooth (idea of [LoRA](https://arxiv.org/abs/2106.09685) integrated), initialized with [ACertainModel](https://huggingface.co/JosephusCheung/ACertainModel).
Although this model may produce better results for image generation, it is built on two major problems. Firstly, it does not always stay true to your prompts; it adds irrelevant details, and sometimes these details are highly homogenized. Secondly, it is an unstable, overfitted model, similar to Anything3.0, and is not suitable for any form of further training. As far as I know, Anything3.0 is obtained by merging several models in just the right way, but it is itself an overfitted model with defects in both its saturation and configuration. However, as I mentioned earlier, it can make even poorly written prompts produce good output images, which leads many lazy people who are incapable of writing good prompts to quickly surpass those who study the writing of prompts carefully. Despite these problems, I still want to release an extended version of the model that caters to the preferences of many people in the community. I hope would you like it.
**In my personal view, I oppose all forms of model merging as it has no scientific principle and is nothing but a waste of time. It is a desire to get results without putting in the effort. That is why I do not like Anything3.0, or this model that is being released. But I respect the choices and preferences of the community, and I hope that you can also respect and understand my thoughts.**
If you want your prompts to be accurately output and want to learn the correct skills for using prompts, it is recommended that you use the more balanced model [ACertainModel](https://huggingface.co/JosephusCheung/ACertainModel).
e.g. **_masterpiece, best quality, 1girl, brown hair, green eyes, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden_**
## About online preview with Hosted inference API, also generation with this model
Parameters are not allowed to be modified, as it seems that it is generated with *Clip skip: 1*, for better performance, it is strongly recommended to use *Clip skip: 2* instead.
Here is an example of inference settings, if it is applicable with you on your own server: *Steps: 28, Sampler: Euler a, CFG scale: 11, Clip skip: 2*.
## 🧨 Diffusers
This model can be used just like any other Stable Diffusion model. For more information,
please have a look at the [Stable Diffusion](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion).
You can also export the model to [ONNX](https://huggingface.co/docs/diffusers/optimization/onnx), [MPS](https://huggingface.co/docs/diffusers/optimization/mps) and/or FLAX/JAX.
```python
from diffusers import StableDiffusionPipeline
import torch
model_id = "JosephusCheung/ACertainThing"
branch_name= "main"
pipe = StableDiffusionPipeline.from_pretrained(model_id, revision=branch_name, torch_dtype=torch.float16)
pipe = pipe.to("cuda")
prompt = "pikachu"
image = pipe(prompt).images[0]
image.save("./pikachu.png")
```
## Examples
Below are some examples of images generated using this model, with better performance on framing and hand gestures, as well as moving objects, comparing to other analogues:
**Anime Girl:**

```
1girl, brown hair, green eyes, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden
Steps: 28, Sampler: Euler a, CFG scale: 11, Seed: 114514, Clip skip: 2
```
**Anime Boy:**

```
1boy, brown hair, green eyes, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden
Steps: 28, Sampler: Euler a, CFG scale: 11, Seed: 114514, Clip skip: 2
```
## License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license)
## Is it a NovelAI based model? What is the relationship with SD1.2 and SD1.4?
See [ASimilarityCalculatior](https://huggingface.co/JosephusCheung/ASimilarityCalculatior) | 6,087 | [
[
-0.03399658203125,
-0.0697021484375,
0.036285400390625,
0.025146484375,
-0.017242431640625,
-0.03314208984375,
-0.0013113021850585938,
-0.04974365234375,
0.0238800048828125,
0.02532958984375,
-0.03790283203125,
-0.04010009765625,
-0.039031982421875,
-0.0031719207763671875,
-0.018280029296875,
0.07940673828125,
-0.02593994140625,
0.00017130374908447266,
-0.016845703125,
-0.0032196044921875,
-0.0386962890625,
-0.018402099609375,
-0.07080078125,
-0.01763916015625,
0.039337158203125,
0.01026153564453125,
0.050537109375,
0.04522705078125,
0.02130126953125,
0.02423095703125,
-0.02154541015625,
-0.00984954833984375,
-0.03802490234375,
0.0030994415283203125,
0.00565338134765625,
-0.02374267578125,
-0.06414794921875,
0.018585205078125,
0.029815673828125,
0.029876708984375,
-0.011688232421875,
0.0003516674041748047,
0.006725311279296875,
0.034332275390625,
-0.0307159423828125,
0.00797271728515625,
-0.0047454833984375,
-0.00196075439453125,
-0.006496429443359375,
0.0188751220703125,
-0.0167083740234375,
-0.041473388671875,
0.005817413330078125,
-0.06304931640625,
0.0180206298828125,
0.005481719970703125,
0.08905029296875,
0.007289886474609375,
-0.01953125,
-0.016571044921875,
-0.02984619140625,
0.050384521484375,
-0.05010986328125,
0.01433563232421875,
0.0225982666015625,
0.0310821533203125,
0.0008978843688964844,
-0.058685302734375,
-0.042755126953125,
-0.0120391845703125,
0.00473785400390625,
0.024688720703125,
-0.0255584716796875,
-0.0004603862762451172,
0.01457977294921875,
0.0301666259765625,
-0.05621337890625,
-0.0015573501586914062,
-0.044464111328125,
0.0036373138427734375,
0.046722412109375,
0.01232147216796875,
0.03802490234375,
-0.0253753662109375,
-0.04522705078125,
-0.010650634765625,
-0.05224609375,
0.01454925537109375,
0.02490234375,
0.0180511474609375,
-0.04168701171875,
0.0478515625,
0.01214599609375,
0.040771484375,
0.0195465087890625,
0.0066986083984375,
0.0296173095703125,
-0.007415771484375,
-0.021270751953125,
-0.0227203369140625,
0.0667724609375,
0.04443359375,
0.00801849365234375,
0.00152587890625,
-0.00447845458984375,
0.0009832382202148438,
0.0012226104736328125,
-0.0848388671875,
-0.0281829833984375,
0.042449951171875,
-0.04730224609375,
-0.0283203125,
-0.01763916015625,
-0.039093017578125,
-0.02020263671875,
-0.004978179931640625,
0.0352783203125,
-0.04254150390625,
-0.038848876953125,
0.006725311279296875,
-0.0210113525390625,
-0.0007877349853515625,
0.04425048828125,
-0.046722412109375,
-0.0013723373413085938,
0.020355224609375,
0.07476806640625,
-0.0228729248046875,
-0.0149078369140625,
-0.0013113021850585938,
-0.00266265869140625,
-0.0196380615234375,
0.053741455078125,
-0.023101806640625,
-0.043609619140625,
-0.0190582275390625,
0.01806640625,
-0.0087432861328125,
-0.03314208984375,
0.044403076171875,
-0.029876708984375,
0.023223876953125,
-0.0213623046875,
-0.047637939453125,
-0.019622802734375,
0.0058746337890625,
-0.05572509765625,
0.06549072265625,
0.01470184326171875,
-0.060638427734375,
0.021820068359375,
-0.062744140625,
-0.0068817138671875,
-0.004261016845703125,
0.0020084381103515625,
-0.039703369140625,
-0.01540374755859375,
0.00853729248046875,
0.0416259765625,
-0.0170745849609375,
0.021697998046875,
-0.0469970703125,
-0.01061248779296875,
0.007965087890625,
-0.01358795166015625,
0.104736328125,
0.0285186767578125,
-0.016082763671875,
0.01312255859375,
-0.0421142578125,
-0.01103973388671875,
0.020172119140625,
-0.00843048095703125,
-0.005115509033203125,
-0.032989501953125,
0.02532958984375,
0.023101806640625,
0.01568603515625,
-0.055572509765625,
0.0234375,
-0.016082763671875,
0.0309295654296875,
0.050048828125,
0.0014133453369140625,
0.035400390625,
-0.050689697265625,
0.053802490234375,
0.01313018798828125,
0.01751708984375,
-0.00893402099609375,
-0.063720703125,
-0.051513671875,
-0.04931640625,
0.0210113525390625,
0.02667236328125,
-0.05242919921875,
0.03216552734375,
-0.00656890869140625,
-0.061920166015625,
-0.032440185546875,
-0.00708770751953125,
0.0300140380859375,
0.0465087890625,
0.0185089111328125,
-0.0149078369140625,
-0.030242919921875,
-0.052825927734375,
0.00826263427734375,
-0.0113067626953125,
-0.01360321044921875,
0.0240631103515625,
0.0450439453125,
-0.00797271728515625,
0.0653076171875,
-0.046142578125,
-0.01611328125,
-0.0159759521484375,
0.0142364501953125,
0.0243072509765625,
0.06475830078125,
0.061004638671875,
-0.061248779296875,
-0.048431396484375,
-0.01128387451171875,
-0.0670166015625,
-0.00789642333984375,
0.003055572509765625,
-0.0222625732421875,
0.01617431640625,
0.01849365234375,
-0.08172607421875,
0.032379150390625,
0.03912353515625,
-0.04437255859375,
0.03265380859375,
-0.01096343994140625,
0.0149688720703125,
-0.09710693359375,
0.03289794921875,
0.01508331298828125,
-0.03045654296875,
-0.045440673828125,
0.0309295654296875,
0.0030155181884765625,
-0.0190277099609375,
-0.04937744140625,
0.0787353515625,
-0.022369384765625,
0.026580810546875,
-0.0228271484375,
0.008819580078125,
0.0193023681640625,
0.04278564453125,
0.0113677978515625,
0.0347900390625,
0.06024169921875,
-0.04034423828125,
0.0308074951171875,
0.0330810546875,
-0.013336181640625,
0.052581787109375,
-0.07122802734375,
-0.004093170166015625,
-0.021148681640625,
0.036895751953125,
-0.07470703125,
-0.02789306640625,
0.041778564453125,
-0.0496826171875,
0.02996826171875,
-0.0017852783203125,
-0.025146484375,
-0.029815673828125,
-0.03271484375,
0.026092529296875,
0.05987548828125,
-0.04949951171875,
0.0565185546875,
0.02471923828125,
-0.00881195068359375,
-0.0161895751953125,
-0.05218505859375,
-0.032073974609375,
-0.03350830078125,
-0.06884765625,
0.0282745361328125,
-0.0228424072265625,
-0.000018715858459472656,
0.007442474365234375,
0.0123291015625,
-0.0109100341796875,
-0.01271820068359375,
0.038818359375,
0.034820556640625,
-0.004608154296875,
-0.014862060546875,
0.00299835205078125,
-0.00675201416015625,
-0.00384521484375,
0.00576019287109375,
0.0238494873046875,
-0.003826141357421875,
-0.01378631591796875,
-0.0533447265625,
0.0286712646484375,
0.04486083984375,
0.008392333984375,
0.050048828125,
0.0531005859375,
-0.033966064453125,
0.00750732421875,
-0.03131103515625,
-0.0079345703125,
-0.037322998046875,
0.00560760498046875,
-0.026519775390625,
-0.04815673828125,
0.061737060546875,
0.019775390625,
0.02252197265625,
0.04559326171875,
0.04168701171875,
-0.018524169921875,
0.09564208984375,
0.0416259765625,
0.025054931640625,
0.033477783203125,
-0.04302978515625,
-0.01177215576171875,
-0.0714111328125,
-0.03338623046875,
-0.02783203125,
-0.0192718505859375,
-0.03680419921875,
-0.029510498046875,
0.0229949951171875,
0.0164947509765625,
-0.036346435546875,
0.03338623046875,
-0.0330810546875,
0.01184844970703125,
0.0282440185546875,
0.0283203125,
0.005184173583984375,
-0.00803375244140625,
0.00429534912109375,
-0.004161834716796875,
-0.051727294921875,
-0.0285491943359375,
0.06695556640625,
0.035675048828125,
0.0645751953125,
0.01055908203125,
0.0421142578125,
0.008758544921875,
0.0350341796875,
-0.033447265625,
0.03594970703125,
0.0012531280517578125,
-0.07623291015625,
-0.00765228271484375,
-0.036041259765625,
-0.07635498046875,
0.006649017333984375,
-0.0185089111328125,
-0.060455322265625,
0.032989501953125,
0.0279083251953125,
-0.040313720703125,
0.0255126953125,
-0.064453125,
0.0721435546875,
0.001209259033203125,
-0.037109375,
0.022216796875,
-0.04412841796875,
0.03143310546875,
0.0066070556640625,
-0.0009531974792480469,
-0.00937652587890625,
-0.00212860107421875,
0.05157470703125,
-0.046417236328125,
0.0740966796875,
-0.0245361328125,
-0.0074310302734375,
0.037353515625,
0.0003466606140136719,
0.03173828125,
0.0008358955383300781,
-0.007755279541015625,
0.0196990966796875,
0.005229949951171875,
-0.044158935546875,
-0.041717529296875,
0.0721435546875,
-0.0693359375,
-0.031219482421875,
-0.023834228515625,
-0.0310821533203125,
0.01348114013671875,
0.0188446044921875,
0.050750732421875,
0.032318115234375,
-0.0082550048828125,
0.0034503936767578125,
0.042236328125,
-0.01195526123046875,
0.0265655517578125,
0.0150909423828125,
-0.047760009765625,
-0.03619384765625,
0.07293701171875,
0.0063934326171875,
0.0212249755859375,
-0.00437164306640625,
0.017669677734375,
-0.031494140625,
-0.02593994140625,
-0.055267333984375,
0.0226593017578125,
-0.0528564453125,
-0.0247955322265625,
-0.057891845703125,
-0.00922393798828125,
-0.032745361328125,
-0.0300140380859375,
-0.0275726318359375,
-0.047149658203125,
-0.0537109375,
0.00009614229202270508,
0.050750732421875,
0.04931640625,
-0.008697509765625,
0.0281982421875,
-0.043487548828125,
0.0290374755859375,
0.01175689697265625,
0.035247802734375,
0.00667572021484375,
-0.048065185546875,
-0.0026645660400390625,
0.00357818603515625,
-0.046112060546875,
-0.06640625,
0.03302001953125,
0.00124359130859375,
0.0257110595703125,
0.04058837890625,
-0.00569915771484375,
0.058807373046875,
-0.0195465087890625,
0.072998046875,
0.0309295654296875,
-0.0517578125,
0.028045654296875,
-0.052734375,
0.023651123046875,
0.0312347412109375,
0.049530029296875,
-0.01727294921875,
-0.038787841796875,
-0.068115234375,
-0.0677490234375,
0.052093505859375,
0.043182373046875,
0.0076141357421875,
0.002613067626953125,
0.03533935546875,
0.0052642822265625,
0.0111083984375,
-0.07110595703125,
-0.032073974609375,
-0.0323486328125,
-0.0016412734985351562,
0.01180267333984375,
0.0061798095703125,
-0.00217437744140625,
-0.0265960693359375,
0.06658935546875,
0.0156402587890625,
0.0274200439453125,
0.01483154296875,
0.019317626953125,
-0.0246429443359375,
-0.0195159912109375,
0.017608642578125,
0.0285186767578125,
-0.0279693603515625,
-0.0218048095703125,
0.002483367919921875,
-0.03045654296875,
0.0014171600341796875,
0.0159454345703125,
-0.0296173095703125,
0.005306243896484375,
0.005176544189453125,
0.06414794921875,
-0.003177642822265625,
-0.0291900634765625,
0.0465087890625,
-0.01158905029296875,
-0.017913818359375,
-0.0174102783203125,
0.01354217529296875,
0.0211944580078125,
0.02490234375,
0.006256103515625,
0.0257415771484375,
0.0230255126953125,
-0.034271240234375,
-0.008392333984375,
0.03314208984375,
-0.0213775634765625,
-0.028656005859375,
0.08056640625,
0.022735595703125,
-0.00885009765625,
0.033203125,
-0.02154541015625,
-0.01462554931640625,
0.056671142578125,
0.04669189453125,
0.0638427734375,
-0.0196075439453125,
0.0204315185546875,
0.049774169921875,
0.0002753734588623047,
-0.01226043701171875,
0.037872314453125,
0.00946044921875,
-0.032684326171875,
-0.0150909423828125,
-0.03857421875,
-0.0175628662109375,
0.01457977294921875,
-0.04473876953125,
0.052032470703125,
-0.04254150390625,
-0.031585693359375,
0.00018966197967529297,
-0.01056671142578125,
-0.0372314453125,
0.030426025390625,
0.0010023117065429688,
0.0772705078125,
-0.07928466796875,
0.040802001953125,
0.054107666015625,
-0.05511474609375,
-0.0689697265625,
-0.021514892578125,
-0.00930023193359375,
-0.034942626953125,
0.0239105224609375,
0.0194244384765625,
0.004772186279296875,
-0.00307464599609375,
-0.06024169921875,
-0.06085205078125,
0.09039306640625,
0.0310211181640625,
-0.0095672607421875,
-0.0273590087890625,
-0.023284912109375,
0.044097900390625,
-0.031494140625,
0.045440673828125,
0.0159759521484375,
0.0323486328125,
0.03466796875,
-0.057708740234375,
0.020904541015625,
-0.04095458984375,
0.018524169921875,
-0.001705169677734375,
-0.06329345703125,
0.0904541015625,
-0.016448974609375,
-0.0278167724609375,
0.042938232421875,
0.0557861328125,
0.044921875,
0.020050048828125,
0.028411865234375,
0.050201416015625,
0.039398193359375,
-0.00534820556640625,
0.09271240234375,
-0.0129547119140625,
0.032928466796875,
0.0677490234375,
-0.0004391670227050781,
0.05242919921875,
0.0231781005859375,
-0.006694793701171875,
0.03179931640625,
0.06414794921875,
-0.0036830902099609375,
0.0275726318359375,
-0.00565338134765625,
-0.0009236335754394531,
-0.017730712890625,
-0.004497528076171875,
-0.04486083984375,
0.0086212158203125,
0.01166534423828125,
-0.03216552734375,
-0.01355743408203125,
0.01451873779296875,
0.0143890380859375,
-0.015838623046875,
-0.01116943359375,
0.0310821533203125,
0.007579803466796875,
-0.037506103515625,
0.06695556640625,
-0.003803253173828125,
0.05877685546875,
-0.03826904296875,
-0.01123046875,
-0.01457977294921875,
-0.0111846923828125,
-0.0237274169921875,
-0.0606689453125,
0.0095672607421875,
0.0026531219482421875,
0.0003952980041503906,
-0.0235748291015625,
0.039825439453125,
-0.0274200439453125,
-0.042938232421875,
0.01318359375,
0.0208282470703125,
0.0374755859375,
0.0112152099609375,
-0.07098388671875,
0.019775390625,
-0.003070831298828125,
-0.0296173095703125,
0.006687164306640625,
0.0162506103515625,
0.010345458984375,
0.05712890625,
0.033416748046875,
0.006893157958984375,
-0.0022525787353515625,
-0.0127410888671875,
0.0760498046875,
-0.0290985107421875,
-0.036529541015625,
-0.0419921875,
0.06683349609375,
-0.007843017578125,
-0.03173828125,
0.06982421875,
0.043060302734375,
0.05609130859375,
-0.0165557861328125,
0.056549072265625,
-0.0290985107421875,
0.03790283203125,
-0.032379150390625,
0.0694580078125,
-0.07183837890625,
0.0016717910766601562,
-0.04095458984375,
-0.07037353515625,
-0.019561767578125,
0.054901123046875,
-0.0177459716796875,
0.03399658203125,
0.03656005859375,
0.06402587890625,
-0.0189056396484375,
0.005718231201171875,
0.0131378173828125,
0.01497650146484375,
0.01467132568359375,
0.0321044921875,
0.054229736328125,
-0.047607421875,
0.03228759765625,
-0.0276336669921875,
-0.0264434814453125,
-0.0216827392578125,
-0.06341552734375,
-0.0823974609375,
-0.05218505859375,
-0.039520263671875,
-0.047637939453125,
-0.004756927490234375,
0.067138671875,
0.06591796875,
-0.052947998046875,
-0.005222320556640625,
-0.01207733154296875,
-0.004436492919921875,
0.005290985107421875,
-0.0167388916015625,
0.0310211181640625,
0.01558685302734375,
-0.075439453125,
-0.001979827880859375,
-0.01044464111328125,
0.037322998046875,
-0.01654052734375,
-0.01345062255859375,
-0.01158905029296875,
-0.0037784576416015625,
0.030303955078125,
0.038543701171875,
-0.041412353515625,
-0.006977081298828125,
-0.00438690185546875,
-0.0068359375,
0.01186370849609375,
0.029876708984375,
-0.038421630859375,
0.0372314453125,
0.04718017578125,
0.014984130859375,
0.04486083984375,
0.00820159912109375,
0.0249481201171875,
-0.0386962890625,
0.007598876953125,
0.01114654541015625,
0.038421630859375,
0.0335693359375,
-0.032318115234375,
0.022430419921875,
0.00931549072265625,
-0.048736572265625,
-0.0572509765625,
0.01287841796875,
-0.0848388671875,
-0.02557373046875,
0.09283447265625,
-0.0250396728515625,
-0.033416748046875,
0.004734039306640625,
-0.0352783203125,
0.0277862548828125,
-0.044403076171875,
0.04931640625,
0.0321044921875,
-0.0311126708984375,
-0.0060272216796875,
-0.038238525390625,
0.03619384765625,
0.01788330078125,
-0.057891845703125,
-0.004085540771484375,
0.049468994140625,
0.036407470703125,
0.0290985107421875,
0.055419921875,
-0.016357421875,
0.031707763671875,
0.000782012939453125,
0.016693115234375,
-0.005382537841796875,
-0.0213775634765625,
-0.035064697265625,
0.0076904296875,
-0.0018634796142578125,
-0.01279449462890625
]
] |
adept/fuyu-8b | 2023-11-04T11:13:11.000Z | [
"transformers",
"safetensors",
"fuyu",
"text-generation",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | adept | null | null | adept/fuyu-8b | 645 | 25,895 | transformers | 2023-10-17T22:42:08 | ---
license: cc-by-nc-4.0
---
# Fuyu-8B Model Card
We’re releasing Fuyu-8B, a small version of the multimodal model that powers our product. The model is available on HuggingFace. We think Fuyu-8B is exciting because:
1. It has a much simpler architecture and training procedure than other multi-modal models, which makes it easier to understand, scale, and deploy.
2. It’s designed from the ground up for digital agents, so it can support arbitrary image resolutions, answer questions about graphs and diagrams, answer UI-based questions, and do fine-grained localization on screen images.
3. It’s fast - we can get responses for large images in less than 100 milliseconds.
4. Despite being optimized for our use-case, it performs well at standard image understanding benchmarks such as visual question-answering and natural-image-captioning.
Please note that **the model we have released is a base model. We expect you to need to finetune the model for specific use cases like verbose captioning or multimodal chat.** In our experience, the model responds well to few-shotting and fine-tuning for a variety of use-cases.
## Model
[Fuyu-8B](https://www.adept.ai/blog/fuyu-8b) is a multi-modal text and image transformer trained by [Adept AI](https://www.adept.ai/).
Architecturally, Fuyu is a vanilla decoder-only transformer - there is no image encoder.
Image patches are instead linearly projected into the first layer of the transformer, bypassing the embedding lookup.
We simply treat the transformer decoder like an image transformer (albeit with no pooling and causal attention).
See the below diagram for more details.

This simplification allows us to support arbitrary image resolutions.
To accomplish this, we treat the sequence of image tokens like the sequence of text tokens.
We remove image-specific position embeddings and feed in as many image tokens as necessary in raster-scan order.
To tell the model when a line has broken, we simply use a special image-newline character.
The model can use its existing position embeddings to reason about different image sizes, and we can use images of arbitrary size at training time, removing the need for separate high and low-resolution training stages.
### Model Description
- **Developed by:** Adept-AI
- **Model type:** Decoder-only multi-modal transformer model
- **License:** [CC-BY-NC](https://creativecommons.org/licenses/by-nc/4.0/deed.en)
- **Model Description:** This is a multi-modal model that can consume images and text and produce text.
- **Resources for more information:** Check out our [blog post](https://www.adept.ai/blog/fuyu-8b).
## Evaluation
Though not the focus of this model, we did evaluate it on standard image understanding benchmarks:
| Eval Task | Fuyu-8B | Fuyu-Medium | LLaVA 1.5 (13.5B) | QWEN-VL (10B) | PALI-X (55B) | PALM-e-12B | PALM-e-562B |
| ------------------- | ------- | ----------------- | ----------------- | ------------- | ------------ | ---------- | ----------- |
| VQAv2 | 74.2 | 77.4 | 80 | 79.5 | 86.1 | 76.2 | 80.0 |
| OKVQA | 60.6 | 63.1 | n/a | 58.6 | 66.1 | 55.5 | 66.1 |
| COCO Captions | 141 | 138 | n/a | n/a | 149 | 135 | 138 |
| AI2D | 64.5 | 73.7 | n/a | 62.3 | 81.2 | n/a | n/a |
## How to Use
You can load the model and perform inference as follows:
```python
from transformers import FuyuProcessor, FuyuForCausalLM
from PIL import Image
import requests
# load model and processor
model_id = "adept/fuyu-8b"
processor = FuyuProcessor.from_pretrained(model_id)
model = FuyuForCausalLM.from_pretrained(model_id, device_map="cuda:0")
# prepare inputs for the model
text_prompt = "Generate a coco-style caption.\n"
url = "https://huggingface.co/adept/fuyu-8b/resolve/main/bus.png"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(text=text_prompt, images=image, return_tensors="pt").to("cuda:0")
# autoregressively generate text
generation_output = model.generate(**inputs, max_new_tokens=7)
generation_text = processor.batch_decode(generation_output[:, -7:], skip_special_tokens=True)
assert generation_text == ['A blue bus parked on the side of a road.']
```
N.B.: The token `|SPEAKER|` is a placeholder token for image patch embeddings, so it will show up in the model context (e.g., in the portion of `generation_output` representing the model context).
`|NEWLINE|` is the "image newline" token, denoting new rows in the raster scan order input of the image patches.
`\x04` is the "beginning of answer" token.
Fuyu can also perform some question answering on natural images and charts/diagrams (thought fine-tuning may be required for good performance):
```python
text_prompt = "What color is the bus?\n"
url = "https://huggingface.co/adept/fuyu-8b/resolve/main/bus.png"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(text=text_prompt, images=image, return_tensors="pt").to("cuda:0")
generation_output = model.generate(**inputs, max_new_tokens=6)
generation_text = processor.batch_decode(generation_output[:, -6:], skip_special_tokens=True)
assert generation_text == ["The bus is blue.\n"]
text_prompt = "What is the highest life expectancy at birth of male?\n"
url = "https://huggingface.co/adept/fuyu-8b/resolve/main/chart.png"
image = Image.open(requests.get(url, stream=True).raw)
model_inputs = processor(text=text_prompt, images=image, return_tensors="pt").to("cuda:0")
generation_output = model.generate(**model_inputs, max_new_tokens=16)
generation_text = processor.batch_decode(generation_output[:, -16:], skip_special_tokens=True)
assert generation_text == ["The life expectancy at birth of males in 2018 is 80.7.\n"]
```
For best performance, it's recommended to end questions with `\n`, as shown above!
## Uses
### Direct Use
The model is intended for research purposes only.
**Because this is a raw model release, we have not added further finetuning, postprocessing or sampling strategies to control for undesirable outputs. You should expect to have to fine-tune the model for your use-case.**
Possible research areas and tasks include
- Applications in computer control or digital agents.
- Research on multi-modal models generally.
Excluded uses are described below.
### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Limitations and Bias
### Limitations
- Faces and people in general may not be generated properly.
### Bias
While the capabilities of these models are impressive, they can also reinforce or exacerbate social biases. | 6,981 | [
[
-0.047607421875,
-0.0762939453125,
0.0169830322265625,
0.01451873779296875,
-0.019989013671875,
-0.0232086181640625,
-0.00630950927734375,
-0.052001953125,
0.0037899017333984375,
0.0237884521484375,
-0.046356201171875,
-0.011627197265625,
-0.04302978515625,
0.0033245086669921875,
-0.00969696044921875,
0.0675048828125,
0.009918212890625,
-0.001850128173828125,
-0.022308349609375,
-0.00745391845703125,
-0.041656494140625,
-0.01209259033203125,
-0.060150146484375,
-0.0171051025390625,
0.021514892578125,
0.0143890380859375,
0.06719970703125,
0.053924560546875,
0.040496826171875,
0.030548095703125,
-0.00705718994140625,
0.0176239013671875,
-0.0298614501953125,
-0.0186309814453125,
0.00955963134765625,
-0.03564453125,
-0.0380859375,
-0.00225067138671875,
0.038665771484375,
0.02642822265625,
0.0129241943359375,
0.01471710205078125,
-0.01142120361328125,
0.022430419921875,
-0.027099609375,
0.0258941650390625,
-0.0255889892578125,
-0.0141143798828125,
-0.01132965087890625,
0.01483917236328125,
-0.0298919677734375,
-0.0013990402221679688,
-0.01416778564453125,
-0.05419921875,
0.023529052734375,
0.0159759521484375,
0.10693359375,
0.0306549072265625,
-0.03448486328125,
0.00495147705078125,
-0.03436279296875,
0.0550537109375,
-0.06439208984375,
0.02484130859375,
0.021697998046875,
0.02886962890625,
-0.0089263916015625,
-0.063232421875,
-0.0635986328125,
-0.0153656005859375,
-0.023681640625,
0.005931854248046875,
-0.0246734619140625,
0.00702667236328125,
0.028411865234375,
0.037384033203125,
-0.0229949951171875,
-0.0104827880859375,
-0.044158935546875,
-0.019805908203125,
0.06549072265625,
0.0164794921875,
0.032989501953125,
-0.034759521484375,
-0.03564453125,
-0.0168609619140625,
-0.0222015380859375,
0.01922607421875,
0.02117919921875,
-0.003040313720703125,
-0.03070068359375,
0.036590576171875,
-0.007293701171875,
0.048980712890625,
0.0273590087890625,
-0.018768310546875,
0.0297088623046875,
-0.0222625732421875,
-0.0236358642578125,
-0.0279693603515625,
0.09295654296875,
0.0184326171875,
-0.01617431640625,
0.0039520263671875,
-0.0262908935546875,
0.007129669189453125,
0.0165557861328125,
-0.057037353515625,
-0.01549530029296875,
0.024993896484375,
-0.039642333984375,
-0.02398681640625,
-0.0006861686706542969,
-0.057861328125,
-0.0220489501953125,
-0.0021305084228515625,
0.04132080078125,
-0.0579833984375,
-0.03375244140625,
-0.0035552978515625,
-0.006130218505859375,
0.03582763671875,
0.030364990234375,
-0.0565185546875,
0.0130767822265625,
0.0278778076171875,
0.060638427734375,
-0.00518798828125,
-0.02728271484375,
-0.021636962890625,
-0.01007080078125,
-0.0095672607421875,
0.07537841796875,
-0.01885986328125,
-0.0177001953125,
-0.0384521484375,
0.0269317626953125,
-0.0130615234375,
-0.02581787109375,
0.037628173828125,
-0.024810791015625,
0.01055145263671875,
-0.017425537109375,
-0.023834228515625,
-0.03509521484375,
0.0173797607421875,
-0.07208251953125,
0.06689453125,
0.0170135498046875,
-0.06768798828125,
0.0007619857788085938,
-0.07080078125,
-0.03546142578125,
0.00179290771484375,
-0.01232147216796875,
-0.045318603515625,
-0.0069122314453125,
0.034637451171875,
0.03662109375,
-0.030548095703125,
0.02655029296875,
-0.0225830078125,
-0.025360107421875,
0.028594970703125,
-0.03521728515625,
0.07757568359375,
0.04022216796875,
-0.037628173828125,
0.01013946533203125,
-0.0374755859375,
-0.016845703125,
0.036773681640625,
-0.01189422607421875,
0.011993408203125,
-0.01824951171875,
0.024444580078125,
0.0140228271484375,
0.0250091552734375,
-0.04193115234375,
0.01177215576171875,
-0.0207061767578125,
0.033935546875,
0.054229736328125,
0.003170013427734375,
0.023956298828125,
-0.048248291015625,
0.04193115234375,
0.030548095703125,
0.028076171875,
-0.04071044921875,
-0.06195068359375,
-0.06585693359375,
-0.025665283203125,
0.016510009765625,
0.02764892578125,
-0.06256103515625,
0.0119781494140625,
-0.0120086669921875,
-0.0489501953125,
-0.051788330078125,
-0.0028018951416015625,
0.032196044921875,
0.029327392578125,
0.026641845703125,
-0.0087738037109375,
-0.030426025390625,
-0.0672607421875,
0.0079803466796875,
-0.0037136077880859375,
0.01727294921875,
0.025848388671875,
0.04931640625,
-0.033172607421875,
0.051513671875,
-0.048431396484375,
-0.016448974609375,
-0.041046142578125,
0.01123809814453125,
0.0207061767578125,
0.045867919921875,
0.0550537109375,
-0.0782470703125,
-0.0193939208984375,
0.005306243896484375,
-0.05340576171875,
0.0153656005859375,
-0.0296478271484375,
-0.0190887451171875,
0.02685546875,
0.01373291015625,
-0.059417724609375,
0.041656494140625,
0.03179931640625,
-0.033050537109375,
0.044525146484375,
-0.01438140869140625,
0.0216217041015625,
-0.09228515625,
0.0258941650390625,
0.01027679443359375,
-0.017669677734375,
-0.04364013671875,
0.028472900390625,
0.0108489990234375,
-0.008270263671875,
-0.03570556640625,
0.049591064453125,
-0.040985107421875,
0.01024627685546875,
-0.0126495361328125,
-0.0099334716796875,
0.01219940185546875,
0.06829833984375,
0.00021183490753173828,
0.038238525390625,
0.03814697265625,
-0.043304443359375,
0.04522705078125,
0.0208892822265625,
-0.0200958251953125,
0.039886474609375,
-0.06219482421875,
0.00568389892578125,
-0.0191497802734375,
0.00530242919921875,
-0.08587646484375,
-0.0281524658203125,
0.030975341796875,
-0.04693603515625,
0.03509521484375,
0.0030651092529296875,
-0.026458740234375,
-0.044403076171875,
-0.0084228515625,
0.03289794921875,
0.0310516357421875,
-0.02996826171875,
0.056549072265625,
0.0343017578125,
0.0125274658203125,
-0.03643798828125,
-0.059906005859375,
-0.00691986083984375,
-0.006160736083984375,
-0.06494140625,
0.0254974365234375,
-0.022125244140625,
-0.0023746490478515625,
0.0016689300537109375,
0.00756072998046875,
-0.0072479248046875,
0.002593994140625,
0.0295562744140625,
0.036773681640625,
-0.01727294921875,
-0.01494598388671875,
0.00815582275390625,
-0.0014362335205078125,
0.002071380615234375,
0.022552490234375,
0.042877197265625,
-0.036346435546875,
-0.018310546875,
-0.05010986328125,
0.01898193359375,
0.052764892578125,
-0.0273590087890625,
0.045166015625,
0.06134033203125,
-0.04852294921875,
0.0195770263671875,
-0.042877197265625,
0.004039764404296875,
-0.0386962890625,
0.028045654296875,
-0.035003662109375,
-0.0183868408203125,
0.052154541015625,
0.0186614990234375,
0.0172576904296875,
0.053863525390625,
0.038055419921875,
-0.006465911865234375,
0.0850830078125,
0.03997802734375,
0.003551483154296875,
0.03875732421875,
-0.0693359375,
0.00476837158203125,
-0.059783935546875,
-0.0418701171875,
-0.014739990234375,
-0.03289794921875,
-0.021575927734375,
-0.0330810546875,
0.030242919921875,
-0.0032558441162109375,
-0.01165008544921875,
0.03466796875,
-0.035400390625,
0.0148468017578125,
0.037109375,
0.035125732421875,
0.006931304931640625,
-0.01983642578125,
-0.00206756591796875,
0.006870269775390625,
-0.04620361328125,
-0.0138397216796875,
0.06280517578125,
0.0321044921875,
0.06549072265625,
-0.0037212371826171875,
0.0316162109375,
0.01438140869140625,
0.0257415771484375,
-0.053009033203125,
0.0408935546875,
0.0025615692138671875,
-0.060791015625,
-0.008087158203125,
-0.029449462890625,
-0.0697021484375,
0.0122222900390625,
-0.0242919921875,
-0.047210693359375,
0.0171051025390625,
0.0100860595703125,
-0.034698486328125,
0.034210205078125,
-0.0623779296875,
0.0904541015625,
-0.0309906005859375,
-0.0300445556640625,
0.00041675567626953125,
-0.059051513671875,
0.048095703125,
0.01428985595703125,
0.0162200927734375,
0.006244659423828125,
0.0106048583984375,
0.05902099609375,
-0.029632568359375,
0.060821533203125,
-0.012359619140625,
0.00811004638671875,
0.032745361328125,
-0.004791259765625,
0.0170135498046875,
0.0080413818359375,
-0.0002665519714355469,
0.02069091796875,
0.01947021484375,
-0.03277587890625,
-0.03680419921875,
0.04278564453125,
-0.078369140625,
-0.038970947265625,
-0.03271484375,
-0.059814453125,
0.01482391357421875,
0.0278167724609375,
0.0406494140625,
0.04296875,
0.0007090568542480469,
0.008331298828125,
0.045379638671875,
-0.03448486328125,
0.0380859375,
0.00429534912109375,
-0.0153656005859375,
-0.041900634765625,
0.06585693359375,
-0.0019283294677734375,
0.0222320556640625,
0.027313232421875,
0.0189056396484375,
-0.030426025390625,
-0.008880615234375,
-0.046661376953125,
0.03839111328125,
-0.05291748046875,
-0.03741455078125,
-0.060028076171875,
-0.0257568359375,
-0.055999755859375,
-0.00006264448165893555,
-0.032073974609375,
-0.0155792236328125,
-0.037445068359375,
0.0113067626953125,
0.0302276611328125,
0.0523681640625,
0.003963470458984375,
0.024749755859375,
-0.04669189453125,
0.058013916015625,
0.010040283203125,
0.01861572265625,
0.00372314453125,
-0.046051025390625,
-0.0163726806640625,
0.037384033203125,
-0.060333251953125,
-0.0750732421875,
0.031280517578125,
0.010345458984375,
0.0533447265625,
0.053466796875,
0.0093231201171875,
0.07293701171875,
-0.001377105712890625,
0.06219482421875,
0.0179901123046875,
-0.06744384765625,
0.042572021484375,
-0.02520751953125,
0.03192138671875,
0.008331298828125,
0.0243682861328125,
-0.0323486328125,
-0.031524658203125,
-0.06597900390625,
-0.04742431640625,
0.0701904296875,
0.007213592529296875,
0.005016326904296875,
0.0023040771484375,
0.0220947265625,
0.0036029815673828125,
0.0041351318359375,
-0.076171875,
-0.0396728515625,
-0.041595458984375,
-0.01407623291015625,
0.00901031494140625,
-0.000667572021484375,
0.00833892822265625,
-0.048126220703125,
0.05096435546875,
-0.007465362548828125,
0.057281494140625,
0.01885986328125,
-0.0182952880859375,
-0.0111541748046875,
-0.00516510009765625,
0.038604736328125,
0.029632568359375,
-0.0037670135498046875,
-0.00007873773574829102,
0.01067352294921875,
-0.045440673828125,
-0.00986480712890625,
0.0166473388671875,
-0.019073486328125,
-0.0047454833984375,
0.02264404296875,
0.0814208984375,
0.01111602783203125,
-0.0404052734375,
0.0458984375,
-0.00811767578125,
-0.01250457763671875,
-0.016448974609375,
0.0087890625,
0.007076263427734375,
0.0247955322265625,
0.0178680419921875,
0.0219573974609375,
0.004940032958984375,
-0.041229248046875,
0.00481414794921875,
0.0278778076171875,
-0.00893402099609375,
-0.02935791015625,
0.087890625,
0.002407073974609375,
-0.0207061767578125,
0.05133056640625,
-0.024871826171875,
-0.03564453125,
0.06719970703125,
0.0518798828125,
0.049896240234375,
-0.00928497314453125,
0.0235137939453125,
0.044097900390625,
0.023681640625,
-0.0033054351806640625,
0.0219573974609375,
-0.0013103485107421875,
-0.050872802734375,
-0.0092620849609375,
-0.044647216796875,
-0.0080413818359375,
0.0220794677734375,
-0.04827880859375,
0.041046142578125,
-0.033355712890625,
-0.021209716796875,
0.00615692138671875,
0.01363372802734375,
-0.0609130859375,
0.022979736328125,
0.0147247314453125,
0.0716552734375,
-0.05694580078125,
0.06072998046875,
0.060943603515625,
-0.057220458984375,
-0.06866455078125,
-0.0208587646484375,
-0.0155792236328125,
-0.06048583984375,
0.016632080078125,
0.0357666015625,
0.016387939453125,
-0.0008821487426757812,
-0.06072998046875,
-0.04998779296875,
0.099853515625,
0.038330078125,
-0.01031494140625,
-0.014801025390625,
0.005207061767578125,
0.0312347412109375,
-0.019561767578125,
0.03961181640625,
0.035675048828125,
0.046478271484375,
0.017486572265625,
-0.05438232421875,
0.0265350341796875,
-0.04736328125,
-0.0009899139404296875,
0.0007352828979492188,
-0.06341552734375,
0.08001708984375,
-0.053619384765625,
-0.0230712890625,
0.0243988037109375,
0.06256103515625,
0.01325225830078125,
0.0305023193359375,
0.041351318359375,
0.048583984375,
0.04034423828125,
-0.011993408203125,
0.0858154296875,
-0.018798828125,
0.0367431640625,
0.05419921875,
0.02532958984375,
0.0433349609375,
0.03192138671875,
-0.0115509033203125,
0.03240966796875,
0.053070068359375,
-0.02484130859375,
0.0308074951171875,
-0.01702880859375,
0.007495880126953125,
-0.0184783935546875,
-0.00710296630859375,
-0.0263519287109375,
0.0286865234375,
0.0246734619140625,
-0.0204010009765625,
-0.0007028579711914062,
0.015594482421875,
0.00817108154296875,
-0.016326904296875,
-0.01312255859375,
0.0377197265625,
0.0003979206085205078,
-0.053070068359375,
0.056365966796875,
0.0167388916015625,
0.056854248046875,
-0.041046142578125,
0.0022640228271484375,
-0.0207061767578125,
0.01271820068359375,
-0.020111083984375,
-0.0513916015625,
0.02032470703125,
-0.017669677734375,
0.00008553266525268555,
-0.004756927490234375,
0.03985595703125,
-0.0279083251953125,
-0.0506591796875,
0.0031414031982421875,
0.0140228271484375,
0.017608642578125,
-0.0055389404296875,
-0.0704345703125,
0.01541900634765625,
0.0020904541015625,
-0.0281829833984375,
-0.002079010009765625,
0.02764892578125,
-0.0018072128295898438,
0.058837890625,
0.0400390625,
-0.00838470458984375,
0.01776123046875,
-0.0175323486328125,
0.069580078125,
-0.04705810546875,
-0.031982421875,
-0.044464111328125,
0.056976318359375,
-0.00238037109375,
-0.02886962890625,
0.045379638671875,
0.03955078125,
0.06072998046875,
-0.01024627685546875,
0.045654296875,
-0.03912353515625,
-0.0081787109375,
-0.02093505859375,
0.0655517578125,
-0.0626220703125,
0.005512237548828125,
-0.0294647216796875,
-0.039886474609375,
-0.01175689697265625,
0.051025390625,
-0.0144195556640625,
0.00882720947265625,
0.0455322265625,
0.0743408203125,
-0.0226898193359375,
0.005207061767578125,
0.0205535888671875,
0.023956298828125,
0.024993896484375,
0.05767822265625,
0.05035400390625,
-0.07354736328125,
0.048980712890625,
-0.053253173828125,
-0.0153656005859375,
-0.00865936279296875,
-0.044403076171875,
-0.06939697265625,
-0.06353759765625,
-0.0239410400390625,
-0.02362060546875,
-0.0255584716796875,
0.05010986328125,
0.054931640625,
-0.048797607421875,
-0.00911712646484375,
0.007617950439453125,
0.020416259765625,
-0.034942626953125,
-0.023345947265625,
0.015899658203125,
-0.007129669189453125,
-0.06610107421875,
-0.004978179931640625,
-0.00611114501953125,
0.01393890380859375,
-0.01367950439453125,
-0.016326904296875,
-0.03509521484375,
-0.0038700103759765625,
0.025665283203125,
0.0213470458984375,
-0.051025390625,
-0.01412200927734375,
0.019195556640625,
-0.00859832763671875,
0.01690673828125,
0.0396728515625,
-0.04547119140625,
0.0313720703125,
0.0384521484375,
0.037139892578125,
0.04364013671875,
-0.006153106689453125,
0.0271148681640625,
-0.03155517578125,
0.0169830322265625,
-0.0067596435546875,
0.054412841796875,
0.025665283203125,
-0.053558349609375,
0.021697998046875,
0.032562255859375,
-0.0367431640625,
-0.064453125,
0.0033512115478515625,
-0.09552001953125,
-0.0263214111328125,
0.08648681640625,
-0.0130157470703125,
-0.04510498046875,
0.007190704345703125,
-0.041107177734375,
0.042877197265625,
-0.0214691162109375,
0.050018310546875,
0.022491455078125,
-0.01039886474609375,
-0.0303955078125,
-0.025054931640625,
0.00879669189453125,
0.005474090576171875,
-0.066162109375,
-0.018096923828125,
0.03717041015625,
0.032012939453125,
0.0256500244140625,
0.04547119140625,
-0.018585205078125,
0.0421142578125,
0.01229095458984375,
0.012542724609375,
-0.033203125,
-0.0215301513671875,
-0.0206756591796875,
-0.00960540771484375,
-0.0171966552734375,
-0.03057861328125
]
] |
nerijs/pixel-art-xl | 2023-08-03T19:23:43.000Z | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"license:creativeml-openrail-m",
"has_space",
"region:us"
] | text-to-image | nerijs | null | null | nerijs/pixel-art-xl | 190 | 25,652 | diffusers | 2023-08-03T19:13:23 | ---
license: creativeml-openrail-m
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: pixel art
widget:
- text: pixel art, a cute corgi, simple, flat colors
---
# Pixel Art XL
## Consider supporting further research on [Patreon](https://www.patreon.com/user?u=29466374) or [Twitter](https://twitter.com/nerijs)


Downscale 8 times to get pixel perfect images (use Nearest Neighbors)
Use a fixed VAE to avoid artifacts (0.9 or fp16 fix)
### Tips:
Don't use refiner
Works great with only 1 text encoder
No style prompt required
No trigger keyword require
Works great with isometric and non-isometric
Works with 0.9 and 1.0
#### Changelog
v1: Initial release | 1,014 | [
[
-0.04681396484375,
-0.033905029296875,
0.047210693359375,
0.006999969482421875,
-0.0303955078125,
0.007228851318359375,
-0.0006899833679199219,
-0.04931640625,
0.062744140625,
0.0408935546875,
-0.06402587890625,
-0.046417236328125,
-0.050262451171875,
0.00899505615234375,
-0.025634765625,
0.0596923828125,
-0.0170440673828125,
0.0007610321044921875,
0.008941650390625,
0.00464630126953125,
0.0020847320556640625,
-0.0165252685546875,
-0.06402587890625,
-0.00682830810546875,
0.042572021484375,
0.038116455078125,
0.06573486328125,
0.036590576171875,
0.0288848876953125,
0.02056884765625,
-0.0021266937255859375,
-0.006397247314453125,
-0.044158935546875,
-0.018096923828125,
0.005405426025390625,
-0.01532745361328125,
-0.04010009765625,
-0.0014696121215820312,
0.0276641845703125,
0.0009517669677734375,
-0.0005564689636230469,
0.01690673828125,
-0.00661468505859375,
0.06719970703125,
-0.048126220703125,
-0.0178070068359375,
-0.005603790283203125,
0.015869140625,
-0.0085906982421875,
-0.003665924072265625,
0.0034942626953125,
-0.0521240234375,
-0.00830078125,
-0.08392333984375,
0.0309600830078125,
-0.00878143310546875,
0.0845947265625,
0.0099334716796875,
0.0018911361694335938,
-0.006000518798828125,
-0.02191162109375,
0.038848876953125,
-0.051300048828125,
0.020721435546875,
0.01317596435546875,
0.0218505859375,
0.013397216796875,
-0.081298828125,
-0.044219970703125,
0.03240966796875,
0.0199127197265625,
0.00902557373046875,
-0.0460205078125,
-0.0159149169921875,
0.00852203369140625,
0.025146484375,
-0.035186767578125,
0.006000518798828125,
-0.02752685546875,
-0.0207977294921875,
0.04266357421875,
0.0167083740234375,
0.042877197265625,
0.0189361572265625,
-0.0311431884765625,
-0.0000635385513305664,
-0.059814453125,
-0.0002639293670654297,
0.0457763671875,
-0.0266265869140625,
-0.040985107421875,
0.0206756591796875,
0.002666473388671875,
0.017669677734375,
0.0214385986328125,
0.03643798828125,
0.0233001708984375,
-0.0435791015625,
-0.006023406982421875,
-0.0135955810546875,
0.065185546875,
0.04949951171875,
0.007320404052734375,
-0.0081939697265625,
-0.004634857177734375,
-0.0003235340118408203,
0.041778564453125,
-0.10333251953125,
-0.04132080078125,
0.01233673095703125,
-0.0469970703125,
-0.016632080078125,
-0.0100860595703125,
-0.0924072265625,
-0.039886474609375,
-0.01910400390625,
0.037811279296875,
-0.041656494140625,
-0.03973388671875,
0.0149078369140625,
-0.0211334228515625,
0.00775909423828125,
0.035858154296875,
-0.06976318359375,
0.0163726806640625,
0.0260772705078125,
0.04559326171875,
0.033447265625,
0.01885986328125,
-0.004726409912109375,
-0.03125,
-0.04620361328125,
0.061920166015625,
-0.025848388671875,
-0.0416259765625,
-0.00623321533203125,
0.0180511474609375,
0.01983642578125,
-0.02386474609375,
0.0750732421875,
-0.0137786865234375,
-0.006622314453125,
-0.0305023193359375,
-0.01959228515625,
-0.01457977294921875,
-0.002925872802734375,
-0.04339599609375,
0.050018310546875,
0.022796630859375,
-0.042510986328125,
0.038543701171875,
-0.054412841796875,
0.02655029296875,
0.028564453125,
-0.023712158203125,
-0.041595458984375,
0.0252532958984375,
0.0230560302734375,
0.0159454345703125,
0.004077911376953125,
-0.053009033203125,
-0.033905029296875,
-0.02178955078125,
0.006565093994140625,
0.0015230178833007812,
0.041107177734375,
0.0154571533203125,
-0.044769287109375,
0.00717926025390625,
-0.05267333984375,
0.02227783203125,
0.0352783203125,
0.01313018798828125,
0.005977630615234375,
-0.01165008544921875,
0.03472900390625,
0.0180816650390625,
0.04449462890625,
-0.071533203125,
0.0177459716796875,
0.0032863616943359375,
0.0285186767578125,
0.0633544921875,
-0.0140533447265625,
0.0277252197265625,
-0.0159759521484375,
0.042877197265625,
-0.013916015625,
0.0197601318359375,
-0.0017414093017578125,
-0.0240631103515625,
-0.059539794921875,
-0.01971435546875,
0.01444244384765625,
0.0031375885009765625,
-0.043670654296875,
0.0179595947265625,
-0.02178955078125,
-0.035400390625,
-0.0230712890625,
-0.0029163360595703125,
0.0190277099609375,
0.0073089599609375,
0.006679534912109375,
-0.050079345703125,
-0.03948974609375,
-0.06256103515625,
0.021697998046875,
0.0238800048828125,
0.0146331787109375,
0.054901123046875,
0.0341796875,
-0.0251617431640625,
0.055816650390625,
-0.055145263671875,
-0.00887298583984375,
-0.0050048828125,
0.0015497207641601562,
0.018402099609375,
0.0472412109375,
0.07635498046875,
-0.06182861328125,
-0.046356201171875,
0.0044097900390625,
-0.0249786376953125,
-0.00563812255859375,
0.0198822021484375,
-0.010955810546875,
0.040008544921875,
0.02642822265625,
-0.036163330078125,
0.06793212890625,
0.0540771484375,
-0.06982421875,
0.058837890625,
-0.0259552001953125,
0.043243408203125,
-0.06842041015625,
0.0220947265625,
0.035369873046875,
-0.02899169921875,
-0.025848388671875,
0.0195159912109375,
0.007541656494140625,
-0.0243072509765625,
-0.0312347412109375,
0.032806396484375,
-0.022125244140625,
0.0188140869140625,
-0.03485107421875,
-0.01372528076171875,
0.01111602783203125,
0.016326904296875,
0.01898193359375,
0.0548095703125,
0.048370361328125,
-0.0230865478515625,
0.01486968994140625,
0.0200653076171875,
-0.05462646484375,
0.03997802734375,
-0.08282470703125,
0.026397705078125,
-0.0026798248291015625,
0.003662109375,
-0.082275390625,
-0.002696990966796875,
0.037567138671875,
-0.0704345703125,
0.0230560302734375,
-0.0001798868179321289,
-0.04473876953125,
-0.0210113525390625,
-0.02984619140625,
0.0472412109375,
0.0567626953125,
-0.0382080078125,
0.0032901763916015625,
0.027130126953125,
-0.01268768310546875,
-0.0181427001953125,
-0.03387451171875,
0.02001953125,
-0.00159454345703125,
-0.06585693359375,
0.0533447265625,
-0.00656890869140625,
0.0032939910888671875,
0.0164642333984375,
-0.0199127197265625,
-0.027862548828125,
-0.0199127197265625,
0.021148681640625,
0.0190277099609375,
-0.026275634765625,
-0.0281829833984375,
-0.0249786376953125,
-0.03216552734375,
-0.000011682510375976562,
-0.0158843994140625,
0.0518798828125,
-0.006954193115234375,
-0.01049041748046875,
-0.06756591796875,
0.0153350830078125,
0.0531005859375,
0.0176849365234375,
0.02398681640625,
0.0673828125,
-0.029327392578125,
0.002044677734375,
-0.0413818359375,
-0.027130126953125,
-0.03692626953125,
0.01100921630859375,
-0.0158843994140625,
-0.05963134765625,
0.048980712890625,
0.00634002685546875,
0.007061004638671875,
0.0250396728515625,
0.0192108154296875,
-0.0110931396484375,
0.081787109375,
0.05841064453125,
-0.00838470458984375,
0.0372314453125,
-0.05718994140625,
0.0193939208984375,
-0.0670166015625,
-0.0221099853515625,
-0.03741455078125,
-0.05035400390625,
-0.0345458984375,
-0.0300750732421875,
0.00920867919921875,
-0.003482818603515625,
-0.01554107666015625,
0.05517578125,
-0.0379638671875,
0.042938232421875,
0.02484130859375,
0.048095703125,
-0.01513671875,
0.01299285888671875,
0.0090484619140625,
-0.0303802490234375,
-0.033538818359375,
-0.024688720703125,
0.06744384765625,
0.009918212890625,
0.050537109375,
0.0057830810546875,
0.054901123046875,
-0.0028667449951171875,
-0.026336669921875,
-0.03936767578125,
0.0291595458984375,
-0.02667236328125,
-0.052886962890625,
0.002162933349609375,
-0.0186004638671875,
-0.08697509765625,
-0.00284576416015625,
-0.03619384765625,
-0.05035400390625,
0.0282745361328125,
0.00615692138671875,
-0.03216552734375,
0.048004150390625,
-0.03973388671875,
0.0672607421875,
-0.006317138671875,
-0.04949951171875,
-0.0310821533203125,
-0.0269317626953125,
-0.0184478759765625,
0.0118255615234375,
0.012237548828125,
-0.01395416259765625,
-0.0303955078125,
0.0296478271484375,
-0.044769287109375,
0.06561279296875,
-0.00495147705078125,
-0.0032806396484375,
0.00873565673828125,
0.01727294921875,
0.024627685546875,
0.0101318359375,
0.003009796142578125,
0.0240020751953125,
0.014617919921875,
-0.04254150390625,
-0.0341796875,
0.056304931640625,
-0.06524658203125,
-0.0164337158203125,
-0.022308349609375,
-0.014678955078125,
0.018280029296875,
0.015777587890625,
0.046905517578125,
0.04345703125,
-0.0139923095703125,
0.0251922607421875,
0.03338623046875,
-0.0104217529296875,
0.0465087890625,
0.024810791015625,
-0.0218505859375,
-0.049346923828125,
0.061248779296875,
0.0035266876220703125,
0.00797271728515625,
0.00794219970703125,
0.0257568359375,
-0.033111572265625,
-0.035491943359375,
-0.046417236328125,
0.00946044921875,
-0.0313720703125,
-0.00698089599609375,
-0.019500732421875,
-0.007476806640625,
-0.04571533203125,
-0.04364013671875,
-0.0303802490234375,
-0.032318115234375,
-0.054840087890625,
0.0186614990234375,
0.036407470703125,
0.05596923828125,
-0.0279541015625,
0.0016889572143554688,
-0.057037353515625,
0.0254974365234375,
0.002288818359375,
0.037872314453125,
-0.0179901123046875,
-0.034759521484375,
-0.01027679443359375,
-0.00016641616821289062,
-0.047821044921875,
-0.0263214111328125,
0.05169677734375,
-0.007221221923828125,
0.03515625,
0.028717041015625,
-0.00855255126953125,
0.049957275390625,
-0.029449462890625,
0.0596923828125,
0.055419921875,
-0.07135009765625,
0.036102294921875,
-0.04400634765625,
0.043365478515625,
0.06158447265625,
0.029449462890625,
-0.0452880859375,
-0.003978729248046875,
-0.07080078125,
-0.08062744140625,
0.0511474609375,
0.0284271240234375,
0.0219879150390625,
0.0177001953125,
0.027496337890625,
0.014373779296875,
0.001644134521484375,
-0.058197021484375,
-0.028717041015625,
-0.02685546875,
-0.007843017578125,
-0.018310546875,
-0.025787353515625,
0.01062774658203125,
-0.048858642578125,
0.056396484375,
-0.00026345252990722656,
0.029693603515625,
0.036895751953125,
0.0257415771484375,
-0.0124664306640625,
-0.01122283935546875,
0.054229736328125,
0.072021484375,
-0.01568603515625,
-0.025421142578125,
-0.0310821533203125,
-0.045867919921875,
-0.00963592529296875,
-0.01543426513671875,
-0.0223236083984375,
0.012725830078125,
-0.01505279541015625,
0.05938720703125,
0.0166778564453125,
-0.0263519287109375,
0.0360107421875,
-0.01219940185546875,
-0.044891357421875,
-0.03900146484375,
-0.005992889404296875,
-0.005542755126953125,
0.006855010986328125,
0.0225372314453125,
0.0322265625,
0.0194091796875,
-0.0114593505859375,
0.0229034423828125,
0.0169219970703125,
-0.058197021484375,
-0.0297698974609375,
0.031951904296875,
0.035400390625,
-0.0372314453125,
0.02850341796875,
-0.0191802978515625,
-0.047393798828125,
0.0767822265625,
0.047882080078125,
0.06927490234375,
0.0209503173828125,
0.03778076171875,
0.0692138671875,
0.0225677490234375,
0.018157958984375,
0.04730224609375,
-0.0021514892578125,
-0.044464111328125,
-0.0178070068359375,
-0.02142333984375,
-0.044464111328125,
0.01763916015625,
-0.0269927978515625,
0.046234130859375,
-0.054595947265625,
-0.00399017333984375,
0.00687408447265625,
0.0030727386474609375,
-0.0251617431640625,
0.0229339599609375,
0.03192138671875,
0.083984375,
-0.017608642578125,
0.03521728515625,
0.044952392578125,
-0.031341552734375,
-0.05865478515625,
-0.01378631591796875,
0.032073974609375,
-0.0616455078125,
0.043731689453125,
0.0305633544921875,
-0.00865936279296875,
0.0225982666015625,
-0.05963134765625,
-0.0654296875,
0.0794677734375,
0.0170440673828125,
-0.04010009765625,
0.050384521484375,
-0.01457977294921875,
0.01837158203125,
-0.03253173828125,
0.016632080078125,
0.01044464111328125,
0.02899169921875,
0.0239410400390625,
-0.04461669921875,
0.0005154609680175781,
-0.04937744140625,
-0.00609588623046875,
0.048583984375,
-0.041778564453125,
0.041534423828125,
-0.019256591796875,
-0.038970947265625,
0.044830322265625,
0.0207672119140625,
-0.006702423095703125,
0.0215301513671875,
0.033843994140625,
0.08209228515625,
0.02069091796875,
-0.0283966064453125,
0.08197021484375,
-0.032562255859375,
0.025848388671875,
0.0472412109375,
0.010589599609375,
0.0306243896484375,
0.02288818359375,
-0.01380157470703125,
0.06353759765625,
0.062164306640625,
-0.0154571533203125,
0.03228759765625,
0.0175628662109375,
-0.0247039794921875,
-0.010650634765625,
-0.01105499267578125,
-0.03741455078125,
0.006313323974609375,
0.01543426513671875,
-0.0250396728515625,
-0.0225067138671875,
-0.0178070068359375,
0.0015401840209960938,
0.02178955078125,
-0.047515869140625,
0.03631591796875,
-0.01087188720703125,
-0.00589752197265625,
0.035919189453125,
-0.0171356201171875,
0.076416015625,
-0.0498046875,
-0.00002777576446533203,
0.0028400421142578125,
0.0278778076171875,
-0.040252685546875,
-0.08294677734375,
0.0230255126953125,
-0.005855560302734375,
-0.00830841064453125,
-0.0280609130859375,
0.046722412109375,
0.00998687744140625,
-0.051025390625,
0.01336669921875,
-0.01409149169921875,
0.0148468017578125,
-0.0006313323974609375,
-0.0692138671875,
0.029876708984375,
0.004703521728515625,
0.00402069091796875,
0.002521514892578125,
0.024383544921875,
0.0357666015625,
0.0322265625,
0.044158935546875,
0.0173187255859375,
0.00445556640625,
-0.0012111663818359375,
0.049957275390625,
-0.0377197265625,
-0.0199432373046875,
-0.07354736328125,
0.03265380859375,
-0.02984619140625,
-0.042510986328125,
0.060333251953125,
0.0587158203125,
0.057769775390625,
-0.0264434814453125,
0.04815673828125,
-0.0278167724609375,
0.01197052001953125,
-0.03460693359375,
0.0682373046875,
-0.060791015625,
-0.037506103515625,
-0.021453857421875,
-0.04718017578125,
-0.0133056640625,
0.046112060546875,
0.0008077621459960938,
-0.006099700927734375,
0.05548095703125,
0.06658935546875,
0.004367828369140625,
0.00701904296875,
0.0198822021484375,
-0.0168914794921875,
0.010833740234375,
0.040771484375,
0.042938232421875,
-0.045623779296875,
0.0355224609375,
-0.046234130859375,
-0.0137176513671875,
-0.0025081634521484375,
-0.0667724609375,
-0.03656005859375,
-0.041900634765625,
-0.046112060546875,
-0.04339599609375,
-0.0302581787109375,
0.054229736328125,
0.08404541015625,
-0.0538330078125,
-0.00341796875,
-0.0343017578125,
0.00809478759765625,
0.010711669921875,
-0.01482391357421875,
0.005466461181640625,
-0.0012111663818359375,
-0.07989501953125,
-0.0170135498046875,
0.0278167724609375,
0.028076171875,
0.008544921875,
-0.00934600830078125,
0.005580902099609375,
-0.005405426025390625,
0.0252532958984375,
0.04339599609375,
-0.035614013671875,
-0.02191162109375,
0.0025920867919921875,
0.002140045166015625,
0.0257568359375,
0.07159423828125,
-0.0333251953125,
0.016754150390625,
0.06561279296875,
0.040985107421875,
0.0290679931640625,
-0.01678466796875,
0.0245819091796875,
-0.0200653076171875,
0.00542449951171875,
0.0104217529296875,
0.0416259765625,
0.006717681884765625,
-0.02740478515625,
0.05126953125,
0.0272369384765625,
-0.05352783203125,
-0.0328369140625,
0.014892578125,
-0.09991455078125,
-0.03094482421875,
0.0693359375,
0.00327301025390625,
-0.0144805908203125,
0.0065155029296875,
-0.05169677734375,
-0.02130126953125,
-0.0469970703125,
0.057647705078125,
0.044464111328125,
-0.0074920654296875,
-0.02703857421875,
-0.018951416015625,
0.0190277099609375,
0.0208740234375,
-0.053955078125,
-0.023529052734375,
0.05975341796875,
0.0192413330078125,
0.046417236328125,
0.05084228515625,
-0.051544189453125,
0.04107666015625,
0.0258331298828125,
0.040374755859375,
0.00440216064453125,
0.007404327392578125,
-0.0249786376953125,
-0.0015401840209960938,
-0.018341064453125,
-0.051300048828125
]
] |
bigscience/bloom | 2023-07-28T17:50:20.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"bloom",
"text-generation",
"ak",
"ar",
"as",
"bm",
"bn",
"ca",
"code",
"en",
"es",
"eu",
"fon",
"fr",
"gu",
"hi",
"id",
"ig",
"ki",
"kn",
"lg",
"ln",
"ml",
"mr",
"ne",
"nso",
"ny",
"or",
"pa",
"pt",
"rn",
"rw",
"sn",
"st",
"sw",
"ta",
"te",
"tn",
"ts",
"tum",
"tw",
"ur",
"vi",
"wo",
"xh",
"yo",
"zh",
"zu",
"arxiv:2211.05100",
"arxiv:1909.08053",
"arxiv:2110.02861",
"arxiv:2108.12409",
"doi:10.57967/hf/0003",
"license:bigscience-bloom-rail-1.0",
"model-index",
"co2_eq_emissions",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | bigscience | null | null | bigscience/bloom | 4,132 | 25,644 | transformers | 2022-05-19T11:53:33 | ---
license: bigscience-bloom-rail-1.0
language:
- ak
- ar
- as
- bm
- bn
- ca
- code
- en
- es
- eu
- fon
- fr
- gu
- hi
- id
- ig
- ki
- kn
- lg
- ln
- ml
- mr
- ne
- nso
- ny
- or
- pa
- pt
- rn
- rw
- sn
- st
- sw
- ta
- te
- tn
- ts
- tum
- tw
- ur
- vi
- wo
- xh
- yo
- zh
- zu
programming_language:
- C
- C++
- C#
- Go
- Java
- JavaScript
- Lua
- PHP
- Python
- Ruby
- Rust
- Scala
- TypeScript
pipeline_tag: text-generation
widget:
- text: 'A "whatpu" is a small, furry animal native to Tanzania. An example of a sentence that uses the word whatpu is: We were traveling in Africa and we saw these very cute whatpus. | To do a "farduddle" means to jump up and down really fast. An example of a sentence that uses the word farduddle is:'
example_title: Imaginary word
group: English
- text: 'Un "whatpu" est un petit animal à fourrure originaire de Tanzanie. Un exemple de phrase qui utilise le mot whatpu est: Nous étions en Afrique et nous avons vu des whatpus trop mignons. Faire un "farduddle" veut dire sauter sur place vraiment vite. Un exemple de phrase qui utilise le mot farduddle est:'
example_title: Imaginary word
group: French
- text: 'Un "whatpu" es un pequeño animal peludo nativo de Tanzania. Un ejemplo de una oración que usa la palabra whatpu es: Estábamos viajando por África y vimos estos whatpus muy bonitos. Hacer un "farduddle" significa saltar arriba y abajo muy rápido. Un ejemplo de una oración que usa la palabra farduddle es:'
example_title: Imaginary word
group: Spanish
- text: ' ال"واتبو" هو حيوان صغير مكسو بالفراء يعيش في تنزانيا. مثال على جملة تستخدم كلمة واتبو هي: كنا نسافر في افريقيا و رأينا هؤلاء الواتبو اللطفاء. للقيام ب"فاردادل" يعني ان تقفز للأعلى و الأسفل بسرعة كبيرة. مثال على جملة تستخدم كلمة فاردادل هي:'
example_title: Imaginary word
group: Arabic
- text: 'Um "whatpu" é um pequeno animal peludo nativo da Tanzânia. Um exemplo de uma frase que usa a palavra whatpu é: Estávamos a viajar por África e vimos uns whatpus muito queridos. Fazer um "farduddle" significa saltar para cima e para baixo muito rápido. Um exemplo de uma frase que usa a palavra farduddle é:'
example : Imaginary word
group: Portuguese
- text: Pour déguster un ortolan, il faut tout d'abord
example_title: Recipe
group: French
- text: |-
34+10=44
54+20=
example_title: Addition
group: Math
- text: |-
This tool converts irregular verbs to past tense.
Arise - Arose
Become - Became
Forget - Forgot
Freeze -
example_title: Irregular verbs
group: English
- text: |-
Please unscramble the letters into a word, and write that word:
r e!c.i p r o.c a/l = reciprocal
d.o m i!n a n.t =
example_title: Word unscrambling
group: English
- text: |-
Estos ejemplos quitan vocales de las palabras
Ejemplos:
hola - hl
manzana - mnzn
papas - pps
alacran - lcrn
papa -
example_title: Vowel removal
group: Spanish
- text: |-
Traduce español de España a español de Argentina
El coche es rojo - el auto es rojo
El ordenador es nuevo - la computadora es nueva
el boligrafo es negro - lapicera es negra
la nevera
example_title: Spanish to Argentinian Spanish
group: Spanish
- text: To say "I love you" in Hindi, you would say
example_title: Translation to Hindi
group: English
- text: To say "I love you" in Hindi, you would say
example_title: Translation from English
group: Hindi
- text: 'Poor English: She no went to the market. Corrected English:'
example_title: Grammar exercise 1
group: English
- text: 'استخراج العدد العاملي في لغة بايثون:'
example_title: Code generation
group: Arabic
- text: 'Regexp. Here is a regular expression to match a word starting with a number and then having only vowels:'
example_title: Regular expressions
group: English
- text: |-
Do a hello world in different languages:
Python: print("hello world")
R:
example_title: Code generation
group: English
- text: |-
Which is the correct preposition? I'm born X July. X is the preposition in
He sat X a chair. X is the preposition on
She drove X the bridge. X is the preposition
example_title: Grammar exercise 2
group: English
- text: |-
Traduction en français: Dans cet essai je vais m'interroger sur la conscience des modèles d'intelligence artificielle récents comme les modèles de langue. Pour commencer, je m'intéresserai à la notion de conscience et à ce qui la caractérise. Ensuite, j'aborderai la question de l'intelligence et de son lien avec le langage. Enfin, dans une dernière partie je me pencherai sur le cas de l'IA et sur sa conscience.
Traduction en espagnol:
example_title: Translation to Spanish
group: French
- text: |-
Traducción al francés: Dans cet essai je vais m'interroger sur la conscience des modèles d'intelligence artificielle récents comme les modèles de langue. Pour commencer, je m'intéresserai à la notion de conscience et à ce qui la caractérise. Ensuite, j'aborderai la question de l'intelligence et de son lien avec le langage. Enfin, dans une dernière partie je me pencherai sur le cas de l'IA et sur sa conscience.
Traducción al español:
example_title: Translation from French
group: Spanish
- text: ذات مرة ، عاش شبل الدب في الغابة
example_title: Fairy tale
group: Arabic
- text: एक बार की बात है, जंगल में एक भालू का शावक रहता था
example_title: Fairy tale
group: Hindi
- text: Il était une fois une licorne qui vivait
example_title: Fairy tale
group: French
- text: |-
Q: A juggler can juggle 16 balls. Half of the balls are golf balls, and half of the golf balls are blue. How many blue golf balls are there?
A: Let's think step by step.
example_title: Mathematical reasoning
group: English
co2_eq_emissions:
emissions: 24_700_000
source: "Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model. https://arxiv.org/abs/2211.02001"
training_type: "pre-training"
geographical_location: "Orsay, France"
hardware_used: "384 A100 80GB GPUs"
model-index:
- name: bloom
results:
- task:
type: text-generation
dataset:
type: openai_humaneval
name: humaneval
metrics:
- name: pass@1
type: pass@1
value: 0.15542682926829265
verified: false
- name: pass@10
type: pass@10
value: 0.3278356276947017
verified: false
- name: pass@100
type: pass@100
value: 0.5719815685597749
verified: false
---
<img src="https://cdn-uploads.huggingface.co/production/uploads/1657124309515-5f17f0a0925b9863e28ad517.png" alt="BigScience Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
BigScience Large Open-science Open-access Multilingual Language Model
Version 1.3 / 6 July 2022
Current Checkpoint: **Training Iteration 95000**
Link to paper: [here](https://arxiv.org/abs/2211.05100)
Total seen tokens: **366B**
---
# Model Details
BLOOM is an autoregressive Large Language Model (LLM), trained to continue text from a prompt on vast amounts of text data using industrial-scale computational resources. As such, it is able to output coherent text in 46 languages and 13 programming languages that is hardly distinguishable from text written by humans. BLOOM can also be instructed to perform text tasks it hasn't been explicitly trained for, by casting them as text generation tasks.
## Basics
*This section provides information about the model type, version, license, funders, release date, developers, and contact information.*
*It is useful for anyone who wants to reference the model.*
<details>
<summary>Click to expand</summary>
**Developed by:** BigScience ([website](https://bigscience.huggingface.co))
*All collaborators are either volunteers or have an agreement with their employer. (Further breakdown of participants forthcoming.)*
**Model Type:** Transformer-based Language Model
**Checkpoints format:** `transformers` (Megatron-DeepSpeed format available [here](https://huggingface.co/bigscience/bloom-optimizer-states))
**Version:** 1.0.0
**Languages:** Multiple; see [training data](#training-data)
**License:** RAIL License v1.0 ([link](https://huggingface.co/spaces/bigscience/license) / [article and FAQ](https://bigscience.huggingface.co/blog/the-bigscience-rail-license))
**Release Date Estimate:** Monday, 11.July.2022
**Send Questions to:** bigscience-contact@googlegroups.com
**Cite as:** BigScience, _BigScience Language Open-science Open-access Multilingual (BLOOM) Language Model_. International, May 2021-May 2022
**Funded by:**
* The French government.
* Hugging Face ([website](https://huggingface.co)).
* Organizations of contributors. *(Further breakdown of organizations forthcoming.)*
</details>
## Technical Specifications
*This section includes details about the model objective and architecture, and the compute infrastructure.*
*It is useful for people interested in model development.*
<details>
<summary>Click to expand</summary>
Please see [the BLOOM training README](https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml#readme) for full details on replicating training.
### Model Architecture and Objective
* Modified from Megatron-LM GPT2 (see [paper](https://arxiv.org/abs/1909.08053), [BLOOM Megatron code](https://github.com/bigscience-workshop/Megatron-DeepSpeed)):
* Decoder-only architecture
* Layer normalization applied to word embeddings layer (`StableEmbedding`; see [code](https://github.com/facebookresearch/bitsandbytes), [paper](https://arxiv.org/pdf/2110.02861.pdf))
* ALiBI positional encodings (see [paper](https://arxiv.org/pdf/2108.12409.pdf)), with GeLU activation functions
* 176,247,271,424 parameters:
* 3,596,615,680 embedding parameters
* 70 layers, 112 attention heads
* Hidden layers are 14336-dimensional
* Sequence length of 2048 tokens used (see [BLOOM tokenizer](https://huggingface.co/bigscience/tokenizer), [tokenizer description](#tokenization))
**Objective Function:** Cross Entropy with mean reduction (see [API documentation](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss)).
### Compute infrastructure
Jean Zay Public Supercomputer, provided by the French government (see [announcement](https://www.enseignementsup-recherche.gouv.fr/fr/signature-du-marche-d-acquisition-de-l-un-des-supercalculateurs-les-plus-puissants-d-europe-46733)).
#### Hardware
* 384 A100 80GB GPUs (48 nodes)
* Additional 32 A100 80GB GPUs (4 nodes) in reserve
* 8 GPUs per node Using NVLink 4 inter-gpu connects, 4 OmniPath links
* CPU: AMD
* CPU memory: 512GB per node
* GPU memory: 640GB per node
* Inter-node connect: Omni-Path Architecture (OPA)
* NCCL-communications network: a fully dedicated subnet
* Disc IO network: shared network with other types of nodes
#### Software
* Megatron-DeepSpeed ([Github link](https://github.com/bigscience-workshop/Megatron-DeepSpeed))
* DeepSpeed ([Github link](https://github.com/microsoft/DeepSpeed))
* PyTorch (pytorch-1.11 w/ CUDA-11.5; see [Github link](https://github.com/pytorch/pytorch))
* apex ([Github link](https://github.com/NVIDIA/apex))
</details>
---
# Training
*This section provides information about the training data, the speed and size of training elements, and the environmental impact of training.*
*It is useful for people who want to learn more about the model inputs and training footprint.*
<details>
<summary>Click to expand</summary>
## Training Data
*This section provides a high-level overview of the training data. It is relevant for anyone who wants to know the basics of what the model is learning.*
Details for each dataset are provided in individual [Data Cards](https://huggingface.co/spaces/bigscience/BigScienceCorpus), and the sizes of each of their contributions to the aggregated training data are presented in an [Interactive Corpus Map](https://huggingface.co/spaces/bigscience-catalogue-lm-data/corpus-map).
Training data includes:
- 46 natural languages
- 13 programming languages
- In 1.6TB of pre-processed text, converted into 350B unique tokens (see [the tokenizer section](#tokenization) for more.)
### Languages
The pie chart shows the distribution of languages in training data.

The following tables shows the further distribution of Niger-Congo & Indic languages and programming languages in the training data.
Distribution of Niger Congo and Indic languages.
| Niger Congo | Percentage | | Indic | Percentage |
|----------------|------------| ------ |-----------|------------|
| Chi Tumbuka | 0.00002 | | Assamese | 0.01 |
| Kikuyu | 0.00004 | | Odia | 0.04 |
| Bambara | 0.00004 | | Gujarati | 0.04 |
| Akan | 0.00007 | | Marathi | 0.05 |
| Xitsonga | 0.00007 | | Punjabi | 0.05 |
| Sesotho | 0.00007 | | Kannada | 0.06 |
| Chi Chewa | 0.0001 | | Nepali | 0.07 |
| Setswana | 0.0002 | | Telugu | 0.09 |
| Lingala | 0.0002 | | Malayalam | 0.10 |
| Northern Sotho | 0.0002 | | Urdu | 0.10 |
| Fon | 0.0002 | | Tamil | 0.20 |
| Kirundi | 0.0003 | | Bengali | 0.50 |
| Wolof | 0.0004 | | Hindi | 0.70 |
| Luganda | 0.0004 |
| Chi Shona | 0.001 |
| Isi Zulu | 0.001 |
| Igbo | 0.001 |
| Xhosa | 0.001 |
| Kinyarwanda | 0.003 |
| Yoruba | 0.006 |
| Swahili | 0.02 |
Distribution of programming languages.
| Extension | Language | Number of files |
|----------------|------------|-----------------|
| java | Java | 5,407,724 |
| php | PHP | 4,942,186 |
| cpp | C++ | 2,503,930 |
| py | Python | 2,435,072 |
| js | JavaScript | 1,905,518 |
| cs | C# | 1,577,347 |
| rb | Ruby | 6,78,413 |
| cc | C++ | 443,054 |
| hpp | C++ | 391,048 |
| lua | Lua | 352,317 |
| go | GO | 227,763 |
| ts | TypeScript | 195,254 |
| C | C | 134,537 |
| scala | Scala | 92,052 |
| hh | C++ | 67,161 |
| H | C++ | 55,899 |
| tsx | TypeScript | 33,107 |
| rs | Rust | 29,693 |
| phpt | PHP | 9,702 |
| c++ | C++ | 1,342 |
| h++ | C++ | 791 |
| php3 | PHP | 540 |
| phps | PHP | 270 |
| php5 | PHP | 166 |
| php4 | PHP | 29 |
### Preprocessing
**Tokenization:** The BLOOM tokenizer ([link](https://huggingface.co/bigscience/tokenizer)), a learned subword tokenizer trained using:
- A byte-level Byte Pair Encoding (BPE) algorithm
- A simple pre-tokenization rule, no normalization
- A vocabulary size of 250,680
It was trained on a subset of a preliminary version of the corpus using alpha-weighting per language.
## Speeds, Sizes, Times
Training logs: [Tensorboard link](https://huggingface.co/tensorboard/bigscience/tr11-176B-ml-logs/)
- Dates:
- Started 11th March, 2022 11:42am PST
- Estimated end: 5th July, 2022
- Checkpoint size:
- Bf16 weights: 329GB
- Full checkpoint with optimizer states: 2.3TB
- Training throughput: About 150 TFLOP per GPU per second
- Number of epochs: 1
- Estimated cost of training: Equivalent of $2-5M in cloud computing (including preliminary experiments)
- Server training location: Île-de-France, France
## Environmental Impact
The training supercomputer, Jean Zay ([website](http://www.idris.fr/eng/jean-zay/jean-zay-presentation-eng.html)), uses mostly nuclear energy. The heat generated by it is reused for heating campus housing.
**Estimated carbon emissions:** *(Forthcoming.)*
**Estimated electricity usage:** *(Forthcoming.)*
</details>
---
# Uses
*This section addresses questions around how the model is intended to be used, discusses the foreseeable users of the model (including those affected by the model), and describes uses that are considered out of scope or misuse of the model.*
*It is useful for anyone considering using the model or who is affected by the model.*
<details>
<summary>Click to expand</summary>
## How to use
This model can be easily used and deployed using HuggingFace's ecosystem. This needs `transformers` and `accelerate` installed. The model can be downloaded as follows:
<img src="https://s3.amazonaws.com/moonup/production/uploads/1657271608456-62441d1d9fdefb55a0b7d12c.png" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
## Intended Use
This model is being created in order to enable public research on large language models (LLMs). LLMs are intended to be used for language generation or as a pretrained base model that can be further fine-tuned for specific tasks. Use cases below are not exhaustive.
### Direct Use
- Text generation
- Exploring characteristics of language generated by a language model
- Examples: Cloze tests, counterfactuals, generations with reframings
### Downstream Use
- Tasks that leverage language models include: Information Extraction, Question Answering, Summarization
### Misuse and Out-of-scope Use
*This section addresses what users ought not do with the model.*
See the [BLOOM License](https://huggingface.co/spaces/bigscience/license), Attachment A, for detailed usage restrictions. The below list is non-exhaustive, but lists some easily foreseeable problematic use cases.
#### Out-of-scope Uses
Using the model in [high-stakes](#high-stakes) settings is out of scope for this model. The model is not designed for [critical decisions](#critical-decisions) nor uses with any material consequences on an individual's livelihood or wellbeing. The model outputs content that appears factual but may not be correct.
Out-of-scope Uses Include:
- Usage in biomedical domains, political and legal domains, or finance domains
- Usage for evaluating or scoring individuals, such as for employment, education, or credit
- Applying the model for critical automatic decisions, generating factual content, creating reliable summaries, or generating predictions that must be correct
#### Misuse
Intentionally using the model for harm, violating [human rights](#human-rights), or other kinds of malicious activities, is a misuse of this model. This includes:
- Spam generation
- Disinformation and influence operations
- Disparagement and defamation
- Harassment and abuse
- [Deception](#deception)
- Unconsented impersonation and imitation
- Unconsented surveillance
- Generating content without attribution to the model, as specified in the [RAIL License, Use Restrictions](https://huggingface.co/spaces/bigscience/license)
## Intended Users
### Direct Users
- General Public
- Researchers
- Students
- Educators
- Engineers/developers
- Non-commercial entities
- Community advocates, including human and civil rights groups
### Indirect Users
- Users of derivatives created by Direct Users, such as those using software with an [intended use](#intended-use)
- Users of [Derivatives of the Model, as described in the License](https://huggingface.co/spaces/bigscience/license)
### Others Affected (Parties Prenantes)
- People and groups referred to by the LLM
- People and groups exposed to outputs of, or decisions based on, the LLM
- People and groups whose original work is included in the LLM
</details>
---
# Risks and Limitations
*This section identifies foreseeable harms and misunderstandings.*
<details>
<summary>Click to expand</summary>
Model may:
- Overrepresent some viewpoints and underrepresent others
- Contain stereotypes
- Contain [personal information](#personal-data-and-information)
- Generate:
- Hateful, abusive, or violent language
- Discriminatory or prejudicial language
- Content that may not be appropriate for all settings, including sexual content
- Make errors, including producing incorrect information as if it were factual
- Generate irrelevant or repetitive outputs
- Induce users into attributing human traits to it, such as sentience or consciousness
</details>
---
# Evaluation
*This section describes the evaluation protocols and provides the results.*
<details>
<summary>Click to expand</summary>
## Metrics
*This section describes the different ways performance is calculated and why.*
Includes:
| Metric | Why chosen |
|--------------------|--------------------------------------------------------------------|
| [Perplexity](#perplexity) | Standard metric for quantifying model improvements during training |
| Cross Entropy [Loss](#loss) | Standard objective for language models. |
And multiple different metrics for specific tasks. _(More evaluation metrics forthcoming upon completion of evaluation protocol.)_
## Factors
*This section lists some different aspects of BLOOM models. Its focus is on aspects that are likely to give rise to high variance in model behavior.*
- Language, such as English or Yoruba
- Domain, such as newswire or stories
- Demographic characteristics, such as gender or nationality
## Results
*Results are based on the [Factors](#factors) and [Metrics](#metrics).*
**Zero-shot evaluations:**
<span style="color:red"><b>WARNING:</b> This section used to contain much more results, however they were not correct and we released without the approval of the evaluation working group. We are currently in the process of fixing the evaluations.</span>
See this repository for JSON files: https://github.com/bigscience-workshop/evaluation-results
| Task | Language | Metric | BLOOM-176B | OPT-175B* |
|:--------|:-----------------|:------------------------|-------------:|------------:|
| humaneval | python | pass@1 ↑ | 0.155 | 0.0 |
| humaneval | python | pass@10 ↑ | 0.328 | 0.0 |
| humaneval | python | pass@100 ↑ | 0.572 | 0.003 |
**Train-time Evaluation:**
Final checkpoint after 95K steps:
- Training Loss: 1.939
- Validation Loss: 2.061
- Perplexity: 7.045
For more see: https://huggingface.co/bigscience/tr11-176B-ml-logs
</details>
---
# Recommendations
*This section provides information on warnings and potential mitigations.*
<details>
<summary>Click to expand</summary>
- Indirect users should be made aware when the content they're working with is created by the LLM.
- Users should be aware of [Risks and Limitations](#risks-and-limitations), and include an appropriate age disclaimer or blocking interface as necessary.
- Models trained or finetuned downstream of BLOOM LM should include an updated Model Card.
- Users of the model should provide mechanisms for those affected to provide feedback, such as an email address for comments.
</details>
---
# Glossary and Calculations
*This section defines common terms and how metrics are calculated.*
<details>
<summary>Click to expand</summary>
- <a name="loss">**Loss:**</a> A calculation of the difference between what the model has learned and what the data shows ("groundtruth"). The lower the loss, the better. The training process aims to minimize the loss.
- <a name="perplexity">**Perplexity:**</a> This is based on what the model estimates the probability of new data is. The lower the perplexity, the better. If the model is 100% correct at predicting the next token it will see, then the perplexity is 1. Mathematically this is calculated using entropy.
- <a name="high-stakes">**High-stakes settings:**</a> Such as those identified as "high-risk AI systems" and "unacceptable risk AI systems" in the European Union's proposed [Artificial Intelligence (AI) Act](https://artificialintelligenceact.eu/annexes/).
- <a name="critical-decisions">**Critical decisions:**</a> Such as those defined in [the United States' proposed Algorithmic Accountability Act](https://www.congress.gov/117/bills/s3572/BILLS-117s3572is.pdf).
- <a name="human-rights">**Human rights:**</a> Includes those rights defined in the [Universal Declaration of Human Rights](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf).
- <a name="personal-data-and-information">**Personal Data and Personal Information:**</a> Personal data and information is defined in multiple data protection regulations, such as "[personal data](https://gdpr-info.eu/issues/personal-data/)" in the [European Union's General Data Protection Regulation](https://gdpr-info.eu); and "personal information" in the Republic of South Africa's [Protection of Personal Information Act](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf), The People's Republic of China's [Personal information protection law](http://en.npc.gov.cn.cdurl.cn/2021-12/29/c_694559.htm).
- <a name="sensitive-characteristics">**Sensitive characteristics:**</a> This includes specifically protected categories in human rights (see [UHDR, Article 2](https://www.un.org/sites/un2.un.org/files/2021/03/udhr.pdf)) and personal information regulation (see GDPR, [Article 9; Protection of Personal Information Act, Chapter 1](https://www.gov.za/sites/default/files/gcis_document/201409/3706726-11act4of2013popi.pdf))
- <a name="deception">**Deception:**</a> Doing something to intentionally mislead individuals to believe something that is false, such as by creating deadbots or chatbots on social media posing as real people, or generating text documents without making consumers aware that the text is machine generated.
</details>
---
# More Information
*This section provides links to writing on dataset creation, technical specifications, lessons learned, and initial results.*
<details>
<summary>Click to expand</summary>
## Intermediate checkpoints
For academic (or any) usage, we published the intermediate checkpoints, corresponding to the model state at each 5000 steps. Please follow [this link](https://huggingface.co/bigscience/bloom-176-intermediate) to get these checkpoints.
## Dataset Creation
Blog post detailing the design choices during the dataset creation: https://bigscience.huggingface.co/blog/building-a-tb-scale-multilingual-dataset-for-language-modeling
## Technical Specifications
Blog post summarizing how the architecture, size, shape, and pre-training duration where selected: https://bigscience.huggingface.co/blog/what-language-model-to-train-if-you-have-two-million-gpu-hours
More details on the architecture/optimizer: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Blog post on the hardware/engineering side: https://bigscience.huggingface.co/blog/which-hardware-to-train-a-176b-parameters-model
Details on the distributed setup used for the training: https://github.com/bigscience-workshop/bigscience/tree/master/train/tr11-176B-ml
Tensorboard updated during the training: https://huggingface.co/bigscience/tr11-176B-ml-logs/tensorboard#scalars&tagFilter=loss
## Lessons
Insights on how to approach training, negative results: https://github.com/bigscience-workshop/bigscience/blob/master/train/lessons-learned.md
Details on the obstacles overcome during the preparation on the engineering side (instabilities, optimization of training throughput, so many technical tricks and questions): https://github.com/bigscience-workshop/bigscience/blob/master/train/tr11-176B-ml/chronicles.md
## Initial Results
Initial prompting experiments using interim checkpoints: https://huggingface.co/spaces/bigscience/bloom-book
</details>
## Original checkpoints
The checkpoints in this repo correspond to the HuggingFace Transformers format. If you want to use our fork of [Megatron-DeepSpeed](https://github.com/bigscience-workshop/Megatron-DeepSpeed) that the model was trained with, you'd want to use [this repo instead](https://huggingface.co/bigscience/bloom-optimizer-states).
Many intermediate checkpoints are available at https://huggingface.co/bigscience/bloom-intermediate/
---
# Model Card Authors
*Ordered roughly chronologically and by amount of time spent on creating this model card.*
Margaret Mitchell, Giada Pistilli, Yacine Jernite, Ezinwanne Ozoani, Marissa Gerchick, Nazneen Rajani, Sasha Luccioni, Irene Solaiman, Maraim Masoud, Somaieh Nikpoor, Carlos Muñoz Ferrandis, Stas Bekman, Christopher Akiki, Danish Contractor, David Lansky, Angelina McMillan-Major, Tristan Thrush, Suzana Ilić, Gérard Dupont, Shayne Longpre, Manan Dey, Stella Biderman, Douwe Kiela, Emi Baylor, Teven Le Scao, Aaron Gokaslan, Julien Launay, Niklas Muennighoff | 29,510 | [
[
-0.031768798828125,
-0.04638671875,
0.025909423828125,
0.0171356201171875,
0.00024390220642089844,
0.0008234977722167969,
-0.034637451171875,
-0.036041259765625,
0.0157470703125,
0.006717681884765625,
-0.03155517578125,
-0.045166015625,
-0.053680419921875,
0.0106964111328125,
-0.021240234375,
0.07513427734375,
0.0010499954223632812,
0.00836181640625,
0.003948211669921875,
-0.0040130615234375,
-0.007358551025390625,
-0.039215087890625,
-0.0411376953125,
-0.0210723876953125,
0.042205810546875,
0.0227203369140625,
0.045867919921875,
0.04510498046875,
0.04425048828125,
0.0203704833984375,
-0.0238494873046875,
-0.0093841552734375,
-0.031463623046875,
-0.02740478515625,
-0.0114898681640625,
-0.0180511474609375,
-0.0361328125,
-0.00911712646484375,
0.06817626953125,
0.05096435546875,
0.0015306472778320312,
0.017303466796875,
0.0005211830139160156,
0.051788330078125,
-0.039215087890625,
0.0244598388671875,
-0.0284271240234375,
0.01114654541015625,
-0.01788330078125,
0.0209808349609375,
-0.01401519775390625,
-0.012725830078125,
0.0028285980224609375,
-0.037322998046875,
0.020111083984375,
-0.00799560546875,
0.071533203125,
0.0005559921264648438,
-0.01175689697265625,
-0.0015153884887695312,
-0.05462646484375,
0.061126708984375,
-0.05548095703125,
0.037445068359375,
0.023712158203125,
0.0151519775390625,
0.00408172607421875,
-0.060577392578125,
-0.05902099609375,
-0.003513336181640625,
0.00034809112548828125,
0.0205230712890625,
-0.00836944580078125,
-0.0037212371826171875,
0.0272979736328125,
0.0498046875,
-0.06427001953125,
0.0169677734375,
-0.042724609375,
-0.01641845703125,
0.050537109375,
0.004161834716796875,
0.020263671875,
-0.01277923583984375,
-0.01343536376953125,
-0.040740966796875,
-0.041015625,
-0.007312774658203125,
0.0194854736328125,
0.0293731689453125,
-0.042327880859375,
0.039642333984375,
0.00003069639205932617,
0.03369140625,
-0.023101806640625,
-0.004611968994140625,
0.052886962890625,
-0.044525146484375,
-0.028350830078125,
-0.016357421875,
0.0858154296875,
0.006855010986328125,
-0.0084075927734375,
0.0029392242431640625,
0.006927490234375,
-0.0116424560546875,
-0.006622314453125,
-0.07171630859375,
-0.0180206298828125,
0.034149169921875,
-0.0166168212890625,
-0.006275177001953125,
0.005817413330078125,
-0.07421875,
-0.00533294677734375,
-0.0149688720703125,
0.02239990234375,
-0.059539794921875,
-0.03497314453125,
0.00885772705078125,
-0.0125579833984375,
0.0236358642578125,
0.0142364501953125,
-0.0718994140625,
0.013458251953125,
0.045318603515625,
0.08111572265625,
-0.0124664306640625,
-0.046539306640625,
0.00864410400390625,
0.0025424957275390625,
-0.00882720947265625,
0.0215911865234375,
-0.01898193359375,
-0.02801513671875,
-0.00830841064453125,
0.020111083984375,
-0.01629638671875,
-0.0191192626953125,
0.033203125,
-0.0251617431640625,
0.0174560546875,
-0.0273284912109375,
-0.036712646484375,
-0.0084075927734375,
0.01006317138671875,
-0.044830322265625,
0.0858154296875,
0.015960693359375,
-0.06488037109375,
0.02667236328125,
-0.060089111328125,
-0.0189056396484375,
0.0049896240234375,
-0.00617218017578125,
-0.04541015625,
-0.0171356201171875,
0.017242431640625,
0.0247344970703125,
-0.0157928466796875,
0.025787353515625,
-0.0087738037109375,
-0.0148468017578125,
0.0007452964782714844,
-0.0270538330078125,
0.07574462890625,
0.0292510986328125,
-0.0408935546875,
0.0038852691650390625,
-0.051483154296875,
-0.005435943603515625,
0.0223846435546875,
-0.03472900390625,
0.0202484130859375,
-0.014862060546875,
0.02410888671875,
0.0171051025390625,
0.0209808349609375,
-0.0499267578125,
0.03363037109375,
-0.04425048828125,
0.04931640625,
0.043243408203125,
-0.00983428955078125,
0.02423095703125,
-0.004840850830078125,
0.0262603759765625,
0.0206451416015625,
0.01904296875,
-0.0186309814453125,
-0.04144287109375,
-0.06451416015625,
-0.034271240234375,
0.02008056640625,
0.040313720703125,
-0.03131103515625,
0.051300048828125,
-0.038818359375,
-0.05914306640625,
-0.031494140625,
0.003978729248046875,
0.037689208984375,
0.020172119140625,
0.04864501953125,
-0.01004791259765625,
-0.041717529296875,
-0.06817626953125,
0.0169525146484375,
0.006290435791015625,
0.01153564453125,
0.01546478271484375,
0.08062744140625,
-0.0399169921875,
0.0623779296875,
-0.04815673828125,
-0.00968170166015625,
-0.0218658447265625,
-0.00405120849609375,
0.033233642578125,
0.043701171875,
0.04498291015625,
-0.04803466796875,
-0.031494140625,
0.0121917724609375,
-0.0482177734375,
0.0300750732421875,
0.022216796875,
-0.003673553466796875,
0.03045654296875,
0.037261962890625,
-0.057525634765625,
0.02191162109375,
0.056121826171875,
-0.0084075927734375,
0.052093505859375,
-0.017486572265625,
-0.0014982223510742188,
-0.10400390625,
0.037689208984375,
-0.0011091232299804688,
0.0053253173828125,
-0.02337646484375,
0.0118408203125,
-0.0013608932495117188,
-0.03228759765625,
-0.03515625,
0.048065185546875,
-0.030426025390625,
-0.00284576416015625,
0.00565338134765625,
-0.01049041748046875,
-0.00901031494140625,
0.036651611328125,
0.008941650390625,
0.07940673828125,
0.06768798828125,
-0.04522705078125,
0.01047515869140625,
0.0157623291015625,
-0.021392822265625,
-0.0016717910766601562,
-0.06396484375,
-0.0004184246063232422,
-0.007236480712890625,
0.0301361083984375,
-0.065185546875,
-0.0277099609375,
0.01488494873046875,
-0.0389404296875,
0.0301361083984375,
-0.01544952392578125,
-0.0517578125,
-0.051300048828125,
-0.0162353515625,
0.0223846435546875,
0.0439453125,
-0.0286102294921875,
0.01922607421875,
0.0176544189453125,
0.004650115966796875,
-0.044647216796875,
-0.064453125,
0.0029392242431640625,
0.00028228759765625,
-0.04119873046875,
0.0247802734375,
-0.00997161865234375,
0.0031280517578125,
-0.0029430389404296875,
0.011505126953125,
-0.002044677734375,
-0.0049591064453125,
0.0225372314453125,
0.008941650390625,
-0.0184783935546875,
0.014129638671875,
-0.0157470703125,
-0.005054473876953125,
-0.00982666015625,
-0.043182373046875,
0.059478759765625,
-0.0237579345703125,
-0.0287017822265625,
-0.033111572265625,
0.01910400390625,
0.047210693359375,
-0.0198974609375,
0.0885009765625,
0.06964111328125,
-0.0289459228515625,
0.0051116943359375,
-0.0236358642578125,
-0.01467132568359375,
-0.0367431640625,
0.048828125,
-0.00649261474609375,
-0.06976318359375,
0.03350830078125,
0.01190185546875,
0.018157958984375,
0.057220458984375,
0.050323486328125,
0.00583648681640625,
0.06243896484375,
0.04693603515625,
-0.0301666259765625,
0.0361328125,
-0.056121826171875,
0.0191650390625,
-0.05706787109375,
-0.01128387451171875,
-0.046142578125,
-0.0121307373046875,
-0.045257568359375,
-0.045501708984375,
0.024932861328125,
0.00801849365234375,
-0.046630859375,
0.03662109375,
-0.049041748046875,
0.0263519287109375,
0.047882080078125,
0.0033054351806640625,
-0.0009860992431640625,
0.013458251953125,
-0.0201568603515625,
-0.002468109130859375,
-0.051483154296875,
-0.03643798828125,
0.0926513671875,
0.040740966796875,
0.0285491943359375,
0.0126800537109375,
0.048736572265625,
-0.003086090087890625,
0.00830078125,
-0.04425048828125,
0.0318603515625,
-0.0033473968505859375,
-0.059661865234375,
-0.023162841796875,
-0.045501708984375,
-0.08282470703125,
0.00390625,
-0.01006317138671875,
-0.072265625,
0.01171875,
0.0171966552734375,
-0.00982666015625,
0.041473388671875,
-0.060333251953125,
0.07293701171875,
-0.01477813720703125,
-0.0276641845703125,
-0.016815185546875,
-0.047027587890625,
0.024078369140625,
-0.00005638599395751953,
0.0213623046875,
0.00833892822265625,
0.0167694091796875,
0.06396484375,
-0.047637939453125,
0.060455322265625,
-0.01458740234375,
0.00362396240234375,
0.0185394287109375,
-0.0183258056640625,
0.030029296875,
0.00385284423828125,
-0.01245880126953125,
0.041900634765625,
-0.002445220947265625,
-0.036376953125,
-0.0091094970703125,
0.061614990234375,
-0.08599853515625,
-0.01959228515625,
-0.0498046875,
-0.037445068359375,
-0.0017442703247070312,
0.0321044921875,
0.041778564453125,
0.01849365234375,
-0.0094757080078125,
0.01824951171875,
0.058868408203125,
-0.048797607421875,
0.041229248046875,
0.0216217041015625,
-0.0213623046875,
-0.044921875,
0.07965087890625,
0.00417327880859375,
0.0236968994140625,
0.0294647216796875,
0.0153045654296875,
-0.022064208984375,
-0.04736328125,
-0.032867431640625,
0.03125,
-0.037567138671875,
-0.01123046875,
-0.07177734375,
-0.03436279296875,
-0.0560302734375,
-0.0016012191772460938,
-0.03765869140625,
-0.01904296875,
-0.0241546630859375,
-0.0083465576171875,
0.046142578125,
0.038421630859375,
-0.007152557373046875,
0.03338623046875,
-0.04803466796875,
0.0014162063598632812,
0.012725830078125,
0.0268402099609375,
0.00481414794921875,
-0.043609619140625,
-0.0369873046875,
0.0202484130859375,
-0.039642333984375,
-0.043914794921875,
0.033050537109375,
0.019378662109375,
0.0251007080078125,
0.017669677734375,
-0.03369140625,
0.0413818359375,
-0.03045654296875,
0.08624267578125,
0.0267486572265625,
-0.06005859375,
0.050384521484375,
-0.0384521484375,
0.030609130859375,
0.0428466796875,
0.0447998046875,
-0.037445068359375,
-0.0118865966796875,
-0.054840087890625,
-0.0810546875,
0.054443359375,
0.017791748046875,
0.003566741943359375,
-0.00009918212890625,
0.0171966552734375,
-0.015350341796875,
0.0092315673828125,
-0.06829833984375,
-0.033203125,
-0.022369384765625,
-0.0177459716796875,
-0.028167724609375,
-0.00016963481903076172,
-0.0139007568359375,
-0.042266845703125,
0.057098388671875,
-0.002376556396484375,
0.040924072265625,
0.0250244140625,
-0.01020050048828125,
0.005767822265625,
0.0137786865234375,
0.04852294921875,
0.043609619140625,
-0.0171051025390625,
-0.0013408660888671875,
0.01364898681640625,
-0.07257080078125,
-0.0009427070617675781,
0.015838623046875,
-0.009735107421875,
-0.0118560791015625,
0.032073974609375,
0.0614013671875,
0.002498626708984375,
-0.04144287109375,
0.035491943359375,
0.011138916015625,
-0.0270538330078125,
-0.025360107421875,
0.0088348388671875,
0.02490234375,
0.006458282470703125,
0.025482177734375,
-0.004566192626953125,
-0.0016527175903320312,
-0.041961669921875,
0.00832366943359375,
0.0307769775390625,
-0.0207366943359375,
-0.03106689453125,
0.045989990234375,
-0.002513885498046875,
-0.005466461181640625,
0.045806884765625,
-0.028594970703125,
-0.042144775390625,
0.057281494140625,
0.042510986328125,
0.05181884765625,
-0.0074005126953125,
0.006992340087890625,
0.06817626953125,
0.0345458984375,
-0.0008368492126464844,
0.0206451416015625,
0.01666259765625,
-0.0404052734375,
-0.0341796875,
-0.07086181640625,
-0.027191162109375,
0.021087646484375,
-0.0249481201171875,
0.0222625732421875,
-0.043701171875,
-0.00732421875,
0.01074981689453125,
0.01277923583984375,
-0.0673828125,
0.011444091796875,
0.017547607421875,
0.08148193359375,
-0.07086181640625,
0.0682373046875,
0.05340576171875,
-0.051666259765625,
-0.072265625,
-0.0150146484375,
-0.002315521240234375,
-0.057373046875,
0.054107666015625,
0.0095367431640625,
0.014801025390625,
0.00225830078125,
-0.04833984375,
-0.08441162109375,
0.08843994140625,
0.02569580078125,
-0.05438232421875,
0.016815185546875,
0.0205535888671875,
0.0528564453125,
0.005138397216796875,
0.0308685302734375,
0.033416748046875,
0.043548583984375,
0.01210784912109375,
-0.0738525390625,
-0.004932403564453125,
-0.0282135009765625,
-0.00635528564453125,
0.0202484130859375,
-0.06365966796875,
0.0794677734375,
-0.00879669189453125,
-0.01268768310546875,
-0.0139617919921875,
0.046295166015625,
0.019775390625,
0.002246856689453125,
0.0067291259765625,
0.07086181640625,
0.054046630859375,
-0.02197265625,
0.07928466796875,
-0.028472900390625,
0.0467529296875,
0.0675048828125,
-0.00170135498046875,
0.053924560546875,
0.02587890625,
-0.048065185546875,
0.031402587890625,
0.043304443359375,
-0.0162200927734375,
0.0163421630859375,
0.00890350341796875,
-0.014312744140625,
0.0116729736328125,
0.00850677490234375,
-0.055328369140625,
0.01360321044921875,
0.032867431640625,
-0.037689208984375,
-0.005428314208984375,
0.0050201416015625,
0.016510009765625,
-0.0310211181640625,
-0.01678466796875,
0.03594970703125,
-0.000995635986328125,
-0.03826904296875,
0.0501708984375,
0.001983642578125,
0.057464599609375,
-0.058624267578125,
0.00429534912109375,
-0.01192474365234375,
0.0212554931640625,
-0.0274658203125,
-0.07684326171875,
0.017669677734375,
-0.00536346435546875,
-0.0231475830078125,
-0.0021820068359375,
0.0196075439453125,
-0.019683837890625,
-0.048065185546875,
0.0181427001953125,
0.014404296875,
0.021331787109375,
0.00212860107421875,
-0.061614990234375,
0.0165557861328125,
-0.0038204193115234375,
-0.02423095703125,
0.029571533203125,
0.0194091796875,
0.01453399658203125,
0.041961669921875,
0.055419921875,
0.009521484375,
0.01256561279296875,
-0.0018663406372070312,
0.072021484375,
-0.054534912109375,
-0.0193939208984375,
-0.06146240234375,
0.03466796875,
-0.01137542724609375,
-0.04693603515625,
0.0703125,
0.06878662109375,
0.0616455078125,
-0.007110595703125,
0.0634765625,
-0.0261383056640625,
0.0217132568359375,
-0.023712158203125,
0.04315185546875,
-0.04119873046875,
-0.0078582763671875,
-0.040191650390625,
-0.073974609375,
-0.02288818359375,
0.039459228515625,
-0.0268402099609375,
0.0184783935546875,
0.04840087890625,
0.0631103515625,
-0.0177459716796875,
-0.0029754638671875,
0.0012826919555664062,
0.0282440185546875,
0.030792236328125,
0.048919677734375,
0.02545166015625,
-0.04754638671875,
0.033447265625,
-0.022003173828125,
-0.0150604248046875,
-0.0186614990234375,
-0.054962158203125,
-0.04638671875,
-0.047515869140625,
-0.0279998779296875,
-0.03350830078125,
-0.007343292236328125,
0.0689697265625,
0.060760498046875,
-0.06524658203125,
-0.015655517578125,
-0.032318115234375,
-0.0080718994140625,
-0.01151275634765625,
-0.0179290771484375,
0.05084228515625,
-0.016876220703125,
-0.05426025390625,
0.0194854736328125,
0.0136566162109375,
0.0103607177734375,
-0.030120849609375,
-0.01187896728515625,
-0.0256500244140625,
-0.015106201171875,
0.045166015625,
0.0443115234375,
-0.032440185546875,
-0.00463104248046875,
0.00333404541015625,
-0.0208740234375,
0.0085906982421875,
0.028472900390625,
-0.031402587890625,
0.019134521484375,
0.0290374755859375,
0.03594970703125,
0.052703857421875,
-0.0246734619140625,
0.015594482421875,
-0.04779052734375,
0.027801513671875,
0.016357421875,
0.03509521484375,
0.0222625732421875,
-0.0218505859375,
0.040374755859375,
0.02978515625,
-0.0565185546875,
-0.05364990234375,
0.007293701171875,
-0.08251953125,
-0.0173187255859375,
0.1051025390625,
-0.00977325439453125,
-0.0277557373046875,
0.007015228271484375,
-0.01324462890625,
0.018218994140625,
-0.0202178955078125,
0.03350830078125,
0.05035400390625,
0.00804901123046875,
-0.00902557373046875,
-0.04931640625,
0.0333251953125,
0.0255279541015625,
-0.06439208984375,
0.00682830810546875,
0.032989501953125,
0.03466796875,
0.0325927734375,
0.040740966796875,
-0.01473236083984375,
0.00312042236328125,
0.004283905029296875,
0.035247802734375,
0.001804351806640625,
-0.01427459716796875,
-0.025604248046875,
-0.00472259521484375,
-0.0006704330444335938,
-0.00608062744140625
]
] |
sentence-transformers/paraphrase-xlm-r-multilingual-v1 | 2022-06-15T19:25:39.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"xlm-roberta",
"feature-extraction",
"sentence-similarity",
"transformers",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/paraphrase-xlm-r-multilingual-v1 | 50 | 25,615 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# sentence-transformers/paraphrase-xlm-r-multilingual-v1
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/paraphrase-xlm-r-multilingual-v1')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/paraphrase-xlm-r-multilingual-v1')
model = AutoModel.from_pretrained('sentence-transformers/paraphrase-xlm-r-multilingual-v1')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/paraphrase-xlm-r-multilingual-v1)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 3,744 | [
[
-0.016082763671875,
-0.0546875,
0.0361328125,
0.0281829833984375,
-0.0287628173828125,
-0.02105712890625,
-0.01360321044921875,
0.006076812744140625,
0.0102386474609375,
0.047698974609375,
-0.0291290283203125,
-0.03619384765625,
-0.0521240234375,
0.019500732421875,
-0.03802490234375,
0.06549072265625,
-0.0279693603515625,
-0.00437164306640625,
-0.025909423828125,
-0.0303955078125,
-0.012939453125,
-0.039825439453125,
-0.028778076171875,
-0.013275146484375,
0.0207672119140625,
0.0189056396484375,
0.0460205078125,
0.038604736328125,
0.0214385986328125,
0.033843994140625,
-0.00022399425506591797,
0.007549285888671875,
-0.01438140869140625,
-0.0006814002990722656,
0.0022220611572265625,
-0.0296630859375,
-0.00274658203125,
0.0059814453125,
0.048797607421875,
0.032073974609375,
-0.0096893310546875,
0.00569915771484375,
0.0111083984375,
0.0177154541015625,
-0.0310211181640625,
0.0335693359375,
-0.0479736328125,
0.010833740234375,
0.0073089599609375,
-0.00484466552734375,
-0.036773681640625,
-0.01229095458984375,
0.00884246826171875,
-0.030975341796875,
0.01222991943359375,
0.01558685302734375,
0.07891845703125,
0.019256591796875,
-0.020904541015625,
-0.02227783203125,
-0.02301025390625,
0.0750732421875,
-0.064453125,
0.01457977294921875,
0.0223388671875,
-0.0018291473388671875,
0.018341064453125,
-0.08416748046875,
-0.05401611328125,
-0.01207733154296875,
-0.03753662109375,
0.01026153564453125,
-0.02374267578125,
-0.00757598876953125,
0.00738525390625,
0.01087188720703125,
-0.05181884765625,
-0.027862548828125,
-0.032623291015625,
-0.0164642333984375,
0.0218353271484375,
0.007297515869140625,
0.03619384765625,
-0.044342041015625,
-0.038299560546875,
-0.028228759765625,
-0.00803375244140625,
-0.005401611328125,
0.00275421142578125,
0.0215606689453125,
-0.0279083251953125,
0.053741455078125,
-0.0010366439819335938,
0.037567138671875,
-0.00634002685546875,
0.0121612548828125,
0.046966552734375,
-0.0399169921875,
-0.00962066650390625,
-0.005886077880859375,
0.08953857421875,
0.0249786376953125,
0.0159759521484375,
-0.01080322265625,
-0.01221466064453125,
-0.00771331787109375,
0.00408935546875,
-0.059539794921875,
-0.02032470703125,
0.01108551025390625,
-0.0299224853515625,
-0.0132598876953125,
0.0151214599609375,
-0.06109619140625,
0.006298065185546875,
-0.0053558349609375,
0.055267333984375,
-0.053436279296875,
0.00991058349609375,
0.016021728515625,
-0.0261688232421875,
0.0155792236328125,
-0.0269775390625,
-0.044586181640625,
0.0211181640625,
0.0234832763671875,
0.0860595703125,
0.007640838623046875,
-0.050872802734375,
-0.033416748046875,
0.0015287399291992188,
0.002849578857421875,
0.05230712890625,
-0.038055419921875,
-0.00299835205078125,
0.005321502685546875,
0.0171966552734375,
-0.055511474609375,
-0.0367431640625,
0.057159423828125,
-0.020538330078125,
0.04071044921875,
0.0009341239929199219,
-0.05413818359375,
-0.010986328125,
0.004718780517578125,
-0.037078857421875,
0.07720947265625,
0.01166534423828125,
-0.07086181640625,
-0.003631591796875,
-0.04754638671875,
-0.0182647705078125,
-0.00896453857421875,
-0.0021381378173828125,
-0.0537109375,
-0.003448486328125,
0.041351318359375,
0.05487060546875,
0.0031833648681640625,
0.0161895751953125,
-0.0222930908203125,
-0.02117919921875,
0.0298614501953125,
-0.01922607421875,
0.07757568359375,
0.0196685791015625,
-0.025390625,
0.00817108154296875,
-0.033966064453125,
-0.002063751220703125,
0.0164794921875,
-0.006832122802734375,
-0.02093505859375,
-0.0095672607421875,
0.0251922607421875,
0.0305023193359375,
0.0233001708984375,
-0.04638671875,
-0.0107421875,
-0.026123046875,
0.07281494140625,
0.033294677734375,
0.0105438232421875,
0.043243408203125,
-0.029083251953125,
0.0276336669921875,
0.0280914306640625,
0.0014715194702148438,
-0.0218505859375,
-0.028533935546875,
-0.05908203125,
-0.019256591796875,
0.01806640625,
0.05303955078125,
-0.06561279296875,
0.0592041015625,
-0.032012939453125,
-0.034515380859375,
-0.04852294921875,
0.01641845703125,
0.0198516845703125,
0.032806396484375,
0.057159423828125,
0.00484466552734375,
-0.040771484375,
-0.07293701171875,
-0.012451171875,
0.0094146728515625,
0.0049896240234375,
0.01244354248046875,
0.05419921875,
-0.0241546630859375,
0.06793212890625,
-0.04132080078125,
-0.0313720703125,
-0.0513916015625,
0.0211181640625,
0.01338958740234375,
0.042266845703125,
0.048736572265625,
-0.061004638671875,
-0.052642822265625,
-0.024139404296875,
-0.06256103515625,
-0.002132415771484375,
-0.0189971923828125,
-0.018341064453125,
0.006561279296875,
0.04925537109375,
-0.0645751953125,
0.0158843994140625,
0.040313720703125,
-0.032379150390625,
0.0236053466796875,
-0.0335693359375,
-0.003147125244140625,
-0.1104736328125,
0.003692626953125,
0.00466156005859375,
-0.01442718505859375,
-0.029296875,
0.0121612548828125,
0.0274810791015625,
-0.011260986328125,
-0.032684326171875,
0.033111572265625,
-0.032745361328125,
0.0194244384765625,
-0.0006351470947265625,
0.03790283203125,
0.0031871795654296875,
0.051910400390625,
-0.004497528076171875,
0.0567626953125,
0.049163818359375,
-0.04681396484375,
0.022918701171875,
0.052703857421875,
-0.033966064453125,
0.0216217041015625,
-0.06396484375,
-0.0031604766845703125,
-0.002750396728515625,
0.029083251953125,
-0.08221435546875,
-0.004932403564453125,
0.02203369140625,
-0.033538818359375,
0.004253387451171875,
0.021392822265625,
-0.04937744140625,
-0.0384521484375,
-0.032745361328125,
0.013336181640625,
0.04254150390625,
-0.03338623046875,
0.046630859375,
0.0147247314453125,
-0.020355224609375,
-0.042236328125,
-0.07208251953125,
0.017852783203125,
-0.0291595458984375,
-0.0452880859375,
0.04302978515625,
-0.01031494140625,
0.007129669189453125,
0.01129913330078125,
0.0156097412109375,
-0.01348876953125,
-0.012847900390625,
-0.0144805908203125,
0.01107025146484375,
-0.00577545166015625,
-0.00421905517578125,
0.0182037353515625,
-0.006938934326171875,
0.0062713623046875,
-0.009033203125,
0.05352783203125,
-0.006992340087890625,
-0.0022182464599609375,
-0.03814697265625,
0.02593994140625,
0.039459228515625,
-0.02349853515625,
0.08270263671875,
0.07525634765625,
-0.0191497802734375,
0.0012493133544921875,
-0.0307769775390625,
-0.0147247314453125,
-0.035064697265625,
0.05206298828125,
-0.0184783935546875,
-0.054168701171875,
0.041900634765625,
0.0250396728515625,
0.004009246826171875,
0.042572021484375,
0.0338134765625,
0.002582550048828125,
0.0643310546875,
0.03271484375,
-0.00730133056640625,
0.033355712890625,
-0.036529541015625,
0.021240234375,
-0.06829833984375,
-0.002239227294921875,
-0.033660888671875,
-0.0091705322265625,
-0.0416259765625,
-0.038848876953125,
0.019287109375,
-0.004764556884765625,
-0.01207733154296875,
0.059478759765625,
-0.0299835205078125,
0.0152130126953125,
0.044708251953125,
0.023162841796875,
-0.0050811767578125,
0.0033416748046875,
-0.040496826171875,
-0.00494384765625,
-0.0538330078125,
-0.04852294921875,
0.07061767578125,
0.0163421630859375,
0.03265380859375,
-0.01422119140625,
0.061767578125,
-0.006984710693359375,
-0.0030460357666015625,
-0.036041259765625,
0.04962158203125,
-0.0233612060546875,
-0.0258331298828125,
-0.0198974609375,
-0.03387451171875,
-0.06658935546875,
0.0309600830078125,
-0.0023651123046875,
-0.05517578125,
0.00006586313247680664,
-0.01444244384765625,
-0.027008056640625,
0.024627685546875,
-0.060394287109375,
0.0843505859375,
0.007476806640625,
-0.01151275634765625,
-0.006587982177734375,
-0.059417724609375,
0.01202392578125,
-0.004486083984375,
0.021270751953125,
0.0017414093017578125,
-0.0113677978515625,
0.0675048828125,
-0.0279083251953125,
0.05511474609375,
-0.00870513916015625,
0.0175323486328125,
0.02313232421875,
-0.0196685791015625,
0.02569580078125,
-0.003932952880859375,
-0.00848388671875,
0.0034084320068359375,
0.004924774169921875,
-0.032958984375,
-0.039642333984375,
0.0546875,
-0.07086181640625,
-0.0299530029296875,
-0.0274505615234375,
-0.055999755859375,
-0.00504302978515625,
0.0117034912109375,
0.0276336669921875,
0.03387451171875,
0.00011426210403442383,
0.046173095703125,
0.037078857421875,
-0.03570556640625,
0.0511474609375,
0.000659942626953125,
-0.000005364418029785156,
-0.044830322265625,
0.051513671875,
0.006908416748046875,
0.01346588134765625,
0.05145263671875,
0.0204925537109375,
-0.0362548828125,
-0.015777587890625,
-0.021240234375,
0.035003662109375,
-0.05340576171875,
-0.01084136962890625,
-0.09271240234375,
-0.0323486328125,
-0.049560546875,
-0.0014858245849609375,
-0.010009765625,
-0.028411865234375,
-0.0294189453125,
-0.0180511474609375,
0.027252197265625,
0.030242919921875,
-0.002391815185546875,
0.0291595458984375,
-0.06036376953125,
0.0235748291015625,
0.01267242431640625,
-0.0079803466796875,
-0.00830078125,
-0.06390380859375,
-0.02880859375,
0.0066375732421875,
-0.0404052734375,
-0.059112548828125,
0.059234619140625,
0.0234832763671875,
0.050079345703125,
0.0095062255859375,
0.01215362548828125,
0.05291748046875,
-0.042633056640625,
0.06390380859375,
0.0027065277099609375,
-0.07940673828125,
0.0167999267578125,
-0.0030670166015625,
0.031280517578125,
0.03253173828125,
0.0267333984375,
-0.0399169921875,
-0.048553466796875,
-0.04742431640625,
-0.08526611328125,
0.06060791015625,
0.042022705078125,
0.049957275390625,
-0.020904541015625,
0.0254669189453125,
-0.008331298828125,
0.007129669189453125,
-0.08917236328125,
-0.041351318359375,
-0.028533935546875,
-0.044647216796875,
-0.035247802734375,
-0.01611328125,
0.011383056640625,
-0.029998779296875,
0.050506591796875,
0.007419586181640625,
0.053070068359375,
0.020660400390625,
-0.037139892578125,
0.0151519775390625,
0.0182952880859375,
0.039337158203125,
0.0293731689453125,
-0.007701873779296875,
0.0195770263671875,
0.024139404296875,
-0.02252197265625,
-0.0014581680297851562,
0.03826904296875,
-0.00548553466796875,
0.02471923828125,
0.028961181640625,
0.07086181640625,
0.04681396484375,
-0.03717041015625,
0.0462646484375,
-0.0050201416015625,
-0.028717041015625,
-0.0215911865234375,
-0.01207733154296875,
0.02667236328125,
0.0222930908203125,
0.0297393798828125,
0.00135040283203125,
-0.00476837158203125,
-0.0279541015625,
0.031402587890625,
0.0196990966796875,
-0.0202484130859375,
-0.0101776123046875,
0.0679931640625,
-0.0105438232421875,
-0.0130615234375,
0.057952880859375,
-0.012115478515625,
-0.055877685546875,
0.03753662109375,
0.04302978515625,
0.0750732421875,
-0.00543975830078125,
0.027008056640625,
0.038604736328125,
0.0206298828125,
-0.013916015625,
-0.00537109375,
-0.002346038818359375,
-0.061798095703125,
-0.01221466064453125,
-0.051544189453125,
-0.00559234619140625,
0.004390716552734375,
-0.049774169921875,
0.0207672119140625,
0.0009784698486328125,
0.0058746337890625,
-0.0029506683349609375,
-0.015899658203125,
-0.044891357421875,
-0.0016279220581054688,
-0.0014848709106445312,
0.054351806640625,
-0.06884765625,
0.060394287109375,
0.053192138671875,
-0.05145263671875,
-0.0526123046875,
0.0062408447265625,
-0.04180908203125,
-0.063720703125,
0.038421630859375,
0.03399658203125,
0.021484375,
0.013824462890625,
-0.032989501953125,
-0.0673828125,
0.09722900390625,
0.02923583984375,
-0.017364501953125,
-0.016632080078125,
0.0234527587890625,
0.031402587890625,
-0.029083251953125,
0.026336669921875,
0.0290985107421875,
0.0302276611328125,
-0.00439453125,
-0.05364990234375,
0.0198516845703125,
-0.0131072998046875,
0.01154327392578125,
-0.006378173828125,
-0.041412353515625,
0.08673095703125,
-0.0037097930908203125,
-0.004680633544921875,
0.0240325927734375,
0.058380126953125,
0.024688720703125,
0.0007786750793457031,
0.0280609130859375,
0.05364990234375,
0.0253448486328125,
0.0025787353515625,
0.07373046875,
-0.045196533203125,
0.060455322265625,
0.070068359375,
-0.002872467041015625,
0.0780029296875,
0.042266845703125,
-0.00986480712890625,
0.0531005859375,
0.0301361083984375,
-0.003345489501953125,
0.05364990234375,
-0.00597381591796875,
-0.00534820556640625,
-0.01012420654296875,
0.0251617431640625,
-0.01287078857421875,
0.0233001708984375,
0.01029205322265625,
-0.0546875,
-0.0234832763671875,
0.01308441162109375,
0.01134490966796875,
0.01013946533203125,
-0.0005116462707519531,
0.040191650390625,
0.023101806640625,
-0.046966552734375,
0.035614013671875,
0.016510009765625,
0.06341552734375,
-0.040496826171875,
0.0132904052734375,
-0.01174163818359375,
0.034454345703125,
0.005970001220703125,
-0.045196533203125,
0.033355712890625,
-0.01227569580078125,
0.00531005859375,
-0.027587890625,
0.0297698974609375,
-0.052154541015625,
-0.0513916015625,
0.034423828125,
0.0428466796875,
0.0033931732177734375,
-0.00684356689453125,
-0.09332275390625,
-0.0082550048828125,
0.0045318603515625,
-0.03509521484375,
0.024688720703125,
0.039459228515625,
0.0238800048828125,
0.049224853515625,
0.02056884765625,
-0.022064208984375,
0.0151519775390625,
0.0008325576782226562,
0.05242919921875,
-0.0513916015625,
-0.041595458984375,
-0.08099365234375,
0.036895751953125,
-0.01134490966796875,
-0.01922607421875,
0.074462890625,
0.04901123046875,
0.05999755859375,
-0.022369384765625,
0.050506591796875,
-0.012359619140625,
0.0226287841796875,
-0.037384033203125,
0.058258056640625,
-0.037750244140625,
-0.0138397216796875,
-0.032989501953125,
-0.06793212890625,
-0.0230712890625,
0.077880859375,
-0.0250701904296875,
0.01177215576171875,
0.0833740234375,
0.0654296875,
-0.017578125,
-0.0165863037109375,
0.0216522216796875,
0.0279998779296875,
0.00965118408203125,
0.035430908203125,
0.0269775390625,
-0.06329345703125,
0.0750732421875,
-0.046905517578125,
0.0009593963623046875,
-0.0007200241088867188,
-0.05413818359375,
-0.0694580078125,
-0.066162109375,
-0.037078857421875,
-0.02337646484375,
-0.013763427734375,
0.07244873046875,
0.034423828125,
-0.058746337890625,
-0.019256591796875,
-0.0257415771484375,
-0.0104827880859375,
-0.0196075439453125,
-0.0237884521484375,
0.038665771484375,
-0.051361083984375,
-0.0712890625,
0.00400543212890625,
-0.0016880035400390625,
-0.0004456043243408203,
-0.01284027099609375,
0.00530242919921875,
-0.047210693359375,
0.0158538818359375,
0.040771484375,
-0.015350341796875,
-0.048858642578125,
-0.014678955078125,
0.0025234222412109375,
-0.02679443359375,
0.002780914306640625,
0.0263519287109375,
-0.04046630859375,
0.00705718994140625,
0.0212249755859375,
0.033966064453125,
0.0479736328125,
-0.01181793212890625,
0.034820556640625,
-0.05413818359375,
0.0204315185546875,
-0.00133514404296875,
0.062286376953125,
0.0343017578125,
-0.01013946533203125,
0.0224151611328125,
0.01953125,
-0.040191650390625,
-0.054290771484375,
-0.0160369873046875,
-0.080078125,
-0.023590087890625,
0.09722900390625,
-0.03369140625,
-0.029266357421875,
0.0008974075317382812,
-0.0252685546875,
0.037689208984375,
-0.025543212890625,
0.052032470703125,
0.06170654296875,
0.00891876220703125,
-0.022613525390625,
-0.0288543701171875,
0.01000213623046875,
0.03680419921875,
-0.046051025390625,
-0.0028667449951171875,
0.00585174560546875,
0.034820556640625,
0.0161590576171875,
0.021026611328125,
-0.00875091552734375,
0.000946044921875,
0.0019350051879882812,
0.002880096435546875,
-0.0047760009765625,
0.01161956787109375,
-0.02349853515625,
0.0167694091796875,
-0.0301361083984375,
-0.025115966796875
]
] |
intfloat/e5-large-unsupervised | 2023-07-27T05:02:57.000Z | [
"sentence-transformers",
"pytorch",
"safetensors",
"bert",
"Sentence Transformers",
"sentence-similarity",
"en",
"arxiv:2212.03533",
"arxiv:2104.08663",
"arxiv:2210.07316",
"license:mit",
"endpoints_compatible",
"region:us"
] | sentence-similarity | intfloat | null | null | intfloat/e5-large-unsupervised | 0 | 25,585 | sentence-transformers | 2023-01-31T03:03:36 | ---
tags:
- Sentence Transformers
- sentence-similarity
- sentence-transformers
language:
- en
license: mit
---
# E5-large-unsupervised
**This model is similar to [e5-large](https://huggingface.co/intfloat/e5-large) but without supervised fine-tuning.**
[Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
This model has 24 layers and the embedding size is 1024.
## Usage
Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
```python
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def average_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
# Each input text should start with "query: " or "passage: ".
# For tasks other than retrieval, you can simply use the "query: " prefix.
input_texts = ['query: how much protein should a female eat',
'query: summit define',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-large-unsupervised')
model = AutoModel.from_pretrained('intfloat/e5-large-unsupervised')
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
```
## Training Details
Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
## Benchmark Evaluation
Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
## Support for Sentence Transformers
Below is an example for usage with sentence_transformers.
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer('intfloat/e5-large-unsupervised')
input_texts = [
'query: how much protein should a female eat',
'query: summit define',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
]
embeddings = model.encode(input_texts, normalize_embeddings=True)
```
Package requirements
`pip install sentence_transformers~=2.2.2`
Contributors: [michaelfeil](https://huggingface.co/michaelfeil)
## FAQ
**1. Do I need to add the prefix "query: " and "passage: " to input texts?**
Yes, this is how the model is trained, otherwise you will see a performance degradation.
Here are some rules of thumb:
- Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval.
- Use "query: " prefix for symmetric tasks such as semantic similarity, paraphrase retrieval.
- Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering.
**2. Why are my reproduced results slightly different from reported in the model card?**
Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences.
## Citation
If you find our paper or models helpful, please consider cite as follows:
```
@article{wang2022text,
title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
journal={arXiv preprint arXiv:2212.03533},
year={2022}
}
```
## Limitations
This model only works for English texts. Long texts will be truncated to at most 512 tokens.
| 5,095 | [
[
-0.0105438232421875,
-0.05224609375,
0.01519775390625,
0.01505279541015625,
-0.0186767578125,
-0.02996826171875,
-0.0022869110107421875,
-0.030609130859375,
0.007312774658203125,
0.0274505615234375,
-0.0404052734375,
-0.04766845703125,
-0.07574462890625,
0.0236663818359375,
-0.0308990478515625,
0.07110595703125,
0.00434112548828125,
0.0052947998046875,
-0.02703857421875,
-0.0027713775634765625,
-0.0169830322265625,
-0.04071044921875,
-0.0240936279296875,
-0.0255126953125,
0.0219268798828125,
0.015960693359375,
0.041351318359375,
0.0423583984375,
0.049560546875,
0.026763916015625,
-0.012359619140625,
0.01824951171875,
-0.039520263671875,
-0.01413726806640625,
-0.0005159378051757812,
-0.03948974609375,
-0.03143310546875,
0.014739990234375,
0.04339599609375,
0.0672607421875,
0.00945281982421875,
0.0213165283203125,
0.0279693603515625,
0.043975830078125,
-0.045074462890625,
0.01509857177734375,
-0.032318115234375,
0.011505126953125,
0.008453369140625,
0.0007877349853515625,
-0.03009033203125,
0.0193939208984375,
0.0240631103515625,
-0.0457763671875,
0.0220794677734375,
0.0100555419921875,
0.09320068359375,
0.02459716796875,
-0.04022216796875,
-0.012603759765625,
-0.01171875,
0.07232666015625,
-0.0528564453125,
0.035400390625,
0.050201416015625,
-0.019561767578125,
-0.01425933837890625,
-0.07293701171875,
-0.0309906005859375,
-0.014068603515625,
-0.0171356201171875,
0.01255035400390625,
-0.0222320556640625,
-0.0013561248779296875,
0.03515625,
0.033416748046875,
-0.060699462890625,
-0.0019235610961914062,
-0.02972412109375,
-0.005176544189453125,
0.041229248046875,
0.007694244384765625,
0.0223846435546875,
-0.0345458984375,
-0.0158843994140625,
-0.0180816650390625,
-0.04058837890625,
0.006145477294921875,
0.01343536376953125,
0.02740478515625,
-0.0269012451171875,
0.0443115234375,
-0.0261688232421875,
0.04962158203125,
0.019866943359375,
0.0100250244140625,
0.0499267578125,
-0.038055419921875,
-0.022552490234375,
-0.0214996337890625,
0.07403564453125,
0.04022216796875,
0.01386260986328125,
-0.00916290283203125,
-0.00750732421875,
-0.007843017578125,
0.00789642333984375,
-0.0833740234375,
-0.041229248046875,
0.018218994140625,
-0.04876708984375,
-0.0142974853515625,
0.0101165771484375,
-0.04150390625,
-0.00536346435546875,
-0.019561767578125,
0.06646728515625,
-0.039520263671875,
0.006870269775390625,
0.0205230712890625,
-0.0166778564453125,
0.00713348388671875,
0.00905609130859375,
-0.064697265625,
0.019317626953125,
0.0079803466796875,
0.06842041015625,
-0.004974365234375,
-0.031707763671875,
-0.0401611328125,
-0.002155303955078125,
0.000008463859558105469,
0.04022216796875,
-0.029541015625,
-0.0205230712890625,
0.002536773681640625,
0.037109375,
-0.0379638671875,
-0.03668212890625,
0.04522705078125,
-0.0231170654296875,
0.029815673828125,
-0.017608642578125,
-0.043914794921875,
-0.007598876953125,
0.0191802978515625,
-0.033355712890625,
0.07550048828125,
0.005756378173828125,
-0.06982421875,
0.01255035400390625,
-0.0386962890625,
-0.0310211181640625,
-0.00885009765625,
-0.004848480224609375,
-0.036590576171875,
-0.006866455078125,
0.043853759765625,
0.03277587890625,
-0.01068115234375,
0.0033016204833984375,
-0.00733184814453125,
-0.035675048828125,
0.017333984375,
-0.01239013671875,
0.06561279296875,
0.00472259521484375,
-0.03668212890625,
-0.007415771484375,
-0.058319091796875,
0.0052490234375,
0.01172637939453125,
-0.0341796875,
-0.00537872314453125,
0.00679779052734375,
0.0012388229370117188,
0.02099609375,
0.029266357421875,
-0.040771484375,
0.0179901123046875,
-0.03729248046875,
0.062164306640625,
0.039825439453125,
0.0047149658203125,
0.0316162109375,
-0.02813720703125,
0.0069427490234375,
0.0309906005859375,
0.0081634521484375,
-0.004451751708984375,
-0.03509521484375,
-0.06182861328125,
-0.01215362548828125,
0.045867919921875,
0.033416748046875,
-0.036376953125,
0.046295166015625,
-0.0247039794921875,
-0.022491455078125,
-0.05133056640625,
0.00543975830078125,
0.01509857177734375,
0.0253143310546875,
0.059722900390625,
-0.006649017333984375,
-0.05078125,
-0.07861328125,
-0.0240020751953125,
0.0178985595703125,
-0.02264404296875,
0.013885498046875,
0.0693359375,
-0.029266357421875,
0.042510986328125,
-0.051177978515625,
-0.042999267578125,
-0.0156707763671875,
0.01079559326171875,
0.0301361083984375,
0.05584716796875,
0.0258636474609375,
-0.0628662109375,
-0.0325927734375,
-0.036773681640625,
-0.0667724609375,
0.0050811767578125,
0.00945281982421875,
-0.01099395751953125,
-0.00441741943359375,
0.037628173828125,
-0.051483154296875,
0.0262603759765625,
0.03814697265625,
-0.033355712890625,
0.019561767578125,
-0.029815673828125,
0.00826263427734375,
-0.07635498046875,
0.0031108856201171875,
0.01390838623046875,
-0.0158233642578125,
-0.0280303955078125,
0.014129638671875,
0.00439453125,
-0.01233673095703125,
-0.031707763671875,
0.0178985595703125,
-0.047332763671875,
0.01357269287109375,
-0.0120086669921875,
0.02099609375,
0.022613525390625,
0.040008544921875,
-0.008026123046875,
0.042999267578125,
0.04638671875,
-0.06268310546875,
0.0012235641479492188,
0.05120849609375,
-0.0303497314453125,
0.022491455078125,
-0.0693359375,
0.004009246826171875,
-0.004215240478515625,
0.0226287841796875,
-0.06842041015625,
-0.01885986328125,
0.0136566162109375,
-0.05078125,
0.0209503173828125,
-0.0023708343505859375,
-0.041229248046875,
-0.026397705078125,
-0.036651611328125,
0.0153045654296875,
0.038970947265625,
-0.0293426513671875,
0.038665771484375,
0.0208587646484375,
0.00455474853515625,
-0.038299560546875,
-0.07806396484375,
-0.008056640625,
-0.002170562744140625,
-0.05078125,
0.052947998046875,
-0.0109100341796875,
0.0195465087890625,
0.0023136138916015625,
-0.00868988037109375,
0.01435089111328125,
-0.01277923583984375,
0.0191192626953125,
0.001312255859375,
0.004486083984375,
0.00384521484375,
-0.0109100341796875,
-0.0015659332275390625,
0.00203704833984375,
-0.02227783203125,
0.040771484375,
-0.0227813720703125,
0.007595062255859375,
-0.046539306640625,
0.03887939453125,
0.0215301513671875,
-0.023590087890625,
0.08148193359375,
0.06121826171875,
-0.028533935546875,
0.004364013671875,
-0.023773193359375,
-0.02764892578125,
-0.03509521484375,
0.044769287109375,
-0.040802001953125,
-0.043975830078125,
0.03314208984375,
0.004917144775390625,
-0.0021877288818359375,
0.06890869140625,
0.024322509765625,
-0.021331787109375,
0.10491943359375,
0.05706787109375,
0.0125274658203125,
0.0288238525390625,
-0.0535888671875,
0.00420379638671875,
-0.074462890625,
-0.027496337890625,
-0.0445556640625,
-0.036468505859375,
-0.06341552734375,
-0.0291595458984375,
0.028594970703125,
0.0148162841796875,
-0.03924560546875,
0.022216796875,
-0.045806884765625,
0.00830841064453125,
0.04461669921875,
0.03729248046875,
0.0098419189453125,
0.0121917724609375,
-0.01511383056640625,
-0.0262603759765625,
-0.07183837890625,
-0.0247802734375,
0.0789794921875,
0.0200653076171875,
0.053314208984375,
-0.006420135498046875,
0.041839599609375,
0.010498046875,
-0.01100921630859375,
-0.049346923828125,
0.039154052734375,
-0.03143310546875,
-0.0241546630859375,
-0.01332855224609375,
-0.05029296875,
-0.08306884765625,
0.032440185546875,
-0.0355224609375,
-0.05224609375,
0.015655517578125,
-0.0120697021484375,
-0.0216064453125,
0.0111083984375,
-0.0672607421875,
0.078369140625,
0.0031795501708984375,
-0.0195159912109375,
0.0017957687377929688,
-0.051055908203125,
-0.01323699951171875,
0.0220947265625,
0.010284423828125,
0.0033397674560546875,
-0.0019130706787109375,
0.0804443359375,
-0.022369384765625,
0.0709228515625,
-0.004764556884765625,
0.031768798828125,
0.007396697998046875,
-0.0159759521484375,
0.042205810546875,
-0.015228271484375,
-0.0005693435668945312,
0.0211029052734375,
0.00653076171875,
-0.04364013671875,
-0.0265960693359375,
0.059295654296875,
-0.0927734375,
-0.040252685546875,
-0.04058837890625,
-0.03814697265625,
0.00563812255859375,
0.01507568359375,
0.048004150390625,
0.038116455078125,
0.01110076904296875,
0.04461669921875,
0.0411376953125,
-0.0278778076171875,
0.029632568359375,
0.026031494140625,
0.006626129150390625,
-0.032867431640625,
0.049346923828125,
0.03289794921875,
0.01068115234375,
0.047943115234375,
0.0167083740234375,
-0.0255126953125,
-0.041107177734375,
-0.015869140625,
0.032440185546875,
-0.049285888671875,
-0.0140533447265625,
-0.08428955078125,
-0.0264129638671875,
-0.046722412109375,
0.0029964447021484375,
-0.0206756591796875,
-0.0277099609375,
-0.02703857421875,
-0.00408935546875,
0.01302337646484375,
0.032012939453125,
-0.0069122314453125,
0.0241546630859375,
-0.049407958984375,
0.021514892578125,
0.005283355712890625,
0.003692626953125,
-0.01026153564453125,
-0.07196044921875,
-0.0290985107421875,
0.01483154296875,
-0.04815673828125,
-0.06451416015625,
0.03533935546875,
0.031463623046875,
0.044647216796875,
0.01264190673828125,
0.0099029541015625,
0.04730224609375,
-0.0289764404296875,
0.07257080078125,
0.01155853271484375,
-0.069091796875,
0.047149658203125,
-0.010498046875,
0.049957275390625,
0.03509521484375,
0.054443359375,
-0.026092529296875,
-0.0278472900390625,
-0.062286376953125,
-0.08245849609375,
0.04595947265625,
0.03094482421875,
0.01373291015625,
-0.0037689208984375,
0.0208892822265625,
-0.00702667236328125,
0.020751953125,
-0.08416748046875,
-0.0234222412109375,
-0.0274810791015625,
-0.0183868408203125,
-0.018524169921875,
-0.01309967041015625,
-0.001194000244140625,
-0.04547119140625,
0.062744140625,
-0.0040740966796875,
0.047149658203125,
0.04205322265625,
-0.035125732421875,
0.00443267822265625,
0.002437591552734375,
0.0239715576171875,
0.04718017578125,
-0.030609130859375,
0.0213623046875,
0.0239410400390625,
-0.048187255859375,
-0.0159454345703125,
0.0164337158203125,
-0.01934814453125,
0.002155303955078125,
0.037353515625,
0.057525634765625,
0.0236663818359375,
-0.0273590087890625,
0.046417236328125,
0.0027713775634765625,
-0.02978515625,
-0.0092315673828125,
-0.005542755126953125,
0.01495361328125,
0.0168914794921875,
0.03668212890625,
-0.00766754150390625,
0.01045989990234375,
-0.045135498046875,
0.006000518798828125,
0.0027484893798828125,
-0.0297088623046875,
-0.02239990234375,
0.0555419921875,
0.0200653076171875,
-0.005504608154296875,
0.07806396484375,
-0.00707244873046875,
-0.039764404296875,
0.035552978515625,
0.052825927734375,
0.048309326171875,
-0.00838470458984375,
0.01428985595703125,
0.0634765625,
0.03277587890625,
0.0015163421630859375,
0.01071929931640625,
0.016204833984375,
-0.051177978515625,
-0.0248565673828125,
-0.069091796875,
0.00022101402282714844,
0.025909423828125,
-0.03533935546875,
0.015045166015625,
-0.0031986236572265625,
-0.0200347900390625,
0.007251739501953125,
0.03460693359375,
-0.0682373046875,
0.020965576171875,
-0.010284423828125,
0.05279541015625,
-0.069091796875,
0.0391845703125,
0.06280517578125,
-0.062164306640625,
-0.056915283203125,
-0.0008282661437988281,
-0.0296478271484375,
-0.04547119140625,
0.052154541015625,
0.039642333984375,
0.01361846923828125,
0.00788116455078125,
-0.041595458984375,
-0.0517578125,
0.082763671875,
0.0115966796875,
-0.038238525390625,
-0.0211334228515625,
0.024658203125,
0.031463623046875,
-0.03680419921875,
0.03863525390625,
0.02618408203125,
0.0253143310546875,
-0.00399017333984375,
-0.05584716796875,
0.01470184326171875,
-0.024871826171875,
-0.0098724365234375,
-0.0085906982421875,
-0.051910400390625,
0.08831787109375,
-0.020111083984375,
-0.0020599365234375,
0.0011463165283203125,
0.04766845703125,
0.004596710205078125,
0.001262664794921875,
0.032562255859375,
0.04681396484375,
0.05010986328125,
-0.0015649795532226562,
0.087646484375,
-0.0244293212890625,
0.0419921875,
0.0589599609375,
0.0246734619140625,
0.06304931640625,
0.03265380859375,
-0.02294921875,
0.046173095703125,
0.06597900390625,
-0.0128326416015625,
0.052032470703125,
0.0078582763671875,
0.01041412353515625,
-0.0261993408203125,
0.0025691986083984375,
-0.04351806640625,
0.031280517578125,
0.013153076171875,
-0.05029296875,
-0.01549530029296875,
0.006931304931640625,
0.0040740966796875,
-0.01276397705078125,
-0.0213165283203125,
0.037445068359375,
0.035247802734375,
-0.037933349609375,
0.0733642578125,
0.010284423828125,
0.053131103515625,
-0.049774169921875,
0.0158233642578125,
-0.015289306640625,
0.03131103515625,
-0.0244140625,
-0.0452880859375,
0.00791168212890625,
-0.00669097900390625,
-0.0211944580078125,
-0.00879669189453125,
0.04095458984375,
-0.04443359375,
-0.039093017578125,
0.021453857421875,
0.04144287109375,
0.020599365234375,
-0.025421142578125,
-0.0738525390625,
-0.0016498565673828125,
0.005580902099609375,
-0.0300445556640625,
0.03460693359375,
0.0191497802734375,
0.0160064697265625,
0.04302978515625,
0.03948974609375,
-0.0088958740234375,
-0.004932403564453125,
0.01369476318359375,
0.061065673828125,
-0.04736328125,
-0.04541015625,
-0.0615234375,
0.0286407470703125,
-0.019744873046875,
-0.0259246826171875,
0.0626220703125,
0.051605224609375,
0.057159423828125,
-0.01403045654296875,
0.0345458984375,
0.0004401206970214844,
0.01172637939453125,
-0.041656494140625,
0.049652099609375,
-0.05413818359375,
-0.0078277587890625,
-0.0172119140625,
-0.07708740234375,
-0.014068603515625,
0.066162109375,
-0.0330810546875,
0.0092926025390625,
0.06964111328125,
0.056396484375,
-0.0185546875,
-0.006908416748046875,
0.017120361328125,
0.04412841796875,
0.02142333984375,
0.060577392578125,
0.040618896484375,
-0.07611083984375,
0.052886962890625,
-0.00957489013671875,
-0.0178680419921875,
-0.01357269287109375,
-0.057342529296875,
-0.06951904296875,
-0.05224609375,
-0.042816162109375,
-0.0257110595703125,
0.00765228271484375,
0.07171630859375,
0.057403564453125,
-0.04412841796875,
-0.0076141357421875,
0.00013768672943115234,
-0.017425537109375,
-0.025299072265625,
-0.0174560546875,
0.045562744140625,
-0.0278778076171875,
-0.0699462890625,
0.017425537109375,
-0.0132904052734375,
0.00830078125,
0.004131317138671875,
0.001842498779296875,
-0.04559326171875,
-0.002506256103515625,
0.054656982421875,
-0.00792694091796875,
-0.028350830078125,
-0.03021240234375,
-0.0005674362182617188,
-0.01947021484375,
0.011810302734375,
0.01035308837890625,
-0.04718017578125,
0.019195556640625,
0.042510986328125,
0.032012939453125,
0.07647705078125,
-0.0012578964233398438,
0.032867431640625,
-0.053497314453125,
0.011322021484375,
0.007049560546875,
0.02777099609375,
0.040374755859375,
-0.02142333984375,
0.03564453125,
0.0287322998046875,
-0.04779052734375,
-0.04638671875,
-0.0091552734375,
-0.07220458984375,
-0.019073486328125,
0.07928466796875,
-0.01404571533203125,
-0.023406982421875,
0.012237548828125,
-0.002483367919921875,
0.0287322998046875,
-0.0208892822265625,
0.057342529296875,
0.062286376953125,
-0.00983428955078125,
-0.00408172607421875,
-0.059539794921875,
0.038543701171875,
0.037261962890625,
-0.041015625,
-0.02252197265625,
0.0081024169921875,
0.03656005859375,
0.01044464111328125,
0.03460693359375,
-0.01027679443359375,
-0.0017557144165039062,
0.0189666748046875,
-0.0027256011962890625,
-0.007022857666015625,
-0.011383056640625,
-0.007236480712890625,
0.0133819580078125,
-0.02142333984375,
-0.025482177734375
]
] |
csarron/mobilebert-uncased-squad-v2 | 2023-07-18T16:52:20.000Z | [
"transformers",
"pytorch",
"onnx",
"safetensors",
"mobilebert",
"question-answering",
"en",
"dataset:squad_v2",
"arxiv:2004.02984",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | question-answering | csarron | null | null | csarron/mobilebert-uncased-squad-v2 | 1 | 25,390 | transformers | 2022-03-02T23:29:05 | ---
language: en
thumbnail:
license: mit
tags:
- question-answering
- mobilebert
datasets:
- squad_v2
metrics:
- squad_v2
widget:
- text: "Which name is also used to describe the Amazon rainforest in English?"
context: "The Amazon rainforest (Portuguese: Floresta Amazônica or Amazônia; Spanish: Selva Amazónica, Amazonía or usually Amazonia; French: Forêt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain \"Amazonas\" in their names. The Amazon represents over half of the planet's remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species."
- text: "How many square kilometers of rainforest is covered in the basin?"
context: "The Amazon rainforest (Portuguese: Floresta Amazônica or Amazônia; Spanish: Selva Amazónica, Amazonía or usually Amazonia; French: Forêt amazonienne; Dutch: Amazoneregenwoud), also known in English as Amazonia or the Amazon Jungle, is a moist broadleaf forest that covers most of the Amazon basin of South America. This basin encompasses 7,000,000 square kilometres (2,700,000 sq mi), of which 5,500,000 square kilometres (2,100,000 sq mi) are covered by the rainforest. This region includes territory belonging to nine nations. The majority of the forest is contained within Brazil, with 60% of the rainforest, followed by Peru with 13%, Colombia with 10%, and with minor amounts in Venezuela, Ecuador, Bolivia, Guyana, Suriname and French Guiana. States or departments in four nations contain \"Amazonas\" in their names. The Amazon represents over half of the planet's remaining rainforests, and comprises the largest and most biodiverse tract of tropical rainforest in the world, with an estimated 390 billion individual trees divided into 16,000 species."
---
## MobileBERT fine-tuned on SQuAD v2
[MobileBERT](https://arxiv.org/abs/2004.02984) is a thin version of BERT_LARGE, while equipped with bottleneck structures and a carefully designed balance
between self-attentions and feed-forward networks.
This model was fine-tuned from the HuggingFace checkpoint `google/mobilebert-uncased` on [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer).
## Details
| Dataset | Split | # samples |
| -------- | ----- | --------- |
| SQuAD2.0 | train | 130k |
| SQuAD2.0 | eval | 12.3k |
### Fine-tuning
- Python: `3.7.5`
- Machine specs:
`CPU: Intel(R) Core(TM) i7-6800K CPU @ 3.40GHz`
`Memory: 32 GiB`
`GPUs: 2 GeForce GTX 1070, each with 8GiB memory`
`GPU driver: 418.87.01, CUDA: 10.1`
- script:
```shell
# after install https://github.com/huggingface/transformers
cd examples/question-answering
mkdir -p data
wget -O data/train-v2.0.json https://rajpurkar.github.io/SQuAD-explorer/dataset/train-v2.0.json
wget -O data/dev-v2.0.json https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v2.0.json
export SQUAD_DIR=`pwd`/data
python run_squad.py \
--model_type mobilebert \
--model_name_or_path google/mobilebert-uncased \
--do_train \
--do_eval \
--do_lower_case \
--version_2_with_negative \
--train_file $SQUAD_DIR/train-v2.0.json \
--predict_file $SQUAD_DIR/dev-v2.0.json \
--per_gpu_train_batch_size 16 \
--per_gpu_eval_batch_size 16 \
--learning_rate 4e-5 \
--num_train_epochs 5.0 \
--max_seq_length 320 \
--doc_stride 128 \
--warmup_steps 1400 \
--save_steps 2000 \
--output_dir $SQUAD_DIR/mobilebert-uncased-warmup-squad_v2 2>&1 | tee train-mobilebert-warmup-squad_v2.log
```
It took about 3.5 hours to finish.
### Results
**Model size**: `95M`
| Metric | # Value | # Original ([Table 5](https://arxiv.org/pdf/2004.02984.pdf))|
| ------ | --------- | --------- |
| **EM** | **75.2** | **76.2** |
| **F1** | **78.8** | **79.2** |
Note that the above results didn't involve any hyperparameter search.
## Example Usage
```python
from transformers import pipeline
qa_pipeline = pipeline(
"question-answering",
model="csarron/mobilebert-uncased-squad-v2",
tokenizer="csarron/mobilebert-uncased-squad-v2"
)
predictions = qa_pipeline({
'context': "The game was played on February 7, 2016 at Levi's Stadium in the San Francisco Bay Area at Santa Clara, California.",
'question': "What day was the game played on?"
})
print(predictions)
# output:
# {'score': 0.71434086561203, 'start': 23, 'end': 39, 'answer': 'February 7, 2016'}
```
> Created by [Qingqing Cao](https://awk.ai/) | [GitHub](https://github.com/csarron) | [Twitter](https://twitter.com/sysnlp)
> Made with ❤️ in New York. | 5,224 | [
[
-0.036865234375,
-0.044158935546875,
0.0194091796875,
0.0263214111328125,
-0.0132293701171875,
0.00952911376953125,
-0.01142120361328125,
-0.0280303955078125,
0.019622802734375,
0.00762939453125,
-0.07867431640625,
-0.033935546875,
-0.0291290283203125,
-0.00579833984375,
-0.0037593841552734375,
0.08734130859375,
-0.0087890625,
-0.005977630615234375,
0.001712799072265625,
-0.031829833984375,
-0.0194091796875,
-0.036224365234375,
-0.055419921875,
-0.0258636474609375,
0.0272979736328125,
0.0211181640625,
0.044891357421875,
0.047760009765625,
0.043182373046875,
0.0242919921875,
-0.00411224365234375,
0.00262451171875,
-0.04632568359375,
0.007049560546875,
0.0178680419921875,
-0.0271759033203125,
-0.0438232421875,
0.009033203125,
0.034912109375,
0.005825042724609375,
-0.0005068778991699219,
0.03692626953125,
-0.00345611572265625,
0.0555419921875,
-0.03411865234375,
0.00942230224609375,
-0.046630859375,
0.000881195068359375,
0.0103759765625,
-0.004734039306640625,
-0.006732940673828125,
-0.00864410400390625,
0.01194000244140625,
-0.03839111328125,
0.0400390625,
-0.01422119140625,
0.091796875,
0.03076171875,
-0.0107269287109375,
-0.004764556884765625,
-0.03204345703125,
0.07501220703125,
-0.0462646484375,
0.01171875,
0.0179443359375,
0.02490234375,
-0.00452423095703125,
-0.039581298828125,
-0.026153564453125,
-0.0021762847900390625,
0.0004916191101074219,
0.038330078125,
-0.01497650146484375,
-0.016021728515625,
0.0205078125,
-0.004535675048828125,
-0.059326171875,
0.004985809326171875,
-0.05548095703125,
-0.026641845703125,
0.0814208984375,
0.0269012451171875,
0.0016078948974609375,
-0.0140533447265625,
-0.0439453125,
-0.01192474365234375,
-0.035186767578125,
0.04150390625,
0.02569580078125,
0.01446533203125,
-0.0286407470703125,
0.036407470703125,
-0.04052734375,
0.0275421142578125,
0.01090240478515625,
0.0056304931640625,
0.038543701171875,
-0.02392578125,
-0.0214996337890625,
0.004016876220703125,
0.0849609375,
0.043426513671875,
0.0092315673828125,
-0.0099334716796875,
-0.01557159423828125,
0.0035343170166015625,
0.0018606185913085938,
-0.08648681640625,
-0.038116455078125,
0.049774169921875,
-0.02423095703125,
-0.0292205810546875,
0.0028400421142578125,
-0.04296875,
0.00189971923828125,
-0.0004444122314453125,
0.03369140625,
-0.03924560546875,
-0.0015039443969726562,
-0.01055145263671875,
-0.015655517578125,
0.039031982421875,
0.016876220703125,
-0.055267333984375,
-0.00989532470703125,
0.0281982421875,
0.07305908203125,
-0.0018644332885742188,
-0.0256195068359375,
-0.048187255859375,
-0.01374053955078125,
-0.0126190185546875,
0.05303955078125,
-0.005031585693359375,
-0.0241241455078125,
-0.00782012939453125,
0.021209716796875,
-0.0226898193359375,
-0.022918701171875,
0.043426513671875,
-0.02618408203125,
0.0110015869140625,
-0.0241546630859375,
-0.022918701171875,
-0.0015954971313476562,
0.015899658203125,
-0.0245208740234375,
0.093017578125,
0.0213165283203125,
-0.041534423828125,
0.03076171875,
-0.03997802734375,
-0.03009033203125,
-0.0037288665771484375,
0.0153656005859375,
-0.054443359375,
-0.02337646484375,
0.0244903564453125,
0.05035400390625,
-0.01690673828125,
-0.0013751983642578125,
-0.03741455078125,
-0.03472900390625,
0.0055999755859375,
0.01152801513671875,
0.0860595703125,
0.01263427734375,
-0.03851318359375,
0.016357421875,
-0.05230712890625,
0.043182373046875,
0.007537841796875,
-0.0236053466796875,
0.0199432373046875,
0.0006613731384277344,
-0.0009198188781738281,
0.033447265625,
0.027496337890625,
-0.02728271484375,
0.016937255859375,
-0.01654052734375,
0.06402587890625,
0.04559326171875,
-0.0245208740234375,
0.03173828125,
-0.03338623046875,
0.03369140625,
-0.00484466552734375,
0.020538330078125,
0.00592041015625,
-0.0439453125,
-0.044342041015625,
-0.035552978515625,
0.01849365234375,
0.052215576171875,
-0.0278472900390625,
0.044921875,
-0.00543212890625,
-0.06622314453125,
-0.042755126953125,
0.00585174560546875,
0.0255889892578125,
0.0241241455078125,
0.043853759765625,
0.00560760498046875,
-0.0574951171875,
-0.08160400390625,
-0.0039005279541015625,
-0.035064697265625,
-0.00007665157318115234,
0.03240966796875,
0.06207275390625,
-0.022705078125,
0.047821044921875,
-0.04107666015625,
-0.0193328857421875,
-0.024749755859375,
0.0014085769653320312,
0.0224609375,
0.049713134765625,
0.055450439453125,
-0.03558349609375,
-0.042144775390625,
-0.01544952392578125,
-0.06719970703125,
-0.00214385986328125,
0.0033435821533203125,
-0.0177459716796875,
0.01971435546875,
0.033111572265625,
-0.0662841796875,
0.0230560302734375,
0.036376953125,
-0.02593994140625,
0.0518798828125,
-0.0128631591796875,
0.0155181884765625,
-0.06964111328125,
0.01491546630859375,
0.00537109375,
-0.01038360595703125,
-0.0369873046875,
0.01032257080078125,
0.00516510009765625,
0.0024890899658203125,
-0.03887939453125,
0.0335693359375,
-0.019866943359375,
0.0167999267578125,
0.0013294219970703125,
0.005054473876953125,
-0.00884246826171875,
0.044921875,
-0.00977325439453125,
0.07720947265625,
0.033905029296875,
-0.040618896484375,
0.025848388671875,
0.019378662109375,
-0.01128387451171875,
0.006847381591796875,
-0.0789794921875,
0.004894256591796875,
-0.00970458984375,
0.0288543701171875,
-0.0809326171875,
-0.0239410400390625,
0.033599853515625,
-0.054840087890625,
-0.002017974853515625,
-0.026336669921875,
-0.026092529296875,
-0.032623291015625,
-0.037567138671875,
0.0252532958984375,
0.048675537109375,
-0.0227203369140625,
0.0175018310546875,
0.02972412109375,
0.01061248779296875,
-0.038665771484375,
-0.0285797119140625,
-0.035125732421875,
-0.0240020751953125,
-0.06463623046875,
0.03271484375,
-0.01143646240234375,
-0.0012235641479492188,
-0.0221099853515625,
-0.0159912109375,
-0.0281524658203125,
0.005084991455078125,
0.028961181640625,
0.0443115234375,
-0.0168304443359375,
0.0007677078247070312,
-0.00421142578125,
0.007236480712890625,
0.01152801513671875,
-0.0032024383544921875,
0.052154541015625,
-0.03875732421875,
0.032257080078125,
-0.037841796875,
0.0029621124267578125,
0.050750732421875,
-0.01873779296875,
0.0657958984375,
0.08447265625,
0.00017976760864257812,
-0.008514404296875,
-0.0226593017578125,
-0.0176239013671875,
-0.036651611328125,
0.032318115234375,
-0.0259857177734375,
-0.049713134765625,
0.061248779296875,
0.0254974365234375,
0.007328033447265625,
0.07470703125,
0.042022705078125,
-0.031768798828125,
0.09454345703125,
0.01666259765625,
-0.0019893646240234375,
0.02386474609375,
-0.06414794921875,
-0.01471710205078125,
-0.0654296875,
-0.043670654296875,
-0.040740966796875,
-0.036590576171875,
-0.061676025390625,
-0.02978515625,
0.02294921875,
0.0236663818359375,
-0.04656982421875,
0.051727294921875,
-0.053131103515625,
0.017730712890625,
0.0306549072265625,
0.027008056640625,
-0.0218353271484375,
-0.0147247314453125,
-0.003955841064453125,
-0.0002777576446533203,
-0.062255859375,
-0.01507568359375,
0.08349609375,
0.01495361328125,
0.0345458984375,
0.0057220458984375,
0.05029296875,
0.01044464111328125,
-0.0028171539306640625,
-0.05743408203125,
0.046173095703125,
0.004283905029296875,
-0.0679931640625,
-0.02459716796875,
-0.03643798828125,
-0.07196044921875,
0.016845703125,
-0.031219482421875,
-0.07049560546875,
-0.006450653076171875,
0.015777587890625,
-0.036102294921875,
0.004364013671875,
-0.06866455078125,
0.06597900390625,
-0.007328033447265625,
-0.0504150390625,
-0.0038547515869140625,
-0.06201171875,
0.0340576171875,
0.003284454345703125,
-0.007293701171875,
-0.030487060546875,
0.0024585723876953125,
0.0653076171875,
-0.0400390625,
0.0286102294921875,
-0.01345062255859375,
0.0196533203125,
0.041107177734375,
-0.00753021240234375,
0.047821044921875,
0.0229339599609375,
-0.019195556640625,
0.0170745849609375,
0.0230865478515625,
-0.05401611328125,
-0.032501220703125,
0.059600830078125,
-0.0755615234375,
-0.034820556640625,
-0.03887939453125,
-0.047454833984375,
-0.007457733154296875,
0.0084381103515625,
0.04498291015625,
0.0290679931640625,
0.00037741661071777344,
0.0234527587890625,
0.037109375,
-0.014404296875,
0.046630859375,
0.018798828125,
-0.00556182861328125,
-0.01337432861328125,
0.054046630859375,
-0.0006232261657714844,
0.0046844482421875,
0.0135650634765625,
0.005786895751953125,
-0.035675048828125,
-0.0291900634765625,
-0.046905517578125,
0.0159454345703125,
-0.01226806640625,
-0.0244140625,
-0.0384521484375,
-0.03131103515625,
-0.0305023193359375,
-0.0038700103759765625,
-0.04205322265625,
-0.04443359375,
-0.027862548828125,
0.0188446044921875,
0.0364990234375,
0.025909423828125,
0.00484466552734375,
0.03387451171875,
-0.055694580078125,
0.00799560546875,
0.00371551513671875,
0.012725830078125,
-0.020294189453125,
-0.043731689453125,
-0.0292816162109375,
0.0238037109375,
-0.039398193359375,
-0.0472412109375,
0.029876708984375,
0.006366729736328125,
0.03057861328125,
0.01617431640625,
-0.0002067089080810547,
0.057861328125,
-0.0272216796875,
0.06109619140625,
0.019989013671875,
-0.04730224609375,
0.042694091796875,
-0.047149658203125,
0.0298309326171875,
0.03851318359375,
0.030120849609375,
0.012481689453125,
-0.016448974609375,
-0.061431884765625,
-0.0660400390625,
0.06787109375,
0.040679931640625,
-0.0163726806640625,
0.0239105224609375,
0.01441192626953125,
-0.02099609375,
0.0160980224609375,
-0.0223236083984375,
-0.032501220703125,
-0.01520538330078125,
-0.001739501953125,
-0.026702880859375,
0.0097198486328125,
-0.003345489501953125,
-0.05047607421875,
0.059600830078125,
0.0013713836669921875,
0.046112060546875,
0.03350830078125,
-0.00568389892578125,
0.0045623779296875,
-0.005290985107421875,
0.060089111328125,
0.047882080078125,
-0.048248291015625,
-0.0311431884765625,
0.025360107421875,
-0.02923583984375,
0.01016998291015625,
0.01503753662109375,
-0.010498046875,
0.011627197265625,
0.0250701904296875,
0.052642822265625,
-0.0075531005859375,
-0.044830322265625,
0.0240478515625,
0.00566864013671875,
-0.044921875,
-0.0234222412109375,
0.0234375,
-0.00766754150390625,
0.036865234375,
0.0203704833984375,
0.0322265625,
0.004016876220703125,
-0.0457763671875,
0.0160980224609375,
0.044158935546875,
-0.042694091796875,
-0.0308380126953125,
0.0709228515625,
-0.00023746490478515625,
-0.0163116455078125,
0.039337158203125,
-0.00653839111328125,
-0.0634765625,
0.0675048828125,
0.0086517333984375,
0.052215576171875,
0.0008702278137207031,
0.025146484375,
0.0589599609375,
0.01297760009765625,
-0.01522064208984375,
0.021484375,
0.00946807861328125,
-0.04010009765625,
-0.015716552734375,
-0.03204345703125,
0.004558563232421875,
0.0196685791015625,
-0.052581787109375,
0.0170440673828125,
-0.03900146484375,
-0.038055419921875,
0.017547607421875,
0.02703857421875,
-0.07525634765625,
0.0286407470703125,
-0.0177459716796875,
0.06646728515625,
-0.04278564453125,
0.05364990234375,
0.0643310546875,
-0.0280609130859375,
-0.0732421875,
-0.006214141845703125,
-0.012725830078125,
-0.0703125,
0.050750732421875,
0.00843048095703125,
-0.002529144287109375,
0.006267547607421875,
-0.051055908203125,
-0.046630859375,
0.08551025390625,
0.0130462646484375,
-0.037353515625,
-0.0009551048278808594,
-0.0081329345703125,
0.04547119140625,
-0.01739501953125,
0.041656494140625,
0.05279541015625,
0.0215911865234375,
0.00763702392578125,
-0.059112548828125,
0.0035686492919921875,
-0.02203369140625,
-0.02313232421875,
0.0251312255859375,
-0.09759521484375,
0.07763671875,
-0.035064697265625,
0.01953125,
0.00833892822265625,
0.03497314453125,
0.031158447265625,
0.0225372314453125,
0.0196990966796875,
0.04327392578125,
0.0518798828125,
-0.0245361328125,
0.0638427734375,
-0.0222320556640625,
0.058135986328125,
0.053863525390625,
0.00804901123046875,
0.05950927734375,
0.029205322265625,
-0.0288238525390625,
0.03314208984375,
0.053741455078125,
-0.02203369140625,
0.0389404296875,
0.01123046875,
-0.009552001953125,
-0.03125,
0.022186279296875,
-0.040802001953125,
0.045074462890625,
-0.007659912109375,
-0.0302581787109375,
-0.004016876220703125,
-0.041351318359375,
0.005428314208984375,
-0.02691650390625,
-0.0215606689453125,
0.04058837890625,
-0.0160675048828125,
-0.0645751953125,
0.08349609375,
0.0014944076538085938,
0.042633056640625,
-0.036651611328125,
0.017059326171875,
-0.016448974609375,
0.0185699462890625,
-0.0169830322265625,
-0.058380126953125,
0.00618743896484375,
-0.0023040771484375,
-0.0252685546875,
0.00131988525390625,
0.04388427734375,
-0.0206756591796875,
-0.04144287109375,
0.008758544921875,
0.038665771484375,
0.016632080078125,
-0.002780914306640625,
-0.0758056640625,
0.01256561279296875,
0.01873779296875,
-0.01702880859375,
0.0267486572265625,
0.007419586181640625,
0.0159912109375,
0.06390380859375,
0.047882080078125,
-0.00559234619140625,
0.00958251953125,
-0.041015625,
0.057861328125,
-0.038604736328125,
-0.0241546630859375,
-0.07196044921875,
0.0513916015625,
-0.0110015869140625,
-0.060028076171875,
0.0472412109375,
0.07464599609375,
0.04705810546875,
-0.00984954833984375,
0.06072998046875,
-0.0239410400390625,
0.0160064697265625,
-0.02703857421875,
0.066650390625,
-0.059661865234375,
0.006626129150390625,
-0.01206207275390625,
-0.059326171875,
0.00530242919921875,
0.060821533203125,
-0.01311492919921875,
-0.00748443603515625,
0.037353515625,
0.06072998046875,
-0.005828857421875,
-0.003414154052734375,
-0.004261016845703125,
0.004077911376953125,
0.0158233642578125,
0.05364990234375,
0.03192138671875,
-0.07373046875,
0.06536865234375,
-0.02423095703125,
-0.01114654541015625,
-0.0245361328125,
-0.032379150390625,
-0.090576171875,
-0.044219970703125,
-0.0226898193359375,
-0.06451416015625,
0.0257415771484375,
0.0748291015625,
0.06280517578125,
-0.05517578125,
-0.017608642578125,
-0.00624847412109375,
-0.00275421142578125,
-0.019256591796875,
-0.019073486328125,
0.0330810546875,
-0.02752685546875,
-0.053497314453125,
0.01506805419921875,
-0.0180511474609375,
-0.0045623779296875,
0.01849365234375,
-0.016998291015625,
-0.0193939208984375,
-0.0170440673828125,
0.050872802734375,
0.026702880859375,
-0.035308837890625,
-0.023193359375,
0.023773193359375,
-0.00995635986328125,
0.019989013671875,
0.0280303955078125,
-0.06427001953125,
0.01174163818359375,
0.036773681640625,
0.0199432373046875,
0.06060791015625,
0.010772705078125,
0.040863037109375,
-0.0299224853515625,
0.021148681640625,
0.01198577880859375,
0.0179290771484375,
0.0023670196533203125,
-0.00975799560546875,
0.0435791015625,
0.0216217041015625,
-0.051055908203125,
-0.06414794921875,
-0.009918212890625,
-0.0877685546875,
-0.00588226318359375,
0.0908203125,
-0.0060577392578125,
-0.00997161865234375,
0.0195465087890625,
-0.0179595947265625,
0.04022216796875,
-0.053070068359375,
0.0615234375,
0.044708251953125,
-0.01800537109375,
0.00232696533203125,
-0.05596923828125,
0.05108642578125,
0.030914306640625,
-0.042144775390625,
-0.0236968994140625,
0.01081085205078125,
0.03253173828125,
-0.00873565673828125,
0.0279693603515625,
0.0166778564453125,
0.0235137939453125,
0.0009322166442871094,
-0.01436614990234375,
-0.007411956787109375,
-0.0106048583984375,
0.0135498046875,
0.0014371871948242188,
-0.042510986328125,
-0.0304107666015625
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.