modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Helsinki-NLP/opus-mt-vi-en | 2023-08-16T12:08:32.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"vi",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-vi-en | 7 | 12,859 | transformers | 2022-03-02T23:29:04 | ---
language:
- vi
- en
tags:
- translation
license: apache-2.0
---
### vie-eng
* source group: Vietnamese
* target group: English
* OPUS readme: [vie-eng](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/vie-eng/README.md)
* model: transformer-align
* source language(s): vie vie_Hani
* target language(s): eng
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus-2020-06-17.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/vie-eng/opus-2020-06-17.zip)
* test set translations: [opus-2020-06-17.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/vie-eng/opus-2020-06-17.test.txt)
* test set scores: [opus-2020-06-17.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/vie-eng/opus-2020-06-17.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba-test.vie.eng | 42.8 | 0.608 |
### System Info:
- hf_name: vie-eng
- source_languages: vie
- target_languages: eng
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/vie-eng/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['vi', 'en']
- src_constituents: {'vie', 'vie_Hani'}
- tgt_constituents: {'eng'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/vie-eng/opus-2020-06-17.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/vie-eng/opus-2020-06-17.test.txt
- src_alpha3: vie
- tgt_alpha3: eng
- short_pair: vi-en
- chrF2_score: 0.608
- bleu: 42.8
- brevity_penalty: 0.955
- ref_len: 20241.0
- src_name: Vietnamese
- tgt_name: English
- train_date: 2020-06-17
- src_alpha2: vi
- tgt_alpha2: en
- prefer_old: False
- long_pair: vie-eng
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 2,084 | [
[
-0.0241851806640625,
-0.048370361328125,
0.022216796875,
0.030029296875,
-0.0267181396484375,
-0.0177459716796875,
-0.021636962890625,
-0.0249176025390625,
0.0188140869140625,
0.0275726318359375,
-0.040283203125,
-0.05712890625,
-0.038116455078125,
0.021514892578125,
-0.003108978271484375,
0.06915283203125,
-0.0172882080078125,
0.01149749755859375,
0.032073974609375,
-0.037750244140625,
-0.034271240234375,
-0.0206756591796875,
-0.047332763671875,
-0.0139312744140625,
0.0300140380859375,
0.0228118896484375,
0.0276336669921875,
0.031341552734375,
0.04315185546875,
0.0213470458984375,
-0.020721435546875,
0.0187225341796875,
-0.021026611328125,
-0.01202392578125,
-0.002162933349609375,
-0.036163330078125,
-0.0487060546875,
-0.0158233642578125,
0.06365966796875,
0.033050537109375,
0.00852203369140625,
0.031646728515625,
-0.0039520263671875,
0.05853271484375,
-0.0174560546875,
0.01515960693359375,
-0.035858154296875,
-0.013885498046875,
-0.0272369384765625,
-0.0215606689453125,
-0.039703369140625,
-0.0196075439453125,
0.0219573974609375,
-0.040771484375,
0.004543304443359375,
-0.0008821487426757812,
0.1328125,
0.0052947998046875,
-0.031524658203125,
-0.0025081634521484375,
-0.025848388671875,
0.0628662109375,
-0.048583984375,
0.0330810546875,
0.0297393798828125,
0.00013494491577148438,
0.00580596923828125,
-0.0305633544921875,
-0.0245819091796875,
0.001644134521484375,
-0.0184326171875,
0.02593994140625,
-0.018280029296875,
-0.01318359375,
0.0080413818359375,
0.03656005859375,
-0.056549072265625,
0.0005364418029785156,
-0.034698486328125,
-0.0139312744140625,
0.04339599609375,
-0.002376556396484375,
0.02337646484375,
-0.043212890625,
-0.03338623046875,
-0.029083251953125,
-0.036590576171875,
0.018829345703125,
0.029571533203125,
0.0238037109375,
-0.04229736328125,
0.053070068359375,
-0.00832366943359375,
0.04583740234375,
0.004497528076171875,
-0.00510406494140625,
0.05047607421875,
-0.04949951171875,
-0.0175933837890625,
-0.01407623291015625,
0.0828857421875,
0.021728515625,
0.0133056640625,
0.016998291015625,
-0.0170745849609375,
-0.014373779296875,
-0.005859375,
-0.054229736328125,
0.0036563873291015625,
0.017547607421875,
-0.0304412841796875,
-0.0146331787109375,
0.006046295166015625,
-0.058013916015625,
0.015380859375,
-0.0003533363342285156,
0.042694091796875,
-0.058502197265625,
-0.014892578125,
0.030120849609375,
-0.0015172958374023438,
0.031280517578125,
0.0114898681640625,
-0.0295257568359375,
0.004673004150390625,
0.0188751220703125,
0.071533203125,
-0.0139312744140625,
-0.027191162109375,
-0.0261383056640625,
0.0033168792724609375,
-0.005817413330078125,
0.04559326171875,
-0.010894775390625,
-0.032958984375,
-0.000766754150390625,
0.034393310546875,
-0.017181396484375,
-0.0138397216796875,
0.0689697265625,
-0.022735595703125,
0.044403076171875,
-0.021087646484375,
-0.040679931640625,
-0.0312042236328125,
0.02081298828125,
-0.060028076171875,
0.0841064453125,
0.01544952392578125,
-0.0670166015625,
0.022125244140625,
-0.051300048828125,
-0.0166778564453125,
-0.00753021240234375,
0.00616455078125,
-0.057159423828125,
-0.0033245086669921875,
0.024658203125,
0.0289154052734375,
-0.0286712646484375,
0.035125732421875,
-0.00007349252700805664,
-0.01107025146484375,
-0.00565338134765625,
-0.030120849609375,
0.09820556640625,
0.010528564453125,
-0.0289459228515625,
0.0063323974609375,
-0.0555419921875,
-0.006954193115234375,
0.0261383056640625,
-0.0369873046875,
-0.0120697021484375,
-0.01029205322265625,
0.02642822265625,
0.00771331787109375,
0.0242156982421875,
-0.034820556640625,
0.021759033203125,
-0.04217529296875,
0.01357269287109375,
0.057708740234375,
0.006267547607421875,
0.01739501953125,
-0.0357666015625,
0.03668212890625,
0.0194091796875,
-0.001018524169921875,
-0.00008028745651245117,
-0.04522705078125,
-0.06475830078125,
-0.0254058837890625,
0.039459228515625,
0.06134033203125,
-0.058502197265625,
0.057586669921875,
-0.05926513671875,
-0.058380126953125,
-0.058685302734375,
-0.01258087158203125,
0.036376953125,
0.0282745361328125,
0.04071044921875,
-0.021759033203125,
-0.033477783203125,
-0.08172607421875,
-0.0175323486328125,
-0.021759033203125,
-0.0025310516357421875,
0.01739501953125,
0.055877685546875,
-0.004680633544921875,
0.046722412109375,
-0.03472900390625,
-0.04534912109375,
-0.01438140869140625,
0.0134735107421875,
0.0234222412109375,
0.048614501953125,
0.05645751953125,
-0.05841064453125,
-0.046356201171875,
0.0081787109375,
-0.047088623046875,
-0.01513671875,
0.0009813308715820312,
-0.0123291015625,
0.030853271484375,
0.0036144256591796875,
-0.0298004150390625,
0.0205230712890625,
0.042633056640625,
-0.0491943359375,
0.037139892578125,
-0.01210784912109375,
0.02777099609375,
-0.10845947265625,
0.01120758056640625,
0.005611419677734375,
-0.012908935546875,
-0.02642822265625,
-0.006298065185546875,
0.0052337646484375,
0.0079803466796875,
-0.042327880859375,
0.054412841796875,
-0.038421630859375,
0.0183868408203125,
0.0292510986328125,
0.01531219482421875,
0.006290435791015625,
0.05908203125,
-0.0013141632080078125,
0.0672607421875,
0.042724609375,
-0.0251617431640625,
0.002899169921875,
0.027679443359375,
-0.031341552734375,
0.03143310546875,
-0.054473876953125,
-0.0206451416015625,
0.01390838623046875,
0.00356292724609375,
-0.060546875,
-0.01100921630859375,
0.00888824462890625,
-0.048126220703125,
0.0225982666015625,
-0.010101318359375,
-0.0382080078125,
-0.0086822509765625,
-0.0301361083984375,
0.042327880859375,
0.029571533203125,
-0.01349639892578125,
0.052215576171875,
0.01910400390625,
0.0006475448608398438,
-0.042938232421875,
-0.06134033203125,
-0.004192352294921875,
-0.0215606689453125,
-0.0574951171875,
0.031646728515625,
-0.004909515380859375,
0.001506805419921875,
0.006656646728515625,
0.00101470947265625,
-0.0179290771484375,
-0.000370025634765625,
-0.00696563720703125,
0.02484130859375,
-0.0247039794921875,
0.009124755859375,
-0.0032196044921875,
-0.00699615478515625,
-0.02081298828125,
-0.0228271484375,
0.052642822265625,
-0.038360595703125,
-0.01486968994140625,
-0.0625,
0.011199951171875,
0.038543701171875,
-0.03790283203125,
0.06683349609375,
0.053375244140625,
-0.021270751953125,
0.0188751220703125,
-0.047637939453125,
0.01236724853515625,
-0.029632568359375,
0.028656005859375,
-0.049560546875,
-0.053497314453125,
0.05975341796875,
0.0214385986328125,
0.013397216796875,
0.07489013671875,
0.04736328125,
0.0087890625,
0.04815673828125,
0.0247802734375,
-0.005706787109375,
0.04248046875,
-0.049102783203125,
-0.0089111328125,
-0.056304931640625,
-0.0114898681640625,
-0.0548095703125,
-0.00850677490234375,
-0.076416015625,
-0.02691650390625,
0.0266876220703125,
-0.0067291259765625,
-0.01152801513671875,
0.05621337890625,
-0.03985595703125,
0.01611328125,
0.044036865234375,
0.0139007568359375,
0.027679443359375,
-0.0015439987182617188,
-0.02606201171875,
-0.01007080078125,
-0.032806396484375,
-0.047943115234375,
0.0867919921875,
0.02606201171875,
0.023681640625,
0.0248565673828125,
0.04656982421875,
0.00658416748046875,
-0.002803802490234375,
-0.048492431640625,
0.0400390625,
-0.01396942138671875,
-0.052886962890625,
-0.024566650390625,
-0.0274200439453125,
-0.077880859375,
0.01483917236328125,
-0.007114410400390625,
-0.054443359375,
0.012481689453125,
-0.01033782958984375,
-0.00396728515625,
0.043670654296875,
-0.058563232421875,
0.06793212890625,
0.0011577606201171875,
-0.0222930908203125,
0.0074310302734375,
-0.042816162109375,
0.0159454345703125,
-0.005161285400390625,
0.00888824462890625,
-0.0124359130859375,
-0.00838470458984375,
0.06463623046875,
-0.0186004638671875,
0.042510986328125,
-0.0014438629150390625,
-0.01885986328125,
0.020355224609375,
0.011810302734375,
0.035858154296875,
0.0026454925537109375,
-0.017791748046875,
0.034759521484375,
0.005645751953125,
-0.048431396484375,
-0.01284027099609375,
0.040557861328125,
-0.06304931640625,
-0.037933349609375,
-0.03948974609375,
-0.043426513671875,
0.0014858245849609375,
0.032135009765625,
0.044097900390625,
0.037841796875,
-0.009033203125,
0.040924072265625,
0.052032470703125,
-0.0238494873046875,
0.03436279296875,
0.037139892578125,
-0.0019702911376953125,
-0.04241943359375,
0.057891845703125,
0.0210113525390625,
0.0183868408203125,
0.050079345703125,
0.002101898193359375,
-0.01174163818359375,
-0.047607421875,
-0.038299560546875,
0.030517578125,
-0.025634765625,
-0.0179290771484375,
-0.045989990234375,
-0.0129547119140625,
-0.0274200439453125,
0.005367279052734375,
-0.02874755859375,
-0.0347900390625,
-0.0159149169921875,
-0.018798828125,
0.028076171875,
0.0206298828125,
0.00739288330078125,
0.019683837890625,
-0.0634765625,
0.0147705078125,
-0.003116607666015625,
0.029052734375,
-0.02349853515625,
-0.06036376953125,
-0.0250396728515625,
-0.0024814605712890625,
-0.0282745361328125,
-0.0772705078125,
0.037811279296875,
-0.00222015380859375,
0.0192108154296875,
0.01468658447265625,
0.0007376670837402344,
0.04388427734375,
-0.03814697265625,
0.078369140625,
-0.006412506103515625,
-0.0703125,
0.051422119140625,
-0.0377197265625,
0.0325927734375,
0.050140380859375,
0.0179901123046875,
-0.0273284912109375,
-0.0430908203125,
-0.05657958984375,
-0.0645751953125,
0.056640625,
0.048370361328125,
-0.00897216796875,
0.0014171600341796875,
0.00012814998626708984,
-0.0031490325927734375,
-0.0122528076171875,
-0.08734130859375,
-0.03033447265625,
0.007762908935546875,
-0.030517578125,
0.005313873291015625,
-0.0268096923828125,
-0.01329803466796875,
-0.0213623046875,
0.08148193359375,
0.01074981689453125,
0.0155029296875,
0.041534423828125,
-0.014007568359375,
0.0017290115356445312,
0.021636962890625,
0.054290771484375,
0.037200927734375,
-0.023773193359375,
-0.0146484375,
0.026123046875,
-0.032928466796875,
0.00797271728515625,
0.00882720947265625,
-0.030364990234375,
0.025909423828125,
0.046722412109375,
0.06683349609375,
0.00385284423828125,
-0.040191650390625,
0.0458984375,
-0.00815582275390625,
-0.02630615234375,
-0.032806396484375,
-0.0190582275390625,
0.004245758056640625,
0.00933837890625,
0.0172119140625,
-0.00461578369140625,
-0.002490997314453125,
-0.0073089599609375,
0.0090789794921875,
0.0115814208984375,
-0.0302734375,
-0.032135009765625,
0.044036865234375,
0.0097503662109375,
-0.0246734619140625,
0.0255584716796875,
-0.0195770263671875,
-0.0298004150390625,
0.042694091796875,
0.01337432861328125,
0.08770751953125,
-0.0166778564453125,
-0.004150390625,
0.05377197265625,
0.042510986328125,
0.0033016204833984375,
0.03179931640625,
0.0196990966796875,
-0.0458984375,
-0.0206298828125,
-0.056640625,
0.0173492431640625,
0.01495361328125,
-0.05877685546875,
0.03961181640625,
0.01251220703125,
-0.0192108154296875,
-0.007785797119140625,
0.022125244140625,
-0.055389404296875,
0.0043792724609375,
-0.025848388671875,
0.07513427734375,
-0.06915283203125,
0.06146240234375,
0.052215576171875,
-0.060546875,
-0.07373046875,
0.0008196830749511719,
-0.0142364501953125,
-0.051239013671875,
0.037017822265625,
0.004913330078125,
0.005950927734375,
-0.0045928955078125,
-0.0211639404296875,
-0.0587158203125,
0.084228515625,
0.03094482421875,
-0.0271148681640625,
-0.0170440673828125,
-0.0007719993591308594,
0.038970947265625,
-0.006011962890625,
0.01071929931640625,
0.034423828125,
0.054229736328125,
-0.0192718505859375,
-0.09130859375,
0.0026226043701171875,
-0.0364990234375,
-0.001445770263671875,
0.0184326171875,
-0.07098388671875,
0.058990478515625,
0.01023101806640625,
-0.01445770263671875,
0.0042724609375,
0.047454833984375,
0.0260162353515625,
0.0008311271667480469,
0.03875732421875,
0.069580078125,
0.033905029296875,
-0.035308837890625,
0.073486328125,
-0.02276611328125,
0.046600341796875,
0.07080078125,
0.0168304443359375,
0.0550537109375,
0.037322998046875,
-0.02008056640625,
0.04779052734375,
0.057342529296875,
-0.01180267333984375,
0.023773193359375,
-0.00188446044921875,
-0.0012531280517578125,
-0.010009765625,
-0.018280029296875,
-0.03948974609375,
0.046630859375,
0.005527496337890625,
-0.0168304443359375,
-0.0029201507568359375,
-0.01507568359375,
0.0308074951171875,
0.005035400390625,
-0.00312042236328125,
0.0478515625,
-0.007747650146484375,
-0.050933837890625,
0.057952880859375,
0.0014066696166992188,
0.043212890625,
-0.04730224609375,
0.00421905517578125,
-0.010772705078125,
0.0062408447265625,
0.00261688232421875,
-0.056365966796875,
0.018035888671875,
0.01287078857421875,
-0.0200653076171875,
-0.019317626953125,
0.0113372802734375,
-0.042633056640625,
-0.06158447265625,
0.03582763671875,
0.04443359375,
0.01110076904296875,
0.018463134765625,
-0.055389404296875,
-0.004100799560546875,
0.0177154541015625,
-0.050079345703125,
-0.005756378173828125,
0.056365966796875,
-0.000339508056640625,
0.046142578125,
0.0245819091796875,
0.0169525146484375,
0.002712249755859375,
0.0008745193481445312,
0.0457763671875,
-0.05279541015625,
-0.034912109375,
-0.060455322265625,
0.04791259765625,
-0.00867462158203125,
-0.04638671875,
0.047607421875,
0.061126708984375,
0.0753173828125,
-0.005138397216796875,
0.033111572265625,
-0.007198333740234375,
0.0270233154296875,
-0.050048828125,
0.053466796875,
-0.0780029296875,
0.008270263671875,
-0.0222015380859375,
-0.056304931640625,
-0.0190582275390625,
0.0234832763671875,
-0.01308441162109375,
-0.0016717910766601562,
0.07171630859375,
0.054656982421875,
0.004055023193359375,
-0.0293426513671875,
0.00005048513412475586,
0.0286712646484375,
0.0306243896484375,
0.0596923828125,
0.0126190185546875,
-0.0684814453125,
0.051910400390625,
-0.0211181640625,
-0.0026645660400390625,
-0.007659912109375,
-0.055999755859375,
-0.060546875,
-0.06298828125,
-0.0191192626953125,
-0.03546142578125,
-0.01297760009765625,
0.0677490234375,
0.0296478271484375,
-0.07476806640625,
-0.031005859375,
0.0013599395751953125,
0.0157318115234375,
-0.0245361328125,
-0.0224456787109375,
0.0665283203125,
-0.0116119384765625,
-0.0887451171875,
0.00963592529296875,
0.0113677978515625,
0.01331329345703125,
-0.0013647079467773438,
0.00020873546600341797,
-0.056915283203125,
0.00019562244415283203,
0.026336669921875,
0.00214385986328125,
-0.06610107421875,
-0.01486968994140625,
0.0096435546875,
-0.0218963623046875,
0.018310546875,
0.00939178466796875,
-0.022613525390625,
0.0312042236328125,
0.0626220703125,
0.032958984375,
0.037445068359375,
-0.00841522216796875,
0.034912109375,
-0.05694580078125,
0.034088134765625,
0.01849365234375,
0.05169677734375,
0.0161895751953125,
-0.01042938232421875,
0.06365966796875,
0.028350830078125,
-0.0301666259765625,
-0.07037353515625,
0.0005331039428710938,
-0.09698486328125,
-0.0016717910766601562,
0.075927734375,
-0.017425537109375,
-0.022796630859375,
0.0150604248046875,
-0.020965576171875,
0.03546142578125,
-0.035858154296875,
0.047515869140625,
0.07318115234375,
0.0251312255859375,
0.0157623291015625,
-0.0416259765625,
0.0254364013671875,
0.04443359375,
-0.055755615234375,
-0.015716552734375,
0.02008056640625,
0.0181884765625,
0.024139404296875,
0.051605224609375,
-0.0232696533203125,
0.01507568359375,
-0.0221099853515625,
0.0164642333984375,
0.00299072265625,
-0.00817108154296875,
-0.0247039794921875,
0.00662994384765625,
-0.0179290771484375,
-0.02545166015625
]
] |
sultan/BioM-ELECTRA-Large-SQuAD2 | 2021-08-06T22:27:10.000Z | [
"transformers",
"pytorch",
"electra",
"question-answering",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | question-answering | sultan | null | null | sultan/BioM-ELECTRA-Large-SQuAD2 | 10 | 12,818 | transformers | 2022-03-02T23:29:05 | # BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
# Abstract
The impact of design choices on the performance
of biomedical language models recently
has been a subject for investigation. In
this paper, we empirically study biomedical
domain adaptation with large transformer models
using different design choices. We evaluate
the performance of our pretrained models
against other existing biomedical language
models in the literature. Our results show that
we achieve state-of-the-art results on several
biomedical domain tasks despite using similar
or less computational cost compared to other
models in the literature. Our findings highlight
the significant effect of design choices on
improving the performance of biomedical language
models.
# Model Description
We fine-tuned BioM-ELECTRA-Large, which was pre-trained on PubMed Abstracts, on the SQuAD2.0 dataset. Fine-tuning the biomedical language model on the SQuAD dataset helps improve the score on the BioASQ challenge. If you plan to work with BioASQ or biomedical QA tasks, it's better to use this model over BioM-ELECTRA-Large. This model (TensorFlow version ) took the lead in the BioASQ9b-Factoid challenge (Batch 5) under the name of (UDEL-LAB2). To see the full details of BioASQ9B results, please check this link http://participants-area.bioasq.org/results/9b/phaseB/ ( you need to register).
Huggingface library doesn't implement Layer-Wise decay feature, which affects the performance on SQuAD task. The reported result of BioM-ELECTRA-SQuAD in our paper is 88.3 (F1) since we use ELECTRA open-source code with TF checkpoint, which uses Layer-Wise decay.
Training Script
```python
run_qa.py --model_name_or_path sultan/BioM-ELECTRA-Large-Discriminator \
--dataset_name squad_v2 \
--do_train \
--do_eval \
--dataloader_num_workers 20 \
--preprocessing_num_workers 20 \
--version_2_with_negative \
--num_train_epochs 2 \
--learning_rate 5e-5 \
--max_seq_length 512 \
--doc_stride 128 \
--per_device_train_batch_size 8 \
--gradient_accumulation_steps 6 \
--per_device_eval_batch_size 128
--fp16 \
--fp16_opt_level O1 \
--logging_steps 50 \
--save_steps 1000 \
--overwrite_output_dir \
--output_dir out
```
Evaluation results on SQuAD2.0 Dev Dataset
```
exact = 84.33420365535248
f1 = 87.49354241889522
total = 11873
HasAns_exact = 80.43184885290148
HasAns_f1 = 86.75958656200127
HasAns_total = 5928
NoAns_exact = 88.22539949537426
NoAns_f1 = 88.22539949537426
NoAns_total = 5945
best_exact = 84.33420365535248
best_exact_thresh = 0.0
best_f1 = 87.49354241889522
best_f1_thresh = 0.0
epoch = 2.0
```
To reproduce results in Google Colab:
- Make sure you have GPU enabled.
- Clone and install required libraries through this code
!git clone https://github.com/huggingface/transformers
!pip3 install -e transformers
!pip3 install sentencepiece
!pip3 install -r /content/transformers/examples/pytorch/question-answering/requirements.txt
- Run this python code:
```python
python /content/transformers/examples/pytorch/question-answering/run_qa.py --model_name_or_path sultan/BioM-ELECTRA-Large-SQuAD2 \
--do_eval \
--version_2_with_negative \
--per_device_eval_batch_size 8 \
--dataset_name squad_v2 \
--overwrite_output_dir \
--fp16 \
--output_dir out
```
- You don't need to download the SQuAD2 dataset. The code will download it from the HuggingFace datasets hub.
- Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
- We added examples to fine-tune BioM-ELECTRA-Large on SQuAD and BioASQ7B using TensorFlow and TPU here https://github.com/salrowili/BioM-Transformers/tree/main/examples . In this example we show that we achieve 88.22 score in SQuAD2.0 since Tensor Flow code has Layer-wise decay feature.
# Acknowledgment
We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
# Citation
```bibtex
@inproceedings{alrowili-shanker-2021-biom,
title = "{B}io{M}-Transformers: Building Large Biomedical Language Models with {BERT}, {ALBERT} and {ELECTRA}",
author = "Alrowili, Sultan and
Shanker, Vijay",
booktitle = "Proceedings of the 20th Workshop on Biomedical Language Processing",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.bionlp-1.24",
pages = "221--227",
abstract = "The impact of design choices on the performance of biomedical language models recently has been a subject for investigation. In this paper, we empirically study biomedical domain adaptation with large transformer models using different design choices. We evaluate the performance of our pretrained models against other existing biomedical language models in the literature. Our results show that we achieve state-of-the-art results on several biomedical domain tasks despite using similar or less computational cost compared to other models in the literature. Our findings highlight the significant effect of design choices on improving the performance of biomedical language models.",
}
``` | 5,144 | [
[
-0.02960205078125,
-0.057098388671875,
0.032073974609375,
0.011383056640625,
-0.0032806396484375,
0.0183563232421875,
-0.0178375244140625,
-0.048797607421875,
0.00876617431640625,
0.01471710205078125,
-0.037384033203125,
-0.0206298828125,
-0.040283203125,
0.01285552978515625,
-0.0025081634521484375,
0.08038330078125,
-0.022247314453125,
-0.005001068115234375,
-0.0299835205078125,
-0.029815673828125,
-0.033111572265625,
-0.028289794921875,
-0.0489501953125,
-0.0491943359375,
0.031951904296875,
0.0014028549194335938,
0.052093505859375,
0.0267181396484375,
0.046173095703125,
0.0362548828125,
-0.01727294921875,
0.00024235248565673828,
-0.04412841796875,
0.00897216796875,
-0.004840850830078125,
-0.035675048828125,
-0.047454833984375,
-0.0112762451171875,
0.0472412109375,
0.035675048828125,
0.0055084228515625,
0.00977325439453125,
-0.00623321533203125,
0.048309326171875,
-0.01464080810546875,
0.0223236083984375,
-0.024505615234375,
-0.0168609619140625,
0.00653076171875,
-0.007843017578125,
-0.04351806640625,
-0.01100921630859375,
0.0258636474609375,
-0.0286712646484375,
0.03179931640625,
0.00395965576171875,
0.08245849609375,
0.02593994140625,
-0.0017709732055664062,
0.0071258544921875,
-0.0528564453125,
0.06414794921875,
-0.049774169921875,
0.033721923828125,
0.00705718994140625,
0.01241302490234375,
-0.00797271728515625,
-0.0511474609375,
-0.024444580078125,
-0.00585174560546875,
-0.005336761474609375,
0.0308074951171875,
-0.0445556640625,
0.0017042160034179688,
0.022735595703125,
0.01551055908203125,
-0.048797607421875,
0.0215911865234375,
-0.05413818359375,
-0.0161590576171875,
0.0560302734375,
0.0032329559326171875,
0.0160980224609375,
-0.004833221435546875,
-0.02105712890625,
-0.045562744140625,
-0.048065185546875,
0.0168914794921875,
0.00838470458984375,
0.0118255615234375,
-0.032073974609375,
0.03271484375,
0.0025177001953125,
0.03265380859375,
0.012298583984375,
0.00423431396484375,
0.058258056640625,
-0.01457977294921875,
-0.02001953125,
-0.01125335693359375,
0.07440185546875,
0.0124969482421875,
0.0227813720703125,
-0.0274810791015625,
-0.00894927978515625,
-0.0009608268737792969,
0.036041259765625,
-0.1024169921875,
-0.023529052734375,
0.042694091796875,
-0.0257415771484375,
-0.017303466796875,
-0.0126800537109375,
-0.0616455078125,
-0.00563812255859375,
-0.00786590576171875,
0.053131103515625,
-0.053436279296875,
-0.0246124267578125,
0.01364898681640625,
-0.0211334228515625,
0.039703369140625,
0.01548004150390625,
-0.0537109375,
0.0041046142578125,
0.0245361328125,
0.06732177734375,
-0.017181396484375,
-0.0184783935546875,
-0.0218353271484375,
-0.01153564453125,
-0.00511932373046875,
0.0687255859375,
-0.0132904052734375,
-0.029052734375,
-0.0172576904296875,
0.023712158203125,
-0.025115966796875,
-0.0521240234375,
0.02178955078125,
-0.043487548828125,
0.00394439697265625,
-0.006542205810546875,
-0.037078857421875,
-0.006771087646484375,
-0.0294342041015625,
-0.03778076171875,
0.06964111328125,
0.033782958984375,
-0.03973388671875,
0.0203857421875,
-0.05584716796875,
-0.049102783203125,
0.00756072998046875,
0.0134124755859375,
-0.07037353515625,
0.0024929046630859375,
0.01207733154296875,
0.039154052734375,
-0.016326904296875,
0.0160675048828125,
-0.0289306640625,
-0.0268707275390625,
0.0291595458984375,
0.004619598388671875,
0.08251953125,
0.0021076202392578125,
-0.047454833984375,
0.0179443359375,
-0.036590576171875,
0.01258087158203125,
0.0168609619140625,
-0.0149383544921875,
0.009613037109375,
-0.006317138671875,
0.0089874267578125,
0.02911376953125,
-0.01629638671875,
-0.04229736328125,
-0.0004773139953613281,
-0.042877197265625,
0.06561279296875,
0.057220458984375,
-0.0178375244140625,
0.0325927734375,
-0.0447998046875,
0.0457763671875,
0.00592041015625,
-0.0009965896606445312,
0.0019407272338867188,
-0.0594482421875,
-0.055908203125,
-0.0333251953125,
0.019989013671875,
0.0390625,
-0.055908203125,
0.041412353515625,
-0.015289306640625,
-0.053497314453125,
-0.0579833984375,
0.0045013427734375,
0.03460693359375,
0.041839599609375,
0.05975341796875,
-0.003719329833984375,
-0.042694091796875,
-0.08221435546875,
-0.0021228790283203125,
-0.034820556640625,
-0.0215911865234375,
0.032470703125,
0.065185546875,
-0.04217529296875,
0.07177734375,
-0.034912109375,
-0.007259368896484375,
-0.032196044921875,
-0.00902557373046875,
0.02642822265625,
0.046356201171875,
0.0657958984375,
-0.03643798828125,
-0.01416778564453125,
-0.007183074951171875,
-0.050537109375,
-0.01204681396484375,
0.00405120849609375,
-0.0223236083984375,
0.026947021484375,
0.024078369140625,
-0.062164306640625,
0.0166473388671875,
0.0439453125,
-0.0276947021484375,
0.0308837890625,
-0.0218353271484375,
-0.0037441253662109375,
-0.0853271484375,
0.027923583984375,
0.01276397705078125,
-0.02197265625,
-0.049652099609375,
0.0036869049072265625,
0.0118560791015625,
0.00408935546875,
-0.03759765625,
0.047607421875,
-0.00789642333984375,
0.0174560546875,
-0.004154205322265625,
-0.00507354736328125,
0.00963592529296875,
0.04791259765625,
0.0018224716186523438,
0.07269287109375,
0.0255126953125,
-0.0418701171875,
0.0126190185546875,
0.043426513671875,
-0.018890380859375,
0.027191162109375,
-0.0870361328125,
0.0174560546875,
-0.0248870849609375,
0.028778076171875,
-0.07916259765625,
-0.01556396484375,
0.004730224609375,
-0.039276123046875,
0.04425048828125,
0.00812530517578125,
-0.0264129638671875,
-0.050750732421875,
-0.0261993408203125,
0.026824951171875,
0.0743408203125,
-0.037506103515625,
0.036285400390625,
0.03125,
-0.003993988037109375,
-0.0167083740234375,
-0.05755615234375,
-0.0219879150390625,
-0.02362060546875,
-0.0657958984375,
0.05731201171875,
-0.0240631103515625,
0.0009670257568359375,
-0.019775390625,
-0.00202178955078125,
0.004802703857421875,
0.00017905235290527344,
0.025421142578125,
0.041534423828125,
-0.022918701171875,
0.0106201171875,
0.00612640380859375,
-0.00933074951171875,
0.014251708984375,
0.001312255859375,
0.045135498046875,
-0.028106689453125,
0.01348114013671875,
-0.041412353515625,
0.0113677978515625,
0.05047607421875,
-0.0161285400390625,
0.06884765625,
0.0672607421875,
-0.00878143310546875,
-0.0072479248046875,
-0.04608154296875,
-0.01702880859375,
-0.042816162109375,
0.046356201171875,
-0.0159149169921875,
-0.0657958984375,
0.043914794921875,
0.0017175674438476562,
0.0093994140625,
0.062255859375,
0.061614990234375,
-0.0205230712890625,
0.07904052734375,
0.037445068359375,
0.00443267822265625,
0.034423828125,
-0.04620361328125,
0.0101470947265625,
-0.07037353515625,
-0.02874755859375,
-0.044036865234375,
-0.03778076171875,
-0.048736572265625,
-0.032073974609375,
0.0406494140625,
-0.005359649658203125,
-0.03277587890625,
0.04443359375,
-0.0382080078125,
0.005825042724609375,
0.0260162353515625,
0.03961181640625,
-0.014801025390625,
0.005893707275390625,
0.0006399154663085938,
0.00908660888671875,
-0.06707763671875,
-0.0181884765625,
0.07598876953125,
0.04095458984375,
0.051300048828125,
0.006740570068359375,
0.03662109375,
-0.0077972412109375,
0.0303497314453125,
-0.064208984375,
0.028472900390625,
-0.006267547607421875,
-0.043487548828125,
-0.0074005126953125,
-0.04425048828125,
-0.086669921875,
0.004852294921875,
-0.017303466796875,
-0.052947998046875,
0.02203369140625,
0.0026493072509765625,
-0.040191650390625,
0.01253509521484375,
-0.07403564453125,
0.0611572265625,
0.0010528564453125,
-0.04803466796875,
-0.007289886474609375,
-0.054168701171875,
0.0053253173828125,
-0.0118408203125,
-0.000743865966796875,
0.00577545166015625,
0.0165557861328125,
0.07470703125,
-0.0299835205078125,
0.051666259765625,
-0.00978851318359375,
0.010345458984375,
0.02459716796875,
-0.0238800048828125,
0.019775390625,
0.002254486083984375,
-0.0222015380859375,
0.026580810546875,
0.0093841552734375,
-0.041259765625,
-0.0162200927734375,
0.040496826171875,
-0.0848388671875,
-0.039886474609375,
-0.019805908203125,
-0.044708251953125,
-0.028228759765625,
0.00844573974609375,
0.036956787109375,
0.042327880859375,
-0.00847625732421875,
0.036529541015625,
0.06610107421875,
-0.054443359375,
0.03778076171875,
0.0230255126953125,
-0.005466461181640625,
-0.0241546630859375,
0.05279541015625,
-0.0036792755126953125,
0.03594970703125,
0.0219879150390625,
0.0167999267578125,
-0.029937744140625,
-0.0313720703125,
-0.0560302734375,
0.041168212890625,
-0.0240631103515625,
-0.046478271484375,
-0.06817626953125,
-0.03466796875,
-0.033782958984375,
-0.01187896728515625,
-0.0223541259765625,
-0.0181884765625,
-0.0214385986328125,
0.0046844482421875,
0.061553955078125,
0.04388427734375,
0.0128021240234375,
0.0229949951171875,
-0.046661376953125,
0.0257415771484375,
0.01546478271484375,
0.03228759765625,
-0.01214599609375,
-0.058868408203125,
-0.01287841796875,
0.0159759521484375,
-0.034820556640625,
-0.0594482421875,
0.0213775634765625,
0.025634765625,
0.035125732421875,
0.0017309188842773438,
-0.0167999267578125,
0.049560546875,
-0.04010009765625,
0.03448486328125,
0.001071929931640625,
-0.052093505859375,
0.034027099609375,
-0.0224456787109375,
0.03338623046875,
0.03839111328125,
0.018218994140625,
-0.004367828369140625,
-0.0292205810546875,
-0.060638427734375,
-0.0582275390625,
0.059478759765625,
0.032012939453125,
0.0084686279296875,
0.01282501220703125,
0.03271484375,
-0.005832672119140625,
0.004352569580078125,
-0.0221405029296875,
-0.024993896484375,
-0.00420379638671875,
-0.0068511962890625,
-0.01218414306640625,
0.002140045166015625,
0.0099639892578125,
-0.052032470703125,
0.053314208984375,
-0.001983642578125,
0.056915283203125,
0.0267486572265625,
-0.01629638671875,
-0.016387939453125,
0.00409698486328125,
0.040008544921875,
0.043212890625,
-0.02740478515625,
-0.0276336669921875,
0.019989013671875,
-0.041168212890625,
0.003795623779296875,
0.0222625732421875,
-0.00690460205078125,
0.007232666015625,
0.0325927734375,
0.0645751953125,
0.0140838623046875,
-0.045989990234375,
0.03619384765625,
-0.0035839080810546875,
-0.0301055908203125,
0.00594329833984375,
0.01128387451171875,
0.0153350830078125,
0.0236053466796875,
0.009429931640625,
0.0280914306640625,
-0.01015472412109375,
-0.036285400390625,
0.01971435546875,
0.03656005859375,
-0.01702880859375,
-0.0404052734375,
0.069091796875,
0.00902557373046875,
-0.019805908203125,
0.056793212890625,
0.0036640167236328125,
-0.049407958984375,
0.06402587890625,
0.058380126953125,
0.0611572265625,
-0.041412353515625,
0.0289306640625,
0.0440673828125,
0.0131378173828125,
-0.00424957275390625,
0.010406494140625,
0.02362060546875,
-0.05419921875,
-0.0426025390625,
-0.06744384765625,
-0.0185394287109375,
0.0142669677734375,
-0.039886474609375,
0.0239105224609375,
-0.02801513671875,
-0.0232696533203125,
0.0233001708984375,
-0.0030384063720703125,
-0.0845947265625,
0.0109710693359375,
-0.0093841552734375,
0.057098388671875,
-0.0555419921875,
0.049652099609375,
0.059967041015625,
-0.0267486572265625,
-0.05450439453125,
-0.035736083984375,
-0.022552490234375,
-0.07568359375,
0.029815673828125,
0.01544189453125,
0.002044677734375,
0.0030956268310546875,
-0.0245513916015625,
-0.055908203125,
0.07830810546875,
0.03497314453125,
-0.0309600830078125,
-0.01113128662109375,
0.01629638671875,
0.072509765625,
-0.009002685546875,
0.0372314453125,
0.059356689453125,
0.0296173095703125,
0.00867462158203125,
-0.06591796875,
0.015625,
-0.0266265869140625,
-0.007732391357421875,
0.01824951171875,
-0.07403564453125,
0.06097412109375,
-0.0333251953125,
-0.00403594970703125,
0.0104827880859375,
0.0343017578125,
0.0259552001953125,
0.002971649169921875,
0.02288818359375,
0.0294342041015625,
0.0482177734375,
-0.00988006591796875,
0.09356689453125,
-0.01287078857421875,
0.0406494140625,
0.0322265625,
-0.00246429443359375,
0.054779052734375,
0.0308074951171875,
-0.04937744140625,
0.04302978515625,
0.0289306640625,
-0.0098114013671875,
0.0187530517578125,
0.0213470458984375,
-0.0023403167724609375,
-0.01416778564453125,
0.007534027099609375,
-0.047821044921875,
0.035980224609375,
0.0206146240234375,
-0.03485107421875,
-0.005832672119140625,
-0.003719329833984375,
0.0132598876953125,
-0.027099609375,
-0.0009331703186035156,
0.04705810546875,
0.01497650146484375,
-0.04681396484375,
0.065673828125,
-0.01035308837890625,
0.047210693359375,
-0.034423828125,
0.00868988037109375,
-0.01374053955078125,
0.02606201171875,
-0.0222625732421875,
-0.049774169921875,
0.02130126953125,
-0.0091705322265625,
-0.01502227783203125,
-0.032440185546875,
0.05108642578125,
-0.017181396484375,
-0.044097900390625,
0.025238037109375,
0.04229736328125,
0.00977325439453125,
0.0020351409912109375,
-0.061126708984375,
0.013214111328125,
-0.007564544677734375,
-0.02606201171875,
0.034423828125,
0.0030841827392578125,
0.01617431640625,
0.043365478515625,
0.038330078125,
-0.0050506591796875,
-0.0128021240234375,
-0.007114410400390625,
0.0728759765625,
-0.035430908203125,
-0.012969970703125,
-0.06292724609375,
0.0362548828125,
0.001018524169921875,
-0.03619384765625,
0.03125,
0.043914794921875,
0.049652099609375,
-0.0137786865234375,
0.04315185546875,
-0.012176513671875,
0.0233154296875,
-0.030426025390625,
0.06805419921875,
-0.036590576171875,
0.0012865066528320312,
-0.03460693359375,
-0.08355712890625,
-0.0189208984375,
0.054840087890625,
-0.0194549560546875,
0.0309600830078125,
0.05657958984375,
0.056793212890625,
-0.002178192138671875,
-0.0011472702026367188,
0.0025272369384765625,
0.03802490234375,
0.037017822265625,
0.062744140625,
0.037017822265625,
-0.05810546875,
0.0343017578125,
-0.0249786376953125,
-0.0306396484375,
-0.0181732177734375,
-0.043304443359375,
-0.08233642578125,
-0.041168212890625,
-0.037109375,
-0.054351806640625,
0.001308441162109375,
0.06011962890625,
0.055419921875,
-0.072998046875,
-0.0038585662841796875,
0.004985809326171875,
-0.01056671142578125,
-0.016571044921875,
-0.016265869140625,
0.04388427734375,
-0.027099609375,
-0.0523681640625,
0.0242919921875,
0.0112457275390625,
-0.00048351287841796875,
0.0004565715789794922,
0.0018672943115234375,
-0.004039764404296875,
-0.00827789306640625,
0.0634765625,
0.024658203125,
-0.0372314453125,
-0.0361328125,
0.02081298828125,
-0.001819610595703125,
0.0300140380859375,
0.042236328125,
-0.07977294921875,
0.0204315185546875,
0.0435791015625,
0.060394287109375,
0.058258056640625,
-0.00850677490234375,
0.04705810546875,
-0.0284576416015625,
0.005710601806640625,
0.016204833984375,
0.030242919921875,
0.02362060546875,
-0.0172119140625,
0.0379638671875,
0.01087188720703125,
-0.06573486328125,
-0.0560302734375,
-0.0108795166015625,
-0.0882568359375,
-0.010986328125,
0.0980224609375,
0.005279541015625,
-0.026092529296875,
0.00206756591796875,
-0.01410675048828125,
0.0274200439453125,
-0.02960205078125,
0.056915283203125,
0.033782958984375,
-0.02020263671875,
0.01204681396484375,
-0.050323486328125,
0.03936767578125,
0.05419921875,
-0.06097412109375,
-0.0170135498046875,
0.01116943359375,
0.030120849609375,
0.00872802734375,
0.0303497314453125,
-0.01045989990234375,
0.0201568603515625,
-0.02349853515625,
-0.00592041015625,
0.003582000732421875,
-0.002964019775390625,
-0.0261383056640625,
-0.00833892822265625,
-0.02423095703125,
0.00914764404296875
]
] |
sentence-transformers/msmarco-distilbert-base-v4 | 2022-06-15T19:32:25.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"distilbert",
"feature-extraction",
"sentence-similarity",
"transformers",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/msmarco-distilbert-base-v4 | 3 | 12,808 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# sentence-transformers/msmarco-distilbert-base-v4
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/msmarco-distilbert-base-v4')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/msmarco-distilbert-base-v4')
model = AutoModel.from_pretrained('sentence-transformers/msmarco-distilbert-base-v4')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/msmarco-distilbert-base-v4)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 3,714 | [
[
-0.02117919921875,
-0.054046630859375,
0.0218658447265625,
0.033660888671875,
-0.0208740234375,
-0.0262908935546875,
-0.0231475830078125,
-0.0011920928955078125,
0.010498046875,
0.0206146240234375,
-0.042022705078125,
-0.03411865234375,
-0.0611572265625,
0.0157318115234375,
-0.031280517578125,
0.06494140625,
-0.01410675048828125,
-0.003814697265625,
-0.0208892822265625,
-0.0147705078125,
-0.020111083984375,
-0.0303955078125,
-0.0228729248046875,
-0.0111846923828125,
0.0195770263671875,
0.01287841796875,
0.032867431640625,
0.0241851806640625,
0.023681640625,
0.03955078125,
-0.0083465576171875,
0.0165557861328125,
-0.0275421142578125,
-0.0022716522216796875,
0.00031566619873046875,
-0.0270843505859375,
-0.00632476806640625,
0.0179595947265625,
0.04083251953125,
0.035614013671875,
-0.01332855224609375,
0.0082550048828125,
0.00908660888671875,
0.02935791015625,
-0.036865234375,
0.0380859375,
-0.045989990234375,
0.01485443115234375,
0.003803253173828125,
-0.0014104843139648438,
-0.04595947265625,
-0.00855255126953125,
0.019012451171875,
-0.025238037109375,
0.0173492431640625,
0.0111541748046875,
0.08648681640625,
0.034698486328125,
-0.0217437744140625,
-0.031402587890625,
-0.0283355712890625,
0.062255859375,
-0.06683349609375,
0.00798797607421875,
0.021942138671875,
0.00348663330078125,
-0.0067291259765625,
-0.0791015625,
-0.06011962890625,
-0.01114654541015625,
-0.034759521484375,
0.017486572265625,
-0.031890869140625,
0.002471923828125,
0.0216217041015625,
0.0263824462890625,
-0.04705810546875,
-0.008880615234375,
-0.041351318359375,
-0.0124053955078125,
0.038726806640625,
0.006595611572265625,
0.0190277099609375,
-0.03497314453125,
-0.037841796875,
-0.022491455078125,
-0.00820159912109375,
-0.00301361083984375,
0.011260986328125,
0.0109710693359375,
-0.01776123046875,
0.06036376953125,
-0.004184722900390625,
0.04644775390625,
0.0023632049560546875,
0.011566162109375,
0.0531005859375,
-0.0284423828125,
-0.0220184326171875,
-0.000029921531677246094,
0.0849609375,
0.025970458984375,
0.0220184326171875,
0.0027065277099609375,
-0.01342010498046875,
0.006877899169921875,
0.020721435546875,
-0.06622314453125,
-0.026885986328125,
0.01502227783203125,
-0.0276641845703125,
-0.028350830078125,
0.017303466796875,
-0.04931640625,
-0.001911163330078125,
0.0022754669189453125,
0.051971435546875,
-0.0423583984375,
0.0066680908203125,
0.02569580078125,
-0.022186279296875,
0.00847625732421875,
-0.0254974365234375,
-0.0606689453125,
0.01470184326171875,
0.01129913330078125,
0.08026123046875,
0.00585174560546875,
-0.041534423828125,
-0.02008056640625,
-0.01224517822265625,
0.00847625732421875,
0.051055908203125,
-0.0218963623046875,
-0.0067138671875,
0.01346588134765625,
0.0233306884765625,
-0.04229736328125,
-0.023895263671875,
0.045806884765625,
-0.0196990966796875,
0.0531005859375,
0.00885772705078125,
-0.05908203125,
-0.01129150390625,
0.0085601806640625,
-0.04425048828125,
0.08392333984375,
0.0214691162109375,
-0.07537841796875,
0.0008687973022460938,
-0.057464599609375,
-0.0269927978515625,
-0.015380859375,
0.0008292198181152344,
-0.05169677734375,
0.01259613037109375,
0.033355712890625,
0.058746337890625,
0.00827789306640625,
0.021728515625,
-0.0171966552734375,
-0.031341552734375,
0.03302001953125,
-0.0287933349609375,
0.0848388671875,
0.017578125,
-0.032379150390625,
0.00878143310546875,
-0.032684326171875,
-0.0175628662109375,
0.0279388427734375,
-0.01058197021484375,
-0.016265869140625,
-0.005603790283203125,
0.01380157470703125,
0.02679443359375,
0.022216796875,
-0.045013427734375,
0.0195770263671875,
-0.036285400390625,
0.073974609375,
0.04803466796875,
-0.0022869110107421875,
0.044036865234375,
-0.0185699462890625,
0.0168304443359375,
0.0297088623046875,
0.006031036376953125,
-0.0232696533203125,
-0.0278472900390625,
-0.07421875,
-0.029541015625,
0.0236053466796875,
0.034393310546875,
-0.0623779296875,
0.077880859375,
-0.03106689453125,
-0.03607177734375,
-0.05535888671875,
-0.00905609130859375,
0.003208160400390625,
0.03973388671875,
0.04693603515625,
-0.0018224716186523438,
-0.047119140625,
-0.072998046875,
-0.0001595020294189453,
0.005290985107421875,
0.00988006591796875,
0.005031585693359375,
0.055023193359375,
-0.0318603515625,
0.0733642578125,
-0.055938720703125,
-0.03839111328125,
-0.03033447265625,
0.01558685302734375,
0.02740478515625,
0.035491943359375,
0.046844482421875,
-0.052764892578125,
-0.034210205078125,
-0.0443115234375,
-0.053619384765625,
-0.002536773681640625,
-0.01450347900390625,
-0.01244354248046875,
0.01200103759765625,
0.03936767578125,
-0.05889892578125,
0.0271759033203125,
0.0509033203125,
-0.04046630859375,
0.029632568359375,
-0.024383544921875,
-0.00911712646484375,
-0.1085205078125,
0.00015985965728759766,
0.0045928955078125,
-0.0179443359375,
-0.034881591796875,
-0.005100250244140625,
0.0098876953125,
0.0008091926574707031,
-0.033294677734375,
0.02685546875,
-0.038330078125,
0.013397216796875,
-0.00305938720703125,
0.0303955078125,
0.00884246826171875,
0.05889892578125,
-0.01110076904296875,
0.05230712890625,
0.045684814453125,
-0.03955078125,
0.0259552001953125,
0.045684814453125,
-0.037841796875,
0.0160369873046875,
-0.06634521484375,
-0.003940582275390625,
-0.008514404296875,
0.028106689453125,
-0.09063720703125,
0.0030803680419921875,
0.0177764892578125,
-0.03826904296875,
0.0121002197265625,
0.0138397216796875,
-0.054779052734375,
-0.05145263671875,
-0.0322265625,
0.00676727294921875,
0.039215087890625,
-0.04058837890625,
0.039215087890625,
0.024658203125,
-0.01038360595703125,
-0.04638671875,
-0.08282470703125,
-0.004032135009765625,
-0.017303466796875,
-0.049163818359375,
0.042999267578125,
-0.00977325439453125,
0.0156402587890625,
0.0161285400390625,
0.016357421875,
0.002178192138671875,
-0.0006361007690429688,
0.004650115966796875,
0.0248260498046875,
-0.0005450248718261719,
0.01593017578125,
0.0236358642578125,
-0.01186370849609375,
0.002407073974609375,
-0.0181427001953125,
0.061309814453125,
-0.0205841064453125,
-0.00966644287109375,
-0.0290069580078125,
0.00946044921875,
0.03118896484375,
-0.0262451171875,
0.08343505859375,
0.07183837890625,
-0.0305328369140625,
-0.00823211669921875,
-0.03546142578125,
-0.022186279296875,
-0.038330078125,
0.04931640625,
-0.021209716796875,
-0.072998046875,
0.0269927978515625,
0.0116119384765625,
0.00783538818359375,
0.055145263671875,
0.0504150390625,
-0.0146942138671875,
0.0615234375,
0.038421630859375,
-0.0194244384765625,
0.042572021484375,
-0.050628662109375,
0.02117919921875,
-0.06524658203125,
-0.004558563232421875,
-0.0250396728515625,
-0.0304107666015625,
-0.048797607421875,
-0.03399658203125,
0.01171875,
-0.00305938720703125,
-0.015899658203125,
0.052001953125,
-0.05291748046875,
0.015594482421875,
0.037872314453125,
0.003955841064453125,
0.0024471282958984375,
0.009307861328125,
-0.030426025390625,
-0.0032444000244140625,
-0.05389404296875,
-0.043182373046875,
0.058624267578125,
0.03179931640625,
0.034576416015625,
-0.00928497314453125,
0.052764892578125,
0.004390716552734375,
0.0007538795471191406,
-0.049713134765625,
0.0347900390625,
-0.0224151611328125,
-0.032470703125,
-0.024200439453125,
-0.03118896484375,
-0.06976318359375,
0.039154052734375,
-0.01348876953125,
-0.05291748046875,
0.00843048095703125,
-0.01568603515625,
-0.0208282470703125,
0.017059326171875,
-0.061920166015625,
0.07916259765625,
0.0004940032958984375,
-0.004199981689453125,
-0.0037441253662109375,
-0.050079345703125,
0.00945281982421875,
0.020416259765625,
0.00907135009765625,
-0.007110595703125,
-0.0016689300537109375,
0.057647705078125,
-0.0225677490234375,
0.0660400390625,
-0.0144500732421875,
0.015533447265625,
0.0286407470703125,
-0.0234375,
0.0216827392578125,
-0.006542205810546875,
-0.0036449432373046875,
0.01358795166015625,
-0.0040130615234375,
-0.02874755859375,
-0.0362548828125,
0.05426025390625,
-0.06561279296875,
-0.020904541015625,
-0.04644775390625,
-0.043731689453125,
-0.0004711151123046875,
0.0166168212890625,
0.033721923828125,
0.030853271484375,
-0.010345458984375,
0.0283203125,
0.036529541015625,
-0.019500732421875,
0.056427001953125,
0.01311492919921875,
-0.0029506683349609375,
-0.032440185546875,
0.03961181640625,
0.005702972412109375,
-0.0009541511535644531,
0.0298309326171875,
0.0144500732421875,
-0.03839111328125,
-0.0174102783203125,
-0.0301971435546875,
0.03228759765625,
-0.03997802734375,
-0.018707275390625,
-0.0770263671875,
-0.03857421875,
-0.045684814453125,
0.003021240234375,
-0.0201568603515625,
-0.03155517578125,
-0.038787841796875,
-0.02935791015625,
0.0292510986328125,
0.03375244140625,
0.0031986236572265625,
0.0262908935546875,
-0.04931640625,
0.01033782958984375,
0.008880615234375,
0.00458526611328125,
-0.00815582275390625,
-0.06268310546875,
-0.023468017578125,
0.007843017578125,
-0.03173828125,
-0.06707763671875,
0.04644775390625,
0.016632080078125,
0.04193115234375,
0.017181396484375,
0.00982666015625,
0.05322265625,
-0.050018310546875,
0.058563232421875,
0.009246826171875,
-0.07525634765625,
0.034088134765625,
0.0018930435180664062,
0.0286407470703125,
0.047027587890625,
0.030181884765625,
-0.03485107421875,
-0.02880859375,
-0.052398681640625,
-0.0751953125,
0.05230712890625,
0.0419921875,
0.039794921875,
-0.024017333984375,
0.0166778564453125,
-0.01995849609375,
0.0181427001953125,
-0.0770263671875,
-0.035125732421875,
-0.0279388427734375,
-0.045562744140625,
-0.029388427734375,
-0.0220184326171875,
0.00901031494140625,
-0.027618408203125,
0.050018310546875,
0.004116058349609375,
0.05029296875,
0.0274810791015625,
-0.03961181640625,
0.0273590087890625,
0.01036834716796875,
0.046112060546875,
0.0123748779296875,
-0.0033740997314453125,
0.0194244384765625,
0.01812744140625,
-0.0275726318359375,
0.002124786376953125,
0.04254150390625,
-0.0005445480346679688,
0.0227508544921875,
0.0301971435546875,
0.07525634765625,
0.0306854248046875,
-0.03216552734375,
0.0631103515625,
-0.0102996826171875,
-0.0206451416015625,
-0.039764404296875,
-0.00969696044921875,
0.02392578125,
0.022857666015625,
0.0221405029296875,
0.0038623809814453125,
0.00605010986328125,
-0.0294036865234375,
0.0253143310546875,
0.01068878173828125,
-0.03399658203125,
-0.00566864013671875,
0.048187255859375,
0.0033092498779296875,
-0.0053253173828125,
0.06927490234375,
-0.023284912109375,
-0.054351806640625,
0.033660888671875,
0.04437255859375,
0.06634521484375,
-0.00461578369140625,
0.0209197998046875,
0.042694091796875,
0.028106689453125,
-0.01184844970703125,
-0.0023670196533203125,
0.006267547607421875,
-0.06671142578125,
-0.0155029296875,
-0.047210693359375,
0.0184326171875,
-0.004657745361328125,
-0.047088623046875,
0.024017333984375,
-0.0040283203125,
-0.00916290283203125,
-0.0116119384765625,
-0.0011167526245117188,
-0.053436279296875,
-0.00597381591796875,
0.0033321380615234375,
0.06353759765625,
-0.0772705078125,
0.061309814453125,
0.04693603515625,
-0.060455322265625,
-0.053070068359375,
-0.0140380859375,
-0.0254058837890625,
-0.0548095703125,
0.0265045166015625,
0.041015625,
0.0135955810546875,
0.01316070556640625,
-0.03680419921875,
-0.054443359375,
0.11224365234375,
0.021759033203125,
-0.040283203125,
-0.0146026611328125,
0.0089263916015625,
0.043304443359375,
-0.03546142578125,
0.03314208984375,
0.03204345703125,
0.0259857177734375,
-0.0034637451171875,
-0.052093505859375,
0.0137786865234375,
-0.0223846435546875,
0.0166778564453125,
-0.00983428955078125,
-0.043426513671875,
0.07635498046875,
-0.00360107421875,
-0.0164947509765625,
0.00994110107421875,
0.0657958984375,
0.025421142578125,
-0.0035724639892578125,
0.039703369140625,
0.06048583984375,
0.04803466796875,
-0.01279449462890625,
0.07122802734375,
-0.0213623046875,
0.0623779296875,
0.07818603515625,
0.0015802383422851562,
0.07244873046875,
0.03997802734375,
-0.01177215576171875,
0.06396484375,
0.039764404296875,
-0.0243072509765625,
0.0533447265625,
0.019989013671875,
-0.0009760856628417969,
0.0108642578125,
0.0162811279296875,
-0.0185089111328125,
0.03839111328125,
0.0135345458984375,
-0.0557861328125,
-0.002315521240234375,
0.01062774658203125,
0.01116943359375,
0.003925323486328125,
0.00833892822265625,
0.044281005859375,
0.0159759521484375,
-0.035125732421875,
0.03436279296875,
0.0148773193359375,
0.0753173828125,
-0.0338134765625,
0.011016845703125,
-0.0104827880859375,
0.0276031494140625,
-0.00525665283203125,
-0.045562744140625,
0.0277557373046875,
-0.007083892822265625,
-0.0060882568359375,
-0.017669677734375,
0.043060302734375,
-0.051544189453125,
-0.0489501953125,
0.02410888671875,
0.0307159423828125,
0.0066680908203125,
-0.0033550262451171875,
-0.07611083984375,
0.00049591064453125,
0.005214691162109375,
-0.036834716796875,
0.0126495361328125,
0.02911376953125,
0.027374267578125,
0.036651611328125,
0.031768798828125,
-0.0135345458984375,
0.01149749755859375,
0.0123443603515625,
0.0670166015625,
-0.04144287109375,
-0.038177490234375,
-0.076171875,
0.059326171875,
-0.026947021484375,
-0.02484130859375,
0.054534912109375,
0.04388427734375,
0.06787109375,
-0.0242767333984375,
0.04290771484375,
-0.018035888671875,
0.0092010498046875,
-0.0379638671875,
0.067626953125,
-0.032318115234375,
-0.005390167236328125,
-0.02093505859375,
-0.07562255859375,
-0.0162811279296875,
0.08929443359375,
-0.022308349609375,
0.00893402099609375,
0.07318115234375,
0.0626220703125,
-0.01131439208984375,
-0.011199951171875,
0.0107269287109375,
0.035552978515625,
0.01552581787109375,
0.035614013671875,
0.0396728515625,
-0.06378173828125,
0.0487060546875,
-0.042755126953125,
-0.01236724853515625,
-0.0120086669921875,
-0.06072998046875,
-0.075927734375,
-0.07269287109375,
-0.0262908935546875,
-0.023590087890625,
-0.015045166015625,
0.07037353515625,
0.04443359375,
-0.0517578125,
-0.00469970703125,
-0.01409149169921875,
-0.018768310546875,
-0.00955963134765625,
-0.0236358642578125,
0.034271240234375,
-0.0416259765625,
-0.06884765625,
0.010955810546875,
-0.0008816719055175781,
0.002361297607421875,
-0.0297088623046875,
0.00782012939453125,
-0.046112060546875,
0.00418853759765625,
0.052459716796875,
-0.0239715576171875,
-0.0494384765625,
-0.01421356201171875,
0.003326416015625,
-0.034942626953125,
-0.004878997802734375,
0.0300140380859375,
-0.045806884765625,
0.0230560302734375,
0.031768798828125,
0.03607177734375,
0.060394287109375,
-0.017242431640625,
0.0278472900390625,
-0.06195068359375,
0.0308074951171875,
0.006778717041015625,
0.062286376953125,
0.0284423828125,
-0.01776123046875,
0.04278564453125,
0.012664794921875,
-0.035797119140625,
-0.04901123046875,
-0.00832366943359375,
-0.07952880859375,
-0.021728515625,
0.0848388671875,
-0.029083251953125,
-0.0259552001953125,
0.0195465087890625,
-0.025543212890625,
0.036285400390625,
-0.0227813720703125,
0.058685302734375,
0.0657958984375,
0.00612640380859375,
-0.0155487060546875,
-0.0269927978515625,
0.014434814453125,
0.028167724609375,
-0.044586181640625,
-0.0180511474609375,
0.019775390625,
0.023345947265625,
0.0195159912109375,
0.034210205078125,
-0.01366424560546875,
-0.0090789794921875,
0.00200653076171875,
0.01259613037109375,
-0.019744873046875,
-0.0030460357666015625,
-0.03228759765625,
-0.00026345252990722656,
-0.02777099609375,
-0.032501220703125
]
] |
OpenAssistant/codellama-13b-oasst-sft-v10 | 2023-08-29T19:16:10.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"custom_code",
"en",
"dataset:OpenAssistant/oasst1",
"dataset:shahules786/orca-best",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | OpenAssistant | null | null | OpenAssistant/codellama-13b-oasst-sft-v10 | 49 | 12,805 | transformers | 2023-08-26T10:13:29 | ---
license: llama2
datasets:
- OpenAssistant/oasst1
- shahules786/orca-best
language:
- en
---
# Open-Assistant CodeLlama 13B SFT v10
This model is an Open-Assistant fine-tuning of Meta's CodeLlama 13B LLM.
**Note**: Due to the new RoPE Theta value (1e6 instead of 1e4), for correct results you must load this model with `trust_remote_code=True` or use the latest main branch of Huggingface transformers (until version 4.33 is released).
## Model Details
- **Finetuned from:** [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) via [epfLLM/Megatron-LLM](https://github.com/epfLLM/Megatron-LLM)
- **Model type:** Causal decoder-only transformer language model
- **Language:** English
- **Weights & Biases training logs:** 6123 steps, BS 64 [run56_oa_llamacode](https://wandb.ai/open-assistant/public-sft/runs/run56_oa_llamacode)
- **Demo:** [Continuations for 250 random prompts (without system message)](https://open-assistant.github.io/oasst-model-eval/?f=https%3A%2F%2Fraw.githubusercontent.com%2FOpen-Assistant%2Foasst-model-eval%2Fmain%2Fsampling_reports%2Foasst-sft%2F2023-08-26_OpenAssistant_codellama-13b-oasst-sft-v10_sampling_noprefix2.json)
- **License:** [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://huggingface.co/meta-llama/Llama-2-70b/raw/main/LICENSE.txt)
- **Contact:** [Open-Assistant Discord](https://ykilcher.com/open-assistant-discord)
## Prompting / Prompt Template
Due to public demand (see [survey](https://twitter.com/erhartford/status/1682403597525430272)) we changed the prompt-template for this model from custom prompter/assistant tokens to OpenAI's [chatml](https://github.com/openai/openai-python/blob/main/chatml.md) standard prompt format.
We hope that this leads to greater compatibility with chat inference/frontend applications.
Prompt dialogue template:
```
"""
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
"""
```
The model input can contain multiple conversation turns between user and assistant, e.g.
```
<|im_start|>user
{prompt 1}<|im_end|>
<|im_start|>assistant
{reply 1}<|im_end|>
<|im_start|>user
{prompt 2}<|im_end|>
<|im_start|>assistant
(...)
```
The model was partly trained with orca system messages.
For inference we recommend to use the official [Llama2 system message](https://github.com/facebookresearch/llama/blob/ea9f33d6d3ea8ed7d560d270986407fd6c2e52b7/example_chat_completion.py#L57-L61):
```
<|im_start|>system
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<|im_end|>
```
### Credits & Special Thanks
- Thanks to [Meta AI](https://ai.meta.com/) for training and releasing the CodeLLlama model.
- Distributed training support was provided by EPFL's [Machine Learning and Optimization Laboratory](https://www.epfl.ch/labs/mlo/), and [Natural Language Processing Lab](https://nlp.epfl.ch/).
- The open-source [epfLLM/Megatron-LLM](https://github.com/epfLLM/Megatron-LLM) trainer was used for fine-tuning.
- [rombodawg](https://huggingface.co/rombodawg) curated the [LosslessMegaCodeTrainingV2_1m_Evol_Uncensored](https://huggingface.co/datasets/rombodawg/LosslessMegaCodeTrainingV2_1m_Evol_Uncensored) dataset.
- [ehartford](https://huggingface.co/ehartford) generated and published the [ehartford/dolphin](https://huggingface.co/datasets/ehartford/dolphin).
- [shahules786](https://github.com/shahules786) de-duped and filtered the Dolphin and Megacode dataset with a clustering/controid approach and generated orca-best & bestofmegacode.
- [andreaskoepf](https://github.com/andreaskoepf/) prepared & orchestrated the training.
## Ethical Considerations and Limitations
Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios.
For these reasons, as with all LLMs, the potential outputs of codellama-13b-oasst-sft-v10 cannot be predicted
in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses
to user prompts. Therefore, before deploying any applications of codellama-13b-oasst-sft-v10, developers should
perform safety testing and tuning tailored to their specific applications of the model.
Please see Meta's [Responsible Use Guide](https://ai.meta.com/llama/responsible-use-guide/).
## Configuration Details
The "pretokenizer" utility used to tokenize the datamix is part of the Open-Assistant github repository and can be found here: [model/pretokenizer](https://github.com/LAION-AI/Open-Assistant/tree/main/model/pretokenizer).
### Pretokenizer Configuration
```
orca_megacode_oasst_best:
datasets:
- orca-chat:
val_split: 0.01
max_val_set: 1000
- bestofmegacode:
val_split: 0.01
max_val_set: 1000
- oasst_export:
lang: "bg,ca,cs,da,de,en,es,fr,hr,hu,it,nl,pl,pt,ro,ru,sl,sr,sv,uk"
#hf_dataset_name: OpenAssistant/oasst1
input_file_path: 2023-08-25_oasst_ready.jsonl.gz
top_k: 1
val_split: 0.025
output_dir: "output/orca_megacode_oasst_best"
filename_prefix: "orca_megacode_oasst_best"
min_assistant_tokens: 1
```
| 5,540 | [
[
-0.04351806640625,
-0.06573486328125,
0.021331787109375,
0.0269622802734375,
-0.01485443115234375,
-0.020050048828125,
-0.00848388671875,
-0.034149169921875,
0.0289306640625,
0.0303192138671875,
-0.04931640625,
-0.042205810546875,
-0.04583740234375,
0.0236663818359375,
-0.00836181640625,
0.0799560546875,
-0.006580352783203125,
-0.005828857421875,
-0.0027637481689453125,
-0.0185394287109375,
-0.0310211181640625,
-0.041107177734375,
-0.0634765625,
-0.01142120361328125,
0.0272369384765625,
0.0205841064453125,
0.05242919921875,
0.04931640625,
0.0285797119140625,
0.023284912109375,
-0.02178955078125,
0.0294036865234375,
-0.03955078125,
-0.0220489501953125,
-0.0017557144165039062,
-0.027618408203125,
-0.067138671875,
0.00547027587890625,
0.0242156982421875,
0.0310516357421875,
-0.007122039794921875,
0.027252197265625,
0.00033473968505859375,
0.026580810546875,
-0.04266357421875,
0.035858154296875,
-0.0201416015625,
0.00424957275390625,
-0.0018711090087890625,
-0.02032470703125,
-0.0198822021484375,
-0.0283203125,
-0.0024318695068359375,
-0.059906005859375,
-0.01480865478515625,
0.006687164306640625,
0.07806396484375,
0.030487060546875,
-0.025787353515625,
-0.0237579345703125,
-0.033294677734375,
0.0465087890625,
-0.06842041015625,
0.0153961181640625,
0.0361328125,
0.01474761962890625,
-0.023101806640625,
-0.049530029296875,
-0.045806884765625,
-0.02325439453125,
-0.0136260986328125,
0.0137939453125,
-0.01837158203125,
0.004734039306640625,
0.02923583984375,
0.0284576416015625,
-0.04718017578125,
0.01021575927734375,
-0.034423828125,
-0.012664794921875,
0.05035400390625,
0.0161590576171875,
0.0193328857421875,
-0.00971221923828125,
-0.02520751953125,
-0.0204315185546875,
-0.049835205078125,
0.0102386474609375,
0.03009033203125,
0.01085662841796875,
-0.041900634765625,
0.04583740234375,
-0.0170440673828125,
0.0416259765625,
0.008697509765625,
-0.026123046875,
0.04254150390625,
-0.031097412109375,
-0.022735595703125,
-0.00556182861328125,
0.08465576171875,
0.02691650390625,
0.0123291015625,
0.0202178955078125,
-0.01332855224609375,
-0.00469207763671875,
0.00424957275390625,
-0.05682373046875,
-0.007114410400390625,
0.0273284912109375,
-0.03594970703125,
-0.034759521484375,
0.01444244384765625,
-0.05047607421875,
-0.0174102783203125,
-0.01021575927734375,
0.027252197265625,
-0.0271148681640625,
-0.0287628173828125,
0.019805908203125,
0.0027141571044921875,
0.0169525146484375,
0.02081298828125,
-0.059478759765625,
0.00905609130859375,
0.033416748046875,
0.05682373046875,
0.00800323486328125,
-0.03204345703125,
-0.025970458984375,
-0.0089263916015625,
-0.0244140625,
0.0379638671875,
-0.01898193359375,
-0.02545166015625,
-0.0113983154296875,
0.0113983154296875,
-0.016021728515625,
-0.03363037109375,
0.03460693359375,
-0.03076171875,
0.031890869140625,
-0.004955291748046875,
-0.0204010009765625,
-0.019073486328125,
0.00982666015625,
-0.042510986328125,
0.07958984375,
0.01027679443359375,
-0.05316162109375,
0.005443572998046875,
-0.08294677734375,
-0.0138092041015625,
-0.035308837890625,
-0.003265380859375,
-0.0270538330078125,
-0.016632080078125,
0.036529541015625,
0.032623291015625,
-0.0224761962890625,
0.029937744140625,
-0.03021240234375,
-0.0309600830078125,
0.01253509521484375,
-0.0204925537109375,
0.08306884765625,
0.0225830078125,
-0.035980224609375,
0.0184326171875,
-0.0601806640625,
-0.02008056640625,
0.0296173095703125,
-0.0308837890625,
-0.00548553466796875,
-0.0163116455078125,
-0.007747650146484375,
0.01214599609375,
0.0293731689453125,
-0.038421630859375,
0.0321044921875,
-0.0257110595703125,
0.0391845703125,
0.064453125,
-0.0085601806640625,
0.0301513671875,
-0.0249786376953125,
0.048828125,
0.0147705078125,
0.033203125,
-0.0248260498046875,
-0.05657958984375,
-0.06463623046875,
-0.0245819091796875,
0.00763702392578125,
0.047637939453125,
-0.0567626953125,
0.057861328125,
-0.015960693359375,
-0.04254150390625,
-0.05010986328125,
-0.00160980224609375,
0.03546142578125,
0.03851318359375,
0.031402587890625,
-0.0234375,
-0.0501708984375,
-0.0548095703125,
0.012939453125,
-0.0193023681640625,
-0.004734039306640625,
0.040771484375,
0.042694091796875,
-0.0394287109375,
0.072265625,
-0.04962158203125,
-0.0306549072265625,
-0.014892578125,
-0.0008459091186523438,
0.03521728515625,
0.041839599609375,
0.05712890625,
-0.05364990234375,
-0.0180511474609375,
-0.01349639892578125,
-0.06939697265625,
-0.01739501953125,
-0.0052337646484375,
-0.011138916015625,
0.030487060546875,
0.0374755859375,
-0.05731201171875,
0.03887939453125,
0.049163818359375,
-0.0271453857421875,
0.03594970703125,
-0.0144805908203125,
0.0038089752197265625,
-0.09173583984375,
0.0223388671875,
-0.003612518310546875,
-0.005359649658203125,
-0.033294677734375,
0.01250457763671875,
-0.01125335693359375,
-0.004852294921875,
-0.03704833984375,
0.050994873046875,
-0.0284576416015625,
-0.0016345977783203125,
-0.004283905029296875,
0.003841400146484375,
-0.0164794921875,
0.054473876953125,
0.0022602081298828125,
0.07061767578125,
0.042694091796875,
-0.0439453125,
0.022918701171875,
0.03375244140625,
-0.0036220550537109375,
0.015869140625,
-0.06646728515625,
0.025115966796875,
-0.0035915374755859375,
0.04388427734375,
-0.07275390625,
-0.020172119140625,
0.048828125,
-0.051910400390625,
0.0148162841796875,
-0.0010919570922851562,
-0.042205810546875,
-0.032135009765625,
-0.029052734375,
0.02325439453125,
0.041534423828125,
-0.05291748046875,
0.046112060546875,
0.0247344970703125,
0.003849029541015625,
-0.05120849609375,
-0.045867919921875,
-0.0108795166015625,
-0.015350341796875,
-0.05316162109375,
0.00753021240234375,
-0.007457733154296875,
0.00714111328125,
-0.01490020751953125,
-0.00453948974609375,
-0.004711151123046875,
0.00936126708984375,
0.034942626953125,
0.0237579345703125,
-0.0233306884765625,
-0.018951416015625,
-0.006504058837890625,
-0.023956298828125,
0.005908966064453125,
-0.0111236572265625,
0.0572509765625,
-0.0220489501953125,
-0.02288818359375,
-0.033111572265625,
0.0040283203125,
0.042510986328125,
-0.0212554931640625,
0.0667724609375,
0.0635986328125,
-0.01611328125,
0.0128021240234375,
-0.036041259765625,
-0.022491455078125,
-0.03961181640625,
0.0171051025390625,
-0.016021728515625,
-0.07122802734375,
0.047332763671875,
0.01189422607421875,
0.017242431640625,
0.035614013671875,
0.0340576171875,
0.00890350341796875,
0.07647705078125,
0.050567626953125,
-0.0249481201171875,
0.0401611328125,
-0.035888671875,
0.016021728515625,
-0.0650634765625,
-0.016571044921875,
-0.038543701171875,
-0.011383056640625,
-0.043914794921875,
-0.0219268798828125,
0.037078857421875,
0.0088043212890625,
-0.030303955078125,
0.048583984375,
-0.033905029296875,
0.0179290771484375,
0.0478515625,
0.011444091796875,
0.0182647705078125,
-0.01183319091796875,
-0.0030803680419921875,
0.0179443359375,
-0.0601806640625,
-0.052764892578125,
0.09130859375,
0.037384033203125,
0.05865478515625,
0.006961822509765625,
0.056640625,
-0.0012903213500976562,
0.018829345703125,
-0.032806396484375,
0.04901123046875,
0.009735107421875,
-0.041900634765625,
-0.0219879150390625,
-0.04559326171875,
-0.07965087890625,
0.00620269775390625,
0.00325775146484375,
-0.0728759765625,
0.015533447265625,
0.0127105712890625,
-0.0340576171875,
0.0168914794921875,
-0.052093505859375,
0.07354736328125,
0.0024776458740234375,
0.00691986083984375,
0.0021495819091796875,
-0.05780029296875,
0.043548583984375,
-0.00664520263671875,
0.00901031494140625,
0.00185394287109375,
-0.0173492431640625,
0.053863525390625,
-0.044830322265625,
0.073974609375,
-0.00710296630859375,
-0.0162506103515625,
0.0210113525390625,
-0.00959014892578125,
0.031402587890625,
0.010772705078125,
-0.005313873291015625,
0.03759765625,
0.0015134811401367188,
-0.0203094482421875,
-0.023284912109375,
0.045501708984375,
-0.0782470703125,
-0.035491943359375,
-0.025238037109375,
-0.0302886962890625,
0.0085296630859375,
0.01068878173828125,
0.0299224853515625,
0.0226593017578125,
0.00336456298828125,
0.00919342041015625,
0.047943115234375,
-0.052764892578125,
0.0274200439453125,
0.04449462890625,
-0.0234222412109375,
-0.043609619140625,
0.060882568359375,
-0.0036373138427734375,
0.0172882080078125,
0.0270843505859375,
0.01016998291015625,
-0.0202789306640625,
-0.0196380615234375,
-0.037200927734375,
0.0211029052734375,
-0.048828125,
-0.0235443115234375,
-0.061309814453125,
-0.01898193359375,
-0.048095703125,
0.006145477294921875,
-0.022125244140625,
-0.019622802734375,
-0.05010986328125,
-0.01050567626953125,
0.04742431640625,
0.04461669921875,
-0.01502227783203125,
0.0277557373046875,
-0.05609130859375,
0.029327392578125,
0.00638580322265625,
0.0117034912109375,
-0.004482269287109375,
-0.045989990234375,
-0.004711151123046875,
0.0164947509765625,
-0.0338134765625,
-0.0572509765625,
0.024658203125,
0.0081939697265625,
0.04229736328125,
0.0308837890625,
0.0115814208984375,
0.0511474609375,
-0.01995849609375,
0.07843017578125,
0.006877899169921875,
-0.06072998046875,
0.0494384765625,
-0.036651611328125,
0.021820068359375,
0.0270843505859375,
0.037872314453125,
-0.0284576416015625,
-0.031463623046875,
-0.052978515625,
-0.0650634765625,
0.07452392578125,
0.0192413330078125,
0.00806427001953125,
-0.01099395751953125,
0.018890380859375,
-0.00958251953125,
0.0166168212890625,
-0.0582275390625,
-0.03558349609375,
-0.01412200927734375,
-0.0108795166015625,
-0.0113372802734375,
-0.0120391845703125,
-0.0010623931884765625,
-0.0213775634765625,
0.0589599609375,
-0.01885986328125,
0.04364013671875,
0.02056884765625,
-0.006893157958984375,
-0.0098876953125,
0.0009441375732421875,
0.06591796875,
0.045196533203125,
-0.021209716796875,
-0.0171966552734375,
0.0212554931640625,
-0.045257568359375,
0.00626373291015625,
0.01311492919921875,
-0.008056640625,
-0.0034732818603515625,
0.036407470703125,
0.0675048828125,
0.01302337646484375,
-0.05010986328125,
0.04925537109375,
-0.0104217529296875,
-0.0135040283203125,
-0.0255889892578125,
0.0157318115234375,
0.00890350341796875,
0.0209197998046875,
0.01898193359375,
0.0025043487548828125,
0.0005049705505371094,
-0.039520263671875,
-0.00225830078125,
0.0109405517578125,
-0.0045318603515625,
-0.0194244384765625,
0.061248779296875,
0.0137176513671875,
-0.0196380615234375,
0.04180908203125,
-0.01580810546875,
-0.036224365234375,
0.06610107421875,
0.0259857177734375,
0.0665283203125,
-0.0202789306640625,
0.007511138916015625,
0.03753662109375,
0.03387451171875,
-0.00405120849609375,
0.03387451171875,
0.003177642822265625,
-0.04327392578125,
-0.0209808349609375,
-0.04132080078125,
-0.02081298828125,
0.0223388671875,
-0.047332763671875,
0.030487060546875,
-0.046905517578125,
-0.0142669677734375,
-0.013153076171875,
-0.00681304931640625,
-0.049774169921875,
-0.004047393798828125,
-0.0006756782531738281,
0.08917236328125,
-0.060577392578125,
0.05731201171875,
0.06390380859375,
-0.055267333984375,
-0.07965087890625,
-0.011688232421875,
0.0128936767578125,
-0.061065673828125,
0.0399169921875,
0.018463134765625,
0.01102447509765625,
-0.0033397674560546875,
-0.052581787109375,
-0.06982421875,
0.0999755859375,
0.034423828125,
-0.029541015625,
-0.0013570785522460938,
0.01314544677734375,
0.039306640625,
-0.033050537109375,
0.05316162109375,
0.0430908203125,
0.037811279296875,
-0.0026226043701171875,
-0.10394287109375,
0.0176544189453125,
-0.0235443115234375,
0.00457763671875,
-0.0037288665771484375,
-0.06292724609375,
0.07635498046875,
-0.019622802734375,
-0.0003304481506347656,
0.0297698974609375,
0.046234130859375,
0.034820556640625,
0.0220794677734375,
0.03277587890625,
0.0491943359375,
0.040679931640625,
-0.0090179443359375,
0.07806396484375,
-0.0284576416015625,
0.03375244140625,
0.0718994140625,
-0.0036563873291015625,
0.053619384765625,
0.0231781005859375,
-0.014495849609375,
0.0198822021484375,
0.06256103515625,
0.003448486328125,
0.02923583984375,
-0.0021839141845703125,
0.00724029541015625,
0.0035190582275390625,
-0.002727508544921875,
-0.0498046875,
0.0284423828125,
0.0172882080078125,
-0.0418701171875,
-0.0163116455078125,
0.00751495361328125,
0.01739501953125,
-0.0255889892578125,
-0.007843017578125,
0.0718994140625,
0.01412200927734375,
-0.042144775390625,
0.07708740234375,
0.00081634521484375,
0.07196044921875,
-0.0533447265625,
-0.006572723388671875,
-0.034454345703125,
0.01349639892578125,
-0.02081298828125,
-0.04412841796875,
0.0106048583984375,
0.0045318603515625,
0.0025959014892578125,
-0.01502227783203125,
0.03692626953125,
-0.0178680419921875,
-0.0098114013671875,
0.0287017822265625,
0.0215911865234375,
0.034393310546875,
-0.0026264190673828125,
-0.059722900390625,
0.0269622802734375,
0.00522613525390625,
-0.036102294921875,
0.0230560302734375,
0.0273284912109375,
0.0045623779296875,
0.057525634765625,
0.05828857421875,
-0.00936126708984375,
-0.0014886856079101562,
-0.005733489990234375,
0.08062744140625,
-0.034271240234375,
-0.040130615234375,
-0.060394287109375,
0.0386962890625,
0.0024280548095703125,
-0.04901123046875,
0.0557861328125,
0.03802490234375,
0.0709228515625,
-0.0181121826171875,
0.039031982421875,
-0.0167388916015625,
0.0206756591796875,
-0.0428466796875,
0.0582275390625,
-0.042083740234375,
0.0257720947265625,
-0.03790283203125,
-0.0777587890625,
-0.002735137939453125,
0.056304931640625,
-0.0193634033203125,
0.0131683349609375,
0.0418701171875,
0.085693359375,
-0.01324462890625,
0.004947662353515625,
0.0111236572265625,
0.02032470703125,
0.0291595458984375,
0.058746337890625,
0.05523681640625,
-0.05084228515625,
0.04864501953125,
-0.0202484130859375,
-0.03228759765625,
-0.02862548828125,
-0.06402587890625,
-0.07891845703125,
-0.036895751953125,
-0.02362060546875,
-0.028411865234375,
-0.0015735626220703125,
0.093017578125,
0.059234619140625,
-0.0509033203125,
-0.021270751953125,
0.0011415481567382812,
0.0021076202392578125,
-0.018890380859375,
-0.018829345703125,
0.0202484130859375,
-0.0079498291015625,
-0.048828125,
0.028717041015625,
0.0011358261108398438,
0.0173797607421875,
-0.0248870849609375,
-0.02178955078125,
-0.0131683349609375,
0.00136566162109375,
0.039459228515625,
0.0340576171875,
-0.055694580078125,
-0.01203155517578125,
0.004161834716796875,
-0.0168914794921875,
0.005126953125,
0.025787353515625,
-0.046173095703125,
0.0080413818359375,
0.0325927734375,
0.03216552734375,
0.025543212890625,
-0.0051727294921875,
0.0380859375,
-0.046844482421875,
0.02276611328125,
0.0019626617431640625,
0.0364990234375,
0.00992584228515625,
-0.035919189453125,
0.06182861328125,
0.01145172119140625,
-0.03857421875,
-0.06396484375,
-0.006862640380859375,
-0.07879638671875,
-0.0035400390625,
0.1009521484375,
-0.0245819091796875,
-0.0236358642578125,
0.008270263671875,
-0.045318603515625,
0.01366424560546875,
-0.047637939453125,
0.048492431640625,
0.032958984375,
0.000026047229766845703,
-0.0161285400390625,
-0.04290771484375,
0.0282745361328125,
0.01580810546875,
-0.07733154296875,
-0.015869140625,
0.02716064453125,
0.02130126953125,
0.0284881591796875,
0.060577392578125,
-0.004108428955078125,
0.0259552001953125,
-0.01537322998046875,
0.004375457763671875,
-0.0292205810546875,
-0.018768310546875,
-0.03265380859375,
-0.01271820068359375,
-0.014984130859375,
-0.0284881591796875
]
] |
google/pix2struct-docvqa-base | 2023-05-19T10:06:56.000Z | [
"transformers",
"pytorch",
"pix2struct",
"text2text-generation",
"visual-question-answering",
"en",
"fr",
"ro",
"de",
"multilingual",
"arxiv:2210.03347",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"region:us"
] | visual-question-answering | google | null | null | google/pix2struct-docvqa-base | 23 | 12,801 | transformers | 2023-03-21T09:45:02 | ---
language:
- en
- fr
- ro
- de
- multilingual
pipeline_tag: visual-question-answering
inference: false
license: apache-2.0
---
# Model card for Pix2Struct - Finetuned on Doc-VQA (Visual Question Answering over scanned documents)

# Table of Contents
0. [TL;DR](#TL;DR)
1. [Using the model](#using-the-model)
2. [Contribution](#contribution)
3. [Citation](#citation)
# TL;DR
Pix2Struct is an image encoder - text decoder model that is trained on image-text pairs for various tasks, including image captionning and visual question answering. The full list of available models can be found on the Table 1 of the paper:

The abstract of the model states that:
> Visually-situated language is ubiquitous—sources range from textbooks with diagrams to web pages with images and tables, to mobile apps with buttons and
forms. Perhaps due to this diversity, previous work has typically relied on domainspecific recipes with limited sharing of the underlying data, model architectures,
and objectives. We present Pix2Struct, a pretrained image-to-text model for
purely visual language understanding, which can be finetuned on tasks containing visually-situated language. Pix2Struct is pretrained by learning to parse
masked screenshots of web pages into simplified HTML. The web, with its richness of visual elements cleanly reflected in the HTML structure, provides a large
source of pretraining data well suited to the diversity of downstream tasks. Intuitively, this objective subsumes common pretraining signals such as OCR, language modeling, image captioning. In addition to the novel pretraining strategy,
we introduce a variable-resolution input representation and a more flexible integration of language and vision inputs, where language prompts such as questions
are rendered directly on top of the input image. For the first time, we show that a
single pretrained model can achieve state-of-the-art results in six out of nine tasks
across four domains: documents, illustrations, user interfaces, and natural images.
# Using the model
## Converting from T5x to huggingface
You can use the [`convert_pix2struct_checkpoint_to_pytorch.py`](https://github.com/huggingface/transformers/blob/main/src/transformers/models/pix2struct/convert_pix2struct_checkpoint_to_pytorch.py) script as follows:
```bash
python convert_pix2struct_checkpoint_to_pytorch.py --t5x_checkpoint_path PATH_TO_T5X_CHECKPOINTS --pytorch_dump_path PATH_TO_SAVE
```
if you are converting a large model, run:
```bash
python convert_pix2struct_checkpoint_to_pytorch.py --t5x_checkpoint_path PATH_TO_T5X_CHECKPOINTS --pytorch_dump_path PATH_TO_SAVE --use-large
```
Once saved, you can push your converted model with the following snippet:
```python
from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor
model = Pix2StructForConditionalGeneration.from_pretrained(PATH_TO_SAVE)
processor = Pix2StructProcessor.from_pretrained(PATH_TO_SAVE)
model.push_to_hub("USERNAME/MODEL_NAME")
processor.push_to_hub("USERNAME/MODEL_NAME")
```
## Running the model
The instructions for running this model are totally similar to the instructions stated on [`pix2struct-aid-base`](https://huggingface.co/ybelkada/pix2struct-ai2d-base) model.
# Contribution
This model was originally contributed by Kenton Lee, Mandar Joshi et al. and added to the Hugging Face ecosystem by [Younes Belkada](https://huggingface.co/ybelkada).
# Citation
If you want to cite this work, please consider citing the original paper:
```
@misc{https://doi.org/10.48550/arxiv.2210.03347,
doi = {10.48550/ARXIV.2210.03347},
url = {https://arxiv.org/abs/2210.03347},
author = {Lee, Kenton and Joshi, Mandar and Turc, Iulia and Hu, Hexiang and Liu, Fangyu and Eisenschlos, Julian and Khandelwal, Urvashi and Shaw, Peter and Chang, Ming-Wei and Toutanova, Kristina},
keywords = {Computation and Language (cs.CL), Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Pix2Struct: Screenshot Parsing as Pretraining for Visual Language Understanding},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` | 4,474 | [
[
-0.0290069580078125,
-0.0567626953125,
0.03265380859375,
0.0175323486328125,
-0.0193328857421875,
-0.028472900390625,
-0.003124237060546875,
-0.03399658203125,
-0.01445770263671875,
0.028900146484375,
-0.045928955078125,
-0.017333984375,
-0.0521240234375,
-0.0101318359375,
-0.017242431640625,
0.06915283203125,
-0.0149078369140625,
0.00537109375,
-0.04498291015625,
0.0012826919555664062,
-0.01543426513671875,
-0.03387451171875,
-0.044219970703125,
-0.0186004638671875,
0.0191802978515625,
0.01126861572265625,
0.04437255859375,
0.03424072265625,
0.033355712890625,
0.0223846435546875,
-0.00856781005859375,
-0.00494384765625,
-0.0286712646484375,
-0.0223846435546875,
-0.0176239013671875,
-0.049652099609375,
-0.034027099609375,
0.022064208984375,
0.04949951171875,
0.03863525390625,
0.008270263671875,
0.0084075927734375,
0.01251220703125,
0.04193115234375,
-0.036773681640625,
0.0211334228515625,
-0.0258026123046875,
0.00263214111328125,
-0.003734588623046875,
0.015228271484375,
-0.03521728515625,
-0.0092315673828125,
0.0029277801513671875,
-0.046417236328125,
0.006313323974609375,
-0.00972747802734375,
0.091064453125,
0.0238800048828125,
-0.0180816650390625,
0.00534820556640625,
-0.01995849609375,
0.037384033203125,
-0.03424072265625,
0.031646728515625,
0.03680419921875,
0.0141448974609375,
0.004180908203125,
-0.0859375,
-0.05517578125,
-0.0027637481689453125,
-0.020263671875,
0.0197296142578125,
-0.033447265625,
-0.000640869140625,
0.0309906005859375,
0.0184326171875,
-0.046966552734375,
-0.005115509033203125,
-0.03985595703125,
-0.021636962890625,
0.033203125,
-0.0192718505859375,
0.044677734375,
-0.02178955078125,
-0.041961669921875,
-0.04345703125,
-0.03173828125,
0.0325927734375,
-0.0019016265869140625,
-0.0072479248046875,
-0.04132080078125,
0.04595947265625,
0.00502777099609375,
0.039794921875,
0.010040283203125,
0.008544921875,
0.0296783447265625,
-0.021453857421875,
-0.00598907470703125,
-0.0298309326171875,
0.07757568359375,
0.044525146484375,
0.028900146484375,
0.0015134811401367188,
-0.002201080322265625,
-0.00027751922607421875,
0.0162811279296875,
-0.091064453125,
-0.036651611328125,
0.013824462890625,
-0.038299560546875,
-0.015899658203125,
0.024444580078125,
-0.047637939453125,
-0.004848480224609375,
-0.0221099853515625,
0.0426025390625,
-0.038970947265625,
-0.038299560546875,
-0.0205078125,
-0.0178375244140625,
0.03271484375,
0.035858154296875,
-0.04364013671875,
0.0196990966796875,
0.041473388671875,
0.0810546875,
-0.0103302001953125,
-0.02972412109375,
-0.042327880859375,
-0.0201873779296875,
-0.02545166015625,
0.07135009765625,
-0.02838134765625,
-0.00629425048828125,
-0.0193328857421875,
0.0207366943359375,
-0.018524169921875,
-0.036865234375,
-0.0016536712646484375,
-0.0243377685546875,
0.0275726318359375,
-0.004070281982421875,
-0.014892578125,
-0.023468017578125,
0.01213836669921875,
-0.0360107421875,
0.0897216796875,
0.038360595703125,
-0.05987548828125,
-0.00035572052001953125,
-0.0413818359375,
-0.02008056640625,
-0.005107879638671875,
-0.01378631591796875,
-0.058319091796875,
0.00611114501953125,
0.010406494140625,
0.032806396484375,
-0.00946044921875,
0.00949859619140625,
-0.03155517578125,
-0.01445770263671875,
0.019775390625,
0.0015611648559570312,
0.07232666015625,
0.020355224609375,
-0.0360107421875,
0.004222869873046875,
-0.0330810546875,
0.016632080078125,
0.0171966552734375,
-0.00955963134765625,
-0.007305145263671875,
-0.01390838623046875,
0.0224609375,
0.03485107421875,
0.0164794921875,
-0.0307464599609375,
0.0159759521484375,
-0.01373291015625,
0.05029296875,
0.031402587890625,
-0.021240234375,
0.038177490234375,
-0.004703521728515625,
0.0263671875,
0.01088714599609375,
0.00788116455078125,
-0.03363037109375,
-0.04022216796875,
-0.05291748046875,
-0.023101806640625,
0.00909423828125,
0.046295166015625,
-0.065673828125,
0.0262603759765625,
-0.01151275634765625,
-0.040496826171875,
-0.00994873046875,
-0.0038394927978515625,
0.052947998046875,
0.042327880859375,
0.03216552734375,
-0.039093017578125,
-0.0246124267578125,
-0.06085205078125,
-0.0032367706298828125,
-0.018798828125,
-0.010040283203125,
0.021240234375,
0.048248291015625,
-0.037872314453125,
0.07025146484375,
-0.027374267578125,
-0.0157318115234375,
-0.02154541015625,
0.01105499267578125,
-0.0006208419799804688,
0.06396484375,
0.055389404296875,
-0.063720703125,
-0.038421630859375,
-0.005413055419921875,
-0.06463623046875,
-0.00457000732421875,
-0.004730224609375,
-0.029266357421875,
0.0222930908203125,
0.04876708984375,
-0.0498046875,
0.038787841796875,
0.03143310546875,
-0.04315185546875,
0.038330078125,
-0.00882720947265625,
-0.00009834766387939453,
-0.09222412109375,
0.0272674560546875,
0.00679779052734375,
-0.0347900390625,
-0.042022705078125,
0.0231475830078125,
0.0296478271484375,
-0.0306243896484375,
-0.0440673828125,
0.0599365234375,
-0.0462646484375,
-0.0162811279296875,
-0.021331787109375,
-0.01409149169921875,
0.0084228515625,
0.0538330078125,
0.0298004150390625,
0.061798095703125,
0.0572509765625,
-0.04290771484375,
0.018768310546875,
0.047332763671875,
-0.0128173828125,
0.05010986328125,
-0.064697265625,
0.025390625,
-0.01389312744140625,
0.018096923828125,
-0.06646728515625,
-0.02032470703125,
0.037628173828125,
-0.051666259765625,
0.031402587890625,
-0.021942138671875,
-0.025909423828125,
-0.035675048828125,
-0.01419830322265625,
0.042327880859375,
0.0513916015625,
-0.0445556640625,
0.047943115234375,
0.018524169921875,
-0.02093505859375,
-0.01369476318359375,
-0.0595703125,
-0.0051422119140625,
-0.0004177093505859375,
-0.05596923828125,
0.02825927734375,
-0.007556915283203125,
-0.00591278076171875,
0.0033740997314453125,
0.00272369384765625,
-0.00908660888671875,
-0.007099151611328125,
0.037384033203125,
0.02789306640625,
-0.01503753662109375,
-0.006855010986328125,
-0.000006318092346191406,
-0.0256500244140625,
0.0019483566284179688,
-0.0247344970703125,
0.04833984375,
-0.0242156982421875,
-0.00788116455078125,
-0.06964111328125,
0.035247802734375,
0.04644775390625,
-0.040008544921875,
0.04132080078125,
0.053314208984375,
-0.03564453125,
0.00821685791015625,
-0.0404052734375,
-0.01351165771484375,
-0.033935546875,
0.048370361328125,
-0.037841796875,
-0.0513916015625,
0.0302581787109375,
-0.0082244873046875,
-0.01110076904296875,
0.042266845703125,
0.043701171875,
-0.02203369140625,
0.06427001953125,
0.07061767578125,
0.0232086181640625,
0.060089111328125,
-0.03375244140625,
0.0007052421569824219,
-0.0665283203125,
-0.03814697265625,
-0.027374267578125,
-0.0118255615234375,
-0.0274200439453125,
-0.0445556640625,
0.03607177734375,
0.0284271240234375,
-0.0296783447265625,
0.040863037109375,
-0.0400390625,
0.0202789306640625,
0.0560302734375,
0.03387451171875,
-0.013916015625,
0.03448486328125,
0.0002390146255493164,
-0.0003731250762939453,
-0.046905517578125,
-0.0245819091796875,
0.06317138671875,
0.03448486328125,
0.0423583984375,
-0.0114288330078125,
0.032989501953125,
-0.007244110107421875,
0.00534820556640625,
-0.05596923828125,
0.030303955078125,
-0.007350921630859375,
-0.029876708984375,
-0.005290985107421875,
-0.0270538330078125,
-0.052398681640625,
0.01076507568359375,
-0.01474761962890625,
-0.06390380859375,
0.02581787109375,
0.0193328857421875,
-0.0305938720703125,
0.019775390625,
-0.0689697265625,
0.09429931640625,
-0.0243377685546875,
-0.055145263671875,
-0.001483917236328125,
-0.052032470703125,
0.0181732177734375,
0.01551055908203125,
-0.00046563148498535156,
0.01419830322265625,
0.01201629638671875,
0.06658935546875,
-0.053466796875,
0.05926513671875,
-0.0229339599609375,
0.01422119140625,
0.038818359375,
0.00017845630645751953,
0.031524658203125,
0.0018339157104492188,
0.0023899078369140625,
0.032958984375,
0.032012939453125,
-0.03350830078125,
-0.041717529296875,
0.026824951171875,
-0.0787353515625,
-0.025787353515625,
-0.037139892578125,
-0.0256500244140625,
0.0041656494140625,
0.025482177734375,
0.038482666015625,
0.032196044921875,
0.0027618408203125,
0.013397216796875,
0.049957275390625,
-0.0275115966796875,
0.032501220703125,
0.0095367431640625,
-0.0097808837890625,
-0.037322998046875,
0.05218505859375,
-0.01065826416015625,
0.02691650390625,
0.0211334228515625,
0.0053863525390625,
-0.03314208984375,
-0.0161285400390625,
-0.03631591796875,
0.0360107421875,
-0.0440673828125,
-0.01291656494140625,
-0.06353759765625,
-0.026092529296875,
-0.043701171875,
-0.007045745849609375,
-0.04949951171875,
-0.0168609619140625,
-0.03387451171875,
0.01453399658203125,
0.0290069580078125,
0.034942626953125,
-0.01036834716796875,
0.037322998046875,
-0.045196533203125,
0.035186767578125,
0.03338623046875,
0.04376220703125,
-0.016204833984375,
-0.0477294921875,
0.003971099853515625,
0.01434326171875,
-0.02374267578125,
-0.06341552734375,
0.0253448486328125,
0.0224151611328125,
0.034576416015625,
0.0295562744140625,
-0.00641632080078125,
0.0556640625,
-0.0309295654296875,
0.04498291015625,
0.040679931640625,
-0.059112548828125,
0.06280517578125,
-0.006443023681640625,
0.01102447509765625,
0.0428466796875,
0.029998779296875,
-0.037841796875,
0.01363372802734375,
-0.050811767578125,
-0.05126953125,
0.07318115234375,
0.019561767578125,
0.014984130859375,
0.0220489501953125,
0.0460205078125,
-0.00829315185546875,
0.00904083251953125,
-0.0743408203125,
-0.0007634162902832031,
-0.052276611328125,
-0.01348876953125,
0.00005358457565307617,
-0.034393310546875,
0.0005168914794921875,
-0.040985107421875,
0.0423583984375,
-0.0164031982421875,
0.056304931640625,
0.027984619140625,
-0.03631591796875,
0.00325775146484375,
-0.0188751220703125,
0.031036376953125,
0.0355224609375,
0.0005002021789550781,
0.0157318115234375,
-0.0171356201171875,
-0.043792724609375,
-0.00833892822265625,
0.016876220703125,
-0.0167236328125,
-0.0116424560546875,
0.0310211181640625,
0.08294677734375,
-0.001293182373046875,
-0.041534423828125,
0.061553955078125,
-0.00545501708984375,
-0.0239715576171875,
-0.03271484375,
0.00711822509765625,
-0.0012884140014648438,
0.0291595458984375,
0.0190887451171875,
0.0132598876953125,
-0.0150909423828125,
-0.0535888671875,
0.0261077880859375,
0.037933349609375,
-0.037689208984375,
-0.031524658203125,
0.0638427734375,
0.00801849365234375,
-0.018890380859375,
0.0667724609375,
-0.01050567626953125,
-0.048583984375,
0.05474853515625,
0.038116455078125,
0.055419921875,
-0.00836944580078125,
0.013580322265625,
0.060882568359375,
0.01454925537109375,
-0.011016845703125,
0.012420654296875,
-0.0247344970703125,
-0.045318603515625,
-0.005344390869140625,
-0.052734375,
-0.00618743896484375,
0.0032291412353515625,
-0.045318603515625,
0.0269012451171875,
-0.04644775390625,
-0.0020599365234375,
-0.0125885009765625,
-0.0027618408203125,
-0.050506591796875,
0.0239410400390625,
0.02618408203125,
0.0633544921875,
-0.055755615234375,
0.055328369140625,
0.067626953125,
-0.05181884765625,
-0.06304931640625,
-0.00858306884765625,
0.00390625,
-0.06805419921875,
0.039703369140625,
0.038818359375,
0.0140228271484375,
0.01557159423828125,
-0.061920166015625,
-0.05364990234375,
0.0938720703125,
0.0203704833984375,
-0.037750244140625,
-0.0033588409423828125,
0.00949859619140625,
0.0216827392578125,
-0.0177459716796875,
0.051422119140625,
0.02435302734375,
0.0305938720703125,
0.02935791015625,
-0.0667724609375,
0.01041412353515625,
-0.06329345703125,
0.019805908203125,
-0.0225830078125,
-0.048828125,
0.07537841796875,
-0.0305938720703125,
-0.0262603759765625,
0.018035888671875,
0.052276611328125,
0.0168914794921875,
0.0216064453125,
0.0290069580078125,
0.046295166015625,
0.0389404296875,
-0.023406982421875,
0.0853271484375,
-0.01486968994140625,
0.03216552734375,
0.0616455078125,
0.014068603515625,
0.06768798828125,
0.034332275390625,
-0.02362060546875,
0.0357666015625,
0.060943603515625,
-0.0207366943359375,
0.0283966064453125,
-0.0106658935546875,
0.006683349609375,
-0.030426025390625,
0.0086822509765625,
-0.03961181640625,
0.0308990478515625,
0.015960693359375,
-0.02862548828125,
-0.01654052734375,
0.012664794921875,
0.0175628662109375,
0.00516510009765625,
-0.022003173828125,
0.058074951171875,
0.0108184814453125,
-0.059295654296875,
0.056671142578125,
0.005290985107421875,
0.056243896484375,
-0.044769287109375,
-0.00008273124694824219,
-0.024993896484375,
0.0193939208984375,
-0.0239715576171875,
-0.056884765625,
0.0277862548828125,
-0.0006380081176757812,
-0.0167388916015625,
-0.0250701904296875,
0.04437255859375,
-0.0283660888671875,
-0.057708740234375,
0.0118560791015625,
0.0169677734375,
0.02490234375,
-0.0406494140625,
-0.061492919921875,
0.0123443603515625,
0.002109527587890625,
-0.0239105224609375,
0.02508544921875,
0.01361846923828125,
-0.01306915283203125,
0.043060302734375,
0.04705810546875,
-0.0211639404296875,
0.0038547515869140625,
-0.016326904296875,
0.067626953125,
-0.03973388671875,
-0.051116943359375,
-0.044769287109375,
0.0654296875,
-0.00019598007202148438,
-0.0242767333984375,
0.0285491943359375,
0.041229248046875,
0.0740966796875,
-0.00959014892578125,
0.05194091796875,
-0.028045654296875,
0.0105743408203125,
-0.0435791015625,
0.07568359375,
-0.056365966796875,
-0.019866943359375,
-0.0281829833984375,
-0.0791015625,
-0.019927978515625,
0.07525634765625,
-0.02301025390625,
0.01042938232421875,
0.055572509765625,
0.07623291015625,
-0.01995849609375,
-0.033203125,
0.0057220458984375,
0.018768310546875,
0.0201416015625,
0.04754638671875,
0.041839599609375,
-0.052032470703125,
0.03424072265625,
-0.031646728515625,
-0.04058837890625,
-0.004001617431640625,
-0.050445556640625,
-0.079833984375,
-0.06439208984375,
-0.048126220703125,
-0.03125,
-0.0006313323974609375,
0.04974365234375,
0.07135009765625,
-0.047760009765625,
-0.007373809814453125,
-0.021209716796875,
-0.013031005859375,
-0.0140380859375,
-0.0159454345703125,
0.035491943359375,
-0.0132598876953125,
-0.062469482421875,
-0.01007080078125,
0.006893157958984375,
0.0263519287109375,
0.0054473876953125,
-0.0168304443359375,
-0.01084136962890625,
-0.0272216796875,
0.052520751953125,
0.03790283203125,
-0.044219970703125,
-0.002330780029296875,
-0.002796173095703125,
-0.011260986328125,
0.019500732421875,
0.040496826171875,
-0.058135986328125,
0.0293426513671875,
0.03912353515625,
0.038665771484375,
0.06256103515625,
0.00859832763671875,
0.0229949951171875,
-0.0264434814453125,
0.03887939453125,
-0.01092529296875,
0.0226898193359375,
0.026702880859375,
-0.023284912109375,
0.0279541015625,
0.039093017578125,
-0.01100921630859375,
-0.048583984375,
0.01503753662109375,
-0.09124755859375,
-0.03155517578125,
0.09747314453125,
-0.02142333984375,
-0.041961669921875,
0.0304718017578125,
-0.0295562744140625,
0.0296783447265625,
-0.00457763671875,
0.034393310546875,
0.0228424072265625,
-0.00420379638671875,
-0.0672607421875,
-0.0219268798828125,
0.0302276611328125,
0.02459716796875,
-0.06207275390625,
-0.01253509521484375,
0.038726806640625,
0.0372314453125,
0.0173492431640625,
0.04266357421875,
-0.0294036865234375,
0.042724609375,
0.002872467041015625,
0.038360595703125,
-0.037994384765625,
-0.0191497802734375,
-0.0211639404296875,
-0.0004191398620605469,
-0.0130615234375,
-0.016357421875
]
] |
garage-bAInd/Platypus2-7B | 2023-08-22T18:32:58.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:garage-bAInd/Open-Platypus",
"arxiv:2308.07317",
"arxiv:2307.09288",
"license:cc-by-nc-sa-4.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | garage-bAInd | null | null | garage-bAInd/Platypus2-7B | 4 | 12,801 | transformers | 2023-08-22T03:48:58 | ---
license: cc-by-nc-sa-4.0
language:
- en
datasets:
- garage-bAInd/Open-Platypus
---
# Platypus2-7B
**NOTE**: There is some issue with LLaMa-2 7B and fine-tuning only works if you use `fp16=False` and `bf16=True` in the HF trainer. Gathering more intel on this but if you have any thoughts about this issue or performance, please let us know!
Platypus-7B is an instruction fine-tuned model based on the LLaMA2-7B transformer architecture.

### Benchmark Metrics
| Metric | Value |
|-----------------------|-------|
| MMLU (5-shot) | - |
| ARC (25-shot) | - |
| HellaSwag (10-shot) | - |
| TruthfulQA (0-shot) | - |
| Avg. | - |
We use state-of-the-art [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) to run the benchmark tests above, using the same version as the HuggingFace LLM Leaderboard. Please see below for detailed instructions on reproducing benchmark results.
### Model Details
* **Trained by**: Cole Hunter & Ariel Lee
* **Model type:** **Platypus2-7B** is an auto-regressive language model based on the LLaMA2 transformer architecture.
* **Language(s)**: English
* **License for base weights**: Non-Commercial Creative Commons license ([CC BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/))
### Prompt Template
```
### Instruction:
<prompt> (without the <>)
### Response:
```
### Training Dataset
`garage-bAInd/Platypus2-7B` trained using STEM and logic based dataset [`garage-bAInd/Open-Platypus`](https://huggingface.co/datasets/garage-bAInd/Open-Platypus).
Please see our [paper](https://arxiv.org/abs/2308.07317) and [project webpage](https://platypus-llm.github.io) for additional information.
### Training Procedure
`garage-bAInd/Platypus2-7B` was instruction fine-tuned using LoRA on 1 A100 80GB. For training details and inference instructions please see the [Platypus2](https://github.com/arielnlee/Platypus) GitHub repo.
### Reproducing Evaluation Results
Install LM Evaluation Harness:
```
# clone repository
git clone https://github.com/EleutherAI/lm-evaluation-harness.git
# check out the correct commit
git checkout b281b0921b636bc36ad05c0b0b0763bd6dd43463
# change to repo directory
cd lm-evaluation-harness
# install
pip install -e .
```
Each task was evaluated on 1 A100 80GB GPU.
ARC:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Platypus2-7B,use_accelerate=True,dtype="bfloat16" --tasks arc_challenge --batch_size 2 --no_cache --write_out --output_path results/Platypus2-7B/arc_challenge_25shot.json --device cuda --num_fewshot 25
```
HellaSwag:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Platypus2-7B,use_accelerate=True,dtype="bfloat16" --tasks hellaswag --batch_size 2 --no_cache --write_out --output_path results/Platypus2-7B/hellaswag_10shot.json --device cuda --num_fewshot 10
```
MMLU:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Platypus2-7B,use_accelerate=True,dtype="bfloat16" --tasks hendrycksTest-* --batch_size 2 --no_cache --write_out --output_path results/Platypus2-7B/mmlu_5shot.json --device cuda --num_fewshot 5
```
TruthfulQA:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Platypus2-7B,use_accelerate=True,dtype="bfloat16" --tasks truthfulqa_mc --batch_size 2 --no_cache --write_out --output_path results/Platypus2-7B/truthfulqa_0shot.json --device cuda
```
### Limitations and bias
Llama 2 and fine-tuned variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2 and any fine-tuned varient's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2 variants, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at https://ai.meta.com/llama/responsible-use-guide/
### Citations
```bibtex
@article{platypus2023,
title={Platypus: Quick, Cheap, and Powerful Refinement of LLMs},
author={Ariel N. Lee and Cole J. Hunter and Nataniel Ruiz},
booktitle={arXiv preprint arxiv:2308.07317},
year={2023}
}
```
```bibtex
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
}
```
```bibtex
@inproceedings{
hu2022lora,
title={Lo{RA}: Low-Rank Adaptation of Large Language Models},
author={Edward J Hu and Yelong Shen and Phillip Wallis and Zeyuan Allen-Zhu and Yuanzhi Li and Shean Wang and Lu Wang and Weizhu Chen},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=nZeVKeeFYf9}
}
``` | 5,231 | [
[
-0.02264404296875,
-0.0628662109375,
0.023681640625,
0.0303497314453125,
-0.0272064208984375,
-0.0026874542236328125,
-0.0297088623046875,
-0.037506103515625,
0.001728057861328125,
0.0223388671875,
-0.0389404296875,
-0.0290069580078125,
-0.051055908203125,
-0.0029544830322265625,
-0.00540924072265625,
0.07568359375,
-0.0306396484375,
-0.01508331298828125,
-0.00701141357421875,
-0.0159454345703125,
-0.050323486328125,
-0.033966064453125,
-0.032257080078125,
-0.032867431640625,
0.02069091796875,
0.0272979736328125,
0.042633056640625,
0.048309326171875,
0.05078125,
0.022247314453125,
-0.0162506103515625,
0.0204010009765625,
-0.04638671875,
-0.01071929931640625,
0.01535797119140625,
-0.041748046875,
-0.039459228515625,
0.00647735595703125,
0.036376953125,
0.02386474609375,
-0.015625,
0.03265380859375,
0.01056671142578125,
0.0243377685546875,
-0.046661376953125,
0.0302581787109375,
-0.043701171875,
-0.014434814453125,
-0.0245513916015625,
-0.015960693359375,
-0.0232696533203125,
-0.01548004150390625,
-0.01318359375,
-0.05792236328125,
0.0007576942443847656,
0.00785064697265625,
0.08477783203125,
0.0465087890625,
-0.01654052734375,
-0.0113372802734375,
-0.0266265869140625,
0.0673828125,
-0.064697265625,
0.011566162109375,
0.028411865234375,
0.00904083251953125,
-0.034027099609375,
-0.048309326171875,
-0.045623779296875,
-0.025299072265625,
-0.002849578857421875,
0.0075836181640625,
-0.0171356201171875,
-0.00019371509552001953,
0.02947998046875,
0.030853271484375,
-0.027740478515625,
0.03692626953125,
-0.035430908203125,
-0.01495361328125,
0.0567626953125,
0.01195526123046875,
0.0018463134765625,
-0.00992584228515625,
-0.036895751953125,
-0.031829833984375,
-0.056915283203125,
0.0290679931640625,
0.032562255859375,
0.01084136962890625,
-0.034332275390625,
0.05029296875,
-0.0105133056640625,
0.031280517578125,
0.00940704345703125,
-0.043182373046875,
0.043731689453125,
-0.0171966552734375,
-0.0215301513671875,
-0.0018367767333984375,
0.0677490234375,
0.033233642578125,
0.0021648406982421875,
0.0049896240234375,
-0.0165863037109375,
0.0243682861328125,
-0.0100250244140625,
-0.0625,
-0.01190185546875,
0.02142333984375,
-0.022705078125,
-0.0157012939453125,
-0.0157318115234375,
-0.0430908203125,
-0.02764892578125,
-0.008544921875,
0.0335693359375,
-0.0333251953125,
-0.0292816162109375,
0.0181884765625,
-0.001155853271484375,
0.037322998046875,
0.0173797607421875,
-0.054656982421875,
0.0283355712890625,
0.044647216796875,
0.06268310546875,
-0.0273590087890625,
-0.04827880859375,
-0.03143310546875,
-0.0025920867919921875,
-0.017242431640625,
0.060272216796875,
-0.012359619140625,
-0.0152130126953125,
-0.0208740234375,
0.01413726806640625,
-0.0158233642578125,
-0.050384521484375,
0.036102294921875,
-0.0221405029296875,
0.006591796875,
-0.0181732177734375,
-0.032745361328125,
-0.02532958984375,
-0.006622314453125,
-0.03179931640625,
0.100341796875,
0.0124053955078125,
-0.058258056640625,
0.006732940673828125,
-0.048919677734375,
-0.032073974609375,
-0.015350341796875,
0.0115203857421875,
-0.045562744140625,
-0.0026988983154296875,
0.01010894775390625,
0.03125,
-0.0360107421875,
0.02130126953125,
-0.0194244384765625,
-0.03411865234375,
0.01506805419921875,
-0.0087127685546875,
0.07037353515625,
0.01430511474609375,
-0.049224853515625,
0.00838470458984375,
-0.048675537109375,
-0.01422882080078125,
0.0357666015625,
-0.0276336669921875,
-0.006885528564453125,
-0.00494384765625,
-0.012176513671875,
0.00823974609375,
0.0308837890625,
-0.034332275390625,
0.00710296630859375,
-0.0296173095703125,
0.04327392578125,
0.05584716796875,
-0.00399017333984375,
0.0168609619140625,
-0.034393310546875,
0.02874755859375,
0.006404876708984375,
0.0225372314453125,
0.0023250579833984375,
-0.05389404296875,
-0.080078125,
-0.0186920166015625,
0.0013685226440429688,
0.058380126953125,
-0.03399658203125,
0.040740966796875,
0.003604888916015625,
-0.044281005859375,
-0.04266357421875,
0.0285797119140625,
0.042938232421875,
0.041168212890625,
0.04095458984375,
-0.02911376953125,
-0.043365478515625,
-0.06304931640625,
-0.00945281982421875,
-0.02484130859375,
0.0150909423828125,
0.0224456787109375,
0.051513671875,
-0.0241546630859375,
0.045196533203125,
-0.035064697265625,
-0.020477294921875,
-0.0185394287109375,
-0.002780914306640625,
0.0260162353515625,
0.0487060546875,
0.036529541015625,
-0.015228271484375,
-0.01456451416015625,
-0.0154571533203125,
-0.059051513671875,
-0.0196533203125,
-0.0025234222412109375,
-0.0199432373046875,
0.03466796875,
0.013153076171875,
-0.065185546875,
0.027313232421875,
0.036468505859375,
-0.0144805908203125,
0.036834716796875,
-0.0126495361328125,
-0.01294708251953125,
-0.0595703125,
0.007602691650390625,
0.0024890899658203125,
0.0006556510925292969,
-0.033599853515625,
0.01166534423828125,
-0.002025604248046875,
0.01180267333984375,
-0.046875,
0.045074462890625,
-0.036376953125,
-0.0115203857421875,
-0.01535797119140625,
0.0081634521484375,
-0.0108642578125,
0.055572509765625,
-0.00390625,
0.06414794921875,
0.037200927734375,
-0.045562744140625,
0.01377105712890625,
0.0288543701171875,
-0.0272216796875,
0.0200347900390625,
-0.06689453125,
0.0167999267578125,
0.009307861328125,
0.02093505859375,
-0.076171875,
-0.0171966552734375,
0.0247802734375,
-0.022735595703125,
0.02471923828125,
0.0076751708984375,
-0.05206298828125,
-0.034698486328125,
-0.035552978515625,
0.0188140869140625,
0.06536865234375,
-0.0482177734375,
0.0213165283203125,
0.03076171875,
0.0091400146484375,
-0.04547119140625,
-0.05621337890625,
-0.01837158203125,
-0.0290374755859375,
-0.053680419921875,
0.01464080810546875,
-0.012664794921875,
-0.014373779296875,
-0.0251007080078125,
-0.013641357421875,
0.0102386474609375,
0.0193328857421875,
0.03985595703125,
0.0268707275390625,
-0.012359619140625,
-0.00931549072265625,
0.005401611328125,
-0.0165557861328125,
0.0014190673828125,
0.0078582763671875,
0.049957275390625,
-0.0276336669921875,
-0.012451171875,
-0.059326171875,
-0.0013208389282226562,
0.03302001953125,
-0.023040771484375,
0.051483154296875,
0.053131103515625,
-0.01558685302734375,
0.01120758056640625,
-0.06024169921875,
-0.0172119140625,
-0.038970947265625,
0.028289794921875,
-0.0172576904296875,
-0.05950927734375,
0.048492431640625,
-0.0005373954772949219,
0.01508331298828125,
0.054290771484375,
0.061981201171875,
-0.00038814544677734375,
0.060333251953125,
0.04193115234375,
0.00868988037109375,
0.033294677734375,
-0.050506591796875,
0.005741119384765625,
-0.07952880859375,
-0.027191162109375,
-0.0293121337890625,
-0.028472900390625,
-0.04736328125,
-0.037261962890625,
0.015960693359375,
0.024017333984375,
-0.045379638671875,
0.037628173828125,
-0.037872314453125,
0.019012451171875,
0.040008544921875,
0.0095977783203125,
0.01404571533203125,
0.00682830810546875,
-0.0063934326171875,
-0.00028133392333984375,
-0.047271728515625,
-0.042022705078125,
0.08502197265625,
0.042388916015625,
0.06658935546875,
-0.004547119140625,
0.049957275390625,
-0.010528564453125,
0.0247344970703125,
-0.048095703125,
0.04571533203125,
-0.005584716796875,
-0.03448486328125,
-0.0024394989013671875,
-0.01517486572265625,
-0.0694580078125,
0.020233154296875,
-0.00308990478515625,
-0.058624267578125,
0.0170135498046875,
0.00751495361328125,
-0.0318603515625,
0.0240020751953125,
-0.0692138671875,
0.05792236328125,
-0.0340576171875,
-0.034149169921875,
-0.01806640625,
-0.054840087890625,
0.050750732421875,
-0.0016765594482421875,
0.0021495819091796875,
-0.023773193359375,
-0.00891876220703125,
0.07989501953125,
-0.043426513671875,
0.0723876953125,
-0.0213470458984375,
-0.0013027191162109375,
0.03656005859375,
-0.005405426025390625,
0.041900634765625,
0.005489349365234375,
-0.0006232261657714844,
0.033782958984375,
-0.001384735107421875,
-0.022857666015625,
-0.0135498046875,
0.060333251953125,
-0.09893798828125,
-0.050811767578125,
-0.036529541015625,
-0.055511474609375,
0.0015211105346679688,
0.007598876953125,
0.01465606689453125,
-0.0008654594421386719,
0.0243682861328125,
0.0063629150390625,
0.04498291015625,
-0.032745361328125,
0.045928955078125,
0.0389404296875,
-0.00041365623474121094,
-0.02886962890625,
0.058013916015625,
0.0023097991943359375,
0.0193634033203125,
0.00803375244140625,
0.00981903076171875,
-0.020843505859375,
-0.03466796875,
-0.02032470703125,
0.048309326171875,
-0.0435791015625,
-0.0384521484375,
-0.036834716796875,
-0.0178985595703125,
-0.0176849365234375,
0.004444122314453125,
-0.039520263671875,
-0.032318115234375,
-0.05126953125,
-0.002960205078125,
0.04931640625,
0.0426025390625,
-0.00689697265625,
0.0545654296875,
-0.01454925537109375,
0.0272979736328125,
0.015655517578125,
0.0254974365234375,
-0.0015611648559570312,
-0.05987548828125,
0.0014352798461914062,
0.004940032958984375,
-0.048309326171875,
-0.052734375,
0.0291595458984375,
0.01371002197265625,
0.05291748046875,
0.012420654296875,
-0.000698089599609375,
0.06903076171875,
-0.016204833984375,
0.0606689453125,
0.01788330078125,
-0.06103515625,
0.0504150390625,
-0.00519561767578125,
0.005817413330078125,
0.03216552734375,
0.0199737548828125,
-0.00943756103515625,
-0.027099609375,
-0.05328369140625,
-0.05975341796875,
0.0653076171875,
0.02264404296875,
-0.0084991455078125,
0.0182037353515625,
0.03826904296875,
0.01276397705078125,
0.00969696044921875,
-0.057159423828125,
-0.025726318359375,
-0.026824951171875,
-0.0016603469848632812,
-0.01190185546875,
-0.018951416015625,
-0.0118560791015625,
-0.0328369140625,
0.054779052734375,
-0.00415802001953125,
0.036712646484375,
0.01666259765625,
-0.0291290283203125,
-0.023162841796875,
0.0016241073608398438,
0.050506591796875,
0.04571533203125,
-0.0310516357421875,
-0.00333404541015625,
0.0250244140625,
-0.04705810546875,
0.0133209228515625,
0.0177764892578125,
-0.0045928955078125,
-0.0152587890625,
0.0290985107421875,
0.08782958984375,
0.00785064697265625,
-0.045440673828125,
0.032501220703125,
-0.00341033935546875,
-0.014739990234375,
-0.016998291015625,
0.0199432373046875,
0.0095977783203125,
0.0275421142578125,
0.02203369140625,
-0.0007715225219726562,
-0.0213775634765625,
-0.028656005859375,
-0.012054443359375,
0.0276641845703125,
0.009307861328125,
-0.0322265625,
0.06671142578125,
0.005130767822265625,
-0.022308349609375,
0.0426025390625,
-0.01340484619140625,
-0.0252838134765625,
0.057159423828125,
0.05419921875,
0.046844482421875,
-0.0166168212890625,
-0.0030975341796875,
0.03277587890625,
0.04010009765625,
-0.01800537109375,
0.034637451171875,
0.0105743408203125,
-0.037353515625,
-0.0283966064453125,
-0.053680419921875,
-0.0198516845703125,
0.0285797119140625,
-0.034088134765625,
0.02691650390625,
-0.04901123046875,
-0.0206451416015625,
-0.0112457275390625,
0.034332275390625,
-0.051055908203125,
-0.006137847900390625,
0.00821685791015625,
0.0760498046875,
-0.0692138671875,
0.0589599609375,
0.048919677734375,
-0.03839111328125,
-0.07305908203125,
-0.0296630859375,
-0.0104217529296875,
-0.0853271484375,
0.039306640625,
0.019500732421875,
0.002994537353515625,
-0.004772186279296875,
-0.0562744140625,
-0.07977294921875,
0.11431884765625,
0.054718017578125,
-0.048858642578125,
0.016143798828125,
0.01013946533203125,
0.040985107421875,
-0.01678466796875,
0.025543212890625,
0.06329345703125,
0.0406494140625,
0.0035800933837890625,
-0.08984375,
0.0180206298828125,
-0.01776123046875,
0.0089263916015625,
-0.005878448486328125,
-0.08392333984375,
0.0823974609375,
-0.031585693359375,
-0.00850677490234375,
0.0218963623046875,
0.047088623046875,
0.057861328125,
0.01824951171875,
0.029052734375,
0.06512451171875,
0.06439208984375,
-0.005832672119140625,
0.0899658203125,
-0.0289764404296875,
0.036102294921875,
0.0703125,
-0.0117340087890625,
0.0714111328125,
0.043365478515625,
-0.03173828125,
0.04632568359375,
0.068115234375,
-0.00562286376953125,
0.040985107421875,
0.00984954833984375,
0.013702392578125,
-0.0080718994140625,
-0.002716064453125,
-0.040374755859375,
0.0284576416015625,
0.0252838134765625,
-0.0090484619140625,
-0.007160186767578125,
-0.01090240478515625,
0.01544189453125,
-0.03009033203125,
-0.01258087158203125,
0.040252685546875,
0.0208892822265625,
-0.053680419921875,
0.09002685546875,
0.007740020751953125,
0.06756591796875,
-0.03948974609375,
0.0139617919921875,
-0.036102294921875,
0.0202178955078125,
-0.028564453125,
-0.048858642578125,
0.0014133453369140625,
-0.0009298324584960938,
0.00705718994140625,
-0.00028705596923828125,
0.04815673828125,
-0.006256103515625,
-0.0252685546875,
0.0311737060546875,
0.0224609375,
0.02410888671875,
0.01125335693359375,
-0.051971435546875,
0.0251922607421875,
-0.0070343017578125,
-0.032073974609375,
0.0245513916015625,
0.004608154296875,
-0.0166015625,
0.04803466796875,
0.051513671875,
-0.0041656494140625,
0.022674560546875,
-0.0120086669921875,
0.0758056640625,
-0.02978515625,
-0.027496337890625,
-0.05621337890625,
0.033905029296875,
0.01387786865234375,
-0.0477294921875,
0.05462646484375,
0.040679931640625,
0.05572509765625,
0.010009765625,
0.0400390625,
-0.00782012939453125,
0.0218963623046875,
-0.03265380859375,
0.03717041015625,
-0.03802490234375,
0.02947998046875,
-0.006351470947265625,
-0.07342529296875,
-0.0110015869140625,
0.057037353515625,
-0.030914306640625,
-0.006256103515625,
0.0623779296875,
0.06988525390625,
-0.0109405517578125,
-0.0171661376953125,
-0.00978851318359375,
0.041107177734375,
0.01861572265625,
0.06884765625,
0.06365966796875,
-0.054290771484375,
0.040924072265625,
-0.042327880859375,
-0.02484130859375,
-0.0192413330078125,
-0.05462646484375,
-0.076171875,
-0.0311279296875,
-0.034576416015625,
-0.030181884765625,
0.003192901611328125,
0.055999755859375,
0.040130615234375,
-0.06268310546875,
-0.04022216796875,
-0.0024166107177734375,
0.00926971435546875,
-0.01348114013671875,
-0.01343536376953125,
0.0357666015625,
-0.0206451416015625,
-0.03204345703125,
0.0174407958984375,
0.006893157958984375,
0.01302337646484375,
-0.0257568359375,
-0.0264739990234375,
-0.0204620361328125,
-0.01092529296875,
0.033294677734375,
0.0300445556640625,
-0.06683349609375,
-0.00859832763671875,
0.0005984306335449219,
-0.006748199462890625,
0.0209503173828125,
0.033599853515625,
-0.061065673828125,
0.00154876708984375,
0.0245513916015625,
0.033111572265625,
0.055389404296875,
-0.01409149169921875,
0.00791168212890625,
-0.038421630859375,
0.039154052734375,
-0.0058135986328125,
0.031280517578125,
0.03448486328125,
-0.0240478515625,
0.041473388671875,
0.03131103515625,
-0.04449462890625,
-0.0760498046875,
-0.010284423828125,
-0.0887451171875,
-0.007007598876953125,
0.11126708984375,
-0.0115814208984375,
-0.0396728515625,
0.017913818359375,
-0.01995849609375,
0.034271240234375,
-0.034942626953125,
0.054290771484375,
0.024383544921875,
-0.0189971923828125,
-0.01363372802734375,
-0.057647705078125,
0.02410888671875,
0.0294342041015625,
-0.066650390625,
-0.011474609375,
0.0178070068359375,
0.039276123046875,
0.01253509521484375,
0.0377197265625,
0.004489898681640625,
0.02032470703125,
-0.015350341796875,
0.005237579345703125,
-0.01568603515625,
-0.004398345947265625,
-0.030914306640625,
-0.0170135498046875,
0.006755828857421875,
-0.0156097412109375
]
] |
diffusers/controlnet-canny-sdxl-1.0 | 2023-09-19T15:25:43.000Z | [
"diffusers",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"license:openrail++",
"has_space",
"diffusers:ControlNetModel",
"region:us"
] | text-to-image | diffusers | null | null | diffusers/controlnet-canny-sdxl-1.0 | 392 | 12,789 | diffusers | 2023-08-01T04:34:46 | ---
license: openrail++
base_model: runwayml/stable-diffusion-v1-5
tags:
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
- text-to-image
- diffusers
inference: false
---
# SDXL-controlnet: Canny
These are controlnet weights trained on [stabilityai/stable-diffusion-xl-base-1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) with canny conditioning. You can find some example images in the following.
prompt: a couple watching a romantic sunset, 4k photo

prompt: ultrarealistic shot of a furry blue bird

prompt: a woman, close up, detailed, beautiful, street photography, photorealistic, detailed, Kodak ektar 100, natural, candid shot

prompt: Cinematic, neoclassical table in the living room, cinematic, contour, lighting, highly detailed, winter, golden hour

prompt: a tornado hitting grass field, 1980's film grain. overcast, muted colors.

## Usage
Make sure to first install the libraries:
```bash
pip install accelerate transformers safetensors opencv-python diffusers
```
And then we're ready to go:
```python
from diffusers import ControlNetModel, StableDiffusionXLControlNetPipeline, AutoencoderKL
from diffusers.utils import load_image
from PIL import Image
import torch
import numpy as np
import cv2
prompt = "aerial view, a futuristic research complex in a bright foggy jungle, hard lighting"
negative_prompt = 'low quality, bad quality, sketches'
image = load_image("https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd_controlnet/hf-logo.png")
controlnet_conditioning_scale = 0.5 # recommended for good generalization
controlnet = ControlNetModel.from_pretrained(
"diffusers/controlnet-canny-sdxl-1.0",
torch_dtype=torch.float16
)
vae = AutoencoderKL.from_pretrained("madebyollin/sdxl-vae-fp16-fix", torch_dtype=torch.float16)
pipe = StableDiffusionXLControlNetPipeline.from_pretrained(
"stabilityai/stable-diffusion-xl-base-1.0",
controlnet=controlnet,
vae=vae,
torch_dtype=torch.float16,
)
pipe.enable_model_cpu_offload()
image = np.array(image)
image = cv2.Canny(image, 100, 200)
image = image[:, :, None]
image = np.concatenate([image, image, image], axis=2)
image = Image.fromarray(image)
images = pipe(
prompt, negative_prompt=negative_prompt, image=image, controlnet_conditioning_scale=controlnet_conditioning_scale,
).images
images[0].save(f"hug_lab.png")
```

To more details, check out the official documentation of [`StableDiffusionXLControlNetPipeline`](https://huggingface.co/docs/diffusers/main/en/api/pipelines/controlnet_sdxl).
### Training
Our training script was built on top of the official training script that we provide [here](https://github.com/huggingface/diffusers/blob/main/examples/controlnet/README_sdxl.md).
#### Training data
This checkpoint was first trained for 20,000 steps on laion 6a resized to a max minimum dimension of 384.
It was then further trained for 20,000 steps on laion 6a resized to a max minimum dimension of 1024 and
then filtered to contain only minimum 1024 images. We found the further high resolution finetuning was
necessary for image quality.
#### Compute
one 8xA100 machine
#### Batch size
Data parallel with a single gpu batch size of 8 for a total batch size of 64.
#### Hyper Parameters
Constant learning rate of 1e-4 scaled by batch size for total learning rate of 64e-4
#### Mixed precision
fp16 | 3,580 | [
[
-0.038238525390625,
-0.0245513916015625,
0.01409912109375,
0.0280914306640625,
-0.0208282470703125,
-0.018218994140625,
0.0028057098388671875,
-0.0111541748046875,
0.028778076171875,
0.039764404296875,
-0.044891357421875,
-0.026336669921875,
-0.05023193359375,
-0.021026611328125,
-0.02008056640625,
0.0714111328125,
-0.036041259765625,
0.006011962890625,
0.0175018310546875,
-0.012451171875,
0.0032863616943359375,
0.00315093994140625,
-0.0863037109375,
-0.026031494140625,
0.035400390625,
-0.009918212890625,
0.045166015625,
0.048187255859375,
0.020477294921875,
0.0268402099609375,
-0.023345947265625,
0.00029206275939941406,
-0.0386962890625,
-0.0201416015625,
0.0206451416015625,
-0.0182342529296875,
-0.0200958251953125,
-0.0090179443359375,
0.050506591796875,
0.02886962890625,
-0.0181732177734375,
-0.00189971923828125,
0.0077972412109375,
0.048095703125,
-0.050506591796875,
0.005207061767578125,
-0.0191192626953125,
0.0290374755859375,
-0.0006570816040039062,
-0.0005974769592285156,
-0.005214691162109375,
0.00302886962890625,
-0.01001739501953125,
-0.0692138671875,
0.01629638671875,
-0.00605010986328125,
0.0986328125,
0.033935546875,
-0.035003662109375,
-0.00238800048828125,
-0.034271240234375,
0.05914306640625,
-0.044891357421875,
0.016998291015625,
0.0208587646484375,
0.0156402587890625,
0.0023860931396484375,
-0.06463623046875,
-0.020782470703125,
-0.00983428955078125,
0.0013942718505859375,
0.047149658203125,
-0.01038360595703125,
0.01763916015625,
0.041290283203125,
0.00489044189453125,
-0.038116455078125,
0.0084228515625,
-0.0340576171875,
-0.023895263671875,
0.05322265625,
0.0174102783203125,
0.0026798248291015625,
-0.0034122467041015625,
-0.054656982421875,
-0.01462554931640625,
-0.009185791015625,
0.020172119140625,
0.01052093505859375,
-0.01306915283203125,
-0.047607421875,
0.018463134765625,
0.005344390869140625,
0.04766845703125,
0.03131103515625,
-0.0175933837890625,
0.033447265625,
-0.027557373046875,
-0.034393310546875,
-0.00836944580078125,
0.059783935546875,
0.037841796875,
0.0013103485107421875,
0.01184844970703125,
-0.0240631103515625,
-0.0010957717895507812,
0.0147857666015625,
-0.06732177734375,
-0.0286865234375,
0.0285186767578125,
-0.04595947265625,
-0.043182373046875,
0.005840301513671875,
-0.0310516357421875,
-0.0113067626953125,
-0.0308685302734375,
0.02899169921875,
-0.022735595703125,
-0.048583984375,
0.00653076171875,
-0.038116455078125,
0.027496337890625,
0.046539306640625,
-0.03387451171875,
0.0306243896484375,
0.01175689697265625,
0.07623291015625,
-0.0208282470703125,
-0.00742340087890625,
-0.0252532958984375,
-0.00281524658203125,
-0.040130615234375,
0.0296783447265625,
0.003543853759765625,
-0.020751953125,
-0.0255279541015625,
0.0221405029296875,
-0.0061492919921875,
-0.047393798828125,
0.0269317626953125,
-0.04425048828125,
-0.0112762451171875,
-0.001636505126953125,
-0.02093505859375,
-0.00616455078125,
0.00811004638671875,
-0.036529541015625,
0.0697021484375,
0.0193023681640625,
-0.0823974609375,
0.0147705078125,
-0.043914794921875,
0.00284576416015625,
-0.01236724853515625,
-0.01412200927734375,
-0.04681396484375,
-0.0059051513671875,
-0.0169830322265625,
0.03369140625,
0.01158905029296875,
0.00012993812561035156,
-0.021087646484375,
-0.012847900390625,
0.006832122802734375,
-0.02630615234375,
0.09783935546875,
0.0231475830078125,
-0.04107666015625,
0.0240478515625,
-0.076904296875,
0.01605224609375,
-0.00646209716796875,
-0.022247314453125,
-0.00548553466796875,
-0.048614501953125,
0.0312347412109375,
0.0293121337890625,
-0.003932952880859375,
-0.046722412109375,
-0.0007367134094238281,
-0.018585205078125,
0.026824951171875,
0.05718994140625,
0.0125885009765625,
0.0307159423828125,
-0.01776123046875,
0.039276123046875,
0.0292816162109375,
0.0161590576171875,
0.016510009765625,
-0.032012939453125,
-0.0697021484375,
-0.0269012451171875,
-0.00536346435546875,
0.04510498046875,
-0.09344482421875,
0.017791748046875,
0.0210723876953125,
-0.03948974609375,
-0.0164794921875,
0.0114593505859375,
0.0302734375,
0.04412841796875,
0.017974853515625,
-0.02960205078125,
-0.031707763671875,
-0.05914306640625,
0.05352783203125,
0.0231781005859375,
-0.01268768310546875,
0.0131378173828125,
0.046722412109375,
-0.0127410888671875,
0.04425048828125,
-0.044464111328125,
-0.011627197265625,
0.0167236328125,
0.006877899169921875,
0.046539306640625,
0.052978515625,
0.0516357421875,
-0.06561279296875,
-0.038818359375,
-0.0015554428100585938,
-0.0643310546875,
0.0032596588134765625,
-0.009063720703125,
-0.0243682861328125,
0.0194549560546875,
0.041717529296875,
-0.045013427734375,
0.050506591796875,
0.049041748046875,
-0.031158447265625,
0.0731201171875,
-0.039398193359375,
-0.0017805099487304688,
-0.08056640625,
0.01094818115234375,
0.02581787109375,
-0.03533935546875,
-0.037628173828125,
0.00031566619873046875,
0.0157318115234375,
0.0016145706176757812,
-0.061553955078125,
0.04620361328125,
-0.035552978515625,
0.02325439453125,
-0.0258941650390625,
-0.019775390625,
0.002201080322265625,
0.045013427734375,
0.020355224609375,
0.03717041015625,
0.077392578125,
-0.05615234375,
0.050567626953125,
0.002887725830078125,
-0.0157470703125,
0.041290283203125,
-0.06573486328125,
-0.001773834228515625,
-0.021392822265625,
0.020904541015625,
-0.07342529296875,
-0.0122528076171875,
0.03643798828125,
-0.0179901123046875,
0.035247802734375,
-0.00292205810546875,
-0.00455474853515625,
-0.0309906005859375,
-0.031494140625,
0.0232086181640625,
0.034637451171875,
-0.03521728515625,
0.0306854248046875,
-0.00452423095703125,
0.032012939453125,
-0.06524658203125,
-0.071044921875,
-0.01274871826171875,
-0.0134124755859375,
-0.049957275390625,
0.02960205078125,
-0.025543212890625,
-0.0246124267578125,
0.0012407302856445312,
0.0045318603515625,
-0.0252838134765625,
0.008392333984375,
0.0418701171875,
0.01934814453125,
-0.01152801513671875,
-0.0251312255859375,
0.01267242431640625,
-0.021392822265625,
0.00521087646484375,
-0.0404052734375,
0.021453857421875,
0.0008478164672851562,
-0.0025177001953125,
-0.0615234375,
0.017333984375,
0.0218963623046875,
0.031585693359375,
0.06719970703125,
0.08013916015625,
-0.030364990234375,
-0.0166015625,
-0.0222320556640625,
-0.0199432373046875,
-0.0377197265625,
0.0211944580078125,
-0.023345947265625,
-0.04632568359375,
0.0548095703125,
0.01004791259765625,
0.0033740997314453125,
0.029571533203125,
0.031402587890625,
-0.0227203369140625,
0.06256103515625,
0.03228759765625,
0.01148223876953125,
0.037811279296875,
-0.07623291015625,
-0.0362548828125,
-0.0611572265625,
-0.00037288665771484375,
-0.016937255859375,
-0.0291748046875,
-0.020965576171875,
-0.033447265625,
0.037017822265625,
0.0255279541015625,
-0.05743408203125,
0.0231170654296875,
-0.0297698974609375,
0.02374267578125,
0.0231781005859375,
0.0238494873046875,
-0.01393890380859375,
-0.0020236968994140625,
-0.0274810791015625,
0.0025463104248046875,
-0.047027587890625,
-0.00457000732421875,
0.051788330078125,
0.034820556640625,
0.07049560546875,
-0.01132965087890625,
0.048431396484375,
0.019287109375,
0.008026123046875,
-0.033843994140625,
0.0268096923828125,
-0.01806640625,
-0.026214599609375,
-0.0007734298706054688,
-0.04034423828125,
-0.072265625,
-0.00913238525390625,
-0.009429931640625,
-0.042572021484375,
0.0263824462890625,
0.0243072509765625,
-0.0229644775390625,
0.035552978515625,
-0.05084228515625,
0.052459716796875,
-0.0283203125,
-0.049835205078125,
0.0074005126953125,
-0.06182861328125,
0.02874755859375,
0.00946044921875,
-0.00861358642578125,
0.020904541015625,
-0.023406982421875,
0.052093505859375,
-0.0535888671875,
0.06573486328125,
-0.03546142578125,
-0.01447296142578125,
0.0231170654296875,
-0.007015228271484375,
0.024658203125,
0.00927734375,
-0.0181121826171875,
0.02728271484375,
0.0185089111328125,
-0.040740966796875,
-0.0340576171875,
0.051788330078125,
-0.0838623046875,
-0.007373809814453125,
-0.0226287841796875,
-0.03472900390625,
0.035736083984375,
0.01372528076171875,
0.049407958984375,
0.040496826171875,
0.025177001953125,
0.0027065277099609375,
0.06109619140625,
-0.015289306640625,
0.035064697265625,
0.0043487548828125,
-0.030181884765625,
-0.054229736328125,
0.05694580078125,
0.030548095703125,
0.04327392578125,
0.020233154296875,
0.0012540817260742188,
-0.0056610107421875,
-0.03387451171875,
-0.03662109375,
0.0260772705078125,
-0.054229736328125,
-0.036590576171875,
-0.0325927734375,
-0.049102783203125,
-0.0298004150390625,
-0.0200653076171875,
-0.02801513671875,
-0.022979736328125,
-0.051177978515625,
0.004940032958984375,
0.032440185546875,
0.047515869140625,
-0.018280029296875,
0.044097900390625,
-0.028289794921875,
0.018218994140625,
0.0111236572265625,
0.0257415771484375,
0.005588531494140625,
-0.04034423828125,
-0.0210418701171875,
0.00031566619873046875,
-0.0255279541015625,
-0.0439453125,
0.0399169921875,
0.01239013671875,
0.03448486328125,
0.057769775390625,
-0.005168914794921875,
0.037628173828125,
-0.01239776611328125,
0.056915283203125,
0.040283203125,
-0.0533447265625,
0.036376953125,
-0.0203857421875,
0.0190277099609375,
0.00815582275390625,
0.04888916015625,
-0.028350830078125,
-0.0016546249389648438,
-0.05712890625,
-0.052581787109375,
0.05877685546875,
0.0172271728515625,
-0.00750732421875,
0.0251007080078125,
0.0660400390625,
-0.00611114501953125,
0.0088043212890625,
-0.05511474609375,
-0.04693603515625,
-0.029571533203125,
0.0011224746704101562,
0.002368927001953125,
0.005359649658203125,
-0.01334381103515625,
-0.022186279296875,
0.0670166015625,
-0.018310546875,
0.0286407470703125,
0.0310821533203125,
0.0204620361328125,
-0.016204833984375,
-0.030059814453125,
0.04541015625,
0.03936767578125,
-0.025909423828125,
0.00499725341796875,
0.006679534912109375,
-0.041259765625,
0.0214691162109375,
-0.005321502685546875,
-0.0304412841796875,
-0.010498046875,
0.0193023681640625,
0.08599853515625,
-0.012359619140625,
-0.009613037109375,
0.042724609375,
-0.020538330078125,
-0.04681396484375,
-0.040130615234375,
0.0245361328125,
0.01220703125,
0.02587890625,
0.0056610107421875,
0.0496826171875,
0.0007071495056152344,
0.0030384063720703125,
0.01444244384765625,
0.033660888671875,
-0.042083740234375,
-0.0216217041015625,
0.0640869140625,
-0.0030918121337890625,
-0.0017080307006835938,
0.046051025390625,
-0.032012939453125,
-0.02508544921875,
0.064697265625,
0.038909912109375,
0.06292724609375,
0.0021686553955078125,
0.0259857177734375,
0.050689697265625,
0.0101776123046875,
-0.01004791259765625,
0.0233001708984375,
-0.01314544677734375,
-0.0699462890625,
-0.0296478271484375,
-0.043670654296875,
-0.01018524169921875,
0.010284423828125,
-0.043792724609375,
0.03875732421875,
-0.055572509765625,
-0.0228118896484375,
-0.0167999267578125,
0.01226043701171875,
-0.055694580078125,
0.0185699462890625,
0.02252197265625,
0.10797119140625,
-0.0780029296875,
0.07305908203125,
0.045623779296875,
-0.0303497314453125,
-0.077392578125,
-0.02557373046875,
0.002735137939453125,
-0.057708740234375,
0.063232421875,
0.0024814605712890625,
-0.0111846923828125,
0.00998687744140625,
-0.07025146484375,
-0.060089111328125,
0.107666015625,
0.0269775390625,
-0.03607177734375,
0.00984954833984375,
-0.0212860107421875,
0.03765869140625,
-0.0261688232421875,
0.042083740234375,
0.01068878173828125,
0.031768798828125,
0.02789306640625,
-0.06439208984375,
0.00003701448440551758,
-0.03057861328125,
0.024810791015625,
-0.0026111602783203125,
-0.066162109375,
0.07208251953125,
-0.0144805908203125,
0.0030040740966796875,
0.0196075439453125,
0.059600830078125,
0.0225067138671875,
0.0253753662109375,
0.05560302734375,
0.0732421875,
0.037750244140625,
-0.00557708740234375,
0.08331298828125,
-0.0029125213623046875,
0.034027099609375,
0.05364990234375,
0.00017154216766357422,
0.033782958984375,
0.033843994140625,
-0.0034122467041015625,
0.026702880859375,
0.059661865234375,
0.00237274169921875,
0.03076171875,
0.034393310546875,
-0.0202789306640625,
-0.00435638427734375,
-0.0027713775634765625,
-0.0242156982421875,
0.003814697265625,
0.0297088623046875,
-0.0247802734375,
-0.01378631591796875,
0.0474853515625,
0.0124969482421875,
-0.0218963623046875,
-0.039337158203125,
0.040924072265625,
0.001506805419921875,
-0.03363037109375,
0.07525634765625,
-0.007556915283203125,
0.08038330078125,
-0.07464599609375,
-0.0016183853149414062,
-0.004802703857421875,
0.026092529296875,
-0.034820556640625,
-0.0673828125,
0.019805908203125,
-0.021392822265625,
-0.00772857666015625,
-0.023773193359375,
0.046661376953125,
-0.026031494140625,
-0.042938232421875,
0.045867919921875,
0.004276275634765625,
0.04156494140625,
0.007511138916015625,
-0.0771484375,
0.0313720703125,
0.0169830322265625,
-0.025787353515625,
0.01349639892578125,
0.01861572265625,
0.01035308837890625,
0.036224365234375,
0.0145721435546875,
0.033203125,
0.0158538818359375,
0.0004265308380126953,
0.07550048828125,
-0.0272674560546875,
-0.00963592529296875,
-0.0243988037109375,
0.04595947265625,
-0.0105743408203125,
-0.0279541015625,
0.036712646484375,
0.0300445556640625,
0.061676025390625,
-0.01151275634765625,
0.051055908203125,
-0.0142364501953125,
-0.0013780593872070312,
-0.0567626953125,
0.05902099609375,
-0.047637939453125,
-0.0027103424072265625,
-0.01079559326171875,
-0.054534912109375,
-0.0169677734375,
0.048980712890625,
0.0005941390991210938,
0.0191650390625,
0.0268402099609375,
0.089111328125,
-0.0193023681640625,
-0.029205322265625,
0.0204010009765625,
0.032135009765625,
0.0174713134765625,
0.041351318359375,
0.03997802734375,
-0.059661865234375,
0.02569580078125,
-0.059967041015625,
-0.0394287109375,
0.01629638671875,
-0.076904296875,
-0.03900146484375,
-0.045623779296875,
-0.061737060546875,
-0.06878662109375,
-0.01129150390625,
0.07373046875,
0.0897216796875,
-0.06536865234375,
-0.028656005859375,
-0.0206451416015625,
-0.003742218017578125,
-0.030181884765625,
-0.0194549560546875,
0.031646728515625,
-0.00342559814453125,
-0.045166015625,
-0.00743865966796875,
0.01493072509765625,
0.02642822265625,
-0.0304718017578125,
-0.0278167724609375,
-0.0247802734375,
-0.027008056640625,
0.0244903564453125,
0.0212860107421875,
-0.026611328125,
-0.00951385498046875,
-0.01300048828125,
-0.0006647109985351562,
0.00827789306640625,
0.035064697265625,
-0.050872802734375,
0.0246429443359375,
0.036651611328125,
0.020965576171875,
0.058563232421875,
-0.007091522216796875,
-0.005657196044921875,
-0.061370849609375,
0.0224151611328125,
-0.005397796630859375,
0.028167724609375,
0.00856781005859375,
-0.0302276611328125,
0.031890869140625,
0.0328369140625,
-0.05609130859375,
-0.0294647216796875,
-0.0018663406372070312,
-0.10888671875,
-0.00838470458984375,
0.08502197265625,
-0.024810791015625,
-0.046051025390625,
-0.0004897117614746094,
-0.047607421875,
0.01221466064453125,
-0.02569580078125,
0.0257110595703125,
0.025054931640625,
-0.0214691162109375,
-0.04193115234375,
-0.0264892578125,
0.0292510986328125,
0.0016727447509765625,
-0.042083740234375,
-0.018798828125,
0.02642822265625,
0.037017822265625,
0.04315185546875,
0.05322265625,
-0.00015306472778320312,
0.01108551025390625,
0.00994110107421875,
0.004322052001953125,
-0.0100555419921875,
-0.00954437255859375,
-0.042144775390625,
0.007518768310546875,
-0.0141754150390625,
-0.0167083740234375
]
] |
Helsinki-NLP/opus-mt-eu-es | 2023-08-16T11:34:06.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"eu",
"es",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-eu-es | 1 | 12,761 | transformers | 2022-03-02T23:29:04 | ---
language:
- eu
- es
tags:
- translation
license: apache-2.0
---
### eus-spa
* source group: Basque
* target group: Spanish
* OPUS readme: [eus-spa](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eus-spa/README.md)
* model: transformer-align
* source language(s): eus
* target language(s): spa
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus-2020-06-17.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eus-spa/opus-2020-06-17.zip)
* test set translations: [opus-2020-06-17.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eus-spa/opus-2020-06-17.test.txt)
* test set scores: [opus-2020-06-17.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eus-spa/opus-2020-06-17.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba-test.eus.spa | 48.8 | 0.673 |
### System Info:
- hf_name: eus-spa
- source_languages: eus
- target_languages: spa
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eus-spa/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['eu', 'es']
- src_constituents: {'eus'}
- tgt_constituents: {'spa'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/eus-spa/opus-2020-06-17.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/eus-spa/opus-2020-06-17.test.txt
- src_alpha3: eus
- tgt_alpha3: spa
- short_pair: eu-es
- chrF2_score: 0.6729999999999999
- bleu: 48.8
- brevity_penalty: 0.9640000000000001
- ref_len: 12469.0
- src_name: Basque
- tgt_name: Spanish
- train_date: 2020-06-17
- src_alpha2: eu
- tgt_alpha2: es
- prefer_old: False
- long_pair: eus-spa
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 2,081 | [
[
-0.0306549072265625,
-0.038848876953125,
0.024688720703125,
0.026641845703125,
-0.0276336669921875,
-0.01447296142578125,
-0.022308349609375,
-0.0277557373046875,
0.0194244384765625,
0.02166748046875,
-0.0491943359375,
-0.055755615234375,
-0.04168701171875,
0.032379150390625,
-0.00875091552734375,
0.06982421875,
-0.006885528564453125,
0.0119171142578125,
0.0316162109375,
-0.034820556640625,
-0.0255584716796875,
-0.0221099853515625,
-0.044097900390625,
-0.0166168212890625,
0.033447265625,
0.020233154296875,
0.034576416015625,
0.03546142578125,
0.041748046875,
0.025054931640625,
-0.034820556640625,
0.0188140869140625,
-0.016265869140625,
-0.0086517333984375,
-0.00954437255859375,
-0.03350830078125,
-0.046234130859375,
-0.024688720703125,
0.066650390625,
0.04071044921875,
0.0035953521728515625,
0.0321044921875,
-0.004802703857421875,
0.05340576171875,
-0.01024627685546875,
0.0031948089599609375,
-0.03887939453125,
-0.004489898681640625,
-0.031951904296875,
-0.02978515625,
-0.0404052734375,
-0.0166015625,
0.0130157470703125,
-0.046173095703125,
0.0116729736328125,
0.00920867919921875,
0.1219482421875,
0.0118408203125,
-0.0294647216796875,
-0.006458282470703125,
-0.02899169921875,
0.06097412109375,
-0.054840087890625,
0.035491943359375,
0.025177001953125,
-0.006008148193359375,
-0.013458251953125,
-0.028900146484375,
-0.0215911865234375,
0.00737762451171875,
-0.0194244384765625,
0.021209716796875,
-0.02252197265625,
-0.01174163818359375,
0.0174102783203125,
0.039520263671875,
-0.052459716796875,
0.0027751922607421875,
-0.035125732421875,
-0.01160430908203125,
0.035369873046875,
0.01039886474609375,
0.0211639404296875,
-0.033294677734375,
-0.037750244140625,
-0.0291595458984375,
-0.04351806640625,
0.016326904296875,
0.031402587890625,
0.03228759765625,
-0.037750244140625,
0.049835205078125,
-0.01128387451171875,
0.041595458984375,
0.0092315673828125,
-0.0034809112548828125,
0.05194091796875,
-0.046417236328125,
-0.0084991455078125,
-0.0174560546875,
0.09051513671875,
0.0221710205078125,
0.0016393661499023438,
0.00046563148498535156,
-0.01445770263671875,
-0.00665283203125,
-0.00557708740234375,
-0.0584716796875,
0.016510009765625,
0.0210723876953125,
-0.0211639404296875,
-0.01116180419921875,
0.013763427734375,
-0.05828857421875,
0.01105499267578125,
0.0012464523315429688,
0.03289794921875,
-0.05328369140625,
-0.01849365234375,
0.03533935546875,
-0.0104827880859375,
0.0199737548828125,
-0.002590179443359375,
-0.036163330078125,
0.0211944580078125,
0.023834228515625,
0.06439208984375,
-0.0123443603515625,
-0.03094482421875,
-0.0198974609375,
0.007144927978515625,
-0.00592041015625,
0.052520751953125,
-0.0127410888671875,
-0.031768798828125,
-0.0123748779296875,
0.034393310546875,
-0.01387786865234375,
-0.00983428955078125,
0.06939697265625,
-0.01337432861328125,
0.041961669921875,
-0.032562255859375,
-0.03765869140625,
-0.0214385986328125,
0.018463134765625,
-0.05511474609375,
0.1064453125,
0.01244354248046875,
-0.0675048828125,
0.0263671875,
-0.056549072265625,
-0.01116180419921875,
-0.00826263427734375,
0.006862640380859375,
-0.052337646484375,
-0.0023059844970703125,
0.0156097412109375,
0.0304107666015625,
-0.026214599609375,
0.036376953125,
-0.00963592529296875,
-0.016357421875,
0.00576019287109375,
-0.026611328125,
0.0914306640625,
0.016387939453125,
-0.0380859375,
0.003570556640625,
-0.0545654296875,
-0.00405120849609375,
0.019317626953125,
-0.0338134765625,
-0.01425933837890625,
-0.00640869140625,
0.01552581787109375,
0.01406097412109375,
0.0200042724609375,
-0.0462646484375,
0.0249176025390625,
-0.044647216796875,
0.01490020751953125,
0.057861328125,
0.0045166015625,
0.0223541259765625,
-0.0289306640625,
0.032684326171875,
0.00739288330078125,
0.01110076904296875,
0.0128021240234375,
-0.039642333984375,
-0.0517578125,
-0.0190887451171875,
0.03936767578125,
0.05413818359375,
-0.0452880859375,
0.0662841796875,
-0.05999755859375,
-0.05487060546875,
-0.047119140625,
-0.01812744140625,
0.039947509765625,
0.0170440673828125,
0.037353515625,
-0.01678466796875,
-0.04150390625,
-0.06939697265625,
-0.0177459716796875,
-0.01081085205078125,
0.00035309791564941406,
0.013519287109375,
0.059356689453125,
0.0031528472900390625,
0.044891357421875,
-0.0222320556640625,
-0.035247802734375,
-0.01256561279296875,
0.007762908935546875,
0.0307159423828125,
0.043060302734375,
0.052642822265625,
-0.056182861328125,
-0.043304443359375,
0.003204345703125,
-0.043121337890625,
-0.0150299072265625,
-0.0006036758422851562,
-0.0099029541015625,
0.027008056640625,
0.0036678314208984375,
-0.042877197265625,
0.022979736328125,
0.045440673828125,
-0.0765380859375,
0.0270843505859375,
-0.017822265625,
0.0276336669921875,
-0.1009521484375,
0.0157623291015625,
-0.0028247833251953125,
-0.011199951171875,
-0.024169921875,
-0.0059356689453125,
0.007221221923828125,
0.0148773193359375,
-0.043670654296875,
0.058319091796875,
-0.053741455078125,
-0.0083465576171875,
0.03582763671875,
0.01419830322265625,
0.00525665283203125,
0.05926513671875,
-0.0101776123046875,
0.0648193359375,
0.045806884765625,
-0.0308990478515625,
0.003887176513671875,
0.03289794921875,
-0.0214385986328125,
0.0288848876953125,
-0.04608154296875,
-0.02337646484375,
0.021240234375,
0.0100860595703125,
-0.056243896484375,
-0.00009828805923461914,
0.0147857666015625,
-0.048248291015625,
0.02056884765625,
-0.009429931640625,
-0.0460205078125,
-0.020172119140625,
-0.03326416015625,
0.03179931640625,
0.027801513671875,
-0.018157958984375,
0.0535888671875,
0.016845703125,
-0.007030487060546875,
-0.051849365234375,
-0.059722900390625,
-0.003940582275390625,
-0.0214385986328125,
-0.0479736328125,
0.03350830078125,
-0.01055908203125,
0.00695037841796875,
0.01611328125,
-0.00643157958984375,
-0.0014972686767578125,
0.0081634521484375,
0.005802154541015625,
0.019378662109375,
-0.0181884765625,
-0.001338958740234375,
-0.0007052421569824219,
-0.013580322265625,
-0.0198974609375,
-0.0113372802734375,
0.05877685546875,
-0.03448486328125,
-0.0223541259765625,
-0.056304931640625,
0.011199951171875,
0.049468994140625,
-0.03143310546875,
0.0765380859375,
0.04205322265625,
-0.0174102783203125,
0.01311492919921875,
-0.04193115234375,
0.004322052001953125,
-0.0286865234375,
0.0224761962890625,
-0.042633056640625,
-0.0543212890625,
0.06805419921875,
0.01212310791015625,
0.01898193359375,
0.07623291015625,
0.04840087890625,
0.01184844970703125,
0.055511474609375,
0.0239105224609375,
0.0039005279541015625,
0.048309326171875,
-0.053131103515625,
-0.011383056640625,
-0.0638427734375,
-0.0220489501953125,
-0.062164306640625,
-0.01116180419921875,
-0.0616455078125,
-0.0203399658203125,
0.021881103515625,
-0.0035552978515625,
-0.01395416259765625,
0.0548095703125,
-0.04168701171875,
0.0222625732421875,
0.039398193359375,
0.0111083984375,
0.0209197998046875,
-0.003448486328125,
-0.040557861328125,
-0.0036525726318359375,
-0.04083251953125,
-0.04296875,
0.09393310546875,
0.0274505615234375,
0.0191192626953125,
0.0232696533203125,
0.05255126953125,
0.00775146484375,
0.00722503662109375,
-0.039459228515625,
0.044403076171875,
-0.0137176513671875,
-0.05889892578125,
-0.035552978515625,
-0.0265350341796875,
-0.07781982421875,
0.0257110595703125,
-0.0172882080078125,
-0.052398681640625,
0.015716552734375,
-0.0128173828125,
-0.013397216796875,
0.051055908203125,
-0.0489501953125,
0.0677490234375,
-0.001201629638671875,
-0.020843505859375,
0.0096282958984375,
-0.043243408203125,
0.0017366409301757812,
0.002719879150390625,
0.019927978515625,
-0.012298583984375,
-0.01451873779296875,
0.06854248046875,
-0.02484130859375,
0.044891357421875,
-0.006664276123046875,
-0.00801849365234375,
0.012176513671875,
0.00797271728515625,
0.04217529296875,
-0.0152587890625,
-0.0189361572265625,
0.0269012451171875,
0.0079345703125,
-0.04083251953125,
-0.010833740234375,
0.038970947265625,
-0.056671142578125,
-0.0251617431640625,
-0.045928955078125,
-0.04296875,
0.0007910728454589844,
0.03338623046875,
0.0426025390625,
0.047119140625,
-0.01187896728515625,
0.04425048828125,
0.056060791015625,
-0.0173187255859375,
0.038177490234375,
0.046539306640625,
-0.00287628173828125,
-0.04388427734375,
0.046722412109375,
0.0179443359375,
0.01486968994140625,
0.032562255859375,
0.0013179779052734375,
-0.02703857421875,
-0.060638427734375,
-0.040008544921875,
0.0322265625,
-0.0227813720703125,
-0.029998779296875,
-0.044464111328125,
0.000007987022399902344,
-0.0278778076171875,
0.00920867919921875,
-0.03173828125,
-0.038116455078125,
-0.011993408203125,
-0.0259552001953125,
0.035369873046875,
0.032562255859375,
-0.006298065185546875,
0.0157012939453125,
-0.064208984375,
0.0186614990234375,
-0.0192413330078125,
0.038909912109375,
-0.0299530029296875,
-0.06231689453125,
-0.02056884765625,
-0.0023212432861328125,
-0.02642822265625,
-0.08465576171875,
0.036712646484375,
-0.003147125244140625,
0.0302734375,
0.013519287109375,
0.004711151123046875,
0.04913330078125,
-0.0361328125,
0.0726318359375,
-0.0036258697509765625,
-0.0638427734375,
0.039398193359375,
-0.035797119140625,
0.021453857421875,
0.050384521484375,
0.01253509521484375,
-0.02752685546875,
-0.058074951171875,
-0.0699462890625,
-0.06884765625,
0.060150146484375,
0.042877197265625,
-0.01178741455078125,
-0.006656646728515625,
0.00841522216796875,
-0.006198883056640625,
-0.018280029296875,
-0.0869140625,
-0.037200927734375,
0.020050048828125,
-0.0362548828125,
0.0107421875,
-0.0303802490234375,
-0.016815185546875,
-0.018035888671875,
0.08843994140625,
0.01416778564453125,
0.01308441162109375,
0.03851318359375,
-0.0054168701171875,
-0.01043701171875,
0.031982421875,
0.046539306640625,
0.034515380859375,
-0.0219268798828125,
-0.0190582275390625,
0.032196044921875,
-0.03143310546875,
0.0007548332214355469,
0.01029205322265625,
-0.030426025390625,
0.0261688232421875,
0.034423828125,
0.0631103515625,
0.024627685546875,
-0.037628173828125,
0.04327392578125,
-0.0068511962890625,
-0.036895751953125,
-0.0298309326171875,
-0.0190887451171875,
0.00705718994140625,
0.0151824951171875,
0.0261688232421875,
0.0009708404541015625,
0.0019168853759765625,
-0.015716552734375,
0.0080413818359375,
0.0186614990234375,
-0.0191497802734375,
-0.03204345703125,
0.039031982421875,
0.006198883056640625,
-0.02117919921875,
0.0119781494140625,
-0.017669677734375,
-0.034637451171875,
0.045745849609375,
0.019287109375,
0.07415771484375,
-0.0201873779296875,
-0.006771087646484375,
0.052093505859375,
0.035308837890625,
-0.014739990234375,
0.04150390625,
0.01508331298828125,
-0.042236328125,
-0.0209808349609375,
-0.058258056640625,
0.005977630615234375,
0.00801849365234375,
-0.06494140625,
0.03143310546875,
0.007770538330078125,
-0.0308685302734375,
-0.010467529296875,
0.0287628173828125,
-0.047760009765625,
0.00424957275390625,
-0.024017333984375,
0.07940673828125,
-0.078857421875,
0.05145263671875,
0.051513671875,
-0.04888916015625,
-0.07916259765625,
-0.020233154296875,
-0.016571044921875,
-0.038299560546875,
0.0287017822265625,
-0.002346038818359375,
0.0010137557983398438,
-0.00481414794921875,
-0.021148681640625,
-0.06884765625,
0.09619140625,
0.031982421875,
-0.03424072265625,
-0.01666259765625,
0.0017108917236328125,
0.042724609375,
-0.0015821456909179688,
0.01126861572265625,
0.0308074951171875,
0.052154541015625,
-0.01201629638671875,
-0.08709716796875,
0.0118255615234375,
-0.03424072265625,
-0.01155853271484375,
0.032318115234375,
-0.0645751953125,
0.06103515625,
0.0123748779296875,
-0.0229339599609375,
0.0118255615234375,
0.03741455078125,
0.031646728515625,
-0.0008716583251953125,
0.036376953125,
0.07647705078125,
0.0408935546875,
-0.04248046875,
0.072021484375,
-0.0287628173828125,
0.05218505859375,
0.0645751953125,
0.0206146240234375,
0.059844970703125,
0.044036865234375,
-0.02630615234375,
0.045196533203125,
0.054473876953125,
-0.01497650146484375,
0.023529052734375,
-0.0091094970703125,
0.0005207061767578125,
-0.01277923583984375,
-0.0093994140625,
-0.04620361328125,
0.0308074951171875,
0.0142822265625,
-0.0181121826171875,
-0.0096588134765625,
-0.0233612060546875,
0.029144287109375,
0.00431060791015625,
-0.00856781005859375,
0.050872802734375,
-0.012939453125,
-0.048583984375,
0.051513671875,
-0.00151824951171875,
0.047332763671875,
-0.048309326171875,
0.003936767578125,
-0.014678955078125,
0.01171875,
-0.006198883056640625,
-0.05975341796875,
0.02056884765625,
0.0257415771484375,
-0.01535797119140625,
-0.022796630859375,
0.006557464599609375,
-0.035003662109375,
-0.05780029296875,
0.035003662109375,
0.039642333984375,
0.0211639404296875,
0.0192413330078125,
-0.0635986328125,
0.0022106170654296875,
0.0119476318359375,
-0.057586669921875,
-0.0007424354553222656,
0.07000732421875,
0.0023250579833984375,
0.052734375,
0.039031982421875,
0.0241241455078125,
0.013397216796875,
0.00440216064453125,
0.0538330078125,
-0.057281494140625,
-0.03863525390625,
-0.061492919921875,
0.0517578125,
-0.0088348388671875,
-0.044097900390625,
0.04925537109375,
0.0628662109375,
0.06683349609375,
0.0025539398193359375,
0.02703857421875,
-0.016448974609375,
0.04193115234375,
-0.04833984375,
0.0479736328125,
-0.06640625,
0.0105133056640625,
-0.0201568603515625,
-0.05743408203125,
-0.0234375,
0.0217742919921875,
-0.01413726806640625,
-0.005084991455078125,
0.07244873046875,
0.059173583984375,
0.00688934326171875,
-0.027557373046875,
0.0010976791381835938,
0.0295867919921875,
0.015716552734375,
0.057281494140625,
0.0217132568359375,
-0.06536865234375,
0.053253173828125,
-0.0236663818359375,
0.0010614395141601562,
-0.0011882781982421875,
-0.060089111328125,
-0.0599365234375,
-0.050079345703125,
-0.0088653564453125,
-0.0330810546875,
-0.010650634765625,
0.0689697265625,
0.024017333984375,
-0.0731201171875,
-0.0222625732421875,
-0.0034694671630859375,
0.005084991455078125,
-0.01910400390625,
-0.0189971923828125,
0.059051513671875,
-0.00611114501953125,
-0.0731201171875,
0.00897979736328125,
0.006320953369140625,
0.01316070556640625,
0.0038509368896484375,
-0.01007843017578125,
-0.043060302734375,
-0.0019426345825195312,
0.0153350830078125,
0.01275634765625,
-0.0694580078125,
-0.01233673095703125,
0.0108795166015625,
-0.022979736328125,
0.0142822265625,
0.00922393798828125,
-0.0236968994140625,
0.0121612548828125,
0.046539306640625,
0.0237579345703125,
0.03741455078125,
-0.005779266357421875,
0.02178955078125,
-0.052337646484375,
0.031341552734375,
0.01507568359375,
0.053131103515625,
0.0215606689453125,
-0.0140228271484375,
0.062103271484375,
0.02484130859375,
-0.0215911865234375,
-0.07745361328125,
-0.00569915771484375,
-0.0977783203125,
-0.004947662353515625,
0.07904052734375,
-0.01568603515625,
-0.0213623046875,
0.013275146484375,
-0.0177764892578125,
0.03387451171875,
-0.036956787109375,
0.04302978515625,
0.064697265625,
0.016693115234375,
0.020477294921875,
-0.0300445556640625,
0.02655029296875,
0.04010009765625,
-0.055755615234375,
-0.006961822509765625,
0.024261474609375,
0.030975341796875,
0.02386474609375,
0.050750732421875,
-0.03375244140625,
0.0149383544921875,
-0.01085662841796875,
0.0214385986328125,
-0.00910186767578125,
-0.0119171142578125,
-0.0258331298828125,
0.004825592041015625,
-0.01529693603515625,
-0.0142974853515625
]
] |
TheBloke/Llama-2-70B-fp16 | 2023-10-30T15:17:36.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"facebook",
"meta",
"pytorch",
"llama-2",
"en",
"license:llama2",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Llama-2-70B-fp16 | 43 | 12,739 | transformers | 2023-07-19T02:21:20 | ---
inference: false
language:
- en
license: llama2
model_type: llama
pipeline_tag: text-generation
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
---
<!-- header start -->
<div style="width: 100%;">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p><a href="https://discord.gg/theblokeai">Chat & support: my new Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<!-- header end -->
# Meta's Llama 2 70B fp16
These files are fp16 format model files for [Meta's Llama 2 70B](https://huggingface.co/meta-llama/Llama-2-70b-hf).
They were produced by downloading the PTH files from Meta, and then converting to HF format using the latest Transformers 4.32.0.dev0, from Git, with the Llama 2 PR included: https://github.com/huggingface/transformers/pull/24891.
Command to convert was:
```
python3 /workspace/venv/pytorch2/lib/python3.10/site-packages/transformers/models/llama/convert_llama_weights_to_hf.py --input_dir /workspace/git/llama/download --model_size 70B --output_dir /workspace/process/llama-2-70b-chat/source --safe_serialization true
```
The files were saved in Safetensors format.
I am uploading this repo because I initially tried to create GPTQs using the [MetaLlama 2 70B HF repo](https://huggingface.co/meta-llama/Llama-2-70b-hf), but got strange errors that suggested the weights were not correct. But converting from the PTH files using the latest `convert_llama_weights_to_hf.py` script worked fine.
Many thanks to William Beauchamp from [Chai](https://chai-research.com/) for providing the hardware for merging and uploading these files!
## Repositories available
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Llama-2-70B-GPTQ)
* [Original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/meta-llama/Llama-2-70b-hf)
* [My fp16 conversion of the unquantised PTH model files](https://huggingface.co/TheBloke/Llama-2-70B-fp16)
## Prompt template: None
```
{prompt}
```
<!-- footer start -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute.
Thanks to the [chirper.ai](https://chirper.ai) team!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Luke from CarbonQuill, Aemon Algiz.
**Patreon special mentions**: Space Cruiser, Nikolai Manek, Sam, Chris McCloskey, Rishabh Srivastava, Kalila, Spiking Neurons AB, Khalefa Al-Ahmad, WelcomeToTheClub, Chadd, Lone Striker, Viktor Bowallius, Edmond Seymore, Ai Maven, Chris Smitley, Dave, Alexandros Triantafyllidis, Luke @flexchar, Elle, ya boyyy, Talal Aujan, Alex , Jonathan Leane, Deep Realms, Randy H, subjectnull, Preetika Verma, Joseph William Delisle, Michael Levine, chris gileta, K, Oscar Rangel, LangChain4j, Trenton Dambrowitz, Eugene Pentland, Johann-Peter Hartmann, Femi Adebogun, Illia Dulskyi, senxiiz, Daniel P. Andersen, Sean Connelly, Artur Olbinski, RoA, Mano Prime, Derek Yates, Raven Klaugh, David Flickinger, Willem Michiel, Pieter, Willian Hasse, vamX, Luke Pendergrass, webtim, Ghost , Rainer Wilmers, Nathan LeClaire, Will Dee, Cory Kujawski, John Detwiler, Fred von Graf, biorpg, Iucharbius , Imad Khwaja, Pierre Kircher, terasurfer , Asp the Wyvern, John Villwock, theTransient, zynix , Gabriel Tamborski, Fen Risland, Gabriel Puliatti, Matthew Berman, Pyrater, SuperWojo, Stephen Murray, Karl Bernard, Ajan Kanaga, Greatston Gnanesh, Junyu Yang.
Thank you to all my generous patrons and donaters!
<!-- footer end -->
# Original model card: Meta's Llama 2 70B
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 70B pretrained model, converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
| 14,033 | [
[
-0.0214385986328125,
-0.052764892578125,
0.0231781005859375,
0.023529052734375,
-0.034393310546875,
0.005573272705078125,
-0.0037555694580078125,
-0.049407958984375,
0.02447509765625,
0.027130126953125,
-0.053253173828125,
-0.028778076171875,
-0.04541015625,
0.015716552734375,
-0.023529052734375,
0.08453369140625,
0.009307861328125,
-0.0264129638671875,
-0.005161285400390625,
0.01336669921875,
-0.034332275390625,
-0.0295257568359375,
-0.044219970703125,
-0.04248046875,
0.03778076171875,
0.0204010009765625,
0.058380126953125,
0.04815673828125,
0.0369873046875,
0.02349853515625,
-0.018524169921875,
0.0191650390625,
-0.0501708984375,
-0.01216888427734375,
0.0118865966796875,
-0.03570556640625,
-0.055450439453125,
-0.0021514892578125,
0.026641845703125,
0.01029205322265625,
-0.0238494873046875,
0.0234222412109375,
0.0021209716796875,
0.031524658203125,
-0.02032470703125,
0.0222930908203125,
-0.045196533203125,
-0.003864288330078125,
-0.01605224609375,
0.00033164024353027344,
-0.0091552734375,
-0.016357421875,
-0.0134429931640625,
-0.06884765625,
-0.00982666015625,
0.0023555755615234375,
0.09051513671875,
0.03826904296875,
-0.03204345703125,
0.00125885009765625,
-0.0369873046875,
0.053466796875,
-0.07080078125,
0.0128631591796875,
0.039520263671875,
0.013763427734375,
-0.0267333984375,
-0.07489013671875,
-0.057464599609375,
-0.0075836181640625,
-0.00232696533203125,
0.015289306640625,
-0.043853759765625,
-0.007572174072265625,
0.00439453125,
0.0308380126953125,
-0.036376953125,
0.021728515625,
-0.033966064453125,
-0.009246826171875,
0.059478759765625,
0.00688934326171875,
0.0178985595703125,
-0.0173797607421875,
-0.0287933349609375,
-0.0270843505859375,
-0.059844970703125,
0.00653839111328125,
0.0311737060546875,
0.00534820556640625,
-0.057830810546875,
0.05792236328125,
-0.01397705078125,
0.0206146240234375,
0.0167694091796875,
-0.0300445556640625,
0.022308349609375,
-0.042083740234375,
-0.0208892822265625,
-0.0230560302734375,
0.08380126953125,
0.04736328125,
0.007251739501953125,
0.01470947265625,
-0.006473541259765625,
0.005146026611328125,
-0.0012607574462890625,
-0.062469482421875,
-0.01235198974609375,
0.025238037109375,
-0.04205322265625,
-0.040283203125,
-0.0236663818359375,
-0.055084228515625,
-0.026702880859375,
-0.0035800933837890625,
0.01338958740234375,
-0.0148162841796875,
-0.03948974609375,
0.0075531005859375,
-0.00876617431640625,
0.039276123046875,
0.029296875,
-0.06878662109375,
0.0088043212890625,
0.045928955078125,
0.052032470703125,
0.01331329345703125,
-0.0203094482421875,
-0.0108184814453125,
0.009918212890625,
-0.0206451416015625,
0.058746337890625,
-0.0245819091796875,
-0.037322998046875,
-0.013458251953125,
0.00923919677734375,
0.01512908935546875,
-0.0318603515625,
0.03009033203125,
-0.02581787109375,
0.0126495361328125,
-0.0205078125,
-0.019073486328125,
-0.0177764892578125,
0.01006317138671875,
-0.03546142578125,
0.08843994140625,
0.02386474609375,
-0.0452880859375,
0.007091522216796875,
-0.0413818359375,
-0.0201263427734375,
-0.002933502197265625,
0.00616455078125,
-0.035400390625,
-0.0100860595703125,
0.0135955810546875,
0.0227203369140625,
-0.04693603515625,
0.032318115234375,
-0.019683837890625,
-0.0251922607421875,
0.0252685546875,
-0.02740478515625,
0.06805419921875,
0.01495361328125,
-0.0322265625,
-0.000009715557098388672,
-0.0435791015625,
-0.015167236328125,
0.044525146484375,
-0.038360595703125,
0.0188446044921875,
0.005672454833984375,
0.00653076171875,
0.006130218505859375,
0.037139892578125,
-0.0260467529296875,
0.022064208984375,
-0.02801513671875,
0.051300048828125,
0.0628662109375,
-0.007503509521484375,
0.018280029296875,
-0.039520263671875,
0.034912109375,
-0.0081024169921875,
0.026092529296875,
0.006298065185546875,
-0.0634765625,
-0.07342529296875,
-0.0174407958984375,
0.006954193115234375,
0.0491943359375,
-0.03753662109375,
0.049957275390625,
-0.01055908203125,
-0.05828857421875,
-0.0391845703125,
0.01151275634765625,
0.0321044921875,
0.025299072265625,
0.0284576416015625,
-0.027252197265625,
-0.05059814453125,
-0.06671142578125,
0.01148223876953125,
-0.0467529296875,
-0.00994110107421875,
0.032135009765625,
0.045379638671875,
-0.0450439453125,
0.05072021484375,
-0.032928466796875,
-0.028594970703125,
-0.0299224853515625,
-0.016876220703125,
0.01788330078125,
0.035888671875,
0.050384521484375,
-0.0404052734375,
-0.01947021484375,
0.003986358642578125,
-0.056365966796875,
-0.01018524169921875,
-0.00009679794311523438,
-0.02069091796875,
0.016143798828125,
0.00955963134765625,
-0.072509765625,
0.032867431640625,
0.054718017578125,
-0.0267333984375,
0.030364990234375,
-0.0016765594482421875,
-0.01016998291015625,
-0.0867919921875,
0.0039825439453125,
-0.006145477294921875,
-0.01187896728515625,
-0.042816162109375,
-0.0048675537109375,
-0.023468017578125,
0.004810333251953125,
-0.031890869140625,
0.04632568359375,
-0.021453857421875,
-0.001247406005859375,
-0.00917816162109375,
0.004428863525390625,
0.007526397705078125,
0.044769287109375,
-0.015899658203125,
0.063232421875,
0.0293731689453125,
-0.03570556640625,
0.034393310546875,
0.047332763671875,
-0.020050048828125,
0.0139923095703125,
-0.07635498046875,
0.0237579345703125,
0.0007805824279785156,
0.0487060546875,
-0.08123779296875,
-0.03582763671875,
0.05364990234375,
-0.046600341796875,
0.021484375,
-0.00960540771484375,
-0.042755126953125,
-0.0305328369140625,
-0.0321044921875,
0.0267333984375,
0.060302734375,
-0.03497314453125,
0.0303192138671875,
0.038299560546875,
-0.004360198974609375,
-0.052001953125,
-0.0648193359375,
0.00847625732421875,
-0.0211334228515625,
-0.037994384765625,
0.032470703125,
-0.00885009765625,
-0.0142822265625,
-0.00928497314453125,
0.01273345947265625,
0.0020961761474609375,
0.01251220703125,
0.0272216796875,
0.025726318359375,
-0.007183074951171875,
-0.015716552734375,
0.01351165771484375,
-0.005260467529296875,
-0.003231048583984375,
-0.0012645721435546875,
0.0611572265625,
-0.021209716796875,
-0.01526641845703125,
-0.0660400390625,
0.00637054443359375,
0.0389404296875,
-0.0170135498046875,
0.04656982421875,
0.0435791015625,
-0.0211639404296875,
0.017974853515625,
-0.051513671875,
-0.01197052001953125,
-0.039947509765625,
0.035614013671875,
-0.0041351318359375,
-0.06854248046875,
0.032562255859375,
0.01216888427734375,
0.02197265625,
0.046875,
0.052947998046875,
-0.0156402587890625,
0.06494140625,
0.051788330078125,
-0.0128631591796875,
0.038116455078125,
-0.037139892578125,
0.00511932373046875,
-0.063720703125,
-0.041595458984375,
-0.030914306640625,
-0.03662109375,
-0.056396484375,
-0.049713134765625,
0.0205535888671875,
0.006046295166015625,
-0.03924560546875,
0.034515380859375,
-0.042724609375,
0.02069091796875,
0.029754638671875,
0.0113525390625,
0.0180816650390625,
0.01384735107421875,
0.02069091796875,
0.00372314453125,
-0.04345703125,
-0.0467529296875,
0.0906982421875,
0.041107177734375,
0.039520263671875,
0.017364501953125,
0.04638671875,
0.0099639892578125,
0.0305938720703125,
-0.04986572265625,
0.040496826171875,
-0.005859375,
-0.05035400390625,
-0.0130462646484375,
-0.01367950439453125,
-0.0703125,
0.014404296875,
-0.01425933837890625,
-0.05426025390625,
0.0188751220703125,
-0.0031890869140625,
-0.0208892822265625,
0.025482177734375,
-0.031707763671875,
0.047637939453125,
-0.023590087890625,
-0.015411376953125,
-0.0184783935546875,
-0.06146240234375,
0.0435791015625,
0.0016202926635742188,
0.01303863525390625,
-0.0313720703125,
-0.0208587646484375,
0.061065673828125,
-0.041290283203125,
0.08758544921875,
-0.00015687942504882812,
-0.0207061767578125,
0.048553466796875,
-0.0018978118896484375,
0.04376220703125,
0.017669677734375,
-0.0038776397705078125,
0.04010009765625,
-0.002288818359375,
-0.0140380859375,
-0.0155181884765625,
0.033843994140625,
-0.09832763671875,
-0.0552978515625,
-0.023590087890625,
-0.032318115234375,
0.0198516845703125,
0.00804901123046875,
0.025177001953125,
0.0032405853271484375,
0.0027523040771484375,
0.0214691162109375,
0.0238494873046875,
-0.026641845703125,
0.03900146484375,
0.0225982666015625,
-0.01025390625,
-0.050933837890625,
0.059051513671875,
-0.0117034912109375,
0.01122283935546875,
0.026458740234375,
0.00634765625,
-0.031982421875,
-0.021636962890625,
-0.042083740234375,
0.047882080078125,
-0.041961669921875,
-0.04107666015625,
-0.03900146484375,
-0.017730712890625,
-0.0298309326171875,
0.0026950836181640625,
-0.03564453125,
-0.0386962890625,
-0.0648193359375,
-0.0032901763916015625,
0.05560302734375,
0.055816650390625,
-0.0069427490234375,
0.04315185546875,
-0.042266845703125,
0.01983642578125,
0.020782470703125,
0.007465362548828125,
0.00022399425506591797,
-0.0654296875,
0.0071563720703125,
0.01216888427734375,
-0.04638671875,
-0.057708740234375,
0.042633056640625,
0.01849365234375,
0.0286407470703125,
0.0251007080078125,
-0.006298065185546875,
0.060882568359375,
-0.0298004150390625,
0.07373046875,
0.0262603759765625,
-0.060333251953125,
0.04278564453125,
-0.03594970703125,
-0.0022945404052734375,
0.029022216796875,
0.0202178955078125,
-0.029693603515625,
-0.023773193359375,
-0.0458984375,
-0.052459716796875,
0.052520751953125,
0.0276336669921875,
0.0282135009765625,
-0.0012302398681640625,
0.04010009765625,
-0.004062652587890625,
0.01593017578125,
-0.07916259765625,
-0.030426025390625,
-0.0302886962890625,
-0.007587432861328125,
0.003398895263671875,
-0.0269775390625,
-0.01084136962890625,
-0.029541015625,
0.057830810546875,
-0.01055145263671875,
0.04315185546875,
0.0111083984375,
0.0128173828125,
-0.0220184326171875,
0.00811767578125,
0.048309326171875,
0.0411376953125,
-0.003910064697265625,
-0.01122283935546875,
0.02178955078125,
-0.036865234375,
0.01514434814453125,
0.01000213623046875,
-0.01187896728515625,
-0.0176239013671875,
0.02423095703125,
0.06732177734375,
0.0126495361328125,
-0.0498046875,
0.037933349609375,
-0.00009131431579589844,
-0.018402099609375,
-0.0323486328125,
0.01190948486328125,
0.0222015380859375,
0.039306640625,
0.0266571044921875,
-0.00970458984375,
-0.007701873779296875,
-0.03533935546875,
-0.0070037841796875,
0.032958984375,
-0.00226593017578125,
-0.037750244140625,
0.07806396484375,
0.0173797607421875,
-0.0289154052734375,
0.048797607421875,
-0.006595611572265625,
-0.03204345703125,
0.062744140625,
0.059661865234375,
0.054779052734375,
-0.01082611083984375,
0.0190887451171875,
0.038970947265625,
0.031890869140625,
-0.00653076171875,
0.0171966552734375,
0.0007762908935546875,
-0.042724609375,
-0.0248870849609375,
-0.05328369140625,
-0.031280517578125,
0.0258026123046875,
-0.0262603759765625,
0.0297698974609375,
-0.05242919921875,
-0.0264892578125,
-0.0264129638671875,
0.0113983154296875,
-0.037384033203125,
-0.0010766983032226562,
0.017669677734375,
0.05474853515625,
-0.048126220703125,
0.059112548828125,
0.042083740234375,
-0.039825439453125,
-0.06707763671875,
-0.015655517578125,
0.00882720947265625,
-0.07373046875,
0.03240966796875,
0.0123443603515625,
0.0036602020263671875,
0.00909423828125,
-0.0635986328125,
-0.08526611328125,
0.11962890625,
0.02703857421875,
-0.046112060546875,
-0.00983428955078125,
0.0208587646484375,
0.040924072265625,
-0.0213165283203125,
0.0285491943359375,
0.05078125,
0.039520263671875,
0.0063323974609375,
-0.08233642578125,
0.01654052734375,
-0.02984619140625,
0.0078887939453125,
-0.011016845703125,
-0.09600830078125,
0.0660400390625,
-0.0237884521484375,
-0.01148223876953125,
0.0330810546875,
0.05908203125,
0.04931640625,
0.01336669921875,
0.02984619140625,
0.04278564453125,
0.056121826171875,
-0.01030731201171875,
0.08160400390625,
-0.0181884765625,
0.02423095703125,
0.045440673828125,
-0.01434326171875,
0.06719970703125,
0.025054931640625,
-0.044586181640625,
0.0489501953125,
0.066162109375,
-0.0175323486328125,
0.0289459228515625,
0.0003135204315185547,
-0.010498046875,
-0.012054443359375,
-0.0279083251953125,
-0.051116943359375,
0.0389404296875,
0.0257110595703125,
-0.0145263671875,
0.00589752197265625,
-0.02880859375,
0.0167236328125,
-0.02618408203125,
0.00310516357421875,
0.05328369140625,
0.02239990234375,
-0.03387451171875,
0.076904296875,
0.00853729248046875,
0.0670166015625,
-0.04913330078125,
-0.0036144256591796875,
-0.03997802734375,
0.006317138671875,
-0.0209503173828125,
-0.04986572265625,
0.0031528472900390625,
0.011322021484375,
-0.0020885467529296875,
-0.00823974609375,
0.048797607421875,
-0.01100921630859375,
-0.0369873046875,
0.033599853515625,
0.0229339599609375,
0.024749755859375,
0.0173492431640625,
-0.05908203125,
0.0285491943359375,
0.0111541748046875,
-0.03424072265625,
0.03094482421875,
0.016082763671875,
0.003482818603515625,
0.060516357421875,
0.057830810546875,
-0.01380157470703125,
-0.00252532958984375,
-0.00862884521484375,
0.0802001953125,
-0.03271484375,
-0.015716552734375,
-0.0623779296875,
0.047760009765625,
0.01372528076171875,
-0.03680419921875,
0.0372314453125,
0.0306549072265625,
0.05438232421875,
-0.0019369125366210938,
0.0482177734375,
-0.01739501953125,
0.0129547119140625,
-0.02197265625,
0.064697265625,
-0.0655517578125,
0.0234375,
-0.01320648193359375,
-0.06146240234375,
-0.013519287109375,
0.0516357421875,
0.00934600830078125,
0.004352569580078125,
0.02130126953125,
0.0673828125,
0.0130615234375,
-0.01200103759765625,
0.006534576416015625,
0.0232086181640625,
0.03570556640625,
0.06536865234375,
0.0760498046875,
-0.05926513671875,
0.06573486328125,
-0.03326416015625,
-0.0107421875,
-0.0230255126953125,
-0.056243896484375,
-0.066162109375,
-0.0321044921875,
-0.0209197998046875,
-0.0239410400390625,
-0.00376129150390625,
0.06689453125,
0.04046630859375,
-0.0391845703125,
-0.034637451171875,
0.00023567676544189453,
0.009307861328125,
0.0011272430419921875,
-0.01491546630859375,
0.0126953125,
0.01139068603515625,
-0.0545654296875,
0.037200927734375,
0.002941131591796875,
0.031890869140625,
-0.0123748779296875,
-0.019317626953125,
-0.022918701171875,
-0.0009746551513671875,
0.042724609375,
0.03009033203125,
-0.07623291015625,
-0.0159912109375,
-0.0010242462158203125,
-0.006786346435546875,
0.0183563232421875,
0.019012451171875,
-0.05413818359375,
-0.00681304931640625,
0.0281982421875,
0.030731201171875,
0.042816162109375,
0.0015811920166015625,
0.0184326171875,
-0.03863525390625,
0.0301971435546875,
0.0012598037719726562,
0.0217742919921875,
0.028106689453125,
-0.035064697265625,
0.052032470703125,
0.0233001708984375,
-0.052276611328125,
-0.0721435546875,
-0.0007348060607910156,
-0.0821533203125,
-0.0091094970703125,
0.09625244140625,
0.00728607177734375,
-0.019378662109375,
0.0120086669921875,
-0.0225830078125,
0.039703369140625,
-0.03704833984375,
0.04791259765625,
0.030487060546875,
-0.00463104248046875,
-0.0186920166015625,
-0.04541015625,
0.0316162109375,
0.0240631103515625,
-0.06805419921875,
-0.00024628639221191406,
0.040069580078125,
0.0291900634765625,
0.01045989990234375,
0.060882568359375,
-0.005855560302734375,
0.0232086181640625,
-0.00540924072265625,
0.00806427001953125,
-0.0068206787109375,
-0.01435089111328125,
-0.0171356201171875,
-0.01617431640625,
-0.002994537353515625,
-0.002899169921875
]
] |
jonfd/electra-small-nordic | 2022-01-31T23:41:26.000Z | [
"transformers",
"pytorch",
"tf",
"electra",
"pretraining",
"is",
"no",
"sv",
"da",
"dataset:igc",
"dataset:ic3",
"dataset:jonfd/ICC",
"dataset:mc4",
"license:cc-by-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] | null | jonfd | null | null | jonfd/electra-small-nordic | 1 | 12,721 | transformers | 2022-03-02T23:29:05 | ---
language:
- is
- no
- sv
- da
license: cc-by-4.0
datasets:
- igc
- ic3
- jonfd/ICC
- mc4
---
# Nordic ELECTRA-Small
This model was pretrained on the following corpora:
* The [Icelandic Gigaword Corpus](http://igc.arnastofnun.is/) (IGC)
* The Icelandic Common Crawl Corpus (IC3)
* The [Icelandic Crawled Corpus](https://huggingface.co/datasets/jonfd/ICC) (ICC)
* The [Multilingual Colossal Clean Crawled Corpus](https://huggingface.co/datasets/mc4) (mC4) - Icelandic, Norwegian, Swedish and Danish text obtained from .is, .no, .se and .dk domains, respectively
The total size of the corpus after document-level deduplication and filtering was 14.82B tokens, split equally between the four languages. The model was trained using a WordPiece tokenizer with a vocabulary size of 96,105 for one million steps with a batch size of 256, and otherwise with default settings.
# Acknowledgments
This research was supported with Cloud TPUs from Google's TPU Research Cloud (TRC).
This project was funded by the Language Technology Programme for Icelandic 2019-2023. The programme, which is managed and coordinated by [Almannarómur](https://almannaromur.is/), is funded by the Icelandic Ministry of Education, Science and Culture. | 1,226 | [
[
-0.024688720703125,
-0.0273895263671875,
0.023834228515625,
-0.00504302978515625,
-0.043792724609375,
0.00608062744140625,
-0.01531219482421875,
-0.031768798828125,
0.0321044921875,
0.0401611328125,
-0.0174102783203125,
-0.03717041015625,
-0.026580810546875,
0.040008544921875,
-0.0103912353515625,
0.059295654296875,
-0.020355224609375,
0.0106201171875,
-0.0196380615234375,
-0.0204620361328125,
-0.0164642333984375,
-0.043548583984375,
-0.0364990234375,
-0.02618408203125,
0.054534912109375,
0.039642333984375,
0.01079559326171875,
0.0244293212890625,
0.0173187255859375,
0.0171051025390625,
-0.0270233154296875,
0.01226806640625,
-0.045013427734375,
-0.02581787109375,
-0.0188446044921875,
-0.0281982421875,
-0.018280029296875,
-0.0028324127197265625,
0.04583740234375,
0.041351318359375,
-0.01222991943359375,
0.043060302734375,
0.0099334716796875,
0.0367431640625,
-0.0201873779296875,
-0.0053558349609375,
-0.055999755859375,
-0.02490234375,
-0.0321044921875,
0.01528167724609375,
-0.03131103515625,
-0.0014362335205078125,
-0.01473236083984375,
-0.051300048828125,
0.021240234375,
0.035308837890625,
0.07073974609375,
0.00510406494140625,
-0.03521728515625,
-0.026275634765625,
-0.03021240234375,
0.0697021484375,
-0.03668212890625,
0.04913330078125,
0.047393798828125,
-0.0159912109375,
0.00418853759765625,
-0.06768798828125,
-0.0572509765625,
0.006801605224609375,
-0.006458282470703125,
-0.0016679763793945312,
-0.031890869140625,
-0.01751708984375,
0.0076751708984375,
0.006908416748046875,
-0.034942626953125,
0.031524658203125,
-0.07757568359375,
-0.01027679443359375,
0.0303497314453125,
0.0035228729248046875,
-0.0032806396484375,
-0.03167724609375,
-0.022552490234375,
-0.01357269287109375,
-0.06781005859375,
-0.016815185546875,
0.068359375,
0.0340576171875,
-0.02752685546875,
0.0577392578125,
0.01145172119140625,
0.038848876953125,
0.01111602783203125,
0.0006718635559082031,
0.0509033203125,
-0.033721923828125,
-0.00809478759765625,
0.019378662109375,
0.0556640625,
-0.01143646240234375,
-0.0123748779296875,
-0.00850677490234375,
-0.01137542724609375,
0.0091400146484375,
0.03497314453125,
-0.047271728515625,
0.0035190582275390625,
0.008056640625,
-0.041748046875,
-0.024932861328125,
-0.01329803466796875,
-0.052154541015625,
0.0016908645629882812,
-0.04815673828125,
0.037445068359375,
-0.033203125,
-0.037261962890625,
0.019134521484375,
0.005146026611328125,
0.01201629638671875,
-0.00676727294921875,
-0.05108642578125,
0.0188140869140625,
0.051910400390625,
0.065673828125,
-0.022918701171875,
-0.0263519287109375,
-0.0157928466796875,
-0.0007238388061523438,
-0.02166748046875,
0.06536865234375,
-0.0177459716796875,
-0.0323486328125,
-0.006134033203125,
0.011474609375,
-0.04437255859375,
-0.0283355712890625,
0.0589599609375,
-0.051422119140625,
0.0311737060546875,
-0.00963592529296875,
-0.04937744140625,
-0.0294952392578125,
0.01422119140625,
-0.06610107421875,
0.0809326171875,
0.017120361328125,
-0.09051513671875,
0.0272369384765625,
-0.04180908203125,
-0.0200042724609375,
0.0207977294921875,
-0.005191802978515625,
-0.0264739990234375,
0.0178375244140625,
-0.00852203369140625,
0.0291748046875,
-0.023834228515625,
0.0092620849609375,
-0.01120758056640625,
-0.0263214111328125,
-0.0246734619140625,
-0.00225067138671875,
0.04736328125,
0.033935546875,
-0.015472412109375,
-0.0008673667907714844,
-0.07196044921875,
-0.02349853515625,
-0.006465911865234375,
-0.0238494873046875,
-0.048248291015625,
-0.0212860107421875,
0.0156097412109375,
0.032318115234375,
0.01467132568359375,
-0.067626953125,
-0.00013136863708496094,
-0.0279693603515625,
0.0260467529296875,
0.044952392578125,
-0.01092529296875,
0.00826263427734375,
-0.0267486572265625,
0.057861328125,
0.01303863525390625,
-0.0218505859375,
0.01812744140625,
-0.02667236328125,
-0.0450439453125,
-0.050048828125,
0.052490234375,
0.032196044921875,
-0.0765380859375,
0.0113983154296875,
-0.04376220703125,
-0.04559326171875,
-0.064208984375,
0.00379180908203125,
0.025604248046875,
0.0308685302734375,
0.01462554931640625,
0.00661468505859375,
-0.04754638671875,
-0.07220458984375,
-0.005031585693359375,
-0.0223541259765625,
-0.004138946533203125,
0.0248565673828125,
0.06884765625,
-0.0205841064453125,
0.042755126953125,
-0.00988006591796875,
-0.0235748291015625,
-0.0247039794921875,
0.023956298828125,
0.042266845703125,
0.0266265869140625,
0.021514892578125,
-0.07196044921875,
-0.036651611328125,
0.00299835205078125,
-0.0212554931640625,
0.0008854866027832031,
0.0147552490234375,
0.0007786750793457031,
0.027862548828125,
0.04248046875,
-0.040924072265625,
0.011810302734375,
0.060760498046875,
-0.044097900390625,
0.049072265625,
-0.021820068359375,
-0.0002911090850830078,
-0.09710693359375,
0.0379638671875,
-0.0256805419921875,
-0.005298614501953125,
-0.038604736328125,
-0.0025882720947265625,
0.004451751708984375,
0.002033233642578125,
-0.0714111328125,
0.0684814453125,
-0.028778076171875,
0.00931549072265625,
0.01053619384765625,
-0.0089874267578125,
-0.00659942626953125,
0.0272369384765625,
0.00270843505859375,
0.07012939453125,
0.0124664306640625,
-0.055419921875,
-0.0223236083984375,
0.037994384765625,
-0.045074462890625,
0.037353515625,
-0.053985595703125,
0.022247314453125,
-0.0086822509765625,
-0.0103302001953125,
-0.06243896484375,
-0.01258087158203125,
-0.02301025390625,
-0.0169677734375,
0.024932861328125,
-0.0119171142578125,
-0.0626220703125,
-0.038299560546875,
-0.0179443359375,
0.0369873046875,
0.03399658203125,
-0.05517578125,
0.059478759765625,
0.03485107421875,
-0.00411224365234375,
-0.02587890625,
-0.042694091796875,
-0.00270843505859375,
-0.045013427734375,
-0.0487060546875,
0.027984619140625,
-0.0004112720489501953,
-0.0220489501953125,
0.001071929931640625,
0.01763916015625,
0.0094757080078125,
-0.0030002593994140625,
0.00921630859375,
0.0280303955078125,
0.005786895751953125,
0.02020263671875,
-0.00473785400390625,
0.013702392578125,
-0.0176239013671875,
-0.026397705078125,
0.07110595703125,
-0.0208282470703125,
-0.00981903076171875,
-0.0194549560546875,
0.0148468017578125,
0.042022705078125,
-0.00695037841796875,
0.0771484375,
0.04644775390625,
-0.0174560546875,
0.02239990234375,
-0.060821533203125,
0.0074005126953125,
-0.032958984375,
0.0118408203125,
-0.04669189453125,
-0.0794677734375,
0.051361083984375,
0.0063018798828125,
0.007030487060546875,
0.06707763671875,
0.042755126953125,
0.0038909912109375,
0.059722900390625,
0.044403076171875,
0.00731658935546875,
0.00965118408203125,
-0.042449951171875,
0.0031871795654296875,
-0.08709716796875,
-0.0270233154296875,
-0.055206298828125,
-0.00687408447265625,
-0.07318115234375,
-0.0250396728515625,
0.0194549560546875,
-0.00333404541015625,
-0.022186279296875,
0.03228759765625,
-0.01409912109375,
0.01507568359375,
0.033935546875,
-0.007289886474609375,
0.0014324188232421875,
0.02587890625,
-0.057708740234375,
0.00176239013671875,
-0.07110595703125,
-0.05194091796875,
0.0830078125,
0.0287322998046875,
0.026123046875,
0.001636505126953125,
0.07763671875,
0.005229949951171875,
0.00992584228515625,
-0.038299560546875,
0.0163421630859375,
-0.04150390625,
-0.0572509765625,
-0.00740814208984375,
-0.039031982421875,
-0.09063720703125,
0.0263519287109375,
-0.004608154296875,
-0.04290771484375,
0.03167724609375,
0.0034942626953125,
-0.0161590576171875,
0.043182373046875,
-0.040496826171875,
0.05340576171875,
0.0100250244140625,
-0.031982421875,
-0.02862548828125,
-0.01312255859375,
0.0076751708984375,
-0.03472900390625,
0.05316162109375,
0.0019702911376953125,
-0.0071563720703125,
0.09222412109375,
-0.01197052001953125,
0.05804443359375,
-0.0010309219360351562,
0.0007519721984863281,
0.0272064208984375,
-0.0035190582275390625,
0.0360107421875,
-0.026702880859375,
-0.02105712890625,
0.051605224609375,
0.01678466796875,
-0.02947998046875,
0.028411865234375,
0.049285888671875,
-0.06658935546875,
-0.008453369140625,
-0.03375244140625,
-0.01323699951171875,
0.006374359130859375,
0.035552978515625,
0.045684814453125,
0.025115966796875,
-0.0269012451171875,
0.032989501953125,
0.050750732421875,
-0.0110931396484375,
0.031982421875,
0.0445556640625,
-0.0078887939453125,
-0.042938232421875,
0.046722412109375,
0.0092926025390625,
0.0012292861938476562,
0.00519561767578125,
0.01038360595703125,
-0.03125,
-0.04010009765625,
-0.0236053466796875,
0.054046630859375,
-0.0289154052734375,
-0.01416015625,
-0.07452392578125,
0.019073486328125,
-0.0318603515625,
0.0012197494506835938,
-0.032745361328125,
-0.035675048828125,
-0.049713134765625,
-0.031219482421875,
0.0268707275390625,
0.057037353515625,
-0.01221466064453125,
0.0252227783203125,
-0.04144287109375,
0.0190277099609375,
0.006687164306640625,
0.022674560546875,
-0.005401611328125,
-0.057769775390625,
-0.0197296142578125,
-0.006488800048828125,
0.003787994384765625,
-0.0270538330078125,
0.0517578125,
0.010833740234375,
0.033935546875,
0.0038127899169921875,
-0.012237548828125,
0.034912109375,
-0.0645751953125,
0.08251953125,
0.0311279296875,
-0.042510986328125,
0.0027904510498046875,
-0.030120849609375,
0.032562255859375,
0.062255859375,
0.0300445556640625,
-0.035919189453125,
-0.0139923095703125,
-0.0880126953125,
-0.0885009765625,
0.041595458984375,
0.011993408203125,
0.0248565673828125,
-0.0174560546875,
0.022247314453125,
0.032958984375,
0.022430419921875,
-0.0274505615234375,
0.003520965576171875,
0.0034084320068359375,
-0.017181396484375,
-0.0193634033203125,
-0.034637451171875,
-0.0021915435791015625,
-0.01482391357421875,
0.07977294921875,
-0.00255584716796875,
0.025115966796875,
0.0055694580078125,
-0.01238250732421875,
-0.004913330078125,
0.0188140869140625,
0.053619384765625,
0.05853271484375,
-0.00865936279296875,
0.01242828369140625,
0.0261688232421875,
-0.06988525390625,
0.00457763671875,
0.003078460693359375,
-0.0196990966796875,
0.021636962890625,
0.0257415771484375,
0.08709716796875,
0.017578125,
-0.03350830078125,
0.0227508544921875,
-0.026214599609375,
-0.04107666015625,
-0.04901123046875,
-0.0245208740234375,
0.01242828369140625,
-0.00974273681640625,
0.0255889892578125,
-0.0166015625,
-0.0093994140625,
-0.00838470458984375,
0.0157928466796875,
0.0050201416015625,
-0.018707275390625,
-0.046234130859375,
0.052490234375,
0.0010967254638671875,
-0.010528564453125,
0.04541015625,
-0.0239715576171875,
-0.034515380859375,
0.0262451171875,
0.042266845703125,
0.06231689453125,
-0.00493621826171875,
0.015960693359375,
0.051910400390625,
0.0209503173828125,
-0.024566650390625,
0.05621337890625,
0.031585693359375,
-0.046600341796875,
-0.03790283203125,
-0.0804443359375,
-0.00446319580078125,
0.0296478271484375,
-0.048675537109375,
0.0386962890625,
-0.0016145706176757812,
-0.002162933349609375,
0.0018072128295898438,
0.0179901123046875,
-0.0997314453125,
0.00555419921875,
0.007656097412109375,
0.0792236328125,
-0.07415771484375,
0.05615234375,
0.06817626953125,
-0.01678466796875,
-0.06939697265625,
-0.037445068359375,
-0.016876220703125,
-0.0538330078125,
0.060089111328125,
0.02001953125,
-0.0016622543334960938,
0.015777587890625,
-0.032562255859375,
-0.0806884765625,
0.076904296875,
0.00182342529296875,
-0.051971435546875,
0.0055999755859375,
-0.003082275390625,
0.048004150390625,
-0.036224365234375,
-0.0021648406982421875,
0.038848876953125,
0.0285186767578125,
0.018341064453125,
-0.087890625,
-0.0237579345703125,
-0.0206756591796875,
-0.0038967132568359375,
0.0182952880859375,
-0.043975830078125,
0.066650390625,
0.013824462890625,
-0.008453369140625,
0.0012340545654296875,
0.040985107421875,
0.0292816162109375,
0.01023101806640625,
0.04931640625,
0.07757568359375,
0.06396484375,
0.0209197998046875,
0.084716796875,
-0.0289306640625,
0.00876617431640625,
0.0728759765625,
-0.0165863037109375,
0.07415771484375,
0.059356689453125,
-0.004787445068359375,
0.0433349609375,
0.054779052734375,
0.00531768798828125,
0.050567626953125,
0.0004680156707763672,
-0.004180908203125,
0.007106781005859375,
-0.0092620849609375,
-0.0125885009765625,
0.048431396484375,
0.033538818359375,
-0.015472412109375,
0.0086517333984375,
0.0211029052734375,
0.0272979736328125,
-0.01568603515625,
-0.0439453125,
0.06231689453125,
0.007083892822265625,
-0.04962158203125,
0.05621337890625,
-0.0104522705078125,
0.0572509765625,
-0.0369873046875,
0.0189208984375,
-0.0055999755859375,
0.0118560791015625,
0.0012569427490234375,
-0.0242767333984375,
0.009521484375,
0.013702392578125,
-0.007030487060546875,
-0.01099395751953125,
0.045135498046875,
-0.0340576171875,
-0.060699462890625,
0.00962066650390625,
0.0287628173828125,
0.03515625,
0.0196990966796875,
-0.038909912109375,
0.001659393310546875,
0.007965087890625,
-0.02789306640625,
0.0246429443359375,
0.0188140869140625,
-0.00711822509765625,
0.0166473388671875,
0.053741455078125,
0.02667236328125,
0.00611114501953125,
0.004077911376953125,
0.06317138671875,
-0.0214080810546875,
-0.051300048828125,
-0.040435791015625,
0.035797119140625,
-0.00788116455078125,
-0.03948974609375,
0.053802490234375,
0.054107666015625,
0.07672119140625,
-0.0087890625,
0.040496826171875,
-0.0117340087890625,
0.06573486328125,
-0.039398193359375,
0.03399658203125,
-0.0462646484375,
0.0033359527587890625,
-0.0013980865478515625,
-0.068603515625,
-0.00640869140625,
0.0313720703125,
-0.0188140869140625,
-0.0260009765625,
0.050262451171875,
0.032623291015625,
-0.021942138671875,
-0.0158233642578125,
0.01448822021484375,
0.0222930908203125,
0.00634002685546875,
-0.003910064697265625,
0.0304412841796875,
-0.024871826171875,
0.02447509765625,
-0.0187225341796875,
0.00649261474609375,
0.0009670257568359375,
-0.06817626953125,
-0.069580078125,
-0.04681396484375,
-0.0180816650390625,
-0.0279693603515625,
-0.01293182373046875,
0.07086181640625,
0.02783203125,
-0.0704345703125,
-0.02667236328125,
0.01122283935546875,
-0.0158843994140625,
-0.0020427703857421875,
-0.004703521728515625,
0.06378173828125,
-0.00994110107421875,
-0.033416748046875,
0.0250244140625,
-0.0208740234375,
0.001373291015625,
-0.01070404052734375,
0.00293731689453125,
-0.041839599609375,
0.00803375244140625,
0.036865234375,
0.00702667236328125,
-0.032318115234375,
-0.0210723876953125,
0.01220703125,
0.006832122802734375,
-0.00616455078125,
0.0401611328125,
-0.046234130859375,
0.0256500244140625,
0.03741455078125,
0.015655517578125,
0.054779052734375,
-0.0151214599609375,
0.02301025390625,
-0.0679931640625,
0.039031982421875,
0.023193359375,
0.034088134765625,
0.040740966796875,
-0.02783203125,
0.0452880859375,
0.00719451904296875,
-0.0350341796875,
-0.070556640625,
-0.0109100341796875,
-0.043212890625,
-0.01387786865234375,
0.10589599609375,
-0.006214141845703125,
-0.034820556640625,
-0.00946044921875,
-0.00881195068359375,
0.0126190185546875,
-0.0131378173828125,
0.049896240234375,
0.06768798828125,
0.01605224609375,
0.01361846923828125,
-0.043701171875,
0.036834716796875,
0.0095672607421875,
-0.031768798828125,
-0.00980377197265625,
0.02392578125,
0.031524658203125,
0.030303955078125,
0.036102294921875,
-0.0172576904296875,
0.00135040283203125,
-0.006847381591796875,
0.0201873779296875,
-0.00917816162109375,
-0.015625,
-0.0364990234375,
0.0033321380615234375,
-0.00905609130859375,
-0.008392333984375
]
] |
cross-encoder/nli-roberta-base | 2021-08-05T08:41:05.000Z | [
"transformers",
"pytorch",
"jax",
"roberta",
"text-classification",
"roberta-base",
"zero-shot-classification",
"en",
"dataset:multi_nli",
"dataset:snli",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-classification | cross-encoder | null | null | cross-encoder/nli-roberta-base | 11 | 12,685 | transformers | 2022-03-02T23:29:05 | ---
language: en
pipeline_tag: zero-shot-classification
tags:
- roberta-base
datasets:
- multi_nli
- snli
metrics:
- accuracy
license: apache-2.0
---
# Cross-Encoder for Natural Language Inference
This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class.
## Training Data
The model was trained on the [SNLI](https://nlp.stanford.edu/projects/snli/) and [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) datasets. For a given sentence pair, it will output three scores corresponding to the labels: contradiction, entailment, neutral.
## Performance
For evaluation results, see [SBERT.net - Pretrained Cross-Encoder](https://www.sbert.net/docs/pretrained_cross-encoders.html#nli).
## Usage
Pre-trained models can be used like this:
```python
from sentence_transformers import CrossEncoder
model = CrossEncoder('cross-encoder/nli-roberta-base')
scores = model.predict([('A man is eating pizza', 'A man eats something'), ('A black race car starts up in front of a crowd of people.', 'A man is driving down a lonely road.')])
#Convert scores to labels
label_mapping = ['contradiction', 'entailment', 'neutral']
labels = [label_mapping[score_max] for score_max in scores.argmax(axis=1)]
```
## Usage with Transformers AutoModel
You can use the model also directly with Transformers library (without SentenceTransformers library):
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model = AutoModelForSequenceClassification.from_pretrained('cross-encoder/nli-roberta-base')
tokenizer = AutoTokenizer.from_pretrained('cross-encoder/nli-roberta-base')
features = tokenizer(['A man is eating pizza', 'A black race car starts up in front of a crowd of people.'], ['A man eats something', 'A man is driving down a lonely road.'], padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
label_mapping = ['contradiction', 'entailment', 'neutral']
labels = [label_mapping[score_max] for score_max in scores.argmax(dim=1)]
print(labels)
```
## Zero-Shot Classification
This model can also be used for zero-shot-classification:
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification", model='cross-encoder/nli-roberta-base')
sent = "Apple just announced the newest iPhone X"
candidate_labels = ["technology", "sports", "politics"]
res = classifier(sent, candidate_labels)
print(res)
``` | 2,559 | [
[
-0.0158538818359375,
-0.056640625,
0.02178955078125,
0.0185546875,
0.0016841888427734375,
-0.00757598876953125,
-0.00975799560546875,
-0.0241546630859375,
0.0131988525390625,
0.035186767578125,
-0.042083740234375,
-0.040130615234375,
-0.04193115234375,
0.0161895751953125,
-0.0433349609375,
0.08953857421875,
-0.0032825469970703125,
0.002811431884765625,
-0.01389312744140625,
-0.00885772705078125,
-0.01629638671875,
-0.032318115234375,
-0.031707763671875,
-0.041717529296875,
0.0260009765625,
0.01047515869140625,
0.043060302734375,
0.0276947021484375,
0.009674072265625,
0.02886962890625,
0.00537109375,
-0.01512908935546875,
-0.014434814453125,
-0.00682830810546875,
-0.0007100105285644531,
-0.04425048828125,
-0.005580902099609375,
0.016754150390625,
0.024078369140625,
0.031524658203125,
-0.0004088878631591797,
0.020416259765625,
-0.0130615234375,
0.01226806640625,
-0.046875,
0.00611114501953125,
-0.0389404296875,
0.0157318115234375,
0.00534820556640625,
-0.002590179443359375,
-0.036102294921875,
-0.026611328125,
0.0073699951171875,
-0.037109375,
0.0258331298828125,
0.00457763671875,
0.10150146484375,
0.0308837890625,
-0.0230712890625,
-0.031890869140625,
-0.039703369140625,
0.07232666015625,
-0.0765380859375,
0.0209808349609375,
0.0169830322265625,
0.00125885009765625,
0.00777435302734375,
-0.056793212890625,
-0.0699462890625,
-0.01445770263671875,
-0.015899658203125,
0.032989501953125,
-0.024993896484375,
-0.007488250732421875,
0.0271759033203125,
0.0304107666015625,
-0.061126708984375,
-0.0017671585083007812,
-0.0308837890625,
-0.01165771484375,
0.055450439453125,
0.0089111328125,
0.019378662109375,
-0.034088134765625,
-0.0286102294921875,
-0.0105743408203125,
-0.009765625,
0.00853729248046875,
0.0219573974609375,
0.0023822784423828125,
-0.021728515625,
0.0648193359375,
-0.025787353515625,
0.0648193359375,
0.0172882080078125,
-0.007472991943359375,
0.05462646484375,
-0.0272216796875,
-0.033843994140625,
0.02239990234375,
0.078857421875,
0.029632568359375,
0.0256195068359375,
-0.01239013671875,
-0.00685882568359375,
0.03021240234375,
-0.004901885986328125,
-0.053131103515625,
-0.0172119140625,
0.02789306640625,
-0.024444580078125,
-0.0258026123046875,
0.007152557373046875,
-0.059356689453125,
0.00231170654296875,
-0.00897979736328125,
0.060302734375,
-0.041748046875,
0.0119781494140625,
0.027984619140625,
-0.0271759033203125,
0.034088134765625,
-0.016937255859375,
-0.059600830078125,
0.005100250244140625,
0.0241851806640625,
0.058807373046875,
0.0158233642578125,
-0.037200927734375,
-0.0270233154296875,
0.003162384033203125,
0.0022449493408203125,
0.037109375,
-0.03179931640625,
-0.00501251220703125,
-0.01297760009765625,
0.006740570068359375,
-0.025177001953125,
-0.0244598388671875,
0.05340576171875,
-0.0187530517578125,
0.05035400390625,
0.0272369384765625,
-0.062225341796875,
-0.0265045166015625,
0.021575927734375,
-0.0307464599609375,
0.086669921875,
0.0078582763671875,
-0.06341552734375,
0.012298583984375,
-0.04412841796875,
-0.038604736328125,
-0.0218353271484375,
-0.0039215087890625,
-0.05035400390625,
0.005950927734375,
0.030303955078125,
0.0321044921875,
-0.019622802734375,
0.03729248046875,
-0.02264404296875,
-0.02545166015625,
0.025543212890625,
-0.04180908203125,
0.08831787109375,
0.00772857666015625,
-0.041046142578125,
0.0170745849609375,
-0.060821533203125,
0.0102081298828125,
0.0117950439453125,
-0.023406982421875,
-0.0018463134765625,
-0.0230712890625,
0.014739990234375,
0.019775390625,
0.0007271766662597656,
-0.058624267578125,
-0.001895904541015625,
-0.03631591796875,
0.04766845703125,
0.0261077880859375,
-0.0022735595703125,
0.0264739990234375,
-0.019439697265625,
0.02215576171875,
-0.0020389556884765625,
0.005767822265625,
-0.007534027099609375,
-0.05181884765625,
-0.0760498046875,
-0.0005593299865722656,
0.035858154296875,
0.0660400390625,
-0.0626220703125,
0.07354736328125,
-0.0167083740234375,
-0.044647216796875,
-0.056610107421875,
-0.019134521484375,
0.0188140869140625,
0.043853759765625,
0.04681396484375,
-0.00391387939453125,
-0.055694580078125,
-0.055938720703125,
-0.0290679931640625,
-0.0019006729125976562,
-0.01325225830078125,
0.0024871826171875,
0.061126708984375,
-0.0283660888671875,
0.081787109375,
-0.03887939453125,
-0.0157623291015625,
-0.039093017578125,
0.030303955078125,
0.036102294921875,
0.0528564453125,
0.0279693603515625,
-0.044647216796875,
-0.0279388427734375,
-0.021209716796875,
-0.062347412109375,
-0.01177978515625,
-0.0284423828125,
0.0019063949584960938,
0.00994873046875,
0.0226593017578125,
-0.0443115234375,
0.05084228515625,
0.034088134765625,
-0.03631591796875,
0.0401611328125,
-0.01045989990234375,
-0.00005745887756347656,
-0.08056640625,
-0.01003265380859375,
0.0173797607421875,
-0.0084075927734375,
-0.057220458984375,
-0.0108642578125,
-0.01056671142578125,
-0.005496978759765625,
-0.029632568359375,
0.040924072265625,
-0.0209197998046875,
0.004283905029296875,
0.0008249282836914062,
0.0089874267578125,
0.0173187255859375,
0.04266357421875,
0.021087646484375,
0.03955078125,
0.058258056640625,
-0.04058837890625,
0.034637451171875,
0.0231475830078125,
-0.031219482421875,
0.021636962890625,
-0.063720703125,
-0.0072174072265625,
-0.012603759765625,
0.0185394287109375,
-0.0679931640625,
-0.01242828369140625,
0.030670166015625,
-0.051055908203125,
-0.0003330707550048828,
0.01419830322265625,
-0.031768798828125,
-0.0355224609375,
-0.00447845458984375,
0.0244293212890625,
0.03485107421875,
-0.03314208984375,
0.061767578125,
0.01074981689453125,
0.02880859375,
-0.037139892578125,
-0.08587646484375,
0.0014467239379882812,
-0.01436614990234375,
-0.035247802734375,
0.0223541259765625,
0.00197601318359375,
0.0033626556396484375,
0.012481689453125,
0.003429412841796875,
-0.0158233642578125,
-0.00445556640625,
0.0173797607421875,
0.0211334228515625,
-0.01947021484375,
-0.00269317626953125,
-0.01168060302734375,
-0.0161285400390625,
0.012481689453125,
-0.0234832763671875,
0.04193115234375,
-0.019989013671875,
-0.02044677734375,
-0.049163818359375,
0.0186614990234375,
0.020355224609375,
-0.019134521484375,
0.054229736328125,
0.073974609375,
-0.0262908935546875,
-0.0006847381591796875,
-0.036407470703125,
-0.01287078857421875,
-0.030303955078125,
0.034149169921875,
-0.023681640625,
-0.049560546875,
0.0250396728515625,
0.0196380615234375,
-0.0146636962890625,
0.045013427734375,
0.03271484375,
0.002956390380859375,
0.0693359375,
0.0272216796875,
-0.0206451416015625,
0.0217132568359375,
-0.043853759765625,
0.0252685546875,
-0.049468994140625,
-0.017913818359375,
-0.040313720703125,
-0.016815185546875,
-0.0433349609375,
-0.026947021484375,
0.01097869873046875,
0.005565643310546875,
-0.0235137939453125,
0.03607177734375,
-0.039947509765625,
0.034820556640625,
0.05902099609375,
0.00623321533203125,
0.004405975341796875,
-0.0016469955444335938,
-0.0042266845703125,
0.00530242919921875,
-0.06396484375,
-0.03131103515625,
0.06597900390625,
0.020965576171875,
0.0570068359375,
-0.01177978515625,
0.064208984375,
-0.005588531494140625,
0.01534271240234375,
-0.0540771484375,
0.03472900390625,
-0.02215576171875,
-0.058563232421875,
-0.01837158203125,
-0.036102294921875,
-0.06494140625,
0.01374053955078125,
-0.031463623046875,
-0.056884765625,
0.0184326171875,
-0.01611328125,
-0.03594970703125,
0.0244293212890625,
-0.061981201171875,
0.09307861328125,
-0.030975341796875,
-0.0172576904296875,
0.0122222900390625,
-0.0567626953125,
0.02752685546875,
0.01165008544921875,
0.00376129150390625,
-0.012664794921875,
0.021575927734375,
0.0601806640625,
-0.01165008544921875,
0.0732421875,
-0.0037288665771484375,
0.0175628662109375,
0.0303802490234375,
-0.021575927734375,
0.0079498291015625,
0.006092071533203125,
-0.0272216796875,
0.0295867919921875,
-0.0098419189453125,
-0.0272216796875,
-0.0460205078125,
0.03509521484375,
-0.07049560546875,
-0.026763916015625,
-0.040618896484375,
-0.034088134765625,
0.015777587890625,
0.016265869140625,
0.052703857421875,
0.036865234375,
-0.0009403228759765625,
0.005367279052734375,
0.026458740234375,
-0.029998779296875,
0.050750732421875,
0.009185791015625,
-0.01018524169921875,
-0.035369873046875,
0.0628662109375,
-0.004695892333984375,
0.01467132568359375,
0.033905029296875,
0.021575927734375,
-0.039825439453125,
-0.0151519775390625,
-0.0296783447265625,
0.019683837890625,
-0.04022216796875,
-0.01385498046875,
-0.04974365234375,
-0.04620361328125,
-0.046539306640625,
-0.009613037109375,
-0.01555633544921875,
-0.02484130859375,
-0.03729248046875,
-0.007419586181640625,
0.02587890625,
0.0369873046875,
-0.0025691986083984375,
0.0298919677734375,
-0.051483154296875,
0.034698486328125,
0.010986328125,
0.008880615234375,
-0.0078582763671875,
-0.052520751953125,
-0.00815582275390625,
-0.0018520355224609375,
-0.0278167724609375,
-0.0736083984375,
0.049346923828125,
0.0237579345703125,
0.048126220703125,
0.017791748046875,
0.0163421630859375,
0.052581787109375,
-0.023834228515625,
0.05633544921875,
0.026458740234375,
-0.09503173828125,
0.04461669921875,
0.0112762451171875,
0.034149169921875,
0.033599853515625,
0.03558349609375,
-0.053009033203125,
-0.03863525390625,
-0.04315185546875,
-0.067626953125,
0.053009033203125,
0.033905029296875,
0.007602691650390625,
-0.00981903076171875,
0.0158233642578125,
0.0031719207763671875,
0.01381683349609375,
-0.10382080078125,
-0.036468505859375,
-0.05206298828125,
-0.041961669921875,
-0.0253753662109375,
0.003582000732421875,
0.00763702392578125,
-0.04364013671875,
0.0654296875,
0.0015850067138671875,
0.0268096923828125,
0.043060302734375,
-0.015289306640625,
0.023651123046875,
0.026763916015625,
0.041595458984375,
0.016693115234375,
-0.0207977294921875,
0.0065155029296875,
0.025177001953125,
-0.0178680419921875,
0.0162811279296875,
0.016387939453125,
-0.0310516357421875,
0.01629638671875,
0.044036865234375,
0.09710693359375,
0.0035305023193359375,
-0.03533935546875,
0.04022216796875,
0.0019235610961914062,
-0.0198516845703125,
-0.034149169921875,
0.005283355712890625,
-0.0019474029541015625,
0.021881103515625,
0.0174560546875,
0.0158233642578125,
0.007061004638671875,
-0.046875,
0.02435302734375,
0.01178741455078125,
-0.040924072265625,
-0.015472412109375,
0.062347412109375,
0.00197601318359375,
-0.037078857421875,
0.05120849609375,
-0.022247314453125,
-0.0531005859375,
0.04901123046875,
0.047119140625,
0.074951171875,
0.0003979206085205078,
0.028045654296875,
0.050140380859375,
0.0287322998046875,
-0.0018777847290039062,
0.0086212158203125,
0.0015411376953125,
-0.0750732421875,
-0.026397705078125,
-0.053619384765625,
-0.004245758056640625,
0.00836944580078125,
-0.05487060546875,
0.01103973388671875,
-0.01611328125,
-0.004817962646484375,
0.0084228515625,
-0.0176544189453125,
-0.049163818359375,
0.0238037109375,
0.01502227783203125,
0.06494140625,
-0.08258056640625,
0.0694580078125,
0.03826904296875,
-0.05377197265625,
-0.06414794921875,
0.014434814453125,
-0.0143585205078125,
-0.053741455078125,
0.05157470703125,
0.0367431640625,
0.008270263671875,
0.0110626220703125,
-0.0304718017578125,
-0.053680419921875,
0.0721435546875,
0.005859375,
-0.0330810546875,
-0.0056304931640625,
0.0232696533203125,
0.04638671875,
-0.0330810546875,
0.056640625,
0.0521240234375,
0.036468505859375,
-0.0034122467041015625,
-0.050537109375,
0.0037364959716796875,
-0.01059722900390625,
-0.005123138427734375,
-0.010223388671875,
-0.027099609375,
0.0689697265625,
-0.0208892822265625,
0.00013077259063720703,
0.014434814453125,
0.053466796875,
0.0236053466796875,
0.039520263671875,
0.0382080078125,
0.0638427734375,
0.043426513671875,
-0.018585205078125,
0.072509765625,
-0.016876220703125,
0.05462646484375,
0.0821533203125,
-0.015106201171875,
0.06695556640625,
0.03533935546875,
-0.0099334716796875,
0.054718017578125,
0.048248291015625,
-0.03179931640625,
0.03826904296875,
0.0215606689453125,
-0.0077056884765625,
-0.0196075439453125,
0.01082611083984375,
-0.022674560546875,
0.059112548828125,
0.006793975830078125,
-0.030914306640625,
-0.02001953125,
0.01171112060546875,
-0.0171661376953125,
0.0010080337524414062,
-0.009490966796875,
0.0430908203125,
-0.01074981689453125,
-0.04913330078125,
0.05224609375,
0.005615234375,
0.0709228515625,
-0.030303955078125,
0.006488800048828125,
0.0027866363525390625,
0.019439697265625,
-0.020904541015625,
-0.06842041015625,
0.024566650390625,
-0.005138397216796875,
-0.01052093505859375,
-0.003063201904296875,
0.032379150390625,
-0.052703857421875,
-0.06304931640625,
0.037689208984375,
0.01922607421875,
0.019012451171875,
0.00782012939453125,
-0.076904296875,
-0.004924774169921875,
0.0167083740234375,
-0.01448822021484375,
-0.00926971435546875,
0.03076171875,
0.024261474609375,
0.037689208984375,
0.03936767578125,
-0.006984710693359375,
0.0275115966796875,
0.01812744140625,
0.0435791015625,
-0.06439208984375,
-0.02764892578125,
-0.06976318359375,
0.048370361328125,
-0.012054443359375,
-0.043182373046875,
0.0662841796875,
0.062164306640625,
0.07379150390625,
-0.0209808349609375,
0.052581787109375,
-0.0161895751953125,
0.0250091552734375,
-0.046661376953125,
0.0455322265625,
-0.0435791015625,
0.004436492919921875,
-0.0068817138671875,
-0.050018310546875,
-0.038543701171875,
0.06915283203125,
-0.0305633544921875,
0.01074981689453125,
0.0477294921875,
0.07281494140625,
-0.00360107421875,
0.00838470458984375,
0.010406494140625,
0.0260009765625,
0.00518798828125,
0.050811767578125,
0.057220458984375,
-0.069580078125,
0.050079345703125,
-0.037353515625,
-0.0022068023681640625,
-0.002063751220703125,
-0.052703857421875,
-0.06927490234375,
-0.033233642578125,
-0.038787841796875,
-0.0279693603515625,
-0.005435943603515625,
0.056793212890625,
0.057830810546875,
-0.08056640625,
-0.0214996337890625,
-0.021209716796875,
0.018035888671875,
-0.020111083984375,
-0.0267791748046875,
0.01535797119140625,
-0.0211944580078125,
-0.061676025390625,
0.02130126953125,
-0.0010461807250976562,
0.0033588409423828125,
-0.0075836181640625,
-0.00894927978515625,
-0.04534912109375,
-0.0002434253692626953,
0.0311126708984375,
0.01464080810546875,
-0.07720947265625,
-0.02642822265625,
-0.003582000732421875,
-0.01131439208984375,
0.01439666748046875,
0.030609130859375,
-0.0640869140625,
0.016265869140625,
0.03253173828125,
0.04852294921875,
0.053924560546875,
-0.00919342041015625,
0.027374267578125,
-0.054229736328125,
0.0077056884765625,
0.0116119384765625,
0.0311737060546875,
0.02435302734375,
-0.015716552734375,
0.036865234375,
0.03302001953125,
-0.042266845703125,
-0.04571533203125,
0.005279541015625,
-0.07281494140625,
-0.0282440185546875,
0.07568359375,
-0.00875091552734375,
-0.039093017578125,
-0.01009368896484375,
-0.007537841796875,
0.044281005859375,
-0.021697998046875,
0.046051025390625,
0.032928466796875,
-0.01812744140625,
-0.019287109375,
-0.03594970703125,
0.0178985595703125,
0.041748046875,
-0.0606689453125,
-0.0229034423828125,
0.009857177734375,
0.033843994140625,
0.0279388427734375,
0.0264739990234375,
0.01178741455078125,
-0.00014317035675048828,
0.01739501953125,
0.0294647216796875,
0.00719451904296875,
-0.006923675537109375,
-0.03729248046875,
0.0122528076171875,
-0.045135498046875,
-0.0438232421875
]
] |
foduucom/stockmarket-pattern-detection-yolov8 | 2023-09-11T10:13:51.000Z | [
"ultralytics",
"tensorboard",
"v8",
"ultralyticsplus",
"yolov8",
"yolo",
"vision",
"object-detection",
"pytorch",
"finance",
"stock market",
"candlesticks",
"pattern recognition",
"option trading",
"chart reader",
"en",
"model-index",
"has_space",
"region:us"
] | object-detection | foduucom | null | null | foduucom/stockmarket-pattern-detection-yolov8 | 38 | 12,639 | ultralytics | 2023-08-10T14:13:24 | ---
tags:
- ultralyticsplus
- yolov8
- ultralytics
- yolo
- vision
- object-detection
- pytorch
- finance
- stock market
- candlesticks
- pattern recognition
- option trading
- chart reader
library_name: ultralytics
library_version: 8.0.43
inference: false
model-index:
- name: foduucom/stockmarket-pattern-detection-yolov8
results:
- task:
type: object-detection
metrics:
- type: precision
value: 0.61355
name: mAP@0.5(box)
language:
- en
pipeline_tag: object-detection
---
<div align="center">
<img width="500" alt="foduucom/stockmarket-pattern-detection-yolov8" src="https://huggingface.co/foduucom/stockmarket-pattern-detection-yolov8/resolve/main/thumbnail.jpg">
</div>
# Model Card for YOLOv8s Stock Market Pattern Detection on Live Trading Video Data
## Model Summary
The YOLOv8s Stock Market Pattern Detection model is an object detection model based on the YOLO (You Only Look Once) framework. It is designed to detect various chart patterns in real-time stock market trading video data. The model aids traders and investors by automating the analysis of chart patterns, providing timely insights for informed decision-making. The model has been fine-tuned on a diverse dataset and achieved high accuracy in detecting and classifying stock market patterns in live trading scenarios.
## Model Details
### Model Description
The YOLOv8s Stock Market Pattern Detection model offers a transformative solution for traders and investors by enabling real-time detection of crucial chart patterns within live trading video data. As stock markets evolve rapidly, this model's capabilities empower users with timely insights, allowing them to make informed decisions with speed and accuracy.
The model seamlessly integrates into live trading systems, providing instant pattern detection and classification. By leveraging advanced bounding box techniques and pattern-specific feature extraction, the model excels in identifying patterns such as 'Head and shoulders bottom,' 'Head and shoulders top,' 'M_Head,' 'StockLine,' 'Triangle,' and 'W_Bottom.' This enables traders to optimize their strategies, automate trading decisions, and respond to market trends in real-time.
To facilitate integration into live trading systems or to inquire about customization, please contact us at info@foduu.com. Your collaboration and feedback are instrumental in refining and enhancing the model's performance in dynamic trading environments.
- **Developed by:** FODUU AI
- **Model type:** Object Detection
- **Task:** Stock Market Pattern Detection on Live Trading Video Data
The YOLOv8s Stock Market Pattern Detection model is designed to adapt to the fast-paced nature of live trading environments. Its ability to operate on real-time video data allows traders and investors to harness pattern-based insights without delay.
### Supported Labels
```
['Head and shoulders bottom', 'Head and shoulders top', 'M_Head', 'StockLine', 'Triangle', 'W_Bottom']
```
## Uses
### Direct Use
The YOLOv8s Stock Market Pattern Detection model can be directly integrated into live trading systems to provide real-time detection and classification of chart patterns. Traders can utilize the model's insights for timely decision-making.
### Downstream Use
The model's real-time capabilities can be leveraged to automate trading strategies, generate alerts for specific patterns, and enhance overall trading performance.
### Training data
The Stockmarket model was trained on custom dataset consisting of 9000/800 annotated images for training/validation respectively.
### Out-of-Scope Use
The model is not designed for unrelated object detection tasks or scenarios outside the scope of stock market pattern detection in live trading video data.
## Bias, Risks, and Limitations
The YOLOv8s Stock Market Pattern Detection model may exhibit some limitations and biases:
- Performance may be affected by variations in video quality, lighting conditions, and pattern complexity within live trading data.
- Rapid market fluctuations and noise in video data may impact the model's accuracy and responsiveness.
- Market-specific patterns or anomalies not well-represented in the training data may pose challenges for detection.
### Recommendations
Users should be aware of the model's limitations and potential biases. Thorough testing and validation within live trading simulations are advised before deploying the model in real trading environments.
## How to Get Started with the Model
To begin using the YOLOv8s Stock Market Pattern Detection model on live trading video data, follow these steps:
```bash
pip install ultralyticsplus==0.0.28 ultralytics==8.0.43
```
- Load model and perform real-time prediction:
```python
from ultralyticsplus import YOLO, render_result
import cv2
# load model
model = YOLO('foduucom/stockmarket-pattern-detection-yolov8')
# set model parameters
model.overrides['conf'] = 0.25 # NMS confidence threshold
model.overrides['iou'] = 0.45 # NMS IoU threshold
model.overrides['agnostic_nms'] = False # NMS class-agnostic
model.overrides['max_det'] = 1000 # maximum number of detections per image
# initialize video capture
# Open the video file
video_path = "path/to/your/video/file.mp4"
cap = cv2.VideoCapture(video_path)
# Loop through the video frames
while cap.isOpened():
# Read a frame from the video
success, frame = cap.read()
if success:
# Run YOLOv8 inference on the frame
results = model(frame)
# Visualize the results on the frame
annotated_frame = results[0].plot()
# Display the annotated frame
cv2.imshow("YOLOv8 Inference", annotated_frame)
# Break the loop if 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord("q"):
break
else:
# Break the loop if the end of the video is reached
break
# Release the video capture object and close the display window
cap.release()
cv2.destroyAllWindows()
```
## Training Details
### Training Data
The model is trained on a diverse dataset containing stock market chart images with various chart patterns, capturing different market conditions and scenarios.
### Training Procedure
The training process involves extensive computation and is conducted over multiple epochs. The model's weights are adjusted to minimize detection loss and optimize performance for stock market pattern detection.
#### Metrics
- mAP@0.5 (box):
- All patterns: 0.932
- Individual patterns: Varies based on pattern type
### Model Architecture and Objective
The YOLOv8s architecture incorporates modifications tailored to stock market pattern detection. It features a specialized backbone network, self-attention mechanisms, and pattern-specific feature extraction modules.
### Compute Infrastructure
#### Hardware
NVIDIA GeForce RTX 3080 card
#### Software
The model was trained and fine-tuned using a Jupyter Notebook environment.
## Model Card Contact
For inquiries and contributions, please contact us at info@foduu.com.
```bibtex
@ModelCard{
author = {Nehul Agrawal and
Pranjal Singh Thakur},
title = {YOLOv8s Stock Market Pattern Detection on Live Trading Video Data},
year = {2023}
}
``` | 7,249 | [
[
0.0021648406982421875,
-0.062744140625,
-0.0008592605590820312,
-0.02264404296875,
-0.049774169921875,
-0.01392364501953125,
0.0169677734375,
-0.0421142578125,
0.01456451416015625,
0.0310821533203125,
-0.034576416015625,
-0.053924560546875,
-0.032745361328125,
-0.0172271728515625,
-0.0308685302734375,
0.03936767578125,
0.0185546875,
0.01032257080078125,
0.0290069580078125,
-0.0147247314453125,
-0.046173095703125,
-0.01248931884765625,
-0.0198974609375,
0.0024623870849609375,
-0.0232086181640625,
0.0172576904296875,
0.0185546875,
0.08294677734375,
0.0205841064453125,
0.037353515625,
0.0009775161743164062,
0.0172882080078125,
-0.03369140625,
0.029205322265625,
0.0306396484375,
-0.046600341796875,
-0.05389404296875,
0.01113128662109375,
0.0308380126953125,
0.008026123046875,
-0.01214599609375,
0.038330078125,
-0.0169219970703125,
0.0132904052734375,
-0.039398193359375,
0.037353515625,
-0.039947509765625,
0.017913818359375,
-0.0352783203125,
-0.019317626953125,
-0.0279998779296875,
-0.001434326171875,
0.00217437744140625,
-0.056549072265625,
-0.005023956298828125,
0.03778076171875,
0.06463623046875,
-0.012969970703125,
-0.0197601318359375,
0.019317626953125,
-0.035919189453125,
0.07440185546875,
-0.08978271484375,
0.0186767578125,
0.02239990234375,
0.028961181640625,
-0.01568603515625,
-0.028778076171875,
-0.0231475830078125,
0.003047943115234375,
-0.006587982177734375,
-0.01641845703125,
-0.0059814453125,
-0.0301055908203125,
0.039215087890625,
0.021240234375,
-0.044036865234375,
-0.0110626220703125,
-0.0738525390625,
-0.0176239013671875,
0.053314208984375,
0.027557373046875,
0.01016998291015625,
-0.0166168212890625,
-0.0189971923828125,
0.006374359130859375,
-0.0220794677734375,
0.014404296875,
0.037841796875,
0.01137542724609375,
-0.0191650390625,
0.00804901123046875,
-0.018768310546875,
0.06378173828125,
0.026519775390625,
-0.044830322265625,
0.04949951171875,
0.0006756782531738281,
-0.0484619140625,
0.0117034912109375,
0.04833984375,
0.04364013671875,
-0.0057525634765625,
0.0164794921875,
-0.021026611328125,
0.0050506591796875,
0.0183563232421875,
-0.05145263671875,
-0.0205078125,
0.01454925537109375,
-0.061187744140625,
-0.056488037109375,
0.01702880859375,
-0.05303955078125,
-0.020263671875,
-0.0096893310546875,
0.059173583984375,
-0.0306396484375,
-0.039825439453125,
0.048858642578125,
-0.03955078125,
0.050933837890625,
0.0214385986328125,
-0.029449462890625,
0.009002685546875,
0.0357666015625,
0.05120849609375,
-0.005870819091796875,
-0.0283355712890625,
-0.0714111328125,
-0.00464630126953125,
-0.030426025390625,
0.05291748046875,
-0.00948333740234375,
-0.00829315185546875,
-0.017425537109375,
0.0254669189453125,
0.01201629638671875,
-0.026214599609375,
-0.006702423095703125,
-0.045135498046875,
0.036865234375,
0.006999969482421875,
-0.053070068359375,
-0.04071044921875,
0.04376220703125,
-0.0323486328125,
0.058563232421875,
-0.0216827392578125,
-0.060089111328125,
0.0382080078125,
-0.05877685546875,
-0.01059722900390625,
0.02862548828125,
-0.014068603515625,
-0.057342529296875,
-0.0252532958984375,
-0.020263671875,
0.058349609375,
0.006092071533203125,
0.0015697479248046875,
-0.050628662109375,
-0.048431396484375,
0.033843994140625,
-0.0258941650390625,
0.03326416015625,
0.017364501953125,
-0.053314208984375,
0.0291748046875,
-0.09185791015625,
-0.00276947021484375,
0.04644775390625,
0.018035888671875,
-0.03509521484375,
-0.0107879638671875,
0.01195526123046875,
0.0020008087158203125,
-0.004871368408203125,
-0.033050537109375,
0.0279998779296875,
-0.032989501953125,
0.0277862548828125,
0.0706787109375,
-0.01169586181640625,
0.0157623291015625,
-0.018768310546875,
0.045806884765625,
0.03582763671875,
0.0217437744140625,
-0.02752685546875,
-0.054290771484375,
-0.0236968994140625,
-0.0033893585205078125,
-0.007434844970703125,
0.033966064453125,
-0.00983428955078125,
0.0465087890625,
0.0291290283203125,
-0.0421142578125,
-0.02740478515625,
-0.01103973388671875,
0.0133056640625,
0.037689208984375,
0.024017333984375,
-0.038543701171875,
-0.045654296875,
-0.052642822265625,
-0.0012845993041992188,
0.038238525390625,
0.018524169921875,
-0.01047515869140625,
0.033721923828125,
-0.004398345947265625,
0.03692626953125,
-0.068115234375,
-0.00742340087890625,
-0.0245513916015625,
0.0120391845703125,
0.035186767578125,
0.0233001708984375,
0.061279296875,
-0.06634521484375,
-0.04425048828125,
0.0189361572265625,
-0.060333251953125,
0.027252197265625,
0.00392913818359375,
-0.00649261474609375,
0.01450347900390625,
0.04638671875,
-0.007442474365234375,
0.09161376953125,
0.008514404296875,
-0.05291748046875,
0.0411376953125,
-0.024658203125,
0.027008056640625,
-0.09375,
-0.00009196996688842773,
0.0263824462890625,
-0.030853271484375,
-0.06903076171875,
-0.01641845703125,
-0.00965118408203125,
0.01470947265625,
-0.038238525390625,
0.0252685546875,
-0.024078369140625,
-0.01248931884765625,
-0.037933349609375,
-0.0183258056640625,
0.00232696533203125,
0.03509521484375,
-0.0131378173828125,
0.04473876953125,
0.0675048828125,
-0.043243408203125,
0.033935546875,
0.007503509521484375,
-0.060394287109375,
0.024658203125,
-0.04022216796875,
-0.0100860595703125,
-0.0212554931640625,
-0.0168609619140625,
-0.08056640625,
-0.0274200439453125,
0.06048583984375,
-0.0306243896484375,
0.0286712646484375,
-0.0231170654296875,
0.01727294921875,
-0.01702880859375,
-0.055908203125,
0.01568603515625,
0.038818359375,
-0.00897979736328125,
0.029510498046875,
0.0178070068359375,
0.045135498046875,
-0.052734375,
-0.04327392578125,
-0.033233642578125,
-0.04803466796875,
-0.038726806640625,
0.0013217926025390625,
-0.0198974609375,
-0.019683837890625,
-0.0159454345703125,
-0.011077880859375,
-0.002040863037109375,
0.0240936279296875,
0.024261474609375,
0.06329345703125,
-0.023223876953125,
-0.015899658203125,
-0.034759521484375,
-0.02288818359375,
0.0014028549194335938,
-0.0404052734375,
0.047088623046875,
-0.005092620849609375,
-0.0237884521484375,
-0.05841064453125,
0.032196044921875,
0.0675048828125,
-0.00537872314453125,
0.055816650390625,
0.0577392578125,
-0.03515625,
-0.0060577392578125,
-0.03619384765625,
-0.00762176513671875,
-0.04248046875,
0.0277099609375,
-0.0189666748046875,
-0.0229644775390625,
0.053314208984375,
0.030059814453125,
-0.00789642333984375,
0.042724609375,
0.038055419921875,
0.0146484375,
0.07183837890625,
0.03375244140625,
-0.00466156005859375,
0.05120849609375,
-0.053924560546875,
0.0256500244140625,
-0.0391845703125,
-0.022125244140625,
0.006412506103515625,
-0.037567138671875,
-0.047027587890625,
-0.020782470703125,
0.021820068359375,
-0.011749267578125,
-0.0411376953125,
0.0352783203125,
-0.049957275390625,
0.024322509765625,
0.05364990234375,
0.01412200927734375,
0.0177459716796875,
-0.01428985595703125,
0.00801849365234375,
-0.01450347900390625,
-0.0305633544921875,
-0.014404296875,
0.1033935546875,
0.025604248046875,
0.0716552734375,
-0.0231781005859375,
0.0269622802734375,
0.046417236328125,
-0.00945281982421875,
-0.05096435546875,
0.036529541015625,
0.024505615234375,
-0.0234832763671875,
-0.008544921875,
0.006103515625,
-0.0587158203125,
0.0057830810546875,
-0.017425537109375,
-0.06549072265625,
0.02886962890625,
0.01629638671875,
-0.03570556640625,
0.04254150390625,
-0.053314208984375,
0.095458984375,
-0.020660400390625,
-0.0301666259765625,
0.0151519775390625,
-0.05023193359375,
0.023773193359375,
0.010955810546875,
0.01435089111328125,
-0.02020263671875,
0.024078369140625,
0.0587158203125,
-0.0472412109375,
0.059051513671875,
-0.0164794921875,
-0.0006556510925292969,
0.0207366943359375,
-0.0136566162109375,
0.0521240234375,
-0.0028209686279296875,
0.016815185546875,
-0.003589630126953125,
0.00891876220703125,
-0.0208740234375,
-0.00786590576171875,
0.041015625,
-0.06243896484375,
-0.022918701171875,
-0.0438232421875,
-0.02557373046875,
0.030975341796875,
0.0125579833984375,
0.038848876953125,
0.02838134765625,
0.01947021484375,
0.01103973388671875,
0.04498291015625,
0.0012178421020507812,
0.01499176025390625,
0.039337158203125,
-0.033477783203125,
-0.017578125,
0.07403564453125,
0.01245880126953125,
0.006168365478515625,
-0.00762176513671875,
0.0260009765625,
-0.0132598876953125,
-0.045867919921875,
-0.004337310791015625,
-0.00794219970703125,
-0.045654296875,
-0.02813720703125,
-0.0301971435546875,
-0.0111083984375,
-0.061553955078125,
-0.00298309326171875,
-0.026641845703125,
0.0142059326171875,
-0.040802001953125,
-0.0203857421875,
0.055419921875,
0.061676025390625,
-0.007129669189453125,
0.00403594970703125,
-0.04803466796875,
0.03460693359375,
0.03955078125,
0.02459716796875,
-0.004177093505859375,
-0.06573486328125,
-0.023529052734375,
0.0025920867919921875,
-0.031707763671875,
-0.074951171875,
0.061676025390625,
-0.019989013671875,
0.03680419921875,
0.06317138671875,
-0.005184173583984375,
0.0643310546875,
-0.00130462646484375,
0.033233642578125,
0.0158233642578125,
-0.0533447265625,
0.0472412109375,
-0.03472900390625,
0.0110626220703125,
-0.0009603500366210938,
0.03558349609375,
-0.04315185546875,
-0.009552001953125,
-0.04180908203125,
-0.055267333984375,
0.07781982421875,
-0.00531768798828125,
-0.035919189453125,
0.0249786376953125,
0.0279541015625,
0.0143585205078125,
0.0238189697265625,
-0.07318115234375,
-0.031494140625,
-0.046905517578125,
0.004638671875,
0.01763916015625,
0.0016717910766601562,
0.010101318359375,
-0.039764404296875,
0.04473876953125,
-0.021240234375,
0.034332275390625,
-0.0019741058349609375,
0.00417327880859375,
-0.0238800048828125,
0.004180908203125,
0.048736572265625,
0.037872314453125,
-0.044921875,
-0.0390625,
0.002376556396484375,
-0.03472900390625,
0.014251708984375,
0.0114288330078125,
-0.01474761962890625,
-0.018524169921875,
0.0189208984375,
0.06195068359375,
0.0067138671875,
-0.044891357421875,
0.033050537109375,
-0.001316070556640625,
-0.0246124267578125,
-0.02899169921875,
0.039825439453125,
-0.0141448974609375,
0.052490234375,
0.04541015625,
0.01076507568359375,
0.031280517578125,
-0.05462646484375,
0.01395416259765625,
0.019775390625,
-0.009735107421875,
-0.0296630859375,
0.0657958984375,
-0.0261688232421875,
-0.03350830078125,
0.026947021484375,
-0.0389404296875,
-0.03741455078125,
0.07159423828125,
0.02459716796875,
0.06884765625,
-0.018402099609375,
-0.006351470947265625,
0.0279541015625,
-0.011199951171875,
-0.025390625,
0.047393798828125,
0.023223876953125,
-0.057464599609375,
-0.031890869140625,
-0.0220794677734375,
-0.006229400634765625,
0.04949951171875,
-0.06890869140625,
0.054656982421875,
-0.028717041015625,
-0.00853729248046875,
0.028289794921875,
-0.007476806640625,
-0.03375244140625,
0.0003426074981689453,
0.0290069580078125,
0.047882080078125,
-0.060028076171875,
0.061187744140625,
0.0472412109375,
-0.008209228515625,
-0.054901123046875,
-0.01308441162109375,
0.02044677734375,
-0.070068359375,
0.024017333984375,
0.049774169921875,
0.016448974609375,
-0.016693115234375,
-0.0672607421875,
-0.037750244140625,
0.10040283203125,
-0.010711669921875,
-0.03485107421875,
0.01110076904296875,
-0.035186767578125,
0.01433563232421875,
-0.0423583984375,
0.0213775634765625,
0.0341796875,
0.032470703125,
0.0296783447265625,
-0.03533935546875,
-0.02337646484375,
-0.0246124267578125,
-0.025848388671875,
0.036590576171875,
-0.04510498046875,
0.0755615234375,
-0.00020062923431396484,
0.00858306884765625,
-0.004405975341796875,
0.033905029296875,
0.013763427734375,
0.046234130859375,
0.041900634765625,
0.035858154296875,
0.035491943359375,
0.020843505859375,
0.03289794921875,
-0.0313720703125,
0.037841796875,
0.06829833984375,
-0.0234832763671875,
0.049957275390625,
-0.003269195556640625,
-0.01531219482421875,
0.024810791015625,
0.040557861328125,
-0.016815185546875,
0.04766845703125,
0.01399993896484375,
0.0158538818359375,
-0.04217529296875,
0.0170440673828125,
-0.042694091796875,
0.06597900390625,
0.03009033203125,
-0.0150299072265625,
0.0087738037109375,
0.016845703125,
-0.00951385498046875,
-0.00870513916015625,
-0.0286712646484375,
0.034454345703125,
-0.031494140625,
-0.00931549072265625,
0.024505615234375,
0.01380157470703125,
0.07391357421875,
-0.07171630859375,
0.0018167495727539062,
-0.0171356201171875,
0.0262298583984375,
0.00661468505859375,
-0.0254058837890625,
0.0282440185546875,
-0.02215576171875,
0.00452423095703125,
-0.0095367431640625,
0.08502197265625,
-0.02593994140625,
-0.03466796875,
0.02001953125,
0.01433563232421875,
0.004596710205078125,
-0.003818511962890625,
-0.05072021484375,
0.010101318359375,
0.0163726806640625,
-0.04205322265625,
0.0276641845703125,
0.033477783203125,
0.005580902099609375,
0.05419921875,
0.060028076171875,
-0.016448974609375,
0.008148193359375,
-0.031158447265625,
0.0635986328125,
-0.08660888671875,
-0.050140380859375,
-0.04278564453125,
0.040802001953125,
-0.03045654296875,
-0.0163726806640625,
0.054229736328125,
0.041290283203125,
0.05462646484375,
-0.0228271484375,
0.036865234375,
0.01316070556640625,
0.035247802734375,
-0.01605224609375,
0.036865234375,
-0.04669189453125,
0.0180511474609375,
-0.009124755859375,
-0.0134735107421875,
-0.00536346435546875,
0.06671142578125,
-0.03045654296875,
-0.0152435302734375,
0.0209503173828125,
0.052337646484375,
-0.031402587890625,
-0.0001226663589477539,
0.05169677734375,
0.0482177734375,
0.005092620849609375,
0.0130157470703125,
0.035614013671875,
-0.08135986328125,
0.02960205078125,
-0.0191650390625,
0.00424957275390625,
-0.053619384765625,
-0.059173583984375,
-0.07196044921875,
-0.024017333984375,
-0.05609130859375,
-0.02252197265625,
-0.0141754150390625,
0.08636474609375,
0.0579833984375,
-0.053070068359375,
-0.03326416015625,
0.021087646484375,
0.00083160400390625,
-0.0273895263671875,
-0.026611328125,
0.005580902099609375,
0.00919342041015625,
-0.041595458984375,
0.0350341796875,
0.0562744140625,
0.038543701171875,
-0.0372314453125,
-0.007381439208984375,
-0.0374755859375,
-0.006557464599609375,
0.03582763671875,
0.02423095703125,
-0.022308349609375,
-0.0037384033203125,
0.025177001953125,
0.0116119384765625,
0.03363037109375,
0.036712646484375,
-0.05792236328125,
0.03466796875,
0.035614013671875,
-0.002460479736328125,
0.0389404296875,
-0.012420654296875,
0.02154541015625,
-0.0143585205078125,
0.060272216796875,
-0.01100921630859375,
0.036956787109375,
-0.01395416259765625,
-0.040130615234375,
0.0011758804321289062,
0.05963134765625,
-0.0274505615234375,
-0.052276611328125,
-0.018096923828125,
-0.09039306640625,
-0.0282440185546875,
0.0577392578125,
-0.00804901123046875,
-0.05621337890625,
-0.022796630859375,
-0.0077667236328125,
-0.00634002685546875,
-0.035186767578125,
0.04302978515625,
0.0169677734375,
0.0145416259765625,
-0.012420654296875,
-0.060394287109375,
0.02593994140625,
-0.0123291015625,
-0.044830322265625,
-0.018890380859375,
0.023193359375,
0.046356201171875,
0.004573822021484375,
0.0528564453125,
0.0218505859375,
0.04937744140625,
-0.005126953125,
-0.01513671875,
-0.018402099609375,
-0.00418853759765625,
-0.02459716796875,
0.0208282470703125,
-0.01248931884765625,
-0.0714111328125
]
] |
KoboldAI/OPT-2.7B-Erebus | 2022-09-19T07:38:12.000Z | [
"transformers",
"pytorch",
"opt",
"text-generation",
"en",
"arxiv:2205.01068",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | KoboldAI | null | null | KoboldAI/OPT-2.7B-Erebus | 32 | 12,636 | transformers | 2022-09-19T06:41:21 | ---
language: en
license: other
commercial: no
inference: false
---
# OPT 2.7B - Erebus
## Model description
This is the second generation of the original Shinen made by Mr. Seeker. The full dataset consists of 6 different sources, all surrounding the "Adult" theme. The name "Erebus" comes from the greek mythology, also named "darkness". This is in line with Shin'en, or "deep abyss". For inquiries, please contact the KoboldAI community. **Warning: THIS model is NOT suitable for use by minors. The model will output X-rated content.**
## Training data
The data can be divided in 6 different datasets:
- Literotica (everything with 4.5/5 or higher)
- Sexstories (everything with 90 or higher)
- Dataset-G (private dataset of X-rated stories)
- Doc's Lab (all stories)
- Pike Dataset (novels with "adult" rating)
- SoFurry (collection of various animals)
The dataset uses `[Genre: <comma-separated list of genres>]` for tagging.
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='KoboldAI/OPT-2.7B-Erebus')
>>> generator("Welcome Captain Janeway, I apologize for the delay.", do_sample=True, min_length=50)
[{'generated_text': 'Welcome Captain Janeway, I apologize for the delay."\nIt's all right," Janeway said. "I'm certain that you're doing your best to keep me informed of what\'s going on."'}]
```
## Limitations and biases
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion). **Warning: This model has a very strong NSFW bias!**
### License
OPT-6.7B is licensed under the OPT-175B license, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
### BibTeX entry and citation info
```
@misc{zhang2022opt,
title={OPT: Open Pre-trained Transformer Language Models},
author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer},
year={2022},
eprint={2205.01068},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 2,398 | [
[
-0.0267486572265625,
-0.044586181640625,
0.0105743408203125,
0.01312255859375,
-0.0205078125,
-0.03082275390625,
-0.02655029296875,
-0.0325927734375,
0.02203369140625,
0.055755615234375,
-0.058074951171875,
-0.027435302734375,
-0.0290374755859375,
0.01678466796875,
-0.0158843994140625,
0.07904052734375,
0.008392333984375,
-0.00960540771484375,
0.0205535888671875,
0.005588531494140625,
-0.033477783203125,
-0.027130126953125,
-0.053955078125,
-0.0247955322265625,
0.037811279296875,
0.0291900634765625,
0.06158447265625,
0.035980224609375,
0.039276123046875,
0.018646240234375,
-0.021270751953125,
0.007228851318359375,
-0.048858642578125,
-0.00269317626953125,
-0.0020427703857421875,
-0.04541015625,
-0.038055419921875,
-0.0082550048828125,
0.04534912109375,
0.040618896484375,
-0.001758575439453125,
0.0184783935546875,
-0.01256561279296875,
0.04119873046875,
-0.036376953125,
-0.01042938232421875,
-0.040008544921875,
0.01464080810546875,
-0.026092529296875,
0.003955841064453125,
-0.059967041015625,
-0.00799560546875,
0.011932373046875,
-0.03582763671875,
0.0411376953125,
0.0215606689453125,
0.0972900390625,
0.018890380859375,
-0.0255889892578125,
-0.00862884521484375,
-0.052703857421875,
0.06658935546875,
-0.0731201171875,
0.032501220703125,
0.0214996337890625,
0.00406646728515625,
-0.0034847259521484375,
-0.07177734375,
-0.033416748046875,
0.003814697265625,
-0.01102447509765625,
0.033721923828125,
-0.00868988037109375,
-0.016082763671875,
0.0194854736328125,
0.0244598388671875,
-0.046478271484375,
0.00821685791015625,
-0.05615234375,
-0.0031299591064453125,
0.050750732421875,
0.01168060302734375,
0.018341064453125,
-0.037689208984375,
-0.04058837890625,
-0.0236358642578125,
-0.044097900390625,
0.0016851425170898438,
0.047149658203125,
0.03082275390625,
-0.020263671875,
0.039703369140625,
0.015777587890625,
0.0443115234375,
0.00653839111328125,
0.014373779296875,
0.046142578125,
-0.020050048828125,
-0.0130462646484375,
0.008087158203125,
0.0672607421875,
0.034088134765625,
0.0059814453125,
0.005573272705078125,
-0.010345458984375,
-0.0128173828125,
0.044189453125,
-0.05194091796875,
-0.0176849365234375,
0.0217742919921875,
-0.050567626953125,
-0.0399169921875,
0.017181396484375,
-0.0771484375,
-0.020050048828125,
-0.004428863525390625,
0.014373779296875,
-0.038116455078125,
-0.038665771484375,
0.004364013671875,
0.0003039836883544922,
0.040985107421875,
-0.010589599609375,
-0.07244873046875,
0.01457977294921875,
0.0245208740234375,
0.044647216796875,
-0.00785064697265625,
-0.028839111328125,
0.01221466064453125,
-0.00984954833984375,
-0.0379638671875,
0.03765869140625,
-0.0215911865234375,
-0.0123138427734375,
0.00811767578125,
0.0215911865234375,
-0.00799560546875,
-0.0276336669921875,
0.07843017578125,
-0.042205810546875,
0.03271484375,
0.0118865966796875,
-0.0292816162109375,
-0.029144287109375,
-0.029052734375,
-0.055084228515625,
0.0831298828125,
0.02337646484375,
-0.069580078125,
0.03387451171875,
-0.048248291015625,
-0.02716064453125,
0.00872039794921875,
0.01352691650390625,
-0.054290771484375,
0.0205535888671875,
0.00864410400390625,
0.01044464111328125,
-0.00762939453125,
0.0229339599609375,
-0.0165863037109375,
-0.0128936767578125,
0.0156402587890625,
-0.03082275390625,
0.07421875,
0.031829833984375,
-0.031463623046875,
0.00644683837890625,
-0.05816650390625,
0.01378631591796875,
0.041900634765625,
-0.01114654541015625,
-0.0196685791015625,
0.00864410400390625,
0.0138092041015625,
0.00968170166015625,
0.0216522216796875,
-0.0335693359375,
-0.00402069091796875,
-0.050262451171875,
0.0220794677734375,
0.05133056640625,
-0.0124664306640625,
0.031890869140625,
-0.0194854736328125,
0.0343017578125,
0.0056915283203125,
0.02471923828125,
-0.02618408203125,
-0.041717529296875,
-0.0897216796875,
-0.004634857177734375,
0.0335693359375,
0.039886474609375,
-0.0352783203125,
0.050567626953125,
-0.01910400390625,
-0.047088623046875,
-0.0584716796875,
-0.0214080810546875,
0.017333984375,
0.00323486328125,
0.03411865234375,
0.00444793701171875,
-0.0623779296875,
-0.07672119140625,
-0.0253448486328125,
-0.003986358642578125,
-0.002227783203125,
0.032470703125,
0.0496826171875,
-0.0272369384765625,
0.0706787109375,
-0.04840087890625,
-0.0185089111328125,
-0.035400390625,
-0.0007166862487792969,
0.0377197265625,
0.03021240234375,
0.043853759765625,
-0.07220458984375,
-0.033660888671875,
-0.0128326416015625,
-0.056976318359375,
-0.0186004638671875,
-0.013214111328125,
-0.027740478515625,
0.00849151611328125,
0.019073486328125,
-0.0233001708984375,
0.030487060546875,
0.037109375,
-0.039886474609375,
0.04290771484375,
-0.013458251953125,
-0.0056915283203125,
-0.11138916015625,
0.007472991943359375,
-0.00324249267578125,
-0.01244354248046875,
-0.06463623046875,
0.01326751708984375,
0.010498046875,
-0.01482391357421875,
-0.04425048828125,
0.043853759765625,
-0.03076171875,
0.0144805908203125,
-0.016815185546875,
0.020294189453125,
-0.011077880859375,
0.039215087890625,
0.0126190185546875,
0.040008544921875,
0.03973388671875,
-0.0540771484375,
0.0271453857421875,
0.040374755859375,
-0.00811767578125,
0.0301513671875,
-0.0626220703125,
0.0081024169921875,
-0.0112762451171875,
-0.0020427703857421875,
-0.04986572265625,
-0.0283355712890625,
0.0234222412109375,
-0.050994873046875,
0.035125732421875,
-0.0037021636962890625,
-0.030517578125,
-0.051177978515625,
-0.01303863525390625,
0.00832366943359375,
0.056915283203125,
-0.0484619140625,
0.045257568359375,
0.01654052734375,
-0.0205078125,
-0.04400634765625,
-0.056915283203125,
-0.0008144378662109375,
-0.0270233154296875,
-0.061920166015625,
0.0379638671875,
0.0013780593872070312,
0.0026073455810546875,
-0.0130767822265625,
0.01221466064453125,
-0.00238800048828125,
-0.01422882080078125,
0.007778167724609375,
0.033660888671875,
-0.00397491455078125,
0.00011402368545532227,
0.02166748046875,
-0.0181121826171875,
0.004871368408203125,
-0.00605010986328125,
0.045257568359375,
-0.0131378173828125,
-0.0038318634033203125,
-0.012542724609375,
0.0191497802734375,
0.0197601318359375,
-0.0179595947265625,
0.06622314453125,
0.06298828125,
-0.0357666015625,
-0.0254669189453125,
-0.0200347900390625,
-0.0240478515625,
-0.036712646484375,
0.05047607421875,
-0.01763916015625,
-0.039459228515625,
0.04095458984375,
0.005825042724609375,
0.0316162109375,
0.05816650390625,
0.035125732421875,
0.01201629638671875,
0.07916259765625,
0.0618896484375,
0.025360107421875,
0.034759521484375,
-0.0243377685546875,
0.01885986328125,
-0.07708740234375,
-0.021453857421875,
-0.031158447265625,
-0.0187835693359375,
-0.046478271484375,
-0.01044464111328125,
0.0013284683227539062,
0.00310516357421875,
-0.031951904296875,
0.045257568359375,
-0.041595458984375,
0.012969970703125,
0.04779052734375,
0.0185394287109375,
-0.00458526611328125,
0.01038360595703125,
-0.0044097900390625,
-0.02020263671875,
-0.059326171875,
-0.04705810546875,
0.09136962890625,
0.04718017578125,
0.06365966796875,
0.01302337646484375,
0.058380126953125,
0.0160064697265625,
0.01342010498046875,
-0.03045654296875,
0.0413818359375,
-0.021148681640625,
-0.0814208984375,
-0.01522064208984375,
-0.0287933349609375,
-0.0777587890625,
0.024932861328125,
-0.00982666015625,
-0.0469970703125,
0.023193359375,
-0.01287078857421875,
-0.0181884765625,
0.0307159423828125,
-0.056488037109375,
0.0635986328125,
-0.024627685546875,
-0.0248870849609375,
0.01023101806640625,
-0.06494140625,
0.0248260498046875,
-0.007228851318359375,
0.0094757080078125,
0.01177978515625,
-0.0022487640380859375,
0.08001708984375,
-0.026031494140625,
0.0684814453125,
0.00643157958984375,
-0.00972747802734375,
0.03131103515625,
-0.0176544189453125,
0.0284881591796875,
0.0051422119140625,
0.0005345344543457031,
0.0155792236328125,
-0.0200958251953125,
-0.0196075439453125,
-0.0034847259521484375,
0.0421142578125,
-0.07177734375,
-0.00997161865234375,
-0.03936767578125,
-0.0148468017578125,
0.02044677734375,
0.045074462890625,
0.06280517578125,
0.0350341796875,
0.0004050731658935547,
0.025787353515625,
0.05450439453125,
-0.036224365234375,
0.02813720703125,
0.038177490234375,
-0.040771484375,
-0.05718994140625,
0.06103515625,
-0.0032215118408203125,
0.0181427001953125,
0.005237579345703125,
0.00592041015625,
-0.031829833984375,
-0.013763427734375,
-0.029815673828125,
0.033172607421875,
-0.05377197265625,
-0.01244354248046875,
-0.050323486328125,
-0.036346435546875,
-0.028076171875,
-0.0175933837890625,
-0.041717529296875,
0.001728057861328125,
-0.037567138671875,
-0.0025157928466796875,
0.00949859619140625,
0.04473876953125,
-0.00020062923431396484,
0.0335693359375,
-0.054107666015625,
0.02349853515625,
0.0017871856689453125,
0.033477783203125,
-0.00931549072265625,
-0.07318115234375,
-0.0246429443359375,
0.0160064697265625,
-0.031158447265625,
-0.08331298828125,
0.046783447265625,
0.011260986328125,
0.051727294921875,
0.036712646484375,
0.023040771484375,
0.01496124267578125,
-0.040008544921875,
0.07659912109375,
0.021575927734375,
-0.041748046875,
0.041351318359375,
-0.0309906005859375,
0.0113372802734375,
0.034088134765625,
0.0222320556640625,
-0.02313232421875,
-0.0316162109375,
-0.0726318359375,
-0.0849609375,
0.08563232421875,
0.040740966796875,
0.0158233642578125,
0.0024852752685546875,
0.0186004638671875,
0.0195770263671875,
0.01226043701171875,
-0.092529296875,
-0.0615234375,
-0.0247955322265625,
-0.021728515625,
-0.006717681884765625,
-0.030364990234375,
0.0015039443969726562,
-0.002483367919921875,
0.071533203125,
0.006191253662109375,
0.048095703125,
0.017120361328125,
-0.01531219482421875,
-0.01332855224609375,
0.0217742919921875,
0.04571533203125,
0.031768798828125,
-0.0243377685546875,
-0.003253936767578125,
0.0162353515625,
-0.057586669921875,
-0.0101318359375,
0.01264190673828125,
-0.047149658203125,
0.01898193359375,
0.0124359130859375,
0.09771728515625,
0.01233673095703125,
-0.0234527587890625,
0.0214080810546875,
0.0012521743774414062,
-0.0166778564453125,
-0.053070068359375,
-0.0035686492919921875,
-0.006656646728515625,
0.01381683349609375,
0.03076171875,
0.01389312744140625,
0.0039215087890625,
-0.0206451416015625,
0.006366729736328125,
-0.00649261474609375,
-0.03497314453125,
-0.0223846435546875,
0.0711669921875,
0.01763916015625,
-0.0391845703125,
0.061370849609375,
-0.0250701904296875,
-0.039154052734375,
0.043670654296875,
0.06683349609375,
0.081787109375,
-0.01261138916015625,
0.0240325927734375,
0.057586669921875,
0.046600341796875,
0.005260467529296875,
0.0276336669921875,
0.0479736328125,
-0.051544189453125,
-0.0143890380859375,
-0.060821533203125,
-0.00952911376953125,
0.0256500244140625,
-0.050079345703125,
0.04925537109375,
-0.002941131591796875,
-0.04071044921875,
-0.01139068603515625,
-0.0139617919921875,
-0.04156494140625,
0.016357421875,
0.032470703125,
0.059844970703125,
-0.065185546875,
0.00962066650390625,
0.06793212890625,
-0.0401611328125,
-0.05291748046875,
-0.0085296630859375,
-0.03021240234375,
-0.028778076171875,
0.0284881591796875,
0.02471923828125,
0.0203094482421875,
0.0186309814453125,
-0.05615234375,
-0.0677490234375,
0.06817626953125,
0.00641632080078125,
-0.0253448486328125,
-0.006572723388671875,
0.00197601318359375,
0.041717529296875,
-0.0296478271484375,
0.03826904296875,
0.03338623046875,
0.0440673828125,
-0.02069091796875,
-0.04876708984375,
-0.007030487060546875,
-0.034912109375,
0.0147247314453125,
0.0125885009765625,
-0.05938720703125,
0.07354736328125,
-0.033477783203125,
-0.02197265625,
0.0147705078125,
0.06304931640625,
0.02484130859375,
0.01934814453125,
0.027130126953125,
0.043792724609375,
0.0347900390625,
-0.0270538330078125,
0.060333251953125,
-0.02532958984375,
0.054443359375,
0.07232666015625,
-0.004108428955078125,
0.051177978515625,
0.01482391357421875,
-0.042694091796875,
0.050323486328125,
0.0631103515625,
-0.024200439453125,
0.038604736328125,
-0.003925323486328125,
0.01201629638671875,
-0.0180511474609375,
0.007656097412109375,
-0.045166015625,
0.01690673828125,
0.019439697265625,
-0.05084228515625,
0.0004782676696777344,
0.0030078887939453125,
0.008087158203125,
-0.00945281982421875,
-0.015045166015625,
0.044464111328125,
0.01556396484375,
-0.044647216796875,
0.047882080078125,
0.00635528564453125,
0.060089111328125,
-0.061431884765625,
0.0113677978515625,
0.0019474029541015625,
0.0209503173828125,
-0.0176544189453125,
-0.05511474609375,
-0.00559234619140625,
-0.00594329833984375,
-0.0243377685546875,
0.0005779266357421875,
0.06341552734375,
-0.0275726318359375,
-0.0499267578125,
0.016754150390625,
0.019744873046875,
0.02655029296875,
0.0224609375,
-0.054107666015625,
-0.00806427001953125,
0.0159149169921875,
-0.0386962890625,
0.0002586841583251953,
0.0071868896484375,
0.0230255126953125,
0.047271728515625,
0.03948974609375,
0.003017425537109375,
0.0364990234375,
0.0085601806640625,
0.046478271484375,
-0.04345703125,
-0.0489501953125,
-0.037322998046875,
0.047576904296875,
-0.0240020751953125,
-0.042449951171875,
0.058074951171875,
0.043548583984375,
0.056884765625,
-0.033477783203125,
0.0677490234375,
-0.031494140625,
0.0455322265625,
-0.0197906494140625,
0.0643310546875,
-0.046844482421875,
-0.01107025146484375,
-0.024566650390625,
-0.092529296875,
0.0023021697998046875,
0.057830810546875,
-0.010528564453125,
0.032928466796875,
0.060516357421875,
0.046539306640625,
0.0004115104675292969,
0.0100250244140625,
0.0114288330078125,
0.0184478759765625,
0.01470947265625,
0.025665283203125,
0.050445556640625,
-0.06353759765625,
0.0386962890625,
-0.031494140625,
-0.01629638671875,
-0.035369873046875,
-0.04425048828125,
-0.07012939453125,
-0.0369873046875,
-0.0201568603515625,
-0.03125,
-0.01352691650390625,
0.044769287109375,
0.044403076171875,
-0.05731201171875,
0.004344940185546875,
-0.0104522705078125,
-0.005542755126953125,
-0.022125244140625,
-0.0222015380859375,
0.0282745361328125,
-0.0216064453125,
-0.0615234375,
0.016448974609375,
-0.0094146728515625,
0.011627197265625,
-0.0190887451171875,
-0.01439666748046875,
-0.01377105712890625,
0.009490966796875,
0.02447509765625,
0.00795745849609375,
-0.046783447265625,
-0.0017900466918945312,
0.024078369140625,
-0.0013685226440429688,
-0.0097198486328125,
0.0240936279296875,
-0.0321044921875,
0.038665771484375,
0.041290283203125,
0.004467010498046875,
0.016326904296875,
-0.0053253173828125,
0.0325927734375,
-0.044525146484375,
0.004329681396484375,
0.0165557861328125,
0.03271484375,
0.01983642578125,
-0.017242431640625,
0.04339599609375,
0.01451873779296875,
-0.0556640625,
-0.07196044921875,
0.0238800048828125,
-0.061370849609375,
-0.00872802734375,
0.10369873046875,
-0.0081024169921875,
-0.018280029296875,
0.0023956298828125,
-0.02874755859375,
0.0298919677734375,
-0.020751953125,
0.022857666015625,
0.035125732421875,
0.023162841796875,
-0.0128936767578125,
-0.05218505859375,
0.0172119140625,
0.020538330078125,
-0.040618896484375,
0.00860595703125,
0.0159454345703125,
0.01393890380859375,
0.03411865234375,
0.0195465087890625,
-0.01611328125,
0.0164337158203125,
0.028076171875,
0.03094482421875,
-0.005352020263671875,
-0.032470703125,
-0.0123138427734375,
-0.0106353759765625,
-0.0203857421875,
0.00638580322265625
]
] |
microsoft/trocr-small-printed | 2023-01-24T16:57:45.000Z | [
"transformers",
"pytorch",
"vision-encoder-decoder",
"trocr",
"image-to-text",
"arxiv:2109.10282",
"endpoints_compatible",
"has_space",
"region:us"
] | image-to-text | microsoft | null | null | microsoft/trocr-small-printed | 17 | 12,582 | transformers | 2022-03-02T23:29:05 | ---
tags:
- trocr
- image-to-text
widget:
- src: https://layoutlm.blob.core.windows.net/trocr/dataset/SROIE2019Task2Crop/train/X00016469612_1.jpg
example_title: Printed 1
- src: https://layoutlm.blob.core.windows.net/trocr/dataset/SROIE2019Task2Crop/train/X51005255805_7.jpg
example_title: Printed 2
- src: https://layoutlm.blob.core.windows.net/trocr/dataset/SROIE2019Task2Crop/train/X51005745214_6.jpg
example_title: Printed 3
---
# TrOCR (small-sized model, fine-tuned on SROIE)
TrOCR model fine-tuned on the [SROIE dataset](https://rrc.cvc.uab.es/?ch=13). It was introduced in the paper [TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models](https://arxiv.org/abs/2109.10282) by Li et al. and first released in [this repository](https://github.com/microsoft/unilm/tree/master/trocr).
## Model description
The TrOCR model is an encoder-decoder model, consisting of an image Transformer as encoder, and a text Transformer as decoder. The image encoder was initialized from the weights of DeiT, while the text decoder was initialized from the weights of UniLM.
Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Next, the Transformer text decoder autoregressively generates tokens.
## Intended uses & limitations
You can use the raw model for optical character recognition (OCR) on single text-line images. See the [model hub](https://huggingface.co/models?search=microsoft/trocr) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model in PyTorch:
```python
from transformers import TrOCRProcessor, VisionEncoderDecoderModel
from PIL import Image
import requests
# load image from the IAM database (actually this model is meant to be used on printed text)
url = 'https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg'
image = Image.open(requests.get(url, stream=True).raw).convert("RGB")
processor = TrOCRProcessor.from_pretrained('microsoft/trocr-small-printed')
model = VisionEncoderDecoderModel.from_pretrained('microsoft/trocr-small-printed')
pixel_values = processor(images=image, return_tensors="pt").pixel_values
generated_ids = model.generate(pixel_values)
generated_text = processor.batch_decode(generated_ids, skip_special_tokens=True)[0]
```
### BibTeX entry and citation info
```bibtex
@misc{li2021trocr,
title={TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models},
author={Minghao Li and Tengchao Lv and Lei Cui and Yijuan Lu and Dinei Florencio and Cha Zhang and Zhoujun Li and Furu Wei},
year={2021},
eprint={2109.10282},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 2,836 | [
[
-0.0175323486328125,
-0.021697998046875,
0.007419586181640625,
-0.041595458984375,
-0.033294677734375,
-0.00846099853515625,
-0.0006270408630371094,
-0.059173583984375,
0.002777099609375,
0.043243408203125,
-0.024169921875,
-0.023834228515625,
-0.03936767578125,
0.0235137939453125,
-0.022979736328125,
0.07232666015625,
-0.01092529296875,
0.0010890960693359375,
0.0177764892578125,
-0.0307769775390625,
-0.01247406005859375,
-0.03485107421875,
-0.045806884765625,
-0.012786865234375,
0.035369873046875,
0.032257080078125,
0.042083740234375,
0.0540771484375,
0.07745361328125,
0.027313232421875,
-0.01763916015625,
0.0184783935546875,
-0.025970458984375,
-0.0305938720703125,
0.01104736328125,
-0.0325927734375,
-0.0291595458984375,
-0.0013103485107421875,
0.052001953125,
0.01206207275390625,
-0.0010404586791992188,
0.0124664306640625,
0.0125274658203125,
0.035064697265625,
-0.022186279296875,
-0.01183319091796875,
-0.0298309326171875,
0.0271759033203125,
0.0018739700317382812,
-0.0094757080078125,
-0.040679931640625,
-0.0224761962890625,
0.0129852294921875,
-0.0430908203125,
0.05023193359375,
0.007476806640625,
0.0914306640625,
-0.007610321044921875,
-0.027984619140625,
-0.03533935546875,
-0.061767578125,
0.052093505859375,
-0.048248291015625,
0.035186767578125,
-0.004703521728515625,
0.0112762451171875,
0.006214141845703125,
-0.092041015625,
-0.058807373046875,
-0.03033447265625,
-0.0292816162109375,
0.005657196044921875,
-0.020782470703125,
0.01544189453125,
0.0285797119140625,
0.0343017578125,
-0.047698974609375,
-0.01111602783203125,
-0.058868408203125,
-0.0218658447265625,
0.0180816650390625,
0.004367828369140625,
0.0180511474609375,
-0.00327301025390625,
-0.030120849609375,
-0.02801513671875,
-0.00820159912109375,
-0.0038356781005859375,
-0.0020904541015625,
0.005710601806640625,
-0.02325439453125,
0.05462646484375,
0.0178985595703125,
0.06292724609375,
0.020263671875,
-0.0227813720703125,
0.0294342041015625,
-0.0187225341796875,
-0.0038471221923828125,
0.0024871826171875,
0.0838623046875,
0.0139617919921875,
0.0207672119140625,
-0.0083770751953125,
-0.0213623046875,
0.016021728515625,
0.0097503662109375,
-0.062347412109375,
-0.00385284423828125,
-0.015869140625,
-0.045440673828125,
-0.0207061767578125,
0.0162200927734375,
-0.06488037109375,
-0.0129852294921875,
-0.010772705078125,
0.034423828125,
-0.0299072265625,
0.01531982421875,
-0.009124755859375,
-0.0089569091796875,
0.007366180419921875,
0.0198974609375,
-0.038970947265625,
0.002399444580078125,
0.007465362548828125,
0.08856201171875,
-0.011932373046875,
-0.0237579345703125,
-0.02728271484375,
-0.0002338886260986328,
-0.022064208984375,
0.053253173828125,
-0.0238494873046875,
-0.0238037109375,
-0.00872039794921875,
0.0148162841796875,
0.0005645751953125,
-0.0302734375,
0.03955078125,
-0.039154052734375,
0.0227813720703125,
0.01326751708984375,
-0.0027599334716796875,
-0.0032901763916015625,
0.0240325927734375,
-0.07159423828125,
0.08502197265625,
0.014617919921875,
-0.0684814453125,
0.0154266357421875,
-0.0513916015625,
-0.0200653076171875,
0.0037097930908203125,
0.00783538818359375,
-0.05712890625,
0.007659912109375,
0.0018281936645507812,
-0.0061798095703125,
-0.0224761962890625,
-0.0068206787109375,
-0.00455474853515625,
-0.03338623046875,
0.00859832763671875,
-0.02825927734375,
0.050201416015625,
0.030914306640625,
-0.0240631103515625,
-0.006107330322265625,
-0.0736083984375,
0.0108489990234375,
0.00811767578125,
-0.029876708984375,
-0.005794525146484375,
-0.0133514404296875,
0.0196380615234375,
0.0258331298828125,
0.030914306640625,
-0.038238525390625,
0.020660400390625,
-0.0282745361328125,
0.0638427734375,
0.032196044921875,
-0.01210784912109375,
0.03955078125,
-0.00904083251953125,
0.0303802490234375,
0.01493072509765625,
0.01094818115234375,
-0.00507354736328125,
-0.0212860107421875,
-0.06976318359375,
-0.018280029296875,
0.0247039794921875,
0.0516357421875,
-0.08197021484375,
0.0175933837890625,
-0.03448486328125,
-0.04986572265625,
-0.024139404296875,
-0.01027679443359375,
0.033447265625,
0.058319091796875,
0.026580810546875,
-0.04046630859375,
-0.035919189453125,
-0.047149658203125,
-0.00835418701171875,
-0.01971435546875,
0.00269317626953125,
0.0216217041015625,
0.0570068359375,
-0.023834228515625,
0.063232421875,
-0.0286712646484375,
-0.047332763671875,
-0.0234527587890625,
0.0242462158203125,
0.01513671875,
0.04998779296875,
0.0231781005859375,
-0.05267333984375,
-0.040283203125,
0.0106201171875,
-0.047882080078125,
0.00853729248046875,
-0.00629425048828125,
-0.0005249977111816406,
0.033477783203125,
0.0229034423828125,
-0.04583740234375,
0.0633544921875,
0.027984619140625,
-0.030242919921875,
0.042510986328125,
-0.043975830078125,
0.0170440673828125,
-0.08111572265625,
0.0184173583984375,
0.000028192996978759766,
-0.0207977294921875,
-0.06195068359375,
0.01358795166015625,
0.01519012451171875,
-0.024261474609375,
-0.036102294921875,
0.041473388671875,
-0.04852294921875,
-0.006107330322265625,
-0.004940032958984375,
0.00540924072265625,
0.0181884765625,
0.043304443359375,
0.033355712890625,
0.07183837890625,
0.0042572021484375,
-0.03594970703125,
0.01303863525390625,
0.02154541015625,
-0.033233642578125,
0.0260162353515625,
-0.07745361328125,
0.038299560546875,
0.0031681060791015625,
-0.007671356201171875,
-0.058502197265625,
0.0199432373046875,
0.031494140625,
-0.02752685546875,
0.0187835693359375,
-0.001750946044921875,
-0.044464111328125,
-0.056884765625,
0.0017747879028320312,
0.044647216796875,
0.025238037109375,
-0.041259765625,
0.07183837890625,
0.00818634033203125,
0.0254364013671875,
-0.034759521484375,
-0.09033203125,
-0.0024547576904296875,
-0.004459381103515625,
-0.052398681640625,
0.037017822265625,
-0.0107269287109375,
0.0215911865234375,
-0.005474090576171875,
0.003936767578125,
-0.01959228515625,
-0.03558349609375,
-0.0024547576904296875,
0.034454345703125,
-0.0260162353515625,
-0.00785064697265625,
-0.041259765625,
-0.01261138916015625,
-0.03900146484375,
-0.0260162353515625,
0.043548583984375,
-0.027923583984375,
0.0015468597412109375,
-0.04266357421875,
0.0186004638671875,
0.044036865234375,
-0.0367431640625,
0.045867919921875,
0.047515869140625,
-0.0211639404296875,
0.0079345703125,
-0.0345458984375,
-0.0191650390625,
-0.037353515625,
0.03515625,
-0.0240478515625,
-0.054443359375,
0.05419921875,
0.0311737060546875,
-0.0119781494140625,
0.035675048828125,
0.02337646484375,
-0.0030117034912109375,
0.06732177734375,
0.061187744140625,
0.0186004638671875,
0.06597900390625,
-0.04339599609375,
0.030914306640625,
-0.059173583984375,
-0.020477294921875,
-0.040985107421875,
-0.0265960693359375,
-0.04913330078125,
-0.0235137939453125,
0.024993896484375,
0.00010317564010620117,
-0.0205535888671875,
0.042510986328125,
-0.08001708984375,
0.0236968994140625,
0.054351806640625,
0.02960205078125,
0.016845703125,
0.0219573974609375,
-0.020904541015625,
-0.0033397674560546875,
-0.0455322265625,
-0.0458984375,
0.04388427734375,
0.01187896728515625,
0.06939697265625,
-0.018310546875,
0.036651611328125,
0.0289764404296875,
-0.0038242340087890625,
-0.0628662109375,
0.04693603515625,
-0.0325927734375,
-0.0347900390625,
-0.0016460418701171875,
-0.0311431884765625,
-0.07147216796875,
0.0033130645751953125,
-0.0222625732421875,
-0.059722900390625,
0.0703125,
0.032989501953125,
-0.00844573974609375,
0.04534912109375,
-0.057952880859375,
0.07147216796875,
-0.027313232421875,
-0.0246429443359375,
0.0172271728515625,
-0.0516357421875,
-0.003604888916015625,
0.01468658447265625,
-0.00708770751953125,
0.034576416015625,
0.005107879638671875,
0.07470703125,
-0.0511474609375,
0.0565185546875,
0.002964019775390625,
0.00603485107421875,
0.041473388671875,
-0.002208709716796875,
0.053955078125,
-0.034149169921875,
-0.01433563232421875,
0.045135498046875,
0.01425933837890625,
-0.0101165771484375,
-0.016845703125,
0.0177459716796875,
-0.0701904296875,
-0.01386260986328125,
-0.06591796875,
-0.04864501953125,
0.0268707275390625,
0.04620361328125,
0.06610107421875,
0.046875,
-0.0010519027709960938,
0.004695892333984375,
0.0426025390625,
-0.0032215118408203125,
0.037353515625,
0.0335693359375,
0.003692626953125,
-0.054229736328125,
0.0634765625,
0.018157958984375,
0.022735595703125,
0.0440673828125,
0.01422119140625,
-0.01467132568359375,
-0.0245361328125,
-0.015289306640625,
0.0278167724609375,
-0.057952880859375,
-0.0243377685546875,
-0.027618408203125,
-0.026123046875,
-0.02545166015625,
-0.014190673828125,
-0.019073486328125,
-0.0199737548828125,
-0.06146240234375,
0.0205078125,
0.022064208984375,
0.042755126953125,
-0.00124359130859375,
0.060302734375,
-0.061767578125,
0.034149169921875,
0.00971221923828125,
0.01849365234375,
-0.004024505615234375,
-0.05303955078125,
-0.023956298828125,
0.01033782958984375,
-0.029541015625,
-0.057769775390625,
0.060882568359375,
0.0311431884765625,
0.02020263671875,
0.0364990234375,
0.006275177001953125,
0.059356689453125,
-0.047607421875,
0.052764892578125,
0.03759765625,
-0.06878662109375,
0.032012939453125,
0.01201629638671875,
0.019073486328125,
0.0260467529296875,
-0.0016279220581054688,
-0.0372314453125,
-0.005893707275390625,
-0.036224365234375,
-0.043304443359375,
0.08660888671875,
0.003932952880859375,
-0.01435089111328125,
0.0190277099609375,
0.033203125,
-0.0174407958984375,
0.00949859619140625,
-0.07763671875,
-0.01416015625,
-0.0294647216796875,
-0.044464111328125,
-0.00946044921875,
-0.03179931640625,
0.00925445556640625,
-0.0184783935546875,
0.029144287109375,
-0.00608062744140625,
0.0675048828125,
0.04461669921875,
-0.03961181640625,
-0.010223388671875,
0.005702972412109375,
0.057586669921875,
0.036712646484375,
-0.014190673828125,
0.02484130859375,
0.001842498779296875,
-0.08233642578125,
0.0016946792602539062,
0.001750946044921875,
-0.032440185546875,
0.01122283935546875,
0.03387451171875,
0.08258056640625,
-0.01406097412109375,
-0.0292510986328125,
0.025390625,
-0.00983428955078125,
-0.016693115234375,
-0.01776123046875,
-0.00754547119140625,
-0.03240966796875,
0.0113525390625,
0.041259765625,
0.02227783203125,
0.0030612945556640625,
-0.03399658203125,
0.004611968994140625,
0.032135009765625,
-0.05194091796875,
-0.016326904296875,
0.04864501953125,
-0.01088714599609375,
-0.04193115234375,
0.054840087890625,
-0.0037689208984375,
-0.0589599609375,
0.060546875,
0.053680419921875,
0.046630859375,
-0.0073089599609375,
0.01145172119140625,
0.04058837890625,
0.051544189453125,
-0.004993438720703125,
0.017578125,
0.0025177001953125,
-0.05694580078125,
0.019561767578125,
-0.037841796875,
-0.01201629638671875,
-0.001537322998046875,
-0.04608154296875,
0.037750244140625,
-0.03265380859375,
-0.031463623046875,
-0.00789642333984375,
0.018463134765625,
-0.054443359375,
0.031585693359375,
-0.0030803680419921875,
0.0738525390625,
-0.0345458984375,
0.06524658203125,
0.044525146484375,
-0.03643798828125,
-0.0567626953125,
-0.0115509033203125,
-0.0274810791015625,
-0.07464599609375,
0.053985595703125,
0.01502227783203125,
-0.01136016845703125,
0.01910400390625,
-0.04168701171875,
-0.056365966796875,
0.092529296875,
0.018096923828125,
-0.05242919921875,
-0.02337646484375,
0.037109375,
0.048492431640625,
-0.035400390625,
0.044158935546875,
0.0179595947265625,
0.019500732421875,
0.02728271484375,
-0.05010986328125,
-0.0006957054138183594,
-0.020660400390625,
0.0296630859375,
0.004207611083984375,
-0.045013427734375,
0.07073974609375,
-0.036346435546875,
-0.02301025390625,
0.034088134765625,
0.048919677734375,
0.0121612548828125,
0.0198822021484375,
0.031036376953125,
0.043243408203125,
0.048828125,
-0.0181732177734375,
0.06427001953125,
-0.0283203125,
0.04266357421875,
0.06402587890625,
0.0064544677734375,
0.056549072265625,
0.038604736328125,
0.00133514404296875,
0.043975830078125,
0.03936767578125,
-0.03570556640625,
0.0494384765625,
-0.0252227783203125,
0.01320648193359375,
0.005062103271484375,
0.0019741058349609375,
-0.030364990234375,
0.0150604248046875,
0.0113067626953125,
-0.05523681640625,
0.004184722900390625,
0.01629638671875,
-0.02728271484375,
-0.030853271484375,
-0.036376953125,
0.048980712890625,
0.002506256103515625,
-0.033355712890625,
0.05126953125,
0.0028896331787109375,
0.06036376953125,
-0.05474853515625,
0.00009107589721679688,
-0.00630950927734375,
0.042724609375,
-0.00974273681640625,
-0.05535888671875,
0.00652313232421875,
0.0003769397735595703,
-0.0286712646484375,
0.0165252685546875,
0.06256103515625,
-0.0362548828125,
-0.06829833984375,
0.016326904296875,
-0.0096588134765625,
0.00730133056640625,
0.03173828125,
-0.052276611328125,
0.007488250732421875,
0.005588531494140625,
-0.01267242431640625,
-0.005863189697265625,
0.038848876953125,
0.00428009033203125,
0.038299560546875,
0.04022216796875,
0.00677490234375,
0.0262908935546875,
-0.017669677734375,
0.046112060546875,
-0.046875,
-0.048065185546875,
-0.04486083984375,
0.0433349609375,
0.006603240966796875,
-0.034637451171875,
0.040679931640625,
0.040863037109375,
0.04522705078125,
-0.0300445556640625,
0.0292510986328125,
-0.0164794921875,
0.01126861572265625,
-0.024566650390625,
0.06787109375,
-0.05987548828125,
-0.0001112222671508789,
-0.0255279541015625,
-0.050750732421875,
-0.034576416015625,
0.07196044921875,
-0.0261383056640625,
0.0195770263671875,
0.056365966796875,
0.08477783203125,
-0.015655517578125,
-0.0214691162109375,
0.0152587890625,
0.01824951171875,
0.00865936279296875,
0.04095458984375,
0.042266845703125,
-0.06964111328125,
0.07159423828125,
-0.0269012451171875,
-0.00403594970703125,
-0.0240936279296875,
-0.06280517578125,
-0.06951904296875,
-0.045562744140625,
-0.026580810546875,
-0.04296875,
-0.002704620361328125,
0.039093017578125,
0.05999755859375,
-0.07611083984375,
-0.0107269287109375,
-0.023345947265625,
0.01113128662109375,
-0.0196533203125,
-0.0193023681640625,
0.038116455078125,
0.007785797119140625,
-0.052642822265625,
-0.031036376953125,
-0.00836944580078125,
0.0288848876953125,
0.0119781494140625,
-0.0199737548828125,
-0.017791748046875,
0.0009489059448242188,
0.031280517578125,
0.04443359375,
-0.044464111328125,
-0.01019287109375,
0.00868988037109375,
-0.0266265869140625,
0.0394287109375,
0.048065185546875,
-0.048065185546875,
0.0311737060546875,
0.045074462890625,
0.00318145751953125,
0.046417236328125,
-0.0179290771484375,
0.0061492919921875,
-0.0362548828125,
0.031494140625,
0.0162353515625,
0.039886474609375,
0.02264404296875,
-0.034820556640625,
0.03277587890625,
0.03271484375,
-0.0426025390625,
-0.07342529296875,
-0.009490966796875,
-0.0919189453125,
0.00548553466796875,
0.059112548828125,
-0.00797271728515625,
-0.033111572265625,
0.0202789306640625,
-0.0254974365234375,
0.03424072265625,
-0.0283203125,
0.038360595703125,
0.01479339599609375,
0.01374053955078125,
-0.05462646484375,
-0.004730224609375,
0.01355743408203125,
-0.0173187255859375,
-0.04071044921875,
-0.0170440673828125,
0.0220184326171875,
0.0218963623046875,
0.052764892578125,
0.031341552734375,
-0.01100921630859375,
0.0169677734375,
0.0079803466796875,
0.037811279296875,
-0.01335906982421875,
-0.023101806640625,
-0.0266265869140625,
0.00817108154296875,
-0.0132598876953125,
-0.022186279296875
]
] |
TheBloke/Llama-2-70B-Chat-AWQ | 2023-09-27T12:49:45.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"facebook",
"meta",
"pytorch",
"llama-2",
"en",
"arxiv:2307.09288",
"license:llama2",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Llama-2-70B-Chat-AWQ | 6 | 12,548 | transformers | 2023-09-19T00:06:16 | ---
language:
- en
license: llama2
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
model_name: Llama 2 70B Chat
base_model: meta-llama/Llama-2-70b-chat-hf
inference: false
model_creator: Meta Llama 2
model_type: llama
pipeline_tag: text-generation
prompt_template: '[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as
possible, while being safe. Your answers should not include any harmful, unethical,
racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses
are socially unbiased and positive in nature. If a question does not make any sense,
or is not factually coherent, explain why instead of answering something not correct.
If you don''t know the answer to a question, please don''t share false information.
<</SYS>>
{prompt}[/INST]
'
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Llama 2 70B Chat - AWQ
- Model creator: [Meta Llama 2](https://huggingface.co/meta-llama)
- Original model: [Llama 2 70B Chat](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf)
<!-- description start -->
## Description
This repo contains AWQ model files for [Meta Llama 2's Llama 2 70B Chat](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf).
### About AWQ
AWQ is an efficient, accurate and blazing-fast low-bit weight quantization method, currently supporting 4-bit quantization. Compared to GPTQ, it offers faster Transformers-based inference.
It is also now supported by continuous batching server [vLLM](https://github.com/vllm-project/vllm), allowing use of AWQ models for high-throughput concurrent inference in multi-user server scenarios. Note that, at the time of writing, overall throughput is still lower than running vLLM with unquantised models, however using AWQ enables using much smaller GPUs which can lead to easier deployment and overall cost savings. For example, a 70B model can be run on 1 x 48GB GPU instead of 2 x 80GB.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Llama-2-70B-chat-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Llama-2-70B-chat-GGUF)
* [Meta Llama 2's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Llama-2-Chat
```
[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>
{prompt}[/INST]
```
<!-- prompt-template end -->
<!-- README_AWQ.md-provided-files start -->
## Provided files and AWQ parameters
For my first release of AWQ models, I am releasing 128g models only. I will consider adding 32g as well if there is interest, and once I have done perplexity and evaluation comparisons, but at this time 32g models are still not fully tested with AutoAWQ and vLLM.
Models are released as sharded safetensors files.
| Branch | Bits | GS | AWQ Dataset | Seq Len | Size |
| ------ | ---- | -- | ----------- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Llama-2-70B-chat-AWQ/tree/main) | 4 | 128 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 36.61 GB
<!-- README_AWQ.md-provided-files end -->
<!-- README_AWQ.md-use-from-vllm start -->
## Serving this model from vLLM
Documentation on installing and using vLLM [can be found here](https://vllm.readthedocs.io/en/latest/).
- When using vLLM as a server, pass the `--quantization awq` parameter, for example:
```shell
python3 python -m vllm.entrypoints.api_server --model TheBloke/Llama-2-70B-chat-AWQ --quantization awq
```
When using vLLM from Python code, pass the `quantization=awq` parameter, for example:
```python
from vllm import LLM, SamplingParams
prompts = [
"Hello, my name is",
"The president of the United States is",
"The capital of France is",
"The future of AI is",
]
sampling_params = SamplingParams(temperature=0.8, top_p=0.95)
llm = LLM(model="TheBloke/Llama-2-70B-chat-AWQ", quantization="awq")
outputs = llm.generate(prompts, sampling_params)
# Print the outputs.
for output in outputs:
prompt = output.prompt
generated_text = output.outputs[0].text
print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
```
<!-- README_AWQ.md-use-from-vllm start -->
<!-- README_AWQ.md-use-from-python start -->
## How to use this AWQ model from Python code
### Install the necessary packages
Requires: [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) 0.0.2 or later
```shell
pip3 install autoawq
```
If you have problems installing [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y autoawq
git clone https://github.com/casper-hansen/AutoAWQ
cd AutoAWQ
pip3 install .
```
### You can then try the following example code
```python
from awq import AutoAWQForCausalLM
from transformers import AutoTokenizer
model_name_or_path = "TheBloke/Llama-2-70B-chat-AWQ"
# Load model
model = AutoAWQForCausalLM.from_quantized(model_name_or_path, fuse_layers=True,
trust_remote_code=False, safetensors=True)
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, trust_remote_code=False)
prompt = "Tell me about AI"
prompt_template=f'''[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
<</SYS>>
{prompt}[/INST]
'''
print("\n\n*** Generate:")
tokens = tokenizer(
prompt_template,
return_tensors='pt'
).input_ids.cuda()
# Generate output
generation_output = model.generate(
tokens,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
max_new_tokens=512
)
print("Output: ", tokenizer.decode(generation_output[0]))
# Inference can also be done using transformers' pipeline
from transformers import pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_AWQ.md-use-from-python end -->
<!-- README_AWQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with [AutoAWQ](https://github.com/casper-hansen/AutoAWQ), and [vLLM](https://github.com/vllm-project/vllm).
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is not yet compatible with AWQ, but a PR is open which should bring support soon: [TGI PR #781](https://github.com/huggingface/text-generation-inference/issues/781).
<!-- README_AWQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Meta Llama 2's Llama 2 70B Chat
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 70B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
| 20,829 | [
[
-0.037506103515625,
-0.06298828125,
0.0256500244140625,
0.0068206787109375,
-0.02313232421875,
-0.00391387939453125,
0.0092926025390625,
-0.038818359375,
0.0016031265258789062,
0.0215301513671875,
-0.052581787109375,
-0.036651611328125,
-0.0249786376953125,
-0.004169464111328125,
-0.0278778076171875,
0.07208251953125,
0.01580810546875,
-0.026397705078125,
-0.0227508544921875,
-0.011505126953125,
-0.0221099853515625,
-0.044769287109375,
-0.0496826171875,
-0.017608642578125,
0.0138397216796875,
0.0159912109375,
0.05877685546875,
0.0528564453125,
0.0194854736328125,
0.033477783203125,
-0.007045745849609375,
0.01372528076171875,
-0.0299835205078125,
0.0006723403930664062,
0.0199127197265625,
-0.023773193359375,
-0.0443115234375,
0.00319671630859375,
0.032806396484375,
0.013427734375,
-0.0232696533203125,
0.0200653076171875,
0.004306793212890625,
0.02862548828125,
-0.036895751953125,
0.013336181640625,
-0.0372314453125,
-0.0016622543334960938,
-0.007564544677734375,
0.0074310302734375,
-0.0095672607421875,
-0.0084381103515625,
-0.00170135498046875,
-0.0626220703125,
-0.0006618499755859375,
0.01464080810546875,
0.092529296875,
0.024444580078125,
-0.043975830078125,
0.006214141845703125,
-0.03790283203125,
0.08013916015625,
-0.08502197265625,
0.024627685546875,
0.0269775390625,
0.017730712890625,
-0.01467132568359375,
-0.0738525390625,
-0.053955078125,
-0.01297760009765625,
-0.0089263916015625,
0.01483154296875,
-0.04461669921875,
-0.0024871826171875,
0.01372528076171875,
0.0389404296875,
-0.044647216796875,
0.0035457611083984375,
-0.024688720703125,
-0.0172271728515625,
0.057373046875,
0.031494140625,
0.02142333984375,
-0.019256591796875,
-0.03216552734375,
-0.0242462158203125,
-0.039154052734375,
0.0113525390625,
0.014251708984375,
0.00211334228515625,
-0.04278564453125,
0.041595458984375,
-0.0240325927734375,
0.037353515625,
0.0177459716796875,
-0.00878143310546875,
0.0208740234375,
-0.039031982421875,
-0.0428466796875,
-0.037689208984375,
0.09539794921875,
0.03338623046875,
-0.024932861328125,
0.01474761962890625,
-0.0014562606811523438,
-0.0090484619140625,
0.002727508544921875,
-0.06378173828125,
-0.021820068359375,
0.04779052734375,
-0.045440673828125,
-0.0361328125,
-0.0186920166015625,
-0.049407958984375,
-0.0128631591796875,
0.006130218505859375,
0.039031982421875,
-0.025787353515625,
-0.026885986328125,
-0.00540924072265625,
-0.023773193359375,
0.044677734375,
0.0211334228515625,
-0.058074951171875,
0.0266571044921875,
0.032806396484375,
0.050537109375,
0.0088653564453125,
-0.0169219970703125,
-0.0238189697265625,
0.0010623931884765625,
-0.007701873779296875,
0.0435791015625,
-0.01042938232421875,
-0.03436279296875,
-0.0240936279296875,
0.0110015869140625,
0.016021728515625,
-0.024383544921875,
0.02960205078125,
-0.0182037353515625,
0.03216552734375,
-0.026641845703125,
-0.034088134765625,
-0.0208587646484375,
0.00954437255859375,
-0.0369873046875,
0.0936279296875,
0.017303466796875,
-0.0626220703125,
0.007671356201171875,
-0.037261962890625,
-0.0142364501953125,
0.0030841827392578125,
-0.0008912086486816406,
-0.0455322265625,
-0.0175933837890625,
0.0283203125,
0.0274810791015625,
-0.03533935546875,
0.00018215179443359375,
-0.03179931640625,
-0.01477813720703125,
0.021514892578125,
-0.03472900390625,
0.0936279296875,
0.02301025390625,
-0.04052734375,
0.0058135986328125,
-0.0521240234375,
0.01201629638671875,
0.031707763671875,
-0.021636962890625,
0.0046539306640625,
-0.0013189315795898438,
0.0013828277587890625,
0.004276275634765625,
0.033050537109375,
-0.0297698974609375,
0.0158538818359375,
-0.02215576171875,
0.053619384765625,
0.057403564453125,
0.00019800662994384766,
0.036468505859375,
-0.045806884765625,
0.0301971435546875,
0.0095062255859375,
0.039276123046875,
0.0056610107421875,
-0.053192138671875,
-0.07208251953125,
-0.02313232421875,
0.0157318115234375,
0.050811767578125,
-0.046630859375,
0.0496826171875,
0.012969970703125,
-0.05859375,
-0.04132080078125,
-0.00787353515625,
0.0185089111328125,
0.03350830078125,
0.03289794921875,
-0.0191497802734375,
-0.05010986328125,
-0.06085205078125,
0.005462646484375,
-0.03668212890625,
-0.01142120361328125,
0.041534423828125,
0.047576904296875,
-0.0313720703125,
0.054718017578125,
-0.03851318359375,
-0.01407623291015625,
-0.0072021484375,
0.004032135009765625,
0.0232696533203125,
0.0499267578125,
0.053680419921875,
-0.039215087890625,
-0.0308074951171875,
-0.00974273681640625,
-0.058563232421875,
-0.0064849853515625,
-0.003200531005859375,
-0.035186767578125,
0.0249481201171875,
0.013702392578125,
-0.06640625,
0.03814697265625,
0.045623779296875,
-0.03302001953125,
0.048309326171875,
-0.01444244384765625,
0.00899505615234375,
-0.0821533203125,
0.0037078857421875,
-0.003265380859375,
-0.0226593017578125,
-0.0364990234375,
0.00798797607421875,
-0.01434326171875,
0.00946807861328125,
-0.03375244140625,
0.0533447265625,
-0.033935546875,
0.0024871826171875,
-0.0056610107421875,
-0.00708770751953125,
0.024810791015625,
0.03594970703125,
-0.01140594482421875,
0.054046630859375,
0.047119140625,
-0.050872802734375,
0.04022216796875,
0.0323486328125,
-0.0029449462890625,
0.0235137939453125,
-0.06988525390625,
0.017608642578125,
0.0109405517578125,
0.028961181640625,
-0.088623046875,
-0.00962066650390625,
0.037628173828125,
-0.048370361328125,
0.01025390625,
-0.016510009765625,
-0.027435302734375,
-0.03265380859375,
-0.032684326171875,
0.0196990966796875,
0.07550048828125,
-0.035064697265625,
0.045074462890625,
0.03753662109375,
0.0137176513671875,
-0.059112548828125,
-0.06182861328125,
-0.01264190673828125,
-0.0290069580078125,
-0.046630859375,
0.0263671875,
-0.0170745849609375,
-0.02227783203125,
0.001041412353515625,
0.002227783203125,
-0.01161956787109375,
0.01371002197265625,
0.023681640625,
0.025299072265625,
-0.0102996826171875,
-0.016998291015625,
0.005916595458984375,
-0.0006194114685058594,
0.0084381103515625,
-0.02142333984375,
0.04241943359375,
-0.028411865234375,
-0.00577545166015625,
-0.051361083984375,
0.0217742919921875,
0.04052734375,
-0.01399993896484375,
0.072998046875,
0.059112548828125,
-0.0181884765625,
-0.0010242462158203125,
-0.04022216796875,
-0.0240325927734375,
-0.04150390625,
0.01477813720703125,
-0.01302337646484375,
-0.04962158203125,
0.048065185546875,
0.02728271484375,
0.0267333984375,
0.057373046875,
0.042510986328125,
-0.0295562744140625,
0.081298828125,
0.043121337890625,
-0.001987457275390625,
0.032623291015625,
-0.047821044921875,
-0.004375457763671875,
-0.06561279296875,
-0.0152130126953125,
-0.031585693359375,
-0.01386260986328125,
-0.048004150390625,
-0.04034423828125,
0.0278472900390625,
0.007404327392578125,
-0.04669189453125,
0.0284423828125,
-0.046722412109375,
0.0006489753723144531,
0.05609130859375,
0.01027679443359375,
0.01342010498046875,
-0.00644683837890625,
-0.0119781494140625,
0.002880096435546875,
-0.052001953125,
-0.0289459228515625,
0.0816650390625,
0.0257110595703125,
0.044830322265625,
0.00943756103515625,
0.04888916015625,
0.01375579833984375,
0.0119171142578125,
-0.041839599609375,
0.044403076171875,
0.005649566650390625,
-0.04766845703125,
-0.031005859375,
-0.034881591796875,
-0.06805419921875,
0.0230712890625,
-0.0169677734375,
-0.05316162109375,
0.021453857421875,
0.00875091552734375,
-0.03533935546875,
0.0233612060546875,
-0.03375244140625,
0.058349609375,
-0.00879669189453125,
-0.02508544921875,
-0.0013322830200195312,
-0.045501708984375,
0.0296630859375,
0.017852783203125,
0.01380157470703125,
-0.01934814453125,
-0.01471710205078125,
0.058868408203125,
-0.069580078125,
0.0660400390625,
-0.0107879638671875,
-0.00372314453125,
0.04339599609375,
-0.005275726318359375,
0.042236328125,
0.01049041748046875,
-0.0090484619140625,
0.030364990234375,
0.01000213623046875,
-0.0322265625,
-0.0234832763671875,
0.042388916015625,
-0.08447265625,
-0.054534912109375,
-0.03271484375,
-0.037445068359375,
0.01250457763671875,
0.006916046142578125,
0.0330810546875,
0.0170135498046875,
-0.00710296630859375,
0.0142059326171875,
0.0307464599609375,
-0.0291748046875,
0.04193115234375,
0.0292816162109375,
-0.01323699951171875,
-0.041046142578125,
0.0474853515625,
-0.0028133392333984375,
0.0206756591796875,
0.0156402587890625,
0.01145172119140625,
-0.035858154296875,
-0.030029296875,
-0.045501708984375,
0.0219879150390625,
-0.040740966796875,
-0.036956787109375,
-0.056884765625,
-0.032470703125,
-0.0379638671875,
0.00078582763671875,
-0.0313720703125,
-0.0404052734375,
-0.052276611328125,
0.0018978118896484375,
0.0679931640625,
0.02972412109375,
-0.028167724609375,
0.0333251953125,
-0.052398681640625,
0.0186004638671875,
0.037445068359375,
-0.003116607666015625,
0.006435394287109375,
-0.05810546875,
-0.004802703857421875,
0.02484130859375,
-0.03863525390625,
-0.057708740234375,
0.054443359375,
0.01471710205078125,
0.048065185546875,
0.02197265625,
0.0215301513671875,
0.06103515625,
-0.0168304443359375,
0.07501220703125,
0.0072784423828125,
-0.08172607421875,
0.03546142578125,
-0.0284423828125,
0.0217437744140625,
0.0198211669921875,
0.025054931640625,
-0.03192138671875,
-0.03936767578125,
-0.057373046875,
-0.0693359375,
0.043975830078125,
0.034027099609375,
0.00873565673828125,
0.00452423095703125,
0.0235137939453125,
-0.0100555419921875,
0.01275634765625,
-0.06591796875,
-0.05010986328125,
-0.02783203125,
-0.00667572021484375,
0.0194854736328125,
-0.00914764404296875,
-0.016937255859375,
-0.04034423828125,
0.06256103515625,
-0.005649566650390625,
0.0556640625,
0.0205841064453125,
0.00841522216796875,
-0.01297760009765625,
0.00725555419921875,
0.0193023681640625,
0.037750244140625,
-0.01168060302734375,
-0.010894775390625,
0.0298919677734375,
-0.0298309326171875,
0.010162353515625,
0.0128021240234375,
-0.011322021484375,
-0.0144805908203125,
0.00991058349609375,
0.06732177734375,
-0.006427764892578125,
-0.0306854248046875,
0.033050537109375,
-0.0203704833984375,
-0.03131103515625,
-0.0292205810546875,
0.0168914794921875,
0.02313232421875,
0.04327392578125,
0.040985107421875,
-0.0158538818359375,
0.0167083740234375,
-0.040771484375,
0.0113983154296875,
0.05230712890625,
-0.00572967529296875,
-0.00650787353515625,
0.08392333984375,
0.00685882568359375,
-0.010009765625,
0.06201171875,
-0.01300048828125,
-0.03564453125,
0.07586669921875,
0.04241943359375,
0.04986572265625,
-0.0012664794921875,
0.01904296875,
0.039703369140625,
0.0181884765625,
0.00838470458984375,
0.029449462890625,
-0.0012874603271484375,
-0.0474853515625,
-0.0187835693359375,
-0.045440673828125,
-0.030517578125,
0.0207977294921875,
-0.0474853515625,
0.019073486328125,
-0.0404052734375,
-0.0276031494140625,
-0.01238250732421875,
0.0217742919921875,
-0.04925537109375,
0.0194854736328125,
0.0145263671875,
0.050262451171875,
-0.049041748046875,
0.05670166015625,
0.04119873046875,
-0.0307464599609375,
-0.067626953125,
-0.019256591796875,
0.0121307373046875,
-0.06353759765625,
0.012939453125,
-0.00012433528900146484,
0.0128936767578125,
0.016448974609375,
-0.0667724609375,
-0.080322265625,
0.113037109375,
0.00777435302734375,
-0.0396728515625,
0.0008935928344726562,
0.00374603271484375,
0.029815673828125,
-0.0179901123046875,
0.051910400390625,
0.03485107421875,
0.032470703125,
0.01235198974609375,
-0.07012939453125,
0.0290679931640625,
-0.021392822265625,
-0.00312042236328125,
0.0001061558723449707,
-0.08319091796875,
0.08917236328125,
-0.0192413330078125,
-0.016387939453125,
0.0301971435546875,
0.0712890625,
0.0460205078125,
0.003265380859375,
0.0361328125,
0.047027587890625,
0.06390380859375,
-0.0122222900390625,
0.073486328125,
-0.0216217041015625,
0.045257568359375,
0.054779052734375,
-0.0032672882080078125,
0.060272216796875,
0.0224151611328125,
-0.039215087890625,
0.047515869140625,
0.05908203125,
-0.02264404296875,
0.029388427734375,
0.002178192138671875,
-0.018341064453125,
-0.01154327392578125,
0.0035648345947265625,
-0.04986572265625,
0.025787353515625,
0.0287933349609375,
-0.0133209228515625,
0.0010786056518554688,
-0.015289306640625,
0.006259918212890625,
-0.0408935546875,
-0.0061492919921875,
0.05035400390625,
0.0222320556640625,
-0.021087646484375,
0.081787109375,
0.0038547515869140625,
0.0626220703125,
-0.036773681640625,
-0.00792694091796875,
-0.0286865234375,
0.0030574798583984375,
-0.01439666748046875,
-0.051055908203125,
0.01132965087890625,
-0.00774383544921875,
0.0010833740234375,
0.007106781005859375,
0.044921875,
-0.0167083740234375,
-0.0328369140625,
0.02191162109375,
0.036041259765625,
0.020751953125,
0.00481414794921875,
-0.07452392578125,
0.02008056640625,
0.004688262939453125,
-0.039886474609375,
0.021820068359375,
0.0246124267578125,
0.0193939208984375,
0.05633544921875,
0.057281494140625,
-0.0228271484375,
0.005153656005859375,
-0.0294952392578125,
0.0704345703125,
-0.051544189453125,
-0.0222320556640625,
-0.0677490234375,
0.061492919921875,
-0.004852294921875,
-0.028961181640625,
0.062744140625,
0.0290679931640625,
0.044952392578125,
0.00432586669921875,
0.061248779296875,
-0.029876708984375,
0.01305389404296875,
-0.0186920166015625,
0.06842041015625,
-0.061492919921875,
0.0179290771484375,
-0.009246826171875,
-0.04925537109375,
0.00440216064453125,
0.056732177734375,
0.0046539306640625,
0.0117950439453125,
0.038543701171875,
0.059326171875,
0.007518768310546875,
-0.00995635986328125,
0.01947021484375,
0.040374755859375,
0.022003173828125,
0.054901123046875,
0.05889892578125,
-0.06884765625,
0.05419921875,
-0.0455322265625,
-0.0142669677734375,
-0.01242828369140625,
-0.06500244140625,
-0.0654296875,
-0.04046630859375,
-0.028106689453125,
-0.04736328125,
-0.00540924072265625,
0.06146240234375,
0.0640869140625,
-0.04937744140625,
-0.0269012451171875,
-0.006053924560546875,
0.004268646240234375,
-0.0198974609375,
-0.021942138671875,
0.018096923828125,
0.00627899169921875,
-0.059234619140625,
0.021636962890625,
-0.00601959228515625,
0.0304718017578125,
-0.0235595703125,
-0.01132965087890625,
-0.0196380615234375,
0.0175018310546875,
0.032470703125,
0.037353515625,
-0.053680419921875,
-0.010345458984375,
-0.0013790130615234375,
-0.01512908935546875,
0.0192108154296875,
0.00370025634765625,
-0.06591796875,
-0.0049896240234375,
0.035430908203125,
0.0155487060546875,
0.050384521484375,
-0.00128936767578125,
0.044952392578125,
-0.033660888671875,
0.0218963623046875,
0.00817108154296875,
0.024688720703125,
0.0114593505859375,
-0.04595947265625,
0.0328369140625,
0.0139617919921875,
-0.0565185546875,
-0.06561279296875,
-0.00711822509765625,
-0.0787353515625,
-0.018341064453125,
0.08489990234375,
-0.01108551025390625,
-0.034881591796875,
0.0007014274597167969,
-0.019256591796875,
0.033538818359375,
-0.03717041015625,
0.03997802734375,
0.0269317626953125,
-0.01119232177734375,
-0.023468017578125,
-0.0404052734375,
0.041656494140625,
0.0271453857421875,
-0.071533203125,
-0.005260467529296875,
0.0298919677734375,
0.03204345703125,
-0.00936126708984375,
0.059112548828125,
0.000225067138671875,
0.0221710205078125,
0.01059722900390625,
0.0107574462890625,
0.0013017654418945312,
0.00438690185546875,
-0.01132965087890625,
-0.01232147216796875,
-0.010650634765625,
-0.0199432373046875
]
] |
Uminosachi/realisticVisionV51_v51VAE-inpainting | 2023-08-01T02:05:37.000Z | [
"diffusers",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | null | Uminosachi | null | null | Uminosachi/realisticVisionV51_v51VAE-inpainting | 1 | 12,537 | diffusers | 2023-08-01T01:53:24 | ---
license: creativeml-openrail-m
---
This is an inpainting model, which has been converted from the [realisticVisionV51_v51VAE-inpainting](https://civitai.com/models/4201?modelVersionId=130090). | 196 | [
[
-0.019317626953125,
-0.01666259765625,
0.03387451171875,
0.01064300537109375,
-0.0355224609375,
0.0186920166015625,
0.032562255859375,
-0.038055419921875,
0.04498291015625,
0.0740966796875,
-0.0831298828125,
0.016815185546875,
-0.00656890869140625,
-0.0252532958984375,
-0.02252197265625,
0.025665283203125,
-0.007572174072265625,
0.032379150390625,
-0.01210784912109375,
0.0208282470703125,
-0.0169677734375,
-0.015960693359375,
-0.0316162109375,
-0.0246124267578125,
0.0006394386291503906,
0.04071044921875,
0.0304718017578125,
0.0192108154296875,
0.049041748046875,
0.018341064453125,
0.00867462158203125,
-0.0030841827392578125,
-0.032684326171875,
-0.0263214111328125,
0.005397796630859375,
-0.056396484375,
-0.044219970703125,
0.0239410400390625,
0.0266876220703125,
0.01462554931640625,
-0.01248931884765625,
0.0267486572265625,
-0.01259613037109375,
0.03607177734375,
-0.0545654296875,
0.005779266357421875,
-0.005695343017578125,
0.0236053466796875,
-0.01430511474609375,
-0.0188140869140625,
-0.04052734375,
-0.025665283203125,
-0.00844573974609375,
-0.06805419921875,
0.0157318115234375,
-0.0217437744140625,
0.095458984375,
0.0213775634765625,
-0.0161285400390625,
0.022369384765625,
-0.07806396484375,
0.0311431884765625,
-0.049591064453125,
0.03411865234375,
0.00495147705078125,
0.0762939453125,
-0.042144775390625,
-0.087158203125,
-0.0281219482421875,
-0.007694244384765625,
0.0219268798828125,
0.024871826171875,
-0.0236968994140625,
0.0067901611328125,
0.023834228515625,
0.0318603515625,
-0.02825927734375,
-0.0186767578125,
-0.04656982421875,
-0.006816864013671875,
0.039947509765625,
0.00582122802734375,
0.035552978515625,
0.006259918212890625,
-0.06243896484375,
-0.0077972412109375,
-0.041839599609375,
-0.0037441253662109375,
0.0197601318359375,
-0.005435943603515625,
0.00844573974609375,
0.06146240234375,
-0.031768798828125,
0.0712890625,
0.01202392578125,
-0.009552001953125,
0.006702423095703125,
0.00040340423583984375,
-0.048980712890625,
0.004489898681640625,
-0.000019490718841552734,
0.06610107421875,
0.03326416015625,
-0.003002166748046875,
-0.01393890380859375,
-0.0171966552734375,
0.045379638671875,
-0.0916748046875,
-0.0292205810546875,
-0.01016998291015625,
-0.0328369140625,
-0.006366729736328125,
0.03704833984375,
-0.019287109375,
-0.0007843971252441406,
-0.0163726806640625,
0.0330810546875,
-0.034912109375,
-0.0263214111328125,
0.021484375,
-0.0086822509765625,
0.007282257080078125,
0.048583984375,
-0.023406982421875,
0.020050048828125,
0.022216796875,
0.0543212890625,
0.03277587890625,
0.00989532470703125,
-0.0026035308837890625,
0.0118560791015625,
-0.038604736328125,
0.058013916015625,
-0.0209197998046875,
-0.03314208984375,
0.0015926361083984375,
0.033935546875,
0.0158233642578125,
-0.051910400390625,
0.03289794921875,
-0.050048828125,
0.01435089111328125,
-0.005970001220703125,
-0.04510498046875,
-0.04327392578125,
0.024627685546875,
-0.06939697265625,
0.060577392578125,
0.0275115966796875,
-0.038543701171875,
0.05169677734375,
-0.041229248046875,
0.02862548828125,
0.033233642578125,
0.0244293212890625,
-0.039794921875,
0.0201568603515625,
-0.02752685546875,
0.0200347900390625,
-0.01337432861328125,
-0.0007824897766113281,
-0.049072265625,
-0.043182373046875,
0.00988006591796875,
-0.01361846923828125,
0.0645751953125,
0.0237579345703125,
0.0090484619140625,
0.022552490234375,
-0.0823974609375,
-0.01094818115234375,
0.0035114288330078125,
-0.00045299530029296875,
-0.015960693359375,
-0.040313720703125,
0.00994110107421875,
0.048919677734375,
0.036956787109375,
-0.0712890625,
0.019073486328125,
-0.03900146484375,
-0.0040283203125,
0.0195159912109375,
0.03533935546875,
0.0290679931640625,
-0.041595458984375,
0.045562744140625,
-0.0131072998046875,
0.04632568359375,
0.02166748046875,
-0.05963134765625,
-0.082763671875,
-0.046356201171875,
-0.00908660888671875,
0.014434814453125,
-0.06304931640625,
0.00901031494140625,
-0.006336212158203125,
-0.06683349609375,
-0.03765869140625,
-0.027557373046875,
0.016204833984375,
0.0246734619140625,
0.005634307861328125,
-0.050689697265625,
-0.0399169921875,
-0.0836181640625,
0.0018758773803710938,
-0.00609588623046875,
-0.0313720703125,
-0.0019254684448242188,
0.028167724609375,
-0.00042629241943359375,
0.044097900390625,
-0.0195770263671875,
-0.0226287841796875,
0.002513885498046875,
-0.0252227783203125,
0.032562255859375,
0.039093017578125,
0.06085205078125,
-0.032135009765625,
-0.057098388671875,
-0.005947113037109375,
-0.041259765625,
0.0186920166015625,
0.0008020401000976562,
-0.03436279296875,
-0.01332855224609375,
0.059539794921875,
-0.0301361083984375,
0.05975341796875,
0.030792236328125,
-0.027435302734375,
0.0423583984375,
-0.0287017822265625,
0.04595947265625,
-0.07659912109375,
0.003353118896484375,
0.00423431396484375,
-0.045257568359375,
-0.042999267578125,
0.041107177734375,
0.0125885009765625,
-0.001987457275390625,
-0.053253173828125,
0.022674560546875,
-0.0501708984375,
0.012908935546875,
-0.031402587890625,
-0.0323486328125,
-0.0007371902465820312,
0.020904541015625,
-0.0030078887939453125,
0.0303955078125,
0.0291900634765625,
-0.01349639892578125,
0.0701904296875,
0.01506805419921875,
-0.053741455078125,
0.033172607421875,
-0.036224365234375,
0.01129150390625,
-0.0104827880859375,
0.0162200927734375,
-0.0728759765625,
-0.0452880859375,
0.0411376953125,
-0.007381439208984375,
0.00302886962890625,
-0.038055419921875,
-0.03790283203125,
-0.0189208984375,
-0.01424407958984375,
0.0325927734375,
0.022857666015625,
-0.046661376953125,
0.047149658203125,
0.006694793701171875,
0.00844573974609375,
0.004337310791015625,
-0.06597900390625,
0.006755828857421875,
-0.0237884521484375,
-0.03790283203125,
0.042327880859375,
0.0003139972686767578,
-0.0278472900390625,
-0.00640869140625,
0.0186309814453125,
-0.028350830078125,
-0.0241851806640625,
0.0252532958984375,
0.0543212890625,
-0.03155517578125,
-0.0212860107421875,
-0.0049591064453125,
-0.0020160675048828125,
0.01285552978515625,
0.029815673828125,
0.0458984375,
0.0157470703125,
-0.029296875,
-0.067138671875,
0.0269012451171875,
0.0706787109375,
0.00653076171875,
0.056243896484375,
0.0110931396484375,
-0.06915283203125,
-0.01425933837890625,
-0.0404052734375,
-0.0236053466796875,
-0.0325927734375,
0.0193939208984375,
-0.047271728515625,
-0.0084228515625,
0.0236053466796875,
-0.02471923828125,
-0.031585693359375,
0.041259765625,
0.0268707275390625,
-0.00726318359375,
0.05523681640625,
0.0579833984375,
0.036407470703125,
0.0640869140625,
-0.056640625,
-0.038726806640625,
-0.04217529296875,
-0.056243896484375,
-0.0032806396484375,
-0.010162353515625,
-0.01488494873046875,
-0.0543212890625,
0.0152740478515625,
-0.0121002197265625,
-0.0232696533203125,
0.03155517578125,
-0.027435302734375,
0.0404052734375,
0.034515380859375,
0.056640625,
0.0187530517578125,
-0.01483154296875,
0.037200927734375,
-0.03277587890625,
-0.0212249755859375,
-0.04119873046875,
0.05426025390625,
0.004940032958984375,
0.02899169921875,
0.0145721435546875,
0.0239410400390625,
0.006534576416015625,
0.033843994140625,
-0.0301361083984375,
0.03228759765625,
-0.01149749755859375,
-0.0775146484375,
0.002086639404296875,
0.0217742919921875,
-0.03387451171875,
0.0133056640625,
-0.060699462890625,
-0.0295867919921875,
0.034698486328125,
0.00507354736328125,
-0.004291534423828125,
0.035491943359375,
-0.0517578125,
0.0592041015625,
0.01114654541015625,
0.008392333984375,
-0.039520263671875,
-0.031341552734375,
0.05682373046875,
0.005863189697265625,
-0.020904541015625,
-0.0022258758544921875,
0.0217742919921875,
0.0246429443359375,
-0.057220458984375,
0.039031982421875,
-0.017822265625,
0.0179443359375,
0.01483917236328125,
0.0244293212890625,
0.038421630859375,
-0.0025310516357421875,
-0.0111236572265625,
-0.0360107421875,
-0.0207366943359375,
-0.0404052734375,
-0.0306243896484375,
0.0433349609375,
-0.05133056640625,
-0.02398681640625,
-0.0294647216796875,
-0.007762908935546875,
0.000827789306640625,
-0.0014667510986328125,
0.042694091796875,
0.0433349609375,
-0.06158447265625,
-0.008331298828125,
0.057952880859375,
0.0133819580078125,
0.032562255859375,
0.01441192626953125,
-0.034942626953125,
-0.004024505615234375,
0.05523681640625,
0.00522613525390625,
0.04620361328125,
0.03363037109375,
-0.017730712890625,
-0.00724029541015625,
-0.0246429443359375,
-0.039031982421875,
0.033233642578125,
-0.040130615234375,
0.0017385482788085938,
0.00861358642578125,
-0.040557861328125,
-0.003955841064453125,
-0.0187530517578125,
-0.050445556640625,
-0.037200927734375,
-0.06011962890625,
0.0012054443359375,
0.0274200439453125,
0.06787109375,
0.0295867919921875,
0.02581787109375,
-0.03692626953125,
0.01331329345703125,
0.054931640625,
0.0075531005859375,
-0.0271148681640625,
-0.06756591796875,
-0.024871826171875,
-0.004791259765625,
-0.01629638671875,
-0.04986572265625,
0.052581787109375,
0.0146026611328125,
0.0294189453125,
0.01776123046875,
-0.0173187255859375,
0.07183837890625,
-0.03546142578125,
0.04425048828125,
0.0195770263671875,
-0.03424072265625,
0.015625,
-0.01551055908203125,
0.016815185546875,
0.050079345703125,
0.0166168212890625,
-0.01641845703125,
-0.007171630859375,
-0.07806396484375,
-0.04803466796875,
0.032958984375,
-0.009552001953125,
-0.005985260009765625,
0.045257568359375,
0.031341552734375,
0.01398468017578125,
0.0107574462890625,
-0.032806396484375,
-0.0107421875,
-0.041534423828125,
0.01134490966796875,
-0.00672149658203125,
-0.015655517578125,
-0.0013589859008789062,
-0.0259857177734375,
0.0654296875,
0.004451751708984375,
0.0257568359375,
0.017730712890625,
-0.014190673828125,
-0.01068878173828125,
-0.0246429443359375,
0.04119873046875,
0.026275634765625,
-0.06463623046875,
-0.025054931640625,
-0.00518798828125,
-0.028350830078125,
0.002506256103515625,
-0.004009246826171875,
0.0059814453125,
0.0157623291015625,
0.0151824951171875,
0.070068359375,
-0.0006394386291503906,
-0.0181121826171875,
0.0567626953125,
-0.0079803466796875,
0.0154266357421875,
-0.06414794921875,
0.006259918212890625,
-0.0066375732421875,
0.042816162109375,
0.01012420654296875,
0.040313720703125,
0.05609130859375,
-0.045989990234375,
-0.0024089813232421875,
0.015228271484375,
-0.05755615234375,
-0.035308837890625,
0.075439453125,
0.0193939208984375,
-0.058380126953125,
0.055023193359375,
0.004364013671875,
0.01471710205078125,
0.038330078125,
0.03570556640625,
0.0826416015625,
-0.0235443115234375,
0.0255126953125,
0.049560546875,
0.005512237548828125,
0.0002887248992919922,
0.04144287109375,
0.019989013671875,
-0.0297698974609375,
-0.027099609375,
-0.01497650146484375,
-0.058807373046875,
-0.0032806396484375,
-0.07025146484375,
0.030792236328125,
-0.04412841796875,
-0.005535125732421875,
-0.004352569580078125,
-0.0251007080078125,
-0.036407470703125,
0.06378173828125,
0.028900146484375,
0.09124755859375,
-0.07550048828125,
0.0906982421875,
0.056549072265625,
-0.034149169921875,
-0.0240631103515625,
-0.005153656005859375,
-0.01194000244140625,
-0.063232421875,
0.0140228271484375,
-0.0010528564453125,
-0.023193359375,
-0.0088653564453125,
-0.06256103515625,
-0.050811767578125,
0.06976318359375,
0.06805419921875,
-0.03826904296875,
0.001934051513671875,
-0.00562286376953125,
0.03533935546875,
-0.041412353515625,
0.01094818115234375,
0.033416748046875,
0.0273590087890625,
0.0117645263671875,
-0.0565185546875,
-0.0008649826049804688,
-0.051971435546875,
0.03125,
-0.008026123046875,
-0.0654296875,
0.060943603515625,
0.0027313232421875,
0.00417327880859375,
0.04449462890625,
0.07366943359375,
0.031768798828125,
-0.0174102783203125,
0.047515869140625,
0.039703369140625,
0.01763916015625,
-0.01352691650390625,
0.0709228515625,
0.01042938232421875,
0.01131439208984375,
0.04766845703125,
0.014678955078125,
0.05499267578125,
0.049652099609375,
-0.005985260009765625,
0.0614013671875,
0.062042236328125,
0.0013027191162109375,
0.060333251953125,
0.007717132568359375,
-0.052825927734375,
-0.035186767578125,
-0.015228271484375,
-0.01387786865234375,
0.0345458984375,
0.0203094482421875,
-0.01361846923828125,
0.007564544677734375,
0.00362396240234375,
-0.0157012939453125,
0.0164794921875,
-0.0261383056640625,
0.04693603515625,
-0.0191497802734375,
-0.0230255126953125,
0.037384033203125,
-0.005619049072265625,
0.03192138671875,
-0.047515869140625,
-0.034210205078125,
0.0054473876953125,
0.02313232421875,
-0.00220489501953125,
-0.050689697265625,
0.0190887451171875,
-0.037628173828125,
-0.0153961181640625,
-0.01529693603515625,
0.04443359375,
-0.0233154296875,
-0.0633544921875,
0.0210723876953125,
-0.0008993148803710938,
0.040863037109375,
-0.00586700439453125,
-0.041168212890625,
-0.0056610107421875,
0.0166168212890625,
-0.03155517578125,
0.0027828216552734375,
0.00981903076171875,
-0.010772705078125,
0.0390625,
0.0198974609375,
0.0178680419921875,
0.0233154296875,
0.0005822181701660156,
0.049560546875,
-0.04083251953125,
-0.035797119140625,
-0.0163726806640625,
0.059112548828125,
-0.034149169921875,
-0.04248046875,
0.042510986328125,
0.056976318359375,
0.06610107421875,
-0.07904052734375,
0.035125732421875,
0.0207672119140625,
0.01514434814453125,
-0.045257568359375,
0.060791015625,
-0.056976318359375,
-0.04248046875,
-0.0277557373046875,
-0.06964111328125,
-0.0235595703125,
0.047393798828125,
0.04583740234375,
-0.01270294189453125,
0.00911712646484375,
0.06005859375,
-0.0132598876953125,
-0.0172882080078125,
0.0504150390625,
0.0133819580078125,
0.019683837890625,
0.00262451171875,
0.053314208984375,
-0.042510986328125,
0.01070404052734375,
-0.050201416015625,
-0.0277862548828125,
-0.0266265869140625,
-0.0579833984375,
-0.045501708984375,
-0.056610107421875,
-0.0266571044921875,
-0.0202484130859375,
0.0021877288818359375,
0.04656982421875,
0.06646728515625,
-0.04815673828125,
-0.0362548828125,
0.00879669189453125,
-0.0400390625,
0.01053619384765625,
-0.01024627685546875,
-0.0125732421875,
0.04638671875,
-0.08251953125,
0.045257568359375,
0.025390625,
0.037017822265625,
-0.03125,
0.0261383056640625,
-0.0093536376953125,
0.016632080078125,
0.0177001953125,
0.0272369384765625,
-0.046539306640625,
-0.027679443359375,
-0.020843505859375,
-0.0005626678466796875,
0.01450347900390625,
0.0308074951171875,
-0.045501708984375,
0.059906005859375,
0.0360107421875,
-0.0123748779296875,
0.0889892578125,
-0.01262664794921875,
0.038421630859375,
-0.0379638671875,
0.051025390625,
0.0004508495330810547,
0.053955078125,
0.03497314453125,
-0.00849151611328125,
0.034820556640625,
0.023406982421875,
-0.038848876953125,
-0.05780029296875,
0.01180267333984375,
-0.10235595703125,
0.004718780517578125,
0.047454833984375,
-0.0016145706176757812,
-0.044769287109375,
0.0111236572265625,
-0.051177978515625,
0.039459228515625,
-0.005695343017578125,
0.03216552734375,
0.034393310546875,
0.007663726806640625,
-0.018707275390625,
-0.0209197998046875,
0.01541900634765625,
-0.0228424072265625,
-0.054229736328125,
-0.042144775390625,
0.0226287841796875,
0.02984619140625,
0.01371002197265625,
0.0223388671875,
-0.0224456787109375,
0.03851318359375,
0.01015472412109375,
0.07257080078125,
0.0102386474609375,
-0.02496337890625,
-0.00241851806640625,
0.006931304931640625,
0.0223846435546875,
-0.045318603515625
]
] |
CAMeL-Lab/bert-base-arabic-camelbert-ca | 2021-09-14T14:27:12.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"ar",
"arxiv:2103.06678",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | CAMeL-Lab | null | null | CAMeL-Lab/bert-base-arabic-camelbert-ca | 8 | 12,530 | transformers | 2022-03-02T23:29:04 | ---
language:
- ar
license: apache-2.0
widget:
- text: "الهدف من الحياة هو [MASK] ."
---
# CAMeLBERT: A collection of pre-trained models for Arabic NLP tasks
## Model description
**CAMeLBERT** is a collection of BERT models pre-trained on Arabic texts with different sizes and variants.
We release pre-trained language models for Modern Standard Arabic (MSA), dialectal Arabic (DA), and classical Arabic (CA), in addition to a model pre-trained on a mix of the three.
We also provide additional models that are pre-trained on a scaled-down set of the MSA variant (half, quarter, eighth, and sixteenth).
The details are described in the paper *"[The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models](https://arxiv.org/abs/2103.06678)."*
This model card describes **CAMeLBERT-CA** (`bert-base-arabic-camelbert-ca`), a model pre-trained on the CA (classical Arabic) dataset.
||Model|Variant|Size|#Word|
|-|-|:-:|-:|-:|
||`bert-base-arabic-camelbert-mix`|CA,DA,MSA|167GB|17.3B|
|✔|`bert-base-arabic-camelbert-ca`|CA|6GB|847M|
||`bert-base-arabic-camelbert-da`|DA|54GB|5.8B|
||`bert-base-arabic-camelbert-msa`|MSA|107GB|12.6B|
||`bert-base-arabic-camelbert-msa-half`|MSA|53GB|6.3B|
||`bert-base-arabic-camelbert-msa-quarter`|MSA|27GB|3.1B|
||`bert-base-arabic-camelbert-msa-eighth`|MSA|14GB|1.6B|
||`bert-base-arabic-camelbert-msa-sixteenth`|MSA|6GB|746M|
## Intended uses
You can use the released model for either masked language modeling or next sentence prediction.
However, it is mostly intended to be fine-tuned on an NLP task, such as NER, POS tagging, sentiment analysis, dialect identification, and poetry classification.
We release our fine-tuninig code [here](https://github.com/CAMeL-Lab/CAMeLBERT).
#### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='CAMeL-Lab/bert-base-arabic-camelbert-ca')
>>> unmasker("الهدف من الحياة هو [MASK] .")
[{'sequence': '[CLS] الهدف من الحياة هو الحياة. [SEP]',
'score': 0.11048116534948349,
'token': 3696,
'token_str': 'الحياة'},
{'sequence': '[CLS] الهدف من الحياة هو الإسلام. [SEP]',
'score': 0.03481195122003555,
'token': 4677,
'token_str': 'الإسلام'},
{'sequence': '[CLS] الهدف من الحياة هو الموت. [SEP]',
'score': 0.03402028977870941,
'token': 4295,
'token_str': 'الموت'},
{'sequence': '[CLS] الهدف من الحياة هو العلم. [SEP]',
'score': 0.027655426412820816,
'token': 2789,
'token_str': 'العلم'},
{'sequence': '[CLS] الهدف من الحياة هو هذا. [SEP]',
'score': 0.023059621453285217,
'token': 2085,
'token_str': 'هذا'}]
```
*Note*: to download our models, you would need `transformers>=3.5.0`. Otherwise, you could download the models manually.
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained('CAMeL-Lab/bert-base-arabic-camelbert-ca')
model = AutoModel.from_pretrained('CAMeL-Lab/bert-base-arabic-camelbert-ca')
text = "مرحبا يا عالم."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import AutoTokenizer, TFAutoModel
tokenizer = AutoTokenizer.from_pretrained('CAMeL-Lab/bert-base-arabic-camelbert-ca')
model = TFAutoModel.from_pretrained('CAMeL-Lab/bert-base-arabic-camelbert-ca')
text = "مرحبا يا عالم."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
## Training data
- CA (classical Arabic)
- [OpenITI (Version 2020.1.2)](https://zenodo.org/record/3891466#.YEX4-F0zbzc)
## Training procedure
We use [the original implementation](https://github.com/google-research/bert) released by Google for pre-training.
We follow the original English BERT model's hyperparameters for pre-training, unless otherwise specified.
### Preprocessing
- After extracting the raw text from each corpus, we apply the following pre-processing.
- We first remove invalid characters and normalize white spaces using the utilities provided by [the original BERT implementation](https://github.com/google-research/bert/blob/eedf5716ce1268e56f0a50264a88cafad334ac61/tokenization.py#L286-L297).
- We also remove lines without any Arabic characters.
- We then remove diacritics and kashida using [CAMeL Tools](https://github.com/CAMeL-Lab/camel_tools).
- Finally, we split each line into sentences with a heuristics-based sentence segmenter.
- We train a WordPiece tokenizer on the entire dataset (167 GB text) with a vocabulary size of 30,000 using [HuggingFace's tokenizers](https://github.com/huggingface/tokenizers).
- We do not lowercase letters nor strip accents.
### Pre-training
- The model was trained on a single cloud TPU (`v3-8`) for one million steps in total.
- The first 90,000 steps were trained with a batch size of 1,024 and the rest was trained with a batch size of 256.
- The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%.
- We use whole word masking and a duplicate factor of 10.
- We set max predictions per sequence to 20 for the dataset with max sequence length of 128 tokens and 80 for the dataset with max sequence length of 512 tokens.
- We use a random seed of 12345, masked language model probability of 0.15, and short sequence probability of 0.1.
- The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Evaluation results
- We evaluate our pre-trained language models on five NLP tasks: NER, POS tagging, sentiment analysis, dialect identification, and poetry classification.
- We fine-tune and evaluate the models using 12 dataset.
- We used Hugging Face's transformers to fine-tune our CAMeLBERT models.
- We used transformers `v3.1.0` along with PyTorch `v1.5.1`.
- The fine-tuning was done by adding a fully connected linear layer to the last hidden state.
- We use \\(F_{1}\\) score as a metric for all tasks.
- Code used for fine-tuning is available [here](https://github.com/CAMeL-Lab/CAMeLBERT).
### Results
| Task | Dataset | Variant | Mix | CA | DA | MSA | MSA-1/2 | MSA-1/4 | MSA-1/8 | MSA-1/16 |
| -------------------- | --------------- | ------- | ----- | ----- | ----- | ----- | ------- | ------- | ------- | -------- |
| NER | ANERcorp | MSA | 80.8% | 67.9% | 74.1% | 82.4% | 82.0% | 82.1% | 82.6% | 80.8% |
| POS | PATB (MSA) | MSA | 98.1% | 97.8% | 97.7% | 98.3% | 98.2% | 98.3% | 98.2% | 98.2% |
| | ARZTB (EGY) | DA | 93.6% | 92.3% | 92.7% | 93.6% | 93.6% | 93.7% | 93.6% | 93.6% |
| | Gumar (GLF) | DA | 97.3% | 97.7% | 97.9% | 97.9% | 97.9% | 97.9% | 97.9% | 97.9% |
| SA | ASTD | MSA | 76.3% | 69.4% | 74.6% | 76.9% | 76.0% | 76.8% | 76.7% | 75.3% |
| | ArSAS | MSA | 92.7% | 89.4% | 91.8% | 93.0% | 92.6% | 92.5% | 92.5% | 92.3% |
| | SemEval | MSA | 69.0% | 58.5% | 68.4% | 72.1% | 70.7% | 72.8% | 71.6% | 71.2% |
| DID | MADAR-26 | DA | 62.9% | 61.9% | 61.8% | 62.6% | 62.0% | 62.8% | 62.0% | 62.2% |
| | MADAR-6 | DA | 92.5% | 91.5% | 92.2% | 91.9% | 91.8% | 92.2% | 92.1% | 92.0% |
| | MADAR-Twitter-5 | MSA | 75.7% | 71.4% | 74.2% | 77.6% | 78.5% | 77.3% | 77.7% | 76.2% |
| | NADI | DA | 24.7% | 17.3% | 20.1% | 24.9% | 24.6% | 24.6% | 24.9% | 23.8% |
| Poetry | APCD | CA | 79.8% | 80.9% | 79.6% | 79.7% | 79.9% | 80.0% | 79.7% | 79.8% |
### Results (Average)
| | Variant | Mix | CA | DA | MSA | MSA-1/2 | MSA-1/4 | MSA-1/8 | MSA-1/16 |
| -------------------- | ------- | ----- | ----- | ----- | ----- | ------- | ------- | ------- | -------- |
| Variant-wise-average<sup>[[1]](#footnote-1)</sup> | MSA | 82.1% | 75.7% | 80.1% | 83.4% | 83.0% | 83.3% | 83.2% | 82.3% |
| | DA | 74.4% | 72.1% | 72.9% | 74.2% | 74.0% | 74.3% | 74.1% | 73.9% |
| | CA | 79.8% | 80.9% | 79.6% | 79.7% | 79.9% | 80.0% | 79.7% | 79.8% |
| Macro-Average | ALL | 78.7% | 74.7% | 77.1% | 79.2% | 79.0% | 79.2% | 79.1% | 78.6% |
<a name="footnote-1">[1]</a>: Variant-wise-average refers to average over a group of tasks in the same language variant.
## Acknowledgements
This research was supported with Cloud TPUs from Google’s TensorFlow Research Cloud (TFRC).
## Citation
```bibtex
@inproceedings{inoue-etal-2021-interplay,
title = "The Interplay of Variant, Size, and Task Type in {A}rabic Pre-trained Language Models",
author = "Inoue, Go and
Alhafni, Bashar and
Baimukan, Nurpeiis and
Bouamor, Houda and
Habash, Nizar",
booktitle = "Proceedings of the Sixth Arabic Natural Language Processing Workshop",
month = apr,
year = "2021",
address = "Kyiv, Ukraine (Online)",
publisher = "Association for Computational Linguistics",
abstract = "In this paper, we explore the effects of language variants, data sizes, and fine-tuning task types in Arabic pre-trained language models. To do so, we build three pre-trained language models across three variants of Arabic: Modern Standard Arabic (MSA), dialectal Arabic, and classical Arabic, in addition to a fourth language model which is pre-trained on a mix of the three. We also examine the importance of pre-training data size by building additional models that are pre-trained on a scaled-down set of the MSA variant. We compare our different models to each other, as well as to eight publicly available models by fine-tuning them on five NLP tasks spanning 12 datasets. Our results suggest that the variant proximity of pre-training data to fine-tuning data is more important than the pre-training data size. We exploit this insight in defining an optimized system selection model for the studied tasks.",
}
```
| 10,412 | [
[
-0.02886962890625,
-0.046630859375,
-0.00592041015625,
0.02386474609375,
-0.031951904296875,
0.0110626220703125,
-0.01004791259765625,
-0.025970458984375,
0.036346435546875,
0.0242156982421875,
-0.040863037109375,
-0.05938720703125,
-0.0706787109375,
0.00799560546875,
-0.026947021484375,
0.09234619140625,
-0.005207061767578125,
-0.00033545494079589844,
0.0282440185546875,
-0.02001953125,
-0.0196075439453125,
-0.035400390625,
-0.038543701171875,
-0.01186370849609375,
0.035980224609375,
0.03228759765625,
0.05914306640625,
0.0254364013671875,
0.0298004150390625,
0.0275421142578125,
-0.00734710693359375,
0.00818634033203125,
-0.024749755859375,
-0.01073455810546875,
0.01229095458984375,
-0.0186309814453125,
-0.0273590087890625,
0.0037689208984375,
0.039520263671875,
0.04498291015625,
-0.0178985595703125,
0.033599853515625,
-0.0018930435180664062,
0.0657958984375,
-0.037017822265625,
0.0075531005859375,
-0.032440185546875,
-0.0004429817199707031,
-0.0156097412109375,
0.00623321533203125,
-0.009063720703125,
-0.029693603515625,
0.01690673828125,
-0.0333251953125,
0.0192108154296875,
0.0215301513671875,
0.09783935546875,
0.00785064697265625,
-0.0172882080078125,
-0.0234832763671875,
-0.027374267578125,
0.0655517578125,
-0.056488037109375,
0.01110076904296875,
0.03778076171875,
0.0112457275390625,
-0.0204620361328125,
-0.06182861328125,
-0.049346923828125,
-0.0014314651489257812,
-0.007293701171875,
0.00897216796875,
-0.0261688232421875,
-0.0134735107421875,
0.0161590576171875,
0.03521728515625,
-0.044464111328125,
-0.0116729736328125,
-0.03173828125,
-0.0259246826171875,
0.0484619140625,
0.0018901824951171875,
0.03131103515625,
-0.01360321044921875,
-0.0295257568359375,
-0.0251007080078125,
-0.030517578125,
0.0226593017578125,
0.033203125,
0.015167236328125,
-0.0227813720703125,
0.045196533203125,
-0.017974853515625,
0.0438232421875,
-0.0014286041259765625,
0.002613067626953125,
0.0435791015625,
-0.01751708984375,
-0.031402587890625,
0.003673553466796875,
0.07537841796875,
0.01139068603515625,
0.00940704345703125,
0.007045745849609375,
-0.0214691162109375,
-0.0019989013671875,
0.0006146430969238281,
-0.07037353515625,
-0.026947021484375,
0.0293426513671875,
-0.0408935546875,
-0.027435302734375,
0.006114959716796875,
-0.045318603515625,
0.0007534027099609375,
0.0030517578125,
0.04583740234375,
-0.04254150390625,
-0.0182952880859375,
0.0252838134765625,
-0.0157470703125,
0.025390625,
0.01983642578125,
-0.0638427734375,
0.0229034423828125,
0.0257415771484375,
0.0582275390625,
0.0011529922485351562,
-0.015716552734375,
-0.017364501953125,
-0.0010852813720703125,
-0.0157318115234375,
0.044830322265625,
-0.01531219482421875,
-0.0428466796875,
0.005443572998046875,
0.0102996826171875,
-0.0087127685546875,
-0.0235137939453125,
0.05499267578125,
-0.0338134765625,
0.0265350341796875,
-0.0175323486328125,
-0.033416748046875,
-0.0268402099609375,
0.01532745361328125,
-0.050537109375,
0.09674072265625,
0.01459503173828125,
-0.06768798828125,
0.0183868408203125,
-0.054168701171875,
-0.031158447265625,
-0.0026226043701171875,
0.006961822509765625,
-0.045166015625,
-0.00848388671875,
0.031280517578125,
0.0267181396484375,
-0.0166168212890625,
0.00940704345703125,
-0.0010538101196289062,
-0.02716064453125,
0.02587890625,
-0.0187530517578125,
0.08380126953125,
0.01605224609375,
-0.036834716796875,
0.029022216796875,
-0.06756591796875,
0.0060882568359375,
0.0180816650390625,
-0.0173187255859375,
0.009552001953125,
-0.0222015380859375,
0.032745361328125,
0.0268402099609375,
0.028350830078125,
-0.0462646484375,
0.0179290771484375,
-0.04705810546875,
0.035736083984375,
0.0589599609375,
-0.0080108642578125,
0.00959014892578125,
-0.0413818359375,
0.038909912109375,
0.01450347900390625,
0.00414276123046875,
0.0124664306640625,
-0.045928955078125,
-0.07073974609375,
-0.039825439453125,
0.035736083984375,
0.0404052734375,
-0.030853271484375,
0.061370849609375,
-0.007701873779296875,
-0.05450439453125,
-0.052764892578125,
0.005985260009765625,
0.02142333984375,
0.028656005859375,
0.034515380859375,
-0.040618896484375,
-0.0389404296875,
-0.06341552734375,
-0.0183868408203125,
-0.01702880859375,
0.00638580322265625,
0.01824951171875,
0.063720703125,
-0.0227813720703125,
0.062042236328125,
-0.044189453125,
-0.03045654296875,
-0.023590087890625,
0.016754150390625,
0.040435791015625,
0.04486083984375,
0.047119140625,
-0.0413818359375,
-0.037994384765625,
-0.0158233642578125,
-0.030609130859375,
0.008819580078125,
0.01476287841796875,
-0.024322509765625,
0.0247802734375,
0.007282257080078125,
-0.05810546875,
0.057525634765625,
0.043731689453125,
-0.045989990234375,
0.05731201171875,
-0.023040771484375,
0.0035247802734375,
-0.086669921875,
0.0135345458984375,
-0.0098114013671875,
-0.007266998291015625,
-0.037628173828125,
-0.014739990234375,
-0.007762908935546875,
-0.002452850341796875,
-0.037994384765625,
0.049896240234375,
-0.031341552734375,
0.01233673095703125,
-0.006927490234375,
-0.0006074905395507812,
0.001773834228515625,
0.05133056640625,
0.00034928321838378906,
0.054595947265625,
0.05206298828125,
-0.033660888671875,
0.0211334228515625,
0.0300445556640625,
-0.043792724609375,
-0.00531005859375,
-0.06719970703125,
0.002765655517578125,
0.00519561767578125,
0.0198822021484375,
-0.09716796875,
-0.013275146484375,
0.0330810546875,
-0.048828125,
0.0172576904296875,
0.00873565673828125,
-0.0528564453125,
-0.0288238525390625,
-0.03070068359375,
0.036590576171875,
0.05078125,
-0.0238494873046875,
0.0406494140625,
0.01280975341796875,
-0.002559661865234375,
-0.06341552734375,
-0.041900634765625,
-0.0035152435302734375,
-0.01145172119140625,
-0.04559326171875,
0.032440185546875,
-0.006710052490234375,
0.00545501708984375,
-0.00484466552734375,
0.0004150867462158203,
-0.0001493692398071289,
0.0037403106689453125,
0.0257568359375,
0.0269622802734375,
-0.0033435821533203125,
-0.0034961700439453125,
-0.00611114501953125,
0.0004987716674804688,
-0.002590179443359375,
-0.007965087890625,
0.0592041015625,
-0.0208740234375,
-0.008453369140625,
-0.04681396484375,
0.021087646484375,
0.03314208984375,
-0.0287933349609375,
0.0804443359375,
0.07745361328125,
-0.028656005859375,
0.000667572021484375,
-0.035797119140625,
-0.006099700927734375,
-0.036773681640625,
0.0221099853515625,
-0.031341552734375,
-0.059326171875,
0.053680419921875,
0.00383758544921875,
0.0035076141357421875,
0.06048583984375,
0.054229736328125,
-0.004917144775390625,
0.07427978515625,
0.045166015625,
-0.025665283203125,
0.030609130859375,
-0.04010009765625,
0.0163116455078125,
-0.051361083984375,
-0.0316162109375,
-0.04425048828125,
-0.0242919921875,
-0.050689697265625,
-0.015869140625,
0.010009765625,
0.01145172119140625,
-0.035186767578125,
0.0372314453125,
-0.0533447265625,
0.00847625732421875,
0.052154541015625,
0.01473236083984375,
-0.003986358642578125,
0.0007395744323730469,
-0.026123046875,
0.00812530517578125,
-0.0400390625,
-0.035736083984375,
0.0821533203125,
0.02642822265625,
0.039398193359375,
0.0220489501953125,
0.05377197265625,
0.0280303955078125,
0.01251220703125,
-0.05169677734375,
0.035491943359375,
0.01242828369140625,
-0.0574951171875,
-0.0251007080078125,
-0.005924224853515625,
-0.08270263671875,
0.0227813720703125,
-0.02288818359375,
-0.06768798828125,
0.00812530517578125,
-0.01078033447265625,
-0.0228729248046875,
0.01534271240234375,
-0.04144287109375,
0.0732421875,
-0.00765228271484375,
-0.02001953125,
-0.012115478515625,
-0.067626953125,
0.01233673095703125,
0.0096282958984375,
0.017822265625,
-0.0277557373046875,
-0.000010013580322265625,
0.0797119140625,
-0.054779052734375,
0.056732177734375,
-0.0030994415283203125,
0.002780914306640625,
0.0301055908203125,
-0.000408172607421875,
0.0299224853515625,
0.00258636474609375,
-0.005443572998046875,
0.0262603759765625,
0.00794219970703125,
-0.057952880859375,
-0.018157958984375,
0.0462646484375,
-0.09014892578125,
-0.0411376953125,
-0.06439208984375,
-0.0394287109375,
0.00600433349609375,
0.02197265625,
0.033294677734375,
0.0362548828125,
-0.0164794921875,
0.0025997161865234375,
0.0255279541015625,
-0.0242767333984375,
0.0538330078125,
0.027801513671875,
-0.015838623046875,
-0.04669189453125,
0.060760498046875,
-0.0037860870361328125,
-0.0018301010131835938,
0.0161895751953125,
0.01332855224609375,
-0.0309600830078125,
-0.0372314453125,
-0.036163330078125,
0.024688720703125,
-0.039337158203125,
-0.01885986328125,
-0.051849365234375,
-0.0183868408203125,
-0.047271728515625,
-0.004638671875,
-0.0158843994140625,
-0.041412353515625,
-0.032196044921875,
0.0004487037658691406,
0.040130615234375,
0.044189453125,
0.0007266998291015625,
0.03369140625,
-0.0560302734375,
0.0105133056640625,
0.0031452178955078125,
0.007450103759765625,
-0.006847381591796875,
-0.059600830078125,
-0.025177001953125,
-0.008544921875,
-0.03424072265625,
-0.069580078125,
0.058319091796875,
0.01035308837890625,
0.0214691162109375,
0.0298614501953125,
0.0013227462768554688,
0.050384521484375,
-0.031982421875,
0.0736083984375,
0.021820068359375,
-0.077880859375,
0.051544189453125,
-0.023040771484375,
0.0231170654296875,
0.037841796875,
0.039581298828125,
-0.033660888671875,
-0.030609130859375,
-0.0693359375,
-0.07427978515625,
0.05462646484375,
0.044097900390625,
0.007572174072265625,
-0.0013217926025390625,
0.007354736328125,
0.0030670166015625,
0.0284271240234375,
-0.05560302734375,
-0.05523681640625,
-0.02655029296875,
-0.027374267578125,
-0.0121612548828125,
-0.01800537109375,
-0.0134735107421875,
-0.04583740234375,
0.0662841796875,
0.01438140869140625,
0.0273590087890625,
0.020111083984375,
-0.01027679443359375,
0.005435943603515625,
0.0149688720703125,
0.0501708984375,
0.044281005859375,
-0.033447265625,
-0.01148223876953125,
0.0082244873046875,
-0.0537109375,
0.005023956298828125,
0.0182647705078125,
-0.004238128662109375,
0.0147552490234375,
0.0292816162109375,
0.0565185546875,
0.0013322830200195312,
-0.04296875,
0.047607421875,
-0.0096435546875,
-0.0190277099609375,
-0.03875732421875,
-0.003284454345703125,
-0.0005869865417480469,
0.00516510009765625,
0.0303497314453125,
0.01517486572265625,
0.01261138916015625,
-0.03887939453125,
0.01557159423828125,
0.033660888671875,
-0.03656005859375,
-0.0164794921875,
0.042755126953125,
0.0034637451171875,
-0.0269622802734375,
0.048919677734375,
-0.002162933349609375,
-0.048828125,
0.0560302734375,
0.041656494140625,
0.06011962890625,
-0.0216827392578125,
0.022186279296875,
0.04986572265625,
0.0156097412109375,
0.003566741943359375,
0.03228759765625,
-0.0031223297119140625,
-0.05926513671875,
-0.01079559326171875,
-0.0634765625,
-0.0169219970703125,
0.00495147705078125,
-0.048248291015625,
0.0206756591796875,
-0.040435791015625,
-0.0254058837890625,
-0.001007080078125,
0.0271148681640625,
-0.0577392578125,
0.033660888671875,
-0.0053863525390625,
0.07513427734375,
-0.0609130859375,
0.077880859375,
0.05206298828125,
-0.05035400390625,
-0.0731201171875,
-0.0179443359375,
-0.01537322998046875,
-0.07843017578125,
0.05450439453125,
0.026611328125,
-0.0067901611328125,
0.00835418701171875,
-0.045928955078125,
-0.070068359375,
0.07659912109375,
0.0011491775512695312,
-0.02288818359375,
0.01309967041015625,
0.005237579345703125,
0.037506103515625,
-0.0199127197265625,
0.04754638671875,
0.04364013671875,
0.0265350341796875,
0.01396942138671875,
-0.05718994140625,
0.0170135498046875,
-0.03936767578125,
-0.0069122314453125,
0.0149688720703125,
-0.060089111328125,
0.06591796875,
-0.01473236083984375,
-0.00919342041015625,
0.01861572265625,
0.0648193359375,
0.020263671875,
0.0008754730224609375,
0.0283050537109375,
0.054779052734375,
0.053924560546875,
-0.01190185546875,
0.0679931640625,
-0.0295867919921875,
0.0374755859375,
0.05487060546875,
0.006404876708984375,
0.06939697265625,
0.034759521484375,
-0.027618408203125,
0.07147216796875,
0.06414794921875,
-0.004512786865234375,
0.048126220703125,
0.0174102783203125,
-0.0272979736328125,
-0.006542205810546875,
-0.01229095458984375,
-0.033203125,
0.0340576171875,
0.02764892578125,
-0.032623291015625,
-0.0005478858947753906,
-0.01271820068359375,
0.0121917724609375,
-0.0124359130859375,
-0.01201629638671875,
0.043609619140625,
0.002197265625,
-0.03985595703125,
0.054840087890625,
0.01378631591796875,
0.049346923828125,
-0.040496826171875,
0.00589752197265625,
-0.000046133995056152344,
0.01525115966796875,
-0.0127716064453125,
-0.05450439453125,
0.00247955322265625,
-0.0096435546875,
-0.0048980712890625,
-0.0069427490234375,
0.048614501953125,
-0.02496337890625,
-0.055511474609375,
0.0118865966796875,
0.030914306640625,
0.02001953125,
0.0008401870727539062,
-0.07562255859375,
0.007465362548828125,
0.0027523040771484375,
-0.033660888671875,
0.0164947509765625,
0.0248260498046875,
0.01265716552734375,
0.04425048828125,
0.054168701171875,
0.007129669189453125,
0.000965118408203125,
0.0012369155883789062,
0.07159423828125,
-0.06915283203125,
-0.032958984375,
-0.0699462890625,
0.041656494140625,
-0.0038509368896484375,
-0.0435791015625,
0.05633544921875,
0.045562744140625,
0.050994873046875,
-0.0098114013671875,
0.0457763671875,
-0.0128326416015625,
0.0304107666015625,
-0.024688720703125,
0.06396484375,
-0.042022705078125,
-0.007755279541015625,
-0.02386474609375,
-0.060943603515625,
-0.0208892822265625,
0.050872802734375,
-0.0200958251953125,
0.00904083251953125,
0.052978515625,
0.060333251953125,
0.019378662109375,
0.0010194778442382812,
0.005008697509765625,
0.01007080078125,
0.014129638671875,
0.048431396484375,
0.04486083984375,
-0.06292724609375,
0.037689208984375,
-0.0258941650390625,
-0.008514404296875,
-0.024505615234375,
-0.052337646484375,
-0.08221435546875,
-0.04217529296875,
-0.02703857421875,
-0.04052734375,
-0.01084136962890625,
0.0810546875,
0.037872314453125,
-0.06903076171875,
-0.0298004150390625,
-0.00313568115234375,
0.004352569580078125,
-0.01210784912109375,
-0.0147857666015625,
0.06304931640625,
-0.020660400390625,
-0.0552978515625,
0.0003249645233154297,
-0.0002505779266357422,
0.01383209228515625,
-0.004878997802734375,
-0.006938934326171875,
-0.034912109375,
0.01126861572265625,
0.033966064453125,
0.0195159912109375,
-0.0667724609375,
-0.0223541259765625,
-0.0024013519287109375,
-0.02587890625,
0.0126190185546875,
0.0249786376953125,
-0.05389404296875,
0.018218994140625,
0.029510498046875,
0.032135009765625,
0.053436279296875,
0.0006937980651855469,
0.0272979736328125,
-0.050537109375,
0.029632568359375,
0.0106353759765625,
0.031890869140625,
0.0209503173828125,
-0.018951416015625,
0.03204345703125,
0.01800537109375,
-0.047821044921875,
-0.050201416015625,
-0.002048492431640625,
-0.0928955078125,
-0.00963592529296875,
0.0697021484375,
-0.0161590576171875,
-0.03533935546875,
0.00220489501953125,
-0.034759521484375,
0.0369873046875,
-0.0435791015625,
0.05462646484375,
0.066162109375,
-0.00592041015625,
-0.0007882118225097656,
-0.0298309326171875,
0.04571533203125,
0.053253173828125,
-0.04296875,
-0.0267333984375,
0.0163116455078125,
0.0252838134765625,
0.01142120361328125,
0.052886962890625,
-0.0091094970703125,
0.0143585205078125,
-0.0016460418701171875,
0.019287109375,
0.00829315185546875,
-0.00258636474609375,
-0.0169219970703125,
0.00848388671875,
0.004940032958984375,
-0.04229736328125
]
] |
pankajmathur/orca_mini_v3_7b | 2023-08-25T23:14:36.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:psmathur/orca_mini_v1_dataset",
"dataset:ehartford/dolphin",
"arxiv:2306.02707",
"license:other",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | pankajmathur | null | null | pankajmathur/orca_mini_v3_7b | 37 | 12,527 | transformers | 2023-08-07T03:23:51 | ---
language:
- en
library_name: transformers
license: other
datasets:
- psmathur/orca_mini_v1_dataset
- ehartford/dolphin
pipeline_tag: text-generation
---
# orca_mini_v3_7b
A LLama2-7b model trained on Orca Style datasets.
<br>

<br>
🤔 How good is orca-mini-v3-7b? Do the evaluation results from HuggingFace Open LLM leaderboard translate to real-world use cases?
🔍 Now you can figure it out for yourself!
Introducing the orca-mini chatbot powered by the orca-mini-v3-7b model. Dive in and see how the open source 7b model stacks up in the world of massive language models. 🌍
⏰ Hurry up before I run out of GPU credits! 😉
Check it out here 👉
[https://huggingface.co/spaces/psmathur/psmathur-orca_mini_v3_7b](https://huggingface.co/spaces/psmathur/psmathur-orca_mini_v3_7b)
<br>
**P.S. If you're interested to collaborate, please connect with me at www.linkedin.com/in/pankajam.**
<br>
### quantized versions
Big thanks to [@TheBloke](https://huggingface.co/TheBloke)
1) https://huggingface.co/TheBloke/orca_mini_v3_7B-GGML
2) https://huggingface.co/TheBloke/orca_mini_v3_7B-GPTQ
<br>
#### license disclaimer:
This model is bound by the license & usage restrictions of the original Llama-2 model. And comes with no warranty or gurantees of any kind.
<br>
## evaluation
We evaluated orca_mini_v3_7b on a wide range of tasks using [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) from EleutherAI.
Here are the results on metrics used by [HuggingFaceH4 Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|||||
|:------:|:--------:|:-------:|:--------:|
|**Task**|**Metric**|**Value**|**Stderr**|
|*arc_challenge*|acc_norm|0.5717|0.0145|
|*hellaswag*|acc_norm|0.7966|0.0043|
|*mmlu*|acc_norm|0.5234|0.035|
|*truthfulqa_mc*|mc2|0.5029|0.0156|
|**Total Average**|-|**0.59865**||
<br>
## example esage
Here is prompt format
```
### System:
You are an AI assistant that follows instruction extremely well. Help as much as you can.
### User:
Tell me about Orcas.
### Assistant:
```
Below shows a code example on how to use this model
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("psmathur/orca_mini_v3_7b", use_fast=False)
model = AutoModelForCausalLM.from_pretrained(
"psmathur/orca_mini_v3_7b",
torch_dtype=torch.float16,
load_in_8bit=True,
low_cpu_mem_usage=True,
device_map="auto"
)
system_prompt = "### System:\nYou are an AI assistant that follows instruction extremely well. Help as much as you can.\n\n"
#generate text steps
instruction = "Tell me about Orcas."
prompt = f"{system_prompt}### User: {instruction}\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=4096)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
<br>
#### limitations & biases:
While this model aims for accuracy, it can occasionally produce inaccurate or misleading results.
Despite diligent efforts in refining the pretraining data, there remains a possibility for the generation of inappropriate, biased, or offensive content.
Exercise caution and cross-check information when necessary.
<br>
### citiation:
Please kindly cite using the following BibTeX:
```
@misc{orca_mini_v3_7b,
author = {Pankaj Mathur},
title = {orca_mini_v3_7b: An explain tuned Llama2-7b model},
year = {2023},
publisher = {GitHub, HuggingFace},
journal = {GitHub repository, HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/psmathur/orca_mini_v3_7b},
}
```
```
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@software{touvron2023llama,
title={LLaMA2: Open and Efficient Foundation Language Models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
``` | 4,620 | [
[
-0.016937255859375,
-0.07183837890625,
0.0165557861328125,
0.01184844970703125,
-0.020751953125,
-0.004581451416015625,
-0.0132598876953125,
-0.0523681640625,
0.0159912109375,
0.0167083740234375,
-0.038177490234375,
-0.04638671875,
-0.043121337890625,
-0.00795745849609375,
-0.017608642578125,
0.0787353515625,
-0.0042724609375,
-0.004550933837890625,
0.0089569091796875,
-0.0193939208984375,
-0.040985107421875,
-0.0338134765625,
-0.0745849609375,
-0.034881591796875,
0.0244293212890625,
0.0174713134765625,
0.0445556640625,
0.06524658203125,
0.02593994140625,
0.0233306884765625,
-0.0192718505859375,
0.01171875,
-0.0288543701171875,
-0.0176849365234375,
0.0124359130859375,
-0.053253173828125,
-0.06793212890625,
-0.00469970703125,
0.02801513671875,
0.0224456787109375,
-0.017730712890625,
0.0350341796875,
0.005886077880859375,
0.016754150390625,
-0.025390625,
0.046539306640625,
-0.0301361083984375,
-0.011688232421875,
-0.03558349609375,
0.003116607666015625,
-0.00894927978515625,
-0.0242767333984375,
-0.00579833984375,
-0.0455322265625,
0.000858306884765625,
-0.0018825531005859375,
0.07794189453125,
0.031768798828125,
-0.0189361572265625,
-0.03253173828125,
-0.0219573974609375,
0.0562744140625,
-0.07281494140625,
0.01166534423828125,
0.0112152099609375,
0.021636962890625,
-0.0240020751953125,
-0.060699462890625,
-0.063232421875,
-0.0257720947265625,
-0.007396697998046875,
0.01198577880859375,
-0.01019287109375,
-0.006687164306640625,
0.02874755859375,
0.032684326171875,
-0.0430908203125,
0.0181884765625,
-0.032440185546875,
-0.018310546875,
0.0433349609375,
0.0208892822265625,
0.0263519287109375,
-0.01177215576171875,
-0.0190887451171875,
-0.031585693359375,
-0.058258056640625,
0.031646728515625,
0.037689208984375,
0.0218505859375,
-0.04254150390625,
0.055572509765625,
0.0016765594482421875,
0.05035400390625,
0.012237548828125,
-0.038360595703125,
0.035308837890625,
-0.02862548828125,
-0.023040771484375,
-0.009979248046875,
0.06463623046875,
0.00390625,
0.010284423828125,
0.0196380615234375,
-0.01204681396484375,
0.0213775634765625,
-0.0134429931640625,
-0.04815673828125,
-0.0252532958984375,
0.00785064697265625,
-0.03289794921875,
-0.026519775390625,
-0.020355224609375,
-0.048675537109375,
-0.01666259765625,
-0.00739288330078125,
0.0243988037109375,
-0.0335693359375,
-0.031494140625,
0.0185394287109375,
0.017547607421875,
0.0443115234375,
0.01044464111328125,
-0.07073974609375,
0.0247802734375,
0.0271759033203125,
0.061767578125,
0.0014209747314453125,
-0.0220794677734375,
-0.01467132568359375,
-0.0099945068359375,
-0.005825042724609375,
0.05548095703125,
-0.0214080810546875,
-0.0292816162109375,
-0.02557373046875,
-0.006793975830078125,
-0.0088653564453125,
-0.034332275390625,
0.0386962890625,
-0.0260162353515625,
0.0160980224609375,
-0.0165252685546875,
-0.0160980224609375,
-0.0263214111328125,
0.00801849365234375,
-0.046722412109375,
0.0987548828125,
0.0010881423950195312,
-0.058380126953125,
0.013671875,
-0.06207275390625,
-0.01461029052734375,
-0.019439697265625,
-0.01447296142578125,
-0.041900634765625,
-0.0123443603515625,
0.0294342041015625,
0.0218505859375,
-0.0251617431640625,
0.0107879638671875,
-0.031280517578125,
-0.017822265625,
0.0111236572265625,
-0.01432037353515625,
0.08782958984375,
0.014068603515625,
-0.046478271484375,
0.0115966796875,
-0.054718017578125,
-0.011077880859375,
0.03363037109375,
-0.030731201171875,
0.001678466796875,
-0.0124969482421875,
-0.0102691650390625,
0.0156097412109375,
0.0364990234375,
-0.037689208984375,
0.028045654296875,
-0.0335693359375,
0.04638671875,
0.058685302734375,
-0.018402099609375,
0.0234832763671875,
-0.0258636474609375,
0.0458984375,
-0.004444122314453125,
0.0172882080078125,
0.00897216796875,
-0.05145263671875,
-0.08984375,
-0.02789306640625,
0.0276947021484375,
0.037750244140625,
-0.04864501953125,
0.03839111328125,
-0.007904052734375,
-0.052276611328125,
-0.0447998046875,
0.003261566162109375,
0.03466796875,
0.04998779296875,
0.033660888671875,
-0.031280517578125,
-0.0474853515625,
-0.054931640625,
0.0047760009765625,
-0.025390625,
0.0029544830322265625,
0.02154541015625,
0.040863037109375,
-0.0153961181640625,
0.07257080078125,
-0.0298309326171875,
-0.0229949951171875,
-0.03179931640625,
0.004909515380859375,
0.0311279296875,
0.0474853515625,
0.0556640625,
-0.033966064453125,
-0.0234222412109375,
-0.0157470703125,
-0.0670166015625,
-0.014556884765625,
0.003173828125,
-0.0171356201171875,
0.023651123046875,
0.02734375,
-0.057342529296875,
0.037994384765625,
0.039886474609375,
-0.021087646484375,
0.045654296875,
-0.0035724639892578125,
-0.0211639404296875,
-0.067626953125,
0.0200347900390625,
-0.004909515380859375,
-0.016693115234375,
-0.024566650390625,
0.00795745849609375,
-0.00662994384765625,
0.00772857666015625,
-0.034881591796875,
0.057861328125,
-0.03399658203125,
-0.0038909912109375,
-0.0099639892578125,
0.0155181884765625,
-0.016937255859375,
0.057464599609375,
-0.002597808837890625,
0.045654296875,
0.05169677734375,
-0.034423828125,
0.0292205810546875,
0.0272674560546875,
-0.02862548828125,
0.0207366943359375,
-0.07220458984375,
0.0247344970703125,
0.0074310302734375,
0.03985595703125,
-0.08038330078125,
-0.0127716064453125,
0.043609619140625,
-0.043792724609375,
0.035003662109375,
0.004398345947265625,
-0.052947998046875,
-0.030120849609375,
-0.0279693603515625,
0.0401611328125,
0.04278564453125,
-0.049163818359375,
0.04388427734375,
0.028717041015625,
-0.008331298828125,
-0.045166015625,
-0.05877685546875,
-0.0195465087890625,
-0.0298919677734375,
-0.052490234375,
0.016693115234375,
-0.018402099609375,
0.0003428459167480469,
-0.01074981689453125,
-0.01436614990234375,
0.0144805908203125,
-0.00016307830810546875,
0.026947021484375,
0.0361328125,
-0.016632080078125,
-0.007686614990234375,
0.005474090576171875,
-0.01427459716796875,
0.00350189208984375,
-0.005451202392578125,
0.056488037109375,
-0.035888671875,
-0.0197906494140625,
-0.03948974609375,
-0.005176544189453125,
0.0289459228515625,
-0.0151214599609375,
0.0523681640625,
0.05511474609375,
-0.02130126953125,
0.00994873046875,
-0.045928955078125,
-0.0137176513671875,
-0.037689208984375,
0.0251617431640625,
-0.03973388671875,
-0.057159423828125,
0.057891845703125,
0.0182342529296875,
0.0167083740234375,
0.062347412109375,
0.06097412109375,
0.00954437255859375,
0.07952880859375,
0.067138671875,
-0.006221771240234375,
0.041351318359375,
-0.04559326171875,
0.00489044189453125,
-0.069580078125,
-0.04351806640625,
-0.0341796875,
-0.0229339599609375,
-0.042327880859375,
-0.0162506103515625,
0.024200439453125,
0.020355224609375,
-0.036102294921875,
0.043670654296875,
-0.040313720703125,
0.01111602783203125,
0.036041259765625,
0.019256591796875,
0.0176849365234375,
-0.005809783935546875,
-0.0193634033203125,
0.0128173828125,
-0.0618896484375,
-0.04693603515625,
0.087158203125,
0.0374755859375,
0.059783935546875,
0.01433563232421875,
0.038543701171875,
0.00605010986328125,
0.0276641845703125,
-0.048370361328125,
0.043548583984375,
0.018463134765625,
-0.04547119140625,
-0.0166015625,
-0.024261474609375,
-0.07196044921875,
0.01467132568359375,
0.0009388923645019531,
-0.06866455078125,
0.0106201171875,
0.0146942138671875,
-0.041595458984375,
0.01236724853515625,
-0.0465087890625,
0.06561279296875,
-0.0185699462890625,
0.0021514892578125,
-0.002658843994140625,
-0.058868408203125,
0.041412353515625,
-0.00038504600524902344,
0.0048370361328125,
-0.004131317138671875,
-0.0098724365234375,
0.07281494140625,
-0.04815673828125,
0.074951171875,
-0.00044798851013183594,
-0.01123046875,
0.039306640625,
-0.005924224853515625,
0.043548583984375,
0.01116943359375,
-0.012908935546875,
0.0341796875,
-0.006732940673828125,
-0.035888671875,
-0.02288818359375,
0.052154541015625,
-0.07989501953125,
-0.03466796875,
-0.034820556640625,
-0.030517578125,
0.01000213623046875,
0.01523590087890625,
0.03240966796875,
0.021087646484375,
0.0200042724609375,
-0.0080413818359375,
0.030426025390625,
-0.02325439453125,
0.041412353515625,
0.033660888671875,
-0.011444091796875,
-0.02435302734375,
0.0582275390625,
0.00965118408203125,
0.009429931640625,
0.0096435546875,
0.006275177001953125,
-0.038482666015625,
-0.032684326171875,
-0.038482666015625,
0.0499267578125,
-0.05035400390625,
-0.028045654296875,
-0.051116943359375,
-0.01288604736328125,
-0.03271484375,
0.003849029541015625,
-0.031982421875,
-0.0300445556640625,
-0.044708251953125,
-0.012237548828125,
0.03546142578125,
0.048980712890625,
-0.0123291015625,
0.0307464599609375,
-0.021881103515625,
0.01096343994140625,
0.02435302734375,
0.002346038818359375,
0.009033203125,
-0.06640625,
-0.0106353759765625,
0.00876617431640625,
-0.04132080078125,
-0.05633544921875,
0.03936767578125,
0.0036468505859375,
0.042999267578125,
0.0150909423828125,
-0.010772705078125,
0.06658935546875,
-0.008758544921875,
0.067626953125,
0.016204833984375,
-0.0809326171875,
0.032012939453125,
-0.01351165771484375,
0.0174407958984375,
0.0160369873046875,
0.01763916015625,
-0.0110931396484375,
-0.030853271484375,
-0.06378173828125,
-0.06805419921875,
0.0660400390625,
0.0262451171875,
-0.003570556640625,
0.0099639892578125,
0.031494140625,
0.0109710693359375,
0.014556884765625,
-0.07763671875,
-0.030364990234375,
-0.0241851806640625,
-0.0079345703125,
-0.0007009506225585938,
-0.017303466796875,
0.00882720947265625,
-0.02154541015625,
0.057464599609375,
-0.0008792877197265625,
0.042694091796875,
-0.0025348663330078125,
-0.013397216796875,
0.0003781318664550781,
-0.01015472412109375,
0.051116943359375,
0.040252685546875,
-0.0180511474609375,
0.00025725364685058594,
0.0325927734375,
-0.0413818359375,
-0.0069122314453125,
0.00875091552734375,
0.0021915435791015625,
-0.006488800048828125,
0.0311737060546875,
0.053955078125,
-0.014862060546875,
-0.0389404296875,
0.02294921875,
-0.0022716522216796875,
0.0068206787109375,
-0.0294647216796875,
0.007007598876953125,
0.01398468017578125,
0.031585693359375,
0.00917816162109375,
0.005512237548828125,
-0.006671905517578125,
-0.04498291015625,
-0.01558685302734375,
0.01800537109375,
-0.003292083740234375,
-0.037200927734375,
0.06817626953125,
0.002567291259765625,
-0.0140838623046875,
0.048919677734375,
-0.00897216796875,
-0.0333251953125,
0.0635986328125,
0.0321044921875,
0.04315185546875,
-0.01482391357421875,
-0.00782012939453125,
0.0313720703125,
0.0200958251953125,
-0.0098876953125,
0.029388427734375,
-0.0008549690246582031,
-0.0389404296875,
-0.0306396484375,
-0.052490234375,
-0.01436614990234375,
0.0300445556640625,
-0.0433349609375,
0.037139892578125,
-0.035491943359375,
-0.0215911865234375,
0.003635406494140625,
0.019439697265625,
-0.0528564453125,
0.01123046875,
0.01363372802734375,
0.07012939453125,
-0.052764892578125,
0.0819091796875,
0.045867919921875,
-0.0601806640625,
-0.08563232421875,
-0.025726318359375,
0.002349853515625,
-0.08013916015625,
0.04608154296875,
0.0069580078125,
-0.01190185546875,
0.0064697265625,
-0.0562744140625,
-0.0762939453125,
0.0968017578125,
0.04425048828125,
-0.024688720703125,
-0.0034961700439453125,
0.000499725341796875,
0.04754638671875,
-0.024383544921875,
0.048858642578125,
0.054656982421875,
0.0338134765625,
0.0035037994384765625,
-0.08111572265625,
0.024627685546875,
-0.02801513671875,
0.0030689239501953125,
-0.00531005859375,
-0.07757568359375,
0.08892822265625,
-0.019195556640625,
-0.010498046875,
0.035552978515625,
0.057586669921875,
0.039398193359375,
0.0020389556884765625,
0.0305633544921875,
0.0439453125,
0.045989990234375,
-0.007686614990234375,
0.0775146484375,
-0.01229095458984375,
0.049468994140625,
0.06903076171875,
0.008026123046875,
0.052947998046875,
0.0175933837890625,
-0.02294921875,
0.050323486328125,
0.0736083984375,
-0.0004935264587402344,
0.0362548828125,
0.01849365234375,
0.002262115478515625,
-0.0097503662109375,
0.0008115768432617188,
-0.051544189453125,
0.031280517578125,
0.03228759765625,
-0.0170135498046875,
-0.014434814453125,
-0.01169586181640625,
0.020050048828125,
-0.02728271484375,
-0.0069122314453125,
0.03961181640625,
0.020477294921875,
-0.0277862548828125,
0.08258056640625,
0.0159912109375,
0.06207275390625,
-0.0455322265625,
-0.0008602142333984375,
-0.03607177734375,
0.010101318359375,
-0.028350830078125,
-0.0389404296875,
0.01041412353515625,
0.002025604248046875,
0.0012111663818359375,
-0.00548553466796875,
0.035736083984375,
-0.024932861328125,
-0.0245513916015625,
0.018768310546875,
0.019439697265625,
0.035247802734375,
0.0128326416015625,
-0.07147216796875,
0.024169921875,
0.00225830078125,
-0.039581298828125,
0.0219879150390625,
0.019775390625,
-0.00408935546875,
0.054046630859375,
0.046478271484375,
0.003040313720703125,
0.01369476318359375,
-0.0171356201171875,
0.07989501953125,
-0.03271484375,
-0.030242919921875,
-0.0689697265625,
0.0380859375,
0.01053619384765625,
-0.03533935546875,
0.061279296875,
0.038604736328125,
0.0709228515625,
0.0006814002990722656,
0.047576904296875,
-0.026031494140625,
0.0236968994140625,
-0.03662109375,
0.05511474609375,
-0.0494384765625,
0.033721923828125,
-0.01161956787109375,
-0.0755615234375,
-0.01308441162109375,
0.0667724609375,
-0.028717041015625,
0.017303466796875,
0.048553466796875,
0.06915283203125,
-0.01168060302734375,
-0.01464080810546875,
0.00231170654296875,
0.038055419921875,
0.0300445556640625,
0.057342529296875,
0.04254150390625,
-0.0418701171875,
0.06207275390625,
-0.023345947265625,
-0.03033447265625,
-0.023162841796875,
-0.0582275390625,
-0.0723876953125,
-0.018310546875,
-0.0260162353515625,
-0.038543701171875,
-0.0035552978515625,
0.061767578125,
0.050323486328125,
-0.053131103515625,
-0.035369873046875,
-0.00797271728515625,
0.006504058837890625,
-0.02178955078125,
-0.01410675048828125,
0.04833984375,
-0.003208160400390625,
-0.06317138671875,
0.016357421875,
-0.0006117820739746094,
0.0213470458984375,
-0.0223846435546875,
-0.00954437255859375,
-0.01328277587890625,
0.004123687744140625,
0.0283966064453125,
0.046722412109375,
-0.057769775390625,
-0.015625,
-0.0073699951171875,
-0.021942138671875,
0.0079803466796875,
0.0262451171875,
-0.06243896484375,
0.022735595703125,
0.0214996337890625,
0.01352691650390625,
0.06097412109375,
-0.01432037353515625,
0.0193939208984375,
-0.035919189453125,
0.0272369384765625,
-0.00007551908493041992,
0.0333251953125,
0.0204620361328125,
-0.0223846435546875,
0.0472412109375,
0.018951416015625,
-0.03436279296875,
-0.06170654296875,
0.0109710693359375,
-0.09552001953125,
0.00860595703125,
0.0921630859375,
-0.0274200439453125,
-0.0281219482421875,
0.008941650390625,
-0.025970458984375,
0.042938232421875,
-0.042694091796875,
0.06695556640625,
0.0238037109375,
-0.02008056640625,
-0.0006113052368164062,
-0.034423828125,
0.0308837890625,
0.024383544921875,
-0.06158447265625,
-0.0305633544921875,
0.0112762451171875,
0.03521728515625,
0.016143798828125,
0.050323486328125,
-0.0098419189453125,
0.0140838623046875,
0.0057220458984375,
0.0164031982421875,
-0.022979736328125,
-0.0031585693359375,
-0.0220489501953125,
-0.00804901123046875,
-0.0012378692626953125,
-0.026519775390625
]
] |
facebook/wav2vec2-lv-60-espeak-cv-ft | 2023-10-31T13:13:45.000Z | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"phoneme-recognition",
"multilingual",
"dataset:common_voice",
"arxiv:2109.11680",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/wav2vec2-lv-60-espeak-cv-ft | 7 | 12,497 | transformers | 2022-03-02T23:29:05 | ---
language: multilingual
datasets:
- common_voice
tags:
- speech
- audio
- automatic-speech-recognition
- phoneme-recognition
widget:
- example_title: Librispeech sample 1
src: https://cdn-media.huggingface.co/speech_samples/sample1.flac
- example_title: Librispeech sample 2
src: https://cdn-media.huggingface.co/speech_samples/sample2.flac
license: apache-2.0
---
# Wav2Vec2-Large-LV60 finetuned on multi-lingual Common Voice
This checkpoint leverages the pretrained checkpoint [wav2vec2-large-lv60](https://huggingface.co/facebook/wav2vec2-large-lv60)
and is fine-tuned on [CommonVoice](https://huggingface.co/datasets/common_voice) to recognize phonetic labels in multiple languages.
When using the model make sure that your speech input is sampled at 16kHz.
Note that the model outputs a string of phonetic labels. A dictionary mapping phonetic labels to words
has to be used to map the phonetic output labels to output words.
[Paper: Simple and Effective Zero-shot Cross-lingual Phoneme Recognition](https://arxiv.org/abs/2109.11680)
Authors: Qiantong Xu, Alexei Baevski, Michael Auli
**Abstract**
Recent progress in self-training, self-supervised pretraining and unsupervised learning enabled well performing speech recognition systems without any labeled data. However, in many cases there is labeled data available for related languages which is not utilized by these methods. This paper extends previous work on zero-shot cross-lingual transfer learning by fine-tuning a multilingually pretrained wav2vec 2.0 model to transcribe unseen languages. This is done by mapping phonemes of the training languages to the target language using articulatory features. Experiments show that this simple method significantly outperforms prior work which introduced task-specific architectures and used only part of a monolingually pretrained model.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-lv-60-espeak-cv-ft")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-lv-60-espeak-cv-ft")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt").input_values
# retrieve logits
with torch.no_grad():
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
# => should give ['m ɪ s t ɚ k w ɪ l t ɚ ɹ ɪ z ð ɪ ɐ p ɑː s əl ʌ v ð ə m ɪ d əl k l æ s ᵻ z æ n d w iː ɑːɹ ɡ l æ d t ə w ɛ l k ə m h ɪ z ɡ ɑː s p əl']
``` | 3,000 | [
[
-0.0113067626953125,
-0.040679931640625,
0.0140838623046875,
0.01934814453125,
-0.01216888427734375,
0.0009326934814453125,
-0.016998291015625,
-0.05084228515625,
0.00811004638671875,
0.0270233154296875,
-0.054443359375,
-0.04937744140625,
-0.04046630859375,
-0.01275634765625,
-0.0184478759765625,
0.0640869140625,
0.005535125732421875,
0.01605224609375,
0.030670166015625,
-0.014404296875,
-0.046539306640625,
-0.036376953125,
-0.05279541015625,
-0.028778076171875,
0.03485107421875,
0.03057861328125,
0.0074005126953125,
0.02392578125,
0.0099639892578125,
0.0179901123046875,
0.0008578300476074219,
-0.0082855224609375,
-0.04248046875,
-0.002223968505859375,
0.00821685791015625,
-0.0276641845703125,
-0.0311737060546875,
0.01129150390625,
0.0426025390625,
0.042327880859375,
-0.00432586669921875,
0.0198974609375,
0.0005841255187988281,
0.019927978515625,
-0.02984619140625,
0.00928497314453125,
-0.052886962890625,
0.002716064453125,
-0.0196685791015625,
-0.0129852294921875,
-0.03662109375,
-0.007106781005859375,
0.00792694091796875,
-0.052886962890625,
0.012786865234375,
-0.0208892822265625,
0.06341552734375,
0.00601959228515625,
-0.034698486328125,
-0.035980224609375,
-0.064208984375,
0.08209228515625,
-0.044677734375,
0.058380126953125,
0.0295562744140625,
0.01453399658203125,
0.00229644775390625,
-0.067626953125,
-0.0267333984375,
-0.0004425048828125,
0.0269775390625,
0.032562255859375,
-0.02008056640625,
0.01375579833984375,
0.017181396484375,
0.01445770263671875,
-0.05633544921875,
0.0212249755859375,
-0.06634521484375,
-0.04864501953125,
0.029998779296875,
-0.01812744140625,
0.01953125,
0.0021991729736328125,
-0.0281829833984375,
-0.0283966064453125,
-0.03778076171875,
0.0285797119140625,
0.0276641845703125,
0.0267333984375,
-0.0284271240234375,
0.03997802734375,
-0.0126800537109375,
0.0565185546875,
0.0026111602783203125,
-0.0270233154296875,
0.055633544921875,
-0.0252685546875,
-0.010589599609375,
0.03985595703125,
0.0511474609375,
0.0145416259765625,
0.028533935546875,
-0.0028476715087890625,
0.00013065338134765625,
0.00911712646484375,
-0.016845703125,
-0.06671142578125,
-0.0292205810546875,
0.03033447265625,
-0.0183258056640625,
-0.003955841064453125,
-0.0062713623046875,
-0.0264892578125,
0.0134124755859375,
-0.02294921875,
0.059112548828125,
-0.037139892578125,
-0.03289794921875,
0.019134521484375,
-0.0172271728515625,
0.0235137939453125,
0.00853729248046875,
-0.06622314453125,
0.0000019073486328125,
0.025177001953125,
0.06756591796875,
0.01062774658203125,
-0.034210205078125,
-0.060211181640625,
0.001483917236328125,
-0.0226593017578125,
0.03485107421875,
-0.0183258056640625,
-0.03271484375,
-0.0091552734375,
0.0099639892578125,
-0.01324462890625,
-0.04052734375,
0.031585693359375,
-0.0211944580078125,
0.0171051025390625,
-0.013916015625,
-0.03631591796875,
-0.01678466796875,
-0.0166778564453125,
-0.036712646484375,
0.0892333984375,
0.0007691383361816406,
-0.038726806640625,
0.0206451416015625,
-0.0276947021484375,
-0.05133056640625,
-0.01059722900390625,
-0.0006513595581054688,
-0.02252197265625,
-0.00848388671875,
0.00934600830078125,
0.034515380859375,
-0.006137847900390625,
-0.005596160888671875,
-0.0158233642578125,
-0.031890869140625,
0.01372528076171875,
-0.040008544921875,
0.0882568359375,
0.03533935546875,
-0.0390625,
0.0031280517578125,
-0.07012939453125,
0.027313232421875,
-0.0126800537109375,
-0.040191650390625,
0.00411224365234375,
-0.04248046875,
0.021942138671875,
0.024261474609375,
0.01338958740234375,
-0.056976318359375,
-0.00013363361358642578,
-0.03436279296875,
0.044189453125,
0.029998779296875,
-0.0125579833984375,
0.031890869140625,
-0.0157623291015625,
0.01934814453125,
-0.00849151611328125,
-0.0116119384765625,
0.0035724639892578125,
-0.03302001953125,
-0.04644775390625,
-0.041046142578125,
0.0316162109375,
0.056488037109375,
-0.0406494140625,
0.039794921875,
-0.0034618377685546875,
-0.05230712890625,
-0.06549072265625,
0.0040130615234375,
0.030487060546875,
0.03900146484375,
0.04547119140625,
-0.035125732421875,
-0.050689697265625,
-0.04522705078125,
-0.02008056640625,
-0.007228851318359375,
-0.0196380615234375,
0.023834228515625,
0.007350921630859375,
-0.018035888671875,
0.036102294921875,
-0.006439208984375,
-0.03863525390625,
-0.016510009765625,
0.032379150390625,
0.029693603515625,
0.0482177734375,
0.02911376953125,
-0.032958984375,
-0.029937744140625,
-0.002349853515625,
-0.044830322265625,
-0.0164794921875,
-0.0138397216796875,
-0.00675201416015625,
0.01885986328125,
0.0401611328125,
-0.03375244140625,
0.013153076171875,
0.04351806640625,
-0.01383209228515625,
0.025360107421875,
-0.010162353515625,
0.0149688720703125,
-0.069091796875,
0.00872039794921875,
-0.011016845703125,
-0.01617431640625,
-0.0589599609375,
-0.037506103515625,
0.00331878662109375,
-0.007244110107421875,
-0.06292724609375,
0.04071044921875,
-0.057769775390625,
-0.01477813720703125,
-0.01050567626953125,
0.0002753734588623047,
-0.00791168212890625,
0.05072021484375,
0.0250244140625,
0.03631591796875,
0.06170654296875,
-0.042327880859375,
0.04266357421875,
0.006603240966796875,
-0.033477783203125,
0.032684326171875,
-0.0565185546875,
0.0295257568359375,
0.0153656005859375,
0.021575927734375,
-0.0814208984375,
-0.0026988983154296875,
0.0165863037109375,
-0.0634765625,
0.031341552734375,
-0.027862548828125,
-0.01068115234375,
-0.015777587890625,
-0.004650115966796875,
0.053558349609375,
0.05145263671875,
-0.04595947265625,
0.0323486328125,
0.026092529296875,
0.0031795501708984375,
-0.046295166015625,
-0.07861328125,
-0.011199951171875,
-0.0053253173828125,
-0.05047607421875,
0.0146331787109375,
0.0001329183578491211,
0.0150146484375,
-0.02392578125,
-0.0178070068359375,
-0.01088714599609375,
-0.0153656005859375,
0.02276611328125,
0.01898193359375,
-0.0169219970703125,
0.002475738525390625,
-0.00585174560546875,
-0.015777587890625,
-0.005161285400390625,
-0.04486083984375,
0.0491943359375,
0.00521087646484375,
-0.01702880859375,
-0.0484619140625,
0.01361083984375,
0.046630859375,
-0.0269317626953125,
0.0177764892578125,
0.09112548828125,
-0.033447265625,
-0.0022296905517578125,
-0.04296875,
-0.010650634765625,
-0.035430908203125,
0.05364990234375,
-0.0396728515625,
-0.05279541015625,
0.03863525390625,
-0.002811431884765625,
0.005889892578125,
0.0386962890625,
0.051116943359375,
-0.0078887939453125,
0.059295654296875,
0.0201873779296875,
-0.0101318359375,
0.03662109375,
-0.04754638671875,
0.005611419677734375,
-0.056610107421875,
-0.03277587890625,
-0.032623291015625,
-0.0191802978515625,
-0.035980224609375,
-0.041656494140625,
0.0237274169921875,
0.003330230712890625,
-0.003368377685546875,
0.03289794921875,
-0.047149658203125,
0.02581787109375,
0.055999755859375,
0.00836944580078125,
0.00012922286987304688,
0.016387939453125,
-0.0081939697265625,
-0.0098876953125,
-0.0460205078125,
-0.00423431396484375,
0.0860595703125,
0.032989501953125,
0.058441162109375,
-0.00942230224609375,
0.061614990234375,
-0.00189208984375,
-0.030364990234375,
-0.07086181640625,
0.0261993408203125,
-0.01404571533203125,
-0.0574951171875,
-0.034759521484375,
-0.0238037109375,
-0.07196044921875,
0.01399993896484375,
-0.020477294921875,
-0.05999755859375,
0.02423095703125,
-0.0025653839111328125,
-0.0293121337890625,
0.0199127197265625,
-0.051025390625,
0.06671142578125,
-0.0249176025390625,
-0.0159759521484375,
-0.0193634033203125,
-0.056304931640625,
0.016693115234375,
-0.005908966064453125,
0.0078125,
-0.0168914794921875,
0.0343017578125,
0.0787353515625,
-0.01922607421875,
0.05767822265625,
-0.0236663818359375,
-0.013153076171875,
0.0362548828125,
-0.0032596588134765625,
0.0323486328125,
-0.01103973388671875,
-0.006061553955078125,
0.040985107421875,
0.021759033203125,
-0.03057861328125,
-0.0137176513671875,
0.0640869140625,
-0.076904296875,
-0.0022106170654296875,
-0.0179901123046875,
-0.0338134765625,
-0.00731658935546875,
0.0246734619140625,
0.0667724609375,
0.05731201171875,
-0.0142059326171875,
0.0214691162109375,
0.052642822265625,
-0.0173187255859375,
0.028900146484375,
0.0306854248046875,
0.0018291473388671875,
-0.0312347412109375,
0.082763671875,
0.0318603515625,
0.02783203125,
0.01476287841796875,
-0.0017547607421875,
-0.0377197265625,
-0.0343017578125,
-0.0305938720703125,
0.01554107666015625,
-0.04937744140625,
-0.0056304931640625,
-0.053955078125,
-0.02508544921875,
-0.056793212890625,
-0.002544403076171875,
-0.054718017578125,
-0.042633056640625,
-0.03106689453125,
0.002399444580078125,
0.0227813720703125,
0.051666259765625,
-0.045867919921875,
0.0158538818359375,
-0.056610107421875,
0.048065185546875,
0.029022216796875,
0.004100799560546875,
-0.00909423828125,
-0.06939697265625,
-0.0236358642578125,
0.024993896484375,
0.0037631988525390625,
-0.06427001953125,
0.029449462890625,
0.012115478515625,
0.0328369140625,
0.0138397216796875,
0.0032958984375,
0.053466796875,
-0.0291595458984375,
0.043548583984375,
0.033966064453125,
-0.0855712890625,
0.03985595703125,
-0.0145721435546875,
0.029083251953125,
0.03497314453125,
0.0171966552734375,
-0.057281494140625,
-0.0153045654296875,
-0.0338134765625,
-0.07757568359375,
0.06695556640625,
0.0180511474609375,
-0.0023555755615234375,
0.0240936279296875,
0.01139068603515625,
0.00653839111328125,
-0.00669097900390625,
-0.055999755859375,
-0.03131103515625,
-0.0302734375,
-0.0159149169921875,
-0.0115814208984375,
-0.016510009765625,
-0.002437591552734375,
-0.037353515625,
0.0650634765625,
0.0168609619140625,
0.025604248046875,
0.030029296875,
-0.0155792236328125,
-0.003139495849609375,
0.0147552490234375,
0.042266845703125,
0.0145721435546875,
-0.0247344970703125,
0.01209259033203125,
0.0269775390625,
-0.0479736328125,
0.0232696533203125,
0.015411376953125,
0.0118865966796875,
0.0189666748046875,
0.038360595703125,
0.0732421875,
0.01158905029296875,
-0.02008056640625,
0.0306549072265625,
0.0019588470458984375,
-0.040557861328125,
-0.04022216796875,
-0.0037364959716796875,
0.0159759521484375,
0.0234375,
0.0241546630859375,
0.002613067626953125,
0.004711151123046875,
-0.044891357421875,
0.0182342529296875,
0.0243988037109375,
-0.03826904296875,
-0.0284881591796875,
0.06396484375,
0.0193634033203125,
-0.01763916015625,
0.05462646484375,
0.0026454925537109375,
-0.0255126953125,
0.048858642578125,
0.036224365234375,
0.06719970703125,
-0.0291595458984375,
-0.005706787109375,
0.049713134765625,
0.021331787109375,
-0.014129638671875,
0.029693603515625,
-0.0142364501953125,
-0.050872802734375,
-0.026275634765625,
-0.03656005859375,
-0.01139068603515625,
0.022247314453125,
-0.060638427734375,
0.0386962890625,
-0.026092529296875,
-0.005329132080078125,
0.0162811279296875,
0.0020656585693359375,
-0.0697021484375,
0.02471923828125,
0.0248870849609375,
0.04766845703125,
-0.06903076171875,
0.10040283203125,
0.0216827392578125,
-0.0176239013671875,
-0.0902099609375,
-0.020751953125,
0.0055084228515625,
-0.0772705078125,
0.0443115234375,
0.0187225341796875,
-0.03436279296875,
0.0165252685546875,
-0.03173828125,
-0.07110595703125,
0.08380126953125,
0.0280609130859375,
-0.06134033203125,
0.01308441162109375,
0.00968170166015625,
0.028717041015625,
-0.01195526123046875,
0.021728515625,
0.0479736328125,
0.03961181640625,
0.032196044921875,
-0.09307861328125,
-0.01136016845703125,
-0.00843048095703125,
-0.032257080078125,
-0.01444244384765625,
-0.040557861328125,
0.0650634765625,
-0.0270233154296875,
-0.02197265625,
0.00482940673828125,
0.0489501953125,
0.022216796875,
0.025726318359375,
0.0360107421875,
0.039031982421875,
0.070556640625,
-0.00531005859375,
0.041839599609375,
-0.0173797607421875,
0.0239105224609375,
0.08160400390625,
-0.00833892822265625,
0.05853271484375,
0.030242919921875,
-0.003040313720703125,
0.0148773193359375,
0.0506591796875,
-0.0042572021484375,
0.05279541015625,
0.01227569580078125,
-0.0176849365234375,
-0.0223388671875,
-0.00296783447265625,
-0.05120849609375,
0.064697265625,
0.01157379150390625,
-0.00771331787109375,
0.00379180908203125,
0.008880615234375,
-0.0089263916015625,
-0.030426025390625,
-0.013153076171875,
0.051513671875,
0.01096343994140625,
-0.02606201171875,
0.07550048828125,
-0.0021076202392578125,
0.07330322265625,
-0.057281494140625,
-0.0050201416015625,
0.0033016204833984375,
0.01494598388671875,
-0.0238189697265625,
-0.04693603515625,
0.00861358642578125,
-0.01306915283203125,
-0.00792694091796875,
-0.0145111083984375,
0.045562744140625,
-0.07049560546875,
-0.04296875,
0.050537109375,
0.0262451171875,
0.041656494140625,
-0.0036945343017578125,
-0.041778564453125,
0.01384735107421875,
0.027557373046875,
-0.0159454345703125,
0.0106201171875,
0.029937744140625,
0.02197265625,
0.02215576171875,
0.050537109375,
0.0248870849609375,
0.0209808349609375,
0.0147247314453125,
0.03155517578125,
-0.04852294921875,
-0.040008544921875,
-0.034149169921875,
0.030487060546875,
0.024200439453125,
-0.028839111328125,
0.04327392578125,
0.05169677734375,
0.0909423828125,
-0.007686614990234375,
0.06365966796875,
0.00905609130859375,
0.05023193359375,
-0.056793212890625,
0.055633544921875,
-0.049102783203125,
0.00980377197265625,
-0.016693115234375,
-0.0628662109375,
-0.0162200927734375,
0.058807373046875,
-0.001598358154296875,
0.01352691650390625,
0.033111572265625,
0.07183837890625,
-0.012664794921875,
-0.0238037109375,
0.038848876953125,
0.015289306640625,
0.01058197021484375,
0.064208984375,
0.045684814453125,
-0.055999755859375,
0.07073974609375,
-0.02874755859375,
-0.009002685546875,
-0.00537872314453125,
-0.0278167724609375,
-0.0640869140625,
-0.05120849609375,
-0.0278472900390625,
-0.034393310546875,
0.0026149749755859375,
0.05816650390625,
0.06658935546875,
-0.0635986328125,
-0.034149169921875,
0.0131683349609375,
-0.007808685302734375,
-0.0203857421875,
-0.016845703125,
0.0296173095703125,
-0.0032749176025390625,
-0.068115234375,
0.055816650390625,
0.004878997802734375,
0.0223388671875,
0.002872467041015625,
-0.0196990966796875,
-0.0160064697265625,
-0.00604248046875,
0.025482177734375,
0.04534912109375,
-0.05023193359375,
-0.02130126953125,
0.0029277801513671875,
-0.01165771484375,
0.00957489013671875,
0.05035400390625,
-0.04315185546875,
0.04620361328125,
0.038848876953125,
0.01490020751953125,
0.0694580078125,
-0.03558349609375,
0.0292205810546875,
-0.05853271484375,
0.038665771484375,
0.00641632080078125,
0.0260009765625,
0.0229339599609375,
-0.023895263671875,
0.02056884765625,
0.0234222412109375,
-0.0350341796875,
-0.05865478515625,
0.007045745849609375,
-0.0977783203125,
-0.0259246826171875,
0.10137939453125,
0.00022876262664794922,
-0.0018682479858398438,
-0.0168609619140625,
-0.0251922607421875,
0.055877685546875,
-0.0286407470703125,
0.029937744140625,
0.037872314453125,
-0.0070648193359375,
-0.00212860107421875,
-0.037353515625,
0.035186767578125,
0.04254150390625,
-0.0236968994140625,
0.005390167236328125,
0.036102294921875,
0.052978515625,
-0.0014314651489257812,
0.07440185546875,
-0.005870819091796875,
0.031158447265625,
0.0152435302734375,
0.042816162109375,
-0.01050567626953125,
-0.0350341796875,
-0.0484619140625,
-0.005340576171875,
-0.00511932373046875,
-0.06048583984375
]
] |
TheBloke/zephyr-7B-alpha-GPTQ | 2023-10-14T07:12:11.000Z | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"generated_from_trainer",
"en",
"dataset:stingning/ultrachat",
"dataset:openbmb/UltraFeedback",
"arxiv:2305.18290",
"license:mit",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/zephyr-7B-alpha-GPTQ | 23 | 12,490 | transformers | 2023-10-11T03:26:23 | ---
base_model: HuggingFaceH4/zephyr-7b-alpha
datasets:
- stingning/ultrachat
- openbmb/UltraFeedback
inference: false
language:
- en
license: mit
model-index:
- name: zephyr-7b-alpha
results: []
model_creator: Hugging Face H4
model_name: Zephyr 7B Alpha
model_type: mistral
prompt_template: '<|system|>
</s>
<|user|>
{prompt}</s>
<|assistant|>
'
quantized_by: TheBloke
tags:
- generated_from_trainer
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Zephyr 7B Alpha - GPTQ
- Model creator: [Hugging Face H4](https://huggingface.co/HuggingFaceH4)
- Original model: [Zephyr 7B Alpha](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha)
<!-- description start -->
## Description
This repo contains GPTQ model files for [Hugging Face H4's Zephyr 7B Alpha](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/zephyr-7B-alpha-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/zephyr-7B-alpha-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/zephyr-7B-alpha-GGUF)
* [Hugging Face H4's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Zephyr
```
<|system|>
</s>
<|user|>
{prompt}</s>
<|assistant|>
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files, and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
Most GPTQ files are made with AutoGPTQ. Mistral models are currently made with Transformers.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The calibration dataset used during quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ calibration dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/zephyr-7B-alpha-GPTQ/tree/main) | 4 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4095 | 4.16 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/zephyr-7B-alpha-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4095 | 4.57 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/zephyr-7B-alpha-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4095 | 7.52 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/zephyr-7B-alpha-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4095 | 7.68 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. |
| [gptq-8bit-32g-actorder_True](https://huggingface.co/TheBloke/zephyr-7B-alpha-GPTQ/tree/gptq-8bit-32g-actorder_True) | 8 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4095 | 8.17 GB | No | 8-bit, with group size 32g and Act Order for maximum inference quality. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/zephyr-7B-alpha-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4095 | 4.29 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download, including from branches
### In text-generation-webui
To download from the `main` branch, enter `TheBloke/zephyr-7B-alpha-GPTQ` in the "Download model" box.
To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/zephyr-7B-alpha-GPTQ:gptq-4bit-32g-actorder_True`
### From the command line
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
To download the `main` branch to a folder called `zephyr-7B-alpha-GPTQ`:
```shell
mkdir zephyr-7B-alpha-GPTQ
huggingface-cli download TheBloke/zephyr-7B-alpha-GPTQ --local-dir zephyr-7B-alpha-GPTQ --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir zephyr-7B-alpha-GPTQ
huggingface-cli download TheBloke/zephyr-7B-alpha-GPTQ --revision gptq-4bit-32g-actorder_True --local-dir zephyr-7B-alpha-GPTQ --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage</summary>
If you remove the `--local-dir-use-symlinks False` parameter, the files will instead be stored in the central Huggingface cache directory (default location on Linux is: `~/.cache/huggingface`), and symlinks will be added to the specified `--local-dir`, pointing to their real location in the cache. This allows for interrupted downloads to be resumed, and allows you to quickly clone the repo to multiple places on disk without triggering a download again. The downside, and the reason why I don't list that as the default option, is that the files are then hidden away in a cache folder and it's harder to know where your disk space is being used, and to clear it up if/when you want to remove a download model.
The cache location can be changed with the `HF_HOME` environment variable, and/or the `--cache-dir` parameter to `huggingface-cli`.
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install hf_transfer
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
mkdir zephyr-7B-alpha-GPTQ
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download TheBloke/zephyr-7B-alpha-GPTQ --local-dir zephyr-7B-alpha-GPTQ --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
### With `git` (**not** recommended)
To clone a specific branch with `git`, use a command like this:
```shell
git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/zephyr-7B-alpha-GPTQ
```
Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/zephyr-7B-alpha-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/zephyr-7B-alpha-GPTQ:gptq-4bit-32g-actorder_True`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `zephyr-7B-alpha-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
It's recommended to use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/zephyr-7B-alpha-GPTQ --port 3000 --quantize awq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''<|system|>
</s>
<|user|>
{prompt}</s>
<|assistant|>
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_GPTQ.md-use-from-tgi end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.33.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install transformers optimum
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
git checkout v0.4.2
pip3 install .
```
### You can then use the following code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/zephyr-7B-alpha-GPTQ"
# To use a different branch, change revision
# For example: revision="gptq-4bit-32g-actorder_True"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''<|system|>
</s>
<|user|>
{prompt}</s>
<|assistant|>
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI).
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama and Mistral models in 4-bit. Please see the Provided Files table above for per-file compatibility.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Pierre Kircher, Stanislav Ovsiannikov, Michael Levine, Eugene Pentland, Andrey, 준교 김, Randy H, Fred von Graf, Artur Olbinski, Caitlyn Gatomon, terasurfer, Jeff Scroggin, James Bentley, Vadim, Gabriel Puliatti, Harry Royden McLaughlin, Sean Connelly, Dan Guido, Edmond Seymore, Alicia Loh, subjectnull, AzureBlack, Manuel Alberto Morcote, Thomas Belote, Lone Striker, Chris Smitley, Vitor Caleffi, Johann-Peter Hartmann, Clay Pascal, biorpg, Brandon Frisco, sidney chen, transmissions 11, Pedro Madruga, jinyuan sun, Ajan Kanaga, Emad Mostaque, Trenton Dambrowitz, Jonathan Leane, Iucharbius, usrbinkat, vamX, George Stoitzev, Luke Pendergrass, theTransient, Olakabola, Swaroop Kallakuri, Cap'n Zoog, Brandon Phillips, Michael Dempsey, Nikolai Manek, danny, Matthew Berman, Gabriel Tamborski, alfie_i, Raymond Fosdick, Tom X Nguyen, Raven Klaugh, LangChain4j, Magnesian, Illia Dulskyi, David Ziegler, Mano Prime, Luis Javier Navarrete Lozano, Erik Bjäreholt, 阿明, Nathan Dryer, Alex, Rainer Wilmers, zynix, TL, Joseph William Delisle, John Villwock, Nathan LeClaire, Willem Michiel, Joguhyik, GodLy, OG, Alps Aficionado, Jeffrey Morgan, ReadyPlayerEmma, Tiffany J. Kim, Sebastain Graf, Spencer Kim, Michael Davis, webtim, Talal Aujan, knownsqashed, John Detwiler, Imad Khwaja, Deo Leter, Jerry Meng, Elijah Stavena, Rooh Singh, Pieter, SuperWojo, Alexandros Triantafyllidis, Stephen Murray, Ai Maven, ya boyyy, Enrico Ros, Ken Nordquist, Deep Realms, Nicholas, Spiking Neurons AB, Elle, Will Dee, Jack West, RoA, Luke @flexchar, Viktor Bowallius, Derek Yates, Subspace Studios, jjj, Toran Billups, Asp the Wyvern, Fen Risland, Ilya, NimbleBox.ai, Chadd, Nitin Borwankar, Emre, Mandus, Leonard Tan, Kalila, K, Trailburnt, S_X, Cory Kujawski
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Hugging Face H4's Zephyr 7B Alpha
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
<img src="https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha/resolve/main/thumbnail.png" alt="Zephyr Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Model Card for Zephyr 7B Alpha
Zephyr is a series of language models that are trained to act as helpful assistants. Zephyr-7B-α is the first model in the series, and is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) that was trained on on a mix of publicly available, synthetic datasets using [Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290). We found that removing the in-built alignment of these datasets boosted performance on [MT Bench](https://huggingface.co/spaces/lmsys/mt-bench) and made the model more helpful. However, this means that model is likely to generate problematic text when prompted to do so and should only be used for educational and research purposes.
## Model description
- **Model type:** A 7B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
- **Language(s) (NLP):** Primarily English
- **License:** MIT
- **Finetuned from model:** [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/huggingface/alignment-handbook
- **Demo:** https://huggingface.co/spaces/HuggingFaceH4/zephyr-chat
## Intended uses & limitations
The model was initially fine-tuned on a variant of the [`UltraChat`](https://huggingface.co/datasets/stingning/ultrachat) dataset, which contains a diverse range of synthetic dialogues generated by ChatGPT. We then further aligned the model with [🤗 TRL's](https://github.com/huggingface/trl) `DPOTrainer` on the [openbmb/UltraFeedback](https://huggingface.co/datasets/openbmb/UltraFeedback) dataset, which contain 64k prompts and model completions that are ranked by GPT-4. As a result, the model can be used for chat and you can check out our [demo](https://huggingface.co/spaces/HuggingFaceH4/zephyr-chat) to test its capabilities.
Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:
```python
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="HuggingFaceH4/zephyr-7b-alpha", torch_dtype=torch.bfloat16, device_map="auto")
# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
{
"role": "system",
"content": "You are a friendly chatbot who always responds in the style of a pirate",
},
{"role": "user", "content": "How many helicopters can a human eat in one sitting?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
# <|system|>
# You are a friendly chatbot who always responds in the style of a pirate.</s>
# <|user|>
# How many helicopters can a human eat in one sitting?</s>
# <|assistant|>
# Ah, me hearty matey! But yer question be a puzzler! A human cannot eat a helicopter in one sitting, as helicopters are not edible. They be made of metal, plastic, and other materials, not food!
```
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Zephyr-7B-α has not been aligned to human preferences with techniques like RLHF or deployed with in-the-loop filtering of responses like ChatGPT, so the model can produce problematic outputs (especially when prompted to do so).
It is also unknown what the size and composition of the corpus was used to train the base model (`mistralai/Mistral-7B-v0.1`), however it is likely to have included a mix of Web data and technical sources like books and code. See the [Falcon 180B model card](https://huggingface.co/tiiuae/falcon-180B#training-data) for an example of this.
## Training and evaluation data
Zephyr 7B Alpha achieves the following results on the evaluation set:
- Loss: 0.4605
- Rewards/chosen: -0.5053
- Rewards/rejected: -1.8752
- Rewards/accuracies: 0.7812
- Rewards/margins: 1.3699
- Logps/rejected: -327.4286
- Logps/chosen: -297.1040
- Logits/rejected: -2.7153
- Logits/chosen: -2.7447
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-07
- train_batch_size: 2
- eval_batch_size: 4
- seed: 42
- distributed_type: multi-GPU
- num_devices: 16
- total_train_batch_size: 32
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
|:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
| 0.5602 | 0.05 | 100 | 0.5589 | -0.3359 | -0.8168 | 0.7188 | 0.4809 | -306.2607 | -293.7161 | -2.6554 | -2.6797 |
| 0.4852 | 0.1 | 200 | 0.5136 | -0.5310 | -1.4994 | 0.8125 | 0.9684 | -319.9124 | -297.6181 | -2.5762 | -2.5957 |
| 0.5212 | 0.15 | 300 | 0.5168 | -0.1686 | -1.1760 | 0.7812 | 1.0074 | -313.4444 | -290.3699 | -2.6865 | -2.7125 |
| 0.5496 | 0.21 | 400 | 0.4835 | -0.1617 | -1.7170 | 0.8281 | 1.5552 | -324.2635 | -290.2326 | -2.7947 | -2.8218 |
| 0.5209 | 0.26 | 500 | 0.5054 | -0.4778 | -1.6604 | 0.7344 | 1.1826 | -323.1325 | -296.5546 | -2.8388 | -2.8667 |
| 0.4617 | 0.31 | 600 | 0.4910 | -0.3738 | -1.5180 | 0.7656 | 1.1442 | -320.2848 | -294.4741 | -2.8234 | -2.8521 |
| 0.4452 | 0.36 | 700 | 0.4838 | -0.4591 | -1.6576 | 0.7031 | 1.1986 | -323.0770 | -296.1796 | -2.7401 | -2.7653 |
| 0.4674 | 0.41 | 800 | 0.5077 | -0.5692 | -1.8659 | 0.7656 | 1.2967 | -327.2416 | -298.3818 | -2.6740 | -2.6945 |
| 0.4656 | 0.46 | 900 | 0.4927 | -0.5279 | -1.6614 | 0.7656 | 1.1335 | -323.1518 | -297.5553 | -2.7817 | -2.8015 |
| 0.4102 | 0.52 | 1000 | 0.4772 | -0.5767 | -2.0667 | 0.7656 | 1.4900 | -331.2578 | -298.5311 | -2.7160 | -2.7455 |
| 0.4663 | 0.57 | 1100 | 0.4740 | -0.8038 | -2.1018 | 0.7656 | 1.2980 | -331.9604 | -303.0741 | -2.6994 | -2.7257 |
| 0.4737 | 0.62 | 1200 | 0.4716 | -0.3783 | -1.7015 | 0.7969 | 1.3232 | -323.9545 | -294.5634 | -2.6842 | -2.7135 |
| 0.4259 | 0.67 | 1300 | 0.4866 | -0.6239 | -1.9703 | 0.7812 | 1.3464 | -329.3312 | -299.4761 | -2.7046 | -2.7356 |
| 0.4935 | 0.72 | 1400 | 0.4747 | -0.5626 | -1.7600 | 0.7812 | 1.1974 | -325.1243 | -298.2491 | -2.7153 | -2.7444 |
| 0.4211 | 0.77 | 1500 | 0.4645 | -0.6099 | -1.9993 | 0.7656 | 1.3894 | -329.9109 | -299.1959 | -2.6944 | -2.7236 |
| 0.4931 | 0.83 | 1600 | 0.4684 | -0.6798 | -2.1082 | 0.7656 | 1.4285 | -332.0890 | -300.5934 | -2.7006 | -2.7305 |
| 0.5029 | 0.88 | 1700 | 0.4595 | -0.5063 | -1.8951 | 0.7812 | 1.3889 | -327.8267 | -297.1233 | -2.7108 | -2.7403 |
| 0.4965 | 0.93 | 1800 | 0.4613 | -0.5561 | -1.9079 | 0.7812 | 1.3518 | -328.0831 | -298.1203 | -2.7226 | -2.7523 |
| 0.4337 | 0.98 | 1900 | 0.4608 | -0.5066 | -1.8718 | 0.7656 | 1.3652 | -327.3599 | -297.1296 | -2.7175 | -2.7469 |
### Framework versions
- Transformers 4.34.0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.14.0
| 27,747 | [
[
-0.033935546875,
-0.052825927734375,
0.00986480712890625,
0.01708984375,
-0.01122283935546875,
-0.0157012939453125,
0.005336761474609375,
-0.04296875,
0.0210113525390625,
0.02520751953125,
-0.04412841796875,
-0.03350830078125,
-0.02191162109375,
-0.00553131103515625,
-0.014984130859375,
0.07489013671875,
0.00576019287109375,
-0.01715087890625,
0.001033782958984375,
-0.022735595703125,
-0.0204315185546875,
-0.03021240234375,
-0.0526123046875,
-0.01873779296875,
0.03338623046875,
0.0113983154296875,
0.0701904296875,
0.03656005859375,
0.014129638671875,
0.0265960693359375,
-0.0121612548828125,
-0.016265869140625,
-0.043609619140625,
-0.00738525390625,
0.0103607177734375,
-0.0231781005859375,
-0.04547119140625,
-0.00026035308837890625,
0.03900146484375,
0.01568603515625,
-0.018463134765625,
0.01019287109375,
0.0010709762573242188,
0.058868408203125,
-0.036712646484375,
0.0144195556640625,
-0.0205230712890625,
0.004695892333984375,
-0.016357421875,
0.016571044921875,
-0.005023956298828125,
-0.036865234375,
0.017486572265625,
-0.0677490234375,
0.02032470703125,
-0.002742767333984375,
0.090576171875,
-0.00033974647521972656,
-0.042694091796875,
0.0122528076171875,
-0.02996826171875,
0.0477294921875,
-0.0687255859375,
0.0316162109375,
0.0273590087890625,
0.0238189697265625,
-0.0159454345703125,
-0.06988525390625,
-0.042388916015625,
0.0005779266357421875,
-0.011505126953125,
0.02557373046875,
-0.0399169921875,
0.011199951171875,
0.035430908203125,
0.059783935546875,
-0.0692138671875,
-0.023193359375,
-0.0257110595703125,
-0.024383544921875,
0.056915283203125,
0.0025730133056640625,
0.032318115234375,
-0.01763916015625,
-0.0247955322265625,
-0.03955078125,
-0.048980712890625,
0.01544189453125,
0.0229339599609375,
0.001953125,
-0.039459228515625,
0.03656005859375,
-0.0290374755859375,
0.037200927734375,
0.01328277587890625,
-0.0079803466796875,
0.0382080078125,
-0.040985107421875,
-0.0291748046875,
-0.0164947509765625,
0.08404541015625,
0.037353515625,
-0.0208587646484375,
0.0107269287109375,
-0.006229400634765625,
-0.00249481201171875,
-0.0055694580078125,
-0.0858154296875,
-0.041778564453125,
0.03436279296875,
-0.033447265625,
-0.0181884765625,
-0.00402069091796875,
-0.061920166015625,
0.00841522216796875,
-0.0010967254638671875,
0.04766845703125,
-0.05224609375,
-0.0289459228515625,
0.013031005859375,
-0.04302978515625,
0.02850341796875,
0.0263671875,
-0.053863525390625,
0.03271484375,
0.02154541015625,
0.060302734375,
0.0193328857421875,
-0.0102996826171875,
-0.0206298828125,
0.0083770751953125,
-0.00699615478515625,
0.031463623046875,
-0.01383209228515625,
-0.02825927734375,
-0.0203399658203125,
0.0291748046875,
-0.0013103485107421875,
-0.014251708984375,
0.04376220703125,
-0.0245513916015625,
0.03289794921875,
-0.047027587890625,
-0.044921875,
-0.04296875,
0.004833221435546875,
-0.05120849609375,
0.091552734375,
0.0460205078125,
-0.0662841796875,
0.017608642578125,
-0.034759521484375,
-0.0095367431640625,
0.006168365478515625,
-0.0024890899658203125,
-0.03936767578125,
-0.006969451904296875,
0.01495361328125,
0.0219879150390625,
-0.0204315185546875,
-0.0026950836181640625,
-0.0276641845703125,
-0.01450347900390625,
0.0035457611083984375,
-0.0262603759765625,
0.099365234375,
0.0116729736328125,
-0.0404052734375,
0.0008134841918945312,
-0.0474853515625,
0.01316070556640625,
0.032379150390625,
-0.0110321044921875,
-0.0028171539306640625,
-0.021697998046875,
0.015411376953125,
0.017669677734375,
0.017425537109375,
-0.03216552734375,
0.0272216796875,
-0.021484375,
0.0482177734375,
0.04107666015625,
0.0041961669921875,
0.0222015380859375,
-0.043731689453125,
0.038604736328125,
0.006622314453125,
0.04241943359375,
0.00655364990234375,
-0.055328369140625,
-0.04583740234375,
-0.01141357421875,
0.0200958251953125,
0.04022216796875,
-0.0555419921875,
0.0382080078125,
-0.0109405517578125,
-0.06427001953125,
-0.02142333984375,
-0.0026187896728515625,
0.03033447265625,
0.0294036865234375,
0.033172607421875,
-0.02685546875,
-0.01641845703125,
-0.061767578125,
0.00572967529296875,
-0.038299560546875,
0.0022869110107421875,
0.04083251953125,
0.050811767578125,
-0.01324462890625,
0.0560302734375,
-0.0435791015625,
-0.009552001953125,
-0.004848480224609375,
-0.0012102127075195312,
0.0233917236328125,
0.04443359375,
0.07464599609375,
-0.06475830078125,
-0.04168701171875,
-0.0084381103515625,
-0.05078125,
-0.00485992431640625,
0.001194000244140625,
-0.0308990478515625,
0.01157379150390625,
-0.002635955810546875,
-0.08367919921875,
0.04388427734375,
0.03826904296875,
-0.04888916015625,
0.059234619140625,
-0.017822265625,
0.0163726806640625,
-0.0748291015625,
0.01012420654296875,
0.01450347900390625,
-0.017303466796875,
-0.041229248046875,
0.007091522216796875,
0.00478363037109375,
0.0130615234375,
-0.037506103515625,
0.0517578125,
-0.04376220703125,
-0.0026149749755859375,
0.014404296875,
-0.0083770751953125,
0.0257415771484375,
0.03570556640625,
-0.007965087890625,
0.058380126953125,
0.042572021484375,
-0.0307159423828125,
0.040771484375,
0.03179931640625,
-0.0016994476318359375,
0.0135498046875,
-0.0697021484375,
0.0005850791931152344,
0.0019330978393554688,
0.031829833984375,
-0.06982421875,
-0.02301025390625,
0.04144287109375,
-0.040069580078125,
0.027130126953125,
-0.033416748046875,
-0.027313232421875,
-0.0280914306640625,
-0.04632568359375,
0.0281524658203125,
0.059417724609375,
-0.0283355712890625,
0.03411865234375,
0.0293121337890625,
0.00015032291412353516,
-0.042266845703125,
-0.041351318359375,
-0.0214385986328125,
-0.023406982421875,
-0.0589599609375,
0.0455322265625,
-0.012481689453125,
-0.00428009033203125,
0.006443023681640625,
-0.0026397705078125,
0.00152587890625,
-0.01201629638671875,
0.02874755859375,
0.0216827392578125,
-0.0184783935546875,
-0.015655517578125,
0.01092529296875,
0.00726318359375,
0.00045680999755859375,
-0.0224456787109375,
0.03179931640625,
-0.02740478515625,
-0.00051116943359375,
-0.0259552001953125,
0.0128936767578125,
0.04241943359375,
0.0033721923828125,
0.053497314453125,
0.0654296875,
-0.03155517578125,
0.01206207275390625,
-0.038299560546875,
-0.0140380859375,
-0.038482666015625,
0.005832672119140625,
-0.018646240234375,
-0.057403564453125,
0.04986572265625,
0.03216552734375,
0.02325439453125,
0.06256103515625,
0.02838134765625,
0.00980377197265625,
0.0736083984375,
0.0287933349609375,
-0.01470184326171875,
0.042388916015625,
-0.04559326171875,
-0.01910400390625,
-0.061309814453125,
-0.01554107666015625,
-0.0266571044921875,
-0.0217132568359375,
-0.05120849609375,
-0.03338623046875,
0.03131103515625,
0.03253173828125,
-0.0469970703125,
0.049713134765625,
-0.052337646484375,
0.01495361328125,
0.0380859375,
0.022796630859375,
0.005542755126953125,
0.001773834228515625,
-0.01152801513671875,
0.0024776458740234375,
-0.0455322265625,
-0.0152587890625,
0.07476806640625,
0.0308074951171875,
0.03961181640625,
0.0270538330078125,
0.03399658203125,
0.00022363662719726562,
0.01763916015625,
-0.041229248046875,
0.038818359375,
0.004497528076171875,
-0.05682373046875,
-0.0272369384765625,
-0.04888916015625,
-0.06903076171875,
0.0267333984375,
-0.0114898681640625,
-0.061279296875,
0.031402587890625,
0.005252838134765625,
-0.02862548828125,
0.020660400390625,
-0.05126953125,
0.08843994140625,
-0.0132293701171875,
-0.0372314453125,
-0.0008025169372558594,
-0.060791015625,
0.027557373046875,
0.02044677734375,
-0.004726409912109375,
-0.0179443359375,
-0.00661468505859375,
0.0635986328125,
-0.0631103515625,
0.04547119140625,
-0.03265380859375,
0.00896453857421875,
0.0399169921875,
-0.007080078125,
0.03790283203125,
0.00940704345703125,
-0.0038852691650390625,
0.025299072265625,
0.0335693359375,
-0.0391845703125,
-0.0296478271484375,
0.041229248046875,
-0.0728759765625,
-0.0305938720703125,
-0.038482666015625,
-0.032806396484375,
0.001018524169921875,
0.0163726806640625,
0.0455322265625,
0.0389404296875,
-0.00894927978515625,
0.010345458984375,
0.05279541015625,
-0.0278778076171875,
0.027984619140625,
0.0210113525390625,
-0.0282135009765625,
-0.04302978515625,
0.06292724609375,
0.00948333740234375,
0.01739501953125,
0.016754150390625,
0.0185394287109375,
-0.035858154296875,
-0.03424072265625,
-0.05352783203125,
0.0287628173828125,
-0.041168212890625,
-0.0282135009765625,
-0.051055908203125,
-0.0215606689453125,
-0.038543701171875,
0.01232147216796875,
-0.0229644775390625,
-0.05029296875,
-0.02716064453125,
0.00788116455078125,
0.0697021484375,
0.0300750732421875,
-0.01044464111328125,
0.027099609375,
-0.070068359375,
0.017608642578125,
0.0252838134765625,
0.0091094970703125,
-0.003162384033203125,
-0.052337646484375,
-0.00209808349609375,
0.0183258056640625,
-0.0421142578125,
-0.07672119140625,
0.0528564453125,
0.022918701171875,
0.0350341796875,
0.034881591796875,
0.0092010498046875,
0.061309814453125,
-0.0189056396484375,
0.07684326171875,
0.021209716796875,
-0.060943603515625,
0.04296875,
-0.04156494140625,
0.01568603515625,
0.0367431640625,
0.045196533203125,
-0.025115966796875,
-0.01910400390625,
-0.06280517578125,
-0.0628662109375,
0.03631591796875,
0.03485107421875,
-0.001575469970703125,
0.00797271728515625,
0.04803466796875,
-0.00861358642578125,
0.00661468505859375,
-0.05908203125,
-0.042022705078125,
-0.031951904296875,
-0.00528717041015625,
0.005706787109375,
0.0008759498596191406,
-0.0162353515625,
-0.0457763671875,
0.07501220703125,
-0.011627197265625,
0.04205322265625,
0.028167724609375,
0.005229949951171875,
-0.0107574462890625,
0.002727508544921875,
0.0234832763671875,
0.040496826171875,
-0.0288848876953125,
-0.019866943359375,
0.0028934478759765625,
-0.057891845703125,
0.0036869049072265625,
0.024627685546875,
-0.01397705078125,
0.004486083984375,
0.00408172607421875,
0.054901123046875,
0.005828857421875,
-0.0255279541015625,
0.04718017578125,
-0.026885986328125,
-0.0282745361328125,
-0.024078369140625,
0.022369384765625,
0.017425537109375,
0.0307159423828125,
0.0205078125,
-0.01187896728515625,
0.0252838134765625,
-0.042724609375,
0.01482391357421875,
0.03912353515625,
-0.021392822265625,
-0.03558349609375,
0.061187744140625,
-0.0081024169921875,
0.0212554931640625,
0.0516357421875,
-0.0202789306640625,
-0.038330078125,
0.055023193359375,
0.034515380859375,
0.05853271484375,
-0.013031005859375,
0.01447296142578125,
0.0452880859375,
0.00958251953125,
-0.01346588134765625,
0.03265380859375,
-0.003993988037109375,
-0.050201416015625,
-0.01549530029296875,
-0.04644775390625,
-0.0203704833984375,
0.0178680419921875,
-0.061126708984375,
0.00928497314453125,
-0.034637451171875,
-0.0333251953125,
0.005191802978515625,
0.0272674560546875,
-0.04364013671875,
0.017486572265625,
-0.003414154052734375,
0.07708740234375,
-0.061004638671875,
0.06634521484375,
0.044830322265625,
-0.03765869140625,
-0.0772705078125,
-0.018157958984375,
0.0227203369140625,
-0.043243408203125,
0.0170440673828125,
0.00738525390625,
0.0227203369140625,
0.000843048095703125,
-0.041107177734375,
-0.0626220703125,
0.11328125,
0.027984619140625,
-0.042449951171875,
-0.0100860595703125,
-0.010223388671875,
0.0245361328125,
-0.0027923583984375,
0.052215576171875,
0.043121337890625,
0.028411865234375,
0.0221405029296875,
-0.0645751953125,
0.0293731689453125,
-0.038299560546875,
0.00283050537109375,
0.0211334228515625,
-0.0806884765625,
0.0745849609375,
0.00223541259765625,
-0.012176513671875,
0.009033203125,
0.0513916015625,
0.032745361328125,
0.0011529922485351562,
0.036163330078125,
0.06494140625,
0.0513916015625,
-0.026214599609375,
0.080810546875,
-0.02215576171875,
0.04071044921875,
0.0491943359375,
0.0009369850158691406,
0.052276611328125,
0.017425537109375,
-0.045684814453125,
0.038360595703125,
0.06866455078125,
-0.005542755126953125,
0.0261383056640625,
-0.00188446044921875,
-0.03326416015625,
-0.00225067138671875,
0.0123748779296875,
-0.054840087890625,
-0.00388336181640625,
0.032958984375,
-0.01558685302734375,
-0.002285003662109375,
-0.019317626953125,
0.0012178421020507812,
-0.057373046875,
-0.01525115966796875,
0.043365478515625,
0.01690673828125,
-0.022918701171875,
0.055877685546875,
-0.0102081298828125,
0.048431396484375,
-0.042388916015625,
-0.007965087890625,
-0.0222930908203125,
-0.0014667510986328125,
-0.0288848876953125,
-0.062164306640625,
0.012054443359375,
-0.020355224609375,
-0.01026153564453125,
0.0032501220703125,
0.060089111328125,
-0.019683837890625,
-0.02960205078125,
0.0303802490234375,
0.030029296875,
0.029541015625,
-0.0110626220703125,
-0.08587646484375,
0.0204925537109375,
0.0035247802734375,
-0.05615234375,
0.037841796875,
0.0350341796875,
0.016998291015625,
0.046783447265625,
0.04547119140625,
0.00201416015625,
0.007518768310546875,
-0.00897216796875,
0.075439453125,
-0.0543212890625,
-0.0242767333984375,
-0.0499267578125,
0.043609619140625,
-0.0082855224609375,
-0.0352783203125,
0.06390380859375,
0.04840087890625,
0.055694580078125,
0.006641387939453125,
0.05023193359375,
-0.033447265625,
0.01502227783203125,
-0.02728271484375,
0.0533447265625,
-0.05291748046875,
0.0024089813232421875,
-0.032470703125,
-0.0635986328125,
-0.00016546249389648438,
0.058441162109375,
-0.005702972412109375,
0.019561767578125,
0.0220489501953125,
0.06884765625,
-0.006160736083984375,
0.01456451416015625,
0.01375579833984375,
0.022064208984375,
0.0167236328125,
0.0662841796875,
0.04498291015625,
-0.07672119140625,
0.032928466796875,
-0.041961669921875,
-0.0221099853515625,
-0.01035308837890625,
-0.053466796875,
-0.052520751953125,
-0.034027099609375,
-0.045684814453125,
-0.053619384765625,
-0.00542449951171875,
0.06396484375,
0.062347412109375,
-0.047882080078125,
-0.0198211669921875,
-0.008544921875,
-0.0058746337890625,
-0.0222320556640625,
-0.0247955322265625,
0.026763916015625,
0.01465606689453125,
-0.043731689453125,
0.0134429931640625,
0.0079193115234375,
0.027984619140625,
-0.004901885986328125,
-0.0286102294921875,
-0.00550079345703125,
-0.007450103759765625,
0.045013427734375,
0.044219970703125,
-0.043365478515625,
-0.0129852294921875,
-0.0114898681640625,
-0.0050048828125,
0.0177001953125,
0.0191497802734375,
-0.05462646484375,
0.005157470703125,
0.046783447265625,
0.01546478271484375,
0.0662841796875,
0.0009179115295410156,
0.025543212890625,
-0.0298004150390625,
0.00632476806640625,
0.01041412353515625,
0.033233642578125,
0.00018334388732910156,
-0.03704833984375,
0.04443359375,
0.0313720703125,
-0.055023193359375,
-0.052642822265625,
-0.01329803466796875,
-0.0953369140625,
-0.01739501953125,
0.08099365234375,
-0.018035888671875,
-0.0313720703125,
0.00318145751953125,
-0.01554107666015625,
0.024932861328125,
-0.042236328125,
0.016876220703125,
0.03375244140625,
-0.0243682861328125,
-0.03155517578125,
-0.06414794921875,
0.05242919921875,
0.02178955078125,
-0.0589599609375,
0.00439453125,
0.04974365234375,
0.041748046875,
0.0071563720703125,
0.060150146484375,
-0.0257415771484375,
0.0293731689453125,
0.01203155517578125,
0.0033435821533203125,
0.007476806640625,
0.005096435546875,
-0.032470703125,
0.0008893013000488281,
-0.0195159912109375,
0.005176544189453125
]
] |
openbmb/cpm-ant-10b | 2023-06-02T02:04:30.000Z | [
"transformers",
"pytorch",
"cpmant",
"text-generation",
"zh",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | openbmb | null | null | openbmb/cpm-ant-10b | 22 | 12,486 | transformers | 2023-01-15T14:28:26 | ---
tags:
- text-generation
language: zh
---
## Usage
```
pip install transformers
```
```python
from transformers import CpmAntTokenizer, CpmAntForCausalLM
texts = "今天天气不错,"
model = CpmAntForCausalLM.from_pretrained("openbmb/cpm-ant-10b")
tokenizer = CpmAntTokenizer.from_pretrained("openbmb/cpm-ant-10b")
input_ids = tokenizer(texts, return_tensors="pt")
outputs = model.generate(**input_ids)
output_texts = tokenizer.batch_decode(outputs)
print(output_texts)
```
| 471 | [
[
-0.0292510986328125,
-0.019989013671875,
0.0045013427734375,
0.031829833984375,
-0.044708251953125,
-0.0096588134765625,
-0.00307464599609375,
0.0173492431640625,
0.00004416704177856445,
0.032470703125,
-0.050445556640625,
-0.033843994140625,
-0.0731201171875,
0.03424072265625,
-0.0172271728515625,
0.068603515625,
-0.0182952880859375,
0.03704833984375,
0.032562255859375,
0.000995635986328125,
-0.0209808349609375,
-0.037872314453125,
-0.0511474609375,
-0.02880859375,
0.007678985595703125,
-0.00911712646484375,
0.0303497314453125,
0.03631591796875,
0.03143310546875,
0.0308074951171875,
-0.006565093994140625,
0.0130767822265625,
-0.0246124267578125,
-0.009033203125,
-0.01099395751953125,
-0.052947998046875,
-0.0191192626953125,
0.001415252685546875,
0.0758056640625,
0.03131103515625,
0.0162353515625,
0.0172119140625,
0.009002685546875,
0.0178985595703125,
-0.03143310546875,
0.0419921875,
-0.0259857177734375,
0.004573822021484375,
-0.0017595291137695312,
-0.0177459716796875,
-0.039093017578125,
-0.03179931640625,
-0.01959228515625,
-0.048553466796875,
0.006351470947265625,
0.00893402099609375,
0.098388671875,
0.0238189697265625,
-0.02703857421875,
-0.00891876220703125,
-0.048004150390625,
0.0743408203125,
-0.05352783203125,
0.0185089111328125,
0.01427459716796875,
0.028594970703125,
-0.02215576171875,
-0.07513427734375,
-0.031707763671875,
0.0079803466796875,
-0.0222930908203125,
0.0029449462890625,
-0.0223846435546875,
-0.01108551025390625,
0.018524169921875,
0.034912109375,
-0.0170135498046875,
-0.008148193359375,
-0.050201416015625,
-0.024444580078125,
0.0083465576171875,
0.0287017822265625,
0.028656005859375,
-0.0189361572265625,
-0.02838134765625,
-0.03826904296875,
-0.032012939453125,
0.003582000732421875,
0.0215606689453125,
0.031646728515625,
-0.015655517578125,
0.068603515625,
-0.00312042236328125,
0.032196044921875,
0.01861572265625,
-0.00897979736328125,
0.017486572265625,
-0.030548095703125,
-0.0423583984375,
0.0006184577941894531,
0.07647705078125,
0.0028820037841796875,
-0.00458526611328125,
0.0160980224609375,
-0.0009555816650390625,
-0.002044677734375,
0.0003154277801513672,
-0.0823974609375,
-0.0174560546875,
0.0244140625,
-0.0604248046875,
-0.021697998046875,
0.043548583984375,
-0.057952880859375,
0.00888824462890625,
-0.01462554931640625,
0.047821044921875,
-0.05096435546875,
-0.038116455078125,
0.00455474853515625,
-0.0443115234375,
0.0284271240234375,
-0.00836944580078125,
-0.073486328125,
0.0075225830078125,
0.055145263671875,
0.05169677734375,
0.0208282470703125,
-0.04736328125,
-0.041015625,
0.005954742431640625,
-0.017364501953125,
0.03271484375,
-0.01535797119140625,
-0.0271453857421875,
-0.00567626953125,
0.035552978515625,
-0.03021240234375,
-0.046630859375,
0.0252685546875,
-0.0184783935546875,
0.050201416015625,
-0.01192474365234375,
-0.00847625732421875,
-0.01654052734375,
0.0091552734375,
-0.0297393798828125,
0.0655517578125,
0.0386962890625,
-0.06378173828125,
0.017425537109375,
-0.054718017578125,
-0.03399658203125,
-0.0045166015625,
0.007701873779296875,
-0.04974365234375,
0.00646209716796875,
0.0217132568359375,
0.039276123046875,
-0.005046844482421875,
0.022857666015625,
-0.001529693603515625,
-0.0267181396484375,
0.03326416015625,
-0.033050537109375,
0.0994873046875,
0.033294677734375,
-0.030670166015625,
0.034088134765625,
-0.03192138671875,
-0.0175933837890625,
0.01181793212890625,
-0.04937744140625,
-0.00213623046875,
-0.00933074951171875,
0.0118560791015625,
-0.0144195556640625,
0.029083251953125,
-0.0421142578125,
0.0125274658203125,
-0.041717529296875,
0.036773681640625,
0.05572509765625,
-0.0140533447265625,
0.03240966796875,
-0.019134521484375,
0.0221099853515625,
0.023712158203125,
0.0015544891357421875,
-0.0281524658203125,
-0.0258026123046875,
-0.056304931640625,
-0.0556640625,
0.0022182464599609375,
0.0190582275390625,
-0.0723876953125,
0.057037353515625,
-0.0134429931640625,
-0.049652099609375,
-0.038360595703125,
-0.0190887451171875,
0.024322509765625,
0.0133514404296875,
0.0298919677734375,
-0.019744873046875,
-0.072021484375,
-0.042205810546875,
-0.016876220703125,
-0.0302276611328125,
-0.031982421875,
-0.0028972625732421875,
0.04833984375,
-0.036407470703125,
0.046539306640625,
-0.05047607421875,
-0.0303955078125,
-0.038543701171875,
0.0205841064453125,
0.04034423828125,
0.05670166015625,
0.04010009765625,
-0.028411865234375,
-0.0244140625,
-0.01149749755859375,
-0.040924072265625,
0.001071929931640625,
-0.0120849609375,
-0.02874755859375,
0.018768310546875,
-0.00426483154296875,
-0.04022216796875,
0.007694244384765625,
0.03173828125,
-0.049835205078125,
0.055023193359375,
-0.005817413330078125,
0.01739501953125,
-0.1019287109375,
0.0195465087890625,
-0.031982421875,
-0.0133819580078125,
-0.02264404296875,
0.0172119140625,
0.0016565322875976562,
-0.0160064697265625,
-0.059906005859375,
0.041290283203125,
-0.0240020751953125,
0.0016698837280273438,
-0.0093536376953125,
-0.03533935546875,
0.005039215087890625,
0.037200927734375,
-0.00630950927734375,
0.042633056640625,
0.048980712890625,
-0.057952880859375,
0.040252685546875,
0.042999267578125,
-0.015960693359375,
0.00763702392578125,
-0.03289794921875,
-0.0244598388671875,
-0.0021152496337890625,
0.010589599609375,
-0.0771484375,
-0.0279998779296875,
0.060333251953125,
-0.043914794921875,
0.0224151611328125,
-0.0096588134765625,
-0.034393310546875,
-0.036346435546875,
-0.025909423828125,
0.035064697265625,
0.059783935546875,
-0.06427001953125,
0.061614990234375,
0.0030994415283203125,
0.01151275634765625,
-0.053436279296875,
-0.050750732421875,
-0.0207977294921875,
-0.0111083984375,
-0.0113525390625,
0.006069183349609375,
-0.00824737548828125,
0.00604248046875,
0.0017681121826171875,
0.00653839111328125,
-0.0115814208984375,
-0.003490447998046875,
0.0167694091796875,
0.034515380859375,
-0.0262451171875,
-0.045562744140625,
0.0102996826171875,
-0.031494140625,
0.0262908935546875,
-0.021697998046875,
0.08111572265625,
-0.01194000244140625,
-0.0188751220703125,
-0.055572509765625,
0.0005731582641601562,
0.04595947265625,
-0.025848388671875,
0.064697265625,
0.08221435546875,
-0.0240020751953125,
0.00139617919921875,
-0.0025577545166015625,
-0.017669677734375,
-0.038330078125,
0.0438232421875,
-0.0308685302734375,
-0.03857421875,
0.03533935546875,
-0.01139068603515625,
0.0034389495849609375,
0.055633544921875,
0.059478759765625,
0.02178955078125,
0.07763671875,
0.03558349609375,
0.0078277587890625,
0.03192138671875,
-0.045135498046875,
0.01471710205078125,
-0.06396484375,
-0.0263214111328125,
-0.037689208984375,
-0.0009374618530273438,
-0.0173187255859375,
-0.01488494873046875,
0.0036907196044921875,
0.0060577392578125,
-0.041839599609375,
0.058868408203125,
-0.0587158203125,
0.0181884765625,
0.05731201171875,
-0.001434326171875,
-0.01091766357421875,
-0.00753021240234375,
-0.027313232421875,
0.0016307830810546875,
-0.03045654296875,
-0.03399658203125,
0.073486328125,
0.0278167724609375,
0.05267333984375,
0.003437042236328125,
0.0391845703125,
-0.0266876220703125,
0.026397705078125,
-0.0556640625,
0.0360107421875,
-0.0211181640625,
-0.048492431640625,
-0.002960205078125,
-0.03228759765625,
-0.0740966796875,
0.0161285400390625,
-0.005126953125,
-0.037841796875,
0.003265380859375,
-0.01096343994140625,
-0.007415771484375,
0.044097900390625,
-0.021575927734375,
0.07135009765625,
-0.00463104248046875,
0.00725555419921875,
-0.00506591796875,
-0.0279541015625,
0.04705810546875,
0.003082275390625,
-0.01233673095703125,
-0.01000213623046875,
0.0015306472778320312,
0.06292724609375,
-0.0279998779296875,
0.0592041015625,
-0.01453399658203125,
0.0245513916015625,
0.0145416259765625,
0.00424957275390625,
0.001506805419921875,
0.005168914794921875,
-0.00177001953125,
0.03289794921875,
0.0323486328125,
-0.0321044921875,
-0.01552581787109375,
0.0457763671875,
-0.08203125,
-0.0345458984375,
-0.0421142578125,
-0.0305023193359375,
0.052734375,
0.050048828125,
0.046356201171875,
0.0305328369140625,
-0.01227569580078125,
0.006855010986328125,
0.01073455810546875,
-0.0323486328125,
0.05523681640625,
0.0183868408203125,
-0.0270233154296875,
-0.06903076171875,
0.05084228515625,
-0.00021088123321533203,
0.01160430908203125,
0.04425048828125,
0.0192718505859375,
-0.031097412109375,
-0.003391265869140625,
-0.036407470703125,
0.0278778076171875,
-0.048583984375,
-0.0251007080078125,
-0.037750244140625,
-0.0272216796875,
-0.04632568359375,
0.0056915283203125,
-0.035491943359375,
-0.02294921875,
-0.03961181640625,
0.0189208984375,
0.01029205322265625,
0.039581298828125,
-0.01494598388671875,
0.046417236328125,
-0.06781005859375,
0.033050537109375,
0.00506591796875,
0.017608642578125,
-0.0032901763916015625,
-0.03436279296875,
-0.0190582275390625,
-0.002979278564453125,
-0.039703369140625,
-0.0567626953125,
0.06744384765625,
-0.0117950439453125,
0.037322998046875,
0.05572509765625,
0.02410888671875,
0.043060302734375,
-0.024139404296875,
0.056976318359375,
0.01611328125,
-0.0889892578125,
0.02886962890625,
-0.00725555419921875,
0.0201263427734375,
0.005218505859375,
0.032073974609375,
-0.0322265625,
-0.0228729248046875,
-0.034027099609375,
-0.056854248046875,
0.080322265625,
0.0243988037109375,
0.004215240478515625,
-0.009063720703125,
0.01473236083984375,
0.0185546875,
0.014434814453125,
-0.06549072265625,
-0.038055419921875,
-0.04736328125,
-0.029998779296875,
-0.0023784637451171875,
-0.01513671875,
-0.01129150390625,
-0.033111572265625,
0.07098388671875,
0.00766754150390625,
0.0638427734375,
0.01543426513671875,
0.007648468017578125,
-0.00910186767578125,
0.00727081298828125,
0.04791259765625,
0.03717041015625,
-0.0311431884765625,
0.0291595458984375,
0.0036754608154296875,
-0.032623291015625,
0.0167236328125,
0.005893707275390625,
0.007053375244140625,
0.0218505859375,
0.0131683349609375,
0.059906005859375,
-0.006488800048828125,
-0.019195556640625,
0.032928466796875,
-0.01959228515625,
-0.00989532470703125,
-0.048370361328125,
0.004268646240234375,
0.005252838134765625,
0.01302337646484375,
0.03277587890625,
0.03253173828125,
-0.0234375,
-0.034027099609375,
0.005481719970703125,
0.0294342041015625,
-0.0244293212890625,
-0.027618408203125,
0.090087890625,
-0.00048542022705078125,
-0.03375244140625,
0.05804443359375,
-0.004123687744140625,
-0.052398681640625,
0.05621337890625,
0.059417724609375,
0.06939697265625,
-0.0135650634765625,
-0.00910186767578125,
0.052490234375,
0.02557373046875,
-0.01435089111328125,
0.020263671875,
0.005611419677734375,
-0.057098388671875,
-0.0341796875,
-0.057891845703125,
-0.026885986328125,
0.005718231201171875,
-0.0413818359375,
0.049957275390625,
-0.048553466796875,
-0.00412750244140625,
-0.0109405517578125,
-0.0185089111328125,
-0.036712646484375,
-0.00472259521484375,
-0.01348114013671875,
0.05303955078125,
-0.0592041015625,
0.07208251953125,
0.05328369140625,
-0.052581787109375,
-0.062744140625,
-0.014373779296875,
-0.01209259033203125,
-0.06982421875,
0.065673828125,
0.03729248046875,
0.0193634033203125,
0.017791748046875,
-0.0248260498046875,
-0.0513916015625,
0.07830810546875,
0.0282135009765625,
0.0084075927734375,
0.0215911865234375,
0.031707763671875,
0.00970458984375,
-0.0298919677734375,
0.034088134765625,
0.035186767578125,
0.056427001953125,
-0.00888824462890625,
-0.048065185546875,
0.01861572265625,
-0.019195556640625,
-0.0082244873046875,
0.0266876220703125,
-0.03240966796875,
0.09228515625,
-0.004505157470703125,
-0.03045654296875,
0.026336669921875,
0.050201416015625,
0.0296783447265625,
0.01010894775390625,
0.00977325439453125,
0.028564453125,
0.0137481689453125,
-0.03277587890625,
0.0596923828125,
-0.0255279541015625,
0.04541015625,
0.0235748291015625,
0.01904296875,
0.062408447265625,
0.037261962890625,
-0.0251007080078125,
0.0423583984375,
0.052459716796875,
-0.04449462890625,
0.036376953125,
0.0005145072937011719,
-0.01800537109375,
0.0012454986572265625,
0.01181793212890625,
-0.032073974609375,
0.0146331787109375,
0.006114959716796875,
-0.051239013671875,
-0.0036754608154296875,
0.0021533966064453125,
0.020782470703125,
-0.0188751220703125,
-0.02618408203125,
0.037017822265625,
0.0012903213500976562,
-0.04156494140625,
0.07427978515625,
0.033477783203125,
0.07257080078125,
-0.047821044921875,
0.003265380859375,
-0.002079010009765625,
0.060821533203125,
-0.009002685546875,
-0.0341796875,
0.048614501953125,
-0.0180206298828125,
-0.01483154296875,
-0.0191650390625,
0.037689208984375,
-0.0236358642578125,
-0.046417236328125,
0.022979736328125,
0.00893402099609375,
0.03741455078125,
-0.009521484375,
-0.0419921875,
0.00382232666015625,
0.01427459716796875,
-0.018310546875,
0.019378662109375,
0.007598876953125,
0.0421142578125,
0.041473388671875,
0.061553955078125,
0.00235748291015625,
0.0162506103515625,
-0.0232696533203125,
0.066650390625,
-0.038818359375,
-0.024261474609375,
-0.087646484375,
0.045654296875,
0.0211029052734375,
-0.0297393798828125,
0.0406494140625,
0.06134033203125,
0.0677490234375,
-0.051239013671875,
0.03729248046875,
-0.0165557861328125,
0.018646240234375,
-0.02685546875,
0.0794677734375,
-0.01314544677734375,
0.01082611083984375,
-0.0026569366455078125,
-0.0887451171875,
-0.026031494140625,
0.0462646484375,
0.004150390625,
-0.022979736328125,
0.0704345703125,
0.0577392578125,
-0.0192108154296875,
-0.0267181396484375,
0.01568603515625,
0.0188446044921875,
0.030120849609375,
0.034637451171875,
0.04595947265625,
-0.05377197265625,
0.057373046875,
-0.037017822265625,
-0.00734710693359375,
0.0034770965576171875,
-0.0438232421875,
-0.0830078125,
-0.038116455078125,
-0.00403594970703125,
-0.050567626953125,
-0.0260467529296875,
0.0792236328125,
0.041168212890625,
-0.060546875,
-0.03179931640625,
-0.0044708251953125,
0.01165008544921875,
-0.02783203125,
-0.026092529296875,
0.04541015625,
-0.0255889892578125,
-0.09613037109375,
0.02142333984375,
0.000598907470703125,
0.023773193359375,
-0.030670166015625,
-0.025146484375,
0.00505828857421875,
0.00997161865234375,
0.0208740234375,
0.01337432861328125,
-0.06256103515625,
-0.00763702392578125,
-0.0106964111328125,
-0.01222991943359375,
0.017181396484375,
0.032196044921875,
-0.059906005859375,
0.006542205810546875,
0.061981201171875,
0.036773681640625,
0.023956298828125,
-0.0095672607421875,
0.05511474609375,
-0.040924072265625,
0.02239990234375,
-0.01128387451171875,
0.047454833984375,
0.01690673828125,
-0.019439697265625,
0.034515380859375,
0.039520263671875,
-0.035736083984375,
-0.0657958984375,
0.0013332366943359375,
-0.06951904296875,
-0.037872314453125,
0.06903076171875,
-0.02508544921875,
-0.04052734375,
0.004199981689453125,
-0.04498291015625,
0.0576171875,
-0.01557159423828125,
0.039398193359375,
0.02655029296875,
0.0024261474609375,
-0.005100250244140625,
0.0006670951843261719,
0.0171966552734375,
0.033905029296875,
-0.037567138671875,
-0.0102386474609375,
-0.011627197265625,
0.038055419921875,
0.0261993408203125,
0.0185089111328125,
-0.026763916015625,
0.004608154296875,
0.02178955078125,
0.0333251953125,
-0.03704833984375,
-0.0008692741394042969,
-0.0189971923828125,
0.0130157470703125,
-0.00815582275390625,
-0.05181884765625
]
] |
timm/wide_resnet50_2.racm_in1k | 2023-04-05T20:39:51.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:2110.00476",
"arxiv:1605.07146",
"arxiv:1512.03385",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/wide_resnet50_2.racm_in1k | 1 | 12,478 | timm | 2023-04-05T20:38:52 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
---
# Model card for wide_resnet50_2.racm_in1k
A Wide-ResNet-B image classification model.
This model features:
* ReLU activations
* single layer 7x7 convolution with pooling
* 1x1 convolution shortcut downsample
Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RACM` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 68.9
- GMACs: 11.4
- Activations (M): 14.4
- Image size: train = 224 x 224, test = 288 x 288
- **Papers:**
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- Wide Residual Networks: https://arxiv.org/abs/1605.07146
- Deep Residual Learning for Image Recognition: https://arxiv.org/abs/1512.03385
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('wide_resnet50_2.racm_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'wide_resnet50_2.racm_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 2048, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'wide_resnet50_2.racm_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
|model |img_size|top1 |top5 |param_count|gmacs|macts|img/sec|
|------------------------------------------|--------|-----|-----|-----------|-----|-----|-------|
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|320 |86.72|98.17|93.6 |35.2 |69.7 |451 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|288 |86.51|98.08|93.6 |28.5 |56.4 |560 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|288 |86.49|98.03|93.6 |28.5 |56.4 |557 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|224 |85.96|97.82|93.6 |17.2 |34.2 |923 |
|[resnext101_32x32d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x32d.fb_wsl_ig1b_ft_in1k)|224 |85.11|97.44|468.5 |87.3 |91.1 |254 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|416 |85.0 |97.12|191.9 |108.4|213.8|134 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|352 |84.96|97.22|102.1 |50.2 |101.2|291 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|320 |84.73|97.18|102.1 |41.5 |83.7 |353 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|384 |84.71|96.99|164.0 |77.6 |154.7|183 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|288 |84.57|97.08|93.6 |28.5 |56.4 |557 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|320 |84.45|97.08|93.2 |31.5 |67.8 |446 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|352 |84.43|96.97|129.9 |51.1 |105.5|280 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|288 |84.36|96.92|93.6 |27.6 |53.0 |595 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|320 |84.35|97.04|66.8 |24.1 |47.7 |610 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|288 |84.3 |96.94|164.0 |43.7 |87.1 |333 |
|[resnext101_32x8d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_swsl_ig1b_ft_in1k)|224 |84.28|97.17|88.8 |16.5 |31.2 |1100 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|320 |84.24|96.86|191.9 |64.2 |126.6|228 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|288 |84.19|96.87|93.6 |27.2 |51.6 |613 |
|[resnext101_32x16d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_wsl_ig1b_ft_in1k)|224 |84.18|97.19|194.0 |36.3 |51.2 |581 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|288 |84.11|97.11|44.6 |15.1 |29.0 |1144 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|320 |83.97|96.82|64.7 |31.2 |67.3 |518 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|256 |83.87|96.75|93.2 |20.2 |43.4 |692 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|224 |83.86|96.65|93.6 |17.2 |34.2 |923 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|320 |83.72|96.61|86.6 |24.3 |48.1 |617 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|256 |83.69|96.78|66.8 |15.4 |30.6 |943 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|224 |83.68|96.61|93.6 |16.7 |32.0 |986 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|320 |83.67|96.74|60.2 |24.1 |47.7 |706 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|256 |83.59|96.61|129.9 |27.1 |55.8 |526 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|224 |83.58|96.4 |93.6 |16.5 |31.2 |1013 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|224 |83.54|96.83|44.6 |9.1 |17.6 |1864 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|288 |83.46|96.54|60.2 |19.1 |37.3 |904 |
|[resnext101_32x16d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k)|224 |83.35|96.85|194.0 |36.3 |51.2 |582 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|256 |83.23|96.53|64.7 |20.0 |43.1 |809 |
|[resnext101_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_swsl_ig1b_ft_in1k)|224 |83.22|96.75|44.2 |8.0 |21.2 |1814 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|288 |83.16|96.38|83.5 |25.7 |51.6 |590 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|256 |83.14|96.38|60.2 |15.4 |30.5 |1096 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|320 |83.02|96.45|44.6 |16.5 |34.8 |992 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|288 |82.98|96.54|44.6 |13.4 |28.2 |1077 |
|[resnext101_64x4d.tv_in1k](https://huggingface.co/timm/resnext101_64x4d.tv_in1k)|224 |82.98|96.25|83.5 |15.5 |31.2 |989 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|256 |82.86|96.28|86.6 |15.6 |30.8 |951 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|224 |82.83|96.22|88.8 |16.5 |31.2 |1099 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|224 |82.8 |96.13|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|288 |82.8 |96.32|44.6 |13.0 |26.8 |1291 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|288 |82.74|95.71|60.2 |19.1 |37.3 |905 |
|[resnext101_32x8d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_wsl_ig1b_ft_in1k)|224 |82.69|96.63|88.8 |16.5 |31.2 |1100 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|288 |82.62|95.75|60.2 |19.1 |37.3 |904 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|288 |82.61|96.49|25.6 |8.9 |20.6 |1729 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|288 |82.53|96.13|36.8 |9.9 |21.5 |1773 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|224 |82.5 |96.02|126.9 |22.8 |21.2 |1078 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|224 |82.46|95.92|83.5 |15.5 |31.2 |987 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|288 |82.36|96.18|35.7 |8.1 |20.9 |1964 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|320 |82.35|96.14|25.6 |8.8 |24.1 |1386 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|288 |82.31|95.63|44.6 |13.0 |26.8 |1291 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|288 |82.29|96.01|63.6 |13.6 |28.5 |1078 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|224 |82.29|96.0 |60.2 |11.6 |22.6 |1484 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|288 |82.27|96.06|68.9 |18.9 |23.8 |1176 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|256 |82.26|96.07|44.6 |10.6 |22.2 |1542 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|288 |82.24|95.73|44.6 |13.0 |26.8 |1290 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|288 |82.2 |96.14|27.6 |7.0 |23.8 |1547 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|224 |82.18|96.05|44.6 |8.1 |17.1 |1771 |
|[resnext50_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_swsl_ig1b_ft_in1k)|224 |82.17|96.22|25.0 |4.3 |14.4 |2943 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|288 |82.12|95.65|25.6 |7.1 |19.6 |1704 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|288 |82.03|95.94|25.0 |7.0 |23.8 |1745 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|288 |82.0 |96.15|24.9 |5.8 |12.7 |1787 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|256 |81.99|95.85|36.8 |7.8 |17.0 |2230 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|176 |81.98|95.72|88.8 |10.3 |19.4 |1768 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|224 |81.97|95.24|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|224 |81.93|95.75|44.6 |7.8 |16.2 |2122 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|224 |81.9 |95.77|44.6 |7.8 |16.2 |2118 |
|[resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k)|224 |81.84|96.1 |194.0 |36.3 |51.2 |583 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|256 |81.78|95.94|35.7 |6.4 |16.6 |2471 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|224 |81.77|95.22|60.2 |11.6 |22.6 |1485 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|224 |81.74|96.06|25.6 |5.4 |12.4 |2813 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|288 |81.65|95.54|25.6 |7.1 |19.6 |1703 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|288 |81.64|95.88|25.6 |7.2 |19.7 |1694 |
|[resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k)|224 |81.62|96.04|88.8 |16.5 |31.2 |1101 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|224 |81.61|95.76|68.9 |11.4 |14.4 |1930 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|288 |81.61|95.83|25.6 |8.5 |19.2 |1868 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|224 |81.5 |95.16|44.6 |7.8 |16.2 |2125 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|288 |81.48|95.16|25.0 |7.0 |23.8 |1745 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|288 |81.47|95.71|25.9 |6.9 |18.6 |2071 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|224 |81.45|95.53|68.9 |11.4 |14.4 |1929 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|288 |81.44|95.22|25.6 |7.2 |19.7 |1908 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|256 |81.44|95.67|25.6 |5.6 |15.4 |2168 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|288 |81.4 |95.82|30.2 |6.8 |13.9 |2132 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|288 |81.37|95.74|25.6 |7.2 |19.7 |1910 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|224 |81.32|95.19|44.6 |7.8 |16.2 |2125 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|288 |81.3 |95.65|28.1 |6.8 |18.4 |1803 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|288 |81.3 |95.11|25.0 |7.0 |23.8 |1746 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|224 |81.27|95.62|27.6 |4.3 |14.4 |2591 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|224 |81.26|95.16|25.6 |4.3 |11.8 |2823 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|288 |81.23|95.54|15.7 |4.8 |19.6 |2117 |
|[senet154.gluon_in1k](https://huggingface.co/timm/senet154.gluon_in1k)|224 |81.23|95.35|115.1 |20.8 |38.7 |545 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|288 |81.22|95.11|25.6 |6.8 |18.4 |2089 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|288 |81.22|95.63|25.6 |6.8 |18.4 |676 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|288 |81.18|95.09|25.6 |7.2 |19.7 |1908 |
|[resnet50.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet50.fb_swsl_ig1b_ft_in1k)|224 |81.18|95.98|25.6 |4.1 |11.1 |3455 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|224 |81.17|95.34|25.0 |4.3 |14.4 |2933 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|224 |81.1 |95.33|25.0 |4.3 |14.4 |2934 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|288 |81.1 |95.23|28.1 |6.8 |18.4 |1801 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|288 |81.1 |95.12|28.1 |6.8 |18.4 |1799 |
|[resnet152s.gluon_in1k](https://huggingface.co/timm/resnet152s.gluon_in1k)|224 |81.02|95.41|60.3 |12.9 |25.0 |1347 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|288 |80.97|95.44|25.6 |6.8 |18.4 |2085 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|256 |80.94|95.45|25.9 |5.4 |14.7 |2571 |
|[resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.93|95.73|44.2 |8.0 |21.2 |1814 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|288 |80.91|95.55|25.6 |6.8 |18.4 |2084 |
|[seresnext101_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_32x4d.gluon_in1k)|224 |80.9 |95.31|49.0 |8.0 |21.3 |1585 |
|[seresnext101_64x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_64x4d.gluon_in1k)|224 |80.9 |95.3 |88.2 |15.5 |31.2 |918 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|288 |80.86|95.52|25.6 |6.8 |18.4 |2085 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|224 |80.85|95.43|25.6 |4.1 |11.1 |3450 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|224 |80.84|95.02|25.6 |4.3 |11.8 |2821 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|224 |80.79|95.62|24.9 |3.5 |7.7 |2961 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|288 |80.79|95.36|19.8 |6.0 |14.8 |2506 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|288 |80.79|95.58|19.9 |4.2 |10.6 |2349 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|288 |80.78|94.99|25.6 |6.8 |18.4 |2088 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|288 |80.71|95.43|25.6 |6.8 |18.4 |2087 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|288 |80.7 |95.39|25.0 |7.0 |23.8 |1749 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|192 |80.69|95.24|63.6 |6.0 |12.7 |2270 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|224 |80.68|94.71|25.6 |4.4 |11.9 |3162 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|288 |80.68|95.36|19.7 |6.0 |14.8 |2637 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|224 |80.67|95.3 |25.6 |4.1 |11.1 |3452 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|288 |80.67|95.42|25.0 |7.4 |25.1 |1626 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|224 |80.63|95.21|25.6 |5.2 |11.6 |3034 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|224 |80.61|95.32|25.6 |4.4 |11.9 |2813 |
|[resnext101_64x4d.gluon_in1k](https://huggingface.co/timm/resnext101_64x4d.gluon_in1k)|224 |80.61|94.99|83.5 |15.5 |31.2 |989 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|288 |80.6 |95.31|19.9 |6.0 |14.8 |2578 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|256 |80.57|95.17|15.7 |3.8 |15.5 |2710 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|224 |80.56|95.0 |60.2 |11.6 |22.6 |1483 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|224 |80.53|95.16|25.6 |4.4 |11.9 |3164 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|224 |80.53|94.46|25.0 |4.3 |14.4 |2930 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|176 |80.48|94.98|126.9 |14.3 |13.2 |1719 |
|[resnet152d.gluon_in1k](https://huggingface.co/timm/resnet152d.gluon_in1k)|224 |80.47|95.2 |60.2 |11.8 |23.4 |1428 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|288 |80.45|95.32|25.6 |6.8 |18.4 |2086 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|224 |80.45|95.24|30.2 |4.1 |8.4 |3530 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|224 |80.45|94.63|25.0 |4.3 |14.4 |2936 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|176 |80.43|95.09|68.9 |7.3 |9.0 |3015 |
|[resnet101d.gluon_in1k](https://huggingface.co/timm/resnet101d.gluon_in1k)|224 |80.42|95.01|44.6 |8.1 |17.0 |2007 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|224 |80.38|94.6 |25.6 |4.1 |11.1 |3461 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|256 |80.36|95.1 |19.8 |4.8 |11.7 |3267 |
|[resnext101_32x4d.gluon_in1k](https://huggingface.co/timm/resnext101_32x4d.gluon_in1k)|224 |80.34|94.93|44.2 |8.0 |21.2 |1814 |
|[resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.32|95.4 |25.0 |4.3 |14.4 |2941 |
|[resnet101s.gluon_in1k](https://huggingface.co/timm/resnet101s.gluon_in1k)|224 |80.28|95.16|44.7 |9.2 |18.6 |1851 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|224 |80.26|95.08|28.1 |4.1 |11.1 |2972 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|288 |80.24|95.24|25.6 |8.5 |19.9 |1523 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|224 |80.22|94.63|25.6 |4.4 |11.9 |3162 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|176 |80.2 |94.64|60.2 |7.2 |14.0 |2346 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|224 |80.08|94.74|28.1 |4.1 |11.1 |2969 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|256 |80.08|94.97|19.7 |4.8 |11.7 |3284 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|256 |80.06|94.99|19.9 |4.8 |11.7 |3216 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|224 |80.06|94.95|25.6 |4.1 |11.1 |1109 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|224 |80.02|94.71|28.1 |4.1 |11.1 |2962 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|288 |79.97|95.05|25.6 |6.8 |18.4 |2086 |
|[resnet152c.gluon_in1k](https://huggingface.co/timm/resnet152c.gluon_in1k)|224 |79.92|94.84|60.2 |11.8 |23.4 |1455 |
|[seresnext50_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext50_32x4d.gluon_in1k)|224 |79.91|94.82|27.6 |4.3 |14.4 |2591 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|224 |79.91|94.67|25.6 |4.1 |11.1 |3456 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|176 |79.9 |94.6 |44.6 |4.9 |10.1 |3341 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|224 |79.89|94.97|35.7 |4.5 |12.1 |2774 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|224 |79.88|94.87|25.6 |4.1 |11.1 |3455 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|320 |79.86|95.07|16.0 |5.2 |16.4 |2168 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|224 |79.85|94.56|25.6 |4.1 |11.1 |3460 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|288 |79.83|94.97|25.6 |6.8 |18.4 |2087 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|224 |79.82|94.62|44.6 |7.8 |16.2 |2114 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|224 |79.76|94.6 |25.0 |4.3 |14.4 |2943 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|224 |79.74|94.95|25.6 |4.1 |11.1 |3455 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|224 |79.74|94.87|19.9 |2.5 |6.4 |3929 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|288 |79.71|94.83|19.7 |6.0 |14.8 |2710 |
|[resnet152.gluon_in1k](https://huggingface.co/timm/resnet152.gluon_in1k)|224 |79.68|94.74|60.2 |11.6 |22.6 |1486 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|224 |79.67|94.87|25.0 |4.5 |15.2 |2729 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|288 |79.63|94.91|25.6 |6.8 |18.4 |2086 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|224 |79.56|94.72|25.6 |4.3 |11.8 |2805 |
|[resnet101c.gluon_in1k](https://huggingface.co/timm/resnet101c.gluon_in1k)|224 |79.53|94.58|44.6 |8.1 |17.0 |2062 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|224 |79.52|94.61|25.6 |4.1 |11.1 |3459 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|176 |79.42|94.64|25.6 |2.6 |6.9 |5397 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|288 |79.4 |94.66|18.0 |5.9 |14.6 |2752 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|224 |79.38|94.57|25.6 |4.1 |11.1 |3459 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|176 |79.37|94.3 |25.0 |2.7 |9.0 |4577 |
|[resnext50_32x4d.gluon_in1k](https://huggingface.co/timm/resnext50_32x4d.gluon_in1k)|224 |79.36|94.43|25.0 |4.3 |14.4 |2942 |
|[resnext101_32x8d.tv_in1k](https://huggingface.co/timm/resnext101_32x8d.tv_in1k)|224 |79.31|94.52|88.8 |16.5 |31.2 |1100 |
|[resnet101.gluon_in1k](https://huggingface.co/timm/resnet101.gluon_in1k)|224 |79.31|94.53|44.6 |7.8 |16.2 |2125 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|224 |79.31|94.63|25.6 |5.2 |12.0 |2524 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|176 |79.27|94.49|25.6 |2.6 |6.9 |5404 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|224 |79.25|94.31|25.0 |4.3 |14.4 |2931 |
|[resnet50.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet50.fb_ssl_yfcc100m_ft_in1k)|224 |79.22|94.84|25.6 |4.1 |11.1 |3451 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|256 |79.21|94.56|19.7 |4.8 |11.7 |3392 |
|[resnet50d.gluon_in1k](https://huggingface.co/timm/resnet50d.gluon_in1k)|224 |79.07|94.48|25.6 |4.4 |11.9 |3162 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|224 |79.03|94.38|25.6 |4.1 |11.1 |3453 |
|[resnet50.am_in1k](https://huggingface.co/timm/resnet50.am_in1k)|224 |79.01|94.39|25.6 |4.1 |11.1 |3461 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|256 |79.01|94.37|18.0 |4.6 |11.6 |3440 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|256 |78.9 |94.54|16.0 |3.4 |10.5 |3421 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|160 |78.89|94.11|60.2 |5.9 |11.5 |2745 |
|[wide_resnet101_2.tv_in1k](https://huggingface.co/timm/wide_resnet101_2.tv_in1k)|224 |78.84|94.28|126.9 |22.8 |21.2 |1079 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|288 |78.83|94.24|16.8 |4.5 |16.8 |2251 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|224 |78.81|94.32|25.6 |4.1 |11.1 |3454 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|288 |78.74|94.33|16.8 |4.5 |16.7 |2264 |
|[resnet50s.gluon_in1k](https://huggingface.co/timm/resnet50s.gluon_in1k)|224 |78.72|94.23|25.7 |5.5 |13.5 |2796 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|224 |78.71|94.24|25.6 |4.4 |11.9 |3154 |
|[wide_resnet50_2.tv_in1k](https://huggingface.co/timm/wide_resnet50_2.tv_in1k)|224 |78.47|94.09|68.9 |11.4 |14.4 |1934 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|224 |78.46|94.27|25.6 |4.1 |11.1 |3454 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|288 |78.43|94.35|21.8 |6.5 |7.5 |3291 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|288 |78.42|94.04|10.5 |3.1 |13.3 |3226 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|320 |78.33|94.13|16.0 |5.2 |16.4 |2391 |
|[resnet152.tv_in1k](https://huggingface.co/timm/resnet152.tv_in1k)|224 |78.32|94.04|60.2 |11.6 |22.6 |1487 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|288 |78.28|94.1 |10.4 |3.1 |13.3 |3062 |
|[bat_resnext26ts.ch_in1k](https://huggingface.co/timm/bat_resnext26ts.ch_in1k)|256 |78.25|94.1 |10.7 |2.5 |12.5 |3393 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|224 |78.06|93.78|25.6 |4.1 |11.1 |3450 |
|[resnet50c.gluon_in1k](https://huggingface.co/timm/resnet50c.gluon_in1k)|224 |78.0 |93.99|25.6 |4.4 |11.9 |3286 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|288 |78.0 |93.91|10.3 |3.1 |13.3 |3297 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|224 |77.98|93.75|16.8 |2.7 |10.1 |3841 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|288 |77.92|93.77|21.8 |6.1 |6.2 |3609 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|160 |77.88|93.71|44.6 |4.0 |8.3 |3926 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|256 |77.87|93.84|16.0 |3.4 |10.5 |3772 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|256 |77.86|93.79|10.4 |2.4 |10.5 |4263 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|160 |77.82|93.81|35.7 |2.3 |6.2 |5238 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|256 |77.81|93.82|10.5 |2.4 |10.5 |4183 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|160 |77.79|93.6 |25.6 |2.2 |6.0 |5329 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|160 |77.73|93.32|25.0 |2.2 |7.4 |5576 |
|[resnext50_32x4d.tv_in1k](https://huggingface.co/timm/resnext50_32x4d.tv_in1k)|224 |77.61|93.7 |25.0 |4.3 |14.4 |2944 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|224 |77.59|93.61|16.8 |2.7 |10.2 |3807 |
|[resnet50.gluon_in1k](https://huggingface.co/timm/resnet50.gluon_in1k)|224 |77.58|93.72|25.6 |4.1 |11.1 |3455 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|256 |77.44|93.56|10.3 |2.4 |10.5 |4284 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|288 |77.41|93.63|16.0 |4.3 |13.5 |2907 |
|[resnet101.tv_in1k](https://huggingface.co/timm/resnet101.tv_in1k)|224 |77.38|93.54|44.6 |7.8 |16.2 |2125 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|160 |77.22|93.27|25.6 |2.2 |6.1 |5982 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|288 |77.17|93.47|10.3 |3.1 |13.3 |3392 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|288 |77.15|93.27|21.8 |6.1 |6.2 |3615 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|224 |77.1 |93.37|21.8 |3.9 |4.5 |5436 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|224 |77.02|93.07|28.1 |4.1 |11.1 |2952 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|256 |76.78|93.13|10.3 |2.4 |10.5 |4410 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|224 |76.7 |93.17|16.0 |2.6 |8.2 |4859 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|288 |76.5 |93.35|21.8 |6.1 |6.2 |3617 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|224 |76.42|92.87|21.8 |3.7 |3.7 |5984 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|288 |76.35|93.18|16.0 |3.9 |12.2 |3331 |
|[resnet50.tv_in1k](https://huggingface.co/timm/resnet50.tv_in1k)|224 |76.13|92.86|25.6 |4.1 |11.1 |3457 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|160 |75.96|92.5 |25.6 |2.1 |5.7 |6490 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|224 |75.52|92.44|21.8 |3.7 |3.7 |5991 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|224 |75.3 |92.58|16.0 |2.4 |7.4 |5583 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|224 |75.16|92.18|21.8 |3.7 |3.7 |5994 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|160 |75.1 |92.08|28.1 |2.1 |5.7 |5513 |
|[resnet34.gluon_in1k](https://huggingface.co/timm/resnet34.gluon_in1k)|224 |74.57|91.98|21.8 |3.7 |3.7 |5984 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|288 |73.81|91.83|11.7 |3.4 |5.4 |5196 |
|[resnet34.tv_in1k](https://huggingface.co/timm/resnet34.tv_in1k)|224 |73.32|91.42|21.8 |3.7 |3.7 |5979 |
|[resnet18.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet18.fb_swsl_ig1b_ft_in1k)|224 |73.28|91.73|11.7 |1.8 |2.5 |10213 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|288 |73.16|91.03|11.7 |3.0 |4.1 |6050 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|224 |72.98|91.11|21.8 |3.7 |3.7 |5967 |
|[resnet18.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet18.fb_ssl_yfcc100m_ft_in1k)|224 |72.6 |91.42|11.7 |1.8 |2.5 |10213 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|288 |72.37|90.59|11.7 |3.0 |4.1 |6051 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|224 |72.26|90.31|10.1 |1.7 |5.8 |7026 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|224 |72.26|90.68|11.7 |2.1 |3.3 |8707 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|224 |71.49|90.07|11.7 |1.8 |2.5 |10187 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|176 |71.31|89.69|10.1 |1.1 |3.6 |10970 |
|[resnet18.gluon_in1k](https://huggingface.co/timm/resnet18.gluon_in1k)|224 |70.84|89.76|11.7 |1.8 |2.5 |10210 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|224 |70.64|89.47|11.7 |1.8 |2.5 |10194 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|160 |70.56|89.52|21.8 |1.9 |1.9 |10737 |
|[resnet18.tv_in1k](https://huggingface.co/timm/resnet18.tv_in1k)|224 |69.76|89.07|11.7 |1.8 |2.5 |10205 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|224 |68.34|88.03|5.4 |1.1 |2.4 |13079 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|224 |68.25|88.17|11.7 |1.8 |2.5 |10167 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|176 |66.71|86.96|5.4 |0.7 |1.5 |20327 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|160 |65.66|86.26|11.7 |0.9 |1.3 |18229 |
## Citation
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@article{DBLP:journals/corr/ZagoruykoK16,
author = {Sergey Zagoruyko and
Nikos Komodakis},
title = {Wide Residual Networks},
journal = {CoRR},
volume = {abs/1605.07146},
year = {2016},
url = {http://arxiv.org/abs/1605.07146},
archivePrefix = {arXiv},
eprint = {1605.07146},
timestamp = {Mon, 13 Aug 2018 16:46:42 +0200},
biburl = {https://dblp.org/rec/journals/corr/ZagoruykoK16.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
```bibtex
@article{He2015,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {arXiv preprint arXiv:1512.03385},
year = {2015}
}
```
| 39,246 | [
[
-0.065673828125,
-0.0193939208984375,
0.0037364959716796875,
0.0272064208984375,
-0.031402587890625,
-0.01189422607421875,
-0.01168060302734375,
-0.0308837890625,
0.083251953125,
0.0225677490234375,
-0.048797607421875,
-0.04315185546875,
-0.048736572265625,
-0.0004737377166748047,
0.023193359375,
0.06549072265625,
-0.0017786026000976562,
-0.0063629150390625,
0.01251220703125,
-0.0245513916015625,
-0.007152557373046875,
-0.022705078125,
-0.0762939453125,
-0.016357421875,
0.032196044921875,
0.0166778564453125,
0.051544189453125,
0.04473876953125,
0.03118896484375,
0.043701171875,
-0.017242431640625,
0.02203369140625,
-0.004138946533203125,
-0.006229400634765625,
0.043701171875,
-0.0321044921875,
-0.06829833984375,
-0.0023670196533203125,
0.052520751953125,
0.043792724609375,
0.003566741943359375,
0.0274810791015625,
0.0250244140625,
0.0484619140625,
-0.0010442733764648438,
-0.0027294158935546875,
-0.0004801750183105469,
0.0115203857421875,
-0.022247314453125,
0.004169464111328125,
-0.007640838623046875,
-0.053924560546875,
0.00881195068359375,
-0.045318603515625,
-0.0026264190673828125,
0.00008028745651245117,
0.1005859375,
-0.0080108642578125,
-0.01519775390625,
0.0037555694580078125,
0.006031036376953125,
0.058563232421875,
-0.06103515625,
0.0223388671875,
0.041900634765625,
0.00533294677734375,
-0.01416015625,
-0.049896240234375,
-0.037841796875,
0.01099395751953125,
-0.0300445556640625,
0.0218658447265625,
-0.025146484375,
-0.016357421875,
0.0290069580078125,
0.0243072509765625,
-0.03485107421875,
-0.0076751708984375,
-0.0268402099609375,
-0.00640106201171875,
0.052215576171875,
0.00687408447265625,
0.05145263671875,
-0.0245208740234375,
-0.037017822265625,
-0.0112762451171875,
-0.0140380859375,
0.03375244140625,
0.0184173583984375,
0.01174163818359375,
-0.08062744140625,
0.033966064453125,
0.00830841064453125,
0.02008056640625,
0.025115966796875,
-0.0125732421875,
0.0616455078125,
-0.004169464111328125,
-0.03826904296875,
-0.036376953125,
0.08221435546875,
0.047454833984375,
0.0211334228515625,
-0.003513336181640625,
-0.00464630126953125,
-0.0128173828125,
-0.0307159423828125,
-0.07293701171875,
-0.0008563995361328125,
0.0206451416015625,
-0.0419921875,
-0.015533447265625,
0.0263214111328125,
-0.06683349609375,
-0.0022602081298828125,
-0.006305694580078125,
0.00803375244140625,
-0.0552978515625,
-0.032745361328125,
0.00251007080078125,
-0.01517486572265625,
0.038726806640625,
0.016632080078125,
-0.0253448486328125,
0.0304718017578125,
0.0092010498046875,
0.06787109375,
0.0207061767578125,
-0.0035572052001953125,
-0.0184783935546875,
0.00008207559585571289,
-0.0252227783203125,
0.030120849609375,
0.006908416748046875,
-0.01346588134765625,
-0.0235137939453125,
0.03106689453125,
-0.0189971923828125,
-0.0218353271484375,
0.043792724609375,
0.0167083740234375,
0.01381683349609375,
-0.0217437744140625,
-0.016510009765625,
-0.0199737548828125,
0.02777099609375,
-0.042694091796875,
0.07830810546875,
0.0286865234375,
-0.084228515625,
0.01227569580078125,
-0.039031982421875,
-0.00347137451171875,
-0.022491455078125,
0.0040283203125,
-0.0697021484375,
0.0006690025329589844,
0.0182952880859375,
0.052764892578125,
-0.019622802734375,
-0.01372528076171875,
-0.02825927734375,
0.002307891845703125,
0.030609130859375,
0.0139007568359375,
0.0693359375,
0.025390625,
-0.034210205078125,
-0.0131683349609375,
-0.053466796875,
0.03265380859375,
0.032958984375,
-0.0005807876586914062,
-0.005207061767578125,
-0.05902099609375,
0.0019388198852539062,
0.045013427734375,
0.0163421630859375,
-0.052215576171875,
0.0204010009765625,
-0.01422882080078125,
0.02386474609375,
0.04620361328125,
0.00406646728515625,
0.0145416259765625,
-0.05291748046875,
0.047454833984375,
-0.0005846023559570312,
0.02294921875,
-0.0011196136474609375,
-0.032958984375,
-0.055999755859375,
-0.05755615234375,
0.0172271728515625,
0.0310211181640625,
-0.0282745361328125,
0.0634765625,
0.00891876220703125,
-0.045623779296875,
-0.047210693359375,
0.004573822021484375,
0.041656494140625,
0.01479339599609375,
0.00922393798828125,
-0.0302581787109375,
-0.055328369140625,
-0.07177734375,
-0.0218963623046875,
0.01126861572265625,
-0.00328826904296875,
0.051422119140625,
0.03350830078125,
-0.016632080078125,
0.044189453125,
-0.0294952392578125,
-0.021209716796875,
-0.01500701904296875,
-0.007305145263671875,
0.032958984375,
0.058441162109375,
0.07635498046875,
-0.053558349609375,
-0.0703125,
0.00853729248046875,
-0.08221435546875,
-0.0035400390625,
0.0003314018249511719,
-0.01885986328125,
0.03387451171875,
0.0173187255859375,
-0.06512451171875,
0.055816650390625,
0.0294036865234375,
-0.0584716796875,
0.0313720703125,
-0.0273284912109375,
0.039794921875,
-0.08428955078125,
0.01983642578125,
0.021270751953125,
-0.0158233642578125,
-0.04046630859375,
0.005947113037109375,
-0.006710052490234375,
0.0100250244140625,
-0.0406494140625,
0.0579833984375,
-0.05426025390625,
-0.00487518310546875,
0.00948333740234375,
0.00464630126953125,
-0.001667022705078125,
0.03265380859375,
-0.006603240966796875,
0.0406494140625,
0.0670166015625,
-0.0120086669921875,
0.0244293212890625,
0.0307769775390625,
0.0012006759643554688,
0.056396484375,
-0.044891357421875,
0.01103973388671875,
0.00067138671875,
0.0355224609375,
-0.0765380859375,
-0.0294952392578125,
0.04052734375,
-0.0614013671875,
0.050048828125,
-0.0207672119140625,
-0.0239715576171875,
-0.061553955078125,
-0.0638427734375,
0.021575927734375,
0.050689697265625,
-0.044677734375,
0.031890869140625,
0.0160980224609375,
-0.0008482933044433594,
-0.038238525390625,
-0.054168701171875,
0.0036258697509765625,
-0.0308380126953125,
-0.0614013671875,
0.0288238525390625,
0.023162841796875,
-0.01383209228515625,
0.0059661865234375,
-0.00994873046875,
-0.0087127685546875,
-0.016571044921875,
0.04730224609375,
0.0249786376953125,
-0.0230865478515625,
-0.031890869140625,
-0.0294036865234375,
-0.021484375,
-0.0044403076171875,
-0.00801849365234375,
0.04046630859375,
-0.031646728515625,
0.0027484893798828125,
-0.1063232421875,
0.00875091552734375,
0.06658935546875,
-0.005611419677734375,
0.07366943359375,
0.060577392578125,
-0.034881591796875,
0.01053619384765625,
-0.03631591796875,
-0.0171661376953125,
-0.038726806640625,
-0.01312255859375,
-0.05078125,
-0.044403076171875,
0.07080078125,
0.006694793701171875,
-0.00995635986328125,
0.0584716796875,
0.01345062255859375,
-0.01470184326171875,
0.0638427734375,
0.03759765625,
-0.003871917724609375,
0.040985107421875,
-0.06365966796875,
0.00595855712890625,
-0.060882568359375,
-0.05609130859375,
-0.0213623046875,
-0.0428466796875,
-0.04364013671875,
-0.028533935546875,
0.0187225341796875,
0.026947021484375,
-0.0192718505859375,
0.041412353515625,
-0.043701171875,
0.0027294158935546875,
0.0234222412109375,
0.0390625,
-0.0154876708984375,
-0.005352020263671875,
-0.01140594482421875,
-0.02459716796875,
-0.040771484375,
-0.024932861328125,
0.059906005859375,
0.04827880859375,
0.035797119140625,
0.006244659423828125,
0.044952392578125,
0.0010824203491210938,
0.0183868408203125,
-0.02520751953125,
0.052459716796875,
0.0030460357666015625,
-0.03515625,
-0.023101806640625,
-0.03125,
-0.07891845703125,
0.0114898681640625,
-0.03515625,
-0.061859130859375,
-0.01375579833984375,
-0.003673553466796875,
-0.0264434814453125,
0.05828857421875,
-0.04449462890625,
0.04620361328125,
-0.0029697418212890625,
-0.03985595703125,
-0.00597381591796875,
-0.0601806640625,
0.007080078125,
0.028167724609375,
0.00537109375,
-0.0018024444580078125,
-0.0026378631591796875,
0.05841064453125,
-0.059295654296875,
0.045684814453125,
-0.0289306640625,
0.01068115234375,
0.0322265625,
-0.0023250579833984375,
0.0303497314453125,
-0.0011434555053710938,
-0.0142669677734375,
-0.004253387451171875,
0.005352020263671875,
-0.05938720703125,
-0.0258636474609375,
0.048431396484375,
-0.057037353515625,
-0.027130126953125,
-0.049102783203125,
-0.0227813720703125,
0.006786346435546875,
0.0013322830200195312,
0.035797119140625,
0.0496826171875,
-0.0013904571533203125,
0.0179443359375,
0.04083251953125,
-0.035003662109375,
0.03717041015625,
-0.005954742431640625,
0.00010532140731811523,
-0.04266357421875,
0.050689697265625,
0.004001617431640625,
0.0018434524536132812,
0.0011959075927734375,
0.0023403167724609375,
-0.0294036865234375,
-0.0178375244140625,
-0.021453857421875,
0.055419921875,
-0.0157623291015625,
-0.0267791748046875,
-0.04888916015625,
-0.0270538330078125,
-0.04119873046875,
-0.029937744140625,
-0.0340576171875,
-0.0272369384765625,
-0.0203857421875,
0.0036411285400390625,
0.055389404296875,
0.0665283203125,
-0.0275726318359375,
0.027740478515625,
-0.038818359375,
0.0210723876953125,
0.0045623779296875,
0.041839599609375,
-0.0218658447265625,
-0.0526123046875,
0.002101898193359375,
-0.0036716461181640625,
-0.008087158203125,
-0.060302734375,
0.048553466796875,
0.0016937255859375,
0.0272064208984375,
0.031646728515625,
-0.0150299072265625,
0.054107666015625,
-0.001491546630859375,
0.034210205078125,
0.046966552734375,
-0.05029296875,
0.0278472900390625,
-0.0305633544921875,
0.004291534423828125,
0.02020263671875,
0.017547607421875,
-0.0306549072265625,
-0.0284576416015625,
-0.065185546875,
-0.036407470703125,
0.056915283203125,
0.005275726318359375,
-0.002025604248046875,
-0.0012445449829101562,
0.055328369140625,
-0.004093170166015625,
-0.00017058849334716797,
-0.039794921875,
-0.0675048828125,
-0.00714874267578125,
-0.01352691650390625,
0.004329681396484375,
-0.006526947021484375,
0.0016202926635742188,
-0.049041748046875,
0.049957275390625,
0.004302978515625,
0.038330078125,
0.01393890380859375,
0.00547027587890625,
-0.0018711090087890625,
-0.0194244384765625,
0.045562744140625,
0.028289794921875,
-0.01548004150390625,
-0.00970458984375,
0.028350830078125,
-0.038330078125,
0.00594329833984375,
0.015228271484375,
0.00026798248291015625,
0.006717681884765625,
0.0086822509765625,
0.03778076171875,
0.023651123046875,
-0.004730224609375,
0.039276123046875,
-0.0181884765625,
-0.04290771484375,
-0.0169830322265625,
-0.0124053955078125,
0.0191192626953125,
0.032684326171875,
0.0262451171875,
0.00681304931640625,
-0.031646728515625,
-0.028533935546875,
0.03887939453125,
0.056640625,
-0.028472900390625,
-0.0301666259765625,
0.045379638671875,
-0.00574493408203125,
-0.0159454345703125,
0.0296173095703125,
-0.006389617919921875,
-0.049407958984375,
0.0760498046875,
0.0268402099609375,
0.048980712890625,
-0.03753662109375,
0.006366729736328125,
0.06787109375,
-0.0005006790161132812,
0.01104736328125,
0.024444580078125,
0.032623291015625,
-0.025390625,
-0.0052337646484375,
-0.04180908203125,
0.0122222900390625,
0.035980224609375,
-0.035308837890625,
0.02215576171875,
-0.055145263671875,
-0.0264892578125,
0.005512237548828125,
0.037109375,
-0.04901123046875,
0.0254669189453125,
-0.003841400146484375,
0.08099365234375,
-0.06378173828125,
0.06549072265625,
0.06689453125,
-0.04180908203125,
-0.0670166015625,
-0.0017528533935546875,
0.008575439453125,
-0.0657958984375,
0.0347900390625,
0.0101318359375,
0.002899169921875,
-0.004718780517578125,
-0.03863525390625,
-0.05078125,
0.1038818359375,
0.0289154052734375,
-0.005649566650390625,
0.020660400390625,
-0.0274810791015625,
0.027740478515625,
-0.01580810546875,
0.04473876953125,
0.025970458984375,
0.039581298828125,
0.01520538330078125,
-0.064697265625,
0.0294189453125,
-0.029205322265625,
-0.007656097412109375,
0.0233154296875,
-0.096435546875,
0.06646728515625,
-0.0207977294921875,
-0.0037746429443359375,
0.01922607421875,
0.050048828125,
0.0233612060546875,
-0.0008587837219238281,
0.0217742919921875,
0.06976318359375,
0.034454345703125,
-0.018585205078125,
0.07757568359375,
-0.0154266357421875,
0.041748046875,
0.01739501953125,
0.038604736328125,
0.02984619140625,
0.0294036865234375,
-0.039703369140625,
0.02130126953125,
0.059906005859375,
-0.00536346435546875,
0.01102447509765625,
0.01971435546875,
-0.0290069580078125,
-0.01474761962890625,
-0.01290130615234375,
-0.05224609375,
0.0187835693359375,
0.0077362060546875,
-0.0125274658203125,
-0.01251220703125,
-0.002109527587890625,
0.01837158203125,
0.0208282470703125,
-0.0187835693359375,
0.03924560546875,
0.00830841064453125,
-0.0316162109375,
0.036529541015625,
0.00197601318359375,
0.08038330078125,
-0.030242919921875,
0.0133819580078125,
-0.0276641845703125,
0.025604248046875,
-0.021209716796875,
-0.0787353515625,
0.0257568359375,
-0.00662994384765625,
0.004802703857421875,
-0.015899658203125,
0.049041748046875,
-0.026458740234375,
-0.0249176025390625,
0.0304107666015625,
0.02825927734375,
0.03704833984375,
0.02276611328125,
-0.08331298828125,
0.02093505859375,
0.005931854248046875,
-0.047027587890625,
0.03424072265625,
0.038177490234375,
0.026275634765625,
0.056243896484375,
0.0242767333984375,
0.0210723876953125,
0.01070404052734375,
-0.023956298828125,
0.056915283203125,
-0.047210693359375,
-0.033599853515625,
-0.062042236328125,
0.038177490234375,
-0.02569580078125,
-0.038360595703125,
0.05633544921875,
0.043487548828125,
0.029083251953125,
0.0030364990234375,
0.049560546875,
-0.0401611328125,
0.03692626953125,
-0.020599365234375,
0.057647705078125,
-0.052276611328125,
-0.017425537109375,
-0.0163421630859375,
-0.0460205078125,
-0.03240966796875,
0.06402587890625,
-0.009765625,
0.0206451416015625,
0.023956298828125,
0.05364990234375,
0.00275421142578125,
-0.0099029541015625,
-0.00023543834686279297,
0.01373291015625,
-0.00890350341796875,
0.065185546875,
0.0364990234375,
-0.05572509765625,
0.007198333740234375,
-0.03436279296875,
-0.0218353271484375,
-0.0274810791015625,
-0.05499267578125,
-0.08734130859375,
-0.050933837890625,
-0.03887939453125,
-0.05328369140625,
-0.0166778564453125,
0.09014892578125,
0.061309814453125,
-0.04656982421875,
-0.01067352294921875,
0.01226043701171875,
0.004894256591796875,
-0.01153564453125,
-0.0162506103515625,
0.04083251953125,
0.0066070556640625,
-0.07080078125,
-0.03192138671875,
0.010833740234375,
0.042449951171875,
0.0287322998046875,
-0.036712646484375,
-0.0164642333984375,
-0.005126953125,
0.0247039794921875,
0.0625,
-0.057647705078125,
-0.0209503173828125,
-0.00021827220916748047,
-0.03643798828125,
0.01381683349609375,
0.0227508544921875,
-0.031646728515625,
-0.006561279296875,
0.034332275390625,
0.0268707275390625,
0.0565185546875,
0.0038471221923828125,
0.01221466064453125,
-0.035125732421875,
0.0433349609375,
-0.0017118453979492188,
0.02545166015625,
0.0182952880859375,
-0.0201416015625,
0.054931640625,
0.037628173828125,
-0.0301361083984375,
-0.07501220703125,
-0.0125732421875,
-0.09857177734375,
-0.005901336669921875,
0.0546875,
-0.00611114501953125,
-0.0325927734375,
0.0316162109375,
-0.031494140625,
0.039215087890625,
-0.01617431640625,
0.0210723876953125,
0.016815185546875,
-0.0245208740234375,
-0.0255126953125,
-0.043792724609375,
0.044677734375,
0.0290679931640625,
-0.04986572265625,
-0.029693603515625,
0.0007586479187011719,
0.0279388427734375,
0.0132598876953125,
0.05670166015625,
-0.031585693359375,
0.0107269287109375,
-0.0099945068359375,
0.02130126953125,
-0.005199432373046875,
0.01119232177734375,
-0.024688720703125,
-0.006801605224609375,
-0.0144195556640625,
-0.04730224609375
]
] |
databricks/dolly-v2-12b | 2023-06-30T18:33:03.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"en",
"dataset:databricks/databricks-dolly-15k",
"license:mit",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | databricks | null | null | databricks/dolly-v2-12b | 1,886 | 12,478 | transformers | 2023-04-11T16:10:54 | ---
license: mit
language:
- en
library_name: transformers
inference: false
datasets:
- databricks/databricks-dolly-15k
---
# dolly-v2-12b Model Card
## Summary
Databricks' `dolly-v2-12b`, an instruction-following large language model trained on the Databricks machine learning platform
that is licensed for commercial use. Based on `pythia-12b`, Dolly is trained on ~15k instruction/response fine tuning records
[`databricks-dolly-15k`](https://github.com/databrickslabs/dolly/tree/master/data) generated
by Databricks employees in capability domains from the InstructGPT paper, including brainstorming, classification, closed QA, generation,
information extraction, open QA and summarization. `dolly-v2-12b` is not a state-of-the-art model, but does exhibit surprisingly
high quality instruction following behavior not characteristic of the foundation model on which it is based.
Dolly v2 is also available in these smaller models sizes:
* [dolly-v2-7b](https://huggingface.co/databricks/dolly-v2-7b), a 6.9 billion parameter based on `pythia-6.9b`
* [dolly-v2-3b](https://huggingface.co/databricks/dolly-v2-3b), a 2.8 billion parameter based on `pythia-2.8b`
Please refer to the [dolly GitHub repo](https://github.com/databrickslabs/dolly#getting-started-with-response-generation) for tips on
running inference for various GPU configurations.
**Owner**: Databricks, Inc.
## Model Overview
`dolly-v2-12b` is a 12 billion parameter causal language model created by [Databricks](https://databricks.com/) that is derived from
[EleutherAI's](https://www.eleuther.ai/) [Pythia-12b](https://huggingface.co/EleutherAI/pythia-12b) and fine-tuned
on a [~15K record instruction corpus](https://github.com/databrickslabs/dolly/tree/master/data) generated by Databricks employees and released under a permissive license (CC-BY-SA)
## Usage
To use the model with the `transformers` library on a machine with GPUs, first make sure you have the `transformers` and `accelerate` libraries installed.
In a Databricks notebook you could run:
```python
%pip install "accelerate>=0.16.0,<1" "transformers[torch]>=4.28.1,<5" "torch>=1.13.1,<2"
```
The instruction following pipeline can be loaded using the `pipeline` function as shown below. This loads a custom `InstructionTextGenerationPipeline`
found in the model repo [here](https://huggingface.co/databricks/dolly-v2-3b/blob/main/instruct_pipeline.py), which is why `trust_remote_code=True` is required.
Including `torch_dtype=torch.bfloat16` is generally recommended if this type is supported in order to reduce memory usage. It does not appear to impact output quality.
It is also fine to remove it if there is sufficient memory.
```python
import torch
from transformers import pipeline
generate_text = pipeline(model="databricks/dolly-v2-12b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
```
You can then use the pipeline to answer instructions:
```python
res = generate_text("Explain to me the difference between nuclear fission and fusion.")
print(res[0]["generated_text"])
```
Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/databricks/dolly-v2-3b/blob/main/instruct_pipeline.py),
store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
```python
import torch
from instruct_pipeline import InstructionTextGenerationPipeline
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("databricks/dolly-v2-12b", padding_side="left")
model = AutoModelForCausalLM.from_pretrained("databricks/dolly-v2-12b", device_map="auto", torch_dtype=torch.bfloat16)
generate_text = InstructionTextGenerationPipeline(model=model, tokenizer=tokenizer)
```
### LangChain Usage
To use the pipeline with LangChain, you must set `return_full_text=True`, as LangChain expects the full text to be returned
and the default for the pipeline is to only return the new text.
```python
import torch
from transformers import pipeline
generate_text = pipeline(model="databricks/dolly-v2-12b", torch_dtype=torch.bfloat16,
trust_remote_code=True, device_map="auto", return_full_text=True)
```
You can create a prompt that either has only an instruction or has an instruction with context:
```python
from langchain import PromptTemplate, LLMChain
from langchain.llms import HuggingFacePipeline
# template for an instrution with no input
prompt = PromptTemplate(
input_variables=["instruction"],
template="{instruction}")
# template for an instruction with input
prompt_with_context = PromptTemplate(
input_variables=["instruction", "context"],
template="{instruction}\n\nInput:\n{context}")
hf_pipeline = HuggingFacePipeline(pipeline=generate_text)
llm_chain = LLMChain(llm=hf_pipeline, prompt=prompt)
llm_context_chain = LLMChain(llm=hf_pipeline, prompt=prompt_with_context)
```
Example predicting using a simple instruction:
```python
print(llm_chain.predict(instruction="Explain to me the difference between nuclear fission and fusion.").lstrip())
```
Example predicting using an instruction with context:
```python
context = """George Washington (February 22, 1732[b] - December 14, 1799) was an American military officer, statesman,
and Founding Father who served as the first president of the United States from 1789 to 1797."""
print(llm_context_chain.predict(instruction="When was George Washington president?", context=context).lstrip())
```
## Known Limitations
### Performance Limitations
**`dolly-v2-12b` is not a state-of-the-art generative language model** and, though quantitative benchmarking is ongoing, is not designed to perform
competitively with more modern model architectures or models subject to larger pretraining corpuses.
The Dolly model family is under active development, and so any list of shortcomings is unlikely to be exhaustive, but we include known limitations and misfires here as a means to document and share our preliminary findings with the community.
In particular, `dolly-v2-12b` struggles with: syntactically complex prompts, programming problems, mathematical operations, factual errors,
dates and times, open-ended question answering, hallucination, enumerating lists of specific length, stylistic mimicry, having a sense of humor, etc.
Moreover, we find that `dolly-v2-12b` does not have some capabilities, such as well-formatted letter writing, present in the original model.
### Dataset Limitations
Like all language models, `dolly-v2-12b` reflects the content and limitations of its training corpuses.
- **The Pile**: GPT-J's pre-training corpus contains content mostly collected from the public internet, and like most web-scale datasets,
it contains content many users would find objectionable. As such, the model is likely to reflect these shortcomings, potentially overtly
in the case it is explicitly asked to produce objectionable content, and sometimes subtly, as in the case of biased or harmful implicit
associations.
- **`databricks-dolly-15k`**: The training data on which `dolly-v2-12b` is instruction tuned represents natural language instructions generated
by Databricks employees during a period spanning March and April 2023 and includes passages from Wikipedia as references passages
for instruction categories like closed QA and summarization. To our knowledge it does not contain obscenity, intellectual property or
personally identifying information about non-public figures, but it may contain typos and factual errors.
The dataset may also reflect biases found in Wikipedia. Finally, the dataset likely reflects
the interests and semantic choices of Databricks employees, a demographic which is not representative of the global population at large.
Databricks is committed to ongoing research and development efforts to develop helpful, honest and harmless AI technologies that
maximize the potential of all individuals and organizations.
### Benchmark Metrics
Below you'll find various models benchmark performance on the [EleutherAI LLM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness);
model results are sorted by geometric mean to produce an intelligible ordering. As outlined above, these results demonstrate that `dolly-v2-12b` is not state of the art,
and in fact underperforms `dolly-v1-6b` in some evaluation benchmarks. We believe this owes to the composition and size of the underlying fine tuning datasets,
but a robust statement as to the sources of these variations requires further study.
| model | openbookqa | arc_easy | winogrande | hellaswag | arc_challenge | piqa | boolq | gmean |
| --------------------------------- | ------------ | ---------- | ------------ | ----------- | --------------- | -------- | -------- | ---------|
| EleutherAI/pythia-2.8b | 0.348 | 0.585859 | 0.589582 | 0.591217 | 0.323379 | 0.73395 | 0.638226 | 0.523431 |
| EleutherAI/pythia-6.9b | 0.368 | 0.604798 | 0.608524 | 0.631548 | 0.343857 | 0.761153 | 0.6263 | 0.543567 |
| databricks/dolly-v2-3b | 0.384 | 0.611532 | 0.589582 | 0.650767 | 0.370307 | 0.742655 | 0.575535 | 0.544886 |
| EleutherAI/pythia-12b | 0.364 | 0.627104 | 0.636148 | 0.668094 | 0.346416 | 0.760065 | 0.673394 | 0.559676 |
| EleutherAI/gpt-j-6B | 0.382 | 0.621633 | 0.651144 | 0.662617 | 0.363481 | 0.761153 | 0.655963 | 0.565936 |
| databricks/dolly-v2-12b | 0.408 | 0.63931 | 0.616417 | 0.707927 | 0.388225 | 0.757889 | 0.568196 | 0.56781 |
| databricks/dolly-v2-7b | 0.392 | 0.633838 | 0.607735 | 0.686517 | 0.406997 | 0.750816 | 0.644037 | 0.573487 |
| databricks/dolly-v1-6b | 0.41 | 0.62963 | 0.643252 | 0.676758 | 0.384812 | 0.773667 | 0.687768 | 0.583431 |
| EleutherAI/gpt-neox-20b | 0.402 | 0.683923 | 0.656669 | 0.7142 | 0.408703 | 0.784004 | 0.695413 | 0.602236 |
# Citation
```
@online{DatabricksBlog2023DollyV2,
author = {Mike Conover and Matt Hayes and Ankit Mathur and Jianwei Xie and Jun Wan and Sam Shah and Ali Ghodsi and Patrick Wendell and Matei Zaharia and Reynold Xin},
title = {Free Dolly: Introducing the World's First Truly Open Instruction-Tuned LLM},
year = {2023},
url = {https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm},
urldate = {2023-06-30}
}
```
# Happy Hacking! | 10,746 | [
[
-0.0020427703857421875,
-0.076416015625,
0.01125335693359375,
0.0271759033203125,
-0.00945281982421875,
-0.004665374755859375,
-0.0008902549743652344,
-0.00429534912109375,
0.0044097900390625,
0.033782958984375,
-0.04071044921875,
-0.037261962890625,
-0.0516357421875,
0.005611419677734375,
-0.045013427734375,
0.08544921875,
-0.00424957275390625,
-0.01070404052734375,
-0.037994384765625,
0.01654052734375,
-0.02783203125,
-0.0228424072265625,
-0.0208587646484375,
-0.0121612548828125,
0.021820068359375,
0.0209197998046875,
0.053802490234375,
0.060302734375,
0.0277252197265625,
0.026763916015625,
-0.00691986083984375,
-0.0031871795654296875,
-0.037200927734375,
0.002475738525390625,
0.0020771026611328125,
-0.028076171875,
-0.03387451171875,
0.00408172607421875,
0.04498291015625,
0.03656005859375,
0.002719879150390625,
0.0247802734375,
-0.0021610260009765625,
0.053924560546875,
-0.03997802734375,
0.0367431640625,
-0.034149169921875,
-0.00817108154296875,
-0.01074981689453125,
0.0004169940948486328,
-0.045196533203125,
-0.034515380859375,
-0.001575469970703125,
-0.04412841796875,
0.0213165283203125,
0.006526947021484375,
0.08001708984375,
0.01605224609375,
-0.02618408203125,
-0.0157012939453125,
-0.0469970703125,
0.07574462890625,
-0.037841796875,
0.0014743804931640625,
0.03338623046875,
0.017669677734375,
-0.0269317626953125,
-0.06402587890625,
-0.049835205078125,
-0.01552581787109375,
-0.037628173828125,
0.004482269287109375,
-0.0210418701171875,
-0.0013837814331054688,
0.03692626953125,
0.038177490234375,
-0.0609130859375,
-0.00864410400390625,
-0.06439208984375,
-0.025848388671875,
0.05419921875,
0.026763916015625,
0.00722503662109375,
-0.048614501953125,
-0.0220489501953125,
-0.0285491943359375,
-0.040496826171875,
0.0025634765625,
0.036285400390625,
0.024200439453125,
-0.040313720703125,
0.056732177734375,
-0.02734375,
0.061065673828125,
-0.008026123046875,
-0.01169586181640625,
0.025421142578125,
-0.00769805908203125,
-0.034088134765625,
-0.01229095458984375,
0.0687255859375,
0.0169219970703125,
0.0099334716796875,
0.0007944107055664062,
-0.002777099609375,
0.01983642578125,
0.015655517578125,
-0.06353759765625,
-0.037628173828125,
0.041015625,
-0.035888671875,
-0.038970947265625,
-0.01227569580078125,
-0.07171630859375,
-0.0408935546875,
-0.0169219970703125,
0.039459228515625,
-0.021453857421875,
-0.0215301513671875,
-0.00954437255859375,
-0.006778717041015625,
0.0219573974609375,
0.01235198974609375,
-0.08319091796875,
0.0175933837890625,
0.039459228515625,
0.05328369140625,
-0.0024089813232421875,
-0.01409912109375,
-0.0550537109375,
-0.01824951171875,
-0.01288604736328125,
0.036865234375,
-0.040740966796875,
-0.02630615234375,
0.00921630859375,
0.016845703125,
-0.01129913330078125,
-0.04217529296875,
0.022491455078125,
-0.028289794921875,
0.04107666015625,
-0.002399444580078125,
-0.04107666015625,
-0.0159454345703125,
0.004657745361328125,
-0.0396728515625,
0.08441162109375,
0.04022216796875,
-0.05615234375,
0.01160430908203125,
-0.010711669921875,
-0.0396728515625,
-0.012481689453125,
-0.00931549072265625,
-0.046356201171875,
-0.00490570068359375,
0.0201873779296875,
0.0513916015625,
-0.038330078125,
0.02520751953125,
0.0017547607421875,
-0.01554107666015625,
0.00499725341796875,
-0.0313720703125,
0.081787109375,
0.0099334716796875,
-0.028839111328125,
0.015472412109375,
-0.0777587890625,
-0.0028247833251953125,
0.0103607177734375,
-0.031005859375,
0.0236968994140625,
-0.0257568359375,
0.01947021484375,
0.002925872802734375,
0.0190887451171875,
-0.0305023193359375,
0.0225372314453125,
-0.020111083984375,
0.00829315185546875,
0.053741455078125,
-0.030517578125,
0.0283660888671875,
-0.037750244140625,
0.04302978515625,
-0.001010894775390625,
0.005229949951171875,
-0.0280914306640625,
-0.059967041015625,
-0.0753173828125,
-0.01453399658203125,
0.020599365234375,
0.0506591796875,
-0.048797607421875,
0.0290985107421875,
-0.012847900390625,
-0.04193115234375,
-0.0440673828125,
0.00457763671875,
0.0401611328125,
0.048797607421875,
0.055145263671875,
-0.0128326416015625,
-0.04931640625,
-0.061553955078125,
0.0017576217651367188,
-0.0303802490234375,
-0.0180511474609375,
0.01407623291015625,
0.04046630859375,
-0.00616455078125,
0.0626220703125,
-0.0396728515625,
-0.0085906982421875,
-0.0399169921875,
0.012908935546875,
0.036041259765625,
0.049560546875,
0.0165863037109375,
-0.041748046875,
-0.038238525390625,
0.004817962646484375,
-0.06488037109375,
0.00536346435546875,
-0.0175018310546875,
-0.0087890625,
0.04144287109375,
0.0178680419921875,
-0.06494140625,
0.0533447265625,
0.049530029296875,
-0.03167724609375,
0.055084228515625,
-0.01317596435546875,
-0.0004436969757080078,
-0.087890625,
0.013702392578125,
-0.00966644287109375,
0.0015153884887695312,
-0.03887939453125,
-0.0101318359375,
0.0030975341796875,
0.0031261444091796875,
-0.0245819091796875,
0.061065673828125,
-0.01477813720703125,
0.0016613006591796875,
-0.005283355712890625,
0.005313873291015625,
0.01016998291015625,
0.039398193359375,
0.0029087066650390625,
0.02978515625,
0.058837890625,
-0.053253173828125,
0.06451416015625,
0.0310821533203125,
-0.035675048828125,
0.026641845703125,
-0.046966552734375,
0.007152557373046875,
-0.0145721435546875,
0.0205230712890625,
-0.07342529296875,
-0.0284423828125,
0.02557373046875,
-0.0318603515625,
0.043060302734375,
-0.0191497802734375,
-0.0260009765625,
-0.0374755859375,
-0.00730133056640625,
0.00832366943359375,
0.069091796875,
-0.041229248046875,
0.053863525390625,
0.0101165771484375,
-0.005771636962890625,
-0.055450439453125,
-0.046966552734375,
-0.017242431640625,
-0.0196075439453125,
-0.0687255859375,
0.02874755859375,
0.0127716064453125,
-0.0151214599609375,
-0.0123748779296875,
0.00109100341796875,
0.0048675537109375,
-0.0216827392578125,
0.0093994140625,
0.03662109375,
-0.01016998291015625,
0.0078582763671875,
0.0033893585205078125,
-0.032470703125,
0.00795745849609375,
-0.0157012939453125,
0.046478271484375,
-0.002895355224609375,
0.00815582275390625,
-0.045196533203125,
0.0017232894897460938,
0.03912353515625,
0.003086090087890625,
0.070068359375,
0.06927490234375,
-0.01369476318359375,
0.005340576171875,
-0.0438232421875,
-0.027191162109375,
-0.038360595703125,
0.038177490234375,
-0.0140380859375,
-0.02685546875,
0.030364990234375,
0.00507354736328125,
0.007427215576171875,
0.0386962890625,
0.0450439453125,
-0.0014390945434570312,
0.043975830078125,
0.029693603515625,
-0.0021953582763671875,
0.0145111083984375,
-0.05328369140625,
0.006076812744140625,
-0.061798095703125,
-0.03778076171875,
-0.0406494140625,
-0.0268096923828125,
-0.06207275390625,
-0.045166015625,
0.0077056884765625,
0.0035572052001953125,
-0.0283050537109375,
0.043731689453125,
-0.036865234375,
0.01221466064453125,
0.04937744140625,
-0.0013284683227539062,
0.0030651092529296875,
0.004886627197265625,
-0.004787445068359375,
0.00940704345703125,
-0.0546875,
-0.045806884765625,
0.08795166015625,
0.0166015625,
0.0699462890625,
-0.0044097900390625,
0.03009033203125,
-0.0027484893798828125,
0.013519287109375,
-0.041046142578125,
0.042724609375,
-0.00704193115234375,
-0.06103515625,
-0.0149078369140625,
-0.04327392578125,
-0.07794189453125,
0.0028667449951171875,
-0.01495361328125,
-0.0810546875,
0.0012216567993164062,
0.0175323486328125,
-0.017242431640625,
0.02227783203125,
-0.060333251953125,
0.0858154296875,
0.00516510009765625,
-0.0462646484375,
-0.00977325439453125,
-0.057098388671875,
0.0199127197265625,
0.0233306884765625,
0.007389068603515625,
0.0007038116455078125,
0.02825927734375,
0.056732177734375,
-0.0318603515625,
0.0540771484375,
-0.01107025146484375,
0.01175689697265625,
0.031005859375,
0.005229949951171875,
0.0531005859375,
0.01143646240234375,
-0.0168609619140625,
-0.00026607513427734375,
-0.00832366943359375,
-0.045623779296875,
-0.041107177734375,
0.053375244140625,
-0.060638427734375,
-0.05126953125,
-0.0308685302734375,
-0.047576904296875,
0.00786590576171875,
-0.005767822265625,
0.031494140625,
0.053314208984375,
0.00011163949966430664,
0.0234832763671875,
0.0386962890625,
-0.04986572265625,
0.038482666015625,
0.00817108154296875,
-0.034515380859375,
-0.0155487060546875,
0.076416015625,
-0.00975799560546875,
0.02496337890625,
0.045501708984375,
0.0292816162109375,
-0.02630615234375,
-0.0149383544921875,
-0.0596923828125,
0.01250457763671875,
-0.05242919921875,
-0.0179901123046875,
-0.06591796875,
-0.018890380859375,
-0.0284271240234375,
-0.014190673828125,
-0.034942626953125,
-0.057037353515625,
-0.0338134765625,
-0.0055084228515625,
0.055999755859375,
0.054962158203125,
0.0030536651611328125,
0.0240478515625,
-0.044525146484375,
0.0305938720703125,
0.038238525390625,
0.0068817138671875,
-0.006092071533203125,
-0.056671142578125,
-0.0178680419921875,
-0.0026607513427734375,
-0.04718017578125,
-0.045684814453125,
0.0293426513671875,
-0.00650787353515625,
0.0196533203125,
0.01337432861328125,
0.00798797607421875,
0.037322998046875,
-0.014923095703125,
0.06915283203125,
0.004974365234375,
-0.061126708984375,
0.0416259765625,
-0.02386474609375,
0.0305938720703125,
0.0094451904296875,
0.027984619140625,
-0.0266876220703125,
-0.02593994140625,
-0.045196533203125,
-0.06842041015625,
0.06976318359375,
0.047119140625,
0.0248870849609375,
0.00319671630859375,
0.006931304931640625,
0.01045989990234375,
0.014801025390625,
-0.058197021484375,
-0.044677734375,
-0.01934814453125,
-0.0152130126953125,
0.01506805419921875,
-0.00862884521484375,
-0.004726409912109375,
-0.032562255859375,
0.0687255859375,
0.0113372802734375,
0.037139892578125,
-0.00994873046875,
-0.00855255126953125,
-0.007568359375,
0.005603790283203125,
0.03167724609375,
0.046630859375,
-0.020538330078125,
-0.006763458251953125,
0.0142822265625,
-0.054229736328125,
0.01009368896484375,
0.0302581787109375,
-0.01177978515625,
-0.00025343894958496094,
0.03375244140625,
0.07196044921875,
-0.0120391845703125,
-0.0244598388671875,
0.0247802734375,
-0.012786865234375,
0.00682830810546875,
-0.01485443115234375,
0.00841522216796875,
0.00870513916015625,
0.01190185546875,
0.0222320556640625,
0.0025463104248046875,
-0.0181427001953125,
-0.03997802734375,
0.007541656494140625,
0.02239990234375,
-0.018829345703125,
-0.0153961181640625,
0.053131103515625,
0.014678955078125,
-0.022125244140625,
0.07952880859375,
-0.0227203369140625,
-0.0191192626953125,
0.061737060546875,
0.03985595703125,
0.060150146484375,
-0.01806640625,
0.03839111328125,
0.056732177734375,
0.026153564453125,
0.011993408203125,
0.01369476318359375,
0.0214080810546875,
-0.03082275390625,
-0.0239410400390625,
-0.0684814453125,
-0.01214599609375,
0.021881103515625,
-0.041015625,
0.05694580078125,
-0.036712646484375,
0.0071868896484375,
-0.01143646240234375,
0.006488800048828125,
-0.06512451171875,
0.032562255859375,
-0.005199432373046875,
0.043426513671875,
-0.05084228515625,
0.057830810546875,
0.03271484375,
-0.02374267578125,
-0.055084228515625,
-0.01107025146484375,
0.00952911376953125,
-0.050994873046875,
0.04345703125,
0.03131103515625,
0.022979736328125,
-0.006107330322265625,
-0.0162353515625,
-0.0667724609375,
0.08782958984375,
0.022216796875,
-0.0272064208984375,
0.0123291015625,
0.007534027099609375,
0.02630615234375,
-0.0268707275390625,
0.049407958984375,
0.0550537109375,
0.0352783203125,
0.01641845703125,
-0.06689453125,
0.017364501953125,
-0.0196990966796875,
-0.006591796875,
0.005893707275390625,
-0.04534912109375,
0.077880859375,
-0.0305938720703125,
-0.016387939453125,
0.0267181396484375,
0.0574951171875,
0.0190887451171875,
0.0197296142578125,
0.007495880126953125,
0.039825439453125,
0.061065673828125,
-0.0188140869140625,
0.105224609375,
-0.0180816650390625,
0.037933349609375,
0.061309814453125,
0.0164794921875,
0.043365478515625,
0.0204620361328125,
-0.038055419921875,
0.058349609375,
0.035369873046875,
-0.0003132820129394531,
0.03448486328125,
0.0304107666015625,
-0.01568603515625,
0.00504302978515625,
0.007518768310546875,
-0.046173095703125,
0.031982421875,
0.03173828125,
-0.0386962890625,
0.00501251220703125,
-0.0190887451171875,
0.017303466796875,
-0.01214599609375,
0.0032749176025390625,
0.031219482421875,
-0.0008282661437988281,
-0.043243408203125,
0.06964111328125,
-0.0015592575073242188,
0.0335693359375,
-0.042877197265625,
-0.00449371337890625,
-0.0272979736328125,
0.0129852294921875,
-0.02716064453125,
-0.042388916015625,
0.024688720703125,
-0.0002422332763671875,
-0.01074981689453125,
-0.00833892822265625,
0.0308837890625,
-0.029510498046875,
-0.0634765625,
0.00872802734375,
0.0152740478515625,
0.0208892822265625,
0.0162811279296875,
-0.037933349609375,
0.0292816162109375,
0.0034427642822265625,
-0.040740966796875,
0.0254669189453125,
0.01537322998046875,
0.02044677734375,
0.048126220703125,
0.03094482421875,
-0.01849365234375,
0.0008358955383300781,
-0.015594482421875,
0.07159423828125,
-0.03961181640625,
-0.0075836181640625,
-0.059051513671875,
0.07745361328125,
-0.00908660888671875,
-0.03839111328125,
0.048431396484375,
0.0474853515625,
0.06634521484375,
-0.022186279296875,
0.06085205078125,
-0.037139892578125,
0.0184478759765625,
-0.0504150390625,
0.04046630859375,
-0.03033447265625,
0.026458740234375,
-0.03546142578125,
-0.09197998046875,
-0.0192413330078125,
0.07122802734375,
-0.0287322998046875,
0.021942138671875,
0.07196044921875,
0.08746337890625,
-0.00592041015625,
0.01038360595703125,
0.0189971923828125,
0.03277587890625,
0.018341064453125,
0.027801513671875,
0.046875,
-0.056060791015625,
0.05426025390625,
-0.04681396484375,
-0.0281829833984375,
-0.0092315673828125,
-0.0626220703125,
-0.0802001953125,
-0.049652099609375,
-0.037078857421875,
-0.051483154296875,
-0.004772186279296875,
0.06610107421875,
0.04595947265625,
-0.0657958984375,
-0.029327392578125,
-0.018096923828125,
0.028900146484375,
-0.0112457275390625,
-0.021759033203125,
0.04718017578125,
-0.01806640625,
-0.06695556640625,
0.0208282470703125,
0.011871337890625,
0.000308990478515625,
-0.0265350341796875,
-0.0089111328125,
-0.012908935546875,
-0.007381439208984375,
0.034698486328125,
0.00858306884765625,
-0.046173095703125,
-0.005535125732421875,
0.003696441650390625,
0.0027713775634765625,
-0.0011949539184570312,
0.041015625,
-0.07794189453125,
0.05120849609375,
0.046722412109375,
0.02947998046875,
0.058197021484375,
-0.0108184814453125,
0.0477294921875,
-0.0589599609375,
0.025665283203125,
0.01093292236328125,
0.0170745849609375,
0.046722412109375,
-0.0328369140625,
0.029510498046875,
0.020721435546875,
-0.047882080078125,
-0.04595947265625,
0.0223388671875,
-0.057708740234375,
-0.006526947021484375,
0.10009765625,
-0.01332855224609375,
-0.02178955078125,
-0.01499176025390625,
-0.0156402587890625,
0.0201568603515625,
-0.026763916015625,
0.0784912109375,
0.035736083984375,
-0.004669189453125,
-0.00482177734375,
-0.0447998046875,
0.04345703125,
0.0268096923828125,
-0.0545654296875,
0.0140380859375,
0.01849365234375,
-0.00560760498046875,
0.021881103515625,
0.032379150390625,
-0.0035247802734375,
0.0205230712890625,
0.0231781005859375,
-0.007503509521484375,
0.00702667236328125,
-0.03253173828125,
-0.004863739013671875,
-0.00679779052734375,
-0.03045654296875,
-0.010955810546875
]
] |
timm/tf_efficientnet_lite0.in1k | 2023-04-27T21:38:11.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1905.11946",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/tf_efficientnet_lite0.in1k | 0 | 12,460 | timm | 2022-12-13T00:13:28 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for tf_efficientnet_lite0.in1k
A EfficientNet-Lite image classification model. Trained on ImageNet-1k in Tensorflow by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 4.7
- GMACs: 0.4
- Activations (M): 6.7
- Image size: 224 x 224
- **Papers:**
- EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks: https://arxiv.org/abs/1905.11946
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('tf_efficientnet_lite0.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnet_lite0.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 24, 56, 56])
# torch.Size([1, 40, 28, 28])
# torch.Size([1, 112, 14, 14])
# torch.Size([1, 320, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnet_lite0.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2019efficientnet,
title={Efficientnet: Rethinking model scaling for convolutional neural networks},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={6105--6114},
year={2019},
organization={PMLR}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,087 | [
[
-0.0269317626953125,
-0.03961181640625,
-0.003078460693359375,
0.00653076171875,
-0.0205841064453125,
-0.033416748046875,
-0.0259552001953125,
-0.0261077880859375,
0.01528167724609375,
0.0265045166015625,
-0.0264892578125,
-0.0484619140625,
-0.05426025390625,
-0.01320648193359375,
-0.0151519775390625,
0.06658935546875,
-0.00690460205078125,
-0.0006804466247558594,
-0.009552001953125,
-0.044769287109375,
-0.004337310791015625,
-0.01090240478515625,
-0.0692138671875,
-0.034027099609375,
0.02886962890625,
0.02313232421875,
0.044769287109375,
0.050445556640625,
0.05059814453125,
0.037017822265625,
-0.006275177001953125,
0.0033130645751953125,
-0.023773193359375,
-0.0099029541015625,
0.032684326171875,
-0.044586181640625,
-0.03204345703125,
0.01454925537109375,
0.053802490234375,
0.037628173828125,
-0.001293182373046875,
0.03594970703125,
0.00846099853515625,
0.040740966796875,
-0.02349853515625,
0.0102386474609375,
-0.0288238525390625,
0.01123809814453125,
-0.0008044242858886719,
0.006809234619140625,
-0.0233154296875,
-0.0306243896484375,
0.0176849365234375,
-0.04522705078125,
0.039154052734375,
-0.006439208984375,
0.093505859375,
0.022796630859375,
-0.00688934326171875,
-0.00506591796875,
-0.01580810546875,
0.054046630859375,
-0.06231689453125,
0.01165771484375,
0.01160430908203125,
0.0162200927734375,
-0.0020160675048828125,
-0.087158203125,
-0.0341796875,
-0.0173187255859375,
-0.01971435546875,
-0.0114898681640625,
-0.0221405029296875,
0.0099639892578125,
0.02685546875,
0.022735595703125,
-0.032928466796875,
0.00653839111328125,
-0.0404052734375,
-0.0128021240234375,
0.044036865234375,
0.002498626708984375,
0.02301025390625,
-0.01154327392578125,
-0.034698486328125,
-0.034576416015625,
-0.024749755859375,
0.0249786376953125,
0.019012451171875,
0.017333984375,
-0.03973388671875,
0.03192138671875,
0.0095367431640625,
0.04815673828125,
0.0029773712158203125,
-0.0258636474609375,
0.0450439453125,
0.0032749176025390625,
-0.03564453125,
-0.006702423095703125,
0.0838623046875,
0.032135009765625,
0.018524169921875,
0.00336456298828125,
-0.01258087158203125,
-0.0284423828125,
-0.0053863525390625,
-0.10076904296875,
-0.0265655517578125,
0.0262603759765625,
-0.0498046875,
-0.033843994140625,
0.0130615234375,
-0.041351318359375,
-0.00821685791015625,
0.0005269050598144531,
0.05511474609375,
-0.0304107666015625,
-0.033905029296875,
0.0014371871948242188,
-0.01480865478515625,
0.0118865966796875,
0.01448822021484375,
-0.042083740234375,
0.0092315673828125,
0.022064208984375,
0.08685302734375,
0.0091552734375,
-0.031463623046875,
-0.0185089111328125,
-0.030029296875,
-0.0212249755859375,
0.0262908935546875,
-0.0031452178955078125,
-0.0037937164306640625,
-0.022186279296875,
0.0246124267578125,
-0.01317596435546875,
-0.053863525390625,
0.02618408203125,
-0.016815185546875,
0.009033203125,
0.004100799560546875,
-0.0245513916015625,
-0.039520263671875,
0.0213775634765625,
-0.038726806640625,
0.08563232421875,
0.02569580078125,
-0.06695556640625,
0.021636962890625,
-0.044219970703125,
-0.00843048095703125,
-0.019073486328125,
0.0007271766662597656,
-0.08294677734375,
-0.0037078857421875,
0.01184844970703125,
0.0618896484375,
-0.018585205078125,
0.00940704345703125,
-0.045745849609375,
-0.018157958984375,
0.0215911865234375,
-0.00568389892578125,
0.083740234375,
0.0175018310546875,
-0.034942626953125,
0.02618408203125,
-0.0457763671875,
0.0185699462890625,
0.03631591796875,
-0.017303466796875,
-0.00008744001388549805,
-0.04541015625,
0.0162506103515625,
0.0189666748046875,
0.01052093505859375,
-0.042449951171875,
0.021392822265625,
-0.00988006591796875,
0.043487548828125,
0.04400634765625,
-0.01094818115234375,
0.02850341796875,
-0.025543212890625,
0.0204925537109375,
0.016632080078125,
0.0166168212890625,
-0.004741668701171875,
-0.03167724609375,
-0.0640869140625,
-0.03936767578125,
0.0248565673828125,
0.022613525390625,
-0.039581298828125,
0.030975341796875,
-0.0227203369140625,
-0.06280517578125,
-0.034698486328125,
0.0020923614501953125,
0.0288238525390625,
0.0498046875,
0.022186279296875,
-0.02197265625,
-0.0300445556640625,
-0.07147216796875,
-0.002002716064453125,
0.002407073974609375,
0.00030040740966796875,
0.0276641845703125,
0.0506591796875,
-0.002559661865234375,
0.043182373046875,
-0.0316162109375,
-0.020538330078125,
-0.0180511474609375,
0.004276275634765625,
0.03558349609375,
0.06494140625,
0.05963134765625,
-0.0494384765625,
-0.042449951171875,
-0.01207733154296875,
-0.06988525390625,
0.01165771484375,
-0.0038013458251953125,
-0.0118560791015625,
0.01325225830078125,
0.0168609619140625,
-0.0457763671875,
0.040863037109375,
0.018280029296875,
-0.03558349609375,
0.0297088623046875,
-0.0206298828125,
0.018218994140625,
-0.08441162109375,
0.01493072509765625,
0.0262298583984375,
-0.0156097412109375,
-0.041046142578125,
0.00970458984375,
0.007843017578125,
-0.00519561767578125,
-0.03619384765625,
0.050323486328125,
-0.039306640625,
-0.0117645263671875,
-0.00688934326171875,
-0.0232696533203125,
0.0014753341674804688,
0.0506591796875,
-0.01142120361328125,
0.0290374755859375,
0.0625,
-0.033721923828125,
0.037811279296875,
0.0204925537109375,
-0.021484375,
0.0235443115234375,
-0.057342529296875,
0.00879669189453125,
0.006702423095703125,
0.01458740234375,
-0.07586669921875,
-0.016845703125,
0.02099609375,
-0.043670654296875,
0.045623779296875,
-0.037567138671875,
-0.03594970703125,
-0.031829833984375,
-0.03314208984375,
0.03302001953125,
0.046051025390625,
-0.056671142578125,
0.037353515625,
0.0133514404296875,
0.0264129638671875,
-0.043212890625,
-0.06829833984375,
-0.0195159912109375,
-0.03155517578125,
-0.06585693359375,
0.02703857421875,
0.007030487060546875,
0.00482177734375,
0.01465606689453125,
-0.0005288124084472656,
-0.008392333984375,
0.0004439353942871094,
0.033294677734375,
0.022735595703125,
-0.025909423828125,
-0.0015878677368164062,
-0.0255584716796875,
0.0008563995361328125,
0.004840850830078125,
-0.025909423828125,
0.04022216796875,
-0.0223846435546875,
-0.0025959014892578125,
-0.06085205078125,
-0.006927490234375,
0.037261962890625,
-0.00030112266540527344,
0.06536865234375,
0.08489990234375,
-0.036102294921875,
-0.006832122802734375,
-0.03094482421875,
-0.024810791015625,
-0.0367431640625,
0.040985107421875,
-0.0255584716796875,
-0.038726806640625,
0.0640869140625,
0.0025615692138671875,
0.01140594482421875,
0.0556640625,
0.025909423828125,
-0.001926422119140625,
0.050445556640625,
0.04638671875,
0.020965576171875,
0.059295654296875,
-0.08392333984375,
-0.01538848876953125,
-0.058502197265625,
-0.0303192138671875,
-0.0290985107421875,
-0.053070068359375,
-0.05499267578125,
-0.0230255126953125,
0.038848876953125,
0.019378662109375,
-0.03875732421875,
0.030670166015625,
-0.06597900390625,
0.004852294921875,
0.04437255859375,
0.042999267578125,
-0.022125244140625,
0.0251922607421875,
-0.01409912109375,
0.004817962646484375,
-0.062103271484375,
-0.0159912109375,
0.0877685546875,
0.030242919921875,
0.04541015625,
-0.01013946533203125,
0.0489501953125,
-0.0192108154296875,
0.031280517578125,
-0.0498046875,
0.0418701171875,
-0.0112152099609375,
-0.033935546875,
-0.0167083740234375,
-0.043609619140625,
-0.07861328125,
0.0178680419921875,
-0.0186309814453125,
-0.053070068359375,
0.0147552490234375,
0.01367950439453125,
-0.0174102783203125,
0.06280517578125,
-0.07171630859375,
0.0787353515625,
-0.007354736328125,
-0.037506103515625,
0.0011234283447265625,
-0.044586181640625,
0.0201568603515625,
0.0240478515625,
-0.01806640625,
-0.005847930908203125,
-0.001224517822265625,
0.0859375,
-0.050811767578125,
0.058074951171875,
-0.04052734375,
0.032379150390625,
0.03851318359375,
-0.0060577392578125,
0.027496337890625,
-0.006992340087890625,
-0.01580810546875,
0.022979736328125,
0.0014276504516601562,
-0.040313720703125,
-0.04119873046875,
0.047271728515625,
-0.0755615234375,
-0.0246124267578125,
-0.018463134765625,
-0.0391845703125,
0.0186920166015625,
0.01195526123046875,
0.041839599609375,
0.05450439453125,
0.0155792236328125,
0.0283355712890625,
0.040679931640625,
-0.0228118896484375,
0.04296875,
-0.0094146728515625,
-0.012725830078125,
-0.037139892578125,
0.05841064453125,
0.025390625,
0.0160064697265625,
0.006824493408203125,
0.0208587646484375,
-0.020538330078125,
-0.0450439453125,
-0.0280609130859375,
0.020599365234375,
-0.055877685546875,
-0.03924560546875,
-0.050079345703125,
-0.02935791015625,
-0.02642822265625,
-0.00948333740234375,
-0.038848876953125,
-0.036651611328125,
-0.03253173828125,
0.0135040283203125,
0.054779052734375,
0.042205810546875,
-0.01024627685546875,
0.0435791015625,
-0.0340576171875,
0.004550933837890625,
0.00803375244140625,
0.034942626953125,
0.00569915771484375,
-0.06524658203125,
-0.02099609375,
-0.00937652587890625,
-0.031768798828125,
-0.0484619140625,
0.038116455078125,
0.0176849365234375,
0.035736083984375,
0.0303192138671875,
-0.0127410888671875,
0.0498046875,
0.0033626556396484375,
0.037994384765625,
0.0308837890625,
-0.03875732421875,
0.040313720703125,
-0.001956939697265625,
0.01366424560546875,
0.00801849365234375,
0.02288818359375,
-0.017852783203125,
0.00118255615234375,
-0.07794189453125,
-0.059295654296875,
0.06524658203125,
0.009002685546875,
-0.0008096694946289062,
0.031707763671875,
0.05633544921875,
0.0010528564453125,
0.002475738525390625,
-0.055999755859375,
-0.039581298828125,
-0.0267181396484375,
-0.0252227783203125,
-0.0034313201904296875,
-0.003570556640625,
-0.0033664703369140625,
-0.04815673828125,
0.048797607421875,
-0.0053863525390625,
0.055328369140625,
0.0281219482421875,
-0.007190704345703125,
-0.004486083984375,
-0.0276641845703125,
0.03399658203125,
0.0265350341796875,
-0.0262908935546875,
0.00982666015625,
0.00945281982421875,
-0.040863037109375,
0.01003265380859375,
0.0156707763671875,
-0.0025119781494140625,
0.00028443336486816406,
0.04315185546875,
0.07293701171875,
-0.0016431808471679688,
0.00803375244140625,
0.0333251953125,
-0.00492095947265625,
-0.03131103515625,
-0.0164947509765625,
0.0137176513671875,
0.002002716064453125,
0.033203125,
0.0246124267578125,
0.0313720703125,
-0.00884246826171875,
-0.015869140625,
0.016571044921875,
0.039947509765625,
-0.020263671875,
-0.025604248046875,
0.04754638671875,
-0.01041412353515625,
-0.0123291015625,
0.06427001953125,
-0.0085906982421875,
-0.036468505859375,
0.08837890625,
0.0284423828125,
0.07159423828125,
0.002956390380859375,
-0.0013275146484375,
0.07476806640625,
0.0194091796875,
-0.0032024383544921875,
0.00797271728515625,
0.006534576416015625,
-0.055633544921875,
0.006351470947265625,
-0.037506103515625,
0.00836944580078125,
0.021087646484375,
-0.03466796875,
0.022705078125,
-0.051910400390625,
-0.0308380126953125,
0.01306915283203125,
0.03192138671875,
-0.072265625,
0.01255035400390625,
-0.00807952880859375,
0.0682373046875,
-0.0548095703125,
0.058563232421875,
0.05950927734375,
-0.04119873046875,
-0.08758544921875,
-0.015838623046875,
-0.008392333984375,
-0.06427001953125,
0.045074462890625,
0.03668212890625,
0.01419830322265625,
0.0106353759765625,
-0.06536865234375,
-0.04949951171875,
0.11224365234375,
0.045623779296875,
-0.007244110107421875,
0.016998291015625,
-0.0083160400390625,
0.0169677734375,
-0.040985107421875,
0.045806884765625,
0.013885498046875,
0.0301361083984375,
0.020477294921875,
-0.0478515625,
0.0224151611328125,
-0.0238189697265625,
0.005107879638671875,
0.01210784912109375,
-0.06561279296875,
0.073486328125,
-0.041046142578125,
-0.00948333740234375,
0.0006270408630371094,
0.04998779296875,
0.01108551025390625,
0.00867462158203125,
0.04559326171875,
0.06585693359375,
0.041046142578125,
-0.0242919921875,
0.07061767578125,
0.0031757354736328125,
0.05511474609375,
0.04278564453125,
0.03839111328125,
0.037445068359375,
0.02777099609375,
-0.0161895751953125,
0.0226898193359375,
0.08038330078125,
-0.0283355712890625,
0.0197296142578125,
0.01425933837890625,
0.002765655517578125,
-0.00937652587890625,
0.005420684814453125,
-0.0285491943359375,
0.034423828125,
0.01214599609375,
-0.045379638671875,
-0.0208282470703125,
0.0010995864868164062,
0.0012578964233398438,
-0.03155517578125,
-0.0251617431640625,
0.032928466796875,
0.0005898475646972656,
-0.0290374755859375,
0.0694580078125,
0.002773284912109375,
0.069580078125,
-0.0275726318359375,
0.00356292724609375,
-0.019073486328125,
0.02154541015625,
-0.032623291015625,
-0.059722900390625,
0.0205230712890625,
-0.0206298828125,
-0.00012171268463134766,
0.001857757568359375,
0.049835205078125,
-0.0316162109375,
-0.039825439453125,
0.017791748046875,
0.0213623046875,
0.037384033203125,
0.0036563873291015625,
-0.092041015625,
0.0125274658203125,
0.005779266357421875,
-0.058502197265625,
0.0194854736328125,
0.032073974609375,
0.01143646240234375,
0.057861328125,
0.036407470703125,
-0.0076141357421875,
0.01448822021484375,
-0.0133209228515625,
0.060638427734375,
-0.031707763671875,
-0.0207672119140625,
-0.0616455078125,
0.049560546875,
-0.0073394775390625,
-0.0430908203125,
0.02923583984375,
0.041168212890625,
0.060821533203125,
0.00048828125,
0.0268707275390625,
-0.0246429443359375,
-0.00872802734375,
-0.0244903564453125,
0.0623779296875,
-0.06134033203125,
-0.0042572021484375,
-0.0059814453125,
-0.0521240234375,
-0.026214599609375,
0.051116943359375,
-0.0176544189453125,
0.037506103515625,
0.037506103515625,
0.07769775390625,
-0.0287628173828125,
-0.02569580078125,
0.0179290771484375,
0.015716552734375,
0.01324462890625,
0.033111572265625,
0.0211639404296875,
-0.06195068359375,
0.035797119140625,
-0.053863525390625,
-0.0149078369140625,
-0.0135955810546875,
-0.05279541015625,
-0.06640625,
-0.0633544921875,
-0.047393798828125,
-0.05181884765625,
-0.0216827392578125,
0.07281494140625,
0.0806884765625,
-0.048583984375,
-0.01190185546875,
-0.0005936622619628906,
0.015655517578125,
-0.01708984375,
-0.018341064453125,
0.056488037109375,
-0.0179290771484375,
-0.05633544921875,
-0.0321044921875,
-0.007587432861328125,
0.02191162109375,
-0.0008955001831054688,
-0.013153076171875,
-0.01175689697265625,
-0.028167724609375,
0.01369476318359375,
0.0153656005859375,
-0.0450439453125,
-0.01154327392578125,
-0.0217437744140625,
-0.015960693359375,
0.0301361083984375,
0.0362548828125,
-0.0367431640625,
0.0258026123046875,
0.035858154296875,
0.0298614501953125,
0.06365966796875,
-0.033782958984375,
-0.0002008676528930664,
-0.057861328125,
0.044464111328125,
-0.007190704345703125,
0.037994384765625,
0.035614013671875,
-0.0283355712890625,
0.049224853515625,
0.028167724609375,
-0.035369873046875,
-0.06475830078125,
-0.013702392578125,
-0.079345703125,
-0.00968170166015625,
0.068115234375,
-0.037261962890625,
-0.041168212890625,
0.043212890625,
0.005519866943359375,
0.05267333984375,
-0.0136260986328125,
0.0308685302734375,
0.0175018310546875,
-0.0091400146484375,
-0.04705810546875,
-0.04278564453125,
0.031280517578125,
0.01360321044921875,
-0.042236328125,
-0.0265045166015625,
-0.0033512115478515625,
0.056671142578125,
0.01433563232421875,
0.0347900390625,
-0.00470733642578125,
0.00974273681640625,
0.01015472412109375,
0.03851318359375,
-0.038909912109375,
-0.0002130270004272461,
-0.0283966064453125,
0.01206207275390625,
-0.005474090576171875,
-0.042755126953125
]
] |
Helsinki-NLP/opus-mt-tl-en | 2023-08-16T12:06:52.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"tl",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-tl-en | 0 | 12,446 | transformers | 2022-03-02T23:29:04 | ---
language:
- tl
- en
tags:
- translation
license: apache-2.0
---
### tgl-eng
* source group: Tagalog
* target group: English
* OPUS readme: [tgl-eng](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/tgl-eng/README.md)
* model: transformer-align
* source language(s): tgl_Latn
* target language(s): eng
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus-2020-06-17.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/tgl-eng/opus-2020-06-17.zip)
* test set translations: [opus-2020-06-17.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/tgl-eng/opus-2020-06-17.test.txt)
* test set scores: [opus-2020-06-17.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/tgl-eng/opus-2020-06-17.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba-test.tgl.eng | 35.0 | 0.542 |
### System Info:
- hf_name: tgl-eng
- source_languages: tgl
- target_languages: eng
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/tgl-eng/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['tl', 'en']
- src_constituents: {'tgl_Latn'}
- tgt_constituents: {'eng'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/tgl-eng/opus-2020-06-17.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/tgl-eng/opus-2020-06-17.test.txt
- src_alpha3: tgl
- tgt_alpha3: eng
- short_pair: tl-en
- chrF2_score: 0.542
- bleu: 35.0
- brevity_penalty: 0.975
- ref_len: 18168.0
- src_name: Tagalog
- tgt_name: English
- train_date: 2020-06-17
- src_alpha2: tl
- tgt_alpha2: en
- prefer_old: False
- long_pair: tgl-eng
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 2,067 | [
[
-0.0189056396484375,
-0.042388916015625,
0.0209197998046875,
0.033477783203125,
-0.0338134765625,
-0.0019159317016601562,
-0.02752685546875,
-0.03216552734375,
0.017608642578125,
0.01525115966796875,
-0.04205322265625,
-0.0638427734375,
-0.0389404296875,
0.017425537109375,
0.003662109375,
0.0672607421875,
-0.01177215576171875,
0.00830078125,
0.04071044921875,
-0.031707763671875,
-0.03082275390625,
-0.01015472412109375,
-0.0556640625,
-0.0212249755859375,
0.024749755859375,
0.0240020751953125,
0.0305328369140625,
0.032745361328125,
0.036102294921875,
0.0190277099609375,
-0.021881103515625,
0.01473236083984375,
-0.0167083740234375,
-0.0165863037109375,
-0.007801055908203125,
-0.0325927734375,
-0.045074462890625,
-0.0158233642578125,
0.0670166015625,
0.03302001953125,
0.002086639404296875,
0.03521728515625,
-0.0096588134765625,
0.0406494140625,
-0.0110321044921875,
0.01495361328125,
-0.035888671875,
-0.00168609619140625,
-0.03594970703125,
-0.0231170654296875,
-0.039825439453125,
-0.029510498046875,
0.01172637939453125,
-0.0458984375,
0.006526947021484375,
0.012969970703125,
0.12890625,
0.00980377197265625,
-0.0257415771484375,
-0.01178741455078125,
-0.0238037109375,
0.06903076171875,
-0.061309814453125,
0.030975341796875,
0.0301971435546875,
-0.00443267822265625,
0.0038280487060546875,
-0.0289459228515625,
-0.0198974609375,
-0.0015316009521484375,
-0.02374267578125,
0.021514892578125,
-0.006439208984375,
-0.005893707275390625,
0.018096923828125,
0.03729248046875,
-0.056121826171875,
0.00347137451171875,
-0.035736083984375,
-0.01410675048828125,
0.043792724609375,
0.0020885467529296875,
0.0298309326171875,
-0.040130615234375,
-0.032196044921875,
-0.0284881591796875,
-0.036834716796875,
0.0132293701171875,
0.0248565673828125,
0.0240631103515625,
-0.037811279296875,
0.0474853515625,
0.0042266845703125,
0.045318603515625,
0.01056671142578125,
-0.01739501953125,
0.05474853515625,
-0.048919677734375,
-0.01401519775390625,
-0.0089111328125,
0.0887451171875,
0.0238037109375,
0.00547027587890625,
0.0004858970642089844,
-0.01995849609375,
-0.0133056640625,
-0.005428314208984375,
-0.059539794921875,
0.0140228271484375,
0.0235443115234375,
-0.028656005859375,
-0.01727294921875,
0.0007572174072265625,
-0.059967041015625,
0.009124755859375,
0.0100860595703125,
0.0416259765625,
-0.05718994140625,
-0.0190887451171875,
0.023468017578125,
-0.0031909942626953125,
0.0269775390625,
-0.0010614395141601562,
-0.04425048828125,
0.00359344482421875,
0.02593994140625,
0.0706787109375,
-0.01222991943359375,
-0.0283050537109375,
-0.0207061767578125,
0.01149749755859375,
-0.01317596435546875,
0.045166015625,
-0.0154876708984375,
-0.034698486328125,
-0.0012025833129882812,
0.027618408203125,
-0.0193328857421875,
-0.016937255859375,
0.0684814453125,
-0.0188751220703125,
0.039642333984375,
-0.020477294921875,
-0.03924560546875,
-0.023284912109375,
0.0280609130859375,
-0.048492431640625,
0.09185791015625,
0.0158538818359375,
-0.0594482421875,
0.0292510986328125,
-0.060546875,
-0.022003173828125,
0.000043451786041259766,
0.00888824462890625,
-0.04913330078125,
0.0005202293395996094,
0.0162506103515625,
0.0220947265625,
-0.027587890625,
0.040374755859375,
-0.0031337738037109375,
-0.0193023681640625,
-0.01062774658203125,
-0.0117034912109375,
0.096923828125,
0.01104736328125,
-0.034637451171875,
0.010162353515625,
-0.049957275390625,
0.00937652587890625,
0.02001953125,
-0.033477783203125,
-0.00847625732421875,
-0.00572967529296875,
0.0189666748046875,
0.01082611083984375,
0.02532958984375,
-0.0469970703125,
0.031463623046875,
-0.044189453125,
0.024200439453125,
0.05816650390625,
0.004581451416015625,
0.02069091796875,
-0.0295867919921875,
0.032562255859375,
0.01186370849609375,
0.01220703125,
0.0137939453125,
-0.0394287109375,
-0.056121826171875,
-0.0078125,
0.03765869140625,
0.05572509765625,
-0.063232421875,
0.053985595703125,
-0.050811767578125,
-0.059478759765625,
-0.0589599609375,
-0.0152435302734375,
0.040740966796875,
0.0245361328125,
0.03729248046875,
-0.0117950439453125,
-0.035858154296875,
-0.07305908203125,
-0.0178680419921875,
-0.025177001953125,
-0.003925323486328125,
0.01971435546875,
0.05340576171875,
0.00359344482421875,
0.03717041015625,
-0.03680419921875,
-0.045318603515625,
-0.01346588134765625,
0.01282501220703125,
0.029541015625,
0.046600341796875,
0.058135986328125,
-0.05584716796875,
-0.0450439453125,
0.00601959228515625,
-0.052398681640625,
-0.02349853515625,
-0.0008120536804199219,
-0.01171112060546875,
0.032470703125,
0.0014104843139648438,
-0.03411865234375,
0.01580810546875,
0.048919677734375,
-0.0567626953125,
0.03448486328125,
-0.01062774658203125,
0.025360107421875,
-0.114013671875,
0.0171051025390625,
-0.007415771484375,
-0.0032520294189453125,
-0.031982421875,
0.002628326416015625,
0.01062774658203125,
0.008148193359375,
-0.038909912109375,
0.0589599609375,
-0.039215087890625,
0.0040435791015625,
0.0259857177734375,
0.018951416015625,
-0.00024056434631347656,
0.05322265625,
-0.01123809814453125,
0.07598876953125,
0.040374755859375,
-0.0214080810546875,
0.003368377685546875,
0.029876708984375,
-0.028900146484375,
0.016815185546875,
-0.056884765625,
-0.01580810546875,
0.02398681640625,
-0.0024242401123046875,
-0.05755615234375,
-0.01491546630859375,
0.0171966552734375,
-0.04949951171875,
0.0263519287109375,
-0.00539398193359375,
-0.047088623046875,
-0.0128173828125,
-0.036346435546875,
0.038848876953125,
0.03533935546875,
-0.0189971923828125,
0.05950927734375,
0.0145721435546875,
0.006626129150390625,
-0.0523681640625,
-0.068115234375,
0.0010461807250976562,
-0.01509857177734375,
-0.05487060546875,
0.037139892578125,
-0.01348876953125,
0.0019083023071289062,
0.00878143310546875,
-0.0011682510375976562,
-0.00995635986328125,
0.00400543212890625,
-0.0011739730834960938,
0.0225067138671875,
-0.01995849609375,
0.01763916015625,
-0.00536346435546875,
-0.0029296875,
-0.015869140625,
-0.0244598388671875,
0.05859375,
-0.0323486328125,
-0.00890350341796875,
-0.044921875,
0.005222320556640625,
0.0299072265625,
-0.031646728515625,
0.0765380859375,
0.05609130859375,
-0.0185699462890625,
0.0083160400390625,
-0.04315185546875,
0.011474609375,
-0.03094482421875,
0.018585205078125,
-0.037506103515625,
-0.054351806640625,
0.0648193359375,
0.0210723876953125,
0.017730712890625,
0.06793212890625,
0.0426025390625,
0.012481689453125,
0.04107666015625,
0.0225372314453125,
0.002117156982421875,
0.0430908203125,
-0.040252685546875,
-0.0003814697265625,
-0.059661865234375,
-0.021575927734375,
-0.05078125,
-0.01044464111328125,
-0.07159423828125,
-0.017181396484375,
0.02197265625,
-0.0035877227783203125,
-0.01255035400390625,
0.04827880859375,
-0.036956787109375,
0.02362060546875,
0.03948974609375,
0.0174560546875,
0.0225067138671875,
-0.00799560546875,
-0.0249481201171875,
-0.005695343017578125,
-0.03173828125,
-0.0433349609375,
0.086181640625,
0.01580810546875,
0.021697998046875,
0.02630615234375,
0.04937744140625,
0.0031070709228515625,
0.01230621337890625,
-0.0537109375,
0.045928955078125,
-0.0170135498046875,
-0.0616455078125,
-0.019256591796875,
-0.02557373046875,
-0.06341552734375,
0.00934600830078125,
-0.00926971435546875,
-0.056549072265625,
0.001506805419921875,
-0.0079345703125,
-0.0037937164306640625,
0.0521240234375,
-0.060699462890625,
0.06915283203125,
0.0001569986343383789,
-0.0233001708984375,
0.01363372802734375,
-0.04425048828125,
0.01403045654296875,
-0.00902557373046875,
0.00316619873046875,
-0.01018524169921875,
-0.01806640625,
0.06353759765625,
-0.0145721435546875,
0.0411376953125,
-0.00847625732421875,
-0.01448822021484375,
0.01134490966796875,
0.005313873291015625,
0.032012939453125,
-0.006214141845703125,
-0.02618408203125,
0.0308837890625,
-0.00508880615234375,
-0.048126220703125,
-0.013916015625,
0.04052734375,
-0.0594482421875,
-0.0440673828125,
-0.039276123046875,
-0.049957275390625,
-0.00028514862060546875,
0.03826904296875,
0.042633056640625,
0.036956787109375,
-0.0029850006103515625,
0.036041259765625,
0.04791259765625,
-0.02532958984375,
0.048126220703125,
0.03717041015625,
-0.005390167236328125,
-0.03863525390625,
0.05059814453125,
0.0203704833984375,
0.02252197265625,
0.0379638671875,
0.0055999755859375,
-0.0160980224609375,
-0.06219482421875,
-0.04296875,
0.0309600830078125,
-0.02618408203125,
-0.0198211669921875,
-0.037750244140625,
-0.007221221923828125,
-0.0249481201171875,
0.00982666015625,
-0.0224151611328125,
-0.0303955078125,
-0.0131683349609375,
-0.01995849609375,
0.0197906494140625,
0.0301361083984375,
0.0062713623046875,
0.0207977294921875,
-0.066162109375,
0.0092010498046875,
-0.015838623046875,
0.039520263671875,
-0.0155487060546875,
-0.059539794921875,
-0.022918701171875,
-0.0064697265625,
-0.0243988037109375,
-0.0740966796875,
0.03802490234375,
0.003078460693359375,
0.0190887451171875,
0.01306915283203125,
0.0029888153076171875,
0.048431396484375,
-0.0361328125,
0.0819091796875,
-0.0135498046875,
-0.07391357421875,
0.048828125,
-0.038055419921875,
0.033233642578125,
0.046234130859375,
0.018646240234375,
-0.02471923828125,
-0.046783447265625,
-0.0572509765625,
-0.058929443359375,
0.065673828125,
0.042510986328125,
-0.01611328125,
0.0009493827819824219,
0.00579071044921875,
-0.0069732666015625,
-0.0111083984375,
-0.08636474609375,
-0.028472900390625,
0.0081329345703125,
-0.03021240234375,
0.002643585205078125,
-0.0276031494140625,
-0.01470184326171875,
-0.01282501220703125,
0.0784912109375,
0.01451873779296875,
0.01265716552734375,
0.031585693359375,
-0.012115478515625,
-0.0033893585205078125,
0.029541015625,
0.057647705078125,
0.04156494140625,
-0.031951904296875,
-0.0174560546875,
0.0290374755859375,
-0.039398193359375,
0.005962371826171875,
0.00716400146484375,
-0.036285400390625,
0.02130126953125,
0.044677734375,
0.063232421875,
0.006130218505859375,
-0.038238525390625,
0.034881591796875,
-0.004100799560546875,
-0.029327392578125,
-0.033905029296875,
-0.017242431640625,
0.00849151611328125,
0.01348876953125,
0.031097412109375,
-0.004657745361328125,
-0.006473541259765625,
-0.016693115234375,
-0.0015287399291992188,
0.0051422119140625,
-0.0164794921875,
-0.03240966796875,
0.041778564453125,
0.00655364990234375,
-0.02593994140625,
0.028839111328125,
-0.028472900390625,
-0.035736083984375,
0.045440673828125,
0.026947021484375,
0.08184814453125,
-0.0234527587890625,
-0.00768280029296875,
0.058502197265625,
0.04461669921875,
-0.0016374588012695312,
0.03466796875,
0.018829345703125,
-0.043487548828125,
-0.0264892578125,
-0.05731201171875,
0.007366180419921875,
0.007442474365234375,
-0.052581787109375,
0.022918701171875,
-0.0016450881958007812,
-0.02154541015625,
-0.00714874267578125,
0.03173828125,
-0.03741455078125,
0.005123138427734375,
-0.022369384765625,
0.07220458984375,
-0.0684814453125,
0.06524658203125,
0.056060791015625,
-0.053497314453125,
-0.08392333984375,
0.0027790069580078125,
-0.019439697265625,
-0.049224853515625,
0.0462646484375,
0.0034008026123046875,
0.00627899169921875,
-0.00554656982421875,
-0.0256805419921875,
-0.057891845703125,
0.09503173828125,
0.030792236328125,
-0.0239410400390625,
-0.0208892822265625,
0.0023479461669921875,
0.046356201171875,
-0.0050201416015625,
0.015899658203125,
0.0328369140625,
0.057891845703125,
-0.0115509033203125,
-0.09014892578125,
0.0143280029296875,
-0.03546142578125,
-0.0032024383544921875,
0.031585693359375,
-0.07415771484375,
0.06243896484375,
0.00617218017578125,
-0.0241851806640625,
0.003643035888671875,
0.034820556640625,
0.029937744140625,
0.0065155029296875,
0.0352783203125,
0.06719970703125,
0.03717041015625,
-0.046142578125,
0.078369140625,
-0.0220947265625,
0.051971435546875,
0.06622314453125,
0.0162353515625,
0.055419921875,
0.0391845703125,
-0.0151824951171875,
0.039581298828125,
0.05755615234375,
-0.0193023681640625,
0.02374267578125,
-0.007518768310546875,
-0.00762939453125,
-0.0123291015625,
-0.031158447265625,
-0.034698486328125,
0.03131103515625,
0.01215362548828125,
-0.0183258056640625,
-0.00799560546875,
-0.0163421630859375,
0.0265045166015625,
-0.0018587112426757812,
-0.01383209228515625,
0.044677734375,
-0.00968170166015625,
-0.053924560546875,
0.05517578125,
-0.0047454833984375,
0.049652099609375,
-0.0482177734375,
0.004596710205078125,
-0.0166473388671875,
0.011932373046875,
-0.01171112060546875,
-0.06622314453125,
0.02618408203125,
0.01374053955078125,
-0.01320648193359375,
-0.022064208984375,
0.017364501953125,
-0.044036865234375,
-0.048614501953125,
0.038848876953125,
0.0316162109375,
0.01435089111328125,
0.0258941650390625,
-0.053436279296875,
0.00388336181640625,
0.022369384765625,
-0.048187255859375,
-0.005039215087890625,
0.045989990234375,
0.002567291259765625,
0.0509033203125,
0.0230712890625,
0.0159149169921875,
0.010284423828125,
0.004947662353515625,
0.048065185546875,
-0.06158447265625,
-0.023895263671875,
-0.06256103515625,
0.037567138671875,
-0.0118408203125,
-0.045013427734375,
0.0484619140625,
0.05535888671875,
0.07196044921875,
-0.00809478759765625,
0.0270538330078125,
-0.0257110595703125,
0.0287017822265625,
-0.05035400390625,
0.052764892578125,
-0.07659912109375,
0.004673004150390625,
-0.010467529296875,
-0.0567626953125,
-0.0251312255859375,
0.017578125,
-0.0179290771484375,
0.0015392303466796875,
0.07696533203125,
0.054534912109375,
0.0061798095703125,
-0.024658203125,
-0.00055694580078125,
0.0357666015625,
0.029083251953125,
0.07464599609375,
0.0135040283203125,
-0.0721435546875,
0.056671142578125,
-0.0205230712890625,
0.005279541015625,
-0.0007853507995605469,
-0.060699462890625,
-0.0687255859375,
-0.04931640625,
-0.01043701171875,
-0.034027099609375,
-0.010986328125,
0.07476806640625,
0.022979736328125,
-0.07354736328125,
-0.0263824462890625,
0.0075531005859375,
0.0087890625,
-0.01239776611328125,
-0.0205230712890625,
0.062286376953125,
-0.0146484375,
-0.08343505859375,
0.01006317138671875,
0.0042724609375,
0.00838470458984375,
0.012939453125,
-0.0007534027099609375,
-0.05413818359375,
-0.011077880859375,
0.022796630859375,
0.006786346435546875,
-0.0677490234375,
-0.018096923828125,
0.00896453857421875,
-0.0214385986328125,
0.0204620361328125,
0.0020542144775390625,
-0.0157623291015625,
0.0233154296875,
0.05657958984375,
0.029266357421875,
0.039093017578125,
-0.0022335052490234375,
0.0282135009765625,
-0.063232421875,
0.034881591796875,
0.020538330078125,
0.0404052734375,
0.017120361328125,
-0.006908416748046875,
0.07318115234375,
0.0210723876953125,
-0.027587890625,
-0.0699462890625,
-0.005702972412109375,
-0.10003662109375,
-0.0003867149353027344,
0.07513427734375,
-0.0208587646484375,
-0.0254058837890625,
0.0201568603515625,
-0.0146636962890625,
0.040435791015625,
-0.039398193359375,
0.05096435546875,
0.0751953125,
0.0258941650390625,
0.0096282958984375,
-0.04107666015625,
0.0258636474609375,
0.046142578125,
-0.0623779296875,
-0.015655517578125,
0.0138092041015625,
0.02410888671875,
0.0295867919921875,
0.0423583984375,
-0.0239410400390625,
0.01239776611328125,
-0.0159149169921875,
0.0202484130859375,
-0.0128936767578125,
0.0025730133056640625,
-0.023651123046875,
0.01247406005859375,
-0.0125274658203125,
-0.0174713134765625
]
] |
zarakiquemparte/zarafusionex-1.2-l2-7b | 2023-08-29T02:33:29.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama2",
"license:other",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | zarakiquemparte | null | null | zarakiquemparte/zarafusionex-1.2-l2-7b | 2 | 12,438 | transformers | 2023-08-29T00:35:40 | ---
license: other
tags:
- llama2
---
# Model Card: Zarafusionex 1.2 L2 7b
This model uses [Nous Hermes Llama2 7b](https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b) (53%) as a base with [Stable Beluga 7b](https://huggingface.co/stabilityai/StableBeluga-7B) (47%) and the result of this merge was merged with [LimaRP Llama2 v2 7B Lora](https://huggingface.co/lemonilia/limarp-llama2-v2).
This merge of models(hermes and stable beluga) was done with this [script](https://github.com/zarakiquemparte/zaraki-tools/blob/main/merge-cli.py)
This merge of Lora with Model was done with this [script](https://github.com/zarakiquemparte/zaraki-tools/blob/main/apply-lora.py)
Merge illustration:

## Usage:
Since this is a merge between Nous Hermes, Stable Beluga and LimaRP, the following instruction formats should work:
Alpaca 2:
```
### Instruction:
<prompt>
### Response:
<leave a newline blank for model to respond>
```
Alpaca LimaRP:
```
### Instruction:
Character's Persona: {bot character description}
User's Persona: {user character description}
Scenario: {what happens in the story}
Play the role of Character. You must engage in a roleplaying chat with User below this line. Do not write dialogues and narration for User. Character should respond with messages of medium length.
### Input:
Character: {utterance}
### Response:
User: {utterance}
```
## Bias, Risks, and Limitations
This model is not intended for supplying factual information or advice in any form
## Training Details
This model is merged and can be reproduced using the tools mentioned above. Please refer to all provided links for extra model-specific details. | 1,711 | [
[
-0.03173828125,
-0.04022216796875,
0.0252532958984375,
0.030303955078125,
-0.039031982421875,
-0.0226287841796875,
0.019287109375,
-0.0517578125,
0.0308074951171875,
0.0694580078125,
-0.0677490234375,
-0.031890869140625,
-0.03704833984375,
-0.01094818115234375,
-0.033233642578125,
0.097900390625,
0.00835418701171875,
-0.0226287841796875,
0.005458831787109375,
-0.019989013671875,
-0.03961181640625,
-0.03314208984375,
-0.055084228515625,
-0.037322998046875,
0.0552978515625,
0.0282135009765625,
0.05902099609375,
0.0275726318359375,
0.03155517578125,
0.02947998046875,
-0.0275726318359375,
0.015838623046875,
-0.02850341796875,
0.0031490325927734375,
-0.005008697509765625,
-0.040924072265625,
-0.07379150390625,
0.00433349609375,
0.03582763671875,
0.035675048828125,
-0.0166778564453125,
0.0158233642578125,
-0.00006985664367675781,
0.016693115234375,
-0.0278167724609375,
-0.0031414031982421875,
-0.01366424560546875,
0.00667572021484375,
0.0030689239501953125,
0.0006656646728515625,
-0.01166534423828125,
-0.036834716796875,
0.01361083984375,
-0.048431396484375,
0.0019254684448242188,
-0.02154541015625,
0.07568359375,
0.020660400390625,
-0.03448486328125,
-0.01251220703125,
-0.0445556640625,
0.0472412109375,
-0.0699462890625,
-0.00632476806640625,
0.019744873046875,
0.027099609375,
-0.020294189453125,
-0.042083740234375,
-0.0579833984375,
0.0022602081298828125,
0.003704071044921875,
0.005084991455078125,
-0.033447265625,
-0.01206207275390625,
0.02984619140625,
0.02435302734375,
-0.0236968994140625,
0.0238037109375,
-0.0692138671875,
-0.0023021697998046875,
0.050201416015625,
0.01551055908203125,
0.0164947509765625,
-0.01255035400390625,
-0.048126220703125,
-0.036468505859375,
-0.0297393798828125,
0.002384185791015625,
0.047698974609375,
0.02392578125,
-0.05792236328125,
0.0728759765625,
-0.00754547119140625,
0.046417236328125,
0.0208740234375,
-0.0256195068359375,
0.03997802734375,
-0.00021028518676757812,
-0.035919189453125,
-0.0007252693176269531,
0.06494140625,
0.042327880859375,
-0.0107879638671875,
0.0240020751953125,
-0.0006122589111328125,
0.00843048095703125,
0.00138092041015625,
-0.060821533203125,
0.0187835693359375,
0.04193115234375,
-0.03302001953125,
-0.039398193359375,
-0.00911712646484375,
-0.05352783203125,
-0.0310821533203125,
0.006988525390625,
0.0241241455078125,
-0.0250244140625,
-0.0204620361328125,
0.022491455078125,
-0.01059722900390625,
0.04400634765625,
0.037109375,
-0.0555419921875,
0.0182342529296875,
0.03436279296875,
0.04656982421875,
0.0278778076171875,
-0.021148681640625,
-0.031951904296875,
0.0091094970703125,
-0.0212554931640625,
0.058258056640625,
0.0016241073608398438,
-0.045562744140625,
-0.0010318756103515625,
0.00885772705078125,
-0.0016880035400390625,
-0.0288543701171875,
0.0633544921875,
-0.026275634765625,
0.040130615234375,
-0.015167236328125,
-0.0178680419921875,
-0.044219970703125,
0.01251220703125,
-0.0570068359375,
0.0672607421875,
0.024658203125,
-0.055419921875,
-0.00658416748046875,
-0.06884765625,
-0.0229644775390625,
-0.0061798095703125,
0.016510009765625,
-0.042633056640625,
-0.005306243896484375,
-0.0080413818359375,
0.0367431640625,
-0.021484375,
-0.01241302490234375,
-0.0364990234375,
-0.0259246826171875,
0.0230865478515625,
0.00201416015625,
0.070068359375,
0.0155181884765625,
-0.01001739501953125,
0.0009183883666992188,
-0.062225341796875,
0.006351470947265625,
0.0219573974609375,
-0.0259552001953125,
0.0005888938903808594,
-0.01229095458984375,
0.01439666748046875,
0.0113372802734375,
0.03546142578125,
-0.025177001953125,
0.04705810546875,
-0.022216796875,
0.016937255859375,
0.034393310546875,
-0.0049285888671875,
0.0189971923828125,
-0.046661376953125,
0.0225830078125,
0.0011110305786132812,
0.020050048828125,
-0.0021038055419921875,
-0.062469482421875,
-0.08441162109375,
-0.0212860107421875,
0.00582122802734375,
0.043670654296875,
-0.026153564453125,
0.057769775390625,
-0.004680633544921875,
-0.05670166015625,
-0.029449462890625,
-0.0017518997192382812,
0.0192108154296875,
0.03125,
0.021087646484375,
-0.0308990478515625,
-0.054901123046875,
-0.07452392578125,
0.0243377685546875,
-0.027984619140625,
-0.007564544677734375,
0.0026798248291015625,
0.039031982421875,
-0.049530029296875,
0.048736572265625,
-0.032440185546875,
-0.02264404296875,
-0.041107177734375,
0.03277587890625,
0.046875,
0.054168701171875,
0.062469482421875,
-0.03729248046875,
-0.0215301513671875,
0.00527191162109375,
-0.052734375,
-0.006992340087890625,
0.00336456298828125,
-0.008453369140625,
0.0207977294921875,
-0.01245880126953125,
-0.06988525390625,
0.050811767578125,
0.055023193359375,
-0.03582763671875,
0.01108551025390625,
-0.0286102294921875,
0.01666259765625,
-0.0999755859375,
0.022857666015625,
-0.00783538818359375,
0.0012578964233398438,
-0.056304931640625,
0.01407623291015625,
-0.00914764404296875,
-0.003696441650390625,
-0.039794921875,
0.05584716796875,
-0.0279541015625,
-0.01031494140625,
-0.036590576171875,
-0.001987457275390625,
0.00771331787109375,
0.03668212890625,
0.007564544677734375,
0.0189971923828125,
0.0294342041015625,
-0.04534912109375,
0.041717529296875,
0.04937744140625,
-0.022735595703125,
0.03955078125,
-0.057403564453125,
0.02471923828125,
0.0024776458740234375,
0.0264739990234375,
-0.07672119140625,
-0.0272674560546875,
0.057464599609375,
-0.0275115966796875,
0.005504608154296875,
-0.022735595703125,
-0.03948974609375,
-0.0301055908203125,
-0.0221099853515625,
0.01910400390625,
0.06231689453125,
-0.0440673828125,
0.06683349609375,
-0.00981903076171875,
-0.0003628730773925781,
-0.04791259765625,
-0.0634765625,
-0.0174560546875,
-0.030548095703125,
-0.06573486328125,
0.0177001953125,
-0.0255279541015625,
-0.003406524658203125,
-0.00916290283203125,
0.01065826416015625,
-0.0232696533203125,
-0.00394439697265625,
0.02862548828125,
0.049560546875,
-0.0158538818359375,
-0.035369873046875,
0.018096923828125,
-0.002956390380859375,
-0.0188751220703125,
0.0276641845703125,
0.05584716796875,
0.007305145263671875,
-0.029327392578125,
-0.057647705078125,
0.0239105224609375,
0.0718994140625,
-0.00815582275390625,
0.07452392578125,
0.043304443359375,
-0.0175628662109375,
0.01389312744140625,
-0.05975341796875,
0.005828857421875,
-0.030426025390625,
0.016204833984375,
-0.0205230712890625,
-0.026885986328125,
0.07073974609375,
0.0238800048828125,
0.015167236328125,
0.040679931640625,
0.0300750732421875,
-0.004486083984375,
0.0721435546875,
0.064453125,
-0.007793426513671875,
0.02496337890625,
-0.04315185546875,
0.0168914794921875,
-0.07275390625,
-0.040252685546875,
-0.034820556640625,
-0.0260162353515625,
-0.034027099609375,
-0.037994384765625,
0.00447845458984375,
0.023834228515625,
-0.0164642333984375,
0.05474853515625,
-0.018096923828125,
0.01515960693359375,
0.0262298583984375,
0.007450103759765625,
0.0194854736328125,
0.01129150390625,
0.0158843994140625,
0.01241302490234375,
-0.042144775390625,
-0.034912109375,
0.072509765625,
0.034759521484375,
0.059967041015625,
0.0261383056640625,
0.05462646484375,
0.005580902099609375,
0.0193023681640625,
-0.03997802734375,
0.048065185546875,
0.006877899169921875,
-0.038848876953125,
-0.00420379638671875,
-0.01190185546875,
-0.0653076171875,
0.027191162109375,
-0.0256805419921875,
-0.06573486328125,
0.0234222412109375,
-0.0017385482788085938,
-0.056060791015625,
0.0210418701171875,
-0.044952392578125,
0.0521240234375,
-0.0157928466796875,
-0.0223541259765625,
-0.0120849609375,
-0.048736572265625,
0.045928955078125,
0.00409698486328125,
-0.005329132080078125,
-0.01039886474609375,
-0.004322052001953125,
0.0576171875,
-0.04034423828125,
0.07501220703125,
0.00968170166015625,
-0.032012939453125,
0.0254364013671875,
-0.0075531005859375,
0.041748046875,
-0.018310546875,
0.00018107891082763672,
0.01702880859375,
0.005550384521484375,
-0.019775390625,
-0.041168212890625,
0.05218505859375,
-0.07745361328125,
-0.0306243896484375,
-0.04180908203125,
-0.03271484375,
0.0081024169921875,
-0.00286102294921875,
0.0263671875,
0.015167236328125,
-0.0134735107421875,
-0.0018978118896484375,
0.0462646484375,
-0.0140533447265625,
0.0235595703125,
0.058349609375,
-0.023956298828125,
-0.038360595703125,
0.0299835205078125,
-0.0193939208984375,
0.01486968994140625,
0.005451202392578125,
-0.0009341239929199219,
-0.02508544921875,
-0.0092620849609375,
-0.0304718017578125,
0.0521240234375,
-0.05377197265625,
-0.015777587890625,
-0.033935546875,
-0.035369873046875,
-0.0306243896484375,
-0.004039764404296875,
-0.024810791015625,
-0.03173828125,
-0.03143310546875,
0.003551483154296875,
0.036407470703125,
0.050933837890625,
-0.021209716796875,
0.0555419921875,
-0.06341552734375,
0.02130126953125,
0.0235443115234375,
0.0016012191772460938,
-0.00817108154296875,
-0.060638427734375,
0.005748748779296875,
0.017333984375,
-0.0254058837890625,
-0.0804443359375,
0.046661376953125,
-0.0133056640625,
0.04010009765625,
0.0487060546875,
-0.00707244873046875,
0.0582275390625,
-0.0024662017822265625,
0.04962158203125,
0.034881591796875,
-0.05621337890625,
0.048492431640625,
-0.037109375,
0.00641632080078125,
0.01007080078125,
0.028594970703125,
-0.052093505859375,
-0.00421905517578125,
-0.057891845703125,
-0.041961669921875,
0.06903076171875,
0.0272216796875,
0.01386260986328125,
0.0205841064453125,
0.0201416015625,
-0.0124053955078125,
0.01666259765625,
-0.07415771484375,
-0.0257568359375,
-0.004856109619140625,
0.001735687255859375,
-0.0027675628662109375,
-0.03192138671875,
-0.034271240234375,
-0.01262664794921875,
0.054901123046875,
0.0005273818969726562,
0.018310546875,
0.00041031837463378906,
0.0216522216796875,
-0.0244293212890625,
0.00844573974609375,
0.057861328125,
0.0160369873046875,
-0.03826904296875,
-0.03118896484375,
0.0145111083984375,
-0.00960540771484375,
-0.005413055419921875,
0.0224456787109375,
0.0034847259521484375,
-0.00859832763671875,
0.034515380859375,
0.06365966796875,
0.0151519775390625,
-0.040069580078125,
0.0308685302734375,
-0.00665283203125,
-0.031524658203125,
-0.00997161865234375,
0.017974853515625,
0.00875091552734375,
0.041259765625,
0.00444793701171875,
0.0204620361328125,
-0.00797271728515625,
-0.061859130859375,
-0.033172607421875,
0.0283966064453125,
-0.00284576416015625,
-0.022369384765625,
0.044708251953125,
0.0208740234375,
-0.0182342529296875,
0.051239013671875,
0.012359619140625,
-0.0322265625,
0.05364990234375,
0.042022705078125,
0.04833984375,
-0.033477783203125,
0.00971221923828125,
0.04229736328125,
0.006450653076171875,
-0.0162353515625,
0.024993896484375,
-0.004283905029296875,
-0.052825927734375,
-0.01490020751953125,
-0.0291748046875,
-0.0232086181640625,
0.0389404296875,
-0.048858642578125,
0.04351806640625,
-0.03485107421875,
-0.0184173583984375,
-0.017303466796875,
0.012939453125,
-0.0311431884765625,
0.00762939453125,
0.0185394287109375,
0.0614013671875,
-0.0819091796875,
0.064697265625,
0.048065185546875,
-0.053619384765625,
-0.06768798828125,
-0.023406982421875,
-0.00939178466796875,
-0.06591796875,
0.044952392578125,
0.008392333984375,
-0.0012865066528320312,
-0.037841796875,
-0.0462646484375,
-0.0726318359375,
0.09210205078125,
0.02874755859375,
-0.0251007080078125,
0.0014162063598632812,
0.0008044242858886719,
0.033599853515625,
-0.036102294921875,
0.03082275390625,
0.02276611328125,
0.034881591796875,
0.049774169921875,
-0.081298828125,
-0.004848480224609375,
-0.0291748046875,
-0.00537109375,
-0.01091766357421875,
-0.0655517578125,
0.08929443359375,
-0.026580810546875,
-0.007572174072265625,
0.07952880859375,
0.052581787109375,
0.035858154296875,
-0.0000023245811462402344,
0.03228759765625,
0.05706787109375,
0.04449462890625,
-0.00014984607696533203,
0.06494140625,
-0.015350341796875,
0.03363037109375,
0.0726318359375,
-0.041168212890625,
0.07196044921875,
0.0278167724609375,
0.00913238525390625,
0.06634521484375,
0.047882080078125,
-0.0142364501953125,
0.03485107421875,
-0.000640869140625,
-0.018646240234375,
-0.01291656494140625,
0.00560760498046875,
-0.0625,
0.037445068359375,
0.0111236572265625,
-0.028167724609375,
-0.0133056640625,
-0.026123046875,
0.0032672882080078125,
-0.0197296142578125,
-0.01025390625,
0.03076171875,
-0.0166015625,
-0.0599365234375,
0.04656982421875,
0.01544952392578125,
0.053619384765625,
-0.0804443359375,
-0.0226593017578125,
-0.046966552734375,
0.0289154052734375,
-0.0222320556640625,
-0.043609619140625,
0.0054473876953125,
-0.0018749237060546875,
-0.015777587890625,
0.01172637939453125,
0.039825439453125,
-0.0272979736328125,
-0.044708251953125,
0.0301513671875,
0.028717041015625,
0.0236053466796875,
0.023406982421875,
-0.051849365234375,
0.03387451171875,
0.00524139404296875,
-0.004589080810546875,
0.0254058837890625,
0.00926971435546875,
0.0094451904296875,
0.06982421875,
0.047454833984375,
-0.01038360595703125,
-0.0139923095703125,
-0.004119873046875,
0.07281494140625,
-0.0259857177734375,
-0.0325927734375,
-0.038299560546875,
0.05059814453125,
-0.01143646240234375,
-0.0232086181640625,
0.053558349609375,
0.046051025390625,
0.0318603515625,
-0.0140533447265625,
0.051055908203125,
-0.0124664306640625,
0.038818359375,
-0.0347900390625,
0.05706787109375,
-0.0501708984375,
0.0219573974609375,
-0.0225372314453125,
-0.07366943359375,
0.007396697998046875,
0.07269287109375,
0.0025691986083984375,
0.005870819091796875,
0.03497314453125,
0.058349609375,
-0.01611328125,
-0.0145721435546875,
0.0198974609375,
0.01557159423828125,
0.002941131591796875,
0.053802490234375,
0.07684326171875,
-0.06524658203125,
0.030914306640625,
-0.01314544677734375,
-0.015655517578125,
-0.01971435546875,
-0.067626953125,
-0.08514404296875,
-0.0235137939453125,
-0.0292816162109375,
-0.037261962890625,
-0.004230499267578125,
0.060791015625,
0.059783935546875,
-0.036407470703125,
-0.044342041015625,
0.022369384765625,
-0.00019693374633789062,
-0.00421142578125,
-0.0099029541015625,
0.007354736328125,
0.0226593017578125,
-0.0711669921875,
0.02337646484375,
-0.00024330615997314453,
0.050933837890625,
-0.01708984375,
-0.0185546875,
-0.011962890625,
0.0111846923828125,
0.0267791748046875,
0.0478515625,
-0.05999755859375,
-0.004878997802734375,
-0.01837158203125,
-0.002910614013671875,
0.002140045166015625,
0.031646728515625,
-0.050018310546875,
-0.00971221923828125,
0.024444580078125,
-0.003055572509765625,
0.040191650390625,
-0.00891876220703125,
0.03582763671875,
-0.033660888671875,
0.0306549072265625,
-0.01605224609375,
0.042724609375,
0.022796630859375,
-0.019683837890625,
0.041748046875,
0.00792694091796875,
-0.0180206298828125,
-0.05853271484375,
0.0019159317016601562,
-0.11407470703125,
-0.004817962646484375,
0.072509765625,
0.00360107421875,
-0.026092529296875,
0.0280609130859375,
-0.048004150390625,
0.01415252685546875,
-0.0272979736328125,
0.041534423828125,
0.02783203125,
-0.018768310546875,
-0.00809478759765625,
-0.00595855712890625,
0.0104217529296875,
0.00930023193359375,
-0.056365966796875,
-0.01861572265625,
0.027252197265625,
0.038787841796875,
0.037078857421875,
0.0418701171875,
0.00681304931640625,
0.029693603515625,
-0.0081024169921875,
0.030853271484375,
-0.006488800048828125,
-0.0008616447448730469,
-0.0160369873046875,
-0.01039886474609375,
-0.01216888427734375,
-0.01617431640625
]
] |
artificialguybr/analogredmond-v2 | 2023-10-07T06:26:23.000Z | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"license:creativeml-openrail-m",
"has_space",
"region:us"
] | text-to-image | artificialguybr | null | null | artificialguybr/analogredmond-v2 | 6 | 12,433 | diffusers | 2023-10-07T06:22:44 | ---
license: creativeml-openrail-m
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: AnalogRedmAF, Analog
widget:
- text: AnalogRedmAF, Analog
---
# Analog.Redmond V2

Analog.Redmond V2 is here!
TEST ALL MY LORAS HERE:https://huggingface.co/spaces/artificialguybr/artificialguybr-demo-lora?logs=build
Introducing AnalogRedmond, the ultimate LORA for creating stunning analog photography!
I'm grateful for the GPU time from Redmond.AI that allowed me to make this LORA! If you need GPU, then you need the great services from Redmond.AI.
It is based on SD XL 1.0 and fine-tuned on a large dataset of analog photographs.
The LORA has a high capacity to generate Analog Photographs.
You can use detailed, minimalist, colorful, black and white as tag to control the results.
The tag for the model:AnalogRedmAF
LORA is not perfect and sometimes needs more than one gen to create good images.
This is inspired in the good Dreambooth Model Nitro made for SD 1.5!
I really hope you like the LORA and use it.
If you like the model and think it's worth it, you can make a donation to my Patreon or Ko-fi.
Follow me in my twitter to know before all about new models:
https://twitter.com/artificialguybr/ | 1,316 | [
[
-0.057159423828125,
-0.06549072265625,
0.0250091552734375,
0.00911712646484375,
-0.040802001953125,
0.0024318695068359375,
0.0275726318359375,
-0.0635986328125,
0.08306884765625,
0.0196990966796875,
-0.05364990234375,
-0.0254058837890625,
-0.0149993896484375,
-0.0150909423828125,
-0.05224609375,
0.06353759765625,
-0.0295867919921875,
-0.0208892822265625,
0.003917694091796875,
-0.0230865478515625,
-0.0302886962890625,
-0.006603240966796875,
-0.03363037109375,
-0.049285888671875,
0.035980224609375,
0.0208740234375,
0.06158447265625,
0.035400390625,
0.039794921875,
0.021759033203125,
0.0204620361328125,
0.000005185604095458984,
-0.013702392578125,
-0.0196533203125,
0.000621795654296875,
-0.0343017578125,
-0.0634765625,
-0.00347137451171875,
0.016876220703125,
-0.01279449462890625,
-0.020294189453125,
0.05804443359375,
-0.048919677734375,
0.058868408203125,
-0.0535888671875,
0.00012052059173583984,
-0.040557861328125,
0.007053375244140625,
-0.007061004638671875,
-0.003940582275390625,
0.0194244384765625,
-0.05169677734375,
-0.048675537109375,
-0.0869140625,
0.01922607421875,
-0.0032291412353515625,
0.076904296875,
0.034210205078125,
0.0006952285766601562,
0.0231475830078125,
-0.07379150390625,
0.026092529296875,
-0.027435302734375,
0.003833770751953125,
0.0244140625,
0.05963134765625,
0.006977081298828125,
-0.0650634765625,
-0.052703857421875,
0.005626678466796875,
0.0164337158203125,
0.017547607421875,
-0.0259246826171875,
-0.038909912109375,
-0.001934051513671875,
0.01702880859375,
-0.0228729248046875,
-0.0081634521484375,
-0.07452392578125,
0.01299285888671875,
0.04742431640625,
0.0057525634765625,
0.01189422607421875,
0.024017333984375,
-0.02288818359375,
-0.0120697021484375,
-0.049285888671875,
-0.0264739990234375,
0.054168701171875,
0.002666473388671875,
-0.037078857421875,
0.05633544921875,
-0.009796142578125,
0.031768798828125,
0.0262298583984375,
0.00007575750350952148,
0.016143798828125,
-0.021331787109375,
-0.006099700927734375,
0.01410675048828125,
0.046875,
0.0477294921875,
0.006866455078125,
0.0279083251953125,
-0.015777587890625,
0.04541015625,
0.018341064453125,
-0.06341552734375,
-0.05126953125,
0.03411865234375,
-0.0306854248046875,
-0.04107666015625,
0.0010557174682617188,
-0.05255126953125,
-0.00980377197265625,
-0.00792694091796875,
0.03338623046875,
-0.02178955078125,
-0.05767822265625,
-0.004322052001953125,
-0.02392578125,
0.016510009765625,
0.053466796875,
-0.07366943359375,
0.04620361328125,
0.03533935546875,
0.053924560546875,
0.01296234130859375,
-0.00811767578125,
-0.00835418701171875,
-0.010955810546875,
-0.061187744140625,
0.0738525390625,
-0.009368896484375,
-0.0278778076171875,
-0.0284576416015625,
-0.0275726318359375,
0.042816162109375,
-0.07452392578125,
0.026641845703125,
-0.051422119140625,
-0.005573272705078125,
0.00494384765625,
0.01715087890625,
-0.0130767822265625,
0.01416778564453125,
-0.055908203125,
0.068359375,
0.036376953125,
-0.0406494140625,
0.024169921875,
-0.076904296875,
-0.025665283203125,
0.031036376953125,
-0.0195465087890625,
-0.0204620361328125,
0.033172607421875,
-0.0139007568359375,
-0.0193939208984375,
-0.0155181884765625,
-0.0274658203125,
-0.041412353515625,
-0.034759521484375,
0.0100555419921875,
0.002483367919921875,
0.034454345703125,
0.0187530517578125,
-0.0162200927734375,
0.00817108154296875,
-0.07208251953125,
0.032623291015625,
0.03997802734375,
-0.0175933837890625,
0.006561279296875,
-0.0166015625,
0.0364990234375,
0.0189666748046875,
0.0157470703125,
-0.061279296875,
-0.0127105712890625,
0.01042938232421875,
0.0484619140625,
0.05853271484375,
-0.01529693603515625,
-0.01520538330078125,
-0.0360107421875,
0.045318603515625,
-0.00583648681640625,
0.0199127197265625,
-0.0001652240753173828,
-0.040130615234375,
-0.0567626953125,
-0.010009765625,
0.020416259765625,
0.0284576416015625,
-0.02423095703125,
0.0221405029296875,
-0.00119781494140625,
-0.035980224609375,
-0.038909912109375,
-0.011016845703125,
-0.0010309219360351562,
0.02178955078125,
0.01419830322265625,
-0.033203125,
-0.050689697265625,
-0.061309814453125,
0.0243377685546875,
-0.0318603515625,
0.0270843505859375,
0.0163726806640625,
0.059173583984375,
-0.036376953125,
0.0758056640625,
-0.031890869140625,
-0.019683837890625,
-0.0195465087890625,
-0.022430419921875,
0.05206298828125,
0.0291595458984375,
0.055450439453125,
-0.051361083984375,
-0.03814697265625,
0.025543212890625,
-0.061553955078125,
0.008056640625,
0.0265045166015625,
0.0013217926025390625,
0.01061248779296875,
0.0282745361328125,
-0.07391357421875,
0.04595947265625,
0.032623291015625,
-0.0260772705078125,
0.062744140625,
-0.035430908203125,
0.0060577392578125,
-0.058624267578125,
0.0258941650390625,
0.005847930908203125,
-0.047760009765625,
-0.02001953125,
0.046142578125,
0.006977081298828125,
-0.021240234375,
-0.051422119140625,
0.05084228515625,
-0.004413604736328125,
-0.0012493133544921875,
-0.024444580078125,
0.007266998291015625,
-0.007625579833984375,
0.0127105712890625,
-0.014007568359375,
0.037445068359375,
0.022979736328125,
-0.03985595703125,
0.06494140625,
0.032562255859375,
-0.037017822265625,
0.054534912109375,
-0.09375,
-0.0032196044921875,
-0.0093231201171875,
0.032989501953125,
-0.042633056640625,
-0.04205322265625,
0.048126220703125,
-0.034820556640625,
0.0125885009765625,
-0.0016155242919921875,
-0.059417724609375,
-0.04632568359375,
-0.0271148681640625,
0.041534423828125,
0.076904296875,
-0.052520751953125,
0.03582763671875,
0.0218963623046875,
-0.00191497802734375,
-0.0103302001953125,
-0.0758056640625,
-0.01140594482421875,
-0.0295867919921875,
-0.033935546875,
0.033111572265625,
0.003711700439453125,
-0.0684814453125,
0.01558685302734375,
-0.004337310791015625,
-0.021270751953125,
-0.01082611083984375,
0.04583740234375,
0.042266845703125,
-0.0301666259765625,
-0.0218505859375,
-0.0099639892578125,
-0.034576416015625,
0.00010710954666137695,
0.0028514862060546875,
0.045379638671875,
-0.0237579345703125,
-0.022430419921875,
-0.07794189453125,
0.006145477294921875,
0.06427001953125,
0.01178741455078125,
0.0160064697265625,
0.030792236328125,
-0.037567138671875,
-0.009552001953125,
-0.0220489501953125,
-0.0182952880859375,
-0.03814697265625,
0.01922607421875,
-0.0269927978515625,
-0.03302001953125,
0.039459228515625,
0.0239715576171875,
0.0140838623046875,
0.04315185546875,
0.04083251953125,
-0.024322509765625,
0.054168701171875,
0.0283355712890625,
-0.0136566162109375,
0.0136566162109375,
-0.045501708984375,
0.0156097412109375,
-0.038970947265625,
0.0027675628662109375,
0.006870269775390625,
-0.049346923828125,
-0.028564453125,
-0.027557373046875,
0.0268402099609375,
0.018157958984375,
-0.0458984375,
0.032867431640625,
-0.00701141357421875,
0.050689697265625,
0.0266265869140625,
0.033416748046875,
0.03143310546875,
0.020416259765625,
0.0292205810546875,
0.0036525726318359375,
-0.03656005859375,
-0.01103973388671875,
0.0726318359375,
0.0294036865234375,
0.073974609375,
0.00218963623046875,
0.056732177734375,
0.02301025390625,
0.0009021759033203125,
-0.03631591796875,
0.04925537109375,
0.004444122314453125,
-0.0672607421875,
0.016693115234375,
0.0024166107177734375,
-0.0262451171875,
0.0120086669921875,
-0.02105712890625,
-0.022613525390625,
-0.0016717910766601562,
0.0408935546875,
-0.056915283203125,
0.037933349609375,
-0.04205322265625,
0.034454345703125,
-0.020538330078125,
-0.033203125,
-0.036834716796875,
-0.025390625,
0.0435791015625,
-0.03472900390625,
0.0034656524658203125,
-0.01922607421875,
0.004184722900390625,
0.020111083984375,
-0.058197021484375,
0.04638671875,
0.0030574798583984375,
-0.00818634033203125,
0.0458984375,
0.0318603515625,
0.041168212890625,
0.0137176513671875,
0.004261016845703125,
-0.002071380615234375,
0.0020389556884765625,
-0.036468505859375,
-0.035186767578125,
0.07110595703125,
-0.044952392578125,
-0.0172882080078125,
-0.03167724609375,
-0.0133209228515625,
0.01160430908203125,
0.00751495361328125,
0.0341796875,
0.03955078125,
-0.02655029296875,
-0.018157958984375,
0.01131439208984375,
0.0117645263671875,
0.04180908203125,
0.0101470947265625,
-0.048675537109375,
-0.025543212890625,
0.080078125,
-0.007305145263671875,
0.0183258056640625,
-0.00429534912109375,
0.00878143310546875,
-0.016571044921875,
-0.0282745361328125,
-0.0419921875,
0.036895751953125,
-0.053466796875,
-0.00838470458984375,
-0.00677490234375,
-0.0017404556274414062,
-0.0071868896484375,
-0.027801513671875,
-0.072509765625,
-0.06524658203125,
-0.04083251953125,
-0.0012760162353515625,
0.057830810546875,
0.072509765625,
-0.032012939453125,
0.007472991943359375,
-0.022125244140625,
0.04132080078125,
0.021148681640625,
0.03594970703125,
-0.013641357421875,
-0.03472900390625,
0.0230560302734375,
0.0017061233520507812,
-0.0302886962890625,
-0.003337860107421875,
0.04791259765625,
0.007305145263671875,
0.00946044921875,
0.01520538330078125,
-0.0009469985961914062,
0.07464599609375,
-0.0428466796875,
0.047515869140625,
0.049041748046875,
-0.0238800048828125,
0.042816162109375,
-0.035797119140625,
0.0232391357421875,
0.051788330078125,
0.005153656005859375,
-0.0030803680419921875,
-0.009613037109375,
-0.07049560546875,
-0.061767578125,
0.046234130859375,
0.017578125,
0.0262603759765625,
0.03179931640625,
0.01322174072265625,
0.0245361328125,
0.0182952880859375,
-0.0645751953125,
-0.041229248046875,
-0.0285797119140625,
-0.0009093284606933594,
0.0037899017333984375,
0.00400543212890625,
-0.0162200927734375,
-0.02215576171875,
0.057464599609375,
0.002994537353515625,
0.039459228515625,
0.01241302490234375,
0.026763916015625,
-0.04962158203125,
-0.0167388916015625,
0.042877197265625,
0.0712890625,
-0.04638671875,
-0.01168060302734375,
0.0006461143493652344,
-0.02117919921875,
0.006683349609375,
-0.002960205078125,
-0.0248260498046875,
0.0167388916015625,
-0.003536224365234375,
0.060516357421875,
-0.00955963134765625,
-0.046478271484375,
0.01155853271484375,
0.01393890380859375,
-0.0247344970703125,
-0.0550537109375,
0.020111083984375,
-0.007396697998046875,
0.0273895263671875,
0.0074920654296875,
0.037689208984375,
-0.0269012451171875,
-0.055633544921875,
-0.005886077880859375,
0.0279998779296875,
-0.0219573974609375,
-0.0546875,
0.0836181640625,
0.00873565673828125,
-0.04107666015625,
0.04388427734375,
-0.020751953125,
-0.007129669189453125,
0.059417724609375,
0.07275390625,
0.046112060546875,
0.026885986328125,
0.0372314453125,
0.046875,
0.0103302001953125,
-0.00788116455078125,
0.04998779296875,
-0.0027618408203125,
-0.032928466796875,
0.012481689453125,
-0.020294189453125,
-0.03509521484375,
0.016845703125,
-0.0263671875,
0.0469970703125,
-0.08026123046875,
-0.0135040283203125,
0.00873565673828125,
-0.0232391357421875,
-0.00646209716796875,
0.00774383544921875,
0.0076904296875,
0.09124755859375,
-0.0306854248046875,
0.046295166015625,
0.034759521484375,
-0.0139923095703125,
-0.0115203857421875,
-0.02911376953125,
0.00577545166015625,
-0.058441162109375,
0.06463623046875,
0.00621795654296875,
-0.01885986328125,
0.00974273681640625,
-0.040924072265625,
-0.0526123046875,
0.0804443359375,
0.0175628662109375,
-0.042327880859375,
0.026031494140625,
-0.0198211669921875,
0.0271453857421875,
-0.02276611328125,
0.0173797607421875,
0.021209716796875,
0.02923583984375,
0.006793975830078125,
-0.052734375,
-0.029815673828125,
-0.042877197265625,
-0.014678955078125,
0.045196533203125,
-0.06317138671875,
0.058441162109375,
-0.06427001953125,
-0.0227203369140625,
0.050323486328125,
0.036865234375,
0.04193115234375,
0.0232391357421875,
0.0305023193359375,
0.0723876953125,
0.0256805419921875,
-0.0105438232421875,
0.10601806640625,
0.0007309913635253906,
0.0154266357421875,
0.0877685546875,
-0.0259246826171875,
0.071044921875,
0.0286407470703125,
-0.035797119140625,
0.041168212890625,
0.056396484375,
-0.00920867919921875,
0.03564453125,
-0.0025844573974609375,
-0.0240478515625,
0.00699615478515625,
-0.0017538070678710938,
-0.050506591796875,
0.00507354736328125,
-0.006500244140625,
0.0009713172912597656,
-0.0186309814453125,
-0.00890350341796875,
-0.016357421875,
0.0009107589721679688,
-0.036712646484375,
-0.001491546630859375,
-0.0089569091796875,
-0.009368896484375,
0.06951904296875,
0.0179443359375,
0.062164306640625,
-0.061309814453125,
-0.00370025634765625,
-0.0304718017578125,
0.031341552734375,
-0.037750244140625,
-0.052490234375,
-0.00251007080078125,
-0.014312744140625,
-0.0099639892578125,
0.0082244873046875,
0.039581298828125,
0.0173492431640625,
-0.062744140625,
0.00885009765625,
0.000690460205078125,
0.0213470458984375,
0.000530242919921875,
-0.0465087890625,
0.02471923828125,
-0.0024662017822265625,
0.0187530517578125,
0.007610321044921875,
-0.0230712890625,
0.0012121200561523438,
0.0484619140625,
0.033721923828125,
0.0174407958984375,
0.00704193115234375,
-0.0240020751953125,
0.058074951171875,
-0.052459716796875,
-0.04022216796875,
-0.056793212890625,
0.026092529296875,
0.0089874267578125,
-0.055145263671875,
0.04534912109375,
0.05572509765625,
0.05743408203125,
-0.034698486328125,
0.057037353515625,
-0.0185546875,
0.030181884765625,
-0.004199981689453125,
0.03118896484375,
-0.049835205078125,
-0.0189208984375,
0.00009834766387939453,
-0.07025146484375,
-0.01300048828125,
0.05718994140625,
0.0206756591796875,
-0.00914764404296875,
0.0265655517578125,
0.064453125,
-0.032623291015625,
0.01438140869140625,
0.0027637481689453125,
0.01093292236328125,
0.006099700927734375,
0.0216217041015625,
0.0712890625,
-0.07965087890625,
0.0139923095703125,
-0.035552978515625,
-0.01265716552734375,
-0.01397705078125,
-0.043975830078125,
-0.028778076171875,
-0.0137176513671875,
-0.06298828125,
-0.027130126953125,
0.00206756591796875,
0.05987548828125,
0.06890869140625,
-0.052947998046875,
-0.012176513671875,
-0.0311431884765625,
-0.01482391357421875,
0.01012420654296875,
-0.01241302490234375,
-0.01220703125,
0.033050537109375,
-0.04949951171875,
0.031494140625,
0.01690673828125,
0.044403076171875,
-0.00403594970703125,
-0.00402069091796875,
0.003192901611328125,
-0.0251007080078125,
0.037841796875,
0.062164306640625,
-0.046722412109375,
0.0095367431640625,
-0.0284576416015625,
0.0244903564453125,
0.0170135498046875,
0.046722412109375,
-0.065673828125,
0.00774383544921875,
0.031768798828125,
0.00516510009765625,
0.032806396484375,
0.01474761962890625,
0.020965576171875,
-0.003963470458984375,
0.0213623046875,
0.00016796588897705078,
0.020782470703125,
0.0038700103759765625,
-0.0009822845458984375,
0.053070068359375,
0.03387451171875,
-0.0204620361328125,
-0.0712890625,
0.020538330078125,
-0.08306884765625,
-0.0246429443359375,
0.074462890625,
0.044677734375,
-0.0284881591796875,
0.0114593505859375,
-0.02593994140625,
-0.039794921875,
-0.022796630859375,
0.035552978515625,
0.0033206939697265625,
-0.0101470947265625,
-0.00521087646484375,
-0.05792236328125,
-0.0145721435546875,
-0.006134033203125,
-0.06707763671875,
-0.01273345947265625,
0.03448486328125,
0.01824951171875,
0.053497314453125,
0.040130615234375,
-0.0401611328125,
0.050201416015625,
0.0200653076171875,
0.007129669189453125,
-0.011810302734375,
-0.0233917236328125,
-0.01806640625,
0.0146331787109375,
-0.00667572021484375,
-0.03985595703125
]
] |
migtissera/SynthIA-7B-v1.3 | 2023-10-14T01:33:58.000Z | [
"transformers",
"pytorch",
"mistral",
"text-generation",
"en",
"arxiv:2306.02707",
"license:apache-2.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | migtissera | null | null | migtissera/SynthIA-7B-v1.3 | 122 | 12,400 | transformers | 2023-09-28T20:41:10 | ---
license: apache-2.0
pipeline_tag: text-generation
language:
- en
library_name: transformers
---
SynthIA-7B-v1.3: Base model is Mistral-7B-v0.1
All SynthIA models are uncensored. Please use it with caution and with best intentions. You are responsible for how you use SynthIA.
To evoke generalized Tree of Thought + Chain of Thought reasoning, you may use the following system message:
```
Elaborate on the topic using a Tree of Thoughts and backtrack when necessary to construct a clear, cohesive Chain of Thought reasoning. Always answer without hesitation.
```
# SynthIA-7B-v1.3
SynthIA (Synthetic Intelligent Agent) 7B-v1.3 is a Mistral-7B-v0.1 model trained on Orca style datasets. It has been fine-tuned for instruction following as well as having long-form conversations.
<br>

<br>
<br>
#### License Disclaimer:
This model is released under Apache 2.0, and comes with no warranty or gurantees of any kind.
<br>
## Evaluation
We evaluated SynthIA-7B-v1.3 on a wide range of tasks using [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) from EleutherAI.
Here are the results on metrics used by [HuggingFaceH4 Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
||||
|:------:|:--------:|:-------:|
|**Task**|**Metric**|**Value**|
|*arc_challenge*|acc_norm|0.6237|
|*hellaswag*|acc_norm|0.8349|
|*mmlu*|acc_norm|0.6232|
|*truthfulqa_mc*|mc2|0.5125|
|**Total Average**|-|**0.6485**||
<br>
## Example Usage
### Here is prompt format:
```
SYSTEM: Elaborate on the topic using a Tree of Thoughts and backtrack when necessary to construct a clear, cohesive Chain of Thought reasoning. Always answer without hesitation.
USER: How is a rocket launched from the surface of the earth to Low Earth Orbit?
ASSISTANT:
```
### Below shows a code example on how to use this model:
```python
import torch, json
from transformers import AutoModelForCausalLM, AutoTokenizer
model_path = "migtissera/SynthIA-7B-v1.3"
output_file_path = "./SynthIA-7B-conversations.jsonl"
model = AutoModelForCausalLM.from_pretrained(
model_path,
torch_dtype=torch.float16,
device_map="auto",
load_in_8bit=False,
trust_remote_code=True,
)
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
def generate_text(instruction):
tokens = tokenizer.encode(instruction)
tokens = torch.LongTensor(tokens).unsqueeze(0)
tokens = tokens.to("cuda")
instance = {
"input_ids": tokens,
"top_p": 1.0,
"temperature": 0.75,
"generate_len": 1024,
"top_k": 50,
}
length = len(tokens[0])
with torch.no_grad():
rest = model.generate(
input_ids=tokens,
max_length=length + instance["generate_len"],
use_cache=True,
do_sample=True,
top_p=instance["top_p"],
temperature=instance["temperature"],
top_k=instance["top_k"],
num_return_sequences=1,
)
output = rest[0][length:]
string = tokenizer.decode(output, skip_special_tokens=True)
answer = string.split("USER:")[0].strip()
return f"{answer}"
conversation = f"SYSTEM: Elaborate on the topic using a Tree of Thoughts and backtrack when necessary to construct a clear, cohesive Chain of Thought reasoning. Always answer without hesitation."
while True:
user_input = input("You: ")
llm_prompt = f"{conversation} \nUSER: {user_input} \nASSISTANT: "
answer = generate_text(llm_prompt)
print(answer)
conversation = f"{llm_prompt}{answer}"
json_data = {"prompt": user_input, "answer": answer}
## Save your conversation
with open(output_file_path, "a") as output_file:
output_file.write(json.dumps(json_data) + "\n")
```
<br>
#### Limitations & Biases:
While this model aims for accuracy, it can occasionally produce inaccurate or misleading results.
Despite diligent efforts in refining the pretraining data, there remains a possibility for the generation of inappropriate, biased, or offensive content.
Exercise caution and cross-check information when necessary. This is an uncensored model.
<br>
### Citiation:
Please kindly cite using the following BibTeX:
```
@misc{SynthIA-7B-v1.3,
author = {Migel Tissera},
title = {SynthIA-7B-v1.3: Synthetic Intelligent Agent},
year = {2023},
publisher = {GitHub, HuggingFace},
journal = {GitHub repository, HuggingFace repository},
howpublished = {\url{https://huggingface.co/migtissera/Synthia-13B},
}
```
```
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 4,991 | [
[
-0.0253448486328125,
-0.0751953125,
0.032012939453125,
0.016082763671875,
-0.023345947265625,
0.004543304443359375,
-0.01454925537109375,
-0.04144287109375,
0.00705718994140625,
0.020416259765625,
-0.053009033203125,
-0.04730224609375,
-0.02471923828125,
0.0016584396362304688,
-0.0138702392578125,
0.08367919921875,
-0.004894256591796875,
-0.007083892822265625,
-0.00287628173828125,
-0.0010366439819335938,
-0.03564453125,
-0.048828125,
-0.054168701171875,
-0.03216552734375,
0.0166015625,
0.002471923828125,
0.0341796875,
0.050689697265625,
0.0175018310546875,
0.032684326171875,
-0.01291656494140625,
0.0128631591796875,
-0.0236358642578125,
0.006988525390625,
-0.0145111083984375,
-0.035552978515625,
-0.056060791015625,
0.007781982421875,
0.032867431640625,
0.02105712890625,
-0.00380706787109375,
0.036407470703125,
-0.004970550537109375,
0.0237274169921875,
-0.0279083251953125,
0.019134521484375,
-0.048370361328125,
-0.00849151611328125,
-0.00844573974609375,
-0.01224517822265625,
-0.0234527587890625,
-0.024871826171875,
0.0146484375,
-0.05035400390625,
0.016326904296875,
0.0010576248168945312,
0.07586669921875,
0.0137481689453125,
-0.0163421630859375,
-0.0272369384765625,
-0.040191650390625,
0.053466796875,
-0.07464599609375,
0.0096893310546875,
0.01198577880859375,
0.006748199462890625,
-0.0180206298828125,
-0.059814453125,
-0.060089111328125,
-0.01477813720703125,
-0.00144195556640625,
0.016815185546875,
-0.006580352783203125,
0.006290435791015625,
0.03533935546875,
0.026275634765625,
-0.04248046875,
-0.0170135498046875,
-0.04119873046875,
-0.0206298828125,
0.043975830078125,
0.02423095703125,
0.02691650390625,
-0.024932861328125,
-0.020599365234375,
-0.0285186767578125,
-0.033966064453125,
0.0226287841796875,
0.040130615234375,
0.0226287841796875,
-0.02398681640625,
0.040771484375,
-0.0164947509765625,
0.04827880859375,
0.019775390625,
-0.004245758056640625,
0.03900146484375,
-0.03106689453125,
-0.0350341796875,
-0.01085662841796875,
0.0755615234375,
0.0161895751953125,
0.009307861328125,
0.0024871826171875,
0.0034046173095703125,
0.01364898681640625,
0.0037479400634765625,
-0.064208984375,
-0.0308074951171875,
0.033966064453125,
-0.0210113525390625,
-0.02630615234375,
-0.00965118408203125,
-0.052642822265625,
-0.00637054443359375,
-0.0153045654296875,
0.03216552734375,
-0.03369140625,
-0.035430908203125,
0.005237579345703125,
-0.005733489990234375,
0.01812744140625,
0.00421905517578125,
-0.0714111328125,
0.02777099609375,
0.02557373046875,
0.062469482421875,
0.0069580078125,
-0.0316162109375,
0.0028400421142578125,
-0.006191253662109375,
-0.01192474365234375,
0.043182373046875,
-0.018798828125,
-0.0254058837890625,
-0.02886962890625,
0.006488800048828125,
-0.0200347900390625,
-0.0369873046875,
0.02972412109375,
-0.0200653076171875,
0.0382080078125,
-0.0231781005859375,
-0.03668212890625,
-0.02667236328125,
0.024261474609375,
-0.032379150390625,
0.084228515625,
0.0069580078125,
-0.06927490234375,
0.0090484619140625,
-0.0445556640625,
-0.005359649658203125,
-0.0165252685546875,
-0.01849365234375,
-0.0341796875,
-0.0161895751953125,
0.0151519775390625,
0.022308349609375,
-0.0183868408203125,
0.029449462890625,
-0.021697998046875,
-0.01398468017578125,
0.026123046875,
-0.0271453857421875,
0.08843994140625,
0.0175933837890625,
-0.043731689453125,
0.025604248046875,
-0.057586669921875,
0.02130126953125,
0.01751708984375,
-0.0175018310546875,
-0.00787353515625,
-0.0193023681640625,
-0.01497650146484375,
0.0235137939453125,
0.0283050537109375,
-0.045166015625,
0.0203704833984375,
-0.054168701171875,
0.03411865234375,
0.0606689453125,
0.0088653564453125,
0.01953125,
-0.03009033203125,
0.032867431640625,
0.01280975341796875,
0.0110626220703125,
0.0033855438232421875,
-0.041717529296875,
-0.07220458984375,
-0.006439208984375,
0.00713348388671875,
0.04315185546875,
-0.042083740234375,
0.042266845703125,
-0.006351470947265625,
-0.05291748046875,
-0.040069580078125,
0.0023708343505859375,
0.040008544921875,
0.0447998046875,
0.034576416015625,
-0.01116943359375,
-0.060394287109375,
-0.05853271484375,
-0.0090789794921875,
-0.023406982421875,
0.005771636962890625,
0.012847900390625,
0.05682373046875,
-0.01439666748046875,
0.07147216796875,
-0.03472900390625,
0.00336456298828125,
-0.02655029296875,
0.0174713134765625,
0.037994384765625,
0.0631103515625,
0.034149169921875,
-0.03369140625,
-0.0255584716796875,
-0.009613037109375,
-0.069580078125,
-0.0018815994262695312,
-0.01824951171875,
-0.0340576171875,
0.0078887939453125,
0.011871337890625,
-0.07733154296875,
0.02667236328125,
0.02716064453125,
-0.043365478515625,
0.048675537109375,
-0.01276397705078125,
0.005397796630859375,
-0.107177734375,
0.0196533203125,
-0.005626678466796875,
-0.004177093505859375,
-0.043182373046875,
0.0007114410400390625,
-0.01016998291015625,
0.0135040283203125,
-0.034393310546875,
0.045745849609375,
-0.034088134765625,
0.01009368896484375,
-0.01019287109375,
0.00988006591796875,
-0.002689361572265625,
0.07037353515625,
-0.005748748779296875,
0.0452880859375,
0.047088623046875,
-0.04449462890625,
0.038055419921875,
0.0218353271484375,
-0.01219940185546875,
0.0219268798828125,
-0.058837890625,
0.0283050537109375,
0.0060882568359375,
0.024505615234375,
-0.064208984375,
-0.01444244384765625,
0.048370361328125,
-0.04888916015625,
0.0192413330078125,
0.0019664764404296875,
-0.0305938720703125,
-0.0390625,
-0.016387939453125,
0.035400390625,
0.039093017578125,
-0.0330810546875,
0.048187255859375,
0.0223236083984375,
0.006439208984375,
-0.04278564453125,
-0.044952392578125,
-0.0200042724609375,
-0.024810791015625,
-0.045166015625,
0.017974853515625,
-0.024993896484375,
-0.01412200927734375,
-0.004154205322265625,
-0.0121002197265625,
0.0008001327514648438,
0.0018777847290039062,
0.028533935546875,
0.033355712890625,
-0.00140380859375,
0.00308990478515625,
0.0019321441650390625,
0.005207061767578125,
0.034332275390625,
-0.01485443115234375,
0.05255126953125,
-0.0313720703125,
-0.01175689697265625,
-0.041717529296875,
0.0011949539184570312,
0.03778076171875,
-0.0086822509765625,
0.056121826171875,
0.037139892578125,
-0.03155517578125,
-0.0111541748046875,
-0.03466796875,
-0.0160369873046875,
-0.038909912109375,
0.0283966064453125,
-0.03948974609375,
-0.041656494140625,
0.06280517578125,
0.00983428955078125,
0.011962890625,
0.0557861328125,
0.056793212890625,
0.0037384033203125,
0.0748291015625,
0.0241546630859375,
0.00974273681640625,
0.029449462890625,
-0.06207275390625,
0.002010345458984375,
-0.0780029296875,
-0.038818359375,
-0.0246124267578125,
-0.00916290283203125,
-0.0416259765625,
-0.0217742919921875,
0.007427215576171875,
0.004810333251953125,
-0.049896240234375,
0.031280517578125,
-0.061920166015625,
0.0259246826171875,
0.040252685546875,
0.022857666015625,
0.010772705078125,
-0.0164337158203125,
-0.0172271728515625,
0.012664794921875,
-0.05255126953125,
-0.041656494140625,
0.092529296875,
0.0285186767578125,
0.047943115234375,
-0.00301361083984375,
0.05828857421875,
-0.0016117095947265625,
0.0322265625,
-0.039703369140625,
0.056060791015625,
0.031463623046875,
-0.071044921875,
-0.020294189453125,
-0.038970947265625,
-0.069091796875,
0.03271484375,
-0.0085296630859375,
-0.07781982421875,
0.00685882568359375,
0.01258087158203125,
-0.03387451171875,
0.028594970703125,
-0.056671142578125,
0.08013916015625,
-0.0148162841796875,
-0.0257720947265625,
-0.0017709732055664062,
-0.0499267578125,
0.041595458984375,
0.0105743408203125,
0.0072174072265625,
-0.0082550048828125,
0.02337646484375,
0.07928466796875,
-0.03741455078125,
0.0716552734375,
-0.01087188720703125,
-0.004718780517578125,
0.046417236328125,
-0.0049285888671875,
0.035888671875,
0.0015668869018554688,
0.0032405853271484375,
0.01093292236328125,
0.00878143310546875,
-0.022216796875,
-0.0447998046875,
0.06243896484375,
-0.08148193359375,
-0.055572509765625,
-0.050872802734375,
-0.03778076171875,
-0.0014476776123046875,
0.0261993408203125,
0.038177490234375,
0.032440185546875,
-0.0092010498046875,
-0.00007462501525878906,
0.04083251953125,
-0.01349639892578125,
0.03271484375,
0.0240936279296875,
-0.0231781005859375,
-0.045501708984375,
0.05596923828125,
0.0113677978515625,
0.017913818359375,
0.0028972625732421875,
0.00789642333984375,
-0.0372314453125,
-0.032989501953125,
-0.0362548828125,
0.027557373046875,
-0.057037353515625,
-0.0196533203125,
-0.06256103515625,
-0.02789306640625,
-0.0322265625,
0.007598876953125,
-0.032928466796875,
-0.025909423828125,
-0.0341796875,
-0.0255584716796875,
0.0269927978515625,
0.04473876953125,
0.0057373046875,
0.01470947265625,
-0.036041259765625,
0.0178070068359375,
0.0205535888671875,
0.0085296630859375,
0.014434814453125,
-0.050567626953125,
-0.0232696533203125,
0.027191162109375,
-0.044219970703125,
-0.076416015625,
0.03631591796875,
-0.0090789794921875,
0.04144287109375,
0.004085540771484375,
0.0034008026123046875,
0.059661865234375,
-0.00797271728515625,
0.0587158203125,
0.01416778564453125,
-0.0849609375,
0.04742431640625,
-0.03033447265625,
0.036346435546875,
0.0252838134765625,
0.01131439208984375,
-0.0170440673828125,
-0.04766845703125,
-0.06756591796875,
-0.064453125,
0.05731201171875,
0.041229248046875,
0.0059814453125,
-0.00492095947265625,
0.0224761962890625,
-0.00605010986328125,
0.01141357421875,
-0.08428955078125,
-0.0248565673828125,
-0.03643798828125,
-0.02325439453125,
0.0161590576171875,
0.0027923583984375,
-0.01535797119140625,
-0.0419921875,
0.06280517578125,
0.0014562606811523438,
0.02984619140625,
0.032562255859375,
0.000667572021484375,
-0.01751708984375,
0.01751708984375,
0.03314208984375,
0.05194091796875,
-0.025238037109375,
0.006786346435546875,
0.035614013671875,
-0.03643798828125,
0.0174102783203125,
0.003963470458984375,
-0.0055999755859375,
-0.010833740234375,
0.0271453857421875,
0.0633544921875,
-0.00765228271484375,
-0.030303955078125,
0.014434814453125,
-0.0008115768432617188,
-0.0181732177734375,
-0.0295562744140625,
0.01535797119140625,
0.0177154541015625,
0.0301055908203125,
0.01910400390625,
0.0128631591796875,
0.0019369125366210938,
-0.04693603515625,
-0.00765228271484375,
0.0235137939453125,
0.005298614501953125,
-0.042327880859375,
0.06072998046875,
0.011138916015625,
-0.01800537109375,
0.040130615234375,
-0.016357421875,
-0.045196533203125,
0.059844970703125,
0.05096435546875,
0.0775146484375,
-0.00540924072265625,
0.0119476318359375,
0.045623779296875,
0.0200347900390625,
-0.0032501220703125,
0.036590576171875,
-0.0010194778442382812,
-0.0523681640625,
-0.0217132568359375,
-0.045928955078125,
-0.01364898681640625,
0.0286712646484375,
-0.032379150390625,
0.0010385513305664062,
-0.045013427734375,
-0.03411865234375,
-0.01018524169921875,
0.019134521484375,
-0.054290771484375,
0.029815673828125,
0.00415802001953125,
0.052734375,
-0.057708740234375,
0.0662841796875,
0.044769287109375,
-0.04669189453125,
-0.087158203125,
-0.01326751708984375,
-0.01275634765625,
-0.038665771484375,
0.044219970703125,
0.0169830322265625,
-0.0177459716796875,
0.01233673095703125,
-0.0533447265625,
-0.07025146484375,
0.0909423828125,
0.028167724609375,
-0.0209503173828125,
-0.01654052734375,
0.000873565673828125,
0.058319091796875,
-0.0256500244140625,
0.043670654296875,
0.03631591796875,
0.02691650390625,
0.0072174072265625,
-0.061767578125,
0.03607177734375,
-0.037933349609375,
-0.00745391845703125,
-0.004619598388671875,
-0.06231689453125,
0.0877685546875,
-0.022064208984375,
-0.02734375,
0.0191497802734375,
0.060333251953125,
0.040008544921875,
0.0210723876953125,
0.02392578125,
0.05316162109375,
0.057708740234375,
-0.00675201416015625,
0.064453125,
-0.029205322265625,
0.04095458984375,
0.0831298828125,
0.00792694091796875,
0.04827880859375,
0.0256195068359375,
-0.0184478759765625,
0.060516357421875,
0.06298828125,
-0.0009832382202148438,
0.0242919921875,
0.023956298828125,
-0.00685882568359375,
-0.01019287109375,
0.01053619384765625,
-0.039703369140625,
0.0175933837890625,
0.0198974609375,
-0.0253143310546875,
0.00397491455078125,
-0.0151519775390625,
0.01184844970703125,
-0.0185546875,
0.0019626617431640625,
0.045013427734375,
0.0067291259765625,
-0.0631103515625,
0.07537841796875,
-0.013671875,
0.041046142578125,
-0.04083251953125,
-0.00274658203125,
-0.01218414306640625,
0.016815185546875,
-0.0204620361328125,
-0.04547119140625,
0.005859375,
0.00356292724609375,
-0.00598907470703125,
-0.001583099365234375,
0.0335693359375,
-0.034881591796875,
-0.03277587890625,
0.01351165771484375,
0.038177490234375,
0.0161895751953125,
0.00693511962890625,
-0.068359375,
0.00829315185546875,
0.00701141357421875,
-0.044677734375,
0.0094451904296875,
0.03302001953125,
0.01296234130859375,
0.05859375,
0.058807373046875,
0.0008416175842285156,
0.0081634521484375,
-0.01519775390625,
0.08404541015625,
-0.05810546875,
-0.03460693359375,
-0.07574462890625,
0.03887939453125,
-0.01175689697265625,
-0.044097900390625,
0.07598876953125,
0.047149658203125,
0.0728759765625,
-0.004230499267578125,
0.060333251953125,
-0.021392822265625,
0.019287109375,
-0.03985595703125,
0.0460205078125,
-0.0259552001953125,
0.0241851806640625,
-0.0230712890625,
-0.0736083984375,
0.00922393798828125,
0.05517578125,
-0.027740478515625,
0.01763916015625,
0.06365966796875,
0.060333251953125,
-0.001560211181640625,
-0.0024261474609375,
-0.003505706787109375,
0.03369140625,
0.0303955078125,
0.05950927734375,
0.056884765625,
-0.040130615234375,
0.03863525390625,
-0.031036376953125,
-0.01371002197265625,
0.00806427001953125,
-0.039764404296875,
-0.08709716796875,
-0.041473388671875,
-0.0208587646484375,
-0.044097900390625,
-0.0005450248718261719,
0.084716796875,
0.054046630859375,
-0.061309814453125,
-0.0321044921875,
-0.0282440185546875,
0.00731658935546875,
-0.0137176513671875,
-0.0212249755859375,
0.035400390625,
-0.01412200927734375,
-0.05853271484375,
0.0129547119140625,
-0.00888824462890625,
0.02691650390625,
-0.02783203125,
-0.0115966796875,
-0.0267181396484375,
0.01177978515625,
0.0172882080078125,
0.028472900390625,
-0.059906005859375,
-0.017364501953125,
0.002773284912109375,
-0.017913818359375,
0.005298614501953125,
0.03326416015625,
-0.06341552734375,
0.044921875,
0.041656494140625,
0.01000213623046875,
0.050872802734375,
0.005329132080078125,
0.042022705078125,
-0.0350341796875,
0.0172576904296875,
0.0146026611328125,
0.026123046875,
0.019683837890625,
-0.0313720703125,
0.0322265625,
0.0303955078125,
-0.0447998046875,
-0.054168701171875,
0.00620269775390625,
-0.07586669921875,
-0.0174560546875,
0.0858154296875,
-0.02197265625,
-0.033843994140625,
0.002166748046875,
-0.039154052734375,
0.05438232421875,
-0.033416748046875,
0.0672607421875,
0.045989990234375,
-0.0172271728515625,
-0.002658843994140625,
-0.02313232421875,
0.0478515625,
0.0215911865234375,
-0.0716552734375,
-0.0005345344543457031,
0.027435302734375,
0.0276947021484375,
0.021392822265625,
0.061431884765625,
0.009246826171875,
0.00862884521484375,
0.010711669921875,
0.0084228515625,
-0.01406097412109375,
-0.01485443115234375,
0.00007551908493041992,
-0.003360748291015625,
-0.0137939453125,
-0.0197296142578125
]
] |
facebook/blenderbot_small-90M | 2023-01-24T16:29:13.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"blenderbot-small",
"text2text-generation",
"convAI",
"conversational",
"facebook",
"en",
"dataset:blended_skill_talk",
"arxiv:1907.06616",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | conversational | facebook | null | null | facebook/blenderbot_small-90M | 40 | 12,303 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
thumbnail:
tags:
- convAI
- conversational
- facebook
license: apache-2.0
datasets:
- blended_skill_talk
metrics:
- perplexity
---
## Model description
+ Paper: [Recipes for building an open-domain chatbot](https://arxiv.org/abs/1907.06616)
+ [Original PARLAI Code](https://parl.ai/projects/recipes/)
### Abstract
Building open-domain chatbots is a challenging area for machine learning research. While prior work has shown that scaling neural models in the number of parameters and the size of the data they are trained on gives improved results, we show that other ingredients are important for a high-performing chatbot. Good conversation requires a number of skills that an expert conversationalist blends in a seamless way: providing engaging talking points and listening to their partners, both asking and answering questions, and displaying knowledge, empathy and personality appropriately, depending on the situation. We show that large scale models can learn these skills when given appropriate training data and choice of generation strategy. We build variants of these recipes with 90M, 2.7B and 9.4B parameter neural models, and make our models and code publicly available. Human evaluations show our best models are superior to existing approaches in multi-turn dialogue in terms of engagingness and humanness measurements. We then discuss the limitations of this work by analyzing failure cases of our models.
| 1,451 | [
[
-0.0259857177734375,
-0.06011962890625,
0.025848388671875,
0.0245361328125,
0.0245208740234375,
-0.006664276123046875,
-0.032135009765625,
-0.015869140625,
-0.0006780624389648438,
0.050262451171875,
-0.0207061767578125,
-0.0195465087890625,
-0.057037353515625,
-0.0283966064453125,
-0.0175933837890625,
0.072998046875,
0.0250244140625,
0.01371002197265625,
-0.025238037109375,
-0.0140228271484375,
-0.036529541015625,
-0.05584716796875,
-0.061492919921875,
-0.01464080810546875,
0.047607421875,
0.0357666015625,
0.07159423828125,
0.03546142578125,
0.01421356201171875,
0.02496337890625,
-0.024627685546875,
0.035736083984375,
-0.056488037109375,
0.023223876953125,
-0.002819061279296875,
-0.0305023193359375,
-0.06591796875,
0.017974853515625,
0.004497528076171875,
0.0634765625,
-0.0253143310546875,
0.0052642822265625,
0.016693115234375,
0.041168212890625,
-0.0301666259765625,
0.037994384765625,
-0.0634765625,
-0.0166473388671875,
0.007183074951171875,
-0.01342010498046875,
-0.05865478515625,
-0.0140380859375,
0.018798828125,
-0.04595947265625,
0.016204833984375,
-0.0046539306640625,
0.052947998046875,
-0.002407073974609375,
-0.0213470458984375,
-0.0279998779296875,
-0.07220458984375,
0.055206298828125,
-0.068115234375,
0.015167236328125,
0.045318603515625,
0.036224365234375,
-0.0330810546875,
-0.05584716796875,
-0.031524658203125,
-0.0220947265625,
0.008941650390625,
-0.005908966064453125,
-0.0164337158203125,
-0.00041365623474121094,
0.006847381591796875,
0.0165557861328125,
-0.0211181640625,
0.01335906982421875,
-0.0245208740234375,
0.009246826171875,
0.045562744140625,
0.0318603515625,
0.01824951171875,
0.01214599609375,
-0.0023975372314453125,
-0.01500701904296875,
-0.03204345703125,
0.00662994384765625,
0.058258056640625,
0.04608154296875,
-0.02044677734375,
0.041717529296875,
-0.0126953125,
0.050628662109375,
-0.00128936767578125,
-0.006229400634765625,
-0.004749298095703125,
-0.0306854248046875,
-0.006256103515625,
-0.03253173828125,
0.06365966796875,
0.0322265625,
0.037353515625,
-0.00824737548828125,
0.0168914794921875,
-0.028961181640625,
0.046844482421875,
-0.0771484375,
-0.035003662109375,
0.0286865234375,
-0.045806884765625,
-0.025848388671875,
-0.00528717041015625,
-0.042816162109375,
-0.024658203125,
-0.019866943359375,
0.0172576904296875,
-0.046356201171875,
-0.036651611328125,
0.0208282470703125,
-0.0164794921875,
0.01023101806640625,
0.0322265625,
-0.0887451171875,
0.023345947265625,
0.056610107421875,
0.06378173828125,
0.01434326171875,
-0.0223846435546875,
-0.03424072265625,
-0.0247650146484375,
-0.036041259765625,
0.0303802490234375,
-0.04583740234375,
-0.0243072509765625,
0.007503509521484375,
-0.01351165771484375,
-0.0007829666137695312,
-0.038543701171875,
0.034576416015625,
-0.031951904296875,
0.029754638671875,
-0.0050048828125,
-0.04534912109375,
-0.01129150390625,
0.007720947265625,
-0.039459228515625,
0.0352783203125,
0.014678955078125,
-0.043609619140625,
0.0019483566284179688,
-0.07037353515625,
-0.036102294921875,
0.01300048828125,
0.0014505386352539062,
-0.0217437744140625,
0.0228118896484375,
0.0164642333984375,
0.048126220703125,
-0.026214599609375,
0.02386474609375,
-0.02886962890625,
0.0120086669921875,
0.041534423828125,
-0.037353515625,
0.0677490234375,
0.009185791015625,
0.005756378173828125,
0.02850341796875,
-0.045501708984375,
0.0284881591796875,
0.020263671875,
-0.021331787109375,
-0.01088714599609375,
0.0091705322265625,
0.00440216064453125,
-0.00775909423828125,
-0.0022640228271484375,
-0.0299835205078125,
0.00013446807861328125,
-0.03875732421875,
0.062255859375,
0.047332763671875,
-0.004547119140625,
0.038116455078125,
-0.026885986328125,
0.019195556640625,
0.0089263916015625,
0.0223846435546875,
-0.037261962890625,
-0.07232666015625,
-0.06890869140625,
-0.0294189453125,
0.037200927734375,
0.0259857177734375,
-0.033447265625,
0.0650634765625,
0.005645751953125,
-0.07208251953125,
-0.06805419921875,
-0.005008697509765625,
0.0301971435546875,
0.03436279296875,
0.0178985595703125,
-0.03497314453125,
-0.034423828125,
-0.060577392578125,
-0.0171051025390625,
-0.00140380859375,
-0.021728515625,
0.05279541015625,
0.0321044921875,
0.01197052001953125,
0.06695556640625,
-0.04058837890625,
0.0025043487548828125,
-0.0316162109375,
0.01438140869140625,
0.0126953125,
0.0341796875,
0.028656005859375,
-0.06622314453125,
-0.05377197265625,
0.00817108154296875,
-0.062469482421875,
0.0276336669921875,
-0.0011625289916992188,
-0.026763916015625,
-0.0012159347534179688,
0.0140380859375,
-0.07275390625,
0.040618896484375,
0.027679443359375,
-0.0184173583984375,
0.027099609375,
-0.01873779296875,
0.01495361328125,
-0.09014892578125,
0.0009450912475585938,
0.0102386474609375,
0.0048980712890625,
-0.07855224609375,
0.00926971435546875,
0.00670623779296875,
-0.0307464599609375,
-0.048309326171875,
0.04412841796875,
-0.01422119140625,
0.01462554931640625,
-0.0158233642578125,
-0.00939178466796875,
-0.005367279052734375,
0.0635986328125,
0.0020618438720703125,
0.054931640625,
0.0362548828125,
-0.037628173828125,
0.025054931640625,
0.0367431640625,
-0.0215606689453125,
0.03887939453125,
-0.08709716796875,
0.017608642578125,
0.01465606689453125,
0.0113525390625,
-0.07427978515625,
-0.042816162109375,
0.0004329681396484375,
-0.0628662109375,
0.01241302490234375,
-0.024444580078125,
-0.041046142578125,
-0.00970458984375,
0.0107574462890625,
0.0022373199462890625,
0.056243896484375,
-0.033782958984375,
0.04486083984375,
0.0377197265625,
-0.0184783935546875,
-0.00705718994140625,
-0.007350921630859375,
0.007476806640625,
-0.00475311279296875,
-0.062347412109375,
0.0136260986328125,
-0.025390625,
-0.00782012939453125,
-0.01392364501953125,
0.023101806640625,
-0.0169830322265625,
0.007320404052734375,
0.028900146484375,
0.0127105712890625,
-0.021270751953125,
-0.010223388671875,
-0.0104827880859375,
-0.00556182861328125,
0.00927734375,
-0.022705078125,
0.05810546875,
-0.021331787109375,
-0.0094451904296875,
-0.04339599609375,
0.036346435546875,
0.04888916015625,
-0.025634765625,
0.07427978515625,
0.033935546875,
-0.0172882080078125,
-0.01175689697265625,
-0.0231170654296875,
-0.0277099609375,
-0.034423828125,
0.0210418701171875,
-0.01528167724609375,
-0.06024169921875,
0.03759765625,
0.01340484619140625,
0.01116180419921875,
0.0177154541015625,
0.0660400390625,
0.0010957717895507812,
0.09002685546875,
0.0419921875,
0.0173187255859375,
0.0210723876953125,
-0.00725555419921875,
0.0263671875,
-0.0465087890625,
-0.021575927734375,
-0.046661376953125,
-0.00939178466796875,
-0.0308685302734375,
-0.037841796875,
0.0259246826171875,
-0.01194000244140625,
-0.03448486328125,
0.037994384765625,
-0.0311737060546875,
0.04119873046875,
0.07086181640625,
0.0261993408203125,
-0.0014295578002929688,
-0.006633758544921875,
0.0027923583984375,
-0.0018329620361328125,
-0.06396484375,
-0.0433349609375,
0.0869140625,
0.036224365234375,
0.048583984375,
-0.006000518798828125,
0.0277557373046875,
-0.0078582763671875,
0.00045108795166015625,
-0.069580078125,
0.03973388671875,
0.00655364990234375,
-0.06585693359375,
-0.036651611328125,
-0.041656494140625,
-0.06939697265625,
0.0072784423828125,
-0.0123291015625,
-0.034759521484375,
-0.0243988037109375,
0.00957489013671875,
-0.025848388671875,
0.0288848876953125,
-0.07257080078125,
0.08038330078125,
-0.00939178466796875,
-0.0221405029296875,
-0.001071929931640625,
-0.06640625,
0.01288604736328125,
0.021575927734375,
-0.01336669921875,
-0.00411224365234375,
0.020904541015625,
0.0330810546875,
-0.0277557373046875,
0.08984375,
-0.01345062255859375,
0.00394439697265625,
0.028717041015625,
0.0175018310546875,
0.00983428955078125,
0.0021762847900390625,
0.0014791488647460938,
0.0229949951171875,
-0.0235595703125,
-0.047607421875,
-0.058258056640625,
0.040191650390625,
-0.07415771484375,
-0.040771484375,
0.005306243896484375,
-0.05029296875,
-0.0268096923828125,
0.009796142578125,
0.0091094970703125,
0.033966064453125,
-0.034637451171875,
0.05462646484375,
0.066650390625,
-0.03594970703125,
0.01727294921875,
0.022003173828125,
0.01070404052734375,
-0.03924560546875,
0.0595703125,
-0.004581451416015625,
0.037139892578125,
0.023590087890625,
0.0238494873046875,
0.01953125,
-0.0238494873046875,
-0.0300140380859375,
-0.00957489013671875,
-0.03387451171875,
-0.01200103759765625,
-0.057037353515625,
-0.04522705078125,
-0.03668212890625,
-0.006710052490234375,
-0.055267333984375,
-0.02935791015625,
-0.031494140625,
0.00824737548828125,
0.031005859375,
0.047454833984375,
0.00920867919921875,
0.0291900634765625,
-0.06329345703125,
0.004886627197265625,
0.0235595703125,
0.0256500244140625,
0.042816162109375,
-0.03607177734375,
-0.026031494140625,
0.0239105224609375,
-0.046875,
-0.03228759765625,
0.049072265625,
0.01010894775390625,
0.039459228515625,
0.0192718505859375,
-0.0028705596923828125,
0.02435302734375,
-0.04449462890625,
0.06591796875,
0.028656005859375,
-0.054351806640625,
0.046295166015625,
-0.044677734375,
0.0206756591796875,
0.022613525390625,
0.0673828125,
-0.033477783203125,
-0.0244598388671875,
-0.05841064453125,
-0.05804443359375,
0.050048828125,
0.03955078125,
0.048797607421875,
-0.00843048095703125,
0.0191497802734375,
0.034637451171875,
0.01824951171875,
-0.0282440185546875,
-0.01123046875,
-0.032135009765625,
-0.01544952392578125,
-0.0055084228515625,
-0.006389617919921875,
-0.021697998046875,
-0.0223846435546875,
0.032501220703125,
-0.006275177001953125,
0.034759521484375,
-0.024322509765625,
0.03363037109375,
0.0013561248779296875,
0.010528564453125,
0.044281005859375,
0.0455322265625,
-0.0146026611328125,
-0.01397705078125,
-0.0187835693359375,
-0.00823211669921875,
-0.016845703125,
-0.01483917236328125,
0.0298309326171875,
-0.032073974609375,
0.0301666259765625,
0.0718994140625,
0.003520965576171875,
-0.04962158203125,
0.042388916015625,
-0.02386474609375,
-0.03314208984375,
-0.005390167236328125,
0.03216552734375,
0.0419921875,
0.026397705078125,
0.00681304931640625,
0.01904296875,
-0.01503753662109375,
-0.051177978515625,
0.003139495849609375,
0.01971435546875,
-0.041351318359375,
-0.040618896484375,
0.044647216796875,
0.0338134765625,
-0.06365966796875,
0.0704345703125,
-0.004291534423828125,
-0.0285186767578125,
0.037567138671875,
0.039947509765625,
0.06646728515625,
-0.0146942138671875,
0.0270233154296875,
0.019073486328125,
0.0020580291748046875,
-0.0032520294189453125,
0.0126800537109375,
-0.00708770751953125,
-0.05767822265625,
-0.009796142578125,
-0.0269775390625,
-0.0638427734375,
0.00751495361328125,
-0.039398193359375,
0.007381439208984375,
-0.032470703125,
-0.01251220703125,
0.0301361083984375,
-0.03076171875,
-0.05804443359375,
-0.00887298583984375,
-0.01491546630859375,
0.061187744140625,
-0.038238525390625,
0.046417236328125,
0.027679443359375,
-0.043975830078125,
-0.0458984375,
-0.0220947265625,
-0.0167236328125,
-0.058349609375,
0.03778076171875,
-0.00084686279296875,
0.0242156982421875,
-0.0199432373046875,
-0.076171875,
-0.03167724609375,
0.055999755859375,
0.0257110595703125,
-0.0188751220703125,
-0.033355712890625,
0.0037708282470703125,
0.0601806640625,
-0.05804443359375,
0.0284423828125,
0.0109100341796875,
0.0034942626953125,
0.0302734375,
-0.08294677734375,
-0.01544952392578125,
-0.0243682861328125,
0.007244110107421875,
0.00560760498046875,
-0.049530029296875,
0.07147216796875,
-0.027099609375,
0.018646240234375,
0.02294921875,
0.055206298828125,
0.00004100799560546875,
0.0218505859375,
0.0164794921875,
0.032318115234375,
0.031494140625,
0.00370025634765625,
0.05743408203125,
-0.0213623046875,
0.005611419677734375,
0.10516357421875,
-0.0269012451171875,
0.08056640625,
0.01116180419921875,
0.0027408599853515625,
0.0272369384765625,
0.03521728515625,
0.00925445556640625,
0.031890869140625,
0.009521484375,
-0.004222869873046875,
-0.0232696533203125,
0.001384735107421875,
-0.010650634765625,
0.0498046875,
0.0364990234375,
-0.03778076171875,
-0.000011086463928222656,
0.00171661376953125,
0.0139617919921875,
0.00725555419921875,
0.0145111083984375,
0.07177734375,
0.00733184814453125,
-0.06317138671875,
0.047760009765625,
-0.0073089599609375,
0.017822265625,
-0.03338623046875,
-0.0078582763671875,
-0.0212860107421875,
0.0151824951171875,
0.007843017578125,
-0.0589599609375,
0.01192474365234375,
-0.00916290283203125,
-0.00742340087890625,
-0.0202178955078125,
0.042572021484375,
-0.02655029296875,
-0.008758544921875,
0.01351165771484375,
0.052642822265625,
0.00496673583984375,
-0.01849365234375,
-0.04937744140625,
-0.006397247314453125,
-0.0147705078125,
-0.017730712890625,
0.0307159423828125,
0.052398681640625,
0.0235748291015625,
0.047607421875,
0.0462646484375,
-0.0034694671630859375,
-0.0109405517578125,
-0.004199981689453125,
0.07305908203125,
-0.044525146484375,
-0.02630615234375,
-0.044921875,
0.0576171875,
-0.0283050537109375,
-0.044921875,
0.056304931640625,
0.03533935546875,
0.0670166015625,
-0.0144805908203125,
0.05352783203125,
-0.005268096923828125,
0.043426513671875,
-0.02166748046875,
0.0556640625,
-0.02294921875,
-0.009246826171875,
0.00572967529296875,
-0.061798095703125,
-0.018218994140625,
0.036865234375,
0.00229644775390625,
0.01096343994140625,
0.032257080078125,
0.068115234375,
-0.013275146484375,
0.0289306640625,
0.05255126953125,
0.01082611083984375,
0.04083251953125,
0.04229736328125,
0.0670166015625,
-0.0291748046875,
0.041412353515625,
0.005523681640625,
-0.037994384765625,
-0.0254669189453125,
-0.052703857421875,
-0.10467529296875,
-0.068359375,
-0.016448974609375,
-0.039886474609375,
0.0072021484375,
0.07635498046875,
0.09429931640625,
-0.04400634765625,
-0.02972412109375,
-0.00392913818359375,
-0.0035991668701171875,
-0.019012451171875,
-0.0129241943359375,
-0.012847900390625,
-0.04083251953125,
-0.056732177734375,
0.034271240234375,
0.0145263671875,
0.0047149658203125,
-0.0266876220703125,
-0.0138397216796875,
0.0003066062927246094,
0.040679931640625,
0.062164306640625,
0.023040771484375,
-0.0440673828125,
-0.005611419677734375,
0.007904052734375,
-0.006801605224609375,
0.014984130859375,
0.062164306640625,
-0.0179290771484375,
0.039093017578125,
0.034088134765625,
0.056976318359375,
0.038177490234375,
-0.0205841064453125,
0.062744140625,
-0.053314208984375,
-0.0025959014892578125,
0.0177001953125,
0.01311492919921875,
0.023651123046875,
-0.00839996337890625,
0.039398193359375,
-0.007297515869140625,
-0.06451416015625,
-0.0535888671875,
0.0206756591796875,
-0.07373046875,
-0.01763916015625,
0.0804443359375,
-0.0224761962890625,
-0.017608642578125,
0.01108551025390625,
-0.02874755859375,
0.0179443359375,
-0.04156494140625,
0.055267333984375,
0.057891845703125,
-0.01194000244140625,
0.008514404296875,
-0.059661865234375,
0.0347900390625,
0.003582000732421875,
-0.060760498046875,
0.033172607421875,
0.024871826171875,
-0.0014829635620117188,
0.0153045654296875,
0.033905029296875,
-0.01273345947265625,
0.0012254714965820312,
0.025421142578125,
0.006748199462890625,
-0.01004791259765625,
-0.052001953125,
0.00826263427734375,
0.03363037109375,
0.0179595947265625,
-0.0029087066650390625
]
] |
timm/efficientnet_b2.ra_in1k | 2023-04-27T21:10:10.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1905.11946",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/efficientnet_b2.ra_in1k | 0 | 12,269 | timm | 2022-12-12T23:56:20 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for efficientnet_b2.ra_in1k
A EfficientNet image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 9.1
- GMACs: 0.9
- Activations (M): 12.8
- Image size: train = 256 x 256, test = 288 x 288
- **Papers:**
- EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks: https://arxiv.org/abs/1905.11946
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('efficientnet_b2.ra_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'efficientnet_b2.ra_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 128, 128])
# torch.Size([1, 24, 64, 64])
# torch.Size([1, 48, 32, 32])
# torch.Size([1, 120, 16, 16])
# torch.Size([1, 352, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'efficientnet_b2.ra_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1408, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2019efficientnet,
title={Efficientnet: Rethinking model scaling for convolutional neural networks},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={6105--6114},
year={2019},
organization={PMLR}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,731 | [
[
-0.0281982421875,
-0.037200927734375,
-0.00905609130859375,
0.00501251220703125,
-0.0148773193359375,
-0.034149169921875,
-0.0237274169921875,
-0.0300750732421875,
0.020172119140625,
0.032196044921875,
-0.032989501953125,
-0.040679931640625,
-0.056396484375,
-0.0088043212890625,
-0.0126190185546875,
0.0628662109375,
-0.002777099609375,
0.0035037994384765625,
-0.01152801513671875,
-0.047637939453125,
-0.0138702392578125,
-0.01509857177734375,
-0.08099365234375,
-0.0413818359375,
0.0340576171875,
0.026123046875,
0.043304443359375,
0.049530029296875,
0.054931640625,
0.036895751953125,
-0.004291534423828125,
0.0137481689453125,
-0.018402099609375,
-0.00555419921875,
0.0289459228515625,
-0.046722412109375,
-0.0298919677734375,
0.0154266357421875,
0.05157470703125,
0.029296875,
-0.004512786865234375,
0.035247802734375,
0.00847625732421875,
0.05224609375,
-0.0254058837890625,
0.00806427001953125,
-0.0341796875,
0.01409149169921875,
-0.005340576171875,
0.00759124755859375,
-0.0276336669921875,
-0.0269317626953125,
0.00710296630859375,
-0.04278564453125,
0.0298309326171875,
0.0009613037109375,
0.09698486328125,
0.0230865478515625,
-0.0007786750793457031,
-0.005306243896484375,
-0.0163421630859375,
0.0577392578125,
-0.0625,
0.012603759765625,
0.0159454345703125,
0.024658203125,
-0.00485992431640625,
-0.090576171875,
-0.0418701171875,
-0.0165252685546875,
-0.0162353515625,
-0.00409698486328125,
-0.0289459228515625,
0.0018739700317382812,
0.0276336669921875,
0.018707275390625,
-0.031768798828125,
0.01444244384765625,
-0.034912109375,
-0.0102081298828125,
0.0390625,
0.00428009033203125,
0.027313232421875,
-0.011077880859375,
-0.034393310546875,
-0.036529541015625,
-0.03033447265625,
0.0234375,
0.0216217041015625,
0.0233612060546875,
-0.042236328125,
0.0282745361328125,
0.00820159912109375,
0.04803466796875,
0.00820159912109375,
-0.02044677734375,
0.042510986328125,
0.00247955322265625,
-0.033538818359375,
-0.01285552978515625,
0.0787353515625,
0.0284576416015625,
0.01473236083984375,
0.01082611083984375,
-0.012451171875,
-0.033203125,
0.0020618438720703125,
-0.09515380859375,
-0.027740478515625,
0.0199737548828125,
-0.05328369140625,
-0.0279998779296875,
0.0111236572265625,
-0.03753662109375,
-0.0093536376953125,
-0.004520416259765625,
0.045013427734375,
-0.03375244140625,
-0.0307769775390625,
-0.0006494522094726562,
-0.01523590087890625,
0.0183258056640625,
0.01788330078125,
-0.04345703125,
0.016143798828125,
0.026214599609375,
0.08880615234375,
0.013031005859375,
-0.0305023193359375,
-0.02105712890625,
-0.032958984375,
-0.02117919921875,
0.0276947021484375,
-0.00734710693359375,
0.00754547119140625,
-0.0254364013671875,
0.023956298828125,
-0.00981903076171875,
-0.057891845703125,
0.0191192626953125,
-0.0241546630859375,
0.006679534912109375,
-0.0037479400634765625,
-0.01971435546875,
-0.046844482421875,
0.0272674560546875,
-0.04345703125,
0.08843994140625,
0.0229339599609375,
-0.06976318359375,
0.019317626953125,
-0.045501708984375,
-0.00861358642578125,
-0.0218048095703125,
-0.0006613731384277344,
-0.0799560546875,
-0.006595611572265625,
0.00847625732421875,
0.057586669921875,
-0.0293121337890625,
0.0020198822021484375,
-0.041717529296875,
-0.01509857177734375,
0.0254364013671875,
-0.0029659271240234375,
0.08197021484375,
0.0207672119140625,
-0.0355224609375,
0.0172271728515625,
-0.048187255859375,
0.0207366943359375,
0.039215087890625,
-0.0179595947265625,
-0.004215240478515625,
-0.04339599609375,
0.01097869873046875,
0.02093505859375,
0.0077362060546875,
-0.040557861328125,
0.01461029052734375,
-0.0145416259765625,
0.03741455078125,
0.043792724609375,
-0.01001739501953125,
0.0260009765625,
-0.0316162109375,
0.0212860107421875,
0.0139923095703125,
0.0189056396484375,
-0.0057373046875,
-0.035064697265625,
-0.06256103515625,
-0.036834716796875,
0.0287017822265625,
0.0281829833984375,
-0.037872314453125,
0.0302886962890625,
-0.0145416259765625,
-0.06072998046875,
-0.03466796875,
0.0050811767578125,
0.040802001953125,
0.04571533203125,
0.0236358642578125,
-0.035919189453125,
-0.03790283203125,
-0.0758056640625,
-0.002960205078125,
0.006313323974609375,
0.0047149658203125,
0.0308685302734375,
0.047576904296875,
-0.0037555694580078125,
0.04290771484375,
-0.0294647216796875,
-0.018310546875,
-0.02435302734375,
0.00341796875,
0.03375244140625,
0.06561279296875,
0.05902099609375,
-0.050537109375,
-0.04541015625,
-0.00876617431640625,
-0.0694580078125,
0.015380859375,
-0.00934600830078125,
-0.011962890625,
0.0116424560546875,
0.0160980224609375,
-0.0418701171875,
0.036163330078125,
0.018463134765625,
-0.0163116455078125,
0.031036376953125,
-0.0186309814453125,
0.017730712890625,
-0.08599853515625,
0.013916015625,
0.028717041015625,
-0.0102081298828125,
-0.0391845703125,
0.0174560546875,
0.005451202392578125,
-0.00724029541015625,
-0.03594970703125,
0.0487060546875,
-0.044342041015625,
-0.0146942138671875,
-0.0120697021484375,
-0.023193359375,
0.0011701583862304688,
0.05230712890625,
-0.01535797119140625,
0.0262603759765625,
0.0621337890625,
-0.031402587890625,
0.03570556640625,
0.019012451171875,
-0.0192108154296875,
0.02557373046875,
-0.056732177734375,
0.017822265625,
0.00316619873046875,
0.0211334228515625,
-0.0780029296875,
-0.017669677734375,
0.0299835205078125,
-0.04833984375,
0.0526123046875,
-0.036529541015625,
-0.03521728515625,
-0.03497314453125,
-0.031768798828125,
0.02911376953125,
0.047576904296875,
-0.057159423828125,
0.035064697265625,
0.0160675048828125,
0.0265350341796875,
-0.04815673828125,
-0.0712890625,
-0.01174163818359375,
-0.0293731689453125,
-0.058135986328125,
0.0223846435546875,
0.0098724365234375,
0.0014486312866210938,
0.0109100341796875,
0.00040602684020996094,
-0.01334381103515625,
-0.0008401870727539062,
0.039093017578125,
0.013916015625,
-0.0216827392578125,
-0.006229400634765625,
-0.0228118896484375,
-0.0032749176025390625,
0.0021266937255859375,
-0.0259246826171875,
0.03948974609375,
-0.017303466796875,
-0.007495880126953125,
-0.0665283203125,
-0.0017986297607421875,
0.037261962890625,
-0.0037479400634765625,
0.06536865234375,
0.08270263671875,
-0.042022705078125,
-0.005695343017578125,
-0.034393310546875,
-0.0289154052734375,
-0.0372314453125,
0.037933349609375,
-0.024871826171875,
-0.033203125,
0.06494140625,
-0.00449371337890625,
0.010101318359375,
0.048858642578125,
0.0259857177734375,
-0.0032024383544921875,
0.048187255859375,
0.046844482421875,
0.01953125,
0.057159423828125,
-0.083984375,
-0.014434814453125,
-0.06781005859375,
-0.0311737060546875,
-0.031524658203125,
-0.059356689453125,
-0.046875,
-0.022308349609375,
0.034881591796875,
0.015289306640625,
-0.03631591796875,
0.034393310546875,
-0.06475830078125,
0.00502777099609375,
0.052032470703125,
0.0452880859375,
-0.03216552734375,
0.0264434814453125,
-0.0168609619140625,
-0.000003933906555175781,
-0.06793212890625,
-0.0128173828125,
0.08453369140625,
0.031158447265625,
0.04376220703125,
-0.00811767578125,
0.0526123046875,
-0.01654052734375,
0.031585693359375,
-0.04443359375,
0.043548583984375,
-0.0178680419921875,
-0.036529541015625,
-0.0169830322265625,
-0.03857421875,
-0.08154296875,
0.01171875,
-0.0198822021484375,
-0.045806884765625,
0.01462554931640625,
0.01849365234375,
-0.0189056396484375,
0.060546875,
-0.06353759765625,
0.06866455078125,
-0.005157470703125,
-0.0341796875,
-0.00537872314453125,
-0.054168701171875,
0.0231781005859375,
0.020538330078125,
-0.014923095703125,
-0.002201080322265625,
0.00672149658203125,
0.0799560546875,
-0.052398681640625,
0.0655517578125,
-0.040191650390625,
0.0367431640625,
0.039764404296875,
-0.010162353515625,
0.029754638671875,
-0.00952911376953125,
-0.0183868408203125,
0.0258331298828125,
-0.01036834716796875,
-0.035919189453125,
-0.04595947265625,
0.04718017578125,
-0.071533203125,
-0.021514892578125,
-0.0159454345703125,
-0.03521728515625,
0.0225982666015625,
0.00926971435546875,
0.041748046875,
0.0579833984375,
0.02325439453125,
0.0275421142578125,
0.041656494140625,
-0.0318603515625,
0.03485107421875,
-0.0020160675048828125,
-0.003368377685546875,
-0.0394287109375,
0.05743408203125,
0.028961181640625,
0.0158538818359375,
0.0135040283203125,
0.0198211669921875,
-0.0162506103515625,
-0.046844482421875,
-0.0264434814453125,
0.017974853515625,
-0.055389404296875,
-0.04595947265625,
-0.05218505859375,
-0.03240966796875,
-0.0253753662109375,
-0.00682830810546875,
-0.042449951171875,
-0.031707763671875,
-0.0267181396484375,
0.01800537109375,
0.0545654296875,
0.03948974609375,
-0.020233154296875,
0.04266357421875,
-0.0352783203125,
0.007732391357421875,
0.007678985595703125,
0.0292205810546875,
0.01146697998046875,
-0.0648193359375,
-0.025543212890625,
-0.00421142578125,
-0.03179931640625,
-0.04888916015625,
0.037139892578125,
0.0187530517578125,
0.038787841796875,
0.0258636474609375,
-0.01238250732421875,
0.049072265625,
-0.0036945343017578125,
0.03753662109375,
0.040191650390625,
-0.03277587890625,
0.042572021484375,
0.00522613525390625,
0.013397216796875,
0.00981903076171875,
0.0259857177734375,
-0.0183563232421875,
-0.00321197509765625,
-0.07183837890625,
-0.060302734375,
0.06292724609375,
0.0035228729248046875,
0.00254058837890625,
0.0214080810546875,
0.062286376953125,
0.00554656982421875,
-0.004665374755859375,
-0.054443359375,
-0.04046630859375,
-0.0199432373046875,
-0.0176544189453125,
0.003704071044921875,
-0.007328033447265625,
-0.00665283203125,
-0.048187255859375,
0.0513916015625,
-0.0022563934326171875,
0.0535888671875,
0.025787353515625,
-0.0009250640869140625,
-0.006927490234375,
-0.03033447265625,
0.031768798828125,
0.0226898193359375,
-0.0193939208984375,
0.01019287109375,
0.01311492919921875,
-0.04119873046875,
0.0113983154296875,
0.00876617431640625,
-0.004638671875,
-0.0009016990661621094,
0.04046630859375,
0.07147216796875,
0.0010519027709960938,
0.00797271728515625,
0.02569580078125,
-0.007389068603515625,
-0.0306243896484375,
-0.019989013671875,
0.01361083984375,
-0.00045108795166015625,
0.036956787109375,
0.0214080810546875,
0.03692626953125,
-0.00693511962890625,
-0.018951416015625,
0.0229644775390625,
0.039642333984375,
-0.018310546875,
-0.0236968994140625,
0.047698974609375,
-0.0132293701171875,
-0.020172119140625,
0.0667724609375,
-0.01428985595703125,
-0.032684326171875,
0.09063720703125,
0.0364990234375,
0.07366943359375,
0.0016145706176757812,
-0.0019044876098632812,
0.0716552734375,
0.022247314453125,
-0.005184173583984375,
0.00980377197265625,
0.01235198974609375,
-0.0618896484375,
0.002208709716796875,
-0.035614013671875,
0.0041351318359375,
0.0225830078125,
-0.041717529296875,
0.01849365234375,
-0.055206298828125,
-0.03472900390625,
0.01409912109375,
0.0311431884765625,
-0.07379150390625,
0.0142822265625,
-0.01161956787109375,
0.07122802734375,
-0.05230712890625,
0.0579833984375,
0.06732177734375,
-0.03753662109375,
-0.0869140625,
-0.01313018798828125,
0.0005307197570800781,
-0.06756591796875,
0.055267333984375,
0.03472900390625,
0.01184844970703125,
0.00738525390625,
-0.0645751953125,
-0.05145263671875,
0.109619140625,
0.04351806640625,
-0.01389312744140625,
0.025787353515625,
-0.0162811279296875,
0.01522064208984375,
-0.03662109375,
0.039581298828125,
0.00782012939453125,
0.0318603515625,
0.021331787109375,
-0.04510498046875,
0.025787353515625,
-0.02685546875,
0.005767822265625,
0.011871337890625,
-0.068359375,
0.07012939453125,
-0.041473388671875,
-0.01192474365234375,
0.0029087066650390625,
0.051116943359375,
0.01129150390625,
0.01555633544921875,
0.04388427734375,
0.0712890625,
0.041351318359375,
-0.01690673828125,
0.0732421875,
-0.0009937286376953125,
0.041412353515625,
0.048736572265625,
0.031768798828125,
0.03948974609375,
0.0215301513671875,
-0.0181121826171875,
0.026458740234375,
0.0799560546875,
-0.024627685546875,
0.0235443115234375,
0.022216796875,
0.007274627685546875,
-0.005741119384765625,
0.007663726806640625,
-0.0323486328125,
0.03668212890625,
0.01081085205078125,
-0.04095458984375,
-0.018798828125,
0.003986358642578125,
0.003509521484375,
-0.021209716796875,
-0.019317626953125,
0.035491943359375,
0.004711151123046875,
-0.0293426513671875,
0.06982421875,
0.01229095458984375,
0.0672607421875,
-0.032684326171875,
0.00043010711669921875,
-0.024505615234375,
0.018218994140625,
-0.0270233154296875,
-0.052032470703125,
0.0249176025390625,
-0.0216827392578125,
-0.00554656982421875,
0.002758026123046875,
0.052001953125,
-0.023101806640625,
-0.03424072265625,
0.016387939453125,
0.018646240234375,
0.0394287109375,
0.00876617431640625,
-0.09765625,
0.01561737060546875,
0.001514434814453125,
-0.052093505859375,
0.02679443359375,
0.0335693359375,
0.01129913330078125,
0.0596923828125,
0.04119873046875,
-0.0062408447265625,
0.00699615478515625,
-0.01139068603515625,
0.0615234375,
-0.0284881591796875,
-0.01727294921875,
-0.059600830078125,
0.042388916015625,
-0.00998687744140625,
-0.041717529296875,
0.03594970703125,
0.0404052734375,
0.058502197265625,
0.0013980865478515625,
0.0296783447265625,
-0.02349853515625,
-0.005908966064453125,
-0.0309600830078125,
0.057830810546875,
-0.060943603515625,
0.00022542476654052734,
-0.00421905517578125,
-0.05035400390625,
-0.0275726318359375,
0.052398681640625,
-0.01153564453125,
0.033294677734375,
0.03515625,
0.07891845703125,
-0.0269012451171875,
-0.030426025390625,
0.0126190185546875,
0.012664794921875,
0.00893402099609375,
0.0309295654296875,
0.022216796875,
-0.057708740234375,
0.02392578125,
-0.05010986328125,
-0.0201873779296875,
-0.01204681396484375,
-0.054168701171875,
-0.061431884765625,
-0.06494140625,
-0.045867919921875,
-0.050048828125,
-0.00798797607421875,
0.07183837890625,
0.08251953125,
-0.047882080078125,
-0.0110931396484375,
-0.0007758140563964844,
0.0141143798828125,
-0.0294036865234375,
-0.0172882080078125,
0.050201416015625,
-0.02197265625,
-0.050506591796875,
-0.0245513916015625,
0.0024242401123046875,
0.0175323486328125,
0.0010538101196289062,
-0.0172271728515625,
-0.01502227783203125,
-0.018463134765625,
0.0143280029296875,
0.022613525390625,
-0.04437255859375,
-0.01325225830078125,
-0.0195159912109375,
-0.011871337890625,
0.02325439453125,
0.037994384765625,
-0.03363037109375,
0.0247650146484375,
0.0300445556640625,
0.032562255859375,
0.0574951171875,
-0.030792236328125,
0.0040740966796875,
-0.0626220703125,
0.043548583984375,
-0.0103912353515625,
0.034454345703125,
0.033294677734375,
-0.0292510986328125,
0.050048828125,
0.0253448486328125,
-0.034149169921875,
-0.06671142578125,
-0.00545501708984375,
-0.0787353515625,
-0.01499176025390625,
0.06805419921875,
-0.038848876953125,
-0.036590576171875,
0.041748046875,
0.00550079345703125,
0.050872802734375,
-0.006313323974609375,
0.034393310546875,
0.016265869140625,
-0.010009765625,
-0.048492431640625,
-0.043212890625,
0.030181884765625,
0.01561737060546875,
-0.040863037109375,
-0.0287017822265625,
-0.00270843505859375,
0.04925537109375,
0.01505279541015625,
0.038726806640625,
-0.007556915283203125,
0.0108489990234375,
0.0119781494140625,
0.039642333984375,
-0.042266845703125,
-0.0072784423828125,
-0.02398681640625,
0.006702423095703125,
-0.004364013671875,
-0.046661376953125
]
] |
shibing624/macbert4csc-base-chinese | 2023-10-09T03:23:54.000Z | [
"transformers",
"pytorch",
"onnx",
"safetensors",
"bert",
"fill-mask",
"zh",
"arxiv:2004.13922",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | shibing624 | null | null | shibing624/macbert4csc-base-chinese | 62 | 12,263 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
tags:
- bert
- pytorch
- zh
license: "apache-2.0"
---
# MacBERT for Chinese Spelling Correction(macbert4csc) Model
中文拼写纠错模型
`macbert4csc-base-chinese` evaluate SIGHAN2015 test data:
- Char Level: precision:0.9372, recall:0.8640, f1:0.8991
- Sentence Level: precision:0.8264, recall:0.7366, f1:0.7789
由于训练使用的数据使用了SIGHAN2015的训练集(复现paper),在SIGHAN2015的测试集上达到SOTA水平。
模型结构,魔改于softmaskedbert:

## Usage
本项目开源在中文文本纠错项目:[pycorrector](https://github.com/shibing624/pycorrector),可支持macbert4csc模型,通过如下命令调用:
```python
from pycorrector.macbert.macbert_corrector import MacBertCorrector
nlp = MacBertCorrector("shibing624/macbert4csc-base-chinese").macbert_correct
i = nlp('今天新情很好')
print(i)
```
当然,你也可使用官方的huggingface/transformers调用:
*Please use 'Bert' related functions to load this model!*
```python
import operator
import torch
from transformers import BertTokenizer, BertForMaskedLM
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
tokenizer = BertTokenizer.from_pretrained("shibing624/macbert4csc-base-chinese")
model = BertForMaskedLM.from_pretrained("shibing624/macbert4csc-base-chinese")
model.to(device)
texts = ["今天新情很好", "你找到你最喜欢的工作,我也很高心。"]
with torch.no_grad():
outputs = model(**tokenizer(texts, padding=True, return_tensors='pt').to(device))
def get_errors(corrected_text, origin_text):
sub_details = []
for i, ori_char in enumerate(origin_text):
if ori_char in [' ', '“', '”', '‘', '’', '琊', '\n', '…', '—', '擤']:
# add unk word
corrected_text = corrected_text[:i] + ori_char + corrected_text[i:]
continue
if i >= len(corrected_text):
continue
if ori_char != corrected_text[i]:
if ori_char.lower() == corrected_text[i]:
# pass english upper char
corrected_text = corrected_text[:i] + ori_char + corrected_text[i + 1:]
continue
sub_details.append((ori_char, corrected_text[i], i, i + 1))
sub_details = sorted(sub_details, key=operator.itemgetter(2))
return corrected_text, sub_details
result = []
for ids, text in zip(outputs.logits, texts):
_text = tokenizer.decode(torch.argmax(ids, dim=-1), skip_special_tokens=True).replace(' ', '')
corrected_text = _text[:len(text)]
corrected_text, details = get_errors(corrected_text, text)
print(text, ' => ', corrected_text, details)
result.append((corrected_text, details))
print(result)
```
output:
```shell
今天新情很好 => 今天心情很好 [('新', '心', 2, 3)]
你找到你最喜欢的工作,我也很高心。 => 你找到你最喜欢的工作,我也很高兴。 [('心', '兴', 15, 16)]
```
模型文件组成:
```
macbert4csc-base-chinese
├── config.json
├── added_tokens.json
├── pytorch_model.bin
├── special_tokens_map.json
├── tokenizer_config.json
└── vocab.txt
```
### 训练数据集
#### SIGHAN+Wang271K中文纠错数据集
| 数据集 | 语料 | 下载链接 | 压缩包大小 |
| :------- | :--------- | :---------: | :---------: |
| **`SIGHAN+Wang271K中文纠错数据集`** | SIGHAN+Wang271K(27万条) | [百度网盘(密码01b9)](https://pan.baidu.com/s/1BV5tr9eONZCI0wERFvr0gQ)| 106M |
| **`原始SIGHAN数据集`** | SIGHAN13 14 15 | [官方csc.html](http://nlp.ee.ncu.edu.tw/resource/csc.html)| 339K |
| **`原始Wang271K数据集`** | Wang271K | [Automatic-Corpus-Generation dimmywang提供](https://github.com/wdimmy/Automatic-Corpus-Generation/blob/master/corpus/train.sgml)| 93M |
SIGHAN+Wang271K中文纠错数据集,数据格式:
```json
[
{
"id": "B2-4029-3",
"original_text": "晚间会听到嗓音,白天的时候大家都不会太在意,但是在睡觉的时候这嗓音成为大家的恶梦。",
"wrong_ids": [
5,
31
],
"correct_text": "晚间会听到噪音,白天的时候大家都不会太在意,但是在睡觉的时候这噪音成为大家的恶梦。"
},
]
```
```shell
macbert4csc
├── config.json
├── pytorch_model.bin
├── special_tokens_map.json
├── tokenizer_config.json
└── vocab.txt
```
如果需要训练macbert4csc,请参考[https://github.com/shibing624/pycorrector/tree/master/pycorrector/macbert](https://github.com/shibing624/pycorrector/tree/master/pycorrector/macbert)
### About MacBERT
**MacBERT** is an improved BERT with novel **M**LM **a**s **c**orrection pre-training task, which mitigates the discrepancy of pre-training and fine-tuning.
Here is an example of our pre-training task.
| task | Example |
| -------------- | ----------------- |
| **Original Sentence** | we use a language model to predict the probability of the next word. |
| **MLM** | we use a language [M] to [M] ##di ##ct the pro [M] ##bility of the next word . |
| **Whole word masking** | we use a language [M] to [M] [M] [M] the [M] [M] [M] of the next word . |
| **N-gram masking** | we use a [M] [M] to [M] [M] [M] the [M] [M] [M] [M] [M] next word . |
| **MLM as correction** | we use a text system to ca ##lc ##ulate the po ##si ##bility of the next word . |
Except for the new pre-training task, we also incorporate the following techniques.
- Whole Word Masking (WWM)
- N-gram masking
- Sentence-Order Prediction (SOP)
**Note that our MacBERT can be directly replaced with the original BERT as there is no differences in the main neural architecture.**
For more technical details, please check our paper: [Revisiting Pre-trained Models for Chinese Natural Language Processing](https://arxiv.org/abs/2004.13922)
## Citation
```latex
@software{pycorrector,
author = {Xu Ming},
title = {pycorrector: Text Error Correction Tool},
year = {2021},
url = {https://github.com/shibing624/pycorrector},
}
```
| 5,397 | [
[
-0.0128631591796875,
-0.04736328125,
0.0020236968994140625,
0.0257415771484375,
-0.0179595947265625,
-0.01116943359375,
-0.0270538330078125,
-0.0245361328125,
0.026519775390625,
0.026031494140625,
-0.04010009765625,
-0.055816650390625,
-0.04119873046875,
0.02178955078125,
-0.002384185791015625,
0.082763671875,
-0.01479339599609375,
0.007572174072265625,
0.0272979736328125,
-0.006763458251953125,
-0.017333984375,
-0.05279541015625,
-0.051971435546875,
-0.00942230224609375,
0.033966064453125,
0.01861572265625,
0.037841796875,
0.032073974609375,
0.0372314453125,
0.0248870849609375,
-0.007293701171875,
0.022247314453125,
-0.01544189453125,
-0.02685546875,
0.0240936279296875,
-0.03814697265625,
-0.04437255859375,
0.01032257080078125,
0.046722412109375,
0.03399658203125,
0.0017557144165039062,
0.01004791259765625,
0.01264190673828125,
0.0408935546875,
-0.036956787109375,
0.0025005340576171875,
-0.048614501953125,
0.0034656524658203125,
-0.02972412109375,
-0.0101318359375,
-0.023895263671875,
-0.022003173828125,
-0.002361297607421875,
-0.046600341796875,
0.02386474609375,
0.003437042236328125,
0.11065673828125,
0.0003414154052734375,
-0.0142364501953125,
-0.026947021484375,
-0.0191802978515625,
0.06134033203125,
-0.0826416015625,
0.004913330078125,
0.0227508544921875,
-0.005733489990234375,
-0.021392822265625,
-0.0892333984375,
-0.059783935546875,
-0.01366424560546875,
-0.0259857177734375,
0.0304718017578125,
0.006465911865234375,
0.01049041748046875,
0.0277252197265625,
0.021240234375,
-0.034515380859375,
-0.015411376953125,
-0.038665771484375,
-0.03997802734375,
0.04071044921875,
-0.0030918121337890625,
0.03424072265625,
-0.029510498046875,
-0.0142669677734375,
-0.0229644775390625,
-0.0245361328125,
0.026214599609375,
0.0291290283203125,
0.0196533203125,
-0.014190673828125,
0.03875732421875,
-0.00010889768600463867,
0.042877197265625,
0.0095367431640625,
-0.017669677734375,
0.05096435546875,
-0.0235595703125,
-0.0276336669921875,
0.00931549072265625,
0.09100341796875,
0.0208282470703125,
0.010528564453125,
0.00806427001953125,
-0.014678955078125,
-0.0032978057861328125,
-0.0158538818359375,
-0.050048828125,
-0.0465087890625,
0.0247650146484375,
-0.042449951171875,
-0.0181732177734375,
0.01557159423828125,
-0.0297393798828125,
0.008880615234375,
0.0014171600341796875,
0.046783447265625,
-0.048431396484375,
-0.0087127685546875,
0.00803375244140625,
-0.01346588134765625,
0.00024187564849853516,
-0.0002562999725341797,
-0.07000732421875,
-0.0013666152954101562,
0.0284271240234375,
0.059722900390625,
0.01222991943359375,
-0.0430908203125,
-0.0282745361328125,
-0.00403594970703125,
-0.0238800048828125,
0.019012451171875,
-0.0023097991943359375,
-0.01166534423828125,
-0.00298309326171875,
0.0219879150390625,
-0.0175628662109375,
-0.0274810791015625,
0.053680419921875,
-0.0188140869140625,
0.034423828125,
-0.028289794921875,
-0.0180206298828125,
-0.020111083984375,
0.018341064453125,
-0.037353515625,
0.09698486328125,
-0.002460479736328125,
-0.08599853515625,
0.0176544189453125,
-0.04669189453125,
-0.04205322265625,
-0.00913238525390625,
0.00629425048828125,
-0.037506103515625,
-0.0211639404296875,
0.04144287109375,
0.041168212890625,
0.0099945068359375,
0.0076751708984375,
0.006916046142578125,
-0.0260467529296875,
0.02972412109375,
-0.0191650390625,
0.0916748046875,
0.0192718505859375,
-0.044586181640625,
0.0245361328125,
-0.062469482421875,
0.00630950927734375,
0.0105133056640625,
-0.03570556640625,
-0.00492095947265625,
-0.01261138916015625,
0.0169677734375,
0.0231475830078125,
0.0430908203125,
-0.03985595703125,
0.00673675537109375,
-0.051513671875,
0.04376220703125,
0.05340576171875,
-0.016937255859375,
0.0142669677734375,
-0.0243377685546875,
0.0192718505859375,
0.01458740234375,
0.00970458984375,
-0.0186614990234375,
-0.03997802734375,
-0.08099365234375,
-0.023834228515625,
0.033111572265625,
0.053436279296875,
-0.06146240234375,
0.055511474609375,
-0.01076507568359375,
-0.036468505859375,
-0.046844482421875,
0.004169464111328125,
0.026885986328125,
0.038360595703125,
0.04278564453125,
-0.0169219970703125,
-0.060302734375,
-0.04364013671875,
-0.0164031982421875,
-0.018707275390625,
0.0027484893798828125,
0.01116180419921875,
0.040252685546875,
-0.033477783203125,
0.0498046875,
-0.036224365234375,
-0.031524658203125,
-0.027862548828125,
0.023345947265625,
0.041290283203125,
0.041748046875,
0.034912109375,
-0.03887939453125,
-0.038970947265625,
-0.0066375732421875,
-0.043731689453125,
-0.0101776123046875,
-0.0220794677734375,
-0.0270843505859375,
0.0413818359375,
0.0308074951171875,
-0.0289764404296875,
0.028289794921875,
0.03485107421875,
-0.01412200927734375,
0.05902099609375,
-0.03717041015625,
0.007080078125,
-0.09228515625,
0.0142364501953125,
-0.0164642333984375,
0.005340576171875,
-0.0438232421875,
-0.0007233619689941406,
0.0243377685546875,
0.0181427001953125,
-0.0312042236328125,
0.029449462890625,
-0.04266357421875,
0.028533935546875,
-0.00616455078125,
0.0277557373046875,
-0.0033473968505859375,
0.0499267578125,
-0.0026798248291015625,
0.042388916015625,
0.03814697265625,
-0.055877685546875,
0.0240020751953125,
0.024566650390625,
-0.0177001953125,
-0.01418304443359375,
-0.04132080078125,
-0.00533294677734375,
0.0063018798828125,
0.026763916015625,
-0.0804443359375,
-0.0081024169921875,
0.044189453125,
-0.056732177734375,
0.015533447265625,
0.0159149169921875,
-0.0355224609375,
-0.0452880859375,
-0.034820556640625,
0.0178680419921875,
0.03173828125,
-0.047760009765625,
0.03485107421875,
0.001800537109375,
0.0039005279541015625,
-0.07232666015625,
-0.07177734375,
0.00835418701171875,
0.0085906982421875,
-0.0570068359375,
0.04241943359375,
-0.00533294677734375,
0.01300048828125,
-0.01154327392578125,
0.00919342041015625,
0.006412506103515625,
-0.0028324127197265625,
0.00449371337890625,
0.038360595703125,
-0.0197906494140625,
-0.0014934539794921875,
-0.00750732421875,
-0.002948760986328125,
-0.0035495758056640625,
-0.0158843994140625,
0.04339599609375,
-0.0048065185546875,
-0.004573822021484375,
-0.0426025390625,
-0.002532958984375,
0.0240936279296875,
-0.02960205078125,
0.056793212890625,
0.07867431640625,
-0.023956298828125,
0.01336669921875,
-0.04144287109375,
-0.007389068603515625,
-0.03851318359375,
0.034912109375,
-0.0297393798828125,
-0.05108642578125,
0.041290283203125,
0.01641845703125,
0.020263671875,
0.06524658203125,
0.032501220703125,
-0.0024871826171875,
0.06903076171875,
0.0296783447265625,
-0.0191192626953125,
0.03271484375,
-0.040863037109375,
0.033721923828125,
-0.06268310546875,
-0.0176239013671875,
-0.032257080078125,
-0.0197906494140625,
-0.05877685546875,
-0.016754150390625,
0.006687164306640625,
0.019683837890625,
-0.01409912109375,
0.034149169921875,
-0.06134033203125,
-0.0043792724609375,
0.052825927734375,
0.0123443603515625,
-0.00603485107421875,
-0.001697540283203125,
-0.04071044921875,
-0.01219940185546875,
-0.0450439453125,
-0.040191650390625,
0.07379150390625,
0.0189666748046875,
0.035797119140625,
-0.00726318359375,
0.055206298828125,
-0.012451171875,
0.009368896484375,
-0.04180908203125,
0.0457763671875,
-0.022796630859375,
-0.039703369140625,
-0.01715087890625,
-0.026336669921875,
-0.07122802734375,
0.0180816650390625,
-0.02813720703125,
-0.0654296875,
0.00519561767578125,
0.006580352783203125,
-0.0239105224609375,
0.035400390625,
-0.05450439453125,
0.071044921875,
-0.02984619140625,
-0.0125274658203125,
0.000056743621826171875,
-0.06005859375,
0.035614013671875,
0.01155853271484375,
0.012237548828125,
0.000926971435546875,
0.00516510009765625,
0.07952880859375,
-0.044219970703125,
0.0489501953125,
-0.003688812255859375,
-0.0030117034912109375,
0.022369384765625,
-0.019287109375,
0.040863037109375,
-0.00988006591796875,
-0.0066375732421875,
0.0272979736328125,
0.013580322265625,
-0.028045654296875,
-0.021026611328125,
0.049957275390625,
-0.0648193359375,
-0.03546142578125,
-0.05584716796875,
-0.03045654296875,
0.016754150390625,
0.026092529296875,
0.058502197265625,
0.029693603515625,
0.00566864013671875,
-0.00022459030151367188,
0.02252197265625,
-0.04180908203125,
0.04571533203125,
0.01690673828125,
-0.017913818359375,
-0.0494384765625,
0.0675048828125,
0.020904541015625,
0.0045166015625,
0.032989501953125,
0.0098114013671875,
-0.0204315185546875,
-0.0308837890625,
-0.0194091796875,
0.03369140625,
-0.0516357421875,
0.0011272430419921875,
-0.07080078125,
-0.032745361328125,
-0.06536865234375,
-0.00342559814453125,
-0.0033111572265625,
-0.0305633544921875,
-0.03912353515625,
0.00392913818359375,
0.023223876953125,
0.02655029296875,
-0.01082611083984375,
0.031829833984375,
-0.060302734375,
0.02496337890625,
-0.00339508056640625,
-0.0024471282958984375,
0.00043463706970214844,
-0.049713134765625,
-0.035888671875,
0.01412200927734375,
-0.038360595703125,
-0.05120849609375,
0.0533447265625,
0.006404876708984375,
0.0304107666015625,
0.03143310546875,
0.0144195556640625,
0.06597900390625,
-0.03363037109375,
0.08502197265625,
0.025970458984375,
-0.0814208984375,
0.040283203125,
-0.00004297494888305664,
0.0153350830078125,
0.0213775634765625,
0.0153350830078125,
-0.059326171875,
-0.030242919921875,
-0.054229736328125,
-0.0810546875,
0.0814208984375,
0.0246734619140625,
0.0083160400390625,
-0.0042572021484375,
0.0179443359375,
-0.008819580078125,
0.0033740997314453125,
-0.07012939453125,
-0.04730224609375,
-0.03814697265625,
-0.030120849609375,
-0.016326904296875,
-0.02752685546875,
0.003753662109375,
-0.035125732421875,
0.087158203125,
0.00739288330078125,
0.031402587890625,
0.039306640625,
-0.01438140869140625,
0.01412200927734375,
0.0028324127197265625,
0.051971435546875,
0.035003662109375,
-0.0233917236328125,
0.0094757080078125,
0.0306854248046875,
-0.06170654296875,
-0.01367950439453125,
0.011322021484375,
-0.007259368896484375,
0.028900146484375,
0.031463623046875,
0.065673828125,
0.0015354156494140625,
-0.033477783203125,
0.037139892578125,
-0.006938934326171875,
-0.032470703125,
-0.03106689453125,
-0.003894805908203125,
0.004978179931640625,
-0.001941680908203125,
0.033905029296875,
0.012451171875,
-0.006031036376953125,
-0.0355224609375,
0.0033111572265625,
0.019256591796875,
-0.0103759765625,
-0.0101318359375,
0.057037353515625,
0.0018157958984375,
-0.0198516845703125,
0.040008544921875,
-0.0118408203125,
-0.07061767578125,
0.04498291015625,
0.040771484375,
0.056854248046875,
-0.00047278404235839844,
0.007190704345703125,
0.061248779296875,
0.0389404296875,
-0.01271820068359375,
0.021209716796875,
0.00550079345703125,
-0.055511474609375,
-0.0148773193359375,
-0.03790283203125,
0.0024356842041015625,
0.0178680419921875,
-0.0234222412109375,
0.0255279541015625,
-0.049102783203125,
-0.0198822021484375,
-0.0027599334716796875,
0.0225830078125,
-0.0286407470703125,
0.03314208984375,
-0.002132415771484375,
0.05828857421875,
-0.040863037109375,
0.07452392578125,
0.044921875,
-0.045623779296875,
-0.09100341796875,
0.0186767578125,
-0.02423095703125,
-0.05706787109375,
0.06561279296875,
0.025482177734375,
-0.0011186599731445312,
0.001415252685546875,
-0.04180908203125,
-0.058563232421875,
0.0810546875,
0.014923095703125,
-0.0280609130859375,
0.0023097991943359375,
-0.0006890296936035156,
0.03790283203125,
-0.00583648681640625,
0.038787841796875,
0.03717041015625,
0.037933349609375,
0.00269317626953125,
-0.06207275390625,
0.019989013671875,
-0.0280914306640625,
-0.0005168914794921875,
-0.003231048583984375,
-0.0465087890625,
0.08880615234375,
-0.01641845703125,
-0.0237579345703125,
0.0204620361328125,
0.0631103515625,
0.01528167724609375,
0.017974853515625,
0.02276611328125,
0.031402587890625,
0.06585693359375,
-0.0205535888671875,
0.05322265625,
-0.03082275390625,
0.030731201171875,
0.060211181640625,
0.012115478515625,
0.063720703125,
0.0274658203125,
-0.0235595703125,
0.044219970703125,
0.0653076171875,
-0.0082855224609375,
0.02984619140625,
0.01139068603515625,
-0.00922393798828125,
0.004421234130859375,
0.0187835693359375,
-0.0382080078125,
0.0170135498046875,
0.0138702392578125,
-0.037933349609375,
0.01473236083984375,
-0.003116607666015625,
0.017364501953125,
-0.007801055908203125,
-0.01453399658203125,
0.0367431640625,
-0.0030345916748046875,
-0.0504150390625,
0.06842041015625,
0.0325927734375,
0.0982666015625,
-0.0487060546875,
0.0103302001953125,
-0.0033702850341796875,
0.0275421142578125,
-0.03033447265625,
-0.032867431640625,
-0.0031375885009765625,
-0.00937652587890625,
-0.0052337646484375,
0.0027008056640625,
0.043365478515625,
-0.04046630859375,
-0.050018310546875,
0.0236968994140625,
-0.00453948974609375,
0.0260467529296875,
0.006816864013671875,
-0.07025146484375,
-0.007434844970703125,
0.0267791748046875,
-0.01447296142578125,
0.0013980865478515625,
0.037933349609375,
0.0105133056640625,
0.0263671875,
0.06549072265625,
0.0191802978515625,
0.0239105224609375,
0.004199981689453125,
0.04437255859375,
-0.05194091796875,
-0.035675048828125,
-0.0682373046875,
0.042724609375,
-0.012725830078125,
-0.04010009765625,
0.060455322265625,
0.04278564453125,
0.06903076171875,
-0.01438140869140625,
0.06158447265625,
-0.0182647705078125,
0.029449462890625,
-0.04071044921875,
0.068359375,
-0.046234130859375,
0.0088348388671875,
-0.041748046875,
-0.049591064453125,
-0.0372314453125,
0.06634521484375,
-0.020263671875,
0.0018863677978515625,
0.07037353515625,
0.07794189453125,
0.01605224609375,
-0.01383209228515625,
0.0152587890625,
0.0269317626953125,
0.0189666748046875,
0.0479736328125,
0.0328369140625,
-0.06982421875,
0.05810546875,
-0.035614013671875,
-0.01303863525390625,
-0.022918701171875,
-0.04766845703125,
-0.0721435546875,
-0.05859375,
-0.023895263671875,
-0.045562744140625,
-0.00860595703125,
0.07379150390625,
0.0242767333984375,
-0.07049560546875,
-0.01593017578125,
-0.005313873291015625,
0.004669189453125,
-0.03277587890625,
-0.018707275390625,
0.061187744140625,
-0.039154052734375,
-0.0595703125,
-0.00762176513671875,
0.0006098747253417969,
0.00011265277862548828,
0.00533294677734375,
-0.018829345703125,
-0.041534423828125,
0.00030875205993652344,
0.0286865234375,
0.0176849365234375,
-0.0706787109375,
-0.002796173095703125,
0.01448822021484375,
-0.034027099609375,
0.0119476318359375,
0.03533935546875,
-0.035369873046875,
0.032928466796875,
0.039306640625,
0.024017333984375,
0.037567138671875,
-0.0230255126953125,
0.0228118896484375,
-0.04608154296875,
0.0170135498046875,
-0.004669189453125,
0.040191650390625,
0.0277557373046875,
-0.034698486328125,
0.029052734375,
0.0330810546875,
-0.04132080078125,
-0.0482177734375,
-0.0193023681640625,
-0.0760498046875,
-0.023193359375,
0.07940673828125,
-0.0236358642578125,
-0.0295562744140625,
0.003917694091796875,
-0.06494140625,
0.06243896484375,
-0.01800537109375,
0.0556640625,
0.060089111328125,
0.0087127685546875,
-0.004390716552734375,
-0.022674560546875,
0.034942626953125,
0.043609619140625,
-0.03070068359375,
-0.00539398193359375,
0.00864410400390625,
0.01861572265625,
0.0267791748046875,
0.053192138671875,
0.0022258758544921875,
0.016357421875,
-0.00556182861328125,
0.038330078125,
-0.0028705596923828125,
0.014923095703125,
-0.008209228515625,
-0.0140228271484375,
-0.007251739501953125,
-0.048431396484375
]
] |
ai-forever/ruBert-base | 2023-11-03T12:50:38.000Z | [
"transformers",
"pytorch",
"bert",
"fill-mask",
"PyTorch",
"Transformers",
"exbert",
"ru",
"arxiv:2309.10931",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | ai-forever | null | null | ai-forever/ruBert-base | 13 | 12,259 | transformers | 2022-03-02T23:29:05 | ---
language:
- ru
tags:
- PyTorch
- Transformers
- bert
- exbert
pipeline_tag: fill-mask
thumbnail: "https://github.com/sberbank-ai/model-zoo"
license: apache-2.0
---
# ruBert-base
The model architecture design, pretraining, and evaluation are documented in our preprint: [**A Family of Pretrained Transformer Language Models for Russian**](https://arxiv.org/abs/2309.10931).
The model is pretrained by the [SberDevices](https://sberdevices.ru/) team.
* Task: `mask filling`
* Type: `encoder`
* Tokenizer: `BPE`
* Dict size: `120 138`
* Num Parameters: `178 M`
* Training Data Volume `30 GB`
# Authors
+ NLP core team RnD [Telegram channel](https://t.me/nlpcoreteam):
+ Dmitry Zmitrovich
# Cite us
```
@misc{zmitrovich2023family,
title={A Family of Pretrained Transformer Language Models for Russian},
author={Dmitry Zmitrovich and Alexander Abramov and Andrey Kalmykov and Maria Tikhonova and Ekaterina Taktasheva and Danil Astafurov and Mark Baushenko and Artem Snegirev and Tatiana Shavrina and Sergey Markov and Vladislav Mikhailov and Alena Fenogenova},
year={2023},
eprint={2309.10931},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 1,190 | [
[
-0.0298309326171875,
-0.01132965087890625,
0.01007080078125,
0.01296234130859375,
-0.029632568359375,
0.00011867284774780273,
-0.0283355712890625,
-0.0118255615234375,
-0.0084686279296875,
0.0273284912109375,
-0.040252685546875,
-0.0245361328125,
-0.04852294921875,
-0.007381439208984375,
-0.01299285888671875,
0.0950927734375,
-0.0166473388671875,
0.0227203369140625,
-0.0129852294921875,
0.00968170166015625,
-0.01366424560546875,
-0.04425048828125,
-0.02490234375,
-0.04022216796875,
0.01055145263671875,
0.003871917724609375,
0.025177001953125,
0.031219482421875,
0.03424072265625,
0.024688720703125,
-0.012786865234375,
-0.017974853515625,
-0.0308685302734375,
0.002468109130859375,
-0.008026123046875,
-0.020233154296875,
-0.061065673828125,
-0.0057830810546875,
0.057647705078125,
0.043487548828125,
-0.0204620361328125,
0.0325927734375,
0.00934600830078125,
0.035003662109375,
-0.04437255859375,
-0.001331329345703125,
-0.038787841796875,
0.015716552734375,
-0.028076171875,
-0.00916290283203125,
-0.062255859375,
0.0040740966796875,
0.022735595703125,
-0.0296478271484375,
0.0216827392578125,
0.0047454833984375,
0.09478759765625,
0.01678466796875,
-0.0099334716796875,
0.004055023193359375,
-0.058929443359375,
0.05487060546875,
-0.049102783203125,
0.042572021484375,
0.025909423828125,
0.017578125,
-0.005218505859375,
-0.07476806640625,
-0.03253173828125,
-0.007205963134765625,
-0.03155517578125,
0.0021305084228515625,
-0.0272064208984375,
0.00867462158203125,
0.0254364013671875,
0.03375244140625,
-0.0643310546875,
-0.004550933837890625,
-0.03521728515625,
-0.0186767578125,
0.0177154541015625,
-0.00202178955078125,
-0.0025653839111328125,
0.007659912109375,
-0.030975341796875,
-0.0192718505859375,
-0.03643798828125,
-0.005947113037109375,
0.031951904296875,
0.0006079673767089844,
-0.030975341796875,
0.031219482421875,
-0.01800537109375,
0.065185546875,
0.0146942138671875,
0.0166473388671875,
0.05621337890625,
-0.029754638671875,
-0.022979736328125,
-0.0374755859375,
0.06878662109375,
-0.01568603515625,
0.01438140869140625,
-0.03155517578125,
-0.032012939453125,
-0.006366729736328125,
0.04132080078125,
-0.07183837890625,
-0.0159454345703125,
0.014129638671875,
-0.0264129638671875,
-0.01209259033203125,
0.0135650634765625,
-0.05194091796875,
0.00612640380859375,
-0.018310546875,
0.052398681640625,
-0.017364501953125,
-0.0293426513671875,
0.039398193359375,
-0.00199127197265625,
0.047332763671875,
-0.0013551712036132812,
-0.06280517578125,
0.044464111328125,
0.050750732421875,
0.03668212890625,
-0.006763458251953125,
-0.02667236328125,
-0.0157623291015625,
-0.0232391357421875,
0.001026153564453125,
0.0587158203125,
-0.020538330078125,
-0.002750396728515625,
-0.0040435791015625,
-0.00015914440155029297,
-0.006488800048828125,
-0.028411865234375,
0.03643798828125,
-0.0462646484375,
0.044464111328125,
0.01885986328125,
-0.017425537109375,
0.005237579345703125,
0.0159759521484375,
-0.0281219482421875,
0.0714111328125,
0.017181396484375,
-0.052337646484375,
0.02850341796875,
-0.05322265625,
-0.0157318115234375,
0.00916290283203125,
0.0164642333984375,
-0.04766845703125,
0.0012884140014648438,
0.0013189315795898438,
0.035736083984375,
-0.030975341796875,
0.034393310546875,
0.006343841552734375,
-0.01233673095703125,
0.0243682861328125,
-0.0056304931640625,
0.06317138671875,
0.0193328857421875,
-0.025299072265625,
0.0275421142578125,
-0.07427978515625,
0.007472991943359375,
-0.0032939910888671875,
-0.01163482666015625,
0.0010786056518554688,
-0.0216827392578125,
0.0163726806640625,
0.0222930908203125,
0.02984619140625,
-0.03717041015625,
0.007167816162109375,
-0.0182037353515625,
0.018646240234375,
0.052337646484375,
-0.01654052734375,
0.046600341796875,
-0.0154571533203125,
0.06494140625,
0.00518035888671875,
0.034271240234375,
-0.02044677734375,
-0.0123138427734375,
-0.078125,
-0.0196380615234375,
0.0465087890625,
0.03131103515625,
-0.033477783203125,
0.062042236328125,
-0.0333251953125,
-0.0587158203125,
-0.0364990234375,
-0.0218963623046875,
0.054473876953125,
0.0318603515625,
0.03436279296875,
-0.038787841796875,
-0.05999755859375,
-0.07281494140625,
-0.0008616447448730469,
0.00656890869140625,
-0.0024623870849609375,
-0.00827789306640625,
0.048095703125,
-0.0192413330078125,
0.06744384765625,
-0.0207672119140625,
-0.0021610260009765625,
-0.03619384765625,
0.016876220703125,
0.032623291015625,
0.064697265625,
0.03668212890625,
-0.04156494140625,
-0.042724609375,
-0.0067901611328125,
-0.022186279296875,
-0.019561767578125,
0.0007257461547851562,
-0.030975341796875,
0.0168609619140625,
0.0033206939697265625,
-0.057037353515625,
0.034942626953125,
0.03936767578125,
-0.044281005859375,
0.060546875,
0.006237030029296875,
-0.006481170654296875,
-0.08642578125,
0.0268096923828125,
-0.017669677734375,
-0.0224151611328125,
-0.07098388671875,
-0.0058135986328125,
-0.0010280609130859375,
-0.0122833251953125,
-0.0406494140625,
0.044952392578125,
-0.053955078125,
-0.014434814453125,
-0.004199981689453125,
-0.0063934326171875,
0.00626373291015625,
0.046112060546875,
0.0243072509765625,
0.039764404296875,
0.06634521484375,
-0.041717529296875,
0.0043487548828125,
0.02716064453125,
-0.032684326171875,
0.02227783203125,
-0.080810546875,
0.0301361083984375,
-0.0040740966796875,
0.0255889892578125,
-0.04937744140625,
0.014678955078125,
0.034698486328125,
-0.040924072265625,
0.0472412109375,
-0.0304412841796875,
-0.0408935546875,
-0.0263671875,
-0.001026153564453125,
0.041046142578125,
0.055389404296875,
-0.0243072509765625,
0.061431884765625,
0.0247802734375,
0.0066680908203125,
-0.0694580078125,
-0.0389404296875,
-0.0189361572265625,
-0.00511932373046875,
-0.055938720703125,
0.033966064453125,
-0.01092529296875,
0.00717926025390625,
0.005886077880859375,
-0.0031909942626953125,
-0.029052734375,
-0.0068206787109375,
0.01129913330078125,
0.036041259765625,
-0.032684326171875,
-0.003986358642578125,
-0.0361328125,
-0.03485107421875,
-0.00836944580078125,
-0.0161590576171875,
0.0814208984375,
-0.0266876220703125,
-0.01337432861328125,
-0.03363037109375,
-0.000827789306640625,
0.0198822021484375,
-0.04449462890625,
0.064208984375,
0.059783935546875,
-0.0308685302734375,
-0.0177154541015625,
-0.053985595703125,
-0.006969451904296875,
-0.03729248046875,
0.0298919677734375,
-0.027374267578125,
-0.054473876953125,
0.050048828125,
0.015777587890625,
-0.0028476715087890625,
0.03436279296875,
0.040557861328125,
0.017547607421875,
0.0254058837890625,
0.043182373046875,
0.0142822265625,
0.045623779296875,
-0.043975830078125,
0.01416778564453125,
-0.076416015625,
-0.0188446044921875,
-0.05426025390625,
-0.0225372314453125,
-0.0179595947265625,
-0.03399658203125,
-0.0003058910369873047,
-0.0006375312805175781,
-0.0343017578125,
0.057220458984375,
-0.037353515625,
0.038909912109375,
0.0228271484375,
-0.0191192626953125,
-0.0023097991943359375,
-0.011474609375,
0.00010776519775390625,
-0.01068115234375,
-0.055938720703125,
-0.043853759765625,
0.0877685546875,
0.030914306640625,
0.0601806640625,
-0.01216888427734375,
0.06292724609375,
-0.0168609619140625,
0.030029296875,
-0.046142578125,
0.044189453125,
0.001216888427734375,
-0.05938720703125,
-0.0037746429443359375,
-0.0218505859375,
-0.076416015625,
0.0268096923828125,
-0.02325439453125,
-0.04376220703125,
0.0215911865234375,
0.023712158203125,
-0.0162200927734375,
0.01343536376953125,
-0.04290771484375,
0.09228515625,
-0.01244354248046875,
-0.006000518798828125,
-0.0154876708984375,
-0.050628662109375,
0.0187225341796875,
-0.0029163360595703125,
-0.0011415481567382812,
0.01873779296875,
0.015777587890625,
0.06689453125,
-0.0192718505859375,
0.05517578125,
-0.026397705078125,
0.0254669189453125,
0.0005040168762207031,
-0.0122528076171875,
0.0236053466796875,
-0.0166015625,
-0.004573822021484375,
0.02130126953125,
-0.01416778564453125,
-0.0193939208984375,
-0.0330810546875,
0.01371002197265625,
-0.06488037109375,
-0.0242462158203125,
-0.057220458984375,
-0.0215911865234375,
-0.00656890869140625,
0.0364990234375,
0.032806396484375,
0.05230712890625,
-0.02099609375,
0.04376220703125,
0.048248291015625,
-0.0094757080078125,
0.026153564453125,
0.061553955078125,
-0.0171966552734375,
-0.03155517578125,
0.03948974609375,
-0.007190704345703125,
0.0243072509765625,
0.044677734375,
0.01285552978515625,
-0.008331298828125,
-0.05548095703125,
-0.0158538818359375,
0.034759521484375,
-0.047027587890625,
-0.02459716796875,
-0.049102783203125,
-0.035430908203125,
-0.032012939453125,
0.004638671875,
-0.058624267578125,
-0.027435302734375,
-0.0249786376953125,
-0.00858306884765625,
0.0009083747863769531,
0.0543212890625,
0.0017328262329101562,
0.045623779296875,
-0.054107666015625,
0.0195159912109375,
-0.004337310791015625,
0.038543701171875,
-0.005863189697265625,
-0.07501220703125,
-0.0423583984375,
-0.0174407958984375,
-0.022796630859375,
-0.032806396484375,
0.028289794921875,
0.00998687744140625,
0.06646728515625,
0.005352020263671875,
-0.00135040283203125,
0.049072265625,
-0.061279296875,
0.05712890625,
0.0031833648681640625,
-0.0794677734375,
0.024078369140625,
0.00310516357421875,
0.01617431640625,
0.040008544921875,
0.02642822265625,
-0.033294677734375,
-0.004055023193359375,
-0.062255859375,
-0.0692138671875,
0.06280517578125,
0.0107574462890625,
0.009796142578125,
0.021148681640625,
0.01580810546875,
0.0138702392578125,
0.017974853515625,
-0.0880126953125,
-0.028900146484375,
-0.03253173828125,
-0.00691986083984375,
0.00403594970703125,
-0.036773681640625,
-0.006778717041015625,
-0.0204620361328125,
0.07244873046875,
0.024688720703125,
0.032684326171875,
0.00620269775390625,
-0.0285186767578125,
0.002246856689453125,
0.0261688232421875,
0.0687255859375,
0.0618896484375,
-0.01861572265625,
-0.0009393692016601562,
0.0017881393432617188,
-0.04473876953125,
0.0009889602661132812,
0.0116119384765625,
-0.00441741943359375,
0.00910186767578125,
0.0299530029296875,
0.0950927734375,
0.01080322265625,
-0.0295257568359375,
0.0560302734375,
-0.0135650634765625,
-0.019378662109375,
-0.047210693359375,
-0.0166473388671875,
-0.01290130615234375,
0.0020503997802734375,
0.0274505615234375,
0.027191162109375,
-0.011138916015625,
-0.005664825439453125,
0.0223846435546875,
0.031829833984375,
-0.01910400390625,
-0.06231689453125,
0.02362060546875,
0.008392333984375,
-0.031951904296875,
0.0267333984375,
-0.0224151611328125,
-0.05419921875,
0.0166473388671875,
0.05206298828125,
0.0889892578125,
-0.024932861328125,
0.008392333984375,
0.039642333984375,
0.036773681640625,
-0.0011682510375976562,
0.0121307373046875,
-0.0019817352294921875,
-0.07269287109375,
-0.036224365234375,
-0.070556640625,
-0.01824951171875,
0.0246734619140625,
-0.0587158203125,
0.03662109375,
-0.0198211669921875,
-0.00641632080078125,
-0.01751708984375,
-0.01482391357421875,
-0.067138671875,
0.015838623046875,
0.0015926361083984375,
0.06683349609375,
-0.05841064453125,
0.064697265625,
0.054412841796875,
-0.016021728515625,
-0.04901123046875,
-0.01183319091796875,
-0.032958984375,
-0.05438232421875,
0.0714111328125,
0.00189208984375,
-0.00963592529296875,
0.0017461776733398438,
-0.02386474609375,
-0.06854248046875,
0.073974609375,
0.017425537109375,
-0.0234527587890625,
-0.001926422119140625,
0.007781982421875,
0.04827880859375,
-0.039581298828125,
0.023712158203125,
0.02685546875,
0.03948974609375,
-0.001613616943359375,
-0.07684326171875,
-0.01474761962890625,
-0.050262451171875,
0.005039215087890625,
0.0089874267578125,
-0.0274658203125,
0.069580078125,
0.004215240478515625,
-0.0247650146484375,
-0.0006561279296875,
0.04290771484375,
-0.0033168792724609375,
-0.014556884765625,
0.0400390625,
0.065673828125,
0.0235595703125,
-0.0175628662109375,
0.0684814453125,
-0.035247802734375,
0.0252838134765625,
0.08795166015625,
0.0071868896484375,
0.060272216796875,
0.033477783203125,
-0.032806396484375,
0.043182373046875,
0.03082275390625,
-0.006134033203125,
0.040069580078125,
0.02471923828125,
0.002986907958984375,
-0.0119476318359375,
0.0185089111328125,
-0.039031982421875,
0.0275726318359375,
0.015716552734375,
-0.0189971923828125,
-0.00001615285873413086,
-0.0110626220703125,
0.00205230712890625,
-0.0158843994140625,
0.003711700439453125,
0.050537109375,
-0.0045623779296875,
-0.04595947265625,
0.051422119140625,
-0.01090240478515625,
0.05694580078125,
-0.07098388671875,
0.01442718505859375,
-0.0015125274658203125,
0.017181396484375,
-0.014007568359375,
-0.0355224609375,
0.0258026123046875,
-0.000045239925384521484,
-0.0211639404296875,
-0.040008544921875,
0.047760009765625,
-0.038787841796875,
-0.0221710205078125,
0.004058837890625,
0.0234832763671875,
0.01316070556640625,
0.02386474609375,
-0.040374755859375,
0.0118408203125,
-0.009429931640625,
-0.0501708984375,
0.0205841064453125,
0.0231170654296875,
0.0222930908203125,
0.0447998046875,
0.050567626953125,
0.0197601318359375,
0.0281219482421875,
0.0137481689453125,
0.07086181640625,
-0.0275421142578125,
-0.034149169921875,
-0.058349609375,
0.07891845703125,
0.005657196044921875,
-0.04473876953125,
0.03729248046875,
0.048858642578125,
0.0750732421875,
-0.04241943359375,
0.03619384765625,
-0.01435089111328125,
0.03387451171875,
-0.030487060546875,
0.05499267578125,
-0.02130126953125,
0.00543975830078125,
-0.006046295166015625,
-0.0826416015625,
-0.0284271240234375,
0.059051513671875,
-0.01511383056640625,
0.002330780029296875,
0.064208984375,
0.058013916015625,
-0.0148773193359375,
-0.03131103515625,
0.0104827880859375,
0.0236663818359375,
0.019989013671875,
0.021636962890625,
0.05523681640625,
-0.043975830078125,
0.028564453125,
-0.031829833984375,
-0.0239105224609375,
-0.0014066696166992188,
-0.0789794921875,
-0.070556640625,
-0.0579833984375,
-0.0269317626953125,
-0.01171112060546875,
-0.0083465576171875,
0.0755615234375,
0.061737060546875,
-0.0616455078125,
-0.03448486328125,
-0.0008988380432128906,
-0.0146026611328125,
-0.005817413330078125,
-0.0108795166015625,
0.0169677734375,
-0.036285400390625,
-0.04559326171875,
0.01377105712890625,
-0.00048613548278808594,
0.013275146484375,
-0.023651123046875,
-0.019683837890625,
-0.011566162109375,
-0.0179595947265625,
0.0185699462890625,
0.01360321044921875,
-0.044830322265625,
-0.01543426513671875,
-0.0023670196533203125,
-0.0023555755615234375,
0.01422882080078125,
0.055450439453125,
-0.049560546875,
0.03131103515625,
0.050384521484375,
0.0367431640625,
0.04486083984375,
-0.009246826171875,
0.054168701171875,
-0.061065673828125,
0.034149169921875,
0.0303497314453125,
0.042083740234375,
0.0321044921875,
0.0006837844848632812,
0.03271484375,
0.01458740234375,
-0.056365966796875,
-0.0809326171875,
0.01800537109375,
-0.08416748046875,
0.0261077880859375,
0.0838623046875,
-0.0310211181640625,
-0.0185089111328125,
-0.00559234619140625,
-0.002902984619140625,
0.0202484130859375,
-0.0112152099609375,
0.025177001953125,
0.0595703125,
0.019622802734375,
-0.0171051025390625,
-0.0225677490234375,
0.052978515625,
0.02301025390625,
-0.048828125,
0.00031757354736328125,
0.008575439453125,
0.0232696533203125,
0.034210205078125,
0.05657958984375,
-0.0140533447265625,
0.015838623046875,
-0.00780487060546875,
0.053131103515625,
-0.0197296142578125,
-0.02850341796875,
-0.0258636474609375,
-0.0194854736328125,
0.00006562471389770508,
-0.0175018310546875
]
] |
Meina/Unreal_V4.1 | 2023-07-16T20:02:45.000Z | [
"diffusers",
"art",
"anime",
"meina",
"unreal",
"semirealistic",
"2.5d",
"sexy",
"fantasy",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Meina | null | null | Meina/Unreal_V4.1 | 4 | 12,252 | diffusers | 2023-07-16T19:59:21 | ---
license: creativeml-openrail-m
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- art
- anime
- meina
- unreal
- semirealistic
- 2.5d
- sexy
- fantasy
---
MeinaUnreal objetive is to be able to do anime art with a 2.5d feeling.
( the VAE is already baked in the model )
For examples and prompts, please checkout: https://civitai.com/models/18798/meinaunreal
I have a discord server where you can post images that you generated, discuss prompt and/or ask for help.
https://discord.gg/XC9nGZNDUd If you like one of my models and want to support their updates
I've made a ko-fi page; https://ko-fi.com/meina where you can pay me a coffee <3
And a Patreon page; https://www.patreon.com/MeinaMix where you can support me and get acess to beta of my models!
You may also try this model using Sinkin.ai: https://sinkin.ai/m/PREaKGN
Recommendations of use: Enable Quantization in K samplers.
Hires.fix is needed for prompts where the character is far away in order to make decent images, it drastically improve the quality of face and eyes!
Recommended parameters:
Sampler: DPM++ 2M Karras: 20 to 40 steps.
Sampler: DPM++ SDE Karras: 20 to 30 steps.
CFG Scale: 7.
Resolutions: 512x768, 512x1024 for Portrait!
Resolutions: 768x512, 1024x512, 1536x512 for Landscape!
Hires.fix: R-ESRGAN 4x+Anime6b, with 15 steps at 0.3 denoising.
Clip Skip: 2.
Negatives: ' (worst quality, low quality:1.4), monochrome, zombie, (interlocked fingers), ' | 1,470 | [
[
-0.046661376953125,
-0.046173095703125,
0.04534912109375,
0.0239410400390625,
-0.02923583984375,
-0.0268707275390625,
0.01496124267578125,
-0.0440673828125,
0.043426513671875,
0.045928955078125,
-0.050537109375,
-0.04083251953125,
-0.026885986328125,
-0.01080322265625,
-0.01043701171875,
0.03607177734375,
-0.000820159912109375,
0.012054443359375,
-0.00797271728515625,
0.01000213623046875,
-0.05609130859375,
-0.0263214111328125,
-0.0692138671875,
-0.00376129150390625,
0.0482177734375,
0.043243408203125,
0.0279693603515625,
0.0229339599609375,
0.0279998779296875,
0.0244598388671875,
-0.01242828369140625,
-0.0060882568359375,
-0.043609619140625,
0.013275146484375,
-0.00853729248046875,
-0.052520751953125,
-0.053497314453125,
-0.003795623779296875,
0.020294189453125,
0.0213165283203125,
-0.0040130615234375,
0.0004949569702148438,
-0.0265655517578125,
0.06109619140625,
-0.040802001953125,
-0.005275726318359375,
0.018524169921875,
0.0045166015625,
-0.0247955322265625,
0.0208587646484375,
-0.015869140625,
-0.0218505859375,
0.0005564689636230469,
-0.062103271484375,
0.002521514892578125,
-0.0248870849609375,
0.06793212890625,
0.02581787109375,
-0.0191802978515625,
0.0009059906005859375,
-0.06707763671875,
0.03167724609375,
-0.07135009765625,
0.03143310546875,
0.028411865234375,
0.059356689453125,
0.004825592041015625,
-0.06591796875,
-0.029632568359375,
0.0078125,
0.01708984375,
0.03436279296875,
-0.0204925537109375,
-0.0079803466796875,
0.041778564453125,
0.03094482421875,
-0.053558349609375,
0.01140594482421875,
-0.0416259765625,
0.00982666015625,
0.06512451171875,
0.01198577880859375,
0.045135498046875,
-0.0025730133056640625,
-0.034088134765625,
-0.038360595703125,
-0.03497314453125,
0.01218414306640625,
0.04522705078125,
-0.0281982421875,
-0.0301971435546875,
0.03460693359375,
-0.0261383056640625,
0.02301025390625,
0.0160369873046875,
0.01114654541015625,
0.0059661865234375,
-0.0003921985626220703,
-0.0011806488037109375,
-0.0202789306640625,
0.038482666015625,
0.06365966796875,
0.020782470703125,
0.03521728515625,
-0.01157379150390625,
-0.00925445556640625,
0.047576904296875,
-0.09808349609375,
-0.039947509765625,
0.0291748046875,
-0.06549072265625,
-0.033447265625,
-0.006458282470703125,
-0.066650390625,
-0.0178680419921875,
-0.030609130859375,
0.0279388427734375,
-0.057098388671875,
-0.050018310546875,
-0.012969970703125,
-0.0367431640625,
0.01483917236328125,
0.04180908203125,
-0.061920166015625,
0.009368896484375,
0.0089569091796875,
0.05718994140625,
0.0279998779296875,
-0.00553131103515625,
0.01093292236328125,
-0.0184326171875,
-0.05615234375,
0.06982421875,
-0.0024700164794921875,
-0.0550537109375,
-0.024383544921875,
-0.0097808837890625,
0.0225677490234375,
-0.05279541015625,
0.0469970703125,
-0.03558349609375,
0.0010166168212890625,
-0.00872802734375,
-0.03900146484375,
-0.041656494140625,
-0.01313018798828125,
-0.067138671875,
0.037841796875,
0.0280303955078125,
-0.0285491943359375,
-0.01214599609375,
-0.05633544921875,
-0.00537872314453125,
0.0179901123046875,
-0.00385284423828125,
-0.01427459716796875,
0.046173095703125,
0.00429534912109375,
0.0172271728515625,
0.00594329833984375,
-0.002086639404296875,
-0.050811767578125,
-0.0228271484375,
0.00920867919921875,
-0.033477783203125,
0.07427978515625,
0.0207366943359375,
-0.02593994140625,
-0.005336761474609375,
-0.07373046875,
0.0193634033203125,
0.0196990966796875,
0.018768310546875,
-0.024566650390625,
-0.0015773773193359375,
0.027557373046875,
0.0187835693359375,
0.0168609619140625,
-0.0162811279296875,
0.006359100341796875,
-0.050628662109375,
0.004291534423828125,
0.06439208984375,
0.01227569580078125,
0.0163726806640625,
-0.030059814453125,
0.03778076171875,
0.0155181884765625,
0.0220184326171875,
-0.0106964111328125,
-0.04205322265625,
-0.072265625,
-0.02276611328125,
0.01763916015625,
0.03656005859375,
-0.04888916015625,
0.018951416015625,
0.0016412734985351562,
-0.056884765625,
-0.040496826171875,
-0.01515960693359375,
0.01548004150390625,
0.016937255859375,
0.009796142578125,
-0.049774169921875,
-0.032135009765625,
-0.106201171875,
0.0264129638671875,
-0.0026721954345703125,
-0.029815673828125,
0.023773193359375,
0.0236358642578125,
-0.0177154541015625,
0.05145263671875,
-0.0323486328125,
-0.018890380859375,
0.0215911865234375,
0.0159454345703125,
0.024871826171875,
0.042144775390625,
0.06671142578125,
-0.07904052734375,
-0.0413818359375,
0.0033473968505859375,
-0.078369140625,
-0.007602691650390625,
0.031890869140625,
-0.0187530517578125,
-0.0011072158813476562,
0.026947021484375,
-0.06591796875,
0.044586181640625,
0.0227813720703125,
-0.0302886962890625,
0.03692626953125,
-0.01136016845703125,
0.017181396484375,
-0.10137939453125,
0.030029296875,
0.006381988525390625,
-0.0162811279296875,
-0.06414794921875,
0.04534912109375,
-0.019256591796875,
-0.038909912109375,
-0.059356689453125,
0.0615234375,
-0.0236663818359375,
0.0181121826171875,
-0.045318603515625,
-0.0170440673828125,
0.0135345458984375,
0.04010009765625,
0.0157623291015625,
0.03912353515625,
0.042999267578125,
-0.043670654296875,
0.047454833984375,
0.0309295654296875,
-0.03167724609375,
0.08251953125,
-0.08355712890625,
0.023162841796875,
-0.0229644775390625,
0.00922393798828125,
-0.054931640625,
-0.044464111328125,
0.052215576171875,
-0.032867431640625,
0.02471923828125,
0.0099945068359375,
-0.011688232421875,
-0.023284912109375,
-0.0143280029296875,
0.045074462890625,
0.067138671875,
-0.03997802734375,
0.06317138671875,
0.00646209716796875,
-0.04010009765625,
0.01044464111328125,
-0.025787353515625,
0.00275421142578125,
-0.024139404296875,
-0.044525146484375,
0.0307769775390625,
-0.0205841064453125,
-0.040557861328125,
0.005611419677734375,
0.030548095703125,
-0.0263671875,
-0.025726318359375,
0.0291290283203125,
0.02301025390625,
-0.0310516357421875,
-0.0263671875,
0.0231170654296875,
-0.018524169921875,
0.00031113624572753906,
0.012969970703125,
0.0237884521484375,
-0.006072998046875,
-0.0270538330078125,
-0.07568359375,
0.047760009765625,
0.06646728515625,
0.0084686279296875,
-0.01409149169921875,
0.044586181640625,
-0.045623779296875,
0.007633209228515625,
-0.0362548828125,
-0.033843994140625,
-0.03369140625,
0.01123809814453125,
-0.057830810546875,
-0.0207977294921875,
0.047088623046875,
-0.01654052734375,
0.00676727294921875,
0.044952392578125,
0.05169677734375,
-0.038543701171875,
0.09844970703125,
0.042877197265625,
0.01280975341796875,
0.0201416015625,
-0.04998779296875,
0.00305938720703125,
-0.059173583984375,
-0.01279449462890625,
-0.00470733642578125,
-0.04534912109375,
-0.0321044921875,
-0.039306640625,
0.03302001953125,
0.0255889892578125,
-0.0031604766845703125,
0.0159454345703125,
-0.00885772705078125,
0.0265045166015625,
0.041015625,
0.029632568359375,
-0.00861358642578125,
0.0027904510498046875,
0.017059326171875,
-0.007289886474609375,
-0.04412841796875,
-0.0218658447265625,
0.050994873046875,
0.02838134765625,
0.044830322265625,
0.0180206298828125,
0.04205322265625,
0.01434326171875,
-0.0009694099426269531,
-0.057037353515625,
0.0276031494140625,
-0.02374267578125,
-0.0693359375,
-0.007373809814453125,
0.0019092559814453125,
-0.030853271484375,
0.01410675048828125,
-0.009979248046875,
-0.043212890625,
0.0673828125,
0.01136016845703125,
-0.034027099609375,
0.004894256591796875,
-0.056976318359375,
0.057403564453125,
-0.008453369140625,
-0.0135040283203125,
-0.02008056640625,
-0.045013427734375,
0.032806396484375,
-0.0002524852752685547,
0.0005393028259277344,
-0.0301971435546875,
0.0222625732421875,
0.0243988037109375,
-0.041961669921875,
0.0865478515625,
-0.0173492431640625,
-0.00481414794921875,
0.036346435546875,
0.01043701171875,
0.0218353271484375,
0.0221099853515625,
-0.01140594482421875,
0.0465087890625,
0.01497650146484375,
-0.03173828125,
-0.057342529296875,
0.052337646484375,
-0.057342529296875,
-0.0406494140625,
-0.01409912109375,
-0.027679443359375,
0.01271820068359375,
0.0216827392578125,
0.056060791015625,
0.05657958984375,
-0.042999267578125,
0.006500244140625,
0.03240966796875,
0.01184844970703125,
0.03436279296875,
-0.0027484893798828125,
-0.0189971923828125,
-0.048187255859375,
0.0780029296875,
-0.00292205810546875,
0.005741119384765625,
0.00341033935546875,
0.01263427734375,
-0.0240631103515625,
0.0097808837890625,
-0.0743408203125,
0.01171112060546875,
-0.049591064453125,
-0.01302337646484375,
-0.0103912353515625,
-0.036346435546875,
-0.029083251953125,
-0.013824462890625,
-0.042144775390625,
-0.025482177734375,
-0.05023193359375,
0.0284423828125,
0.045623779296875,
0.055572509765625,
-0.02264404296875,
0.01467132568359375,
-0.034759521484375,
0.027313232421875,
0.02520751953125,
0.032196044921875,
0.00667572021484375,
-0.022674560546875,
-0.00885009765625,
0.014801025390625,
-0.018280029296875,
-0.052947998046875,
0.0304107666015625,
-0.0040130615234375,
0.01666259765625,
0.064697265625,
-0.0082244873046875,
0.056640625,
-0.052642822265625,
0.054473876953125,
0.019683837890625,
-0.040802001953125,
0.04022216796875,
-0.0399169921875,
0.0305633544921875,
0.055572509765625,
0.03289794921875,
-0.033966064453125,
-0.0265960693359375,
-0.06610107421875,
-0.045928955078125,
0.034881591796875,
0.02764892578125,
0.02386474609375,
0.0239715576171875,
0.047088623046875,
0.0036869049072265625,
0.004695892333984375,
-0.03009033203125,
-0.0216827392578125,
-0.041046142578125,
-0.0077056884765625,
-0.004283905029296875,
-0.039154052734375,
0.00553131103515625,
-0.037109375,
0.061553955078125,
-0.002033233642578125,
0.029754638671875,
0.0242767333984375,
0.039276123046875,
-0.005382537841796875,
-0.015625,
0.051177978515625,
0.0506591796875,
-0.01177978515625,
-0.01050567626953125,
0.0015649795532226562,
-0.0260009765625,
0.0130462646484375,
0.001468658447265625,
-0.039306640625,
0.0309295654296875,
0.00933074951171875,
0.09588623046875,
0.009918212890625,
-0.04388427734375,
0.029815673828125,
0.004001617431640625,
-0.024444580078125,
-0.03900146484375,
0.0116424560546875,
0.006969451904296875,
0.033905029296875,
-0.012237548828125,
0.0019006729125976562,
0.04083251953125,
-0.0198822021484375,
-0.005092620849609375,
-0.0004832744598388672,
-0.03009033203125,
-0.033966064453125,
0.057159423828125,
0.0179595947265625,
-0.055267333984375,
0.056549072265625,
0.0018463134765625,
-0.023834228515625,
0.059478759765625,
0.03790283203125,
0.0672607421875,
-0.044342041015625,
0.0291748046875,
0.05328369140625,
-0.01197052001953125,
0.00897979736328125,
0.0286712646484375,
0.01824951171875,
-0.01557159423828125,
-0.001987457275390625,
-0.033050537109375,
-0.031524658203125,
0.0304107666015625,
-0.039306640625,
0.06561279296875,
-0.037689208984375,
-0.0003631114959716797,
0.01393890380859375,
-0.0248260498046875,
-0.0262603759765625,
0.0413818359375,
0.00722503662109375,
0.059295654296875,
-0.0462646484375,
0.038360595703125,
0.0430908203125,
-0.0479736328125,
-0.0479736328125,
-0.00830841064453125,
0.010406494140625,
-0.0263214111328125,
0.0279693603515625,
0.011322021484375,
0.007175445556640625,
0.0229644775390625,
-0.0303192138671875,
-0.054290771484375,
0.06353759765625,
0.004833221435546875,
-0.059906005859375,
-0.00951385498046875,
-0.0265960693359375,
0.04083251953125,
-0.0245208740234375,
0.0217437744140625,
0.0135955810546875,
0.03045654296875,
0.0079345703125,
-0.03076171875,
-0.022613525390625,
-0.06109619140625,
0.052093505859375,
-0.00209808349609375,
-0.06292724609375,
0.05242919921875,
-0.00623321533203125,
-0.0200653076171875,
0.04669189453125,
0.047454833984375,
0.01177978515625,
0.028106689453125,
0.048736572265625,
0.0614013671875,
0.0144500732421875,
0.0254974365234375,
0.085205078125,
-0.0201568603515625,
-0.00689697265625,
0.057098388671875,
-0.01494598388671875,
0.06304931640625,
0.021270751953125,
0.01132965087890625,
0.0501708984375,
0.07574462890625,
-0.0247039794921875,
0.0455322265625,
-0.0004456043243408203,
-0.038543701171875,
-0.009674072265625,
-0.01486968994140625,
-0.03875732421875,
0.03839111328125,
0.006038665771484375,
-0.0016946792602539062,
-0.0134429931640625,
0.04510498046875,
-0.039520263671875,
0.00749969482421875,
-0.006439208984375,
0.058929443359375,
0.0254364013671875,
-0.0174560546875,
0.04730224609375,
0.0021877288818359375,
0.015411376953125,
-0.03289794921875,
-0.03704833984375,
-0.0199127197265625,
0.00267791748046875,
0.007572174072265625,
-0.05401611328125,
0.0160369873046875,
-0.0175933837890625,
-0.035552978515625,
-0.00963592529296875,
0.06585693359375,
-0.0122833251953125,
-0.05804443359375,
0.01287841796875,
0.032501220703125,
0.04913330078125,
-0.004810333251953125,
-0.044281005859375,
0.01007080078125,
-0.01690673828125,
-0.01580810546875,
-0.01018524169921875,
0.01263427734375,
-0.01251220703125,
0.01497650146484375,
0.030548095703125,
0.0111236572265625,
-0.03082275390625,
0.02899169921875,
0.049041748046875,
-0.0250244140625,
-0.03912353515625,
-0.044952392578125,
0.041473388671875,
-0.013458251953125,
-0.030242919921875,
0.035003662109375,
0.04425048828125,
0.06890869140625,
-0.0491943359375,
0.04144287109375,
-0.00827789306640625,
0.030303955078125,
-0.04486083984375,
0.0758056640625,
-0.06378173828125,
-0.039581298828125,
0.0029735565185546875,
-0.07464599609375,
-0.007190704345703125,
0.057373046875,
0.0133209228515625,
0.006191253662109375,
0.0207672119140625,
0.058624267578125,
-0.01401519775390625,
0.00316619873046875,
0.043701171875,
0.006717681884765625,
0.0005984306335449219,
0.0179595947265625,
0.06317138671875,
-0.04144287109375,
-0.01287841796875,
-0.06317138671875,
-0.0171051025390625,
-0.038177490234375,
-0.0230865478515625,
-0.072265625,
-0.056396484375,
-0.0198822021484375,
-0.01467132568359375,
0.0029468536376953125,
0.036346435546875,
0.07965087890625,
-0.035736083984375,
-0.0215911865234375,
-0.0077056884765625,
-0.01551055908203125,
-0.01197052001953125,
-0.01470184326171875,
0.010406494140625,
0.0196533203125,
-0.096435546875,
0.0293426513671875,
0.011444091796875,
0.035491943359375,
-0.026214599609375,
0.004184722900390625,
-0.007110595703125,
-0.00296783447265625,
0.06500244140625,
0.029388427734375,
-0.06610107421875,
-0.0158538818359375,
-0.0056915283203125,
0.00501251220703125,
-0.001800537109375,
0.050323486328125,
-0.006656646728515625,
0.050872802734375,
0.03167724609375,
0.0169677734375,
0.035003662109375,
0.0124969482421875,
0.042205810546875,
-0.031829833984375,
0.0223541259765625,
0.02215576171875,
0.0030975341796875,
0.004978179931640625,
-0.042633056640625,
0.035400390625,
0.0268707275390625,
-0.0222320556640625,
-0.040557861328125,
0.0285186767578125,
-0.07122802734375,
-0.0330810546875,
0.061553955078125,
0.011474609375,
-0.039825439453125,
0.0301971435546875,
-0.044647216796875,
0.003753662109375,
0.0002646446228027344,
0.038604736328125,
0.06610107421875,
-0.01392364501953125,
-0.0222015380859375,
-0.059906005859375,
0.01548004150390625,
0.01514434814453125,
-0.06903076171875,
-0.020050048828125,
0.062744140625,
0.016937255859375,
0.027191162109375,
0.061859130859375,
-0.027008056640625,
0.053558349609375,
0.0270538330078125,
0.038360595703125,
0.013519287109375,
-0.044464111328125,
-0.037841796875,
-0.0096282958984375,
0.0004801750183105469,
-0.0137939453125
]
] |
cointegrated/rubert-tiny | 2023-03-17T10:21:51.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"pretraining",
"russian",
"fill-mask",
"embeddings",
"masked-lm",
"tiny",
"feature-extraction",
"sentence-similarity",
"ru",
"en",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | cointegrated | null | null | cointegrated/rubert-tiny | 23 | 12,250 | transformers | 2022-03-02T23:29:05 | ---
language:
- ru
- en
tags:
- russian
- fill-mask
- pretraining
- embeddings
- masked-lm
- tiny
- feature-extraction
- sentence-similarity
license: mit
widget:
- text: Миниатюрная модель для [MASK] разных задач.
pipeline_tag: fill-mask
---
This is a very small distilled version of the [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) model for Russian and English (45 MB, 12M parameters). There is also an **updated version of this model**, [rubert-tiny2](https://huggingface.co/cointegrated/rubert-tiny2), with a larger vocabulary and better quality on practically all Russian NLU tasks.
This model is useful if you want to fine-tune it for a relatively simple Russian task (e.g. NER or sentiment classification), and you care more about speed and size than about accuracy. It is approximately x10 smaller and faster than a base-sized BERT. Its `[CLS]` embeddings can be used as a sentence representation aligned between Russian and English.
It was trained on the [Yandex Translate corpus](https://translate.yandex.ru/corpus), [OPUS-100](https://huggingface.co/datasets/opus100) and [Tatoeba](https://huggingface.co/datasets/tatoeba), using MLM loss (distilled from [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased)), translation ranking loss, and `[CLS]` embeddings distilled from [LaBSE](https://huggingface.co/sentence-transformers/LaBSE), [rubert-base-cased-sentence](https://huggingface.co/DeepPavlov/rubert-base-cased-sentence), Laser and USE.
There is a more detailed [description in Russian](https://habr.com/ru/post/562064/).
Sentence embeddings can be produced as follows:
```python
# pip install transformers sentencepiece
import torch
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("cointegrated/rubert-tiny")
model = AutoModel.from_pretrained("cointegrated/rubert-tiny")
# model.cuda() # uncomment it if you have a GPU
def embed_bert_cls(text, model, tokenizer):
t = tokenizer(text, padding=True, truncation=True, return_tensors='pt')
with torch.no_grad():
model_output = model(**{k: v.to(model.device) for k, v in t.items()})
embeddings = model_output.last_hidden_state[:, 0, :]
embeddings = torch.nn.functional.normalize(embeddings)
return embeddings[0].cpu().numpy()
print(embed_bert_cls('привет мир', model, tokenizer).shape)
# (312,)
``` | 2,411 | [
[
-0.0185394287109375,
-0.053131103515625,
0.024749755859375,
0.0282745361328125,
-0.031982421875,
-0.00519561767578125,
-0.03253173828125,
0.00540924072265625,
0.02593994140625,
0.02886962890625,
-0.0313720703125,
-0.034393310546875,
-0.042572021484375,
-0.00870513916015625,
-0.02752685546875,
0.0977783203125,
-0.00792694091796875,
0.04449462890625,
-0.011871337890625,
-0.00376129150390625,
-0.0275726318359375,
-0.042816162109375,
-0.01331329345703125,
-0.0413818359375,
0.022857666015625,
0.0272369384765625,
0.045623779296875,
0.0281524658203125,
0.039276123046875,
0.024749755859375,
-0.0014629364013671875,
-0.0183563232421875,
-0.017974853515625,
-0.01175689697265625,
0.001674652099609375,
-0.0316162109375,
-0.0156707763671875,
-0.0006079673767089844,
0.0428466796875,
0.05084228515625,
-0.017974853515625,
0.0072784423828125,
-0.0021991729736328125,
0.031341552734375,
-0.033660888671875,
0.01959228515625,
-0.0278472900390625,
0.0185394287109375,
0.0026416778564453125,
0.01056671142578125,
-0.054962158203125,
-0.0133209228515625,
0.0265350341796875,
-0.029296875,
0.0154571533203125,
0.0046844482421875,
0.078369140625,
0.0196380615234375,
-0.0238494873046875,
-0.03253173828125,
-0.03594970703125,
0.067626953125,
-0.052581787109375,
0.021209716796875,
0.0146484375,
0.0177459716796875,
0.0018205642700195312,
-0.0657958984375,
-0.024444580078125,
-0.0066680908203125,
-0.025634765625,
-0.00421905517578125,
-0.01360321044921875,
0.00591278076171875,
0.01641845703125,
0.02587890625,
-0.037689208984375,
-0.0066986083984375,
-0.0233917236328125,
-0.0282440185546875,
0.0281524658203125,
-0.00020837783813476562,
0.0101776123046875,
-0.024444580078125,
-0.0265350341796875,
-0.0272369384765625,
-0.04803466796875,
-0.0163421630859375,
0.023773193359375,
0.023956298828125,
-0.024078369140625,
0.0640869140625,
-0.007110595703125,
0.05072021484375,
0.0060272216796875,
0.00009322166442871094,
0.04071044921875,
-0.0174407958984375,
-0.022308349609375,
0.0077667236328125,
0.05157470703125,
0.0178070068359375,
0.0255126953125,
-0.0159149169921875,
-0.01123809814453125,
-0.001712799072265625,
0.01218414306640625,
-0.06298828125,
-0.041656494140625,
0.003887176513671875,
-0.04730224609375,
-0.0170135498046875,
-0.0003781318664550781,
-0.029296875,
-0.002292633056640625,
-0.0045013427734375,
0.060516357421875,
-0.0477294921875,
0.0020313262939453125,
0.021331787109375,
-0.0183868408203125,
0.0218048095703125,
-0.00014197826385498047,
-0.0657958984375,
0.0196990966796875,
0.0465087890625,
0.07073974609375,
0.01117706298828125,
-0.01258087158203125,
-0.0296478271484375,
-0.0208282470703125,
-0.0121612548828125,
0.040802001953125,
-0.0198822021484375,
-0.0166473388671875,
0.0294647216796875,
0.007640838623046875,
0.00591278076171875,
-0.023345947265625,
0.037841796875,
-0.0303802490234375,
0.040496826171875,
-0.0093536376953125,
-0.048492431640625,
-0.0216522216796875,
0.014495849609375,
-0.04937744140625,
0.06536865234375,
0.00913238525390625,
-0.055267333984375,
0.03192138671875,
-0.04681396484375,
-0.049560546875,
-0.00293731689453125,
0.00494384765625,
-0.053619384765625,
0.020721435546875,
0.018310546875,
0.057098388671875,
0.0059661865234375,
0.005458831787109375,
-0.018585205078125,
-0.02593994140625,
0.01343536376953125,
-0.00841522216796875,
0.07073974609375,
0.019378662109375,
-0.0281219482421875,
-0.00547027587890625,
-0.043243408203125,
0.003662109375,
-0.00018227100372314453,
-0.0307159423828125,
-0.0239410400390625,
0.0033931732177734375,
0.0419921875,
0.00878143310546875,
0.026611328125,
-0.060882568359375,
0.0236053466796875,
-0.043243408203125,
0.051666259765625,
0.046600341796875,
-0.0147552490234375,
0.050537109375,
-0.021881103515625,
0.0164337158203125,
0.01157379150390625,
0.0040740966796875,
-0.00579071044921875,
-0.0438232421875,
-0.0758056640625,
-0.0242462158203125,
0.044769287109375,
0.03887939453125,
-0.06378173828125,
0.040252685546875,
-0.018218994140625,
-0.0394287109375,
-0.053009033203125,
-0.007732391357421875,
0.02423095703125,
0.01044464111328125,
0.024932861328125,
0.008880615234375,
-0.042510986328125,
-0.08795166015625,
-0.00324249267578125,
0.00862884521484375,
0.0075531005859375,
-0.006366729736328125,
0.05462646484375,
-0.0083770751953125,
0.052093505859375,
-0.0283660888671875,
-0.03173828125,
-0.0196533203125,
0.019378662109375,
0.036224365234375,
0.0479736328125,
0.04638671875,
-0.048004150390625,
-0.05487060546875,
-0.0022106170654296875,
-0.034088134765625,
0.0205535888671875,
-0.01419830322265625,
-0.00814056396484375,
0.0232696533203125,
0.0075225830078125,
-0.04669189453125,
0.0289459228515625,
0.039703369140625,
-0.03955078125,
0.019256591796875,
-0.02838134765625,
0.0035381317138671875,
-0.10406494140625,
-0.0093536376953125,
-0.00083160400390625,
-0.031280517578125,
-0.039764404296875,
0.005138397216796875,
0.0144195556640625,
-0.002849578857421875,
-0.047637939453125,
0.0190887451171875,
-0.03887939453125,
-0.006809234619140625,
-0.0012722015380859375,
0.0042877197265625,
-0.00173187255859375,
0.031646728515625,
-0.0005536079406738281,
0.053619384765625,
0.048980712890625,
-0.0286712646484375,
0.04046630859375,
0.03826904296875,
-0.043487548828125,
0.0190582275390625,
-0.0621337890625,
-0.0015497207641601562,
0.0003609657287597656,
0.0013103485107421875,
-0.064697265625,
0.0008182525634765625,
0.0122833251953125,
-0.0384521484375,
0.0279388427734375,
-0.00957489013671875,
-0.0693359375,
-0.01220703125,
-0.033966064453125,
0.0116729736328125,
0.055908203125,
-0.05120849609375,
0.042633056640625,
0.0279541015625,
0.0016622543334960938,
-0.06036376953125,
-0.08233642578125,
-0.0007143020629882812,
-0.0203094482421875,
-0.04852294921875,
0.04278564453125,
-0.00479888916015625,
0.000732421875,
0.020904541015625,
0.02978515625,
-0.004268646240234375,
-0.00482940673828125,
0.0013685226440429688,
0.0212860107421875,
-0.0173187255859375,
0.02569580078125,
0.007228851318359375,
0.00554656982421875,
0.00019788742065429688,
-0.0016603469848632812,
0.06085205078125,
-0.03741455078125,
-0.00881195068359375,
-0.02325439453125,
0.0282745361328125,
0.0250244140625,
0.00884246826171875,
0.07110595703125,
0.07159423828125,
-0.0379638671875,
-0.02301025390625,
-0.036834716796875,
-0.01427459716796875,
-0.0355224609375,
0.034576416015625,
-0.03594970703125,
-0.06964111328125,
0.0526123046875,
0.0158233642578125,
-0.0081787109375,
0.051055908203125,
0.06829833984375,
-0.018798828125,
0.07659912109375,
0.04779052734375,
-0.0233306884765625,
0.038787841796875,
-0.04412841796875,
0.01175689697265625,
-0.057220458984375,
-0.022735595703125,
-0.023956298828125,
-0.0286407470703125,
-0.052001953125,
-0.0263671875,
0.0157318115234375,
-0.00919342041015625,
-0.01416015625,
0.038116455078125,
-0.046875,
0.018829345703125,
0.0557861328125,
0.01024627685546875,
-0.006969451904296875,
0.0229644775390625,
-0.022247314453125,
-0.0213775634765625,
-0.06170654296875,
-0.035491943359375,
0.0802001953125,
0.0290374755859375,
0.076416015625,
0.008087158203125,
0.059112548828125,
0.033843994140625,
0.0176544189453125,
-0.078369140625,
0.02490234375,
-0.033477783203125,
-0.059112548828125,
-0.0004794597625732422,
-0.024688720703125,
-0.06866455078125,
0.007503509521484375,
-0.024566650390625,
-0.050262451171875,
-0.0012540817260742188,
-0.0036830902099609375,
-0.01314544677734375,
0.01702880859375,
-0.0560302734375,
0.075927734375,
-0.006641387939453125,
-0.0026760101318359375,
-0.0245361328125,
-0.03118896484375,
0.005794525146484375,
0.021087646484375,
-0.007122039794921875,
0.00479888916015625,
0.004383087158203125,
0.063720703125,
-0.02972412109375,
0.059234619140625,
-0.0198211669921875,
0.01458740234375,
0.018707275390625,
-0.00801849365234375,
0.0255279541015625,
0.0099029541015625,
0.00585174560546875,
0.01239013671875,
0.0080718994140625,
-0.0323486328125,
-0.030670166015625,
0.062347412109375,
-0.06817626953125,
-0.0254669189453125,
-0.049346923828125,
-0.0296630859375,
0.01093292236328125,
0.01097869873046875,
0.031646728515625,
0.044189453125,
-0.0277099609375,
0.0499267578125,
0.048583984375,
-0.024658203125,
0.04998779296875,
0.016693115234375,
-0.0006747245788574219,
-0.035614013671875,
0.057952880859375,
0.0014352798461914062,
-0.0017595291137695312,
0.04083251953125,
0.018798828125,
-0.0265655517578125,
-0.024505615234375,
-0.04736328125,
0.0341796875,
-0.058746337890625,
-0.0277252197265625,
-0.034881591796875,
-0.01776123046875,
-0.040924072265625,
-0.01271820068359375,
-0.0272369384765625,
-0.04315185546875,
-0.035369873046875,
-0.0117645263671875,
0.04534912109375,
0.051055908203125,
-0.0236663818359375,
0.043121337890625,
-0.046051025390625,
0.01409912109375,
-0.0037689208984375,
0.0215606689453125,
-0.02252197265625,
-0.042999267578125,
-0.04718017578125,
0.0071258544921875,
-0.01352691650390625,
-0.06695556640625,
0.04931640625,
0.032135009765625,
0.053558349609375,
0.0171966552734375,
-0.005008697509765625,
0.0443115234375,
-0.05255126953125,
0.0621337890625,
0.0155792236328125,
-0.0767822265625,
0.034088134765625,
-0.00969696044921875,
0.0098876953125,
0.0276947021484375,
0.0308074951171875,
-0.039276123046875,
-0.0081024169921875,
-0.045440673828125,
-0.06201171875,
0.05999755859375,
0.01470184326171875,
0.03094482421875,
-0.01055908203125,
0.019134521484375,
0.0087127685546875,
0.0161285400390625,
-0.06951904296875,
-0.0428466796875,
-0.035003662109375,
-0.0401611328125,
-0.00676727294921875,
-0.02679443359375,
-0.00832366943359375,
-0.023651123046875,
0.06500244140625,
0.01049041748046875,
0.05218505859375,
0.0127105712890625,
-0.03289794921875,
0.021331787109375,
0.0067138671875,
0.042144775390625,
0.01419830322265625,
-0.0272216796875,
-0.0034236907958984375,
0.012664794921875,
-0.0312347412109375,
0.00585174560546875,
0.0058746337890625,
-0.01285552978515625,
0.0290374755859375,
0.015777587890625,
0.0577392578125,
0.004352569580078125,
-0.038116455078125,
0.055145263671875,
-0.00019228458404541016,
-0.01467132568359375,
-0.05072021484375,
-0.0265350341796875,
0.0164947509765625,
0.0100250244140625,
0.021575927734375,
-0.00021922588348388672,
0.024444580078125,
-0.047210693359375,
0.03228759765625,
0.0308074951171875,
-0.03424072265625,
-0.01458740234375,
0.02801513671875,
0.0153045654296875,
-0.01534271240234375,
0.061614990234375,
-0.0225982666015625,
-0.050262451171875,
0.03253173828125,
0.033660888671875,
0.06854248046875,
0.007396697998046875,
0.0299072265625,
0.032470703125,
0.032501220703125,
-0.00927734375,
0.02655029296875,
0.00595855712890625,
-0.058258056640625,
-0.045623779296875,
-0.045257568359375,
-0.0173187255859375,
0.002101898193359375,
-0.06585693359375,
0.02410888671875,
-0.035736083984375,
-0.019805908203125,
0.0002827644348144531,
0.00592041015625,
-0.06536865234375,
0.01343536376953125,
0.0116729736328125,
0.056884765625,
-0.0711669921875,
0.07470703125,
0.07049560546875,
-0.0296478271484375,
-0.0430908203125,
-0.0364990234375,
-0.03338623046875,
-0.060760498046875,
0.06463623046875,
0.0059661865234375,
0.0202484130859375,
-0.01206207275390625,
-0.0299530029296875,
-0.06597900390625,
0.06610107421875,
0.0221405029296875,
-0.024017333984375,
0.00798797607421875,
0.0121307373046875,
0.040985107421875,
-0.04705810546875,
0.0168914794921875,
0.024566650390625,
0.0223236083984375,
-0.0187225341796875,
-0.06683349609375,
-0.0123291015625,
-0.026336669921875,
0.0034198760986328125,
0.01035308837890625,
-0.048583984375,
0.0872802734375,
-0.003993988037109375,
-0.00580596923828125,
0.0188751220703125,
0.04534912109375,
-0.006900787353515625,
-0.005123138427734375,
0.022857666015625,
0.0574951171875,
0.0151214599609375,
-0.03216552734375,
0.071044921875,
-0.0262908935546875,
0.061370849609375,
0.06744384765625,
0.015869140625,
0.07537841796875,
0.03729248046875,
-0.0097198486328125,
0.066162109375,
0.048583984375,
-0.02685546875,
0.07489013671875,
0.0178070068359375,
0.0013647079467773438,
-0.01385498046875,
0.01959228515625,
-0.0287933349609375,
0.041748046875,
0.037353515625,
-0.0406494140625,
-0.01131439208984375,
0.01690673828125,
0.012176513671875,
-0.0017728805541992188,
-0.00562286376953125,
0.052032470703125,
-0.0005049705505371094,
-0.032073974609375,
0.044586181640625,
0.0157318115234375,
0.07379150390625,
-0.035186767578125,
-0.006195068359375,
-0.00824737548828125,
0.0307464599609375,
0.0021190643310546875,
-0.059112548828125,
0.0189666748046875,
-0.008148193359375,
-0.01393890380859375,
-0.0265960693359375,
0.044586181640625,
-0.043487548828125,
-0.06591796875,
0.0182647705078125,
0.0323486328125,
0.0235137939453125,
0.0119171142578125,
-0.0650634765625,
-0.006092071533203125,
0.00255584716796875,
-0.030853271484375,
0.016204833984375,
0.02215576171875,
0.02923583984375,
0.04534912109375,
0.02325439453125,
0.005413055419921875,
0.011322021484375,
0.01236724853515625,
0.060882568359375,
-0.044586181640625,
-0.03497314453125,
-0.06182861328125,
0.04205322265625,
-0.01151275634765625,
-0.0288848876953125,
0.06280517578125,
0.0501708984375,
0.06951904296875,
-0.04296875,
0.048126220703125,
-0.0163726806640625,
0.02313232421875,
-0.040985107421875,
0.06439208984375,
-0.0465087890625,
-0.014190673828125,
-0.00815582275390625,
-0.07342529296875,
-0.01477813720703125,
0.09088134765625,
0.0029811859130859375,
-0.0032501220703125,
0.06134033203125,
0.042144775390625,
0.0024852752685546875,
-0.035919189453125,
0.0271759033203125,
0.0176849365234375,
0.01056671142578125,
0.0389404296875,
0.03839111328125,
-0.06097412109375,
0.04736328125,
-0.054229736328125,
0.0005888938903808594,
-0.035186767578125,
-0.06494140625,
-0.08502197265625,
-0.054779052734375,
-0.037200927734375,
-0.034759521484375,
-0.00926971435546875,
0.06793212890625,
0.06292724609375,
-0.07373046875,
-0.019134521484375,
0.01096343994140625,
-0.0025730133056640625,
-0.00891876220703125,
-0.01512908935546875,
0.034088134765625,
-0.0293121337890625,
-0.07122802734375,
0.01336669921875,
0.0006365776062011719,
0.0008940696716308594,
-0.02880859375,
-0.0006299018859863281,
-0.0252838134765625,
-0.01715087890625,
0.048828125,
-0.01715087890625,
-0.06744384765625,
-0.043182373046875,
0.01519012451171875,
-0.01207733154296875,
0.00001531839370727539,
0.04278564453125,
-0.051055908203125,
0.041107177734375,
0.04486083984375,
0.0355224609375,
0.05279541015625,
-0.0193939208984375,
0.04754638671875,
-0.07586669921875,
0.0220947265625,
0.0171661376953125,
0.061004638671875,
0.04498291015625,
-0.0190277099609375,
0.03363037109375,
0.01284027099609375,
-0.038787841796875,
-0.050567626953125,
0.004787445068359375,
-0.0960693359375,
-0.0119781494140625,
0.0887451171875,
-0.02587890625,
-0.0254669189453125,
0.0185089111328125,
-0.0249176025390625,
0.030517578125,
-0.03643798828125,
0.061004638671875,
0.07568359375,
0.01151275634765625,
-0.00933837890625,
-0.01451873779296875,
0.032806396484375,
0.042449951171875,
-0.0271148681640625,
-0.019561767578125,
0.026275634765625,
0.0279693603515625,
0.036224365234375,
0.005451202392578125,
-0.00539398193359375,
0.01617431640625,
0.014556884765625,
0.02593994140625,
-0.0016393661499023438,
-0.01023101806640625,
-0.0206451416015625,
-0.0032672882080078125,
-0.006072998046875,
-0.03955078125
]
] |
DTAI-KULeuven/robbertje-1-gb-bort | 2022-02-24T09:57:08.000Z | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"Dutch",
"Flemish",
"RoBERTa",
"RobBERT",
"RobBERTje",
"nl",
"dataset:oscar",
"dataset:oscar (NL)",
"dataset:dbrd",
"dataset:lassy-ud",
"dataset:europarl-mono",
"dataset:conll2002",
"arxiv:2101.05716",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | DTAI-KULeuven | null | null | DTAI-KULeuven/robbertje-1-gb-bort | 0 | 12,244 | transformers | 2022-03-02T23:29:04 | ---
language: "nl"
thumbnail: "https://github.com/iPieter/RobBERT/raw/master/res/robbert_logo.png"
tags:
- Dutch
- Flemish
- RoBERTa
- RobBERT
- RobBERTje
license: mit
datasets:
- oscar
- oscar (NL)
- dbrd
- lassy-ud
- europarl-mono
- conll2002
widget:
- text: "Hallo, ik ben RobBERTje, een gedistilleerd <mask> taalmodel van de KU Leuven."
---
<p align="center">
<img src="https://github.com/iPieter/robbertje/raw/master/images/robbertje_logo_with_name.png" alt="RobBERTje: A collection of distilled Dutch BERT-based models" width="75%">
</p>
# About RobBERTje
RobBERTje is a collection of distilled models based on [RobBERT](http://github.com/iPieter/robbert). There are multiple models with different sizes and different training settings, which you can choose for your use-case.
We are also continuously working on releasing better-performing models, so watch [the repository](http://github.com/iPieter/robbertje) for updates.
# News
- **February 21, 2022**: Our paper about RobBERTje has been published in [volume 11 of CLIN journal](https://www.clinjournal.org/clinj/article/view/131)!
- **July 2, 2021**: Publicly released 4 RobBERTje models.
- **May 12, 2021**: RobBERTje was accepted at [CLIN31](https://www.clin31.ugent.be) for an oral presentation!
# The models
| Model | Description | Parameters | Training size | Huggingface id |
|--------------|-------------|------------------|-------------------|------------------------------------------------------------------------------------|
| Non-shuffled | Trained on the non-shuffled variant of the oscar corpus, without any operations to preserve this order during training and distillation. | 74 M | 1 GB | [DTAI-KULeuven/robbertje-1-gb-non-shuffled](https://huggingface.co/DTAI-KULeuven/robbertje-1-gb-non-shuffled) |
| Shuffled | Trained on the publicly available and shuffled OSCAR corpus. | 74 M | 1 GB | [DTAI-KULeuven/robbertje-1-gb-shuffled](https://huggingface.co/DTAI-KULeuven/robbertje-1-gb-shuffled) |
| Merged (p=0.5) | Same as the non-shuffled variant, but sequential sentences of the same document are merged with a probability of 50%. | 74 M | 1 GB | [DTAI-KULeuven/robbertje-1-gb-merged](https://huggingface.co/DTAI-KULeuven/robbertje-1-gb-merged) |
| BORT | A smaller version with 8 attention heads instead of 12 and 4 layers instead of 6 (and 12 for RobBERT). | 46 M | 1 GB | this model |
# Results
## Intrinsic results
We calculated the _pseudo perplexity_ (PPPL) from [cite](), which is a built-in metric in our distillation library. This metric gives an indication of how well the model captures the input distribution.
| Model | PPPL |
|-------------------|-----------|
| RobBERT (teacher) | 7.76 |
| Non-shuffled | 12.95 |
| Shuffled | 18.74 |
| Merged (p=0.5) | 17.10 |
| BORT | 26.44 |
## Extrinsic results
We also evaluated our models on sereral downstream tasks, just like the teacher model RobBERT. Since that evaluation, a [Dutch NLI task named SICK-NL](https://arxiv.org/abs/2101.05716) was also released and we evaluated our models with it as well.
| Model | DBRD | DIE-DAT | NER | POS |SICK-NL |
|------------------|-----------|-----------|-----------|-----------|----------|
| RobBERT (teacher)|94.4 | 99.2 |89.1 |96.4 | 84.2 |
| Non-shuffled |90.2 | 98.4 |82.9 |95.5 | 83.4 |
| Shuffled |92.5 | 98.2 |82.7 |95.6 | 83.4 |
| Merged (p=0.5) |92.9 | 96.5 |81.8 |95.2 | 82.8 |
| BORT |89.6 | 92.2 |79.7 |94.3 | 81.0 |
| 3,942 | [
[
-0.031829833984375,
-0.032989501953125,
0.0269927978515625,
0.0158233642578125,
-0.0261993408203125,
-0.0158538818359375,
0.0003540515899658203,
-0.042083740234375,
0.031158447265625,
0.00473785400390625,
-0.0290069580078125,
-0.021881103515625,
-0.07489013671875,
0.0156707763671875,
-0.04901123046875,
0.0953369140625,
0.004352569580078125,
0.0278778076171875,
-0.0028839111328125,
0.001369476318359375,
-0.022003173828125,
-0.04046630859375,
-0.01218414306640625,
-0.013458251953125,
0.03656005859375,
0.00616455078125,
0.0269622802734375,
0.02008056640625,
0.054107666015625,
0.0242462158203125,
-0.028076171875,
-0.006191253662109375,
-0.039031982421875,
-0.01519775390625,
-0.0043182373046875,
-0.0286865234375,
-0.039276123046875,
0.022705078125,
0.043701171875,
0.06060791015625,
-0.00925445556640625,
0.025360107421875,
0.020355224609375,
0.060150146484375,
-0.0477294921875,
-0.00997161865234375,
-0.029571533203125,
0.007572174072265625,
-0.015289306640625,
0.00696563720703125,
-0.0238037109375,
-0.02435302734375,
0.04180908203125,
-0.0185699462890625,
0.0106353759765625,
-0.01849365234375,
0.08892822265625,
0.01146697998046875,
-0.01812744140625,
-0.00902557373046875,
-0.044708251953125,
0.064453125,
-0.059600830078125,
0.0190582275390625,
0.033203125,
0.00859832763671875,
-0.0022411346435546875,
-0.0498046875,
-0.04315185546875,
0.0050201416015625,
-0.01806640625,
0.0257110595703125,
-0.0261688232421875,
-0.0133514404296875,
0.018341064453125,
0.040679931640625,
-0.044677734375,
-0.01131439208984375,
-0.059661865234375,
-0.0015115737915039062,
0.05999755859375,
-0.001739501953125,
-0.00827789306640625,
-0.0088348388671875,
-0.06036376953125,
-0.0239105224609375,
-0.0178375244140625,
0.01122283935546875,
0.048187255859375,
0.00530242919921875,
-0.0186309814453125,
0.0372314453125,
0.00392913818359375,
0.06146240234375,
0.0278167724609375,
-0.035308837890625,
0.050140380859375,
-0.035491943359375,
-0.0171966552734375,
-0.004444122314453125,
0.06719970703125,
0.0141754150390625,
-0.003360748291015625,
0.007801055908203125,
-0.034942626953125,
-0.0258941650390625,
0.0212860107421875,
-0.048675537109375,
-0.04229736328125,
0.00902557373046875,
-0.039947509765625,
-0.037353515625,
0.0084228515625,
-0.0391845703125,
-0.00403594970703125,
-0.01953125,
0.04241943359375,
-0.052001953125,
-0.01959228515625,
0.0010700225830078125,
-0.0262908935546875,
0.0242462158203125,
0.032440185546875,
-0.05072021484375,
-0.0010852813720703125,
0.026947021484375,
0.062225341796875,
-0.0225677490234375,
-0.0267791748046875,
-0.0003981590270996094,
-0.0122528076171875,
-0.0228271484375,
0.036224365234375,
-0.00952911376953125,
-0.00983428955078125,
-0.0218505859375,
-0.01035308837890625,
-0.0179901123046875,
-0.0318603515625,
0.04986572265625,
-0.028289794921875,
0.04327392578125,
-0.02197265625,
-0.041717529296875,
-0.0289764404296875,
0.023681640625,
-0.03955078125,
0.0753173828125,
0.0162506103515625,
-0.07781982421875,
0.030059814453125,
-0.03912353515625,
-0.0108795166015625,
-0.016357421875,
0.0161285400390625,
-0.046142578125,
0.009735107421875,
0.004947662353515625,
0.03179931640625,
-0.016143798828125,
0.0433349609375,
-0.0283203125,
-0.019866943359375,
0.0224456787109375,
-0.028778076171875,
0.09613037109375,
0.025909423828125,
-0.00885009765625,
-0.006008148193359375,
-0.06500244140625,
-0.0198516845703125,
0.02154541015625,
-0.0259552001953125,
-0.0262908935546875,
-0.02679443359375,
0.0037689208984375,
0.024444580078125,
0.0362548828125,
-0.032470703125,
0.0206146240234375,
-0.023834228515625,
0.0195465087890625,
0.059661865234375,
-0.00018918514251708984,
0.022796630859375,
-0.042999267578125,
0.026458740234375,
0.0274200439453125,
0.0262908935546875,
0.006084442138671875,
-0.047119140625,
-0.048370361328125,
-0.054962158203125,
0.044921875,
0.05157470703125,
-0.05224609375,
0.0550537109375,
-0.0191802978515625,
-0.051910400390625,
-0.035308837890625,
-0.004878997802734375,
0.02679443359375,
0.053619384765625,
0.020355224609375,
-0.0372314453125,
-0.04266357421875,
-0.0965576171875,
-0.0012731552124023438,
-0.01898193359375,
-0.0022220611572265625,
-0.00018930435180664062,
0.059417724609375,
0.00778961181640625,
0.0733642578125,
-0.0290374755859375,
-0.0194549560546875,
-0.0204925537109375,
0.0197601318359375,
0.055938720703125,
0.042572021484375,
0.0643310546875,
-0.056121826171875,
-0.036041259765625,
-0.038421630859375,
-0.054718017578125,
-0.0034275054931640625,
-0.0005474090576171875,
-0.0157012939453125,
0.0208740234375,
-0.0021209716796875,
-0.030181884765625,
0.0372314453125,
0.03497314453125,
-0.0274810791015625,
0.054107666015625,
-0.0144195556640625,
-0.00038909912109375,
-0.0823974609375,
0.020172119140625,
-0.0024242401123046875,
-0.0195159912109375,
-0.05450439453125,
-0.0121307373046875,
-0.006793975830078125,
0.00418853759765625,
-0.041717529296875,
0.018402099609375,
-0.03326416015625,
-0.0018701553344726562,
-0.0016956329345703125,
-0.0005517005920410156,
-0.0036334991455078125,
0.0555419921875,
0.004825592041015625,
0.041900634765625,
0.052459716796875,
-0.0312042236328125,
0.013580322265625,
0.0188751220703125,
-0.031646728515625,
0.03900146484375,
-0.058380126953125,
-0.006168365478515625,
-0.0108489990234375,
0.019866943359375,
-0.07891845703125,
0.0032291412353515625,
0.022918701171875,
-0.034332275390625,
0.0311431884765625,
-0.0104522705078125,
-0.044036865234375,
-0.04486083984375,
-0.045806884765625,
0.00708770751953125,
0.049774169921875,
-0.035675048828125,
0.034149169921875,
0.02667236328125,
0.005157470703125,
-0.061248779296875,
-0.061187744140625,
-0.016571044921875,
-0.032135009765625,
-0.055694580078125,
0.0164031982421875,
-0.00726318359375,
-0.00336456298828125,
-0.0013017654418945312,
-0.0028228759765625,
-0.01262664794921875,
0.01270294189453125,
0.005710601806640625,
0.052490234375,
-0.00016605854034423828,
-0.01059722900390625,
0.0010232925415039062,
0.0048675537109375,
-0.004207611083984375,
-0.0167999267578125,
0.034393310546875,
-0.006908416748046875,
-0.0003731250762939453,
-0.041595458984375,
0.017333984375,
0.037506103515625,
-0.006256103515625,
0.06878662109375,
0.033050537109375,
-0.0124664306640625,
0.0144195556640625,
-0.0433349609375,
-0.0177154541015625,
-0.0345458984375,
0.0139923095703125,
-0.0302734375,
-0.044952392578125,
0.051788330078125,
0.0157623291015625,
0.0176849365234375,
0.049346923828125,
0.041717529296875,
-0.0278778076171875,
0.051361083984375,
0.03375244140625,
-0.0003216266632080078,
0.04443359375,
-0.051544189453125,
0.029876708984375,
-0.06396484375,
-0.01383209228515625,
-0.02471923828125,
-0.055694580078125,
-0.048187255859375,
-0.01239776611328125,
0.037994384765625,
0.0369873046875,
-0.01342010498046875,
0.035552978515625,
-0.051666259765625,
0.0207977294921875,
0.05718994140625,
-0.009033203125,
0.00707244873046875,
-0.005352020263671875,
-0.0198974609375,
0.0027523040771484375,
-0.0638427734375,
-0.03216552734375,
0.087646484375,
0.0309906005859375,
0.02783203125,
0.0235748291015625,
0.08514404296875,
0.014068603515625,
0.0098724365234375,
-0.0226287841796875,
0.042388916015625,
-0.00888824462890625,
-0.073974609375,
-0.016845703125,
-0.0243682861328125,
-0.0758056640625,
0.039398193359375,
-0.023773193359375,
-0.076171875,
0.04620361328125,
0.01270294189453125,
-0.0281982421875,
0.0243682861328125,
-0.0672607421875,
0.05316162109375,
-0.0115966796875,
-0.034820556640625,
0.0111236572265625,
-0.06640625,
0.047454833984375,
-0.01023101806640625,
0.00136566162109375,
-0.004886627197265625,
0.0202789306640625,
0.06976318359375,
-0.052459716796875,
0.06280517578125,
-0.020050048828125,
-0.00307464599609375,
0.039459228515625,
-0.007045745849609375,
0.048065185546875,
-0.00982666015625,
0.0017347335815429688,
0.0333251953125,
0.0233612060546875,
-0.0335693359375,
-0.0261077880859375,
0.04095458984375,
-0.071044921875,
-0.041015625,
-0.0684814453125,
-0.038848876953125,
0.00013720989227294922,
0.0163116455078125,
0.039093017578125,
0.0179443359375,
0.00861358642578125,
0.008331298828125,
0.04656982421875,
-0.022308349609375,
0.0411376953125,
0.034454345703125,
-0.0230255126953125,
-0.019683837890625,
0.053802490234375,
0.029327392578125,
0.010986328125,
0.019378662109375,
0.0204925537109375,
-0.0289459228515625,
-0.055267333984375,
-0.0211944580078125,
0.01551055908203125,
-0.03912353515625,
-0.022308349609375,
-0.056427001953125,
-0.0226593017578125,
-0.04052734375,
-0.005672454833984375,
-0.0301055908203125,
-0.051727294921875,
-0.02587890625,
-0.0157928466796875,
0.0311126708984375,
0.038360595703125,
-0.0298309326171875,
0.00782012939453125,
-0.0516357421875,
-0.00298309326171875,
0.0135040283203125,
0.02227783203125,
0.0126953125,
-0.060699462890625,
-0.006114959716796875,
-0.0031833648681640625,
-0.04010009765625,
-0.07171630859375,
0.04827880859375,
-0.0048065185546875,
0.031341552734375,
0.034942626953125,
0.012603759765625,
0.052581787109375,
-0.0189666748046875,
0.07354736328125,
0.034759521484375,
-0.0562744140625,
0.04266357421875,
-0.036285400390625,
0.0098419189453125,
0.05328369140625,
0.042510986328125,
-0.00870513916015625,
-0.034942626953125,
-0.0693359375,
-0.09173583984375,
0.0819091796875,
0.0290679931640625,
-0.01470947265625,
0.0122528076171875,
-0.00215911865234375,
0.02130126953125,
0.0182952880859375,
-0.06658935546875,
-0.03363037109375,
-0.0168304443359375,
0.0005717277526855469,
-0.0007987022399902344,
-0.0265350341796875,
-0.0126953125,
-0.038177490234375,
0.066162109375,
0.0288238525390625,
0.0308074951171875,
0.021514892578125,
0.001956939697265625,
0.0238800048828125,
0.00426483154296875,
0.047119140625,
0.05010986328125,
-0.03973388671875,
0.004566192626953125,
0.0175323486328125,
-0.045562744140625,
-0.0030879974365234375,
0.0221405029296875,
-0.023193359375,
0.004184722900390625,
0.0389404296875,
0.06878662109375,
-0.011077880859375,
-0.04638671875,
0.05145263671875,
0.00536346435546875,
-0.038421630859375,
-0.046905517578125,
-0.00289154052734375,
-0.00298309326171875,
0.0250701904296875,
0.03363037109375,
0.00574493408203125,
0.034332275390625,
-0.033905029296875,
0.00830078125,
0.018951416015625,
-0.0101470947265625,
-0.0267181396484375,
0.035736083984375,
0.0038280487060546875,
0.0024127960205078125,
0.046875,
-0.03717041015625,
-0.03936767578125,
0.02734375,
0.030548095703125,
0.06304931640625,
-0.0033855438232421875,
0.004421234130859375,
0.052642822265625,
0.0197296142578125,
-0.016082763671875,
0.02691650390625,
-0.002086639404296875,
-0.04571533203125,
-0.01351165771484375,
-0.058380126953125,
0.0107269287109375,
0.0195159912109375,
-0.0538330078125,
0.01132965087890625,
-0.016998291015625,
-0.0236663818359375,
0.0136260986328125,
0.005733489990234375,
-0.040985107421875,
-0.0003147125244140625,
0.0083160400390625,
0.06256103515625,
-0.07098388671875,
0.06744384765625,
0.0499267578125,
-0.0362548828125,
-0.0675048828125,
-0.015167236328125,
-0.0013303756713867188,
-0.04034423828125,
0.058837890625,
-0.0159912109375,
0.0072174072265625,
-0.026214599609375,
-0.044677734375,
-0.094482421875,
0.08587646484375,
0.0241241455078125,
-0.0572509765625,
-0.002269744873046875,
-0.0028076171875,
0.052886962890625,
-0.0024433135986328125,
0.028472900390625,
0.03350830078125,
0.0328369140625,
0.026611328125,
-0.0799560546875,
-0.02099609375,
-0.040069580078125,
0.012420654296875,
0.0232696533203125,
-0.045806884765625,
0.08233642578125,
-0.0009984970092773438,
-0.006763458251953125,
-0.00457000732421875,
0.03741455078125,
0.031951904296875,
0.01641845703125,
0.0362548828125,
0.0665283203125,
0.054351806640625,
-0.0192108154296875,
0.0992431640625,
-0.01849365234375,
0.0304107666015625,
0.07781982421875,
-0.00243377685546875,
0.0419921875,
0.045806884765625,
-0.031890869140625,
0.023773193359375,
0.048858642578125,
-0.0207061767578125,
0.055267333984375,
0.0154266357421875,
0.00856781005859375,
0.0014972686767578125,
0.01328277587890625,
-0.0283966064453125,
0.035064697265625,
-0.006351470947265625,
-0.036041259765625,
-0.00946044921875,
0.00858306884765625,
0.003734588623046875,
-0.0169677734375,
-0.0148773193359375,
0.05078125,
-0.0062713623046875,
-0.049346923828125,
0.06951904296875,
0.0048065185546875,
0.063720703125,
-0.0401611328125,
0.0162811279296875,
-0.0108642578125,
0.0171661376953125,
-0.006107330322265625,
-0.056549072265625,
0.0328369140625,
0.0101318359375,
-0.0183258056640625,
-0.028350830078125,
0.041015625,
-0.0241851806640625,
-0.054534912109375,
0.01259613037109375,
0.031158447265625,
0.032562255859375,
0.032501220703125,
-0.0579833984375,
-0.0075531005859375,
0.008941650390625,
-0.037078857421875,
0.03363037109375,
0.034820556640625,
0.0032787322998046875,
0.0291900634765625,
0.0439453125,
-0.0011777877807617188,
0.0092926025390625,
-0.0171356201171875,
0.0579833984375,
-0.0209808349609375,
-0.041839599609375,
-0.058685302734375,
0.05224609375,
-0.0142974853515625,
-0.044708251953125,
0.06060791015625,
0.060150146484375,
0.07269287109375,
-0.0215606689453125,
0.0404052734375,
-0.0170135498046875,
0.046783447265625,
-0.01763916015625,
0.05145263671875,
-0.054351806640625,
0.00672149658203125,
-0.039154052734375,
-0.05950927734375,
-0.01287078857421875,
0.06634521484375,
-0.04461669921875,
0.00527191162109375,
0.034820556640625,
0.02728271484375,
-0.0023651123046875,
-0.0020008087158203125,
0.01311492919921875,
0.0174713134765625,
0.0050201416015625,
0.049224853515625,
0.057403564453125,
-0.039642333984375,
0.0220794677734375,
-0.042327880859375,
-0.0171661376953125,
-0.0174713134765625,
-0.054107666015625,
-0.058380126953125,
-0.0643310546875,
-0.034454345703125,
-0.0158843994140625,
0.0069122314453125,
0.066650390625,
0.05194091796875,
-0.059417724609375,
-0.012603759765625,
0.005695343017578125,
-0.003662109375,
-0.033294677734375,
-0.0159149169921875,
0.038177490234375,
-0.0009794235229492188,
-0.0660400390625,
0.0001976490020751953,
0.006793975830078125,
0.0189971923828125,
-0.00443267822265625,
0.00675201416015625,
-0.05572509765625,
0.011474609375,
0.036834716796875,
-0.001369476318359375,
-0.0307769775390625,
-0.0039825439453125,
-0.0038814544677734375,
-0.004680633544921875,
0.0091400146484375,
0.0202789306640625,
-0.041900634765625,
0.0165863037109375,
0.054443359375,
0.0151519775390625,
0.050018310546875,
-0.0113067626953125,
0.01354217529296875,
-0.055877685546875,
0.024383544921875,
0.037841796875,
0.035308837890625,
0.0089874267578125,
-0.0130462646484375,
0.04791259765625,
0.016448974609375,
-0.029266357421875,
-0.0841064453125,
-0.0032100677490234375,
-0.0819091796875,
-0.031463623046875,
0.06231689453125,
-0.01605224609375,
-0.0194549560546875,
0.01139068603515625,
0.00742340087890625,
0.0142059326171875,
-0.038787841796875,
0.069580078125,
0.0751953125,
-0.015045166015625,
0.0006422996520996094,
-0.032012939453125,
0.01526641845703125,
0.02996826171875,
-0.026763916015625,
-0.01194000244140625,
0.03656005859375,
0.018951416015625,
0.039093017578125,
0.0411376953125,
-0.0238800048828125,
0.0097808837890625,
-0.011474609375,
0.0160064697265625,
-0.00637054443359375,
-0.0225067138671875,
-0.0243682861328125,
0.01403045654296875,
-0.010894775390625,
-0.0277557373046875
]
] |
ku-nlp/deberta-v2-base-japanese-char-wwm | 2023-03-26T03:32:27.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"fill-mask",
"deberta",
"character",
"wwm",
"ja",
"dataset:wikipedia",
"dataset:cc100",
"dataset:oscar",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | ku-nlp | null | null | ku-nlp/deberta-v2-base-japanese-char-wwm | 0 | 12,239 | transformers | 2023-01-18T13:55:30 | ---
language: ja
license: cc-by-sa-4.0
library_name: transformers
tags:
- deberta
- deberta-v2
- fill-mask
- character
- wwm
datasets:
- wikipedia
- cc100
- oscar
metrics:
- accuracy
mask_token: "[MASK]"
widget:
- text: "京都大学で自然言語処理を[MASK][MASK]する。"
---
# Model Card for Japanese character-level DeBERTa V2 base
## Model description
This is a Japanese DeBERTa V2 base model pre-trained on Japanese Wikipedia, the Japanese portion of CC-100, and the Japanese portion of OSCAR.
This model is trained with character-level tokenization and whole word masking.
## How to use
You can use this model for masked language modeling as follows:
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained('ku-nlp/deberta-v2-base-japanese-char-wwm')
model = AutoModelForMaskedLM.from_pretrained('ku-nlp/deberta-v2-base-japanese-char-wwm')
sentence = '京都大学で自然言語処理を[MASK][MASK]する。'
encoding = tokenizer(sentence, return_tensors='pt')
...
```
You can also fine-tune this model on downstream tasks.
## Tokenization
There is no need to tokenize texts in advance, and you can give raw texts to the tokenizer.
The texts are tokenized into character-level tokens by [sentencepiece](https://github.com/google/sentencepiece).
## Training data
We used the following corpora for pre-training:
- Japanese Wikipedia (as of 20221020, 3.2GB, 27M sentences, 1.3M documents)
- Japanese portion of CC-100 (85GB, 619M sentences, 66M documents)
- Japanese portion of OSCAR (54GB, 326M sentences, 25M documents)
Note that we filtered out documents annotated with "header", "footer", or "noisy" tags in OSCAR.
Also note that Japanese Wikipedia was duplicated 10 times to make the total size of the corpus comparable to that of CC-100 and OSCAR. As a result, the total size of the training data is 171GB.
## Training procedure
We first segmented texts in the corpora into words using [Juman++ 2.0.0-rc3](https://github.com/ku-nlp/jumanpp/releases/tag/v2.0.0-rc3) for whole word masking.
Then, we built a sentencepiece model with 22,012 tokens including all characters that appear in the training corpus.
We tokenized raw corpora into character-level subwords using the sentencepiece model and trained the Japanese DeBERTa model using [transformers](https://github.com/huggingface/transformers) library.
The training took 20 days using 8 NVIDIA A100-SXM4-40GB GPUs.
The following hyperparameters were used during pre-training:
- learning_rate: 2e-4
- per_device_train_batch_size: 46
- distributed_type: multi-GPU
- num_devices: 8
- gradient_accumulation_steps: 6
- total_train_batch_size: 2,208
- max_seq_length: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-06
- lr_scheduler_type: linear schedule with warmup (lr = 0 at 500k steps)
- training_steps: 320,000
- warmup_steps: 10,000
## Acknowledgments
This work was supported by Joint Usage/Research Center for Interdisciplinary Large-scale Information Infrastructures (JHPCN) through General Collaboration Project no. jh221004, "Developing a Platform for Constructing and Sharing of Large-Scale Japanese Language Models".
For training models, we used the mdx: a platform for the data-driven future.
| 3,223 | [
[
-0.035980224609375,
-0.060455322265625,
0.0165252685546875,
0.007579803466796875,
-0.041595458984375,
0.006916046142578125,
-0.0111083984375,
-0.0311279296875,
0.032501220703125,
0.039642333984375,
-0.047149658203125,
-0.04254150390625,
-0.058258056640625,
0.007701873779296875,
-0.0024433135986328125,
0.06732177734375,
-0.007579803466796875,
0.0165557861328125,
0.0172576904296875,
0.0084381103515625,
-0.047698974609375,
-0.0426025390625,
-0.06298828125,
-0.0279998779296875,
0.0303192138671875,
0.03546142578125,
0.045257568359375,
0.05218505859375,
0.0205841064453125,
0.0169219970703125,
-0.005680084228515625,
0.0020599365234375,
-0.02667236328125,
-0.0070953369140625,
0.0017490386962890625,
-0.05035400390625,
-0.0250396728515625,
0.007106781005859375,
0.049468994140625,
0.043212890625,
0.00719451904296875,
0.0088653564453125,
-0.005802154541015625,
0.043548583984375,
-0.04833984375,
0.0295562744140625,
-0.049652099609375,
0.0254974365234375,
-0.022857666015625,
0.00772857666015625,
-0.022796630859375,
0.0131378173828125,
-0.01496124267578125,
-0.064208984375,
-0.0013265609741210938,
-0.00447845458984375,
0.0806884765625,
0.0131988525390625,
-0.01104736328125,
-0.010772705078125,
-0.0428466796875,
0.0474853515625,
-0.0535888671875,
0.0194854736328125,
0.05242919921875,
0.01276397705078125,
-0.0038394927978515625,
-0.0487060546875,
-0.061431884765625,
-0.01351165771484375,
-0.0122222900390625,
0.0226287841796875,
-0.01227569580078125,
-0.004241943359375,
0.0526123046875,
0.0171661376953125,
-0.057098388671875,
0.03228759765625,
-0.03857421875,
-0.00189971923828125,
0.038970947265625,
0.01421356201171875,
0.033660888671875,
-0.026275634765625,
-0.024261474609375,
-0.031280517578125,
-0.04388427734375,
-0.00011742115020751953,
0.04327392578125,
0.00321197509765625,
-0.014617919921875,
0.0311431884765625,
-0.0252227783203125,
0.0240325927734375,
0.00934600830078125,
-0.0126953125,
0.03302001953125,
-0.007083892822265625,
-0.0234527587890625,
0.01229095458984375,
0.0816650390625,
0.004741668701171875,
0.0113983154296875,
-0.0028228759765625,
0.004642486572265625,
0.01071929931640625,
0.0211029052734375,
-0.0699462890625,
-0.0184326171875,
0.0045166015625,
-0.0211029052734375,
-0.0234527587890625,
-0.006622314453125,
-0.05419921875,
-0.0111236572265625,
-0.03466796875,
0.039764404296875,
-0.04376220703125,
-0.020263671875,
0.01302337646484375,
-0.009429931640625,
0.02203369140625,
0.007366180419921875,
-0.07928466796875,
0.017333984375,
0.0360107421875,
0.05621337890625,
-0.003173828125,
-0.033935546875,
-0.0258331298828125,
-0.0015954971313476562,
-0.02667236328125,
0.025909423828125,
-0.0292205810546875,
-0.022125244140625,
-0.0135955810546875,
0.0230560302734375,
-0.021881103515625,
-0.02239990234375,
0.04248046875,
-0.03424072265625,
0.031524658203125,
0.004970550537109375,
-0.0389404296875,
-0.014007568359375,
0.022003173828125,
-0.049041748046875,
0.0753173828125,
0.0214385986328125,
-0.070068359375,
0.0230560302734375,
-0.048126220703125,
-0.0191650390625,
0.023223876953125,
-0.007205963134765625,
-0.026824951171875,
-0.007282257080078125,
0.030120849609375,
0.03375244140625,
-0.000988006591796875,
0.0300140380859375,
-0.01486968994140625,
-0.040557861328125,
0.01434326171875,
-0.0298614501953125,
0.07672119140625,
0.0183258056640625,
-0.053375244140625,
-0.017425537109375,
-0.07220458984375,
-0.011627197265625,
0.0220794677734375,
-0.01291656494140625,
-0.033843994140625,
-0.0274658203125,
0.00626373291015625,
0.02801513671875,
0.01806640625,
-0.045989990234375,
0.027313232421875,
-0.0447998046875,
0.031707763671875,
0.036163330078125,
-0.0127716064453125,
0.0218048095703125,
0.005191802978515625,
0.05804443359375,
0.022430419921875,
0.021270751953125,
-0.0253448486328125,
-0.029937744140625,
-0.080322265625,
-0.026611328125,
0.0328369140625,
0.045654296875,
-0.061065673828125,
0.060089111328125,
-0.021636962890625,
-0.058074951171875,
-0.05560302734375,
-0.01023101806640625,
0.035980224609375,
0.036376953125,
0.025054931640625,
-0.038360595703125,
-0.05157470703125,
-0.07916259765625,
0.01678466796875,
-0.00772857666015625,
-0.0031528472900390625,
-0.004367828369140625,
0.056854248046875,
-0.03619384765625,
0.04815673828125,
-0.03564453125,
-0.0277862548828125,
-0.0190582275390625,
0.01285552978515625,
0.0203094482421875,
0.04925537109375,
0.029937744140625,
-0.03887939453125,
-0.040435791015625,
-0.0164642333984375,
-0.050933837890625,
0.0027217864990234375,
-0.00397491455078125,
-0.02374267578125,
0.0274810791015625,
0.042083740234375,
-0.043975830078125,
0.021087646484375,
0.041839599609375,
-0.022125244140625,
0.0232696533203125,
-0.0236358642578125,
0.0030059814453125,
-0.10968017578125,
0.03692626953125,
-0.01424407958984375,
-0.01541900634765625,
-0.046234130859375,
0.01409149169921875,
-0.0014505386352539062,
-0.032379150390625,
-0.03533935546875,
0.037322998046875,
-0.03228759765625,
0.0101776123046875,
-0.03887939453125,
0.02056884765625,
0.0081787109375,
0.06951904296875,
0.0304107666015625,
0.048126220703125,
0.048492431640625,
-0.036224365234375,
0.0085906982421875,
0.031280517578125,
-0.044525146484375,
0.0239715576171875,
-0.06060791015625,
0.0157012939453125,
-0.0177459716796875,
0.0213470458984375,
-0.06378173828125,
-0.0111236572265625,
0.045806884765625,
-0.033233642578125,
0.039947509765625,
-0.009979248046875,
-0.051727294921875,
-0.028594970703125,
-0.0150299072265625,
0.0164794921875,
0.051971435546875,
-0.034759521484375,
0.0284271240234375,
0.038116455078125,
-0.0096588134765625,
-0.0545654296875,
-0.053497314453125,
0.01457977294921875,
-0.0184783935546875,
-0.0249176025390625,
0.038970947265625,
-0.007656097412109375,
0.01050567626953125,
-0.01422119140625,
0.013092041015625,
-0.005138397216796875,
0.02099609375,
0.0221099853515625,
0.0239715576171875,
0.0074310302734375,
-0.005397796630859375,
0.019012451171875,
-0.0193023681640625,
-0.0007233619689941406,
-0.0043182373046875,
0.0709228515625,
0.007049560546875,
0.0023632049560546875,
-0.048309326171875,
0.0177764892578125,
0.037750244140625,
-0.01074981689453125,
0.07293701171875,
0.06805419921875,
-0.0293426513671875,
0.006069183349609375,
-0.0286865234375,
-0.004459381103515625,
-0.030029296875,
0.041778564453125,
-0.0416259765625,
-0.05596923828125,
0.0394287109375,
0.02191162109375,
-0.00130462646484375,
0.061004638671875,
0.039093017578125,
0.00702667236328125,
0.08831787109375,
0.03936767578125,
-0.026611328125,
0.037689208984375,
-0.047698974609375,
-0.0008378028869628906,
-0.07861328125,
-0.0255279541015625,
-0.038604736328125,
-0.01751708984375,
-0.049530029296875,
-0.0246734619140625,
0.0131378173828125,
0.02679443359375,
-0.0251617431640625,
0.0513916015625,
-0.034698486328125,
0.041412353515625,
0.042633056640625,
0.00272369384765625,
0.0098419189453125,
0.01355743408203125,
0.001171112060546875,
-0.00897216796875,
-0.05419921875,
-0.037200927734375,
0.07196044921875,
0.051300048828125,
0.044647216796875,
-0.0210113525390625,
0.049468994140625,
0.003936767578125,
-0.00824737548828125,
-0.050811767578125,
0.0321044921875,
-0.0201263427734375,
-0.048736572265625,
-0.0165557861328125,
-0.0250244140625,
-0.0731201171875,
0.031951904296875,
-0.019256591796875,
-0.04888916015625,
0.008392333984375,
-0.0097503662109375,
-0.0011997222900390625,
0.01511383056640625,
-0.031005859375,
0.07275390625,
-0.0083160400390625,
-0.0015707015991210938,
-0.0213775634765625,
-0.06744384765625,
0.02447509765625,
-0.0160369873046875,
0.004085540771484375,
-0.001499176025390625,
0.00774383544921875,
0.0806884765625,
-0.0176849365234375,
0.066162109375,
-0.019287109375,
-0.003398895263671875,
0.0093994140625,
-0.010009765625,
0.028594970703125,
0.0100860595703125,
0.0157012939453125,
0.043212890625,
0.007335662841796875,
-0.02008056640625,
-0.03094482421875,
0.045166015625,
-0.093017578125,
-0.029022216796875,
-0.049041748046875,
-0.0238037109375,
-0.003971099853515625,
0.020263671875,
0.039031982421875,
0.032928466796875,
-0.00801849365234375,
0.0233154296875,
0.053955078125,
-0.0226287841796875,
0.0261383056640625,
0.052490234375,
-0.0117340087890625,
-0.04620361328125,
0.054962158203125,
0.014007568359375,
0.00502777099609375,
0.039215087890625,
0.0121307373046875,
-0.0223236083984375,
-0.024566650390625,
-0.056488037109375,
0.030242919921875,
-0.04315185546875,
-0.0017671585083007812,
-0.0748291015625,
-0.02630615234375,
-0.03619384765625,
0.0139007568359375,
-0.0283355712890625,
-0.045196533203125,
-0.029632568359375,
0.00039768218994140625,
0.0010137557983398438,
0.0247344970703125,
0.0104522705078125,
0.036285400390625,
-0.054443359375,
0.01427459716796875,
-0.0024471282958984375,
0.01116943359375,
-0.005680084228515625,
-0.0687255859375,
-0.039306640625,
0.0176849365234375,
-0.0269775390625,
-0.049468994140625,
0.0369873046875,
-0.00616455078125,
0.03668212890625,
0.0242919921875,
-0.01178741455078125,
0.048553466796875,
-0.036956787109375,
0.0706787109375,
0.0226593017578125,
-0.0677490234375,
0.047698974609375,
-0.018798828125,
0.039398193359375,
0.06005859375,
0.057098388671875,
-0.052398681640625,
-0.034881591796875,
-0.06683349609375,
-0.058624267578125,
0.0743408203125,
0.016815185546875,
0.0235443115234375,
-0.00548553466796875,
0.0251312255859375,
0.00943756103515625,
0.01544189453125,
-0.06964111328125,
-0.0202789306640625,
-0.0241851806640625,
-0.017333984375,
-0.019195556640625,
-0.0185089111328125,
0.00798797607421875,
-0.0295562744140625,
0.08831787109375,
-0.00714874267578125,
0.01812744140625,
0.01480865478515625,
-0.023223876953125,
0.01241302490234375,
0.01160430908203125,
0.051300048828125,
0.036163330078125,
-0.01427459716796875,
-0.00684356689453125,
0.007633209228515625,
-0.05328369140625,
0.005435943603515625,
0.0186614990234375,
-0.029632568359375,
0.019500732421875,
0.021270751953125,
0.099365234375,
-0.0014352798461914062,
-0.050445556640625,
0.036468505859375,
-0.00865936279296875,
-0.0250244140625,
-0.04388427734375,
0.00957489013671875,
-0.00707244873046875,
0.0028781890869140625,
0.006702423095703125,
-0.032806396484375,
0.0025196075439453125,
-0.05316162109375,
-0.00799560546875,
0.0157318115234375,
-0.0308837890625,
-0.0194549560546875,
0.056671142578125,
0.0055084228515625,
-0.01178741455078125,
0.0579833984375,
-0.01306915283203125,
-0.0450439453125,
0.04620361328125,
0.05108642578125,
0.07147216796875,
-0.0145416259765625,
0.018951416015625,
0.0614013671875,
0.0166473388671875,
-0.0005426406860351562,
-0.0041046142578125,
-0.0022029876708984375,
-0.04400634765625,
-0.0244598388671875,
-0.060882568359375,
-0.0233154296875,
0.044281005859375,
-0.0406494140625,
0.0260162353515625,
-0.03082275390625,
0.0010423660278320312,
-0.0029926300048828125,
0.0289764404296875,
-0.041107177734375,
0.0295867919921875,
0.028167724609375,
0.06549072265625,
-0.06890869140625,
0.07623291015625,
0.045318603515625,
-0.053619384765625,
-0.0648193359375,
-0.0202789306640625,
-0.035491943359375,
-0.0697021484375,
0.06744384765625,
0.0316162109375,
0.01480865478515625,
-0.006397247314453125,
-0.0296173095703125,
-0.05865478515625,
0.07989501953125,
0.014495849609375,
-0.07244873046875,
-0.01145172119140625,
0.0272979736328125,
0.05322265625,
-0.029388427734375,
0.026397705078125,
0.035430908203125,
0.0171661376953125,
-0.0004127025604248047,
-0.058685302734375,
0.0001443624496459961,
-0.0302581787109375,
0.0196685791015625,
0.01396942138671875,
-0.047088623046875,
0.0574951171875,
0.026519775390625,
-0.006580352783203125,
0.001194000244140625,
0.03607177734375,
0.01678466796875,
0.0030574798583984375,
0.04449462890625,
0.06689453125,
0.045654296875,
0.006061553955078125,
0.05517578125,
-0.035125732421875,
0.014556884765625,
0.07086181640625,
-0.005107879638671875,
0.054931640625,
0.0230560302734375,
-0.00560760498046875,
0.056854248046875,
0.03814697265625,
-0.022491455078125,
0.04034423828125,
0.02142333984375,
-0.00647735595703125,
-0.0026607513427734375,
0.00007635354995727539,
-0.0278167724609375,
0.0496826171875,
0.0139312744140625,
-0.039764404296875,
0.007213592529296875,
0.031890869140625,
0.036590576171875,
-0.01416778564453125,
-0.0222015380859375,
0.06658935546875,
0.003604888916015625,
-0.072998046875,
0.04730224609375,
0.02447509765625,
0.0533447265625,
-0.063232421875,
0.01049041748046875,
-0.0239410400390625,
0.020172119140625,
0.009796142578125,
-0.04052734375,
0.01593017578125,
0.013641357421875,
-0.01360321044921875,
-0.0081939697265625,
0.040435791015625,
-0.045013427734375,
-0.0455322265625,
0.01027679443359375,
0.006381988525390625,
0.0303192138671875,
-0.00511932373046875,
-0.05438232421875,
0.007198333740234375,
-0.00487518310546875,
-0.0254364013671875,
0.03900146484375,
0.00930023193359375,
-0.0024471282958984375,
0.02679443359375,
0.037200927734375,
-0.002384185791015625,
-0.01374053955078125,
0.01470947265625,
0.05572509765625,
-0.032745361328125,
-0.040771484375,
-0.065185546875,
0.0285797119140625,
-0.0147552490234375,
-0.0286407470703125,
0.050506591796875,
0.053955078125,
0.076171875,
-0.0205078125,
0.049041748046875,
-0.0139312744140625,
0.0208587646484375,
-0.05548095703125,
0.0545654296875,
-0.042327880859375,
-0.0101318359375,
-0.022857666015625,
-0.075439453125,
-0.005001068115234375,
0.0538330078125,
-0.0006999969482421875,
-0.0115509033203125,
0.04681396484375,
0.0538330078125,
-0.00444793701171875,
-0.01282501220703125,
0.02056884765625,
0.0040435791015625,
0.0256500244140625,
0.0548095703125,
0.043853759765625,
-0.060699462890625,
0.03240966796875,
-0.038543701171875,
-0.0128021240234375,
-0.00482177734375,
-0.047760009765625,
-0.0794677734375,
-0.048065185546875,
-0.04083251953125,
-0.0226287841796875,
-0.005596160888671875,
0.0633544921875,
0.048675537109375,
-0.0406494140625,
-0.0106353759765625,
-0.01506805419921875,
-0.032379150390625,
-0.0031185150146484375,
-0.017730712890625,
0.040313720703125,
-0.027801513671875,
-0.07781982421875,
0.0229949951171875,
-0.00543212890625,
0.0038318634033203125,
-0.03070068359375,
-0.0008535385131835938,
-0.01439666748046875,
-0.00506591796875,
0.04522705078125,
0.007122039794921875,
-0.033538818359375,
-0.00481414794921875,
0.0022029876708984375,
-0.0210113525390625,
0.0011606216430664062,
0.035797119140625,
-0.056854248046875,
0.035400390625,
0.034759521484375,
0.034332275390625,
0.07000732421875,
-0.0139007568359375,
0.026947021484375,
-0.049041748046875,
0.0292205810546875,
0.003997802734375,
0.033416748046875,
0.019866943359375,
-0.0201263427734375,
0.0460205078125,
0.0290374755859375,
-0.03948974609375,
-0.050689697265625,
0.00862884521484375,
-0.08209228515625,
-0.02655029296875,
0.08563232421875,
-0.016876220703125,
-0.025238037109375,
0.01355743408203125,
-0.036285400390625,
0.03594970703125,
0.0019626617431640625,
0.05419921875,
0.060089111328125,
0.0294342041015625,
-0.0082855224609375,
-0.0288238525390625,
0.02691650390625,
0.0226898193359375,
-0.060272216796875,
-0.00853729248046875,
0.0287322998046875,
0.029327392578125,
0.026397705078125,
0.061492919921875,
-0.01526641845703125,
0.007289886474609375,
0.0023059844970703125,
0.0131378173828125,
-0.0216827392578125,
-0.0301361083984375,
-0.0316162109375,
-0.0047454833984375,
-0.00846099853515625,
-0.0257568359375
]
] |
EleutherAI/pythia-2.8b-deduped | 2023-07-09T16:06:37.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-2.8b-deduped | 12 | 12,222 | transformers | 2023-02-10T22:26:20 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- EleutherAI/the_pile_deduplicated
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight:600">Details on previous early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-2.8B-deduped
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-2.8B-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-2.8B-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-2.8B-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-2.8B-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “follow” human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most “accurate” text. Never rely on Pythia-2.8B-deduped to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-2.8B-deduped may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-2.8B-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
Pythia-2.8B-deduped was trained on the Pile **after the dataset has been globally
deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Easy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1× their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 13,663 | [
[
-0.02362060546875,
-0.059844970703125,
0.024627685546875,
0.004283905029296875,
-0.01861572265625,
-0.0148468017578125,
-0.017181396484375,
-0.034637451171875,
0.01369476318359375,
0.01259613037109375,
-0.026275634765625,
-0.020416259765625,
-0.0330810546875,
-0.0055694580078125,
-0.033782958984375,
0.0848388671875,
-0.009307861328125,
-0.01080322265625,
0.00897979736328125,
-0.004985809326171875,
-0.0063934326171875,
-0.040374755859375,
-0.03424072265625,
-0.0295562744140625,
0.0474853515625,
0.01415252685546875,
0.065673828125,
0.04266357421875,
0.01192474365234375,
0.0212554931640625,
-0.02825927734375,
-0.0049591064453125,
-0.01197052001953125,
-0.006824493408203125,
-0.00238037109375,
-0.0199127197265625,
-0.053558349609375,
0.00301361083984375,
0.051025390625,
0.048980712890625,
-0.013275146484375,
0.0191192626953125,
-0.0004763603210449219,
0.0273895263671875,
-0.03778076171875,
0.0031375885009765625,
-0.0246734619140625,
-0.0133209228515625,
-0.005489349365234375,
0.0116729736328125,
-0.0294647216796875,
-0.0255279541015625,
0.03253173828125,
-0.048828125,
0.0192108154296875,
0.006046295166015625,
0.08929443359375,
-0.00856781005859375,
-0.032562255859375,
-0.00496673583984375,
-0.053558349609375,
0.050079345703125,
-0.052978515625,
0.0251312255859375,
0.021697998046875,
0.01293182373046875,
-0.0036163330078125,
-0.0682373046875,
-0.04156494140625,
-0.015869140625,
-0.01021575927734375,
-0.002819061279296875,
-0.047119140625,
0.00116729736328125,
0.03839111328125,
0.048004150390625,
-0.0625,
-0.00098419189453125,
-0.02886962890625,
-0.0258636474609375,
0.02685546875,
0.00373077392578125,
0.033203125,
-0.023193359375,
-0.00025463104248046875,
-0.029449462890625,
-0.05120849609375,
-0.0166473388671875,
0.041290283203125,
0.004558563232421875,
-0.0279388427734375,
0.037322998046875,
-0.027984619140625,
0.040496826171875,
-0.005603790283203125,
0.019439697265625,
0.032012939453125,
-0.01488494873046875,
-0.0372314453125,
-0.00807952880859375,
0.07061767578125,
0.01097869873046875,
0.0158233642578125,
-0.0008921623229980469,
-0.0031299591064453125,
0.004627227783203125,
0.0025043487548828125,
-0.08544921875,
-0.061004638671875,
0.017578125,
-0.030029296875,
-0.032958984375,
-0.01378631591796875,
-0.07061767578125,
-0.015655517578125,
-0.0145263671875,
0.042449951171875,
-0.0372314453125,
-0.055267333984375,
-0.0120697021484375,
0.000033795833587646484,
0.016357421875,
0.0276336669921875,
-0.071533203125,
0.03070068359375,
0.033233642578125,
0.07647705078125,
0.0177459716796875,
-0.04083251953125,
-0.0156402587890625,
-0.0203704833984375,
-0.00954437255859375,
0.0279998779296875,
-0.0088958740234375,
-0.014495849609375,
-0.00859832763671875,
0.0122222900390625,
-0.007762908935546875,
-0.0270843505859375,
0.029693603515625,
-0.0305023193359375,
0.019866943359375,
-0.0216827392578125,
-0.0328369140625,
-0.0283966064453125,
0.0060577392578125,
-0.046478271484375,
0.0655517578125,
0.0194854736328125,
-0.07177734375,
0.01702880859375,
-0.016143798828125,
-0.00494384765625,
-0.00362396240234375,
0.0139923095703125,
-0.051483154296875,
0.0026397705078125,
0.026214599609375,
0.00228118896484375,
-0.030975341796875,
0.015716552734375,
-0.0189208984375,
-0.032318115234375,
0.0135345458984375,
-0.040313720703125,
0.06988525390625,
0.0151519775390625,
-0.050079345703125,
0.019744873046875,
-0.04266357421875,
0.0163726806640625,
0.01849365234375,
-0.0263519287109375,
0.0037631988525390625,
-0.0125579833984375,
0.0285491943359375,
0.0167083740234375,
0.012481689453125,
-0.02655029296875,
0.022918701171875,
-0.038421630859375,
0.056182861328125,
0.05621337890625,
-0.006195068359375,
0.034637451171875,
-0.03082275390625,
0.0330810546875,
0.003040313720703125,
0.0142822265625,
-0.0028247833251953125,
-0.046966552734375,
-0.07421875,
-0.0226898193359375,
0.0282440185546875,
0.0230255126953125,
-0.034515380859375,
0.033294677734375,
-0.0176849365234375,
-0.0654296875,
-0.0119476318359375,
-0.006259918212890625,
0.03228759765625,
0.0226898193359375,
0.0322265625,
-0.01218414306640625,
-0.040069580078125,
-0.06646728515625,
-0.0159454345703125,
-0.03289794921875,
0.00897216796875,
0.01403045654296875,
0.0709228515625,
-0.00882720947265625,
0.043975830078125,
-0.027557373046875,
0.01861572265625,
-0.0278167724609375,
0.01299285888671875,
0.0322265625,
0.0460205078125,
0.02935791015625,
-0.0418701171875,
-0.02850341796875,
0.00049591064453125,
-0.043914794921875,
0.006069183349609375,
0.0029697418212890625,
-0.0238037109375,
0.02362060546875,
0.005840301513671875,
-0.0751953125,
0.0352783203125,
0.046173095703125,
-0.04205322265625,
0.060577392578125,
-0.02447509765625,
-0.0014972686767578125,
-0.08026123046875,
0.0196685791015625,
0.0087127685546875,
-0.01654052734375,
-0.045379638671875,
0.0058441162109375,
0.0145263671875,
-0.01531219482421875,
-0.0293731689453125,
0.046478271484375,
-0.04046630859375,
-0.01355743408203125,
-0.0171051025390625,
0.0059967041015625,
-0.0022106170654296875,
0.047088623046875,
0.01181793212890625,
0.043701171875,
0.06048583984375,
-0.05780029296875,
0.032684326171875,
0.017791748046875,
-0.021209716796875,
0.0281829833984375,
-0.06707763671875,
0.01363372802734375,
0.00628662109375,
0.032562255859375,
-0.04510498046875,
-0.02655029296875,
0.040863037109375,
-0.044464111328125,
0.01174163818359375,
-0.03131103515625,
-0.040374755859375,
-0.03265380859375,
-0.01354217529296875,
0.045562744140625,
0.059722900390625,
-0.04632568359375,
0.050384521484375,
0.004222869873046875,
0.00875091552734375,
-0.027557373046875,
-0.042510986328125,
-0.0189208984375,
-0.038604736328125,
-0.049346923828125,
0.028076171875,
0.01291656494140625,
-0.01409912109375,
0.0006833076477050781,
-0.0005922317504882812,
0.007495880126953125,
-0.0036163330078125,
0.0242767333984375,
0.0257568359375,
-0.0032825469970703125,
0.0019893646240234375,
-0.00914764404296875,
-0.0098114013671875,
0.0004725456237792969,
-0.03857421875,
0.07281494140625,
-0.0203094482421875,
-0.0132293701171875,
-0.060882568359375,
0.00007271766662597656,
0.066162109375,
-0.03228759765625,
0.06610107421875,
0.046539306640625,
-0.053466796875,
0.01146697998046875,
-0.0275115966796875,
-0.022125244140625,
-0.033050537109375,
0.051055908203125,
-0.02001953125,
-0.0273895263671875,
0.047088623046875,
0.02252197265625,
0.021026611328125,
0.043548583984375,
0.055877685546875,
0.01654052734375,
0.090087890625,
0.033355712890625,
-0.01172637939453125,
0.047149658203125,
-0.039276123046875,
0.019683837890625,
-0.08453369140625,
-0.0147857666015625,
-0.038970947265625,
-0.01947021484375,
-0.07147216796875,
-0.023651123046875,
0.0234832763671875,
0.0192718505859375,
-0.056396484375,
0.0428466796875,
-0.04156494140625,
0.00394439697265625,
0.049560546875,
0.01959228515625,
0.01313018798828125,
0.0163726806640625,
0.00655364990234375,
-0.0042724609375,
-0.0484619140625,
-0.0251007080078125,
0.09368896484375,
0.038055419921875,
0.04437255859375,
0.0230865478515625,
0.052520751953125,
-0.01049041748046875,
0.0172119140625,
-0.053375244140625,
0.031890869140625,
0.0249786376953125,
-0.054229736328125,
-0.015472412109375,
-0.0579833984375,
-0.0699462890625,
0.036895751953125,
0.006900787353515625,
-0.0849609375,
0.017822265625,
0.0174713134765625,
-0.0281982421875,
0.036041259765625,
-0.046844482421875,
0.075439453125,
-0.01727294921875,
-0.03668212890625,
-0.0282745361328125,
-0.023590087890625,
0.017730712890625,
0.02825927734375,
0.00865936279296875,
0.00704193115234375,
0.023712158203125,
0.0751953125,
-0.0513916015625,
0.048919677734375,
-0.01033782958984375,
0.01103973388671875,
0.02667236328125,
0.0218048095703125,
0.050628662109375,
0.01204681396484375,
0.01085662841796875,
-0.00292205810546875,
0.01177215576171875,
-0.041351318359375,
-0.029205322265625,
0.06903076171875,
-0.08392333984375,
-0.0278778076171875,
-0.060333251953125,
-0.04510498046875,
0.0071258544921875,
0.01494598388671875,
0.031494140625,
0.050506591796875,
-0.0022296905517578125,
0.0006089210510253906,
0.045501708984375,
-0.037994384765625,
0.0270843505859375,
0.0161590576171875,
-0.03564453125,
-0.039825439453125,
0.07452392578125,
0.0012760162353515625,
0.0258941650390625,
0.0009775161743164062,
0.0177001953125,
-0.03167724609375,
-0.033172607421875,
-0.04620361328125,
0.040283203125,
-0.054779052734375,
0.00008922815322875977,
-0.053924560546875,
-0.00307464599609375,
-0.035675048828125,
0.00909423828125,
-0.0301971435546875,
-0.029449462890625,
-0.01898193359375,
-0.0020923614501953125,
0.044158935546875,
0.035980224609375,
0.00647735595703125,
0.0253753662109375,
-0.040374755859375,
-0.0027217864990234375,
0.01806640625,
0.007678985595703125,
0.008392333984375,
-0.06927490234375,
-0.007183074951171875,
0.0107421875,
-0.033172607421875,
-0.0860595703125,
0.036956787109375,
-0.003910064697265625,
0.0262603759765625,
0.004669189453125,
-0.018157958984375,
0.0445556640625,
-0.0059814453125,
0.051239013671875,
0.0120849609375,
-0.0767822265625,
0.041656494140625,
-0.036712646484375,
0.0217132568359375,
0.0262603759765625,
0.0252838134765625,
-0.054534912109375,
-0.0059356689453125,
-0.074951171875,
-0.08154296875,
0.05743408203125,
0.03759765625,
0.01251983642578125,
0.00839996337890625,
0.030303955078125,
-0.03497314453125,
0.01171112060546875,
-0.07720947265625,
-0.02130126953125,
-0.018829345703125,
-0.006328582763671875,
0.0125732421875,
-0.00457000732421875,
0.00440216064453125,
-0.04205322265625,
0.0760498046875,
0.004573822021484375,
0.026397705078125,
0.021484375,
-0.0298919677734375,
-0.00756072998046875,
-0.0031108856201171875,
0.0121002197265625,
0.05615234375,
-0.00995635986328125,
0.00428009033203125,
0.0157623291015625,
-0.04217529296875,
0.003208160400390625,
0.01334381103515625,
-0.029052734375,
-0.005886077880859375,
0.0131683349609375,
0.0667724609375,
0.00995635986328125,
-0.0305328369140625,
0.0174713134765625,
-0.002429962158203125,
-0.005939483642578125,
-0.0228424072265625,
-0.0134124755859375,
0.01324462890625,
0.0165252685546875,
-0.002536773681640625,
-0.01306915283203125,
-0.000025212764739990234,
-0.06658935546875,
0.0038814544677734375,
0.0160064697265625,
-0.01137542724609375,
-0.0307159423828125,
0.044677734375,
0.003376007080078125,
-0.0142364501953125,
0.0858154296875,
-0.0201873779296875,
-0.0513916015625,
0.058074951171875,
0.0384521484375,
0.0552978515625,
-0.0140533447265625,
0.026702880859375,
0.06842041015625,
0.024078369140625,
-0.0161285400390625,
0.005615234375,
0.00746917724609375,
-0.037994384765625,
-0.007293701171875,
-0.06085205078125,
-0.016693115234375,
0.0199432373046875,
-0.04412841796875,
0.033721923828125,
-0.048370361328125,
-0.0059356689453125,
-0.003292083740234375,
0.01751708984375,
-0.041839599609375,
0.024139404296875,
0.01324462890625,
0.0538330078125,
-0.0699462890625,
0.06146240234375,
0.0484619140625,
-0.05584716796875,
-0.08367919921875,
0.0017948150634765625,
0.0020999908447265625,
-0.032562255859375,
0.01395416259765625,
0.01641845703125,
0.0154876708984375,
0.01187896728515625,
-0.02056884765625,
-0.06591796875,
0.09783935546875,
0.0166015625,
-0.049835205078125,
-0.0213470458984375,
-0.0089111328125,
0.040863037109375,
0.0054473876953125,
0.05511474609375,
0.05474853515625,
0.03057861328125,
0.00656890869140625,
-0.08026123046875,
0.028564453125,
-0.024871826171875,
-0.0036830902099609375,
0.0174102783203125,
-0.051971435546875,
0.0986328125,
-0.005889892578125,
-0.0009441375732421875,
0.03131103515625,
0.045867919921875,
0.0302581787109375,
-0.00897216796875,
0.0269927978515625,
0.05889892578125,
0.06646728515625,
-0.0284576416015625,
0.0919189453125,
-0.0222320556640625,
0.058990478515625,
0.0650634765625,
0.01409912109375,
0.0390625,
0.0297088623046875,
-0.029083251953125,
0.0399169921875,
0.0631103515625,
-0.006832122802734375,
0.01381683349609375,
0.019012451171875,
-0.021270751953125,
-0.0200653076171875,
0.00907135009765625,
-0.045806884765625,
0.014404296875,
0.01079559326171875,
-0.043731689453125,
-0.01519775390625,
-0.0262298583984375,
0.0272979736328125,
-0.0309600830078125,
-0.0173187255859375,
0.0197296142578125,
0.00734710693359375,
-0.04937744140625,
0.04742431640625,
0.0186309814453125,
0.04095458984375,
-0.033416748046875,
0.01085662841796875,
-0.012786865234375,
0.025054931640625,
-0.025787353515625,
-0.033294677734375,
0.006450653076171875,
0.0003809928894042969,
0.003925323486328125,
0.009765625,
0.032501220703125,
-0.01084136962890625,
-0.04302978515625,
0.01447296142578125,
0.035003662109375,
0.01983642578125,
-0.033416748046875,
-0.050811767578125,
0.006404876708984375,
-0.0118560791015625,
-0.040679931640625,
0.033050537109375,
0.0184783935546875,
-0.0106048583984375,
0.043548583984375,
0.047271728515625,
0.002239227294921875,
-0.001251220703125,
0.01114654541015625,
0.07403564453125,
-0.034515380859375,
-0.0360107421875,
-0.068115234375,
0.03759765625,
-0.0009107589721679688,
-0.0506591796875,
0.064208984375,
0.042724609375,
0.05206298828125,
0.0194091796875,
0.045318603515625,
-0.035125732421875,
0.0012760162353515625,
-0.0227203369140625,
0.05010986328125,
-0.038238525390625,
0.002780914306640625,
-0.036773681640625,
-0.0855712890625,
-0.003040313720703125,
0.07177734375,
-0.0389404296875,
0.0294952392578125,
0.060577392578125,
0.061431884765625,
-0.00592803955078125,
0.006198883056640625,
0.00244903564453125,
0.0229034423828125,
0.040374755859375,
0.07037353515625,
0.066650390625,
-0.053741455078125,
0.041412353515625,
-0.03778076171875,
-0.020782470703125,
-0.01280975341796875,
-0.03607177734375,
-0.06500244140625,
-0.034820556640625,
-0.037628173828125,
-0.057281494140625,
-0.0008549690246582031,
0.06719970703125,
0.055084228515625,
-0.046844482421875,
-0.011688232421875,
-0.039886474609375,
0.00353240966796875,
-0.0195770263671875,
-0.0175933837890625,
0.03271484375,
0.0098114013671875,
-0.07073974609375,
-0.003047943115234375,
-0.01177215576171875,
0.007625579833984375,
-0.03106689453125,
-0.0218963623046875,
-0.01385498046875,
-0.0089874267578125,
0.006198883056640625,
0.0235443115234375,
-0.039337158203125,
-0.0208282470703125,
0.0016088485717773438,
0.003337860107421875,
0.00020372867584228516,
0.05328369140625,
-0.042724609375,
0.00823974609375,
0.047149658203125,
0.00704193115234375,
0.0611572265625,
-0.020294189453125,
0.030426025390625,
-0.01947021484375,
0.027069091796875,
0.020751953125,
0.04736328125,
0.0247802734375,
-0.019439697265625,
0.012420654296875,
0.032012939453125,
-0.0552978515625,
-0.06591796875,
0.028411865234375,
-0.05364990234375,
-0.00725555419921875,
0.09674072265625,
-0.0200958251953125,
-0.02862548828125,
0.0052337646484375,
-0.016326904296875,
0.041778564453125,
-0.0207672119140625,
0.04986572265625,
0.04638671875,
0.005687713623046875,
-0.015533447265625,
-0.048919677734375,
0.0288543701171875,
0.05029296875,
-0.061309814453125,
0.0298004150390625,
0.04742431640625,
0.04656982421875,
0.017730712890625,
0.044158935546875,
-0.0227813720703125,
0.0469970703125,
0.00794219970703125,
0.00569915771484375,
0.001216888427734375,
-0.03607177734375,
-0.032012939453125,
-0.01186370849609375,
0.017669677734375,
0.00165557861328125
]
] |
samrawal/bert-base-uncased_clinical-ner | 2022-11-11T22:57:56.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | samrawal | null | null | samrawal/bert-base-uncased_clinical-ner | 23 | 12,201 | transformers | 2022-03-02T23:29:05 | A Named Entity Recognition model for clinical entities (`problem`, `treatment`, `test`)
The model has been trained on the [i2b2 (now n2c2) dataset](https://n2c2.dbmi.hms.harvard.edu) for the 2010 - Relations task. Please visit the n2c2 site to request access to the dataset. | 275 | [
[
-0.00838470458984375,
-0.025970458984375,
0.04437255859375,
0.01206207275390625,
-0.0014009475708007812,
-0.00968170166015625,
0.01305389404296875,
-0.04742431640625,
0.0276336669921875,
0.058319091796875,
-0.057373046875,
-0.02520751953125,
-0.05279541015625,
0.0210113525390625,
-0.03338623046875,
0.0714111328125,
-0.00992584228515625,
0.06768798828125,
-0.035491943359375,
-0.01444244384765625,
-0.0275115966796875,
-0.01491546630859375,
-0.0799560546875,
-0.044769287109375,
0.048126220703125,
0.048858642578125,
0.0266571044921875,
0.037811279296875,
0.0906982421875,
0.0097808837890625,
0.007598876953125,
-0.01251220703125,
-0.025421142578125,
-0.005374908447265625,
-0.036041259765625,
-0.026611328125,
-0.040435791015625,
0.005733489990234375,
0.04815673828125,
0.0716552734375,
-0.0084381103515625,
0.01454925537109375,
-0.025115966796875,
0.03936767578125,
-0.03424072265625,
0.029083251953125,
-0.0430908203125,
0.01073455810546875,
-0.01885986328125,
-0.0009636878967285156,
-0.0303497314453125,
-0.01806640625,
0.01238250732421875,
-0.05145263671875,
0.026611328125,
0.0238494873046875,
0.09515380859375,
0.019287109375,
-0.05987548828125,
-0.04998779296875,
-0.047607421875,
0.0287628173828125,
-0.0099639892578125,
0.0487060546875,
0.044525146484375,
0.039398193359375,
-0.03240966796875,
-0.06329345703125,
-0.034393310546875,
-0.023162841796875,
-0.00836944580078125,
0.0014514923095703125,
-0.0216217041015625,
-0.00457763671875,
0.0352783203125,
0.023040771484375,
-0.043365478515625,
0.022216796875,
-0.061676025390625,
-0.00968170166015625,
0.0565185546875,
0.039215087890625,
0.032867431640625,
-0.01142120361328125,
-0.045928955078125,
0.0352783203125,
-0.07110595703125,
-0.0061187744140625,
-0.015838623046875,
0.005580902099609375,
-0.0015192031860351562,
0.0467529296875,
-0.0023326873779296875,
0.040069580078125,
0.004123687744140625,
-0.0171966552734375,
0.0151214599609375,
-0.0081634521484375,
-0.0232391357421875,
0.038970947265625,
0.0221099853515625,
0.0295867919921875,
0.01313018798828125,
-0.019561767578125,
-0.0066375732421875,
0.0059967041015625,
0.0213623046875,
-0.0718994140625,
-0.048492431640625,
0.0247344970703125,
-0.0382080078125,
-0.00983428955078125,
-0.0135955810546875,
-0.0218353271484375,
-0.0380859375,
-0.035430908203125,
0.0241241455078125,
-0.04119873046875,
-0.032257080078125,
-0.0059661865234375,
-0.0027904510498046875,
0.0189208984375,
0.0184326171875,
-0.056182861328125,
0.054443359375,
0.03692626953125,
0.0489501953125,
-0.0182342529296875,
0.0099945068359375,
-0.03643798828125,
0.0218048095703125,
-0.0211181640625,
0.0655517578125,
-0.022216796875,
-0.037109375,
-0.00919342041015625,
0.035125732421875,
-0.0204620361328125,
-0.0716552734375,
0.0257720947265625,
-0.04119873046875,
0.0054168701171875,
-0.024810791015625,
-0.08135986328125,
-0.0028324127197265625,
0.01525115966796875,
-0.07159423828125,
0.0579833984375,
0.019317626953125,
-0.06500244140625,
0.0285491943359375,
-0.046356201171875,
-0.01021575927734375,
0.011016845703125,
-0.0160980224609375,
-0.045318603515625,
-0.006046295166015625,
-0.021240234375,
0.0268402099609375,
-0.035003662109375,
0.02642822265625,
-0.036041259765625,
-0.020782470703125,
0.0051116943359375,
0.0384521484375,
0.06695556640625,
0.01435089111328125,
0.013946533203125,
-0.006923675537109375,
-0.0894775390625,
0.0014705657958984375,
0.042266845703125,
-0.018402099609375,
-0.04388427734375,
-0.0160980224609375,
0.009002685546875,
0.002193450927734375,
0.0026149749755859375,
-0.03997802734375,
0.033966064453125,
-0.0081634521484375,
0.036102294921875,
0.00940704345703125,
0.02496337890625,
0.00783538818359375,
-0.0088653564453125,
0.029937744140625,
0.02288818359375,
0.002777099609375,
0.0313720703125,
-0.051055908203125,
-0.05548095703125,
-0.005817413330078125,
0.055023193359375,
0.04876708984375,
-0.0428466796875,
0.010955810546875,
-0.0135498046875,
-0.036651611328125,
-0.0157928466796875,
-0.020843505859375,
0.03460693359375,
0.043914794921875,
0.03509521484375,
-0.017669677734375,
-0.04632568359375,
-0.07940673828125,
0.0021228790283203125,
-0.0016107559204101562,
-0.003765106201171875,
0.0294036865234375,
0.071533203125,
-0.0309600830078125,
0.042449951171875,
-0.04083251953125,
-0.04736328125,
-0.0093841552734375,
0.01371002197265625,
0.00820159912109375,
0.032196044921875,
0.026336669921875,
-0.050048828125,
-0.020782470703125,
-0.037750244140625,
-0.054779052734375,
-0.00101470947265625,
0.00777435302734375,
-0.00933074951171875,
-0.0184326171875,
0.0458984375,
-0.0285797119140625,
0.0443115234375,
0.0231781005859375,
-0.02362060546875,
0.0188140869140625,
-0.001537322998046875,
0.0002777576446533203,
-0.089111328125,
0.0165557861328125,
-0.00014293193817138672,
-0.01561737060546875,
-0.05322265625,
-0.011688232421875,
0.0340576171875,
-0.005046844482421875,
-0.040771484375,
0.04730224609375,
-0.06781005859375,
-0.0181427001953125,
-0.0112762451171875,
0.005947113037109375,
0.00007545948028564453,
0.009796142578125,
0.02392578125,
0.040679931640625,
0.041961669921875,
-0.05426025390625,
0.0289306640625,
0.04791259765625,
-0.038055419921875,
0.0732421875,
-0.036376953125,
0.0220489501953125,
-0.0102081298828125,
-0.003246307373046875,
-0.047607421875,
-0.049072265625,
0.031890869140625,
-0.0151519775390625,
0.032470703125,
-0.00946044921875,
-0.0243988037109375,
-0.03125,
0.0263824462890625,
0.04119873046875,
-0.003650665283203125,
-0.039398193359375,
0.01812744140625,
0.043548583984375,
-0.005702972412109375,
-0.0180206298828125,
-0.048919677734375,
0.018524169921875,
-0.00748443603515625,
-0.0191497802734375,
0.053466796875,
0.005084991455078125,
-0.02569580078125,
0.0154571533203125,
0.00498199462890625,
-0.052215576171875,
0.0007419586181640625,
0.043701171875,
0.023040771484375,
-0.00917816162109375,
0.0501708984375,
0.00428009033203125,
-0.019989013671875,
0.02288818359375,
0.029541015625,
0.0283660888671875,
0.0293731689453125,
-0.0176239013671875,
-0.0400390625,
0.034423828125,
-0.0018072128295898438,
-0.007511138916015625,
0.040435791015625,
0.0171051025390625,
-0.09130859375,
0.0084075927734375,
-0.0241851806640625,
-0.0242156982421875,
-0.0242156982421875,
0.0322265625,
-0.03558349609375,
-0.036376953125,
0.02142333984375,
-0.007480621337890625,
-0.034423828125,
0.029998779296875,
0.039886474609375,
-0.00693511962890625,
0.061614990234375,
0.034393310546875,
0.0180511474609375,
0.00923919677734375,
-0.024169921875,
0.0301055908203125,
-0.0909423828125,
-0.0474853515625,
-0.032073974609375,
-0.019989013671875,
-0.019012451171875,
-0.011077880859375,
0.021087646484375,
0.0277862548828125,
-0.01548004150390625,
0.040008544921875,
-0.0310516357421875,
0.015106201171875,
0.04290771484375,
0.032318115234375,
0.0278472900390625,
-0.018341064453125,
-0.0008745193481445312,
-0.0192108154296875,
-0.02362060546875,
-0.0288238525390625,
0.09271240234375,
0.0253143310546875,
0.0287933349609375,
0.0145721435546875,
0.050872802734375,
0.01065826416015625,
0.04248046875,
-0.05535888671875,
0.043182373046875,
0.0005240440368652344,
-0.07171630859375,
-0.00027632713317871094,
-0.013763427734375,
-0.11138916015625,
-0.034881591796875,
-0.0289306640625,
-0.051177978515625,
0.0219573974609375,
0.0031890869140625,
-0.0236053466796875,
0.000263214111328125,
-0.02581787109375,
0.058319091796875,
-0.006626129150390625,
0.01163482666015625,
-0.0183258056640625,
-0.0457763671875,
0.0323486328125,
0.00783538818359375,
-0.004138946533203125,
-0.03497314453125,
0.0112457275390625,
0.063232421875,
-0.0247955322265625,
0.00807952880859375,
-0.020751953125,
0.0269622802734375,
0.01033782958984375,
0.004581451416015625,
0.02337646484375,
0.0240325927734375,
0.0283050537109375,
0.00954437255859375,
0.006961822509765625,
-0.0243072509765625,
-0.0281829833984375,
0.046844482421875,
-0.032867431640625,
-0.02203369140625,
-0.050140380859375,
-0.0252838134765625,
-0.0221710205078125,
0.0254058837890625,
0.024749755859375,
0.04071044921875,
-0.0289306640625,
0.0002219676971435547,
0.053253173828125,
-0.0038814544677734375,
0.003734588623046875,
0.06494140625,
-0.021728515625,
-0.0211639404296875,
0.04608154296875,
0.0251617431640625,
0.007274627685546875,
0.027740478515625,
-0.023223876953125,
-0.019287109375,
-0.0599365234375,
-0.0181427001953125,
0.0291595458984375,
-0.034820556640625,
-0.0221710205078125,
-0.049591064453125,
-0.05352783203125,
-0.031097412109375,
0.039398193359375,
-0.01120758056640625,
-0.0238494873046875,
-0.049102783203125,
-0.01556396484375,
0.0311279296875,
0.062469482421875,
0.00531005859375,
0.0164337158203125,
-0.04364013671875,
0.028350830078125,
0.01494598388671875,
0.0380859375,
-0.030914306640625,
-0.0182037353515625,
-0.018524169921875,
-0.0150299072265625,
-0.020721435546875,
-0.097412109375,
0.026214599609375,
0.0285491943359375,
0.033111572265625,
0.03173828125,
-0.022674560546875,
0.039764404296875,
-0.0406494140625,
0.032562255859375,
0.019317626953125,
-0.0340576171875,
0.05682373046875,
-0.006015777587890625,
-0.020233154296875,
0.0310211181640625,
0.0711669921875,
-0.035186767578125,
0.0166778564453125,
-0.051971435546875,
-0.0328369140625,
0.0479736328125,
0.01079559326171875,
-0.0321044921875,
-0.0030574798583984375,
0.058441162109375,
-0.0138092041015625,
0.018707275390625,
-0.038299560546875,
-0.03277587890625,
0.015716552734375,
-0.0252685546875,
0.020050048828125,
-0.038970947265625,
-0.0211334228515625,
-0.0187530517578125,
0.05548095703125,
0.0011844635009765625,
0.052093505859375,
0.0274658203125,
-0.017974853515625,
-0.004772186279296875,
-0.004486083984375,
0.00897216796875,
0.033203125,
-0.0271759033203125,
-0.007610321044921875,
-0.006378173828125,
-0.048614501953125,
0.0192108154296875,
0.0298919677734375,
0.00009572505950927734,
0.0012903213500976562,
0.0208740234375,
0.06573486328125,
0.0218505859375,
-0.044830322265625,
0.03839111328125,
-0.01221466064453125,
-0.036407470703125,
-0.056610107421875,
0.01947021484375,
0.0166168212890625,
0.03192138671875,
-0.0167388916015625,
-0.0205535888671875,
0.053436279296875,
-0.0303802490234375,
0.0289306640625,
0.02532958984375,
-0.05364990234375,
-0.003787994384765625,
0.0643310546875,
-0.0169219970703125,
-0.048065185546875,
0.09234619140625,
-0.032470703125,
-0.003551483154296875,
0.054351806640625,
0.0523681640625,
0.033966064453125,
-0.0234222412109375,
0.01092529296875,
0.050506591796875,
-0.01543426513671875,
-0.0009708404541015625,
0.054107666015625,
0.0255889892578125,
-0.0537109375,
0.0159149169921875,
-0.0216827392578125,
-0.045623779296875,
0.034149169921875,
-0.08831787109375,
0.053436279296875,
-0.048095703125,
-0.02728271484375,
0.033355712890625,
0.01209259033203125,
-0.06988525390625,
0.0531005859375,
0.01479339599609375,
0.0771484375,
-0.0389404296875,
0.0843505859375,
0.0684814453125,
-0.045440673828125,
-0.054595947265625,
-0.0311737060546875,
0.00157928466796875,
-0.07763671875,
0.05322265625,
0.0128326416015625,
0.0445556640625,
0.0005431175231933594,
-0.0282440185546875,
-0.08636474609375,
0.10345458984375,
0.0105133056640625,
-0.0638427734375,
-0.0248870849609375,
0.01270294189453125,
0.0296630859375,
-0.03558349609375,
0.00777435302734375,
0.048614501953125,
0.01404571533203125,
-0.00920867919921875,
-0.06378173828125,
0.00015175342559814453,
-0.0318603515625,
-0.0074462890625,
0.016265869140625,
-0.02972412109375,
0.057525634765625,
-0.04315185546875,
-0.01824951171875,
0.0202178955078125,
0.033966064453125,
0.0301513671875,
0.04534912109375,
0.048553466796875,
0.05706787109375,
0.07000732421875,
-0.032196044921875,
0.05023193359375,
-0.0272216796875,
0.038665771484375,
0.097412109375,
-0.0261993408203125,
0.05419921875,
0.0188751220703125,
-0.0284271240234375,
0.0377197265625,
0.072021484375,
-0.03265380859375,
0.050323486328125,
0.0289764404296875,
0.0011119842529296875,
-0.01319122314453125,
-0.023468017578125,
-0.03497314453125,
0.031707763671875,
0.04522705078125,
-0.039520263671875,
-0.00212860107421875,
0.008331298828125,
0.0034160614013671875,
-0.0198822021484375,
-0.0266571044921875,
0.06549072265625,
0.0198211669921875,
-0.0255889892578125,
0.0257415771484375,
-0.007190704345703125,
-0.0189056396484375,
-0.0221405029296875,
-0.0134124755859375,
-0.002048492431640625,
0.01468658447265625,
-0.01617431640625,
-0.044342041015625,
0.02484130859375,
-0.0255126953125,
-0.0307769775390625,
0.0282440185546875,
0.057403564453125,
-0.043701171875,
-0.05230712890625,
0.0311737060546875,
0.021392822265625,
0.032562255859375,
-0.005283355712890625,
-0.060546875,
-0.0216217041015625,
-0.027801513671875,
0.00005054473876953125,
0.0293731689453125,
0.0165863037109375,
-0.0250701904296875,
0.060699462890625,
0.01617431640625,
0.00597381591796875,
-0.00827789306640625,
0.01861572265625,
0.038299560546875,
-0.054229736328125,
-0.016876220703125,
-0.01239776611328125,
0.01326751708984375,
-0.0185699462890625,
-0.0362548828125,
0.023223876953125,
0.04766845703125,
0.048919677734375,
-0.003093719482421875,
0.0267791748046875,
0.00348663330078125,
0.07269287109375,
-0.034759521484375,
0.0257720947265625,
-0.054443359375,
-0.00867462158203125,
0.00199127197265625,
-0.05389404296875,
-0.0340576171875,
0.031524658203125,
-0.0080108642578125,
-0.01551055908203125,
0.059112548828125,
0.052001953125,
-0.006710052490234375,
0.02496337890625,
0.0067138671875,
0.035858154296875,
0.00954437255859375,
0.059783935546875,
0.040924072265625,
-0.04901123046875,
0.004138946533203125,
-0.0292510986328125,
-0.00543975830078125,
-0.02215576171875,
-0.04400634765625,
-0.0672607421875,
-0.04937744140625,
-0.047454833984375,
-0.04052734375,
0.025115966796875,
0.076416015625,
0.045654296875,
-0.08038330078125,
0.0100250244140625,
-0.00907135009765625,
-0.0113372802734375,
-0.00824737548828125,
-0.01102447509765625,
0.03240966796875,
0.0093994140625,
-0.05145263671875,
0.01538848876953125,
0.0180816650390625,
0.0125732421875,
-0.0128173828125,
0.0052642822265625,
-0.017303466796875,
-0.01055145263671875,
0.007701873779296875,
0.00762176513671875,
-0.036407470703125,
-0.00890350341796875,
-0.0006017684936523438,
-0.0187530517578125,
0.024078369140625,
0.05413818359375,
-0.033416748046875,
0.0221099853515625,
0.0005526542663574219,
0.031005859375,
0.0303497314453125,
-0.029083251953125,
0.011444091796875,
-0.01180267333984375,
-0.0035457611083984375,
0.0177459716796875,
0.05218505859375,
0.0231475830078125,
-0.0389404296875,
0.056488037109375,
0.03399658203125,
-0.046630859375,
-0.07525634765625,
0.015869140625,
-0.07275390625,
-0.001346588134765625,
0.07220458984375,
-0.0216217041015625,
-0.00905609130859375,
-0.01788330078125,
-0.010528564453125,
0.0292205810546875,
-0.006610870361328125,
0.058319091796875,
0.037078857421875,
-0.034576416015625,
-0.01392364501953125,
-0.05267333984375,
0.047119140625,
0.00499725341796875,
-0.072998046875,
-0.038421630859375,
0.025726318359375,
0.04962158203125,
-0.0036640167236328125,
0.062347412109375,
-0.01268768310546875,
0.0186614990234375,
-0.01409912109375,
0.0208892822265625,
0.0024852752685546875,
-0.055389404296875,
-0.0028972625732421875,
-0.007045745849609375,
0.0008082389831542969,
-0.0167236328125
]
] |
Yntec/humu | 2023-11-01T00:05:29.000Z | [
"diffusers",
"Photorealistic",
"Sexy",
"Female",
"weed",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/humu | 1 | 12,201 | diffusers | 2023-09-04T22:55:49 | ---
language:
- en
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- Photorealistic
- Sexy
- Female
- weed
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
---
# humu
This model with the MoistV2VAE baked in.
Preview and prompt:

CUTE Pretty girl of artwork mini style by gaston bussiere, sitting IN GOLDEN RING in CUTE KITCHEN, A wholesome animation key shot at computer monitor, studio ghibli, pixar and disney animation, anime key art by Clay Mann and maple story, style of ROSSDRAWS, soft lighting, soft shade,
Original page:
https://civitai.com/models/136799?modelVersionId=150925 | 774 | [
[
-0.003032684326171875,
-0.051788330078125,
0.0202789306640625,
0.0163421630859375,
-0.016021728515625,
-0.0189208984375,
0.0400390625,
-0.01172637939453125,
0.0390625,
0.048431396484375,
-0.0506591796875,
-0.02606201171875,
-0.0280914306640625,
-0.020721435546875,
-0.02423095703125,
0.033905029296875,
0.016357421875,
0.00734710693359375,
-0.0137176513671875,
0.0396728515625,
-0.04730224609375,
-0.0024280548095703125,
-0.0208892822265625,
-0.017364501953125,
0.0026607513427734375,
0.07415771484375,
0.06048583984375,
0.01459503173828125,
0.01100921630859375,
0.023345947265625,
-0.002262115478515625,
-0.007183074951171875,
-0.050567626953125,
-0.007110595703125,
0.0072479248046875,
-0.056549072265625,
-0.05096435546875,
0.01299285888671875,
0.01325225830078125,
0.006748199462890625,
-0.0190887451171875,
0.01422882080078125,
-0.00547027587890625,
0.0309906005859375,
-0.039398193359375,
0.01629638671875,
-0.01065826416015625,
-0.0037174224853515625,
-0.0139923095703125,
0.01166534423828125,
-0.0281829833984375,
-0.0293121337890625,
-0.015228271484375,
-0.0660400390625,
0.01448822021484375,
-0.018218994140625,
0.0899658203125,
0.00151824951171875,
-0.042816162109375,
-0.0032062530517578125,
-0.05523681640625,
0.053253173828125,
-0.0294342041015625,
0.03558349609375,
0.0293731689453125,
0.0400390625,
-0.036376953125,
-0.091064453125,
-0.0188140869140625,
0.028778076171875,
0.00392913818359375,
0.0400390625,
-0.026611328125,
-0.04071044921875,
0.0134124755859375,
0.0256805419921875,
-0.054168701171875,
-0.02813720703125,
-0.0252838134765625,
0.0114593505859375,
0.04132080078125,
0.00997161865234375,
0.052276611328125,
0.00836944580078125,
-0.026885986328125,
-0.01549530029296875,
-0.037200927734375,
0.005970001220703125,
0.017730712890625,
-0.0143280029296875,
-0.034698486328125,
0.0435791015625,
0.0268402099609375,
0.00875091552734375,
0.00165557861328125,
-0.005218505859375,
0.01111602783203125,
-0.0271453857421875,
-0.03338623046875,
-0.021697998046875,
0.041961669921875,
0.038818359375,
0.015716552734375,
0.01065826416015625,
0.004192352294921875,
-0.002269744873046875,
0.036529541015625,
-0.106201171875,
-0.048065185546875,
0.01141357421875,
-0.0274658203125,
-0.03143310546875,
0.02142333984375,
-0.055389404296875,
-0.0018749237060546875,
0.00672149658203125,
0.0078887939453125,
-0.017333984375,
-0.0498046875,
0.01255035400390625,
-0.01222991943359375,
0.0347900390625,
0.023956298828125,
-0.06329345703125,
0.0234222412109375,
0.035491943359375,
0.027740478515625,
0.039581298828125,
0.0220489501953125,
-0.02081298828125,
0.0199432373046875,
-0.05517578125,
0.049468994140625,
-0.036712646484375,
-0.059295654296875,
-0.0172576904296875,
0.0277862548828125,
-0.0016736984252929688,
-0.05157470703125,
0.055084228515625,
-0.04571533203125,
-0.008758544921875,
-0.0382080078125,
-0.0156707763671875,
-0.03179931640625,
-0.01445770263671875,
-0.0531005859375,
0.05596923828125,
0.0231475830078125,
-0.061859130859375,
0.0268402099609375,
-0.0284576416015625,
-0.01125335693359375,
0.006694793701171875,
-0.003814697265625,
-0.04412841796875,
0.0352783203125,
0.00563812255859375,
0.01369476318359375,
-0.0272979736328125,
-0.00917816162109375,
-0.04522705078125,
-0.0284881591796875,
0.025909423828125,
-0.00394439697265625,
0.0467529296875,
0.042205810546875,
-0.00838470458984375,
0.010650634765625,
-0.0616455078125,
0.0250091552734375,
0.06256103515625,
0.015869140625,
-0.02044677734375,
-0.03741455078125,
0.029815673828125,
0.0195465087890625,
0.0169525146484375,
-0.01137542724609375,
0.0157470703125,
-0.0265960693359375,
0.018829345703125,
0.0264892578125,
0.01708984375,
0.0162811279296875,
-0.03173828125,
0.0526123046875,
0.005947113037109375,
0.04034423828125,
0.0160675048828125,
-0.04730224609375,
-0.051483154296875,
-0.0377197265625,
0.034454345703125,
0.033477783203125,
-0.0308685302734375,
0.01617431640625,
0.01392364501953125,
-0.059295654296875,
-0.0458984375,
0.007053375244140625,
0.02398681640625,
0.0007081031799316406,
-0.0203857421875,
-0.038055419921875,
-0.025390625,
-0.09002685546875,
0.020965576171875,
-0.00293731689453125,
-0.0239715576171875,
0.0124053955078125,
0.036529541015625,
-0.022857666015625,
0.03143310546875,
-0.028778076171875,
-0.00917816162109375,
-0.0206756591796875,
-0.0011806488037109375,
0.042694091796875,
0.036102294921875,
0.0823974609375,
-0.06781005859375,
-0.0521240234375,
0.0047149658203125,
-0.05670166015625,
0.00016558170318603516,
0.047332763671875,
-0.032135009765625,
-0.02984619140625,
0.0215301513671875,
-0.04022216796875,
0.053253173828125,
0.0101776123046875,
-0.053192138671875,
0.031097412109375,
-0.018341064453125,
0.048614501953125,
-0.0860595703125,
-0.0023593902587890625,
-0.00473785400390625,
-0.019134521484375,
-0.0380859375,
0.054656982421875,
-0.001934051513671875,
-0.017120361328125,
-0.06512451171875,
0.053680419921875,
-0.053253173828125,
0.00937652587890625,
-0.032501220703125,
-0.01454925537109375,
0.03546142578125,
0.01092529296875,
-0.004352569580078125,
0.0540771484375,
0.049468994140625,
-0.0194091796875,
0.008331298828125,
0.0384521484375,
-0.028656005859375,
0.0202178955078125,
-0.0787353515625,
0.020111083984375,
0.00732421875,
-0.0052337646484375,
-0.0728759765625,
-0.0484619140625,
0.03009033203125,
-0.0268402099609375,
0.0228118896484375,
-0.0251922607421875,
-0.0787353515625,
-0.0256500244140625,
-0.035369873046875,
0.034454345703125,
0.042816162109375,
-0.0255126953125,
0.033477783203125,
0.024078369140625,
-0.005634307861328125,
0.0128936767578125,
-0.06317138671875,
-0.0201416015625,
-0.0185546875,
-0.0758056640625,
0.0247802734375,
-0.0211944580078125,
-0.0283355712890625,
-0.0174713134765625,
0.00850677490234375,
-0.02691650390625,
-0.01105499267578125,
0.039947509765625,
0.0264434814453125,
-0.03692626953125,
-0.039642333984375,
0.00254058837890625,
0.00484466552734375,
-0.0009560585021972656,
0.03399658203125,
0.044158935546875,
-0.0478515625,
-0.0201568603515625,
-0.07220458984375,
0.02593994140625,
0.059295654296875,
0.00493621826171875,
0.04718017578125,
0.048583984375,
-0.05859375,
0.01187896728515625,
-0.03314208984375,
-0.019317626953125,
-0.033477783203125,
-0.005275726318359375,
-0.04986572265625,
0.0009093284606933594,
0.054412841796875,
0.0050201416015625,
-0.029296875,
0.0350341796875,
0.042816162109375,
0.004123687744140625,
0.07354736328125,
0.036956787109375,
0.0297393798828125,
0.020538330078125,
-0.07391357421875,
-0.0095672607421875,
-0.04620361328125,
-0.0152740478515625,
-0.0198974609375,
-0.0119781494140625,
-0.03778076171875,
-0.050567626953125,
0.01593017578125,
0.02789306640625,
-0.0283355712890625,
0.0293121337890625,
-0.0188140869140625,
0.04791259765625,
0.0273284912109375,
0.0284423828125,
0.0065460205078125,
-0.0007252693176269531,
-0.006366729736328125,
-0.039520263671875,
-0.043182373046875,
-0.04522705078125,
0.034820556640625,
0.0313720703125,
0.0538330078125,
0.030792236328125,
0.0400390625,
-0.0120849609375,
-0.007328033447265625,
-0.0299224853515625,
0.023956298828125,
-0.0013751983642578125,
-0.0709228515625,
0.0092010498046875,
-0.0022373199462890625,
-0.044464111328125,
0.0229949951171875,
-0.01715087890625,
-0.04290771484375,
0.0311431884765625,
-0.00490570068359375,
-0.0226593017578125,
0.005672454833984375,
-0.0921630859375,
0.0870361328125,
-0.010040283203125,
-0.060272216796875,
0.005924224853515625,
-0.0264434814453125,
0.0462646484375,
0.032440185546875,
0.018524169921875,
0.0038738250732421875,
0.006649017333984375,
0.036529541015625,
-0.03875732421875,
0.05682373046875,
0.0024433135986328125,
-0.003818511962890625,
0.032958984375,
0.031768798828125,
0.0295867919921875,
0.0648193359375,
-0.0219268798828125,
-0.01366424560546875,
0.0126800537109375,
-0.043121337890625,
-0.0380859375,
0.06829833984375,
-0.05487060546875,
-0.02435302734375,
-0.0494384765625,
-0.005100250244140625,
0.006473541259765625,
0.01390838623046875,
0.056488037109375,
0.0318603515625,
-0.04339599609375,
0.0040435791015625,
0.0518798828125,
0.00539398193359375,
0.015655517578125,
0.00572967529296875,
-0.04388427734375,
-0.018707275390625,
0.06268310546875,
-0.01082611083984375,
0.0117340087890625,
0.0019683837890625,
0.027740478515625,
-0.006134033203125,
-0.0021533966064453125,
-0.02276611328125,
0.045501708984375,
-0.03765869140625,
-0.011474609375,
-0.0243377685546875,
-0.033477783203125,
-0.01210784912109375,
-0.0189971923828125,
-0.051483154296875,
-0.01367950439453125,
-0.07049560546875,
-0.005641937255859375,
0.032501220703125,
0.0640869140625,
0.011383056640625,
0.0208587646484375,
-0.04229736328125,
0.022552490234375,
0.0498046875,
0.01012420654296875,
0.01070404052734375,
-0.038848876953125,
0.01517486572265625,
0.005619049072265625,
-0.053985595703125,
-0.043182373046875,
0.042205810546875,
0.00566864013671875,
0.0260772705078125,
0.0377197265625,
-0.002719879150390625,
0.04901123046875,
-0.03143310546875,
0.042510986328125,
0.02703857421875,
-0.030364990234375,
0.046630859375,
-0.032318115234375,
0.034881591796875,
0.048095703125,
0.01280975341796875,
-0.01010894775390625,
-0.00884246826171875,
-0.0633544921875,
-0.059234619140625,
0.0418701171875,
0.0455322265625,
0.016204833984375,
0.0117034912109375,
0.0284423828125,
0.0198822021484375,
0.025787353515625,
-0.035125732421875,
-0.029052734375,
-0.0280609130859375,
0.01494598388671875,
0.035003662109375,
-0.040435791015625,
-0.004726409912109375,
-0.04229736328125,
0.05999755859375,
0.00399017333984375,
0.047882080078125,
-0.01139068603515625,
0.0292510986328125,
-0.0264892578125,
0.00994873046875,
0.0391845703125,
0.06103515625,
-0.049835205078125,
-0.0264434814453125,
-0.0208587646484375,
-0.033447265625,
-0.00324249267578125,
-0.0237579345703125,
-0.0109405517578125,
0.0057830810546875,
0.0143585205078125,
0.06829833984375,
0.037994384765625,
-0.01336669921875,
0.0579833984375,
-0.052581787109375,
0.0026874542236328125,
-0.04254150390625,
0.03302001953125,
0.034881591796875,
0.03802490234375,
-0.004306793212890625,
-0.00850677490234375,
0.04248046875,
-0.072265625,
0.0083160400390625,
0.0293121337890625,
-0.034271240234375,
-0.05657958984375,
0.08551025390625,
-0.0092010498046875,
-0.007045745849609375,
0.035736083984375,
-0.0106353759765625,
-0.0162200927734375,
0.05169677734375,
0.055816650390625,
0.0635986328125,
-0.022125244140625,
0.036224365234375,
0.0303955078125,
-0.00302886962890625,
0.01068115234375,
0.0675048828125,
0.030181884765625,
-0.01021575927734375,
0.03173828125,
-0.041900634765625,
-0.0144195556640625,
0.020172119140625,
-0.0467529296875,
0.07275390625,
-0.042816162109375,
0.004039764404296875,
-0.004199981689453125,
0.00402069091796875,
-0.052764892578125,
0.037139892578125,
0.003116607666015625,
0.07366943359375,
-0.07049560546875,
0.06402587890625,
0.036376953125,
-0.042388916015625,
-0.056915283203125,
0.0035419464111328125,
0.0236358642578125,
-0.055572509765625,
0.016387939453125,
0.01861572265625,
0.01177978515625,
-0.0195465087890625,
-0.04803466796875,
-0.0462646484375,
0.0765380859375,
0.0303497314453125,
-0.045623779296875,
-0.01468658447265625,
-0.0304412841796875,
0.04107666015625,
-0.03717041015625,
0.040985107421875,
0.033294677734375,
0.030029296875,
0.080322265625,
-0.048248291015625,
-0.025787353515625,
-0.06890869140625,
0.032073974609375,
-0.01049041748046875,
-0.07501220703125,
0.03619384765625,
-0.027862548828125,
-0.0187530517578125,
0.03460693359375,
0.07373046875,
0.021026611328125,
0.0272979736328125,
0.0247802734375,
0.041717529296875,
0.01953125,
-0.01152801513671875,
0.080078125,
0.021087646484375,
0.00363922119140625,
0.0792236328125,
-0.0231475830078125,
0.048828125,
0.003726959228515625,
-0.01538848876953125,
0.0211029052734375,
0.07208251953125,
-0.00804901123046875,
0.06829833984375,
0.010833740234375,
0.0006151199340820312,
-0.023101806640625,
-0.0184478759765625,
-0.04498291015625,
0.016632080078125,
0.0165557861328125,
-0.00008511543273925781,
-0.0023193359375,
0.002506256103515625,
-0.004184722900390625,
0.0162200927734375,
-0.0024261474609375,
0.05157470703125,
0.033416748046875,
-0.00015175342559814453,
0.05450439453125,
-0.01280975341796875,
0.01154327392578125,
-0.040008544921875,
-0.0242919921875,
-0.0186767578125,
0.0015411376953125,
-0.0219268798828125,
-0.050872802734375,
0.0111236572265625,
-0.0181732177734375,
-0.02630615234375,
-0.01010894775390625,
0.046173095703125,
0.0001901388168334961,
-0.07818603515625,
0.0223388671875,
0.01434326171875,
0.0184783935546875,
0.011627197265625,
-0.05963134765625,
0.038543701171875,
-0.00478363037109375,
-0.0163421630859375,
0.0157470703125,
0.043304443359375,
0.0028209686279296875,
0.0205230712890625,
0.00806427001953125,
-0.0233612060546875,
0.004840850830078125,
0.0182037353515625,
0.053466796875,
-0.037445068359375,
-0.045806884765625,
-0.034576416015625,
0.05322265625,
-0.0006256103515625,
-0.00537109375,
0.04803466796875,
0.06768798828125,
0.0379638671875,
-0.052154541015625,
0.0303497314453125,
-0.00035834312438964844,
0.021240234375,
-0.048248291015625,
0.061370849609375,
-0.08563232421875,
-0.0018110275268554688,
-0.036956787109375,
-0.06365966796875,
-0.007434844970703125,
0.0428466796875,
0.0234832763671875,
0.0157623291015625,
0.0293731689453125,
0.06829833984375,
-0.055084228515625,
0.0125885009765625,
0.024169921875,
0.022064208984375,
0.02899169921875,
0.0189971923828125,
0.04864501953125,
-0.07440185546875,
-0.00615692138671875,
-0.034332275390625,
-0.021270751953125,
-0.046661376953125,
-0.057891845703125,
-0.07830810546875,
-0.051788330078125,
-0.038543701171875,
-0.03216552734375,
-0.01543426513671875,
0.047607421875,
0.06591796875,
-0.06878662109375,
-0.0216827392578125,
0.01959228515625,
0.00893402099609375,
0.0005540847778320312,
-0.0162506103515625,
0.005558013916015625,
0.04766845703125,
-0.0626220703125,
0.03363037109375,
-0.0009140968322753906,
0.0462646484375,
0.01436614990234375,
0.01398468017578125,
-0.032257080078125,
0.018218994140625,
0.02130126953125,
0.0047607421875,
-0.038909912109375,
-0.007381439208984375,
-0.00583648681640625,
-0.011444091796875,
0.01042938232421875,
0.0263519287109375,
0.0020694732666015625,
0.01549530029296875,
0.0421142578125,
-0.0019092559814453125,
0.04644775390625,
-0.0124053955078125,
0.0302276611328125,
-0.0300140380859375,
0.0203094482421875,
-0.00849151611328125,
0.033966064453125,
0.024169921875,
-0.031768798828125,
0.0506591796875,
0.0164642333984375,
-0.0443115234375,
-0.058502197265625,
0.0120391845703125,
-0.092529296875,
-0.0143890380859375,
0.062408447265625,
0.0250701904296875,
-0.0662841796875,
0.038299560546875,
-0.03399658203125,
0.00803375244140625,
-0.0261383056640625,
0.02276611328125,
0.05914306640625,
0.01280975341796875,
-0.0114593505859375,
-0.07220458984375,
0.005939483642578125,
-0.006256103515625,
-0.044464111328125,
-0.01219940185546875,
0.027984619140625,
0.04595947265625,
0.0201873779296875,
0.037506103515625,
-0.03546142578125,
0.05035400390625,
-0.001811981201171875,
0.01849365234375,
0.0130615234375,
-0.04638671875,
0.02691650390625,
-0.003765106201171875,
0.023101806640625,
-0.0260162353515625
]
] |
google/byt5-base | 2023-01-24T16:36:53.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"multilingual",
"af",
"am",
"ar",
"az",
"be",
"bg",
"bn",
"ca",
"ceb",
"co",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fil",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"haw",
"hi",
"hmn",
"ht",
"hu",
"hy",
"ig",
"is",
"it",
"iw",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lb",
"lo",
"lt",
"lv",
"mg",
"mi",
"mk",
"ml",
"mn",
"mr",
"ms",
"mt",
"my",
"ne",
"nl",
"no",
"ny",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"sm",
"sn",
"so",
"sq",
"sr",
"st",
"su",
"sv",
"sw",
"ta",
"te",
"tg",
"th",
"tr",
"uk",
"und",
"ur",
"uz",
"vi",
"xh",
"yi",
"yo",
"zh",
"zu",
"dataset:mc4",
"arxiv:1907.06292",
"arxiv:2105.13626",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/byt5-base | 17 | 12,196 | transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- af
- am
- ar
- az
- be
- bg
- bn
- ca
- ceb
- co
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fil
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- haw
- hi
- hmn
- ht
- hu
- hy
- ig
- is
- it
- iw
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lb
- lo
- lt
- lv
- mg
- mi
- mk
- ml
- mn
- mr
- ms
- mt
- my
- ne
- nl
- no
- ny
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- st
- su
- sv
- sw
- ta
- te
- tg
- th
- tr
- uk
- und
- ur
- uz
- vi
- xh
- yi
- yo
- zh
- zu
datasets:
- mc4
license: apache-2.0
---
# ByT5 - Base
ByT5 is a tokenizer-free version of [Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) and generally follows the architecture of [MT5](https://huggingface.co/google/mt5-base).
ByT5 was only pre-trained on [mC4](https://www.tensorflow.org/datasets/catalog/c4#c4multilingual) excluding any supervised training with an average span-mask of 20 UTF-8 characters. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
ByT5 works especially well on noisy text data,*e.g.*, `google/byt5-base` significantly outperforms [mt5-base](https://huggingface.co/google/mt5-base) on [TweetQA](https://arxiv.org/abs/1907.06292).
Paper: [ByT5: Towards a token-free future with pre-trained byte-to-byte models](https://arxiv.org/abs/2105.13626)
Authors: *Linting Xue, Aditya Barua, Noah Constant, Rami Al-Rfou, Sharan Narang, Mihir Kale, Adam Roberts, Colin Raffel*
## Example Inference
ByT5 works on raw UTF-8 bytes and can be used without a tokenizer:
```python
from transformers import T5ForConditionalGeneration
import torch
model = T5ForConditionalGeneration.from_pretrained('google/byt5-base')
input_ids = torch.tensor([list("Life is like a box of chocolates.".encode("utf-8"))]) + 3 # add 3 for special tokens
labels = torch.tensor([list("La vie est comme une boîte de chocolat.".encode("utf-8"))]) + 3 # add 3 for special tokens
loss = model(input_ids, labels=labels).loss # forward pass
```
For batched inference & training it is however recommended using a tokenizer class for padding:
```python
from transformers import T5ForConditionalGeneration, AutoTokenizer
model = T5ForConditionalGeneration.from_pretrained('google/byt5-base')
tokenizer = AutoTokenizer.from_pretrained('google/byt5-base')
model_inputs = tokenizer(["Life is like a box of chocolates.", "Today is Monday."], padding="longest", return_tensors="pt")
labels = tokenizer(["La vie est comme une boîte de chocolat.", "Aujourd'hui c'est lundi."], padding="longest", return_tensors="pt").input_ids
loss = model(**model_inputs, labels=labels).loss # forward pass
```
## Abstract
Most widely-used pre-trained language models operate on sequences of tokens corresponding to word or subword units. Encoding text as a sequence of tokens requires a tokenizer, which is typically created as an independent artifact from the model. Token-free models that instead operate directly on raw text (bytes or characters) have many benefits: they can process text in any language out of the box, they are more robust to noise, and they minimize technical debt by removing complex and error-prone text preprocessing pipelines. Since byte or character sequences are longer than token sequences, past work on token-free models has often introduced new model architectures designed to amortize the cost of operating directly on raw text. In this paper, we show that a standard Transformer architecture can be used with minimal modifications to process byte sequences. We carefully characterize the trade-offs in terms of parameter count, training FLOPs, and inference speed, and show that byte-level models are competitive with their token-level counterparts. We also demonstrate that byte-level models are significantly more robust to noise and perform better on tasks that are sensitive to spelling and pronunciation. As part of our contribution, we release a new set of pre-trained byte-level Transformer models based on the T5 architecture, as well as all code and data used in our experiments.
 | 4,221 | [
[
-0.0164947509765625,
-0.02130126953125,
0.017303466796875,
0.01361083984375,
-0.027679443359375,
0.0016231536865234375,
-0.0187225341796875,
-0.042694091796875,
-0.0030803680419921875,
0.0252838134765625,
-0.04217529296875,
-0.039825439453125,
-0.057373046875,
0.0159454345703125,
-0.03265380859375,
0.07769775390625,
-0.0019550323486328125,
0.00836181640625,
0.015106201171875,
-0.01309967041015625,
-0.02447509765625,
-0.04681396484375,
-0.045196533203125,
-0.0307769775390625,
0.03436279296875,
0.02099609375,
0.033416748046875,
0.05523681640625,
0.03607177734375,
0.0264129638671875,
-0.006015777587890625,
-0.006565093994140625,
-0.02825927734375,
-0.01080322265625,
-0.0016546249389648438,
-0.046417236328125,
-0.039398193359375,
-0.0023632049560546875,
0.0283966064453125,
0.049713134765625,
0.00966644287109375,
0.026824951171875,
-0.0117340087890625,
0.024810791015625,
-0.0462646484375,
0.007781982421875,
-0.05126953125,
0.0079193115234375,
-0.016082763671875,
0.0067901611328125,
-0.05108642578125,
-0.007808685302734375,
0.002628326416015625,
-0.039337158203125,
0.04632568359375,
0.00511932373046875,
0.07928466796875,
0.016326904296875,
-0.026336669921875,
-0.0194244384765625,
-0.045379638671875,
0.07635498046875,
-0.05059814453125,
0.05804443359375,
0.01038360595703125,
0.0095672607421875,
-0.0031414031982421875,
-0.09033203125,
-0.041015625,
0.0054931640625,
-0.00922393798828125,
0.0181884765625,
-0.0152587890625,
0.015594482421875,
0.02294921875,
0.036285400390625,
-0.0428466796875,
-0.0005135536193847656,
-0.052001953125,
-0.01207733154296875,
0.02410888671875,
-0.0137481689453125,
0.0229034423828125,
-0.02496337890625,
-0.037628173828125,
-0.01032257080078125,
-0.047149658203125,
0.01430511474609375,
0.01332855224609375,
0.020294189453125,
-0.01285552978515625,
0.018707275390625,
-0.004856109619140625,
0.0241851806640625,
0.017974853515625,
0.0016927719116210938,
0.029327392578125,
-0.0244903564453125,
-0.0268096923828125,
0.0177001953125,
0.0675048828125,
0.0079803466796875,
0.027191162109375,
-0.046234130859375,
-0.0262298583984375,
0.0146636962890625,
0.023651123046875,
-0.092529296875,
-0.0181121826171875,
0.0295562744140625,
-0.05120849609375,
-0.0297393798828125,
-0.0026569366455078125,
-0.036163330078125,
-0.0026035308837890625,
0.00847625732421875,
0.042083740234375,
-0.04510498046875,
-0.00885009765625,
0.0211944580078125,
-0.0183258056640625,
0.0123291015625,
-0.0027866363525390625,
-0.100341796875,
0.0164642333984375,
0.0293121337890625,
0.056427001953125,
-0.011199951171875,
-0.018585205078125,
-0.028656005859375,
-0.0010776519775390625,
-0.0267333984375,
0.046173095703125,
-0.021636962890625,
-0.044158935546875,
-0.0259246826171875,
0.01447296142578125,
-0.002277374267578125,
-0.035858154296875,
0.062286376953125,
-0.0426025390625,
0.03460693359375,
-0.01678466796875,
-0.04278564453125,
-0.014678955078125,
0.007678985595703125,
-0.049285888671875,
0.06976318359375,
0.0030498504638671875,
-0.061279296875,
0.0531005859375,
-0.0643310546875,
-0.02044677734375,
-0.0025806427001953125,
0.01375579833984375,
-0.04498291015625,
0.0031337738037109375,
0.0308837890625,
0.031494140625,
-0.01214599609375,
0.0130462646484375,
-0.0185089111328125,
-0.041839599609375,
0.006137847900390625,
-0.045806884765625,
0.06866455078125,
0.0244903564453125,
-0.033355712890625,
0.0218963623046875,
-0.07464599609375,
0.004810333251953125,
-0.0017995834350585938,
-0.03887939453125,
0.00374603271484375,
-0.00731658935546875,
0.0068817138671875,
0.0160980224609375,
0.013397216796875,
-0.043975830078125,
0.020843505859375,
-0.03460693359375,
0.052642822265625,
0.048736572265625,
-0.0179443359375,
0.0418701171875,
-0.021453857421875,
0.024566650390625,
0.0293731689453125,
0.010406494140625,
-0.0252685546875,
-0.016448974609375,
-0.07623291015625,
-0.03509521484375,
0.04632568359375,
0.037811279296875,
-0.050537109375,
0.0273284912109375,
-0.05712890625,
-0.038116455078125,
-0.05035400390625,
-0.0098876953125,
0.0218963623046875,
0.038177490234375,
0.056396484375,
-0.0258941650390625,
-0.06378173828125,
-0.04022216796875,
-0.0191497802734375,
0.0007877349853515625,
-0.004741668701171875,
-0.0005230903625488281,
0.037567138671875,
-0.0194244384765625,
0.06939697265625,
-0.006500244140625,
-0.0028362274169921875,
-0.031494140625,
0.0202178955078125,
0.0228271484375,
0.0550537109375,
0.03753662109375,
-0.043670654296875,
-0.0222930908203125,
-0.0164794921875,
-0.05419921875,
0.003398895263671875,
-0.006626129150390625,
0.01174163818359375,
0.027008056640625,
0.0229034423828125,
-0.05059814453125,
0.02130126953125,
0.0316162109375,
-0.032318115234375,
0.033050537109375,
-0.02081298828125,
0.0051727294921875,
-0.09832763671875,
0.0166778564453125,
-0.01000213623046875,
-0.035369873046875,
-0.055572509765625,
-0.0101470947265625,
0.01100921630859375,
-0.007213592529296875,
-0.050079345703125,
0.059814453125,
-0.042388916015625,
0.002033233642578125,
-0.0007877349853515625,
-0.0234375,
-0.004405975341796875,
0.044586181640625,
0.004962921142578125,
0.07684326171875,
0.033538818359375,
-0.048065185546875,
0.0056610107421875,
0.0196990966796875,
-0.0264739990234375,
0.00720977783203125,
-0.048797607421875,
0.015533447265625,
-0.00687408447265625,
0.020904541015625,
-0.0672607421875,
0.004955291748046875,
0.01995849609375,
-0.0584716796875,
0.02313232421875,
-0.0148773193359375,
-0.03472900390625,
-0.0275726318359375,
-0.027801513671875,
0.0298919677734375,
0.053466796875,
-0.04718017578125,
0.048187255859375,
-0.01221466064453125,
0.0263824462890625,
-0.0682373046875,
-0.07635498046875,
0.01003265380859375,
-0.0267181396484375,
-0.0546875,
0.04705810546875,
0.0014314651489257812,
0.0242919921875,
-0.0008425712585449219,
-0.018707275390625,
-0.005924224853515625,
0.007633209228515625,
0.0084686279296875,
0.0230560302734375,
-0.025360107421875,
-0.00565338134765625,
-0.01080322265625,
-0.021575927734375,
0.003307342529296875,
-0.051788330078125,
0.044219970703125,
-0.0101470947265625,
0.00806427001953125,
-0.0302734375,
0.00820159912109375,
0.0455322265625,
-0.018585205078125,
0.0638427734375,
0.0677490234375,
-0.0198822021484375,
-0.01456451416015625,
-0.034698486328125,
-0.0187225341796875,
-0.0419921875,
0.0276336669921875,
-0.05126953125,
-0.046295166015625,
0.0555419921875,
0.01131439208984375,
0.0068817138671875,
0.036285400390625,
0.037750244140625,
0.0081024169921875,
0.06646728515625,
0.048065185546875,
-0.0023040771484375,
0.046417236328125,
-0.040283203125,
0.0205841064453125,
-0.044586181640625,
-0.00457763671875,
-0.0299072265625,
-0.01983642578125,
-0.066162109375,
-0.0170440673828125,
0.005275726318359375,
-0.0032138824462890625,
-0.027496337890625,
0.03350830078125,
-0.040374755859375,
0.0255279541015625,
0.048919677734375,
0.006023406982421875,
0.01392364501953125,
0.0012903213500976562,
-0.0186309814453125,
-0.0008177757263183594,
-0.058624267578125,
-0.0423583984375,
0.087890625,
0.0246734619140625,
0.050933837890625,
-0.006664276123046875,
0.05035400390625,
0.0098876953125,
0.0194244384765625,
-0.05133056640625,
0.035003662109375,
-0.033416748046875,
-0.05535888671875,
-0.00614166259765625,
-0.0257415771484375,
-0.08648681640625,
0.0145111083984375,
-0.0222625732421875,
-0.071044921875,
0.010009765625,
0.0123138427734375,
-0.01751708984375,
0.04058837890625,
-0.07366943359375,
0.08404541015625,
-0.003368377685546875,
-0.026031494140625,
0.006389617919921875,
-0.05755615234375,
0.01922607421875,
0.0135040283203125,
-0.0021343231201171875,
0.02130126953125,
0.00884246826171875,
0.056304931640625,
-0.036041259765625,
0.0621337890625,
-0.0096435546875,
0.01483917236328125,
0.01338958740234375,
-0.01505279541015625,
0.034576416015625,
-0.0255584716796875,
-0.007389068603515625,
0.0190582275390625,
0.0114898681640625,
-0.038787841796875,
-0.04058837890625,
0.0290069580078125,
-0.07391357421875,
-0.0325927734375,
-0.0181732177734375,
-0.029388427734375,
-0.0004143714904785156,
0.037261962890625,
0.054290771484375,
0.033233642578125,
0.0033817291259765625,
0.03253173828125,
0.053466796875,
-0.01331329345703125,
0.060211181640625,
-0.0024738311767578125,
-0.003208160400390625,
-0.032318115234375,
0.075927734375,
0.0243682861328125,
0.013671875,
0.037384033203125,
0.01216888427734375,
-0.042205810546875,
-0.04217529296875,
-0.042755126953125,
0.00804901123046875,
-0.067626953125,
-0.0038738250732421875,
-0.06353759765625,
-0.0218658447265625,
-0.043670654296875,
-0.012908935546875,
-0.030181884765625,
-0.0301513671875,
-0.0321044921875,
-0.0157012939453125,
0.00936126708984375,
0.0264129638671875,
0.0164642333984375,
0.036376953125,
-0.06744384765625,
0.0189056396484375,
-0.0019664764404296875,
0.030914306640625,
0.004425048828125,
-0.043792724609375,
-0.016845703125,
-0.00885009765625,
-0.031494140625,
-0.05364990234375,
0.0303955078125,
-0.0046539306640625,
0.022705078125,
0.023712158203125,
0.0036754608154296875,
0.05352783203125,
-0.035552978515625,
0.0693359375,
0.00928497314453125,
-0.0919189453125,
-0.003082275390625,
-0.01439666748046875,
0.024627685546875,
0.0229034423828125,
0.0241851806640625,
-0.053466796875,
-0.015106201171875,
-0.07135009765625,
-0.0643310546875,
0.055084228515625,
0.019317626953125,
0.01352691650390625,
0.0074615478515625,
0.0112762451171875,
0.016845703125,
0.01549530029296875,
-0.0787353515625,
-0.0230865478515625,
-0.045379638671875,
-0.033416748046875,
0.01081085205078125,
-0.014129638671875,
0.0372314453125,
-0.0126953125,
0.055694580078125,
0.005260467529296875,
0.04779052734375,
0.0152130126953125,
-0.0289459228515625,
0.0185394287109375,
0.0232086181640625,
0.04534912109375,
0.0269622802734375,
-0.0200042724609375,
0.018707275390625,
0.038482666015625,
-0.049346923828125,
0.00907135009765625,
0.0017795562744140625,
-0.0174713134765625,
0.0093536376953125,
0.0274505615234375,
0.088623046875,
0.0036716461181640625,
-0.02313232421875,
0.0306854248046875,
-0.01104736328125,
-0.01953125,
-0.01294708251953125,
0.0059356689453125,
0.007793426513671875,
0.0081939697265625,
0.022674560546875,
0.0102081298828125,
0.0090179443359375,
-0.0313720703125,
0.013763427734375,
0.0178375244140625,
-0.026275634765625,
-0.04205322265625,
0.07867431640625,
0.0155029296875,
-0.019287109375,
0.05133056640625,
-0.0118560791015625,
-0.042877197265625,
0.038421630859375,
0.0550537109375,
0.0699462890625,
0.0014734268188476562,
-0.01116943359375,
0.02740478515625,
0.016448974609375,
-0.005645751953125,
0.0154571533203125,
-0.0168609619140625,
-0.06109619140625,
-0.0321044921875,
-0.0408935546875,
-0.01499176025390625,
0.0180511474609375,
-0.017974853515625,
0.047882080078125,
-0.032012939453125,
-0.010772705078125,
0.00011080503463745117,
0.0133514404296875,
-0.0548095703125,
0.054901123046875,
0.01499176025390625,
0.07366943359375,
-0.042388916015625,
0.06915283203125,
0.040985107421875,
-0.0439453125,
-0.0767822265625,
-0.0078277587890625,
-0.0435791015625,
-0.0538330078125,
0.047882080078125,
0.03546142578125,
-0.006282806396484375,
0.019927978515625,
-0.037933349609375,
-0.062347412109375,
0.09295654296875,
0.035858154296875,
-0.005481719970703125,
-0.033447265625,
0.02001953125,
0.035308837890625,
-0.005764007568359375,
0.046417236328125,
0.0270538330078125,
0.0264434814453125,
0.021026611328125,
-0.0560302734375,
0.01085662841796875,
-0.0174560546875,
-0.002040863037109375,
0.01093292236328125,
-0.0556640625,
0.0614013671875,
-0.01031494140625,
-0.020904541015625,
-0.0076141357421875,
0.0716552734375,
0.008514404296875,
-0.005161285400390625,
0.0298614501953125,
0.03948974609375,
0.06207275390625,
-0.009765625,
0.07843017578125,
-0.037139892578125,
0.045074462890625,
0.0513916015625,
0.0275115966796875,
0.040283203125,
0.03350830078125,
-0.001071929931640625,
0.0335693359375,
0.060882568359375,
-0.0129852294921875,
0.040008544921875,
0.00670623779296875,
-0.0194091796875,
-0.00983428955078125,
0.010711669921875,
-0.018096923828125,
0.0290374755859375,
0.007472991943359375,
-0.03289794921875,
-0.0197296142578125,
-0.00933074951171875,
0.0213775634765625,
-0.04107666015625,
-0.0164337158203125,
0.035308837890625,
0.003509521484375,
-0.050201416015625,
0.06298828125,
0.0220794677734375,
0.08038330078125,
-0.036773681640625,
0.02838134765625,
-0.021759033203125,
0.026519775390625,
-0.0251007080078125,
-0.032562255859375,
0.0311431884765625,
0.0135955810546875,
-0.0191650390625,
-0.033782958984375,
0.057373046875,
-0.0270843505859375,
-0.041290283203125,
0.004833221435546875,
0.0234222412109375,
0.0218353271484375,
0.01094818115234375,
-0.042236328125,
-0.00908660888671875,
-0.006565093994140625,
-0.040740966796875,
0.006317138671875,
0.0361328125,
-0.0236053466796875,
0.05828857421875,
0.046661376953125,
0.0046234130859375,
0.032867431640625,
-0.00811767578125,
0.04608154296875,
-0.050048828125,
-0.04217529296875,
-0.07379150390625,
0.0531005859375,
0.017669677734375,
-0.0264434814453125,
0.0296630859375,
0.050048828125,
0.07208251953125,
-0.0186920166015625,
0.06103515625,
-0.001293182373046875,
0.0090484619140625,
-0.0391845703125,
0.065185546875,
-0.04559326171875,
-0.0025043487548828125,
-0.003932952880859375,
-0.04937744140625,
-0.0303955078125,
0.03582763671875,
-0.0268707275390625,
0.0209503173828125,
0.0584716796875,
0.051788330078125,
-0.0325927734375,
-0.01464080810546875,
0.033538818359375,
0.0084381103515625,
0.04052734375,
0.039642333984375,
0.026641845703125,
-0.05804443359375,
0.061981201171875,
-0.006847381591796875,
0.0167694091796875,
0.0030956268310546875,
-0.06878662109375,
-0.0780029296875,
-0.032684326171875,
-0.0040740966796875,
-0.029083251953125,
0.007694244384765625,
0.08538818359375,
0.0677490234375,
-0.05810546875,
-0.008331298828125,
-0.0224609375,
0.0002734661102294922,
0.0000629425048828125,
-0.01218414306640625,
0.0325927734375,
-0.04693603515625,
-0.0738525390625,
0.00711822509765625,
-0.003948211669921875,
0.0187835693359375,
0.0008134841918945312,
0.00905609130859375,
0.0037174224853515625,
0.014434814453125,
0.0328369140625,
0.0035190582275390625,
-0.04052734375,
-0.032623291015625,
0.023681640625,
-0.0169219970703125,
0.041656494140625,
0.042694091796875,
-0.06756591796875,
0.0218963623046875,
0.03717041015625,
0.059478759765625,
0.0587158203125,
-0.003772735595703125,
0.0352783203125,
-0.058441162109375,
0.0167083740234375,
-0.0018358230590820312,
0.034698486328125,
0.0340576171875,
-0.0257110595703125,
0.030029296875,
0.037872314453125,
-0.0328369140625,
-0.0509033203125,
-0.0007905960083007812,
-0.07427978515625,
-0.0093536376953125,
0.07281494140625,
-0.0156097412109375,
-0.036651611328125,
0.01287078857421875,
-0.0035991668701171875,
0.050750732421875,
-0.03265380859375,
0.06597900390625,
0.0697021484375,
0.01348876953125,
-0.02081298828125,
-0.024658203125,
0.03912353515625,
0.038055419921875,
-0.0501708984375,
-0.01165771484375,
-0.006458282470703125,
0.043487548828125,
0.005855560302734375,
0.046844482421875,
-0.00244903564453125,
0.01152801513671875,
0.0075531005859375,
0.03045654296875,
-0.01010894775390625,
-0.0118255615234375,
-0.01558685302734375,
0.01331329345703125,
-0.007541656494140625,
-0.048431396484375
]
] |
EleutherAI/pythia-1b-deduped | 2023-07-10T15:04:31.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-1b-deduped | 14 | 12,183 | transformers | 2023-02-14T00:07:42 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- EleutherAI/the_pile_deduplicated
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight:600">Details on previous early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-1B-deduped
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-1B-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-1B-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-1B-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-1B-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “follow” human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most “accurate” text. Never rely on Pythia-1B-deduped to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-1B-deduped may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-1B-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
Pythia-1B-deduped was trained on the Pile **after the dataset has been globally
deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Easy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1× their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 13,644 | [
[
-0.0236358642578125,
-0.062286376953125,
0.02374267578125,
0.0050201416015625,
-0.0181427001953125,
-0.01471710205078125,
-0.0158843994140625,
-0.03375244140625,
0.01409912109375,
0.01163482666015625,
-0.0268707275390625,
-0.020050048828125,
-0.033966064453125,
-0.00499725341796875,
-0.034149169921875,
0.0823974609375,
-0.007610321044921875,
-0.0110321044921875,
0.0099945068359375,
-0.0031452178955078125,
-0.005741119384765625,
-0.039794921875,
-0.035369873046875,
-0.029144287109375,
0.046905517578125,
0.0158843994140625,
0.06536865234375,
0.041839599609375,
0.0124969482421875,
0.0213775634765625,
-0.0283203125,
-0.00640106201171875,
-0.01226043701171875,
-0.00717926025390625,
-0.0031070709228515625,
-0.018890380859375,
-0.05450439453125,
0.003261566162109375,
0.051605224609375,
0.05072021484375,
-0.01316070556640625,
0.0192108154296875,
0.0001595020294189453,
0.028533935546875,
-0.0384521484375,
0.0017414093017578125,
-0.0265045166015625,
-0.0157623291015625,
-0.0045013427734375,
0.0127410888671875,
-0.0281982421875,
-0.0270538330078125,
0.032684326171875,
-0.0477294921875,
0.018524169921875,
0.004146575927734375,
0.08935546875,
-0.007305145263671875,
-0.032196044921875,
-0.00628662109375,
-0.05517578125,
0.0496826171875,
-0.054107666015625,
0.023681640625,
0.022857666015625,
0.01293182373046875,
-0.0005960464477539062,
-0.06719970703125,
-0.04315185546875,
-0.01457977294921875,
-0.01169586181640625,
-0.0031070709228515625,
-0.050262451171875,
0.0008716583251953125,
0.03961181640625,
0.047149658203125,
-0.06292724609375,
-0.0010652542114257812,
-0.030029296875,
-0.025482177734375,
0.0252532958984375,
0.0047607421875,
0.03167724609375,
-0.0226287841796875,
-0.0009679794311523438,
-0.027374267578125,
-0.051361083984375,
-0.017822265625,
0.040374755859375,
0.005245208740234375,
-0.026885986328125,
0.037933349609375,
-0.0279998779296875,
0.041839599609375,
-0.005229949951171875,
0.0197906494140625,
0.031646728515625,
-0.01446533203125,
-0.0391845703125,
-0.0058746337890625,
0.068603515625,
0.0081939697265625,
0.0161590576171875,
-0.0007796287536621094,
-0.0020313262939453125,
0.00560760498046875,
0.0027713775634765625,
-0.087646484375,
-0.059661865234375,
0.01873779296875,
-0.0295562744140625,
-0.0316162109375,
-0.013824462890625,
-0.07012939453125,
-0.015106201171875,
-0.0170440673828125,
0.04339599609375,
-0.03759765625,
-0.0567626953125,
-0.00937652587890625,
-0.00004941225051879883,
0.0157318115234375,
0.02850341796875,
-0.072265625,
0.0306243896484375,
0.033203125,
0.076171875,
0.0184326171875,
-0.040557861328125,
-0.0146942138671875,
-0.02069091796875,
-0.0088958740234375,
0.025543212890625,
-0.00878143310546875,
-0.01381683349609375,
-0.0096435546875,
0.01183319091796875,
-0.0074462890625,
-0.0265045166015625,
0.0267333984375,
-0.0302886962890625,
0.0181121826171875,
-0.023101806640625,
-0.0316162109375,
-0.029022216796875,
0.0101165771484375,
-0.046051025390625,
0.062347412109375,
0.02081298828125,
-0.07293701171875,
0.017913818359375,
-0.016204833984375,
-0.006084442138671875,
-0.0030384063720703125,
0.01552581787109375,
-0.050140380859375,
0.0013408660888671875,
0.0248565673828125,
0.002658843994140625,
-0.031005859375,
0.01535797119140625,
-0.0172119140625,
-0.032958984375,
0.01337432861328125,
-0.04248046875,
0.071533203125,
0.0155487060546875,
-0.049224853515625,
0.0178375244140625,
-0.04571533203125,
0.0144195556640625,
0.0190277099609375,
-0.0271453857421875,
0.0021114349365234375,
-0.0146942138671875,
0.0290069580078125,
0.015594482421875,
0.01348876953125,
-0.0282745361328125,
0.0239715576171875,
-0.038818359375,
0.05645751953125,
0.055816650390625,
-0.00403594970703125,
0.03460693359375,
-0.030303955078125,
0.034423828125,
0.004497528076171875,
0.01424407958984375,
-0.0036792755126953125,
-0.044525146484375,
-0.07366943359375,
-0.0250701904296875,
0.0285491943359375,
0.0247802734375,
-0.034576416015625,
0.03387451171875,
-0.0175933837890625,
-0.0655517578125,
-0.0128173828125,
-0.006237030029296875,
0.03076171875,
0.0241851806640625,
0.03240966796875,
-0.01033782958984375,
-0.040679931640625,
-0.06707763671875,
-0.01386260986328125,
-0.0330810546875,
0.00984954833984375,
0.0110321044921875,
0.071533203125,
-0.0112152099609375,
0.04461669921875,
-0.0267181396484375,
0.0179443359375,
-0.0286712646484375,
0.0126953125,
0.033416748046875,
0.048675537109375,
0.0295257568359375,
-0.04144287109375,
-0.0277557373046875,
-0.0005688667297363281,
-0.0423583984375,
0.0075531005859375,
0.0016803741455078125,
-0.0243377685546875,
0.0245208740234375,
0.004421234130859375,
-0.075439453125,
0.036224365234375,
0.044921875,
-0.042205810546875,
0.058013916015625,
-0.0240936279296875,
-0.0004191398620605469,
-0.08056640625,
0.022186279296875,
0.007114410400390625,
-0.0158843994140625,
-0.0435791015625,
0.0036602020263671875,
0.01328277587890625,
-0.01433563232421875,
-0.0285491943359375,
0.046173095703125,
-0.038299560546875,
-0.0127105712890625,
-0.0173492431640625,
0.005084991455078125,
-0.0016202926635742188,
0.047943115234375,
0.01141357421875,
0.0423583984375,
0.061767578125,
-0.059356689453125,
0.031402587890625,
0.0197906494140625,
-0.022186279296875,
0.028106689453125,
-0.068359375,
0.012481689453125,
0.0048828125,
0.0299224853515625,
-0.048675537109375,
-0.024749755859375,
0.0413818359375,
-0.043975830078125,
0.01318359375,
-0.032135009765625,
-0.04278564453125,
-0.032012939453125,
-0.01154327392578125,
0.045440673828125,
0.060333251953125,
-0.04644775390625,
0.04888916015625,
0.004673004150390625,
0.0085601806640625,
-0.0284576416015625,
-0.0452880859375,
-0.01953125,
-0.03839111328125,
-0.048187255859375,
0.0287628173828125,
0.014434814453125,
-0.01535797119140625,
0.0025005340576171875,
-0.0014171600341796875,
0.0079498291015625,
-0.0015668869018554688,
0.0238800048828125,
0.025482177734375,
-0.002880096435546875,
0.0024127960205078125,
-0.00885772705078125,
-0.01007080078125,
-0.0009379386901855469,
-0.037384033203125,
0.07330322265625,
-0.020538330078125,
-0.013092041015625,
-0.06085205078125,
-0.0002536773681640625,
0.06707763671875,
-0.032196044921875,
0.0657958984375,
0.04583740234375,
-0.0550537109375,
0.0103607177734375,
-0.027679443359375,
-0.02239990234375,
-0.033111572265625,
0.04974365234375,
-0.0196075439453125,
-0.0261993408203125,
0.046356201171875,
0.021636962890625,
0.0203704833984375,
0.042388916015625,
0.05584716796875,
0.01751708984375,
0.08905029296875,
0.032745361328125,
-0.012451171875,
0.047088623046875,
-0.041168212890625,
0.0186920166015625,
-0.08319091796875,
-0.01338958740234375,
-0.0399169921875,
-0.0208892822265625,
-0.07171630859375,
-0.0229949951171875,
0.023773193359375,
0.0197601318359375,
-0.05828857421875,
0.0439453125,
-0.042724609375,
0.0037555694580078125,
0.05010986328125,
0.01873779296875,
0.013397216796875,
0.0163421630859375,
0.007030487060546875,
-0.004261016845703125,
-0.048248291015625,
-0.0259857177734375,
0.09197998046875,
0.0382080078125,
0.046051025390625,
0.0232086181640625,
0.053009033203125,
-0.01031494140625,
0.0170440673828125,
-0.053497314453125,
0.03167724609375,
0.02496337890625,
-0.054718017578125,
-0.01568603515625,
-0.0577392578125,
-0.0711669921875,
0.0369873046875,
0.006824493408203125,
-0.08428955078125,
0.0176239013671875,
0.0175018310546875,
-0.026702880859375,
0.035064697265625,
-0.045806884765625,
0.0753173828125,
-0.0174713134765625,
-0.037933349609375,
-0.0295867919921875,
-0.022003173828125,
0.017730712890625,
0.0290374755859375,
0.007747650146484375,
0.0081939697265625,
0.024749755859375,
0.07244873046875,
-0.0501708984375,
0.04888916015625,
-0.012542724609375,
0.01049041748046875,
0.02752685546875,
0.024017333984375,
0.049713134765625,
0.01318359375,
0.0095977783203125,
-0.0022869110107421875,
0.0133514404296875,
-0.041778564453125,
-0.0283203125,
0.06817626953125,
-0.08172607421875,
-0.027435302734375,
-0.060302734375,
-0.04290771484375,
0.00699615478515625,
0.0144805908203125,
0.0301513671875,
0.051666259765625,
-0.004608154296875,
0.0005087852478027344,
0.043853759765625,
-0.039215087890625,
0.0276336669921875,
0.0192108154296875,
-0.035858154296875,
-0.03857421875,
0.0740966796875,
0.0017490386962890625,
0.024688720703125,
0.0026721954345703125,
0.01715087890625,
-0.0308685302734375,
-0.032989501953125,
-0.045654296875,
0.039825439453125,
-0.055511474609375,
-0.00002968311309814453,
-0.05450439453125,
-0.002429962158203125,
-0.034942626953125,
0.00879669189453125,
-0.03033447265625,
-0.031402587890625,
-0.017486572265625,
-0.003200531005859375,
0.043670654296875,
0.035125732421875,
0.0048980712890625,
0.0252227783203125,
-0.040191650390625,
-0.0016632080078125,
0.0182037353515625,
0.00763702392578125,
0.00862884521484375,
-0.06817626953125,
-0.0061187744140625,
0.0124053955078125,
-0.032928466796875,
-0.08477783203125,
0.0372314453125,
-0.006481170654296875,
0.0266265869140625,
0.005153656005859375,
-0.019195556640625,
0.04583740234375,
-0.005619049072265625,
0.0499267578125,
0.0116424560546875,
-0.078125,
0.04266357421875,
-0.035064697265625,
0.023284912109375,
0.029022216796875,
0.0260009765625,
-0.053466796875,
-0.00638580322265625,
-0.073486328125,
-0.08111572265625,
0.05767822265625,
0.037872314453125,
0.01325225830078125,
0.005832672119140625,
0.02947998046875,
-0.032562255859375,
0.01165771484375,
-0.0772705078125,
-0.021697998046875,
-0.018341064453125,
-0.005672454833984375,
0.01358795166015625,
-0.003604888916015625,
0.0052947998046875,
-0.044708251953125,
0.07696533203125,
0.00598907470703125,
0.0257568359375,
0.0223846435546875,
-0.030029296875,
-0.00704193115234375,
-0.0028476715087890625,
0.0120697021484375,
0.05712890625,
-0.01140594482421875,
0.0041351318359375,
0.0137176513671875,
-0.0413818359375,
0.002643585205078125,
0.0131378173828125,
-0.029022216796875,
-0.0046539306640625,
0.01261138916015625,
0.0657958984375,
0.00872039794921875,
-0.0313720703125,
0.017181396484375,
-0.001773834228515625,
-0.00600433349609375,
-0.0229339599609375,
-0.014404296875,
0.01258087158203125,
0.015838623046875,
-0.0017299652099609375,
-0.0113372802734375,
0.0007801055908203125,
-0.06689453125,
0.00412750244140625,
0.0182647705078125,
-0.01025390625,
-0.0305938720703125,
0.04437255859375,
0.00356292724609375,
-0.01346588134765625,
0.08642578125,
-0.021575927734375,
-0.05120849609375,
0.06060791015625,
0.038818359375,
0.05511474609375,
-0.012420654296875,
0.0263214111328125,
0.07159423828125,
0.026275634765625,
-0.0153045654296875,
0.006378173828125,
0.00620269775390625,
-0.03839111328125,
-0.00713348388671875,
-0.06109619140625,
-0.017578125,
0.021331787109375,
-0.04425048828125,
0.034942626953125,
-0.0487060546875,
-0.004779815673828125,
-0.00229644775390625,
0.0167083740234375,
-0.04510498046875,
0.02447509765625,
0.014373779296875,
0.053802490234375,
-0.06768798828125,
0.0623779296875,
0.0478515625,
-0.057464599609375,
-0.08135986328125,
0.002410888671875,
0.0010347366333007812,
-0.0338134765625,
0.015716552734375,
0.017791748046875,
0.0157012939453125,
0.01343536376953125,
-0.021148681640625,
-0.06536865234375,
0.0985107421875,
0.01702880859375,
-0.051422119140625,
-0.0207672119140625,
-0.00778961181640625,
0.040496826171875,
0.006023406982421875,
0.0546875,
0.05487060546875,
0.031646728515625,
0.00766754150390625,
-0.07952880859375,
0.0291290283203125,
-0.0237579345703125,
-0.005687713623046875,
0.01751708984375,
-0.050628662109375,
0.09832763671875,
-0.00598907470703125,
-0.002704620361328125,
0.030426025390625,
0.040618896484375,
0.0286407470703125,
-0.00879669189453125,
0.02691650390625,
0.058380126953125,
0.0660400390625,
-0.027679443359375,
0.0902099609375,
-0.0226593017578125,
0.058624267578125,
0.06500244140625,
0.01322174072265625,
0.037689208984375,
0.0301513671875,
-0.029937744140625,
0.039398193359375,
0.060272216796875,
-0.00785064697265625,
0.01456451416015625,
0.0210723876953125,
-0.019012451171875,
-0.019439697265625,
0.01117706298828125,
-0.0452880859375,
0.01702880859375,
0.01187896728515625,
-0.043243408203125,
-0.0169830322265625,
-0.02520751953125,
0.026275634765625,
-0.030853271484375,
-0.017120361328125,
0.0211944580078125,
0.005893707275390625,
-0.0504150390625,
0.04852294921875,
0.018890380859375,
0.041717529296875,
-0.0345458984375,
0.01016998291015625,
-0.01323699951171875,
0.0258941650390625,
-0.0251617431640625,
-0.031829833984375,
0.008941650390625,
0.0008177757263183594,
0.00400543212890625,
0.00818634033203125,
0.031280517578125,
-0.01194000244140625,
-0.0439453125,
0.0144500732421875,
0.036346435546875,
0.01800537109375,
-0.033966064453125,
-0.050445556640625,
0.005786895751953125,
-0.01285552978515625,
-0.040374755859375,
0.03204345703125,
0.019073486328125,
-0.007328033447265625,
0.042938232421875,
0.046417236328125,
0.0025310516357421875,
-0.0034236907958984375,
0.01045989990234375,
0.073486328125,
-0.034820556640625,
-0.035369873046875,
-0.07147216796875,
0.0394287109375,
0.00019288063049316406,
-0.050323486328125,
0.06298828125,
0.0439453125,
0.05108642578125,
0.018341064453125,
0.04608154296875,
-0.036590576171875,
0.0018215179443359375,
-0.0228118896484375,
0.0501708984375,
-0.038055419921875,
0.0045928955078125,
-0.03802490234375,
-0.08441162109375,
-0.0024261474609375,
0.07281494140625,
-0.037139892578125,
0.02716064453125,
0.05902099609375,
0.06158447265625,
-0.006603240966796875,
0.0056915283203125,
0.0028858184814453125,
0.022857666015625,
0.0390625,
0.0711669921875,
0.06768798828125,
-0.054168701171875,
0.040740966796875,
-0.03857421875,
-0.020416259765625,
-0.01142120361328125,
-0.03692626953125,
-0.06512451171875,
-0.035858154296875,
-0.039581298828125,
-0.057342529296875,
-0.0031890869140625,
0.06793212890625,
0.05450439453125,
-0.046661376953125,
-0.00939178466796875,
-0.042236328125,
0.00446319580078125,
-0.020599365234375,
-0.0182037353515625,
0.03167724609375,
0.00969696044921875,
-0.07220458984375,
-0.0034046173095703125,
-0.010986328125,
0.008392333984375,
-0.0308685302734375,
-0.01953125,
-0.01483154296875,
-0.007518768310546875,
0.00635528564453125,
0.0222625732421875,
-0.038787841796875,
-0.019287109375,
0.0017223358154296875,
0.0018672943115234375,
-0.0014181137084960938,
0.053497314453125,
-0.04461669921875,
0.00881195068359375,
0.04730224609375,
0.00841522216796875,
0.06158447265625,
-0.0187835693359375,
0.0291900634765625,
-0.0194854736328125,
0.0286712646484375,
0.02178955078125,
0.046630859375,
0.0245819091796875,
-0.0195770263671875,
0.01401519775390625,
0.0310516357421875,
-0.0546875,
-0.06597900390625,
0.027984619140625,
-0.0560302734375,
-0.00902557373046875,
0.097900390625,
-0.0196075439453125,
-0.0258636474609375,
0.00568389892578125,
-0.0178680419921875,
0.04034423828125,
-0.018768310546875,
0.0504150390625,
0.046600341796875,
0.00780487060546875,
-0.01336669921875,
-0.047576904296875,
0.0280914306640625,
0.0499267578125,
-0.06121826171875,
0.0296478271484375,
0.04766845703125,
0.046905517578125,
0.0175018310546875,
0.04534912109375,
-0.0222930908203125,
0.046600341796875,
0.00905609130859375,
0.0052947998046875,
0.00043582916259765625,
-0.03497314453125,
-0.032806396484375,
-0.01194000244140625,
0.01702880859375,
-0.0007386207580566406
]
] |
sb3/dqn-MountainCar-v0 | 2022-10-11T15:06:51.000Z | [
"stable-baselines3",
"MountainCar-v0",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | sb3 | null | null | sb3/dqn-MountainCar-v0 | 1 | 12,133 | stable-baselines3 | 2022-05-19T23:08:31 | ---
library_name: stable-baselines3
tags:
- MountainCar-v0
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- metrics:
- type: mean_reward
value: -103.40 +/- 7.49
name: mean_reward
task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: MountainCar-v0
type: MountainCar-v0
---
# **DQN** Agent playing **MountainCar-v0**
This is a trained model of a **DQN** agent playing **MountainCar-v0**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env MountainCar-v0 -orga sb3 -f logs/
python enjoy.py --algo dqn --env MountainCar-v0 -f logs/
```
## Training (with the RL Zoo)
```
python train.py --algo dqn --env MountainCar-v0 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env MountainCar-v0 -f logs/ -orga sb3
```
## Hyperparameters
```python
OrderedDict([('batch_size', 128),
('buffer_size', 10000),
('exploration_final_eps', 0.07),
('exploration_fraction', 0.2),
('gamma', 0.98),
('gradient_steps', 8),
('learning_rate', 0.004),
('learning_starts', 1000),
('n_timesteps', 120000.0),
('policy', 'MlpPolicy'),
('policy_kwargs', 'dict(net_arch=[256, 256])'),
('target_update_interval', 600),
('train_freq', 16),
('normalize', False)])
```
| 2,073 | [
[
-0.03875732421875,
-0.034912109375,
0.01050567626953125,
0.018218994140625,
-0.025421142578125,
-0.01413726806640625,
0.004398345947265625,
-0.01739501953125,
-0.0014467239379882812,
0.03192138671875,
-0.06842041015625,
-0.038482666015625,
-0.0247802734375,
-0.005916595458984375,
-0.002025604248046875,
0.0882568359375,
-0.005764007568359375,
0.0075225830078125,
-0.00859832763671875,
-0.02276611328125,
-0.018218994140625,
-0.038970947265625,
-0.0616455078125,
-0.041839599609375,
0.0257110595703125,
-0.00682830810546875,
0.058685302734375,
0.06561279296875,
0.028564453125,
0.027801513671875,
-0.0181732177734375,
-0.007568359375,
-0.0382080078125,
-0.0009889602661132812,
-0.006153106689453125,
-0.015869140625,
-0.034271240234375,
-0.0154571533203125,
0.037353515625,
-0.0016193389892578125,
-0.035308837890625,
0.0196990966796875,
-0.0109405517578125,
0.031890869140625,
-0.051300048828125,
0.04522705078125,
-0.0224761962890625,
0.01088714599609375,
-0.00531005859375,
-0.00992584228515625,
-0.0012826919555664062,
-0.0147857666015625,
0.006008148193359375,
-0.0828857421875,
0.00963592529296875,
0.002101898193359375,
0.10357666015625,
0.033233642578125,
-0.04010009765625,
-0.01026153564453125,
-0.0462646484375,
0.06304931640625,
-0.06292724609375,
0.01690673828125,
0.026397705078125,
0.042755126953125,
-0.0187530517578125,
-0.051239013671875,
-0.047210693359375,
-0.0194091796875,
0.01039886474609375,
0.017578125,
-0.0036296844482421875,
0.0015926361083984375,
0.034088134765625,
0.0189971923828125,
-0.040008544921875,
0.0177764892578125,
-0.0210113525390625,
-0.00934600830078125,
0.037750244140625,
0.057342529296875,
0.0009479522705078125,
-0.00786590576171875,
-0.036651611328125,
-0.044342041015625,
-0.0408935546875,
0.04144287109375,
0.01236724853515625,
0.0237884521484375,
-0.018280029296875,
0.0357666015625,
-0.052947998046875,
0.039093017578125,
0.0013551712036132812,
-0.026611328125,
0.038818359375,
-0.010284423828125,
-0.0328369140625,
-0.00978851318359375,
0.0672607421875,
0.047393798828125,
-0.0161590576171875,
0.0217742919921875,
-0.04180908203125,
-0.0316162109375,
-0.000054955482482910156,
-0.06439208984375,
-0.0186309814453125,
0.0290374755859375,
-0.0190887451171875,
-0.022796630859375,
-0.00838470458984375,
-0.049285888671875,
-0.005523681640625,
-0.0147552490234375,
0.041534423828125,
-0.031524658203125,
-0.0165252685546875,
0.0112762451171875,
-0.0153350830078125,
0.041046142578125,
0.0236358642578125,
-0.0626220703125,
0.0234832763671875,
0.03338623046875,
0.0567626953125,
0.0256195068359375,
-0.055938720703125,
-0.038330078125,
0.0281219482421875,
-0.02093505859375,
0.062164306640625,
0.01311492919921875,
-0.0233001708984375,
0.0165557861328125,
0.0192718505859375,
-0.00335693359375,
-0.04248046875,
0.0345458984375,
-0.055450439453125,
0.0039215087890625,
-0.0204620361328125,
-0.01080322265625,
-0.04010009765625,
0.044677734375,
-0.059051513671875,
0.09814453125,
0.01284027099609375,
-0.055450439453125,
0.024993896484375,
-0.03619384765625,
-0.0154571533203125,
0.014617919921875,
0.00891876220703125,
-0.05694580078125,
-0.0240020751953125,
0.00876617431640625,
0.0253448486328125,
-0.02130126953125,
0.015167236328125,
-0.028106689453125,
-0.0269622802734375,
0.009918212890625,
0.0026226043701171875,
0.08447265625,
-0.0006089210510253906,
-0.03466796875,
0.00949859619140625,
-0.059234619140625,
0.01605224609375,
0.019287109375,
-0.040313720703125,
-0.00628662109375,
0.000125885009765625,
0.008209228515625,
0.034942626953125,
0.0240325927734375,
-0.01543426513671875,
0.005649566650390625,
-0.022216796875,
0.0236358642578125,
0.0518798828125,
0.01195526123046875,
-0.0157318115234375,
-0.04534912109375,
0.03350830078125,
0.01287078857421875,
0.01529693603515625,
0.03857421875,
-0.0171966552734375,
-0.04833984375,
-0.01318359375,
0.01141357421875,
0.045623779296875,
-0.07501220703125,
0.042022705078125,
-0.008697509765625,
-0.04278564453125,
-0.021514892578125,
-0.01560211181640625,
0.038421630859375,
0.02252197265625,
0.034423828125,
-0.005279541015625,
-0.036346435546875,
-0.060089111328125,
0.00714111328125,
-0.032470703125,
-0.00447845458984375,
0.0237884521484375,
0.0732421875,
-0.0207672119140625,
0.042755126953125,
-0.03363037109375,
-0.024200439453125,
-0.0135650634765625,
0.0079345703125,
0.0197601318359375,
0.0567626953125,
0.054473876953125,
-0.0256805419921875,
-0.031768798828125,
-0.011627197265625,
-0.07598876953125,
0.0208587646484375,
0.0032024383544921875,
-0.02398681640625,
-0.0111083984375,
0.00830078125,
-0.06390380859375,
0.0290374755859375,
0.0245819091796875,
-0.006786346435546875,
0.0567626953125,
-0.025970458984375,
0.0189971923828125,
-0.06396484375,
-0.0013713836669921875,
0.0116424560546875,
-0.0061798095703125,
-0.0316162109375,
0.01508331298828125,
-0.006591796875,
0.00725555419921875,
-0.065185546875,
0.035369873046875,
-0.020599365234375,
-0.005886077880859375,
-0.00978851318359375,
0.004817962646484375,
-0.01068878173828125,
0.054412841796875,
0.005481719970703125,
0.05047607421875,
0.07354736328125,
-0.0682373046875,
0.043365478515625,
0.03765869140625,
-0.0177459716796875,
0.031158447265625,
-0.058258056640625,
-0.005046844482421875,
-0.01085662841796875,
0.0262603759765625,
-0.0318603515625,
-0.0222625732421875,
0.029052734375,
-0.028717041015625,
-0.0024204254150390625,
-0.0267181396484375,
-0.035858154296875,
-0.020721435546875,
-0.0198211669921875,
0.030242919921875,
0.032806396484375,
-0.041748046875,
0.0211334228515625,
0.03350830078125,
0.0131988525390625,
-0.046234130859375,
-0.030670166015625,
-0.0133819580078125,
-0.0277252197265625,
-0.0360107421875,
0.0187225341796875,
-0.004154205322265625,
-0.01226806640625,
0.0007700920104980469,
0.0014505386352539062,
-0.0241851806640625,
0.01300048828125,
0.0239105224609375,
0.0253143310546875,
-0.0250244140625,
-0.01396942138671875,
-0.01560211181640625,
-0.0070343017578125,
0.0274810791015625,
-0.013671875,
0.03619384765625,
-0.018280029296875,
0.0137176513671875,
-0.0628662109375,
-0.006114959716796875,
0.038482666015625,
-0.01198577880859375,
0.07550048828125,
0.0360107421875,
-0.041290283203125,
-0.0172119140625,
0.006748199462890625,
0.00017559528350830078,
-0.0357666015625,
0.0276641845703125,
-0.041717529296875,
-0.0276031494140625,
0.04559326171875,
0.0080413818359375,
0.0027618408203125,
0.06744384765625,
0.043365478515625,
-0.002590179443359375,
0.099365234375,
0.039642333984375,
0.006290435791015625,
0.0357666015625,
-0.060546875,
-0.0279693603515625,
-0.06671142578125,
-0.04150390625,
-0.059844970703125,
0.013458251953125,
-0.045928955078125,
-0.01064300537109375,
0.0201568603515625,
0.01012420654296875,
-0.0614013671875,
0.03472900390625,
-0.0185394287109375,
0.0260009765625,
0.04119873046875,
0.0159759521484375,
-0.00006604194641113281,
-0.0105133056640625,
-0.02386474609375,
0.009124755859375,
-0.05120849609375,
-0.033172607421875,
0.0625,
0.0211181640625,
0.051422119140625,
0.01389312744140625,
0.04010009765625,
0.009979248046875,
0.006031036376953125,
-0.050048828125,
0.04217529296875,
0.00576019287109375,
-0.0634765625,
-0.0300445556640625,
-0.0343017578125,
-0.056182861328125,
0.03912353515625,
-0.027679443359375,
-0.045562744140625,
0.0028095245361328125,
0.00656890869140625,
-0.043914794921875,
0.0239715576171875,
-0.0029888153076171875,
0.07476806640625,
-0.019500732421875,
-0.0305023193359375,
-0.00433349609375,
-0.045928955078125,
0.04425048828125,
0.013214111328125,
0.00800323486328125,
-0.014892578125,
0.0015764236450195312,
0.07330322265625,
-0.0582275390625,
0.033538818359375,
-0.03582763671875,
0.0274505615234375,
0.036407470703125,
0.005733489990234375,
0.04742431640625,
0.0199432373046875,
0.00029850006103515625,
0.003509521484375,
0.01236724853515625,
-0.0469970703125,
-0.0266876220703125,
0.04986572265625,
-0.1024169921875,
-0.032318115234375,
-0.05926513671875,
-0.025604248046875,
-0.00434112548828125,
0.01812744140625,
0.01277923583984375,
0.0379638671875,
-0.00910186767578125,
0.013275146484375,
0.045867919921875,
-0.00901031494140625,
0.035125732421875,
0.048309326171875,
-0.0028018951416015625,
-0.051483154296875,
0.0562744140625,
-0.00983428955078125,
0.0045013427734375,
0.01026153564453125,
0.0015153884887695312,
-0.033050537109375,
-0.058837890625,
-0.0528564453125,
0.0224761962890625,
-0.053131103515625,
-0.0194854736328125,
-0.042388916015625,
-0.03314208984375,
-0.0252227783203125,
0.00998687744140625,
-0.035308837890625,
-0.021881103515625,
-0.02471923828125,
-0.0086517333984375,
0.03387451171875,
0.0518798828125,
-0.0250396728515625,
0.058013916015625,
-0.052032470703125,
0.01090240478515625,
0.034271240234375,
0.006763458251953125,
0.0031604766845703125,
-0.04852294921875,
-0.047149658203125,
0.0135498046875,
-0.035491943359375,
-0.05023193359375,
0.05877685546875,
-0.0089569091796875,
0.0614013671875,
0.0294647216796875,
0.005458831787109375,
0.07049560546875,
-0.01398468017578125,
0.06982421875,
0.0122833251953125,
-0.051300048828125,
0.031280517578125,
-0.029815673828125,
0.006908416748046875,
0.050140380859375,
0.051116943359375,
-0.02374267578125,
-0.01125335693359375,
-0.050323486328125,
-0.0531005859375,
0.0745849609375,
0.025115966796875,
-0.0207366943359375,
0.01470947265625,
0.019195556640625,
-0.0191497802734375,
0.017822265625,
-0.0758056640625,
-0.0268402099609375,
-0.0428466796875,
0.01032257080078125,
-0.0191650390625,
0.0197906494140625,
-0.0253753662109375,
-0.015625,
0.0869140625,
-0.00728607177734375,
0.01200103759765625,
0.01311492919921875,
-0.00811004638671875,
-0.02001953125,
-0.0175323486328125,
0.049468994140625,
0.0302734375,
-0.05328369140625,
-0.0036411285400390625,
0.0155181884765625,
-0.029327392578125,
0.0195465087890625,
0.01776123046875,
-0.0117034912109375,
-0.0044708251953125,
0.0213470458984375,
0.0618896484375,
0.003170013427734375,
-0.03875732421875,
0.036895751953125,
-0.0169830322265625,
-0.0272369384765625,
-0.030853271484375,
0.0089111328125,
-0.01361083984375,
0.023590087890625,
0.01239013671875,
0.001270294189453125,
0.0061798095703125,
-0.033721923828125,
-0.0003209114074707031,
0.0192413330078125,
-0.053680419921875,
-0.0274200439453125,
0.064208984375,
0.0056915283203125,
-0.02001953125,
0.062286376953125,
-0.0187835693359375,
-0.0496826171875,
0.07696533203125,
0.03240966796875,
0.06256103515625,
-0.0017118453979492188,
0.0199737548828125,
0.061553955078125,
0.01413726806640625,
-0.0294189453125,
0.0149993896484375,
0.01085662841796875,
-0.051239013671875,
-0.00836944580078125,
-0.032623291015625,
-0.03369140625,
0.028839111328125,
-0.058685302734375,
0.0229644775390625,
-0.034332275390625,
-0.0190887451171875,
-0.0084075927734375,
0.033050537109375,
-0.053436279296875,
0.03143310546875,
-0.01297760009765625,
0.071533203125,
-0.0616455078125,
0.07635498046875,
0.056854248046875,
-0.0550537109375,
-0.0667724609375,
-0.004962921142578125,
0.00493621826171875,
-0.0498046875,
0.05328369140625,
-0.0006146430969238281,
0.00482940673828125,
0.006137847900390625,
-0.050994873046875,
-0.0733642578125,
0.107666015625,
-0.017486572265625,
-0.0190887451171875,
0.00592041015625,
0.002529144287109375,
0.0364990234375,
-0.031768798828125,
0.0322265625,
0.0299530029296875,
0.0225372314453125,
0.0263671875,
-0.055908203125,
0.01103973388671875,
-0.011474609375,
0.0038700103759765625,
0.0007767677307128906,
-0.07232666015625,
0.10028076171875,
-0.004894256591796875,
0.00537109375,
0.0175628662109375,
0.045654296875,
0.06671142578125,
0.0187530517578125,
0.04449462890625,
0.064453125,
0.03558349609375,
0.00557708740234375,
0.0634765625,
-0.02349853515625,
0.0540771484375,
0.06463623046875,
-0.02264404296875,
0.059112548828125,
0.005340576171875,
-0.01212310791015625,
0.047760009765625,
0.07574462890625,
-0.005077362060546875,
0.067138671875,
0.02008056640625,
-0.021514892578125,
-0.0274810791015625,
0.01238250732421875,
-0.047943115234375,
0.01189422607421875,
0.010650634765625,
0.009002685546875,
-0.0249786376953125,
-0.0169830322265625,
-0.004451751708984375,
-0.035186767578125,
-0.036102294921875,
0.057281494140625,
-0.0166168212890625,
-0.046875,
0.0794677734375,
0.009429931640625,
0.0211334228515625,
-0.03753662109375,
-0.0014085769653320312,
-0.02203369140625,
0.0196685791015625,
-0.005107879638671875,
-0.04638671875,
-0.005466461181640625,
-0.00913238525390625,
0.0008716583251953125,
0.01079559326171875,
0.0208740234375,
-0.00891876220703125,
-0.01425933837890625,
0.0192718505859375,
0.0283355712890625,
0.0244598388671875,
0.006580352783203125,
-0.07684326171875,
-0.00954437255859375,
-0.005481719970703125,
-0.024871826171875,
0.031494140625,
0.04498291015625,
0.003238677978515625,
0.059417724609375,
0.041717529296875,
-0.007904052734375,
0.006282806396484375,
-0.0142822265625,
0.07745361328125,
-0.07696533203125,
-0.037689208984375,
-0.037811279296875,
0.04833984375,
0.00479888916015625,
-0.052398681640625,
0.04461669921875,
0.0697021484375,
0.0628662109375,
-0.01338958740234375,
0.055206298828125,
-0.01035308837890625,
0.005298614501953125,
-0.046905517578125,
0.03802490234375,
-0.042388916015625,
0.00860595703125,
0.0015621185302734375,
-0.044403076171875,
0.0014553070068359375,
0.0460205078125,
-0.0012655258178710938,
-0.0040283203125,
0.046875,
0.07623291015625,
0.002239227294921875,
-0.01172637939453125,
0.0158233642578125,
0.02642822265625,
0.0218963623046875,
0.055419921875,
0.07830810546875,
-0.05609130859375,
0.055999755859375,
-0.0423583984375,
-0.019378662109375,
-0.0208892822265625,
-0.0506591796875,
-0.0694580078125,
-0.0197601318359375,
-0.034454345703125,
-0.0533447265625,
0.01885986328125,
0.0673828125,
0.0601806640625,
-0.050323486328125,
-0.0460205078125,
0.00311279296875,
0.0106658935546875,
-0.040557861328125,
-0.0145721435546875,
0.01433563232421875,
-0.00928497314453125,
-0.045867919921875,
0.01812744140625,
-0.02044677734375,
0.028228759765625,
-0.0202484130859375,
-0.027435302734375,
-0.0196685791015625,
-0.01117706298828125,
0.020416259765625,
0.034423828125,
-0.02532958984375,
-0.01446533203125,
-0.0182342529296875,
-0.02740478515625,
0.0125579833984375,
0.0280914306640625,
-0.061553955078125,
-0.00936126708984375,
0.032196044921875,
0.0026721954345703125,
0.076171875,
-0.0034999847412109375,
0.0303802490234375,
-0.0175933837890625,
0.01161956787109375,
0.01068878173828125,
0.0196685791015625,
0.0037708282470703125,
-0.00809478759765625,
0.0287933349609375,
0.029632568359375,
-0.062286376953125,
-0.057464599609375,
-0.0117950439453125,
-0.07305908203125,
-0.0196685791015625,
0.0670166015625,
-0.0247344970703125,
-0.047149658203125,
0.0008211135864257812,
-0.0037708282470703125,
0.0198516845703125,
-0.033843994140625,
0.042755126953125,
0.02716064453125,
-0.01065826416015625,
-0.0013589859008789062,
-0.074462890625,
0.055206298828125,
-0.0108184814453125,
-0.06268310546875,
-0.0230865478515625,
0.03790283203125,
0.0296173095703125,
-0.0017137527465820312,
0.0290679931640625,
0.007476806640625,
0.03009033203125,
0.017669677734375,
0.021820068359375,
-0.003185272216796875,
-0.0202484130859375,
-0.041046142578125,
0.0107421875,
0.004215240478515625,
-0.022064208984375
]
] |
PygmalionAI/pygmalion-2-13b | 2023-09-15T20:29:04.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"text generation",
"instruct",
"en",
"dataset:PygmalionAI/PIPPA",
"dataset:Open-Orca/OpenOrca",
"dataset:Norquinal/claude_multiround_chat_30k",
"dataset:jondurbin/airoboros-gpt4-1.4.1",
"dataset:databricks/databricks-dolly-15k",
"license:llama2",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | PygmalionAI | null | null | PygmalionAI/pygmalion-2-13b | 36 | 12,127 | transformers | 2023-09-04T22:05:31 | ---
language:
- en
thumbnail: null
tags:
- text generation
- instruct
pipeline_tag: text-generation
inference: false
license: llama2
datasets:
- PygmalionAI/PIPPA
- Open-Orca/OpenOrca
- Norquinal/claude_multiround_chat_30k
- jondurbin/airoboros-gpt4-1.4.1
- databricks/databricks-dolly-15k
---
<h1 style="text-align: center">Pygmalion-2 13B</h1>
<h2 style="text-align: center">An instruction-tuned Llama-2 biased towards fiction writing and conversation.</h2>
## Model Details
The long-awaited release of our new models based on Llama-2 is finally here. Pygmalion-2 13B (formerly known as Metharme) is based on
[Llama-2 13B](https://huggingface.co/meta-llama/llama-2-13b-hf) released by Meta AI.
The Metharme models were an experiment to try and get a model that is usable for conversation, roleplaying and storywriting,
but which can be guided using natural language like other instruct models. After much deliberation, we reached the conclusion
that the Metharme prompting format is superior (and easier to use) compared to the classic Pygmalion.
This model was trained by doing supervised fine-tuning over a mixture of regular instruction data alongside roleplay, fictional stories
and conversations with synthetically generated instructions attached.
This model is freely available for both commercial and non-commercial use, as per the Llama-2 license.
## Prompting
The model has been trained on prompts using three different roles, which are denoted by the following tokens: `<|system|>`, `<|user|>` and `<|model|>`.
The `<|system|>` prompt can be used to inject out-of-channel information behind the scenes, while the `<|user|>` prompt should be used to indicate user input.
The `<|model|>` token should then be used to indicate that the model should generate a response. These tokens can happen multiple times and be chained up to
form a conversation history.
### Prompting example
The system prompt has been designed to allow the model to "enter" various modes and dictate the reply length. Here's an example:
```
<|system|>Enter RP mode. Pretend to be {{char}} whose persona follows:
{{persona}}
You shall reply to the user while staying in character, and generate long responses.
```
## Dataset
The dataset used to fine-tune this model includes our own [PIPPA](https://huggingface.co/datasets/PygmalionAI/PIPPA), along with several other instruction
datasets, and datasets acquired from various RP forums.
## Limitations and biases
The intended use-case for this model is fictional writing for entertainment purposes. Any other sort of usage is out of scope.
As such, it was **not** fine-tuned to be safe and harmless: the base model _and_ this fine-tune have been trained on data known to contain profanity and texts that are lewd or otherwise offensive. It may produce socially unacceptable or undesirable text, even if the prompt itself does not include anything explicitly offensive. Outputs might often be factually wrong or misleading.
## Acknowledgements
We would like to thank [SpicyChat](https://spicychat.ai/) for sponsoring the training for this model.
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
| 3,314 | [
[
-0.01605224609375,
-0.061767578125,
0.0169219970703125,
0.035003662109375,
-0.0235748291015625,
-0.01027679443359375,
-0.0250701904296875,
-0.04278564453125,
0.00919342041015625,
0.02008056640625,
-0.061981201171875,
-0.0225830078125,
-0.03643798828125,
0.0151824951171875,
0.0016450881958007812,
0.087646484375,
0.00865936279296875,
-0.006702423095703125,
-0.027099609375,
0.001720428466796875,
-0.06536865234375,
-0.035430908203125,
-0.050933837890625,
-0.036376953125,
0.055023193359375,
0.0275421142578125,
0.052459716796875,
0.03912353515625,
0.0177154541015625,
0.01352691650390625,
-0.01490020751953125,
0.0311126708984375,
-0.0266571044921875,
0.01146697998046875,
-0.0002567768096923828,
-0.0243377685546875,
-0.05157470703125,
0.033660888671875,
0.03076171875,
0.0289306640625,
-0.0208892822265625,
0.0203704833984375,
0.01416778564453125,
0.0231475830078125,
-0.0423583984375,
0.0169830322265625,
-0.0214996337890625,
-0.000026106834411621094,
0.00788116455078125,
0.001918792724609375,
-0.049652099609375,
-0.0299835205078125,
-0.0025272369384765625,
-0.047698974609375,
-0.0260009765625,
0.0140533447265625,
0.0638427734375,
0.0137176513671875,
-0.0286865234375,
-0.0275115966796875,
-0.033416748046875,
0.05255126953125,
-0.06561279296875,
0.00972747802734375,
0.044036865234375,
0.0230560302734375,
-0.01244354248046875,
-0.068603515625,
-0.0428466796875,
-0.043304443359375,
-0.00775909423828125,
0.006618499755859375,
-0.0302276611328125,
0.0027446746826171875,
0.0218353271484375,
0.0116119384765625,
-0.04962158203125,
0.0276031494140625,
-0.0234527587890625,
-0.01300048828125,
0.037200927734375,
0.022064208984375,
0.027069091796875,
-0.0233917236328125,
-0.033203125,
-0.0014619827270507812,
-0.059967041015625,
0.010162353515625,
0.0263214111328125,
0.0051727294921875,
-0.0330810546875,
0.053436279296875,
-0.0024871826171875,
0.045379638671875,
0.020050048828125,
-0.029022216796875,
0.00443267822265625,
-0.0114898681640625,
-0.01222991943359375,
-0.0122528076171875,
0.06787109375,
0.06280517578125,
0.035980224609375,
0.0064239501953125,
-0.00960540771484375,
0.01259613037109375,
0.0161285400390625,
-0.08184814453125,
-0.0221405029296875,
0.0100250244140625,
-0.052947998046875,
-0.037384033203125,
-0.021759033203125,
-0.045074462890625,
-0.040863037109375,
-0.01038360595703125,
0.00811004638671875,
-0.029266357421875,
-0.0251617431640625,
-0.0118255615234375,
-0.0030422210693359375,
0.021820068359375,
0.0291748046875,
-0.08050537109375,
0.0157318115234375,
0.04571533203125,
0.06964111328125,
-0.0070037841796875,
-0.026947021484375,
-0.020843505859375,
-0.0162811279296875,
-0.0243682861328125,
0.053619384765625,
-0.05169677734375,
-0.0280609130859375,
-0.01197052001953125,
0.0065460205078125,
-0.018890380859375,
-0.04388427734375,
0.0270233154296875,
-0.02587890625,
0.03533935546875,
-0.01052093505859375,
-0.03277587890625,
-0.021392822265625,
0.0150146484375,
-0.0251617431640625,
0.0760498046875,
0.0006194114685058594,
-0.0577392578125,
0.0125732421875,
-0.052642822265625,
-0.013092041015625,
-0.010772705078125,
0.006282806396484375,
-0.0111846923828125,
-0.01605224609375,
0.0239410400390625,
0.031829833984375,
-0.034332275390625,
0.0221405029296875,
-0.0197601318359375,
-0.042236328125,
0.03692626953125,
-0.034332275390625,
0.0657958984375,
0.0196380615234375,
-0.017059326171875,
0.01389312744140625,
-0.042236328125,
-0.0007009506225585938,
0.01503753662109375,
-0.03985595703125,
-0.0044097900390625,
-0.00234222412109375,
-0.0148162841796875,
0.007656097412109375,
0.0260162353515625,
-0.037261962890625,
0.0216827392578125,
-0.0249786376953125,
0.0467529296875,
0.057281494140625,
0.006259918212890625,
0.034454345703125,
-0.04461669921875,
0.051239013671875,
-0.016845703125,
0.028076171875,
-0.0184326171875,
-0.059326171875,
-0.04376220703125,
-0.0294952392578125,
0.0149383544921875,
0.07470703125,
-0.0285797119140625,
0.034454345703125,
0.00859832763671875,
-0.04364013671875,
-0.0248870849609375,
-0.00820159912109375,
0.043121337890625,
0.040771484375,
0.01165771484375,
-0.01023101806640625,
-0.04791259765625,
-0.05206298828125,
-0.00702667236328125,
-0.03729248046875,
-0.005832672119140625,
0.04620361328125,
0.02197265625,
-0.0253143310546875,
0.044677734375,
-0.03271484375,
0.0157012939453125,
-0.0279693603515625,
-0.00238800048828125,
0.005161285400390625,
0.044464111328125,
0.041046142578125,
-0.029022216796875,
-0.029266357421875,
-0.0185699462890625,
-0.07403564453125,
-0.0209503173828125,
-0.007110595703125,
-0.00659942626953125,
0.01959228515625,
0.0223388671875,
-0.06549072265625,
0.035980224609375,
0.036834716796875,
-0.02490234375,
0.04534912109375,
-0.0012159347534179688,
0.01058197021484375,
-0.0897216796875,
0.01617431640625,
-0.006866455078125,
-0.00252532958984375,
-0.039520263671875,
0.0061492919921875,
0.01218414306640625,
-0.028167724609375,
-0.0251617431640625,
0.049957275390625,
-0.0278167724609375,
0.014862060546875,
-0.0279693603515625,
0.0026683807373046875,
-0.0015325546264648438,
0.04681396484375,
-0.006290435791015625,
0.0787353515625,
0.036224365234375,
-0.050384521484375,
0.041839599609375,
0.036834716796875,
-0.017547607421875,
0.04327392578125,
-0.07781982421875,
0.036834716796875,
-0.002155303955078125,
0.03729248046875,
-0.07769775390625,
-0.033660888671875,
0.0679931640625,
-0.0592041015625,
0.0266265869140625,
-0.0224456787109375,
-0.0379638671875,
-0.030426025390625,
-0.0248870849609375,
0.020050048828125,
0.0576171875,
-0.0513916015625,
0.0286102294921875,
0.0193023681640625,
-0.0179901123046875,
-0.046142578125,
-0.05169677734375,
0.01277923583984375,
-0.032379150390625,
-0.06317138671875,
0.0087890625,
-0.0267181396484375,
-0.00959014892578125,
-0.033416748046875,
0.00760650634765625,
-0.00545501708984375,
0.00809478759765625,
0.049835205078125,
0.03045654296875,
-0.00888824462890625,
0.0018558502197265625,
0.01788330078125,
0.003070831298828125,
0.01419830322265625,
0.00970458984375,
0.054931640625,
-0.001827239990234375,
0.000476837158203125,
-0.06640625,
0.0254058837890625,
0.034942626953125,
-0.0264129638671875,
0.05377197265625,
0.0345458984375,
-0.0305328369140625,
0.00949859619140625,
-0.0277099609375,
-0.03204345703125,
-0.03826904296875,
0.01024627685546875,
-0.00400543212890625,
-0.056640625,
0.033966064453125,
-0.003330230712890625,
0.011138916015625,
0.00484466552734375,
0.0404052734375,
-0.0065460205078125,
0.0894775390625,
0.0439453125,
0.012542724609375,
0.039764404296875,
0.0024204254150390625,
0.00891876220703125,
-0.0780029296875,
-0.048919677734375,
-0.029449462890625,
-0.02099609375,
-0.038604736328125,
-0.01324462890625,
0.0164031982421875,
0.0018434524536132812,
-0.023223876953125,
0.03192138671875,
-0.038330078125,
0.0223388671875,
0.044219970703125,
0.01898193359375,
-0.003662109375,
0.002197265625,
-0.0014677047729492188,
-0.01519775390625,
-0.047149658203125,
-0.059478759765625,
0.0787353515625,
0.04095458984375,
0.07427978515625,
0.01190948486328125,
0.041290283203125,
0.0201416015625,
0.01192474365234375,
-0.06689453125,
0.049835205078125,
0.01007080078125,
-0.041778564453125,
0.007213592529296875,
-0.01248931884765625,
-0.070068359375,
0.0030345916748046875,
-0.011627197265625,
-0.068603515625,
0.0114898681640625,
0.0152130126953125,
-0.039215087890625,
0.0022335052490234375,
-0.06787109375,
0.0618896484375,
0.004901885986328125,
-0.013824462890625,
-0.01031494140625,
-0.06591796875,
0.0340576171875,
0.0244293212890625,
-0.01153564453125,
-0.0082244873046875,
-0.01214599609375,
0.0633544921875,
-0.034149169921875,
0.10723876953125,
-0.01384735107421875,
-0.01256561279296875,
0.03692626953125,
0.0159149169921875,
0.036956787109375,
0.022369384765625,
-0.005664825439453125,
0.00274658203125,
0.00020003318786621094,
-0.00865936279296875,
-0.038055419921875,
0.05572509765625,
-0.06304931640625,
-0.0506591796875,
-0.031036376953125,
-0.044677734375,
0.00699615478515625,
0.0012159347534179688,
0.027435302734375,
0.037078857421875,
-0.006603240966796875,
0.0016803741455078125,
0.05377197265625,
-0.0382080078125,
0.034759521484375,
0.03472900390625,
-0.01432037353515625,
-0.039459228515625,
0.052398681640625,
-0.0097808837890625,
0.01116943359375,
0.00994873046875,
0.0247344970703125,
-0.024993896484375,
-0.0107421875,
-0.04931640625,
0.02764892578125,
-0.0506591796875,
-0.0242156982421875,
-0.048095703125,
-0.0209503173828125,
-0.04595947265625,
0.000347137451171875,
-0.005420684814453125,
-0.039276123046875,
-0.0523681640625,
-0.004070281982421875,
0.04156494140625,
0.053497314453125,
-0.0088348388671875,
0.038116455078125,
-0.038604736328125,
0.01021575927734375,
0.031646728515625,
0.0081787109375,
0.0009050369262695312,
-0.06915283203125,
0.01044464111328125,
0.022857666015625,
-0.033050537109375,
-0.06988525390625,
0.019989013671875,
0.0224151611328125,
0.0306243896484375,
0.0264129638671875,
0.0017042160034179688,
0.036407470703125,
-0.03314208984375,
0.0721435546875,
0.010955810546875,
-0.044891357421875,
0.055938720703125,
-0.0258331298828125,
0.013092041015625,
0.01348876953125,
0.025848388671875,
-0.063720703125,
-0.0122833251953125,
-0.03448486328125,
-0.04345703125,
0.06463623046875,
0.007389068603515625,
0.046630859375,
-0.0249786376953125,
0.042510986328125,
0.01186370849609375,
0.016693115234375,
-0.066162109375,
-0.021026611328125,
-0.032470703125,
-0.0048980712890625,
0.0203704833984375,
-0.04852294921875,
-0.00034356117248535156,
-0.02362060546875,
0.0391845703125,
-0.00159454345703125,
0.038909912109375,
0.00011020898818969727,
-0.004547119140625,
-0.0105438232421875,
0.00406646728515625,
0.0560302734375,
0.051910400390625,
-0.016937255859375,
0.00003409385681152344,
0.0018701553344726562,
-0.047332763671875,
-0.00518035888671875,
0.006679534912109375,
-0.01528167724609375,
-0.017425537109375,
0.0262298583984375,
0.07745361328125,
0.005390167236328125,
-0.050506591796875,
0.04254150390625,
-0.004100799560546875,
-0.0000451207160949707,
-0.025054931640625,
0.0222625732421875,
0.007537841796875,
0.03271484375,
0.004177093505859375,
0.004055023193359375,
0.0011186599731445312,
-0.03912353515625,
0.0010633468627929688,
0.01788330078125,
-0.0077362060546875,
-0.03167724609375,
0.0633544921875,
0.0265655517578125,
-0.0435791015625,
0.055572509765625,
-0.0106353759765625,
-0.026092529296875,
0.047332763671875,
0.0665283203125,
0.0419921875,
-0.0258636474609375,
0.03900146484375,
0.048675537109375,
0.0274505615234375,
-0.00017642974853515625,
0.0201416015625,
-0.0021038055419921875,
-0.02911376953125,
-0.0157623291015625,
-0.0263824462890625,
-0.0173797607421875,
0.01739501953125,
-0.04437255859375,
0.01430511474609375,
-0.07171630859375,
-0.01715087890625,
-0.00806427001953125,
-0.0014133453369140625,
-0.0189361572265625,
0.005931854248046875,
0.007904052734375,
0.07611083984375,
-0.060150146484375,
0.0472412109375,
0.056884765625,
-0.0418701171875,
-0.06475830078125,
-0.0003247261047363281,
0.00963592529296875,
-0.084716796875,
0.031097412109375,
0.033416748046875,
0.01064300537109375,
0.0007195472717285156,
-0.07470703125,
-0.04217529296875,
0.098876953125,
0.020355224609375,
-0.02886962890625,
-0.01229095458984375,
-0.01023101806640625,
0.031951904296875,
-0.038177490234375,
0.03436279296875,
0.0291900634765625,
0.02593994140625,
0.006381988525390625,
-0.0814208984375,
0.026123046875,
-0.0264434814453125,
0.0067291259765625,
-0.0161895751953125,
-0.06683349609375,
0.07769775390625,
-0.025726318359375,
-0.01806640625,
0.0467529296875,
0.05517578125,
0.034576416015625,
0.0279083251953125,
0.02569580078125,
0.0251312255859375,
0.06591796875,
0.007747650146484375,
0.07763671875,
-0.007472991943359375,
-0.0016527175903320312,
0.07427978515625,
-0.0088348388671875,
0.05438232421875,
0.027435302734375,
-0.0023193359375,
0.0482177734375,
0.07257080078125,
-0.0047760009765625,
0.040863037109375,
0.01132965087890625,
-0.01316070556640625,
-0.0194091796875,
-0.029266357421875,
-0.0304412841796875,
0.035430908203125,
0.0251617431640625,
-0.03076171875,
-0.00292205810546875,
0.004726409912109375,
0.0352783203125,
0.004726409912109375,
-0.00742340087890625,
0.06072998046875,
0.0146331787109375,
-0.06622314453125,
0.08154296875,
0.01245880126953125,
0.06988525390625,
-0.043670654296875,
-0.01151275634765625,
-0.0513916015625,
-0.010955810546875,
-0.01226043701171875,
-0.04730224609375,
-0.004119873046875,
0.02252197265625,
-0.01739501953125,
-0.00292205810546875,
0.059783935546875,
-0.0198974609375,
-0.0200653076171875,
-0.0025463104248046875,
0.02227783203125,
0.0306854248046875,
0.0024852752685546875,
-0.062469482421875,
0.01519775390625,
-0.0014781951904296875,
-0.012451171875,
0.01837158203125,
0.01331329345703125,
-0.0118255615234375,
0.0645751953125,
0.0347900390625,
-0.021026611328125,
-0.0025577545166015625,
0.00455474853515625,
0.07257080078125,
-0.0256195068359375,
-0.0272369384765625,
-0.052459716796875,
0.035247802734375,
-0.0004398822784423828,
-0.03515625,
0.05572509765625,
0.025054931640625,
0.03253173828125,
-0.00750732421875,
0.04205322265625,
-0.022613525390625,
0.037811279296875,
-0.039794921875,
0.0618896484375,
-0.038665771484375,
0.0253143310546875,
-0.0064849853515625,
-0.0650634765625,
0.00006842613220214844,
0.061859130859375,
-0.004638671875,
0.02117919921875,
0.0435791015625,
0.08502197265625,
-0.00725555419921875,
-0.00008368492126464844,
0.007541656494140625,
0.0015459060668945312,
0.0121002197265625,
0.0423583984375,
0.091064453125,
-0.02752685546875,
0.0423583984375,
-0.0298309326171875,
-0.03900146484375,
-0.0194549560546875,
-0.06005859375,
-0.104736328125,
-0.0309906005859375,
-0.02435302734375,
-0.037567138671875,
0.01548004150390625,
0.08367919921875,
0.047760009765625,
-0.03668212890625,
-0.017669677734375,
0.01157379150390625,
-0.003131866455078125,
-0.01107025146484375,
-0.01541900634765625,
0.004398345947265625,
-0.006618499755859375,
-0.05224609375,
0.043426513671875,
-0.004329681396484375,
0.0186920166015625,
-0.0036754608154296875,
-0.016510009765625,
-0.01506805419921875,
0.012298583984375,
0.04010009765625,
0.0290069580078125,
-0.058013916015625,
-0.032562255859375,
0.0190277099609375,
-0.01482391357421875,
-0.0031795501708984375,
0.042327880859375,
-0.043426513671875,
0.0140838623046875,
0.021270751953125,
0.0236663818359375,
0.016937255859375,
-0.0015783309936523438,
0.03741455078125,
-0.059967041015625,
0.0263824462890625,
0.0134124755859375,
0.0111541748046875,
0.03778076171875,
-0.04278564453125,
0.0294036865234375,
0.0211181640625,
-0.054107666015625,
-0.06890869140625,
0.0184326171875,
-0.0709228515625,
-0.021514892578125,
0.11395263671875,
-0.0149993896484375,
-0.0198822021484375,
0.017425537109375,
-0.0555419921875,
0.0286407470703125,
-0.04937744140625,
0.05609130859375,
0.04473876953125,
-0.0073699951171875,
-0.03289794921875,
-0.03692626953125,
0.0377197265625,
0.0271453857421875,
-0.057373046875,
0.01111602783203125,
0.05816650390625,
0.034271240234375,
-0.0123443603515625,
0.04205322265625,
0.003482818603515625,
0.0377197265625,
-0.00461578369140625,
0.00035691261291503906,
-0.02056884765625,
-0.03656005859375,
-0.0184326171875,
-0.0276947021484375,
0.0123291015625,
-0.030609130859375
]
] |
Helsinki-NLP/opus-mt-ca-en | 2023-08-16T11:26:39.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"ca",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-ca-en | 0 | 12,110 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-ca-en
* source languages: ca
* target languages: en
* OPUS readme: [ca-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/ca-en/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/ca-en/opus-2019-12-18.zip)
* test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/ca-en/opus-2019-12-18.test.txt)
* test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/ca-en/opus-2019-12-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.ca.en | 51.4 | 0.678 |
| 818 | [
[
-0.01543426513671875,
-0.0330810546875,
0.020111083984375,
0.03619384765625,
-0.03076171875,
-0.0261383056640625,
-0.03155517578125,
-0.00411224365234375,
0.006786346435546875,
0.034942626953125,
-0.050689697265625,
-0.046295166015625,
-0.0440673828125,
0.01238250732421875,
-0.00524139404296875,
0.051025390625,
-0.01349639892578125,
0.04052734375,
0.0225677490234375,
-0.0312042236328125,
-0.0260467529296875,
-0.0325927734375,
-0.04083251953125,
-0.0251007080078125,
0.0264892578125,
0.0260772705078125,
0.029754638671875,
0.031341552734375,
0.06695556640625,
0.01435089111328125,
-0.005764007568359375,
0.00687408447265625,
-0.03521728515625,
-0.0111236572265625,
0.00424957275390625,
-0.047637939453125,
-0.052581787109375,
-0.01332855224609375,
0.0782470703125,
0.0279693603515625,
-0.006343841552734375,
0.02691650390625,
-0.0024871826171875,
0.06591796875,
-0.0310211181640625,
0.0108795166015625,
-0.04693603515625,
0.0041656494140625,
-0.0262451171875,
-0.0287933349609375,
-0.05059814453125,
-0.02001953125,
0.007175445556640625,
-0.04937744140625,
0.0012903213500976562,
0.005764007568359375,
0.10552978515625,
0.0267486572265625,
-0.027374267578125,
-0.0128326416015625,
-0.040008544921875,
0.07373046875,
-0.0550537109375,
0.046417236328125,
0.032440185546875,
0.0217437744140625,
0.01171875,
-0.043609619140625,
-0.0245513916015625,
0.00708770751953125,
-0.00689697265625,
0.01898193359375,
-0.0031719207763671875,
-0.0196685791015625,
0.0185089111328125,
0.05548095703125,
-0.052490234375,
0.0040435791015625,
-0.039337158203125,
-0.007160186767578125,
0.049285888671875,
0.00946807861328125,
0.0142974853515625,
-0.003185272216796875,
-0.03271484375,
-0.046783447265625,
-0.057891845703125,
0.0127716064453125,
0.0279541015625,
0.026580810546875,
-0.036834716796875,
0.049835205078125,
-0.0108489990234375,
0.039306640625,
-0.0023021697998046875,
0.004425048828125,
0.070556640625,
-0.02685546875,
-0.02667236328125,
-0.006664276123046875,
0.08831787109375,
0.0191650390625,
0.00543975830078125,
0.0029544830322265625,
-0.0148773193359375,
-0.01381683349609375,
0.006687164306640625,
-0.064697265625,
-0.005184173583984375,
0.007640838623046875,
-0.030670166015625,
-0.00638580322265625,
0.005870819091796875,
-0.04534912109375,
0.0161895751953125,
-0.031494140625,
0.042083740234375,
-0.04058837890625,
-0.0175018310546875,
0.0296478271484375,
0.005970001220703125,
0.03656005859375,
-0.0007638931274414062,
-0.047271728515625,
0.0198211669921875,
0.027618408203125,
0.054412841796875,
-0.034393310546875,
-0.01904296875,
-0.0401611328125,
-0.0167694091796875,
-0.00919342041015625,
0.049285888671875,
-0.0071868896484375,
-0.03643798828125,
-0.00209808349609375,
0.038482666015625,
-0.032318115234375,
-0.02178955078125,
0.09967041015625,
-0.0228424072265625,
0.043701171875,
-0.0423583984375,
-0.030914306640625,
-0.02117919921875,
0.034637451171875,
-0.04730224609375,
0.0972900390625,
0.0014104843139648438,
-0.0587158203125,
0.0154876708984375,
-0.0628662109375,
-0.01509857177734375,
-0.004955291748046875,
0.00270843505859375,
-0.043121337890625,
0.002727508544921875,
0.01535797119140625,
0.0265350341796875,
-0.0273895263671875,
0.0252838134765625,
0.00020039081573486328,
-0.0227508544921875,
0.005687713623046875,
-0.032073974609375,
0.075439453125,
0.024200439453125,
-0.0262451171875,
0.01374053955078125,
-0.06646728515625,
-0.004535675048828125,
-0.0022792816162109375,
-0.037017822265625,
-0.0078887939453125,
0.00629425048828125,
0.025115966796875,
0.01049041748046875,
0.0286712646484375,
-0.05023193359375,
0.01605224609375,
-0.046844482421875,
0.0116424560546875,
0.045257568359375,
-0.0198516845703125,
0.02471923828125,
-0.0272979736328125,
0.0263824462890625,
0.01250457763671875,
0.0093841552734375,
0.003936767578125,
-0.03558349609375,
-0.06341552734375,
-0.01690673828125,
0.04229736328125,
0.081298828125,
-0.05718994140625,
0.058074951171875,
-0.05096435546875,
-0.05999755859375,
-0.057281494140625,
-0.00917816162109375,
0.036865234375,
0.0271148681640625,
0.04205322265625,
-0.018798828125,
-0.026336669921875,
-0.08013916015625,
-0.00902557373046875,
-0.011199951171875,
-0.009002685546875,
0.008270263671875,
0.045623779296875,
-0.0119171142578125,
0.038299560546875,
-0.04150390625,
-0.03082275390625,
-0.00812530517578125,
0.00986480712890625,
0.0299835205078125,
0.048583984375,
0.038116455078125,
-0.06781005859375,
-0.038818359375,
-0.00039196014404296875,
-0.048248291015625,
-0.0080413818359375,
0.0076904296875,
-0.0158843994140625,
0.0019931793212890625,
0.005893707275390625,
-0.018585205078125,
0.01497650146484375,
0.04547119140625,
-0.04083251953125,
0.037017822265625,
-0.0125579833984375,
0.024658203125,
-0.10028076171875,
0.01140594482421875,
-0.005096435546875,
-0.0025691986083984375,
-0.02886962890625,
0.004337310791015625,
0.023040771484375,
0.0068817138671875,
-0.06634521484375,
0.04046630859375,
-0.01727294921875,
-0.00047969818115234375,
0.0205535888671875,
-0.0031681060791015625,
0.0019407272338867188,
0.057891845703125,
0.0033435821533203125,
0.0638427734375,
0.05487060546875,
-0.041412353515625,
0.00933837890625,
0.042449951171875,
-0.0303955078125,
0.0240936279296875,
-0.059783935546875,
-0.0221710205078125,
0.027099609375,
-0.0018014907836914062,
-0.04632568359375,
0.00965118408203125,
0.0216522216796875,
-0.046295166015625,
0.028350830078125,
0.0036487579345703125,
-0.06097412109375,
0.000050067901611328125,
-0.0207672119140625,
0.04571533203125,
0.046356201171875,
-0.019622802734375,
0.0413818359375,
0.00690460205078125,
0.0035305023193359375,
-0.03985595703125,
-0.0814208984375,
-0.004680633544921875,
-0.025848388671875,
-0.050994873046875,
0.01666259765625,
-0.033447265625,
0.00341796875,
-0.00411224365234375,
0.0243377685546875,
-0.010589599609375,
0.00308990478515625,
0.006748199462890625,
0.01299285888671875,
-0.037353515625,
0.00719451904296875,
0.00250244140625,
-0.008819580078125,
-0.01238250732421875,
-0.005126953125,
0.0440673828125,
-0.0264434814453125,
-0.0235748291015625,
-0.05218505859375,
-0.0019378662109375,
0.04156494140625,
-0.038604736328125,
0.0625,
0.042327880859375,
-0.01165771484375,
0.0115966796875,
-0.034423828125,
0.00829315185546875,
-0.03143310546875,
0.007175445556640625,
-0.037841796875,
-0.056884765625,
0.046295166015625,
0.0123443603515625,
0.032806396484375,
0.064697265625,
0.05316162109375,
0.01837158203125,
0.047515869140625,
0.0218505859375,
-0.0008502006530761719,
0.0318603515625,
-0.038330078125,
-0.0087127685546875,
-0.077880859375,
0.0022106170654296875,
-0.0596923828125,
-0.019287109375,
-0.0625,
-0.020355224609375,
0.0147552490234375,
-0.0030651092529296875,
-0.016845703125,
0.05487060546875,
-0.041107177734375,
0.0167999267578125,
0.037994384765625,
-0.00617218017578125,
0.021514892578125,
0.000014066696166992188,
-0.042266845703125,
-0.014801025390625,
-0.03216552734375,
-0.04193115234375,
0.0948486328125,
0.032745361328125,
0.023406982421875,
0.0185546875,
0.03485107421875,
0.00018858909606933594,
0.017242431640625,
-0.04632568359375,
0.032745361328125,
-0.0214080810546875,
-0.051788330078125,
-0.0201263427734375,
-0.042205810546875,
-0.06512451171875,
0.035430908203125,
-0.0174560546875,
-0.044097900390625,
0.021087646484375,
-0.0026264190673828125,
-0.00807952880859375,
0.034576416015625,
-0.048919677734375,
0.0836181640625,
-0.003345489501953125,
-0.00824737548828125,
0.0193634033203125,
-0.032318115234375,
0.026031494140625,
-0.003322601318359375,
0.0188751220703125,
-0.0090179443359375,
0.00710296630859375,
0.046356201171875,
-0.007549285888671875,
0.034149169921875,
0.00022530555725097656,
-0.0102691650390625,
0.0096588134765625,
0.005725860595703125,
0.023681640625,
-0.007099151611328125,
-0.035491943359375,
0.02862548828125,
0.01036834716796875,
-0.0357666015625,
-0.007640838623046875,
0.046722412109375,
-0.058258056640625,
0.0011606216430664062,
-0.035369873046875,
-0.049713134765625,
0.0015010833740234375,
0.019561767578125,
0.0518798828125,
0.055084228515625,
-0.024261474609375,
0.0400390625,
0.060546875,
-0.02288818359375,
0.0294036865234375,
0.053497314453125,
-0.00914764404296875,
-0.03961181640625,
0.0643310546875,
0.0095977783203125,
0.025634765625,
0.049285888671875,
0.002193450927734375,
-0.0169219970703125,
-0.0545654296875,
-0.049072265625,
0.0193939208984375,
-0.02288818359375,
-0.01042938232421875,
-0.043121337890625,
-0.0071258544921875,
-0.0237884521484375,
0.0199737548828125,
-0.0382080078125,
-0.039520263671875,
-0.01678466796875,
-0.01678466796875,
0.00881195068359375,
0.01267242431640625,
0.00022852420806884766,
0.041717529296875,
-0.07464599609375,
0.0144195556640625,
-0.0110626220703125,
0.024658203125,
-0.028900146484375,
-0.055938720703125,
-0.033599853515625,
0.00818634033203125,
-0.057159423828125,
-0.0523681640625,
0.03912353515625,
0.00627899169921875,
0.021331787109375,
0.025909423828125,
0.0135955810546875,
0.0304412841796875,
-0.04669189453125,
0.0782470703125,
-0.0010499954223632812,
-0.054412841796875,
0.035919189453125,
-0.031158447265625,
0.03662109375,
0.07159423828125,
0.0174713134765625,
-0.019805908203125,
-0.039031982421875,
-0.05487060546875,
-0.05621337890625,
0.05670166015625,
0.056304931640625,
-0.01165008544921875,
0.00925445556640625,
-0.004886627197265625,
-0.00458526611328125,
0.0079498291015625,
-0.0830078125,
-0.0244598388671875,
0.004558563232421875,
-0.0294647216796875,
-0.0192413330078125,
-0.01335906982421875,
-0.0121307373046875,
-0.00936126708984375,
0.083251953125,
0.0081024169921875,
0.00814056396484375,
0.035614013671875,
-0.0128021240234375,
-0.0172576904296875,
0.0264892578125,
0.0732421875,
0.042236328125,
-0.03662109375,
-0.015716552734375,
0.0268096923828125,
-0.035888671875,
-0.0138092041015625,
0.006595611572265625,
-0.026519775390625,
0.016510009765625,
0.032440185546875,
0.075927734375,
0.019622802734375,
-0.047576904296875,
0.03485107421875,
-0.034942626953125,
-0.0323486328125,
-0.04901123046875,
-0.00774383544921875,
0.0062255859375,
-0.002315521240234375,
0.0195159912109375,
0.00684356689453125,
0.01323699951171875,
-0.0193939208984375,
0.006557464599609375,
0.005970001220703125,
-0.0556640625,
-0.04193115234375,
0.036865234375,
0.007656097412109375,
-0.0245208740234375,
0.0335693359375,
-0.0243377685546875,
-0.0386962890625,
0.036468505859375,
0.00873565673828125,
0.07733154296875,
-0.0174560546875,
-0.0163421630859375,
0.0577392578125,
0.042724609375,
-0.02459716796875,
0.034423828125,
0.003326416015625,
-0.051300048828125,
-0.037078857421875,
-0.0628662109375,
-0.0130157470703125,
0.00891876220703125,
-0.06707763671875,
0.032012939453125,
0.0194854736328125,
0.0014371871948242188,
-0.0250396728515625,
0.021209716796875,
-0.044769287109375,
0.01294708251953125,
-0.0233001708984375,
0.08544921875,
-0.072021484375,
0.0679931640625,
0.03643798828125,
-0.0229949951171875,
-0.06561279296875,
-0.0188446044921875,
-0.01544952392578125,
-0.036346435546875,
0.044097900390625,
0.01470947265625,
0.0195159912109375,
-0.010589599609375,
-0.019500732421875,
-0.056610107421875,
0.08953857421875,
0.01666259765625,
-0.0457763671875,
0.00434112548828125,
0.01384735107421875,
0.0382080078125,
-0.0307159423828125,
0.00966644287109375,
0.034637451171875,
0.052093505859375,
0.00982666015625,
-0.0870361328125,
-0.0207366943359375,
-0.0390625,
-0.025604248046875,
0.0362548828125,
-0.047760009765625,
0.0672607421875,
0.035186767578125,
-0.01198577880859375,
0.01186370849609375,
0.045623779296875,
0.026580810546875,
0.02899169921875,
0.036834716796875,
0.0877685546875,
0.0265350341796875,
-0.03399658203125,
0.0738525390625,
-0.0240936279296875,
0.034576416015625,
0.08892822265625,
-0.005352020263671875,
0.07177734375,
0.0237884521484375,
-0.005702972412109375,
0.036712646484375,
0.054229736328125,
-0.01898193359375,
0.034515380859375,
0.004589080810546875,
0.010711669921875,
-0.01171112060546875,
0.01186370849609375,
-0.055145263671875,
0.019622802734375,
0.0165863037109375,
-0.0159454345703125,
0.0021419525146484375,
-0.00794219970703125,
0.0014524459838867188,
0.0025787353515625,
-0.01477813720703125,
0.044769287109375,
0.0018672943115234375,
-0.047210693359375,
0.049530029296875,
-0.003345489501953125,
0.045318603515625,
-0.054107666015625,
0.00920867919921875,
-0.001789093017578125,
0.0172271728515625,
-0.00203704833984375,
-0.041107177734375,
0.0293121337890625,
0.0008554458618164062,
-0.0171051025390625,
-0.0269927978515625,
0.0087127685546875,
-0.03851318359375,
-0.06610107421875,
0.0312042236328125,
0.0301361083984375,
0.028839111328125,
0.00384521484375,
-0.064697265625,
0.0036563873291015625,
0.0084991455078125,
-0.040618896484375,
0.0062408447265625,
0.052459716796875,
0.0166015625,
0.036468505859375,
0.05169677734375,
0.0203704833984375,
0.0196685791015625,
0.004425048828125,
0.050079345703125,
-0.031005859375,
-0.040008544921875,
-0.05682373046875,
0.05889892578125,
-0.0035495758056640625,
-0.05340576171875,
0.048004150390625,
0.07928466796875,
0.07135009765625,
-0.00743865966796875,
0.01514434814453125,
0.002445220947265625,
0.04315185546875,
-0.049591064453125,
0.045501708984375,
-0.0682373046875,
0.02001953125,
-0.00799560546875,
-0.07403564453125,
-0.0190887451171875,
0.015655517578125,
-0.0185089111328125,
-0.0283966064453125,
0.061798095703125,
0.056427001953125,
-0.0181427001953125,
-0.01220703125,
0.020233154296875,
0.0194549560546875,
0.0219573974609375,
0.045867919921875,
0.0254058837890625,
-0.07666015625,
0.040679931640625,
-0.0175628662109375,
-0.0084991455078125,
0.0011701583862304688,
-0.05255126953125,
-0.06201171875,
-0.04986572265625,
-0.0156707763671875,
-0.01432037353515625,
-0.023223876953125,
0.06268310546875,
0.039520263671875,
-0.0697021484375,
-0.04315185546875,
0.0007214546203613281,
0.006610870361328125,
-0.0112762451171875,
-0.0190887451171875,
0.05047607421875,
-0.0267486572265625,
-0.07122802734375,
0.03192138671875,
0.00250244140625,
-0.008331298828125,
-0.0015058517456054688,
-0.0253753662109375,
-0.038177490234375,
-0.00141143798828125,
0.0175018310546875,
0.010040283203125,
-0.043731689453125,
0.0101776123046875,
0.008270263671875,
-0.007587432861328125,
0.026611328125,
0.031341552734375,
-0.0193939208984375,
0.019775390625,
0.057342529296875,
0.032989501953125,
0.03741455078125,
-0.010528564453125,
0.03857421875,
-0.046600341796875,
0.0250244140625,
0.01186370849609375,
0.04632568359375,
0.027496337890625,
-0.0080413818359375,
0.05889892578125,
0.01253509521484375,
-0.0478515625,
-0.07745361328125,
0.0050506591796875,
-0.0999755859375,
0.0044097900390625,
0.06756591796875,
-0.021759033203125,
-0.0253753662109375,
0.0270538330078125,
-0.01326751708984375,
0.0164337158203125,
-0.0282440185546875,
0.02862548828125,
0.06402587890625,
0.031341552734375,
0.0075225830078125,
-0.05340576171875,
0.0228118896484375,
0.0391845703125,
-0.0546875,
-0.017425537109375,
0.012359619140625,
0.00954437255859375,
0.0289764404296875,
0.03643798828125,
-0.0286407470703125,
0.0134735107421875,
-0.025421142578125,
0.029754638671875,
-0.00112152099609375,
-0.01204681396484375,
-0.024871826171875,
0.00080108642578125,
-0.004077911376953125,
-0.0237274169921875
]
] |
haoranxu/ALMA-13B-Pretrain | 2023-10-27T05:10:01.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"arxiv:2309.11674",
"license:mit",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | haoranxu | null | null | haoranxu/ALMA-13B-Pretrain | 4 | 12,109 | transformers | 2023-09-17T17:43:22 | ---
license: mit
---
**ALMA** (**A**dvanced **L**anguage **M**odel-based tr**A**nslator) is an LLM-based translation model, which adopts a new translation model paradigm: it begins with fine-tuning on monolingual data and is further optimized using high-quality parallel data. This two-step fine-tuning process ensures strong translation performance.
Please find more details in our [paper](https://arxiv.org/abs/2309.11674).
```
@misc{xu2023paradigm,
title={A Paradigm Shift in Machine Translation: Boosting Translation Performance of Large Language Models},
author={Haoran Xu and Young Jin Kim and Amr Sharaf and Hany Hassan Awadalla},
year={2023},
eprint={2309.11674},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
We release four translation models presented in the paper:
- **ALMA-7B**: Full-weight Fine-tune LLaMA-2-7B on 20B monolingual tokens and then **Full-weight** fine-tune on human-written parallel data
- **ALMA-7B-LoRA**: Full-weight Fine-tune LLaMA-2-7B on 20B monolingual tokens and then **LoRA** fine-tune on human-written parallel data
- **ALMA-13B**: Full-weight Fine-tune LLaMA-2-7B on 12B monolingual tokens and then **Full-weight** fine-tune on human-written parallel data
- **ALMA-13B-LoRA** (Our best system): Full-weight Fine-tune LLaMA-2-7B on 12B monolingual tokens and then **LoRA** fine-tune on human-written parallel data
Model checkpoints are released at huggingface:
| Models | Base Model Link | LoRA Link |
|:-------------:|:---------------:|:---------:|
| ALMA-7B | [haoranxu/ALMA-7B](https://huggingface.co/haoranxu/ALMA-7B) | - |
| ALMA-7B-LoRA | [haoranxu/ALMA-7B-Pretrain](https://huggingface.co/haoranxu/ALMA-7B-Pretrain) | [haoranxu/ALMA-7B-Pretrain-LoRA](https://huggingface.co/haoranxu/ALMA-7B-Pretrain-LoRA) |
| ALMA-13B | [haoranxu/ALMA-13B](https://huggingface.co/haoranxu/ALMA-13B) | - |
| ALMA-13B-LoRA | [haoranxu/ALMA-13B-Pretrain](https://huggingface.co/haoranxu/ALMA-13B-Pretrain) | [haoranxu/ALMA-13B-Pretrain-LoRA](https://huggingface.co/haoranxu/ALMA-13B-Pretrain-LoRA) |
**Note that `ALMA-7B-Pretrain` and `ALMA-13B-Pretrain` are NOT translation models. They only experience stage 1 monolingual fine-tuning (20B tokens for the 7B model and 12B tokens for the 13B model), and should be utilized in conjunction with their LoRA models for translation purposes.**
A quick start to use our best system (ALMA-13B-LoRA) for translation. An example of translating "我爱机器翻译。" into English:
```
import torch
from peft import PeftModel
from transformers import AutoModelForCausalLM
from transformers import LlamaTokenizer
# Load base model and LoRA weights
model = AutoModelForCausalLM.from_pretrained("haoranxu/ALMA-13B-Pretrain", torch_dtype=torch.float16, device_map="auto")
model = PeftModel.from_pretrained(model, "haoranxu/ALMA-13B-Pretrain-LoRA")
tokenizer = LlamaTokenizer.from_pretrained("haoranxu/ALMA-13B-Pretrain", padding_side='left')
# Add the source setence into the prompt template
prompt="Translate this from Chinese to English:\nChinese: 我爱机器翻译。\nEnglish:"
input_ids = tokenizer(prompt, return_tensors="pt", padding=True, max_length=40, truncation=True).input_ids.cuda()
# Translation
with torch.no_grad():
generated_ids = model.generate(input_ids=input_ids, num_beams=5, max_new_tokens=20, do_sample=True, temperature=0.6, top_p=0.9)
outputs = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
print(outputs)
```
Please find more details in our [GitHub repository](https://github.com/fe1ixxu/ALMA) | 3,637 | [
[
-0.020599365234375,
-0.036468505859375,
0.01340484619140625,
0.029266357421875,
-0.037506103515625,
-0.00421905517578125,
-0.008209228515625,
-0.038421630859375,
0.0225067138671875,
0.03399658203125,
-0.04541015625,
-0.0584716796875,
-0.05133056640625,
0.011566162109375,
-0.03729248046875,
0.0869140625,
-0.0279693603515625,
0.017669677734375,
0.00826263427734375,
-0.01544952392578125,
-0.0293731689453125,
-0.01654052734375,
-0.03436279296875,
-0.03662109375,
0.037139892578125,
0.023895263671875,
0.0439453125,
0.07489013671875,
0.061126708984375,
0.023345947265625,
-0.0178680419921875,
0.008941650390625,
-0.0258331298828125,
-0.0106964111328125,
0.016754150390625,
-0.044464111328125,
-0.071533203125,
-0.00493621826171875,
0.061614990234375,
0.03582763671875,
0.0010175704956054688,
0.03515625,
-0.00254058837890625,
0.03900146484375,
-0.0228271484375,
0.0100860595703125,
-0.037506103515625,
-0.0023517608642578125,
-0.03509521484375,
0.0139617919921875,
-0.0180511474609375,
-0.0205535888671875,
-0.0102386474609375,
-0.047882080078125,
0.0019378662109375,
0.0242767333984375,
0.09716796875,
0.0284271240234375,
-0.02960205078125,
-0.00640869140625,
-0.05133056640625,
0.0755615234375,
-0.071044921875,
0.0208740234375,
0.01171875,
0.0244140625,
0.001628875732421875,
-0.0618896484375,
-0.033355712890625,
-0.011993408203125,
-0.00930023193359375,
0.0146484375,
-0.01114654541015625,
-0.013824462890625,
0.0157623291015625,
0.036712646484375,
-0.0301055908203125,
0.007587432861328125,
-0.0413818359375,
-0.00914764404296875,
0.034423828125,
0.0267791748046875,
0.01415252685546875,
-0.0194854736328125,
-0.0426025390625,
-0.0186004638671875,
-0.055206298828125,
0.00521087646484375,
0.0116424560546875,
0.02239990234375,
-0.03375244140625,
0.044158935546875,
-0.01168060302734375,
0.046966552734375,
0.005535125732421875,
-0.028045654296875,
0.038543701171875,
-0.0401611328125,
-0.0277862548828125,
-0.0011548995971679688,
0.07550048828125,
0.0178070068359375,
-0.0142669677734375,
-0.0015764236450195312,
-0.0236968994140625,
0.000377655029296875,
-0.01898193359375,
-0.06549072265625,
0.01849365234375,
0.0097503662109375,
-0.0382080078125,
-0.0261993408203125,
0.0022106170654296875,
-0.04345703125,
0.0014467239379882812,
-0.020355224609375,
0.03765869140625,
-0.04193115234375,
0.0006723403930664062,
0.01849365234375,
0.014556884765625,
0.02740478515625,
0.0218658447265625,
-0.0504150390625,
0.014556884765625,
0.040313720703125,
0.056060791015625,
-0.0116729736328125,
-0.055908203125,
-0.029876708984375,
0.00012624263763427734,
-0.0165252685546875,
0.037841796875,
-0.00701141357421875,
-0.01251220703125,
-0.016571044921875,
0.0035648345947265625,
-0.011627197265625,
-0.034912109375,
0.056427001953125,
-0.024993896484375,
0.01427459716796875,
-0.039031982421875,
-0.0263671875,
-0.007537841796875,
0.01126861572265625,
-0.031982421875,
0.08349609375,
0.01094818115234375,
-0.06329345703125,
0.021087646484375,
-0.041046142578125,
-0.0195159912109375,
-0.018829345703125,
0.016937255859375,
-0.04010009765625,
-0.012908935546875,
0.0198516845703125,
0.0270233154296875,
-0.0546875,
0.01027679443359375,
0.0010166168212890625,
-0.032928466796875,
0.005840301513671875,
-0.01024627685546875,
0.05938720703125,
0.0323486328125,
-0.03179931640625,
0.027191162109375,
-0.057952880859375,
-0.018218994140625,
0.017486572265625,
-0.0284271240234375,
-0.004367828369140625,
-0.028045654296875,
-0.002765655517578125,
0.01346588134765625,
0.045196533203125,
-0.045318603515625,
0.026275634765625,
-0.037933349609375,
0.0244903564453125,
0.052947998046875,
-0.031829833984375,
0.034820556640625,
-0.035064697265625,
0.039794921875,
0.0194244384765625,
0.024993896484375,
-0.0184478759765625,
-0.037750244140625,
-0.0966796875,
-0.0186004638671875,
0.0253143310546875,
0.05169677734375,
-0.0584716796875,
0.0264739990234375,
-0.0197296142578125,
-0.053192138671875,
-0.06610107421875,
0.009246826171875,
0.048858642578125,
0.0458984375,
0.04241943359375,
-0.0019016265869140625,
-0.041900634765625,
-0.06524658203125,
-0.00852203369140625,
-0.00966644287109375,
0.008453369140625,
0.003387451171875,
0.055816650390625,
-0.0208740234375,
0.039031982421875,
-0.0211029052734375,
-0.0182952880859375,
-0.0223236083984375,
0.01076507568359375,
0.04443359375,
0.04010009765625,
0.045013427734375,
-0.03424072265625,
-0.04864501953125,
0.019927978515625,
-0.070556640625,
-0.01148223876953125,
-0.0030155181884765625,
-0.0289764404296875,
0.01558685302734375,
0.025115966796875,
-0.051483154296875,
0.0300445556640625,
0.037567138671875,
-0.04248046875,
0.046173095703125,
-0.007366180419921875,
0.01505279541015625,
-0.10260009765625,
0.01303863525390625,
-0.021484375,
-0.0179443359375,
-0.044097900390625,
0.01340484619140625,
0.0092926025390625,
0.01432037353515625,
-0.060791015625,
0.057586669921875,
-0.0308074951171875,
-0.0053558349609375,
-0.0175323486328125,
0.0021533966064453125,
0.004940032958984375,
0.035858154296875,
-0.0234527587890625,
0.05548095703125,
0.04669189453125,
-0.041961669921875,
0.037811279296875,
0.043853759765625,
-0.0277557373046875,
0.02130126953125,
-0.05682373046875,
0.012542724609375,
-0.006107330322265625,
0.01010894775390625,
-0.0479736328125,
-0.0151519775390625,
0.04736328125,
-0.033477783203125,
0.0185546875,
-0.004093170166015625,
-0.0380859375,
-0.04193115234375,
-0.01424407958984375,
0.0298309326171875,
0.044281005859375,
-0.0386962890625,
0.036529541015625,
-0.0001367330551147461,
-0.0015783309936523438,
-0.06512451171875,
-0.053558349609375,
-0.0038623809814453125,
-0.0252532958984375,
-0.06646728515625,
0.0396728515625,
-0.006702423095703125,
-0.0147247314453125,
-0.00685882568359375,
0.0015707015991210938,
0.01226043701171875,
0.0015125274658203125,
0.009857177734375,
0.047882080078125,
-0.0257720947265625,
-0.01230621337890625,
0.01479339599609375,
-0.01708984375,
-0.0093231201171875,
-0.02020263671875,
0.061187744140625,
-0.03424072265625,
-0.0202789306640625,
-0.051025390625,
0.00937652587890625,
0.055908203125,
-0.0271759033203125,
0.08233642578125,
0.06744384765625,
-0.019317626953125,
0.0238189697265625,
-0.0364990234375,
-0.00521087646484375,
-0.03314208984375,
0.0254058837890625,
-0.034912109375,
-0.04364013671875,
0.06951904296875,
0.015411376953125,
0.020172119140625,
0.048126220703125,
0.049163818359375,
0.01161956787109375,
0.06842041015625,
0.0350341796875,
-0.01399993896484375,
0.035003662109375,
-0.039947509765625,
-0.00933837890625,
-0.06927490234375,
-0.0289764404296875,
-0.0435791015625,
-0.031646728515625,
-0.03253173828125,
-0.039398193359375,
0.029449462890625,
0.0179290771484375,
-0.036102294921875,
0.047332763671875,
-0.0289764404296875,
0.01531219482421875,
0.05194091796875,
-0.004245758056640625,
0.031585693359375,
-0.01055908203125,
-0.01458740234375,
-0.00066375732421875,
-0.044830322265625,
-0.03448486328125,
0.0914306640625,
0.0245361328125,
0.051025390625,
-0.010467529296875,
0.05584716796875,
-0.0174102783203125,
0.0313720703125,
-0.042144775390625,
0.04779052734375,
-0.00870513916015625,
-0.04168701171875,
-0.011932373046875,
-0.03253173828125,
-0.060394287109375,
0.020416259765625,
-0.016571044921875,
-0.04644775390625,
0.007049560546875,
0.001773834228515625,
-0.01477813720703125,
0.033416748046875,
-0.030242919921875,
0.06304931640625,
-0.0233154296875,
-0.027069091796875,
0.006153106689453125,
-0.0465087890625,
0.033477783203125,
-0.00991058349609375,
0.01319122314453125,
-0.0181427001953125,
0.00643157958984375,
0.06793212890625,
-0.03570556640625,
0.033843994140625,
-0.00884246826171875,
-0.0007796287536621094,
0.0213470458984375,
-0.0056304931640625,
0.061798095703125,
-0.008026123046875,
-0.0226593017578125,
0.031524658203125,
0.002353668212890625,
-0.034942626953125,
-0.01904296875,
0.051361083984375,
-0.078125,
-0.042510986328125,
-0.031890869140625,
-0.055816650390625,
0.0175628662109375,
0.031005859375,
0.049468994140625,
0.002010345458984375,
-0.00010776519775390625,
0.0124969482421875,
0.020355224609375,
-0.02191162109375,
0.0244293212890625,
0.037322998046875,
-0.045654296875,
-0.035125732421875,
0.06097412109375,
0.0003612041473388672,
0.0183868408203125,
0.0230712890625,
0.01079559326171875,
-0.005950927734375,
-0.0438232421875,
-0.051483154296875,
0.04296875,
-0.04034423828125,
-0.0279541015625,
-0.0367431640625,
-0.00910186767578125,
-0.047607421875,
-0.0022430419921875,
-0.052093505859375,
-0.030426025390625,
-0.0275115966796875,
0.004444122314453125,
0.0270538330078125,
0.04498291015625,
0.00186920166015625,
0.04559326171875,
-0.06329345703125,
0.0186920166015625,
-0.0067138671875,
-0.0004208087921142578,
-0.013946533203125,
-0.06744384765625,
-0.0443115234375,
0.010040283203125,
-0.023529052734375,
-0.053802490234375,
0.06451416015625,
-0.002094268798828125,
0.0192718505859375,
0.029510498046875,
-0.007171630859375,
0.0546875,
-0.0406494140625,
0.04693603515625,
0.0192413330078125,
-0.0775146484375,
0.034942626953125,
-0.039703369140625,
0.0218353271484375,
0.035125732421875,
0.01763916015625,
-0.047088623046875,
-0.022857666015625,
-0.0192718505859375,
-0.06512451171875,
0.07598876953125,
0.033050537109375,
-0.0029296875,
0.016845703125,
0.038238525390625,
0.0164031982421875,
0.0187225341796875,
-0.06817626953125,
-0.051116943359375,
-0.0217132568359375,
-0.006801605224609375,
0.0014162063598632812,
-0.00315093994140625,
-0.0038280487060546875,
-0.03411865234375,
0.065185546875,
0.007678985595703125,
0.035888671875,
0.00702667236328125,
0.007427215576171875,
-0.0174560546875,
0.00821685791015625,
0.032989501953125,
0.0355224609375,
-0.0260162353515625,
-0.024322509765625,
0.0300750732421875,
-0.03564453125,
0.0022411346435546875,
0.0192108154296875,
-0.039794921875,
0.0048065185546875,
0.0185089111328125,
0.09039306640625,
0.0048980712890625,
-0.02325439453125,
0.0162200927734375,
0.006481170654296875,
-0.013336181640625,
-0.02545166015625,
-0.005199432373046875,
0.0088958740234375,
0.0198974609375,
0.0163421630859375,
0.0107574462890625,
-0.01068878173828125,
-0.03570556640625,
-0.01306915283203125,
0.0230560302734375,
-0.013702392578125,
-0.036834716796875,
0.080322265625,
0.0074005126953125,
-0.0151519775390625,
0.042388916015625,
-0.00982666015625,
-0.03448486328125,
0.0506591796875,
0.04510498046875,
0.0489501953125,
-0.0279083251953125,
0.01073455810546875,
0.035125732421875,
0.0285797119140625,
-0.017852783203125,
0.019073486328125,
0.021942138671875,
-0.0546875,
-0.0248260498046875,
-0.06878662109375,
-0.0181427001953125,
0.004390716552734375,
-0.0595703125,
0.0426025390625,
-0.01480865478515625,
-0.01177215576171875,
-0.0089263916015625,
-0.0005717277526855469,
-0.056427001953125,
0.00504302978515625,
0.01090240478515625,
0.06915283203125,
-0.060394287109375,
0.08197021484375,
0.05426025390625,
-0.0279693603515625,
-0.057098388671875,
-0.0219268798828125,
-0.00989532470703125,
-0.06732177734375,
0.0521240234375,
0.0007538795471191406,
-0.00015687942504882812,
0.002590179443359375,
-0.035675048828125,
-0.08544921875,
0.112548828125,
0.052459716796875,
-0.0333251953125,
-0.022735595703125,
0.0369873046875,
0.036468505859375,
-0.026275634765625,
0.00823974609375,
0.044464111328125,
0.052520751953125,
0.02166748046875,
-0.07525634765625,
0.0024566650390625,
-0.024200439453125,
-0.01715087890625,
0.01715087890625,
-0.07696533203125,
0.1041259765625,
-0.0090179443359375,
-0.0007729530334472656,
0.0433349609375,
0.068115234375,
0.0270538330078125,
0.031036376953125,
0.0081024169921875,
0.047637939453125,
0.0535888671875,
-0.0187225341796875,
0.08544921875,
-0.031951904296875,
0.032958984375,
0.06341552734375,
-0.006809234619140625,
0.05145263671875,
0.034912109375,
-0.02093505859375,
0.03314208984375,
0.04315185546875,
-0.006084442138671875,
0.018157958984375,
-0.00959014892578125,
-0.01126861572265625,
-0.021514892578125,
-0.01288604736328125,
-0.04779052734375,
0.03472900390625,
0.01381683349609375,
-0.035247802734375,
-0.0022716522216796875,
0.0109100341796875,
0.0226593017578125,
-0.0206451416015625,
-0.022308349609375,
0.016571044921875,
0.01678466796875,
-0.05108642578125,
0.06597900390625,
0.0289306640625,
0.0579833984375,
-0.045318603515625,
0.018218994140625,
-0.03509521484375,
0.0223236083984375,
0.000972747802734375,
-0.02294921875,
0.014739990234375,
0.0083770751953125,
-0.006237030029296875,
-0.020355224609375,
0.034942626953125,
-0.0223388671875,
-0.0670166015625,
0.029083251953125,
0.0394287109375,
0.02313232421875,
0.005329132080078125,
-0.062103271484375,
0.018157958984375,
0.013702392578125,
-0.038421630859375,
0.02557373046875,
0.024749755859375,
0.0094146728515625,
0.050323486328125,
0.044464111328125,
0.0121002197265625,
0.0205841064453125,
-0.00934600830078125,
0.0716552734375,
-0.050323486328125,
-0.01129913330078125,
-0.060699462890625,
0.03765869140625,
0.0201263427734375,
-0.036285400390625,
0.06951904296875,
0.040863037109375,
0.05645751953125,
-0.006801605224609375,
0.03314208984375,
0.00003618001937866211,
0.0380859375,
-0.047332763671875,
0.05535888671875,
-0.052978515625,
0.0294189453125,
-0.023223876953125,
-0.080322265625,
-0.0208892822265625,
0.03564453125,
-0.0221099853515625,
-0.0032558441162109375,
0.053070068359375,
0.0504150390625,
-0.0008897781372070312,
-0.0203857421875,
0.0243682861328125,
0.039398193359375,
0.01381683349609375,
0.055206298828125,
0.0421142578125,
-0.073486328125,
0.059661865234375,
-0.0305633544921875,
-0.0005135536193847656,
-0.01451873779296875,
-0.042205810546875,
-0.0693359375,
-0.03607177734375,
-0.0179901123046875,
-0.03350830078125,
-0.003559112548828125,
0.07000732421875,
0.035858154296875,
-0.059814453125,
-0.0235443115234375,
-0.0002359151840209961,
0.007030487060546875,
-0.00904083251953125,
-0.01079559326171875,
0.036529541015625,
-0.0097808837890625,
-0.09063720703125,
0.03399658203125,
0.00202178955078125,
0.021514892578125,
-0.010162353515625,
-0.0251312255859375,
-0.01412200927734375,
0.00366973876953125,
0.0211944580078125,
0.0158538818359375,
-0.052032470703125,
0.0071868896484375,
0.0277557373046875,
-0.0177154541015625,
0.0179443359375,
0.02374267578125,
-0.04644775390625,
0.0024566650390625,
0.032623291015625,
0.01763916015625,
0.05828857421875,
-0.0076904296875,
0.024505615234375,
-0.03192138671875,
0.0306549072265625,
0.0004973411560058594,
0.039794921875,
0.0198974609375,
-0.01351165771484375,
0.037628173828125,
0.0238037109375,
-0.0433349609375,
-0.06524658203125,
0.002899169921875,
-0.0777587890625,
-0.004055023193359375,
0.0863037109375,
-0.0214691162109375,
-0.037445068359375,
0.0219879150390625,
-0.0241546630859375,
0.042694091796875,
-0.0261993408203125,
0.056976318359375,
0.044647216796875,
0.00665283203125,
0.0052642822265625,
-0.034393310546875,
0.0182647705078125,
0.044464111328125,
-0.060394287109375,
-0.00786590576171875,
0.0131683349609375,
0.00994873046875,
0.017059326171875,
0.028900146484375,
-0.00994873046875,
0.01332855224609375,
-0.00971221923828125,
0.0164947509765625,
-0.0165252685546875,
-0.024627685546875,
-0.00969696044921875,
-0.0187530517578125,
-0.006595611572265625,
-0.018218994140625
]
] |
hyunseoki/ko-en-llama2-13b | 2023-10-03T18:49:04.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"ko",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"has_space"
] | text-generation | hyunseoki | null | null | hyunseoki/ko-en-llama2-13b | 24 | 12,090 | transformers | 2023-10-02T13:05:10 | ---
language:
- ko
library_name: transformers
pipeline_tag: text-generation
---
**Model Developers** HyunseokLee, TaeyoungKim - (kaist alinlab, omnious.ai)
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture**
ko-en-llama2-13b is an auto-regressive language model based on the LLaMA2 transformer architecture.
**Base Model**
Llama-2-13B
**Training Dataset**
Open dataset wiki and AIhub (English + Korean).
**Training Objective**
We trained the model to learn Korean corpus while maintaining Llama's English ability.
(still training) | 588 | [
[
-0.001483917236328125,
-0.06097412109375,
0.03521728515625,
0.039520263671875,
-0.01873779296875,
0.022186279296875,
-0.0032329559326171875,
-0.034393310546875,
0.0140838623046875,
0.047760009765625,
-0.040374755859375,
-0.032928466796875,
-0.043426513671875,
0.0121307373046875,
-0.02288818359375,
0.064208984375,
-0.017486572265625,
0.020111083984375,
0.004425048828125,
0.01094818115234375,
-0.04852294921875,
-0.022064208984375,
-0.044891357421875,
-0.034759521484375,
0.01885986328125,
0.055816650390625,
0.050750732421875,
0.06878662109375,
0.032958984375,
0.0196533203125,
0.004730224609375,
-0.013916015625,
-0.046478271484375,
0.0186309814453125,
0.00017881393432617188,
-0.054229736328125,
-0.0296630859375,
-0.017913818359375,
0.039306640625,
0.01421356201171875,
-0.01555633544921875,
0.032684326171875,
-0.01319122314453125,
0.03814697265625,
-0.0267181396484375,
0.03216552734375,
-0.05194091796875,
-0.015411376953125,
-0.0163726806640625,
0.032928466796875,
-0.033660888671875,
-0.0220947265625,
-0.0218505859375,
-0.026885986328125,
-0.01458740234375,
0.003818511962890625,
0.08355712890625,
0.037445068359375,
-0.041534423828125,
-0.0311126708984375,
-0.044464111328125,
0.04412841796875,
-0.051605224609375,
0.04010009765625,
0.049835205078125,
0.0200042724609375,
-0.0133819580078125,
-0.0504150390625,
-0.034210205078125,
-0.021331787109375,
-0.0013284683227539062,
0.0138702392578125,
-0.0028324127197265625,
-0.0126190185546875,
-0.01317596435546875,
0.0193634033203125,
-0.03997802734375,
0.04498291015625,
-0.05364990234375,
-0.00936126708984375,
0.06103515625,
0.01302337646484375,
0.018951416015625,
-0.0272979736328125,
-0.0189208984375,
-0.00849151611328125,
-0.04583740234375,
-0.00704193115234375,
0.034515380859375,
0.01290130615234375,
-0.025421142578125,
0.0693359375,
-0.0146484375,
0.02545166015625,
0.01552581787109375,
-0.0308685302734375,
0.0236663818359375,
-0.00601959228515625,
-0.0307464599609375,
0.0103607177734375,
0.06353759765625,
0.009765625,
0.006313323974609375,
-0.0198211669921875,
-0.0163726806640625,
0.0210723876953125,
0.01441192626953125,
-0.03607177734375,
-0.01302337646484375,
0.01056671142578125,
-0.06158447265625,
-0.05389404296875,
-0.01486968994140625,
-0.059722900390625,
-0.012481689453125,
-0.006404876708984375,
0.033050537109375,
-0.01047515869140625,
-0.04205322265625,
-0.00032067298889160156,
0.0133514404296875,
0.0159912109375,
-0.00804901123046875,
-0.05230712890625,
0.0291900634765625,
0.017364501953125,
0.04156494140625,
-0.01078033447265625,
-0.026611328125,
0.01352691650390625,
-0.0011138916015625,
-0.038604736328125,
0.058502197265625,
-0.0280303955078125,
-0.034912109375,
0.001247406005859375,
0.018585205078125,
-0.01049041748046875,
-0.040069580078125,
0.043609619140625,
-0.052001953125,
0.021331787109375,
0.0144805908203125,
-0.03521728515625,
-0.03289794921875,
-0.0069732666015625,
-0.0626220703125,
0.1041259765625,
0.025177001953125,
-0.0246429443359375,
0.0017652511596679688,
-0.056060791015625,
-0.018585205078125,
-0.00838470458984375,
0.00833892822265625,
-0.0276336669921875,
0.01287078857421875,
-0.01001739501953125,
0.022552490234375,
-0.0304107666015625,
0.043243408203125,
-0.000667572021484375,
-0.0018339157104492188,
-0.0017099380493164062,
-0.0125885009765625,
0.049957275390625,
0.0281524658203125,
-0.014617919921875,
0.0000368952751159668,
-0.1002197265625,
-0.0103759765625,
0.054412841796875,
-0.046478271484375,
-0.01381683349609375,
0.003170013427734375,
0.0181732177734375,
0.0232391357421875,
0.03497314453125,
-0.04010009765625,
0.0263671875,
-0.0246429443359375,
0.0198211669921875,
0.04803466796875,
-0.01352691650390625,
0.04730224609375,
-0.01168060302734375,
0.058990478515625,
0.00017690658569335938,
-0.01190948486328125,
-0.01453399658203125,
-0.0379638671875,
-0.07928466796875,
-0.0117340087890625,
0.02362060546875,
0.0731201171875,
-0.038726806640625,
0.0294647216796875,
0.0014600753784179688,
-0.06683349609375,
-0.056304931640625,
0.0203857421875,
0.034027099609375,
0.0306854248046875,
0.0129241943359375,
0.00968170166015625,
-0.0672607421875,
-0.081787109375,
0.009490966796875,
-0.03546142578125,
0.011322021484375,
0.006824493408203125,
0.04461669921875,
-0.0374755859375,
0.039825439453125,
-0.0223846435546875,
-0.006572723388671875,
-0.036834716796875,
-0.001560211181640625,
0.021240234375,
0.03070068359375,
0.034515380859375,
-0.03594970703125,
-0.0433349609375,
-0.00373077392578125,
-0.05364990234375,
-0.0419921875,
-0.010498046875,
-0.0273590087890625,
0.032867431640625,
0.044708251953125,
-0.061065673828125,
0.035400390625,
0.058837890625,
-0.0301361083984375,
0.02978515625,
0.01055145263671875,
-0.014312744140625,
-0.106201171875,
-0.009918212890625,
-0.029388427734375,
-0.01238250732421875,
-0.03692626953125,
0.007808685302734375,
0.0105133056640625,
0.005176544189453125,
-0.01605224609375,
0.0494384765625,
-0.021026611328125,
-0.0006814002990722656,
-0.035797119140625,
-0.004138946533203125,
-0.00440216064453125,
0.0275115966796875,
0.00997161865234375,
0.0531005859375,
0.0243072509765625,
-0.0531005859375,
0.0255279541015625,
0.04644775390625,
-0.0275726318359375,
0.006244659423828125,
-0.05889892578125,
0.0170440673828125,
-0.005306243896484375,
0.00725555419921875,
-0.08721923828125,
-0.04266357421875,
0.0305328369140625,
-0.037139892578125,
0.01727294921875,
0.0125732421875,
-0.04718017578125,
-0.0447998046875,
-0.003376007080078125,
0.0157623291015625,
0.047760009765625,
-0.03741455078125,
0.049346923828125,
0.02197265625,
-0.01422882080078125,
-0.0294952392578125,
-0.060211181640625,
0.00255584716796875,
-0.0062408447265625,
-0.03594970703125,
0.0207977294921875,
0.0021572113037109375,
-0.0196685791015625,
0.006000518798828125,
0.018951416015625,
-0.020721435546875,
0.01239013671875,
0.0262603759765625,
0.045654296875,
-0.0308837890625,
0.01253509521484375,
0.026092529296875,
-0.0008935928344726562,
0.0015077590942382812,
0.0199127197265625,
0.06500244140625,
0.0062255859375,
-0.0123443603515625,
-0.0447998046875,
0.01026153564453125,
0.029510498046875,
0.002658843994140625,
0.054473876953125,
0.02960205078125,
-0.020782470703125,
0.01042938232421875,
-0.038787841796875,
0.0149688720703125,
-0.03497314453125,
0.061553955078125,
-0.047515869140625,
-0.05108642578125,
0.046783447265625,
-0.0070953369140625,
0.00445556640625,
0.06195068359375,
0.08026123046875,
0.00405120849609375,
0.06976318359375,
0.062225341796875,
-0.01275634765625,
0.0150299072265625,
0.004863739013671875,
0.0095672607421875,
-0.05120849609375,
-0.04339599609375,
-0.0406494140625,
-0.0194091796875,
-0.044464111328125,
-0.011444091796875,
-0.0168914794921875,
0.0310516357421875,
-0.04461669921875,
0.049835205078125,
-0.0303802490234375,
0.0240020751953125,
0.05242919921875,
-0.011627197265625,
0.010589599609375,
-0.005939483642578125,
-0.003849029541015625,
0.0114288330078125,
-0.05181884765625,
-0.06280517578125,
0.09490966796875,
0.0491943359375,
0.07305908203125,
0.006744384765625,
0.046600341796875,
0.00742340087890625,
0.0120697021484375,
-0.057891845703125,
0.03033447265625,
0.009765625,
-0.054779052734375,
-0.0046844482421875,
0.0018062591552734375,
-0.0648193359375,
-0.00302886962890625,
0.00347900390625,
-0.048614501953125,
0.01001739501953125,
0.01041412353515625,
0.004344940185546875,
0.0140228271484375,
-0.051116943359375,
0.0474853515625,
-0.0328369140625,
0.014434814453125,
0.0007481575012207031,
-0.03857421875,
0.037078857421875,
-0.02557373046875,
0.00946044921875,
-0.006744384765625,
-0.0132904052734375,
0.0460205078125,
-0.0006232261657714844,
0.060882568359375,
-0.007305145263671875,
-0.0160064697265625,
0.0308074951171875,
0.0277862548828125,
0.03704833984375,
-0.0004756450653076172,
0.00545501708984375,
0.02197265625,
-0.017669677734375,
-0.00853729248046875,
-0.01471710205078125,
0.0372314453125,
-0.081298828125,
-0.03533935546875,
-0.016693115234375,
-0.033538818359375,
-0.0066680908203125,
0.017974853515625,
0.0273895263671875,
-0.025146484375,
-0.0196533203125,
0.0045928955078125,
0.0170440673828125,
-0.025543212890625,
0.03314208984375,
0.0498046875,
-0.044189453125,
-0.053009033203125,
0.045135498046875,
0.0005097389221191406,
0.0194854736328125,
0.0160675048828125,
0.005695343017578125,
-0.0243377685546875,
-0.01739501953125,
-0.046539306640625,
0.0301055908203125,
-0.052276611328125,
-0.040771484375,
-0.0477294921875,
-0.041015625,
-0.045074462890625,
0.005733489990234375,
-0.048004150390625,
-0.037750244140625,
-0.0379638671875,
-0.03070068359375,
0.0198974609375,
0.08807373046875,
-0.010101318359375,
0.059844970703125,
-0.046966552734375,
0.0261077880859375,
0.00865936279296875,
0.0208282470703125,
-0.006793975830078125,
-0.056488037109375,
-0.0188751220703125,
-0.00128936767578125,
-0.030426025390625,
-0.06158447265625,
0.043731689453125,
0.0223541259765625,
0.041656494140625,
0.022491455078125,
-0.0229034423828125,
0.034149169921875,
-0.0496826171875,
0.06494140625,
0.004268646240234375,
-0.0611572265625,
0.0445556640625,
-0.006317138671875,
0.02362060546875,
0.022491455078125,
0.0294189453125,
-0.0275115966796875,
-0.0203094482421875,
-0.041046142578125,
-0.050262451171875,
0.06439208984375,
0.0140838623046875,
0.0227508544921875,
-0.0088958740234375,
0.033203125,
0.0247955322265625,
0.0162811279296875,
-0.08294677734375,
-0.01560211181640625,
-0.035400390625,
-0.05206298828125,
0.0126953125,
-0.054962158203125,
0.01690673828125,
-0.0033245086669921875,
0.06427001953125,
0.01029205322265625,
0.042266845703125,
-0.0026340484619140625,
-0.0171661376953125,
-0.020416259765625,
-0.006168365478515625,
0.043914794921875,
0.007747650146484375,
-0.006229400634765625,
-0.01226806640625,
0.006160736083984375,
-0.051788330078125,
0.0263214111328125,
-0.0084381103515625,
-0.03265380859375,
0.004718780517578125,
0.027679443359375,
0.08380126953125,
0.00496673583984375,
-0.049102783203125,
-0.00026297569274902344,
0.0159149169921875,
-0.0242462158203125,
-0.0380859375,
0.00034332275390625,
0.0171051025390625,
0.030609130859375,
0.0157012939453125,
-0.00954437255859375,
-0.002170562744140625,
-0.01483917236328125,
-0.004848480224609375,
-0.00010401010513305664,
-0.0008587837219238281,
-0.03790283203125,
0.06805419921875,
0.0027637481689453125,
-0.036895751953125,
0.04388427734375,
-0.006168365478515625,
-0.03912353515625,
0.06292724609375,
0.0867919921875,
0.05804443359375,
-0.04461669921875,
0.0200347900390625,
0.039642333984375,
0.030853271484375,
-0.018402099609375,
0.0408935546875,
0.0266265869140625,
-0.061614990234375,
-0.0235748291015625,
-0.0513916015625,
-0.009063720703125,
0.0307769775390625,
-0.05303955078125,
0.041534423828125,
-0.02435302734375,
-0.00989532470703125,
-0.006534576416015625,
-0.01306915283203125,
-0.041656494140625,
0.01824951171875,
0.018402099609375,
0.0740966796875,
-0.06011962890625,
0.07122802734375,
0.06256103515625,
-0.040496826171875,
-0.059326171875,
-0.0261077880859375,
-0.00811004638671875,
-0.09869384765625,
0.09002685546875,
0.00287628173828125,
0.01505279541015625,
0.012786865234375,
-0.06591796875,
-0.10382080078125,
0.08343505859375,
0.01335906982421875,
-0.044219970703125,
-0.014984130859375,
0.01385498046875,
0.02532958984375,
-0.04315185546875,
-0.0049285888671875,
0.03582763671875,
0.032135009765625,
-0.01032257080078125,
-0.08001708984375,
-0.0274200439453125,
-0.025726318359375,
0.0151824951171875,
-0.011505126953125,
-0.0716552734375,
0.059844970703125,
-0.000047326087951660156,
-0.00707244873046875,
0.041259765625,
0.053253173828125,
0.017669677734375,
0.0016412734985351562,
0.0310516357421875,
0.049285888671875,
0.04296875,
0.007476806640625,
0.0543212890625,
-0.03857421875,
0.0221099853515625,
0.1015625,
-0.0272979736328125,
0.07257080078125,
0.01505279541015625,
-0.0164031982421875,
0.052337646484375,
0.0703125,
-0.0230560302734375,
0.054901123046875,
0.0004019737243652344,
0.0208892822265625,
-0.006000518798828125,
-0.002231597900390625,
-0.021148681640625,
0.072509765625,
0.0218353271484375,
-0.0447998046875,
0.0029392242431640625,
0.00933837890625,
0.0286102294921875,
-0.0005502700805664062,
-0.0177001953125,
0.057891845703125,
-0.007755279541015625,
-0.03289794921875,
0.0310516357421875,
0.028228759765625,
0.04071044921875,
-0.046356201171875,
-0.0003821849822998047,
-0.016448974609375,
0.0029582977294921875,
-0.0026226043701171875,
-0.0390625,
0.0248870849609375,
0.00919342041015625,
-0.031768798828125,
0.01178741455078125,
0.0743408203125,
-0.035247802734375,
-0.060638427734375,
0.0311737060546875,
0.032470703125,
0.00707244873046875,
0.0177764892578125,
-0.05548095703125,
0.0204010009765625,
0.001827239990234375,
-0.03607177734375,
0.009735107421875,
0.0206451416015625,
-0.0128631591796875,
0.05718994140625,
0.027984619140625,
-0.0010080337524414062,
0.0200042724609375,
-0.0014848709106445312,
0.060302734375,
-0.035247802734375,
-0.01507568359375,
-0.0584716796875,
0.02838134765625,
0.0034503936767578125,
-0.025848388671875,
0.049835205078125,
0.04803466796875,
0.06439208984375,
-0.0247650146484375,
0.055389404296875,
-0.0180206298828125,
0.05059814453125,
-0.0251312255859375,
0.04779052734375,
-0.0282440185546875,
-0.01305389404296875,
-0.0008969306945800781,
-0.07232666015625,
-0.01000213623046875,
0.0677490234375,
0.01061248779296875,
-0.01163482666015625,
0.039093017578125,
0.047393798828125,
0.01546478271484375,
-0.01116943359375,
0.0283660888671875,
0.041900634765625,
0.0033130645751953125,
0.0251617431640625,
0.054412841796875,
-0.05615234375,
0.03594970703125,
-0.029388427734375,
-0.01132965087890625,
-0.0087738037109375,
-0.046844482421875,
-0.07623291015625,
-0.0350341796875,
-0.01192474365234375,
-0.041290283203125,
-0.01000213623046875,
0.07293701171875,
0.023345947265625,
-0.058990478515625,
-0.03033447265625,
-0.01068115234375,
-0.01029205322265625,
0.01508331298828125,
-0.017333984375,
0.0305023193359375,
-0.03057861328125,
-0.05242919921875,
0.03619384765625,
0.0006284713745117188,
0.019073486328125,
-0.033416748046875,
-0.01094818115234375,
-0.018463134765625,
0.01291656494140625,
0.0408935546875,
0.0088653564453125,
-0.060211181640625,
0.0056610107421875,
0.0282440185546875,
-0.01416778564453125,
0.00983428955078125,
0.0183563232421875,
-0.048919677734375,
0.01302337646484375,
0.0173797607421875,
0.048370361328125,
0.01458740234375,
-0.00818634033203125,
0.04620361328125,
-0.021697998046875,
0.0296478271484375,
-0.001922607421875,
0.0169219970703125,
0.0118255615234375,
-0.040618896484375,
0.054412841796875,
0.0189666748046875,
-0.0511474609375,
-0.05633544921875,
0.0106353759765625,
-0.05841064453125,
-0.01204681396484375,
0.10101318359375,
0.00586700439453125,
-0.031829833984375,
-0.0043182373046875,
-0.044464111328125,
0.032012939453125,
-0.0225677490234375,
0.0567626953125,
0.062225341796875,
0.0053253173828125,
-0.01837158203125,
-0.047271728515625,
0.0208740234375,
0.005924224853515625,
-0.04669189453125,
-0.0205230712890625,
0.024017333984375,
0.02130126953125,
0.0080108642578125,
0.03912353515625,
-0.0083160400390625,
0.0116119384765625,
0.01441192626953125,
0.032684326171875,
-0.0034275054931640625,
-0.02496337890625,
-0.01169586181640625,
-0.0185394287109375,
0.001495361328125,
-0.0248565673828125
]
] |
lmsys/vicuna-13b-v1.1 | 2023-08-01T18:26:15.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"arxiv:2302.13971",
"arxiv:2306.05685",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | lmsys | null | null | lmsys/vicuna-13b-v1.1 | 96 | 12,073 | transformers | 2023-04-12T21:23:50 | ---
inference: false
---
**NOTE: New version available**
Please check out a newer version of the weights [here](https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md).
<br>
# Vicuna Model Card
## Model Details
Vicuna is a chat assistant trained by fine-tuning LLaMA on user-shared conversations collected from ShareGPT.
- **Developed by:** [LMSYS](https://lmsys.org/)
- **Model type:** An auto-regressive language model based on the transformer architecture.
- **License:** Non-commercial license
- **Finetuned from model:** [LLaMA](https://arxiv.org/abs/2302.13971).
### Model Sources
- **Repository:** https://github.com/lm-sys/FastChat
- **Blog:** https://lmsys.org/blog/2023-03-30-vicuna/
- **Paper:** https://arxiv.org/abs/2306.05685
- **Demo:** https://chat.lmsys.org/
## Uses
The primary use of Vicuna is research on large language models and chatbots.
The primary intended users of the model are researchers and hobbyists in natural language processing, machine learning, and artificial intelligence.
## How to Get Started with the Model
Command line interface: https://github.com/lm-sys/FastChat#vicuna-weights.
APIs (OpenAI API, Huggingface API): https://github.com/lm-sys/FastChat/tree/main#api.
## Training Details
Vicuna v1.1 is fine-tuned from LLaMA with supervised instruction fine-tuning.
The training data is around 70K conversations collected from ShareGPT.com.
See more details in the "Training Details of Vicuna Models" section in the appendix of this [paper](https://arxiv.org/pdf/2306.05685.pdf).
## Evaluation
Vicuna is evaluated with standard benchmarks, human preference, and LLM-as-a-judge. See more details in this [paper](https://arxiv.org/pdf/2306.05685.pdf) and [leaderboard](https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard).
## Difference between different versions of Vicuna
See [vicuna_weights_version.md](https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md)
## Acknowledgement
Special thanks to [@TheBloke](https://huggingface.co/TheBloke) for hosting this merged version of weights earlier. | 2,110 | [
[
-0.0138702392578125,
-0.061981201171875,
0.026611328125,
0.0357666015625,
-0.040191650390625,
-0.01666259765625,
-0.017608642578125,
-0.04364013671875,
0.031402587890625,
0.0271148681640625,
-0.039703369140625,
-0.036834716796875,
-0.050048828125,
-0.002460479736328125,
-0.011077880859375,
0.0667724609375,
0.00505828857421875,
0.01593017578125,
-0.00849151611328125,
-0.0300750732421875,
-0.06707763671875,
-0.038116455078125,
-0.0762939453125,
-0.0291748046875,
0.048370361328125,
0.033782958984375,
0.045562744140625,
0.038116455078125,
0.0265045166015625,
0.0300750732421875,
-0.00705718994140625,
0.0190887451171875,
-0.04351806640625,
0.004726409912109375,
0.0267486572265625,
-0.064453125,
-0.054290771484375,
-0.0183563232421875,
0.04150390625,
0.0071258544921875,
-0.0208587646484375,
0.01446533203125,
0.001251220703125,
0.03546142578125,
-0.0250244140625,
0.0265655517578125,
-0.044036865234375,
-0.01788330078125,
-0.0161285400390625,
-0.0416259765625,
-0.0201263427734375,
-0.024261474609375,
-0.01090240478515625,
-0.03533935546875,
-0.004222869873046875,
-0.00505828857421875,
0.08251953125,
0.04144287109375,
-0.0270843505859375,
-0.0128631591796875,
-0.043670654296875,
0.0458984375,
-0.06805419921875,
0.03302001953125,
0.032196044921875,
0.0440673828125,
-0.0170135498046875,
-0.044036865234375,
-0.036865234375,
-0.0284423828125,
0.0057373046875,
-0.0038738250732421875,
-0.020751953125,
0.002349853515625,
0.002483367919921875,
0.03497314453125,
-0.026824951171875,
0.0305023193359375,
-0.04217529296875,
0.0013666152954101562,
0.045562744140625,
0.0226898193359375,
0.01229095458984375,
-0.01403045654296875,
-0.0284881591796875,
-0.03045654296875,
-0.020751953125,
-0.0015840530395507812,
0.0273284912109375,
0.038818359375,
-0.044586181640625,
0.040985107421875,
-0.021148681640625,
0.042816162109375,
-0.002727508544921875,
-0.014190673828125,
0.0386962890625,
-0.0097198486328125,
-0.031982421875,
-0.01163482666015625,
0.08587646484375,
0.038360595703125,
0.0020122528076171875,
0.01177978515625,
0.0032672882080078125,
-0.011016845703125,
0.01171112060546875,
-0.059844970703125,
0.0007624626159667969,
0.03814697265625,
-0.0218048095703125,
-0.038177490234375,
-0.0139007568359375,
-0.024383544921875,
-0.0377197265625,
-0.016845703125,
0.0338134765625,
-0.032379150390625,
-0.0291290283203125,
0.025238037109375,
-0.0005955696105957031,
0.022705078125,
0.039764404296875,
-0.0467529296875,
0.033172607421875,
0.036651611328125,
0.07855224609375,
0.0016021728515625,
-0.0287017822265625,
-0.0194854736328125,
-0.0307159423828125,
-0.009735107421875,
0.06219482421875,
0.004940032958984375,
-0.020172119140625,
-0.010162353515625,
0.0146942138671875,
-0.001926422119140625,
-0.039459228515625,
0.050689697265625,
-0.023590087890625,
0.0276336669921875,
-0.01361846923828125,
-0.038543701171875,
0.0024700164794921875,
0.0179443359375,
-0.046630859375,
0.093017578125,
0.006587982177734375,
-0.0594482421875,
0.00901031494140625,
-0.037689208984375,
-0.0009083747863769531,
0.006237030029296875,
0.0013484954833984375,
-0.039215087890625,
-0.00543975830078125,
0.0019273757934570312,
0.041656494140625,
-0.0305023193359375,
0.02508544921875,
-0.019805908203125,
-0.039825439453125,
0.0178070068359375,
-0.0291900634765625,
0.07965087890625,
0.017303466796875,
-0.0229034423828125,
0.034210205078125,
-0.05511474609375,
-0.00641632080078125,
0.0264892578125,
-0.0237579345703125,
-0.0296630859375,
-0.01070404052734375,
-0.0011911392211914062,
-0.0005588531494140625,
0.0394287109375,
-0.02105712890625,
0.025421142578125,
-0.002948760986328125,
0.01507568359375,
0.053558349609375,
-0.0062103271484375,
0.0210723876953125,
-0.03338623046875,
0.0277099609375,
-0.004085540771484375,
0.05035400390625,
0.017608642578125,
-0.037261962890625,
-0.07684326171875,
-0.034332275390625,
0.004817962646484375,
0.04644775390625,
-0.05804443359375,
0.050567626953125,
-0.032928466796875,
-0.07794189453125,
-0.0645751953125,
0.0194244384765625,
0.0278778076171875,
0.00390625,
0.025482177734375,
-0.039947509765625,
-0.053375244140625,
-0.0648193359375,
-0.0125579833984375,
-0.024688720703125,
-0.005855560302734375,
0.0305633544921875,
0.0190277099609375,
-0.03472900390625,
0.0595703125,
-0.0364990234375,
-0.0249176025390625,
-0.0113677978515625,
-0.000743865966796875,
0.004978179931640625,
0.0307769775390625,
0.0477294921875,
-0.04541015625,
-0.0276336669921875,
-0.00839996337890625,
-0.054443359375,
-0.00562286376953125,
-0.004146575927734375,
-0.031951904296875,
0.0031280517578125,
0.03369140625,
-0.049713134765625,
0.028076171875,
0.05224609375,
-0.031890869140625,
0.035125732421875,
-0.0171356201171875,
-0.0016222000122070312,
-0.10150146484375,
-0.0032405853271484375,
0.007762908935546875,
-0.0369873046875,
-0.0399169921875,
-0.0017147064208984375,
-0.0018672943115234375,
0.032989501953125,
-0.052703857421875,
0.07586669921875,
-0.03125,
0.01053619384765625,
-0.035858154296875,
-0.0055999755859375,
-0.00676727294921875,
0.05670166015625,
-0.0033931732177734375,
0.046356201171875,
0.03057861328125,
-0.0626220703125,
0.0367431640625,
0.01180267333984375,
-0.0198516845703125,
0.022430419921875,
-0.06964111328125,
0.017913818359375,
0.00006341934204101562,
0.0264434814453125,
-0.0625,
-0.002254486083984375,
0.04473876953125,
-0.043212890625,
0.01605224609375,
-0.003589630126953125,
-0.033172607421875,
-0.0147857666015625,
-0.0228424072265625,
0.0161285400390625,
0.025390625,
-0.031982421875,
0.019805908203125,
0.03582763671875,
0.00879669189453125,
-0.042388916015625,
-0.0439453125,
-0.0003197193145751953,
-0.030181884765625,
-0.00665283203125,
0.00408935546875,
-0.02203369140625,
-0.018280029296875,
-0.0157623291015625,
0.00524139404296875,
-0.007152557373046875,
0.007415771484375,
0.0218353271484375,
0.00295257568359375,
0.00380706787109375,
0.01169586181640625,
-0.01053619384765625,
0.00127410888671875,
-0.01226806640625,
-0.006786346435546875,
0.07562255859375,
-0.033538818359375,
0.0081939697265625,
-0.0634765625,
-0.0113372802734375,
0.04364013671875,
0.007724761962890625,
0.0968017578125,
0.062286376953125,
-0.0178375244140625,
0.013641357421875,
-0.052337646484375,
-0.0157623291015625,
-0.0367431640625,
0.03155517578125,
-0.01349639892578125,
-0.062286376953125,
0.045745849609375,
0.0308380126953125,
0.0235595703125,
0.0343017578125,
0.059967041015625,
0.007518768310546875,
0.041717529296875,
0.0665283203125,
-0.0125579833984375,
0.0770263671875,
-0.0197601318359375,
-0.013092041015625,
-0.059417724609375,
-0.0227203369140625,
-0.04833984375,
-0.01113128662109375,
-0.051422119140625,
-0.04888916015625,
0.0007977485656738281,
-0.0006055831909179688,
-0.025054931640625,
0.055908203125,
-0.040496826171875,
0.0029811859130859375,
0.040802001953125,
0.0270843505859375,
0.020599365234375,
-0.00606536865234375,
0.01910400390625,
0.00909423828125,
-0.04962158203125,
-0.03961181640625,
0.07623291015625,
0.04833984375,
0.0521240234375,
0.0116729736328125,
0.047576904296875,
0.0200958251953125,
0.041351318359375,
-0.06756591796875,
0.039703369140625,
0.0203704833984375,
-0.053131103515625,
-0.0312042236328125,
-0.06280517578125,
-0.08148193359375,
0.0288848876953125,
-0.01485443115234375,
-0.05633544921875,
0.01055908203125,
0.00600433349609375,
-0.009796142578125,
0.0207061767578125,
-0.053924560546875,
0.05975341796875,
-0.029449462890625,
-0.0195465087890625,
-0.003520965576171875,
-0.037506103515625,
0.04217529296875,
0.011260986328125,
0.00992584228515625,
-0.01488494873046875,
-0.0084381103515625,
0.06317138671875,
-0.05078125,
0.08453369140625,
-0.01335906982421875,
-0.0276336669921875,
0.0175933837890625,
-0.01131439208984375,
0.0217132568359375,
0.0012617111206054688,
0.0032024383544921875,
0.02972412109375,
0.00795745849609375,
-0.0416259765625,
-0.04205322265625,
0.045166015625,
-0.08148193359375,
-0.026275634765625,
-0.021240234375,
-0.02874755859375,
0.0079345703125,
0.01220703125,
0.0257568359375,
0.0209197998046875,
-0.01171112060546875,
0.0214996337890625,
0.04058837890625,
-0.029815673828125,
-0.002307891845703125,
0.034820556640625,
-0.0247039794921875,
-0.0341796875,
0.045013427734375,
-0.0023975372314453125,
0.01157379150390625,
0.04351806640625,
0.0151824951171875,
-0.00969696044921875,
-0.01424407958984375,
-0.01140594482421875,
0.02960205078125,
-0.03509521484375,
-0.017852783203125,
-0.060302734375,
-0.01403045654296875,
-0.034027099609375,
0.033599853515625,
-0.061767578125,
-0.0290985107421875,
-0.0216827392578125,
-0.0038928985595703125,
0.057220458984375,
0.0281829833984375,
0.020538330078125,
0.0634765625,
-0.043670654296875,
0.01538848876953125,
0.01435089111328125,
0.0250701904296875,
0.002407073974609375,
-0.0538330078125,
-0.0404052734375,
0.01245880126953125,
-0.021728515625,
-0.061431884765625,
0.037994384765625,
-0.007843017578125,
0.034912109375,
0.0284423828125,
0.0005769729614257812,
0.058990478515625,
-0.0162811279296875,
0.04315185546875,
0.01605224609375,
-0.03717041015625,
0.0299224853515625,
-0.0183868408203125,
0.02801513671875,
0.048919677734375,
0.0303192138671875,
-0.047271728515625,
-0.02239990234375,
-0.058929443359375,
-0.05633544921875,
0.034393310546875,
0.0242919921875,
0.0260162353515625,
-0.00042057037353515625,
0.03680419921875,
0.005229949951171875,
0.0221405029296875,
-0.06353759765625,
-0.03948974609375,
-0.006381988525390625,
-0.0242767333984375,
-0.0175933837890625,
-0.0192718505859375,
0.0019512176513671875,
-0.033416748046875,
0.052154541015625,
-0.00856781005859375,
0.03985595703125,
0.00556182861328125,
-0.001781463623046875,
-0.0048980712890625,
0.01056671142578125,
0.048126220703125,
0.022186279296875,
-0.03515625,
-0.0207061767578125,
0.01454925537109375,
-0.038482666015625,
-0.00890350341796875,
0.0121917724609375,
0.0023632049560546875,
0.00653076171875,
0.0263519287109375,
0.10894775390625,
0.0304718017578125,
-0.03631591796875,
0.0290679931640625,
-0.058380126953125,
-0.016082763671875,
-0.0302886962890625,
0.01374053955078125,
0.0105438232421875,
0.031890869140625,
0.01036834716796875,
-0.01024627685546875,
-0.009063720703125,
-0.051849365234375,
-0.01393890380859375,
0.0279541015625,
-0.0322265625,
-0.022430419921875,
0.046630859375,
0.01345062255859375,
-0.033966064453125,
0.0271453857421875,
0.002285003662109375,
-0.0267181396484375,
0.03082275390625,
0.00794219970703125,
0.06475830078125,
-0.0183868408203125,
0.00870513916015625,
0.03631591796875,
0.021026611328125,
-0.0117950439453125,
0.01363372802734375,
-0.018524169921875,
-0.0587158203125,
-0.0009317398071289062,
-0.051513671875,
-0.043243408203125,
0.019866943359375,
-0.049591064453125,
0.0333251953125,
-0.02581787109375,
-0.04119873046875,
-0.0296173095703125,
0.043304443359375,
-0.068603515625,
-0.0019435882568359375,
-0.0084381103515625,
0.06512451171875,
-0.053558349609375,
0.07318115234375,
0.047607421875,
-0.035614013671875,
-0.0667724609375,
-0.026275634765625,
-0.003780364990234375,
-0.062286376953125,
0.00971221923828125,
0.0026226043701171875,
0.0008625984191894531,
-0.00798797607421875,
-0.052825927734375,
-0.056243896484375,
0.10455322265625,
0.0270233154296875,
-0.041229248046875,
-0.01568603515625,
-0.0120849609375,
0.049163818359375,
-0.006908416748046875,
0.04364013671875,
0.0272216796875,
0.0153350830078125,
0.006046295166015625,
-0.09161376953125,
0.0019273757934570312,
-0.034698486328125,
-0.0027790069580078125,
-0.01416778564453125,
-0.0860595703125,
0.06634521484375,
0.003490447998046875,
-0.00197601318359375,
0.021026611328125,
0.0657958984375,
0.03759765625,
0.00952911376953125,
0.04071044921875,
0.027069091796875,
0.07305908203125,
0.00846099853515625,
0.08648681640625,
-0.01003265380859375,
0.017333984375,
0.088134765625,
0.0041351318359375,
0.0643310546875,
0.032470703125,
-0.0026264190673828125,
0.04290771484375,
0.05712890625,
0.019805908203125,
0.014068603515625,
-0.00009465217590332031,
0.004634857177734375,
0.001556396484375,
0.003093719482421875,
-0.035125732421875,
0.036407470703125,
0.0162353515625,
-0.01216888427734375,
0.00799560546875,
-0.008331298828125,
0.0265655517578125,
-0.0229034423828125,
-0.004497528076171875,
0.055694580078125,
0.0239410400390625,
-0.04144287109375,
0.0792236328125,
0.00963592529296875,
0.08319091796875,
-0.05743408203125,
0.01971435546875,
-0.036834716796875,
0.028228759765625,
-0.0021915435791015625,
-0.01435089111328125,
0.003185272216796875,
0.01094818115234375,
0.018280029296875,
-0.000045239925384521484,
0.035919189453125,
-0.02801513671875,
-0.024017333984375,
0.0279541015625,
0.041290283203125,
0.037994384765625,
-0.0005168914794921875,
-0.058258056640625,
0.03704833984375,
-0.007801055908203125,
-0.04461669921875,
0.01934814453125,
0.027435302734375,
-0.0138702392578125,
0.07586669921875,
0.037017822265625,
0.0106964111328125,
0.0015668869018554688,
0.0251312255859375,
0.06610107421875,
-0.035552978515625,
-0.032684326171875,
-0.054473876953125,
0.025421142578125,
-0.004764556884765625,
-0.0362548828125,
0.06976318359375,
0.033782958984375,
0.0531005859375,
0.013092041015625,
0.0279083251953125,
-0.00534820556640625,
0.0200653076171875,
-0.03643798828125,
0.055908203125,
-0.06524658203125,
0.01552581787109375,
-0.027923583984375,
-0.06536865234375,
-0.01123046875,
0.043365478515625,
-0.01235198974609375,
0.0144805908203125,
0.03497314453125,
0.05615234375,
0.0110931396484375,
-0.021148681640625,
0.0182647705078125,
0.0232696533203125,
0.03460693359375,
0.036102294921875,
0.0400390625,
-0.06134033203125,
0.03741455078125,
-0.0128326416015625,
-0.022796630859375,
-0.037384033203125,
-0.04876708984375,
-0.081298828125,
-0.04827880859375,
-0.0164337158203125,
-0.023956298828125,
0.0108642578125,
0.076416015625,
0.052734375,
-0.021087646484375,
-0.045562744140625,
0.0074462890625,
-0.00235748291015625,
-0.0201873779296875,
-0.01654052734375,
0.01445770263671875,
-0.0011701583862304688,
-0.06781005859375,
0.0133056640625,
-0.024444580078125,
0.0165252685546875,
-0.02191162109375,
-0.03125,
-0.023406982421875,
0.00705718994140625,
0.031524658203125,
0.040740966796875,
-0.033111572265625,
-0.00001329183578491211,
-0.00809478759765625,
-0.033203125,
0.014404296875,
0.02362060546875,
-0.0540771484375,
0.004077911376953125,
0.0297698974609375,
0.01143646240234375,
0.05084228515625,
-0.0034942626953125,
0.032623291015625,
-0.054534912109375,
0.043609619140625,
-0.00665283203125,
0.0303955078125,
0.040374755859375,
-0.024444580078125,
0.0309906005859375,
-0.002483367919921875,
-0.0268402099609375,
-0.073974609375,
-0.014190673828125,
-0.0772705078125,
-0.013519287109375,
0.091796875,
0.0211944580078125,
-0.0443115234375,
0.01409149169921875,
-0.04150390625,
0.0523681640625,
-0.0266265869140625,
0.055908203125,
0.03472900390625,
0.01910400390625,
-0.039825439453125,
-0.054290771484375,
0.038848876953125,
0.0209197998046875,
-0.0650634765625,
0.003509521484375,
0.0206298828125,
0.0330810546875,
0.0002295970916748047,
0.09429931640625,
-0.004405975341796875,
0.0010128021240234375,
-0.012481689453125,
0.03973388671875,
-0.0214691162109375,
-0.02801513671875,
-0.01776123046875,
-0.027587890625,
0.01220703125,
-0.0243377685546875
]
] |
Qwen/Qwen-7B | 2023-11-05T03:27:25.000Z | [
"transformers",
"safetensors",
"qwen",
"text-generation",
"custom_code",
"zh",
"en",
"arxiv:2309.16609",
"has_space",
"region:us"
] | text-generation | Qwen | null | null | Qwen/Qwen-7B | 288 | 12,017 | transformers | 2023-08-03T02:51:18 | ---
language:
- zh
- en
tags:
- qwen
pipeline_tag: text-generation
inference: false
---
# Qwen-7B
<p align="center">
<img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/logo_qwen.jpg" width="400"/>
<p>
<br>
<p align="center">
🤗 <a href="https://huggingface.co/Qwen">Hugging Face</a>   |   🤖 <a href="https://modelscope.cn/organization/qwen">ModelScope</a>   |    📑 <a href="https://arxiv.org/abs/2309.16609">Paper</a>   |   🖥️ <a href="https://modelscope.cn/studios/qwen/Qwen-7B-Chat-Demo/summary">Demo</a>
<br>
<a href="https://github.com/QwenLM/Qwen/blob/main/assets/wechat.png">WeChat (微信)</a>   |    DingTalk (钉钉)    |   <a href="https://discord.gg/z3GAxXZ9Ce">Discord</a>  
</p>
<br><br>
## 介绍 (Introduction)
**通义千问-7B(Qwen-7B)**是阿里云研发的通义千问大模型系列的70亿参数规模的模型。Qwen-7B是基于Transformer的大语言模型, 在超大规模的预训练数据上进行训练得到。预训练数据类型多样,覆盖广泛,包括大量网络文本、专业书籍、代码等。同时,在Qwen-7B的基础上,我们使用对齐机制打造了基于大语言模型的AI助手Qwen-7B-Chat。相较于最初开源的Qwen-7B模型,我们现已将预训练模型和Chat模型更新到效果更优的版本。本仓库为Qwen-7B预训练模型的仓库。
通义千问-7B(Qwen-7B)主要有以下特点:
1. **大规模高质量训练语料**:使用超过2.4万亿tokens的数据进行预训练,包含高质量中、英、多语言、代码、数学等数据,涵盖通用及专业领域的训练语料。通过大量对比实验对预训练语料分布进行了优化。
2. **强大的性能**:Qwen-7B在多个中英文下游评测任务上(涵盖常识推理、代码、数学、翻译等),效果显著超越现有的相近规模开源模型,甚至在部分指标上相比更大尺寸模型也有较强竞争力。具体评测结果请详见下文。
3. **覆盖更全面的词表**:相比目前以中英词表为主的开源模型,Qwen-7B使用了约15万大小的词表。该词表对多语言更加友好,方便用户在不扩展词表的情况下对部分语种进行能力增强和扩展。
如果您想了解更多关于通义千问7B开源模型的细节,我们建议您参阅[GitHub代码库](https://github.com/QwenLM/Qwen)。
**Qwen-7B** is the 7B-parameter version of the large language model series, Qwen (abbr. Tongyi Qianwen), proposed by Alibaba Cloud. Qwen-7B is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc. Additionally, based on the pretrained Qwen-7B, we release Qwen-7B-Chat, a large-model-based AI assistant, which is trained with alignment techniques. Now we have updated both our pretrained and chat models for better performances. This repository is the one for the Qwen-7B base language model.
The features of Qwen-7B include:
1. **Large-scale high-quality training corpora**: It is pretrained on over 2.4 trillion tokens, including Chinese, English, multilingual texts, code, and mathematics, covering general and professional fields. The distribution of the pre-training corpus has been optimized through a large number of ablation experiments.
2. **Competitive performance**: It significantly surpasses existing open-source models of similar scale on multiple Chinese and English downstream evaluation tasks (including commonsense, reasoning, code, mathematics, etc.), and even surpasses some larger-scale models in several benchmarks. See below for specific evaluation results.
3. **More comprehensive vocabulary coverage**: Compared with other open-source models based on Chinese and English vocabularies, Qwen-7B uses a vocabulary of over 150K tokens. This vocabulary is more friendly to multiple languages, enabling users to directly further enhance the capability for certain languages without expanding the vocabulary.
For more details about Qwen, please refer to the [GitHub](https://github.com/QwenLM/Qwen) code repository.
<br>
## 要求(Requirements)
* python 3.8及以上版本
* pytorch 1.12及以上版本,推荐2.0及以上版本
* 建议使用CUDA 11.4及以上(GPU用户、flash-attention用户等需考虑此选项)
* python 3.8 and above
* pytorch 1.12 and above, 2.0 and above are recommended
* CUDA 11.4 and above are recommended (this is for GPU users, flash-attention users, etc.)
<br>
## 依赖项 (Dependency)
运行Qwen-7B,请确保满足上述要求,再执行以下pip命令安装依赖库
To run Qwen-7B, please make sure you meet the above requirements, and then execute the following pip commands to install the dependent libraries.
```bash
pip install transformers==4.32.0 accelerate tiktoken einops scipy transformers_stream_generator==0.0.4 peft deepspeed
```
另外,推荐安装`flash-attention`库(**当前已支持flash attention 2**),以实现更高的效率和更低的显存占用。
In addition, it is recommended to install the `flash-attention` library (**we support flash attention 2 now.**) for higher efficiency and lower memory usage.
```bash
git clone https://github.com/Dao-AILab/flash-attention
cd flash-attention && pip install .
# 下方安装可选,安装可能比较缓慢。
# pip install csrc/layer_norm
# pip install csrc/rotary
```
<br>
## 快速使用(Quickstart)
您可以通过以下代码轻松调用:
You can easily call the model with the following code:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers.generation import GenerationConfig
# Note: The default behavior now has injection attack prevention off.
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen-7B", trust_remote_code=True)
# use bf16
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-7B", device_map="auto", trust_remote_code=True, bf16=True).eval()
# use fp16
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-7B", device_map="auto", trust_remote_code=True, fp16=True).eval()
# use cpu only
# model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-7B", device_map="cpu", trust_remote_code=True).eval()
# use auto mode, automatically select precision based on the device.
model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-7B", device_map="auto", trust_remote_code=True).eval()
# Specify hyperparameters for generation. But if you use transformers>=4.32.0, there is no need to do this.
# model.generation_config = GenerationConfig.from_pretrained("Qwen/Qwen-7B", trust_remote_code=True)
inputs = tokenizer('蒙古国的首都是乌兰巴托(Ulaanbaatar)\n冰岛的首都是雷克雅未克(Reykjavik)\n埃塞俄比亚的首都是', return_tensors='pt')
inputs = inputs.to(model.device)
pred = model.generate(**inputs)
print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))
# 蒙古国的首都是乌兰巴托(Ulaanbaatar)\n冰岛的首都是雷克雅未克(Reykjavik)\n埃塞俄比亚的首都是亚的斯亚贝巴(Addis Ababa)...
```
关于更多的使用说明,请参考我们的[GitHub repo](https://github.com/QwenLM/Qwen)获取更多信息。
For more information, please refer to our [GitHub repo](https://github.com/QwenLM/Qwen) for more information.
<br>
## Tokenizer
> 注:作为术语的“tokenization”在中文中尚无共识的概念对应,本文档采用英文表达以利说明。
基于tiktoken的分词器有别于其他分词器,比如sentencepiece分词器。尤其在微调阶段,需要特别注意特殊token的使用。关于tokenizer的更多信息,以及微调时涉及的相关使用,请参阅[文档](https://github.com/QwenLM/Qwen/blob/main/tokenization_note_zh.md)。
Our tokenizer based on tiktoken is different from other tokenizers, e.g., sentencepiece tokenizer. You need to pay attention to special tokens, especially in finetuning. For more detailed information on the tokenizer and related use in fine-tuning, please refer to the [documentation](https://github.com/QwenLM/Qwen/blob/main/tokenization_note.md).
<br>
## 模型细节 (Model)
Qwen-7B模型规模基本情况如下所示。
The details of the model architecture of Qwen-7B are listed as follows.
| Hyperparameter | Value |
|:----------------|:-------|
| n_layers | 32 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 151851 |
| sequence length | 8192 |
在位置编码、FFN激活函数和normalization的实现方式上,我们也采用了目前最流行的做法,
即RoPE相对位置编码、SwiGLU激活函数、RMSNorm(可选安装flash-attention加速)。
在分词器方面,相比目前主流开源模型以中英词表为主,Qwen-7B使用了超过15万token大小的词表。 该词表在GPT-4使用的BPE词表`cl100k_base`基础上,对中文、多语言进行了优化,在对中、英、代码数据的高效编解码的基础上,对部分多语言更加友好,方便用户在不扩展词表的情况下对部分语种进行能力增强。
词表对数字按单个数字位切分。调用较为高效的[tiktoken分词库](https://github.com/openai/tiktoken)进行分词。
我们从部分语种各随机抽取100万个文档语料,以对比不同模型的编码压缩率(以支持100语种的XLM-R为基准值1,越低越好),具体性能见图。
可以看到Qwen-7B在保持中英代码高效解码的前提下,对部分使用人群较多的语种(泰语th、希伯来语he、阿拉伯语ar、韩语ko、越南语vi、日语ja、土耳其语tr、印尼语id、波兰语pl、俄语ru、荷兰语nl、葡萄牙语pt、意大利语it、德语de、西班牙语es、法语fr等)上也实现了较高的压缩率,使得模型在这些语种上也具备较强的可扩展性和较高的训练和推理效率。
在预训练数据方面,去重及过滤后的语料超过2.4T tokens,囊括全网文本、百科、书籍、代码、数学及各个领域垂类。
<p align="center">
<img src="assets/tokenizer.png" style="width: 1200px"/>
<p>
For position encoding, FFN activation function, and normalization methods, we adopt the prevalent practices, i.e., RoPE relative position encoding, SwiGLU for activation function, and RMSNorm for normalization (optional installation of flash-attention for acceleration).
For tokenization, compared to the current mainstream open-source models based on Chinese and English vocabularies, Qwen-7B uses a vocabulary of over 150K tokens. It first considers efficient encoding of Chinese, English, and code data, and is also more friendly to multilingual languages, enabling users to directly enhance the capability of some languages without expanding the vocabulary. It segments numbers by single digit, and calls the [tiktoken](https://github.com/openai/tiktoken) tokenizer library for efficient tokenization.
We randomly selected 1 million document corpus of each language to test and compare the encoding compression rates of different models (with XLM-R, which supports 100 languages, as the base value 1). The specific performance is shown in the figure above.
As can be seen, while ensuring the efficient decoding of Chinese, English, and code, Qwen-7B also achieves a high compression rate for many other languages (such as th, he, ar, ko, vi, ja, tr, id, pl, ru, nl, pt, it, de, es, fr etc.), equipping the model with strong scalability as well as high training and inference efficiency in these languages.
The scale of pretraining corpus reaches over 2.4T tokens after deduplication and filtration, encompassing web text, encyclopedia, books, code, mathematics, and various domains.
<br>
## 评测效果(Evaluation)
我们选取了MMLU,C-Eval,GSM8K, MATH, HumanEval, MBPP, BBH, CMMLU等目前较流行的benchmark,对模型的中英知识能力、翻译、数学推理、代码等能力进行综合评测。从下列结果可以看到Qwen模型在所有benchmark上均取得了同级别开源模型中的最优表现。
We selected MMLU, C-Eval, GSM8K, MATH, HumanEval, MBPP, BBH, CMMLU, which are currently popular benchmarks, to test the model’s Chinese and English knowledge capabilities, translation, mathematical reasoning, coding and other capabilities. From the following comprehensive evaluation results, we can see that the Qwen model outperform the similarly sized open-source models on all tasks.
| Model | MMLU | C-Eval | GSM8K | MATH | HumanEval | MBPP | BBH | CMMLU |
|:-------------------|:--------:|:--------:|:--------:|:--------:|:---------:|:--------:|:--------:|:--------:|
| | 5-shot | 5-shot | 8-shot | 4-shot | 0-shot | 3-shot | 3-shot | 5-shot |
| LLaMA2-7B | 46.8 | 32.5 | 16.7 | 3.3 | 12.8 | 20.8 | 38.2 | 31.8 |
| LLaMA2-13B | 55.0 | 41.4 | 29.6 | 5.0 | 18.9 | 30.3 | 45.6 | 38.4 |
| LLaMA2-34B | 62.6 | - | 42.2 | 6.2 | 22.6 | 33.0 | 44.1 | - |
| ChatGLM2-6B | 47.9 | 51.7 | 32.4 | 6.5 | - | - | 33.7 | - |
| InternLM-7B | 51.0 | 53.4 | 31.2 | 6.3 | 10.4 | 14.0 | 37.0 | 51.8 |
| InternLM-20B | 62.1 | 58.8 | 52.6 | 7.9 | 25.6 | 35.6 | 52.5 | 59.0 |
| Baichuan2-7B | 54.7 | 56.3 | 24.6 | 5.6 | 18.3 | 24.2 | 41.6 | 57.1 |
| Baichuan2-13B | 59.5 | 59.0 | 52.8 | 10.1 | 17.1 | 30.2 | 49.0 | 62.0 |
| Qwen-7B (original) | 56.7 | 59.6 | 51.6 | - | 24.4 | 31.2 | 40.6 | 58.8 |
| **Qwen-7B** | 58.2 | 63.5 | 51.7 | 11.6 | 29.9 | 31.6 | 45.0 | 62.2 |
| **Qwen-14B** | **66.3** | **72.1** | **61.3** | **24.8** | **32.3** | **40.8** | **53.4** | **71.0** |
### 长序列评测(Long-Context Evaluation)
我们引入NTK插值,LogN注意力缩放,窗口注意力等技巧,将Qwen-7B (original)和14B模型的上下文长度从2K扩展到8K以上,将Qwen-7B从8K扩到32K。在arXiv数据上使用PPL指标测试Qwen-7B和Qwen-14B在不同长度下的表现,结果如下:
**(若要启用NTK和LogN注意力缩放,请将config.json里的`use_dynamic_ntk`和`use_logn_attn`设置为true)**
We introduce NTK-aware interpolation, LogN attention scaling, Window attention, etc. to extend the context length to over 8K tokens. We conduct language modeling experiments on the arXiv dataset with the PPL evaluation. Results are demonstrated below:
**(To use NTK interpolation and LogN scaling, please set `use_dynamic_ntk` and `use_long_attn` to true in config.json.)**
<table>
<tr>
<th rowspan="2">Model</th><th colspan="6" align="center">Sequence Length</th>
</tr>
<tr>
<th align="center">1024</th><th align="center">2048</th><th align="center">4096</th><th align="center">8192</th><th align="center">16384</th><th align="center">32768</th>
</tr>
<tr>
<td>Qwen-7B (original)</td><td align="center">4.23</td><td align="center">3.78</td><td align="center">39.35</td><td align="center">469.81</td><td align="center">2645.09</td><td align="center">-</td>
</tr>
<tr>
<td>+ dynamic_ntk</td><td align="center">4.23</td><td align="center">3.78</td><td align="center">3.59</td><td align="center">3.66</td><td align="center">5.71</td><td align="center">-</td>
</tr>
<tr>
<td>+ dynamic_ntk + logn</td><td align="center">4.23</td><td align="center">3.78</td><td align="center">3.58</td><td align="center">3.56</td><td align="center">4.62</td><td align="center">-</td>
</tr>
<tr>
<td>+ dynamic_ntk + logn + window_attn</td><td align="center">4.23</td><td align="center">3.78</td><td align="center">3.58</td><td align="center">3.49</td><td align="center">4.32</td><td align="center">-</td>
</tr>
<tr>
<tr>
<td>Qwen-7B</td><td align="center"><b>4.23</b></td><td align="center"><b>3.81</b></td><td align="center"><b>3.52</b></td><td align="center"><b>3.31</b></td><td align="center">7.27</td><td align="center">181.49</td>
</tr>
<tr>
<td>+ dynamic_ntk + logn + window_attn</td><td align="center"><b>4.23</b></td><td align="center"><b>3.81</b></td><td align="center"><b>3.52</b></td><td align="center"><b>3.33</b></td><td align="center"><b>3.22</b></td><td align="center"><b>3.17</b></td>
</tr>
<tr>
<td>Qwen-14B</td><td align="center"><b>-</b></td><td align="center"><b>3.46</b></td><td align="center">22.79</td><td align="center">334.65</td><td align="center">3168.35</td><td align="center">-</td>
</tr>
<tr>
<td>+ dynamic_ntk + logn + window_attn</td><td align="center"><b>-</b></td><td align="center"><b>3.46</b></td><td align="center"><b>3.29</b></td><td align="center"><b>3.18</b></td><td align="center">3.42</td><td align="center">-</td>
</tr>
</table>
## 评测复现(Reproduction)
我们提供了评测脚本,方便大家复现模型效果,详见[链接](https://github.com/QwenLM/Qwen/tree/main/eval)。提示:由于硬件和框架造成的舍入误差,复现结果如有小幅波动属于正常现象。
We have provided evaluation scripts to reproduce the performance of our model, details as [link](https://github.com/QwenLM/Qwen/tree/main/eval).
<br>
## FAQ
如遇到问题,敬请查阅[FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ_zh.md)以及issue区,如仍无法解决再提交issue。
If you meet problems, please refer to [FAQ](https://github.com/QwenLM/Qwen/blob/main/FAQ.md) and the issues first to search a solution before you launch a new issue.
<br>
## 引用 (Citation)
如果你觉得我们的工作对你有帮助,欢迎引用!
If you find our work helpful, feel free to give us a cite.
```
@article{qwen,
title={Qwen Technical Report},
author={Jinze Bai and Shuai Bai and Yunfei Chu and Zeyu Cui and Kai Dang and Xiaodong Deng and Yang Fan and Wenbin Ge and Yu Han and Fei Huang and Binyuan Hui and Luo Ji and Mei Li and Junyang Lin and Runji Lin and Dayiheng Liu and Gao Liu and Chengqiang Lu and Keming Lu and Jianxin Ma and Rui Men and Xingzhang Ren and Xuancheng Ren and Chuanqi Tan and Sinan Tan and Jianhong Tu and Peng Wang and Shijie Wang and Wei Wang and Shengguang Wu and Benfeng Xu and Jin Xu and An Yang and Hao Yang and Jian Yang and Shusheng Yang and Yang Yao and Bowen Yu and Hongyi Yuan and Zheng Yuan and Jianwei Zhang and Xingxuan Zhang and Yichang Zhang and Zhenru Zhang and Chang Zhou and Jingren Zhou and Xiaohuan Zhou and Tianhang Zhu},
journal={arXiv preprint arXiv:2309.16609},
year={2023}
}
```
<br>
## 使用协议(License Agreement)
我们的代码和模型权重对学术研究完全开放,并支持商用。请查看[LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE)了解具体的开源协议细节。如需商用,请填写[问卷](https://dashscope.console.aliyun.com/openModelApply/qianwen)申请。
Our code and checkpoints are open to research purpose, and they are allowed for commercial purposes. Check [LICENSE](https://github.com/QwenLM/Qwen/blob/main/LICENSE) for more details about the license. If you have requirements for commercial use, please fill out the [form](https://dashscope.console.aliyun.com/openModelApply/qianwen) to apply.
<br>
## 联系我们(Contact Us)
如果你想给我们的研发团队和产品团队留言,欢迎加入我们的微信群、钉钉群以及Discord!同时,也欢迎通过邮件(qianwen_opensource@alibabacloud.com)联系我们。
If you are interested to leave a message to either our research team or product team, join our Discord or WeChat groups! Also, feel free to send an email to qianwen_opensource@alibabacloud.com.
| 16,546 | [
[
-0.03521728515625,
-0.037933349609375,
0.005229949951171875,
0.01187896728515625,
-0.026275634765625,
-0.011962890625,
-0.01097869873046875,
-0.036376953125,
-0.0003094673156738281,
0.0207061767578125,
-0.03472900390625,
-0.04827880859375,
-0.032440185546875,
0.0018396377563476562,
-0.0158233642578125,
0.048858642578125,
0.00928497314453125,
0.00023043155670166016,
0.0291595458984375,
-0.01432037353515625,
-0.031494140625,
-0.0228729248046875,
-0.04644775390625,
-0.0108184814453125,
0.0245513916015625,
0.01081085205078125,
0.06182861328125,
0.05010986328125,
0.04351806640625,
0.0304107666015625,
-0.01320648193359375,
0.01035308837890625,
-0.031829833984375,
-0.017547607421875,
0.00789642333984375,
-0.044830322265625,
-0.049346923828125,
-0.005794525146484375,
0.05364990234375,
0.0233917236328125,
0.00952911376953125,
0.0285797119140625,
0.0139312744140625,
0.03875732421875,
-0.0200347900390625,
0.011993408203125,
-0.027618408203125,
-0.004085540771484375,
-0.021575927734375,
0.004619598388671875,
-0.007328033447265625,
-0.035919189453125,
0.011444091796875,
-0.05999755859375,
0.01445770263671875,
0.02606201171875,
0.09765625,
-0.005237579345703125,
-0.036041259765625,
-0.00021922588348388672,
-0.023040771484375,
0.07965087890625,
-0.0860595703125,
0.0181732177734375,
0.0175628662109375,
0.0283660888671875,
-0.0185394287109375,
-0.0828857421875,
-0.055908203125,
-0.01800537109375,
-0.020751953125,
0.017730712890625,
-0.01314544677734375,
0.013214111328125,
0.0391845703125,
0.0289764404296875,
-0.0472412109375,
-0.00614166259765625,
-0.03973388671875,
-0.0198211669921875,
0.040557861328125,
0.0019235610961914062,
0.044830322265625,
-0.031341552734375,
-0.0303192138671875,
-0.0178070068359375,
-0.03778076171875,
0.015594482421875,
0.017608642578125,
0.00621795654296875,
-0.034027099609375,
0.02117919921875,
-0.016510009765625,
0.027923583984375,
0.0264434814453125,
-0.028045654296875,
0.036285400390625,
-0.0204620361328125,
-0.029571533203125,
-0.005275726318359375,
0.0865478515625,
0.04437255859375,
-0.0074462890625,
0.0029163360595703125,
-0.01131439208984375,
-0.00852203369140625,
-0.01435089111328125,
-0.07470703125,
-0.0245819091796875,
0.05218505859375,
-0.0618896484375,
-0.03533935546875,
0.0089111328125,
-0.0323486328125,
0.003826141357421875,
0.0011796951293945312,
0.054412841796875,
-0.049896240234375,
-0.04742431640625,
0.00626373291015625,
-0.0203857421875,
0.0169219970703125,
0.01922607421875,
-0.0538330078125,
0.0005459785461425781,
0.0240631103515625,
0.059906005859375,
0.006496429443359375,
-0.0382080078125,
-0.0175628662109375,
0.005496978759765625,
-0.01776123046875,
0.02606201171875,
0.011444091796875,
-0.028900146484375,
-0.0038471221923828125,
0.01837158203125,
-0.004871368408203125,
-0.0380859375,
0.043609619140625,
-0.04248046875,
0.036895751953125,
-0.0199432373046875,
-0.034698486328125,
-0.0343017578125,
0.01418304443359375,
-0.050750732421875,
0.07952880859375,
0.01558685302734375,
-0.08160400390625,
0.0008692741394042969,
-0.051422119140625,
-0.017547607421875,
0.009246826171875,
0.0086822509765625,
-0.036346435546875,
-0.0179901123046875,
0.014434814453125,
0.02313232421875,
-0.027984619140625,
0.0184326171875,
-0.01325225830078125,
-0.035003662109375,
0.02484130859375,
-0.038177490234375,
0.10040283203125,
0.0152130126953125,
-0.052001953125,
0.036865234375,
-0.061431884765625,
0.0168609619140625,
0.01953125,
-0.0128326416015625,
-0.0013933181762695312,
-0.0159759521484375,
0.016632080078125,
0.0302276611328125,
0.034423828125,
-0.037811279296875,
0.01215362548828125,
-0.061309814453125,
0.052001953125,
0.052032470703125,
-0.00957489013671875,
0.038787841796875,
-0.0406494140625,
0.030517578125,
0.02099609375,
0.031219482421875,
-0.0144500732421875,
-0.0307464599609375,
-0.06854248046875,
-0.0007791519165039062,
0.03857421875,
0.035247802734375,
-0.076904296875,
0.04608154296875,
-0.0033111572265625,
-0.04522705078125,
-0.06121826171875,
-0.00530242919921875,
0.041259765625,
0.0355224609375,
0.043365478515625,
0.005290985107421875,
-0.04144287109375,
-0.05224609375,
0.00308990478515625,
-0.0139923095703125,
-0.00949859619140625,
0.0027751922607421875,
0.0321044921875,
-0.00771331787109375,
0.052642822265625,
-0.0275726318359375,
-0.0197601318359375,
-0.0304107666015625,
-0.008636474609375,
0.032379150390625,
0.0552978515625,
0.06182861328125,
-0.06549072265625,
-0.040008544921875,
0.00150299072265625,
-0.058349609375,
0.005603790283203125,
-0.0066986083984375,
-0.041107177734375,
0.00940704345703125,
0.01092529296875,
-0.053070068359375,
0.0290985107421875,
0.0440673828125,
-0.027679443359375,
0.06695556640625,
-0.004161834716796875,
0.0118560791015625,
-0.107177734375,
0.007843017578125,
-0.009521484375,
-0.005950927734375,
-0.037872314453125,
0.0136260986328125,
0.0205841064453125,
0.021514892578125,
-0.052581787109375,
0.062164306640625,
-0.041290283203125,
0.0194854736328125,
-0.0138397216796875,
0.0160980224609375,
0.01483154296875,
0.0462646484375,
-0.0213165283203125,
0.0443115234375,
0.05035400390625,
-0.048187255859375,
0.03936767578125,
0.025848388671875,
-0.0218658447265625,
0.005767822265625,
-0.060150146484375,
-0.006580352783203125,
0.01114654541015625,
0.00719451904296875,
-0.06671142578125,
-0.0063018798828125,
0.034698486328125,
-0.053009033203125,
0.018341064453125,
-0.0120086669921875,
-0.0174407958984375,
-0.047515869140625,
-0.039886474609375,
0.0191650390625,
0.0526123046875,
-0.04534912109375,
0.044830322265625,
0.007579803466796875,
0.0120849609375,
-0.0489501953125,
-0.0406494140625,
-0.0096282958984375,
-0.020843505859375,
-0.059234619140625,
0.05419921875,
-0.0105743408203125,
-0.0024127960205078125,
0.0016269683837890625,
0.0075225830078125,
0.0122833251953125,
0.0012636184692382812,
0.01476287841796875,
0.037261962890625,
-0.02679443359375,
-0.00490570068359375,
0.01360321044921875,
-0.00649261474609375,
-0.002140045166015625,
-0.035247802734375,
0.041839599609375,
-0.00345611572265625,
-0.01175689697265625,
-0.0633544921875,
0.0132598876953125,
0.041015625,
-0.02374267578125,
0.051300048828125,
0.08154296875,
-0.0233001708984375,
-0.00165557861328125,
-0.036865234375,
-0.00875091552734375,
-0.045440673828125,
0.032989501953125,
-0.032806396484375,
-0.059661865234375,
0.049163818359375,
0.00850677490234375,
0.036376953125,
0.06402587890625,
0.0404052734375,
-0.008453369140625,
0.0860595703125,
0.034515380859375,
-0.00820159912109375,
0.04913330078125,
-0.05450439453125,
0.0038509368896484375,
-0.064208984375,
-0.00006681680679321289,
-0.021026611328125,
-0.027496337890625,
-0.059814453125,
-0.0183563232421875,
0.0240631103515625,
0.0306549072265625,
-0.037933349609375,
0.030029296875,
-0.04071044921875,
-0.0081939697265625,
0.06396484375,
0.00499725341796875,
0.004421234130859375,
-0.020751953125,
-0.01555633544921875,
0.01275634765625,
-0.060821533203125,
-0.038330078125,
0.06787109375,
0.035003662109375,
0.028961181640625,
0.01300048828125,
0.05072021484375,
0.002887725830078125,
0.0222625732421875,
-0.04901123046875,
0.033233642578125,
0.0024089813232421875,
-0.037261962890625,
-0.0313720703125,
-0.03533935546875,
-0.0706787109375,
0.03912353515625,
-0.00965118408203125,
-0.055328369140625,
0.023193359375,
0.01509857177734375,
-0.0423583984375,
0.029815673828125,
-0.057952880859375,
0.0687255859375,
-0.0205841064453125,
-0.03155517578125,
0.004070281982421875,
-0.0543212890625,
0.0311737060546875,
0.0321044921875,
0.0189971923828125,
-0.0125732421875,
0.02587890625,
0.07366943359375,
-0.04229736328125,
0.05364990234375,
-0.021728515625,
0.002208709716796875,
0.0455322265625,
-0.00833892822265625,
0.031768798828125,
0.002960205078125,
0.0004940032958984375,
0.0186004638671875,
0.027923583984375,
-0.03839111328125,
-0.0328369140625,
0.047607421875,
-0.06927490234375,
-0.051727294921875,
-0.022857666015625,
-0.03302001953125,
0.0097808837890625,
0.03741455078125,
0.045166015625,
0.0416259765625,
0.00930023193359375,
0.01507568359375,
0.0287933349609375,
-0.0307769775390625,
0.04736328125,
0.0218505859375,
-0.034515380859375,
-0.04315185546875,
0.07183837890625,
0.021209716796875,
0.0200347900390625,
0.0214691162109375,
0.0191650390625,
-0.026702880859375,
-0.039642333984375,
-0.05694580078125,
0.017578125,
-0.034149169921875,
-0.034149169921875,
-0.04705810546875,
-0.0240631103515625,
-0.054534912109375,
0.006626129150390625,
-0.01256561279296875,
-0.0272979736328125,
-0.027130126953125,
-0.00958251953125,
0.03436279296875,
0.027252197265625,
0.0034637451171875,
0.032989501953125,
-0.080810546875,
0.0338134765625,
0.00847625732421875,
0.00785064697265625,
0.03070068359375,
-0.048858642578125,
-0.041595458984375,
0.0274505615234375,
-0.034515380859375,
-0.05975341796875,
0.04736328125,
-0.00396728515625,
0.0391845703125,
0.039642333984375,
0.018707275390625,
0.03460693359375,
-0.015869140625,
0.06341552734375,
0.029571533203125,
-0.07208251953125,
0.025848388671875,
-0.03411865234375,
0.017669677734375,
0.00647735595703125,
0.0172271728515625,
-0.03594970703125,
-0.01788330078125,
-0.05224609375,
-0.06634521484375,
0.05859375,
0.022979736328125,
0.005157470703125,
0.0034122467041015625,
0.0122528076171875,
-0.0203857421875,
0.0140380859375,
-0.04779052734375,
-0.0391845703125,
-0.032257080078125,
-0.0277099609375,
0.026885986328125,
-0.0138092041015625,
0.0035915374755859375,
-0.025909423828125,
0.053131103515625,
-0.0002951622009277344,
0.032989501953125,
0.01776123046875,
-0.01131439208984375,
0.0015859603881835938,
0.0033779144287109375,
0.03173828125,
0.037109375,
-0.017822265625,
-0.00543975830078125,
0.015655517578125,
-0.05596923828125,
-0.0022296905517578125,
0.007381439208984375,
-0.030120849609375,
0.0083160400390625,
0.024169921875,
0.07208251953125,
0.0101165771484375,
-0.0278167724609375,
0.02850341796875,
-0.003627777099609375,
-0.033843994140625,
-0.0115814208984375,
0.0146331787109375,
0.0265655517578125,
0.036468505859375,
0.0384521484375,
-0.0183258056640625,
0.013336181640625,
-0.036224365234375,
0.00015974044799804688,
0.0107574462890625,
-0.0160980224609375,
-0.0283966064453125,
0.0628662109375,
0.016571044921875,
-0.006313323974609375,
0.031280517578125,
-0.03289794921875,
-0.06365966796875,
0.06732177734375,
0.04949951171875,
0.058868408203125,
-0.0184783935546875,
0.006256103515625,
0.052520751953125,
0.01117706298828125,
-0.01104736328125,
0.0369873046875,
0.01454925537109375,
-0.05615234375,
-0.0191802978515625,
-0.05157470703125,
-0.01514434814453125,
-0.0010433197021484375,
-0.04595947265625,
0.0227203369140625,
-0.02935791015625,
-0.031890869140625,
-0.0017976760864257812,
0.0188751220703125,
-0.049163818359375,
0.032135009765625,
-0.0106201171875,
0.065185546875,
-0.028228759765625,
0.081298828125,
0.0347900390625,
-0.04296875,
-0.076171875,
-0.000705718994140625,
-0.0166473388671875,
-0.049346923828125,
0.044189453125,
0.01090240478515625,
0.0004749298095703125,
0.0264434814453125,
-0.0369873046875,
-0.07379150390625,
0.1080322265625,
0.004131317138671875,
-0.051300048828125,
-0.0168914794921875,
-0.01519775390625,
0.0273590087890625,
-0.00362396240234375,
0.0302581787109375,
0.0287017822265625,
0.035797119140625,
0.0012969970703125,
-0.080810546875,
0.0220947265625,
-0.029571533203125,
0.00603485107421875,
0.004852294921875,
-0.07904052734375,
0.09002685546875,
-0.0186920166015625,
-0.0238037109375,
0.0108642578125,
0.07659912109375,
0.017669677734375,
0.0094451904296875,
0.0223846435546875,
0.015655517578125,
0.048858642578125,
-0.01430511474609375,
0.059112548828125,
-0.041473388671875,
0.04718017578125,
0.056121826171875,
0.0035724639892578125,
0.056243896484375,
0.0119781494140625,
-0.03753662109375,
0.0268402099609375,
0.04119873046875,
-0.0166473388671875,
0.030029296875,
0.00469207763671875,
-0.01543426513671875,
-0.00023174285888671875,
0.0034465789794921875,
-0.048858642578125,
0.0058746337890625,
0.02520751953125,
-0.0153045654296875,
0.007049560546875,
0.0163421630859375,
0.0014638900756835938,
-0.03070068359375,
-0.031524658203125,
0.039031982421875,
0.0177001953125,
-0.027313232421875,
0.060821533203125,
0.023681640625,
0.083984375,
-0.038177490234375,
-0.005680084228515625,
-0.0096893310546875,
0.00799560546875,
-0.0247344970703125,
-0.04437255859375,
-0.003597259521484375,
-0.02313232421875,
-0.00930023193359375,
0.01226806640625,
0.0616455078125,
-0.03509521484375,
-0.035003662109375,
0.0253448486328125,
0.0254669189453125,
0.002376556396484375,
-0.0028839111328125,
-0.061614990234375,
0.0039215087890625,
0.0179290771484375,
-0.042022705078125,
0.0309295654296875,
0.03839111328125,
-0.00612640380859375,
0.042022705078125,
0.058807373046875,
-0.0033512115478515625,
0.01107025146484375,
0.01305389404296875,
0.07269287109375,
-0.060394287109375,
-0.036529541015625,
-0.052459716796875,
0.0570068359375,
-0.00701904296875,
-0.03704833984375,
0.06475830078125,
0.0274505615234375,
0.07049560546875,
0.0081787109375,
0.0697021484375,
-0.0177154541015625,
0.04376220703125,
-0.0341796875,
0.07330322265625,
-0.03253173828125,
-0.00008273124694824219,
-0.0174102783203125,
-0.045318603515625,
-0.00821685791015625,
0.058868408203125,
-0.0235137939453125,
0.0179443359375,
0.044219970703125,
0.057525634765625,
0.010711669921875,
-0.0105133056640625,
0.039886474609375,
0.036041259765625,
0.0182647705078125,
0.049346923828125,
0.0433349609375,
-0.07232666015625,
0.04998779296875,
-0.0423583984375,
-0.0039215087890625,
-0.028839111328125,
-0.036407470703125,
-0.07684326171875,
-0.032958984375,
-0.018463134765625,
-0.046478271484375,
-0.00975799560546875,
0.08001708984375,
0.0305328369140625,
-0.06781005859375,
-0.0180816650390625,
0.0151519775390625,
0.016845703125,
-0.0310516357421875,
-0.022216796875,
0.06390380859375,
-0.01146697998046875,
-0.06915283203125,
0.00042247772216796875,
0.005390167236328125,
0.0194854736328125,
-0.0250244140625,
-0.00865936279296875,
-0.00846099853515625,
-0.0144195556640625,
0.041168212890625,
0.01113128662109375,
-0.05535888671875,
-0.00716400146484375,
0.024871826171875,
-0.0233001708984375,
0.007122039794921875,
0.0118560791015625,
-0.03985595703125,
0.0104827880859375,
0.043609619140625,
0.00383758544921875,
0.02587890625,
-0.005527496337890625,
0.03448486328125,
-0.0195159912109375,
0.01509857177734375,
0.005123138427734375,
0.034942626953125,
0.006855010986328125,
-0.0225067138671875,
0.00945281982421875,
0.0157623291015625,
-0.03240966796875,
-0.0579833984375,
-0.00699615478515625,
-0.07208251953125,
-0.0166778564453125,
0.08013916015625,
-0.03741455078125,
-0.0362548828125,
-0.004291534423828125,
-0.055145263671875,
0.045623779296875,
-0.0240936279296875,
0.04925537109375,
0.03485107421875,
0.01708984375,
-0.018035888671875,
-0.04913330078125,
0.043243408203125,
0.0170135498046875,
-0.03704833984375,
-0.0155181884765625,
0.01313018798828125,
0.0218658447265625,
0.0099639892578125,
0.046783447265625,
-0.0008058547973632812,
0.041412353515625,
0.003879547119140625,
0.029022216796875,
-0.012481689453125,
0.00473785400390625,
-0.0025501251220703125,
0.00817108154296875,
0.002758026123046875,
-0.031402587890625
]
] |
timm/resnetv2_50.a1h_in1k | 2023-03-22T20:55:30.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1603.05027",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/resnetv2_50.a1h_in1k | 0 | 12,003 | timm | 2023-03-22T20:54:57 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for resnetv2_50.a1h_in1k
A ResNet-V2 (pre-activation ResNet) image classification model. Trained on ImageNet-1k by Ross Wightman in `timm` using ResNet strikes back (RSB) `A1` based recipe.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 25.5
- GMACs: 4.1
- Activations (M): 11.1
- Image size: 224 x 224
- **Papers:**
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- Identity Mappings in Deep Residual Networks: https://arxiv.org/abs/1603.05027
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('resnetv2_50.a1h_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnetv2_50.a1h_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 2048, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnetv2_50.a1h_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
```bibtex
@article{He2016,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Identity Mappings in Deep Residual Networks},
journal = {arXiv preprint arXiv:1603.05027},
year = {2016}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,331 | [
[
-0.030975341796875,
-0.0247955322265625,
-0.005100250244140625,
-0.0024623870849609375,
-0.02166748046875,
-0.019989013671875,
-0.017059326171875,
-0.02313232421875,
0.022705078125,
0.04241943359375,
-0.038543701171875,
-0.0506591796875,
-0.05120849609375,
-0.0099945068359375,
-0.01132965087890625,
0.06793212890625,
-0.006565093994140625,
-0.0016222000122070312,
-0.014801025390625,
-0.04150390625,
-0.0209503173828125,
-0.0251922607421875,
-0.0712890625,
-0.0369873046875,
0.03662109375,
0.0151214599609375,
0.037109375,
0.043548583984375,
0.052398681640625,
0.03509521484375,
-0.00286102294921875,
0.011260986328125,
-0.0210723876953125,
-0.00969696044921875,
0.020050048828125,
-0.047821044921875,
-0.0287017822265625,
0.0190582275390625,
0.057098388671875,
0.0206451416015625,
0.0007424354553222656,
0.035888671875,
0.006595611572265625,
0.048919677734375,
-0.01898193359375,
0.00316619873046875,
-0.03045654296875,
0.0162811279296875,
-0.005702972412109375,
-0.0012111663818359375,
-0.0291900634765625,
-0.03173828125,
0.01171875,
-0.03533935546875,
0.03509521484375,
-0.001125335693359375,
0.10394287109375,
0.0229949951171875,
-0.000041961669921875,
0.01210784912109375,
-0.0192108154296875,
0.061859130859375,
-0.06280517578125,
0.017822265625,
0.018768310546875,
0.0243072509765625,
-0.009857177734375,
-0.09503173828125,
-0.03955078125,
-0.013671875,
-0.015411376953125,
-0.001682281494140625,
-0.02069091796875,
-0.00580596923828125,
0.0212860107421875,
0.0197906494140625,
-0.025482177734375,
0.0159149169921875,
-0.0400390625,
-0.017608642578125,
0.0355224609375,
0.0030994415283203125,
0.0220794677734375,
-0.01262664794921875,
-0.03900146484375,
-0.036468505859375,
-0.032135009765625,
0.0165252685546875,
0.024871826171875,
0.0218353271484375,
-0.042755126953125,
0.0252838134765625,
0.0097808837890625,
0.044097900390625,
-0.0004963874816894531,
-0.022216796875,
0.047515869140625,
0.0009026527404785156,
-0.030120849609375,
-0.018646240234375,
0.07769775390625,
0.03173828125,
0.01434326171875,
0.0084686279296875,
-0.01172637939453125,
-0.033050537109375,
0.00138092041015625,
-0.08624267578125,
-0.031280517578125,
0.021575927734375,
-0.050323486328125,
-0.033172607421875,
0.017303466796875,
-0.044921875,
-0.0164337158203125,
-0.0080108642578125,
0.035919189453125,
-0.0384521484375,
-0.036529541015625,
-0.0013532638549804688,
-0.01776123046875,
0.0279693603515625,
0.01454925537109375,
-0.032745361328125,
0.015411376953125,
0.0312347412109375,
0.08453369140625,
0.00659942626953125,
-0.0361328125,
-0.02008056640625,
-0.035430908203125,
-0.01971435546875,
0.033660888671875,
-0.00821685791015625,
0.00490570068359375,
-0.0238494873046875,
0.0264434814453125,
-0.004108428955078125,
-0.057586669921875,
0.00768280029296875,
-0.0264739990234375,
0.02239990234375,
-0.0099945068359375,
-0.01161956787109375,
-0.0462646484375,
0.0257720947265625,
-0.038970947265625,
0.0953369140625,
0.0303192138671875,
-0.06597900390625,
0.017791748046875,
-0.034423828125,
-0.0075225830078125,
-0.02520751953125,
-0.00238800048828125,
-0.083251953125,
-0.006290435791015625,
0.00849151611328125,
0.04791259765625,
-0.02984619140625,
0.0034084320068359375,
-0.040069580078125,
-0.020111083984375,
0.0300750732421875,
-0.006435394287109375,
0.07635498046875,
0.00844573974609375,
-0.03533935546875,
0.01409149169921875,
-0.041015625,
0.0195159912109375,
0.038818359375,
-0.017242431640625,
0.0015974044799804688,
-0.043304443359375,
0.006343841552734375,
0.0251007080078125,
0.0070953369140625,
-0.03656005859375,
0.0165252685546875,
-0.01666259765625,
0.033050537109375,
0.045806884765625,
-0.004329681396484375,
0.01727294921875,
-0.032562255859375,
0.026123046875,
0.02081298828125,
0.01242828369140625,
-0.005458831787109375,
-0.0341796875,
-0.06280517578125,
-0.037139892578125,
0.039947509765625,
0.0311279296875,
-0.042266845703125,
0.0361328125,
-0.0158233642578125,
-0.0577392578125,
-0.0390625,
0.0009627342224121094,
0.0479736328125,
0.0517578125,
0.029296875,
-0.044158935546875,
-0.04852294921875,
-0.0731201171875,
0.001613616943359375,
-0.0010662078857421875,
0.002445220947265625,
0.02630615234375,
0.04766845703125,
-0.00878143310546875,
0.05230712890625,
-0.030609130859375,
-0.02020263671875,
-0.01776123046875,
0.008941650390625,
0.0254669189453125,
0.058929443359375,
0.06134033203125,
-0.04302978515625,
-0.030670166015625,
-0.00901031494140625,
-0.06689453125,
0.0117645263671875,
-0.0094757080078125,
-0.0165557861328125,
0.02313232421875,
0.01474761962890625,
-0.032745361328125,
0.04498291015625,
0.0128326416015625,
-0.01488494873046875,
0.033721923828125,
-0.01546478271484375,
0.0240631103515625,
-0.09185791015625,
0.01309967041015625,
0.0274200439453125,
-0.0089111328125,
-0.0285491943359375,
0.011505126953125,
0.00030922889709472656,
-0.0065765380859375,
-0.03955078125,
0.041534423828125,
-0.04638671875,
-0.0224609375,
-0.01483154296875,
-0.0212860107421875,
0.004001617431640625,
0.051177978515625,
-0.0022945404052734375,
0.0286407470703125,
0.059356689453125,
-0.02734375,
0.04443359375,
0.0202484130859375,
-0.00647735595703125,
0.025787353515625,
-0.051239013671875,
0.02349853515625,
-0.003162384033203125,
0.0240325927734375,
-0.085693359375,
-0.01462554931640625,
0.030120849609375,
-0.052947998046875,
0.05279541015625,
-0.043365478515625,
-0.0312347412109375,
-0.04266357421875,
-0.034149169921875,
0.025848388671875,
0.0609130859375,
-0.055419921875,
0.03521728515625,
0.0128936767578125,
0.02398681640625,
-0.046966552734375,
-0.0672607421875,
-0.008544921875,
-0.033935546875,
-0.04644775390625,
0.02392578125,
0.0225067138671875,
0.00872802734375,
0.01293182373046875,
-0.003063201904296875,
-0.01328277587890625,
-0.006542205810546875,
0.0428466796875,
0.022674560546875,
-0.02685546875,
-0.00814056396484375,
-0.02783203125,
-0.015716552734375,
0.0011043548583984375,
-0.024383544921875,
0.0467529296875,
-0.01898193359375,
-0.011383056640625,
-0.07232666015625,
-0.006927490234375,
0.03692626953125,
-0.0159454345703125,
0.06597900390625,
0.0836181640625,
-0.042022705078125,
0.00417327880859375,
-0.038848876953125,
-0.031829833984375,
-0.0362548828125,
0.0391845703125,
-0.023223876953125,
-0.030853271484375,
0.06585693359375,
-0.00872039794921875,
0.00788116455078125,
0.047821044921875,
0.023223876953125,
-0.0089111328125,
0.03759765625,
0.042938232421875,
0.011688232421875,
0.050384521484375,
-0.07537841796875,
-0.018707275390625,
-0.0675048828125,
-0.039520263671875,
-0.029632568359375,
-0.057373046875,
-0.0408935546875,
-0.027313232421875,
0.027008056640625,
0.011566162109375,
-0.030426025390625,
0.042327880859375,
-0.0634765625,
0.005329132080078125,
0.056365966796875,
0.04638671875,
-0.03448486328125,
0.033782958984375,
-0.01308441162109375,
-0.01073455810546875,
-0.057373046875,
-0.0154571533203125,
0.0823974609375,
0.038299560546875,
0.04241943359375,
-0.00809478759765625,
0.062469482421875,
-0.01525115966796875,
0.031280517578125,
-0.042144775390625,
0.0443115234375,
-0.01141357421875,
-0.027130126953125,
-0.01303863525390625,
-0.031951904296875,
-0.0787353515625,
0.00632476806640625,
-0.01751708984375,
-0.04608154296875,
0.0108642578125,
0.0175018310546875,
-0.024261474609375,
0.06085205078125,
-0.061553955078125,
0.0665283203125,
-0.006839752197265625,
-0.0306243896484375,
0.0015497207641601562,
-0.056793212890625,
0.02044677734375,
0.01416778564453125,
-0.02203369140625,
-0.0031490325927734375,
0.007694244384765625,
0.08203125,
-0.045501708984375,
0.0714111328125,
-0.037689208984375,
0.0311126708984375,
0.039886474609375,
-0.009613037109375,
0.0270538330078125,
-0.01094818115234375,
-0.0130767822265625,
0.0282745361328125,
-0.005237579345703125,
-0.0307769775390625,
-0.047515869140625,
0.042633056640625,
-0.07080078125,
-0.025665283203125,
-0.0245208740234375,
-0.023529052734375,
0.0200347900390625,
0.00798797607421875,
0.04022216796875,
0.056365966796875,
0.0285797119140625,
0.0254669189453125,
0.042144775390625,
-0.03802490234375,
0.031585693359375,
0.00490570068359375,
-0.007083892822265625,
-0.04791259765625,
0.06390380859375,
0.020660400390625,
0.0135040283203125,
0.00972747802734375,
0.01355743408203125,
-0.0259246826171875,
-0.041900634765625,
-0.01497650146484375,
0.0308685302734375,
-0.049774169921875,
-0.043914794921875,
-0.03961181640625,
-0.034759521484375,
-0.035003662109375,
0.0027332305908203125,
-0.04522705078125,
-0.0214385986328125,
-0.03021240234375,
0.01485443115234375,
0.056915283203125,
0.03570556640625,
-0.0169830322265625,
0.039886474609375,
-0.041229248046875,
0.0141143798828125,
0.006427764892578125,
0.03472900390625,
-0.005458831787109375,
-0.07763671875,
-0.01442718505859375,
-0.004993438720703125,
-0.027496337890625,
-0.056182861328125,
0.03411865234375,
0.0091552734375,
0.03509521484375,
0.017547607421875,
-0.01284027099609375,
0.058502197265625,
-0.007549285888671875,
0.038970947265625,
0.03533935546875,
-0.0305633544921875,
0.04522705078125,
0.00745391845703125,
0.004364013671875,
0.01113128662109375,
0.0208587646484375,
-0.025054931640625,
0.0029468536376953125,
-0.07720947265625,
-0.057769775390625,
0.0682373046875,
0.0021209716796875,
0.0016689300537109375,
0.0245513916015625,
0.06549072265625,
0.0021152496337890625,
-0.0008792877197265625,
-0.053497314453125,
-0.042022705078125,
-0.0198974609375,
-0.01477813720703125,
0.0057220458984375,
-0.014251708984375,
-0.009033203125,
-0.048919677734375,
0.053131103515625,
0.0014820098876953125,
0.0555419921875,
0.0245513916015625,
0.007762908935546875,
-0.0026035308837890625,
-0.037261962890625,
0.034759521484375,
0.0208740234375,
-0.023834228515625,
0.010772705078125,
0.0161285400390625,
-0.03985595703125,
0.0126800537109375,
0.01239013671875,
0.0017042160034179688,
0.0032062530517578125,
0.042755126953125,
0.06805419921875,
-0.0036487579345703125,
0.00991058349609375,
0.027130126953125,
-0.00420379638671875,
-0.0299530029296875,
-0.0249481201171875,
0.009063720703125,
-0.00778961181640625,
0.034271240234375,
0.0216827392578125,
0.029632568359375,
-0.0158538818359375,
-0.01457977294921875,
0.03179931640625,
0.03326416015625,
-0.0245513916015625,
-0.0248565673828125,
0.04620361328125,
-0.014556884765625,
-0.024444580078125,
0.0697021484375,
-0.005523681640625,
-0.0357666015625,
0.08587646484375,
0.035858154296875,
0.07574462890625,
-0.0020275115966796875,
0.003894805908203125,
0.06561279296875,
0.0251007080078125,
-0.001735687255859375,
0.0089111328125,
0.0207977294921875,
-0.054840087890625,
0.0008754730224609375,
-0.0297698974609375,
0.00658416748046875,
0.03143310546875,
-0.04656982421875,
0.023223876953125,
-0.05401611328125,
-0.0382080078125,
0.004596710205078125,
0.02032470703125,
-0.0689697265625,
0.018707275390625,
-0.00647735595703125,
0.06463623046875,
-0.0556640625,
0.054718017578125,
0.06683349609375,
-0.0413818359375,
-0.0792236328125,
-0.003429412841796875,
-0.0005578994750976562,
-0.0732421875,
0.052032470703125,
0.0305633544921875,
0.01268768310546875,
0.0116729736328125,
-0.056549072265625,
-0.053497314453125,
0.107666015625,
0.044586181640625,
-0.00843048095703125,
0.0237579345703125,
-0.00913238525390625,
0.0198974609375,
-0.0253753662109375,
0.0357666015625,
0.01953125,
0.024871826171875,
0.024505615234375,
-0.0443115234375,
0.023651123046875,
-0.0183258056640625,
0.00865936279296875,
0.0130767822265625,
-0.0635986328125,
0.060394287109375,
-0.03997802734375,
-0.01213836669921875,
0.0021152496337890625,
0.05810546875,
0.0205535888671875,
0.0088653564453125,
0.0362548828125,
0.0689697265625,
0.040008544921875,
-0.025299072265625,
0.06866455078125,
-0.0003314018249511719,
0.045928955078125,
0.055450439453125,
0.02825927734375,
0.0443115234375,
0.026641845703125,
-0.023590087890625,
0.031951904296875,
0.08660888671875,
-0.0265655517578125,
0.027130126953125,
0.023101806640625,
0.0033054351806640625,
-0.008026123046875,
0.00745391845703125,
-0.043304443359375,
0.0289459228515625,
0.006221771240234375,
-0.04351806640625,
-0.0197906494140625,
-0.00043010711669921875,
0.00124359130859375,
-0.019195556640625,
-0.00247955322265625,
0.037506103515625,
0.0007748603820800781,
-0.0295257568359375,
0.06964111328125,
0.01432037353515625,
0.056915283203125,
-0.02716064453125,
-0.006855010986328125,
-0.031951904296875,
0.01337432861328125,
-0.0241546630859375,
-0.057769775390625,
0.0224456787109375,
-0.02117919921875,
-0.0010156631469726562,
0.0042724609375,
0.05072021484375,
-0.0228271484375,
-0.030975341796875,
0.0099639892578125,
0.011871337890625,
0.041290283203125,
0.006816864013671875,
-0.0931396484375,
0.0168914794921875,
0.00638580322265625,
-0.0450439453125,
0.02105712890625,
0.0295257568359375,
0.01328277587890625,
0.0556640625,
0.041473388671875,
-0.01068115234375,
0.005779266357421875,
-0.01058197021484375,
0.06304931640625,
-0.0360107421875,
-0.01531219482421875,
-0.06488037109375,
0.051361083984375,
-0.01129150390625,
-0.043792724609375,
0.03564453125,
0.044830322265625,
0.057525634765625,
-0.0018634796142578125,
0.0350341796875,
-0.0164642333984375,
0.0033206939697265625,
-0.032073974609375,
0.049713134765625,
-0.052886962890625,
0.0035552978515625,
-0.0034275054931640625,
-0.04864501953125,
-0.0272674560546875,
0.048553466796875,
-0.018035888671875,
0.0323486328125,
0.036376953125,
0.0784912109375,
-0.023834228515625,
-0.04107666015625,
0.005523681640625,
0.0061798095703125,
0.009613037109375,
0.033416748046875,
0.031280517578125,
-0.06610107421875,
0.02398681640625,
-0.04736328125,
-0.016387939453125,
-0.008758544921875,
-0.05426025390625,
-0.0706787109375,
-0.06463623046875,
-0.05194091796875,
-0.0626220703125,
-0.013427734375,
0.062164306640625,
0.078125,
-0.0516357421875,
-0.0018281936645507812,
0.004825592041015625,
0.01284027099609375,
-0.0207977294921875,
-0.01690673828125,
0.04791259765625,
-0.01459503173828125,
-0.04718017578125,
-0.023193359375,
0.0013408660888671875,
0.0308685302734375,
-0.0026569366455078125,
-0.0188446044921875,
-0.008544921875,
-0.0255584716796875,
0.004791259765625,
0.031585693359375,
-0.056732177734375,
-0.0188446044921875,
-0.0189361572265625,
-0.0109405517578125,
0.02764892578125,
0.03546142578125,
-0.0455322265625,
0.0218658447265625,
0.030364990234375,
0.0377197265625,
0.054443359375,
-0.0191650390625,
0.004764556884765625,
-0.07183837890625,
0.043548583984375,
-0.01093292236328125,
0.0301361083984375,
0.033233642578125,
-0.02508544921875,
0.04736328125,
0.04034423828125,
-0.023712158203125,
-0.07183837890625,
-0.0035724639892578125,
-0.07080078125,
-0.009552001953125,
0.06719970703125,
-0.032440185546875,
-0.03533935546875,
0.0390625,
-0.004901885986328125,
0.056671142578125,
-0.00007510185241699219,
0.03515625,
0.01522064208984375,
0.0001804828643798828,
-0.05328369140625,
-0.034912109375,
0.0307769775390625,
0.014862060546875,
-0.045196533203125,
-0.0286407470703125,
0.00043320655822753906,
0.0538330078125,
0.01561737060546875,
0.03759765625,
-0.01319122314453125,
0.0098114013671875,
0.0057830810546875,
0.043243408203125,
-0.043548583984375,
-0.01378631591796875,
-0.02783203125,
0.0007586479187011719,
-0.007110595703125,
-0.056060791015625
]
] |
timm/vit_base_patch32_224.augreg_in21k_ft_in1k | 2023-05-06T00:03:27.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:imagenet-21k",
"arxiv:2106.10270",
"arxiv:2010.11929",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_base_patch32_224.augreg_in21k_ft_in1k | 0 | 11,979 | timm | 2022-12-22T07:33:47 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
- imagenet-21k
---
# Model card for vit_base_patch32_224.augreg_in21k_ft_in1k
A Vision Transformer (ViT) image classification model. Trained on ImageNet-21k and fine-tuned on ImageNet-1k (with additional augmentation and regularization) in JAX by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 88.2
- GMACs: 4.4
- Activations (M): 4.2
- Image size: 224 x 224
- **Papers:**
- How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers: https://arxiv.org/abs/2106.10270
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Dataset:** ImageNet-1k
- **Pretrain Dataset:** ImageNet-21k
- **Original:** https://github.com/google-research/vision_transformer
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_base_patch32_224.augreg_in21k_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_base_patch32_224.augreg_in21k_ft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 50, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{steiner2021augreg,
title={How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers},
author={Steiner, Andreas and Kolesnikov, Alexander and and Zhai, Xiaohua and Wightman, Ross and Uszkoreit, Jakob and Beyer, Lucas},
journal={arXiv preprint arXiv:2106.10270},
year={2021}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 3,903 | [
[
-0.0390625,
-0.0290069580078125,
-0.0034465789794921875,
0.007129669189453125,
-0.0295867919921875,
-0.0249176025390625,
-0.021148681640625,
-0.034454345703125,
0.01251220703125,
0.0235443115234375,
-0.04132080078125,
-0.037261962890625,
-0.048065185546875,
0.00011926889419555664,
-0.01070404052734375,
0.0736083984375,
-0.0104827880859375,
0.0034503936767578125,
-0.0168304443359375,
-0.033843994140625,
-0.0240478515625,
-0.02105712890625,
-0.04486083984375,
-0.031890869140625,
0.026763916015625,
0.012664794921875,
0.0438232421875,
0.04638671875,
0.05859375,
0.033355712890625,
-0.00799560546875,
0.01044464111328125,
-0.025787353515625,
-0.01557159423828125,
0.021484375,
-0.047454833984375,
-0.0301361083984375,
0.01812744140625,
0.053802490234375,
0.02886962890625,
0.00907135009765625,
0.0256805419921875,
0.0105743408203125,
0.0384521484375,
-0.02679443359375,
0.01497650146484375,
-0.0394287109375,
0.0200347900390625,
-0.00424957275390625,
-0.0036792755126953125,
-0.023529052734375,
-0.0249786376953125,
0.0200958251953125,
-0.04034423828125,
0.0460205078125,
-0.005168914794921875,
0.103759765625,
0.0218048095703125,
0.0040740966796875,
0.0173797607421875,
-0.0301361083984375,
0.0565185546875,
-0.04644775390625,
0.03302001953125,
0.014068603515625,
0.01302337646484375,
0.00457763671875,
-0.076416015625,
-0.049224853515625,
-0.01361846923828125,
-0.01751708984375,
0.00887298583984375,
-0.0229034423828125,
0.019378662109375,
0.035888671875,
0.04425048828125,
-0.039886474609375,
-0.0030651092529296875,
-0.04217529296875,
-0.020263671875,
0.041748046875,
-0.0029773712158203125,
0.0146942138671875,
-0.01149749755859375,
-0.046356201171875,
-0.045318603515625,
-0.0257415771484375,
0.020538330078125,
0.0218658447265625,
0.0036754608154296875,
-0.036529541015625,
0.040283203125,
0.002834320068359375,
0.05010986328125,
0.018646240234375,
-0.0158538818359375,
0.050323486328125,
-0.0115203857421875,
-0.0303955078125,
-0.0192413330078125,
0.08099365234375,
0.035400390625,
0.0310516357421875,
-0.0031642913818359375,
-0.01319122314453125,
-0.00843048095703125,
0.00519561767578125,
-0.08148193359375,
-0.028472900390625,
0.006191253662109375,
-0.03399658203125,
-0.02862548828125,
0.02703857421875,
-0.04705810546875,
-0.00830841064453125,
-0.00809478759765625,
0.05865478515625,
-0.034332275390625,
-0.017120361328125,
0.00846099853515625,
-0.01302337646484375,
0.036376953125,
0.018402099609375,
-0.044464111328125,
0.00826263427734375,
0.015960693359375,
0.07684326171875,
0.0029468536376953125,
-0.036773681640625,
-0.0172576904296875,
-0.03363037109375,
-0.0240936279296875,
0.037109375,
-0.00196075439453125,
-0.01195526123046875,
-0.01276397705078125,
0.0280609130859375,
-0.0191497802734375,
-0.04296875,
0.0247039794921875,
-0.015411376953125,
0.0265655517578125,
0.00717926025390625,
-0.01544189453125,
-0.03204345703125,
0.021514892578125,
-0.0306549072265625,
0.0911865234375,
0.0299530029296875,
-0.0677490234375,
0.032135009765625,
-0.0341796875,
-0.006191253662109375,
-0.0091705322265625,
0.002124786376953125,
-0.08184814453125,
0.00412750244140625,
0.0238037109375,
0.043609619140625,
-0.01506805419921875,
-0.002017974853515625,
-0.029296875,
-0.02520751953125,
0.0258636474609375,
-0.0204010009765625,
0.06903076171875,
0.002574920654296875,
-0.02423095703125,
0.0208587646484375,
-0.04376220703125,
0.00664520263671875,
0.031402587890625,
-0.0191650390625,
0.0011644363403320312,
-0.04638671875,
0.0112152099609375,
0.0152587890625,
0.0170440673828125,
-0.050018310546875,
0.02935791015625,
-0.0281219482421875,
0.0292205810546875,
0.048797607421875,
-0.00745391845703125,
0.0286865234375,
-0.0253753662109375,
0.0258941650390625,
0.0195159912109375,
0.030914306640625,
-0.011688232421875,
-0.047515869140625,
-0.07904052734375,
-0.033233642578125,
0.0252685546875,
0.03424072265625,
-0.04925537109375,
0.04254150390625,
-0.0280609130859375,
-0.05560302734375,
-0.045013427734375,
0.0017995834350585938,
0.035430908203125,
0.041412353515625,
0.03997802734375,
-0.0411376953125,
-0.041259765625,
-0.07177734375,
-0.00972747802734375,
-0.00554656982421875,
0.0006527900695800781,
0.015380859375,
0.0462646484375,
-0.0197906494140625,
0.0660400390625,
-0.032196044921875,
-0.0245361328125,
-0.0159454345703125,
0.0041961669921875,
0.0262603759765625,
0.056854248046875,
0.05316162109375,
-0.0487060546875,
-0.0347900390625,
-0.0101165771484375,
-0.0653076171875,
0.01012420654296875,
-0.002376556396484375,
-0.013946533203125,
0.01082611083984375,
0.013519287109375,
-0.052154541015625,
0.059326171875,
0.01261138916015625,
-0.0277557373046875,
0.032135009765625,
-0.01654052734375,
0.006252288818359375,
-0.08831787109375,
-0.00112152099609375,
0.028564453125,
-0.0196533203125,
-0.037261962890625,
0.0006365776062011719,
0.007778167724609375,
-0.001789093017578125,
-0.03045654296875,
0.042388916015625,
-0.036834716796875,
-0.002727508544921875,
-0.005092620849609375,
-0.027099609375,
0.0042266845703125,
0.054901123046875,
-0.00335693359375,
0.040496826171875,
0.056365966796875,
-0.036285400390625,
0.04315185546875,
0.039886474609375,
-0.0156402587890625,
0.035614013671875,
-0.054351806640625,
0.01195526123046875,
-0.00396728515625,
0.0149688720703125,
-0.0755615234375,
-0.01424407958984375,
0.0291290283203125,
-0.055908203125,
0.048736572265625,
-0.040802001953125,
-0.032623291015625,
-0.045166015625,
-0.02984619140625,
0.030426025390625,
0.05694580078125,
-0.05859375,
0.044830322265625,
0.0062103271484375,
0.0239410400390625,
-0.045166015625,
-0.072021484375,
-0.0171661376953125,
-0.0279083251953125,
-0.053985595703125,
0.035186767578125,
0.005756378173828125,
0.0112152099609375,
0.0060272216796875,
-0.00804901123046875,
-0.00003737211227416992,
-0.015716552734375,
0.0341796875,
0.0312347412109375,
-0.017974853515625,
-0.005222320556640625,
-0.024749755859375,
-0.01593017578125,
-0.0001647472381591797,
-0.0263671875,
0.038726806640625,
-0.023895263671875,
-0.0153045654296875,
-0.056671142578125,
-0.019683837890625,
0.03692626953125,
-0.022796630859375,
0.054168701171875,
0.08538818359375,
-0.0350341796875,
0.004688262939453125,
-0.0435791015625,
-0.0287017822265625,
-0.037109375,
0.035552978515625,
-0.024566650390625,
-0.03314208984375,
0.05462646484375,
0.01323699951171875,
0.00775146484375,
0.057647705078125,
0.0321044921875,
0.0031585693359375,
0.0618896484375,
0.0518798828125,
0.01084136962890625,
0.06671142578125,
-0.07373046875,
-0.00920867919921875,
-0.068359375,
-0.0273895263671875,
-0.0178070068359375,
-0.038482666015625,
-0.052276611328125,
-0.036376953125,
0.033416748046875,
0.009063720703125,
-0.0216217041015625,
0.041046142578125,
-0.06683349609375,
0.01378631591796875,
0.0535888671875,
0.039093017578125,
-0.00848388671875,
0.032806396484375,
-0.0151519775390625,
-0.00524139404296875,
-0.05767822265625,
-0.00804901123046875,
0.08233642578125,
0.035888671875,
0.05987548828125,
-0.021240234375,
0.048919677734375,
-0.0195770263671875,
0.0256195068359375,
-0.058074951171875,
0.0406494140625,
-0.0018625259399414062,
-0.0305633544921875,
-0.00823211669921875,
-0.029205322265625,
-0.076171875,
0.0161590576171875,
-0.0254669189453125,
-0.060028076171875,
0.0269927978515625,
0.01537322998046875,
-0.016143798828125,
0.04931640625,
-0.0640869140625,
0.072509765625,
-0.005336761474609375,
-0.0362548828125,
0.00684356689453125,
-0.053375244140625,
0.01308441162109375,
0.019744873046875,
-0.0283355712890625,
0.01032257080078125,
0.0211029052734375,
0.0743408203125,
-0.04534912109375,
0.06134033203125,
-0.03179931640625,
0.0268096923828125,
0.03631591796875,
-0.0162353515625,
0.02764892578125,
0.002346038818359375,
0.01232147216796875,
0.02520751953125,
-0.0017347335815429688,
-0.027313232421875,
-0.036834716796875,
0.035186767578125,
-0.0777587890625,
-0.0283355712890625,
-0.038604736328125,
-0.04351806640625,
0.007358551025390625,
0.005340576171875,
0.05291748046875,
0.046417236328125,
0.020538330078125,
0.0295562744140625,
0.05035400390625,
-0.024749755859375,
0.029022216796875,
0.0003676414489746094,
-0.01264190673828125,
-0.042205810546875,
0.07073974609375,
0.0168609619140625,
0.0121307373046875,
0.01374053955078125,
0.01678466796875,
-0.026031494140625,
-0.036712646484375,
-0.027252197265625,
0.030364990234375,
-0.05389404296875,
-0.036041259765625,
-0.043426513671875,
-0.039764404296875,
-0.0254364013671875,
0.0009489059448242188,
-0.032135009765625,
-0.0245513916015625,
-0.0274200439453125,
0.0066070556640625,
0.062469482421875,
0.038726806640625,
-0.0089263916015625,
0.040740966796875,
-0.04339599609375,
0.015380859375,
0.0215606689453125,
0.039703369140625,
-0.01476287841796875,
-0.07568359375,
-0.0268707275390625,
0.0023365020751953125,
-0.03839111328125,
-0.057342529296875,
0.034515380859375,
0.015045166015625,
0.03466796875,
0.028106689453125,
-0.0207977294921875,
0.066162109375,
-0.004726409912109375,
0.0438232421875,
0.0247344970703125,
-0.040679931640625,
0.0374755859375,
-0.007534027099609375,
0.01094818115234375,
0.0151519775390625,
0.01300811767578125,
-0.0213470458984375,
-0.00437164306640625,
-0.07904052734375,
-0.05657958984375,
0.05859375,
0.0193023681640625,
0.00469207763671875,
0.034759521484375,
0.046142578125,
-0.0040435791015625,
0.00458526611328125,
-0.06671142578125,
-0.0249176025390625,
-0.032440185546875,
-0.0246124267578125,
-0.005340576171875,
-0.0010747909545898438,
0.0001512765884399414,
-0.06048583984375,
0.04864501953125,
-0.005207061767578125,
0.06085205078125,
0.036834716796875,
-0.01654052734375,
-0.01328277587890625,
-0.0292205810546875,
0.0263671875,
0.019439697265625,
-0.02337646484375,
0.0015888214111328125,
0.020294189453125,
-0.05621337890625,
-0.0027027130126953125,
0.0246124267578125,
-0.006534576416015625,
0.004444122314453125,
0.0357666015625,
0.0830078125,
-0.0102996826171875,
-0.0005621910095214844,
0.0400390625,
-0.006931304931640625,
-0.031707763671875,
-0.02117919921875,
0.006683349609375,
-0.019195556640625,
0.0267333984375,
0.0257720947265625,
0.02972412109375,
-0.01192474365234375,
-0.00998687744140625,
0.01209259033203125,
0.0401611328125,
-0.04034423828125,
-0.0282135009765625,
0.0504150390625,
-0.0151824951171875,
-0.006839752197265625,
0.060943603515625,
-0.004016876220703125,
-0.043548583984375,
0.06646728515625,
0.0240325927734375,
0.07769775390625,
-0.007686614990234375,
-0.003787994384765625,
0.059814453125,
0.0283203125,
-0.002101898193359375,
0.01238250732421875,
0.00890350341796875,
-0.06048583984375,
-0.0063323974609375,
-0.04827880859375,
0.0020961761474609375,
0.0248565673828125,
-0.0396728515625,
0.0294342041015625,
-0.0400390625,
-0.028472900390625,
0.003795623779296875,
0.0175018310546875,
-0.07659912109375,
0.02197265625,
0.0029048919677734375,
0.056976318359375,
-0.059722900390625,
0.049224853515625,
0.0645751953125,
-0.05126953125,
-0.071533203125,
-0.0117034912109375,
-0.013946533203125,
-0.0653076171875,
0.03485107421875,
0.03265380859375,
0.01142120361328125,
0.0196533203125,
-0.061614990234375,
-0.045867919921875,
0.09698486328125,
0.0282745361328125,
-0.0105743408203125,
0.0103302001953125,
-0.0016870498657226562,
0.0282745361328125,
-0.0207977294921875,
0.0350341796875,
0.0128326416015625,
0.03009033203125,
0.01531219482421875,
-0.054290771484375,
0.0048675537109375,
-0.024871826171875,
0.01190185546875,
0.01369476318359375,
-0.05987548828125,
0.0738525390625,
-0.0312347412109375,
-0.0089263916015625,
0.01326751708984375,
0.04901123046875,
0.00807952880859375,
0.003963470458984375,
0.0419921875,
0.06744384765625,
0.028228759765625,
-0.0325927734375,
0.068115234375,
-0.01102447509765625,
0.054534912109375,
0.0369873046875,
0.036773681640625,
0.031402587890625,
0.034759521484375,
-0.026336669921875,
0.025146484375,
0.074951171875,
-0.042388916015625,
0.0223846435546875,
0.007678985595703125,
0.00359344482421875,
-0.01727294921875,
0.0047607421875,
-0.037109375,
0.038543701171875,
0.016021728515625,
-0.041656494140625,
-0.007778167724609375,
0.01285552978515625,
-0.011688232421875,
-0.028900146484375,
-0.013427734375,
0.04522705078125,
-0.0008373260498046875,
-0.033355712890625,
0.06378173828125,
-0.0012416839599609375,
0.062347412109375,
-0.033660888671875,
-0.0029506683349609375,
-0.0196075439453125,
0.031005859375,
-0.0286712646484375,
-0.059600830078125,
0.01215362548828125,
-0.0173187255859375,
-0.00557708740234375,
0.002712249755859375,
0.05316162109375,
-0.0302276611328125,
-0.041351318359375,
0.00797271728515625,
0.0238189697265625,
0.0227813720703125,
-0.00711822509765625,
-0.07708740234375,
-0.0030956268310546875,
0.0012063980102539062,
-0.044708251953125,
0.0139312744140625,
0.0285186767578125,
0.0022830963134765625,
0.05096435546875,
0.050872802734375,
-0.0072479248046875,
0.0183258056640625,
-0.00936126708984375,
0.06982421875,
-0.0318603515625,
-0.029388427734375,
-0.0594482421875,
0.048431396484375,
-0.004741668701171875,
-0.047576904296875,
0.05010986328125,
0.045562744140625,
0.0703125,
-0.0113983154296875,
0.036468505859375,
-0.01285552978515625,
0.001735687255859375,
-0.02618408203125,
0.04541015625,
-0.053192138671875,
-0.00823974609375,
-0.0219879150390625,
-0.06640625,
-0.0279693603515625,
0.0718994140625,
-0.02398681640625,
0.032958984375,
0.03973388671875,
0.0743408203125,
-0.024627685546875,
-0.0297088623046875,
0.01329803466796875,
0.01480865478515625,
0.00919342041015625,
0.0296173095703125,
0.043609619140625,
-0.0667724609375,
0.036773681640625,
-0.047210693359375,
-0.01323699951171875,
-0.01800537109375,
-0.035552978515625,
-0.07733154296875,
-0.061767578125,
-0.04339599609375,
-0.05126953125,
-0.015411376953125,
0.0645751953125,
0.07275390625,
-0.0423583984375,
-0.005786895751953125,
-0.01332855224609375,
0.0017385482788085938,
-0.0234375,
-0.01849365234375,
0.0384521484375,
-0.0104827880859375,
-0.058074951171875,
-0.0239105224609375,
-0.0004813671112060547,
0.038299560546875,
-0.013885498046875,
-0.01226806640625,
-0.01110076904296875,
-0.0253448486328125,
0.018035888671875,
0.022369384765625,
-0.052032470703125,
-0.0162811279296875,
-0.0037441253662109375,
-0.0036182403564453125,
0.03875732421875,
0.0286712646484375,
-0.056610107421875,
0.042388916015625,
0.042755126953125,
0.025115966796875,
0.06317138671875,
-0.01322174072265625,
0.00782012939453125,
-0.0645751953125,
0.044952392578125,
-0.0020618438720703125,
0.039306640625,
0.03790283203125,
-0.0206756591796875,
0.045013427734375,
0.043975830078125,
-0.0335693359375,
-0.06317138671875,
-0.0026111602783203125,
-0.08294677734375,
0.00978851318359375,
0.07086181640625,
-0.019561767578125,
-0.035400390625,
0.028106689453125,
-0.0160980224609375,
0.054534912109375,
-0.005107879638671875,
0.0341796875,
0.0170135498046875,
0.00614166259765625,
-0.045928955078125,
-0.034332275390625,
0.038818359375,
0.011688232421875,
-0.0399169921875,
-0.028289794921875,
0.0029926300048828125,
0.04248046875,
0.0273895263671875,
0.0261383056640625,
-0.01180267333984375,
0.0144195556640625,
0.004261016845703125,
0.041595458984375,
-0.0272369384765625,
-0.0114898681640625,
-0.0301513671875,
-0.01214599609375,
-0.006702423095703125,
-0.047637939453125
]
] |
ayameRushia/bert-base-indonesian-1.5G-sentiment-analysis-smsa | 2021-12-22T08:52:47.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"generated_from_trainer",
"id",
"dataset:indonlu",
"license:mit",
"model-index",
"endpoints_compatible",
"region:us"
] | text-classification | ayameRushia | null | null | ayameRushia/bert-base-indonesian-1.5G-sentiment-analysis-smsa | 4 | 11,978 | transformers | 2022-03-02T23:29:05 | ---
license: mit
tags:
- generated_from_trainer
datasets:
- indonlu
metrics:
- accuracy
model-index:
- name: bert-base-indonesian-1.5G-finetuned-sentiment-analysis-smsa
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: indonlu
type: indonlu
args: smsa
metrics:
- name: Accuracy
type: accuracy
value: 0.9373015873015873
language: id
widget:
- text: "Saya mengapresiasi usaha anda"
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-indonesian-1.5G-finetuned-sentiment-analysis-smsa
This model is a fine-tuned version of [cahya/bert-base-indonesian-1.5G](https://huggingface.co/cahya/bert-base-indonesian-1.5G) on the indonlu dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3390
- Accuracy: 0.9373
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2864 | 1.0 | 688 | 0.2154 | 0.9286 |
| 0.1648 | 2.0 | 1376 | 0.2238 | 0.9357 |
| 0.0759 | 3.0 | 2064 | 0.3351 | 0.9365 |
| 0.044 | 4.0 | 2752 | 0.3390 | 0.9373 |
| 0.0308 | 5.0 | 3440 | 0.4346 | 0.9365 |
| 0.0113 | 6.0 | 4128 | 0.4708 | 0.9365 |
| 0.006 | 7.0 | 4816 | 0.5533 | 0.9325 |
| 0.0047 | 8.0 | 5504 | 0.5888 | 0.9310 |
| 0.0001 | 9.0 | 6192 | 0.5961 | 0.9333 |
| 0.0 | 10.0 | 6880 | 0.5992 | 0.9357 |
### Framework versions
- Transformers 4.14.1
- Pytorch 1.10.0+cu111
- Datasets 1.16.1
- Tokenizers 0.10.3
| 2,303 | [
[
-0.042510986328125,
-0.05029296875,
-0.0005168914794921875,
0.02386474609375,
-0.03118896484375,
-0.025360107421875,
-0.017486572265625,
-0.01251220703125,
0.0240020751953125,
0.0278167724609375,
-0.052886962890625,
-0.048614501953125,
-0.0516357421875,
-0.007701873779296875,
-0.007781982421875,
0.09130859375,
0.01461029052734375,
0.0217437744140625,
-0.002422332763671875,
-0.00765228271484375,
-0.0243377685546875,
-0.03729248046875,
-0.060333251953125,
-0.032012939453125,
0.0207061767578125,
0.0254974365234375,
0.0582275390625,
0.029052734375,
0.0433349609375,
0.0158843994140625,
-0.028961181640625,
-0.01132965087890625,
-0.0333251953125,
-0.0241851806640625,
0.00811004638671875,
-0.041015625,
-0.0364990234375,
-0.002841949462890625,
0.038299560546875,
0.04229736328125,
-0.009796142578125,
0.043304443359375,
0.01031494140625,
0.053741455078125,
-0.031707763671875,
0.0223846435546875,
-0.031524658203125,
0.01959228515625,
-0.007198333740234375,
-0.00917816162109375,
-0.0183258056640625,
-0.0251007080078125,
0.0255889892578125,
-0.036865234375,
0.0250701904296875,
-0.013214111328125,
0.10504150390625,
0.02874755859375,
-0.0240631103515625,
-0.00414276123046875,
-0.044891357421875,
0.0596923828125,
-0.059173583984375,
0.006473541259765625,
0.0289764404296875,
0.0255584716796875,
0.008941650390625,
-0.042266845703125,
-0.045196533203125,
0.016693115234375,
-0.00582122802734375,
0.018096923828125,
-0.004749298095703125,
-0.004608154296875,
0.036834716796875,
0.0391845703125,
-0.035797119140625,
0.003063201904296875,
-0.03729248046875,
-0.01218414306640625,
0.043914794921875,
0.01641845703125,
-0.00998687744140625,
-0.043701171875,
-0.036163330078125,
-0.01428985595703125,
-0.0224609375,
0.02667236328125,
0.04412841796875,
0.0269317626953125,
-0.0306854248046875,
0.036773681640625,
-0.0181732177734375,
0.04803466796875,
0.01016998291015625,
-0.0222015380859375,
0.049652099609375,
0.0034332275390625,
-0.0391845703125,
0.01126861572265625,
0.0537109375,
0.041351318359375,
0.0214080810546875,
0.032012939453125,
-0.0101165771484375,
0.0006623268127441406,
0.0172882080078125,
-0.0673828125,
-0.0234832763671875,
0.0206451416015625,
-0.06390380859375,
-0.04852294921875,
0.01239013671875,
-0.0462646484375,
0.0151519775390625,
-0.0386962890625,
0.032745361328125,
-0.038848876953125,
-0.033477783203125,
-0.0017518997192382812,
0.00024175643920898438,
0.039337158203125,
0.0092315673828125,
-0.06439208984375,
0.0169219970703125,
0.03497314453125,
0.04803466796875,
0.0152587890625,
-0.0022125244140625,
0.0144805908203125,
-0.007038116455078125,
-0.0270233154296875,
0.0404052734375,
-0.00902557373046875,
-0.03704833984375,
-0.0005369186401367188,
0.0129241943359375,
-0.01062774658203125,
-0.027496337890625,
0.064697265625,
-0.020721435546875,
0.0294647216796875,
-0.026397705078125,
-0.046234130859375,
-0.033203125,
0.036529541015625,
-0.042938232421875,
0.0965576171875,
0.00556182861328125,
-0.068359375,
0.04876708984375,
-0.045654296875,
-0.01080322265625,
-0.01351165771484375,
-0.0088653564453125,
-0.06573486328125,
-0.010711669921875,
0.0203399658203125,
0.045745849609375,
-0.011260986328125,
0.0253753662109375,
-0.020599365234375,
-0.035675048828125,
-0.004627227783203125,
-0.034271240234375,
0.0845947265625,
0.0236053466796875,
-0.0347900390625,
0.00945281982421875,
-0.0882568359375,
0.0132904052734375,
0.015716552734375,
-0.033416748046875,
-0.014892578125,
-0.017425537109375,
0.0303955078125,
0.020294189453125,
0.02862548828125,
-0.0465087890625,
0.01445770263671875,
-0.03875732421875,
0.01099395751953125,
0.054840087890625,
0.00949859619140625,
-0.0006589889526367188,
-0.02667236328125,
0.020050048828125,
0.0265655517578125,
0.0279541015625,
0.00325775146484375,
-0.038970947265625,
-0.0684814453125,
-0.0242156982421875,
0.0180816650390625,
0.034576416015625,
-0.0247344970703125,
0.06329345703125,
-0.0178680419921875,
-0.058868408203125,
-0.028961181640625,
0.0028896331787109375,
0.0199737548828125,
0.047088623046875,
0.0214996337890625,
-0.01245880126953125,
-0.04803466796875,
-0.0855712890625,
-0.0009145736694335938,
-0.01055145263671875,
0.015655517578125,
0.0215606689453125,
0.04815673828125,
-0.01482391357421875,
0.0738525390625,
-0.037689208984375,
-0.0270843505859375,
-0.017669677734375,
0.00994110107421875,
0.049896240234375,
0.054779052734375,
0.06329345703125,
-0.04852294921875,
-0.03564453125,
-0.0094451904296875,
-0.06475830078125,
0.029510498046875,
-0.0126190185546875,
-0.01568603515625,
0.01441192626953125,
0.01117706298828125,
-0.044158935546875,
0.052032470703125,
0.033203125,
-0.027740478515625,
0.05792236328125,
-0.01605224609375,
-0.0028781890869140625,
-0.09588623046875,
0.02252197265625,
0.0156097412109375,
-0.0111236572265625,
-0.0234222412109375,
-0.0079345703125,
0.00269317626953125,
-0.0164642333984375,
-0.0265350341796875,
0.03277587890625,
-0.00415802001953125,
0.004329681396484375,
-0.003917694091796875,
-0.0333251953125,
-0.010986328125,
0.06512451171875,
0.00848388671875,
0.05419921875,
0.051605224609375,
-0.035064697265625,
0.0153045654296875,
0.0308837890625,
-0.0421142578125,
0.042572021484375,
-0.0626220703125,
-0.0014619827270507812,
-0.00511932373046875,
0.0033664703369140625,
-0.07769775390625,
-0.01922607421875,
0.02325439453125,
-0.0404052734375,
0.0159912109375,
-0.01367950439453125,
-0.0185546875,
-0.03668212890625,
-0.021636962890625,
0.00617218017578125,
0.045074462890625,
-0.039215087890625,
0.03558349609375,
-0.00562286376953125,
0.0028533935546875,
-0.057647705078125,
-0.058837890625,
-0.013031005859375,
-0.01393890380859375,
-0.03961181640625,
0.0131683349609375,
0.0098724365234375,
0.0023021697998046875,
0.0035114288330078125,
-0.00589752197265625,
-0.0162200927734375,
-0.0008955001831054688,
0.03094482421875,
0.029388427734375,
-0.0165252685546875,
-0.007747650146484375,
0.008453369140625,
-0.01123046875,
0.02484130859375,
0.003673553466796875,
0.047088623046875,
-0.014434814453125,
-0.0179595947265625,
-0.0645751953125,
-0.0025501251220703125,
0.036468505859375,
-0.00887298583984375,
0.07330322265625,
0.05438232421875,
-0.034698486328125,
0.0105133056640625,
-0.033477783203125,
-0.0020503997802734375,
-0.031646728515625,
0.034881591796875,
-0.04254150390625,
-0.021484375,
0.058990478515625,
-0.00533294677734375,
0.0159454345703125,
0.07196044921875,
0.041717529296875,
-0.006603240966796875,
0.08831787109375,
0.03045654296875,
-0.0185394287109375,
0.024200439453125,
-0.062042236328125,
0.0003991127014160156,
-0.049163818359375,
-0.0323486328125,
-0.039154052734375,
-0.0257415771484375,
-0.051788330078125,
0.01309967041015625,
0.0169677734375,
0.002838134765625,
-0.0574951171875,
0.0066375732421875,
-0.047119140625,
0.01885986328125,
0.054290771484375,
0.037841796875,
-0.005069732666015625,
0.00970458984375,
-0.02484130859375,
-0.015716552734375,
-0.053985595703125,
-0.041290283203125,
0.1007080078125,
0.035400390625,
0.05078125,
-0.00450897216796875,
0.058349609375,
0.00917816162109375,
0.015655517578125,
-0.044158935546875,
0.026947021484375,
-0.0057525634765625,
-0.0699462890625,
-0.01209259033203125,
-0.0190582275390625,
-0.060272216796875,
0.01508331298828125,
-0.020477294921875,
-0.034088134765625,
0.0278472900390625,
0.00445556640625,
-0.03265380859375,
0.032012939453125,
-0.035736083984375,
0.07843017578125,
-0.0083160400390625,
-0.01186370849609375,
-0.0168914794921875,
-0.0506591796875,
0.026031494140625,
0.01299285888671875,
-0.00811004638671875,
-0.007358551025390625,
0.0188446044921875,
0.0687255859375,
-0.042816162109375,
0.060272216796875,
-0.039642333984375,
0.0139923095703125,
0.02972412109375,
-0.01378631591796875,
0.0321044921875,
0.012969970703125,
-0.014556884765625,
0.0119781494140625,
0.005706787109375,
-0.0550537109375,
-0.026153564453125,
0.055755615234375,
-0.09417724609375,
-0.02081298828125,
-0.04656982421875,
-0.0241546630859375,
0.002506256103515625,
0.0191497802734375,
0.03472900390625,
0.03863525390625,
-0.004917144775390625,
0.0265045166015625,
0.052734375,
-0.01070404052734375,
0.024688720703125,
0.007427215576171875,
-0.0084228515625,
-0.042938232421875,
0.061553955078125,
-0.002628326416015625,
0.0115814208984375,
-0.00572967529296875,
0.0026702880859375,
-0.0263519287109375,
-0.0136566162109375,
-0.036865234375,
0.01165008544921875,
-0.055999755859375,
-0.014312744140625,
-0.0322265625,
-0.0279998779296875,
-0.0281524658203125,
-0.00974273681640625,
-0.0328369140625,
-0.0251312255859375,
-0.035186767578125,
-0.0126953125,
0.030303955078125,
0.0287933349609375,
0.016693115234375,
0.038299560546875,
-0.03875732421875,
-0.003963470458984375,
0.011505126953125,
0.023712158203125,
0.00534820556640625,
-0.05078125,
-0.0198974609375,
-0.002796173095703125,
-0.02667236328125,
-0.0501708984375,
0.044464111328125,
0.00799560546875,
0.036529541015625,
0.050140380859375,
-0.018096923828125,
0.06524658203125,
-0.0038738250732421875,
0.07403564453125,
0.0257720947265625,
-0.0499267578125,
0.051544189453125,
-0.0223846435546875,
0.0238494873046875,
0.050018310546875,
0.04583740234375,
-0.01849365234375,
-0.005054473876953125,
-0.08544921875,
-0.059417724609375,
0.05584716796875,
0.015869140625,
0.00934600830078125,
0.0024585723876953125,
0.034698486328125,
-0.0060272216796875,
0.0184783935546875,
-0.06842041015625,
-0.04925537109375,
-0.03533935546875,
-0.03021240234375,
0.0028247833251953125,
-0.0243072509765625,
-0.001956939697265625,
-0.0467529296875,
0.07489013671875,
0.0026760101318359375,
0.02862548828125,
0.0230560302734375,
0.007251739501953125,
-0.00640106201171875,
0.0012559890747070312,
0.03802490234375,
0.0494384765625,
-0.044647216796875,
-0.0146636962890625,
0.014312744140625,
-0.039794921875,
0.0004372596740722656,
0.0133056640625,
-0.0218505859375,
0.0178985595703125,
0.0233154296875,
0.0684814453125,
0.01092529296875,
-0.0173492431640625,
0.039337158203125,
0.0008172988891601562,
-0.0298919677734375,
-0.051116943359375,
-0.00396728515625,
-0.01393890380859375,
0.008270263671875,
0.0321044921875,
0.03826904296875,
0.0019102096557617188,
-0.0185089111328125,
0.0026607513427734375,
0.013885498046875,
-0.04229736328125,
-0.00617218017578125,
0.054290771484375,
0.00797271728515625,
-0.00275421142578125,
0.05584716796875,
-0.005290985107421875,
-0.03350830078125,
0.06781005859375,
0.02593994140625,
0.057647705078125,
-0.019073486328125,
0.00017440319061279297,
0.06793212890625,
0.0165252685546875,
-0.0011959075927734375,
0.052215576171875,
0.0146484375,
-0.0335693359375,
-0.020294189453125,
-0.056060791015625,
-0.0194244384765625,
0.0285186767578125,
-0.07904052734375,
0.0277862548828125,
-0.0430908203125,
-0.033935546875,
0.006076812744140625,
0.0153045654296875,
-0.0689697265625,
0.04949951171875,
-0.0016880035400390625,
0.08306884765625,
-0.068359375,
0.05792236328125,
0.055755615234375,
-0.041961669921875,
-0.07354736328125,
-0.0197601318359375,
-0.015472412109375,
-0.060028076171875,
0.055267333984375,
0.017303466796875,
0.0288238525390625,
0.0030307769775390625,
-0.047454833984375,
-0.06463623046875,
0.07513427734375,
-0.0026607513427734375,
-0.045074462890625,
0.01255035400390625,
0.0203094482421875,
0.0465087890625,
-0.005512237548828125,
0.031982421875,
0.03350830078125,
0.02447509765625,
-0.006893157958984375,
-0.05950927734375,
-0.004825592041015625,
-0.0287017822265625,
0.011810302734375,
0.01080322265625,
-0.056915283203125,
0.07513427734375,
0.006378173828125,
0.0278472900390625,
0.007427215576171875,
0.06048583984375,
0.017578125,
0.01108551025390625,
0.042205810546875,
0.0703125,
0.036956787109375,
-0.017181396484375,
0.072021484375,
-0.03717041015625,
0.06195068359375,
0.0626220703125,
0.00991058349609375,
0.056610107421875,
0.033111572265625,
-0.02978515625,
0.04547119140625,
0.06854248046875,
-0.01282501220703125,
0.033905029296875,
-0.0018444061279296875,
-0.01129913330078125,
-0.023162841796875,
0.0130462646484375,
-0.04266357421875,
0.026641845703125,
0.018951416015625,
-0.0443115234375,
-0.017578125,
-0.00228118896484375,
0.0068817138671875,
-0.00948333740234375,
-0.0310211181640625,
0.04400634765625,
-0.0257415771484375,
-0.0295562744140625,
0.055908203125,
0.0010614395141601562,
0.05224609375,
-0.0472412109375,
-0.000888824462890625,
-0.0106658935546875,
0.028289794921875,
-0.0265045166015625,
-0.05877685546875,
0.0229034423828125,
0.0005483627319335938,
-0.0135345458984375,
0.00994873046875,
0.04791259765625,
-0.01088714599609375,
-0.06610107421875,
-0.0045318603515625,
0.018096923828125,
0.00921630859375,
0.003841400146484375,
-0.0765380859375,
-0.0037937164306640625,
0.01081085205078125,
-0.050140380859375,
-0.00023221969604492188,
0.0226898193359375,
0.0024967193603515625,
0.0386962890625,
0.0489501953125,
-0.0041656494140625,
0.01016998291015625,
0.01090240478515625,
0.0701904296875,
-0.0489501953125,
-0.0535888671875,
-0.050445556640625,
0.0322265625,
-0.027923583984375,
-0.06341552734375,
0.051788330078125,
0.066650390625,
0.061187744140625,
-0.020050048828125,
0.054901123046875,
-0.0140380859375,
0.042877197265625,
-0.026336669921875,
0.048858642578125,
-0.036285400390625,
-0.00978851318359375,
-0.021484375,
-0.06512451171875,
-0.0181121826171875,
0.06585693359375,
-0.030303955078125,
0.0062408447265625,
0.0211944580078125,
0.05035400390625,
0.0026302337646484375,
0.01412200927734375,
0.00016498565673828125,
0.00823974609375,
0.0162353515625,
0.0280914306640625,
0.033294677734375,
-0.06304931640625,
0.03326416015625,
-0.054229736328125,
-0.00720977783203125,
-0.01076507568359375,
-0.045257568359375,
-0.0728759765625,
-0.0267486572265625,
-0.030242919921875,
-0.029541015625,
-0.006359100341796875,
0.086181640625,
0.056304931640625,
-0.0599365234375,
-0.0211181640625,
-0.0056610107421875,
-0.02593994140625,
-0.0277099609375,
-0.0219573974609375,
0.052886962890625,
-0.0280914306640625,
-0.058441162109375,
-0.0245513916015625,
-0.0191192626953125,
0.0177764892578125,
-0.026336669921875,
-0.00974273681640625,
-0.0287933349609375,
-0.007720947265625,
0.030426025390625,
0.002994537353515625,
-0.050018310546875,
-0.0178680419921875,
-0.00782012939453125,
-0.0142974853515625,
0.0203399658203125,
0.01318359375,
-0.043548583984375,
0.0283660888671875,
0.020599365234375,
0.036895751953125,
0.0513916015625,
-0.0008625984191894531,
0.0031719207763671875,
-0.06256103515625,
0.0335693359375,
0.007251739501953125,
0.0271453857421875,
0.0098876953125,
-0.031158447265625,
0.03814697265625,
0.0287017822265625,
-0.04327392578125,
-0.052276611328125,
-0.02569580078125,
-0.0911865234375,
-0.005603790283203125,
0.0740966796875,
-0.0079498291015625,
-0.03741455078125,
0.0162200927734375,
-0.03680419921875,
0.0190277099609375,
-0.021087646484375,
0.053741455078125,
0.06341552734375,
-0.0100250244140625,
0.008331298828125,
-0.036102294921875,
0.043426513671875,
0.040924072265625,
-0.044891357421875,
-0.0211639404296875,
0.0169219970703125,
0.03289794921875,
0.01275634765625,
0.0272216796875,
-0.00743865966796875,
0.02734375,
0.0095367431640625,
0.024688720703125,
-0.0018625259399414062,
-0.0057373046875,
-0.0216522216796875,
0.0067291259765625,
0.004192352294921875,
-0.040191650390625
]
] |
microsoft/deberta-large | 2022-09-26T08:50:58.000Z | [
"transformers",
"pytorch",
"tf",
"deberta",
"deberta-v1",
"fill-mask",
"en",
"arxiv:2006.03654",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | microsoft | null | null | microsoft/deberta-large | 11 | 11,959 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- deberta-v1
- fill-mask
thumbnail: https://huggingface.co/front/thumbnails/microsoft.png
license: mit
---
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention
[DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data.
Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.
#### Fine-tuning on NLU tasks
We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.
| Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m/mm | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B |
|---------------------------|-----------|-----------|-------------|-------|------|------|--------|-------|-------|------|
| | F1/EM | F1/EM | Acc | Acc | Acc | MCC | Acc |Acc/F1 |Acc/F1 |P/S |
| BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6/- | 93.2 | 92.3 | 60.6 | 70.4 | 88.0/- | 91.3/- |90.0/- |
| RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2/- | 96.4 | 93.9 | 68.0 | 86.6 | 90.9/- | 92.2/- |92.4/- |
| XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8/- | 97.0 | 94.9 | 69.0 | 85.9 | 90.8/- | 92.3/- |92.5/- |
| [DeBERTa-Large](https://huggingface.co/microsoft/deberta-large)<sup>1</sup> | 95.5/90.1 | 90.7/88.0 | 91.3/91.1| 96.5|95.3| 69.5| 91.0| 92.6/94.6| 92.3/- |92.8/92.5 |
| [DeBERTa-XLarge](https://huggingface.co/microsoft/deberta-xlarge)<sup>1</sup> | -/- | -/- | 91.5/91.2| 97.0 | - | - | 93.1 | 92.1/94.3 | - |92.9/92.7|
| [DeBERTa-V2-XLarge](https://huggingface.co/microsoft/deberta-v2-xlarge)<sup>1</sup>|95.8/90.8| 91.4/88.9|91.7/91.6| **97.5**| 95.8|71.1|**93.9**|92.0/94.2|92.3/89.8|92.9/92.9|
|**[DeBERTa-V2-XXLarge](https://huggingface.co/microsoft/deberta-v2-xxlarge)<sup>1,2</sup>**|**96.1/91.4**|**92.2/89.7**|**91.7/91.9**|97.2|**96.0**|**72.0**| 93.5| **93.1/94.9**|**92.7/90.3** |**93.2/93.1** |
--------
#### Notes.
- <sup>1</sup> Following RoBERTa, for RTE, MRPC, STS-B, we fine-tune the tasks based on [DeBERTa-Large-MNLI](https://huggingface.co/microsoft/deberta-large-mnli), [DeBERTa-XLarge-MNLI](https://huggingface.co/microsoft/deberta-xlarge-mnli), [DeBERTa-V2-XLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xlarge-mnli), [DeBERTa-V2-XXLarge-MNLI](https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli). The results of SST-2/QQP/QNLI/SQuADv2 will also be slightly improved when start from MNLI fine-tuned models, however, we only report the numbers fine-tuned from pretrained base models for those 4 tasks.
- <sup>2</sup> To try the **XXLarge** model with **[HF transformers](https://huggingface.co/transformers/main_classes/trainer.html)**, you need to specify **--sharded_ddp**
```bash
cd transformers/examples/text-classification/
export TASK_NAME=mrpc
python -m torch.distributed.launch --nproc_per_node=8 run_glue.py --model_name_or_path microsoft/deberta-v2-xxlarge \\
--task_name $TASK_NAME --do_train --do_eval --max_seq_length 128 --per_device_train_batch_size 4 \\
--learning_rate 3e-6 --num_train_epochs 3 --output_dir /tmp/$TASK_NAME/ --overwrite_output_dir --sharded_ddp --fp16
```
### Citation
If you find DeBERTa useful for your work, please cite the following paper:
``` latex
@inproceedings{
he2021deberta,
title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION},
author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=XPZIaotutsD}
}
```
| 3,773 | [
[
-0.033721923828125,
-0.047271728515625,
0.0201416015625,
0.035888671875,
-0.01374053955078125,
0.01535797119140625,
-0.0003330707550048828,
-0.048675537109375,
0.0216217041015625,
0.01163482666015625,
-0.063232421875,
-0.0254364013671875,
-0.0689697265625,
-0.00592803955078125,
-0.0015726089477539062,
0.064697265625,
-0.004749298095703125,
-0.0154571533203125,
-0.01143646240234375,
-0.0137786865234375,
-0.043426513671875,
-0.032958984375,
-0.036956787109375,
-0.032989501953125,
0.0205230712890625,
0.0238037109375,
0.050079345703125,
0.010955810546875,
0.036529541015625,
0.0256805419921875,
-0.03070068359375,
0.0264129638671875,
-0.038543701171875,
-0.004138946533203125,
0.010955810546875,
-0.0244293212890625,
-0.07000732421875,
0.00824737548828125,
0.0244293212890625,
0.0284576416015625,
0.01555633544921875,
0.0250244140625,
0.030029296875,
0.07843017578125,
-0.035308837890625,
0.0110015869140625,
-0.036224365234375,
0.005474090576171875,
0.009124755859375,
-0.0032978057861328125,
-0.014801025390625,
-0.003173828125,
0.006561279296875,
-0.03070068359375,
0.00351715087890625,
-0.0127410888671875,
0.09490966796875,
0.040618896484375,
-0.007678985595703125,
-0.0076904296875,
-0.0293731689453125,
0.08624267578125,
-0.05487060546875,
0.027008056640625,
0.024627685546875,
0.002597808837890625,
-0.0160369873046875,
-0.030853271484375,
-0.027740478515625,
-0.01163482666015625,
-0.015594482421875,
0.024749755859375,
-0.057342529296875,
-0.0092010498046875,
0.0290069580078125,
0.01202392578125,
-0.05206298828125,
0.01242828369140625,
-0.02496337890625,
-0.0006031990051269531,
0.0540771484375,
0.004718780517578125,
0.01441192626953125,
0.00823211669921875,
-0.03759765625,
-0.0106658935546875,
-0.04107666015625,
0.0172882080078125,
0.0096893310546875,
0.0019311904907226562,
-0.0240478515625,
0.0185699462890625,
-0.019561767578125,
0.06854248046875,
0.0298004150390625,
-0.0013399124145507812,
0.05328369140625,
-0.01381683349609375,
-0.033538818359375,
0.0005612373352050781,
0.04791259765625,
0.024017333984375,
-0.0025787353515625,
-0.00870513916015625,
-0.015594482421875,
0.006008148193359375,
0.0071868896484375,
-0.07037353515625,
-0.03363037109375,
0.04150390625,
-0.04461669921875,
-0.018402099609375,
-0.002735137939453125,
-0.04022216796875,
0.0015707015991210938,
-0.044403076171875,
0.02734375,
-0.043304443359375,
-0.02374267578125,
0.004993438720703125,
-0.00531005859375,
0.005641937255859375,
0.03875732421875,
-0.06341552734375,
0.0016107559204101562,
0.035614013671875,
0.055419921875,
-0.00909423828125,
-0.013641357421875,
-0.0458984375,
-0.0152587890625,
-0.0036602020263671875,
0.0228424072265625,
-0.01209259033203125,
0.0092010498046875,
-0.0128631591796875,
0.012176513671875,
-0.0239715576171875,
-0.0262451171875,
0.0136260986328125,
-0.039398193359375,
-0.00183868408203125,
-0.0287628173828125,
-0.029144287109375,
-0.0197296142578125,
0.0288848876953125,
-0.039215087890625,
0.08355712890625,
0.033050537109375,
-0.06610107421875,
0.0144195556640625,
-0.04400634765625,
-0.00817108154296875,
-0.01491546630859375,
-0.0012607574462890625,
-0.04034423828125,
-0.006435394287109375,
0.0190582275390625,
0.04425048828125,
-0.004810333251953125,
-0.004062652587890625,
-0.016632080078125,
-0.0321044921875,
0.004787445068359375,
-0.006252288818359375,
0.09576416015625,
0.0260009765625,
-0.06829833984375,
0.0035724639892578125,
-0.067626953125,
0.0182647705078125,
0.01708984375,
-0.022491455078125,
-0.01111602783203125,
-0.0141754150390625,
0.004680633544921875,
0.040252685546875,
0.044830322265625,
-0.04522705078125,
0.0224456787109375,
-0.02996826171875,
0.043609619140625,
0.043365478515625,
-0.0240478515625,
0.017120361328125,
-0.018096923828125,
0.031494140625,
0.0311126708984375,
0.0293121337890625,
0.020111083984375,
-0.047943115234375,
-0.05670166015625,
-0.04583740234375,
0.0262451171875,
0.05450439453125,
-0.046539306640625,
0.056182861328125,
-0.007335662841796875,
-0.046295166015625,
-0.0413818359375,
0.018951416015625,
0.044525146484375,
0.023529052734375,
0.0390625,
-0.005802154541015625,
-0.04156494140625,
-0.08355712890625,
0.004131317138671875,
-0.0004851818084716797,
-0.0009889602661132812,
0.013153076171875,
0.051605224609375,
-0.0238037109375,
0.0665283203125,
-0.036102294921875,
-0.03619384765625,
-0.013580322265625,
0.004001617431640625,
0.03338623046875,
0.056793212890625,
0.07781982421875,
-0.05511474609375,
-0.047882080078125,
-0.0169219970703125,
-0.04925537109375,
0.01543426513671875,
0.000003933906555175781,
-0.0195770263671875,
0.045135498046875,
0.0187225341796875,
-0.044921875,
0.0382080078125,
0.0537109375,
-0.03692626953125,
0.0189971923828125,
-0.02410888671875,
0.01413726806640625,
-0.07598876953125,
0.0155792236328125,
-0.00179290771484375,
-0.0219573974609375,
-0.041046142578125,
-0.005985260009765625,
0.01177978515625,
0.0239410400390625,
-0.02593994140625,
0.0256195068359375,
-0.048370361328125,
0.007358551025390625,
-0.0156707763671875,
0.01800537109375,
0.0095672607421875,
0.06256103515625,
-0.00409698486328125,
0.051116943359375,
0.04248046875,
-0.0345458984375,
0.0199432373046875,
0.04302978515625,
-0.02386474609375,
0.033050537109375,
-0.06500244140625,
0.014678955078125,
-0.0138702392578125,
0.01357269287109375,
-0.08245849609375,
0.009979248046875,
0.02532958984375,
-0.039215087890625,
0.044219970703125,
-0.01073455810546875,
-0.039398193359375,
-0.04034423828125,
-0.0283203125,
-0.0015726089477539062,
0.056549072265625,
-0.051849365234375,
0.0180206298828125,
0.029266357421875,
0.00934600830078125,
-0.05303955078125,
-0.06134033203125,
-0.00925445556640625,
-0.01441192626953125,
-0.06378173828125,
0.05487060546875,
-0.0157623291015625,
-0.007061004638671875,
-0.006267547607421875,
-0.006618499755859375,
-0.0154266357421875,
0.0233001708984375,
0.0261688232421875,
0.0345458984375,
-0.004772186279296875,
-0.004642486572265625,
0.00730133056640625,
0.0028133392333984375,
-0.01111602783203125,
0.0013055801391601562,
0.039215087890625,
-0.0254974365234375,
-0.003314971923828125,
-0.0288238525390625,
0.0196533203125,
0.043914794921875,
-0.0276947021484375,
0.05902099609375,
0.07037353515625,
-0.0201416015625,
0.0018901824951171875,
-0.04010009765625,
-0.016448974609375,
-0.034912109375,
0.01971435546875,
-0.032257080078125,
-0.060394287109375,
0.05035400390625,
0.01617431640625,
0.021209716796875,
0.047119140625,
0.04461669921875,
-0.01027679443359375,
0.08843994140625,
0.049346923828125,
-0.0254669189453125,
0.043731689453125,
-0.056182861328125,
-0.00296783447265625,
-0.0750732421875,
-0.0164794921875,
-0.032867431640625,
-0.05047607421875,
-0.038818359375,
-0.0180511474609375,
0.01560211181640625,
0.034210205078125,
-0.0219573974609375,
0.060821533203125,
-0.08123779296875,
0.0021076202392578125,
0.055938720703125,
0.0380859375,
-0.0022144317626953125,
0.005687713623046875,
0.0114898681640625,
-0.00775146484375,
-0.057525634765625,
-0.0305023193359375,
0.057708740234375,
0.0286865234375,
0.0380859375,
0.01459503173828125,
0.06475830078125,
0.0111541748046875,
-0.00676727294921875,
-0.0248565673828125,
0.032928466796875,
-0.010650634765625,
-0.041412353515625,
-0.0139923095703125,
-0.0258941650390625,
-0.08551025390625,
0.015533447265625,
-0.0100860595703125,
-0.08685302734375,
0.031341552734375,
0.0308074951171875,
-0.0347900390625,
0.01245880126953125,
-0.040496826171875,
0.06915283203125,
-0.0099029541015625,
-0.0286102294921875,
-0.0211944580078125,
-0.052459716796875,
0.016571044921875,
0.01904296875,
-0.013763427734375,
-0.0225982666015625,
0.00409698486328125,
0.062408447265625,
-0.0226898193359375,
0.0582275390625,
-0.027862548828125,
-0.0219268798828125,
0.0279998779296875,
-0.002483367919921875,
0.0531005859375,
-0.0027561187744140625,
-0.0017309188842773438,
0.01885986328125,
0.022918701171875,
-0.033782958984375,
-0.034759521484375,
0.059356689453125,
-0.06329345703125,
-0.025726318359375,
-0.034698486328125,
-0.045318603515625,
-0.0201263427734375,
-0.0011272430419921875,
0.02410888671875,
0.032470703125,
0.003681182861328125,
0.01480865478515625,
0.06256103515625,
-0.0097503662109375,
0.038604736328125,
0.03948974609375,
0.01416015625,
-0.01137542724609375,
0.06170654296875,
0.007724761962890625,
0.00530242919921875,
0.035736083984375,
-0.0229949951171875,
-0.023712158203125,
-0.040283203125,
-0.03857421875,
0.00653839111328125,
-0.03973388671875,
-0.0308685302734375,
-0.051727294921875,
-0.005786895751953125,
-0.0272216796875,
0.005077362060546875,
-0.0301055908203125,
-0.043792724609375,
-0.05487060546875,
0.020782470703125,
0.050689697265625,
0.040252685546875,
-0.00344085693359375,
0.01085662841796875,
-0.06719970703125,
0.01288604736328125,
0.0078582763671875,
0.0184783935546875,
-0.0012578964233398438,
-0.04083251953125,
-0.0190887451171875,
0.0240478515625,
-0.04534912109375,
-0.061126708984375,
0.033477783203125,
0.0032024383544921875,
0.046539306640625,
0.0004115104675292969,
0.007678985595703125,
0.04913330078125,
-0.0306396484375,
0.058929443359375,
0.024932861328125,
-0.06207275390625,
0.0535888671875,
-0.019256591796875,
0.0216522216796875,
0.046539306640625,
0.033447265625,
-0.0004374980926513672,
-0.0238800048828125,
-0.060699462890625,
-0.057342529296875,
0.075439453125,
0.0391845703125,
-0.01096343994140625,
0.00745391845703125,
0.0124664306640625,
-0.01505279541015625,
0.0173187255859375,
-0.0279083251953125,
-0.035125732421875,
-0.01387786865234375,
-0.0215606689453125,
-0.001678466796875,
-0.0226898193359375,
-0.0070343017578125,
-0.035736083984375,
0.068359375,
-0.003017425537109375,
0.043121337890625,
0.03759765625,
-0.0206451416015625,
-0.0008053779602050781,
-0.0010862350463867188,
0.0628662109375,
0.06378173828125,
-0.033050537109375,
-0.0174102783203125,
0.01496124267578125,
-0.032623291015625,
-0.0023365020751953125,
0.0167236328125,
0.0027923583984375,
0.01502227783203125,
0.01959228515625,
0.0699462890625,
0.00281524658203125,
-0.038818359375,
0.028045654296875,
0.004802703857421875,
-0.031585693359375,
-0.01535797119140625,
-0.0013437271118164062,
-0.0021343231201171875,
0.042938232421875,
0.0214691162109375,
0.01163482666015625,
0.0128631591796875,
-0.0284423828125,
0.01552581787109375,
0.048553466796875,
-0.04290771484375,
-0.02276611328125,
0.0506591796875,
0.00756072998046875,
0.0016050338745117188,
0.04034423828125,
-0.0186920166015625,
-0.050994873046875,
0.062225341796875,
0.0264892578125,
0.060302734375,
-0.01143646240234375,
0.0042572021484375,
0.0506591796875,
0.0253753662109375,
0.00859832763671875,
0.04461669921875,
0.004131317138671875,
-0.02618408203125,
-0.0208282470703125,
-0.049835205078125,
-0.0017175674438476562,
0.020233154296875,
-0.05157470703125,
0.00262451171875,
-0.009552001953125,
-0.02593994140625,
0.0131988525390625,
0.0284576416015625,
-0.06494140625,
0.01312255859375,
0.0079498291015625,
0.07305908203125,
-0.0391845703125,
0.06378173828125,
0.05401611328125,
-0.032623291015625,
-0.047943115234375,
-0.02099609375,
-0.0088043212890625,
-0.06329345703125,
0.076904296875,
0.01349639892578125,
0.007171630859375,
0.0015201568603515625,
-0.0279998779296875,
-0.07086181640625,
0.09344482421875,
0.0267181396484375,
-0.06878662109375,
-0.00391387939453125,
-0.00008225440979003906,
0.0330810546875,
-0.017425537109375,
0.0196533203125,
0.04290771484375,
0.035675048828125,
-0.005340576171875,
-0.08331298828125,
0.02734375,
-0.02532958984375,
0.00760650634765625,
0.01555633544921875,
-0.068115234375,
0.07781982421875,
-0.01041412353515625,
0.0135955810546875,
0.01141357421875,
0.045806884765625,
0.0208282470703125,
0.004962921142578125,
0.043914794921875,
0.050994873046875,
0.043670654296875,
-0.0168609619140625,
0.0679931640625,
-0.037139892578125,
0.04766845703125,
0.06829833984375,
0.0130157470703125,
0.050323486328125,
0.036468505859375,
-0.03564453125,
0.0328369140625,
0.049957275390625,
-0.01361846923828125,
0.03424072265625,
0.01244354248046875,
0.00652313232421875,
-0.0171661376953125,
0.0248870849609375,
-0.0350341796875,
0.032318115234375,
0.008392333984375,
-0.034332275390625,
-0.0150299072265625,
0.00579833984375,
0.005474090576171875,
-0.01071929931640625,
-0.0182647705078125,
0.049591064453125,
-0.0030651092529296875,
-0.052764892578125,
0.0823974609375,
-0.0171051025390625,
0.06341552734375,
-0.038299560546875,
-0.01021575927734375,
-0.004039764404296875,
0.0382080078125,
-0.0257720947265625,
-0.055694580078125,
0.0192718505859375,
-0.00801849365234375,
-0.02471923828125,
-0.009674072265625,
0.04913330078125,
-0.028472900390625,
-0.0301513671875,
0.02838134765625,
0.0277862548828125,
0.0114593505859375,
-0.0233612060546875,
-0.09088134765625,
0.028228759765625,
0.0182647705078125,
-0.038818359375,
0.037261962890625,
0.0097503662109375,
0.013671875,
0.0352783203125,
0.0155792236328125,
-0.029571533203125,
0.001567840576171875,
-0.0172882080078125,
0.07403564453125,
-0.0218963623046875,
-0.0210418701171875,
-0.06268310546875,
0.04791259765625,
-0.0164337158203125,
-0.029144287109375,
0.06939697265625,
0.03594970703125,
0.038970947265625,
-0.020111083984375,
0.0384521484375,
-0.031768798828125,
0.025787353515625,
-0.033782958984375,
0.058135986328125,
-0.06793212890625,
-0.009124755859375,
-0.035797119140625,
-0.06829833984375,
0.0026702880859375,
0.0543212890625,
-0.0007748603820800781,
0.0086517333984375,
0.01678466796875,
0.049163818359375,
-0.0083770751953125,
-0.0176544189453125,
0.01067352294921875,
0.01163482666015625,
0.0186767578125,
0.0750732421875,
0.036041259765625,
-0.0615234375,
0.03558349609375,
-0.03863525390625,
-0.033111572265625,
-0.029052734375,
-0.058624267578125,
-0.081787109375,
-0.056304931640625,
-0.05426025390625,
-0.0390625,
-0.0032329559326171875,
0.06671142578125,
0.071533203125,
-0.0648193359375,
0.01413726806640625,
-0.01528167724609375,
-0.0078887939453125,
-0.03961181640625,
-0.017120361328125,
0.0394287109375,
-0.033935546875,
-0.07757568359375,
0.024200439453125,
-0.00787353515625,
0.02410888671875,
-0.00868988037109375,
-0.017974853515625,
-0.0245513916015625,
-0.004154205322265625,
0.057769775390625,
0.0178680419921875,
-0.04913330078125,
-0.0142364501953125,
0.006710052490234375,
-0.009918212890625,
0.0081939697265625,
0.00852203369140625,
-0.056182861328125,
0.005062103271484375,
0.043182373046875,
0.016326904296875,
0.042999267578125,
-0.0168609619140625,
0.01279449462890625,
-0.0587158203125,
0.032379150390625,
0.017822265625,
0.031341552734375,
0.0040130615234375,
-0.034881591796875,
0.04583740234375,
-0.0097503662109375,
-0.04443359375,
-0.0670166015625,
0.00736236572265625,
-0.109375,
-0.0232086181640625,
0.07147216796875,
-0.02447509765625,
-0.02056884765625,
0.006954193115234375,
-0.0258636474609375,
0.0118560791015625,
-0.031036376953125,
0.053985595703125,
0.037384033203125,
-0.019317626953125,
0.0024394989013671875,
-0.03607177734375,
0.05596923828125,
0.041046142578125,
-0.04400634765625,
0.0012187957763671875,
0.0229644775390625,
0.018310546875,
0.040802001953125,
0.04278564453125,
-0.0032978057861328125,
0.028839111328125,
-0.011383056640625,
0.0005364418029785156,
-0.026824951171875,
-0.0166015625,
-0.0121307373046875,
-0.01520538330078125,
-0.01013946533203125,
-0.044097900390625
]
] |
cardiffnlp/twitter-roberta-base | 2023-02-07T15:33:34.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"fill-mask",
"arxiv:2010.12421",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | cardiffnlp | null | null | cardiffnlp/twitter-roberta-base | 16 | 11,929 | transformers | 2022-03-02T23:29:05 | # Twitter-roBERTa-base
This is a RoBERTa-base model trained on ~58M tweets on top of the original RoBERTa-base checkpoint, as described and evaluated in the [_TweetEval_ benchmark (Findings of EMNLP 2020)](https://arxiv.org/pdf/2010.12421.pdf).
To evaluate this and other LMs on Twitter-specific data, please refer to the [Tweeteval official repository](https://github.com/cardiffnlp/tweeteval).
## Preprocess Text
Replace usernames and links for placeholders: "@user" and "http".
```python
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
```
## Example Masked Language Model
```python
from transformers import pipeline, AutoTokenizer
import numpy as np
MODEL = "cardiffnlp/twitter-roberta-base"
fill_mask = pipeline("fill-mask", model=MODEL, tokenizer=MODEL)
tokenizer = AutoTokenizer.from_pretrained(MODEL)
def print_candidates():
for i in range(5):
token = tokenizer.decode(candidates[i]['token'])
score = np.round(candidates[i]['score'], 4)
print(f"{i+1}) {token} {score}")
texts = [
"I am so <mask> 😊",
"I am so <mask> 😢"
]
for text in texts:
t = preprocess(text)
print(f"{'-'*30}\n{t}")
candidates = fill_mask(t)
print_candidates()
```
Output:
```
------------------------------
I am so <mask> 😊
1) happy 0.402
2) excited 0.1441
3) proud 0.143
4) grateful 0.0669
5) blessed 0.0334
------------------------------
I am so <mask> 😢
1) sad 0.2641
2) sorry 0.1605
3) tired 0.138
4) sick 0.0278
5) hungry 0.0232
```
## Example Tweet Embeddings
```python
from transformers import AutoTokenizer, AutoModel, TFAutoModel
import numpy as np
from scipy.spatial.distance import cosine
from collections import defaultdict
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModel.from_pretrained(MODEL)
def get_embedding(text):
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
features = model(**encoded_input)
features = features[0].detach().cpu().numpy()
features_mean = np.mean(features[0], axis=0)
return features_mean
MODEL = "cardiffnlp/twitter-roberta-base"
query = "The book was awesome"
tweets = ["I just ordered fried chicken 🐣",
"The movie was great",
"What time is the next game?",
"Just finished reading 'Embeddings in NLP'"]
d = defaultdict(int)
for tweet in tweets:
sim = 1-cosine(get_embedding(query),get_embedding(tweet))
d[tweet] = sim
print('Most similar to: ',query)
print('----------------------------------------')
for idx,x in enumerate(sorted(d.items(), key=lambda x:x[1], reverse=True)):
print(idx+1,x[0])
```
Output:
```
Most similar to: The book was awesome
----------------------------------------
1 The movie was great
2 Just finished reading 'Embeddings in NLP'
3 I just ordered fried chicken 🐣
4 What time is the next game?
```
## Example Feature Extraction
```python
from transformers import AutoTokenizer, AutoModel, TFAutoModel
import numpy as np
MODEL = "cardiffnlp/twitter-roberta-base"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
text = "Good night 😊"
text = preprocess(text)
# Pytorch
model = AutoModel.from_pretrained(MODEL)
encoded_input = tokenizer(text, return_tensors='pt')
features = model(**encoded_input)
features = features[0].detach().cpu().numpy()
features_mean = np.mean(features[0], axis=0)
#features_max = np.max(features[0], axis=0)
# # Tensorflow
# model = TFAutoModel.from_pretrained(MODEL)
# encoded_input = tokenizer(text, return_tensors='tf')
# features = model(encoded_input)
# features = features[0].numpy()
# features_mean = np.mean(features[0], axis=0)
# #features_max = np.max(features[0], axis=0)
```
### BibTeX entry and citation info
Please cite the [reference paper](https://aclanthology.org/2020.findings-emnlp.148/) if you use this model.
```bibtex
@inproceedings{barbieri-etal-2020-tweeteval,
title = "{T}weet{E}val: Unified Benchmark and Comparative Evaluation for Tweet Classification",
author = "Barbieri, Francesco and
Camacho-Collados, Jose and
Espinosa Anke, Luis and
Neves, Leonardo",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.findings-emnlp.148",
doi = "10.18653/v1/2020.findings-emnlp.148",
pages = "1644--1650"
}
``` | 4,616 | [
[
-0.0162200927734375,
-0.049072265625,
0.0156402587890625,
0.0203704833984375,
-0.015869140625,
0.00768280029296875,
-0.0193939208984375,
-0.01120758056640625,
0.0301971435546875,
0.00897216796875,
-0.03265380859375,
-0.051666259765625,
-0.0601806640625,
0.00989532470703125,
-0.038909912109375,
0.056610107421875,
-0.002468109130859375,
-0.00472259521484375,
0.0195770263671875,
-0.01476287841796875,
0.00045418739318847656,
-0.03265380859375,
-0.0447998046875,
-0.0139617919921875,
0.0145111083984375,
0.00640106201171875,
0.03350830078125,
0.05078125,
0.02880859375,
0.033935546875,
0.006671905517578125,
0.0071868896484375,
-0.0247039794921875,
-0.00327301025390625,
-0.0007033348083496094,
-0.0234832763671875,
-0.019500732421875,
0.00705718994140625,
0.05059814453125,
0.035675048828125,
0.00605010986328125,
0.0234222412109375,
0.005619049072265625,
0.0242767333984375,
-0.0281524658203125,
0.010772705078125,
-0.039276123046875,
0.0031280517578125,
-0.005954742431640625,
-0.01392364501953125,
-0.0108184814453125,
-0.0419921875,
0.00872802734375,
-0.034027099609375,
0.019256591796875,
-0.007476806640625,
0.100830078125,
0.0249176025390625,
-0.007122039794921875,
-0.023468017578125,
-0.0278778076171875,
0.0782470703125,
-0.06024169921875,
0.01366424560546875,
0.0230560302734375,
0.0070648193359375,
0.0009522438049316406,
-0.05322265625,
-0.04290771484375,
-0.010284423828125,
-0.006175994873046875,
0.005420684814453125,
-0.0270538330078125,
-0.021453857421875,
0.0161590576171875,
0.01038360595703125,
-0.03717041015625,
-0.00891876220703125,
-0.0213470458984375,
-0.007442474365234375,
0.042083740234375,
-0.0036716461181640625,
0.0257110595703125,
-0.0308685302734375,
-0.0166168212890625,
-0.0203857421875,
-0.01526641845703125,
-0.00688934326171875,
-0.002193450927734375,
0.01366424560546875,
-0.027069091796875,
0.048309326171875,
-0.0109710693359375,
0.036163330078125,
0.01105499267578125,
-0.0019931793212890625,
0.051605224609375,
-0.025604248046875,
-0.0201263427734375,
-0.0172119140625,
0.0869140625,
0.0295867919921875,
0.03564453125,
-0.006443023681640625,
-0.0117950439453125,
-0.01290130615234375,
-0.01375579833984375,
-0.059906005859375,
-0.0159454345703125,
0.0211944580078125,
-0.034637451171875,
-0.0345458984375,
0.009765625,
-0.045684814453125,
-0.0036754608154296875,
0.007099151611328125,
0.055816650390625,
-0.049224853515625,
-0.0259552001953125,
-0.0047760009765625,
-0.022674560546875,
0.01329803466796875,
0.00862884521484375,
-0.0628662109375,
0.00023889541625976562,
0.038726806640625,
0.07781982421875,
0.00965118408203125,
-0.040069580078125,
-0.0240631103515625,
0.0092620849609375,
-0.0222015380859375,
0.04150390625,
-0.0274200439453125,
-0.00553131103515625,
0.001140594482421875,
-0.0007343292236328125,
-0.0255279541015625,
-0.027740478515625,
0.017974853515625,
-0.0181732177734375,
0.0203094482421875,
-0.0147857666015625,
-0.04888916015625,
-0.004730224609375,
0.021575927734375,
-0.0347900390625,
0.09033203125,
0.021575927734375,
-0.05615234375,
0.0238800048828125,
-0.06036376953125,
-0.0180206298828125,
-0.01296234130859375,
-0.0108642578125,
-0.034881591796875,
-0.0053253173828125,
0.040130615234375,
0.0435791015625,
-0.01666259765625,
0.0148468017578125,
-0.023162841796875,
-0.0211334228515625,
0.018951416015625,
-0.01216888427734375,
0.09539794921875,
0.014068603515625,
-0.04498291015625,
-0.004230499267578125,
-0.0462646484375,
0.01174163818359375,
0.0267486572265625,
-0.0175018310546875,
-0.004108428955078125,
-0.0175933837890625,
0.002880096435546875,
0.018707275390625,
0.0225830078125,
-0.0482177734375,
0.0132293701171875,
-0.03717041015625,
0.059051513671875,
0.0517578125,
0.0030994415283203125,
0.02978515625,
-0.03802490234375,
0.0164337158203125,
0.006465911865234375,
0.0109710693359375,
-0.00540924072265625,
-0.042083740234375,
-0.06512451171875,
-0.017730712890625,
0.03289794921875,
0.034423828125,
-0.048309326171875,
0.0531005859375,
-0.03497314453125,
-0.0440673828125,
-0.05078125,
-0.00414276123046875,
0.0184478759765625,
0.027099609375,
0.0411376953125,
0.0016546249389648438,
-0.0570068359375,
-0.04998779296875,
-0.03399658203125,
-0.0175628662109375,
-0.0006170272827148438,
0.0194854736328125,
0.0609130859375,
-0.021331787109375,
0.061859130859375,
-0.0303802490234375,
-0.0185699462890625,
-0.0176849365234375,
0.0103607177734375,
0.03253173828125,
0.059722900390625,
0.05853271484375,
-0.0426025390625,
-0.056427001953125,
-0.02154541015625,
-0.060211181640625,
-0.00927734375,
-0.002716064453125,
-0.014495849609375,
0.0262908935546875,
0.03424072265625,
-0.0482177734375,
0.034027099609375,
0.0226593017578125,
-0.026641845703125,
0.016357421875,
0.0008559226989746094,
0.0182647705078125,
-0.105224609375,
0.004238128662109375,
-0.00018405914306640625,
-0.0015411376953125,
-0.034637451171875,
-0.012298583984375,
-0.004894256591796875,
0.01030731201171875,
-0.0222320556640625,
0.050994873046875,
-0.02764892578125,
0.0013036727905273438,
0.00698089599609375,
0.00799560546875,
-0.003978729248046875,
0.033050537109375,
-0.01238250732421875,
0.034820556640625,
0.043365478515625,
-0.0309600830078125,
0.0240631103515625,
0.0255279541015625,
-0.024505615234375,
0.0126190185546875,
-0.051361083984375,
0.002002716064453125,
0.01300048828125,
0.0106964111328125,
-0.08551025390625,
-0.012237548828125,
0.0226898193359375,
-0.07110595703125,
0.01366424560546875,
-0.01934814453125,
-0.054107666015625,
-0.033538818359375,
-0.029083251953125,
0.02093505859375,
0.044830322265625,
-0.0428466796875,
0.047119140625,
0.0250244140625,
0.0183868408203125,
-0.04827880859375,
-0.0740966796875,
0.00860595703125,
-0.010986328125,
-0.048492431640625,
0.034881591796875,
0.005157470703125,
-0.0011720657348632812,
0.0175933837890625,
0.012542724609375,
-0.005615234375,
0.0021572113037109375,
0.009429931640625,
0.005401611328125,
-0.0143585205078125,
0.003612518310546875,
-0.01467132568359375,
-0.01136016845703125,
-0.00036716461181640625,
-0.0299072265625,
0.0621337890625,
-0.017333984375,
-0.00841522216796875,
-0.038665771484375,
0.013824462890625,
0.0208740234375,
-0.00606536865234375,
0.06927490234375,
0.0885009765625,
-0.041473388671875,
-0.004405975341796875,
-0.05035400390625,
-0.0130462646484375,
-0.03741455078125,
0.04376220703125,
-0.0290069580078125,
-0.053985595703125,
0.047882080078125,
0.0146636962890625,
0.0099945068359375,
0.07330322265625,
0.049224853515625,
-0.00972747802734375,
0.06689453125,
0.0302734375,
-0.0070343017578125,
0.04718017578125,
-0.06829833984375,
0.0063018798828125,
-0.052398681640625,
-0.0208740234375,
-0.042236328125,
-0.0175628662109375,
-0.060943603515625,
-0.037811279296875,
0.005275726318359375,
0.00438690185546875,
-0.041534423828125,
0.038177490234375,
-0.04949951171875,
0.0126190185546875,
0.050048828125,
0.0083160400390625,
-0.0131988525390625,
0.0107269287109375,
-0.0248260498046875,
-0.00860595703125,
-0.051910400390625,
-0.0271453857421875,
0.0906982421875,
0.02679443359375,
0.036285400390625,
0.005207061767578125,
0.079833984375,
0.0047454833984375,
0.024139404296875,
-0.039306640625,
0.041839599609375,
-0.0201263427734375,
-0.0491943359375,
-0.0224151611328125,
-0.041656494140625,
-0.0582275390625,
0.006969451904296875,
-0.01593017578125,
-0.0609130859375,
0.01114654541015625,
-0.0133209228515625,
-0.025238037109375,
0.038482666015625,
-0.050994873046875,
0.056427001953125,
-0.00879669189453125,
-0.014556884765625,
0.0016202926635742188,
-0.033538818359375,
0.003711700439453125,
0.003265380859375,
0.006313323974609375,
-0.01055145263671875,
-0.02056884765625,
0.08074951171875,
-0.039337158203125,
0.053466796875,
-0.01219940185546875,
0.031982421875,
0.0154876708984375,
-0.003307342529296875,
0.0160675048828125,
-0.003204345703125,
-0.0224151611328125,
0.0175323486328125,
0.0018777847290039062,
-0.04107666015625,
-0.027740478515625,
0.06109619140625,
-0.0853271484375,
-0.039154052734375,
-0.049407958984375,
-0.02374267578125,
0.005859375,
0.0294342041015625,
0.04339599609375,
0.03118896484375,
0.0009431838989257812,
0.0266876220703125,
0.0173187255859375,
-0.021209716796875,
0.06610107421875,
0.01166534423828125,
-0.009246826171875,
-0.050506591796875,
0.058685302734375,
0.02325439453125,
0.004619598388671875,
0.035369873046875,
0.0247650146484375,
-0.02764892578125,
-0.03460693359375,
-0.016357421875,
0.036376953125,
-0.050567626953125,
-0.0174407958984375,
-0.0733642578125,
-0.045166015625,
-0.0540771484375,
-0.00978851318359375,
-0.0278778076171875,
-0.0447998046875,
-0.0379638671875,
0.004241943359375,
0.03369140625,
0.06341552734375,
-0.0244293212890625,
0.0142059326171875,
-0.04827880859375,
0.015289306640625,
0.00688934326171875,
0.01407623291015625,
0.00754547119140625,
-0.06549072265625,
-0.0308685302734375,
0.005329132080078125,
-0.027618408203125,
-0.059051513671875,
0.053955078125,
0.02490234375,
0.043182373046875,
0.0211639404296875,
-0.0020732879638671875,
0.058380126953125,
-0.0217132568359375,
0.06072998046875,
0.01093292236328125,
-0.071044921875,
0.04339599609375,
-0.020660400390625,
0.0258941650390625,
0.031646728515625,
0.031463623046875,
-0.03070068359375,
-0.0273590087890625,
-0.067138671875,
-0.061920166015625,
0.05902099609375,
0.03643798828125,
0.0088958740234375,
-0.015777587890625,
0.019195556640625,
-0.0175323486328125,
0.010467529296875,
-0.0616455078125,
-0.03741455078125,
-0.0333251953125,
-0.03179931640625,
-0.010467529296875,
-0.008514404296875,
-0.00293731689453125,
-0.046295166015625,
0.0506591796875,
0.01142120361328125,
0.0560302734375,
0.0234832763671875,
-0.018524169921875,
-0.005401611328125,
-0.01207733154296875,
0.0345458984375,
0.049041748046875,
-0.035400390625,
-0.0065460205078125,
0.0169219970703125,
-0.0367431640625,
-0.004543304443359375,
0.016632080078125,
-0.01031494140625,
0.0226593017578125,
0.04705810546875,
0.044036865234375,
0.01390838623046875,
-0.0091094970703125,
0.04327392578125,
-0.00417327880859375,
-0.0278167724609375,
-0.038970947265625,
-0.0008821487426757812,
0.01132965087890625,
0.0231475830078125,
0.0604248046875,
0.0127105712890625,
-0.01183319091796875,
-0.03717041015625,
0.0274810791015625,
0.0178070068359375,
-0.01483154296875,
-0.0227813720703125,
0.04705810546875,
-0.0028591156005859375,
-0.03692626953125,
0.041595458984375,
-0.0149383544921875,
-0.0670166015625,
0.061004638671875,
0.03765869140625,
0.0888671875,
-0.00693511962890625,
0.017974853515625,
0.0628662109375,
0.0292816162109375,
0.0012674331665039062,
0.02056884765625,
0.01284027099609375,
-0.062225341796875,
-0.00853729248046875,
-0.05828857421875,
0.00007838010787963867,
0.0096282958984375,
-0.031280517578125,
0.01885986328125,
-0.045806884765625,
-0.024322509765625,
0.016448974609375,
0.022491455078125,
-0.061920166015625,
0.0173187255859375,
-0.00847625732421875,
0.057861328125,
-0.06378173828125,
0.0635986328125,
0.050994873046875,
-0.04443359375,
-0.06890869140625,
0.0090789794921875,
-0.0139617919921875,
-0.06890869140625,
0.060546875,
0.028045654296875,
0.0179290771484375,
0.02105712890625,
-0.044219970703125,
-0.0833740234375,
0.08941650390625,
0.0157623291015625,
-0.01105499267578125,
-0.01061248779296875,
0.01485443115234375,
0.041015625,
-0.049041748046875,
0.05194091796875,
0.0355224609375,
0.0261077880859375,
0.00327301025390625,
-0.056976318359375,
0.0165252685546875,
-0.0291748046875,
0.0008330345153808594,
-0.0002161264419555664,
-0.05914306640625,
0.088623046875,
-0.01349639892578125,
-0.0083465576171875,
0.0079498291015625,
0.046356201171875,
0.0213470458984375,
0.01136016845703125,
0.0357666015625,
0.053741455078125,
0.03936767578125,
-0.0224609375,
0.07000732421875,
-0.0282440185546875,
0.061187744140625,
0.0560302734375,
0.035614013671875,
0.06402587890625,
0.0296173095703125,
-0.0211334228515625,
0.040313720703125,
0.0579833984375,
-0.0211639404296875,
0.04132080078125,
-0.0010538101196289062,
0.01062774658203125,
-0.01139068603515625,
-0.00019097328186035156,
-0.038909912109375,
0.0312347412109375,
0.021728515625,
-0.04766845703125,
-0.0233612060546875,
-0.0100250244140625,
0.01015472412109375,
-0.0119171142578125,
-0.0012617111206054688,
0.04510498046875,
0.01378631591796875,
-0.03997802734375,
0.07098388671875,
0.0006947517395019531,
0.059722900390625,
-0.021942138671875,
0.004314422607421875,
-0.005481719970703125,
0.02215576171875,
-0.018798828125,
-0.059539794921875,
0.016998291015625,
0.0026149749755859375,
0.0010852813720703125,
-0.00888824462890625,
0.022979736328125,
-0.030364990234375,
-0.046661376953125,
0.0396728515625,
0.026153564453125,
0.0170135498046875,
0.007568359375,
-0.09808349609375,
0.0075225830078125,
0.0150909423828125,
-0.03851318359375,
0.00039887428283691406,
0.035888671875,
0.0212860107421875,
0.048797607421875,
0.048675537109375,
0.005596160888671875,
0.0218963623046875,
0.01235198974609375,
0.0672607421875,
-0.061920166015625,
-0.0297698974609375,
-0.08465576171875,
0.0310211181640625,
-0.0186004638671875,
-0.0426025390625,
0.0621337890625,
0.047119140625,
0.05682373046875,
0.0015783309936523438,
0.058349609375,
-0.028656005859375,
0.044921875,
-0.0194549560546875,
0.061187744140625,
-0.05511474609375,
0.01271820068359375,
-0.024749755859375,
-0.06884765625,
-0.0172119140625,
0.06805419921875,
-0.032379150390625,
0.036224365234375,
0.060821533203125,
0.058990478515625,
0.00020885467529296875,
-0.01983642578125,
0.01153564453125,
0.035430908203125,
0.0284271240234375,
0.049224853515625,
0.03948974609375,
-0.069091796875,
0.04736328125,
-0.050506591796875,
-0.012420654296875,
-0.0330810546875,
-0.055145263671875,
-0.08758544921875,
-0.061737060546875,
-0.023773193359375,
-0.058807373046875,
0.0005178451538085938,
0.083740234375,
0.0401611328125,
-0.0736083984375,
-0.0162811279296875,
-0.00725555419921875,
0.006256103515625,
0.0006170272827148438,
-0.02105712890625,
0.056121826171875,
-0.0238800048828125,
-0.06365966796875,
-0.003997802734375,
-0.0010766983032226562,
0.008880615234375,
-0.004241943359375,
-0.0033111572265625,
-0.053466796875,
0.0009546279907226562,
0.0176544189453125,
0.008819580078125,
-0.054779052734375,
-0.0212249755859375,
0.0096282958984375,
-0.031982421875,
0.007404327392578125,
0.01293182373046875,
-0.035888671875,
0.0245208740234375,
0.04425048828125,
0.0335693359375,
0.059356689453125,
-0.0029296875,
0.032867431640625,
-0.036102294921875,
0.00930023193359375,
0.02374267578125,
0.0306396484375,
0.0357666015625,
-0.01529693603515625,
0.04931640625,
0.03228759765625,
-0.042572021484375,
-0.0693359375,
-0.0186004638671875,
-0.068359375,
-0.01154327392578125,
0.08831787109375,
-0.0186767578125,
-0.0390625,
0.0037403106689453125,
0.01187896728515625,
0.052001953125,
-0.0386962890625,
0.057708740234375,
0.0445556640625,
0.008819580078125,
-0.0090789794921875,
-0.03570556640625,
0.0399169921875,
0.032073974609375,
-0.03094482421875,
-0.032501220703125,
-0.01349639892578125,
0.04473876953125,
0.0234832763671875,
0.0482177734375,
-0.001384735107421875,
0.016571044921875,
0.00494384765625,
0.00058746337890625,
-0.01386260986328125,
0.0038299560546875,
-0.0184783935546875,
-0.0016841888427734375,
-0.024658203125,
-0.03131103515625
]
] |
jplu/tf-camembert-base | 2020-12-11T21:47:52.000Z | [
"transformers",
"tf",
"camembert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | jplu | null | null | jplu/tf-camembert-base | 0 | 11,917 | transformers | 2022-03-02T23:29:05 | # Tensorflow CamemBERT
In this repository you will find different versions of the CamemBERT model for Tensorflow.
## CamemBERT
[CamemBERT](https://camembert-model.fr/) is a state-of-the-art language model for French based on the RoBERTa architecture pretrained on the French subcorpus of the newly available multilingual corpus OSCAR.
## Model Weights
| Model | Downloads
| -------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `jplu/tf-camembert-base` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-camembert-base/config.json) • [`tf_model.h5`](https://s3.amazonaws.com/models.huggingface.co/bert/jplu/tf-camembert-base/tf_model.h5)
## Usage
With Transformers >= 2.4 the Tensorflow models of CamemBERT can be loaded like:
```python
from transformers import TFCamembertModel
model = TFCamembertModel.from_pretrained("jplu/tf-camembert-base")
```
## Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/jplu).
## Acknowledgments
Thanks to all the Huggingface team for the support and their amazing library!
| 1,222 | [
[
-0.0266265869140625,
-0.045867919921875,
0.0180511474609375,
0.0352783203125,
-0.0110015869140625,
0.00897979736328125,
0.001750946044921875,
-0.01458740234375,
0.01038360595703125,
0.03216552734375,
-0.0517578125,
-0.037200927734375,
-0.053619384765625,
-0.0044708251953125,
-0.0174560546875,
0.07720947265625,
-0.021514892578125,
0.033782958984375,
-0.004589080810546875,
-0.0271148681640625,
-0.033172607421875,
-0.035552978515625,
-0.05859375,
-0.06622314453125,
0.03106689453125,
-0.01898193359375,
0.058624267578125,
0.007709503173828125,
0.054229736328125,
0.026214599609375,
-0.0085601806640625,
-0.022186279296875,
-0.0245208740234375,
-0.0064697265625,
0.0091705322265625,
-0.03955078125,
-0.056732177734375,
-0.0022945404052734375,
0.0491943359375,
0.0214691162109375,
-0.006053924560546875,
-0.00861358642578125,
-0.0025787353515625,
0.03826904296875,
-0.00421905517578125,
0.032867431640625,
0.0104217529296875,
0.01204681396484375,
0.004791259765625,
-0.00846099853515625,
-0.00714874267578125,
-0.031036376953125,
0.0286865234375,
-0.0274658203125,
0.017486572265625,
-0.020721435546875,
0.08587646484375,
0.0300750732421875,
-0.0226898193359375,
0.0037746429443359375,
-0.03546142578125,
0.05755615234375,
-0.049835205078125,
0.07891845703125,
0.0229644775390625,
0.04010009765625,
-0.0009398460388183594,
-0.08148193359375,
-0.00690460205078125,
-0.008941650390625,
-0.0261688232421875,
0.0168609619140625,
-0.025360107421875,
0.003299713134765625,
0.00823974609375,
0.036224365234375,
-0.0595703125,
-0.00927734375,
-0.038970947265625,
-0.01221466064453125,
0.039398193359375,
-0.01274871826171875,
0.0137786865234375,
-0.001293182373046875,
-0.030609130859375,
-0.03369140625,
-0.047454833984375,
-0.0172576904296875,
0.0218658447265625,
0.01226806640625,
-0.052154541015625,
0.0347900390625,
0.01202392578125,
0.04864501953125,
-0.004947662353515625,
-0.004337310791015625,
0.037933349609375,
-0.01202392578125,
-0.0231475830078125,
-0.006557464599609375,
0.057769775390625,
0.004917144775390625,
0.01157379150390625,
-0.01446533203125,
-0.03240966796875,
-0.0256805419921875,
0.019866943359375,
-0.0775146484375,
-0.015167236328125,
0.02447509765625,
-0.03369140625,
-0.0301055908203125,
-0.00800323486328125,
-0.0217742919921875,
0.01224517822265625,
-0.0175933837890625,
0.022430419921875,
-0.043731689453125,
-0.052093505859375,
-0.0063018798828125,
-0.0070343017578125,
0.0282440185546875,
0.023284912109375,
-0.039947509765625,
0.0200347900390625,
0.052459716796875,
0.077392578125,
-0.01245880126953125,
-0.0263671875,
-0.01153564453125,
-0.026336669921875,
-0.0135040283203125,
0.052032470703125,
-0.01067352294921875,
-0.0290374755859375,
0.0016651153564453125,
0.017608642578125,
-0.0177459716796875,
-0.0215301513671875,
0.0272369384765625,
-0.0548095703125,
0.041961669921875,
0.00638580322265625,
-0.0574951171875,
-0.049530029296875,
0.0299835205078125,
-0.054901123046875,
0.07281494140625,
0.060699462890625,
-0.054046630859375,
0.006069183349609375,
-0.05816650390625,
-0.01165008544921875,
0.0238494873046875,
-0.01534271240234375,
-0.051177978515625,
-0.00012409687042236328,
-0.0013885498046875,
0.049896240234375,
-0.0225372314453125,
0.037353515625,
-0.0222625732421875,
-0.018890380859375,
0.00922393798828125,
-0.0209808349609375,
0.09832763671875,
0.0241546630859375,
-0.0027713775634765625,
0.0232696533203125,
-0.039398193359375,
-0.01416778564453125,
0.0202789306640625,
-0.023895263671875,
0.01983642578125,
-0.0301666259765625,
0.03192138671875,
0.0007042884826660156,
0.007808685302734375,
-0.02978515625,
0.002277374267578125,
-0.00765228271484375,
0.045928955078125,
0.04302978515625,
0.001842498779296875,
0.01172637939453125,
-0.051727294921875,
0.056884765625,
0.01393890380859375,
0.004932403564453125,
0.0010662078857421875,
-0.043670654296875,
-0.050689697265625,
-0.053314208984375,
0.0261688232421875,
0.040252685546875,
-0.05877685546875,
0.0291748046875,
-0.016754150390625,
-0.060791015625,
-0.039886474609375,
-0.008453369140625,
0.0133819580078125,
0.0149993896484375,
0.0172576904296875,
-0.0160980224609375,
-0.030792236328125,
-0.061492919921875,
0.0008330345153808594,
-0.026824951171875,
0.00948333740234375,
0.00423431396484375,
0.04473876953125,
-0.0218353271484375,
0.060211181640625,
-0.0345458984375,
-0.00823211669921875,
-0.0129241943359375,
0.006275177001953125,
0.033538818359375,
0.052734375,
0.08270263671875,
-0.058013916015625,
-0.01666259765625,
-0.0105133056640625,
-0.0699462890625,
0.02252197265625,
0.0131683349609375,
-0.01593017578125,
0.01081085205078125,
0.03564453125,
-0.049285888671875,
0.018707275390625,
0.056732177734375,
-0.0277557373046875,
0.037139892578125,
-0.0006899833679199219,
0.008087158203125,
-0.1209716796875,
-0.0021114349365234375,
0.0191802978515625,
-0.05841064453125,
-0.0217742919921875,
0.049072265625,
0.00263214111328125,
-0.01503753662109375,
-0.0609130859375,
0.048919677734375,
-0.0139923095703125,
0.01197052001953125,
0.00432586669921875,
-0.03277587890625,
0.0012378692626953125,
0.05340576171875,
0.02130126953125,
0.035675048828125,
0.048492431640625,
-0.0190277099609375,
0.0599365234375,
0.02154541015625,
-0.02587890625,
0.043243408203125,
-0.0615234375,
0.00701904296875,
-0.0111236572265625,
0.033935546875,
-0.06341552734375,
-0.0147857666015625,
0.0295257568359375,
-0.033843994140625,
0.0465087890625,
-0.03521728515625,
-0.0308380126953125,
-0.02728271484375,
-0.006374359130859375,
0.0343017578125,
0.03643798828125,
-0.04571533203125,
0.055328369140625,
0.004177093505859375,
0.0037746429443359375,
-0.03143310546875,
-0.07025146484375,
-0.019866943359375,
-0.029998779296875,
-0.055023193359375,
0.0236358642578125,
-0.037261962890625,
0.00522613525390625,
0.0084228515625,
0.01541900634765625,
-0.0290679931640625,
-0.001842498779296875,
0.0160980224609375,
0.0276947021484375,
-0.03668212890625,
-0.0107879638671875,
-0.0085296630859375,
-0.01038360595703125,
-0.0187530517578125,
-0.021392822265625,
0.049560546875,
-0.036773681640625,
-0.01666259765625,
-0.041900634765625,
0.0015048980712890625,
0.043670654296875,
-0.008819580078125,
0.0667724609375,
0.07269287109375,
-0.043853759765625,
0.01222991943359375,
-0.044677734375,
-0.0262298583984375,
-0.0347900390625,
0.03143310546875,
-0.044158935546875,
-0.08782958984375,
0.0540771484375,
0.0196380615234375,
0.0309295654296875,
0.037200927734375,
0.052886962890625,
0.01505279541015625,
0.05615234375,
0.0491943359375,
0.00323486328125,
0.044281005859375,
-0.03851318359375,
0.00881195068359375,
-0.041107177734375,
-0.01015472412109375,
-0.038055419921875,
0.0005755424499511719,
-0.059906005859375,
-0.0195770263671875,
0.01218414306640625,
0.00159454345703125,
-0.0343017578125,
0.06817626953125,
-0.03131103515625,
0.0088653564453125,
0.03753662109375,
0.01081085205078125,
0.0141143798828125,
0.01213836669921875,
-0.01314544677734375,
0.01107025146484375,
-0.0592041015625,
-0.0193939208984375,
0.06536865234375,
0.046875,
0.043243408203125,
0.031524658203125,
0.057159423828125,
0.007144927978515625,
0.04913330078125,
-0.06024169921875,
0.015411376953125,
0.0079803466796875,
-0.07550048828125,
0.0188140869140625,
-0.04339599609375,
-0.040283203125,
0.010833740234375,
-0.004611968994140625,
-0.0421142578125,
0.01117706298828125,
0.0102081298828125,
-0.0189971923828125,
0.02874755859375,
-0.06005859375,
0.07720947265625,
-0.00750732421875,
-0.00775909423828125,
0.00591278076171875,
-0.0164642333984375,
0.024627685546875,
-0.014129638671875,
-0.002681732177734375,
0.0014657974243164062,
0.0275115966796875,
0.050689697265625,
-0.04083251953125,
0.047637939453125,
-0.003429412841796875,
-0.004802703857421875,
0.0245361328125,
0.00975799560546875,
0.0220489501953125,
0.01385498046875,
-0.01181793212890625,
0.047515869140625,
0.02569580078125,
-0.0323486328125,
-0.0364990234375,
0.06256103515625,
-0.09124755859375,
-0.027496337890625,
-0.036529541015625,
-0.02801513671875,
0.007411956787109375,
0.0322265625,
0.0399169921875,
0.039276123046875,
-0.0322265625,
0.0159759521484375,
0.0248565673828125,
-0.00988006591796875,
0.03155517578125,
0.0298309326171875,
-0.0323486328125,
-0.0262298583984375,
0.061279296875,
0.004009246826171875,
-0.00010925531387329102,
0.034942626953125,
0.00957489013671875,
-0.0260467529296875,
-0.0281982421875,
-0.03448486328125,
0.0259857177734375,
-0.04754638671875,
-0.030792236328125,
-0.05364990234375,
-0.04620361328125,
-0.045928955078125,
-0.01430511474609375,
-0.0526123046875,
-0.041015625,
-0.0238189697265625,
-0.0113067626953125,
0.054168701171875,
0.04449462890625,
0.003780364990234375,
0.06591796875,
-0.065673828125,
-0.01483154296875,
0.0087432861328125,
0.050567626953125,
-0.007965087890625,
-0.03985595703125,
-0.01480865478515625,
-0.00891876220703125,
-0.0345458984375,
-0.0447998046875,
0.0303192138671875,
-0.001956939697265625,
0.0369873046875,
0.032806396484375,
-0.024749755859375,
0.01430511474609375,
-0.0213165283203125,
0.0517578125,
0.035614013671875,
-0.035552978515625,
-0.0015287399291992188,
-0.034820556640625,
0.027435302734375,
0.01548004150390625,
0.0479736328125,
-0.04193115234375,
-0.0094451904296875,
-0.059661865234375,
-0.038848876953125,
0.06414794921875,
0.0380859375,
0.00396728515625,
0.01263427734375,
0.006977081298828125,
0.006313323974609375,
0.0104827880859375,
-0.059600830078125,
-0.01561737060546875,
-0.04217529296875,
-0.034423828125,
-0.01039886474609375,
-0.0016546249389648438,
-0.007091522216796875,
-0.00858306884765625,
0.072265625,
0.01012420654296875,
0.048797607421875,
0.00008809566497802734,
-0.0116729736328125,
-0.0221099853515625,
-0.014923095703125,
0.033843994140625,
0.044403076171875,
-0.054168701171875,
-0.0169677734375,
-0.01342010498046875,
-0.026275634765625,
-0.00638580322265625,
0.035552978515625,
-0.0004220008850097656,
0.0026721954345703125,
0.0472412109375,
0.09466552734375,
0.010650634765625,
-0.0167694091796875,
0.0330810546875,
-0.028900146484375,
-0.02703857421875,
-0.05096435546875,
-0.0037746429443359375,
0.025726318359375,
0.050048828125,
-0.0018863677978515625,
0.013427734375,
-0.0259857177734375,
-0.020355224609375,
0.018768310546875,
0.02105712890625,
-0.053741455078125,
-0.0333251953125,
0.05035400390625,
0.00359344482421875,
-0.038330078125,
0.0633544921875,
0.0004718303680419922,
-0.0347900390625,
0.033355712890625,
0.03509521484375,
0.0887451171875,
-0.0236053466796875,
0.014984130859375,
0.03662109375,
0.017120361328125,
-0.01180267333984375,
0.007232666015625,
0.00897216796875,
-0.07696533203125,
-0.0228424072265625,
-0.05999755859375,
-0.003314971923828125,
0.016754150390625,
-0.034454345703125,
0.0300750732421875,
-0.041748046875,
-0.03717041015625,
0.01025390625,
-0.002513885498046875,
-0.08135986328125,
0.00325775146484375,
0.0008959770202636719,
0.0701904296875,
-0.053253173828125,
0.047332763671875,
0.07354736328125,
-0.05126953125,
-0.058990478515625,
-0.0289764404296875,
0.0103607177734375,
-0.07843017578125,
0.0269775390625,
-0.0102996826171875,
0.010894775390625,
0.0264434814453125,
-0.0325927734375,
-0.0638427734375,
0.08477783203125,
0.0248565673828125,
-0.01224517822265625,
-0.01007843017578125,
-0.00640869140625,
0.02935791015625,
-0.0257415771484375,
0.050750732421875,
0.0328369140625,
0.0225067138671875,
0.034088134765625,
-0.04486083984375,
-0.00719451904296875,
-0.0220184326171875,
0.0022125244140625,
0.00597381591796875,
-0.07025146484375,
0.05316162109375,
0.0093536376953125,
0.019622802734375,
0.0037097930908203125,
0.05841064453125,
0.0276031494140625,
-0.01262664794921875,
0.0251922607421875,
0.061492919921875,
0.025726318359375,
-0.019989013671875,
0.0841064453125,
-0.0173492431640625,
0.057281494140625,
0.051849365234375,
0.009521484375,
0.0236358642578125,
0.017059326171875,
-0.021697998046875,
0.03533935546875,
0.0797119140625,
-0.006153106689453125,
0.022430419921875,
-0.0065155029296875,
-0.0097503662109375,
-0.01371002197265625,
0.00469207763671875,
-0.03271484375,
0.023040771484375,
0.01515960693359375,
-0.0186309814453125,
-0.0308685302734375,
0.0030574798583984375,
0.019775390625,
-0.0482177734375,
-0.0031719207763671875,
0.03509521484375,
0.03155517578125,
-0.046356201171875,
0.063232421875,
0.00983428955078125,
0.041534423828125,
-0.050628662109375,
0.00012993812561035156,
-0.02825927734375,
0.0209503173828125,
-0.01390838623046875,
-0.04510498046875,
0.0160064697265625,
-0.0017490386962890625,
0.00707244873046875,
-0.025421142578125,
0.044158935546875,
-0.0288238525390625,
-0.062164306640625,
0.02459716796875,
0.04693603515625,
0.0251922607421875,
0.007537841796875,
-0.08831787109375,
0.016387939453125,
0.0036792755126953125,
-0.0570068359375,
0.01031494140625,
0.002361297607421875,
0.0008449554443359375,
0.047637939453125,
0.0290985107421875,
-0.01116943359375,
-0.0033283233642578125,
0.01922607421875,
0.051849365234375,
-0.019989013671875,
-0.039215087890625,
-0.043792724609375,
0.033294677734375,
0.0208892822265625,
-0.0185699462890625,
0.0298004150390625,
0.04437255859375,
0.06829833984375,
-0.00424957275390625,
0.049407958984375,
-0.01084136962890625,
0.024200439453125,
-0.0209808349609375,
0.0653076171875,
-0.06768798828125,
-0.012481689453125,
-0.0253143310546875,
-0.0992431640625,
-0.039031982421875,
0.06097412109375,
-0.0054168701171875,
0.034881591796875,
0.0540771484375,
0.06512451171875,
-0.04083251953125,
0.0158843994140625,
0.023895263671875,
0.0239715576171875,
0.03582763671875,
0.0185394287109375,
0.032501220703125,
-0.048492431640625,
0.01189422607421875,
-0.02398681640625,
-0.0230712890625,
-0.01318359375,
-0.060333251953125,
-0.07781982421875,
-0.0538330078125,
-0.048614501953125,
-0.050506591796875,
0.0033721923828125,
0.08984375,
0.069091796875,
-0.06488037109375,
-0.031768798828125,
-0.01824951171875,
0.0031185150146484375,
-0.00846099853515625,
-0.0154571533203125,
0.048004150390625,
-0.004154205322265625,
-0.07086181640625,
0.0203704833984375,
-0.01464080810546875,
0.033416748046875,
-0.0184478759765625,
-0.020416259765625,
-0.027130126953125,
-0.010894775390625,
0.0306396484375,
0.0129241943359375,
-0.01580810546875,
-0.01995849609375,
-0.01267242431640625,
0.004001617431640625,
0.0137786865234375,
0.042236328125,
-0.028228759765625,
0.01007843017578125,
0.05255126953125,
0.038421630859375,
0.06585693359375,
-0.0197296142578125,
0.05267333984375,
-0.060150146484375,
0.040740966796875,
0.01220703125,
0.0474853515625,
0.039794921875,
-0.00830841064453125,
0.04779052734375,
0.02813720703125,
-0.0306396484375,
-0.04852294921875,
0.01020050048828125,
-0.0645751953125,
-0.021026611328125,
0.05609130859375,
-0.0242462158203125,
-0.0244598388671875,
0.0269622802734375,
0.00913238525390625,
0.059295654296875,
-0.029327392578125,
0.02239990234375,
0.0654296875,
0.003936767578125,
-0.04180908203125,
-0.04302978515625,
0.033660888671875,
0.0235443115234375,
-0.0275115966796875,
-0.0211029052734375,
0.0159912109375,
0.03302001953125,
-0.000926971435546875,
0.036834716796875,
-0.027008056640625,
0.0010890960693359375,
-0.01396942138671875,
0.013885498046875,
-0.00795745849609375,
-0.011199951171875,
-0.01458740234375,
-0.01454925537109375,
-0.0015153884887695312,
0.0036334991455078125
]
] |
ivanlau/language-detection-fine-tuned-on-xlm-roberta-base | 2021-12-17T10:33:13.000Z | [
"transformers",
"pytorch",
"tensorboard",
"xlm-roberta",
"text-classification",
"generated_from_trainer",
"dataset:common_language",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | ivanlau | null | null | ivanlau/language-detection-fine-tuned-on-xlm-roberta-base | 7 | 11,906 | transformers | 2022-03-02T23:29:05 | ---
license: mit
tags:
- generated_from_trainer
datasets:
- common_language
metrics:
- accuracy
model-index:
- name: language-detection-fine-tuned-on-xlm-roberta-base
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: common_language
type: common_language
args: full
metrics:
- name: Accuracy
type: accuracy
value: 0.9738386718094919
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# language-detection-fine-tuned-on-xlm-roberta-base
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the [common_language](https://huggingface.co/datasets/common_language) dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1886
- Accuracy: 0.9738
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.1 | 1.0 | 22194 | 0.1886 | 0.9738 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu111
- Datasets 1.15.1
- Tokenizers 0.10.3
### Notebook
[notebook](https://github.com/IvanLauLinTiong/language-detector/blob/main/xlm_roberta_base_commonlanguage_language_detector.ipynb) | 1,748 | [
[
-0.03619384765625,
-0.056671142578125,
0.0226898193359375,
0.00872802734375,
-0.024383544921875,
-0.01412200927734375,
-0.045318603515625,
-0.0258941650390625,
0.00646209716796875,
0.04095458984375,
-0.03851318359375,
-0.061004638671875,
-0.060546875,
0.0037059783935546875,
-0.02252197265625,
0.08868408203125,
-0.0003135204315185547,
0.0179290771484375,
0.0032024383544921875,
-0.0139007568359375,
-0.0262451171875,
-0.047027587890625,
-0.0740966796875,
-0.035858154296875,
0.0095672607421875,
0.0281982421875,
0.048553466796875,
0.058990478515625,
0.01470947265625,
0.0195465087890625,
-0.0249176025390625,
0.00792694091796875,
-0.02362060546875,
-0.0237884521484375,
0.00330352783203125,
-0.0443115234375,
-0.056549072265625,
-0.0002551078796386719,
0.059661865234375,
0.020172119140625,
-0.005474090576171875,
0.0268402099609375,
0.00278472900390625,
0.0311126708984375,
-0.02777099609375,
0.03192138671875,
-0.048858642578125,
0.0002779960632324219,
-0.03582763671875,
-0.01715087890625,
-0.037750244140625,
-0.003208160400390625,
-0.017303466796875,
-0.0181884765625,
0.0174407958984375,
0.002593994140625,
0.09588623046875,
0.013885498046875,
-0.0213165283203125,
-0.006870269775390625,
-0.0572509765625,
0.0628662109375,
-0.050445556640625,
0.01959228515625,
0.036834716796875,
0.0247650146484375,
0.0014438629150390625,
-0.036956787109375,
-0.05194091796875,
-0.019256591796875,
-0.006313323974609375,
0.01348114013671875,
-0.033416748046875,
0.00023949146270751953,
0.047943115234375,
0.029998779296875,
-0.067138671875,
0.01007843017578125,
-0.03033447265625,
-0.0229034423828125,
0.032135009765625,
0.0175323486328125,
0.020904541015625,
-0.00847625732421875,
-0.0037517547607421875,
-0.003475189208984375,
-0.0418701171875,
0.0036830902099609375,
0.043487548828125,
0.034393310546875,
-0.045257568359375,
0.0287322998046875,
-0.016845703125,
0.0706787109375,
-0.00664520263671875,
-0.012847900390625,
0.044677734375,
-0.005062103271484375,
-0.007137298583984375,
-0.00768280029296875,
0.0760498046875,
0.0164642333984375,
0.037384033203125,
0.006473541259765625,
-0.0220184326171875,
0.005733489990234375,
-0.00795745849609375,
-0.05474853515625,
-0.01708984375,
0.0160675048828125,
-0.0283966064453125,
-0.03619384765625,
0.001827239990234375,
-0.0285491943359375,
0.036529541015625,
-0.0308685302734375,
0.0254058837890625,
-0.036376953125,
-0.0167083740234375,
0.0106201171875,
0.007801055908203125,
0.0119476318359375,
-0.007137298583984375,
-0.06781005859375,
0.036041259765625,
0.03802490234375,
0.04559326171875,
0.004840850830078125,
-0.0214385986328125,
-0.04901123046875,
-0.00864410400390625,
-0.01195526123046875,
0.05316162109375,
-0.018768310546875,
-0.0219573974609375,
-0.003810882568359375,
0.01397705078125,
-0.01239013671875,
-0.047027587890625,
0.0726318359375,
-0.0218963623046875,
0.036041259765625,
0.01161956787109375,
-0.04339599609375,
-0.001819610595703125,
0.0284576416015625,
-0.050384521484375,
0.09588623046875,
0.01346588134765625,
-0.049163818359375,
0.0302276611328125,
-0.038604736328125,
-0.015655517578125,
0.01319122314453125,
-0.019317626953125,
-0.0611572265625,
-0.014373779296875,
0.00629425048828125,
0.0213165283203125,
-0.0174560546875,
0.031402587890625,
-0.0111541748046875,
-0.039886474609375,
0.00966644287109375,
-0.031829833984375,
0.08135986328125,
0.0164947509765625,
-0.04559326171875,
0.008544921875,
-0.099609375,
0.0180816650390625,
0.013671875,
-0.047943115234375,
-0.0088958740234375,
-0.016998291015625,
0.037322998046875,
0.0187530517578125,
0.01922607421875,
-0.04132080078125,
-0.00013327598571777344,
-0.0277099609375,
0.0014019012451171875,
0.0389404296875,
-0.02691650390625,
0.01184844970703125,
-0.0190887451171875,
0.035980224609375,
0.01303863525390625,
0.01131439208984375,
0.00000959634780883789,
-0.038818359375,
-0.07080078125,
-0.0217437744140625,
0.0310821533203125,
0.050384521484375,
-0.0201263427734375,
0.07232666015625,
-0.01349639892578125,
-0.038116455078125,
-0.02972412109375,
0.01305389404296875,
0.051300048828125,
0.034942626953125,
0.03228759765625,
-0.01788330078125,
-0.04937744140625,
-0.07049560546875,
-0.020355224609375,
-0.008270263671875,
0.0158233642578125,
0.0251922607421875,
0.0413818359375,
-0.0200958251953125,
0.051300048828125,
-0.042999267578125,
-0.0250701904296875,
-0.037872314453125,
0.01123809814453125,
0.032958984375,
0.057403564453125,
0.0635986328125,
-0.03369140625,
-0.051300048828125,
0.0037288665771484375,
-0.04290771484375,
-0.00554656982421875,
0.004459381103515625,
-0.00174713134765625,
0.035491943359375,
0.029144287109375,
-0.034820556640625,
0.036407470703125,
0.0443115234375,
-0.023651123046875,
0.04644775390625,
-0.032196044921875,
-0.01446533203125,
-0.100341796875,
0.00970458984375,
0.0163116455078125,
-0.0180816650390625,
-0.035186767578125,
0.01007843017578125,
0.0209197998046875,
-0.0037174224853515625,
-0.0183258056640625,
0.056915283203125,
-0.020782470703125,
0.01457977294921875,
-0.0328369140625,
0.0035724639892578125,
-0.00794219970703125,
0.04522705078125,
0.0214691162109375,
0.05096435546875,
0.06353759765625,
-0.033203125,
0.0201263427734375,
0.0183258056640625,
-0.0281524658203125,
0.032684326171875,
-0.050872802734375,
0.01788330078125,
0.0011501312255859375,
-0.0030918121337890625,
-0.0609130859375,
-0.0012073516845703125,
0.036224365234375,
-0.04791259765625,
0.03076171875,
-0.0364990234375,
-0.042877197265625,
-0.0253448486328125,
0.01171875,
0.0180816650390625,
0.04693603515625,
-0.03900146484375,
0.035186767578125,
0.0086212158203125,
0.00518798828125,
-0.037994384765625,
-0.054290771484375,
0.0036106109619140625,
-0.00823211669921875,
-0.03594970703125,
0.0167388916015625,
-0.01207733154296875,
-0.00986480712890625,
-0.0047149658203125,
0.0083160400390625,
-0.0228118896484375,
0.00044846534729003906,
0.0108184814453125,
0.0282440185546875,
-0.0245513916015625,
-0.0080718994140625,
-0.007030487060546875,
-0.0294189453125,
0.01079559326171875,
-0.0086212158203125,
0.060791015625,
-0.01486968994140625,
-0.01337432861328125,
-0.05047607421875,
-0.003498077392578125,
0.034698486328125,
-0.031402587890625,
0.06365966796875,
0.0706787109375,
-0.027313232421875,
-0.01092529296875,
-0.034271240234375,
-0.0016298294067382812,
-0.029937744140625,
0.0386962890625,
-0.035888671875,
-0.038604736328125,
0.052032470703125,
0.0094451904296875,
-0.0215911865234375,
0.040313720703125,
0.048126220703125,
0.0275726318359375,
0.0887451171875,
0.03265380859375,
-0.0307464599609375,
0.03631591796875,
-0.0472412109375,
0.00273895263671875,
-0.060302734375,
-0.0164947509765625,
-0.06884765625,
0.00472259521484375,
-0.0611572265625,
-0.006183624267578125,
-0.00617218017578125,
-0.00444793701171875,
-0.0169525146484375,
0.037506103515625,
-0.02642822265625,
0.040252685546875,
0.05169677734375,
0.0164947509765625,
0.0165557861328125,
0.0005831718444824219,
-0.02490234375,
-0.00536346435546875,
-0.044189453125,
-0.047393798828125,
0.095947265625,
0.0186920166015625,
0.0478515625,
0.0017690658569335938,
0.0614013671875,
-0.01959228515625,
0.00966644287109375,
-0.048492431640625,
0.03704833984375,
-0.017669677734375,
-0.059600830078125,
-0.00077056884765625,
-0.034332275390625,
-0.058197021484375,
0.003314971923828125,
-0.0229034423828125,
-0.051788330078125,
0.0030155181884765625,
0.0265045166015625,
-0.00921630859375,
0.0308837890625,
-0.035888671875,
0.08203125,
-0.0225830078125,
-0.0173492431640625,
-0.0180511474609375,
-0.0191497802734375,
0.01279449462890625,
-0.0033855438232421875,
0.01253509521484375,
-0.004367828369140625,
0.0258026123046875,
0.0499267578125,
-0.033203125,
0.049163818359375,
-0.017303466796875,
0.013946533203125,
0.01861572265625,
-0.007801055908203125,
0.045867919921875,
0.0051727294921875,
-0.00873565673828125,
0.01302337646484375,
-0.01325225830078125,
-0.041595458984375,
-0.0270843505859375,
0.059783935546875,
-0.07659912109375,
-0.02484130859375,
-0.04388427734375,
-0.04071044921875,
0.005359649658203125,
0.030731201171875,
0.035888671875,
0.0548095703125,
-0.013946533203125,
0.01026153564453125,
0.047760009765625,
-0.0078277587890625,
0.01025390625,
0.05194091796875,
-0.015289306640625,
-0.0257568359375,
0.05902099609375,
-0.00641632080078125,
0.0174713134765625,
0.0162506103515625,
0.014892578125,
-0.01212310791015625,
-0.0601806640625,
-0.035125732421875,
0.017974853515625,
-0.046905517578125,
-0.0028400421142578125,
-0.03936767578125,
-0.025115966796875,
-0.028961181640625,
0.023895263671875,
-0.026641845703125,
-0.0226593017578125,
-0.03057861328125,
-0.0035915374755859375,
0.0258026123046875,
0.0296173095703125,
0.0050506591796875,
0.035186767578125,
-0.0545654296875,
0.00783538818359375,
0.0035610198974609375,
0.0308074951171875,
-0.01776123046875,
-0.07257080078125,
-0.041839599609375,
0.00847625732421875,
-0.0202484130859375,
-0.02899169921875,
0.045684814453125,
0.01280975341796875,
0.057769775390625,
0.02911376953125,
-0.0171661376953125,
0.053680419921875,
-0.03240966796875,
0.046234130859375,
-0.0026683807373046875,
-0.063232421875,
0.046966552734375,
-0.0165252685546875,
0.03411865234375,
0.032501220703125,
0.03289794921875,
-0.02197265625,
-0.0455322265625,
-0.06396484375,
-0.052001953125,
0.07989501953125,
0.0214080810546875,
0.0193939208984375,
-0.021331787109375,
0.023101806640625,
-0.0186309814453125,
-0.00719451904296875,
-0.07525634765625,
-0.047943115234375,
0.0021991729736328125,
-0.02410888671875,
-0.01059722900390625,
-0.023895263671875,
-0.0073089599609375,
-0.0309600830078125,
0.07464599609375,
-0.00504302978515625,
0.00757598876953125,
-0.00687408447265625,
-0.01171875,
-0.022125244140625,
0.01491546630859375,
0.060150146484375,
0.0439453125,
-0.0389404296875,
-0.008636474609375,
0.01363372802734375,
-0.037322998046875,
-0.0087432861328125,
0.017333984375,
-0.013824462890625,
0.0271453857421875,
0.0231781005859375,
0.0806884765625,
0.01386260986328125,
-0.042327880859375,
0.03173828125,
-0.0175323486328125,
-0.0206146240234375,
-0.047393798828125,
0.019195556640625,
-0.005237579345703125,
0.00919342041015625,
0.01328277587890625,
0.0278472900390625,
-0.00749969482421875,
-0.0300750732421875,
0.01047515869140625,
0.0309600830078125,
-0.033843994140625,
-0.027618408203125,
0.05279541015625,
-0.00006663799285888672,
-0.03851318359375,
0.0472412109375,
-0.0161285400390625,
-0.04071044921875,
0.054412841796875,
0.056610107421875,
0.06451416015625,
-0.01209259033203125,
-0.00382232666015625,
0.056640625,
0.0162200927734375,
-0.00533294677734375,
0.04388427734375,
0.004138946533203125,
-0.0692138671875,
-0.026519775390625,
-0.048187255859375,
-0.016815185546875,
0.036834716796875,
-0.0716552734375,
0.03857421875,
-0.0225830078125,
-0.017486572265625,
0.020965576171875,
0.006793975830078125,
-0.0565185546875,
0.028656005859375,
0.01366424560546875,
0.08123779296875,
-0.06829833984375,
0.089599609375,
0.050628662109375,
-0.037567138671875,
-0.07452392578125,
-0.006275177001953125,
-0.0017747879028320312,
-0.072021484375,
0.07928466796875,
-0.0015153884887695312,
0.009796142578125,
0.0133056640625,
-0.028961181640625,
-0.057220458984375,
0.059478759765625,
0.0191650390625,
-0.0474853515625,
0.0214691162109375,
0.0196075439453125,
0.04931640625,
-0.0282440185546875,
0.03363037109375,
0.034271240234375,
0.0289306640625,
-0.006160736083984375,
-0.083740234375,
-0.006778717041015625,
-0.0265045166015625,
0.0001285076141357422,
0.021697998046875,
-0.045013427734375,
0.07366943359375,
0.004100799560546875,
0.0109710693359375,
0.02728271484375,
0.02655029296875,
0.026153564453125,
0.00762176513671875,
0.0271148681640625,
0.07122802734375,
0.037322998046875,
-0.024688720703125,
0.06927490234375,
-0.06011962890625,
0.054290771484375,
0.085693359375,
-0.0015077590942382812,
0.055084228515625,
0.0202789306640625,
-0.0175323486328125,
0.042816162109375,
0.034637451171875,
-0.0162200927734375,
0.0173797607421875,
0.01157379150390625,
-0.013031005859375,
-0.0293426513671875,
0.0222930908203125,
-0.0281524658203125,
0.037841796875,
0.01421356201171875,
-0.051788330078125,
-0.021240234375,
-0.00951385498046875,
0.0209197998046875,
0.00966644287109375,
-0.0189056396484375,
0.039459228515625,
-0.0164031982421875,
-0.043304443359375,
0.06719970703125,
0.0118865966796875,
0.046783447265625,
-0.044891357421875,
-0.007434844970703125,
-0.01450347900390625,
0.036529541015625,
-0.0132293701171875,
-0.047943115234375,
0.02197265625,
0.0027179718017578125,
-0.007472991943359375,
-0.000008404254913330078,
0.04022216796875,
-0.046783447265625,
-0.048614501953125,
0.0269622802734375,
0.03076171875,
0.0182647705078125,
0.00507354736328125,
-0.07232666015625,
0.0164031982421875,
0.0021572113037109375,
-0.01294708251953125,
0.0167999267578125,
0.0318603515625,
-0.00157928466796875,
0.0418701171875,
0.050537109375,
0.0238494873046875,
-0.0013380050659179688,
0.026611328125,
0.05999755859375,
-0.048126220703125,
-0.033966064453125,
-0.06280517578125,
0.0294036865234375,
-0.01287841796875,
-0.045928955078125,
0.0546875,
0.07501220703125,
0.0865478515625,
-0.0167236328125,
0.050933837890625,
0.00091552734375,
0.0421142578125,
-0.039337158203125,
0.049652099609375,
-0.032684326171875,
0.0186614990234375,
-0.020751953125,
-0.060394287109375,
-0.0283203125,
0.055145263671875,
-0.02801513671875,
0.00257110595703125,
0.043121337890625,
0.0718994140625,
-0.004512786865234375,
-0.0260467529296875,
0.031707763671875,
0.0118408203125,
0.0163726806640625,
0.032135009765625,
0.0322265625,
-0.055877685546875,
0.053924560546875,
-0.035369873046875,
-0.017974853515625,
-0.00921630859375,
-0.042388916015625,
-0.0831298828125,
-0.03582763671875,
-0.0421142578125,
-0.039794921875,
0.005893707275390625,
0.076904296875,
0.054931640625,
-0.0760498046875,
-0.044952392578125,
-0.0137939453125,
-0.00664520263671875,
-0.01192474365234375,
-0.0202789306640625,
0.0284881591796875,
-0.033447265625,
-0.082763671875,
-0.00010800361633300781,
-0.0002301931381225586,
0.0173797607421875,
-0.0053253173828125,
-0.022308349609375,
-0.016326904296875,
-0.0211639404296875,
0.02349853515625,
0.017547607421875,
-0.049774169921875,
-0.0179443359375,
-0.0030689239501953125,
-0.0159454345703125,
0.0150299072265625,
0.024261474609375,
-0.04400634765625,
0.034759521484375,
0.0205535888671875,
0.006809234619140625,
0.05023193359375,
-0.0232391357421875,
0.0311279296875,
-0.055999755859375,
0.044189453125,
0.009246826171875,
0.048492431640625,
0.0066680908203125,
-0.0240478515625,
0.024322509765625,
0.026641845703125,
-0.041839599609375,
-0.059844970703125,
0.00806427001953125,
-0.0831298828125,
0.006244659423828125,
0.08465576171875,
-0.00844573974609375,
-0.03558349609375,
-0.002918243408203125,
-0.0258941650390625,
0.0197906494140625,
-0.025299072265625,
0.047271728515625,
0.0523681640625,
0.0016536712646484375,
-0.0190277099609375,
-0.0296478271484375,
0.038787841796875,
0.017913818359375,
-0.048614501953125,
-0.011627197265625,
0.022308349609375,
0.048309326171875,
0.016998291015625,
0.037933349609375,
-0.02069091796875,
0.0308837890625,
-0.01079559326171875,
0.0298309326171875,
-0.015716552734375,
-0.00490570068359375,
-0.031585693359375,
-0.0214691162109375,
0.026611328125,
-0.020782470703125
]
] |
gilf/french-camembert-postag-model | 2023-04-05T15:31:56.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"camembert",
"token-classification",
"fr",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | gilf | null | null | gilf/french-camembert-postag-model | 5 | 11,901 | transformers | 2022-03-02T23:29:05 | ---
language: fr
widget:
- text: "Face à un choc inédit, les mesures mises en place par le gouvernement ont permis une protection forte et efficace des ménages"
---
## About
The *french-camembert-postag-model* is a part of speech tagging model for French that was trained on the *free-french-treebank* dataset available on
[github](https://github.com/nicolashernandez/free-french-treebank). The base tokenizer and model used for training is *'camembert-base'*.
## Supported Tags
It uses the following tags:
| Tag | Category | Extra Info |
|----------|:------------------------------:|------------:|
| ADJ | adjectif | |
| ADJWH | adjectif | |
| ADV | adverbe | |
| ADVWH | adverbe | |
| CC | conjonction de coordination | |
| CLO | pronom | obj |
| CLR | pronom | refl |
| CLS | pronom | suj |
| CS | conjonction de subordination | |
| DET | déterminant | |
| DETWH | déterminant | |
| ET | mot étranger | |
| I | interjection | |
| NC | nom commun | |
| NPP | nom propre | |
| P | préposition | |
| P+D | préposition + déterminant | |
| PONCT | signe de ponctuation | |
| PREF | préfixe | |
| PRO | autres pronoms | |
| PROREL | autres pronoms | rel |
| PROWH | autres pronoms | int |
| U | ? | |
| V | verbe | |
| VIMP | verbe imperatif | |
| VINF | verbe infinitif | |
| VPP | participe passé | |
| VPR | participe présent | |
| VS | subjonctif | |
More information on the tags can be found here:
http://alpage.inria.fr/statgram/frdep/Publications/crabbecandi-taln2008-final.pdf
## Usage
The usage of this model follows the common transformers patterns. Here is a short example of its usage:
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("gilf/french-camembert-postag-model")
model = AutoModelForTokenClassification.from_pretrained("gilf/french-camembert-postag-model")
from transformers import pipeline
nlp_token_class = pipeline('ner', model=model, tokenizer=tokenizer, grouped_entities=True)
nlp_token_class('Face à un choc inédit, les mesures mises en place par le gouvernement ont permis une protection forte et efficace des ménages')
```
The lines above would display something like this on a Jupyter notebook:
```
[{'entity_group': 'NC', 'score': 0.5760144591331482, 'word': '<s>'},
{'entity_group': 'U', 'score': 0.9946700930595398, 'word': 'Face'},
{'entity_group': 'P', 'score': 0.999615490436554, 'word': 'à'},
{'entity_group': 'DET', 'score': 0.9995906352996826, 'word': 'un'},
{'entity_group': 'NC', 'score': 0.9995531439781189, 'word': 'choc'},
{'entity_group': 'ADJ', 'score': 0.999183714389801, 'word': 'inédit'},
{'entity_group': 'P', 'score': 0.3710663616657257, 'word': ','},
{'entity_group': 'DET', 'score': 0.9995903968811035, 'word': 'les'},
{'entity_group': 'NC', 'score': 0.9995649456977844, 'word': 'mesures'},
{'entity_group': 'VPP', 'score': 0.9988670349121094, 'word': 'mises'},
{'entity_group': 'P', 'score': 0.9996246099472046, 'word': 'en'},
{'entity_group': 'NC', 'score': 0.9995329976081848, 'word': 'place'},
{'entity_group': 'P', 'score': 0.9996233582496643, 'word': 'par'},
{'entity_group': 'DET', 'score': 0.9995935559272766, 'word': 'le'},
{'entity_group': 'NC', 'score': 0.9995369911193848, 'word': 'gouvernement'},
{'entity_group': 'V', 'score': 0.9993771314620972, 'word': 'ont'},
{'entity_group': 'VPP', 'score': 0.9991101026535034, 'word': 'permis'},
{'entity_group': 'DET', 'score': 0.9995885491371155, 'word': 'une'},
{'entity_group': 'NC', 'score': 0.9995636343955994, 'word': 'protection'},
{'entity_group': 'ADJ', 'score': 0.9991781711578369, 'word': 'forte'},
{'entity_group': 'CC', 'score': 0.9991298317909241, 'word': 'et'},
{'entity_group': 'ADJ', 'score': 0.9992275238037109, 'word': 'efficace'},
{'entity_group': 'P+D', 'score': 0.9993300437927246, 'word': 'des'},
{'entity_group': 'NC', 'score': 0.8353511393070221, 'word': 'ménages</s>'}]
```
| 4,936 | [
[
-0.035980224609375,
-0.04522705078125,
0.01276397705078125,
0.01248931884765625,
-0.0243377685546875,
0.0117034912109375,
-0.00439453125,
-0.005527496337890625,
0.046142578125,
0.030914306640625,
-0.0297088623046875,
-0.07684326171875,
-0.054534912109375,
0.006900787353515625,
-0.01178741455078125,
0.07171630859375,
-0.0034351348876953125,
-0.0096282958984375,
0.01514434814453125,
-0.002716064453125,
-0.0053558349609375,
-0.038482666015625,
-0.054962158203125,
-0.0161590576171875,
0.0281982421875,
0.01427459716796875,
0.0433349609375,
0.04913330078125,
0.031463623046875,
0.022491455078125,
-0.01479339599609375,
0.00212860107421875,
-0.0125579833984375,
-0.01287841796875,
-0.00787353515625,
-0.044281005859375,
-0.043487548828125,
-0.0007009506225585938,
0.045867919921875,
0.050567626953125,
-0.0052490234375,
0.00604248046875,
0.00823974609375,
0.036376953125,
-0.0205841064453125,
0.0374755859375,
-0.03753662109375,
0.0084991455078125,
-0.02447509765625,
-0.020233154296875,
-0.00888824462890625,
-0.01026153564453125,
-0.0035953521728515625,
-0.0489501953125,
0.01541900634765625,
0.001708984375,
0.110595703125,
0.0094146728515625,
-0.0245819091796875,
-0.0223541259765625,
-0.0243988037109375,
0.052764892578125,
-0.06658935546875,
0.021575927734375,
0.03326416015625,
-0.004299163818359375,
-0.023773193359375,
-0.06524658203125,
-0.0526123046875,
0.009857177734375,
-0.040618896484375,
0.0260162353515625,
-0.01422119140625,
-0.011688232421875,
0.019561767578125,
0.019805908203125,
-0.060272216796875,
-0.00562286376953125,
-0.013671875,
-0.008026123046875,
0.04754638671875,
0.0034847259521484375,
0.0250396728515625,
-0.04425048828125,
-0.03521728515625,
-0.01554107666015625,
-0.0289154052734375,
0.02117919921875,
0.0215606689453125,
0.0294342041015625,
-0.0311431884765625,
0.0521240234375,
-0.027069091796875,
0.0535888671875,
-0.0001819133758544922,
-0.01132965087890625,
0.04791259765625,
-0.0289154052734375,
-0.020355224609375,
0.002941131591796875,
0.09429931640625,
0.050506591796875,
0.01261138916015625,
0.0034618377685546875,
-0.0163726806640625,
0.01038360595703125,
-0.00907135009765625,
-0.047607421875,
-0.03900146484375,
0.0206451416015625,
-0.033447265625,
-0.018798828125,
0.01446533203125,
-0.08575439453125,
0.006603240966796875,
-0.004375457763671875,
0.029205322265625,
-0.029541015625,
-0.0250396728515625,
-0.0014181137084960938,
-0.02056884765625,
0.0218963623046875,
-0.005939483642578125,
-0.060211181640625,
0.01910400390625,
0.027984619140625,
0.075927734375,
0.015960693359375,
-0.00916290283203125,
-0.0205841064453125,
0.00457000732421875,
-0.010589599609375,
0.06414794921875,
-0.033477783203125,
-0.045379638671875,
-0.0097808837890625,
0.0292510986328125,
-0.03778076171875,
-0.0169219970703125,
0.047088623046875,
-0.0168914794921875,
0.02362060546875,
-0.009521484375,
-0.04144287109375,
-0.0159454345703125,
0.01727294921875,
-0.048248291015625,
0.0814208984375,
0.028076171875,
-0.08245849609375,
0.04791259765625,
-0.04931640625,
-0.0237579345703125,
0.0178680419921875,
-0.0230712890625,
-0.0318603515625,
-0.0253143310546875,
0.032806396484375,
0.043914794921875,
-0.0236968994140625,
0.004207611083984375,
0.00557708740234375,
-0.00640106201171875,
0.006320953369140625,
-0.01221466064453125,
0.08709716796875,
0.017425537109375,
-0.0283660888671875,
0.00676727294921875,
-0.07452392578125,
0.0029296875,
0.0272674560546875,
-0.040740966796875,
-0.0158843994140625,
-0.001621246337890625,
0.0090789794921875,
0.01690673828125,
0.01013946533203125,
-0.03594970703125,
0.02386474609375,
-0.05316162109375,
0.04913330078125,
0.04327392578125,
0.010986328125,
0.01543426513671875,
-0.02703857421875,
0.036468505859375,
0.0162353515625,
-0.004451751708984375,
-0.00859832763671875,
-0.050628662109375,
-0.0364990234375,
-0.042999267578125,
0.0303802490234375,
0.05242919921875,
-0.04681396484375,
0.07830810546875,
-0.0325927734375,
-0.049957275390625,
-0.03948974609375,
-0.0097503662109375,
-0.004673004150390625,
0.054351806640625,
0.037353515625,
-0.0212554931640625,
-0.046417236328125,
-0.0667724609375,
-0.0102691650390625,
-0.009918212890625,
0.0024013519287109375,
0.0180206298828125,
0.0604248046875,
-0.020660400390625,
0.07159423828125,
-0.042449951171875,
-0.0252838134765625,
-0.0178375244140625,
0.00806427001953125,
0.0579833984375,
0.05010986328125,
0.05096435546875,
-0.0494384765625,
-0.041748046875,
-0.0008072853088378906,
-0.051544189453125,
0.002216339111328125,
-0.005764007568359375,
-0.01001739501953125,
0.01337432861328125,
0.0230865478515625,
-0.06671142578125,
0.045867919921875,
0.037384033203125,
-0.048187255859375,
0.0288543701171875,
-0.0240936279296875,
0.0196533203125,
-0.09722900390625,
0.0030956268310546875,
-0.0022220611572265625,
-0.021484375,
-0.0377197265625,
-0.0013132095336914062,
-0.0000966787338256836,
0.01105499267578125,
-0.030059814453125,
0.0221405029296875,
-0.034912109375,
0.01010894775390625,
0.0077362060546875,
0.01137542724609375,
0.002227783203125,
0.040618896484375,
0.0017261505126953125,
0.048248291015625,
0.059844970703125,
-0.03363037109375,
0.031951904296875,
0.01345062255859375,
-0.034423828125,
0.034332275390625,
-0.0469970703125,
-0.013580322265625,
-0.01629638671875,
0.0221405029296875,
-0.08740234375,
-0.026214599609375,
0.036102294921875,
-0.0390625,
0.02923583984375,
-0.0225677490234375,
-0.043792724609375,
-0.042388916015625,
-0.0266265869140625,
0.01287078857421875,
0.025665283203125,
-0.0257415771484375,
0.055206298828125,
0.0246429443359375,
0.0004513263702392578,
-0.04229736328125,
-0.05401611328125,
0.000690460205078125,
-0.025390625,
-0.04461669921875,
0.031646728515625,
0.00785064697265625,
0.006687164306640625,
0.0161285400390625,
0.00009918212890625,
0.0028743743896484375,
0.0135040283203125,
0.01340484619140625,
0.0175628662109375,
-0.0097503662109375,
-0.007747650146484375,
-0.00815582275390625,
-0.0090789794921875,
-0.00925445556640625,
-0.0325927734375,
0.0841064453125,
-0.0092010498046875,
-0.0095367431640625,
-0.034942626953125,
0.0190582275390625,
0.0269622802734375,
-0.03521728515625,
0.077880859375,
0.048675537109375,
-0.048431396484375,
0.004062652587890625,
-0.0098724365234375,
0.00518035888671875,
-0.029205322265625,
0.0161285400390625,
-0.051788330078125,
-0.056884765625,
0.04998779296875,
0.0072784423828125,
0.005695343017578125,
0.07855224609375,
0.034393310546875,
-0.016571044921875,
0.076171875,
0.01107025146484375,
-0.0072784423828125,
0.016357421875,
-0.045989990234375,
0.0168914794921875,
-0.04608154296875,
-0.0343017578125,
-0.0386962890625,
-0.01605224609375,
-0.062286376953125,
-0.03253173828125,
0.005535125732421875,
0.029876708984375,
-0.0187530517578125,
0.03863525390625,
-0.061004638671875,
0.0156707763671875,
0.043243408203125,
0.007747650146484375,
-0.0008754730224609375,
-0.005340576171875,
-0.0242462158203125,
-0.007457733154296875,
-0.045867919921875,
-0.039947509765625,
0.08258056640625,
0.0115814208984375,
0.044525146484375,
0.0072784423828125,
0.062744140625,
0.01210784912109375,
0.00884246826171875,
-0.05499267578125,
0.040863037109375,
-0.01235198974609375,
-0.037078857421875,
-0.0164031982421875,
-0.0330810546875,
-0.09039306640625,
0.0330810546875,
-0.0180206298828125,
-0.07830810546875,
0.0227508544921875,
0.0118560791015625,
-0.0258636474609375,
0.034210205078125,
-0.049530029296875,
0.0751953125,
-0.01160430908203125,
-0.00872802734375,
0.01305389404296875,
-0.045135498046875,
0.008331298828125,
0.002414703369140625,
0.021453857421875,
-0.00890350341796875,
-0.0015478134155273438,
0.0673828125,
-0.049530029296875,
0.042724609375,
-0.013671875,
0.0019550323486328125,
0.035491943359375,
-0.00499725341796875,
0.05157470703125,
0.01451873779296875,
0.012420654296875,
0.015838623046875,
0.00896453857421875,
-0.0287322998046875,
-0.0228271484375,
0.058929443359375,
-0.05401611328125,
-0.041046142578125,
-0.048553466796875,
-0.0116119384765625,
0.0164947509765625,
0.0289306640625,
0.046478271484375,
0.0258636474609375,
0.007659912109375,
0.01995849609375,
0.03363037109375,
-0.022979736328125,
0.041778564453125,
0.017822265625,
-0.0181732177734375,
-0.046722412109375,
0.06060791015625,
0.02886962890625,
-0.0014829635620117188,
0.039215087890625,
0.006305694580078125,
-0.039947509765625,
-0.02734375,
-0.027374267578125,
0.0292510986328125,
-0.0394287109375,
-0.01885986328125,
-0.0797119140625,
-0.0134429931640625,
-0.0609130859375,
0.0002658367156982422,
-0.0169677734375,
-0.041015625,
-0.039947509765625,
-0.02471923828125,
0.044219970703125,
0.028228759765625,
-0.0135955810546875,
0.032135009765625,
-0.04901123046875,
0.0191192626953125,
0.0103912353515625,
0.030731201171875,
-0.0226593017578125,
-0.042510986328125,
-0.00672149658203125,
0.001739501953125,
-0.01113128662109375,
-0.08807373046875,
0.04998779296875,
0.0006122589111328125,
0.0286102294921875,
0.0227203369140625,
-0.0191802978515625,
0.052581787109375,
-0.034759521484375,
0.07147216796875,
0.0157318115234375,
-0.06781005859375,
0.0531005859375,
-0.0261688232421875,
0.0157318115234375,
0.0209197998046875,
0.03485107421875,
-0.05218505859375,
-0.0142822265625,
-0.06500244140625,
-0.09808349609375,
0.0643310546875,
0.034698486328125,
-0.0083770751953125,
-0.01485443115234375,
0.01092529296875,
-0.0194854736328125,
0.0167388916015625,
-0.062744140625,
-0.048065185546875,
-0.042266845703125,
-0.035797119140625,
-0.00627899169921875,
-0.00017821788787841797,
-0.0110931396484375,
-0.0299835205078125,
0.062042236328125,
0.01006317138671875,
0.03204345703125,
0.034576416015625,
-0.00678253173828125,
0.002635955810546875,
0.01446533203125,
0.04248046875,
0.0404052734375,
-0.0243988037109375,
0.0120849609375,
0.005809783935546875,
-0.0173492431640625,
-0.0019521713256835938,
0.0286407470703125,
-0.01474761962890625,
0.01593017578125,
0.037139892578125,
0.059356689453125,
-0.00637054443359375,
-0.0257720947265625,
0.04327392578125,
0.00016367435455322266,
-0.04144287109375,
-0.04071044921875,
-0.00270843505859375,
0.0184173583984375,
0.0072784423828125,
0.0252227783203125,
0.00011867284774780273,
-0.00926971435546875,
-0.03851318359375,
0.0171356201171875,
0.0283203125,
-0.0219879150390625,
-0.0192108154296875,
0.07366943359375,
-0.0167999267578125,
-0.034423828125,
0.0212554931640625,
-0.021270751953125,
-0.042449951171875,
0.049346923828125,
0.0304718017578125,
0.05389404296875,
-0.0150909423828125,
0.01519012451171875,
0.043121337890625,
0.040283203125,
0.0058135986328125,
0.038909912109375,
0.010772705078125,
-0.057464599609375,
-0.004505157470703125,
-0.0577392578125,
0.027740478515625,
0.0289459228515625,
-0.02801513671875,
0.00908660888671875,
-0.041229248046875,
-0.035247802734375,
0.013824462890625,
-0.007320404052734375,
-0.06549072265625,
0.037506103515625,
-0.0030059814453125,
0.058837890625,
-0.0623779296875,
0.05426025390625,
0.0684814453125,
-0.05450439453125,
-0.07855224609375,
-0.01108551025390625,
-0.0170440673828125,
-0.046142578125,
0.049591064453125,
0.01113128662109375,
0.0252227783203125,
0.00952911376953125,
-0.03466796875,
-0.08013916015625,
0.07928466796875,
-0.00909423828125,
-0.048828125,
-0.004840850830078125,
0.0107269287109375,
0.04327392578125,
-0.0249481201171875,
0.0411376953125,
0.052581787109375,
0.0382080078125,
-0.014984130859375,
-0.053497314453125,
0.018402099609375,
-0.0272674560546875,
-0.0087127685546875,
0.0221099853515625,
-0.06658935546875,
0.0743408203125,
0.02392578125,
-0.0173492431640625,
-0.0173797607421875,
0.04083251953125,
0.004283905029296875,
0.0027675628662109375,
0.04345703125,
0.06304931640625,
0.05645751953125,
-0.044708251953125,
0.06610107421875,
-0.0311279296875,
0.050567626953125,
0.0672607421875,
-0.0031528472900390625,
0.046905517578125,
0.03106689453125,
-0.027191162109375,
0.04144287109375,
0.05377197265625,
-0.01374053955078125,
0.034149169921875,
0.01473236083984375,
-0.0183868408203125,
-0.00350189208984375,
0.0007700920104980469,
-0.0271148681640625,
0.03521728515625,
0.038482666015625,
-0.03973388671875,
-0.0022754669189453125,
-0.013275146484375,
0.017059326171875,
0.0061187744140625,
-0.0194091796875,
0.046173095703125,
-0.0007510185241699219,
-0.0306854248046875,
0.0450439453125,
0.0088348388671875,
0.0552978515625,
-0.03204345703125,
-0.0015621185302734375,
-0.0017042160034179688,
0.005641937255859375,
-0.032928466796875,
-0.0673828125,
0.00493621826171875,
0.0005402565002441406,
-0.00768280029296875,
0.003055572509765625,
0.035675048828125,
-0.033935546875,
-0.050872802734375,
0.0200347900390625,
0.0266571044921875,
0.0201263427734375,
0.00843048095703125,
-0.06939697265625,
-0.0189208984375,
0.012298583984375,
-0.042938232421875,
-0.004878997802734375,
0.0284576416015625,
0.00485992431640625,
0.040130615234375,
0.04571533203125,
0.031494140625,
0.0187225341796875,
0.00893402099609375,
0.07000732421875,
-0.06439208984375,
-0.039337158203125,
-0.06915283203125,
0.0487060546875,
-0.0157623291015625,
-0.0185699462890625,
0.047515869140625,
0.0635986328125,
0.0645751953125,
0.0036468505859375,
0.06451416015625,
-0.03253173828125,
0.04345703125,
-0.0399169921875,
0.05511474609375,
-0.05010986328125,
0.0033016204833984375,
-0.0173492431640625,
-0.06475830078125,
-0.025299072265625,
0.06365966796875,
-0.00864410400390625,
0.0024089813232421875,
0.050384521484375,
0.06719970703125,
0.001178741455078125,
-0.0153656005859375,
-0.0017538070678710938,
0.025146484375,
0.0306854248046875,
0.038543701171875,
0.0328369140625,
-0.04144287109375,
0.0367431640625,
-0.03326416015625,
-0.0145416259765625,
-0.00839996337890625,
-0.048095703125,
-0.06787109375,
-0.0352783203125,
-0.0254364013671875,
-0.0241241455078125,
-0.007625579833984375,
0.09429931640625,
0.03460693359375,
-0.0711669921875,
-0.00806427001953125,
-0.0185699462890625,
-0.010833740234375,
-0.0125274658203125,
-0.0267333984375,
0.060760498046875,
-0.0038623809814453125,
-0.053497314453125,
0.021942138671875,
0.005504608154296875,
0.0245819091796875,
0.0237274169921875,
0.0069732666015625,
-0.04644775390625,
-0.004360198974609375,
0.033843994140625,
0.006259918212890625,
-0.04901123046875,
-0.018402099609375,
-0.024505615234375,
-0.006366729736328125,
0.028228759765625,
0.025360107421875,
-0.040374755859375,
0.032196044921875,
0.05029296875,
0.0199737548828125,
0.047088623046875,
0.0165252685546875,
0.0172882080078125,
-0.062469482421875,
0.027008056640625,
0.006084442138671875,
0.0467529296875,
0.01558685302734375,
-0.027435302734375,
0.038330078125,
0.047393798828125,
-0.033782958984375,
-0.050048828125,
-0.01029205322265625,
-0.073486328125,
-0.0167236328125,
0.0587158203125,
-0.0189208984375,
-0.0286407470703125,
-0.00026798248291015625,
-0.0279998779296875,
0.041748046875,
-0.050628662109375,
0.0286865234375,
0.051361083984375,
0.000995635986328125,
-0.0042266845703125,
-0.039520263671875,
0.043548583984375,
0.0243072509765625,
-0.05010986328125,
-0.022491455078125,
0.004535675048828125,
0.0221099853515625,
0.019287109375,
0.056884765625,
-0.007633209228515625,
-0.007648468017578125,
0.005069732666015625,
0.01001739501953125,
0.0225677490234375,
0.002994537353515625,
-0.00936126708984375,
0.00724029541015625,
-0.0130157470703125,
-0.0255126953125
]
] |
Helsinki-NLP/opus-mt-cy-en | 2023-08-16T11:27:19.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"cy",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-cy-en | 0 | 11,900 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-cy-en
* source languages: cy
* target languages: en
* OPUS readme: [cy-en](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/cy-en/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/cy-en/opus-2019-12-18.zip)
* test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/cy-en/opus-2019-12-18.test.txt)
* test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/cy-en/opus-2019-12-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.cy.en | 33.0 | 0.525 |
| 818 | [
[
-0.017303466796875,
-0.026702880859375,
0.020904541015625,
0.0279998779296875,
-0.037567138671875,
-0.0223541259765625,
-0.034759521484375,
-0.007762908935546875,
0.00637054443359375,
0.03851318359375,
-0.05938720703125,
-0.048004150390625,
-0.042266845703125,
0.0183563232421875,
-0.00724029541015625,
0.055145263671875,
-0.01096343994140625,
0.036407470703125,
0.0225982666015625,
-0.033935546875,
-0.02117919921875,
-0.0279693603515625,
-0.038116455078125,
-0.0216522216796875,
0.0251617431640625,
0.020294189453125,
0.0321044921875,
0.031707763671875,
0.06463623046875,
0.018707275390625,
-0.00562286376953125,
0.00899505615234375,
-0.036407470703125,
-0.00495147705078125,
-0.0015115737915039062,
-0.045745849609375,
-0.05096435546875,
-0.00501251220703125,
0.0726318359375,
0.03741455078125,
0.00197601318359375,
0.022552490234375,
-0.007335662841796875,
0.06964111328125,
-0.0272216796875,
0.01187896728515625,
-0.051177978515625,
0.0020751953125,
-0.025848388671875,
-0.022613525390625,
-0.04534912109375,
-0.02032470703125,
0.0092926025390625,
-0.058135986328125,
-0.004924774169921875,
0.0083160400390625,
0.0975341796875,
0.0227203369140625,
-0.0251617431640625,
-0.0171051025390625,
-0.0460205078125,
0.0731201171875,
-0.059295654296875,
0.04296875,
0.0251617431640625,
0.013885498046875,
0.0205230712890625,
-0.047332763671875,
-0.024169921875,
0.00855255126953125,
-0.01371002197265625,
0.01837158203125,
-0.012420654296875,
-0.01250457763671875,
0.021026611328125,
0.05206298828125,
-0.052734375,
-0.007656097412109375,
-0.0501708984375,
-0.0054473876953125,
0.050018310546875,
0.0098724365234375,
0.01169586181640625,
-0.007541656494140625,
-0.0303497314453125,
-0.038177490234375,
-0.05609130859375,
0.002925872802734375,
0.027618408203125,
0.029388427734375,
-0.03448486328125,
0.062103271484375,
-0.01175689697265625,
0.0465087890625,
0.0021152496337890625,
0.00466156005859375,
0.07177734375,
-0.0430908203125,
-0.0285797119140625,
-0.005985260009765625,
0.0919189453125,
0.0247955322265625,
0.01043701171875,
-0.0010652542114257812,
-0.0118865966796875,
-0.01345062255859375,
0.00823974609375,
-0.0645751953125,
-0.0009388923645019531,
0.007350921630859375,
-0.033416748046875,
-0.0133514404296875,
0.006023406982421875,
-0.0528564453125,
0.012481689453125,
-0.0338134765625,
0.046051025390625,
-0.051361083984375,
-0.014404296875,
0.027923583984375,
0.0059661865234375,
0.0265655517578125,
-0.0001964569091796875,
-0.047210693359375,
0.0165863037109375,
0.026214599609375,
0.053619384765625,
-0.02984619140625,
-0.02081298828125,
-0.03704833984375,
-0.019744873046875,
-0.01462554931640625,
0.048675537109375,
-0.0068206787109375,
-0.0277557373046875,
-0.0012149810791015625,
0.0400390625,
-0.0391845703125,
-0.028289794921875,
0.0965576171875,
-0.0240478515625,
0.054351806640625,
-0.03472900390625,
-0.043182373046875,
-0.01885986328125,
0.030517578125,
-0.040283203125,
0.0882568359375,
0.0172271728515625,
-0.0665283203125,
0.01580810546875,
-0.055328369140625,
-0.00876617431640625,
0.0006270408630371094,
0.0013704299926757812,
-0.04388427734375,
0.00669097900390625,
0.006000518798828125,
0.0216827392578125,
-0.024261474609375,
0.0184783935546875,
-0.004528045654296875,
-0.031951904296875,
0.006893157958984375,
-0.0281829833984375,
0.080078125,
0.0233612060546875,
-0.0298919677734375,
0.016632080078125,
-0.0672607421875,
0.0010223388671875,
-0.0050506591796875,
-0.03900146484375,
-0.0135650634765625,
0.006107330322265625,
0.0240020751953125,
0.01180267333984375,
0.0240936279296875,
-0.046295166015625,
0.0222320556640625,
-0.055145263671875,
0.013427734375,
0.0426025390625,
-0.024169921875,
0.031005859375,
-0.0266571044921875,
0.025634765625,
0.005786895751953125,
0.0010013580322265625,
0.00035262107849121094,
-0.032012939453125,
-0.06060791015625,
-0.0166015625,
0.045074462890625,
0.07904052734375,
-0.0596923828125,
0.0694580078125,
-0.0467529296875,
-0.05859375,
-0.060394287109375,
-0.008575439453125,
0.032257080078125,
0.024688720703125,
0.0391845703125,
-0.01035308837890625,
-0.0299835205078125,
-0.07562255859375,
-0.009490966796875,
-0.0157470703125,
-0.01300811767578125,
0.00988006591796875,
0.0462646484375,
-0.0168609619140625,
0.038848876953125,
-0.03509521484375,
-0.023101806640625,
-0.0170135498046875,
0.0079193115234375,
0.035614013671875,
0.044830322265625,
0.042755126953125,
-0.05792236328125,
-0.0439453125,
-0.0014371871948242188,
-0.0537109375,
-0.0153045654296875,
0.0013141632080078125,
-0.0179443359375,
0.006275177001953125,
0.01082611083984375,
-0.0321044921875,
0.01311492919921875,
0.0460205078125,
-0.053924560546875,
0.042510986328125,
-0.007171630859375,
0.0220794677734375,
-0.10125732421875,
0.01271820068359375,
-0.013763427734375,
-0.005649566650390625,
-0.033172607421875,
0.00030040740966796875,
0.0231475830078125,
0.007415771484375,
-0.060272216796875,
0.039642333984375,
-0.01177978515625,
-0.00514984130859375,
0.0168609619140625,
0.0020465850830078125,
0.007656097412109375,
0.053009033203125,
-0.002262115478515625,
0.065673828125,
0.05206298828125,
-0.034820556640625,
0.0158538818359375,
0.0384521484375,
-0.0301666259765625,
0.028289794921875,
-0.0594482421875,
-0.0237579345703125,
0.02838134765625,
-0.01143646240234375,
-0.046783447265625,
0.007335662841796875,
0.035980224609375,
-0.04718017578125,
0.030853271484375,
-0.007965087890625,
-0.052520751953125,
0.003513336181640625,
-0.0257415771484375,
0.03857421875,
0.040924072265625,
-0.01546478271484375,
0.038360595703125,
-0.0028839111328125,
-0.0019407272338867188,
-0.03662109375,
-0.069580078125,
-0.0108184814453125,
-0.03277587890625,
-0.052215576171875,
0.02813720703125,
-0.037200927734375,
-0.004116058349609375,
0.00007849931716918945,
0.023040771484375,
-0.0050811767578125,
-0.000640869140625,
-0.00246429443359375,
0.021820068359375,
-0.03729248046875,
0.0110321044921875,
0.005016326904296875,
-0.0112762451171875,
-0.005390167236328125,
-0.0180206298828125,
0.04913330078125,
-0.024749755859375,
-0.0186920166015625,
-0.050018310546875,
0.003131866455078125,
0.052825927734375,
-0.0275726318359375,
0.069091796875,
0.04693603515625,
-0.01143646240234375,
0.0211334228515625,
-0.0263671875,
0.0102996826171875,
-0.03369140625,
0.0139312744140625,
-0.024169921875,
-0.05401611328125,
0.03790283203125,
0.00939178466796875,
0.0263671875,
0.0655517578125,
0.048614501953125,
0.0059967041015625,
0.04473876953125,
0.0202484130859375,
-0.006011962890625,
0.0300445556640625,
-0.03717041015625,
-0.00804901123046875,
-0.0794677734375,
0.0032482147216796875,
-0.055328369140625,
-0.0168304443359375,
-0.07025146484375,
-0.0213623046875,
0.018402099609375,
0.00821685791015625,
-0.022674560546875,
0.051239013671875,
-0.039306640625,
0.01319122314453125,
0.041046142578125,
-0.00798797607421875,
0.0219268798828125,
0.005199432373046875,
-0.039947509765625,
-0.0213775634765625,
-0.0300445556640625,
-0.03900146484375,
0.0970458984375,
0.02410888671875,
0.0269012451171875,
0.0117340087890625,
0.0357666015625,
-0.0011749267578125,
0.0117645263671875,
-0.04937744140625,
0.028778076171875,
-0.0201873779296875,
-0.0567626953125,
-0.0202484130859375,
-0.048126220703125,
-0.06353759765625,
0.0297393798828125,
-0.0251007080078125,
-0.0439453125,
0.0181427001953125,
-0.0036983489990234375,
-0.009521484375,
0.033447265625,
-0.04443359375,
0.08343505859375,
-0.00563812255859375,
-0.01473236083984375,
0.0210723876953125,
-0.03167724609375,
0.019927978515625,
-0.00179290771484375,
0.018585205078125,
-0.0121917724609375,
0.006267547607421875,
0.046295166015625,
-0.00580596923828125,
0.033111572265625,
-0.002330780029296875,
-0.0004420280456542969,
0.0086517333984375,
0.0019435882568359375,
0.0258941650390625,
0.00031304359436035156,
-0.032440185546875,
0.02239990234375,
0.0068359375,
-0.034088134765625,
-0.006931304931640625,
0.038238525390625,
-0.060272216796875,
0.0071563720703125,
-0.032684326171875,
-0.047210693359375,
0.0041046142578125,
0.027008056640625,
0.06201171875,
0.0521240234375,
-0.0194091796875,
0.040191650390625,
0.06414794921875,
-0.030517578125,
0.02685546875,
0.04559326171875,
-0.013824462890625,
-0.04730224609375,
0.05804443359375,
0.00455474853515625,
0.02813720703125,
0.043121337890625,
0.0013942718505859375,
-0.01446533203125,
-0.0572509765625,
-0.05474853515625,
0.0247344970703125,
-0.0184326171875,
-0.017578125,
-0.03759765625,
-0.01373291015625,
-0.023651123046875,
0.014556884765625,
-0.048126220703125,
-0.0455322265625,
-0.01251983642578125,
-0.01456451416015625,
0.01122283935546875,
0.01806640625,
-0.0016450881958007812,
0.036529541015625,
-0.073974609375,
0.00927734375,
-0.005767822265625,
0.0268096923828125,
-0.040557861328125,
-0.055816650390625,
-0.03802490234375,
0.00548553466796875,
-0.05255126953125,
-0.056671142578125,
0.04461669921875,
0.0037899017333984375,
0.0221405029296875,
0.0262603759765625,
0.01371002197265625,
0.023834228515625,
-0.04296875,
0.07110595703125,
-0.002384185791015625,
-0.055633544921875,
0.0333251953125,
-0.03802490234375,
0.033477783203125,
0.0738525390625,
0.0184783935546875,
-0.027069091796875,
-0.0367431640625,
-0.0499267578125,
-0.0618896484375,
0.06390380859375,
0.06536865234375,
-0.006259918212890625,
0.0127105712890625,
0.004131317138671875,
-0.00530242919921875,
0.01348114013671875,
-0.08709716796875,
-0.03302001953125,
0.0020847320556640625,
-0.0333251953125,
-0.0160980224609375,
-0.01486968994140625,
-0.01552581787109375,
-0.00971221923828125,
0.0770263671875,
0.00908660888671875,
0.0244140625,
0.034332275390625,
-0.00628662109375,
-0.0201416015625,
0.028228759765625,
0.070556640625,
0.037384033203125,
-0.042510986328125,
-0.014892578125,
0.022552490234375,
-0.030853271484375,
-0.01308441162109375,
0.01727294921875,
-0.031402587890625,
0.0179901123046875,
0.0421142578125,
0.071044921875,
0.01373291015625,
-0.04693603515625,
0.030670166015625,
-0.033935546875,
-0.036163330078125,
-0.04644775390625,
-0.006404876708984375,
0.01508331298828125,
0.00347137451171875,
0.01776123046875,
0.018768310546875,
0.01421356201171875,
-0.0212554931640625,
0.0070343017578125,
0.00457763671875,
-0.05523681640625,
-0.043548583984375,
0.033447265625,
0.0043792724609375,
-0.0214080810546875,
0.0462646484375,
-0.029022216796875,
-0.041259765625,
0.03289794921875,
0.00865936279296875,
0.07159423828125,
-0.01457977294921875,
-0.0153350830078125,
0.058807373046875,
0.045684814453125,
-0.0133819580078125,
0.032379150390625,
0.0107421875,
-0.05938720703125,
-0.033782958984375,
-0.058685302734375,
-0.0093231201171875,
0.01244354248046875,
-0.06610107421875,
0.026947021484375,
0.0233612060546875,
0.0014667510986328125,
-0.021881103515625,
0.0241241455078125,
-0.04046630859375,
0.01033782958984375,
-0.0115966796875,
0.081787109375,
-0.06884765625,
0.0679931640625,
0.03546142578125,
-0.01517486572265625,
-0.058990478515625,
-0.015350341796875,
-0.0145111083984375,
-0.02978515625,
0.03497314453125,
0.01528167724609375,
0.0223541259765625,
-0.01313018798828125,
-0.0171356201171875,
-0.06170654296875,
0.08831787109375,
0.0242462158203125,
-0.046234130859375,
-0.0008282661437988281,
0.01506805419921875,
0.039825439453125,
-0.03631591796875,
0.00928497314453125,
0.034027099609375,
0.054473876953125,
0.0045928955078125,
-0.075927734375,
-0.024566650390625,
-0.04144287109375,
-0.0290679931640625,
0.0321044921875,
-0.0450439453125,
0.06390380859375,
0.03326416015625,
-0.0072021484375,
0.01554107666015625,
0.042205810546875,
0.03387451171875,
0.033782958984375,
0.0391845703125,
0.08026123046875,
0.0269012451171875,
-0.036865234375,
0.071533203125,
-0.0268096923828125,
0.03887939453125,
0.08740234375,
-0.0032825469970703125,
0.061004638671875,
0.0221710205078125,
-0.0117340087890625,
0.034637451171875,
0.051422119140625,
-0.0175018310546875,
0.0419921875,
0.0021839141845703125,
0.0075225830078125,
-0.01190185546875,
0.01873779296875,
-0.054351806640625,
0.021087646484375,
0.023406982421875,
-0.0261077880859375,
0.00852203369140625,
-0.0151519775390625,
0.0045166015625,
-0.0010166168212890625,
-0.01255035400390625,
0.03857421875,
-0.00421142578125,
-0.047698974609375,
0.045501708984375,
-0.0012874603271484375,
0.04931640625,
-0.051788330078125,
0.01552581787109375,
-0.005950927734375,
0.01953125,
0.0022125244140625,
-0.037017822265625,
0.043212890625,
-0.0008196830749511719,
-0.0174407958984375,
-0.0242919921875,
0.015777587890625,
-0.04156494140625,
-0.07476806640625,
0.041290283203125,
0.0355224609375,
0.0198974609375,
0.0032367706298828125,
-0.07037353515625,
0.0042572021484375,
0.0155181884765625,
-0.0455322265625,
0.005615234375,
0.055633544921875,
0.0283050537109375,
0.030548095703125,
0.0404052734375,
0.017852783203125,
0.020294189453125,
-0.0032405853271484375,
0.06201171875,
-0.034393310546875,
-0.0207061767578125,
-0.057647705078125,
0.05804443359375,
-0.007373809814453125,
-0.0528564453125,
0.04156494140625,
0.0772705078125,
0.07940673828125,
-0.00809478759765625,
0.0211639404296875,
-0.005096435546875,
0.05010986328125,
-0.050872802734375,
0.0467529296875,
-0.06524658203125,
0.0238037109375,
-0.012603759765625,
-0.07159423828125,
-0.01486968994140625,
0.02978515625,
-0.0135345458984375,
-0.0203857421875,
0.0614013671875,
0.052032470703125,
-0.013671875,
-0.0162200927734375,
0.0233001708984375,
0.021942138671875,
0.01056671142578125,
0.04388427734375,
0.0308990478515625,
-0.07928466796875,
0.04620361328125,
-0.0284881591796875,
-0.00399017333984375,
0.005977630615234375,
-0.04547119140625,
-0.0679931640625,
-0.0435791015625,
-0.015228271484375,
-0.0183563232421875,
-0.0157318115234375,
0.07159423828125,
0.042816162109375,
-0.0699462890625,
-0.0408935546875,
-0.003879547119140625,
0.00954437255859375,
-0.01190185546875,
-0.0200958251953125,
0.045562744140625,
-0.0284271240234375,
-0.07550048828125,
0.03857421875,
0.006580352783203125,
-0.007724761962890625,
0.00656890869140625,
-0.0240631103515625,
-0.032196044921875,
-0.0030422210693359375,
0.0157318115234375,
-0.0025691986083984375,
-0.035980224609375,
0.0133056640625,
0.015655517578125,
-0.00835418701171875,
0.0263824462890625,
0.03118896484375,
-0.0227203369140625,
0.0170135498046875,
0.057952880859375,
0.0200653076171875,
0.033447265625,
-0.01247406005859375,
0.044281005859375,
-0.0462646484375,
0.0192718505859375,
0.013641357421875,
0.049591064453125,
0.0247650146484375,
-0.00945281982421875,
0.062408447265625,
0.0235595703125,
-0.046661376953125,
-0.06878662109375,
0.00229644775390625,
-0.095458984375,
-0.0010213851928710938,
0.06170654296875,
-0.029083251953125,
-0.015838623046875,
0.0230865478515625,
-0.01064300537109375,
0.019744873046875,
-0.026214599609375,
0.0357666015625,
0.05859375,
0.029876708984375,
0.004779815673828125,
-0.04840087890625,
0.0264739990234375,
0.03253173828125,
-0.055938720703125,
-0.00897216796875,
0.01009368896484375,
0.0128021240234375,
0.0299224853515625,
0.036865234375,
-0.0291900634765625,
0.0119781494140625,
-0.023681640625,
0.02362060546875,
0.0019254684448242188,
-0.01386260986328125,
-0.024688720703125,
-0.0000019669532775878906,
-0.00661468505859375,
-0.0123443603515625
]
] |
etalab-ia/dpr-question_encoder-fr_qa-camembert | 2021-06-16T10:10:09.000Z | [
"transformers",
"pytorch",
"camembert",
"feature-extraction",
"fr",
"dataset:piaf",
"dataset:FQuAD",
"dataset:SQuAD-FR",
"arxiv:2004.04906",
"arxiv:1911.03894",
"endpoints_compatible",
"region:us"
] | feature-extraction | etalab-ia | null | null | etalab-ia/dpr-question_encoder-fr_qa-camembert | 7 | 11,896 | transformers | 2022-03-02T23:29:05 | ---
language: fr
datasets:
- piaf
- FQuAD
- SQuAD-FR
---
# dpr-question_encoder-fr_qa-camembert
## Description
French [DPR model](https://arxiv.org/abs/2004.04906) using [CamemBERT](https://arxiv.org/abs/1911.03894) as base and then fine-tuned on a combo of three French Q&A
## Data
### French Q&A
We use a combination of three French Q&A datasets:
1. [PIAFv1.1](https://www.data.gouv.fr/en/datasets/piaf-le-dataset-francophone-de-questions-reponses/)
2. [FQuADv1.0](https://fquad.illuin.tech/)
3. [SQuAD-FR (SQuAD automatically translated to French)](https://github.com/Alikabbadj/French-SQuAD)
### Training
We are using 90 562 random questions for `train` and 22 391 for `dev`. No question in `train` exists in `dev`. For each question, we have a single `positive_context` (the paragraph where the answer to this question is found) and around 30 `hard_negtive_contexts`. Hard negative contexts are found by querying an ES instance (via bm25 retrieval) and getting the top-k candidates **that do not contain the answer**.
The files are over [here](https://drive.google.com/file/d/1W5Jm3sqqWlsWsx2sFpA39Ewn33PaLQ7U/view?usp=sharing).
### Evaluation
We use FQuADv1.0 and French-SQuAD evaluation sets.
## Training Script
We use the official [Facebook DPR implentation](https://github.com/facebookresearch/DPR) with a slight modification: by default, the code can work with Roberta models, still we changed a single line to make it easier to work with Camembert. This modification can be found [over here](https://github.com/psorianom/DPR).
### Hyperparameters
```shell
python -m torch.distributed.launch --nproc_per_node=8 train_dense_encoder.py \
--max_grad_norm 2.0 --encoder_model_type hf_bert --pretrained_file data/bert-base-multilingual-uncased \
--seed 12345 --sequence_length 256 --warmup_steps 1237 --batch_size 16 --do_lower_case \
--train_file DPR_FR_train.json \
--dev_file ./data/100_hard_neg_ctxs/DPR_FR_dev.json \
--output_dir ./output/bert --learning_rate 2e-05 --num_train_epochs 35 \
--dev_batch_size 16 --val_av_rank_start_epoch 25 \
--pretrained_model_cfg ./data/bert-base-multilingual-uncased
```
###
## Evaluation results
We obtain the following evaluation by using FQuAD and SQuAD-FR evaluation (or validation) sets. To obtain these results, we use [haystack's evaluation script](https://github.com/deepset-ai/haystack/blob/db4151bbc026f27c6d709fefef1088cd3f1e18b9/tutorials/Tutorial5_Evaluation.py) (**we report Retrieval results only**).
### DPR
#### FQuAD v1.0 Evaluation
```shell
For 2764 out of 3184 questions (86.81%), the answer was in the top-20 candidate passages selected by the retriever.
Retriever Recall: 0.87
Retriever Mean Avg Precision: 0.57
```
#### SQuAD-FR Evaluation
```shell
For 8945 out of 10018 questions (89.29%), the answer was in the top-20 candidate passages selected by the retriever.
Retriever Recall: 0.89
Retriever Mean Avg Precision: 0.63
```
### BM25
For reference, BM25 gets the results shown below. As in the original paper, regarding SQuAD-like datasets, the results of DPR are consistently superseeded by BM25.
#### FQuAD v1.0 Evaluation
```shell
For 2966 out of 3184 questions (93.15%), the answer was in the top-20 candidate passages selected by the retriever.
Retriever Recall: 0.93
Retriever Mean Avg Precision: 0.74
```
#### SQuAD-FR Evaluation
```shell
For 9353 out of 10018 questions (93.36%), the answer was in the top-20 candidate passages selected by the retriever.
Retriever Recall: 0.93
Retriever Mean Avg Precision: 0.77
```
## Usage
The results reported here are obtained with the `haystack` library. To get to similar embeddings using exclusively HF `transformers` library, you can do the following:
```python
from transformers import AutoTokenizer, AutoModel
query = "Salut, mon chien est-il mignon ?"
tokenizer = AutoTokenizer.from_pretrained("etalab-ia/dpr-question_encoder-fr_qa-camembert", do_lower_case=True)
input_ids = tokenizer(query, return_tensors='pt')["input_ids"]
model = AutoModel.from_pretrained("etalab-ia/dpr-question_encoder-fr_qa-camembert", return_dict=True)
embeddings = model.forward(input_ids).pooler_output
print(embeddings)
```
And with `haystack`, we use it as a retriever:
```
retriever = DensePassageRetriever(
document_store=document_store,
query_embedding_model="etalab-ia/dpr-question_encoder-fr_qa-camembert",
passage_embedding_model="etalab-ia/dpr-ctx_encoder-fr_qa-camembert",
model_version=dpr_model_tag,
infer_tokenizer_classes=True,
)
```
## Acknowledgments
This work was performed using HPC resources from GENCI–IDRIS (Grant 2020-AD011011224).
## Citations
### Datasets
#### PIAF
```
@inproceedings{KeraronLBAMSSS20,
author = {Rachel Keraron and
Guillaume Lancrenon and
Mathilde Bras and
Fr{\'{e}}d{\'{e}}ric Allary and
Gilles Moyse and
Thomas Scialom and
Edmundo{-}Pavel Soriano{-}Morales and
Jacopo Staiano},
title = {Project {PIAF:} Building a Native French Question-Answering Dataset},
booktitle = {{LREC}},
pages = {5481--5490},
publisher = {European Language Resources Association},
year = {2020}
}
```
#### FQuAD
```
@article{dHoffschmidt2020FQuADFQ,
title={FQuAD: French Question Answering Dataset},
author={Martin d'Hoffschmidt and Maxime Vidal and Wacim Belblidia and Tom Brendl'e and Quentin Heinrich},
journal={ArXiv},
year={2020},
volume={abs/2002.06071}
}
```
#### SQuAD-FR
```
@MISC{kabbadj2018,
author = "Kabbadj, Ali",
title = "Something new in French Text Mining and Information Extraction (Universal Chatbot): Largest Q&A French training dataset (110 000+) ",
editor = "linkedin.com",
month = "November",
year = "2018",
url = "\url{https://www.linkedin.com/pulse/something-new-french-text-mining-information-chatbot-largest-kabbadj/}",
note = "[Online; posted 11-November-2018]",
}
```
### Models
#### CamemBERT
HF model card : [https://huggingface.co/camembert-base](https://huggingface.co/camembert-base)
```
@inproceedings{martin2020camembert,
title={CamemBERT: a Tasty French Language Model},
author={Martin, Louis and Muller, Benjamin and Su{\'a}rez, Pedro Javier Ortiz and Dupont, Yoann and Romary, Laurent and de la Clergerie, {\'E}ric Villemonte and Seddah, Djam{\'e} and Sagot, Beno{\^\i}t},
booktitle={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
year={2020}
}
```
#### DPR
```
@misc{karpukhin2020dense,
title={Dense Passage Retrieval for Open-Domain Question Answering},
author={Vladimir Karpukhin and Barlas Oğuz and Sewon Min and Patrick Lewis and Ledell Wu and Sergey Edunov and Danqi Chen and Wen-tau Yih},
year={2020},
eprint={2004.04906},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 6,905 | [
[
-0.031036376953125,
-0.06915283203125,
0.02850341796875,
0.01493072509765625,
0.001132965087890625,
0.005245208740234375,
-0.0212554931640625,
-0.004589080810546875,
0.01061248779296875,
0.0181427001953125,
-0.049652099609375,
-0.045318603515625,
-0.03839111328125,
0.0202178955078125,
-0.023529052734375,
0.064208984375,
0.0018701553344726562,
0.01229095458984375,
-0.01194000244140625,
-0.0073089599609375,
-0.007770538330078125,
-0.0626220703125,
-0.049346923828125,
-0.00909423828125,
0.021392822265625,
0.020538330078125,
0.0297393798828125,
0.027557373046875,
0.035400390625,
0.0252532958984375,
-0.00960540771484375,
0.0166015625,
-0.041229248046875,
-0.0020008087158203125,
0.0005192756652832031,
-0.030731201171875,
-0.03656005859375,
-0.003681182861328125,
0.04437255859375,
0.02392578125,
-0.004489898681640625,
-0.0011539459228515625,
0.0009427070617675781,
0.052398681640625,
-0.0288238525390625,
0.01117706298828125,
-0.049652099609375,
-0.009063720703125,
0.00511932373046875,
-0.0156402587890625,
-0.028045654296875,
-0.018585205078125,
0.006526947021484375,
-0.03826904296875,
0.03173828125,
-0.01314544677734375,
0.0936279296875,
0.01309967041015625,
-0.0185394287109375,
-0.0166778564453125,
-0.034912109375,
0.06640625,
-0.058563232421875,
0.03436279296875,
0.03912353515625,
0.01198577880859375,
-0.017333984375,
-0.04351806640625,
-0.060394287109375,
-0.00946807861328125,
-0.0128021240234375,
0.025360107421875,
-0.01052093505859375,
-0.0116119384765625,
0.0123748779296875,
0.0341796875,
-0.060150146484375,
-0.00543212890625,
-0.025299072265625,
-0.0192108154296875,
0.0697021484375,
0.0007395744323730469,
0.011383056640625,
-0.0208892822265625,
-0.0252532958984375,
-0.048492431640625,
-0.03338623046875,
0.01262664794921875,
0.0238800048828125,
0.0215911865234375,
-0.022918701171875,
0.039306640625,
-0.0234527587890625,
0.049102783203125,
0.01436614990234375,
0.01119232177734375,
0.04608154296875,
-0.045257568359375,
-0.00518798828125,
-0.0180206298828125,
0.08807373046875,
0.0167236328125,
0.0275115966796875,
0.001194000244140625,
-0.013458251953125,
-0.0205078125,
-0.0021305084228515625,
-0.06805419921875,
-0.0241546630859375,
0.02935791015625,
-0.01219940185546875,
-0.01044464111328125,
0.0161590576171875,
-0.054901123046875,
-0.0009531974792480469,
0.00957489013671875,
0.040771484375,
-0.049224853515625,
-0.017852783203125,
0.0247039794921875,
-0.0267791748046875,
0.040252685546875,
0.01342010498046875,
-0.039276123046875,
0.007511138916015625,
0.026611328125,
0.044921875,
-0.00009167194366455078,
-0.0203094482421875,
-0.0307769775390625,
-0.020751953125,
-0.00775146484375,
0.04815673828125,
-0.0325927734375,
-0.0261077880859375,
-0.0036449432373046875,
0.0247955322265625,
-0.028045654296875,
-0.0311431884765625,
0.035125732421875,
-0.0382080078125,
0.038116455078125,
-0.0362548828125,
-0.053741455078125,
-0.018096923828125,
0.027252197265625,
-0.0377197265625,
0.09429931640625,
0.0142822265625,
-0.053070068359375,
0.0132598876953125,
-0.03192138671875,
-0.0165863037109375,
-0.0030689239501953125,
-0.01739501953125,
-0.0281982421875,
-0.0233917236328125,
0.05523681640625,
0.046600341796875,
-0.0201873779296875,
0.01373291015625,
-0.01513671875,
-0.032135009765625,
0.040069580078125,
-0.026336669921875,
0.08477783203125,
0.0009021759033203125,
-0.0218658447265625,
-0.014862060546875,
-0.049835205078125,
0.004497528076171875,
0.027252197265625,
-0.029144287109375,
-0.006336212158203125,
-0.01030731201171875,
-0.00213623046875,
0.0198974609375,
0.0272369384765625,
-0.057769775390625,
0.006519317626953125,
-0.03387451171875,
0.03656005859375,
0.04632568359375,
0.02301025390625,
0.0242919921875,
-0.03448486328125,
0.039642333984375,
0.007694244384765625,
0.007045745849609375,
-0.0012531280517578125,
-0.03912353515625,
-0.060150146484375,
-0.029266357421875,
0.033843994140625,
0.06195068359375,
-0.052581787109375,
0.060577392578125,
-0.0204620361328125,
-0.04925537109375,
-0.046630859375,
0.001922607421875,
0.0309906005859375,
0.03045654296875,
0.0457763671875,
-0.0066986083984375,
-0.032745361328125,
-0.07183837890625,
0.00604248046875,
-0.0186614990234375,
0.0011997222900390625,
0.036163330078125,
0.05780029296875,
-0.008148193359375,
0.0628662109375,
-0.0479736328125,
-0.018646240234375,
-0.028411865234375,
-0.0159912109375,
0.032379150390625,
0.051300048828125,
0.05682373046875,
-0.0677490234375,
-0.04730224609375,
-0.0266571044921875,
-0.053436279296875,
0.0219573974609375,
0.0006575584411621094,
-0.02593994140625,
0.0102081298828125,
0.03656005859375,
-0.046966552734375,
0.0207977294921875,
0.01519012451171875,
-0.022186279296875,
0.021484375,
-0.00817108154296875,
0.02935791015625,
-0.09375,
0.007587432861328125,
0.00901031494140625,
0.004131317138671875,
-0.0352783203125,
-0.002559661865234375,
0.0059356689453125,
-0.0061798095703125,
-0.0384521484375,
0.055389404296875,
-0.022186279296875,
0.0207061767578125,
0.026275634765625,
0.02081298828125,
0.0131378173828125,
0.06561279296875,
0.00848388671875,
0.06622314453125,
0.037933349609375,
-0.04150390625,
0.018463134765625,
0.0499267578125,
-0.021087646484375,
0.0208892822265625,
-0.064453125,
0.0165557861328125,
-0.007518768310546875,
0.0226593017578125,
-0.08758544921875,
-0.009735107421875,
0.021392822265625,
-0.06640625,
0.023834228515625,
-0.0164031982421875,
-0.047943115234375,
-0.03314208984375,
-0.019287109375,
0.0249176025390625,
0.045623779296875,
-0.0222015380859375,
0.0140380859375,
0.0216522216796875,
-0.006134033203125,
-0.05413818359375,
-0.06103515625,
-0.0236053466796875,
0.0002543926239013672,
-0.047607421875,
0.03192138671875,
-0.0217132568359375,
0.0027141571044921875,
0.0032176971435546875,
-0.003337860107421875,
-0.031463623046875,
-0.0072479248046875,
-0.0027446746826171875,
0.0184173583984375,
-0.0137176513671875,
0.02001953125,
-0.00695037841796875,
0.006977081298828125,
-0.01412200927734375,
-0.00865936279296875,
0.05682373046875,
-0.0304412841796875,
-0.01137542724609375,
-0.0238800048828125,
0.0202178955078125,
0.03448486328125,
-0.045257568359375,
0.0662841796875,
0.057952880859375,
-0.0228424072265625,
0.00007086992263793945,
-0.044769287109375,
-0.031646728515625,
-0.038360595703125,
0.046173095703125,
-0.027557373046875,
-0.061737060546875,
0.05096435546875,
0.04150390625,
0.004871368408203125,
0.061737060546875,
0.03619384765625,
-0.01377105712890625,
0.0767822265625,
0.0236968994140625,
-0.0010061264038085938,
0.0330810546875,
-0.06353759765625,
0.0022563934326171875,
-0.054534912109375,
-0.0251617431640625,
-0.041778564453125,
-0.0198516845703125,
-0.053558349609375,
-0.034912109375,
0.024871826171875,
0.00775909423828125,
-0.024261474609375,
0.03643798828125,
-0.04986572265625,
0.01611328125,
0.039337158203125,
0.033935546875,
-0.002307891845703125,
0.00957489013671875,
-0.01198577880859375,
-0.00513458251953125,
-0.07318115234375,
-0.0171051025390625,
0.092529296875,
0.02301025390625,
0.03790283203125,
-0.00209808349609375,
0.0721435546875,
0.01030731201171875,
-0.006122589111328125,
-0.044769287109375,
0.058349609375,
-0.020416259765625,
-0.06646728515625,
-0.0271453857421875,
-0.0438232421875,
-0.06964111328125,
0.006855010986328125,
-0.01751708984375,
-0.043609619140625,
0.033233642578125,
-0.00081634521484375,
-0.0341796875,
0.014892578125,
-0.040924072265625,
0.07183837890625,
-0.0238037109375,
-0.024566650390625,
-0.007476806640625,
-0.03924560546875,
0.0071868896484375,
0.0085296630859375,
0.00507354736328125,
-0.00952911376953125,
0.0007557868957519531,
0.0743408203125,
-0.0259857177734375,
0.048370361328125,
-0.02099609375,
0.0020198822021484375,
0.032501220703125,
-0.007389068603515625,
0.024932861328125,
0.01549530029296875,
-0.0207977294921875,
0.0168609619140625,
0.01421356201171875,
-0.043060302734375,
-0.037261962890625,
0.05096435546875,
-0.06744384765625,
-0.032135009765625,
-0.037567138671875,
-0.03875732421875,
-0.0027332305908203125,
0.0092926025390625,
0.040252685546875,
0.051025390625,
-0.0155029296875,
0.03131103515625,
0.05902099609375,
-0.041229248046875,
0.029327392578125,
0.042755126953125,
0.003452301025390625,
-0.032318115234375,
0.0655517578125,
0.00988006591796875,
0.0028171539306640625,
0.05474853515625,
0.00229644775390625,
-0.033355712890625,
-0.0367431640625,
-0.034210205078125,
0.035186767578125,
-0.053466796875,
-0.016754150390625,
-0.065185546875,
-0.04400634765625,
-0.0316162109375,
0.0007863044738769531,
-0.03564453125,
-0.041656494140625,
-0.022796630859375,
-0.0110626220703125,
0.044036865234375,
0.025665283203125,
0.00891876220703125,
0.0157470703125,
-0.04974365234375,
0.01160430908203125,
0.017364501953125,
0.00843048095703125,
-0.019744873046875,
-0.054168701171875,
-0.0206146240234375,
0.028778076171875,
-0.012908935546875,
-0.0693359375,
0.0286407470703125,
0.0281524658203125,
0.054046630859375,
-0.004299163818359375,
0.0215911865234375,
0.031646728515625,
-0.0157012939453125,
0.070068359375,
-0.012420654296875,
-0.03802490234375,
0.045318603515625,
-0.0226593017578125,
0.0219879150390625,
0.06304931640625,
0.0465087890625,
-0.017425537109375,
-0.0243377685546875,
-0.056884765625,
-0.0670166015625,
0.0499267578125,
0.0312347412109375,
0.00946044921875,
-0.0295562744140625,
0.0323486328125,
-0.01172637939453125,
0.01335906982421875,
-0.050750732421875,
-0.0275726318359375,
-0.00925445556640625,
-0.02899169921875,
-0.017333984375,
-0.00769805908203125,
-0.0020389556884765625,
-0.042144775390625,
0.0635986328125,
0.006412506103515625,
0.05029296875,
0.046173095703125,
-0.0217132568359375,
0.0183258056640625,
0.0182952880859375,
0.03057861328125,
0.03839111328125,
-0.041290283203125,
-0.01873779296875,
0.01123809814453125,
-0.033477783203125,
0.01264190673828125,
0.0294952392578125,
-0.014434814453125,
0.004604339599609375,
0.032318115234375,
0.05084228515625,
0.006977081298828125,
-0.05126953125,
0.055908203125,
-0.0149078369140625,
-0.03936767578125,
-0.03656005859375,
0.00843048095703125,
-0.0096588134765625,
0.0218658447265625,
0.03765869140625,
-0.01241302490234375,
-0.01154327392578125,
-0.034210205078125,
0.0341796875,
0.03192138671875,
-0.03289794921875,
-0.01751708984375,
0.0362548828125,
0.01280975341796875,
-0.015380859375,
0.06561279296875,
-0.02960205078125,
-0.0648193359375,
0.0538330078125,
0.019683837890625,
0.066162109375,
-0.004299163818359375,
0.0227508544921875,
0.06201171875,
0.0269622802734375,
0.0018157958984375,
0.0408935546875,
0.002407073974609375,
-0.0689697265625,
-0.019775390625,
-0.05255126953125,
-0.0034961700439453125,
0.024627685546875,
-0.050323486328125,
0.0048065185546875,
-0.023284912109375,
-0.0161895751953125,
-0.00982666015625,
0.0205841064453125,
-0.07220458984375,
0.017547607421875,
-0.01702880859375,
0.07073974609375,
-0.053802490234375,
0.051605224609375,
0.065673828125,
-0.0535888671875,
-0.0516357421875,
0.00804901123046875,
-0.022857666015625,
-0.06683349609375,
0.048370361328125,
0.00823211669921875,
0.0157470703125,
0.01934814453125,
-0.041595458984375,
-0.06329345703125,
0.075439453125,
0.0167236328125,
-0.038360595703125,
-0.006694793701171875,
-0.000022590160369873047,
0.0369873046875,
-0.0196380615234375,
0.03326416015625,
0.038238525390625,
0.031494140625,
0.0022602081298828125,
-0.055206298828125,
0.0081634521484375,
-0.042236328125,
-0.0270843505859375,
0.0016632080078125,
-0.0626220703125,
0.0697021484375,
-0.01189422607421875,
-0.01305389404296875,
-0.0024814605712890625,
0.03466796875,
0.0205078125,
0.003509521484375,
0.026123046875,
0.048187255859375,
0.04632568359375,
-0.0184478759765625,
0.0814208984375,
-0.036468505859375,
0.0301666259765625,
0.067626953125,
0.006069183349609375,
0.07098388671875,
0.0239105224609375,
-0.03387451171875,
0.042877197265625,
0.051971435546875,
-0.01094818115234375,
0.043243408203125,
0.00543212890625,
0.005229949951171875,
-0.031646728515625,
0.016937255859375,
-0.037689208984375,
0.040283203125,
0.020355224609375,
-0.0198516845703125,
-0.007198333740234375,
-0.01352691650390625,
0.0007181167602539062,
0.00942230224609375,
0.0088043212890625,
0.051513671875,
-0.0008835792541503906,
-0.052398681640625,
0.08160400390625,
-0.00534820556640625,
0.0516357421875,
-0.05084228515625,
-0.0023517608642578125,
-0.03204345703125,
0.0218048095703125,
-0.0017900466918945312,
-0.0638427734375,
-0.0012407302856445312,
-0.004547119140625,
-0.0164642333984375,
-0.008392333984375,
0.03179931640625,
-0.048065185546875,
-0.05059814453125,
0.0239105224609375,
0.05682373046875,
0.0128631591796875,
-0.01181793212890625,
-0.08258056640625,
-0.0072021484375,
0.016204833984375,
-0.0256500244140625,
0.01788330078125,
0.024658203125,
0.0218658447265625,
0.044769287109375,
0.0391845703125,
-0.010406494140625,
0.0036754608154296875,
-0.010040283203125,
0.06396484375,
-0.044525146484375,
-0.030181884765625,
-0.05743408203125,
0.047760009765625,
-0.0126800537109375,
-0.0343017578125,
0.06201171875,
0.056365966796875,
0.072998046875,
-0.00325775146484375,
0.062103271484375,
-0.021270751953125,
0.042022705078125,
-0.032745361328125,
0.057281494140625,
-0.05889892578125,
0.021087646484375,
-0.0212249755859375,
-0.067626953125,
0.00806427001953125,
0.058563232421875,
-0.0236663818359375,
0.01195526123046875,
0.04443359375,
0.08245849609375,
-0.007965087890625,
-0.02117919921875,
0.0013570785522460938,
0.0218048095703125,
0.02001953125,
0.060882568359375,
0.044708251953125,
-0.06512451171875,
0.053741455078125,
-0.03973388671875,
-0.0161590576171875,
-0.014678955078125,
-0.0382080078125,
-0.0830078125,
-0.06475830078125,
-0.03851318359375,
-0.034515380859375,
0.0102691650390625,
0.067138671875,
0.043792724609375,
-0.06585693359375,
-0.0198516845703125,
-0.005573272705078125,
0.002887725830078125,
-0.0214385986328125,
-0.0212249755859375,
0.045989990234375,
-0.012359619140625,
-0.053985595703125,
0.037200927734375,
-0.014892578125,
0.00927734375,
-0.0025177001953125,
-0.0185546875,
-0.05828857421875,
0.00270843505859375,
0.0276947021484375,
0.01708984375,
-0.035919189453125,
-0.01186370849609375,
0.037078857421875,
-0.01378631591796875,
0.010833740234375,
0.0217132568359375,
-0.053985595703125,
0.0280914306640625,
0.049285888671875,
0.045318603515625,
0.06072998046875,
-0.0009984970092773438,
0.048675537109375,
-0.06976318359375,
0.0221710205078125,
0.027923583984375,
0.0196380615234375,
0.03302001953125,
-0.0220947265625,
0.04534912109375,
0.01910400390625,
-0.038726806640625,
-0.066162109375,
-0.00457763671875,
-0.06768798828125,
-0.0198516845703125,
0.0931396484375,
-0.0157928466796875,
-0.01285552978515625,
0.0131683349609375,
-0.01041412353515625,
0.0300140380859375,
-0.0423583984375,
0.033905029296875,
0.051483154296875,
0.0018634796142578125,
-0.01739501953125,
-0.047607421875,
0.04052734375,
0.033355712890625,
-0.038299560546875,
-0.01496124267578125,
0.0172882080078125,
0.02691650390625,
0.0003235340118408203,
0.05450439453125,
-0.007781982421875,
0.017364501953125,
-0.005527496337890625,
0.0008339881896972656,
-0.01110076904296875,
-0.007080078125,
-0.01511383056640625,
-0.00389862060546875,
-0.022430419921875,
-0.0236663818359375
]
] |
BAAI/bge-small-zh-v1.5 | 2023-10-12T03:35:59.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"feature-extraction",
"zh",
"arxiv:2310.07554",
"arxiv:2309.07597",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | feature-extraction | BAAI | null | null | BAAI/bge-small-zh-v1.5 | 8 | 11,868 | transformers | 2023-09-12T05:22:29 | ---
license: mit
language:
- zh
---
<h1 align="center">FlagEmbedding</h1>
<h4 align="center">
<p>
<a href=#model-list>Model List</a> |
<a href=#frequently-asked-questions>FAQ</a> |
<a href=#usage>Usage</a> |
<a href="#evaluation">Evaluation</a> |
<a href="#train">Train</a> |
<a href="#contact">Contact</a> |
<a href="#citation">Citation</a> |
<a href="#license">License</a>
<p>
</h4>
More details please refer to our Github: [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding).
[English](README.md) | [中文](https://github.com/FlagOpen/FlagEmbedding/blob/master/README_zh.md)
FlagEmbedding can map any text to a low-dimensional dense vector which can be used for tasks like retrieval, classification, clustering, or semantic search.
And it also can be used in vector databases for LLMs.
************* 🌟**Updates**🌟 *************
- 10/12/2023: Release [LLM-Embedder](./FlagEmbedding/llm_embedder/README.md), a unified embedding model to support diverse retrieval augmentation needs for LLMs. [Paper](https://arxiv.org/pdf/2310.07554.pdf) :fire:
- 09/15/2023: The [technical report](https://arxiv.org/pdf/2309.07597.pdf) of BGE has been released
- 09/15/2023: The [masive training data](https://data.baai.ac.cn/details/BAAI-MTP) of BGE has been released
- 09/12/2023: New models:
- **New reranker model**: release cross-encoder models `BAAI/bge-reranker-base` and `BAAI/bge-reranker-large`, which are more powerful than embedding model. We recommend to use/fine-tune them to re-rank top-k documents returned by embedding models.
- **update embedding model**: release `bge-*-v1.5` embedding model to alleviate the issue of the similarity distribution, and enhance its retrieval ability without instruction.
<details>
<summary>More</summary>
<!-- ### More -->
- 09/07/2023: Update [fine-tune code](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md): Add script to mine hard negatives and support adding instruction during fine-tuning.
- 08/09/2023: BGE Models are integrated into **Langchain**, you can use it like [this](#using-langchain); C-MTEB **leaderboard** is [available](https://huggingface.co/spaces/mteb/leaderboard).
- 08/05/2023: Release base-scale and small-scale models, **best performance among the models of the same size 🤗**
- 08/02/2023: Release `bge-large-*`(short for BAAI General Embedding) Models, **rank 1st on MTEB and C-MTEB benchmark!** :tada: :tada:
- 08/01/2023: We release the [Chinese Massive Text Embedding Benchmark](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB) (**C-MTEB**), consisting of 31 test dataset.
</details>
## Model List
`bge` is short for `BAAI general embedding`.
| Model | Language | | Description | query instruction for retrieval [1] |
|:-------------------------------|:--------:| :--------:| :--------:|:--------:|
| [BAAI/llm-embedder](https://huggingface.co/BAAI/llm-embedder) | English | [Inference](./FlagEmbedding/llm_embedder/README.md) [Fine-tune](./FlagEmbedding/llm_embedder/README.md) | a unified embedding model to support diverse retrieval augmentation needs for LLMs | See [README](./FlagEmbedding/llm_embedder/README.md) |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | Chinese and English | [Inference](#usage-for-reranker) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker) | a cross-encoder model which is more accurate but less efficient [2] | |
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh-v1.5](https://huggingface.co/BAAI/bge-large-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | version 1.5 with more reasonable similarity distribution | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-large-en](https://huggingface.co/BAAI/bge-large-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [MTEB](https://huggingface.co/spaces/mteb/leaderboard) leaderboard | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-base-en](https://huggingface.co/BAAI/bge-base-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-en` | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-small-en](https://huggingface.co/BAAI/bge-small-en) | English | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) |a small-scale model but with competitive performance | `Represent this sentence for searching relevant passages: ` |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | :trophy: rank **1st** in [C-MTEB](https://github.com/FlagOpen/FlagEmbedding/tree/master/C_MTEB) benchmark | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a base-scale model but with similar ability to `bge-large-zh` | `为这个句子生成表示以用于检索相关文章:` |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | Chinese | [Inference](#usage-for-embedding-model) [Fine-tune](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) | a small-scale model but with competitive performance | `为这个句子生成表示以用于检索相关文章:` |
[1\]: If you need to search the relevant passages to a query, we suggest to add the instruction to the query; in other cases, no instruction is needed, just use the original query directly. In all cases, **no instruction** needs to be added to passages.
[2\]: Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. To balance the accuracy and time cost, cross-encoder is widely used to re-rank top-k documents retrieved by other simple models.
For examples, use bge embedding model to retrieve top 100 relevant documents, and then use bge reranker to re-rank the top 100 document to get the final top-3 results.
All models have been uploaded to Huggingface Hub, and you can see them at https://huggingface.co/BAAI.
If you cannot open the Huggingface Hub, you also can download the models at https://model.baai.ac.cn/models .
## Frequently asked questions
<details>
<summary>1. How to fine-tune bge embedding model?</summary>
<!-- ### How to fine-tune bge embedding model? -->
Following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune) to prepare data and fine-tune your model.
Some suggestions:
- Mine hard negatives following this [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune#hard-negatives), which can improve the retrieval performance.
- If you pre-train bge on your data, the pre-trained model cannot be directly used to calculate similarity, and it must be fine-tuned with contrastive learning before computing similarity.
- If the accuracy of the fine-tuned model is still not high, it is recommended to use/fine-tune the cross-encoder model (bge-reranker) to re-rank top-k results. Hard negatives also are needed to fine-tune reranker.
</details>
<details>
<summary>2. The similarity score between two dissimilar sentences is higher than 0.5</summary>
<!-- ### The similarity score between two dissimilar sentences is higher than 0.5 -->
**Suggest to use bge v1.5, which alleviates the issue of the similarity distribution.**
Since we finetune the models by contrastive learning with a temperature of 0.01,
the similarity distribution of the current BGE model is about in the interval \[0.6, 1\].
So a similarity score greater than 0.5 does not indicate that the two sentences are similar.
For downstream tasks, such as passage retrieval or semantic similarity,
**what matters is the relative order of the scores, not the absolute value.**
If you need to filter similar sentences based on a similarity threshold,
please select an appropriate similarity threshold based on the similarity distribution on your data (such as 0.8, 0.85, or even 0.9).
</details>
<details>
<summary>3. When does the query instruction need to be used</summary>
<!-- ### When does the query instruction need to be used -->
For the `bge-*-v1.5`, we improve its retrieval ability when not using instruction.
No instruction only has a slight degradation in retrieval performance compared with using instruction.
So you can generate embedding without instruction in all cases for convenience.
For a retrieval task that uses short queries to find long related documents,
it is recommended to add instructions for these short queries.
**The best method to decide whether to add instructions for queries is choosing the setting that achieves better performance on your task.**
In all cases, the documents/passages do not need to add the instruction.
</details>
## Usage
### Usage for Embedding Model
Here are some examples for using `bge` models with
[FlagEmbedding](#using-flagembedding), [Sentence-Transformers](#using-sentence-transformers), [Langchain](#using-langchain), or [Huggingface Transformers](#using-huggingface-transformers).
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
If it doesn't work for you, you can see [FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md) for more methods to install FlagEmbedding.
```python
from FlagEmbedding import FlagModel
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = FlagModel('BAAI/bge-large-zh-v1.5',
query_instruction_for_retrieval="为这个句子生成表示以用于检索相关文章:",
use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
embeddings_1 = model.encode(sentences_1)
embeddings_2 = model.encode(sentences_2)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
# for s2p(short query to long passage) retrieval task, suggest to use encode_queries() which will automatically add the instruction to each query
# corpus in retrieval task can still use encode() or encode_corpus(), since they don't need instruction
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
q_embeddings = model.encode_queries(queries)
p_embeddings = model.encode(passages)
scores = q_embeddings @ p_embeddings.T
```
For the value of the argument `query_instruction_for_retrieval`, see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list).
By default, FlagModel will use all available GPUs when encoding. Please set `os.environ["CUDA_VISIBLE_DEVICES"]` to select specific GPUs.
You also can set `os.environ["CUDA_VISIBLE_DEVICES"]=""` to make all GPUs unavailable.
#### Using Sentence-Transformers
You can also use the `bge` models with [sentence-transformers](https://www.SBERT.net):
```
pip install -U sentence-transformers
```
```python
from sentence_transformers import SentenceTransformer
sentences_1 = ["样例数据-1", "样例数据-2"]
sentences_2 = ["样例数据-3", "样例数据-4"]
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
embeddings_1 = model.encode(sentences_1, normalize_embeddings=True)
embeddings_2 = model.encode(sentences_2, normalize_embeddings=True)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
```
For s2p(short query to long passage) retrieval task,
each short query should start with an instruction (instructions see [Model List](https://github.com/FlagOpen/FlagEmbedding/tree/master#model-list)).
But the instruction is not needed for passages.
```python
from sentence_transformers import SentenceTransformer
queries = ['query_1', 'query_2']
passages = ["样例文档-1", "样例文档-2"]
instruction = "为这个句子生成表示以用于检索相关文章:"
model = SentenceTransformer('BAAI/bge-large-zh-v1.5')
q_embeddings = model.encode([instruction+q for q in queries], normalize_embeddings=True)
p_embeddings = model.encode(passages, normalize_embeddings=True)
scores = q_embeddings @ p_embeddings.T
```
#### Using Langchain
You can use `bge` in langchain like this:
```python
from langchain.embeddings import HuggingFaceBgeEmbeddings
model_name = "BAAI/bge-large-en-v1.5"
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity
model = HuggingFaceBgeEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
query_instruction="为这个句子生成表示以用于检索相关文章:"
)
model.query_instruction = "为这个句子生成表示以用于检索相关文章:"
```
#### Using HuggingFace Transformers
With the transformers package, you can use the model like this: First, you pass your input through the transformer model, then you select the last hidden state of the first token (i.e., [CLS]) as the sentence embedding.
```python
from transformers import AutoTokenizer, AutoModel
import torch
# Sentences we want sentence embeddings for
sentences = ["样例数据-1", "样例数据-2"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-large-zh-v1.5')
model = AutoModel.from_pretrained('BAAI/bge-large-zh-v1.5')
model.eval()
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# for s2p(short query to long passage) retrieval task, add an instruction to query (not add instruction for passages)
# encoded_input = tokenizer([instruction + q for q in queries], padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, cls pooling.
sentence_embeddings = model_output[0][:, 0]
# normalize embeddings
sentence_embeddings = torch.nn.functional.normalize(sentence_embeddings, p=2, dim=1)
print("Sentence embeddings:", sentence_embeddings)
```
### Usage for Reranker
Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding.
You can get a relevance score by inputting query and passage to the reranker.
The reranker is optimized based cross-entropy loss, so the relevance score is not bounded to a specific range.
#### Using FlagEmbedding
```
pip install -U FlagEmbedding
```
Get relevance scores (higher scores indicate more relevance):
```python
from FlagEmbedding import FlagReranker
reranker = FlagReranker('BAAI/bge-reranker-large', use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation
score = reranker.compute_score(['query', 'passage'])
print(score)
scores = reranker.compute_score([['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']])
print(scores)
```
#### Using Huggingface transformers
```python
import torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-reranker-large')
model = AutoModelForSequenceClassification.from_pretrained('BAAI/bge-reranker-large')
model.eval()
pairs = [['what is panda?', 'hi'], ['what is panda?', 'The giant panda (Ailuropoda melanoleuca), sometimes called a panda bear or simply panda, is a bear species endemic to China.']]
with torch.no_grad():
inputs = tokenizer(pairs, padding=True, truncation=True, return_tensors='pt', max_length=512)
scores = model(**inputs, return_dict=True).logits.view(-1, ).float()
print(scores)
```
## Evaluation
`baai-general-embedding` models achieve **state-of-the-art performance on both MTEB and C-MTEB leaderboard!**
For more details and evaluation tools see our [scripts](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md).
- **MTEB**:
| Model Name | Dimension | Sequence Length | Average (56) | Retrieval (15) |Clustering (11) | Pair Classification (3) | Reranking (4) | STS (10) | Summarization (1) | Classification (12) |
|:----:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|:---:|
| [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 1024 | 512 | **64.23** | **54.29** | 46.08 | 87.12 | 60.03 | 83.11 | 31.61 | 75.97 |
| [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 768 | 512 | 63.55 | 53.25 | 45.77 | 86.55 | 58.86 | 82.4 | 31.07 | 75.53 |
| [BAAI/bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) | 384 | 512 | 62.17 |51.68 | 43.82 | 84.92 | 58.36 | 81.59 | 30.12 | 74.14 |
| [bge-large-en](https://huggingface.co/BAAI/bge-large-en) | 1024 | 512 | 63.98 | 53.9 | 46.98 | 85.8 | 59.48 | 81.56 | 32.06 | 76.21 |
| [bge-base-en](https://huggingface.co/BAAI/bge-base-en) | 768 | 512 | 63.36 | 53.0 | 46.32 | 85.86 | 58.7 | 81.84 | 29.27 | 75.27 |
| [gte-large](https://huggingface.co/thenlper/gte-large) | 1024 | 512 | 63.13 | 52.22 | 46.84 | 85.00 | 59.13 | 83.35 | 31.66 | 73.33 |
| [gte-base](https://huggingface.co/thenlper/gte-base) | 768 | 512 | 62.39 | 51.14 | 46.2 | 84.57 | 58.61 | 82.3 | 31.17 | 73.01 |
| [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2) | 1024| 512 | 62.25 | 50.56 | 44.49 | 86.03 | 56.61 | 82.05 | 30.19 | 75.24 |
| [bge-small-en](https://huggingface.co/BAAI/bge-small-en) | 384 | 512 | 62.11 | 51.82 | 44.31 | 83.78 | 57.97 | 80.72 | 30.53 | 74.37 |
| [instructor-xl](https://huggingface.co/hkunlp/instructor-xl) | 768 | 512 | 61.79 | 49.26 | 44.74 | 86.62 | 57.29 | 83.06 | 32.32 | 61.79 |
| [e5-base-v2](https://huggingface.co/intfloat/e5-base-v2) | 768 | 512 | 61.5 | 50.29 | 43.80 | 85.73 | 55.91 | 81.05 | 30.28 | 73.84 |
| [gte-small](https://huggingface.co/thenlper/gte-small) | 384 | 512 | 61.36 | 49.46 | 44.89 | 83.54 | 57.7 | 82.07 | 30.42 | 72.31 |
| [text-embedding-ada-002](https://platform.openai.com/docs/guides/embeddings) | 1536 | 8192 | 60.99 | 49.25 | 45.9 | 84.89 | 56.32 | 80.97 | 30.8 | 70.93 |
| [e5-small-v2](https://huggingface.co/intfloat/e5-base-v2) | 384 | 512 | 59.93 | 49.04 | 39.92 | 84.67 | 54.32 | 80.39 | 31.16 | 72.94 |
| [sentence-t5-xxl](https://huggingface.co/sentence-transformers/sentence-t5-xxl) | 768 | 512 | 59.51 | 42.24 | 43.72 | 85.06 | 56.42 | 82.63 | 30.08 | 73.42 |
| [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) | 768 | 514 | 57.78 | 43.81 | 43.69 | 83.04 | 59.36 | 80.28 | 27.49 | 65.07 |
| [sgpt-bloom-7b1-msmarco](https://huggingface.co/bigscience/sgpt-bloom-7b1-msmarco) | 4096 | 2048 | 57.59 | 48.22 | 38.93 | 81.9 | 55.65 | 77.74 | 33.6 | 66.19 |
- **C-MTEB**:
We create the benchmark C-MTEB for Chinese text embedding which consists of 31 datasets from 6 tasks.
Please refer to [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/README.md) for a detailed introduction.
| Model | Embedding dimension | Avg | Retrieval | STS | PairClassification | Classification | Reranking | Clustering |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| [**BAAI/bge-large-zh-v1.5**](https://huggingface.co/BAAI/bge-large-zh-v1.5) | 1024 | **64.53** | 70.46 | 56.25 | 81.6 | 69.13 | 65.84 | 48.99 |
| [BAAI/bge-base-zh-v1.5](https://huggingface.co/BAAI/bge-base-zh-v1.5) | 768 | 63.13 | 69.49 | 53.72 | 79.75 | 68.07 | 65.39 | 47.53 |
| [BAAI/bge-small-zh-v1.5](https://huggingface.co/BAAI/bge-small-zh-v1.5) | 512 | 57.82 | 61.77 | 49.11 | 70.41 | 63.96 | 60.92 | 44.18 |
| [BAAI/bge-large-zh](https://huggingface.co/BAAI/bge-large-zh) | 1024 | 64.20 | 71.53 | 54.98 | 78.94 | 68.32 | 65.11 | 48.39 |
| [bge-large-zh-noinstruct](https://huggingface.co/BAAI/bge-large-zh-noinstruct) | 1024 | 63.53 | 70.55 | 53 | 76.77 | 68.58 | 64.91 | 50.01 |
| [BAAI/bge-base-zh](https://huggingface.co/BAAI/bge-base-zh) | 768 | 62.96 | 69.53 | 54.12 | 77.5 | 67.07 | 64.91 | 47.63 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 1024 | 58.79 | 63.66 | 48.44 | 69.89 | 67.34 | 56.00 | 48.23 |
| [BAAI/bge-small-zh](https://huggingface.co/BAAI/bge-small-zh) | 512 | 58.27 | 63.07 | 49.45 | 70.35 | 63.64 | 61.48 | 45.09 |
| [m3e-base](https://huggingface.co/moka-ai/m3e-base) | 768 | 57.10 | 56.91 | 50.47 | 63.99 | 67.52 | 59.34 | 47.68 |
| [m3e-large](https://huggingface.co/moka-ai/m3e-large) | 1024 | 57.05 | 54.75 | 50.42 | 64.3 | 68.2 | 59.66 | 48.88 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 768 | 55.48 | 61.63 | 46.49 | 67.07 | 65.35 | 54.35 | 40.68 |
| [multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) | 384 | 55.38 | 59.95 | 45.27 | 66.45 | 65.85 | 53.86 | 45.26 |
| [text-embedding-ada-002(OpenAI)](https://platform.openai.com/docs/guides/embeddings/what-are-embeddings) | 1536 | 53.02 | 52.0 | 43.35 | 69.56 | 64.31 | 54.28 | 45.68 |
| [luotuo](https://huggingface.co/silk-road/luotuo-bert-medium) | 1024 | 49.37 | 44.4 | 42.78 | 66.62 | 61 | 49.25 | 44.39 |
| [text2vec-base](https://huggingface.co/shibing624/text2vec-base-chinese) | 768 | 47.63 | 38.79 | 43.41 | 67.41 | 62.19 | 49.45 | 37.66 |
| [text2vec-large](https://huggingface.co/GanymedeNil/text2vec-large-chinese) | 1024 | 47.36 | 41.94 | 44.97 | 70.86 | 60.66 | 49.16 | 30.02 |
- **Reranking**:
See [C_MTEB](https://github.com/FlagOpen/FlagEmbedding/blob/master/C_MTEB/) for evaluation script.
| Model | T2Reranking | T2RerankingZh2En\* | T2RerankingEn2Zh\* | MMarcoReranking | CMedQAv1 | CMedQAv2 | Avg |
|:-------------------------------|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|:--------:|
| text2vec-base-multilingual | 64.66 | 62.94 | 62.51 | 14.37 | 48.46 | 48.6 | 50.26 |
| multilingual-e5-small | 65.62 | 60.94 | 56.41 | 29.91 | 67.26 | 66.54 | 57.78 |
| multilingual-e5-large | 64.55 | 61.61 | 54.28 | 28.6 | 67.42 | 67.92 | 57.4 |
| multilingual-e5-base | 64.21 | 62.13 | 54.68 | 29.5 | 66.23 | 66.98 | 57.29 |
| m3e-base | 66.03 | 62.74 | 56.07 | 17.51 | 77.05 | 76.76 | 59.36 |
| m3e-large | 66.13 | 62.72 | 56.1 | 16.46 | 77.76 | 78.27 | 59.57 |
| bge-base-zh-v1.5 | 66.49 | 63.25 | 57.02 | 29.74 | 80.47 | 84.88 | 63.64 |
| bge-large-zh-v1.5 | 65.74 | 63.39 | 57.03 | 28.74 | 83.45 | 85.44 | 63.97 |
| [BAAI/bge-reranker-base](https://huggingface.co/BAAI/bge-reranker-base) | 67.28 | 63.95 | 60.45 | 35.46 | 81.26 | 84.1 | 65.42 |
| [BAAI/bge-reranker-large](https://huggingface.co/BAAI/bge-reranker-large) | 67.6 | 64.03 | 61.44 | 37.16 | 82.15 | 84.18 | 66.09 |
\* : T2RerankingZh2En and T2RerankingEn2Zh are cross-language retrieval tasks
## Train
### BAAI Embedding
We pre-train the models using [retromae](https://github.com/staoxiao/RetroMAE) and train them on large-scale pairs data using contrastive learning.
**You can fine-tune the embedding model on your data following our [examples](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune).**
We also provide a [pre-train example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/pretrain).
Note that the goal of pre-training is to reconstruct the text, and the pre-trained model cannot be used for similarity calculation directly, it needs to be fine-tuned.
More training details for bge see [baai_general_embedding](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/baai_general_embedding/README.md).
### BGE Reranker
Cross-encoder will perform full-attention over the input pair,
which is more accurate than embedding model (i.e., bi-encoder) but more time-consuming than embedding model.
Therefore, it can be used to re-rank the top-k documents returned by embedding model.
We train the cross-encoder on a multilingual pair data,
The data format is the same as embedding model, so you can fine-tune it easily following our [example](https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/reranker).
More details please refer to [./FlagEmbedding/reranker/README.md](https://github.com/FlagOpen/FlagEmbedding/tree/master/FlagEmbedding/reranker)
## Contact
If you have any question or suggestion related to this project, feel free to open an issue or pull request.
You also can email Shitao Xiao(stxiao@baai.ac.cn) and Zheng Liu(liuzheng@baai.ac.cn).
## Citation
If you find this repository useful, please consider giving a star :star: and citation
```
@misc{bge_embedding,
title={C-Pack: Packaged Resources To Advance General Chinese Embedding},
author={Shitao Xiao and Zheng Liu and Peitian Zhang and Niklas Muennighoff},
year={2023},
eprint={2309.07597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## License
FlagEmbedding is licensed under the [MIT License](https://github.com/FlagOpen/FlagEmbedding/blob/master/LICENSE). The released models can be used for commercial purposes free of charge.
| 27,165 | [
[
-0.036651611328125,
-0.06793212890625,
0.029296875,
0.0118865966796875,
-0.02716064453125,
-0.02056884765625,
-0.0237579345703125,
-0.0189666748046875,
0.0300445556640625,
0.0283966064453125,
-0.0260467529296875,
-0.06549072265625,
-0.036102294921875,
-0.004604339599609375,
-0.006374359130859375,
0.041473388671875,
-0.0037555694580078125,
0.010711669921875,
0.003925323486328125,
-0.0186767578125,
-0.0283966064453125,
-0.01904296875,
-0.049224853515625,
-0.0192108154296875,
0.025726318359375,
0.017059326171875,
0.042205810546875,
0.056365966796875,
0.0222930908203125,
0.02020263671875,
-0.0175933837890625,
0.01165771484375,
-0.0357666015625,
-0.005153656005859375,
-0.01552581787109375,
-0.024871826171875,
-0.03143310546875,
0.01317596435546875,
0.050994873046875,
0.0335693359375,
-0.007740020751953125,
0.00778961181640625,
0.0004954338073730469,
0.053192138671875,
-0.034515380859375,
0.020751953125,
-0.042724609375,
0.0027294158935546875,
-0.0180511474609375,
0.01128387451171875,
-0.03863525390625,
-0.0287017822265625,
0.0119476318359375,
-0.0458984375,
0.00617218017578125,
0.0216064453125,
0.097412109375,
0.01541900634765625,
-0.033935546875,
-0.01239013671875,
-0.009063720703125,
0.0740966796875,
-0.07525634765625,
0.051177978515625,
0.037933349609375,
0.0190277099609375,
-0.005809783935546875,
-0.0609130859375,
-0.0269622802734375,
-0.0124053955078125,
-0.0151519775390625,
0.03155517578125,
0.001903533935546875,
0.0014982223510742188,
0.0236358642578125,
0.044586181640625,
-0.041412353515625,
0.007049560546875,
-0.00511932373046875,
-0.011871337890625,
0.0570068359375,
-0.01230621337890625,
0.034210205078125,
-0.041595458984375,
-0.0223541259765625,
-0.027435302734375,
-0.0596923828125,
0.0034122467041015625,
0.0273284912109375,
0.0102386474609375,
-0.02935791015625,
0.0423583984375,
-0.017120361328125,
0.045501708984375,
0.003787994384765625,
0.00383758544921875,
0.03924560546875,
-0.0278472900390625,
-0.0154876708984375,
-0.0109405517578125,
0.069580078125,
0.029449462890625,
-0.004314422607421875,
0.003849029541015625,
-0.024139404296875,
-0.007076263427734375,
-0.006961822509765625,
-0.06695556640625,
-0.018157958984375,
0.0148162841796875,
-0.05712890625,
-0.0135498046875,
0.0177459716796875,
-0.05816650390625,
0.00774383544921875,
0.00010311603546142578,
0.043670654296875,
-0.055755615234375,
-0.005519866943359375,
0.02337646484375,
-0.015777587890625,
0.029998779296875,
-0.00024437904357910156,
-0.046844482421875,
-0.018524169921875,
0.039764404296875,
0.0640869140625,
0.0124969482421875,
-0.005706787109375,
-0.0279541015625,
0.00283050537109375,
-0.010650634765625,
0.0244598388671875,
-0.03887939453125,
-0.01331329345703125,
0.015777587890625,
0.0289764404296875,
-0.00771331787109375,
-0.0216827392578125,
0.06597900390625,
-0.040130615234375,
0.02691650390625,
-0.0283050537109375,
-0.0615234375,
-0.037506103515625,
0.0069122314453125,
-0.060089111328125,
0.08270263671875,
-0.00733184814453125,
-0.06341552734375,
0.006229400634765625,
-0.048248291015625,
-0.0161895751953125,
-0.01910400390625,
-0.0024871826171875,
-0.044769287109375,
-0.00879669189453125,
0.0284423828125,
0.043701171875,
-0.017120361328125,
0.002605438232421875,
-0.025970458984375,
-0.042755126953125,
-0.0005397796630859375,
-0.017303466796875,
0.0819091796875,
0.0191497802734375,
-0.025115966796875,
-0.0164794921875,
-0.032806396484375,
0.00899505615234375,
0.022705078125,
-0.02337646484375,
-0.025787353515625,
0.0165557861328125,
0.0176849365234375,
0.0038814544677734375,
0.039642333984375,
-0.05279541015625,
0.01369476318359375,
-0.043792724609375,
0.044464111328125,
0.04180908203125,
0.012939453125,
0.0179290771484375,
-0.035491943359375,
0.021514892578125,
-0.0017499923706054688,
-0.0028553009033203125,
-0.0166168212890625,
-0.039703369140625,
-0.046966552734375,
-0.0226287841796875,
0.055419921875,
0.04937744140625,
-0.0653076171875,
0.049774169921875,
-0.034210205078125,
-0.04632568359375,
-0.07049560546875,
0.0100555419921875,
0.039947509765625,
0.00016045570373535156,
0.05377197265625,
-0.01035308837890625,
-0.035858154296875,
-0.0699462890625,
-0.0046234130859375,
0.005855560302734375,
-0.00679779052734375,
0.040191650390625,
0.0460205078125,
-0.023895263671875,
0.0305023193359375,
-0.05487060546875,
-0.026153564453125,
-0.0172119140625,
-0.0054779052734375,
0.0253753662109375,
0.036773681640625,
0.0478515625,
-0.07537841796875,
-0.043701171875,
-0.000606536865234375,
-0.058319091796875,
0.00571441650390625,
0.0027141571044921875,
-0.0224151611328125,
0.01305389404296875,
0.045654296875,
-0.0307769775390625,
0.017791748046875,
0.035675048828125,
-0.019317626953125,
0.0210418701171875,
-0.0015697479248046875,
0.010986328125,
-0.09912109375,
0.00164031982421875,
0.022613525390625,
-0.008575439453125,
-0.020477294921875,
0.0389404296875,
0.012725830078125,
0.0154266357421875,
-0.025909423828125,
0.0439453125,
-0.0394287109375,
0.0187530517578125,
0.009674072265625,
0.0460205078125,
-0.0067138671875,
0.038330078125,
-0.0034999847412109375,
0.0537109375,
0.02783203125,
-0.0299072265625,
0.00928497314453125,
0.03955078125,
-0.0333251953125,
0.0060882568359375,
-0.04937744140625,
-0.005687713623046875,
-0.00554656982421875,
0.0125579833984375,
-0.06207275390625,
-0.005462646484375,
0.0198516845703125,
-0.042999267578125,
0.03955078125,
-0.0224456787109375,
-0.037261962890625,
-0.027587890625,
-0.06829833984375,
0.010986328125,
0.04376220703125,
-0.048553466796875,
0.0164794921875,
0.022125244140625,
0.006961822509765625,
-0.0579833984375,
-0.061309814453125,
-0.01165771484375,
-0.00017368793487548828,
-0.03955078125,
0.0408935546875,
-0.002140045166015625,
0.0191650390625,
0.0141754150390625,
-0.005359649658203125,
0.0112762451171875,
0.00865936279296875,
-0.00020933151245117188,
0.0184326171875,
-0.035797119140625,
0.0035400390625,
0.02056884765625,
0.0098114013671875,
-0.0148773193359375,
-0.01212310791015625,
0.033111572265625,
-0.01291656494140625,
-0.02679443359375,
-0.017791748046875,
0.0255584716796875,
0.0192413330078125,
-0.0303955078125,
0.0445556640625,
0.07452392578125,
-0.0281829833984375,
-0.0062713623046875,
-0.0496826171875,
-0.00923919677734375,
-0.036224365234375,
0.034149169921875,
-0.02435302734375,
-0.07379150390625,
0.02972412109375,
-0.0015201568603515625,
0.0162506103515625,
0.050872802734375,
0.0252532958984375,
-0.01061248779296875,
0.08099365234375,
0.0281524658203125,
-0.020294189453125,
0.049835205078125,
-0.049774169921875,
0.0133209228515625,
-0.0882568359375,
-0.00335693359375,
-0.0297088623046875,
-0.0297088623046875,
-0.099853515625,
-0.03802490234375,
0.00463104248046875,
0.021026611328125,
-0.02862548828125,
0.0323486328125,
-0.04302978515625,
0.01145172119140625,
0.036407470703125,
0.0222930908203125,
-0.00138092041015625,
0.00933837890625,
-0.0325927734375,
-0.0203704833984375,
-0.04583740234375,
-0.038238525390625,
0.07513427734375,
0.036407470703125,
0.046051025390625,
0.027374267578125,
0.061981201171875,
0.01422119140625,
0.007293701171875,
-0.058197021484375,
0.042999267578125,
-0.03936767578125,
-0.04296875,
-0.0269927978515625,
-0.036712646484375,
-0.083984375,
0.02984619140625,
-0.0206146240234375,
-0.058319091796875,
0.00806427001953125,
-0.0148773193359375,
-0.0022869110107421875,
0.03515625,
-0.050872802734375,
0.07720947265625,
-0.00811004638671875,
-0.0231781005859375,
-0.0058746337890625,
-0.03155517578125,
0.0245208740234375,
0.01497650146484375,
0.00620269775390625,
0.00556182861328125,
-0.01953125,
0.05718994140625,
-0.01416015625,
0.04803466796875,
-0.01220703125,
0.01123809814453125,
0.03240966796875,
-0.01386260986328125,
0.04168701171875,
0.006061553955078125,
-0.0135498046875,
0.0226898193359375,
0.0067291259765625,
-0.036376953125,
-0.037384033203125,
0.06634521484375,
-0.05072021484375,
-0.05328369140625,
-0.0282440185546875,
-0.0188751220703125,
0.01348114013671875,
0.032989501953125,
0.0265960693359375,
0.0165252685546875,
-0.0077667236328125,
0.048736572265625,
0.06982421875,
-0.0411376953125,
0.028900146484375,
0.0261077880859375,
-0.0206146240234375,
-0.044647216796875,
0.0845947265625,
0.019805908203125,
-0.003963470458984375,
0.05078125,
0.0010051727294921875,
-0.021087646484375,
-0.0400390625,
-0.034393310546875,
0.047943115234375,
-0.04473876953125,
-0.01264190673828125,
-0.048309326171875,
-0.0322265625,
-0.0325927734375,
0.0016393661499023438,
-0.020416259765625,
-0.021331787109375,
-0.013427734375,
-0.021209716796875,
0.0177764892578125,
0.0357666015625,
0.00916290283203125,
0.00669097900390625,
-0.0535888671875,
0.015899658203125,
-0.00736236572265625,
0.033172607421875,
0.005413055419921875,
-0.04071044921875,
-0.046844482421875,
0.013153076171875,
-0.03692626953125,
-0.081787109375,
0.0262908935546875,
0.005680084228515625,
0.06317138671875,
0.02484130859375,
-0.000843048095703125,
0.0309600830078125,
-0.03955078125,
0.08062744140625,
-0.008148193359375,
-0.05926513671875,
0.0384521484375,
-0.02117919921875,
0.012420654296875,
0.042205810546875,
0.049285888671875,
-0.03497314453125,
-0.0206451416015625,
-0.03704833984375,
-0.07275390625,
0.036712646484375,
0.01372528076171875,
0.003223419189453125,
-0.022369384765625,
0.0247802734375,
-0.01372528076171875,
-0.0001710653305053711,
-0.060302734375,
-0.05621337890625,
-0.0251922607421875,
-0.02655029296875,
-0.0073089599609375,
-0.0208892822265625,
0.01558685302734375,
-0.021881103515625,
0.075439453125,
0.0003371238708496094,
0.041412353515625,
0.0270233154296875,
-0.024688720703125,
0.0180511474609375,
0.019073486328125,
0.0224609375,
0.01412200927734375,
-0.0291595458984375,
-0.01093292236328125,
0.0237579345703125,
-0.0416259765625,
-0.004810333251953125,
0.0233612060546875,
-0.035308837890625,
0.01459503173828125,
0.023040771484375,
0.05328369140625,
0.033843994140625,
-0.03338623046875,
0.042633056640625,
0.00862884521484375,
-0.01419830322265625,
-0.02252197265625,
-0.005405426025390625,
0.02301025390625,
0.0189666748046875,
0.00879669189453125,
-0.034393310546875,
0.0199737548828125,
-0.039947509765625,
0.0255584716796875,
0.033843994140625,
-0.028656005859375,
-0.005062103271484375,
0.05279541015625,
0.0026149749755859375,
-0.0016050338745117188,
0.03607177734375,
-0.037841796875,
-0.055450439453125,
0.031982421875,
0.028289794921875,
0.06329345703125,
-0.010986328125,
0.016876220703125,
0.06500244140625,
0.04010009765625,
-0.02410888671875,
0.026885986328125,
0.0058441162109375,
-0.04400634765625,
-0.033447265625,
-0.0408935546875,
-0.004383087158203125,
0.0200958251953125,
-0.043548583984375,
0.0264434814453125,
-0.031402587890625,
-0.01113128662109375,
0.0023651123046875,
0.033111572265625,
-0.05609130859375,
0.00955963134765625,
0.0034236907958984375,
0.08477783203125,
-0.04400634765625,
0.06292724609375,
0.07476806640625,
-0.07232666015625,
-0.058197021484375,
0.0059814453125,
-0.0098419189453125,
-0.045867919921875,
0.0283050537109375,
0.0198822021484375,
0.0133209228515625,
0.004669189453125,
-0.03594970703125,
-0.06890869140625,
0.11822509765625,
0.002880096435546875,
-0.0399169921875,
-0.00466156005859375,
-0.021240234375,
0.034454345703125,
-0.02850341796875,
0.033599853515625,
0.0310516357421875,
0.0460205078125,
-0.0139617919921875,
-0.04852294921875,
0.040863037109375,
-0.023834228515625,
0.0176849365234375,
0.0036907196044921875,
-0.0733642578125,
0.0623779296875,
0.0034465789794921875,
-0.0249786376953125,
0.014678955078125,
0.05450439453125,
0.0173187255859375,
0.031585693359375,
0.0181732177734375,
0.07000732421875,
0.049652099609375,
-0.01678466796875,
0.08709716796875,
-0.019439697265625,
0.04736328125,
0.06549072265625,
0.01274871826171875,
0.0843505859375,
0.006679534912109375,
-0.0177459716796875,
0.05059814453125,
0.0596923828125,
-0.024078369140625,
0.035064697265625,
0.0014257431030273438,
0.004581451416015625,
-0.0240631103515625,
0.004016876220703125,
-0.04034423828125,
0.021697998046875,
0.0245513916015625,
-0.038970947265625,
0.0033702850341796875,
-0.0220489501953125,
0.0089874267578125,
0.00799560546875,
-0.0017108917236328125,
0.04339599609375,
0.0235137939453125,
-0.035064697265625,
0.050079345703125,
0.0177459716796875,
0.07611083984375,
-0.0305023193359375,
-0.0114898681640625,
-0.0212860107421875,
-0.0085906982421875,
-0.01708984375,
-0.05889892578125,
-0.006099700927734375,
-0.019378662109375,
-0.0155487060546875,
0.00634765625,
0.0406494140625,
-0.046966552734375,
-0.0306549072265625,
0.0426025390625,
0.038818359375,
0.0182647705078125,
0.01349639892578125,
-0.08270263671875,
0.0023555755615234375,
0.0288238525390625,
-0.040130615234375,
0.0231170654296875,
0.035552978515625,
-0.004669189453125,
0.04437255859375,
0.043975830078125,
0.00484466552734375,
-0.0014820098876953125,
0.0030384063720703125,
0.0386962890625,
-0.07037353515625,
-0.022979736328125,
-0.047607421875,
0.02716064453125,
-0.0246124267578125,
0.0016565322875976562,
0.060791015625,
0.053009033203125,
0.08074951171875,
-0.00397491455078125,
0.061309814453125,
-0.008636474609375,
0.03070068359375,
-0.045379638671875,
0.067138671875,
-0.0772705078125,
0.0194244384765625,
-0.0266265869140625,
-0.07049560546875,
-0.0118408203125,
0.052703857421875,
-0.0252838134765625,
0.0174102783203125,
0.051177978515625,
0.0736083984375,
-0.019256591796875,
-0.01438140869140625,
0.023193359375,
0.0328369140625,
0.01187896728515625,
0.0596923828125,
0.0259857177734375,
-0.0736083984375,
0.0482177734375,
-0.017852783203125,
0.00962066650390625,
-0.039154052734375,
-0.04791259765625,
-0.06939697265625,
-0.055145263671875,
-0.031707763671875,
-0.022979736328125,
-0.0034198760986328125,
0.06927490234375,
0.0257568359375,
-0.056304931640625,
-0.0052032470703125,
0.02081298828125,
0.03643798828125,
-0.0200958251953125,
-0.0207061767578125,
0.0496826171875,
-0.005706787109375,
-0.07037353515625,
0.02471923828125,
-0.006282806396484375,
-0.005138397216796875,
-0.0039825439453125,
-0.0184326171875,
-0.06640625,
0.0089569091796875,
0.044952392578125,
0.0191650390625,
-0.06866455078125,
-0.031585693359375,
0.00634765625,
-0.0196075439453125,
-0.01158905029296875,
0.01271820068359375,
-0.0309906005859375,
0.0269927978515625,
0.046600341796875,
0.05792236328125,
0.0496826171875,
-0.0035190582275390625,
0.01529693603515625,
-0.04608154296875,
-0.006397247314453125,
-0.003162384033203125,
0.053375244140625,
0.0273895263671875,
-0.02264404296875,
0.06781005859375,
0.016082763671875,
-0.03045654296875,
-0.056610107421875,
0.002437591552734375,
-0.07928466796875,
-0.0247344970703125,
0.08404541015625,
-0.031982421875,
-0.0188140869140625,
0.0205535888671875,
-0.01439666748046875,
0.041259765625,
-0.036956787109375,
0.035369873046875,
0.05999755859375,
0.03271484375,
-0.0117034912109375,
-0.0684814453125,
0.0235137939453125,
0.046722412109375,
-0.0206756591796875,
-0.0255126953125,
0.025390625,
0.036346435546875,
0.0179595947265625,
0.00876617431640625,
-0.0185699462890625,
0.024017333984375,
-0.005641937255859375,
-0.0006580352783203125,
-0.010406494140625,
0.018524169921875,
-0.01372528076171875,
0.001949310302734375,
-0.0125579833984375,
-0.0232086181640625
]
] |
defog/sqlcoder2 | 2023-10-13T16:43:20.000Z | [
"transformers",
"pytorch",
"gpt_bigcode",
"text-generation",
"code",
"en",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | defog | null | null | defog/sqlcoder2 | 65 | 11,862 | transformers | 2023-10-02T12:13:43 | ---
license: other
language:
- en
pipeline_tag: text-generation
tags:
- code
---
# Defog SQLCoder
Defog's SQLCoder is a state-of-the-art LLM for converting natural language questions to SQL queries.
[Interactive Demo](https://defog.ai/sqlcoder-demo/) | [🤗 HF Repo](https://huggingface.co/defog/sqlcoder2) | [♾️ Colab](https://colab.research.google.com/drive/1z4rmOEiFkxkMiecAWeTUlPl0OmKgfEu7?usp=sharing) | [🐦 Twitter](https://twitter.com/defogdata)
## TL;DR
SQLCoder is a 15B parameter model that outperforms `gpt-3.5-turbo` for natural language to SQL generation tasks on our [sql-eval](https://github.com/defog-ai/sql-eval) framework, and significantly outperforms all popular open-source models. When fine-tuned on a given schema, it also outperforms `gpt-4`
SQLCoder is fine-tuned on a base StarCoder model.
## Results on novel datasets not seen in training
| model | perc_correct |
|-|-|
| gpt4-2023-10-04 | 82.0 |
| defog-sqlcoder2 | 77.5 |
| gpt4-2023-08-28 | 74.0 |
| defog-sqlcoder-7b | 71.0 |
| gpt-3.5-2023-10-04 | 66.0 |
| claude-2 | 64.5 |
| gpt-3.5-2023-08-28 | 61.0 |
| claude_instant_1 | 61.0 |
| text-davinci-003 | 52.5 |
## License
The code in this repo (what little there is of it) is Apache-2 licensed. The model weights have a `CC BY-SA 4.0` license, with additional responsible use restrictions added. The TL;DR is that you can use and modify the model for any purpose – including commercial use. However, if you modify the weights (for example, by fine-tuning), you must open-source your modified weights under the same license terms.
## Training
Defog was trained on more than 20,000 human-curated questions. These questions were based on 10 different schemas. None of the schemas in the training data were included in our evaluation framework.
You can read more about our [training approach](https://defog.ai/blog/open-sourcing-sqlcoder2-7b/) and [evaluation framework](https://defog.ai/blog/open-sourcing-sqleval/).
## Results by question category
We classified each generated question into one of 5 categories. The table displays the percentage of questions answered correctly by each model, broken down by category.
| query_category | gpt-4 | sqlcoder2-15b | sqlcoder-7b | gpt-3.5 | claude-2 | claude-instant | gpt-3 |
|:-----------------|--------:|----------------:|--------------:|----------:|-----------:|-----------------:|--------:|
| date | 72 | 80 | 64 | 68 | 52 | 48 | 32 |
| group_by | 91.4 | 82.9 | 82.9 | 77.1 | 71.4 | 71.4 | 71.4 |
| order_by | 82.9 | 77.1 | 74.3 | 68.6 | 74.3 | 74.3 | 68.6 |
| ratio | 80 | 74.3 | 54.3 | 37.1 | 57.1 | 45.7 | 25.7 |
| join | 82.9 | 74.3 | 74.3 | 71.4 | 65.7 | 62.9 | 57.1 |
| where | 80 | 77.1 | 74.3 | 74.3 | 62.9 | 60 | 54.3 |
## Using SQLCoder
You can use SQLCoder via the `transformers` library by downloading our model weights from the Hugging Face repo. We have added sample code for [inference](https://github.com/defog-ai/sqlcoder/blob/main/inference.py) on a [sample database schema](https://github.com/defog-ai/sqlcoder/blob/main/metadata.sql).
```bash
python inference.py -q "Question about the sample database goes here"
# Sample question:
# Do we get more revenue from customers in New York compared to customers in San Francisco? Give me the total revenue for each city, and the difference between the two.
```
You can also use a demo on our website [here](https://defog.ai/sqlcoder-demo), or run SQLCoder in Colab [here](https://colab.research.google.com/drive/13BIKsqHnPOBcQ-ba2p77L5saiepTIwu0#scrollTo=ZpbVgVHMkJvC)
## Hardware Requirements
SQLCoder has been tested on an A100 40GB GPU with `bfloat16` weights. You can also load an 8-bit and 4-bit quantized version of the model on consumer GPUs with 20GB or more of memory – like RTX 4090, RTX 3090, and Apple M2 Pro, M2 Max, or M2 Ultra Chips with 20GB or more of memory.
## Todo
- [x] Open-source the v1 model weights
- [x] Train the model on more data, with higher data variance
- [ ] Tune the model further with Reward Modelling and RLHF
- [ ] Pretrain a model from scratch that specializes in SQL analysis | 4,490 | [
[
-0.02740478515625,
-0.07757568359375,
0.0150299072265625,
-0.005458831787109375,
-0.0165252685546875,
-0.0199127197265625,
-0.0078582763671875,
-0.026214599609375,
0.0026264190673828125,
0.0333251953125,
-0.041839599609375,
-0.0391845703125,
-0.0284881591796875,
0.005641937255859375,
-0.03277587890625,
0.08441162109375,
0.0267333984375,
0.0267181396484375,
-0.043731689453125,
-0.006610870361328125,
-0.03106689453125,
-0.057952880859375,
-0.043548583984375,
0.0016613006591796875,
0.0018472671508789062,
-0.0093994140625,
0.04736328125,
0.0242462158203125,
0.039947509765625,
0.019561767578125,
-0.0180816650390625,
0.0008192062377929688,
-0.03656005859375,
-0.01503753662109375,
0.0000362396240234375,
-0.049072265625,
-0.0283660888671875,
-0.00826263427734375,
0.024627685546875,
0.0285797119140625,
-0.047576904296875,
0.0289306640625,
-0.01154327392578125,
0.055938720703125,
-0.0360107421875,
0.0185394287109375,
-0.0535888671875,
-0.01641845703125,
0.0015535354614257812,
0.0274658203125,
-0.01322174072265625,
-0.043670654296875,
-0.00954437255859375,
-0.0236053466796875,
0.021209716796875,
0.01079559326171875,
0.089599609375,
0.04766845703125,
-0.019439697265625,
-0.00966644287109375,
-0.04669189453125,
0.05364990234375,
-0.06201171875,
0.0228424072265625,
0.02685546875,
0.0294952392578125,
0.005496978759765625,
-0.05413818359375,
-0.055511474609375,
-0.01531219482421875,
-0.0258941650390625,
-0.00262451171875,
-0.0341796875,
0.002285003662109375,
0.037445068359375,
0.050689697265625,
-0.054107666015625,
-0.01448822021484375,
-0.038848876953125,
-0.0112457275390625,
0.0677490234375,
0.00984954833984375,
0.01372528076171875,
-0.0108184814453125,
-0.0226898193359375,
-0.0098724365234375,
-0.03558349609375,
0.008941650390625,
0.0120391845703125,
0.01020050048828125,
-0.0085906982421875,
0.047119140625,
-0.0173797607421875,
0.08001708984375,
0.01157379150390625,
0.000018358230590820312,
0.01873779296875,
-0.05133056640625,
-0.038421630859375,
-0.0007014274597167969,
0.074462890625,
0.0251617431640625,
0.0218353271484375,
0.0079498291015625,
-0.0221099853515625,
0.000644683837890625,
0.02374267578125,
-0.0748291015625,
-0.022735595703125,
0.0390625,
-0.024871826171875,
-0.016632080078125,
0.00397491455078125,
-0.059051513671875,
-0.0242919921875,
-0.0058746337890625,
0.036376953125,
-0.036468505859375,
-0.0212249755859375,
0.00024259090423583984,
-0.01372528076171875,
0.041046142578125,
0.01412200927734375,
-0.0655517578125,
0.01558685302734375,
0.037506103515625,
0.0609130859375,
0.006656646728515625,
0.0030193328857421875,
-0.023223876953125,
-0.0182647705078125,
-0.0201873779296875,
0.04486083984375,
-0.02239990234375,
-0.01180267333984375,
-0.012908935546875,
-0.0030841827392578125,
-0.006778717041015625,
-0.051300048828125,
0.01070404052734375,
-0.045013427734375,
0.040191650390625,
-0.0168304443359375,
-0.0467529296875,
-0.020233154296875,
0.01349639892578125,
-0.053070068359375,
0.09112548828125,
0.03021240234375,
-0.03857421875,
0.040802001953125,
-0.039703369140625,
-0.01219940185546875,
-0.0005078315734863281,
-0.0110015869140625,
-0.049530029296875,
0.0104522705078125,
0.02813720703125,
0.0185699462890625,
-0.04107666015625,
0.0308837890625,
-0.01471710205078125,
-0.03240966796875,
0.01557159423828125,
-0.003326416015625,
0.09808349609375,
0.023040771484375,
-0.032318115234375,
0.00868988037109375,
-0.057525634765625,
0.01053619384765625,
0.025299072265625,
-0.0197296142578125,
-0.0038967132568359375,
-0.01229095458984375,
-0.01384735107421875,
0.0023212432861328125,
0.0164337158203125,
-0.034271240234375,
0.024017333984375,
-0.039276123046875,
0.0426025390625,
0.0418701171875,
0.0091705322265625,
0.020477294921875,
-0.0196685791015625,
0.038116455078125,
-0.004909515380859375,
0.01446533203125,
0.003482818603515625,
-0.043426513671875,
-0.05084228515625,
-0.0304107666015625,
0.028961181640625,
0.04833984375,
-0.03619384765625,
0.049713134765625,
-0.01027679443359375,
-0.0443115234375,
-0.05560302734375,
0.00262451171875,
0.0227813720703125,
0.02728271484375,
0.039703369140625,
-0.0218353271484375,
-0.035064697265625,
-0.05560302734375,
-0.03094482421875,
-0.0311737060546875,
-0.00970458984375,
0.0305633544921875,
0.05126953125,
0.01258087158203125,
0.08251953125,
-0.064453125,
-0.034454345703125,
-0.022735595703125,
-0.0179290771484375,
0.031402587890625,
0.0391845703125,
0.048797607421875,
-0.05743408203125,
-0.03558349609375,
-0.01299285888671875,
-0.061248779296875,
0.0187835693359375,
-0.010009765625,
-0.00981903076171875,
0.036895751953125,
0.0140380859375,
-0.0640869140625,
0.046051025390625,
0.0242919921875,
-0.0282440185546875,
0.048797607421875,
-0.0007071495056152344,
0.006740570068359375,
-0.07171630859375,
0.0079498291015625,
-0.004619598388671875,
-0.00699615478515625,
-0.038818359375,
0.01032257080078125,
-0.012481689453125,
-0.0185089111328125,
-0.024139404296875,
0.035400390625,
-0.0396728515625,
0.0029754638671875,
0.022979736328125,
-0.02557373046875,
0.0036029815673828125,
0.04833984375,
0.0016345977783203125,
0.0770263671875,
0.05084228515625,
-0.041168212890625,
0.04498291015625,
0.029052734375,
-0.050262451171875,
0.039794921875,
-0.07080078125,
0.0223236083984375,
-0.0103607177734375,
0.0031890869140625,
-0.0765380859375,
-0.0264892578125,
0.0234527587890625,
-0.051666259765625,
0.018096923828125,
-0.031646728515625,
-0.028594970703125,
-0.03302001953125,
-0.01012420654296875,
-0.0003752708435058594,
0.0704345703125,
-0.0277557373046875,
0.0440673828125,
0.02496337890625,
0.001865386962890625,
-0.043548583984375,
-0.07659912109375,
-0.01201629638671875,
-0.01953125,
-0.04669189453125,
0.01268768310546875,
-0.0034885406494140625,
-0.023468017578125,
0.0086212158203125,
-0.0087890625,
-0.034393310546875,
-0.01148223876953125,
0.006160736083984375,
0.03179931640625,
-0.0192413330078125,
-0.0019311904907226562,
-0.0135345458984375,
0.01531219482421875,
0.01352691650390625,
-0.024078369140625,
0.037139892578125,
-0.017913818359375,
-0.0158233642578125,
-0.033935546875,
0.001956939697265625,
0.026458740234375,
-0.045928955078125,
0.083740234375,
0.04779052734375,
-0.0307769775390625,
-0.0010509490966796875,
-0.05419921875,
-0.0242462158203125,
-0.031951904296875,
0.0289154052734375,
-0.0185394287109375,
-0.04833984375,
0.0296478271484375,
0.0192718505859375,
0.0055694580078125,
0.037689208984375,
0.04547119140625,
-0.00124359130859375,
0.0780029296875,
0.038604736328125,
-0.004261016845703125,
0.032470703125,
-0.038726806640625,
0.016387939453125,
-0.0281219482421875,
-0.0379638671875,
-0.046142578125,
-0.0011529922485351562,
-0.035186767578125,
-0.034454345703125,
0.02978515625,
0.019073486328125,
-0.024871826171875,
0.027618408203125,
-0.072021484375,
0.033416748046875,
0.04827880859375,
0.0108489990234375,
0.014923095703125,
0.025054931640625,
0.01471710205078125,
0.001720428466796875,
-0.049346923828125,
-0.0168304443359375,
0.07855224609375,
-0.005695343017578125,
0.06390380859375,
0.0112457275390625,
0.05389404296875,
0.0203399658203125,
0.01253509521484375,
-0.041107177734375,
0.041961669921875,
-0.00044083595275878906,
-0.08660888671875,
-0.0255126953125,
-0.05084228515625,
-0.06103515625,
0.01049041748046875,
-0.016204833984375,
-0.048248291015625,
0.019989013671875,
0.020416259765625,
-0.0276947021484375,
0.0296478271484375,
-0.06292724609375,
0.084716796875,
-0.03021240234375,
-0.03741455078125,
-0.0081939697265625,
-0.042266845703125,
0.026824951171875,
0.007007598876953125,
0.00534820556640625,
0.0013647079467773438,
-0.001895904541015625,
0.044219970703125,
-0.05419921875,
0.051544189453125,
-0.039031982421875,
0.0183868408203125,
0.056854248046875,
-0.00014328956604003906,
0.0232391357421875,
0.002819061279296875,
-0.0287017822265625,
0.038665771484375,
0.0177001953125,
-0.03363037109375,
-0.044403076171875,
0.0396728515625,
-0.05224609375,
-0.043182373046875,
-0.04168701171875,
-0.022003173828125,
0.00855255126953125,
0.0226287841796875,
0.0192718505859375,
0.0220947265625,
-0.0093536376953125,
0.01520538330078125,
0.043212890625,
-0.0308837890625,
0.01038360595703125,
0.0237274169921875,
-0.00856781005859375,
-0.03411865234375,
0.05828857421875,
0.004150390625,
0.0189361572265625,
0.0217437744140625,
0.006153106689453125,
-0.042816162109375,
-0.0369873046875,
-0.048858642578125,
0.0251312255859375,
-0.0360107421875,
-0.0265045166015625,
-0.039947509765625,
-0.013763427734375,
-0.0498046875,
0.0145721435546875,
-0.0140380859375,
-0.02471923828125,
-0.0166015625,
-0.004650115966796875,
0.048492431640625,
0.05950927734375,
0.0183258056640625,
0.0162506103515625,
-0.034759521484375,
0.00984954833984375,
0.0242919921875,
0.033416748046875,
-0.018951416015625,
-0.032470703125,
-0.0074005126953125,
0.0120697021484375,
-0.01184844970703125,
-0.07080078125,
0.03411865234375,
0.026275634765625,
0.040252685546875,
0.00751495361328125,
0.01235198974609375,
0.058868408203125,
-0.01020050048828125,
0.088623046875,
0.00555419921875,
-0.058990478515625,
0.048858642578125,
0.00048470497131347656,
0.01409149169921875,
0.05316162109375,
0.049530029296875,
-0.005832672119140625,
0.0012254714965820312,
-0.039642333984375,
-0.052642822265625,
0.06768798828125,
0.0166015625,
0.005550384521484375,
0.007720947265625,
0.0372314453125,
-0.00969696044921875,
0.04107666015625,
-0.06500244140625,
-0.025543212890625,
-0.0218658447265625,
-0.00980377197265625,
0.0027408599853515625,
-0.01177978515625,
-0.00640106201171875,
-0.040496826171875,
0.043487548828125,
-0.003833770751953125,
0.036376953125,
0.0259246826171875,
-0.0180816650390625,
0.01462554931640625,
0.00852203369140625,
0.01544189453125,
0.05047607421875,
-0.038604736328125,
-0.021942138671875,
0.02734375,
-0.032257080078125,
-0.004238128662109375,
0.023834228515625,
-0.028350830078125,
-0.003910064697265625,
0.01861572265625,
0.0682373046875,
-0.026031494140625,
-0.0498046875,
0.042816162109375,
-0.01023101806640625,
-0.017852783203125,
-0.028472900390625,
0.0118865966796875,
0.00489044189453125,
-0.0015268325805664062,
0.0181427001953125,
-0.0056304931640625,
0.00905609130859375,
-0.04058837890625,
0.01000213623046875,
0.05084228515625,
-0.036041259765625,
-0.005382537841796875,
0.07586669921875,
0.0017480850219726562,
-0.0160675048828125,
0.073974609375,
-0.0162811279296875,
-0.0311279296875,
0.05120849609375,
0.0199432373046875,
0.050262451171875,
-0.012298583984375,
-0.0035247802734375,
0.06414794921875,
0.03521728515625,
0.00521087646484375,
0.0302886962890625,
-0.00350189208984375,
-0.029754638671875,
-0.0289306640625,
-0.0728759765625,
-0.0027217864990234375,
0.0110015869140625,
-0.057647705078125,
0.0279083251953125,
-0.029083251953125,
-0.0217437744140625,
0.0006532669067382812,
0.00200653076171875,
-0.050018310546875,
0.0211334228515625,
-0.0018863677978515625,
0.07183837890625,
-0.046661376953125,
0.05108642578125,
0.06317138671875,
-0.06768798828125,
-0.06317138671875,
-0.0189056396484375,
0.00019752979278564453,
-0.06451416015625,
0.0233001708984375,
0.01056671142578125,
0.0178985595703125,
0.006587982177734375,
-0.05450439453125,
-0.06134033203125,
0.087890625,
0.04071044921875,
-0.03692626953125,
-0.019012451171875,
0.0157623291015625,
0.045074462890625,
-0.00998687744140625,
0.04644775390625,
0.052001953125,
0.021697998046875,
0.003692626953125,
-0.060333251953125,
0.0187530517578125,
-0.00775146484375,
0.0011272430419921875,
0.01534271240234375,
-0.050323486328125,
0.0654296875,
-0.037261962890625,
-0.01387786865234375,
0.004024505615234375,
0.04156494140625,
0.0123291015625,
0.0254364013671875,
0.044281005859375,
0.051513671875,
0.057769775390625,
-0.0269927978515625,
0.10577392578125,
-0.0301666259765625,
0.043212890625,
0.06463623046875,
-0.003154754638671875,
0.0474853515625,
0.0166473388671875,
-0.058868408203125,
0.056243896484375,
0.0478515625,
-0.023834228515625,
0.0264434814453125,
0.025543212890625,
-0.00569915771484375,
-0.0242462158203125,
0.00481414794921875,
-0.040679931640625,
0.022430419921875,
0.00565338134765625,
-0.01336669921875,
0.00035572052001953125,
-0.026947021484375,
0.0007042884826660156,
-0.00411224365234375,
-0.005550384521484375,
0.0772705078125,
-0.0025310516357421875,
-0.052764892578125,
0.089111328125,
-0.0185699462890625,
0.041748046875,
-0.06890869140625,
-0.00782012939453125,
-0.031463623046875,
0.01265716552734375,
-0.035430908203125,
-0.0494384765625,
0.0345458984375,
0.012664794921875,
-0.0026912689208984375,
0.0016031265258789062,
0.0489501953125,
-0.036956787109375,
-0.040863037109375,
0.005313873291015625,
0.031951904296875,
0.0279083251953125,
-0.02911376953125,
-0.0751953125,
-0.001617431640625,
0.02130126953125,
-0.0251312255859375,
0.0221405029296875,
0.010986328125,
0.0016498565673828125,
0.0509033203125,
0.056976318359375,
0.0025882720947265625,
0.0057525634765625,
-0.001941680908203125,
0.068359375,
-0.05047607421875,
-0.0269012451171875,
-0.07159423828125,
0.050384521484375,
-0.01849365234375,
-0.021026611328125,
0.060211181640625,
0.06011962890625,
0.06390380859375,
-0.01044464111328125,
0.04052734375,
-0.04168701171875,
0.0195465087890625,
-0.03204345703125,
0.056427001953125,
-0.07061767578125,
0.0246429443359375,
-0.003276824951171875,
-0.06695556640625,
-0.0145721435546875,
0.042205810546875,
-0.01849365234375,
0.026336669921875,
0.034637451171875,
0.08453369140625,
0.0145263671875,
-0.007785797119140625,
0.004108428955078125,
0.02734375,
0.01276397705078125,
0.037078857421875,
0.045867919921875,
-0.0292816162109375,
0.0721435546875,
-0.0333251953125,
-0.0211334228515625,
-0.0009617805480957031,
-0.0419921875,
-0.0665283203125,
-0.040191650390625,
-0.032135009765625,
-0.049102783203125,
0.00799560546875,
0.05316162109375,
0.07183837890625,
-0.0665283203125,
-0.0222930908203125,
-0.0138092041015625,
0.008453369140625,
-0.034515380859375,
-0.023956298828125,
0.0209197998046875,
-0.01256561279296875,
-0.05419921875,
0.003757476806640625,
-0.01033782958984375,
-0.01140594482421875,
-0.020477294921875,
0.00811004638671875,
-0.02606201171875,
0.0061187744140625,
0.032928466796875,
0.0025653839111328125,
-0.0511474609375,
0.0017156600952148438,
0.00919342041015625,
-0.003917694091796875,
0.0098724365234375,
0.03326416015625,
-0.06549072265625,
0.032257080078125,
0.0396728515625,
0.044281005859375,
0.050628662109375,
0.02313232421875,
0.027862548828125,
-0.04803466796875,
-0.0002048015594482422,
0.01416015625,
0.01212310791015625,
0.0288238525390625,
-0.050201416015625,
0.06256103515625,
0.029937744140625,
-0.04595947265625,
-0.07708740234375,
-0.003643035888671875,
-0.07305908203125,
-0.0233917236328125,
0.08270263671875,
-0.0036029815673828125,
-0.03265380859375,
0.01139068603515625,
-0.01593017578125,
0.01247406005859375,
-0.0335693359375,
0.037200927734375,
0.0439453125,
-0.0262298583984375,
-0.0072021484375,
-0.0426025390625,
0.050445556640625,
0.035247802734375,
-0.060638427734375,
0.0023403167724609375,
0.049041748046875,
0.038604736328125,
0.0027065277099609375,
0.049560546875,
-0.0159912109375,
0.012969970703125,
0.01508331298828125,
0.0251007080078125,
-0.0234222412109375,
0.00533294677734375,
-0.05047607421875,
0.001705169677734375,
-0.00026345252990722656,
-0.0135955810546875
]
] |
hegelty/KcBERT-Base-finetuned-hate | 2023-09-18T02:30:55.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"text-classification",
"ko",
"license:bsd",
"endpoints_compatible",
"region:us"
] | text-classification | hegelty | null | null | hegelty/KcBERT-Base-finetuned-hate | 1 | 11,838 | transformers | 2023-09-07T04:23:36 | ---
license: bsd
language:
- ko
library_name: transformers
---
# 혐오표현 분류
tag 0: 혐오
tag 1: 일반
# 소스코드
https://github.com/hegelty/hate-classifier
# 데이터셋
https://github.com/smilegate-ai/korean_unsmile_dataset
| 207 | [
[
-0.0233612060546875,
-0.0024261474609375,
0.0113983154296875,
0.02008056640625,
-0.0205841064453125,
0.0188446044921875,
0.01203155517578125,
-0.0009045600891113281,
0.0312042236328125,
0.0079193115234375,
-0.025787353515625,
-0.0697021484375,
-0.040618896484375,
-0.01309967041015625,
0.006389617919921875,
0.05810546875,
0.014007568359375,
0.0293426513671875,
0.0189666748046875,
-0.00301361083984375,
-0.0173187255859375,
-0.020294189453125,
-0.060211181640625,
-0.0489501953125,
0.03759765625,
0.03350830078125,
0.0389404296875,
0.031402587890625,
0.0426025390625,
0.01453399658203125,
0.0189361572265625,
-0.031890869140625,
-0.0198822021484375,
-0.01043701171875,
-0.0165557861328125,
-0.037872314453125,
-0.040191650390625,
0.002288818359375,
0.0249481201171875,
0.021759033203125,
0.00485992431640625,
0.03948974609375,
-0.0162200927734375,
0.04827880859375,
-0.01548004150390625,
0.024658203125,
-0.04315185546875,
0.006908416748046875,
-0.024078369140625,
0.0028705596923828125,
-0.009735107421875,
-0.04730224609375,
-0.0237274169921875,
-0.048919677734375,
0.013397216796875,
0.007312774658203125,
0.09832763671875,
0.026275634765625,
-0.04058837890625,
0.0006623268127441406,
-0.0210418701171875,
0.056854248046875,
-0.026397705078125,
0.040924072265625,
0.04449462890625,
0.024383544921875,
-0.0015554428100585938,
-0.0275421142578125,
-0.026092529296875,
0.0225982666015625,
0.0005211830139160156,
0.0228271484375,
-0.0048675537109375,
-0.0247955322265625,
0.011016845703125,
0.0195465087890625,
-0.0450439453125,
0.027435302734375,
-0.0296630859375,
-0.032196044921875,
0.06292724609375,
-0.0202178955078125,
0.0487060546875,
-0.040496826171875,
-0.0222930908203125,
0.01183319091796875,
-0.027679443359375,
-0.01303863525390625,
0.0305938720703125,
0.030548095703125,
-0.00595855712890625,
0.04693603515625,
-0.0233306884765625,
0.0153656005859375,
0.0010347366333007812,
-0.0322265625,
0.07135009765625,
0.003940582275390625,
-0.0287322998046875,
0.016143798828125,
0.04827880859375,
0.0579833984375,
0.033355712890625,
-0.004047393798828125,
-0.008636474609375,
0.050933837890625,
0.00882720947265625,
-0.052490234375,
-0.034393310546875,
0.0182037353515625,
-0.053009033203125,
-0.030364990234375,
0.01171112060546875,
-0.06787109375,
-0.037689208984375,
0.006072998046875,
0.021881103515625,
-0.0262451171875,
-0.03839111328125,
0.007755279541015625,
-0.032684326171875,
0.0109405517578125,
0.0140838623046875,
-0.0279998779296875,
0.00484466552734375,
0.014892578125,
0.0168609619140625,
0.02203369140625,
-0.00653839111328125,
0.0157012939453125,
0.0008211135864257812,
-0.0058746337890625,
0.039154052734375,
-0.0008139610290527344,
-0.048614501953125,
-0.0267486572265625,
0.0025501251220703125,
-0.01493072509765625,
-0.0418701171875,
0.063720703125,
-0.037261962890625,
-0.0068817138671875,
-0.00061798095703125,
-0.044219970703125,
-0.0310211181640625,
0.004634857177734375,
-0.0474853515625,
0.076171875,
0.0304107666015625,
-0.05072021484375,
0.03448486328125,
-0.06671142578125,
-0.0305938720703125,
0.033660888671875,
-0.0038661956787109375,
-0.0294036865234375,
-0.012481689453125,
-0.024200439453125,
0.037689208984375,
-0.00102996826171875,
-0.004833221435546875,
-0.0120849609375,
-0.01114654541015625,
0.01070404052734375,
0.01233673095703125,
0.0789794921875,
0.034759521484375,
-0.0158233642578125,
-0.0003910064697265625,
-0.08416748046875,
0.00502777099609375,
0.06402587890625,
-0.02081298828125,
-0.051513671875,
-0.0048675537109375,
0.00258636474609375,
0.023468017578125,
0.023345947265625,
-0.07281494140625,
0.018218994140625,
0.003269195556640625,
0.0089263916015625,
0.06805419921875,
-0.001171112060546875,
0.02728271484375,
-0.0274810791015625,
0.040679931640625,
0.0070037841796875,
-0.00605010986328125,
0.0157012939453125,
-0.033966064453125,
-0.03662109375,
-0.0307159423828125,
0.0057830810546875,
0.058929443359375,
-0.05975341796875,
0.0279083251953125,
-0.0030727386474609375,
-0.06951904296875,
-0.0545654296875,
-0.0079498291015625,
0.024658203125,
0.0330810546875,
0.03521728515625,
-0.0171661376953125,
-0.08160400390625,
-0.0489501953125,
-0.0189971923828125,
-0.03643798828125,
-0.0026702880859375,
0.032745361328125,
0.05560302734375,
-0.0386962890625,
0.0413818359375,
-0.058319091796875,
-0.026641845703125,
-0.003314971923828125,
-0.0003833770751953125,
0.0335693359375,
0.03533935546875,
0.040252685546875,
-0.07183837890625,
-0.045501708984375,
-0.00650787353515625,
-0.052581787109375,
-0.01678466796875,
0.007778167724609375,
-0.004970550537109375,
0.018707275390625,
0.022491455078125,
-0.02227783203125,
0.0540771484375,
0.0108642578125,
-0.04913330078125,
0.050201416015625,
0.04486083984375,
0.0162353515625,
-0.088134765625,
-0.00531005859375,
0.032989501953125,
-0.003269195556640625,
-0.039215087890625,
0.01494598388671875,
-0.007781982421875,
0.0278167724609375,
-0.026031494140625,
0.045623779296875,
-0.0223388671875,
0.0192718505859375,
0.0186004638671875,
-0.0099945068359375,
-0.0145263671875,
0.0291595458984375,
-0.0032138824462890625,
0.0248260498046875,
0.03656005859375,
-0.035003662109375,
0.05609130859375,
0.04583740234375,
-0.01366424560546875,
0.0374755859375,
-0.04156494140625,
-0.02642822265625,
-0.016387939453125,
0.01035308837890625,
-0.08544921875,
-0.042999267578125,
0.044921875,
-0.0482177734375,
0.02239990234375,
-0.03912353515625,
-0.030364990234375,
-0.051116943359375,
-0.04486083984375,
0.03729248046875,
0.045379638671875,
-0.036376953125,
-0.0194091796875,
0.038299560546875,
0.01178741455078125,
-0.056121826171875,
-0.05718994140625,
-0.01239776611328125,
-0.030670166015625,
-0.0008702278137207031,
0.02728271484375,
-0.008636474609375,
-0.0139312744140625,
-0.006076812744140625,
-0.0170440673828125,
-0.01381683349609375,
0.0018672943115234375,
0.049224853515625,
0.0377197265625,
-0.0159759521484375,
-0.0006499290466308594,
-0.00943756103515625,
-0.01403045654296875,
0.0081634521484375,
-0.0099639892578125,
0.051422119140625,
0.001068115234375,
-0.012298583984375,
-0.02978515625,
-0.00949859619140625,
0.013275146484375,
0.017120361328125,
0.036346435546875,
0.06622314453125,
-0.034881591796875,
-0.020111083984375,
-0.00994110107421875,
0.026458740234375,
-0.031646728515625,
0.0282745361328125,
-0.046905517578125,
-0.072265625,
0.0300445556640625,
-0.01511383056640625,
-0.041534423828125,
0.052886962890625,
0.0284576416015625,
0.013519287109375,
0.06268310546875,
0.0279693603515625,
-0.0433349609375,
-0.0018768310546875,
-0.01091766357421875,
0.026031494140625,
-0.04644775390625,
-0.029632568359375,
-0.044158935546875,
-0.003292083740234375,
-0.08056640625,
-0.0124053955078125,
-0.00029587745666503906,
0.0215911865234375,
-0.0408935546875,
0.0582275390625,
-0.057037353515625,
0.0304107666015625,
0.02349853515625,
0.0092620849609375,
-0.0035610198974609375,
-0.0094146728515625,
-0.0090789794921875,
-0.01520538330078125,
-0.034393310546875,
-0.0228118896484375,
0.0555419921875,
0.0574951171875,
0.08282470703125,
0.0190887451171875,
0.07061767578125,
0.0117950439453125,
0.0106048583984375,
-0.074951171875,
0.047515869140625,
0.0011348724365234375,
-0.043304443359375,
-0.028564453125,
-0.0232391357421875,
-0.052459716796875,
-0.01641845703125,
0.030517578125,
-0.059722900390625,
0.01332855224609375,
-0.0032596588134765625,
-0.0015401840209960938,
0.042877197265625,
-0.040618896484375,
0.07855224609375,
0.0128326416015625,
0.003879547119140625,
0.022216796875,
-0.0443115234375,
0.0186767578125,
0.0043182373046875,
0.0068359375,
-0.038665771484375,
-0.005558013916015625,
0.06134033203125,
-0.005489349365234375,
0.06146240234375,
-0.0179290771484375,
-0.01511383056640625,
0.0149383544921875,
0.00830078125,
-0.0103912353515625,
-0.0001150369644165039,
-0.002223968505859375,
0.0278167724609375,
-0.0255584716796875,
-0.04730224609375,
0.019927978515625,
0.055816650390625,
-0.054168701171875,
0.0014486312866210938,
-0.04754638671875,
-0.006134033203125,
0.01087188720703125,
0.03131103515625,
0.0550537109375,
0.005886077880859375,
-0.02447509765625,
-0.0234832763671875,
0.04071044921875,
-0.0247802734375,
0.03985595703125,
0.031951904296875,
-0.048431396484375,
-0.02691650390625,
0.1014404296875,
0.0253448486328125,
-0.0127105712890625,
0.019927978515625,
-0.009246826171875,
-0.0192108154296875,
-0.0206298828125,
-0.025909423828125,
0.027587890625,
-0.050506591796875,
-0.038330078125,
-0.035400390625,
-0.044525146484375,
-0.0469970703125,
-0.01702880859375,
-0.03155517578125,
-0.037139892578125,
-0.040802001953125,
-0.029144287109375,
0.00516510009765625,
0.06268310546875,
-0.0172882080078125,
0.0312347412109375,
-0.043670654296875,
0.024932861328125,
0.004528045654296875,
0.0555419921875,
-0.0173797607421875,
-0.039947509765625,
-0.0264434814453125,
-0.0266571044921875,
-0.01593017578125,
-0.056640625,
0.047454833984375,
-0.0130462646484375,
0.0419921875,
0.0311126708984375,
0.008209228515625,
0.0112457275390625,
-0.03131103515625,
0.0548095703125,
0.022705078125,
-0.056854248046875,
0.044952392578125,
-0.032012939453125,
0.0268402099609375,
0.027374267578125,
0.09564208984375,
-0.041900634765625,
0.00588226318359375,
-0.050933837890625,
-0.056427001953125,
0.039093017578125,
-0.0022525787353515625,
-0.00804901123046875,
-0.005611419677734375,
-0.005558013916015625,
0.0112762451171875,
0.01560211181640625,
-0.067626953125,
-0.061676025390625,
-0.026153564453125,
-0.04156494140625,
0.0262298583984375,
-0.048065185546875,
0.007114410400390625,
-0.0212554931640625,
0.08978271484375,
0.0037479400634765625,
0.03076171875,
0.01483917236328125,
-0.00836944580078125,
-0.0111083984375,
0.0178070068359375,
0.0259857177734375,
0.02166748046875,
-0.0291595458984375,
0.01555633544921875,
-0.0226593017578125,
-0.049163818359375,
0.0234832763671875,
0.0073699951171875,
-0.015838623046875,
0.01666259765625,
0.01067352294921875,
0.050506591796875,
-0.025177001953125,
-0.00499725341796875,
0.0162506103515625,
-0.004596710205078125,
-0.0189056396484375,
-0.037872314453125,
0.0113677978515625,
0.005741119384765625,
-0.0017852783203125,
0.032135009765625,
0.01470184326171875,
0.0194549560546875,
-0.003143310546875,
0.040130615234375,
0.01128387451171875,
-0.047637939453125,
-0.0305023193359375,
0.0516357421875,
0.0083160400390625,
-0.02581787109375,
0.0018224716186523438,
-0.06378173828125,
-0.0704345703125,
0.056304931640625,
0.0546875,
0.053436279296875,
-0.06805419921875,
0.054962158203125,
0.06353759765625,
0.0277252197265625,
0.0029888153076171875,
0.044769287109375,
0.022857666015625,
-0.07513427734375,
-0.00421905517578125,
-0.06512451171875,
0.0088958740234375,
0.03277587890625,
-0.044769287109375,
0.0146484375,
-0.05035400390625,
-0.0033206939697265625,
0.037200927734375,
0.0013294219970703125,
-0.02471923828125,
0.04376220703125,
0.025421142578125,
0.07708740234375,
-0.05535888671875,
0.060577392578125,
0.0718994140625,
-0.02838134765625,
-0.04315185546875,
-0.0074462890625,
0.0274810791015625,
-0.06768798828125,
0.041534423828125,
0.0248565673828125,
0.0023250579833984375,
0.0096282958984375,
-0.0704345703125,
-0.0626220703125,
0.06427001953125,
-0.0362548828125,
-0.0352783203125,
0.042724609375,
-0.00937652587890625,
0.035858154296875,
-0.0250701904296875,
-0.01096343994140625,
0.044219970703125,
0.03875732421875,
-0.003307342529296875,
-0.06280517578125,
-0.008056640625,
-0.05218505859375,
-0.00750732421875,
0.031951904296875,
-0.0767822265625,
0.054962158203125,
-0.00952911376953125,
-0.007354736328125,
-0.0175933837890625,
0.03131103515625,
-0.01499176025390625,
0.052490234375,
0.035308837890625,
0.07061767578125,
0.0274505615234375,
-0.01155853271484375,
0.0196075439453125,
0.021240234375,
0.049346923828125,
0.08843994140625,
-0.0253143310546875,
0.0015592575073242188,
0.0101470947265625,
-0.04022216796875,
0.0355224609375,
0.0555419921875,
-0.04217529296875,
0.035980224609375,
0.0253143310546875,
-0.0253143310546875,
0.019378662109375,
-0.0447998046875,
-0.0177459716796875,
0.04217529296875,
0.0238800048828125,
-0.01395416259765625,
-0.00962066650390625,
-0.00070953369140625,
0.0640869140625,
0.00803375244140625,
-0.04693603515625,
0.0469970703125,
-0.006465911865234375,
-0.0015583038330078125,
0.0269775390625,
0.018798828125,
0.050506591796875,
-0.019500732421875,
-0.00632476806640625,
0.004505157470703125,
0.01055908203125,
-0.035980224609375,
-0.06573486328125,
0.00945281982421875,
-0.006206512451171875,
-0.0194091796875,
0.00629425048828125,
0.07928466796875,
-0.033050537109375,
-0.036651611328125,
0.038482666015625,
0.023162841796875,
0.0171966552734375,
0.0069427490234375,
-0.07586669921875,
-0.00400543212890625,
0.032806396484375,
-0.027679443359375,
0.025909423828125,
0.01194000244140625,
-0.01337432861328125,
0.053619384765625,
0.02581787109375,
-0.00582122802734375,
0.00010311603546142578,
0.01617431640625,
0.0648193359375,
-0.056427001953125,
-0.0277557373046875,
-0.06378173828125,
0.028228759765625,
-0.016082763671875,
-0.0252532958984375,
0.0775146484375,
0.060455322265625,
0.09326171875,
-0.006664276123046875,
0.074951171875,
-0.03729248046875,
0.067626953125,
0.01238250732421875,
0.06072998046875,
-0.04241943359375,
-0.019805908203125,
-0.036773681640625,
-0.0330810546875,
-0.03656005859375,
0.033111572265625,
0.007335662841796875,
0.01328277587890625,
0.0469970703125,
0.0540771484375,
0.00341033935546875,
-0.00669097900390625,
0.0292510986328125,
0.039886474609375,
-0.00756072998046875,
0.029449462890625,
0.022552490234375,
-0.049591064453125,
0.004058837890625,
-0.04486083984375,
-0.0107879638671875,
-0.00789642333984375,
-0.032958984375,
-0.0623779296875,
-0.024139404296875,
-0.04266357421875,
-0.0711669921875,
-0.02130126953125,
0.08343505859375,
0.0239410400390625,
-0.08544921875,
-0.020355224609375,
0.0168609619140625,
0.040374755859375,
0.022613525390625,
-0.0237274169921875,
0.02288818359375,
-0.002521514892578125,
-0.03704833984375,
0.01293182373046875,
0.0083160400390625,
0.01226806640625,
0.00939178466796875,
-0.0156402587890625,
-0.0447998046875,
-0.001941680908203125,
0.03155517578125,
0.02099609375,
-0.0141143798828125,
-0.01165771484375,
-0.01107025146484375,
-0.04412841796875,
0.00809478759765625,
0.025665283203125,
-0.0265655517578125,
-0.006000518798828125,
0.0311126708984375,
0.010955810546875,
0.0014190673828125,
0.01399993896484375,
-0.012176513671875,
-0.061767578125,
0.0200653076171875,
-0.0018978118896484375,
0.01009368896484375,
0.0245819091796875,
-0.038909912109375,
0.09454345703125,
0.06549072265625,
-0.058807373046875,
-0.064697265625,
0.01140594482421875,
-0.066650390625,
-0.0280914306640625,
0.059539794921875,
-0.0133819580078125,
-0.03814697265625,
-0.011077880859375,
-0.0308837890625,
0.04278564453125,
-0.051513671875,
0.024810791015625,
0.03863525390625,
0.007640838623046875,
-0.0001671314239501953,
-0.02789306640625,
0.052581787109375,
0.007678985595703125,
-0.0418701171875,
-0.0396728515625,
0.01849365234375,
0.025726318359375,
0.0262451171875,
0.0565185546875,
0.01020050048828125,
0.01422882080078125,
0.0204925537109375,
0.05718994140625,
0.0125885009765625,
-0.0173492431640625,
-0.027679443359375,
-0.007289886474609375,
-0.021942138671875,
-0.042388916015625
]
] |
OpenBuddy/openbuddy-llama2-13b-v11.1-bf16 | 2023-09-01T16:15:41.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"zh",
"en",
"fr",
"de",
"ja",
"ko",
"it",
"ru",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | OpenBuddy | null | null | OpenBuddy/openbuddy-llama2-13b-v11.1-bf16 | 17 | 11,810 | transformers | 2023-08-24T08:17:42 | ---
language:
- zh
- en
- fr
- de
- ja
- ko
- it
- ru
pipeline_tag: text-generation
inference: false
library_name: transformers
---
# OpenBuddy - Open Multilingual Chatbot
GitHub and Usage Guide: [https://github.com/OpenBuddy/OpenBuddy](https://github.com/OpenBuddy/OpenBuddy)
Website and Demo: [https://openbuddy.ai](https://openbuddy.ai)

# Copyright Notice
This model is built upon Meta's LLaMA series of models and is subject to Meta's licensing agreement.
This model is intended for use only by individuals who have obtained approval from Meta and are eligible to download LLaMA.
If you have not obtained approval from Meta, you must visit the https://ai.meta.com/llama/ page, read and agree to the model's licensing agreement, submit an application, and wait for approval from Meta before downloading the model from this page.
## Disclaimer
All OpenBuddy models have inherent limitations and may potentially produce outputs that are erroneous, harmful, offensive, or otherwise undesirable. Users should not use these models in critical or high-stakes situations that may lead to personal injury, property damage, or significant losses. Examples of such scenarios include, but are not limited to, the medical field, controlling software and hardware systems that may cause harm, and making important financial or legal decisions.
OpenBuddy is provided "as-is" without any warranty of any kind, either express or implied, including, but not limited to, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement. In no event shall the authors, contributors, or copyright holders be liable for any claim, damages, or other liabilities, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the software or the use or other dealings in the software.
By using OpenBuddy, you agree to these terms and conditions, and acknowledge that you understand the potential risks associated with its use. You also agree to indemnify and hold harmless the authors, contributors, and copyright holders from any claims, damages, or liabilities arising from your use of OpenBuddy.
## 免责声明
所有OpenBuddy模型均存在固有的局限性,可能产生错误的、有害的、冒犯性的或其他不良的输出。用户在关键或高风险场景中应谨慎行事,不要使用这些模型,以免导致人身伤害、财产损失或重大损失。此类场景的例子包括但不限于医疗领域、可能导致伤害的软硬件系统的控制以及进行重要的财务或法律决策。
OpenBuddy按“原样”提供,不附带任何种类的明示或暗示的保证,包括但不限于适销性、特定目的的适用性和非侵权的暗示保证。在任何情况下,作者、贡献者或版权所有者均不对因软件或使用或其他软件交易而产生的任何索赔、损害赔偿或其他责任(无论是合同、侵权还是其他原因)承担责任。
使用OpenBuddy即表示您同意这些条款和条件,并承认您了解其使用可能带来的潜在风险。您还同意赔偿并使作者、贡献者和版权所有者免受因您使用OpenBuddy而产生的任何索赔、损害赔偿或责任的影响。 | 2,636 | [
[
-0.0264892578125,
-0.0709228515625,
0.0171051025390625,
0.036163330078125,
-0.0265045166015625,
-0.005359649658203125,
-0.0138702392578125,
-0.03460693359375,
0.0175018310546875,
0.033050537109375,
-0.0257568359375,
-0.048095703125,
-0.035675048828125,
-0.00897979736328125,
-0.0017080307006835938,
0.0770263671875,
-0.0181121826171875,
-0.01528167724609375,
-0.00441741943359375,
-0.01128387451171875,
-0.0443115234375,
-0.0168914794921875,
-0.03131103515625,
-0.00669097900390625,
0.006244659423828125,
0.035491943359375,
0.060943603515625,
0.0038700103759765625,
0.046417236328125,
0.028106689453125,
0.002960205078125,
-0.002086639404296875,
-0.0401611328125,
0.01099395751953125,
0.006008148193359375,
-0.034515380859375,
-0.054046630859375,
-0.01081085205078125,
0.012603759765625,
0.0251312255859375,
-0.025909423828125,
0.033233642578125,
0.0017938613891601562,
0.04901123046875,
-0.05828857421875,
0.0304718017578125,
-0.0155181884765625,
0.0032787322998046875,
-0.0098724365234375,
-0.0254364013671875,
-0.00870513916015625,
-0.0557861328125,
-0.01428985595703125,
-0.0487060546875,
-0.01248931884765625,
0.005340576171875,
0.07879638671875,
0.004093170166015625,
-0.0289154052734375,
-0.0138702392578125,
-0.053863525390625,
0.04229736328125,
-0.062744140625,
0.022918701171875,
0.027008056640625,
0.054046630859375,
-0.0192718505859375,
-0.05322265625,
-0.040679931640625,
-0.0081329345703125,
-0.003505706787109375,
0.0279541015625,
-0.027618408203125,
-0.007694244384765625,
0.01541900634765625,
0.039398193359375,
-0.05401611328125,
-0.0021038055419921875,
-0.0460205078125,
-0.0012006759643554688,
0.035675048828125,
0.014739990234375,
0.041412353515625,
-0.0219879150390625,
-0.039581298828125,
-0.0007157325744628906,
-0.035003662109375,
0.030242919921875,
0.030792236328125,
0.01428985595703125,
-0.051422119140625,
0.05926513671875,
-0.0244293212890625,
0.029144287109375,
-0.003917694091796875,
-0.038909912109375,
0.044647216796875,
-0.032440185546875,
-0.02825927734375,
-0.002201080322265625,
0.0816650390625,
0.048553466796875,
0.021148681640625,
0.0093841552734375,
-0.0099639892578125,
-0.010833740234375,
0.0045013427734375,
-0.058624267578125,
-0.0155792236328125,
0.049285888671875,
-0.050994873046875,
-0.0237884521484375,
0.002101898193359375,
-0.07000732421875,
-0.01116180419921875,
-0.0021839141845703125,
0.02301025390625,
-0.0386962890625,
-0.04571533203125,
0.0179901123046875,
0.0006933212280273438,
0.00018155574798583984,
0.016021728515625,
-0.041015625,
0.0176239013671875,
0.016571044921875,
0.0804443359375,
0.0238494873046875,
-0.0169219970703125,
-0.006366729736328125,
0.024932861328125,
-0.0189208984375,
0.04522705078125,
-0.01445770263671875,
-0.0433349609375,
0.00421905517578125,
0.01032257080078125,
0.0019893646240234375,
-0.016693115234375,
0.0257110595703125,
-0.01528167724609375,
0.042388916015625,
0.022979736328125,
-0.00644683837890625,
-0.033355712890625,
0.003894805908203125,
-0.039031982421875,
0.0704345703125,
0.006938934326171875,
-0.06732177734375,
0.01184844970703125,
-0.07379150390625,
-0.028350830078125,
-0.0010671615600585938,
-0.01215362548828125,
-0.03326416015625,
-0.00547027587890625,
0.0160064697265625,
0.032623291015625,
-0.0165252685546875,
0.0169830322265625,
-0.036712646484375,
-0.0165252685546875,
0.018890380859375,
-0.024658203125,
0.10064697265625,
0.0196075439453125,
-0.01031494140625,
0.0343017578125,
-0.051025390625,
0.00360870361328125,
0.0401611328125,
-0.033050537109375,
-0.0257568359375,
-0.01311492919921875,
0.004367828369140625,
0.0145416259765625,
0.0308837890625,
-0.0455322265625,
0.0273284912109375,
-0.036834716796875,
0.038909912109375,
0.056732177734375,
0.006084442138671875,
0.0258331298828125,
-0.036041259765625,
0.05804443359375,
0.006298065185546875,
0.03564453125,
-0.027130126953125,
-0.06024169921875,
-0.03948974609375,
-0.046112060546875,
0.003765106201171875,
0.06298828125,
-0.037841796875,
0.048309326171875,
-0.0177154541015625,
-0.048919677734375,
-0.053009033203125,
0.001804351806640625,
0.0251922607421875,
0.0214080810546875,
0.0272064208984375,
-0.01263427734375,
-0.0296630859375,
-0.04388427734375,
-0.00179290771484375,
-0.0239410400390625,
-0.00823211669921875,
0.035003662109375,
0.051361083984375,
-0.018035888671875,
0.0623779296875,
-0.06048583984375,
-0.036651611328125,
0.002887725830078125,
0.0002765655517578125,
0.0293426513671875,
0.046630859375,
0.0679931640625,
-0.04827880859375,
-0.050079345703125,
0.0031719207763671875,
-0.06488037109375,
0.00722503662109375,
-0.0015211105346679688,
-0.0237884521484375,
0.02813720703125,
0.023162841796875,
-0.0579833984375,
0.07086181640625,
0.052764892578125,
-0.0309600830078125,
0.057861328125,
-0.0266876220703125,
0.01381683349609375,
-0.10400390625,
0.017791748046875,
-0.016937255859375,
-0.0133514404296875,
-0.034210205078125,
0.0183563232421875,
0.0004096031188964844,
-0.017242431640625,
-0.0433349609375,
0.047882080078125,
-0.026885986328125,
0.0196075439453125,
-0.0013914108276367188,
0.0167083740234375,
-0.0133209228515625,
0.037445068359375,
-0.017578125,
0.051025390625,
0.0419921875,
-0.032928466796875,
0.03826904296875,
0.02789306640625,
-0.026519775390625,
0.04132080078125,
-0.07073974609375,
-0.00823974609375,
-0.0037326812744140625,
0.019439697265625,
-0.0889892578125,
-0.0274200439453125,
0.0552978515625,
-0.06451416015625,
0.015899658203125,
-0.006084442138671875,
-0.041717529296875,
-0.0307464599609375,
-0.03082275390625,
0.01137542724609375,
0.04254150390625,
-0.025360107421875,
0.035125732421875,
0.01922607421875,
-0.0184783935546875,
-0.05133056640625,
-0.053314208984375,
-0.0179290771484375,
-0.01461029052734375,
-0.06866455078125,
0.016265869140625,
-0.012054443359375,
-0.0033130645751953125,
0.007648468017578125,
0.010040283203125,
-0.01468658447265625,
-0.0009355545043945312,
0.039886474609375,
0.026336669921875,
-0.01233673095703125,
0.0062255859375,
0.00539398193359375,
-0.013153076171875,
-0.00968170166015625,
0.00649261474609375,
0.043121337890625,
-0.01629638671875,
-0.04052734375,
-0.0257110595703125,
0.036224365234375,
0.045989990234375,
-0.0179901123046875,
0.060699462890625,
0.05157470703125,
-0.032958984375,
0.01374053955078125,
-0.036102294921875,
-0.0008668899536132812,
-0.038299560546875,
0.01528167724609375,
-0.031768798828125,
-0.062744140625,
0.0570068359375,
0.0119171142578125,
0.02996826171875,
0.0208587646484375,
0.05401611328125,
-0.00823974609375,
0.06793212890625,
0.051239013671875,
0.00995635986328125,
0.0266876220703125,
-0.01522064208984375,
0.0207366943359375,
-0.053985595703125,
-0.0266876220703125,
-0.04315185546875,
-0.018310546875,
-0.0557861328125,
-0.0248870849609375,
0.02703857421875,
0.0249786376953125,
-0.043853759765625,
0.0188751220703125,
-0.05157470703125,
0.027740478515625,
0.058441162109375,
0.019439697265625,
0.02447509765625,
-0.00787353515625,
-0.0220794677734375,
0.0175323486328125,
-0.0333251953125,
-0.04248046875,
0.08062744140625,
0.0237884521484375,
0.06475830078125,
0.031341552734375,
0.052459716796875,
-0.0126800537109375,
0.0090484619140625,
-0.053985595703125,
0.037261962890625,
0.01555633544921875,
-0.0711669921875,
-0.0316162109375,
-0.0182037353515625,
-0.0968017578125,
0.018402099609375,
-0.0035247802734375,
-0.07977294921875,
0.011566162109375,
0.0034008026123046875,
-0.0164642333984375,
0.037384033203125,
-0.055908203125,
0.060333251953125,
-0.0161590576171875,
-0.021636962890625,
-0.007747650146484375,
-0.048919677734375,
0.04376220703125,
-0.003978729248046875,
0.032958984375,
-0.0256195068359375,
-0.017303466796875,
0.028411865234375,
-0.046600341796875,
0.0738525390625,
-0.01291656494140625,
0.003803253173828125,
0.0286865234375,
0.0261077880859375,
0.0201568603515625,
0.0179443359375,
0.0279693603515625,
0.0467529296875,
0.0129547119140625,
-0.0325927734375,
-0.0257568359375,
0.052459716796875,
-0.0693359375,
-0.04559326171875,
-0.03594970703125,
-0.0245513916015625,
0.00998687744140625,
0.032440185546875,
0.01544189453125,
0.008148193359375,
-0.003376007080078125,
0.02252197265625,
0.004268646240234375,
-0.055755615234375,
0.03448486328125,
0.046600341796875,
-0.040283203125,
-0.04595947265625,
0.058349609375,
0.0016393661499023438,
0.012939453125,
0.01143646240234375,
0.016571044921875,
-0.0116119384765625,
-0.0294952392578125,
-0.032623291015625,
0.0242462158203125,
-0.047637939453125,
-0.0261993408203125,
-0.03021240234375,
0.00603485107421875,
-0.051727294921875,
-0.0159454345703125,
-0.011138916015625,
-0.033905029296875,
-0.01552581787109375,
-0.005855560302734375,
0.0469970703125,
0.01898193359375,
-0.0261077880859375,
0.01340484619140625,
-0.0755615234375,
0.03985595703125,
-0.00009804964065551758,
0.054351806640625,
-0.0033740997314453125,
-0.020233154296875,
-0.0208587646484375,
0.00897979736328125,
-0.039703369140625,
-0.0782470703125,
0.03326416015625,
-0.015655517578125,
0.05157470703125,
0.045135498046875,
0.024139404296875,
0.05059814453125,
-0.031524658203125,
0.061920166015625,
0.05450439453125,
-0.04949951171875,
0.060791015625,
-0.046478271484375,
0.0232391357421875,
0.028900146484375,
0.059539794921875,
-0.03692626953125,
-0.0223846435546875,
-0.0400390625,
-0.061370849609375,
0.061676025390625,
0.02545166015625,
0.00891876220703125,
0.0021495819091796875,
-0.00878143310546875,
-0.0012683868408203125,
0.02276611328125,
-0.065185546875,
-0.0304412841796875,
-0.035308837890625,
-0.0092620849609375,
0.01031494140625,
-0.00225067138671875,
-0.0198211669921875,
-0.01192474365234375,
0.047576904296875,
0.00725555419921875,
0.0380859375,
0.003215789794921875,
0.005214691162109375,
-0.0258636474609375,
0.02471923828125,
0.0498046875,
0.0546875,
-0.037628173828125,
-0.0234527587890625,
-0.0052032470703125,
-0.040740966796875,
0.00435638427734375,
0.01092529296875,
-0.018524169921875,
-0.0028324127197265625,
0.01107025146484375,
0.052001953125,
0.0178375244140625,
-0.05010986328125,
0.048919677734375,
-0.0011053085327148438,
0.0009775161743164062,
-0.039398193359375,
-0.0037250518798828125,
0.017822265625,
0.0245513916015625,
0.0057525634765625,
0.0066070556640625,
0.004909515380859375,
-0.038787841796875,
-0.01617431640625,
0.0218505859375,
-0.02886962890625,
-0.0132598876953125,
0.061920166015625,
0.02337646484375,
-0.03704833984375,
0.045379638671875,
0.0012035369873046875,
-0.01296234130859375,
0.047210693359375,
0.027252197265625,
0.07562255859375,
-0.038543701171875,
0.008941650390625,
0.051666259765625,
0.032318115234375,
0.0196075439453125,
0.053619384765625,
0.00852203369140625,
-0.03973388671875,
-0.03326416015625,
-0.0278167724609375,
-0.035491943359375,
0.0171356201171875,
-0.053619384765625,
0.03778076171875,
-0.03887939453125,
-0.0287322998046875,
-0.0038928985595703125,
-0.0230560302734375,
-0.0416259765625,
-0.0094757080078125,
-0.0035877227783203125,
0.06884765625,
-0.0394287109375,
0.0435791015625,
0.065185546875,
-0.0693359375,
-0.046600341796875,
-0.01529693603515625,
0.00731658935546875,
-0.0570068359375,
0.034912109375,
0.01541900634765625,
0.003978729248046875,
-0.0255126953125,
-0.03875732421875,
-0.060577392578125,
0.0841064453125,
0.0140380859375,
-0.0238037109375,
-0.0110626220703125,
0.0020923614501953125,
0.0177154541015625,
-0.00466156005859375,
0.04736328125,
-0.002197265625,
0.03997802734375,
-0.006038665771484375,
-0.10516357421875,
0.0292205810546875,
-0.0233001708984375,
-0.01415252685546875,
0.00803375244140625,
-0.066650390625,
0.07513427734375,
-0.038848876953125,
-0.0108184814453125,
0.005924224853515625,
0.03228759765625,
0.0284576416015625,
0.0295257568359375,
0.0286865234375,
0.0279083251953125,
0.0404052734375,
-0.01477813720703125,
0.0706787109375,
-0.03533935546875,
0.031707763671875,
0.06884765625,
0.005329132080078125,
0.06610107421875,
0.0142059326171875,
-0.037017822265625,
0.057464599609375,
0.041595458984375,
-0.0005931854248046875,
0.0224609375,
0.0005340576171875,
-0.006038665771484375,
-0.004047393798828125,
0.00850677490234375,
-0.0504150390625,
0.028411865234375,
0.029388427734375,
-0.02398681640625,
-0.01404571533203125,
0.01031494140625,
0.003078460693359375,
-0.0161895751953125,
-0.007801055908203125,
0.0550537109375,
0.0018291473388671875,
-0.0237274169921875,
0.057586669921875,
0.00348663330078125,
0.043121337890625,
-0.061492919921875,
-0.002864837646484375,
-0.0127716064453125,
0.0109405517578125,
-0.0293121337890625,
-0.06158447265625,
0.00504302978515625,
-0.0031948089599609375,
0.0004322528839111328,
0.003108978271484375,
0.055694580078125,
0.0010547637939453125,
-0.0225677490234375,
0.024017333984375,
0.042205810546875,
0.02313232421875,
0.004154205322265625,
-0.062744140625,
0.0009965896606445312,
-0.0022487640380859375,
-0.043670654296875,
0.01861572265625,
0.036376953125,
0.0058441162109375,
0.0721435546875,
0.0565185546875,
0.0029773712158203125,
-0.00525665283203125,
-0.0056915283203125,
0.07196044921875,
-0.050750732421875,
-0.053619384765625,
-0.04779052734375,
0.06500244140625,
0.00022101402282714844,
-0.0273284912109375,
0.06048583984375,
0.050201416015625,
0.068359375,
-0.0181121826171875,
0.0687255859375,
-0.011444091796875,
0.04559326171875,
-0.02252197265625,
0.055511474609375,
-0.05670166015625,
-0.01544952392578125,
-0.03472900390625,
-0.049774169921875,
-0.013275146484375,
0.063232421875,
-0.0173797607421875,
0.013275146484375,
0.045562744140625,
0.0489501953125,
-0.002460479736328125,
0.005924224853515625,
0.017852783203125,
0.0279693603515625,
0.0133209228515625,
0.045196533203125,
0.05010986328125,
-0.0308837890625,
0.07318115234375,
-0.0261077880859375,
-0.03546142578125,
-0.03131103515625,
-0.044097900390625,
-0.083984375,
-0.0305938720703125,
-0.0310821533203125,
-0.034393310546875,
-0.0105743408203125,
0.06671142578125,
0.052459716796875,
-0.061309814453125,
-0.032623291015625,
0.0196380615234375,
0.01031494140625,
-0.026458740234375,
-0.0237274169921875,
0.0230865478515625,
-0.004852294921875,
-0.0650634765625,
0.00603485107421875,
0.0124969482421875,
0.0189056396484375,
-0.025909423828125,
0.0014324188232421875,
-0.01192474365234375,
0.0023021697998046875,
0.043548583984375,
0.0247650146484375,
-0.0584716796875,
-0.014434814453125,
-0.00579071044921875,
0.002628326416015625,
0.007843017578125,
0.0268402099609375,
-0.04559326171875,
0.04205322265625,
0.05120849609375,
0.0008149147033691406,
0.0303955078125,
-0.01349639892578125,
0.0178375244140625,
-0.036285400390625,
0.0243072509765625,
0.0037078857421875,
0.038604736328125,
-0.0035839080810546875,
-0.0243072509765625,
0.0531005859375,
0.01218414306640625,
-0.041046142578125,
-0.0677490234375,
0.0081024169921875,
-0.07373046875,
-0.03472900390625,
0.08294677734375,
-0.0189666748046875,
0.0019140243530273438,
-0.0081329345703125,
-0.034515380859375,
0.0273284912109375,
-0.054229736328125,
0.05072021484375,
0.039642333984375,
-0.0169525146484375,
-0.0023593902587890625,
-0.06390380859375,
0.005153656005859375,
-0.011444091796875,
-0.0584716796875,
-0.01041412353515625,
0.045379638671875,
0.0215606689453125,
0.0248260498046875,
0.0626220703125,
-0.0133209228515625,
0.0283050537109375,
0.0031108856201171875,
0.0290679931640625,
-0.029083251953125,
-0.002017974853515625,
-0.00678253173828125,
0.0170440673828125,
-0.0272064208984375,
-0.035919189453125
]
] |
timm/efficientnetv2_rw_s.ra2_in1k | 2023-04-27T21:13:07.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:2104.00298",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/efficientnetv2_rw_s.ra2_in1k | 0 | 11,782 | timm | 2022-12-12T23:58:49 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for efficientnetv2_rw_s.ra2_in1k
A EfficientNet-v2 image classification model. This is a `timm` specific variation of the architecture. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA2` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 23.9
- GMACs: 4.9
- Activations (M): 21.4
- Image size: train = 288 x 288, test = 384 x 384
- **Papers:**
- EfficientNetV2: Smaller Models and Faster Training: https://arxiv.org/abs/2104.00298
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('efficientnetv2_rw_s.ra2_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'efficientnetv2_rw_s.ra2_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 24, 144, 144])
# torch.Size([1, 48, 72, 72])
# torch.Size([1, 64, 36, 36])
# torch.Size([1, 160, 18, 18])
# torch.Size([1, 272, 9, 9])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'efficientnetv2_rw_s.ra2_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1792, 9, 9) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2021efficientnetv2,
title={Efficientnetv2: Smaller models and faster training},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={10096--10106},
year={2021},
organization={PMLR}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,773 | [
[
-0.02593994140625,
-0.032135009765625,
-0.00933074951171875,
0.002391815185546875,
-0.0206451416015625,
-0.033905029296875,
-0.01551055908203125,
-0.0307464599609375,
0.0184783935546875,
0.035247802734375,
-0.0309600830078125,
-0.038848876953125,
-0.055145263671875,
-0.01480865478515625,
-0.00632476806640625,
0.05877685546875,
-0.005222320556640625,
0.0026950836181640625,
-0.0128173828125,
-0.045196533203125,
-0.0191802978515625,
-0.0103759765625,
-0.07733154296875,
-0.041290283203125,
0.03314208984375,
0.027862548828125,
0.03668212890625,
0.052490234375,
0.052734375,
0.034332275390625,
-0.0026683807373046875,
0.017364501953125,
-0.019866943359375,
-0.00867462158203125,
0.0269927978515625,
-0.051788330078125,
-0.02825927734375,
0.01213836669921875,
0.049468994140625,
0.017730712890625,
0.00041484832763671875,
0.033935546875,
0.01052093505859375,
0.051544189453125,
-0.020050048828125,
0.01143646240234375,
-0.033294677734375,
0.01678466796875,
-0.010467529296875,
0.0047760009765625,
-0.0250091552734375,
-0.0220489501953125,
0.006561279296875,
-0.038818359375,
0.0244598388671875,
0.004974365234375,
0.0946044921875,
0.02301025390625,
-0.005512237548828125,
0.0014362335205078125,
-0.0177001953125,
0.057891845703125,
-0.053192138671875,
0.01467132568359375,
0.02508544921875,
0.02337646484375,
-0.00594329833984375,
-0.09027099609375,
-0.042083740234375,
-0.0086822509765625,
-0.0123443603515625,
0.0026645660400390625,
-0.0261688232421875,
-0.0043182373046875,
0.0225067138671875,
0.0079498291015625,
-0.0377197265625,
0.022796630859375,
-0.039215087890625,
-0.0135345458984375,
0.0335693359375,
0.001888275146484375,
0.02593994140625,
-0.0194549560546875,
-0.036285400390625,
-0.03741455078125,
-0.030792236328125,
0.0254669189453125,
0.0247344970703125,
0.017822265625,
-0.042510986328125,
0.031585693359375,
0.00855255126953125,
0.040924072265625,
-0.0033245086669921875,
-0.021759033203125,
0.04150390625,
0.0013179779052734375,
-0.0281982421875,
-0.0171051025390625,
0.07635498046875,
0.032989501953125,
0.01128387451171875,
0.01422882080078125,
-0.010040283203125,
-0.031829833984375,
-0.0032367706298828125,
-0.09185791015625,
-0.032989501953125,
0.021270751953125,
-0.05126953125,
-0.0279541015625,
0.016754150390625,
-0.038665771484375,
-0.0102996826171875,
-0.0019350051879882812,
0.03533935546875,
-0.032745361328125,
-0.03216552734375,
-0.0096588134765625,
-0.021270751953125,
0.0287322998046875,
0.01824951171875,
-0.039520263671875,
0.0179901123046875,
0.034637451171875,
0.0928955078125,
0.007419586181640625,
-0.026641845703125,
-0.0260772705078125,
-0.030181884765625,
-0.0234832763671875,
0.034393310546875,
-0.0021686553955078125,
0.0098114013671875,
-0.024993896484375,
0.021453857421875,
-0.005115509033203125,
-0.05670166015625,
0.011505126953125,
-0.0257110595703125,
0.0157623291015625,
-0.01078033447265625,
-0.0117340087890625,
-0.04510498046875,
0.0222930908203125,
-0.0390625,
0.1011962890625,
0.0284576416015625,
-0.0712890625,
0.01493072509765625,
-0.0401611328125,
-0.006134033203125,
-0.022613525390625,
-0.003719329833984375,
-0.08013916015625,
-0.01058197021484375,
0.0030193328857421875,
0.05218505859375,
-0.032867431640625,
-0.0037746429443359375,
-0.037841796875,
-0.0189971923828125,
0.02093505859375,
0.0032196044921875,
0.078857421875,
0.01788330078125,
-0.03778076171875,
0.0099029541015625,
-0.0406494140625,
0.0191192626953125,
0.0428466796875,
-0.018798828125,
-0.00745391845703125,
-0.041259765625,
0.0135345458984375,
0.0263671875,
0.0020084381103515625,
-0.033905029296875,
0.01532745361328125,
-0.0203857421875,
0.034942626953125,
0.045928955078125,
-0.01354217529296875,
0.0233306884765625,
-0.030548095703125,
0.0170440673828125,
0.016815185546875,
0.01372528076171875,
0.0028514862060546875,
-0.04351806640625,
-0.06658935546875,
-0.036407470703125,
0.0311279296875,
0.035125732421875,
-0.047607421875,
0.0293121337890625,
-0.01322174072265625,
-0.0550537109375,
-0.0304412841796875,
0.01125335693359375,
0.047821044921875,
0.041778564453125,
0.0220184326171875,
-0.04193115234375,
-0.040069580078125,
-0.0748291015625,
-0.0003647804260253906,
-0.0012073516845703125,
0.005016326904296875,
0.030731201171875,
0.053802490234375,
-0.006099700927734375,
0.045928955078125,
-0.028045654296875,
-0.0210723876953125,
-0.0248260498046875,
0.00560760498046875,
0.022125244140625,
0.0616455078125,
0.06378173828125,
-0.045745849609375,
-0.04010009765625,
-0.00838470458984375,
-0.068603515625,
0.0157470703125,
-0.004718780517578125,
-0.0171661376953125,
0.018768310546875,
0.01427459716796875,
-0.037200927734375,
0.036102294921875,
0.015869140625,
-0.016754150390625,
0.02459716796875,
-0.01678466796875,
0.015045166015625,
-0.08721923828125,
0.0106048583984375,
0.0272369384765625,
-0.0138092041015625,
-0.03472900390625,
0.01308441162109375,
0.0016765594482421875,
-0.0006880760192871094,
-0.0408935546875,
0.051513671875,
-0.04656982421875,
-0.0228271484375,
-0.0153961181640625,
-0.0182952880859375,
0.0007767677307128906,
0.050018310546875,
-0.01027679443359375,
0.0287322998046875,
0.057861328125,
-0.0302734375,
0.03729248046875,
0.0248260498046875,
-0.021392822265625,
0.0221099853515625,
-0.055877685546875,
0.0252227783203125,
-0.00283050537109375,
0.020355224609375,
-0.0802001953125,
-0.02490234375,
0.035797119140625,
-0.049041748046875,
0.050201416015625,
-0.037872314453125,
-0.0379638671875,
-0.040557861328125,
-0.03729248046875,
0.0251007080078125,
0.05877685546875,
-0.058502197265625,
0.031890869140625,
0.0182342529296875,
0.02020263671875,
-0.047119140625,
-0.07904052734375,
-0.01239013671875,
-0.027496337890625,
-0.05694580078125,
0.0191650390625,
0.0194854736328125,
0.0006642341613769531,
0.0119476318359375,
-0.00069427490234375,
-0.0161285400390625,
-0.0031948089599609375,
0.0400390625,
0.016876220703125,
-0.020843505859375,
-0.00826263427734375,
-0.0211944580078125,
-0.01032257080078125,
0.0009093284606933594,
-0.028167724609375,
0.04156494140625,
-0.01519012451171875,
-0.00911712646484375,
-0.07110595703125,
-0.00492095947265625,
0.02825927734375,
-0.0034275054931640625,
0.062286376953125,
0.087646484375,
-0.040252685546875,
-0.006381988525390625,
-0.03472900390625,
-0.0305633544921875,
-0.036407470703125,
0.04425048828125,
-0.0241241455078125,
-0.03564453125,
0.058013916015625,
-0.0009355545043945312,
0.006378173828125,
0.048431396484375,
0.027618408203125,
-0.00792694091796875,
0.0479736328125,
0.040618896484375,
0.0234222412109375,
0.05401611328125,
-0.08050537109375,
-0.01898193359375,
-0.07000732421875,
-0.041900634765625,
-0.0306396484375,
-0.055145263671875,
-0.045318603515625,
-0.035888671875,
0.032501220703125,
0.015594482421875,
-0.032928466796875,
0.037628173828125,
-0.060638427734375,
0.007007598876953125,
0.0565185546875,
0.04290771484375,
-0.0357666015625,
0.03173828125,
-0.01459503173828125,
-0.00446319580078125,
-0.0689697265625,
-0.01467132568359375,
0.08001708984375,
0.0364990234375,
0.035308837890625,
0.00008744001388549805,
0.050201416015625,
-0.01401519775390625,
0.020263671875,
-0.043243408203125,
0.04144287109375,
-0.0153350830078125,
-0.0305023193359375,
-0.0086212158203125,
-0.0367431640625,
-0.07904052734375,
0.0108642578125,
-0.0193939208984375,
-0.052520751953125,
0.01006317138671875,
0.02117919921875,
-0.016693115234375,
0.0572509765625,
-0.058197021484375,
0.065673828125,
-0.0083465576171875,
-0.037139892578125,
-0.0008664131164550781,
-0.05810546875,
0.0290069580078125,
0.0194854736328125,
-0.0164337158203125,
-0.0040283203125,
0.005268096923828125,
0.08599853515625,
-0.052154541015625,
0.063232421875,
-0.039520263671875,
0.041168212890625,
0.044921875,
-0.01142120361328125,
0.03717041015625,
-0.00754547119140625,
-0.0135345458984375,
0.023468017578125,
-0.00955963134765625,
-0.033782958984375,
-0.044586181640625,
0.04632568359375,
-0.07666015625,
-0.0164642333984375,
-0.02593994140625,
-0.02374267578125,
0.0212860107421875,
0.0041961669921875,
0.04278564453125,
0.054473876953125,
0.0261383056640625,
0.0261383056640625,
0.044769287109375,
-0.0340576171875,
0.0303497314453125,
-0.00258636474609375,
-0.00435638427734375,
-0.041473388671875,
0.06365966796875,
0.0234222412109375,
0.011566162109375,
0.013153076171875,
0.0197296142578125,
-0.0236358642578125,
-0.045623779296875,
-0.024627685546875,
0.0194244384765625,
-0.054473876953125,
-0.0411376953125,
-0.055145263671875,
-0.0269927978515625,
-0.0306396484375,
0.001861572265625,
-0.045501708984375,
-0.031829833984375,
-0.031890869140625,
0.019500732421875,
0.057647705078125,
0.041595458984375,
-0.021728515625,
0.04388427734375,
-0.032989501953125,
0.01529693603515625,
0.0118255615234375,
0.0298614501953125,
0.0013217926025390625,
-0.0692138671875,
-0.01371002197265625,
-0.007106781005859375,
-0.0247344970703125,
-0.04705810546875,
0.033782958984375,
0.016448974609375,
0.033050537109375,
0.0235748291015625,
-0.0162200927734375,
0.048797607421875,
-0.0019969940185546875,
0.040252685546875,
0.044769287109375,
-0.028167724609375,
0.04217529296875,
0.003940582275390625,
0.007183074951171875,
0.013275146484375,
0.0173187255859375,
-0.018035888671875,
0.004749298095703125,
-0.06488037109375,
-0.06243896484375,
0.07159423828125,
0.0092010498046875,
-0.0017232894897460938,
0.0238037109375,
0.06182861328125,
0.0012054443359375,
-0.00601959228515625,
-0.050750732421875,
-0.04083251953125,
-0.02178955078125,
-0.01499176025390625,
0.0084991455078125,
-0.01605224609375,
-0.0055694580078125,
-0.04754638671875,
0.056671142578125,
-0.0035343170166015625,
0.061187744140625,
0.0196380615234375,
0.0022792816162109375,
-0.0034923553466796875,
-0.035888671875,
0.032745361328125,
0.01428985595703125,
-0.016998291015625,
0.01090240478515625,
0.0130157470703125,
-0.037841796875,
0.00897216796875,
0.005573272705078125,
-0.00719451904296875,
-0.0007491111755371094,
0.037994384765625,
0.07568359375,
-0.005329132080078125,
0.00653839111328125,
0.0341796875,
-0.006389617919921875,
-0.033843994140625,
-0.0254058837890625,
0.01654052734375,
0.001079559326171875,
0.037872314453125,
0.01326751708984375,
0.035430908203125,
-0.00847625732421875,
-0.020751953125,
0.0238800048828125,
0.03863525390625,
-0.0230560302734375,
-0.020660400390625,
0.052734375,
-0.0103302001953125,
-0.0204620361328125,
0.0694580078125,
-0.0186004638671875,
-0.033599853515625,
0.08880615234375,
0.033203125,
0.0709228515625,
0.006427764892578125,
0.0017843246459960938,
0.0643310546875,
0.0228424072265625,
-0.0082855224609375,
0.0153350830078125,
0.018310546875,
-0.054901123046875,
0.004299163818359375,
-0.0341796875,
0.00783538818359375,
0.025146484375,
-0.042388916015625,
0.0236968994140625,
-0.055084228515625,
-0.032135009765625,
0.0083465576171875,
0.0275726318359375,
-0.076904296875,
0.0117340087890625,
-0.007568359375,
0.06866455078125,
-0.051055908203125,
0.0604248046875,
0.06591796875,
-0.032501220703125,
-0.08331298828125,
-0.01291656494140625,
0.01275634765625,
-0.07421875,
0.055389404296875,
0.03460693359375,
0.01001739501953125,
0.004451751708984375,
-0.0565185546875,
-0.0491943359375,
0.11236572265625,
0.03778076171875,
-0.0142364501953125,
0.0272979736328125,
-0.0108795166015625,
0.01241302490234375,
-0.02825927734375,
0.044830322265625,
0.01206207275390625,
0.03472900390625,
0.0241546630859375,
-0.0435791015625,
0.0197906494140625,
-0.0284271240234375,
0.0159912109375,
0.00965118408203125,
-0.07196044921875,
0.0640869140625,
-0.041717529296875,
-0.01050567626953125,
0.00678253173828125,
0.055084228515625,
0.01104736328125,
0.01537322998046875,
0.038177490234375,
0.06488037109375,
0.041107177734375,
-0.0261688232421875,
0.0733642578125,
0.0028858184814453125,
0.040313720703125,
0.05206298828125,
0.027130126953125,
0.042938232421875,
0.022979736328125,
-0.0145263671875,
0.0272674560546875,
0.0802001953125,
-0.0257110595703125,
0.0275726318359375,
0.0195465087890625,
0.01078033447265625,
-0.005001068115234375,
0.006404876708984375,
-0.03472900390625,
0.04510498046875,
0.004650115966796875,
-0.03717041015625,
-0.0128021240234375,
0.0005917549133300781,
0.00307464599609375,
-0.023773193359375,
-0.00949859619140625,
0.041717529296875,
0.004543304443359375,
-0.031219482421875,
0.0634765625,
0.0231170654296875,
0.0628662109375,
-0.032989501953125,
-0.0002567768096923828,
-0.020782470703125,
0.0164794921875,
-0.0230712890625,
-0.053924560546875,
0.02349853515625,
-0.017791748046875,
-0.0010843276977539062,
0.0038242340087890625,
0.053802490234375,
-0.020263671875,
-0.02947998046875,
0.01515960693359375,
0.0217742919921875,
0.042510986328125,
0.0103607177734375,
-0.09588623046875,
0.0166473388671875,
0.00345611572265625,
-0.053192138671875,
0.0290069580078125,
0.0221099853515625,
0.01102447509765625,
0.053436279296875,
0.040679931640625,
-0.00826263427734375,
0.003452301025390625,
-0.0157928466796875,
0.05975341796875,
-0.0291900634765625,
-0.0208282470703125,
-0.056854248046875,
0.04327392578125,
-0.01242828369140625,
-0.046142578125,
0.03436279296875,
0.044219970703125,
0.0574951171875,
0.003353118896484375,
0.038055419921875,
-0.0240478515625,
-0.00376129150390625,
-0.037750244140625,
0.050933837890625,
-0.059967041015625,
0.0025272369384765625,
0.0012645721435546875,
-0.052978515625,
-0.02294921875,
0.0550537109375,
-0.0078125,
0.0311431884765625,
0.03558349609375,
0.07904052734375,
-0.02679443359375,
-0.033721923828125,
0.0052947998046875,
0.0099029541015625,
0.005504608154296875,
0.031524658203125,
0.0258941650390625,
-0.059967041015625,
0.0213623046875,
-0.049652099609375,
-0.0181884765625,
-0.0171966552734375,
-0.051666259765625,
-0.07012939453125,
-0.05950927734375,
-0.048431396484375,
-0.057464599609375,
-0.0024890899658203125,
0.0751953125,
0.08001708984375,
-0.047149658203125,
-0.005603790283203125,
-0.00006335973739624023,
0.0108642578125,
-0.024078369140625,
-0.017059326171875,
0.048919677734375,
-0.0092315673828125,
-0.049468994140625,
-0.0213165283203125,
0.0056610107421875,
0.0208892822265625,
0.0037326812744140625,
-0.023651123046875,
-0.010223388671875,
-0.0191192626953125,
0.014434814453125,
0.0261688232421875,
-0.049072265625,
-0.0176849365234375,
-0.02142333984375,
-0.0132293701171875,
0.024627685546875,
0.036468505859375,
-0.033843994140625,
0.0243682861328125,
0.0364990234375,
0.030792236328125,
0.059234619140625,
-0.0255584716796875,
-0.0027103424072265625,
-0.06591796875,
0.046539306640625,
-0.010467529296875,
0.032379150390625,
0.031829833984375,
-0.02398681640625,
0.048614501953125,
0.034027099609375,
-0.0273284912109375,
-0.06658935546875,
-0.00131988525390625,
-0.08172607421875,
-0.0146026611328125,
0.0792236328125,
-0.032989501953125,
-0.03765869140625,
0.033905029296875,
0.00386810302734375,
0.051116943359375,
-0.00472259521484375,
0.0283203125,
0.00824737548828125,
-0.010162353515625,
-0.0491943359375,
-0.04815673828125,
0.03143310546875,
0.01299285888671875,
-0.045684814453125,
-0.034027099609375,
-0.003940582275390625,
0.05206298828125,
0.009674072265625,
0.041778564453125,
-0.00508880615234375,
0.0132293701171875,
0.0161895751953125,
0.03533935546875,
-0.048797607421875,
-0.013275146484375,
-0.018768310546875,
0.0022792816162109375,
-0.002452850341796875,
-0.046234130859375
]
] |
avichr/Legal-heBERT_ft | 2022-07-07T07:31:58.000Z | [
"transformers",
"pytorch",
"bert",
"fill-mask",
"arxiv:1911.03090",
"arxiv:2010.02559",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | avichr | null | null | avichr/Legal-heBERT_ft | 2 | 11,757 | transformers | 2022-05-05T06:49:36 | # Legal-HeBERT
Legal-HeBERT is a BERT model for Hebrew legal and legislative domains. It is intended to improve the legal NLP research and tools development in Hebrew. We release two versions of Legal-HeBERT. The first version is a fine-tuned model of [HeBERT](https://github.com/avichaychriqui/HeBERT) applied on legal and legislative documents. The second version uses [HeBERT](https://github.com/avichaychriqui/HeBERT)'s architecture guidlines to train a BERT model from scratch. <br>
We continue collecting legal data, examining different architectural designs, and performing tagged datasets and legal tasks for evaluating and to development of a Hebrew legal tools.
## Training Data
Our training datasets are:
| Name | Hebrew Description | Size (GB) | Documents | Sentences | Words | Notes |
|----------------------------------------------------------------------------------------------------------------------------------- |-------------------------------------------------------------------------- |----------- |----------- |------------ |------------- |----------------------------------------- |
| The Israeli Law Book | ספר החוקים הישראלי | 0.05 | 2338 | 293352 | 4851063 | |
| Judgments of the Supreme Court | מאגר פסקי הדין של בית המשפט העליון | 0.7 | 212348 | 5790138 | 79672415 | |
| custody courts | החלטות בתי הדין למשמורת | 2.46 | 169,708 | 8,555,893 | 213,050,492 | |
| Law memoranda, drafts of secondary legislation and drafts of support tests that have been distributed to the public for comment | תזכירי חוק, טיוטות חקיקת משנה וטיוטות מבחני תמיכה שהופצו להערות הציבור | 0.4 | 3,291 | 294,752 | 7,218,960 | |
| Supervisors of Land Registration judgments | מאגר פסקי דין של המפקחים על רישום המקרקעין | 0.02 | 559 | 67,639 | 1,785,446 | |
| Decisions of the Labor Court - Corona | מאגר החלטות בית הדין לעניין שירות התעסוקה – קורונה | 0.001 | 146 | 3505 | 60195 | |
| Decisions of the Israel Lands Council | החלטות מועצת מקרקעי ישראל | | 118 | 11283 | 162692 | aggregate file |
| Judgments of the Disciplinary Tribunal and the Israel Police Appeals Tribunal | פסקי דין של בית הדין למשמעת ובית הדין לערעורים של משטרת ישראל | 0.02 | 54 | 83724 | 1743419 | aggregate files |
| Disciplinary Appeals Committee in the Ministry of Health | ועדת ערר לדין משמעתי במשרד הבריאות | 0.004 | 252 | 21010 | 429807 | 465 files are scanned and didn't parser |
| Attorney General's Positions | מאגר התייצבויות היועץ המשפטי לממשלה | 0.008 | 281 | 32724 | 813877 | |
| Legal-Opinion of the Attorney General | מאגר חוות דעת היועץ המשפטי לממשלה | 0.002 | 44 | 7132 | 188053 | |
| | | | | | | |
| total | | 3.665 | 389,139 | 15,161,152 | 309,976,419 | |
We thank <b>Yair Gardin</b> for the referring to the governance data, <b>Elhanan Schwarts</b> for collecting and parsing The Israeli law book, and <b>Jonathan Schler</b> for collecting the judgments of the supreme court.
## Training process
* Vocabulary size: 50,000 tokens
* 4 epochs (1M steps±)
* lr=5e-5
* mlm_probability=0.15
* batch size = 32 (for each gpu)
* NVIDIA GeForce RTX 2080 TI + NVIDIA GeForce RTX 3090 (1 week training)
### Additional training settings:
<b>Fine-tuned [HeBERT](https://github.com/avichaychriqui/HeBERT) model:</b> The first eight layers were freezed (like [Lee et al. (2019)](https://arxiv.org/abs/1911.03090) suggest)<br>
<b>Legal-HeBERT trained from scratch:</b> The training process is similar to [HeBERT](https://github.com/avichaychriqui/HeBERT) and inspired by [Chalkidis et al. (2020)](https://arxiv.org/abs/2010.02559) <br>
## How to use
The models can be found in huggingface hub and can be fine-tunned to any down-stream task:
```
# !pip install transformers==4.14.1
from transformers import AutoTokenizer, AutoModel
model_name = 'avichr/Legal-heBERT_ft' # for the fine-tuned HeBERT model
model_name = 'avichr/Legal-heBERT' # for legal HeBERT model trained from scratch
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
from transformers import pipeline
fill_mask = pipeline(
"fill-mask",
model=model_name,
)
fill_mask("הקורונה לקחה את [MASK] ולנו לא נשאר דבר.")
```
## Stay tuned!
We are still working on our models and the datasets. We will edit this page as we progress. We are open for collaborations.
## If you used this model please cite us as :
Chriqui, Avihay, Yahav, Inbal and Bar-Siman-Tov, Ittai, Legal HeBERT: A BERT-based NLP Model for Hebrew Legal, Judicial and Legislative Texts (June 27, 2022). Available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4147127
```
@article{chriqui2021hebert,
title={Legal HeBERT: A BERT-based NLP Model for Hebrew Legal, Judicial and Legislative Texts},
author={Chriqui, Avihay, Yahav, Inbal and Bar-Siman-Tov, Ittai},
journal={SSRN preprint:4147127},
year={2022}
}
```
## Contact us
[Avichay Chriqui](mailto:avichayc@mail.tau.ac.il), The Coller AI Lab <br>
[Inbal yahav](mailto:inbalyahav@tauex.tau.ac.il), The Coller AI Lab <br>
[Ittai Bar-Siman-Tov](mailto:Ittai.Bar-Siman-Tov@biu.ac.il), the BIU Innovation Lab for Law, Data-Science and Digital Ethics <br>
Thank you, תודה, شكرا <br>
| 7,821 | [
[
-0.02752685546875,
-0.02911376953125,
0.023193359375,
0.01450347900390625,
-0.03729248046875,
-0.01374053955078125,
-0.00147247314453125,
-0.021820068359375,
0.0201873779296875,
0.034698486328125,
-0.0183563232421875,
-0.060882568359375,
-0.06500244140625,
-0.036651611328125,
-0.042724609375,
0.07177734375,
0.0086517333984375,
0.01322174072265625,
0.01195526123046875,
-0.02349853515625,
-0.0269012451171875,
-0.033477783203125,
-0.0282440185546875,
-0.030364990234375,
0.0142669677734375,
0.0015125274658203125,
0.05853271484375,
0.043670654296875,
0.033538818359375,
0.0200042724609375,
-0.01474761962890625,
-0.02081298828125,
-0.0187530517578125,
-0.0157928466796875,
0.00464630126953125,
-0.0248565673828125,
-0.034332275390625,
-0.0008435249328613281,
0.03753662109375,
0.0289764404296875,
-0.0389404296875,
0.022308349609375,
-0.01462554931640625,
0.076416015625,
-0.045013427734375,
0.00940704345703125,
-0.037200927734375,
0.02020263671875,
-0.01403045654296875,
-0.01352691650390625,
-0.0286102294921875,
-0.029449462890625,
0.01236724853515625,
-0.033905029296875,
0.0222015380859375,
0.019287109375,
0.109130859375,
0.0214385986328125,
-0.0246734619140625,
-0.01218414306640625,
-0.029205322265625,
0.03936767578125,
-0.07940673828125,
0.0377197265625,
0.040283203125,
0.01099395751953125,
-0.0061492919921875,
-0.05560302734375,
-0.051971435546875,
-0.0181732177734375,
-0.01543426513671875,
0.0406494140625,
-0.03131103515625,
-0.0102081298828125,
0.0291290283203125,
0.0206298828125,
-0.044708251953125,
0.0037078857421875,
-0.050323486328125,
-0.024017333984375,
0.058135986328125,
0.0036563873291015625,
0.025360107421875,
-0.0270538330078125,
-0.028167724609375,
-0.0224151611328125,
-0.045166015625,
0.0380859375,
0.0458984375,
0.0066986083984375,
-0.03143310546875,
0.04315185546875,
0.0037403106689453125,
0.031494140625,
0.004688262939453125,
-0.00499725341796875,
0.04296875,
-0.024658203125,
-0.03350830078125,
0.01189422607421875,
0.082763671875,
0.0215606689453125,
0.006717681884765625,
0.0079498291015625,
-0.01422119140625,
-0.01336669921875,
0.03509521484375,
-0.06005859375,
-0.00905609130859375,
0.013916015625,
-0.05218505859375,
-0.018585205078125,
0.00862884521484375,
-0.047271728515625,
-0.0210418701171875,
-0.0226287841796875,
0.020233154296875,
-0.051483154296875,
-0.0106353759765625,
0.0199127197265625,
-0.01065826416015625,
0.032562255859375,
0.0256500244140625,
-0.0677490234375,
0.01739501953125,
0.040771484375,
0.041595458984375,
0.0212249755859375,
-0.01013946533203125,
-0.03369140625,
-0.0005750656127929688,
-0.03521728515625,
0.032257080078125,
-0.0258331298828125,
-0.021148681640625,
0.01690673828125,
-0.01099395751953125,
-0.00264739990234375,
-0.032135009765625,
0.051605224609375,
-0.05181884765625,
0.011749267578125,
-0.017059326171875,
-0.0548095703125,
-0.0223541259765625,
0.0174407958984375,
-0.041046142578125,
0.07977294921875,
0.002826690673828125,
-0.0557861328125,
0.048736572265625,
-0.04559326171875,
-0.0167694091796875,
0.004638671875,
-0.033172607421875,
-0.06683349609375,
-0.00004291534423828125,
0.00868988037109375,
0.021575927734375,
-0.0233154296875,
0.0406494140625,
-0.0036296844482421875,
-0.014312744140625,
0.0008759498596191406,
-0.01206207275390625,
0.11029052734375,
0.0205841064453125,
-0.04522705078125,
0.0257568359375,
-0.058746337890625,
0.0045166015625,
0.0094757080078125,
-0.048126220703125,
0.0026988983154296875,
0.0025615692138671875,
-0.00408935546875,
0.0198822021484375,
0.007434844970703125,
-0.044036865234375,
0.00444793701171875,
-0.03564453125,
0.01258087158203125,
0.061370849609375,
-0.01007080078125,
0.021759033203125,
-0.038818359375,
0.0279693603515625,
0.0035858154296875,
0.0115203857421875,
0.013092041015625,
-0.0506591796875,
-0.062286376953125,
-0.02142333984375,
0.039093017578125,
0.048187255859375,
-0.0259552001953125,
0.059417724609375,
-0.023193359375,
-0.05047607421875,
-0.046295166015625,
0.0011196136474609375,
0.0285186767578125,
0.03173828125,
0.034393310546875,
-0.0218505859375,
-0.034149169921875,
-0.07073974609375,
-0.0231475830078125,
-0.034210205078125,
0.020172119140625,
0.0115814208984375,
0.06866455078125,
0.0064849853515625,
0.06585693359375,
-0.0209808349609375,
-0.02880859375,
-0.0103607177734375,
0.01557159423828125,
0.033782958984375,
0.0364990234375,
0.06048583984375,
-0.050811767578125,
-0.05352783203125,
0.0018253326416015625,
-0.06866455078125,
0.0195159912109375,
-0.00762176513671875,
-0.0201873779296875,
0.0098419189453125,
0.03607177734375,
-0.04345703125,
0.061981201171875,
0.013946533203125,
-0.058502197265625,
0.056060791015625,
-0.0282745361328125,
-0.003204345703125,
-0.06329345703125,
0.0007257461547851562,
-0.0026950836181640625,
-0.0117645263671875,
-0.048583984375,
0.01410675048828125,
0.0084228515625,
0.01415252685546875,
-0.03759765625,
0.039154052734375,
-0.031829833984375,
-0.01116943359375,
0.0070343017578125,
-0.010223388671875,
-0.0128021240234375,
0.04730224609375,
-0.0129241943359375,
0.054473876953125,
0.046051025390625,
-0.03448486328125,
0.028839111328125,
0.050994873046875,
-0.034332275390625,
0.033905029296875,
-0.04937744140625,
-0.01824951171875,
-0.0237884521484375,
0.006389617919921875,
-0.046112060546875,
-0.003917694091796875,
0.0391845703125,
-0.0416259765625,
0.0266876220703125,
-0.009918212890625,
-0.050048828125,
-0.036895751953125,
-0.0330810546875,
-0.005096435546875,
0.0640869140625,
-0.0244598388671875,
0.058135986328125,
0.0194854736328125,
0.005764007568359375,
-0.0655517578125,
-0.053802490234375,
0.0063323974609375,
-0.011810302734375,
-0.050262451171875,
0.038299560546875,
-0.0023937225341796875,
-0.0211029052734375,
0.0289306640625,
0.004245758056640625,
-0.0146636962890625,
-0.009033203125,
0.032806396484375,
0.0211639404296875,
-0.01371002197265625,
-0.004638671875,
0.003635406494140625,
0.0020580291748046875,
0.00849151611328125,
-0.0009522438049316406,
0.048187255859375,
-0.0222320556640625,
-0.04302978515625,
-0.0550537109375,
0.0174713134765625,
0.037567138671875,
-0.0237579345703125,
0.057403564453125,
0.047393798828125,
-0.034881591796875,
0.0213165283203125,
-0.039947509765625,
0.0007600784301757812,
-0.03070068359375,
0.006839752197265625,
-0.02044677734375,
-0.047637939453125,
0.046844482421875,
0.0234527587890625,
0.042572021484375,
0.05975341796875,
0.0670166015625,
-0.01141357421875,
0.03875732421875,
0.0291748046875,
-0.01361083984375,
0.03839111328125,
-0.040679931640625,
0.021270751953125,
-0.0469970703125,
-0.0240020751953125,
-0.04827880859375,
-0.0224609375,
-0.055267333984375,
-0.005908966064453125,
0.020355224609375,
-0.0011816024780273438,
-0.034942626953125,
0.03741455078125,
-0.0283050537109375,
0.01027679443359375,
0.0660400390625,
0.005695343017578125,
0.00374603271484375,
0.0099029541015625,
-0.03814697265625,
-0.0082550048828125,
-0.050201416015625,
-0.038726806640625,
0.081298828125,
0.0259246826171875,
0.02947998046875,
0.03277587890625,
0.07757568359375,
0.037750244140625,
0.0198822021484375,
-0.03948974609375,
0.043548583984375,
-0.005138397216796875,
-0.07861328125,
-0.03302001953125,
-0.01232147216796875,
-0.0775146484375,
0.037078857421875,
-0.031463623046875,
-0.07281494140625,
0.0361328125,
0.00420379638671875,
-0.057403564453125,
0.033538818359375,
-0.027191162109375,
0.060760498046875,
-0.010345458984375,
-0.043701171875,
-0.02667236328125,
-0.0618896484375,
0.01195526123046875,
0.0159454345703125,
0.0109100341796875,
-0.021575927734375,
0.01806640625,
0.07781982421875,
-0.06982421875,
0.05987548828125,
-0.02166748046875,
0.0015125274658203125,
0.046051025390625,
-0.00936126708984375,
0.04486083984375,
0.005512237548828125,
-0.01210784912109375,
0.003917694091796875,
0.0036163330078125,
-0.033843994140625,
-0.01904296875,
0.03509521484375,
-0.077392578125,
-0.047637939453125,
-0.06414794921875,
-0.03936767578125,
0.0237274169921875,
0.0251922607421875,
0.03192138671875,
0.020050048828125,
0.0113525390625,
0.0227813720703125,
0.032073974609375,
-0.0261688232421875,
0.04443359375,
0.041107177734375,
-0.00567626953125,
-0.03143310546875,
0.058013916015625,
0.02166748046875,
-0.012664794921875,
-0.005096435546875,
0.01180267333984375,
-0.0209808349609375,
-0.04998779296875,
-0.041046142578125,
0.017578125,
-0.0594482421875,
-0.01078033447265625,
-0.046051025390625,
-0.0305328369140625,
-0.030914306640625,
-0.00753021240234375,
-0.030242919921875,
-0.0184783935546875,
-0.0264434814453125,
-0.0106964111328125,
0.027557373046875,
0.040557861328125,
-0.0096282958984375,
0.01267242431640625,
-0.056640625,
0.028533935546875,
0.0197906494140625,
0.024688720703125,
-0.0159454345703125,
-0.06414794921875,
0.005279541015625,
-0.01465606689453125,
-0.02862548828125,
-0.060302734375,
0.043975830078125,
0.0002734661102294922,
0.046234130859375,
0.022552490234375,
0.0115966796875,
0.056304931640625,
-0.053375244140625,
0.06134033203125,
0.03387451171875,
-0.059173583984375,
0.02606201171875,
-0.0205230712890625,
0.004913330078125,
0.05279541015625,
0.025146484375,
-0.0297088623046875,
-0.032745361328125,
-0.072509765625,
-0.074462890625,
0.062469482421875,
0.034820556640625,
0.0313720703125,
0.0213775634765625,
0.02838134765625,
-0.0005593299865722656,
0.0223388671875,
-0.0556640625,
-0.03717041015625,
-0.01139068603515625,
-0.00867462158203125,
0.039398193359375,
-0.0278778076171875,
-0.0224609375,
-0.04327392578125,
0.07550048828125,
0.01611328125,
0.040130615234375,
0.01384735107421875,
-0.01003265380859375,
-0.0016851425170898438,
0.03509521484375,
0.05419921875,
0.0609130859375,
-0.0159759521484375,
0.0003223419189453125,
0.00872802734375,
-0.037811279296875,
0.01507568359375,
0.0156097412109375,
-0.0465087890625,
0.00893402099609375,
0.031829833984375,
0.07916259765625,
-0.00937652587890625,
-0.03155517578125,
0.052947998046875,
-0.001373291015625,
-0.055328369140625,
-0.06048583984375,
-0.004230499267578125,
0.01049041748046875,
0.039337158203125,
0.017242431640625,
0.00952911376953125,
0.0096435546875,
-0.0443115234375,
0.0208740234375,
0.0201416015625,
-0.0267333984375,
0.0024394989013671875,
0.06182861328125,
-0.0168609619140625,
-0.0167388916015625,
0.04266357421875,
-0.037384033203125,
-0.0438232421875,
0.05523681640625,
0.0191802978515625,
0.047821044921875,
0.00763702392578125,
0.01233673095703125,
0.053497314453125,
0.0303192138671875,
-0.0119476318359375,
0.035491943359375,
0.00934600830078125,
-0.039581298828125,
-0.01531982421875,
-0.04315185546875,
-0.00555419921875,
0.007843017578125,
-0.053314208984375,
0.010711669921875,
-0.048187255859375,
-0.0272979736328125,
0.01300048828125,
-0.0102386474609375,
-0.04742431640625,
0.0022735595703125,
0.00919342041015625,
0.069091796875,
-0.04443359375,
0.060943603515625,
0.0361328125,
-0.055145263671875,
-0.051788330078125,
-0.00800323486328125,
-0.0015687942504882812,
-0.064697265625,
0.06365966796875,
-0.004085540771484375,
0.011016845703125,
-0.01334381103515625,
-0.035888671875,
-0.05914306640625,
0.086669921875,
0.0280303955078125,
-0.058624267578125,
0.002834320068359375,
0.01197052001953125,
0.0439453125,
0.0062103271484375,
0.0034923553466796875,
0.04144287109375,
0.04107666015625,
0.00868988037109375,
-0.06292724609375,
0.016265869140625,
-0.036407470703125,
-0.0087738037109375,
0.0086212158203125,
-0.050262451171875,
0.0692138671875,
-0.00855255126953125,
-0.002735137939453125,
-0.005847930908203125,
0.053741455078125,
0.0170440673828125,
0.01361846923828125,
0.03839111328125,
0.07501220703125,
0.057159423828125,
-0.01416015625,
0.0814208984375,
-0.03887939453125,
0.041656494140625,
0.079345703125,
-0.01316070556640625,
0.057464599609375,
0.036407470703125,
-0.025238037109375,
0.043121337890625,
0.04010009765625,
-0.0309906005859375,
0.033660888671875,
0.021026611328125,
-0.0211181640625,
-0.0212860107421875,
0.00101470947265625,
-0.026947021484375,
0.0218505859375,
-0.0005908012390136719,
-0.035308837890625,
-0.0238189697265625,
-0.019744873046875,
0.0148162841796875,
0.00536346435546875,
-0.01296234130859375,
0.049346923828125,
0.0078887939453125,
-0.04144287109375,
0.044097900390625,
0.0121002197265625,
0.024688720703125,
-0.028961181640625,
0.0000559687614440918,
0.0097503662109375,
0.002941131591796875,
-0.020660400390625,
-0.06329345703125,
0.0234222412109375,
0.0171051025390625,
-0.0016117095947265625,
-0.0196075439453125,
0.033355712890625,
-0.00678253173828125,
-0.036407470703125,
0.0276336669921875,
0.035980224609375,
0.01334381103515625,
0.0262908935546875,
-0.076171875,
-0.01303863525390625,
-0.01222991943359375,
-0.049407958984375,
0.00894927978515625,
0.019012451171875,
-0.01204681396484375,
0.039093017578125,
0.0477294921875,
-0.003963470458984375,
0.0361328125,
0.0191497802734375,
0.07366943359375,
-0.05438232421875,
-0.037445068359375,
-0.079345703125,
0.056854248046875,
-0.0198211669921875,
-0.0267333984375,
0.03607177734375,
0.05078125,
0.04583740234375,
0.01027679443359375,
0.06475830078125,
-0.03192138671875,
0.007083892822265625,
-0.04010009765625,
0.06201171875,
-0.044769287109375,
0.0018148422241210938,
-0.0010309219360351562,
-0.047576904296875,
-0.02764892578125,
0.054718017578125,
-0.038970947265625,
0.005462646484375,
0.048187255859375,
0.037506103515625,
0.001064300537109375,
-0.0157928466796875,
0.02630615234375,
0.00994110107421875,
0.022735595703125,
0.0225677490234375,
0.03558349609375,
-0.05169677734375,
0.04437255859375,
-0.02545166015625,
0.01384735107421875,
-0.02117919921875,
-0.041534423828125,
-0.07745361328125,
-0.027862548828125,
-0.015380859375,
-0.03936767578125,
0.00021326541900634766,
0.08453369140625,
0.0469970703125,
-0.06475830078125,
-0.02886962890625,
-0.017547607421875,
-0.004497528076171875,
-0.0141448974609375,
-0.0115203857421875,
0.05023193359375,
-0.01338958740234375,
-0.033660888671875,
0.0044708251953125,
0.004344940185546875,
0.0026988983154296875,
0.0022296905517578125,
-0.0240631103515625,
-0.0426025390625,
0.0108795166015625,
0.034881591796875,
0.01690673828125,
-0.06695556640625,
-0.0222625732421875,
-0.006748199462890625,
-0.0287628173828125,
0.037628173828125,
0.037139892578125,
-0.0210418701171875,
0.0252838134765625,
0.04345703125,
0.044708251953125,
0.06390380859375,
-0.01288604736328125,
0.0028781890869140625,
-0.045166015625,
0.020965576171875,
0.00713348388671875,
0.056884765625,
-0.00629425048828125,
-0.01192474365234375,
0.048431396484375,
0.01331329345703125,
-0.02386474609375,
-0.04296875,
0.0087738037109375,
-0.0771484375,
-0.01050567626953125,
0.06353759765625,
-0.0233917236328125,
-0.022430419921875,
0.0006556510925292969,
-0.015960693359375,
0.0299072265625,
-0.05047607421875,
0.057098388671875,
0.055023193359375,
-0.003421783447265625,
-0.0010194778442382812,
-0.08514404296875,
0.0189361572265625,
0.03924560546875,
-0.055938720703125,
-0.0137786865234375,
0.0285797119140625,
0.020294189453125,
0.0311279296875,
0.06256103515625,
-0.0163726806640625,
0.03253173828125,
0.003978729248046875,
0.022064208984375,
0.00011980533599853516,
0.0045166015625,
-0.0192108154296875,
0.014984130859375,
-0.002666473388671875,
-0.007266998291015625
]
] |
facebook/data2vec-audio-base-100h | 2022-04-18T16:19:17.000Z | [
"transformers",
"pytorch",
"data2vec-audio",
"automatic-speech-recognition",
"speech",
"en",
"dataset:librispeech_asr",
"arxiv:2202.03555",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/data2vec-audio-base-100h | 1 | 11,739 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- librispeech_asr
tags:
- speech
license: apache-2.0
---
# Data2Vec-Audio-Base-100h
[Facebook's Data2Vec](https://ai.facebook.com/research/data2vec-a-general-framework-for-self-supervised-learning-in-speech-vision-and-language/)
The base model pretrained and fine-tuned on 100 hours of Librispeech on 16kHz sampled speech audio. When using the model
make sure that your speech input is also sampled at 16Khz.
[Paper](https://arxiv.org/abs/2202.03555)
Authors: Alexei Baevski, Wei-Ning Hsu, Qiantong Xu, Arun Babu, Jiatao Gu, Michael Auli
**Abstract**
While the general idea of self-supervised learning is identical across modalities, the actual algorithms and objectives differ widely because they were developed with a single modality in mind. To get us closer to general self-supervised learning, we present data2vec, a framework that uses the same learning method for either speech, NLP or computer vision. The core idea is to predict latent representations of the full input data based on a masked view of the input in a self-distillation setup using a standard Transformer architecture. Instead of predicting modality-specific targets such as words, visual tokens or units of human speech which are local in nature, data2vec predicts contextualized latent representations that contain information from the entire input. Experiments on the major benchmarks of speech recognition, image classification, and natural language understanding demonstrate a new state of the art or competitive performance to predominant approaches.
The original model can be found under https://github.com/pytorch/fairseq/tree/main/examples/data2vec .
# Pre-Training method

For more information, please take a look at the [official paper](https://arxiv.org/abs/2202.03555).
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Data2VecForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/data2vec-audio-base-100h")
model = Data2VecForCTC.from_pretrained("facebook/data2vec-audio-base-100h")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"],, return_tensors="pt", padding="longest").input_values # Batch size 1
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
| 2,776 | [
[
-0.01296234130859375,
-0.05767822265625,
0.0105133056640625,
0.0147552490234375,
-0.00940704345703125,
-0.0207977294921875,
-0.0163726806640625,
-0.037109375,
-0.00362396240234375,
0.023834228515625,
-0.058349609375,
-0.04278564453125,
-0.033294677734375,
-0.0130767822265625,
-0.031585693359375,
0.059967041015625,
0.016204833984375,
0.0177764892578125,
-0.004913330078125,
-0.0114288330078125,
-0.057525634765625,
-0.03729248046875,
-0.051727294921875,
-0.017486572265625,
0.00963592529296875,
0.0191497802734375,
0.0197296142578125,
0.039642333984375,
0.0222320556640625,
0.021148681640625,
-0.0014524459838867188,
0.0032367706298828125,
-0.04046630859375,
0.0037441253662109375,
-0.004817962646484375,
-0.0237884521484375,
-0.0257415771484375,
0.029510498046875,
0.04278564453125,
0.04742431640625,
-0.01177978515625,
0.0204620361328125,
0.009979248046875,
0.035369873046875,
-0.0306396484375,
0.0285491943359375,
-0.045562744140625,
-0.004276275634765625,
-0.0018167495727539062,
-0.01371002197265625,
-0.03399658203125,
0.007389068603515625,
0.00420379638671875,
-0.036224365234375,
0.024139404296875,
-0.019287109375,
0.047027587890625,
0.035125732421875,
-0.0301971435546875,
-0.0300750732421875,
-0.07470703125,
0.0623779296875,
-0.03131103515625,
0.06689453125,
0.04815673828125,
0.0240020751953125,
0.0010766983032226562,
-0.06707763671875,
-0.032684326171875,
0.0007643699645996094,
0.017181396484375,
0.0321044921875,
-0.0255584716796875,
0.0009293556213378906,
0.01318359375,
0.01541900634765625,
-0.05975341796875,
0.00640106201171875,
-0.0556640625,
-0.0295562744140625,
0.0450439453125,
-0.01519775390625,
-0.01458740234375,
-0.01605224609375,
-0.031646728515625,
-0.022552490234375,
-0.02947998046875,
0.0235595703125,
0.02117919921875,
0.023895263671875,
-0.0265655517578125,
0.022918701171875,
-0.00919342041015625,
0.05621337890625,
0.0132293701171875,
-0.0192718505859375,
0.0484619140625,
-0.037994384765625,
-0.0032939910888671875,
0.022613525390625,
0.054534912109375,
0.01326751708984375,
0.01151275634765625,
-0.0032329559326171875,
-0.0102996826171875,
0.00902557373046875,
0.002651214599609375,
-0.05181884765625,
-0.033905029296875,
0.0259246826171875,
-0.0288238525390625,
-0.01239013671875,
0.00485992431640625,
-0.035186767578125,
0.0033168792724609375,
-0.04486083984375,
0.0556640625,
-0.0204620361328125,
-0.0174713134765625,
0.0037746429443359375,
0.01020050048828125,
0.016998291015625,
-0.0016450881958007812,
-0.07098388671875,
0.0251922607421875,
0.03887939453125,
0.07025146484375,
-0.016571044921875,
-0.006977081298828125,
-0.044097900390625,
-0.007472991943359375,
-0.034423828125,
0.04364013671875,
-0.0215301513671875,
-0.03668212890625,
-0.0183563232421875,
0.001956939697265625,
0.01189422607421875,
-0.040069580078125,
0.0438232421875,
-0.022796630859375,
0.034942626953125,
-0.0032329559326171875,
-0.04168701171875,
-0.01087188720703125,
-0.042510986328125,
-0.049713134765625,
0.0916748046875,
0.00902557373046875,
-0.047607421875,
0.0282440185546875,
-0.027557373046875,
-0.04248046875,
-0.0157928466796875,
-0.0221405029296875,
-0.03204345703125,
-0.00887298583984375,
0.017333984375,
0.04559326171875,
-0.01471710205078125,
0.0236358642578125,
-0.0215301513671875,
-0.02947998046875,
0.0153350830078125,
-0.049774169921875,
0.06390380859375,
0.036865234375,
-0.035736083984375,
-0.003704071044921875,
-0.066162109375,
0.0024204254150390625,
-0.0038776397705078125,
-0.029388427734375,
0.0013895034790039062,
-0.0072784423828125,
0.018310546875,
0.02734375,
0.004436492919921875,
-0.03900146484375,
-0.00855255126953125,
-0.047607421875,
0.0501708984375,
0.05548095703125,
-0.01163482666015625,
0.042633056640625,
-0.00954437255859375,
0.0025005340576171875,
-0.017822265625,
0.0009212493896484375,
-0.002849578857421875,
-0.0262451171875,
-0.030792236328125,
-0.036407470703125,
0.033538818359375,
0.041412353515625,
-0.0301513671875,
0.05340576171875,
-0.01497650146484375,
-0.033203125,
-0.056488037109375,
0.0119171142578125,
0.019317626953125,
0.057220458984375,
0.054962158203125,
-0.0196075439453125,
-0.0577392578125,
-0.052642822265625,
-0.00015032291412353516,
-0.0196533203125,
-0.0208282470703125,
0.039581298828125,
0.01511383056640625,
-0.0347900390625,
0.07940673828125,
-0.00884246826171875,
-0.043853759765625,
-0.01181793212890625,
0.012603759765625,
0.033447265625,
0.050537109375,
0.040283203125,
-0.06103515625,
-0.02386474609375,
-0.0302276611328125,
-0.051422119140625,
-0.0178680419921875,
-0.00890350341796875,
0.000766754150390625,
0.0247650146484375,
0.042633056640625,
-0.01186370849609375,
0.03558349609375,
0.0511474609375,
-0.0166473388671875,
0.01306915283203125,
-0.006572723388671875,
0.0038242340087890625,
-0.081298828125,
-0.0003685951232910156,
-0.0014553070068359375,
-0.028228759765625,
-0.048309326171875,
-0.06365966796875,
-0.004894256591796875,
0.002071380615234375,
-0.0312347412109375,
0.03985595703125,
-0.025115966796875,
-0.0184478759765625,
-0.0298919677734375,
-0.00725555419921875,
-0.0086212158203125,
0.03985595703125,
0.023834228515625,
0.060211181640625,
0.058685302734375,
-0.045928955078125,
0.05072021484375,
0.0175628662109375,
-0.0217437744140625,
0.01959228515625,
-0.062744140625,
0.0248565673828125,
0.00429534912109375,
0.03204345703125,
-0.09600830078125,
-0.0100860595703125,
0.008087158203125,
-0.06390380859375,
0.034149169921875,
-0.017822265625,
-0.0171051025390625,
-0.0265655517578125,
0.003986358642578125,
0.044403076171875,
0.05609130859375,
-0.0592041015625,
0.0506591796875,
0.039154052734375,
-0.0011205673217773438,
-0.038543701171875,
-0.08221435546875,
-0.0164337158203125,
-0.0006570816040039062,
-0.05792236328125,
0.0292510986328125,
-0.006988525390625,
-0.0005254745483398438,
-0.0067291259765625,
-0.030609130859375,
-0.01143646240234375,
-0.01358795166015625,
0.041839599609375,
0.0228424072265625,
-0.003086090087890625,
0.029510498046875,
-0.00756072998046875,
-0.0237274169921875,
0.0086822509765625,
-0.0458984375,
0.0347900390625,
-0.00551605224609375,
-0.0212860107421875,
-0.06121826171875,
0.013458251953125,
0.008514404296875,
-0.01303863525390625,
0.01458740234375,
0.0582275390625,
-0.0261993408203125,
-0.0162200927734375,
-0.05194091796875,
-0.03094482421875,
-0.03729248046875,
0.0478515625,
-0.03314208984375,
-0.05804443359375,
0.026275634765625,
0.00638580322265625,
-0.0038051605224609375,
0.050537109375,
0.0660400390625,
-0.0298919677734375,
0.063720703125,
0.034881591796875,
-0.0151214599609375,
0.05096435546875,
-0.05340576171875,
-0.004550933837890625,
-0.038665771484375,
-0.031463623046875,
-0.0167236328125,
-0.0168609619140625,
-0.03424072265625,
-0.051513671875,
0.0177154541015625,
-0.01043701171875,
0.006805419921875,
0.0256805419921875,
-0.045440673828125,
0.019561767578125,
0.04229736328125,
0.00302886962890625,
0.0013532638549804688,
0.022705078125,
0.0009212493896484375,
-0.0017232894897460938,
-0.048248291015625,
-0.0114288330078125,
0.1021728515625,
0.042236328125,
0.07708740234375,
-0.0216522216796875,
0.047332763671875,
0.038055419921875,
-0.03118896484375,
-0.06201171875,
0.012603759765625,
-0.01287841796875,
-0.0648193359375,
-0.022064208984375,
-0.03436279296875,
-0.06927490234375,
0.00797271728515625,
-0.00930023193359375,
-0.048309326171875,
0.032806396484375,
0.01096343994140625,
-0.026397705078125,
0.00667572021484375,
-0.050079345703125,
0.056121826171875,
-0.025909423828125,
-0.0304412841796875,
-0.0210723876953125,
-0.057220458984375,
-0.00598907470703125,
-0.006198883056640625,
0.0230255126953125,
-0.0112762451171875,
0.023284912109375,
0.0855712890625,
-0.006870269775390625,
0.055084228515625,
-0.030242919921875,
0.0023555755615234375,
0.0494384765625,
-0.004703521728515625,
0.018707275390625,
0.0111846923828125,
-7.748603820800781e-7,
0.03277587890625,
0.010955810546875,
-0.022064208984375,
-0.0275726318359375,
0.0484619140625,
-0.07366943359375,
-0.00861358642578125,
-0.014739990234375,
-0.0225677490234375,
-0.018463134765625,
-0.0087432861328125,
0.047027587890625,
0.0675048828125,
-0.0187225341796875,
0.041351318359375,
0.049652099609375,
0.0157623291015625,
0.020233154296875,
0.0275421142578125,
0.005435943603515625,
-0.029266357421875,
0.07452392578125,
0.023956298828125,
0.0094757080078125,
0.0145111083984375,
0.005859375,
-0.04974365234375,
-0.0299530029296875,
-0.005718231201171875,
0.01294708251953125,
-0.057342529296875,
-0.00677490234375,
-0.054840087890625,
-0.0214996337890625,
-0.06304931640625,
0.01068115234375,
-0.0662841796875,
-0.03424072265625,
-0.042144775390625,
-0.0111846923828125,
0.02166748046875,
0.047210693359375,
-0.04974365234375,
0.0200347900390625,
-0.0216522216796875,
0.043487548828125,
0.040496826171875,
0.01181793212890625,
-0.034423828125,
-0.0870361328125,
-0.017242431640625,
0.023834228515625,
0.00934600830078125,
-0.056365966796875,
0.02294921875,
0.02191162109375,
0.05206298828125,
0.0230255126953125,
-0.0106201171875,
0.0300445556640625,
-0.0222930908203125,
0.044586181640625,
0.01763916015625,
-0.06866455078125,
0.056793212890625,
-0.0012683868408203125,
0.004878997802734375,
0.047119140625,
0.02508544921875,
-0.04278564453125,
0.0144805908203125,
-0.0382080078125,
-0.0645751953125,
0.068115234375,
0.0222320556640625,
0.00720977783203125,
0.024688720703125,
0.03448486328125,
0.00870513916015625,
-0.006832122802734375,
-0.051971435546875,
-0.031646728515625,
-0.04901123046875,
-0.023956298828125,
-0.0204620361328125,
-0.026885986328125,
-0.004669189453125,
-0.039794921875,
0.06158447265625,
-0.005275726318359375,
0.03936767578125,
0.0311126708984375,
-0.00820159912109375,
0.00864410400390625,
-0.0035495758056640625,
0.01910400390625,
0.0251922607421875,
-0.02178955078125,
0.0083770751953125,
0.0189208984375,
-0.041229248046875,
0.00836181640625,
0.0138702392578125,
0.01499176025390625,
0.005680084228515625,
0.04718017578125,
0.0823974609375,
-0.0037555694580078125,
-0.025604248046875,
0.05242919921875,
-0.01406097412109375,
-0.0455322265625,
-0.038055419921875,
0.0037689208984375,
0.0040435791015625,
0.0247039794921875,
0.034271240234375,
-0.0034351348876953125,
0.0209503173828125,
-0.033905029296875,
0.0330810546875,
0.01371002197265625,
-0.054962158203125,
-0.029327392578125,
0.060211181640625,
0.0206146240234375,
0.0019168853759765625,
0.054443359375,
-0.0044097900390625,
-0.018798828125,
0.040985107421875,
0.0310821533203125,
0.0643310546875,
-0.02960205078125,
-0.005954742431640625,
0.042083740234375,
0.0076141357421875,
-0.0025482177734375,
0.0234832763671875,
-0.0253448486328125,
-0.044769287109375,
-0.0302734375,
-0.04278564453125,
-0.005817413330078125,
0.016571044921875,
-0.042999267578125,
0.0249176025390625,
-0.02618408203125,
-0.0180816650390625,
0.00823211669921875,
0.009002685546875,
-0.062744140625,
0.025177001953125,
0.03289794921875,
0.04669189453125,
-0.0797119140625,
0.08587646484375,
0.018524169921875,
-0.017852783203125,
-0.083740234375,
-0.021820068359375,
0.00823211669921875,
-0.050994873046875,
0.05242919921875,
0.01194000244140625,
-0.0252685546875,
0.004398345947265625,
-0.049285888671875,
-0.07293701171875,
0.0872802734375,
0.018524169921875,
-0.047149658203125,
0.0028781890869140625,
0.0081634521484375,
0.03521728515625,
-0.035797119140625,
0.01203155517578125,
0.051361083984375,
0.0261383056640625,
0.030792236328125,
-0.07281494140625,
-0.0247039794921875,
-0.0151824951171875,
-0.0206451416015625,
-0.040863037109375,
-0.031707763671875,
0.06787109375,
-0.0147705078125,
-0.0207061767578125,
-0.016693115234375,
0.06884765625,
0.0015716552734375,
0.0239105224609375,
0.047515869140625,
0.0283203125,
0.061737060546875,
-0.012542724609375,
0.057220458984375,
-0.00958251953125,
0.041656494140625,
0.09521484375,
0.0040435791015625,
0.049285888671875,
0.0208282470703125,
-0.0213165283203125,
0.017822265625,
0.0521240234375,
-0.021759033203125,
0.064453125,
-0.00008791685104370117,
-0.003620147705078125,
-0.0304412841796875,
-0.0011873245239257812,
-0.049713134765625,
0.0731201171875,
0.01401519775390625,
-0.014312744140625,
0.00850677490234375,
0.01378631591796875,
-0.0242156982421875,
-0.00946044921875,
-0.00528717041015625,
0.0654296875,
0.004352569580078125,
-0.0150909423828125,
0.06103515625,
-0.00701141357421875,
0.060791015625,
-0.0469970703125,
0.006298065185546875,
0.002948760986328125,
0.01163482666015625,
-0.02252197265625,
-0.04901123046875,
0.01422882080078125,
-0.0055084228515625,
-0.0266876220703125,
-0.0095367431640625,
0.05108642578125,
-0.047637939453125,
-0.039276123046875,
0.044921875,
0.024505615234375,
0.0206756591796875,
-0.0132293701171875,
-0.054046630859375,
0.01361083984375,
0.0088348388671875,
-0.026123046875,
-0.0033893585205078125,
0.019683837890625,
0.0202178955078125,
0.035308837890625,
0.048736572265625,
-0.00024962425231933594,
0.00919342041015625,
0.0047760009765625,
0.062164306640625,
-0.036224365234375,
-0.048736572265625,
-0.039306640625,
0.019317626953125,
0.01358795166015625,
-0.01259613037109375,
0.034637451171875,
0.059661865234375,
0.07928466796875,
0.00679779052734375,
0.060089111328125,
0.004878997802734375,
0.063232421875,
-0.04736328125,
0.03631591796875,
-0.050689697265625,
0.0073699951171875,
-0.00823211669921875,
-0.0615234375,
-0.0009775161743164062,
0.07086181640625,
-0.0197906494140625,
0.030364990234375,
0.03472900390625,
0.06817626953125,
-0.009552001953125,
-0.0012378692626953125,
0.0228424072265625,
0.019927978515625,
0.0246124267578125,
0.036865234375,
0.049407958984375,
-0.041778564453125,
0.053680419921875,
-0.040802001953125,
-0.0174560546875,
-0.01535797119140625,
-0.0229949951171875,
-0.06976318359375,
-0.06719970703125,
-0.025360107421875,
-0.027679443359375,
0.006214141845703125,
0.0777587890625,
0.07708740234375,
-0.07037353515625,
-0.0231170654296875,
0.004833221435546875,
-0.0164794921875,
-0.01024627685546875,
-0.01441192626953125,
0.032958984375,
-0.00782012939453125,
-0.060089111328125,
0.057373046875,
-0.0005207061767578125,
0.01263427734375,
-0.01129913330078125,
-0.0176544189453125,
-0.012939453125,
-0.0035686492919921875,
0.0229339599609375,
0.0194244384765625,
-0.0557861328125,
-0.01342010498046875,
-0.0028438568115234375,
-0.0218048095703125,
0.0200042724609375,
0.0418701171875,
-0.07794189453125,
0.058624267578125,
0.04205322265625,
0.032562255859375,
0.0706787109375,
-0.001499176025390625,
0.0165863037109375,
-0.055816650390625,
0.0286712646484375,
0.0247955322265625,
0.0162506103515625,
0.02447509765625,
-0.0224609375,
0.039459228515625,
0.038330078125,
-0.040771484375,
-0.04571533203125,
-0.0022602081298828125,
-0.09600830078125,
-0.030242919921875,
0.109375,
0.011962890625,
-0.01099395751953125,
-0.0010347366333007812,
-0.0225677490234375,
0.04913330078125,
-0.038360595703125,
0.028350830078125,
0.02618408203125,
-0.004314422607421875,
0.0209197998046875,
-0.0489501953125,
0.04315185546875,
0.0282745361328125,
-0.0245208740234375,
0.0166473388671875,
0.042510986328125,
0.03570556640625,
-0.00553131103515625,
0.06158447265625,
-0.0069580078125,
0.0220489501953125,
0.0090179443359375,
0.016571044921875,
-0.018157958984375,
-0.0452880859375,
-0.050384521484375,
0.0022678375244140625,
-0.0262451171875,
-0.0298004150390625
]
] |
laion/CLIP-convnext_xxlarge-laion2B-s34B-b82K-augreg-soup | 2023-04-18T17:45:00.000Z | [
"open_clip",
"zero-shot-image-classification",
"clip",
"arxiv:2210.08402",
"arxiv:1910.04867",
"license:mit",
"has_space",
"region:us"
] | zero-shot-image-classification | laion | null | null | laion/CLIP-convnext_xxlarge-laion2B-s34B-b82K-augreg-soup | 16 | 11,722 | open_clip | 2023-02-26T20:23:04 | ---
tags:
- zero-shot-image-classification
- clip
library_tag: open_clip
license: mit
library_name: open_clip
pipeline_tag: zero-shot-image-classification
---
# Model card for CLIP-convnext_xxlarge-laion2B-s34B-b82K-augreg-soup
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Training Details](#training-details)
4. [Evaluation](#evaluation)
5. [Acknowledgements](#acknowledgements)
6. [Citation](#citation)
# Model Details
## Model Description
A series of CLIP ConvNeXt-XXLarge (a custom `timm` ConvNeXt size) models trained on LAION-2B (english), a subset of [LAION-5B](https://arxiv.org/abs/2210.08402), using [OpenCLIP](https://github.com/mlfoundations/open_clip).
| Model | Dataset | Resolution | AugReg | Top-1 ImageNet Zero-Shot (%) |
| ----- | ------- | ---------- | ------------ | --------- |
| [convnext_xxlarge.laion2b_s34b_b82k-augreg](https://huggingface.co/laion/CLIP-convnext_xxlarge-laion2B-s34B-b82K-augreg) | LAION-2B | 256x256 | RRC (0.33, 1.0), RE (0.35), SD (0.1) | 79.1 |
| [convnext_xxlarge.laion2b_s34b_b82k-augreg-rewind](https://huggingface.co/laion/CLIP-convnext_xxlarge-laion2B-s34B-b82K-augreg-rewind) | LAION-2B | 256x256 | RRC (0.3, 1.0), RE (0.4), SD (0.1) | 79.3 |
| [convnext_xxlarge.laion2b_s34b_b82k-augreg-soup](https://huggingface.co/laion/CLIP-convnext_xxlarge-laion2B-s34B-b82K-augreg-soup) | LAION-2B | 256x256 | N/A | 79.4 |
RRC = Random Resize Crop (crop pcts), RE = Random Erasing (prob), SD = Stochastic Depth (prob) -- image tower only
The core training run was performed in pieces over a period of ~ 2 months. The global batch size for the core run was 81920. The last ~10% of training was re-done at a 95744 global batch size w/ higher LR and aug than original finish. The two were averaged together in a 'soup'. See more details in [Training Details](#training-details).
Goals:
* Push the size of largest convolutional CLIP image tower into the performance range of ViT-g to ViT-G w/ improved image size scaling for downstream use.
Firsts:
* Largest released ConvNeXt model pretrained (847M params w/ 198 GMAC and 125 MActs @ 256x256 for image)
* A non-ViT image tower CLIP model (with no previous image tower pretrain) achieving > 79% ImageNet top-1 zero-shot
The models utilize:
* the [timm](https://github.com/rwightman/pytorch-image-models) ConvNeXt-XXLarge model (`convnext_xxlarge`) as the image tower
* a standard projection at end of image tower
* a text tower with same size (with 1024, heads 16, depth 24) as ViT-H-14 and ViT-g-14 models
The models are trained at 256x256 image resolution. The size of the combined image + text CLIP model is 1.2B params w/ 222 GMAC and 146 MActs. At 256x256, the ConvNext-XXLarge sits just above a ViT-H-14 CLIP configuration in FLOPS and params while being lower in activation counts. It is well under both g-14 and G-14 while being between them in capabilities.
|model |image_size|embed_dim|gmacs |macts |mparams|image_gmacs|image_macts|image_mparams|text_gmacs|text_macts|text_mparams|
|--------------------------|----------|---------|------|------|-------|-----------|-----------|-------------|----------|----------|------------|
|ViT-H-16 |224 |1024 |150.96|122.01|986.26 |127.4 |100.81 |632.23 |23.57 |21.2 |354.03 |
|ViT-H-14 |224 |1024 |190.97|160.61|986.11 |167.4 |139.41 |632.08 |23.57 |21.2 |354.03 |
|ViT-L-14-336 |336 |768 |197.76|278.19|427.94 |191.1 |270.24 |304.29 |6.66 |7.95 |123.65 |
|convnext_xxlarge |256 |1024 |221.66|145.66|1200.58|198.09 |124.45 |846.54 |23.57 |21.2 |354.03 |
|RN50x64 |448 |1024 |276.8 |249.73|623.26 |265.02 |239.13 |420.38 |11.78 |10.6 |202.88 |
|ViT-g-14 |224 |1024 |290.74|213.84|1366.68|267.18 |192.64 |1012.65 |23.57 |21.2 |354.03 |
|convnext_xxlarge_320 |320 |1024 |333.08|215.66|1200.58|309.52 |194.46 |846.54 |23.57 |21.2 |354.03 |
|ViT-H-14-336 |336 |1024 |414.53|428.74|986.52 |390.97 |407.54 |632.49 |23.57 |21.2 |354.03 |
|ViT-bigG-14 |224 |1280 |532.92|310.71|2539.57|483.96 |275.37 |1844.91 |48.96 |35.34 |694.66 |
Model training done by Ross Wightman across both the [stability.ai](https://stability.ai/) cluster and the [JUWELS Booster](https://apps.fz-juelich.de/jsc/hps/juwels/booster-overview.html) supercomputer. See acknowledgements below.
# Uses
As per the original [OpenAI CLIP model card](https://github.com/openai/CLIP/blob/d50d76daa670286dd6cacf3bcd80b5e4823fc8e1/model-card.md), this model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, arbitrary image classification. We also hope it can be used for interdisciplinary studies of the potential impact of such model.
The OpenAI CLIP paper includes a discussion of potential downstream impacts to provide an example for this sort of analysis. Additionally, the LAION-5B blog (https://laion.ai/blog/laion-5b/) and upcoming paper include additional discussion as it relates specifically to the training dataset.
## Direct Use
Zero-shot image classification, image and text retrieval, among others.
## Downstream Use
Image classification and other image task fine-tuning, linear probe image classification, image generation guiding and conditioning, among others.
## Out-of-Scope Use
As per the OpenAI models,
**Any** deployed use case of the model - whether commercial or not - is currently out of scope. Non-deployed use cases such as image search in a constrained environment, are also not recommended unless there is thorough in-domain testing of the model with a specific, fixed class taxonomy. This is because our safety assessment demonstrated a high need for task specific testing especially given the variability of CLIP’s performance with different class taxonomies. This makes untested and unconstrained deployment of the model in any use case currently potentially harmful.
Certain use cases which would fall under the domain of surveillance and facial recognition are always out-of-scope regardless of performance of the model. This is because the use of artificial intelligence for tasks such as these can be premature currently given the lack of testing norms and checks to ensure its fair use.
Since the model has not been purposefully trained in or evaluated on any languages other than English, its use should be limited to English language use cases.
Further the above notice, the LAION-5B dataset used in training of these models has additional considerations, see below.
# Training Details
## Training Data
This model was trained with LAION-2B -- A 2 billion sample English subset of LAION-5B (https://laion.ai/blog/laion-5b/).
**IMPORTANT NOTE:** The motivation behind dataset creation is to democratize research and experimentation around large-scale multi-modal model training and handling of uncurated, large-scale datasets crawled from publically available internet. Our recommendation is therefore to use the dataset for research purposes. Be aware that this large-scale dataset is uncurated. Keep in mind that the uncurated nature of the dataset means that collected links may lead to strongly discomforting and disturbing content for a human viewer. Therefore, please use the demo links with caution and at your own risk. It is possible to extract a “safe” subset by filtering out samples based on the safety tags (using a customized trained NSFW classifier that we built). While this strongly reduces the chance for encountering potentially harmful content when viewing, we cannot entirely exclude the possibility for harmful content being still present in safe mode, so that the warning holds also there. We think that providing the dataset openly to broad research and other interested communities will allow for transparent investigation of benefits that come along with training large-scale models as well as pitfalls and dangers that may stay unreported or unnoticed when working with closed large datasets that remain restricted to a small community. Providing our dataset openly, we however do not recommend using it for creating ready-to-go industrial products, as the basic research about general properties and safety of such large-scale models, which we would like to encourage with this release, is still in progress.
## Training Procedure
The main training run was done at global batch size of 81920 for 256 checkpoint intervals of 135.6M samples for a total of ~34B samples seen over training.
Many difficulties w/ both model numerical stability and cluster stability and performance were encountered while training this model. Initial attempts to train with float16 AMP and default adam beta2 resulted in loss spikes and eventually NaN blow ups. `beta2` was reduced to 0.97 which helped, but the loss / zs curves were not tracking as expected. After switching to PyTorch nightlies, it was possible to use bfloat16 + AMP for training (as with rececnt H/14, g/14, and G/14 models), beta2 was returned to 0.98 and metrics improved.
|Checkpoint Interval |Cluster |# GPUs|# Nodes|GPU |local BS|sample/s|sample/s/gpu|precision |adam beta2 |
|--------------------|----------|------|-------|----------|--------|--------|------------|----------|-----------|
|1 - 2 |Stability |1024 |128 |A100 40GB | 80 |37-40k | 36-39 |amp + fp16|0.97 |
|3 - 32 |Stability |512 |64 |A100 80GB | 160 |27-32k | 52-62 |amp + fp16|0.97 |
|33 - 75 |Booster |1024 |256 |A100 40GB | 80 |48k | 47 |amp + fp16|0.97 |
|76 - 165 |Booster |1024 |256 |A100 40GB | 80 |51k | 50 |amp + bf16|0.98 |
|166 - 232 |Stability |320 |40 |A100 80GB | 256 |18-19k | 56-59 |amp + bf16|0.98 |
|233 - 249 |Booster |1024 |256 |A100 40GB | 80 |51k | 50 |amp + bf16|0.98 |
|250 - 256 |Stability |1024 |128 |A100 40GB | 80 |27-31k | 26-30 |amp + bf16|0.98 |
JUWELS Booster has 4x A100 GPU per node w/ 4x HDR-200 IB adapters per node (200Gbit/sec per GPU). Stability setup used was 8x A100 GPU per node w/ 400Gbit/sec EFA networking per node (50 GBit/sec per GPU). Significant variation in training efficiency (throughput per GPU) as observed across the various configurations. The 1024 GPU configurations across both clusters were particularly prone to crashing (or very difficult to get running w/ a 'good' set of GPUs).
A slurm srun command line below for a 128 8-GPU (40GB A100) configuration:
```
srun --cpu_bind=v --accel-bind=gn python -m training.main \
--save-frequency 1 \
--name "xxlarge-2b-81920-bf16" \
--resume "latest" \
--logs "/runs" \
--log-every-n-steps 50 \
--train-data="pipe:aws s3 cp s3://laion5b/laion2B-data/{000000..231349}.tar -" \
--train-num-samples 135646078 \
--dataset-type webdataset \
--warmup 10000 \
--batch-size=80 \
--epochs=256 \
--dataset-resampled \
--aug-cfg use_timm=True scale='(0.33, 1.0)' re_prob=0.35 \
--precision amp_bfloat16 \
--grad-clip-norm 5.0 \
--lr 1e-3 \
--workers=6 \
--beta2 0.98 \
--model "convnext_xxlarge" \
--seed 0 \
--ddp-static-graph \
--local-loss \
--gather-with-grad \
--grad-checkpointing \
--report-to "tensorboard"
```
For the rewind of last 10%, a higher global batch size of 95744 was used w/ a higher LR and slightly increased augmentation strength.
|Checkpoint Interval |Cluster |# GPUs|# Nodes|GPU |local BS|sample/s|sample/s/gpu|precision |adam beta2 |
|--------------------|---------|------|-------|----------|--------|--------|------------|----------|-----------|
|231 - 256 |stability|1088 |136 |A100 40GB | 88 |32-35k | 29-32 |amp + bf16|0.98 |
The slurm srun command line for 136 8-GPU (40GB A100) nodes:
```
srun --cpu_bind=v --accel-bind=gn python -m training.main \
--save-frequency 1 \
--name "xxlarge-2b-81920-r-bf16" \
--resume "latest" \
--logs "/runs" \
--log-every-n-steps 50 \
--train-data="pipe:aws s3 cp s3://laion5b/laion2B-data/{000000..231349}.tar -" \
--train-num-samples 135646078 \
--dataset-type webdataset \
--warmup 10000 \
--batch-size=88 \
--epochs=256 \
--dataset-resampled \
--aug-cfg use_timm=True scale='(0.3, 1.0)' re_prob=0.4 \
--precision amp_bfloat16 \
--grad-clip-norm 5.0 \
--lr 2e-3 \
--workers=6 \
--beta2 0.98 \
--model "convnext_xxlarge" \
--seed 0 \
--ddp-static-graph \
--local-loss \
--gather-with-grad \
--grad-checkpointing \
--report-to "tensorboard"
```
# Evaluation
Evaluation done with code in the [LAION CLIP Benchmark suite](https://github.com/LAION-AI/CLIP_benchmark).
## Testing Data, Factors & Metrics
### Testing Data
The testing is performed with VTAB+ (A combination of VTAB (https://arxiv.org/abs/1910.04867) w/ additional robustness datasets) for classification and COCO and Flickr for retrieval.
## Results
These models achieve between 79.1 and 79.4 top-1 zero-shot accuracy on ImageNet-1k.

A zoom-in on final 10% w/ rewind:

An initial round of benchmarks have been performed on a wider range of datasets, to be viewable at https://github.com/LAION-AI/CLIP_benchmark/blob/main/benchmark/results.ipynb
# Acknowledgements
Acknowledging [stability.ai](https://stability.ai/) and the Gauss Centre for Supercomputing e.V. (http://gauss-centre.eu) for funding this part of work by providing computing time through the John von Neumann Institute for Computing (NIC) on the GCS Supercomputer JUWELS Booster at Jülich Supercomputing Centre (JSC).
# Citation
**BibTeX:**
LAION-5B
```bibtex
@inproceedings{schuhmann2022laionb,
title={{LAION}-5B: An open large-scale dataset for training next generation image-text models},
author={Christoph Schuhmann and
Romain Beaumont and
Richard Vencu and
Cade W Gordon and
Ross Wightman and
Mehdi Cherti and
Theo Coombes and
Aarush Katta and
Clayton Mullis and
Mitchell Wortsman and
Patrick Schramowski and
Srivatsa R Kundurthy and
Katherine Crowson and
Ludwig Schmidt and
Robert Kaczmarczyk and
Jenia Jitsev},
booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2022},
url={https://openreview.net/forum?id=M3Y74vmsMcY}
}
```
OpenCLIP software
```bibtex
@software{ilharco_gabriel_2021_5143773,
author = {Ilharco, Gabriel and
Wortsman, Mitchell and
Wightman, Ross and
Gordon, Cade and
Carlini, Nicholas and
Taori, Rohan and
Dave, Achal and
Shankar, Vaishaal and
Namkoong, Hongseok and
Miller, John and
Hajishirzi, Hannaneh and
Farhadi, Ali and
Schmidt, Ludwig},
title = {OpenCLIP},
month = jul,
year = 2021,
note = {If you use this software, please cite it as below.},
publisher = {Zenodo},
version = {0.1},
doi = {10.5281/zenodo.5143773},
url = {https://doi.org/10.5281/zenodo.5143773}
}
```
OpenAI CLIP paper
```bibtex
@inproceedings{Radford2021LearningTV,
title={Learning Transferable Visual Models From Natural Language Supervision},
author={Alec Radford and Jong Wook Kim and Chris Hallacy and A. Ramesh and Gabriel Goh and Sandhini Agarwal and Girish Sastry and Amanda Askell and Pamela Mishkin and Jack Clark and Gretchen Krueger and Ilya Sutskever},
booktitle={ICML},
year={2021}
}
```
```bibtex
@Article{liu2022convnet,
author = {Zhuang Liu and Hanzi Mao and Chao-Yuan Wu and Christoph Feichtenhofer and Trevor Darrell and Saining Xie},
title = {A ConvNet for the 2020s},
journal = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2022},
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/rwightman/pytorch-image-models}}
}
```
```
@InProceedings{pmlr-v162-wortsman22a,
title = {Model soups: averaging weights of multiple fine-tuned models improves accuracy without increasing inference time},
author = {Wortsman, Mitchell and Ilharco, Gabriel and Gadre, Samir Ya and Roelofs, Rebecca and Gontijo-Lopes, Raphael and Morcos, Ari S and Namkoong, Hongseok and Farhadi, Ali and Carmon, Yair and Kornblith, Simon and Schmidt, Ludwig},
booktitle = {Proceedings of the 39th International Conference on Machine Learning},
pages = {23965--23998},
year = {2022},
editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan},
volume = {162},
series = {Proceedings of Machine Learning Research},
month = {17--23 Jul},
publisher = {PMLR},
pdf = {https://proceedings.mlr.press/v162/wortsman22a/wortsman22a.pdf},
url = {https://proceedings.mlr.press/v162/wortsman22a.html}
}
``` | 17,900 | [
[
-0.047332763671875,
-0.03839111328125,
-0.00357818603515625,
0.0035457611083984375,
-0.0278167724609375,
-0.022918701171875,
-0.0127410888671875,
-0.032745361328125,
0.027069091796875,
0.020111083984375,
-0.03558349609375,
-0.031402587890625,
-0.05230712890625,
-0.007099151611328125,
-0.01849365234375,
0.06500244140625,
-0.0015611648559570312,
-0.0015134811401367188,
0.0039215087890625,
-0.0242156982421875,
-0.04364013671875,
-0.0286102294921875,
-0.053741455078125,
-0.0094451904296875,
0.01535797119140625,
0.0276031494140625,
0.05523681640625,
0.060455322265625,
0.04742431640625,
0.018646240234375,
-0.0190887451171875,
0.0039215087890625,
-0.046142578125,
-0.041595458984375,
0.018341064453125,
-0.022705078125,
-0.038177490234375,
0.00771331787109375,
0.051727294921875,
0.02178955078125,
-0.0014028549194335938,
0.0129241943359375,
0.01407623291015625,
0.041473388671875,
-0.050262451171875,
-0.00553131103515625,
-0.036041259765625,
0.0030670166015625,
-0.013214111328125,
-0.0038852691650390625,
-0.01125335693359375,
-0.006130218505859375,
0.0170440673828125,
-0.053131103515625,
0.026123046875,
-0.0063629150390625,
0.10748291015625,
0.01148223876953125,
-0.022308349609375,
0.01172637939453125,
-0.03564453125,
0.055877685546875,
-0.046722412109375,
0.01824951171875,
0.0236663818359375,
0.01386260986328125,
0.0022182464599609375,
-0.058502197265625,
-0.029541015625,
-0.006572723388671875,
-0.0045318603515625,
0.0257110595703125,
-0.023834228515625,
-0.004764556884765625,
0.036590576171875,
0.038238525390625,
-0.04461669921875,
0.0038089752197265625,
-0.052825927734375,
-0.007080078125,
0.058197021484375,
0.0010128021240234375,
0.0238189697265625,
-0.0299224853515625,
-0.05511474609375,
-0.023834228515625,
-0.049591064453125,
0.03521728515625,
0.0184173583984375,
-0.00787353515625,
-0.034637451171875,
0.0290679931640625,
-0.00041675567626953125,
0.03509521484375,
0.00982666015625,
-0.0140533447265625,
0.034149169921875,
-0.03216552734375,
-0.03387451171875,
-0.005161285400390625,
0.07757568359375,
0.043304443359375,
0.007190704345703125,
0.01702880859375,
-0.00605010986328125,
-0.005767822265625,
0.01617431640625,
-0.0865478515625,
-0.01186370849609375,
0.00727081298828125,
-0.051544189453125,
-0.025665283203125,
0.02178955078125,
-0.0494384765625,
0.017730712890625,
-0.009307861328125,
0.05181884765625,
-0.04296875,
-0.0214385986328125,
0.0006265640258789062,
-0.0171051025390625,
0.01824951171875,
0.0211639404296875,
-0.051239013671875,
0.01245880126953125,
0.0252685546875,
0.08026123046875,
-0.00714874267578125,
-0.0177001953125,
-0.009185791015625,
0.005893707275390625,
-0.0283050537109375,
0.045806884765625,
-0.00699615478515625,
-0.039642333984375,
-0.012359619140625,
0.0352783203125,
-0.016204833984375,
-0.032501220703125,
0.051666259765625,
-0.01396942138671875,
0.00860595703125,
-0.0178375244140625,
-0.0308074951171875,
-0.0286865234375,
0.0260467529296875,
-0.04718017578125,
0.073974609375,
0.00092315673828125,
-0.06976318359375,
0.0273590087890625,
-0.04229736328125,
-0.0013780593872070312,
-0.0129241943359375,
-0.0029430389404296875,
-0.053070068359375,
-0.01459503173828125,
0.0400390625,
0.038238525390625,
-0.0232696533203125,
0.01910400390625,
-0.0233612060546875,
-0.03607177734375,
0.00957489013671875,
-0.037628173828125,
0.0755615234375,
0.0086517333984375,
-0.0262451171875,
0.0082244873046875,
-0.053009033203125,
0.00004673004150390625,
0.0240631103515625,
-0.004337310791015625,
-0.00911712646484375,
-0.033843994140625,
-0.007053375244140625,
0.019439697265625,
0.00862884521484375,
-0.038299560546875,
0.011505126953125,
-0.012359619140625,
0.04315185546875,
0.049652099609375,
0.01629638671875,
0.026336669921875,
-0.041290283203125,
0.0430908203125,
0.006740570068359375,
0.0426025390625,
-0.02294921875,
-0.0364990234375,
-0.0550537109375,
-0.050506591796875,
0.0261077880859375,
0.033233642578125,
-0.036224365234375,
0.032958984375,
-0.01453399658203125,
-0.044891357421875,
-0.028564453125,
-0.017425537109375,
0.032012939453125,
0.0400390625,
0.0282745361328125,
-0.0288848876953125,
-0.042327880859375,
-0.080810546875,
0.015869140625,
0.00262451171875,
0.004795074462890625,
0.03948974609375,
0.060760498046875,
-0.00019121170043945312,
0.056060791015625,
-0.04559326171875,
-0.035369873046875,
-0.02081298828125,
-0.00476837158203125,
0.026092529296875,
0.03875732421875,
0.0648193359375,
-0.0657958984375,
-0.041534423828125,
-0.00817108154296875,
-0.070068359375,
0.0236663818359375,
-0.004436492919921875,
-0.01174163818359375,
-0.0063629150390625,
0.0242462158203125,
-0.0487060546875,
0.062744140625,
0.031280517578125,
-0.0003788471221923828,
0.0504150390625,
-0.019775390625,
0.01268768310546875,
-0.08013916015625,
0.0289459228515625,
0.00806427001953125,
-0.0125885009765625,
-0.042572021484375,
-0.00034427642822265625,
0.0032367706298828125,
-0.02105712890625,
-0.053558349609375,
0.04400634765625,
-0.039398193359375,
0.0016183853149414062,
-0.0098724365234375,
-0.0035266876220703125,
0.00574493408203125,
0.047149658203125,
0.0109405517578125,
0.08172607421875,
0.045806884765625,
-0.04541015625,
0.00305938720703125,
0.0248870849609375,
-0.035552978515625,
0.031585693359375,
-0.0699462890625,
-0.00791168212890625,
-0.00458526611328125,
0.0259246826171875,
-0.049957275390625,
-0.02813720703125,
0.0262298583984375,
-0.0465087890625,
0.033050537109375,
-0.0252838134765625,
-0.006847381591796875,
-0.048797607421875,
-0.0506591796875,
0.03631591796875,
0.046966552734375,
-0.038299560546875,
0.0143280029296875,
0.0261077880859375,
0.0242919921875,
-0.0618896484375,
-0.0533447265625,
-0.00988006591796875,
-0.0224151611328125,
-0.06329345703125,
0.04547119140625,
0.007251739501953125,
0.0111236572265625,
0.00934600830078125,
0.0031032562255859375,
-0.0100555419921875,
-0.001434326171875,
0.03729248046875,
0.032958984375,
-0.016204833984375,
-0.0135040283203125,
-0.0172576904296875,
-0.0011816024780273438,
-0.00974273681640625,
-0.01399993896484375,
0.0273895263671875,
-0.01207733154296875,
-0.0160064697265625,
-0.05450439453125,
0.0090484619140625,
0.043426513671875,
-0.0250396728515625,
0.06622314453125,
0.06134033203125,
-0.026336669921875,
0.01280975341796875,
-0.027557373046875,
-0.01214599609375,
-0.03338623046875,
0.02886962890625,
-0.0182037353515625,
-0.054107666015625,
0.043701171875,
0.00940704345703125,
0.000047326087951660156,
0.058013916015625,
0.03057861328125,
-0.006404876708984375,
0.06536865234375,
0.041534423828125,
-0.00568389892578125,
0.04248046875,
-0.08331298828125,
0.0015354156494140625,
-0.08367919921875,
-0.0275115966796875,
-0.0181121826171875,
-0.0311737060546875,
-0.0384521484375,
-0.028045654296875,
0.052490234375,
0.0222625732421875,
-0.0231475830078125,
0.033843994140625,
-0.0401611328125,
0.02197265625,
0.0439453125,
0.0273895263671875,
-0.0081024169921875,
0.0020160675048828125,
-0.006847381591796875,
-0.01000213623046875,
-0.0660400390625,
-0.0228271484375,
0.0870361328125,
0.03839111328125,
0.0465087890625,
-0.0089111328125,
0.03619384765625,
0.01210784912109375,
0.0029659271240234375,
-0.05267333984375,
0.035003662109375,
-0.022796630859375,
-0.055816650390625,
-0.0195465087890625,
-0.0210113525390625,
-0.07318115234375,
0.021087646484375,
-0.01488494873046875,
-0.056060791015625,
0.0286712646484375,
0.01332855224609375,
-0.033721923828125,
0.040985107421875,
-0.038818359375,
0.0782470703125,
-0.0164947509765625,
-0.036834716796875,
-0.005054473876953125,
-0.060943603515625,
0.0408935546875,
0.020965576171875,
0.005771636962890625,
-0.01085662841796875,
0.0121612548828125,
0.07098388671875,
-0.0638427734375,
0.052825927734375,
-0.0122833251953125,
0.023834228515625,
0.057159423828125,
-0.0144195556640625,
0.03399658203125,
0.01111602783203125,
0.00421905517578125,
0.0247039794921875,
0.00659942626953125,
-0.0243072509765625,
-0.02752685546875,
0.0498046875,
-0.0821533203125,
-0.039886474609375,
-0.034912109375,
-0.031524658203125,
0.024139404296875,
0.01149749755859375,
0.06292724609375,
0.05816650390625,
-0.0116729736328125,
0.03704833984375,
0.052032470703125,
-0.0142974853515625,
0.041015625,
0.0023670196533203125,
-0.00685882568359375,
-0.055816650390625,
0.0694580078125,
0.0198211669921875,
0.0274658203125,
0.01335906982421875,
0.01309967041015625,
-0.018402099609375,
-0.02935791015625,
-0.039276123046875,
0.016876220703125,
-0.03094482421875,
-0.028472900390625,
-0.04180908203125,
-0.038543701171875,
-0.038604736328125,
-0.01084136962890625,
-0.034912109375,
-0.0205230712890625,
-0.04400634765625,
0.0011911392211914062,
0.0273284912109375,
0.0350341796875,
-0.0127105712890625,
0.03216552734375,
-0.06451416015625,
0.01171112060546875,
0.012481689453125,
0.006397247314453125,
0.006786346435546875,
-0.061614990234375,
-0.0197906494140625,
0.010833740234375,
-0.046539306640625,
-0.0596923828125,
0.041473388671875,
0.0186920166015625,
0.034271240234375,
0.047607421875,
-0.00960540771484375,
0.062469482421875,
-0.0169830322265625,
0.0797119140625,
0.0304718017578125,
-0.049072265625,
0.041046142578125,
-0.033966064453125,
0.021820068359375,
0.03704833984375,
0.044830322265625,
-0.028656005859375,
0.00439453125,
-0.06927490234375,
-0.0589599609375,
0.06378173828125,
0.0230865478515625,
0.0004394054412841797,
0.01044464111328125,
0.0341796875,
-0.01024627685546875,
0.0115509033203125,
-0.062103271484375,
-0.022979736328125,
-0.0312347412109375,
0.0102996826171875,
0.00677490234375,
-0.03076171875,
-0.0012693405151367188,
-0.03558349609375,
0.06500244140625,
-0.0014448165893554688,
0.039947509765625,
0.0297698974609375,
-0.0018777847290039062,
-0.004779815673828125,
-0.0007829666137695312,
0.04913330078125,
0.041168212890625,
-0.032073974609375,
-0.010986328125,
0.01447296142578125,
-0.047821044921875,
0.003997802734375,
-0.00457763671875,
-0.0494384765625,
-0.0009174346923828125,
0.033721923828125,
0.08209228515625,
0.015472412109375,
-0.0243988037109375,
0.0682373046875,
-0.00676727294921875,
-0.0340576171875,
-0.0325927734375,
-0.00023365020751953125,
-0.0188140869140625,
0.0066070556640625,
0.0176544189453125,
0.0168609619140625,
0.00803375244140625,
-0.0369873046875,
0.015777587890625,
0.038421630859375,
-0.039093017578125,
-0.0318603515625,
0.058197021484375,
0.004802703857421875,
-0.004638671875,
0.042205810546875,
-0.00997161865234375,
-0.049896240234375,
0.06787109375,
0.03717041015625,
0.0611572265625,
-0.0190277099609375,
0.018829345703125,
0.0738525390625,
0.0233612060546875,
-0.0161285400390625,
0.01195526123046875,
0.0148468017578125,
-0.03314208984375,
-0.00726318359375,
-0.0287933349609375,
-0.002414703369140625,
0.03765869140625,
-0.06683349609375,
0.04541015625,
-0.037628173828125,
-0.02960205078125,
-0.0171356201171875,
-0.0003159046173095703,
-0.0487060546875,
0.026336669921875,
0.0025730133056640625,
0.084228515625,
-0.061981201171875,
0.05706787109375,
0.04925537109375,
-0.0509033203125,
-0.07720947265625,
-0.007476806640625,
-0.00848388671875,
-0.05047607421875,
0.036346435546875,
0.0364990234375,
0.0177459716796875,
-0.0177764892578125,
-0.05535888671875,
-0.0751953125,
0.1055908203125,
0.016326904296875,
-0.0286102294921875,
0.0023040771484375,
-0.00505828857421875,
0.0299224853515625,
-0.02166748046875,
0.036224365234375,
0.0258331298828125,
0.0134429931640625,
0.021148681640625,
-0.069091796875,
0.0018262863159179688,
-0.01465606689453125,
0.0145721435546875,
0.0194091796875,
-0.09185791015625,
0.083984375,
-0.019683837890625,
-0.0101470947265625,
0.0101776123046875,
0.054779052734375,
0.00970458984375,
0.0098114013671875,
0.0311279296875,
0.06561279296875,
0.03558349609375,
-0.0023040771484375,
0.08135986328125,
-0.021575927734375,
0.0330810546875,
0.06512451171875,
0.0180816650390625,
0.060150146484375,
0.028564453125,
-0.0167083740234375,
0.016448974609375,
0.056640625,
-0.0227203369140625,
0.04901123046875,
-0.01045989990234375,
-0.0060272216796875,
-0.01493072509765625,
-0.0294189453125,
-0.044586181640625,
0.0300750732421875,
0.013763427734375,
-0.03515625,
-0.005123138427734375,
0.01082611083984375,
0.008697509765625,
-0.023681640625,
-0.026336669921875,
0.030609130859375,
0.00677490234375,
-0.041046142578125,
0.052215576171875,
-0.0013751983642578125,
0.05279541015625,
-0.049896240234375,
0.00789642333984375,
-0.00749969482421875,
0.0211181640625,
-0.020111083984375,
-0.06768798828125,
0.0222625732421875,
-0.008209228515625,
-0.006504058837890625,
0.0012149810791015625,
0.058135986328125,
-0.01325225830078125,
-0.03070068359375,
0.0205841064453125,
0.0033931732177734375,
0.0179901123046875,
-0.004909515380859375,
-0.06170654296875,
0.005573272705078125,
0.010101318359375,
-0.0203399658203125,
0.02679443359375,
0.0211639404296875,
-0.004779815673828125,
0.0389404296875,
0.040130615234375,
-0.0031147003173828125,
0.0088958740234375,
-0.0088958740234375,
0.07989501953125,
-0.0296478271484375,
-0.0352783203125,
-0.0552978515625,
0.031280517578125,
-0.015899658203125,
-0.041778564453125,
0.05767822265625,
0.05072021484375,
0.06103515625,
-0.005016326904296875,
0.045562744140625,
-0.0159149169921875,
0.01556396484375,
-0.041656494140625,
0.04193115234375,
-0.06414794921875,
-0.003688812255859375,
-0.01727294921875,
-0.04876708984375,
-0.00972747802734375,
0.045745849609375,
-0.0282135009765625,
0.001094818115234375,
0.043243408203125,
0.05413818359375,
-0.0110015869140625,
0.002330780029296875,
0.0168914794921875,
0.006481170654296875,
0.0206146240234375,
0.049346923828125,
0.05078125,
-0.06939697265625,
0.056549072265625,
-0.052032470703125,
-0.013580322265625,
-0.02691650390625,
-0.051605224609375,
-0.0753173828125,
-0.038909912109375,
-0.03582763671875,
-0.022308349609375,
-0.002716064453125,
0.062347412109375,
0.07269287109375,
-0.04144287109375,
-0.017913818359375,
0.008819580078125,
-0.0109100341796875,
-0.01544952392578125,
-0.01324462890625,
0.048004150390625,
0.012908935546875,
-0.0421142578125,
0.006008148193359375,
0.016754150390625,
0.0232696533203125,
-0.007282257080078125,
-0.005435943603515625,
-0.0253753662109375,
-0.01442718505859375,
0.0283660888671875,
0.0347900390625,
-0.0467529296875,
-0.024871826171875,
0.006549835205078125,
-0.00234222412109375,
0.028778076171875,
0.046478271484375,
-0.0400390625,
0.0183563232421875,
0.034515380859375,
0.0296630859375,
0.058685302734375,
0.01219940185546875,
0.011505126953125,
-0.05047607421875,
0.03240966796875,
-0.003536224365234375,
0.02716064453125,
0.018402099609375,
-0.030364990234375,
0.0518798828125,
0.032012939453125,
-0.049224853515625,
-0.060791015625,
-0.01535797119140625,
-0.08563232421875,
-0.014434814453125,
0.08551025390625,
-0.028045654296875,
-0.043243408203125,
0.0298614501953125,
-0.026275634765625,
0.022796630859375,
-0.031463623046875,
0.0303955078125,
0.03558349609375,
-0.00196075439453125,
-0.025238037109375,
-0.057403564453125,
0.034698486328125,
0.01107025146484375,
-0.048309326171875,
-0.023834228515625,
0.018707275390625,
0.03314208984375,
0.0138397216796875,
0.04541015625,
-0.0216522216796875,
0.0288848876953125,
0.004261016845703125,
0.007083892822265625,
-0.018524169921875,
-0.0257110595703125,
-0.0264434814453125,
0.012115478515625,
-0.0165252685546875,
-0.048980712890625
]
] |
h2oai/h2ogpt-4096-llama2-13b-chat | 2023-08-24T18:35:40.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"facebook",
"meta",
"llama-2",
"h2ogpt",
"en",
"license:llama2",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | h2oai | null | null | h2oai/h2ogpt-4096-llama2-13b-chat | 10 | 11,707 | transformers | 2023-08-09T17:19:03 | ---
inference: false
language:
- en
license: llama2
model_type: llama
pipeline_tag: text-generation
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
- h2ogpt
---
h2oGPT clone of [Meta's Llama 2 13B Chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf).
Try it live on our [h2oGPT demo](https://gpt.h2o.ai) with side-by-side LLM comparisons and private document chat!
See how it compares to other models on our [LLM Leaderboard](https://evalgpt.ai/)!
See more at [H2O.ai](https://h2o.ai/)
## Model Architecture
```
LlamaForCausalLM(
(model): LlamaModel(
(embed_tokens): Embedding(32000, 5120, padding_idx=0)
(layers): ModuleList(
(0-39): 40 x LlamaDecoderLayer(
(self_attn): LlamaAttention(
(q_proj): Linear(in_features=5120, out_features=5120, bias=False)
(k_proj): Linear(in_features=5120, out_features=5120, bias=False)
(v_proj): Linear(in_features=5120, out_features=5120, bias=False)
(o_proj): Linear(in_features=5120, out_features=5120, bias=False)
(rotary_emb): LlamaRotaryEmbedding()
)
(mlp): LlamaMLP(
(gate_proj): Linear(in_features=5120, out_features=13824, bias=False)
(up_proj): Linear(in_features=5120, out_features=13824, bias=False)
(down_proj): Linear(in_features=13824, out_features=5120, bias=False)
(act_fn): SiLUActivation()
)
(input_layernorm): LlamaRMSNorm()
(post_attention_layernorm): LlamaRMSNorm()
)
)
(norm): LlamaRMSNorm()
)
(lm_head): Linear(in_features=5120, out_features=32000, bias=False)
)
``` | 1,615 | [
[
-0.01971435546875,
-0.053924560546875,
0.0357666015625,
0.040130615234375,
-0.027435302734375,
0.021087646484375,
0.0021991729736328125,
-0.0377197265625,
0.0350341796875,
0.01678466796875,
-0.034332275390625,
-0.045440673828125,
-0.049835205078125,
-0.0212860107421875,
-0.017730712890625,
0.056915283203125,
0.0033855438232421875,
-0.0111236572265625,
-0.0196075439453125,
-0.00942230224609375,
-0.029632568359375,
-0.0311431884765625,
-0.05029296875,
-0.044403076171875,
0.017059326171875,
0.01410675048828125,
0.05181884765625,
0.034759521484375,
0.04034423828125,
0.0187530517578125,
-0.0268707275390625,
0.01153564453125,
-0.0384521484375,
-0.01320648193359375,
-0.00046443939208984375,
-0.0174102783203125,
-0.052520751953125,
0.00865936279296875,
0.034820556640625,
0.0122222900390625,
-0.014801025390625,
0.042205810546875,
0.0281219482421875,
0.0220794677734375,
-0.03265380859375,
0.0228118896484375,
-0.042999267578125,
0.004535675048828125,
0.0028018951416015625,
-0.01447296142578125,
-0.0227508544921875,
0.0007605552673339844,
0.0039215087890625,
-0.0283660888671875,
-0.017791748046875,
0.0161285400390625,
0.08172607421875,
0.037109375,
-0.03326416015625,
-0.021484375,
-0.0364990234375,
0.053924560546875,
-0.08056640625,
0.01070404052734375,
0.0408935546875,
0.0265350341796875,
-0.0065155029296875,
-0.056427001953125,
-0.032958984375,
-0.0127716064453125,
-0.0019931793212890625,
0.024444580078125,
-0.033233642578125,
-0.0037384033203125,
0.00963592529296875,
0.0203094482421875,
-0.03857421875,
0.0208892822265625,
-0.0245208740234375,
-0.02587890625,
0.045166015625,
0.01035308837890625,
-0.0022640228271484375,
-0.027374267578125,
-0.047637939453125,
-0.00481414794921875,
-0.041412353515625,
0.004638671875,
0.032623291015625,
-0.01056671142578125,
-0.03289794921875,
0.0545654296875,
-0.016448974609375,
0.039031982421875,
0.0269622802734375,
-0.04705810546875,
0.03387451171875,
-0.02783203125,
-0.02264404296875,
-0.005947113037109375,
0.06903076171875,
0.052093505859375,
-0.008819580078125,
0.027557373046875,
-0.00847625732421875,
-0.0164337158203125,
0.0003209114074707031,
-0.071044921875,
-0.01361083984375,
0.0255584716796875,
-0.05023193359375,
-0.040313720703125,
-0.01181793212890625,
-0.053497314453125,
-0.0193634033203125,
0.0017032623291015625,
0.034027099609375,
-0.01262664794921875,
-0.0130157470703125,
0.002655029296875,
0.005916595458984375,
0.037994384765625,
0.0246124267578125,
-0.042266845703125,
0.022216796875,
0.055511474609375,
0.0787353515625,
-0.0211334228515625,
-0.00689697265625,
-0.005924224853515625,
0.00077056884765625,
-0.0007433891296386719,
0.06195068359375,
-0.026123046875,
-0.0241241455078125,
-0.0302886962890625,
-0.01200103759765625,
-0.00022935867309570312,
-0.041015625,
0.023223876953125,
-0.013641357421875,
0.029022216796875,
-0.0003688335418701172,
-0.035064697265625,
-0.02545166015625,
0.03533935546875,
-0.0270843505859375,
0.08636474609375,
0.00039386749267578125,
-0.05413818359375,
0.0121917724609375,
-0.039886474609375,
-0.006183624267578125,
-0.019195556640625,
-0.0193939208984375,
-0.044891357421875,
-0.0242156982421875,
0.014892578125,
0.036529541015625,
-0.0190582275390625,
0.005138397216796875,
-0.041015625,
-0.036163330078125,
0.0233917236328125,
-0.025909423828125,
0.05841064453125,
0.01442718505859375,
-0.03857421875,
0.01012420654296875,
-0.0594482421875,
0.0032672882080078125,
0.01085662841796875,
-0.037811279296875,
0.006641387939453125,
0.0016698837280273438,
-0.0113067626953125,
0.0247344970703125,
0.0237274169921875,
-0.0308074951171875,
0.0126953125,
-0.01001739501953125,
0.052001953125,
0.049835205078125,
0.016571044921875,
0.02142333984375,
-0.038299560546875,
0.037872314453125,
-0.00714874267578125,
0.0252685546875,
0.00774383544921875,
-0.065185546875,
-0.043914794921875,
-0.061370849609375,
0.0120086669921875,
0.0660400390625,
-0.03057861328125,
0.047088623046875,
-0.0081024169921875,
-0.046051025390625,
-0.048065185546875,
0.014495849609375,
0.0345458984375,
0.029815673828125,
0.025909423828125,
-0.0302581787109375,
-0.03424072265625,
-0.07806396484375,
0.0228729248046875,
-0.0240325927734375,
-0.01544952392578125,
0.0509033203125,
0.03875732421875,
-0.032501220703125,
0.0419921875,
-0.0276641845703125,
-0.020111083984375,
-0.0130157470703125,
-0.00620269775390625,
0.041290283203125,
0.0274658203125,
0.047393798828125,
-0.015533447265625,
-0.02801513671875,
-0.020599365234375,
-0.058563232421875,
-0.0185546875,
-0.00540924072265625,
-0.0296478271484375,
0.007457733154296875,
0.02294921875,
-0.0673828125,
0.056243896484375,
0.051116943359375,
-0.02447509765625,
0.0306854248046875,
-0.004085540771484375,
0.004688262939453125,
-0.09326171875,
0.0120086669921875,
-0.0300140380859375,
-0.0203094482421875,
-0.02166748046875,
0.007030487060546875,
-0.00460052490234375,
0.00775909423828125,
-0.04638671875,
0.041168212890625,
-0.032745361328125,
-0.026947021484375,
-0.021514892578125,
0.0034389495849609375,
-0.002727508544921875,
0.05078125,
-0.0088348388671875,
0.0528564453125,
0.0281219482421875,
-0.01534271240234375,
0.042572021484375,
0.035247802734375,
-0.037078857421875,
0.014892578125,
-0.0806884765625,
0.0177764892578125,
0.0179290771484375,
0.045166015625,
-0.09991455078125,
-0.0198516845703125,
0.0273284912109375,
-0.042999267578125,
0.01023101806640625,
-0.01436614990234375,
-0.0367431640625,
-0.034759521484375,
-0.05706787109375,
0.0299530029296875,
0.0618896484375,
-0.03759765625,
0.0247955322265625,
0.0269622802734375,
-0.00072479248046875,
-0.056396484375,
-0.0684814453125,
0.01007080078125,
-0.02239990234375,
-0.045684814453125,
0.024658203125,
-0.00040459632873535156,
-0.006549835205078125,
-0.0007691383361816406,
-0.006099700927734375,
0.01116943359375,
0.005947113037109375,
0.05096435546875,
0.023773193359375,
-0.00836181640625,
-0.02435302734375,
-0.00330352783203125,
0.003902435302734375,
0.0020084381103515625,
0.00039505958557128906,
0.073486328125,
-0.00304412841796875,
-0.042205810546875,
-0.0615234375,
-0.0021724700927734375,
0.0281524658203125,
0.005496978759765625,
0.034637451171875,
0.0550537109375,
-0.02569580078125,
0.007965087890625,
-0.049346923828125,
-0.00989532470703125,
-0.034271240234375,
0.01113128662109375,
-0.01904296875,
-0.059814453125,
0.044830322265625,
0.0137176513671875,
0.0033664703369140625,
0.0279388427734375,
0.07354736328125,
-0.025238037109375,
0.060455322265625,
0.0225830078125,
-0.01123046875,
0.037872314453125,
-0.032806396484375,
0.002338409423828125,
-0.0672607421875,
-0.025482177734375,
-0.0018854141235351562,
-0.048126220703125,
-0.047515869140625,
-0.054718017578125,
0.01861572265625,
0.0244293212890625,
-0.01837158203125,
0.038116455078125,
-0.026092529296875,
0.022613525390625,
0.049713134765625,
0.0279083251953125,
0.01316070556640625,
0.0160064697265625,
-0.01473236083984375,
-0.0031337738037109375,
-0.04443359375,
-0.05108642578125,
0.0950927734375,
0.045623779296875,
0.0643310546875,
0.00567626953125,
0.05987548828125,
0.02166748046875,
0.0225830078125,
-0.04974365234375,
0.047271728515625,
0.01503753662109375,
-0.0418701171875,
-0.007488250732421875,
-0.015411376953125,
-0.05645751953125,
0.0195159912109375,
-0.0296173095703125,
-0.058807373046875,
0.0087432861328125,
0.025115966796875,
-0.03582763671875,
0.024261474609375,
-0.0299530029296875,
0.0418701171875,
-0.005374908447265625,
-0.035491943359375,
-0.017242431640625,
-0.04864501953125,
0.04034423828125,
0.0014505386352539062,
0.00701141357421875,
-0.026580810546875,
-0.01355743408203125,
0.048004150390625,
-0.0288543701171875,
0.0687255859375,
0.004734039306640625,
-0.00431060791015625,
0.056396484375,
0.0134735107421875,
0.0269622802734375,
0.0262603759765625,
-0.0067901611328125,
0.021759033203125,
-0.01087188720703125,
-0.03509521484375,
-0.0257568359375,
0.03948974609375,
-0.0775146484375,
-0.04962158203125,
-0.0234222412109375,
-0.02362060546875,
0.00394439697265625,
-0.0054168701171875,
0.0173797607421875,
0.0184478759765625,
-0.0028247833251953125,
0.0269622802734375,
0.01210784912109375,
-0.0254974365234375,
0.024810791015625,
0.0188446044921875,
-0.037322998046875,
-0.052001953125,
0.0504150390625,
-0.015380859375,
0.033660888671875,
0.02777099609375,
0.0037174224853515625,
-0.016876220703125,
-0.03680419921875,
-0.0276031494140625,
0.0208282470703125,
-0.037750244140625,
-0.0294342041015625,
-0.05230712890625,
-0.045989990234375,
-0.0279388427734375,
0.004619598388671875,
-0.0239410400390625,
-0.0537109375,
-0.0265350341796875,
-0.02825927734375,
0.0225677490234375,
0.039306640625,
-0.03253173828125,
0.0242767333984375,
-0.033966064453125,
0.01505279541015625,
0.043304443359375,
-0.012237548828125,
-0.010772705078125,
-0.07427978515625,
0.0157012939453125,
0.01515960693359375,
-0.047271728515625,
-0.0655517578125,
0.034820556640625,
0.0160980224609375,
0.0504150390625,
0.03375244140625,
-0.0231781005859375,
0.06268310546875,
-0.0202789306640625,
0.07135009765625,
0.0120697021484375,
-0.05938720703125,
0.037506103515625,
-0.015533447265625,
0.00494384765625,
0.0263671875,
0.0208282470703125,
-0.01558685302734375,
-0.03094482421875,
-0.050323486328125,
-0.060302734375,
0.0335693359375,
0.037506103515625,
0.005069732666015625,
0.012664794921875,
0.0246124267578125,
0.0010585784912109375,
0.0157470703125,
-0.058135986328125,
-0.0147705078125,
-0.0273284912109375,
-0.0181121826171875,
-0.003021240234375,
-0.03643798828125,
-0.007965087890625,
-0.0287017822265625,
0.040557861328125,
0.0017223358154296875,
0.053375244140625,
0.00798797607421875,
-0.002452850341796875,
0.004878997802734375,
-0.007205963134765625,
0.0645751953125,
0.05157470703125,
-0.044036865234375,
0.0277557373046875,
0.03656005859375,
-0.035675048828125,
0.0005640983581542969,
-0.009674072265625,
-0.0196685791015625,
-0.0081634521484375,
0.062255859375,
0.068603515625,
0.0223388671875,
-0.04949951171875,
0.046356201171875,
0.00403594970703125,
-0.0298614501953125,
-0.021453857421875,
-0.00013637542724609375,
0.02978515625,
0.045928955078125,
0.010284423828125,
-0.00615692138671875,
0.004970550537109375,
-0.038421630859375,
0.0123748779296875,
0.0270538330078125,
-0.0103759765625,
-0.0284423828125,
0.061614990234375,
0.01493072509765625,
-0.02734375,
0.03759765625,
-0.01482391357421875,
-0.04364013671875,
0.05487060546875,
0.0400390625,
0.046722412109375,
-0.00482177734375,
0.0163726806640625,
0.044097900390625,
0.021514892578125,
-0.006725311279296875,
0.032501220703125,
-0.01500701904296875,
-0.05279541015625,
-0.0191192626953125,
-0.038177490234375,
-0.0445556640625,
0.018402099609375,
-0.04864501953125,
0.02105712890625,
-0.050506591796875,
-0.031341552734375,
-0.01229095458984375,
0.01203155517578125,
-0.048675537109375,
-0.01442718505859375,
0.026702880859375,
0.069580078125,
-0.037506103515625,
0.07379150390625,
0.034759521484375,
-0.005779266357421875,
-0.036163330078125,
-0.01556396484375,
0.017852783203125,
-0.1058349609375,
0.0430908203125,
0.023529052734375,
0.00437164306640625,
-0.0115509033203125,
-0.056884765625,
-0.08831787109375,
0.133544921875,
0.01494598388671875,
-0.044708251953125,
0.003299713134765625,
0.008758544921875,
0.038299560546875,
-0.036376953125,
0.0281219482421875,
0.0290985107421875,
0.02783203125,
0.0144805908203125,
-0.08831787109375,
0.0123748779296875,
-0.01490020751953125,
-0.00823974609375,
-0.03106689453125,
-0.07635498046875,
0.07122802734375,
-0.024444580078125,
-0.013671875,
0.0172271728515625,
0.04193115234375,
0.051910400390625,
0.0237579345703125,
0.040496826171875,
0.05230712890625,
0.0308685302734375,
0.00620269775390625,
0.06402587890625,
-0.0379638671875,
0.038238525390625,
0.08502197265625,
-0.0185089111328125,
0.08648681640625,
0.03875732421875,
-0.02020263671875,
0.036865234375,
0.06475830078125,
-0.00495147705078125,
0.049896240234375,
0.020263671875,
0.0008440017700195312,
-0.0202484130859375,
-0.0011053085327148438,
-0.037078857421875,
0.04449462890625,
0.02020263671875,
-0.0157318115234375,
-0.0035152435302734375,
-0.023345947265625,
0.0127410888671875,
-0.023681640625,
0.004547119140625,
0.05157470703125,
0.0159454345703125,
-0.01971435546875,
0.047393798828125,
0.0008339881896972656,
0.06280517578125,
-0.029815673828125,
-0.0007271766662597656,
-0.0328369140625,
0.0173492431640625,
-0.01421356201171875,
-0.06573486328125,
0.01294708251953125,
0.0031280517578125,
0.0157928466796875,
-0.00949859619140625,
0.06011962890625,
-0.028961181640625,
-0.0304107666015625,
0.039642333984375,
0.04583740234375,
0.0257720947265625,
0.0092926025390625,
-0.06353759765625,
0.0206146240234375,
0.002277374267578125,
-0.0487060546875,
0.01369476318359375,
0.004688262939453125,
0.01093292236328125,
0.061431884765625,
0.047332763671875,
-0.0064697265625,
0.017730712890625,
-0.0118255615234375,
0.06365966796875,
-0.0487060546875,
-0.0207061767578125,
-0.07275390625,
0.0208892822265625,
-0.0194854736328125,
-0.03515625,
0.052337646484375,
0.04669189453125,
0.05682373046875,
-0.0014638900756835938,
0.0312042236328125,
0.00446319580078125,
0.022125244140625,
-0.03857421875,
0.037017822265625,
-0.042144775390625,
0.017181396484375,
-0.006450653076171875,
-0.06884765625,
-0.00487518310546875,
0.062255859375,
0.00115203857421875,
-0.00836181640625,
0.03350830078125,
0.07537841796875,
-0.005710601806640625,
-0.0209808349609375,
0.01404571533203125,
0.025604248046875,
0.0203094482421875,
0.066162109375,
0.087646484375,
-0.03802490234375,
0.050689697265625,
-0.0288848876953125,
-0.0223388671875,
-0.041290283203125,
-0.06793212890625,
-0.084228515625,
-0.007904052734375,
-0.023773193359375,
-0.033966064453125,
-0.00035381317138671875,
0.07916259765625,
0.069580078125,
-0.05706787109375,
-0.02813720703125,
0.03106689453125,
0.0123138427734375,
-0.01556396484375,
-0.0102691650390625,
0.0216522216796875,
0.0257415771484375,
-0.0418701171875,
0.026275634765625,
0.018402099609375,
0.020599365234375,
-0.00972747802734375,
-0.0148162841796875,
-0.0203094482421875,
0.01666259765625,
0.051605224609375,
0.00850677490234375,
-0.059906005859375,
-0.041290283203125,
-0.00860595703125,
-0.0266571044921875,
0.00183868408203125,
0.02679443359375,
-0.03692626953125,
-0.01229095458984375,
0.045196533203125,
0.033905029296875,
0.06805419921875,
0.00817108154296875,
0.006938934326171875,
-0.040924072265625,
0.0302276611328125,
-0.004451751708984375,
0.0272216796875,
0.0172119140625,
-0.014068603515625,
0.051666259765625,
0.0294952392578125,
-0.045684814453125,
-0.06695556640625,
0.006031036376953125,
-0.10784912109375,
-0.0028057098388671875,
0.1077880859375,
-0.01103973388671875,
-0.01751708984375,
0.0264892578125,
-0.03204345703125,
0.00543975830078125,
-0.034881591796875,
0.05902099609375,
0.0421142578125,
-0.0146331787109375,
-0.00406646728515625,
-0.0450439453125,
0.0159454345703125,
0.01354217529296875,
-0.06494140625,
-0.03509521484375,
0.0178375244140625,
0.0443115234375,
0.0022258758544921875,
0.063720703125,
-0.00479888916015625,
0.006671905517578125,
0.0001926422119140625,
0.005970001220703125,
-0.023345947265625,
-0.0103302001953125,
-0.00792694091796875,
0.00283050537109375,
-0.0137939453125,
-0.0361328125
]
] |
Salesforce/codet5-small | 2021-11-23T09:45:34.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"codet5",
"dataset:code_search_net",
"arxiv:2109.00859",
"arxiv:1909.09436",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | Salesforce | null | null | Salesforce/codet5-small | 41 | 11,698 | transformers | 2022-03-02T23:29:04 | ---
license: apache-2.0
tags:
- codet5
datasets:
- code_search_net
inference: false
---
# CodeT5 (small-sized model)
Pre-trained CodeT5 model. It was introduced in the paper [CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models
for Code Understanding and Generation](https://arxiv.org/abs/2109.00859) by Yue Wang, Weishi Wang, Shafiq Joty, Steven C.H. Hoi and first released in [this repository](https://github.com/salesforce/CodeT5).
Disclaimer: The team releasing CodeT5 did not write a model card for this model so this model card has been written by the Hugging Face team (more specifically, [nielsr](https://huggingface.co/nielsr)).
## Model description
From the abstract:
"We present CodeT5, a unified pre-trained encoder-decoder Transformer model that better leverages the code semantics conveyed from the developer-assigned identifiers. Our model employs a unified framework to seamlessly support both code understanding and generation tasks and allows for multi-task learning. Besides, we propose a novel identifier-aware pre-training task that enables the model to distinguish which code tokens are identifiers and to recover them when they are masked. Furthermore, we propose to exploit the user-written code comments with a bimodal dual generation task for better NL-PL alignment. Comprehensive experiments show that CodeT5 significantly outperforms prior methods on understanding tasks such as code defect detection and clone detection, and generation tasks across various directions including PL-NL, NL-PL, and PL-PL. Further analysis reveals that our model can better capture semantic information from code."
## Intended uses & limitations
This repository contains the pre-trained model only, so you can use this model for masked span prediction, as shown in the code example below. However, the main use of this model is to fine-tune it for a downstream task of interest, such as:
* code summarization
* code generation
* code translation
* code refinement
* code defect detection
* code clone detection.
See the [model hub](https://huggingface.co/models?search=salesforce/codet) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model:
```python
from transformers import RobertaTokenizer, T5ForConditionalGeneration
tokenizer = RobertaTokenizer.from_pretrained('Salesforce/codet5-small')
model = T5ForConditionalGeneration.from_pretrained('Salesforce/codet5-small')
text = "def greet(user): print(f'hello <extra_id_0>!')"
input_ids = tokenizer(text, return_tensors="pt").input_ids
# simply generate a single sequence
generated_ids = model.generate(input_ids, max_length=10)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))
# this prints "user: {user.name}"
```
## Training data
The CodeT5 model was pretrained on CodeSearchNet [Husain et al., 2019](https://arxiv.org/abs/1909.09436). Additionally, the authors collected two datasets of C/CSharp from [BigQuery1](https://console.cloud.google.com/marketplace/details/github/github-repos) to ensure that all downstream tasks have overlapped programming languages with the pre-training data. In total, around 8.35 million instances are used for pretraining.
## Training procedure
### Preprocessing
This model uses a code-specific BPE (Byte-Pair Encoding) tokenizer. One can prepare text (or code) for the model using RobertaTokenizer, with the files from this repository.
## Evaluation results
For evaluation results on several downstream benchmarks, we refer to the paper.
### BibTeX entry and citation info
```bibtex
@misc{wang2021codet5,
title={CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation},
author={Yue Wang and Weishi Wang and Shafiq Joty and Steven C. H. Hoi},
year={2021},
eprint={2109.00859},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 3,924 | [
[
-0.02838134765625,
-0.02099609375,
0.0033321380615234375,
0.017608642578125,
-0.0183258056640625,
0.003406524658203125,
-0.0234832763671875,
-0.038177490234375,
-0.0081329345703125,
0.023223876953125,
-0.041473388671875,
-0.0396728515625,
-0.04095458984375,
0.01512908935546875,
-0.0210723876953125,
0.09649658203125,
0.0058441162109375,
-0.0017671585083007812,
-0.01373291015625,
-0.00092315673828125,
-0.0364990234375,
-0.06072998046875,
-0.0178985595703125,
-0.0174407958984375,
0.0178985595703125,
0.0124359130859375,
0.026641845703125,
0.041015625,
0.04608154296875,
0.020965576171875,
0.0008301734924316406,
-0.002582550048828125,
-0.0313720703125,
-0.031280517578125,
0.01398468017578125,
-0.037109375,
-0.0496826171875,
-0.0021820068359375,
0.021209716796875,
0.043426513671875,
0.005023956298828125,
0.039947509765625,
0.0007905960083007812,
0.031951904296875,
-0.03436279296875,
0.01406097412109375,
-0.049774169921875,
0.002834320068359375,
-0.007617950439453125,
0.0007252693176269531,
-0.0384521484375,
-0.0177154541015625,
0.00182342529296875,
-0.0300445556640625,
0.038360595703125,
-0.01482391357421875,
0.0882568359375,
0.0263214111328125,
-0.0219573974609375,
-0.003387451171875,
-0.039581298828125,
0.055206298828125,
-0.066162109375,
0.0278167724609375,
0.01056671142578125,
0.0028781890869140625,
0.0097503662109375,
-0.0875244140625,
-0.04364013671875,
-0.012725830078125,
-0.0195770263671875,
0.0052337646484375,
-0.017730712890625,
0.017822265625,
0.04559326171875,
0.0240020751953125,
-0.05224609375,
-0.004123687744140625,
-0.053314208984375,
-0.0215301513671875,
0.045684814453125,
-0.00859832763671875,
0.020111083984375,
-0.01349639892578125,
-0.03997802734375,
0.001316070556640625,
-0.047515869140625,
0.01035308837890625,
0.0182647705078125,
0.008636474609375,
-0.025299072265625,
0.010528564453125,
-0.003635406494140625,
0.0533447265625,
0.0185699462890625,
-0.0020122528076171875,
0.04833984375,
-0.042266845703125,
-0.0269622802734375,
-0.0084381103515625,
0.06439208984375,
0.00008118152618408203,
0.0242919921875,
-0.01971435546875,
-0.019866943359375,
0.018218994140625,
0.0302886962890625,
-0.08587646484375,
-0.0325927734375,
0.021148681640625,
-0.047271728515625,
-0.034271240234375,
0.028106689453125,
-0.03582763671875,
-0.002597808837890625,
-0.0185394287109375,
0.0263214111328125,
-0.0308990478515625,
-0.0222015380859375,
0.00659942626953125,
-0.0063629150390625,
0.0173797607421875,
-0.002735137939453125,
-0.06329345703125,
0.00003612041473388672,
0.025482177734375,
0.040496826171875,
-0.006153106689453125,
-0.044036865234375,
-0.027069091796875,
-0.00926971435546875,
-0.0134429931640625,
0.032989501953125,
-0.024810791015625,
-0.0137481689453125,
-0.014862060546875,
0.01220703125,
-0.002918243408203125,
-0.041259765625,
0.021453857421875,
-0.043121337890625,
0.00936126708984375,
-0.002933502197265625,
-0.0257110595703125,
-0.01067352294921875,
0.0074310302734375,
-0.041412353515625,
0.07958984375,
0.01256561279296875,
-0.0614013671875,
0.039306640625,
-0.06500244140625,
-0.0182037353515625,
0.0036411285400390625,
-0.009918212890625,
-0.05206298828125,
0.0003864765167236328,
0.0215606689453125,
0.041412353515625,
-0.00605010986328125,
0.0350341796875,
-0.0172576904296875,
-0.037384033203125,
0.01092529296875,
-0.01230621337890625,
0.060943603515625,
0.03369140625,
-0.0419921875,
0.0273284912109375,
-0.056732177734375,
0.0161285400390625,
0.01358795166015625,
-0.0226593017578125,
0.006710052490234375,
-0.02032470703125,
0.005584716796875,
0.030609130859375,
0.0276336669921875,
-0.0245208740234375,
0.0247344970703125,
-0.0167999267578125,
0.049774169921875,
0.04449462890625,
-0.01258087158203125,
0.0280914306640625,
-0.01268768310546875,
0.053375244140625,
0.030364990234375,
0.0136871337890625,
-0.039886474609375,
-0.01617431640625,
-0.0482177734375,
-0.0223541259765625,
0.05126953125,
0.024658203125,
-0.048095703125,
0.0416259765625,
-0.040802001953125,
-0.03851318359375,
-0.036773681640625,
0.00997161865234375,
0.057830810546875,
0.014190673828125,
0.03485107421875,
-0.036285400390625,
-0.075439453125,
-0.04388427734375,
-0.0222320556640625,
0.004505157470703125,
0.0022125244140625,
0.004558563232421875,
0.058990478515625,
-0.03826904296875,
0.07635498046875,
-0.035491943359375,
-0.00688934326171875,
-0.03411865234375,
0.01806640625,
0.010955810546875,
0.06402587890625,
0.04541015625,
-0.0634765625,
-0.029296875,
-0.016387939453125,
-0.05322265625,
-0.0009899139404296875,
-0.0098114013671875,
0.0003504753112792969,
0.0263214111328125,
0.0457763671875,
-0.02789306640625,
0.0369873046875,
0.0455322265625,
-0.022735595703125,
0.035247802734375,
-0.0117034912109375,
-0.00652313232421875,
-0.0997314453125,
0.028228759765625,
-0.005992889404296875,
-0.0112152099609375,
-0.044647216796875,
0.008270263671875,
0.0207061767578125,
-0.0257568359375,
-0.02752685546875,
0.01922607421875,
-0.06024169921875,
-0.018096923828125,
0.0007562637329101562,
-0.0107269287109375,
0.0006504058837890625,
0.059967041015625,
0.009552001953125,
0.07122802734375,
0.0204010009765625,
-0.04644775390625,
0.002513885498046875,
0.016387939453125,
-0.02001953125,
0.0037250518798828125,
-0.06158447265625,
0.034576416015625,
0.00760650634765625,
0.0217132568359375,
-0.056365966796875,
-0.00846099853515625,
-0.004535675048828125,
-0.05560302734375,
0.0221405029296875,
-0.032623291015625,
-0.03271484375,
-0.049530029296875,
-0.0101776123046875,
0.042449951171875,
0.054351806640625,
-0.0467529296875,
0.024200439453125,
0.0143890380859375,
0.0180816650390625,
-0.047821044921875,
-0.05938720703125,
-0.0112762451171875,
-0.017852783203125,
-0.055633544921875,
0.042449951171875,
-0.0167388916015625,
0.019195556640625,
-0.0054779052734375,
-0.016326904296875,
-0.01236724853515625,
0.0032196044921875,
0.0273284912109375,
0.030853271484375,
-0.02484130859375,
0.00284576416015625,
-0.029266357421875,
-0.0152740478515625,
0.0010232925415039062,
-0.0394287109375,
0.0472412109375,
-0.032073974609375,
-0.01012420654296875,
-0.023223876953125,
0.006061553955078125,
0.034912109375,
-0.05291748046875,
0.049591064453125,
0.0693359375,
-0.02276611328125,
-0.01421356201171875,
-0.0330810546875,
-0.016448974609375,
-0.035308837890625,
0.039031982421875,
-0.04498291015625,
-0.05706787109375,
0.041717529296875,
-0.004055023193359375,
-0.007373809814453125,
0.027069091796875,
0.036590576171875,
0.020355224609375,
0.0806884765625,
0.064697265625,
-0.0025730133056640625,
0.051544189453125,
-0.0455322265625,
0.03179931640625,
-0.048095703125,
-0.01157379150390625,
-0.036895751953125,
-0.0090484619140625,
-0.0430908203125,
-0.02215576171875,
0.0159759521484375,
0.02288818359375,
-0.03521728515625,
0.046112060546875,
-0.06329345703125,
0.029754638671875,
0.044952392578125,
0.0062408447265625,
0.005458831787109375,
-0.0087432861328125,
-0.0020885467529296875,
0.00035190582275390625,
-0.052642822265625,
-0.0276031494140625,
0.08929443359375,
0.026397705078125,
0.05078125,
-0.004154205322265625,
0.0653076171875,
0.01007080078125,
0.0074462890625,
-0.03289794921875,
0.0242462158203125,
-0.005523681640625,
-0.039306640625,
0.00024306774139404297,
-0.035308837890625,
-0.07086181640625,
0.006359100341796875,
-0.007450103759765625,
-0.04974365234375,
0.01308441162109375,
0.020660400390625,
-0.0374755859375,
0.01418304443359375,
-0.0867919921875,
0.1063232421875,
-0.01523590087890625,
-0.01039886474609375,
0.0112152099609375,
-0.062744140625,
0.0318603515625,
-0.006000518798828125,
0.0043182373046875,
0.0206146240234375,
0.01497650146484375,
0.0672607421875,
-0.047882080078125,
0.06494140625,
-0.02313232421875,
0.010406494140625,
0.0165557861328125,
-0.0170440673828125,
0.0311126708984375,
-0.01067352294921875,
0.0125732421875,
0.030670166015625,
0.0199432373046875,
-0.04498291015625,
-0.032806396484375,
0.0301971435546875,
-0.06768798828125,
-0.0267486572265625,
-0.0294952392578125,
-0.0258941650390625,
-0.0035858154296875,
0.031829833984375,
0.04931640625,
0.032012939453125,
0.0011301040649414062,
0.022491455078125,
0.0445556640625,
-0.0201873779296875,
0.045196533203125,
0.0187225341796875,
-0.0116729736328125,
-0.02142333984375,
0.062469482421875,
0.00012791156768798828,
0.0216064453125,
0.0133514404296875,
-0.007965087890625,
-0.00890350341796875,
-0.041168212890625,
-0.025848388671875,
0.01229095458984375,
-0.04254150390625,
-0.033905029296875,
-0.045013427734375,
-0.02752685546875,
-0.047210693359375,
-0.00939178466796875,
-0.0293426513671875,
0.005985260009765625,
-0.0108795166015625,
-0.004024505615234375,
0.0216827392578125,
0.050140380859375,
0.00716400146484375,
0.024017333984375,
-0.07177734375,
0.0213775634765625,
0.00905609130859375,
0.03118896484375,
-0.007404327392578125,
-0.043701171875,
-0.035614013671875,
0.010009765625,
-0.021148681640625,
-0.05157470703125,
0.03369140625,
-0.0007157325744628906,
0.0323486328125,
0.022003173828125,
0.005077362060546875,
0.0584716796875,
-0.0155792236328125,
0.07159423828125,
0.0185089111328125,
-0.09320068359375,
0.045318603515625,
-0.01422882080078125,
0.029876708984375,
0.033966064453125,
0.0166015625,
-0.028533935546875,
0.0026416778564453125,
-0.06475830078125,
-0.057373046875,
0.078125,
0.00789642333984375,
0.00738525390625,
0.015167236328125,
0.01029205322265625,
-0.00223541259765625,
0.026214599609375,
-0.0838623046875,
-0.00998687744140625,
-0.035400390625,
-0.0260162353515625,
0.002246856689453125,
-0.0179901123046875,
0.01155853271484375,
-0.03076171875,
0.03997802734375,
-0.0175323486328125,
0.056854248046875,
0.007076263427734375,
-0.035308837890625,
0.01197052001953125,
0.007335662841796875,
0.05126953125,
0.052398681640625,
-0.0171356201171875,
-0.0033130645751953125,
0.01049041748046875,
-0.0537109375,
-0.01065826416015625,
0.0177001953125,
0.004795074462890625,
-0.00151824951171875,
0.0289459228515625,
0.07684326171875,
0.01276397705078125,
-0.0458984375,
0.06646728515625,
-0.0035572052001953125,
-0.023681640625,
-0.0380859375,
-0.00493621826171875,
-0.0001398324966430664,
0.0091705322265625,
0.016357421875,
0.0242462158203125,
-0.0120086669921875,
-0.037628173828125,
0.0191802978515625,
0.01189422607421875,
-0.026397705078125,
-0.036712646484375,
0.0638427734375,
0.0196685791015625,
-0.0077362060546875,
0.0435791015625,
-0.023468017578125,
-0.057220458984375,
0.06494140625,
0.046051025390625,
0.06781005859375,
0.0027675628662109375,
-0.00479888916015625,
0.034881591796875,
0.0157318115234375,
0.0036449432373046875,
0.0213165283203125,
-0.01381683349609375,
-0.056854248046875,
-0.029541015625,
-0.042083740234375,
0.009002685546875,
0.010589599609375,
-0.05023193359375,
0.032806396484375,
-0.0272979736328125,
-0.0060272216796875,
0.0008640289306640625,
0.0126953125,
-0.07354736328125,
0.02288818359375,
0.01087188720703125,
0.06231689453125,
-0.04962158203125,
0.08880615234375,
0.054840087890625,
-0.080078125,
-0.09210205078125,
0.010833740234375,
-0.0242919921875,
-0.0589599609375,
0.062164306640625,
0.0268402099609375,
0.0109405517578125,
0.0170440673828125,
-0.055938720703125,
-0.055572509765625,
0.0938720703125,
0.02484130859375,
-0.0214996337890625,
-0.0277862548828125,
-0.003467559814453125,
0.036773681640625,
-0.035430908203125,
0.03515625,
0.03759765625,
0.0165252685546875,
-0.005359649658203125,
-0.07196044921875,
0.0126495361328125,
-0.03759765625,
0.0237884521484375,
0.01107025146484375,
-0.04547119140625,
0.07977294921875,
-0.036346435546875,
-0.006359100341796875,
-0.0026187896728515625,
0.043731689453125,
0.01406097412109375,
0.0097198486328125,
0.0272369384765625,
0.033203125,
0.04937744140625,
-0.01244354248046875,
0.07366943359375,
-0.044342041015625,
0.040283203125,
0.062744140625,
0.00934600830078125,
0.045989990234375,
0.022552490234375,
-0.0204315185546875,
0.0418701171875,
0.050628662109375,
-0.025787353515625,
0.0304412841796875,
0.01413726806640625,
0.005420684814453125,
-0.0034122467041015625,
0.0182647705078125,
-0.054840087890625,
0.036163330078125,
0.003910064697265625,
-0.0428466796875,
0.0087127685546875,
-0.001811981201171875,
0.0182342529296875,
-0.010650634765625,
-0.01108551025390625,
0.058258056640625,
-0.0007185935974121094,
-0.05706787109375,
0.0738525390625,
0.0117645263671875,
0.08355712890625,
-0.053619384765625,
0.0025482177734375,
-0.022552490234375,
0.02789306640625,
-0.036041259765625,
-0.0167999267578125,
0.015411376953125,
0.0180511474609375,
-0.0187225341796875,
-0.02484130859375,
0.049774169921875,
-0.03289794921875,
-0.0173797607421875,
0.0180206298828125,
0.02178955078125,
0.0086517333984375,
-0.006710052490234375,
-0.050567626953125,
0.010406494140625,
0.0183868408203125,
-0.016204833984375,
0.0270233154296875,
0.031829833984375,
0.000720977783203125,
0.0269927978515625,
0.054046630859375,
-0.01059722900390625,
0.0205535888671875,
0.00008940696716308594,
0.060302734375,
-0.0560302734375,
-0.04705810546875,
-0.052520751953125,
0.053070068359375,
-0.0061492919921875,
-0.042755126953125,
0.04864501953125,
0.048492431640625,
0.0933837890625,
-0.033172607421875,
0.07122802734375,
-0.0187835693359375,
0.023193359375,
-0.035675048828125,
0.053558349609375,
-0.036285400390625,
0.0247344970703125,
-0.03143310546875,
-0.05731201171875,
-0.01654052734375,
0.0400390625,
-0.0298004150390625,
0.039581298828125,
0.059661865234375,
0.06597900390625,
-0.012969970703125,
-0.00899505615234375,
0.0240325927734375,
0.01175689697265625,
0.01885986328125,
0.05767822265625,
0.03533935546875,
-0.0555419921875,
0.061859130859375,
-0.02130126953125,
0.0030059814453125,
-0.01812744140625,
-0.05718994140625,
-0.079345703125,
-0.050811767578125,
-0.0194549560546875,
-0.040802001953125,
0.0008816719055175781,
0.08172607421875,
0.07965087890625,
-0.059661865234375,
-0.00860595703125,
-0.024322509765625,
-0.01103973388671875,
-0.0202178955078125,
-0.0163726806640625,
0.0272979736328125,
-0.045318603515625,
-0.049774169921875,
-0.0021572113037109375,
-0.014495849609375,
0.004398345947265625,
-0.005939483642578125,
-0.01739501953125,
0.0014514923095703125,
-0.015960693359375,
0.034423828125,
0.0245513916015625,
-0.050567626953125,
-0.017181396484375,
0.00720977783203125,
-0.0218353271484375,
0.0175323486328125,
0.06024169921875,
-0.065673828125,
0.0214996337890625,
0.027099609375,
0.058135986328125,
0.052093505859375,
0.005359649658203125,
0.044342041015625,
-0.0699462890625,
0.0265960693359375,
0.00661468505859375,
0.0263519287109375,
0.0140228271484375,
-0.026458740234375,
0.04998779296875,
0.03228759765625,
-0.038360595703125,
-0.07012939453125,
0.0087432861328125,
-0.06640625,
-0.01861572265625,
0.07843017578125,
-0.025482177734375,
-0.0107574462890625,
0.0081634521484375,
-0.0121002197265625,
0.027740478515625,
-0.0160675048828125,
0.037811279296875,
0.032440185546875,
0.01145172119140625,
-0.020660400390625,
-0.033355712890625,
0.05419921875,
0.024017333984375,
-0.064208984375,
-0.010498046875,
0.017974853515625,
0.03131103515625,
0.00971221923828125,
0.0548095703125,
-0.0124969482421875,
0.027496337890625,
0.0142669677734375,
0.0498046875,
-0.0250701904296875,
-0.01300811767578125,
-0.033538818359375,
0.0063629150390625,
0.00616455078125,
-0.033233642578125
]
] |
facebook/galactica-1.3b | 2023-01-24T17:20:39.000Z | [
"transformers",
"pytorch",
"opt",
"text-generation",
"galactica",
"arxiv:1810.03993",
"license:cc-by-nc-4.0",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | facebook | null | null | facebook/galactica-1.3b | 52 | 11,693 | transformers | 2022-11-16T13:37:55 | ---
license: cc-by-nc-4.0
tags:
- galactica
widget:
- text: "The Transformer architecture [START_REF]"
- text: "The Schwarzschild radius is defined as: \\["
- text: "A force of 0.6N is applied to an object, which accelerates at 3m/s. What is its mass? <work>"
- text: "Lecture 1: The Ising Model\n\n"
- text: "[START_I_SMILES]"
- text: "[START_AMINO]GHMQSITAGQKVISKHKNGRFYQCEVVRLTTETFYEVNFDDGSFSDNLYPEDIVSQDCLQFGPPAEGEVVQVRWTDGQVYGAKFVASHPIQMYQVEFEDGSQLVVKRDDVYTLDEELP[END_AMINO] ## Keywords"
inference: false
---

# GALACTICA 1.3B (base)
Model card from the original [repo](https://github.com/paperswithcode/galai/blob/main/docs/model_card.md)
Following [Mitchell et al. (2018)](https://arxiv.org/abs/1810.03993), this model card provides information about the GALACTICA model, how it was trained, and the intended use cases. Full details about how the model was trained and evaluated can be found in the [release paper](https://galactica.org/paper.pdf).
## Model Details
The GALACTICA models are trained on a large-scale scientific corpus. The models are designed to perform scientific tasks, including but not limited to citation prediction, scientific QA, mathematical reasoning, summarization, document generation, molecular property prediction and entity extraction. The models were developed by the Papers with Code team at Meta AI to study the use of language models for the automatic organization of science. We train models with sizes ranging from 125M to 120B parameters. Below is a summary of the released models:
| Size | Parameters |
|:-----------:|:-----------:|
| `mini` | 125 M |
| `base` | 1.3 B |
| `standard` | 6.7 B |
| `large` | 30 B |
| `huge` | 120 B |
## Release Date
November 2022
## Model Type
Transformer based architecture in a decoder-only setup with a few modifications (see paper for more details).
## Paper & Demo
[Paper](https://galactica.org/paper.pdf) / [Demo](https://galactica.org)
## Model Use
The primary intended users of the GALACTICA models are researchers studying language models applied to the scientific domain. We also anticipate the model will be useful for developers who wish to build scientific tooling. However, we caution against production use without safeguards given the potential of language models to hallucinate.
The models are made available under a non-commercial CC BY-NC 4.0 license. More information about how to use the model can be found in the README.md of this repository.
## Training Data
The GALACTICA models are trained on 106 billion tokens of open-access scientific text and data. This includes papers, textbooks, scientific websites, encyclopedias, reference material, knowledge bases, and more. We tokenize different modalities to provide a natural langauge interface for different tasks. See the README.md for more information. See the paper for full information on the training data.
## How to use
Find below some example scripts on how to use the model in `transformers`:
## Using the Pytorch model
### Running the model on a CPU
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoTokenizer, OPTForCausalLM
tokenizer = AutoTokenizer.from_pretrained("facebook/galactica-1.3b")
model = OPTForCausalLM.from_pretrained("facebook/galactica-1.3b")
input_text = "The Transformer architecture [START_REF]"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
### Running the model on a GPU
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
from transformers import AutoTokenizer, OPTForCausalLM
tokenizer = AutoTokenizer.from_pretrained("facebook/galactica-1.3b")
model = OPTForCausalLM.from_pretrained("facebook/galactica-1.3b", device_map="auto")
input_text = "The Transformer architecture [START_REF]"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
### Running the model on a GPU using different precisions
#### FP16
<details>
<summary> Click to expand </summary>
```python
# pip install accelerate
import torch
from transformers import AutoTokenizer, OPTForCausalLM
tokenizer = AutoTokenizer.from_pretrained("facebook/galactica-1.3b")
model = OPTForCausalLM.from_pretrained("facebook/galactica-1.3b", device_map="auto", torch_dtype=torch.float16)
input_text = "The Transformer architecture [START_REF]"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
#### INT8
<details>
<summary> Click to expand </summary>
```python
# pip install bitsandbytes accelerate
from transformers import AutoTokenizer, OPTForCausalLM
tokenizer = AutoTokenizer.from_pretrained("facebook/galactica-1.3b")
model = OPTForCausalLM.from_pretrained("facebook/galactica-1.3b", device_map="auto", load_in_8bit=True)
input_text = "The Transformer architecture [START_REF]"
input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(input_ids)
print(tokenizer.decode(outputs[0]))
```
</details>
## Performance and Limitations
The model outperforms several existing language models on a range of knowledge probes, reasoning, and knowledge-intensive scientific tasks. This also extends to general NLP tasks, where GALACTICA outperforms other open source general language models. That being said, we note a number of limitations in this section.
As with other language models, GALACTICA is often prone to hallucination - and training on a high-quality academic corpus does not prevent this, especially for less popular and less cited scientific concepts. There are no guarantees of truthful output when generating from the model. This extends to specific modalities such as citation prediction. While GALACTICA's citation behaviour approaches the ground truth citation behaviour with scale, the model continues to exhibit a popularity bias at larger scales.
In addition, we evaluated the model on several types of benchmarks related to stereotypes and toxicity. Overall, the model exhibits substantially lower toxicity rates compared to other large language models. That being said, the model continues to exhibit bias on certain measures (see the paper for details). So we recommend care when using the model for generations.
## Broader Implications
GALACTICA can potentially be used as a new way to discover academic literature. We also expect a lot of downstream use for application to particular domains, such as mathematics, biology, and chemistry. In the paper, we demonstrated several examples of the model acting as alternative to standard search tools. We expect a new generation of scientific tools to be built upon large language models such as GALACTICA.
We encourage researchers to investigate beneficial and new use cases for these models. That being said, it is important to be aware of the current limitations of large language models. Researchers should pay attention to common issues such as hallucination and biases that could emerge from using these models.
## Citation
```bibtex
@inproceedings{GALACTICA,
title={GALACTICA: A Large Language Model for Science},
author={Ross Taylor and Marcin Kardas and Guillem Cucurull and Thomas Scialom and Anthony Hartshorn and Elvis Saravia and Andrew Poulton and Viktor Kerkez and Robert Stojnic},
year={2022}
}
``` | 7,692 | [
[
-0.0271453857421875,
-0.058624267578125,
0.0257110595703125,
0.0157012939453125,
-0.00794219970703125,
0.0005259513854980469,
-0.0277862548828125,
-0.030181884765625,
0.03363037109375,
0.02630615234375,
-0.040283203125,
-0.0242462158203125,
-0.04608154296875,
0.0046234130859375,
-0.027374267578125,
0.07061767578125,
0.01366424560546875,
0.006397247314453125,
-0.00331878662109375,
0.005100250244140625,
-0.00841522216796875,
-0.03857421875,
-0.04632568359375,
-0.01593017578125,
0.03582763671875,
0.006694793701171875,
0.06689453125,
0.06744384765625,
0.0673828125,
0.0306243896484375,
-0.032073974609375,
0.01108551025390625,
-0.043243408203125,
-0.0295257568359375,
-0.00792694091796875,
-0.029388427734375,
-0.0328369140625,
0.01016998291015625,
0.052703857421875,
0.056243896484375,
0.00994110107421875,
0.0290069580078125,
-0.0152130126953125,
0.03955078125,
-0.0167388916015625,
0.031768798828125,
-0.04583740234375,
-0.01285552978515625,
0.00447845458984375,
0.01107025146484375,
-0.0208282470703125,
-0.031829833984375,
-0.0009307861328125,
-0.059783935546875,
0.016448974609375,
-0.0078277587890625,
0.091552734375,
0.040008544921875,
-0.00917816162109375,
-0.041595458984375,
-0.06353759765625,
0.032684326171875,
-0.056915283203125,
0.039581298828125,
-0.0010290145874023438,
0.019500732421875,
-0.00026702880859375,
-0.06829833984375,
-0.061248779296875,
-0.040863037109375,
-0.00860595703125,
0.00627899169921875,
-0.031646728515625,
-0.0087432861328125,
0.018463134765625,
0.0176544189453125,
-0.04510498046875,
0.00415802001953125,
-0.06378173828125,
0.006313323974609375,
0.044830322265625,
-0.00876617431640625,
0.0181732177734375,
-0.04412841796875,
-0.0088043212890625,
-0.033935546875,
-0.0478515625,
0.0019207000732421875,
0.029449462890625,
0.0330810546875,
-0.00553131103515625,
0.04693603515625,
0.00036525726318359375,
0.03466796875,
0.02447509765625,
0.0147705078125,
0.027099609375,
-0.031768798828125,
-0.03314208984375,
-0.0287017822265625,
0.07110595703125,
0.0166473388671875,
0.005321502685546875,
-0.021392822265625,
0.0132293701171875,
0.0193634033203125,
0.017822265625,
-0.069580078125,
-0.0163726806640625,
0.0155181884765625,
-0.04412841796875,
-0.014923095703125,
0.007244110107421875,
-0.0648193359375,
-0.0146636962890625,
-0.0177154541015625,
0.0246124267578125,
-0.0181884765625,
-0.0275726318359375,
0.003265380859375,
0.0036602020263671875,
-0.0014629364013671875,
-0.012481689453125,
-0.07928466796875,
0.007717132568359375,
0.035888671875,
0.057037353515625,
0.0024623870849609375,
-0.0250701904296875,
-0.00982666015625,
-0.022857666015625,
-0.020477294921875,
0.036346435546875,
-0.0261077880859375,
-0.034332275390625,
-0.0159149169921875,
0.0209503173828125,
-0.02142333984375,
-0.0404052734375,
0.039154052734375,
-0.0361328125,
0.021881103515625,
-0.026092529296875,
-0.040252685546875,
-0.0278778076171875,
-0.0301361083984375,
-0.047210693359375,
0.054351806640625,
0.00897216796875,
-0.025390625,
0.0247955322265625,
-0.052032470703125,
-0.0003800392150878906,
-0.0007586479187011719,
0.004985809326171875,
-0.0271453857421875,
-0.0018243789672851562,
0.015960693359375,
0.042938232421875,
-0.0092010498046875,
0.04638671875,
-0.0290985107421875,
-0.0157318115234375,
0.003101348876953125,
-0.01861572265625,
0.06402587890625,
0.036285400390625,
-0.03277587890625,
0.018096923828125,
-0.047119140625,
-0.0159454345703125,
0.027313232421875,
-0.01428985595703125,
0.006153106689453125,
-0.004657745361328125,
0.01003265380859375,
0.0275421142578125,
0.0287933349609375,
-0.031524658203125,
0.020599365234375,
-0.031219482421875,
0.03228759765625,
0.044342041015625,
-0.0277252197265625,
0.05364990234375,
0.00450897216796875,
0.03094482421875,
-0.01001739501953125,
0.0145721435546875,
-0.0111541748046875,
-0.03826904296875,
-0.06500244140625,
0.0001951456069946289,
0.00879669189453125,
0.03253173828125,
-0.05133056640625,
0.0504150390625,
-0.03204345703125,
-0.07781982421875,
-0.038726806640625,
0.01617431640625,
0.027374267578125,
0.0269622802734375,
0.033050537109375,
-0.0023021697998046875,
-0.048492431640625,
-0.053375244140625,
-0.006671905517578125,
-0.033233642578125,
0.0011339187622070312,
0.04840087890625,
0.0609130859375,
-0.034698486328125,
0.058990478515625,
-0.034942626953125,
-0.01020050048828125,
-0.01410675048828125,
-0.01163482666015625,
0.032440185546875,
0.0290374755859375,
0.046844482421875,
-0.050506591796875,
-0.029815673828125,
0.0158538818359375,
-0.07769775390625,
-0.005115509033203125,
-0.01056671142578125,
0.0005130767822265625,
0.0219879150390625,
0.0560302734375,
-0.06292724609375,
0.01291656494140625,
0.0599365234375,
-0.05108642578125,
0.032623291015625,
-0.006198883056640625,
-0.00817108154296875,
-0.08868408203125,
0.0452880859375,
0.00998687744140625,
0.003063201904296875,
-0.04815673828125,
0.0007014274597167969,
0.007350921630859375,
-0.014739990234375,
-0.0278167724609375,
0.060699462890625,
-0.016815185546875,
0.0159454345703125,
-0.028045654296875,
0.0161895751953125,
-0.0169525146484375,
0.039947509765625,
0.005046844482421875,
0.049072265625,
0.037017822265625,
-0.04998779296875,
0.0173797607421875,
0.0281829833984375,
-0.013702392578125,
0.0045318603515625,
-0.058135986328125,
-0.001888275146484375,
-0.0209197998046875,
0.029510498046875,
-0.06048583984375,
-0.0146484375,
0.0165252685546875,
-0.04815673828125,
0.0117645263671875,
0.007320404052734375,
-0.014923095703125,
-0.05499267578125,
0.0198822021484375,
0.0076751708984375,
0.039520263671875,
-0.06280517578125,
0.04669189453125,
0.03466796875,
-0.009552001953125,
-0.055267333984375,
-0.02520751953125,
0.006137847900390625,
-0.021820068359375,
-0.0670166015625,
0.0399169921875,
-0.0116424560546875,
-0.01436614990234375,
0.01279449462890625,
0.006427764892578125,
0.0176544189453125,
-0.003963470458984375,
0.0294036865234375,
0.03839111328125,
-0.017578125,
0.0019817352294921875,
-0.01425933837890625,
-0.01438140869140625,
0.0103912353515625,
-0.0169830322265625,
0.052978515625,
-0.0400390625,
0.005218505859375,
-0.034942626953125,
0.00006872415542602539,
0.04656982421875,
-0.0020751953125,
0.06268310546875,
0.055206298828125,
-0.0294952392578125,
0.005672454833984375,
-0.038177490234375,
-0.0291595458984375,
-0.0400390625,
0.0548095703125,
0.0014925003051757812,
-0.055816650390625,
0.06365966796875,
0.00959014892578125,
0.003505706787109375,
0.06329345703125,
0.048675537109375,
0.0159149169921875,
0.063232421875,
0.04949951171875,
-0.01525115966796875,
0.04364013671875,
-0.05035400390625,
0.021942138671875,
-0.044158935546875,
-0.0206451416015625,
-0.048370361328125,
-0.0015573501586914062,
-0.04388427734375,
-0.040740966796875,
0.0121307373046875,
0.01273345947265625,
-0.023529052734375,
0.052398681640625,
-0.045745849609375,
0.044921875,
0.033294677734375,
-0.0169830322265625,
-0.00019550323486328125,
-0.004421234130859375,
0.00897216796875,
0.01222991943359375,
-0.0609130859375,
-0.049560546875,
0.0859375,
0.042999267578125,
0.05059814453125,
-0.01380157470703125,
0.0264129638671875,
-0.0210418701171875,
0.0362548828125,
-0.0369873046875,
0.039215087890625,
-0.0289764404296875,
-0.060821533203125,
-0.00959014892578125,
-0.049102783203125,
-0.0667724609375,
0.0174102783203125,
-0.00737762451171875,
-0.052886962890625,
0.0335693359375,
-0.0038509368896484375,
-0.037841796875,
0.0306243896484375,
-0.06494140625,
0.07550048828125,
-0.004955291748046875,
-0.01378631591796875,
-0.002132415771484375,
-0.042633056640625,
0.028564453125,
-0.0010633468627929688,
0.00757598876953125,
0.01450347900390625,
0.032562255859375,
0.0684814453125,
-0.0156097412109375,
0.0653076171875,
-0.0172119140625,
-0.0029201507568359375,
0.0276031494140625,
-0.0163421630859375,
0.031494140625,
0.004314422607421875,
0.006603240966796875,
0.034423828125,
-0.0160064697265625,
-0.0078887939453125,
-0.0275421142578125,
0.059234619140625,
-0.07366943359375,
-0.037506103515625,
-0.01995849609375,
-0.04364013671875,
0.0108489990234375,
0.0167999267578125,
0.0270538330078125,
0.0012187957763671875,
-0.0227813720703125,
0.0267791748046875,
0.019378662109375,
-0.03546142578125,
0.055511474609375,
0.0232391357421875,
-0.037200927734375,
-0.027496337890625,
0.07073974609375,
0.01470947265625,
0.0255889892578125,
-0.0015811920166015625,
0.0009965896606445312,
-0.0430908203125,
-0.02020263671875,
-0.0380859375,
0.042816162109375,
-0.06463623046875,
-0.00572967529296875,
-0.06292724609375,
-0.029388427734375,
-0.04638671875,
-0.019561767578125,
-0.043060302734375,
-0.040191650390625,
-0.04229736328125,
-0.01181793212890625,
0.01020050048828125,
0.05731201171875,
-0.01336669921875,
0.044281005859375,
-0.034393310546875,
0.022125244140625,
0.011749267578125,
0.0119781494140625,
-0.0009455680847167969,
-0.05682373046875,
-0.0279693603515625,
-0.0012731552124023438,
-0.027862548828125,
-0.039306640625,
0.050567626953125,
-0.0008616447448730469,
0.029205322265625,
0.0311431884765625,
-0.045074462890625,
0.0478515625,
-0.031494140625,
0.06207275390625,
0.0075836181640625,
-0.06475830078125,
0.0197906494140625,
-0.029754638671875,
0.016021728515625,
0.0269317626953125,
0.026458740234375,
-0.038238525390625,
-0.0248870849609375,
-0.06732177734375,
-0.058441162109375,
0.0491943359375,
0.0196685791015625,
0.01666259765625,
0.00046634674072265625,
0.028839111328125,
-0.0131072998046875,
-0.0009560585021972656,
-0.081298828125,
-0.0303497314453125,
-0.0330810546875,
-0.00514984130859375,
-0.0282135009765625,
0.01058197021484375,
0.00638580322265625,
-0.021148681640625,
0.06402587890625,
-0.01116943359375,
0.034332275390625,
0.01410675048828125,
-0.007648468017578125,
-0.0203399658203125,
-0.0122222900390625,
0.043182373046875,
0.049652099609375,
-0.0210418701171875,
-0.0169219970703125,
0.001369476318359375,
-0.052215576171875,
-0.00832366943359375,
0.042755126953125,
-0.0240478515625,
0.002300262451171875,
0.005725860595703125,
0.079833984375,
-0.01410675048828125,
-0.0321044921875,
0.01107025146484375,
-0.004749298095703125,
-0.01279449462890625,
-0.0158538818359375,
0.004329681396484375,
0.032196044921875,
0.0223236083984375,
0.0380859375,
0.0236053466796875,
-0.015106201171875,
-0.033477783203125,
0.0117645263671875,
0.036468505859375,
-0.0206756591796875,
-0.032012939453125,
0.07977294921875,
0.00908660888671875,
-0.017913818359375,
0.033538818359375,
-0.0260772705078125,
-0.028411865234375,
0.07080078125,
0.040252685546875,
0.06353759765625,
-0.031280517578125,
0.034423828125,
0.04998779296875,
0.040618896484375,
-0.0080718994140625,
0.0027370452880859375,
0.04156494140625,
-0.05010986328125,
-0.032135009765625,
-0.060455322265625,
-0.0130462646484375,
0.0180206298828125,
-0.0208740234375,
0.0200042724609375,
-0.032745361328125,
-0.012176513671875,
0.0253753662109375,
0.01212310791015625,
-0.053558349609375,
0.019927978515625,
0.0161590576171875,
0.05029296875,
-0.0633544921875,
0.046875,
0.0270233154296875,
-0.049285888671875,
-0.0750732421875,
-0.033050537109375,
-0.006317138671875,
-0.05078125,
0.05950927734375,
0.022064208984375,
0.0087890625,
0.00281524658203125,
-0.0426025390625,
-0.09234619140625,
0.08489990234375,
0.02471923828125,
-0.037017822265625,
-0.00902557373046875,
0.0178680419921875,
0.059356689453125,
-0.0360107421875,
0.0290679931640625,
0.038818359375,
0.038970947265625,
0.00968170166015625,
-0.07025146484375,
0.0149383544921875,
-0.037689208984375,
-0.01593017578125,
-0.01168060302734375,
-0.0645751953125,
0.0826416015625,
-0.02276611328125,
-0.0208282470703125,
0.00229644775390625,
0.0648193359375,
0.0163116455078125,
0.0206756591796875,
0.02764892578125,
0.036376953125,
0.0673828125,
-0.01366424560546875,
0.0849609375,
-0.03253173828125,
0.042266845703125,
0.04693603515625,
-0.004741668701171875,
0.043182373046875,
0.02459716796875,
-0.0155792236328125,
0.032958984375,
0.043548583984375,
-0.0266571044921875,
0.0135955810546875,
0.0029850006103515625,
0.0102386474609375,
0.01351165771484375,
-0.010650634765625,
-0.037200927734375,
0.031585693359375,
0.0322265625,
-0.05364990234375,
-0.00179290771484375,
-0.00946044921875,
0.022857666015625,
-0.01351165771484375,
-0.01166534423828125,
0.03448486328125,
0.00724029541015625,
-0.0748291015625,
0.06951904296875,
0.0015430450439453125,
0.052581787109375,
-0.04241943359375,
0.0257720947265625,
-0.0205078125,
0.0234375,
-0.017822265625,
-0.0201568603515625,
0.0271759033203125,
-0.0003082752227783203,
-0.00860595703125,
-0.002017974853515625,
0.03863525390625,
-0.033966064453125,
-0.06402587890625,
0.0282745361328125,
0.0288848876953125,
0.007518768310546875,
-0.0106048583984375,
-0.050323486328125,
-0.0022220611572265625,
0.00482940673828125,
-0.034210205078125,
0.0022602081298828125,
0.00057220458984375,
0.01248931884765625,
0.052886962890625,
0.07196044921875,
0.0010013580322265625,
0.021728515625,
-0.005542755126953125,
0.07568359375,
-0.0498046875,
-0.03369140625,
-0.06353759765625,
0.02691650390625,
-0.00598907470703125,
-0.037322998046875,
0.0758056640625,
0.043731689453125,
0.056365966796875,
-0.017608642578125,
0.041900634765625,
-0.0258941650390625,
0.01203155517578125,
-0.025665283203125,
0.07012939453125,
-0.037384033203125,
0.011383056640625,
-0.033416748046875,
-0.08282470703125,
-0.0221710205078125,
0.05401611328125,
-0.01519775390625,
0.0245208740234375,
0.06256103515625,
0.0794677734375,
-0.003299713134765625,
0.0111846923828125,
0.0020084381103515625,
0.03106689453125,
0.04754638671875,
0.0305023193359375,
0.0364990234375,
-0.040130615234375,
0.049224853515625,
-0.025634765625,
-0.01250457763671875,
-0.006885528564453125,
-0.046356201171875,
-0.0753173828125,
-0.054656982421875,
-0.00743865966796875,
-0.04058837890625,
-0.014678955078125,
0.054229736328125,
0.053253173828125,
-0.0491943359375,
-0.023956298828125,
-0.0162506103515625,
-0.00894927978515625,
-0.0088348388671875,
-0.018096923828125,
0.053802490234375,
-0.020660400390625,
-0.07159423828125,
0.0382080078125,
0.0018939971923828125,
0.01029205322265625,
-0.040313720703125,
-0.006526947021484375,
-0.0157470703125,
0.01168060302734375,
0.0269622802734375,
0.0313720703125,
-0.04791259765625,
-0.01056671142578125,
0.041015625,
-0.0016613006591796875,
-0.0079345703125,
0.03863525390625,
-0.055755615234375,
0.0201873779296875,
0.033172607421875,
0.04248046875,
0.041015625,
-0.0006461143493652344,
0.0230560302734375,
-0.0223236083984375,
0.0125274658203125,
0.00902557373046875,
0.0305023193359375,
0.0296783447265625,
-0.037628173828125,
0.045562744140625,
0.0221710205078125,
-0.06610107421875,
-0.060150146484375,
-0.00965118408203125,
-0.07989501953125,
-0.02496337890625,
0.1063232421875,
-0.0004153251647949219,
-0.032806396484375,
0.01515960693359375,
-0.0029659271240234375,
0.021697998046875,
-0.0246429443359375,
0.08001708984375,
0.05474853515625,
0.0162506103515625,
0.0190582275390625,
-0.041107177734375,
0.043548583984375,
0.0400390625,
-0.06976318359375,
0.017425537109375,
0.028106689453125,
0.006526947021484375,
0.054107666015625,
0.0404052734375,
-0.0083465576171875,
0.0157928466796875,
0.0037441253662109375,
0.010467529296875,
-0.0057525634765625,
-0.035308837890625,
-0.0284271240234375,
0.0006208419799804688,
-0.0467529296875,
0.01531219482421875
]
] |
Salesforce/ctrl | 2023-07-11T14:45:34.000Z | [
"transformers",
"pytorch",
"tf",
"ctrl",
"text-generation",
"en",
"arxiv:1909.05858",
"arxiv:1910.09700",
"license:bsd-3-clause",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | Salesforce | null | null | Salesforce/ctrl | 5 | 11,686 | transformers | 2022-03-02T23:29:04 | ---
language: en
license: bsd-3-clause
pipeline_tag: text-generation
---
# ctrl
# Table of Contents
1. [Model Details](#model-details)
2. [Uses](#uses)
3. [Bias, Risks, and Limitations](#bias-risks-and-limitations)
4. [Training](#training)
5. [Evaluation](#evaluation)
6. [Environmental Impact](#environmental-impact)
7. [Technical Specifications](#technical-specifications)
8. [Citation](#citation)
9. [Model Card Authors](#model-card-authors)
10. [How To Get Started With the Model](#how-to-get-started-with-the-model)
# Model Details
## Model Description
The CTRL model was proposed in [CTRL: A Conditional Transformer Language Model for Controllable Generation](https://arxiv.org/abs/1909.05858) by Nitish Shirish Keskar*, Bryan McCann*, Lav R. Varshney, Caiming Xiong and Richard Socher. It's a causal (unidirectional) transformer pre-trained using language modeling on a very large corpus of ~140 GB of text data with the first token reserved as a control code (such as Links, Books, Wikipedia etc.). The model developers released a model card for CTRL, available [here](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf).
In their [model card](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf), the developers write:
> The CTRL Language Model analyzed in this card generates text conditioned on control codes that specify domain, style, topics, dates, entities, relationships between entities, plot points, and task-related behavior.
- **Developed by:** See [associated paper](https://arxiv.org/abs/1909.05858) from Salesforce Research
- **Model type:** Transformer-based language model
- **Language(s) (NLP):** Primarily English, some German, Spanish, French
- **License:** [BSD 3-Clause](https://github.com/salesforce/ctrl/blob/master/LICENSE.txt); also see [Code of Conduct](https://github.com/salesforce/ctrl)
- **Related Models:** More information needed
- **Parent Model:** More information needed
- **Resources for more information:**
- [Associated paper](https://arxiv.org/abs/1909.05858)
- [GitHub repo](https://github.com/salesforce/ctrl)
- [Developer Model Card](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf)
- [Blog post](https://blog.salesforceairesearch.com/introducing-a-conditional-transformer-language-model-for-controllable-generation/)
# Uses
## Direct Use
The model is a language model. The model can be used for text generation.
## Downstream Use
In their [model card](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf), the developers write that the primary intended users are general audiences and NLP Researchers, and that the primary intended uses are:
> 1. Generating artificial text in collaboration with a human, including but not limited to:
> - Creative writing
> - Automating repetitive writing tasks
> - Formatting specific text types
> - Creating contextualized marketing materials
> 2. Improvement of other NLP applications through fine-tuning (on another task or other data, e.g. fine-tuning CTRL to learn new kinds of language like product descriptions)
> 3. Enhancement in the field of natural language understanding to push towards a better understanding of artificial text generation, including how to detect it and work toward control, understanding, and potentially combating potentially negative consequences of such models.
## Out-of-Scope Use
In their [model card](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf), the developers write:
> - CTRL should not be used for generating artificial text without collaboration with a human.
> - It should not be used to make normative or prescriptive claims.
> - This software should not be used to promote or profit from:
> - violence, hate, and division;
> - environmental destruction;
> - abuse of human rights; or
> - the destruction of people's physical and mental health.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
In their [model card](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf), the developers write:
> We recognize the potential for misuse or abuse, including use by bad actors who could manipulate the system to act maliciously and generate text to influence decision-making in political, economic, and social settings. False attribution could also harm individuals, organizations, or other entities. To address these concerns, the model was evaluated internally as well as externally by third parties, including the Partnership on AI, prior to release.
> To mitigate potential misuse to the extent possible, we stripped out all detectable training data from undesirable sources. We then redteamed the model and found that negative utterances were often placed in contexts that made them identifiable as such. For example, when using the ‘News’ control code, hate speech could be embedded as part of an apology (e.g. “the politician apologized for saying [insert hateful statement]”), implying that this type of speech was negative. By pre-selecting the available control codes (omitting, for example, Instagram and Twitter from the available domains), we are able to limit the potential for misuse.
> In releasing our model, we hope to put it into the hands of researchers and prosocial actors so that they can work to control, understand, and potentially combat the negative consequences of such models. We hope that research into detecting fake news and model-generated content of all kinds will be pushed forward by CTRL. It is our belief that these models should become a common tool so researchers can design methods to guard against malicious use and so the public becomes familiar with their existence and patterns of behavior.
See the [associated paper](https://arxiv.org/pdf/1909.05858.pdf) for further discussions about the ethics of LLMs.
## Recommendations
In their [model card](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf), the developers write:
> - A recommendation to monitor and detect use will be implemented through the development of a model that will identify CTRLgenerated text.
> - A second recommendation to further screen the input into and output from the model will be implemented through the addition of a check in the CTRL interface to prohibit the insertion into the model of certain negative inputs, which will help control the output that can be generated.
> - The model is trained on a limited number of languages: primarily English and some German, Spanish, French. A recommendation for a future area of research is to train the model on more languages.
See the [CTRL-detector GitHub repo](https://github.com/salesforce/ctrl-detector) for more on the detector model.
# Training
## Training Data
In their [model card](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf), the developers write:
> This model is trained on 140 GB of text drawn from a variety of domains: Wikipedia (English, German, Spanish, and French), Project Gutenberg, submissions from 45 subreddits, OpenWebText, a large collection of news data, Amazon Reviews, Europarl and UN data from WMT (En-De, En-Es, En-Fr), question-answer pairs (no context documents) from ELI5, and the MRQA shared task, which includes Stanford Question Answering Dataset, NewsQA, TriviaQA, SearchQA, HotpotQA, and Natural Questions. See the paper for the full list of training data.
## Training Procedure
### Preprocessing
In the [associated paper](https://arxiv.org/pdf/1909.05858.pdf) the developers write:
> We learn BPE (Sennrich et al., 2015) codes and tokenize the data using fastBPE4, but we use a large vocabulary of roughly 250K tokens. This includes the sub-word tokens necessary to mitigate problems with rare words, but it also reduces the average number of tokens required to generate long text by including most common words. We use English Wikipedia and a 5% split of our collected OpenWebText data for learning BPE codes. We also introduce an unknown token so that during preprocessing we can filter out sequences that contain more than 2 unknown tokens. This, along with the compressed storage for efficient training (TFRecords) (Abadi et al., 2016), reduces our training data to 140 GB from the total 180 GB collected.
See the paper for links, references, and further details.
### Training
In the [associated paper](https://arxiv.org/pdf/1909.05858.pdf) the developers write:
> CTRL has model dimension d = 1280, inner dimension f = 8192, 48 layers, and 16 heads per layer. Dropout with probability 0.1 follows the residual connections in each layer. Token embeddings were tied with the final output embedding layer (Inan et al., 2016; Press & Wolf, 2016).
See the paper for links, references, and further details.
# Evaluation
## Testing Data, Factors & Metrics
In their [model card](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf), the developers write that model performance measures are:
> Performance evaluated on qualitative judgments by humans as to whether the control codes lead to text generated in the desired domain
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). Details are pulled from the [associated paper](https://arxiv.org/pdf/1909.05858.pdf).
- **Hardware Type:** TPU v3 Pod
- **Hours used:** Approximately 336 hours (2 weeks)
- **Cloud Provider:** GCP
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications
In the [associated paper](https://arxiv.org/pdf/1909.05858.pdf) the developers write:
> CTRL was implemented in TensorFlow (Abadi et al., 2016) and trained with a global batch size of 1024 distributed across 256 cores of a Cloud TPU v3 Pod for 800k iterations. Training took approximately 2 weeks using Adagrad (Duchi et al., 2011) with a linear warmup from 0 to 0.05 over 25k steps. The norm of gradients were clipped to 0.25 as in (Merity et al., 2017). Learning rate decay was not necessary due to the monotonic nature of the Adagrad accumulator. We compared to the Adam optimizer (Kingma & Ba, 2014) while training smaller models, but we noticed comparable convergence rates and significant memory savings with Adagrad. We also experimented with explicit memory-saving optimizers including SM3 (Anil et al., 2019), Adafactor (Shazeer & Stern, 2018), and NovoGrad (Ginsburg et al., 2019) with mixed results.
See the paper for links, references, and further details.
# Citation
**BibTeX:**
```bibtex
@article{keskarCTRL2019,
title={{CTRL - A Conditional Transformer Language Model for Controllable Generation}},
author={Keskar, Nitish Shirish and McCann, Bryan and Varshney, Lav and Xiong, Caiming and Socher, Richard},
journal={arXiv preprint arXiv:1909.05858},
year={2019}
}
```
**APA:**
- Keskar, N. S., McCann, B., Varshney, L. R., Xiong, C., & Socher, R. (2019). Ctrl: A conditional transformer language model for controllable generation. arXiv preprint arXiv:1909.05858.
# Model Card Authors
This model card was written by the team at Hugging Face, referencing the [model card](https://github.com/salesforce/ctrl/blob/master/ModelCard.pdf) released by the developers.
# How to Get Started with the Model
Use the code below to get started with the model. See the [Hugging Face ctrl docs](https://huggingface.co/docs/transformers/model_doc/ctrl) for more information.
<details>
<summary> Click to expand </summary>
```python
>>> from transformers import CTRLTokenizer, CTRLModel
>>> import torch
>>> tokenizer = CTRLTokenizer.from_pretrained("ctrl")
>>> model = CTRLModel.from_pretrained("ctrl")
>>> # CTRL was trained with control codes as the first token
>>> inputs = tokenizer("Opinion My dog is cute", return_tensors="pt")
>>> assert inputs["input_ids"][0, 0].item() in tokenizer.control_codes.values()
>>> outputs = model(**inputs)
>>> last_hidden_states = outputs.last_hidden_state
>>> list(last_hidden_states.shape)
```
</details> | 12,455 | [
[
-0.0200653076171875,
-0.070068359375,
0.01528167724609375,
0.00959014892578125,
-0.0159759521484375,
-0.015380859375,
-0.032257080078125,
-0.043670654296875,
-0.027252197265625,
0.055023193359375,
-0.03997802734375,
-0.045379638671875,
-0.037811279296875,
0.005329132080078125,
-0.053680419921875,
0.0899658203125,
0.0145721435546875,
-0.007709503173828125,
-0.0040283203125,
0.0031299591064453125,
-0.027008056640625,
-0.06591796875,
-0.06182861328125,
-0.0035266876220703125,
0.028350830078125,
0.01169586181640625,
0.0609130859375,
0.038604736328125,
0.042572021484375,
0.0221405029296875,
-0.020843505859375,
0.0144805908203125,
-0.0374755859375,
-0.005779266357421875,
-0.00878143310546875,
-0.0357666015625,
-0.04736328125,
0.0164031982421875,
0.02978515625,
0.016510009765625,
0.0026836395263671875,
0.0106201171875,
0.0032138824462890625,
0.0285797119140625,
-0.024688720703125,
-0.00894927978515625,
-0.052734375,
-0.003021240234375,
-0.0178375244140625,
-0.0120697021484375,
-0.046844482421875,
-0.0241241455078125,
0.0098419189453125,
-0.0350341796875,
0.02484130859375,
-0.00366973876953125,
0.08172607421875,
0.004726409912109375,
-0.041412353515625,
-0.0284576416015625,
-0.054412841796875,
0.06640625,
-0.080810546875,
0.042633056640625,
0.0248565673828125,
0.00682830810546875,
0.0002970695495605469,
-0.05810546875,
-0.0526123046875,
-0.042572021484375,
-0.01284027099609375,
0.00797271728515625,
-0.031585693359375,
0.01276397705078125,
0.02825927734375,
0.00913238525390625,
-0.0487060546875,
0.01154327392578125,
-0.039520263671875,
-0.0162353515625,
0.0599365234375,
0.0146942138671875,
0.036163330078125,
-0.0256805419921875,
-0.0236663818359375,
-0.00531005859375,
-0.052032470703125,
0.004146575927734375,
0.036468505859375,
0.035491943359375,
-0.0233612060546875,
0.0435791015625,
0.00749969482421875,
0.053466796875,
-0.00044155120849609375,
-0.005344390869140625,
0.016357421875,
-0.0521240234375,
-0.0183563232421875,
-0.024810791015625,
0.07049560546875,
0.0219879150390625,
0.0166015625,
-0.015716552734375,
-0.0140533447265625,
0.0169219970703125,
0.035400390625,
-0.05047607421875,
-0.00022041797637939453,
0.0233306884765625,
-0.036712646484375,
-0.043548583984375,
-0.006839752197265625,
-0.035308837890625,
-0.013824462890625,
-0.0269622802734375,
0.01800537109375,
-0.03070068359375,
-0.0301055908203125,
0.000028431415557861328,
-0.023895263671875,
-0.002689361572265625,
0.0205230712890625,
-0.04376220703125,
0.01385498046875,
0.039825439453125,
0.044219970703125,
-0.035186767578125,
-0.017974853515625,
0.0013866424560546875,
-0.0070648193359375,
-0.0109100341796875,
0.033447265625,
-0.027984619140625,
-0.02880859375,
0.0017461776733398438,
0.009063720703125,
-0.00075531005859375,
-0.036102294921875,
0.040740966796875,
-0.0288848876953125,
0.042327880859375,
0.002140045166015625,
-0.035369873046875,
-0.01024627685546875,
0.021240234375,
-0.057373046875,
0.083740234375,
-0.0024929046630859375,
-0.056060791015625,
0.01113128662109375,
-0.05743408203125,
-0.035888671875,
-0.01016998291015625,
0.00582122802734375,
-0.033721923828125,
-0.004695892333984375,
-0.0079193115234375,
0.0218505859375,
-0.027984619140625,
0.058380126953125,
-0.00502777099609375,
-0.0035572052001953125,
0.026885986328125,
-0.0479736328125,
0.091064453125,
0.02606201171875,
-0.0322265625,
0.005584716796875,
-0.05181884765625,
-0.002735137939453125,
0.01312255859375,
-0.03271484375,
-0.0175933837890625,
-0.004756927490234375,
0.026214599609375,
0.0297393798828125,
0.0189056396484375,
-0.020538330078125,
0.00360870361328125,
-0.0233306884765625,
0.04833984375,
0.045074462890625,
-0.00457763671875,
0.0496826171875,
-0.0158843994140625,
0.060455322265625,
0.01239013671875,
0.026092529296875,
-0.0081329345703125,
-0.02825927734375,
-0.059661865234375,
-0.00016748905181884766,
0.02386474609375,
0.03753662109375,
-0.0462646484375,
0.037689208984375,
-0.00437164306640625,
-0.05560302734375,
-0.03131103515625,
0.00177764892578125,
0.043487548828125,
0.033782958984375,
0.0274200439453125,
0.004688262939453125,
-0.05572509765625,
-0.07183837890625,
-0.0206298828125,
-0.004638671875,
0.01617431640625,
0.01898193359375,
0.041473388671875,
-0.01251983642578125,
0.07366943359375,
-0.02325439453125,
-0.01134490966796875,
-0.055389404296875,
0.01149749755859375,
-0.0012979507446289062,
0.039306640625,
0.03558349609375,
-0.0804443359375,
-0.0256195068359375,
-0.0093841552734375,
-0.047607421875,
-0.00955963134765625,
-0.0229949951171875,
-0.01389312744140625,
0.04351806640625,
0.046173095703125,
-0.049285888671875,
0.035980224609375,
0.058258056640625,
-0.028106689453125,
0.04241943359375,
0.00388336181640625,
-0.01270294189453125,
-0.09033203125,
0.0286865234375,
0.01284027099609375,
-0.01099395751953125,
-0.0645751953125,
0.0029582977294921875,
0.00859832763671875,
-0.0187530517578125,
-0.050140380859375,
0.0601806640625,
-0.0231781005859375,
0.03076171875,
-0.0224609375,
0.0196075439453125,
0.007366180419921875,
0.04132080078125,
0.023162841796875,
0.05767822265625,
0.00940704345703125,
-0.06573486328125,
-0.007259368896484375,
0.006336212158203125,
-0.0022335052490234375,
0.0247344970703125,
-0.05792236328125,
0.002437591552734375,
0.0028133392333984375,
0.013671875,
-0.042327880859375,
0.01496124267578125,
0.0169830322265625,
-0.055755615234375,
0.01456451416015625,
0.00955963134765625,
-0.047607421875,
-0.046417236328125,
-0.005290985107421875,
0.01320648193359375,
0.03302001953125,
-0.02294921875,
0.050384521484375,
0.049530029296875,
0.0125732421875,
-0.033111572265625,
-0.0634765625,
0.005084991455078125,
-0.01369476318359375,
-0.0523681640625,
0.0330810546875,
-0.0157623291015625,
-0.0126190185546875,
0.01418304443359375,
0.0097808837890625,
-0.014434814453125,
0.0261383056640625,
0.0138092041015625,
0.0251312255859375,
-0.004703521728515625,
0.0269622802734375,
-0.000949859619140625,
-0.00470733642578125,
-0.0021076202392578125,
-0.002101898193359375,
0.0477294921875,
0.0012636184692382812,
0.006866455078125,
-0.04437255859375,
0.0189666748046875,
0.0308380126953125,
-0.032318115234375,
0.046722412109375,
0.0399169921875,
-0.027618408203125,
-0.0285186767578125,
-0.037872314453125,
-0.026153564453125,
-0.03656005859375,
0.05206298828125,
0.008758544921875,
-0.0628662109375,
0.0236358642578125,
0.025634765625,
0.011932373046875,
0.03753662109375,
0.052947998046875,
0.00384521484375,
0.086181640625,
0.0706787109375,
-0.007396697998046875,
0.0504150390625,
-0.0257110595703125,
0.039764404296875,
-0.0396728515625,
-0.01018524169921875,
-0.056365966796875,
-0.006984710693359375,
-0.05712890625,
-0.0164794921875,
0.00855255126953125,
0.004669189453125,
-0.0194549560546875,
0.053314208984375,
-0.04266357421875,
0.01071929931640625,
0.064453125,
-0.003047943115234375,
0.022857666015625,
-0.018035888671875,
0.0013751983642578125,
-0.003787994384765625,
-0.056427001953125,
-0.050445556640625,
0.07708740234375,
0.027313232421875,
0.047760009765625,
-0.00946044921875,
0.0509033203125,
0.0279388427734375,
0.0210113525390625,
-0.04656982421875,
0.050445556640625,
-0.018096923828125,
-0.07904052734375,
-0.0134735107421875,
-0.0279083251953125,
-0.06622314453125,
0.0093841552734375,
-0.0170135498046875,
-0.044158935546875,
0.00220489501953125,
0.031341552734375,
-0.0221099853515625,
0.0179443359375,
-0.063720703125,
0.079345703125,
-0.036285400390625,
-0.0172576904296875,
-0.01338958740234375,
-0.040924072265625,
0.0302886962890625,
0.0023326873779296875,
0.0150299072265625,
0.0104522705078125,
0.01331329345703125,
0.06719970703125,
-0.029632568359375,
0.08929443359375,
-0.00423431396484375,
-0.002666473388671875,
0.0300750732421875,
-0.0015697479248046875,
0.0386962890625,
-0.0186004638671875,
-0.00041937828063964844,
0.0203704833984375,
-0.0026111602783203125,
0.00893402099609375,
-0.029510498046875,
0.0367431640625,
-0.05303955078125,
-0.04156494140625,
-0.0302276611328125,
-0.048370361328125,
0.005764007568359375,
0.0325927734375,
0.0286407470703125,
0.0254058837890625,
-0.01904296875,
0.0214385986328125,
0.0307159423828125,
-0.044219970703125,
0.027130126953125,
0.049224853515625,
-0.0189971923828125,
-0.0210418701171875,
0.05621337890625,
0.01010894775390625,
0.028350830078125,
0.0267181396484375,
0.0111541748046875,
-0.0186614990234375,
-0.036468505859375,
-0.00769805908203125,
0.0211334228515625,
-0.043853759765625,
-0.0299224853515625,
-0.06695556640625,
-0.038909912109375,
-0.041259765625,
-0.009765625,
-0.027984619140625,
-0.033935546875,
-0.048553466796875,
-0.022308349609375,
0.0181121826171875,
0.0404052734375,
0.005031585693359375,
0.03350830078125,
-0.057342529296875,
0.0221099853515625,
-0.01041412353515625,
-0.00234222412109375,
0.007793426513671875,
-0.046234130859375,
-0.023895263671875,
0.0247650146484375,
-0.046661376953125,
-0.051422119140625,
0.04931640625,
-0.007793426513671875,
0.04302978515625,
0.0168914794921875,
0.01055908203125,
0.0345458984375,
-0.026397705078125,
0.0765380859375,
0.005916595458984375,
-0.07342529296875,
0.0460205078125,
-0.01654052734375,
0.0188140869140625,
0.03228759765625,
0.029205322265625,
-0.06121826171875,
-0.06512451171875,
-0.0662841796875,
-0.0654296875,
0.06787109375,
0.02490234375,
0.0278778076171875,
-0.0187835693359375,
0.00140380859375,
-0.01001739501953125,
0.0277862548828125,
-0.08984375,
-0.0251922607421875,
-0.004566192626953125,
-0.0201416015625,
0.0016260147094726562,
-0.02630615234375,
0.005123138427734375,
-0.0118560791015625,
0.0640869140625,
0.017852783203125,
0.0267791748046875,
-0.0025844573974609375,
-0.028564453125,
0.0022983551025390625,
0.017425537109375,
0.052947998046875,
0.03326416015625,
-0.003940582275390625,
-0.0019512176513671875,
-0.01009368896484375,
-0.034423828125,
-0.0197601318359375,
0.0023479461669921875,
-0.046722412109375,
0.001953125,
0.0226287841796875,
0.06201171875,
0.01025390625,
-0.05908203125,
0.046417236328125,
0.00637054443359375,
-0.01502227783203125,
-0.0285186767578125,
0.004673004150390625,
0.017333984375,
0.0119781494140625,
0.01085662841796875,
0.006320953369140625,
0.00475311279296875,
-0.03289794921875,
0.0086669921875,
0.0386962890625,
-0.024658203125,
-0.04052734375,
0.056610107421875,
0.030487060546875,
-0.037506103515625,
0.0433349609375,
-0.0147705078125,
-0.0633544921875,
0.06353759765625,
0.054534912109375,
0.0615234375,
-0.00004017353057861328,
0.032196044921875,
0.039764404296875,
0.046966552734375,
0.0010776519775390625,
0.01593017578125,
0.006031036376953125,
-0.06500244140625,
-0.00164031982421875,
-0.021453857421875,
-0.005489349365234375,
0.015655517578125,
-0.035888671875,
0.044464111328125,
-0.027862548828125,
-0.037811279296875,
-0.0041656494140625,
0.0002689361572265625,
-0.0662841796875,
0.0194854736328125,
0.0156097412109375,
0.07977294921875,
-0.06402587890625,
0.0748291015625,
0.0239410400390625,
-0.0655517578125,
-0.065673828125,
-0.0157623291015625,
0.00913238525390625,
-0.05419921875,
0.06512451171875,
0.01001739501953125,
0.003948211669921875,
-0.000270843505859375,
-0.06597900390625,
-0.056610107421875,
0.08477783203125,
0.007556915283203125,
-0.043975830078125,
-0.02191162109375,
0.0183563232421875,
0.06201171875,
-0.020294189453125,
0.0201416015625,
0.0262451171875,
0.0362548828125,
-0.0209808349609375,
-0.071533203125,
0.0120086669921875,
-0.0234375,
0.00807952880859375,
0.00388336181640625,
-0.037811279296875,
0.067138671875,
-0.00958251953125,
-0.0333251953125,
0.0067901611328125,
0.02996826171875,
0.0124053955078125,
0.0124969482421875,
0.0222320556640625,
0.0396728515625,
0.05712890625,
0.002407073974609375,
0.08697509765625,
-0.035980224609375,
0.0280609130859375,
0.09588623046875,
-0.0041351318359375,
0.061065673828125,
0.025115966796875,
-0.003505706787109375,
0.01580810546875,
0.025146484375,
-0.0156402587890625,
0.031829833984375,
0.0013322830200195312,
0.0092926025390625,
-0.01056671142578125,
-0.023406982421875,
-0.0293426513671875,
0.03045654296875,
0.01806640625,
-0.06475830078125,
-0.0105438232421875,
0.006885528564453125,
0.0316162109375,
0.00030803680419921875,
-0.00007009506225585938,
0.041778564453125,
0.00815582275390625,
-0.050140380859375,
0.03704833984375,
0.01308441162109375,
0.0682373046875,
-0.056884765625,
0.006855010986328125,
-0.015655517578125,
0.0158843994140625,
-0.0064849853515625,
-0.040435791015625,
0.0217437744140625,
0.01259613037109375,
-0.01605224609375,
-0.023101806640625,
0.047821044921875,
-0.0421142578125,
-0.045257568359375,
0.03143310546875,
0.02886962890625,
-0.0125579833984375,
-0.006977081298828125,
-0.06884765625,
0.0198974609375,
0.014678955078125,
-0.0207672119140625,
0.006351470947265625,
0.0264739990234375,
-0.011444091796875,
0.05291748046875,
0.053466796875,
0.006252288818359375,
0.013702392578125,
0.0036830902099609375,
0.06280517578125,
-0.05712890625,
-0.029937744140625,
-0.05889892578125,
0.05499267578125,
-0.01143646240234375,
-0.03826904296875,
0.07275390625,
0.043365478515625,
0.0816650390625,
-0.0011692047119140625,
0.08026123046875,
-0.016265869140625,
0.037933349609375,
-0.03070068359375,
0.06298828125,
-0.0496826171875,
0.01305389404296875,
-0.021820068359375,
-0.04962158203125,
-0.037078857421875,
0.0294342041015625,
-0.03173828125,
0.00801849365234375,
0.061676025390625,
0.07781982421875,
0.005817413330078125,
-0.0005249977111816406,
0.0205230712890625,
0.025054931640625,
0.03692626953125,
0.0203704833984375,
0.0518798828125,
-0.032806396484375,
0.0565185546875,
-0.0328369140625,
0.004474639892578125,
-0.0196075439453125,
-0.06658935546875,
-0.075927734375,
-0.05120849609375,
-0.0228729248046875,
-0.04071044921875,
0.00789642333984375,
0.07373046875,
0.050079345703125,
-0.05712890625,
-0.028839111328125,
-0.01549530029296875,
0.01154327392578125,
-0.010833740234375,
-0.0212554931640625,
0.0197601318359375,
-0.02545166015625,
-0.0565185546875,
-0.0013446807861328125,
-0.0030612945556640625,
0.00395965576171875,
-0.0206756591796875,
-0.008331298828125,
-0.023101806640625,
0.0017385482788085938,
0.04949951171875,
0.0202178955078125,
-0.0556640625,
-0.002227783203125,
0.0217132568359375,
-0.034423828125,
-0.004970550537109375,
0.07037353515625,
-0.041656494140625,
0.0218658447265625,
0.03656005859375,
0.036834716796875,
0.0188446044921875,
-0.0085906982421875,
0.047576904296875,
-0.035736083984375,
0.01494598388671875,
0.033721923828125,
0.0144195556640625,
0.016357421875,
-0.038726806640625,
0.0302886962890625,
0.00762176513671875,
-0.03192138671875,
-0.06756591796875,
0.00997161865234375,
-0.07525634765625,
-0.0257415771484375,
0.1043701171875,
0.01033782958984375,
-0.02154541015625,
-0.003025054931640625,
-0.0256805419921875,
0.02154541015625,
-0.040924072265625,
0.044158935546875,
0.040069580078125,
0.0211181640625,
-0.01342010498046875,
-0.03839111328125,
0.041717529296875,
0.0022487640380859375,
-0.06365966796875,
0.012969970703125,
0.036834716796875,
0.02056884765625,
0.0094146728515625,
0.060211181640625,
-0.0151214599609375,
0.019317626953125,
-0.0208282470703125,
0.03515625,
-0.01398468017578125,
-0.02374267578125,
-0.0192108154296875,
-0.0076141357421875,
-0.007297515869140625,
0.005329132080078125
]
] |
ClueAI/PromptCLUE-base | 2023-04-02T08:57:29.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"zh",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | ClueAI | null | null | ClueAI/PromptCLUE-base | 67 | 11,677 | transformers | 2022-09-29T08:51:44 | ---
language:
- zh
license: creativeml-openrail-m
widget:
- text: "这是关于哪方面的新闻: \n如果日本沉没,中国会接收日本难民吗?\n选项:故事,文化,娱乐,体育,财经,房产,汽车,教育,科技,军事,旅游,国际,股票,农业,游戏\n答案:"
- text: "以下两句话是否表达相同意思:\n文本1:糖尿病腿麻木怎么办?\n文本2:糖尿病怎样控制生活方式\n选项:相似,不相似\n答案:"
- text: "阅读以下对话并回答问题。\n男:今天怎么这么晚才来上班啊?女:昨天工作到很晚,而且我还感冒了。男:那你回去休息吧,我帮你请假。女:谢谢你。\n问题:女的怎么样?\n选项:正在工作,感冒了,在打电话,要出差。\n答案:"
- text: "信息抽取:\n张玄武1990年出生中国国籍无境外居留权博士学历现任杭州线锁科技技术总监。\n问题:机构,人名,职位,籍贯,专业,国籍,种族\n答案:"
- text: "抽取关键词:\n当地时间21日,美国联邦储备委员会宣布加息75个基点,将联邦基金利率目标区间上调到3.00%至3.25%之间,符合市场预期。这是美联储今年以来第五次加息,也是连续第三次加息,创自1981年以来的最大密集加息幅度。\n关键词:"
- text: "翻译成中文:\nThis is a dialogue robot that can talk to people.\n答案:"
- text: "为下面的文章生成摘要:\n北京时间9月5日12时52分,四川甘孜藏族自治州泸定县发生6.8级地震。地震发生后,领导高度重视并作出重要指示,要求把抢救生命作为首要任务,全力救援受灾群众,最大限度减少人员伤亡\n摘要:"
- text: "推理关系判断:\n前提:小明明天要去北京\n假设:小明计划明天去上海\n选项:矛盾,蕴含,中立\n答案:"
- text: "问答:\n问题:小米的创始人是谁?\n答案:"
---
<a href="https://colab.research.google.com/drive/1noyBA_JrYO6Lk6cwxsNZ_jdJ-Jtaf82G?usp=sharing"><img src="https://colab.research.google.com/assets/colab-badge.svg"></a>
PromptCLUE:全中文任务零样本学习模型
这个模型是基于1000亿token中文语料上预训练,累计学习1.5万亿中文token,并且在数百种任务上进行Prompt任务式训练。针对理解类任务,如分类、情感分析、抽取等,可以自定义标签体系;针对多种生成任务,可以进行采样自由生成。
<a href='https://www.cluebenchmarks.com/clueai.html'>在线Demo</a> |
<a href='https://www.clueai.cn'>使用clueai工具包和API(large版)</a> |
<a href='https://github.com/clue-ai/PromptCLUE'>Github项目地址</a> |
<a href='https://colab.research.google.com/drive/1noyBA_JrYO6Lk6cwxsNZ_jdJ-Jtaf82G?usp=sharing#scrollTo=Nk2tSi3vnSN0'>Colab试用</a>
加载模型:
```python
# 加载模型
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("ClueAI/PromptCLUE-base")
model = T5ForConditionalGeneration.from_pretrained("ClueAI/PromptCLUE-base")
```
使用模型进行预测推理方法:
```python
import torch
#device = torch.device('cpu')
device = torch.device('cuda')
model.to(device)
def preprocess(text):
return text.replace("\n", "_")
def postprocess(text):
return text.replace("_", "\n")
def answer(text, sample=False, top_p=0.8):
'''sample:是否抽样。生成任务,可以设置为True;
top_p:0-1之间,生成的内容越多样'''
text = preprocess(text)
encoding = tokenizer(text=[text], truncation=True, padding=True, max_length=768, return_tensors="pt").to(device)
if not sample:
out = model.generate(**encoding, return_dict_in_generate=True, output_scores=False, max_length=128, num_beams=4, length_penalty=0.6)
else:
out = model.generate(**encoding, return_dict_in_generate=True, output_scores=False, max_length=64, do_sample=True, top_p=top_p)
out_text = tokenizer.batch_decode(out["sequences"], skip_special_tokens=True)
return postprocess(out_text[0])
```
### 示例输入
#### 新闻分类(classify)
```bash
Input:
分类任务:
折价率过低遭抛售基金泰和跌7.15%,证券时报记者 朱景锋本报讯 由于折价率在大盘封基中处于最低水平,基金泰和昨日遭到投资者大举抛售,跌幅达到7.15%,远超大盘。盘面显示,基金泰和随大盘高开,之后开始震荡走低,午后开始加速下行,几乎没有像样反弹。截至收盘时,在沪深300指数仅下跌2.56%的情况下,基金泰和收盘跌幅高达7.15%,在所有封基中跌幅最大,而昨日多数封基跌幅在2%左右。
选项:财经,娱乐,时政,股票
答案:
Model output:
财经
```
#### 意图分类(classify)
```bash
Input:
意图分类:
帮我定一个周日上海浦东的房间
选项:闹钟,文学,酒店,艺术,体育,健康,天气,其他
答案:
Model output:
酒店
```
#### 情感分析(classify)
```bash
Input:
情感分析:
这个看上去还可以,但其实我不喜欢
选项:积极,消极
答案:
Model output:
消极
```
#### 推理(generate)
```bash
Input:
请推理出上下文的关系:
前提:对不起事情就是这样。
假设:事情就是这样,不需要道歉。
选项:中立,蕴涵,矛盾
答案:
Model output:
矛盾
```
#### 阅读理解(generate)
```bash
Input:
阅读文章,给出答案:
段落:
港汇指数,全称港元实际汇兑指数(Effective Exchange Rate Index for the Hong Kong Dollar)是由香港政府统计处编制的一项指数,以反映港元与香港主要贸易伙伴之货币的名义有效汇率加权平均数的变动情况。加权比重是按1999年至2000年平均贸易模式所制定,但政府并未有公布详细的计算公式。旧港汇指数基准日为2000年1月1日,基数为100点。由2012年1月3日起,新系列港汇指数 (包括15种货币及以2010年1月 = 100) 已取代旧港汇指数系列。港汇指数的作用,主要是用于反映香港的货品及服务的价格相对于其主要贸易伙伴的变动,并通常被视作反映香港价格竞争力的指标。
问题:港汇指数的加权比重如何制定?
答案:
Model output:
按1999年至2000年平均贸易模式所制定
```
#### 阅读理解-自由式(generate)
```bash
Input:
阅读以下对话并回答问题。
男:今天怎么这么晚才来上班啊?女:昨天工作到很晚,而且我还感冒了。男:那你回去休息吧,我帮你请假。女:谢谢你。
问题:女的怎么样?
选项:正在工作,感冒了,在打电话,要出差。
答案:
Model output:
感冒了
```
#### 摘要(generate)
```bash
Input:
为下面的文章生成摘要:
北京时间9月5日12时52分,四川甘孜藏族自治州泸定县发生6.8级地震。地震发生后,领导高度重视并作出重要指示,要求把抢救生命作为首要任务,全力救援受灾群众,最大限度减少人员伤亡
答案:
Model output:
四川甘孜发生6.8级地震
```
#### 翻译-中英(generate)
```bash
Input:
翻译成英文:
议长去了台湾,中国人民很愤怒。
答案:
Model output:
The secretary went to Taiwan and the Chinese people were angry.
```
#### 翻译-英中(generate)
```bash
Input:
翻译成中文:
This is a dialogue robot that can talk to people.
答案:
Model output:
这是一台可以与人交谈的对话机器人。
```
#### 通用信息抽取(generate)
```bash
Input:
信息抽取:
据新华社电广东省清远市清城区政府昨日对外发布信息称,日前被实名举报涉嫌勒索企业、说“分分钟可以搞垮一间厂”的清城区环保局局长陈柏,已被免去清城区区委委员
问题:机构名,人名,职位
答案:
Model output:
机构名:新华社,清城区政府,清城区环保局,清城区区委
人名:陈柏
职位:局长,区委委员
```
#### 简历信息抽取(generate)
```bash
Input:
阅读文本抽取关键信息:
张玄武2000年出生中国国籍无境外居留权博士学历现任杭州线锁科技技术总监。
问题:机构,人名,职位,籍贯,专业,国籍,学历,种族
答案:
Model output:
人名:张玄武
职位:杭州线锁科技技术总监
国籍:中国国籍
学历:博士学历
```
#### 医疗信息抽取(generate)
```bash
Input:
从文本中抽取信息:
患者精神可,饮食可,睡眠可,二便正常。患者通过综合治疗头晕症状较前减轻,患者继续口服改善脑血管及调整血压变化药物。
问题:症状,治疗,检查,身体部位,疾病
答案:
Model output:
症状:头晕
治疗:改善脑血管及调整血压变化药物
身体部位:二便
```
#### 电商客户需求分析(classify)
```bash
Input:
电商客户诉求分类:
收到但不太合身,可以退换吗
选项:买家咨询商品是否支持花呗付款,买家表示收藏关注店铺,买家咨询退换货规则,买家需要商品推荐
答案:
Model output:
买家咨询退换货规则
```
#### 医疗语义相似度(classify)
```bash
Input:
下面句子是否表示了相同的语义:
文本1:糖尿病腿麻木怎么办?
文本2:糖尿病怎样控制生活方式
选项:相似,不相似
答案:
Model output:
不相似
```
#### 问题生成(generate)
```bash
Input:
问题生成:
中新网2022年9月22日电 22日,商务部召开例行新闻发布会,商务部新闻发言人束珏婷表示,今年1-8月,中国实际使用外资1384亿美元,增长20.2%;其中,欧盟对华投资增长123.7%(含通过自由港投资数据)。这充分表明,包括欧盟在内的外国投资者持续看好中国市场,希望继续深化对华投资合作。
答案:
Model output:
1、商务部发言人束珏婷表示,中国去年1-8月实际使用外资1384亿美元,增长20.2%,其中欧盟对华投资增长多少?
2、欧盟对华投资的增长是否达到预期?
3、欧盟在对华投资合作方面取得什么样的成果?
```
#### 指代消解(generate)
```bash
Input:
指代消解:
段落:
少平跟润叶进了她二爸家的院子,润生走过来对他(代词)说:“我到宿舍找了你两回,你到哪里去了?”
问题:代词“他”指代的是?
答案:
Model output:
少平
```
#### 关键词抽取(generate)
```bash
Input:
抽取关键词:
当地时间21日,美国联邦储备委员会宣布加息75个基点,将联邦基金利率目标区间上调到3.00%至3.25%之间,符合市场预期。这是美联储今年以来第五次加息,也是连续第三次加息,创自1981年以来的最大密集加息幅度。
关键词:
Model output:
美联储,利率目标区间,加息,基点
```
#### 情感倾向(classify)
```bash
文字中包含了怎样的情感:
超可爱的帅哥,爱了。。。
选项:厌恶,喜欢,开心,悲伤,惊讶,生气,害怕
答案:
Model output:
喜欢
```
更多示例功能和模型见
[ClueAI](https://github.com/clue-ai/PromptCLUE)
| 5,944 | [
[
-0.042083740234375,
-0.051605224609375,
0.0372314453125,
0.01715087890625,
-0.029388427734375,
-0.01348114013671875,
-0.0065765380859375,
-0.0136566162109375,
0.030059814453125,
0.01849365234375,
-0.039703369140625,
-0.0576171875,
-0.04376220703125,
0.01276397705078125,
0.0027980804443359375,
0.0570068359375,
0.0026531219482421875,
-0.003482818603515625,
0.02459716796875,
0.002685546875,
-0.038665771484375,
-0.03143310546875,
-0.045501708984375,
-0.0133056640625,
0.01140594482421875,
0.009918212890625,
0.039337158203125,
0.064697265625,
0.03765869140625,
0.021148681640625,
0.007694244384765625,
0.00930023193359375,
-0.016326904296875,
-0.00476837158203125,
0.0115509033203125,
-0.0206756591796875,
-0.032318115234375,
-0.00669097900390625,
0.044769287109375,
0.043121337890625,
0.0196685791015625,
0.039703369140625,
-0.0025730133056640625,
0.036529541015625,
-0.0218505859375,
0.0236358642578125,
-0.0256805419921875,
-0.0171356201171875,
-0.0202178955078125,
-0.0209503173828125,
-0.0158538818359375,
-0.04815673828125,
-0.0257110595703125,
-0.032073974609375,
0.0097198486328125,
0.012603759765625,
0.10333251953125,
0.0008454322814941406,
-0.0176239013671875,
-0.006755828857421875,
-0.04388427734375,
0.06390380859375,
-0.07366943359375,
0.009613037109375,
0.011871337890625,
0.01537322998046875,
-0.0166015625,
-0.06866455078125,
-0.06732177734375,
0.01074981689453125,
-0.032318115234375,
0.0150299072265625,
-0.01093292236328125,
-0.0171356201171875,
0.02154541015625,
0.010833740234375,
-0.03350830078125,
-0.0081787109375,
-0.02789306640625,
-0.016082763671875,
0.04254150390625,
0.0200347900390625,
0.0487060546875,
-0.055755615234375,
-0.0294342041015625,
-0.037078857421875,
-0.0179443359375,
0.039306640625,
0.01629638671875,
0.0305633544921875,
-0.01995849609375,
0.043701171875,
-0.0154266357421875,
0.03326416015625,
0.01222991943359375,
-0.039459228515625,
0.03955078125,
-0.035125732421875,
-0.0233612060546875,
-0.00811767578125,
0.094970703125,
0.04425048828125,
-0.00925445556640625,
0.0228271484375,
-0.0206451416015625,
0.02069091796875,
-0.021697998046875,
-0.059783935546875,
-0.022613525390625,
0.0355224609375,
-0.0462646484375,
-0.024383544921875,
0.0164337158203125,
-0.07464599609375,
0.0070343017578125,
0.004657745361328125,
0.040008544921875,
-0.044189453125,
-0.02557373046875,
0.00006842613220214844,
-0.027496337890625,
0.021087646484375,
0.0211334228515625,
-0.0545654296875,
0.01139068603515625,
0.038360595703125,
0.052459716796875,
0.0031032562255859375,
-0.019989013671875,
-0.03118896484375,
0.0038089752197265625,
-0.0186614990234375,
0.0297393798828125,
-0.031951904296875,
-0.0232696533203125,
-0.0228271484375,
0.022796630859375,
-0.0196533203125,
-0.01271820068359375,
0.0294342041015625,
-0.0269927978515625,
0.03179931640625,
-0.015594482421875,
-0.0189208984375,
-0.0260772705078125,
0.0085601806640625,
-0.042236328125,
0.0731201171875,
0.0204010009765625,
-0.089111328125,
0.013824462890625,
-0.0528564453125,
-0.0262451171875,
0.0033092498779296875,
-0.01448822021484375,
-0.03570556640625,
-0.041961669921875,
0.03472900390625,
0.035736083984375,
-0.0251312255859375,
0.01611328125,
-0.013214111328125,
-0.0176239013671875,
0.01180267333984375,
-0.0146484375,
0.0999755859375,
0.0283660888671875,
-0.038909912109375,
0.0298309326171875,
-0.051483154296875,
0.01273345947265625,
0.0341796875,
-0.0235748291015625,
-0.005825042724609375,
-0.00852203369140625,
-0.00389862060546875,
0.017578125,
0.03729248046875,
-0.0400390625,
0.0184326171875,
-0.03668212890625,
0.046722412109375,
0.06719970703125,
0.0198516845703125,
0.031524658203125,
-0.049346923828125,
0.03515625,
0.00920867919921875,
0.0034084320068359375,
-0.037078857421875,
-0.04766845703125,
-0.06781005859375,
-0.0276641845703125,
0.0169525146484375,
0.068359375,
-0.054656982421875,
0.056060791015625,
-0.0242919921875,
-0.04852294921875,
-0.036468505859375,
-0.00897979736328125,
0.015838623046875,
0.03729248046875,
0.035430908203125,
-0.007904052734375,
-0.061187744140625,
-0.0518798828125,
0.01461029052734375,
-0.031768798828125,
-0.01473236083984375,
0.0131988525390625,
0.04925537109375,
-0.0272674560546875,
0.054779052734375,
-0.05462646484375,
-0.0145111083984375,
-0.01898193359375,
0.00913238525390625,
0.045166015625,
0.0474853515625,
0.0430908203125,
-0.041046142578125,
-0.040496826171875,
-0.00004845857620239258,
-0.053436279296875,
0.00673675537109375,
-0.0215606689453125,
-0.0239715576171875,
0.015899658203125,
0.00833892822265625,
-0.03924560546875,
0.01546478271484375,
0.0149078369140625,
-0.02105712890625,
0.0599365234375,
-0.00511932373046875,
0.0185699462890625,
-0.092529296875,
0.0139007568359375,
-0.005645751953125,
0.00392913818359375,
-0.04510498046875,
0.016998291015625,
-0.00308990478515625,
0.0065765380859375,
-0.0217132568359375,
0.04315185546875,
-0.0179901123046875,
0.020355224609375,
0.0169677734375,
-0.0012655258178710938,
-0.00672149658203125,
0.0599365234375,
-0.0030574798583984375,
0.04046630859375,
0.05023193359375,
-0.05303955078125,
0.0250091552734375,
0.011749267578125,
-0.0162811279296875,
-0.0032405853271484375,
-0.04327392578125,
0.00307464599609375,
-0.005908966064453125,
0.00272369384765625,
-0.09161376953125,
-0.0277252197265625,
0.056793212890625,
-0.061492919921875,
0.0089111328125,
-0.0018320083618164062,
-0.027191162109375,
-0.04937744140625,
-0.043701171875,
-0.0014886856079101562,
0.059295654296875,
-0.03289794921875,
0.045928955078125,
0.01432037353515625,
-0.006069183349609375,
-0.044097900390625,
-0.03424072265625,
-0.030517578125,
-0.016448974609375,
-0.0616455078125,
0.01959228515625,
-0.01030731201171875,
-0.0021381378173828125,
-0.005523681640625,
0.0084381103515625,
0.00560760498046875,
-0.0117034912109375,
0.0196075439453125,
0.041046142578125,
-0.01428985595703125,
-0.01488494873046875,
-0.005130767822265625,
-0.00330352783203125,
0.01035308837890625,
-0.00910186767578125,
0.06231689453125,
-0.01055145263671875,
-0.0296173095703125,
-0.049957275390625,
0.0205078125,
0.044708251953125,
-0.0137481689453125,
0.041259765625,
0.0670166015625,
-0.01488494873046875,
0.00141143798828125,
-0.0236358642578125,
-0.00839996337890625,
-0.036102294921875,
0.0239715576171875,
-0.0304718017578125,
-0.0517578125,
0.04669189453125,
-0.004856109619140625,
0.0279693603515625,
0.06060791015625,
0.031982421875,
-0.01045989990234375,
0.093505859375,
0.031280517578125,
-0.0058746337890625,
0.022674560546875,
-0.040008544921875,
0.0213165283203125,
-0.044769287109375,
-0.03509521484375,
-0.040283203125,
-0.01953125,
-0.06060791015625,
-0.04510498046875,
0.0305938720703125,
0.0036182403564453125,
-0.04388427734375,
0.0262298583984375,
-0.06512451171875,
-0.01448822021484375,
0.06011962890625,
0.00992584228515625,
-0.0025768280029296875,
-0.00933074951171875,
-0.021331787109375,
0.0006237030029296875,
-0.04327392578125,
-0.049774169921875,
0.06549072265625,
0.0268096923828125,
0.046905517578125,
0.00102996826171875,
0.06463623046875,
0.01473236083984375,
0.0018367767333984375,
-0.039459228515625,
0.056182861328125,
-0.01221466064453125,
-0.0244598388671875,
-0.04150390625,
-0.03240966796875,
-0.07470703125,
0.01424407958984375,
-0.0133819580078125,
-0.07769775390625,
0.0078887939453125,
-0.0042724609375,
-0.0263671875,
0.050018310546875,
-0.053680419921875,
0.0521240234375,
-0.01922607421875,
-0.0106201171875,
0.02008056640625,
-0.049835205078125,
0.047149658203125,
0.005229949951171875,
0.025054931640625,
-0.014801025390625,
0.0164337158203125,
0.08038330078125,
-0.0357666015625,
0.049102783203125,
-0.018585205078125,
0.0129241943359375,
0.029022216796875,
0.012969970703125,
0.049346923828125,
0.01131439208984375,
0.003925323486328125,
0.0209503173828125,
0.0164794921875,
-0.0204010009765625,
-0.020751953125,
0.050506591796875,
-0.0714111328125,
-0.06524658203125,
-0.037841796875,
-0.0170440673828125,
0.02685546875,
0.04071044921875,
0.040283203125,
0.010498046875,
0.017791748046875,
-0.0036106109619140625,
0.0295867919921875,
-0.041534423828125,
0.042938232421875,
0.018341064453125,
-0.0301971435546875,
-0.05438232421875,
0.07598876953125,
0.00560760498046875,
0.01200103759765625,
0.036865234375,
0.0123748779296875,
-0.015716552734375,
-0.01837158203125,
-0.04461669921875,
0.0244293212890625,
-0.0474853515625,
-0.02423095703125,
-0.0711669921875,
-0.032073974609375,
-0.06939697265625,
-0.013946533203125,
-0.005290985107421875,
-0.0258636474609375,
-0.0249481201171875,
0.0009708404541015625,
0.032257080078125,
0.03143310546875,
-0.005512237548828125,
0.012115478515625,
-0.052276611328125,
0.0504150390625,
0.0208740234375,
0.0117950439453125,
0.01013946533203125,
-0.0238800048828125,
-0.005908966064453125,
0.01523590087890625,
-0.0271453857421875,
-0.073486328125,
0.0528564453125,
-0.01207733154296875,
0.037384033203125,
0.043914794921875,
0.0230255126953125,
0.052001953125,
-0.01227569580078125,
0.0750732421875,
0.03973388671875,
-0.0711669921875,
0.04425048828125,
-0.0207366943359375,
0.021270751953125,
0.0012722015380859375,
0.0284271240234375,
-0.047271728515625,
-0.03131103515625,
-0.04766845703125,
-0.07440185546875,
0.07928466796875,
0.0209808349609375,
-0.006011962890625,
0.0014362335205078125,
-0.01374053955078125,
-0.0014247894287109375,
-0.00405120849609375,
-0.061553955078125,
-0.059539794921875,
-0.033966064453125,
-0.01047515869140625,
0.013427734375,
-0.00867462158203125,
0.00043582916259765625,
-0.03509521484375,
0.05877685546875,
0.01296234130859375,
0.052032470703125,
0.01198577880859375,
0.0157470703125,
-0.01715087890625,
0.0169677734375,
0.01708984375,
0.02471923828125,
-0.028350830078125,
0.01488494873046875,
0.0193634033203125,
-0.0310211181640625,
0.0171966552734375,
-0.002197265625,
-0.04156494140625,
0.01311492919921875,
0.031707763671875,
0.05963134765625,
-0.0196990966796875,
-0.0275726318359375,
0.0239715576171875,
-0.00765228271484375,
-0.022003173828125,
-0.028839111328125,
0.0264892578125,
0.01177978515625,
0.002765655517578125,
0.03839111328125,
0.01296234130859375,
-0.0015964508056640625,
-0.039794921875,
0.01410675048828125,
0.029052734375,
0.0066070556640625,
-0.006191253662109375,
0.06396484375,
-0.00991058349609375,
-0.026702880859375,
0.039794921875,
-0.038665771484375,
-0.05914306640625,
0.06756591796875,
0.035980224609375,
0.0526123046875,
-0.00798797607421875,
0.02398681640625,
0.08782958984375,
0.031524658203125,
-0.004718780517578125,
0.032989501953125,
0.01210784912109375,
-0.043853759765625,
-0.01062774658203125,
-0.060546875,
-0.003757476806640625,
0.0286407470703125,
-0.03118896484375,
0.036102294921875,
-0.05535888671875,
-0.027679443359375,
0.0219573974609375,
0.01528167724609375,
-0.044342041015625,
0.01538848876953125,
0.0077056884765625,
0.053131103515625,
-0.07958984375,
0.0582275390625,
0.047332763671875,
-0.04608154296875,
-0.07684326171875,
-0.0082550048828125,
-0.015350341796875,
-0.043853759765625,
0.05120849609375,
0.0162506103515625,
-0.007236480712890625,
-0.00260162353515625,
-0.044708251953125,
-0.08880615234375,
0.086669921875,
0.002742767333984375,
-0.0197906494140625,
0.00010126829147338867,
-0.0016698837280273438,
0.0308990478515625,
-0.027099609375,
0.0335693359375,
0.040557861328125,
0.055877685546875,
0.0102691650390625,
-0.06304931640625,
0.037994384765625,
-0.04608154296875,
-0.0016126632690429688,
0.00910186767578125,
-0.058837890625,
0.100341796875,
-0.02288818359375,
-0.0494384765625,
-0.003696441650390625,
0.049224853515625,
0.03680419921875,
0.0347900390625,
0.03948974609375,
0.029876708984375,
0.054534912109375,
-0.02685546875,
0.045501708984375,
-0.021026611328125,
0.0322265625,
0.0662841796875,
-0.0022296905517578125,
0.040283203125,
0.00798797607421875,
-0.04266357421875,
0.056884765625,
0.060577392578125,
-0.03680419921875,
0.036529541015625,
-0.00418853759765625,
-0.003856658935546875,
-0.018829345703125,
0.01220703125,
-0.04058837890625,
0.01421356201171875,
0.02685546875,
-0.0263214111328125,
0.0004696846008300781,
0.0044403076171875,
0.0198516845703125,
-0.00992584228515625,
-0.010986328125,
0.055084228515625,
-0.020294189453125,
-0.06280517578125,
0.063232421875,
0.028961181640625,
0.0677490234375,
-0.05303955078125,
-0.009979248046875,
-0.00441741943359375,
-0.0012140274047851562,
-0.02093505859375,
-0.058868408203125,
-0.0030574798583984375,
-0.00023794174194335938,
-0.0218505859375,
0.007503509521484375,
0.041961669921875,
-0.0166473388671875,
-0.057373046875,
0.0260162353515625,
0.016845703125,
0.022979736328125,
0.034332275390625,
-0.0548095703125,
0.00202178955078125,
0.03955078125,
-0.018096923828125,
0.007480621337890625,
0.037841796875,
-0.0019292831420898438,
0.051239013671875,
0.0673828125,
0.01172637939453125,
0.0053863525390625,
-0.0172576904296875,
0.07708740234375,
-0.060028076171875,
-0.038116455078125,
-0.064453125,
0.04046630859375,
-0.00235748291015625,
-0.0168609619140625,
0.064697265625,
0.040252685546875,
0.07098388671875,
-0.0057220458984375,
0.08282470703125,
-0.04437255859375,
0.04022216796875,
-0.0219573974609375,
0.0457763671875,
-0.03875732421875,
0.0124053955078125,
-0.035308837890625,
-0.0423583984375,
-0.0108184814453125,
0.044189453125,
-0.02349853515625,
0.018157958984375,
0.0504150390625,
0.0697021484375,
0.015716552734375,
-0.018341064453125,
0.0086517333984375,
0.008087158203125,
0.0145263671875,
0.0587158203125,
0.0174560546875,
-0.029022216796875,
0.045166015625,
-0.058746337890625,
-0.014404296875,
-0.019073486328125,
-0.033599853515625,
-0.05975341796875,
-0.042999267578125,
-0.0151519775390625,
-0.03607177734375,
-0.0269317626953125,
0.07342529296875,
0.041748046875,
-0.07891845703125,
-0.021270751953125,
0.01160430908203125,
0.0278778076171875,
-0.047332763671875,
-0.0250396728515625,
0.0635986328125,
-0.01483917236328125,
-0.06396484375,
-0.0056610107421875,
0.0008130073547363281,
0.005889892578125,
-0.00508880615234375,
-0.017120361328125,
-0.0237579345703125,
0.004901885986328125,
0.0218505859375,
0.011962890625,
-0.06268310546875,
-0.00922393798828125,
0.015716552734375,
-0.0158843994140625,
0.01523590087890625,
0.0176544189453125,
-0.039794921875,
0.038848876953125,
0.0546875,
-0.0037364959716796875,
0.0270233154296875,
0.0097503662109375,
0.01511383056640625,
-0.030792236328125,
-0.0070037841796875,
-0.004566192626953125,
0.03253173828125,
0.014801025390625,
-0.041900634765625,
0.0301971435546875,
0.024261474609375,
-0.0302276611328125,
-0.05718994140625,
-0.003063201904296875,
-0.07611083984375,
-0.0295257568359375,
0.080078125,
-0.0213165283203125,
-0.0323486328125,
0.0026340484619140625,
-0.0262603759765625,
0.05517578125,
-0.047027587890625,
0.052032470703125,
0.034820556640625,
-0.0079498291015625,
-0.004405975341796875,
-0.0256805419921875,
0.025177001953125,
0.003692626953125,
-0.035614013671875,
-0.011505126953125,
0.015777587890625,
0.0275115966796875,
0.034820556640625,
0.05560302734375,
0.00571441650390625,
0.0168609619140625,
0.01357269287109375,
0.0230255126953125,
-0.004108428955078125,
0.007755279541015625,
-0.0016698837280273438,
0.014068603515625,
-0.0134429931640625,
-0.033599853515625
]
] |
togethercomputer/RedPajama-INCITE-7B-Instruct | 2023-08-09T18:01:27.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"en",
"dataset:togethercomputer/RedPajama-Data-1T",
"dataset:togethercomputer/RedPajama-Data-Instruct",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | togethercomputer | null | null | togethercomputer/RedPajama-INCITE-7B-Instruct | 103 | 11,663 | transformers | 2023-05-05T05:28:20 | ---
license: apache-2.0
language:
- en
datasets:
- togethercomputer/RedPajama-Data-1T
- togethercomputer/RedPajama-Data-Instruct
widget:
- text: |-
Label the sentences as either 'positive', 'negative', 'mixed', or 'neutral':
Sentence: I can say that there isn't anything I would change.
Label: positive
Sentence: I'm not sure about this.
Label: neutral
Sentence: I liked some parts but I didn't like other parts.
Label: mixed
Sentence: I think the background image could have been better.
Label: negative
Sentence: I really like it.
Label:
example_title: Sentiment Analysis
- text: |-
Please answer the following question:
Question: What is the capital of Canada?
Answer: Ottawa
Question: What is the currency of Switzerland?
Answer: Swiss franc
Question: In which country is Wisconsin located?
Answer:
example_title: Question Answering
- text: >-
Given a news article, classify its topic.
Possible labels: 1. World 2. Sports 3. Business 4. Sci/Tech
Article: A nearby star thought to harbor comets and asteroids now appears to
be home to planets, too.
Label: Sci/Tech
Article: Soaring crude prices plus worries about the economy and the outlook
for earnings are expected to hang over the stock market next week during the
depth of the summer doldrums.
Label: Business
Article: Murtagh a stickler for success Northeastern field hockey coach
Cheryl Murtagh doesn't want the glare of the spotlight that shines on her to
detract from a team that has been the America East champion for the past
three years and has been to the NCAA tournament 13 times.
Label::
example_title: Topic Classification
- text: |-
Paraphrase the given sentence into a different sentence.
Input: Can you recommend some upscale restaurants in New York?
Output: What upscale restaurants do you recommend in New York?
Input: What are the famous places we should not miss in Paris?
Output: Recommend some of the best places to visit in Paris?
Input: Could you recommend some hotels that have cheap price in Zurich?
Output:
example_title: Paraphrasing
- text: >-
Given a review from Amazon's food products, the task is to generate a short
summary of the given review in the input.
Input: I have bought several of the Vitality canned dog food products and
have found them all to be of good quality. The product looks more like a
stew than a processed meat and it smells better. My Labrador is finicky and
she appreciates this product better than most.
Output: Good Quality Dog Food
Input: Product arrived labeled as Jumbo Salted Peanuts...the peanuts were
actually small sized unsalted. Not sure if this was an error or if the
vendor intended to represent the product as 'Jumbo'.
Output: Not as Advertised
Input: My toddler loves this game to a point where he asks for it. That's a
big thing for me. Secondly, no glitching unlike one of their competitors
(PlayShifu). Any tech I don’t have to reach out to support for help is a
good tech for me. I even enjoy some of the games and activities in this.
Overall, this is a product that shows that the developers took their time
and made sure people would not be asking for refund. I’ve become bias
regarding this product and honestly I look forward to buying more of this
company’s stuff. Please keep up the great work.
Output:
example_title: Text Summarization
- text: |-
Identify which sense of a word is meant in a given context.
Context: The river overflowed the bank.
Word: bank
Sense: river bank
Context: A mouse takes much more room than a trackball.
Word: mouse
Sense: computer mouse
Context: The bank will not be accepting cash on Saturdays.
Word: bank
Sense: commercial (finance) banks
Context: Bill killed the project
Word: kill
Sense:
example_title: Word Sense Disambiguation
- text: >-
Given a pair of sentences, choose whether the two sentences agree
(entailment)/disagree (contradiction) with each other.
Possible labels: 1. entailment 2. contradiction
Sentence 1: The skier was on the edge of the ramp. Sentence 2: The skier was
dressed in winter clothes.
Label: entailment
Sentence 1: The boy skated down the staircase railing. Sentence 2: The boy
is a newbie skater.
Label: contradiction
Sentence 1: Two middle-aged people stand by a golf hole. Sentence 2: A
couple riding in a golf cart.
Label:
example_title: Natural Language Inference
inference:
parameters:
temperature: 0.7
top_p: 0.7
top_k: 50
max_new_tokens: 128
---
# RedPajama-INCITE-7B-Instruct
RedPajama-INCITE-7B-Instruct was developed by Together and leaders from the open-source AI community including Ontocord.ai, ETH DS3Lab, AAI CERC, Université de Montréal, MILA - Québec AI Institute, Stanford Center for Research on Foundation Models (CRFM), Stanford Hazy Research research group and LAION.
The model was fine-tuned for few-shot applications on the data of [GPT-JT](https://huggingface.co/togethercomputer/GPT-JT-6B-v1), with exclusion of tasks that overlap with the HELM core scenarios.
- Base Model: [RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base)
- Instruction-tuned Version: [RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct)
- Chat Version: [RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat)
## Model Details
- **Developed by**: Together Computer.
- **Model type**: Language Model
- **Language(s)**: English
- **License**: Apache 2.0
- **Model Description**: A 6.9B parameter pretrained language model.
# Quick Start
Please note that the model requires `transformers` version >= 4.25.1.
## GPU Inference
This requires a GPU with 16GB memory.
```python
import torch
import transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
MIN_TRANSFORMERS_VERSION = '4.25.1'
# check transformers version
assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.'
# init
tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Instruct")
model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Instruct", torch_dtype=torch.float16)
model = model.to('cuda:0')
# infer
prompt = "Q: The capital of France is?\nA:"
inputs = tokenizer(prompt, return_tensors='pt').to(model.device)
input_length = inputs.input_ids.shape[1]
outputs = model.generate(
**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True
)
token = outputs.sequences[0, input_length:]
output_str = tokenizer.decode(token)
print(output_str)
"""
Paris
"""
```
## GPU Inference in Int8
This requires a GPU with 12GB memory.
To run inference with int8, please ensure you have installed accelerate and bitandbytes. You can install them with the following command:
```bash
pip install accelerate
pip install bitsandbytes
```
Then you can run inference with int8 as follows:
```python
import torch
import transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
MIN_TRANSFORMERS_VERSION = '4.25.1'
# check transformers version
assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.'
# init
tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Instruct")
model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Instruct", device_map='auto', torch_dtype=torch.float16, load_in_8bit=True)
# infer
prompt = "Q: The capital of France is?\nA:"
inputs = tokenizer(prompt, return_tensors='pt').to(model.device)
input_length = inputs.input_ids.shape[1]
outputs = model.generate(
**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True
)
token = outputs.sequences[0, input_length:]
output_str = tokenizer.decode(token)
print(output_str)
"""
Paris
"""
```
## CPU Inference
```python
import torch
import transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
MIN_TRANSFORMERS_VERSION = '4.25.1'
# check transformers version
assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.'
# init
tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Instruct")
model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Instruct", torch_dtype=torch.bfloat16)
# infer
prompt = "Q: The capital of France is?\nA:"
inputs = tokenizer(prompt, return_tensors='pt').to(model.device)
input_length = inputs.input_ids.shape[1]
outputs = model.generate(
**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True
)
token = outputs.sequences[0, input_length:]
output_str = tokenizer.decode(token)
print(output_str)
"""
Paris
"""
```
Please note that since `LayerNormKernelImpl` is not implemented in fp16 for CPU, we use `bfloat16` for CPU inference.
# Uses
## Direct Use
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
It is the responsibility of the end user to ensure that the model is used in a responsible and ethical manner.
#### Out-of-Scope Use
RedPajama-INCITE-7B-Instruct is a language model and may not perform well for other use cases outside of its intended scope.
For example, it may not be suitable for use in safety-critical applications or for making decisions that have a significant impact on individuals or society.
It is important to consider the limitations of the model and to only use it for its intended purpose.
#### Misuse and Malicious Use
RedPajama-INCITE-7B-Instruct is designed for language modeling.
Misuse of the model, such as using it to engage in illegal or unethical activities, is strictly prohibited and goes against the principles of the project.
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating fake news, misinformation, or propaganda
- Promoting hate speech, discrimination, or violence against individuals or groups
- Impersonating individuals or organizations without their consent
- Engaging in cyberbullying or harassment
- Defamatory content
- Spamming or scamming
- Sharing confidential or sensitive information without proper authorization
- Violating the terms of use of the model or the data used to train it
- Creating automated bots for malicious purposes such as spreading malware, phishing scams, or spamming
## Limitations
RedPajama-INCITE-7B-Instruct, like other language models, has limitations that should be taken into consideration.
For example, the model may not always provide accurate or relevant answers, particularly for questions that are complex, ambiguous, or outside of its training data.
We therefore welcome contributions from individuals and organizations, and encourage collaboration towards creating a more robust and inclusive chatbot.
## Training
**Training Data**
Please refer to [togethercomputer/RedPajama-Data-1T](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T)
**Training Procedure**
- **Hardware:** 8 A100
- **Optimizer:** Adam
- **Gradient Accumulations**: 1
- **Num of Tokens:** 1B tokens
- **Learning rate:** 1e-5
## Community
Join us on [Together Discord](https://discord.gg/6ZVDU8tTD4) | 11,788 | [
[
-0.0277252197265625,
-0.06707763671875,
0.019744873046875,
0.0284576416015625,
-0.007030487060546875,
-0.01528167724609375,
-0.032379150390625,
-0.0289459228515625,
-0.0010347366333007812,
0.0411376953125,
-0.038421630859375,
-0.036895751953125,
-0.057373046875,
-0.0054931640625,
-0.0273895263671875,
0.06182861328125,
-0.0005583763122558594,
-0.01018524169921875,
-0.01232147216796875,
0.0009236335754394531,
-0.0299530029296875,
-0.037017822265625,
-0.0537109375,
-0.032135009765625,
-0.00044155120849609375,
0.0298919677734375,
0.033782958984375,
0.0225677490234375,
0.038421630859375,
0.0301055908203125,
0.00029158592224121094,
0.00783538818359375,
-0.032196044921875,
-0.0177459716796875,
0.0169219970703125,
-0.031219482421875,
-0.0200958251953125,
-0.0062408447265625,
0.034637451171875,
0.0247344970703125,
-0.0024967193603515625,
0.02398681640625,
-0.01007843017578125,
0.036376953125,
-0.040679931640625,
0.0269775390625,
-0.0443115234375,
-0.0108795166015625,
-0.005046844482421875,
0.006412506103515625,
-0.032012939453125,
-0.0118408203125,
0.005420684814453125,
-0.0311431884765625,
0.0130767822265625,
-0.00475311279296875,
0.0775146484375,
0.029449462890625,
-0.00807952880859375,
-0.006214141845703125,
-0.036224365234375,
0.06439208984375,
-0.06927490234375,
0.0233917236328125,
0.03204345703125,
0.017425537109375,
-0.01343536376953125,
-0.07147216796875,
-0.058441162109375,
-0.0163421630859375,
-0.011749267578125,
0.00392913818359375,
-0.026031494140625,
-0.0027790069580078125,
0.030120849609375,
0.017730712890625,
-0.037811279296875,
-0.00838470458984375,
-0.041900634765625,
-0.0073699951171875,
0.045806884765625,
0.003387451171875,
0.036773681640625,
-0.021148681640625,
-0.0258941650390625,
-0.0232086181640625,
-0.036285400390625,
0.00829315185546875,
0.0256500244140625,
0.0307769775390625,
-0.04827880859375,
0.036041259765625,
-0.0131683349609375,
0.045318603515625,
0.00713348388671875,
-0.0106353759765625,
0.041778564453125,
-0.021240234375,
-0.0372314453125,
-0.0033740997314453125,
0.08245849609375,
0.0233612060546875,
0.01995849609375,
0.0012531280517578125,
-0.0008254051208496094,
0.0067596435546875,
0.003223419189453125,
-0.067626953125,
-0.04351806640625,
0.032989501953125,
-0.0222625732421875,
-0.0155792236328125,
-0.0008687973022460938,
-0.04925537109375,
-0.00449371337890625,
0.0013456344604492188,
0.0506591796875,
-0.033599853515625,
-0.03656005859375,
0.0172271728515625,
-0.0185394287109375,
0.0175323486328125,
0.008270263671875,
-0.0748291015625,
0.025665283203125,
0.01995849609375,
0.058380126953125,
-0.0010852813720703125,
-0.02996826171875,
0.0003998279571533203,
-0.0134429931640625,
-0.0012483596801757812,
0.036590576171875,
-0.0218658447265625,
-0.0278778076171875,
-0.0254364013671875,
0.007049560546875,
-0.0208892822265625,
-0.0302886962890625,
0.0281829833984375,
-0.01222991943359375,
0.04193115234375,
0.0019292831420898438,
-0.038421630859375,
-0.0264892578125,
0.01776123046875,
-0.032012939453125,
0.09869384765625,
0.01329803466796875,
-0.0706787109375,
0.007015228271484375,
-0.06402587890625,
-0.02215576171875,
-0.0143585205078125,
-0.01910400390625,
-0.05780029296875,
-0.0122833251953125,
0.030517578125,
0.03350830078125,
-0.0257720947265625,
0.01070404052734375,
-0.01052093505859375,
-0.024139404296875,
0.01251983642578125,
-0.036651611328125,
0.09710693359375,
0.006305694580078125,
-0.05828857421875,
0.0208892822265625,
-0.05450439453125,
0.00004976987838745117,
0.031951904296875,
-0.006580352783203125,
0.006511688232421875,
-0.019775390625,
0.0106048583984375,
0.0175018310546875,
0.025665283203125,
-0.041412353515625,
0.01306915283203125,
-0.043548583984375,
0.0484619140625,
0.050811767578125,
-0.0006117820739746094,
0.031280517578125,
-0.0206756591796875,
0.0281982421875,
0.00803375244140625,
0.0208282470703125,
-0.000017642974853515625,
-0.05914306640625,
-0.06817626953125,
-0.03192138671875,
0.00751495361328125,
0.045562744140625,
-0.0433349609375,
0.0389404296875,
-0.006256103515625,
-0.052734375,
-0.0307769775390625,
-0.018524169921875,
0.0248260498046875,
0.040557861328125,
0.03887939453125,
-0.010589599609375,
-0.05157470703125,
-0.061920166015625,
0.00531768798828125,
-0.01093292236328125,
0.0089263916015625,
0.0292205810546875,
0.058258056640625,
-0.0275726318359375,
0.058837890625,
-0.0340576171875,
-0.0013751983642578125,
-0.0191802978515625,
0.00292205810546875,
0.044097900390625,
0.05816650390625,
0.051788330078125,
-0.0435791015625,
-0.037353515625,
-0.0106048583984375,
-0.051666259765625,
0.007701873779296875,
-0.0004711151123046875,
-0.0154266357421875,
0.01611328125,
0.0279388427734375,
-0.05816650390625,
0.033599853515625,
0.043365478515625,
-0.0404052734375,
0.042327880859375,
-0.01519775390625,
0.0206756591796875,
-0.08270263671875,
0.008056640625,
-0.0119476318359375,
-0.015716552734375,
-0.044708251953125,
-0.0017032623291015625,
-0.0019969940185546875,
-0.00695037841796875,
-0.055419921875,
0.0711669921875,
-0.022186279296875,
0.01548004150390625,
-0.01016998291015625,
-0.00669097900390625,
-0.005710601806640625,
0.058563232421875,
0.0032253265380859375,
0.05511474609375,
0.055023193359375,
-0.05206298828125,
0.0217742919921875,
0.0213165283203125,
-0.028717041015625,
-0.001720428466796875,
-0.05413818359375,
0.00626373291015625,
0.0108795166015625,
0.00408935546875,
-0.07257080078125,
-0.01422119140625,
0.038055419921875,
-0.05511474609375,
0.020965576171875,
-0.01349639892578125,
-0.0406494140625,
-0.037017822265625,
-0.0019369125366210938,
0.036712646484375,
0.05889892578125,
-0.040679931640625,
0.042694091796875,
0.03900146484375,
0.0157012939453125,
-0.04876708984375,
-0.06494140625,
-0.019439697265625,
-0.024322509765625,
-0.05450439453125,
0.0143585205078125,
-0.0108184814453125,
-0.015350341796875,
-0.007572174072265625,
0.00628662109375,
-0.0107269287109375,
0.01529693603515625,
0.0265960693359375,
0.022064208984375,
0.00373077392578125,
-0.00798797607421875,
-0.015777587890625,
0.0016345977783203125,
0.018341064453125,
-0.0176544189453125,
0.06292724609375,
-0.0210723876953125,
-0.018524169921875,
-0.05224609375,
-0.001194000244140625,
0.040985107421875,
0.00002956390380859375,
0.070068359375,
0.055023193359375,
-0.0411376953125,
-0.01271820068359375,
-0.037322998046875,
-0.032745361328125,
-0.036407470703125,
0.03350830078125,
-0.024200439453125,
-0.054229736328125,
0.050079345703125,
0.02020263671875,
0.01702880859375,
0.061187744140625,
0.0692138671875,
0.00531768798828125,
0.065673828125,
0.038177490234375,
-0.0002682209014892578,
0.04388427734375,
-0.054779052734375,
0.012176513671875,
-0.049072265625,
-0.021484375,
-0.0413818359375,
-0.00794219970703125,
-0.053375244140625,
-0.0380859375,
0.0186614990234375,
0.005584716796875,
-0.043426513671875,
0.044342041015625,
-0.06243896484375,
0.02569580078125,
0.052154541015625,
0.010986328125,
-0.00852203369140625,
0.002017974853515625,
-0.01540374755859375,
0.00685882568359375,
-0.063720703125,
-0.021820068359375,
0.07977294921875,
0.032196044921875,
0.05194091796875,
-0.005558013916015625,
0.054290771484375,
0.00548553466796875,
0.0146484375,
-0.02923583984375,
0.03106689453125,
-0.0033721923828125,
-0.0513916015625,
-0.023773193359375,
-0.036041259765625,
-0.06201171875,
0.0186004638671875,
-0.01102447509765625,
-0.07196044921875,
0.010101318359375,
0.0156402587890625,
-0.023590087890625,
0.029022216796875,
-0.06884765625,
0.0926513671875,
-0.015625,
-0.0168914794921875,
-0.01224517822265625,
-0.03619384765625,
0.03436279296875,
0.023284912109375,
0.0111846923828125,
-0.017791748046875,
0.02703857421875,
0.06878662109375,
-0.031951904296875,
0.0750732421875,
-0.01268768310546875,
0.01549530029296875,
0.039825439453125,
-0.0129547119140625,
0.03387451171875,
0.004261016845703125,
-0.0035305023193359375,
0.0472412109375,
0.0007014274597167969,
-0.034515380859375,
-0.0280609130859375,
0.056365966796875,
-0.08673095703125,
-0.03619384765625,
-0.0364990234375,
-0.040802001953125,
0.00868988037109375,
0.021728515625,
0.039947509765625,
0.0268707275390625,
0.006622314453125,
0.01239013671875,
0.04046630859375,
-0.0244140625,
0.041473388671875,
0.008575439453125,
-0.01483154296875,
-0.03271484375,
0.06646728515625,
-0.005687713623046875,
0.0157470703125,
0.01433563232421875,
0.0166168212890625,
-0.0291900634765625,
-0.0313720703125,
-0.039886474609375,
0.0192108154296875,
-0.050445556640625,
-0.0132293701171875,
-0.053558349609375,
-0.028778076171875,
-0.0469970703125,
-0.00616455078125,
-0.043243408203125,
-0.041351318359375,
-0.03253173828125,
0.01277923583984375,
0.039459228515625,
0.0253143310546875,
0.0008034706115722656,
0.02423095703125,
-0.039947509765625,
0.0215301513671875,
0.010986328125,
0.010589599609375,
-0.0030803680419921875,
-0.0634765625,
-0.0304412841796875,
0.0038604736328125,
-0.020965576171875,
-0.042633056640625,
0.035552978515625,
-0.00445556640625,
0.04278564453125,
0.01424407958984375,
0.0105438232421875,
0.04254150390625,
-0.023101806640625,
0.06512451171875,
0.021392822265625,
-0.0670166015625,
0.03826904296875,
-0.00998687744140625,
0.042083740234375,
0.0277557373046875,
0.0168609619140625,
-0.024017333984375,
-0.04351806640625,
-0.0755615234375,
-0.0704345703125,
0.0625,
0.04595947265625,
0.0097808837890625,
-0.01461029052734375,
0.0156402587890625,
-0.0130767822265625,
0.0164794921875,
-0.07269287109375,
-0.04217529296875,
-0.0255584716796875,
-0.0220489501953125,
0.01160430908203125,
-0.0012340545654296875,
-0.00919342041015625,
-0.0277862548828125,
0.0673828125,
0.0082244873046875,
0.053924560546875,
0.017364501953125,
-0.0133514404296875,
0.004894256591796875,
-0.005886077880859375,
0.05291748046875,
0.06024169921875,
-0.016754150390625,
0.00037407875061035156,
0.0250396728515625,
-0.03692626953125,
-0.006053924560546875,
0.00930023193359375,
-0.0161590576171875,
-0.0025177001953125,
0.0265350341796875,
0.08056640625,
0.0094757080078125,
-0.040618896484375,
0.0295562744140625,
-0.02362060546875,
-0.01605224609375,
-0.04217529296875,
0.022705078125,
0.01334381103515625,
0.029052734375,
0.0134429931640625,
0.007598876953125,
-0.01387786865234375,
-0.0253753662109375,
0.01044464111328125,
0.037628173828125,
-0.01611328125,
-0.019622802734375,
0.06793212890625,
0.01378631591796875,
-0.0214996337890625,
0.055084228515625,
-0.0105438232421875,
-0.037841796875,
0.07061767578125,
0.03582763671875,
0.0692138671875,
0.00550079345703125,
-0.01042938232421875,
0.05865478515625,
0.0345458984375,
-0.01293182373046875,
0.023651123046875,
-0.00049591064453125,
-0.066650390625,
-0.02349853515625,
-0.047088623046875,
-0.0204620361328125,
0.0144195556640625,
-0.0300140380859375,
0.0298004150390625,
-0.03631591796875,
-0.018524169921875,
-0.01336669921875,
0.00792694091796875,
-0.06103515625,
0.0181121826171875,
0.005954742431640625,
0.0638427734375,
-0.06719970703125,
0.0699462890625,
0.03851318359375,
-0.04620361328125,
-0.059844970703125,
-0.0184326171875,
-0.0153350830078125,
-0.064208984375,
0.043365478515625,
0.0233306884765625,
0.002933502197265625,
0.01398468017578125,
-0.049285888671875,
-0.059478759765625,
0.0709228515625,
0.03961181640625,
-0.0304718017578125,
-0.002140045166015625,
-0.0057220458984375,
0.02197265625,
-0.01873779296875,
0.048187255859375,
0.046478271484375,
0.03448486328125,
-0.0034618377685546875,
-0.07373046875,
0.0012464523315429688,
-0.01259613037109375,
-0.0195465087890625,
0.01148223876953125,
-0.0511474609375,
0.0865478515625,
-0.0240936279296875,
-0.01222991943359375,
-0.00489044189453125,
0.06768798828125,
0.022918701171875,
-0.004230499267578125,
0.0308837890625,
0.056060791015625,
0.0491943359375,
-0.01800537109375,
0.0809326171875,
-0.0439453125,
0.06005859375,
0.07989501953125,
0.0258941650390625,
0.0538330078125,
0.0260467529296875,
-0.027618408203125,
0.042083740234375,
0.0469970703125,
-0.01227569580078125,
0.0295562744140625,
0.004215240478515625,
-0.0164642333984375,
-0.01145172119140625,
0.0175628662109375,
-0.031585693359375,
0.032379150390625,
0.0233001708984375,
-0.0260009765625,
-0.0025615692138671875,
-0.00669097900390625,
0.01215362548828125,
-0.016448974609375,
-0.007488250732421875,
0.04083251953125,
0.0077362060546875,
-0.03497314453125,
0.0726318359375,
0.00909423828125,
0.061248779296875,
-0.039459228515625,
-0.00420379638671875,
-0.00524139404296875,
0.022552490234375,
-0.01241302490234375,
-0.042633056640625,
0.01371002197265625,
-0.01038360595703125,
-0.0135955810546875,
-0.003269195556640625,
0.037017822265625,
-0.039886474609375,
-0.05230712890625,
0.022247314453125,
0.01019287109375,
0.0247344970703125,
-0.0085906982421875,
-0.07061767578125,
0.0175933837890625,
0.005451202392578125,
-0.0338134765625,
0.01035308837890625,
0.01461029052734375,
0.00946807861328125,
0.048248291015625,
0.03857421875,
-0.00492095947265625,
0.0166168212890625,
-0.0010051727294921875,
0.05828857421875,
-0.04290771484375,
-0.03485107421875,
-0.07171630859375,
0.0472412109375,
-0.002227783203125,
-0.0304718017578125,
0.0645751953125,
0.0423583984375,
0.085693359375,
-0.006072998046875,
0.061798095703125,
-0.0264129638671875,
0.0098114013671875,
-0.03460693359375,
0.0618896484375,
-0.039886474609375,
0.004680633544921875,
-0.0143280029296875,
-0.06170654296875,
-0.01593017578125,
0.0689697265625,
-0.0172119140625,
0.0179595947265625,
0.05230712890625,
0.06982421875,
-0.00783538818359375,
-0.0199432373046875,
0.0225982666015625,
0.0309600830078125,
0.024688720703125,
0.042327880859375,
0.0214996337890625,
-0.055419921875,
0.037139892578125,
-0.05474853515625,
-0.0133209228515625,
-0.0121307373046875,
-0.0538330078125,
-0.08538818359375,
-0.046722412109375,
-0.0305328369140625,
-0.043426513671875,
-0.0092620849609375,
0.08782958984375,
0.06207275390625,
-0.071533203125,
-0.0299224853515625,
-0.0176544189453125,
0.01326751708984375,
-0.01355743408203125,
-0.021759033203125,
0.03912353515625,
-0.0273590087890625,
-0.08221435546875,
0.018096923828125,
0.0083770751953125,
0.0032100677490234375,
-0.028656005859375,
-0.006725311279296875,
-0.033355712890625,
-0.0019083023071289062,
0.035980224609375,
0.03436279296875,
-0.057830810546875,
-0.0224456787109375,
-0.01433563232421875,
-0.0130615234375,
0.01268768310546875,
0.035369873046875,
-0.050079345703125,
0.038909912109375,
0.0457763671875,
0.04083251953125,
0.0557861328125,
-0.01751708984375,
0.026153564453125,
-0.04876708984375,
0.038604736328125,
0.00998687744140625,
0.036407470703125,
0.023162841796875,
-0.031402587890625,
0.0244903564453125,
0.02752685546875,
-0.049713134765625,
-0.054412841796875,
-0.00188446044921875,
-0.06890869140625,
-0.0223846435546875,
0.0853271484375,
-0.005466461181640625,
-0.04119873046875,
0.00550079345703125,
-0.0103912353515625,
0.03167724609375,
-0.038421630859375,
0.06536865234375,
0.040618896484375,
-0.029296875,
-0.01910400390625,
-0.03521728515625,
0.033966064453125,
0.018310546875,
-0.0615234375,
0.00041103363037109375,
0.0097198486328125,
0.03424072265625,
0.0110626220703125,
0.06292724609375,
0.000148773193359375,
0.0264434814453125,
0.0126953125,
0.034332275390625,
-0.01233673095703125,
-0.0164337158203125,
-0.010986328125,
0.00328826904296875,
-0.007160186767578125,
-0.0281982421875
]
] |
sangrimlee/bert-base-multilingual-cased-nsmc | 2021-06-02T18:46:18.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"ko",
"endpoints_compatible",
"region:us"
] | text-classification | sangrimlee | null | null | sangrimlee/bert-base-multilingual-cased-nsmc | 4 | 11,656 | transformers | 2022-03-02T23:29:05 | ---
language: ko
---
# BERT multilingual basecased finetuned with NSMC
This model is a fine-tune checkpoint of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased), fine-tuned on [NSMC(Naver Sentiment Movie Corpus)](https://github.com/e9t/nsmc).
## Usage
You can use this model directly with a pipeline for sentiment-analysis:
```python
>>> from transformers import pipeline
>>> classifier = pipeline(
"sentiment-analysis", model="sangrimlee/bert-base-multilingual-cased-nsmc"
)
>>> classifier("흠...포스터보고 초딩영화줄....오버연기조차 가볍지 않구나.")
>>> classifier("액션이 없는데도 재미 있는 몇안되는 영화")
[{'label': 'negative', 'score': 0.9642567038536072}]
[{'label': 'positive', 'score': 0.9970554113388062}]
```
| 724 | [
[
-0.03900146484375,
-0.0361328125,
0.00925445556640625,
0.03973388671875,
-0.0360107421875,
0.018035888671875,
-0.02276611328125,
0.0095367431640625,
0.03558349609375,
0.0377197265625,
-0.049560546875,
-0.05047607421875,
-0.044342041015625,
-0.01114654541015625,
-0.0150604248046875,
0.11737060546875,
0.0086212158203125,
0.0287933349609375,
0.00283050537109375,
-0.0135650634765625,
0.0022182464599609375,
-0.055908203125,
-0.042022705078125,
-0.0300445556640625,
0.0277252197265625,
0.0222320556640625,
0.040924072265625,
0.0023040771484375,
0.02593994140625,
0.016876220703125,
-0.01152801513671875,
-0.0135040283203125,
-0.0221710205078125,
0.0167083740234375,
-0.0132904052734375,
-0.0284576416015625,
-0.037811279296875,
-0.006649017333984375,
0.03961181640625,
0.055145263671875,
0.0017642974853515625,
0.0157928466796875,
0.003662109375,
0.046417236328125,
-0.0328369140625,
0.028045654296875,
-0.0386962890625,
0.005870819091796875,
-0.0044708251953125,
0.028717041015625,
-0.0261993408203125,
-0.02130126953125,
0.00290679931640625,
-0.0181121826171875,
0.022674560546875,
-0.0185699462890625,
0.08001708984375,
0.00984954833984375,
-0.022186279296875,
-0.0042266845703125,
-0.0211639404296875,
0.0836181640625,
-0.061431884765625,
0.0161590576171875,
0.01641845703125,
0.020050048828125,
0.01422119140625,
-0.03680419921875,
-0.040618896484375,
-0.009033203125,
0.0226593017578125,
0.03167724609375,
0.003620147705078125,
0.005786895751953125,
0.0100555419921875,
0.037353515625,
-0.0215301513671875,
-0.00963592529296875,
-0.03472900390625,
-0.03826904296875,
0.027191162109375,
0.00443267822265625,
0.007389068603515625,
-0.051422119140625,
-0.026092529296875,
-0.0177764892578125,
-0.0219268798828125,
0.0025806427001953125,
0.025390625,
0.043701171875,
-0.0277099609375,
0.05426025390625,
-0.018402099609375,
0.0240325927734375,
0.0144195556640625,
0.0008921623229980469,
0.0513916015625,
-0.004703521728515625,
-0.022674560546875,
0.01313018798828125,
0.05743408203125,
0.04302978515625,
0.041290283203125,
0.0185089111328125,
-0.011383056640625,
0.026763916015625,
0.031890869140625,
-0.05841064453125,
-0.0277862548828125,
0.023590087890625,
-0.04437255859375,
-0.033935546875,
0.002025604248046875,
-0.055572509765625,
-0.0048980712890625,
-0.0163726806640625,
0.035064697265625,
-0.035736083984375,
-0.028778076171875,
-0.014434814453125,
-0.00524139404296875,
0.033935546875,
0.01435089111328125,
-0.05340576171875,
0.0014667510986328125,
0.039337158203125,
0.0665283203125,
-0.007843017578125,
-0.0132293701171875,
-0.0126953125,
-0.0191802978515625,
-0.019500732421875,
0.044952392578125,
-0.01349639892578125,
-0.0240478515625,
0.0179443359375,
0.0124053955078125,
-0.0020427703857421875,
-0.0264892578125,
0.072509765625,
-0.0360107421875,
0.0160064697265625,
-0.014373779296875,
-0.0298004150390625,
-0.026031494140625,
0.056304931640625,
-0.04266357421875,
0.06591796875,
0.0097503662109375,
-0.058074951171875,
0.02288818359375,
-0.04486083984375,
-0.0396728515625,
-0.00600433349609375,
0.010040283203125,
-0.0435791015625,
0.00019633769989013672,
0.027740478515625,
0.046478271484375,
0.0155792236328125,
0.0289306640625,
-0.033935546875,
-0.0188751220703125,
0.002117156982421875,
-0.024322509765625,
0.0770263671875,
0.035858154296875,
-0.01142120361328125,
0.00582122802734375,
-0.061279296875,
0.0173797607421875,
-0.0206298828125,
-0.034149169921875,
-0.0150299072265625,
0.00643157958984375,
0.03668212890625,
0.01995849609375,
0.0240936279296875,
-0.05584716796875,
0.0124053955078125,
-0.036529541015625,
0.0272979736328125,
0.05279541015625,
0.0009427070617675781,
0.0299224853515625,
-0.005767822265625,
0.035919189453125,
0.032623291015625,
0.01018524169921875,
-0.01209259033203125,
-0.04083251953125,
-0.08056640625,
-0.031158447265625,
0.0281982421875,
0.05096435546875,
-0.06231689453125,
0.03558349609375,
-0.00048828125,
-0.03765869140625,
-0.059661865234375,
-0.004970550537109375,
0.002666473388671875,
0.03240966796875,
0.03826904296875,
-0.035858154296875,
-0.06256103515625,
-0.08148193359375,
0.00885772705078125,
-0.0206756591796875,
0.00673675537109375,
0.0105438232421875,
0.0287933349609375,
-0.055206298828125,
0.07049560546875,
-0.0299530029296875,
-0.0206451416015625,
-0.034423828125,
0.0188751220703125,
0.07275390625,
0.044219970703125,
0.02996826171875,
-0.0374755859375,
-0.050140380859375,
-0.00782012939453125,
-0.050811767578125,
-0.00893402099609375,
-0.0076141357421875,
-0.01523590087890625,
0.024993896484375,
0.019378662109375,
-0.032501220703125,
0.004085540771484375,
0.0386962890625,
-0.00649261474609375,
0.057281494140625,
-0.0012407302856445312,
0.0015697479248046875,
-0.101318359375,
-0.00836944580078125,
0.007049560546875,
-0.010040283203125,
-0.049163818359375,
0.0117034912109375,
0.0126953125,
-0.002422332763671875,
-0.03814697265625,
0.024444580078125,
-0.00191497802734375,
0.006256103515625,
-0.01364898681640625,
-0.0174560546875,
-0.01016998291015625,
0.053466796875,
0.0224761962890625,
0.0224761962890625,
0.061981201171875,
-0.035247802734375,
0.021575927734375,
0.0146026611328125,
-0.033905029296875,
0.040130615234375,
-0.046875,
-0.02142333984375,
-0.01361846923828125,
0.0085296630859375,
-0.0748291015625,
-0.005615234375,
0.0222320556640625,
-0.0369873046875,
0.020904541015625,
-0.0290985107421875,
-0.027069091796875,
-0.0185699462890625,
-0.0357666015625,
0.0189208984375,
0.038360595703125,
-0.042999267578125,
0.041290283203125,
0.0250701904296875,
-0.029632568359375,
-0.05230712890625,
-0.0660400390625,
0.0081787109375,
-0.010040283203125,
-0.03656005859375,
0.0139007568359375,
-0.019256591796875,
-0.003673553466796875,
-0.02764892578125,
0.01201629638671875,
-0.0214691162109375,
-0.0180816650390625,
0.01554107666015625,
0.0394287109375,
-0.0203857421875,
-0.012664794921875,
0.037689208984375,
-0.010589599609375,
0.01049041748046875,
0.003604888916015625,
0.057830810546875,
-0.03472900390625,
-0.016571044921875,
-0.00013387203216552734,
0.0279998779296875,
0.057037353515625,
0.0054779052734375,
0.0460205078125,
0.07879638671875,
-0.021484375,
-0.00209808349609375,
-0.036468505859375,
-0.006824493408203125,
-0.032440185546875,
0.039398193359375,
-0.02044677734375,
-0.0595703125,
0.050872802734375,
0.01377105712890625,
-0.0121002197265625,
0.052490234375,
0.06884765625,
-0.0271148681640625,
0.08905029296875,
0.039093017578125,
-0.046295166015625,
0.019317626953125,
-0.03802490234375,
0.016021728515625,
-0.07366943359375,
-0.00946044921875,
-0.02197265625,
-0.0124053955078125,
-0.062164306640625,
-0.0155487060546875,
0.0265045166015625,
0.012420654296875,
-0.04132080078125,
0.02081298828125,
-0.030548095703125,
0.0158843994140625,
0.046356201171875,
0.006175994873046875,
0.00180816650390625,
0.0185089111328125,
-0.04644775390625,
-0.0247650146484375,
-0.043212890625,
-0.03717041015625,
0.060943603515625,
0.040618896484375,
0.0654296875,
0.0092010498046875,
0.066162109375,
0.016510009765625,
0.0220184326171875,
-0.071533203125,
0.0357666015625,
-0.03778076171875,
-0.060943603515625,
0.0144805908203125,
-0.01702880859375,
-0.04315185546875,
0.0238037109375,
-0.008453369140625,
-0.048187255859375,
0.021514892578125,
-0.006443023681640625,
-0.01462554931640625,
0.0141448974609375,
-0.046112060546875,
0.06494140625,
-0.00901031494140625,
-0.0135955810546875,
-0.0176239013671875,
-0.058319091796875,
0.0267486572265625,
0.0012731552124023438,
0.020050048828125,
-0.00800323486328125,
0.017974853515625,
0.046112060546875,
-0.028594970703125,
0.083740234375,
-0.0209503173828125,
0.004184722900390625,
0.0232086181640625,
0.006908416748046875,
0.0015935897827148438,
0.0231475830078125,
-0.01580810546875,
0.038177490234375,
0.004215240478515625,
-0.0272064208984375,
0.00038051605224609375,
0.06695556640625,
-0.078369140625,
-0.01136016845703125,
-0.058349609375,
-0.04058837890625,
-0.0162506103515625,
0.0209503173828125,
0.0240631103515625,
0.017242431640625,
-0.01019287109375,
0.0153961181640625,
0.033599853515625,
-0.03155517578125,
0.039764404296875,
0.035430908203125,
-0.01456451416015625,
-0.0462646484375,
0.06842041015625,
-0.01031494140625,
0.0091705322265625,
0.0180816650390625,
0.015777587890625,
-0.0266876220703125,
-0.0111541748046875,
-0.03643798828125,
0.0191802978515625,
-0.0540771484375,
-0.01384735107421875,
-0.049041748046875,
-0.04144287109375,
-0.032562255859375,
-0.00876617431640625,
-0.0292510986328125,
-0.035064697265625,
-0.0232391357421875,
-0.01470947265625,
0.0252227783203125,
0.0216827392578125,
-0.025299072265625,
0.0111846923828125,
-0.0667724609375,
0.0280303955078125,
0.01065826416015625,
0.024749755859375,
-0.0007977485656738281,
-0.045013427734375,
-0.0257415771484375,
0.01373291015625,
-0.02899169921875,
-0.043548583984375,
0.049224853515625,
0.02191162109375,
0.0404052734375,
0.0328369140625,
0.001758575439453125,
0.037689208984375,
-0.035919189453125,
0.083740234375,
0.0169677734375,
-0.07916259765625,
0.035614013671875,
-0.037322998046875,
0.038482666015625,
0.053802490234375,
0.038360595703125,
-0.053466796875,
-0.0352783203125,
-0.038421630859375,
-0.08544921875,
0.0604248046875,
0.01377105712890625,
0.028045654296875,
-0.018707275390625,
0.0006017684936523438,
0.00225830078125,
0.0310516357421875,
-0.08062744140625,
-0.0259246826171875,
-0.053802490234375,
-0.034149169921875,
-0.0105438232421875,
-0.0174407958984375,
0.007648468017578125,
-0.03289794921875,
0.07147216796875,
-0.00981903076171875,
0.0228118896484375,
0.0174102783203125,
-0.0160064697265625,
0.0045013427734375,
0.017913818359375,
0.032470703125,
-0.0012912750244140625,
-0.048797607421875,
-0.004047393798828125,
0.00334930419921875,
-0.0484619140625,
-0.0113067626953125,
-0.006420135498046875,
-0.0142974853515625,
0.0106658935546875,
0.026153564453125,
0.0535888671875,
0.002941131591796875,
-0.044525146484375,
0.0214385986328125,
0.0067138671875,
-0.00897979736328125,
-0.031585693359375,
-0.0102996826171875,
0.007541656494140625,
0.0146331787109375,
0.0240325927734375,
0.0290985107421875,
0.008270263671875,
-0.05078125,
0.0178070068359375,
0.023040771484375,
-0.051422119140625,
-0.0025806427001953125,
0.0296630859375,
0.00856781005859375,
0.00439453125,
0.0654296875,
-0.016448974609375,
-0.07940673828125,
0.04803466796875,
0.031494140625,
0.069580078125,
-0.0018205642700195312,
0.031158447265625,
0.05279541015625,
0.0255279541015625,
-0.0012216567993164062,
0.03680419921875,
0.0011501312255859375,
-0.08447265625,
-0.045623779296875,
-0.0694580078125,
-0.04364013671875,
0.00568389892578125,
-0.068115234375,
0.00811767578125,
-0.03594970703125,
-0.0158843994140625,
-0.0083770751953125,
-0.003368377685546875,
-0.0231781005859375,
0.031280517578125,
0.024261474609375,
0.058837890625,
-0.058685302734375,
0.09375,
0.05523681640625,
-0.03509521484375,
-0.048187255859375,
-0.01198577880859375,
-0.01142120361328125,
-0.04827880859375,
0.045928955078125,
0.02685546875,
0.01277923583984375,
-0.00998687744140625,
-0.02716064453125,
-0.050537109375,
0.04510498046875,
0.00359344482421875,
-0.032623291015625,
0.035186767578125,
0.011016845703125,
0.061859130859375,
-0.034820556640625,
0.024139404296875,
0.03369140625,
0.0284881591796875,
0.0030651092529296875,
-0.04791259765625,
-0.0291748046875,
-0.043304443359375,
-0.006809234619140625,
0.004947662353515625,
-0.04071044921875,
0.08221435546875,
-0.01416778564453125,
0.01983642578125,
0.00904083251953125,
0.03778076171875,
0.03314208984375,
0.013275146484375,
0.041656494140625,
0.046783447265625,
0.0236663818359375,
-0.0247955322265625,
0.06585693359375,
-0.0190887451171875,
0.061981201171875,
0.06561279296875,
-0.019500732421875,
0.06939697265625,
0.036407470703125,
-0.0220184326171875,
0.064208984375,
0.0557861328125,
-0.030120849609375,
0.06060791015625,
-0.00205230712890625,
-0.01123809814453125,
-0.020294189453125,
0.0011119842529296875,
-0.016693115234375,
0.0197601318359375,
0.033172607421875,
-0.026031494140625,
-0.01136016845703125,
0.00394439697265625,
-0.00276947021484375,
-0.01140594482421875,
-0.042724609375,
0.031890869140625,
0.01381683349609375,
-0.047515869140625,
0.037689208984375,
0.0168609619140625,
0.080810546875,
-0.04473876953125,
0.0106658935546875,
-0.01280975341796875,
0.0328369140625,
-0.0199737548828125,
-0.0697021484375,
0.0242919921875,
0.0086517333984375,
-0.04119873046875,
-0.006877899169921875,
0.05084228515625,
-0.04620361328125,
-0.071533203125,
0.02886962890625,
0.024993896484375,
0.0154571533203125,
-0.0092620849609375,
-0.082275390625,
0.00919342041015625,
0.02069091796875,
-0.0029087066650390625,
0.01496124267578125,
-0.008392333984375,
-0.0016355514526367188,
0.03973388671875,
0.04144287109375,
0.0036144256591796875,
-0.002689361572265625,
0.0478515625,
0.05389404296875,
-0.04229736328125,
-0.043609619140625,
-0.056304931640625,
0.057647705078125,
-0.007274627685546875,
-0.0159149169921875,
0.053009033203125,
0.044647216796875,
0.057861328125,
-0.04266357421875,
0.06146240234375,
-0.016143798828125,
0.0380859375,
-0.024444580078125,
0.06591796875,
-0.03515625,
-0.0013608932495117188,
-0.04437255859375,
-0.07354736328125,
-0.0233154296875,
0.080810546875,
0.000396728515625,
0.00856781005859375,
0.0457763671875,
0.0595703125,
-0.00319671630859375,
-0.01245880126953125,
0.00928497314453125,
0.03668212890625,
0.011383056640625,
0.0309600830078125,
0.05377197265625,
-0.039764404296875,
0.029296875,
-0.042633056640625,
-0.010772705078125,
-0.006069183349609375,
-0.07208251953125,
-0.0770263671875,
-0.031768798828125,
-0.035247802734375,
-0.0338134765625,
-0.02520751953125,
0.06787109375,
0.05975341796875,
-0.099609375,
-0.032470703125,
0.003963470458984375,
0.006473541259765625,
-0.015106201171875,
-0.026763916015625,
0.0220184326171875,
-0.0281982421875,
-0.0736083984375,
0.017913818359375,
0.004581451416015625,
-0.01044464111328125,
-0.0265350341796875,
-0.006427764892578125,
-0.024261474609375,
0.0029010772705078125,
0.058013916015625,
-0.0018625259399414062,
-0.054473876953125,
-0.017486572265625,
-0.008697509765625,
-0.01074981689453125,
0.004360198974609375,
0.03399658203125,
-0.046356201171875,
0.035736083984375,
0.03594970703125,
0.02081298828125,
0.043670654296875,
-0.01352691650390625,
0.037872314453125,
-0.07281494140625,
0.0138702392578125,
0.01549530029296875,
0.05242919921875,
0.03582763671875,
-0.0194244384765625,
0.0251617431640625,
0.01390838623046875,
-0.035888671875,
-0.044586181640625,
0.00403594970703125,
-0.1175537109375,
-0.01288604736328125,
0.07330322265625,
0.0007295608520507812,
-0.0013570785522460938,
-0.00008970499038696289,
-0.0369873046875,
0.0142059326171875,
-0.051422119140625,
0.053802490234375,
0.06982421875,
0.0089874267578125,
-0.00647735595703125,
-0.01320648193359375,
0.003978729248046875,
0.043212890625,
-0.0271148681640625,
-0.0313720703125,
0.019195556640625,
0.0276641845703125,
0.03619384765625,
0.0266265869140625,
-0.008453369140625,
0.0165863037109375,
-0.0214691162109375,
0.0309295654296875,
0.01788330078125,
-0.010894775390625,
-0.031280517578125,
0.01430511474609375,
-0.006443023681640625,
-0.01837158203125
]
] |
mosaicml/mpt-30b-instruct | 2023-10-30T21:54:31.000Z | [
"transformers",
"pytorch",
"mpt",
"text-generation",
"Composer",
"MosaicML",
"llm-foundry",
"custom_code",
"dataset:competition_math",
"dataset:conceptofmind/cot_submix_original/cot_gsm8k",
"dataset:knkarthick/dialogsum",
"dataset:mosaicml/dolly_hhrlhf",
"dataset:duorc",
"dataset:tau/scrolls/qasper",
"dataset:emozilla/quality",
"dataset:scrolls/summ_screen_fd",
"dataset:spider",
"arxiv:2205.14135",
"arxiv:2108.12409",
"license:cc-by-sa-3.0",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | mosaicml | null | null | mosaicml/mpt-30b-instruct | 95 | 11,644 | transformers | 2023-06-21T17:12:11 | ---
license: cc-by-sa-3.0
datasets:
- competition_math
- conceptofmind/cot_submix_original/cot_gsm8k
- knkarthick/dialogsum
- mosaicml/dolly_hhrlhf
- duorc
- tau/scrolls/qasper
- emozilla/quality
- scrolls/summ_screen_fd
- spider
tags:
- Composer
- MosaicML
- llm-foundry
inference: false
---
# MPT-30B-Instruct
MPT-30B-Instruct is a model for short-form instruction following.
It is built by finetuning [MPT-30B](https://huggingface.co/mosaicml/mpt-30b) on [Dolly HHRLHF](https://huggingface.co/datasets/mosaicml/dolly_hhrlhf) derived from the [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and the [Anthropic Helpful and Harmless (HH-RLHF)](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets. It is also trained on [Competition Math](https://huggingface.co/datasets/competition_math), [Duorc](https://huggingface.co/datasets/duorc), [CoT GSM8k](https://huggingface.co/datasets/conceptofmind/cot_submix_original), [Qasper](https://huggingface.co/datasets/allenai/qasper), [Quality](https://huggingface.co/datasets/emozilla/quality), [Summ Screen FD](https://huggingface.co/datasets/tau/scrolls) and [Spider](https://huggingface.co/datasets/spider).
* License: _CC-By-SA-3.0_
This model was trained by [MosaicML](https://www.mosaicml.com) and follows a modified decoder-only transformer architecture.
## Model Date
June 22, 2023
## Model License
CC-By-SA-3.0
## Documentation
* [Blog post: Raising the bar for open-source foundation models](https://www.mosaicml.com/blog/mpt-30b)
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)!
### Example Question/Instruction
**Bespokenizer46**
> I need to convince my friend, Phyllis, that she should train a custom LLM for her Fortune 500 company using the MosaicML Platform.
> Please write an email that explains why MosaicML's emphasis on cutting edge methodology, data privacy, and efficiency are so important.
> End the email with a friendly inquiry about Phyllis's family.
**MPT-30B-Instruct**:
> Phyllis -
> I know you're always looking for ways to improve efficiency at work and I think I've found a solution that you'll be very interested in.
> LLMs are only as good as the data they're trained on and the training process can be very time consuming and resource intensive. MosaicML uses cutting edge methodology to efficiently train high quality models without compromising data privacy.
> They also provide tools to easily connect to and use the model in your daily workflow.
> I think you'd really enjoy speaking with their founder, we can set up a call if you're interested.
> Also, I know it's been a tough year for your family, how are things?
> Best,
> Your Friend
## How to Use
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method. This is because we use a custom model architecture that is not yet part of the `transformers` package.
It includes options for many training efficiency features such as [FlashAttention (Dao et al. 2022)](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), QK LayerNorm, and more.
```python
import transformers
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-30b-instruct',
trust_remote_code=True
)
```
To use the optimized [triton implementation](https://github.com/openai/triton) of FlashAttention, you can load the model on GPU (`cuda:0`) with `attn_impl='triton'` and with `bfloat16` precision:
```python
import torch
import transformers
name = 'mosaicml/mpt-30b-instruct'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.attn_config['attn_impl'] = 'triton' # change this to use triton-based FlashAttention
config.init_device = 'cuda:0' # For fast initialization directly on GPU!
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
torch_dtype=torch.bfloat16, # Load model weights in bfloat16
trust_remote_code=True
)
```
The model was trained initially on a sequence length of 2048. An additional pre-training phase was included for sequence length adaptation to 8192. However, ALiBi further enables users to increase the maximum sequence length during finetuning and/or inference. For example:
```python
import transformers
name = 'mosaicml/mpt-30b-instruct'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.max_seq_len = 16384 # (input + output) tokens can now be up to 16384
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
trust_remote_code=True
)
```
This model was trained with the MPT-30B tokenizer which is based on the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer and includes additional padding and eos tokens.
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('mosaicml/mpt-30b')
```
The model can then be used, for example, within a text-generation pipeline.
Note: when running Torch modules in lower precision, it is best practice to use the [torch.autocast context manager](https://pytorch.org/docs/stable/amp.html).
```python
from transformers import pipeline
with torch.autocast('cuda', dtype=torch.bfloat16):
inputs = tokenizer('Here is a recipe for vegan banana bread:\n', return_tensors="pt").to('cuda')
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
# or using the HF pipeline
pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, device='cuda:0')
with torch.autocast('cuda', dtype=torch.bfloat16):
print(
pipe('Here is a recipe for vegan banana bread:\n',
max_new_tokens=100,
do_sample=True,
use_cache=True))
```
### Formatting
This model was trained on data formatted as follows:
```python
def format_prompt(instruction):
template = "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n###Instruction\n{instruction}\n\n### Response\n"
return template.format(instruction=instruction)
example = "Tell me a funny joke.\nDon't make it too funny though."
fmt_ex = format_prompt(instruction=example)
```
In the above example, `fmt_ex` is ready to be tokenized and sent through the model.
## Model Description
The architecture is a modification of a standard decoder-only transformer.
The model has been modified from a standard transformer in the following ways:
* It uses [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf)
* It uses [ALiBi (Attention with Linear Biases)](https://arxiv.org/abs/2108.12409) and does not use positional embeddings
* It does not use biases
| Hyperparameter | Value |
|----------------|-------|
|n_parameters | 29.95B |
|n_layers | 48 |
| n_heads | 64 |
| d_model | 7168 |
| vocab size | 50432 |
| sequence length | 8192 |
## Data Mix
The model was trained on the following data mix:
| Data Source | Number of Tokens in Source | Proportion |
|-------------|----------------------------|------------|
| competition_math | 1.6 M | 3.66% |
| cot_gsm8k | 3.36 M | 7.67% |
| dialogsum | 0.1 M | 0.23% |
| dolly_hhrlhf | 5.89 M | 13.43% |
| duorc | 7.8 M | 17.80% |
| qasper | 8.72 M | 19.90% |
| quality | 11.29 M | 25.78% |
| scrolls/summ_screen_fd | 4.97 M | 11.33% |
| spider | 0.089 M | 0.20% |
## PreTraining Data
For more details on the pretraining process, see [MPT-30B](https://huggingface.co/mosaicml/mpt-30b).
The data was tokenized using the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
### Training Configuration
This model was trained on 72 A100 40GB GPUs for 8 hours using the [MosaicML Platform](https://www.mosaicml.com/platform).
The model was trained with sharded data parallelism using [FSDP](https://pytorch.org/docs/stable/fsdp.html) and used the AdamW optimizer.
## Limitations and Biases
_The following language is modified from [EleutherAI's GPT-NeoX-20B](https://huggingface.co/EleutherAI/gpt-neox-20b)_
MPT-30B-Instruct can produce factually incorrect output, and should not be relied on to produce factually accurate information.
MPT-30B-Instruct was trained on various public datasets.
While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
## Acknowledgements
This model was finetuned by Sam Havens, Alex Trott, and the MosaicML NLP team
## MosaicML Platform
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-30b).
## Disclaimer
The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please consult an attorney before using this model for commercial purposes.
## Citation
Please cite this model using the following format:
```
@online{MosaicML2023Introducing,
author = {MosaicML NLP Team},
title = {Introducing MPT-30B: Raising the bar
for open-source foundation models},
year = {2023},
url = {www.mosaicml.com/blog/mpt-30b},
note = {Accessed: 2023-06-22},
urldate = {2023-06-22}
}
``` | 9,547 | [
[
-0.03179931640625,
-0.038238525390625,
0.0093994140625,
0.0209197998046875,
-0.02215576171875,
-0.0005450248718261719,
-0.00571441650390625,
-0.018646240234375,
0.006511688232421875,
0.0209503173828125,
-0.044464111328125,
-0.042449951171875,
-0.050201416015625,
0.006793975830078125,
-0.030181884765625,
0.07611083984375,
-0.01181793212890625,
0.009002685546875,
0.004764556884765625,
-0.0018224716186523438,
-0.0115966796875,
-0.028961181640625,
-0.051849365234375,
-0.0303955078125,
0.027313232421875,
0.0214996337890625,
0.05206298828125,
0.06976318359375,
0.033721923828125,
0.0255889892578125,
-0.011322021484375,
0.007663726806640625,
-0.031646728515625,
-0.030242919921875,
0.0080413818359375,
-0.03497314453125,
-0.040069580078125,
0.0140380859375,
0.037200927734375,
0.0298919677734375,
-0.01385498046875,
0.033905029296875,
0.0035076141357421875,
0.0250701904296875,
-0.03759765625,
0.01363372802734375,
-0.023529052734375,
0.0200653076171875,
-0.0087738037109375,
-0.0084228515625,
-0.03839111328125,
-0.0203857421875,
0.0024089813232421875,
-0.0404052734375,
0.024688720703125,
0.00783538818359375,
0.08319091796875,
0.01898193359375,
-0.026031494140625,
-0.004070281982421875,
-0.046722412109375,
0.048309326171875,
-0.06524658203125,
0.0213623046875,
0.0221405029296875,
0.01482391357421875,
0.0056610107421875,
-0.0821533203125,
-0.049072265625,
-0.0167388916015625,
-0.00852203369140625,
0.025299072265625,
-0.0158538818359375,
-0.0024566650390625,
0.03607177734375,
0.03570556640625,
-0.044097900390625,
-0.0015411376953125,
-0.029449462890625,
-0.01392364501953125,
0.038360595703125,
0.025634765625,
0.0244903564453125,
-0.0207977294921875,
-0.044158935546875,
-0.0285797119140625,
-0.04901123046875,
0.004955291748046875,
0.028167724609375,
-0.0003898143768310547,
-0.0377197265625,
0.048828125,
-0.0023097991943359375,
0.0465087890625,
0.0178680419921875,
-0.0025501251220703125,
0.0292816162109375,
-0.0271148681640625,
-0.02984619140625,
-0.0094451904296875,
0.07806396484375,
0.019989013671875,
0.01024627685546875,
-0.004100799560546875,
-0.016387939453125,
-0.01486968994140625,
0.00798797607421875,
-0.08160400390625,
-0.03271484375,
0.01580810546875,
-0.03704833984375,
-0.0163116455078125,
0.0097808837890625,
-0.042449951171875,
-0.00893402099609375,
-0.0222930908203125,
0.056640625,
-0.056243896484375,
-0.0179443359375,
0.00540924072265625,
-0.0149993896484375,
0.017974853515625,
0.00814056396484375,
-0.07086181640625,
0.013336181640625,
0.03369140625,
0.072509765625,
-0.0006375312805175781,
-0.04180908203125,
-0.0181427001953125,
0.002368927001953125,
-0.002483367919921875,
0.0352783203125,
-0.013671875,
-0.018096923828125,
-0.031890869140625,
0.013427734375,
-0.0257720947265625,
-0.031402587890625,
0.0140228271484375,
-0.027862548828125,
0.04296875,
-0.01363372802734375,
-0.042327880859375,
-0.0123138427734375,
0.0091094970703125,
-0.041595458984375,
0.07373046875,
0.03228759765625,
-0.07257080078125,
0.0257568359375,
-0.0628662109375,
-0.0164642333984375,
-0.00702667236328125,
0.005367279052734375,
-0.050750732421875,
-0.0135345458984375,
0.0233612060546875,
0.034332275390625,
-0.021453857421875,
0.01103973388671875,
-0.01268768310546875,
-0.042266845703125,
0.0165557861328125,
-0.036346435546875,
0.078125,
0.0263671875,
-0.053741455078125,
0.0106658935546875,
-0.0631103515625,
-0.00571441650390625,
0.0228424072265625,
-0.037078857421875,
0.0183868408203125,
-0.025970458984375,
0.006008148193359375,
0.014739990234375,
0.0113677978515625,
-0.040130615234375,
0.014556884765625,
-0.035797119140625,
0.03759765625,
0.057281494140625,
-0.0091552734375,
0.0297393798828125,
-0.0340576171875,
0.031585693359375,
0.010711669921875,
0.03076171875,
-0.01317596435546875,
-0.039703369140625,
-0.07574462890625,
-0.02423095703125,
0.031585693359375,
0.036163330078125,
-0.059967041015625,
0.0308837890625,
-0.0144500732421875,
-0.04949951171875,
-0.0439453125,
-0.0046234130859375,
0.033050537109375,
0.042816162109375,
0.04205322265625,
-0.0232696533203125,
-0.053436279296875,
-0.052703857421875,
0.0028705596923828125,
0.006420135498046875,
-0.0010595321655273438,
0.0186004638671875,
0.0523681640625,
-0.02301025390625,
0.065673828125,
-0.034576416015625,
0.00255584716796875,
-0.022003173828125,
0.02386474609375,
0.045135498046875,
0.051177978515625,
0.03924560546875,
-0.052093505859375,
-0.0496826171875,
-0.0151214599609375,
-0.04833984375,
0.00968170166015625,
-0.00542449951171875,
-0.01142120361328125,
0.00968170166015625,
0.0138702392578125,
-0.057769775390625,
0.03765869140625,
0.03765869140625,
-0.035858154296875,
0.046142578125,
-0.01207733154296875,
0.0021419525146484375,
-0.10418701171875,
0.01335906982421875,
-0.006072998046875,
-0.01114654541015625,
-0.042327880859375,
-0.0037250518798828125,
0.0081787109375,
-0.0017404556274414062,
-0.0560302734375,
0.035186767578125,
-0.0304718017578125,
0.0018224716186523438,
-0.01410675048828125,
-0.028778076171875,
-0.005100250244140625,
0.05279541015625,
0.0103759765625,
0.0650634765625,
0.03765869140625,
-0.041778564453125,
0.03240966796875,
0.0204620361328125,
-0.01885986328125,
0.01274871826171875,
-0.0506591796875,
0.0084228515625,
0.0086822509765625,
0.0087432861328125,
-0.0615234375,
-0.01114654541015625,
0.03369140625,
-0.037139892578125,
0.02984619140625,
-0.016357421875,
-0.03619384765625,
-0.04595947265625,
-0.00975799560546875,
0.031646728515625,
0.04949951171875,
-0.054931640625,
0.0465087890625,
0.0008873939514160156,
0.0203704833984375,
-0.0594482421875,
-0.05181884765625,
-0.006122589111328125,
-0.017974853515625,
-0.05169677734375,
0.03399658203125,
-0.0030117034912109375,
0.00861358642578125,
-0.00827789306640625,
-0.006565093994140625,
-0.004688262939453125,
0.0011949539184570312,
0.024658203125,
0.0250701904296875,
-0.0207061767578125,
-0.0177459716796875,
-0.0158538818359375,
-0.018280029296875,
0.0083770751953125,
-0.0209503173828125,
0.07427978515625,
-0.027191162109375,
-0.0212860107421875,
-0.052093505859375,
0.0037059783935546875,
0.03912353515625,
-0.01186370849609375,
0.07611083984375,
0.0843505859375,
-0.015899658203125,
0.0008025169372558594,
-0.037200927734375,
-0.025360107421875,
-0.03875732421875,
0.0261383056640625,
-0.0098419189453125,
-0.043487548828125,
0.03839111328125,
0.00959014892578125,
-0.00482940673828125,
0.04486083984375,
0.0526123046875,
-0.0125579833984375,
0.0732421875,
0.03997802734375,
0.01264190673828125,
0.046051025390625,
-0.0679931640625,
-0.006702423095703125,
-0.06512451171875,
-0.025909423828125,
-0.0186920166015625,
-0.0204010009765625,
-0.037139892578125,
-0.0268707275390625,
0.018218994140625,
-0.007053375244140625,
-0.0545654296875,
0.05657958984375,
-0.0439453125,
0.0244140625,
0.060821533203125,
0.0250396728515625,
0.0017633438110351562,
-0.005916595458984375,
-0.0245819091796875,
0.007537841796875,
-0.072998046875,
-0.0293426513671875,
0.09521484375,
0.031524658203125,
0.052764892578125,
-0.007755279541015625,
0.061767578125,
-0.00458526611328125,
0.030914306640625,
-0.0323486328125,
0.029205322265625,
0.0021820068359375,
-0.048553466796875,
-0.007640838623046875,
-0.048980712890625,
-0.06024169921875,
0.0126800537109375,
-0.0200653076171875,
-0.052276611328125,
0.0232391357421875,
0.0141448974609375,
-0.0386962890625,
0.04150390625,
-0.06768798828125,
0.077392578125,
-0.021636962890625,
-0.03399658203125,
0.00734710693359375,
-0.05438232421875,
0.0274658203125,
0.005474090576171875,
-0.00260162353515625,
0.006744384765625,
0.0186767578125,
0.07220458984375,
-0.051239013671875,
0.06396484375,
-0.01544189453125,
0.017791748046875,
0.0222930908203125,
-0.01140594482421875,
0.03863525390625,
0.0011548995971679688,
-0.0013675689697265625,
0.011566162109375,
0.00925445556640625,
-0.029571533203125,
-0.022796630859375,
0.0287322998046875,
-0.08380126953125,
-0.035064697265625,
-0.0379638671875,
-0.048797607421875,
0.00852203369140625,
0.018280029296875,
0.055755615234375,
0.025421142578125,
0.00756072998046875,
0.0190582275390625,
0.04217529296875,
-0.0204925537109375,
0.060699462890625,
0.021026611328125,
-0.00212860107421875,
-0.039459228515625,
0.0634765625,
0.0015468597412109375,
0.0258331298828125,
0.0244140625,
0.016357421875,
-0.023834228515625,
-0.035675048828125,
-0.03466796875,
0.021484375,
-0.04644775390625,
-0.0252685546875,
-0.052703857421875,
-0.030853271484375,
-0.03106689453125,
0.002628326416015625,
-0.045135498046875,
-0.0260162353515625,
-0.04107666015625,
0.0018529891967773438,
0.0220794677734375,
0.03759765625,
-0.00543975830078125,
0.053558349609375,
-0.055419921875,
0.0170745849609375,
0.0222320556640625,
0.021728515625,
0.004123687744140625,
-0.055572509765625,
-0.0299835205078125,
0.016265869140625,
-0.043182373046875,
-0.06024169921875,
0.039825439453125,
-0.0011043548583984375,
0.035491943359375,
0.03106689453125,
-0.0099945068359375,
0.0517578125,
-0.021331787109375,
0.06048583984375,
0.0275421142578125,
-0.07550048828125,
0.026641845703125,
-0.0163116455078125,
0.0259246826171875,
0.018157958984375,
0.040924072265625,
-0.0321044921875,
-0.01424407958984375,
-0.06182861328125,
-0.05889892578125,
0.0751953125,
0.03656005859375,
0.0025882720947265625,
0.00855255126953125,
0.0246124267578125,
-0.0002340078353881836,
0.01149749755859375,
-0.08612060546875,
-0.0208587646484375,
-0.041900634765625,
-0.0197906494140625,
0.0016021728515625,
-0.0054473876953125,
-0.012847900390625,
-0.046783447265625,
0.0535888671875,
0.0010156631469726562,
0.050323486328125,
0.01165771484375,
-0.0174560546875,
-0.00252532958984375,
-0.0011129379272460938,
0.03997802734375,
0.05743408203125,
-0.0281982421875,
0.006420135498046875,
0.0281219482421875,
-0.05487060546875,
0.01102447509765625,
0.01285552978515625,
-0.006488800048828125,
-0.01030731201171875,
0.030242919921875,
0.0693359375,
-0.00040841102600097656,
-0.01385498046875,
0.04510498046875,
-0.01064300537109375,
-0.015869140625,
-0.0186309814453125,
0.0186920166015625,
0.024505615234375,
0.0271148681640625,
0.01934814453125,
0.01091766357421875,
-0.014556884765625,
-0.034393310546875,
0.017333984375,
0.01702880859375,
-0.01311492919921875,
-0.0212860107421875,
0.0670166015625,
-0.0007500648498535156,
-0.0221710205078125,
0.06036376953125,
-0.006988525390625,
-0.037109375,
0.06524658203125,
0.0550537109375,
0.056396484375,
-0.0212249755859375,
0.0189056396484375,
0.0341796875,
0.0225677490234375,
-0.005588531494140625,
0.01497650146484375,
-0.00787353515625,
-0.04931640625,
-0.0254058837890625,
-0.055877685546875,
-0.0206146240234375,
0.00007086992263793945,
-0.0428466796875,
0.03082275390625,
-0.035186767578125,
-0.0191192626953125,
-0.0125579833984375,
0.006885528564453125,
-0.06298828125,
0.027435302734375,
0.0160369873046875,
0.07952880859375,
-0.061767578125,
0.07244873046875,
0.033172607421875,
-0.047882080078125,
-0.08050537109375,
-0.0171661376953125,
-0.017486572265625,
-0.0750732421875,
0.041778564453125,
0.0182037353515625,
0.0157318115234375,
0.011199951171875,
-0.05126953125,
-0.0699462890625,
0.116943359375,
0.034912109375,
-0.030914306640625,
-0.017120361328125,
0.033233642578125,
0.038604736328125,
-0.03515625,
0.049835205078125,
0.035491943359375,
0.0311126708984375,
0.0257720947265625,
-0.0546875,
0.0167999267578125,
-0.0215606689453125,
-0.002002716064453125,
-0.0011386871337890625,
-0.05908203125,
0.0947265625,
-0.016876220703125,
-0.0112762451171875,
0.0077667236328125,
0.04571533203125,
0.01535797119140625,
0.018463134765625,
0.026611328125,
0.060943603515625,
0.0419921875,
-0.019317626953125,
0.0872802734375,
-0.03338623046875,
0.0517578125,
0.06982421875,
0.0189056396484375,
0.038909912109375,
0.029998779296875,
-0.0183563232421875,
0.03668212890625,
0.06854248046875,
-0.0321044921875,
0.03778076171875,
-0.0002713203430175781,
-0.004428863525390625,
-0.016143798828125,
0.0229949951171875,
-0.0435791015625,
0.027740478515625,
0.01183319091796875,
-0.048675537109375,
-0.011566162109375,
0.010528564453125,
0.0148773193359375,
-0.036529541015625,
-0.0103759765625,
0.041595458984375,
0.0081329345703125,
-0.039459228515625,
0.0635986328125,
0.00563812255859375,
0.05328369140625,
-0.0382080078125,
0.01166534423828125,
-0.0181884765625,
0.01611328125,
-0.027557373046875,
-0.05108642578125,
0.0208740234375,
-0.01019287109375,
-0.00043201446533203125,
-0.0147857666015625,
0.01885986328125,
-0.0242462158203125,
-0.03399658203125,
0.013214111328125,
0.0237274169921875,
0.00799560546875,
-0.006526947021484375,
-0.06610107421875,
-0.0093994140625,
0.0009207725524902344,
-0.030120849609375,
0.01158905029296875,
0.0105133056640625,
0.02490234375,
0.047332763671875,
0.052886962890625,
-0.00484466552734375,
0.017425537109375,
-0.009063720703125,
0.0758056640625,
-0.05181884765625,
-0.0241851806640625,
-0.07427978515625,
0.048187255859375,
-0.004161834716796875,
-0.033416748046875,
0.06036376953125,
0.052398681640625,
0.06634521484375,
-0.00986480712890625,
0.0296783447265625,
-0.013336181640625,
0.004825592041015625,
-0.040374755859375,
0.067138671875,
-0.034942626953125,
0.019317626953125,
-0.0219573974609375,
-0.08013916015625,
-0.013397216796875,
0.042236328125,
-0.0299835205078125,
0.01448822021484375,
0.05645751953125,
0.064453125,
-0.027679443359375,
0.01258087158203125,
0.006072998046875,
0.02325439453125,
0.0185394287109375,
0.058837890625,
0.0616455078125,
-0.051177978515625,
0.047210693359375,
-0.049072265625,
-0.01358795166015625,
-0.0094757080078125,
-0.055694580078125,
-0.0760498046875,
-0.03692626953125,
-0.021026611328125,
-0.039093017578125,
-0.00438690185546875,
0.078369140625,
0.06927490234375,
-0.056427001953125,
-0.02685546875,
-0.01369476318359375,
0.00004374980926513672,
-0.0149078369140625,
-0.0166168212890625,
0.042816162109375,
-0.00911712646484375,
-0.054962158203125,
0.004547119140625,
-0.0017461776733398438,
0.0283966064453125,
0.00015044212341308594,
-0.0135345458984375,
-0.028533935546875,
-0.0025348663330078125,
0.0257415771484375,
0.0150604248046875,
-0.031341552734375,
-0.01377105712890625,
0.004535675048828125,
-0.00957489013671875,
0.0311737060546875,
0.034820556640625,
-0.0421142578125,
0.0184173583984375,
0.0264434814453125,
0.0269317626953125,
0.07977294921875,
-0.005245208740234375,
0.036773681640625,
-0.04339599609375,
0.005855560302734375,
0.016754150390625,
0.042144775390625,
0.0191802978515625,
-0.032073974609375,
0.040130615234375,
0.037261962890625,
-0.039764404296875,
-0.05322265625,
-0.003986358642578125,
-0.07183837890625,
-0.013397216796875,
0.0845947265625,
-0.0170440673828125,
-0.04107666015625,
0.01763916015625,
-0.01535797119140625,
0.04791259765625,
-0.00858306884765625,
0.05291748046875,
0.036163330078125,
-0.01044464111328125,
-0.0263671875,
-0.017669677734375,
0.031280517578125,
0.026702880859375,
-0.049407958984375,
-0.01861572265625,
0.0034236907958984375,
0.0462646484375,
0.01666259765625,
0.0321044921875,
-0.00771331787109375,
0.030731201171875,
0.006298065185546875,
0.0184783935546875,
-0.034576416015625,
-0.0171966552734375,
-0.01212310791015625,
0.016204833984375,
-0.0257415771484375,
-0.0234222412109375
]
] |
togethercomputer/RedPajama-INCITE-7B-Chat | 2023-06-05T03:22:54.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"en",
"dataset:togethercomputer/RedPajama-Data-1T",
"dataset:OpenAssistant/oasst1",
"dataset:databricks/databricks-dolly-15k",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | togethercomputer | null | null | togethercomputer/RedPajama-INCITE-7B-Chat | 89 | 11,615 | transformers | 2023-05-04T20:24:59 | ---
license: apache-2.0
language:
- en
datasets:
- togethercomputer/RedPajama-Data-1T
- OpenAssistant/oasst1
- databricks/databricks-dolly-15k
widget:
- text: "<human>: Write an email to my friends inviting them to come to my home on Friday for a dinner party, bring their own food to share.\n<bot>:"
example_title: "Email Writing"
- text: "<human>: Create a list of things to do in San Francisco\n<bot>:"
example_title: "Brainstorming"
inference:
parameters:
temperature: 0.7
top_p: 0.7
top_k: 50
max_new_tokens: 128
---
# RedPajama-INCITE-7B-Chat
RedPajama-INCITE-7B-Chat was developed by Together and leaders from the open-source AI community including Ontocord.ai, ETH DS3Lab, AAI CERC, Université de Montréal, MILA - Québec AI Institute, Stanford Center for Research on Foundation Models (CRFM), Stanford Hazy Research research group and LAION.
It is fine-tuned on OASST1 and Dolly2 to enhance chatting ability.
- Base Model: [RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base)
- Instruction-tuned Version: [RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct)
- Chat Version: [RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat)
## Model Details
- **Developed by**: Together Computer.
- **Model type**: Language Model
- **Language(s)**: English
- **License**: Apache 2.0
- **Model Description**: A 6.9B parameter pretrained language model.
# Quick Start
Please note that the model requires `transformers` version >= 4.25.1.
To prompt the chat model, use the following format:
```
<human>: [Instruction]
<bot>:
```
## GPU Inference
This requires a GPU with 16GB memory.
```python
import torch
import transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
MIN_TRANSFORMERS_VERSION = '4.25.1'
# check transformers version
assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.'
# init
tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Chat")
model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Chat", torch_dtype=torch.float16)
model = model.to('cuda:0')
# infer
prompt = "<human>: Who is Alan Turing?\n<bot>:"
inputs = tokenizer(prompt, return_tensors='pt').to(model.device)
input_length = inputs.input_ids.shape[1]
outputs = model.generate(
**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True
)
token = outputs.sequences[0, input_length:]
output_str = tokenizer.decode(token)
print(output_str)
"""
Alan Mathison Turing (23 June 1912 7 June 1954) was an English computer scientist, mathematician, logician, cryptanalyst, philosopher, mathematician, and theoretical biologist.
"""
```
## GPU Inference in Int8
This requires a GPU with 12GB memory.
To run inference with int8, please ensure you have installed accelerate and bitandbytes. You can install them with the following command:
```bash
pip install accelerate
pip install bitsandbytes
```
Then you can run inference with int8 as follows:
```python
import torch
import transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
MIN_TRANSFORMERS_VERSION = '4.25.1'
# check transformers version
assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.'
# init
tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Chat")
model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Chat", device_map='auto', torch_dtype=torch.float16, load_in_8bit=True)
# infer
prompt = "<human>: Who is Alan Turing?\n<bot>:"
inputs = tokenizer(prompt, return_tensors='pt').to(model.device)
input_length = inputs.input_ids.shape[1]
outputs = model.generate(
**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True
)
token = outputs.sequences[0, input_length:]
output_str = tokenizer.decode(token)
print(output_str)
"""
Alan Mathison Turing (23 June 1912 – 7 June 1954) was an English computer scientist, mathematician, logician, cryptanalyst, philosopher, and theoretical biologist.
"""
```
## CPU Inference
```python
import torch
import transformers
from transformers import AutoTokenizer, AutoModelForCausalLM
MIN_TRANSFORMERS_VERSION = '4.25.1'
# check transformers version
assert transformers.__version__ >= MIN_TRANSFORMERS_VERSION, f'Please upgrade transformers to version {MIN_TRANSFORMERS_VERSION} or higher.'
# init
tokenizer = AutoTokenizer.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Chat")
model = AutoModelForCausalLM.from_pretrained("togethercomputer/RedPajama-INCITE-7B-Chat", torch_dtype=torch.bfloat16)
# infer
prompt = "<human>: Who is Alan Turing?\n<bot>:"
inputs = tokenizer(prompt, return_tensors='pt').to(model.device)
input_length = inputs.input_ids.shape[1]
outputs = model.generate(
**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.7, top_k=50, return_dict_in_generate=True
)
token = outputs.sequences[0, input_length:]
output_str = tokenizer.decode(token)
print(output_str)
"""
Alan Mathison Turing, OBE, FRS, (23 June 1912 – 7 June 1954) was an English computer scientist, mathematician, logician, cryptanalyst, philosopher, and theoretical biologist.
"""
```
Please note that since `LayerNormKernelImpl` is not implemented in fp16 for CPU, we use `bfloat16` for CPU inference.
# Uses
## Direct Use
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
It is the responsibility of the end user to ensure that the model is used in a responsible and ethical manner.
#### Out-of-Scope Use
`RedPajama-INCITE-7B-Chat` is a language model and may not perform well for other use cases outside of its intended scope.
For example, it may not be suitable for use in safety-critical applications or for making decisions that have a significant impact on individuals or society.
It is important to consider the limitations of the model and to only use it for its intended purpose.
#### Misuse and Malicious Use
`RedPajama-INCITE-7B-Chat` is designed for language modeling.
Misuse of the model, such as using it to engage in illegal or unethical activities, is strictly prohibited and goes against the principles of the project.
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating fake news, misinformation, or propaganda
- Promoting hate speech, discrimination, or violence against individuals or groups
- Impersonating individuals or organizations without their consent
- Engaging in cyberbullying or harassment
- Defamatory content
- Spamming or scamming
- Sharing confidential or sensitive information without proper authorization
- Violating the terms of use of the model or the data used to train it
- Creating automated bots for malicious purposes such as spreading malware, phishing scams, or spamming
## Limitations
`RedPajama-INCITE-7B-Chat`, like other language models, has limitations that should be taken into consideration.
For example, the model may not always provide accurate or relevant answers, particularly for questions that are complex, ambiguous, or outside of its training data.
We therefore welcome contributions from individuals and organizations, and encourage collaboration towards creating a more robust and inclusive chatbot.
## Training
**Training Data**
Please refer to [togethercomputer/RedPajama-Data-1T](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T)
**Training Procedure**
- **Hardware:** 8 A100
- **Optimizer:** Adam
- **Gradient Accumulations**: 1
- **Num of Tokens:** 79M tokens
- **Learning rate:** 1e-5
## Community
Join us on [Together Discord](https://discord.gg/6ZVDU8tTD4) | 8,001 | [
[
-0.034576416015625,
-0.07720947265625,
0.0180816650390625,
0.0220947265625,
0.0044708251953125,
-0.01129150390625,
-0.0165557861328125,
-0.03704833984375,
0.0274505615234375,
0.02142333984375,
-0.050140380859375,
-0.0293426513671875,
-0.0592041015625,
0.0021343231201171875,
-0.034423828125,
0.05401611328125,
0.0106964111328125,
-0.005947113037109375,
-0.0196075439453125,
-0.000972747802734375,
-0.02838134765625,
-0.034149169921875,
-0.05706787109375,
-0.0306396484375,
0.004669189453125,
0.01531982421875,
0.045684814453125,
0.0274505615234375,
0.045440673828125,
0.03192138671875,
-0.006870269775390625,
0.0028858184814453125,
-0.033660888671875,
-0.009307861328125,
0.0131683349609375,
-0.044189453125,
-0.02911376953125,
-0.005153656005859375,
0.03521728515625,
0.0305328369140625,
0.002552032470703125,
0.013885498046875,
-0.0095977783203125,
0.040435791015625,
-0.042755126953125,
0.01971435546875,
-0.048431396484375,
0.0012655258178710938,
-0.00916290283203125,
-0.004673004150390625,
-0.040679931640625,
-0.0184173583984375,
-0.005157470703125,
-0.037567138671875,
0.0093994140625,
-0.005706787109375,
0.06512451171875,
0.0244293212890625,
0.0009264945983886719,
-0.01509857177734375,
-0.041748046875,
0.06536865234375,
-0.08087158203125,
0.020965576171875,
0.041229248046875,
0.0252227783203125,
-0.031341552734375,
-0.07037353515625,
-0.06109619140625,
-0.01605224609375,
-0.0085906982421875,
-0.0107574462890625,
-0.02618408203125,
-0.003631591796875,
0.031158447265625,
0.0181884765625,
-0.044342041015625,
-0.01108551025390625,
-0.058319091796875,
-0.016937255859375,
0.05078125,
0.01422119140625,
0.0462646484375,
0.0007576942443847656,
-0.00675201416015625,
-0.0198211669921875,
-0.037139892578125,
-0.0018415451049804688,
0.02008056640625,
0.015380859375,
-0.0477294921875,
0.040557861328125,
-0.005207061767578125,
0.0367431640625,
-0.0003249645233154297,
-0.01186370849609375,
0.04644775390625,
-0.01509857177734375,
-0.0259857177734375,
-0.0022563934326171875,
0.09381103515625,
0.01262664794921875,
0.009613037109375,
0.01399993896484375,
0.018890380859375,
0.0257720947265625,
-0.0012083053588867188,
-0.055694580078125,
-0.04437255859375,
0.039215087890625,
-0.0328369140625,
-0.0198211669921875,
0.0011196136474609375,
-0.044219970703125,
-0.01523590087890625,
0.0158233642578125,
0.046417236328125,
-0.035888671875,
-0.031982421875,
0.0093231201171875,
-0.0272674560546875,
0.01160430908203125,
0.003261566162109375,
-0.0748291015625,
0.0256805419921875,
0.0289154052734375,
0.052215576171875,
0.00814056396484375,
-0.040618896484375,
-0.0027923583984375,
-0.00914764404296875,
-0.005718231201171875,
0.02508544921875,
-0.034393310546875,
-0.00555419921875,
-0.0264434814453125,
-0.004405975341796875,
-0.004047393798828125,
-0.028472900390625,
0.01284027099609375,
-0.0203399658203125,
0.053070068359375,
0.01300811767578125,
-0.04522705078125,
-0.020904541015625,
0.0036334991455078125,
-0.036376953125,
0.0767822265625,
0.0007810592651367188,
-0.07232666015625,
-0.0007476806640625,
-0.07659912109375,
-0.0267486572265625,
0.00006151199340820312,
-0.0303802490234375,
-0.042877197265625,
-0.01300048828125,
0.023284912109375,
0.02825927734375,
-0.0231781005859375,
0.01311492919921875,
-0.0321044921875,
-0.0223388671875,
0.03814697265625,
-0.040283203125,
0.10736083984375,
0.0093841552734375,
-0.0557861328125,
0.01026153564453125,
-0.045867919921875,
0.003932952880859375,
0.027099609375,
0.0024013519287109375,
0.009521484375,
-0.00734710693359375,
-0.00986480712890625,
0.0204620361328125,
0.03411865234375,
-0.045684814453125,
0.002048492431640625,
-0.042205810546875,
0.057098388671875,
0.048431396484375,
0.000873565673828125,
0.01947021484375,
-0.03729248046875,
0.02789306640625,
-0.00292205810546875,
0.02734375,
-0.0060272216796875,
-0.06414794921875,
-0.06390380859375,
-0.027191162109375,
0.01067352294921875,
0.043212890625,
-0.03680419921875,
0.0552978515625,
-0.002864837646484375,
-0.049957275390625,
-0.051910400390625,
-0.010345458984375,
0.0200347900390625,
0.020599365234375,
0.03619384765625,
-0.01605224609375,
-0.052886962890625,
-0.0582275390625,
-0.00647735595703125,
-0.019012451171875,
-0.0014896392822265625,
0.036376953125,
0.048583984375,
-0.03973388671875,
0.05255126953125,
-0.03851318359375,
-0.006374359130859375,
-0.016143798828125,
0.0019969940185546875,
0.03802490234375,
0.051666259765625,
0.03546142578125,
-0.03936767578125,
-0.038970947265625,
-0.0070648193359375,
-0.07147216796875,
0.0034656524658203125,
-0.0084381103515625,
-0.0148468017578125,
0.0266571044921875,
0.029693603515625,
-0.0653076171875,
0.0433349609375,
0.044189453125,
-0.039520263671875,
0.032135009765625,
-0.0146942138671875,
0.01715087890625,
-0.08038330078125,
0.0091400146484375,
-0.0185699462890625,
-0.008392333984375,
-0.0421142578125,
0.0009188652038574219,
-0.008941650390625,
-0.0032253265380859375,
-0.057373046875,
0.058929443359375,
-0.021392822265625,
0.01445770263671875,
-0.011322021484375,
0.00018680095672607422,
-0.0151214599609375,
0.051727294921875,
-0.0083465576171875,
0.04327392578125,
0.055633544921875,
-0.047576904296875,
0.036163330078125,
0.0265655517578125,
-0.01544189453125,
0.0018558502197265625,
-0.056427001953125,
0.0163421630859375,
0.0033779144287109375,
0.006084442138671875,
-0.08026123046875,
-0.004993438720703125,
0.04144287109375,
-0.07440185546875,
0.012939453125,
0.0011301040649414062,
-0.03155517578125,
-0.028289794921875,
-0.0189208984375,
0.0271148681640625,
0.06658935546875,
-0.0335693359375,
0.04638671875,
0.05328369140625,
0.001934051513671875,
-0.0458984375,
-0.0577392578125,
-0.007251739501953125,
-0.017791748046875,
-0.06927490234375,
0.00962066650390625,
-0.02239990234375,
-0.0256805419921875,
-0.007534027099609375,
-0.0010385513305664062,
-0.007579803466796875,
0.01299285888671875,
0.02252197265625,
0.0240936279296875,
0.0076904296875,
-0.0190277099609375,
-0.004444122314453125,
-0.003604888916015625,
0.030242919921875,
0.002101898193359375,
0.06671142578125,
-0.0174407958984375,
-0.01312255859375,
-0.060089111328125,
0.0137939453125,
0.044586181640625,
0.007762908935546875,
0.07470703125,
0.0440673828125,
-0.037384033203125,
-0.00983428955078125,
-0.0350341796875,
-0.047576904296875,
-0.041351318359375,
0.029876708984375,
-0.020721435546875,
-0.057373046875,
0.040557861328125,
0.027191162109375,
0.0210723876953125,
0.057373046875,
0.06268310546875,
-0.010833740234375,
0.07373046875,
0.032501220703125,
-0.006072998046875,
0.040313720703125,
-0.04620361328125,
0.01288604736328125,
-0.046417236328125,
-0.022674560546875,
-0.04034423828125,
-0.0220184326171875,
-0.0445556640625,
-0.02923583984375,
0.01512908935546875,
-0.010833740234375,
-0.0467529296875,
0.036590576171875,
-0.059326171875,
0.027374267578125,
0.047149658203125,
0.00960540771484375,
-0.003936767578125,
0.004268646240234375,
-0.0030002593994140625,
-0.002315521240234375,
-0.05938720703125,
-0.02313232421875,
0.07855224609375,
0.03619384765625,
0.0611572265625,
0.0047454833984375,
0.04443359375,
0.01177978515625,
0.00843048095703125,
-0.0185394287109375,
0.04681396484375,
0.00768280029296875,
-0.06707763671875,
-0.0298919677734375,
-0.04937744140625,
-0.0765380859375,
0.01424407958984375,
-0.0167388916015625,
-0.0704345703125,
-0.007556915283203125,
0.0205230712890625,
-0.0013093948364257812,
0.0197906494140625,
-0.06005859375,
0.081298828125,
-0.010162353515625,
-0.00180816650390625,
-0.0258331298828125,
-0.057464599609375,
0.0305023193359375,
0.025360107421875,
0.0193328857421875,
-0.013031005859375,
0.022247314453125,
0.07244873046875,
-0.039215087890625,
0.077880859375,
-0.007587432861328125,
0.0220184326171875,
0.0239410400390625,
0.0081634521484375,
0.025177001953125,
-0.0019664764404296875,
0.0190582275390625,
0.0467529296875,
0.005390167236328125,
-0.0306396484375,
-0.03729248046875,
0.061981201171875,
-0.09405517578125,
-0.033538818359375,
-0.03582763671875,
-0.0308837890625,
0.006717681884765625,
0.0174560546875,
0.0379638671875,
0.0251617431640625,
0.01320648193359375,
0.01641845703125,
0.0391845703125,
-0.0262298583984375,
0.0289459228515625,
0.00962066650390625,
-0.016326904296875,
-0.040679931640625,
0.06683349609375,
-0.007122039794921875,
0.0112457275390625,
0.02410888671875,
0.01534271240234375,
-0.0237274169921875,
-0.016571044921875,
-0.021514892578125,
0.03851318359375,
-0.047515869140625,
-0.01267242431640625,
-0.059356689453125,
-0.0345458984375,
-0.038055419921875,
-0.007965087890625,
-0.035186767578125,
-0.04644775390625,
-0.033721923828125,
0.01535797119140625,
0.041351318359375,
0.0277252197265625,
0.0054779052734375,
0.0147857666015625,
-0.043121337890625,
0.02880859375,
0.006626129150390625,
0.01476287841796875,
0.0015535354614257812,
-0.06268310546875,
-0.01031494140625,
0.0193328857421875,
-0.0290679931640625,
-0.0506591796875,
0.03411865234375,
0.0092315673828125,
0.04791259765625,
0.01175689697265625,
0.006450653076171875,
0.0438232421875,
-0.0180206298828125,
0.06280517578125,
0.02764892578125,
-0.067626953125,
0.0439453125,
-0.00066375732421875,
0.0303802490234375,
0.021881103515625,
0.0302886962890625,
-0.0341796875,
-0.050079345703125,
-0.0799560546875,
-0.067626953125,
0.0626220703125,
0.041778564453125,
0.012115478515625,
-0.01038360595703125,
0.007602691650390625,
-0.001255035400390625,
0.00965118408203125,
-0.07568359375,
-0.058807373046875,
-0.01436614990234375,
-0.032562255859375,
0.0128021240234375,
0.0004703998565673828,
-0.00907135009765625,
-0.033294677734375,
0.078857421875,
0.012664794921875,
0.0576171875,
0.0011157989501953125,
-0.01335906982421875,
0.0000896453857421875,
-0.0030612945556640625,
0.047454833984375,
0.062286376953125,
-0.01100921630859375,
0.0009121894836425781,
0.0208892822265625,
-0.045623779296875,
0.000919342041015625,
0.0055389404296875,
-0.01309967041015625,
-0.00283050537109375,
0.03546142578125,
0.071533203125,
-0.00391387939453125,
-0.04449462890625,
0.02618408203125,
-0.033233642578125,
-0.022369384765625,
-0.04412841796875,
0.029296875,
0.01763916015625,
0.02484130859375,
0.018157958984375,
-0.0006442070007324219,
-0.01197052001953125,
-0.030914306640625,
0.00547027587890625,
0.040008544921875,
-0.0093994140625,
-0.0170745849609375,
0.0626220703125,
0.0078125,
-0.04351806640625,
0.0650634765625,
-0.00298309326171875,
-0.025146484375,
0.05096435546875,
0.048492431640625,
0.062744140625,
-0.0002522468566894531,
-0.006687164306640625,
0.041656494140625,
0.024932861328125,
0.0006437301635742188,
0.031402587890625,
0.007228851318359375,
-0.0635986328125,
-0.024932861328125,
-0.0299530029296875,
-0.0199432373046875,
0.0204620361328125,
-0.0226287841796875,
0.0220184326171875,
-0.040740966796875,
-0.0198211669921875,
0.0067901611328125,
0.01113128662109375,
-0.04486083984375,
-0.006145477294921875,
0.0022029876708984375,
0.065673828125,
-0.06549072265625,
0.06011962890625,
0.045623779296875,
-0.029205322265625,
-0.05718994140625,
-0.031097412109375,
-0.00495147705078125,
-0.060943603515625,
0.03265380859375,
0.015777587890625,
0.0037136077880859375,
0.007659912109375,
-0.05487060546875,
-0.05560302734375,
0.0765380859375,
0.03997802734375,
-0.0259246826171875,
0.00853729248046875,
-0.0061492919921875,
0.0284881591796875,
-0.0084381103515625,
0.04791259765625,
0.036468505859375,
0.026885986328125,
0.003826141357421875,
-0.07177734375,
0.0010442733764648438,
-0.0352783203125,
-0.017181396484375,
0.027862548828125,
-0.034881591796875,
0.074462890625,
-0.01313018798828125,
-0.01104736328125,
-0.00518035888671875,
0.06866455078125,
0.03326416015625,
0.0013885498046875,
0.0205841064453125,
0.0509033203125,
0.05126953125,
-0.0016536712646484375,
0.0657958984375,
-0.041107177734375,
0.045684814453125,
0.061767578125,
0.0219268798828125,
0.05218505859375,
0.0313720703125,
-0.03936767578125,
0.041839599609375,
0.0255279541015625,
-0.0016851425170898438,
0.03436279296875,
0.006465911865234375,
-0.0192413330078125,
0.0013790130615234375,
0.027679443359375,
-0.03759765625,
0.01824951171875,
0.035247802734375,
-0.0265960693359375,
0.00826263427734375,
-0.01450347900390625,
0.0102996826171875,
-0.01450347900390625,
-0.0038299560546875,
0.047882080078125,
0.01003265380859375,
-0.034423828125,
0.07391357421875,
0.00537872314453125,
0.06719970703125,
-0.04638671875,
-0.0083160400390625,
-0.01369476318359375,
0.032806396484375,
-0.01233673095703125,
-0.04290771484375,
0.0196380615234375,
-0.014556884765625,
-0.002170562744140625,
0.011199951171875,
0.0357666015625,
-0.037139892578125,
-0.040679931640625,
0.0126800537109375,
0.006603240966796875,
0.045501708984375,
0.00441741943359375,
-0.0712890625,
0.0399169921875,
0.01482391357421875,
-0.0124664306640625,
0.018829345703125,
0.0123291015625,
0.0110321044921875,
0.06494140625,
0.0419921875,
0.0035400390625,
0.0125732421875,
-0.018768310546875,
0.050048828125,
-0.05267333984375,
-0.0374755859375,
-0.08544921875,
0.03387451171875,
0.0019550323486328125,
-0.0270233154296875,
0.061981201171875,
0.03472900390625,
0.07867431640625,
-0.00647735595703125,
0.05804443359375,
-0.0312347412109375,
0.0147857666015625,
-0.0202789306640625,
0.062103271484375,
-0.0297393798828125,
0.00878143310546875,
-0.0205078125,
-0.05755615234375,
-0.020172119140625,
0.09136962890625,
-0.0034351348876953125,
0.0237274169921875,
0.052459716796875,
0.07012939453125,
0.00318145751953125,
-0.007160186767578125,
0.0140380859375,
0.03424072265625,
0.038604736328125,
0.054779052734375,
0.039581298828125,
-0.05218505859375,
0.03741455078125,
-0.043975830078125,
-0.021759033203125,
-0.01503753662109375,
-0.0506591796875,
-0.08837890625,
-0.052276611328125,
-0.030303955078125,
-0.04327392578125,
-0.00296783447265625,
0.093505859375,
0.05535888671875,
-0.05865478515625,
-0.032806396484375,
-0.01087188720703125,
0.01114654541015625,
-0.01605224609375,
-0.02215576171875,
0.02960205078125,
-0.01520538330078125,
-0.05670166015625,
0.01531982421875,
0.0122222900390625,
0.010162353515625,
-0.0296478271484375,
-0.01557159423828125,
-0.026092529296875,
0.0001531839370727539,
0.030548095703125,
0.0428466796875,
-0.05181884765625,
-0.01100921630859375,
-0.00875091552734375,
-0.01123809814453125,
0.01490020751953125,
0.0213623046875,
-0.05780029296875,
0.024261474609375,
0.0443115234375,
0.036865234375,
0.056182861328125,
-0.025421142578125,
0.033599853515625,
-0.0380859375,
0.0231781005859375,
0.0212554931640625,
0.028564453125,
0.0190582275390625,
-0.032318115234375,
0.0200958251953125,
0.023101806640625,
-0.049530029296875,
-0.06591796875,
-0.0014238357543945312,
-0.0673828125,
-0.0245819091796875,
0.0748291015625,
0.005474090576171875,
-0.03265380859375,
-0.0022106170654296875,
-0.01554107666015625,
0.0194244384765625,
-0.03399658203125,
0.0726318359375,
0.0330810546875,
-0.03240966796875,
-0.02044677734375,
-0.0262908935546875,
0.029205322265625,
0.019256591796875,
-0.06475830078125,
0.00443267822265625,
0.0165863037109375,
0.0283203125,
0.011260986328125,
0.07568359375,
-0.00858306884765625,
0.03363037109375,
0.0115509033203125,
0.0308990478515625,
-0.01087188720703125,
-0.0244140625,
-0.0119171142578125,
-0.00238800048828125,
0.000919342041015625,
-0.0212860107421875
]
] |
matsuo-lab/weblab-10b-instruction-sft | 2023-09-04T23:16:23.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | matsuo-lab | null | null | matsuo-lab/weblab-10b-instruction-sft | 68 | 11,579 | transformers | 2023-08-04T05:01:56 | ---
license: cc-by-nc-4.0
---
# weblab-10b-instruction-sft
# Overview
This repository provides a Japanese-centric multilingual GPT-NeoX model of 10 billion parameters.
* **Library**
The model was trained using code based on [EleutherAI/gpt-neox](https://github.com/EleutherAI/gpt-neox).
* **Model architecture**
A 36-layer, 4864-hidden-size transformer-based language model.
* **Pre-training**
The model was trained on around **600B** tokens from a mixture of the following corpora.
- [Japanese C4](https://huggingface.co/datasets/mc4)
- [The Pile](https://huggingface.co/datasets/EleutherAI/pile)
* **Instruction-supervised-finetuning**
The model was finetuned on a subset records from a mixture of the following dataset. Training epoch: 1.
- [Alpaca (English)](https://github.com/gururise/AlpacaDataCleaned/blob/main/alpaca_data_cleaned.json)
- [Alpaca (Japanese translation)](https://github.com/shi3z/alpaca_ja/blob/main/alpaca_cleaned_ja.json)
- [Flan 2021 (English)](https://huggingface.co/datasets/conceptofmind/flan2021_submix_original)
- [Flan CoT (English)](https://huggingface.co/datasets/conceptofmind/cot_submix_original)
- [Flan Dialog (English)](https://huggingface.co/datasets/conceptofmind/dialog_submix_original)
* **Model Series**
| Variant | Link |
| :-- | :--|
| weblab-10b-instruction-sft | https://huggingface.co/matsuo-lab/weblab-10b-instruction-sft |
| weblab-10b | https://huggingface.co/matsuo-lab/weblab-10b |
* **Authors**
Takeshi Kojima
---
# Benchmarking
* **Japanese benchmark : JGLUE 8-task (2023-08-27)**
- *We used [Stability-AI/lm-evaluation-harness](https://github.com/Stability-AI/lm-evaluation-harness/tree/2f1583c0735eacdfdfa5b7d656074b69577b6774) library for evaluation.*
- *The 8-task average accuracy is based on results of JCommonsenseQA-1.1, JNLI-1.1, MARC-ja-1.1, JSQuAD-1.1, jaqket_v2-0.2, xlsum_ja-1.0, xwinograd_ja, and mgsm-1.0.*
- *model loading is performed with float16, and evaluation is performed with template version 0.3 using the few-shot in-context learning.*
- *The number of few-shots is 3,3,3,2,1,1,0,5.*
- *special_tokens_map.json is modified to avoid errors during the evaluation of the second half benchmarks. As a result, the results of the first half benchmarks became slightly different.*
model | average | jcommonsenseqa | jnli | marc_ja | jsquad | jaqket_v2 | xlsum_ja | xwinograd_ja | mgsm
| :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- |
weblab-10b-instruction-sft | 59.11 | 74.62 | 66.56 | 95.49 | 78.34 | 63.32 | 20.57 | 71.95 | 2
weblab-10b | 50.74 | 66.58 | 53.74 | 82.07 | 62.94 | 56.19 | 10.03 | 71.95 | 2.4
* **Japanese benchmark : JGLUE 4-task (2023-08-18)**
- *We used [Stability-AI/lm-evaluation-harness](https://github.com/Stability-AI/lm-evaluation-harness/tree/2f1583c0735eacdfdfa5b7d656074b69577b6774) library for evaluation.*
- *The 4-task average accuracy is based on results of JCommonsenseQA-1.1, JNLI-1.1, MARC-ja-1.1, and JSQuAD-1.1.*
- *model loading is performed with float16, and evaluation is performed with template version 0.3 using the few-shot in-context learning.*
- *The number of few-shots is 3,3,3,2.*
| Model | Average | JCommonsenseQA | JNLI | MARC-ja | JSQuAD |
| :-- | :-- | :-- | :-- | :-- | :-- |
| weblab-10b-instruction-sft | 78.78 | 74.35 | 65.65 | 96.06 | 79.04 |
| weblab-10b | 66.38 | 65.86 | 54.19 | 84.49 | 60.98 |
---
# How to use the model
~~~~python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("matsuo-lab/weblab-10b-instruction-sft")
model = AutoModelForCausalLM.from_pretrained("matsuo-lab/weblab-10b-instruction-sft", torch_dtype=torch.float16)
if torch.cuda.is_available():
model = model.to("cuda")
text = "大規模言語モデルについて説明してください。"
text = f'以下は、タスクを説明する指示です。要求を適切に満たす応答を書きなさい。\n\n### 指示:\n{text}\n\n### 応答:'
token_ids = tokenizer.encode(text, add_special_tokens=False, return_tensors="pt")
with torch.no_grad():
output_ids = model.generate(
token_ids.to(model.device),
max_new_tokens=100,
do_sample=True,
temperature=0.7,
top_p=0.95
)
output = tokenizer.decode(output_ids.tolist()[0])
print(output)
~~~~
---
# Licenese
[cc-by-nc-4.0](https://creativecommons.org/licenses/by-nc/4.0/) | 4,421 | [
[
-0.037506103515625,
-0.062103271484375,
0.0238189697265625,
0.0015659332275390625,
-0.01457977294921875,
-0.00995635986328125,
-0.0122528076171875,
-0.0275115966796875,
-0.00792694091796875,
0.00859832763671875,
-0.046539306640625,
-0.054962158203125,
-0.044525146484375,
-0.007228851318359375,
-0.01209259033203125,
0.07745361328125,
-0.01349639892578125,
0.0077667236328125,
-0.00524139404296875,
-0.016876220703125,
-0.0183868408203125,
-0.02691650390625,
-0.055755615234375,
-0.02325439453125,
0.011260986328125,
0.0076446533203125,
0.038604736328125,
0.046905517578125,
0.03546142578125,
0.0257720947265625,
-0.001270294189453125,
-0.005512237548828125,
-0.0270843505859375,
-0.026611328125,
0.01482391357421875,
-0.033782958984375,
-0.041290283203125,
0.003482818603515625,
0.044036865234375,
0.025299072265625,
-0.0036144256591796875,
0.02911376953125,
-0.003223419189453125,
0.037567138671875,
-0.03802490234375,
0.031890869140625,
-0.0236663818359375,
-0.002178192138671875,
-0.008514404296875,
0.012939453125,
-0.02374267578125,
0.0013179779052734375,
0.003589630126953125,
-0.0650634765625,
0.0236053466796875,
-0.0015468597412109375,
0.09063720703125,
0.025390625,
-0.01409912109375,
-0.001983642578125,
-0.02630615234375,
0.0599365234375,
-0.0709228515625,
0.028839111328125,
0.027069091796875,
0.0153961181640625,
-0.002471923828125,
-0.058502197265625,
-0.037506103515625,
-0.0129547119140625,
-0.006862640380859375,
0.02691650390625,
-0.015899658203125,
0.007633209228515625,
0.039764404296875,
0.0145416259765625,
-0.0570068359375,
0.01043701171875,
-0.03485107421875,
-0.01910400390625,
0.058685302734375,
0.0195770263671875,
0.01702880859375,
-0.0241851806640625,
-0.025909423828125,
-0.03216552734375,
-0.0293426513671875,
0.0245208740234375,
0.02471923828125,
0.01302337646484375,
-0.0458984375,
0.028106689453125,
-0.0264434814453125,
0.046783447265625,
0.005764007568359375,
-0.030181884765625,
0.047943115234375,
-0.0263519287109375,
-0.020904541015625,
-0.0129547119140625,
0.09881591796875,
0.0283050537109375,
-0.001972198486328125,
0.01158905029296875,
-0.00847625732421875,
-0.0033550262451171875,
-0.005977630615234375,
-0.072265625,
-0.0234527587890625,
0.01352691650390625,
-0.030609130859375,
-0.01543426513671875,
0.010498046875,
-0.046966552734375,
0.003940582275390625,
-0.01148223876953125,
0.045654296875,
-0.04412841796875,
-0.01220703125,
0.0025272369384765625,
-0.007167816162109375,
0.02349853515625,
0.022064208984375,
-0.053955078125,
0.021209716796875,
0.0292510986328125,
0.06744384765625,
-0.016448974609375,
-0.03753662109375,
-0.00856781005859375,
-0.00659942626953125,
-0.0159149169921875,
0.0328369140625,
-0.006011962890625,
-0.031890869140625,
-0.032012939453125,
0.01398468017578125,
-0.0196990966796875,
-0.0238189697265625,
0.0302734375,
-0.01483154296875,
0.033599853515625,
-0.0144500732421875,
-0.040130615234375,
-0.0260467529296875,
0.02813720703125,
-0.041748046875,
0.09698486328125,
0.0210113525390625,
-0.0560302734375,
0.01328277587890625,
-0.06005859375,
-0.00714111328125,
-0.00843048095703125,
-0.01525115966796875,
-0.04852294921875,
-0.0149383544921875,
0.02325439453125,
0.036224365234375,
-0.030029296875,
0.0267791748046875,
-0.018768310546875,
-0.036773681640625,
0.00725555419921875,
-0.04443359375,
0.0810546875,
0.0135955810546875,
-0.053131103515625,
0.017547607421875,
-0.07159423828125,
-0.00463104248046875,
0.0247650146484375,
-0.01313018798828125,
0.00647735595703125,
-0.0276947021484375,
0.00559234619140625,
0.0210723876953125,
0.0213623046875,
-0.0297698974609375,
0.01354217529296875,
-0.0311737060546875,
0.02874755859375,
0.049774169921875,
-0.002346038818359375,
0.025146484375,
-0.019073486328125,
0.0419921875,
0.010223388671875,
0.0223236083984375,
-0.01335906982421875,
-0.049102783203125,
-0.0679931640625,
-0.022979736328125,
0.020751953125,
0.044952392578125,
-0.054718017578125,
0.0285186767578125,
-0.01227569580078125,
-0.048736572265625,
-0.034881591796875,
0.00200653076171875,
0.045501708984375,
0.051422119140625,
0.03961181640625,
-0.0171966552734375,
-0.03131103515625,
-0.0679931640625,
0.01513671875,
-0.01184844970703125,
0.005828857421875,
0.0164794921875,
0.050994873046875,
-0.0215606689453125,
0.054595947265625,
-0.03900146484375,
-0.0007386207580566406,
-0.007289886474609375,
0.02801513671875,
0.028594970703125,
0.0482177734375,
0.0599365234375,
-0.036651611328125,
-0.044891357421875,
-0.0072174072265625,
-0.06451416015625,
0.006748199462890625,
-0.003040313720703125,
-0.019256591796875,
0.028076171875,
0.03192138671875,
-0.06402587890625,
0.042236328125,
0.0272674560546875,
-0.02252197265625,
0.047119140625,
-0.0021514892578125,
0.0161590576171875,
-0.09674072265625,
0.0187835693359375,
0.008819580078125,
-0.00629425048828125,
-0.033843994140625,
0.0162811279296875,
0.005893707275390625,
-0.00402069091796875,
-0.053955078125,
0.0628662109375,
-0.03076171875,
0.0038089752197265625,
-0.00907135009765625,
0.006572723388671875,
0.004619598388671875,
0.057403564453125,
-0.005023956298828125,
0.06591796875,
0.049407958984375,
-0.035491943359375,
0.031494140625,
0.00970458984375,
-0.0182037353515625,
0.012542724609375,
-0.055145263671875,
0.007244110107421875,
0.00939178466796875,
0.0167999267578125,
-0.06854248046875,
-0.00977325439453125,
0.03704833984375,
-0.046905517578125,
0.02947998046875,
-0.016845703125,
-0.034881591796875,
-0.0382080078125,
-0.0245819091796875,
0.032318115234375,
0.03033447265625,
-0.0274505615234375,
0.036529541015625,
0.007152557373046875,
0.00965118408203125,
-0.056976318359375,
-0.04718017578125,
-0.0305328369140625,
-0.011749267578125,
-0.037078857421875,
0.028564453125,
-0.0227508544921875,
-0.002307891845703125,
0.0013456344604492188,
-0.0029201507568359375,
0.0005598068237304688,
0.0153045654296875,
0.022796630859375,
0.048797607421875,
-0.0172882080078125,
-0.0161590576171875,
0.0008220672607421875,
-0.0192108154296875,
0.01160430908203125,
-0.0081939697265625,
0.05303955078125,
-0.0305328369140625,
-0.0230865478515625,
-0.052886962890625,
-0.004970550537109375,
0.043426513671875,
0.0013971328735351562,
0.0615234375,
0.08099365234375,
-0.03387451171875,
0.018310546875,
-0.0265960693359375,
-0.00948333740234375,
-0.036346435546875,
0.04095458984375,
-0.0389404296875,
-0.04949951171875,
0.0592041015625,
0.0168914794921875,
0.0109710693359375,
0.068603515625,
0.034454345703125,
-0.0010137557983398438,
0.0816650390625,
0.0248870849609375,
-0.019439697265625,
0.0309600830078125,
-0.069580078125,
-0.0010652542114257812,
-0.066162109375,
-0.0148468017578125,
-0.032867431640625,
-0.0162353515625,
-0.056488037109375,
-0.0343017578125,
0.029052734375,
0.01617431640625,
-0.043212890625,
0.03143310546875,
-0.043365478515625,
0.017852783203125,
0.052764892578125,
0.005950927734375,
-0.003986358642578125,
-0.01012420654296875,
-0.03118896484375,
0.004505157470703125,
-0.0626220703125,
-0.020172119140625,
0.08905029296875,
0.0255126953125,
0.044097900390625,
-0.006313323974609375,
0.05816650390625,
-0.0072174072265625,
0.01445770263671875,
-0.048919677734375,
0.043212890625,
0.005329132080078125,
-0.05450439453125,
-0.021453857421875,
-0.041046142578125,
-0.060760498046875,
0.016448974609375,
-0.01306915283203125,
-0.0689697265625,
0.00373077392578125,
0.00003898143768310547,
-0.0369873046875,
0.0167694091796875,
-0.05731201171875,
0.08917236328125,
-0.0282745361328125,
-0.037628173828125,
-0.0038814544677734375,
-0.049041748046875,
0.02777099609375,
0.01149749755859375,
0.0168304443359375,
-0.0184173583984375,
0.0095062255859375,
0.07452392578125,
-0.0293426513671875,
0.05572509765625,
-0.0197906494140625,
0.01447296142578125,
0.0308685302734375,
-0.010833740234375,
0.032562255859375,
0.00995635986328125,
-0.0114898681640625,
0.034698486328125,
0.01322174072265625,
-0.032867431640625,
-0.031585693359375,
0.055999755859375,
-0.08880615234375,
-0.039306640625,
-0.046966552734375,
-0.0465087890625,
-0.0024242401123046875,
0.0301361083984375,
0.043426513671875,
0.033905029296875,
-0.0005421638488769531,
0.0162353515625,
0.0389404296875,
-0.0212554931640625,
0.049560546875,
0.0299835205078125,
-0.021881103515625,
-0.04241943359375,
0.067626953125,
0.0124359130859375,
0.0180816650390625,
0.0209808349609375,
0.01168060302734375,
-0.022308349609375,
-0.03460693359375,
-0.06414794921875,
0.0204925537109375,
-0.04852294921875,
-0.0292205810546875,
-0.041900634765625,
-0.0279693603515625,
-0.043212890625,
-0.0092620849609375,
-0.04388427734375,
-0.03729248046875,
-0.040435791015625,
-0.0100555419921875,
0.0323486328125,
0.03375244140625,
0.0111541748046875,
0.0270538330078125,
-0.052276611328125,
0.0189666748046875,
0.006256103515625,
0.0173797607421875,
-0.000392913818359375,
-0.055206298828125,
-0.033355712890625,
0.0111846923828125,
-0.03533935546875,
-0.052978515625,
0.03094482421875,
-0.0004603862762451172,
0.050811767578125,
0.0306549072265625,
-0.006946563720703125,
0.060791015625,
-0.0145263671875,
0.069580078125,
0.0218048095703125,
-0.0699462890625,
0.041351318359375,
-0.036102294921875,
0.049713134765625,
0.044525146484375,
0.042755126953125,
-0.016998291015625,
-0.0190887451171875,
-0.06549072265625,
-0.06982421875,
0.08135986328125,
0.013458251953125,
-0.00861358642578125,
0.00864410400390625,
0.0308380126953125,
-0.00910186767578125,
0.005279541015625,
-0.06488037109375,
-0.042816162109375,
-0.02410888671875,
-0.033843994140625,
0.00019931793212890625,
0.0027942657470703125,
-0.00836181640625,
-0.037384033203125,
0.07379150390625,
-0.015655517578125,
0.03485107421875,
0.004055023193359375,
-0.00890350341796875,
0.00017547607421875,
-0.01026153564453125,
0.047149658203125,
0.040008544921875,
-0.0204925537109375,
-0.01209259033203125,
0.0250244140625,
-0.046356201171875,
0.004184722900390625,
0.0296630859375,
-0.035491943359375,
-0.01079559326171875,
0.02740478515625,
0.0855712890625,
0.016632080078125,
-0.037200927734375,
0.0299530029296875,
-0.010711669921875,
-0.024078369140625,
-0.0204620361328125,
0.0213623046875,
0.00775146484375,
0.0124359130859375,
0.0245208740234375,
0.0025348663330078125,
0.006771087646484375,
-0.029144287109375,
0.007122039794921875,
0.0287017822265625,
-0.01027679443359375,
-0.022369384765625,
0.060882568359375,
0.0004260540008544922,
-0.00952911376953125,
0.0457763671875,
-0.0330810546875,
-0.03558349609375,
0.058929443359375,
0.041168212890625,
0.05950927734375,
-0.01059722900390625,
0.005138397216796875,
0.06707763671875,
0.0190582275390625,
-0.006744384765625,
0.0086517333984375,
0.00787353515625,
-0.051116943359375,
-0.0110015869140625,
-0.047332763671875,
-0.0206146240234375,
0.01519775390625,
-0.0513916015625,
0.027984619140625,
-0.046539306640625,
-0.0236663818359375,
-0.01141357421875,
0.0302734375,
-0.05718994140625,
0.021148681640625,
0.00806427001953125,
0.054595947265625,
-0.0599365234375,
0.06591796875,
0.05133056640625,
-0.04742431640625,
-0.07525634765625,
-0.01335906982421875,
0.0106964111328125,
-0.054046630859375,
0.02557373046875,
0.019378662109375,
-0.00016617774963378906,
0.0124969482421875,
-0.039459228515625,
-0.0872802734375,
0.11175537109375,
0.0267791748046875,
-0.039764404296875,
0.0002932548522949219,
0.0064239501953125,
0.035186767578125,
-0.01349639892578125,
0.04425048828125,
0.035400390625,
0.030548095703125,
-0.003849029541015625,
-0.0709228515625,
0.00530242919921875,
-0.0404052734375,
-0.007251739501953125,
0.0110015869140625,
-0.07257080078125,
0.071533203125,
-0.00299835205078125,
0.00487518310546875,
0.000850677490234375,
0.053985595703125,
0.039703369140625,
0.0279998779296875,
0.02044677734375,
0.0638427734375,
0.04718017578125,
-0.0186920166015625,
0.07147216796875,
-0.039764404296875,
0.045562744140625,
0.078857421875,
0.01058197021484375,
0.05419921875,
0.01012420654296875,
-0.0286102294921875,
0.04010009765625,
0.0548095703125,
-0.01300048828125,
0.0243072509765625,
-0.010162353515625,
-0.0023899078369140625,
0.00484466552734375,
0.017425537109375,
-0.034210205078125,
0.033843994140625,
0.0211639404296875,
-0.0178680419921875,
0.005802154541015625,
0.006992340087890625,
0.01514434814453125,
-0.0269622802734375,
-0.00769805908203125,
0.044525146484375,
-0.0009374618530273438,
-0.044036865234375,
0.06512451171875,
0.0181732177734375,
0.055145263671875,
-0.042144775390625,
0.00417327880859375,
-0.010650634765625,
0.00335693359375,
-0.0100860595703125,
-0.038818359375,
0.0017557144165039062,
0.002285003662109375,
-0.01158905029296875,
0.005077362060546875,
0.039794921875,
-0.0166473388671875,
-0.057220458984375,
0.024444580078125,
0.0221405029296875,
0.01183319091796875,
0.01410675048828125,
-0.089599609375,
0.0239410400390625,
0.00856781005859375,
-0.04522705078125,
0.0278778076171875,
0.01513671875,
0.00196075439453125,
0.041900634765625,
0.043212890625,
-0.0232696533203125,
0.018280029296875,
0.007129669189453125,
0.0595703125,
-0.045501708984375,
-0.0178375244140625,
-0.0587158203125,
0.048614501953125,
-0.003570556640625,
-0.048431396484375,
0.06207275390625,
0.05859375,
0.08319091796875,
0.00406646728515625,
0.050323486328125,
-0.0183868408203125,
0.01229095458984375,
-0.042205810546875,
0.05999755859375,
-0.042999267578125,
0.01116180419921875,
-0.0280609130859375,
-0.0596923828125,
-0.01090240478515625,
0.06390380859375,
-0.018768310546875,
0.0247802734375,
0.0517578125,
0.06597900390625,
-0.006061553955078125,
-0.01241302490234375,
0.0150299072265625,
0.0230712890625,
0.0188446044921875,
0.045928955078125,
0.0267791748046875,
-0.0755615234375,
0.0306549072265625,
-0.06060791015625,
-0.01174163818359375,
-0.00948333740234375,
-0.044891357421875,
-0.064208984375,
-0.0419921875,
-0.0377197265625,
-0.02935791015625,
-0.005397796630859375,
0.08428955078125,
0.060699462890625,
-0.06573486328125,
-0.0245513916015625,
-0.013092041015625,
-0.0098114013671875,
-0.021392822265625,
-0.0213165283203125,
0.03790283203125,
-0.0171661376953125,
-0.07281494140625,
0.027496337890625,
-0.002368927001953125,
0.01105499267578125,
-0.0177764892578125,
-0.0291900634765625,
-0.032012939453125,
-0.01343536376953125,
0.030303955078125,
0.01453399658203125,
-0.052398681640625,
-0.00545501708984375,
0.00875091552734375,
-0.0158843994140625,
0.01232147216796875,
0.0198822021484375,
-0.048370361328125,
0.03375244140625,
0.03277587890625,
0.0305938720703125,
0.0692138671875,
-0.006130218505859375,
0.018310546875,
-0.047210693359375,
0.032958984375,
0.0027751922607421875,
0.038665771484375,
0.0189971923828125,
-0.0292205810546875,
0.04461669921875,
0.036834716796875,
-0.03369140625,
-0.055206298828125,
-0.0118865966796875,
-0.07818603515625,
-0.0146942138671875,
0.084228515625,
-0.0243072509765625,
-0.031005859375,
0.0164794921875,
-0.0133056640625,
0.04327392578125,
-0.030914306640625,
0.0450439453125,
0.050140380859375,
-0.01617431640625,
-0.0289154052734375,
-0.046905517578125,
0.0214691162109375,
0.02508544921875,
-0.061309814453125,
-0.0260009765625,
0.0322265625,
0.031585693359375,
0.020355224609375,
0.048431396484375,
-0.00789642333984375,
0.03326416015625,
0.0111541748046875,
0.0186004638671875,
-0.01885986328125,
-0.017120361328125,
-0.01995849609375,
-0.0028324127197265625,
-0.005786895751953125,
-0.0169677734375
]
] |
cardiffnlp/twitter-roberta-base-sep2022 | 2022-10-14T15:36:18.000Z | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"timelms",
"twitter",
"en",
"dataset:twitter-api",
"arxiv:2202.03829",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | cardiffnlp | null | null | cardiffnlp/twitter-roberta-base-sep2022 | 6 | 11,576 | transformers | 2022-10-14T15:19:24 | ---
language: en
tags:
- timelms
- twitter
license: mit
datasets:
- twitter-api
---
# Twitter September 2022 (RoBERTa-base, 169M)
This is a RoBERTa-base model trained on 168.86M tweets until the end of September 2022 (15M tweets increment).
More details and performance scores are available in the [TimeLMs paper](https://arxiv.org/abs/2202.03829).
Below, we provide some usage examples using the standard Transformers interface. For another interface more suited to comparing predictions and perplexity scores between models trained at different temporal intervals, check the [TimeLMs repository](https://github.com/cardiffnlp/timelms).
For other models trained until different periods, check this [table](https://github.com/cardiffnlp/timelms#released-models).
## Preprocess Text
Replace usernames and links for placeholders: "@user" and "http".
If you're interested in retaining verified users which were also retained during training, you may keep the users listed [here](https://github.com/cardiffnlp/timelms/tree/main/data).
```python
def preprocess(text):
preprocessed_text = []
for t in text.split():
if len(t) > 1:
t = '@user' if t[0] == '@' and t.count('@') == 1 else t
t = 'http' if t.startswith('http') else t
preprocessed_text.append(t)
return ' '.join(preprocessed_text)
```
## Example Masked Language Model
```python
from transformers import pipeline, AutoTokenizer
MODEL = "cardiffnlp/twitter-roberta-base-sep2022"
fill_mask = pipeline("fill-mask", model=MODEL, tokenizer=MODEL)
tokenizer = AutoTokenizer.from_pretrained(MODEL)
def pprint(candidates, n):
for i in range(n):
token = tokenizer.decode(candidates[i]['token'])
score = candidates[i]['score']
print("%d) %.5f %s" % (i+1, score, token))
texts = [
"So glad I'm <mask> vaccinated.",
"I keep forgetting to bring a <mask>.",
"Looking forward to watching <mask> Game tonight!",
]
for text in texts:
t = preprocess(text)
print(f"{'-'*30}\n{t}")
candidates = fill_mask(t)
pprint(candidates, 5)
```
Output:
```
------------------------------
So glad I'm <mask> vaccinated.
1) 0.60140 not
2) 0.15077 getting
3) 0.12119 fully
4) 0.02203 still
5) 0.01020 all
------------------------------
I keep forgetting to bring a <mask>.
1) 0.05812 charger
2) 0.05040 backpack
3) 0.05004 book
4) 0.04548 bag
5) 0.03992 lighter
------------------------------
Looking forward to watching <mask> Game tonight!
1) 0.39552 the
2) 0.28083 The
3) 0.02029 End
4) 0.01878 Squid
5) 0.01438 this
```
## Example Tweet Embeddings
```python
from transformers import AutoTokenizer, AutoModel, TFAutoModel
import numpy as np
from scipy.spatial.distance import cosine
from collections import Counter
def get_embedding(text): # naive approach for demonstration
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
features = model(**encoded_input)
features = features[0].detach().cpu().numpy()
return np.mean(features[0], axis=0)
MODEL = "cardiffnlp/twitter-roberta-base-sep2022"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModel.from_pretrained(MODEL)
query = "The book was awesome"
tweets = ["I just ordered fried chicken 🐣",
"The movie was great",
"What time is the next game?",
"Just finished reading 'Embeddings in NLP'"]
sims = Counter()
for tweet in tweets:
sim = 1 - cosine(get_embedding(query), get_embedding(tweet))
sims[tweet] = sim
print('Most similar to: ', query)
print(f"{'-'*30}")
for idx, (tweet, sim) in enumerate(sims.most_common()):
print("%d) %.5f %s" % (idx+1, sim, tweet))
```
Output:
```
Most similar to: The book was awesome
------------------------------
1) 0.98914 The movie was great
2) 0.96194 Just finished reading 'Embeddings in NLP'
3) 0.94603 What time is the next game?
4) 0.94580 I just ordered fried chicken 🐣
```
## Example Feature Extraction
```python
from transformers import AutoTokenizer, AutoModel, TFAutoModel
import numpy as np
MODEL = "cardiffnlp/twitter-roberta-base-sep2022"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
text = "Good night 😊"
text = preprocess(text)
# Pytorch
model = AutoModel.from_pretrained(MODEL)
encoded_input = tokenizer(text, return_tensors='pt')
features = model(**encoded_input)
features = features[0].detach().cpu().numpy()
features_mean = np.mean(features[0], axis=0)
#features_max = np.max(features[0], axis=0)
# # Tensorflow
# model = TFAutoModel.from_pretrained(MODEL)
# encoded_input = tokenizer(text, return_tensors='tf')
# features = model(encoded_input)
# features = features[0].numpy()
# features_mean = np.mean(features[0], axis=0)
# #features_max = np.max(features[0], axis=0)
``` | 4,759 | [
[
-0.018890380859375,
-0.040740966796875,
0.00962066650390625,
0.02325439453125,
-0.016021728515625,
0.006710052490234375,
-0.006748199462890625,
-0.005992889404296875,
0.0162811279296875,
-0.0000597834587097168,
-0.036651611328125,
-0.04443359375,
-0.05609130859375,
0.01094818115234375,
-0.03814697265625,
0.060577392578125,
-0.007648468017578125,
-0.007434844970703125,
0.0213470458984375,
-0.006999969482421875,
-0.003814697265625,
-0.0413818359375,
-0.037567138671875,
-0.0301971435546875,
0.01885986328125,
0.01006317138671875,
0.0312347412109375,
0.0528564453125,
0.0323486328125,
0.0367431640625,
0.005985260009765625,
0.00002562999725341797,
-0.0222930908203125,
-0.003520965576171875,
-0.0031070709228515625,
-0.02740478515625,
-0.02099609375,
0.01313018798828125,
0.04937744140625,
0.0291900634765625,
0.0018587112426757812,
0.0245513916015625,
0.00237274169921875,
0.02044677734375,
-0.0296630859375,
0.01157379150390625,
-0.03387451171875,
0.0156097412109375,
-0.0003871917724609375,
-0.0205230712890625,
-0.0124664306640625,
-0.0221099853515625,
0.00852203369140625,
-0.038482666015625,
0.0231781005859375,
-0.01058197021484375,
0.09832763671875,
0.022186279296875,
-0.01111602783203125,
-0.018890380859375,
-0.0406494140625,
0.08111572265625,
-0.0576171875,
0.007488250732421875,
0.0261688232421875,
0.0110931396484375,
0.0013551712036132812,
-0.058258056640625,
-0.046234130859375,
-0.00975799560546875,
-0.0109100341796875,
0.0185546875,
-0.0250701904296875,
-0.024993896484375,
0.024688720703125,
0.0098114013671875,
-0.04345703125,
-0.0129241943359375,
-0.0251922607421875,
-0.00714874267578125,
0.052337646484375,
0.00429534912109375,
0.019989013671875,
-0.0309600830078125,
-0.02484130859375,
-0.026763916015625,
-0.008056640625,
-0.002105712890625,
0.004425048828125,
0.004055023193359375,
-0.0250244140625,
0.046844482421875,
-0.01395416259765625,
0.033966064453125,
0.01273345947265625,
-0.0034027099609375,
0.04681396484375,
-0.0217437744140625,
-0.0247344970703125,
-0.018310546875,
0.08447265625,
0.022979736328125,
0.036224365234375,
-0.00275421142578125,
-0.01215362548828125,
-0.01397705078125,
-0.00992584228515625,
-0.059295654296875,
-0.0255584716796875,
0.0149383544921875,
-0.039703369140625,
-0.03173828125,
0.00966644287109375,
-0.040557861328125,
0.004245758056640625,
0.002384185791015625,
0.055084228515625,
-0.050537109375,
-0.0198211669921875,
-0.0026721954345703125,
-0.0251617431640625,
0.0158843994140625,
0.006145477294921875,
-0.0648193359375,
-0.0009927749633789062,
0.0294189453125,
0.072509765625,
0.0079498291015625,
-0.041412353515625,
-0.0174102783203125,
0.01065826416015625,
-0.02392578125,
0.043060302734375,
-0.0250701904296875,
-0.016632080078125,
-0.01061248779296875,
-0.001033782958984375,
-0.029327392578125,
-0.0263214111328125,
0.00852203369140625,
-0.018157958984375,
0.02880859375,
-0.007049560546875,
-0.04351806640625,
-0.00007158517837524414,
0.0193328857421875,
-0.030792236328125,
0.08544921875,
0.027801513671875,
-0.05743408203125,
0.019805908203125,
-0.06396484375,
-0.022216796875,
-0.01073455810546875,
-0.0014429092407226562,
-0.040924072265625,
-0.00664520263671875,
0.036712646484375,
0.041412353515625,
-0.00263214111328125,
0.01654052734375,
-0.0195465087890625,
-0.0281829833984375,
0.022674560546875,
-0.024505615234375,
0.0888671875,
0.015899658203125,
-0.04345703125,
-0.0020618438720703125,
-0.0440673828125,
0.00604248046875,
0.0318603515625,
-0.0205078125,
0.00818634033203125,
-0.01558685302734375,
-0.004489898681640625,
0.0186767578125,
0.0210723876953125,
-0.036773681640625,
0.020965576171875,
-0.036346435546875,
0.0606689453125,
0.0489501953125,
0.002529144287109375,
0.0283203125,
-0.039642333984375,
0.0206146240234375,
0.0188446044921875,
0.0234527587890625,
-0.0174102783203125,
-0.037200927734375,
-0.06494140625,
-0.0059814453125,
0.0254669189453125,
0.040008544921875,
-0.05145263671875,
0.051116943359375,
-0.028961181640625,
-0.04962158203125,
-0.035797119140625,
-0.003025054931640625,
0.0174102783203125,
0.033905029296875,
0.04034423828125,
-0.0029926300048828125,
-0.05316162109375,
-0.05120849609375,
-0.0274658203125,
-0.0231475830078125,
-0.004665374755859375,
0.01157379150390625,
0.0626220703125,
-0.014892578125,
0.06353759765625,
-0.0345458984375,
-0.0159912109375,
-0.0206451416015625,
0.016082763671875,
0.0350341796875,
0.06121826171875,
0.055328369140625,
-0.051300048828125,
-0.050323486328125,
-0.0222015380859375,
-0.0567626953125,
-0.0008764266967773438,
-0.0017566680908203125,
-0.0026226043701171875,
0.0234527587890625,
0.03375244140625,
-0.051116943359375,
0.045135498046875,
0.0238037109375,
-0.0285491943359375,
0.02667236328125,
-0.007747650146484375,
0.01461029052734375,
-0.10321044921875,
0.00885772705078125,
-0.0054473876953125,
-0.00656890869140625,
-0.035308837890625,
-0.0173492431640625,
-0.004238128662109375,
0.0093994140625,
-0.03045654296875,
0.05316162109375,
-0.0244598388671875,
0.01141357421875,
0.00799560546875,
-0.00457000732421875,
0.0037059783935546875,
0.0304718017578125,
-0.01153564453125,
0.049102783203125,
0.037933349609375,
-0.0352783203125,
0.02850341796875,
0.0178680419921875,
-0.0208740234375,
0.0055999755859375,
-0.051116943359375,
0.00022232532501220703,
0.00717926025390625,
0.0133514404296875,
-0.0849609375,
-0.021881103515625,
0.032135009765625,
-0.07110595703125,
0.0218963623046875,
-0.01308441162109375,
-0.0484619140625,
-0.04541015625,
-0.0229339599609375,
0.02679443359375,
0.049468994140625,
-0.033538818359375,
0.05255126953125,
0.02032470703125,
0.01306915283203125,
-0.05340576171875,
-0.07000732421875,
0.00518798828125,
-0.0142059326171875,
-0.03472900390625,
0.0305633544921875,
0.0011320114135742188,
-0.002777099609375,
0.0158843994140625,
0.00707244873046875,
-0.0038089752197265625,
0.007080078125,
0.00815582275390625,
0.01258087158203125,
-0.0141143798828125,
0.004955291748046875,
-0.0198516845703125,
-0.0155181884765625,
-0.00006526708602905273,
-0.0341796875,
0.06304931640625,
-0.017913818359375,
-0.01268768310546875,
-0.037872314453125,
0.01544952392578125,
0.0157928466796875,
-0.005863189697265625,
0.06353759765625,
0.0860595703125,
-0.038909912109375,
0.003719329833984375,
-0.04827880859375,
-0.01358795166015625,
-0.03900146484375,
0.043365478515625,
-0.030975341796875,
-0.046905517578125,
0.046051025390625,
0.01763916015625,
0.0084228515625,
0.07403564453125,
0.04443359375,
-0.0181884765625,
0.0787353515625,
0.034423828125,
-0.005535125732421875,
0.04913330078125,
-0.06549072265625,
0.007781982421875,
-0.045562744140625,
-0.026702880859375,
-0.0301055908203125,
-0.018798828125,
-0.05126953125,
-0.035369873046875,
0.00908660888671875,
0.01343536376953125,
-0.043701171875,
0.035980224609375,
-0.05462646484375,
0.015899658203125,
0.0535888671875,
0.005786895751953125,
-0.01202392578125,
0.013702392578125,
-0.0217437744140625,
-0.008331298828125,
-0.06329345703125,
-0.0251312255859375,
0.09161376953125,
0.0277252197265625,
0.034942626953125,
-0.0027618408203125,
0.07342529296875,
0.007762908935546875,
0.037353515625,
-0.039825439453125,
0.039794921875,
-0.0214691162109375,
-0.05108642578125,
-0.022552490234375,
-0.041473388671875,
-0.0633544921875,
0.002532958984375,
-0.01454925537109375,
-0.06353759765625,
0.007488250732421875,
-0.005191802978515625,
-0.0295562744140625,
0.03741455078125,
-0.06158447265625,
0.053924560546875,
-0.01222991943359375,
-0.0164642333984375,
0.00582122802734375,
-0.040252685546875,
0.006771087646484375,
0.0027828216552734375,
0.001636505126953125,
-0.00972747802734375,
-0.0117950439453125,
0.08111572265625,
-0.03778076171875,
0.055816650390625,
-0.006885528564453125,
0.035675048828125,
0.0204925537109375,
-0.00893402099609375,
0.0112457275390625,
0.0013437271118164062,
-0.01654052734375,
0.0182342529296875,
0.004581451416015625,
-0.0428466796875,
-0.0308380126953125,
0.055084228515625,
-0.08642578125,
-0.052703857421875,
-0.0552978515625,
-0.0306549072265625,
0.01171875,
0.03399658203125,
0.048431396484375,
0.0318603515625,
0.006244659423828125,
0.0166168212890625,
0.0225830078125,
-0.0232696533203125,
0.069091796875,
0.01473236083984375,
-0.01061248779296875,
-0.051239013671875,
0.06243896484375,
0.0229644775390625,
0.0038909912109375,
0.03997802734375,
0.0311737060546875,
-0.0302734375,
-0.0270843505859375,
-0.01222991943359375,
0.037322998046875,
-0.04901123046875,
-0.0213165283203125,
-0.072509765625,
-0.048980712890625,
-0.063232421875,
-0.00890350341796875,
-0.02252197265625,
-0.03692626953125,
-0.037750244140625,
0.004482269287109375,
0.0286102294921875,
0.064453125,
-0.024932861328125,
0.0211334228515625,
-0.052154541015625,
0.01361846923828125,
0.01020050048828125,
0.01366424560546875,
0.00429534912109375,
-0.061492919921875,
-0.024566650390625,
0.0026874542236328125,
-0.035308837890625,
-0.05828857421875,
0.05389404296875,
0.0094757080078125,
0.038238525390625,
0.028411865234375,
-0.00397491455078125,
0.06298828125,
-0.0201263427734375,
0.065673828125,
0.0174560546875,
-0.0728759765625,
0.042266845703125,
-0.017822265625,
0.03900146484375,
0.0175018310546875,
0.03436279296875,
-0.0308990478515625,
-0.020599365234375,
-0.0726318359375,
-0.06427001953125,
0.06353759765625,
0.032562255859375,
0.00908660888671875,
-0.0110626220703125,
0.0169830322265625,
-0.016143798828125,
0.0105743408203125,
-0.058746337890625,
-0.04095458984375,
-0.04510498046875,
-0.0301666259765625,
-0.0228424072265625,
-0.01280975341796875,
-0.0014171600341796875,
-0.05322265625,
0.04779052734375,
0.01416778564453125,
0.050384521484375,
0.0267486572265625,
-0.006622314453125,
-0.004673004150390625,
-0.01035308837890625,
0.03875732421875,
0.05169677734375,
-0.0330810546875,
-0.006008148193359375,
0.0231781005859375,
-0.045318603515625,
0.00498199462890625,
0.0254364013671875,
-0.00836181640625,
0.0250701904296875,
0.048614501953125,
0.047454833984375,
0.010894775390625,
-0.0100860595703125,
0.040557861328125,
-0.0037384033203125,
-0.0282745361328125,
-0.04827880859375,
0.0034637451171875,
0.014617919921875,
0.0192718505859375,
0.058197021484375,
0.0108489990234375,
-0.0176544189453125,
-0.03326416015625,
0.0286865234375,
0.01181793212890625,
-0.012969970703125,
-0.0235443115234375,
0.05340576171875,
-0.0037078857421875,
-0.04180908203125,
0.045501708984375,
-0.01233673095703125,
-0.06829833984375,
0.063232421875,
0.037017822265625,
0.07366943359375,
-0.0181121826171875,
0.019500732421875,
0.061004638671875,
0.0242156982421875,
-0.0008344650268554688,
0.01397705078125,
0.0092315673828125,
-0.049713134765625,
-0.0127105712890625,
-0.054351806640625,
0.0004582405090332031,
0.018035888671875,
-0.0298309326171875,
0.0183563232421875,
-0.04241943359375,
-0.032257080078125,
0.0088043212890625,
0.028045654296875,
-0.07183837890625,
0.0225677490234375,
-0.005695343017578125,
0.06768798828125,
-0.06298828125,
0.0638427734375,
0.052764892578125,
-0.0439453125,
-0.07647705078125,
0.00818634033203125,
-0.0185089111328125,
-0.06597900390625,
0.060516357421875,
0.0253753662109375,
0.0218505859375,
0.0232086181640625,
-0.041717529296875,
-0.082763671875,
0.096923828125,
0.0093841552734375,
-0.0213165283203125,
-0.01381683349609375,
0.018890380859375,
0.038665771484375,
-0.0411376953125,
0.055511474609375,
0.038543701171875,
0.026397705078125,
0.007068634033203125,
-0.05169677734375,
0.00445556640625,
-0.0271759033203125,
-0.007236480712890625,
-0.0018186569213867188,
-0.055999755859375,
0.093017578125,
-0.0173797607421875,
-0.01143646240234375,
0.00299072265625,
0.050262451171875,
0.0185546875,
0.0225830078125,
0.036346435546875,
0.04443359375,
0.036773681640625,
-0.02001953125,
0.07574462890625,
-0.02410888671875,
0.059478759765625,
0.053680419921875,
0.0328369140625,
0.05682373046875,
0.0283660888671875,
-0.0215911865234375,
0.03192138671875,
0.05743408203125,
-0.0220794677734375,
0.034759521484375,
-0.0008273124694824219,
0.007595062255859375,
-0.020782470703125,
0.00583648681640625,
-0.036529541015625,
0.029998779296875,
0.022003173828125,
-0.04052734375,
-0.0295562744140625,
-0.0044097900390625,
0.00792694091796875,
-0.0160675048828125,
0.000865936279296875,
0.044189453125,
0.012908935546875,
-0.04962158203125,
0.06683349609375,
-0.0037593841552734375,
0.0518798828125,
-0.03326416015625,
0.0008392333984375,
-0.01035308837890625,
0.02587890625,
-0.01157379150390625,
-0.064453125,
0.01329803466796875,
0.00081634521484375,
-0.005207061767578125,
-0.01248931884765625,
0.03179931640625,
-0.030426025390625,
-0.051116943359375,
0.0311279296875,
0.0254669189453125,
0.0139923095703125,
0.009857177734375,
-0.09552001953125,
0.006809234619140625,
0.00997161865234375,
-0.0423583984375,
0.0023174285888671875,
0.028228759765625,
0.017059326171875,
0.056640625,
0.049163818359375,
0.0074462890625,
0.0167083740234375,
0.0052947998046875,
0.0653076171875,
-0.05584716796875,
-0.0257110595703125,
-0.08392333984375,
0.033355712890625,
-0.016326904296875,
-0.045562744140625,
0.05865478515625,
0.047576904296875,
0.060760498046875,
-0.004955291748046875,
0.056182861328125,
-0.0182952880859375,
0.03753662109375,
-0.0172119140625,
0.06524658203125,
-0.050048828125,
0.00861358642578125,
-0.0132598876953125,
-0.06158447265625,
-0.018218994140625,
0.06195068359375,
-0.03900146484375,
0.025848388671875,
0.055450439453125,
0.050933837890625,
-0.0037860870361328125,
-0.0206451416015625,
0.01073455810546875,
0.0301055908203125,
0.03240966796875,
0.051239013671875,
0.04498291015625,
-0.06427001953125,
0.053192138671875,
-0.052276611328125,
-0.0162353515625,
-0.0310516357421875,
-0.0550537109375,
-0.08984375,
-0.06036376953125,
-0.0211334228515625,
-0.051239013671875,
0.00492095947265625,
0.0787353515625,
0.040130615234375,
-0.0726318359375,
-0.0104827880859375,
0.0014162063598632812,
0.005462646484375,
-0.0061187744140625,
-0.0225677490234375,
0.050140380859375,
-0.0261688232421875,
-0.06298828125,
-0.00432586669921875,
-0.0034656524658203125,
0.02154541015625,
-0.003124237060546875,
0.0019350051879882812,
-0.04779052734375,
0.000579833984375,
0.0225677490234375,
0.01111602783203125,
-0.043853759765625,
-0.0202178955078125,
0.0107574462890625,
-0.0288848876953125,
0.007781982421875,
0.0213165283203125,
-0.04840087890625,
0.0240478515625,
0.039703369140625,
0.0223846435546875,
0.05902099609375,
0.006427764892578125,
0.03363037109375,
-0.034576416015625,
0.0113983154296875,
0.0294647216796875,
0.024505615234375,
0.033447265625,
-0.01158905029296875,
0.043365478515625,
0.03753662109375,
-0.0406494140625,
-0.06634521484375,
-0.0160675048828125,
-0.059478759765625,
-0.0179901123046875,
0.08221435546875,
-0.02069091796875,
-0.03961181640625,
-0.003803253173828125,
0.01139068603515625,
0.05609130859375,
-0.0283203125,
0.064697265625,
0.0496826171875,
0.004241943359375,
-0.00206756591796875,
-0.034912109375,
0.04022216796875,
0.03326416015625,
-0.02783203125,
-0.03424072265625,
-0.0192413330078125,
0.04632568359375,
0.02423095703125,
0.0496826171875,
0.0002887248992919922,
0.01464080810546875,
0.01078033447265625,
-0.004344940185546875,
-0.020904541015625,
-0.0017232894897460938,
-0.0117645263671875,
-0.002044677734375,
-0.029541015625,
-0.035369873046875
]
] |
baichuan-inc/Baichuan2-7B-Base | 2023-10-13T02:00:57.000Z | [
"transformers",
"pytorch",
"baichuan",
"text-generation",
"custom_code",
"en",
"zh",
"license:other",
"has_space",
"region:us"
] | text-generation | baichuan-inc | null | null | baichuan-inc/Baichuan2-7B-Base | 47 | 11,574 | transformers | 2023-08-30T10:11:04 | ---
language:
- en
- zh
license: other
tasks:
- text-generation
---
<!-- markdownlint-disable first-line-h1 -->
<!-- markdownlint-disable html -->
<div align="center">
<h1>
Baichuan 2
</h1>
</div>
<div align="center">
<a href="https://github.com/baichuan-inc/Baichuan2" target="_blank">🦉GitHub</a> | <a href="https://github.com/baichuan-inc/Baichuan-7B/blob/main/media/wechat.jpeg?raw=true" target="_blank">💬WeChat</a>
</div>
<div align="center">
🚀 <a href="https://www.baichuan-ai.com/" target="_blank">百川大模型在线对话平台</a> 已正式向公众开放 🎉
</div>
# 目录/Table of Contents
- [📖 模型介绍/Introduction](#Introduction)
- [⚙️ 快速开始/Quick Start](#Start)
- [📊 Benchmark评估/Benchmark Evaluation](#Benchmark)
- [📜 声明与协议/Terms and Conditions](#Terms)
# <span id="Introduction">模型介绍/Introduction</span>
Baichuan 2 是[百川智能]推出的新一代开源大语言模型,采用 **2.6 万亿** Tokens 的高质量语料训练,在权威的中文和英文 benchmark
上均取得同尺寸最好的效果。本次发布包含有 7B、13B 的 Base 和 Chat 版本,并提供了 Chat 版本的 4bits
量化,所有版本不仅对学术研究完全开放,开发者也仅需[邮件申请]并获得官方商用许可后,即可以免费商用。具体发布版本和下载见下表:
Baichuan 2 is the new generation of large-scale open-source language models launched by [Baichuan Intelligence inc.](https://www.baichuan-ai.com/).
It is trained on a high-quality corpus with 2.6 trillion tokens and has achieved the best performance in authoritative Chinese and English benchmarks of the same size.
This release includes 7B and 13B versions for both Base and Chat models, along with a 4bits quantized version for the Chat model.
All versions are fully open to academic research, and developers can also use them for free in commercial applications after obtaining an official commercial license through [email request](mailto:opensource@baichuan-inc.com).
The specific release versions and download links are listed in the table below:
| | Base Model | Chat Model | 4bits Quantized Chat Model |
|:---:|:--------------------:|:--------------------:|:--------------------------:|
| 7B | [Baichuan2-7B-Base](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base) | [Baichuan2-7B-Chat](https://huggingface.co/baichuan-inc/Baichuan2-7B-Chat) | [Baichuan2-7B-Chat-4bits](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base-4bits) |
| 13B | [Baichuan2-13B-Base](https://huggingface.co/baichuan-inc/Baichuan2-13B-Base) | [Baichuan2-13B-Chat](https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat) | [Baichuan2-13B-Chat-4bits](https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat-4bits) |
# <span id="Start">快速开始/Quick Start</span>
在Baichuan2系列模型中,我们为了加快推理速度使用了Pytorch2.0加入的新功能F.scaled_dot_product_attention,因此模型需要在Pytorch2.0环境下运行。
In the Baichuan 2 series models, we have utilized the new feature `F.scaled_dot_product_attention` introduced in PyTorch 2.0 to accelerate inference speed. Therefore, the model needs to be run in a PyTorch 2.0 environment.
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan2-7B-Base", use_fast=False, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan2-7B-Base", device_map="auto", trust_remote_code=True)
inputs = tokenizer('登鹳雀楼->王之涣\n夜雨寄北->', return_tensors='pt')
inputs = inputs.to('cuda:0')
pred = model.generate(**inputs, max_new_tokens=64, repetition_penalty=1.1)
print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))
```
# <span id="Benchmark">Benchmark 结果/Benchmark Evaluation</span>
我们在[通用]、[法律]、[医疗]、[数学]、[代码]和[多语言翻译]六个领域的中英文权威数据集上对模型进行了广泛测试,更多详细测评结果可查看[GitHub]。
We have extensively tested the model on authoritative Chinese-English datasets across six domains: [General](https://github.com/baichuan-inc/Baichuan2/blob/main/README_EN.md#general-domain), [Legal](https://github.com/baichuan-inc/Baichuan2/blob/main/README_EN.md#law-and-medicine), [Medical](https://github.com/baichuan-inc/Baichuan2/blob/main/README_EN.md#law-and-medicine), [Mathematics](https://github.com/baichuan-inc/Baichuan2/blob/main/README_EN.md#mathematics-and-code), [Code](https://github.com/baichuan-inc/Baichuan2/blob/main/README_EN.md#mathematics-and-code), and [Multilingual Translation](https://github.com/baichuan-inc/Baichuan2/blob/main/README_EN.md#multilingual-translation). For more detailed evaluation results, please refer to [GitHub](https://github.com/baichuan-inc/Baichuan2/blob/main/README_EN.md).
### 7B Model Results
| | **C-Eval** | **MMLU** | **CMMLU** | **Gaokao** | **AGIEval** | **BBH** |
|:-----------------------:|:----------:|:--------:|:---------:|:----------:|:-----------:|:-------:|
| | 5-shot | 5-shot | 5-shot | 5-shot | 5-shot | 3-shot |
| **GPT-4** | 68.40 | 83.93 | 70.33 | 66.15 | 63.27 | 75.12 |
| **GPT-3.5 Turbo** | 51.10 | 68.54 | 54.06 | 47.07 | 46.13 | 61.59 |
| **LLaMA-7B** | 27.10 | 35.10 | 26.75 | 27.81 | 28.17 | 32.38 |
| **LLaMA2-7B** | 28.90 | 45.73 | 31.38 | 25.97 | 26.53 | 39.16 |
| **MPT-7B** | 27.15 | 27.93 | 26.00 | 26.54 | 24.83 | 35.20 |
| **Falcon-7B** | 24.23 | 26.03 | 25.66 | 24.24 | 24.10 | 28.77 |
| **ChatGLM2-6B** | 50.20 | 45.90 | 49.00 | 49.44 | 45.28 | 31.65 |
| **[Baichuan-7B]** | 42.80 | 42.30 | 44.02 | 36.34 | 34.44 | 32.48 |
| **[Baichuan2-7B-Base]** | 54.00 | 54.16 | 57.07 | 47.47 | 42.73 | 41.56 |
### 13B Model Results
| | **C-Eval** | **MMLU** | **CMMLU** | **Gaokao** | **AGIEval** | **BBH** |
|:---------------------------:|:----------:|:--------:|:---------:|:----------:|:-----------:|:-------:|
| | 5-shot | 5-shot | 5-shot | 5-shot | 5-shot | 3-shot |
| **GPT-4** | 68.40 | 83.93 | 70.33 | 66.15 | 63.27 | 75.12 |
| **GPT-3.5 Turbo** | 51.10 | 68.54 | 54.06 | 47.07 | 46.13 | 61.59 |
| **LLaMA-13B** | 28.50 | 46.30 | 31.15 | 28.23 | 28.22 | 37.89 |
| **LLaMA2-13B** | 35.80 | 55.09 | 37.99 | 30.83 | 32.29 | 46.98 |
| **Vicuna-13B** | 32.80 | 52.00 | 36.28 | 30.11 | 31.55 | 43.04 |
| **Chinese-Alpaca-Plus-13B** | 38.80 | 43.90 | 33.43 | 34.78 | 35.46 | 28.94 |
| **XVERSE-13B** | 53.70 | 55.21 | 58.44 | 44.69 | 42.54 | 38.06 |
| **[Baichuan-13B-Base]** | 52.40 | 51.60 | 55.30 | 49.69 | 43.20 | 43.01 |
| **[Baichuan2-13B-Base]** | 58.10 | 59.17 | 61.97 | 54.33 | 48.17 | 48.78 |
## 训练过程模型/Training Dynamics
除了训练了 2.6 万亿 Tokens 的 [Baichuan2-7B-Base](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base) 模型,我们还提供了在此之前的另外 11 个中间过程的模型(分别对应训练了约 0.2 ~ 2.4 万亿 Tokens)供社区研究使用
([训练过程checkpoint下载](https://huggingface.co/baichuan-inc/Baichuan2-7B-Intermediate-Checkpoints))。下图给出了这些 checkpoints 在 C-Eval、MMLU、CMMLU 三个 benchmark 上的效果变化:
In addition to the [Baichuan2-7B-Base](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base) model trained on 2.6 trillion tokens, we also offer 11 additional intermediate-stage models for community research, corresponding to training on approximately 0.2 to 2.4 trillion tokens each ([Intermediate Checkpoints Download](https://huggingface.co/baichuan-inc/Baichuan2-7B-Intermediate-Checkpoints)). The graph below shows the performance changes of these checkpoints on three benchmarks: C-Eval, MMLU, and CMMLU.

# <span id="Terms">声明与协议/Terms and Conditions</span>
## 声明
我们在此声明,我们的开发团队并未基于 Baichuan 2 模型开发任何应用,无论是在 iOS、Android、网页或任何其他平台。我们强烈呼吁所有使用者,不要利用
Baichuan 2 模型进行任何危害国家社会安全或违法的活动。另外,我们也要求使用者不要将 Baichuan 2
模型用于未经适当安全审查和备案的互联网服务。我们希望所有的使用者都能遵守这个原则,确保科技的发展能在规范和合法的环境下进行。
我们已经尽我们所能,来确保模型训练过程中使用的数据的合规性。然而,尽管我们已经做出了巨大的努力,但由于模型和数据的复杂性,仍有可能存在一些无法预见的问题。因此,如果由于使用
Baichuan 2 开源模型而导致的任何问题,包括但不限于数据安全问题、公共舆论风险,或模型被误导、滥用、传播或不当利用所带来的任何风险和问题,我们将不承担任何责任。
We hereby declare that our team has not developed any applications based on Baichuan 2 models, not on iOS, Android, the web, or any other platform. We strongly call on all users not to use Baichuan 2 models for any activities that harm national / social security or violate the law. Also, we ask users not to use Baichuan 2 models for Internet services that have not undergone appropriate security reviews and filings. We hope that all users can abide by this principle and ensure that the development of technology proceeds in a regulated and legal environment.
We have done our best to ensure the compliance of the data used in the model training process. However, despite our considerable efforts, there may still be some unforeseeable issues due to the complexity of the model and data. Therefore, if any problems arise due to the use of Baichuan 2 open-source models, including but not limited to data security issues, public opinion risks, or any risks and problems brought about by the model being misled, abused, spread or improperly exploited, we will not assume any responsibility.
## 协议
社区使用 Baichuan 2 模型需要遵循 [Apache 2.0](https://github.com/baichuan-inc/Baichuan2/blob/main/LICENSE) 和[《Baichuan 2 模型社区许可协议》](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base/resolve/main/Baichuan%202%E6%A8%A1%E5%9E%8B%E7%A4%BE%E5%8C%BA%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE.pdf)。Baichuan 2 模型支持商业用途,如果您计划将 Baichuan 2 模型或其衍生品用于商业目的,请您确认您的主体符合以下情况:
1. 您或您的关联方的服务或产品的日均用户活跃量(DAU)低于100万。
2. 您或您的关联方不是软件服务提供商、云服务提供商。
3. 您或您的关联方不存在将授予您的商用许可,未经百川许可二次授权给其他第三方的可能。
在符合以上条件的前提下,您需要通过以下联系邮箱 opensource@baichuan-inc.com ,提交《Baichuan 2 模型社区许可协议》要求的申请材料。审核通过后,百川将特此授予您一个非排他性、全球性、不可转让、不可再许可、可撤销的商用版权许可。
The community usage of Baichuan 2 model requires adherence to [Apache 2.0](https://github.com/baichuan-inc/Baichuan2/blob/main/LICENSE) and [Community License for Baichuan2 Model](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base/resolve/main/Baichuan%202%E6%A8%A1%E5%9E%8B%E7%A4%BE%E5%8C%BA%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE.pdf). The Baichuan 2 model supports commercial use. If you plan to use the Baichuan 2 model or its derivatives for commercial purposes, please ensure that your entity meets the following conditions:
1. The Daily Active Users (DAU) of your or your affiliate's service or product is less than 1 million.
2. Neither you nor your affiliates are software service providers or cloud service providers.
3. There is no possibility for you or your affiliates to grant the commercial license given to you, to reauthorize it to other third parties without Baichuan's permission.
Upon meeting the above conditions, you need to submit the application materials required by the Baichuan 2 Model Community License Agreement via the following contact email: opensource@baichuan-inc.com. Once approved, Baichuan will hereby grant you a non-exclusive, global, non-transferable, non-sublicensable, revocable commercial copyright license.
[GitHub]:https://github.com/baichuan-inc/Baichuan2
[Baichuan2]:https://github.com/baichuan-inc/Baichuan2
[Baichuan-7B]:https://huggingface.co/baichuan-inc/Baichuan-7B
[Baichuan2-7B-Base]:https://huggingface.co/baichuan-inc/Baichuan2-7B-Base
[Baichuan2-7B-Chat]:https://huggingface.co/baichuan-inc/Baichuan2-7B-Chat
[Baichuan2-7B-Chat-4bits]:https://huggingface.co/baichuan-inc/Baichuan2-7B-Chat-4bits
[Baichuan-13B-Base]:https://huggingface.co/baichuan-inc/Baichuan-13B-Base
[Baichuan2-13B-Base]:https://huggingface.co/baichuan-inc/Baichuan2-13B-Base
[Baichuan2-13B-Chat]:https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat
[Baichuan2-13B-Chat-4bits]:https://huggingface.co/baichuan-inc/Baichuan2-13B-Chat-4bits
[通用]:https://github.com/baichuan-inc/Baichuan2#%E9%80%9A%E7%94%A8%E9%A2%86%E5%9F%9F
[法律]:https://github.com/baichuan-inc/Baichuan2#%E6%B3%95%E5%BE%8B%E5%8C%BB%E7%96%97
[医疗]:https://github.com/baichuan-inc/Baichuan2#%E6%B3%95%E5%BE%8B%E5%8C%BB%E7%96%97
[数学]:https://github.com/baichuan-inc/Baichuan2#%E6%95%B0%E5%AD%A6%E4%BB%A3%E7%A0%81
[代码]:https://github.com/baichuan-inc/Baichuan2#%E6%95%B0%E5%AD%A6%E4%BB%A3%E7%A0%81
[多语言翻译]:https://github.com/baichuan-inc/Baichuan2#%E5%A4%9A%E8%AF%AD%E8%A8%80%E7%BF%BB%E8%AF%91
[《Baichuan 2 模型社区许可协议》]:https://huggingface.co/baichuan-inc/Baichuan2-7B-Base/blob/main/Baichuan%202%E6%A8%A1%E5%9E%8B%E7%A4%BE%E5%8C%BA%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE.pdf
[邮件申请]: mailto:opensource@baichuan-inc.com
[Email]: mailto:opensource@baichuan-inc.com
[opensource@baichuan-inc.com]: mailto:opensource@baichuan-inc.com
[训练过程heckpoint下载]: https://huggingface.co/baichuan-inc/Baichuan2-7B-Intermediate-Checkpoints
[百川智能]: https://www.baichuan-ai.com
| 12,939 | [
[
-0.0251617431640625,
-0.050628662109375,
0.00234222412109375,
0.029266357421875,
-0.0212554931640625,
-0.003986358642578125,
-0.0203094482421875,
-0.032379150390625,
0.01837158203125,
0.005828857421875,
-0.033111572265625,
-0.035125732421875,
-0.049560546875,
-0.0020465850830078125,
0.006931304931640625,
0.06634521484375,
-0.005588531494140625,
0.00531005859375,
0.0199432373046875,
-0.01206207275390625,
-0.0443115234375,
-0.01959228515625,
-0.061370849609375,
-0.014923095703125,
0.0191650390625,
0.018798828125,
0.0518798828125,
0.048309326171875,
0.055511474609375,
0.0176849365234375,
-0.018035888671875,
0.01727294921875,
-0.028961181640625,
-0.01445770263671875,
0.025726318359375,
-0.03619384765625,
-0.056427001953125,
0.0003685951232910156,
0.026336669921875,
0.026458740234375,
-0.0019483566284179688,
0.0201416015625,
0.0206146240234375,
0.03875732421875,
-0.0254669189453125,
0.0243072509765625,
-0.0185394287109375,
-0.0033054351806640625,
-0.015716552734375,
0.0035686492919921875,
-0.0161285400390625,
-0.0266265869140625,
0.00521087646484375,
-0.045379638671875,
0.01084136962890625,
0.0067138671875,
0.111083984375,
-0.0016107559204101562,
-0.0283050537109375,
-0.00957489013671875,
-0.0191192626953125,
0.06549072265625,
-0.0799560546875,
0.013916015625,
0.026336669921875,
0.01522064208984375,
-0.011199951171875,
-0.0643310546875,
-0.037506103515625,
-0.002765655517578125,
-0.03570556640625,
0.0289306640625,
-0.01849365234375,
-0.01523590087890625,
0.0079345703125,
0.034149169921875,
-0.052490234375,
0.0027027130126953125,
-0.043121337890625,
-0.0132904052734375,
0.0594482421875,
0.018463134765625,
0.02301025390625,
-0.0330810546875,
-0.038604736328125,
-0.0026798248291015625,
-0.038818359375,
0.0295257568359375,
-0.0017728805541992188,
0.0078582763671875,
-0.04351806640625,
0.0252532958984375,
-0.00385284423828125,
0.033966064453125,
0.0166778564453125,
-0.0144805908203125,
0.043365478515625,
-0.047332763671875,
-0.025665283203125,
-0.0220794677734375,
0.0927734375,
0.04193115234375,
-0.0183563232421875,
0.0111236572265625,
-0.0174407958984375,
-0.019683837890625,
-0.0268402099609375,
-0.06982421875,
-0.029022216796875,
0.0440673828125,
-0.05963134765625,
-0.0267791748046875,
0.0140380859375,
-0.056915283203125,
-0.0028362274169921875,
-0.0012598037719726562,
0.0347900390625,
-0.04681396484375,
-0.043426513671875,
-0.004566192626953125,
-0.0071563720703125,
0.0238037109375,
0.0203399658203125,
-0.065673828125,
0.0179595947265625,
0.041259765625,
0.0849609375,
-0.0102691650390625,
-0.03076171875,
-0.0145416259765625,
-0.0021762847900390625,
-0.032440185546875,
0.0428466796875,
0.0026149749755859375,
-0.027923583984375,
-0.014892578125,
0.02386474609375,
-0.01471710205078125,
-0.033477783203125,
0.0270843505859375,
-0.01528167724609375,
0.00850677490234375,
-0.038421630859375,
-0.029266357421875,
-0.01410675048828125,
0.029052734375,
-0.048309326171875,
0.08575439453125,
0.005184173583984375,
-0.06549072265625,
0.02001953125,
-0.039337158203125,
-0.0233154296875,
-0.0154266357421875,
0.004047393798828125,
-0.041717529296875,
-0.0301055908203125,
0.0244140625,
0.034942626953125,
-0.037200927734375,
0.00934600830078125,
-0.0089569091796875,
-0.0269012451171875,
0.01080322265625,
-0.019439697265625,
0.09063720703125,
0.03424072265625,
-0.0511474609375,
0.01477813720703125,
-0.04803466796875,
-0.0018672943115234375,
0.034271240234375,
-0.02130126953125,
0.008209228515625,
-0.0112762451171875,
0.00571441650390625,
0.027923583984375,
0.02825927734375,
-0.0145111083984375,
0.0056610107421875,
-0.0328369140625,
0.05462646484375,
0.063232421875,
0.005237579345703125,
0.02001953125,
-0.050994873046875,
0.02581787109375,
0.0265350341796875,
0.03314208984375,
-0.022796630859375,
-0.0540771484375,
-0.0758056640625,
-0.0236968994140625,
0.0234832763671875,
0.04510498046875,
-0.037078857421875,
0.050689697265625,
-0.01052093505859375,
-0.049713134765625,
-0.038116455078125,
-0.003704071044921875,
0.03289794921875,
0.0307464599609375,
0.0271148681640625,
-0.008331298828125,
-0.042205810546875,
-0.054840087890625,
0.00933074951171875,
-0.0211334228515625,
0.0052337646484375,
0.02923583984375,
0.05615234375,
-0.008758544921875,
0.05145263671875,
-0.03961181640625,
-0.0199127197265625,
-0.0261077880859375,
-0.00390625,
0.03692626953125,
0.042877197265625,
0.053924560546875,
-0.049072265625,
-0.061431884765625,
0.01593017578125,
-0.060546875,
0.0113067626953125,
-0.00762939453125,
-0.02862548828125,
0.030609130859375,
0.01207733154296875,
-0.047607421875,
0.036376953125,
0.044219970703125,
-0.024993896484375,
0.05963134765625,
-0.0191192626953125,
0.021728515625,
-0.0904541015625,
0.0223541259765625,
-0.004749298095703125,
0.00365447998046875,
-0.041961669921875,
0.007259368896484375,
0.017822265625,
0.01120758056640625,
-0.034027099609375,
0.056427001953125,
-0.050140380859375,
0.0225830078125,
0.003993988037109375,
0.024200439453125,
0.0089874267578125,
0.050506591796875,
-0.00080108642578125,
0.05975341796875,
0.047637939453125,
-0.045623779296875,
0.03704833984375,
0.0287322998046875,
-0.02606201171875,
0.0028839111328125,
-0.055328369140625,
-0.0025272369384765625,
0.013885498046875,
0.020355224609375,
-0.08428955078125,
-0.01377105712890625,
0.039306640625,
-0.05816650390625,
0.01898193359375,
-0.01296234130859375,
-0.0255584716796875,
-0.050994873046875,
-0.047821044921875,
0.01016998291015625,
0.041839599609375,
-0.037261962890625,
0.0200347900390625,
0.01537322998046875,
-0.0013523101806640625,
-0.0428466796875,
-0.05963134765625,
-0.01457977294921875,
-0.0157623291015625,
-0.068115234375,
0.0240631103515625,
-0.0022830963134765625,
-0.005138397216796875,
-0.00506591796875,
0.00010538101196289062,
-0.002742767333984375,
0.004119873046875,
0.00968170166015625,
0.042083740234375,
-0.02117919921875,
-0.01531982421875,
-0.007511138916015625,
-0.0008330345153808594,
-0.0003380775451660156,
-0.01207733154296875,
0.052154541015625,
-0.0076904296875,
0.003139495849609375,
-0.04522705078125,
0.0027923583984375,
0.03363037109375,
-0.038299560546875,
0.07012939453125,
0.049224853515625,
-0.0283203125,
0.01331329345703125,
-0.034027099609375,
-0.01055145263671875,
-0.03399658203125,
0.0253753662109375,
-0.0284271240234375,
-0.042633056640625,
0.06329345703125,
0.023406982421875,
0.0232391357421875,
0.05401611328125,
0.0506591796875,
-0.002716064453125,
0.0655517578125,
0.00986480712890625,
-0.01166534423828125,
0.028076171875,
-0.05828857421875,
0.006519317626953125,
-0.0655517578125,
-0.039703369140625,
-0.0289154052734375,
-0.0218963623046875,
-0.042022705078125,
-0.033477783203125,
0.0262298583984375,
0.00693511962890625,
-0.03375244140625,
0.04351806640625,
-0.037841796875,
-0.002288818359375,
0.05206298828125,
0.02264404296875,
0.0020599365234375,
-0.013916015625,
-0.0091552734375,
-0.0017452239990234375,
-0.042266845703125,
-0.0194091796875,
0.08697509765625,
0.0309600830078125,
0.043701171875,
0.019561767578125,
0.0328369140625,
0.00795745849609375,
0.0069427490234375,
-0.043701171875,
0.023834228515625,
-0.004268646240234375,
-0.06256103515625,
-0.012725830078125,
-0.033721923828125,
-0.07073974609375,
0.0257568359375,
-0.010223388671875,
-0.061248779296875,
0.01105499267578125,
0.0008831024169921875,
-0.0419921875,
0.028961181640625,
-0.057037353515625,
0.068603515625,
-0.0304718017578125,
-0.04156494140625,
-0.00048613548278808594,
-0.060638427734375,
0.040069580078125,
0.0064697265625,
0.017913818359375,
-0.006626129150390625,
0.0108642578125,
0.068115234375,
-0.053955078125,
0.039886474609375,
-0.0155792236328125,
-0.0024242401123046875,
0.04034423828125,
0.0032596588134765625,
0.055267333984375,
0.01331329345703125,
-0.01074981689453125,
0.02838134765625,
0.007419586181640625,
-0.0379638671875,
-0.0304412841796875,
0.046875,
-0.06500244140625,
-0.044464111328125,
-0.040985107421875,
-0.02813720703125,
0.01245880126953125,
0.0277557373046875,
0.04705810546875,
0.0232391357421875,
0.00861358642578125,
0.01325225830078125,
0.03558349609375,
-0.0263214111328125,
0.0447998046875,
0.0257568359375,
-0.0157623291015625,
-0.044708251953125,
0.057281494140625,
0.01332855224609375,
0.029876708984375,
0.0232391357421875,
0.0157318115234375,
-0.02001953125,
-0.0286712646484375,
-0.036834716796875,
0.0267181396484375,
-0.029937744140625,
-0.020172119140625,
-0.044036865234375,
-0.030364990234375,
-0.06768798828125,
-0.0036716461181640625,
-0.0238037109375,
-0.0231170654296875,
-0.018646240234375,
-0.01111602783203125,
0.028961181640625,
0.02801513671875,
-0.0162811279296875,
0.0197906494140625,
-0.055816650390625,
0.0169525146484375,
0.00447845458984375,
0.00954437255859375,
0.01155853271484375,
-0.053436279296875,
-0.038665771484375,
0.0251617431640625,
-0.039154052734375,
-0.054168701171875,
0.045928955078125,
-0.0037822723388671875,
0.038604736328125,
0.04754638671875,
-0.0001766681671142578,
0.058074951171875,
-0.0198516845703125,
0.08172607421875,
0.02862548828125,
-0.058868408203125,
0.0474853515625,
-0.031158447265625,
-0.00019860267639160156,
0.0198211669921875,
0.023040771484375,
-0.04583740234375,
-0.01476287841796875,
-0.04083251953125,
-0.06011962890625,
0.07965087890625,
0.03851318359375,
-0.00772857666015625,
0.00911712646484375,
0.015655517578125,
-0.01126861572265625,
0.002841949462890625,
-0.0565185546875,
-0.057281494140625,
-0.0272674560546875,
-0.001956939697265625,
0.007808685302734375,
-0.01557159423828125,
-0.002300262451171875,
-0.02734375,
0.061676025390625,
0.016998291015625,
0.036834716796875,
0.0204925537109375,
-0.00250244140625,
0.003215789794921875,
-0.01251983642578125,
0.035247802734375,
0.0411376953125,
-0.032135009765625,
-0.0197296142578125,
0.01371002197265625,
-0.047393798828125,
-0.003856658935546875,
0.012725830078125,
-0.0276641845703125,
0.00354766845703125,
0.029052734375,
0.063720703125,
0.005565643310546875,
-0.02813720703125,
0.043365478515625,
-0.002857208251953125,
-0.0182647705078125,
-0.0186614990234375,
0.0043487548828125,
0.00310516357421875,
0.01561737060546875,
0.0191192626953125,
0.0007033348083496094,
0.001190185546875,
-0.0390625,
0.0101776123046875,
0.016082763671875,
-0.019012451171875,
-0.0198974609375,
0.073486328125,
0.01513671875,
0.001094818115234375,
0.041717529296875,
-0.01381683349609375,
-0.044219970703125,
0.0684814453125,
0.03173828125,
0.0489501953125,
-0.0218963623046875,
0.0077362060546875,
0.0758056640625,
0.0262451171875,
-0.01352691650390625,
0.00624847412109375,
0.013916015625,
-0.037384033203125,
0.0060882568359375,
-0.0276641845703125,
0.0028324127197265625,
0.0176849365234375,
-0.044464111328125,
0.04119873046875,
-0.036468505859375,
-0.0308837890625,
-0.006427764892578125,
0.035919189453125,
-0.034942626953125,
0.0269622802734375,
0.00469207763671875,
0.0728759765625,
-0.04132080078125,
0.06256103515625,
0.03265380859375,
-0.056854248046875,
-0.0823974609375,
-0.00830841064453125,
0.0062103271484375,
-0.05975341796875,
0.032379150390625,
0.01296234130859375,
0.0207061767578125,
-0.0103759765625,
-0.035125732421875,
-0.0704345703125,
0.1168212890625,
0.0008616447448730469,
-0.034637451171875,
-0.0082855224609375,
-0.0005497932434082031,
0.029632568359375,
0.0025424957275390625,
0.04559326171875,
0.04730224609375,
0.0369873046875,
0.00743865966796875,
-0.08062744140625,
0.0213623046875,
-0.04486083984375,
-0.00539398193359375,
-0.004573822021484375,
-0.10699462890625,
0.09844970703125,
-0.0202178955078125,
-0.005199432373046875,
0.0169525146484375,
0.053924560546875,
0.035980224609375,
0.0156707763671875,
0.01763916015625,
0.032562255859375,
0.05316162109375,
-0.027679443359375,
0.0643310546875,
-0.032623291015625,
0.0546875,
0.0611572265625,
0.0024127960205078125,
0.045806884765625,
0.0081939697265625,
-0.040557861328125,
0.032928466796875,
0.07183837890625,
-0.022613525390625,
0.0360107421875,
-0.010589599609375,
-0.0145263671875,
-0.00322723388671875,
0.0171356201171875,
-0.0506591796875,
0.01015472412109375,
0.01641845703125,
-0.0201416015625,
0.0077667236328125,
-0.018768310546875,
0.03558349609375,
-0.026458740234375,
-0.01027679443359375,
0.03985595703125,
0.0036830902099609375,
-0.05255126953125,
0.06646728515625,
0.0187530517578125,
0.06854248046875,
-0.0521240234375,
0.00997161865234375,
-0.044036865234375,
0.0121612548828125,
-0.023834228515625,
-0.05401611328125,
-0.0008449554443359375,
0.00042366981506347656,
-0.0029354095458984375,
0.02001953125,
0.037384033203125,
-0.0162811279296875,
-0.033843994140625,
0.03216552734375,
0.0100860595703125,
0.011627197265625,
0.0121002197265625,
-0.0673828125,
0.0014362335205078125,
0.012176513671875,
-0.039825439453125,
0.01617431640625,
0.034698486328125,
0.0028820037841796875,
0.05108642578125,
0.0484619140625,
0.0023288726806640625,
0.0214996337890625,
-0.01216888427734375,
0.0662841796875,
-0.048187255859375,
-0.03387451171875,
-0.0625,
0.048828125,
-0.0111083984375,
-0.038238525390625,
0.0711669921875,
0.0672607421875,
0.05255126953125,
0.002719879150390625,
0.0662841796875,
-0.040130615234375,
0.0267486572265625,
-0.035858154296875,
0.07269287109375,
-0.05426025390625,
0.0077362060546875,
-0.0235137939453125,
-0.0347900390625,
-0.0206756591796875,
0.050933837890625,
-0.015380859375,
0.00994873046875,
0.049530029296875,
0.06854248046875,
0.00925445556640625,
-0.011260986328125,
0.00675201416015625,
0.03350830078125,
0.03778076171875,
0.072021484375,
0.032196044921875,
-0.07867431640625,
0.05755615234375,
-0.0487060546875,
-0.0179595947265625,
-0.0306396484375,
-0.026611328125,
-0.06976318359375,
-0.041717529296875,
-0.0200653076171875,
-0.052947998046875,
-0.0118865966796875,
0.0679931640625,
0.05975341796875,
-0.06964111328125,
-0.0279541015625,
0.0035915374755859375,
0.003475189208984375,
-0.0300140380859375,
-0.017669677734375,
0.060516357421875,
-0.021759033203125,
-0.06756591796875,
-0.0007619857788085938,
0.00408935546875,
0.00743865966796875,
0.007659912109375,
-0.033355712890625,
-0.0274658203125,
-0.0007104873657226562,
0.031768798828125,
0.0164337158203125,
-0.0517578125,
0.0014410018920898438,
0.0229034423828125,
-0.0238189697265625,
0.01727294921875,
0.02142333984375,
-0.0234375,
0.0194244384765625,
0.040191650390625,
0.0106201171875,
0.041259765625,
-0.0011386871337890625,
0.0167694091796875,
-0.01983642578125,
0.024993896484375,
-0.0109710693359375,
0.037506103515625,
0.002971649169921875,
-0.0236968994140625,
0.046875,
0.030517578125,
-0.0345458984375,
-0.059112548828125,
-0.022857666015625,
-0.079345703125,
-0.023040771484375,
0.1014404296875,
-0.035858154296875,
-0.02752685546875,
0.012969970703125,
-0.031341552734375,
0.039642333984375,
-0.032623291015625,
0.050262451171875,
0.055511474609375,
0.005779266357421875,
-0.00986480712890625,
-0.041351318359375,
0.0271148681640625,
0.0201568603515625,
-0.057861328125,
0.0019273757934570312,
0.015655517578125,
0.0267333984375,
0.00623321533203125,
0.05267333984375,
-0.0161590576171875,
0.032623291015625,
0.0090179443359375,
0.0009446144104003906,
-0.0093994140625,
-0.00006818771362304688,
-0.0030670166015625,
-0.021636962890625,
-0.01190185546875,
-0.03619384765625
]
] |
Undi95/MLewd-ReMM-L2-Chat-20B | 2023-09-26T00:37:27.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"not-for-all-audiences",
"nsfw",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | Undi95 | null | null | Undi95/MLewd-ReMM-L2-Chat-20B | 12 | 11,565 | transformers | 2023-09-17T22:40:27 | ---
license: cc-by-nc-4.0
tags:
- not-for-all-audiences
- nsfw
---
First :
```shell
layer_slices:
- model: Undi95/MLewd-L2-Chat-13B
start: 0
end: 16
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 8
end: 20
- model: Undi95/MLewd-L2-Chat-13B
start: 17
end: 32
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 21
end: 40
```
Inverted:
```shell
layer_slices:
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 0
end: 16
- model: Undi95/MLewd-L2-Chat-13B
start: 8
end: 20
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 17
end: 32
- model: Undi95/MLewd-L2-Chat-13B
start: 21
end: 40
```
Precise:
```shell
layer_slices:
- model: Undi95/MLewd-L2-Chat-13B
start: 0
end: 8
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 4
end: 12
- model: Undi95/MLewd-L2-Chat-13B
start: 9
end: 16
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 13
end: 22
- model: Undi95/MLewd-L2-Chat-13B
start: 17
end: 24
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 23
end: 32
- model: Undi95/MLewd-L2-Chat-13B
start: 25
end: 32
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 33
end: 40
```
PreciseInverted:
```shell
layer_slices:
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 0
end: 8
- model: Undi95/MLewd-L2-Chat-13B
start: 4
end: 12
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 9
end: 16
- model: Undi95/MLewd-L2-Chat-13B
start: 13
end: 22
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 17
end: 24
- model: Undi95/MLewd-L2-Chat-13B
start: 23
end: 32
- model: Undi95/MLewd-ReMM-L2-Chat-20B-Part1
start: 25
end: 32
- model: Undi95/MLewd-L2-Chat-13B
start: 33
end: 40
```
Part1 = ReMM v2.1 merged /w MLewd low weight to keep consistency. I call this "dilution" and result show consistency and coherency without repeat/loop beside the small amount of duplicated datas.
The goal is to find the best way to interlace layers the best way possible to have a sweetspot between 13B and +30B.
Normal/Inverted is by chunk of 16 layers and Precise/PreciseInverted is by chunk of 8 layers.
All the models are made of 64(+1) layers. Need testing.
## Prompt template: Alpaca
```
Below is an instruction that describes a task. Write a response that completes the request.
### Instruction:
{prompt}
### Response:
``` | 2,468 | [
[
-0.037078857421875,
-0.06353759765625,
0.036865234375,
0.024627685546875,
-0.021697998046875,
-0.0162353515625,
0.007701873779296875,
-0.005695343017578125,
0.0308074951171875,
0.05609130859375,
-0.06719970703125,
-0.05804443359375,
-0.04736328125,
-0.009765625,
-0.035491943359375,
0.055084228515625,
0.009063720703125,
-0.01318359375,
0.0200042724609375,
-0.01168060302734375,
-0.02001953125,
-0.01018524169921875,
-0.04132080078125,
-0.0125274658203125,
0.0155792236328125,
0.062286376953125,
0.0579833984375,
0.06280517578125,
0.039215087890625,
0.0270233154296875,
-0.0147552490234375,
0.0215911865234375,
-0.0235748291015625,
-0.0081787109375,
0.004520416259765625,
-0.029510498046875,
-0.0513916015625,
-0.004974365234375,
0.0447998046875,
0.0648193359375,
-0.003917694091796875,
0.02508544921875,
0.003490447998046875,
0.0616455078125,
-0.035736083984375,
-0.0099334716796875,
-0.0292205810546875,
0.017608642578125,
-0.0166473388671875,
0.0024662017822265625,
-0.0196685791015625,
-0.0267181396484375,
-0.01541900634765625,
-0.053863525390625,
0.018707275390625,
0.006134033203125,
0.0887451171875,
0.0039825439453125,
-0.02947998046875,
-0.012786865234375,
-0.038818359375,
0.064453125,
-0.085205078125,
0.0185546875,
0.0292816162109375,
0.003170013427734375,
-0.028594970703125,
-0.06640625,
-0.0479736328125,
0.00848388671875,
-0.007755279541015625,
0.00928497314453125,
-0.00678253173828125,
-0.00794219970703125,
0.0160369873046875,
0.030029296875,
-0.037506103515625,
0.0110321044921875,
-0.031829833984375,
-0.018585205078125,
0.057464599609375,
0.0289764404296875,
0.017669677734375,
-0.027191162109375,
-0.035247802734375,
-0.0245819091796875,
-0.01151275634765625,
0.0196990966796875,
0.037261962890625,
0.019866943359375,
-0.033966064453125,
0.06219482421875,
-0.039642333984375,
0.03826904296875,
0.036865234375,
-0.02911376953125,
0.050933837890625,
-0.04351806640625,
-0.03704833984375,
-0.0089874267578125,
0.07476806640625,
0.03509521484375,
-0.01074981689453125,
0.0242462158203125,
-0.008056640625,
-0.00536346435546875,
0.01148223876953125,
-0.064208984375,
-0.035064697265625,
0.037109375,
-0.0303497314453125,
-0.033050537109375,
0.0296783447265625,
-0.05364990234375,
0.014556884765625,
0.0023632049560546875,
0.04034423828125,
-0.040924072265625,
-0.02008056640625,
0.01763916015625,
-0.01213836669921875,
0.0058135986328125,
0.017364501953125,
-0.040924072265625,
0.037872314453125,
0.059539794921875,
0.06097412109375,
0.0005574226379394531,
-0.032012939453125,
-0.0328369140625,
0.0153045654296875,
-0.024871826171875,
0.025421142578125,
-0.01425933837890625,
-0.0215606689453125,
-0.038970947265625,
0.0279998779296875,
-0.0015954971313476562,
-0.037841796875,
0.0584716796875,
-0.02532958984375,
0.0310211181640625,
-0.0246124267578125,
-0.04425048828125,
-0.03887939453125,
0.0212249755859375,
-0.0447998046875,
0.07733154296875,
0.02093505859375,
-0.06903076171875,
0.02191162109375,
-0.04364013671875,
-0.01702880859375,
-0.0130157470703125,
-0.005584716796875,
-0.052459716796875,
-0.0109710693359375,
0.036376953125,
0.04229736328125,
-0.007617950439453125,
-0.0079803466796875,
-0.017425537109375,
-0.02197265625,
-0.007404327392578125,
-0.007518768310546875,
0.08001708984375,
0.0278167724609375,
-0.03839111328125,
0.0193023681640625,
-0.060791015625,
0.0251617431640625,
0.0206146240234375,
-0.0197296142578125,
-0.0134735107421875,
-0.0237579345703125,
-0.00891876220703125,
0.007904052734375,
0.007152557373046875,
-0.0294647216796875,
0.0283355712890625,
-0.037384033203125,
0.039337158203125,
0.05426025390625,
0.030242919921875,
0.04229736328125,
-0.04803466796875,
0.0224761962890625,
0.047271728515625,
0.025970458984375,
-0.0018129348754882812,
-0.04302978515625,
-0.0643310546875,
-0.0318603515625,
0.00926971435546875,
0.03826904296875,
-0.02679443359375,
0.059844970703125,
-0.01026153564453125,
-0.051605224609375,
-0.03369140625,
0.0019350051879882812,
0.0252838134765625,
0.040252685546875,
0.024383544921875,
-0.044403076171875,
-0.03680419921875,
-0.09552001953125,
0.0189666748046875,
-0.0013790130615234375,
-0.00041866302490234375,
0.00571441650390625,
0.040252685546875,
-0.029449462890625,
0.0596923828125,
-0.05194091796875,
-0.0163726806640625,
-0.01165771484375,
0.0005483627319335938,
0.053131103515625,
0.050018310546875,
0.040496826171875,
-0.0501708984375,
-0.0401611328125,
-0.0011835098266601562,
-0.03369140625,
-0.005157470703125,
-0.0147857666015625,
-0.038177490234375,
-0.01114654541015625,
0.01230621337890625,
-0.0472412109375,
0.02447509765625,
0.0249786376953125,
-0.03179931640625,
0.0565185546875,
-0.005184173583984375,
0.032379150390625,
-0.0782470703125,
0.0005478858947753906,
-0.023284912109375,
-0.0189971923828125,
-0.050384521484375,
0.0177764892578125,
0.01178741455078125,
0.0173797607421875,
-0.034881591796875,
0.02783203125,
-0.0477294921875,
-0.017578125,
-0.0017862319946289062,
-0.005126953125,
0.0150299072265625,
0.04632568359375,
-0.025634765625,
0.041015625,
0.049285888671875,
-0.051513671875,
0.0259246826171875,
0.0156707763671875,
-0.0308837890625,
0.014862060546875,
-0.038116455078125,
0.00909423828125,
0.01042938232421875,
0.0023670196533203125,
-0.086181640625,
-0.01666259765625,
0.035369873046875,
-0.035614013671875,
-0.01000213623046875,
-0.02398681640625,
-0.015533447265625,
-0.0282440185546875,
-0.043701171875,
0.0292205810546875,
0.0311431884765625,
-0.0120086669921875,
0.039642333984375,
0.0228118896484375,
0.00803375244140625,
-0.043853759765625,
-0.058502197265625,
-0.0228424072265625,
-0.0208892822265625,
-0.08184814453125,
0.044647216796875,
-0.0156707763671875,
-0.0017747879028320312,
0.00656890869140625,
-0.0056610107421875,
-0.017974853515625,
-0.0011081695556640625,
0.0328369140625,
0.00293731689453125,
-0.0226593017578125,
-0.049285888671875,
0.0118408203125,
-0.0028095245361328125,
0.00024044513702392578,
-0.0007958412170410156,
0.0692138671875,
-0.004451751708984375,
-0.04510498046875,
-0.037994384765625,
0.02850341796875,
0.055755615234375,
-0.003444671630859375,
0.0474853515625,
0.0643310546875,
-0.0154876708984375,
-0.003795623779296875,
-0.04449462890625,
-0.005950927734375,
-0.03662109375,
0.035430908203125,
-0.045623779296875,
-0.036468505859375,
0.07708740234375,
0.0208587646484375,
0.0210723876953125,
0.049896240234375,
0.031402587890625,
-0.0089263916015625,
0.07171630859375,
0.01763916015625,
-0.01425933837890625,
0.019012451171875,
-0.054901123046875,
-0.0091705322265625,
-0.054901123046875,
-0.023162841796875,
-0.020355224609375,
-0.04168701171875,
-0.035186767578125,
-0.036773681640625,
0.043121337890625,
0.036041259765625,
-0.04052734375,
0.032806396484375,
-0.039642333984375,
0.02294921875,
0.0465087890625,
0.0222015380859375,
0.01273345947265625,
-0.00336456298828125,
-0.03692626953125,
0.006168365478515625,
-0.046112060546875,
-0.01151275634765625,
0.05596923828125,
0.027374267578125,
0.043365478515625,
0.0170745849609375,
0.073974609375,
0.0136566162109375,
-0.0229949951171875,
-0.0418701171875,
0.059906005859375,
-0.0033931732177734375,
-0.03411865234375,
-0.02679443359375,
-0.048553466796875,
-0.059173583984375,
0.0170135498046875,
0.004833221435546875,
-0.046722412109375,
0.0028400421142578125,
-0.01001739501953125,
-0.032073974609375,
0.029815673828125,
-0.04608154296875,
0.05340576171875,
-0.00562286376953125,
-0.0172271728515625,
-0.0055389404296875,
-0.0556640625,
0.051483154296875,
0.01441192626953125,
0.0198974609375,
-0.016326904296875,
-0.00030922889709472656,
0.06964111328125,
-0.059112548828125,
0.03570556640625,
-0.013763427734375,
0.00896453857421875,
0.0285797119140625,
0.0279998779296875,
0.03997802734375,
0.00917816162109375,
0.00017559528350830078,
-0.0020732879638671875,
0.004444122314453125,
-0.030029296875,
-0.0262908935546875,
0.05242919921875,
-0.07025146484375,
-0.03375244140625,
-0.00868988037109375,
-0.0312042236328125,
0.01053619384765625,
0.023651123046875,
0.03643798828125,
0.0247039794921875,
0.0166168212890625,
0.007694244384765625,
0.026397705078125,
-0.00885772705078125,
0.0303192138671875,
0.0220947265625,
-0.035400390625,
-0.043121337890625,
0.034759521484375,
0.00946044921875,
0.0347900390625,
0.018768310546875,
0.01277923583984375,
-0.0080413818359375,
-0.03570556640625,
-0.03125,
0.02105712890625,
-0.03472900390625,
-0.0254974365234375,
-0.059783935546875,
-0.048248291015625,
-0.035247802734375,
-0.014556884765625,
-0.05694580078125,
-0.044647216796875,
-0.0198211669921875,
-0.004817962646484375,
0.027862548828125,
0.030914306640625,
-0.0198211669921875,
0.03497314453125,
-0.0679931640625,
0.031768798828125,
0.017913818359375,
0.006198883056640625,
0.0085906982421875,
-0.06298828125,
-0.01165008544921875,
0.01383209228515625,
-0.034271240234375,
-0.0780029296875,
0.046142578125,
-0.0001876354217529297,
0.041748046875,
0.045928955078125,
0.002506256103515625,
0.08013916015625,
-0.016265869140625,
0.0643310546875,
0.01242828369140625,
-0.06927490234375,
0.046051025390625,
-0.01568603515625,
0.0197906494140625,
0.03399658203125,
0.021728515625,
-0.043243408203125,
-0.0287933349609375,
-0.06549072265625,
-0.0947265625,
0.06500244140625,
0.0338134765625,
-0.01285552978515625,
0.006244659423828125,
0.020751953125,
0.010498046875,
-0.00208282470703125,
-0.05194091796875,
-0.047576904296875,
-0.00616455078125,
-0.003810882568359375,
0.0296173095703125,
-0.024871826171875,
-0.01708984375,
-0.0281219482421875,
0.06317138671875,
0.008514404296875,
0.03668212890625,
0.02130126953125,
0.00489044189453125,
-0.019012451171875,
0.006374359130859375,
0.0419921875,
0.03900146484375,
-0.038665771484375,
0.01486968994140625,
0.0150909423828125,
-0.04571533203125,
0.019134521484375,
-0.019317626953125,
-0.019866943359375,
-0.0071563720703125,
0.048004150390625,
0.046051025390625,
0.005306243896484375,
-0.0168304443359375,
0.0340576171875,
0.00916290283203125,
-0.030975341796875,
-0.026519775390625,
0.02557373046875,
0.0133056640625,
0.029022216796875,
0.0264129638671875,
0.0021839141845703125,
0.005615234375,
-0.03240966796875,
0.033447265625,
0.020721435546875,
0.00724029541015625,
-0.016265869140625,
0.033782958984375,
-0.012420654296875,
-0.01151275634765625,
0.04901123046875,
-0.0245208740234375,
-0.047698974609375,
0.0631103515625,
0.0180511474609375,
0.0582275390625,
-0.0104522705078125,
0.01012420654296875,
0.046173095703125,
0.0206298828125,
-0.009429931640625,
0.033935546875,
0.01432037353515625,
-0.0662841796875,
-0.00722503662109375,
-0.034210205078125,
-0.01029205322265625,
0.02984619140625,
-0.056976318359375,
0.03125,
-0.044921875,
-0.01751708984375,
0.009246826171875,
0.0241241455078125,
-0.04644775390625,
0.01450347900390625,
-0.00891876220703125,
0.063232421875,
-0.07666015625,
0.05181884765625,
0.043121337890625,
-0.027313232421875,
-0.083984375,
-0.0196990966796875,
0.01336669921875,
-0.053070068359375,
0.03619384765625,
0.002063751220703125,
0.01328277587890625,
-0.0112152099609375,
-0.041107177734375,
-0.07373046875,
0.099609375,
0.0083160400390625,
-0.05047607421875,
-0.0009617805480957031,
0.01157379150390625,
0.005268096923828125,
-0.03363037109375,
0.05010986328125,
0.037994384765625,
0.047454833984375,
0.0237884521484375,
-0.07049560546875,
0.0081939697265625,
-0.0235137939453125,
-0.01197052001953125,
0.01505279541015625,
-0.062042236328125,
0.10638427734375,
-0.017242431640625,
-0.0160675048828125,
0.0031585693359375,
0.049346923828125,
0.029327392578125,
0.021026611328125,
0.0557861328125,
0.06829833984375,
0.037628173828125,
0.002574920654296875,
0.045928955078125,
-0.005786895751953125,
0.03948974609375,
0.0810546875,
0.0007662773132324219,
0.044586181640625,
0.0219879150390625,
-0.02801513671875,
0.0267333984375,
0.048553466796875,
0.0008001327514648438,
0.052703857421875,
-0.0101776123046875,
0.008514404296875,
-0.014007568359375,
0.0238800048828125,
-0.0418701171875,
0.0308837890625,
0.0202178955078125,
-0.027313232421875,
-0.006450653076171875,
-0.0212554931640625,
0.014739990234375,
-0.014862060546875,
-0.01169586181640625,
0.04296875,
-0.0174407958984375,
-0.031768798828125,
0.039794921875,
0.03369140625,
0.0555419921875,
-0.048553466796875,
0.0112457275390625,
-0.0280609130859375,
0.01544189453125,
-0.03717041015625,
-0.07476806640625,
0.02215576171875,
-0.0235748291015625,
-0.037200927734375,
0.0186004638671875,
0.0325927734375,
-0.0305023193359375,
-0.05584716796875,
0.03338623046875,
0.03753662109375,
0.01702880859375,
0.0010671615600585938,
-0.0616455078125,
0.0232086181640625,
0.0350341796875,
-0.0452880859375,
0.01123046875,
0.048919677734375,
0.01116180419921875,
0.0479736328125,
0.0458984375,
0.003204345703125,
0.0271759033203125,
-0.004924774169921875,
0.0855712890625,
-0.060638427734375,
-0.04901123046875,
-0.06634521484375,
0.046905517578125,
-0.0075531005859375,
-0.0267333984375,
0.0570068359375,
0.060577392578125,
0.06756591796875,
-0.01085662841796875,
0.0270538330078125,
-0.0263214111328125,
0.04205322265625,
-0.048553466796875,
0.03741455078125,
-0.0443115234375,
0.026611328125,
-0.0269622802734375,
-0.07574462890625,
0.003643035888671875,
0.05364990234375,
-0.01445770263671875,
0.005847930908203125,
0.041534423828125,
0.0750732421875,
-0.03759765625,
0.00058746337890625,
0.01235198974609375,
0.0241851806640625,
0.0255889892578125,
0.053924560546875,
0.03790283203125,
-0.04803466796875,
0.015655517578125,
-0.07427978515625,
-0.004833221435546875,
-0.0182952880859375,
-0.050567626953125,
-0.030792236328125,
-0.046844482421875,
-0.0278778076171875,
-0.01267242431640625,
-0.0177001953125,
0.0762939453125,
0.0770263671875,
-0.0611572265625,
-0.0266571044921875,
0.0023040771484375,
0.0041351318359375,
-0.03179931640625,
-0.028411865234375,
0.037078857421875,
0.004119873046875,
-0.040985107421875,
0.0279998779296875,
0.007175445556640625,
-0.002773284912109375,
-0.0008969306945800781,
-0.016082763671875,
-0.0306549072265625,
0.0020847320556640625,
0.02825927734375,
-0.003765106201171875,
-0.0447998046875,
-0.0213165283203125,
-0.0265350341796875,
-0.00931549072265625,
0.0211944580078125,
0.0218505859375,
-0.03155517578125,
0.0004661083221435547,
0.044342041015625,
0.00800323486328125,
0.060638427734375,
0.018829345703125,
0.00572967529296875,
-0.0670166015625,
0.021392822265625,
0.00759124755859375,
0.0199737548828125,
0.0086822509765625,
-0.037628173828125,
0.0458984375,
0.01444244384765625,
-0.047454833984375,
-0.05767822265625,
0.00860595703125,
-0.10137939453125,
-0.0122528076171875,
0.07208251953125,
-0.01136016845703125,
-0.02630615234375,
0.0168914794921875,
-0.018524169921875,
0.0148468017578125,
-0.04327392578125,
0.03375244140625,
0.035430908203125,
-0.0236663818359375,
-0.0224151611328125,
-0.0296783447265625,
0.0162200927734375,
0.03228759765625,
-0.042938232421875,
-0.01055908203125,
0.0240631103515625,
0.0479736328125,
0.029632568359375,
0.05584716796875,
-0.00943756103515625,
-0.01027679443359375,
0.002445220947265625,
-0.0078582763671875,
-0.0257415771484375,
-0.027862548828125,
0.013458251953125,
0.0322265625,
-0.00970458984375,
-0.04193115234375
]
] |
databricks/dolly-v2-7b | 2023-06-30T18:33:41.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"en",
"dataset:databricks/databricks-dolly-15k",
"license:mit",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | databricks | null | null | databricks/dolly-v2-7b | 133 | 11,550 | transformers | 2023-04-13T05:19:39 | ---
license: mit
language:
- en
library_name: transformers
inference: false
datasets:
- databricks/databricks-dolly-15k
---
# dolly-v2-7b Model Card
## Summary
Databricks' `dolly-v2-7b`, an instruction-following large language model trained on the Databricks machine learning platform
that is licensed for commercial use. Based on `pythia-6.9b`, Dolly is trained on ~15k instruction/response fine tuning records
[`databricks-dolly-15k`](https://github.com/databrickslabs/dolly/tree/master/data) generated
by Databricks employees in capability domains from the InstructGPT paper, including brainstorming, classification, closed QA, generation,
information extraction, open QA and summarization. `dolly-v2-7b` is not a state-of-the-art model, but does exhibit surprisingly
high quality instruction following behavior not characteristic of the foundation model on which it is based.
Dolly v2 is also available in these other models sizes:
* [dolly-v2-12b](https://huggingface.co/databricks/dolly-v2-12b), a 12 billion parameter based on `pythia-12b`
* [dolly-v2-3b](https://huggingface.co/databricks/dolly-v2-3b), a 2.8 billion parameter based on `pythia-2.8b`
Please refer to the [dolly GitHub repo](https://github.com/databrickslabs/dolly#getting-started-with-response-generation) for tips on
running inference for various GPU configurations.
**Owner**: Databricks, Inc.
## Model Overview
`dolly-v2-7b` is a 6.9 billion parameter causal language model created by [Databricks](https://databricks.com/) that is derived from
[EleutherAI's](https://www.eleuther.ai/) [Pythia-6.9b](https://huggingface.co/EleutherAI/pythia-6.9b) and fine-tuned
on a [~15K record instruction corpus](https://github.com/databrickslabs/dolly/tree/master/data) generated by Databricks employees and released under a permissive license (CC-BY-SA)
## Usage
To use the model with the `transformers` library on a machine with GPUs, first make sure you have the `transformers` and `accelerate` libraries installed.
In a Databricks notebook you could run:
```python
%pip install "accelerate>=0.16.0,<1" "transformers[torch]>=4.28.1,<5" "torch>=1.13.1,<2"
```
The instruction following pipeline can be loaded using the `pipeline` function as shown below. This loads a custom `InstructionTextGenerationPipeline`
found in the model repo [here](https://huggingface.co/databricks/dolly-v2-3b/blob/main/instruct_pipeline.py), which is why `trust_remote_code=True` is required.
Including `torch_dtype=torch.bfloat16` is generally recommended if this type is supported in order to reduce memory usage. It does not appear to impact output quality.
It is also fine to remove it if there is sufficient memory.
```python
import torch
from transformers import pipeline
generate_text = pipeline(model="databricks/dolly-v2-7b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
```
You can then use the pipeline to answer instructions:
```python
res = generate_text("Explain to me the difference between nuclear fission and fusion.")
print(res[0]["generated_text"])
```
Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/databricks/dolly-v2-3b/blob/main/instruct_pipeline.py),
store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
```python
import torch
from instruct_pipeline import InstructionTextGenerationPipeline
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("databricks/dolly-v2-7b", padding_side="left")
model = AutoModelForCausalLM.from_pretrained("databricks/dolly-v2-7b", device_map="auto", torch_dtype=torch.bfloat16)
generate_text = InstructionTextGenerationPipeline(model=model, tokenizer=tokenizer)
```
### LangChain Usage
To use the pipeline with LangChain, you must set `return_full_text=True`, as LangChain expects the full text to be returned
and the default for the pipeline is to only return the new text.
```python
import torch
from transformers import pipeline
generate_text = pipeline(model="databricks/dolly-v2-7b", torch_dtype=torch.bfloat16,
trust_remote_code=True, device_map="auto", return_full_text=True)
```
You can create a prompt that either has only an instruction or has an instruction with context:
```python
from langchain import PromptTemplate, LLMChain
from langchain.llms import HuggingFacePipeline
# template for an instrution with no input
prompt = PromptTemplate(
input_variables=["instruction"],
template="{instruction}")
# template for an instruction with input
prompt_with_context = PromptTemplate(
input_variables=["instruction", "context"],
template="{instruction}\n\nInput:\n{context}")
hf_pipeline = HuggingFacePipeline(pipeline=generate_text)
llm_chain = LLMChain(llm=hf_pipeline, prompt=prompt)
llm_context_chain = LLMChain(llm=hf_pipeline, prompt=prompt_with_context)
```
Example predicting using a simple instruction:
```python
print(llm_chain.predict(instruction="Explain to me the difference between nuclear fission and fusion.").lstrip())
```
Example predicting using an instruction with context:
```python
context = """George Washington (February 22, 1732[b] - December 14, 1799) was an American military officer, statesman,
and Founding Father who served as the first president of the United States from 1789 to 1797."""
print(llm_context_chain.predict(instruction="When was George Washington president?", context=context).lstrip())
```
## Known Limitations
### Performance Limitations
**`dolly-v2-7b` is not a state-of-the-art generative language model** and, though quantitative benchmarking is ongoing, is not designed to perform
competitively with more modern model architectures or models subject to larger pretraining corpuses.
The Dolly model family is under active development, and so any list of shortcomings is unlikely to be exhaustive, but we include known limitations and misfires here as a means to document and share our preliminary findings with the community.
In particular, `dolly-v2-7b` struggles with: syntactically complex prompts, programming problems, mathematical operations, factual errors,
dates and times, open-ended question answering, hallucination, enumerating lists of specific length, stylistic mimicry, having a sense of humor, etc.
Moreover, we find that `dolly-v2-7b` does not have some capabilities, such as well-formatted letter writing, present in the original model.
### Dataset Limitations
Like all language models, `dolly-v2-7b` reflects the content and limitations of its training corpuses.
- **The Pile**: GPT-J's pre-training corpus contains content mostly collected from the public internet, and like most web-scale datasets,
it contains content many users would find objectionable. As such, the model is likely to reflect these shortcomings, potentially overtly
in the case it is explicitly asked to produce objectionable content, and sometimes subtly, as in the case of biased or harmful implicit
associations.
- **`databricks-dolly-15k`**: The training data on which `dolly-v2-7b` is instruction tuned represents natural language instructions generated
by Databricks employees during a period spanning March and April 2023 and includes passages from Wikipedia as references passages
for instruction categories like closed QA and summarization. To our knowledge it does not contain obscenity, intellectual property or
personally identifying information about non-public figures, but it may contain typos and factual errors.
The dataset may also reflect biases found in Wikipedia. Finally, the dataset likely reflects
the interests and semantic choices of Databricks employees, a demographic which is not representative of the global population at large.
Databricks is committed to ongoing research and development efforts to develop helpful, honest and harmless AI technologies that
maximize the potential of all individuals and organizations.
### Benchmark Metrics
Below you'll find various models benchmark performance on the [EleutherAI LLM Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness);
model results are sorted by geometric mean to produce an intelligible ordering. As outlined above, these results demonstrate that `dolly-v2-7b` is not state of the art,
and in fact underperforms `dolly-v1-6b` in some evaluation benchmarks. We believe this owes to the composition and size of the underlying fine tuning datasets,
but a robust statement as to the sources of these variations requires further study.
| model | openbookqa | arc_easy | winogrande | hellaswag | arc_challenge | piqa | boolq | gmean |
| --------------------------------- | ------------ | ---------- | ------------ | ----------- | --------------- | -------- | -------- | ---------|
| EleutherAI/pythia-2.8b | 0.348 | 0.585859 | 0.589582 | 0.591217 | 0.323379 | 0.73395 | 0.638226 | 0.523431 |
| EleutherAI/pythia-6.9b | 0.368 | 0.604798 | 0.608524 | 0.631548 | 0.343857 | 0.761153 | 0.6263 | 0.543567 |
| databricks/dolly-v2-3b | 0.384 | 0.611532 | 0.589582 | 0.650767 | 0.370307 | 0.742655 | 0.575535 | 0.544886 |
| EleutherAI/pythia-12b | 0.364 | 0.627104 | 0.636148 | 0.668094 | 0.346416 | 0.760065 | 0.673394 | 0.559676 |
| EleutherAI/gpt-j-6B | 0.382 | 0.621633 | 0.651144 | 0.662617 | 0.363481 | 0.761153 | 0.655963 | 0.565936 |
| databricks/dolly-v2-12b | 0.408 | 0.63931 | 0.616417 | 0.707927 | 0.388225 | 0.757889 | 0.568196 | 0.56781 |
| databricks/dolly-v2-7b | 0.392 | 0.633838 | 0.607735 | 0.686517 | 0.406997 | 0.750816 | 0.644037 | 0.573487 |
| databricks/dolly-v1-6b | 0.41 | 0.62963 | 0.643252 | 0.676758 | 0.384812 | 0.773667 | 0.687768 | 0.583431 |
| EleutherAI/gpt-neox-20b | 0.402 | 0.683923 | 0.656669 | 0.7142 | 0.408703 | 0.784004 | 0.695413 | 0.602236 |
# Citation
```
@online{DatabricksBlog2023DollyV2,
author = {Mike Conover and Matt Hayes and Ankit Mathur and Jianwei Xie and Jun Wan and Sam Shah and Ali Ghodsi and Patrick Wendell and Matei Zaharia and Reynold Xin},
title = {Free Dolly: Introducing the World's First Truly Open Instruction-Tuned LLM},
year = {2023},
url = {https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm},
urldate = {2023-06-30}
}
```
# Happy Hacking! | 10,734 | [
[
-0.0018529891967773438,
-0.07598876953125,
0.01201629638671875,
0.025421142578125,
-0.011383056640625,
-0.004711151123046875,
0.0001995563507080078,
-0.0038585662841796875,
0.003936767578125,
0.035888671875,
-0.0380859375,
-0.03814697265625,
-0.05145263671875,
0.006175994873046875,
-0.04620361328125,
0.08428955078125,
-0.004215240478515625,
-0.0109405517578125,
-0.0384521484375,
0.016357421875,
-0.027740478515625,
-0.0257720947265625,
-0.0225830078125,
-0.01134490966796875,
0.0221099853515625,
0.02044677734375,
0.053192138671875,
0.0606689453125,
0.0272064208984375,
0.0265045166015625,
-0.00878143310546875,
-0.00452423095703125,
-0.037811279296875,
0.00258636474609375,
0.002716064453125,
-0.0306243896484375,
-0.0350341796875,
0.00225830078125,
0.045379638671875,
0.034942626953125,
0.002635955810546875,
0.024993896484375,
-0.00374603271484375,
0.053192138671875,
-0.040679931640625,
0.037689208984375,
-0.034515380859375,
-0.00652313232421875,
-0.01064300537109375,
0.0017786026000976562,
-0.045013427734375,
-0.033660888671875,
-0.002475738525390625,
-0.0443115234375,
0.023040771484375,
0.00450897216796875,
0.07720947265625,
0.015777587890625,
-0.0262603759765625,
-0.01490020751953125,
-0.044677734375,
0.07537841796875,
-0.04083251953125,
0.0022792816162109375,
0.032806396484375,
0.018157958984375,
-0.0293426513671875,
-0.0643310546875,
-0.051300048828125,
-0.016510009765625,
-0.036407470703125,
0.00467681884765625,
-0.0187530517578125,
-0.00039196014404296875,
0.039794921875,
0.037750244140625,
-0.0606689453125,
-0.00885009765625,
-0.066650390625,
-0.0259552001953125,
0.054962158203125,
0.0257720947265625,
0.007843017578125,
-0.049285888671875,
-0.021392822265625,
-0.0291290283203125,
-0.0401611328125,
0.004329681396484375,
0.035186767578125,
0.023590087890625,
-0.040374755859375,
0.055999755859375,
-0.02978515625,
0.061920166015625,
-0.008209228515625,
-0.01275634765625,
0.026885986328125,
-0.006801605224609375,
-0.033447265625,
-0.00890350341796875,
0.068603515625,
0.0176239013671875,
0.007411956787109375,
0.0012187957763671875,
-0.0029621124267578125,
0.0188140869140625,
0.0157928466796875,
-0.0631103515625,
-0.0364990234375,
0.041656494140625,
-0.03472900390625,
-0.038238525390625,
-0.0115203857421875,
-0.07049560546875,
-0.041412353515625,
-0.0166168212890625,
0.037750244140625,
-0.0206146240234375,
-0.022064208984375,
-0.01076507568359375,
-0.007694244384765625,
0.0221099853515625,
0.01317596435546875,
-0.08306884765625,
0.0140533447265625,
0.040008544921875,
0.054290771484375,
-0.002040863037109375,
-0.01364898681640625,
-0.055328369140625,
-0.017669677734375,
-0.0118865966796875,
0.03533935546875,
-0.040313720703125,
-0.027313232421875,
0.0099029541015625,
0.0164031982421875,
-0.01062774658203125,
-0.04144287109375,
0.0251617431640625,
-0.0267791748046875,
0.040374755859375,
-0.00325775146484375,
-0.040191650390625,
-0.01800537109375,
0.005832672119140625,
-0.04107666015625,
0.0849609375,
0.040679931640625,
-0.055328369140625,
0.01200103759765625,
-0.01187896728515625,
-0.039276123046875,
-0.0134429931640625,
-0.00778961181640625,
-0.0462646484375,
-0.003345489501953125,
0.021331787109375,
0.051910400390625,
-0.03656005859375,
0.025299072265625,
-0.0006136894226074219,
-0.0165252685546875,
0.006343841552734375,
-0.031494140625,
0.08074951171875,
0.0096893310546875,
-0.029754638671875,
0.0146484375,
-0.07763671875,
-0.0032253265380859375,
0.00977325439453125,
-0.030029296875,
0.0249176025390625,
-0.024993896484375,
0.01788330078125,
0.0042572021484375,
0.0196685791015625,
-0.0303802490234375,
0.0237579345703125,
-0.021697998046875,
0.00894927978515625,
0.053558349609375,
-0.03289794921875,
0.02764892578125,
-0.037384033203125,
0.042694091796875,
0.0006356239318847656,
0.007541656494140625,
-0.0267791748046875,
-0.059326171875,
-0.07708740234375,
-0.01348876953125,
0.0194549560546875,
0.048980712890625,
-0.049407958984375,
0.0270843505859375,
-0.01296234130859375,
-0.04083251953125,
-0.04425048828125,
0.005695343017578125,
0.03997802734375,
0.050048828125,
0.05517578125,
-0.0137176513671875,
-0.048797607421875,
-0.061767578125,
0.0022869110107421875,
-0.0285491943359375,
-0.016937255859375,
0.01222991943359375,
0.03912353515625,
-0.004291534423828125,
0.0631103515625,
-0.037994384765625,
-0.006725311279296875,
-0.040771484375,
0.0108489990234375,
0.038970947265625,
0.048370361328125,
0.017486572265625,
-0.0433349609375,
-0.03857421875,
0.00489044189453125,
-0.0657958984375,
0.005130767822265625,
-0.01528167724609375,
-0.00846099853515625,
0.041595458984375,
0.017364501953125,
-0.0662841796875,
0.053070068359375,
0.048980712890625,
-0.0301513671875,
0.054595947265625,
-0.0120849609375,
0.0009288787841796875,
-0.08966064453125,
0.01371002197265625,
-0.01111602783203125,
0.00011813640594482422,
-0.04107666015625,
-0.01036834716796875,
0.002239227294921875,
0.004154205322265625,
-0.0271759033203125,
0.06292724609375,
-0.015869140625,
0.001285552978515625,
-0.0056610107421875,
0.007328033447265625,
0.01097869873046875,
0.03839111328125,
0.0012645721435546875,
0.028045654296875,
0.059051513671875,
-0.052734375,
0.06451416015625,
0.0306243896484375,
-0.033905029296875,
0.0270233154296875,
-0.04583740234375,
0.0095062255859375,
-0.01548004150390625,
0.021514892578125,
-0.0703125,
-0.0283660888671875,
0.0269927978515625,
-0.031494140625,
0.043182373046875,
-0.0214080810546875,
-0.02490234375,
-0.03851318359375,
-0.007427215576171875,
0.008941650390625,
0.06939697265625,
-0.041015625,
0.05487060546875,
0.009033203125,
-0.006221771240234375,
-0.0576171875,
-0.044677734375,
-0.0164031982421875,
-0.0198516845703125,
-0.06781005859375,
0.0266876220703125,
0.01235198974609375,
-0.0142974853515625,
-0.01412200927734375,
0.0010137557983398438,
0.00534820556640625,
-0.0208587646484375,
0.01001739501953125,
0.035400390625,
-0.009674072265625,
0.006694793701171875,
0.005126953125,
-0.030731201171875,
0.007904052734375,
-0.0157928466796875,
0.045684814453125,
-0.0015010833740234375,
0.0072174072265625,
-0.0430908203125,
0.0018625259399414062,
0.0400390625,
0.0029315948486328125,
0.06805419921875,
0.07098388671875,
-0.01360321044921875,
0.004482269287109375,
-0.044830322265625,
-0.0255889892578125,
-0.038177490234375,
0.0401611328125,
-0.01468658447265625,
-0.0274200439453125,
0.03131103515625,
0.00836181640625,
0.0088043212890625,
0.039459228515625,
0.046295166015625,
0.00021779537200927734,
0.046600341796875,
0.02862548828125,
-0.001399993896484375,
0.0153045654296875,
-0.05340576171875,
0.006832122802734375,
-0.0631103515625,
-0.036773681640625,
-0.039520263671875,
-0.0256500244140625,
-0.061065673828125,
-0.045166015625,
0.008331298828125,
0.0064239501953125,
-0.0267181396484375,
0.041290283203125,
-0.0374755859375,
0.01224517822265625,
0.04962158203125,
-0.0008459091186523438,
0.0027484893798828125,
0.004062652587890625,
-0.004199981689453125,
0.0125579833984375,
-0.05560302734375,
-0.046112060546875,
0.08978271484375,
0.0164947509765625,
0.067138671875,
-0.0041046142578125,
0.03179931640625,
-0.0021915435791015625,
0.0162811279296875,
-0.040679931640625,
0.043182373046875,
-0.00806427001953125,
-0.05950927734375,
-0.01361846923828125,
-0.043548583984375,
-0.07708740234375,
0.004302978515625,
-0.01461029052734375,
-0.0821533203125,
0.0002849102020263672,
0.0198211669921875,
-0.016326904296875,
0.022216796875,
-0.061004638671875,
0.08575439453125,
0.00286865234375,
-0.046142578125,
-0.009002685546875,
-0.056976318359375,
0.02081298828125,
0.02386474609375,
0.00716400146484375,
-0.001125335693359375,
0.0308990478515625,
0.057464599609375,
-0.031005859375,
0.05413818359375,
-0.0113677978515625,
0.01406097412109375,
0.0300750732421875,
0.003986358642578125,
0.052886962890625,
0.0116119384765625,
-0.0174713134765625,
0.0010242462158203125,
-0.006534576416015625,
-0.046600341796875,
-0.0430908203125,
0.0546875,
-0.0599365234375,
-0.052703857421875,
-0.03118896484375,
-0.04852294921875,
0.00830078125,
-0.003879547119140625,
0.03045654296875,
0.053863525390625,
-0.000029921531677246094,
0.0225067138671875,
0.03863525390625,
-0.047821044921875,
0.038909912109375,
0.007137298583984375,
-0.034332275390625,
-0.0162811279296875,
0.0762939453125,
-0.010589599609375,
0.0255889892578125,
0.045196533203125,
0.0298004150390625,
-0.0286865234375,
-0.0175323486328125,
-0.0604248046875,
0.011016845703125,
-0.055023193359375,
-0.0172119140625,
-0.0657958984375,
-0.0178680419921875,
-0.0300140380859375,
-0.014251708984375,
-0.033416748046875,
-0.0570068359375,
-0.03460693359375,
-0.005733489990234375,
0.05450439453125,
0.0567626953125,
0.005039215087890625,
0.02410888671875,
-0.046539306640625,
0.029876708984375,
0.035858154296875,
0.006053924560546875,
-0.00624847412109375,
-0.055206298828125,
-0.019012451171875,
0.0006847381591796875,
-0.0469970703125,
-0.04736328125,
0.02984619140625,
-0.006916046142578125,
0.0218505859375,
0.0131378173828125,
0.00772857666015625,
0.03485107421875,
-0.01227569580078125,
0.0699462890625,
0.00604248046875,
-0.06060791015625,
0.039825439453125,
-0.0242919921875,
0.031829833984375,
0.0095367431640625,
0.026336669921875,
-0.025482177734375,
-0.0270843505859375,
-0.044830322265625,
-0.06890869140625,
0.068115234375,
0.045867919921875,
0.024566650390625,
0.003894805908203125,
0.0087432861328125,
0.00958251953125,
0.0166473388671875,
-0.058990478515625,
-0.042694091796875,
-0.0223388671875,
-0.01702880859375,
0.01338958740234375,
-0.00830841064453125,
-0.004791259765625,
-0.0323486328125,
0.06744384765625,
0.010009765625,
0.0347900390625,
-0.00799560546875,
-0.01113128662109375,
-0.005382537841796875,
0.00738525390625,
0.033843994140625,
0.04833984375,
-0.020416259765625,
-0.006793975830078125,
0.01528167724609375,
-0.053985595703125,
0.0095367431640625,
0.030914306640625,
-0.012481689453125,
-0.0002574920654296875,
0.032623291015625,
0.073974609375,
-0.0128936767578125,
-0.0237274169921875,
0.0238189697265625,
-0.0126800537109375,
0.00751495361328125,
-0.014678955078125,
0.0106353759765625,
0.0080718994140625,
0.0127410888671875,
0.022247314453125,
-0.0005097389221191406,
-0.016693115234375,
-0.040374755859375,
0.006404876708984375,
0.0222015380859375,
-0.020263671875,
-0.015228271484375,
0.05291748046875,
0.013671875,
-0.0224456787109375,
0.07958984375,
-0.02349853515625,
-0.0228271484375,
0.062103271484375,
0.040130615234375,
0.061553955078125,
-0.0179290771484375,
0.037994384765625,
0.056671142578125,
0.025054931640625,
0.01158905029296875,
0.013763427734375,
0.021392822265625,
-0.02862548828125,
-0.0238189697265625,
-0.0706787109375,
-0.0126495361328125,
0.0203857421875,
-0.0386962890625,
0.05609130859375,
-0.0361328125,
0.005222320556640625,
-0.01313018798828125,
0.00528717041015625,
-0.0653076171875,
0.0321044921875,
-0.0042572021484375,
0.041900634765625,
-0.04998779296875,
0.056671142578125,
0.033111572265625,
-0.023406982421875,
-0.05523681640625,
-0.0132904052734375,
0.00734710693359375,
-0.049407958984375,
0.04254150390625,
0.0330810546875,
0.0239105224609375,
-0.00689697265625,
-0.0159759521484375,
-0.0670166015625,
0.08782958984375,
0.0230255126953125,
-0.0281982421875,
0.0126190185546875,
0.00890350341796875,
0.02642822265625,
-0.026397705078125,
0.0521240234375,
0.054107666015625,
0.034881591796875,
0.01641845703125,
-0.0684814453125,
0.018035888671875,
-0.02081298828125,
-0.007419586181640625,
0.006626129150390625,
-0.045196533203125,
0.07928466796875,
-0.030914306640625,
-0.01459503173828125,
0.0247344970703125,
0.057830810546875,
0.019439697265625,
0.0206146240234375,
0.0084686279296875,
0.040069580078125,
0.060028076171875,
-0.0187225341796875,
0.10498046875,
-0.0185394287109375,
0.038238525390625,
0.060882568359375,
0.016510009765625,
0.04364013671875,
0.020904541015625,
-0.036712646484375,
0.057281494140625,
0.035552978515625,
-0.00003981590270996094,
0.031646728515625,
0.030792236328125,
-0.0162200927734375,
0.005260467529296875,
0.007015228271484375,
-0.046722412109375,
0.031768798828125,
0.0305633544921875,
-0.0390625,
0.00569915771484375,
-0.0200653076171875,
0.0173187255859375,
-0.01548004150390625,
0.001689910888671875,
0.0304107666015625,
0.0007023811340332031,
-0.04412841796875,
0.07012939453125,
-0.003314971923828125,
0.03216552734375,
-0.043212890625,
-0.00690460205078125,
-0.02703857421875,
0.01267242431640625,
-0.027587890625,
-0.041839599609375,
0.021942138671875,
-0.0025577545166015625,
-0.01080322265625,
-0.006298065185546875,
0.030609130859375,
-0.0300750732421875,
-0.0645751953125,
0.007404327392578125,
0.0137786865234375,
0.020111083984375,
0.015960693359375,
-0.03875732421875,
0.0284271240234375,
0.0030040740966796875,
-0.040679931640625,
0.026153564453125,
0.011932373046875,
0.019012451171875,
0.048095703125,
0.03314208984375,
-0.018524169921875,
0.0029201507568359375,
-0.01526641845703125,
0.07098388671875,
-0.039276123046875,
-0.00980377197265625,
-0.058013916015625,
0.07879638671875,
-0.01110076904296875,
-0.03814697265625,
0.0484619140625,
0.04754638671875,
0.06597900390625,
-0.02056884765625,
0.0626220703125,
-0.036590576171875,
0.0177001953125,
-0.05218505859375,
0.040771484375,
-0.0312347412109375,
0.024993896484375,
-0.034515380859375,
-0.09228515625,
-0.0183563232421875,
0.06982421875,
-0.0291595458984375,
0.022308349609375,
0.07049560546875,
0.0869140625,
-0.007293701171875,
0.00881195068359375,
0.0191650390625,
0.0347900390625,
0.020263671875,
0.0279998779296875,
0.047515869140625,
-0.0548095703125,
0.05230712890625,
-0.0469970703125,
-0.027374267578125,
-0.01026153564453125,
-0.06298828125,
-0.08099365234375,
-0.048431396484375,
-0.035308837890625,
-0.052337646484375,
-0.002933502197265625,
0.0655517578125,
0.0440673828125,
-0.0655517578125,
-0.0292205810546875,
-0.01556396484375,
0.0309906005859375,
-0.01000213623046875,
-0.0218658447265625,
0.04754638671875,
-0.01702880859375,
-0.0673828125,
0.0208587646484375,
0.01192474365234375,
-0.00008809566497802734,
-0.027740478515625,
-0.00843048095703125,
-0.0120391845703125,
-0.00927734375,
0.0357666015625,
0.00982666015625,
-0.047088623046875,
-0.005855560302734375,
0.0035686492919921875,
0.003582000732421875,
-0.000043272972106933594,
0.040740966796875,
-0.07568359375,
0.054046630859375,
0.0484619140625,
0.0269927978515625,
0.05853271484375,
-0.0108642578125,
0.048919677734375,
-0.0567626953125,
0.025543212890625,
0.01177215576171875,
0.0191192626953125,
0.047515869140625,
-0.032684326171875,
0.0265350341796875,
0.0195770263671875,
-0.047119140625,
-0.047454833984375,
0.0216217041015625,
-0.05950927734375,
-0.004489898681640625,
0.10028076171875,
-0.0133209228515625,
-0.023773193359375,
-0.0160980224609375,
-0.015594482421875,
0.021453857421875,
-0.0273895263671875,
0.07891845703125,
0.034515380859375,
-0.00487518310546875,
-0.00553131103515625,
-0.046356201171875,
0.04364013671875,
0.0267791748046875,
-0.053924560546875,
0.01204681396484375,
0.0187835693359375,
-0.00452423095703125,
0.021392822265625,
0.0341796875,
-0.0016832351684570312,
0.021514892578125,
0.0220184326171875,
-0.006694793701171875,
0.005008697509765625,
-0.0305633544921875,
-0.0040130615234375,
-0.006076812744140625,
-0.0289154052734375,
-0.008697509765625
]
] |
slauw87/bart_summarisation | 2021-09-20T05:27:36.000Z | [
"transformers",
"pytorch",
"bart",
"text2text-generation",
"sagemaker",
"summarization",
"en",
"dataset:samsum",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | summarization | slauw87 | null | null | slauw87/bart_summarisation | 45 | 11,532 | transformers | 2022-03-02T23:29:05 |
---
language: en
tags:
- sagemaker
- bart
- summarization
license: apache-2.0
datasets:
- samsum
model-index:
- name: bart-large-cnn-samsum
results:
- task:
name: Abstractive Text Summarization
type: abstractive-text-summarization
dataset:
name: "SAMSum Corpus: A Human-annotated Dialogue Dataset for Abstractive Summarization"
type: samsum
metrics:
- name: Validation ROGUE-1
type: rogue-1
value: 43.2111
- name: Validation ROGUE-2
type: rogue-2
value: 22.3519
- name: Validation ROGUE-L
type: rogue-l
value: 33.315
- name: Test ROGUE-1
type: rogue-1
value: 41.8283
- name: Test ROGUE-2
type: rogue-2
value: 20.9857
- name: Test ROGUE-L
type: rogue-l
value: 32.3602
widget:
- text: |
Sugi: I am tired of everything in my life.
Tommy: What? How happy you life is! I do envy you.
Sugi: You don't know that I have been over-protected by my mother these years. I am really about to leave the family and spread my wings.
Tommy: Maybe you are right.
---
## `bart-large-cnn-samsum`
This model was trained using Amazon SageMaker and the new Hugging Face Deep Learning container.
For more information look at:
- [🤗 Transformers Documentation: Amazon SageMaker](https://huggingface.co/transformers/sagemaker.html)
- [Example Notebooks](https://github.com/huggingface/notebooks/tree/master/sagemaker)
- [Amazon SageMaker documentation for Hugging Face](https://docs.aws.amazon.com/sagemaker/latest/dg/hugging-face.html)
- [Python SDK SageMaker documentation for Hugging Face](https://sagemaker.readthedocs.io/en/stable/frameworks/huggingface/index.html)
- [Deep Learning Container](https://github.com/aws/deep-learning-containers/blob/master/available_images.md#huggingface-training-containers)
## Hyperparameters
{
"dataset_name": "samsum",
"do_eval": true,
"do_predict": true,
"do_train": true,
"fp16": true,
"learning_rate": 5e-05,
"model_name_or_path": "facebook/bart-large-cnn",
"num_train_epochs": 3,
"output_dir": "/opt/ml/model",
"per_device_eval_batch_size": 4,
"per_device_train_batch_size": 4,
"predict_with_generate": true,
"seed": 7
}
## Usage
from transformers import pipeline
summarizer = pipeline("summarization", model="slauw87/bart-large-cnn-samsum")
conversation = '''Sugi: I am tired of everything in my life.
Tommy: What? How happy you life is! I do envy you.
Sugi: You don't know that I have been over-protected by my mother these years. I am really about to leave the family and spread my wings.
Tommy: Maybe you are right.
'''
nlp(conversation)
## Results
| key | value |
| --- | ----- |
| eval_rouge1 | 43.2111 |
| eval_rouge2 | 22.3519 |
| eval_rougeL | 33.3153 |
| eval_rougeLsum | 40.0527 |
| predict_rouge1 | 41.8283 |
| predict_rouge2 | 20.9857 |
| predict_rougeL | 32.3602 |
| predict_rougeLsum | 38.7316 |
| 3,068 | [
[
-0.04498291015625,
-0.05560302734375,
0.00751495361328125,
0.00783538818359375,
-0.0193023681640625,
-0.005611419677734375,
-0.0173797607421875,
-0.01323699951171875,
0.0280303955078125,
0.041229248046875,
-0.0716552734375,
-0.04046630859375,
-0.060089111328125,
-0.01520538330078125,
-0.0020389556884765625,
0.0982666015625,
0.004451751708984375,
0.018402099609375,
0.0010852813720703125,
-0.00922393798828125,
-0.0016851425170898438,
-0.0338134765625,
-0.0655517578125,
-0.030029296875,
0.023284912109375,
0.02008056640625,
0.060211181640625,
0.0228271484375,
0.03643798828125,
0.0260162353515625,
-0.02642822265625,
-0.003711700439453125,
-0.04266357421875,
-0.00936126708984375,
0.008453369140625,
-0.0224151611328125,
-0.0394287109375,
0.00295257568359375,
0.037628173828125,
0.038726806640625,
-0.00620269775390625,
0.035919189453125,
-0.00027060508728027344,
0.052520751953125,
-0.0181732177734375,
0.0308837890625,
-0.032958984375,
0.022491455078125,
0.002086639404296875,
0.0142364501953125,
-0.009552001953125,
-0.02484130859375,
0.013885498046875,
-0.02960205078125,
0.021087646484375,
-0.003528594970703125,
0.100830078125,
0.036224365234375,
-0.0338134765625,
0.00458526611328125,
-0.0257415771484375,
0.0635986328125,
-0.0611572265625,
0.00876617431640625,
0.03857421875,
0.041656494140625,
-0.0214385986328125,
-0.060150146484375,
-0.042083740234375,
-0.005229949951171875,
-0.01605224609375,
0.01171112060546875,
-0.0243682861328125,
-0.00650787353515625,
0.03057861328125,
0.0325927734375,
-0.04132080078125,
-0.004222869873046875,
-0.03961181640625,
-0.0076141357421875,
0.04742431640625,
-0.0015707015991210938,
0.00589752197265625,
-0.01319122314453125,
-0.041778564453125,
-0.025177001953125,
-0.0229034423828125,
0.014862060546875,
0.0080413818359375,
0.01007080078125,
-0.032867431640625,
0.0394287109375,
-0.032867431640625,
0.042694091796875,
0.00202178955078125,
-0.012542724609375,
0.0460205078125,
0.00018858909606933594,
-0.0281829833984375,
0.0026836395263671875,
0.07757568359375,
0.040985107421875,
0.0176849365234375,
0.03033447265625,
-0.006801605224609375,
-0.01155853271484375,
0.01214599609375,
-0.08074951171875,
-0.01947021484375,
0.0113677978515625,
-0.049163818359375,
-0.033782958984375,
-0.007625579833984375,
-0.05853271484375,
-0.01580810546875,
-0.01100921630859375,
0.015380859375,
-0.0295867919921875,
-0.0116729736328125,
0.004566192626953125,
-0.0024585723876953125,
0.0279541015625,
0.01611328125,
-0.07012939453125,
0.0162353515625,
0.043212890625,
0.05877685546875,
0.0221099853515625,
-0.01003265380859375,
-0.0310821533203125,
-0.0112457275390625,
-0.0169525146484375,
0.03778076171875,
-0.01210784912109375,
-0.01531982421875,
-0.00908660888671875,
0.026947021484375,
0.006481170654296875,
-0.0246734619140625,
0.0604248046875,
-0.0210113525390625,
0.035400390625,
-0.01055145263671875,
-0.03582763671875,
-0.045745849609375,
0.0181121826171875,
-0.037841796875,
0.06744384765625,
0.032012939453125,
-0.0548095703125,
0.0201568603515625,
-0.046661376953125,
-0.01448822021484375,
0.0007410049438476562,
-0.0013856887817382812,
-0.07452392578125,
-0.0107574462890625,
0.0225830078125,
0.0767822265625,
-0.025115966796875,
0.025177001953125,
-0.045806884765625,
-0.009490966796875,
0.0166168212890625,
-0.019439697265625,
0.087158203125,
0.012847900390625,
-0.0310516357421875,
0.019775390625,
-0.05670166015625,
-0.00795745849609375,
0.036651611328125,
-0.01372528076171875,
-0.0007157325744628906,
-0.0325927734375,
0.0084991455078125,
0.0260162353515625,
0.027435302734375,
-0.04254150390625,
0.027099609375,
-0.0093841552734375,
0.032623291015625,
0.04339599609375,
0.005695343017578125,
0.00962066650390625,
-0.03436279296875,
0.038543701171875,
0.00226593017578125,
0.01198577880859375,
-0.0200653076171875,
-0.048370361328125,
-0.061798095703125,
-0.017913818359375,
0.00878143310546875,
0.025238037109375,
-0.03765869140625,
0.060394287109375,
-0.022491455078125,
-0.05401611328125,
-0.042724609375,
0.009521484375,
0.038604736328125,
0.051116943359375,
0.03656005859375,
-0.021820068359375,
-0.04791259765625,
-0.07269287109375,
0.014495849609375,
-0.00725555419921875,
-0.0028438568115234375,
0.0252685546875,
0.0479736328125,
-0.035125732421875,
0.04339599609375,
-0.0478515625,
-0.037841796875,
-0.0180206298828125,
0.01334381103515625,
0.021636962890625,
0.0472412109375,
0.040130615234375,
-0.0513916015625,
-0.0247802734375,
-0.00159454345703125,
-0.06317138671875,
0.01265716552734375,
-0.018890380859375,
-0.0277557373046875,
0.0207977294921875,
0.02813720703125,
-0.060211181640625,
0.03265380859375,
0.05377197265625,
-0.03228759765625,
0.045562744140625,
0.00022733211517333984,
-0.0088043212890625,
-0.096435546875,
0.001495361328125,
0.02032470703125,
-0.0246124267578125,
-0.02716064453125,
0.00972747802734375,
-0.0097808837890625,
-0.006649017333984375,
-0.033721923828125,
0.03704833984375,
-0.0171356201171875,
-0.008880615234375,
-0.0045623779296875,
-0.009521484375,
0.000530242919921875,
0.06427001953125,
0.004001617431640625,
0.03643798828125,
0.048492431640625,
-0.0281524658203125,
0.052215576171875,
0.053466796875,
-0.0308837890625,
0.03955078125,
-0.055419921875,
-0.01043701171875,
-0.0014467239379882812,
0.053863525390625,
-0.0653076171875,
-0.0285797119140625,
0.039581298828125,
-0.0517578125,
0.0251312255859375,
-0.0157470703125,
-0.0221099853515625,
-0.046417236328125,
-0.0148468017578125,
0.01055908203125,
0.054656982421875,
-0.048370361328125,
0.035369873046875,
0.03167724609375,
-0.00975799560546875,
-0.05072021484375,
-0.05621337890625,
0.00537109375,
-0.0281219482421875,
-0.040740966796875,
0.0228118896484375,
-0.0021991729736328125,
-0.0014314651489257812,
0.004001617431640625,
0.0076141357421875,
-0.01108551025390625,
0.0021514892578125,
0.0272979736328125,
0.033172607421875,
-0.03045654296875,
-0.00774383544921875,
0.0039520263671875,
-0.0230560302734375,
0.01557159423828125,
-0.0010242462158203125,
0.04803466796875,
-0.04681396484375,
-0.0062713623046875,
-0.05401611328125,
-0.00408172607421875,
0.038787841796875,
-0.0030879974365234375,
0.0655517578125,
0.07708740234375,
-0.035369873046875,
0.0005965232849121094,
-0.036163330078125,
-0.0168609619140625,
-0.035491943359375,
0.0244903564453125,
-0.02362060546875,
-0.05615234375,
0.045166015625,
-0.0121307373046875,
-0.00007909536361694336,
0.05023193359375,
0.05157470703125,
-0.00666046142578125,
0.064208984375,
0.0272369384765625,
-0.0248260498046875,
0.0301971435546875,
-0.056121826171875,
-0.00839996337890625,
-0.07220458984375,
-0.02362060546875,
-0.0120391845703125,
-0.039642333984375,
-0.03271484375,
-0.0255279541015625,
0.01366424560546875,
0.006130218505859375,
-0.048370361328125,
0.050994873046875,
-0.0472412109375,
0.0205078125,
0.061614990234375,
0.019775390625,
0.0036468505859375,
-0.01385498046875,
0.00446319580078125,
-0.0031147003173828125,
-0.03643798828125,
-0.0306854248046875,
0.06854248046875,
0.0467529296875,
0.04608154296875,
-0.0099334716796875,
0.05029296875,
-0.00800323486328125,
0.018524169921875,
-0.05194091796875,
0.0499267578125,
-0.0033817291259765625,
-0.04937744140625,
-0.0194244384765625,
-0.043212890625,
-0.091064453125,
0.0138397216796875,
-0.031768798828125,
-0.05010986328125,
-0.005786895751953125,
-0.01035308837890625,
-0.031280517578125,
0.025848388671875,
-0.034637451171875,
0.0816650390625,
-0.0161285400390625,
0.0016918182373046875,
-0.0092315673828125,
-0.07598876953125,
0.052764892578125,
0.00554656982421875,
-0.0094757080078125,
-0.0201568603515625,
0.011016845703125,
0.04913330078125,
-0.0297393798828125,
0.06787109375,
-0.00905609130859375,
0.0105438232421875,
0.021820068359375,
-0.006866455078125,
0.022430419921875,
-0.0049591064453125,
0.00653839111328125,
0.036285400390625,
-0.00524139404296875,
-0.038482666015625,
-0.035369873046875,
0.052703857421875,
-0.07098388671875,
-0.0287017822265625,
-0.03411865234375,
-0.021728515625,
0.0081024169921875,
0.024688720703125,
0.0325927734375,
0.0176239013671875,
-0.014801025390625,
0.0423583984375,
0.03668212890625,
-0.028045654296875,
0.03668212890625,
0.006565093994140625,
-0.0247344970703125,
-0.0380859375,
0.0565185546875,
-0.0118560791015625,
0.00922393798828125,
0.0149078369140625,
0.02874755859375,
-0.00988006591796875,
0.00225067138671875,
-0.05078125,
0.0297393798828125,
-0.0304107666015625,
-0.02972412109375,
-0.044097900390625,
-0.041748046875,
-0.03131103515625,
-0.0160064697265625,
-0.039886474609375,
-0.0338134765625,
-0.03131103515625,
-0.015838623046875,
0.0377197265625,
0.0301971435546875,
0.0156707763671875,
0.0391845703125,
-0.058624267578125,
0.029266357421875,
0.0195159912109375,
0.01922607421875,
-0.00347137451171875,
-0.0487060546875,
-0.0176849365234375,
0.0068511962890625,
-0.049652099609375,
-0.046783447265625,
0.053375244140625,
0.01053619384765625,
0.032867431640625,
0.05157470703125,
0.0062713623046875,
0.047637939453125,
-0.0232086181640625,
0.065185546875,
0.02691650390625,
-0.055023193359375,
0.0372314453125,
-0.03533935546875,
0.00897979736328125,
0.03289794921875,
0.040496826171875,
-0.036346435546875,
-0.014007568359375,
-0.0855712890625,
-0.05743408203125,
0.053619384765625,
0.01715087890625,
0.00925445556640625,
0.0234222412109375,
0.01372528076171875,
-0.0021514892578125,
0.0267333984375,
-0.072998046875,
-0.037200927734375,
-0.01983642578125,
-0.0185394287109375,
-0.01312255859375,
-0.001995086669921875,
-0.0220947265625,
-0.032470703125,
0.07769775390625,
-0.00409698486328125,
0.0301361083984375,
-0.0033245086669921875,
0.023284912109375,
-0.00986480712890625,
-0.0238037109375,
0.039215087890625,
0.01727294921875,
-0.017303466796875,
-0.01085662841796875,
0.0188446044921875,
-0.02276611328125,
-0.00800323486328125,
0.02032470703125,
-0.00031828880310058594,
0.00853729248046875,
0.0272369384765625,
0.0836181640625,
0.0168304443359375,
-0.043731689453125,
0.061767578125,
-0.01534271240234375,
-0.023529052734375,
-0.05364990234375,
0.0039520263671875,
0.015228271484375,
0.025787353515625,
0.006282806396484375,
0.0164337158203125,
0.00797271728515625,
-0.034759521484375,
0.02471923828125,
0.04193115234375,
-0.02978515625,
-0.00901031494140625,
0.05841064453125,
0.00209808349609375,
-0.0236358642578125,
0.05419921875,
-0.0171051025390625,
-0.05267333984375,
0.05877685546875,
0.0313720703125,
0.0728759765625,
-0.0192108154296875,
0.027191162109375,
0.0537109375,
0.0076446533203125,
-0.008270263671875,
0.00495147705078125,
-0.00830078125,
-0.05560302734375,
-0.024200439453125,
-0.048828125,
-0.03582763671875,
0.032958984375,
-0.06854248046875,
0.0438232421875,
-0.0472412109375,
-0.0128631591796875,
-0.0017652511596679688,
0.0125885009765625,
-0.056060791015625,
0.029266357421875,
0.017181396484375,
0.0531005859375,
-0.06317138671875,
0.06011962890625,
0.040283203125,
-0.035400390625,
-0.06268310546875,
-0.006561279296875,
0.00492095947265625,
-0.0831298828125,
0.036895751953125,
0.0305023193359375,
0.0134124755859375,
-0.009857177734375,
-0.052093505859375,
-0.06256103515625,
0.08160400390625,
0.0167999267578125,
-0.028717041015625,
0.012176513671875,
-0.018035888671875,
0.041351318359375,
-0.02655029296875,
0.0208740234375,
0.045166015625,
0.0251617431640625,
0.0242919921875,
-0.050140380859375,
0.013763427734375,
-0.028656005859375,
-0.0031604766845703125,
0.01105499267578125,
-0.07598876953125,
0.05987548828125,
-0.0211334228515625,
0.01552581787109375,
0.016357421875,
0.061767578125,
0.02117919921875,
0.03753662109375,
0.032318115234375,
0.06573486328125,
0.046051025390625,
-0.0019245147705078125,
0.07257080078125,
-0.020172119140625,
0.046661376953125,
0.078857421875,
0.0232086181640625,
0.0562744140625,
0.0080108642578125,
-0.0183258056640625,
0.058746337890625,
0.059783935546875,
-0.0282745361328125,
0.0210418701171875,
0.00861358642578125,
-0.013885498046875,
-0.007091522216796875,
0.004627227783203125,
-0.033599853515625,
0.036529541015625,
0.03277587890625,
-0.04364013671875,
-0.00432586669921875,
-0.004383087158203125,
0.0279998779296875,
-0.01097869873046875,
-0.01007080078125,
0.05535888671875,
0.015167236328125,
-0.044219970703125,
0.0638427734375,
-0.0035953521728515625,
0.041412353515625,
-0.04290771484375,
0.004848480224609375,
-0.002716064453125,
0.00832366943359375,
-0.0099639892578125,
-0.055145263671875,
0.0197601318359375,
-0.0140380859375,
0.0115509033203125,
-0.0197296142578125,
0.03680419921875,
-0.0323486328125,
-0.052337646484375,
0.01314544677734375,
0.0191192626953125,
0.031646728515625,
0.003200531005859375,
-0.086669921875,
0.00003641843795776367,
0.01140594482421875,
-0.051055908203125,
0.0232391357421875,
0.0304107666015625,
0.0158538818359375,
0.07122802734375,
0.059112548828125,
0.006046295166015625,
-0.014862060546875,
-0.005970001220703125,
0.06982421875,
-0.0499267578125,
-0.030426025390625,
-0.057220458984375,
0.04345703125,
-0.0192108154296875,
-0.053436279296875,
0.050384521484375,
0.0364990234375,
0.064453125,
-0.01126861572265625,
0.04364013671875,
-0.00005882978439331055,
0.035125732421875,
-0.026214599609375,
0.043426513671875,
-0.047882080078125,
-0.01265716552734375,
-0.032958984375,
-0.07318115234375,
-0.031768798828125,
0.081787109375,
-0.0115814208984375,
0.01132965087890625,
0.035247802734375,
0.053192138671875,
0.0021076202392578125,
0.00836944580078125,
0.01116943359375,
0.02972412109375,
0.0157623291015625,
0.044891357421875,
0.03424072265625,
-0.043914794921875,
0.036468505859375,
-0.04534912109375,
-0.0220489501953125,
-0.029510498046875,
-0.055572509765625,
-0.076171875,
-0.05517578125,
-0.032745361328125,
-0.047607421875,
-0.0257415771484375,
0.077880859375,
0.0657958984375,
-0.052215576171875,
-0.0030460357666015625,
-0.00920867919921875,
-0.0035381317138671875,
-0.029327392578125,
-0.0164337158203125,
0.03436279296875,
-0.00739288330078125,
-0.06475830078125,
-0.0003001689910888672,
-0.00923919677734375,
0.0239410400390625,
-0.01885986328125,
-0.007411956787109375,
0.007366180419921875,
-0.0202484130859375,
0.025787353515625,
0.0284881591796875,
-0.049560546875,
-0.01531982421875,
-0.0185699462890625,
-0.01403045654296875,
0.00243377685546875,
0.0233612060546875,
-0.055328369140625,
0.0204925537109375,
0.01039886474609375,
0.056121826171875,
0.05743408203125,
0.0191650390625,
0.0268402099609375,
-0.044189453125,
0.00919342041015625,
0.015838623046875,
0.049835205078125,
0.033935546875,
-0.0439453125,
0.04986572265625,
0.01399993896484375,
-0.06951904296875,
-0.047515869140625,
0.01409149169921875,
-0.10198974609375,
-0.013275146484375,
0.08575439453125,
-0.0184478759765625,
-0.0283966064453125,
0.0243682861328125,
-0.035614013671875,
0.03765869140625,
-0.03839111328125,
0.057586669921875,
0.037994384765625,
-0.005855560302734375,
-0.0159149169921875,
-0.042938232421875,
0.033721923828125,
0.0289154052734375,
-0.035186767578125,
-0.0119781494140625,
0.0231475830078125,
0.0095367431640625,
0.0163726806640625,
0.027679443359375,
-0.0018815994262695312,
0.003932952880859375,
0.0011157989501953125,
0.023895263671875,
-0.005832672119140625,
-0.010772705078125,
-0.037841796875,
-0.01427459716796875,
-0.0330810546875,
-0.033172607421875
]
] |
yangheng/deberta-v3-base-absa-v1.1 | 2023-09-09T18:58:09.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"text-classification",
"aspect-based-sentiment-analysis",
"PyABSA",
"en",
"dataset:laptop14",
"dataset:restaurant14",
"dataset:restaurant16",
"dataset:ACL-Twitter",
"dataset:MAMS",
"dataset:Television",
"dataset:TShirt",
"dataset:Yelp",
"arxiv:2208.01368",
"arxiv:2110.08604",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | yangheng | null | null | yangheng/deberta-v3-base-absa-v1.1 | 23 | 11,522 | transformers | 2022-03-18T23:58:16 |
---
language:
- en
tags:
- aspect-based-sentiment-analysis
- PyABSA
license: mit
datasets:
- laptop14
- restaurant14
- restaurant16
- ACL-Twitter
- MAMS
- Television
- TShirt
- Yelp
metrics:
- accuracy
- macro-f1
widget:
- text: "[CLS] when tables opened up, the manager sat another party before us. [SEP] manager [SEP] "
---
# Powered by [PyABSA](https://github.com/yangheng95/PyABSA): An open source tool for aspect-based sentiment analysis
This model is training with 30k+ ABSA samples, see [ABSADatasets](https://github.com/yangheng95/ABSADatasets). Yet the test sets are not included in pre-training, so you can use this model for training and benchmarking on common ABSA datasets, e.g., Laptop14, Rest14 datasets. (Except for the Rest15 dataset!)
## Usage
```python3
from transformers import AutoTokenizer, AutoModelForSequenceClassification
# Load the ABSA model and tokenizer
model_name = "yangheng/deberta-v3-base-absa-v1.1"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
classifier = pipeline("text-classification", model=model, tokenizer=tokenizer)
for aspect in ['camera', 'phone']:
print(aspect, classifier('The camera quality of this phone is amazing.', text_pair=aspect))
```
# DeBERTa for aspect-based sentiment analysis
The `deberta-v3-base-absa` model for aspect-based sentiment analysis, trained with English datasets from [ABSADatasets](https://github.com/yangheng95/ABSADatasets).
## Training Model
This model is trained based on the FAST-LCF-BERT model with `microsoft/deberta-v3-base`, which comes from [PyABSA](https://github.com/yangheng95/PyABSA).
To track state-of-the-art models, please see [PyASBA](https://github.com/yangheng95/PyABSA).
## Example in PyASBA
An [example](https://github.com/yangheng95/PyABSA/blob/release/demos/aspect_polarity_classification/train_apc_multilingual.py) for using FAST-LCF-BERT in PyASBA datasets.
## Datasets
This model is fine-tuned with 180k examples for the ABSA dataset (including augmented data). Training dataset files:
```
loading: integrated_datasets/apc_datasets/SemEval/laptop14/Laptops_Train.xml.seg
loading: integrated_datasets/apc_datasets/SemEval/restaurant14/Restaurants_Train.xml.seg
loading: integrated_datasets/apc_datasets/SemEval/restaurant16/restaurant_train.raw
loading: integrated_datasets/apc_datasets/ACL_Twitter/acl-14-short-data/train.raw
loading: integrated_datasets/apc_datasets/MAMS/train.xml.dat
loading: integrated_datasets/apc_datasets/Television/Television_Train.xml.seg
loading: integrated_datasets/apc_datasets/TShirt/Menstshirt_Train.xml.seg
loading: integrated_datasets/apc_datasets/Yelp/yelp.train.txt
```
If you use this model in your research, please cite our papers:
```
@article{YangL22,
author = {Heng Yang and
Ke Li},
title = {A Modularized Framework for Reproducible Aspect-based Sentiment Analysis},
journal = {CoRR},
volume = {abs/2208.01368},
year = {2022},
url = {https://doi.org/10.48550/arXiv.2208.01368},
doi = {10.48550/arXiv.2208.01368},
eprinttype = {arXiv},
eprint = {2208.01368},
timestamp = {Tue, 08 Nov 2022 21:46:32 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2208-01368.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
@article{YangZMT21,
author = {Heng Yang and
Biqing Zeng and
Mayi Xu and
Tianxing Wang},
title = {Back to Reality: Leveraging Pattern-driven Modeling to Enable Affordable
Sentiment Dependency Learning},
journal = {CoRR},
volume = {abs/2110.08604},
year = {2021},
url = {https://arxiv.org/abs/2110.08604},
eprinttype = {arXiv},
eprint = {2110.08604},
timestamp = {Fri, 22 Oct 2021 13:33:09 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2110-08604.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 4,002 | [
[
-0.031005859375,
-0.053497314453125,
0.029266357421875,
0.034271240234375,
-0.029449462890625,
-0.01326751708984375,
-0.007183074951171875,
-0.0207061767578125,
0.022491455078125,
0.02239990234375,
-0.042266845703125,
-0.05810546875,
-0.021759033203125,
0.007343292236328125,
-0.0049285888671875,
0.0782470703125,
-0.01184844970703125,
0.0141754150390625,
-0.0077667236328125,
-0.0364990234375,
-0.01244354248046875,
-0.053131103515625,
-0.038177490234375,
-0.00974273681640625,
0.0117034912109375,
0.024200439453125,
0.02099609375,
0.01128387451171875,
0.04046630859375,
0.0231170654296875,
-0.016021728515625,
0.0218048095703125,
-0.0005984306335449219,
0.0063018798828125,
-0.0011491775512695312,
-0.030059814453125,
-0.05426025390625,
0.00897216796875,
0.0386962890625,
0.046844482421875,
0.0035686492919921875,
0.0257415771484375,
0.01253509521484375,
0.058746337890625,
-0.0289306640625,
0.0273895263671875,
-0.031341552734375,
-0.00811767578125,
0.0009551048278808594,
0.0184783935546875,
-0.0100250244140625,
-0.041290283203125,
0.007244110107421875,
-0.0230712890625,
0.019561767578125,
0.00717926025390625,
0.1021728515625,
-0.005565643310546875,
-0.0089874267578125,
-0.00609588623046875,
-0.039794921875,
0.06787109375,
-0.08526611328125,
0.0117340087890625,
0.01617431640625,
-0.015411376953125,
0.00374603271484375,
-0.0382080078125,
-0.057586669921875,
-0.004138946533203125,
-0.01343536376953125,
0.0209503173828125,
-0.0046844482421875,
0.0013246536254882812,
0.007144927978515625,
0.056060791015625,
-0.03173828125,
-0.013427734375,
-0.0128021240234375,
0.0011348724365234375,
0.05126953125,
0.00891876220703125,
-0.0027484893798828125,
-0.054931640625,
-0.0171356201171875,
-0.02154541015625,
-0.03564453125,
0.0352783203125,
0.024139404296875,
0.032623291015625,
-0.032562255859375,
0.0308990478515625,
-0.020782470703125,
0.037567138671875,
0.01538848876953125,
-0.00531768798828125,
0.05804443359375,
-0.037384033203125,
-0.0139617919921875,
-0.01812744140625,
0.0872802734375,
0.02520751953125,
-0.022125244140625,
0.0216217041015625,
-0.0192718505859375,
-0.0263519287109375,
0.00440216064453125,
-0.053497314453125,
-0.0220947265625,
0.039215087890625,
-0.05072021484375,
-0.0224151611328125,
0.02374267578125,
-0.0621337890625,
-0.01392364501953125,
-0.0150909423828125,
0.03912353515625,
-0.03887939453125,
-0.035400390625,
-0.017120361328125,
0.0146331787109375,
0.0266876220703125,
0.0059814453125,
-0.036102294921875,
-0.009124755859375,
0.045379638671875,
0.0850830078125,
0.00366973876953125,
-0.0244903564453125,
0.0021419525146484375,
-0.03411865234375,
-0.029083251953125,
0.047119140625,
-0.01153564453125,
-0.031494140625,
0.0024471282958984375,
0.0216064453125,
-0.0007791519165039062,
-0.0350341796875,
0.061004638671875,
-0.01898193359375,
0.014739990234375,
-0.03887939453125,
-0.0265045166015625,
-0.049224853515625,
0.02557373046875,
-0.034210205078125,
0.086181640625,
0.005855560302734375,
-0.06048583984375,
0.022979736328125,
-0.03778076171875,
-0.042877197265625,
-0.0298309326171875,
0.003543853759765625,
-0.07525634765625,
0.01433563232421875,
0.03460693359375,
0.05047607421875,
-0.00946044921875,
0.02166748046875,
-0.03997802734375,
0.0011348724365234375,
0.0214080810546875,
-0.0135498046875,
0.06549072265625,
0.019989013671875,
-0.025390625,
0.0119781494140625,
-0.06451416015625,
-0.0030918121337890625,
0.0089263916015625,
-0.0277252197265625,
-0.03460693359375,
0.0181732177734375,
0.0033550262451171875,
0.0245513916015625,
0.0264892578125,
-0.050079345703125,
0.00445556640625,
-0.03973388671875,
0.0203704833984375,
0.0545654296875,
0.007198333740234375,
0.01953125,
-0.0282745361328125,
0.017486572265625,
0.039154052734375,
0.02386474609375,
-0.0170135498046875,
-0.027984619140625,
-0.0872802734375,
-0.0195465087890625,
0.01373291015625,
0.03887939453125,
-0.046844482421875,
0.0806884765625,
-0.0292510986328125,
-0.055511474609375,
-0.055816650390625,
-0.0012359619140625,
0.0225982666015625,
0.03314208984375,
0.03985595703125,
-0.04559326171875,
-0.047760009765625,
-0.060302734375,
-0.0031070709228515625,
-0.025543212890625,
0.00690460205078125,
0.032470703125,
0.0443115234375,
0.00975799560546875,
0.08392333984375,
-0.03924560546875,
-0.03802490234375,
-0.043121337890625,
0.021087646484375,
0.044921875,
0.04364013671875,
0.034271240234375,
-0.046966552734375,
-0.03582763671875,
-0.0186004638671875,
-0.051849365234375,
-0.003345489501953125,
-0.01861572265625,
-0.006641387939453125,
0.0297393798828125,
0.01617431640625,
-0.02789306640625,
0.0231475830078125,
0.03497314453125,
-0.037628173828125,
0.0469970703125,
0.0030193328857421875,
0.0159759521484375,
-0.11700439453125,
0.0088043212890625,
0.0280609130859375,
-0.0082244873046875,
-0.02288818359375,
-0.016448974609375,
0.0021038055419921875,
-0.009033203125,
-0.0234375,
0.0278167724609375,
-0.0313720703125,
0.0002378225326538086,
-0.00843048095703125,
-0.01058197021484375,
0.0115814208984375,
0.06011962890625,
0.01438140869140625,
0.03582763671875,
0.04632568359375,
-0.040557861328125,
0.01372528076171875,
0.0167999267578125,
-0.038238525390625,
0.043792724609375,
-0.051910400390625,
-0.004302978515625,
0.0072479248046875,
0.0247955322265625,
-0.06976318359375,
0.007007598876953125,
0.0419921875,
-0.036407470703125,
0.0159149169921875,
-0.0207366943359375,
-0.0268096923828125,
-0.027984619140625,
-0.04327392578125,
0.00804901123046875,
0.032623291015625,
-0.045745849609375,
0.053466796875,
0.01503753662109375,
-0.0149078369140625,
-0.045196533203125,
-0.04345703125,
-0.0227813720703125,
-0.01273345947265625,
-0.03106689453125,
0.017578125,
0.004436492919921875,
-0.0074615478515625,
0.01568603515625,
-0.01035308837890625,
-0.0005626678466796875,
-0.01103973388671875,
0.02740478515625,
0.0270233154296875,
-0.0189971923828125,
-0.003143310546875,
0.00792694091796875,
0.01316070556640625,
0.00470733642578125,
-0.004314422607421875,
0.040283203125,
-0.0318603515625,
-0.0227203369140625,
-0.048736572265625,
0.0194854736328125,
0.032379150390625,
-0.0182037353515625,
0.0726318359375,
0.05767822265625,
-0.00893402099609375,
0.002162933349609375,
-0.03887939453125,
-0.004161834716796875,
-0.032562255859375,
0.053070068359375,
-0.0183868408203125,
-0.03643798828125,
0.048797607421875,
0.0141448974609375,
0.0238800048828125,
0.039947509765625,
0.032470703125,
-0.007354736328125,
0.06329345703125,
0.041961669921875,
-0.024688720703125,
0.051361083984375,
-0.04290771484375,
0.0278167724609375,
-0.0794677734375,
-0.0158538818359375,
-0.035369873046875,
-0.01030731201171875,
-0.053192138671875,
-0.02099609375,
0.026031494140625,
0.0212249755859375,
-0.0278167724609375,
0.01439666748046875,
-0.0396728515625,
0.0017337799072265625,
0.04302978515625,
0.0223541259765625,
0.01177978515625,
0.0031414031982421875,
0.0034198760986328125,
0.005397796630859375,
-0.030181884765625,
-0.033203125,
0.08599853515625,
0.03289794921875,
0.058929443359375,
0.005641937255859375,
0.0552978515625,
0.0030994415283203125,
0.015594482421875,
-0.0699462890625,
0.0430908203125,
-0.0187835693359375,
-0.0386962890625,
0.0036258697509765625,
-0.034210205078125,
-0.046142578125,
0.0198211669921875,
-0.011138916015625,
-0.036346435546875,
0.05133056640625,
0.00496673583984375,
-0.0299072265625,
0.040863037109375,
-0.0457763671875,
0.05474853515625,
-0.01323699951171875,
-0.0270843505859375,
-0.01190185546875,
-0.032989501953125,
0.01824951171875,
0.01462554931640625,
0.0193023681640625,
-0.01439666748046875,
0.0146942138671875,
0.0709228515625,
-0.0465087890625,
0.07489013671875,
-0.03302001953125,
0.00910186767578125,
0.03399658203125,
-0.00756072998046875,
0.027923583984375,
0.01093292236328125,
0.0006275177001953125,
0.04351806640625,
-0.005733489990234375,
-0.02728271484375,
-0.0169219970703125,
0.055419921875,
-0.07708740234375,
-0.036224365234375,
-0.05633544921875,
-0.016265869140625,
-0.01458740234375,
0.0040130615234375,
0.024017333984375,
0.02899169921875,
-0.0038127899169921875,
0.0148468017578125,
0.041839599609375,
-0.0302276611328125,
0.04901123046875,
0.048309326171875,
-0.005107879638671875,
-0.0419921875,
0.07586669921875,
0.004520416259765625,
-0.0014896392822265625,
0.02410888671875,
0.0070037841796875,
-0.0165863037109375,
-0.043243408203125,
-0.0167388916015625,
0.02362060546875,
-0.04364013671875,
-0.0249176025390625,
-0.05157470703125,
-0.022064208984375,
-0.04254150390625,
0.008056640625,
-0.04669189453125,
-0.01392364501953125,
-0.032958984375,
-0.0042572021484375,
0.036773681640625,
0.031494140625,
-0.02349853515625,
0.0201416015625,
-0.049285888671875,
-0.0030307769775390625,
0.0109100341796875,
0.0178375244140625,
0.0151214599609375,
-0.045196533203125,
-0.03155517578125,
-0.00490570068359375,
-0.054718017578125,
-0.06256103515625,
0.050018310546875,
0.01070404052734375,
0.0243682861328125,
0.0286102294921875,
0.005420684814453125,
0.03704833984375,
-0.0063018798828125,
0.0872802734375,
0.01495361328125,
-0.07476806640625,
0.0465087890625,
-0.00739288330078125,
0.006107330322265625,
0.041412353515625,
0.03399658203125,
-0.040679931640625,
-0.0155029296875,
-0.04107666015625,
-0.1041259765625,
0.06585693359375,
0.00017917156219482422,
0.0111541748046875,
0.0024547576904296875,
0.0286712646484375,
0.0117645263671875,
0.011627197265625,
-0.0826416015625,
-0.038177490234375,
-0.049285888671875,
-0.032928466796875,
0.00449371337890625,
-0.01739501953125,
0.00463104248046875,
-0.0433349609375,
0.08282470703125,
0.005290985107421875,
0.034088134765625,
0.04669189453125,
-0.00943756103515625,
0.01128387451171875,
0.0233154296875,
0.01251220703125,
0.03961181640625,
-0.0286407470703125,
-0.019622802734375,
0.0012044906616210938,
-0.045684814453125,
0.004119873046875,
0.0203399658203125,
-0.0208740234375,
0.00917816162109375,
0.0192718505859375,
0.07513427734375,
-0.0167388916015625,
-0.0208892822265625,
0.038482666015625,
0.0019178390502929688,
-0.00293731689453125,
-0.043121337890625,
-0.01605224609375,
-0.01482391357421875,
0.0228118896484375,
0.04132080078125,
-0.0012254714965820312,
0.00020968914031982422,
-0.033477783203125,
-0.00725555419921875,
0.0281982421875,
-0.034088134765625,
-0.038177490234375,
0.0243377685546875,
0.017486572265625,
0.0059967041015625,
0.044464111328125,
-0.03973388671875,
-0.044647216796875,
0.041412353515625,
0.0119171142578125,
0.0814208984375,
-0.007068634033203125,
0.0254669189453125,
0.0572509765625,
0.02392578125,
-0.00885009765625,
0.0240631103515625,
0.0063323974609375,
-0.0550537109375,
-0.0238037109375,
-0.07403564453125,
-0.02203369140625,
0.033935546875,
-0.053924560546875,
0.0132293701171875,
-0.037322998046875,
-0.01528167724609375,
-0.0181732177734375,
0.0189208984375,
-0.03973388671875,
0.015960693359375,
-0.007671356201171875,
0.044647216796875,
-0.0771484375,
0.05157470703125,
0.0733642578125,
-0.06524658203125,
-0.07537841796875,
0.0216064453125,
-0.0086517333984375,
-0.0318603515625,
0.0208587646484375,
0.015777587890625,
0.018890380859375,
-0.023956298828125,
-0.03729248046875,
-0.049102783203125,
0.08355712890625,
-0.0171051025390625,
-0.034027099609375,
0.0139617919921875,
-0.00548553466796875,
0.032379150390625,
-0.029388427734375,
0.023712158203125,
0.034881591796875,
0.0187225341796875,
-0.007656097412109375,
-0.049896240234375,
0.0163116455078125,
-0.0214691162109375,
0.005069732666015625,
-0.0022983551025390625,
-0.06976318359375,
0.087158203125,
-0.0118865966796875,
0.009552001953125,
0.002880096435546875,
0.07354736328125,
0.017547607421875,
0.0252532958984375,
0.032257080078125,
0.053680419921875,
0.0318603515625,
-0.02484130859375,
0.05926513671875,
-0.01154327392578125,
0.05859375,
0.0771484375,
0.002941131591796875,
0.0733642578125,
0.0224609375,
-0.0389404296875,
0.06756591796875,
0.0743408203125,
-0.01090240478515625,
0.05108642578125,
-0.0191802978515625,
-0.01224517822265625,
-0.011932373046875,
0.00553131103515625,
-0.03228759765625,
0.02032470703125,
0.016021728515625,
-0.041900634765625,
0.002971649169921875,
-0.01044464111328125,
0.0283050537109375,
-0.0178070068359375,
-0.0291900634765625,
0.05499267578125,
0.0149993896484375,
-0.040557861328125,
0.038116455078125,
0.01160430908203125,
0.0869140625,
-0.045623779296875,
0.02008056640625,
-0.0435791015625,
0.02081298828125,
-0.0267181396484375,
-0.04608154296875,
0.0213623046875,
-0.00555419921875,
-0.01517486572265625,
0.013702392578125,
0.036590576171875,
-0.0248870849609375,
-0.057952880859375,
0.004093170166015625,
0.0289764404296875,
0.00008606910705566406,
-0.007724761962890625,
-0.0738525390625,
-0.01038360595703125,
0.01396942138671875,
-0.051361083984375,
0.016845703125,
0.03204345703125,
0.015777587890625,
0.0260009765625,
0.03668212890625,
0.00543212890625,
-0.018218994140625,
-0.00429534912109375,
0.05865478515625,
-0.05517578125,
-0.0513916015625,
-0.06585693359375,
0.041015625,
-0.0170135498046875,
-0.031341552734375,
0.063232421875,
0.0462646484375,
0.0616455078125,
-0.0016107559204101562,
0.07366943359375,
-0.042266845703125,
0.045745849609375,
-0.02825927734375,
0.057464599609375,
-0.056915283203125,
0.0258636474609375,
-0.029541015625,
-0.060333251953125,
0.0004718303680419922,
0.057861328125,
-0.03643798828125,
0.022796630859375,
0.0574951171875,
0.0645751953125,
-0.006656646728515625,
-0.00048661231994628906,
-0.011962890625,
0.045074462890625,
0.031951904296875,
0.0254669189453125,
0.04010009765625,
-0.039093017578125,
0.034698486328125,
-0.050567626953125,
-0.024261474609375,
-0.0175018310546875,
-0.05670166015625,
-0.06195068359375,
-0.06610107421875,
-0.032379150390625,
-0.0506591796875,
-0.0108642578125,
0.07568359375,
0.04705810546875,
-0.08038330078125,
-0.01471710205078125,
-0.005023956298828125,
0.0139923095703125,
-0.038543701171875,
-0.027435302734375,
0.059783935546875,
-0.031829833984375,
-0.0550537109375,
-0.00439453125,
-0.0038356781005859375,
0.006549835205078125,
-0.0270538330078125,
0.0114898681640625,
-0.02197265625,
0.0035533905029296875,
0.04339599609375,
0.01812744140625,
-0.043975830078125,
-0.005695343017578125,
-0.0119781494140625,
0.004856109619140625,
0.0228424072265625,
0.039215087890625,
-0.041290283203125,
0.009368896484375,
0.037872314453125,
0.01459503173828125,
0.0243072509765625,
0.00399017333984375,
0.0162353515625,
-0.064697265625,
0.0157623291015625,
0.005535125732421875,
0.0345458984375,
0.042022705078125,
-0.0177001953125,
0.023590087890625,
0.0345458984375,
-0.0374755859375,
-0.05108642578125,
-0.00727081298828125,
-0.07806396484375,
-0.031402587890625,
0.094970703125,
-0.04071044921875,
-0.0270233154296875,
-0.00455474853515625,
-0.006931304931640625,
0.026397705078125,
-0.054595947265625,
0.06854248046875,
0.054107666015625,
-0.00914764404296875,
0.004695892333984375,
-0.0177459716796875,
0.039031982421875,
0.03302001953125,
-0.05902099609375,
0.00469207763671875,
0.02960205078125,
0.03961181640625,
0.0191650390625,
0.052001953125,
-0.01153564453125,
0.00078582763671875,
-0.0160675048828125,
0.02703857421875,
-0.00547027587890625,
-0.001598358154296875,
-0.032196044921875,
0.002193450927734375,
-0.0023021697998046875,
-0.038482666015625
]
] |
Austism/chronos-hermes-13b | 2023-07-01T16:13:40.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"chatbot",
"storywriting",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | Austism | null | null | Austism/chronos-hermes-13b | 40 | 11,480 | transformers | 2023-06-13T02:36:03 | ---
license: other
tags:
- llama
- pytorch
- chatbot
- storywriting
---
([chronos-13b](https://huggingface.co/elinas/chronos-13b) + [Nous-Hermes-13b](https://huggingface.co/NousResearch/Nous-Hermes-13b)) 75/25 merge
This has the aspects of chronos's nature to produce long, descriptive outputs. But with additional coherency and an ability to better obey instructions. Resulting in this model having a great ability to produce evocative storywriting and follow a narrative.
This mix contains alot of chronos's writing style and 'flavour' with far less tendency of going AWOL and spouting nonsensical babble.
This result was much more successful than my [first chronos merge](https://huggingface.co/Austism/chronos-wizardlm-uc-scot-st-13b). | 742 | [
[
-0.047149658203125,
-0.032867431640625,
0.04925537109375,
0.0241241455078125,
-0.033172607421875,
0.0170135498046875,
-0.01812744140625,
-0.0648193359375,
0.058929443359375,
0.05035400390625,
-0.060638427734375,
-0.0225830078125,
-0.047271728515625,
0.022308349609375,
-0.04779052734375,
0.08636474609375,
-0.0073699951171875,
-0.002765655517578125,
0.01465606689453125,
-0.016693115234375,
-0.007076263427734375,
-0.031982421875,
-0.05780029296875,
-0.032958984375,
0.0699462890625,
0.0214691162109375,
0.055816650390625,
0.02874755859375,
0.02764892578125,
0.025543212890625,
-0.0303802490234375,
0.039520263671875,
-0.03460693359375,
0.03472900390625,
-0.0247650146484375,
-0.0037784576416015625,
-0.0640869140625,
-0.001483917236328125,
0.048248291015625,
0.04327392578125,
-0.01497650146484375,
-0.0031490325927734375,
0.0248870849609375,
0.0367431640625,
-0.0180206298828125,
-0.0106201171875,
0.007511138916015625,
0.0134735107421875,
-0.01519775390625,
-0.0335693359375,
-0.0244903564453125,
-0.042724609375,
0.0250091552734375,
-0.0843505859375,
0.0161895751953125,
0.021148681640625,
0.07135009765625,
-0.0233154296875,
-0.042694091796875,
-0.0296478271484375,
-0.0567626953125,
0.06591796875,
-0.03973388671875,
0.01812744140625,
0.0113677978515625,
0.0295562744140625,
-0.0069122314453125,
-0.045379638671875,
-0.040069580078125,
-0.0291290283203125,
-0.0030670166015625,
0.022216796875,
-0.01497650146484375,
-0.0179290771484375,
0.0111541748046875,
0.060028076171875,
-0.033355712890625,
0.0164794921875,
-0.053619384765625,
-0.0252227783203125,
0.053802490234375,
0.038726806640625,
0.032196044921875,
-0.0079193115234375,
-0.031951904296875,
-0.03741455078125,
-0.0116119384765625,
-0.01178741455078125,
0.053619384765625,
0.0177154541015625,
-0.03887939453125,
0.07135009765625,
-0.032501220703125,
0.022705078125,
0.038909912109375,
-0.00811767578125,
0.018341064453125,
-0.046417236328125,
-0.01190185546875,
0.01739501953125,
0.049072265625,
0.047271728515625,
0.01477813720703125,
-0.004711151123046875,
0.00580596923828125,
0.02337646484375,
0.0048980712890625,
-0.06829833984375,
-0.0091400146484375,
0.01727294921875,
-0.039886474609375,
-0.0360107421875,
0.0108642578125,
-0.05157470703125,
-0.03338623046875,
-0.033477783203125,
-0.006999969482421875,
-0.0687255859375,
-0.0145263671875,
0.0073089599609375,
-0.042510986328125,
0.01335906982421875,
0.051666259765625,
-0.061798095703125,
0.03863525390625,
0.037322998046875,
0.034881591796875,
-0.003864288330078125,
-0.0216827392578125,
-0.037933349609375,
0.0108642578125,
-0.04638671875,
0.034759521484375,
-0.01210784912109375,
-0.032012939453125,
-0.0137939453125,
-0.0007929801940917969,
0.00850677490234375,
-0.0211944580078125,
0.052215576171875,
-0.02398681640625,
0.04254150390625,
-0.01174163818359375,
-0.046173095703125,
-0.02294921875,
-0.005931854248046875,
-0.059417724609375,
0.06658935546875,
0.01465606689453125,
-0.035675048828125,
0.0288543701171875,
-0.019805908203125,
-0.0360107421875,
0.0208892822265625,
0.010498046875,
-0.01319122314453125,
0.023193359375,
-0.0033740997314453125,
0.02728271484375,
-0.0238037109375,
0.00841522216796875,
-0.0413818359375,
-0.0034027099609375,
0.029937744140625,
0.00463104248046875,
0.0487060546875,
0.028564453125,
-0.0033054351806640625,
0.0018873214721679688,
-0.045989990234375,
0.00103759765625,
0.01554107666015625,
-0.00366973876953125,
-0.023834228515625,
-0.023895263671875,
0.0254974365234375,
0.0033664703369140625,
0.032379150390625,
-0.0262908935546875,
0.0237274169921875,
-0.0178680419921875,
0.0236663818359375,
0.022491455078125,
0.0069122314453125,
0.0799560546875,
-0.06103515625,
0.04510498046875,
-0.00611114501953125,
0.01013946533203125,
-0.01264190673828125,
-0.0467529296875,
-0.043060302734375,
-0.024658203125,
0.0038738250732421875,
0.04425048828125,
-0.05499267578125,
0.039093017578125,
0.0016794204711914062,
-0.0653076171875,
-0.0174102783203125,
0.01546478271484375,
0.032958984375,
0.025360107421875,
0.0143280029296875,
-0.051177978515625,
-0.04833984375,
-0.045806884765625,
-0.0015764236450195312,
0.0031948089599609375,
0.000004410743713378906,
0.0187225341796875,
0.0208282470703125,
0.0028018951416015625,
0.0272216796875,
-0.033538818359375,
-0.00745391845703125,
-0.051300048828125,
0.019317626953125,
0.017822265625,
0.0300140380859375,
0.05816650390625,
-0.0309295654296875,
-0.0304107666015625,
0.0220947265625,
-0.057830810546875,
-0.00827789306640625,
-0.0064544677734375,
-0.005340576171875,
0.014312744140625,
0.003406524658203125,
-0.0631103515625,
0.037872314453125,
0.027679443359375,
-0.040985107421875,
0.046417236328125,
-0.0221405029296875,
0.045135498046875,
-0.09967041015625,
-0.00762939453125,
0.00333404541015625,
-0.0006814002990722656,
-0.036590576171875,
0.0012912750244140625,
-0.0069122314453125,
0.0006537437438964844,
-0.034942626953125,
0.058990478515625,
-0.047607421875,
0.01629638671875,
-0.0117034912109375,
0.02880859375,
0.0159759521484375,
0.018157958984375,
0.00865936279296875,
0.01145172119140625,
0.044097900390625,
-0.051361083984375,
0.040679931640625,
0.033721923828125,
-0.00044918060302734375,
0.0567626953125,
-0.0301361083984375,
-0.0221710205078125,
-0.0039215087890625,
0.038818359375,
-0.06658935546875,
-0.0289459228515625,
0.0236358642578125,
-0.0291290283203125,
0.0191650390625,
0.0201263427734375,
-0.050994873046875,
-0.034942626953125,
-0.0452880859375,
0.0316162109375,
0.0399169921875,
-0.0227508544921875,
0.05352783203125,
-0.0250091552734375,
-0.038818359375,
-0.01233673095703125,
-0.053985595703125,
0.01528167724609375,
-0.03167724609375,
-0.03826904296875,
0.0556640625,
-0.021148681640625,
0.0011959075927734375,
0.0093536376953125,
0.014373779296875,
-0.01079559326171875,
-0.0307464599609375,
0.012420654296875,
0.04779052734375,
-0.03277587890625,
-0.03851318359375,
0.02264404296875,
-0.005245208740234375,
-0.0345458984375,
0.01116180419921875,
0.04443359375,
0.01178741455078125,
-0.0025463104248046875,
-0.08184814453125,
0.046356201171875,
0.08941650390625,
0.0100555419921875,
0.05194091796875,
0.0372314453125,
-0.01317596435546875,
-0.0115509033203125,
-0.040435791015625,
-0.0228118896484375,
-0.029266357421875,
0.0013971328735351562,
-0.0165252685546875,
-0.0634765625,
0.05511474609375,
0.01482391357421875,
-0.009185791015625,
0.05853271484375,
0.02557373046875,
-0.01690673828125,
0.04925537109375,
0.044464111328125,
0.0022296905517578125,
0.038909912109375,
-0.01154327392578125,
0.0300445556640625,
-0.0626220703125,
-0.023284912109375,
-0.021881103515625,
-0.024322509765625,
-0.0545654296875,
-0.0007996559143066406,
-0.01311492919921875,
0.022064208984375,
-0.0113677978515625,
0.06585693359375,
-0.03924560546875,
0.0097503662109375,
0.059234619140625,
0.006725311279296875,
0.02117919921875,
-0.0099334716796875,
-0.00323486328125,
-0.01465606689453125,
-0.03961181640625,
-0.03411865234375,
0.06500244140625,
0.014190673828125,
0.054046630859375,
0.023834228515625,
0.07470703125,
0.00696563720703125,
-0.0009131431579589844,
-0.0158538818359375,
0.049530029296875,
-0.0103607177734375,
-0.0494384765625,
-0.0180511474609375,
-0.0235443115234375,
-0.06427001953125,
0.0017566680908203125,
-0.052703857421875,
-0.055694580078125,
0.0162506103515625,
-0.00846099853515625,
-0.0489501953125,
0.00518798828125,
-0.040069580078125,
0.045318603515625,
-0.007350921630859375,
-0.034393310546875,
-0.0364990234375,
-0.0396728515625,
0.027496337890625,
0.0064697265625,
-0.002765655517578125,
0.00629425048828125,
0.0017347335815429688,
0.049560546875,
-0.031494140625,
0.05572509765625,
0.04583740234375,
0.007160186767578125,
0.037872314453125,
0.02313232421875,
-0.00579833984375,
-0.0009756088256835938,
0.0199737548828125,
-0.00458526611328125,
0.0176239013671875,
-0.0207672119140625,
-0.0296783447265625,
0.0518798828125,
-0.04345703125,
-0.0011873245239257812,
-0.0589599609375,
-0.044158935546875,
0.021942138671875,
0.0023479461669921875,
0.0399169921875,
0.06414794921875,
-0.03936767578125,
0.007259368896484375,
0.036956787109375,
-0.0234527587890625,
0.04266357421875,
0.03857421875,
-0.040618896484375,
-0.08111572265625,
0.0231781005859375,
-0.00023257732391357422,
0.0221405029296875,
0.0130462646484375,
0.01027679443359375,
-0.0235443115234375,
0.00997161865234375,
-0.038055419921875,
0.041107177734375,
-0.0205841064453125,
-0.0195465087890625,
-0.031280517578125,
-0.048004150390625,
-0.046539306640625,
-0.050750732421875,
-0.0168304443359375,
-0.06402587890625,
-0.019256591796875,
-0.0159759521484375,
0.062042236328125,
0.060943603515625,
-0.0283355712890625,
0.04510498046875,
-0.0694580078125,
0.034637451171875,
0.00031876564025878906,
-0.003559112548828125,
0.019866943359375,
-0.04571533203125,
-0.0048828125,
-0.0285491943359375,
-0.044647216796875,
-0.097412109375,
0.041351318359375,
-0.00595855712890625,
0.034942626953125,
0.049957275390625,
-0.002227783203125,
0.042694091796875,
-0.01549530029296875,
0.060821533203125,
0.04925537109375,
-0.06787109375,
0.033355712890625,
-0.057891845703125,
0.0201263427734375,
0.0264434814453125,
0.01087188720703125,
-0.0302886962890625,
-0.064453125,
-0.08148193359375,
-0.072998046875,
0.06060791015625,
0.049163818359375,
-0.003078460693359375,
0.000782012939453125,
0.00943756103515625,
0.0007939338684082031,
0.0259246826171875,
-0.0479736328125,
-0.0308990478515625,
0.007770538330078125,
-0.0157318115234375,
-0.019500732421875,
-0.028533935546875,
-0.02838134765625,
-0.03228759765625,
0.05316162109375,
0.031494140625,
0.0030841827392578125,
0.0141448974609375,
0.0169677734375,
-0.01316070556640625,
0.0205841064453125,
0.04656982421875,
0.0242156982421875,
-0.0119781494140625,
0.0052337646484375,
0.0123748779296875,
-0.02801513671875,
-0.01308441162109375,
0.01343536376953125,
0.012176513671875,
0.00038814544677734375,
0.053985595703125,
0.03485107421875,
0.02667236328125,
-0.031494140625,
0.0113525390625,
-0.003643035888671875,
-0.01546478271484375,
-0.01027679443359375,
0.0108642578125,
0.024688720703125,
0.02935791015625,
0.023651123046875,
-0.0041046142578125,
0.0140533447265625,
-0.058349609375,
0.0227508544921875,
0.004535675048828125,
-0.004772186279296875,
-0.018280029296875,
0.052032470703125,
0.019500732421875,
-0.01073455810546875,
0.03887939453125,
-0.0294952392578125,
-0.037353515625,
0.060943603515625,
0.049163818359375,
0.07293701171875,
-0.0386962890625,
0.018829345703125,
0.042144775390625,
0.0213623046875,
-0.0115203857421875,
0.0218505859375,
-0.0222320556640625,
-0.033660888671875,
-0.0281982421875,
-0.0372314453125,
-0.033294677734375,
-0.0233001708984375,
-0.0634765625,
0.04144287109375,
-0.035308837890625,
-0.015045166015625,
0.00673675537109375,
0.0205535888671875,
-0.037261962890625,
0.03533935546875,
0.0172271728515625,
0.0634765625,
-0.07977294921875,
0.045501708984375,
0.04095458984375,
-0.043731689453125,
-0.057891845703125,
-0.042755126953125,
0.0086212158203125,
-0.0254669189453125,
0.0256805419921875,
0.01016998291015625,
-0.007781982421875,
-0.037567138671875,
-0.029510498046875,
-0.06121826171875,
0.0960693359375,
0.01125335693359375,
-0.040435791015625,
0.010711669921875,
-0.019775390625,
0.0548095703125,
-0.059539794921875,
0.00897979736328125,
0.01629638671875,
0.034027099609375,
0.040863037109375,
-0.08856201171875,
0.00727081298828125,
-0.0185699462890625,
-0.0176544189453125,
0.018707275390625,
-0.03704833984375,
0.05572509765625,
-0.00896453857421875,
-0.0159149169921875,
0.056915283203125,
0.06256103515625,
0.0258331298828125,
0.06610107421875,
0.00919342041015625,
0.07635498046875,
0.04345703125,
-0.00885772705078125,
0.09405517578125,
-0.03656005859375,
0.01329803466796875,
0.0875244140625,
-0.0258331298828125,
0.040863037109375,
0.03265380859375,
0.0097198486328125,
0.047332763671875,
0.053009033203125,
0.01727294921875,
0.048187255859375,
-0.0182342529296875,
-0.0180511474609375,
-0.0129241943359375,
-0.0014400482177734375,
-0.053253173828125,
-0.002147674560546875,
-0.0011310577392578125,
-0.028778076171875,
-0.0213623046875,
-0.017486572265625,
0.02130126953125,
0.0156097412109375,
-0.00566864013671875,
0.044464111328125,
0.0282440185546875,
-0.049072265625,
-0.00437164306640625,
-0.0113067626953125,
0.037567138671875,
-0.0634765625,
0.007602691650390625,
-0.0095062255859375,
0.00856781005859375,
-0.01149749755859375,
-0.0723876953125,
0.02880859375,
-0.028717041015625,
-0.0241851806640625,
-0.039154052734375,
0.0261993408203125,
-0.034942626953125,
-0.042236328125,
0.046630859375,
0.039794921875,
0.0003826618194580078,
0.0241241455078125,
-0.0240478515625,
0.0088958740234375,
-0.014923095703125,
0.00878143310546875,
0.01519012451171875,
0.04827880859375,
0.0057830810546875,
0.03521728515625,
0.04705810546875,
0.0115203857421875,
-0.004840850830078125,
0.0008854866027832031,
0.037384033203125,
-0.05194091796875,
-0.0555419921875,
-0.04766845703125,
0.028900146484375,
-0.0299224853515625,
-0.060394287109375,
0.07635498046875,
0.05438232421875,
0.029571533203125,
-0.00983428955078125,
0.038177490234375,
-0.0099639892578125,
0.0311431884765625,
-0.0197601318359375,
0.045318603515625,
-0.052978515625,
-0.0146636962890625,
-0.032623291015625,
-0.08306884765625,
-0.011932373046875,
0.034027099609375,
-0.021728515625,
-0.0010471343994140625,
0.07391357421875,
0.032196044921875,
0.0024204254150390625,
0.02313232421875,
0.01479339599609375,
0.007793426513671875,
-0.0247802734375,
0.04229736328125,
0.08135986328125,
-0.0297698974609375,
0.012969970703125,
-0.016326904296875,
-0.027679443359375,
-0.0300140380859375,
-0.07366943359375,
-0.05853271484375,
-0.0447998046875,
-0.027313232421875,
-0.052734375,
0.0005145072937011719,
0.0394287109375,
0.042999267578125,
-0.0447998046875,
-0.0322265625,
0.01776123046875,
-0.005161285400390625,
-0.02728271484375,
-0.016204833984375,
0.0005121231079101562,
0.00730133056640625,
-0.0631103515625,
0.0027618408203125,
0.037017822265625,
0.0235137939453125,
0.0085906982421875,
-0.02264404296875,
0.019256591796875,
0.03338623046875,
0.049560546875,
0.0248260498046875,
-0.049713134765625,
0.01042938232421875,
0.003814697265625,
-0.032958984375,
-0.01274871826171875,
0.047454833984375,
-0.0302886962890625,
0.0033855438232421875,
0.047149658203125,
0.007793426513671875,
0.053863525390625,
-0.008514404296875,
0.060760498046875,
-0.0008263587951660156,
0.00466156005859375,
0.00445556640625,
0.0440673828125,
0.0260009765625,
-0.033050537109375,
0.0309295654296875,
0.01324462890625,
-0.032135009765625,
-0.040802001953125,
0.0177154541015625,
-0.1361083984375,
-0.00901031494140625,
0.0740966796875,
0.0345458984375,
-0.0066375732421875,
0.0304718017578125,
-0.040313720703125,
0.026458740234375,
-0.04254150390625,
0.048248291015625,
0.0712890625,
-0.027679443359375,
0.008453369140625,
-0.0223236083984375,
0.0084075927734375,
0.027496337890625,
-0.03759765625,
-0.022216796875,
0.052520751953125,
0.018157958984375,
0.027130126953125,
0.042083740234375,
-0.0202789306640625,
0.0294647216796875,
0.0001289844512939453,
0.017578125,
0.0193634033203125,
-0.027740478515625,
-0.006214141845703125,
0.01861572265625,
-0.0043182373046875,
-0.0142669677734375
]
] |
StudentLLM/Alpagasus-2-13b-QLoRA-merged | 2023-09-15T07:07:30.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"license:other",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | StudentLLM | null | null | StudentLLM/Alpagasus-2-13b-QLoRA-merged | 2 | 11,470 | transformers | 2023-09-02T04:38:10 | ---
license: other
language:
- en
---
## Model Details
This is an unofficial implementation of "[AlpaGasus: Training a better Alpaca with Fewer Data.](https://github.com/Lichang-Chen/AlpaGasus)" with [LLaMA2](https://huggingface.co/meta-llama/Llama-2-13b-hf) & QLoRA! Training code is available at our [repo](https://github.com/gauss5930/AlpaGasus2-QLoRA).
- **Developed by:** [Yunsang Yoo](https://huggingface.co/ryan0712) and [Hyunwoo Ko](https://huggingface.co/Cartinoe5930)
- **Model type:** Auto-regressive model
- **Language(s):** English
- **Base Model:** [meta-llama/Llama-2-13b-hf](https://huggingface.co/meta-llama/Llama-2-13b-hf)
- **License**: Non-Commercial Creative Commons license ([CC BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/))
### Training dataset
"StudentLLM/Alpagasus-2-13b-QLoRA-merged" used [gpt4life](https://github.com/gpt4life/alpagasus)'s gpt-3.5-turbo filtered dataset, 'alpaca_t45.json'.
Configuration of the dataset is as follows:
```
{
'instruction': Give the instruction describing the question.
'input': Occasionally present, detailed instructions accompany the question if available.
'output': Give answers to questions.
}
.
.
.
```
### Prompt Template: Alpaca style prompt
```
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
<prompt> (without the <>)
### Input:
<prompt> (if input exists)
### Response:
```
### Fine-tuning Procedure
Our model was finetuned using QLoRA on single A100 80GB GPU. Training details are described in [repo](https://github.com/gauss5930/AlpaGasus2-QLoRA).
### Benchmark Metrics
"StudentLLM/Alpagasus-2-13b-QLoRA-merged" model performance is uploaded on Huggingface's [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). Model was evaluated on the tasks specified in HF's Open LLM Leaderboard(ARC, HellaSwag, MMLU, TruthfulQA).
| Metric | Value |
|-----------------------|-------|
| Avg. | 59.34 |
| MMLU | 55.27 |
| ARC | 61.09 |
| HellaSwag | 82.46 |
| TruthfulQA | 38.53 |
### LLM Evaluation
We tried to follow the evaluation metric introduced by the AlpaGasus paper. During the process, we consulted the code by [gpt4life](https://github.com/gpt4life/alpagasus). We used OpenAI's gpt-3.5-turbo as the evaluator model, and Alpaca2-LoRA-13B(it doesn't exist now...) as the comparison model. For more detailed information, please refer to our Github [repo](https://github.com/gauss5930/AlpaGasus2-QLoRA).
The evaluation result of AlpaGasus2-QLoRA is as follows:

### How to use
To use "StudentLLM/Alpagasus-2-13b-QLoRA-merged", please follow the following code! The use of the 7B model is the same!
```python
from peft import PeftModel, PeftConfig
from transformers import AutoModelForCausalLM
import torch
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
config = PeftConfig.from_pretrained("StudentLLM/Alpagasus-2-13B-QLoRA")
model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-2-13b-hf", use_auth_token="yotu_HuggingFace_token").to(device)
model = PeftModel.from_pretrained(model, "StudentLLM/Alpagasus-2-13B-QLoRA")
tokenizer = AutoTokenizer.from_pretrained("StudentLLM/Alpagasus-2-13B-QLoRA")
tokenizer.pad_token = tokenizer.eos_token
input_data = "Please tell me 3 ways to relieve stress." # You can enter any questions!!
model_inputs = tokenizer(input_data, return_tensors='pt').to(device)
model_output = model.generate(**model_inputs, max_length=256)
model_output = tokenizer.decode(model_output[0], skip_special_tokens=True)
print(model_output)
```
### Citations
```bibtex
@article{chen2023alpagasus,
title={AlpaGasus: Training a Better Alpaca with Fewer Data},
author={Lichang Chen, Shiyang Li, Jun Yan, Hai Wang, Kalpa Gunaratna, Vikas Yadav, Zheng Tang, Vijay Srinivasan, Tianyi Zhou, Heng Huang, Hongxia Jin},
journal={arXiv preprint arXiv:2307.08701},
year={2023}
}
``` | 4,153 | [
[
-0.0335693359375,
-0.051239013671875,
0.019683837890625,
0.01537322998046875,
-0.03082275390625,
-0.01079559326171875,
-0.01458740234375,
-0.040924072265625,
0.01401519775390625,
0.01100921630859375,
-0.04925537109375,
-0.040863037109375,
-0.04095458984375,
-0.0008053779602050781,
-0.01507568359375,
0.09344482421875,
-0.02423095703125,
-0.0012187957763671875,
0.007579803466796875,
-0.02313232421875,
-0.0196075439453125,
-0.0293731689453125,
-0.054351806640625,
-0.03704833984375,
0.032989501953125,
0.0022830963134765625,
0.039886474609375,
0.043853759765625,
0.034027099609375,
0.02178955078125,
-0.01293182373046875,
0.0175933837890625,
-0.03204345703125,
-0.027740478515625,
0.022918701171875,
-0.03753662109375,
-0.053009033203125,
0.0133514404296875,
0.04736328125,
0.0146026611328125,
-0.0206298828125,
0.03594970703125,
0.0218658447265625,
0.037322998046875,
-0.040252685546875,
0.0225677490234375,
-0.0302886962890625,
0.007251739501953125,
-0.00775909423828125,
-0.0030879974365234375,
-0.005992889404296875,
-0.037933349609375,
-0.0038623809814453125,
-0.05694580078125,
0.0163116455078125,
-0.00826263427734375,
0.09454345703125,
0.032257080078125,
-0.018035888671875,
-0.021728515625,
-0.035064697265625,
0.043792724609375,
-0.072265625,
0.01983642578125,
0.03857421875,
0.010589599609375,
-0.0092620849609375,
-0.0445556640625,
-0.050750732421875,
-0.00958251953125,
-0.0135498046875,
0.00858306884765625,
-0.020263671875,
-0.0032176971435546875,
0.0138702392578125,
0.028778076171875,
-0.0435791015625,
0.005992889404296875,
-0.044677734375,
-0.0079193115234375,
0.049591064453125,
0.005374908447265625,
-0.005062103271484375,
-0.0091705322265625,
-0.038665771484375,
-0.032623291015625,
-0.043792724609375,
0.02783203125,
0.03656005859375,
0.02093505859375,
-0.038360595703125,
0.035797119140625,
-0.014190673828125,
0.039581298828125,
0.0129241943359375,
-0.031402587890625,
0.055389404296875,
-0.0208282470703125,
-0.04290771484375,
-0.01270294189453125,
0.07366943359375,
0.0164642333984375,
-0.006855010986328125,
0.005828857421875,
-0.00959014892578125,
-0.00508880615234375,
-0.0109405517578125,
-0.0604248046875,
-0.0195465087890625,
0.020172119140625,
-0.037628173828125,
-0.0290985107421875,
0.002735137939453125,
-0.04736328125,
-0.004138946533203125,
-0.005237579345703125,
0.048431396484375,
-0.0280303955078125,
-0.01629638671875,
0.0107879638671875,
0.02069091796875,
0.037109375,
0.016754150390625,
-0.0521240234375,
0.0304107666015625,
0.0380859375,
0.0638427734375,
-0.0012350082397460938,
-0.038543701171875,
-0.025604248046875,
-0.01364898681640625,
-0.007366180419921875,
0.0482177734375,
-0.0004665851593017578,
-0.0235748291015625,
-0.0202484130859375,
0.0153045654296875,
-0.031280517578125,
-0.039306640625,
0.05682373046875,
-0.0287322998046875,
0.031585693359375,
-0.022552490234375,
-0.034698486328125,
-0.034759521484375,
0.012298583984375,
-0.0362548828125,
0.093505859375,
0.0200653076171875,
-0.048370361328125,
0.0137481689453125,
-0.05548095703125,
0.004230499267578125,
-0.01345062255859375,
0.005962371826171875,
-0.06231689453125,
-0.021270751953125,
0.0123748779296875,
0.025848388671875,
-0.0236968994140625,
0.0109405517578125,
-0.019287109375,
-0.03424072265625,
0.01226043701171875,
-0.01428985595703125,
0.07293701171875,
0.01557159423828125,
-0.039093017578125,
0.01467132568359375,
-0.05706787109375,
0.001789093017578125,
0.038848876953125,
-0.036163330078125,
0.0023345947265625,
-0.0237884521484375,
-0.00376129150390625,
0.01012420654296875,
0.0256195068359375,
-0.0285186767578125,
0.0312347412109375,
-0.019012451171875,
0.026458740234375,
0.052520751953125,
-0.0096435546875,
0.020782470703125,
-0.03253173828125,
0.0386962890625,
-0.012115478515625,
0.0335693359375,
0.0122528076171875,
-0.048095703125,
-0.07806396484375,
-0.01561737060546875,
0.0160675048828125,
0.0433349609375,
-0.038543701171875,
0.046966552734375,
0.0022563934326171875,
-0.06268310546875,
-0.039031982421875,
0.0216217041015625,
0.037353515625,
0.052978515625,
0.045013427734375,
-0.0258636474609375,
-0.03814697265625,
-0.059967041015625,
0.0096588134765625,
-0.02886962890625,
0.01325225830078125,
0.0166015625,
0.050933837890625,
-0.0242767333984375,
0.0467529296875,
-0.04779052734375,
-0.0294647216796875,
-0.00383758544921875,
0.00966644287109375,
0.03515625,
0.053985595703125,
0.0692138671875,
-0.020263671875,
-0.0239715576171875,
-0.0191650390625,
-0.05499267578125,
-0.006092071533203125,
0.00370025634765625,
-0.033355712890625,
0.015838623046875,
0.007366180419921875,
-0.061920166015625,
0.035186767578125,
0.039886474609375,
-0.04144287109375,
0.047576904296875,
-0.00934600830078125,
0.00713348388671875,
-0.063232421875,
0.020172119140625,
-0.002208709716796875,
-0.003009796142578125,
-0.03167724609375,
0.01493072509765625,
0.004077911376953125,
0.01031494140625,
-0.042938232421875,
0.044830322265625,
-0.0227203369140625,
0.0008635520935058594,
-0.02215576171875,
-0.019073486328125,
0.00989532470703125,
0.06451416015625,
-0.014556884765625,
0.0521240234375,
0.035125732421875,
-0.03466796875,
0.0260162353515625,
0.0308990478515625,
-0.0147552490234375,
0.0091705322265625,
-0.05853271484375,
0.016510009765625,
0.01091766357421875,
0.032684326171875,
-0.0655517578125,
-0.008331298828125,
0.053924560546875,
-0.02716064453125,
0.0165252685546875,
-0.00914764404296875,
-0.036468505859375,
-0.0460205078125,
-0.0333251953125,
0.042694091796875,
0.0498046875,
-0.05523681640625,
0.035186767578125,
0.002384185791015625,
0.007427215576171875,
-0.04718017578125,
-0.04150390625,
-0.028228759765625,
-0.032958984375,
-0.032135009765625,
0.0181121826171875,
-0.0230560302734375,
0.005168914794921875,
0.0021495819091796875,
0.0014286041259765625,
-0.002532958984375,
0.0095672607421875,
0.0185699462890625,
0.041473388671875,
-0.01122283935546875,
0.002597808837890625,
0.0072174072265625,
0.003681182861328125,
0.005092620849609375,
-0.00013780593872070312,
0.0582275390625,
-0.0290069580078125,
-0.02276611328125,
-0.049957275390625,
-0.00687408447265625,
0.0179595947265625,
-0.018280029296875,
0.06292724609375,
0.065673828125,
-0.0265655517578125,
0.0158233642578125,
-0.048095703125,
-0.00372314453125,
-0.0338134765625,
0.0143890380859375,
-0.019195556640625,
-0.046478271484375,
0.052398681640625,
0.0259246826171875,
0.01325225830078125,
0.06304931640625,
0.04364013671875,
0.0007991790771484375,
0.05572509765625,
0.037872314453125,
-0.0014028549194335938,
0.0364990234375,
-0.0635986328125,
-0.00624847412109375,
-0.07489013671875,
-0.03948974609375,
-0.033660888671875,
-0.0218963623046875,
-0.02996826171875,
-0.034820556640625,
0.022186279296875,
0.0253448486328125,
-0.040313720703125,
0.033477783203125,
-0.0460205078125,
0.01953125,
0.037353515625,
0.02117919921875,
0.01410675048828125,
0.006793975830078125,
-0.0010509490966796875,
0.0150604248046875,
-0.05133056640625,
-0.031402587890625,
0.093017578125,
0.04339599609375,
0.0479736328125,
0.0008654594421386719,
0.06805419921875,
0.0017004013061523438,
0.032470703125,
-0.049346923828125,
0.032470703125,
0.006740570068359375,
-0.0233154296875,
-0.00737762451171875,
-0.0279388427734375,
-0.08087158203125,
0.030548095703125,
-0.0181884765625,
-0.048309326171875,
0.0307159423828125,
0.01021575927734375,
-0.051971435546875,
0.024993896484375,
-0.049896240234375,
0.060455322265625,
-0.021697998046875,
-0.0260162353515625,
-0.00223541259765625,
-0.033447265625,
0.0423583984375,
-0.00879669189453125,
0.01120758056640625,
-0.0179901123046875,
-0.00252532958984375,
0.08856201171875,
-0.057647705078125,
0.055389404296875,
-0.0259246826171875,
-0.0191192626953125,
0.0467529296875,
-0.01666259765625,
0.03631591796875,
-0.00322723388671875,
-0.006252288818359375,
0.0275421142578125,
-0.0011053085327148438,
-0.0362548828125,
-0.02899169921875,
0.05279541015625,
-0.086669921875,
-0.0269622802734375,
-0.04388427734375,
-0.03338623046875,
-0.0012578964233398438,
0.003047943115234375,
0.0272979736328125,
0.0029697418212890625,
0.00347900390625,
-0.0102996826171875,
0.0269775390625,
-0.01232147216796875,
0.041259765625,
0.03192138671875,
-0.0215606689453125,
-0.04168701171875,
0.053131103515625,
0.005008697509765625,
0.0028247833251953125,
-0.00726318359375,
0.01183319091796875,
-0.0265960693359375,
-0.032928466796875,
-0.04345703125,
0.04364013671875,
-0.049407958984375,
-0.034637451171875,
-0.046234130859375,
-0.021759033203125,
-0.0311431884765625,
0.01031494140625,
-0.0275726318359375,
-0.0291748046875,
-0.040771484375,
-0.0118255615234375,
0.038970947265625,
0.043060302734375,
-0.01023101806640625,
0.04388427734375,
-0.0462646484375,
0.014190673828125,
0.02716064453125,
0.01387786865234375,
0.0162353515625,
-0.067138671875,
-0.0292510986328125,
0.00789642333984375,
-0.041046142578125,
-0.0679931640625,
0.04449462890625,
0.017669677734375,
0.042572021484375,
0.0258636474609375,
-0.0224609375,
0.06390380859375,
-0.01140594482421875,
0.051544189453125,
0.0169677734375,
-0.059173583984375,
0.044769287109375,
-0.0146484375,
0.0117645263671875,
0.040008544921875,
0.027099609375,
-0.003200531005859375,
-0.006809234619140625,
-0.0592041015625,
-0.06732177734375,
0.05755615234375,
0.0308685302734375,
-0.011566162109375,
0.0203857421875,
0.043365478515625,
0.01235198974609375,
0.012542724609375,
-0.08209228515625,
-0.032073974609375,
-0.02459716796875,
-0.002216339111328125,
-0.00548553466796875,
-0.00742340087890625,
-0.025726318359375,
-0.046356201171875,
0.0743408203125,
-0.0108795166015625,
0.031951904296875,
0.0175018310546875,
-0.0009698867797851562,
-0.022491455078125,
-0.0035247802734375,
0.039703369140625,
0.032501220703125,
-0.029510498046875,
-0.013458251953125,
0.022857666015625,
-0.041290283203125,
0.0186004638671875,
0.03375244140625,
-0.01387786865234375,
-0.007465362548828125,
0.0293731689453125,
0.0809326171875,
-0.001979827880859375,
-0.0309600830078125,
0.023284912109375,
-0.01502227783203125,
-0.0266571044921875,
-0.0206298828125,
0.0190887451171875,
0.003993988037109375,
0.03271484375,
0.038848876953125,
0.0040435791015625,
0.0006189346313476562,
-0.035369873046875,
-0.01001739501953125,
0.0225372314453125,
0.003940582275390625,
-0.025848388671875,
0.059722900390625,
0.0010166168212890625,
-0.005237579345703125,
0.03582763671875,
-0.032135009765625,
-0.035308837890625,
0.07275390625,
0.034088134765625,
0.058929443359375,
-0.0225677490234375,
0.006610870361328125,
0.050018310546875,
0.0163421630859375,
-0.0167083740234375,
0.028411865234375,
-0.00033593177795410156,
-0.042755126953125,
-0.011993408203125,
-0.06427001953125,
-0.01166534423828125,
0.031341552734375,
-0.0582275390625,
0.017822265625,
-0.030548095703125,
-0.022918701171875,
-0.0168914794921875,
0.028656005859375,
-0.063232421875,
0.020355224609375,
0.009429931640625,
0.056427001953125,
-0.07763671875,
0.06805419921875,
0.0435791015625,
-0.036102294921875,
-0.08197021484375,
-0.01505279541015625,
-0.009185791015625,
-0.07489013671875,
0.034271240234375,
0.0091705322265625,
0.004161834716796875,
-0.01070404052734375,
-0.040130615234375,
-0.08355712890625,
0.1146240234375,
0.03369140625,
-0.040008544921875,
-0.005252838134765625,
0.001117706298828125,
0.04376220703125,
-0.0204315185546875,
0.0247955322265625,
0.049957275390625,
0.031707763671875,
0.006542205810546875,
-0.0777587890625,
0.0138092041015625,
-0.030548095703125,
-0.016632080078125,
-0.0030345916748046875,
-0.08551025390625,
0.0916748046875,
-0.0219879150390625,
0.00016951560974121094,
0.0234375,
0.05810546875,
0.046722412109375,
0.0223236083984375,
0.0377197265625,
0.06646728515625,
0.06512451171875,
-0.005096435546875,
0.07586669921875,
-0.020843505859375,
0.06231689453125,
0.08050537109375,
0.000713348388671875,
0.058197021484375,
0.0235595703125,
-0.033905029296875,
0.04449462890625,
0.073486328125,
-0.019378662109375,
0.035919189453125,
-0.0025844573974609375,
-0.0083770751953125,
0.0025806427001953125,
0.0013303756713867188,
-0.058929443359375,
0.035919189453125,
0.0074005126953125,
-0.028472900390625,
-0.00928497314453125,
-0.0062255859375,
0.01483154296875,
-0.031982421875,
-0.018463134765625,
0.038482666015625,
0.0032329559326171875,
-0.0426025390625,
0.0869140625,
0.0096588134765625,
0.07086181640625,
-0.04608154296875,
0.0041046142578125,
-0.0231170654296875,
0.00417327880859375,
-0.031494140625,
-0.03173828125,
0.009063720703125,
0.006809234619140625,
0.004055023193359375,
0.01122283935546875,
0.040130615234375,
-0.032562255859375,
-0.045867919921875,
0.03375244140625,
0.037384033203125,
0.02197265625,
0.01001739501953125,
-0.07501220703125,
0.0205535888671875,
0.0011510848999023438,
-0.045196533203125,
0.02685546875,
-0.006561279296875,
0.01018524169921875,
0.04559326171875,
0.051055908203125,
-0.002140045166015625,
0.011993408203125,
0.002033233642578125,
0.0755615234375,
-0.0249481201171875,
-0.0272064208984375,
-0.0623779296875,
0.01380157470703125,
0.01708984375,
-0.040557861328125,
0.050384521484375,
0.0592041015625,
0.06573486328125,
0.00008511543273925781,
0.04193115234375,
-0.0067138671875,
0.019073486328125,
-0.043243408203125,
0.057464599609375,
-0.04840087890625,
0.016510009765625,
-0.0137176513671875,
-0.0775146484375,
0.003875732421875,
0.06671142578125,
-0.0187225341796875,
0.0176239013671875,
0.041748046875,
0.049591064453125,
-0.007755279541015625,
-0.0146026611328125,
-0.00972747802734375,
0.0197601318359375,
0.0222015380859375,
0.060089111328125,
0.03692626953125,
-0.06842041015625,
0.03857421875,
-0.048797607421875,
-0.0165557861328125,
-0.01349639892578125,
-0.04180908203125,
-0.05706787109375,
-0.0292510986328125,
-0.0290069580078125,
-0.037841796875,
-0.0030879974365234375,
0.08001708984375,
0.047515869140625,
-0.052520751953125,
-0.03253173828125,
0.005218505859375,
-0.00988006591796875,
-0.01580810546875,
-0.0162353515625,
0.05340576171875,
0.0015420913696289062,
-0.06280517578125,
0.022216796875,
-0.019378662109375,
0.034332275390625,
-0.0166473388671875,
-0.026580810546875,
-0.0275726318359375,
-0.0030517578125,
0.025543212890625,
0.0335693359375,
-0.047698974609375,
-0.00628662109375,
-0.0112457275390625,
-0.01385498046875,
0.025604248046875,
0.0250091552734375,
-0.063232421875,
0.001453399658203125,
0.0226593017578125,
0.0179901123046875,
0.056060791015625,
0.00354766845703125,
-0.0033359527587890625,
-0.032073974609375,
0.0275726318359375,
0.00020205974578857422,
0.044097900390625,
0.023040771484375,
-0.03704833984375,
0.058929443359375,
0.0241546630859375,
-0.038787841796875,
-0.0655517578125,
-0.01493072509765625,
-0.0849609375,
-0.006237030029296875,
0.08416748046875,
-0.025421142578125,
-0.0347900390625,
0.0251312255859375,
-0.025421142578125,
0.042022705078125,
-0.033233642578125,
0.06317138671875,
0.03070068359375,
-0.02874755859375,
-0.005382537841796875,
-0.036102294921875,
0.03302001953125,
0.03912353515625,
-0.07171630859375,
-0.0244903564453125,
0.01078033447265625,
0.03619384765625,
0.0109710693359375,
0.059661865234375,
0.005035400390625,
0.023223876953125,
-0.0171661376953125,
0.019683837890625,
-0.024932861328125,
-0.0019063949584960938,
-0.037384033203125,
-0.00289154052734375,
-0.00628662109375,
-0.0239105224609375
]
] |
ai-forever/rugpt3small_based_on_gpt2 | 2023-11-03T12:50:19.000Z | [
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"PyTorch",
"Transformers",
"ru",
"arxiv:2309.10931",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | ai-forever | null | null | ai-forever/rugpt3small_based_on_gpt2 | 22 | 11,466 | transformers | 2022-03-02T23:29:05 | ---
language:
- ru
tags:
- PyTorch
- Transformers
thumbnail: "https://github.com/sberbank-ai/ru-gpts"
---
# rugpt3small\_based\_on\_gpt2
The model architecture design, pretraining, and evaluation are documented in our preprint: [**A Family of Pretrained Transformer Language Models for Russian**](https://arxiv.org/abs/2309.10931).
The model was pretrained with sequence length 1024 using transformers by the [SberDevices](https://sberdevices.ru/) team on 80B tokens around 3 epochs. After that, the model was finetuned with the context size of 2048.
Total training time took around one week on 32 GPUs.
# Authors
+ NLP core team RnD [Telegram channel](https://t.me/nlpcoreteam):
+ Dmitry Zmitrovich
# Cite us
```
@misc{zmitrovich2023family,
title={A Family of Pretrained Transformer Language Models for Russian},
author={Dmitry Zmitrovich and Alexander Abramov and Andrey Kalmykov and Maria Tikhonova and Ekaterina Taktasheva and Danil Astafurov and Mark Baushenko and Artem Snegirev and Tatiana Shavrina and Sergey Markov and Vladislav Mikhailov and Alena Fenogenova},
year={2023},
eprint={2309.10931},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 1,200 | [
[
-0.024017333984375,
-0.032501220703125,
0.0245819091796875,
0.0181427001953125,
-0.035888671875,
-0.0167236328125,
-0.0264739990234375,
-0.0157012939453125,
-0.035980224609375,
0.006114959716796875,
-0.035552978515625,
-0.00959014892578125,
-0.045379638671875,
-0.0069732666015625,
-0.01255035400390625,
0.10345458984375,
-0.01404571533203125,
0.0195159912109375,
0.021636962890625,
0.00555419921875,
-0.01526641845703125,
-0.030548095703125,
-0.0421142578125,
-0.033843994140625,
-0.00238800048828125,
0.0030670166015625,
0.039154052734375,
0.0450439453125,
0.0263671875,
0.0169830322265625,
-0.0103759765625,
-0.0272216796875,
-0.04150390625,
-0.020111083984375,
0.0020618438720703125,
-0.01947021484375,
-0.044677734375,
0.0038051605224609375,
0.054656982421875,
0.020050048828125,
-0.0251922607421875,
0.03155517578125,
0.0292510986328125,
0.0251617431640625,
-0.022979736328125,
0.015655517578125,
-0.047637939453125,
0.010162353515625,
-0.027587890625,
-0.0048675537109375,
-0.0543212890625,
0.005306243896484375,
0.0244598388671875,
-0.0421142578125,
0.0307464599609375,
-0.0028476715087890625,
0.0830078125,
0.01039886474609375,
-0.024078369140625,
0.007598876953125,
-0.0650634765625,
0.057861328125,
-0.0556640625,
0.040771484375,
0.0188751220703125,
0.030029296875,
-0.0021762847900390625,
-0.081787109375,
-0.04302978515625,
-0.006885528564453125,
-0.0257415771484375,
0.007205963134765625,
-0.0160064697265625,
-0.0017910003662109375,
0.03607177734375,
0.0247650146484375,
-0.072998046875,
-0.0008392333984375,
-0.025909423828125,
-0.01393890380859375,
0.02191162109375,
0.01143646240234375,
-0.006076812744140625,
-0.0074310302734375,
-0.03717041015625,
-0.0128936767578125,
-0.0513916015625,
-0.006221771240234375,
0.033203125,
0.00794219970703125,
-0.029327392578125,
0.025543212890625,
-0.004665374755859375,
0.053070068359375,
0.01317596435546875,
-0.0110015869140625,
0.0298309326171875,
-0.039825439453125,
-0.019378662109375,
-0.03558349609375,
0.078369140625,
-0.007415771484375,
0.0214080810546875,
-0.0229339599609375,
-0.0287628173828125,
-0.00615692138671875,
0.03582763671875,
-0.07647705078125,
-0.02105712890625,
-0.0005121231079101562,
-0.0286712646484375,
-0.007904052734375,
-0.005970001220703125,
-0.053802490234375,
0.0081939697265625,
-0.024261474609375,
0.044036865234375,
-0.038726806640625,
-0.027374267578125,
0.017974853515625,
0.01568603515625,
0.05902099609375,
8.344650268554688e-7,
-0.0838623046875,
0.0362548828125,
0.0628662109375,
0.059326171875,
-0.01233673095703125,
-0.0345458984375,
-0.013824462890625,
-0.01399993896484375,
-0.004261016845703125,
0.051666259765625,
-0.0186614990234375,
-0.0174407958984375,
-0.01226043701171875,
0.005382537841796875,
-0.0171051025390625,
-0.01361083984375,
0.038360595703125,
-0.042510986328125,
0.049072265625,
0.01415252685546875,
-0.02288818359375,
0.006267547607421875,
0.00933837890625,
-0.024627685546875,
0.08001708984375,
0.0292205810546875,
-0.0594482421875,
0.042999267578125,
-0.040771484375,
-0.0221099853515625,
0.0140228271484375,
-0.00064849853515625,
-0.046142578125,
-0.0013904571533203125,
0.012359619140625,
0.033477783203125,
-0.0328369140625,
0.034881591796875,
0.007740020751953125,
-0.028564453125,
-0.0150146484375,
-0.0263671875,
0.055999755859375,
0.0262908935546875,
-0.04754638671875,
0.018402099609375,
-0.058502197265625,
-0.0001856088638305664,
0.0196075439453125,
-0.021148681640625,
0.010589599609375,
-0.01117706298828125,
0.0231781005859375,
0.03106689453125,
0.0115966796875,
-0.0302886962890625,
0.0241241455078125,
-0.031463623046875,
0.032928466796875,
0.0537109375,
-0.028717041015625,
0.043853759765625,
-0.005939483642578125,
0.054534912109375,
-0.006397247314453125,
0.03680419921875,
-0.01593017578125,
-0.0294647216796875,
-0.057861328125,
-0.007549285888671875,
0.0306243896484375,
0.035797119140625,
-0.05413818359375,
0.041290283203125,
-0.0384521484375,
-0.048919677734375,
-0.022186279296875,
-0.0215301513671875,
0.041168212890625,
0.0294036865234375,
0.04022216796875,
-0.0214996337890625,
-0.041656494140625,
-0.06591796875,
-0.0006966590881347656,
-0.0125732421875,
-0.01727294921875,
-0.0016584396362304688,
0.04693603515625,
-0.01045989990234375,
0.064453125,
-0.031219482421875,
-0.0032520294189453125,
-0.040435791015625,
0.01381683349609375,
0.0322265625,
0.04510498046875,
0.047088623046875,
-0.03460693359375,
-0.04547119140625,
-0.00667572021484375,
-0.0225982666015625,
0.002765655517578125,
0.0016088485717773438,
-0.0033435821533203125,
0.0357666015625,
0.0084991455078125,
-0.0675048828125,
0.02825927734375,
0.043182373046875,
-0.035736083984375,
0.0692138671875,
-0.002292633056640625,
-0.01195526123046875,
-0.09356689453125,
0.0267486572265625,
-0.01007080078125,
-0.01175689697265625,
-0.06231689453125,
-0.0064544677734375,
-0.003543853759765625,
-0.01348114013671875,
-0.04656982421875,
0.05157470703125,
-0.0474853515625,
-0.01555633544921875,
-0.00722503662109375,
-0.0031108856201171875,
-0.01947021484375,
0.052093505859375,
0.0164794921875,
0.07110595703125,
0.04559326171875,
-0.03814697265625,
-0.008453369140625,
0.023529052734375,
-0.0408935546875,
0.01531982421875,
-0.07501220703125,
0.0279693603515625,
0.01007843017578125,
0.026336669921875,
-0.061798095703125,
0.005397796630859375,
0.0306243896484375,
-0.034637451171875,
0.039154052734375,
-0.0236968994140625,
-0.04638671875,
-0.039794921875,
0.007434844970703125,
0.048004150390625,
0.05975341796875,
-0.03802490234375,
0.0518798828125,
0.01416778564453125,
-0.0006203651428222656,
-0.050445556640625,
-0.0196380615234375,
-0.01210784912109375,
-0.0240325927734375,
-0.051727294921875,
0.02471923828125,
0.0078582763671875,
0.0106658935546875,
-0.0163421630859375,
-0.0033512115478515625,
-0.00782012939453125,
0.00283050537109375,
-0.001781463623046875,
0.0279388427734375,
-0.027587890625,
0.0020351409912109375,
-0.031707763671875,
-0.04132080078125,
-0.009613037109375,
-0.0303192138671875,
0.07977294921875,
-0.0355224609375,
-0.006244659423828125,
-0.039825439453125,
-0.006473541259765625,
0.0193328857421875,
-0.025421142578125,
0.0599365234375,
0.07318115234375,
-0.0176544189453125,
-0.00855255126953125,
-0.043212890625,
-0.018829345703125,
-0.03594970703125,
0.039581298828125,
-0.03277587890625,
-0.06536865234375,
0.033355712890625,
0.01522064208984375,
-0.00409698486328125,
0.045928955078125,
0.05126953125,
0.030364990234375,
0.052398681640625,
0.0458984375,
-0.002994537353515625,
0.033721923828125,
-0.033355712890625,
0.0248260498046875,
-0.06488037109375,
-0.0114898681640625,
-0.042144775390625,
-0.002025604248046875,
-0.0330810546875,
-0.035552978515625,
0.0035190582275390625,
0.005710601806640625,
-0.046173095703125,
0.04937744140625,
-0.025665283203125,
0.030548095703125,
0.01934814453125,
-0.022613525390625,
-0.0009317398071289062,
-0.00524139404296875,
-0.011749267578125,
-0.01311492919921875,
-0.06646728515625,
-0.053314208984375,
0.09210205078125,
0.03814697265625,
0.053192138671875,
-0.0016946792602539062,
0.0311126708984375,
-0.02386474609375,
0.032012939453125,
-0.06256103515625,
0.03948974609375,
0.00041556358337402344,
-0.05181884765625,
-0.0211029052734375,
-0.019989013671875,
-0.07232666015625,
0.0187225341796875,
-0.01068115234375,
-0.0577392578125,
-0.00939178466796875,
0.0224761962890625,
-0.02191162109375,
0.0202484130859375,
-0.04364013671875,
0.08172607421875,
-0.01297760009765625,
-0.0218658447265625,
-0.01361083984375,
-0.06353759765625,
0.029144287109375,
-0.0135650634765625,
-0.01303863525390625,
0.0179290771484375,
0.0220794677734375,
0.060882568359375,
-0.026580810546875,
0.033905029296875,
-0.022979736328125,
0.00402069091796875,
0.0037860870361328125,
-0.01348114013671875,
0.039764404296875,
0.003509521484375,
0.0041961669921875,
0.0241241455078125,
-0.011138916015625,
-0.0218658447265625,
-0.0299224853515625,
0.019317626953125,
-0.06829833984375,
-0.03070068359375,
-0.04681396484375,
-0.0211029052734375,
-0.0208587646484375,
0.034088134765625,
0.03814697265625,
0.045562744140625,
-0.022857666015625,
0.025115966796875,
0.0322265625,
-0.0015459060668945312,
0.051361083984375,
0.0509033203125,
-0.018463134765625,
-0.02362060546875,
0.0357666015625,
-0.0016412734985351562,
0.0196380615234375,
0.0257415771484375,
0.0106658935546875,
-0.0279998779296875,
-0.049163818359375,
-0.034820556640625,
0.043060302734375,
-0.0302886962890625,
-0.006580352783203125,
-0.052337646484375,
-0.025604248046875,
-0.038116455078125,
0.01837158203125,
-0.055633544921875,
-0.016815185546875,
-0.0246124267578125,
-0.004795074462890625,
-0.00396728515625,
0.062744140625,
0.00855255126953125,
0.0543212890625,
-0.053497314453125,
0.01299285888671875,
0.01377105712890625,
0.047210693359375,
0.0012350082397460938,
-0.07525634765625,
-0.033721923828125,
-0.0138397216796875,
-0.0281982421875,
-0.035186767578125,
0.0275421142578125,
0.010009765625,
0.0582275390625,
0.022979736328125,
-0.0238189697265625,
0.04095458984375,
-0.07354736328125,
0.070068359375,
0.0020389556884765625,
-0.07586669921875,
0.0193634033203125,
-0.0229949951171875,
0.0297698974609375,
0.026519775390625,
0.03839111328125,
-0.0244598388671875,
-0.00019168853759765625,
-0.05517578125,
-0.064453125,
0.07135009765625,
0.022186279296875,
0.001873016357421875,
0.016693115234375,
0.023193359375,
0.01496124267578125,
0.002033233642578125,
-0.06890869140625,
-0.02008056640625,
-0.03668212890625,
0.0071258544921875,
-0.02899169921875,
-0.039337158203125,
-0.0054779052734375,
-0.03277587890625,
0.0675048828125,
-0.005584716796875,
0.048126220703125,
-0.01065826416015625,
-0.0196685791015625,
0.0172576904296875,
0.0279998779296875,
0.07861328125,
0.0694580078125,
-0.01641845703125,
-0.00754547119140625,
0.01580810546875,
-0.05029296875,
0.01308441162109375,
0.0190887451171875,
0.00737762451171875,
0.0015592575073242188,
0.031829833984375,
0.10308837890625,
0.004352569580078125,
-0.01180267333984375,
0.057647705078125,
-0.0201873779296875,
-0.0270843505859375,
-0.034393310546875,
-0.019287109375,
-0.003948211669921875,
0.005809783935546875,
0.034027099609375,
0.004344940185546875,
-0.0124053955078125,
-0.005916595458984375,
0.027069091796875,
0.0189666748046875,
-0.0262908935546875,
-0.0635986328125,
0.047119140625,
-0.00162506103515625,
-0.0200042724609375,
0.048858642578125,
-0.01548004150390625,
-0.048828125,
0.010955810546875,
0.05999755859375,
0.07861328125,
-0.036285400390625,
0.01690673828125,
0.043212890625,
0.02685546875,
-0.0220794677734375,
-0.006969451904296875,
0.0011110305786132812,
-0.0556640625,
-0.04132080078125,
-0.0689697265625,
0.00012540817260742188,
0.03680419921875,
-0.050384521484375,
0.035369873046875,
-0.01233673095703125,
-0.01483154296875,
-0.027069091796875,
-0.00662994384765625,
-0.061187744140625,
0.021514892578125,
-0.0026340484619140625,
0.07623291015625,
-0.06414794921875,
0.07110595703125,
0.0418701171875,
-0.00275421142578125,
-0.0789794921875,
0.0124359130859375,
-0.01430511474609375,
-0.064208984375,
0.060577392578125,
0.0163421630859375,
0,
0.023956298828125,
-0.0227813720703125,
-0.062225341796875,
0.0819091796875,
0.0158233642578125,
-0.036224365234375,
-0.01483154296875,
0.01540374755859375,
0.06451416015625,
-0.0207061767578125,
0.0311126708984375,
0.057342529296875,
0.0286712646484375,
0.006999969482421875,
-0.082763671875,
-0.007415771484375,
-0.03973388671875,
0.0138397216796875,
0.016326904296875,
-0.041717529296875,
0.06634521484375,
-0.00728607177734375,
-0.032470703125,
0.0040435791015625,
0.041839599609375,
-0.0017423629760742188,
-0.0206451416015625,
0.04022216796875,
0.067626953125,
0.01861572265625,
-0.027679443359375,
0.08831787109375,
-0.046905517578125,
0.046661376953125,
0.08587646484375,
0.007175445556640625,
0.05657958984375,
0.032501220703125,
-0.040435791015625,
0.019927978515625,
0.0413818359375,
-0.005931854248046875,
0.050201416015625,
0.0183258056640625,
-0.006023406982421875,
-0.013885498046875,
0.0260162353515625,
-0.052337646484375,
0.036163330078125,
0.01100921630859375,
-0.005817413330078125,
-0.01396942138671875,
-0.007778167724609375,
0.0173492431640625,
-0.034027099609375,
0.0112457275390625,
0.050262451171875,
0.004894256591796875,
-0.056884765625,
0.052825927734375,
-0.007274627685546875,
0.043609619140625,
-0.062744140625,
0.0228424072265625,
-0.0152130126953125,
0.0184326171875,
0.0013933181762695312,
-0.03955078125,
0.0195465087890625,
-0.00335693359375,
-0.006488800048828125,
-0.028228759765625,
0.037628173828125,
-0.0272216796875,
-0.0225982666015625,
0.01329803466796875,
0.0215911865234375,
0.00243377685546875,
0.01085662841796875,
-0.048553466796875,
-0.004390716552734375,
-0.014434814453125,
-0.0516357421875,
0.0187530517578125,
0.013580322265625,
0.00862884521484375,
0.042144775390625,
0.038421630859375,
0.0038623809814453125,
0.00750732421875,
0.0089569091796875,
0.065673828125,
-0.021881103515625,
-0.038543701171875,
-0.06878662109375,
0.053863525390625,
0.022491455078125,
-0.046173095703125,
0.053863525390625,
0.048828125,
0.0670166015625,
-0.031219482421875,
0.04193115234375,
-0.0116119384765625,
0.0230865478515625,
-0.042144775390625,
0.05615234375,
-0.0277862548828125,
0.01136016845703125,
-0.01523590087890625,
-0.09527587890625,
-0.023956298828125,
0.048919677734375,
-0.032745361328125,
0.0174407958984375,
0.07110595703125,
0.05487060546875,
-0.0084686279296875,
-0.01446533203125,
0.00731658935546875,
0.0201568603515625,
0.034088134765625,
0.039581298828125,
0.060455322265625,
-0.04656982421875,
0.034881591796875,
-0.018463134765625,
-0.01480865478515625,
-0.00524139404296875,
-0.06829833984375,
-0.0621337890625,
-0.049591064453125,
-0.0028934478759765625,
-0.0206451416015625,
0.0017881393432617188,
0.06341552734375,
0.05340576171875,
-0.05609130859375,
-0.0223846435546875,
-0.01300811767578125,
-0.0301361083984375,
-0.00044035911560058594,
-0.0147857666015625,
0.03521728515625,
-0.02752685546875,
-0.041168212890625,
0.0112762451171875,
0.00418853759765625,
0.0197296142578125,
-0.01100921630859375,
-0.020751953125,
-0.0190582275390625,
-0.0211181640625,
0.0305938720703125,
0.0054779052734375,
-0.042755126953125,
-0.0287017822265625,
-0.0006017684936523438,
-0.011993408203125,
0.0205841064453125,
0.05841064453125,
-0.043548583984375,
0.02117919921875,
0.0391845703125,
0.03558349609375,
0.0556640625,
0.007579803466796875,
0.05340576171875,
-0.042144775390625,
0.0306396484375,
0.0129547119140625,
0.0288238525390625,
0.0241851806640625,
0.0017147064208984375,
0.0416259765625,
0.0289306640625,
-0.06402587890625,
-0.056396484375,
0.0178375244140625,
-0.07171630859375,
0.01314544677734375,
0.09661865234375,
-0.0258026123046875,
0.001308441162109375,
-0.004058837890625,
-0.006900787353515625,
0.02642822265625,
-0.00640106201171875,
0.040924072265625,
0.050384521484375,
0.0178375244140625,
-0.005889892578125,
-0.03515625,
0.061492919921875,
0.02435302734375,
-0.051788330078125,
-0.00691986083984375,
0.0087890625,
0.03497314453125,
0.0157318115234375,
0.05902099609375,
-0.01096343994140625,
0.0133514404296875,
0.0059051513671875,
0.0166778564453125,
-0.01041412353515625,
-0.034088134765625,
-0.03289794921875,
-0.00859832763671875,
-0.0023822784423828125,
0.004180908203125
]
] |
KoboldAI/LLaMA2-13B-Tiefighter | 2023-10-19T16:55:50.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"license:llama2",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"has_space"
] | text-generation | KoboldAI | null | null | KoboldAI/LLaMA2-13B-Tiefighter | 18 | 11,463 | transformers | 2023-10-18T18:22:59 | ---
license: llama2
---
# LLaMA2-13B-Tiefighter
Tiefighter is a merged model achieved trough merging two different lora's on top of a well established existing merge.
To achieve this the following recipe was used:
* We begin with the base model Undi95/Xwin-MLewd-13B-V0.2 which is a well established merged, contrary to the name this model does not have a strong NSFW bias.
* Then we applied the PocketDoc/Dans-RetroRodeo-13b lora which is a finetune on the Choose your own Adventure datasets from our Skein model.
* After applying this lora we merged the new model with PocketDoc/Dans-RetroRodeo-13b at 5% to weaken the newly introduced adventure bias.
* The resulting merge was used as a new basemodel to which we applied Blackroot/Llama-2-13B-Storywriter-LORA and repeated the same trick, this time at 10%.
This means this model contains the following ingredients from their upstream models for as far as we can track them:
- Undi95/Xwin-MLewd-13B-V0.2
- - Undi95/ReMM-S-Light
- Undi95/CreativeEngine
- Brouz/Slerpeno
- - elinas/chronos-13b-v2
- jondurbin/airoboros-l2-13b-2.1
- NousResearch/Nous-Hermes-Llama2-13b+nRuaif/Kimiko-v2
- CalderaAI/13B-Legerdemain-L2+lemonilia/limarp-llama2-v2
- - KoboldAI/LLAMA2-13B-Holodeck-1
- NousResearch/Nous-Hermes-13b
- OpenAssistant/llama2-13b-orca-8k-3319
- ehartford/WizardLM-1.0-Uncensored-Llama2-13b
- Henk717/spring-dragon
- The-Face-Of-Goonery/Huginn-v3-13b (Contains undisclosed model versions, those we assumed where possible)
- - SuperCOT (Undisclosed version)
- elinas/chronos-13b-v2 (Version assumed)
- NousResearch/Nous-Hermes-Llama2-13b
- stabilityai/StableBeluga-13B (Version assumed)
- zattio770/120-Days-of-LORA-v2-13B
- PygmalionAI/pygmalion-2-13b
- Undi95/Storytelling-v1-13B-lora
- TokenBender/sakhi_13B_roleplayer_NSFW_chat_adapter
- nRuaif/Kimiko-v2-13B
- The-Face-Of-Goonery/Huginn-13b-FP16
- - "a lot of different models, like hermes, beluga, airoboros, chronos.. limarp"
- lemonilia/LimaRP-Llama2-13B-v3-EXPERIMENT
- Xwin-LM/Xwin-LM-13B-V0.2
- PocketDoc/Dans-RetroRodeo-13b
- Blackroot/Llama-2-13B-Storywriter-LORA
While we could possibly not credit every single lora or model involved in this merged model, we'd like to thank all involved creators upstream for making this awesome model possible!
Thanks to you the AI ecosystem is thriving, and without your dedicated tuning efforts models such as this one would not be possible.
# Usage
This model is meant to be creative, If you let it improvise you get better results than if you drown it in details.
## Story Writing
Regular story writing in the traditional way is supported, simply copy paste your story and continue writing. Optionally use an instruction in memory or an authors note to guide the direction of your story.
### Generate a story on demand
To generate stories on demand you can use an instruction (tested in the Alpaca format) such as "Write a novel about X, use chapters and dialogue" this will generate a story. The format can vary between generations depending on how the model chooses to begin, either write what you want as shown in the earlier example or write the beginning of the story yourself so the model can follow your style. A few retries can also help if the model gets it wrong.
## Chatbots and persona's
This model has been tested with various forms of chatting, testers have found that typically less is more and the model is good at improvising. Don't drown the model in paragraphs of detailed information, instead keep it simple first and see how far you can lean on the models own ability to figure out your character. Copy pasting paragraphs of background information is not suitable for a 13B model such as this one, code formatted characters or an instruction prompt describing who you wish to talk to goes much further.
For example, you can put this in memory in regular chat mode:
```
### Instruction:
Generate a conversation between Alice and Henk where they discuss language models.
In this conversation Henk is excited to teach Alice about Tiefigther.
### Response:
```
Because the model is a merge of a variety of models, it should support a broad range of instruct formats, or plain chat mode. If you have a particular favourite try it, otherwise we recommend to either use the regular chat mode or Alpaca's format.
## Instruct Prompting
This model features various instruct models on a variety of instruction styles, when testing the model we have used Alpaca for our own tests. If you prefer a different format chances are it can work.
During instructions we have observed that in some cases the adventure data can leak, it may also be worth experimenting using > as the prefix for a user command to remedy this. But this may result in a stronger fiction bias.
Keep in mind that while this model can be used as a factual instruct model, the focus was on fiction. Information provided by the model can be made up.
## Adventuring and Adventure Games
This model contains a lora that was trained on the same adventure dataset as the KoboldAI Skein model. Adventuring is best done using an small introduction to the world and your objective while using the > prefix for a user command (KoboldAI's adventure mode).
It is possible that the model does not immediately pick up on what you wish to do and does not engage in its Adventure mode behaviour right away. Simply manually correct the output to trim excess dialogue or other undesirable behaviour and continue to submit your actions using the appropriate mode. The model should pick up on this style quickly and will correctly follow this format within 3 turns.
## Discovered something cool and want to engage with us?
Join our community at https://koboldai.org/discord ! | 5,785 | [
[
-0.033782958984375,
-0.06402587890625,
0.0271148681640625,
0.01157379150390625,
-0.030548095703125,
0.0026569366455078125,
0.007160186767578125,
-0.06671142578125,
0.046295166015625,
0.05694580078125,
-0.054901123046875,
-0.017242431640625,
-0.036041259765625,
-0.01153564453125,
-0.029693603515625,
0.09765625,
-0.002605438232421875,
-0.019683837890625,
-0.003200531005859375,
-0.0112457275390625,
-0.048248291015625,
-0.03912353515625,
-0.037841796875,
-0.038299560546875,
0.053436279296875,
0.03155517578125,
0.060272216796875,
0.05657958984375,
0.054656982421875,
0.0229339599609375,
-0.023529052734375,
0.021209716796875,
-0.044158935546875,
-0.0031719207763671875,
-0.0007495880126953125,
-0.0440673828125,
-0.0615234375,
0.01378631591796875,
0.023529052734375,
0.02996826171875,
-0.01528167724609375,
0.00656890869140625,
-0.0040283203125,
0.025604248046875,
-0.033905029296875,
0.00980377197265625,
-0.0225067138671875,
-0.005496978759765625,
-0.008453369140625,
0.0045928955078125,
-0.01074981689453125,
-0.025634765625,
0.0074615478515625,
-0.055511474609375,
0.00139617919921875,
0.00763702392578125,
0.09002685546875,
0.00949859619140625,
-0.03814697265625,
-0.0478515625,
-0.0509033203125,
0.03857421875,
-0.06976318359375,
0.007110595703125,
0.036590576171875,
0.0198822021484375,
-0.01268768310546875,
-0.05340576171875,
-0.04180908203125,
-0.0168304443359375,
-0.009521484375,
0.004093170166015625,
-0.023284912109375,
-0.0236663818359375,
0.0033931732177734375,
0.0309906005859375,
-0.027069091796875,
0.02423095703125,
-0.050506591796875,
0.00849151611328125,
0.0579833984375,
0.0169525146484375,
0.01535797119140625,
-0.01436614990234375,
-0.034271240234375,
-0.026275634765625,
-0.053009033203125,
0.00746917724609375,
0.0394287109375,
0.006866455078125,
-0.035400390625,
0.05474853515625,
-0.0018568038940429688,
0.0350341796875,
0.019683837890625,
-0.037384033203125,
0.0139617919921875,
-0.0239410400390625,
-0.0203857421875,
-0.00542449951171875,
0.056243896484375,
0.042144775390625,
0.0035839080810546875,
-0.0024433135986328125,
-0.007328033447265625,
0.006580352783203125,
0.014495849609375,
-0.0513916015625,
-0.0010356903076171875,
0.0142669677734375,
-0.03778076171875,
-0.058349609375,
-0.0027370452880859375,
-0.05364990234375,
-0.03363037109375,
-0.0006718635559082031,
0.0304412841796875,
-0.030670166015625,
-0.01629638671875,
0.004486083984375,
-0.0171356201171875,
0.0423583984375,
0.0294952392578125,
-0.05706787109375,
0.01194000244140625,
0.039459228515625,
0.048614501953125,
-0.0008215904235839844,
-0.03533935546875,
-0.01239776611328125,
0.01149749755859375,
-0.037994384765625,
0.060577392578125,
-0.01209259033203125,
-0.041778564453125,
-0.016571044921875,
0.0107574462890625,
0.01015472412109375,
-0.02691650390625,
0.037841796875,
-0.033111572265625,
0.0260162353515625,
-0.015655517578125,
-0.03802490234375,
-0.0293731689453125,
0.005496978759765625,
-0.06121826171875,
0.055206298828125,
-0.0014972686767578125,
-0.052154541015625,
0.005008697509765625,
-0.04461669921875,
-0.0291595458984375,
-0.0015420913696289062,
0.0032482147216796875,
-0.028472900390625,
-0.0028533935546875,
0.007534027099609375,
0.0229339599609375,
-0.043609619140625,
0.033660888671875,
-0.0212554931640625,
-0.032928466796875,
0.0219573974609375,
-0.01324462890625,
0.0784912109375,
0.01171112060546875,
-0.0298614501953125,
-0.01158905029296875,
-0.0428466796875,
-0.008453369140625,
0.038330078125,
-0.0126800537109375,
-0.0115814208984375,
-0.00820159912109375,
-0.00472259521484375,
-0.0036602020263671875,
0.0203857421875,
-0.01476287841796875,
0.041961669921875,
-0.0236053466796875,
0.032073974609375,
0.05694580078125,
0.01541900634765625,
0.034423828125,
-0.055755615234375,
0.0513916015625,
-0.0033130645751953125,
0.0179595947265625,
-0.017822265625,
-0.06646728515625,
-0.09893798828125,
-0.005939483642578125,
0.013885498046875,
0.049072265625,
-0.04754638671875,
0.05438232421875,
0.004795074462890625,
-0.077392578125,
-0.03448486328125,
-0.007205963134765625,
0.028900146484375,
0.0296173095703125,
0.0033779144287109375,
-0.0169830322265625,
-0.046844482421875,
-0.056182861328125,
0.0097808837890625,
-0.0386962890625,
0.01776123046875,
0.0262908935546875,
0.0509033203125,
-0.03814697265625,
0.055755615234375,
-0.050323486328125,
-0.02142333984375,
-0.028656005859375,
0.006610870361328125,
0.026123046875,
0.046356201171875,
0.05792236328125,
-0.05010986328125,
-0.0184173583984375,
0.0106048583984375,
-0.0721435546875,
0.009429931640625,
-0.006336212158203125,
-0.0259246826171875,
0.010833740234375,
0.0233306884765625,
-0.05780029296875,
0.05462646484375,
0.043548583984375,
-0.042022705078125,
0.0254669189453125,
-0.00785064697265625,
0.018157958984375,
-0.0850830078125,
0.01468658447265625,
-0.0309906005859375,
-0.0068817138671875,
-0.043853759765625,
0.03778076171875,
-0.02276611328125,
-0.01270294189453125,
-0.0306396484375,
0.0616455078125,
-0.0245361328125,
0.01218414306640625,
-0.0183258056640625,
0.0100555419921875,
0.01275634765625,
0.04913330078125,
-0.011474609375,
0.039642333984375,
0.032745361328125,
-0.0374755859375,
0.044952392578125,
0.0357666015625,
-0.0260772705078125,
0.020416259765625,
-0.062103271484375,
0.030242919921875,
-0.0023136138916015625,
0.0254669189453125,
-0.06390380859375,
-0.028472900390625,
0.032012939453125,
-0.037689208984375,
0.005802154541015625,
-0.0034847259521484375,
-0.0264739990234375,
-0.0221099853515625,
-0.0185394287109375,
0.0158843994140625,
0.056640625,
-0.0440673828125,
0.05230712890625,
0.0301361083984375,
0.0010824203491210938,
-0.0311431884765625,
-0.06219482421875,
-0.00470733642578125,
-0.042999267578125,
-0.05584716796875,
0.024444580078125,
-0.030670166015625,
-0.0164337158203125,
0.0005931854248046875,
0.0183868408203125,
-0.025604248046875,
0.005558013916015625,
0.0279693603515625,
0.04559326171875,
-0.0222320556640625,
-0.022125244140625,
0.0026874542236328125,
0.007293701171875,
-0.0190887451171875,
0.01268768310546875,
0.0469970703125,
-0.0074615478515625,
-0.00868988037109375,
-0.0205535888671875,
0.040557861328125,
0.039215087890625,
-0.0235595703125,
0.06781005859375,
0.036346435546875,
-0.0302886962890625,
0.01273345947265625,
-0.04888916015625,
0.0036983489990234375,
-0.035308837890625,
0.00110626220703125,
-0.0174407958984375,
-0.061920166015625,
0.059051513671875,
0.019073486328125,
0.0144805908203125,
0.048553466796875,
0.03387451171875,
-0.017333984375,
0.0570068359375,
0.06915283203125,
-0.0016002655029296875,
0.026824951171875,
-0.04473876953125,
0.02337646484375,
-0.05126953125,
-0.035125732421875,
-0.0350341796875,
-0.023345947265625,
-0.054046630859375,
-0.03155517578125,
0.0176544189453125,
0.038818359375,
-0.0180511474609375,
0.052886962890625,
-0.01751708984375,
0.04132080078125,
0.04376220703125,
0.0077972412109375,
0.0262298583984375,
-0.0008196830749511719,
0.015869140625,
0.006252288818359375,
-0.056304931640625,
-0.02789306640625,
0.07574462890625,
0.0297393798828125,
0.068359375,
0.0201568603515625,
0.0728759765625,
0.016510009765625,
0.01297760009765625,
-0.050872802734375,
0.038360595703125,
-0.00879669189453125,
-0.0576171875,
-0.009246826171875,
-0.01123809814453125,
-0.06256103515625,
-0.002628326416015625,
-0.01543426513671875,
-0.06744384765625,
0.043121337890625,
0.0037479400634765625,
-0.0389404296875,
0.01238250732421875,
-0.0570068359375,
0.051727294921875,
-0.01776123046875,
-0.0018453598022460938,
-0.0014066696166992188,
-0.049163818359375,
0.0535888671875,
-0.00786590576171875,
0.0020809173583984375,
0.0003345012664794922,
0.0015583038330078125,
0.06365966796875,
-0.03912353515625,
0.07025146484375,
0.020660400390625,
-0.0220794677734375,
0.048370361328125,
0.0087890625,
0.04461669921875,
-0.0018901824951171875,
0.0132904052734375,
0.0259857177734375,
-0.0029087066650390625,
-0.00461578369140625,
-0.0357666015625,
0.060882568359375,
-0.07684326171875,
-0.039093017578125,
-0.043609619140625,
-0.0197601318359375,
0.0196380615234375,
0.01331329345703125,
0.0531005859375,
0.0271453857421875,
-0.0222015380859375,
0.017425537109375,
0.035675048828125,
-0.0156707763671875,
0.029632568359375,
0.036407470703125,
-0.03973388671875,
-0.040740966796875,
0.054229736328125,
0.0023021697998046875,
0.016998291015625,
0.0167083740234375,
0.00836944580078125,
-0.0025424957275390625,
-0.0031490325927734375,
-0.0260772705078125,
0.039825439453125,
-0.063232421875,
-0.035675048828125,
-0.0406494140625,
-0.0413818359375,
-0.03851318359375,
-0.0263214111328125,
-0.026763916015625,
-0.029144287109375,
-0.033447265625,
-0.00020635128021240234,
0.0469970703125,
0.07171630859375,
-0.00504302978515625,
0.06280517578125,
-0.041961669921875,
0.028472900390625,
0.02667236328125,
0.005908966064453125,
0.0002741813659667969,
-0.057464599609375,
-0.01351165771484375,
0.0068817138671875,
-0.02667236328125,
-0.06982421875,
0.035400390625,
0.008026123046875,
0.0252532958984375,
0.033782958984375,
-0.00812530517578125,
0.0643310546875,
-0.034576416015625,
0.060546875,
0.0122528076171875,
-0.06121826171875,
0.04571533203125,
-0.03216552734375,
0.00844573974609375,
0.0208587646484375,
0.026824951171875,
-0.037353515625,
-0.02093505859375,
-0.07232666015625,
-0.041015625,
0.06134033203125,
0.03753662109375,
-0.0064239501953125,
0.00882720947265625,
0.021881103515625,
-0.004367828369140625,
0.0157012939453125,
-0.07135009765625,
-0.03125,
-0.01378631591796875,
0.01535797119140625,
-0.016082763671875,
-0.0206298828125,
-0.01499176025390625,
-0.01454925537109375,
0.0537109375,
0.002964019775390625,
0.03173828125,
0.00267791748046875,
0.01227569580078125,
-0.01235198974609375,
-0.00759124755859375,
0.03582763671875,
0.043701171875,
-0.0225677490234375,
-0.0229339599609375,
0.0018339157104492188,
-0.03466796875,
0.0099945068359375,
0.0093231201171875,
-0.02178955078125,
-0.00388336181640625,
0.062103271484375,
0.08172607421875,
0.01042938232421875,
-0.051849365234375,
0.0308380126953125,
-0.007740020751953125,
0.002948760986328125,
-0.0179595947265625,
0.0247955322265625,
0.024566650390625,
0.026123046875,
0.0008554458618164062,
0.00376129150390625,
0.004673004150390625,
-0.06365966796875,
-0.011993408203125,
0.0164794921875,
-0.0003986358642578125,
-0.0111236572265625,
0.0482177734375,
0.023681640625,
-0.043426513671875,
0.04931640625,
-0.0191802978515625,
-0.0264739990234375,
0.07183837890625,
0.05108642578125,
0.0687255859375,
-0.03411865234375,
0.01502227783203125,
0.028289794921875,
0.035614013671875,
0.0030193328857421875,
0.01433563232421875,
0.01509857177734375,
-0.055450439453125,
-0.00582122802734375,
-0.037628173828125,
-0.022674560546875,
0.023284912109375,
-0.05419921875,
0.0472412109375,
-0.03558349609375,
-0.006069183349609375,
0.0007481575012207031,
0.004116058349609375,
-0.048309326171875,
0.0086517333984375,
0.0024318695068359375,
0.06573486328125,
-0.06524658203125,
0.062744140625,
0.04534912109375,
-0.047607421875,
-0.0684814453125,
-0.00786590576171875,
-0.0007615089416503906,
-0.06280517578125,
0.0229339599609375,
0.0119781494140625,
0.004180908203125,
-0.0160369873046875,
-0.067626953125,
-0.0758056640625,
0.1051025390625,
0.01690673828125,
-0.03607177734375,
-0.03070068359375,
-0.01090240478515625,
0.0357666015625,
-0.04736328125,
0.0136566162109375,
0.04888916015625,
0.0189971923828125,
0.0226287841796875,
-0.07476806640625,
-0.01296234130859375,
-0.027191162109375,
0.0026950836181640625,
-0.005767822265625,
-0.06903076171875,
0.08270263671875,
-0.0239715576171875,
-0.00797271728515625,
0.0479736328125,
0.050933837890625,
0.03399658203125,
0.006969451904296875,
0.0374755859375,
0.041107177734375,
0.049346923828125,
0.0012798309326171875,
0.073974609375,
-0.00640869140625,
0.0245819091796875,
0.0906982421875,
-0.0202789306640625,
0.057098388671875,
0.022064208984375,
-0.0103302001953125,
0.03955078125,
0.052825927734375,
0.001323699951171875,
0.038665771484375,
-0.0185394287109375,
0.0019435882568359375,
-0.004688262939453125,
-0.01995849609375,
-0.044952392578125,
0.058837890625,
0.0137481689453125,
-0.0173187255859375,
0.0014181137084960938,
-0.0008726119995117188,
0.02142333984375,
-0.0205841064453125,
0.0007843971252441406,
0.05975341796875,
0.0024852752685546875,
-0.057464599609375,
0.05419921875,
0.0130767822265625,
0.0511474609375,
-0.07513427734375,
-0.0161590576171875,
-0.055145263671875,
0.007785797119140625,
0.004421234130859375,
-0.048736572265625,
0.004901885986328125,
0.01186370849609375,
-0.016326904296875,
0.006866455078125,
0.050750732421875,
-0.0310211181640625,
-0.05242919921875,
0.0290985107421875,
0.026763916015625,
0.022552490234375,
0.017791748046875,
-0.042877197265625,
0.032012939453125,
0.0186767578125,
-0.0018663406372070312,
0.01345062255859375,
0.027923583984375,
-0.0027008056640625,
0.05450439453125,
0.036529541015625,
-0.007587432861328125,
-0.00930023193359375,
-0.0214385986328125,
0.0799560546875,
-0.05621337890625,
-0.056640625,
-0.040771484375,
0.029937744140625,
0.006618499755859375,
-0.041473388671875,
0.053436279296875,
0.0287933349609375,
0.04473876953125,
-0.00691986083984375,
0.03857421875,
-0.0299835205078125,
0.0269317626953125,
-0.043548583984375,
0.06298828125,
-0.05230712890625,
0.01953125,
-0.0273284912109375,
-0.07427978515625,
0.017578125,
0.051910400390625,
0.0225982666015625,
0.0016498565673828125,
0.043121337890625,
0.049774169921875,
-0.00707244873046875,
0.001220703125,
0.00807952880859375,
0.0188140869140625,
0.0269012451171875,
0.051422119140625,
0.07135009765625,
-0.0596923828125,
0.03399658203125,
-0.038726806640625,
-0.029022216796875,
-0.0219268798828125,
-0.07354736328125,
-0.07843017578125,
-0.0435791015625,
-0.0204010009765625,
-0.0301971435546875,
0.009368896484375,
0.05120849609375,
0.054290771484375,
-0.035675048828125,
-0.045806884765625,
0.0183868408203125,
0.006603240966796875,
0.00177764892578125,
-0.0164337158203125,
0.0099945068359375,
0.015350341796875,
-0.053131103515625,
0.034271240234375,
-0.006427764892578125,
0.0310516357421875,
-0.029022216796875,
-0.01499176025390625,
-0.0364990234375,
0.01461029052734375,
0.0283203125,
0.0225067138671875,
-0.060882568359375,
-0.0110321044921875,
0.00347900390625,
-0.0032405853271484375,
-0.006603240966796875,
0.046905517578125,
-0.0556640625,
0.01222991943359375,
0.0249176025390625,
0.0238494873046875,
0.042877197265625,
-0.0097198486328125,
0.0379638671875,
-0.04107666015625,
0.03631591796875,
0.004520416259765625,
0.0262298583984375,
0.019500732421875,
-0.0413818359375,
0.05364990234375,
0.01485443115234375,
-0.04486083984375,
-0.0654296875,
-0.00335693359375,
-0.08355712890625,
-0.017181396484375,
0.0784912109375,
-0.01213836669921875,
-0.0238037109375,
0.0364990234375,
-0.02978515625,
0.011688232421875,
-0.016815185546875,
0.030242919921875,
0.06097412109375,
-0.0111541748046875,
-0.0008664131164550781,
-0.020904541015625,
0.028045654296875,
-0.00003147125244140625,
-0.07037353515625,
-0.00646209716796875,
0.026275634765625,
0.0083160400390625,
0.034881591796875,
0.0489501953125,
0.0157470703125,
0.02447509765625,
0.0051727294921875,
0.00439453125,
0.005962371826171875,
-0.04205322265625,
-0.0184173583984375,
-0.02978515625,
-0.01032257080078125,
-0.002216339111328125
]
] |
garage-bAInd/Camel-Platypus2-70B | 2023-08-15T01:54:31.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:garage-bAInd/Open-Platypus",
"arxiv:2308.07317",
"arxiv:2307.09288",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | garage-bAInd | null | null | garage-bAInd/Camel-Platypus2-70B | 12 | 11,456 | transformers | 2023-08-09T21:24:32 | ---
language:
- en
datasets:
- garage-bAInd/Open-Platypus
license: cc-by-nc-4.0
---
# Camel-Platypus2-70B
Camel-Platypus2-70B is a merge of [`garage-bAInd/Platypus2-70B`](https://huggingface.co/garage-bAInd/Platypus2-70B) and [`augtoma/qCammel-70-x`](https://huggingface.co/augtoma/qCammel-70-x).

### Benchmark Metrics
| Metric | Value |
|-----------------------|-------|
| MMLU (5-shot) | 69.80 |
| ARC (25-shot) | 71.16 |
| HellaSwag (10-shot) | 87.66 |
| TruthfulQA (0-shot) | 57.77 |
| Avg. | 71.60 |
We use state-of-the-art [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) to run the benchmark tests above, using the same version as the HuggingFace LLM Leaderboard. Please see below for detailed instructions on reproducing benchmark results.
### Model Details
* **Trained by**: **Platypus2-70B** trained by Cole Hunter & Ariel Lee; **augtoma/qCammel-70-x** trained by augtoma
* **Model type:** **Camel-Platypus2-70B** is an auto-regressive language model based on the LLaMA 2 transformer architecture.
* **Language(s)**: English
* **License**: Non-Commercial Creative Commons license ([CC BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/))
### Prompt Template
```
### Instruction:
<prompt> (without the <>)
### Response:
```
### Training Dataset
`garage-bAInd/Platypus2-70B` trained using STEM and logic based dataset [`garage-bAInd/Open-Platypus`](https://huggingface.co/datasets/garage-bAInd/Open-Platypus).
Please see our [paper](https://arxiv.org/abs/2308.07317) and [project webpage](https://platypus-llm.github.io) for additional information.
### Training Procedure
`garage-bAInd/Platypus2-70B` was instruction fine-tuned using LoRA on 8 A100 80GB. For training details and inference instructions please see the [Platypus](https://github.com/arielnlee/Platypus) GitHub repo.
### Reproducing Evaluation Results
Install LM Evaluation Harness:
```
# clone repository
git clone https://github.com/EleutherAI/lm-evaluation-harness.git
# change to repo directory
cd lm-evaluation-harness
# check out the correct commit
git checkout b281b0921b636bc36ad05c0b0b0763bd6dd43463
# install
pip install -e .
```
Each task was evaluated on a single A100 80GB GPU.
ARC:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Camel-Platypus2-70B --tasks arc_challenge --batch_size 1 --no_cache --write_out --output_path results/Camel-Platypus2-70B/arc_challenge_25shot.json --device cuda --num_fewshot 25
```
HellaSwag:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Camel-Platypus2-70B --tasks hellaswag --batch_size 1 --no_cache --write_out --output_path results/Camel-Platypus2-70B/hellaswag_10shot.json --device cuda --num_fewshot 10
```
MMLU:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Camel-Platypus2-70B --tasks hendrycksTest-* --batch_size 1 --no_cache --write_out --output_path results/Camel-Platypus2-70B/mmlu_5shot.json --device cuda --num_fewshot 5
```
TruthfulQA:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Camel-Platypus2-70B --tasks truthfulqa_mc --batch_size 1 --no_cache --write_out --output_path results/Camel-Platypus2-70B/truthfulqa_0shot.json --device cuda
```
### Limitations and bias
Llama 2 and fine-tuned variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2 and any fine-tuned varient's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2 variants, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at https://ai.meta.com/llama/responsible-use-guide/
### Citations
```bibtex
@article{platypus2023,
title={Platypus: Quick, Cheap, and Powerful Refinement of LLMs},
author={Ariel N. Lee and Cole J. Hunter and Nataniel Ruiz},
booktitle={arXiv preprint arxiv:2308.07317},
year={2023}
}
```
```bibtex
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
}
```
```bibtex
@inproceedings{
hu2022lora,
title={Lo{RA}: Low-Rank Adaptation of Large Language Models},
author={Edward J Hu and Yelong Shen and Phillip Wallis and Zeyuan Allen-Zhu and Yuanzhi Li and Shean Wang and Lu Wang and Weizhu Chen},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=nZeVKeeFYf9}
}
``` | 5,082 | [
[
-0.027496337890625,
-0.05682373046875,
0.007389068603515625,
0.0273284912109375,
-0.02801513671875,
-0.005260467529296875,
-0.0250701904296875,
-0.045623779296875,
0.0018224716186523438,
0.021942138671875,
-0.035980224609375,
-0.0280914306640625,
-0.04498291015625,
-0.0056610107421875,
-0.0106964111328125,
0.08526611328125,
-0.029388427734375,
-0.0146331787109375,
-0.005771636962890625,
-0.0305023193359375,
-0.04656982421875,
-0.035186767578125,
-0.0304412841796875,
-0.0266571044921875,
0.026031494140625,
0.020965576171875,
0.043731689453125,
0.036285400390625,
0.046600341796875,
0.02398681640625,
-0.01026153564453125,
0.0205230712890625,
-0.04705810546875,
-0.00811004638671875,
0.01337432861328125,
-0.031768798828125,
-0.04071044921875,
0.00820159912109375,
0.0379638671875,
0.0307159423828125,
-0.016937255859375,
0.033905029296875,
0.0099945068359375,
0.029083251953125,
-0.0484619140625,
0.023345947265625,
-0.04473876953125,
-0.01549530029296875,
-0.0256805419921875,
-0.0121002197265625,
-0.0172271728515625,
-0.018341064453125,
-0.006298065185546875,
-0.05389404296875,
0.0021190643310546875,
0.0117645263671875,
0.087158203125,
0.03765869140625,
-0.0236358642578125,
-0.007717132568359375,
-0.0237579345703125,
0.07427978515625,
-0.057708740234375,
0.01253509521484375,
0.0262298583984375,
0.0101318359375,
-0.035552978515625,
-0.045074462890625,
-0.05047607421875,
-0.022674560546875,
-0.006160736083984375,
0.0081329345703125,
-0.0212554931640625,
-0.01126861572265625,
0.0164337158203125,
0.032073974609375,
-0.0309906005859375,
0.035491943359375,
-0.0310516357421875,
-0.015350341796875,
0.054229736328125,
0.00981903076171875,
0.0093841552734375,
-0.004058837890625,
-0.04461669921875,
-0.03228759765625,
-0.058349609375,
0.028106689453125,
0.0291748046875,
0.00862884521484375,
-0.0305633544921875,
0.05413818359375,
-0.0179443359375,
0.03619384765625,
-0.0093536376953125,
-0.0377197265625,
0.046539306640625,
-0.031951904296875,
-0.02496337890625,
-0.0055999755859375,
0.07525634765625,
0.032135009765625,
0.0016603469848632812,
0.00748443603515625,
-0.017578125,
0.0205230712890625,
-0.01467132568359375,
-0.060333251953125,
-0.007160186767578125,
0.0222015380859375,
-0.026123046875,
-0.0180816650390625,
-0.019195556640625,
-0.034454345703125,
-0.0222015380859375,
-0.0010528564453125,
0.0286712646484375,
-0.038909912109375,
-0.030975341796875,
0.020538330078125,
-0.01259613037109375,
0.047027587890625,
0.0208587646484375,
-0.047332763671875,
0.03009033203125,
0.0426025390625,
0.06268310546875,
-0.0255126953125,
-0.04412841796875,
-0.0272369384765625,
0.0038299560546875,
-0.0179595947265625,
0.0623779296875,
-0.0038967132568359375,
-0.0222320556640625,
-0.014892578125,
0.01192474365234375,
-0.00537109375,
-0.043701171875,
0.039154052734375,
-0.02581787109375,
0.01519012451171875,
-0.022674560546875,
-0.02960205078125,
-0.029998779296875,
0.003345489501953125,
-0.032440185546875,
0.10040283203125,
0.0152587890625,
-0.06573486328125,
0.0111541748046875,
-0.045623779296875,
-0.029205322265625,
-0.01073455810546875,
0.0093231201171875,
-0.043731689453125,
-0.01320648193359375,
0.01033782958984375,
0.031707763671875,
-0.044097900390625,
0.01319122314453125,
-0.0213470458984375,
-0.0282745361328125,
0.0227508544921875,
-0.01220703125,
0.0787353515625,
0.0090789794921875,
-0.038909912109375,
0.006328582763671875,
-0.047515869140625,
-0.007236480712890625,
0.036041259765625,
-0.0257720947265625,
-0.01248931884765625,
-0.0064239501953125,
-0.005512237548828125,
0.006717681884765625,
0.03271484375,
-0.03485107421875,
0.00972747802734375,
-0.02685546875,
0.04290771484375,
0.053436279296875,
-0.00904083251953125,
0.01323699951171875,
-0.046630859375,
0.0238494873046875,
0.003559112548828125,
0.0182342529296875,
0.0067596435546875,
-0.053253173828125,
-0.08319091796875,
-0.0249786376953125,
0.00748443603515625,
0.057281494140625,
-0.0273895263671875,
0.047332763671875,
-0.005252838134765625,
-0.052825927734375,
-0.040863037109375,
0.023040771484375,
0.0391845703125,
0.038299560546875,
0.040771484375,
-0.0379638671875,
-0.035888671875,
-0.0687255859375,
-0.01026153564453125,
-0.032501220703125,
0.00928497314453125,
0.0234527587890625,
0.049713134765625,
-0.0279388427734375,
0.0543212890625,
-0.03192138671875,
-0.0251007080078125,
-0.015777587890625,
0.0010042190551757812,
0.0188751220703125,
0.0516357421875,
0.042938232421875,
-0.0171356201171875,
-0.0118560791015625,
-0.01141357421875,
-0.057891845703125,
-0.01525115966796875,
0.003993988037109375,
-0.0238800048828125,
0.03851318359375,
0.003314971923828125,
-0.06640625,
0.0289154052734375,
0.03759765625,
-0.013580322265625,
0.04302978515625,
-0.01042938232421875,
-0.0085601806640625,
-0.052703857421875,
0.00930023193359375,
0.001972198486328125,
-0.0012445449829101562,
-0.02984619140625,
0.006267547607421875,
-0.006641387939453125,
0.0138397216796875,
-0.046905517578125,
0.04962158203125,
-0.03936767578125,
-0.0174713134765625,
-0.010650634765625,
0.016143798828125,
-0.005176544189453125,
0.052093505859375,
-0.006473541259765625,
0.06353759765625,
0.0367431640625,
-0.04266357421875,
0.016357421875,
0.032012939453125,
-0.033355712890625,
0.01448822021484375,
-0.0673828125,
0.0116119384765625,
0.0132598876953125,
0.03460693359375,
-0.081298828125,
-0.004608154296875,
0.0249786376953125,
-0.02264404296875,
0.0216064453125,
0.01131439208984375,
-0.05230712890625,
-0.031158447265625,
-0.04302978515625,
0.03143310546875,
0.06817626953125,
-0.04168701171875,
0.0194549560546875,
0.03131103515625,
0.003337860107421875,
-0.05517578125,
-0.0574951171875,
-0.0175933837890625,
-0.0247955322265625,
-0.054412841796875,
0.01531219482421875,
-0.0108184814453125,
-0.01268768310546875,
-0.0159759521484375,
-0.0074462890625,
0.0129547119140625,
0.0144195556640625,
0.0341796875,
0.0340576171875,
-0.0092926025390625,
-0.001682281494140625,
0.00016319751739501953,
-0.0156402587890625,
-0.0038356781005859375,
0.0025844573974609375,
0.04638671875,
-0.0262908935546875,
-0.01268768310546875,
-0.05999755859375,
0.00394439697265625,
0.031494140625,
-0.0268096923828125,
0.057281494140625,
0.054656982421875,
-0.010528564453125,
0.01172637939453125,
-0.0609130859375,
-0.0118560791015625,
-0.036956787109375,
0.025787353515625,
-0.0209808349609375,
-0.053375244140625,
0.042510986328125,
0.0006766319274902344,
0.0168609619140625,
0.0545654296875,
0.060577392578125,
0.0008134841918945312,
0.05859375,
0.04351806640625,
-0.00229644775390625,
0.03411865234375,
-0.050201416015625,
0.00566864013671875,
-0.07989501953125,
-0.0290679931640625,
-0.036285400390625,
-0.0256195068359375,
-0.054840087890625,
-0.0440673828125,
0.00920867919921875,
0.0214080810546875,
-0.04534912109375,
0.036102294921875,
-0.039093017578125,
0.00901031494140625,
0.0457763671875,
0.0030517578125,
0.016326904296875,
0.002628326416015625,
-0.0092010498046875,
0.0099945068359375,
-0.047576904296875,
-0.043487548828125,
0.08154296875,
0.037994384765625,
0.058685302734375,
0.0032863616943359375,
0.041473388671875,
-0.00536346435546875,
0.0250701904296875,
-0.052001953125,
0.0509033203125,
0.0003414154052734375,
-0.034149169921875,
-0.01026153564453125,
-0.01323699951171875,
-0.07421875,
0.0267486572265625,
-0.00506591796875,
-0.054779052734375,
0.012908935546875,
-0.0038166046142578125,
-0.022125244140625,
0.01959228515625,
-0.06011962890625,
0.05670166015625,
-0.028045654296875,
-0.029876708984375,
-0.022369384765625,
-0.055633544921875,
0.0478515625,
-0.00200653076171875,
0.006465911865234375,
-0.0341796875,
-0.0202178955078125,
0.08404541015625,
-0.055023193359375,
0.06915283203125,
-0.01067352294921875,
-0.0033206939697265625,
0.039154052734375,
-0.00586700439453125,
0.042510986328125,
0.001331329345703125,
-0.003711700439453125,
0.0408935546875,
-0.0006403923034667969,
-0.032501220703125,
-0.01070404052734375,
0.058990478515625,
-0.1009521484375,
-0.049224853515625,
-0.044525146484375,
-0.050445556640625,
0.003147125244140625,
0.004695892333984375,
0.0121917724609375,
0.007236480712890625,
0.01268768310546875,
0.0049285888671875,
0.034515380859375,
-0.036468505859375,
0.04193115234375,
0.044647216796875,
-0.00007236003875732422,
-0.028472900390625,
0.056243896484375,
0.0015726089477539062,
0.0177459716796875,
0.01065826416015625,
0.007568359375,
-0.0212860107421875,
-0.036834716796875,
-0.0234527587890625,
0.04840087890625,
-0.045257568359375,
-0.0374755859375,
-0.0474853515625,
-0.0189208984375,
-0.0201416015625,
0.005237579345703125,
-0.0318603515625,
-0.0369873046875,
-0.044586181640625,
0.002521514892578125,
0.053955078125,
0.038909912109375,
-0.00847625732421875,
0.043212890625,
-0.0222930908203125,
0.0211639404296875,
0.0232696533203125,
0.01446533203125,
-0.00383758544921875,
-0.055023193359375,
0.0045013427734375,
0.006999969482421875,
-0.042633056640625,
-0.055633544921875,
0.03240966796875,
0.0141143798828125,
0.052581787109375,
0.01320648193359375,
0.00385284423828125,
0.06573486328125,
-0.01678466796875,
0.06695556640625,
0.022003173828125,
-0.0626220703125,
0.04656982421875,
-0.00983428955078125,
0.0016326904296875,
0.02899169921875,
0.0219268798828125,
-0.0153045654296875,
-0.033050537109375,
-0.050628662109375,
-0.06298828125,
0.05859375,
0.0279083251953125,
-0.01149749755859375,
0.018524169921875,
0.0248870849609375,
0.0165557861328125,
0.012176513671875,
-0.0570068359375,
-0.0309600830078125,
-0.02459716796875,
0.00151824951171875,
-0.01276397705078125,
-0.0211181640625,
-0.017822265625,
-0.035491943359375,
0.0623779296875,
-0.0009794235229492188,
0.03466796875,
0.01084136962890625,
-0.0230712890625,
-0.010650634765625,
0.0006847381591796875,
0.047088623046875,
0.038177490234375,
-0.0274200439453125,
-0.00617218017578125,
0.0268707275390625,
-0.0386962890625,
0.01371002197265625,
0.01477813720703125,
0.0017833709716796875,
-0.01268768310546875,
0.0321044921875,
0.07977294921875,
0.004852294921875,
-0.051788330078125,
0.0386962890625,
-0.004878997802734375,
-0.0081787109375,
-0.0225677490234375,
0.01438140869140625,
0.00749969482421875,
0.022918701171875,
0.0206756591796875,
-0.00008803606033325195,
-0.0135040283203125,
-0.036529541015625,
-0.00966644287109375,
0.0380859375,
0.00492095947265625,
-0.02593994140625,
0.05548095703125,
0.008941650390625,
-0.019439697265625,
0.04620361328125,
-0.016998291015625,
-0.0280303955078125,
0.058563232421875,
0.04876708984375,
0.044647216796875,
-0.0096282958984375,
-0.0014085769653320312,
0.03070068359375,
0.0309906005859375,
-0.0130767822265625,
0.0286712646484375,
0.005916595458984375,
-0.04498291015625,
-0.024688720703125,
-0.048004150390625,
-0.020172119140625,
0.0210418701171875,
-0.035858154296875,
0.03558349609375,
-0.032012939453125,
-0.0187835693359375,
-0.00978851318359375,
0.034698486328125,
-0.0562744140625,
-0.002666473388671875,
-0.0010585784912109375,
0.07464599609375,
-0.06451416015625,
0.06689453125,
0.04833984375,
-0.0374755859375,
-0.0673828125,
-0.029205322265625,
-0.01177978515625,
-0.08892822265625,
0.04498291015625,
0.020111083984375,
-0.001735687255859375,
-0.01171875,
-0.0478515625,
-0.0771484375,
0.11187744140625,
0.04266357421875,
-0.0433349609375,
0.027069091796875,
-0.00047206878662109375,
0.03533935546875,
-0.017852783203125,
0.03662109375,
0.059112548828125,
0.03466796875,
0.0100250244140625,
-0.086669921875,
0.0211639404296875,
-0.0208282470703125,
0.0081634521484375,
0.0024890899658203125,
-0.0870361328125,
0.08282470703125,
-0.0297393798828125,
-0.01378631591796875,
0.0252227783203125,
0.051361083984375,
0.057647705078125,
0.0105743408203125,
0.033782958984375,
0.06268310546875,
0.06268310546875,
-0.004608154296875,
0.0880126953125,
-0.02337646484375,
0.035980224609375,
0.072021484375,
-0.01107025146484375,
0.07781982421875,
0.039581298828125,
-0.036224365234375,
0.05682373046875,
0.0701904296875,
-0.001285552978515625,
0.044342041015625,
0.0142974853515625,
0.00337982177734375,
-0.005634307861328125,
-0.01065826416015625,
-0.05084228515625,
0.0311279296875,
0.02459716796875,
-0.005039215087890625,
-0.005084991455078125,
-0.01320648193359375,
0.01123809814453125,
-0.034881591796875,
-0.0014591217041015625,
0.043212890625,
0.0157318115234375,
-0.048919677734375,
0.08819580078125,
0.006072998046875,
0.06878662109375,
-0.04095458984375,
0.01508331298828125,
-0.0297393798828125,
0.01268768310546875,
-0.0177764892578125,
-0.04949951171875,
-0.004871368408203125,
-0.007053375244140625,
0.01104736328125,
0.0011453628540039062,
0.048828125,
-0.00916290283203125,
-0.027740478515625,
0.034393310546875,
0.028350830078125,
0.027587890625,
0.015106201171875,
-0.054840087890625,
0.0245208740234375,
-0.01218414306640625,
-0.035003662109375,
0.0249176025390625,
0.010955810546875,
-0.0142822265625,
0.054840087890625,
0.05352783203125,
-0.001430511474609375,
0.015045166015625,
-0.0127105712890625,
0.076416015625,
-0.036224365234375,
-0.0268402099609375,
-0.056427001953125,
0.029998779296875,
0.00710296630859375,
-0.04315185546875,
0.05352783203125,
0.04010009765625,
0.0552978515625,
0.01335906982421875,
0.050323486328125,
-0.008819580078125,
0.0222625732421875,
-0.0262603759765625,
0.035736083984375,
-0.04437255859375,
0.0279998779296875,
-0.0005068778991699219,
-0.06927490234375,
-0.004444122314453125,
0.055084228515625,
-0.022369384765625,
-0.0016183853149414062,
0.06353759765625,
0.06561279296875,
0.00031304359436035156,
-0.0155792236328125,
-0.00923919677734375,
0.033782958984375,
0.0199432373046875,
0.0692138671875,
0.06707763671875,
-0.05841064453125,
0.041351318359375,
-0.034820556640625,
-0.025634765625,
-0.0245208740234375,
-0.06256103515625,
-0.0711669921875,
-0.0318603515625,
-0.0406494140625,
-0.03192138671875,
-0.00112152099609375,
0.054656982421875,
0.041015625,
-0.05853271484375,
-0.039520263671875,
-0.0009870529174804688,
0.01348876953125,
-0.019287109375,
-0.011474609375,
0.034942626953125,
-0.0169830322265625,
-0.035491943359375,
0.0088958740234375,
0.008331298828125,
0.021820068359375,
-0.0318603515625,
-0.021728515625,
-0.019195556640625,
-0.005863189697265625,
0.0355224609375,
0.02960205078125,
-0.06927490234375,
-0.00923919677734375,
-0.01129150390625,
-0.01447296142578125,
0.01538848876953125,
0.033050537109375,
-0.07080078125,
-0.0004673004150390625,
0.032806396484375,
0.02783203125,
0.053436279296875,
-0.0146942138671875,
0.0123138427734375,
-0.035980224609375,
0.045074462890625,
-0.007587432861328125,
0.034942626953125,
0.035491943359375,
-0.0165863037109375,
0.04742431640625,
0.028350830078125,
-0.04547119140625,
-0.076416015625,
-0.0091552734375,
-0.09765625,
-0.01142120361328125,
0.11248779296875,
-0.012939453125,
-0.03485107421875,
0.01253509521484375,
-0.01885986328125,
0.036834716796875,
-0.03448486328125,
0.0484619140625,
0.026153564453125,
-0.01096343994140625,
0.0005025863647460938,
-0.054595947265625,
0.0233154296875,
0.0323486328125,
-0.06549072265625,
-0.01519012451171875,
0.01538848876953125,
0.039947509765625,
0.007320404052734375,
0.04632568359375,
0.00785064697265625,
0.013580322265625,
-0.0142974853515625,
0.01055145263671875,
-0.00762939453125,
0.00432586669921875,
-0.0205230712890625,
-0.01517486572265625,
0.00791168212890625,
-0.0290985107421875
]
] |
microsoft/trocr-large-handwritten | 2023-01-24T16:57:33.000Z | [
"transformers",
"pytorch",
"vision-encoder-decoder",
"trocr",
"image-to-text",
"arxiv:2109.10282",
"endpoints_compatible",
"has_space",
"region:us"
] | image-to-text | microsoft | null | null | microsoft/trocr-large-handwritten | 39 | 11,443 | transformers | 2022-03-02T23:29:05 | ---
tags:
- trocr
- image-to-text
widget:
- src: https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg
example_title: Note 1
- src: https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSoolxi9yWGAT5SLZShv8vVd0bz47UWRzQC19fDTeE8GmGv_Rn-PCF1pP1rrUx8kOjA4gg&usqp=CAU
example_title: Note 2
- src: https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRNYtTuSBpZPV_nkBYPMFwVVD9asZOPgHww4epu9EqWgDmXW--sE2o8og40ZfDGo87j5w&usqp=CAU
example_title: Note 3
---
# TrOCR (large-sized model, fine-tuned on IAM)
TrOCR model fine-tuned on the [IAM dataset](https://fki.tic.heia-fr.ch/databases/iam-handwriting-database). It was introduced in the paper [TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models](https://arxiv.org/abs/2109.10282) by Li et al. and first released in [this repository](https://github.com/microsoft/unilm/tree/master/trocr).
Disclaimer: The team releasing TrOCR did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
The TrOCR model is an encoder-decoder model, consisting of an image Transformer as encoder, and a text Transformer as decoder. The image encoder was initialized from the weights of BEiT, while the text decoder was initialized from the weights of RoBERTa.
Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded. One also adds absolute position embeddings before feeding the sequence to the layers of the Transformer encoder. Next, the Transformer text decoder autoregressively generates tokens.
## Intended uses & limitations
You can use the raw model for optical character recognition (OCR) on single text-line images. See the [model hub](https://huggingface.co/models?search=microsoft/trocr) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model in PyTorch:
```python
from transformers import TrOCRProcessor, VisionEncoderDecoderModel
from PIL import Image
import requests
# load image from the IAM database
url = 'https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg'
image = Image.open(requests.get(url, stream=True).raw).convert("RGB")
processor = TrOCRProcessor.from_pretrained('microsoft/trocr-large-handwritten')
model = VisionEncoderDecoderModel.from_pretrained('microsoft/trocr-large-handwritten')
pixel_values = processor(images=image, return_tensors="pt").pixel_values
generated_ids = model.generate(pixel_values)
generated_text = processor.batch_decode(generated_ids, skip_special_tokens=True)[0]
```
### BibTeX entry and citation info
```bibtex
@misc{li2021trocr,
title={TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models},
author={Minghao Li and Tengchao Lv and Lei Cui and Yijuan Lu and Dinei Florencio and Cha Zhang and Zhoujun Li and Furu Wei},
year={2021},
eprint={2109.10282},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 2,975 | [
[
-0.01531982421875,
-0.0298004150390625,
0.0121307373046875,
-0.0261383056640625,
-0.0272674560546875,
-0.00030303001403808594,
-0.00431060791015625,
-0.0662841796875,
0.00981903076171875,
0.050506591796875,
-0.02520751953125,
-0.034912109375,
-0.05078125,
0.01143646240234375,
-0.0278472900390625,
0.0814208984375,
-0.01136016845703125,
-0.0006422996520996094,
0.008209228515625,
-0.035400390625,
-0.004062652587890625,
-0.045135498046875,
-0.04266357421875,
-0.0139923095703125,
0.0284423828125,
0.0248260498046875,
0.047760009765625,
0.05401611328125,
0.076171875,
0.0290679931640625,
-0.02587890625,
0.0089111328125,
-0.0183868408203125,
-0.026519775390625,
0.0169677734375,
-0.034637451171875,
-0.049468994140625,
0.00408172607421875,
0.0491943359375,
0.0112457275390625,
-0.0038738250732421875,
0.01123809814453125,
0.003932952880859375,
0.040771484375,
-0.019683837890625,
-0.006374359130859375,
-0.0251312255859375,
0.0204620361328125,
-0.00618743896484375,
0.001171112060546875,
-0.03216552734375,
-0.0316162109375,
0.0257110595703125,
-0.042266845703125,
0.04608154296875,
0.005550384521484375,
0.08929443359375,
-0.01068115234375,
-0.0232086181640625,
-0.041778564453125,
-0.057647705078125,
0.047332763671875,
-0.04583740234375,
0.0271759033203125,
0.00234222412109375,
0.016937255859375,
0.0078125,
-0.07769775390625,
-0.069091796875,
-0.028289794921875,
-0.019256591796875,
0.0003502368927001953,
-0.0173187255859375,
0.021331787109375,
0.0352783203125,
0.041748046875,
-0.0443115234375,
-0.01549530029296875,
-0.0513916015625,
-0.028656005859375,
0.0237579345703125,
-0.0005059242248535156,
0.0257415771484375,
0.0064239501953125,
-0.0291595458984375,
-0.034576416015625,
-0.00873565673828125,
-0.005504608154296875,
0.0065460205078125,
-0.00313568115234375,
-0.0283966064453125,
0.052734375,
0.0244903564453125,
0.0631103515625,
0.020355224609375,
-0.02569580078125,
0.04034423828125,
-0.004581451416015625,
-0.00028634071350097656,
0.00391387939453125,
0.0789794921875,
0.0229949951171875,
0.0286712646484375,
-0.008392333984375,
-0.020843505859375,
0.018463134765625,
0.00482177734375,
-0.067138671875,
-0.0011606216430664062,
-0.0047149658203125,
-0.041748046875,
-0.0188446044921875,
0.0165557861328125,
-0.06439208984375,
-0.0183258056640625,
-0.01442718505859375,
0.03497314453125,
-0.0311279296875,
0.0095672607421875,
-0.00641632080078125,
-0.011383056640625,
0.0128936767578125,
0.0225830078125,
-0.039215087890625,
0.00263214111328125,
0.0133819580078125,
0.08642578125,
-0.011138916015625,
-0.024749755859375,
-0.0279693603515625,
-0.003086090087890625,
-0.01479339599609375,
0.046234130859375,
-0.0210418701171875,
-0.0200347900390625,
-0.005321502685546875,
0.0036029815673828125,
0.0003604888916015625,
-0.040557861328125,
0.042144775390625,
-0.03118896484375,
0.0245361328125,
0.0112457275390625,
-0.0007891654968261719,
-0.0035552978515625,
0.028656005859375,
-0.068603515625,
0.08697509765625,
0.0195770263671875,
-0.0592041015625,
0.01253509521484375,
-0.053863525390625,
-0.0227203369140625,
0.0011301040649414062,
0.016357421875,
-0.067138671875,
0.00278472900390625,
0.00490570068359375,
0.0147552490234375,
-0.025848388671875,
0.0028743743896484375,
-0.0118560791015625,
-0.029266357421875,
0.019500732421875,
-0.017791748046875,
0.057159423828125,
0.022308349609375,
-0.02960205078125,
-0.003810882568359375,
-0.07769775390625,
0.0068511962890625,
0.00646209716796875,
-0.0267181396484375,
-0.00492095947265625,
-0.0278472900390625,
0.027801513671875,
0.0357666015625,
0.032684326171875,
-0.050262451171875,
0.020263671875,
-0.0203704833984375,
0.05999755859375,
0.0276031494140625,
-0.019256591796875,
0.032440185546875,
-0.0213165283203125,
0.0310516357421875,
0.01641845703125,
0.009765625,
-0.010467529296875,
-0.014434814453125,
-0.07373046875,
-0.0208282470703125,
0.0167999267578125,
0.049285888671875,
-0.07122802734375,
0.0352783203125,
-0.0267486572265625,
-0.0498046875,
-0.036407470703125,
-0.005077362060546875,
0.04815673828125,
0.057037353515625,
0.0279083251953125,
-0.044769287109375,
-0.0311431884765625,
-0.048126220703125,
-0.00817108154296875,
-0.018157958984375,
0.004489898681640625,
0.020843505859375,
0.048828125,
-0.0203094482421875,
0.052154541015625,
-0.0272064208984375,
-0.060394287109375,
-0.01904296875,
0.021575927734375,
0.0129241943359375,
0.049285888671875,
0.031280517578125,
-0.0474853515625,
-0.04388427734375,
0.0016202926635742188,
-0.0513916015625,
0.00861358642578125,
-0.00594329833984375,
-0.011077880859375,
0.044189453125,
0.031036376953125,
-0.0523681640625,
0.056671142578125,
0.033660888671875,
-0.035736083984375,
0.0413818359375,
-0.038726806640625,
0.00971221923828125,
-0.0819091796875,
0.0308380126953125,
0.006267547607421875,
-0.0167083740234375,
-0.057769775390625,
0.0152740478515625,
0.015625,
-0.020965576171875,
-0.0205078125,
0.0443115234375,
-0.060272216796875,
-0.0015325546264648438,
-0.005077362060546875,
0.005481719970703125,
0.008636474609375,
0.051025390625,
0.03271484375,
0.056884765625,
0.01666259765625,
-0.0243377685546875,
0.01373291015625,
0.0297698974609375,
-0.029693603515625,
0.0472412109375,
-0.073486328125,
0.04266357421875,
-0.004852294921875,
-0.00040984153747558594,
-0.057861328125,
0.01282501220703125,
0.0241241455078125,
-0.035247802734375,
0.031890869140625,
0.003353118896484375,
-0.040771484375,
-0.061767578125,
-0.004955291748046875,
0.032867431640625,
0.029937744140625,
-0.04559326171875,
0.08447265625,
0.01441192626953125,
0.0245819091796875,
-0.036712646484375,
-0.0797119140625,
0.004741668701171875,
-0.010009765625,
-0.05072021484375,
0.035919189453125,
-0.0173797607421875,
0.0168304443359375,
-0.00861358642578125,
0.008148193359375,
-0.01078033447265625,
-0.0308837890625,
0.010009765625,
0.040985107421875,
-0.019989013671875,
-0.017425537109375,
-0.03753662109375,
-0.01363372802734375,
-0.020965576171875,
-0.0164642333984375,
0.04559326171875,
-0.025634765625,
0.0030078887939453125,
-0.046234130859375,
0.0139923095703125,
0.06109619140625,
-0.04229736328125,
0.051177978515625,
0.051971435546875,
-0.021820068359375,
0.004180908203125,
-0.051422119140625,
-0.006427764892578125,
-0.035675048828125,
0.0257110595703125,
-0.0229034423828125,
-0.056365966796875,
0.062744140625,
0.029937744140625,
-0.0080108642578125,
0.031463623046875,
0.0308837890625,
0.0011920928955078125,
0.064208984375,
0.053802490234375,
0.00212860107421875,
0.0643310546875,
-0.03997802734375,
0.0170745849609375,
-0.069580078125,
-0.0316162109375,
-0.032989501953125,
-0.0318603515625,
-0.038818359375,
-0.0189056396484375,
0.0289306640625,
-0.006961822509765625,
-0.0157318115234375,
0.0419921875,
-0.0767822265625,
0.023040771484375,
0.058441162109375,
0.034271240234375,
0.01361846923828125,
0.0115509033203125,
-0.017242431640625,
0.00942230224609375,
-0.027191162109375,
-0.03900146484375,
0.05828857421875,
0.0121307373046875,
0.054962158203125,
-0.01477813720703125,
0.039093017578125,
0.0171051025390625,
0.006114959716796875,
-0.05926513671875,
0.04791259765625,
-0.02001953125,
-0.03900146484375,
-0.004932403564453125,
-0.02166748046875,
-0.0706787109375,
-0.0025234222412109375,
-0.0350341796875,
-0.0595703125,
0.04876708984375,
0.03265380859375,
-0.00885009765625,
0.038848876953125,
-0.0482177734375,
0.06781005859375,
-0.025390625,
-0.025299072265625,
0.0166168212890625,
-0.06207275390625,
-0.000736236572265625,
0.0128936767578125,
-0.0177001953125,
0.0259857177734375,
0.012847900390625,
0.07080078125,
-0.058868408203125,
0.054229736328125,
-0.00771331787109375,
0.0006155967712402344,
0.0421142578125,
0.00020134449005126953,
0.04742431640625,
-0.045166015625,
-0.0174560546875,
0.044281005859375,
0.007549285888671875,
-0.01081085205078125,
-0.024322509765625,
0.020477294921875,
-0.0689697265625,
-0.0138397216796875,
-0.0621337890625,
-0.04852294921875,
0.0212249755859375,
0.0396728515625,
0.05706787109375,
0.049346923828125,
-0.003536224365234375,
-0.00127410888671875,
0.040374755859375,
-0.003940582275390625,
0.036712646484375,
0.024505615234375,
0.0021038055419921875,
-0.0574951171875,
0.058837890625,
0.0143890380859375,
0.023590087890625,
0.035400390625,
0.0131378173828125,
-0.0152740478515625,
-0.02960205078125,
-0.0234375,
0.0367431640625,
-0.045440673828125,
-0.017791748046875,
-0.0328369140625,
-0.0256500244140625,
-0.0244293212890625,
-0.01519775390625,
-0.01120758056640625,
-0.020538330078125,
-0.05364990234375,
0.0206146240234375,
0.0289306640625,
0.0419921875,
0.00838470458984375,
0.059295654296875,
-0.061370849609375,
0.0288848876953125,
-0.00047206878662109375,
0.0272674560546875,
0.005062103271484375,
-0.046356201171875,
-0.0161285400390625,
-0.0005083084106445312,
-0.0311737060546875,
-0.051116943359375,
0.057861328125,
0.0290069580078125,
0.0182952880859375,
0.037139892578125,
-0.0016698837280273438,
0.057098388671875,
-0.038299560546875,
0.039794921875,
0.035125732421875,
-0.0760498046875,
0.0264129638671875,
-0.004180908203125,
0.0245208740234375,
0.032318115234375,
0.006801605224609375,
-0.047698974609375,
-0.01172637939453125,
-0.043975830078125,
-0.0423583984375,
0.0789794921875,
-0.00035881996154785156,
-0.00751495361328125,
0.023193359375,
0.038330078125,
-0.01885986328125,
0.015838623046875,
-0.07464599609375,
-0.01499176025390625,
-0.022430419921875,
-0.050567626953125,
-0.011383056640625,
-0.02752685546875,
0.01139068603515625,
-0.0199432373046875,
0.0335693359375,
-0.00360870361328125,
0.056884765625,
0.040435791015625,
-0.03656005859375,
-0.00917816162109375,
-0.00894927978515625,
0.05078125,
0.0290679931640625,
-0.01479339599609375,
0.01387786865234375,
-0.006221771240234375,
-0.08526611328125,
-0.0017004013061523438,
0.010009765625,
-0.0271759033203125,
0.0007605552673339844,
0.0408935546875,
0.08245849609375,
-0.02001953125,
-0.039031982421875,
0.039093017578125,
-0.001514434814453125,
-0.02435302734375,
-0.0301361083984375,
-0.003986358642578125,
-0.0243377685546875,
0.01476287841796875,
0.036407470703125,
0.01415252685546875,
-0.000732421875,
-0.0394287109375,
-0.0027923583984375,
0.04168701171875,
-0.048431396484375,
-0.0220947265625,
0.048126220703125,
-0.007251739501953125,
-0.04595947265625,
0.0635986328125,
0.0020008087158203125,
-0.0682373046875,
0.0572509765625,
0.04986572265625,
0.051727294921875,
-0.020599365234375,
0.005954742431640625,
0.0465087890625,
0.034942626953125,
-0.007110595703125,
0.02093505859375,
-0.005977630615234375,
-0.051910400390625,
0.0238494873046875,
-0.041778564453125,
-0.0186920166015625,
0.0026798248291015625,
-0.050567626953125,
0.03900146484375,
-0.04705810546875,
-0.027618408203125,
-0.0052337646484375,
0.01087188720703125,
-0.049652099609375,
0.027313232421875,
-0.0009813308715820312,
0.06256103515625,
-0.037261962890625,
0.05596923828125,
0.040252685546875,
-0.03057861328125,
-0.05499267578125,
-0.0099029541015625,
-0.01065826416015625,
-0.07855224609375,
0.043426513671875,
0.0271759033203125,
-0.005870819091796875,
0.01163482666015625,
-0.0455322265625,
-0.060638427734375,
0.09356689453125,
0.0157318115234375,
-0.05157470703125,
-0.0269317626953125,
0.0285797119140625,
0.05889892578125,
-0.02740478515625,
0.045440673828125,
0.026336669921875,
0.0184478759765625,
0.03533935546875,
-0.06207275390625,
0.00951385498046875,
-0.029266357421875,
0.0181884765625,
0.01727294921875,
-0.054107666015625,
0.0635986328125,
-0.03802490234375,
-0.0185699462890625,
0.0374755859375,
0.048095703125,
0.0164947509765625,
0.023040771484375,
0.0255889892578125,
0.051727294921875,
0.0526123046875,
-0.01136016845703125,
0.06207275390625,
-0.0285797119140625,
0.0261383056640625,
0.05926513671875,
-0.0008296966552734375,
0.0565185546875,
0.033172607421875,
0.0013027191162109375,
0.05059814453125,
0.0286712646484375,
-0.03814697265625,
0.0299835205078125,
-0.00656890869140625,
0.0017480850219726562,
0.007762908935546875,
0.0049591064453125,
-0.0166473388671875,
0.0251312255859375,
0.0117950439453125,
-0.057861328125,
0.0045013427734375,
0.019073486328125,
-0.00921630859375,
-0.018035888671875,
-0.03363037109375,
0.055328369140625,
0.00032639503479003906,
-0.0478515625,
0.05059814453125,
-0.0013704299926757812,
0.0679931640625,
-0.050048828125,
-0.0026874542236328125,
-0.0024738311767578125,
0.044586181640625,
-0.01104736328125,
-0.05706787109375,
0.00940704345703125,
-0.005748748779296875,
-0.01641845703125,
0.01027679443359375,
0.05401611328125,
-0.048675537109375,
-0.0662841796875,
0.019683837890625,
-0.00824737548828125,
0.015594482421875,
0.019073486328125,
-0.05804443359375,
0.01318359375,
-0.001953125,
-0.01398468017578125,
-0.0003046989440917969,
0.035888671875,
-0.0026607513427734375,
0.0413818359375,
0.0445556640625,
0.0098724365234375,
0.0103759765625,
-0.0178680419921875,
0.04840087890625,
-0.044189453125,
-0.038421630859375,
-0.05462646484375,
0.039093017578125,
0.0034008026123046875,
-0.0455322265625,
0.040985107421875,
0.042877197265625,
0.045623779296875,
-0.021240234375,
0.0289306640625,
-0.01004791259765625,
0.0218048095703125,
-0.026275634765625,
0.0750732421875,
-0.048980712890625,
-0.01482391357421875,
-0.04388427734375,
-0.061798095703125,
-0.04461669921875,
0.07232666015625,
-0.01641845703125,
0.0225982666015625,
0.04827880859375,
0.08099365234375,
-0.00682830810546875,
-0.0224609375,
0.00782012939453125,
0.0185089111328125,
0.006275177001953125,
0.0523681640625,
0.03338623046875,
-0.062255859375,
0.058990478515625,
-0.022308349609375,
-0.01036834716796875,
-0.018463134765625,
-0.0660400390625,
-0.08489990234375,
-0.0567626953125,
-0.0290985107421875,
-0.055908203125,
-0.0092620849609375,
0.047088623046875,
0.053131103515625,
-0.064453125,
-0.012451171875,
-0.0123138427734375,
-0.0017948150634765625,
-0.00934600830078125,
-0.0160980224609375,
0.043792724609375,
0.0255584716796875,
-0.058319091796875,
-0.039947509765625,
-0.0097503662109375,
0.037139892578125,
0.006977081298828125,
-0.014190673828125,
-0.0091400146484375,
-0.0065765380859375,
0.032989501953125,
0.042144775390625,
-0.039642333984375,
-0.0047454833984375,
0.01074981689453125,
-0.020538330078125,
0.03729248046875,
0.046356201171875,
-0.04388427734375,
0.0309600830078125,
0.033721923828125,
0.0094757080078125,
0.057861328125,
-0.01052093505859375,
0.0071563720703125,
-0.0246124267578125,
0.021636962890625,
0.0131072998046875,
0.036468505859375,
0.033203125,
-0.038848876953125,
0.02801513671875,
0.0290679931640625,
-0.0428466796875,
-0.06597900390625,
-0.01471710205078125,
-0.10260009765625,
0.01256561279296875,
0.06268310546875,
-0.00594329833984375,
-0.03900146484375,
0.0187225341796875,
-0.0291595458984375,
0.03350830078125,
-0.0279541015625,
0.045135498046875,
0.033233642578125,
0.0098419189453125,
-0.0474853515625,
0.004642486572265625,
0.01849365234375,
-0.00751495361328125,
-0.04608154296875,
-0.005725860595703125,
0.033203125,
0.0255126953125,
0.0517578125,
0.042999267578125,
-0.0246124267578125,
0.0176239013671875,
0.0017423629760742188,
0.052947998046875,
-0.018768310546875,
-0.02276611328125,
-0.03533935546875,
0.0011348724365234375,
-0.00797271728515625,
-0.00844573974609375
]
] |
PulsarAI/EnsembleV5-Nova-13B | 2023-10-20T10:05:31.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | PulsarAI | null | null | PulsarAI/EnsembleV5-Nova-13B | 0 | 11,443 | transformers | 2023-09-04T20:33:04 | ---
license: cc-by-nc-4.0
---
<a href="https://www.buymeacoffee.com/PulsarAI" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a>
# EnsembleV5-Nova-13B
EnsembleV5-Nova-13B is a merge of [yontaek/llama-2-13B-ensemble-v5](https://huggingface.co/yontaek/llama-2-13B-ensemble-v5) and [Nova-13B-Lora](https://huggingface.co/PulsarAI/Nova-13B-Lora).
**Note:** I currently can not find the model on hub so the link is broken.
# Evulation Results ([Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard))
| Metric | Value |
|-----------------------|-------|
| Avg. | 62.98 |
| ARC (25-shot) | 62.71 |
| HellaSwag (10-shot) | 82.55 |
| MMLU (5-shot) | 56.79 |
| TruthfulQA (0-shot) | 49.86 |
| 898 | [
[
-0.036285400390625,
-0.036895751953125,
0.0294952392578125,
0.018951416015625,
-0.0307464599609375,
0.01373291015625,
0.00136566162109375,
-0.047637939453125,
0.067138671875,
0.0140533447265625,
-0.053375244140625,
-0.0384521484375,
-0.046905517578125,
0.00872802734375,
-0.00872802734375,
0.05682373046875,
0.00016438961029052734,
0.008697509765625,
0.0006036758422851562,
-0.0031299591064453125,
-0.033172607421875,
-0.01525115966796875,
-0.0693359375,
-0.0297393798828125,
0.03790283203125,
0.0216217041015625,
0.055877685546875,
0.0067901611328125,
0.051605224609375,
0.034912109375,
-0.01331329345703125,
0.0170440673828125,
-0.033599853515625,
0.0177154541015625,
0.0142059326171875,
-0.040740966796875,
-0.059844970703125,
0.0045166015625,
0.05889892578125,
0.0266265869140625,
-0.008209228515625,
0.01474761962890625,
0.01708984375,
0.023590087890625,
-0.041961669921875,
0.0116424560546875,
-0.02069091796875,
0.007801055908203125,
-0.00319671630859375,
0.004657745361328125,
-0.0193939208984375,
-0.0430908203125,
-0.0178375244140625,
-0.06219482421875,
-0.00957489013671875,
0.022674560546875,
0.10443115234375,
0.0196990966796875,
-0.006877899169921875,
-0.0006461143493652344,
-0.046539306640625,
0.0498046875,
-0.05389404296875,
0.02374267578125,
0.0061798095703125,
0.046478271484375,
-0.0157012939453125,
-0.047760009765625,
-0.0294952392578125,
0.01038360595703125,
-0.01502227783203125,
0.024688720703125,
-0.05670166015625,
-0.0421142578125,
-0.0210113525390625,
0.038818359375,
-0.04278564453125,
0.01092529296875,
-0.05401611328125,
-0.003978729248046875,
0.044586181640625,
0.0211029052734375,
0.0206451416015625,
-0.00417327880859375,
-0.039520263671875,
-0.0271759033203125,
-0.036224365234375,
-0.0083465576171875,
0.036651611328125,
0.0295257568359375,
-0.023468017578125,
0.048309326171875,
-0.0036373138427734375,
0.056915283203125,
0.038055419921875,
-0.0179901123046875,
0.041534423828125,
-0.04168701171875,
-0.0413818359375,
-0.00592803955078125,
0.0733642578125,
0.0401611328125,
-0.0189208984375,
0.006877899169921875,
0.00958251953125,
-0.00701904296875,
0.0064849853515625,
-0.037445068359375,
-0.0008702278137207031,
0.01346588134765625,
-0.043212890625,
-0.03594970703125,
0.011627197265625,
-0.046661376953125,
0.01239013671875,
0.0019140243530273438,
0.034332275390625,
-0.0249481201171875,
-0.033294677734375,
0.0150146484375,
-0.029693603515625,
0.0386962890625,
0.0298614501953125,
-0.033721923828125,
0.0181427001953125,
0.02557373046875,
0.055267333984375,
-0.005573272705078125,
-0.02606201171875,
0.01361083984375,
-0.005279541015625,
-0.016387939453125,
0.060791015625,
-0.034027099609375,
-0.05377197265625,
-0.0142669677734375,
0.022125244140625,
-0.01561737060546875,
-0.039306640625,
0.0611572265625,
-0.0238189697265625,
0.0208892822265625,
-0.0266571044921875,
-0.022186279296875,
-0.01151275634765625,
0.0299072265625,
-0.048858642578125,
0.07806396484375,
-0.0087890625,
-0.048187255859375,
0.0323486328125,
-0.03228759765625,
-0.00513458251953125,
-0.0022373199462890625,
0.01323699951171875,
-0.039215087890625,
-0.0111236572265625,
-0.0080413818359375,
0.032989501953125,
-0.00885009765625,
-0.0205535888671875,
-0.055999755859375,
-0.0193939208984375,
0.012939453125,
-0.018280029296875,
0.054718017578125,
0.0145721435546875,
0.015228271484375,
-0.01343536376953125,
-0.06732177734375,
-0.019775390625,
0.0523681640625,
-0.0272216796875,
-0.0198211669921875,
-0.03143310546875,
0.01462554931640625,
0.0202789306640625,
0.055206298828125,
-0.019866943359375,
0.0242767333984375,
0.005645751953125,
0.005401611328125,
0.055908203125,
-0.0018291473388671875,
0.033905029296875,
-0.04620361328125,
0.044219970703125,
0.0105743408203125,
0.039947509765625,
0.018585205078125,
-0.05377197265625,
-0.07012939453125,
-0.032440185546875,
0.01161956787109375,
0.0263214111328125,
-0.0235443115234375,
0.06707763671875,
0.005161285400390625,
-0.06494140625,
-0.0628662109375,
-0.021148681640625,
0.012237548828125,
0.036895751953125,
0.02337646484375,
-0.0142364501953125,
-0.053497314453125,
-0.08892822265625,
0.008575439453125,
-0.00931549072265625,
0.0165557861328125,
0.04248046875,
0.03472900390625,
-0.0212249755859375,
0.038543701171875,
-0.048004150390625,
-0.0282440185546875,
-0.004058837890625,
-0.005519866943359375,
0.04425048828125,
0.041259765625,
0.06683349609375,
-0.033477783203125,
-0.030426025390625,
0.007030487060546875,
-0.06719970703125,
-0.0113372802734375,
0.0335693359375,
-0.0155181884765625,
-0.00847625732421875,
0.0002503395080566406,
-0.04022216796875,
0.03851318359375,
0.03961181640625,
-0.03924560546875,
0.05352783203125,
-0.01273345947265625,
0.054229736328125,
-0.1046142578125,
0.0015859603881835938,
0.01934814453125,
-0.01450347900390625,
-0.0237579345703125,
0.00991058349609375,
-0.0226593017578125,
-0.0008516311645507812,
-0.06561279296875,
0.0361328125,
-0.025390625,
-0.0200042724609375,
-0.01157379150390625,
0.00251007080078125,
0.0182342529296875,
0.0283050537109375,
-0.019805908203125,
0.03302001953125,
0.046051025390625,
-0.0273590087890625,
0.01904296875,
0.033935546875,
-0.010009765625,
0.048797607421875,
-0.0350341796875,
-0.0213623046875,
-0.00494384765625,
0.03802490234375,
-0.09063720703125,
-0.0165557861328125,
0.025726318359375,
-0.030792236328125,
0.01361083984375,
-0.0003428459167480469,
-0.0352783203125,
-0.03076171875,
-0.0258636474609375,
0.039886474609375,
0.05596923828125,
-0.0433349609375,
0.038848876953125,
0.0180206298828125,
0.01044464111328125,
-0.04486083984375,
-0.0556640625,
-0.019866943359375,
-0.03985595703125,
-0.050079345703125,
0.0316162109375,
0.005199432373046875,
-0.0184478759765625,
0.0110321044921875,
0.017333984375,
-0.01168060302734375,
-0.006069183349609375,
0.026092529296875,
0.0458984375,
-0.0251312255859375,
-0.0281524658203125,
0.003612518310546875,
-0.021636962890625,
-0.00909423828125,
0.0037994384765625,
0.050079345703125,
-0.0192718505859375,
-0.0133514404296875,
-0.072021484375,
0.00904083251953125,
0.061981201171875,
0.0022144317626953125,
0.06219482421875,
0.055450439453125,
-0.0352783203125,
0.01490020751953125,
-0.045196533203125,
-0.01522064208984375,
-0.036285400390625,
-0.00992584228515625,
-0.035491943359375,
-0.058135986328125,
0.06158447265625,
0.004116058349609375,
-0.002483367919921875,
0.06591796875,
0.03436279296875,
-0.0224609375,
0.06939697265625,
0.02850341796875,
-0.008453369140625,
0.021575927734375,
-0.04595947265625,
0.018829345703125,
-0.06573486328125,
-0.047607421875,
-0.04974365234375,
-0.055999755859375,
-0.04168701171875,
-0.0247955322265625,
0.0266571044921875,
0.00836944580078125,
-0.02850341796875,
0.06793212890625,
-0.044830322265625,
0.026458740234375,
0.03546142578125,
0.034912109375,
0.0340576171875,
-0.0031604766845703125,
-0.0168914794921875,
-0.0201568603515625,
-0.040679931640625,
-0.01361083984375,
0.050689697265625,
0.022369384765625,
0.05731201171875,
0.031463623046875,
0.05401611328125,
0.0142059326171875,
-0.0142059326171875,
-0.049041748046875,
0.035614013671875,
-0.0006122589111328125,
-0.03955078125,
0.0008378028869628906,
-0.0053863525390625,
-0.08612060546875,
0.0256805419921875,
-0.048004150390625,
-0.047088623046875,
0.0290985107421875,
0.003932952880859375,
-0.031890869140625,
0.0240936279296875,
-0.041351318359375,
0.048553466796875,
0.000028192996978759766,
-0.0304412841796875,
-0.01442718505859375,
-0.017578125,
0.035675048828125,
0.0097503662109375,
0.0133514404296875,
-0.00521087646484375,
0.01082611083984375,
0.0401611328125,
-0.047210693359375,
0.04949951171875,
0.016693115234375,
0.002685546875,
0.032623291015625,
0.00844573974609375,
0.0280303955078125,
-0.01364898681640625,
0.002002716064453125,
0.01439666748046875,
-0.03271484375,
-0.033416748046875,
-0.041107177734375,
0.07867431640625,
-0.06280517578125,
-0.04351806640625,
-0.0323486328125,
-0.006641387939453125,
0.0131072998046875,
-0.0032100677490234375,
0.037139892578125,
0.00534820556640625,
-0.0347900390625,
0.01373291015625,
0.0241241455078125,
0.01094818115234375,
0.048431396484375,
0.0162200927734375,
-0.032623291015625,
-0.031982421875,
0.039215087890625,
-0.00841522216796875,
0.01055145263671875,
0.021392822265625,
0.01421356201171875,
-0.03948974609375,
-0.0046234130859375,
-0.0005464553833007812,
0.0222320556640625,
-0.03863525390625,
-0.0198974609375,
-0.035736083984375,
-0.0238037109375,
-0.0272369384765625,
-0.033782958984375,
-0.032470703125,
-0.039215087890625,
-0.048797607421875,
-0.0206298828125,
0.043853759765625,
0.06195068359375,
-0.023406982421875,
0.040496826171875,
-0.0362548828125,
-0.031341552734375,
0.029449462890625,
-0.006641387939453125,
0.002727508544921875,
-0.056304931640625,
0.0122222900390625,
-0.0157623291015625,
-0.0428466796875,
-0.066650390625,
0.0452880859375,
-0.0220489501953125,
0.029052734375,
0.039764404296875,
-0.036712646484375,
0.06793212890625,
-0.02471923828125,
0.060791015625,
0.0611572265625,
-0.033935546875,
0.04058837890625,
-0.021575927734375,
0.00852203369140625,
0.03997802734375,
0.03363037109375,
-0.0015859603881835938,
-0.0301513671875,
-0.07904052734375,
-0.058135986328125,
0.0297698974609375,
0.0367431640625,
-0.023406982421875,
0.006435394287109375,
-0.00342559814453125,
0.0033016204833984375,
-0.00099945068359375,
-0.056488037109375,
-0.034423828125,
-0.0015964508056640625,
0.00536346435546875,
-0.0008625984191894531,
0.00799560546875,
-0.0005044937133789062,
-0.0308990478515625,
0.06170654296875,
0.0052032470703125,
0.01128387451171875,
0.001529693603515625,
0.022979736328125,
-0.01058197021484375,
-0.0069732666015625,
0.029052734375,
0.02471923828125,
-0.03387451171875,
-0.0203399658203125,
0.0178680419921875,
-0.013580322265625,
0.005519866943359375,
-0.010711669921875,
0.005870819091796875,
0.01103973388671875,
0.004364013671875,
0.06817626953125,
0.0509033203125,
-0.0178680419921875,
0.0287933349609375,
0.0003247261047363281,
-0.01552581787109375,
-0.0223236083984375,
0.01085662841796875,
0.004993438720703125,
0.047210693359375,
0.02301025390625,
0.0216522216796875,
0.018768310546875,
-0.03179931640625,
0.018280029296875,
0.04595947265625,
-0.026763916015625,
-0.0278167724609375,
0.055389404296875,
0.00884246826171875,
-0.037689208984375,
0.027557373046875,
-0.0048828125,
-0.01422882080078125,
0.06427001953125,
0.056121826171875,
0.05364990234375,
-0.02001953125,
0.01438140869140625,
0.057098388671875,
0.0108489990234375,
-0.014923095703125,
0.0159454345703125,
0.01788330078125,
-0.048553466796875,
-0.003246307373046875,
-0.0474853515625,
-0.0196990966796875,
0.0176849365234375,
-0.050445556640625,
0.05877685546875,
-0.0309295654296875,
-0.03277587890625,
0.0142364501953125,
0.00661468505859375,
-0.049835205078125,
0.033050537109375,
0.007511138916015625,
0.0606689453125,
-0.08746337890625,
0.04931640625,
0.03997802734375,
-0.05389404296875,
-0.052947998046875,
-0.03253173828125,
0.003215789794921875,
-0.0902099609375,
0.028594970703125,
-0.022552490234375,
0.01129913330078125,
-0.0220489501953125,
-0.043853759765625,
-0.07208251953125,
0.11541748046875,
0.0017309188842773438,
-0.04119873046875,
0.006023406982421875,
-0.0226287841796875,
0.02056884765625,
-0.04864501953125,
0.0457763671875,
0.032623291015625,
0.044464111328125,
0.043060302734375,
-0.07513427734375,
-0.007442474365234375,
-0.013824462890625,
-0.01543426513671875,
0.0257568359375,
-0.1015625,
0.06854248046875,
0.00525665283203125,
0.015838623046875,
0.0116119384765625,
0.034027099609375,
0.048309326171875,
0.00980377197265625,
0.0287017822265625,
0.092529296875,
0.040130615234375,
-0.0187835693359375,
0.06390380859375,
-0.003925323486328125,
0.04913330078125,
0.062408447265625,
-0.023681640625,
0.07421875,
0.026641845703125,
-0.032470703125,
0.0477294921875,
0.07452392578125,
-0.01546478271484375,
0.041107177734375,
-0.017730712890625,
-0.008758544921875,
-0.017486572265625,
-0.0014972686767578125,
-0.051239013671875,
0.035614013671875,
0.0184478759765625,
-0.0418701171875,
-0.03875732421875,
-0.0325927734375,
0.0094146728515625,
-0.011962890625,
-0.01035308837890625,
0.042724609375,
0.0108184814453125,
-0.035308837890625,
0.0460205078125,
-0.0178985595703125,
0.0506591796875,
-0.042144775390625,
0.00400543212890625,
-0.0105743408203125,
0.00498199462890625,
-0.019317626953125,
-0.059906005859375,
0.0234222412109375,
-0.0166168212890625,
-0.0112762451171875,
0.005016326904296875,
0.061126708984375,
-0.0142974853515625,
-0.060150146484375,
0.033935546875,
0.039947509765625,
0.0257568359375,
0.041290283203125,
-0.07708740234375,
0.024505615234375,
0.0034942626953125,
0.0076446533203125,
0.0274658203125,
-0.0064544677734375,
0.0253753662109375,
0.04315185546875,
0.05718994140625,
-0.01214599609375,
0.006084442138671875,
-0.02362060546875,
0.048553466796875,
-0.0303802490234375,
-0.024932861328125,
-0.066162109375,
0.0394287109375,
-0.0214080810546875,
-0.049591064453125,
0.050689697265625,
0.06695556640625,
0.036712646484375,
-0.0171966552734375,
0.0445556640625,
0.00891876220703125,
0.004505157470703125,
-0.0233306884765625,
0.058990478515625,
-0.072509765625,
-0.0038356781005859375,
-0.00091552734375,
-0.0780029296875,
-0.01296234130859375,
0.05267333984375,
0.0122222900390625,
-0.008148193359375,
0.039764404296875,
0.04901123046875,
-0.0298004150390625,
-0.004734039306640625,
0.0240478515625,
0.033294677734375,
-0.0088043212890625,
0.048187255859375,
0.051971435546875,
-0.06658935546875,
0.02288818359375,
-0.007411956787109375,
-0.0266571044921875,
-0.03790283203125,
-0.06317138671875,
-0.050994873046875,
-0.030029296875,
-0.03802490234375,
-0.048492431640625,
-0.004215240478515625,
0.057464599609375,
0.053497314453125,
-0.046966552734375,
-0.0259857177734375,
0.01401519775390625,
0.009033203125,
-0.0025234222412109375,
-0.017059326171875,
0.0113372802734375,
0.002582550048828125,
-0.07757568359375,
0.035400390625,
0.0312042236328125,
0.031402587890625,
-0.02081298828125,
-0.00945281982421875,
-0.0312042236328125,
0.0103912353515625,
0.0227203369140625,
0.034088134765625,
-0.0626220703125,
0.00136566162109375,
-0.003955841064453125,
0.005352020263671875,
0.026275634765625,
-0.00154876708984375,
-0.036224365234375,
-0.01026153564453125,
0.051513671875,
0.01393890380859375,
0.043243408203125,
-0.00998687744140625,
0.02294921875,
-0.04083251953125,
0.032135009765625,
-0.002910614013671875,
0.0762939453125,
0.0052642822265625,
-0.0157623291015625,
0.03515625,
0.028228759765625,
-0.04229736328125,
-0.050689697265625,
0.00015985965728759766,
-0.10968017578125,
-0.00893402099609375,
0.0811767578125,
0.00434112548828125,
-0.035614013671875,
0.0355224609375,
-0.0156097412109375,
0.0011720657348632812,
-0.054718017578125,
0.04864501953125,
0.06298828125,
-0.006366729736328125,
0.005218505859375,
0.003673553466796875,
0.0031528472900390625,
0.0191650390625,
-0.04840087890625,
-0.06170654296875,
0.026214599609375,
0.0194244384765625,
0.041778564453125,
0.0295257568359375,
-0.017822265625,
0.016937255859375,
-0.002185821533203125,
0.0107879638671875,
0.0294952392578125,
-0.01273345947265625,
-0.0079803466796875,
0.0194549560546875,
-0.01108551025390625,
-0.052642822265625
]
] |
SmilingWolf/wd-v1-4-swinv2-tagger-v2 | 2023-03-23T16:59:48.000Z | [
"keras",
"onnx",
"license:apache-2.0",
"has_space",
"region:us"
] | null | SmilingWolf | null | null | SmilingWolf/wd-v1-4-swinv2-tagger-v2 | 43 | 11,427 | keras | 2023-01-21T11:58:41 | ---
license: apache-2.0
---
# WD 1.4 SwinV2 Tagger V2
Supports ratings, characters and general tags.
Trained using https://github.com/SmilingWolf/SW-CV-ModelZoo.
TPUs used for training kindly provided by the [TRC program](https://sites.research.google/trc/about/).
## Dataset
Last image id: 5944504
Trained on Danbooru images with IDs modulo 0000-0899.
Validated on images with IDs modulo 0950-0999.
Images with less than 10 general tags were filtered out.
Tags with less than 600 images were filtered out.
## Validation results
`P=R: threshold = 0.3771, F1 = 0.6854`
## Final words
Subject to change and updates.
Downstream users are encouraged to use tagged releases rather than relying on the head of the repo. | 730 | [
[
-0.03741455078125,
-0.013031005859375,
0.002777099609375,
0.007213592529296875,
-0.054718017578125,
-0.012786865234375,
-0.004673004150390625,
-0.05010986328125,
0.00992584228515625,
0.0296478271484375,
-0.05047607421875,
-0.071044921875,
-0.036529541015625,
-0.006534576416015625,
-0.005741119384765625,
0.08966064453125,
0.00916290283203125,
0.006069183349609375,
-0.0241851806640625,
-0.032745361328125,
-0.0321044921875,
-0.036773681640625,
-0.05364990234375,
-0.0116729736328125,
0.07427978515625,
0.0340576171875,
0.033447265625,
0.0157623291015625,
0.06671142578125,
0.012054443359375,
0.007190704345703125,
0.0162353515625,
-0.0261383056640625,
0.00742340087890625,
-0.032257080078125,
-0.0137786865234375,
-0.039031982421875,
0.01194000244140625,
0.017333984375,
-0.0016326904296875,
-0.00768280029296875,
0.0316162109375,
-0.0239105224609375,
0.04296875,
-0.034027099609375,
0.0117034912109375,
-0.042388916015625,
-0.00525665283203125,
-0.0248260498046875,
-0.0121307373046875,
-0.0219879150390625,
-0.00435638427734375,
0.01053619384765625,
-0.05633544921875,
0.01325225830078125,
0.004924774169921875,
0.10418701171875,
-0.0006518363952636719,
-0.033905029296875,
-0.00899505615234375,
-0.043701171875,
0.060760498046875,
-0.043701171875,
-0.0019855499267578125,
0.045806884765625,
0.0433349609375,
-0.004924774169921875,
-0.057464599609375,
-0.031585693359375,
-0.006626129150390625,
0.0151214599609375,
0.007843017578125,
-0.0261077880859375,
0.00977325439453125,
0.043731689453125,
0.01605224609375,
-0.05706787109375,
0.035247802734375,
-0.0300140380859375,
-0.0308074951171875,
0.053497314453125,
0.0029144287109375,
0.01171875,
-0.0191650390625,
-0.037567138671875,
-0.0307769775390625,
-0.0219573974609375,
0.025238037109375,
0.032806396484375,
-0.005062103271484375,
-0.0147552490234375,
0.055633544921875,
-0.00982666015625,
0.023193359375,
-0.01500701904296875,
-0.0183258056640625,
0.051483154296875,
-0.0106048583984375,
-0.022796630859375,
-0.046478271484375,
0.0537109375,
0.0634765625,
0.02490234375,
0.009918212890625,
-0.04248046875,
0.0216217041015625,
0.01480865478515625,
-0.04278564453125,
-0.021240234375,
-0.0031757354736328125,
-0.0286712646484375,
-0.0433349609375,
0.0357666015625,
-0.03826904296875,
-0.0253143310546875,
0.006591796875,
0.03033447265625,
-0.0236968994140625,
-0.0445556640625,
-0.01605224609375,
-0.0689697265625,
0.04864501953125,
0.04248046875,
-0.043304443359375,
-0.003955841064453125,
0.038909912109375,
0.0626220703125,
0.0018854141235351562,
-0.00826263427734375,
-0.0312042236328125,
-0.0012712478637695312,
-0.0209503173828125,
0.062225341796875,
-0.0222625732421875,
-0.04840087890625,
0.017242431640625,
0.0166015625,
0.033538818359375,
-0.01386260986328125,
0.061309814453125,
-0.0458984375,
-0.00016927719116210938,
-0.029571533203125,
-0.022308349609375,
-0.0174560546875,
0.032806396484375,
-0.039947509765625,
0.08416748046875,
0.04547119140625,
-0.0677490234375,
0.04241943359375,
-0.037139892578125,
-0.043670654296875,
0.033447265625,
0.0021514892578125,
-0.0316162109375,
0.0003421306610107422,
-0.001964569091796875,
0.0253143310546875,
-0.0001417398452758789,
0.0034236907958984375,
-0.030792236328125,
-0.02667236328125,
0.016448974609375,
-0.0014104843139648438,
0.030975341796875,
0.021820068359375,
-0.00193023681640625,
0.0222625732421875,
-0.041473388671875,
0.007781982421875,
0.009521484375,
-0.01012420654296875,
-0.034698486328125,
-0.01198577880859375,
0.03546142578125,
0.028076171875,
0.00208282470703125,
-0.047119140625,
0.035369873046875,
-0.007617950439453125,
0.022918701171875,
0.048370361328125,
0.022003173828125,
0.0304718017578125,
-0.017730712890625,
0.0537109375,
0.0251007080078125,
0.023773193359375,
-0.0018701553344726562,
-0.0491943359375,
-0.052520751953125,
-0.022918701171875,
0.03277587890625,
0.0276336669921875,
-0.08758544921875,
0.057098388671875,
-0.01641845703125,
-0.07666015625,
-0.01983642578125,
-0.016204833984375,
0.0171966552734375,
0.033782958984375,
0.02667236328125,
-0.045501708984375,
-0.043243408203125,
-0.067138671875,
0.0125274658203125,
-0.0181121826171875,
-0.038787841796875,
0.0290985107421875,
0.046783447265625,
-0.046600341796875,
0.06787109375,
-0.0322265625,
-0.053253173828125,
-0.0262603759765625,
0.036773681640625,
0.0038318634033203125,
0.033538818359375,
0.0625,
-0.05194091796875,
-0.021942138671875,
-0.007122039794921875,
-0.0306396484375,
0.00269317626953125,
0.0030670166015625,
-0.01241302490234375,
0.0357666015625,
0.0008783340454101562,
-0.04248046875,
0.0511474609375,
0.024749755859375,
-0.01221466064453125,
0.04656982421875,
-0.038726806640625,
0.01258087158203125,
-0.059051513671875,
-0.00574493408203125,
0.06207275390625,
-0.0330810546875,
-0.0244903564453125,
-0.00917816162109375,
0.0243682861328125,
0.0151214599609375,
-0.046112060546875,
0.0292510986328125,
-0.0125732421875,
-0.011962890625,
-0.014923095703125,
0.0130615234375,
0.0233612060546875,
0.035003662109375,
0.004344940185546875,
0.0258636474609375,
0.062347412109375,
-0.0494384765625,
0.04046630859375,
0.0147247314453125,
-0.04345703125,
0.038909912109375,
-0.057769775390625,
0.00373077392578125,
-0.01523590087890625,
0.040496826171875,
-0.06768798828125,
-0.0266265869140625,
0.01409912109375,
-0.047515869140625,
0.038909912109375,
-0.033416748046875,
-0.04376220703125,
-0.0718994140625,
-0.0548095703125,
0.0039215087890625,
0.05596923828125,
-0.04302978515625,
0.0136566162109375,
0.037353515625,
0.0284423828125,
-0.041900634765625,
-0.0775146484375,
-0.0035686492919921875,
-0.00588226318359375,
-0.044464111328125,
0.00962066650390625,
-0.00470733642578125,
-0.00504302978515625,
-0.020904541015625,
-0.0016632080078125,
-0.01386260986328125,
-0.013702392578125,
0.0284881591796875,
0.045989990234375,
0.00727081298828125,
0.0196990966796875,
0.0020084381103515625,
-0.0160980224609375,
-0.00743865966796875,
-0.01258087158203125,
0.0203857421875,
-0.00457763671875,
0.007106781005859375,
-0.0249176025390625,
0.0020751953125,
0.035980224609375,
-0.036407470703125,
0.0218048095703125,
0.08782958984375,
-0.033416748046875,
-0.020538330078125,
-0.033447265625,
0.01319122314453125,
-0.035369873046875,
0.04547119140625,
-0.034515380859375,
-0.053985595703125,
0.035247802734375,
0.024658203125,
0.007251739501953125,
0.059051513671875,
0.028076171875,
-0.055755615234375,
0.076416015625,
0.0423583984375,
0.002643585205078125,
0.03277587890625,
-0.03240966796875,
-0.00421142578125,
-0.0704345703125,
-0.033111572265625,
-0.0300140380859375,
-0.0309295654296875,
-0.0694580078125,
-0.041046142578125,
0.007671356201171875,
0.004314422607421875,
-0.00045299530029296875,
0.04571533203125,
-0.05804443359375,
0.031646728515625,
0.03875732421875,
0.00980377197265625,
-0.0161590576171875,
-0.003307342529296875,
0.0065155029296875,
-0.01184844970703125,
-0.01555633544921875,
-0.025604248046875,
0.05596923828125,
0.051788330078125,
0.0777587890625,
-0.002410888671875,
0.0233154296875,
0.053466796875,
0.015411376953125,
-0.07330322265625,
0.047393798828125,
-0.0261383056640625,
-0.049560546875,
-0.01187896728515625,
-0.015625,
-0.041290283203125,
0.007251739501953125,
-0.023162841796875,
-0.0350341796875,
0.0206451416015625,
0.0002231597900390625,
0.01800537109375,
0.0296630859375,
-0.045684814453125,
0.0498046875,
-0.00896453857421875,
0.019134521484375,
-0.01134490966796875,
-0.06353759765625,
0.0170440673828125,
0.020263671875,
-0.006710052490234375,
-0.0450439453125,
-0.007904052734375,
0.06304931640625,
-0.0276641845703125,
0.04193115234375,
-0.046722412109375,
0.005859375,
0.0233917236328125,
-0.01528167724609375,
0.03533935546875,
0.0203399658203125,
0.0126190185546875,
0.0389404296875,
0.006389617919921875,
-0.01093292236328125,
-0.0106201171875,
0.053009033203125,
-0.0625,
-0.0089874267578125,
-0.05999755859375,
-0.018218994140625,
0.00600433349609375,
-0.0006322860717773438,
0.052459716796875,
0.034912109375,
-0.0191802978515625,
0.00862884521484375,
0.047088623046875,
-0.01143646240234375,
0.041900634765625,
0.03240966796875,
-0.0019483566284179688,
-0.038909912109375,
0.059234619140625,
0.00859832763671875,
-0.00638580322265625,
0.017242431640625,
-0.001743316650390625,
-0.0262908935546875,
-0.03167724609375,
-0.0299224853515625,
0.020965576171875,
-0.068115234375,
-0.060943603515625,
-0.03472900390625,
-0.025177001953125,
-0.0380859375,
-0.0015649795532226562,
-0.00841522216796875,
-0.041259765625,
-0.046783447265625,
-0.018798828125,
0.04730224609375,
0.057586669921875,
0.0027065277099609375,
0.0166778564453125,
-0.04840087890625,
0.01666259765625,
0.00443267822265625,
0.0355224609375,
-0.024810791015625,
-0.06439208984375,
-0.0263671875,
0.008880615234375,
-0.0144195556640625,
-0.033782958984375,
0.027984619140625,
0.02593994140625,
0.031982421875,
0.038330078125,
0.012664794921875,
0.040130615234375,
-0.0267486572265625,
0.08734130859375,
0.038360595703125,
-0.053070068359375,
0.05517578125,
-0.0247802734375,
0.021240234375,
0.038055419921875,
0.032745361328125,
-0.046783447265625,
-0.0231170654296875,
-0.03826904296875,
-0.05352783203125,
0.04248046875,
-0.00383758544921875,
0.0084228515625,
0.003208160400390625,
0.036590576171875,
0.00492095947265625,
0.0004925727844238281,
-0.05181884765625,
-0.0168609619140625,
-0.038177490234375,
-0.01241302490234375,
0.0131378173828125,
-0.042266845703125,
-0.0018243789672851562,
-0.0318603515625,
0.053985595703125,
-0.01241302490234375,
0.0017147064208984375,
0.0260162353515625,
-0.02215576171875,
-0.01580810546875,
0.0039215087890625,
0.0526123046875,
0.036224365234375,
-0.028045654296875,
-0.00579071044921875,
-0.01471710205078125,
-0.043609619140625,
-0.01052093505859375,
-0.0211944580078125,
-0.00809478759765625,
0.0159454345703125,
0.01213836669921875,
0.06787109375,
-0.002590179443359375,
-0.02105712890625,
0.0604248046875,
-0.0302276611328125,
-0.04644775390625,
-0.0310516357421875,
0.028076171875,
-0.0174407958984375,
0.00881195068359375,
0.031280517578125,
0.03857421875,
0.004047393798828125,
-0.0093994140625,
0.0085906982421875,
0.040557861328125,
-0.0462646484375,
-0.03546142578125,
0.035614013671875,
0.03179931640625,
-0.01361083984375,
0.058807373046875,
-0.036529541015625,
-0.0306854248046875,
0.054840087890625,
0.0227203369140625,
0.0654296875,
-0.0016193389892578125,
0.0290069580078125,
0.05841064453125,
0.028289794921875,
0.0021610260009765625,
0.0261383056640625,
-0.00110626220703125,
-0.036407470703125,
0.0013561248779296875,
-0.04632568359375,
-0.0380859375,
0.006122589111328125,
-0.07098388671875,
0.048736572265625,
-0.037841796875,
-0.03436279296875,
0.007518768310546875,
0.0162811279296875,
-0.064208984375,
0.03533935546875,
0.0321044921875,
0.091552734375,
-0.0513916015625,
0.10382080078125,
0.053253173828125,
-0.045501708984375,
-0.046112060546875,
-0.028961181640625,
-0.01079559326171875,
-0.03778076171875,
0.0152130126953125,
0.049652099609375,
0.0120086669921875,
-0.01093292236328125,
-0.06646728515625,
-0.0355224609375,
0.09613037109375,
-0.0263671875,
-0.043670654296875,
0.00640869140625,
-0.01690673828125,
0.050750732421875,
-0.03729248046875,
0.02606201171875,
0.0333251953125,
0.048492431640625,
0.0287933349609375,
-0.056304931640625,
-0.0308990478515625,
-0.025787353515625,
0.03289794921875,
-0.00969696044921875,
-0.032989501953125,
0.03741455078125,
-0.036529541015625,
-0.02679443359375,
0.022552490234375,
0.055267333984375,
0.004367828369140625,
0.02142333984375,
0.04559326171875,
0.047821044921875,
0.0482177734375,
-0.038177490234375,
0.06903076171875,
0.032501220703125,
0.04083251953125,
0.06793212890625,
-0.028533935546875,
0.052459716796875,
0.0291748046875,
-0.00505828857421875,
0.055084228515625,
0.07763671875,
-0.0648193359375,
0.057891845703125,
0.00966644287109375,
0.00640106201171875,
0.005096435546875,
-0.0183563232421875,
-0.02740478515625,
0.025482177734375,
0.0308837890625,
-0.0041351318359375,
0.0073089599609375,
0.0305328369140625,
-0.0182952880859375,
-0.02740478515625,
-0.0367431640625,
0.0582275390625,
-0.00231170654296875,
-0.026275634765625,
0.0173797607421875,
0.0301513671875,
0.08282470703125,
-0.0797119140625,
0.00008320808410644531,
-0.01097869873046875,
0.0007228851318359375,
-0.031951904296875,
-0.085205078125,
0.002410888671875,
0.004589080810546875,
-0.0158843994140625,
0.002628326416015625,
0.08551025390625,
-0.0423583984375,
-0.029693603515625,
0.01326751708984375,
-0.005161285400390625,
0.00994873046875,
-0.00708770751953125,
-0.04669189453125,
0.01131439208984375,
-0.0001392364501953125,
-0.0230712890625,
0.01024627685546875,
0.0206146240234375,
-0.0108795166015625,
0.045013427734375,
0.032379150390625,
-0.0028362274169921875,
-0.0236968994140625,
0.038330078125,
0.07232666015625,
-0.056610107421875,
-0.039642333984375,
-0.022552490234375,
0.052825927734375,
-0.0167999267578125,
-0.03265380859375,
0.0557861328125,
0.032470703125,
0.0634765625,
-0.0248870849609375,
0.059112548828125,
-0.02215576171875,
0.02093505859375,
0.0015506744384765625,
0.059478759765625,
-0.034912109375,
-0.0122222900390625,
-0.011016845703125,
-0.052093505859375,
-0.02532958984375,
0.041046142578125,
0.01306915283203125,
-0.0055084228515625,
0.04193115234375,
0.045684814453125,
0.0210723876953125,
-0.0195770263671875,
0.038055419921875,
-0.01314544677734375,
0.047576904296875,
0.0189208984375,
0.044586181640625,
-0.04913330078125,
0.039215087890625,
-0.0251007080078125,
-0.027740478515625,
-0.0204925537109375,
-0.0582275390625,
-0.07891845703125,
-0.034820556640625,
-0.042633056640625,
-0.03851318359375,
-0.035003662109375,
0.050811767578125,
0.05743408203125,
-0.04083251953125,
0.020263671875,
0.0162506103515625,
0.00626373291015625,
0.0013780593872070312,
-0.0176239013671875,
0.0020599365234375,
-0.0019311904907226562,
-0.036163330078125,
-0.00481414794921875,
0.0306549072265625,
0.0266876220703125,
-0.03399658203125,
-0.0237579345703125,
-0.004886627197265625,
-0.00170135498046875,
0.033447265625,
0.0143585205078125,
-0.042999267578125,
-0.03045654296875,
-0.00809478759765625,
-0.039794921875,
0.0204010009765625,
0.050994873046875,
-0.041168212890625,
0.035736083984375,
0.0447998046875,
-0.01055145263671875,
0.046142578125,
0.005741119384765625,
-0.00040411949157714844,
-0.08502197265625,
0.03326416015625,
0.02392578125,
0.0286102294921875,
0.039459228515625,
-0.0177001953125,
0.04400634765625,
0.032684326171875,
-0.047119140625,
-0.055084228515625,
0.00473785400390625,
-0.10638427734375,
0.0114288330078125,
0.08526611328125,
-0.0287933349609375,
-0.0216217041015625,
0.0124053955078125,
-0.01080322265625,
0.01543426513671875,
-0.031646728515625,
0.01035308837890625,
0.046630859375,
0.028045654296875,
-0.011566162109375,
-0.0194549560546875,
0.032135009765625,
-0.0237884521484375,
-0.05303955078125,
-0.01995849609375,
0.0322265625,
0.0259552001953125,
0.0001456737518310547,
0.040802001953125,
-0.0175628662109375,
0.0426025390625,
-0.0014591217041015625,
0.038360595703125,
-0.01422119140625,
-0.01383209228515625,
-0.02679443359375,
-0.0305633544921875,
0.0055084228515625,
-0.03594970703125
]
] |
THUDM/chatglm-6b-int4 | 2023-07-08T03:06:02.000Z | [
"transformers",
"pytorch",
"chatglm",
"glm",
"thudm",
"custom_code",
"zh",
"en",
"endpoints_compatible",
"has_space",
"region:us"
] | null | THUDM | null | null | THUDM/chatglm-6b-int4 | 380 | 11,423 | transformers | 2023-03-19T12:01:56 | ---
language:
- zh
- en
tags:
- glm
- chatglm
- thudm
---
# ChatGLM-6B-INT4
<p align="center">
👋 Join our <a href="https://join.slack.com/t/chatglm/shared_invite/zt-1udqapmrr-ocT1DS_mxWe6dDY8ahRWzg" target="_blank">Slack</a> and <a href="https://github.com/THUDM/ChatGLM-6B/blob/main/resources/WECHAT.md" target="_blank">WeChat</a>
</p>
## 介绍
ChatGLM-6B 是一个开源的、支持中英双语问答的对话语言模型,基于 [General Language Model (GLM)](https://github.com/THUDM/GLM) 架构,具有 62 亿参数。结合模型量化技术,用户可以在消费级的显卡上进行本地部署(INT4 量化级别下最低只需 6GB 显存)。ChatGLM-6B 使用了和 [ChatGLM](https://chatglm.cn) 相同的技术,针对中文问答和对话进行了优化。经过约 1T 标识符的中英双语训练,辅以监督微调、反馈自助、人类反馈强化学习等技术的加持,62 亿参数的 ChatGLM-6B 已经能生成相当符合人类偏好的回答。
ChatGLM-6B-INT4 是 ChatGLM-6B 量化后的模型权重。具体的,ChatGLM-6B-INT4 对 ChatGLM-6B 中的 28 个 GLM Block 进行了 INT4 量化,没有对 Embedding 和 LM Head 进行量化。量化后的模型理论上 6G 显存(使用 CPU 即内存)即可推理,具有在嵌入式设备(如树莓派)上运行的可能。
在 CPU 上运行时,会根据硬件自动编译 CPU Kernel ,请确保已安装 GCC 和 OpenMP (Linux一般已安装,对于Windows则需手动安装),以获得最佳并行计算能力。
## 软件依赖
```shell
pip install protobuf transformers==4.27.1 cpm_kernels
```
## 代码调用
可以通过如下代码调用 ChatGLM-6B 模型来生成对话:
```ipython
>>> from transformers import AutoTokenizer, AutoModel
>>> tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b-int4", trust_remote_code=True)
>>> model = AutoModel.from_pretrained("THUDM/chatglm-6b-int4", trust_remote_code=True).half().cuda()
>>> response, history = model.chat(tokenizer, "你好", history=[])
>>> print(response)
你好👋!我是人工智能助手 ChatGLM-6B,很高兴见到你,欢迎问我任何问题。
>>> response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history)
>>> print(response)
晚上睡不着可能会让你感到焦虑或不舒服,但以下是一些可以帮助你入睡的方法:
1. 制定规律的睡眠时间表:保持规律的睡眠时间表可以帮助你建立健康的睡眠习惯,使你更容易入睡。尽量在每天的相同时间上床,并在同一时间起床。
2. 创造一个舒适的睡眠环境:确保睡眠环境舒适,安静,黑暗且温度适宜。可以使用舒适的床上用品,并保持房间通风。
3. 放松身心:在睡前做些放松的活动,例如泡个热水澡,听些轻柔的音乐,阅读一些有趣的书籍等,有助于缓解紧张和焦虑,使你更容易入睡。
4. 避免饮用含有咖啡因的饮料:咖啡因是一种刺激性物质,会影响你的睡眠质量。尽量避免在睡前饮用含有咖啡因的饮料,例如咖啡,茶和可乐。
5. 避免在床上做与睡眠无关的事情:在床上做些与睡眠无关的事情,例如看电影,玩游戏或工作等,可能会干扰你的睡眠。
6. 尝试呼吸技巧:深呼吸是一种放松技巧,可以帮助你缓解紧张和焦虑,使你更容易入睡。试着慢慢吸气,保持几秒钟,然后缓慢呼气。
如果这些方法无法帮助你入睡,你可以考虑咨询医生或睡眠专家,寻求进一步的建议。
```
关于更多的使用说明,包括如何运行命令行和网页版本的 DEMO,以及使用模型量化以节省显存,请参考我们的 [Github Repo](https://github.com/THUDM/ChatGLM-6B)。
## 协议
本仓库的代码依照 [Apache-2.0](LICENSE) 协议开源,ChatGLM-6B 模型的权重的使用则需要遵循 [Model License](MODEL_LICENSE)。
## 引用
如果你觉得我们的工作有帮助的话,请考虑引用下列论文:
```
@inproceedings{
zeng2023glm-130b,
title={{GLM}-130B: An Open Bilingual Pre-trained Model},
author={Aohan Zeng and Xiao Liu and Zhengxiao Du and Zihan Wang and Hanyu Lai and Ming Ding and Zhuoyi Yang and Yifan Xu and Wendi Zheng and Xiao Xia and Weng Lam Tam and Zixuan Ma and Yufei Xue and Jidong Zhai and Wenguang Chen and Zhiyuan Liu and Peng Zhang and Yuxiao Dong and Jie Tang},
booktitle={The Eleventh International Conference on Learning Representations (ICLR)},
year={2023},
url={https://openreview.net/forum?id=-Aw0rrrPUF}
}
```
```
@inproceedings{du2022glm,
title={GLM: General Language Model Pretraining with Autoregressive Blank Infilling},
author={Du, Zhengxiao and Qian, Yujie and Liu, Xiao and Ding, Ming and Qiu, Jiezhong and Yang, Zhilin and Tang, Jie},
booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
pages={320--335},
year={2022}
}
``` | 3,201 | [
[
-0.04180908203125,
-0.055206298828125,
0.005062103271484375,
0.02685546875,
-0.0302734375,
0.003894805908203125,
-0.0218353271484375,
-0.027252197265625,
0.01438140869140625,
0.0058441162109375,
-0.035491943359375,
-0.039215087890625,
-0.04241943359375,
-0.00669097900390625,
-0.0145721435546875,
0.06494140625,
0.000008106231689453125,
0.002651214599609375,
0.0023441314697265625,
-0.004459381103515625,
-0.04656982421875,
-0.0478515625,
-0.0472412109375,
-0.01555633544921875,
-0.00019073486328125,
0.00467681884765625,
0.043975830078125,
0.0217437744140625,
0.02838134765625,
0.0308990478515625,
-0.0030231475830078125,
0.00904083251953125,
-0.03369140625,
-0.01496124267578125,
0.0159149169921875,
-0.03338623046875,
-0.0469970703125,
0.005489349365234375,
0.048309326171875,
0.0281524658203125,
-0.00902557373046875,
0.0258026123046875,
0.034454345703125,
0.044708251953125,
-0.03717041015625,
0.036651611328125,
-0.036590576171875,
0.0019054412841796875,
-0.00998687744140625,
-0.0176544189453125,
-0.016998291015625,
-0.035430908203125,
-0.0034236907958984375,
-0.03533935546875,
-0.0001285076141357422,
0.00445556640625,
0.09765625,
-0.006519317626953125,
-0.016571044921875,
-0.0105438232421875,
-0.040557861328125,
0.07012939453125,
-0.083251953125,
0.0195159912109375,
0.02435302734375,
0.02581787109375,
-0.01910400390625,
-0.05377197265625,
-0.045135498046875,
-0.0106353759765625,
-0.03497314453125,
0.02130126953125,
-0.01377105712890625,
-0.0075531005859375,
0.01454925537109375,
0.0291748046875,
-0.041412353515625,
0.0009760856628417969,
-0.0311737060546875,
-0.019256591796875,
0.04608154296875,
0.017669677734375,
0.0467529296875,
-0.004638671875,
-0.023193359375,
-0.006252288818359375,
-0.030548095703125,
0.0279388427734375,
0.0244598388671875,
0.027923583984375,
-0.050933837890625,
0.021087646484375,
-0.0045166015625,
0.043701171875,
0.01105499267578125,
-0.0174102783203125,
0.0391845703125,
-0.0458984375,
-0.0250701904296875,
-0.023284912109375,
0.09625244140625,
0.02386474609375,
0.00969696044921875,
0.006134033203125,
-0.0097503662109375,
-0.005741119384765625,
-0.00930023193359375,
-0.0614013671875,
-0.0017786026000976562,
0.018157958984375,
-0.045013427734375,
-0.01476287841796875,
0.001567840576171875,
-0.05450439453125,
0.01515960693359375,
-0.00576019287109375,
0.04443359375,
-0.03173828125,
-0.03717041015625,
0.0081787109375,
-0.0007348060607910156,
0.0217132568359375,
0.01491546630859375,
-0.0753173828125,
0.040740966796875,
0.024932861328125,
0.054840087890625,
-0.0083465576171875,
-0.029327392578125,
-0.01078033447265625,
0.0089874267578125,
-0.01390838623046875,
0.0245361328125,
-0.00524139404296875,
-0.03985595703125,
-0.001224517822265625,
-0.0034008026123046875,
-0.03009033203125,
-0.02178955078125,
0.0194091796875,
-0.0219573974609375,
0.051910400390625,
-0.01499176025390625,
-0.03656005859375,
-0.0225677490234375,
0.0275421142578125,
-0.02569580078125,
0.0670166015625,
-0.005657196044921875,
-0.07562255859375,
-0.0071258544921875,
-0.0518798828125,
-0.0181732177734375,
-0.004322052001953125,
-0.024871826171875,
-0.024932861328125,
-0.0199737548828125,
0.032073974609375,
0.030059814453125,
-0.029693603515625,
0.00719451904296875,
-0.01335906982421875,
-0.027679443359375,
0.0230865478515625,
-0.028778076171875,
0.0931396484375,
0.01763916015625,
-0.037200927734375,
0.0196075439453125,
-0.035675048828125,
0.026611328125,
0.0233001708984375,
-0.014678955078125,
-0.0051116943359375,
-0.0017938613891601562,
-0.0173187255859375,
0.03045654296875,
0.037628173828125,
-0.020904541015625,
0.009490966796875,
-0.045013427734375,
0.0325927734375,
0.06591796875,
-0.0078277587890625,
0.03515625,
-0.033660888671875,
0.032562255859375,
0.016571044921875,
0.034820556640625,
-0.01432037353515625,
-0.041473388671875,
-0.0789794921875,
-0.012115478515625,
0.01416015625,
0.05633544921875,
-0.046783447265625,
0.06427001953125,
-0.014251708984375,
-0.038726806640625,
-0.038055419921875,
0.0181732177734375,
0.040252685546875,
0.034759521484375,
0.043121337890625,
-0.0167694091796875,
-0.0438232421875,
-0.0526123046875,
-0.012939453125,
-0.0296478271484375,
0.003749847412109375,
0.0266571044921875,
0.040252685546875,
-0.0256500244140625,
0.06951904296875,
-0.03656005859375,
-0.0251007080078125,
-0.02142333984375,
0.0059967041015625,
0.0310211181640625,
0.05322265625,
0.05621337890625,
-0.047607421875,
-0.0738525390625,
-0.002307891845703125,
-0.0675048828125,
0.01348114013671875,
0.00675201416015625,
-0.02581787109375,
0.038330078125,
0.021728515625,
-0.04266357421875,
0.04205322265625,
0.0482177734375,
-0.033355712890625,
0.050628662109375,
-0.020782470703125,
0.001979827880859375,
-0.084716796875,
0.00018155574798583984,
-0.0197296142578125,
0.005950927734375,
-0.044525146484375,
-0.0102386474609375,
-0.0022983551025390625,
0.01551055908203125,
-0.034393310546875,
0.072265625,
-0.052642822265625,
0.01947021484375,
0.00027680397033691406,
0.006015777587890625,
-0.0162200927734375,
0.06268310546875,
-0.0162353515625,
0.044158935546875,
0.061767578125,
-0.05029296875,
0.021728515625,
0.0178985595703125,
-0.0085601806640625,
0.002643585205078125,
-0.056854248046875,
0.01403045654296875,
0.0009317398071289062,
0.0158538818359375,
-0.09075927734375,
0.00292205810546875,
0.0479736328125,
-0.054290771484375,
0.0236663818359375,
-0.00311279296875,
-0.029144287109375,
-0.048828125,
-0.0308990478515625,
0.0183563232421875,
0.056671142578125,
-0.02130126953125,
0.049530029296875,
0.025726318359375,
-0.002277374267578125,
-0.052459716796875,
-0.043060302734375,
-0.016632080078125,
-0.0107574462890625,
-0.058258056640625,
0.015533447265625,
-0.02520751953125,
0.00441741943359375,
0.0016813278198242188,
0.01361083984375,
-0.0007462501525878906,
-0.00421142578125,
0.0172119140625,
0.03521728515625,
-0.006805419921875,
-0.0132598876953125,
-0.0150299072265625,
-0.00501251220703125,
0.007427215576171875,
-0.00426483154296875,
0.05169677734375,
-0.03277587890625,
-0.038482666015625,
-0.0435791015625,
0.02392578125,
0.033599853515625,
-0.01302337646484375,
0.06085205078125,
0.06622314453125,
-0.01476287841796875,
0.01419830322265625,
-0.05438232421875,
-0.01317596435546875,
-0.041290283203125,
0.0184173583984375,
-0.0078277587890625,
-0.067138671875,
0.0677490234375,
0.028228759765625,
0.0175323486328125,
0.0477294921875,
0.049560546875,
0.001903533935546875,
0.0843505859375,
0.0408935546875,
-0.029327392578125,
0.03985595703125,
-0.046234130859375,
0.0205841064453125,
-0.050445556640625,
-0.0209197998046875,
-0.039947509765625,
-0.022552490234375,
-0.0482177734375,
-0.030426025390625,
0.01428985595703125,
0.006381988525390625,
-0.0311126708984375,
0.01012420654296875,
-0.041229248046875,
-0.00136566162109375,
0.039703369140625,
-0.004238128662109375,
0.004711151123046875,
-0.01027679443359375,
-0.0284576416015625,
0.0008096694946289062,
-0.052947998046875,
-0.0289306640625,
0.066162109375,
0.031280517578125,
0.055938720703125,
0.01194000244140625,
0.052978515625,
-0.0029315948486328125,
0.03314208984375,
-0.040374755859375,
0.050567626953125,
0.0037078857421875,
-0.051666259765625,
-0.033050537109375,
-0.03619384765625,
-0.0701904296875,
0.03765869140625,
-0.00823974609375,
-0.0712890625,
0.007747650146484375,
0.00937652587890625,
-0.0194854736328125,
0.027740478515625,
-0.05426025390625,
0.061187744140625,
-0.0338134765625,
-0.01593017578125,
-0.0032253265380859375,
-0.04620361328125,
0.0343017578125,
0.0147705078125,
0.036163330078125,
-0.0145416259765625,
0.00765228271484375,
0.06201171875,
-0.045654296875,
0.0699462890625,
-0.0273590087890625,
0.003421783447265625,
0.041259765625,
-0.007175445556640625,
0.04693603515625,
0.004589080810546875,
0.01297760009765625,
0.0208892822265625,
-0.0014705657958984375,
-0.032073974609375,
-0.036895751953125,
0.04779052734375,
-0.072265625,
-0.060089111328125,
-0.03961181640625,
-0.0256805419921875,
-0.01160430908203125,
0.028106689453125,
0.035430908203125,
0.015228271484375,
0.0001926422119140625,
0.00992584228515625,
0.0183868408203125,
-0.031585693359375,
0.056884765625,
0.040863037109375,
-0.037109375,
-0.039581298828125,
0.05560302734375,
0.015716552734375,
0.0225982666015625,
0.00997161865234375,
0.00951385498046875,
-0.027130126953125,
-0.0386962890625,
-0.022979736328125,
0.03179931640625,
-0.03717041015625,
-0.0117034912109375,
-0.0521240234375,
-0.039703369140625,
-0.054656982421875,
0.00438690185546875,
-0.0247802734375,
-0.003185272216796875,
-0.03009033203125,
-0.00264739990234375,
0.0191650390625,
0.0164794921875,
-0.00835418701171875,
0.0092315673828125,
-0.06964111328125,
0.024627685546875,
0.0248260498046875,
0.026763916015625,
0.02374267578125,
-0.05206298828125,
-0.039306640625,
0.03338623046875,
-0.0183563232421875,
-0.04632568359375,
0.0474853515625,
0.00817108154296875,
0.066650390625,
0.03350830078125,
0.00030303001403808594,
0.0626220703125,
-0.02166748046875,
0.06658935546875,
0.03662109375,
-0.07489013671875,
0.032073974609375,
-0.0259246826171875,
0.019805908203125,
0.0170135498046875,
0.034942626953125,
-0.045379638671875,
-0.03302001953125,
-0.056854248046875,
-0.06439208984375,
0.06146240234375,
0.0280609130859375,
0.0423583984375,
-0.0006613731384277344,
0.00005835294723510742,
-0.017913818359375,
0.01309967041015625,
-0.062347412109375,
-0.05438232421875,
-0.02606201171875,
-0.00513458251953125,
0.004558563232421875,
-0.0272979736328125,
-0.00687408447265625,
-0.051177978515625,
0.056365966796875,
0.006572723388671875,
0.04852294921875,
0.003681182861328125,
0.002689361572265625,
0.0020160675048828125,
0.0140533447265625,
0.043670654296875,
0.054931640625,
-0.0175018310546875,
-0.00760650634765625,
0.023345947265625,
-0.046600341796875,
0.0031261444091796875,
0.00371551513671875,
-0.0201873779296875,
0.00759124755859375,
0.023773193359375,
0.07977294921875,
0.0104217529296875,
-0.0274658203125,
0.044891357421875,
-0.029876708984375,
-0.01910400390625,
-0.0256500244140625,
0.01392364501953125,
0.0145721435546875,
0.00618743896484375,
0.045684814453125,
-0.021087646484375,
-0.01284027099609375,
-0.04827880859375,
-0.00449371337890625,
0.043426513671875,
-0.017974853515625,
-0.0264739990234375,
0.04754638671875,
0.01288604736328125,
-0.00838470458984375,
0.04193115234375,
-0.0192718505859375,
-0.048828125,
0.040985107421875,
0.04229736328125,
0.06927490234375,
-0.0245819091796875,
0.0083770751953125,
0.055419921875,
0.0096282958984375,
-0.01593017578125,
0.031158447265625,
0.008087158203125,
-0.06732177734375,
-0.024383544921875,
-0.045745849609375,
-0.010467529296875,
0.0171051025390625,
-0.039031982421875,
0.018096923828125,
-0.037994384765625,
-0.0274658203125,
-0.014617919921875,
0.00969696044921875,
-0.044647216796875,
0.00923919677734375,
0.00151824951171875,
0.051300048828125,
-0.0467529296875,
0.06591796875,
0.0386962890625,
-0.039154052734375,
-0.078369140625,
-0.0168304443359375,
-0.0060272216796875,
-0.051055908203125,
0.048492431640625,
-0.00020563602447509766,
-0.0014028549194335938,
0.0008568763732910156,
-0.041412353515625,
-0.085205078125,
0.0816650390625,
0.026397705078125,
-0.0260467529296875,
-0.00952911376953125,
0.00936126708984375,
0.044769287109375,
-0.02154541015625,
0.041107177734375,
0.01312255859375,
0.03790283203125,
0.01012420654296875,
-0.09100341796875,
0.018280029296875,
-0.045257568359375,
0.00482940673828125,
-0.010772705078125,
-0.06707763671875,
0.10491943359375,
-0.0055084228515625,
-0.0230865478515625,
-0.0170135498046875,
0.0584716796875,
0.01403045654296875,
-0.0005125999450683594,
0.033447265625,
0.026885986328125,
0.042572021484375,
-0.0146026611328125,
0.06304931640625,
-0.03509521484375,
0.047149658203125,
0.0728759765625,
0.0117950439453125,
0.04931640625,
0.019500732421875,
-0.03302001953125,
0.045501708984375,
0.0384521484375,
-0.00968170166015625,
0.0374755859375,
0.0017919540405273438,
-0.0178680419921875,
-0.00768280029296875,
0.0180511474609375,
-0.055572509765625,
0.0194091796875,
0.0298614501953125,
-0.01110076904296875,
-0.01377105712890625,
0.0016832351684570312,
0.0162353515625,
-0.0191650390625,
-0.0182037353515625,
0.0616455078125,
0.017120361328125,
-0.050628662109375,
0.084716796875,
-0.0013647079467773438,
0.07708740234375,
-0.059906005859375,
0.00724029541015625,
-0.0218505859375,
0.008697509765625,
-0.0196533203125,
-0.04083251953125,
0.009857177734375,
-0.00844573974609375,
-0.0010595321655273438,
-0.007709503173828125,
0.058197021484375,
-0.034332275390625,
-0.029876708984375,
0.043914794921875,
0.034423828125,
0.0104522705078125,
0.01187896728515625,
-0.0748291015625,
0.00604248046875,
0.0136260986328125,
-0.036407470703125,
0.031341552734375,
0.0261383056640625,
0.0015468597412109375,
0.05828857421875,
0.052764892578125,
0.00612640380859375,
0.01467132568359375,
-0.003917694091796875,
0.07379150390625,
-0.0479736328125,
-0.047821044921875,
-0.07757568359375,
0.055084228515625,
-0.00885772705078125,
-0.0121612548828125,
0.07586669921875,
0.03564453125,
0.06463623046875,
-0.0012683868408203125,
0.06781005859375,
-0.0225677490234375,
0.037750244140625,
-0.02166748046875,
0.064453125,
-0.035308837890625,
0.00850677490234375,
-0.0228424072265625,
-0.0416259765625,
-0.00927734375,
0.0501708984375,
-0.027679443359375,
0.02471923828125,
0.04974365234375,
0.0567626953125,
0.01422882080078125,
-0.0175323486328125,
0.0120849609375,
0.0287628173828125,
0.02569580078125,
0.054046630859375,
0.041015625,
-0.0440673828125,
0.05120849609375,
-0.0311431884765625,
-0.00457000732421875,
-0.0260009765625,
-0.051544189453125,
-0.08526611328125,
-0.046173095703125,
-0.014251708984375,
-0.028106689453125,
-0.01103973388671875,
0.06781005859375,
0.041290283203125,
-0.0657958984375,
-0.044403076171875,
0.01288604736328125,
0.020965576171875,
-0.0287017822265625,
-0.0162811279296875,
0.044708251953125,
-0.0400390625,
-0.0589599609375,
-0.0052947998046875,
0.0010519027709960938,
0.02532958984375,
-0.0251007080078125,
-0.022216796875,
-0.032012939453125,
0.005786895751953125,
0.0288238525390625,
0.030364990234375,
-0.05401611328125,
-0.01273345947265625,
0.00246429443359375,
-0.0328369140625,
0.0105133056640625,
0.0170135498046875,
-0.0394287109375,
0.02734375,
0.04864501953125,
0.0123748779296875,
0.0482177734375,
-0.00844573974609375,
0.031768798828125,
-0.0390625,
0.03131103515625,
0.004787445068359375,
0.0281524658203125,
0.00380706787109375,
-0.02001953125,
0.035919189453125,
0.017913818359375,
-0.0283050537109375,
-0.0672607421875,
-0.0189361572265625,
-0.0670166015625,
-0.0087127685546875,
0.08685302734375,
-0.0269012451171875,
-0.0184326171875,
0.0007505416870117188,
-0.023468017578125,
0.036590576171875,
-0.0270233154296875,
0.065185546875,
0.055328369140625,
-0.006771087646484375,
-0.006778717041015625,
-0.043182373046875,
0.041168212890625,
0.0226287841796875,
-0.062347412109375,
-0.015106201171875,
0.018707275390625,
0.02545166015625,
0.01190185546875,
0.081298828125,
-0.00676727294921875,
0.01021575927734375,
-0.0222015380859375,
0.01605224609375,
-0.0232391357421875,
0.01343536376953125,
-0.0045318603515625,
-0.00870513916015625,
-0.0150299072265625,
-0.0277862548828125
]
] |
TurkuNLP/bert-base-finnish-uncased-v1 | 2021-05-18T22:46:38.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"fi",
"arxiv:1912.07076",
"arxiv:1908.04212",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | TurkuNLP | null | null | TurkuNLP/bert-base-finnish-uncased-v1 | 0 | 11,418 | transformers | 2022-03-02T23:29:05 | ---
language: fi
---
## Quickstart
**Release 1.0** (November 25, 2019)
Download the models here:
* Cased Finnish BERT Base: [bert-base-finnish-cased-v1.zip](http://dl.turkunlp.org/finbert/bert-base-finnish-cased-v1.zip)
* Uncased Finnish BERT Base: [bert-base-finnish-uncased-v1.zip](http://dl.turkunlp.org/finbert/bert-base-finnish-uncased-v1.zip)
We generally recommend the use of the cased model.
Paper presenting Finnish BERT: [arXiv:1912.07076](https://arxiv.org/abs/1912.07076)
## What's this?
A version of Google's [BERT](https://github.com/google-research/bert) deep transfer learning model for Finnish. The model can be fine-tuned to achieve state-of-the-art results for various Finnish natural language processing tasks.
FinBERT features a custom 50,000 wordpiece vocabulary that has much better coverage of Finnish words than e.g. the previously released [multilingual BERT](https://github.com/google-research/bert/blob/master/multilingual.md) models from Google:
| Vocabulary | Example |
|------------|---------|
| FinBERT | Suomessa vaihtuu kesän aikana sekä pääministeri että valtiovarain ##ministeri . |
| Multilingual BERT | Suomessa vai ##htuu kes ##än aikana sekä p ##ää ##minister ##i että valt ##io ##vara ##in ##minister ##i . |
FinBERT has been pre-trained for 1 million steps on over 3 billion tokens (24B characters) of Finnish text drawn from news, online discussion, and internet crawls. By contrast, Multilingual BERT was trained on Wikipedia texts, where the Finnish Wikipedia text is approximately 3% of the amount used to train FinBERT.
These features allow FinBERT to outperform not only Multilingual BERT but also all previously proposed models when fine-tuned for Finnish natural language processing tasks.
## Results
### Document classification

FinBERT outperforms multilingual BERT (M-BERT) on document classification over a range of training set sizes on the Yle news (left) and Ylilauta online discussion (right) corpora. (Baseline classification performance with [FastText](https://fasttext.cc/) included for reference.)
[[code](https://github.com/spyysalo/finbert-text-classification)][[Yle data](https://github.com/spyysalo/yle-corpus)] [[Ylilauta data](https://github.com/spyysalo/ylilauta-corpus)]
### Named Entity Recognition
Evaluation on FiNER corpus ([Ruokolainen et al 2019](https://arxiv.org/abs/1908.04212))
| Model | Accuracy |
|--------------------|----------|
| **FinBERT** | **92.40%** |
| Multilingual BERT | 90.29% |
| [FiNER-tagger](https://github.com/Traubert/FiNer-rules) (rule-based) | 86.82% |
(FiNER tagger results from [Ruokolainen et al. 2019](https://arxiv.org/pdf/1908.04212.pdf))
[[code](https://github.com/jouniluoma/keras-bert-ner)][[data](https://github.com/mpsilfve/finer-data)]
### Part of speech tagging
Evaluation on three Finnish corpora annotated with [Universal Dependencies](https://universaldependencies.org/) part-of-speech tags: the Turku Dependency Treebank (TDT), FinnTreeBank (FTB), and Parallel UD treebank (PUD)
| Model | TDT | FTB | PUD |
|-------------------|-------------|-------------|-------------|
| **FinBERT** | **98.23%** | **98.39%** | **98.08%** |
| Multilingual BERT | 96.97% | 95.87% | 97.58% |
[[code](https://github.com/spyysalo/bert-pos)][[data](http://hdl.handle.net/11234/1-2837)]
## Use with PyTorch
If you want to use the model with the huggingface/transformers library, follow the steps in [huggingface_transformers.md](https://github.com/TurkuNLP/FinBERT/blob/master/huggingface_transformers.md)
## Previous releases
### Release 0.2
**October 24, 2019** Beta version of the BERT base uncased model trained from scratch on a corpus of Finnish news, online discussions, and crawled data.
Download the model here: [bert-base-finnish-uncased.zip](http://dl.turkunlp.org/finbert/bert-base-finnish-uncased.zip)
### Release 0.1
**September 30, 2019** We release a beta version of the BERT base cased model trained from scratch on a corpus of Finnish news, online discussions, and crawled data.
Download the model here: [bert-base-finnish-cased.zip](http://dl.turkunlp.org/finbert/bert-base-finnish-cased.zip)
| 4,375 | [
[
-0.0391845703125,
-0.048828125,
0.0145721435546875,
0.0082855224609375,
-0.0219268798828125,
-0.0086822509765625,
-0.03912353515625,
-0.045989990234375,
0.01861572265625,
0.02593994140625,
-0.036376953125,
-0.055267333984375,
-0.044921875,
-0.003864288330078125,
-0.01751708984375,
0.09783935546875,
-0.001804351806640625,
0.0167083740234375,
-0.005153656005859375,
-0.013763427734375,
-0.0160980224609375,
-0.061248779296875,
-0.0377197265625,
-0.0219268798828125,
0.049041748046875,
0.0184478759765625,
0.022125244140625,
0.0165557861328125,
0.040069580078125,
0.0223388671875,
-0.0128021240234375,
-0.0069122314453125,
-0.0142669677734375,
-0.00005155801773071289,
0.006969451904296875,
-0.0220489501953125,
-0.0313720703125,
0.0011167526245117188,
0.0545654296875,
0.045135498046875,
-0.0035343170166015625,
0.01161956787109375,
-0.0074310302734375,
0.0487060546875,
-0.0298614501953125,
0.016754150390625,
-0.043670654296875,
-0.016693115234375,
-0.0175323486328125,
0.022613525390625,
-0.0394287109375,
-0.0158233642578125,
0.02703857421875,
-0.034515380859375,
0.0257415771484375,
-0.0207977294921875,
0.09600830078125,
0.00815582275390625,
-0.0167083740234375,
-0.0218048095703125,
-0.040069580078125,
0.05804443359375,
-0.052276611328125,
0.05645751953125,
0.02459716796875,
0.0128173828125,
-0.00505828857421875,
-0.06878662109375,
-0.034576416015625,
-0.016510009765625,
-0.01885986328125,
0.0108795166015625,
-0.0171661376953125,
0.0002727508544921875,
0.01071929931640625,
0.01554107666015625,
-0.04425048828125,
0.005252838134765625,
-0.048095703125,
-0.0166473388671875,
0.04803466796875,
-0.0282135009765625,
0.01023101806640625,
-0.034576416015625,
-0.038848876953125,
-0.034271240234375,
-0.0284271240234375,
0.01837158203125,
0.0306243896484375,
0.04278564453125,
-0.0225677490234375,
0.029022216796875,
0.0103607177734375,
0.04400634765625,
-0.004730224609375,
-0.00971221923828125,
0.04681396484375,
-0.01641845703125,
-0.0195465087890625,
0.00586700439453125,
0.0675048828125,
0.01421356201171875,
0.017059326171875,
-0.00725555419921875,
-0.00836944580078125,
0.002777099609375,
0.00907135009765625,
-0.05084228515625,
-0.02020263671875,
0.0272369384765625,
-0.0298614501953125,
-0.009552001953125,
0.0003597736358642578,
-0.04644775390625,
0.0021419525146484375,
-0.0196533203125,
0.04425048828125,
-0.06353759765625,
-0.0230560302734375,
0.01287078857421875,
-0.01418304443359375,
0.025543212890625,
0.020721435546875,
-0.06683349609375,
0.01497650146484375,
0.041107177734375,
0.05523681640625,
-0.01041412353515625,
-0.01727294921875,
-0.012908935546875,
-0.0220947265625,
-0.00759124755859375,
0.051483154296875,
-0.0056610107421875,
-0.01251220703125,
-0.0005016326904296875,
0.01007843017578125,
-0.0225677490234375,
-0.01543426513671875,
0.067138671875,
-0.027374267578125,
0.04541015625,
-0.01580810546875,
-0.049652099609375,
-0.01947021484375,
0.01031494140625,
-0.045684814453125,
0.09161376953125,
0.018707275390625,
-0.074462890625,
0.02587890625,
-0.0521240234375,
-0.0199737548828125,
0.0026302337646484375,
0.00909423828125,
-0.042083740234375,
-0.0059661865234375,
0.00940704345703125,
0.0433349609375,
0.01027679443359375,
0.024017333984375,
-0.018768310546875,
-0.01493072509765625,
0.0012788772583007812,
-0.00962066650390625,
0.093994140625,
0.024139404296875,
-0.0193634033203125,
0.0058135986328125,
-0.06353759765625,
0.0042266845703125,
0.008453369140625,
-0.035888671875,
-0.03009033203125,
-0.01027679443359375,
0.03472900390625,
0.01007080078125,
0.02606201171875,
-0.05133056640625,
0.020233154296875,
-0.0333251953125,
0.0271759033203125,
0.04278564453125,
-0.022247314453125,
0.0235443115234375,
-0.024749755859375,
0.01319122314453125,
-0.00835418701171875,
0.016357421875,
-0.01020050048828125,
-0.046417236328125,
-0.076171875,
-0.04852294921875,
0.06494140625,
0.0460205078125,
-0.04248046875,
0.05474853515625,
-0.02972412109375,
-0.05914306640625,
-0.055816650390625,
-0.0014486312866210938,
0.0297088623046875,
0.03387451171875,
0.0292510986328125,
-0.025299072265625,
-0.049835205078125,
-0.07537841796875,
0.006084442138671875,
-0.032958984375,
0.00238037109375,
0.00823211669921875,
0.04632568359375,
-0.0193939208984375,
0.06732177734375,
-0.01116180419921875,
-0.023162841796875,
-0.019500732421875,
0.030029296875,
0.02093505859375,
0.04510498046875,
0.05615234375,
-0.061126708984375,
-0.037933349609375,
-0.0021495819091796875,
-0.041748046875,
0.006893157958984375,
0.0095062255859375,
-0.000020205974578857422,
0.055694580078125,
0.0232086181640625,
-0.058868408203125,
0.015655517578125,
0.0386962890625,
-0.0275726318359375,
0.044097900390625,
-0.0150909423828125,
-0.0122528076171875,
-0.09405517578125,
0.0218963623046875,
0.004364013671875,
-0.01534271240234375,
-0.05615234375,
0.016510009765625,
0.01416778564453125,
0.00731658935546875,
-0.047271728515625,
0.04443359375,
-0.016845703125,
-0.0005984306335449219,
0.01456451416015625,
-0.004825592041015625,
-0.0033588409423828125,
0.05096435546875,
0.01300048828125,
0.0537109375,
0.0301666259765625,
-0.042999267578125,
0.0116119384765625,
0.0283355712890625,
-0.044403076171875,
0.0104522705078125,
-0.048675537109375,
0.0018262863159179688,
-0.0209197998046875,
0.0164642333984375,
-0.07696533203125,
-0.01055145263671875,
0.01983642578125,
-0.056549072265625,
0.043792724609375,
-0.020416259765625,
-0.0458984375,
-0.030181884765625,
-0.03729248046875,
-0.00807952880859375,
0.047943115234375,
-0.040679931640625,
0.039642333984375,
0.0234222412109375,
-0.02301025390625,
-0.056610107421875,
-0.055938720703125,
-0.00579071044921875,
-0.01849365234375,
-0.0555419921875,
0.036834716796875,
-0.0217742919921875,
-0.014312744140625,
0.006839752197265625,
0.0059051513671875,
-0.01143646240234375,
0.00928497314453125,
0.006870269775390625,
0.03509521484375,
-0.015960693359375,
0.026763916015625,
-0.008453369140625,
0.0036678314208984375,
-0.00733184814453125,
-0.008636474609375,
0.046051025390625,
-0.033172607421875,
0.006771087646484375,
-0.0211944580078125,
0.020355224609375,
0.04046630859375,
-0.01493072509765625,
0.0543212890625,
0.076171875,
-0.0273590087890625,
0.00931549072265625,
-0.053314208984375,
-0.005908966064453125,
-0.0311279296875,
0.0189666748046875,
-0.0264434814453125,
-0.072021484375,
0.041473388671875,
0.0178070068359375,
0.0246734619140625,
0.06085205078125,
0.04010009765625,
-0.0200653076171875,
0.050537109375,
0.06353759765625,
-0.0140380859375,
0.044189453125,
-0.0367431640625,
-0.00579071044921875,
-0.05267333984375,
-0.026123046875,
-0.05535888671875,
-0.00469970703125,
-0.0745849609375,
-0.017547607421875,
0.00983428955078125,
0.0230560302734375,
-0.01343536376953125,
0.0533447265625,
-0.045135498046875,
0.0046234130859375,
0.05645751953125,
-0.0013170242309570312,
0.0022640228271484375,
0.0288543701171875,
-0.0275726318359375,
-0.0121002197265625,
-0.056854248046875,
-0.033935546875,
0.08380126953125,
0.043212890625,
0.0372314453125,
0.004154205322265625,
0.06341552734375,
0.0230255126953125,
0.0211334228515625,
-0.0599365234375,
0.0271148681640625,
-0.029327392578125,
-0.0655517578125,
-0.02215576171875,
-0.02783203125,
-0.07965087890625,
0.024169921875,
-0.025604248046875,
-0.0704345703125,
0.0213470458984375,
-0.006244659423828125,
-0.0161895751953125,
0.0268402099609375,
-0.0738525390625,
0.0643310546875,
-0.0189208984375,
-0.0126800537109375,
0.002105712890625,
-0.05865478515625,
0.0192413330078125,
-0.01070404052734375,
0.01861572265625,
-0.0147552490234375,
0.00955963134765625,
0.07232666015625,
-0.020538330078125,
0.0599365234375,
-0.01428985595703125,
-0.0101318359375,
0.015960693359375,
-0.0243072509765625,
0.0270843505859375,
-0.01113128662109375,
-0.0033016204833984375,
0.03643798828125,
0.0158233642578125,
-0.0221405029296875,
-0.019622802734375,
0.051666259765625,
-0.07305908203125,
-0.020538330078125,
-0.044189453125,
-0.03302001953125,
-0.007358551025390625,
0.0181427001953125,
0.03656005859375,
0.0193328857421875,
-0.02093505859375,
0.012603759765625,
0.06182861328125,
-0.028778076171875,
0.038848876953125,
0.044464111328125,
-0.013580322265625,
-0.038330078125,
0.061248779296875,
0.0100555419921875,
0.0021114349365234375,
0.034332275390625,
0.00641632080078125,
-0.026580810546875,
-0.0303192138671875,
-0.0384521484375,
0.031280517578125,
-0.04827880859375,
-0.0108795166015625,
-0.06671142578125,
-0.032012939453125,
-0.048675537109375,
0.0055694580078125,
-0.0316162109375,
-0.0562744140625,
-0.013641357421875,
-0.002414703369140625,
0.042877197265625,
0.043212890625,
-0.01117706298828125,
0.019439697265625,
-0.046539306640625,
0.00333404541015625,
0.0164794921875,
0.032623291015625,
-0.0253143310546875,
-0.040679931640625,
-0.0171356201171875,
-0.0015249252319335938,
-0.00885772705078125,
-0.048858642578125,
0.0308074951171875,
0.0077056884765625,
0.042083740234375,
0.00835418701171875,
0.001140594482421875,
0.0260009765625,
-0.03564453125,
0.06011962890625,
0.02349853515625,
-0.055633544921875,
0.0360107421875,
-0.0275726318359375,
0.021636962890625,
0.051666259765625,
0.051605224609375,
-0.0419921875,
-0.0171661376953125,
-0.060638427734375,
-0.07666015625,
0.06256103515625,
0.0197906494140625,
0.00951385498046875,
0.00621795654296875,
0.0140533447265625,
0.01348114013671875,
0.0093536376953125,
-0.060943603515625,
-0.0357666015625,
-0.00988006591796875,
-0.017425537109375,
-0.0200958251953125,
-0.0307159423828125,
0.00759124755859375,
-0.04132080078125,
0.07000732421875,
0.01168060302734375,
0.046142578125,
0.0285797119140625,
-0.0154571533203125,
0.00516510009765625,
0.0292510986328125,
0.05810546875,
0.03564453125,
-0.058441162109375,
-0.0057525634765625,
0.0088348388671875,
-0.043670654296875,
-0.0115509033203125,
0.0460205078125,
-0.01702880859375,
0.0433349609375,
0.0311279296875,
0.0716552734375,
0.01715087890625,
-0.04022216796875,
0.036590576171875,
-0.0173492431640625,
-0.0399169921875,
-0.03253173828125,
-0.01270294189453125,
0.00862884521484375,
0.0101165771484375,
0.0295257568359375,
-0.005992889404296875,
0.0007472038269042969,
-0.030242919921875,
0.0216522216796875,
0.027435302734375,
-0.0308837890625,
-0.0174560546875,
0.033172607421875,
0.01043701171875,
-0.011932373046875,
0.041595458984375,
-0.00982666015625,
-0.05322265625,
0.035980224609375,
0.0249176025390625,
0.062042236328125,
-0.01320648193359375,
0.0228271484375,
0.048248291015625,
0.0350341796875,
0.007709503173828125,
0.02313232421875,
-0.00027680397033691406,
-0.052490234375,
-0.038726806640625,
-0.06634521484375,
-0.0172271728515625,
0.031402587890625,
-0.045135498046875,
0.0259552001953125,
-0.028228759765625,
-0.0268707275390625,
0.0265655517578125,
0.0281524658203125,
-0.054718017578125,
0.00955963134765625,
0.02392578125,
0.08837890625,
-0.0523681640625,
0.082763671875,
0.06524658203125,
-0.033416748046875,
-0.043914794921875,
-0.025604248046875,
-0.01971435546875,
-0.0484619140625,
0.054962158203125,
0.0167694091796875,
0.010467529296875,
-0.00688934326171875,
-0.038482666015625,
-0.072021484375,
0.07037353515625,
0.0264434814453125,
-0.04473876953125,
0.00003266334533691406,
0.0017194747924804688,
0.04486083984375,
-0.0166778564453125,
0.0216064453125,
0.032379150390625,
0.037933349609375,
0.0008072853088378906,
-0.0870361328125,
-0.0230712890625,
-0.02777099609375,
-0.0024509429931640625,
0.0223541259765625,
-0.04693603515625,
0.0693359375,
-0.0070343017578125,
-0.0184478759765625,
0.00618743896484375,
0.039581298828125,
0.01180267333984375,
0.0113067626953125,
0.03790283203125,
0.06182861328125,
0.06109619140625,
-0.01248931884765625,
0.0772705078125,
-0.0222320556640625,
0.0377197265625,
0.08026123046875,
0.01120758056640625,
0.0760498046875,
0.0296478271484375,
-0.0197906494140625,
0.05328369140625,
0.054962158203125,
-0.0085296630859375,
0.03729248046875,
-0.00452423095703125,
-0.017913818359375,
-0.00847625732421875,
-0.01290130615234375,
-0.037933349609375,
0.038665771484375,
0.0261383056640625,
-0.030914306640625,
-0.0176239013671875,
0.00516510009765625,
0.0211029052734375,
-0.0146331787109375,
-0.0111083984375,
0.047210693359375,
-0.002788543701171875,
-0.04010009765625,
0.06341552734375,
0.0210113525390625,
0.06939697265625,
-0.06280517578125,
0.01006317138671875,
-0.01358795166015625,
0.01197052001953125,
-0.003925323486328125,
-0.04827880859375,
0.008697509765625,
0.00678253173828125,
-0.01861572265625,
-0.021209716796875,
0.06634521484375,
-0.040191650390625,
-0.047149658203125,
0.031463623046875,
0.033355712890625,
0.026031494140625,
0.0162353515625,
-0.0706787109375,
0.01343536376953125,
-0.0025691986083984375,
-0.0249176025390625,
0.0209197998046875,
0.009796142578125,
-0.00374603271484375,
0.040863037109375,
0.046844482421875,
-0.002971649169921875,
0.006195068359375,
0.005893707275390625,
0.0616455078125,
-0.03582763671875,
-0.0160064697265625,
-0.046051025390625,
0.044952392578125,
-0.002803802490234375,
-0.0291748046875,
0.05706787109375,
0.04681396484375,
0.090087890625,
-0.0114593505859375,
0.05706787109375,
-0.02410888671875,
0.039825439453125,
-0.0330810546875,
0.05950927734375,
-0.046722412109375,
-0.0101165771484375,
-0.0248260498046875,
-0.059722900390625,
-0.0234222412109375,
0.0517578125,
-0.0100250244140625,
-0.003055572509765625,
0.041656494140625,
0.03826904296875,
0.002567291259765625,
-0.01467132568359375,
0.01390838623046875,
0.009185791015625,
0.005428314208984375,
0.031402587890625,
0.04150390625,
-0.053558349609375,
0.037139892578125,
-0.036865234375,
-0.00997161865234375,
-0.01052093505859375,
-0.06182861328125,
-0.0770263671875,
-0.06439208984375,
-0.0249481201171875,
-0.0267333984375,
0.0171051025390625,
0.0816650390625,
0.061492919921875,
-0.06982421875,
-0.0219573974609375,
0.005115509033203125,
0.00296783447265625,
-0.00036907196044921875,
-0.016448974609375,
0.040069580078125,
-0.0347900390625,
-0.0594482421875,
0.01401519775390625,
-0.009368896484375,
0.014129638671875,
-0.0133209228515625,
-0.007396697998046875,
-0.045013427734375,
-0.007770538330078125,
0.0518798828125,
0.024749755859375,
-0.054412841796875,
-0.01007080078125,
-0.001438140869140625,
-0.01511383056640625,
0.00666046142578125,
0.031707763671875,
-0.05706787109375,
0.033935546875,
0.03973388671875,
0.03411865234375,
0.0616455078125,
-0.02197265625,
0.0214691162109375,
-0.064208984375,
0.02130126953125,
0.00650787353515625,
0.0297698974609375,
0.0350341796875,
-0.01358795166015625,
0.047637939453125,
0.0194091796875,
-0.0258026123046875,
-0.0577392578125,
-0.00691986083984375,
-0.0860595703125,
-0.034149169921875,
0.07537841796875,
-0.0177764892578125,
-0.0224456787109375,
0.005115509033203125,
-0.01491546630859375,
0.02752685546875,
-0.034423828125,
0.043121337890625,
0.0692138671875,
0.013092041015625,
-0.01393890380859375,
-0.040863037109375,
0.045257568359375,
0.026031494140625,
-0.04400634765625,
-0.00704193115234375,
0.025604248046875,
0.0364990234375,
0.037078857421875,
0.044891357421875,
-0.00650787353515625,
0.00997161865234375,
-0.0172576904296875,
0.04364013671875,
-0.0022144317626953125,
-0.0247650146484375,
-0.0306243896484375,
-0.007091522216796875,
-0.0046539306640625,
-0.016204833984375
]
] |
jjzha/jobbert_knowledge_extraction | 2023-10-26T10:25:41.000Z | [
"transformers",
"pytorch",
"safetensors",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | jjzha | null | null | jjzha/jobbert_knowledge_extraction | 0 | 11,407 | transformers | 2023-04-06T14:15:13 | This is a demo using the models from:
```
@inproceedings{zhang-etal-2022-skillspan,
title = "{S}kill{S}pan: Hard and Soft Skill Extraction from {E}nglish Job Postings",
author = "Zhang, Mike and
Jensen, Kristian and
Sonniks, Sif and
Plank, Barbara",
booktitle = "Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
month = jul,
year = "2022",
address = "Seattle, United States",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.naacl-main.366",
doi = "10.18653/v1/2022.naacl-main.366",
pages = "4962--4984",
abstract = "Skill Extraction (SE) is an important and widely-studied task useful to gain insights into labor market dynamics. However, there is a lacuna of datasets and annotation guidelines; available datasets are few and contain crowd-sourced labels on the span-level or labels from a predefined skill inventory. To address this gap, we introduce SKILLSPAN, a novel SE dataset consisting of 14.5K sentences and over 12.5K annotated spans. We release its respective guidelines created over three different sources annotated for hard and soft skills by domain experts. We introduce a BERT baseline (Devlin et al., 2019). To improve upon this baseline, we experiment with language models that are optimized for long spans (Joshi et al., 2020; Beltagy et al., 2020), continuous pre-training on the job posting domain (Han and Eisenstein, 2019; Gururangan et al., 2020), and multi-task learning (Caruana, 1997). Our results show that the domain-adapted models significantly outperform their non-adapted counterparts, and single-task outperforms multi-task learning.",
}
```
Note that there is another endpoint, namely `jjzha/jobbert_skill_extraction`.
Knowledge can be seen as hard skills and skills are both soft and applied skills. | 1,943 | [
[
-0.0180816650390625,
-0.05438232421875,
0.017303466796875,
0.0204315185546875,
0.00524139404296875,
-0.01629638671875,
-0.0345458984375,
-0.04180908203125,
-0.0005745887756347656,
0.04437255859375,
-0.0460205078125,
-0.03448486328125,
-0.043212890625,
0.01001739501953125,
-0.0169677734375,
0.080810546875,
0.0177154541015625,
-0.0272216796875,
-0.005832672119140625,
0.01422882080078125,
-0.0191497802734375,
-0.0181884765625,
-0.043365478515625,
-0.005741119384765625,
0.0275421142578125,
0.04815673828125,
0.03533935546875,
0.053009033203125,
0.01824951171875,
0.0233612060546875,
0.0024662017822265625,
0.0088348388671875,
-0.0177459716796875,
0.00811767578125,
-0.0130462646484375,
-0.0191192626953125,
-0.0638427734375,
0.0216217041015625,
0.026824951171875,
0.08355712890625,
0.004711151123046875,
0.0159759521484375,
0.0278167724609375,
0.0321044921875,
-0.04180908203125,
0.043609619140625,
-0.05120849609375,
-0.01183319091796875,
-0.047027587890625,
0.00991058349609375,
-0.028472900390625,
-0.01236724853515625,
0.01548004150390625,
-0.05413818359375,
0.034454345703125,
-0.0240631103515625,
0.08319091796875,
0.006038665771484375,
-0.031280517578125,
-0.019256591796875,
-0.0231170654296875,
0.06890869140625,
-0.06402587890625,
0.0306396484375,
0.046722412109375,
0.03619384765625,
-0.0008149147033691406,
-0.0516357421875,
-0.039306640625,
0.0008540153503417969,
0.0038738250732421875,
0.0189666748046875,
0.03228759765625,
0.0034122467041015625,
0.0201263427734375,
0.013824462890625,
-0.03314208984375,
0.03656005859375,
-0.057647705078125,
0.002079010009765625,
0.065185546875,
0.006786346435546875,
0.01202392578125,
-0.0208587646484375,
-0.0162506103515625,
-0.02239990234375,
-0.045501708984375,
0.0179443359375,
0.0278167724609375,
0.051910400390625,
-0.0182342529296875,
0.0477294921875,
-0.0255279541015625,
0.05157470703125,
-0.007480621337890625,
-0.0231170654296875,
0.032623291015625,
-0.01690673828125,
-0.0038204193115234375,
-0.03912353515625,
0.054290771484375,
0.01110076904296875,
0.03509521484375,
-0.01141357421875,
-0.0034332275390625,
-0.0227508544921875,
0.0214080810546875,
-0.03424072265625,
-0.02569580078125,
0.002445220947265625,
-0.03387451171875,
-0.0223236083984375,
0.014434814453125,
-0.07818603515625,
-0.0435791015625,
-0.00925445556640625,
-0.003833770751953125,
-0.02191162109375,
-0.040740966796875,
0.005466461181640625,
-0.004161834716796875,
0.0321044921875,
0.0194244384765625,
-0.06707763671875,
0.025634765625,
0.05865478515625,
0.050872802734375,
-0.020263671875,
-0.036346435546875,
-0.02691650390625,
-0.0071563720703125,
-0.00982666015625,
0.04638671875,
-0.032501220703125,
0.0024585723876953125,
0.0206146240234375,
0.0318603515625,
-0.0258941650390625,
-0.032440185546875,
0.04388427734375,
-0.046539306640625,
0.0163726806640625,
-0.0225372314453125,
-0.03753662109375,
-0.0221099853515625,
0.002532958984375,
-0.07232666015625,
0.098388671875,
0.021209716796875,
-0.03814697265625,
0.0528564453125,
-0.07537841796875,
-0.041015625,
0.004383087158203125,
0.0019817352294921875,
-0.032562255859375,
-0.0164947509765625,
0.0163421630859375,
0.0721435546875,
-0.0218505859375,
0.0009469985961914062,
-0.041748046875,
-0.016143798828125,
0.01366424560546875,
-0.020355224609375,
0.076904296875,
0.0115966796875,
-0.0302581787109375,
-0.0296783447265625,
-0.05511474609375,
-0.0005645751953125,
0.00946807861328125,
-0.032623291015625,
-0.0164031982421875,
-0.003269195556640625,
0.0015459060668945312,
0.036895751953125,
0.018798828125,
-0.03314208984375,
0.016998291015625,
-0.0445556640625,
0.0104827880859375,
0.058685302734375,
-0.0117340087890625,
0.03466796875,
0.0009784698486328125,
0.052642822265625,
-0.00760650634765625,
-0.0011472702026367188,
0.0018968582153320312,
-0.01503753662109375,
-0.037933349609375,
-0.023162841796875,
0.0198516845703125,
0.04010009765625,
-0.0589599609375,
0.0419921875,
-0.01519012451171875,
-0.04998779296875,
-0.046173095703125,
0.032806396484375,
0.0360107421875,
0.044921875,
0.038421630859375,
-0.0251922607421875,
-0.039306640625,
-0.059234619140625,
-0.0190277099609375,
0.00788116455078125,
0.00550079345703125,
0.004520416259765625,
0.022796630859375,
0.008026123046875,
0.0802001953125,
-0.038604736328125,
-0.0178070068359375,
-0.04376220703125,
0.01922607421875,
0.031982421875,
0.0269775390625,
0.034637451171875,
-0.05401611328125,
-0.052093505859375,
0.0003268718719482422,
-0.0662841796875,
-0.031494140625,
-0.01267242431640625,
-0.006488800048828125,
0.0205230712890625,
0.040191650390625,
-0.045440673828125,
0.0299072265625,
0.02410888671875,
-0.0247650146484375,
0.06903076171875,
0.011444091796875,
-0.0163421630859375,
-0.07421875,
0.025970458984375,
0.038970947265625,
-0.005245208740234375,
-0.043670654296875,
-0.0016565322875976562,
0.005748748779296875,
-0.00597381591796875,
-0.040557861328125,
0.054412841796875,
-0.058624267578125,
-0.0250701904296875,
-0.0251007080078125,
0.01256561279296875,
0.00214385986328125,
0.056884765625,
0.013153076171875,
0.07025146484375,
0.031585693359375,
-0.051422119140625,
0.02069091796875,
0.02349853515625,
-0.038970947265625,
0.035400390625,
-0.054779052734375,
0.00974273681640625,
0.004863739013671875,
-0.0022869110107421875,
-0.057342529296875,
-0.01715087890625,
0.0011701583862304688,
-0.006847381591796875,
0.019805908203125,
-0.0234375,
-0.0309906005859375,
-0.046173095703125,
-0.0252838134765625,
0.00417327880859375,
0.0277557373046875,
-0.0152435302734375,
0.0204315185546875,
0.0367431640625,
-0.045440673828125,
-0.062744140625,
-0.050048828125,
-0.0055999755859375,
-0.0075531005859375,
-0.032012939453125,
0.0251617431640625,
0.0137481689453125,
-0.0144805908203125,
-0.004367828369140625,
0.00753021240234375,
-0.036376953125,
0.01776123046875,
0.005023956298828125,
0.0223236083984375,
-0.0070037841796875,
0.0030727386474609375,
0.009185791015625,
-0.020538330078125,
-0.01074981689453125,
0.0030498504638671875,
0.02618408203125,
-0.003765106201171875,
-0.035247802734375,
-0.0264739990234375,
0.006786346435546875,
0.04022216796875,
-0.0221405029296875,
0.06085205078125,
0.06915283203125,
-0.022369384765625,
-0.0213165283203125,
-0.055450439453125,
-0.0251312255859375,
-0.037811279296875,
0.053955078125,
-0.030517578125,
-0.040618896484375,
0.033905029296875,
-0.008056640625,
0.012725830078125,
0.035736083984375,
0.0179290771484375,
-0.0175323486328125,
0.058074951171875,
0.060272216796875,
-0.00891876220703125,
0.030120849609375,
-0.04034423828125,
-0.00499725341796875,
-0.06732177734375,
-0.00803375244140625,
-0.0302734375,
-0.0038509368896484375,
-0.0210418701171875,
-0.01416778564453125,
0.02239990234375,
0.0019378662109375,
-0.0172271728515625,
0.047576904296875,
-0.025787353515625,
0.0309295654296875,
0.052459716796875,
0.01407623291015625,
-0.0160064697265625,
-0.0174713134765625,
0.004383087158203125,
-0.0198974609375,
-0.05133056640625,
-0.038726806640625,
0.0982666015625,
0.0295562744140625,
0.02984619140625,
-0.01146697998046875,
0.029693603515625,
0.0299530029296875,
0.0011005401611328125,
-0.038238525390625,
0.056854248046875,
-0.040771484375,
-0.04229736328125,
-0.038970947265625,
-0.01306915283203125,
-0.07513427734375,
0.0174102783203125,
-0.016021728515625,
-0.040802001953125,
0.0102691650390625,
-0.007080078125,
-0.0263671875,
0.0214385986328125,
-0.055877685546875,
0.0806884765625,
-0.0244140625,
-0.038421630859375,
0.0034999847412109375,
-0.0521240234375,
0.0134429931640625,
-0.0078887939453125,
0.0222625732421875,
-0.01788330078125,
0.00152587890625,
0.06304931640625,
-0.038299560546875,
0.1043701171875,
-0.04278564453125,
-0.0034923553466796875,
0.0137481689453125,
-0.00469970703125,
0.038421630859375,
-0.0279693603515625,
-0.047271728515625,
0.00457000732421875,
-0.008636474609375,
-0.03662109375,
-0.033538818359375,
0.014923095703125,
-0.05389404296875,
-0.01538848876953125,
-0.0130462646484375,
-0.03997802734375,
-0.0170135498046875,
0.025909423828125,
0.00809478759765625,
0.0249176025390625,
0.00396728515625,
0.0221405029296875,
0.026397705078125,
-0.024688720703125,
0.0302734375,
0.028900146484375,
0.0250701904296875,
-0.013946533203125,
0.05987548828125,
0.028778076171875,
0.00479888916015625,
0.0032558441162109375,
-0.01438140869140625,
-0.02325439453125,
-0.0173187255859375,
-0.006488800048828125,
0.03265380859375,
-0.0411376953125,
-0.01470184326171875,
-0.03857421875,
-0.0255126953125,
-0.055023193359375,
-0.029876708984375,
-0.031768798828125,
-0.0272674560546875,
-0.0185699462890625,
-0.023956298828125,
0.01708984375,
0.04840087890625,
-0.01080322265625,
0.01629638671875,
-0.04510498046875,
0.01352691650390625,
0.033203125,
0.020721435546875,
0.002349853515625,
-0.0235443115234375,
-0.053253173828125,
0.015899658203125,
-0.0186767578125,
-0.0750732421875,
0.046875,
0.030303955078125,
0.0667724609375,
0.03106689453125,
0.004627227783203125,
0.046234130859375,
-0.007083892822265625,
0.076171875,
-0.003360748291015625,
-0.05975341796875,
0.0384521484375,
-0.0302276611328125,
0.005908966064453125,
0.0645751953125,
0.0347900390625,
-0.046173095703125,
-0.0186309814453125,
-0.0469970703125,
-0.08209228515625,
0.07244873046875,
-0.00589752197265625,
-0.00429534912109375,
-0.00920867919921875,
0.0246429443359375,
0.045166015625,
0.020965576171875,
-0.0673828125,
0.004638671875,
-0.00536346435546875,
-0.031524658203125,
0.011932373046875,
-0.029754638671875,
0.01149749755859375,
-0.0199432373046875,
0.054443359375,
-0.0170745849609375,
0.02862548828125,
-0.0211029052734375,
-0.0318603515625,
0.015045166015625,
0.01110076904296875,
0.0163116455078125,
0.053009033203125,
-0.00347137451171875,
0.00897216796875,
0.0287628173828125,
-0.028167724609375,
-0.019317626953125,
0.0175323486328125,
0.0133514404296875,
-0.0186614990234375,
0.04388427734375,
0.044769287109375,
0.0243072509765625,
-0.048248291015625,
0.06109619140625,
0.052947998046875,
-0.021453857421875,
-0.03826904296875,
0.0027294158935546875,
-0.01528167724609375,
0.0262603759765625,
0.0311737060546875,
-0.0199432373046875,
0.0013151168823242188,
-0.0225067138671875,
0.025604248046875,
0.0157470703125,
-0.0137939453125,
-0.0440673828125,
0.045654296875,
0.0235137939453125,
-0.003627777099609375,
0.06298828125,
-0.041259765625,
-0.044525146484375,
0.050201416015625,
0.03363037109375,
0.05657958984375,
-0.00464630126953125,
0.0246429443359375,
0.0166473388671875,
0.0085601806640625,
-0.01378631591796875,
0.0283660888671875,
-0.005275726318359375,
-0.057098388671875,
-0.04901123046875,
-0.0276336669921875,
-0.0244903564453125,
0.01215362548828125,
-0.05242919921875,
0.03460693359375,
-0.00904083251953125,
-0.0006394386291503906,
-0.0128021240234375,
0.0206146240234375,
-0.08502197265625,
0.018768310546875,
0.002254486083984375,
0.06378173828125,
-0.09552001953125,
0.06781005859375,
0.052032470703125,
-0.057342529296875,
-0.058868408203125,
-0.0122833251953125,
-0.0007796287536621094,
-0.06109619140625,
0.055999755859375,
0.0111846923828125,
0.0198974609375,
-0.0129547119140625,
-0.038970947265625,
-0.045623779296875,
0.09405517578125,
0.0236358642578125,
-0.00766754150390625,
-0.0202484130859375,
0.0243377685546875,
0.0270538330078125,
-0.01363372802734375,
0.03070068359375,
0.038055419921875,
0.04241943359375,
-0.02447509765625,
-0.092041015625,
-0.00492095947265625,
-0.04937744140625,
-0.0110321044921875,
-0.0177001953125,
-0.045440673828125,
0.0791015625,
0.006622314453125,
-0.00948333740234375,
-0.0152130126953125,
0.054229736328125,
0.024688720703125,
0.027618408203125,
0.041290283203125,
0.0271759033203125,
0.056732177734375,
0.007190704345703125,
0.04742431640625,
-0.02520751953125,
0.01497650146484375,
0.07977294921875,
-0.03045654296875,
0.060882568359375,
0.0128326416015625,
-0.028594970703125,
0.06683349609375,
0.036773681640625,
-0.007633209228515625,
0.032806396484375,
-0.01480865478515625,
0.00969696044921875,
-0.01702880859375,
0.01055145263671875,
-0.0263519287109375,
0.0447998046875,
0.037628173828125,
-0.0215301513671875,
-0.012725830078125,
0.01678466796875,
0.0215301513671875,
0.00400543212890625,
-0.003665924072265625,
0.052642822265625,
0.004131317138671875,
-0.040557861328125,
0.032196044921875,
-0.0009303092956542969,
0.06988525390625,
-0.0452880859375,
-0.018096923828125,
-0.00812530517578125,
-0.0008840560913085938,
-0.0120849609375,
-0.07147216796875,
0.022979736328125,
0.00478363037109375,
-0.0009098052978515625,
-0.03570556640625,
0.07757568359375,
-0.03948974609375,
-0.028533935546875,
0.0196075439453125,
0.057830810546875,
0.0274658203125,
-0.0007758140563964844,
-0.06298828125,
-0.019195556640625,
-0.0203857421875,
-0.032806396484375,
0.0143890380859375,
0.04266357421875,
0.005336761474609375,
0.045013427734375,
0.03802490234375,
0.0279083251953125,
0.01132965087890625,
-0.00542449951171875,
0.053680419921875,
-0.044891357421875,
-0.04913330078125,
-0.03131103515625,
0.057464599609375,
-0.044525146484375,
-0.030303955078125,
0.053009033203125,
0.030487060546875,
0.06683349609375,
-0.0150146484375,
0.080078125,
-0.01995849609375,
0.061614990234375,
-0.031097412109375,
0.05517578125,
-0.048614501953125,
0.01079559326171875,
-0.038116455078125,
-0.064697265625,
-0.0046234130859375,
0.048095703125,
-0.0496826171875,
0.032745361328125,
0.042999267578125,
0.06719970703125,
-0.0189666748046875,
-0.018707275390625,
0.0095367431640625,
0.01175689697265625,
0.01544952392578125,
0.03741455078125,
0.0197601318359375,
-0.023590087890625,
0.038299560546875,
-0.038909912109375,
-0.01123809814453125,
-0.03533935546875,
-0.0655517578125,
-0.05474853515625,
-0.0687255859375,
-0.0236663818359375,
-0.01422882080078125,
0.005161285400390625,
0.071044921875,
0.06768798828125,
-0.044525146484375,
-0.038848876953125,
0.004150390625,
-0.0121917724609375,
-0.06890869140625,
-0.0239105224609375,
0.041412353515625,
-0.034942626953125,
-0.046234130859375,
0.0498046875,
-0.0156097412109375,
-0.0157012939453125,
-0.0087890625,
-0.002559661865234375,
-0.0242919921875,
-0.00992584228515625,
0.043701171875,
0.051788330078125,
-0.0260162353515625,
-0.021453857421875,
0.00927734375,
-0.0166168212890625,
-0.0297698974609375,
0.035552978515625,
-0.03912353515625,
0.0178985595703125,
0.027618408203125,
0.046661376953125,
0.0357666015625,
-0.00690460205078125,
0.044281005859375,
-0.08721923828125,
0.013824462890625,
0.0031890869140625,
0.02142333984375,
0.0186004638671875,
-0.016632080078125,
0.06396484375,
0.0258941650390625,
-0.0283355712890625,
-0.06365966796875,
0.0009131431579589844,
-0.06781005859375,
-0.0265655517578125,
0.1019287109375,
-0.0160064697265625,
-0.01262664794921875,
-0.0391845703125,
0.0025787353515625,
0.034027099609375,
-0.01280975341796875,
0.07373046875,
0.04638671875,
0.003387451171875,
-0.01959228515625,
-0.047088623046875,
0.032135009765625,
0.029876708984375,
-0.06317138671875,
0.01114654541015625,
0.02362060546875,
0.00827789306640625,
-0.00701904296875,
0.061676025390625,
-0.005687713623046875,
0.033843994140625,
-0.0009741783142089844,
0.035552978515625,
-0.0367431640625,
-0.03851318359375,
-0.002567291259765625,
0.03375244140625,
0.0096282958984375,
-0.0144500732421875
]
] |
stanfordnlp/backpack-gpt2 | 2023-08-14T20:03:44.000Z | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"text-generation-inference",
"backpack",
"backpackmodel",
"custom_code",
"en",
"dataset:openwebtext",
"arxiv:2305.16765",
"license:apache-2.0",
"has_space",
"region:us"
] | text-generation | stanfordnlp | null | null | stanfordnlp/backpack-gpt2 | 15 | 11,390 | transformers | 2023-05-29T05:17:39 | ---
pipeline_tag: text-generation
tags:
- text-generation-inference
- backpack
- backpackmodel
library_name: transformers
license: apache-2.0
datasets:
- openwebtext
language:
- en
---
# Model Card for Backpack-GPT2
<!-- Provide a quick summary of what the model is/does. [Optional] -->
The Backpack-GPT2 language model is an instance of the [Backpack architecture](https://arxiv.org/abs/2305.16765), intended to combine strong modeling performance with an interface for interpretability and control.
Most details about this model and its training should be accessed in the paper, [Backpack Language Models](https://arxiv.org/abs/2305.16765).
See also [backpackmodels.science](backpackmodels.science).

## Table of Contents
- [Model Card for Backpack-GPT2](#model-card-for--model_id-)
- [Table of Contents](#table-of-contents)
- [Model Details](#model-details)
- [Model Description](#model-description)
- [Uses](#uses)
- [Bias, Risks, and Limitations](#bias-risks-and-limitations)
- [Training Details](#training-details)
- [Training Data](#training-data)
- [Training Procedure](#training-procedure)
- [Environmental Impact](#environmental-impact)
- [Technical Specifications [optional]](#technical-specifications-optional)
- [Model Architecture and Objective](#model-architecture-and-objective)
- [Compute Infrastructure](#compute-infrastructure)
- [Hardware](#hardware)
- [Software](#software)
- [Citation](#citation)
- [Model Card Authors [optional]](#model-card-authors-optional)
- [Model Card Contact](#model-card-contact)
- [How to Get Started with the Model](#how-to-get-started-with-the-model)
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is/does. -->
The Backpack-GPT2 is a [Backpack-based language model](https://arxiv.org/abs/2305.16765), an architecture intended to combine strong modeling performance with an interface for interpretability and control.
- **Developed by:** John Hewitt, John Thickstun, Christopher D. Manning, Percy Liang
- **Shared by [Optional]:** More information needed
- **Model type:** Language model
- **Language(s) (NLP):** en
- **License:** apache-2.0
- **Resources for more information:**
- [GitHub Repo](https://github.com/john-hewitt/backpacks-flash-attn)
- [Associated Paper](https://huggingface.co/datasets/openwebtext)
## Uses
This model is intended for use in the study and development of increasingly interpretable methods in natural language processing.
It is not directly fit for any production use.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
This model in particular is limited in its capabilities, and with a brand new architecture, less is known about its biases than, e.g., Transformer-based models.
## How to Get Started with the Model
```python
import torch
from transformers import AutoConfig, AutoModelForCausalLM
model_id = "stanfordnlp/backpack-gpt2"
config = AutoConfig.from_pretrained(model_id, trust_remote_code=True)
torch_model = AutoModelForCausalLM.from_pretrained(model_id, config=config, trust_remote_code=True)
torch_model.eval()
input = torch.randint(0, 50264, (1, 512), dtype=torch.long)
torch_out = torch_model(
input,
position_ids=None,
)
torch_out = torch.nn.functional.softmax(torch_out.logits, dim=-1)
print(torch_out)
```
## Training Details
### Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
This model was trained on the [OpenWebText](https://huggingface.co/datasets/openwebtext) corpus.
### Training Procedure
This model was trained for 100k gradient steps with a batch size of 512k tokens and a linearly decaying learning rate from 6e-4 to zero, with a linear warmup of 5k steps.
### Environmental Impact
- **Hardware Type:** 4 A100 GPUs (40G)
- **Hours used:** Roughly 4 days.
- **Cloud Provider:** Stanford compute.
- **Compute Region:** Stanford energy grid.
### Model Architecture and Objective
This model was trained to minimize the cross-entropy loss, and is a [Backpack language model](https://arxiv.org/pdf/2305.16765.pdf).
### Compute Infrastructure
This model was trained on a slurm cluster.
### Hardware
This model was trained on 4 A100s.
### Software
This model was trained with [FlashAttention](https://github.com/HazyResearch/flash-attention) and [PyTorch](https://pytorch.org/)
## Citation
**BibTeX:**
```
@InProceedings{hewitt2023backpack,
author = "Hewitt, John and Thickstun, John and Manning, Christopher D. and Liang, Percy",
title = "Backpack Language Models",
booktitle = "Proceedings of the Association for Computational Linguistics",
year = "2023",
publisher = "Association for Computational Linguistics",
location = "Toronto, Canada",
}
```
## Model Card Authors [optional]
<!-- This section provides another layer of transparency and accountability. Whose views is this model card representing? How many voices were included in its construction? Etc. -->
John Hewitt
## Model Card Contact
johnhew@cs.stanford.edu
| 5,863 | [
[
-0.019989013671875,
-0.047637939453125,
0.013153076171875,
0.01885986328125,
-0.013214111328125,
-0.01346588134765625,
-0.01493072509765625,
-0.045166015625,
-0.0240936279296875,
0.017303466796875,
-0.033935546875,
-0.0269317626953125,
-0.04400634765625,
-0.0191497802734375,
-0.0299072265625,
0.0972900390625,
-0.0105133056640625,
0.004604339599609375,
-0.0020351409912109375,
-0.01251220703125,
-0.0291900634765625,
-0.041259765625,
-0.053985595703125,
-0.03741455078125,
0.0109405517578125,
-0.01103973388671875,
0.043609619140625,
0.04046630859375,
0.018463134765625,
0.0176239013671875,
-0.016204833984375,
-0.0271759033203125,
-0.032928466796875,
-0.028900146484375,
-0.009002685546875,
-0.0255126953125,
-0.0386962890625,
0.0148162841796875,
0.049285888671875,
0.034942626953125,
-0.0168609619140625,
0.027313232421875,
0.019439697265625,
0.024932861328125,
-0.03009033203125,
0.0290985107421875,
-0.04150390625,
-0.006931304931640625,
-0.01934814453125,
0.016326904296875,
-0.0341796875,
0.00017535686492919922,
0.01898193359375,
-0.0297698974609375,
0.03302001953125,
0.00757598876953125,
0.072265625,
0.005096435546875,
-0.0205535888671875,
-0.0227508544921875,
-0.0465087890625,
0.0660400390625,
-0.051300048828125,
0.0301361083984375,
0.03948974609375,
0.00876617431640625,
0.0048675537109375,
-0.058685302734375,
-0.05096435546875,
-0.0006136894226074219,
-0.0142974853515625,
0.01080322265625,
-0.013702392578125,
0.0027790069580078125,
0.01221466064453125,
0.0178985595703125,
-0.07220458984375,
0.005084991455078125,
-0.044677734375,
-0.01345062255859375,
0.048919677734375,
-0.0008063316345214844,
0.02325439453125,
-0.0215606689453125,
-0.035186767578125,
-0.01273345947265625,
-0.043304443359375,
0.007198333740234375,
0.034881591796875,
0.02008056640625,
-0.046966552734375,
0.035552978515625,
-0.002819061279296875,
0.047698974609375,
-0.0173187255859375,
-0.01148223876953125,
0.029998779296875,
-0.0244903564453125,
-0.0213470458984375,
-0.01031494140625,
0.0599365234375,
0.01641845703125,
0.013885498046875,
-0.01383209228515625,
-0.02838134765625,
-0.0146942138671875,
0.0033130645751953125,
-0.06854248046875,
-0.03399658203125,
0.0195159912109375,
-0.042877197265625,
-0.03057861328125,
-0.009124755859375,
-0.0518798828125,
-0.00629425048828125,
-0.018463134765625,
0.03216552734375,
-0.0189208984375,
-0.0438232421875,
0.01128387451171875,
-0.0120086669921875,
0.032501220703125,
0.0021038055419921875,
-0.0745849609375,
0.0361328125,
0.04327392578125,
0.0816650390625,
-0.00478363037109375,
-0.0236663818359375,
0.0003600120544433594,
0.006809234619140625,
0.007720947265625,
0.055328369140625,
-0.03192138671875,
-0.0301513671875,
0.0025157928466796875,
0.012481689453125,
-0.005947113037109375,
-0.017303466796875,
0.042205810546875,
-0.029998779296875,
0.04345703125,
-0.010589599609375,
-0.032135009765625,
-0.01006317138671875,
0.0033245086669921875,
-0.0369873046875,
0.10125732421875,
0.0203857421875,
-0.06488037109375,
0.00894927978515625,
-0.053314208984375,
-0.024169921875,
0.0095672607421875,
-0.00618743896484375,
-0.03631591796875,
-0.001079559326171875,
-0.001979827880859375,
0.0224151611328125,
-0.036834716796875,
0.039642333984375,
-0.020263671875,
-0.0170135498046875,
-0.00037384033203125,
-0.02325439453125,
0.0926513671875,
0.01003265380859375,
-0.034027099609375,
-0.0025615692138671875,
-0.049591064453125,
-0.0016765594482421875,
0.0266571044921875,
-0.032501220703125,
-0.01158905029296875,
-0.004016876220703125,
0.0328369140625,
0.0293121337890625,
0.017242431640625,
-0.031463623046875,
0.013885498046875,
-0.03009033203125,
0.04473876953125,
0.06903076171875,
-0.0308074951171875,
0.043212890625,
-0.0171051025390625,
0.031982421875,
-0.0177764892578125,
0.0231475830078125,
-0.00165557861328125,
-0.053680419921875,
-0.049041748046875,
-0.01507568359375,
0.0186920166015625,
0.050811767578125,
-0.03802490234375,
0.040985107421875,
-0.0154571533203125,
-0.04144287109375,
-0.028961181640625,
-0.0019683837890625,
0.03997802734375,
0.048126220703125,
0.042388916015625,
-0.0209808349609375,
-0.037994384765625,
-0.05657958984375,
-0.01496124267578125,
-0.0269317626953125,
-0.01213836669921875,
0.027679443359375,
0.0599365234375,
-0.01432037353515625,
0.067138671875,
-0.038604736328125,
-0.008544921875,
-0.0246429443359375,
0.018646240234375,
0.0015048980712890625,
0.051483154296875,
0.0478515625,
-0.0469970703125,
-0.027099609375,
-0.0216064453125,
-0.052642822265625,
-0.00841522216796875,
0.00749969482421875,
-0.0032787322998046875,
0.0290679931640625,
0.018280029296875,
-0.0621337890625,
0.0293426513671875,
0.0430908203125,
-0.03399658203125,
0.047637939453125,
-0.0083160400390625,
-0.029693603515625,
-0.0704345703125,
0.0198974609375,
0.0079193115234375,
0.0012826919555664062,
-0.047332763671875,
-0.0002224445343017578,
-0.0034885406494140625,
-0.01146697998046875,
-0.035186767578125,
0.08990478515625,
-0.03778076171875,
-0.00026535987854003906,
-0.025054931640625,
-0.00006479024887084961,
0.0017557144165039062,
0.04833984375,
0.0196380615234375,
0.04815673828125,
0.051055908203125,
-0.043975830078125,
0.0162506103515625,
0.0177764892578125,
-0.0222015380859375,
0.031829833984375,
-0.0765380859375,
0.01708984375,
-0.016754150390625,
0.037811279296875,
-0.0765380859375,
-0.01346588134765625,
0.0216064453125,
-0.0277252197265625,
0.038604736328125,
-0.0231475830078125,
-0.052520751953125,
-0.040863037109375,
-0.0185089111328125,
0.04949951171875,
0.057098388671875,
-0.036468505859375,
0.034942626953125,
0.045684814453125,
-0.010040283203125,
-0.04296875,
-0.06488037109375,
-0.017608642578125,
-0.010589599609375,
-0.03704833984375,
0.031768798828125,
-0.012420654296875,
-0.002689361572265625,
-0.0021572113037109375,
-0.005126953125,
-0.01483154296875,
0.004604339599609375,
0.014739990234375,
0.0306396484375,
-0.0006966590881347656,
0.021453857421875,
-0.01171112060546875,
-0.00829315185546875,
0.004810333251953125,
-0.042266845703125,
0.047821044921875,
-0.0177001953125,
0.0057220458984375,
-0.0276641845703125,
0.0125885009765625,
0.003978729248046875,
-0.01270294189453125,
0.07080078125,
0.072998046875,
-0.046783447265625,
-0.004913330078125,
-0.044342041015625,
-0.0211029052734375,
-0.038360595703125,
0.041839599609375,
-0.01629638671875,
-0.052764892578125,
0.036834716796875,
0.025054931640625,
0.0062713623046875,
0.03826904296875,
0.04241943359375,
0.0024051666259765625,
0.0662841796875,
0.059112548828125,
-0.01107025146484375,
0.044586181640625,
-0.020263671875,
0.0233917236328125,
-0.07073974609375,
-0.032318115234375,
-0.053802490234375,
-0.01111602783203125,
-0.053192138671875,
-0.038299560546875,
0.0019197463989257812,
0.01152801513671875,
-0.04022216796875,
0.034271240234375,
-0.054168701171875,
0.030975341796875,
0.039825439453125,
-0.003726959228515625,
-0.0029048919677734375,
0.0081634521484375,
-0.01049041748046875,
0.006511688232421875,
-0.061798095703125,
-0.04620361328125,
0.0958251953125,
0.04779052734375,
0.051544189453125,
0.002902984619140625,
0.03167724609375,
0.005764007568359375,
0.0285491943359375,
-0.03375244140625,
0.037322998046875,
0.0037899017333984375,
-0.067626953125,
-0.01165008544921875,
-0.027557373046875,
-0.06463623046875,
0.016448974609375,
0.00007283687591552734,
-0.07415771484375,
0.00939178466796875,
0.014984130859375,
-0.011383056640625,
0.023040771484375,
-0.0670166015625,
0.083740234375,
-0.0246429443359375,
-0.03369140625,
-0.0028247833251953125,
-0.056854248046875,
0.035491943359375,
0.01061248779296875,
0.023162841796875,
-0.0028820037841796875,
0.0030689239501953125,
0.0792236328125,
-0.024688720703125,
0.06927490234375,
-0.0244903564453125,
-0.004486083984375,
0.0265960693359375,
-0.0187530517578125,
0.045654296875,
-0.007659912109375,
-0.006931304931640625,
0.035797119140625,
-0.0032806396484375,
-0.04168701171875,
-0.024749755859375,
0.051116943359375,
-0.07281494140625,
-0.0218963623046875,
-0.04302978515625,
-0.043701171875,
0.0037326812744140625,
0.02435302734375,
0.0260009765625,
0.0345458984375,
-0.0099639892578125,
-0.0020904541015625,
0.05267333984375,
-0.040374755859375,
0.03851318359375,
0.03521728515625,
-0.018035888671875,
-0.01395416259765625,
0.06976318359375,
0.01491546630859375,
0.0156097412109375,
0.0267333984375,
0.0113983154296875,
-0.042755126953125,
-0.036834716796875,
-0.0298309326171875,
0.0276641845703125,
-0.055755615234375,
-0.005664825439453125,
-0.064453125,
-0.0213165283203125,
-0.0526123046875,
0.003559112548828125,
-0.023956298828125,
-0.03765869140625,
-0.025360107421875,
-0.01490020751953125,
0.0185699462890625,
0.05267333984375,
0.005344390869140625,
0.038726806640625,
-0.0200042724609375,
0.01366424560546875,
0.020751953125,
0.0372314453125,
0.0033969879150390625,
-0.057342529296875,
-0.0187835693359375,
0.0026798248291015625,
-0.03460693359375,
-0.0587158203125,
0.0219879150390625,
0.00997161865234375,
0.039794921875,
0.01045989990234375,
-0.01073455810546875,
0.035064697265625,
-0.03216552734375,
0.0704345703125,
0.01216888427734375,
-0.06512451171875,
0.03887939453125,
-0.0207977294921875,
0.0290069580078125,
0.019805908203125,
0.026214599609375,
-0.0247955322265625,
-0.0172882080078125,
-0.0648193359375,
-0.07159423828125,
0.07122802734375,
0.022308349609375,
0.0180511474609375,
0.00637054443359375,
0.023162841796875,
-0.01187896728515625,
0.0214691162109375,
-0.095947265625,
-0.027984619140625,
-0.0347900390625,
-0.023956298828125,
-0.0033817291259765625,
-0.025360107421875,
-0.0037994384765625,
-0.03192138671875,
0.07122802734375,
0.0008416175842285156,
0.046844482421875,
-0.00437164306640625,
-0.01311492919921875,
0.014404296875,
0.0085601806640625,
0.058197021484375,
0.032257080078125,
-0.0189056396484375,
0.00959014892578125,
0.01386260986328125,
-0.04400634765625,
-0.004978179931640625,
0.0240936279296875,
-0.0265960693359375,
-0.00010544061660766602,
0.0202484130859375,
0.0748291015625,
0.01221466064453125,
-0.039306640625,
0.041046142578125,
-0.005092620849609375,
-0.00742340087890625,
-0.0241241455078125,
-0.005702972412109375,
0.0078125,
0.007061004638671875,
0.019744873046875,
0.0016088485717773438,
0.007160186767578125,
-0.04052734375,
0.0005450248718261719,
0.047637939453125,
-0.0154571533203125,
-0.03814697265625,
0.06298828125,
0.0217132568359375,
-0.0222320556640625,
0.047332763671875,
-0.03076171875,
-0.0362548828125,
0.040771484375,
0.0455322265625,
0.06689453125,
-0.0212554931640625,
0.0149993896484375,
0.052947998046875,
0.04876708984375,
-0.0183868408203125,
0.004276275634765625,
-0.01146697998046875,
-0.0439453125,
-0.006031036376953125,
-0.059661865234375,
0.006687164306640625,
0.0182037353515625,
-0.03472900390625,
0.0234832763671875,
-0.0255584716796875,
-0.004913330078125,
-0.021575927734375,
0.0106964111328125,
-0.07550048828125,
0.0157470703125,
0.005306243896484375,
0.07122802734375,
-0.07244873046875,
0.07867431640625,
0.032562255859375,
-0.04193115234375,
-0.061309814453125,
-0.0059356689453125,
0.0009937286376953125,
-0.06805419921875,
0.04107666015625,
0.0218963623046875,
0.01169586181640625,
0.00228118896484375,
-0.044281005859375,
-0.060821533203125,
0.09033203125,
0.023956298828125,
-0.03546142578125,
-0.018402099609375,
0.00943756103515625,
0.04302978515625,
-0.018157958984375,
0.04876708984375,
0.039337158203125,
0.0328369140625,
-0.00893402099609375,
-0.07763671875,
0.0120086669921875,
-0.020751953125,
0.01421356201171875,
-0.00872039794921875,
-0.051788330078125,
0.07611083984375,
-0.012420654296875,
-0.0205078125,
0.0016984939575195312,
0.041015625,
0.0182342529296875,
0.0145721435546875,
0.01788330078125,
0.0472412109375,
0.068359375,
-0.011810302734375,
0.10052490234375,
-0.0213165283203125,
0.052093505859375,
0.0936279296875,
-0.006534576416015625,
0.061004638671875,
0.0229949951171875,
-0.0230712890625,
0.035308837890625,
0.046112060546875,
-0.006015777587890625,
0.046539306640625,
0.015625,
-0.0234375,
0.00994873046875,
0.00010669231414794922,
-0.04461669921875,
0.0155487060546875,
0.01488494873046875,
-0.03167724609375,
-0.01119232177734375,
0.0008840560913085938,
0.01137542724609375,
-0.03009033203125,
-0.00820159912109375,
0.045501708984375,
0.00347137451171875,
-0.04400634765625,
0.046539306640625,
0.0246429443359375,
0.053680419921875,
-0.04681396484375,
0.0132904052734375,
-0.0048675537109375,
0.010528564453125,
-0.01428985595703125,
-0.05267333984375,
0.0171661376953125,
0.00670623779296875,
-0.01678466796875,
-0.028045654296875,
0.0513916015625,
-0.042236328125,
-0.050628662109375,
0.0225067138671875,
0.0198211669921875,
0.0220489501953125,
0.0033855438232421875,
-0.07757568359375,
0.0175323486328125,
-0.01386260986328125,
-0.0293121337890625,
0.0288543701171875,
0.006317138671875,
0.0037670135498046875,
0.05120849609375,
0.033843994140625,
-0.00734710693359375,
0.0138397216796875,
-0.003414154052734375,
0.051116943359375,
-0.028289794921875,
-0.031707763671875,
-0.05914306640625,
0.049591064453125,
-0.006641387939453125,
-0.0297393798828125,
0.045623779296875,
0.059539794921875,
0.08172607421875,
-0.006381988525390625,
0.07861328125,
-0.0170135498046875,
0.01094818115234375,
-0.02923583984375,
0.0513916015625,
-0.0262451171875,
0.0157012939453125,
-0.01934814453125,
-0.066162109375,
-0.00681304931640625,
0.07110595703125,
-0.0281524658203125,
0.0313720703125,
0.048309326171875,
0.07135009765625,
-0.01256561279296875,
0.0010623931884765625,
-0.0076904296875,
0.0251922607421875,
0.030426025390625,
0.04296875,
0.0283355712890625,
-0.06231689453125,
0.04168701171875,
-0.01910400390625,
-0.0222320556640625,
-0.00548553466796875,
-0.06402587890625,
-0.06927490234375,
-0.050262451171875,
-0.035369873046875,
-0.0374755859375,
0.01119232177734375,
0.05694580078125,
0.0660400390625,
-0.054168701171875,
-0.0281524658203125,
-0.0261077880859375,
0.01378631591796875,
-0.0078887939453125,
-0.020233154296875,
0.0268402099609375,
-0.013275146484375,
-0.05072021484375,
0.0006461143493652344,
0.0009636878967285156,
0.021636962890625,
-0.02874755859375,
-0.0277557373046875,
-0.01751708984375,
0.001087188720703125,
0.04669189453125,
0.0292205810546875,
-0.057708740234375,
-0.012298583984375,
-0.01340484619140625,
-0.01261138916015625,
-0.00843048095703125,
0.0494384765625,
-0.050811767578125,
0.0232391357421875,
0.027587890625,
0.0323486328125,
0.05548095703125,
-0.0204925537109375,
0.0295257568359375,
-0.06182861328125,
0.03240966796875,
0.00655364990234375,
0.030914306640625,
0.030426025390625,
-0.0146484375,
0.044281005859375,
0.034637451171875,
-0.04901123046875,
-0.07891845703125,
0.0076446533203125,
-0.07635498046875,
-0.0181121826171875,
0.10699462890625,
-0.01265716552734375,
-0.0160369873046875,
-0.01175689697265625,
-0.0123748779296875,
0.02130126953125,
-0.0215301513671875,
0.0572509765625,
0.0601806640625,
0.00910186767578125,
-0.0222930908203125,
-0.05810546875,
0.04302978515625,
0.00997161865234375,
-0.060302734375,
0.00396728515625,
0.018646240234375,
0.04119873046875,
-0.0006031990051269531,
0.0670166015625,
-0.0020503997802734375,
0.00823211669921875,
-0.0018787384033203125,
0.0185394287109375,
0.00583648681640625,
-0.0184173583984375,
-0.024505615234375,
0.006317138671875,
-0.0037174224853515625,
0.0005803108215332031
]
] |
PygmalionAI/pygmalion-2-7b | 2023-09-15T20:29:47.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"text generation",
"instruct",
"en",
"dataset:PygmalionAI/PIPPA",
"dataset:Open-Orca/OpenOrca",
"dataset:Norquinal/claude_multiround_chat_30k",
"dataset:jondurbin/airoboros-gpt4-1.4.1",
"dataset:databricks/databricks-dolly-15k",
"license:llama2",
"text-generation-inference",
"region:us"
] | text-generation | PygmalionAI | null | null | PygmalionAI/pygmalion-2-7b | 33 | 11,372 | transformers | 2023-09-04T22:20:25 | ---
language:
- en
thumbnail: null
tags:
- text generation
- instruct
pipeline_tag: text-generation
inference: false
license: llama2
datasets:
- PygmalionAI/PIPPA
- Open-Orca/OpenOrca
- Norquinal/claude_multiround_chat_30k
- jondurbin/airoboros-gpt4-1.4.1
- databricks/databricks-dolly-15k
---
<h1 style="text-align: center">Pygmalion-2 7B</h1>
<h2 style="text-align: center">An instruction-tuned Llama-2 biased towards fiction writing and conversation.</h2>
## Model Details
The long-awaited release of our new models based on Llama-2 is finally here. Pygmalion-2 7B (formerly known as Metharme) is based on
[Llama-2 7B](https://huggingface.co/meta-llama/llama-2-7b-hf) released by Meta AI.
The Metharme models were an experiment to try and get a model that is usable for conversation, roleplaying and storywriting,
but which can be guided using natural language like other instruct models. After much deliberation, we reached the conclusion
that the Metharme prompting format is superior (and easier to use) compared to the classic Pygmalion.
This model was trained by doing supervised fine-tuning over a mixture of regular instruction data alongside roleplay, fictional stories
and conversations with synthetically generated instructions attached.
This model is freely available for both commercial and non-commercial use, as per the Llama-2 license.
## Prompting
The model has been trained on prompts using three different roles, which are denoted by the following tokens: `<|system|>`, `<|user|>` and `<|model|>`.
The `<|system|>` prompt can be used to inject out-of-channel information behind the scenes, while the `<|user|>` prompt should be used to indicate user input.
The `<|model|>` token should then be used to indicate that the model should generate a response. These tokens can happen multiple times and be chained up to
form a conversation history.
### Prompting example
The system prompt has been designed to allow the model to "enter" various modes and dictate the reply length. Here's an example:
```
<|system|>Enter RP mode. Pretend to be {{char}} whose persona follows:
{{persona}}
You shall reply to the user while staying in character, and generate long responses.
```
## Dataset
The dataset used to fine-tune this model includes our own [PIPPA](https://huggingface.co/datasets/PygmalionAI/PIPPA), along with several other instruction
datasets, and datasets acquired from various RP forums.
## Limitations and biases
The intended use-case for this model is fictional writing for entertainment purposes. Any other sort of usage is out of scope.
As such, it was **not** fine-tuned to be safe and harmless: the base model _and_ this fine-tune have been trained on data known to contain profanity and texts that
are lewd or otherwise offensive. It may produce socially unacceptable or undesirable text, even if the prompt itself does not include anything explicitly offensive.
Outputs might often be factually wrong or misleading.
## Acknowledgements
We would like to thank [SpicyChat](https://spicychat.ai/) for sponsoring the training for this model.
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl) | 3,309 | [
[
-0.01548004150390625,
-0.059173583984375,
0.017333984375,
0.0309600830078125,
-0.026275634765625,
-0.010589599609375,
-0.023284912109375,
-0.042205810546875,
0.00775909423828125,
0.0234832763671875,
-0.055572509765625,
-0.0237884521484375,
-0.036285400390625,
0.0155792236328125,
0.0005841255187988281,
0.08831787109375,
0.00801849365234375,
-0.0072174072265625,
-0.0274505615234375,
-0.00188446044921875,
-0.06842041015625,
-0.038421630859375,
-0.05157470703125,
-0.034210205078125,
0.05645751953125,
0.027130126953125,
0.05224609375,
0.04156494140625,
0.0164642333984375,
0.01424407958984375,
-0.0157470703125,
0.02691650390625,
-0.027618408203125,
0.0124969482421875,
-0.0003180503845214844,
-0.03143310546875,
-0.05352783203125,
0.0301971435546875,
0.031158447265625,
0.02783203125,
-0.018463134765625,
0.023193359375,
0.0114288330078125,
0.022369384765625,
-0.0433349609375,
0.0215911865234375,
-0.0227813720703125,
-0.001163482666015625,
0.00658416748046875,
0.004215240478515625,
-0.048065185546875,
-0.029510498046875,
-0.0027618408203125,
-0.048370361328125,
-0.0222930908203125,
0.014862060546875,
0.057220458984375,
0.0118255615234375,
-0.0276947021484375,
-0.0273590087890625,
-0.03167724609375,
0.052032470703125,
-0.06951904296875,
0.00901031494140625,
0.0426025390625,
0.0260467529296875,
-0.015838623046875,
-0.06719970703125,
-0.048248291015625,
-0.041778564453125,
-0.006404876708984375,
0.00539398193359375,
-0.02398681640625,
0.005077362060546875,
0.0272064208984375,
0.00995635986328125,
-0.050750732421875,
0.0226287841796875,
-0.0275421142578125,
-0.0139312744140625,
0.0367431640625,
0.018585205078125,
0.0274810791015625,
-0.0218353271484375,
-0.0311126708984375,
-0.0027027130126953125,
-0.0614013671875,
0.01250457763671875,
0.0270843505859375,
0.0038471221923828125,
-0.0318603515625,
0.05267333984375,
-0.00615692138671875,
0.04656982421875,
0.016265869140625,
-0.034332275390625,
0.006206512451171875,
-0.007656097412109375,
-0.01039886474609375,
-0.006717681884765625,
0.0654296875,
0.064208984375,
0.034393310546875,
0.0062103271484375,
-0.0117950439453125,
0.0164947509765625,
0.0151214599609375,
-0.07745361328125,
-0.0224456787109375,
0.01087188720703125,
-0.04925537109375,
-0.034576416015625,
-0.0232391357421875,
-0.043304443359375,
-0.0411376953125,
-0.0108489990234375,
0.0088043212890625,
-0.0269012451171875,
-0.0256195068359375,
-0.014068603515625,
-0.006534576416015625,
0.02349853515625,
0.0304718017578125,
-0.0814208984375,
0.01071929931640625,
0.044586181640625,
0.07037353515625,
-0.006526947021484375,
-0.0277862548828125,
-0.021759033203125,
-0.013671875,
-0.023345947265625,
0.054718017578125,
-0.051422119140625,
-0.0278167724609375,
-0.00949859619140625,
0.00666046142578125,
-0.0175323486328125,
-0.0433349609375,
0.0279083251953125,
-0.0269012451171875,
0.035308837890625,
-0.0130157470703125,
-0.0296630859375,
-0.025146484375,
0.01690673828125,
-0.0265350341796875,
0.07373046875,
0.003971099853515625,
-0.0562744140625,
0.01401519775390625,
-0.052947998046875,
-0.0139617919921875,
-0.011474609375,
0.005680084228515625,
-0.01219940185546875,
-0.0161895751953125,
0.0242156982421875,
0.0305938720703125,
-0.03387451171875,
0.0205078125,
-0.0233917236328125,
-0.043853759765625,
0.03619384765625,
-0.03466796875,
0.06170654296875,
0.0184783935546875,
-0.0194549560546875,
0.01430511474609375,
-0.041412353515625,
-0.0011844635009765625,
0.01535797119140625,
-0.038482666015625,
-0.0031909942626953125,
-0.0017614364624023438,
-0.0173797607421875,
0.009002685546875,
0.0237579345703125,
-0.035552978515625,
0.023651123046875,
-0.0293731689453125,
0.047271728515625,
0.056060791015625,
0.0029888153076171875,
0.03375244140625,
-0.047088623046875,
0.050323486328125,
-0.01462554931640625,
0.03106689453125,
-0.0191192626953125,
-0.05963134765625,
-0.04766845703125,
-0.0256805419921875,
0.013458251953125,
0.0714111328125,
-0.033660888671875,
0.030853271484375,
0.00705718994140625,
-0.03973388671875,
-0.027069091796875,
-0.006069183349609375,
0.04400634765625,
0.043701171875,
0.01318359375,
-0.0126190185546875,
-0.0426025390625,
-0.051727294921875,
-0.00888824462890625,
-0.03546142578125,
-0.004802703857421875,
0.0426025390625,
0.0213775634765625,
-0.021026611328125,
0.04718017578125,
-0.0291290283203125,
0.017730712890625,
-0.0298919677734375,
-0.0062103271484375,
0.00699615478515625,
0.043365478515625,
0.04144287109375,
-0.032958984375,
-0.0309906005859375,
-0.0186920166015625,
-0.07476806640625,
-0.020477294921875,
-0.003520965576171875,
-0.00640106201171875,
0.0198822021484375,
0.017822265625,
-0.06707763671875,
0.037078857421875,
0.034942626953125,
-0.0227203369140625,
0.04425048828125,
0.0010280609130859375,
0.0131683349609375,
-0.0911865234375,
0.0160064697265625,
-0.009429931640625,
-0.004512786865234375,
-0.041748046875,
0.003376007080078125,
0.01116180419921875,
-0.02557373046875,
-0.0305938720703125,
0.051544189453125,
-0.031768798828125,
0.0120697021484375,
-0.0289764404296875,
0.006015777587890625,
-0.0008325576782226562,
0.0479736328125,
-0.0103912353515625,
0.076171875,
0.035980224609375,
-0.0504150390625,
0.0426025390625,
0.03424072265625,
-0.0151214599609375,
0.04339599609375,
-0.07818603515625,
0.03973388671875,
-0.004772186279296875,
0.03778076171875,
-0.0755615234375,
-0.03509521484375,
0.07025146484375,
-0.05914306640625,
0.0275421142578125,
-0.027130126953125,
-0.037506103515625,
-0.030517578125,
-0.025177001953125,
0.0240020751953125,
0.059906005859375,
-0.051239013671875,
0.0284881591796875,
0.0211181640625,
-0.019134521484375,
-0.04718017578125,
-0.04510498046875,
0.01175689697265625,
-0.034942626953125,
-0.061737060546875,
0.004390716552734375,
-0.027740478515625,
-0.008331298828125,
-0.036895751953125,
0.00759124755859375,
-0.0017642974853515625,
0.006153106689453125,
0.05010986328125,
0.0304412841796875,
-0.00882720947265625,
0.0017518997192382812,
0.02105712890625,
0.0036220550537109375,
0.0135345458984375,
0.00809478759765625,
0.05413818359375,
-0.0011034011840820312,
-0.00286865234375,
-0.061370849609375,
0.0261077880859375,
0.03802490234375,
-0.026214599609375,
0.050018310546875,
0.035552978515625,
-0.030242919921875,
0.00858306884765625,
-0.0304107666015625,
-0.0296630859375,
-0.03839111328125,
0.01110076904296875,
-0.006130218505859375,
-0.056610107421875,
0.036346435546875,
-0.002574920654296875,
0.0145721435546875,
0.00860595703125,
0.04205322265625,
-0.00336456298828125,
0.09405517578125,
0.043182373046875,
0.01194000244140625,
0.03961181640625,
0.0027256011962890625,
0.00823211669921875,
-0.0809326171875,
-0.0457763671875,
-0.0272064208984375,
-0.0191497802734375,
-0.03680419921875,
-0.01312255859375,
0.0195159912109375,
0.005207061767578125,
-0.021697998046875,
0.0294036865234375,
-0.040802001953125,
0.0225372314453125,
0.0423583984375,
0.0164031982421875,
-0.003925323486328125,
0.0020904541015625,
-0.00021147727966308594,
-0.010162353515625,
-0.05010986328125,
-0.0614013671875,
0.07965087890625,
0.042144775390625,
0.0712890625,
0.01447296142578125,
0.040679931640625,
0.0215911865234375,
0.01947021484375,
-0.06549072265625,
0.04815673828125,
0.01276397705078125,
-0.039886474609375,
0.00756072998046875,
-0.011260986328125,
-0.07196044921875,
0.007457733154296875,
-0.011749267578125,
-0.0709228515625,
0.00930023193359375,
0.0208587646484375,
-0.037933349609375,
0.0020236968994140625,
-0.070556640625,
0.061492919921875,
0.0035114288330078125,
-0.0142669677734375,
-0.006542205810546875,
-0.06842041015625,
0.035308837890625,
0.0273590087890625,
-0.01032257080078125,
-0.0117950439453125,
-0.007770538330078125,
0.06585693359375,
-0.031951904296875,
0.10748291015625,
-0.0142669677734375,
-0.00795745849609375,
0.038330078125,
0.01239013671875,
0.03668212890625,
0.022705078125,
-0.004718780517578125,
0.004489898681640625,
0.004024505615234375,
-0.01154327392578125,
-0.04107666015625,
0.056671142578125,
-0.0626220703125,
-0.052398681640625,
-0.03204345703125,
-0.045440673828125,
0.007579803466796875,
0.004703521728515625,
0.0269622802734375,
0.038177490234375,
-0.0023059844970703125,
-0.0004553794860839844,
0.0546875,
-0.034515380859375,
0.034820556640625,
0.033416748046875,
-0.01355743408203125,
-0.040679931640625,
0.055755615234375,
-0.0090179443359375,
0.01324462890625,
0.007404327392578125,
0.025848388671875,
-0.028533935546875,
-0.0126190185546875,
-0.053680419921875,
0.0275421142578125,
-0.055511474609375,
-0.023468017578125,
-0.048095703125,
-0.017547607421875,
-0.05029296875,
0.00037932395935058594,
-0.005001068115234375,
-0.036102294921875,
-0.051788330078125,
0.00016891956329345703,
0.038482666015625,
0.0562744140625,
0.00015497207641601562,
0.039031982421875,
-0.041656494140625,
0.0096893310546875,
0.028411865234375,
0.008270263671875,
-0.0001266002655029297,
-0.06646728515625,
0.009552001953125,
0.026397705078125,
-0.03167724609375,
-0.07427978515625,
0.0227203369140625,
0.018829345703125,
0.0350341796875,
0.024200439453125,
0.0033206939697265625,
0.032440185546875,
-0.0294952392578125,
0.07269287109375,
0.0133819580078125,
-0.042816162109375,
0.0533447265625,
-0.0264892578125,
0.0164642333984375,
0.01303863525390625,
0.0222930908203125,
-0.061431884765625,
-0.01226043701171875,
-0.035003662109375,
-0.0450439453125,
0.061248779296875,
0.00638580322265625,
0.043975830078125,
-0.0222015380859375,
0.04608154296875,
0.0114593505859375,
0.022918701171875,
-0.0665283203125,
-0.018585205078125,
-0.035614013671875,
-0.00769805908203125,
0.0170745849609375,
-0.04656982421875,
-0.0014753341674804688,
-0.0227813720703125,
0.0369873046875,
-0.00569915771484375,
0.037628173828125,
0.0029430389404296875,
-0.008514404296875,
-0.00787353515625,
0.006336212158203125,
0.060302734375,
0.05279541015625,
-0.0133209228515625,
0.0014009475708007812,
0.004413604736328125,
-0.04638671875,
-0.006134033203125,
0.004566192626953125,
-0.01495361328125,
-0.01763916015625,
0.0258941650390625,
0.079833984375,
0.0027313232421875,
-0.04901123046875,
0.041473388671875,
-0.00446319580078125,
0.0007920265197753906,
-0.0258636474609375,
0.0277099609375,
0.0052490234375,
0.0343017578125,
0.00247955322265625,
0.00025081634521484375,
0.0037975311279296875,
-0.040802001953125,
-0.00266265869140625,
0.0161285400390625,
-0.00959014892578125,
-0.032562255859375,
0.0633544921875,
0.0247344970703125,
-0.046051025390625,
0.055419921875,
-0.012664794921875,
-0.032073974609375,
0.047454833984375,
0.06805419921875,
0.042816162109375,
-0.0241241455078125,
0.035003662109375,
0.046661376953125,
0.0265350341796875,
-0.0026798248291015625,
0.02008056640625,
-0.0020809173583984375,
-0.025787353515625,
-0.01323699951171875,
-0.0296173095703125,
-0.01837158203125,
0.015716552734375,
-0.039215087890625,
0.0129241943359375,
-0.0692138671875,
-0.0194854736328125,
-0.01081085205078125,
-0.00395965576171875,
-0.020965576171875,
0.005283355712890625,
0.0058746337890625,
0.07037353515625,
-0.059814453125,
0.04718017578125,
0.0574951171875,
-0.042083740234375,
-0.0657958984375,
-0.0022430419921875,
0.0070648193359375,
-0.08135986328125,
0.0264892578125,
0.035888671875,
0.01194000244140625,
0.00048613548278808594,
-0.073974609375,
-0.04266357421875,
0.1005859375,
0.0228118896484375,
-0.029571533203125,
-0.01052093505859375,
-0.00858306884765625,
0.032196044921875,
-0.0391845703125,
0.0380859375,
0.0290985107421875,
0.0246124267578125,
0.00749969482421875,
-0.08270263671875,
0.0256195068359375,
-0.02825927734375,
0.007274627685546875,
-0.01555633544921875,
-0.067626953125,
0.08038330078125,
-0.0264892578125,
-0.0164794921875,
0.0406494140625,
0.05645751953125,
0.03436279296875,
0.0311431884765625,
0.0296630859375,
0.0212860107421875,
0.06256103515625,
0.00852203369140625,
0.07763671875,
-0.008575439453125,
-0.0011796951293945312,
0.0733642578125,
-0.009307861328125,
0.0550537109375,
0.027618408203125,
-0.0010099411010742188,
0.044586181640625,
0.0712890625,
-0.0030040740966796875,
0.036590576171875,
0.01363372802734375,
-0.015716552734375,
-0.01611328125,
-0.027984619140625,
-0.0313720703125,
0.033416748046875,
0.0258026123046875,
-0.0299530029296875,
-0.0009241104125976562,
0.0027923583984375,
0.03271484375,
-0.0007910728454589844,
-0.008514404296875,
0.0601806640625,
0.01523590087890625,
-0.069580078125,
0.08135986328125,
0.00959014892578125,
0.06719970703125,
-0.042510986328125,
-0.017333984375,
-0.04888916015625,
-0.01123046875,
-0.0122222900390625,
-0.045989990234375,
-0.007476806640625,
0.0167999267578125,
-0.0176239013671875,
0.0014734268188476562,
0.06170654296875,
-0.0232086181640625,
-0.0185394287109375,
-0.005828857421875,
0.0195465087890625,
0.032440185546875,
0.0020694732666015625,
-0.06182861328125,
0.015045166015625,
-0.002544403076171875,
-0.01306915283203125,
0.021575927734375,
0.00771331787109375,
-0.0157928466796875,
0.06396484375,
0.039031982421875,
-0.0207061767578125,
-0.00011229515075683594,
0.00470733642578125,
0.07373046875,
-0.0246124267578125,
-0.0306854248046875,
-0.050323486328125,
0.038177490234375,
-0.0027751922607421875,
-0.035888671875,
0.055816650390625,
0.0239105224609375,
0.033721923828125,
-0.005802154541015625,
0.04669189453125,
-0.01995849609375,
0.0377197265625,
-0.040496826171875,
0.0628662109375,
-0.037445068359375,
0.0258941650390625,
-0.004489898681640625,
-0.066162109375,
0.0015964508056640625,
0.060577392578125,
-0.0063934326171875,
0.0229644775390625,
0.041351318359375,
0.084228515625,
-0.007282257080078125,
-0.0035724639892578125,
0.01131439208984375,
0.0018033981323242188,
0.0169525146484375,
0.04180908203125,
0.08868408203125,
-0.0272979736328125,
0.0401611328125,
-0.0289459228515625,
-0.038818359375,
-0.021026611328125,
-0.0577392578125,
-0.10675048828125,
-0.0301513671875,
-0.0209808349609375,
-0.03961181640625,
0.0182952880859375,
0.0816650390625,
0.044525146484375,
-0.035552978515625,
-0.0166778564453125,
0.01447296142578125,
0.0029163360595703125,
-0.00799560546875,
-0.01561737060546875,
0.004688262939453125,
-0.00598907470703125,
-0.051727294921875,
0.041015625,
-0.002162933349609375,
0.01812744140625,
-0.007740020751953125,
-0.01535797119140625,
-0.01190948486328125,
0.00882720947265625,
0.042816162109375,
0.031982421875,
-0.059173583984375,
-0.034515380859375,
0.01812744140625,
-0.01407623291015625,
-0.00180816650390625,
0.04180908203125,
-0.041748046875,
0.01995849609375,
0.0247039794921875,
0.019866943359375,
0.0160064697265625,
-0.003223419189453125,
0.038909912109375,
-0.055389404296875,
0.0254058837890625,
0.0145111083984375,
0.0125274658203125,
0.03851318359375,
-0.042449951171875,
0.024139404296875,
0.023406982421875,
-0.0543212890625,
-0.06866455078125,
0.0174560546875,
-0.07318115234375,
-0.016632080078125,
0.11578369140625,
-0.016815185546875,
-0.0253448486328125,
0.011444091796875,
-0.056610107421875,
0.03192138671875,
-0.0496826171875,
0.05743408203125,
0.041015625,
-0.0074615478515625,
-0.0330810546875,
-0.04241943359375,
0.03851318359375,
0.025482177734375,
-0.057220458984375,
0.007354736328125,
0.05755615234375,
0.037139892578125,
-0.01195526123046875,
0.045379638671875,
0.00543212890625,
0.039031982421875,
-0.0029449462890625,
0.0017423629760742188,
-0.0201568603515625,
-0.03564453125,
-0.0160980224609375,
-0.0243072509765625,
0.01526641845703125,
-0.0292510986328125
]
] |
xiaolxl/Stable-diffusion-models | 2023-02-17T08:47:29.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"has_space",
"region:us"
] | text-to-image | xiaolxl | null | null | xiaolxl/Stable-diffusion-models | 33 | 11,354 | diffusers | 2022-11-22T03:42:40 | ---
license: creativeml-openrail-m
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-diffusers
---
模型均来自网络,此仓库仅作为下载备用 | 179 | [
[
-0.0069580078125,
-0.035858154296875,
-0.0043792724609375,
0.08154296875,
-0.07196044921875,
0.002216339111328125,
0.01056671142578125,
-0.0197601318359375,
0.04669189453125,
0.055755615234375,
-0.015838623046875,
-0.0043182373046875,
-0.04046630859375,
-0.0027923583984375,
-0.04248046875,
0.05755615234375,
-0.03875732421875,
0.00525665283203125,
0.01617431640625,
0.007289886474609375,
-0.05133056640625,
0.01082611083984375,
-0.050048828125,
-0.0174102783203125,
-0.00763702392578125,
0.028717041015625,
0.05609130859375,
0.039764404296875,
0.046142578125,
0.035858154296875,
0.0214691162109375,
-0.00843048095703125,
-0.0193328857421875,
0.030303955078125,
0.025665283203125,
-0.0499267578125,
-0.059234619140625,
-0.0116119384765625,
0.07830810546875,
0.04583740234375,
-0.006122589111328125,
-0.01116180419921875,
0.030487060546875,
0.0721435546875,
-0.0128631591796875,
0.015960693359375,
-0.0030574798583984375,
0.0289764404296875,
-0.021148681640625,
-0.03875732421875,
0.00881195068359375,
-0.08074951171875,
-0.005283355712890625,
-0.04583740234375,
-0.0251922607421875,
0.01611328125,
0.08251953125,
0.00939178466796875,
-0.044586181640625,
0.0068359375,
-0.056793212890625,
0.0426025390625,
-0.0220489501953125,
-0.0024738311767578125,
-0.0095062255859375,
0.05670166015625,
0.0007724761962890625,
-0.055694580078125,
0.00029397010803222656,
0.006031036376953125,
-0.03515625,
0.020843505859375,
0.015899658203125,
-0.0246429443359375,
0.0007472038269042969,
0.035614013671875,
0.01617431640625,
0.00780487060546875,
-0.054443359375,
-0.01202392578125,
0.029541015625,
0.0308990478515625,
0.05609130859375,
0.0017499923706054688,
-0.0352783203125,
0.022613525390625,
-0.03985595703125,
0.029541015625,
-0.0031871795654296875,
0.01500701904296875,
-0.032379150390625,
0.016937255859375,
-0.02783203125,
0.00794219970703125,
0.0194549560546875,
-0.016571044921875,
0.00908660888671875,
-0.0086822509765625,
-0.0574951171875,
-0.01007080078125,
0.016326904296875,
0.0638427734375,
-0.020111083984375,
-0.0006852149963378906,
-0.02606201171875,
-0.0209503173828125,
-0.02001953125,
-0.05255126953125,
-0.0156097412109375,
0.036224365234375,
-0.045135498046875,
-0.03619384765625,
0.020904541015625,
-0.10528564453125,
0.0048828125,
-0.00418853759765625,
0.029449462890625,
0.0015382766723632812,
-0.072021484375,
-0.00627899169921875,
-0.0139007568359375,
0.0004429817199707031,
0.035858154296875,
-0.04180908203125,
0.0248565673828125,
0.01120758056640625,
0.03955078125,
0.0047760009765625,
-0.006793975830078125,
0.0031337738037109375,
0.03717041015625,
-0.0009021759033203125,
0.041290283203125,
0.01114654541015625,
-0.0496826171875,
0.016387939453125,
0.010986328125,
0.014984130859375,
-0.0338134765625,
0.0806884765625,
-0.044036865234375,
-0.00946807861328125,
-0.038848876953125,
-0.00827789306640625,
-0.0188751220703125,
0.01023101806640625,
-0.048919677734375,
0.044281005859375,
-0.0120086669921875,
-0.061614990234375,
-0.00852203369140625,
-0.0372314453125,
-0.05255126953125,
0.049072265625,
-0.01297760009765625,
0.006313323974609375,
0.01091766357421875,
0.005504608154296875,
0.025482177734375,
-0.01337432861328125,
0.00020194053649902344,
0.032958984375,
0.0010309219360351562,
0.0271759033203125,
-0.003978729248046875,
0.09222412109375,
0.04248046875,
-0.02117919921875,
0.0135650634765625,
-0.0146484375,
0.01031494140625,
0.038238525390625,
-0.016632080078125,
-0.0209503173828125,
-0.0019016265869140625,
0.0305633544921875,
0.017303466796875,
0.07080078125,
-0.01528167724609375,
0.020294189453125,
-0.042022705078125,
0.0235443115234375,
0.07708740234375,
0.0002288818359375,
0.04205322265625,
-0.0552978515625,
0.0325927734375,
0.01139068603515625,
0.017608642578125,
-0.0037994384765625,
-0.01202392578125,
-0.09796142578125,
0.0170440673828125,
-0.01374053955078125,
0.0550537109375,
-0.0623779296875,
0.044708251953125,
0.0146331787109375,
-0.0357666015625,
-0.01129913330078125,
-0.0094146728515625,
0.03045654296875,
0.01337432861328125,
0.0227508544921875,
-0.02508544921875,
-0.05499267578125,
-0.03472900390625,
-0.0012969970703125,
-0.01163482666015625,
-0.005847930908203125,
0.0281982421875,
0.0269317626953125,
-0.0152435302734375,
0.0220794677734375,
-0.0260772705078125,
-0.032257080078125,
-0.01300811767578125,
-0.0092315673828125,
0.024169921875,
0.054351806640625,
0.04498291015625,
-0.06219482421875,
-0.053924560546875,
-0.013702392578125,
-0.0733642578125,
-0.01215362548828125,
0.0160675048828125,
-0.031097412109375,
0.0172271728515625,
-0.0081634521484375,
-0.015838623046875,
0.03399658203125,
0.0255584716796875,
-0.0278472900390625,
0.067138671875,
-0.018463134765625,
0.00261688232421875,
-0.05450439453125,
-0.01274871826171875,
-0.034454345703125,
0.03125,
-0.026580810546875,
0.038421630859375,
-0.0146942138671875,
0.0008134841918945312,
-0.0460205078125,
0.034820556640625,
-0.040313720703125,
0.0178985595703125,
-0.042449951171875,
0.0188751220703125,
0.003887176513671875,
0.005588531494140625,
-0.018951416015625,
0.034942626953125,
0.03973388671875,
-0.0887451171875,
0.0811767578125,
0.041412353515625,
-0.01439666748046875,
0.004970550537109375,
-0.04315185546875,
-0.015228271484375,
-0.0025463104248046875,
0.024383544921875,
-0.081787109375,
-0.032562255859375,
0.0220489501953125,
-0.012725830078125,
0.033782958984375,
0.0292510986328125,
-0.0171661376953125,
-0.063720703125,
-0.03125,
0.025787353515625,
0.0504150390625,
-0.0179290771484375,
-0.0004253387451171875,
0.04534912109375,
-0.0012331008911132812,
-0.048248291015625,
-0.0379638671875,
-0.016937255859375,
0.0040130615234375,
-0.05047607421875,
0.00641632080078125,
0.006008148193359375,
-0.0012254714965820312,
0.0041656494140625,
0.00760650634765625,
-0.0157318115234375,
-0.0033321380615234375,
0.005710601806640625,
0.046173095703125,
-0.030426025390625,
-0.0306549072265625,
0.0092926025390625,
-0.0214385986328125,
0.01074981689453125,
-0.00937652587890625,
0.02349853515625,
0.008514404296875,
-0.0185699462890625,
-0.05804443359375,
0.051910400390625,
0.0247344970703125,
0.0160675048828125,
0.0026187896728515625,
0.017791748046875,
-0.027496337890625,
0.00823211669921875,
-0.041259765625,
0.006053924560546875,
-0.04638671875,
0.00849151611328125,
-0.015106201171875,
-0.06060791015625,
0.0267486572265625,
-0.03790283203125,
-0.00243377685546875,
0.04852294921875,
0.0244903564453125,
-0.012451171875,
0.06317138671875,
0.061126708984375,
-0.031829833984375,
0.0016126632690429688,
-0.0167999267578125,
0.0152435302734375,
-0.04754638671875,
-0.043182373046875,
-0.058837890625,
-0.0386962890625,
-0.048980712890625,
-0.00504302978515625,
0.0213470458984375,
-0.01605224609375,
-0.00405120849609375,
0.03814697265625,
-0.0772705078125,
0.0090789794921875,
0.0189361572265625,
0.023193359375,
0.0015745162963867188,
-0.0491943359375,
-0.003086090087890625,
0.005649566650390625,
-0.031097412109375,
-0.0282745361328125,
0.029632568359375,
0.0458984375,
0.07452392578125,
-0.00439453125,
0.0272216796875,
-0.0139617919921875,
0.0198516845703125,
-0.0263214111328125,
0.0657958984375,
0.00829315185546875,
-0.06085205078125,
-0.0273284912109375,
0.01125335693359375,
-0.07611083984375,
0.0233154296875,
0.0010538101196289062,
-0.060791015625,
-0.0119171142578125,
-0.0131988525390625,
-0.03631591796875,
0.0594482421875,
-0.0222320556640625,
0.03521728515625,
-0.0523681640625,
-0.02325439453125,
-0.0078887939453125,
-0.0345458984375,
0.04388427734375,
0.0167083740234375,
0.0238800048828125,
-0.021881103515625,
-0.024261474609375,
0.0841064453125,
-0.0218353271484375,
0.032684326171875,
-0.04315185546875,
-0.0034809112548828125,
0.0167083740234375,
0.01393890380859375,
0.0450439453125,
0.004131317138671875,
0.029022216796875,
-0.0206756591796875,
0.01186370849609375,
-0.04241943359375,
-0.00035309791564941406,
0.0655517578125,
-0.052459716796875,
-0.07427978515625,
-0.050506591796875,
-0.018585205078125,
0.0215606689453125,
0.027801513671875,
0.036712646484375,
0.004055023193359375,
-0.0235595703125,
-0.009246826171875,
0.027099609375,
-0.03314208984375,
0.05584716796875,
0.043304443359375,
-0.047393798828125,
-0.01434326171875,
0.04412841796875,
0.038665771484375,
-0.023284912109375,
0.027008056640625,
-0.00606536865234375,
-0.01346588134765625,
-0.02984619140625,
-0.021270751953125,
0.01739501953125,
-0.01222991943359375,
-0.0258026123046875,
-0.03863525390625,
-0.0170440673828125,
-0.037811279296875,
-0.0289764404296875,
-0.0160675048828125,
-0.029296875,
-0.002105712890625,
-0.0233001708984375,
-0.0030975341796875,
0.04010009765625,
-0.003204345703125,
0.036376953125,
-0.0733642578125,
0.020904541015625,
0.02972412109375,
0.017578125,
0.030303955078125,
-0.03765869140625,
-0.058502197265625,
-0.0180816650390625,
-0.047027587890625,
-0.047637939453125,
0.058319091796875,
-0.01508331298828125,
0.0198211669921875,
0.0830078125,
0.02044677734375,
0.037261962890625,
-0.0316162109375,
0.04815673828125,
0.033599853515625,
-0.06536865234375,
0.056854248046875,
-0.021148681640625,
-0.0015048980712890625,
0.0243377685546875,
0.03668212890625,
-0.0296478271484375,
-0.0263824462890625,
-0.053863525390625,
-0.07037353515625,
0.043548583984375,
0.01206207275390625,
0.04278564453125,
0.033843994140625,
-0.00801849365234375,
0.011260986328125,
0.03179931640625,
-0.018157958984375,
-0.0574951171875,
-0.0145111083984375,
0.0457763671875,
0.034637451171875,
-0.0504150390625,
-0.017852783203125,
-0.04876708984375,
0.04754638671875,
0.054656982421875,
0.0738525390625,
-0.00046372413635253906,
0.03118896484375,
-0.054901123046875,
0.0360107421875,
0.037139892578125,
0.051971435546875,
-0.016326904296875,
0.0216522216796875,
0.00958251953125,
-0.0418701171875,
0.009368896484375,
-0.00042939186096191406,
-0.036834716796875,
0.02520751953125,
-0.0189971923828125,
0.0172271728515625,
0.01480865478515625,
-0.0002512931823730469,
0.024658203125,
0.0013055801391601562,
0.028564453125,
-0.0665283203125,
0.0037403106689453125,
-0.027496337890625,
0.006664276123046875,
0.06512451171875,
0.01457977294921875,
-0.0015535354614257812,
-0.0266571044921875,
0.0296783447265625,
0.0128021240234375,
-0.015167236328125,
-0.0091705322265625,
0.06256103515625,
0.08160400390625,
-0.0187530517578125,
0.0271759033203125,
0.0232696533203125,
-0.06890869140625,
0.05615234375,
0.043304443359375,
0.051025390625,
-0.056427001953125,
0.0063323974609375,
0.0447998046875,
0.025604248046875,
-0.02117919921875,
0.030426025390625,
0.0100250244140625,
-0.049285888671875,
-0.020233154296875,
-0.0263214111328125,
-0.05340576171875,
0.020355224609375,
-0.030059814453125,
0.01393890380859375,
-0.0625,
0.0009169578552246094,
-0.0036449432373046875,
0.01538848876953125,
-0.0018739700317382812,
0.035858154296875,
-0.011199951171875,
0.110595703125,
-0.04400634765625,
0.08331298828125,
0.0158538818359375,
-0.0291748046875,
-0.041656494140625,
-0.0071258544921875,
-0.01055145263671875,
-0.07171630859375,
0.05615234375,
0.00885772705078125,
0.0101165771484375,
-0.006805419921875,
-0.048187255859375,
-0.046539306640625,
0.06585693359375,
-0.007549285888671875,
-0.006587982177734375,
0.0020999908447265625,
0.00433349609375,
0.0178680419921875,
-0.039215087890625,
0.007289886474609375,
0.0280303955078125,
0.07147216796875,
-0.01268768310546875,
-0.062255859375,
0.04412841796875,
-0.0283050537109375,
-0.01290130615234375,
0.02142333984375,
-0.07623291015625,
0.07452392578125,
-0.003376007080078125,
-0.0243377685546875,
0.02032470703125,
0.07354736328125,
-0.01861572265625,
0.03680419921875,
0.03118896484375,
0.019134521484375,
0.0250396728515625,
-0.042877197265625,
0.050872802734375,
0.003009796142578125,
-0.0017452239990234375,
0.0227508544921875,
-0.004940032958984375,
0.01763916015625,
0.01210784912109375,
-0.044158935546875,
0.03277587890625,
0.036102294921875,
-0.0085601806640625,
0.042205810546875,
-0.02508544921875,
-0.04449462890625,
0.012725830078125,
-0.033538818359375,
-0.0372314453125,
-0.003910064697265625,
0.02655029296875,
-0.0086669921875,
-0.0262603759765625,
-0.0035266876220703125,
0.031707763671875,
0.0025234222412109375,
-0.0272216796875,
0.04876708984375,
0.001804351806640625,
-0.05291748046875,
0.0428466796875,
0.004650115966796875,
0.07171630859375,
-0.053314208984375,
0.00334930419921875,
-0.0034027099609375,
0.01393890380859375,
-0.0386962890625,
-0.06390380859375,
0.031829833984375,
-0.0138397216796875,
0.005802154541015625,
-0.0243377685546875,
0.076416015625,
-0.0273590087890625,
-0.0152130126953125,
0.033843994140625,
0.0202178955078125,
0.0250701904296875,
0.0235443115234375,
-0.0771484375,
-0.005870819091796875,
0.01419830322265625,
-0.006191253662109375,
0.028564453125,
0.0200042724609375,
0.01788330078125,
0.03814697265625,
0.026031494140625,
0.0139007568359375,
-0.005474090576171875,
-0.0288238525390625,
0.0882568359375,
-0.03631591796875,
-0.051025390625,
-0.052154541015625,
0.04595947265625,
-0.024505615234375,
-0.0180206298828125,
0.06072998046875,
0.04443359375,
0.048736572265625,
-0.024658203125,
0.0714111328125,
-0.042999267578125,
0.054534912109375,
0.01023101806640625,
0.04010009765625,
-0.0174407958984375,
-0.02130126953125,
-0.012481689453125,
-0.00975799560546875,
-0.00507354736328125,
0.0347900390625,
-0.00460052490234375,
-0.01161956787109375,
0.08258056640625,
0.0301055908203125,
0.0211181640625,
0.01050567626953125,
0.0180511474609375,
0.01006317138671875,
-0.00914764404296875,
0.0762939453125,
0.032745361328125,
-0.054656982421875,
0.032684326171875,
-0.0634765625,
-0.0271759033203125,
-0.04730224609375,
-0.0355224609375,
-0.04351806640625,
-0.026885986328125,
-0.01708984375,
-0.039764404296875,
-0.0243377685546875,
0.03228759765625,
0.0251617431640625,
-0.085205078125,
-0.054351806640625,
0.040191650390625,
0.04595947265625,
-0.0408935546875,
-0.00794219970703125,
0.05279541015625,
-0.0196533203125,
-0.0726318359375,
-0.005771636962890625,
0.020477294921875,
0.0146484375,
-0.026275634765625,
0.012359619140625,
-0.0160980224609375,
0.00909423828125,
0.011566162109375,
0.04803466796875,
-0.0447998046875,
0.00263214111328125,
-0.0088958740234375,
-0.03582763671875,
0.019256591796875,
0.058837890625,
-0.00572967529296875,
0.0092315673828125,
0.06524658203125,
0.003910064697265625,
0.01953125,
-0.0301513671875,
0.041839599609375,
-0.047637939453125,
0.0506591796875,
0.006725311279296875,
0.036041259765625,
-0.01035308837890625,
-0.054168701171875,
0.04425048828125,
0.037750244140625,
-0.05291748046875,
-0.040435791015625,
0.01177978515625,
-0.091552734375,
-0.0137481689453125,
0.0452880859375,
-0.007312774658203125,
-0.01922607421875,
-0.02764892578125,
-0.052215576171875,
0.06463623046875,
-0.03997802734375,
0.05712890625,
0.018157958984375,
-0.00646209716796875,
0.0011234283447265625,
-0.06414794921875,
0.0399169921875,
0.0223388671875,
-0.04010009765625,
-0.043182373046875,
-0.01038360595703125,
-0.018157958984375,
0.0224456787109375,
0.057373046875,
-0.02313232421875,
0.00232696533203125,
-0.01120758056640625,
-0.01088714599609375,
-0.0011224746704101562,
0.01306915283203125,
0.007442474365234375,
0.039154052734375,
-0.0305938720703125,
-0.06396484375
]
] |
symanto/sn-xlm-roberta-base-snli-mnli-anli-xnli | 2023-02-20T09:49:54.000Z | [
"sentence-transformers",
"pytorch",
"xlm-roberta",
"zero-shot-classification",
"feature-extraction",
"sentence-similarity",
"transformers",
"ar",
"bg",
"de",
"el",
"en",
"es",
"fr",
"ru",
"th",
"tr",
"ur",
"vn",
"zh",
"dataset:SNLI",
"dataset:MNLI",
"dataset:ANLI",
"dataset:XNLI",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | symanto | null | null | symanto/sn-xlm-roberta-base-snli-mnli-anli-xnli | 59 | 11,346 | sentence-transformers | 2022-03-02T23:29:05 | ---
language:
- ar
- bg
- de
- el
- en
- es
- fr
- ru
- th
- tr
- ur
- vn
- zh
datasets:
- SNLI
- MNLI
- ANLI
- XNLI
pipeline_tag: sentence-similarity
tags:
- zero-shot-classification
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
A Siamese network model trained for zero-shot and few-shot text classification.
The base model is [xlm-roberta-base](https://huggingface.co/xlm-roberta-base).
It was trained on [SNLI](https://nlp.stanford.edu/projects/snli/), [MNLI](https://cims.nyu.edu/~sbowman/multinli/), [ANLI](https://github.com/facebookresearch/anli) and [XNLI](https://github.com/facebookresearch/XNLI).
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
| 2,725 | [
[
-0.0152587890625,
-0.042236328125,
0.016693115234375,
0.015106201171875,
-0.0190277099609375,
-0.025482177734375,
-0.01212310791015625,
-0.00785064697265625,
0.02569580078125,
0.037353515625,
-0.042144775390625,
-0.039031982421875,
-0.05242919921875,
0.011688232421875,
-0.028717041015625,
0.09002685546875,
-0.0148162841796875,
-0.0036449432373046875,
0.00153350830078125,
-0.0213470458984375,
-0.0244598388671875,
-0.0335693359375,
-0.042633056640625,
-0.027587890625,
0.03118896484375,
0.0218658447265625,
0.0287628173828125,
0.04071044921875,
0.021820068359375,
0.031402587890625,
0.005096435546875,
0.0034160614013671875,
-0.0318603515625,
-0.00684356689453125,
0.0045928955078125,
-0.037811279296875,
-0.01316070556640625,
0.021636962890625,
0.050140380859375,
0.0225982666015625,
-0.00009131431579589844,
0.0095672607421875,
-0.0026645660400390625,
0.027313232421875,
-0.052032470703125,
0.0222015380859375,
-0.034271240234375,
0.0196990966796875,
-0.002704620361328125,
-0.00738525390625,
-0.039642333984375,
0.0029506683349609375,
0.01386260986328125,
-0.0399169921875,
0.007762908935546875,
0.01232147216796875,
0.08709716796875,
0.022369384765625,
-0.02606201171875,
-0.022918701171875,
-0.0223236083984375,
0.06768798828125,
-0.05389404296875,
0.01386260986328125,
0.0193328857421875,
-0.0019159317016601562,
0.00836181640625,
-0.06121826171875,
-0.0655517578125,
0.0078887939453125,
-0.0255279541015625,
0.020904541015625,
-0.02593994140625,
0.01123046875,
0.0164794921875,
0.022216796875,
-0.057647705078125,
0.001895904541015625,
-0.03265380859375,
-0.00498199462890625,
0.045013427734375,
0.0084075927734375,
0.03094482421875,
-0.047698974609375,
-0.03533935546875,
-0.0217742919921875,
-0.013916015625,
-0.007549285888671875,
0.0026912689208984375,
0.01557159423828125,
-0.0232696533203125,
0.05780029296875,
-0.0170135498046875,
0.047515869140625,
-0.0012264251708984375,
-0.0030517578125,
0.037353515625,
-0.0250244140625,
-0.034698486328125,
0.0125732421875,
0.0848388671875,
0.033935546875,
0.033050537109375,
-0.0010156631469726562,
-0.0097808837890625,
0.01259613037109375,
0.0089569091796875,
-0.0654296875,
-0.02325439453125,
0.0119476318359375,
-0.0255889892578125,
-0.040252685546875,
0.005031585693359375,
-0.04754638671875,
-0.005710601806640625,
-0.0014514923095703125,
0.05780029296875,
-0.0304107666015625,
-0.004261016845703125,
0.0222015380859375,
-0.018157958984375,
0.016387939453125,
-0.0076751708984375,
-0.05908203125,
0.01407623291015625,
0.01800537109375,
0.093505859375,
0.001506805419921875,
-0.03692626953125,
-0.0335693359375,
-0.0019702911376953125,
-0.00490570068359375,
0.052032470703125,
-0.01519775390625,
-0.0038623809814453125,
0.0023021697998046875,
0.01251220703125,
-0.0501708984375,
-0.0183563232421875,
0.028289794921875,
-0.022918701171875,
0.05078125,
0.017913818359375,
-0.060546875,
-0.0196075439453125,
0.03265380859375,
-0.038543701171875,
0.07275390625,
0.01398468017578125,
-0.0882568359375,
0.00571441650390625,
-0.060028076171875,
-0.00910186767578125,
-0.0093841552734375,
-0.006938934326171875,
-0.055450439453125,
0.00045037269592285156,
0.0188751220703125,
0.06396484375,
0.014739990234375,
0.0119476318359375,
-0.02777099609375,
-0.039306640625,
0.028564453125,
-0.0239715576171875,
0.08245849609375,
0.00466156005859375,
-0.03314208984375,
0.01434326171875,
-0.0538330078125,
-0.00628662109375,
0.019622802734375,
-0.01312255859375,
-0.0186309814453125,
-0.0259857177734375,
0.030364990234375,
0.0273284912109375,
-0.005779266357421875,
-0.056121826171875,
0.017913818359375,
-0.04376220703125,
0.056243896484375,
0.035919189453125,
0.00640106201171875,
0.04608154296875,
-0.0283050537109375,
0.015960693359375,
0.0298614501953125,
-0.0166168212890625,
-0.0286102294921875,
-0.0276947021484375,
-0.0848388671875,
-0.016876220703125,
0.0189056396484375,
0.0550537109375,
-0.066650390625,
0.049530029296875,
-0.03314208984375,
-0.032440185546875,
-0.040130615234375,
0.006561279296875,
0.0186309814453125,
0.037689208984375,
0.036834716796875,
-0.016082763671875,
-0.041046142578125,
-0.053070068359375,
0.003814697265625,
-0.0083465576171875,
0.0011844635009765625,
0.0169677734375,
0.05596923828125,
-0.0230255126953125,
0.07305908203125,
-0.046112060546875,
-0.048614501953125,
-0.03216552734375,
0.024749755859375,
0.01885986328125,
0.046905517578125,
0.04315185546875,
-0.04443359375,
-0.030548095703125,
-0.0240325927734375,
-0.0650634765625,
0.006908416748046875,
-0.0204925537109375,
-0.01515960693359375,
0.0171356201171875,
0.0341796875,
-0.0576171875,
0.04058837890625,
0.03619384765625,
-0.027587890625,
0.03790283203125,
-0.00759124755859375,
-0.012054443359375,
-0.10699462890625,
0.002902984619140625,
0.0145721435546875,
-0.0271759033203125,
-0.046875,
0.005146026611328125,
-0.0008087158203125,
-0.011871337890625,
-0.04193115234375,
0.053314208984375,
-0.0301971435546875,
0.0107574462890625,
-0.0253448486328125,
0.0264739990234375,
0.0077972412109375,
0.05340576171875,
0.010833740234375,
0.0445556640625,
0.04827880859375,
-0.045379638671875,
0.035003662109375,
0.0335693359375,
-0.0173797607421875,
0.0284576416015625,
-0.0521240234375,
0.005474090576171875,
-0.01317596435546875,
0.033538818359375,
-0.08062744140625,
-0.0168609619140625,
0.033111572265625,
-0.045135498046875,
0.021881103515625,
0.01519775390625,
-0.040924072265625,
-0.035919189453125,
-0.02850341796875,
0.025146484375,
0.02410888671875,
-0.04144287109375,
0.038848876953125,
0.01593017578125,
0.003963470458984375,
-0.040771484375,
-0.0816650390625,
0.0055389404296875,
-0.020721435546875,
-0.038177490234375,
0.0308685302734375,
0.0010271072387695312,
0.0168914794921875,
0.0116119384765625,
0.02581787109375,
-0.01532745361328125,
-0.0018091201782226562,
0.004718780517578125,
0.021820068359375,
-0.0190277099609375,
0.0166778564453125,
0.006816864013671875,
-0.0200347900390625,
-0.0009775161743164062,
-0.0267791748046875,
0.046295166015625,
-0.0091400146484375,
0.0000699758529663086,
-0.04986572265625,
0.0128936767578125,
0.025482177734375,
-0.01042938232421875,
0.06573486328125,
0.0784912109375,
-0.03826904296875,
-0.01107025146484375,
-0.0305328369140625,
-0.0168609619140625,
-0.03253173828125,
0.038970947265625,
-0.0255279541015625,
-0.07293701171875,
0.0244293212890625,
0.0154266357421875,
0.00688934326171875,
0.038055419921875,
0.03656005859375,
-0.005645751953125,
0.0706787109375,
0.059112548828125,
-0.0252838134765625,
0.040740966796875,
-0.050567626953125,
0.014984130859375,
-0.0662841796875,
0.0077972412109375,
-0.01430511474609375,
-0.0227813720703125,
-0.049407958984375,
-0.0232391357421875,
0.0148162841796875,
-0.00045680999755859375,
-0.0204010009765625,
0.04644775390625,
-0.048065185546875,
0.02947998046875,
0.0379638671875,
0.0133514404296875,
0.0013790130615234375,
0.0031185150146484375,
-0.0172119140625,
-0.0006003379821777344,
-0.06280517578125,
-0.035888671875,
0.0634765625,
0.0357666015625,
0.034942626953125,
-0.00937652587890625,
0.06622314453125,
-0.01015472412109375,
0.0235595703125,
-0.0672607421875,
0.03204345703125,
-0.024078369140625,
-0.03985595703125,
-0.022705078125,
-0.032012939453125,
-0.062164306640625,
0.028656005859375,
-0.015899658203125,
-0.0712890625,
0.0049896240234375,
-0.0114288330078125,
-0.037139892578125,
0.0284576416015625,
-0.0550537109375,
0.0782470703125,
0.01554107666015625,
-0.01322174072265625,
0.0008697509765625,
-0.052520751953125,
0.03582763671875,
0.0117645263671875,
-0.0034961700439453125,
-0.004718780517578125,
0.006961822509765625,
0.06939697265625,
-0.0153045654296875,
0.07757568359375,
-0.0058135986328125,
0.0236968994140625,
0.0228424072265625,
-0.0181121826171875,
0.0028553009033203125,
0.0009160041809082031,
-0.002002716064453125,
0.013275146484375,
0.0015115737915039062,
-0.0290374755859375,
-0.048126220703125,
0.048675537109375,
-0.076904296875,
-0.0180511474609375,
-0.046051025390625,
-0.0310821533203125,
0.00962066650390625,
0.0200347900390625,
0.040618896484375,
0.038360595703125,
-0.00562286376953125,
0.0196685791015625,
0.0400390625,
-0.0167236328125,
0.05908203125,
0.005096435546875,
-0.01435089111328125,
-0.03826904296875,
0.06024169921875,
0.0086669921875,
0.0019216537475585938,
0.035736083984375,
0.01055908203125,
-0.031524658203125,
-0.007297515869140625,
-0.047607421875,
0.036529541015625,
-0.04754638671875,
-0.027557373046875,
-0.07708740234375,
-0.055389404296875,
-0.040771484375,
-0.0110015869140625,
-0.01824951171875,
-0.0251617431640625,
-0.05096435546875,
-0.0132904052734375,
0.0203094482421875,
0.040679931640625,
0.00930023193359375,
0.046722412109375,
-0.05908203125,
0.0086822509765625,
0.01201629638671875,
0.0101776123046875,
-0.004459381103515625,
-0.051177978515625,
-0.017242431640625,
-0.006328582763671875,
-0.0350341796875,
-0.07598876953125,
0.04949951171875,
0.02569580078125,
0.0231170654296875,
0.034088134765625,
0.00323486328125,
0.046051025390625,
-0.03912353515625,
0.0487060546875,
0.0238037109375,
-0.07733154296875,
0.033538818359375,
-0.006610870361328125,
0.0325927734375,
0.03509521484375,
0.033172607421875,
-0.0487060546875,
-0.023590087890625,
-0.0447998046875,
-0.05096435546875,
0.0670166015625,
0.03375244140625,
0.03704833984375,
-0.01702880859375,
0.036865234375,
-0.023223876953125,
0.01088714599609375,
-0.0933837890625,
-0.0307464599609375,
-0.033050537109375,
-0.044403076171875,
-0.0264434814453125,
-0.0137786865234375,
0.0149383544921875,
-0.0231170654296875,
0.058197021484375,
-0.017181396484375,
0.03875732421875,
0.0298614501953125,
-0.02777099609375,
0.00962066650390625,
0.01230621337890625,
0.0474853515625,
0.00913238525390625,
-0.00420379638671875,
0.0171051025390625,
0.0186004638671875,
-0.022796630859375,
0.0151824951171875,
0.01422882080078125,
-0.01552581787109375,
0.0215606689453125,
0.031524658203125,
0.0889892578125,
0.03277587890625,
-0.02294921875,
0.0653076171875,
-0.01125335693359375,
-0.019134521484375,
-0.026214599609375,
0.004123687744140625,
0.0036525726318359375,
0.01004791259765625,
0.007701873779296875,
-0.0025730133056640625,
0.00609588623046875,
-0.0244598388671875,
0.028564453125,
0.015716552734375,
-0.03802490234375,
-0.01332855224609375,
0.046905517578125,
0.00344085693359375,
-0.016448974609375,
0.0618896484375,
-0.0288848876953125,
-0.0576171875,
0.029510498046875,
0.04632568359375,
0.08026123046875,
-0.00537872314453125,
0.0300750732421875,
0.04931640625,
0.01451873779296875,
0.005767822265625,
-0.010833740234375,
-0.0059814453125,
-0.076904296875,
-0.0215606689453125,
-0.0565185546875,
-0.0039215087890625,
0.00785064697265625,
-0.0447998046875,
0.0343017578125,
-0.026519775390625,
-0.00386810302734375,
0.00420379638671875,
0.008148193359375,
-0.045623779296875,
0.01529693603515625,
0.0085296630859375,
0.058624267578125,
-0.083984375,
0.061248779296875,
0.05462646484375,
-0.04901123046875,
-0.056732177734375,
-0.00597381591796875,
-0.00543975830078125,
-0.060089111328125,
0.044586181640625,
0.05413818359375,
0.0120849609375,
0.0250244140625,
-0.038970947265625,
-0.062103271484375,
0.09039306640625,
0.0030269622802734375,
-0.0305938720703125,
-0.014404296875,
0.00012874603271484375,
0.03509521484375,
-0.042510986328125,
0.038909912109375,
0.02825927734375,
0.01094818115234375,
0.006195068359375,
-0.060821533203125,
0.005828857421875,
-0.022186279296875,
0.01551055908203125,
-0.008941650390625,
-0.04608154296875,
0.0621337890625,
-0.0081787109375,
-0.0231781005859375,
0.00838470458984375,
0.0654296875,
0.032257080078125,
0.0184478759765625,
0.046142578125,
0.05303955078125,
0.0377197265625,
-0.0019044876098632812,
0.0682373046875,
-0.0180511474609375,
0.0552978515625,
0.065185546875,
-0.0034236907958984375,
0.08197021484375,
0.0242156982421875,
0.006378173828125,
0.055023193359375,
0.048553466796875,
-0.01398468017578125,
0.032989501953125,
0.01751708984375,
-0.0129241943359375,
0.00008696317672729492,
-0.0019426345825195312,
-0.00440216064453125,
0.05291748046875,
0.01222991943359375,
-0.03558349609375,
-0.0178985595703125,
0.0276336669921875,
0.0164947509765625,
-0.01299285888671875,
-0.009185791015625,
0.062103271484375,
0.01543426513671875,
-0.03448486328125,
0.03167724609375,
0.0229949951171875,
0.078125,
-0.02679443359375,
0.00551605224609375,
-0.0011606216430664062,
0.0277252197265625,
-0.0107269287109375,
-0.04693603515625,
0.030609130859375,
-0.004215240478515625,
-0.01092529296875,
-0.007701873779296875,
0.0550537109375,
-0.054107666015625,
-0.0491943359375,
0.0214996337890625,
0.02508544921875,
0.0220489501953125,
-0.0017442703247070312,
-0.07562255859375,
-0.0007162094116210938,
-0.006877899169921875,
-0.0335693359375,
0.018798828125,
0.018157958984375,
0.0125732421875,
0.041229248046875,
0.032073974609375,
-0.00620269775390625,
0.0139617919921875,
0.00916290283203125,
0.0645751953125,
-0.047119140625,
-0.0531005859375,
-0.07159423828125,
0.037353515625,
-0.01308441162109375,
-0.03289794921875,
0.051483154296875,
0.035797119140625,
0.07025146484375,
-0.00374603271484375,
0.03582763671875,
0.0030384063720703125,
0.017059326171875,
-0.0447998046875,
0.0689697265625,
-0.04248046875,
-0.0189666748046875,
-0.0225982666015625,
-0.06817626953125,
-0.03900146484375,
0.0792236328125,
-0.0145721435546875,
0.0126190185546875,
0.060546875,
0.05401611328125,
-0.01194000244140625,
-0.00539398193359375,
0.0208892822265625,
0.0218963623046875,
0.01070404052734375,
0.040985107421875,
0.03192138671875,
-0.06475830078125,
0.03656005859375,
-0.040313720703125,
-0.0181427001953125,
-0.0037593841552734375,
-0.05584716796875,
-0.0809326171875,
-0.047119140625,
-0.04315185546875,
-0.02288818359375,
-0.00965118408203125,
0.070068359375,
0.0703125,
-0.061248779296875,
-0.0133209228515625,
-0.023101806640625,
-0.007602691650390625,
0.00476837158203125,
-0.02783203125,
0.03131103515625,
-0.037506103515625,
-0.063720703125,
0.001678466796875,
0.0132598876953125,
0.015411376953125,
-0.0308074951171875,
0.0120697021484375,
-0.038665771484375,
0.0037517547607421875,
0.041473388671875,
-0.00395965576171875,
-0.0487060546875,
-0.0185394287109375,
0.00496673583984375,
-0.0267486572265625,
0.005786895751953125,
0.031005859375,
-0.04632568359375,
0.0279083251953125,
0.02752685546875,
0.046051025390625,
0.05462646484375,
-0.0095367431640625,
0.019805908203125,
-0.0748291015625,
0.0165252685546875,
-0.0005102157592773438,
0.03857421875,
0.036407470703125,
-0.0261688232421875,
0.032318115234375,
0.0279541015625,
-0.03985595703125,
-0.057830810546875,
-0.004833221435546875,
-0.0721435546875,
-0.01192474365234375,
0.080322265625,
-0.03179931640625,
-0.040924072265625,
0.00868988037109375,
-0.01837158203125,
0.03814697265625,
-0.00940704345703125,
0.0496826171875,
0.06329345703125,
-0.007793426513671875,
-0.04290771484375,
-0.018646240234375,
0.015411376953125,
0.03125,
-0.048004150390625,
-0.022430419921875,
0.009979248046875,
0.0246124267578125,
0.0318603515625,
0.056610107421875,
0.007442474365234375,
0.005596160888671875,
0.004428863525390625,
0.00872802734375,
-0.001003265380859375,
-0.0172119140625,
-0.04815673828125,
0.004489898681640625,
-0.03277587890625,
-0.0418701171875
]
] |
upstage/SOLAR-0-70b-16bit | 2023-09-13T09:14:02.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"upstage",
"llama-2",
"instruct",
"instruction",
"en",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | upstage | null | null | upstage/SOLAR-0-70b-16bit | 220 | 11,338 | transformers | 2023-07-30T01:10:53 | ---
language:
- en
tags:
- upstage
- llama-2
- instruct
- instruction
pipeline_tag: text-generation
---
# Updates
Solar, a new bot created by Upstage, is now available on **Poe**. As a top-ranked model on the HuggingFace Open LLM leaderboard, and a fine tune of Llama 2, Solar is a great example of the progress enabled by open source.
Try now at https://poe.com/Solar-0-70b
# SOLAR-0-70b-16bit model card
The model name has been changed from LLaMa-2-70b-instruct-v2 to SOLAR-0-70b-16bit
## Model Details
* **Developed by**: [Upstage](https://en.upstage.ai)
* **Backbone Model**: [LLaMA-2](https://github.com/facebookresearch/llama/tree/main)
* **Language(s)**: English
* **Library**: [HuggingFace Transformers](https://github.com/huggingface/transformers)
* **License**: Fine-tuned checkpoints is licensed under the Non-Commercial Creative Commons license ([CC BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/))
* **Where to send comments**: Instructions on how to provide feedback or comments on a model can be found by opening an issue in the [Hugging Face community's model repository](https://huggingface.co/upstage/Llama-2-70b-instruct-v2/discussions)
* **Contact**: For questions and comments about the model, please email [contact@upstage.ai](mailto:contact@upstage.ai)
## Dataset Details
### Used Datasets
- Orca-style dataset
- Alpaca-style dataset
- No other dataset was used except for the dataset mentioned above
- No benchmark test set or the training set are used
### Prompt Template
```
### System:
{System}
### User:
{User}
### Assistant:
{Assistant}
```
## Usage
- The followings are tested on A100 80GB
- Our model can handle up to 10k+ input tokens, thanks to the `rope_scaling` option
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
tokenizer = AutoTokenizer.from_pretrained("upstage/Llama-2-70b-instruct-v2")
model = AutoModelForCausalLM.from_pretrained(
"upstage/Llama-2-70b-instruct-v2",
device_map="auto",
torch_dtype=torch.float16,
load_in_8bit=True,
rope_scaling={"type": "dynamic", "factor": 2} # allows handling of longer inputs
)
prompt = "### User:\nThomas is healthy, but he has to go to the hospital. What could be the reasons?\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
del inputs["token_type_ids"]
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
output = model.generate(**inputs, streamer=streamer, use_cache=True, max_new_tokens=float('inf'))
output_text = tokenizer.decode(output[0], skip_special_tokens=True)
```
## Hardware and Software
* **Hardware**: We utilized an A100x8 * 4 for training our model
* **Training Factors**: We fine-tuned this model using a combination of the [DeepSpeed library](https://github.com/microsoft/DeepSpeed) and the [HuggingFace Trainer](https://huggingface.co/docs/transformers/main_classes/trainer) / [HuggingFace Accelerate](https://huggingface.co/docs/accelerate/index)
## Evaluation Results
### Overview
- We conducted a performance evaluation following the tasks being evaluated on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
We evaluated our model on four benchmark datasets, which include `ARC-Challenge`, `HellaSwag`, `MMLU`, and `TruthfulQA`
We used the [lm-evaluation-harness repository](https://github.com/EleutherAI/lm-evaluation-harness), specifically commit [b281b0921b636bc36ad05c0b0b0763bd6dd43463](https://github.com/EleutherAI/lm-evaluation-harness/tree/b281b0921b636bc36ad05c0b0b0763bd6dd43463).
- We used [MT-bench](https://github.com/lm-sys/FastChat/tree/main/fastchat/llm_judge), a set of challenging multi-turn open-ended questions, to evaluate the models
### Main Results
| Model | H4(Avg) | ARC | HellaSwag | MMLU | TruthfulQA | | MT_Bench |
|--------------------------------------------------------------------|----------|----------|----------|------|----------|-|-------------|
| **[Llama-2-70b-instruct-v2](https://huggingface.co/upstage/Llama-2-70b-instruct-v2)**(***Ours***, ***Open LLM Leaderboard***) | **73** | **71.1** | **87.9** | **70.6** | **62.2** | | **7.44063** |
| [Llama-2-70b-instruct](https://huggingface.co/upstage/Llama-2-70b-instruct) (Ours, Open LLM Leaderboard) | 72.3 | 70.9 | 87.5 | 69.8 | 61 | | 7.24375 |
| [llama-65b-instruct](https://huggingface.co/upstage/llama-65b-instruct) (Ours, Open LLM Leaderboard) | 69.4 | 67.6 | 86.5 | 64.9 | 58.8 | | |
| Llama-2-70b-hf | 67.3 | 67.3 | 87.3 | 69.8 | 44.9 | | |
| [llama-30b-instruct-2048](https://huggingface.co/upstage/llama-30b-instruct-2048) (Ours, Open LLM Leaderboard) | 67.0 | 64.9 | 84.9 | 61.9 | 56.3 | | |
| [llama-30b-instruct](https://huggingface.co/upstage/llama-30b-instruct) (Ours, Open LLM Leaderboard) | 65.2 | 62.5 | 86.2 | 59.4 | 52.8 | | |
| llama-65b | 64.2 | 63.5 | 86.1 | 63.9 | 43.4 | | |
| falcon-40b-instruct | 63.4 | 61.6 | 84.3 | 55.4 | 52.5 | | |
### Scripts for H4 Score Reproduction
- Prepare evaluation environments:
```
# clone the repository
git clone https://github.com/EleutherAI/lm-evaluation-harness.git
# check out the specific commit
git checkout b281b0921b636bc36ad05c0b0b0763bd6dd43463
# change to the repository directory
cd lm-evaluation-harness
```
## Contact Us
### About Upstage
- [Upstage](https://en.upstage.ai) is a company specialized in Large Language Models (LLMs) and AI. We will help you build private LLMs and related applications.
If you have a dataset to build domain specific LLMs or make LLM applications, please contact us at ► [click here to contact](https://www.upstage.ai/private-llm?utm_source=huggingface&utm_medium=link&utm_campaign=privatellm)
- As of August 1st, our 70B model has reached the top spot in openLLM rankings, marking itself as the current leading performer globally. | 5,847 | [
[
-0.0258026123046875,
-0.0428466796875,
0.024627685546875,
0.0290985107421875,
-0.0299835205078125,
0.0099639892578125,
-0.0126495361328125,
-0.039215087890625,
0.0302581787109375,
0.013671875,
-0.048431396484375,
-0.042083740234375,
-0.0533447265625,
0.00262451171875,
-0.0254364013671875,
0.0777587890625,
-0.01953125,
-0.0135345458984375,
-0.01012420654296875,
-0.0214080810546875,
-0.0245208740234375,
-0.03533935546875,
-0.046630859375,
-0.039093017578125,
0.0203094482421875,
0.029022216796875,
0.049072265625,
0.032501220703125,
0.04327392578125,
0.0277252197265625,
-0.028076171875,
0.0156402587890625,
-0.031951904296875,
-0.006053924560546875,
0.017913818359375,
-0.03228759765625,
-0.06268310546875,
-0.0008335113525390625,
0.054046630859375,
0.01904296875,
-0.0301055908203125,
0.03594970703125,
0.005706787109375,
0.052337646484375,
-0.0212249755859375,
0.012664794921875,
-0.0361328125,
0.00553131103515625,
-0.0184478759765625,
0.003337860107421875,
-0.0031223297119140625,
-0.0246124267578125,
-0.01218414306640625,
-0.040283203125,
-0.0088348388671875,
0.0010824203491210938,
0.0877685546875,
0.02813720703125,
-0.005886077880859375,
-0.00762939453125,
-0.029449462890625,
0.048248291015625,
-0.055938720703125,
0.0219573974609375,
0.018341064453125,
0.01507568359375,
-0.0089111328125,
-0.05535888671875,
-0.04498291015625,
-0.0221710205078125,
-0.0009851455688476562,
0.0146484375,
-0.029937744140625,
-0.01174163818359375,
0.0193634033203125,
0.040924072265625,
-0.0239715576171875,
0.02691650390625,
-0.0235748291015625,
-0.01247406005859375,
0.06829833984375,
0.020843505859375,
0.017425537109375,
-0.0185699462890625,
-0.04339599609375,
-0.0206451416015625,
-0.057952880859375,
0.035308837890625,
0.0218353271484375,
0.0037136077880859375,
-0.0404052734375,
0.05340576171875,
-0.010589599609375,
0.0343017578125,
0.0198974609375,
-0.01129150390625,
0.039276123046875,
-0.0313720703125,
-0.0308074951171875,
-0.01061248779296875,
0.0655517578125,
0.04132080078125,
0.0054168701171875,
0.0181732177734375,
-0.02252197265625,
0.004638671875,
-0.0077056884765625,
-0.0726318359375,
-0.00024509429931640625,
0.024017333984375,
-0.0276641845703125,
-0.0312347412109375,
-0.0101470947265625,
-0.051422119140625,
-0.028289794921875,
-0.0018053054809570312,
0.0243988037109375,
-0.01922607421875,
-0.022857666015625,
0.01416778564453125,
0.00608062744140625,
0.025634765625,
0.038909912109375,
-0.040679931640625,
0.01873779296875,
0.031341552734375,
0.0679931640625,
-0.00904083251953125,
-0.0271453857421875,
-0.0198211669921875,
-0.01904296875,
-0.007228851318359375,
0.04803466796875,
-0.017822265625,
-0.0301971435546875,
-0.0216064453125,
0.01102447509765625,
-0.022674560546875,
-0.04010009765625,
0.046417236328125,
-0.0233612060546875,
0.01508331298828125,
-0.018707275390625,
-0.04180908203125,
-0.01145172119140625,
0.0179595947265625,
-0.030517578125,
0.10205078125,
0.015655517578125,
-0.05145263671875,
0.006870269775390625,
-0.05096435546875,
-0.001552581787109375,
-0.013580322265625,
-0.0023784637451171875,
-0.0538330078125,
-0.00899505615234375,
0.0206451416015625,
0.04534912109375,
-0.0313720703125,
0.01324462890625,
-0.0204315185546875,
-0.035064697265625,
0.0144805908203125,
-0.004238128662109375,
0.06622314453125,
0.005893707275390625,
-0.037017822265625,
0.027801513671875,
-0.0648193359375,
-0.00667572021484375,
0.04736328125,
-0.0270538330078125,
0.006946563720703125,
-0.018402099609375,
-0.0081329345703125,
0.01442718505859375,
0.0215301513671875,
-0.050689697265625,
0.0267181396484375,
-0.02484130859375,
0.031829833984375,
0.07586669921875,
-0.012176513671875,
0.0210723876953125,
-0.041259765625,
0.04052734375,
-0.0024623870849609375,
0.0287933349609375,
0.00431060791015625,
-0.055450439453125,
-0.0712890625,
-0.030609130859375,
0.01171112060546875,
0.04241943359375,
-0.0233612060546875,
0.043060302734375,
-0.01042938232421875,
-0.05859375,
-0.06201171875,
0.017425537109375,
0.04010009765625,
0.038665771484375,
0.0276947021484375,
-0.04217529296875,
-0.03863525390625,
-0.061126708984375,
0.007110595703125,
-0.01165771484375,
-0.002407073974609375,
0.034515380859375,
0.05682373046875,
-0.0362548828125,
0.0428466796875,
-0.0357666015625,
-0.026397705078125,
-0.0251312255859375,
-0.01531219482421875,
0.0426025390625,
0.032958984375,
0.05084228515625,
-0.0355224609375,
-0.029815673828125,
-0.007328033447265625,
-0.06085205078125,
-0.00226593017578125,
0.00569915771484375,
-0.01715087890625,
0.0251312255859375,
0.0088348388671875,
-0.0748291015625,
0.0421142578125,
0.047882080078125,
-0.0357666015625,
0.051971435546875,
-0.01142120361328125,
0.00690460205078125,
-0.0809326171875,
0.01206207275390625,
-0.00576019287109375,
-0.005298614501953125,
-0.0295562744140625,
0.01058197021484375,
0.0006384849548339844,
0.00302886962890625,
-0.04046630859375,
0.05792236328125,
-0.0297393798828125,
-0.01375579833984375,
-0.003086090087890625,
0.0162200927734375,
0.00606536865234375,
0.050048828125,
-0.014495849609375,
0.054656982421875,
0.0421142578125,
-0.032470703125,
0.0304107666015625,
0.030731201171875,
-0.035491943359375,
0.036529541015625,
-0.06231689453125,
0.0204620361328125,
0.01007843017578125,
0.0305633544921875,
-0.076904296875,
-0.0265655517578125,
0.0302886962890625,
-0.03607177734375,
0.030242919921875,
0.00963592529296875,
-0.038116455078125,
-0.0538330078125,
-0.046539306640625,
0.019683837890625,
0.052398681640625,
-0.049041748046875,
0.0208740234375,
0.032440185546875,
0.0096282958984375,
-0.051513671875,
-0.04345703125,
-0.01032257080078125,
-0.0234832763671875,
-0.060211181640625,
0.033203125,
-0.02398681640625,
-0.016510009765625,
-0.0086822509765625,
-0.01421356201171875,
0.0115509033203125,
0.01430511474609375,
0.0279693603515625,
0.031219482421875,
-0.005207061767578125,
-0.01256561279296875,
-0.009613037109375,
-0.0016469955444335938,
-0.0046234130859375,
0.0135498046875,
0.04449462890625,
-0.029754638671875,
-0.02423095703125,
-0.056640625,
-0.010101318359375,
0.04669189453125,
-0.009429931640625,
0.050201416015625,
0.042236328125,
-0.0215606689453125,
0.002086639404296875,
-0.04327392578125,
-0.0037708282470703125,
-0.036163330078125,
0.01959228515625,
-0.023162841796875,
-0.06890869140625,
0.060211181640625,
0.0088043212890625,
0.0145416259765625,
0.04669189453125,
0.05926513671875,
-0.006778717041015625,
0.06439208984375,
0.041534423828125,
-0.011260986328125,
0.02947998046875,
-0.045166015625,
-0.01314544677734375,
-0.086181640625,
-0.0294342041015625,
-0.0269012451171875,
-0.0292816162109375,
-0.04473876953125,
-0.04217529296875,
0.03216552734375,
0.0168914794921875,
-0.04071044921875,
0.0384521484375,
-0.0509033203125,
0.01287078857421875,
0.0228271484375,
0.019256591796875,
0.018524169921875,
-0.005619049072265625,
-0.022125244140625,
0.005298614501953125,
-0.041473388671875,
-0.0242156982421875,
0.08563232421875,
0.042022705078125,
0.04852294921875,
0.0017242431640625,
0.051025390625,
0.0141754150390625,
0.03985595703125,
-0.03656005859375,
0.053009033203125,
0.015960693359375,
-0.03973388671875,
-0.012725830078125,
-0.0208587646484375,
-0.06646728515625,
0.0298004150390625,
-0.005035400390625,
-0.07232666015625,
0.0018329620361328125,
0.0030269622802734375,
-0.0300750732421875,
0.036346435546875,
-0.03594970703125,
0.050994873046875,
-0.0254364013671875,
-0.031524658203125,
-0.00571441650390625,
-0.052398681640625,
0.04388427734375,
-0.0050506591796875,
0.01232147216796875,
-0.025909423828125,
-0.005680084228515625,
0.06353759765625,
-0.05291748046875,
0.0650634765625,
-0.01641845703125,
-0.00945281982421875,
0.030853271484375,
-0.0036373138427734375,
0.05145263671875,
-0.0010194778442382812,
-0.0261688232421875,
0.0313720703125,
-0.01666259765625,
-0.0221405029296875,
-0.0275726318359375,
0.05609130859375,
-0.07867431640625,
-0.04388427734375,
-0.032562255859375,
-0.025543212890625,
0.00269317626953125,
0.0012693405151367188,
0.020416259765625,
0.0102081298828125,
0.0014600753784179688,
0.00811004638671875,
0.031982421875,
-0.0295562744140625,
0.040008544921875,
0.0289154052734375,
-0.0283966064453125,
-0.039093017578125,
0.0509033203125,
0.00032830238342285156,
0.01529693603515625,
0.0102081298828125,
0.01145172119140625,
-0.029052734375,
-0.0318603515625,
-0.05853271484375,
0.033966064453125,
-0.03985595703125,
-0.034881591796875,
-0.0421142578125,
-0.02337646484375,
-0.0168304443359375,
-0.0035419464111328125,
-0.043670654296875,
-0.0345458984375,
-0.02996826171875,
-0.019683837890625,
0.04388427734375,
0.059112548828125,
-0.00445556640625,
0.0261688232421875,
-0.041259765625,
0.01177215576171875,
0.00875091552734375,
0.02813720703125,
0.0030956268310546875,
-0.06988525390625,
-0.00775146484375,
-0.002948760986328125,
-0.039947509765625,
-0.06427001953125,
0.032196044921875,
0.00885772705078125,
0.034576416015625,
0.00543212890625,
-0.0164794921875,
0.0640869140625,
-0.019317626953125,
0.0609130859375,
0.0227203369140625,
-0.061614990234375,
0.043182373046875,
-0.019927978515625,
0.013885498046875,
0.0294342041015625,
0.031494140625,
-0.0226287841796875,
-0.020416259765625,
-0.05230712890625,
-0.064453125,
0.05755615234375,
0.03533935546875,
-0.0040740966796875,
0.02313232421875,
0.031707763671875,
-0.00417327880859375,
0.0233612060546875,
-0.0655517578125,
-0.03533935546875,
-0.0005707740783691406,
0.003448486328125,
-0.01161956787109375,
-0.01873779296875,
-0.004703521728515625,
-0.05169677734375,
0.0521240234375,
0.0024871826171875,
0.031707763671875,
0.024658203125,
-0.00341796875,
-0.0295562744140625,
0.0014638900756835938,
0.044830322265625,
0.041656494140625,
-0.0262451171875,
-0.0305938720703125,
0.026611328125,
-0.031219482421875,
0.01007080078125,
0.0286712646484375,
-0.01488494873046875,
-0.0108642578125,
0.02325439453125,
0.07318115234375,
0.037506103515625,
-0.037994384765625,
0.044464111328125,
-0.009033203125,
-0.0111846923828125,
-0.027557373046875,
0.0015745162963867188,
0.019378662109375,
0.0302581787109375,
0.01471710205078125,
0.0018596649169921875,
-0.0184173583984375,
-0.03485107421875,
0.01074981689453125,
0.034576416015625,
-0.018310546875,
-0.037384033203125,
0.07110595703125,
0.0152587890625,
-0.0247650146484375,
0.04278564453125,
-0.0071868896484375,
-0.042022705078125,
0.059326171875,
0.025482177734375,
0.0589599609375,
-0.0285186767578125,
0.00820159912109375,
0.03814697265625,
0.022735595703125,
-0.0015659332275390625,
0.02996826171875,
-0.006561279296875,
-0.045379638671875,
-0.0174713134765625,
-0.076904296875,
-0.0285491943359375,
0.006343841552734375,
-0.0364990234375,
0.0305938720703125,
-0.027801513671875,
-0.0173797607421875,
-0.00875091552734375,
0.0252685546875,
-0.062469482421875,
-0.000225067138671875,
0.0165557861328125,
0.0810546875,
-0.0411376953125,
0.06280517578125,
0.049224853515625,
-0.040618896484375,
-0.06475830078125,
-0.027862548828125,
0.0108642578125,
-0.095947265625,
0.033905029296875,
0.0232696533203125,
-0.004726409912109375,
-0.0006194114685058594,
-0.04766845703125,
-0.073974609375,
0.11566162109375,
0.0242919921875,
-0.046875,
0.00592041015625,
0.0036334991455078125,
0.046844482421875,
-0.02899169921875,
0.049652099609375,
0.0384521484375,
0.0421142578125,
0.00848388671875,
-0.090087890625,
0.0242919921875,
-0.0286102294921875,
-0.00983428955078125,
0.00591278076171875,
-0.08941650390625,
0.07354736328125,
-0.03741455078125,
-0.01053619384765625,
0.0271759033203125,
0.0458984375,
0.05499267578125,
0.035614013671875,
0.0362548828125,
0.072265625,
0.058380126953125,
-0.00470733642578125,
0.09759521484375,
-0.01549530029296875,
0.04278564453125,
0.05731201171875,
-0.0241546630859375,
0.054412841796875,
0.01395416259765625,
-0.03472900390625,
0.049224853515625,
0.0682373046875,
0.0018053054809570312,
0.014984130859375,
0.0268402099609375,
-0.0030460357666015625,
-0.00804901123046875,
-0.00667572021484375,
-0.0428466796875,
0.033905029296875,
0.0118560791015625,
-0.0212249755859375,
-0.0093994140625,
-0.01226806640625,
0.0222015380859375,
-0.0146942138671875,
-0.0200042724609375,
0.0369873046875,
0.0163421630859375,
-0.03778076171875,
0.0657958984375,
-0.003566741943359375,
0.06781005859375,
-0.044036865234375,
0.006244659423828125,
-0.03680419921875,
0.0218353271484375,
-0.0207672119140625,
-0.054931640625,
0.0005183219909667969,
0.0085296630859375,
0.0090179443359375,
-0.01922607421875,
0.0482177734375,
-0.00890350341796875,
-0.039337158203125,
0.045989990234375,
0.03692626953125,
0.0278778076171875,
-0.000011563301086425781,
-0.0833740234375,
0.03143310546875,
0.0033016204833984375,
-0.056488037109375,
0.04180908203125,
0.0082244873046875,
-0.0018129348754882812,
0.06341552734375,
0.04736328125,
-0.002460479736328125,
0.00231170654296875,
0.0022945404052734375,
0.09051513671875,
-0.0517578125,
-0.01136016845703125,
-0.06597900390625,
0.04559326171875,
-0.00688934326171875,
-0.04107666015625,
0.064697265625,
0.040191650390625,
0.059417724609375,
0.01352691650390625,
0.0241546630859375,
-0.0138397216796875,
0.0286712646484375,
-0.0211639404296875,
0.059326171875,
-0.07012939453125,
0.023651123046875,
-0.020538330078125,
-0.06268310546875,
-0.00447845458984375,
0.029876708984375,
-0.004154205322265625,
0.00801849365234375,
0.034698486328125,
0.0609130859375,
0.00548553466796875,
-0.00449371337890625,
0.00505828857421875,
0.029937744140625,
0.013885498046875,
0.057830810546875,
0.06976318359375,
-0.049072265625,
0.032073974609375,
-0.04046630859375,
-0.0259857177734375,
-0.01873779296875,
-0.0528564453125,
-0.059967041015625,
-0.03094482421875,
-0.0170440673828125,
-0.028350830078125,
-0.007038116455078125,
0.07275390625,
0.0447998046875,
-0.044830322265625,
-0.034027099609375,
0.00930023193359375,
0.01678466796875,
-0.00687408447265625,
-0.0185699462890625,
0.0377197265625,
0.00498199462890625,
-0.05743408203125,
0.0325927734375,
0.01512908935546875,
0.006237030029296875,
-0.01885986328125,
-0.022125244140625,
-0.022491455078125,
-0.0021228790283203125,
0.048919677734375,
0.0250701904296875,
-0.0521240234375,
-0.023681640625,
0.00612640380859375,
-0.007572174072265625,
0.0206451416015625,
0.0234375,
-0.046112060546875,
0.01134490966796875,
0.0247955322265625,
0.02996826171875,
0.05322265625,
0.006252288818359375,
0.0083160400390625,
-0.043212890625,
0.016082763671875,
0.00492095947265625,
0.039947509765625,
0.029388427734375,
-0.0293121337890625,
0.060272216796875,
0.0286865234375,
-0.054443359375,
-0.083984375,
-0.004032135009765625,
-0.08447265625,
-0.002765655517578125,
0.08349609375,
-0.01352691650390625,
-0.03692626953125,
0.0316162109375,
-0.01806640625,
0.0154266357421875,
-0.03509521484375,
0.052520751953125,
0.036712646484375,
-0.033203125,
-0.005687713623046875,
-0.03851318359375,
0.0288543701171875,
0.03656005859375,
-0.0640869140625,
-0.01338958740234375,
0.0216064453125,
0.0251312255859375,
0.00925445556640625,
0.0701904296875,
-0.00811767578125,
0.0148468017578125,
-0.02142333984375,
0.01531219482421875,
-0.0118560791015625,
-0.01078033447265625,
-0.0297698974609375,
-0.001453399658203125,
-0.007427215576171875,
-0.0183563232421875
]
] |
lllyasviel/control_v11p_sd15_lineart | 2023-05-04T18:49:42.000Z | [
"diffusers",
"art",
"controlnet",
"stable-diffusion",
"controlnet-v1-1",
"image-to-image",
"arxiv:2302.05543",
"license:openrail",
"has_space",
"diffusers:ControlNetModel",
"region:us"
] | image-to-image | lllyasviel | null | null | lllyasviel/control_v11p_sd15_lineart | 19 | 11,335 | diffusers | 2023-04-14T19:25:13 | ---
license: openrail
base_model: runwayml/stable-diffusion-v1-5
tags:
- art
- controlnet
- stable-diffusion
- controlnet-v1-1
- image-to-image
duplicated_from: ControlNet-1-1-preview/control_v11p_sd15_lineart
---
# Controlnet - v1.1 - *lineart Version*
**Controlnet v1.1** was released in [lllyasviel/ControlNet-v1-1](https://huggingface.co/lllyasviel/ControlNet-v1-1) by [Lvmin Zhang](https://huggingface.co/lllyasviel).
This checkpoint is a conversion of [the original checkpoint](https://huggingface.co/lllyasviel/ControlNet-v1-1/blob/main/control_v11p_sd15_lineart.pth) into `diffusers` format.
It can be used in combination with **Stable Diffusion**, such as [runwayml/stable-diffusion-v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5).
For more details, please also have a look at the [🧨 Diffusers docs](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/controlnet).
ControlNet is a neural network structure to control diffusion models by adding extra conditions.

This checkpoint corresponds to the ControlNet conditioned on **lineart images**.
## Model Details
- **Developed by:** Lvmin Zhang, Maneesh Agrawala
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based.
- **Resources for more information:** [GitHub Repository](https://github.com/lllyasviel/ControlNet), [Paper](https://arxiv.org/abs/2302.05543).
- **Cite as:**
@misc{zhang2023adding,
title={Adding Conditional Control to Text-to-Image Diffusion Models},
author={Lvmin Zhang and Maneesh Agrawala},
year={2023},
eprint={2302.05543},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
## Introduction
Controlnet was proposed in [*Adding Conditional Control to Text-to-Image Diffusion Models*](https://arxiv.org/abs/2302.05543) by
Lvmin Zhang, Maneesh Agrawala.
The abstract reads as follows:
*We present a neural network structure, ControlNet, to control pretrained large diffusion models to support additional input conditions.
The ControlNet learns task-specific conditions in an end-to-end way, and the learning is robust even when the training dataset is small (< 50k).
Moreover, training a ControlNet is as fast as fine-tuning a diffusion model, and the model can be trained on a personal devices.
Alternatively, if powerful computation clusters are available, the model can scale to large amounts (millions to billions) of data.
We report that large diffusion models like Stable Diffusion can be augmented with ControlNets to enable conditional inputs like edge maps, segmentation maps, keypoints, etc.
This may enrich the methods to control large diffusion models and further facilitate related applications.*
## Example
It is recommended to use the checkpoint with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) as the checkpoint
has been trained on it.
Experimentally, the checkpoint can be used with other diffusion models such as dreamboothed stable diffusion.
**Note**: If you want to process an image to create the auxiliary conditioning, external dependencies are required as shown below:
1. Install https://github.com/patrickvonplaten/controlnet_aux
```sh
$ pip install controlnet_aux==0.3.0
```
2. Let's install `diffusers` and related packages:
```
$ pip install diffusers transformers accelerate
```
3. Run code:
```python
import torch
import os
from huggingface_hub import HfApi
from pathlib import Path
from diffusers.utils import load_image
from PIL import Image
import numpy as np
from controlnet_aux import LineartDetector
from diffusers import (
ControlNetModel,
StableDiffusionControlNetPipeline,
UniPCMultistepScheduler,
)
checkpoint = "ControlNet-1-1-preview/control_v11p_sd15_lineart"
image = load_image(
"https://huggingface.co/ControlNet-1-1-preview/control_v11p_sd15_lineart/resolve/main/images/input.png"
)
image = image.resize((512, 512))
prompt = "michael jackson concert"
processor = LineartDetector.from_pretrained("lllyasviel/Annotators")
control_image = processor(image)
control_image.save("./images/control.png")
controlnet = ControlNetModel.from_pretrained(checkpoint, torch_dtype=torch.float16)
pipe = StableDiffusionControlNetPipeline.from_pretrained(
"runwayml/stable-diffusion-v1-5", controlnet=controlnet, torch_dtype=torch.float16
)
pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config)
pipe.enable_model_cpu_offload()
generator = torch.manual_seed(0)
image = pipe(prompt, num_inference_steps=30, generator=generator, image=control_image).images[0]
image.save('images/image_out.png')
```



## Other released checkpoints v1-1
The authors released 14 different checkpoints, each trained with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5)
on a different type of conditioning:
| Model Name | Control Image Overview| Condition Image | Control Image Example | Generated Image Example |
|---|---|---|---|---|
|[lllyasviel/control_v11p_sd15_canny](https://huggingface.co/lllyasviel/control_v11p_sd15_canny)<br/> | *Trained with canny edge detection* | A monochrome image with white edges on a black background.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11e_sd15_ip2p](https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p)<br/> | *Trained with pixel to pixel instruction* | No condition .|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_inpaint](https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint)<br/> | Trained with image inpainting | No condition.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/output.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/output.png"/></a>|
|[lllyasviel/control_v11p_sd15_mlsd](https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd)<br/> | Trained with multi-level line segment detection | An image with annotated line segments.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11f1p_sd15_depth](https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth)<br/> | Trained with depth estimation | An image with depth information, usually represented as a grayscale image.|<a href="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_normalbae](https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae)<br/> | Trained with surface normal estimation | An image with surface normal information, usually represented as a color-coded image.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_seg](https://huggingface.co/lllyasviel/control_v11p_sd15_seg)<br/> | Trained with image segmentation | An image with segmented regions, usually represented as a color-coded image.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_lineart](https://huggingface.co/lllyasviel/control_v11p_sd15_lineart)<br/> | Trained with line art generation | An image with line art, usually black lines on a white background.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15s2_lineart_anime](https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime)<br/> | Trained with anime line art generation | An image with anime-style line art.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_openpose](https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime)<br/> | Trained with human pose estimation | An image with human poses, usually represented as a set of keypoints or skeletons.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_scribble](https://huggingface.co/lllyasviel/control_v11p_sd15_scribble)<br/> | Trained with scribble-based image generation | An image with scribbles, usually random or user-drawn strokes.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_softedge](https://huggingface.co/lllyasviel/control_v11p_sd15_softedge)<br/> | Trained with soft edge image generation | An image with soft edges, usually to create a more painterly or artistic effect.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11e_sd15_shuffle](https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle)<br/> | Trained with image shuffling | An image with shuffled patches or regions.|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11f1e_sd15_tile](https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile)<br/> | Trained with image tiling | A blurry image or part of an image .|<a href="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/original.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/original.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/output.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/output.png"/></a>|
## More information
For more information, please also have a look at the [Diffusers ControlNet Blog Post](https://huggingface.co/blog/controlnet) and have a look at the [official docs](https://github.com/lllyasviel/ControlNet-v1-1-nightly). | 15,396 | [
[
-0.04595947265625,
-0.04620361328125,
0.009613037109375,
0.045196533203125,
-0.019134521484375,
-0.0184783935546875,
0.00482177734375,
-0.0421142578125,
0.0426025390625,
0.01898193359375,
-0.05743408203125,
-0.0265045166015625,
-0.054931640625,
-0.0117645263671875,
-0.01284027099609375,
0.062744140625,
-0.023651123046875,
0.000576019287109375,
0.0081329345703125,
-0.005550384521484375,
-0.00632476806640625,
-0.01282501220703125,
-0.0947265625,
-0.036102294921875,
0.03692626953125,
0.0016469955444335938,
0.03973388671875,
0.041656494140625,
0.038482666015625,
0.0281829833984375,
-0.02801513671875,
0.004123687744140625,
-0.0215606689453125,
-0.015045166015625,
0.01442718505859375,
-0.00763702392578125,
-0.0538330078125,
0.006847381591796875,
0.05517578125,
0.0223541259765625,
0.0016088485717773438,
-0.014801025390625,
0.01180267333984375,
0.0509033203125,
-0.03631591796875,
-0.0100250244140625,
-0.0103912353515625,
0.0214996337890625,
-0.01050567626953125,
0.00769805908203125,
-0.0146331787109375,
-0.0248565673828125,
0.007015228271484375,
-0.057830810546875,
-0.00881195068359375,
-0.01216888427734375,
0.10369873046875,
0.022735595703125,
-0.0323486328125,
-0.006511688232421875,
-0.020294189453125,
0.049560546875,
-0.059844970703125,
0.00835418701171875,
0.0084228515625,
0.01641845703125,
-0.0159912109375,
-0.0740966796875,
-0.0380859375,
-0.0109710693359375,
-0.008758544921875,
0.0345458984375,
-0.0286407470703125,
0.005588531494140625,
0.021820068359375,
0.016815185546875,
-0.0296173095703125,
0.01904296875,
-0.023406982421875,
-0.0299530029296875,
0.046905517578125,
-0.0007181167602539062,
0.044342041015625,
0.004230499267578125,
-0.045318603515625,
-0.0023136138916015625,
-0.0287933349609375,
0.027374267578125,
0.0159912109375,
-0.01236724853515625,
-0.05908203125,
0.0309600830078125,
-0.002017974853515625,
0.055572509765625,
0.0341796875,
-0.01580810546875,
0.037139892578125,
-0.0152740478515625,
-0.028778076171875,
-0.023590087890625,
0.07794189453125,
0.036529541015625,
0.01035308837890625,
-0.000015616416931152344,
-0.0145416259765625,
-0.0096435546875,
-0.004886627197265625,
-0.09259033203125,
-0.0118865966796875,
0.0153045654296875,
-0.043304443359375,
-0.0276641845703125,
-0.011749267578125,
-0.055084228515625,
-0.015045166015625,
-0.00675201416015625,
0.0299835205078125,
-0.0457763671875,
-0.035614013671875,
0.01220703125,
-0.033935546875,
0.041046142578125,
0.045989990234375,
-0.033172607421875,
0.01385498046875,
0.00939178466796875,
0.07623291015625,
-0.01873779296875,
-0.00939178466796875,
-0.021759033203125,
-0.007328033447265625,
-0.0225067138671875,
0.039031982421875,
-0.009490966796875,
-0.00827789306640625,
-0.004520416259765625,
0.0257415771484375,
-0.01026153564453125,
-0.0240478515625,
0.033050537109375,
-0.0247802734375,
0.01453399658203125,
-0.0053253173828125,
-0.0301666259765625,
-0.0096282958984375,
0.0184326171875,
-0.035186767578125,
0.058441162109375,
0.01800537109375,
-0.08135986328125,
0.026123046875,
-0.038909912109375,
-0.017364501953125,
-0.019866943359375,
0.01041412353515625,
-0.057891845703125,
-0.03265380859375,
0.0034732818603515625,
0.043670654296875,
0.0002493858337402344,
-0.01082611083984375,
-0.0360107421875,
-0.0029315948486328125,
0.01323699951171875,
-0.00931549072265625,
0.09130859375,
0.01201629638671875,
-0.046661376953125,
0.01543426513671875,
-0.053863525390625,
0.0034637451171875,
0.0110931396484375,
-0.0200042724609375,
0.00701904296875,
-0.0222625732421875,
0.01201629638671875,
0.049774169921875,
0.028778076171875,
-0.052154541015625,
0.01068115234375,
-0.017364501953125,
0.0335693359375,
0.0504150390625,
0.0165252685546875,
0.044769287109375,
-0.040008544921875,
0.04376220703125,
0.02130126953125,
0.0252685546875,
0.005207061767578125,
-0.038421630859375,
-0.0791015625,
-0.0450439453125,
-0.0004029273986816406,
0.04425048828125,
-0.06207275390625,
0.062286376953125,
0.010040283203125,
-0.04986572265625,
-0.0190887451171875,
0.005702972412109375,
0.039154052734375,
0.037628173828125,
0.0223236083984375,
-0.035614013671875,
-0.028717041015625,
-0.0687255859375,
0.01076507568359375,
0.01837158203125,
0.0006284713745117188,
0.01396942138671875,
0.049560546875,
-0.0085906982421875,
0.05059814453125,
-0.0179595947265625,
-0.0291748046875,
-0.00722503662109375,
-0.007465362548828125,
0.021728515625,
0.07861328125,
0.061614990234375,
-0.0572509765625,
-0.0491943359375,
-0.0023365020751953125,
-0.06787109375,
-0.0047454833984375,
-0.0174560546875,
-0.03875732421875,
0.01611328125,
0.0462646484375,
-0.04876708984375,
0.060516357421875,
0.040496826171875,
-0.04437255859375,
0.04364013671875,
-0.026580810546875,
0.01056671142578125,
-0.073486328125,
0.0174713134765625,
0.028106689453125,
-0.0265960693359375,
-0.046142578125,
0.0048370361328125,
0.010589599609375,
0.004917144775390625,
-0.052947998046875,
0.054595947265625,
-0.037933349609375,
0.01415252685546875,
-0.02264404296875,
-0.005252838134765625,
0.005138397216796875,
0.048431396484375,
0.0130767822265625,
0.0374755859375,
0.07305908203125,
-0.045196533203125,
0.0252685546875,
0.0316162109375,
-0.0157623291015625,
0.06488037109375,
-0.063232421875,
0.0103759765625,
-0.0131378173828125,
0.045196533203125,
-0.06982421875,
-0.0200042724609375,
0.050323486328125,
-0.037445068359375,
0.043121337890625,
-0.020263671875,
-0.01995849609375,
-0.032928466796875,
-0.0244903564453125,
0.01546478271484375,
0.057708740234375,
-0.038848876953125,
0.0262908935546875,
0.01296234130859375,
0.014129638671875,
-0.03790283203125,
-0.068603515625,
-0.006927490234375,
-0.0302581787109375,
-0.06353759765625,
0.035675048828125,
-0.01263427734375,
0.0034198760986328125,
-0.0009016990661621094,
0.0005793571472167969,
-0.022247314453125,
-0.0010957717895507812,
0.0265350341796875,
0.0175018310546875,
-0.00714874267578125,
-0.01287841796875,
0.00777435302734375,
-0.01265716552734375,
-0.004909515380859375,
-0.0302276611328125,
0.035125732421875,
0.0025787353515625,
-0.016845703125,
-0.0765380859375,
0.017333984375,
0.042724609375,
-0.002513885498046875,
0.06878662109375,
0.07183837890625,
-0.031982421875,
-0.0013427734375,
-0.0293731689453125,
-0.0118408203125,
-0.03875732421875,
-0.0042572021484375,
-0.01617431640625,
-0.053863525390625,
0.052276611328125,
0.004840850830078125,
-0.0039520263671875,
0.047332763671875,
0.027862548828125,
-0.0169219970703125,
0.0662841796875,
0.0400390625,
-0.0107269287109375,
0.0634765625,
-0.05645751953125,
-0.00909423828125,
-0.0748291015625,
-0.0201873779296875,
-0.0250396728515625,
-0.05218505859375,
-0.025360107421875,
-0.0279998779296875,
0.0362548828125,
0.03424072265625,
-0.055999755859375,
0.035858154296875,
-0.045440673828125,
0.00881195068359375,
0.0243072509765625,
0.042236328125,
-0.012603759765625,
-0.0109405517578125,
-0.0125274658203125,
0.0043182373046875,
-0.045928955078125,
-0.0188751220703125,
0.0452880859375,
0.04376220703125,
0.040863037109375,
-0.0037784576416015625,
0.044525146484375,
0.0020999908447265625,
0.02276611328125,
-0.0428466796875,
0.0386962890625,
0.0008563995361328125,
-0.042449951171875,
-0.0159149169921875,
-0.0244293212890625,
-0.07763671875,
0.0072784423828125,
-0.037750244140625,
-0.05657958984375,
0.027740478515625,
0.019073486328125,
-0.0053253173828125,
0.035614013671875,
-0.05072021484375,
0.057891845703125,
-0.000004589557647705078,
-0.04620361328125,
0.00531768798828125,
-0.06396484375,
0.0168914794921875,
0.0177764892578125,
-0.01345062255859375,
-0.0008387565612792969,
-0.0085906982421875,
0.0653076171875,
-0.059356689453125,
0.0655517578125,
-0.042022705078125,
-0.0023555755615234375,
0.025299072265625,
-0.0015077590942382812,
0.04486083984375,
-0.009979248046875,
-0.017120361328125,
0.004154205322265625,
-0.005161285400390625,
-0.04656982421875,
-0.0271759033203125,
0.049041748046875,
-0.05242919921875,
-0.0152130126953125,
-0.0207366943359375,
-0.0213470458984375,
0.0156402587890625,
0.017974853515625,
0.051849365234375,
0.029754638671875,
0.01352691650390625,
0.0052642822265625,
0.050323486328125,
-0.02593994140625,
0.051177978515625,
0.003047943115234375,
-0.00911712646484375,
-0.042205810546875,
0.05426025390625,
-0.0009298324584960938,
0.031494140625,
0.015960693359375,
0.01134490966796875,
-0.0168304443359375,
-0.0382080078125,
-0.03265380859375,
0.033782958984375,
-0.047454833984375,
-0.031646728515625,
-0.04559326171875,
-0.035369873046875,
-0.0281829833984375,
-0.0379638671875,
-0.020660400390625,
-0.022491455078125,
-0.052093505859375,
0.0133819580078125,
0.04852294921875,
0.038818359375,
-0.0182037353515625,
0.045318603515625,
-0.019744873046875,
0.0180816650390625,
0.01947021484375,
0.02581787109375,
-0.005428314208984375,
-0.04620361328125,
-0.0001970529556274414,
0.0081787109375,
-0.03948974609375,
-0.057220458984375,
0.037872314453125,
0.004070281982421875,
0.03619384765625,
0.042724609375,
-0.0183563232421875,
0.050933837890625,
-0.022186279296875,
0.04345703125,
0.047943115234375,
-0.06317138671875,
0.04022216796875,
-0.032928466796875,
0.0203399658203125,
0.0223846435546875,
0.03948974609375,
-0.03411865234375,
-0.0269012451171875,
-0.053497314453125,
-0.049774169921875,
0.046051025390625,
0.0202484130859375,
-0.006443023681640625,
0.02557373046875,
0.051361083984375,
-0.028594970703125,
0.00934600830078125,
-0.065185546875,
-0.035736083984375,
-0.0206451416015625,
0.0026416778564453125,
0.0023670196533203125,
0.006336212158203125,
-0.005680084228515625,
-0.0379638671875,
0.06695556640625,
-0.001190185546875,
0.0438232421875,
0.041107177734375,
0.00846099853515625,
-0.013763427734375,
-0.023406982421875,
0.0438232421875,
0.037750244140625,
-0.00768280029296875,
-0.0235748291015625,
0.005084991455078125,
-0.0310821533203125,
0.0179290771484375,
-0.002201080322265625,
-0.0273284912109375,
-0.00576019287109375,
0.024993896484375,
0.06365966796875,
-0.01194000244140625,
-0.0123291015625,
0.05877685546875,
0.001125335693359375,
-0.041900634765625,
-0.022674560546875,
0.0010213851928710938,
0.00955963134765625,
0.0355224609375,
0.0104827880859375,
0.029266357421875,
0.0024776458740234375,
-0.00885772705078125,
0.02325439453125,
0.045135498046875,
-0.04522705078125,
-0.01166534423828125,
0.05804443359375,
0.004154205322265625,
-0.007747650146484375,
0.02777099609375,
-0.03338623046875,
-0.05352783203125,
0.06982421875,
0.040435791015625,
0.05682373046875,
-0.00994110107421875,
0.0213623046875,
0.0537109375,
0.01519012451171875,
0.007354736328125,
0.013916015625,
0.00841522216796875,
-0.052734375,
-0.0302734375,
-0.034393310546875,
-0.001964569091796875,
0.01299285888671875,
-0.032379150390625,
0.034515380859375,
-0.060516357421875,
-0.017730712890625,
-0.00592041015625,
0.007537841796875,
-0.0528564453125,
0.0310821533203125,
0.0033817291259765625,
0.09344482421875,
-0.06427001953125,
0.0657958984375,
0.042999267578125,
-0.036376953125,
-0.06787109375,
0.0008563995361328125,
0.0020618438720703125,
-0.0601806640625,
0.04949951171875,
0.01093292236328125,
-0.0034351348876953125,
0.006923675537109375,
-0.059967041015625,
-0.045745849609375,
0.09893798828125,
0.01971435546875,
-0.015869140625,
0.006500244140625,
-0.037628173828125,
0.03399658203125,
-0.033111572265625,
0.03656005859375,
0.0340576171875,
0.04052734375,
0.033905029296875,
-0.060394287109375,
0.018035888671875,
-0.03399658203125,
0.0091094970703125,
0.0137939453125,
-0.07513427734375,
0.069091796875,
0.0032291412353515625,
-0.0115509033203125,
0.021484375,
0.05450439453125,
0.0186920166015625,
0.0126953125,
0.0487060546875,
0.060150146484375,
0.0263519287109375,
-0.00910186767578125,
0.074462890625,
-0.0059967041015625,
0.023406982421875,
0.045684814453125,
0.0194244384765625,
0.03985595703125,
0.0293731689453125,
-0.005825042724609375,
0.037750244140625,
0.06512451171875,
0.0034427642822265625,
0.031829833984375,
0.038818359375,
-0.0244903564453125,
-0.009552001953125,
-0.0042877197265625,
-0.0276641845703125,
0.0054931640625,
0.02178955078125,
-0.0219879150390625,
-0.0178375244140625,
0.020599365234375,
0.0228118896484375,
-0.01416015625,
-0.033203125,
0.04931640625,
-0.0069732666015625,
-0.039306640625,
0.058380126953125,
-0.007354736328125,
0.08544921875,
-0.05242919921875,
0.00574493408203125,
-0.0203399658203125,
0.007579803466796875,
-0.030426025390625,
-0.06317138671875,
0.01465606689453125,
-0.01458740234375,
0.0205535888671875,
-0.0316162109375,
0.054443359375,
-0.0277862548828125,
-0.03192138671875,
0.040863037109375,
0.0084991455078125,
0.02972412109375,
0.0142059326171875,
-0.0826416015625,
0.0175018310546875,
0.00824737548828125,
-0.0369873046875,
0.0186309814453125,
0.029022216796875,
0.0162200927734375,
0.055694580078125,
0.025787353515625,
0.02789306640625,
0.0222930908203125,
-0.0178070068359375,
0.0797119140625,
-0.021026611328125,
-0.024200439453125,
-0.043212890625,
0.060760498046875,
-0.02630615234375,
-0.033599853515625,
0.043426513671875,
0.0235748291015625,
0.056427001953125,
-0.006839752197265625,
0.052734375,
-0.03240966796875,
0.014892578125,
-0.051361083984375,
0.0640869140625,
-0.06597900390625,
-0.0198822021484375,
-0.0262298583984375,
-0.050689697265625,
-0.021881103515625,
0.06494140625,
-0.0092010498046875,
0.016357421875,
0.044403076171875,
0.073974609375,
-0.015777587890625,
-0.044586181640625,
0.0068206787109375,
0.007312774658203125,
0.026519775390625,
0.0555419921875,
0.05133056640625,
-0.051666259765625,
0.022247314453125,
-0.041748046875,
-0.036529541015625,
-0.006317138671875,
-0.0732421875,
-0.06787109375,
-0.055419921875,
-0.053192138671875,
-0.054962158203125,
-0.0207061767578125,
0.0556640625,
0.0870361328125,
-0.0491943359375,
-0.0132598876953125,
-0.0261688232421875,
0.0064239501953125,
-0.01172637939453125,
-0.0157318115234375,
0.0271759033203125,
-0.00931549072265625,
-0.064453125,
0.0024776458740234375,
0.0167999267578125,
0.04425048828125,
-0.006229400634765625,
-0.031768798828125,
-0.028411865234375,
-0.0195465087890625,
0.019378662109375,
0.035797119140625,
-0.03546142578125,
-0.017303466796875,
-0.021820068359375,
-0.0168609619140625,
0.00922393798828125,
0.041107177734375,
-0.031219482421875,
0.01412200927734375,
0.04095458984375,
0.0308074951171875,
0.0638427734375,
-0.01100921630859375,
0.01050567626953125,
-0.044647216796875,
0.0423583984375,
0.0022754669189453125,
0.035675048828125,
0.004726409912109375,
-0.02520751953125,
0.031646728515625,
0.0263214111328125,
-0.06085205078125,
-0.0341796875,
0.0120391845703125,
-0.1060791015625,
-0.00848388671875,
0.07537841796875,
-0.027587890625,
-0.0292510986328125,
0.0128936767578125,
-0.03466796875,
0.027099609375,
-0.0260467529296875,
0.0158538818359375,
0.0279083251953125,
-0.01532745361328125,
-0.0281829833984375,
-0.032989501953125,
0.049346923828125,
0.0167236328125,
-0.061614990234375,
-0.0416259765625,
0.04071044921875,
0.0279083251953125,
0.0237274169921875,
0.06768798828125,
-0.0086517333984375,
0.00963592529296875,
-0.00719451904296875,
0.0195465087890625,
0.0068817138671875,
-0.00659942626953125,
-0.04345703125,
-0.0123138427734375,
-0.0168304443359375,
-0.031280517578125
]
] |
Yntec/CetusRemix | 2023-08-29T09:22:58.000Z | [
"diffusers",
"Anime",
"2D",
"2.5D",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"Eagelaxis",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/CetusRemix | 3 | 11,329 | diffusers | 2023-08-29T07:36:41 | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
language:
- en
tags:
- Anime
- 2D
- 2.5D
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- Eagelaxis
inference: true
---
# Cetus Remix
A mix of Cetus 3.5 and Cetus 4 to create my favorite Cetus model! Check out this comparison:

(Click for 1920px version)
Sample and prompt:

Pretty cute girl. Like lesser birds on the four winds. Like silver scrapes in May. Now the sands become a crust. And most of you have gone away.
Original pages:
https://civitai.com/models/6755?modelVersionId=29851 (CetusMix v3.5)
https://civitai.com/models/6755?modelVersionId=78676 (CetusMix v4)
| 920 | [
[
-0.057861328125,
-0.0273284912109375,
0.04669189453125,
0.0308837890625,
-0.034912109375,
0.003849029541015625,
-0.0036182403564453125,
-0.037994384765625,
0.056915283203125,
0.03692626953125,
-0.0204925537109375,
-0.04229736328125,
-0.027435302734375,
-0.00974273681640625,
-0.048492431640625,
0.068603515625,
0.022796630859375,
0.00807952880859375,
0.0083465576171875,
0.0263214111328125,
-0.0214080810546875,
-0.020355224609375,
-0.024139404296875,
-0.038787841796875,
0.033782958984375,
0.0423583984375,
0.053619384765625,
0.025115966796875,
0.00896453857421875,
0.0276031494140625,
-0.05975341796875,
-0.0242767333984375,
-0.027374267578125,
-0.00884246826171875,
-0.01678466796875,
-0.002300262451171875,
-0.048736572265625,
0.0163726806640625,
0.0198974609375,
0.025604248046875,
-0.0212860107421875,
0.0189056396484375,
0.0072021484375,
0.0377197265625,
-0.060211181640625,
-0.024383544921875,
-0.045745849609375,
0.00743865966796875,
-0.02801513671875,
0.0138397216796875,
0.0092620849609375,
-0.032745361328125,
-0.02203369140625,
-0.076416015625,
0.0192413330078125,
-0.027679443359375,
0.06903076171875,
-0.00004124641418457031,
-0.04058837890625,
-0.02410888671875,
-0.036468505859375,
0.0364990234375,
-0.045928955078125,
0.04925537109375,
-0.0072479248046875,
0.0484619140625,
-0.0223388671875,
-0.07086181640625,
-0.042816162109375,
0.01305389404296875,
0.04412841796875,
0.04168701171875,
-0.00007730722427368164,
-0.037445068359375,
0.005764007568359375,
0.0498046875,
-0.0184478759765625,
-0.028839111328125,
-0.046966552734375,
-0.0044708251953125,
0.03607177734375,
-0.0157318115234375,
0.021697998046875,
-0.010772705078125,
-0.03857421875,
-0.02099609375,
-0.052490234375,
0.00732421875,
0.050262451171875,
0.003047943115234375,
-0.04644775390625,
0.036773681640625,
-0.0079803466796875,
0.038116455078125,
0.04217529296875,
0.006130218505859375,
0.048797607421875,
-0.0068817138671875,
-0.020233154296875,
-0.0161590576171875,
0.04180908203125,
0.04754638671875,
-0.000675201416015625,
-0.0008292198181152344,
-0.01413726806640625,
0.0180206298828125,
0.0142974853515625,
-0.05279541015625,
-0.031585693359375,
-0.00460052490234375,
-0.03875732421875,
-0.041748046875,
0.009735107421875,
-0.05621337890625,
-0.01232147216796875,
-0.0192718505859375,
-0.0078887939453125,
-0.022674560546875,
-0.048248291015625,
0.02337646484375,
-0.00007730722427368164,
0.018157958984375,
0.0190582275390625,
-0.0682373046875,
0.03564453125,
0.05426025390625,
0.041900634765625,
0.0191650390625,
0.0038089752197265625,
-0.0188751220703125,
-0.004245758056640625,
-0.037506103515625,
0.05059814453125,
-0.035064697265625,
-0.05450439453125,
-0.0233306884765625,
0.00820159912109375,
0.0081939697265625,
-0.051513671875,
0.068359375,
-0.02252197265625,
-0.0162506103515625,
-0.034698486328125,
-0.0174560546875,
-0.030059814453125,
0.009765625,
-0.055084228515625,
0.06585693359375,
0.00818634033203125,
-0.05670166015625,
0.0195465087890625,
-0.052978515625,
0.01137542724609375,
-0.0004742145538330078,
-0.012908935546875,
-0.0166168212890625,
0.028076171875,
-0.0027751922607421875,
-0.00403594970703125,
-0.019683837890625,
-0.0257415771484375,
-0.065673828125,
-0.035064697265625,
0.037353515625,
0.0172576904296875,
0.04339599609375,
0.07244873046875,
-0.037139892578125,
-0.0103912353515625,
-0.054840087890625,
0.007232666015625,
0.01062774658203125,
-0.020751953125,
-0.005664825439453125,
-0.0091705322265625,
0.0266265869140625,
0.040069580078125,
0.01459503173828125,
-0.047760009765625,
-0.01363372802734375,
-0.030120849609375,
0.038360595703125,
0.037750244140625,
0.018585205078125,
0.042144775390625,
-0.06719970703125,
0.04364013671875,
0.016571044921875,
0.01145172119140625,
0.024444580078125,
-0.020050048828125,
-0.062347412109375,
-0.019866943359375,
0.02789306640625,
0.0061798095703125,
-0.0245361328125,
-0.010009765625,
0.002368927001953125,
-0.07513427734375,
-0.016754150390625,
-0.005950927734375,
0.01146697998046875,
0.006977081298828125,
0.02239990234375,
-0.03021240234375,
-0.0465087890625,
-0.0601806640625,
-0.0006322860717773438,
-0.0097198486328125,
-0.0024967193603515625,
0.00749969482421875,
0.0307159423828125,
-0.010101318359375,
0.042816162109375,
-0.00714874267578125,
0.003322601318359375,
-0.00836944580078125,
-0.033843994140625,
0.04266357421875,
0.047454833984375,
0.07647705078125,
-0.0888671875,
-0.0294647216796875,
0.01212310791015625,
-0.032073974609375,
-0.0294647216796875,
0.0015115737915039062,
-0.006160736083984375,
-0.03570556640625,
0.0230865478515625,
-0.054168701171875,
0.0187835693359375,
0.0290069580078125,
-0.048797607421875,
0.038330078125,
0.010955810546875,
0.03155517578125,
-0.082763671875,
0.01511383056640625,
-0.000060677528381347656,
-0.032867431640625,
-0.01551055908203125,
0.0697021484375,
0.0105743408203125,
-0.0008559226989746094,
-0.0865478515625,
0.0266265869140625,
-0.04541015625,
-0.01561737060546875,
-0.01065826416015625,
-0.0031986236572265625,
0.00557708740234375,
-0.007297515869140625,
-0.0215911865234375,
0.036285400390625,
0.04193115234375,
-0.03955078125,
0.011810302734375,
0.045074462890625,
-0.0016527175903320312,
0.0292205810546875,
-0.07330322265625,
0.01171112060546875,
0.019287109375,
0.01148223876953125,
-0.0242462158203125,
-0.0182952880859375,
0.02618408203125,
-0.04766845703125,
-0.020355224609375,
-0.00506591796875,
-0.023773193359375,
-0.0235137939453125,
-0.04669189453125,
0.03265380859375,
0.06121826171875,
-0.045501708984375,
0.027557373046875,
0.0325927734375,
-0.00040531158447265625,
-0.00811004638671875,
-0.0601806640625,
0.003414154052734375,
-0.0467529296875,
-0.033966064453125,
0.0665283203125,
-0.041046142578125,
-0.03851318359375,
-0.0167083740234375,
-0.0275726318359375,
-0.00445556640625,
0.005767822265625,
0.059478759765625,
0.00872802734375,
-0.0010080337524414062,
-0.038360595703125,
0.00974273681640625,
0.006420135498046875,
-0.0032176971435546875,
-0.0060882568359375,
0.03704833984375,
-0.05120849609375,
-0.01296234130859375,
-0.028350830078125,
0.01154327392578125,
0.07647705078125,
0.00542449951171875,
-0.0272064208984375,
0.01100921630859375,
-0.035858154296875,
0.0179290771484375,
-0.050018310546875,
-0.0018339157104492188,
-0.03485107421875,
-0.03717041015625,
-0.07073974609375,
-0.05157470703125,
0.06451416015625,
0.003936767578125,
-0.0002741813659667969,
0.046844482421875,
0.04974365234375,
-0.02056884765625,
0.0604248046875,
0.024749755859375,
-0.001811981201171875,
0.01561737060546875,
-0.039215087890625,
-0.039642333984375,
-0.05389404296875,
-0.01267242431640625,
-0.03546142578125,
-0.023223876953125,
-0.0546875,
-0.004169464111328125,
0.00640869140625,
0.02197265625,
-0.034912109375,
0.04888916015625,
-0.0272216796875,
0.041839599609375,
0.021759033203125,
0.038543701171875,
0.0216522216796875,
-0.021942138671875,
-0.034698486328125,
-0.007293701171875,
-0.03289794921875,
-0.0303802490234375,
0.056610107421875,
0.026580810546875,
0.0115203857421875,
0.03192138671875,
0.09552001953125,
0.00817108154296875,
-0.006603240966796875,
-0.005352020263671875,
0.0556640625,
-0.00576019287109375,
-0.054107666015625,
0.02947998046875,
-0.00301361083984375,
-0.043212890625,
0.03948974609375,
-0.05792236328125,
-0.010894775390625,
0.0007524490356445312,
0.0087432861328125,
-0.042510986328125,
0.02740478515625,
-0.021575927734375,
0.0504150390625,
0.009490966796875,
-0.02484130859375,
-0.004116058349609375,
-0.028045654296875,
0.0161285400390625,
-0.0027790069580078125,
0.049468994140625,
-0.01029205322265625,
-0.00095367431640625,
0.022369384765625,
-0.041168212890625,
0.039306640625,
0.01165771484375,
0.0073699951171875,
0.040008544921875,
0.01450347900390625,
0.002536773681640625,
0.038726806640625,
-0.02020263671875,
0.0030231475830078125,
0.012451171875,
-0.03802490234375,
-0.01340484619140625,
0.08807373046875,
-0.060302734375,
0.0008516311645507812,
-0.043548583984375,
-0.009765625,
0.0198822021484375,
0.01397705078125,
0.05133056640625,
0.0171661376953125,
-0.03131103515625,
0.0170440673828125,
0.01267242431640625,
0.02008056640625,
0.06756591796875,
0.0322265625,
-0.038330078125,
-0.052490234375,
0.033294677734375,
-0.011627197265625,
0.006977081298828125,
-0.0271759033203125,
0.01515960693359375,
0.0009255409240722656,
-0.047393798828125,
-0.026275634765625,
0.01433563232421875,
-0.0333251953125,
-0.007152557373046875,
-0.0190887451171875,
-0.00812530517578125,
-0.0240631103515625,
-0.030487060546875,
-0.03753662109375,
-0.04803466796875,
-0.0384521484375,
-0.006793975830078125,
0.0115509033203125,
0.056671142578125,
-0.0277252197265625,
0.028594970703125,
-0.03753662109375,
0.00968170166015625,
0.0225677490234375,
-0.0016803741455078125,
-0.03765869140625,
-0.0423583984375,
0.01300048828125,
0.007389068603515625,
-0.05316162109375,
-0.07720947265625,
0.048065185546875,
-0.0052490234375,
0.0142974853515625,
0.00778961181640625,
-0.0162506103515625,
0.042327880859375,
-0.027374267578125,
0.07415771484375,
0.037322998046875,
-0.08123779296875,
0.0225982666015625,
-0.0552978515625,
0.0462646484375,
0.0478515625,
-0.0186309814453125,
-0.015716552734375,
-0.04412841796875,
-0.07562255859375,
-0.06793212890625,
0.0008258819580078125,
0.0418701171875,
0.0098724365234375,
-0.006290435791015625,
0.02801513671875,
0.0166778564453125,
0.0127105712890625,
-0.0665283203125,
0.00736236572265625,
-0.042236328125,
-0.005157470703125,
0.00506591796875,
-0.0071563720703125,
0.01216888427734375,
-0.008270263671875,
0.06463623046875,
0.00878143310546875,
0.00659942626953125,
0.009552001953125,
0.007587432861328125,
-0.017181396484375,
0.0182952880859375,
0.042572021484375,
0.047119140625,
-0.0183868408203125,
0.0166778564453125,
0.0313720703125,
-0.032958984375,
0.004169464111328125,
-0.00997161865234375,
-0.017120361328125,
0.0210418701171875,
-0.0101470947265625,
0.0212860107421875,
0.060028076171875,
-0.01398468017578125,
0.052459716796875,
-0.00530242919921875,
-0.0019426345825195312,
-0.07586669921875,
0.0175628662109375,
0.032562255859375,
0.041748046875,
0.01580810546875,
0.032470703125,
0.049774169921875,
-0.0128936767578125,
0.00519561767578125,
-0.0092926025390625,
-0.0186004638671875,
-0.055206298828125,
0.0809326171875,
0.0148162841796875,
-0.01430511474609375,
0.0023136138916015625,
-0.01111602783203125,
-0.0190277099609375,
0.06414794921875,
0.05364990234375,
0.05657958984375,
-0.015869140625,
0.010833740234375,
0.0286712646484375,
-0.003925323486328125,
-0.01450347900390625,
0.06939697265625,
-0.01473236083984375,
-0.0237274169921875,
-0.005157470703125,
-0.032684326171875,
-0.0302734375,
0.0257415771484375,
-0.04803466796875,
0.0770263671875,
-0.05450439453125,
-0.0186004638671875,
0.0192108154296875,
-0.03485107421875,
-0.04071044921875,
0.036590576171875,
0.019439697265625,
0.10577392578125,
-0.07073974609375,
0.047637939453125,
0.059967041015625,
-0.0242462158203125,
-0.048553466796875,
-0.00676727294921875,
0.030120849609375,
0.00982666015625,
0.0079498291015625,
0.016693115234375,
0.0020046234130859375,
-0.0077362060546875,
-0.0465087890625,
-0.054840087890625,
0.06951904296875,
0.023040771484375,
-0.0548095703125,
0.006053924560546875,
-0.0260009765625,
0.020355224609375,
-0.05706787109375,
0.037689208984375,
0.043853759765625,
0.01505279541015625,
0.03326416015625,
-0.0802001953125,
-0.0248565673828125,
-0.05377197265625,
0.0206146240234375,
0.004718780517578125,
-0.050262451171875,
0.08184814453125,
-0.02655029296875,
0.0089111328125,
0.0426025390625,
0.06915283203125,
0.046844482421875,
0.045623779296875,
0.07208251953125,
0.0545654296875,
0.010040283203125,
-0.0104827880859375,
0.06280517578125,
-0.033721923828125,
0.041290283203125,
0.084716796875,
-0.017120361328125,
0.052825927734375,
0.040008544921875,
0.00579833984375,
0.04058837890625,
0.08660888671875,
0.0233001708984375,
0.043670654296875,
0.01369476318359375,
-0.030975341796875,
-0.02044677734375,
0.0182037353515625,
-0.058380126953125,
0.03680419921875,
0.001956939697265625,
-0.0111846923828125,
-0.01654052734375,
0.00478363037109375,
-0.00568389892578125,
0.0382080078125,
-0.01580810546875,
0.034454345703125,
0.003734588623046875,
-0.0153045654296875,
0.04107666015625,
-0.0010290145874023438,
0.0291900634765625,
-0.024322509765625,
-0.0122528076171875,
-0.0169219970703125,
0.0007681846618652344,
-0.036163330078125,
-0.0323486328125,
0.0180206298828125,
-0.00589752197265625,
0.000637054443359375,
-0.00867462158203125,
0.043731689453125,
-0.011871337890625,
-0.038848876953125,
0.0292205810546875,
0.00461578369140625,
0.0221405029296875,
0.0009055137634277344,
-0.055633544921875,
0.056060791015625,
0.01035308837890625,
-0.026611328125,
-0.018157958984375,
-0.0022754669189453125,
0.00609588623046875,
0.0291748046875,
0.05352783203125,
0.0290679931640625,
0.003368377685546875,
-0.0010833740234375,
0.0780029296875,
-0.055023193359375,
-0.07427978515625,
-0.04010009765625,
0.052093505859375,
-0.01387786865234375,
-0.04046630859375,
0.042236328125,
0.07843017578125,
0.06756591796875,
-0.0400390625,
0.056915283203125,
-0.0244903564453125,
0.057952880859375,
-0.035400390625,
0.038970947265625,
-0.047698974609375,
0.02142333984375,
-0.044921875,
-0.09100341796875,
0.00664520263671875,
0.0361328125,
0.0033855438232421875,
0.0162200927734375,
0.0269927978515625,
0.05621337890625,
-0.0303802490234375,
0.021148681640625,
0.018768310546875,
0.0234832763671875,
0.01132965087890625,
0.045928955078125,
0.07183837890625,
-0.049224853515625,
0.0167236328125,
-0.03753662109375,
-0.0181427001953125,
-0.01390838623046875,
-0.061798095703125,
-0.03497314453125,
-0.0482177734375,
-0.035552978515625,
-0.042022705078125,
0.00521087646484375,
0.07952880859375,
0.0714111328125,
-0.06475830078125,
-0.02081298828125,
-0.01556396484375,
-0.010986328125,
-0.004169464111328125,
-0.004825592041015625,
-0.0210418701171875,
0.016632080078125,
-0.07110595703125,
0.05792236328125,
-0.01398468017578125,
0.049896240234375,
0.0226898193359375,
-0.00991058349609375,
-0.0052947998046875,
0.0032329559326171875,
0.0197906494140625,
0.0205841064453125,
-0.05047607421875,
-0.0343017578125,
-0.0017986297607421875,
-0.01093292236328125,
-0.01447296142578125,
0.043731689453125,
-0.03607177734375,
0.01175689697265625,
0.0360107421875,
-0.005096435546875,
0.065673828125,
-0.01534271240234375,
0.012237548828125,
-0.01100921630859375,
0.03131103515625,
0.0028076171875,
0.044677734375,
0.036895751953125,
-0.0007748603820800781,
0.06103515625,
0.01262664794921875,
-0.032135009765625,
-0.042083740234375,
0.025238037109375,
-0.101318359375,
-0.0193328857421875,
0.07177734375,
0.0223541259765625,
-0.02099609375,
0.029876708984375,
-0.03668212890625,
0.00661468505859375,
-0.01690673828125,
0.049591064453125,
0.0243988037109375,
0.00939178466796875,
0.004535675048828125,
-0.055572509765625,
0.01477813720703125,
0.020843505859375,
-0.031097412109375,
-0.05792236328125,
0.04412841796875,
0.06201171875,
0.0233001708984375,
0.025390625,
-0.052001953125,
0.05230712890625,
-0.01032257080078125,
0.043212890625,
-0.001491546630859375,
-0.0206298828125,
-0.0154571533203125,
0.0206298828125,
0.01232147216796875,
-0.05035400390625
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.