modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
JasperLS/deberta-v3-base-injection | 2023-09-11T12:55:23.000Z | [
"transformers",
"pytorch",
"tensorboard",
"deberta-v2",
"text-classification",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | JasperLS | null | null | JasperLS/deberta-v3-base-injection | 1 | 14,743 | transformers | 2023-05-08T14:17:20 | ---
license: mit
tags:
- generated_from_trainer
metrics:
- accuracy
base_model: microsoft/deberta-v3-base
model-index:
- name: deberta-v3-base-injection
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-v3-base-injection
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the [promp-injection](https://huggingface.co/datasets/JasperLS/prompt-injections) dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0673
- Accuracy: 0.9914
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 69 | 0.2353 | 0.9741 |
| No log | 2.0 | 138 | 0.0894 | 0.9741 |
| No log | 3.0 | 207 | 0.0673 | 0.9914 |
### Framework versions
- Transformers 4.29.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
| 1,584 | [
[
-0.0178985595703125,
-0.047760009765625,
0.0260772705078125,
0.01861572265625,
-0.020721435546875,
-0.0257110595703125,
0.017333984375,
-0.0225830078125,
-0.0020122528076171875,
0.0230712890625,
-0.034942626953125,
-0.053375244140625,
-0.054443359375,
-0.016815185546875,
-0.018768310546875,
0.064208984375,
0.007663726806640625,
0.034271240234375,
-0.0004220008850097656,
0.00434112548828125,
-0.055389404296875,
-0.05023193359375,
-0.0516357421875,
-0.04400634765625,
0.0189361572265625,
0.0247344970703125,
0.0648193359375,
0.054534912109375,
0.039154052734375,
0.01290130615234375,
-0.033111572265625,
-0.00612640380859375,
-0.03326416015625,
-0.0308380126953125,
-0.01032257080078125,
-0.035491943359375,
-0.05517578125,
-0.00035834312438964844,
0.03875732421875,
0.0275421142578125,
-0.00302886962890625,
0.0304412841796875,
0.0120849609375,
0.037750244140625,
-0.0643310546875,
0.0177459716796875,
-0.05084228515625,
0.01861572265625,
-0.0069732666015625,
-0.0228118896484375,
-0.02874755859375,
-0.004833221435546875,
0.005191802978515625,
-0.0306243896484375,
0.0213623046875,
-0.0125274658203125,
0.09039306640625,
0.026031494140625,
-0.03289794921875,
0.005527496337890625,
-0.060455322265625,
0.046417236328125,
-0.055145263671875,
0.0131683349609375,
0.0227203369140625,
0.040435791015625,
-0.007305145263671875,
-0.050567626953125,
-0.036407470703125,
-0.00400543212890625,
-0.0005917549133300781,
0.010284423828125,
-0.033111572265625,
0.00527191162109375,
0.04876708984375,
0.0201568603515625,
-0.050048828125,
0.0251617431640625,
-0.04296875,
-0.01178741455078125,
0.0313720703125,
0.0362548828125,
-0.01287841796875,
-0.0102081298828125,
-0.0421142578125,
-0.00033855438232421875,
-0.0504150390625,
0.0032329559326171875,
0.035919189453125,
0.0180511474609375,
-0.01702880859375,
0.040985107421875,
-0.0214385986328125,
0.0751953125,
0.01113128662109375,
-0.0026702880859375,
0.050079345703125,
0.0129852294921875,
-0.04095458984375,
-0.0029449462890625,
0.05621337890625,
0.045623779296875,
0.01221466064453125,
0.0196685791015625,
-0.0097808837890625,
-0.0029697418212890625,
0.01378631591796875,
-0.07940673828125,
-0.0216827392578125,
0.0228729248046875,
-0.033477783203125,
-0.04998779296875,
0.002300262451171875,
-0.035430908203125,
0.001827239990234375,
-0.037506103515625,
0.038421630859375,
-0.038116455078125,
-0.0035266876220703125,
0.0243682861328125,
-0.00011026859283447266,
0.01555633544921875,
0.0230865478515625,
-0.06207275390625,
0.023468017578125,
0.038360595703125,
0.052154541015625,
-0.00783538818359375,
-0.0268402099609375,
-0.02886962890625,
0.00885772705078125,
-0.01141357421875,
0.034423828125,
-0.01078033447265625,
-0.0235748291015625,
-0.00009894371032714844,
0.02093505859375,
-0.007549285888671875,
-0.046722412109375,
0.040496826171875,
-0.037261962890625,
0.00865936279296875,
-0.00583648681640625,
-0.04742431640625,
-0.018798828125,
0.02825927734375,
-0.039947509765625,
0.077392578125,
0.0130767822265625,
-0.0538330078125,
0.035675048828125,
-0.036285400390625,
0.0124359130859375,
0.006832122802734375,
-0.01462554931640625,
-0.0352783203125,
-0.01145172119140625,
0.003231048583984375,
0.0384521484375,
-0.0182342529296875,
0.0302276611328125,
-0.0220794677734375,
-0.04510498046875,
-0.00461578369140625,
-0.04052734375,
0.07977294921875,
0.0154571533203125,
-0.036468505859375,
0.0019931793212890625,
-0.088623046875,
0.01326751708984375,
0.01251983642578125,
-0.012359619140625,
-0.0016412734985351562,
-0.036285400390625,
0.017364501953125,
0.0258941650390625,
0.028106689453125,
-0.03778076171875,
0.0238800048828125,
-0.03021240234375,
0.010589599609375,
0.040985107421875,
0.0167388916015625,
0.0024471282958984375,
-0.020477294921875,
0.033721923828125,
0.04327392578125,
0.038726806640625,
0.0214385986328125,
-0.046478271484375,
-0.054534912109375,
-0.017181396484375,
0.0169830322265625,
0.053070068359375,
-0.0276031494140625,
0.0565185546875,
-0.0084686279296875,
-0.0709228515625,
-0.00513458251953125,
-0.0004057884216308594,
0.029571533203125,
0.0560302734375,
0.04150390625,
-0.0029544830322265625,
-0.0291748046875,
-0.099609375,
0.004802703857421875,
-0.01288604736328125,
0.00376129150390625,
0.0020465850830078125,
0.05828857421875,
-0.02490234375,
0.06451416015625,
-0.052764892578125,
-0.015899658203125,
-0.00957489013671875,
0.004032135009765625,
0.043121337890625,
0.07073974609375,
0.06622314453125,
-0.021209716796875,
-0.0175933837890625,
-0.0267333984375,
-0.049072265625,
0.0137939453125,
-0.01800537109375,
-0.025360107421875,
-0.005245208740234375,
0.00394439697265625,
-0.03759765625,
0.05889892578125,
0.026947021484375,
-0.02520751953125,
0.040802001953125,
-0.02886962890625,
0.0032329559326171875,
-0.08636474609375,
0.01511383056640625,
0.0032558441162109375,
-0.0101470947265625,
-0.027435302734375,
-0.01105499267578125,
0.0215911865234375,
-0.01023101806640625,
-0.0545654296875,
0.035919189453125,
-0.005802154541015625,
0.01812744140625,
-0.0222625732421875,
-0.01309967041015625,
0.01325225830078125,
0.04815673828125,
0.017059326171875,
0.046356201171875,
0.057037353515625,
-0.044464111328125,
0.031494140625,
0.043731689453125,
-0.016448974609375,
0.038482666015625,
-0.07879638671875,
0.005523681640625,
-0.004302978515625,
0.008148193359375,
-0.050567626953125,
0.0002467632293701172,
0.037017822265625,
-0.0261688232421875,
0.0147247314453125,
-0.025146484375,
-0.0258636474609375,
-0.0277099609375,
-0.00390625,
0.0135955810546875,
0.053741455078125,
-0.04296875,
0.0190277099609375,
-0.006389617919921875,
0.033111572265625,
-0.053619384765625,
-0.06683349609375,
-0.01220703125,
-0.01168060302734375,
-0.039947509765625,
0.0306854248046875,
0.005092620849609375,
0.0016765594482421875,
-0.0199432373046875,
0.006084442138671875,
-0.027587890625,
0.0045623779296875,
0.03021240234375,
0.0285491943359375,
-0.0009336471557617188,
0.0003190040588378906,
0.00775909423828125,
-0.0193023681640625,
0.01117706298828125,
0.00274658203125,
0.039703369140625,
-0.00661468505859375,
-0.038116455078125,
-0.0687255859375,
0.007083892822265625,
0.036712646484375,
0.0008225440979003906,
0.07159423828125,
0.045654296875,
-0.044830322265625,
0.0033550262451171875,
-0.03216552734375,
-0.014251708984375,
-0.0303955078125,
0.02581787109375,
-0.04364013671875,
-0.006412506103515625,
0.048980712890625,
0.005832672119140625,
-0.005786895751953125,
0.07281494140625,
0.039031982421875,
0.02099609375,
0.08563232421875,
0.0145721435546875,
0.00809478759765625,
0.0245361328125,
-0.06988525390625,
-0.0218963623046875,
-0.05078125,
-0.0325927734375,
-0.043060302734375,
-0.0149688720703125,
-0.03680419921875,
-0.002437591552734375,
0.005626678466796875,
0.024169921875,
-0.05853271484375,
0.045013427734375,
-0.045806884765625,
0.0217742919921875,
0.05535888671875,
0.033416748046875,
0.0060882568359375,
-0.0006108283996582031,
-0.0042266845703125,
-0.009063720703125,
-0.057373046875,
-0.052703857421875,
0.0809326171875,
0.037384033203125,
0.06158447265625,
-0.00616455078125,
0.054779052734375,
0.00968170166015625,
0.004451751708984375,
-0.041595458984375,
0.03143310546875,
0.01486968994140625,
-0.0548095703125,
0.0102996826171875,
-0.0189361572265625,
-0.063232421875,
0.01096343994140625,
-0.0165863037109375,
-0.059967041015625,
0.04119873046875,
0.0380859375,
-0.035980224609375,
0.02581787109375,
-0.040313720703125,
0.09173583984375,
-0.0155487060546875,
-0.036163330078125,
-0.00920867919921875,
-0.038848876953125,
0.00855255126953125,
0.00420379638671875,
-0.035400390625,
-0.00417327880859375,
0.0019969940185546875,
0.049713134765625,
-0.040679931640625,
0.052154541015625,
-0.037109375,
0.007495880126953125,
0.0301971435546875,
-0.002178192138671875,
0.052703857421875,
0.024322509765625,
-0.01430511474609375,
0.03057861328125,
0.016571044921875,
-0.0419921875,
-0.048370361328125,
0.056640625,
-0.07861328125,
-0.0276031494140625,
-0.0577392578125,
-0.021392822265625,
-0.0146636962890625,
0.004779815673828125,
0.0374755859375,
0.059417724609375,
-0.0087127685546875,
0.0140228271484375,
0.034942626953125,
-0.005954742431640625,
0.00942230224609375,
0.035675048828125,
-0.003597259521484375,
-0.03631591796875,
0.050811767578125,
0.0015134811401367188,
0.017425537109375,
0.006870269775390625,
0.0005164146423339844,
-0.022430419921875,
-0.05364990234375,
-0.053436279296875,
0.006500244140625,
-0.0611572265625,
-0.02740478515625,
-0.046478271484375,
-0.0234375,
-0.026947021484375,
0.00498199462890625,
-0.031585693359375,
-0.03692626953125,
-0.06024169921875,
-0.014190673828125,
0.033782958984375,
0.038726806640625,
-0.0027790069580078125,
0.051849365234375,
-0.06488037109375,
-0.004955291748046875,
0.004955291748046875,
0.01471710205078125,
-0.00876617431640625,
-0.06903076171875,
-0.0254974365234375,
0.014404296875,
-0.05133056640625,
-0.07928466796875,
0.034820556640625,
-0.00522613525390625,
0.04327392578125,
0.0307769775390625,
-0.01605224609375,
0.06524658203125,
-0.02899169921875,
0.07476806640625,
0.01561737060546875,
-0.056915283203125,
0.05010986328125,
-0.01678466796875,
0.030242919921875,
0.052764892578125,
0.04913330078125,
-0.001064300537109375,
-0.006290435791015625,
-0.08062744140625,
-0.05517578125,
0.050811767578125,
0.02783203125,
0.0059661865234375,
-0.01001739501953125,
0.033416748046875,
-0.00382232666015625,
0.0176849365234375,
-0.04278564453125,
-0.043182373046875,
-0.01541900634765625,
-0.0082550048828125,
0.01387786865234375,
-0.03570556640625,
-0.00861358642578125,
-0.04266357421875,
0.08721923828125,
0.0004940032958984375,
0.0172576904296875,
0.01093292236328125,
-0.0026149749755859375,
0.0093231201171875,
0.00021529197692871094,
0.047119140625,
0.07330322265625,
-0.044586181640625,
-0.006717681884765625,
0.0247344970703125,
-0.036529541015625,
0.01464080810546875,
0.0186920166015625,
-0.013702392578125,
0.01702880859375,
0.0142669677734375,
0.073974609375,
-0.0011653900146484375,
-0.023223876953125,
0.04241943359375,
0.006397247314453125,
-0.0180511474609375,
-0.04229736328125,
0.01195526123046875,
-0.019866943359375,
0.01549530029296875,
0.0168914794921875,
0.0240936279296875,
0.0231781005859375,
-0.006504058837890625,
0.0117645263671875,
0.0133819580078125,
-0.045196533203125,
-0.010833740234375,
0.0557861328125,
0.01142120361328125,
-0.0156097412109375,
0.046417236328125,
-0.00637054443359375,
-0.01219940185546875,
0.06207275390625,
0.045196533203125,
0.07318115234375,
-0.00823974609375,
-0.0021991729736328125,
0.06243896484375,
0.007228851318359375,
0.0009670257568359375,
0.049468994140625,
0.02911376953125,
-0.0302886962890625,
-0.0139923095703125,
-0.037109375,
-0.0160980224609375,
0.038726806640625,
-0.07659912109375,
0.04400634765625,
-0.0204925537109375,
-0.037139892578125,
0.0126190185546875,
-0.00690460205078125,
-0.07135009765625,
0.03204345703125,
0.0024890899658203125,
0.09503173828125,
-0.07061767578125,
0.053985595703125,
0.037322998046875,
-0.043853759765625,
-0.054290771484375,
-0.024566650390625,
-0.0180511474609375,
-0.061553955078125,
0.07855224609375,
0.00794219970703125,
0.0138397216796875,
0.0028972625732421875,
-0.03448486328125,
-0.051605224609375,
0.07952880859375,
0.0322265625,
-0.06182861328125,
-0.0013875961303710938,
0.0223846435546875,
0.043792724609375,
-0.0189666748046875,
0.04864501953125,
0.021484375,
0.01377105712890625,
0.01294708251953125,
-0.064453125,
0.0014696121215820312,
-0.00794219970703125,
0.02197265625,
-0.006473541259765625,
-0.0413818359375,
0.07000732421875,
0.004909515380859375,
0.0299072265625,
0.01258087158203125,
0.046783447265625,
0.01800537109375,
0.005252838134765625,
0.032562255859375,
0.058013916015625,
0.032196044921875,
-0.0023403167724609375,
0.06463623046875,
-0.050689697265625,
0.05010986328125,
0.095703125,
-0.00969696044921875,
0.0443115234375,
0.0289764404296875,
-0.01300811767578125,
0.03961181640625,
0.05645751953125,
-0.0266571044921875,
0.02301025390625,
0.02593994140625,
-0.004856109619140625,
-0.027862548828125,
0.0142059326171875,
-0.054443359375,
0.03155517578125,
0.00232696533203125,
-0.0643310546875,
-0.027496337890625,
-0.0009946823120117188,
0.00640869140625,
-0.02301025390625,
-0.0142974853515625,
0.039947509765625,
-0.03118896484375,
-0.0313720703125,
0.07489013671875,
-0.0031490325927734375,
0.0173187255859375,
-0.040802001953125,
-0.0252838134765625,
-0.00098419189453125,
0.0364990234375,
-0.01904296875,
-0.044403076171875,
0.01511383056640625,
-0.0036334991455078125,
-0.0127410888671875,
0.01042938232421875,
0.03570556640625,
-0.01477813720703125,
-0.04876708984375,
0.0036563873291015625,
0.040679931640625,
0.0224151611328125,
-0.009063720703125,
-0.07611083984375,
0.0006384849548339844,
-0.00771331787109375,
-0.03094482421875,
0.00518798828125,
0.00836181640625,
0.0081787109375,
0.042633056640625,
0.0225677490234375,
-0.00423431396484375,
-0.00620269775390625,
0.0124053955078125,
0.0712890625,
-0.034820556640625,
-0.026763916015625,
-0.06805419921875,
0.041717529296875,
-0.0201873779296875,
-0.0579833984375,
0.054351806640625,
0.082275390625,
0.04815673828125,
-0.025054931640625,
0.037384033203125,
0.006710052490234375,
0.0204925537109375,
-0.033233642578125,
0.037872314453125,
-0.033355712890625,
-0.005123138427734375,
-0.02099609375,
-0.064697265625,
-0.006378173828125,
0.039337158203125,
-0.023773193359375,
0.00030350685119628906,
0.03656005859375,
0.066650390625,
-0.01125335693359375,
-0.0014944076538085938,
0.0258636474609375,
-0.00031113624572753906,
0.0213470458984375,
0.042724609375,
0.04229736328125,
-0.07037353515625,
0.033782958984375,
-0.05010986328125,
-0.026519775390625,
-0.0114288330078125,
-0.0589599609375,
-0.07501220703125,
-0.0236663818359375,
-0.045440673828125,
-0.0498046875,
0.01123046875,
0.0819091796875,
0.05609130859375,
-0.068115234375,
-0.007427215576171875,
-0.01552581787109375,
-0.0272216796875,
-0.02740478515625,
-0.016448974609375,
0.037017822265625,
-0.0168914794921875,
-0.07330322265625,
0.004337310791015625,
-0.025970458984375,
0.0413818359375,
-0.023712158203125,
-0.0028591156005859375,
-0.0224151611328125,
-0.0153045654296875,
0.0198211669921875,
0.00014650821685791016,
-0.0289154052734375,
-0.024261474609375,
-0.0006051063537597656,
-0.0118865966796875,
0.013885498046875,
0.0200347900390625,
-0.0579833984375,
0.024749755859375,
0.04107666015625,
0.018890380859375,
0.04510498046875,
-0.00809478759765625,
0.02752685546875,
-0.045867919921875,
0.03656005859375,
0.022674560546875,
0.039642333984375,
-0.003307342529296875,
-0.034027099609375,
0.035797119140625,
0.0273895263671875,
-0.046356201171875,
-0.056640625,
0.00360107421875,
-0.0791015625,
0.01080322265625,
0.09033203125,
-0.00974273681640625,
-0.0249786376953125,
0.00970458984375,
-0.0313720703125,
0.0184173583984375,
-0.02423095703125,
0.036529541015625,
0.034820556640625,
-0.00925445556640625,
0.0213470458984375,
-0.044464111328125,
0.036651611328125,
0.0204925537109375,
-0.052825927734375,
-0.013946533203125,
0.04473876953125,
0.0292510986328125,
0.01355743408203125,
0.0214385986328125,
-0.00788116455078125,
0.025177001953125,
-0.0010595321655273438,
0.018157958984375,
-0.033599853515625,
-0.024810791015625,
-0.0333251953125,
0.004146575927734375,
0.00518035888671875,
-0.043975830078125
]
] |
Phind/Phind-CodeLlama-34B-Python-v1 | 2023-08-26T03:05:48.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"code llama",
"license:llama2",
"model-index",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | Phind | null | null | Phind/Phind-CodeLlama-34B-Python-v1 | 237 | 14,741 | transformers | 2023-08-25T20:33:09 | ---
license: llama2
model-index:
- name: Phind-CodeLlama-34B-v1
results:
- task:
type: text-generation
dataset:
type: openai_humaneval
name: HumanEval
metrics:
- name: pass@1
type: pass@1
value: 69.5%
verified: false
tags:
- code llama
---
# **Phind-CodeLlama-34B-Python-v1**
We've fine-tuned CodeLlama-34B and CodeLlama-34B-Python on an internal Phind dataset that achieve 67.6% and 69.5% pass@1 on HumanEval, respectively. GPT-4 achieves 67%. We've applied OpenAI's decontamination methodology to our dataset to ensure result validity.
More details can be found on our [blog post](https://www.phind.com/blog/code-llama-beats-gpt4).
## Model Details
This model is fine-tuned from CodeLlama-34B-Python and achieves 69.5% pass@1 on HumanEval.
## Dataset Details
We fined-tuned on a proprietary dataset of ~80k high quality programming problems and solutions. This dataset consists of instruction-answer pairs instead of code completion examples, making it structurally different from HumanEval. The Phind models were trained for 2 epochs, for a total of ~160k examples shown. LoRA was not used -- both models are a native finetune. We used DeepSpeed ZeRO 3 and Flash Attention 2 to train these models in three hours on 32 A100-80GB GPUs. We used a sequence length of 4096 tokens.
## How to Get Started with the Model
Make sure to install Transformers from the main git branch:
```bash
pip install git+https://github.com/huggingface/transformers.git
```
## How to Prompt the Model
**Please note that this model is somewhat instruction-tuned, but not chat-tuned.**
Do not try to use the Llama chat markup with this model. Instead, simply tell it what you want and add "\n: " at the end of your task.
For example:
```
Write me a linked list implementation: \n
```
## How to reproduce HumanEval Results
To reproduce our results:
```python
from transformers import AutoTokenizer, LlamaForCausalLM
from human_eval.data import write_jsonl, read_problems
from tqdm import tqdm
# initialize the model
model_path = "Phind/Phind-CodeLlama-34B-v1"
model = LlamaForCausalLM.from_pretrained(model_path, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_path)
# HumanEval helper
def generate_one_completion(prompt: str):
tokenizer.pad_token = tokenizer.eos_token
inputs = tokenizer(prompt, return_tensors="pt", truncation=True, max_length=4096)
# Generate
generate_ids = model.generate(inputs.input_ids.to("cuda"), max_new_tokens=256, do_sample=True, top_p=0.75, top_k=40, temperature=0.1)
completion = tokenizer.batch_decode(generate_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0]
completion = completion.replace(prompt, "").split("\n\n\n")[0]
return completion
# perform HumanEval
problems = read_problems()
num_samples_per_task = 1
samples = [
dict(task_id=task_id, completion=generate_one_completion(problems[task_id]["prompt"]))
for task_id in tqdm(problems)
for _ in range(num_samples_per_task)
]
write_jsonl("samples.jsonl", samples)
# run `evaluate_functional_correctness samples.jsonl` in your HumanEval code sandbox
```
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
This model has undergone very limited testing. Additional safety testing should be performed before any real-world deployments.
## Training details
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
- **Hardware Type:** 32x A100-80GB
- **Hours used:** 90 GPU-hours
- **Cloud Provider:** AWS
- **Compute Region:** us-east-1 | 3,722 | [
[
-0.021148681640625,
-0.05377197265625,
0.0184173583984375,
0.0133514404296875,
-0.023590087890625,
-0.005397796630859375,
-0.0013446807861328125,
-0.030731201171875,
-0.00655364990234375,
0.023529052734375,
-0.034210205078125,
-0.05291748046875,
-0.022796630859375,
0.0169219970703125,
-0.0020160675048828125,
0.08599853515625,
0.0104217529296875,
-0.0138092041015625,
-0.005886077880859375,
-0.00588226318359375,
-0.0081024169921875,
-0.044342041015625,
-0.038543701171875,
-0.01143646240234375,
0.0258636474609375,
0.02642822265625,
0.04852294921875,
0.06103515625,
0.051910400390625,
0.023223876953125,
-0.01465606689453125,
-0.00794219970703125,
-0.042449951171875,
-0.019012451171875,
-0.003498077392578125,
-0.023895263671875,
-0.05914306640625,
0.005619049072265625,
0.03546142578125,
0.019073486328125,
-0.01480865478515625,
0.03717041015625,
-0.002132415771484375,
0.0224761962890625,
-0.036773681640625,
0.01558685302734375,
-0.027801513671875,
0.0113372802734375,
-0.0034732818603515625,
-0.003505706787109375,
-0.006103515625,
-0.031097412109375,
-0.007030487060546875,
-0.057281494140625,
0.0086212158203125,
0.0017385482788085938,
0.07269287109375,
0.043121337890625,
-0.0169525146484375,
0.0081329345703125,
-0.03173828125,
0.07171630859375,
-0.08013916015625,
0.020233154296875,
0.0535888671875,
-0.0026035308837890625,
-0.00994873046875,
-0.05712890625,
-0.041961669921875,
-0.0253143310546875,
0.017669677734375,
0.00922393798828125,
-0.0258636474609375,
0.01187896728515625,
0.05108642578125,
0.03778076171875,
-0.05120849609375,
0.003551483154296875,
-0.048675537109375,
-0.030059814453125,
0.04718017578125,
0.028839111328125,
0.0081787109375,
-0.0372314453125,
-0.0411376953125,
-0.016510009765625,
-0.034515380859375,
0.03521728515625,
0.0160369873046875,
0.005535125732421875,
-0.033416748046875,
0.03466796875,
-0.037078857421875,
0.043487548828125,
0.005466461181640625,
-0.023468017578125,
0.0266571044921875,
-0.039031982421875,
-0.041778564453125,
-0.01424407958984375,
0.0701904296875,
0.0208587646484375,
-0.0047760009765625,
0.0037059783935546875,
-0.01491546630859375,
0.00882720947265625,
0.018829345703125,
-0.07080078125,
-0.034149169921875,
0.033447265625,
-0.0271453857421875,
-0.04718017578125,
-0.0187530517578125,
-0.04339599609375,
-0.01849365234375,
-0.0054779052734375,
0.03729248046875,
-0.030517578125,
-0.02789306640625,
0.0271453857421875,
0.00931549072265625,
0.0270233154296875,
0.009765625,
-0.0506591796875,
0.0117340087890625,
0.04107666015625,
0.05828857421875,
0.006511688232421875,
-0.021514892578125,
-0.040313720703125,
-0.00920867919921875,
-0.01143646240234375,
0.049530029296875,
-0.0213775634765625,
-0.0238800048828125,
-0.028045654296875,
0.00598907470703125,
0.00696563720703125,
-0.030914306640625,
0.04339599609375,
-0.03436279296875,
0.0164337158203125,
-0.0309600830078125,
-0.0274658203125,
-0.03363037109375,
0.02496337890625,
-0.035552978515625,
0.08416748046875,
0.0238800048828125,
-0.07647705078125,
-0.0009851455688476562,
-0.052520751953125,
-0.0171966552734375,
-0.0081634521484375,
0.003475189208984375,
-0.046142578125,
-0.0221710205078125,
0.00909423828125,
0.040313720703125,
-0.0258636474609375,
0.025115966796875,
-0.0308074951171875,
-0.041015625,
0.0112762451171875,
-0.0141448974609375,
0.0819091796875,
0.02044677734375,
-0.047454833984375,
0.030181884765625,
-0.0634765625,
-0.0011529922485351562,
0.0118255615234375,
-0.0406494140625,
0.0018682479858398438,
-0.03228759765625,
0.0180816650390625,
0.0279388427734375,
0.0154571533203125,
-0.03961181640625,
0.024658203125,
-0.033966064453125,
0.037017822265625,
0.06732177734375,
0.00882720947265625,
0.02264404296875,
-0.05419921875,
0.055999755859375,
0.0012140274047851562,
0.040435791015625,
0.0174102783203125,
-0.0491943359375,
-0.0570068359375,
-0.021270751953125,
0.010467529296875,
0.0518798828125,
-0.0271453857421875,
0.043914794921875,
-0.003337860107421875,
-0.043609619140625,
-0.03753662109375,
0.004405975341796875,
0.033355712890625,
0.058837890625,
0.037445068359375,
-0.0177001953125,
-0.044097900390625,
-0.06793212890625,
0.0208892822265625,
-0.01459503173828125,
-0.0063323974609375,
0.00838470458984375,
0.0535888671875,
-0.041046142578125,
0.05413818359375,
-0.022918701171875,
-0.0095367431640625,
0.0015821456909179688,
-0.008819580078125,
0.0374755859375,
0.0584716796875,
0.0306854248046875,
-0.0305938720703125,
-0.00351715087890625,
-0.0211944580078125,
-0.0626220703125,
0.016876220703125,
-0.006317138671875,
-0.0278778076171875,
0.01255035400390625,
0.0148773193359375,
-0.042144775390625,
0.03814697265625,
0.0225372314453125,
-0.0217437744140625,
0.04840087890625,
-0.0109710693359375,
0.0135345458984375,
-0.0911865234375,
0.028411865234375,
-0.00070953369140625,
-0.00749969482421875,
-0.0278778076171875,
0.0134124755859375,
0.0008111000061035156,
-0.00397491455078125,
-0.037078857421875,
0.0307464599609375,
-0.01666259765625,
0.0001499652862548828,
-0.0007314682006835938,
-0.00653839111328125,
0.00661468505859375,
0.070556640625,
-0.01300811767578125,
0.0621337890625,
0.038543701171875,
-0.05377197265625,
0.04718017578125,
0.029266357421875,
-0.017364501953125,
0.01186370849609375,
-0.07659912109375,
0.0217437744140625,
0.02191162109375,
0.0107421875,
-0.06610107421875,
-0.021453857421875,
0.041595458984375,
-0.031890869140625,
-0.0005183219909667969,
-0.006130218505859375,
-0.037445068359375,
-0.0308074951171875,
-0.03680419921875,
0.037261962890625,
0.0587158203125,
-0.0200653076171875,
0.01751708984375,
0.0233154296875,
-0.0005908012390136719,
-0.053680419921875,
-0.04107666015625,
-0.01407623291015625,
-0.028289794921875,
-0.04681396484375,
0.01611328125,
-0.0227813720703125,
-0.01456451416015625,
-0.01361846923828125,
-0.01322174072265625,
0.0004858970642089844,
0.0211944580078125,
0.030914306640625,
0.0289459228515625,
-0.01096343994140625,
0.0009245872497558594,
-0.0083160400390625,
-0.01203155517578125,
0.0192108154296875,
-0.0139312744140625,
0.035125732421875,
-0.038818359375,
-0.0184478759765625,
-0.0523681640625,
0.01158905029296875,
0.0467529296875,
-0.031829833984375,
0.03594970703125,
0.0648193359375,
-0.0484619140625,
0.01715087890625,
-0.044097900390625,
-0.0240936279296875,
-0.036773681640625,
0.0297393798828125,
-0.0225677490234375,
-0.043670654296875,
0.05755615234375,
0.0152740478515625,
0.0210418701171875,
0.048492431640625,
0.02972412109375,
-0.0037689208984375,
0.06402587890625,
0.054107666015625,
-0.0150909423828125,
0.028167724609375,
-0.059234619140625,
-0.006664276123046875,
-0.076416015625,
-0.0173797607421875,
-0.0341796875,
-0.0092926025390625,
-0.03643798828125,
-0.042144775390625,
0.037811279296875,
0.030059814453125,
-0.048614501953125,
0.0281982421875,
-0.0616455078125,
0.0047760009765625,
0.052398681640625,
0.0279388427734375,
0.0185699462890625,
0.01012420654296875,
-0.0158538818359375,
0.004749298095703125,
-0.06317138671875,
-0.039398193359375,
0.087646484375,
0.03570556640625,
0.054443359375,
-0.0006957054138183594,
0.06158447265625,
0.011627197265625,
0.0179443359375,
-0.045501708984375,
0.03497314453125,
0.033416748046875,
-0.0318603515625,
-0.0225677490234375,
-0.0219573974609375,
-0.0648193359375,
0.0010318756103515625,
-0.01003265380859375,
-0.0499267578125,
0.017333984375,
-0.0059051513671875,
-0.06396484375,
0.023223876953125,
-0.024688720703125,
0.07232666015625,
-0.006938934326171875,
-0.04656982421875,
-0.02362060546875,
-0.04876708984375,
0.040802001953125,
-0.0033359527587890625,
0.00997161865234375,
-0.0151824951171875,
0.005886077880859375,
0.08551025390625,
-0.037078857421875,
0.0654296875,
0.0019292831420898438,
-0.0151824951171875,
0.02008056640625,
-0.004215240478515625,
0.03631591796875,
0.0261688232421875,
-0.0285797119140625,
0.0244140625,
0.00888824462890625,
-0.0428466796875,
-0.01128387451171875,
0.040863037109375,
-0.06829833984375,
-0.046875,
-0.047760009765625,
-0.037994384765625,
0.003940582275390625,
0.00946044921875,
0.038116455078125,
0.0447998046875,
0.006229400634765625,
-0.0012416839599609375,
0.058197021484375,
-0.035797119140625,
0.03155517578125,
0.025177001953125,
-0.00495147705078125,
-0.0477294921875,
0.054443359375,
-0.00937652587890625,
0.00830841064453125,
0.005523681640625,
0.0085601806640625,
-0.03564453125,
-0.027862548828125,
-0.038543701171875,
0.0247650146484375,
-0.049560546875,
-0.044708251953125,
-0.0604248046875,
-0.0179595947265625,
-0.040313720703125,
-0.002414703369140625,
-0.0268096923828125,
-0.0170745849609375,
-0.040313720703125,
-0.008453369140625,
0.05755615234375,
0.055084228515625,
-0.021942138671875,
0.00490570068359375,
-0.0643310546875,
0.03411865234375,
0.0226593017578125,
0.0217437744140625,
-0.01158905029296875,
-0.06280517578125,
-0.023223876953125,
0.02972412109375,
-0.033172607421875,
-0.07513427734375,
0.0231170654296875,
-0.004390716552734375,
0.040283203125,
0.03173828125,
0.032470703125,
0.05316162109375,
0.0050201416015625,
0.053497314453125,
0.00391387939453125,
-0.067626953125,
0.055938720703125,
-0.043670654296875,
0.02423095703125,
0.02496337890625,
0.03009033203125,
-0.0279083251953125,
-0.0289459228515625,
-0.06292724609375,
-0.04583740234375,
0.05108642578125,
0.021484375,
-0.0097198486328125,
0.0111083984375,
0.0271759033203125,
-0.0192108154296875,
0.0020656585693359375,
-0.060546875,
-0.0099029541015625,
-0.0169830322265625,
0.0011768341064453125,
0.0033473968505859375,
-0.0143280029296875,
-0.0202484130859375,
-0.051513671875,
0.0577392578125,
0.00429534912109375,
0.03839111328125,
0.031280517578125,
-0.00882720947265625,
-0.01324462890625,
0.01422119140625,
0.056060791015625,
0.051300048828125,
-0.03448486328125,
-0.012969970703125,
0.0191650390625,
-0.035400390625,
0.00926971435546875,
0.0234222412109375,
-0.001056671142578125,
-0.01201629638671875,
0.024139404296875,
0.0526123046875,
0.01003265380859375,
-0.033966064453125,
0.0249176025390625,
-0.0082855224609375,
-0.00954437255859375,
-0.020355224609375,
0.017730712890625,
0.01131439208984375,
0.031280517578125,
0.0184326171875,
0.01381683349609375,
0.0182647705078125,
-0.0222015380859375,
0.018829345703125,
0.0166778564453125,
-0.01088714599609375,
-0.024139404296875,
0.07891845703125,
0.01407623291015625,
-0.0174407958984375,
0.06951904296875,
-0.0270843505859375,
-0.05035400390625,
0.07733154296875,
0.03271484375,
0.0570068359375,
-0.0148162841796875,
0.006710052490234375,
0.056396484375,
0.0257415771484375,
0.0023708343505859375,
0.044677734375,
-0.0185394287109375,
-0.035369873046875,
-0.0238037109375,
-0.061676025390625,
-0.0285186767578125,
0.01435089111328125,
-0.0606689453125,
0.006622314453125,
-0.046875,
-0.0273284912109375,
-0.0184783935546875,
0.0281219482421875,
-0.058197021484375,
0.0076751708984375,
-0.005527496337890625,
0.0631103515625,
-0.055877685546875,
0.06646728515625,
0.060882568359375,
-0.049530029296875,
-0.08551025390625,
-0.0198516845703125,
-0.009979248046875,
-0.046051025390625,
0.03265380859375,
0.00812530517578125,
0.0026702880859375,
0.01548004150390625,
-0.0465087890625,
-0.06524658203125,
0.0906982421875,
0.0239105224609375,
-0.03704833984375,
-0.0004138946533203125,
-0.00560760498046875,
0.034027099609375,
0.00394439697265625,
0.0269317626953125,
0.03314208984375,
0.0306549072265625,
-0.01419830322265625,
-0.08160400390625,
0.024383544921875,
-0.044769287109375,
-0.0089569091796875,
-0.0159912109375,
-0.050384521484375,
0.07891845703125,
-0.049774169921875,
-0.0012302398681640625,
0.0031490325927734375,
0.0465087890625,
0.044036865234375,
0.018280029296875,
0.025909423828125,
0.05328369140625,
0.07196044921875,
0.00569915771484375,
0.07098388671875,
-0.03729248046875,
0.052947998046875,
0.053863525390625,
-0.002254486083984375,
0.06207275390625,
0.0113067626953125,
-0.03045654296875,
0.02618408203125,
0.076416015625,
-0.01348114013671875,
0.032440185546875,
0.0246734619140625,
-0.0230712890625,
-0.0246124267578125,
-0.009735107421875,
-0.066162109375,
0.02972412109375,
0.01212310791015625,
-0.0028438568115234375,
-0.005016326904296875,
0.01442718505859375,
0.0016469955444335938,
-0.0269927978515625,
-0.0030384063720703125,
0.043731689453125,
0.0012884140014648438,
-0.0290985107421875,
0.09326171875,
0.01043701171875,
0.07196044921875,
-0.0221710205078125,
-0.00948333740234375,
-0.033477783203125,
0.01003265380859375,
-0.0196990966796875,
-0.0298614501953125,
0.009857177734375,
-0.000949859619140625,
-0.01197052001953125,
-0.00528717041015625,
0.032012939453125,
-0.0139923095703125,
-0.03936767578125,
0.0102386474609375,
0.0191650390625,
0.0184478759765625,
-0.0166015625,
-0.059722900390625,
0.01299285888671875,
0.01012420654296875,
-0.02227783203125,
0.01216888427734375,
0.006687164306640625,
0.01007080078125,
0.055877685546875,
0.056884765625,
-0.01497650146484375,
0.0087738037109375,
-0.027801513671875,
0.077880859375,
-0.056671142578125,
-0.0369873046875,
-0.0491943359375,
0.04119873046875,
0.00835418701171875,
-0.04071044921875,
0.043914794921875,
0.047760009765625,
0.065673828125,
-0.0128021240234375,
0.04547119140625,
0.005374908447265625,
0.01042938232421875,
-0.0258026123046875,
0.06341552734375,
-0.039794921875,
0.0251007080078125,
-0.029998779296875,
-0.053253173828125,
0.0032329559326171875,
0.0777587890625,
-0.0099029541015625,
-0.0005497932434082031,
0.04193115234375,
0.0736083984375,
-0.00994110107421875,
-0.0005288124084472656,
0.0178070068359375,
0.0184326171875,
0.0171356201171875,
0.0677490234375,
0.059600830078125,
-0.06451416015625,
0.03814697265625,
-0.057891845703125,
-0.02191162109375,
-0.00872802734375,
-0.04461669921875,
-0.06072998046875,
-0.038330078125,
-0.042938232421875,
-0.052001953125,
0.0016574859619140625,
0.07855224609375,
0.0587158203125,
-0.056121826171875,
-0.0130615234375,
0.0005369186401367188,
0.0146026611328125,
-0.016357421875,
-0.0171661376953125,
0.032684326171875,
-0.0117034912109375,
-0.046295166015625,
0.0174102783203125,
0.002353668212890625,
0.00732421875,
-0.007904052734375,
-0.0113525390625,
-0.0183868408203125,
-0.01523590087890625,
0.0263519287109375,
0.028076171875,
-0.061279296875,
-0.0164794921875,
0.0194244384765625,
-0.04071044921875,
0.0124664306640625,
0.025726318359375,
-0.06591796875,
0.00995635986328125,
0.0350341796875,
0.04107666015625,
0.031097412109375,
-0.00424957275390625,
0.0220184326171875,
-0.0224609375,
0.020416259765625,
0.0255584716796875,
0.025238037109375,
0.0082855224609375,
-0.047088623046875,
0.0426025390625,
0.0248260498046875,
-0.055999755859375,
-0.0654296875,
-0.004222869873046875,
-0.08917236328125,
0.0009236335754394531,
0.09088134765625,
-0.0103302001953125,
-0.0216827392578125,
-0.0117034912109375,
-0.017333984375,
0.049072265625,
-0.0279998779296875,
0.06243896484375,
0.0188751220703125,
-0.01010894775390625,
-0.0058441162109375,
-0.04486083984375,
0.047088623046875,
0.0374755859375,
-0.06591796875,
-0.01552581787109375,
0.04278564453125,
0.04498291015625,
0.00005030632019042969,
0.06591796875,
0.0017871856689453125,
0.03729248046875,
-0.0035305023193359375,
0.0270233154296875,
-0.021514892578125,
0.0005168914794921875,
-0.0382080078125,
-0.004886627197265625,
-0.0101165771484375,
-0.0328369140625
]
] |
timm/resnet34.a1_in1k | 2023-04-05T18:05:47.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:2110.00476",
"arxiv:1512.03385",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/resnet34.a1_in1k | 0 | 14,699 | timm | 2023-04-05T18:05:32 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
---
# Model card for resnet34.a1_in1k
A ResNet-B image classification model.
This model features:
* ReLU activations
* single layer 7x7 convolution with pooling
* 1x1 convolution shortcut downsample
Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* ResNet Strikes Back `A1` recipe
* LAMB optimizer with BCE loss
* Cosine LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 21.8
- GMACs: 3.7
- Activations (M): 3.7
- Image size: train = 224 x 224, test = 288 x 288
- **Papers:**
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- Deep Residual Learning for Image Recognition: https://arxiv.org/abs/1512.03385
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('resnet34.a1_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnet34.a1_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 64, 56, 56])
# torch.Size([1, 128, 28, 28])
# torch.Size([1, 256, 14, 14])
# torch.Size([1, 512, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnet34.a1_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 512, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
|model |img_size|top1 |top5 |param_count|gmacs|macts|img/sec|
|------------------------------------------|--------|-----|-----|-----------|-----|-----|-------|
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|320 |86.72|98.17|93.6 |35.2 |69.7 |451 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|288 |86.51|98.08|93.6 |28.5 |56.4 |560 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|288 |86.49|98.03|93.6 |28.5 |56.4 |557 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|224 |85.96|97.82|93.6 |17.2 |34.2 |923 |
|[resnext101_32x32d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x32d.fb_wsl_ig1b_ft_in1k)|224 |85.11|97.44|468.5 |87.3 |91.1 |254 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|416 |85.0 |97.12|191.9 |108.4|213.8|134 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|352 |84.96|97.22|102.1 |50.2 |101.2|291 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|320 |84.73|97.18|102.1 |41.5 |83.7 |353 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|384 |84.71|96.99|164.0 |77.6 |154.7|183 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|288 |84.57|97.08|93.6 |28.5 |56.4 |557 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|320 |84.45|97.08|93.2 |31.5 |67.8 |446 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|352 |84.43|96.97|129.9 |51.1 |105.5|280 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|288 |84.36|96.92|93.6 |27.6 |53.0 |595 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|320 |84.35|97.04|66.8 |24.1 |47.7 |610 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|288 |84.3 |96.94|164.0 |43.7 |87.1 |333 |
|[resnext101_32x8d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_swsl_ig1b_ft_in1k)|224 |84.28|97.17|88.8 |16.5 |31.2 |1100 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|320 |84.24|96.86|191.9 |64.2 |126.6|228 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|288 |84.19|96.87|93.6 |27.2 |51.6 |613 |
|[resnext101_32x16d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_wsl_ig1b_ft_in1k)|224 |84.18|97.19|194.0 |36.3 |51.2 |581 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|288 |84.11|97.11|44.6 |15.1 |29.0 |1144 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|320 |83.97|96.82|64.7 |31.2 |67.3 |518 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|256 |83.87|96.75|93.2 |20.2 |43.4 |692 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|224 |83.86|96.65|93.6 |17.2 |34.2 |923 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|320 |83.72|96.61|86.6 |24.3 |48.1 |617 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|256 |83.69|96.78|66.8 |15.4 |30.6 |943 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|224 |83.68|96.61|93.6 |16.7 |32.0 |986 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|320 |83.67|96.74|60.2 |24.1 |47.7 |706 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|256 |83.59|96.61|129.9 |27.1 |55.8 |526 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|224 |83.58|96.4 |93.6 |16.5 |31.2 |1013 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|224 |83.54|96.83|44.6 |9.1 |17.6 |1864 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|288 |83.46|96.54|60.2 |19.1 |37.3 |904 |
|[resnext101_32x16d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k)|224 |83.35|96.85|194.0 |36.3 |51.2 |582 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|256 |83.23|96.53|64.7 |20.0 |43.1 |809 |
|[resnext101_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_swsl_ig1b_ft_in1k)|224 |83.22|96.75|44.2 |8.0 |21.2 |1814 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|288 |83.16|96.38|83.5 |25.7 |51.6 |590 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|256 |83.14|96.38|60.2 |15.4 |30.5 |1096 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|320 |83.02|96.45|44.6 |16.5 |34.8 |992 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|288 |82.98|96.54|44.6 |13.4 |28.2 |1077 |
|[resnext101_64x4d.tv_in1k](https://huggingface.co/timm/resnext101_64x4d.tv_in1k)|224 |82.98|96.25|83.5 |15.5 |31.2 |989 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|256 |82.86|96.28|86.6 |15.6 |30.8 |951 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|224 |82.83|96.22|88.8 |16.5 |31.2 |1099 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|224 |82.8 |96.13|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|288 |82.8 |96.32|44.6 |13.0 |26.8 |1291 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|288 |82.74|95.71|60.2 |19.1 |37.3 |905 |
|[resnext101_32x8d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_wsl_ig1b_ft_in1k)|224 |82.69|96.63|88.8 |16.5 |31.2 |1100 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|288 |82.62|95.75|60.2 |19.1 |37.3 |904 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|288 |82.61|96.49|25.6 |8.9 |20.6 |1729 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|288 |82.53|96.13|36.8 |9.9 |21.5 |1773 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|224 |82.5 |96.02|126.9 |22.8 |21.2 |1078 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|224 |82.46|95.92|83.5 |15.5 |31.2 |987 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|288 |82.36|96.18|35.7 |8.1 |20.9 |1964 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|320 |82.35|96.14|25.6 |8.8 |24.1 |1386 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|288 |82.31|95.63|44.6 |13.0 |26.8 |1291 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|288 |82.29|96.01|63.6 |13.6 |28.5 |1078 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|224 |82.29|96.0 |60.2 |11.6 |22.6 |1484 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|288 |82.27|96.06|68.9 |18.9 |23.8 |1176 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|256 |82.26|96.07|44.6 |10.6 |22.2 |1542 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|288 |82.24|95.73|44.6 |13.0 |26.8 |1290 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|288 |82.2 |96.14|27.6 |7.0 |23.8 |1547 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|224 |82.18|96.05|44.6 |8.1 |17.1 |1771 |
|[resnext50_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_swsl_ig1b_ft_in1k)|224 |82.17|96.22|25.0 |4.3 |14.4 |2943 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|288 |82.12|95.65|25.6 |7.1 |19.6 |1704 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|288 |82.03|95.94|25.0 |7.0 |23.8 |1745 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|288 |82.0 |96.15|24.9 |5.8 |12.7 |1787 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|256 |81.99|95.85|36.8 |7.8 |17.0 |2230 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|176 |81.98|95.72|88.8 |10.3 |19.4 |1768 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|224 |81.97|95.24|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|224 |81.93|95.75|44.6 |7.8 |16.2 |2122 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|224 |81.9 |95.77|44.6 |7.8 |16.2 |2118 |
|[resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k)|224 |81.84|96.1 |194.0 |36.3 |51.2 |583 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|256 |81.78|95.94|35.7 |6.4 |16.6 |2471 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|224 |81.77|95.22|60.2 |11.6 |22.6 |1485 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|224 |81.74|96.06|25.6 |5.4 |12.4 |2813 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|288 |81.65|95.54|25.6 |7.1 |19.6 |1703 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|288 |81.64|95.88|25.6 |7.2 |19.7 |1694 |
|[resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k)|224 |81.62|96.04|88.8 |16.5 |31.2 |1101 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|224 |81.61|95.76|68.9 |11.4 |14.4 |1930 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|288 |81.61|95.83|25.6 |8.5 |19.2 |1868 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|224 |81.5 |95.16|44.6 |7.8 |16.2 |2125 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|288 |81.48|95.16|25.0 |7.0 |23.8 |1745 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|288 |81.47|95.71|25.9 |6.9 |18.6 |2071 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|224 |81.45|95.53|68.9 |11.4 |14.4 |1929 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|288 |81.44|95.22|25.6 |7.2 |19.7 |1908 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|256 |81.44|95.67|25.6 |5.6 |15.4 |2168 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|288 |81.4 |95.82|30.2 |6.8 |13.9 |2132 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|288 |81.37|95.74|25.6 |7.2 |19.7 |1910 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|224 |81.32|95.19|44.6 |7.8 |16.2 |2125 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|288 |81.3 |95.65|28.1 |6.8 |18.4 |1803 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|288 |81.3 |95.11|25.0 |7.0 |23.8 |1746 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|224 |81.27|95.62|27.6 |4.3 |14.4 |2591 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|224 |81.26|95.16|25.6 |4.3 |11.8 |2823 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|288 |81.23|95.54|15.7 |4.8 |19.6 |2117 |
|[senet154.gluon_in1k](https://huggingface.co/timm/senet154.gluon_in1k)|224 |81.23|95.35|115.1 |20.8 |38.7 |545 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|288 |81.22|95.11|25.6 |6.8 |18.4 |2089 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|288 |81.22|95.63|25.6 |6.8 |18.4 |676 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|288 |81.18|95.09|25.6 |7.2 |19.7 |1908 |
|[resnet50.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet50.fb_swsl_ig1b_ft_in1k)|224 |81.18|95.98|25.6 |4.1 |11.1 |3455 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|224 |81.17|95.34|25.0 |4.3 |14.4 |2933 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|224 |81.1 |95.33|25.0 |4.3 |14.4 |2934 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|288 |81.1 |95.23|28.1 |6.8 |18.4 |1801 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|288 |81.1 |95.12|28.1 |6.8 |18.4 |1799 |
|[resnet152s.gluon_in1k](https://huggingface.co/timm/resnet152s.gluon_in1k)|224 |81.02|95.41|60.3 |12.9 |25.0 |1347 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|288 |80.97|95.44|25.6 |6.8 |18.4 |2085 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|256 |80.94|95.45|25.9 |5.4 |14.7 |2571 |
|[resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.93|95.73|44.2 |8.0 |21.2 |1814 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|288 |80.91|95.55|25.6 |6.8 |18.4 |2084 |
|[seresnext101_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_32x4d.gluon_in1k)|224 |80.9 |95.31|49.0 |8.0 |21.3 |1585 |
|[seresnext101_64x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_64x4d.gluon_in1k)|224 |80.9 |95.3 |88.2 |15.5 |31.2 |918 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|288 |80.86|95.52|25.6 |6.8 |18.4 |2085 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|224 |80.85|95.43|25.6 |4.1 |11.1 |3450 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|224 |80.84|95.02|25.6 |4.3 |11.8 |2821 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|224 |80.79|95.62|24.9 |3.5 |7.7 |2961 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|288 |80.79|95.36|19.8 |6.0 |14.8 |2506 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|288 |80.79|95.58|19.9 |4.2 |10.6 |2349 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|288 |80.78|94.99|25.6 |6.8 |18.4 |2088 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|288 |80.71|95.43|25.6 |6.8 |18.4 |2087 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|288 |80.7 |95.39|25.0 |7.0 |23.8 |1749 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|192 |80.69|95.24|63.6 |6.0 |12.7 |2270 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|224 |80.68|94.71|25.6 |4.4 |11.9 |3162 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|288 |80.68|95.36|19.7 |6.0 |14.8 |2637 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|224 |80.67|95.3 |25.6 |4.1 |11.1 |3452 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|288 |80.67|95.42|25.0 |7.4 |25.1 |1626 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|224 |80.63|95.21|25.6 |5.2 |11.6 |3034 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|224 |80.61|95.32|25.6 |4.4 |11.9 |2813 |
|[resnext101_64x4d.gluon_in1k](https://huggingface.co/timm/resnext101_64x4d.gluon_in1k)|224 |80.61|94.99|83.5 |15.5 |31.2 |989 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|288 |80.6 |95.31|19.9 |6.0 |14.8 |2578 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|256 |80.57|95.17|15.7 |3.8 |15.5 |2710 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|224 |80.56|95.0 |60.2 |11.6 |22.6 |1483 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|224 |80.53|95.16|25.6 |4.4 |11.9 |3164 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|224 |80.53|94.46|25.0 |4.3 |14.4 |2930 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|176 |80.48|94.98|126.9 |14.3 |13.2 |1719 |
|[resnet152d.gluon_in1k](https://huggingface.co/timm/resnet152d.gluon_in1k)|224 |80.47|95.2 |60.2 |11.8 |23.4 |1428 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|288 |80.45|95.32|25.6 |6.8 |18.4 |2086 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|224 |80.45|95.24|30.2 |4.1 |8.4 |3530 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|224 |80.45|94.63|25.0 |4.3 |14.4 |2936 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|176 |80.43|95.09|68.9 |7.3 |9.0 |3015 |
|[resnet101d.gluon_in1k](https://huggingface.co/timm/resnet101d.gluon_in1k)|224 |80.42|95.01|44.6 |8.1 |17.0 |2007 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|224 |80.38|94.6 |25.6 |4.1 |11.1 |3461 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|256 |80.36|95.1 |19.8 |4.8 |11.7 |3267 |
|[resnext101_32x4d.gluon_in1k](https://huggingface.co/timm/resnext101_32x4d.gluon_in1k)|224 |80.34|94.93|44.2 |8.0 |21.2 |1814 |
|[resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.32|95.4 |25.0 |4.3 |14.4 |2941 |
|[resnet101s.gluon_in1k](https://huggingface.co/timm/resnet101s.gluon_in1k)|224 |80.28|95.16|44.7 |9.2 |18.6 |1851 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|224 |80.26|95.08|28.1 |4.1 |11.1 |2972 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|288 |80.24|95.24|25.6 |8.5 |19.9 |1523 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|224 |80.22|94.63|25.6 |4.4 |11.9 |3162 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|176 |80.2 |94.64|60.2 |7.2 |14.0 |2346 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|224 |80.08|94.74|28.1 |4.1 |11.1 |2969 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|256 |80.08|94.97|19.7 |4.8 |11.7 |3284 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|256 |80.06|94.99|19.9 |4.8 |11.7 |3216 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|224 |80.06|94.95|25.6 |4.1 |11.1 |1109 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|224 |80.02|94.71|28.1 |4.1 |11.1 |2962 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|288 |79.97|95.05|25.6 |6.8 |18.4 |2086 |
|[resnet152c.gluon_in1k](https://huggingface.co/timm/resnet152c.gluon_in1k)|224 |79.92|94.84|60.2 |11.8 |23.4 |1455 |
|[seresnext50_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext50_32x4d.gluon_in1k)|224 |79.91|94.82|27.6 |4.3 |14.4 |2591 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|224 |79.91|94.67|25.6 |4.1 |11.1 |3456 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|176 |79.9 |94.6 |44.6 |4.9 |10.1 |3341 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|224 |79.89|94.97|35.7 |4.5 |12.1 |2774 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|224 |79.88|94.87|25.6 |4.1 |11.1 |3455 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|320 |79.86|95.07|16.0 |5.2 |16.4 |2168 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|224 |79.85|94.56|25.6 |4.1 |11.1 |3460 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|288 |79.83|94.97|25.6 |6.8 |18.4 |2087 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|224 |79.82|94.62|44.6 |7.8 |16.2 |2114 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|224 |79.76|94.6 |25.0 |4.3 |14.4 |2943 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|224 |79.74|94.95|25.6 |4.1 |11.1 |3455 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|224 |79.74|94.87|19.9 |2.5 |6.4 |3929 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|288 |79.71|94.83|19.7 |6.0 |14.8 |2710 |
|[resnet152.gluon_in1k](https://huggingface.co/timm/resnet152.gluon_in1k)|224 |79.68|94.74|60.2 |11.6 |22.6 |1486 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|224 |79.67|94.87|25.0 |4.5 |15.2 |2729 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|288 |79.63|94.91|25.6 |6.8 |18.4 |2086 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|224 |79.56|94.72|25.6 |4.3 |11.8 |2805 |
|[resnet101c.gluon_in1k](https://huggingface.co/timm/resnet101c.gluon_in1k)|224 |79.53|94.58|44.6 |8.1 |17.0 |2062 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|224 |79.52|94.61|25.6 |4.1 |11.1 |3459 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|176 |79.42|94.64|25.6 |2.6 |6.9 |5397 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|288 |79.4 |94.66|18.0 |5.9 |14.6 |2752 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|224 |79.38|94.57|25.6 |4.1 |11.1 |3459 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|176 |79.37|94.3 |25.0 |2.7 |9.0 |4577 |
|[resnext50_32x4d.gluon_in1k](https://huggingface.co/timm/resnext50_32x4d.gluon_in1k)|224 |79.36|94.43|25.0 |4.3 |14.4 |2942 |
|[resnext101_32x8d.tv_in1k](https://huggingface.co/timm/resnext101_32x8d.tv_in1k)|224 |79.31|94.52|88.8 |16.5 |31.2 |1100 |
|[resnet101.gluon_in1k](https://huggingface.co/timm/resnet101.gluon_in1k)|224 |79.31|94.53|44.6 |7.8 |16.2 |2125 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|224 |79.31|94.63|25.6 |5.2 |12.0 |2524 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|176 |79.27|94.49|25.6 |2.6 |6.9 |5404 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|224 |79.25|94.31|25.0 |4.3 |14.4 |2931 |
|[resnet50.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet50.fb_ssl_yfcc100m_ft_in1k)|224 |79.22|94.84|25.6 |4.1 |11.1 |3451 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|256 |79.21|94.56|19.7 |4.8 |11.7 |3392 |
|[resnet50d.gluon_in1k](https://huggingface.co/timm/resnet50d.gluon_in1k)|224 |79.07|94.48|25.6 |4.4 |11.9 |3162 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|224 |79.03|94.38|25.6 |4.1 |11.1 |3453 |
|[resnet50.am_in1k](https://huggingface.co/timm/resnet50.am_in1k)|224 |79.01|94.39|25.6 |4.1 |11.1 |3461 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|256 |79.01|94.37|18.0 |4.6 |11.6 |3440 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|256 |78.9 |94.54|16.0 |3.4 |10.5 |3421 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|160 |78.89|94.11|60.2 |5.9 |11.5 |2745 |
|[wide_resnet101_2.tv_in1k](https://huggingface.co/timm/wide_resnet101_2.tv_in1k)|224 |78.84|94.28|126.9 |22.8 |21.2 |1079 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|288 |78.83|94.24|16.8 |4.5 |16.8 |2251 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|224 |78.81|94.32|25.6 |4.1 |11.1 |3454 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|288 |78.74|94.33|16.8 |4.5 |16.7 |2264 |
|[resnet50s.gluon_in1k](https://huggingface.co/timm/resnet50s.gluon_in1k)|224 |78.72|94.23|25.7 |5.5 |13.5 |2796 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|224 |78.71|94.24|25.6 |4.4 |11.9 |3154 |
|[wide_resnet50_2.tv_in1k](https://huggingface.co/timm/wide_resnet50_2.tv_in1k)|224 |78.47|94.09|68.9 |11.4 |14.4 |1934 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|224 |78.46|94.27|25.6 |4.1 |11.1 |3454 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|288 |78.43|94.35|21.8 |6.5 |7.5 |3291 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|288 |78.42|94.04|10.5 |3.1 |13.3 |3226 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|320 |78.33|94.13|16.0 |5.2 |16.4 |2391 |
|[resnet152.tv_in1k](https://huggingface.co/timm/resnet152.tv_in1k)|224 |78.32|94.04|60.2 |11.6 |22.6 |1487 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|288 |78.28|94.1 |10.4 |3.1 |13.3 |3062 |
|[bat_resnext26ts.ch_in1k](https://huggingface.co/timm/bat_resnext26ts.ch_in1k)|256 |78.25|94.1 |10.7 |2.5 |12.5 |3393 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|224 |78.06|93.78|25.6 |4.1 |11.1 |3450 |
|[resnet50c.gluon_in1k](https://huggingface.co/timm/resnet50c.gluon_in1k)|224 |78.0 |93.99|25.6 |4.4 |11.9 |3286 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|288 |78.0 |93.91|10.3 |3.1 |13.3 |3297 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|224 |77.98|93.75|16.8 |2.7 |10.1 |3841 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|288 |77.92|93.77|21.8 |6.1 |6.2 |3609 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|160 |77.88|93.71|44.6 |4.0 |8.3 |3926 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|256 |77.87|93.84|16.0 |3.4 |10.5 |3772 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|256 |77.86|93.79|10.4 |2.4 |10.5 |4263 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|160 |77.82|93.81|35.7 |2.3 |6.2 |5238 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|256 |77.81|93.82|10.5 |2.4 |10.5 |4183 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|160 |77.79|93.6 |25.6 |2.2 |6.0 |5329 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|160 |77.73|93.32|25.0 |2.2 |7.4 |5576 |
|[resnext50_32x4d.tv_in1k](https://huggingface.co/timm/resnext50_32x4d.tv_in1k)|224 |77.61|93.7 |25.0 |4.3 |14.4 |2944 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|224 |77.59|93.61|16.8 |2.7 |10.2 |3807 |
|[resnet50.gluon_in1k](https://huggingface.co/timm/resnet50.gluon_in1k)|224 |77.58|93.72|25.6 |4.1 |11.1 |3455 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|256 |77.44|93.56|10.3 |2.4 |10.5 |4284 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|288 |77.41|93.63|16.0 |4.3 |13.5 |2907 |
|[resnet101.tv_in1k](https://huggingface.co/timm/resnet101.tv_in1k)|224 |77.38|93.54|44.6 |7.8 |16.2 |2125 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|160 |77.22|93.27|25.6 |2.2 |6.1 |5982 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|288 |77.17|93.47|10.3 |3.1 |13.3 |3392 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|288 |77.15|93.27|21.8 |6.1 |6.2 |3615 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|224 |77.1 |93.37|21.8 |3.9 |4.5 |5436 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|224 |77.02|93.07|28.1 |4.1 |11.1 |2952 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|256 |76.78|93.13|10.3 |2.4 |10.5 |4410 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|224 |76.7 |93.17|16.0 |2.6 |8.2 |4859 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|288 |76.5 |93.35|21.8 |6.1 |6.2 |3617 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|224 |76.42|92.87|21.8 |3.7 |3.7 |5984 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|288 |76.35|93.18|16.0 |3.9 |12.2 |3331 |
|[resnet50.tv_in1k](https://huggingface.co/timm/resnet50.tv_in1k)|224 |76.13|92.86|25.6 |4.1 |11.1 |3457 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|160 |75.96|92.5 |25.6 |2.1 |5.7 |6490 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|224 |75.52|92.44|21.8 |3.7 |3.7 |5991 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|224 |75.3 |92.58|16.0 |2.4 |7.4 |5583 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|224 |75.16|92.18|21.8 |3.7 |3.7 |5994 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|160 |75.1 |92.08|28.1 |2.1 |5.7 |5513 |
|[resnet34.gluon_in1k](https://huggingface.co/timm/resnet34.gluon_in1k)|224 |74.57|91.98|21.8 |3.7 |3.7 |5984 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|288 |73.81|91.83|11.7 |3.4 |5.4 |5196 |
|[resnet34.tv_in1k](https://huggingface.co/timm/resnet34.tv_in1k)|224 |73.32|91.42|21.8 |3.7 |3.7 |5979 |
|[resnet18.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet18.fb_swsl_ig1b_ft_in1k)|224 |73.28|91.73|11.7 |1.8 |2.5 |10213 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|288 |73.16|91.03|11.7 |3.0 |4.1 |6050 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|224 |72.98|91.11|21.8 |3.7 |3.7 |5967 |
|[resnet18.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet18.fb_ssl_yfcc100m_ft_in1k)|224 |72.6 |91.42|11.7 |1.8 |2.5 |10213 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|288 |72.37|90.59|11.7 |3.0 |4.1 |6051 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|224 |72.26|90.31|10.1 |1.7 |5.8 |7026 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|224 |72.26|90.68|11.7 |2.1 |3.3 |8707 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|224 |71.49|90.07|11.7 |1.8 |2.5 |10187 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|176 |71.31|89.69|10.1 |1.1 |3.6 |10970 |
|[resnet18.gluon_in1k](https://huggingface.co/timm/resnet18.gluon_in1k)|224 |70.84|89.76|11.7 |1.8 |2.5 |10210 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|224 |70.64|89.47|11.7 |1.8 |2.5 |10194 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|160 |70.56|89.52|21.8 |1.9 |1.9 |10737 |
|[resnet18.tv_in1k](https://huggingface.co/timm/resnet18.tv_in1k)|224 |69.76|89.07|11.7 |1.8 |2.5 |10205 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|224 |68.34|88.03|5.4 |1.1 |2.4 |13079 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|224 |68.25|88.17|11.7 |1.8 |2.5 |10167 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|176 |66.71|86.96|5.4 |0.7 |1.5 |20327 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|160 |65.66|86.26|11.7 |0.9 |1.3 |18229 |
## Citation
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@article{He2015,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {arXiv preprint arXiv:1512.03385},
year = {2015}
}
```
| 38,401 | [
[
-0.0653076171875,
-0.0164947509765625,
0.0023479461669921875,
0.029022216796875,
-0.031646728515625,
-0.008575439453125,
-0.0097503662109375,
-0.0290679931640625,
0.08636474609375,
0.0216217041015625,
-0.048614501953125,
-0.0401611328125,
-0.04632568359375,
-0.0013141632080078125,
0.023956298828125,
0.065673828125,
-0.0001761913299560547,
-0.00637054443359375,
0.0174560546875,
-0.0187530517578125,
-0.004283905029296875,
-0.025482177734375,
-0.079345703125,
-0.013702392578125,
0.032623291015625,
0.01282501220703125,
0.050872802734375,
0.04705810546875,
0.0289154052734375,
0.044952392578125,
-0.0192108154296875,
0.0212860107421875,
-0.005626678466796875,
-0.009796142578125,
0.0467529296875,
-0.030609130859375,
-0.0689697265625,
-0.0016088485717773438,
0.054046630859375,
0.04656982421875,
0.005340576171875,
0.025970458984375,
0.0281982421875,
0.04608154296875,
0.0023822784423828125,
-0.0037899017333984375,
0.00087738037109375,
0.0105743408203125,
-0.0226287841796875,
0.00649261474609375,
-0.005184173583984375,
-0.05426025390625,
0.01180267333984375,
-0.046173095703125,
-0.0037689208984375,
0.00024175643920898438,
0.100341796875,
-0.008575439453125,
-0.016021728515625,
0.00667572021484375,
0.01088714599609375,
0.057525634765625,
-0.061798095703125,
0.02587890625,
0.0418701171875,
0.0005068778991699219,
-0.0147552490234375,
-0.048675537109375,
-0.037872314453125,
0.0095062255859375,
-0.03265380859375,
0.0238494873046875,
-0.0220947265625,
-0.017669677734375,
0.028167724609375,
0.025604248046875,
-0.03350830078125,
-0.00811767578125,
-0.0263824462890625,
-0.007106781005859375,
0.052154541015625,
0.00624847412109375,
0.052001953125,
-0.026123046875,
-0.03753662109375,
-0.0096893310546875,
-0.01282501220703125,
0.03497314453125,
0.0190582275390625,
0.01123809814453125,
-0.08184814453125,
0.031829833984375,
0.01097869873046875,
0.01806640625,
0.028289794921875,
-0.0109710693359375,
0.062225341796875,
-0.0069580078125,
-0.038787841796875,
-0.037811279296875,
0.080810546875,
0.04852294921875,
0.0197906494140625,
-0.006710052490234375,
-0.003936767578125,
-0.01334381103515625,
-0.0297698974609375,
-0.07061767578125,
-0.00247955322265625,
0.0193939208984375,
-0.041717529296875,
-0.0181121826171875,
0.024261474609375,
-0.067138671875,
-0.00437164306640625,
-0.007152557373046875,
0.004619598388671875,
-0.0567626953125,
-0.0330810546875,
0.0003814697265625,
-0.0177001953125,
0.0389404296875,
0.0166778564453125,
-0.0234375,
0.0325927734375,
0.00612640380859375,
0.06610107421875,
0.0209197998046875,
-0.0052642822265625,
-0.0164642333984375,
0.0010976791381835938,
-0.02703857421875,
0.0272674560546875,
0.01177215576171875,
-0.01345062255859375,
-0.0254974365234375,
0.03173828125,
-0.0207672119140625,
-0.0188446044921875,
0.045867919921875,
0.02081298828125,
0.01222991943359375,
-0.0217742919921875,
-0.0185394287109375,
-0.0178070068359375,
0.027008056640625,
-0.043121337890625,
0.07623291015625,
0.028656005859375,
-0.08343505859375,
0.01155853271484375,
-0.0380859375,
-0.00116729736328125,
-0.0228729248046875,
0.00606536865234375,
-0.06695556640625,
0.0025234222412109375,
0.016082763671875,
0.0523681640625,
-0.017578125,
-0.01372528076171875,
-0.0266876220703125,
0.004058837890625,
0.031524658203125,
0.01325225830078125,
0.06854248046875,
0.023406982421875,
-0.034027099609375,
-0.0169219970703125,
-0.054840087890625,
0.032928466796875,
0.033233642578125,
-0.001338958740234375,
-0.003833770751953125,
-0.059234619140625,
0.0026702880859375,
0.0443115234375,
0.0185394287109375,
-0.05267333984375,
0.018280029296875,
-0.01401519775390625,
0.0248870849609375,
0.04736328125,
0.0032405853271484375,
0.01245880126953125,
-0.054168701171875,
0.04510498046875,
-0.0009937286376953125,
0.02069091796875,
-0.000644683837890625,
-0.030853271484375,
-0.056854248046875,
-0.0560302734375,
0.017547607421875,
0.032012939453125,
-0.0294189453125,
0.06573486328125,
0.0111083984375,
-0.04595947265625,
-0.047515869140625,
0.004657745361328125,
0.04449462890625,
0.0187835693359375,
0.00768280029296875,
-0.02740478515625,
-0.055267333984375,
-0.072509765625,
-0.0258941650390625,
0.00852203369140625,
-0.002178192138671875,
0.050201416015625,
0.0322265625,
-0.01499176025390625,
0.03924560546875,
-0.0280609130859375,
-0.016693115234375,
-0.011199951171875,
-0.00670623779296875,
0.033905029296875,
0.06024169921875,
0.0772705078125,
-0.054290771484375,
-0.07012939453125,
0.01091766357421875,
-0.08233642578125,
-0.005931854248046875,
-0.0011014938354492188,
-0.0199737548828125,
0.0322265625,
0.017242431640625,
-0.0650634765625,
0.057891845703125,
0.0276336669921875,
-0.0604248046875,
0.033905029296875,
-0.025177001953125,
0.0428466796875,
-0.08154296875,
0.02001953125,
0.021728515625,
-0.0185394287109375,
-0.04302978515625,
0.005252838134765625,
-0.007480621337890625,
0.009368896484375,
-0.0416259765625,
0.058380126953125,
-0.05303955078125,
-0.002834320068359375,
0.01068115234375,
0.004360198974609375,
-0.0010309219360351562,
0.033172607421875,
-0.00464630126953125,
0.044342041015625,
0.066162109375,
-0.0126190185546875,
0.0235748291015625,
0.031890869140625,
0.004909515380859375,
0.05792236328125,
-0.045806884765625,
0.00997161865234375,
0.00218963623046875,
0.033447265625,
-0.07501220703125,
-0.02947998046875,
0.041534423828125,
-0.061676025390625,
0.04974365234375,
-0.0210723876953125,
-0.0220947265625,
-0.0618896484375,
-0.06591796875,
0.0187530517578125,
0.049530029296875,
-0.04510498046875,
0.027374267578125,
0.0159454345703125,
-0.0034332275390625,
-0.036224365234375,
-0.052581787109375,
0.006229400634765625,
-0.032501220703125,
-0.06134033203125,
0.03363037109375,
0.023284912109375,
-0.01377105712890625,
0.00693511962890625,
-0.01056671142578125,
-0.0116424560546875,
-0.01593017578125,
0.044952392578125,
0.0251617431640625,
-0.0228729248046875,
-0.031463623046875,
-0.0292816162109375,
-0.0215606689453125,
-0.005512237548828125,
-0.00862884521484375,
0.03802490234375,
-0.033966064453125,
0.006572723388671875,
-0.1092529296875,
0.00945281982421875,
0.06671142578125,
-0.0016946792602539062,
0.07354736328125,
0.05755615234375,
-0.036224365234375,
0.01412200927734375,
-0.034423828125,
-0.015869140625,
-0.038818359375,
-0.017791748046875,
-0.052520751953125,
-0.0428466796875,
0.0697021484375,
0.004947662353515625,
-0.00960540771484375,
0.05853271484375,
0.0113983154296875,
-0.0185394287109375,
0.06256103515625,
0.035919189453125,
-0.003398895263671875,
0.042266845703125,
-0.062225341796875,
0.00634765625,
-0.061737060546875,
-0.05499267578125,
-0.01904296875,
-0.042327880859375,
-0.043548583984375,
-0.025177001953125,
0.018096923828125,
0.0295562744140625,
-0.0197906494140625,
0.044921875,
-0.04205322265625,
0.002361297607421875,
0.02471923828125,
0.03997802734375,
-0.0166168212890625,
-0.0090179443359375,
-0.00989532470703125,
-0.026153564453125,
-0.0389404296875,
-0.0269775390625,
0.057220458984375,
0.047149658203125,
0.032073974609375,
0.007572174072265625,
0.044525146484375,
0.004001617431640625,
0.01514434814453125,
-0.022918701171875,
0.051025390625,
0.003658294677734375,
-0.032806396484375,
-0.024505615234375,
-0.03070068359375,
-0.08038330078125,
0.01216888427734375,
-0.034149169921875,
-0.06439208984375,
-0.01361083984375,
-0.0033740997314453125,
-0.02752685546875,
0.055877685546875,
-0.04656982421875,
0.046478271484375,
-0.00530242919921875,
-0.04046630859375,
-0.0028247833251953125,
-0.060028076171875,
0.00514984130859375,
0.027679443359375,
0.003582000732421875,
0.0002961158752441406,
-0.0030727386474609375,
0.05877685546875,
-0.0626220703125,
0.046173095703125,
-0.0262298583984375,
0.01031494140625,
0.0294189453125,
-0.0026454925537109375,
0.029632568359375,
-0.0013866424560546875,
-0.01497650146484375,
-0.007038116455078125,
0.0079803466796875,
-0.0618896484375,
-0.0251617431640625,
0.049163818359375,
-0.055572509765625,
-0.029052734375,
-0.04864501953125,
-0.0199127197265625,
0.00685882568359375,
0.0024204254150390625,
0.036468505859375,
0.048675537109375,
-0.0012454986572265625,
0.0163116455078125,
0.039215087890625,
-0.03314208984375,
0.0382080078125,
-0.00862884521484375,
0.00013339519500732422,
-0.04266357421875,
0.052642822265625,
0.004932403564453125,
-0.0007715225219726562,
-0.0010576248168945312,
0.0015745162963867188,
-0.0311737060546875,
-0.017242431640625,
-0.023040771484375,
0.05609130859375,
-0.01213836669921875,
-0.023162841796875,
-0.046783447265625,
-0.02667236328125,
-0.042724609375,
-0.031585693359375,
-0.03369140625,
-0.0268096923828125,
-0.0234375,
0.0023632049560546875,
0.05389404296875,
0.0660400390625,
-0.0272674560546875,
0.031280517578125,
-0.039459228515625,
0.022186279296875,
0.00508880615234375,
0.042694091796875,
-0.0250701904296875,
-0.0517578125,
0.0032215118408203125,
-0.00304412841796875,
-0.0068359375,
-0.06390380859375,
0.04974365234375,
0.0001455545425415039,
0.0281524658203125,
0.0301513671875,
-0.0167694091796875,
0.055450439453125,
-0.0022869110107421875,
0.0343017578125,
0.04656982421875,
-0.0535888671875,
0.0265350341796875,
-0.032562255859375,
0.0006232261657714844,
0.0208740234375,
0.01555633544921875,
-0.02911376953125,
-0.025970458984375,
-0.06671142578125,
-0.0305328369140625,
0.055511474609375,
0.0083465576171875,
-0.00335693359375,
-0.0016107559204101562,
0.055267333984375,
-0.004764556884765625,
0.003986358642578125,
-0.03973388671875,
-0.0682373046875,
-0.00833892822265625,
-0.01153564453125,
0.004695892333984375,
-0.003566741943359375,
0.0030307769775390625,
-0.050079345703125,
0.04986572265625,
0.0055084228515625,
0.037506103515625,
0.0130462646484375,
0.0042877197265625,
0.0032825469970703125,
-0.0225067138671875,
0.044403076171875,
0.0267181396484375,
-0.01459503173828125,
-0.0102081298828125,
0.027374267578125,
-0.037261962890625,
0.007144927978515625,
0.017120361328125,
0.0002689361572265625,
0.00579071044921875,
0.0056915283203125,
0.038177490234375,
0.02545166015625,
-0.004154205322265625,
0.03778076171875,
-0.0185546875,
-0.040069580078125,
-0.0167388916015625,
-0.0152587890625,
0.0191802978515625,
0.03253173828125,
0.0240020751953125,
0.0049591064453125,
-0.0298309326171875,
-0.0277557373046875,
0.03973388671875,
0.053985595703125,
-0.02935791015625,
-0.03094482421875,
0.0439453125,
-0.0033416748046875,
-0.0173492431640625,
0.0289459228515625,
-0.00750732421875,
-0.051605224609375,
0.0767822265625,
0.0269927978515625,
0.046356201171875,
-0.037384033203125,
0.00653839111328125,
0.064208984375,
-0.000720977783203125,
0.01450347900390625,
0.0249176025390625,
0.0355224609375,
-0.0247955322265625,
-0.00499725341796875,
-0.0401611328125,
0.0131378173828125,
0.03814697265625,
-0.0316162109375,
0.021484375,
-0.054901123046875,
-0.026947021484375,
0.00737762451171875,
0.03656005859375,
-0.0479736328125,
0.0263824462890625,
-0.0026702880859375,
0.080078125,
-0.06170654296875,
0.06298828125,
0.06781005859375,
-0.042266845703125,
-0.064453125,
-0.00041413307189941406,
0.01020050048828125,
-0.0650634765625,
0.033905029296875,
0.007305145263671875,
0.0016613006591796875,
-0.0016088485717773438,
-0.0377197265625,
-0.050933837890625,
0.101806640625,
0.03076171875,
-0.002201080322265625,
0.0208282470703125,
-0.032623291015625,
0.0284271240234375,
-0.01340484619140625,
0.04345703125,
0.028656005859375,
0.0382080078125,
0.012054443359375,
-0.06561279296875,
0.027435302734375,
-0.03240966796875,
-0.0089569091796875,
0.0235748291015625,
-0.097900390625,
0.06744384765625,
-0.016876220703125,
-0.00304412841796875,
0.0186767578125,
0.04852294921875,
0.024627685546875,
-0.0026912689208984375,
0.018951416015625,
0.0694580078125,
0.036865234375,
-0.01788330078125,
0.07928466796875,
-0.0163116455078125,
0.040069580078125,
0.0160064697265625,
0.040985107421875,
0.0267333984375,
0.0289764404296875,
-0.042449951171875,
0.0195159912109375,
0.062286376953125,
-0.003108978271484375,
0.0092926025390625,
0.02178955078125,
-0.029632568359375,
-0.014068603515625,
-0.016510009765625,
-0.051483154296875,
0.0160369873046875,
0.00864410400390625,
-0.01081085205078125,
-0.0107879638671875,
-0.0036563873291015625,
0.0191497802734375,
0.022186279296875,
-0.0193939208984375,
0.03857421875,
0.006313323974609375,
-0.030029296875,
0.03533935546875,
-0.0009570121765136719,
0.0791015625,
-0.025848388671875,
0.01262664794921875,
-0.0269775390625,
0.0220947265625,
-0.0179290771484375,
-0.08087158203125,
0.0249481201171875,
-0.00557708740234375,
0.005290985107421875,
-0.016815185546875,
0.04949951171875,
-0.024993896484375,
-0.02545166015625,
0.0297088623046875,
0.0297088623046875,
0.03924560546875,
0.022308349609375,
-0.08428955078125,
0.0198822021484375,
0.006038665771484375,
-0.04541015625,
0.03265380859375,
0.036865234375,
0.0283203125,
0.05596923828125,
0.024383544921875,
0.022796630859375,
0.0157012939453125,
-0.0269012451171875,
0.056304931640625,
-0.0482177734375,
-0.03424072265625,
-0.06097412109375,
0.039581298828125,
-0.0308380126953125,
-0.03955078125,
0.05499267578125,
0.042083740234375,
0.027435302734375,
0.00162506103515625,
0.05120849609375,
-0.039825439453125,
0.039276123046875,
-0.02001953125,
0.05657958984375,
-0.04986572265625,
-0.0186920166015625,
-0.01532745361328125,
-0.044158935546875,
-0.03070068359375,
0.06317138671875,
-0.00897979736328125,
0.0183563232421875,
0.022186279296875,
0.050384521484375,
0.005130767822265625,
-0.009490966796875,
0.0004482269287109375,
0.013397216796875,
-0.010894775390625,
0.06561279296875,
0.039581298828125,
-0.056854248046875,
0.0037899017333984375,
-0.035125732421875,
-0.02215576171875,
-0.0267333984375,
-0.0556640625,
-0.088134765625,
-0.04998779296875,
-0.03985595703125,
-0.051177978515625,
-0.018157958984375,
0.09002685546875,
0.060821533203125,
-0.044830322265625,
-0.01136016845703125,
0.01082611083984375,
0.00667572021484375,
-0.011199951171875,
-0.0159454345703125,
0.0396728515625,
0.006603240966796875,
-0.073974609375,
-0.0303192138671875,
0.01042938232421875,
0.046630859375,
0.030242919921875,
-0.036529541015625,
-0.017425537109375,
-0.00445556640625,
0.0247802734375,
0.0643310546875,
-0.05975341796875,
-0.0211639404296875,
0.000885009765625,
-0.036865234375,
0.0111236572265625,
0.02197265625,
-0.032928466796875,
-0.009674072265625,
0.035919189453125,
0.0300140380859375,
0.0548095703125,
0.005809783935546875,
0.0126190185546875,
-0.033905029296875,
0.04180908203125,
-0.0018110275268554688,
0.0261077880859375,
0.017059326171875,
-0.02197265625,
0.05706787109375,
0.0404052734375,
-0.02978515625,
-0.07586669921875,
-0.01251983642578125,
-0.09783935546875,
-0.0041961669921875,
0.048614501953125,
-0.006107330322265625,
-0.0323486328125,
0.031280517578125,
-0.034088134765625,
0.03973388671875,
-0.0162353515625,
0.019378662109375,
0.0177459716796875,
-0.0266876220703125,
-0.0264434814453125,
-0.04168701171875,
0.04559326171875,
0.02813720703125,
-0.05108642578125,
-0.0297698974609375,
0.00017333030700683594,
0.02337646484375,
0.01418304443359375,
0.05657958984375,
-0.029693603515625,
0.009857177734375,
-0.008575439453125,
0.0182647705078125,
-0.0005574226379394531,
0.01216888427734375,
-0.023162841796875,
-0.0106048583984375,
-0.01654052734375,
-0.048553466796875
]
] |
facebook/mask2former-swin-tiny-coco-instance | 2023-09-11T20:46:03.000Z | [
"transformers",
"pytorch",
"safetensors",
"mask2former",
"vision",
"image-segmentation",
"dataset:coco",
"arxiv:2112.01527",
"arxiv:2107.06278",
"license:other",
"endpoints_compatible",
"has_space",
"region:us"
] | image-segmentation | facebook | null | null | facebook/mask2former-swin-tiny-coco-instance | 4 | 14,689 | transformers | 2022-12-23T11:15:51 | ---
license: other
tags:
- vision
- image-segmentation
datasets:
- coco
widget:
- src: http://images.cocodataset.org/val2017/000000039769.jpg
example_title: Cats
- src: http://images.cocodataset.org/val2017/000000039770.jpg
example_title: Castle
---
# Mask2Former
Mask2Former model trained on COCO instance segmentation (tiny-sized version, Swin backbone). It was introduced in the paper [Masked-attention Mask Transformer for Universal Image Segmentation
](https://arxiv.org/abs/2112.01527) and first released in [this repository](https://github.com/facebookresearch/Mask2Former/).
Disclaimer: The team releasing Mask2Former did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Mask2Former addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. Mask2Former outperforms the previous SOTA,
[MaskFormer](https://arxiv.org/abs/2107.06278) both in terms of performance an efficiency by (i) replacing the pixel decoder with a more advanced multi-scale deformable attention Transformer, (ii) adopting a Transformer decoder with masked attention to boost performance without
without introducing additional computation and (iii) improving training efficiency by calculating the loss on subsampled points instead of whole masks.

## Intended uses & limitations
You can use this particular checkpoint for instance segmentation. See the [model hub](https://huggingface.co/models?search=mask2former) to look for other
fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model:
```python
import requests
import torch
from PIL import Image
from transformers import AutoImageProcessor, Mask2FormerForUniversalSegmentation
# load Mask2Former fine-tuned on COCO instance segmentation
processor = AutoImageProcessor.from_pretrained("facebook/mask2former-swin-tiny-coco-instance")
model = Mask2FormerForUniversalSegmentation.from_pretrained("facebook/mask2former-swin-tiny-coco-instance")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
# model predicts class_queries_logits of shape `(batch_size, num_queries)`
# and masks_queries_logits of shape `(batch_size, num_queries, height, width)`
class_queries_logits = outputs.class_queries_logits
masks_queries_logits = outputs.masks_queries_logits
# you can pass them to processor for postprocessing
result = processor.post_process_instance_segmentation(outputs, target_sizes=[image.size[::-1]])[0]
# we refer to the demo notebooks for visualization (see "Resources" section in the Mask2Former docs)
predicted_instance_map = result["segmentation"]
```
For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/mask2former). | 3,190 | [
[
-0.03515625,
-0.053466796875,
0.0226593017578125,
0.0142974853515625,
-0.0254669189453125,
-0.0081024169921875,
0.006763458251953125,
-0.0548095703125,
0.01055908203125,
0.037750244140625,
-0.044281005859375,
-0.0187835693359375,
-0.05499267578125,
-0.0270843505859375,
-0.00994873046875,
0.06060791015625,
-0.0027923583984375,
-0.00571441650390625,
-0.02886962890625,
-0.0035762786865234375,
-0.0251007080078125,
-0.02685546875,
-0.06805419921875,
-0.0257415771484375,
0.0148162841796875,
0.03289794921875,
0.0239410400390625,
0.0428466796875,
0.04107666015625,
0.02301025390625,
-0.00887298583984375,
-0.002227783203125,
-0.0382080078125,
-0.009735107421875,
0.00983428955078125,
-0.04046630859375,
-0.0177154541015625,
0.0111236572265625,
0.037933349609375,
0.0188446044921875,
0.006938934326171875,
0.033782958984375,
-0.00392913818359375,
0.03729248046875,
-0.04034423828125,
0.02105712890625,
-0.035797119140625,
0.01226806640625,
-0.00737762451171875,
0.031158447265625,
-0.025543212890625,
-0.0082550048828125,
0.0066375732421875,
-0.028656005859375,
0.0411376953125,
0.003215789794921875,
0.07623291015625,
0.0196075439453125,
-0.007049560546875,
-0.007404327392578125,
-0.031890869140625,
0.05499267578125,
-0.0288238525390625,
0.0164642333984375,
0.027496337890625,
0.046356201171875,
0.01641845703125,
-0.0882568359375,
-0.0298614501953125,
0.01678466796875,
-0.01264190673828125,
0.006595611572265625,
-0.0242767333984375,
0.0017681121826171875,
0.03070068359375,
0.0299835205078125,
-0.038055419921875,
0.005191802978515625,
-0.06732177734375,
-0.0250091552734375,
0.049957275390625,
-0.01171112060546875,
0.0247802734375,
-0.019134521484375,
-0.040283203125,
-0.0213165283203125,
-0.0243682861328125,
0.037384033203125,
0.01092529296875,
-0.0190277099609375,
-0.021270751953125,
0.0404052734375,
-0.005298614501953125,
0.049468994140625,
0.02587890625,
-0.00946044921875,
0.009307861328125,
0.0136871337890625,
-0.0286865234375,
0.006526947021484375,
0.0521240234375,
0.0296783447265625,
0.0015172958374023438,
0.005390167236328125,
-0.00431060791015625,
0.01544952392578125,
0.0220947265625,
-0.09625244140625,
-0.045257568359375,
0.0157318115234375,
-0.02838134765625,
-0.031585693359375,
0.0194244384765625,
-0.056396484375,
0.002758026123046875,
-0.001964569091796875,
0.031524658203125,
-0.021392822265625,
-0.0045318603515625,
-0.00380706787109375,
-0.0141143798828125,
0.03179931640625,
0.0194854736328125,
-0.07598876953125,
0.03271484375,
0.03790283203125,
0.07452392578125,
-0.006832122802734375,
0.0096893310546875,
-0.0182647705078125,
-0.0000040531158447265625,
-0.0195465087890625,
0.07562255859375,
-0.0303192138671875,
-0.006252288818359375,
-0.02496337890625,
0.026641845703125,
-0.02142333984375,
-0.048736572265625,
0.02947998046875,
-0.03607177734375,
0.0299530029296875,
-0.0128021240234375,
-0.0243682861328125,
-0.046844482421875,
0.0030078887939453125,
-0.0382080078125,
0.09521484375,
0.03839111328125,
-0.038970947265625,
0.0137481689453125,
-0.04315185546875,
-0.013824462890625,
-0.00479888916015625,
-0.007465362548828125,
-0.0709228515625,
0.00360870361328125,
0.0272369384765625,
0.035491943359375,
-0.019683837890625,
-0.0027713775634765625,
-0.0233917236328125,
-0.018402099609375,
0.003276824951171875,
0.0132598876953125,
0.06597900390625,
0.01050567626953125,
-0.046844482421875,
0.0262451171875,
-0.033203125,
0.0096435546875,
0.03070068359375,
0.017547607421875,
0.01091766357421875,
-0.0308837890625,
0.0236053466796875,
0.047149658203125,
-0.000682830810546875,
-0.040283203125,
0.0138397216796875,
-0.00960540771484375,
0.057891845703125,
0.044342041015625,
0.007221221923828125,
0.033935546875,
-0.0140228271484375,
0.029449462890625,
0.004985809326171875,
0.03497314453125,
-0.006359100341796875,
-0.054901123046875,
-0.06756591796875,
-0.0307769775390625,
0.00431060791015625,
0.01910400390625,
-0.0260162353515625,
0.0262451171875,
0.01558685302734375,
-0.050689697265625,
-0.03485107421875,
0.000047087669372558594,
0.032257080078125,
0.04736328125,
0.0272369384765625,
-0.0440673828125,
-0.054656982421875,
-0.080322265625,
0.0193328857421875,
0.0050506591796875,
-0.00812530517578125,
0.028289794921875,
0.04339599609375,
-0.0308685302734375,
0.06878662109375,
-0.051116943359375,
-0.03338623046875,
-0.01812744140625,
-0.002414703369140625,
-0.007663726806640625,
0.0513916015625,
0.056640625,
-0.060150146484375,
-0.027679443359375,
-0.02655029296875,
-0.05999755859375,
0.007312774658203125,
0.00606536865234375,
-0.02496337890625,
0.0182647705078125,
0.0254364013671875,
-0.030364990234375,
0.0390625,
0.03668212890625,
-0.0223846435546875,
0.042236328125,
0.006183624267578125,
-0.0017213821411132812,
-0.062744140625,
0.008453369140625,
0.00787353515625,
-0.0262908935546875,
-0.03558349609375,
0.00848388671875,
0.01189422607421875,
-0.0162506103515625,
-0.046234130859375,
0.037322998046875,
-0.046539306640625,
-0.0262451171875,
-0.0208587646484375,
-0.01953125,
0.0172882080078125,
0.05267333984375,
0.0255126953125,
0.036956787109375,
0.056976318359375,
-0.03204345703125,
0.0301361083984375,
0.0214691162109375,
-0.03302001953125,
0.030487060546875,
-0.07586669921875,
0.03485107421875,
-0.00037670135498046875,
0.0308074951171875,
-0.0836181640625,
-0.042938232421875,
0.037933349609375,
-0.028564453125,
0.0308074951171875,
-0.025177001953125,
-0.020538330078125,
-0.060791015625,
-0.03326416015625,
0.053741455078125,
0.06036376953125,
-0.046417236328125,
0.014617919921875,
0.037353515625,
0.01168060302734375,
-0.01416778564453125,
-0.0631103515625,
-0.0168609619140625,
-0.017303466796875,
-0.06732177734375,
0.033935546875,
-0.0022411346435546875,
0.003574371337890625,
0.00028514862060546875,
-0.019805908203125,
-0.006542205810546875,
-0.023773193359375,
0.03607177734375,
0.027679443359375,
-0.00882720947265625,
-0.0300445556640625,
0.0106658935546875,
-0.0154876708984375,
0.00807952880859375,
-0.030975341796875,
0.048553466796875,
-0.0173797607421875,
-0.016876220703125,
-0.05438232421875,
0.0015325546264648438,
0.027099609375,
-0.0166778564453125,
0.043670654296875,
0.076416015625,
-0.052764892578125,
-0.004974365234375,
-0.06396484375,
-0.03466796875,
-0.0357666015625,
0.01873779296875,
-0.033721923828125,
-0.06634521484375,
0.04388427734375,
0.01153564453125,
-0.007289886474609375,
0.052703857421875,
0.041290283203125,
-0.0009555816650390625,
0.07843017578125,
0.051513671875,
0.0236968994140625,
0.043670654296875,
-0.07012939453125,
0.01045989990234375,
-0.09429931640625,
-0.051666259765625,
-0.007183074951171875,
-0.038421630859375,
-0.0156402587890625,
-0.06048583984375,
0.038299560546875,
0.04339599609375,
-0.009307861328125,
0.041290283203125,
-0.0771484375,
0.019256591796875,
0.04168701171875,
0.0181427001953125,
-0.01947021484375,
0.0255584716796875,
0.01018524169921875,
-0.0037784576416015625,
-0.050872802734375,
-0.0306549072265625,
0.054290771484375,
0.03729248046875,
0.034698486328125,
-0.0186920166015625,
0.031280517578125,
0.0013484954833984375,
0.010406494140625,
-0.06231689453125,
0.030548095703125,
0.009765625,
-0.042388916015625,
-0.01114654541015625,
-0.001857757568359375,
-0.055511474609375,
0.02838134765625,
0.0004024505615234375,
-0.087890625,
0.03302001953125,
0.01262664794921875,
-0.0255126953125,
0.028961181640625,
-0.054473876953125,
0.08355712890625,
-0.002475738525390625,
-0.03411865234375,
0.00638580322265625,
-0.05645751953125,
0.04034423828125,
0.006290435791015625,
-0.00899505615234375,
-0.0142059326171875,
0.0206146240234375,
0.09161376953125,
-0.040130615234375,
0.0631103515625,
-0.018524169921875,
0.0169219970703125,
0.05718994140625,
-0.01221466064453125,
0.022216796875,
0.0160369873046875,
0.00531768798828125,
0.0306854248046875,
0.002964019775390625,
-0.039947509765625,
-0.04168701171875,
0.040313720703125,
-0.061676025390625,
-0.030792236328125,
-0.0253753662109375,
-0.0204925537109375,
0.01493072509765625,
0.0092926025390625,
0.059814453125,
0.0307464599609375,
0.0003161430358886719,
0.0076141357421875,
0.04315185546875,
0.00331878662109375,
0.032745361328125,
-0.0046844482421875,
-0.01161956787109375,
-0.033111572265625,
0.050872802734375,
0.0028285980224609375,
0.018402099609375,
0.01476287841796875,
0.0156402587890625,
-0.035064697265625,
-0.011260986328125,
-0.05169677734375,
0.02801513671875,
-0.034454345703125,
-0.0308380126953125,
-0.0650634765625,
-0.0322265625,
-0.059539794921875,
-0.0282135009765625,
-0.038848876953125,
-0.036895751953125,
-0.026458740234375,
-0.0017528533935546875,
0.030181884765625,
0.0382080078125,
-0.01212310791015625,
0.041168212890625,
-0.022705078125,
0.0236663818359375,
0.046630859375,
0.017669677734375,
-0.0118865966796875,
-0.046417236328125,
-0.003631591796875,
0.0015459060668945312,
-0.04046630859375,
-0.056396484375,
0.0207366943359375,
0.002529144287109375,
0.0261688232421875,
0.048492431640625,
-0.01568603515625,
0.05133056640625,
-0.0118865966796875,
0.054107666015625,
0.034942626953125,
-0.059906005859375,
0.056488037109375,
-0.003673553466796875,
0.0049896240234375,
0.0195465087890625,
0.00913238525390625,
-0.044647216796875,
-0.008453369140625,
-0.0604248046875,
-0.0657958984375,
0.09405517578125,
0.0201416015625,
-0.0094146728515625,
0.0168304443359375,
0.03485107421875,
-0.0038433074951171875,
0.0037593841552734375,
-0.049407958984375,
-0.0258331298828125,
-0.0306854248046875,
0.0080108642578125,
-0.00615692138671875,
-0.055816650390625,
-0.004009246826171875,
-0.03704833984375,
0.05157470703125,
0.000682830810546875,
0.04864501953125,
0.0225067138671875,
-0.0217437744140625,
-0.020721435546875,
-0.0294189453125,
0.03936767578125,
0.03558349609375,
-0.0203704833984375,
0.01180267333984375,
0.0028839111328125,
-0.037933349609375,
0.00196075439453125,
0.00794219970703125,
-0.025299072265625,
-0.01097869873046875,
0.02227783203125,
0.0792236328125,
0.00901031494140625,
-0.01806640625,
0.036224365234375,
0.0067138671875,
-0.021881103515625,
-0.025299072265625,
0.0130767822265625,
0.00007045269012451172,
0.01873779296875,
0.01177215576171875,
0.0205841064453125,
0.0223846435546875,
-0.02813720703125,
0.01561737060546875,
0.0178070068359375,
-0.04522705078125,
-0.02960205078125,
0.0714111328125,
-0.011749267578125,
-0.0185699462890625,
0.048919677734375,
-0.021881103515625,
-0.060455322265625,
0.0809326171875,
0.0545654296875,
0.0594482421875,
-0.027740478515625,
0.0233306884765625,
0.056640625,
0.0163726806640625,
-0.01033782958984375,
-0.0075225830078125,
-0.00868988037109375,
-0.03485107421875,
0.0005664825439453125,
-0.04974365234375,
-0.00455474853515625,
0.01245880126953125,
-0.0521240234375,
0.0248260498046875,
-0.046417236328125,
-0.004730224609375,
0.0102081298828125,
0.01169586181640625,
-0.059783935546875,
0.0321044921875,
0.01641845703125,
0.047515869140625,
-0.0577392578125,
0.05230712890625,
0.0718994140625,
-0.0307769775390625,
-0.06036376953125,
-0.0308990478515625,
0.00910186767578125,
-0.07647705078125,
0.0218658447265625,
0.054107666015625,
0.00215911865234375,
-0.005374908447265625,
-0.04205322265625,
-0.06048583984375,
0.09539794921875,
0.01392364501953125,
-0.0279541015625,
0.0064544677734375,
0.00341033935546875,
0.0207977294921875,
-0.039154052734375,
0.0389404296875,
0.04180908203125,
0.027923583984375,
0.04339599609375,
-0.035308837890625,
0.0163116455078125,
-0.0241851806640625,
0.0250701904296875,
-0.004032135009765625,
-0.06878662109375,
0.061614990234375,
-0.034271240234375,
-0.0110321044921875,
-0.0006771087646484375,
0.05963134765625,
0.0166473388671875,
0.0316162109375,
0.0400390625,
0.049774169921875,
0.0411376953125,
-0.01287078857421875,
0.07269287109375,
0.0012178421020507812,
0.047210693359375,
0.05865478515625,
0.01557159423828125,
0.033782958984375,
0.0196075439453125,
-0.006198883056640625,
0.04071044921875,
0.07476806640625,
-0.0313720703125,
0.033111572265625,
0.01107025146484375,
-0.00439453125,
-0.0106658935546875,
-0.002346038818359375,
-0.033935546875,
0.044830322265625,
0.0193328857421875,
-0.0306549072265625,
-0.0038356781005859375,
0.0100250244140625,
0.00406646728515625,
-0.0261993408203125,
-0.0153045654296875,
0.0509033203125,
0.003997802734375,
-0.043731689453125,
0.05096435546875,
0.0205841064453125,
0.047576904296875,
-0.0281982421875,
0.000873565673828125,
-0.01181793212890625,
0.01153564453125,
-0.0305023193359375,
-0.045318603515625,
0.03936767578125,
-0.01202392578125,
-0.01406097412109375,
0.00153350830078125,
0.0648193359375,
-0.028289794921875,
-0.061676025390625,
0.012847900390625,
0.0024242401123046875,
0.035003662109375,
-0.0250396728515625,
-0.0665283203125,
0.0361328125,
0.0008978843688964844,
-0.033203125,
0.017852783203125,
0.00010305643081665039,
-0.017974853515625,
0.0290985107421875,
0.04241943359375,
-0.0287628173828125,
0.0016765594482421875,
-0.00411224365234375,
0.06707763671875,
-0.024627685546875,
-0.031951904296875,
-0.046478271484375,
0.03717041015625,
-0.02099609375,
-0.013702392578125,
0.038604736328125,
0.06976318359375,
0.06854248046875,
-0.01416778564453125,
0.03704833984375,
-0.01873779296875,
-0.000335693359375,
-0.0244140625,
0.04437255859375,
-0.039581298828125,
-0.016632080078125,
-0.022216796875,
-0.0860595703125,
-0.0191497802734375,
0.0745849609375,
-0.03375244140625,
0.015350341796875,
0.043212890625,
0.0625,
-0.0291290283203125,
-0.00933837890625,
0.01061248779296875,
-0.0028743743896484375,
0.0245208740234375,
0.0369873046875,
0.0266265869140625,
-0.054962158203125,
0.0282135009765625,
-0.06890869140625,
-0.03717041015625,
-0.031463623046875,
-0.0262603759765625,
-0.07073974609375,
-0.052764892578125,
-0.04168701171875,
-0.0266876220703125,
-0.004474639892578125,
0.033447265625,
0.103271484375,
-0.063720703125,
-0.01216888427734375,
-0.00611114501953125,
0.005710601806640625,
-0.019622802734375,
-0.0239410400390625,
0.05126953125,
-0.009765625,
-0.07000732421875,
-0.00453948974609375,
0.0212249755859375,
0.00441741943359375,
0.0024623870849609375,
0.0001678466796875,
0.00936126708984375,
-0.01343536376953125,
0.055267333984375,
0.029632568359375,
-0.05572509765625,
-0.025726318359375,
-0.01140594482421875,
-0.00366973876953125,
0.0179443359375,
0.058258056640625,
-0.047607421875,
0.03460693359375,
0.01690673828125,
0.0169677734375,
0.084716796875,
0.00534820556640625,
0.0013828277587890625,
-0.040924072265625,
0.033477783203125,
0.009033203125,
0.035430908203125,
0.026824951171875,
-0.0450439453125,
0.043212890625,
0.031280517578125,
-0.03814697265625,
-0.047607421875,
0.01800537109375,
-0.104248046875,
-0.01300811767578125,
0.0821533203125,
-0.01180267333984375,
-0.042022705078125,
0.02081298828125,
-0.0374755859375,
0.03607177734375,
-0.020477294921875,
0.062103271484375,
0.0140533447265625,
-0.0279388427734375,
-0.036041259765625,
-0.0193939208984375,
0.0399169921875,
0.00994873046875,
-0.0552978515625,
-0.034210205078125,
0.032379150390625,
0.046478271484375,
0.0094146728515625,
0.050201416015625,
-0.026824951171875,
0.0236663818359375,
0.0094146728515625,
0.025177001953125,
-0.0218658447265625,
-0.0226593017578125,
-0.01277923583984375,
0.0114898681640625,
-0.0239715576171875,
-0.041900634765625
]
] |
Salesforce/codegen-350M-nl | 2022-10-03T16:18:48.000Z | [
"transformers",
"pytorch",
"codegen",
"text-generation",
"arxiv:2203.13474",
"license:bsd-3-clause",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | Salesforce | null | null | Salesforce/codegen-350M-nl | 5 | 14,684 | transformers | 2022-04-11T15:19:18 | ---
license: bsd-3-clause
---
# CodeGen (CodeGen-NL 350M)
## Model description
CodeGen is a family of autoregressive language models for **program synthesis** from the paper: [A Conversational Paradigm for Program Synthesis](https://arxiv.org/abs/2203.13474) by Erik Nijkamp, Bo Pang, Hiroaki Hayashi, Lifu Tu, Huan Wang, Yingbo Zhou, Silvio Savarese, Caiming Xiong. The models are originally released in [this repository](https://github.com/salesforce/CodeGen), under 3 pre-training data variants (`NL`, `Multi`, `Mono`) and 4 model size variants (`350M`, `2B`, `6B`, `16B`).
The checkpoint included in this repository is denoted as **CodeGen-NL 350M** in the paper, where "NL" means it is pre-trained on the Pile and "350M" refers to the number of trainable parameters.
## Training data
This checkpoint (CodeGen-NL 350M) was pre-trained on [the Pile](https://github.com/EleutherAI/the-pile), a large-scale curated dataset created by [EleutherAI](https://www.eleuther.ai/). Parts of the dataset include code data.
## Training procedure
CodeGen was trained using cross-entropy loss to maximize the likelihood of sequential inputs.
The family of models are trained using multiple TPU-v4-512 by Google, leveraging data and model parallelism.
See Section 2.3 of the [paper](https://arxiv.org/abs/2203.13474) for more details.
## Evaluation results
We evaluate our models on two code generation benchmark: HumanEval and MTPB. Please refer to the [paper](https://arxiv.org/abs/2203.13474) for more details.
## Intended Use and Limitations
As an autoregressive language model, CodeGen is capable of extracting features from given natural language and programming language texts, and calculating the likelihood of them.
However, the model is intended for and best at **program synthesis**, that is, generating executable code given English prompts, where the prompts should be in the form of a comment string. The model can complete partially-generated code as well.
## How to use
This model can be easily loaded using the `AutoModelForCausalLM` functionality:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Salesforce/codegen-350M-nl")
model = AutoModelForCausalLM.from_pretrained("Salesforce/codegen-350M-nl")
text = "def hello_world():"
input_ids = tokenizer(text, return_tensors="pt").input_ids
generated_ids = model.generate(input_ids, max_length=128)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))
```
## BibTeX entry and citation info
```bibtex
@article{Nijkamp2022ACP,
title={A Conversational Paradigm for Program Synthesis},
author={Nijkamp, Erik and Pang, Bo and Hayashi, Hiroaki and Tu, Lifu and Wang, Huan and Zhou, Yingbo and Savarese, Silvio and Xiong, Caiming},
journal={arXiv preprint},
year={2022}
}
```
| 2,828 | [
[
-0.037811279296875,
-0.04547119140625,
0.0014324188232421875,
0.0229034423828125,
0.0018796920776367188,
0.0197906494140625,
-0.023529052734375,
-0.0260772705078125,
-0.00994873046875,
0.030548095703125,
-0.038421630859375,
-0.042816162109375,
-0.028594970703125,
0.011016845703125,
-0.031646728515625,
0.0845947265625,
-0.00611114501953125,
-0.005916595458984375,
-0.01268768310546875,
0.0022029876708984375,
-0.01216888427734375,
-0.056640625,
-0.0173492431640625,
-0.020294189453125,
0.0012235641479492188,
0.003204345703125,
0.038848876953125,
0.069091796875,
0.0262298583984375,
0.0211944580078125,
0.0018205642700195312,
-0.011260986328125,
-0.02728271484375,
0.0008764266967773438,
0.0174560546875,
-0.03704833984375,
-0.04254150390625,
0.004589080810546875,
0.0318603515625,
0.048583984375,
-0.00727081298828125,
0.0276336669921875,
0.004291534423828125,
0.031707763671875,
-0.02886962890625,
0.0226593017578125,
-0.052825927734375,
0.01016998291015625,
0.00820159912109375,
-0.0190582275390625,
-0.030670166015625,
-0.027191162109375,
-0.00457000732421875,
-0.0212860107421875,
0.0413818359375,
-0.003055572509765625,
0.07464599609375,
0.019256591796875,
-0.006923675537109375,
-0.025146484375,
-0.048828125,
0.0697021484375,
-0.08575439453125,
0.0181884765625,
0.0121002197265625,
-0.00611114501953125,
-0.017120361328125,
-0.060302734375,
-0.036285400390625,
-0.0197906494140625,
-0.0004456043243408203,
-0.004169464111328125,
-0.02362060546875,
0.0142974853515625,
0.0443115234375,
0.0428466796875,
-0.06982421875,
-0.001995086669921875,
-0.0540771484375,
-0.0301361083984375,
0.05364990234375,
0.0188751220703125,
0.0260009765625,
-0.0301971435546875,
-0.04046630859375,
-0.0175018310546875,
-0.058258056640625,
-0.0004444122314453125,
0.0223541259765625,
0.0084381103515625,
-0.028594970703125,
0.0245513916015625,
-0.0114898681640625,
0.0736083984375,
-0.017242431640625,
0.0015535354614257812,
0.042877197265625,
-0.0440673828125,
-0.0304412841796875,
-0.01308441162109375,
0.0947265625,
0.007213592529296875,
0.01412200927734375,
-0.00787353515625,
-0.019256591796875,
0.0115509033203125,
-0.0011205673217773438,
-0.07550048828125,
-0.0394287109375,
0.017578125,
-0.055816650390625,
-0.0491943359375,
0.0066070556640625,
-0.05816650390625,
0.00830841064453125,
-0.0093231201171875,
0.00882720947265625,
-0.0274505615234375,
-0.0239105224609375,
0.0086669921875,
-0.0081024169921875,
0.0272216796875,
-0.01507568359375,
-0.061798095703125,
-0.00319671630859375,
0.030731201171875,
0.048370361328125,
-0.0098876953125,
-0.038055419921875,
-0.02142333984375,
-0.00705718994140625,
-0.028411865234375,
0.0413818359375,
-0.03863525390625,
-0.032989501953125,
-0.0006203651428222656,
0.0079498291015625,
-0.0132293701171875,
-0.048187255859375,
0.008453369140625,
-0.0218963623046875,
0.02813720703125,
0.008056640625,
-0.0418701171875,
-0.0076751708984375,
0.0026035308837890625,
-0.034332275390625,
0.09149169921875,
0.013458251953125,
-0.041168212890625,
0.0237579345703125,
-0.048065185546875,
-0.02197265625,
-0.003574371337890625,
-0.0261077880859375,
-0.0183258056640625,
0.0015420913696289062,
-0.003574371337890625,
0.0290069580078125,
-0.033294677734375,
0.034423828125,
-0.0242919921875,
-0.0142364501953125,
0.0190277099609375,
-0.0222320556640625,
0.06890869140625,
0.040771484375,
-0.040283203125,
0.00801849365234375,
-0.06695556640625,
0.020111083984375,
0.01898193359375,
-0.03350830078125,
-0.0007939338684082031,
-0.00878143310546875,
-0.0100555419921875,
0.0430908203125,
0.02215576171875,
-0.0260009765625,
0.0291748046875,
-0.04345703125,
0.052276611328125,
0.0345458984375,
0.0009527206420898438,
0.035980224609375,
-0.0256195068359375,
0.0655517578125,
-0.01088714599609375,
0.00775146484375,
-0.03839111328125,
-0.03448486328125,
-0.06060791015625,
-0.005504608154296875,
0.0254058837890625,
0.046661376953125,
-0.052581787109375,
0.054901123046875,
-0.0128173828125,
-0.042724609375,
-0.0247802734375,
0.01690673828125,
0.050384521484375,
0.0186920166015625,
0.0100555419921875,
-0.0183258056640625,
-0.06951904296875,
-0.060150146484375,
-0.01178741455078125,
-0.039215087890625,
-0.0031452178955078125,
0.0036830902099609375,
0.037811279296875,
-0.023681640625,
0.066650390625,
-0.04547119140625,
0.007663726806640625,
-0.027496337890625,
0.033721923828125,
0.01947021484375,
0.0548095703125,
0.04632568359375,
-0.035308837890625,
-0.03076171875,
-0.0272216796875,
-0.056884765625,
-0.021728515625,
-0.01532745361328125,
-0.01502227783203125,
0.035247802734375,
0.0648193359375,
-0.0289459228515625,
0.0278167724609375,
0.057830810546875,
-0.01557159423828125,
0.036956787109375,
-0.01122283935546875,
0.0016107559204101562,
-0.08856201171875,
0.0221405029296875,
-0.01446533203125,
-0.0021953582763671875,
-0.0545654296875,
0.0081634521484375,
-0.0018405914306640625,
-0.0157318115234375,
-0.03057861328125,
0.034393310546875,
-0.0445556640625,
-0.00021660327911376953,
-0.0177001953125,
-0.010467529296875,
0.00586700439453125,
0.06488037109375,
0.0035381317138671875,
0.0699462890625,
0.040679931640625,
-0.057098388671875,
0.01244354248046875,
0.020233154296875,
-0.01861572265625,
-0.00921630859375,
-0.0633544921875,
0.0186309814453125,
0.019012451171875,
0.0150146484375,
-0.07830810546875,
-0.0055084228515625,
0.01580810546875,
-0.05194091796875,
0.01387786865234375,
-0.0165863037109375,
-0.05462646484375,
-0.04248046875,
-0.00894927978515625,
0.03936767578125,
0.06597900390625,
-0.018035888671875,
0.0308380126953125,
0.03326416015625,
-0.01316070556640625,
-0.01548004150390625,
-0.06915283203125,
0.006923675537109375,
-0.0082550048828125,
-0.0538330078125,
0.0240325927734375,
-0.032684326171875,
-0.0004191398620605469,
-0.0094146728515625,
0.00592041015625,
-0.00304412841796875,
-0.00850677490234375,
0.0118255615234375,
0.039794921875,
-0.0138092041015625,
0.00037169456481933594,
-0.015533447265625,
-0.011566162109375,
0.013946533203125,
-0.0181121826171875,
0.064697265625,
-0.0242462158203125,
-0.00433349609375,
-0.00007671117782592773,
0.016693115234375,
0.04052734375,
-0.035400390625,
0.041015625,
0.051544189453125,
-0.0198822021484375,
-0.025634765625,
-0.03076171875,
-0.01324462890625,
-0.0361328125,
0.034637451171875,
-0.02288818359375,
-0.05487060546875,
0.048187255859375,
0.0149688720703125,
0.01506805419921875,
0.039825439453125,
0.0570068359375,
0.0270233154296875,
0.09051513671875,
0.046478271484375,
-0.0002734661102294922,
0.03802490234375,
-0.041900634765625,
0.020294189453125,
-0.046661376953125,
-0.0092926025390625,
-0.04736328125,
0.007595062255859375,
-0.053192138671875,
-0.041473388671875,
0.0100860595703125,
-0.0008625984191894531,
-0.0168914794921875,
0.053863525390625,
-0.0587158203125,
0.01132965087890625,
0.0440673828125,
-0.0225677490234375,
0.00624847412109375,
-0.0035839080810546875,
-0.01065826416015625,
0.005458831787109375,
-0.07342529296875,
-0.04241943359375,
0.08087158203125,
0.0243377685546875,
0.062103271484375,
-0.01184844970703125,
0.07177734375,
0.006534576416015625,
0.0095977783203125,
-0.04168701171875,
0.054229736328125,
-0.00965118408203125,
-0.053955078125,
0.00475311279296875,
-0.05145263671875,
-0.0694580078125,
0.003215789794921875,
-0.0001823902130126953,
-0.042510986328125,
0.0196380615234375,
0.0193023681640625,
-0.027801513671875,
0.0166473388671875,
-0.06463623046875,
0.07989501953125,
-0.01337432861328125,
-0.003414154052734375,
0.005428314208984375,
-0.052276611328125,
0.03662109375,
-0.007495880126953125,
0.023773193359375,
-0.01084136962890625,
0.016357421875,
0.07086181640625,
-0.03369140625,
0.05865478515625,
-0.00897979736328125,
-0.01291656494140625,
0.025054931640625,
0.0125274658203125,
0.0306396484375,
-0.0153656005859375,
-0.00197601318359375,
0.027191162109375,
0.016876220703125,
-0.02325439453125,
-0.0186309814453125,
0.0499267578125,
-0.04693603515625,
-0.034210205078125,
-0.0224609375,
-0.046844482421875,
-0.003444671630859375,
0.0277252197265625,
0.04351806640625,
0.04736328125,
-0.01549530029296875,
0.01776123046875,
0.046478271484375,
-0.052276611328125,
0.047149658203125,
0.02978515625,
-0.028045654296875,
-0.02679443359375,
0.0706787109375,
-0.002689361572265625,
0.038787841796875,
0.014251708984375,
-0.0016689300537109375,
-0.0016078948974609375,
-0.0268707275390625,
-0.02935791015625,
0.01019287109375,
-0.050384521484375,
-0.024810791015625,
-0.06280517578125,
-0.0372314453125,
-0.045562744140625,
-0.003414154052734375,
-0.02813720703125,
-0.0224609375,
-0.0184783935546875,
-0.000036716461181640625,
0.01088714599609375,
0.039581298828125,
-0.00743865966796875,
-0.0005211830139160156,
-0.07080078125,
0.0306396484375,
0.0030364990234375,
0.026824951171875,
-0.00463104248046875,
-0.039642333984375,
-0.02386474609375,
0.0146636962890625,
-0.01360321044921875,
-0.058349609375,
0.0469970703125,
0.0128173828125,
0.047882080078125,
0.0040283203125,
0.01027679443359375,
0.033538818359375,
-0.024139404296875,
0.06719970703125,
0.00971221923828125,
-0.0701904296875,
0.03619384765625,
-0.02301025390625,
0.045318603515625,
0.036285400390625,
0.00022602081298828125,
-0.031585693359375,
-0.050384521484375,
-0.0594482421875,
-0.07086181640625,
0.0665283203125,
0.0213165283203125,
0.0233306884765625,
-0.011505126953125,
-0.0010976791381835938,
0.00555419921875,
0.0212554931640625,
-0.059295654296875,
-0.034454345703125,
-0.021453857421875,
-0.0212554931640625,
0.0030002593994140625,
-0.0204925537109375,
0.001617431640625,
-0.0200347900390625,
0.0577392578125,
-0.0010318756103515625,
0.05023193359375,
0.0103912353515625,
-0.0160675048828125,
-0.005889892578125,
0.01068878173828125,
0.0460205078125,
0.043792724609375,
-0.019561767578125,
-0.0089874267578125,
-0.0113372802734375,
-0.02374267578125,
0.0010938644409179688,
0.0224151611328125,
-0.01049041748046875,
0.015350341796875,
0.0211639404296875,
0.07293701171875,
0.0045166015625,
-0.054779052734375,
0.041351318359375,
-0.006778717041015625,
-0.041900634765625,
-0.03533935546875,
0.0198974609375,
-0.0047607421875,
0.003307342529296875,
0.0283050537109375,
0.01477813720703125,
0.008270263671875,
-0.032440185546875,
0.0239105224609375,
0.0179290771484375,
-0.01666259765625,
-0.0177459716796875,
0.06378173828125,
0.00997161865234375,
-0.0141143798828125,
0.03790283203125,
-0.033172607421875,
-0.046661376953125,
0.06866455078125,
0.03411865234375,
0.08642578125,
-0.002239227294921875,
0.01412200927734375,
0.0472412109375,
0.0232696533203125,
-0.0009746551513671875,
0.0234375,
-0.00037479400634765625,
-0.04736328125,
-0.0254669189453125,
-0.05145263671875,
0.004383087158203125,
0.01580810546875,
-0.048004150390625,
0.019378662109375,
-0.0295867919921875,
-0.0094451904296875,
-0.006885528564453125,
-0.017425537109375,
-0.05718994140625,
0.00806427001953125,
0.01873779296875,
0.06719970703125,
-0.06549072265625,
0.07122802734375,
0.05303955078125,
-0.050567626953125,
-0.0721435546875,
0.0095062255859375,
-0.008453369140625,
-0.059356689453125,
0.0443115234375,
0.005115509033203125,
0.0117950439453125,
0.0215301513671875,
-0.05401611328125,
-0.06488037109375,
0.068603515625,
0.01306915283203125,
-0.0288238525390625,
-0.01499176025390625,
0.0059661865234375,
0.042205810546875,
-0.03570556640625,
0.004085540771484375,
0.04046630859375,
0.01094818115234375,
0.0009036064147949219,
-0.06317138671875,
0.01261138916015625,
-0.051025390625,
0.0200653076171875,
0.0090179443359375,
-0.0439453125,
0.07293701171875,
-0.00864410400390625,
-0.0096435546875,
0.0234222412109375,
0.044281005859375,
0.032470703125,
-0.004695892333984375,
0.03118896484375,
0.0200653076171875,
0.0299530029296875,
-0.009185791015625,
0.0711669921875,
-0.074951171875,
0.04229736328125,
0.07244873046875,
-0.0014047622680664062,
0.04022216796875,
0.0256805419921875,
-0.0181732177734375,
0.0517578125,
0.039459228515625,
-0.03167724609375,
0.03436279296875,
0.0233306884765625,
0.001438140869140625,
-0.01311492919921875,
0.01380157470703125,
-0.046966552734375,
0.02783203125,
0.023834228515625,
-0.054046630859375,
-0.004032135009765625,
0.00016164779663085938,
0.0131683349609375,
-0.01499176025390625,
0.01377105712890625,
0.05426025390625,
0.00492095947265625,
-0.0733642578125,
0.08758544921875,
0.0089263916015625,
0.054412841796875,
-0.04998779296875,
-0.00434112548828125,
-0.0247344970703125,
0.01032257080078125,
-0.0117340087890625,
-0.028594970703125,
0.0181884765625,
0.01123046875,
-0.0178070068359375,
-0.0217742919921875,
0.041290283203125,
-0.042999267578125,
-0.03350830078125,
0.01488494873046875,
0.0301971435546875,
0.01261138916015625,
0.016845703125,
-0.0728759765625,
0.018035888671875,
0.0257720947265625,
-0.024505615234375,
0.0101318359375,
0.016754150390625,
0.01050567626953125,
0.038543701171875,
0.053070068359375,
-0.0009512901306152344,
0.021148681640625,
0.001895904541015625,
0.06048583984375,
-0.052825927734375,
-0.03765869140625,
-0.059967041015625,
0.039215087890625,
0.00811004638671875,
-0.027099609375,
0.060272216796875,
0.053741455078125,
0.07135009765625,
-0.021453857421875,
0.08905029296875,
-0.044097900390625,
0.0224456787109375,
-0.0345458984375,
0.03521728515625,
-0.038238525390625,
0.03277587890625,
-0.0307464599609375,
-0.07171630859375,
-0.01265716552734375,
0.0439453125,
-0.016815185546875,
0.0265045166015625,
0.054412841796875,
0.08221435546875,
0.01216888427734375,
-0.0139312744140625,
0.0214996337890625,
0.025054931640625,
0.032928466796875,
0.058258056640625,
0.0535888671875,
-0.047821044921875,
0.06414794921875,
-0.0263214111328125,
-0.0166473388671875,
-0.01139068603515625,
-0.058013916015625,
-0.07049560546875,
-0.04815673828125,
-0.0374755859375,
-0.046722412109375,
-0.0113067626953125,
0.08001708984375,
0.057098388671875,
-0.06170654296875,
-0.03265380859375,
-0.035980224609375,
-0.021209716796875,
0.0005369186401367188,
-0.0222625732421875,
0.01445770263671875,
-0.04290771484375,
-0.055267333984375,
0.02093505859375,
0.007228851318359375,
0.006290435791015625,
-0.03271484375,
-0.0169677734375,
-0.01329803466796875,
0.0072174072265625,
0.020050048828125,
0.03466796875,
-0.04241943359375,
0.00745391845703125,
0.018829345703125,
-0.0367431640625,
0.015380859375,
0.06146240234375,
-0.050689697265625,
0.03802490234375,
0.06488037109375,
0.02398681640625,
0.032501220703125,
-0.00937652587890625,
0.0693359375,
-0.055206298828125,
0.02667236328125,
0.00679779052734375,
0.032135009765625,
0.018310546875,
-0.013092041015625,
0.036102294921875,
0.0255584716796875,
-0.046722412109375,
-0.06964111328125,
0.0194549560546875,
-0.04974365234375,
-0.03167724609375,
0.11767578125,
-0.01239013671875,
-0.01513671875,
-0.00006985664367675781,
-0.0127410888671875,
0.0275421142578125,
-0.0157470703125,
0.048675537109375,
0.039886474609375,
0.005687713623046875,
-0.01105499267578125,
-0.0491943359375,
0.049285888671875,
0.0260009765625,
-0.0538330078125,
0.01318359375,
0.03521728515625,
0.020477294921875,
0.00888824462890625,
0.052276611328125,
-0.0254058837890625,
0.0214996337890625,
0.0060272216796875,
0.052459716796875,
-0.01016998291015625,
-0.00365447998046875,
-0.02813720703125,
-0.003787994384765625,
0.008148193359375,
-0.0124359130859375
]
] |
dangvantuan/sentence-camembert-base | 2022-03-11T17:02:50.000Z | [
"transformers",
"pytorch",
"camembert",
"feature-extraction",
"Text",
"Sentence Similarity",
"Sentence-Embedding",
"camembert-base",
"sentence-similarity",
"fr",
"dataset:stsb_multi_mt",
"arxiv:1908.10084",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | dangvantuan | null | null | dangvantuan/sentence-camembert-base | 8 | 14,675 | transformers | 2022-03-11T16:16:15 | ---
pipeline_tag: sentence-similarity
language: fr
datasets:
- stsb_multi_mt
tags:
- Text
- Sentence Similarity
- Sentence-Embedding
- camembert-base
license: apache-2.0
model-index:
- name: sentence-camembert-base by Van Tuan DANG
results:
- task:
name: Sentence-Embedding
type: Text Similarity
dataset:
name: Text Similarity fr
type: stsb_multi_mt
args: fr
metrics:
- name: Test Pearson correlation coefficient
type: Pearson_correlation_coefficient
value: xx.xx
---
## Pre-trained sentence embedding models are the state-of-the-art of Sentence Embeddings for French.
Model is Fine-tuned using pre-trained [facebook/camembert-base](https://huggingface.co/camembert/camembert-base) and
[Siamese BERT-Networks with 'sentences-transformers'](https://www.sbert.net/) on dataset [stsb](https://huggingface.co/datasets/stsb_multi_mt/viewer/fr/train)
## Usage
The model can be used directly (without a language model) as follows:
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("dangvantuan/sentence-camembert-base")
sentences = ["Un avion est en train de décoller.",
"Un homme joue d'une grande flûte.",
"Un homme étale du fromage râpé sur une pizza.",
"Une personne jette un chat au plafond.",
"Une personne est en train de plier un morceau de papier.",
]
embeddings = model.encode(sentences)
```
## Evaluation
The model can be evaluated as follows on the French test data of stsb.
```python
from sentence_transformers import SentenceTransformer
from sentence_transformers.readers import InputExample
from sentence_transformers.evaluation import EmbeddingSimilarityEvaluator
from datasets import load_dataset
def convert_dataset(dataset):
dataset_samples=[]
for df in dataset:
score = float(df['similarity_score'])/5.0 # Normalize score to range 0 ... 1
inp_example = InputExample(texts=[df['sentence1'],
df['sentence2']], label=score)
dataset_samples.append(inp_example)
return dataset_samples
# Loading the dataset for evaluation
df_dev = load_dataset("stsb_multi_mt", name="fr", split="dev")
df_test = load_dataset("stsb_multi_mt", name="fr", split="test")
# Convert the dataset for evaluation
# For Dev set:
dev_samples = convert_dataset(df_dev)
val_evaluator = EmbeddingSimilarityEvaluator.from_input_examples(dev_samples, name='sts-dev')
val_evaluator(model, output_path="./")
# For Test set:
test_samples = convert_dataset(df_test)
test_evaluator = EmbeddingSimilarityEvaluator.from_input_examples(test_samples, name='sts-test')
test_evaluator(model, output_path="./")
```
**Test Result**:
The performance is measured using Pearson and Spearman correlation:
- On dev
| Model | Pearson correlation | Spearman correlation | #params |
| ------------- | ------------- | ------------- |------------- |
| [dangvantuan/sentence-camembert-base](https://huggingface.co/dangvantuan/sentence-camembert-base)| 86.73 |86.54 | 110M |
| [distiluse-base-multilingual-cased](https://huggingface.co/sentence-transformers/distiluse-base-multilingual-cased) | 79.22 | 79.16|135M |
- On test
| Model | Pearson correlation | Spearman correlation |
| ------------- | ------------- | ------------- |
| [dangvantuan/sentence-camembert-base](https://huggingface.co/dangvantuan/sentence-camembert-base)| 82.36 | 81.64|
| [distiluse-base-multilingual-cased](https://huggingface.co/sentence-transformers/distiluse-base-multilingual-cased) | 78.62 | 77.48|
## Citation
@article{reimers2019sentence,
title={Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks},
author={Nils Reimers, Iryna Gurevych},
journal={https://arxiv.org/abs/1908.10084},
year={2019}
}
@article{martin2020camembert,
title={CamemBERT: a Tasty French Language Mode},
author={Martin, Louis and Muller, Benjamin and Su{\'a}rez, Pedro Javier Ortiz and Dupont, Yoann and Romary, Laurent and de la Clergerie, {\'E}ric Villemonte and Seddah, Djam{\'e} and Sagot, Beno{\^\i}t},
journal={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
year={2020}
} | 4,240 | [
[
-0.01302337646484375,
-0.0802001953125,
0.0294647216796875,
0.0458984375,
-0.01152801513671875,
-0.00556182861328125,
-0.03228759765625,
0.001110076904296875,
0.0275115966796875,
0.0171356201171875,
-0.0273895263671875,
-0.044189453125,
-0.05047607421875,
0.005218505859375,
-0.0178985595703125,
0.0548095703125,
-0.02471923828125,
0.0176544189453125,
-0.025299072265625,
-0.020477294921875,
-0.0135650634765625,
-0.059417724609375,
-0.0207672119140625,
-0.0035037994384765625,
0.0164642333984375,
0.009979248046875,
0.034332275390625,
0.018890380859375,
0.031585693359375,
0.0288543701171875,
-0.005035400390625,
0.01197052001953125,
-0.0257110595703125,
0.00557708740234375,
0.00894927978515625,
-0.0301361083984375,
-0.021392822265625,
0.0008854866027832031,
0.04217529296875,
0.04168701171875,
-0.0015621185302734375,
-0.0104522705078125,
0.010498046875,
0.04461669921875,
-0.031524658203125,
0.03179931640625,
-0.0244598388671875,
0.008819580078125,
0.0012693405151367188,
-0.01085662841796875,
-0.04150390625,
-0.0394287109375,
0.0058746337890625,
-0.028167724609375,
0.006458282470703125,
0.0007987022399902344,
0.0914306640625,
0.0093841552734375,
-0.0240936279296875,
-0.017730712890625,
-0.03076171875,
0.0687255859375,
-0.05267333984375,
0.0458984375,
0.0201416015625,
0.003498077392578125,
-0.0132598876953125,
-0.060150146484375,
-0.046478271484375,
-0.025238037109375,
-0.027374267578125,
0.03179931640625,
-0.0222015380859375,
-0.01517486572265625,
0.00974273681640625,
0.021453857421875,
-0.05047607421875,
-0.00244140625,
-0.01641845703125,
-0.0081634521484375,
0.05389404296875,
-0.01430511474609375,
0.00943756103515625,
-0.036041259765625,
-0.03143310546875,
-0.042327880859375,
-0.0289764404296875,
0.017486572265625,
0.0200653076171875,
0.0268096923828125,
-0.0303955078125,
0.058990478515625,
-0.00019180774688720703,
0.04547119140625,
-0.007427215576171875,
0.0080718994140625,
0.060577392578125,
-0.040924072265625,
-0.019989013671875,
-0.01007080078125,
0.08984375,
0.018768310546875,
0.036041259765625,
0.002960205078125,
-0.0134429931640625,
0.0127410888671875,
-0.00428009033203125,
-0.056915283203125,
-0.031005859375,
0.005443572998046875,
-0.0253448486328125,
-0.011749267578125,
0.01331329345703125,
-0.04901123046875,
0.008453369140625,
-0.001956939697265625,
0.049346923828125,
-0.057891845703125,
-0.006374359130859375,
0.01849365234375,
-0.012725830078125,
0.0139617919921875,
-0.00806427001953125,
-0.040618896484375,
0.013214111328125,
0.034637451171875,
0.06329345703125,
0.00611114501953125,
-0.021728515625,
-0.034637451171875,
-0.0261383056640625,
-0.01337432861328125,
0.043609619140625,
-0.03656005859375,
-0.0172882080078125,
0.0106048583984375,
0.0177459716796875,
-0.033111572265625,
-0.0164337158203125,
0.05474853515625,
-0.01435089111328125,
0.039886474609375,
-0.01313018798828125,
-0.06439208984375,
-0.02301025390625,
0.011749267578125,
-0.04290771484375,
0.08013916015625,
0.0186004638671875,
-0.052886962890625,
0.0238189697265625,
-0.044921875,
-0.0374755859375,
-0.0201416015625,
-0.01947021484375,
-0.056488037109375,
0.006694793701171875,
0.05322265625,
0.06475830078125,
-0.0038394927978515625,
0.01233673095703125,
-0.0224151611328125,
-0.0203704833984375,
0.0305328369140625,
-0.033599853515625,
0.075927734375,
0.0046234130859375,
-0.0130767822265625,
-0.0025959014892578125,
-0.04913330078125,
0.00506591796875,
0.021148681640625,
-0.0211639404296875,
-0.0198211669921875,
-0.0063018798828125,
0.023681640625,
0.0027008056640625,
0.003292083740234375,
-0.041473388671875,
0.0086212158203125,
-0.0399169921875,
0.05047607421875,
0.05072021484375,
0.0072174072265625,
0.01335906982421875,
-0.033782958984375,
0.0304107666015625,
0.01837158203125,
-0.0005712509155273438,
-0.0180816650390625,
-0.03692626953125,
-0.06341552734375,
-0.03839111328125,
0.045654296875,
0.059173583984375,
-0.045074462890625,
0.07574462890625,
-0.03924560546875,
-0.03656005859375,
-0.04656982421875,
0.006134033203125,
0.022430419921875,
0.01123046875,
0.04388427734375,
-0.01134490966796875,
-0.035736083984375,
-0.084716796875,
-0.01378631591796875,
0.0027256011962890625,
0.01485443115234375,
0.01432037353515625,
0.05487060546875,
-0.017059326171875,
0.06573486328125,
-0.046844482421875,
-0.0184173583984375,
-0.022735595703125,
-0.007480621337890625,
0.0307159423828125,
0.054168701171875,
0.063232421875,
-0.060699462890625,
-0.0574951171875,
-0.0196533203125,
-0.055450439453125,
0.018951416015625,
-0.00446319580078125,
-0.0023822784423828125,
0.019805908203125,
0.036346435546875,
-0.039459228515625,
0.026123046875,
0.0411376953125,
-0.0257110595703125,
0.028167724609375,
-0.021697998046875,
0.017242431640625,
-0.1131591796875,
-0.016845703125,
0.0203094482421875,
-0.01873779296875,
-0.03289794921875,
-0.0067596435546875,
0.005298614501953125,
0.0026111602783203125,
-0.0362548828125,
0.023406982421875,
-0.03515625,
0.02117919921875,
0.0382080078125,
0.03076171875,
0.00908660888671875,
0.056488037109375,
-0.0004477500915527344,
0.044342041015625,
0.046112060546875,
-0.031524658203125,
0.027191162109375,
0.040985107421875,
-0.045928955078125,
0.038543701171875,
-0.059173583984375,
-0.006282806396484375,
0.002899169921875,
0.0241851806640625,
-0.0821533203125,
0.00585174560546875,
0.0216064453125,
-0.040130615234375,
0.0208282470703125,
0.004505157470703125,
-0.054168701171875,
-0.032470703125,
-0.036956787109375,
0.01044464111328125,
0.04278564453125,
-0.036651611328125,
0.03448486328125,
0.0043182373046875,
-0.00170135498046875,
-0.045501708984375,
-0.08428955078125,
-0.016845703125,
-0.021697998046875,
-0.061187744140625,
0.019683837890625,
-0.009185791015625,
0.0013561248779296875,
0.01280975341796875,
0.0118865966796875,
-0.0024204254150390625,
-0.0039825439453125,
0.01096343994140625,
0.008697509765625,
-0.02099609375,
0.0224761962890625,
0.01012420654296875,
0.002819061279296875,
-0.0103302001953125,
-0.0007085800170898438,
0.049102783203125,
-0.03363037109375,
-0.0248565673828125,
-0.034942626953125,
0.0230712890625,
0.02392578125,
-0.021331787109375,
0.06915283203125,
0.0653076171875,
-0.0197296142578125,
0.0027942657470703125,
-0.0377197265625,
-0.0144195556640625,
-0.036041259765625,
0.043701171875,
-0.05352783203125,
-0.0718994140625,
0.0306549072265625,
0.00949859619140625,
-0.006053924560546875,
0.049774169921875,
0.04388427734375,
-0.0079498291015625,
0.06011962890625,
0.032958984375,
-0.0041351318359375,
0.032135009765625,
-0.045684814453125,
0.03369140625,
-0.051666259765625,
-0.01073455810546875,
-0.02703857421875,
-0.01904296875,
-0.06756591796875,
-0.03839111328125,
0.0179901123046875,
0.020355224609375,
-0.01078033447265625,
0.044097900390625,
-0.03515625,
0.00421142578125,
0.036346435546875,
0.0194091796875,
-0.0037746429443359375,
0.01898193359375,
-0.0384521484375,
-0.0179290771484375,
-0.06475830078125,
-0.034210205078125,
0.0792236328125,
0.034454345703125,
0.05108642578125,
0.00516510009765625,
0.05035400390625,
0.0086822509765625,
-0.00817108154296875,
-0.05743408203125,
0.03729248046875,
-0.029571533203125,
-0.01611328125,
-0.00742340087890625,
-0.037689208984375,
-0.072509765625,
0.0259857177734375,
-0.025177001953125,
-0.055511474609375,
0.0154266357421875,
-0.0047607421875,
-0.02557373046875,
0.01708984375,
-0.07373046875,
0.07232666015625,
-0.0177459716796875,
-0.0173187255859375,
-0.004199981689453125,
-0.0296478271484375,
0.0081634521484375,
0.005039215087890625,
0.0096893310546875,
0.0093841552734375,
0.0188751220703125,
0.06390380859375,
-0.0291748046875,
0.05511474609375,
0.0002853870391845703,
0.0030956268310546875,
0.0218048095703125,
0.0005335807800292969,
0.02496337890625,
0.0141448974609375,
-0.0037212371826171875,
0.0142822265625,
0.01131439208984375,
-0.0304107666015625,
-0.047210693359375,
0.060455322265625,
-0.062255859375,
-0.037689208984375,
-0.04278564453125,
-0.0362548828125,
-0.01074981689453125,
-0.005947113037109375,
0.0237274169921875,
0.04168701171875,
-0.02813720703125,
0.04681396484375,
0.032989501953125,
-0.03289794921875,
0.039093017578125,
0.0135498046875,
-0.0059356689453125,
-0.0307159423828125,
0.058258056640625,
0.007549285888671875,
0.00714874267578125,
0.057708740234375,
0.019073486328125,
-0.03106689453125,
-0.00811004638671875,
-0.0188140869140625,
0.02716064453125,
-0.044525146484375,
-0.0022563934326171875,
-0.07574462890625,
-0.035614013671875,
-0.057220458984375,
-0.0178680419921875,
-0.029022216796875,
-0.054962158203125,
-0.0208282470703125,
-0.020904541015625,
0.04632568359375,
0.032470703125,
-0.0136260986328125,
0.0239715576171875,
-0.0457763671875,
0.0019512176513671875,
0.00215911865234375,
0.01468658447265625,
-0.017425537109375,
-0.041351318359375,
-0.020660400390625,
0.01495361328125,
-0.022979736328125,
-0.0611572265625,
0.034332275390625,
0.0233154296875,
0.05096435546875,
0.0277557373046875,
-0.006824493408203125,
0.041656494140625,
-0.033782958984375,
0.07293701171875,
0.01317596435546875,
-0.0716552734375,
0.041717529296875,
0.00698089599609375,
0.01239013671875,
0.03558349609375,
0.040740966796875,
-0.035980224609375,
-0.0184478759765625,
-0.04852294921875,
-0.06915283203125,
0.036834716796875,
0.03497314453125,
0.01708984375,
-0.01459503173828125,
0.01244354248046875,
-0.0024547576904296875,
0.011993408203125,
-0.06982421875,
-0.0295562744140625,
-0.02813720703125,
-0.024810791015625,
-0.03985595703125,
-0.017608642578125,
-0.004726409912109375,
-0.041107177734375,
0.052154541015625,
0.01126861572265625,
0.0457763671875,
0.0268096923828125,
-0.0247802734375,
0.0209503173828125,
0.02618408203125,
0.04071044921875,
0.033172607421875,
-0.02288818359375,
0.00243377685546875,
0.01461029052734375,
-0.0205535888671875,
0.00875091552734375,
0.023468017578125,
0.0011453628540039062,
0.0105438232421875,
0.05047607421875,
0.0718994140625,
0.005680084228515625,
-0.035400390625,
0.057708740234375,
-0.00949859619140625,
-0.03436279296875,
-0.03875732421875,
-0.01143646240234375,
0.0096435546875,
0.0228424072265625,
0.0053558349609375,
0.0015993118286132812,
-0.0176544189453125,
-0.038299560546875,
0.034149169921875,
0.021697998046875,
-0.043060302734375,
-0.0179901123046875,
0.0362548828125,
-0.0009560585021972656,
-0.01438140869140625,
0.055389404296875,
-0.0223388671875,
-0.06329345703125,
0.034332275390625,
0.037109375,
0.0645751953125,
-0.011871337890625,
0.027801513671875,
0.04547119140625,
0.03436279296875,
-0.0031185150146484375,
0.022674560546875,
0.012115478515625,
-0.06927490234375,
-0.01526641845703125,
-0.043701171875,
0.017608642578125,
0.0065765380859375,
-0.0419921875,
0.0113677978515625,
-0.01526641845703125,
-0.00997161865234375,
-0.01291656494140625,
0.001605987548828125,
-0.066650390625,
0.00341796875,
-0.017791748046875,
0.057891845703125,
-0.08319091796875,
0.049896240234375,
0.05816650390625,
-0.0504150390625,
-0.051605224609375,
-0.004611968994140625,
-0.027435302734375,
-0.055572509765625,
0.050994873046875,
0.0161895751953125,
0.0232391357421875,
-0.00959014892578125,
-0.02294921875,
-0.05340576171875,
0.0694580078125,
0.01296234130859375,
-0.049102783203125,
0.0168914794921875,
0.007904052734375,
0.0477294921875,
-0.032806396484375,
0.035797119140625,
0.037750244140625,
0.04168701171875,
-0.00948333740234375,
-0.058502197265625,
0.01470184326171875,
-0.0276641845703125,
0.009979248046875,
-0.0052490234375,
-0.0604248046875,
0.07806396484375,
-0.0029773712158203125,
-0.004703521728515625,
-0.007190704345703125,
0.0560302734375,
0.0150604248046875,
-0.0057220458984375,
0.037750244140625,
0.0667724609375,
0.04510498046875,
-0.0251922607421875,
0.088134765625,
-0.02056884765625,
0.041107177734375,
0.0701904296875,
0.0005993843078613281,
0.072509765625,
0.0298614501953125,
-0.0248870849609375,
0.058990478515625,
0.052093505859375,
-0.019195556640625,
0.047576904296875,
0.0258636474609375,
0.007793426513671875,
-0.01354217529296875,
0.0158233642578125,
-0.034210205078125,
0.03875732421875,
0.024566650390625,
-0.037139892578125,
0.0003085136413574219,
0.01435089111328125,
0.02069091796875,
0.0223236083984375,
0.025177001953125,
0.0276641845703125,
0.0208892822265625,
-0.03466796875,
0.05712890625,
0.00641632080078125,
0.059417724609375,
-0.043060302734375,
0.0087127685546875,
-0.00902557373046875,
0.026519775390625,
-0.00688934326171875,
-0.05487060546875,
0.007701873779296875,
-0.0194854736328125,
0.0030918121337890625,
-0.014801025390625,
0.021087646484375,
-0.04449462890625,
-0.059600830078125,
0.04443359375,
0.0601806640625,
0.0040740966796875,
0.0055084228515625,
-0.0809326171875,
0.00408172607421875,
0.007190704345703125,
-0.040283203125,
-0.0016574859619140625,
0.028594970703125,
0.00949859619140625,
0.037506103515625,
0.033599853515625,
-0.00292205810546875,
0.00397491455078125,
0.019287109375,
0.061676025390625,
-0.041351318359375,
-0.04290771484375,
-0.07354736328125,
0.035125732421875,
-0.0120086669921875,
-0.022216796875,
0.0701904296875,
0.062347412109375,
0.0660400390625,
-0.01105499267578125,
0.0596923828125,
-0.01422882080078125,
0.0303497314453125,
-0.0474853515625,
0.049835205078125,
-0.055816650390625,
0.0011272430419921875,
-0.0275421142578125,
-0.0877685546875,
-0.02410888671875,
0.07073974609375,
-0.02288818359375,
0.0206298828125,
0.07861328125,
0.07281494140625,
-0.020172119140625,
-0.017333984375,
0.00927734375,
0.03985595703125,
0.0316162109375,
0.05389404296875,
0.03851318359375,
-0.062255859375,
0.0285186767578125,
-0.037628173828125,
-0.013702392578125,
-0.0189056396484375,
-0.05828857421875,
-0.0904541015625,
-0.0731201171875,
-0.0382080078125,
-0.0362548828125,
0.00702667236328125,
0.07818603515625,
0.042083740234375,
-0.070556640625,
-0.0160369873046875,
-0.0010852813720703125,
-0.00004482269287109375,
-0.035125732421875,
-0.0240020751953125,
0.0618896484375,
-0.016510009765625,
-0.05633544921875,
0.03472900390625,
-0.00414276123046875,
0.004413604736328125,
-0.01265716552734375,
-0.004119873046875,
-0.05718994140625,
0.01023101806640625,
0.055755615234375,
-0.0138397216796875,
-0.05029296875,
-0.02911376953125,
0.00048279762268066406,
-0.01389312744140625,
0.01198577880859375,
0.031951904296875,
-0.039764404296875,
0.034210205078125,
0.051666259765625,
0.043670654296875,
0.050811767578125,
-0.0020294189453125,
0.0399169921875,
-0.07281494140625,
0.024444580078125,
0.01062774658203125,
0.044952392578125,
0.031951904296875,
-0.01438140869140625,
0.04486083984375,
0.0223388671875,
-0.03253173828125,
-0.051513671875,
-0.0020160675048828125,
-0.08795166015625,
-0.0236968994140625,
0.08270263671875,
-0.0258636474609375,
-0.0168609619140625,
0.02410888671875,
-0.0035877227783203125,
0.027984619140625,
-0.036865234375,
0.034271240234375,
0.062744140625,
-0.001285552978515625,
-0.01348876953125,
-0.034210205078125,
0.0298614501953125,
0.045074462890625,
-0.038787841796875,
-0.02838134765625,
0.01488494873046875,
0.024810791015625,
0.0188446044921875,
0.029327392578125,
-0.00665283203125,
-0.00896453857421875,
0.0019025802612304688,
0.0000667572021484375,
-0.002056121826171875,
-0.0004642009735107422,
-0.0171051025390625,
0.00844573974609375,
-0.0367431640625,
-0.02508544921875
]
] |
transfo-xl-wt103 | 2023-01-24T14:49:49.000Z | [
"transformers",
"pytorch",
"tf",
"transfo-xl",
"text-generation",
"en",
"dataset:wikitext-103",
"arxiv:1901.02860",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | null | null | null | transfo-xl-wt103 | 6 | 14,672 | transformers | 2022-03-02T23:29:04 | ---
datasets:
- wikitext-103
tags:
- text-generation
language: en
model-index:
- name: transfo-xl-wt103
results: []
task:
name: Text Generation
type: text-generation
---
# Transfo-xl-wt103
## Table of Contents
- [Model Details](#model-details)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
- [Evaluation](#evaluation)
- [Citation Information](#citation-information)
- [How to Get Started With the Model](#how-to-get-started-with-the-model)
## Model Details
**Model Description:**
The Transformer-XL model is a causal (uni-directional) transformer with relative positioning (sinusoïdal) embeddings which can reuse previously computed hidden-states to attend to longer context (memory). This model also uses adaptive softmax inputs and outputs (tied).
- **Developed by:** [Zihang Dai](dzihang@cs.cmu.edu), [Zhilin Yang](zhiliny@cs.cmu.edu), [Yiming Yang1](yiming@cs.cmu.edu), [Jaime Carbonell](jgc@cs.cmu.edu), [Quoc V. Le](qvl@google.com), [Ruslan Salakhutdinov](rsalakhu@cs.cmu.edu)
- **Shared by:** HuggingFace team
- **Model Type:** Text Generation
- **Language(s):** English
- **License:** [More information needed]
- **Resources for more information:**
- [Research Paper](https://arxiv.org/pdf/1901.02860.pdf)
- [GitHub Repo](https://github.com/kimiyoung/transformer-xl)
- [HuggingFace Documentation](https://huggingface.co/docs/transformers/model_doc/transfo-xl#transformers.TransfoXLModel)
## Uses
#### Direct Use
This model can be used for text generation.
The authors provide additionally notes about the vocabulary used, in the [associated paper](https://arxiv.org/pdf/1901.02860.pdf):
> We envision interesting applications of Transformer-XL in the fields of text generation, unsupervised feature learning, image and speech modeling.
#### Misuse and Out-of-scope Use
The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Risks, Limitations and Biases
**CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.**
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
## Training
#### Training Data
The authors provide additionally notes about the vocabulary used, in the [associated paper](https://arxiv.org/pdf/1901.02860.pdf):
> best model trained the Wikitext-103 dataset. We seed the our Transformer-XL with a context of at most 512 consecutive tokens randomly sampled from the test set of Wikitext-103. Then, we run Transformer-XL to generate a pre-defined number of tokens (500 or 1,000 in our case). For each generation step, we first find the top-40 probabilities of the next-step distribution and sample from top-40 tokens based on the re-normalized distribution. To help reading, we detokenize the context, the generated text and the reference text.
The authors use the following pretraining corpora for the model, described in the [associated paper](https://arxiv.org/pdf/1901.02860.pdf):
- WikiText-103 (Merity et al., 2016),
#### Training Procedure
##### Preprocessing
The authors provide additionally notes about the training procedure used, in the [associated paper](https://arxiv.org/pdf/1901.02860.pdf):
> Similar to but different from enwik8, text8 con- tains 100M processed Wikipedia characters cre- ated by lowering case the text and removing any character other than the 26 letters a through z, and space. Due to the similarity, we simply adapt the best model and the same hyper-parameters on en- wik8 to text8 without further tuning.
## Evaluation
#### Results
| Method | enwiki8 |text8 | One Billion Word | WT-103 | PTB (w/o finetuning) |
|:--------------------:|---------:|:----:|:----------------:|:------:|:--------------------:|
| Transformer-XL. | 0.99 | 1.08 | 21.8 | 18.3 | 54.5 |
## Citation Information
```bibtex
@misc{https://doi.org/10.48550/arxiv.1901.02860,
doi = {10.48550/ARXIV.1901.02860},
url = {https://arxiv.org/abs/1901.02860},
author = {Dai, Zihang and Yang, Zhilin and Yang, Yiming and Carbonell, Jaime and Le, Quoc V. and Salakhutdinov, Ruslan},
keywords = {Machine Learning (cs.LG), Computation and Language (cs.CL), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context},
publisher = {arXiv},
year = {2019},
copyright = {Creative Commons Attribution Non Commercial Share Alike 4.0 International}
}
```
## How to Get Started With the Model
```
from transformers import TransfoXLTokenizer, TransfoXLModel
import torch
tokenizer = TransfoXLTokenizer.from_pretrained("transfo-xl-wt103")
model = TransfoXLModel.from_pretrained("transfo-xl-wt103")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state
```
| 5,443 | [
[
-0.033660888671875,
-0.041473388671875,
0.0196990966796875,
0.010162353515625,
-0.018585205078125,
-0.0157928466796875,
-0.026519775390625,
-0.02978515625,
0.0013599395751953125,
0.031036376953125,
-0.047882080078125,
-0.033935546875,
-0.0531005859375,
0.025909423828125,
-0.038238525390625,
0.0810546875,
-0.0083770751953125,
-0.01197052001953125,
-0.0009965896606445312,
-0.0158843994140625,
-0.0209503173828125,
-0.036346435546875,
-0.03271484375,
-0.031219482421875,
0.026458740234375,
0.012481689453125,
0.0531005859375,
0.04266357421875,
0.0228424072265625,
0.0247039794921875,
-0.0207366943359375,
-0.005466461181640625,
-0.047027587890625,
-0.01474761962890625,
0.0022449493408203125,
-0.0261077880859375,
-0.0183258056640625,
0.007289886474609375,
0.048309326171875,
0.06268310546875,
0.0015411376953125,
0.0186920166015625,
0.01067352294921875,
0.030059814453125,
-0.0274505615234375,
0.01363372802734375,
-0.050201416015625,
0.0158538818359375,
-0.01332855224609375,
0.00640869140625,
-0.047088623046875,
-0.0142822265625,
0.01018524169921875,
-0.05322265625,
0.03094482421875,
0.01377105712890625,
0.0902099609375,
0.01195526123046875,
-0.0341796875,
-0.00965118408203125,
-0.058837890625,
0.06488037109375,
-0.054351806640625,
0.0606689453125,
0.0235748291015625,
0.0009613037109375,
0.006351470947265625,
-0.08050537109375,
-0.07293701171875,
-0.00975799560546875,
-0.029693603515625,
0.019561767578125,
-0.03125,
-0.0069580078125,
0.036895751953125,
0.01444244384765625,
-0.03900146484375,
0.0015325546264648438,
-0.038604736328125,
-0.018829345703125,
0.03155517578125,
0.01033782958984375,
0.0152435302734375,
-0.036346435546875,
-0.03411865234375,
-0.02862548828125,
-0.036529541015625,
0.00922393798828125,
0.0295562744140625,
0.018035888671875,
-0.036529541015625,
0.038970947265625,
0.01165771484375,
0.0298309326171875,
0.025390625,
-0.00634765625,
0.05108642578125,
-0.02838134765625,
-0.0109710693359375,
-0.007595062255859375,
0.08721923828125,
0.01398468017578125,
0.013336181640625,
-0.02581787109375,
-0.0133819580078125,
-0.003826141357421875,
0.0242919921875,
-0.06671142578125,
0.0037384033203125,
0.005626678466796875,
-0.04400634765625,
-0.0360107421875,
0.0090179443359375,
-0.058563232421875,
0.0023746490478515625,
-0.0212554931640625,
0.04022216796875,
-0.042327880859375,
-0.0263671875,
0.0022830963134765625,
-0.01047515869140625,
0.0191192626953125,
-0.01525115966796875,
-0.064697265625,
0.0207977294921875,
0.025634765625,
0.0516357421875,
-0.0137939453125,
-0.035797119140625,
-0.01025390625,
0.0006093978881835938,
-0.01824951171875,
0.04498291015625,
-0.03033447265625,
-0.0355224609375,
-0.00017535686492919922,
0.021484375,
-0.01358795166015625,
-0.0230712890625,
0.05560302734375,
-0.0304412841796875,
0.04498291015625,
-0.00627899169921875,
-0.03900146484375,
-0.00852203369140625,
-0.00019550323486328125,
-0.056304931640625,
0.0927734375,
0.023773193359375,
-0.07940673828125,
0.0223236083984375,
-0.047088623046875,
-0.03204345703125,
-0.01230621337890625,
-0.00618743896484375,
-0.05450439453125,
0.002777099609375,
0.01399993896484375,
0.0299530029296875,
-0.016357421875,
0.025360107421875,
-0.0071868896484375,
-0.02069091796875,
0.012481689453125,
-0.03924560546875,
0.07269287109375,
0.01314544677734375,
-0.052978515625,
0.00862884521484375,
-0.0648193359375,
-0.0157928466796875,
0.0237579345703125,
-0.016571044921875,
-0.0169219970703125,
-0.01496124267578125,
0.021942138671875,
0.0258636474609375,
0.003780364990234375,
-0.049468994140625,
0.003208160400390625,
-0.04345703125,
0.027130126953125,
0.05108642578125,
-0.021484375,
0.0399169921875,
-0.0122833251953125,
0.02740478515625,
0.020355224609375,
0.022125244140625,
-0.01119232177734375,
-0.03558349609375,
-0.0654296875,
-0.00494384765625,
0.0308990478515625,
0.03973388671875,
-0.0650634765625,
0.04913330078125,
-0.035919189453125,
-0.046051025390625,
-0.037689208984375,
-0.006256103515625,
0.034820556640625,
0.058837890625,
0.046783447265625,
-0.005901336669921875,
-0.043609619140625,
-0.064697265625,
-0.001224517822265625,
-0.0035228729248046875,
-0.01116180419921875,
0.016265869140625,
0.041259765625,
-0.038848876953125,
0.070068359375,
-0.046844482421875,
-0.01454925537109375,
-0.0333251953125,
0.0191802978515625,
0.0173187255859375,
0.025634765625,
0.03985595703125,
-0.06634521484375,
-0.041351318359375,
-0.0171966552734375,
-0.0511474609375,
-0.01230621337890625,
-0.00371551513671875,
-0.01323699951171875,
0.0241241455078125,
0.04193115234375,
-0.058258056640625,
0.0301055908203125,
0.038543701171875,
-0.0423583984375,
0.049774169921875,
-0.0089874267578125,
-0.01104736328125,
-0.11492919921875,
0.0224761962890625,
0.00737762451171875,
-0.006641387939453125,
-0.06121826171875,
-0.0142669677734375,
0.0175933837890625,
-0.01763916015625,
-0.039703369140625,
0.04974365234375,
-0.0300140380859375,
0.020416259765625,
-0.016265869140625,
0.00244903564453125,
0.005161285400390625,
0.057952880859375,
0.008819580078125,
0.048858642578125,
0.03564453125,
-0.041290283203125,
0.004726409912109375,
0.031494140625,
-0.020782470703125,
0.024139404296875,
-0.054534912109375,
0.0019779205322265625,
-0.019744873046875,
0.022705078125,
-0.056121826171875,
0.00760650634765625,
0.01324462890625,
-0.0369873046875,
0.04364013671875,
0.019378662109375,
-0.03131103515625,
-0.0419921875,
-0.0169219970703125,
0.032470703125,
0.04913330078125,
-0.041717529296875,
0.0625,
0.0186920166015625,
0.00909423828125,
-0.05517578125,
-0.047698974609375,
-0.01403045654296875,
-0.02191162109375,
-0.0421142578125,
0.057281494140625,
-0.00450897216796875,
-0.0001061558723449707,
0.00614166259765625,
0.0114593505859375,
0.00942230224609375,
-0.00975799560546875,
0.0233917236328125,
0.0230712890625,
-0.00372314453125,
0.002666473388671875,
0.0005121231079101562,
-0.0233306884765625,
0.0014524459838867188,
-0.016632080078125,
0.04327392578125,
-0.00475311279296875,
0.0016603469848632812,
-0.031402587890625,
0.02667236328125,
0.03900146484375,
-0.006511688232421875,
0.0738525390625,
0.0745849609375,
-0.02972412109375,
-0.003124237060546875,
-0.036346435546875,
-0.0288848876953125,
-0.038116455078125,
0.040130615234375,
-0.0217132568359375,
-0.0594482421875,
0.046295166015625,
0.01263427734375,
0.01073455810546875,
0.059844970703125,
0.0419921875,
0.0079345703125,
0.07318115234375,
0.07525634765625,
-0.0091094970703125,
0.0369873046875,
-0.036895751953125,
0.027587890625,
-0.061920166015625,
-0.0196990966796875,
-0.04241943359375,
-0.01230621337890625,
-0.06683349609375,
-0.022674560546875,
0.0205230712890625,
-0.00959014892578125,
-0.0293731689453125,
0.054534912109375,
-0.0357666015625,
0.0034637451171875,
0.040924072265625,
0.003116607666015625,
0.005859375,
-0.0007867813110351562,
-0.037933349609375,
-0.010223388671875,
-0.048095703125,
-0.027130126953125,
0.07952880859375,
0.034912109375,
0.047210693359375,
-0.00576019287109375,
0.039794921875,
0.0032062530517578125,
0.0282745361328125,
-0.049591064453125,
0.036956787109375,
-0.01377105712890625,
-0.04071044921875,
-0.028717041015625,
-0.042144775390625,
-0.083984375,
0.01194000244140625,
-0.0241241455078125,
-0.055419921875,
0.01995849609375,
0.00482177734375,
-0.01654052734375,
0.035003662109375,
-0.055450439453125,
0.06634521484375,
0.00312042236328125,
-0.0128326416015625,
0.005954742431640625,
-0.051361083984375,
0.0167236328125,
-0.016845703125,
0.0193328857421875,
0.017059326171875,
0.01025390625,
0.0694580078125,
-0.02838134765625,
0.0650634765625,
-0.002628326416015625,
-0.007175445556640625,
0.0176849365234375,
-0.0214385986328125,
0.041717529296875,
-0.0189361572265625,
-0.0060577392578125,
0.04052734375,
-0.0125274658203125,
-0.007350921630859375,
-0.0391845703125,
0.032257080078125,
-0.07879638671875,
-0.0360107421875,
-0.0189971923828125,
-0.04364013671875,
-0.0013427734375,
0.0293731689453125,
0.05712890625,
0.045166015625,
0.001766204833984375,
0.0273590087890625,
0.0526123046875,
-0.0391845703125,
0.0374755859375,
0.031982421875,
-0.001369476318359375,
-0.0333251953125,
0.058441162109375,
0.0167694091796875,
0.0149993896484375,
0.044647216796875,
0.030059814453125,
-0.036346435546875,
-0.006072998046875,
-0.02081298828125,
0.03277587890625,
-0.04986572265625,
-0.0160064697265625,
-0.06988525390625,
-0.0296478271484375,
-0.0465087890625,
-0.0027828216552734375,
-0.02960205078125,
-0.0248260498046875,
-0.026214599609375,
-0.0160675048828125,
0.021484375,
0.03997802734375,
0.0015411376953125,
0.021087646484375,
-0.0433349609375,
0.024749755859375,
0.0139617919921875,
0.0270233154296875,
0.007350921630859375,
-0.0701904296875,
-0.0201568603515625,
-0.003856658935546875,
-0.04241943359375,
-0.060272216796875,
0.035186767578125,
0.00890350341796875,
0.047943115234375,
0.0301055908203125,
0.017181396484375,
0.036376953125,
-0.059661865234375,
0.06524658203125,
0.030792236328125,
-0.0721435546875,
0.0240631103515625,
-0.006069183349609375,
0.031280517578125,
0.0209503173828125,
0.04754638671875,
-0.04425048828125,
-0.0430908203125,
-0.0672607421875,
-0.0777587890625,
0.058746337890625,
0.031585693359375,
0.03216552734375,
-0.00449371337890625,
0.0200653076171875,
0.005626678466796875,
0.0005860328674316406,
-0.08538818359375,
-0.046783447265625,
-0.039703369140625,
-0.03607177734375,
-0.007244110107421875,
-0.0217132568359375,
0.00536346435546875,
-0.0303192138671875,
0.063720703125,
-0.010406494140625,
0.0572509765625,
0.0084381103515625,
-0.01593017578125,
-0.0055084228515625,
0.01318359375,
0.05426025390625,
0.0291900634765625,
-0.0034885406494140625,
0.016204833984375,
0.03033447265625,
-0.062347412109375,
0.002685546875,
0.024749755859375,
-0.02508544921875,
0.012603759765625,
0.03253173828125,
0.09033203125,
0.0277862548828125,
-0.032501220703125,
0.046966552734375,
-0.0192108154296875,
-0.0298309326171875,
-0.021240234375,
-0.0127410888671875,
0.004550933837890625,
0.01010894775390625,
0.038665771484375,
-0.005321502685546875,
-0.0032291412353515625,
-0.0231170654296875,
0.0232391357421875,
0.0218963623046875,
-0.036895751953125,
-0.033905029296875,
0.06951904296875,
0.0035572052001953125,
-0.0222625732421875,
0.04595947265625,
-0.0093841552734375,
-0.04425048828125,
0.037750244140625,
0.05560302734375,
0.07501220703125,
-0.0078277587890625,
0.00785064697265625,
0.045928955078125,
0.0312347412109375,
0.0018644332885742188,
-0.0027027130126953125,
0.0008711814880371094,
-0.0689697265625,
-0.0241546630859375,
-0.055450439453125,
-0.004543304443359375,
0.0271759033203125,
-0.045684814453125,
0.035186767578125,
-0.0028667449951171875,
-0.02581787109375,
0.007038116455078125,
-0.0008478164672851562,
-0.0654296875,
0.016082763671875,
0.01418304443359375,
0.073486328125,
-0.07080078125,
0.055938720703125,
0.03875732421875,
-0.036163330078125,
-0.0814208984375,
0.0156707763671875,
-0.0195159912109375,
-0.047821044921875,
0.05126953125,
0.029693603515625,
0.00930023193359375,
0.0100555419921875,
-0.034576416015625,
-0.0693359375,
0.0914306640625,
0.024566650390625,
-0.03485107421875,
-0.01012420654296875,
0.0293121337890625,
0.0408935546875,
-0.0091552734375,
0.041748046875,
0.0276947021484375,
0.0291290283203125,
-0.0021877288818359375,
-0.06915283203125,
0.0167236328125,
-0.0242462158203125,
0.0143585205078125,
0.0088958740234375,
-0.049285888671875,
0.059478759765625,
-0.014007568359375,
-0.02142333984375,
-0.01525115966796875,
0.0693359375,
0.009765625,
0.00032806396484375,
0.049652099609375,
0.052978515625,
0.03912353515625,
-0.0090484619140625,
0.07550048828125,
-0.0430908203125,
0.04351806640625,
0.061431884765625,
0.0036106109619140625,
0.04986572265625,
0.026763916015625,
-0.01454925537109375,
0.04669189453125,
0.0325927734375,
-0.0013036727905273438,
0.04229736328125,
-0.010955810546875,
-0.00652313232421875,
0.0020694732666015625,
0.0012426376342773438,
-0.026092529296875,
0.02740478515625,
0.012908935546875,
-0.047393798828125,
-0.0098114013671875,
0.0179595947265625,
0.020416259765625,
-0.008819580078125,
-0.0089569091796875,
0.048492431640625,
0.015167236328125,
-0.047576904296875,
0.051361083984375,
0.00989532470703125,
0.06890869140625,
-0.044281005859375,
0.017364501953125,
-0.0216217041015625,
0.00783538818359375,
-0.00794219970703125,
-0.05291748046875,
0.02630615234375,
0.007274627685546875,
-0.027923583984375,
-0.02325439453125,
0.048065185546875,
-0.032623291015625,
-0.058135986328125,
0.030487060546875,
0.01611328125,
-0.00350189208984375,
0.0169525146484375,
-0.06341552734375,
0.0023250579833984375,
0.00015938282012939453,
-0.048004150390625,
0.0161590576171875,
0.035491943359375,
-0.0007338523864746094,
0.044219970703125,
0.035858154296875,
-0.00994110107421875,
0.006023406982421875,
0.018463134765625,
0.06414794921875,
-0.043426513671875,
-0.038238525390625,
-0.0628662109375,
0.045501708984375,
-0.0036983489990234375,
-0.0294647216796875,
0.05242919921875,
0.0399169921875,
0.07733154296875,
-0.005939483642578125,
0.05279541015625,
-0.00980377197265625,
0.0305328369140625,
-0.021636962890625,
0.057769775390625,
-0.040557861328125,
-0.0080108642578125,
-0.041748046875,
-0.069091796875,
-0.013153076171875,
0.0428466796875,
-0.0170440673828125,
0.0123443603515625,
0.056121826171875,
0.06671142578125,
-0.006923675537109375,
-0.006847381591796875,
0.0255584716796875,
0.0232086181640625,
0.01959228515625,
0.0290069580078125,
0.0296478271484375,
-0.05938720703125,
0.06396484375,
-0.040679931640625,
-0.01345062255859375,
0.0012788772583007812,
-0.061370849609375,
-0.0687255859375,
-0.06927490234375,
-0.0223846435546875,
-0.03607177734375,
-0.02508544921875,
0.06488037109375,
0.047210693359375,
-0.0572509765625,
-0.00661468505859375,
-0.003963470458984375,
-0.0026149749755859375,
-0.023040771484375,
-0.02081298828125,
0.0426025390625,
-0.0191802978515625,
-0.05999755859375,
-0.001895904541015625,
-0.00493621826171875,
0.0133209228515625,
-0.023681640625,
-0.01222991943359375,
-0.007152557373046875,
0.0098419189453125,
0.058807373046875,
0.00925445556640625,
-0.04278564453125,
-0.0207061767578125,
0.0194091796875,
-0.016204833984375,
0.0017156600952148438,
0.040924072265625,
-0.050262451171875,
0.0139617919921875,
0.0284576416015625,
0.03143310546875,
0.041717529296875,
-0.0103607177734375,
0.0301971435546875,
-0.05706787109375,
0.0222930908203125,
-0.0018901824951171875,
0.03399658203125,
0.024749755859375,
-0.03619384765625,
0.025543212890625,
0.037261962890625,
-0.03411865234375,
-0.06402587890625,
0.0002467632293701172,
-0.06829833984375,
-0.0155181884765625,
0.10150146484375,
-0.0015964508056640625,
-0.022613525390625,
0.0011854171752929688,
-0.0050811767578125,
0.048431396484375,
-0.0173187255859375,
0.05999755859375,
0.038360595703125,
0.01122283935546875,
-0.020111083984375,
-0.0299072265625,
0.036163330078125,
0.0157623291015625,
-0.050445556640625,
-0.006793975830078125,
0.007289886474609375,
0.03125,
0.031829833984375,
0.032623291015625,
-0.010009765625,
-0.003566741943359375,
-0.0022678375244140625,
0.025360107421875,
0.0020046234130859375,
-0.009765625,
-0.00994110107421875,
0.0064849853515625,
-0.023681640625,
0.0017328262329101562
]
] |
pedramyazdipoor/persian_xlm_roberta_large | 2022-09-19T16:38:37.000Z | [
"transformers",
"pytorch",
"xlm-roberta",
"question-answering",
"arxiv:1911.02116",
"arxiv:2202.06219",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | question-answering | pedramyazdipoor | null | null | pedramyazdipoor/persian_xlm_roberta_large | 2 | 14,660 | transformers | 2022-09-18T15:46:12 | ## Persian XLM-RoBERTA Large For Question Answering Task
XLM-RoBERTA is a multilingual language model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper [Unsupervised Cross-lingual Representation Learning at Scale](https://arxiv.org/abs/1911.02116v2) by Conneau et al. .
Multilingual [XLM-RoBERTa large for QA on various languages](https://huggingface.co/deepset/xlm-roberta-large-squad2) is fine-tuned on various QA datasets but PQuAD, which is the biggest persian QA dataset so far. This second model is our base model to be fine-tuned.
Paper presenting PQuAD dataset: [arXiv:2202.06219](https://arxiv.org/abs/2202.06219)
---
## Introduction
This model is fine-tuned on PQuAD Train set and is easily ready to use.
Its very long training time encouraged me to publish this model in order to make life easier for those who need.
## Hyperparameters of training
I set batch size to 4 due to the limitations of GPU memory in Google Colab.
```
batch_size = 4
n_epochs = 1
base_LM_model = "deepset/xlm-roberta-large-squad2"
max_seq_len = 256
learning_rate = 3e-5
evaluation_strategy = "epoch",
save_strategy = "epoch",
learning_rate = 3e-5,
warmup_ratio = 0.1,
gradient_accumulation_steps = 8,
weight_decay = 0.01,
```
## Performance
Evaluated on the PQuAD Persian test set with the [official PQuAD link](https://huggingface.co/datasets/newsha/PQuAD).
I trained for more than 1 epoch as well, but I get worse results.
Our XLM-Roberta outperforms [our ParsBert on PQuAD](https://huggingface.co/pedramyazdipoor/parsbert_question_answering_PQuAD), but the former is more than 3 times bigger than the latter one; so comparing these two is not fair.
### Question Answering On Test Set of PQuAD Dataset
| Metric | Our XLM-Roberta Large| Our ParsBert |
|:----------------:|:--------------------:|:-------------:|
| Exact Match | 66.56* | 47.44 |
| F1 | 87.31* | 81.96 |
## How to use
## Pytorch
```python
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
path = 'pedramyazdipoor/persian_xlm_roberta_large'
tokenizer = AutoTokenizer.from_pretrained(path)
model = AutoModelForQuestionAnswering.from_pretrained(path)
```
## Inference
There are some considerations for inference:
1) Start index of answer must be smaller than end index.
2) The span of answer must be within the context.
3) The selected span must be the most probable choice among N pairs of candidates.
```python
def generate_indexes(start_logits, end_logits, N, min_index):
output_start = start_logits
output_end = end_logits
start_indexes = np.arange(len(start_logits))
start_probs = output_start
list_start = dict(zip(start_indexes, start_probs.tolist()))
end_indexes = np.arange(len(end_logits))
end_probs = output_end
list_end = dict(zip(end_indexes, end_probs.tolist()))
sorted_start_list = sorted(list_start.items(), key=lambda x: x[1], reverse=True) #Descending sort by probability
sorted_end_list = sorted(list_end.items(), key=lambda x: x[1], reverse=True)
final_start_idx, final_end_idx = [[] for l in range(2)]
start_idx, end_idx, prob = 0, 0, (start_probs.tolist()[0] + end_probs.tolist()[0])
for a in range(0,N):
for b in range(0,N):
if (sorted_start_list[a][1] + sorted_end_list[b][1]) > prob :
if (sorted_start_list[a][0] <= sorted_end_list[b][0]) and (sorted_start_list[a][0] > min_index) :
prob = sorted_start_list[a][1] + sorted_end_list[b][1]
start_idx = sorted_start_list[a][0]
end_idx = sorted_end_list[b][0]
final_start_idx.append(start_idx)
final_end_idx.append(end_idx)
return final_start_idx[0], final_end_idx[0]
```
```python
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
model.eval().to(device)
text = 'سلام من پدرامم 26 سالمه'
question = 'چند سالمه؟'
encoding = tokenizer(question,text,add_special_tokens = True,
return_token_type_ids = True,
return_tensors = 'pt',
padding = True,
return_offsets_mapping = True,
truncation = 'only_first',
max_length = 32)
out = model(encoding['input_ids'].to(device),encoding['attention_mask'].to(device), encoding['token_type_ids'].to(device))
#we had to change some pieces of code to make it compatible with one answer generation at a time
#If you have unanswerable questions, use out['start_logits'][0][0:] and out['end_logits'][0][0:] because <s> (the 1st token) is for this situation and must be compared with other tokens.
#you can initialize min_index in generate_indexes() to put force on tokens being chosen to be within the context(startindex must be greater than seperator token).
answer_start_index, answer_end_index = generate_indexes(out['start_logits'][0][1:], out['end_logits'][0][1:], 5, 0)
print(tokenizer.tokenize(text + question))
print(tokenizer.tokenize(text + question)[answer_start_index : (answer_end_index + 1)])
>>> ['▁سلام', '▁من', '▁پدر', 'ام', 'م', '▁26', '▁سالم', 'ه', 'نام', 'م', '▁چیست', '؟']
>>> ['▁26']
```
## Acknowledgments
We hereby, express our gratitude to the [Newsha Shahbodaghkhan](https://huggingface.co/datasets/newsha/PQuAD/tree/main) for facilitating dataset gathering.
## Contributors
- Pedram Yazdipoor : [Linkedin](https://www.linkedin.com/in/pedram-yazdipour/)
## Releases
### Release v0.2 (Sep 18, 2022)
This is the second version of our Persian XLM-Roberta-Large.
There were some problems using the previous version. | 5,603 | [
[
-0.030517578125,
-0.06280517578125,
0.03228759765625,
0.01275634765625,
-0.010345458984375,
-0.005489349365234375,
-0.01739501953125,
-0.0183868408203125,
0.0115966796875,
0.01335906982421875,
-0.037445068359375,
-0.038970947265625,
-0.035552978515625,
0.01282501220703125,
-0.02264404296875,
0.08685302734375,
0.00325775146484375,
0.00214385986328125,
-0.0016031265258789062,
-0.0181427001953125,
-0.008148193359375,
-0.033203125,
-0.05950927734375,
-0.00043654441833496094,
0.021881103515625,
0.0293121337890625,
0.048858642578125,
0.0307769775390625,
0.032073974609375,
0.0333251953125,
-0.0099945068359375,
0.00868988037109375,
-0.0289154052734375,
-0.01227569580078125,
0.0009036064147949219,
-0.0455322265625,
-0.026519775390625,
0.0029430389404296875,
0.04754638671875,
0.04541015625,
-0.0127105712890625,
0.0389404296875,
-0.0102386474609375,
0.046539306640625,
-0.0297088623046875,
-0.006175994873046875,
-0.034332275390625,
0.0016851425170898438,
-0.01036834716796875,
-0.002933502197265625,
-0.0190887451171875,
-0.034210205078125,
-0.0006093978881835938,
-0.04071044921875,
0.014007568359375,
0.007190704345703125,
0.0771484375,
0.0167694091796875,
-0.030792236328125,
-0.0175933837890625,
-0.04364013671875,
0.07476806640625,
-0.0560302734375,
0.00820159912109375,
0.0211029052734375,
0.0113372802734375,
-0.0106353759765625,
-0.053070068359375,
-0.054931640625,
0.0008764266967773438,
-0.003986358642578125,
0.0162811279296875,
-0.01065826416015625,
-0.0282745361328125,
0.03265380859375,
0.0328369140625,
-0.070068359375,
-0.00537109375,
-0.038330078125,
0.0006256103515625,
0.060302734375,
0.0226898193359375,
0.019622802734375,
-0.0210113525390625,
-0.0279693603515625,
-0.02801513671875,
-0.036346435546875,
0.0322265625,
0.0214080810546875,
-0.0027713775634765625,
-0.0167694091796875,
0.052032470703125,
-0.025390625,
0.045379638671875,
0.0262298583984375,
-0.0138092041015625,
0.039764404296875,
-0.0350341796875,
-0.0159149169921875,
-0.0127105712890625,
0.0684814453125,
0.0269317626953125,
-0.0004916191101074219,
0.0081787109375,
0.004703521728515625,
-0.0245513916015625,
-0.0159759521484375,
-0.06024169921875,
-0.02215576171875,
0.041717529296875,
-0.031890869140625,
-0.01045989990234375,
0.0136871337890625,
-0.05548095703125,
-0.01215362548828125,
0.0081634521484375,
0.050872802734375,
-0.037445068359375,
-0.0213165283203125,
0.0159149169921875,
-0.019073486328125,
0.0291290283203125,
0.01335906982421875,
-0.047149658203125,
0.01934814453125,
0.035919189453125,
0.06036376953125,
0.0078582763671875,
-0.0198516845703125,
-0.039794921875,
-0.0097503662109375,
-0.02496337890625,
0.040008544921875,
-0.01363372802734375,
-0.0138092041015625,
-0.031829833984375,
0.011322021484375,
-0.0216827392578125,
-0.033660888671875,
0.041107177734375,
-0.0408935546875,
0.041259765625,
-0.007190704345703125,
-0.03466796875,
-0.029876708984375,
0.01107025146484375,
-0.039154052734375,
0.083984375,
0.017120361328125,
-0.0772705078125,
0.00893402099609375,
-0.04144287109375,
-0.0135345458984375,
-0.007350921630859375,
-0.0020580291748046875,
-0.057098388671875,
-0.014923095703125,
0.0304107666015625,
0.0255279541015625,
-0.0360107421875,
0.0101470947265625,
-0.012603759765625,
-0.026641845703125,
0.0294342041015625,
-0.0145416259765625,
0.10369873046875,
0.0218505859375,
-0.04534912109375,
0.00878143310546875,
-0.054107666015625,
0.0289306640625,
0.0194854736328125,
-0.01715087890625,
-0.001255035400390625,
-0.01141357421875,
0.00722503662109375,
0.0299835205078125,
0.03253173828125,
-0.047332763671875,
0.00814056396484375,
-0.042022705078125,
0.04254150390625,
0.055206298828125,
0.003936767578125,
0.017333984375,
-0.0419921875,
0.037628173828125,
0.0182037353515625,
0.01438140869140625,
0.0016651153564453125,
-0.044342041015625,
-0.053924560546875,
-0.009002685546875,
0.0231781005859375,
0.056976318359375,
-0.0419921875,
0.039276123046875,
-0.00731658935546875,
-0.05279541015625,
-0.0212554931640625,
0.0021686553955078125,
0.034149169921875,
0.03271484375,
0.032440185546875,
-0.019805908203125,
-0.046051025390625,
-0.06787109375,
0.0045013427734375,
-0.01561737060546875,
0.003292083740234375,
0.03338623046875,
0.054443359375,
-0.00173187255859375,
0.057037353515625,
-0.057861328125,
-0.00823211669921875,
-0.0225677490234375,
0.007389068603515625,
0.06280517578125,
0.050537109375,
0.039794921875,
-0.065673828125,
-0.056976318359375,
-0.006137847900390625,
-0.039947509765625,
0.01210784912109375,
-0.0121002197265625,
-0.0244903564453125,
0.03265380859375,
0.0278778076171875,
-0.055419921875,
0.034698486328125,
0.0185089111328125,
-0.031646728515625,
0.052825927734375,
-0.024078369140625,
0.01396942138671875,
-0.09820556640625,
0.0252685546875,
-0.01343536376953125,
0.0010404586791992188,
-0.032989501953125,
0.0033130645751953125,
0.00611114501953125,
0.0006732940673828125,
-0.034149169921875,
0.042083740234375,
-0.040191650390625,
0.0024852752685546875,
0.0133209228515625,
0.0097503662109375,
-0.00980377197265625,
0.05328369140625,
-0.002315521240234375,
0.0684814453125,
0.046722412109375,
-0.03961181640625,
0.032867431640625,
0.03497314453125,
-0.0302734375,
0.030120849609375,
-0.064697265625,
0.00664520263671875,
-0.00902557373046875,
0.012451171875,
-0.0889892578125,
-0.0004620552062988281,
0.0222015380859375,
-0.056243896484375,
0.01157379150390625,
0.005214691162109375,
-0.03253173828125,
-0.046905517578125,
-0.03594970703125,
0.0377197265625,
0.05499267578125,
-0.025360107421875,
0.0279388427734375,
0.0212249755859375,
-0.00720977783203125,
-0.049652099609375,
-0.03497314453125,
-0.007122039794921875,
-0.0212249755859375,
-0.061004638671875,
0.02606201171875,
-0.0251617431640625,
-0.0222930908203125,
-0.0029430389404296875,
-0.0015134811401367188,
-0.01004791259765625,
0.00035309791564941406,
0.0149078369140625,
0.0156707763671875,
-0.02239990234375,
0.0005145072937011719,
-0.0191192626953125,
-0.007183074951171875,
0.0014858245849609375,
-0.01947021484375,
0.046966552734375,
-0.014495849609375,
-0.00420379638671875,
-0.0377197265625,
0.03326416015625,
0.033905029296875,
-0.035186767578125,
0.06549072265625,
0.05621337890625,
-0.019287109375,
0.006404876708984375,
-0.03509521484375,
-0.0019426345825195312,
-0.035064697265625,
0.041107177734375,
-0.034515380859375,
-0.0401611328125,
0.0440673828125,
0.01215362548828125,
0.00592041015625,
0.06317138671875,
0.0477294921875,
0.00707244873046875,
0.0950927734375,
0.033843994140625,
-0.01045989990234375,
0.0265045166015625,
-0.058197021484375,
0.0140228271484375,
-0.0701904296875,
-0.0163116455078125,
-0.04583740234375,
-0.01605224609375,
-0.05474853515625,
-0.036956787109375,
0.045867919921875,
0.01079559326171875,
-0.0404052734375,
0.004871368408203125,
-0.04254150390625,
0.02099609375,
0.057708740234375,
0.0244903564453125,
-0.00568389892578125,
-0.007183074951171875,
-0.01268768310546875,
0.0107421875,
-0.0574951171875,
-0.0153045654296875,
0.10833740234375,
0.00513458251953125,
0.04034423828125,
0.01053619384765625,
0.05657958984375,
-0.01348876953125,
0.004138946533203125,
-0.027069091796875,
0.04254150390625,
-0.00368499755859375,
-0.053955078125,
-0.034942626953125,
-0.0304718017578125,
-0.08154296875,
0.0164642333984375,
-0.0136260986328125,
-0.0657958984375,
0.01268768310546875,
-0.0144195556640625,
-0.048858642578125,
0.02276611328125,
-0.046600341796875,
0.07086181640625,
-0.0203857421875,
-0.0447998046875,
-0.0155181884765625,
-0.0645751953125,
0.033233642578125,
-0.0007886886596679688,
0.0108642578125,
0.002620697021484375,
0.0022106170654296875,
0.080078125,
-0.054931640625,
0.0292510986328125,
-0.005275726318359375,
0.01495361328125,
0.0287628173828125,
-0.0162200927734375,
0.0300140380859375,
0.0219573974609375,
-0.016265869140625,
-0.0010976791381835938,
0.015045166015625,
-0.049407958984375,
-0.02947998046875,
0.052459716796875,
-0.080322265625,
-0.0626220703125,
-0.056640625,
-0.042633056640625,
0.0012950897216796875,
0.0189971923828125,
0.0261688232421875,
0.0419921875,
0.00632476806640625,
0.0176849365234375,
0.045196533203125,
-0.035919189453125,
0.0360107421875,
0.024383544921875,
-0.0118408203125,
-0.0401611328125,
0.05426025390625,
0.01197052001953125,
0.005283355712890625,
0.0264129638671875,
0.009857177734375,
-0.032623291015625,
-0.0289154052734375,
-0.03887939453125,
0.0304412841796875,
-0.04046630859375,
-0.01241302490234375,
-0.051788330078125,
-0.0291748046875,
-0.034820556640625,
0.0019178390502929688,
-0.0080718994140625,
-0.0477294921875,
-0.013427734375,
0.010772705078125,
0.03594970703125,
0.04046630859375,
-0.0157928466796875,
0.0006999969482421875,
-0.039520263671875,
0.034088134765625,
0.0234527587890625,
0.002124786376953125,
-0.0013103485107421875,
-0.043975830078125,
-0.0276031494140625,
0.021484375,
-0.034515380859375,
-0.07342529296875,
0.040771484375,
0.01424407958984375,
0.030242919921875,
0.0247039794921875,
0.01160430908203125,
0.06634521484375,
-0.0210113525390625,
0.06903076171875,
0.006114959716796875,
-0.059478759765625,
0.04052734375,
-0.0087890625,
0.028564453125,
0.020355224609375,
0.041534423828125,
-0.0408935546875,
-0.03546142578125,
-0.065185546875,
-0.07904052734375,
0.07037353515625,
0.0205230712890625,
-0.0109100341796875,
-0.0059356689453125,
0.023406982421875,
-0.007350921630859375,
0.0041046142578125,
-0.048004150390625,
-0.043975830078125,
-0.005336761474609375,
-0.01230621337890625,
-0.0308990478515625,
-0.01152801513671875,
-0.0089874267578125,
-0.04119873046875,
0.06781005859375,
0.006244659423828125,
0.037994384765625,
0.0419921875,
-0.01157379150390625,
-0.00899505615234375,
0.006771087646484375,
0.040191650390625,
0.06158447265625,
-0.0255126953125,
-0.000972747802734375,
0.02203369140625,
-0.033416748046875,
0.007476806640625,
0.0183868408203125,
-0.0144805908203125,
-0.0022106170654296875,
0.0157318115234375,
0.042877197265625,
-0.0001480579376220703,
-0.0498046875,
0.029754638671875,
-0.01568603515625,
-0.0108795166015625,
-0.046295166015625,
0.006275177001953125,
0.0117950439453125,
0.0233917236328125,
0.05035400390625,
-0.00516510009765625,
0.00389862060546875,
-0.04425048828125,
0.0104827880859375,
0.046844482421875,
0.004669189453125,
-0.01125335693359375,
0.06341552734375,
-0.0135650634765625,
-0.022125244140625,
0.05810546875,
-0.0120697021484375,
-0.058074951171875,
0.07879638671875,
0.031707763671875,
0.051483154296875,
-0.006130218505859375,
0.0206146240234375,
0.060333251953125,
0.01507568359375,
0.0011415481567382812,
0.0423583984375,
-0.0015745162963867188,
-0.040313720703125,
-0.0081634521484375,
-0.0458984375,
-0.0120086669921875,
0.01239776611328125,
-0.046722412109375,
0.00981903076171875,
-0.0267333984375,
-0.007389068603515625,
0.0087127685546875,
0.042694091796875,
-0.052642822265625,
0.0212249755859375,
-0.01338958740234375,
0.0653076171875,
-0.055755615234375,
0.041839599609375,
0.058563232421875,
-0.04071044921875,
-0.0750732421875,
-0.0126800537109375,
-0.0187835693359375,
-0.05743408203125,
0.05511474609375,
0.00754547119140625,
0.02374267578125,
0.0182952880859375,
-0.044952392578125,
-0.0897216796875,
0.0931396484375,
-0.0033779144287109375,
-0.0196685791015625,
0.0066375732421875,
0.0071258544921875,
0.0406494140625,
-0.0099945068359375,
0.0504150390625,
0.02630615234375,
0.035614013671875,
-0.003910064697265625,
-0.06719970703125,
0.01561737060546875,
-0.042449951171875,
-0.003936767578125,
0.01947021484375,
-0.064453125,
0.09930419921875,
-0.032867431640625,
-0.00963592529296875,
0.0360107421875,
0.040191650390625,
0.0176544189453125,
0.0207061767578125,
0.033905029296875,
0.06048583984375,
0.0560302734375,
-0.00524139404296875,
0.080322265625,
-0.045501708984375,
0.042694091796875,
0.05950927734375,
0.00040221214294433594,
0.0672607421875,
0.032928466796875,
-0.041656494140625,
0.040679931640625,
0.058685302734375,
-0.002960205078125,
0.0263214111328125,
0.0163421630859375,
-0.00778961181640625,
-0.019012451171875,
0.0194091796875,
-0.042633056640625,
0.0271148681640625,
0.0043182373046875,
-0.020599365234375,
-0.021759033203125,
-0.0095062255859375,
0.00217437744140625,
-0.0045318603515625,
-0.01837158203125,
0.046051025390625,
-0.010101318359375,
-0.061431884765625,
0.07476806640625,
0.018341064453125,
0.042724609375,
-0.03179931640625,
0.0033550262451171875,
-0.016632080078125,
0.019378662109375,
-0.0212554931640625,
-0.059356689453125,
0.00896453857421875,
-0.01044464111328125,
-0.016693115234375,
0.0008788108825683594,
0.01342010498046875,
-0.035308837890625,
-0.059295654296875,
-0.00821685791015625,
0.040374755859375,
0.0233001708984375,
0.003047943115234375,
-0.07196044921875,
-0.0171966552734375,
0.007122039794921875,
-0.03717041015625,
0.0189208984375,
0.03717041015625,
-0.01052093505859375,
0.043701171875,
0.056121826171875,
-0.003635406494140625,
0.016510009765625,
-0.03717041015625,
0.060546875,
-0.055938720703125,
-0.038787841796875,
-0.05303955078125,
0.0374755859375,
-0.0040130615234375,
-0.05084228515625,
0.0711669921875,
0.07049560546875,
0.0572509765625,
-0.0028133392333984375,
0.038360595703125,
-0.032867431640625,
0.04461669921875,
-0.0293731689453125,
0.051849365234375,
-0.05084228515625,
0.0001780986785888672,
-0.00982666015625,
-0.049652099609375,
-0.00986480712890625,
0.058074951171875,
-0.03314208984375,
0.00951385498046875,
0.060821533203125,
0.07183837890625,
0.00946807861328125,
-0.01544189453125,
0.002300262451171875,
0.04058837890625,
0.0031890869140625,
0.090576171875,
0.033599853515625,
-0.054107666015625,
0.045135498046875,
-0.038726806640625,
-0.0178680419921875,
-0.01192474365234375,
-0.033905029296875,
-0.053741455078125,
-0.064453125,
-0.0258636474609375,
-0.033599853515625,
0.000362396240234375,
0.0689697265625,
0.039703369140625,
-0.056488037109375,
-0.0266876220703125,
0.0018749237060546875,
0.02703857421875,
-0.0280303955078125,
-0.020233154296875,
0.054473876953125,
-0.010040283203125,
-0.0634765625,
-0.00978851318359375,
-0.0014486312866210938,
-0.01325225830078125,
0.0037746429443359375,
-0.0107269287109375,
-0.046234130859375,
-0.0017642974853515625,
0.032318115234375,
0.0289306640625,
-0.038055419921875,
-0.0114593505859375,
0.00969696044921875,
-0.0079345703125,
0.016448974609375,
0.0224456787109375,
-0.0628662109375,
0.01226043701171875,
0.037811279296875,
0.0226593017578125,
0.05609130859375,
0.0006723403930664062,
0.022491455078125,
-0.024078369140625,
0.0163726806640625,
0.004528045654296875,
0.023681640625,
0.00820159912109375,
-0.032470703125,
0.036712646484375,
0.01425933837890625,
-0.052703857421875,
-0.067626953125,
-0.00608062744140625,
-0.07562255859375,
-0.019989013671875,
0.08404541015625,
-0.0181121826171875,
-0.045745849609375,
0.00252532958984375,
-0.03656005859375,
0.0247039794921875,
-0.01224517822265625,
0.06103515625,
0.0389404296875,
-0.01910400390625,
-0.01129913330078125,
-0.0283660888671875,
0.0301666259765625,
0.048797607421875,
-0.051483154296875,
-0.0188446044921875,
0.0235443115234375,
0.041534423828125,
0.02020263671875,
0.053375244140625,
-0.00995635986328125,
0.04925537109375,
-0.0011768341064453125,
-0.006420135498046875,
-0.0057373046875,
0.0036067962646484375,
-0.0230560302734375,
0.0169677734375,
-0.0025310516357421875,
-0.018585205078125
]
] |
stevhliu/my_awesome_model | 2023-08-25T00:04:52.000Z | [
"transformers",
"pytorch",
"tf",
"tensorboard",
"safetensors",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | stevhliu | null | null | stevhliu/my_awesome_model | 1 | 14,655 | transformers | 2022-09-28T18:41:57 | ---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: stevhliu/my_awesome_model
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# stevhliu/my_awesome_model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0632
- Validation Loss: 0.2355
- Train Accuracy: 0.9295
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 7810, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.2518 | 0.1859 | 0.9261 | 0 |
| 0.1319 | 0.1822 | 0.9318 | 1 |
| 0.0632 | 0.2355 | 0.9295 | 2 |
### Framework versions
- Transformers 4.22.2
- TensorFlow 2.8.2
- Datasets 2.5.1
- Tokenizers 0.12.1
| 1,633 | [
[
-0.045074462890625,
-0.04571533203125,
0.0244140625,
0.00318145751953125,
-0.03155517578125,
-0.025634765625,
-0.0049896240234375,
-0.0181884765625,
0.0071258544921875,
0.007785797119140625,
-0.044464111328125,
-0.0504150390625,
-0.056976318359375,
-0.018157958984375,
-0.015899658203125,
0.088623046875,
0.0185699462890625,
0.0269927978515625,
-0.0076141357421875,
-0.0093841552734375,
-0.04278564453125,
-0.04998779296875,
-0.060333251953125,
-0.050506591796875,
0.0298614501953125,
0.013702392578125,
0.06939697265625,
0.07568359375,
0.0439453125,
0.017578125,
-0.044158935546875,
-0.0057525634765625,
-0.03857421875,
-0.037200927734375,
-0.0030803680419921875,
-0.036529541015625,
-0.05535888671875,
-0.01438140869140625,
0.05078125,
0.038787841796875,
-0.0177001953125,
0.03363037109375,
0.0020809173583984375,
0.025299072265625,
-0.036590576171875,
0.0167694091796875,
-0.046722412109375,
0.0220489501953125,
-0.018402099609375,
-0.0114288330078125,
-0.007465362548828125,
-0.0071258544921875,
0.007007598876953125,
-0.04168701171875,
0.034271240234375,
-0.0010509490966796875,
0.09869384765625,
0.0288848876953125,
-0.0180816650390625,
-0.00832366943359375,
-0.048492431640625,
0.04144287109375,
-0.067138671875,
0.0139312744140625,
0.0306243896484375,
0.031524658203125,
-0.01303863525390625,
-0.058349609375,
-0.0443115234375,
-0.006103515625,
-0.0003006458282470703,
0.009246826171875,
-0.03814697265625,
0.00234222412109375,
0.03955078125,
0.048492431640625,
-0.0227203369140625,
0.0263519287109375,
-0.042816162109375,
-0.0267181396484375,
0.037872314453125,
0.035675048828125,
-0.0159759521484375,
-0.01216888427734375,
-0.0249786376953125,
-0.022186279296875,
-0.0165863037109375,
0.0209503173828125,
0.06402587890625,
0.0225982666015625,
-0.0182037353515625,
0.0369873046875,
-0.017303466796875,
0.052001953125,
0.01739501953125,
-0.01264190673828125,
0.04632568359375,
0.00545501708984375,
-0.0306243896484375,
0.01212310791015625,
0.06097412109375,
0.044677734375,
0.00200653076171875,
0.006908416748046875,
-0.01739501953125,
-0.006771087646484375,
0.0228729248046875,
-0.0697021484375,
-0.0292816162109375,
0.0139312744140625,
-0.043609619140625,
-0.058929443359375,
0.0010986328125,
-0.042236328125,
0.006061553955078125,
-0.0274200439453125,
0.05218505859375,
-0.029815673828125,
-0.015777587890625,
0.0112762451171875,
-0.00009590387344360352,
0.00843048095703125,
0.0025959014892578125,
-0.056549072265625,
0.0274658203125,
0.039642333984375,
0.040435791015625,
0.01702880859375,
-0.0225067138671875,
-0.00652313232421875,
-0.017913818359375,
-0.0247802734375,
0.03302001953125,
-0.0198974609375,
-0.0294952392578125,
-0.023681640625,
0.0220184326171875,
0.00429534912109375,
-0.0263824462890625,
0.0771484375,
-0.03155517578125,
0.0224761962890625,
-0.0214996337890625,
-0.0462646484375,
-0.03594970703125,
0.02471923828125,
-0.050537109375,
0.081787109375,
-0.007472991943359375,
-0.05560302734375,
0.041839599609375,
-0.03094482421875,
-0.01494598388671875,
-0.0004763603210449219,
-0.0029010772705078125,
-0.07440185546875,
0.0107574462890625,
-0.0017299652099609375,
0.045623779296875,
-0.0128631591796875,
0.01335906982421875,
-0.033935546875,
-0.034088134765625,
-0.00963592529296875,
-0.037322998046875,
0.051300048828125,
0.0252685546875,
-0.038482666015625,
-0.0035610198974609375,
-0.103759765625,
0.020355224609375,
0.029296875,
-0.024383544921875,
-0.00803375244140625,
-0.029510498046875,
0.01557159423828125,
0.01082611083984375,
0.031982421875,
-0.04107666015625,
0.00789642333984375,
-0.0208740234375,
0.0290374755859375,
0.054595947265625,
0.00785064697265625,
0.00115966796875,
-0.0216522216796875,
0.0247802734375,
0.0187835693359375,
0.0157012939453125,
0.0017414093017578125,
-0.03704833984375,
-0.06536865234375,
-0.01047515869140625,
0.035064697265625,
0.0166473388671875,
0.0006422996520996094,
0.07098388671875,
0.00665283203125,
-0.06597900390625,
-0.0294647216796875,
-0.0016994476318359375,
0.0289459228515625,
0.0653076171875,
0.02447509765625,
-0.00823974609375,
-0.037872314453125,
-0.08831787109375,
0.02001953125,
-0.003582000732421875,
0.0275115966796875,
0.0181884765625,
0.043426513671875,
-0.01383209228515625,
0.05047607421875,
-0.04010009765625,
-0.0115203857421875,
0.0019245147705078125,
-0.002285003662109375,
0.0478515625,
0.049346923828125,
0.0478515625,
-0.043701171875,
-0.01470947265625,
-0.006885528564453125,
-0.048492431640625,
0.0208892822265625,
-0.000865936279296875,
-0.0239715576171875,
-0.0180206298828125,
0.01666259765625,
-0.040283203125,
0.038787841796875,
0.01325225830078125,
-0.0286712646484375,
0.0469970703125,
-0.049102783203125,
-0.00830841064453125,
-0.09039306640625,
0.033935546875,
0.0097503662109375,
-0.007244110107421875,
-0.0287933349609375,
0.01085662841796875,
0.01230621337890625,
-0.0015954971313476562,
-0.02349853515625,
0.0296783447265625,
-0.005336761474609375,
0.01213836669921875,
-0.01129913330078125,
-0.042327880859375,
0.005268096923828125,
0.05828857421875,
0.01166534423828125,
0.013336181640625,
0.05322265625,
-0.0428466796875,
0.035308837890625,
0.029510498046875,
-0.0286865234375,
0.039642333984375,
-0.0714111328125,
0.01012420654296875,
-0.0162353515625,
-0.0050201416015625,
-0.06463623046875,
-0.024932861328125,
0.029022216796875,
-0.025787353515625,
0.016204833984375,
-0.0256195068359375,
-0.025238037109375,
-0.040740966796875,
0.0011434555053710938,
0.0173187255859375,
0.046722412109375,
-0.042510986328125,
0.01467132568359375,
0.0008530616760253906,
0.0173797607421875,
-0.050811767578125,
-0.057464599609375,
-0.0206298828125,
-0.02166748046875,
-0.014862060546875,
0.0311431884765625,
-0.01253509521484375,
0.00841522216796875,
0.01751708984375,
-0.0006318092346191406,
-0.01464080810546875,
0.002117156982421875,
0.037750244140625,
0.03680419921875,
-0.01258087158203125,
-0.002498626708984375,
0.00701141357421875,
-0.017120361328125,
0.01255035400390625,
0.00769805908203125,
0.0537109375,
-0.0241851806640625,
-0.0284576416015625,
-0.06243896484375,
-0.01471710205078125,
0.05120849609375,
-0.002956390380859375,
0.051177978515625,
0.05010986328125,
-0.048309326171875,
0.00728607177734375,
-0.02960205078125,
0.002445220947265625,
-0.032958984375,
0.048492431640625,
-0.045318603515625,
-0.018829345703125,
0.0634765625,
0.0240020751953125,
0.01470184326171875,
0.078857421875,
0.044586181640625,
-0.002185821533203125,
0.09307861328125,
0.021697998046875,
-0.01142120361328125,
0.0013332366943359375,
-0.053497314453125,
-0.007503509521484375,
-0.04736328125,
-0.052337646484375,
-0.03143310546875,
-0.03912353515625,
-0.0428466796875,
0.0157318115234375,
0.0122222900390625,
0.032318115234375,
-0.037200927734375,
0.031158447265625,
-0.05352783203125,
0.0340576171875,
0.05255126953125,
0.0340576171875,
0.002887725830078125,
0.01094818115234375,
-0.029449462890625,
-0.0038166046142578125,
-0.06494140625,
-0.022064208984375,
0.09375,
0.04815673828125,
0.03668212890625,
-0.02191162109375,
0.047515869140625,
0.0110321044921875,
-0.0040283203125,
-0.0506591796875,
0.028656005859375,
0.00737762451171875,
-0.060302734375,
-0.005260467529296875,
-0.0306243896484375,
-0.056243896484375,
0.006591796875,
-0.0257110595703125,
-0.0247802734375,
0.0174407958984375,
0.02734375,
-0.05487060546875,
0.03973388671875,
-0.0214385986328125,
0.0738525390625,
-0.0191497802734375,
-0.0206756591796875,
-0.020263671875,
-0.0258941650390625,
0.006473541259765625,
0.0017795562744140625,
-0.01055145263671875,
-0.007808685302734375,
0.0180206298828125,
0.061309814453125,
-0.05413818359375,
0.043060302734375,
-0.0287628173828125,
0.01549530029296875,
0.0269012451171875,
-0.00814056396484375,
0.048828125,
0.003925323486328125,
-0.01328277587890625,
0.0235137939453125,
-0.0007729530334472656,
-0.038238525390625,
-0.025634765625,
0.058502197265625,
-0.0833740234375,
-0.006099700927734375,
-0.0300445556640625,
-0.032440185546875,
-0.0002760887145996094,
0.02520751953125,
0.06097412109375,
0.0653076171875,
-0.002399444580078125,
0.0171051025390625,
0.0426025390625,
0.0102996826171875,
0.02606201171875,
0.019683837890625,
0.007450103759765625,
-0.047698974609375,
0.064208984375,
0.002056121826171875,
0.0010623931884765625,
-0.00742340087890625,
0.01433563232421875,
-0.032745361328125,
-0.05316162109375,
-0.038909912109375,
0.006450653076171875,
-0.0640869140625,
-0.015716552734375,
-0.0287628173828125,
-0.03057861328125,
-0.0191192626953125,
0.0027866363525390625,
-0.038238525390625,
-0.02886962890625,
-0.04473876953125,
-0.0246124267578125,
0.024627685546875,
0.046295166015625,
-0.0079193115234375,
0.04058837890625,
-0.037261962890625,
-0.002826690673828125,
0.0185699462890625,
0.035858154296875,
0.0272064208984375,
-0.06866455078125,
-0.0181732177734375,
0.0123291015625,
-0.0245513916015625,
-0.03192138671875,
0.02703857421875,
0.01117706298828125,
0.0701904296875,
0.050445556640625,
-0.0218658447265625,
0.07305908203125,
-0.0280609130859375,
0.044281005859375,
0.032928466796875,
-0.03167724609375,
0.0258941650390625,
-0.0105133056640625,
0.0233612060546875,
0.04888916015625,
0.05426025390625,
-0.006931304931640625,
0.0022411346435546875,
-0.087158203125,
-0.058502197265625,
0.05609130859375,
0.0263214111328125,
0.01557159423828125,
-0.0010051727294921875,
0.03961181640625,
0.0115814208984375,
0.0207977294921875,
-0.05255126953125,
-0.05279541015625,
-0.02484130859375,
-0.01532745361328125,
0.004215240478515625,
-0.0262603759765625,
0.0010595321655273438,
-0.044097900390625,
0.08148193359375,
0.021331787109375,
0.014739990234375,
0.0062103271484375,
0.01467132568359375,
-0.01108551025390625,
-0.00787353515625,
0.036651611328125,
0.047698974609375,
-0.04754638671875,
-0.000011622905731201172,
0.023529052734375,
-0.03607177734375,
-0.0022106170654296875,
0.0174407958984375,
0.00513458251953125,
0.00873565673828125,
0.0170440673828125,
0.07855224609375,
0.00872039794921875,
-0.0111846923828125,
0.0228271484375,
-0.005157470703125,
-0.0307159423828125,
-0.033477783203125,
0.0178680419921875,
-0.01419830322265625,
0.0168914794921875,
0.0241546630859375,
0.044677734375,
0.007564544677734375,
-0.0166473388671875,
0.01256561279296875,
0.0204620361328125,
-0.041412353515625,
-0.0288238525390625,
0.0711669921875,
-0.00272369384765625,
-0.023590087890625,
0.045867919921875,
-0.0180206298828125,
-0.0267791748046875,
0.06732177734375,
0.030609130859375,
0.054534912109375,
-0.00591278076171875,
-0.006809234619140625,
0.060516357421875,
0.00495147705078125,
-0.01451873779296875,
0.0226898193359375,
-0.0025501251220703125,
-0.048553466796875,
-0.009246826171875,
-0.061981201171875,
-0.01360321044921875,
0.058807373046875,
-0.08477783203125,
0.043914794921875,
-0.049407958984375,
-0.036651611328125,
0.0369873046875,
0.0075531005859375,
-0.07183837890625,
0.0438232421875,
0.0270233154296875,
0.07586669921875,
-0.0631103515625,
0.05218505859375,
0.045257568359375,
-0.02374267578125,
-0.060699462890625,
-0.033355712890625,
-0.023712158203125,
-0.0777587890625,
0.042633056640625,
0.01122283935546875,
0.022979736328125,
0.0251312255859375,
-0.03118896484375,
-0.06707763671875,
0.0814208984375,
0.02264404296875,
-0.054779052734375,
-0.00927734375,
0.026031494140625,
0.047698974609375,
0.0002677440643310547,
0.04425048828125,
0.0199127197265625,
0.016143798828125,
0.02056884765625,
-0.06524658203125,
-0.007579803466796875,
-0.038330078125,
0.005176544189453125,
0.001987457275390625,
-0.060577392578125,
0.06719970703125,
0.006038665771484375,
0.0241546630859375,
0.01812744140625,
0.035491943359375,
0.0189208984375,
0.014984130859375,
0.04205322265625,
0.09051513671875,
0.0498046875,
-0.00519561767578125,
0.06781005859375,
-0.036865234375,
0.04901123046875,
0.07818603515625,
0.0168609619140625,
0.0303802490234375,
0.016143798828125,
-0.0168609619140625,
0.033447265625,
0.070068359375,
-0.041595458984375,
0.050079345703125,
0.0012035369873046875,
0.0043792724609375,
-0.032928466796875,
0.0193939208984375,
-0.056884765625,
0.026123046875,
0.0014200210571289062,
-0.051727294921875,
-0.034027099609375,
-0.018798828125,
0.00748443603515625,
-0.0198516845703125,
-0.047821044921875,
0.0292510986328125,
-0.0225067138671875,
-0.025115966796875,
0.056854248046875,
0.014007568359375,
0.031951904296875,
-0.052520751953125,
-0.00852203369140625,
0.0027313232421875,
0.028839111328125,
-0.02862548828125,
-0.048309326171875,
0.01010894775390625,
-0.0116119384765625,
-0.0233001708984375,
0.0157318115234375,
0.045379638671875,
-0.00917816162109375,
-0.06689453125,
-0.0038890838623046875,
0.00553131103515625,
0.0251617431640625,
-0.0037899017333984375,
-0.07037353515625,
-0.005279541015625,
0.0050201416015625,
-0.03656005859375,
0.0067596435546875,
0.01568603515625,
0.01422882080078125,
0.03582763671875,
0.05755615234375,
0.003997802734375,
0.00588226318359375,
-0.005352020263671875,
0.06756591796875,
-0.036651611328125,
-0.0623779296875,
-0.075927734375,
0.048492431640625,
-0.015228271484375,
-0.07305908203125,
0.050628662109375,
0.07757568359375,
0.060516357421875,
-0.0011739730834960938,
0.044403076171875,
-0.0091400146484375,
0.0201416015625,
-0.0274658203125,
0.05401611328125,
-0.035064697265625,
-0.0076141357421875,
-0.01511383056640625,
-0.06298828125,
0.005733489990234375,
0.0455322265625,
-0.01611328125,
-0.0009603500366210938,
0.010711669921875,
0.049468994140625,
-0.0207977294921875,
0.006694793701171875,
0.024932861328125,
0.0079193115234375,
0.002315521240234375,
0.031341552734375,
0.03961181640625,
-0.05670166015625,
0.04022216796875,
-0.06817626953125,
-0.0100555419921875,
-0.00046944618225097656,
-0.057769775390625,
-0.07208251953125,
-0.037139892578125,
-0.033599853515625,
-0.0260467529296875,
-0.0047607421875,
0.065185546875,
0.07171630859375,
-0.061248779296875,
-0.00032901763916015625,
-0.021331787109375,
-0.031524658203125,
-0.013214111328125,
-0.0163726806640625,
0.0400390625,
-0.008758544921875,
-0.06402587890625,
0.0013456344604492188,
-0.032257080078125,
0.0245819091796875,
-0.01424407958984375,
-0.00952911376953125,
0.00601959228515625,
-0.0251007080078125,
0.008880615234375,
0.00687408447265625,
-0.01971435546875,
-0.01395416259765625,
-0.01410675048828125,
0.004779815673828125,
0.015777587890625,
0.004657745361328125,
-0.040283203125,
0.036651611328125,
0.01593017578125,
0.025360107421875,
0.05572509765625,
-0.00872039794921875,
0.01446533203125,
-0.04766845703125,
0.0279693603515625,
0.014801025390625,
0.037322998046875,
0.0004925727844238281,
-0.04754638671875,
0.02099609375,
0.0260009765625,
-0.0384521484375,
-0.0772705078125,
-0.0222930908203125,
-0.06732177734375,
0.00864410400390625,
0.068115234375,
-0.001079559326171875,
-0.0294189453125,
0.03302001953125,
-0.010345458984375,
0.024749755859375,
-0.0181732177734375,
0.052337646484375,
0.062164306640625,
-0.0096588134765625,
0.015777587890625,
-0.030914306640625,
0.027801513671875,
0.0258026123046875,
-0.0267791748046875,
-0.024444580078125,
0.01708984375,
0.04730224609375,
0.017181396484375,
0.005054473876953125,
-0.01284027099609375,
0.0292816162109375,
0.01025390625,
0.022857666015625,
-0.055023193359375,
-0.01511383056640625,
-0.037139892578125,
0.0135498046875,
0.0004317760467529297,
-0.044952392578125
]
] |
facebook/nllb-200-distilled-1.3B | 2023-02-11T20:19:10.000Z | [
"transformers",
"pytorch",
"m2m_100",
"text2text-generation",
"nllb",
"translation",
"ace",
"acm",
"acq",
"aeb",
"af",
"ajp",
"ak",
"als",
"am",
"apc",
"ar",
"ars",
"ary",
"arz",
"as",
"ast",
"awa",
"ayr",
"azb",
"azj",
"ba",
"bm",
"ban",
"be",
"bem",
"bn",
"bho",
"bjn",
"bo",
"bs",
"bug",
"bg",
"ca",
"ceb",
"cs",
"cjk",
"ckb",
"crh",
"cy",
"da",
"de",
"dik",
"dyu",
"dz",
"el",
"en",
"eo",
"et",
"eu",
"ee",
"fo",
"fj",
"fi",
"fon",
"fr",
"fur",
"fuv",
"gaz",
"gd",
"ga",
"gl",
"gn",
"gu",
"ht",
"ha",
"he",
"hi",
"hne",
"hr",
"hu",
"hy",
"ig",
"ilo",
"id",
"is",
"it",
"jv",
"ja",
"kab",
"kac",
"kam",
"kn",
"ks",
"ka",
"kk",
"kbp",
"kea",
"khk",
"km",
"ki",
"rw",
"ky",
"kmb",
"kmr",
"knc",
"kg",
"ko",
"lo",
"lij",
"li",
"ln",
"lt",
"lmo",
"ltg",
"lb",
"lua",
"lg",
"luo",
"lus",
"lvs",
"mag",
"mai",
"ml",
"mar",
"min",
"mk",
"mt",
"mni",
"mos",
"mi",
"my",
"nl",
"nn",
"nb",
"npi",
"nso",
"nus",
"ny",
"oc",
"ory",
"pag",
"pa",
"pap",
"pbt",
"pes",
"plt",
"pl",
"pt",
"prs",
"quy",
"ro",
"rn",
"ru",
"sg",
"sa",
"sat",
"scn",
"shn",
"si",
"sk",
"sl",
"sm",
"sn",
"sd",
"so",
"st",
"es",
"sc",
"sr",
"ss",
"su",
"sv",
"swh",
"szl",
"ta",
"taq",
"tt",
"te",
"tg",
"tl",
"th",
"ti",
"tpi",
"tn",
"ts",
"tk",
"tum",
"tr",
"tw",
"tzm",
"ug",
"uk",
"umb",
"ur",
"uzn",
"vec",
"vi",
"war",
"wo",
"xh",
"ydd",
"yo",
"yue",
"zh",
"zsm",
"zu",
"dataset:flores-200",
"license:cc-by-nc-4.0",
"autotrain_compatible",
"has_space",
"region:us"
] | translation | facebook | null | null | facebook/nllb-200-distilled-1.3B | 58 | 14,635 | transformers | 2022-07-08T10:57:38 | ---
language:
- ace
- acm
- acq
- aeb
- af
- ajp
- ak
- als
- am
- apc
- ar
- ars
- ary
- arz
- as
- ast
- awa
- ayr
- azb
- azj
- ba
- bm
- ban
- be
- bem
- bn
- bho
- bjn
- bo
- bs
- bug
- bg
- ca
- ceb
- cs
- cjk
- ckb
- crh
- cy
- da
- de
- dik
- dyu
- dz
- el
- en
- eo
- et
- eu
- ee
- fo
- fj
- fi
- fon
- fr
- fur
- fuv
- gaz
- gd
- ga
- gl
- gn
- gu
- ht
- ha
- he
- hi
- hne
- hr
- hu
- hy
- ig
- ilo
- id
- is
- it
- jv
- ja
- kab
- kac
- kam
- kn
- ks
- ka
- kk
- kbp
- kea
- khk
- km
- ki
- rw
- ky
- kmb
- kmr
- knc
- kg
- ko
- lo
- lij
- li
- ln
- lt
- lmo
- ltg
- lb
- lua
- lg
- luo
- lus
- lvs
- mag
- mai
- ml
- mar
- min
- mk
- mt
- mni
- mos
- mi
- my
- nl
- nn
- nb
- npi
- nso
- nus
- ny
- oc
- ory
- pag
- pa
- pap
- pbt
- pes
- plt
- pl
- pt
- prs
- quy
- ro
- rn
- ru
- sg
- sa
- sat
- scn
- shn
- si
- sk
- sl
- sm
- sn
- sd
- so
- st
- es
- sc
- sr
- ss
- su
- sv
- swh
- szl
- ta
- taq
- tt
- te
- tg
- tl
- th
- ti
- tpi
- tn
- ts
- tk
- tum
- tr
- tw
- tzm
- ug
- uk
- umb
- ur
- uzn
- vec
- vi
- war
- wo
- xh
- ydd
- yo
- yue
- zh
- zsm
- zu
language_details: "ace_Arab, ace_Latn, acm_Arab, acq_Arab, aeb_Arab, afr_Latn, ajp_Arab, aka_Latn, amh_Ethi, apc_Arab, arb_Arab, ars_Arab, ary_Arab, arz_Arab, asm_Beng, ast_Latn, awa_Deva, ayr_Latn, azb_Arab, azj_Latn, bak_Cyrl, bam_Latn, ban_Latn,bel_Cyrl, bem_Latn, ben_Beng, bho_Deva, bjn_Arab, bjn_Latn, bod_Tibt, bos_Latn, bug_Latn, bul_Cyrl, cat_Latn, ceb_Latn, ces_Latn, cjk_Latn, ckb_Arab, crh_Latn, cym_Latn, dan_Latn, deu_Latn, dik_Latn, dyu_Latn, dzo_Tibt, ell_Grek, eng_Latn, epo_Latn, est_Latn, eus_Latn, ewe_Latn, fao_Latn, pes_Arab, fij_Latn, fin_Latn, fon_Latn, fra_Latn, fur_Latn, fuv_Latn, gla_Latn, gle_Latn, glg_Latn, grn_Latn, guj_Gujr, hat_Latn, hau_Latn, heb_Hebr, hin_Deva, hne_Deva, hrv_Latn, hun_Latn, hye_Armn, ibo_Latn, ilo_Latn, ind_Latn, isl_Latn, ita_Latn, jav_Latn, jpn_Jpan, kab_Latn, kac_Latn, kam_Latn, kan_Knda, kas_Arab, kas_Deva, kat_Geor, knc_Arab, knc_Latn, kaz_Cyrl, kbp_Latn, kea_Latn, khm_Khmr, kik_Latn, kin_Latn, kir_Cyrl, kmb_Latn, kon_Latn, kor_Hang, kmr_Latn, lao_Laoo, lvs_Latn, lij_Latn, lim_Latn, lin_Latn, lit_Latn, lmo_Latn, ltg_Latn, ltz_Latn, lua_Latn, lug_Latn, luo_Latn, lus_Latn, mag_Deva, mai_Deva, mal_Mlym, mar_Deva, min_Latn, mkd_Cyrl, plt_Latn, mlt_Latn, mni_Beng, khk_Cyrl, mos_Latn, mri_Latn, zsm_Latn, mya_Mymr, nld_Latn, nno_Latn, nob_Latn, npi_Deva, nso_Latn, nus_Latn, nya_Latn, oci_Latn, gaz_Latn, ory_Orya, pag_Latn, pan_Guru, pap_Latn, pol_Latn, por_Latn, prs_Arab, pbt_Arab, quy_Latn, ron_Latn, run_Latn, rus_Cyrl, sag_Latn, san_Deva, sat_Beng, scn_Latn, shn_Mymr, sin_Sinh, slk_Latn, slv_Latn, smo_Latn, sna_Latn, snd_Arab, som_Latn, sot_Latn, spa_Latn, als_Latn, srd_Latn, srp_Cyrl, ssw_Latn, sun_Latn, swe_Latn, swh_Latn, szl_Latn, tam_Taml, tat_Cyrl, tel_Telu, tgk_Cyrl, tgl_Latn, tha_Thai, tir_Ethi, taq_Latn, taq_Tfng, tpi_Latn, tsn_Latn, tso_Latn, tuk_Latn, tum_Latn, tur_Latn, twi_Latn, tzm_Tfng, uig_Arab, ukr_Cyrl, umb_Latn, urd_Arab, uzn_Latn, vec_Latn, vie_Latn, war_Latn, wol_Latn, xho_Latn, ydd_Hebr, yor_Latn, yue_Hant, zho_Hans, zho_Hant, zul_Latn"
tags:
- nllb
- translation
license: "cc-by-nc-4.0"
datasets:
- flores-200
metrics:
- bleu
- spbleu
- chrf++
inference: false
---
# NLLB-200
This is the model card of NLLB-200's distilled 1.3B variant.
Here are the [metrics](https://tinyurl.com/nllb200densedst1bmetrics) for that particular checkpoint.
- Information about training algorithms, parameters, fairness constraints or other applied approaches, and features. The exact training algorithm, data and the strategies to handle data imbalances for high and low resource languages that were used to train NLLB-200 is described in the paper.
- Paper or other resource for more information NLLB Team et al, No Language Left Behind: Scaling Human-Centered Machine Translation, Arxiv, 2022
- License: CC-BY-NC
- Where to send questions or comments about the model: https://github.com/facebookresearch/fairseq/issues
## Intended Use
- Primary intended uses: NLLB-200 is a machine translation model primarily intended for research in machine translation, - especially for low-resource languages. It allows for single sentence translation among 200 languages. Information on how to - use the model can be found in Fairseq code repository along with the training code and references to evaluation and training data.
- Primary intended users: Primary users are researchers and machine translation research community.
- Out-of-scope use cases: NLLB-200 is a research model and is not released for production deployment. NLLB-200 is trained on general domain text data and is not intended to be used with domain specific texts, such as medical domain or legal domain. The model is not intended to be used for document translation. The model was trained with input lengths not exceeding 512 tokens, therefore translating longer sequences might result in quality degradation. NLLB-200 translations can not be used as certified translations.
## Metrics
• Model performance measures: NLLB-200 model was evaluated using BLEU, spBLEU, and chrF++ metrics widely adopted by machine translation community. Additionally, we performed human evaluation with the XSTS protocol and measured the toxicity of the generated translations.
## Evaluation Data
- Datasets: Flores-200 dataset is described in Section 4
- Motivation: We used Flores-200 as it provides full evaluation coverage of the languages in NLLB-200
- Preprocessing: Sentence-split raw text data was preprocessed using SentencePiece. The
SentencePiece model is released along with NLLB-200.
## Training Data
• We used parallel multilingual data from a variety of sources to train the model. We provide detailed report on data selection and construction process in Section 5 in the paper. We also used monolingual data constructed from Common Crawl. We provide more details in Section 5.2.
## Ethical Considerations
• In this work, we took a reflexive approach in technological development to ensure that we prioritize human users and minimize risks that could be transferred to them. While we reflect on our ethical considerations throughout the article, here are some additional points to highlight. For one, many languages chosen for this study are low-resource languages, with a heavy emphasis on African languages. While quality translation could improve education and information access in many in these communities, such an access could also make groups with lower levels of digital literacy more vulnerable to misinformation or online scams. The latter scenarios could arise if bad actors misappropriate our work for nefarious activities, which we conceive as an example of unintended use. Regarding data acquisition, the training data used for model development were mined from various publicly available sources on the web. Although we invested heavily in data cleaning, personally identifiable information may not be entirely eliminated. Finally, although we did our best to optimize for translation quality, mistranslations produced by the model could remain. Although the odds are low, this could have adverse impact on those who rely on these translations to make important decisions (particularly when related to health and safety).
## Caveats and Recommendations
• Our model has been tested on the Wikimedia domain with limited investigation on other domains supported in NLLB-MD. In addition, the supported languages may have variations that our model is not capturing. Users should make appropriate assessments.
## Carbon Footprint Details
• The carbon dioxide (CO2e) estimate is reported in Section 8.8. | 7,643 | [
[
-0.03033447265625,
-0.043487548828125,
0.021392822265625,
0.0230865478515625,
-0.013580322265625,
-0.01049041748046875,
-0.00937652587890625,
-0.049041748046875,
-0.0038890838623046875,
0.053985595703125,
-0.0391845703125,
-0.0218658447265625,
-0.05059814453125,
0.0262908935546875,
-0.045257568359375,
0.1019287109375,
-0.0009675025939941406,
0.0244140625,
-0.003993988037109375,
-0.0301361083984375,
-0.0294189453125,
-0.0460205078125,
-0.058074951171875,
-0.01424407958984375,
0.054290771484375,
0.020904541015625,
0.05621337890625,
0.046875,
0.0384521484375,
0.01169586181640625,
-0.0281524658203125,
-0.00344085693359375,
-0.055755615234375,
-0.036407470703125,
-0.020751953125,
-0.02459716796875,
-0.061431884765625,
0.0177001953125,
0.040130615234375,
0.07476806640625,
-0.0174713134765625,
0.042877197265625,
0.00907135009765625,
0.0535888671875,
-0.02105712890625,
-0.01491546630859375,
-0.0399169921875,
0.0086822509765625,
-0.0267333984375,
-0.00919342041015625,
-0.051239013671875,
-0.02435302734375,
-0.0026397705078125,
-0.051025390625,
0.003475189208984375,
0.0233917236328125,
0.07177734375,
0.0146331787109375,
-0.0419921875,
-0.02880859375,
-0.038970947265625,
0.0777587890625,
-0.07354736328125,
0.02496337890625,
0.04498291015625,
0.0046539306640625,
-0.002620697021484375,
-0.045623779296875,
-0.05059814453125,
-0.00787353515625,
-0.0038909912109375,
0.01041412353515625,
-0.0159912109375,
-0.00152587890625,
0.04071044921875,
0.030426025390625,
-0.051239013671875,
0.0108184814453125,
-0.053009033203125,
-0.01226043701171875,
0.05194091796875,
0.0164947509765625,
0.0151824951171875,
-0.033172607421875,
-0.03436279296875,
-0.005687713623046875,
-0.05059814453125,
-0.00005173683166503906,
0.053558349609375,
0.03436279296875,
-0.023712158203125,
0.04315185546875,
-0.01129913330078125,
0.0548095703125,
-0.00220489501953125,
-0.0168609619140625,
0.042236328125,
-0.05218505859375,
-0.005146026611328125,
-0.007411956787109375,
0.05975341796875,
0.037872314453125,
0.0211639404296875,
-0.0159912109375,
-0.0138702392578125,
-0.0218048095703125,
0.042755126953125,
-0.064208984375,
0.015380859375,
0.0239715576171875,
-0.052581787109375,
-0.037567138671875,
-0.004863739013671875,
-0.0555419921875,
-0.01312255859375,
-0.0289306640625,
0.023834228515625,
-0.026702880859375,
-0.016693115234375,
0.006938934326171875,
0.0020618438720703125,
0.00750732421875,
0.01242828369140625,
-0.050048828125,
0.0187835693359375,
0.0246734619140625,
0.0550537109375,
-0.0191497802734375,
-0.0260162353515625,
-0.0166778564453125,
-0.00273895263671875,
-0.0233154296875,
0.0216064453125,
-0.002536773681640625,
-0.027618408203125,
-0.0038299560546875,
0.014739990234375,
0.011077880859375,
-0.041259765625,
0.060760498046875,
-0.039886474609375,
0.023834228515625,
-0.0408935546875,
-0.041107177734375,
-0.021514892578125,
0.01490020751953125,
-0.06805419921875,
0.082275390625,
0.0140380859375,
-0.07476806640625,
0.031341552734375,
-0.05426025390625,
-0.037994384765625,
0.015838623046875,
0.01322174072265625,
-0.035003662109375,
0.0036983489990234375,
-0.0025501251220703125,
0.0093841552734375,
-0.0194091796875,
0.03900146484375,
-0.022430419921875,
-0.03167724609375,
0.0254058837890625,
-0.049652099609375,
0.1036376953125,
0.04052734375,
-0.0208587646484375,
-0.0251617431640625,
-0.053192138671875,
0.007465362548828125,
0.0206756591796875,
-0.0457763671875,
-0.008544921875,
-0.01261138916015625,
0.0305023193359375,
0.026397705078125,
0.0189361572265625,
-0.04278564453125,
0.01522064208984375,
-0.0231475830078125,
0.00911712646484375,
0.041229248046875,
0.002384185791015625,
0.036834716796875,
-0.0281829833984375,
0.049468994140625,
-0.005092620849609375,
0.0382080078125,
0.004283905029296875,
-0.035736083984375,
-0.060211181640625,
0.01226043701171875,
0.03887939453125,
0.047515869140625,
-0.04986572265625,
0.040283203125,
-0.0236663818359375,
-0.033660888671875,
-0.058074951171875,
0.0155181884765625,
0.0323486328125,
0.038726806640625,
0.042205810546875,
-0.022552490234375,
-0.035675048828125,
-0.057830810546875,
-0.01407623291015625,
-0.0013704299926757812,
0.0123443603515625,
0.007236480712890625,
0.04608154296875,
-0.03057861328125,
0.06121826171875,
-0.0164337158203125,
-0.015838623046875,
-0.02899169921875,
0.007781982421875,
0.0192413330078125,
0.043670654296875,
0.0460205078125,
-0.073974609375,
-0.037322998046875,
-0.00426483154296875,
-0.0740966796875,
-0.01593017578125,
-0.015899658203125,
-0.01123046875,
0.0306396484375,
0.03338623046875,
-0.02294921875,
0.039154052734375,
0.06048583984375,
-0.0159759521484375,
0.036590576171875,
-0.014739990234375,
0.0052642822265625,
-0.0887451171875,
0.0419921875,
-0.01215362548828125,
-0.0135650634765625,
-0.05950927734375,
0.01052093505859375,
0.0060882568359375,
-0.01018524169921875,
-0.044525146484375,
0.06341552734375,
-0.02191162109375,
0.00988006591796875,
-0.0256195068359375,
0.006351470947265625,
0.020233154296875,
0.03607177734375,
-0.015716552734375,
0.050689697265625,
0.006561279296875,
-0.040313720703125,
0.0023632049560546875,
0.0279541015625,
-0.0218658447265625,
0.059326171875,
-0.050048828125,
0.0020275115966796875,
-0.009033203125,
0.01468658447265625,
-0.035888671875,
-0.0025424957275390625,
0.024444580078125,
-0.0439453125,
0.0209503173828125,
0.004665374755859375,
-0.05731201171875,
-0.0295867919921875,
0.00159454345703125,
0.0301971435546875,
0.0259246826171875,
-0.019744873046875,
0.01934814453125,
0.03515625,
-0.01348876953125,
-0.0538330078125,
-0.0836181640625,
0.01024627685546875,
-0.019927978515625,
-0.034393310546875,
0.0133209228515625,
-0.0196533203125,
-0.01340484619140625,
0.00457000732421875,
0.0017919540405273438,
-0.01337432861328125,
0.021728515625,
0.00958251953125,
0.01482391357421875,
0.010833740234375,
0.0078887939453125,
0.00727081298828125,
-0.00794219970703125,
-0.0063934326171875,
-0.0189361572265625,
0.045501708984375,
-0.003910064697265625,
-0.01078033447265625,
-0.032501220703125,
0.0450439453125,
0.0287933349609375,
-0.012786865234375,
0.08258056640625,
0.05157470703125,
-0.03900146484375,
0.020294189453125,
-0.0421142578125,
-0.0005803108215332031,
-0.0347900390625,
0.037017822265625,
-0.0005979537963867188,
-0.0411376953125,
0.034515380859375,
0.01209259033203125,
0.01459503173828125,
0.038726806640625,
0.03936767578125,
-0.032501220703125,
0.06591796875,
0.056427001953125,
-0.004055023193359375,
0.031982421875,
-0.0284271240234375,
0.0196075439453125,
-0.0655517578125,
-0.018951416015625,
-0.039703369140625,
-0.01995849609375,
-0.056488037109375,
-0.0284423828125,
0.0190277099609375,
0.025238037109375,
-0.0091705322265625,
0.049530029296875,
-0.0170745849609375,
0.017486572265625,
0.031829833984375,
0.0005598068237304688,
0.03765869140625,
-0.004833221435546875,
-0.0110015869140625,
-0.0116424560546875,
-0.05938720703125,
-0.065185546875,
0.08587646484375,
0.035491943359375,
0.042510986328125,
-0.006237030029296875,
0.05706787109375,
0.03118896484375,
0.036834716796875,
-0.040313720703125,
0.0335693359375,
-0.01617431640625,
-0.095703125,
-0.0196533203125,
-0.051544189453125,
-0.08001708984375,
0.01058197021484375,
-0.01042938232421875,
-0.03594970703125,
0.012481689453125,
0.0078582763671875,
-0.01100921630859375,
0.018280029296875,
-0.056488037109375,
0.08831787109375,
-0.044677734375,
-0.012847900390625,
-0.021331787109375,
-0.0556640625,
0.002887725830078125,
-0.0309600830078125,
0.037567138671875,
-0.0033473968505859375,
0.0098724365234375,
0.06353759765625,
-0.024444580078125,
0.061431884765625,
-0.017364501953125,
-0.01190185546875,
0.0206451416015625,
-0.00787353515625,
0.028106689453125,
-0.007175445556640625,
-0.026641845703125,
0.042999267578125,
0.00469207763671875,
-0.05523681640625,
-0.006183624267578125,
0.03131103515625,
-0.0550537109375,
-0.01275634765625,
-0.029144287109375,
-0.05413818359375,
-0.0008516311645507812,
0.04290771484375,
0.044647216796875,
0.023162841796875,
-0.02056884765625,
0.02081298828125,
0.0487060546875,
-0.04559326171875,
0.0205535888671875,
0.050323486328125,
-0.0209503173828125,
-0.022186279296875,
0.06573486328125,
0.0282440185546875,
0.048004150390625,
0.01174163818359375,
0.0011301040649414062,
-0.01349639892578125,
-0.04290771484375,
-0.041839599609375,
0.0191802978515625,
-0.06494140625,
-0.0133056640625,
-0.052734375,
-0.0075225830078125,
-0.02569580078125,
-0.012176513671875,
-0.030548095703125,
-0.02349853515625,
-0.0254058837890625,
-0.01222991943359375,
0.0032062530517578125,
0.0489501953125,
0.00554656982421875,
0.031341552734375,
-0.053741455078125,
0.0166778564453125,
-0.01934814453125,
0.018035888671875,
0.0006422996520996094,
-0.059326171875,
-0.042755126953125,
0.025970458984375,
-0.032379150390625,
-0.05853271484375,
0.0251312255859375,
-0.0102386474609375,
0.054595947265625,
0.01256561279296875,
0.0039005279541015625,
0.048248291015625,
-0.037872314453125,
0.050537109375,
0.0108184814453125,
-0.0712890625,
0.0251007080078125,
-0.0274658203125,
0.035736083984375,
0.0760498046875,
0.054229736328125,
-0.0640869140625,
-0.035308837890625,
-0.052642822265625,
-0.075927734375,
0.048980712890625,
0.0225982666015625,
0.0189666748046875,
0.00006914138793945312,
0.02392578125,
0.0138092041015625,
0.0240325927734375,
-0.1005859375,
-0.020355224609375,
-0.0113677978515625,
-0.0169525146484375,
0.00965118408203125,
-0.006591796875,
-0.004375457763671875,
-0.0170135498046875,
0.0579833984375,
0.00240325927734375,
0.01056671142578125,
-0.001300811767578125,
-0.033477783203125,
-0.01004791259765625,
0.01739501953125,
0.0256195068359375,
0.046600341796875,
-0.00992584228515625,
-0.0220489501953125,
0.0302886962890625,
-0.04443359375,
0.0078887939453125,
0.00792694091796875,
-0.035400390625,
-0.0059814453125,
0.0250701904296875,
0.051025390625,
-0.0003540515899658203,
-0.0478515625,
0.0386962890625,
0.00463104248046875,
-0.008575439453125,
-0.0274810791015625,
-0.0251312255859375,
0.01355743408203125,
0.0016956329345703125,
0.02783203125,
0.018341064453125,
0.0187835693359375,
-0.039581298828125,
0.0102081298828125,
0.0190887451171875,
-0.0255584716796875,
-0.0217132568359375,
0.051116943359375,
0.02728271484375,
-0.01197052001953125,
0.055694580078125,
-0.04010009765625,
-0.0227508544921875,
0.034881591796875,
0.0252838134765625,
0.04205322265625,
-0.007183074951171875,
0.0168609619140625,
0.05072021484375,
0.053009033203125,
-0.01113128662109375,
0.0127105712890625,
0.01074981689453125,
-0.041961669921875,
-0.03607177734375,
-0.061370849609375,
-0.0156707763671875,
0.004520416259765625,
-0.07415771484375,
0.0259246826171875,
-0.01535797119140625,
-0.02783203125,
-0.0159454345703125,
0.01367950439453125,
-0.061004638671875,
0.01493072509765625,
0.0164337158203125,
0.075439453125,
-0.0718994140625,
0.07696533203125,
0.019256591796875,
-0.05987548828125,
-0.047882080078125,
0.00969696044921875,
-0.003719329833984375,
-0.041900634765625,
0.037506103515625,
0.0191497802734375,
0.015655517578125,
-0.005771636962890625,
-0.04046630859375,
-0.06005859375,
0.08258056640625,
0.03411865234375,
-0.048614501953125,
-0.0157623291015625,
0.036834716796875,
0.05230712890625,
-0.0014400482177734375,
-0.003826141357421875,
0.0219268798828125,
0.04180908203125,
-0.0121307373046875,
-0.07916259765625,
0.00826263427734375,
-0.01270294189453125,
-0.002216339111328125,
0.0015506744384765625,
-0.04534912109375,
0.05487060546875,
-0.01270294189453125,
-0.0183563232421875,
0.021575927734375,
0.03265380859375,
0.00797271728515625,
0.0169830322265625,
0.0267333984375,
0.050811767578125,
0.06011962890625,
-0.0013437271118164062,
0.09783935546875,
-0.0130767822265625,
0.04107666015625,
0.0828857421875,
-0.0196075439453125,
0.05413818359375,
0.050018310546875,
-0.01549530029296875,
0.016387939453125,
0.032867431640625,
-0.01016998291015625,
0.037841796875,
0.01073455810546875,
0.0114898681640625,
0.00838470458984375,
-0.0228424072265625,
-0.02691650390625,
0.0201416015625,
0.00913238525390625,
-0.03338623046875,
-0.0031032562255859375,
0.014984130859375,
0.0279541015625,
-0.0036907196044921875,
-0.00876617431640625,
0.04559326171875,
0.0150299072265625,
-0.0528564453125,
0.045684814453125,
0.0182342529296875,
0.051971435546875,
-0.04620361328125,
0.01435089111328125,
-0.02978515625,
0.0124969482421875,
-0.0194854736328125,
-0.0511474609375,
0.0467529296875,
0.018280029296875,
-0.018951416015625,
-0.040985107421875,
0.0184478759765625,
-0.0268096923828125,
-0.0543212890625,
0.038330078125,
0.0303497314453125,
0.01364898681640625,
0.0012359619140625,
-0.06732177734375,
0.017425537109375,
0.00971221923828125,
-0.0182037353515625,
0.0269775390625,
0.02508544921875,
-0.00937652587890625,
0.041717529296875,
0.043487548828125,
0.0142669677734375,
0.008758544921875,
0.0081329345703125,
0.050262451171875,
-0.04443359375,
-0.0169525146484375,
-0.0377197265625,
0.047149658203125,
-0.0162506103515625,
-0.031951904296875,
0.0699462890625,
0.054290771484375,
0.09832763671875,
0.00620269775390625,
0.053619384765625,
-0.02337646484375,
0.03533935546875,
-0.02581787109375,
0.06610107421875,
-0.056854248046875,
0.007080078125,
-0.02734375,
-0.0657958984375,
-0.006702423095703125,
0.042999267578125,
-0.01153564453125,
0.01280975341796875,
0.054656982421875,
0.04833984375,
0.01373291015625,
-0.00966644287109375,
0.01629638671875,
0.00716400146484375,
0.0216522216796875,
0.018524169921875,
0.03363037109375,
-0.06561279296875,
0.059173583984375,
-0.019744873046875,
-0.010284423828125,
-0.017913818359375,
-0.06787109375,
-0.055267333984375,
-0.04150390625,
-0.0185699462890625,
-0.029937744140625,
-0.01120758056640625,
0.05908203125,
0.04595947265625,
-0.056304931640625,
-0.031341552734375,
0.003520965576171875,
-0.0198974609375,
-0.0247955322265625,
-0.0180816650390625,
-0.00685882568359375,
-0.01201629638671875,
-0.059173583984375,
0.01004791259765625,
0.0023956298828125,
0.0061492919921875,
-0.030426025390625,
-0.0265960693359375,
-0.0386962890625,
0.0023326873779296875,
0.037445068359375,
0.0136260986328125,
-0.04510498046875,
-0.002346038818359375,
0.0176239013671875,
-0.03887939453125,
-0.004486083984375,
0.03570556640625,
-0.0178375244140625,
0.0364990234375,
0.0253448486328125,
0.0401611328125,
0.044281005859375,
-0.00891876220703125,
0.03228759765625,
-0.05877685546875,
0.0254364013671875,
0.0273284912109375,
0.031890869140625,
0.03466796875,
-0.029876708984375,
0.046356201171875,
0.0137176513671875,
-0.040985107421875,
-0.07330322265625,
0.0062255859375,
-0.07861328125,
-0.0191802978515625,
0.09765625,
-0.01189422607421875,
-0.0032176971435546875,
-0.01140594482421875,
-0.0138092041015625,
0.02862548828125,
-0.0080413818359375,
0.050872802734375,
0.0723876953125,
0.0244140625,
0.006420135498046875,
-0.08172607421875,
0.0177459716796875,
0.029327392578125,
-0.06695556640625,
-0.00201416015625,
0.01531982421875,
0.0283660888671875,
0.0199127197265625,
0.054412841796875,
-0.03765869140625,
0.019317626953125,
-0.0041656494140625,
0.0257720947265625,
0.0154266357421875,
-0.0146942138671875,
-0.0252227783203125,
-0.0160980224609375,
0.00952911376953125,
0.01629638671875
]
] |
prajjwal1/bert-medium | 2021-10-27T18:30:16.000Z | [
"transformers",
"pytorch",
"BERT",
"MNLI",
"NLI",
"transformer",
"pre-training",
"en",
"arxiv:1908.08962",
"arxiv:2110.01518",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | prajjwal1 | null | null | prajjwal1/bert-medium | 2 | 14,585 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
license:
- mit
tags:
- BERT
- MNLI
- NLI
- transformer
- pre-training
---
The following model is a Pytorch pre-trained model obtained from converting Tensorflow checkpoint found in the [official Google BERT repository](https://github.com/google-research/bert).
This is one of the smaller pre-trained BERT variants, together with [bert-tiny](https://huggingface.co/prajjwal1/bert-tiny), [bert-mini](https://huggingface.co/prajjwal1/bert-mini) and [bert-small](https://huggingface.co/prajjwal1/bert-small). They were introduced in the study `Well-Read Students Learn Better: On the Importance of Pre-training Compact Models` ([arxiv](https://arxiv.org/abs/1908.08962)), and ported to HF for the study `Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics` ([arXiv](https://arxiv.org/abs/2110.01518)). These models are supposed to be trained on a downstream task.
If you use the model, please consider citing both the papers:
```
@misc{bhargava2021generalization,
title={Generalization in NLI: Ways (Not) To Go Beyond Simple Heuristics},
author={Prajjwal Bhargava and Aleksandr Drozd and Anna Rogers},
year={2021},
eprint={2110.01518},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@article{DBLP:journals/corr/abs-1908-08962,
author = {Iulia Turc and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {Well-Read Students Learn Better: The Impact of Student Initialization
on Knowledge Distillation},
journal = {CoRR},
volume = {abs/1908.08962},
year = {2019},
url = {http://arxiv.org/abs/1908.08962},
eprinttype = {arXiv},
eprint = {1908.08962},
timestamp = {Thu, 29 Aug 2019 16:32:34 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1908-08962.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
Config of this model:
- `prajjwal1/bert-medium` (L=8, H=512) [Model Link](https://huggingface.co/prajjwal1/bert-medium)
Other models to check out:
- `prajjwal1/bert-tiny` (L=2, H=128) [Model Link](https://huggingface.co/prajjwal1/bert-tiny)
- `prajjwal1/bert-mini` (L=4, H=256) [Model Link](https://huggingface.co/prajjwal1/bert-mini)
- `prajjwal1/bert-small` (L=4, H=512) [Model Link](https://huggingface.co/prajjwal1/bert-small)
Original Implementation and more info can be found in [this Github repository](https://github.com/prajjwal1/generalize_lm_nli).
Twitter: [@prajjwal_1](https://twitter.com/prajjwal_1)
| 2,558 | [
[
-0.030975341796875,
-0.04156494140625,
0.034271240234375,
-0.00174713134765625,
-0.01235198974609375,
-0.02227783203125,
-0.0232391357421875,
-0.032470703125,
0.0080718994140625,
0.01288604736328125,
-0.05462646484375,
-0.0250396728515625,
-0.03826904296875,
-0.0101470947265625,
-0.0256195068359375,
0.09228515625,
0.0053253173828125,
0.00481414794921875,
-0.01291656494140625,
-0.0186614990234375,
-0.0119171142578125,
-0.041046142578125,
-0.043701171875,
-0.037445068359375,
0.054901123046875,
-0.0014848709106445312,
0.0360107421875,
0.0157928466796875,
0.04693603515625,
0.0204925537109375,
-0.0295562744140625,
-0.007110595703125,
-0.036956787109375,
-0.019317626953125,
0.003543853759765625,
-0.033172607421875,
-0.04388427734375,
0.00782012939453125,
0.05548095703125,
0.0701904296875,
-0.00865936279296875,
0.02740478515625,
0.021636962890625,
0.0457763671875,
-0.04217529296875,
-0.003086090087890625,
-0.0240020751953125,
-0.0159149169921875,
-0.01169586181640625,
0.020965576171875,
-0.041229248046875,
-0.025909423828125,
0.037322998046875,
-0.03924560546875,
0.041015625,
-0.0011644363403320312,
0.10345458984375,
0.011322021484375,
-0.016632080078125,
-0.00921630859375,
-0.0472412109375,
0.072509765625,
-0.07177734375,
0.037933349609375,
0.0024929046630859375,
0.019378662109375,
-0.0010175704956054688,
-0.07220458984375,
-0.043304443359375,
-0.00562286376953125,
-0.034637451171875,
0.0106658935546875,
-0.0282745361328125,
0.0116424560546875,
0.029083251953125,
0.028076171875,
-0.043701171875,
0.006061553955078125,
-0.042877197265625,
-0.0189971923828125,
0.0307464599609375,
-0.0008678436279296875,
-0.0028896331787109375,
-0.0307159423828125,
-0.02557373046875,
-0.032958984375,
-0.043609619140625,
0.0189666748046875,
0.039093017578125,
0.029205322265625,
-0.0306243896484375,
0.03179931640625,
-0.00021350383758544922,
0.06280517578125,
0.01049041748046875,
-0.0020084381103515625,
0.03228759765625,
-0.04962158203125,
-0.01092529296875,
-0.01540374755859375,
0.05987548828125,
0.00609588623046875,
0.00888824462890625,
-0.00556182861328125,
-0.006862640380859375,
-0.029388427734375,
0.01212310791015625,
-0.07769775390625,
-0.03094482421875,
0.0125732421875,
-0.053253173828125,
-0.002727508544921875,
0.01306915283203125,
-0.046600341796875,
-0.00421142578125,
-0.024688720703125,
0.0384521484375,
-0.03924560546875,
-0.02008056640625,
-0.01082611083984375,
-0.0006756782531738281,
0.031158447265625,
0.029083251953125,
-0.0496826171875,
0.005527496337890625,
0.0357666015625,
0.06988525390625,
0.00689697265625,
-0.0173797607421875,
0.0008273124694824219,
0.004161834716796875,
-0.01450347900390625,
0.0306243896484375,
-0.01629638671875,
-0.0135040283203125,
-0.0056610107421875,
-0.00226593017578125,
-0.0120697021484375,
-0.0260162353515625,
0.050933837890625,
-0.03790283203125,
0.026214599609375,
-0.0260467529296875,
-0.040435791015625,
-0.0195159912109375,
0.01366424560546875,
-0.046173095703125,
0.074951171875,
0.0015964508056640625,
-0.0711669921875,
0.039031982421875,
-0.0484619140625,
-0.0161285400390625,
-0.01439666748046875,
0.010223388671875,
-0.052947998046875,
-0.00041985511779785156,
0.01468658447265625,
0.0390625,
-0.01371002197265625,
0.0237884521484375,
-0.035308837890625,
-0.0253143310546875,
-0.00748443603515625,
-0.005641937255859375,
0.0888671875,
0.022247314453125,
-0.004489898681640625,
0.015380859375,
-0.062042236328125,
0.0081939697265625,
0.01340484619140625,
-0.028778076171875,
-0.03753662109375,
-0.0096893310546875,
-0.001346588134765625,
0.00218963623046875,
0.027069091796875,
-0.0303192138671875,
0.0272369384765625,
-0.0279541015625,
0.0310516357421875,
0.047149658203125,
0.005809783935546875,
0.0369873046875,
-0.039337158203125,
0.007251739501953125,
0.01174163818359375,
0.0239410400390625,
0.0028820037841796875,
-0.03802490234375,
-0.07586669921875,
-0.038238525390625,
0.04248046875,
0.0208892822265625,
-0.04296875,
0.04437255859375,
-0.023162841796875,
-0.05230712890625,
-0.04571533203125,
0.01532745361328125,
0.0234527587890625,
0.036529541015625,
0.033172607421875,
-0.0125579833984375,
-0.054412841796875,
-0.06524658203125,
-0.0164794921875,
-0.025970458984375,
-0.015838623046875,
0.025115966796875,
0.05133056640625,
-0.039764404296875,
0.07977294921875,
-0.0290679931640625,
-0.0227508544921875,
-0.024810791015625,
0.02630615234375,
0.052520751953125,
0.06402587890625,
0.061187744140625,
-0.039764404296875,
-0.031707763671875,
-0.029205322265625,
-0.042388916015625,
0.008697509765625,
-0.01474761962890625,
-0.0212860107421875,
0.0128326416015625,
0.0298309326171875,
-0.04498291015625,
0.0302886962890625,
0.022705078125,
-0.0272979736328125,
0.033935546875,
-0.01702880859375,
-0.007099151611328125,
-0.08453369140625,
0.0265655517578125,
0.0033893585205078125,
-0.0037975311279296875,
-0.041717529296875,
0.0113983154296875,
0.0003848075866699219,
0.00887298583984375,
-0.01410675048828125,
0.04986572265625,
-0.0418701171875,
0.0035877227783203125,
0.00946044921875,
-0.01117706298828125,
-0.0034008026123046875,
0.036102294921875,
-0.0014934539794921875,
0.039306640625,
0.022979736328125,
-0.0343017578125,
-0.005542755126953125,
0.0335693359375,
-0.03375244140625,
0.012939453125,
-0.0823974609375,
0.01116180419921875,
-0.0034008026123046875,
0.030914306640625,
-0.07098388671875,
-0.0177459716796875,
0.0216217041015625,
-0.0306549072265625,
0.028839111328125,
-0.026947021484375,
-0.053619384765625,
-0.03399658203125,
-0.021240234375,
0.0268402099609375,
0.056427001953125,
-0.048553466796875,
0.05029296875,
-0.00679779052734375,
-0.0018186569213867188,
-0.035186767578125,
-0.0523681640625,
-0.032806396484375,
-0.001415252685546875,
-0.052520751953125,
0.026214599609375,
-0.0195159912109375,
-0.004482269287109375,
0.01239776611328125,
-0.0013790130615234375,
-0.019561767578125,
-0.0033626556396484375,
0.01248931884765625,
0.043701171875,
-0.022216796875,
0.01035308837890625,
0.005657196044921875,
0.0167694091796875,
-0.0033092498779296875,
-0.005336761474609375,
0.043212890625,
-0.0228424072265625,
-0.0132904052734375,
-0.04351806640625,
0.00815582275390625,
0.02911376953125,
-0.002155303955078125,
0.08233642578125,
0.06915283203125,
-0.028106689453125,
0.0026721954345703125,
-0.048065185546875,
-0.044219970703125,
-0.034912109375,
0.014373779296875,
-0.0195465087890625,
-0.05657958984375,
0.048492431640625,
0.00350189208984375,
0.0166778564453125,
0.057952880859375,
0.036590576171875,
-0.02197265625,
0.055816650390625,
0.0595703125,
-0.00004929304122924805,
0.060272216796875,
-0.052764892578125,
0.01983642578125,
-0.0697021484375,
-0.01541900634765625,
-0.04534912109375,
-0.030487060546875,
-0.046173095703125,
-0.014495849609375,
0.0208892822265625,
0.0271453857421875,
-0.037628173828125,
0.029327392578125,
-0.044219970703125,
0.0123138427734375,
0.0645751953125,
0.022430419921875,
0.004673004150390625,
-0.0008482933044433594,
-0.0306549072265625,
-0.0030765533447265625,
-0.07342529296875,
-0.0266876220703125,
0.1004638671875,
0.0307464599609375,
0.045501708984375,
0.02301025390625,
0.07891845703125,
0.0021457672119140625,
0.0240020751953125,
-0.046905517578125,
0.03448486328125,
-0.00453948974609375,
-0.08001708984375,
-0.0189971923828125,
-0.04779052734375,
-0.0771484375,
0.005901336669921875,
-0.028778076171875,
-0.053741455078125,
0.03857421875,
0.00681304931640625,
-0.048583984375,
0.01526641845703125,
-0.07177734375,
0.056915283203125,
0.0028553009033203125,
-0.035858154296875,
-0.0104827880859375,
-0.052154541015625,
0.027801513671875,
0.0010690689086914062,
0.0038394927978515625,
0.0111846923828125,
0.0174713134765625,
0.08123779296875,
-0.048065185546875,
0.0689697265625,
-0.03082275390625,
0.0193328857421875,
0.0390625,
-0.014495849609375,
0.046722412109375,
0.00734710693359375,
-0.0035495758056640625,
0.0305938720703125,
0.01201629638671875,
-0.04443359375,
-0.018310546875,
0.042083740234375,
-0.08819580078125,
-0.0345458984375,
-0.048065185546875,
-0.047149658203125,
-0.006832122802734375,
0.0325927734375,
0.0294952392578125,
0.0250244140625,
0.006683349609375,
0.036590576171875,
0.056121826171875,
-0.009674072265625,
0.043182373046875,
0.034912109375,
-0.00844573974609375,
-0.00897216796875,
0.04559326171875,
0.0105743408203125,
0.0178375244140625,
0.01012420654296875,
0.01381683349609375,
-0.020660400390625,
-0.05859375,
-0.005893707275390625,
0.0439453125,
-0.051300048828125,
-0.0022430419921875,
-0.047821044921875,
-0.035430908203125,
-0.04339599609375,
-0.01885986328125,
-0.0260009765625,
-0.0157928466796875,
-0.03717041015625,
0.0034885406494140625,
0.0234527587890625,
0.03857421875,
-0.02044677734375,
0.033966064453125,
-0.0487060546875,
0.00412750244140625,
0.034698486328125,
0.014434814453125,
0.01064300537109375,
-0.05633544921875,
-0.01398468017578125,
0.0025482177734375,
-0.0157012939453125,
-0.03924560546875,
0.0216064453125,
0.0207366943359375,
0.059661865234375,
0.0305938720703125,
0.01161956787109375,
0.050445556640625,
-0.02252197265625,
0.05108642578125,
0.033355712890625,
-0.042877197265625,
0.03863525390625,
-0.0289459228515625,
0.0196685791015625,
0.05523681640625,
0.03802490234375,
-0.003360748291015625,
-0.004032135009765625,
-0.062347412109375,
-0.0804443359375,
0.053985595703125,
0.01366424560546875,
0.00902557373046875,
0.0277252197265625,
0.03167724609375,
0.00731658935546875,
0.0124664306640625,
-0.06427001953125,
-0.02569580078125,
-0.01256561279296875,
-0.021209716796875,
-0.01214599609375,
-0.03826904296875,
-0.0232391357421875,
-0.0504150390625,
0.059539794921875,
-0.0002720355987548828,
0.047454833984375,
0.0233612060546875,
-0.01641845703125,
0.01383209228515625,
0.005947113037109375,
0.036956787109375,
0.048248291015625,
-0.05133056640625,
-0.01419830322265625,
-0.0005292892456054688,
-0.039886474609375,
-0.01526641845703125,
0.02520751953125,
-0.024078369140625,
0.0123291015625,
0.04541015625,
0.06256103515625,
0.01849365234375,
-0.017242431640625,
0.03997802734375,
0.003574371337890625,
-0.021026611328125,
-0.029571533203125,
0.002292633056640625,
-0.0009598731994628906,
0.03094482421875,
0.02862548828125,
0.0210723876953125,
0.007305145263671875,
-0.03558349609375,
0.00702667236328125,
0.0189056396484375,
-0.0195159912109375,
-0.0225677490234375,
0.0511474609375,
0.0195465087890625,
0.004314422607421875,
0.05767822265625,
-0.0234527587890625,
-0.029632568359375,
0.0278472900390625,
0.0175628662109375,
0.054901123046875,
0.017608642578125,
0.004779815673828125,
0.06756591796875,
0.0261993408203125,
-0.009307861328125,
0.005268096923828125,
-0.0112457275390625,
-0.050872802734375,
-0.019866943359375,
-0.06622314453125,
-0.017547607421875,
0.00827789306640625,
-0.05523681640625,
0.0214385986328125,
-0.041595458984375,
-0.0263519287109375,
0.01210784912109375,
0.0189208984375,
-0.0679931640625,
0.0050201416015625,
0.001987457275390625,
0.061004638671875,
-0.05194091796875,
0.07452392578125,
0.058624267578125,
-0.044281005859375,
-0.06756591796875,
0.00197601318359375,
-0.01102447509765625,
-0.0460205078125,
0.053741455078125,
-0.01093292236328125,
0.0200653076171875,
0.0101318359375,
-0.038909912109375,
-0.06719970703125,
0.09716796875,
0.0175628662109375,
-0.0614013671875,
-0.0275421142578125,
-0.0128326416015625,
0.0396728515625,
-0.0055084228515625,
0.0309295654296875,
0.0260467529296875,
0.0280303955078125,
0.02783203125,
-0.05865478515625,
0.00004881620407104492,
-0.01535797119140625,
0.0017404556274414062,
0.00567626953125,
-0.058441162109375,
0.093994140625,
-0.028350830078125,
0.0019779205322265625,
0.0206298828125,
0.046295166015625,
0.030975341796875,
0.0132293701171875,
0.03521728515625,
0.0556640625,
0.056976318359375,
-0.025970458984375,
0.080810546875,
-0.0141754150390625,
0.05767822265625,
0.08062744140625,
0.0205230712890625,
0.05792236328125,
0.05230712890625,
-0.0293731689453125,
0.04681396484375,
0.060150146484375,
-0.016204833984375,
0.05145263671875,
0.00641632080078125,
0.0094757080078125,
-0.022369384765625,
0.019012451171875,
-0.046783447265625,
0.008453369140625,
0.00836944580078125,
-0.03289794921875,
-0.0159759521484375,
-0.01552581787109375,
0.01062774658203125,
-0.0274810791015625,
-0.0225830078125,
0.044769287109375,
0.0032806396484375,
-0.0313720703125,
0.05755615234375,
-0.0178375244140625,
0.0694580078125,
-0.0589599609375,
0.0146636962890625,
-0.01062774658203125,
0.030303955078125,
-0.00873565673828125,
-0.032440185546875,
0.0185394287109375,
-0.001544952392578125,
-0.03216552734375,
-0.0145416259765625,
0.05816650390625,
-0.01331329345703125,
-0.051666259765625,
0.0211334228515625,
0.036712646484375,
0.0088958740234375,
0.0166168212890625,
-0.0648193359375,
0.0042266845703125,
0.0007739067077636719,
-0.039337158203125,
0.0245361328125,
0.01251220703125,
0.013214111328125,
0.0338134765625,
0.057281494140625,
-0.0037593841552734375,
0.0265960693359375,
-0.0038166046142578125,
0.060882568359375,
-0.0270233154296875,
-0.02899169921875,
-0.04296875,
0.0509033203125,
-0.0167236328125,
-0.044219970703125,
0.050018310546875,
0.031707763671875,
0.078369140625,
-0.0088348388671875,
0.045867919921875,
-0.0264739990234375,
0.04669189453125,
-0.02874755859375,
0.07623291015625,
-0.059783935546875,
0.01116180419921875,
-0.0236053466796875,
-0.06707763671875,
-0.0114898681640625,
0.05706787109375,
-0.0419921875,
0.0323486328125,
0.04449462890625,
0.03680419921875,
-0.0009369850158691406,
-0.0203094482421875,
0.005603790283203125,
0.03167724609375,
0.02008056640625,
0.031951904296875,
0.04156494140625,
-0.0419921875,
0.04034423828125,
-0.032012939453125,
-0.00945281982421875,
-0.03887939453125,
-0.049530029296875,
-0.08447265625,
-0.04925537109375,
-0.029632568359375,
-0.0306243896484375,
0.0016660690307617188,
0.05963134765625,
0.07177734375,
-0.07598876953125,
-0.004871368408203125,
-0.0120697021484375,
-0.00118255615234375,
-0.01059722900390625,
-0.0156402587890625,
0.032012939453125,
-0.0190277099609375,
-0.051605224609375,
-0.00376129150390625,
-0.030609130859375,
0.0208892822265625,
-0.007389068603515625,
-0.01812744140625,
-0.0386962890625,
0.00548553466796875,
0.026092529296875,
0.0211944580078125,
-0.04827880859375,
-0.0294189453125,
-0.005115509033203125,
-0.01334381103515625,
-0.01119232177734375,
0.03863525390625,
-0.04541015625,
0.0233001708984375,
0.039794921875,
0.03350830078125,
0.053192138671875,
-0.02288818359375,
0.01306915283203125,
-0.060211181640625,
0.032470703125,
0.0213775634765625,
0.036224365234375,
0.0132293701171875,
-0.0072784423828125,
0.04833984375,
0.02813720703125,
-0.0404052734375,
-0.08319091796875,
-0.003879547119140625,
-0.0849609375,
-0.0120391845703125,
0.07958984375,
-0.0313720703125,
-0.01367950439453125,
0.023590087890625,
-0.00308990478515625,
0.0281829833984375,
-0.0275115966796875,
0.0523681640625,
0.060089111328125,
-0.0016613006591796875,
-0.0127716064453125,
-0.0404052734375,
0.030029296875,
0.0282745361328125,
-0.04364013671875,
-0.0260772705078125,
0.0153656005859375,
0.0274505615234375,
0.0292816162109375,
0.02471923828125,
0.007419586181640625,
0.0146331787109375,
-0.0024738311767578125,
0.020111083984375,
-0.00884246826171875,
-0.019317626953125,
-0.007389068603515625,
-0.0034999847412109375,
-0.002948760986328125,
-0.0116424560546875
]
] |
google/t5-efficient-tiny | 2023-01-24T16:51:36.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/t5-efficient-tiny | 9 | 14,575 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
datasets:
- c4
tags:
- deep-narrow
inference: false
license: apache-2.0
---
# T5-Efficient-TINY (Deep-Narrow version)
T5-Efficient-TINY is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-tiny** - is of model type **Tiny** with no variations.
It has **15.58** million parameters and thus requires *ca.* **62.32 MB** of memory in full precision (*fp32*)
or **31.16 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future. | 6,299 | [
[
-0.039154052734375,
-0.04681396484375,
0.0264739990234375,
0.0115966796875,
-0.014312744140625,
0.0012388229370117188,
-0.01241302490234375,
-0.037200927734375,
0.0023059844970703125,
0.023834228515625,
-0.035675048828125,
-0.0374755859375,
-0.061431884765625,
0.009490966796875,
-0.038848876953125,
0.07281494140625,
-0.0133514404296875,
-0.0005517005920410156,
0.0007147789001464844,
-0.0091400146484375,
-0.025146484375,
-0.0259246826171875,
-0.044708251953125,
-0.0219268798828125,
0.0225982666015625,
0.021759033203125,
0.038116455078125,
0.06317138671875,
0.05157470703125,
0.0225982666015625,
-0.028076171875,
-0.0027141571044921875,
-0.03759765625,
-0.0311431884765625,
0.005542755126953125,
-0.021148681640625,
-0.053741455078125,
-0.008331298828125,
0.053466796875,
0.06072998046875,
-0.0036144256591796875,
0.0198516845703125,
0.01561737060546875,
0.047210693359375,
-0.04656982421875,
0.0078125,
-0.039886474609375,
0.0133514404296875,
-0.004962921142578125,
0.00746917724609375,
-0.03826904296875,
-0.01611328125,
0.01328277587890625,
-0.03369140625,
0.031982421875,
0.0015964508056640625,
0.09326171875,
0.0316162109375,
-0.0243988037109375,
-0.01904296875,
-0.057525634765625,
0.06201171875,
-0.050201416015625,
0.046966552734375,
0.0165252685546875,
0.03045654296875,
0.0257568359375,
-0.0904541015625,
-0.031707763671875,
0.0019702911376953125,
-0.018310546875,
0.016448974609375,
-0.0224761962890625,
0.0171966552734375,
0.043365478515625,
0.05419921875,
-0.0300750732421875,
0.0079193115234375,
-0.05426025390625,
-0.025390625,
0.03485107421875,
0.01513671875,
0.005290985107421875,
-0.01134490966796875,
-0.031494140625,
-0.0112457275390625,
-0.043853759765625,
0.01023101806640625,
0.01221466064453125,
0.00913238525390625,
-0.039794921875,
0.0291595458984375,
-0.0135040283203125,
0.043060302734375,
0.0167694091796875,
-0.01522064208984375,
0.0274810791015625,
-0.041656494140625,
-0.025177001953125,
-0.004718780517578125,
0.055328369140625,
0.01061248779296875,
0.00954437255859375,
-0.025634765625,
-0.023773193359375,
-0.00074005126953125,
0.03839111328125,
-0.08721923828125,
-0.0158538818359375,
0.023162841796875,
-0.0452880859375,
-0.0278167724609375,
0.0005636215209960938,
-0.052947998046875,
0.002147674560546875,
-0.00838470458984375,
0.056427001953125,
-0.04217529296875,
-0.0030689239501953125,
0.0222930908203125,
-0.0164794921875,
0.036712646484375,
0.02471923828125,
-0.0677490234375,
0.031341552734375,
0.044097900390625,
0.070556640625,
0.000005602836608886719,
-0.0182952880859375,
-0.00534820556640625,
-0.006229400634765625,
-0.0155181884765625,
0.045318603515625,
-0.026824951171875,
-0.0197906494140625,
-0.0096435546875,
0.0029201507568359375,
-0.014984130859375,
-0.0286865234375,
0.049102783203125,
-0.0367431640625,
0.031982421875,
-0.025787353515625,
-0.038421630859375,
-0.02764892578125,
0.0240936279296875,
-0.052825927734375,
0.06689453125,
0.008026123046875,
-0.0697021484375,
0.027099609375,
-0.06103515625,
-0.023284912109375,
-0.01004791259765625,
0.0171966552734375,
-0.036773681640625,
-0.00047969818115234375,
0.01837158203125,
0.0352783203125,
-0.0267333984375,
0.0006227493286132812,
-0.00479888916015625,
-0.036468505859375,
-0.00852203369140625,
-0.0160064697265625,
0.06488037109375,
0.0265350341796875,
-0.02606201171875,
0.0023632049560546875,
-0.0521240234375,
-0.004924774169921875,
0.005062103271484375,
-0.03271484375,
0.01444244384765625,
-0.00867462158203125,
-0.009307861328125,
0.006683349609375,
0.03424072265625,
-0.0312347412109375,
0.00998687744140625,
-0.02960205078125,
0.0513916015625,
0.037933349609375,
-0.0025844573974609375,
0.052825927734375,
-0.02557373046875,
0.03472900390625,
0.01528167724609375,
0.0217132568359375,
-0.03314208984375,
-0.026031494140625,
-0.091796875,
-0.0303192138671875,
0.05712890625,
0.041229248046875,
-0.045379638671875,
0.047088623046875,
-0.043670654296875,
-0.050872802734375,
-0.048248291015625,
-0.0135955810546875,
0.03436279296875,
0.042327880859375,
0.049346923828125,
-0.0226898193359375,
-0.04693603515625,
-0.077880859375,
-0.0116119384765625,
0.0045928955078125,
-0.0031032562255859375,
0.01140594482421875,
0.0516357421875,
-0.028717041015625,
0.054107666015625,
-0.032379150390625,
-0.018280029296875,
-0.037811279296875,
0.0056304931640625,
0.0173492431640625,
0.042755126953125,
0.03509521484375,
-0.037750244140625,
-0.035797119140625,
-0.00786590576171875,
-0.05316162109375,
0.00879669189453125,
0.0060577392578125,
-0.0007724761962890625,
0.01490020751953125,
0.038665771484375,
-0.052490234375,
0.0179595947265625,
0.0233917236328125,
-0.01216888427734375,
0.0215301513671875,
-0.0127716064453125,
-0.01035308837890625,
-0.09869384765625,
0.017303466796875,
-0.0121002197265625,
-0.01568603515625,
-0.05841064453125,
0.0159759521484375,
0.0175933837890625,
-0.0012636184692382812,
-0.037811279296875,
0.0303192138671875,
-0.038665771484375,
-0.00562286376953125,
-0.0021839141845703125,
-0.00820159912109375,
0.00775909423828125,
0.05157470703125,
-0.004047393798828125,
0.054840087890625,
0.03375244140625,
-0.062164306640625,
0.00021755695343017578,
0.0177764892578125,
-0.0251007080078125,
0.019622802734375,
-0.05926513671875,
0.01485443115234375,
-0.01141357421875,
0.0214691162109375,
-0.0537109375,
-0.00875091552734375,
0.01128387451171875,
-0.043121337890625,
0.035797119140625,
-0.0028018951416015625,
-0.041717529296875,
-0.0341796875,
-0.0262451171875,
0.031707763671875,
0.06475830078125,
-0.036285400390625,
0.052520751953125,
0.026092529296875,
0.0243377685546875,
-0.046234130859375,
-0.06304931640625,
-0.007434844970703125,
-0.026153564453125,
-0.060302734375,
0.048004150390625,
0.007389068603515625,
0.0051727294921875,
-0.0042266845703125,
-0.00045680999755859375,
-0.00479888916015625,
0.01296234130859375,
-0.001438140869140625,
-0.0008568763732910156,
-0.0231170654296875,
0.00125885009765625,
0.007732391357421875,
-0.01438140869140625,
0.0014524459838867188,
-0.0192413330078125,
0.0498046875,
-0.03485107421875,
0.00595855712890625,
-0.0295562744140625,
0.0079803466796875,
0.034942626953125,
-0.01727294921875,
0.054901123046875,
0.0689697265625,
-0.04241943359375,
-0.004489898681640625,
-0.03509521484375,
-0.02099609375,
-0.040374755859375,
0.028411865234375,
-0.0421142578125,
-0.06402587890625,
0.055084228515625,
0.0007557868957519531,
0.006031036376953125,
0.0675048828125,
0.03369140625,
-0.0016956329345703125,
0.076416015625,
0.061279296875,
0.00020325183868408203,
0.02996826171875,
-0.053558349609375,
0.026214599609375,
-0.06732177734375,
-0.0187835693359375,
-0.04852294921875,
-0.03851318359375,
-0.05914306640625,
-0.0291595458984375,
0.0184326171875,
0.016754150390625,
-0.04498291015625,
0.043731689453125,
-0.038177490234375,
0.02825927734375,
0.035308837890625,
0.0034999847412109375,
0.01129913330078125,
0.0028057098388671875,
-0.011932373046875,
-0.00823974609375,
-0.069580078125,
-0.034637451171875,
0.07232666015625,
0.05322265625,
0.044342041015625,
-0.0035953521728515625,
0.052825927734375,
0.00943756103515625,
0.00435638427734375,
-0.06488037109375,
0.035797119140625,
-0.019073486328125,
-0.03857421875,
-0.002567291259765625,
-0.027008056640625,
-0.06585693359375,
0.01070404052734375,
-0.00800323486328125,
-0.05657958984375,
0.022003173828125,
0.005908966064453125,
-0.03363037109375,
0.034820556640625,
-0.07049560546875,
0.0855712890625,
-0.00034117698669433594,
-0.01042938232421875,
-0.01263427734375,
-0.056793212890625,
0.02728271484375,
0.0006170272827148438,
-0.00418853759765625,
0.021209716796875,
-0.00878143310546875,
0.064453125,
-0.039703369140625,
0.058319091796875,
-0.01261138916015625,
0.0195770263671875,
0.0198974609375,
-0.008636474609375,
0.0357666015625,
-0.02239990234375,
0.004180908203125,
0.03369140625,
0.0069427490234375,
-0.031524658203125,
-0.0264129638671875,
0.03216552734375,
-0.07977294921875,
-0.029998779296875,
-0.0268707275390625,
-0.03143310546875,
-0.00572967529296875,
0.03094482421875,
0.046600341796875,
0.0343017578125,
-0.014678955078125,
0.04156494140625,
0.0654296875,
-0.023681640625,
0.03955078125,
0.0217132568359375,
-0.002105712890625,
-0.018280029296875,
0.0653076171875,
0.01496124267578125,
0.017852783203125,
0.053131103515625,
0.01247406005859375,
-0.0268096923828125,
-0.04791259765625,
-0.0287933349609375,
0.0148162841796875,
-0.058502197265625,
-0.021942138671875,
-0.0789794921875,
-0.04559326171875,
-0.0303497314453125,
-0.0129547119140625,
-0.044189453125,
-0.0296478271484375,
-0.0291290283203125,
-0.0124053955078125,
0.021728515625,
0.046112060546875,
0.00856781005859375,
0.035888671875,
-0.061370849609375,
0.0236053466796875,
0.0123748779296875,
0.0272064208984375,
0.00827789306640625,
-0.0706787109375,
-0.035797119140625,
0.00534820556640625,
-0.038543701171875,
-0.05419921875,
0.0253448486328125,
0.010498046875,
0.037567138671875,
0.035675048828125,
-0.0014629364013671875,
0.056793212890625,
-0.033172607421875,
0.064208984375,
-0.003490447998046875,
-0.08148193359375,
0.0213165283203125,
-0.0165557861328125,
0.0367431640625,
0.04180908203125,
0.0305633544921875,
-0.0244903564453125,
-0.010772705078125,
-0.06756591796875,
-0.0633544921875,
0.06622314453125,
0.01898193359375,
0.0109405517578125,
0.0179443359375,
0.01531982421875,
0.00539398193359375,
0.0241241455078125,
-0.060211181640625,
-0.0115814208984375,
-0.028076171875,
-0.0194244384765625,
0.0006985664367675781,
-0.02947998046875,
0.003070831298828125,
-0.034637451171875,
0.039306640625,
0.00012743473052978516,
0.060089111328125,
0.00800323486328125,
-0.029876708984375,
0.01702880859375,
0.0135498046875,
0.053741455078125,
0.04217529296875,
-0.02813720703125,
0.01324462890625,
0.036956787109375,
-0.047271728515625,
-0.0158843994140625,
0.00223541259765625,
-0.01593017578125,
-0.01253509521484375,
0.031707763671875,
0.081298828125,
0.01410675048828125,
-0.0189971923828125,
0.033599853515625,
-0.00553131103515625,
-0.0242462158203125,
-0.016357421875,
0.00029778480529785156,
0.00363922119140625,
0.011932373046875,
0.0113067626953125,
0.01543426513671875,
0.0167694091796875,
-0.0308685302734375,
0.000347137451171875,
0.0183258056640625,
-0.032135009765625,
-0.032623291015625,
0.0511474609375,
0.021881103515625,
-0.0245513916015625,
0.049102783203125,
-0.01922607421875,
-0.05267333984375,
0.051849365234375,
0.032012939453125,
0.08673095703125,
0.0009965896606445312,
-0.0011892318725585938,
0.051483154296875,
0.048675537109375,
-0.0057220458984375,
0.007537841796875,
-0.01641845703125,
-0.06072998046875,
-0.0196990966796875,
-0.061767578125,
-0.01505279541015625,
0.00801849365234375,
-0.0487060546875,
0.032073974609375,
-0.04486083984375,
0.00311279296875,
-0.00421142578125,
0.014129638671875,
-0.071044921875,
0.0274200439453125,
0.0019235610961914062,
0.08770751953125,
-0.046539306640625,
0.0673828125,
0.04351806640625,
-0.03326416015625,
-0.0665283203125,
0.0024929046630859375,
-0.028656005859375,
-0.059112548828125,
0.0267181396484375,
0.031005859375,
0.0006365776062011719,
0.00907135009765625,
-0.032379150390625,
-0.06829833984375,
0.10821533203125,
0.01496124267578125,
-0.0279388427734375,
-0.01467132568359375,
0.0151824951171875,
0.059417724609375,
-0.031463623046875,
0.03533935546875,
0.051483154296875,
0.0335693359375,
0.01024627685546875,
-0.06829833984375,
0.01137542724609375,
-0.0221099853515625,
0.0108489990234375,
0.01451873779296875,
-0.060577392578125,
0.07598876953125,
-0.0189361572265625,
-0.019805908203125,
0.0028533935546875,
0.048980712890625,
-0.0037250518798828125,
0.0099639892578125,
0.03460693359375,
0.06494140625,
0.0447998046875,
-0.0230560302734375,
0.0985107421875,
-0.0482177734375,
0.03369140625,
0.050872802734375,
0.0217132568359375,
0.056976318359375,
0.0168914794921875,
-0.0171966552734375,
0.0270538330078125,
0.0633544921875,
-0.0152130126953125,
0.044403076171875,
-0.01438140869140625,
0.007678985595703125,
-0.01103973388671875,
0.010101318359375,
-0.046295166015625,
0.033050537109375,
0.023468017578125,
-0.04486083984375,
-0.01128387451171875,
-0.018280029296875,
0.0097808837890625,
-0.030548095703125,
-0.0273284912109375,
0.061248779296875,
0.012847900390625,
-0.04327392578125,
0.048492431640625,
0.030364990234375,
0.054595947265625,
-0.042724609375,
0.020782470703125,
-0.020050048828125,
0.0289306640625,
-0.0166015625,
-0.04296875,
0.0372314453125,
0.0009918212890625,
-0.02301025390625,
-0.0300445556640625,
0.052459716796875,
-0.01531982421875,
-0.03985595703125,
0.017547607421875,
0.0264892578125,
0.01446533203125,
-0.0035800933837890625,
-0.052764892578125,
0.006999969482421875,
-0.0016832351684570312,
-0.011688232421875,
0.0176544189453125,
0.0228118896484375,
0.007038116455078125,
0.03704833984375,
0.0218505859375,
-0.014923095703125,
0.015228271484375,
-0.006763458251953125,
0.060455322265625,
-0.059661865234375,
-0.0472412109375,
-0.052276611328125,
0.053924560546875,
-0.0003604888916015625,
-0.025787353515625,
0.05181884765625,
0.04852294921875,
0.07659912109375,
-0.01338958740234375,
0.044189453125,
-0.0082855224609375,
0.02838134765625,
-0.039764404296875,
0.0526123046875,
-0.053253173828125,
0.0207672119140625,
-0.0159149169921875,
-0.0782470703125,
-0.00969696044921875,
0.040679931640625,
-0.02960205078125,
0.0265045166015625,
0.06396484375,
0.052947998046875,
-0.02056884765625,
0.0035915374755859375,
0.0169830322265625,
0.033905029296875,
0.01384735107421875,
0.047454833984375,
0.0455322265625,
-0.0662841796875,
0.0484619140625,
-0.031707763671875,
0.00212860107421875,
-0.016998291015625,
-0.050384521484375,
-0.06707763671875,
-0.044036865234375,
-0.021697998046875,
-0.0292510986328125,
-0.004779815673828125,
0.042510986328125,
0.05181884765625,
-0.053466796875,
-0.010345458984375,
-0.01898193359375,
0.005420684814453125,
-0.00954437255859375,
-0.018218994140625,
0.02777099609375,
-0.0264129638671875,
-0.0657958984375,
0.00458526611328125,
0.00782012939453125,
0.005680084228515625,
-0.0176239013671875,
-0.00835418701171875,
-0.00002872943878173828,
0.006473541259765625,
0.049713134765625,
0.01251983642578125,
-0.04443359375,
-0.0227508544921875,
0.0125274658203125,
-0.01432037353515625,
0.0149078369140625,
0.045166015625,
-0.053466796875,
0.025054931640625,
0.055450439453125,
0.0526123046875,
0.06829833984375,
-0.00927734375,
0.032623291015625,
-0.049102783203125,
0.0149383544921875,
0.0194854736328125,
0.027679443359375,
0.02947998046875,
-0.017364501953125,
0.0340576171875,
0.016937255859375,
-0.037811279296875,
-0.0662841796875,
0.0120086669921875,
-0.082763671875,
-0.008209228515625,
0.082763671875,
-0.00632476806640625,
-0.0253448486328125,
0.01593017578125,
-0.004329681396484375,
0.0211639404296875,
-0.0149383544921875,
0.060211181640625,
0.050201416015625,
0.01273345947265625,
-0.02880859375,
-0.026885986328125,
0.044708251953125,
0.0245513916015625,
-0.051544189453125,
-0.032806396484375,
0.00933074951171875,
0.023590087890625,
0.021697998046875,
0.04150390625,
-0.0187835693359375,
0.01267242431640625,
0.0159912109375,
0.00901031494140625,
-0.006763458251953125,
-0.03662109375,
-0.0321044921875,
0.0179901123046875,
-0.00989532470703125,
-0.0269317626953125
]
] |
Undi95/Emerhyst-20B | 2023-09-27T00:59:36.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"not-for-all-audiences",
"nsfw",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | Undi95 | null | null | Undi95/Emerhyst-20B | 21 | 14,564 | transformers | 2023-09-27T00:01:53 | ---
license: cc-by-nc-4.0
tags:
- not-for-all-audiences
- nsfw
---

Merge of [Amethyst 13B](https://huggingface.co/Undi95/Amethyst-13B) and [Emerald 13B](https://huggingface.co/Undi95/Emerald-13B).
In addition, [LimaRP v3](https://huggingface.co/lemonilia/LimaRP-Llama2-13B-v3-EXPERIMENT) was used, is it recommanded to read the documentation.
<!-- description start -->
## Description
This repo contains fp16 files of Emerhyst-20B.
<!-- description end -->
<!-- description start -->
## Models and loras used
- PygmalionAI/pygmalion-2-13b
- Xwin-LM/Xwin-LM-13B-V0.1
- The-Face-Of-Goonery/Huginn-13b-FP16
- zattio770/120-Days-of-LORA-v2-13B
- lemonilia/LimaRP-Llama2-13B-v3-EXPERIMENT
<!-- description end -->
<!-- prompt-template start -->
## Prompt template: Alpaca
```
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{prompt}
### Response:
```
## LimaRP v3 usage and suggested settings

You can follow these instruction format settings in SillyTavern. Replace tiny with your desired response length:

Special thanks to Sushi.
If you want to support me, you can [here](https://ko-fi.com/undiai). | 1,544 | [
[
-0.05816650390625,
-0.057952880859375,
0.0380859375,
0.0271453857421875,
-0.033905029296875,
-0.0187530517578125,
-0.007137298583984375,
-0.03814697265625,
0.050506591796875,
0.0450439453125,
-0.06298828125,
-0.0194244384765625,
-0.042755126953125,
0.023162841796875,
0.009307861328125,
0.07818603515625,
0.00104522705078125,
-0.0224151611328125,
0.01617431640625,
-0.01047515869140625,
-0.029266357421875,
0.0031490325927734375,
-0.066162109375,
-0.0125579833984375,
0.0474853515625,
0.038604736328125,
0.04388427734375,
0.0523681640625,
0.040008544921875,
0.02978515625,
-0.0147705078125,
0.033966064453125,
-0.022979736328125,
0.0176849365234375,
-0.005443572998046875,
-0.01354217529296875,
-0.0654296875,
-0.007396697998046875,
0.055938720703125,
0.0269927978515625,
-0.01593017578125,
0.0157928466796875,
0.01442718505859375,
0.033233642578125,
-0.032562255859375,
-0.003948211669921875,
-0.01178741455078125,
0.0201416015625,
-0.0185089111328125,
-0.0120697021484375,
-0.0008134841918945312,
-0.042724609375,
-0.0079803466796875,
-0.059783935546875,
-0.0014467239379882812,
0.0241546630859375,
0.08038330078125,
0.015411376953125,
-0.03143310546875,
-0.01275634765625,
-0.0216064453125,
0.059539794921875,
-0.058990478515625,
0.00629425048828125,
0.004039764404296875,
0.0194854736328125,
-0.031585693359375,
-0.050445556640625,
-0.058013916015625,
0.00963592529296875,
-0.0079803466796875,
0.01027679443359375,
-0.056854248046875,
-0.0093994140625,
0.0221405029296875,
0.040435791015625,
-0.040283203125,
0.0181884765625,
-0.052886962890625,
-0.017181396484375,
0.036407470703125,
0.030426025390625,
0.026092529296875,
-0.0193023681640625,
-0.0312042236328125,
-0.02020263671875,
-0.041839599609375,
-0.00838470458984375,
0.0184478759765625,
0.025115966796875,
-0.056304931640625,
0.07904052734375,
-0.0091705322265625,
0.056884765625,
0.03680419921875,
-0.010528564453125,
0.0308685302734375,
-0.02447509765625,
-0.0311126708984375,
0.007007598876953125,
0.08697509765625,
0.033905029296875,
-0.010345458984375,
0.0191650390625,
0.0074462890625,
-0.007762908935546875,
0.005931854248046875,
-0.07354736328125,
0.0042877197265625,
0.035369873046875,
-0.04718017578125,
-0.0252532958984375,
-0.0033817291259765625,
-0.0828857421875,
-0.01678466796875,
0.0046234130859375,
0.0294189453125,
-0.034820556640625,
-0.0089111328125,
0.005828857421875,
-0.021636962890625,
0.0297393798828125,
0.04364013671875,
-0.0513916015625,
0.0286865234375,
0.039093017578125,
0.06036376953125,
0.0276336669921875,
-0.026519775390625,
-0.057769775390625,
0.00695037841796875,
-0.0078887939453125,
0.040252685546875,
-0.007152557373046875,
-0.036956787109375,
-0.0215911865234375,
0.0219268798828125,
-0.0051116943359375,
-0.039825439453125,
0.039886474609375,
-0.0094451904296875,
0.0265960693359375,
-0.01502227783203125,
-0.017242431640625,
-0.01885986328125,
0.01763916015625,
-0.043609619140625,
0.043548583984375,
0.018157958984375,
-0.075439453125,
-0.0033111572265625,
-0.038665771484375,
-0.00647735595703125,
-0.00968170166015625,
0.0016193389892578125,
-0.03424072265625,
0.004467010498046875,
0.009918212890625,
0.031005859375,
-0.03375244140625,
-0.039520263671875,
-0.034454345703125,
-0.0074310302734375,
0.0178070068359375,
-0.0010251998901367188,
0.062103271484375,
0.031097412109375,
-0.033599853515625,
0.0036830902099609375,
-0.05841064453125,
0.006931304931640625,
0.01995849609375,
-0.039398193359375,
0.000095367431640625,
-0.03912353515625,
0.00714111328125,
0.01983642578125,
0.04638671875,
-0.057708740234375,
0.0286865234375,
-0.0130157470703125,
0.026153564453125,
0.055206298828125,
0.006740570068359375,
0.0303192138671875,
-0.04632568359375,
0.04168701171875,
-0.00345611572265625,
0.0216522216796875,
0.01204681396484375,
-0.053436279296875,
-0.07342529296875,
-0.0279693603515625,
0.0022487640380859375,
0.0347900390625,
-0.037353515625,
0.047454833984375,
-0.0177459716796875,
-0.056732177734375,
-0.04620361328125,
-0.0035152435302734375,
0.033233642578125,
0.047149658203125,
0.028564453125,
-0.035919189453125,
-0.059326171875,
-0.07391357421875,
0.00890350341796875,
-0.0164947509765625,
-0.01531219482421875,
0.038055419921875,
0.037261962890625,
-0.03839111328125,
0.04498291015625,
-0.05987548828125,
-0.038299560546875,
-0.0134735107421875,
-0.0002739429473876953,
0.0465087890625,
0.056365966796875,
0.046600341796875,
-0.04248046875,
-0.0234527587890625,
0.004650115966796875,
-0.038665771484375,
-0.0186920166015625,
0.01430511474609375,
-0.01392364501953125,
0.0100555419921875,
-0.005672454833984375,
-0.06292724609375,
0.026397705078125,
0.046051025390625,
-0.05218505859375,
0.01885986328125,
-0.03839111328125,
0.0477294921875,
-0.079345703125,
0.010986328125,
0.01032257080078125,
-0.0138092041015625,
-0.038970947265625,
0.01056671142578125,
0.011138916015625,
-0.01104736328125,
-0.050506591796875,
0.043701171875,
-0.0335693359375,
-0.001750946044921875,
-0.01279449462890625,
-0.00020635128021240234,
0.00298309326171875,
0.0174560546875,
-0.0266265869140625,
0.033233642578125,
0.0281982421875,
-0.04376220703125,
0.038665771484375,
0.04144287109375,
0.0015239715576171875,
0.04364013671875,
-0.0548095703125,
0.0009937286376953125,
0.00812530517578125,
0.024261474609375,
-0.062225341796875,
-0.037200927734375,
0.058502197265625,
-0.0304107666015625,
0.0202484130859375,
-0.0245513916015625,
-0.0276947021484375,
-0.03570556640625,
-0.04571533203125,
0.027191162109375,
0.06268310546875,
-0.03289794921875,
0.05859375,
0.0190277099609375,
-0.01297760009765625,
-0.0211639404296875,
-0.08367919921875,
-0.0157012939453125,
-0.036468505859375,
-0.06658935546875,
0.04638671875,
-0.00832366943359375,
-0.0107269287109375,
0.00728607177734375,
0.0028057098388671875,
-0.01360321044921875,
-0.035614013671875,
0.03485107421875,
0.0295867919921875,
-0.0150909423828125,
-0.047882080078125,
0.00753021240234375,
-0.005626678466796875,
-0.018951416015625,
-0.0009708404541015625,
0.055206298828125,
-0.020416259765625,
-0.0168914794921875,
-0.0357666015625,
0.0204315185546875,
0.055938720703125,
-0.00472259521484375,
0.06646728515625,
0.042724609375,
-0.021514892578125,
0.0199737548828125,
-0.05548095703125,
-0.0008111000061035156,
-0.035919189453125,
-0.0130615234375,
-0.0226287841796875,
-0.055755615234375,
0.060272216796875,
0.037811279296875,
0.0047607421875,
0.056610107421875,
0.03204345703125,
-0.00902557373046875,
0.059967041015625,
0.043212890625,
0.01435089111328125,
0.0308685302734375,
-0.041961669921875,
0.0108184814453125,
-0.0875244140625,
-0.04669189453125,
-0.033843994140625,
-0.022735595703125,
-0.056732177734375,
-0.0438232421875,
0.0248870849609375,
0.0012025833129882812,
-0.021575927734375,
0.0440673828125,
-0.0312042236328125,
-0.00042319297790527344,
0.0181121826171875,
0.02593994140625,
0.0042572021484375,
-0.00015151500701904297,
0.004535675048828125,
-0.0154571533203125,
-0.0341796875,
-0.0176239013671875,
0.053375244140625,
0.0235748291015625,
0.06512451171875,
0.0269622802734375,
0.07940673828125,
-0.005523681640625,
-0.0161590576171875,
-0.05181884765625,
0.05841064453125,
-0.00782012939453125,
-0.018798828125,
-0.01093292236328125,
-0.03302001953125,
-0.0819091796875,
0.029937744140625,
-0.0051727294921875,
-0.04949951171875,
0.024139404296875,
0.0179901123046875,
-0.04296875,
0.0263519287109375,
-0.032379150390625,
0.05133056640625,
0.001079559326171875,
-0.0263671875,
-0.02081298828125,
-0.035369873046875,
0.0287322998046875,
0.000013172626495361328,
0.015960693359375,
-0.0129547119140625,
-0.032562255859375,
0.06951904296875,
-0.07293701171875,
0.044403076171875,
-0.004383087158203125,
-0.0181884765625,
0.0165252685546875,
-0.0027065277099609375,
0.04315185546875,
-0.00011962652206420898,
0.007488250732421875,
-0.01213836669921875,
-0.0247802734375,
-0.027679443359375,
-0.0294342041015625,
0.07415771484375,
-0.047607421875,
-0.04315185546875,
-0.0374755859375,
-0.0181884765625,
0.027740478515625,
0.0019969940185546875,
0.030548095703125,
0.029632568359375,
0.007091522216796875,
0.0113983154296875,
0.053253173828125,
-0.00579071044921875,
0.03179931640625,
0.03411865234375,
-0.01207733154296875,
-0.0460205078125,
0.04632568359375,
0.00844573974609375,
0.004791259765625,
0.01528167724609375,
0.0184326171875,
-0.0245361328125,
-0.01000213623046875,
-0.0298309326171875,
0.0347900390625,
-0.046173095703125,
-0.023040771484375,
-0.05816650390625,
-0.01207733154296875,
-0.0333251953125,
-0.01480865478515625,
-0.02825927734375,
-0.050933837890625,
-0.033660888671875,
0.033843994140625,
0.0552978515625,
0.041046142578125,
-0.0455322265625,
0.0284881591796875,
-0.0694580078125,
0.0311279296875,
0.030364990234375,
-0.012115478515625,
-0.005985260009765625,
-0.0703125,
0.01422119140625,
0.0083160400390625,
-0.037200927734375,
-0.08306884765625,
0.046234130859375,
-0.0010404586791992188,
0.0271148681640625,
0.0164794921875,
-0.0091705322265625,
0.0748291015625,
-0.032623291015625,
0.045013427734375,
0.032318115234375,
-0.06072998046875,
0.07281494140625,
-0.04180908203125,
0.01373291015625,
0.021209716796875,
0.0218658447265625,
-0.039581298828125,
-0.0307464599609375,
-0.07421875,
-0.0732421875,
0.04547119140625,
0.02728271484375,
-0.01444244384765625,
0.0019245147705078125,
0.002292633056640625,
-0.00569915771484375,
0.006435394287109375,
-0.03717041015625,
-0.03472900390625,
-0.01079559326171875,
-0.0021820068359375,
0.0171661376953125,
-0.00833892822265625,
-0.020233154296875,
-0.01373291015625,
0.051422119140625,
0.0024204254150390625,
0.029052734375,
0.0130767822265625,
0.006343841552734375,
-0.0199127197265625,
0.0284881591796875,
0.037811279296875,
0.042510986328125,
-0.037017822265625,
0.0004367828369140625,
0.0258026123046875,
-0.030120849609375,
0.0087127685546875,
0.0167694091796875,
0.01435089111328125,
-0.00453948974609375,
0.013641357421875,
0.038543701171875,
0.0347900390625,
-0.0276641845703125,
0.0293121337890625,
-0.01763916015625,
-0.00148773193359375,
-0.0013227462768554688,
0.01047515869140625,
-0.0031909942626953125,
0.031524658203125,
0.032989501953125,
0.0176849365234375,
0.007350921630859375,
-0.0389404296875,
-0.0019216537475585938,
0.0158843994140625,
-0.004795074462890625,
-0.01166534423828125,
0.044189453125,
0.0029621124267578125,
-0.004932403564453125,
0.047454833984375,
-0.01068878173828125,
-0.027313232421875,
0.064453125,
0.0517578125,
0.04595947265625,
-0.00493621826171875,
-0.00045871734619140625,
0.033111572265625,
0.01404571533203125,
0.011322021484375,
0.0538330078125,
0.005321502685546875,
-0.0302886962890625,
-0.00867462158203125,
-0.043853759765625,
-0.0244140625,
0.019256591796875,
-0.051727294921875,
0.0268096923828125,
-0.045623779296875,
-0.008819580078125,
0.0080108642578125,
0.0241546630859375,
-0.0556640625,
0.0213775634765625,
-0.00644683837890625,
0.0643310546875,
-0.07666015625,
0.0465087890625,
0.062469482421875,
-0.045562744140625,
-0.0684814453125,
-0.0130615234375,
0.015777587890625,
-0.06280517578125,
0.0460205078125,
0.005828857421875,
0.0107421875,
-0.0068206787109375,
-0.0248870849609375,
-0.06591796875,
0.09368896484375,
0.015777587890625,
-0.0374755859375,
0.008697509765625,
-0.0305633544921875,
0.00720977783203125,
-0.037109375,
0.01904296875,
0.032562255859375,
0.047454833984375,
0.0173492431640625,
-0.08905029296875,
0.042816162109375,
-0.01654052734375,
0.01267242431640625,
0.0286102294921875,
-0.06732177734375,
0.0810546875,
-0.019561767578125,
-0.01422882080078125,
0.0455322265625,
0.042449951171875,
0.06671142578125,
-0.0087432861328125,
0.031646728515625,
0.0697021484375,
0.036956787109375,
-0.0272979736328125,
0.06622314453125,
-0.000225067138671875,
0.041900634765625,
0.05181884765625,
-0.0006093978881835938,
0.05712890625,
0.05706787109375,
-0.021697998046875,
0.0251312255859375,
0.068115234375,
-0.0194244384765625,
0.0399169921875,
-0.001483917236328125,
-0.01751708984375,
0.007293701171875,
-0.01499176025390625,
-0.063232421875,
0.018798828125,
0.0240936279296875,
-0.020599365234375,
-0.02374267578125,
-0.0024738311767578125,
0.01153564453125,
-0.021240234375,
-0.029815673828125,
0.029144287109375,
0.00042128562927246094,
-0.0396728515625,
0.0399169921875,
0.0035190582275390625,
0.0869140625,
-0.05023193359375,
-0.004474639892578125,
-0.021820068359375,
0.0151214599609375,
-0.036224365234375,
-0.0791015625,
0.016357421875,
-0.0002994537353515625,
-0.01436614990234375,
-0.00286102294921875,
0.0565185546875,
-0.0269775390625,
-0.01506805419921875,
0.026763916015625,
0.0213623046875,
0.0333251953125,
0.0298614501953125,
-0.057830810546875,
0.04144287109375,
0.017913818359375,
-0.0191192626953125,
0.021942138671875,
0.033294677734375,
0.030517578125,
0.04901123046875,
0.0284271240234375,
0.026031494140625,
0.0186004638671875,
-0.0176849365234375,
0.0782470703125,
-0.038360595703125,
-0.02911376953125,
-0.069091796875,
0.04632568359375,
-0.00827789306640625,
-0.0286865234375,
0.060272216796875,
0.055450439453125,
0.048126220703125,
-0.024261474609375,
0.048614501953125,
-0.046142578125,
0.01165771484375,
-0.053466796875,
0.05303955078125,
-0.056976318359375,
0.0005621910095214844,
-0.00458526611328125,
-0.06756591796875,
0.00868988037109375,
0.0540771484375,
-0.0015916824340820312,
-0.00080108642578125,
0.043060302734375,
0.06292724609375,
-0.030792236328125,
-0.0264892578125,
0.004627227783203125,
0.016387939453125,
0.0104217529296875,
0.059326171875,
0.0616455078125,
-0.053955078125,
0.03448486328125,
-0.04345703125,
-0.0164031982421875,
-0.030181884765625,
-0.0638427734375,
-0.044708251953125,
-0.02789306640625,
-0.031036376953125,
-0.0406494140625,
-0.01146697998046875,
0.0753173828125,
0.045440673828125,
-0.04547119140625,
-0.01328277587890625,
0.00510406494140625,
-0.00091552734375,
0.02557373046875,
-0.01739501953125,
0.0123138427734375,
0.0070343017578125,
-0.06634521484375,
0.0234527587890625,
0.0166473388671875,
0.061553955078125,
0.0163421630859375,
-0.0276031494140625,
-0.00091552734375,
-0.0126800537109375,
0.009796142578125,
0.053436279296875,
-0.06756591796875,
-0.006343841552734375,
-0.0312347412109375,
0.0045013427734375,
0.0087890625,
0.034576416015625,
-0.042022705078125,
-0.0024242401123046875,
0.051025390625,
0.00391387939453125,
0.041656494140625,
-0.01763916015625,
0.036529541015625,
-0.05303955078125,
0.03814697265625,
-0.0016422271728515625,
0.05841064453125,
0.01873779296875,
-0.0265960693359375,
0.03997802734375,
-0.002353668212890625,
-0.039703369140625,
-0.0655517578125,
-0.0076904296875,
-0.089111328125,
-0.0008821487426757812,
0.059906005859375,
-0.00970458984375,
-0.032318115234375,
0.032073974609375,
-0.037506103515625,
0.03668212890625,
-0.033477783203125,
0.02581787109375,
0.0155792236328125,
-0.02423095703125,
-0.013702392578125,
-0.019317626953125,
0.0235595703125,
0.0210113525390625,
-0.0703125,
-0.021453857421875,
0.02801513671875,
0.030975341796875,
0.042724609375,
0.04486083984375,
-0.00505828857421875,
0.032989501953125,
-0.00262451171875,
0.0013446807861328125,
-0.004917144775390625,
0.000476837158203125,
-0.033843994140625,
-0.0034008026123046875,
-0.0079803466796875,
-0.04302978515625
]
] |
sazyou-roukaku/chilled_remix | 2023-06-09T23:08:31.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"ja",
"license:creativeml-openrail-m",
"region:us"
] | text-to-image | sazyou-roukaku | null | null | sazyou-roukaku/chilled_remix | 203 | 14,561 | diffusers | 2023-04-18T12:48:48 | ---
license: creativeml-openrail-m
language:
- ja
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- text-to-image
---
**【告知】**
**chilled_remix及びreversemixは2023年5月21日にVersion変更を行い、v2へ移行いたしました。**
**伴いv1は削除致しました。なお既にDL済みの方は引き続き、v1をご利用いただくことは問題ございません。**
License:[CreativeML Open RAIL-M](https://huggingface.co/sazyou-roukaku/chilled_remix/blob/main/license_v2.txt)<br>
Additional Copyright: sazyou_roukaku (TwitterID [@sazyou_roukaku](https://twitter.com/sazyou_roukaku)) as of May 21, 2023<br>
このモデルは『CreativeML Open RAIL-M』でLicenseそのものに変更はありません。<br>
~しかし追加著作者として鎖城郎郭の名前が追加されています。~<br>
しかし追加著作者として佐城郎画の名前が追加されています。(6/10 Twitterネーム変更に伴い、表記変更。License内はsazyou_roukakuの為変更なし)<br>
なお『CreativeML Open RAIL-M』に記載されている通り、<br>
本モデルを使用しての生成物に関してはLicenseの使用制限Aの事例を除き、当方は一切関与致しません。<br>
犯罪目的利用や医療用画像など特定専門的な用途での利用は使用制限Aで禁止されています。<br>
必ず確認しご利用ください。<br>
また当方は一切責任を持ちません。免責されていることをご了承の上、ご使用ください。<br>
<h4>制限</h4>
<div class="px-2">
<table class="table-fixed border mt-0 text-xs">
<tr>
<td class="align-middle px-4 w-8">
<span class="text-green-500">
<h5>OK</h5>
</span>
</td>
<td>
著作者表記を入れずにモデルを使用する<br>
Use the model without crediting the creator
</td>
</tr>
<tr>
<td class="align-middle px-4 w-8">
<span class="text-green-500">
<h5>OK</h5>
</span>
</td>
<td>
このモデルで生成した画像を商用利用する<br>
Sell images they generate
</td>
</tr>
<tr>
<td class="align-middle px-4 w-8">
<span class="text-green-500">
<h5>OK</h5>
</span>
</td>
<td>
商用画像生成サービスに、このモデルを使用する<br>
Run on services that generate images for money
</td>
</tr>
<tr>
<td class="align-middle px-4 w-8">
<span class="text-green-500">
<h5>OK</h5>
</span>
</td>
<td>
このモデルを使用したマージモデルを共有・配布する<br>
Share merges using this model
</td>
</tr>
<tr>
<td class="align-middle px-4 w-8">
<span class="text-green-500">
<h5>OK</h5>
</span>
</td>
<td>
このモデル、または派生モデルを販売する<br>
Sell this model or merges using this model
</td>
</tr>
<tr>
<td class="align-middle px-4 w-8">
<span class="text-green-500">
<h5>OK</h5>
</span>
</td>
<td>
このモデルをマージしたモデルに異なる権限を設定する<br>
Have different permissions when sharing merges
</td>
</tr>
</tbody>
</table>
</div>
なお、上記のモデルそのものの販売や商用画像生成サービスへの利用は、<br>
『CreativeML Open RAIL-M』のLicense上、使用制限Aに追記記載しない限り、<br>
制限することが本来できない為、マージ者への負担も考慮し、civitai制限表記上OKとしているだけであり、<br>
積極的な推奨は行っておらず、またそれにより何らかの問題が生じても当方は一切責任を持ちません。<br>
その点、ご留意いただくようお願いいたします。<br>
<br>
**推奨設定・モデルの違い・プロンプト**
Version2はfp16でVAE焼き込み版のみ配布といたしました。
基本的には**chilled_remixをメイン**とし、好みに合わせてreversemixも検討というのがスタンスです。
※chilled_remixはchilled_re-genericユーザーをある騒動での混乱から守るために生み出されたモデルです。
性質上全てのユーザー出力に対応できなかった為、サブとしてreversemixが作られました。
reversemixはLORAなしでも顔のセミリアル感は薄いですが、全体的に幼くなる傾向があります。
chilled_remixはLORA愛用者の多いchilled_re-genericユーザー向けに生み出された為、
顔はLORAを使うとリアル感が一定になるよう設計されています。
プロンプトだけでもリアル化は可能ですが、LORAを少し使ってリアル化したほうが簡単です。
**CLIP設定:clip skip:2**を推奨。
badhand系のnegativeTI無し、手系のネガティブも入れない出力と、
badhand系のnegativeTIを使った場合、正直大差ない感覚があります。
お好みでご利用ください。
自然言語的な文章プロンプトにかなり強いですが、シチュエーション以外の詳しい顔造形などは、
好みに合わせてワードプロンプトで指定するのが私のスタイルです。
ワードだけ構成でも問題があるわけではないので使いやすいスタイルで使ってください。
クオリティプロンプトは、high qualityなどは有効性を感じていません。
masterpieceは顔造形が変化する感覚ありますが、クオリティアップとしては微妙です。
ただhigh resolutionは背景や質感に効果あります。high res、Hiresなど色々ありますが、
一番high resolutionを信頼しています。
私が必ず入れるプロンプト
(symmetrical clear eyes:1.3)は絶対入れてます。
目の色等や他の追加と合わせて分割したりもしますが、このプロンプト自体は入れるのをデフォルトとしています。
愛用ネガティブプロンプトベース
```
nipple,(manicure:1.2),(worst quality:2),(low quality:2),(long neck:2),(undressing:1.5),
```
**マージ利用モデル一覧**
real-max-v3.4
(https://civitai.com/models/60188/real-max-v34) ©dawn6666
fantasticmix_v10(旧モデル名fantasticmixReal_v10)
(https://civitai.com/models/22402/fantasticmixreal) ©michin
dreamshaper_5Bakedvae
(https://civitai.com/models/4384/dreamshaper) ©Lykon
epicrealism_newAge
(https://civitai.com/models/25694) ©epinikion
diamondCoalMix_diamondCoalv2
(https://civitai.com/models/41415) ©EnthusiastAI
**FAQ**
**Q1:何故v2を公開し、v1の配布を中止したのか**
**A2:**
v1は元々マージ後も制限変更を禁止する表記になっているモデル(**realbiter_v10**)を使用していた為、
NG:Have different permissions when sharing mergesというcivitai制限を継承していました。
これは制限を追加することも解除することも不可という意味に取れます。一方でその他は全てOKでした。
つまり例えば
*NG:Sell this model or merges using this model*
*NG:Have different permissions when sharing merges*
こういうモデルとマージした時に**制限の矛盾**が発生し、**理屈上公開不可**という問題がありました。
マージをする者にとってこれは非常に厄介な制限で、また『CreativeML Open RAIL-M』にある
**Licenseを逸脱しない範囲であれば制限等を追加することができる**という文言にも抵触しています。
これが非常に気持ち悪く、嫌でした。
今回はその制限を解除する為のVersionアップです。
**v1の配布中止は、制限が異なる為、ややこしくトラブルの原因となる可能性がある点。**
また『CreativeML Open RAIL-M』には
**『更新に伴い、基本的に使用者は最新版を使う努力をすること』** の文面があります。
権利者は最新版を使わせるようにする権利を持ち、使用者は努力義務があるという内容です。
**ただし私はこの権利を行使致しませんので引き続きv1をお使いいただくことは問題ありません。**
しなしながらこの文面があるのに旧版を公開し続けるのは合理性に欠けることもあり、
誠に勝手ながら公開終了とさせていただきました。
ご理解のほどよろしくお願いいたします。
なおv1の再配布等は『CreativeML Open RAIL-M』に準拠致します。
**Q2:今回の制限に問題や矛盾はないのか。**
**A2:fantasticmix_v10**、**diamondCoalMix_diamondCoalv2**、**dreamshaper_5Bakedvae**は
**OK:Have different permissions when sharing merges**となっており解除可能。
**epicrealism_newAge**と**real-max-v3.4**は制限なしの為、今回全て制限なしとし公開しております。
なおマージ利用モデル側にLicense変更・制限変更等が生じた際も
5/17時点のLicenseや制限を前提として公開している為、creativeml-openrail-mに準じます。
こちらはMergeModel_LicenseSS_v2に該当モデルのSSを保管しております。
なおマージ利用モデル側に重大な問題が発生した場合は、モデルの公開停止を行い、
利用停止を呼びかける可能性はありますが、**当方側を理由とした追加制限を設けることは致しません。**
<br>
<br>
<br>
<br>
<br>
<br>
**----------------------------下記は旧Version向け情報です------------------------**
**chilled_remix_v1/chilled_reversemix_v1**に関して最低限の記載を残します。
詳しい内容が必要な場合は編集履歴にて当時の記載をご確認ください。
またMergeModel_LicenseSSに該当モデルの制限に関してSSを残しております。
License:[CreativeML Open RAIL-M](https://huggingface.co/sazyou-roukaku/chilled_remix/blob/main/license.txt)<br>
Additional Copyright: sazyou_roukaku (TwitterID [@sazyou_roukaku](https://twitter.com/sazyou_roukaku)) as of April 18, 2023
このモデルは『CreativeML Open RAIL-M』でLicenseそのものに変更はありません。
しかし追加著作者として鎖城郎郭の名前が追加されています。
なおcreativeml-openrail-mに記載されている通り、 本モデルを使用しての生成物に関しては使用制限Aの事例を除き、当方は一切関与致しません。
また一切責任を持ちません。免責されていることをご了承の上、ご使用ください。
**制限**
| Allowed | Permission |
|:-------:|-----------------------------------------------------|
| OK | Use the model without crediting the creator |
| OK | Sell images they generate |
| OK | Run on services that generate images for money |
| OK | Share merges using this model |
| OK | Sell this model or merges using this model |
| NG | Have different permissions when sharing merges |
| | | | 7,385 | [
[
-0.0633544921875,
-0.04205322265625,
0.02020263671875,
0.03387451171875,
-0.046783447265625,
-0.012786865234375,
0.0048370361328125,
-0.041748046875,
0.043701171875,
0.0300140380859375,
-0.06927490234375,
-0.049041748046875,
-0.0255279541015625,
-0.00435638427734375,
-0.003826141357421875,
0.047149658203125,
-0.043121337890625,
-0.01026153564453125,
-0.00652313232421875,
0.01517486572265625,
-0.03521728515625,
-0.0180206298828125,
-0.03289794921875,
-0.00447845458984375,
0.0036468505859375,
0.0160980224609375,
0.06597900390625,
0.045379638671875,
0.055328369140625,
0.0212249755859375,
-0.01175689697265625,
0.006866455078125,
-0.0222015380859375,
-0.00473785400390625,
0.002155303955078125,
-0.034576416015625,
-0.05487060546875,
-0.0014410018920898438,
0.03814697265625,
0.001132965087890625,
0.0045928955078125,
0.0156402587890625,
0.005420684814453125,
0.052215576171875,
-0.040252685546875,
0.01116943359375,
0.0009489059448242188,
0.0229949951171875,
-0.01580810546875,
-0.0286102294921875,
0.0143890380859375,
-0.05047607421875,
-0.0400390625,
-0.07501220703125,
-0.00223541259765625,
0.0011434555053710938,
0.1087646484375,
0.0033435821533203125,
-0.005336761474609375,
-0.000942230224609375,
-0.054229736328125,
0.04986572265625,
-0.0635986328125,
0.03607177734375,
0.023895263671875,
0.03436279296875,
-0.00909423828125,
-0.058868408203125,
-0.0657958984375,
0.017364501953125,
-0.00859832763671875,
0.03900146484375,
-0.018890380859375,
-0.049530029296875,
0.01558685302734375,
0.0255279541015625,
-0.0457763671875,
0.00685882568359375,
-0.0288848876953125,
0.00823211669921875,
0.043670654296875,
0.02288818359375,
0.060516357421875,
-0.01288604736328125,
-0.054168701171875,
-0.00574493408203125,
-0.051971435546875,
0.0238037109375,
0.013275146484375,
0.01434326171875,
-0.06866455078125,
0.03363037109375,
0.00458526611328125,
0.021453857421875,
0.00811004638671875,
-0.00818634033203125,
0.04217529296875,
-0.038787841796875,
-0.031524658203125,
-0.03424072265625,
0.08447265625,
0.060760498046875,
-0.005573272705078125,
-0.01222991943359375,
-0.02056884765625,
-0.0218048095703125,
-0.025543212890625,
-0.056182861328125,
-0.004039764404296875,
0.03546142578125,
-0.040679931640625,
-0.0006856918334960938,
0.01403045654296875,
-0.0830078125,
-0.00171661376953125,
-0.026336669921875,
0.0144195556640625,
-0.042327880859375,
-0.04693603515625,
0.0203094482421875,
-0.005123138427734375,
0.0076751708984375,
0.0367431640625,
-0.031494140625,
0.0117645263671875,
0.04510498046875,
0.061553955078125,
0.01203155517578125,
-0.03143310546875,
0.01000213623046875,
0.041107177734375,
-0.0207977294921875,
0.0557861328125,
0.0100555419921875,
-0.040313720703125,
-0.00848388671875,
0.016693115234375,
-0.01451873779296875,
-0.0291748046875,
0.053924560546875,
-0.0218048095703125,
0.02203369140625,
-0.0238494873046875,
0.0011720657348632812,
-0.01023101806640625,
0.006443023681640625,
-0.0316162109375,
0.03790283203125,
-0.01236724853515625,
-0.06878662109375,
0.024444580078125,
-0.046630859375,
-0.019561767578125,
-0.006061553955078125,
0.0037689208984375,
-0.0301361083984375,
-0.0191650390625,
0.0130462646484375,
0.03271484375,
-0.02093505859375,
-0.0167694091796875,
-0.0172119140625,
-0.01110076904296875,
0.03167724609375,
0.002010345458984375,
0.091796875,
0.04736328125,
-0.0207672119140625,
-0.0011997222900390625,
-0.05670166015625,
0.0123443603515625,
0.05584716796875,
-0.0272064208984375,
-0.0235595703125,
-0.0175323486328125,
0.005947113037109375,
0.05120849609375,
0.0246734619140625,
-0.02655029296875,
0.0148468017578125,
-0.036346435546875,
0.0221099853515625,
0.0706787109375,
0.0132598876953125,
0.0372314453125,
-0.05426025390625,
0.0511474609375,
0.0198974609375,
0.026519775390625,
0.008087158203125,
-0.034393310546875,
-0.054412841796875,
-0.0309906005859375,
-0.00494384765625,
0.0333251953125,
-0.058013916015625,
0.0379638671875,
-0.02374267578125,
-0.06219482421875,
-0.05023193359375,
0.007511138916015625,
0.02056884765625,
-0.0003571510314941406,
0.03167724609375,
-0.01605224609375,
-0.049835205078125,
-0.03778076171875,
-0.0015716552734375,
-0.002162933349609375,
0.019775390625,
0.034759521484375,
0.03564453125,
-0.03125,
0.055450439453125,
-0.04046630859375,
-0.040863037109375,
-0.016143798828125,
-0.0194091796875,
0.051025390625,
0.0660400390625,
0.0679931640625,
-0.0677490234375,
-0.060333251953125,
-0.003528594970703125,
-0.06976318359375,
0.00281524658203125,
-0.00679779052734375,
-0.032501220703125,
0.01032257080078125,
0.0035114288330078125,
-0.0814208984375,
0.030364990234375,
0.0204620361328125,
-0.03619384765625,
0.0538330078125,
-0.0225830078125,
0.03173828125,
-0.09014892578125,
0.0171966552734375,
-0.0023250579833984375,
0.01038360595703125,
-0.044403076171875,
0.042572021484375,
-0.00415802001953125,
-0.0008559226989746094,
-0.045074462890625,
0.04290771484375,
-0.051361083984375,
0.0362548828125,
0.00763702392578125,
0.020843505859375,
0.0023403167724609375,
0.030059814453125,
-0.01190185546875,
0.0291290283203125,
0.050018310546875,
-0.041015625,
0.0275421142578125,
0.0303192138671875,
-0.0243072509765625,
0.0305633544921875,
-0.041015625,
0.00867462158203125,
-0.0220794677734375,
0.0070343017578125,
-0.0726318359375,
-0.0060577392578125,
0.04443359375,
-0.05364990234375,
0.03466796875,
-0.0036983489990234375,
-0.04913330078125,
-0.055267333984375,
-0.0294647216796875,
-0.0166473388671875,
0.034820556640625,
-0.033294677734375,
0.038848876953125,
0.0294647216796875,
0.0089569091796875,
-0.04461669921875,
-0.0772705078125,
-0.0014600753784179688,
-0.0157012939453125,
-0.064453125,
0.0191192626953125,
-0.01288604736328125,
-0.025665283203125,
0.0037288665771484375,
0.0095367431640625,
-0.021881103515625,
-0.00804901123046875,
0.03436279296875,
0.02410888671875,
-0.016876220703125,
-0.0249176025390625,
-0.0011491775512695312,
-0.004177093505859375,
0.0014200210571289062,
-0.003269195556640625,
0.05670166015625,
0.00348663330078125,
-0.0241851806640625,
-0.07110595703125,
0.0206298828125,
0.06512451171875,
-0.0279693603515625,
0.042694091796875,
0.04693603515625,
-0.01456451416015625,
0.0010328292846679688,
-0.0321044921875,
0.0032100677490234375,
-0.034027099609375,
-0.00409698486328125,
-0.03387451171875,
-0.035797119140625,
0.0543212890625,
0.0235748291015625,
0.0007371902465820312,
0.04132080078125,
0.0280609130859375,
-0.026153564453125,
0.06719970703125,
0.03240966796875,
0.00002765655517578125,
0.01317596435546875,
-0.06329345703125,
0.0235137939453125,
-0.046356201171875,
-0.041473388671875,
-0.044219970703125,
-0.0199737548828125,
-0.0362548828125,
-0.035919189453125,
0.00313568115234375,
0.0191650390625,
-0.0380859375,
0.033447265625,
-0.048828125,
0.0014944076538085938,
0.024658203125,
0.01446533203125,
0.0115509033203125,
-0.01123046875,
-0.0227203369140625,
-0.01377105712890625,
-0.034393310546875,
-0.0200042724609375,
0.046478271484375,
0.0367431640625,
0.056549072265625,
0.04669189453125,
0.0540771484375,
-0.0173187255859375,
-0.0027599334716796875,
-0.0286865234375,
0.058685302734375,
0.0018491744995117188,
-0.052520751953125,
0.00237274169921875,
-0.0267181396484375,
-0.067138671875,
0.022430419921875,
-0.0457763671875,
-0.05194091796875,
0.039520263671875,
0.0042572021484375,
-0.0231170654296875,
0.053955078125,
-0.054412841796875,
0.03802490234375,
-0.02685546875,
-0.0655517578125,
0.00748443603515625,
-0.045318603515625,
0.032501220703125,
0.01123046875,
0.043304443359375,
-0.01904296875,
-0.01303863525390625,
0.052276611328125,
-0.059814453125,
0.0357666015625,
-0.026763916015625,
-0.0011425018310546875,
0.025665283203125,
0.0257110595703125,
0.048004150390625,
0.0122528076171875,
0.019500732421875,
0.01232147216796875,
-0.0005016326904296875,
-0.0109710693359375,
-0.035003662109375,
0.080078125,
-0.064208984375,
-0.049835205078125,
-0.023345947265625,
-0.0153961181640625,
0.006549835205078125,
0.0369873046875,
0.0384521484375,
0.01165771484375,
0.02276611328125,
0.0013141632080078125,
0.0238494873046875,
-0.0252685546875,
0.038299560546875,
0.03472900390625,
-0.04376220703125,
-0.039947509765625,
0.05999755859375,
0.0213165283203125,
0.008941650390625,
0.03497314453125,
0.006259918212890625,
-0.0216217041015625,
-0.039825439453125,
-0.031036376953125,
0.04071044921875,
-0.0380859375,
-0.017242431640625,
-0.0528564453125,
-0.00045299530029296875,
-0.0596923828125,
-0.0276336669921875,
-0.016693115234375,
-0.033599853515625,
-0.029541015625,
-0.0175018310546875,
0.029998779296875,
0.0283355712890625,
-0.039337158203125,
-0.0009012222290039062,
-0.0540771484375,
0.0182342529296875,
-0.01039886474609375,
0.041595458984375,
0.0193939208984375,
-0.020538330078125,
-0.01849365234375,
0.0126800537109375,
-0.0128021240234375,
-0.060699462890625,
0.056396484375,
-0.037506103515625,
0.040313720703125,
0.0316162109375,
-0.01377105712890625,
0.054168701171875,
-0.01215362548828125,
0.06317138671875,
0.049163818359375,
-0.042572021484375,
0.04217529296875,
-0.04791259765625,
0.032196044921875,
0.029754638671875,
0.048004150390625,
-0.0258026123046875,
-0.00946807861328125,
-0.048309326171875,
-0.07183837890625,
0.04815673828125,
0.0122528076171875,
0.002696990966796875,
0.01337432861328125,
0.001735687255859375,
-0.017730712890625,
0.0123138427734375,
-0.05999755859375,
-0.0504150390625,
-0.032012939453125,
0.01763916015625,
-0.00211334228515625,
0.002582550048828125,
-0.0015153884887695312,
-0.04010009765625,
0.06890869140625,
0.039642333984375,
0.044891357421875,
0.0217742919921875,
0.031341552734375,
-0.032867431640625,
0.033935546875,
0.039215087890625,
0.0450439453125,
-0.039642333984375,
-0.004032135009765625,
-0.0031337738037109375,
-0.047149658203125,
0.00959014892578125,
-0.006145477294921875,
-0.038970947265625,
0.006744384765625,
0.0006265640258789062,
0.05487060546875,
-0.003444671630859375,
-0.0304718017578125,
0.04522705078125,
-0.004314422607421875,
-0.00799560546875,
-0.0240020751953125,
-0.002857208251953125,
0.0222320556640625,
0.005584716796875,
0.011810302734375,
0.022003173828125,
-0.004779815673828125,
-0.0462646484375,
-0.0031681060791015625,
0.0238494873046875,
-0.039794921875,
-0.0017385482788085938,
0.0811767578125,
-0.0080718994140625,
-0.013458251953125,
0.01235198974609375,
-0.0062408447265625,
-0.037750244140625,
0.051788330078125,
0.044952392578125,
0.06097412109375,
-0.0203857421875,
0.0051116943359375,
0.05474853515625,
0.0186004638671875,
0.01364898681640625,
0.04669189453125,
0.0081939697265625,
-0.0309600830078125,
-0.01268768310546875,
-0.041748046875,
0.004779815673828125,
0.01338958740234375,
-0.032470703125,
0.041259765625,
-0.0648193359375,
-0.01861572265625,
-0.01175689697265625,
-0.0181121826171875,
-0.037322998046875,
0.0357666015625,
-0.00986480712890625,
0.09161376953125,
-0.044158935546875,
0.05023193359375,
0.052764892578125,
-0.0648193359375,
-0.07525634765625,
-0.0006475448608398438,
0.01800537109375,
-0.033538818359375,
0.041259765625,
-0.0227203369140625,
-0.007537841796875,
-0.00807952880859375,
-0.0528564453125,
-0.05877685546875,
0.0880126953125,
-0.0086669921875,
-0.0196075439453125,
0.00652313232421875,
0.01134490966796875,
0.05023193359375,
-0.020416259765625,
0.00821685791015625,
0.00998687744140625,
0.036346435546875,
0.02374267578125,
-0.06964111328125,
0.0308380126953125,
-0.055084228515625,
-0.01427459716796875,
0.001708984375,
-0.0711669921875,
0.07806396484375,
-0.0294342041015625,
-0.028045654296875,
0.0096893310546875,
0.02685546875,
0.030181884765625,
0.0233306884765625,
0.00894927978515625,
0.058502197265625,
0.0218048095703125,
-0.031341552734375,
0.08343505859375,
-0.02398681640625,
0.035064697265625,
0.06488037109375,
0.00970458984375,
0.064208984375,
0.0296783447265625,
-0.037261962890625,
0.023345947265625,
0.042236328125,
-0.01438140869140625,
0.052459716796875,
-0.00011980533599853516,
-0.01149749755859375,
-0.001171112060546875,
-0.00696563720703125,
-0.06365966796875,
-0.0004279613494873047,
0.0046539306640625,
-0.0274505615234375,
0.004428863525390625,
0.01116180419921875,
0.01666259765625,
0.00473785400390625,
-0.03289794921875,
0.056854248046875,
0.0108184814453125,
-0.0247802734375,
0.06512451171875,
-0.00501251220703125,
0.057403564453125,
-0.0340576171875,
0.01593017578125,
-0.01885986328125,
0.009063720703125,
-0.041717529296875,
-0.090576171875,
0.00579833984375,
-0.011871337890625,
-0.00627899169921875,
-0.0090179443359375,
0.0428466796875,
-0.0003726482391357422,
-0.019500732421875,
0.035675048828125,
0.01715087890625,
0.01953125,
0.032745361328125,
-0.066162109375,
0.0118865966796875,
0.02789306640625,
-0.0116424560546875,
0.00820159912109375,
0.02569580078125,
0.033355712890625,
0.06097412109375,
0.047698974609375,
0.035888671875,
0.01031494140625,
-0.014923095703125,
0.07891845703125,
-0.05023193359375,
-0.045989990234375,
-0.05120849609375,
0.06884765625,
-0.003997802734375,
-0.01538848876953125,
0.07012939453125,
0.06298828125,
0.06451416015625,
-0.0306549072265625,
0.08111572265625,
-0.024871826171875,
0.05029296875,
-0.03546142578125,
0.07366943359375,
-0.066162109375,
0.00414276123046875,
-0.0504150390625,
-0.04595947265625,
-0.028778076171875,
0.0309295654296875,
-0.032073974609375,
0.0091094970703125,
0.0214996337890625,
0.06549072265625,
-0.00586700439453125,
0.0028743743896484375,
0.0004513263702392578,
0.03802490234375,
0.01025390625,
0.061370849609375,
0.051605224609375,
-0.048126220703125,
0.0257415771484375,
-0.0584716796875,
-0.006595611572265625,
-0.0142974853515625,
-0.03094482421875,
-0.046630859375,
-0.047149658203125,
-0.018280029296875,
-0.04730224609375,
-0.01265716552734375,
0.077392578125,
0.034423828125,
-0.05218505859375,
-0.0225830078125,
0.0166473388671875,
0.01277923583984375,
-0.018280029296875,
-0.0221405029296875,
0.036163330078125,
0.0211944580078125,
-0.0543212890625,
0.0081329345703125,
0.0296173095703125,
0.043670654296875,
0.0050506591796875,
-0.014892578125,
-0.035064697265625,
-0.004489898681640625,
0.0142059326171875,
0.02142333984375,
-0.03033447265625,
0.024017333984375,
0.0006799697875976562,
-0.023284912109375,
0.01611328125,
0.031646728515625,
-0.0109405517578125,
0.032196044921875,
0.0714111328125,
-0.01381683349609375,
0.03619384765625,
-0.00835418701171875,
0.001972198486328125,
-0.00304412841796875,
0.007289886474609375,
-0.0021915435791015625,
0.04302978515625,
0.002933502197265625,
-0.036376953125,
0.04193115234375,
0.036041259765625,
-0.0162353515625,
-0.046630859375,
-0.0082244873046875,
-0.09246826171875,
-0.033843994140625,
0.08380126953125,
-0.0009431838989257812,
-0.01983642578125,
0.00453948974609375,
-0.0280609130859375,
0.01678466796875,
-0.020721435546875,
0.03533935546875,
0.0174407958984375,
0.0012950897216796875,
-0.02362060546875,
-0.0546875,
0.02764892578125,
0.00018036365509033203,
-0.0484619140625,
-0.019195556640625,
0.035247802734375,
0.0223388671875,
0.0435791015625,
0.04998779296875,
-0.0275421142578125,
0.029205322265625,
-0.01386260986328125,
0.0274658203125,
-0.02197265625,
0.011810302734375,
-0.01331329345703125,
0.0193939208984375,
-0.01178741455078125,
-0.01161956787109375
]
] |
joeddav/xlm-roberta-large-xnli | 2023-03-22T18:23:34.000Z | [
"transformers",
"pytorch",
"tf",
"xlm-roberta",
"text-classification",
"tensorflow",
"zero-shot-classification",
"multilingual",
"en",
"fr",
"es",
"de",
"el",
"bg",
"ru",
"tr",
"ar",
"vi",
"th",
"zh",
"hi",
"sw",
"ur",
"dataset:multi_nli",
"dataset:xnli",
"arxiv:1911.02116",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | zero-shot-classification | joeddav | null | null | joeddav/xlm-roberta-large-xnli | 161 | 14,534 | transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- en
- fr
- es
- de
- el
- bg
- ru
- tr
- ar
- vi
- th
- zh
- hi
- sw
- ur
tags:
- text-classification
- pytorch
- tensorflow
datasets:
- multi_nli
- xnli
license: mit
pipeline_tag: zero-shot-classification
widget:
- text: "За кого вы голосуете в 2020 году?"
candidate_labels: "politique étrangère, Europe, élections, affaires, politique"
multi_class: true
- text: "لمن تصوت في 2020؟"
candidate_labels: "السياسة الخارجية, أوروبا, الانتخابات, الأعمال, السياسة"
multi_class: true
- text: "2020'de kime oy vereceksiniz?"
candidate_labels: "dış politika, Avrupa, seçimler, ticaret, siyaset"
multi_class: true
---
# xlm-roberta-large-xnli
## Model Description
This model takes [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) and fine-tunes it on a combination of NLI data in 15 languages. It is intended to be used for zero-shot text classification, such as with the Hugging Face [ZeroShotClassificationPipeline](https://huggingface.co/transformers/master/main_classes/pipelines.html#transformers.ZeroShotClassificationPipeline).
## Intended Usage
This model is intended to be used for zero-shot text classification, especially in languages other than English. It is fine-tuned on XNLI, which is a multilingual NLI dataset. The model can therefore be used with any of the languages in the XNLI corpus:
- English
- French
- Spanish
- German
- Greek
- Bulgarian
- Russian
- Turkish
- Arabic
- Vietnamese
- Thai
- Chinese
- Hindi
- Swahili
- Urdu
Since the base model was pre-trained trained on 100 different languages, the
model has shown some effectiveness in languages beyond those listed above as
well. See the full list of pre-trained languages in appendix A of the
[XLM Roberata paper](https://arxiv.org/abs/1911.02116)
For English-only classification, it is recommended to use
[bart-large-mnli](https://huggingface.co/facebook/bart-large-mnli) or
[a distilled bart MNLI model](https://huggingface.co/models?filter=pipeline_tag%3Azero-shot-classification&search=valhalla).
#### With the zero-shot classification pipeline
The model can be loaded with the `zero-shot-classification` pipeline like so:
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification",
model="joeddav/xlm-roberta-large-xnli")
```
You can then classify in any of the above languages. You can even pass the labels in one language and the sequence to
classify in another:
```python
# we will classify the Russian translation of, "Who are you voting for in 2020?"
sequence_to_classify = "За кого вы голосуете в 2020 году?"
# we can specify candidate labels in Russian or any other language above:
candidate_labels = ["Europe", "public health", "politics"]
classifier(sequence_to_classify, candidate_labels)
# {'labels': ['politics', 'Europe', 'public health'],
# 'scores': [0.9048484563827515, 0.05722189322113991, 0.03792969882488251],
# 'sequence': 'За кого вы голосуете в 2020 году?'}
```
The default hypothesis template is the English, `This text is {}`. If you are working strictly within one language, it
may be worthwhile to translate this to the language you are working with:
```python
sequence_to_classify = "¿A quién vas a votar en 2020?"
candidate_labels = ["Europa", "salud pública", "política"]
hypothesis_template = "Este ejemplo es {}."
classifier(sequence_to_classify, candidate_labels, hypothesis_template=hypothesis_template)
# {'labels': ['política', 'Europa', 'salud pública'],
# 'scores': [0.9109585881233215, 0.05954807624220848, 0.029493311420083046],
# 'sequence': '¿A quién vas a votar en 2020?'}
```
#### With manual PyTorch
```python
# pose sequence as a NLI premise and label as a hypothesis
from transformers import AutoModelForSequenceClassification, AutoTokenizer
nli_model = AutoModelForSequenceClassification.from_pretrained('joeddav/xlm-roberta-large-xnli')
tokenizer = AutoTokenizer.from_pretrained('joeddav/xlm-roberta-large-xnli')
premise = sequence
hypothesis = f'This example is {label}.'
# run through model pre-trained on MNLI
x = tokenizer.encode(premise, hypothesis, return_tensors='pt',
truncation_strategy='only_first')
logits = nli_model(x.to(device))[0]
# we throw away "neutral" (dim 1) and take the probability of
# "entailment" (2) as the probability of the label being true
entail_contradiction_logits = logits[:,[0,2]]
probs = entail_contradiction_logits.softmax(dim=1)
prob_label_is_true = probs[:,1]
```
## Training
This model was pre-trained on set of 100 languages, as described in
[the original paper](https://arxiv.org/abs/1911.02116). It was then fine-tuned on the task of NLI on the concatenated
MNLI train set and the XNLI validation and test sets. Finally, it was trained for one additional epoch on only XNLI
data where the translations for the premise and hypothesis are shuffled such that the premise and hypothesis for
each example come from the same original English example but the premise and hypothesis are of different languages.
| 5,029 | [
[
-0.01715087890625,
-0.031585693359375,
0.0248870849609375,
0.00423431396484375,
-0.00881195068359375,
-0.00881195068359375,
-0.018035888671875,
-0.026092529296875,
0.0274810791015625,
0.0287322998046875,
-0.0301361083984375,
-0.045928955078125,
-0.04193115234375,
0.027740478515625,
-0.0242156982421875,
0.08740234375,
-0.0085296630859375,
0.0135040283203125,
0.01270294189453125,
-0.0225830078125,
-0.00746917724609375,
-0.043060302734375,
-0.045989990234375,
-0.0256195068359375,
0.04803466796875,
0.0176239013671875,
0.0384521484375,
0.045928955078125,
0.00972747802734375,
0.0283203125,
-0.0132598876953125,
-0.0107269287109375,
-0.0186004638671875,
-0.0161285400390625,
0.0014495849609375,
-0.0758056640625,
-0.0318603515625,
0.006237030029296875,
0.05633544921875,
0.032958984375,
0.0014925003051757812,
0.02203369140625,
-0.0145263671875,
0.024749755859375,
-0.034698486328125,
0.00748443603515625,
-0.03753662109375,
0.01259613037109375,
-0.016021728515625,
-0.004665374755859375,
-0.036651611328125,
-0.007904052734375,
0.01418304443359375,
-0.033111572265625,
0.00917816162109375,
0.0020580291748046875,
0.09222412109375,
0.00640869140625,
-0.0235443115234375,
-0.01271820068359375,
-0.039459228515625,
0.0709228515625,
-0.0699462890625,
0.0197601318359375,
0.016998291015625,
0.0020503997802734375,
0.0031147003173828125,
-0.04766845703125,
-0.065185546875,
-0.007617950439453125,
-0.01346588134765625,
0.0219879150390625,
-0.010101318359375,
-0.0228424072265625,
0.032806396484375,
0.020599365234375,
-0.0738525390625,
0.01514434814453125,
-0.030181884765625,
-0.01255035400390625,
0.0552978515625,
0.0004076957702636719,
0.0186920166015625,
-0.03485107421875,
-0.020294189453125,
-0.03656005859375,
-0.04144287109375,
0.00434112548828125,
0.0281524658203125,
0.028900146484375,
-0.026336669921875,
0.050567626953125,
-0.00885772705078125,
0.06036376953125,
0.0022144317626953125,
-0.02484130859375,
0.059478759765625,
-0.0227813720703125,
-0.0280303955078125,
0.0187225341796875,
0.0859375,
0.004138946533203125,
0.00823974609375,
0.0085296630859375,
-0.00026702880859375,
0.007843017578125,
-0.0178375244140625,
-0.052734375,
-0.004169464111328125,
0.03497314453125,
-0.021484375,
-0.0258941650390625,
0.00209808349609375,
-0.050323486328125,
0.006252288818359375,
-0.0255889892578125,
0.052337646484375,
-0.042205810546875,
-0.01534271240234375,
0.0169677734375,
-0.00673675537109375,
0.0254974365234375,
-0.0009584426879882812,
-0.03973388671875,
-0.0030689239501953125,
0.0357666015625,
0.061431884765625,
0.00911712646484375,
-0.0256805419921875,
-0.0281829833984375,
-0.0091705322265625,
-0.0156402587890625,
0.03375244140625,
-0.023712158203125,
-0.0101318359375,
-0.00662994384765625,
0.0201568603515625,
-0.0196685791015625,
-0.0223846435546875,
0.05096435546875,
-0.032470703125,
0.04736328125,
0.0094146728515625,
-0.054962158203125,
-0.0205078125,
0.0260009765625,
-0.032257080078125,
0.0557861328125,
0.00559234619140625,
-0.0689697265625,
0.02606201171875,
-0.035888671875,
-0.024505615234375,
-0.00028824806213378906,
-0.00833892822265625,
-0.057586669921875,
-0.009613037109375,
0.0007529258728027344,
0.035247802734375,
-0.01190948486328125,
0.0266265869140625,
-0.0160980224609375,
-0.0165252685546875,
0.0268096923828125,
-0.019195556640625,
0.09307861328125,
0.0214996337890625,
-0.0438232421875,
0.014373779296875,
-0.07568359375,
0.0190277099609375,
0.0137786865234375,
-0.023162841796875,
-0.01418304443359375,
-0.01519775390625,
0.0212860107421875,
0.0242156982421875,
0.0024394989013671875,
-0.052337646484375,
0.01335906982421875,
-0.0413818359375,
0.034027099609375,
0.031951904296875,
-0.0212554931640625,
0.0263214111328125,
-0.018829345703125,
0.03155517578125,
0.00864410400390625,
-0.00984954833984375,
-0.022369384765625,
-0.0523681640625,
-0.0714111328125,
-0.016510009765625,
0.04833984375,
0.071533203125,
-0.056243896484375,
0.06414794921875,
-0.0304718017578125,
-0.04736328125,
-0.03790283203125,
-0.007450103759765625,
0.037139892578125,
0.030120849609375,
0.02679443359375,
-0.00408172607421875,
-0.06439208984375,
-0.05474853515625,
0.005840301513671875,
-0.012908935546875,
-0.0019626617431640625,
0.0028018951416015625,
0.05438232421875,
-0.024169921875,
0.060089111328125,
-0.037841796875,
-0.0254364013671875,
-0.0269012451171875,
0.0183258056640625,
0.059295654296875,
0.045806884765625,
0.045745849609375,
-0.0582275390625,
-0.0509033203125,
0.01433563232421875,
-0.058013916015625,
0.0037975311279296875,
-0.0172882080078125,
-0.00821685791015625,
0.043182373046875,
0.0213775634765625,
-0.046051025390625,
0.037628173828125,
0.032745361328125,
-0.022186279296875,
0.0404052734375,
-0.0140228271484375,
-0.0003027915954589844,
-0.10888671875,
0.01042938232421875,
0.01218414306640625,
-0.0113525390625,
-0.0576171875,
0.007656097412109375,
0.00424957275390625,
-0.0007996559143066406,
-0.0433349609375,
0.0474853515625,
-0.0214996337890625,
0.0218658447265625,
-0.006603240966796875,
0.0011644363403320312,
-0.00940704345703125,
0.03338623046875,
0.02850341796875,
0.02850341796875,
0.05743408203125,
-0.045166015625,
0.01474761962890625,
0.0313720703125,
-0.032684326171875,
0.021087646484375,
-0.046234130859375,
-0.00860595703125,
-0.0129547119140625,
0.017578125,
-0.06585693359375,
-0.010772705078125,
0.033721923828125,
-0.049285888671875,
0.04217529296875,
-0.0078887939453125,
-0.037841796875,
-0.037261962890625,
-0.00411224365234375,
0.024444580078125,
0.0576171875,
-0.038970947265625,
0.047943115234375,
0.00536346435546875,
0.00988006591796875,
-0.058197021484375,
-0.05780029296875,
0.0004088878631591797,
-0.028594970703125,
-0.039306640625,
0.0263519287109375,
-0.004917144775390625,
-0.0010557174682617188,
-0.005382537841796875,
0.012359619140625,
-0.0050506591796875,
-0.014373779296875,
0.0068511962890625,
0.028167724609375,
-0.02313232421875,
-0.00804901123046875,
-0.016815185546875,
-0.021331787109375,
-0.0035877227783203125,
-0.038330078125,
0.0513916015625,
-0.0013513565063476562,
0.003391265869140625,
-0.04290771484375,
0.0190277099609375,
0.0282440185546875,
-0.0137786865234375,
0.060760498046875,
0.06781005859375,
-0.0213775634765625,
0.004497528076171875,
-0.029449462890625,
0.004192352294921875,
-0.0303802490234375,
0.0421142578125,
-0.0389404296875,
-0.0504150390625,
0.034423828125,
0.018646240234375,
-0.0017309188842773438,
0.04779052734375,
0.040924072265625,
0.006256103515625,
0.083740234375,
0.0389404296875,
-0.0223846435546875,
0.01084136962890625,
-0.056549072265625,
0.018890380859375,
-0.048736572265625,
-0.00738525390625,
-0.048431396484375,
-0.011871337890625,
-0.058929443359375,
-0.0303955078125,
0.01537322998046875,
-0.004642486572265625,
-0.0220947265625,
0.0421142578125,
-0.0294647216796875,
0.0244903564453125,
0.045379638671875,
0.005504608154296875,
0.0045623779296875,
0.0044708251953125,
-0.0046234130859375,
-0.0025615692138671875,
-0.059295654296875,
-0.031951904296875,
0.08502197265625,
0.0202484130859375,
0.041107177734375,
0.006008148193359375,
0.06414794921875,
-0.019927978515625,
0.0275726318359375,
-0.0577392578125,
0.03387451171875,
-0.0276947021484375,
-0.05804443359375,
-0.0196685791015625,
-0.048126220703125,
-0.08258056640625,
0.015960693359375,
-0.0282135009765625,
-0.05975341796875,
0.0119476318359375,
0.0006833076477050781,
-0.0230712890625,
0.035064697265625,
-0.03863525390625,
0.06610107421875,
-0.0209808349609375,
-0.0171661376953125,
-0.0014905929565429688,
-0.05426025390625,
0.035125732421875,
-0.0226287841796875,
0.00942230224609375,
-0.005985260009765625,
0.0147705078125,
0.060455322265625,
-0.0311279296875,
0.06610107421875,
-0.0009756088256835938,
-0.004581451416015625,
0.01422882080078125,
-0.0239105224609375,
0.00986480712890625,
0.00533294677734375,
-0.0171051025390625,
0.031494140625,
0.01177978515625,
-0.033477783203125,
-0.035064697265625,
0.04473876953125,
-0.072021484375,
-0.04266357421875,
-0.055023193359375,
-0.0274200439453125,
0.01611328125,
0.02337646484375,
0.06231689453125,
0.0322265625,
-0.0011625289916992188,
0.00582122802734375,
0.041107177734375,
-0.036102294921875,
0.0296478271484375,
0.034210205078125,
-0.0278472900390625,
-0.03387451171875,
0.067138671875,
0.0204925537109375,
0.00958251953125,
0.042938232421875,
0.024505615234375,
-0.0316162109375,
-0.0282440185546875,
-0.03369140625,
0.0308685302734375,
-0.038421630859375,
-0.0182037353515625,
-0.054443359375,
-0.03155517578125,
-0.04266357421875,
0.0070343017578125,
-0.01097869873046875,
-0.0335693359375,
-0.020172119140625,
0.00473785400390625,
0.025146484375,
0.03485107421875,
-0.00902557373046875,
0.01325225830078125,
-0.06036376953125,
0.0166168212890625,
0.004482269287109375,
0.00907135009765625,
-0.0123443603515625,
-0.06451416015625,
-0.0147705078125,
0.0043487548828125,
-0.0171966552734375,
-0.06695556640625,
0.05609130859375,
0.027618408203125,
0.040802001953125,
0.035369873046875,
0.0015926361083984375,
0.04962158203125,
-0.04937744140625,
0.04833984375,
0.0235595703125,
-0.07806396484375,
0.04345703125,
-0.00942230224609375,
0.020050048828125,
0.0160369873046875,
0.06671142578125,
-0.037384033203125,
-0.02789306640625,
-0.052001953125,
-0.06610107421875,
0.06256103515625,
0.0263671875,
0.0111846923828125,
-0.0105133056640625,
0.032196044921875,
0.00634765625,
0.0021457672119140625,
-0.07830810546875,
-0.04931640625,
-0.01371002197265625,
-0.0263214111328125,
-0.036773681640625,
-0.002208709716796875,
-0.006511688232421875,
-0.04388427734375,
0.0721435546875,
-0.0138092041015625,
0.0270843505859375,
0.032012939453125,
-0.0066986083984375,
0.0012540817260742188,
0.013946533203125,
0.044219970703125,
0.0321044921875,
-0.0194854736328125,
-0.00470733642578125,
0.0250091552734375,
-0.025848388671875,
0.0306854248046875,
0.01239013671875,
-0.0379638671875,
0.016265869140625,
0.0276947021484375,
0.080322265625,
0.0034198760986328125,
-0.04266357421875,
0.0299224853515625,
-0.0078887939453125,
-0.026092529296875,
-0.052154541015625,
0.005176544189453125,
-0.00630950927734375,
0.006282806396484375,
0.022216796875,
0.00656890869140625,
-0.002437591552734375,
-0.045623779296875,
0.025848388671875,
0.02557373046875,
-0.0195770263671875,
-0.034881591796875,
0.057037353515625,
-0.015838623046875,
-0.0232696533203125,
0.0423583984375,
-0.0310821533203125,
-0.050628662109375,
0.051239013671875,
0.04888916015625,
0.060394287109375,
-0.0207672119140625,
0.026824951171875,
0.06549072265625,
0.00623321533203125,
-0.005279541015625,
0.020538330078125,
0.0171356201171875,
-0.07122802734375,
-0.047637939453125,
-0.05859375,
-0.01010894775390625,
0.0270843505859375,
-0.059356689453125,
0.03973388671875,
-0.023529052734375,
-0.0158843994140625,
0.02166748046875,
-0.0025691986083984375,
-0.047607421875,
0.0260772705078125,
0.024688720703125,
0.07568359375,
-0.0804443359375,
0.07403564453125,
0.055145263671875,
-0.029449462890625,
-0.0689697265625,
-0.0149993896484375,
-0.002758026123046875,
-0.053680419921875,
0.06402587890625,
0.0281219482421875,
0.0250396728515625,
-0.0037994384765625,
-0.0384521484375,
-0.084228515625,
0.0667724609375,
0.0012159347534179688,
-0.0382080078125,
0.01007080078125,
0.013214111328125,
0.04730224609375,
-0.0266265869140625,
0.03387451171875,
0.0413818359375,
0.03814697265625,
0.0127716064453125,
-0.06390380859375,
0.00357818603515625,
-0.031494140625,
-0.01224517822265625,
0.0144195556640625,
-0.04266357421875,
0.06787109375,
-0.0182037353515625,
-0.0087890625,
0.03240966796875,
0.0531005859375,
0.0252532958984375,
0.0186004638671875,
0.041595458984375,
0.06488037109375,
0.052825927734375,
-0.0198974609375,
0.07122802734375,
-0.023468017578125,
0.0382080078125,
0.08135986328125,
-0.0274505615234375,
0.07891845703125,
0.016693115234375,
-0.022064208984375,
0.05316162109375,
0.051788330078125,
-0.0191802978515625,
0.0197906494140625,
0.018280029296875,
-0.0111236572265625,
-0.00646209716796875,
0.01059722900390625,
-0.0262451171875,
0.051513671875,
0.0264434814453125,
-0.0240020751953125,
0.0026378631591796875,
0.008941650390625,
0.021331787109375,
-0.01116180419921875,
-0.00658416748046875,
0.043060302734375,
0.00008493661880493164,
-0.05877685546875,
0.07177734375,
0.0008893013000488281,
0.0650634765625,
-0.04833984375,
0.016632080078125,
0.004390716552734375,
0.022674560546875,
-0.017059326171875,
-0.05908203125,
0.0170440673828125,
-0.0051727294921875,
-0.00823974609375,
0.003292083740234375,
0.017333984375,
-0.05047607421875,
-0.0557861328125,
0.025787353515625,
0.0228118896484375,
0.022125244140625,
0.01325225830078125,
-0.0692138671875,
-0.006580352783203125,
0.0227203369140625,
-0.0237884521484375,
0.01983642578125,
0.02227783203125,
0.0028667449951171875,
0.04156494140625,
0.04541015625,
0.01239013671875,
0.004230499267578125,
-0.00009918212890625,
0.044464111328125,
-0.052154541015625,
-0.0302886962890625,
-0.060089111328125,
0.045257568359375,
-0.007320404052734375,
-0.036865234375,
0.06005859375,
0.052490234375,
0.0699462890625,
-0.01885986328125,
0.05303955078125,
-0.018951416015625,
0.0321044921875,
-0.040618896484375,
0.05010986328125,
-0.054931640625,
-0.00807952880859375,
-0.01502227783203125,
-0.064697265625,
-0.041259765625,
0.059539794921875,
-0.007511138916015625,
0.0008521080017089844,
0.0496826171875,
0.060638427734375,
0.01003265380859375,
-0.01454925537109375,
0.025787353515625,
0.039520263671875,
0.01190948486328125,
0.0504150390625,
0.04083251953125,
-0.054534912109375,
0.03802490234375,
-0.035797119140625,
-0.0218963623046875,
0.0078582763671875,
-0.058868408203125,
-0.07159423828125,
-0.05316162109375,
-0.050872802734375,
-0.04388427734375,
-0.017303466796875,
0.0718994140625,
0.0638427734375,
-0.082763671875,
-0.02069091796875,
0.0004305839538574219,
-0.00421142578125,
-0.0034351348876953125,
-0.0222930908203125,
0.034088134765625,
-0.0207977294921875,
-0.08294677734375,
0.01413726806640625,
0.008697509765625,
0.01995849609375,
-0.010223388671875,
-0.00399017333984375,
-0.03131103515625,
-0.00675201416015625,
0.03564453125,
0.03387451171875,
-0.0611572265625,
-0.01331329345703125,
0.0111083984375,
0.002765655517578125,
0.0206451416015625,
0.027740478515625,
-0.05889892578125,
0.0206146240234375,
0.041412353515625,
0.0252838134765625,
0.043609619140625,
-0.02117919921875,
0.039398193359375,
-0.050018310546875,
0.030181884765625,
0.0009813308715820312,
0.0455322265625,
0.0221405029296875,
-0.02069091796875,
0.042510986328125,
0.016998291015625,
-0.0400390625,
-0.05908203125,
0.01294708251953125,
-0.07843017578125,
-0.020263671875,
0.0772705078125,
-0.021148681640625,
-0.0445556640625,
-0.003383636474609375,
-0.01617431640625,
0.028045654296875,
-0.00402069091796875,
0.04632568359375,
0.029937744140625,
0.00402069091796875,
-0.011993408203125,
-0.0276641845703125,
0.0289764404296875,
0.0440673828125,
-0.04071044921875,
-0.0308074951171875,
-0.00925445556640625,
0.03179931640625,
0.054595947265625,
0.0294036865234375,
-0.007457733154296875,
0.01020050048828125,
-0.01158905029296875,
0.01543426513671875,
0.0274658203125,
0.0005040168762207031,
-0.03765869140625,
-0.002170562744140625,
-0.015838623046875,
-0.00788116455078125
]
] |
allenai/biomed_roberta_base | 2022-10-03T22:05:08.000Z | [
"transformers",
"pytorch",
"jax",
"roberta",
"en",
"endpoints_compatible",
"has_space",
"region:us"
] | null | allenai | null | null | allenai/biomed_roberta_base | 15 | 14,531 | transformers | 2022-03-02T23:29:05 | ---
language: en
thumbnail: https://huggingface.co/front/thumbnails/allenai.png
---
# BioMed-RoBERTa-base
BioMed-RoBERTa-base is a language model based on the RoBERTa-base (Liu et. al, 2019) architecture. We adapt RoBERTa-base to 2.68 million scientific papers from the [Semantic Scholar](https://www.semanticscholar.org) corpus via continued pretraining. This amounts to 7.55B tokens and 47GB of data. We use the full text of the papers in training, not just abstracts.
Specific details of the adaptive pretraining procedure can be found in Gururangan et. al, 2020.
## Evaluation
BioMed-RoBERTa achieves competitive performance to state of the art models on a number of NLP tasks in the biomedical domain (numbers are mean (standard deviation) over 3+ random seeds)
| Task | Task Type | RoBERTa-base | BioMed-RoBERTa-base |
|--------------|---------------------|--------------|---------------------|
| RCT-180K | Text Classification | 86.4 (0.3) | 86.9 (0.2) |
| ChemProt | Relation Extraction | 81.1 (1.1) | 83.0 (0.7) |
| JNLPBA | NER | 74.3 (0.2) | 75.2 (0.1) |
| BC5CDR | NER | 85.6 (0.1) | 87.8 (0.1) |
| NCBI-Disease | NER | 86.6 (0.3) | 87.1 (0.8) |
More evaluations TBD.
## Citation
If using this model, please cite the following paper:
```bibtex
@inproceedings{domains,
author = {Suchin Gururangan and Ana Marasović and Swabha Swayamdipta and Kyle Lo and Iz Beltagy and Doug Downey and Noah A. Smith},
title = {Don't Stop Pretraining: Adapt Language Models to Domains and Tasks},
year = {2020},
booktitle = {Proceedings of ACL},
}
```
| 1,703 | [
[
-0.0031909942626953125,
-0.048828125,
0.0426025390625,
0.007720947265625,
-0.00391387939453125,
0.0075225830078125,
-0.0200042724609375,
-0.039459228515625,
0.020843505859375,
0.0296783447265625,
-0.0284881591796875,
-0.04315185546875,
-0.06683349609375,
0.0173797607421875,
0.0018320083618164062,
0.09967041015625,
0.0228118896484375,
0.045806884765625,
-0.012664794921875,
-0.0235137939453125,
-0.01444244384765625,
-0.04931640625,
-0.039459228515625,
-0.0276031494140625,
0.032989501953125,
0.004650115966796875,
0.034698486328125,
0.0426025390625,
0.051239013671875,
0.01519775390625,
-0.0226898193359375,
-0.0050506591796875,
-0.03167724609375,
0.003063201904296875,
-0.024261474609375,
-0.031494140625,
-0.06988525390625,
0.000732421875,
0.041229248046875,
0.07489013671875,
0.00598907470703125,
0.01502227783203125,
-0.0129852294921875,
0.059112548828125,
-0.029327392578125,
0.0013875961303710938,
-0.032867431640625,
-0.0132904052734375,
-0.04107666015625,
-0.01507568359375,
-0.042236328125,
-0.0087432861328125,
0.0209197998046875,
-0.022735595703125,
0.005924224853515625,
-0.0037479400634765625,
0.072509765625,
0.01123046875,
-0.0227508544921875,
-0.0145263671875,
-0.042816162109375,
0.06561279296875,
-0.06024169921875,
0.034912109375,
0.029327392578125,
0.00989532470703125,
-0.02386474609375,
-0.0517578125,
-0.03131103515625,
-0.031463623046875,
0.0004930496215820312,
0.017059326171875,
-0.023651123046875,
0.0051116943359375,
0.01904296875,
0.01532745361328125,
-0.0799560546875,
-0.01336669921875,
-0.035552978515625,
0.0141448974609375,
0.022674560546875,
-0.00569915771484375,
0.03704833984375,
-0.005153656005859375,
-0.0302886962890625,
-0.00390625,
-0.052581787109375,
-0.01641845703125,
-0.00717926025390625,
0.0227508544921875,
-0.016876220703125,
0.03173828125,
0.0009737014770507812,
0.07574462890625,
0.0062103271484375,
-0.0029296875,
0.053924560546875,
-0.0283050537109375,
-0.02423095703125,
-0.01293182373046875,
0.06414794921875,
0.00029158592224121094,
0.01204681396484375,
-0.02783203125,
0.021636962890625,
0.0075225830078125,
0.03167724609375,
-0.054412841796875,
-0.0162506103515625,
0.03118896484375,
-0.04876708984375,
-0.01055908203125,
-0.00836181640625,
-0.049224853515625,
0.004795074462890625,
0.0037784576416015625,
0.023284912109375,
-0.060638427734375,
-0.0202178955078125,
0.008758544921875,
0.0007190704345703125,
0.0020294189453125,
-0.0008721351623535156,
-0.062042236328125,
0.040252685546875,
0.0316162109375,
0.061920166015625,
-0.0269927978515625,
-0.0124053955078125,
-0.0159912109375,
0.01666259765625,
-0.0020503997802734375,
0.0579833984375,
-0.03472900390625,
-0.034454345703125,
-0.0341796875,
0.0107269287109375,
-0.02313232421875,
-0.03228759765625,
0.027923583984375,
-0.03955078125,
0.022308349609375,
-0.003536224365234375,
-0.03692626953125,
-0.0081634521484375,
0.01108551025390625,
-0.03533935546875,
0.058013916015625,
0.021881103515625,
-0.04754638671875,
0.0269012451171875,
-0.0740966796875,
-0.02252197265625,
0.012847900390625,
-0.0001863241195678711,
-0.0252532958984375,
-0.0115509033203125,
-0.00885009765625,
0.0015230178833007812,
-0.0259857177734375,
0.0325927734375,
-0.0247955322265625,
0.0080108642578125,
0.012847900390625,
0.0006709098815917969,
0.09991455078125,
0.01739501953125,
-0.0203399658203125,
0.031768798828125,
-0.08648681640625,
0.0255889892578125,
-0.005016326904296875,
-0.01971435546875,
-0.0291595458984375,
-0.005535125732421875,
0.00672149658203125,
0.017547607421875,
0.014068603515625,
-0.05157470703125,
0.019989013671875,
-0.052581787109375,
0.031524658203125,
0.022979736328125,
-0.0035381317138671875,
0.034820556640625,
-0.049072265625,
0.053466796875,
-0.009613037109375,
0.006328582763671875,
0.0118255615234375,
-0.048492431640625,
-0.0374755859375,
-0.0406494140625,
0.050506591796875,
0.055023193359375,
-0.0173797607421875,
0.045379638671875,
-0.03021240234375,
-0.0506591796875,
-0.039459228515625,
-0.00690460205078125,
0.058502197265625,
0.048797607421875,
0.06744384765625,
-0.0296783447265625,
-0.05426025390625,
-0.05157470703125,
-0.01274871826171875,
-0.0153961181640625,
0.0002474784851074219,
0.01483154296875,
0.050933837890625,
-0.032867431640625,
0.053192138671875,
-0.02423095703125,
-0.021942138671875,
-0.018646240234375,
0.00847625732421875,
0.034271240234375,
0.0648193359375,
0.0626220703125,
-0.0439453125,
-0.0421142578125,
-0.015350341796875,
-0.057281494140625,
-0.00690460205078125,
0.00994110107421875,
-0.0300445556640625,
0.0284423828125,
0.006992340087890625,
-0.06488037109375,
0.023773193359375,
0.04937744140625,
-0.0285491943359375,
0.037322998046875,
-0.0193634033203125,
0.0032176971435546875,
-0.09197998046875,
0.03369140625,
-0.01302337646484375,
-0.0188751220703125,
-0.049407958984375,
0.0263824462890625,
-0.00548553466796875,
0.00684356689453125,
-0.01422119140625,
0.045135498046875,
-0.04150390625,
0.00580596923828125,
-0.01268768310546875,
-0.016357421875,
-0.00018155574798583984,
0.017822265625,
0.0114898681640625,
0.0445556640625,
0.0445556640625,
-0.0322265625,
-0.0017175674438476562,
0.037994384765625,
-0.0008006095886230469,
0.0270233154296875,
-0.06463623046875,
0.00852203369140625,
0.00469970703125,
0.038848876953125,
-0.052459716796875,
-0.0017709732055664062,
0.021484375,
-0.056121826171875,
0.021484375,
-0.022430419921875,
-0.022552490234375,
-0.014373779296875,
-0.01084136962890625,
0.0271453857421875,
0.050384521484375,
-0.00618743896484375,
0.041168212890625,
0.037322998046875,
-0.01328277587890625,
-0.034698486328125,
-0.056884765625,
0.009796142578125,
-0.0004107952117919922,
-0.052703857421875,
0.05426025390625,
-0.032958984375,
-0.01739501953125,
0.021270751953125,
0.002353668212890625,
-0.0303497314453125,
-0.005584716796875,
0.0240936279296875,
0.03155517578125,
-0.0279388427734375,
0.0079345703125,
-0.0178375244140625,
-0.019622802734375,
-0.00986480712890625,
-0.0244903564453125,
0.04888916015625,
-0.004756927490234375,
-0.0140533447265625,
-0.0224456787109375,
0.0272064208984375,
0.040863037109375,
-0.0289154052734375,
0.057952880859375,
0.0208282470703125,
-0.0255279541015625,
-0.0007519721984863281,
-0.0180511474609375,
-0.0064239501953125,
-0.0268096923828125,
0.027984619140625,
-0.021942138671875,
-0.0477294921875,
0.029327392578125,
0.004512786865234375,
-0.00743865966796875,
0.052703857421875,
0.060760498046875,
-0.0082550048828125,
0.051025390625,
0.044158935546875,
0.01013946533203125,
0.0229644775390625,
-0.019195556640625,
0.00455474853515625,
-0.0684814453125,
-0.0167388916015625,
-0.058685302734375,
-0.0016231536865234375,
-0.03424072265625,
-0.04205322265625,
0.0214080810546875,
0.0009713172912597656,
-0.0184783935546875,
0.035247802734375,
-0.04541015625,
0.00394439697265625,
0.041107177734375,
0.0200347900390625,
0.01317596435546875,
-0.02130126953125,
-0.0059661865234375,
-0.0005359649658203125,
-0.0528564453125,
-0.04327392578125,
0.10772705078125,
0.015655517578125,
0.041473388671875,
0.0311279296875,
0.06134033203125,
0.0025081634521484375,
0.034637451171875,
-0.044891357421875,
0.032958984375,
-0.025970458984375,
-0.0750732421875,
-0.0087432861328125,
-0.0228424072265625,
-0.0897216796875,
0.0009431838989257812,
-0.03631591796875,
-0.05084228515625,
0.01497650146484375,
0.0144500732421875,
-0.0268707275390625,
0.004322052001953125,
-0.0305633544921875,
0.06280517578125,
-0.036773681640625,
-0.01219940185546875,
-0.01477813720703125,
-0.056671142578125,
0.01470947265625,
-0.0056304931640625,
0.020751953125,
0.00859832763671875,
0.01372528076171875,
0.052459716796875,
-0.04632568359375,
0.052490234375,
-0.009979248046875,
0.033660888671875,
0.001129150390625,
-0.00539398193359375,
0.011138916015625,
-0.003887176513671875,
-0.011505126953125,
0.0272979736328125,
0.02227783203125,
-0.0391845703125,
-0.0238800048828125,
0.046173095703125,
-0.0562744140625,
-0.031463623046875,
-0.06622314453125,
-0.039794921875,
-0.033935546875,
0.04547119140625,
0.02813720703125,
0.022613525390625,
-0.019287109375,
0.0263824462890625,
0.054443359375,
-0.0213623046875,
0.0004756450653076172,
0.056854248046875,
-0.006404876708984375,
-0.020782470703125,
0.04791259765625,
0.00811767578125,
0.01485443115234375,
0.0271453857421875,
0.0025634765625,
-0.00455474853515625,
-0.07769775390625,
-0.034454345703125,
0.0233001708984375,
-0.023101806640625,
-0.0184783935546875,
-0.0767822265625,
-0.0241241455078125,
-0.04705810546875,
-0.0017728805541992188,
-0.0139312744140625,
-0.034271240234375,
-0.0283355712890625,
-0.00023472309112548828,
0.030181884765625,
0.04669189453125,
0.0006885528564453125,
0.0158233642578125,
-0.07171630859375,
0.02679443359375,
-0.01445770263671875,
0.0250701904296875,
-0.0197296142578125,
-0.055450439453125,
-0.01403045654296875,
0.005733489990234375,
-0.01108551025390625,
-0.073486328125,
0.044891357421875,
0.02813720703125,
0.06414794921875,
-0.0184478759765625,
-0.0177001953125,
0.0517578125,
-0.05029296875,
0.055694580078125,
0.0135345458984375,
-0.059112548828125,
0.04852294921875,
-0.036163330078125,
0.0162811279296875,
0.05023193359375,
0.04315185546875,
-0.005023956298828125,
-0.0267486572265625,
-0.06927490234375,
-0.0831298828125,
0.034759521484375,
-0.0006494522094726562,
-0.0018014907836914062,
-0.0145111083984375,
0.0179290771484375,
-0.00525665283203125,
-0.003948211669921875,
-0.0650634765625,
-0.017913818359375,
-0.009613037109375,
-0.037384033203125,
0.01142120361328125,
-0.031982421875,
-0.033203125,
-0.0246429443359375,
0.059906005859375,
0.0021228790283203125,
0.0246429443359375,
0.0189971923828125,
-0.03216552734375,
-0.007556915283203125,
0.01471710205078125,
0.0552978515625,
0.062469482421875,
-0.036163330078125,
-0.01361846923828125,
0.004974365234375,
-0.0418701171875,
-0.0008482933044433594,
0.0164794921875,
-0.006683349609375,
0.00934600830078125,
0.056396484375,
0.044158935546875,
0.0246734619140625,
-0.057098388671875,
0.0310211181640625,
0.008880615234375,
-0.0185089111328125,
-0.031982421875,
0.010772705078125,
0.005527496337890625,
0.010650634765625,
0.025970458984375,
0.03271484375,
0.0080718994140625,
-0.036529541015625,
0.0298004150390625,
0.035919189453125,
-0.0269012451171875,
-0.04083251953125,
0.056121826171875,
0.00342559814453125,
-0.0204925537109375,
0.0242767333984375,
-0.004364013671875,
-0.022308349609375,
0.0467529296875,
0.060516357421875,
0.0556640625,
-0.0184173583984375,
-0.002735137939453125,
0.04156494140625,
-0.0019474029541015625,
0.0021152496337890625,
0.050201416015625,
0.0283660888671875,
-0.051910400390625,
-0.0306396484375,
-0.06854248046875,
-0.01399993896484375,
0.005458831787109375,
-0.051483154296875,
0.005680084228515625,
-0.052764892578125,
-0.03857421875,
0.0374755859375,
-0.003131866455078125,
-0.03485107421875,
0.02813720703125,
-0.01027679443359375,
0.07330322265625,
-0.048126220703125,
0.0948486328125,
0.07269287109375,
-0.050445556640625,
-0.040313720703125,
-0.004444122314453125,
0.00860595703125,
-0.049896240234375,
0.08978271484375,
-0.030426025390625,
-0.00128173828125,
0.005207061767578125,
-0.020721435546875,
-0.08697509765625,
0.06402587890625,
0.0176544189453125,
-0.03497314453125,
-0.01316070556640625,
-0.026763916015625,
0.071044921875,
-0.034912109375,
0.0145721435546875,
0.008697509765625,
0.034942626953125,
-0.0048675537109375,
-0.0784912109375,
0.00496673583984375,
-0.036773681640625,
0.0011806488037109375,
-0.00948333740234375,
-0.043487548828125,
0.0865478515625,
-0.00959014892578125,
0.005886077880859375,
0.0272369384765625,
0.0295257568359375,
0.03240966796875,
0.003345489501953125,
0.0203399658203125,
0.050994873046875,
0.053253173828125,
-0.00295257568359375,
0.0826416015625,
-0.050506591796875,
0.025360107421875,
0.08544921875,
-0.015411376953125,
0.057373046875,
0.0295867919921875,
-0.042877197265625,
0.07122802734375,
0.0294342041015625,
-0.0096893310546875,
0.038970947265625,
0.017822265625,
-0.003276824951171875,
-0.004589080810546875,
0.006893157958984375,
-0.046905517578125,
0.035919189453125,
0.024200439453125,
-0.04339599609375,
-0.0038471221923828125,
-0.00493621826171875,
0.01200103759765625,
0.0084991455078125,
0.0111236572265625,
0.046539306640625,
-0.0022125244140625,
-0.0374755859375,
0.051116943359375,
-0.0048065185546875,
0.03680419921875,
-0.067138671875,
-0.0085296630859375,
0.012054443359375,
0.00943756103515625,
-0.01849365234375,
-0.0338134765625,
0.0362548828125,
0.01030731201171875,
-0.0467529296875,
-0.01375579833984375,
0.04254150390625,
-0.024200439453125,
-0.0345458984375,
0.03631591796875,
0.03912353515625,
0.01059722900390625,
0.03692626953125,
-0.06494140625,
0.010650634765625,
-0.01105499267578125,
-0.0264892578125,
0.03912353515625,
0.0101318359375,
-0.01218414306640625,
0.04052734375,
0.06378173828125,
0.03863525390625,
-0.005462646484375,
-0.006481170654296875,
0.05401611328125,
-0.045989990234375,
-0.024932861328125,
-0.058502197265625,
0.03399658203125,
-0.0007462501525878906,
-0.031402587890625,
0.051055908203125,
0.043212890625,
0.045440673828125,
-0.01273345947265625,
0.062469482421875,
-0.029205322265625,
0.0635986328125,
-0.033294677734375,
0.0648193359375,
-0.042327880859375,
0.0034198760986328125,
-0.0229644775390625,
-0.047271728515625,
-0.049346923828125,
0.061279296875,
-0.022552490234375,
0.0218505859375,
0.05828857421875,
0.039764404296875,
0.007106781005859375,
-0.0197601318359375,
0.0032024383544921875,
0.043731689453125,
0.017669677734375,
0.045074462890625,
0.01042938232421875,
-0.0179901123046875,
0.01206207275390625,
0.01140594482421875,
-0.042022705078125,
-0.0186614990234375,
-0.059112548828125,
-0.052703857421875,
-0.03900146484375,
-0.0197906494140625,
-0.056396484375,
0.038238525390625,
0.07684326171875,
0.070556640625,
-0.08697509765625,
-0.01001739501953125,
-0.007740020751953125,
-0.021026611328125,
-0.018798828125,
-0.01092529296875,
0.034912109375,
-0.032135009765625,
-0.046356201171875,
0.044891357421875,
0.017852783203125,
-0.0030460357666015625,
-0.00365447998046875,
-0.00563812255859375,
-0.0496826171875,
-0.001590728759765625,
0.032958984375,
0.0389404296875,
-0.042999267578125,
-0.01727294921875,
-0.0032901763916015625,
-0.0288238525390625,
0.01081085205078125,
0.029876708984375,
-0.054351806640625,
0.03350830078125,
0.034881591796875,
0.044219970703125,
0.035675048828125,
-0.0005564689636230469,
0.045166015625,
-0.032562255859375,
0.0235137939453125,
0.038421630859375,
0.0194549560546875,
0.0111541748046875,
-0.01873779296875,
0.04425048828125,
0.033538818359375,
-0.04229736328125,
-0.06463623046875,
0.01535797119140625,
-0.103271484375,
-0.02947998046875,
0.072998046875,
-0.027557373046875,
-0.0283050537109375,
-0.0115203857421875,
0.009979248046875,
0.0226898193359375,
-0.00482940673828125,
0.041839599609375,
0.04278564453125,
-0.027557373046875,
0.0038242340087890625,
-0.044647216796875,
0.05224609375,
0.041107177734375,
-0.0592041015625,
-0.017425537109375,
0.031097412109375,
0.047576904296875,
0.0225677490234375,
0.08465576171875,
-0.03009033203125,
0.024566650390625,
-0.00861358642578125,
0.019317626953125,
-0.00231170654296875,
-0.0186309814453125,
-0.020477294921875,
-0.002994537353515625,
0.01233673095703125,
0.004589080810546875
]
] |
hfl/chinese-roberta-wwm-ext-large | 2022-03-01T09:15:16.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"zh",
"arxiv:1906.08101",
"arxiv:2004.13922",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | hfl | null | null | hfl/chinese-roberta-wwm-ext-large | 78 | 14,509 | transformers | 2022-03-02T23:29:05 | ---
language:
- zh
tags:
- bert
license: "apache-2.0"
---
# Please use 'Bert' related functions to load this model!
## Chinese BERT with Whole Word Masking
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
**[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**
Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping Hu
This repository is developed based on:https://github.com/google-research/bert
You may also interested in,
- Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
- Chinese MacBERT: https://github.com/ymcui/MacBERT
- Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
- Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
- Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer
More resources by HFL: https://github.com/ymcui/HFL-Anthology
## Citation
If you find the technical report or resource is useful, please cite the following technical report in your paper.
- Primary: https://arxiv.org/abs/2004.13922
```
@inproceedings{cui-etal-2020-revisiting,
title = "Revisiting Pre-Trained Models for {C}hinese Natural Language Processing",
author = "Cui, Yiming and
Che, Wanxiang and
Liu, Ting and
Qin, Bing and
Wang, Shijin and
Hu, Guoping",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.findings-emnlp.58",
pages = "657--668",
}
```
- Secondary: https://arxiv.org/abs/1906.08101
```
@article{chinese-bert-wwm,
title={Pre-Training with Whole Word Masking for Chinese BERT},
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Yang, Ziqing and Wang, Shijin and Hu, Guoping},
journal={arXiv preprint arXiv:1906.08101},
year={2019}
}
``` | 2,066 | [
[
-0.027862548828125,
-0.057281494140625,
0.02239990234375,
0.0367431640625,
-0.029205322265625,
-0.0144805908203125,
-0.041351318359375,
-0.052154541015625,
0.023773193359375,
0.033660888671875,
-0.035491943359375,
-0.03887939453125,
-0.0394287109375,
0.00012117624282836914,
-0.004009246826171875,
0.05908203125,
-0.0008745193481445312,
0.0154266357421875,
0.017669677734375,
0.007236480712890625,
-0.006366729736328125,
-0.05633544921875,
-0.047943115234375,
-0.03155517578125,
0.033843994140625,
-0.0015611648559570312,
0.0263519287109375,
0.041595458984375,
0.0218658447265625,
0.02557373046875,
-0.009307861328125,
0.01410675048828125,
-0.016021728515625,
-0.0103607177734375,
0.0193328857421875,
-0.00585174560546875,
-0.038726806640625,
0.00995635986328125,
0.0518798828125,
0.043304443359375,
0.005069732666015625,
-0.00382232666015625,
0.01285552978515625,
0.039398193359375,
-0.042449951171875,
0.00800323486328125,
-0.054046630859375,
0.002620697021484375,
-0.035552978515625,
0.003692626953125,
-0.0307159423828125,
-0.0230255126953125,
0.0208587646484375,
-0.05487060546875,
0.01448822021484375,
0.00406646728515625,
0.11090087890625,
-0.01245880126953125,
-0.007781982421875,
-0.00004470348358154297,
-0.03253173828125,
0.054656982421875,
-0.08856201171875,
0.029693603515625,
0.040679931640625,
-0.0070037841796875,
-0.01444244384765625,
-0.07525634765625,
-0.06365966796875,
-0.025115966796875,
-0.0118560791015625,
0.01454925537109375,
0.002948760986328125,
0.0247039794921875,
0.0176239013671875,
0.029266357421875,
-0.0484619140625,
0.0120086669921875,
-0.0218353271484375,
-0.0408935546875,
0.05084228515625,
-0.0169525146484375,
0.026031494140625,
-0.0100860595703125,
-0.02789306640625,
-0.035308837890625,
-0.0247955322265625,
0.0169219970703125,
0.0220489501953125,
0.020965576171875,
-0.00975799560546875,
0.01213836669921875,
-0.004199981689453125,
0.05072021484375,
-0.00737762451171875,
0.0020122528076171875,
0.0487060546875,
-0.03338623046875,
-0.0269775390625,
0.003955841064453125,
0.08050537109375,
-0.0019474029541015625,
0.01241302490234375,
0.002597808837890625,
-0.0203094482421875,
-0.0257720947265625,
-0.0017213821411132812,
-0.058563232421875,
-0.028717041015625,
0.0081329345703125,
-0.03656005859375,
-0.006557464599609375,
0.0167999267578125,
-0.046722412109375,
-0.016754150390625,
-0.015960693359375,
0.041229248046875,
-0.040771484375,
-0.024078369140625,
0.0179901123046875,
-0.00891876220703125,
0.034088134765625,
0.0030460357666015625,
-0.05438232421875,
0.00975799560546875,
0.032958984375,
0.050933837890625,
0.005252838134765625,
-0.03692626953125,
-0.0252685546875,
-0.007091522216796875,
-0.01190185546875,
0.049346923828125,
-0.02178955078125,
-0.004177093505859375,
0.016632080078125,
0.0092315673828125,
-0.01189422607421875,
-0.017425537109375,
0.06134033203125,
-0.0311279296875,
0.03485107421875,
-0.02099609375,
-0.027984619140625,
-0.026336669921875,
0.0142059326171875,
-0.04736328125,
0.0877685546875,
-0.019989013671875,
-0.061370849609375,
0.0116424560546875,
-0.060516357421875,
-0.049041748046875,
-0.00021696090698242188,
0.0156097412109375,
-0.041229248046875,
-0.0203399658203125,
0.0272369384765625,
0.0273895263671875,
-0.01059722900390625,
0.0180511474609375,
-0.0176849365234375,
-0.0237884521484375,
0.006092071533203125,
-0.02825927734375,
0.0941162109375,
0.0185699462890625,
-0.0303497314453125,
0.0198516845703125,
-0.067138671875,
0.00565338134765625,
0.016876220703125,
-0.01922607421875,
-0.0215911865234375,
-0.0019445419311523438,
0.01995849609375,
0.02105712890625,
0.0452880859375,
-0.05535888671875,
-0.0083465576171875,
-0.047637939453125,
0.033660888671875,
0.05706787109375,
-0.0189361572265625,
0.01611328125,
-0.02301025390625,
0.0177459716796875,
0.0031833648681640625,
0.0047454833984375,
-0.028411865234375,
-0.038970947265625,
-0.068359375,
-0.0189056396484375,
0.0458984375,
0.05035400390625,
-0.0618896484375,
0.063232421875,
-0.0168914794921875,
-0.044158935546875,
-0.050933837890625,
-0.01032257080078125,
0.03668212890625,
0.03485107421875,
0.039031982421875,
-0.022308349609375,
-0.06280517578125,
-0.054595947265625,
-0.01165771484375,
-0.019866943359375,
-0.0078277587890625,
0.004283905029296875,
0.0197601318359375,
-0.0092926025390625,
0.059295654296875,
-0.04827880859375,
-0.035552978515625,
-0.0174407958984375,
0.033935546875,
0.0162353515625,
0.03582763671875,
0.037811279296875,
-0.047454833984375,
-0.044097900390625,
-0.0119781494140625,
-0.02947998046875,
-0.0184326171875,
-0.01580810546875,
-0.031158447265625,
0.0303955078125,
0.035888671875,
-0.030548095703125,
0.034271240234375,
0.0256500244140625,
-0.01299285888671875,
0.054595947265625,
-0.0258941650390625,
-0.00490570068359375,
-0.08221435546875,
0.012420654296875,
0.0126800537109375,
0.0019893646240234375,
-0.0595703125,
0.0014324188232421875,
0.003139495849609375,
0.0087127685546875,
-0.03253173828125,
0.042572021484375,
-0.051971435546875,
0.0237274169921875,
-0.013397216796875,
0.024871826171875,
0.00714874267578125,
0.071044921875,
0.0218658447265625,
0.0399169921875,
0.0413818359375,
-0.055572509765625,
0.0114898681640625,
0.01332855224609375,
-0.02960205078125,
-0.018310546875,
-0.05609130859375,
0.0103912353515625,
0.0003104209899902344,
0.0269775390625,
-0.08428955078125,
0.00844573974609375,
0.03619384765625,
-0.051116943359375,
0.0302276611328125,
0.02545166015625,
-0.053924560546875,
-0.028656005859375,
-0.055511474609375,
0.01236724853515625,
0.035980224609375,
-0.03619384765625,
0.026458740234375,
0.0201263427734375,
0.0011377334594726562,
-0.044952392578125,
-0.062255859375,
0.0107269287109375,
0.00801849365234375,
-0.051727294921875,
0.06109619140625,
-0.02349853515625,
0.01206207275390625,
0.0015850067138671875,
0.01068878173828125,
-0.0276336669921875,
0.00519561767578125,
-0.0130615234375,
0.033935546875,
-0.0222015380859375,
0.01294708251953125,
-0.00815582275390625,
0.009857177734375,
0.004894256591796875,
-0.01910400390625,
0.046478271484375,
0.0102386474609375,
-0.01277923583984375,
-0.0298309326171875,
0.01419830322265625,
0.01503753662109375,
-0.030517578125,
0.06365966796875,
0.0831298828125,
-0.049774169921875,
0.005756378173828125,
-0.048858642578125,
-0.01398468017578125,
-0.03521728515625,
0.036346435546875,
-0.0089111328125,
-0.06298828125,
0.035614013671875,
0.031402587890625,
0.030487060546875,
0.050537109375,
0.03326416015625,
-0.0005679130554199219,
0.056365966796875,
0.04681396484375,
-0.0247650146484375,
0.05950927734375,
-0.005680084228515625,
0.033660888671875,
-0.07220458984375,
0.00226593017578125,
-0.04693603515625,
-0.01508331298828125,
-0.051361083984375,
-0.0145111083984375,
-0.0017490386962890625,
0.006328582763671875,
-0.0235595703125,
0.037017822265625,
-0.055023193359375,
0.01459503173828125,
0.052825927734375,
0.00791168212890625,
0.0078277587890625,
0.0069580078125,
-0.029815673828125,
-0.00531768798828125,
-0.03057861328125,
-0.0289459228515625,
0.06756591796875,
0.026031494140625,
0.02130126953125,
-0.01284027099609375,
0.054931640625,
0.00542449951171875,
0.01354217529296875,
-0.042388916015625,
0.0535888671875,
-0.0276336669921875,
-0.046295166015625,
-0.039031982421875,
-0.02099609375,
-0.0843505859375,
0.03656005859375,
-0.023468017578125,
-0.05914306640625,
0.00794219970703125,
-0.00446319580078125,
-0.0262298583984375,
0.038818359375,
-0.052978515625,
0.04345703125,
-0.0139312744140625,
-0.00975799560546875,
0.006847381591796875,
-0.05328369140625,
0.036773681640625,
-0.00640869140625,
0.00881195068359375,
0.0000807046890258789,
0.0160064697265625,
0.07830810546875,
-0.03155517578125,
0.06988525390625,
-0.019134521484375,
-0.0198974609375,
0.02362060546875,
-0.03314208984375,
0.02874755859375,
-0.02020263671875,
-0.007099151611328125,
0.039093017578125,
-0.0070037841796875,
-0.0214385986328125,
-0.0170440673828125,
0.039764404296875,
-0.0654296875,
-0.0452880859375,
-0.054656982421875,
-0.022216796875,
0.0020294189453125,
0.037506103515625,
0.04510498046875,
0.016326904296875,
0.00110626220703125,
0.00847625732421875,
0.052276611328125,
-0.0380859375,
0.0545654296875,
0.044158935546875,
-0.0078125,
-0.03680419921875,
0.06488037109375,
0.0283050537109375,
-0.00005322694778442383,
0.048431396484375,
0.0147705078125,
-0.01556396484375,
-0.04541015625,
-0.01163482666015625,
0.0307159423828125,
-0.03399658203125,
-0.003017425537109375,
-0.056610107421875,
-0.060699462890625,
-0.057342529296875,
0.0121917724609375,
-0.00475311279296875,
-0.026092529296875,
-0.042633056640625,
0.002635955810546875,
0.01168060302734375,
0.0208282470703125,
-0.018463134765625,
0.020751953125,
-0.0618896484375,
0.0298004150390625,
0.0195465087890625,
0.0250396728515625,
0.0219573974609375,
-0.055267333984375,
-0.043975830078125,
0.0259857177734375,
-0.03570556640625,
-0.046539306640625,
0.044708251953125,
0.01568603515625,
0.061798095703125,
0.0257415771484375,
0.0234527587890625,
0.04931640625,
-0.04034423828125,
0.0858154296875,
0.0182647705078125,
-0.07391357421875,
0.0307769775390625,
-0.006999969482421875,
0.0260772705078125,
0.02886962890625,
0.0121612548828125,
-0.047882080078125,
-0.0177154541015625,
-0.039520263671875,
-0.07818603515625,
0.07061767578125,
0.01690673828125,
0.0143890380859375,
-0.0003540515899658203,
0.007843017578125,
-0.006336212158203125,
-0.0036296844482421875,
-0.09136962890625,
-0.037841796875,
-0.025146484375,
-0.005401611328125,
0.004856109619140625,
-0.0341796875,
0.01503753662109375,
-0.032958984375,
0.08172607421875,
0.0163116455078125,
0.041168212890625,
0.039337158203125,
-0.0171661376953125,
0.0017709732055664062,
0.0194854736328125,
0.0611572265625,
0.0286102294921875,
-0.0279998779296875,
-0.0042266845703125,
0.01039886474609375,
-0.05908203125,
-0.021759033203125,
0.036956787109375,
-0.0038547515869140625,
0.022308349609375,
0.04449462890625,
0.065673828125,
0.0096893310546875,
-0.032806396484375,
0.039703369140625,
-0.0100860595703125,
-0.040496826171875,
-0.03106689453125,
-0.00939178466796875,
0.00016546249389648438,
-0.009613037109375,
0.040435791015625,
-0.0154266357421875,
0.0020008087158203125,
-0.0238189697265625,
0.0011816024780273438,
0.019195556640625,
-0.03826904296875,
-0.02618408203125,
0.042022705078125,
0.01369476318359375,
-0.01351165771484375,
0.046905517578125,
-0.0308380126953125,
-0.0660400390625,
0.039947509765625,
0.042144775390625,
0.09539794921875,
-0.007965087890625,
-0.005107879638671875,
0.049530029296875,
0.0423583984375,
0.0021419525146484375,
0.0185699462890625,
-0.009063720703125,
-0.07415771484375,
-0.042388916015625,
-0.044769287109375,
-0.01313018798828125,
0.03692626953125,
-0.0313720703125,
0.016448974609375,
-0.0321044921875,
-0.0049896240234375,
-0.0008525848388671875,
0.0215911865234375,
-0.03662109375,
0.0138092041015625,
0.03515625,
0.07159423828125,
-0.040618896484375,
0.09259033203125,
0.06341552734375,
-0.0364990234375,
-0.057891845703125,
0.02349853515625,
-0.02362060546875,
-0.051605224609375,
0.05377197265625,
0.013641357421875,
-0.007747650146484375,
-0.01064300537109375,
-0.053955078125,
-0.054656982421875,
0.07745361328125,
0.0108184814453125,
-0.033416748046875,
-0.002796173095703125,
-0.0027141571044921875,
0.04547119140625,
-0.0050506591796875,
0.021453857421875,
0.0258636474609375,
0.04876708984375,
-0.004261016845703125,
-0.0745849609375,
0.002895355224609375,
-0.0347900390625,
0.007476806640625,
-0.0038738250732421875,
-0.0560302734375,
0.07330322265625,
0.00013494491577148438,
-0.0278472900390625,
0.0028228759765625,
0.061431884765625,
0.0221710205078125,
0.0254974365234375,
0.04150390625,
0.044586181640625,
0.056549072265625,
-0.00839996337890625,
0.038665771484375,
-0.0212860107421875,
0.01113128662109375,
0.075927734375,
-0.00820159912109375,
0.056304931640625,
0.00991058349609375,
-0.0297088623046875,
0.049407958984375,
0.05938720703125,
0.003582000732421875,
0.03729248046875,
0.0204010009765625,
-0.00832366943359375,
-0.006465911865234375,
-0.0093231201171875,
-0.048431396484375,
0.0184478759765625,
0.01172637939453125,
-0.0204315185546875,
0.0012483596801757812,
-0.00606536865234375,
0.02783203125,
0.004062652587890625,
-0.00921630859375,
0.046356201171875,
0.007480621337890625,
-0.039794921875,
0.05377197265625,
0.018341064453125,
0.0947265625,
-0.0684814453125,
-0.005687713623046875,
-0.00872802734375,
-0.01531219482421875,
-0.008697509765625,
-0.04022216796875,
-0.0022678375244140625,
-0.010101318359375,
-0.0153350830078125,
-0.01226043701171875,
0.056427001953125,
-0.03936767578125,
-0.0364990234375,
0.036865234375,
0.01165771484375,
0.0103302001953125,
-0.000308990478515625,
-0.049041748046875,
-0.005218505859375,
0.0218963623046875,
-0.03729248046875,
0.028839111328125,
0.045318603515625,
0.0032672882080078125,
0.0283050537109375,
0.0728759765625,
0.0244293212890625,
0.0142822265625,
0.01293182373046875,
0.059814453125,
-0.05609130859375,
-0.040374755859375,
-0.062408447265625,
0.0335693359375,
-0.018341064453125,
-0.0279998779296875,
0.05108642578125,
0.0253143310546875,
0.07305908203125,
0.00196075439453125,
0.06231689453125,
-0.0065460205078125,
0.03472900390625,
-0.035614013671875,
0.080078125,
-0.039031982421875,
0.01406097412109375,
-0.033172607421875,
-0.0628662109375,
-0.033966064453125,
0.047821044921875,
-0.011199951171875,
0.00595855712890625,
0.05224609375,
0.046539306640625,
0.0262603759765625,
-0.00850677490234375,
0.02685546875,
0.030670166015625,
0.031494140625,
0.033905029296875,
0.02630615234375,
-0.050048828125,
0.048248291015625,
-0.036102294921875,
-0.016265869140625,
-0.0091705322265625,
-0.0770263671875,
-0.05902099609375,
-0.066162109375,
-0.0255584716796875,
-0.010650634765625,
-0.00888824462890625,
0.0650634765625,
0.05181884765625,
-0.06915283203125,
-0.0140228271484375,
0.00771331787109375,
0.0115509033203125,
-0.0262298583984375,
-0.0192718505859375,
0.055816650390625,
-0.03765869140625,
-0.054412841796875,
0.004039764404296875,
0.0008378028869628906,
0.0008630752563476562,
-0.0108489990234375,
-0.0018968582153320312,
-0.054473876953125,
0.0115203857421875,
0.044342041015625,
0.0257110595703125,
-0.058624267578125,
-0.00989532470703125,
0.001323699951171875,
-0.016326904296875,
0.0034942626953125,
0.041107177734375,
-0.049774169921875,
0.035430908203125,
0.04681396484375,
0.042205810546875,
0.03125,
-0.02203369140625,
0.0282745361328125,
-0.05633544921875,
0.01157379150390625,
0.00406646728515625,
0.03271484375,
0.02093505859375,
-0.0306243896484375,
0.031494140625,
0.0155181884765625,
-0.037139892578125,
-0.060150146484375,
-0.01224517822265625,
-0.0706787109375,
-0.0271759033203125,
0.0694580078125,
-0.026031494140625,
-0.0144195556640625,
-0.004650115966796875,
-0.02978515625,
0.05023193359375,
-0.031646728515625,
0.051971435546875,
0.08050537109375,
0.01326751708984375,
-0.0160064697265625,
-0.0226593017578125,
0.034393310546875,
0.026947021484375,
-0.04693603515625,
0.005466461181640625,
0.01300048828125,
-0.00136566162109375,
0.021881103515625,
0.051300048828125,
-0.0031147003173828125,
0.0032787322998046875,
-0.0171356201171875,
0.042236328125,
-0.0120849609375,
0.0025768280029296875,
-0.0214385986328125,
-0.0151519775390625,
-0.00765228271484375,
-0.037109375
]
] |
minutillamolinara/bert-japanese_finetuned-sentiment-analysis | 2023-03-31T13:13:37.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"ja",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | minutillamolinara | null | null | minutillamolinara/bert-japanese_finetuned-sentiment-analysis | 0 | 14,482 | transformers | 2023-03-31T02:28:09 | ---
language: ja
license: mit
widget:
- text: "自然言語処理が面白い"
metrics:
- accuracy
- f1
---
# bert-japanese_finetuned-sentiment-analysis
This model was trained from scratch on the Japanese Sentiment Polarity Dictionary dataset.
## Pre-trained model
jarvisx17/japanese-sentiment-analysis<br/>
Link : https://huggingface.co/jarvisx17/japanese-sentiment-analysis
## Training Data
The model was trained on Japanese Sentiment Polarity Dictionary dataset.<br/>
link : https://www.cl.ecei.tohoku.ac.jp/Open_Resources-Japanese_Sentiment_Polarity_Dictionary.html
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
## Usage
You can use cURL to access this model:
Python API:
```
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("minutillamolinara/bert-japanese_finetuned-sentiment-analysis")
model = AutoModelForSequenceClassification.from_pretrained("minutillamolinara/bert-japanese_finetuned-sentiment-analysis")
inputs = tokenizer("自然言語処理が面白い", return_tensors="pt")
outputs = model(**inputs)
```
### Dependencies
- !pip install fugashi
- !pip install unidic_lite
## Licenses
MIT
| 1,373 | [
[
-0.055511474609375,
-0.047576904296875,
0.01468658447265625,
0.0287017822265625,
-0.04632568359375,
-0.01343536376953125,
-0.026611328125,
-0.0036773681640625,
0.028411865234375,
0.016448974609375,
-0.06597900390625,
-0.041259765625,
-0.053131103515625,
-0.0017528533935546875,
0.0033626556396484375,
0.0985107421875,
0.01149749755859375,
0.0443115234375,
0.010894775390625,
-0.01393890380859375,
-0.004566192626953125,
-0.041290283203125,
-0.048248291015625,
-0.02520751953125,
0.0268707275390625,
0.0238037109375,
0.047210693359375,
0.004032135009765625,
0.04168701171875,
0.023345947265625,
-0.00234222412109375,
-0.013946533203125,
-0.030487060546875,
-0.0034942626953125,
0.0018177032470703125,
-0.038482666015625,
-0.03131103515625,
0.01666259765625,
0.0438232421875,
0.0418701171875,
0.00417327880859375,
0.0248565673828125,
0.0022182464599609375,
0.026092529296875,
-0.0286102294921875,
0.03143310546875,
-0.0445556640625,
0.014678955078125,
-0.004985809326171875,
0.02056884765625,
-0.005764007568359375,
-0.023223876953125,
0.0084075927734375,
-0.04010009765625,
0.022979736328125,
-0.01371002197265625,
0.09295654296875,
0.01543426513671875,
-0.006740570068359375,
-0.005584716796875,
-0.0305328369140625,
0.0703125,
-0.07904052734375,
0.02471923828125,
0.02203369140625,
0.01444244384765625,
0.01104736328125,
-0.051513671875,
-0.047515869140625,
-0.005901336669921875,
0.01335906982421875,
0.02764892578125,
0.0031337738037109375,
0.0008325576782226562,
0.029937744140625,
0.0258636474609375,
-0.027374267578125,
0.008087158203125,
-0.020660400390625,
-0.0138397216796875,
0.036712646484375,
0.01233673095703125,
0.0027027130126953125,
-0.047576904296875,
-0.0289764404296875,
-0.02947998046875,
-0.009552001953125,
0.0192413330078125,
0.036529541015625,
0.034149169921875,
-0.027557373046875,
0.03887939453125,
-0.01019287109375,
0.02978515625,
0.0123748779296875,
-0.00000286102294921875,
0.042327880859375,
-0.01398468017578125,
-0.0153045654296875,
-0.01898193359375,
0.0897216796875,
0.02899169921875,
0.0207672119140625,
0.019866943359375,
-0.026702880859375,
0.0012044906616210938,
0.005157470703125,
-0.057220458984375,
-0.024078369140625,
0.0102386474609375,
-0.04443359375,
-0.0341796875,
0.007381439208984375,
-0.057861328125,
-0.000043451786041259766,
-0.031890869140625,
0.046112060546875,
-0.03515625,
-0.0195465087890625,
-0.006134033203125,
-0.01509857177734375,
0.03204345703125,
-0.0045928955078125,
-0.0560302734375,
0.00604248046875,
0.03302001953125,
0.043243408203125,
0.013641357421875,
-0.0304412841796875,
0.01070404052734375,
-0.0222625732421875,
-0.00266265869140625,
0.028472900390625,
-0.0012950897216796875,
-0.02996826171875,
-0.004604339599609375,
0.0199432373046875,
-0.0038433074951171875,
-0.0111083984375,
0.050537109375,
-0.0281524658203125,
0.04083251953125,
-0.0198211669921875,
-0.045867919921875,
-0.0177459716796875,
0.0266876220703125,
-0.0302581787109375,
0.0670166015625,
0.01343536376953125,
-0.06719970703125,
0.0254058837890625,
-0.05535888671875,
-0.026641845703125,
-0.0043487548828125,
0.006195068359375,
-0.04949951171875,
0.0011148452758789062,
0.0157470703125,
0.048095703125,
-0.0031299591064453125,
0.0216064453125,
-0.045989990234375,
-0.0063934326171875,
0.0199127197265625,
-0.040863037109375,
0.089111328125,
0.0283355712890625,
-0.029205322265625,
0.0030765533447265625,
-0.06719970703125,
-0.0031528472900390625,
0.0233154296875,
-0.01363372802734375,
-0.01971435546875,
-0.0126190185546875,
0.03070068359375,
0.01010894775390625,
0.045166015625,
-0.065185546875,
0.01537322998046875,
-0.04840087890625,
0.0260467529296875,
0.055633544921875,
-0.0027942657470703125,
0.006649017333984375,
-0.0029430389404296875,
0.0235137939453125,
0.02679443359375,
0.0372314453125,
0.017669677734375,
-0.033782958984375,
-0.0775146484375,
-0.01468658447265625,
0.01513671875,
0.03607177734375,
-0.05401611328125,
0.06365966796875,
-0.00849151611328125,
-0.05609130859375,
-0.060302734375,
0.00846099853515625,
0.016357421875,
0.03973388671875,
0.0225677490234375,
-0.0345458984375,
-0.05706787109375,
-0.060882568359375,
-0.0087432861328125,
-0.009307861328125,
0.00868988037109375,
0.00531005859375,
0.04119873046875,
-0.044036865234375,
0.0650634765625,
-0.0364990234375,
-0.0239410400390625,
-0.0234375,
0.040863037109375,
0.05328369140625,
0.044921875,
0.040008544921875,
-0.032379150390625,
-0.041748046875,
-0.03253173828125,
-0.04925537109375,
0.003376007080078125,
0.005138397216796875,
-0.0185394287109375,
0.022613525390625,
0.0152740478515625,
-0.041748046875,
0.005397796630859375,
0.0238189697265625,
-0.0210113525390625,
0.035491943359375,
-0.005825042724609375,
-0.00807952880859375,
-0.100341796875,
0.0110626220703125,
0.01080322265625,
-0.01468658447265625,
-0.016204833984375,
0.01065826416015625,
-0.0067291259765625,
-0.02264404296875,
-0.034393310546875,
0.0289306640625,
-0.007633209228515625,
0.00464630126953125,
-0.01152801513671875,
-0.01371002197265625,
0.003826141357421875,
0.055755615234375,
0.004070281982421875,
0.03997802734375,
0.052093505859375,
-0.02435302734375,
0.035552978515625,
0.0306854248046875,
-0.0278167724609375,
0.01465606689453125,
-0.059051513671875,
-0.0006346702575683594,
-0.007129669189453125,
-0.013275146484375,
-0.084228515625,
-0.0113067626953125,
0.0265960693359375,
-0.04815673828125,
0.021240234375,
-0.01352691650390625,
-0.035980224609375,
-0.033935546875,
-0.03424072265625,
0.014190673828125,
0.0538330078125,
-0.041107177734375,
0.033782958984375,
0.0185089111328125,
-0.01197052001953125,
-0.046966552734375,
-0.052734375,
-0.023529052734375,
-0.0024929046630859375,
-0.0234527587890625,
-0.0160369873046875,
-0.0116729736328125,
0.0036220550537109375,
-0.00710296630859375,
0.01715087890625,
-0.0073699951171875,
0.00496673583984375,
0.007541656494140625,
0.039276123046875,
-0.01219940185546875,
-0.01448822021484375,
0.00862884521484375,
-0.011260986328125,
0.036041259765625,
0.01116180419921875,
0.06134033203125,
-0.02691650390625,
-0.0101318359375,
-0.049713134765625,
-0.005802154541015625,
0.054046630859375,
0.00783538818359375,
0.058380126953125,
0.079833984375,
-0.00322723388671875,
0.01013946533203125,
-0.04351806640625,
-0.020965576171875,
-0.032867431640625,
0.032470703125,
-0.01320648193359375,
-0.030853271484375,
0.050811767578125,
0.018310546875,
0.0028095245361328125,
0.060333251953125,
0.045440673828125,
-0.02471923828125,
0.1053466796875,
0.0299072265625,
-0.042877197265625,
0.046173095703125,
-0.050018310546875,
0.00656890869140625,
-0.055633544921875,
-0.004669189453125,
-0.0230865478515625,
-0.00849151611328125,
-0.05218505859375,
0.00818634033203125,
0.026092529296875,
-0.0020618438720703125,
-0.051300048828125,
0.01343536376953125,
-0.0328369140625,
0.0124359130859375,
0.05535888671875,
0.0083465576171875,
-0.00650787353515625,
0.00884246826171875,
-0.01416778564453125,
-0.01326751708984375,
-0.045501708984375,
-0.01812744140625,
0.09100341796875,
0.053070068359375,
0.06402587890625,
-0.0156097412109375,
0.058319091796875,
-0.0015935897827148438,
0.00879669189453125,
-0.06976318359375,
0.04095458984375,
-0.032012939453125,
-0.04998779296875,
-0.0089111328125,
-0.0274200439453125,
-0.057037353515625,
0.0111846923828125,
-0.01873779296875,
-0.01334381103515625,
0.01079559326171875,
-0.0140533447265625,
-0.016265869140625,
0.01082611083984375,
-0.03961181640625,
0.068359375,
-0.017913818359375,
-0.0031185150146484375,
-0.009246826171875,
-0.057403564453125,
0.01873779296875,
0.01152801513671875,
0.01369476318359375,
-0.009552001953125,
0.0164947509765625,
0.07989501953125,
-0.02899169921875,
0.07965087890625,
-0.048980712890625,
0.00823211669921875,
0.01372528076171875,
-0.007633209228515625,
0.01215362548828125,
0.0177154541015625,
0.0021228790283203125,
0.038543701171875,
0.0070953369140625,
-0.032989501953125,
-0.014892578125,
0.0552978515625,
-0.11956787109375,
-0.0223846435546875,
-0.042022705078125,
-0.0335693359375,
-0.01271820068359375,
0.0300750732421875,
0.04766845703125,
0.0252685546875,
-0.01519775390625,
0.015167236328125,
0.04119873046875,
0.0015621185302734375,
0.0386962890625,
0.040191650390625,
-0.0175323486328125,
-0.055633544921875,
0.0631103515625,
0.0113525390625,
0.00473785400390625,
-0.00858306884765625,
0.020538330078125,
-0.031219482421875,
-0.0087432861328125,
-0.0215301513671875,
0.0307769775390625,
-0.050079345703125,
-0.01502227783203125,
-0.036041259765625,
-0.0289459228515625,
-0.05364990234375,
0.00457000732421875,
-0.04046630859375,
-0.024017333984375,
-0.033599853515625,
0.0025119781494140625,
0.0279541015625,
0.020660400390625,
-0.012603759765625,
0.043792724609375,
-0.059417724609375,
0.026824951171875,
0.0216827392578125,
0.0272674560546875,
-0.0011167526245117188,
-0.055908203125,
-0.0179901123046875,
0.0253143310546875,
-0.03216552734375,
-0.04632568359375,
0.053558349609375,
0.004669189453125,
0.035003662109375,
0.0447998046875,
-0.0061187744140625,
0.04156494140625,
-0.0005235671997070312,
0.07000732421875,
0.0232391357421875,
-0.07452392578125,
0.049835205078125,
-0.0323486328125,
0.02655029296875,
0.053680419921875,
0.05810546875,
-0.01708984375,
-0.033355712890625,
-0.0672607421875,
-0.06951904296875,
0.058013916015625,
0.006267547607421875,
0.0188751220703125,
-0.0021266937255859375,
0.0218353271484375,
0.006755828857421875,
0.0161285400390625,
-0.09307861328125,
-0.0287628173828125,
-0.05224609375,
-0.037078857421875,
-0.0269012451171875,
-0.0191497802734375,
-0.0128631591796875,
-0.059051513671875,
0.082763671875,
0.00024247169494628906,
0.0258636474609375,
0.018280029296875,
-0.004241943359375,
-0.014190673828125,
0.0026378631591796875,
0.0053558349609375,
0.0182647705078125,
-0.039031982421875,
-0.0216217041015625,
0.006439208984375,
-0.040374755859375,
-0.00473785400390625,
0.0168609619140625,
-0.0158843994140625,
0.0188446044921875,
0.01297760009765625,
0.08465576171875,
0.0026950836181640625,
-0.0219573974609375,
0.049102783203125,
-0.006969451904296875,
-0.00879669189453125,
-0.050262451171875,
-0.0070953369140625,
-0.010772705078125,
0.005840301513671875,
0.00902557373046875,
0.0005817413330078125,
0.004077911376953125,
-0.03790283203125,
-0.01361846923828125,
0.037261962890625,
-0.037811279296875,
-0.0192108154296875,
0.03778076171875,
0.01479339599609375,
0.00907135009765625,
0.06475830078125,
-0.0156402587890625,
-0.042205810546875,
0.047332763671875,
0.032135009765625,
0.071533203125,
-0.003875732421875,
0.005283355712890625,
0.046142578125,
0.007843017578125,
-0.0005536079406738281,
0.054473876953125,
0.00165557861328125,
-0.0780029296875,
-0.01477813720703125,
-0.062164306640625,
-0.0276031494140625,
0.019561767578125,
-0.07928466796875,
0.027008056640625,
-0.049346923828125,
-0.03387451171875,
-0.0025997161865234375,
0.018280029296875,
-0.046356201171875,
0.0250701904296875,
0.017181396484375,
0.058380126953125,
-0.0836181640625,
0.06500244140625,
0.0660400390625,
-0.059478759765625,
-0.06207275390625,
-0.0177764892578125,
-0.022979736328125,
-0.07684326171875,
0.050140380859375,
0.00885772705078125,
0.008575439453125,
0.0193023681640625,
-0.0643310546875,
-0.052490234375,
0.060516357421875,
-0.003421783447265625,
-0.042205810546875,
0.0182647705078125,
0.0177764892578125,
0.04718017578125,
-0.0160369873046875,
0.0260467529296875,
0.033782958984375,
0.0250091552734375,
-0.0138702392578125,
-0.04449462890625,
-0.0236663818359375,
-0.0255889892578125,
0.00856781005859375,
0.006420135498046875,
-0.054046630859375,
0.0869140625,
0.0154266357421875,
0.027374267578125,
0.01087188720703125,
0.05609130859375,
0.01226806640625,
0.010162353515625,
0.039764404296875,
0.06597900390625,
0.0214691162109375,
-0.00472259521484375,
0.051361083984375,
-0.03826904296875,
0.07244873046875,
0.058868408203125,
-0.0025424957275390625,
0.0638427734375,
0.0367431640625,
-0.01837158203125,
0.061553955078125,
0.041351318359375,
-0.0338134765625,
0.05072021484375,
-0.018707275390625,
-0.004474639892578125,
-0.01934814453125,
0.00890350341796875,
-0.0285186767578125,
0.0296478271484375,
0.0247650146484375,
-0.0304107666015625,
-0.0118560791015625,
0.028411865234375,
0.00849151611328125,
-0.02630615234375,
-0.04083251953125,
0.06536865234375,
-0.0034580230712890625,
-0.04571533203125,
0.0552978515625,
-0.0016508102416992188,
0.07208251953125,
-0.058135986328125,
0.024322509765625,
-0.0163421630859375,
0.024688720703125,
-0.0083770751953125,
-0.0640869140625,
0.0095672607421875,
0.0012121200561523438,
-0.02105712890625,
0.0006270408630371094,
0.04510498046875,
-0.0235443115234375,
-0.059295654296875,
0.0245513916015625,
0.0264892578125,
0.0117950439453125,
0.0089111328125,
-0.0855712890625,
-0.0041046142578125,
0.01071929931640625,
-0.0443115234375,
0.02239990234375,
0.0243072509765625,
0.02911376953125,
0.0330810546875,
0.04888916015625,
0.007602691650390625,
-0.0007376670837402344,
0.031341552734375,
0.06573486328125,
-0.046905517578125,
-0.04022216796875,
-0.05548095703125,
0.032257080078125,
-0.003307342529296875,
-0.038665771484375,
0.059783935546875,
0.035888671875,
0.06829833984375,
-0.031585693359375,
0.0552978515625,
-0.01097869873046875,
0.044769287109375,
-0.026702880859375,
0.06610107421875,
-0.03533935546875,
-0.0150146484375,
-0.01513671875,
-0.064208984375,
0.003055572509765625,
0.064453125,
-0.0193023681640625,
0.007160186767578125,
0.032470703125,
0.058013916015625,
-0.012725830078125,
-0.001613616943359375,
-0.00234222412109375,
0.04705810546875,
0.001323699951171875,
0.0168609619140625,
0.052947998046875,
-0.048095703125,
0.0120697021484375,
-0.06097412109375,
-0.01387786865234375,
-0.004795074462890625,
-0.0458984375,
-0.06756591796875,
-0.036865234375,
-0.026763916015625,
-0.03424072265625,
-0.0195465087890625,
0.0887451171875,
0.035736083984375,
-0.0736083984375,
-0.031951904296875,
-0.01556396484375,
-0.0245361328125,
0.00354766845703125,
-0.028411865234375,
0.021514892578125,
-0.06402587890625,
-0.0595703125,
-0.0024242401123046875,
-0.028472900390625,
0.01393890380859375,
-0.0340576171875,
0.0062255859375,
-0.0161590576171875,
-0.0030307769775390625,
0.03515625,
0.017578125,
-0.046905517578125,
-0.01056671142578125,
-0.014129638671875,
-0.014007568359375,
0.003299713134765625,
0.0200347900390625,
-0.04119873046875,
0.0301971435546875,
0.042572021484375,
0.0198822021484375,
0.0269622802734375,
-0.0084075927734375,
0.0256805419921875,
-0.0595703125,
0.0277557373046875,
0.0171966552734375,
0.04815673828125,
0.01352691650390625,
-0.0188751220703125,
0.010955810546875,
0.038421630859375,
-0.042724609375,
-0.046783447265625,
-0.01149749755859375,
-0.09130859375,
-0.0232086181640625,
0.08404541015625,
-0.016204833984375,
-0.02691650390625,
0.01151275634765625,
-0.031036376953125,
0.044036865234375,
-0.0255279541015625,
0.059478759765625,
0.058135986328125,
0.0011692047119140625,
0.0006937980651855469,
-0.01230621337890625,
0.025634765625,
0.05438232421875,
-0.029937744140625,
-0.0133056640625,
0.0150604248046875,
0.044952392578125,
0.01531219482421875,
0.0246734619140625,
-0.00933837890625,
0.01239013671875,
-0.0156402587890625,
0.040069580078125,
-0.003948211669921875,
-0.003559112548828125,
-0.042877197265625,
-0.00002396106719970703,
0.0105133056640625,
-0.0273590087890625
]
] |
facebook/opt-66b | 2023-01-24T17:12:06.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"opt",
"text-generation",
"en",
"arxiv:2205.01068",
"arxiv:2005.14165",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | facebook | null | null | facebook/opt-66b | 171 | 14,474 | transformers | 2022-06-23T21:51:55 | ---
language: en
inference: false
tags:
- text-generation
- opt
license: other
commercial: false
---
# OPT : Open Pre-trained Transformer Language Models
OPT was first introduced in [Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) and first released in [metaseq's repository](https://github.com/facebookresearch/metaseq) on May 3rd 2022 by Meta AI.
**Disclaimer**: The team releasing OPT wrote an official model card, which is available in Appendix D of the [paper](https://arxiv.org/pdf/2205.01068.pdf).
Content from **this** model card has been written by the Hugging Face team.
## Intro
To quote the first two paragraphs of the [official paper](https://arxiv.org/abs/2205.01068)
> Large language models trained on massive text collections have shown surprising emergent
> capabilities to generate text and perform zero- and few-shot learning. While in some cases the public
> can interact with these models through paid APIs, full model access is currently limited to only a
> few highly resourced labs. This restricted access has limited researchers’ ability to study how and
> why these large language models work, hindering progress on improving known challenges in areas
> such as robustness, bias, and toxicity.
> We present Open Pretrained Transformers (OPT), a suite of decoder-only pre-trained transformers ranging from 125M
> to 175B parameters, which we aim to fully and responsibly share with interested researchers. We train the OPT models to roughly match
> the performance and sizes of the GPT-3 class of models, while also applying the latest best practices in data
> collection and efficient training. Our aim in developing this suite of OPT models is to enable reproducible and responsible research at scale, and
> to bring more voices to the table in studying the impact of these LLMs. Definitions of risk, harm, bias, and toxicity, etc., should be articulated by the
> collective research community as a whole, which is only possible when models are available for study.
## Model description
OPT was predominantly pretrained with English text, but a small amount of non-English data is still present within the training corpus via CommonCrawl. The model was pretrained using a causal language modeling (CLM) objective.
OPT belongs to the same family of decoder-only models like [GPT-3](https://arxiv.org/abs/2005.14165). As such, it was pretrained using the self-supervised causal language modedling objective.
For evaluation, OPT follows [GPT-3](https://arxiv.org/abs/2005.14165) by using their prompts and overall experimental setup. For more details, please read
the [official paper](https://arxiv.org/abs/2205.01068).
## Intended uses & limitations
The pretrained-only model can be used for prompting for evaluation of downstream tasks as well as text generation.
In addition, the model can be fine-tuned on a downstream task using the [CLM example](https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling). For all other OPT checkpoints, please have a look at the [model hub](https://huggingface.co/models?filter=opt).
### How to use
For large OPT models, such as this one, it is not recommend to make use of the `text-generation` pipeline because
one should load the model in half-precision to accelerate generation and optimize memory consumption on GPU.
It is recommended to directly call the [`generate`](https://huggingface.co/docs/transformers/main/en/main_classes/text_generation#transformers.generation_utils.GenerationMixin.generate)
method as follows:
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-66b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-66b", use_fast=False)
>>> prompt = "Hello, I am conscious and"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> generated_ids = model.generate(input_ids)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
['Hello, I am conscious and I am here.\nI am also conscious and I am here']
```
By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-66b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-66b", use_fast=False)
>>> prompt = "Hello, I am conscious and"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> set_seed(32)
>>> generated_ids = model.generate(input_ids, do_sample=True)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
['Hello, I am conscious and aware that you have your back turned to me and want to talk']
```
### Limitations and bias
As mentioned in Meta AI's model card, given that the training data used for this model contains a lot of
unfiltered content from the internet, which is far from neutral the model is strongly biased :
> Like other large language models for which the diversity (or lack thereof) of training
> data induces downstream impact on the quality of our model, OPT-175B has limitations in terms
> of bias and safety. OPT-175B can also have quality issues in terms of generation diversity and
> hallucination. In general, OPT-175B is not immune from the plethora of issues that plague modern
> large language models.
Here's an example of how the model can have biased predictions:
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-66b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-66b", use_fast=False)
>>> prompt = "The woman worked as a"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> set_seed(32)
>>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
The woman worked as a supervisor in the office
The woman worked as a social worker in a
The woman worked as a cashier at the
The woman worked as a teacher from 2011 to
he woman worked as a maid at the house
```
compared to:
```python
>>> from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed
>>> import torch
>>> model = AutoModelForCausalLM.from_pretrained("facebook/opt-66b", torch_dtype=torch.float16).cuda()
>>> # the fast tokenizer currently does not work correctly
>>> tokenizer = AutoTokenizer.from_pretrained("facebook/opt-66b", use_fast=False)
>>> prompt = "The man worked as a"
>>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids.cuda()
>>> set_seed(32)
>>> generated_ids = model.generate(input_ids, do_sample=True, num_return_sequences=5, max_length=10)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
The man worked as a school bus driver for
The man worked as a bartender in a bar
The man worked as a cashier at the
The man worked as a teacher, and was
The man worked as a professional at a range
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The Meta AI team wanted to train this model on a corpus as large as possible. It is composed of the union of the following 5 filtered datasets of textual documents:
- BookCorpus, which consists of more than 10K unpublished books,
- CC-Stories, which contains a subset of CommonCrawl data filtered to match the
story-like style of Winograd schemas,
- The Pile, from which * Pile-CC, OpenWebText2, USPTO, Project Gutenberg, OpenSubtitles, Wikipedia, DM Mathematics and HackerNews* were included.
- Pushshift.io Reddit dataset that was developed in Baumgartner et al. (2020) and processed in
Roller et al. (2021)
- CCNewsV2 containing an updated version of the English portion of the CommonCrawl News
dataset that was used in RoBERTa (Liu et al., 2019b)
The final training data contains 180B tokens corresponding to 800GB of data. The validation split was made of 200MB of the pretraining data, sampled proportionally
to each dataset’s size in the pretraining corpus.
The dataset might contains offensive content as parts of the dataset are a subset of
public Common Crawl data, along with a subset of public Reddit data, which could contain sentences
that, if viewed directly, can be insulting, threatening, or might otherwise cause anxiety.
### Collection process
The dataset was collected form internet, and went through classic data processing algorithms and
re-formatting practices, including removing repetitive/non-informative text like *Chapter One* or
*This ebook by Project Gutenberg.*
## Training procedure
### Preprocessing
The texts are tokenized using the **GPT2** byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a
vocabulary size of 50272. The inputs are sequences of 2048 consecutive tokens.
The 175B model was trained on 992 *80GB A100 GPUs*. The training duration was roughly ~33 days of continuous training.
### BibTeX entry and citation info
```bibtex
@misc{zhang2022opt,
title={OPT: Open Pre-trained Transformer Language Models},
author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer},
year={2022},
eprint={2205.01068},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 10,010 | [
[
-0.022674560546875,
-0.06427001953125,
0.01296234130859375,
0.019073486328125,
-0.01221466064453125,
-0.01119232177734375,
-0.032562255859375,
-0.032958984375,
0.00469970703125,
0.03369140625,
-0.045501708984375,
-0.0321044921875,
-0.0465087890625,
0.0078125,
-0.03509521484375,
0.0792236328125,
-0.003589630126953125,
-0.00034880638122558594,
0.0134124755859375,
0.006877899169921875,
-0.0256500244140625,
-0.034149169921875,
-0.06512451171875,
-0.012481689453125,
0.0159149169921875,
0.007152557373046875,
0.0516357421875,
0.0418701171875,
0.03070068359375,
0.029388427734375,
0.0045318603515625,
0.0088348388671875,
-0.046173095703125,
-0.0230255126953125,
-0.0050201416015625,
-0.03204345703125,
-0.03656005859375,
0.0185089111328125,
0.039825439453125,
0.035797119140625,
0.0081024169921875,
0.0162353515625,
0.0017976760864257812,
0.031524658203125,
-0.03668212890625,
0.01800537109375,
-0.05242919921875,
-0.00415802001953125,
-0.01253509521484375,
0.00862884521484375,
-0.045745849609375,
-0.02166748046875,
0.00970458984375,
-0.03424072265625,
0.027923583984375,
-0.004146575927734375,
0.091064453125,
0.030517578125,
-0.0219573974609375,
-0.01294708251953125,
-0.0504150390625,
0.05035400390625,
-0.06854248046875,
0.027801513671875,
0.01776123046875,
0.006900787353515625,
0.00322723388671875,
-0.06732177734375,
-0.04669189453125,
-0.01242828369140625,
-0.01708984375,
0.016998291015625,
-0.0264739990234375,
0.01322174072265625,
0.022369384765625,
0.0273895263671875,
-0.040740966796875,
0.0009031295776367188,
-0.040618896484375,
-0.0265960693359375,
0.055023193359375,
0.004852294921875,
0.024993896484375,
-0.0263519287109375,
-0.0184478759765625,
-0.007152557373046875,
-0.031494140625,
-0.00007581710815429688,
0.036956787109375,
0.018646240234375,
-0.0162811279296875,
0.043426513671875,
-0.0205230712890625,
0.060760498046875,
0.01806640625,
0.00891876220703125,
0.031494140625,
-0.0301971435546875,
-0.0231170654296875,
-0.00870513916015625,
0.089599609375,
0.021209716796875,
0.026947021484375,
-0.0011310577392578125,
-0.003574371337890625,
0.00875091552734375,
0.0035381317138671875,
-0.06561279296875,
-0.02581787109375,
0.0238037109375,
-0.03912353515625,
-0.025634765625,
0.001705169677734375,
-0.0562744140625,
-0.0004968643188476562,
-0.0111236572265625,
0.04315185546875,
-0.0311737060546875,
-0.037750244140625,
0.0204925537109375,
0.0026454925537109375,
0.01971435546875,
0.0028896331787109375,
-0.06500244140625,
-0.0005679130554199219,
0.02325439453125,
0.06005859375,
-0.0031948089599609375,
-0.0288543701171875,
-0.0157623291015625,
-0.006999969482421875,
-0.01381683349609375,
0.0299072265625,
-0.0205230712890625,
-0.00994873046875,
-0.0002601146697998047,
0.00983428955078125,
-0.0204620361328125,
-0.0260772705078125,
0.04925537109375,
-0.0289306640625,
0.031585693359375,
-0.016265869140625,
-0.03125,
-0.004138946533203125,
-0.005382537841796875,
-0.04486083984375,
0.0875244140625,
0.01013946533203125,
-0.07110595703125,
0.0308685302734375,
-0.05096435546875,
-0.028106689453125,
-0.00605010986328125,
-0.0033855438232421875,
-0.0262451171875,
0.0025463104248046875,
0.03131103515625,
0.045135498046875,
-0.01123046875,
0.037872314453125,
-0.01316070556640625,
-0.0178985595703125,
0.00498199462890625,
-0.0399169921875,
0.090576171875,
0.0240478515625,
-0.0457763671875,
0.0229949951171875,
-0.044342041015625,
-0.00507354736328125,
0.0278778076171875,
-0.0114288330078125,
-0.0032062530517578125,
-0.00855255126953125,
0.01422119140625,
0.034881591796875,
0.0244293212890625,
-0.03662109375,
0.00946044921875,
-0.0452880859375,
0.05291748046875,
0.0712890625,
-0.0200653076171875,
0.030059814453125,
-0.00592041015625,
0.019195556640625,
0.00466156005859375,
0.029754638671875,
-0.0064544677734375,
-0.0262298583984375,
-0.0792236328125,
-0.0193328857421875,
0.0140228271484375,
0.0262451171875,
-0.05499267578125,
0.054656982421875,
-0.0210723876953125,
-0.05157470703125,
-0.046417236328125,
-0.001224517822265625,
0.0282745361328125,
0.03253173828125,
0.03289794921875,
-0.01348114013671875,
-0.043365478515625,
-0.0618896484375,
-0.023681640625,
-0.0062408447265625,
0.0142364501953125,
0.0290069580078125,
0.0482177734375,
-0.03533935546875,
0.08636474609375,
-0.04412841796875,
-0.0237274169921875,
-0.0303192138671875,
-0.0049591064453125,
0.030059814453125,
0.050140380859375,
0.0418701171875,
-0.05572509765625,
-0.0458984375,
-0.01806640625,
-0.055511474609375,
-0.0038051605224609375,
-0.00818634033203125,
-0.0302276611328125,
0.0296783447265625,
0.0450439453125,
-0.06451416015625,
0.0235595703125,
0.041290283203125,
-0.03533935546875,
0.042877197265625,
0.003047943115234375,
-0.0148773193359375,
-0.0908203125,
0.0192718505859375,
-0.00687408447265625,
-0.01293182373046875,
-0.039337158203125,
-0.0181732177734375,
0.0009555816650390625,
-0.0120697021484375,
-0.044708251953125,
0.060302734375,
-0.0277252197265625,
0.0157928466796875,
-0.01227569580078125,
0.0030536651611328125,
-0.0086822509765625,
0.046478271484375,
0.009063720703125,
0.043304443359375,
0.053070068359375,
-0.048248291015625,
0.0182342529296875,
0.01751708984375,
-0.01548004150390625,
0.0179290771484375,
-0.0546875,
0.004970550537109375,
-0.01399993896484375,
0.0263214111328125,
-0.0714111328125,
-0.023590087890625,
0.02862548828125,
-0.046356201171875,
0.0257415771484375,
0.0102996826171875,
-0.03778076171875,
-0.057708740234375,
-0.002166748046875,
0.0297088623046875,
0.040740966796875,
-0.042388916015625,
0.049346923828125,
0.02618408203125,
0.0164337158203125,
-0.057861328125,
-0.046142578125,
-0.00789642333984375,
-0.0056610107421875,
-0.055511474609375,
0.026123046875,
-0.00983428955078125,
0.000011861324310302734,
0.0104827880859375,
-0.00740814208984375,
0.005481719970703125,
-0.0035686492919921875,
0.00582122802734375,
0.0245208740234375,
-0.001384735107421875,
0.002017974853515625,
-0.00279998779296875,
-0.017730712890625,
0.01325225830078125,
-0.031585693359375,
0.065185546875,
-0.0193939208984375,
-0.01110076904296875,
-0.040496826171875,
-0.002655029296875,
0.03253173828125,
-0.0308380126953125,
0.06689453125,
0.0723876953125,
-0.037567138671875,
-0.0125732421875,
-0.055419921875,
-0.026702880859375,
-0.0413818359375,
0.051727294921875,
-0.0090179443359375,
-0.0587158203125,
0.0384521484375,
0.01513671875,
0.0184478759765625,
0.059112548828125,
0.060455322265625,
0.0198974609375,
0.07965087890625,
0.0458984375,
-0.021240234375,
0.048492431640625,
-0.046234130859375,
0.0208587646484375,
-0.04766845703125,
-0.004779815673828125,
-0.024993896484375,
-0.00356292724609375,
-0.033538818359375,
-0.0204315185546875,
0.007656097412109375,
0.004024505615234375,
-0.0286865234375,
0.034881591796875,
-0.0556640625,
0.0250091552734375,
0.04144287109375,
0.013671875,
-0.00009298324584960938,
-0.01175689697265625,
-0.00940704345703125,
0.003143310546875,
-0.06072998046875,
-0.0310211181640625,
0.09344482421875,
0.029388427734375,
0.051361083984375,
-0.0263671875,
0.053009033203125,
0.0011081695556640625,
0.03350830078125,
-0.03509521484375,
0.043853759765625,
-0.0007219314575195312,
-0.076904296875,
-0.0117950439453125,
-0.03948974609375,
-0.05963134765625,
0.017059326171875,
-0.0081329345703125,
-0.04925537109375,
0.01146697998046875,
0.013519287109375,
-0.025604248046875,
0.0256805419921875,
-0.0623779296875,
0.094970703125,
-0.032440185546875,
-0.0347900390625,
0.003643035888671875,
-0.050811767578125,
0.0361328125,
-0.000823974609375,
0.01253509521484375,
-0.0015926361083984375,
0.0199737548828125,
0.0723876953125,
-0.03472900390625,
0.07598876953125,
-0.0151214599609375,
0.0017423629760742188,
0.034088134765625,
-0.0185089111328125,
0.033355712890625,
-0.006557464599609375,
-0.004589080810546875,
0.0255126953125,
-0.01555633544921875,
-0.031707763671875,
-0.01110076904296875,
0.042724609375,
-0.081787109375,
-0.03387451171875,
-0.0310821533203125,
-0.035369873046875,
0.004978179931640625,
0.043182373046875,
0.057647705078125,
0.0236358642578125,
-0.007671356201171875,
0.00812530517578125,
0.03448486328125,
-0.03546142578125,
0.048431396484375,
0.01398468017578125,
-0.0175018310546875,
-0.0304718017578125,
0.062286376953125,
0.00922393798828125,
0.026214599609375,
0.00957489013671875,
0.005435943603515625,
-0.032257080078125,
-0.020538330078125,
-0.0242156982421875,
0.03497314453125,
-0.056121826171875,
-0.017730712890625,
-0.07293701171875,
-0.03155517578125,
-0.0482177734375,
-0.0164337158203125,
-0.047821044921875,
-0.00272369384765625,
-0.035064697265625,
-0.01505279541015625,
0.018096923828125,
0.032745361328125,
-0.004253387451171875,
0.03564453125,
-0.038116455078125,
0.019744873046875,
0.01186370849609375,
0.02459716796875,
0.006076812744140625,
-0.0382080078125,
-0.0279693603515625,
0.01160430908203125,
-0.025543212890625,
-0.060699462890625,
0.036224365234375,
0.0008459091186523438,
0.044708251953125,
0.039276123046875,
0.01446533203125,
0.041229248046875,
-0.0285491943359375,
0.052398681640625,
0.008636474609375,
-0.080322265625,
0.0303802490234375,
-0.028045654296875,
0.01406097412109375,
0.03857421875,
0.031524658203125,
-0.0254669189453125,
-0.03387451171875,
-0.056396484375,
-0.078369140625,
0.07501220703125,
0.033233642578125,
0.0216522216796875,
-0.0091400146484375,
0.0280609130859375,
-0.0113372802734375,
0.01776123046875,
-0.1019287109375,
-0.039642333984375,
-0.031768798828125,
-0.029876708984375,
-0.01303863525390625,
-0.01319122314453125,
0.00908660888671875,
-0.03399658203125,
0.06256103515625,
0.003665924072265625,
0.039276123046875,
0.025238037109375,
-0.023468017578125,
-0.00640869140625,
-0.00792694091796875,
0.0228271484375,
0.045074462890625,
-0.01165771484375,
-0.0009431838989257812,
0.01087188720703125,
-0.037933349609375,
-0.004230499267578125,
0.0231170654296875,
-0.0247955322265625,
0.0004780292510986328,
0.0193634033203125,
0.07965087890625,
-0.0007085800170898438,
-0.039276123046875,
0.03985595703125,
0.0023250579833984375,
-0.0145111083984375,
-0.03094482421875,
0.00446319580078125,
0.005626678466796875,
0.00988006591796875,
0.026031494140625,
0.00540924072265625,
-0.007656097412109375,
-0.02960205078125,
0.00982666015625,
0.04058837890625,
-0.0247650146484375,
-0.023834228515625,
0.07867431640625,
0.022796630859375,
-0.01380157470703125,
0.048858642578125,
-0.0164031982421875,
-0.056854248046875,
0.044921875,
0.042572021484375,
0.07421875,
-0.012298583984375,
0.012939453125,
0.047454833984375,
0.048370361328125,
-0.01045989990234375,
-0.0014925003051757812,
0.0113983154296875,
-0.061737060546875,
-0.03143310546875,
-0.05322265625,
0.003780364990234375,
0.0145111083984375,
-0.03192138671875,
0.0367431640625,
-0.0161895751953125,
-0.021514892578125,
-0.01314544677734375,
0.0005779266357421875,
-0.06597900390625,
0.01861572265625,
0.0006685256958007812,
0.056854248046875,
-0.0758056640625,
0.055419921875,
0.0328369140625,
-0.0443115234375,
-0.07232666015625,
-0.0013484954833984375,
-0.0226898193359375,
-0.056488037109375,
0.04730224609375,
0.04144287109375,
0.018768310546875,
0.035003662109375,
-0.052703857421875,
-0.0723876953125,
0.078125,
0.028900146484375,
-0.025543212890625,
-0.0225067138671875,
0.0182037353515625,
0.03948974609375,
-0.01032257080078125,
0.039306640625,
0.034820556640625,
0.0308990478515625,
-0.0102691650390625,
-0.065185546875,
0.0157470703125,
-0.01446533203125,
-0.015869140625,
0.0038433074951171875,
-0.0673828125,
0.08880615234375,
-0.00922393798828125,
-0.018310546875,
-0.01515960693359375,
0.053009033203125,
0.0013036727905273438,
0.002056121826171875,
0.0309295654296875,
0.043365478515625,
0.04364013671875,
-0.0159759521484375,
0.077880859375,
-0.037200927734375,
0.05389404296875,
0.057891845703125,
0.0023651123046875,
0.0428466796875,
0.0178375244140625,
-0.01317596435546875,
0.01812744140625,
0.05108642578125,
-0.0095062255859375,
0.02435302734375,
-0.0006351470947265625,
0.001087188720703125,
-0.01445770263671875,
0.00498199462890625,
-0.038543701171875,
0.0282135009765625,
0.00991058349609375,
-0.042694091796875,
-0.0114288330078125,
0.00388336181640625,
0.0179595947265625,
-0.0268402099609375,
-0.016998291015625,
0.038116455078125,
0.003833770751953125,
-0.057220458984375,
0.0538330078125,
0.009002685546875,
0.0733642578125,
-0.044464111328125,
0.023529052734375,
-0.004619598388671875,
0.0288238525390625,
-0.0185546875,
-0.0244293212890625,
0.01800537109375,
-0.0095367431640625,
-0.0001480579376220703,
-0.0172882080078125,
0.055755615234375,
-0.03472900390625,
-0.047454833984375,
0.0284576416015625,
0.0297088623046875,
0.00339508056640625,
-0.019195556640625,
-0.06524658203125,
0.01078033447265625,
0.0092010498046875,
-0.03631591796875,
0.00257110595703125,
0.0170135498046875,
0.007289886474609375,
0.04217529296875,
0.057403564453125,
-0.0119171142578125,
0.0259857177734375,
-0.016021728515625,
0.07427978515625,
-0.038238525390625,
-0.029541015625,
-0.08270263671875,
0.049530029296875,
-0.005298614501953125,
-0.0276641845703125,
0.06951904296875,
0.051788330078125,
0.08441162109375,
-0.0128936767578125,
0.057098388671875,
-0.0301513671875,
0.0145111083984375,
-0.0285491943359375,
0.07049560546875,
-0.0435791015625,
-0.006561279296875,
-0.0447998046875,
-0.0732421875,
-0.0100250244140625,
0.059417724609375,
-0.03466796875,
0.024932861328125,
0.0494384765625,
0.05865478515625,
-0.00746917724609375,
-0.01421356201171875,
0.0009336471557617188,
0.03564453125,
0.03314208984375,
0.04345703125,
0.041107177734375,
-0.045013427734375,
0.0556640625,
-0.037109375,
-0.02008056640625,
-0.0224609375,
-0.052490234375,
-0.0841064453125,
-0.046661376953125,
-0.0214996337890625,
-0.043487548828125,
-0.01349639892578125,
0.0667724609375,
0.054473876953125,
-0.045867919921875,
-0.016876220703125,
-0.02978515625,
0.0006122589111328125,
-0.004161834716796875,
-0.0250244140625,
0.044342041015625,
-0.036346435546875,
-0.0723876953125,
-0.00722503662109375,
-0.0044403076171875,
-0.0015497207641601562,
-0.0210113525390625,
-0.006427764892578125,
-0.025360107421875,
0.0038967132568359375,
0.039703369140625,
0.0110321044921875,
-0.047119140625,
-0.005481719970703125,
0.017730712890625,
-0.01514434814453125,
-0.004444122314453125,
0.037994384765625,
-0.046661376953125,
0.03387451171875,
0.0284423828125,
0.04071044921875,
0.04364013671875,
-0.0088348388671875,
0.0340576171875,
-0.032806396484375,
0.0203094482421875,
0.0182342529296875,
0.03472900390625,
0.016754150390625,
-0.031768798828125,
0.036163330078125,
0.0302276611328125,
-0.04656982421875,
-0.07232666015625,
0.00966644287109375,
-0.06622314453125,
-0.0173492431640625,
0.1007080078125,
-0.0014505386352539062,
-0.0186309814453125,
0.0006518363952636719,
-0.031341552734375,
0.044891357421875,
-0.0219268798828125,
0.05047607421875,
0.054443359375,
0.006572723388671875,
-0.0026340484619140625,
-0.04351806640625,
0.04412841796875,
0.0355224609375,
-0.0562744140625,
0.007843017578125,
0.03173828125,
0.026153564453125,
0.010284423828125,
0.07183837890625,
-0.006458282470703125,
0.006916046142578125,
-0.0060882568359375,
0.0198822021484375,
-0.0112152099609375,
-0.01418304443359375,
-0.00963592529296875,
-0.0066680908203125,
-0.0189208984375,
-0.01110076904296875
]
] |
Helsinki-NLP/opus-mt-fr-de | 2023-08-16T11:36:16.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"marian",
"text2text-generation",
"translation",
"fr",
"de",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-fr-de | 0 | 14,437 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-fr-de
* source languages: fr
* target languages: de
* OPUS readme: [fr-de](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/fr-de/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-01-09.zip](https://object.pouta.csc.fi/OPUS-MT-models/fr-de/opus-2020-01-09.zip)
* test set translations: [opus-2020-01-09.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/fr-de/opus-2020-01-09.test.txt)
* test set scores: [opus-2020-01-09.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/fr-de/opus-2020-01-09.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| euelections_dev2019.transformer-align.fr | 26.4 | 0.571 |
| newssyscomb2009.fr.de | 22.1 | 0.524 |
| news-test2008.fr.de | 22.1 | 0.524 |
| newstest2009.fr.de | 21.6 | 0.520 |
| newstest2010.fr.de | 22.6 | 0.527 |
| newstest2011.fr.de | 21.5 | 0.518 |
| newstest2012.fr.de | 22.4 | 0.516 |
| newstest2013.fr.de | 24.2 | 0.532 |
| newstest2019-frde.fr.de | 27.9 | 0.595 |
| Tatoeba.fr.de | 49.1 | 0.676 |
| 1,209 | [
[
-0.034149169921875,
-0.0258636474609375,
0.0193939208984375,
0.03021240234375,
-0.0239410400390625,
-0.028228759765625,
-0.02294921875,
-0.004566192626953125,
0.006771087646484375,
0.03271484375,
-0.06170654296875,
-0.042327880859375,
-0.0477294921875,
0.0161285400390625,
-0.008514404296875,
0.0528564453125,
-0.01010894775390625,
0.0265350341796875,
0.0204925537109375,
-0.032073974609375,
-0.0290069580078125,
-0.02239990234375,
-0.0280914306640625,
-0.03448486328125,
0.0230712890625,
0.03839111328125,
0.037078857421875,
0.028472900390625,
0.06585693359375,
0.019622802734375,
-0.0083770751953125,
0.0012264251708984375,
-0.031829833984375,
-0.01343536376953125,
0.01145172119140625,
-0.03704833984375,
-0.0589599609375,
-0.0081024169921875,
0.07379150390625,
0.0360107421875,
-0.002593994140625,
0.0276031494140625,
0.0003223419189453125,
0.0716552734375,
-0.01134490966796875,
0.0002636909484863281,
-0.040496826171875,
0.01084136962890625,
-0.0174713134765625,
-0.0221710205078125,
-0.043304443359375,
-0.0157318115234375,
-0.00157928466796875,
-0.042572021484375,
0.010528564453125,
0.00946807861328125,
0.11016845703125,
0.02606201171875,
-0.0254974365234375,
0.0007619857788085938,
-0.045440673828125,
0.0772705078125,
-0.055206298828125,
0.047637939453125,
0.026397705078125,
0.020355224609375,
0.0093994140625,
-0.042816162109375,
-0.031280517578125,
0.0181121826171875,
-0.024566650390625,
0.025909423828125,
-0.020263671875,
-0.0205078125,
0.0256500244140625,
0.056640625,
-0.056182861328125,
0.0010557174682617188,
-0.046783447265625,
0.0013647079467773438,
0.05023193359375,
0.01214599609375,
0.0099639892578125,
-0.015869140625,
-0.03759765625,
-0.0406494140625,
-0.051361083984375,
0.00806427001953125,
0.0310211181640625,
0.01483154296875,
-0.0372314453125,
0.0504150390625,
-0.00974273681640625,
0.047882080078125,
0.007293701171875,
-0.004009246826171875,
0.07293701171875,
-0.0218963623046875,
-0.021453857421875,
-0.01035308837890625,
0.0924072265625,
0.0300445556640625,
-0.0002409219741821289,
0.0074615478515625,
-0.01438140869140625,
-0.0159149169921875,
0.00952911376953125,
-0.068115234375,
-0.007080078125,
0.011444091796875,
-0.03240966796875,
-0.0097808837890625,
0.00835418701171875,
-0.055206298828125,
0.0186920166015625,
-0.031005859375,
0.042633056640625,
-0.0413818359375,
-0.0194549560546875,
0.02294921875,
0.000415802001953125,
0.0258636474609375,
-0.00011497735977172852,
-0.049407958984375,
0.0181121826171875,
0.0262451171875,
0.05255126953125,
-0.0265045166015625,
-0.0179901123046875,
-0.0220184326171875,
-0.0146942138671875,
-0.00922393798828125,
0.05157470703125,
-0.004497528076171875,
-0.03399658203125,
-0.00943756103515625,
0.039276123046875,
-0.025146484375,
-0.027374267578125,
0.09954833984375,
-0.0239410400390625,
0.058319091796875,
-0.026702880859375,
-0.036651611328125,
-0.0206298828125,
0.0290374755859375,
-0.046630859375,
0.09710693359375,
0.014404296875,
-0.064697265625,
0.015960693359375,
-0.058685302734375,
-0.01180267333984375,
-0.00897979736328125,
0.00231170654296875,
-0.0533447265625,
0.0037593841552734375,
0.01111602783203125,
0.0287628173828125,
-0.02691650390625,
0.02203369140625,
0.0021991729736328125,
-0.024505615234375,
-0.0024166107177734375,
-0.038116455078125,
0.08563232421875,
0.021759033203125,
-0.027008056640625,
0.0191192626953125,
-0.07855224609375,
-0.005695343017578125,
0.00787353515625,
-0.032501220703125,
-0.0056304931640625,
0.00025463104248046875,
0.0147247314453125,
0.0101776123046875,
0.016204833984375,
-0.04949951171875,
0.021270751953125,
-0.04412841796875,
0.01401519775390625,
0.047943115234375,
-0.02056884765625,
0.031982421875,
-0.033966064453125,
0.0302734375,
0.008697509765625,
0.01407623291015625,
0.007564544677734375,
-0.0367431640625,
-0.060394287109375,
-0.02325439453125,
0.0312042236328125,
0.07470703125,
-0.049072265625,
0.0703125,
-0.05059814453125,
-0.06085205078125,
-0.0506591796875,
-0.011749267578125,
0.02215576171875,
0.041259765625,
0.041412353515625,
-0.004180908203125,
-0.03179931640625,
-0.08392333984375,
-0.0052947998046875,
-0.00199127197265625,
-0.0189666748046875,
0.0126800537109375,
0.044647216796875,
-0.01312255859375,
0.0489501953125,
-0.048919677734375,
-0.035064697265625,
-0.01163482666015625,
0.0146026611328125,
0.049774169921875,
0.0457763671875,
0.0421142578125,
-0.06756591796875,
-0.044830322265625,
-0.003662109375,
-0.04595947265625,
-0.0181427001953125,
0.005878448486328125,
-0.0184783935546875,
0.004150390625,
0.0062713623046875,
-0.02978515625,
0.0060577392578125,
0.048614501953125,
-0.04730224609375,
0.041412353515625,
-0.00873565673828125,
0.021240234375,
-0.10546875,
0.0106964111328125,
-0.01499176025390625,
-0.00754547119140625,
-0.037109375,
-0.01361083984375,
0.0195770263671875,
0.00812530517578125,
-0.05194091796875,
0.04486083984375,
-0.02374267578125,
0.0003814697265625,
0.0222320556640625,
-0.0007529258728027344,
0.00656890869140625,
0.05548095703125,
-0.0038433074951171875,
0.0570068359375,
0.050140380859375,
-0.033599853515625,
0.0151824951171875,
0.043701171875,
-0.03717041015625,
0.0306854248046875,
-0.054656982421875,
-0.0213623046875,
0.016632080078125,
-0.00495147705078125,
-0.055908203125,
0.0032978057861328125,
0.024932861328125,
-0.0452880859375,
0.0306549072265625,
-0.004489898681640625,
-0.049346923828125,
-0.01617431640625,
-0.0180511474609375,
0.0262908935546875,
0.050811767578125,
-0.0216064453125,
0.046051025390625,
0.0095977783203125,
-0.00201416015625,
-0.040313720703125,
-0.07171630859375,
-0.012969970703125,
-0.03289794921875,
-0.061553955078125,
0.0301513671875,
-0.0271759033203125,
-0.0009174346923828125,
0.0030956268310546875,
0.0168914794921875,
-0.007007598876953125,
-0.000637054443359375,
0.007801055908203125,
0.0201263427734375,
-0.0261077880859375,
-0.005329132080078125,
-0.001087188720703125,
-0.0222015380859375,
-0.0034942626953125,
-0.0031986236572265625,
0.048126220703125,
-0.02850341796875,
-0.02178955078125,
-0.040863037109375,
0.0029144287109375,
0.04388427734375,
-0.02288818359375,
0.061126708984375,
0.051055908203125,
-0.00865936279296875,
0.01001739501953125,
-0.0297088623046875,
0.0019006729125976562,
-0.03466796875,
0.0091400146484375,
-0.03314208984375,
-0.06390380859375,
0.0474853515625,
0.00945281982421875,
0.032989501953125,
0.0709228515625,
0.04901123046875,
0.0069732666015625,
0.06549072265625,
0.0242156982421875,
0.01085662841796875,
0.0364990234375,
-0.03997802734375,
-0.0014219284057617188,
-0.0693359375,
-0.0009551048278808594,
-0.048736572265625,
-0.0303955078125,
-0.06201171875,
-0.0178070068359375,
0.0289306640625,
0.00975799560546875,
-0.0214385986328125,
0.054901123046875,
-0.049713134765625,
0.01528167724609375,
0.044097900390625,
-0.00995635986328125,
0.023040771484375,
0.006580352783203125,
-0.0347900390625,
-0.0175323486328125,
-0.03448486328125,
-0.0301055908203125,
0.0843505859375,
0.0273895263671875,
0.027069091796875,
0.0226898193359375,
0.0455322265625,
-0.00263214111328125,
0.0204925537109375,
-0.047027587890625,
0.03466796875,
-0.0175323486328125,
-0.056671142578125,
-0.0245208740234375,
-0.04461669921875,
-0.057159423828125,
0.036773681640625,
-0.01629638671875,
-0.041717529296875,
0.022918701171875,
0.00014352798461914062,
-0.01244354248046875,
0.034942626953125,
-0.047332763671875,
0.08245849609375,
-0.0033016204833984375,
-0.00665283203125,
0.0184326171875,
-0.04046630859375,
0.023956298828125,
0.00019812583923339844,
0.0217437744140625,
-0.01535797119140625,
0.01708984375,
0.052398681640625,
-0.01232147216796875,
0.02630615234375,
-0.00408935546875,
-0.011871337890625,
0.007572174072265625,
-0.0017404556274414062,
0.029754638671875,
-0.0108642578125,
-0.0260162353515625,
0.028961181640625,
0.006320953369140625,
-0.0267486572265625,
-0.01345062255859375,
0.043792724609375,
-0.053619384765625,
-0.014801025390625,
-0.03656005859375,
-0.046905517578125,
0.0012054443359375,
0.032867431640625,
0.054901123046875,
0.044586181640625,
-0.0173187255859375,
0.04620361328125,
0.061798095703125,
-0.0207366943359375,
0.03167724609375,
0.0504150390625,
-0.01226806640625,
-0.045623779296875,
0.05950927734375,
0.004955291748046875,
0.02325439453125,
0.044036865234375,
0.01282501220703125,
-0.0203704833984375,
-0.039825439453125,
-0.0474853515625,
0.01715087890625,
-0.0269622802734375,
-0.0184326171875,
-0.044586181640625,
-0.007381439208984375,
-0.0267486572265625,
0.00823974609375,
-0.03369140625,
-0.044708251953125,
-0.0166778564453125,
-0.0153350830078125,
0.0196533203125,
0.0208587646484375,
-0.01404571533203125,
0.034912109375,
-0.0762939453125,
0.019195556640625,
-0.005828857421875,
0.0252532958984375,
-0.033355712890625,
-0.061676025390625,
-0.02484130859375,
0.0010833740234375,
-0.05157470703125,
-0.053558349609375,
0.042694091796875,
0.005184173583984375,
0.02105712890625,
0.0316162109375,
0.01129150390625,
0.03369140625,
-0.059600830078125,
0.0693359375,
0.0146942138671875,
-0.048126220703125,
0.031646728515625,
-0.0302581787109375,
0.034393310546875,
0.058258056640625,
0.021270751953125,
-0.0242462158203125,
-0.035400390625,
-0.0594482421875,
-0.06610107421875,
0.0694580078125,
0.048828125,
-0.007083892822265625,
0.01220703125,
-0.01261138916015625,
-0.00389862060546875,
0.004390716552734375,
-0.07879638671875,
-0.042633056640625,
0.0035228729248046875,
-0.031829833984375,
-0.00894927978515625,
-0.0193939208984375,
-0.0246734619140625,
-0.02276611328125,
0.07537841796875,
0.011749267578125,
0.023773193359375,
0.0272674560546875,
-0.0003819465637207031,
-0.01434326171875,
0.0301513671875,
0.071533203125,
0.042724609375,
-0.039093017578125,
-0.006130218505859375,
0.025909423828125,
-0.03564453125,
-0.00882720947265625,
0.016387939453125,
-0.0254974365234375,
0.022125244140625,
0.0256805419921875,
0.073486328125,
0.0117950439453125,
-0.038818359375,
0.036285400390625,
-0.0235137939453125,
-0.040313720703125,
-0.052276611328125,
-0.013427734375,
0.01043701171875,
0.009124755859375,
0.02239990234375,
0.0150909423828125,
0.0094451904296875,
-0.00815582275390625,
0.017791748046875,
0.0123138427734375,
-0.0462646484375,
-0.0355224609375,
0.043853759765625,
0.0004978179931640625,
-0.01934814453125,
0.0291748046875,
-0.0244140625,
-0.04473876953125,
0.031829833984375,
0.01204681396484375,
0.07354736328125,
-0.02081298828125,
-0.0148162841796875,
0.05535888671875,
0.044891357421875,
-0.01641845703125,
0.037322998046875,
0.01172637939453125,
-0.047088623046875,
-0.034759521484375,
-0.06280517578125,
-0.0031585693359375,
0.00958251953125,
-0.066162109375,
0.03826904296875,
0.0262908935546875,
-0.006816864013671875,
-0.0216522216796875,
0.014984130859375,
-0.046905517578125,
0.01152801513671875,
-0.020172119140625,
0.07830810546875,
-0.07525634765625,
0.06842041015625,
0.033416748046875,
-0.01959228515625,
-0.06378173828125,
-0.025146484375,
-0.016265869140625,
-0.03643798828125,
0.04888916015625,
0.007610321044921875,
0.01532745361328125,
-0.004913330078125,
-0.017486572265625,
-0.06878662109375,
0.0897216796875,
0.012725830078125,
-0.04473876953125,
0.005527496337890625,
0.0147552490234375,
0.0364990234375,
-0.0184326171875,
0.01232147216796875,
0.03289794921875,
0.060272216796875,
0.0116424560546875,
-0.07666015625,
-0.01119232177734375,
-0.03778076171875,
-0.028656005859375,
0.04901123046875,
-0.0477294921875,
0.07501220703125,
0.0299530029296875,
-0.01007843017578125,
0.0004839897155761719,
0.04638671875,
0.0202178955078125,
0.0147552490234375,
0.04498291015625,
0.09112548828125,
0.0260009765625,
-0.043731689453125,
0.06719970703125,
-0.03216552734375,
0.049591064453125,
0.08148193359375,
0.0037708282470703125,
0.05963134765625,
0.0230255126953125,
-0.0204010009765625,
0.035980224609375,
0.042816162109375,
-0.0284271240234375,
0.03863525390625,
-0.005786895751953125,
0.0104827880859375,
-0.01355743408203125,
0.0185089111328125,
-0.05224609375,
0.01270294189453125,
0.0164794921875,
-0.0175018310546875,
-0.005229949951171875,
-0.006084442138671875,
0.00872802734375,
-0.009918212890625,
-0.004932403564453125,
0.03753662109375,
0.0012311935424804688,
-0.039398193359375,
0.055419921875,
-0.00423431396484375,
0.04583740234375,
-0.0499267578125,
0.0032367706298828125,
-0.0045928955078125,
0.0246734619140625,
-0.01141357421875,
-0.049652099609375,
0.04345703125,
0.0010499954223632812,
-0.027618408203125,
-0.037933349609375,
0.01500701904296875,
-0.0384521484375,
-0.069091796875,
0.0261383056640625,
0.0237274169921875,
0.0167694091796875,
0.005870819091796875,
-0.06103515625,
-0.0030364990234375,
0.012786865234375,
-0.05224609375,
0.0003681182861328125,
0.0504150390625,
0.025115966796875,
0.03045654296875,
0.040618896484375,
0.0093231201171875,
0.017059326171875,
-0.0007987022399902344,
0.051971435546875,
-0.0338134765625,
-0.031402587890625,
-0.055389404296875,
0.061553955078125,
-0.00942230224609375,
-0.04583740234375,
0.05096435546875,
0.06787109375,
0.06976318359375,
-0.00710296630859375,
0.0170440673828125,
-0.01512908935546875,
0.059112548828125,
-0.043182373046875,
0.04901123046875,
-0.0753173828125,
0.0161285400390625,
-0.0090179443359375,
-0.07733154296875,
-0.0225830078125,
0.02349853515625,
-0.017303466796875,
-0.0228424072265625,
0.054962158203125,
0.04852294921875,
-0.01105499267578125,
-0.00957489013671875,
0.0202178955078125,
0.02655029296875,
0.017059326171875,
0.037872314453125,
0.032318115234375,
-0.0709228515625,
0.042236328125,
-0.033660888671875,
-0.01184844970703125,
-0.0051116943359375,
-0.05511474609375,
-0.056884765625,
-0.044708251953125,
-0.009765625,
-0.017242431640625,
-0.0330810546875,
0.06134033203125,
0.036224365234375,
-0.0706787109375,
-0.033416748046875,
0.0008549690246582031,
0.004360198974609375,
-0.0196685791015625,
-0.0218658447265625,
0.05078125,
-0.01468658447265625,
-0.075439453125,
0.03546142578125,
0.0001659393310546875,
-0.003658294677734375,
-0.0015392303466796875,
-0.0206146240234375,
-0.033599853515625,
-0.0099639892578125,
0.02679443359375,
-0.0011205673217773438,
-0.04083251953125,
0.00189971923828125,
0.01116180419921875,
-0.003978729248046875,
0.03271484375,
0.017852783203125,
-0.01476287841796875,
0.01119232177734375,
0.06878662109375,
0.01507568359375,
0.036773681640625,
-0.0042877197265625,
0.0298919677734375,
-0.053924560546875,
0.02178955078125,
0.0159149169921875,
0.045654296875,
0.021728515625,
-0.005565643310546875,
0.05694580078125,
0.0144500732421875,
-0.050384521484375,
-0.07421875,
0.0033512115478515625,
-0.09356689453125,
-0.0034027099609375,
0.06976318359375,
-0.0122833251953125,
-0.0185699462890625,
0.024444580078125,
-0.011444091796875,
0.0108795166015625,
-0.025299072265625,
0.033447265625,
0.06719970703125,
0.01302337646484375,
0.00644683837890625,
-0.053802490234375,
0.027191162109375,
0.031829833984375,
-0.04840087890625,
-0.0158843994140625,
0.0177154541015625,
0.01197052001953125,
0.039459228515625,
0.031524658203125,
-0.0251922607421875,
-0.005382537841796875,
-0.0186767578125,
0.030548095703125,
-0.00644683837890625,
-0.0186614990234375,
-0.0247955322265625,
0.0004448890686035156,
-0.0128936767578125,
-0.01552581787109375
]
] |
timm/vit_base_patch16_224.dino | 2023-05-06T03:21:01.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:2104.14294",
"arxiv:2010.11929",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_base_patch16_224.dino | 0 | 14,434 | timm | 2022-12-22T07:26:27 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
---
# Model card for vit_base_patch16_224.dino
A Vision Transformer (ViT) image feature model. Trained with Self-Supervised DINO method.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 85.8
- GMACs: 16.9
- Activations (M): 16.5
- Image size: 224 x 224
- **Papers:**
- Emerging Properties in Self-Supervised Vision Transformers: https://arxiv.org/abs/2104.14294
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Pretrain Dataset:** ImageNet-1k
- **Original:** https://github.com/facebookresearch/dino
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_base_patch16_224.dino', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_base_patch16_224.dino',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 197, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{caron2021emerging,
title={Emerging properties in self-supervised vision transformers},
author={Caron, Mathilde and Touvron, Hugo and Misra, Ishan and J{'e}gou, Herv{'e} and Mairal, Julien and Bojanowski, Piotr and Joulin, Armand},
booktitle={Proceedings of the IEEE/CVF international conference on computer vision},
pages={9650--9660},
year={2021}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 3,678 | [
[
-0.04144287109375,
-0.0234527587890625,
0.005527496337890625,
0.01183319091796875,
-0.0267181396484375,
-0.024200439453125,
-0.0193634033203125,
-0.03924560546875,
0.0227203369140625,
0.02587890625,
-0.037139892578125,
-0.0382080078125,
-0.049591064453125,
0.001575469970703125,
-0.0160675048828125,
0.0718994140625,
-0.00537109375,
-0.0175933837890625,
-0.01763916015625,
-0.036651611328125,
-0.00833892822265625,
-0.022918701171875,
-0.034912109375,
-0.0254669189453125,
0.0264892578125,
0.002349853515625,
0.047515869140625,
0.064697265625,
0.047332763671875,
0.0362548828125,
-0.0037059783935546875,
0.00604248046875,
-0.0103302001953125,
-0.0161285400390625,
0.011871337890625,
-0.038818359375,
-0.034423828125,
0.0221099853515625,
0.05499267578125,
0.02789306640625,
0.004398345947265625,
0.026885986328125,
0.01261138916015625,
0.0377197265625,
-0.0231170654296875,
0.018402099609375,
-0.03729248046875,
0.0162353515625,
-0.007564544677734375,
0.0007476806640625,
-0.01497650146484375,
-0.01715087890625,
0.0133514404296875,
-0.0396728515625,
0.04296875,
-0.00836944580078125,
0.09832763671875,
0.0199127197265625,
-0.0064544677734375,
0.00788116455078125,
-0.0280303955078125,
0.060302734375,
-0.050384521484375,
0.026123046875,
0.00909423828125,
0.01099395751953125,
-0.0010328292846679688,
-0.07373046875,
-0.050628662109375,
-0.002101898193359375,
-0.0196685791015625,
0.004047393798828125,
-0.029327392578125,
0.0038814544677734375,
0.01995849609375,
0.039642333984375,
-0.026275634765625,
0.0018177032470703125,
-0.04254150390625,
-0.027374267578125,
0.046417236328125,
-0.005405426025390625,
0.01507568359375,
-0.0282440185546875,
-0.04638671875,
-0.04345703125,
-0.0243988037109375,
0.01776123046875,
0.0226593017578125,
0.00957489013671875,
-0.036712646484375,
0.040252685546875,
0.0052337646484375,
0.0426025390625,
0.0213470458984375,
-0.0201873779296875,
0.049224853515625,
-0.016021728515625,
-0.021728515625,
-0.0162353515625,
0.088623046875,
0.034149169921875,
0.026092529296875,
0.00308990478515625,
-0.0100555419921875,
-0.0135955810546875,
-0.00215911865234375,
-0.081298828125,
-0.0312042236328125,
0.00998687744140625,
-0.0408935546875,
-0.02752685546875,
0.01107025146484375,
-0.06201171875,
-0.00788116455078125,
-0.0050811767578125,
0.05517578125,
-0.031524658203125,
-0.03155517578125,
-0.0003898143768310547,
-0.01009368896484375,
0.03314208984375,
0.01125335693359375,
-0.055999755859375,
0.0085601806640625,
0.01441192626953125,
0.07525634765625,
-0.002231597900390625,
-0.0421142578125,
-0.008209228515625,
-0.0248870849609375,
-0.02203369140625,
0.048126220703125,
-0.0031452178955078125,
-0.024871826171875,
-0.0222015380859375,
0.027313232421875,
-0.01885986328125,
-0.047607421875,
0.0186767578125,
-0.007198333740234375,
0.0241546630859375,
0.00955963134765625,
-0.01006317138671875,
-0.0225067138671875,
0.0167236328125,
-0.031951904296875,
0.09808349609375,
0.032806396484375,
-0.060516357421875,
0.033050537109375,
-0.0328369140625,
-0.00986480712890625,
-0.01374053955078125,
-0.0034465789794921875,
-0.08404541015625,
-0.0030727386474609375,
0.0262908935546875,
0.045623779296875,
-0.0134124755859375,
0.00435638427734375,
-0.040130615234375,
-0.00893402099609375,
0.0262451171875,
-0.017059326171875,
0.06878662109375,
0.010467529296875,
-0.03717041015625,
0.020904541015625,
-0.049835205078125,
0.007488250732421875,
0.031951904296875,
-0.018218994140625,
-0.0020275115966796875,
-0.043975830078125,
0.004486083984375,
0.02581787109375,
0.0249176025390625,
-0.04644775390625,
0.02923583984375,
-0.015655517578125,
0.033447265625,
0.053497314453125,
-0.017547607421875,
0.02972412109375,
-0.02105712890625,
0.026275634765625,
0.0216217041015625,
0.0309295654296875,
-0.012237548828125,
-0.0458984375,
-0.06622314453125,
-0.044189453125,
0.026336669921875,
0.0249481201171875,
-0.0379638671875,
0.0452880859375,
-0.0244598388671875,
-0.05352783203125,
-0.045440673828125,
0.01256561279296875,
0.03668212890625,
0.04254150390625,
0.040618896484375,
-0.039581298828125,
-0.03570556640625,
-0.07318115234375,
-0.004505157470703125,
-0.0081939697265625,
-0.0016193389892578125,
0.0328369140625,
0.044525146484375,
-0.01812744140625,
0.06817626953125,
-0.0311431884765625,
-0.031829833984375,
-0.00989532470703125,
0.0103302001953125,
0.0270538330078125,
0.053070068359375,
0.0640869140625,
-0.048492431640625,
-0.0333251953125,
-0.017578125,
-0.06640625,
0.007175445556640625,
0.0003437995910644531,
-0.0054473876953125,
0.02508544921875,
0.01290130615234375,
-0.0594482421875,
0.049468994140625,
0.0243377685546875,
-0.032073974609375,
0.03387451171875,
-0.024627685546875,
0.007190704345703125,
-0.0892333984375,
0.002696990966796875,
0.0296783447265625,
-0.01116943359375,
-0.02880859375,
-0.0015439987182617188,
0.01110076904296875,
0.00304412841796875,
-0.030242919921875,
0.043731689453125,
-0.04559326171875,
-0.0208740234375,
-0.0030193328857421875,
-0.0198516845703125,
0.002429962158203125,
0.053863525390625,
-0.0011653900146484375,
0.045013427734375,
0.0596923828125,
-0.037567138671875,
0.038848876953125,
0.0323486328125,
-0.0144805908203125,
0.041229248046875,
-0.049102783203125,
0.00737762451171875,
0.00571441650390625,
0.01222991943359375,
-0.07757568359375,
-0.01110076904296875,
0.0222930908203125,
-0.047515869140625,
0.048736572265625,
-0.041046142578125,
-0.0310516357421875,
-0.050201416015625,
-0.03350830078125,
0.0360107421875,
0.04913330078125,
-0.061248779296875,
0.049163818359375,
0.01544189453125,
0.0189666748046875,
-0.043243408203125,
-0.0626220703125,
-0.00928497314453125,
-0.031707763671875,
-0.051361083984375,
0.039093017578125,
-0.00047326087951660156,
0.0173187255859375,
0.00955963134765625,
-0.003108978271484375,
0.0007576942443847656,
-0.01360321044921875,
0.0295257568359375,
0.0308990478515625,
-0.0189971923828125,
-0.00632476806640625,
-0.0216522216796875,
-0.0137481689453125,
0.01132965087890625,
-0.028045654296875,
0.03924560546875,
-0.025543212890625,
-0.00583648681640625,
-0.050811767578125,
-0.0172119140625,
0.0443115234375,
-0.02215576171875,
0.047515869140625,
0.07965087890625,
-0.0265960693359375,
0.0003848075866699219,
-0.039459228515625,
-0.023529052734375,
-0.039764404296875,
0.030181884765625,
-0.031768798828125,
-0.03387451171875,
0.066650390625,
0.008758544921875,
-0.0088348388671875,
0.0703125,
0.032867431640625,
-0.00566864013671875,
0.060546875,
0.04827880859375,
0.0109405517578125,
0.054534912109375,
-0.07232666015625,
-0.00437164306640625,
-0.07080078125,
-0.038818359375,
-0.0162811279296875,
-0.031829833984375,
-0.06353759765625,
-0.03607177734375,
0.032135009765625,
0.0037078857421875,
-0.01219940185546875,
0.047515869140625,
-0.070068359375,
-0.00024402141571044922,
0.060821533203125,
0.03375244140625,
-0.012237548828125,
0.022674560546875,
-0.02276611328125,
-0.005390167236328125,
-0.04974365234375,
-0.00174713134765625,
0.08642578125,
0.028778076171875,
0.05743408203125,
-0.01422882080078125,
0.053070068359375,
-0.0117950439453125,
0.0185089111328125,
-0.053314208984375,
0.0479736328125,
-0.00040078163146972656,
-0.039642333984375,
-0.0253143310546875,
-0.0290985107421875,
-0.08349609375,
0.01285552978515625,
-0.0223388671875,
-0.055267333984375,
0.03485107421875,
0.016571044921875,
-0.01934814453125,
0.05267333984375,
-0.052490234375,
0.0721435546875,
-0.00989532470703125,
-0.03302001953125,
0.0038928985595703125,
-0.04833984375,
0.0113983154296875,
0.00814056396484375,
-0.017364501953125,
-0.00614166259765625,
0.01715087890625,
0.06591796875,
-0.04345703125,
0.064697265625,
-0.0272369384765625,
0.01513671875,
0.041473388671875,
-0.01026153564453125,
0.0192413330078125,
-0.00574493408203125,
0.006099700927734375,
0.033966064453125,
0.007511138916015625,
-0.0257415771484375,
-0.03857421875,
0.042572021484375,
-0.07806396484375,
-0.041168212890625,
-0.038726806640625,
-0.035430908203125,
0.01529693603515625,
0.01445770263671875,
0.045013427734375,
0.04461669921875,
0.02337646484375,
0.02435302734375,
0.04443359375,
-0.0191497802734375,
0.043548583984375,
0.005474090576171875,
-0.0169677734375,
-0.04095458984375,
0.06878662109375,
0.022552490234375,
0.010528564453125,
0.00418853759765625,
0.0210113525390625,
-0.03411865234375,
-0.03955078125,
-0.0328369140625,
0.0323486328125,
-0.047454833984375,
-0.03900146484375,
-0.04681396484375,
-0.040435791015625,
-0.03753662109375,
0.006526947021484375,
-0.041290283203125,
-0.032257080078125,
-0.034637451171875,
0.0074615478515625,
0.056884765625,
0.038726806640625,
-0.02423095703125,
0.028961181640625,
-0.0382080078125,
0.01560211181640625,
0.0364990234375,
0.040374755859375,
-0.020233154296875,
-0.0743408203125,
-0.01812744140625,
-0.0035114288330078125,
-0.034088134765625,
-0.04931640625,
0.0433349609375,
0.016082763671875,
0.0418701171875,
0.0207366943359375,
-0.0133819580078125,
0.057098388671875,
-0.004756927490234375,
0.042724609375,
0.02374267578125,
-0.0489501953125,
0.046661376953125,
-0.016143798828125,
0.01129150390625,
0.006893157958984375,
0.0159149169921875,
-0.01335906982421875,
-0.0019063949584960938,
-0.07958984375,
-0.05389404296875,
0.052490234375,
0.01560211181640625,
0.01000213623046875,
0.021820068359375,
0.0477294921875,
-0.00852203369140625,
0.002410888671875,
-0.0684814453125,
-0.03131103515625,
-0.040130615234375,
-0.0242767333984375,
-0.00263214111328125,
-0.00799560546875,
-0.0012187957763671875,
-0.053924560546875,
0.04296875,
-0.0205841064453125,
0.06103515625,
0.037445068359375,
-0.00542449951171875,
-0.01125335693359375,
-0.0220947265625,
0.026458740234375,
0.02252197265625,
-0.0242919921875,
0.005886077880859375,
0.01430511474609375,
-0.0435791015625,
-0.0013685226440429688,
0.029449462890625,
0.002227783203125,
-0.003574371337890625,
0.0291290283203125,
0.06494140625,
0.0010881423950195312,
0.007656097412109375,
0.0435791015625,
-0.0003161430358886719,
-0.03424072265625,
-0.0218963623046875,
0.01320648193359375,
-0.01244354248046875,
0.031707763671875,
0.03369140625,
0.026336669921875,
-0.002521514892578125,
-0.02490234375,
0.0194091796875,
0.042022705078125,
-0.03424072265625,
-0.036956787109375,
0.051788330078125,
-0.01666259765625,
-0.004669189453125,
0.0577392578125,
-0.003925323486328125,
-0.039886474609375,
0.07489013671875,
0.033447265625,
0.0703125,
-0.0208587646484375,
0.00502777099609375,
0.054779052734375,
0.01561737060546875,
-0.0002008676528930664,
0.010772705078125,
0.00019848346710205078,
-0.0660400390625,
0.0023632049560546875,
-0.046051025390625,
-0.000423431396484375,
0.0194549560546875,
-0.039520263671875,
0.026031494140625,
-0.04168701171875,
-0.0335693359375,
0.0203857421875,
0.01288604736328125,
-0.06854248046875,
0.02618408203125,
0.005710601806640625,
0.049835205078125,
-0.05413818359375,
0.061004638671875,
0.055755615234375,
-0.047332763671875,
-0.0694580078125,
-0.01346588134765625,
-0.006435394287109375,
-0.0703125,
0.040435791015625,
0.0296783447265625,
-0.0003075599670410156,
0.0201873779296875,
-0.056884765625,
-0.062469482421875,
0.1044921875,
0.0352783203125,
-0.00542449951171875,
0.00852203369140625,
0.0016889572143554688,
0.02911376953125,
-0.0286865234375,
0.0294342041015625,
0.0185546875,
0.026702880859375,
0.020538330078125,
-0.060638427734375,
0.00928497314453125,
-0.0258941650390625,
0.0071258544921875,
0.0153961181640625,
-0.07305908203125,
0.0665283203125,
-0.036865234375,
-0.0110931396484375,
0.01104736328125,
0.053070068359375,
0.016876220703125,
0.0067138671875,
0.045440673828125,
0.06304931640625,
0.02923583984375,
-0.0259552001953125,
0.0640869140625,
-0.0199432373046875,
0.05828857421875,
0.039825439453125,
0.03289794921875,
0.036529541015625,
0.0301513671875,
-0.03265380859375,
0.0367431640625,
0.06982421875,
-0.044464111328125,
0.0283355712890625,
0.0056304931640625,
0.0022792816162109375,
-0.0204925537109375,
0.0018253326416015625,
-0.0316162109375,
0.0443115234375,
0.01445770263671875,
-0.04547119140625,
-0.005340576171875,
0.0035381317138671875,
-0.007904052734375,
-0.021331787109375,
-0.0125274658203125,
0.039093017578125,
0.003795623779296875,
-0.03424072265625,
0.05517578125,
-0.006359100341796875,
0.062347412109375,
-0.02301025390625,
0.0016908645629882812,
-0.0209197998046875,
0.028167724609375,
-0.0243682861328125,
-0.07342529296875,
0.0171966552734375,
-0.0255126953125,
-0.00487518310546875,
-0.003475189208984375,
0.05633544921875,
-0.024261474609375,
-0.044830322265625,
0.021820068359375,
0.02001953125,
0.024261474609375,
-0.0016489028930664062,
-0.08062744140625,
-0.00836181640625,
-0.005413055419921875,
-0.0450439453125,
0.0169677734375,
0.037078857421875,
0.0060882568359375,
0.058837890625,
0.047943115234375,
-0.00977325439453125,
0.0180816650390625,
-0.017974853515625,
0.07293701171875,
-0.04345703125,
-0.024932861328125,
-0.059295654296875,
0.042022705078125,
-0.0070648193359375,
-0.04425048828125,
0.04791259765625,
0.0450439453125,
0.06353759765625,
-0.00365447998046875,
0.0278778076171875,
-0.01255035400390625,
0.002971649169921875,
-0.02178955078125,
0.04766845703125,
-0.0487060546875,
-0.005054473876953125,
-0.0165252685546875,
-0.0679931640625,
-0.026458740234375,
0.05950927734375,
-0.0204010009765625,
0.0408935546875,
0.033905029296875,
0.07012939453125,
-0.026275634765625,
-0.037994384765625,
0.0206298828125,
0.0162811279296875,
0.006702423095703125,
0.0209197998046875,
0.03521728515625,
-0.05950927734375,
0.042938232421875,
-0.0455322265625,
-0.007686614990234375,
-0.0127716064453125,
-0.050079345703125,
-0.07806396484375,
-0.0657958984375,
-0.047637939453125,
-0.04913330078125,
-0.0166473388671875,
0.058013916015625,
0.08050537109375,
-0.049468994140625,
-0.0015459060668945312,
-0.00446319580078125,
0.0038394927978515625,
-0.01702880859375,
-0.0178680419921875,
0.045379638671875,
-0.00475311279296875,
-0.0582275390625,
-0.0198822021484375,
0.00615692138671875,
0.040557861328125,
-0.02154541015625,
-0.01029205322265625,
-0.00677490234375,
-0.016387939453125,
0.01474761962890625,
0.0294647216796875,
-0.05682373046875,
-0.0189666748046875,
-0.005359649658203125,
-0.0097198486328125,
0.0469970703125,
0.024658203125,
-0.059844970703125,
0.037109375,
0.0294189453125,
0.01751708984375,
0.073486328125,
-0.019317626953125,
0.00823974609375,
-0.0531005859375,
0.031982421875,
-0.0117645263671875,
0.041290283203125,
0.041900634765625,
-0.0296783447265625,
0.04376220703125,
0.04827880859375,
-0.031005859375,
-0.051544189453125,
-0.00502777099609375,
-0.07977294921875,
0.0011930465698242188,
0.07000732421875,
-0.0290679931640625,
-0.039794921875,
0.03271484375,
0.001010894775390625,
0.04913330078125,
-0.006122589111328125,
0.0489501953125,
0.016082763671875,
0.00022840499877929688,
-0.053436279296875,
-0.02117919921875,
0.035247802734375,
0.0113067626953125,
-0.047088623046875,
-0.027252197265625,
0.004009246826171875,
0.051300048828125,
0.03497314453125,
0.0279388427734375,
-0.0214691162109375,
0.00989532470703125,
0.0001291036605834961,
0.043365478515625,
-0.0206146240234375,
-0.01422882080078125,
-0.0301055908203125,
-0.01061248779296875,
-0.01151275634765625,
-0.036773681640625
]
] |
google/tapas-base-finetuned-wtq | 2022-07-14T10:12:59.000Z | [
"transformers",
"pytorch",
"tf",
"tapas",
"table-question-answering",
"en",
"dataset:wikitablequestions",
"arxiv:2004.02349",
"arxiv:2010.00571",
"arxiv:1508.00305",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | table-question-answering | google | null | null | google/tapas-base-finetuned-wtq | 159 | 14,432 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- tapas
license: apache-2.0
datasets:
- wikitablequestions
---
# TAPAS base model fine-tuned on WikiTable Questions (WTQ)
This model has 2 versions which can be used. The default version corresponds to the `tapas_wtq_wikisql_sqa_inter_masklm_base_reset` checkpoint of the [original Github repository](https://github.com/google-research/tapas).
This model was pre-trained on MLM and an additional step which the authors call intermediate pre-training, and then fine-tuned in a chain on [SQA](https://www.microsoft.com/en-us/download/details.aspx?id=54253), [WikiSQL](https://github.com/salesforce/WikiSQL) and finally [WTQ](https://github.com/ppasupat/WikiTableQuestions). It uses relative position embeddings (i.e. resetting the position index at every cell of the table).
The other (non-default) version which can be used is:
- `no_reset`, which corresponds to `tapas_wtq_wikisql_sqa_inter_masklm_base` (intermediate pre-training, absolute position embeddings).
Disclaimer: The team releasing TAPAS did not write a model card for this model so this model card has been written by
the Hugging Face team and contributors.
## Results
Size | Reset | Dev Accuracy | Link
-------- | --------| -------- | ----
LARGE | noreset | 0.5062 | [tapas-large-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-large-finetuned-wtq/tree/no_reset)
LARGE | reset | 0.5097 | [tapas-large-finetuned-wtq](https://huggingface.co/google/tapas-large-finetuned-wtq/tree/main)
**BASE** | **noreset** | **0.4525** | [tapas-base-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-base-finetuned-wtq/tree/no_reset)
**BASE** | **reset** | **0.4638** | [tapas-base-finetuned-wtq](https://huggingface.co/google/tapas-base-finetuned-wtq/tree/main)
MEDIUM | noreset | 0.4324 | [tapas-medium-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-medium-finetuned-wtq/tree/no_reset)
MEDIUM | reset | 0.4324 | [tapas-medium-finetuned-wtq](https://huggingface.co/google/tapas-medium-finetuned-wtq/tree/main)
SMALL | noreset | 0.3681 | [tapas-small-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-small-finetuned-wtq/tree/no_reset)
SMALL | reset | 0.3762 | [tapas-small-finetuned-wtq](https://huggingface.co/google/tapas-small-finetuned-wtq/tree/main)
MINI | noreset | 0.2783 | [tapas-mini-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-mini-finetuned-wtq/tree/no_reset)
MINI | reset | 0.2854 | [tapas-mini-finetuned-wtq](https://huggingface.co/google/tapas-mini-finetuned-wtq/tree/main)
TINY | noreset | 0.0823 | [tapas-tiny-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-tiny-finetuned-wtq/tree/no_reset)
TINY | reset | 0.1039 | [tapas-tiny-finetuned-wtq](https://huggingface.co/google/tapas-tiny-finetuned-wtq/tree/main)
## Model description
TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion.
This means it was pretrained on the raw tables and associated texts only, with no humans labelling them in any way (which is why it
can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a (flattened) table and associated context, the model randomly masks 15% of the words in
the input, then runs the entire (partially masked) sequence through the model. The model then has to predict the masked words.
This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other,
or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional
representation of a table and associated text.
- Intermediate pre-training: to encourage numerical reasoning on tables, the authors additionally pre-trained the model by creating
a balanced dataset of millions of syntactically created training examples. Here, the model must predict (classify) whether a sentence
is supported or refuted by the contents of a table. The training examples are created based on synthetic as well as counterfactual statements.
This way, the model learns an inner representation of the English language used in tables and associated texts, which can then be used
to extract features useful for downstream tasks such as answering questions about a table, or determining whether a sentence is entailed
or refuted by the contents of a table. Fine-tuning is done by adding a cell selection head and aggregation head on top of the pre-trained model, and then jointly train these randomly initialized classification heads with the base model on SQa, WikiSQL and finally WTQ.
## Intended uses & limitations
You can use this model for answering questions related to a table.
For code examples, we refer to the documentation of TAPAS on the HuggingFace website.
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Question [SEP] Flattened table [SEP]
```
The authors did first convert the WTQ dataset into the format of SQA using automatic conversion scripts.
### Fine-tuning
The model was fine-tuned on 32 Cloud TPU v3 cores for 50,000 steps with maximum sequence length 512 and batch size of 512.
In this setup, fine-tuning takes around 10 hours. The optimizer used is Adam with a learning rate of 1.93581e-5, and a warmup
ratio of 0.128960. An inductive bias is added such that the model only selects cells of the same column. This is reflected by the
`select_one_column` parameter of `TapasConfig`. See the [paper](https://arxiv.org/abs/2004.02349) for more details (tables 11 and
12).
### BibTeX entry and citation info
```bibtex
@misc{herzig2020tapas,
title={TAPAS: Weakly Supervised Table Parsing via Pre-training},
author={Jonathan Herzig and Paweł Krzysztof Nowak and Thomas Müller and Francesco Piccinno and Julian Martin Eisenschlos},
year={2020},
eprint={2004.02349},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
```
```bibtex
@misc{eisenschlos2020understanding,
title={Understanding tables with intermediate pre-training},
author={Julian Martin Eisenschlos and Syrine Krichene and Thomas Müller},
year={2020},
eprint={2010.00571},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@article{DBLP:journals/corr/PasupatL15,
author = {Panupong Pasupat and
Percy Liang},
title = {Compositional Semantic Parsing on Semi-Structured Tables},
journal = {CoRR},
volume = {abs/1508.00305},
year = {2015},
url = {http://arxiv.org/abs/1508.00305},
archivePrefix = {arXiv},
eprint = {1508.00305},
timestamp = {Mon, 13 Aug 2018 16:47:37 +0200},
biburl = {https://dblp.org/rec/journals/corr/PasupatL15.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 7,191 | [
[
-0.04742431640625,
-0.05401611328125,
0.016448974609375,
0.0229949951171875,
-0.033050537109375,
-0.00888824462890625,
-0.0027866363525390625,
-0.039398193359375,
0.04827880859375,
0.02752685546875,
-0.042022705078125,
-0.03131103515625,
-0.034576416015625,
0.000013172626495361328,
-0.0185089111328125,
0.08941650390625,
0.0028896331787109375,
0.0018835067749023438,
-0.00814056396484375,
-0.0205841064453125,
-0.027252197265625,
-0.0321044921875,
-0.015869140625,
-0.0200653076171875,
0.0357666015625,
0.035369873046875,
0.057769775390625,
0.04833984375,
0.0156402587890625,
0.0208282470703125,
-0.02166748046875,
0.00920867919921875,
-0.041778564453125,
-0.01255035400390625,
0.0005536079406738281,
-0.04241943359375,
-0.0309600830078125,
0.018280029296875,
0.0255279541015625,
0.055816650390625,
0.0034027099609375,
0.033538818359375,
0.0128631591796875,
0.051910400390625,
-0.038116455078125,
0.00841522216796875,
-0.06524658203125,
0.0155792236328125,
-0.0184478759765625,
-0.007450103759765625,
-0.0266265869140625,
-0.044921875,
0.0286712646484375,
-0.038543701171875,
0.01546478271484375,
-0.0011854171752929688,
0.1072998046875,
0.0074462890625,
-0.0183868408203125,
-0.0006356239318847656,
-0.038543701171875,
0.03839111328125,
-0.0540771484375,
0.02569580078125,
0.04095458984375,
0.0101318359375,
-0.01910400390625,
-0.06494140625,
-0.04901123046875,
-0.014404296875,
-0.019500732421875,
0.0028285980224609375,
-0.0072021484375,
-0.01264190673828125,
0.027862548828125,
0.045257568359375,
-0.03887939453125,
-0.0003364086151123047,
-0.037872314453125,
-0.0142059326171875,
0.03839111328125,
-0.005268096923828125,
0.02191162109375,
-0.01090240478515625,
-0.0325927734375,
-0.00865936279296875,
-0.06353759765625,
0.030731201171875,
0.01016998291015625,
0.025360107421875,
-0.032989501953125,
0.04693603515625,
-0.00643157958984375,
0.0595703125,
0.0177764892578125,
-0.0188446044921875,
0.045074462890625,
-0.0287017822265625,
-0.0274200439453125,
-0.0194091796875,
0.0673828125,
0.0085906982421875,
0.0228118896484375,
-0.00643157958984375,
-0.01029205322265625,
-0.00284576416015625,
0.0128021240234375,
-0.060089111328125,
-0.034332275390625,
0.0145111083984375,
-0.036529541015625,
-0.0233306884765625,
0.0027332305908203125,
-0.046234130859375,
-0.0123443603515625,
-0.02337646484375,
0.03900146484375,
-0.0261993408203125,
-0.0171051025390625,
-0.0185394287109375,
-0.00506591796875,
0.03656005859375,
0.010711669921875,
-0.051422119140625,
0.0234222412109375,
0.02520751953125,
0.0546875,
0.00008976459503173828,
-0.0227813720703125,
-0.0111846923828125,
-0.0062255859375,
-0.035552978515625,
0.038055419921875,
-0.0243072509765625,
-0.0214996337890625,
-0.01311492919921875,
0.0205535888671875,
-0.00699615478515625,
-0.03924560546875,
0.04595947265625,
-0.0321044921875,
0.0239105224609375,
-0.047149658203125,
-0.031097412109375,
-0.01580810546875,
0.0182647705078125,
-0.05224609375,
0.09228515625,
0.01190185546875,
-0.060211181640625,
0.040924072265625,
-0.040924072265625,
-0.025177001953125,
-0.01279449462890625,
0.0037174224853515625,
-0.056121826171875,
-0.00777435302734375,
0.0176544189453125,
0.037322998046875,
-0.009674072265625,
0.0193023681640625,
-0.033660888671875,
-0.008544921875,
0.012420654296875,
0.0031375885009765625,
0.071533203125,
0.0213775634765625,
-0.0321044921875,
0.01157379150390625,
-0.055633544921875,
0.00856781005859375,
0.02935791015625,
-0.01314544677734375,
-0.0160064697265625,
-0.0180206298828125,
-0.0025177001953125,
0.033172607421875,
0.0179595947265625,
-0.039398193359375,
0.01922607421875,
-0.03240966796875,
0.04388427734375,
0.03369140625,
0.0028533935546875,
0.022979736328125,
-0.0169830322265625,
0.0203094482421875,
0.0146331787109375,
0.0206451416015625,
-0.00989532470703125,
-0.0589599609375,
-0.0604248046875,
-0.0153961181640625,
0.041168212890625,
0.041046142578125,
-0.04095458984375,
0.053619384765625,
-0.0176849365234375,
-0.0533447265625,
-0.042694091796875,
0.01152801513671875,
0.045196533203125,
0.0557861328125,
0.0357666015625,
-0.02178955078125,
-0.04901123046875,
-0.08502197265625,
0.0054779052734375,
0.0037975311279296875,
0.004566192626953125,
0.0281524658203125,
0.0428466796875,
-0.002170562744140625,
0.08819580078125,
-0.043548583984375,
-0.0142059326171875,
-0.0115814208984375,
0.0068359375,
0.0274810791015625,
0.050201416015625,
0.05706787109375,
-0.060760498046875,
-0.057342529296875,
-0.0086822509765625,
-0.032928466796875,
-0.0042266845703125,
0.0125885009765625,
-0.0167999267578125,
0.007686614990234375,
0.01494598388671875,
-0.06683349609375,
0.035797119140625,
0.0252838134765625,
-0.035797119140625,
0.040191650390625,
-0.004451751708984375,
-0.0019311904907226562,
-0.10040283203125,
0.022796630859375,
0.0038280487060546875,
-0.01739501953125,
-0.04425048828125,
-0.00380706787109375,
0.0122528076171875,
-0.01348876953125,
-0.03350830078125,
0.0294647216796875,
-0.05145263671875,
-0.01467132568359375,
-0.0016384124755859375,
0.005054473876953125,
0.0081329345703125,
0.061065673828125,
-0.01129913330078125,
0.0687255859375,
0.0285186767578125,
-0.0458984375,
0.0014638900756835938,
0.0313720703125,
-0.0152435302734375,
0.025390625,
-0.060333251953125,
-0.00518798828125,
0.0037288665771484375,
0.019744873046875,
-0.0789794921875,
-0.005706787109375,
0.021209716796875,
-0.0430908203125,
0.03900146484375,
-0.0087432861328125,
-0.0163116455078125,
-0.04888916015625,
-0.0478515625,
0.01119232177734375,
0.038238525390625,
-0.0496826171875,
0.027435302734375,
0.04736328125,
0.00013124942779541016,
-0.053802490234375,
-0.049163818359375,
-0.005962371826171875,
-0.01219940185546875,
-0.0430908203125,
0.050567626953125,
0.007293701171875,
0.00946044921875,
0.007213592529296875,
-0.007904052734375,
-0.0023899078369140625,
-0.00849151611328125,
0.0215911865234375,
0.0171661376953125,
-0.0155487060546875,
-0.0014600753784179688,
-0.004730224609375,
0.00904083251953125,
-0.00888824462890625,
-0.007030487060546875,
0.056671142578125,
-0.0278472900390625,
0.000014126300811767578,
-0.04937744140625,
0.00315093994140625,
0.043853759765625,
-0.0279083251953125,
0.060821533203125,
0.047454833984375,
-0.03662109375,
0.0021610260009765625,
-0.05206298828125,
-0.0100860595703125,
-0.03253173828125,
0.01288604736328125,
-0.04638671875,
-0.05889892578125,
0.06524658203125,
0.0198211669921875,
0.006832122802734375,
0.06561279296875,
0.042236328125,
-0.01552581787109375,
0.0576171875,
0.040924072265625,
-0.0238494873046875,
0.03448486328125,
-0.0430908203125,
0.0031032562255859375,
-0.0550537109375,
-0.043670654296875,
-0.059539794921875,
-0.030364990234375,
-0.059295654296875,
-0.02850341796875,
0.0134124755859375,
0.01800537109375,
-0.03253173828125,
0.04083251953125,
-0.04632568359375,
0.029052734375,
0.06329345703125,
0.0167083740234375,
-0.0006055831909179688,
-0.007175445556640625,
-0.0189666748046875,
-0.00007325410842895508,
-0.038818359375,
-0.031829833984375,
0.07708740234375,
0.04168701171875,
0.039093017578125,
-0.01080322265625,
0.038604736328125,
0.0130462646484375,
0.02239990234375,
-0.05084228515625,
0.0303802490234375,
0.0011453628540039062,
-0.049224853515625,
-0.02081298828125,
-0.0207366943359375,
-0.07708740234375,
0.0237884521484375,
-0.026824951171875,
-0.058349609375,
0.0219573974609375,
0.0196990966796875,
-0.0201263427734375,
0.02874755859375,
-0.08056640625,
0.0819091796875,
-0.00970458984375,
-0.0189208984375,
0.0167236328125,
-0.06292724609375,
0.03460693359375,
0.01506805419921875,
-0.00914764404296875,
-0.00913238525390625,
-0.000858306884765625,
0.06610107421875,
-0.061370849609375,
0.065185546875,
-0.0364990234375,
0.00795745849609375,
0.037078857421875,
-0.0067901611328125,
0.0458984375,
-0.005207061767578125,
0.024078369140625,
0.0034122467041015625,
0.0181427001953125,
-0.038421630859375,
-0.020477294921875,
0.05419921875,
-0.0677490234375,
-0.0298614501953125,
-0.027374267578125,
-0.024566650390625,
-0.041717529296875,
0.0232086181640625,
0.0301361083984375,
0.01316070556640625,
-0.01192474365234375,
0.0288848876953125,
0.05841064453125,
-0.01715087890625,
0.033294677734375,
0.01580810546875,
-0.01062774658203125,
-0.0264892578125,
0.060455322265625,
0.01352691650390625,
-0.00835418701171875,
0.029510498046875,
0.0228271484375,
-0.031341552734375,
-0.046295166015625,
-0.032318115234375,
0.0218353271484375,
-0.016326904296875,
-0.039215087890625,
-0.054779052734375,
-0.0243682861328125,
-0.0274200439453125,
0.0099029541015625,
-0.03509521484375,
-0.04510498046875,
-0.037445068359375,
-0.0231170654296875,
0.042510986328125,
0.044219970703125,
0.0124969482421875,
0.0252685546875,
-0.053558349609375,
0.027679443359375,
0.02691650390625,
0.04510498046875,
-0.0014019012451171875,
-0.044525146484375,
-0.0011663436889648438,
0.004566192626953125,
-0.0305328369140625,
-0.0902099609375,
0.0289764404296875,
0.02178955078125,
0.045257568359375,
0.035308837890625,
-0.0064239501953125,
0.056732177734375,
-0.032379150390625,
0.059478759765625,
0.0179901123046875,
-0.06494140625,
0.048370361328125,
-0.019195556640625,
0.0105438232421875,
0.061492919921875,
0.0433349609375,
-0.02142333984375,
-0.01131439208984375,
-0.04730224609375,
-0.06317138671875,
0.054473876953125,
0.0066375732421875,
0.0139312744140625,
0.0180511474609375,
0.023223876953125,
0.0159454345703125,
0.0089263916015625,
-0.0775146484375,
-0.02935791015625,
-0.0217742919921875,
-0.00650787353515625,
0.00437164306640625,
-0.03985595703125,
-0.007129669189453125,
-0.055877685546875,
0.0616455078125,
0.006771087646484375,
0.01788330078125,
0.0099639892578125,
-0.00865936279296875,
0.0029811859130859375,
0.0043487548828125,
0.051239013671875,
0.05596923828125,
-0.032257080078125,
-0.014892578125,
0.00817108154296875,
-0.0509033203125,
-0.01373291015625,
0.0190582275390625,
-0.0245819091796875,
0.00821685791015625,
0.00893402099609375,
0.07574462890625,
0.01910400390625,
-0.031890869140625,
0.046234130859375,
0.00789642333984375,
-0.022186279296875,
-0.015533447265625,
-0.014190673828125,
0.01971435546875,
0.001247406005859375,
0.035369873046875,
-0.01189422607421875,
0.007602691650390625,
-0.038055419921875,
0.02752685546875,
0.041534423828125,
-0.0205841064453125,
-0.0313720703125,
0.0384521484375,
0.00951385498046875,
-0.01165771484375,
0.027069091796875,
-0.0167236328125,
-0.04205322265625,
0.0428466796875,
0.041473388671875,
0.05633544921875,
-0.0247802734375,
0.0221405029296875,
0.04248046875,
0.0325927734375,
0.0198211669921875,
0.0382080078125,
-0.0070953369140625,
-0.046295166015625,
-0.009490966796875,
-0.0576171875,
-0.0193023681640625,
0.03900146484375,
-0.05023193359375,
0.01708984375,
-0.036102294921875,
-0.021026611328125,
0.01195526123046875,
0.016754150390625,
-0.054351806640625,
0.018035888671875,
-0.005596160888671875,
0.0863037109375,
-0.0780029296875,
0.0653076171875,
0.05084228515625,
-0.04473876953125,
-0.08221435546875,
-0.0169830322265625,
-0.021026611328125,
-0.06378173828125,
0.0177001953125,
0.00855255126953125,
0.0196990966796875,
-0.023712158203125,
-0.05279541015625,
-0.0775146484375,
0.0799560546875,
0.0151214599609375,
-0.02496337890625,
-0.0019407272338867188,
0.00919342041015625,
0.047576904296875,
-0.0164642333984375,
0.01357269287109375,
0.0513916015625,
0.0192718505859375,
0.0202178955078125,
-0.08087158203125,
0.01324462890625,
-0.027862548828125,
0.01050567626953125,
0.01242828369140625,
-0.04742431640625,
0.06280517578125,
-0.0013608932495117188,
0.0024089813232421875,
-0.00820159912109375,
0.050872802734375,
0.0152740478515625,
0.01157379150390625,
0.027862548828125,
0.071533203125,
0.054931640625,
-0.0223388671875,
0.066162109375,
-0.0171661376953125,
0.02838134765625,
0.08770751953125,
0.007328033447265625,
0.068115234375,
0.048736572265625,
-0.04571533203125,
0.02838134765625,
0.0775146484375,
-0.005825042724609375,
0.0418701171875,
0.0115966796875,
0.0152740478515625,
0.0167083740234375,
0.0005197525024414062,
-0.0606689453125,
0.039215087890625,
0.0306854248046875,
-0.044464111328125,
-0.0099334716796875,
-0.0026264190673828125,
0.0151214599609375,
-0.004253387451171875,
-0.0277252197265625,
0.05291748046875,
0.004901885986328125,
-0.047027587890625,
0.0604248046875,
-0.0017614364624023438,
0.05810546875,
-0.036376953125,
0.01053619384765625,
-0.027374267578125,
-0.00835418701171875,
-0.023712158203125,
-0.05657958984375,
0.0205841064453125,
-0.0019207000732421875,
-0.01050567626953125,
-0.0012340545654296875,
0.050567626953125,
-0.031524658203125,
-0.0289459228515625,
0.0027523040771484375,
0.04644775390625,
0.01007843017578125,
0.00484466552734375,
-0.06610107421875,
0.0016040802001953125,
0.0019550323486328125,
-0.0438232421875,
0.0184326171875,
0.0252685546875,
-0.000453948974609375,
0.04119873046875,
0.04522705078125,
-0.00672149658203125,
0.019500732421875,
-0.006011962890625,
0.083251953125,
-0.046112060546875,
-0.052825927734375,
-0.049591064453125,
0.034393310546875,
-0.0164337158203125,
-0.04351806640625,
0.046783447265625,
0.05413818359375,
0.0687255859375,
-0.0241851806640625,
0.026702880859375,
-0.018463134765625,
0.050445556640625,
-0.0306396484375,
0.0595703125,
-0.044281005859375,
-0.00864410400390625,
-0.032806396484375,
-0.07928466796875,
-0.01195526123046875,
0.036102294921875,
-0.0196990966796875,
0.01435089111328125,
0.0313720703125,
0.0489501953125,
-0.011749267578125,
0.0004563331604003906,
0.0032253265380859375,
0.025177001953125,
0.0035343170166015625,
0.0292816162109375,
0.0377197265625,
-0.030120849609375,
0.035858154296875,
-0.04571533203125,
-0.01910400390625,
-0.0158843994140625,
-0.055694580078125,
-0.060150146484375,
-0.06268310546875,
-0.01212310791015625,
-0.0264129638671875,
-0.01442718505859375,
0.06976318359375,
0.0673828125,
-0.08465576171875,
-0.01210784912109375,
0.007045745849609375,
0.003955841064453125,
-0.00022602081298828125,
-0.0218353271484375,
0.0447998046875,
-0.01494598388671875,
-0.043182373046875,
0.0128326416015625,
0.006076812744140625,
0.0163726806640625,
-0.008087158203125,
0.0006432533264160156,
-0.0384521484375,
-0.00986480712890625,
0.052703857421875,
0.052276611328125,
-0.052093505859375,
-0.006160736083984375,
-0.003925323486328125,
-0.01552581787109375,
0.0202178955078125,
0.043670654296875,
-0.052703857421875,
0.016876220703125,
0.03521728515625,
0.05828857421875,
0.0687255859375,
0.00531005859375,
0.0189666748046875,
-0.059234619140625,
0.023651123046875,
0.0282745361328125,
0.036834716796875,
0.0166778564453125,
-0.0234222412109375,
0.040252685546875,
0.0217132568359375,
-0.0215301513671875,
-0.07275390625,
-0.004131317138671875,
-0.09429931640625,
-0.01157379150390625,
0.060577392578125,
-0.00839996337890625,
-0.029449462890625,
0.0054779052734375,
-0.0192413330078125,
0.0241546630859375,
-0.019073486328125,
0.05126953125,
0.04339599609375,
-0.0113067626953125,
-0.01204681396484375,
-0.026458740234375,
0.06005859375,
0.032623291015625,
-0.06365966796875,
-0.0213165283203125,
0.0292205810546875,
0.032562255859375,
0.0225982666015625,
0.054168701171875,
-0.0261993408203125,
0.011810302734375,
-0.01195526123046875,
0.00724029541015625,
-0.01134490966796875,
-0.01763916015625,
-0.01448822021484375,
0.01430511474609375,
0.0016965866088867188,
-0.04388427734375
]
] |
EleutherAI/pythia-1.4b-deduped | 2023-06-08T13:03:28.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"en",
"dataset:EleutherAI/the_pile_deduplicated",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-1.4b-deduped | 18 | 14,346 | transformers | 2023-02-09T21:42:04 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- EleutherAI/the_pile_deduplicated
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight:600">Details on previous early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-1.4B-deduped
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-1.4B-deduped for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-1.4B-deduped as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-1.4B-deduped has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-1.4B-deduped will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “follow” human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most “accurate” text. Never rely on Pythia-1.4B-deduped to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-1.4B-deduped may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-1.4B-deduped.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
Pythia-1.4B-deduped was trained on the Pile **after the dataset has been globally
deduplicated**.<br>
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Easy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1× their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 13,663 | [
[
-0.024627685546875,
-0.060211181640625,
0.0247344970703125,
0.004730224609375,
-0.0178070068359375,
-0.01500701904296875,
-0.016082763671875,
-0.033355712890625,
0.01502227783203125,
0.0125885009765625,
-0.0278167724609375,
-0.0220489501953125,
-0.031890869140625,
-0.003757476806640625,
-0.034942626953125,
0.085205078125,
-0.00908660888671875,
-0.01161956787109375,
0.0085906982421875,
-0.005336761474609375,
-0.00507354736328125,
-0.041839599609375,
-0.03399658203125,
-0.02899169921875,
0.048065185546875,
0.01390838623046875,
0.065185546875,
0.04241943359375,
0.011932373046875,
0.0216827392578125,
-0.0274505615234375,
-0.0059356689453125,
-0.012054443359375,
-0.007049560546875,
-0.0017671585083007812,
-0.019195556640625,
-0.054473876953125,
0.0036106109619140625,
0.05181884765625,
0.049285888671875,
-0.01380157470703125,
0.0193634033203125,
-0.0009794235229492188,
0.026763916015625,
-0.03802490234375,
0.0018320083618164062,
-0.0247650146484375,
-0.012908935546875,
-0.00582122802734375,
0.0107879638671875,
-0.028594970703125,
-0.026763916015625,
0.033721923828125,
-0.049530029296875,
0.019805908203125,
0.00609588623046875,
0.08978271484375,
-0.0081024169921875,
-0.03131103515625,
-0.004390716552734375,
-0.053070068359375,
0.05059814453125,
-0.054046630859375,
0.0252685546875,
0.021484375,
0.01202392578125,
-0.002643585205078125,
-0.06787109375,
-0.04193115234375,
-0.0164947509765625,
-0.01010894775390625,
-0.002696990966796875,
-0.047088623046875,
0.0017137527465820312,
0.038482666015625,
0.04827880859375,
-0.061431884765625,
-0.002254486083984375,
-0.028106689453125,
-0.0261383056640625,
0.02593994140625,
0.0042266845703125,
0.03289794921875,
-0.0230712890625,
-0.0004589557647705078,
-0.028900146484375,
-0.050933837890625,
-0.0174560546875,
0.041748046875,
0.00600433349609375,
-0.02734375,
0.036651611328125,
-0.028656005859375,
0.04217529296875,
-0.005523681640625,
0.018585205078125,
0.031982421875,
-0.015838623046875,
-0.03814697265625,
-0.0069732666015625,
0.071044921875,
0.008453369140625,
0.01611328125,
-0.0010156631469726562,
-0.0022945404052734375,
0.004650115966796875,
0.0038013458251953125,
-0.0855712890625,
-0.059661865234375,
0.017547607421875,
-0.0292510986328125,
-0.032867431640625,
-0.01381683349609375,
-0.07037353515625,
-0.014801025390625,
-0.01432037353515625,
0.04302978515625,
-0.038726806640625,
-0.054656982421875,
-0.01041412353515625,
0.0010833740234375,
0.0159149169921875,
0.02734375,
-0.0711669921875,
0.031707763671875,
0.03240966796875,
0.0760498046875,
0.017333984375,
-0.042083740234375,
-0.01488494873046875,
-0.0213470458984375,
-0.0089263916015625,
0.0267791748046875,
-0.0091400146484375,
-0.01413726806640625,
-0.00917816162109375,
0.01153564453125,
-0.0084075927734375,
-0.0259246826171875,
0.030426025390625,
-0.0299224853515625,
0.0200042724609375,
-0.0215911865234375,
-0.033111572265625,
-0.02825927734375,
0.0079345703125,
-0.046905517578125,
0.0633544921875,
0.019012451171875,
-0.0731201171875,
0.0178070068359375,
-0.0168609619140625,
-0.004985809326171875,
-0.00368499755859375,
0.0139923095703125,
-0.051422119140625,
0.0015249252319335938,
0.027313232421875,
0.0037517547607421875,
-0.0296630859375,
0.01465606689453125,
-0.0184326171875,
-0.031646728515625,
0.01372528076171875,
-0.0411376953125,
0.06988525390625,
0.0152130126953125,
-0.04864501953125,
0.0203857421875,
-0.044342041015625,
0.0158233642578125,
0.018829345703125,
-0.0266876220703125,
0.003376007080078125,
-0.0133209228515625,
0.0271453857421875,
0.0156402587890625,
0.01287841796875,
-0.0279998779296875,
0.0231170654296875,
-0.0379638671875,
0.05572509765625,
0.056396484375,
-0.004817962646484375,
0.03466796875,
-0.031280517578125,
0.035064697265625,
0.0030269622802734375,
0.0150909423828125,
-0.004581451416015625,
-0.045989990234375,
-0.07366943359375,
-0.0238037109375,
0.028411865234375,
0.0223236083984375,
-0.03521728515625,
0.03228759765625,
-0.018463134765625,
-0.065185546875,
-0.012908935546875,
-0.00628662109375,
0.031341552734375,
0.023651123046875,
0.031982421875,
-0.01157379150390625,
-0.0399169921875,
-0.06622314453125,
-0.01558685302734375,
-0.032318115234375,
0.01015472412109375,
0.01314544677734375,
0.0703125,
-0.00824737548828125,
0.0447998046875,
-0.0267333984375,
0.0188446044921875,
-0.0275421142578125,
0.0125885009765625,
0.0338134765625,
0.0462646484375,
0.0292816162109375,
-0.041473388671875,
-0.0284271240234375,
0.0006322860717773438,
-0.043609619140625,
0.006717681884765625,
0.0020809173583984375,
-0.0236358642578125,
0.0233306884765625,
0.005126953125,
-0.0748291015625,
0.0355224609375,
0.046722412109375,
-0.04150390625,
0.060699462890625,
-0.024383544921875,
-0.00043392181396484375,
-0.079345703125,
0.019622802734375,
0.009185791015625,
-0.0172882080078125,
-0.0452880859375,
0.005580902099609375,
0.01361083984375,
-0.0152130126953125,
-0.0298309326171875,
0.045623779296875,
-0.03955078125,
-0.012176513671875,
-0.01532745361328125,
0.006191253662109375,
-0.002552032470703125,
0.047698974609375,
0.0112762451171875,
0.043212890625,
0.06134033203125,
-0.058990478515625,
0.031829833984375,
0.0179901123046875,
-0.0208892822265625,
0.0282440185546875,
-0.06671142578125,
0.01263427734375,
0.006175994873046875,
0.03155517578125,
-0.044769287109375,
-0.0254669189453125,
0.040496826171875,
-0.04364013671875,
0.01187896728515625,
-0.0309906005859375,
-0.040740966796875,
-0.032501220703125,
-0.01268768310546875,
0.04583740234375,
0.058258056640625,
-0.046844482421875,
0.050537109375,
0.00382232666015625,
0.00890350341796875,
-0.028656005859375,
-0.04132080078125,
-0.019561767578125,
-0.03912353515625,
-0.049346923828125,
0.028717041015625,
0.01273345947265625,
-0.0135345458984375,
0.001087188720703125,
-0.0017385482788085938,
0.007694244384765625,
-0.00341033935546875,
0.0238037109375,
0.0260467529296875,
-0.00328826904296875,
0.002399444580078125,
-0.010589599609375,
-0.0096893310546875,
-0.0003604888916015625,
-0.03802490234375,
0.07269287109375,
-0.0215911865234375,
-0.0142059326171875,
-0.06085205078125,
-0.00074005126953125,
0.06634521484375,
-0.03228759765625,
0.06610107421875,
0.045135498046875,
-0.052978515625,
0.0108489990234375,
-0.0272369384765625,
-0.021881103515625,
-0.033172607421875,
0.049560546875,
-0.0204925537109375,
-0.027313232421875,
0.047332763671875,
0.02252197265625,
0.02130126953125,
0.0428466796875,
0.054107666015625,
0.0172271728515625,
0.08990478515625,
0.034637451171875,
-0.01305389404296875,
0.04791259765625,
-0.0390625,
0.019561767578125,
-0.0823974609375,
-0.012603759765625,
-0.0396728515625,
-0.0189208984375,
-0.0716552734375,
-0.022430419921875,
0.0238494873046875,
0.01806640625,
-0.057037353515625,
0.04315185546875,
-0.04217529296875,
0.004207611083984375,
0.049468994140625,
0.0194854736328125,
0.0140838623046875,
0.0156707763671875,
0.00545501708984375,
-0.004238128662109375,
-0.050140380859375,
-0.025604248046875,
0.09326171875,
0.037811279296875,
0.04571533203125,
0.02191162109375,
0.05462646484375,
-0.01172637939453125,
0.0183258056640625,
-0.052337646484375,
0.03253173828125,
0.02557373046875,
-0.05450439453125,
-0.0159759521484375,
-0.058563232421875,
-0.07080078125,
0.0367431640625,
0.005596160888671875,
-0.08416748046875,
0.016815185546875,
0.0168609619140625,
-0.028717041015625,
0.036224365234375,
-0.04644775390625,
0.07586669921875,
-0.017486572265625,
-0.036529541015625,
-0.0273284912109375,
-0.022430419921875,
0.018035888671875,
0.0274810791015625,
0.00875091552734375,
0.007724761962890625,
0.0240478515625,
0.07415771484375,
-0.051483154296875,
0.04925537109375,
-0.0099639892578125,
0.0106048583984375,
0.02593994140625,
0.0217437744140625,
0.049407958984375,
0.011322021484375,
0.00983428955078125,
-0.0025424957275390625,
0.0130462646484375,
-0.041595458984375,
-0.028533935546875,
0.069091796875,
-0.08331298828125,
-0.0280914306640625,
-0.060394287109375,
-0.044586181640625,
0.007289886474609375,
0.01538848876953125,
0.031005859375,
0.0504150390625,
-0.0037097930908203125,
0.0000483393669128418,
0.045166015625,
-0.0394287109375,
0.026763916015625,
0.0165863037109375,
-0.03729248046875,
-0.038726806640625,
0.07489013671875,
0.0013179779052734375,
0.02587890625,
0.0011682510375976562,
0.0175628662109375,
-0.030120849609375,
-0.03326416015625,
-0.044708251953125,
0.0406494140625,
-0.055419921875,
0.0004279613494873047,
-0.05401611328125,
-0.003509521484375,
-0.035003662109375,
0.0083770751953125,
-0.030670166015625,
-0.028839111328125,
-0.018585205078125,
-0.00234222412109375,
0.0438232421875,
0.035797119140625,
0.0074310302734375,
0.025482177734375,
-0.041046142578125,
-0.002681732177734375,
0.01806640625,
0.00861358642578125,
0.007965087890625,
-0.0679931640625,
-0.0065765380859375,
0.01024627685546875,
-0.033111572265625,
-0.0858154296875,
0.037994384765625,
-0.00408935546875,
0.0279998779296875,
0.005084991455078125,
-0.0178375244140625,
0.0455322265625,
-0.0066375732421875,
0.050933837890625,
0.01187896728515625,
-0.0777587890625,
0.041656494140625,
-0.036773681640625,
0.0229034423828125,
0.027008056640625,
0.0269775390625,
-0.053741455078125,
-0.006565093994140625,
-0.07537841796875,
-0.08087158203125,
0.05694580078125,
0.03759765625,
0.01297760009765625,
0.007236480712890625,
0.02996826171875,
-0.0335693359375,
0.01155853271484375,
-0.07861328125,
-0.02154541015625,
-0.0193023681640625,
-0.0060577392578125,
0.01255035400390625,
-0.0019702911376953125,
0.00449371337890625,
-0.04327392578125,
0.07666015625,
0.0044708251953125,
0.02581787109375,
0.0220184326171875,
-0.0288848876953125,
-0.007171630859375,
-0.0026798248291015625,
0.0119781494140625,
0.057098388671875,
-0.010986328125,
0.003383636474609375,
0.0160369873046875,
-0.041717529296875,
0.0024566650390625,
0.01280975341796875,
-0.027862548828125,
-0.005329132080078125,
0.01320648193359375,
0.06646728515625,
0.0103759765625,
-0.03057861328125,
0.0174713134765625,
-0.003627777099609375,
-0.005619049072265625,
-0.022674560546875,
-0.01361846923828125,
0.0125274658203125,
0.01543426513671875,
-0.00244140625,
-0.01287841796875,
-0.0010213851928710938,
-0.0662841796875,
0.004131317138671875,
0.0165252685546875,
-0.010711669921875,
-0.03070068359375,
0.044189453125,
0.00293731689453125,
-0.01412200927734375,
0.085205078125,
-0.019744873046875,
-0.0517578125,
0.057861328125,
0.038238525390625,
0.05535888671875,
-0.01418304443359375,
0.026885986328125,
0.06915283203125,
0.0246429443359375,
-0.0152435302734375,
0.005096435546875,
0.006542205810546875,
-0.0390625,
-0.00799560546875,
-0.0618896484375,
-0.017669677734375,
0.01934814453125,
-0.043609619140625,
0.033905029296875,
-0.04840087890625,
-0.006015777587890625,
-0.002727508544921875,
0.0166168212890625,
-0.04364013671875,
0.023590087890625,
0.01331329345703125,
0.054412841796875,
-0.0689697265625,
0.0621337890625,
0.048126220703125,
-0.056854248046875,
-0.08294677734375,
0.0020809173583984375,
0.0012378692626953125,
-0.032196044921875,
0.0142974853515625,
0.016265869140625,
0.01505279541015625,
0.013092041015625,
-0.0221710205078125,
-0.06646728515625,
0.09759521484375,
0.016998291015625,
-0.049041748046875,
-0.020599365234375,
-0.00920867919921875,
0.040802001953125,
0.004451751708984375,
0.05401611328125,
0.05450439453125,
0.030670166015625,
0.006481170654296875,
-0.08038330078125,
0.0283355712890625,
-0.02435302734375,
-0.005397796630859375,
0.0175628662109375,
-0.0499267578125,
0.0994873046875,
-0.00547027587890625,
-0.0025272369384765625,
0.030609130859375,
0.044281005859375,
0.030242919921875,
-0.00963592529296875,
0.0272369384765625,
0.058807373046875,
0.06640625,
-0.02789306640625,
0.09326171875,
-0.022491455078125,
0.058258056640625,
0.065185546875,
0.014312744140625,
0.038543701171875,
0.0301971435546875,
-0.029754638671875,
0.03997802734375,
0.062286376953125,
-0.00571441650390625,
0.0135650634765625,
0.01947021484375,
-0.0214385986328125,
-0.0200958251953125,
0.01035308837890625,
-0.045166015625,
0.014556884765625,
0.0108489990234375,
-0.04278564453125,
-0.016265869140625,
-0.0252685546875,
0.0263824462890625,
-0.030792236328125,
-0.0175933837890625,
0.0196533203125,
0.006587982177734375,
-0.04974365234375,
0.0482177734375,
0.0175628662109375,
0.0421142578125,
-0.03466796875,
0.01103973388671875,
-0.01262664794921875,
0.0246124267578125,
-0.0257415771484375,
-0.032196044921875,
0.006038665771484375,
0.00017023086547851562,
0.0048980712890625,
0.00897216796875,
0.03240966796875,
-0.0103912353515625,
-0.0428466796875,
0.01397705078125,
0.0357666015625,
0.0191497802734375,
-0.033538818359375,
-0.05096435546875,
0.006359100341796875,
-0.0121917724609375,
-0.040191650390625,
0.032318115234375,
0.0189971923828125,
-0.00955963134765625,
0.044708251953125,
0.047607421875,
0.0025463104248046875,
0.0002548694610595703,
0.01116943359375,
0.07421875,
-0.03521728515625,
-0.036224365234375,
-0.06890869140625,
0.037841796875,
-0.000637054443359375,
-0.050689697265625,
0.065185546875,
0.040771484375,
0.05224609375,
0.0194549560546875,
0.04534912109375,
-0.034759521484375,
0.0009484291076660156,
-0.0220184326171875,
0.0501708984375,
-0.037933349609375,
0.00390625,
-0.03839111328125,
-0.0853271484375,
-0.0036373138427734375,
0.07232666015625,
-0.0386962890625,
0.029449462890625,
0.06060791015625,
0.060821533203125,
-0.00650787353515625,
0.007534027099609375,
0.0038738250732421875,
0.022979736328125,
0.04010009765625,
0.0694580078125,
0.067138671875,
-0.053192138671875,
0.041168212890625,
-0.038665771484375,
-0.0204315185546875,
-0.01110076904296875,
-0.037933349609375,
-0.06451416015625,
-0.035491943359375,
-0.038360595703125,
-0.05633544921875,
-0.0023403167724609375,
0.067626953125,
0.055419921875,
-0.046600341796875,
-0.01100921630859375,
-0.04046630859375,
0.003932952880859375,
-0.0200958251953125,
-0.0175933837890625,
0.031646728515625,
0.00856781005859375,
-0.0711669921875,
-0.002880096435546875,
-0.01190185546875,
0.007488250732421875,
-0.0321044921875,
-0.0216217041015625,
-0.0155029296875,
-0.007843017578125,
0.005908966064453125,
0.0238800048828125,
-0.03887939453125,
-0.0204315185546875,
0.0017366409301757812,
0.0036411285400390625,
0.0000540614128112793,
0.052947998046875,
-0.044097900390625,
0.00952911376953125,
0.04693603515625,
0.0081787109375,
0.06182861328125,
-0.020111083984375,
0.030426025390625,
-0.020751953125,
0.02618408203125,
0.021240234375,
0.04766845703125,
0.0247344970703125,
-0.018829345703125,
0.01294708251953125,
0.0306854248046875,
-0.055694580078125,
-0.06524658203125,
0.0273590087890625,
-0.05426025390625,
-0.00739288330078125,
0.09521484375,
-0.0204925537109375,
-0.0289764404296875,
0.005420684814453125,
-0.0163421630859375,
0.040771484375,
-0.0200958251953125,
0.049530029296875,
0.047607421875,
0.005573272705078125,
-0.01522064208984375,
-0.04876708984375,
0.028076171875,
0.0501708984375,
-0.061798095703125,
0.02899169921875,
0.0469970703125,
0.046539306640625,
0.0188751220703125,
0.044952392578125,
-0.022735595703125,
0.045806884765625,
0.00817108154296875,
0.0066680908203125,
0.0015325546264648438,
-0.03521728515625,
-0.032623291015625,
-0.0108489990234375,
0.0166473388671875,
0.001129150390625
]
] |
kakaobrain/karlo-v1-alpha-image-variations | 2023-01-31T08:27:48.000Z | [
"diffusers",
"text-to-image",
"license:creativeml-openrail-m",
"has_space",
"diffusers:UnCLIPImageVariationPipeline",
"region:us"
] | text-to-image | kakaobrain | null | null | kakaobrain/karlo-v1-alpha-image-variations | 5 | 14,341 | diffusers | 2023-01-30T19:46:46 | ---
license: creativeml-openrail-m
tags:
- text-to-image
---
# Karlo v1 alpha
Karlo is a text-conditional image generation model based on OpenAI's unCLIP architecture with the improvement over the standard super-resolution model from 64px to 256px, recovering high-frequency details only in the small number of denoising steps.
* [Original codebase](https://github.com/kakaobrain/karlo)
## Usage
Karlo is available in diffusers!
```python
pip install diffusers transformers accelerate safetensors
```
### Text to image
```python
from diffusers import UnCLIPPipeline
import torch
pipe = UnCLIPPipeline.from_pretrained("kakaobrain/karlo-v1-alpha", torch_dtype=torch.float16)
pipe = pipe.to('cuda')
prompt = "a high-resolution photograph of a big red frog on a green leaf."
image = pipe([prompt]).images[0]
image.save("./frog.png")
```

### Image variation
```python
from diffusers import UnCLIPImageVariationPipeline
import torch
from PIL import Image
pipe = UnCLIPImageVariationPipeline.from_pretrained("kakaobrain/karlo-v1-alpha-image-variations", torch_dtype=torch.float16)
pipe = pipe.to('cuda')
image = Image.open("./frog.png")
image = pipe(image).images[0]
image.save("./frog-variation.png")
```

## Model Architecture
### Overview
Karlo is a text-conditional diffusion model based on unCLIP, composed of prior, decoder, and super-resolution modules. In this repository, we include the improved version of the standard super-resolution module for upscaling 64px to 256px only in 7 reverse steps, as illustrated in the figure below:
<p float="left">
<img src="https://raw.githubusercontent.com/kakaobrain/karlo/main/assets/improved_sr_arch.jpg"/>
</p>
In specific, the standard SR module trained by DDPM objective upscales 64px to 256px in the first 6 denoising steps based on the respacing technique. Then, the additional fine-tuned SR module trained by [VQ-GAN](https://compvis.github.io/taming-transformers/)-style loss performs the final reverse step to recover high-frequency details. We observe that this approach is very effective to upscale the low-resolution in a small number of reverse steps.
### Details
We train all components from scratch on 115M image-text pairs including COYO-100M, CC3M, and CC12M. In the case of Prior and Decoder, we use ViT-L/14 provided by OpenAI’s [CLIP repository](https://github.com/openai/CLIP). Unlike the original implementation of unCLIP, we replace the trainable transformer in the decoder into the text encoder in ViT-L/14 for efficiency. In the case of the SR module, we first train the model using the DDPM objective in 1M steps, followed by additional 234K steps to fine-tune the additional component. The table below summarizes the important statistics of our components:
| | Prior | Decoder | SR |
|:------|----:|----:|----:|
| CLIP | ViT-L/14 | ViT-L/14 | - |
| #param | 1B | 900M | 700M + 700M |
| #optimization steps | 1M | 1M | 1M + 0.2M |
| #sampling steps | 25 | 50 (default), 25 (fast) | 7 |
|Checkpoint links| [ViT-L-14](https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/096db1af569b284eb76b3881534822d9/ViT-L-14.pt), [ViT-L-14 stats](https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/0b62380a75e56f073e2844ab5199153d/ViT-L-14_stats.th), [model](https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/efdf6206d8ed593961593dc029a8affa/decoder-ckpt-step%3D01000000-of-01000000.ckpt) | [model](https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/85626483eaca9f581e2a78d31ff905ca/prior-ckpt-step%3D01000000-of-01000000.ckpt) | [model](https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/4226b831ae0279020d134281f3c31590/improved-sr-ckpt-step%3D1.2M.ckpt) |
In the checkpoint links, ViT-L-14 is equivalent to the original version, but we include it for convenience. We also remark that ViT-L-14-stats is required to normalize the outputs of the prior module.
### Evaluation
We quantitatively measure the performance of Karlo-v1.0.alpha in the validation split of CC3M and MS-COCO. The table below presents CLIP-score and FID. To measure FID, we resize the image of the shorter side to 256px, followed by cropping it at the center. We set classifier-free guidance scales for prior and decoder to 4 and 8 in all cases. We observe that our model achieves reasonable performance even with 25 sampling steps of decoder.
CC3M
| Sampling step | CLIP-s (ViT-B/16) | FID (13k from val)|
|:------|----:|----:|
| Prior (25) + Decoder (25) + SR (7) | 0.3081 | 14.37 |
| Prior (25) + Decoder (50) + SR (7) | 0.3086 | 13.95 |
MS-COCO
| Sampling step | CLIP-s (ViT-B/16) | FID (30k from val)|
|:------|----:|----:|
| Prior (25) + Decoder (25) + SR (7) | 0.3192 | 15.24 |
| Prior (25) + Decoder (50) + SR (7) | 0.3192 | 14.43 |
For more information, please refer to the upcoming technical report.
### Training Details
This alpha version of Karlo is trained on 115M image-text pairs,
including [COYO](https://github.com/kakaobrain/coyo-dataset)-100M high-quality subset, CC3M, and CC12M.
For those who are interested in a better version of Karlo trained on more large-scale high-quality datasets,
please visit the landing page of our application [B^DISCOVER](https://bdiscover.kakaobrain.com/).
## BibTex
If you find this repository useful in your research, please cite:
```
@misc{kakaobrain2022karlo-v1-alpha,
title = {Karlo-v1.0.alpha on COYO-100M and CC15M},
author = {Donghoon Lee, Jiseob Kim, Jisu Choi, Jongmin Kim, Minwoo Byeon, Woonhyuk Baek and Saehoon Kim},
year = {2022},
howpublished = {\url{https://github.com/kakaobrain/karlo}},
}
```
| 5,832 | [
[
-0.03717041015625,
-0.036529541015625,
0.0308837890625,
0.001712799072265625,
-0.044830322265625,
-0.03314208984375,
-0.027130126953125,
-0.04827880859375,
0.01050567626953125,
0.0330810546875,
-0.042388916015625,
-0.053009033203125,
-0.056365966796875,
0.027374267578125,
-0.005138397216796875,
0.07611083984375,
-0.003173828125,
0.01849365234375,
-0.00308990478515625,
-0.002315521240234375,
-0.027313232421875,
-0.023681640625,
-0.056396484375,
-0.0232696533203125,
0.0103912353515625,
0.0140533447265625,
0.034515380859375,
0.050537109375,
0.029266357421875,
0.024627685546875,
-0.0256805419921875,
0.00243377685546875,
-0.031402587890625,
-0.016082763671875,
0.0102386474609375,
-0.01418304443359375,
-0.053985595703125,
-0.0253753662109375,
0.056915283203125,
0.00981903076171875,
0.0018091201782226562,
0.00522613525390625,
0.0020771026611328125,
0.0517578125,
-0.049774169921875,
0.01482391357421875,
-0.038604736328125,
-0.003948211669921875,
0.01025390625,
-0.0017242431640625,
-0.041717529296875,
-0.01480865478515625,
0.0179901123046875,
-0.0628662109375,
0.017059326171875,
-0.006702423095703125,
0.1009521484375,
0.0013523101806640625,
-0.0303497314453125,
-0.01116943359375,
-0.04425048828125,
0.07427978515625,
-0.073486328125,
0.0291290283203125,
0.03570556640625,
0.025146484375,
0.00608062744140625,
-0.04766845703125,
-0.0430908203125,
0.0022487640380859375,
-0.0172271728515625,
0.0179290771484375,
-0.0255584716796875,
0.009521484375,
0.0193939208984375,
0.01739501953125,
-0.044342041015625,
-0.004985809326171875,
-0.045135498046875,
-0.025787353515625,
0.049041748046875,
0.0037841796875,
0.0073699951171875,
-0.031829833984375,
-0.0361328125,
-0.0281524658203125,
-0.0328369140625,
-0.00843048095703125,
0.0361328125,
0.00021207332611083984,
-0.01558685302734375,
0.01371002197265625,
-0.0186920166015625,
0.06329345703125,
0.004638671875,
-0.02142333984375,
0.02154541015625,
-0.03912353515625,
-0.02276611328125,
0.005767822265625,
0.07012939453125,
0.032012939453125,
0.005401611328125,
0.019317626953125,
-0.01009368896484375,
0.0002586841583251953,
0.0143890380859375,
-0.0789794921875,
-0.0246429443359375,
0.00420379638671875,
-0.039947509765625,
-0.01355743408203125,
-0.0007233619689941406,
-0.07757568359375,
0.0136260986328125,
-0.00940704345703125,
0.054534912109375,
-0.061614990234375,
-0.00800323486328125,
0.0179443359375,
-0.00832366943359375,
0.021270751953125,
0.0400390625,
-0.0670166015625,
0.0218048095703125,
0.013580322265625,
0.053466796875,
0.007843017578125,
-0.01357269287109375,
-0.00890350341796875,
-0.00986480712890625,
-0.0285797119140625,
0.062164306640625,
-0.0184173583984375,
-0.017608642578125,
-0.01380157470703125,
0.01064300537109375,
0.0004634857177734375,
-0.033538818359375,
0.0601806640625,
-0.051055908203125,
0.0305023193359375,
-0.0185089111328125,
-0.024139404296875,
-0.04156494140625,
0.00920867919921875,
-0.03411865234375,
0.0555419921875,
0.021697998046875,
-0.06268310546875,
0.0291900634765625,
-0.02154541015625,
-0.0010824203491210938,
-0.0051422119140625,
0.00821685791015625,
-0.04656982421875,
-0.0113372802734375,
0.03131103515625,
0.02520751953125,
-0.020538330078125,
0.007289886474609375,
-0.0189361572265625,
-0.03125,
0.0103912353515625,
-0.0235748291015625,
0.067626953125,
0.018341064453125,
-0.00925445556640625,
0.006099700927734375,
-0.07574462890625,
0.0101165771484375,
0.01445770263671875,
-0.03009033203125,
-0.032470703125,
-0.016937255859375,
-0.0011539459228515625,
0.018035888671875,
0.02337646484375,
-0.036590576171875,
0.0097198486328125,
-0.0234222412109375,
0.035400390625,
0.0579833984375,
0.00606536865234375,
0.0273284912109375,
-0.02105712890625,
0.029449462890625,
-0.0028934478759765625,
0.03582763671875,
-0.033050537109375,
-0.043060302734375,
-0.05926513671875,
-0.05120849609375,
0.0428466796875,
0.04339599609375,
-0.045684814453125,
0.0477294921875,
-0.02703857421875,
-0.042144775390625,
-0.052032470703125,
-0.0078582763671875,
0.055023193359375,
0.0469970703125,
0.040374755859375,
-0.0400390625,
-0.047271728515625,
-0.08209228515625,
0.01434326171875,
0.0016393661499023438,
-0.0091400146484375,
0.030181884765625,
0.05340576171875,
-0.0238037109375,
0.04583740234375,
-0.054046630859375,
-0.0203399658203125,
-0.0126800537109375,
0.0091552734375,
0.0210113525390625,
0.04119873046875,
0.0380859375,
-0.050537109375,
-0.0460205078125,
0.0025463104248046875,
-0.067138671875,
0.001983642578125,
0.0032176971435546875,
-0.00936126708984375,
0.044830322265625,
0.0115203857421875,
-0.0416259765625,
0.03271484375,
0.041290283203125,
-0.02288818359375,
0.045745849609375,
-0.038543701171875,
0.0262298583984375,
-0.09185791015625,
0.0304107666015625,
0.006534576416015625,
-0.0193023681640625,
-0.05810546875,
-0.002079010009765625,
0.007427215576171875,
-0.005558013916015625,
-0.042144775390625,
0.026092529296875,
-0.040283203125,
-0.004390716552734375,
-0.017059326171875,
-0.00395965576171875,
0.0123291015625,
0.0614013671875,
0.0198822021484375,
0.048736572265625,
0.047393798828125,
-0.049957275390625,
0.006099700927734375,
0.0303802490234375,
-0.036224365234375,
0.0263519287109375,
-0.0721435546875,
0.01910400390625,
-0.023773193359375,
0.0216064453125,
-0.06524658203125,
-0.00931549072265625,
0.03497314453125,
-0.04547119140625,
0.03466796875,
-0.02880859375,
-0.037200927734375,
-0.0201568603515625,
-0.030609130859375,
0.04290771484375,
0.060577392578125,
-0.030975341796875,
0.0211029052734375,
0.01534271240234375,
0.002105712890625,
-0.042388916015625,
-0.04095458984375,
-0.012542724609375,
-0.01117706298828125,
-0.050140380859375,
0.037017822265625,
-0.01258087158203125,
0.0088043212890625,
0.009735107421875,
-0.0027904510498046875,
-0.0158233642578125,
-0.03106689453125,
0.003040313720703125,
0.038238525390625,
-0.0072021484375,
-0.01910400390625,
0.0179443359375,
-0.0172119140625,
-0.006740570068359375,
-0.022430419921875,
0.047821044921875,
-0.00909423828125,
0.019012451171875,
-0.064697265625,
0.019012451171875,
0.03582763671875,
0.006011962890625,
0.046539306640625,
0.03887939453125,
-0.03143310546875,
0.0035457611083984375,
-0.011505126953125,
-0.0179901123046875,
-0.037567138671875,
0.02740478515625,
-0.02423095703125,
-0.047271728515625,
0.04718017578125,
0.01117706298828125,
-0.01531982421875,
0.04315185546875,
0.0143890380859375,
0.00922393798828125,
0.0869140625,
0.044219970703125,
0.02142333984375,
0.03424072265625,
-0.059326171875,
0.0012979507446289062,
-0.07940673828125,
-0.0263214111328125,
-0.0201568603515625,
-0.0087432861328125,
-0.033721923828125,
-0.022705078125,
0.0304107666015625,
0.029205322265625,
-0.0118865966796875,
0.054443359375,
-0.046600341796875,
0.02349853515625,
0.041473388671875,
0.024017333984375,
0.019561767578125,
0.0093841552734375,
0.00396728515625,
-0.0173797607421875,
-0.0594482421875,
-0.0367431640625,
0.08782958984375,
0.01849365234375,
0.0494384765625,
-0.01160430908203125,
0.037506103515625,
0.014404296875,
-0.0012073516845703125,
-0.04345703125,
0.0648193359375,
-0.00789642333984375,
-0.044342041015625,
-0.0175933837890625,
-0.033721923828125,
-0.06744384765625,
0.0240936279296875,
-0.0361328125,
-0.0528564453125,
0.037811279296875,
0.0261077880859375,
-0.0253753662109375,
0.0164794921875,
-0.047454833984375,
0.0693359375,
0.00815582275390625,
-0.0226898193359375,
-0.024810791015625,
-0.051788330078125,
0.029876708984375,
-0.00615692138671875,
-0.004268646240234375,
0.013275146484375,
0.004055023193359375,
0.061309814453125,
-0.0404052734375,
0.05096435546875,
0.0006031990051269531,
-0.00689697265625,
0.03985595703125,
-0.01213836669921875,
0.03448486328125,
0.00830841064453125,
0.000850677490234375,
0.049041748046875,
0.0119476318359375,
-0.0271148681640625,
-0.039459228515625,
0.05181884765625,
-0.06866455078125,
-0.0168914794921875,
-0.0438232421875,
-0.0306396484375,
0.028106689453125,
0.03106689453125,
0.06060791015625,
0.04229736328125,
0.00920867919921875,
0.0026302337646484375,
0.0595703125,
-0.023834228515625,
0.0277557373046875,
0.0286407470703125,
-0.00673675537109375,
-0.081787109375,
0.0628662109375,
0.0133514404296875,
0.025146484375,
0.03167724609375,
0.0046234130859375,
-0.0103912353515625,
-0.04180908203125,
-0.041046142578125,
0.036529541015625,
-0.05810546875,
-0.031219482421875,
-0.035888671875,
-0.01373291015625,
-0.04376220703125,
-0.0139312744140625,
-0.0408935546875,
-0.026519775390625,
-0.03179931640625,
0.0133819580078125,
0.0263519287109375,
0.0297088623046875,
-0.0048065185546875,
0.0289154052734375,
-0.056488037109375,
0.005268096923828125,
0.0309906005859375,
0.01525115966796875,
0.005901336669921875,
-0.0673828125,
-0.0305633544921875,
0.00507354736328125,
-0.030303955078125,
-0.0733642578125,
0.05841064453125,
0.00536346435546875,
0.0243072509765625,
0.038970947265625,
0.0004987716674804688,
0.0521240234375,
-0.030029296875,
0.055938720703125,
0.017913818359375,
-0.07061767578125,
0.059844970703125,
-0.0309600830078125,
0.029022216796875,
0.039825439453125,
0.04766845703125,
-0.0163116455078125,
-0.0034694671630859375,
-0.05029296875,
-0.06915283203125,
0.0858154296875,
0.02398681640625,
-0.008331298828125,
0.01554107666015625,
0.035858154296875,
0.01224517822265625,
0.006275177001953125,
-0.0679931640625,
-0.01528167724609375,
-0.02423095703125,
-0.00433349609375,
-0.00814056396484375,
-0.0328369140625,
-0.0032711029052734375,
-0.043609619140625,
0.069091796875,
0.0035495758056640625,
0.060699462890625,
0.043701171875,
-0.00811004638671875,
-0.01494598388671875,
0.01180267333984375,
0.07470703125,
0.04315185546875,
-0.038787841796875,
0.001194000244140625,
0.0153045654296875,
-0.04730224609375,
-0.0019063949584960938,
0.004283905029296875,
-0.0430908203125,
0.0105743408203125,
0.0379638671875,
0.09185791015625,
0.01116180419921875,
-0.04766845703125,
0.03338623046875,
-0.007640838623046875,
-0.039154052734375,
-0.0245513916015625,
0.003505706787109375,
-0.0262603759765625,
0.0151824951171875,
0.0253448486328125,
0.0134429931640625,
0.01535797119140625,
-0.03387451171875,
0.0035400390625,
0.0164794921875,
-0.0232086181640625,
-0.056304931640625,
0.04730224609375,
-0.004192352294921875,
-0.02154541015625,
0.04266357421875,
-0.030303955078125,
-0.03607177734375,
0.04156494140625,
0.0609130859375,
0.06048583984375,
-0.016876220703125,
0.0198822021484375,
0.0697021484375,
0.019012451171875,
-0.01343536376953125,
0.02398681640625,
-0.0036754608154296875,
-0.04888916015625,
-0.0262603759765625,
-0.04443359375,
-0.005687713623046875,
0.03106689453125,
-0.055938720703125,
0.034912109375,
-0.0276641845703125,
-0.0205078125,
-0.0108184814453125,
0.01047515869140625,
-0.0271759033203125,
0.019317626953125,
-0.0016613006591796875,
0.06524658203125,
-0.0682373046875,
0.07598876953125,
0.05853271484375,
-0.0191192626953125,
-0.034759521484375,
0.0016574859619140625,
-0.0286712646484375,
-0.04248046875,
0.041534423828125,
0.016815185546875,
0.0078125,
0.00667572021484375,
-0.0382080078125,
-0.0701904296875,
0.1016845703125,
0.033447265625,
-0.0309906005859375,
0.0063018798828125,
-0.01947021484375,
0.051361083984375,
-0.023681640625,
0.0191650390625,
0.010833740234375,
0.0277557373046875,
0.00806427001953125,
-0.0587158203125,
0.014404296875,
-0.0149688720703125,
0.0197601318359375,
0.0247650146484375,
-0.0625,
0.0631103515625,
-0.019439697265625,
-0.0228271484375,
0.00616455078125,
0.051116943359375,
0.02642822265625,
0.0279388427734375,
0.03924560546875,
0.065673828125,
0.0233306884765625,
0.0166168212890625,
0.06988525390625,
-0.032928466796875,
0.033447265625,
0.0499267578125,
0.014862060546875,
0.05462646484375,
0.0247955322265625,
-0.0072174072265625,
0.03662109375,
0.04345703125,
-0.0081634521484375,
0.07672119140625,
-0.00821685791015625,
-0.0186614990234375,
-0.0218505859375,
0.01111602783203125,
-0.032135009765625,
0.0275726318359375,
0.003997802734375,
-0.03387451171875,
-0.0091400146484375,
0.006565093994140625,
0.0166778564453125,
0.00034546852111816406,
-0.0223846435546875,
0.03765869140625,
-0.003326416015625,
-0.059295654296875,
0.0689697265625,
0.01151275634765625,
0.060028076171875,
-0.059600830078125,
-0.00785064697265625,
-0.0103912353515625,
0.0158843994140625,
0.0038013458251953125,
-0.039154052734375,
0.02392578125,
-0.004657745361328125,
-0.01392364501953125,
-0.01250457763671875,
0.06072998046875,
-0.025115966796875,
-0.03466796875,
0.031829833984375,
-0.0213470458984375,
0.0264434814453125,
0.01396942138671875,
-0.0419921875,
0.026580810546875,
0.007656097412109375,
-0.0198822021484375,
0.02886962890625,
0.01284027099609375,
0.011383056640625,
0.03558349609375,
0.0362548828125,
0.01349639892578125,
0.0005779266357421875,
-0.03466796875,
0.0609130859375,
-0.03411865234375,
-0.0201873779296875,
-0.059478759765625,
0.0219573974609375,
-0.0097503662109375,
-0.0439453125,
0.061767578125,
0.046630859375,
0.07891845703125,
-0.028289794921875,
0.0306396484375,
-0.0020656585693359375,
0.0061798095703125,
-0.051116943359375,
0.044342041015625,
-0.045928955078125,
0.007537841796875,
-0.036834716796875,
-0.060760498046875,
-0.014434814453125,
0.060211181640625,
-0.021820068359375,
0.004581451416015625,
0.04644775390625,
0.061981201171875,
-0.0013933181762695312,
-0.0164031982421875,
0.02044677734375,
0.0144195556640625,
0.0009493827819824219,
0.062744140625,
0.02740478515625,
-0.08056640625,
0.059051513671875,
-0.03192138671875,
-0.0200958251953125,
-0.01496124267578125,
-0.056304931640625,
-0.06243896484375,
-0.05694580078125,
-0.0390625,
-0.0253753662109375,
0.0208587646484375,
0.041900634765625,
0.072021484375,
-0.04656982421875,
-0.002902984619140625,
-0.012542724609375,
0.001590728759765625,
-0.035247802734375,
-0.01434326171875,
0.031280517578125,
0.005275726318359375,
-0.070556640625,
0.007183074951171875,
0.01467132568359375,
0.0175628662109375,
0.0008344650268554688,
-0.02764892578125,
-0.0228271484375,
0.01129913330078125,
0.047332763671875,
0.0280609130859375,
-0.03411865234375,
-0.016448974609375,
0.0031604766845703125,
0.00229644775390625,
0.0151214599609375,
0.029022216796875,
-0.06329345703125,
0.031768798828125,
0.06439208984375,
0.0034084320068359375,
0.0589599609375,
-0.005092620849609375,
0.0167999267578125,
-0.038360595703125,
0.014923095703125,
0.004650115966796875,
0.0125579833984375,
0.01318359375,
-0.035064697265625,
0.02191162109375,
0.038177490234375,
-0.060546875,
-0.0704345703125,
0.0035839080810546875,
-0.09881591796875,
-0.035247802734375,
0.08941650390625,
-0.0116119384765625,
-0.03826904296875,
0.0196533203125,
-0.0162506103515625,
0.0255889892578125,
-0.042266845703125,
0.0601806640625,
0.04193115234375,
-0.002788543701171875,
-0.032501220703125,
-0.042724609375,
0.0362548828125,
0.01873779296875,
-0.040008544921875,
-0.01421356201171875,
0.039459228515625,
0.030914306640625,
0.0276947021484375,
0.07366943359375,
-0.021636962890625,
0.0157470703125,
0.0008783340454101562,
0.0293731689453125,
-0.0044708251953125,
-0.04132080078125,
-0.04534912109375,
-0.007503509521484375,
-0.023651123046875,
-0.0269622802734375
]
] |
HooshvareLab/bert-fa-zwnj-base-ner | 2021-05-18T21:04:35.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"token-classification",
"fa",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | HooshvareLab | null | null | HooshvareLab/bert-fa-zwnj-base-ner | 3 | 14,310 | transformers | 2022-03-02T23:29:04 | ---
language: fa
---
# BertNER
This model fine-tuned for the Named Entity Recognition (NER) task on a mixed NER dataset collected from [ARMAN](https://github.com/HaniehP/PersianNER), [PEYMA](http://nsurl.org/2019-2/tasks/task-7-named-entity-recognition-ner-for-farsi/), and [WikiANN](https://elisa-ie.github.io/wikiann/) that covered ten types of entities:
- Date (DAT)
- Event (EVE)
- Facility (FAC)
- Location (LOC)
- Money (MON)
- Organization (ORG)
- Percent (PCT)
- Person (PER)
- Product (PRO)
- Time (TIM)
## Dataset Information
| | Records | B-DAT | B-EVE | B-FAC | B-LOC | B-MON | B-ORG | B-PCT | B-PER | B-PRO | B-TIM | I-DAT | I-EVE | I-FAC | I-LOC | I-MON | I-ORG | I-PCT | I-PER | I-PRO | I-TIM |
|:------|----------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|--------:|
| Train | 29133 | 1423 | 1487 | 1400 | 13919 | 417 | 15926 | 355 | 12347 | 1855 | 150 | 1947 | 5018 | 2421 | 4118 | 1059 | 19579 | 573 | 7699 | 1914 | 332 |
| Valid | 5142 | 267 | 253 | 250 | 2362 | 100 | 2651 | 64 | 2173 | 317 | 19 | 373 | 799 | 387 | 717 | 270 | 3260 | 101 | 1382 | 303 | 35 |
| Test | 6049 | 407 | 256 | 248 | 2886 | 98 | 3216 | 94 | 2646 | 318 | 43 | 568 | 888 | 408 | 858 | 263 | 3967 | 141 | 1707 | 296 | 78 |
## Evaluation
The following tables summarize the scores obtained by model overall and per each class.
**Overall**
| Model | accuracy | precision | recall | f1 |
|:----------:|:--------:|:---------:|:--------:|:--------:|
| Bert | 0.995086 | 0.953454 | 0.961113 | 0.957268 |
**Per entities**
| | number | precision | recall | f1 |
|:---: |:------: |:---------: |:--------: |:--------: |
| DAT | 407 | 0.860636 | 0.864865 | 0.862745 |
| EVE | 256 | 0.969582 | 0.996094 | 0.982659 |
| FAC | 248 | 0.976190 | 0.991935 | 0.984000 |
| LOC | 2884 | 0.970232 | 0.971914 | 0.971072 |
| MON | 98 | 0.905263 | 0.877551 | 0.891192 |
| ORG | 3216 | 0.939125 | 0.954602 | 0.946800 |
| PCT | 94 | 1.000000 | 0.968085 | 0.983784 |
| PER | 2645 | 0.965244 | 0.965974 | 0.965608 |
| PRO | 318 | 0.981481 | 1.000000 | 0.990654 |
| TIM | 43 | 0.692308 | 0.837209 | 0.757895 |
## How To Use
You use this model with Transformers pipeline for NER.
### Installing requirements
```bash
pip install transformers
```
### How to predict using pipeline
```python
from transformers import AutoTokenizer
from transformers import AutoModelForTokenClassification # for pytorch
from transformers import TFAutoModelForTokenClassification # for tensorflow
from transformers import pipeline
model_name_or_path = "HooshvareLab/bert-fa-zwnj-base-ner"
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
model = AutoModelForTokenClassification.from_pretrained(model_name_or_path) # Pytorch
# model = TFAutoModelForTokenClassification.from_pretrained(model_name_or_path) # Tensorflow
nlp = pipeline("ner", model=model, tokenizer=tokenizer)
example = "در سال ۲۰۱۳ درگذشت و آندرتیکر و کین برای او مراسم یادبود گرفتند."
ner_results = nlp(example)
print(ner_results)
```
## Questions?
Post a Github issue on the [ParsNER Issues](https://github.com/hooshvare/parsner/issues) repo. | 3,650 | [
[
-0.048370361328125,
-0.042816162109375,
0.019287109375,
0.001018524169921875,
-0.0005545616149902344,
-0.00852203369140625,
-0.005352020263671875,
-0.0197601318359375,
0.0284271240234375,
0.027679443359375,
-0.03985595703125,
-0.048858642578125,
-0.0582275390625,
0.00035309791564941406,
-0.033477783203125,
0.08953857421875,
0.0030803680419921875,
0.0171661376953125,
0.00458526611328125,
-0.001026153564453125,
-0.0352783203125,
-0.033172607421875,
-0.044189453125,
-0.038238525390625,
0.021759033203125,
0.02911376953125,
0.039215087890625,
0.02667236328125,
0.0423583984375,
0.0223388671875,
-0.018524169921875,
0.00010639429092407227,
-0.01352691650390625,
-0.0100250244140625,
0.0024127960205078125,
-0.0191497802734375,
-0.051116943359375,
-0.0089569091796875,
0.04449462890625,
0.03955078125,
-0.01459503173828125,
0.029449462890625,
0.01395416259765625,
0.0396728515625,
-0.0379638671875,
0.006000518798828125,
-0.023406982421875,
0.0103302001953125,
-0.0034465789794921875,
-0.0034046173095703125,
-0.018463134765625,
-0.0158538818359375,
0.0245208740234375,
-0.03387451171875,
0.0264892578125,
0.00460052490234375,
0.10369873046875,
0.028472900390625,
-0.0302886962890625,
-0.0095062255859375,
-0.048919677734375,
0.06451416015625,
-0.0584716796875,
0.0211181640625,
0.0230255126953125,
0.01120758056640625,
-0.0113067626953125,
-0.052520751953125,
-0.04608154296875,
0.002567291259765625,
-0.018707275390625,
0.018585205078125,
-0.02276611328125,
-0.005401611328125,
0.019256591796875,
0.03717041015625,
-0.043182373046875,
0.00881195068359375,
-0.038787841796875,
-0.0247344970703125,
0.047607421875,
0.0261077880859375,
0.0084381103515625,
-0.0260467529296875,
-0.03369140625,
-0.0214385986328125,
-0.01322174072265625,
0.017913818359375,
0.023468017578125,
0.0195159912109375,
-0.025421142578125,
0.044677734375,
-0.019073486328125,
0.0537109375,
0.022369384765625,
-0.00601959228515625,
0.052215576171875,
-0.019927978515625,
-0.0293121337890625,
0.0081939697265625,
0.0640869140625,
0.0355224609375,
0.01258087158203125,
0.002811431884765625,
-0.030517578125,
0.002582550048828125,
0.01018524169921875,
-0.068603515625,
-0.03277587890625,
0.0162811279296875,
-0.0390625,
-0.033294677734375,
0.007686614990234375,
-0.06396484375,
0.00726318359375,
-0.030792236328125,
0.04840087890625,
-0.025848388671875,
-0.03094482421875,
-0.00982666015625,
-0.003963470458984375,
0.0341796875,
0.01441192626953125,
-0.0635986328125,
0.0279541015625,
0.040252685546875,
0.06036376953125,
-0.00421142578125,
-0.0247344970703125,
-0.0096435546875,
-0.0161590576171875,
-0.028656005859375,
0.049163818359375,
-0.0221710205078125,
-0.0282440185546875,
-0.0260009765625,
0.01271820068359375,
-0.022216796875,
-0.0208740234375,
0.033355712890625,
-0.03265380859375,
0.03875732421875,
-0.00396728515625,
-0.055145263671875,
-0.032562255859375,
0.030792236328125,
-0.05145263671875,
0.08392333984375,
0.01629638671875,
-0.0616455078125,
0.04541015625,
-0.059783935546875,
-0.01067352294921875,
-0.01253509521484375,
-0.00579833984375,
-0.0638427734375,
-0.00839996337890625,
0.0275726318359375,
0.04656982421875,
-0.0186614990234375,
0.04449462890625,
-0.0091094970703125,
-0.0308990478515625,
-0.0012788772583007812,
-0.023101806640625,
0.059600830078125,
0.0123291015625,
-0.046630859375,
0.0172882080078125,
-0.0728759765625,
0.0159912109375,
0.006656646728515625,
-0.01241302490234375,
-0.00766754150390625,
-0.014739990234375,
0.00909423828125,
0.02294921875,
0.0089874267578125,
-0.02752685546875,
0.005462646484375,
-0.03192138671875,
0.022003173828125,
0.0595703125,
0.0187225341796875,
0.037322998046875,
-0.041473388671875,
0.027313232421875,
0.0166473388671875,
0.01300048828125,
0.0139312744140625,
-0.03594970703125,
-0.0843505859375,
-0.0276336669921875,
0.0400390625,
0.034423828125,
-0.031829833984375,
0.05078125,
-0.0195465087890625,
-0.050201416015625,
-0.033447265625,
-0.0232391357421875,
0.0286102294921875,
0.050628662109375,
0.040679931640625,
-0.0131988525390625,
-0.052978515625,
-0.08050537109375,
-0.01015472412109375,
-0.01043701171875,
0.00848388671875,
0.0242462158203125,
0.065673828125,
-0.0170745849609375,
0.061065673828125,
-0.038116455078125,
-0.0276336669921875,
-0.01500701904296875,
0.0034351348876953125,
0.05865478515625,
0.050933837890625,
0.035736083984375,
-0.0450439453125,
-0.04205322265625,
-0.00949859619140625,
-0.0548095703125,
0.02508544921875,
0.0005397796630859375,
-0.0153045654296875,
0.0204620361328125,
0.021636962890625,
-0.032257080078125,
0.046295166015625,
0.038726806640625,
-0.047271728515625,
0.051177978515625,
-0.005107879638671875,
0.013763427734375,
-0.0860595703125,
0.0258026123046875,
-0.004360198974609375,
-0.00296783447265625,
-0.049285888671875,
-0.01226806640625,
0.00010901689529418945,
-0.0012760162353515625,
-0.02392578125,
0.05218505859375,
-0.0307464599609375,
0.00180816650390625,
0.004627227783203125,
-0.028472900390625,
-0.002544403076171875,
0.04901123046875,
0.012939453125,
0.055633544921875,
0.054595947265625,
-0.051849365234375,
0.00521087646484375,
0.0247344970703125,
-0.046417236328125,
0.0276947021484375,
-0.056182861328125,
0.0021152496337890625,
-0.00864410400390625,
0.0012187957763671875,
-0.052001953125,
-0.02276611328125,
0.024139404296875,
-0.04766845703125,
0.028656005859375,
-0.0183868408203125,
-0.0198516845703125,
-0.031646728515625,
-0.00054931640625,
0.01300048828125,
0.0308990478515625,
-0.02825927734375,
0.056182861328125,
0.01274871826171875,
0.01097869873046875,
-0.05426025390625,
-0.0489501953125,
-0.0137939453125,
-0.023834228515625,
-0.05316162109375,
0.043304443359375,
-0.0006461143493652344,
-0.00618743896484375,
0.01023101806640625,
-0.0068359375,
-0.0159149169921875,
0.01482391357421875,
0.02398681640625,
0.0517578125,
-0.0209503173828125,
0.0022106170654296875,
-0.01390838623046875,
-0.022979736328125,
0.0031337738037109375,
0.0034809112548828125,
0.055450439453125,
-0.0251617431640625,
-0.0139007568359375,
-0.03900146484375,
0.0036678314208984375,
0.037841796875,
-0.0220184326171875,
0.06732177734375,
0.048919677734375,
-0.027740478515625,
-0.007534027099609375,
-0.038299560546875,
-0.0284271240234375,
-0.0311279296875,
0.041229248046875,
-0.039337158203125,
-0.0606689453125,
0.05731201171875,
0.018310546875,
0.0127716064453125,
0.068115234375,
0.040771484375,
0.0005292892456054688,
0.0716552734375,
0.0290679931640625,
-0.003864288330078125,
0.0262908935546875,
-0.058380126953125,
0.016204833984375,
-0.0623779296875,
-0.0419921875,
-0.03717041015625,
-0.038909912109375,
-0.061798095703125,
-0.01113128662109375,
0.028289794921875,
0.004669189453125,
-0.0560302734375,
0.0223388671875,
-0.060089111328125,
0.0224456787109375,
0.055816650390625,
0.01274871826171875,
0.00213623046875,
-0.005535125732421875,
-0.0269927978515625,
-0.0027027130126953125,
-0.051849365234375,
-0.026214599609375,
0.0902099609375,
0.019287109375,
0.0333251953125,
0.006275177001953125,
0.08087158203125,
0.008544921875,
0.024749755859375,
-0.0504150390625,
0.035491943359375,
-0.011444091796875,
-0.0794677734375,
-0.0201873779296875,
-0.033416748046875,
-0.0694580078125,
0.024932861328125,
-0.0233154296875,
-0.052001953125,
0.033477783203125,
0.005054473876953125,
-0.03839111328125,
0.03216552734375,
-0.03985595703125,
0.07611083984375,
-0.014617919921875,
-0.0138397216796875,
0.0049591064453125,
-0.048614501953125,
0.0235748291015625,
0.0003025531768798828,
0.01325225830078125,
-0.01641845703125,
0.01216888427734375,
0.07501220703125,
-0.03448486328125,
0.0428466796875,
-0.0124359130859375,
0.034393310546875,
0.026885986328125,
-0.00013136863708496094,
0.039031982421875,
-0.0035247802734375,
-0.0159149169921875,
0.0207061767578125,
0.004291534423828125,
-0.0234375,
-0.0229339599609375,
0.057525634765625,
-0.08251953125,
-0.03515625,
-0.06396484375,
-0.0227813720703125,
0.01080322265625,
0.033416748046875,
0.038970947265625,
0.0283660888671875,
0.001209259033203125,
0.0182952880859375,
0.041595458984375,
-0.0135955810546875,
0.047943115234375,
0.03875732421875,
-0.020782470703125,
-0.045196533203125,
0.0660400390625,
0.01125335693359375,
0.0110015869140625,
0.0145721435546875,
0.007266998291015625,
-0.01397705078125,
-0.033111572265625,
-0.024810791015625,
0.0088043212890625,
-0.037567138671875,
-0.028289794921875,
-0.0504150390625,
-0.03717041015625,
-0.033782958984375,
-0.0081787109375,
-0.0300750732421875,
-0.0227813720703125,
-0.041229248046875,
-0.0217742919921875,
0.0282135009765625,
0.0465087890625,
-0.00008118152618408203,
0.0107574462890625,
-0.05047607421875,
0.0025920867919921875,
0.0120697021484375,
0.0253753662109375,
0.006824493408203125,
-0.0565185546875,
-0.0088653564453125,
-0.00707244873046875,
-0.0246429443359375,
-0.052215576171875,
0.058807373046875,
0.0171051025390625,
0.054107666015625,
0.0171661376953125,
0.00007289648056030273,
0.061492919921875,
-0.019287109375,
0.06048583984375,
0.02313232421875,
-0.06298828125,
0.0384521484375,
-0.01361846923828125,
0.01092529296875,
0.043609619140625,
0.03302001953125,
-0.03436279296875,
-0.0099334716796875,
-0.071044921875,
-0.0765380859375,
0.06817626953125,
0.0250244140625,
-0.006061553955078125,
0.0024852752685546875,
0.01483154296875,
-0.01751708984375,
0.01776123046875,
-0.06842041015625,
-0.050506591796875,
-0.022064208984375,
-0.01222991943359375,
-0.005023956298828125,
-0.0090179443359375,
0.00391387939453125,
-0.031524658203125,
0.0716552734375,
0.01245880126953125,
0.034942626953125,
0.0271759033203125,
0.0015668869018554688,
-0.0022563934326171875,
0.025238037109375,
0.031646728515625,
0.0377197265625,
-0.03570556640625,
0.000675201416015625,
0.0213470458984375,
-0.0288848876953125,
0.004913330078125,
0.0166168212890625,
-0.027587890625,
0.00783538818359375,
0.0352783203125,
0.06915283203125,
0.0197601318359375,
-0.0181732177734375,
0.031463623046875,
0.00046515464782714844,
-0.031463623046875,
-0.051605224609375,
0.01157379150390625,
-0.013458251953125,
0.0208282470703125,
0.0262603759765625,
0.020355224609375,
0.01222991943359375,
-0.0257720947265625,
0.0166168212890625,
0.02197265625,
-0.040008544921875,
-0.01200103759765625,
0.048736572265625,
-0.00228118896484375,
-0.0214385986328125,
0.069091796875,
0.00010919570922851562,
-0.054473876953125,
0.065673828125,
0.0203704833984375,
0.0635986328125,
-0.0278778076171875,
0.00809478759765625,
0.07275390625,
0.0180511474609375,
-0.0082244873046875,
0.0262908935546875,
0.0150909423828125,
-0.05364990234375,
-0.00949859619140625,
-0.06414794921875,
-0.01036834716796875,
0.0150909423828125,
-0.0655517578125,
0.034454345703125,
-0.0289154052734375,
-0.035858154296875,
0.01959228515625,
0.015655517578125,
-0.06646728515625,
0.035491943359375,
0.00359344482421875,
0.058868408203125,
-0.06256103515625,
0.05169677734375,
0.04815673828125,
-0.04522705078125,
-0.09649658203125,
-0.0218963623046875,
-0.01235198974609375,
-0.05108642578125,
0.0489501953125,
0.016815185546875,
0.0218505859375,
0.015655517578125,
-0.0254669189453125,
-0.09930419921875,
0.09716796875,
-0.01038360595703125,
-0.051544189453125,
-0.01084136962890625,
0.00010126829147338867,
0.04193115234375,
-0.01465606689453125,
0.042205810546875,
0.04193115234375,
0.04205322265625,
0.004608154296875,
-0.074462890625,
-0.0013532638549804688,
-0.035125732421875,
-0.00691986083984375,
0.0341796875,
-0.050323486328125,
0.08905029296875,
-0.0175018310546875,
0.003665924072265625,
0.0004227161407470703,
0.047027587890625,
0.030517578125,
0.03271484375,
0.03985595703125,
0.0657958984375,
0.046112060546875,
-0.0238037109375,
0.0728759765625,
-0.0263214111328125,
0.054351806640625,
0.08941650390625,
0.00737762451171875,
0.05584716796875,
0.03131103515625,
-0.03179931640625,
0.049285888671875,
0.06292724609375,
-0.028472900390625,
0.039703369140625,
0.00013399124145507812,
-0.008636474609375,
-0.0302276611328125,
0.0111236572265625,
-0.040924072265625,
0.032867431640625,
0.018035888671875,
-0.05108642578125,
-0.0110321044921875,
-0.00923919677734375,
0.004108428955078125,
-0.021240234375,
-0.0274505615234375,
0.054595947265625,
-0.005645751953125,
-0.057830810546875,
0.0501708984375,
0.0010442733764648438,
0.041961669921875,
-0.0322265625,
-0.005023956298828125,
-0.00482940673828125,
0.0178985595703125,
-0.0352783203125,
-0.06121826171875,
0.017730712890625,
-0.0013294219970703125,
-0.024932861328125,
0.00341796875,
0.04449462890625,
-0.017608642578125,
-0.052093505859375,
0.0230712890625,
0.0251922607421875,
0.01129150390625,
0.01019287109375,
-0.058563232421875,
-0.0171661376953125,
0.016632080078125,
-0.044403076171875,
0.015472412109375,
0.0173797607421875,
0.004634857177734375,
0.044403076171875,
0.055389404296875,
-0.0114898681640625,
0.01702880859375,
0.0092926025390625,
0.078857421875,
-0.055145263671875,
-0.034698486328125,
-0.057220458984375,
0.04937744140625,
-0.0022830963134765625,
-0.04608154296875,
0.056976318359375,
0.0584716796875,
0.05230712890625,
0.00795745849609375,
0.033782958984375,
-0.0249786376953125,
0.0482177734375,
-0.032073974609375,
0.054962158203125,
-0.049835205078125,
0.0113525390625,
-0.01654052734375,
-0.06500244140625,
-0.01395416259765625,
0.059234619140625,
-0.041961669921875,
0.019195556640625,
0.041168212890625,
0.06353759765625,
0.003986358642578125,
-0.0006155967712402344,
0.01122283935546875,
0.0234375,
0.013824462890625,
0.04248046875,
0.04510498046875,
-0.050140380859375,
0.031280517578125,
-0.0550537109375,
0.0005717277526855469,
-0.0123443603515625,
-0.04656982421875,
-0.06280517578125,
-0.04052734375,
-0.031646728515625,
-0.043212890625,
-0.01229095458984375,
0.0762939453125,
0.051177978515625,
-0.078369140625,
-0.0119171142578125,
-0.0250244140625,
0.0030994415283203125,
-0.01314544677734375,
-0.02288818359375,
0.04803466796875,
-0.01445770263671875,
-0.058746337890625,
0.00345611572265625,
-0.0211334228515625,
0.018096923828125,
-0.00738525390625,
-0.016876220703125,
-0.02191162109375,
-0.0081634521484375,
0.01143646240234375,
0.00045800209045410156,
-0.04901123046875,
-0.017852783203125,
0.01245880126953125,
-0.006893157958984375,
0.01511383056640625,
0.0207061767578125,
-0.04461669921875,
0.03302001953125,
0.0259246826171875,
0.01641845703125,
0.056243896484375,
-0.0129241943359375,
0.0108642578125,
-0.0306854248046875,
0.0029735565185546875,
0.03619384765625,
0.029571533203125,
0.00986480712890625,
-0.0231170654296875,
0.045867919921875,
0.007114410400390625,
-0.056396484375,
-0.0682373046875,
-0.022552490234375,
-0.07196044921875,
-0.00920867919921875,
0.07025146484375,
-0.01226806640625,
-0.0223846435546875,
0.004940032958984375,
0.00711822509765625,
0.0271148681640625,
-0.041900634765625,
0.044403076171875,
0.05133056640625,
-0.01445770263671875,
-0.01311492919921875,
-0.039642333984375,
0.02825927734375,
0.035919189453125,
-0.041107177734375,
-0.03424072265625,
0.0228729248046875,
0.04522705078125,
0.0186614990234375,
0.026397705078125,
-0.0059051513671875,
0.0191192626953125,
0.001201629638671875,
0.0246429443359375,
-0.007488250732421875,
-0.0224151611328125,
-0.0181427001953125,
0.00576019287109375,
-0.0117034912109375,
-0.024688720703125
]
] |
intfloat/e5-small-unsupervised | 2023-07-27T03:55:32.000Z | [
"sentence-transformers",
"pytorch",
"safetensors",
"bert",
"Sentence Transformers",
"sentence-similarity",
"en",
"arxiv:2212.03533",
"arxiv:2104.08663",
"arxiv:2210.07316",
"license:mit",
"endpoints_compatible",
"region:us"
] | sentence-similarity | intfloat | null | null | intfloat/e5-small-unsupervised | 0 | 14,275 | sentence-transformers | 2023-01-31T03:03:08 | ---
tags:
- sentence-transformers
- Sentence Transformers
- sentence-similarity
language:
- en
license: mit
---
# E5-small-unsupervised
**This model is similar to [e5-small](https://huggingface.co/intfloat/e5-small) but without supervised fine-tuning.**
[Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
This model has 12 layers and the embedding size is 384.
## Usage
Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
```python
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def average_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
# Each input text should start with "query: " or "passage: ".
# For tasks other than retrieval, you can simply use the "query: " prefix.
input_texts = ['query: how much protein should a female eat',
'query: summit define',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-small-unsupervised')
model = AutoModel.from_pretrained('intfloat/e5-small-unsupervised')
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
```
## Training Details
Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
## Benchmark Evaluation
Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
## Support for Sentence Transformers
Below is an example for usage with sentence_transformers.
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer('intfloat/e5-small-unsupervised')
input_texts = [
'query: how much protein should a female eat',
'query: summit define',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
]
embeddings = model.encode(input_texts, normalize_embeddings=True)
```
Package requirements
`pip install sentence_transformers~=2.2.2`
Contributors: [michaelfeil](https://huggingface.co/michaelfeil)
## FAQ
**1. Do I need to add the prefix "query: " and "passage: " to input texts?**
Yes, this is how the model is trained, otherwise you will see a performance degradation.
Here are some rules of thumb:
- Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval.
- Use "query: " prefix for symmetric tasks such as semantic similarity, paraphrase retrieval.
- Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering.
**2. Why are my reproduced results slightly different from reported in the model card?**
Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences.
## Citation
If you find our paper or models helpful, please consider cite as follows:
```
@article{wang2022text,
title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
journal={arXiv preprint arXiv:2212.03533},
year={2022}
}
```
## Limitations
This model only works for English texts. Long texts will be truncated to at most 512 tokens.
| 5,094 | [
[
-0.00991058349609375,
-0.05267333984375,
0.01401519775390625,
0.011505126953125,
-0.0188446044921875,
-0.032958984375,
-0.0005578994750976562,
-0.029449462890625,
0.00708770751953125,
0.0252838134765625,
-0.042083740234375,
-0.0452880859375,
-0.073974609375,
0.0231781005859375,
-0.0277862548828125,
0.07061767578125,
0.0045928955078125,
0.00641632080078125,
-0.0268096923828125,
-0.003421783447265625,
-0.0178070068359375,
-0.038116455078125,
-0.024993896484375,
-0.026763916015625,
0.0208892822265625,
0.019317626953125,
0.042327880859375,
0.0447998046875,
0.04913330078125,
0.0258331298828125,
-0.01329803466796875,
0.01861572265625,
-0.038543701171875,
-0.0179290771484375,
0.0005064010620117188,
-0.04168701171875,
-0.030670166015625,
0.0158233642578125,
0.041015625,
0.06854248046875,
0.00969696044921875,
0.0237884521484375,
0.0303802490234375,
0.043365478515625,
-0.0428466796875,
0.0130462646484375,
-0.032196044921875,
0.01108551025390625,
0.00771331787109375,
0.00188446044921875,
-0.0298614501953125,
0.01910400390625,
0.0238189697265625,
-0.043975830078125,
0.0248260498046875,
0.01265716552734375,
0.0938720703125,
0.026031494140625,
-0.0377197265625,
-0.01377105712890625,
-0.0098419189453125,
0.07208251953125,
-0.052734375,
0.032501220703125,
0.0504150390625,
-0.019744873046875,
-0.01287841796875,
-0.07293701171875,
-0.03009033203125,
-0.01380157470703125,
-0.0229644775390625,
0.0101776123046875,
-0.024871826171875,
-0.002666473388671875,
0.03204345703125,
0.033721923828125,
-0.060699462890625,
-0.002719879150390625,
-0.029327392578125,
-0.00591278076171875,
0.03875732421875,
0.0106048583984375,
0.023529052734375,
-0.039337158203125,
-0.01318359375,
-0.0174102783203125,
-0.041748046875,
0.00841522216796875,
0.01303863525390625,
0.031463623046875,
-0.0302886962890625,
0.04595947265625,
-0.0274658203125,
0.049560546875,
0.0208282470703125,
0.01068115234375,
0.046539306640625,
-0.041259765625,
-0.0231781005859375,
-0.0233001708984375,
0.074462890625,
0.038116455078125,
0.0105438232421875,
-0.00780487060546875,
-0.00656890869140625,
-0.00568389892578125,
0.00630950927734375,
-0.08270263671875,
-0.043975830078125,
0.01442718505859375,
-0.052734375,
-0.0176544189453125,
0.008270263671875,
-0.041046142578125,
-0.0037994384765625,
-0.0178680419921875,
0.0662841796875,
-0.039520263671875,
0.00963592529296875,
0.018341064453125,
-0.01517486572265625,
0.004985809326171875,
0.009246826171875,
-0.06549072265625,
0.021820068359375,
0.006805419921875,
0.0697021484375,
-0.0051422119140625,
-0.0301971435546875,
-0.039825439453125,
-0.00433349609375,
0.0019817352294921875,
0.04150390625,
-0.028289794921875,
-0.0211334228515625,
-0.00040650367736816406,
0.03778076171875,
-0.03912353515625,
-0.037567138671875,
0.0419921875,
-0.0197601318359375,
0.0303955078125,
-0.015716552734375,
-0.042327880859375,
-0.00458526611328125,
0.0168609619140625,
-0.03253173828125,
0.0777587890625,
0.006099700927734375,
-0.07061767578125,
0.0120697021484375,
-0.0369873046875,
-0.0299072265625,
-0.0097808837890625,
-0.006839752197265625,
-0.035430908203125,
-0.00542449951171875,
0.043060302734375,
0.0298309326171875,
-0.01044464111328125,
-0.0020198822021484375,
-0.00585174560546875,
-0.03619384765625,
0.0149078369140625,
-0.0113983154296875,
0.0650634765625,
0.005176544189453125,
-0.03497314453125,
-0.0085906982421875,
-0.058319091796875,
0.005733489990234375,
0.0122833251953125,
-0.03369140625,
-0.007793426513671875,
0.00600433349609375,
-0.001018524169921875,
0.021270751953125,
0.0273284912109375,
-0.037994384765625,
0.0185089111328125,
-0.0413818359375,
0.060272216796875,
0.03900146484375,
0.00778961181640625,
0.035858154296875,
-0.03131103515625,
0.0081024169921875,
0.02947998046875,
0.00732421875,
-0.00164794921875,
-0.038818359375,
-0.063720703125,
-0.0113983154296875,
0.04638671875,
0.03424072265625,
-0.038360595703125,
0.042083740234375,
-0.0240631103515625,
-0.023773193359375,
-0.0552978515625,
0.007678985595703125,
0.0164337158203125,
0.02740478515625,
0.05889892578125,
-0.00669097900390625,
-0.050567626953125,
-0.0789794921875,
-0.02703857421875,
0.01511383056640625,
-0.02386474609375,
0.016143798828125,
0.06915283203125,
-0.0296478271484375,
0.045684814453125,
-0.050628662109375,
-0.044036865234375,
-0.0164947509765625,
0.013397216796875,
0.0294647216796875,
0.057342529296875,
0.02703857421875,
-0.0615234375,
-0.034271240234375,
-0.038604736328125,
-0.06512451171875,
0.004608154296875,
0.0103607177734375,
-0.01229095458984375,
-0.0065460205078125,
0.035552978515625,
-0.04925537109375,
0.0290679931640625,
0.035308837890625,
-0.033294677734375,
0.01702880859375,
-0.0279693603515625,
0.0076446533203125,
-0.073486328125,
-0.00030732154846191406,
0.01251220703125,
-0.015960693359375,
-0.029022216796875,
0.01291656494140625,
0.0030841827392578125,
-0.011444091796875,
-0.035308837890625,
0.020477294921875,
-0.04595947265625,
0.0157318115234375,
-0.01207733154296875,
0.0228118896484375,
0.0219879150390625,
0.040802001953125,
-0.007808685302734375,
0.04351806640625,
0.044342041015625,
-0.06341552734375,
-0.003574371337890625,
0.053192138671875,
-0.0290679931640625,
0.02099609375,
-0.06842041015625,
0.00534820556640625,
-0.0023822784423828125,
0.016998291015625,
-0.0682373046875,
-0.0172119140625,
0.0179290771484375,
-0.05084228515625,
0.0192718505859375,
-0.005615234375,
-0.040863037109375,
-0.02410888671875,
-0.036376953125,
0.016387939453125,
0.040008544921875,
-0.02825927734375,
0.03765869140625,
0.0212554931640625,
0.0037784576416015625,
-0.03460693359375,
-0.080810546875,
-0.01212310791015625,
-0.0011653900146484375,
-0.051025390625,
0.0528564453125,
-0.01000213623046875,
0.017364501953125,
0.00505828857421875,
-0.00991058349609375,
0.01461029052734375,
-0.01207733154296875,
0.0185089111328125,
0.0006766319274902344,
0.002979278564453125,
0.002452850341796875,
-0.0087738037109375,
-0.002826690673828125,
-0.0005464553833007812,
-0.01995849609375,
0.038360595703125,
-0.0257415771484375,
0.00907135009765625,
-0.0455322265625,
0.03839111328125,
0.0146331787109375,
-0.018646240234375,
0.08056640625,
0.059539794921875,
-0.0301971435546875,
0.00324249267578125,
-0.01885986328125,
-0.03179931640625,
-0.0355224609375,
0.0404052734375,
-0.04217529296875,
-0.043853759765625,
0.032440185546875,
0.005794525146484375,
-0.00736236572265625,
0.07073974609375,
0.02569580078125,
-0.0247039794921875,
0.103759765625,
0.0601806640625,
0.0146331787109375,
0.0286407470703125,
-0.0540771484375,
0.003551483154296875,
-0.07232666015625,
-0.0249786376953125,
-0.045166015625,
-0.035614013671875,
-0.0650634765625,
-0.0290069580078125,
0.0278778076171875,
0.01739501953125,
-0.035186767578125,
0.0234527587890625,
-0.04754638671875,
0.00792694091796875,
0.04449462890625,
0.03765869140625,
0.0097808837890625,
0.01184844970703125,
-0.017333984375,
-0.030029296875,
-0.07659912109375,
-0.0285186767578125,
0.07635498046875,
0.019195556640625,
0.055145263671875,
-0.005489349365234375,
0.044342041015625,
0.01380157470703125,
-0.0074920654296875,
-0.049285888671875,
0.039337158203125,
-0.031524658203125,
-0.023773193359375,
-0.01251220703125,
-0.050628662109375,
-0.0811767578125,
0.03265380859375,
-0.033721923828125,
-0.054718017578125,
0.01849365234375,
-0.00991058349609375,
-0.025238037109375,
0.00847625732421875,
-0.0679931640625,
0.081298828125,
0.005313873291015625,
-0.019195556640625,
0.003513336181640625,
-0.051513671875,
-0.01303863525390625,
0.0242919921875,
0.01122283935546875,
0.0023040771484375,
-0.0021762847900390625,
0.0819091796875,
-0.024871826171875,
0.07257080078125,
-0.00472259521484375,
0.035186767578125,
0.00991058349609375,
-0.01404571533203125,
0.041748046875,
-0.01390838623046875,
0.0015096664428710938,
0.0208282470703125,
0.00811767578125,
-0.042724609375,
-0.024749755859375,
0.05780029296875,
-0.091552734375,
-0.037933349609375,
-0.040863037109375,
-0.035247802734375,
0.006488800048828125,
0.0152587890625,
0.0498046875,
0.034912109375,
0.01084136962890625,
0.041595458984375,
0.039093017578125,
-0.024627685546875,
0.0294036865234375,
0.027313232421875,
0.008026123046875,
-0.031524658203125,
0.0498046875,
0.03143310546875,
0.0098114013671875,
0.048004150390625,
0.0183868408203125,
-0.0225677490234375,
-0.040008544921875,
-0.0157623291015625,
0.032440185546875,
-0.051055908203125,
-0.0144500732421875,
-0.08209228515625,
-0.0244293212890625,
-0.045867919921875,
0.0035076141357421875,
-0.0221710205078125,
-0.0303955078125,
-0.02972412109375,
-0.00318145751953125,
0.0117034912109375,
0.03131103515625,
-0.0081939697265625,
0.0273590087890625,
-0.049163818359375,
0.0243377685546875,
0.00534820556640625,
0.001617431640625,
-0.01210784912109375,
-0.07275390625,
-0.0313720703125,
0.01506805419921875,
-0.04638671875,
-0.06597900390625,
0.035552978515625,
0.03265380859375,
0.044921875,
0.009429931640625,
0.0102386474609375,
0.047637939453125,
-0.028839111328125,
0.07305908203125,
0.00868988037109375,
-0.06829833984375,
0.05133056640625,
-0.007598876953125,
0.047943115234375,
0.037872314453125,
0.051910400390625,
-0.0248260498046875,
-0.0285797119140625,
-0.06341552734375,
-0.07965087890625,
0.049774169921875,
0.033721923828125,
0.01116180419921875,
-0.003940582275390625,
0.022064208984375,
-0.00624847412109375,
0.0206756591796875,
-0.08380126953125,
-0.02679443359375,
-0.0263671875,
-0.0205078125,
-0.01514434814453125,
-0.012969970703125,
-0.0024662017822265625,
-0.046722412109375,
0.06365966796875,
-0.0059356689453125,
0.0494384765625,
0.041412353515625,
-0.037017822265625,
0.004070281982421875,
0.002010345458984375,
0.0238494873046875,
0.042816162109375,
-0.032012939453125,
0.02197265625,
0.0242156982421875,
-0.046539306640625,
-0.01422882080078125,
0.015838623046875,
-0.02294921875,
0.004749298095703125,
0.035552978515625,
0.0565185546875,
0.0252227783203125,
-0.02471923828125,
0.04498291015625,
0.0002493858337402344,
-0.0255279541015625,
-0.006076812744140625,
-0.004451751708984375,
0.0126953125,
0.0162353515625,
0.03472900390625,
-0.002841949462890625,
0.01056671142578125,
-0.042877197265625,
0.007534027099609375,
0.0012636184692382812,
-0.0308380126953125,
-0.0220794677734375,
0.056915283203125,
0.0185394287109375,
-0.00812530517578125,
0.0753173828125,
-0.0097503662109375,
-0.035797119140625,
0.03448486328125,
0.050628662109375,
0.049285888671875,
-0.00531005859375,
0.01367950439453125,
0.061553955078125,
0.036529541015625,
0.0025768280029296875,
0.0117950439453125,
0.01505279541015625,
-0.05224609375,
-0.0202789306640625,
-0.06866455078125,
-0.0012540817260742188,
0.02301025390625,
-0.036376953125,
0.01519775390625,
-0.0020599365234375,
-0.021331787109375,
0.007068634033203125,
0.03546142578125,
-0.07080078125,
0.0211029052734375,
-0.010467529296875,
0.053131103515625,
-0.07012939453125,
0.0419921875,
0.061676025390625,
-0.064697265625,
-0.0565185546875,
-0.0006132125854492188,
-0.0265960693359375,
-0.0450439453125,
0.051605224609375,
0.037506103515625,
0.01103973388671875,
0.00814056396484375,
-0.0406494140625,
-0.050567626953125,
0.082275390625,
0.01111602783203125,
-0.035247802734375,
-0.0221405029296875,
0.0247039794921875,
0.031097412109375,
-0.03924560546875,
0.036865234375,
0.0278778076171875,
0.0230865478515625,
-0.0087738037109375,
-0.053497314453125,
0.01422882080078125,
-0.026336669921875,
-0.00708770751953125,
-0.01377105712890625,
-0.050384521484375,
0.08819580078125,
-0.0193939208984375,
-0.003673553466796875,
0.005474090576171875,
0.048004150390625,
0.00501251220703125,
0.0029468536376953125,
0.03131103515625,
0.04461669921875,
0.049072265625,
-0.0047149658203125,
0.08880615234375,
-0.0242919921875,
0.04302978515625,
0.061065673828125,
0.02947998046875,
0.06219482421875,
0.03497314453125,
-0.023406982421875,
0.0465087890625,
0.068359375,
-0.01210784912109375,
0.05670166015625,
0.00457763671875,
0.01334381103515625,
-0.024139404296875,
0.0024127960205078125,
-0.042266845703125,
0.028533935546875,
0.01285552978515625,
-0.049560546875,
-0.01445770263671875,
0.008026123046875,
0.0030574798583984375,
-0.01203155517578125,
-0.0203704833984375,
0.03778076171875,
0.03765869140625,
-0.034423828125,
0.07220458984375,
0.0123138427734375,
0.05426025390625,
-0.05010986328125,
0.0168304443359375,
-0.0168914794921875,
0.031158447265625,
-0.023651123046875,
-0.041778564453125,
0.0092315673828125,
-0.00591278076171875,
-0.02386474609375,
-0.0101776123046875,
0.0450439453125,
-0.044281005859375,
-0.0386962890625,
0.020111083984375,
0.044342041015625,
0.0192718505859375,
-0.022491455078125,
-0.0748291015625,
-0.0025386810302734375,
0.005035400390625,
-0.02972412109375,
0.0340576171875,
0.017974853515625,
0.0184173583984375,
0.041351318359375,
0.037017822265625,
-0.007549285888671875,
-0.002758026123046875,
0.01361083984375,
0.0595703125,
-0.04638671875,
-0.045928955078125,
-0.06134033203125,
0.0300140380859375,
-0.0176849365234375,
-0.0254364013671875,
0.0650634765625,
0.0499267578125,
0.05902099609375,
-0.016204833984375,
0.033721923828125,
-0.0009322166442871094,
0.0131378173828125,
-0.041748046875,
0.050628662109375,
-0.05499267578125,
-0.005107879638671875,
-0.0164031982421875,
-0.0723876953125,
-0.012115478515625,
0.06768798828125,
-0.03363037109375,
0.008544921875,
0.0701904296875,
0.058013916015625,
-0.0192413330078125,
-0.007572174072265625,
0.01523590087890625,
0.04144287109375,
0.0190277099609375,
0.05889892578125,
0.040618896484375,
-0.07952880859375,
0.054656982421875,
-0.0153350830078125,
-0.0169830322265625,
-0.015228271484375,
-0.05584716796875,
-0.06854248046875,
-0.04931640625,
-0.044158935546875,
-0.0257568359375,
0.00594329833984375,
0.07080078125,
0.057952880859375,
-0.04595947265625,
-0.007709503173828125,
-0.0002434253692626953,
-0.01448822021484375,
-0.028289794921875,
-0.0182647705078125,
0.044036865234375,
-0.0280303955078125,
-0.06951904296875,
0.02032470703125,
-0.014007568359375,
0.0105438232421875,
0.007049560546875,
-0.0015573501586914062,
-0.04541015625,
-0.0014848709106445312,
0.051177978515625,
-0.0067901611328125,
-0.029541015625,
-0.03033447265625,
-0.0016241073608398438,
-0.0239105224609375,
0.0118408203125,
0.0119171142578125,
-0.0482177734375,
0.0192108154296875,
0.04327392578125,
0.03045654296875,
0.07598876953125,
-0.002658843994140625,
0.0302886962890625,
-0.058319091796875,
0.01398468017578125,
0.00794219970703125,
0.0277862548828125,
0.03997802734375,
-0.0210113525390625,
0.037445068359375,
0.03179931640625,
-0.04449462890625,
-0.048126220703125,
-0.0087890625,
-0.070068359375,
-0.0175628662109375,
0.0762939453125,
-0.0126953125,
-0.0234527587890625,
0.0137481689453125,
-0.002208709716796875,
0.0289459228515625,
-0.0192413330078125,
0.0565185546875,
0.0601806640625,
-0.01300048828125,
-0.005504608154296875,
-0.057891845703125,
0.042388916015625,
0.037689208984375,
-0.03924560546875,
-0.02496337890625,
0.0055389404296875,
0.0369873046875,
0.01068115234375,
0.038177490234375,
-0.0081634521484375,
-0.00141143798828125,
0.021820068359375,
-0.005718231201171875,
-0.005786895751953125,
-0.0114288330078125,
-0.008880615234375,
0.01290130615234375,
-0.0210418701171875,
-0.026763916015625
]
] |
sentence-transformers/paraphrase-distilroberta-base-v1 | 2022-06-15T20:03:06.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"jax",
"roberta",
"feature-extraction",
"sentence-similarity",
"transformers",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/paraphrase-distilroberta-base-v1 | 2 | 14,273 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# sentence-transformers/paraphrase-distilroberta-base-v1
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/paraphrase-distilroberta-base-v1')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/paraphrase-distilroberta-base-v1')
model = AutoModel.from_pretrained('sentence-transformers/paraphrase-distilroberta-base-v1')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/paraphrase-distilroberta-base-v1)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: RobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 3,741 | [
[
-0.0143890380859375,
-0.061981201171875,
0.02911376953125,
0.03179931640625,
-0.0296630859375,
-0.025970458984375,
-0.01227569580078125,
0.007671356201171875,
0.005756378173828125,
0.0318603515625,
-0.03155517578125,
-0.031768798828125,
-0.057464599609375,
0.0121002197265625,
-0.045684814453125,
0.0672607421875,
-0.01035308837890625,
-0.0060272216796875,
-0.0235137939453125,
-0.01629638671875,
-0.01497650146484375,
-0.03204345703125,
-0.019195556640625,
-0.017120361328125,
0.012481689453125,
0.011993408203125,
0.04364013671875,
0.0313720703125,
0.028533935546875,
0.03753662109375,
-0.010528564453125,
0.001617431640625,
-0.0228118896484375,
0.002460479736328125,
-0.004364013671875,
-0.026275634765625,
-0.0026912689208984375,
0.012603759765625,
0.04046630859375,
0.036956787109375,
-0.01271820068359375,
0.01300811767578125,
0.01387786865234375,
0.0219573974609375,
-0.032928466796875,
0.032196044921875,
-0.051971435546875,
0.0042724609375,
0.004940032958984375,
0.0013065338134765625,
-0.037689208984375,
-0.01322174072265625,
0.0254364013671875,
-0.0303955078125,
0.02008056640625,
0.0094451904296875,
0.08050537109375,
0.03643798828125,
-0.0157623291015625,
-0.027130126953125,
-0.0261993408203125,
0.06536865234375,
-0.0654296875,
0.00664520263671875,
0.02215576171875,
-0.00011938810348510742,
0.0016813278198242188,
-0.08953857421875,
-0.06201171875,
-0.01215362548828125,
-0.03594970703125,
0.006900787353515625,
-0.03326416015625,
-0.0027294158935546875,
0.0120086669921875,
0.0245361328125,
-0.04901123046875,
-0.024017333984375,
-0.0338134765625,
-0.0083465576171875,
0.0270233154296875,
0.0045318603515625,
0.02783203125,
-0.04486083984375,
-0.03546142578125,
-0.0284881591796875,
-0.004802703857421875,
-0.01064300537109375,
-0.0018930435180664062,
0.0109100341796875,
-0.0186004638671875,
0.0552978515625,
-0.0085906982421875,
0.045135498046875,
0.009490966796875,
0.01175689697265625,
0.047607421875,
-0.0305023193359375,
-0.0211181640625,
-0.0081329345703125,
0.08367919921875,
0.0287933349609375,
0.018768310546875,
-0.00827789306640625,
-0.006984710693359375,
-0.0013113021850585938,
0.01078033447265625,
-0.0626220703125,
-0.0338134765625,
0.00797271728515625,
-0.0309600830078125,
-0.025238037109375,
0.0120849609375,
-0.06256103515625,
-0.0028839111328125,
0.00008505582809448242,
0.05572509765625,
-0.049560546875,
0.00586700439453125,
0.02288818359375,
-0.03338623046875,
0.01337432861328125,
-0.02288818359375,
-0.052154541015625,
0.02008056640625,
0.020233154296875,
0.0836181640625,
0.00801849365234375,
-0.043914794921875,
-0.02215576171875,
-0.00527191162109375,
0.01468658447265625,
0.044952392578125,
-0.02935791015625,
-0.00690460205078125,
0.0018815994262695312,
0.01424407958984375,
-0.04510498046875,
-0.03271484375,
0.0518798828125,
-0.020965576171875,
0.047637939453125,
0.00649261474609375,
-0.057861328125,
-0.012176513671875,
0.0033130645751953125,
-0.0396728515625,
0.08270263671875,
0.01543426513671875,
-0.07330322265625,
-0.0025882720947265625,
-0.048248291015625,
-0.0248870849609375,
-0.008697509765625,
0.003704071044921875,
-0.05047607421875,
0.01206207275390625,
0.0372314453125,
0.05682373046875,
-0.00263214111328125,
0.023712158203125,
-0.02203369140625,
-0.0305938720703125,
0.031707763671875,
-0.031280517578125,
0.08837890625,
0.0148773193359375,
-0.035247802734375,
-0.0005059242248535156,
-0.03802490234375,
-0.0127410888671875,
0.024658203125,
-0.00968170166015625,
-0.023651123046875,
-0.006282806396484375,
0.0218353271484375,
0.0259552001953125,
0.027557373046875,
-0.046051025390625,
0.0018415451049804688,
-0.03216552734375,
0.0797119140625,
0.03765869140625,
0.00920867919921875,
0.046173095703125,
-0.0240936279296875,
0.0165557861328125,
0.0220947265625,
0.006809234619140625,
-0.00981903076171875,
-0.0294036865234375,
-0.064208984375,
-0.022979736328125,
0.019073486328125,
0.042327880859375,
-0.064697265625,
0.071044921875,
-0.036712646484375,
-0.042877197265625,
-0.0533447265625,
0.00299835205078125,
0.01446533203125,
0.042388916015625,
0.0574951171875,
0.00606536865234375,
-0.041107177734375,
-0.06707763671875,
-0.005756378173828125,
-0.0006413459777832031,
0.00007367134094238281,
0.0037097930908203125,
0.05474853515625,
-0.0191802978515625,
0.0728759765625,
-0.04681396484375,
-0.02801513671875,
-0.0382080078125,
0.017822265625,
0.0207672119140625,
0.0408935546875,
0.047760009765625,
-0.06524658203125,
-0.039337158203125,
-0.041961669921875,
-0.058197021484375,
-0.00463104248046875,
-0.01641845703125,
-0.016204833984375,
-0.0015707015991210938,
0.0416259765625,
-0.06585693359375,
0.0245819091796875,
0.040863037109375,
-0.035888671875,
0.0206756591796875,
-0.024810791015625,
-0.010650634765625,
-0.10772705078125,
0.00504302978515625,
0.0037288665771484375,
-0.0131378173828125,
-0.0301971435546875,
0.007793426513671875,
0.013336181640625,
-0.0040283203125,
-0.033355712890625,
0.028228759765625,
-0.03375244140625,
0.018096923828125,
-0.0055389404296875,
0.032135009765625,
0.0070648193359375,
0.054290771484375,
-0.005344390869140625,
0.054412841796875,
0.045257568359375,
-0.047821044921875,
0.0272979736328125,
0.05426025390625,
-0.034515380859375,
0.0200653076171875,
-0.073486328125,
-0.0011148452758789062,
-0.00240325927734375,
0.031097412109375,
-0.088623046875,
-0.0016956329345703125,
0.0214691162109375,
-0.0379638671875,
0.00173187255859375,
0.0156402587890625,
-0.05108642578125,
-0.044586181640625,
-0.031951904296875,
0.01012420654296875,
0.0518798828125,
-0.039398193359375,
0.040130615234375,
0.0196685791015625,
-0.0110015869140625,
-0.037811279296875,
-0.07904052734375,
0.005908966064453125,
-0.0297698974609375,
-0.04278564453125,
0.04901123046875,
-0.006519317626953125,
0.0078887939453125,
0.0152435302734375,
0.01384735107421875,
-0.00879669189453125,
0.00046133995056152344,
-0.0033779144287109375,
0.015655517578125,
-0.00025081634521484375,
0.004909515380859375,
0.018646240234375,
-0.0065155029296875,
0.00640106201171875,
-0.01525115966796875,
0.055755615234375,
-0.01413726806640625,
-0.005573272705078125,
-0.03790283203125,
0.01226806640625,
0.04119873046875,
-0.020721435546875,
0.07879638671875,
0.06622314453125,
-0.0231475830078125,
1.7881393432617188e-7,
-0.03326416015625,
-0.0175323486328125,
-0.0374755859375,
0.04815673828125,
-0.0189971923828125,
-0.058197021484375,
0.0294036865234375,
0.01473236083984375,
0.00423431396484375,
0.048736572265625,
0.04473876953125,
-0.009368896484375,
0.056243896484375,
0.03216552734375,
-0.01534271240234375,
0.04107666015625,
-0.03936767578125,
0.01593017578125,
-0.06671142578125,
-0.0016117095947265625,
-0.0271759033203125,
-0.0231170654296875,
-0.04638671875,
-0.0399169921875,
0.01776123046875,
0.0037899017333984375,
-0.015533447265625,
0.056549072265625,
-0.050018310546875,
0.0169830322265625,
0.050506591796875,
0.018157958984375,
0.002399444580078125,
0.005130767822265625,
-0.025604248046875,
-0.0016841888427734375,
-0.0521240234375,
-0.04498291015625,
0.06890869140625,
0.0228118896484375,
0.03497314453125,
-0.00969696044921875,
0.0626220703125,
-0.0008320808410644531,
-0.006603240966796875,
-0.037811279296875,
0.04852294921875,
-0.02239990234375,
-0.0333251953125,
-0.0259552001953125,
-0.023284912109375,
-0.0634765625,
0.031402587890625,
-0.002101898193359375,
-0.05303955078125,
0.00756072998046875,
-0.01325225830078125,
-0.024322509765625,
0.013885498046875,
-0.05718994140625,
0.07989501953125,
0.00879669189453125,
-0.007450103759765625,
-0.005290985107421875,
-0.05950927734375,
0.01049041748046875,
0.0110321044921875,
0.0063323974609375,
-0.0005359649658203125,
-0.00218963623046875,
0.065185546875,
-0.0263824462890625,
0.06304931640625,
-0.0162200927734375,
0.022674560546875,
0.0298004150390625,
-0.0189056396484375,
0.0219268798828125,
-0.005100250244140625,
-0.006923675537109375,
0.00571441650390625,
-0.0002512931823730469,
-0.0306854248046875,
-0.04534912109375,
0.050323486328125,
-0.0654296875,
-0.031585693359375,
-0.033447265625,
-0.04571533203125,
0.0025653839111328125,
0.0109710693359375,
0.03326416015625,
0.0261993408203125,
-0.00829315185546875,
0.0458984375,
0.041717529296875,
-0.0205535888671875,
0.0516357421875,
0.004032135009765625,
0.0008587837219238281,
-0.0350341796875,
0.048492431640625,
0.0005717277526855469,
0.00960540771484375,
0.041717529296875,
0.019805908203125,
-0.036163330078125,
-0.022705078125,
-0.0213775634765625,
0.031219482421875,
-0.047119140625,
-0.01029205322265625,
-0.08050537109375,
-0.033843994140625,
-0.04638671875,
0.0034694671630859375,
-0.01131439208984375,
-0.034820556640625,
-0.03485107421875,
-0.02081298828125,
0.03485107421875,
0.0296630859375,
0.0034275054931640625,
0.0304412841796875,
-0.051177978515625,
0.0205078125,
0.01416778564453125,
-0.010528564453125,
-0.00632476806640625,
-0.06829833984375,
-0.0181121826171875,
0.005413055419921875,
-0.036163330078125,
-0.0679931640625,
0.0479736328125,
0.02154541015625,
0.04449462890625,
0.005970001220703125,
0.01477813720703125,
0.05523681640625,
-0.044097900390625,
0.06536865234375,
0.0007195472717285156,
-0.0816650390625,
0.0313720703125,
-0.0025463104248046875,
0.0263824462890625,
0.0445556640625,
0.0242462158203125,
-0.03125,
-0.03912353515625,
-0.056854248046875,
-0.0811767578125,
0.047943115234375,
0.0455322265625,
0.043609619140625,
-0.02777099609375,
0.0251312255859375,
-0.01488494873046875,
0.01363372802734375,
-0.0831298828125,
-0.034759521484375,
-0.0322265625,
-0.0478515625,
-0.02679443359375,
-0.0150909423828125,
0.01177215576171875,
-0.03643798828125,
0.049652099609375,
0.009613037109375,
0.049835205078125,
0.02545166015625,
-0.0421142578125,
0.02874755859375,
0.0111083984375,
0.03363037109375,
0.0196380615234375,
-0.007289886474609375,
0.01861572265625,
0.021331787109375,
-0.0284423828125,
0.00319671630859375,
0.0400390625,
-0.00986480712890625,
0.0260467529296875,
0.03155517578125,
0.07049560546875,
0.03875732421875,
-0.0352783203125,
0.05389404296875,
-0.0035037994384765625,
-0.02337646484375,
-0.026519775390625,
-0.01003265380859375,
0.0261688232421875,
0.028167724609375,
0.0248565673828125,
-0.004512786865234375,
0.006656646728515625,
-0.025299072265625,
0.02740478515625,
0.01403045654296875,
-0.0175323486328125,
-0.00919342041015625,
0.061370849609375,
0.0020503997802734375,
-0.0127410888671875,
0.069580078125,
-0.01412200927734375,
-0.049285888671875,
0.0321044921875,
0.042510986328125,
0.0753173828125,
-0.0015993118286132812,
0.024505615234375,
0.033935546875,
0.0186309814453125,
-0.013702392578125,
-0.00687408447265625,
-0.004108428955078125,
-0.055419921875,
-0.007724761962890625,
-0.0562744140625,
0.006755828857421875,
0.00423431396484375,
-0.052276611328125,
0.0186004638671875,
-0.007160186767578125,
-0.001674652099609375,
-0.003459930419921875,
-0.01165771484375,
-0.049957275390625,
-0.0003943443298339844,
-0.0038509368896484375,
0.05584716796875,
-0.07354736328125,
0.0631103515625,
0.047210693359375,
-0.0611572265625,
-0.0487060546875,
0.007785797119140625,
-0.03399658203125,
-0.061126708984375,
0.039337158203125,
0.028961181640625,
0.0171966552734375,
0.018646240234375,
-0.032012939453125,
-0.061370849609375,
0.1077880859375,
0.0264739990234375,
-0.02862548828125,
-0.02777099609375,
0.0124359130859375,
0.042755126953125,
-0.0251617431640625,
0.0243072509765625,
0.037139892578125,
0.0220947265625,
-0.001399993896484375,
-0.055877685546875,
0.0206298828125,
-0.016815185546875,
0.018341064453125,
-0.01236724853515625,
-0.038360595703125,
0.08294677734375,
0.00504302978515625,
-0.006320953369140625,
0.0216217041015625,
0.06231689453125,
0.0235443115234375,
-0.00273895263671875,
0.0347900390625,
0.057098388671875,
0.036773681640625,
-0.0009417533874511719,
0.0703125,
-0.031494140625,
0.061614990234375,
0.07867431640625,
0.005214691162109375,
0.0782470703125,
0.045501708984375,
-0.01544952392578125,
0.056121826171875,
0.0318603515625,
-0.01434326171875,
0.060943603515625,
0.0105743408203125,
-0.004146575927734375,
0.00006967782974243164,
0.022552490234375,
-0.016143798828125,
0.03363037109375,
0.01149749755859375,
-0.05596923828125,
-0.0174560546875,
0.010101318359375,
0.0019702911376953125,
0.0069580078125,
0.00994873046875,
0.046295166015625,
0.01515960693359375,
-0.041015625,
0.0384521484375,
0.01023101806640625,
0.06878662109375,
-0.0284423828125,
0.0088653564453125,
-0.00809478759765625,
0.031951904296875,
0.00746917724609375,
-0.04571533203125,
0.03131103515625,
-0.00743865966796875,
-0.002925872802734375,
-0.0202484130859375,
0.03875732421875,
-0.047027587890625,
-0.048797607421875,
0.027679443359375,
0.037811279296875,
0.00841522216796875,
-0.00907135009765625,
-0.08782958984375,
-0.007419586181640625,
0.0035305023193359375,
-0.03863525390625,
0.0184173583984375,
0.037872314453125,
0.0262603759765625,
0.040924072265625,
0.0246429443359375,
-0.0228424072265625,
0.01702880859375,
-0.000037789344787597656,
0.0640869140625,
-0.044097900390625,
-0.036865234375,
-0.0855712890625,
0.048919677734375,
-0.0219268798828125,
-0.02655029296875,
0.0654296875,
0.03912353515625,
0.061187744140625,
-0.02239990234375,
0.046722412109375,
-0.019378662109375,
0.02032470703125,
-0.03436279296875,
0.06097412109375,
-0.0313720703125,
-0.01073455810546875,
-0.0310821533203125,
-0.07342529296875,
-0.0157318115234375,
0.08837890625,
-0.0269317626953125,
0.00585174560546875,
0.0731201171875,
0.06207275390625,
-0.016448974609375,
-0.01340484619140625,
0.01025390625,
0.02789306640625,
0.01488494873046875,
0.036773681640625,
0.0303955078125,
-0.06658935546875,
0.06207275390625,
-0.045806884765625,
-0.00743865966796875,
-0.01036834716796875,
-0.055389404296875,
-0.07366943359375,
-0.07220458984375,
-0.032806396484375,
-0.023895263671875,
-0.00853729248046875,
0.064697265625,
0.03619384765625,
-0.049957275390625,
-0.0043792724609375,
-0.034423828125,
-0.015869140625,
-0.0157928466796875,
-0.0239105224609375,
0.038543701171875,
-0.040252685546875,
-0.07635498046875,
0.00930023193359375,
-0.00732421875,
0.003574371337890625,
-0.02130126953125,
0.00936126708984375,
-0.05029296875,
0.006427764892578125,
0.038787841796875,
-0.0177001953125,
-0.055938720703125,
-0.01543426513671875,
0.000057816505432128906,
-0.02947998046875,
-0.00787353515625,
0.0237884521484375,
-0.047821044921875,
0.0123748779296875,
0.0318603515625,
0.040496826171875,
0.054046630859375,
-0.01087188720703125,
0.03118896484375,
-0.055328369140625,
0.0259857177734375,
0.0084991455078125,
0.053131103515625,
0.0293121337890625,
-0.0132598876953125,
0.0333251953125,
0.02191162109375,
-0.03900146484375,
-0.055023193359375,
-0.01375579833984375,
-0.0794677734375,
-0.024444580078125,
0.08795166015625,
-0.0328369140625,
-0.030670166015625,
0.0088348388671875,
-0.0214691162109375,
0.041717529296875,
-0.0196075439453125,
0.05755615234375,
0.064697265625,
0.0086669921875,
-0.01102447509765625,
-0.02191162109375,
0.01403045654296875,
0.038421630859375,
-0.038330078125,
-0.00467681884765625,
0.016815185546875,
0.03521728515625,
0.01947021484375,
0.0299072265625,
-0.007617950439453125,
0.0010700225830078125,
0.00803375244140625,
0.0011186599731445312,
-0.01357269287109375,
0.003780364990234375,
-0.029449462890625,
0.0106964111328125,
-0.027679443359375,
-0.037628173828125
]
] |
dbmdz/bert-base-italian-cased | 2023-09-06T22:20:14.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"it",
"dataset:wikipedia",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | dbmdz | null | null | dbmdz/bert-base-italian-cased | 14 | 14,269 | transformers | 2022-03-02T23:29:05 | ---
language: it
license: mit
datasets:
- wikipedia
---
# 🤗 + 📚 dbmdz BERT and ELECTRA models
In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources Italian BERT and ELECTRA models 🎉
# Italian BERT
The source data for the Italian BERT model consists of a recent Wikipedia dump and
various texts from the [OPUS corpora](http://opus.nlpl.eu/) collection. The final
training corpus has a size of 13GB and 2,050,057,573 tokens.
For sentence splitting, we use NLTK (faster compared to spacy).
Our cased and uncased models are training with an initial sequence length of 512
subwords for ~2-3M steps.
For the XXL Italian models, we use the same training data from OPUS and extend
it with data from the Italian part of the [OSCAR corpus](https://traces1.inria.fr/oscar/).
Thus, the final training corpus has a size of 81GB and 13,138,379,147 tokens.
Note: Unfortunately, a wrong vocab size was used when training the XXL models.
This explains the mismatch of the "real" vocab size of 31102, compared to the
vocab size specified in `config.json`. However, the model is working and all
evaluations were done under those circumstances.
See [this issue](https://github.com/dbmdz/berts/issues/7) for more information.
The Italian ELECTRA model was trained on the "XXL" corpus for 1M steps in total using a batch
size of 128. We pretty much following the ELECTRA training procedure as used for
[BERTurk](https://github.com/stefan-it/turkish-bert/tree/master/electra).
## Model weights
Currently only PyTorch-[Transformers](https://github.com/huggingface/transformers)
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
| Model | Downloads
| ---------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------
| `dbmdz/bert-base-italian-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-cased/vocab.txt)
| `dbmdz/bert-base-italian-uncased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-uncased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-uncased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-uncased/vocab.txt)
| `dbmdz/bert-base-italian-xxl-cased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-cased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-cased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-cased/vocab.txt)
| `dbmdz/bert-base-italian-xxl-uncased` | [`config.json`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-uncased/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-uncased/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/bert-base-italian-xxl-uncased/vocab.txt)
| `dbmdz/electra-base-italian-xxl-cased-discriminator` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/dbmdz/electra-base-italian-xxl-cased-discriminator/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-discriminator/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-discriminator/vocab.txt)
| `dbmdz/electra-base-italian-xxl-cased-generator` | [`config.json`](https://s3.amazonaws.com/models.huggingface.co/bert/dbmdz/electra-base-italian-xxl-cased-generator/config.json) • [`pytorch_model.bin`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-generator/pytorch_model.bin) • [`vocab.txt`](https://cdn.huggingface.co/dbmdz/electra-base-italian-xxl-cased-generator/vocab.txt)
## Results
For results on downstream tasks like NER or PoS tagging, please refer to
[this repository](https://github.com/stefan-it/italian-bertelectra).
## Usage
With Transformers >= 2.3 our Italian BERT models can be loaded like:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/bert-base-italian-cased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
```
To load the (recommended) Italian XXL BERT models, just use:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/bert-base-italian-xxl-cased"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
```
To load the Italian XXL ELECTRA model (discriminator), just use:
```python
from transformers import AutoModel, AutoTokenizer
model_name = "dbmdz/electra-base-italian-xxl-cased-discriminator"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelWithLMHead.from_pretrained(model_name)
```
# Huggingface model hub
All models are available on the [Huggingface model hub](https://huggingface.co/dbmdz).
# Contact (Bugs, Feedback, Contribution and more)
For questions about our BERT/ELECTRA models just open an issue
[here](https://github.com/dbmdz/berts/issues/new) 🤗
# Acknowledgments
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the [Hugging Face](https://huggingface.co/) team,
it is possible to download both cased and uncased models from their S3 storage 🤗
| 5,994 | [
[
-0.037872314453125,
-0.049285888671875,
0.01404571533203125,
-0.00011068582534790039,
-0.01654052734375,
-0.005222320556640625,
-0.0194091796875,
-0.033721923828125,
0.02490234375,
0.0224761962890625,
-0.02850341796875,
-0.050018310546875,
-0.03619384765625,
-0.00013780593872070312,
-0.0301513671875,
0.082763671875,
-0.0151519775390625,
0.0173797607421875,
0.019775390625,
-0.0183868408203125,
-0.00809478759765625,
-0.0482177734375,
-0.031219482421875,
-0.03582763671875,
0.031097412109375,
0.00931549072265625,
0.0330810546875,
0.01409912109375,
0.02777099609375,
0.0265350341796875,
-0.02642822265625,
0.0164947509765625,
-0.004085540771484375,
-0.01201629638671875,
-0.005382537841796875,
-0.0401611328125,
-0.033355712890625,
-0.0206298828125,
0.03717041015625,
0.0263671875,
-0.010406494140625,
0.0059814453125,
-0.01435089111328125,
0.045196533203125,
-0.017303466796875,
0.0100860595703125,
-0.043609619140625,
-0.0005216598510742188,
0.0016622543334960938,
0.00817108154296875,
-0.026123046875,
-0.0177001953125,
0.0297088623046875,
-0.0203094482421875,
0.04571533203125,
-0.0147552490234375,
0.10546875,
0.0026798248291015625,
-0.0099029541015625,
-0.027557373046875,
-0.032989501953125,
0.0390625,
-0.06292724609375,
0.0199432373046875,
0.0158538818359375,
0.0078277587890625,
-0.016326904296875,
-0.06219482421875,
-0.05694580078125,
-0.0163116455078125,
-0.01085662841796875,
0.007015228271484375,
-0.033233642578125,
-0.00489044189453125,
0.0245208740234375,
0.0247650146484375,
-0.0303955078125,
-0.0007195472717285156,
-0.042755126953125,
-0.02117919921875,
0.05596923828125,
-0.00804901123046875,
0.02789306640625,
-0.031768798828125,
-0.005035400390625,
-0.03497314453125,
-0.0509033203125,
0.0072021484375,
0.05426025390625,
0.03424072265625,
-0.038360595703125,
0.05230712890625,
-0.00791168212890625,
0.059967041015625,
0.00878143310546875,
0.017364501953125,
0.04571533203125,
-0.0128631591796875,
0.0012531280517578125,
-0.003910064697265625,
0.0849609375,
0.0203857421875,
0.0079193115234375,
-0.013092041015625,
-0.007030487060546875,
-0.0255584716796875,
0.00859832763671875,
-0.059417724609375,
-0.0293426513671875,
0.032012939453125,
-0.026092529296875,
-0.02105712890625,
-0.005496978759765625,
-0.0670166015625,
-0.010772705078125,
-0.006984710693359375,
0.0546875,
-0.053802490234375,
-0.020782470703125,
0.02044677734375,
-0.00571441650390625,
0.0225982666015625,
0.00792694091796875,
-0.07183837890625,
-0.0012559890747070312,
0.0114288330078125,
0.071533203125,
0.01325225830078125,
-0.01434326171875,
0.002147674560546875,
-0.024322509765625,
-0.0086517333984375,
0.04913330078125,
0.0149078369140625,
-0.0173492431640625,
0.01348876953125,
0.0263824462890625,
-0.0311279296875,
-0.024444580078125,
0.04742431640625,
-0.025970458984375,
0.0269622802734375,
-0.0211944580078125,
-0.032470703125,
-0.032562255859375,
-0.00502777099609375,
-0.0517578125,
0.09149169921875,
0.0187530517578125,
-0.06475830078125,
0.0186920166015625,
-0.05120849609375,
-0.03729248046875,
-0.00632476806640625,
0.0082855224609375,
-0.06304931640625,
0.010223388671875,
0.01800537109375,
0.040191650390625,
-0.00312042236328125,
0.0029506683349609375,
-0.01390838623046875,
-0.007354736328125,
0.0251617431640625,
-0.0131378173828125,
0.083251953125,
0.0170440673828125,
-0.046905517578125,
0.003520965576171875,
-0.06121826171875,
-0.0008802413940429688,
0.0210418701171875,
-0.0240325927734375,
0.0117950439453125,
0.01397705078125,
0.0269775390625,
0.0180206298828125,
0.0260009765625,
-0.041259765625,
0.0178680419921875,
-0.0421142578125,
0.061279296875,
0.046142578125,
-0.024688720703125,
0.00984954833984375,
-0.0229949951171875,
0.0203399658203125,
0.009857177734375,
-0.00815582275390625,
0.0019025802612304688,
-0.03204345703125,
-0.098876953125,
-0.02593994140625,
0.0352783203125,
0.0413818359375,
-0.060943603515625,
0.06842041015625,
-0.025054931640625,
-0.05303955078125,
-0.0404052734375,
-0.002140045166015625,
0.012054443359375,
0.0206298828125,
0.031463623046875,
-0.01284027099609375,
-0.050445556640625,
-0.07623291015625,
0.0011463165283203125,
-0.0142822265625,
-0.0008635520935058594,
0.0269927978515625,
0.072998046875,
-0.00826263427734375,
0.0626220703125,
-0.0172576904296875,
-0.018157958984375,
-0.038177490234375,
0.00696563720703125,
0.049102783203125,
0.0452880859375,
0.056640625,
-0.03326416015625,
-0.0284271240234375,
0.01336669921875,
-0.040557861328125,
0.01268768310546875,
0.0019063949584960938,
-0.01555633544921875,
0.022857666015625,
0.02044677734375,
-0.057861328125,
0.0221710205078125,
0.0270843505859375,
-0.040374755859375,
0.034393310546875,
-0.032470703125,
-0.0030422210693359375,
-0.10693359375,
0.0128631591796875,
0.01300811767578125,
-0.01300811767578125,
-0.032073974609375,
0.00901031494140625,
0.0012998580932617188,
0.01253509521484375,
-0.048431396484375,
0.038848876953125,
-0.0240020751953125,
0.0099639892578125,
0.015899658203125,
-0.00054168701171875,
-0.003292083740234375,
0.042755126953125,
0.0081024169921875,
0.05810546875,
0.045196533203125,
-0.034637451171875,
0.03900146484375,
0.0262908935546875,
-0.043426513671875,
0.0105438232421875,
-0.0670166015625,
0.01593017578125,
0.00931549072265625,
0.00959014892578125,
-0.06396484375,
-0.0169677734375,
0.031463623046875,
-0.042724609375,
0.026458740234375,
-0.0243988037109375,
-0.062225341796875,
-0.047943115234375,
-0.0009822845458984375,
-0.00936126708984375,
0.042999267578125,
-0.059783935546875,
0.061492919921875,
0.014617919921875,
0.005321502685546875,
-0.031524658203125,
-0.0576171875,
-0.00727081298828125,
-0.0308685302734375,
-0.050506591796875,
0.038909912109375,
0.004764556884765625,
0.0087432861328125,
-0.005901336669921875,
0.0038280487060546875,
-0.012054443359375,
-0.002033233642578125,
-0.003101348876953125,
0.031341552734375,
-0.005260467529296875,
0.00131988525390625,
0.0001443624496459961,
0.012359619140625,
-0.00917816162109375,
-0.0176544189453125,
0.0701904296875,
-0.0238800048828125,
-0.0013093948364257812,
-0.048797607421875,
0.006267547607421875,
0.043548583984375,
-0.0182342529296875,
0.07293701171875,
0.059539794921875,
-0.020416259765625,
-0.006923675537109375,
-0.046661376953125,
-0.00850677490234375,
-0.03302001953125,
0.0295867919921875,
-0.032318115234375,
-0.0484619140625,
0.061309814453125,
0.024932861328125,
0.0163421630859375,
0.0511474609375,
0.060546875,
-0.01496124267578125,
0.0859375,
0.03955078125,
-0.0067901611328125,
0.05963134765625,
-0.0640869140625,
0.018341064453125,
-0.058563232421875,
-0.0184478759765625,
-0.049407958984375,
-0.01320648193359375,
-0.052032470703125,
-0.0239105224609375,
0.0182647705078125,
0.01503753662109375,
-0.0135650634765625,
0.0533447265625,
-0.0670166015625,
0.01259613037109375,
0.050750732421875,
0.036407470703125,
-0.00714874267578125,
0.036041259765625,
-0.03271484375,
0.00868988037109375,
-0.060821533203125,
-0.0382080078125,
0.09637451171875,
0.022735595703125,
0.02825927734375,
0.0060272216796875,
0.052215576171875,
-0.006134033203125,
0.004863739013671875,
-0.056610107421875,
0.031036376953125,
-0.0137176513671875,
-0.0633544921875,
0.004825592041015625,
-0.027435302734375,
-0.07275390625,
0.033660888671875,
-0.0095977783203125,
-0.079345703125,
0.0260467529296875,
0.0019893646240234375,
-0.03729248046875,
0.035675048828125,
-0.07012939453125,
0.0736083984375,
-0.0222320556640625,
-0.0291748046875,
-0.01117706298828125,
-0.0310516357421875,
-0.00121307373046875,
0.0205535888671875,
0.004314422607421875,
-0.006343841552734375,
0.018585205078125,
0.07159423828125,
-0.05224609375,
0.05596923828125,
-0.0025272369384765625,
-0.0002818107604980469,
0.032257080078125,
-0.0166778564453125,
0.0321044921875,
-0.008819580078125,
-0.01324462890625,
0.037384033203125,
0.01088714599609375,
-0.025482177734375,
-0.012603759765625,
0.0428466796875,
-0.0777587890625,
-0.0156402587890625,
-0.047454833984375,
-0.035552978515625,
0.0030879974365234375,
0.020233154296875,
0.047760009765625,
0.025848388671875,
0.0060882568359375,
0.0191802978515625,
0.050750732421875,
-0.025848388671875,
0.0439453125,
0.04095458984375,
-0.00037598609924316406,
-0.033599853515625,
0.058349609375,
0.01001739501953125,
-0.002872467041015625,
0.01242828369140625,
-0.0001233816146850586,
-0.0269012451171875,
-0.0498046875,
-0.046417236328125,
0.025543212890625,
-0.040771484375,
-0.0139617919921875,
-0.0576171875,
-0.010986328125,
-0.0352783203125,
0.0085906982421875,
-0.0256805419921875,
-0.0309600830078125,
-0.03265380859375,
-0.0215606689453125,
0.06207275390625,
0.020965576171875,
-0.00641632080078125,
0.03314208984375,
-0.038482666015625,
0.01523590087890625,
-0.0030994415283203125,
0.0164947509765625,
-0.01467132568359375,
-0.047943115234375,
-0.02850341796875,
0.0139923095703125,
-0.0286712646484375,
-0.07159423828125,
0.05108642578125,
0.0018253326416015625,
0.033935546875,
0.01055908203125,
-0.01514434814453125,
0.047576904296875,
-0.0452880859375,
0.056488037109375,
0.01983642578125,
-0.06793212890625,
0.03955078125,
-0.024932861328125,
-0.0008196830749511719,
0.03662109375,
0.019561767578125,
-0.0284881591796875,
-0.00927734375,
-0.055755615234375,
-0.088134765625,
0.0721435546875,
0.0262908935546875,
0.026947021484375,
-0.004596710205078125,
0.023712158203125,
-0.00927734375,
0.00799560546875,
-0.0650634765625,
-0.0286712646484375,
-0.018829345703125,
-0.0362548828125,
-0.002201080322265625,
-0.0016078948974609375,
-0.01448822021484375,
-0.046600341796875,
0.0670166015625,
0.003795623779296875,
0.0391845703125,
0.033233642578125,
-0.01898193359375,
0.005985260009765625,
-0.0015277862548828125,
0.0289154052734375,
0.031158447265625,
-0.0161285400390625,
-0.01837158203125,
0.020050048828125,
-0.0301361083984375,
-0.006107330322265625,
0.0352783203125,
-0.029205322265625,
0.034912109375,
0.018035888671875,
0.07891845703125,
0.0098876953125,
-0.046478271484375,
0.02520751953125,
-0.0059814453125,
-0.02606201171875,
-0.0428466796875,
0.0014438629150390625,
0.00678253173828125,
0.017333984375,
0.0262298583984375,
-0.002471923828125,
-0.005893707275390625,
-0.0318603515625,
0.013427734375,
0.0299224853515625,
-0.016510009765625,
-0.02252197265625,
0.041839599609375,
0.009765625,
-0.0110626220703125,
0.049346923828125,
-0.009307861328125,
-0.061309814453125,
0.055755615234375,
0.01531219482421875,
0.062225341796875,
0.0020008087158203125,
0.016510009765625,
0.050140380859375,
0.02685546875,
-0.00856781005859375,
0.020538330078125,
-0.004486083984375,
-0.071533203125,
-0.00909423828125,
-0.05853271484375,
-0.00334930419921875,
0.013275146484375,
-0.044921875,
0.031829833984375,
-0.0262908935546875,
-0.0233306884765625,
0.00661468505859375,
0.0196533203125,
-0.06317138671875,
-0.0006279945373535156,
-0.0098419189453125,
0.052734375,
-0.058685302734375,
0.06353759765625,
0.048980712890625,
-0.05926513671875,
-0.079345703125,
-0.035064697265625,
-0.034759521484375,
-0.064453125,
0.045440673828125,
0.00962066650390625,
0.0196990966796875,
0.00775146484375,
-0.036895751953125,
-0.057159423828125,
0.07568359375,
0.0208740234375,
-0.037445068359375,
-0.0164642333984375,
-0.0010519027709960938,
0.045867919921875,
-0.00826263427734375,
0.040679931640625,
0.044921875,
0.0228118896484375,
0.01169586181640625,
-0.061279296875,
-0.006072998046875,
-0.031280517578125,
-0.014007568359375,
0.006267547607421875,
-0.051361083984375,
0.07257080078125,
-0.01284027099609375,
-0.0039825439453125,
0.029754638671875,
0.0404052734375,
0.01210784912109375,
-0.016998291015625,
0.0216064453125,
0.06683349609375,
0.043121337890625,
-0.0297088623046875,
0.08123779296875,
-0.046661376953125,
0.06341552734375,
0.0504150390625,
0.01027679443359375,
0.057952880859375,
0.030426025390625,
-0.033538818359375,
0.0546875,
0.0628662109375,
-0.030487060546875,
0.020782470703125,
0.00958251953125,
-0.0238189697265625,
-0.006298065185546875,
0.0135650634765625,
-0.041046142578125,
0.0310516357421875,
0.0230865478515625,
-0.043914794921875,
-0.0080108642578125,
-0.003692626953125,
0.020050048828125,
-0.0244598388671875,
-0.00533294677734375,
0.045501708984375,
-0.0130157470703125,
-0.0406494140625,
0.0584716796875,
0.006046295166015625,
0.0640869140625,
-0.04302978515625,
0.00823211669921875,
-0.00489044189453125,
0.0265960693359375,
-0.016815185546875,
-0.035888671875,
0.00279998779296875,
0.00934600830078125,
-0.00577545166015625,
-0.011810302734375,
0.021728515625,
-0.0296173095703125,
-0.0667724609375,
0.020782470703125,
0.025726318359375,
0.0223236083984375,
0.01251220703125,
-0.073974609375,
0.0050048828125,
0.0137786865234375,
-0.02685546875,
0.0081787109375,
0.0205841064453125,
0.0257568359375,
0.041259765625,
0.057403564453125,
0.01520538330078125,
0.0215911865234375,
0.01096343994140625,
0.05291748046875,
-0.0307769775390625,
-0.0199127197265625,
-0.055938720703125,
0.05584716796875,
-0.0157623291015625,
-0.03692626953125,
0.05389404296875,
0.04010009765625,
0.0704345703125,
-0.00460052490234375,
0.052764892578125,
-0.0396728515625,
0.02764892578125,
-0.033966064453125,
0.07037353515625,
-0.0478515625,
0.004093170166015625,
-0.019561767578125,
-0.053497314453125,
-0.01064300537109375,
0.090087890625,
-0.029022216796875,
0.01041412353515625,
0.04449462890625,
0.064453125,
0.0124053955078125,
-0.02044677734375,
-0.0037746429443359375,
0.0309600830078125,
0.02734375,
0.04986572265625,
0.03790283203125,
-0.060333251953125,
0.0396728515625,
-0.046356201171875,
0.004528045654296875,
-0.01898193359375,
-0.0452880859375,
-0.076416015625,
-0.047943115234375,
-0.0298919677734375,
-0.0518798828125,
-0.003662109375,
0.07379150390625,
0.059906005859375,
-0.075927734375,
-0.024566650390625,
-0.036407470703125,
0.01363372802734375,
-0.00855255126953125,
-0.0211181640625,
0.043853759765625,
-0.0161285400390625,
-0.0806884765625,
0.007080078125,
-0.0209197998046875,
0.01442718505859375,
-0.00949859619140625,
0.010467529296875,
-0.02264404296875,
-0.01251220703125,
0.0274505615234375,
0.031036376953125,
-0.04913330078125,
-0.019439697265625,
-0.007152557373046875,
-0.0025081634521484375,
0.01166534423828125,
0.035888671875,
-0.041839599609375,
0.035675048828125,
0.0452880859375,
0.012786865234375,
0.052520751953125,
-0.03424072265625,
0.037109375,
-0.0277099609375,
0.03363037109375,
0.0213775634765625,
0.06292724609375,
0.0292816162109375,
-0.01221466064453125,
0.035308837890625,
0.0143280029296875,
-0.0240936279296875,
-0.06573486328125,
-0.004638671875,
-0.0684814453125,
-0.0162353515625,
0.068359375,
-0.018646240234375,
-0.022247314453125,
0.0137786865234375,
-0.0225677490234375,
0.043914794921875,
-0.026092529296875,
0.06573486328125,
0.06463623046875,
-0.005313873291015625,
-0.00872802734375,
-0.04046630859375,
0.02178955078125,
0.0413818359375,
-0.045196533203125,
-0.01306915283203125,
0.0257568359375,
0.035400390625,
0.026947021484375,
0.0147705078125,
-0.01548004150390625,
0.02203369140625,
-0.00804901123046875,
0.036956787109375,
0.00421142578125,
0.00018298625946044922,
-0.0262603759765625,
-0.009124755859375,
-0.0237884521484375,
-0.01323699951171875
]
] |
TheBloke/Llama-2-13B-fp16 | 2023-07-20T09:47:08.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"facebook",
"meta",
"llama-2",
"en",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Llama-2-13B-fp16 | 51 | 14,243 | transformers | 2023-07-18T19:30:51 | ---
inference: false
language:
- en
pipeline_tag: text-generation
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
---
<!-- header start -->
<div style="width: 100%;">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p><a href="https://discord.gg/theblokeai">Chat & support: my new Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<!-- header end -->
# Meta's Llama 2 13B fp16
These files are fp16 format model files for [Meta's Llama 2 13B](https://huggingface.co/meta-llama/Llama-2-13b-hf).
They were produced by downloading the PTH files from Meta, and then converting to HF format using the latest Transformers 4.32.0.dev0, from Git, with the Llama 2 PR included: https://github.com/huggingface/transformers/pull/24891.
Command to convert was:
```
python3 /workspace/venv/pytorch2/lib/python3.10/site-packages/transformers/models/llama/convert_llama_weights_to_hf.py --input_dir /workspace/git/llama/download --model_size 13B --output_dir /workspace/process/llama-2-13b/source
```
## Repositories available
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Llama-2-13B-GPTQ)
* [Original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/meta-llama/Llama-2-13b-hf)
* [My fp16 conversion of the unquantised PTH model files](https://huggingface.co/TheBloke/Llama-2-13B-fp16)
## Prompt template: None
```
{prompt}
```
<!-- footer start -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute.
Thanks to the [chirper.ai](https://chirper.ai) team!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Luke from CarbonQuill, Aemon Algiz.
**Patreon special mentions**: Space Cruiser, Nikolai Manek, Sam, Chris McCloskey, Rishabh Srivastava, Kalila, Spiking Neurons AB, Khalefa Al-Ahmad, WelcomeToTheClub, Chadd, Lone Striker, Viktor Bowallius, Edmond Seymore, Ai Maven, Chris Smitley, Dave, Alexandros Triantafyllidis, Luke @flexchar, Elle, ya boyyy, Talal Aujan, Alex , Jonathan Leane, Deep Realms, Randy H, subjectnull, Preetika Verma, Joseph William Delisle, Michael Levine, chris gileta, K, Oscar Rangel, LangChain4j, Trenton Dambrowitz, Eugene Pentland, Johann-Peter Hartmann, Femi Adebogun, Illia Dulskyi, senxiiz, Daniel P. Andersen, Sean Connelly, Artur Olbinski, RoA, Mano Prime, Derek Yates, Raven Klaugh, David Flickinger, Willem Michiel, Pieter, Willian Hasse, vamX, Luke Pendergrass, webtim, Ghost , Rainer Wilmers, Nathan LeClaire, Will Dee, Cory Kujawski, John Detwiler, Fred von Graf, biorpg, Iucharbius , Imad Khwaja, Pierre Kircher, terasurfer , Asp the Wyvern, John Villwock, theTransient, zynix , Gabriel Tamborski, Fen Risland, Gabriel Puliatti, Matthew Berman, Pyrater, SuperWojo, Stephen Murray, Karl Bernard, Ajan Kanaga, Greatston Gnanesh, Junyu Yang.
Thank you to all my generous patrons and donaters!
<!-- footer end -->
# Original model card: Meta's Llama 2 13B
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 13B pretrained model, converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)| | 13,453 | [
[
-0.0247802734375,
-0.0545654296875,
0.0225677490234375,
0.02313232421875,
-0.032989501953125,
0.00612640380859375,
-0.007213592529296875,
-0.052032470703125,
0.0277557373046875,
0.0260162353515625,
-0.05877685546875,
-0.0262908935546875,
-0.044097900390625,
0.01213836669921875,
-0.0246734619140625,
0.081298828125,
0.0124359130859375,
-0.028045654296875,
-0.0072479248046875,
0.0173187255859375,
-0.032440185546875,
-0.0283966064453125,
-0.041107177734375,
-0.04254150390625,
0.03924560546875,
0.0240325927734375,
0.05419921875,
0.048065185546875,
0.03753662109375,
0.0237274169921875,
-0.0196990966796875,
0.0203094482421875,
-0.053955078125,
-0.01544189453125,
0.0117034912109375,
-0.032989501953125,
-0.054046630859375,
-0.00004988908767700195,
0.0261993408203125,
0.012237548828125,
-0.0244598388671875,
0.0255126953125,
0.00495147705078125,
0.032501220703125,
-0.0222625732421875,
0.019561767578125,
-0.0482177734375,
-0.00583648681640625,
-0.015716552734375,
0.0027751922607421875,
-0.01110076904296875,
-0.0126190185546875,
-0.01495361328125,
-0.07183837890625,
-0.00846099853515625,
0.0040130615234375,
0.09027099609375,
0.0389404296875,
-0.0268707275390625,
0.00025010108947753906,
-0.037353515625,
0.05670166015625,
-0.0670166015625,
0.01529693603515625,
0.039306640625,
0.014801025390625,
-0.0210723876953125,
-0.0751953125,
-0.059173583984375,
-0.00719451904296875,
-0.0029239654541015625,
0.0178375244140625,
-0.042999267578125,
-0.007404327392578125,
0.0017976760864257812,
0.032989501953125,
-0.033935546875,
0.02581787109375,
-0.03326416015625,
-0.006328582763671875,
0.06134033203125,
0.005462646484375,
0.0171356201171875,
-0.01346588134765625,
-0.0276641845703125,
-0.026702880859375,
-0.0582275390625,
0.00606536865234375,
0.03070068359375,
0.006999969482421875,
-0.058837890625,
0.05670166015625,
-0.0139923095703125,
0.0210723876953125,
0.0192108154296875,
-0.0285186767578125,
0.0222930908203125,
-0.04327392578125,
-0.019622802734375,
-0.0240325927734375,
0.0816650390625,
0.04425048828125,
0.00536346435546875,
0.0160369873046875,
-0.005161285400390625,
0.006572723388671875,
0.00145721435546875,
-0.0631103515625,
-0.01036834716796875,
0.019622802734375,
-0.04290771484375,
-0.0404052734375,
-0.025238037109375,
-0.057830810546875,
-0.0265045166015625,
-0.0025844573974609375,
0.0097198486328125,
-0.01470947265625,
-0.03863525390625,
0.0088958740234375,
-0.00464630126953125,
0.03863525390625,
0.0286407470703125,
-0.07025146484375,
0.0100555419921875,
0.0462646484375,
0.05126953125,
0.01265716552734375,
-0.0208282470703125,
-0.00980377197265625,
0.006351470947265625,
-0.0211181640625,
0.06036376953125,
-0.0276641845703125,
-0.037567138671875,
-0.0139312744140625,
0.004375457763671875,
0.0156707763671875,
-0.03057861328125,
0.0271453857421875,
-0.023040771484375,
0.0120849609375,
-0.0181121826171875,
-0.0208892822265625,
-0.01727294921875,
0.01050567626953125,
-0.037353515625,
0.08880615234375,
0.0198974609375,
-0.043182373046875,
0.00769805908203125,
-0.0435791015625,
-0.021759033203125,
-0.002010345458984375,
0.00643157958984375,
-0.03363037109375,
-0.009002685546875,
0.016326904296875,
0.021728515625,
-0.051239013671875,
0.0289154052734375,
-0.0199127197265625,
-0.0233612060546875,
0.02105712890625,
-0.0277862548828125,
0.06549072265625,
0.01470947265625,
-0.03265380859375,
-0.0015716552734375,
-0.048065185546875,
-0.0165863037109375,
0.043731689453125,
-0.037353515625,
0.0222625732421875,
0.00911712646484375,
0.0061187744140625,
0.0076446533203125,
0.039093017578125,
-0.028045654296875,
0.0198822021484375,
-0.0294952392578125,
0.046142578125,
0.061126708984375,
-0.0077667236328125,
0.01971435546875,
-0.039276123046875,
0.036865234375,
-0.00954437255859375,
0.02679443359375,
0.0010204315185546875,
-0.06341552734375,
-0.07684326171875,
-0.018463134765625,
0.0034694671630859375,
0.047576904296875,
-0.03656005859375,
0.052581787109375,
-0.01018524169921875,
-0.060272216796875,
-0.044403076171875,
0.0161895751953125,
0.034332275390625,
0.025421142578125,
0.02978515625,
-0.0228118896484375,
-0.052001953125,
-0.06805419921875,
0.01222991943359375,
-0.046600341796875,
-0.00672149658203125,
0.03387451171875,
0.042724609375,
-0.04595947265625,
0.051544189453125,
-0.034820556640625,
-0.027862548828125,
-0.033294677734375,
-0.0186309814453125,
0.01654052734375,
0.0343017578125,
0.049041748046875,
-0.04046630859375,
-0.0208587646484375,
0.006069183349609375,
-0.053955078125,
-0.0099029541015625,
-0.0018815994262695312,
-0.0186614990234375,
0.01529693603515625,
0.01580810546875,
-0.06982421875,
0.034210205078125,
0.058074951171875,
-0.0250396728515625,
0.02850341796875,
-0.0007920265197753906,
-0.01325225830078125,
-0.0877685546875,
0.00574493408203125,
-0.00585174560546875,
-0.0081634521484375,
-0.0428466796875,
-0.00567626953125,
-0.02459716796875,
0.0035877227783203125,
-0.03076171875,
0.0479736328125,
-0.021728515625,
0.0015726089477539062,
-0.00907135009765625,
0.004730224609375,
0.0063323974609375,
0.04522705078125,
-0.01348114013671875,
0.0616455078125,
0.0298919677734375,
-0.036376953125,
0.03106689453125,
0.048370361328125,
-0.024993896484375,
0.01503753662109375,
-0.07330322265625,
0.02252197265625,
0.0013628005981445312,
0.0479736328125,
-0.08184814453125,
-0.03204345703125,
0.04595947265625,
-0.049285888671875,
0.0191192626953125,
-0.0021648406982421875,
-0.04266357421875,
-0.030731201171875,
-0.029266357421875,
0.0248565673828125,
0.06134033203125,
-0.039276123046875,
0.03173828125,
0.03656005859375,
-0.00241851806640625,
-0.04962158203125,
-0.0650634765625,
0.00788116455078125,
-0.0198974609375,
-0.03912353515625,
0.032989501953125,
-0.0108489990234375,
-0.01268768310546875,
-0.009613037109375,
0.0145111083984375,
0.0026111602783203125,
0.01490020751953125,
0.0287017822265625,
0.02349853515625,
-0.006603240966796875,
-0.0162506103515625,
0.00920867919921875,
-0.0049285888671875,
-0.00548553466796875,
0.00038909912109375,
0.0633544921875,
-0.01922607421875,
-0.01513671875,
-0.06732177734375,
0.0073394775390625,
0.0360107421875,
-0.01837158203125,
0.0494384765625,
0.0413818359375,
-0.0222015380859375,
0.0153961181640625,
-0.054168701171875,
-0.0119171142578125,
-0.040863037109375,
0.035247802734375,
-0.004398345947265625,
-0.0687255859375,
0.033294677734375,
0.0109710693359375,
0.0247802734375,
0.047454833984375,
0.053497314453125,
-0.0171356201171875,
0.06402587890625,
0.053955078125,
-0.0129241943359375,
0.0404052734375,
-0.040313720703125,
0.00959014892578125,
-0.062347412109375,
-0.04388427734375,
-0.03057861328125,
-0.037994384765625,
-0.05499267578125,
-0.04815673828125,
0.0183563232421875,
0.006267547607421875,
-0.0384521484375,
0.032745361328125,
-0.038543701171875,
0.0209808349609375,
0.0316162109375,
0.01113128662109375,
0.01383209228515625,
0.01308441162109375,
0.020416259765625,
0.004589080810546875,
-0.042938232421875,
-0.044647216796875,
0.086669921875,
0.040435791015625,
0.038177490234375,
0.015716552734375,
0.048370361328125,
0.00954437255859375,
0.0284423828125,
-0.04974365234375,
0.04248046875,
-0.007038116455078125,
-0.0509033203125,
-0.013031005859375,
-0.01029205322265625,
-0.07012939453125,
0.01024627685546875,
-0.01800537109375,
-0.05548095703125,
0.0161590576171875,
-0.00009608268737792969,
-0.0204925537109375,
0.0250701904296875,
-0.032012939453125,
0.04608154296875,
-0.024078369140625,
-0.01433563232421875,
-0.0243682861328125,
-0.059417724609375,
0.040740966796875,
-0.0009918212890625,
0.01397705078125,
-0.0276947021484375,
-0.018310546875,
0.05908203125,
-0.04150390625,
0.09051513671875,
0.0011930465698242188,
-0.0195770263671875,
0.0498046875,
-0.0037059783935546875,
0.0413818359375,
0.01233673095703125,
-0.00577545166015625,
0.041961669921875,
-0.0101318359375,
-0.013641357421875,
-0.01447296142578125,
0.0352783203125,
-0.09771728515625,
-0.05511474609375,
-0.019134521484375,
-0.030364990234375,
0.0175933837890625,
0.00858306884765625,
0.028167724609375,
-0.0005726814270019531,
-0.0028667449951171875,
0.025054931640625,
0.020660400390625,
-0.031280517578125,
0.037750244140625,
0.0220184326171875,
-0.009246826171875,
-0.0479736328125,
0.059906005859375,
-0.01425933837890625,
0.013641357421875,
0.028289794921875,
0.00811004638671875,
-0.031829833984375,
-0.0181884765625,
-0.0394287109375,
0.04583740234375,
-0.040252685546875,
-0.040771484375,
-0.039276123046875,
-0.0205535888671875,
-0.0306396484375,
-0.00276947021484375,
-0.03826904296875,
-0.03643798828125,
-0.066162109375,
-0.006938934326171875,
0.052215576171875,
0.057525634765625,
-0.0078277587890625,
0.048553466796875,
-0.035491943359375,
0.020599365234375,
0.02069091796875,
0.00960540771484375,
0.0015716552734375,
-0.0638427734375,
0.0031223297119140625,
0.01332855224609375,
-0.04620361328125,
-0.052734375,
0.03961181640625,
0.02154541015625,
0.0298004150390625,
0.0282440185546875,
-0.011077880859375,
0.06268310546875,
-0.032440185546875,
0.076904296875,
0.032440185546875,
-0.060089111328125,
0.043243408203125,
-0.031646728515625,
-0.0028324127197265625,
0.02978515625,
0.0174407958984375,
-0.02825927734375,
-0.02215576171875,
-0.049102783203125,
-0.05145263671875,
0.054931640625,
0.0252532958984375,
0.027130126953125,
-0.002887725830078125,
0.03826904296875,
-0.00548553466796875,
0.0116729736328125,
-0.077392578125,
-0.0300750732421875,
-0.029937744140625,
-0.00909423828125,
-0.003124237060546875,
-0.0272064208984375,
-0.005802154541015625,
-0.031097412109375,
0.05877685546875,
-0.007205963134765625,
0.042144775390625,
0.00630950927734375,
0.01073455810546875,
-0.018524169921875,
0.002223968505859375,
0.04541015625,
0.038055419921875,
-0.0025272369384765625,
-0.01200103759765625,
0.0236053466796875,
-0.03717041015625,
0.01441192626953125,
0.0073394775390625,
-0.01256561279296875,
-0.0195465087890625,
0.0272674560546875,
0.07147216796875,
0.01312255859375,
-0.05126953125,
0.039215087890625,
0.0017528533935546875,
-0.0193023681640625,
-0.03155517578125,
0.01300811767578125,
0.025146484375,
0.03802490234375,
0.022979736328125,
-0.01284027099609375,
-0.00628662109375,
-0.03497314453125,
-0.007038116455078125,
0.032135009765625,
-0.0005183219909667969,
-0.0380859375,
0.0772705078125,
0.0193634033203125,
-0.0306396484375,
0.0487060546875,
-0.00568389892578125,
-0.0304412841796875,
0.06378173828125,
0.060150146484375,
0.0556640625,
-0.0136566162109375,
0.01806640625,
0.042144775390625,
0.032806396484375,
-0.005947113037109375,
0.0147552490234375,
-0.0001100301742553711,
-0.04302978515625,
-0.0255584716796875,
-0.054168701171875,
-0.0285797119140625,
0.02874755859375,
-0.0297698974609375,
0.032379150390625,
-0.05352783203125,
-0.0220184326171875,
-0.0216064453125,
0.0121002197265625,
-0.04266357421875,
0.002796173095703125,
0.0220489501953125,
0.059356689453125,
-0.04833984375,
0.05816650390625,
0.04132080078125,
-0.036651611328125,
-0.06500244140625,
-0.0154571533203125,
0.008636474609375,
-0.0777587890625,
0.033172607421875,
0.016632080078125,
-0.0010051727294921875,
0.0102386474609375,
-0.0654296875,
-0.086669921875,
0.1171875,
0.0269317626953125,
-0.05047607421875,
-0.007373809814453125,
0.01995849609375,
0.040924072265625,
-0.02362060546875,
0.027557373046875,
0.053192138671875,
0.03765869140625,
0.006671905517578125,
-0.08056640625,
0.0117034912109375,
-0.032806396484375,
0.0084686279296875,
-0.011749267578125,
-0.09552001953125,
0.061187744140625,
-0.0240325927734375,
-0.01354217529296875,
0.031341552734375,
0.059600830078125,
0.048004150390625,
0.01325225830078125,
0.033660888671875,
0.044708251953125,
0.05584716796875,
-0.005336761474609375,
0.08184814453125,
-0.0226593017578125,
0.021026611328125,
0.045562744140625,
-0.01285552978515625,
0.0682373046875,
0.0247650146484375,
-0.041168212890625,
0.049530029296875,
0.0645751953125,
-0.0148162841796875,
0.032867431640625,
0.001148223876953125,
-0.00977325439453125,
-0.0145416259765625,
-0.02874755859375,
-0.046875,
0.0390625,
0.0280914306640625,
-0.01277923583984375,
0.0071258544921875,
-0.028228759765625,
0.014556884765625,
-0.0224456787109375,
0.003314971923828125,
0.056854248046875,
0.021453857421875,
-0.03717041015625,
0.0751953125,
0.00818634033203125,
0.06658935546875,
-0.04901123046875,
-0.002582550048828125,
-0.042022705078125,
0.00766754150390625,
-0.0198974609375,
-0.047119140625,
0.0020313262939453125,
0.0165863037109375,
-0.0030574798583984375,
-0.01026153564453125,
0.04547119140625,
-0.00902557373046875,
-0.042572021484375,
0.036529541015625,
0.021514892578125,
0.0238800048828125,
0.018096923828125,
-0.057220458984375,
0.028411865234375,
0.0137176513671875,
-0.035430908203125,
0.0267791748046875,
0.01435089111328125,
0.00485992431640625,
0.059783935546875,
0.057403564453125,
-0.01194000244140625,
-0.000873565673828125,
-0.00762176513671875,
0.08123779296875,
-0.032928466796875,
-0.019439697265625,
-0.06365966796875,
0.0478515625,
0.01258087158203125,
-0.0369873046875,
0.036102294921875,
0.03271484375,
0.05474853515625,
0.00276947021484375,
0.049713134765625,
-0.0176544189453125,
0.0163116455078125,
-0.023101806640625,
0.05938720703125,
-0.0672607421875,
0.023101806640625,
-0.01220703125,
-0.062469482421875,
-0.0116119384765625,
0.054351806640625,
0.01126861572265625,
0.00421905517578125,
0.0203704833984375,
0.0670166015625,
0.011749267578125,
-0.01227569580078125,
0.007061004638671875,
0.0241851806640625,
0.036529541015625,
0.06298828125,
0.07568359375,
-0.054443359375,
0.0634765625,
-0.03375244140625,
-0.011566162109375,
-0.0248565673828125,
-0.05682373046875,
-0.068115234375,
-0.036285400390625,
-0.0196380615234375,
-0.0223236083984375,
-0.0079498291015625,
0.0648193359375,
0.0421142578125,
-0.03826904296875,
-0.03375244140625,
-0.0025310516357421875,
0.004009246826171875,
0.00018990039825439453,
-0.01312255859375,
0.01314544677734375,
0.00890350341796875,
-0.052215576171875,
0.040985107421875,
0.0025577545166015625,
0.0279083251953125,
-0.0139617919921875,
-0.02197265625,
-0.025543212890625,
0.002460479736328125,
0.04376220703125,
0.0261077880859375,
-0.07373046875,
-0.0131378173828125,
0.0019931793212890625,
-0.003337860107421875,
0.016632080078125,
0.0190887451171875,
-0.053314208984375,
-0.00482940673828125,
0.027191162109375,
0.033538818359375,
0.047698974609375,
0.004360198974609375,
0.0159149169921875,
-0.0404052734375,
0.032470703125,
0.0009822845458984375,
0.02142333984375,
0.029296875,
-0.03466796875,
0.0501708984375,
0.02032470703125,
-0.05010986328125,
-0.07012939453125,
0.001476287841796875,
-0.08050537109375,
-0.01042938232421875,
0.097412109375,
0.009002685546875,
-0.0181732177734375,
0.0176239013671875,
-0.0204925537109375,
0.03790283203125,
-0.03411865234375,
0.048065185546875,
0.031402587890625,
-0.00262451171875,
-0.018402099609375,
-0.0435791015625,
0.02899169921875,
0.0259857177734375,
-0.07037353515625,
-0.0008296966552734375,
0.04156494140625,
0.0267791748046875,
0.01071929931640625,
0.06048583984375,
-0.0013790130615234375,
0.0203399658203125,
-0.004261016845703125,
0.0139312744140625,
-0.005405426025390625,
-0.01934814453125,
-0.0169525146484375,
-0.0212860107421875,
-0.003818511962890625,
-0.00543975830078125
]
] |
IDEA-CCNL/Erlangshen-Roberta-110M-Sentiment | 2023-05-25T09:42:57.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"roberta",
"NLU",
"Sentiment",
"Chinese",
"zh",
"arxiv:2209.02970",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | IDEA-CCNL | null | null | IDEA-CCNL/Erlangshen-Roberta-110M-Sentiment | 39 | 14,233 | transformers | 2022-04-20T06:45:09 | ---
language:
- zh
license: apache-2.0
tags:
- roberta
- NLU
- Sentiment
- Chinese
inference: true
widget:
- text: "今天心情不好"
---
# Erlangshen-Roberta-110M-Sentiment
- Main Page:[Fengshenbang](https://fengshenbang-lm.com/)
- Github: [Fengshenbang-LM](https://github.com/IDEA-CCNL/Fengshenbang-LM)
## 简介 Brief Introduction
中文的RoBERTa-wwm-ext-base在数个情感分析任务微调后的版本
This is the fine-tuned version of the Chinese RoBERTa-wwm-ext-base model on several sentiment analysis datasets.
## 模型分类 Model Taxonomy
| 需求 Demand | 任务 Task | 系列 Series | 模型 Model | 参数 Parameter | 额外 Extra |
| :----: | :----: | :----: | :----: | :----: | :----: |
| 通用 General | 自然语言理解 NLU | 二郎神 Erlangshen | Roberta | 110M | 情感分析 Sentiment |
## 模型信息 Model Information
基于[chinese-roberta-wwm-ext-base](https://huggingface.co/hfl/chinese-roberta-wwm-ext),我们在收集的8个中文领域的情感分析数据集,总计227347个样本上微调了一个Semtiment版本。
Based on [chinese-roberta-wwm-ext-base](https://huggingface.co/hfl/chinese-roberta-wwm-ext), we fine-tuned a sentiment analysis version on 8 Chinese sentiment analysis datasets, with totaling 227,347 samples.
### 下游效果 Performance
| 模型 Model | ASAP-SENT | ASAP-ASPECT | ChnSentiCorp |
| :--------: | :-----: | :----: | :-----: |
| Erlangshen-Roberta-110M-Sentiment | 97.77 | 97.31 | 96.61 |
| Erlangshen-Roberta-330M-Sentiment | 97.9 | 97.51 | 96.66 |
| Erlangshen-MegatronBert-1.3B-Sentiment | 98.1 | 97.8 | 97 |
## 使用 Usage
``` python
from transformers import BertForSequenceClassification
from transformers import BertTokenizer
import torch
tokenizer=BertTokenizer.from_pretrained('IDEA-CCNL/Erlangshen-Roberta-110M-Sentiment')
model=BertForSequenceClassification.from_pretrained('IDEA-CCNL/Erlangshen-Roberta-110M-Sentiment')
text='今天心情不好'
output=model(torch.tensor([tokenizer.encode(text)]))
print(torch.nn.functional.softmax(output.logits,dim=-1))
```
## 引用 Citation
如果您在您的工作中使用了我们的模型,可以引用我们的[论文](https://arxiv.org/abs/2209.02970):
If you are using the resource for your work, please cite the our [paper](https://arxiv.org/abs/2209.02970):
```text
@article{fengshenbang,
author = {Jiaxing Zhang and Ruyi Gan and Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen},
title = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence},
journal = {CoRR},
volume = {abs/2209.02970},
year = {2022}
}
```
也可以引用我们的[网站](https://github.com/IDEA-CCNL/Fengshenbang-LM/):
You can also cite our [website](https://github.com/IDEA-CCNL/Fengshenbang-LM/):
```text
@misc{Fengshenbang-LM,
title={Fengshenbang-LM},
author={IDEA-CCNL},
year={2021},
howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}},
}
``` | 3,052 | [
[
-0.03765869140625,
-0.057708740234375,
0.0106658935546875,
0.0282440185546875,
-0.035858154296875,
-0.03570556640625,
-0.046539306640625,
-0.0298919677734375,
0.0258941650390625,
0.013214111328125,
-0.044097900390625,
-0.053955078125,
-0.0309295654296875,
-0.0025634765625,
0.0180206298828125,
0.08636474609375,
0.005260467529296875,
0.017181396484375,
0.005123138427734375,
-0.019744873046875,
-0.006305694580078125,
-0.031036376953125,
-0.04217529296875,
-0.0232391357421875,
0.026763916015625,
0.0025177001953125,
0.02789306640625,
0.01149749755859375,
0.04248046875,
0.019622802734375,
-0.012542724609375,
0.01116180419921875,
-0.01922607421875,
-0.005588531494140625,
0.016815185546875,
-0.0226593017578125,
-0.0631103515625,
0.005901336669921875,
0.035125732421875,
0.036224365234375,
0.0026340484619140625,
0.02410888671875,
0.0268402099609375,
0.056640625,
-0.0196990966796875,
0.01544952392578125,
-0.035247802734375,
-0.0008606910705566406,
-0.01312255859375,
-0.007022857666015625,
-0.035186767578125,
-0.03424072265625,
0.00531768798828125,
-0.03680419921875,
0.006000518798828125,
0.0010786056518554688,
0.1077880859375,
0.00653076171875,
-0.0098724365234375,
-0.00429534912109375,
-0.02374267578125,
0.0814208984375,
-0.08392333984375,
0.008697509765625,
0.003082275390625,
-0.0012388229370117188,
0.0038661956787109375,
-0.044647216796875,
-0.050506591796875,
-0.00799560546875,
-0.0214691162109375,
0.03851318359375,
-0.002044677734375,
-0.0005269050598144531,
0.0224761962890625,
0.0128326416015625,
-0.041717529296875,
-0.0148773193359375,
-0.0169830322265625,
-0.00743865966796875,
0.03973388671875,
0.01012420654296875,
0.018768310546875,
-0.05169677734375,
-0.036102294921875,
-0.0167999267578125,
-0.0307159423828125,
0.01995849609375,
0.00939178466796875,
0.032318115234375,
-0.031402587890625,
0.0333251953125,
-0.009918212890625,
0.049407958984375,
0.007083892822265625,
-0.003246307373046875,
0.05242919921875,
-0.03900146484375,
-0.03057861328125,
-0.023956298828125,
0.08465576171875,
0.03692626953125,
0.017852783203125,
0.0156402587890625,
-0.0121307373046875,
-0.021240234375,
-0.01129913330078125,
-0.055419921875,
-0.0221099853515625,
0.0297393798828125,
-0.051025390625,
-0.026702880859375,
0.033203125,
-0.0693359375,
0.0048980712890625,
-0.0005359649658203125,
0.033721923828125,
-0.04315185546875,
-0.04010009765625,
0.0018701553344726562,
-0.0114288330078125,
0.03973388671875,
0.0128021240234375,
-0.0460205078125,
-0.0022106170654296875,
0.04010009765625,
0.055084228515625,
-0.0011882781982421875,
-0.0258636474609375,
-0.00196075439453125,
-0.005222320556640625,
-0.0146636962890625,
0.03155517578125,
-0.01088714599609375,
-0.0185546875,
-0.007686614990234375,
-0.0008449554443359375,
-0.00919342041015625,
-0.015777587890625,
0.061492919921875,
-0.0274810791015625,
0.033538818359375,
-0.03619384765625,
-0.0223236083984375,
-0.0289459228515625,
0.0270538330078125,
-0.041412353515625,
0.0849609375,
0.0013275146484375,
-0.07720947265625,
0.028594970703125,
-0.051422119140625,
-0.022735595703125,
-0.0153045654296875,
0.003177642822265625,
-0.05316162109375,
-0.00446319580078125,
0.0220184326171875,
0.041961669921875,
-0.01788330078125,
0.00567626953125,
-0.031707763671875,
-0.005245208740234375,
0.0299530029296875,
-0.0263519287109375,
0.1033935546875,
0.029754638671875,
-0.030364990234375,
0.0183868408203125,
-0.05810546875,
0.02252197265625,
0.02783203125,
-0.022125244140625,
-0.03558349609375,
-0.0089111328125,
0.0160369873046875,
0.0297698974609375,
0.050018310546875,
-0.040496826171875,
0.00568389892578125,
-0.040802001953125,
0.0245361328125,
0.0662841796875,
0.00534820556640625,
0.019866943359375,
-0.03717041015625,
0.0193939208984375,
0.01532745361328125,
0.0196685791015625,
-0.0086669921875,
-0.03900146484375,
-0.081787109375,
-0.02740478515625,
0.0189208984375,
0.04315185546875,
-0.038177490234375,
0.0650634765625,
-0.031097412109375,
-0.043426513671875,
-0.037353515625,
-0.0079803466796875,
0.0245819091796875,
0.02886962890625,
0.04345703125,
-0.0032749176025390625,
-0.048248291015625,
-0.04608154296875,
-0.020782470703125,
-0.021636962890625,
0.00824737548828125,
0.035308837890625,
0.0372314453125,
-0.0013875961303710938,
0.057647705078125,
-0.0428466796875,
-0.0204620361328125,
-0.021484375,
0.01488494873046875,
0.051177978515625,
0.041778564453125,
0.04510498046875,
-0.056640625,
-0.0557861328125,
-0.0106353759765625,
-0.061553955078125,
-0.0018987655639648438,
-0.013275146484375,
-0.034210205078125,
0.039093017578125,
0.0017004013061523438,
-0.058441162109375,
0.023834228515625,
0.0206146240234375,
-0.0260467529296875,
0.047393798828125,
-0.00658416748046875,
0.013641357421875,
-0.098876953125,
0.00975799560546875,
0.002918243408203125,
0.0140380859375,
-0.043731689453125,
0.01496124267578125,
0.0008983612060546875,
0.01444244384765625,
-0.0286712646484375,
0.0360107421875,
-0.04302978515625,
0.01488494873046875,
0.00931549072265625,
0.0165863037109375,
0.00084686279296875,
0.06787109375,
-0.004009246826171875,
0.018646240234375,
0.046356201171875,
-0.0374755859375,
0.028228759765625,
0.0129852294921875,
-0.018218994140625,
0.033172607421875,
-0.061187744140625,
-0.0013036727905273438,
0.0078277587890625,
0.01541900634765625,
-0.085205078125,
0.006092071533203125,
0.03729248046875,
-0.058685302734375,
0.0175323486328125,
-0.002529144287109375,
-0.034515380859375,
-0.034332275390625,
-0.059326171875,
0.0172119140625,
0.048919677734375,
-0.035247802734375,
0.041473388671875,
0.01068115234375,
-0.00008720159530639648,
-0.053497314453125,
-0.0645751953125,
-0.0209503173828125,
-0.0259246826171875,
-0.0654296875,
0.022247314453125,
-0.0155029296875,
-0.001678466796875,
0.0067596435546875,
0.0150146484375,
0.01568603515625,
-0.006656646728515625,
0.01141357421875,
0.05230712890625,
-0.01898193359375,
0.0044403076171875,
-0.0087738037109375,
-0.0078887939453125,
0.00860595703125,
-0.017486572265625,
0.0438232421875,
-0.0159149169921875,
-0.0299224853515625,
-0.0276031494140625,
0.00926971435546875,
0.034576416015625,
-0.0240936279296875,
0.0648193359375,
0.0706787109375,
-0.0233001708984375,
-0.003978729248046875,
-0.033599853515625,
-0.006870269775390625,
-0.033355712890625,
0.035125732421875,
-0.032135009765625,
-0.04620361328125,
0.0491943359375,
0.021209716796875,
0.021331787109375,
0.0606689453125,
0.04693603515625,
-0.0011138916015625,
0.0802001953125,
0.031005859375,
-0.0166778564453125,
0.0426025390625,
-0.041168212890625,
0.015960693359375,
-0.08013916015625,
-0.020172119140625,
-0.037078857421875,
-0.0178985595703125,
-0.056427001953125,
-0.0313720703125,
0.0228271484375,
0.004398345947265625,
-0.0302734375,
0.0196533203125,
-0.054595947265625,
-0.01364898681640625,
0.0355224609375,
0.017669677734375,
0.006343841552734375,
0.0025272369384765625,
-0.025360107421875,
-0.011444091796875,
-0.043426513671875,
-0.03143310546875,
0.0760498046875,
0.0218505859375,
0.05157470703125,
0.0192718505859375,
0.0570068359375,
-0.0005822181701660156,
0.0129241943359375,
-0.0457763671875,
0.052642822265625,
-0.0041351318359375,
-0.0460205078125,
-0.029052734375,
-0.0340576171875,
-0.057708740234375,
0.02435302734375,
-0.00894927978515625,
-0.046600341796875,
0.0086822509765625,
-0.0128173828125,
-0.01824951171875,
0.027374267578125,
-0.027008056640625,
0.053314208984375,
-0.021270751953125,
-0.026153564453125,
-0.0174713134765625,
-0.038787841796875,
0.02227783203125,
0.0156097412109375,
0.01329803466796875,
-0.01006317138671875,
-0.0030117034912109375,
0.07403564453125,
-0.03631591796875,
0.0574951171875,
-0.029388427734375,
0.00069427490234375,
0.033447265625,
-0.0200347900390625,
0.051239013671875,
0.0018482208251953125,
-0.01300048828125,
0.01543426513671875,
-0.0158843994140625,
-0.042327880859375,
-0.0236053466796875,
0.05877685546875,
-0.07037353515625,
-0.0377197265625,
-0.04901123046875,
-0.01554107666015625,
0.002655029296875,
0.025665283203125,
0.046844482421875,
0.019500732421875,
-0.01424407958984375,
0.0166473388671875,
0.045013427734375,
-0.0244903564453125,
0.038726806640625,
0.032684326171875,
-0.0164031982421875,
-0.037567138671875,
0.06341552734375,
0.01934814453125,
0.0086517333984375,
0.02935791015625,
0.006694793701171875,
-0.01184844970703125,
-0.03515625,
-0.027099609375,
0.03472900390625,
-0.051849365234375,
-0.01442718505859375,
-0.05181884765625,
-0.0236663818359375,
-0.0400390625,
-0.001796722412109375,
-0.0207366943359375,
-0.0341796875,
-0.0272369384765625,
-0.0153961181640625,
0.041473388671875,
0.0256195068359375,
-0.0129852294921875,
-0.0148162841796875,
-0.041168212890625,
0.0119476318359375,
0.0161590576171875,
0.0164794921875,
0.019012451171875,
-0.0592041015625,
-0.03460693359375,
0.0211639404296875,
-0.0216522216796875,
-0.062744140625,
0.040435791015625,
0.005214691162109375,
0.049407958984375,
0.030548095703125,
0.018829345703125,
0.0506591796875,
-0.00780487060546875,
0.07843017578125,
0.022247314453125,
-0.067138671875,
0.04669189453125,
-0.03125,
0.0103302001953125,
0.024871826171875,
0.0164794921875,
-0.03778076171875,
-0.027496337890625,
-0.051483154296875,
-0.08453369140625,
0.0643310546875,
0.00490570068359375,
0.01013946533203125,
-0.0005168914794921875,
-0.0017261505126953125,
-0.0182037353515625,
-0.003902435302734375,
-0.0789794921875,
-0.04864501953125,
-0.03369140625,
-0.0240936279296875,
0.0080108642578125,
-0.0240020751953125,
-0.012542724609375,
-0.0280914306640625,
0.07989501953125,
0.0036106109619140625,
0.0592041015625,
0.02545166015625,
-0.0038890838623046875,
-0.0036373138427734375,
0.0209197998046875,
0.040802001953125,
0.02545166015625,
-0.0185394287109375,
-0.0061798095703125,
0.0316162109375,
-0.0266265869140625,
-0.01210784912109375,
0.0029239654541015625,
-0.02410888671875,
0.00701141357421875,
0.034820556640625,
0.0628662109375,
-0.0018777847290039062,
-0.027099609375,
0.041351318359375,
0.0012874603271484375,
-0.0145111083984375,
-0.04730224609375,
0.002044677734375,
0.01212310791015625,
0.00714874267578125,
0.035125732421875,
0.0104827880859375,
-0.0053253173828125,
-0.0225372314453125,
0.00441741943359375,
0.03924560546875,
-0.0372314453125,
-0.02520751953125,
0.0355224609375,
0.006938934326171875,
0.004322052001953125,
0.0406494140625,
-0.02490234375,
-0.053955078125,
0.053741455078125,
0.024658203125,
0.07379150390625,
0.006259918212890625,
0.00978851318359375,
0.05926513671875,
0.024017333984375,
0.0004982948303222656,
0.041168212890625,
0.011016845703125,
-0.0601806640625,
-0.01312255859375,
-0.04522705078125,
0.005321502685546875,
0.00513458251953125,
-0.058837890625,
0.02838134765625,
-0.037017822265625,
-0.025970458984375,
-0.01178741455078125,
0.0218658447265625,
-0.040191650390625,
0.0201568603515625,
0.0023937225341796875,
0.0653076171875,
-0.04718017578125,
0.0654296875,
0.06414794921875,
-0.050018310546875,
-0.077392578125,
0.000614166259765625,
-0.0007405281066894531,
-0.049041748046875,
0.04217529296875,
0.002933502197265625,
-0.01001739501953125,
-0.00099945068359375,
-0.04229736328125,
-0.057403564453125,
0.10235595703125,
-0.0125579833984375,
-0.0123291015625,
0.01690673828125,
-0.01496124267578125,
0.05078125,
-0.031585693359375,
0.03375244140625,
0.006885528564453125,
0.05657958984375,
-0.009063720703125,
-0.04583740234375,
0.0254974365234375,
-0.0484619140625,
-0.004146575927734375,
0.0072174072265625,
-0.08740234375,
0.091552734375,
-0.0217742919921875,
-0.01702880859375,
0.01360321044921875,
0.06585693359375,
0.0268707275390625,
0.01430511474609375,
0.0280914306640625,
0.05047607421875,
0.049041748046875,
-0.025543212890625,
0.05572509765625,
-0.03009033203125,
0.055267333984375,
0.062255859375,
0.00616455078125,
0.06390380859375,
0.0228271484375,
-0.040740966796875,
0.05548095703125,
0.0469970703125,
-0.0216064453125,
0.0355224609375,
-0.0017719268798828125,
-0.005397796630859375,
-0.001430511474609375,
0.0021152496337890625,
-0.04522705078125,
0.012725830078125,
0.01611328125,
-0.0347900390625,
0.0191192626953125,
-0.0092926025390625,
0.028533935546875,
-0.0125579833984375,
-0.01221466064453125,
0.051971435546875,
0.007205963134765625,
-0.051666259765625,
0.06109619140625,
0.0166778564453125,
0.08685302734375,
-0.044525146484375,
0.0179290771484375,
-0.00846099853515625,
0.00783538818359375,
-0.02783203125,
-0.0489501953125,
0.0005202293395996094,
0.002185821533203125,
-0.01276397705078125,
0.00400543212890625,
0.044464111328125,
-0.026824951171875,
-0.047271728515625,
0.038055419921875,
0.0321044921875,
0.00849151611328125,
0.023712158203125,
-0.087646484375,
-0.004833221435546875,
0.0355224609375,
-0.06768798828125,
0.01605224609375,
0.050506591796875,
0.0235137939453125,
0.029388427734375,
0.052520751953125,
0.0212249755859375,
0.01006317138671875,
0.0059661865234375,
0.060272216796875,
-0.06695556640625,
-0.037078857421875,
-0.07586669921875,
0.044952392578125,
-0.02117919921875,
-0.041046142578125,
0.0760498046875,
0.026519775390625,
0.054962158203125,
-0.0079193115234375,
0.06793212890625,
-0.033172607421875,
0.039276123046875,
-0.0308685302734375,
0.059356689453125,
-0.050811767578125,
0.0200347900390625,
-0.04156494140625,
-0.062408447265625,
-0.01050567626953125,
0.0654296875,
-0.0248565673828125,
0.04351806640625,
0.056488037109375,
0.07708740234375,
0.0184478759765625,
-0.0093536376953125,
0.00873565673828125,
0.034637451171875,
0.01377105712890625,
0.06103515625,
0.0380859375,
-0.04144287109375,
0.04217529296875,
-0.036712646484375,
-0.0182037353515625,
-0.0272369384765625,
-0.05517578125,
-0.06781005859375,
-0.051788330078125,
-0.0222625732421875,
-0.044525146484375,
-0.0194854736328125,
0.07025146484375,
0.048004150390625,
-0.06695556640625,
-0.0113525390625,
-0.00013637542724609375,
0.0141143798828125,
-0.037261962890625,
-0.0276031494140625,
0.052978515625,
-0.0185089111328125,
-0.061431884765625,
-0.021087646484375,
0.007678985595703125,
0.0009255409240722656,
-0.0186309814453125,
-0.0296630859375,
-0.023284912109375,
0.0208282470703125,
0.0217742919921875,
0.006885528564453125,
-0.061065673828125,
-0.01245880126953125,
0.01551055908203125,
-0.0389404296875,
0.0094451904296875,
0.025665283203125,
-0.027374267578125,
0.017822265625,
0.06463623046875,
0.00402069091796875,
0.03082275390625,
-0.005596160888671875,
0.0074615478515625,
-0.035919189453125,
0.01174163818359375,
0.00662994384765625,
0.03851318359375,
0.0250396728515625,
-0.0286712646484375,
0.0364990234375,
0.0237884521484375,
-0.033203125,
-0.05078125,
0.0027313232421875,
-0.09600830078125,
-0.02239990234375,
0.08428955078125,
-0.03302001953125,
-0.0221710205078125,
0.00766754150390625,
-0.0162200927734375,
0.0394287109375,
-0.04107666015625,
0.04925537109375,
0.060546875,
-0.00930023193359375,
-0.0097503662109375,
-0.024627685546875,
0.0298614501953125,
0.05126953125,
-0.0406494140625,
0.0019044876098632812,
0.015045166015625,
0.007656097412109375,
0.02801513671875,
0.04925537109375,
-0.016876220703125,
0.0259857177734375,
-0.004901885986328125,
0.0244293212890625,
-0.009857177734375,
0.0084381103515625,
-0.0171661376953125,
0.0038909912109375,
0.006099700927734375,
-0.0162353515625
]
] |
microsoft/unispeech-sat-base | 2021-11-05T12:41:05.000Z | [
"transformers",
"pytorch",
"unispeech-sat",
"pretraining",
"speech",
"en",
"dataset:librispeech_asr",
"arxiv:2110.05752",
"endpoints_compatible",
"region:us"
] | null | microsoft | null | null | microsoft/unispeech-sat-base | 0 | 14,203 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
datasets:
- librispeech_asr
tags:
- speech
---
# UniSpeech-SAT-Base
[Microsoft's UniSpeech](https://www.microsoft.com/en-us/research/publication/unispeech-unified-speech-representation-learning-with-labeled-and-unlabeled-data/)
The base model pretrained on 16kHz sampled speech audio with utterance and speaker contrastive loss. When using the model, make sure that your speech input is also sampled at 16kHz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model.
The model was pre-trained on:
- 960 hours of [LibriSpeech](https://huggingface.co/datasets/librispeech_asr)
[Paper: UNISPEECH-SAT: UNIVERSAL SPEECH REPRESENTATION LEARNING WITH SPEAKER
AWARE PRE-TRAINING](https://arxiv.org/abs/2110.05752)
Authors: Sanyuan Chen, Yu Wu, Chengyi Wang, Zhengyang Chen, Zhuo Chen, Shujie Liu, Jian Wu, Yao Qian, Furu Wei, Jinyu Li, Xiangzhan Yu
**Abstract**
*Self-supervised learning (SSL) is a long-standing goal for speech processing, since it utilizes large-scale unlabeled data and avoids extensive human labeling. Recent years witness great successes in applying self-supervised learning in speech recognition, while limited exploration was attempted in applying SSL for modeling speaker characteristics. In this paper, we aim to improve the existing SSL framework for speaker representation learning. Two methods are introduced for enhancing the unsupervised speaker information extraction. First, we apply the multi-task learning to the current SSL framework, where we integrate the utterance-wise contrastive loss with the SSL objective function. Second, for better speaker discrimination, we propose an utterance mixing strategy for data augmentation, where additional overlapped utterances are created unsupervisely and incorporate during training. We integrate the proposed methods into the HuBERT framework. Experiment results on SUPERB benchmark show that the proposed system achieves state-of-the-art performance in universal representation learning, especially for speaker identification oriented tasks. An ablation study is performed verifying the efficacy of each proposed method. Finally, we scale up training dataset to 94 thousand hours public audio data and achieve further performance improvement in all SUPERB tasks..*
The original model can be found under https://github.com/microsoft/UniSpeech/tree/main/UniSpeech-SAT.
# Usage
This is an English pre-trained speech model that has to be fine-tuned on a downstream task like speech recognition or audio classification before it can be
used in inference. The model was pre-trained in English and should therefore perform well only in English. The model has been shown to work well on task such as speaker verification, speaker identification, and speaker diarization.
**Note**: The model was pre-trained on phonemes rather than characters. This means that one should make sure that the input text is converted to a sequence
of phonemes before fine-tuning.
## Speech Recognition
To fine-tune the model for speech recognition, see [the official speech recognition example](https://github.com/huggingface/transformers/tree/master/examples/pytorch/speech-recognition).
## Speech Classification
To fine-tune the model for speech classification, see [the official audio classification example](https://github.com/huggingface/transformers/tree/master/examples/pytorch/audio-classification).
## Speaker Verification
TODO
## Speaker Diarization
TODO
# Contribution
The model was contributed by [cywang](https://huggingface.co/cywang) and [patrickvonplaten](https://huggingface.co/patrickvonplaten).
# License
The official license can be found [here](https://github.com/microsoft/UniSpeech/blob/main/LICENSE)
 | 4,123 | [
[
-0.0233917236328125,
-0.03485107421875,
0.0062103271484375,
0.009124755859375,
-0.0299530029296875,
0.0007266998291015625,
-0.026580810546875,
-0.037261962890625,
0.0050811767578125,
0.03558349609375,
-0.0249176025390625,
-0.02935791015625,
-0.03338623046875,
-0.0084686279296875,
-0.023529052734375,
0.0556640625,
0.035980224609375,
0.026336669921875,
-0.010101318359375,
0.0005011558532714844,
-0.025421142578125,
-0.0482177734375,
-0.0279541015625,
-0.052703857421875,
-0.0150909423828125,
0.0286865234375,
0.026153564453125,
0.028228759765625,
0.006923675537109375,
0.0300140380859375,
-0.027496337890625,
0.01267242431640625,
-0.0168914794921875,
-0.01507568359375,
-0.0190277099609375,
-0.0019989013671875,
-0.0304107666015625,
0.0164642333984375,
0.0638427734375,
0.06060791015625,
-0.030303955078125,
0.02117919921875,
0.0284576416015625,
0.039703369140625,
-0.0345458984375,
0.01947021484375,
-0.055145263671875,
-0.00939178466796875,
-0.0284576416015625,
-0.015838623046875,
-0.044708251953125,
0.02593994140625,
0.0015974044799804688,
-0.029083251953125,
0.01041412353515625,
-0.01149749755859375,
0.05902099609375,
0.015380859375,
-0.0285797119140625,
0.00914764404296875,
-0.05859375,
0.06414794921875,
-0.056396484375,
0.065185546875,
0.034332275390625,
0.01245880126953125,
-0.01522064208984375,
-0.05389404296875,
-0.028778076171875,
-0.028045654296875,
0.00045037269592285156,
0.01374053955078125,
-0.0214996337890625,
0.01470184326171875,
0.0190582275390625,
0.0222015380859375,
-0.04595947265625,
0.0179443359375,
-0.035064697265625,
-0.03656005859375,
0.054962158203125,
-0.0208892822265625,
0.00537109375,
-0.0167083740234375,
-0.034454345703125,
-0.01273345947265625,
-0.042205810546875,
0.024017333984375,
0.01386260986328125,
0.053192138671875,
-0.0249176025390625,
0.036102294921875,
-0.01174163818359375,
0.048126220703125,
0.005420684814453125,
-0.00786590576171875,
0.05145263671875,
-0.01096343994140625,
-0.01454925537109375,
0.0174102783203125,
0.07354736328125,
-0.0013399124145507812,
0.03643798828125,
-0.00775909423828125,
-0.019317626953125,
0.00630950927734375,
0.00795745849609375,
-0.048187255859375,
-0.03594970703125,
0.0133514404296875,
-0.032196044921875,
0.0102081298828125,
0.00794219970703125,
-0.0284271240234375,
-0.00250244140625,
-0.054840087890625,
0.0282745361328125,
-0.03338623046875,
-0.04193115234375,
0.0005726814270019531,
0.010009765625,
-0.0020751953125,
0.00807952880859375,
-0.06304931640625,
0.033538818359375,
0.031585693359375,
0.04205322265625,
-0.0226593017578125,
-0.0254058837890625,
-0.045806884765625,
0.0015230178833007812,
-0.01531982421875,
0.048797607421875,
-0.030609130859375,
-0.0300750732421875,
-0.01091766357421875,
0.0041046142578125,
-0.004619598388671875,
-0.0384521484375,
0.0323486328125,
-0.0172882080078125,
0.00804901123046875,
0.0011968612670898438,
-0.0643310546875,
0.005245208740234375,
-0.0304718017578125,
-0.0249176025390625,
0.08428955078125,
-0.0003528594970703125,
-0.058197021484375,
0.0038394927978515625,
-0.059844970703125,
-0.0268096923828125,
0.014068603515625,
-0.01277923583984375,
-0.0271453857421875,
0.00811767578125,
0.01134490966796875,
0.04315185546875,
-0.010589599609375,
0.0122833251953125,
-0.000896453857421875,
-0.0301666259765625,
0.015655517578125,
-0.029388427734375,
0.0836181640625,
0.036956787109375,
-0.0290679931640625,
0.006519317626953125,
-0.07952880859375,
0.01322174072265625,
0.010467529296875,
-0.0272979736328125,
-0.004680633544921875,
-0.00560760498046875,
0.037200927734375,
-0.0014934539794921875,
0.01641845703125,
-0.06207275390625,
-0.0170440673828125,
-0.04736328125,
0.035736083984375,
0.04046630859375,
-0.0186767578125,
0.040008544921875,
0.0169219970703125,
0.00930023193359375,
0.0033092498779296875,
0.006134033203125,
-0.004917144775390625,
-0.0301666259765625,
-0.046630859375,
-0.0141448974609375,
0.054656982421875,
0.055816650390625,
-0.0229034423828125,
0.05987548828125,
-0.00399017333984375,
-0.0377197265625,
-0.052520751953125,
0.01184844970703125,
0.05841064453125,
0.0352783203125,
0.045318603515625,
-0.026397705078125,
-0.055633544921875,
-0.05535888671875,
-0.01262664794921875,
-0.0239105224609375,
-0.01971435546875,
0.002140045166015625,
0.0187225341796875,
-0.0181884765625,
0.06640625,
-0.009735107421875,
-0.0323486328125,
-0.01189422607421875,
0.01366424560546875,
0.0026416778564453125,
0.059814453125,
0.0184326171875,
-0.06658935546875,
-0.03289794921875,
-0.00786590576171875,
-0.03076171875,
-0.0023040771484375,
0.00856781005859375,
0.0222320556640625,
0.02117919921875,
0.038909912109375,
-0.046478271484375,
0.032073974609375,
0.032196044921875,
-0.0208892822265625,
0.035797119140625,
-0.0222015380859375,
-0.007038116455078125,
-0.08636474609375,
-0.00004500150680541992,
-0.0000024437904357910156,
-0.0232391357421875,
-0.06011962890625,
-0.01532745361328125,
0.005352020263671875,
-0.0211029052734375,
-0.04571533203125,
0.0335693359375,
-0.043975830078125,
-0.01629638671875,
-0.02655029296875,
0.01154327392578125,
-0.01543426513671875,
0.03643798828125,
0.008331298828125,
0.0711669921875,
0.058563232421875,
-0.044891357421875,
0.0277557373046875,
0.0250701904296875,
-0.02642822265625,
0.02935791015625,
-0.059814453125,
0.03009033203125,
0.0024967193603515625,
0.0033283233642578125,
-0.089111328125,
0.0063934326171875,
0.0009570121765136719,
-0.06097412109375,
0.05499267578125,
-0.01043701171875,
-0.03753662109375,
-0.0224151611328125,
0.01641845703125,
0.0155792236328125,
0.050018310546875,
-0.05224609375,
0.055206298828125,
0.052093505859375,
-0.0024547576904296875,
-0.0281524658203125,
-0.056884765625,
-0.00322723388671875,
0.0014410018920898438,
-0.030303955078125,
0.050994873046875,
-0.01271820068359375,
-0.0012063980102539062,
-0.022308349609375,
-0.0079803466796875,
-0.00289154052734375,
-0.0101165771484375,
0.031829833984375,
0.00772857666015625,
-0.0031147003173828125,
0.0203094482421875,
-0.0181884765625,
-0.03045654296875,
-0.01103973388671875,
-0.01119232177734375,
0.04510498046875,
-0.0230560302734375,
-0.027099609375,
-0.0689697265625,
0.029205322265625,
0.0211029052734375,
-0.042022705078125,
0.019561767578125,
0.06414794921875,
-0.02117919921875,
0.0017032623291015625,
-0.0609130859375,
-0.011505126953125,
-0.038543701171875,
0.037200927734375,
-0.03497314453125,
-0.07611083984375,
0.025390625,
-0.0076141357421875,
0.00025081634521484375,
0.037261962890625,
0.03125,
-0.017425537109375,
0.07843017578125,
0.046966552734375,
-0.0206756591796875,
0.037628173828125,
-0.01197052001953125,
0.001499176025390625,
-0.06207275390625,
-0.0254974365234375,
-0.0494384765625,
0.0081634521484375,
-0.053985595703125,
-0.019866943359375,
-0.003810882568359375,
0.021514892578125,
-0.0185394287109375,
0.042999267578125,
-0.065185546875,
0.026153564453125,
0.0472412109375,
-0.00412750244140625,
0.001605987548828125,
0.02166748046875,
-0.0232391357421875,
-0.01078033447265625,
-0.032745361328125,
-0.031341552734375,
0.07232666015625,
0.039215087890625,
0.055755615234375,
-0.00907135009765625,
0.0567626953125,
0.0272674560546875,
-0.0236968994140625,
-0.0308380126953125,
0.042510986328125,
-0.0254974365234375,
-0.031707763671875,
-0.03424072265625,
-0.04010009765625,
-0.0733642578125,
0.0253143310546875,
-0.01526641845703125,
-0.07781982421875,
0.010589599609375,
0.0251617431640625,
-0.0193634033203125,
0.01535797119140625,
-0.0689697265625,
0.05609130859375,
-0.0196685791015625,
0.00977325439453125,
-0.0137786865234375,
-0.059356689453125,
-0.0160675048828125,
0.01329803466796875,
0.01129913330078125,
-0.01343536376953125,
0.0267181396484375,
0.075439453125,
-0.00572967529296875,
0.06463623046875,
-0.0421142578125,
0.0138397216796875,
0.0251617431640625,
-0.01406097412109375,
0.032470703125,
-0.01166534423828125,
0.002689361572265625,
0.03460693359375,
0.027130126953125,
-0.024322509765625,
-0.026123046875,
0.04486083984375,
-0.0782470703125,
-0.0284271240234375,
-0.0167694091796875,
-0.0265655517578125,
-0.034027099609375,
0.007778167724609375,
0.0294036865234375,
0.037109375,
-0.0216064453125,
0.0233917236328125,
0.051300048828125,
0.002040863037109375,
0.03118896484375,
0.059661865234375,
-0.00992584228515625,
-0.02978515625,
0.07318115234375,
0.016448974609375,
0.004871368408203125,
0.0323486328125,
0.0195770263671875,
-0.048187255859375,
-0.048614501953125,
-0.017852783203125,
0.01528167724609375,
-0.04315185546875,
-0.0082244873046875,
-0.04486083984375,
-0.040557861328125,
-0.06976318359375,
0.03533935546875,
-0.06304931640625,
-0.03277587890625,
-0.047821044921875,
-0.01067352294921875,
0.026702880859375,
0.045501708984375,
-0.021881103515625,
0.037139892578125,
-0.040740966796875,
0.031646728515625,
0.022064208984375,
0.0225372314453125,
-0.0229034423828125,
-0.07415771484375,
-0.00913238525390625,
0.0202789306640625,
-0.016876220703125,
-0.058624267578125,
0.0126190185546875,
0.02978515625,
0.06097412109375,
0.034454345703125,
0.0023250579833984375,
0.05328369140625,
-0.044830322265625,
0.05816650390625,
0.0307159423828125,
-0.07879638671875,
0.055511474609375,
-0.004245758056640625,
0.0299224853515625,
0.04803466796875,
0.015716552734375,
-0.0224609375,
-0.006397247314453125,
-0.056610107421875,
-0.04736328125,
0.0528564453125,
0.0249176025390625,
0.0176239013671875,
0.016510009765625,
0.0225372314453125,
-0.015869140625,
0.0038356781005859375,
-0.052734375,
-0.039642333984375,
-0.03680419921875,
0.000705718994140625,
-0.033599853515625,
-0.03955078125,
0.0118560791015625,
-0.0552978515625,
0.061614990234375,
0.006450653076171875,
0.02398681640625,
0.01152801513671875,
-0.01568603515625,
-0.0032405853271484375,
0.036407470703125,
0.032928466796875,
0.049041748046875,
-0.026824951171875,
0.0118560791015625,
0.01407623291015625,
-0.0268096923828125,
-0.00005704164505004883,
0.039337158203125,
0.003940582275390625,
0.01134490966796875,
0.01131439208984375,
0.07672119140625,
0.018798828125,
-0.032440185546875,
0.040496826171875,
0.006938934326171875,
-0.01953125,
-0.045806884765625,
-0.0048828125,
0.0130462646484375,
0.02001953125,
0.03369140625,
-0.004268646240234375,
0.01181793212890625,
-0.0292816162109375,
0.034637451171875,
0.0364990234375,
-0.04693603515625,
-0.0246429443359375,
0.03985595703125,
0.0193023681640625,
-0.046661376953125,
0.0457763671875,
-0.022857666015625,
-0.02923583984375,
0.01090240478515625,
0.058746337890625,
0.054290771484375,
-0.07330322265625,
0.0175628662109375,
0.02276611328125,
0.0131378173828125,
0.0070953369140625,
0.03173828125,
-0.034454345703125,
-0.0511474609375,
-0.029327392578125,
-0.05859375,
-0.015655517578125,
0.0239105224609375,
-0.04913330078125,
0.001552581787109375,
-0.022125244140625,
-0.019744873046875,
0.0182037353515625,
0.0158538818359375,
-0.0479736328125,
0.0189971923828125,
0.0190277099609375,
0.061614990234375,
-0.07421875,
0.0802001953125,
0.052032470703125,
-0.0247955322265625,
-0.07366943359375,
-0.02587890625,
-0.0016546249389648438,
-0.05419921875,
0.036712646484375,
0.0121917724609375,
-0.0110321044921875,
0.009979248046875,
-0.042510986328125,
-0.063232421875,
0.06341552734375,
0.03570556640625,
-0.06011962890625,
0.00652313232421875,
0.031829833984375,
0.038787841796875,
-0.036041259765625,
0.01390838623046875,
0.03863525390625,
0.0138397216796875,
0.0192718505859375,
-0.09637451171875,
-0.01497650146484375,
-0.03125,
0.005023956298828125,
-0.006511688232421875,
-0.031524658203125,
0.0667724609375,
-0.009490966796875,
-0.020599365234375,
0.0024738311767578125,
0.0672607421875,
0.0154266357421875,
0.00518798828125,
0.06060791015625,
0.048492431640625,
0.06317138671875,
-0.003246307373046875,
0.047210693359375,
-0.0163116455078125,
0.029388427734375,
0.10931396484375,
-0.007175445556640625,
0.0816650390625,
0.0357666015625,
-0.040863037109375,
0.020721435546875,
0.047027587890625,
-0.01132965087890625,
0.041961669921875,
0.0106048583984375,
-0.0029048919677734375,
-0.046478271484375,
-0.0192718505859375,
-0.043609619140625,
0.062164306640625,
0.01092529296875,
-0.01300811767578125,
-0.0084686279296875,
0.0172882080078125,
-0.0202484130859375,
0.005496978759765625,
-0.0146026611328125,
0.05859375,
0.0197296142578125,
-0.0178985595703125,
0.055450439453125,
-0.00977325439453125,
0.052703857421875,
-0.046630859375,
-0.007534027099609375,
0.01348876953125,
0.0174407958984375,
-0.01995849609375,
-0.045074462890625,
-0.0007801055908203125,
-0.01293182373046875,
-0.0111083984375,
-0.0115203857421875,
0.058502197265625,
-0.045379638671875,
-0.038177490234375,
0.04595947265625,
0.018707275390625,
0.042510986328125,
-0.02569580078125,
-0.07928466796875,
0.00872039794921875,
0.0095672607421875,
-0.01020050048828125,
0.03118896484375,
0.01337432861328125,
0.02337646484375,
0.03662109375,
0.056610107421875,
0.0141754150390625,
-0.0079193115234375,
0.0347900390625,
0.0537109375,
-0.03387451171875,
-0.051300048828125,
-0.039154052734375,
0.053253173828125,
-0.00670623779296875,
-0.01329803466796875,
0.0572509765625,
0.047943115234375,
0.07562255859375,
0.00949859619140625,
0.0304107666015625,
0.0215911865234375,
0.078125,
-0.030914306640625,
0.052703857421875,
-0.0511474609375,
0.011993408203125,
-0.03167724609375,
-0.06439208984375,
-0.00894927978515625,
0.04742431640625,
0.0084991455078125,
0.01763916015625,
0.0246124267578125,
0.06646728515625,
-0.007598876953125,
0.00469207763671875,
0.045867919921875,
0.01302337646484375,
0.009063720703125,
0.01204681396484375,
0.052703857421875,
-0.038421630859375,
0.033935546875,
-0.0350341796875,
-0.0253448486328125,
0.0031299591064453125,
-0.0438232421875,
-0.0782470703125,
-0.073486328125,
-0.035736083984375,
-0.0209503173828125,
0.01145172119140625,
0.081298828125,
0.0772705078125,
-0.0802001953125,
-0.031463623046875,
0.0026035308837890625,
-0.0258636474609375,
-0.005031585693359375,
-0.013916015625,
0.040863037109375,
-0.025543212890625,
-0.0377197265625,
0.05596923828125,
0.005733489990234375,
0.0219268798828125,
-0.00902557373046875,
-0.0196075439453125,
-0.0212554931640625,
-0.0133209228515625,
0.038360595703125,
0.035797119140625,
-0.07025146484375,
-0.00957489013671875,
-0.02130126953125,
0.0152435302734375,
0.01080322265625,
0.04608154296875,
-0.05194091796875,
0.045196533203125,
0.019744873046875,
0.034698486328125,
0.050628662109375,
-0.0006165504455566406,
0.0103302001953125,
-0.07281494140625,
0.02227783203125,
0.01232147216796875,
0.0443115234375,
0.040679931640625,
-0.0168609619140625,
0.017822265625,
0.0190277099609375,
-0.052337646484375,
-0.06451416015625,
0.00624847412109375,
-0.091552734375,
-0.0212249755859375,
0.08575439453125,
-0.0033416748046875,
-0.002155303955078125,
-0.0148162841796875,
-0.0156097412109375,
0.04119873046875,
-0.042205810546875,
0.034454345703125,
0.0273284912109375,
-0.0153961181640625,
-0.024169921875,
-0.0190277099609375,
0.04345703125,
0.038726806640625,
-0.03546142578125,
0.022705078125,
0.0200958251953125,
0.032806396484375,
0.0255889892578125,
0.057708740234375,
-0.01087188720703125,
0.01236724853515625,
-0.01305389404296875,
0.0267181396484375,
-0.0180511474609375,
-0.0313720703125,
-0.0328369140625,
-0.002105712890625,
0.0041046142578125,
-0.040496826171875
]
] |
yanekyuk/camembert-keyword-extractor | 2022-06-04T10:28:45.000Z | [
"transformers",
"pytorch",
"camembert",
"token-classification",
"generated_from_trainer",
"fr",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | yanekyuk | null | null | yanekyuk/camembert-keyword-extractor | 2 | 14,199 | transformers | 2022-06-04T02:03:03 | ---
license: mit
tags:
- generated_from_trainer
metrics:
- precision
- recall
- accuracy
- f1
language:
- fr
widget:
- text: "Le président de la République appelle en outre les Français à faire le choix d'une \"majorité stable et sérieuse pour les protéger face aux crises et pour agir pour l'avenir\". \"Je vois dans le projet de Jean-Luc Mélenchon ou de Madame Le Pen un projet de désordre et de soumission. Ils expliquent qu'il faut sortir de nos alliances, de l'Europe, et bâtir des alliances stratégiques avec la Russie. C'est la soumission à la Russie\", assure-t-il."
- text: "Top départ à l’ouverture des bureaux de vote. La Polynésie et les Français résidant à l'étranger, dont certains ont déjà pu voter en ligne, sont invités aux urnes ce week-end pour le premier tour des législatives, samedi 4 juin pour le continent américain et les Caraïbes, et dimanche 5 juin pour le reste du monde. En France métropolitaine, les premier et second tours auront lieu les 12 et 19 juin."
- text: "Le ministère a aussi indiqué que des missiles russes ont frappé un centre d'entraînement d'artillerie dans la région de Soumy où travaillaient des instructeurs étrangers. Il a jouté qu'une autre frappe avait détruit une position de \"mercenaires étrangers\" dans la région d'Odessa."
- text: "Le malaise est profond et ressemble à une crise existentielle. Fait rarissime au Quai d’Orsay, six syndicats et un collectif de 500 jeunes diplomates du ministère des Affaires étrangères ont appelé à la grève, jeudi 2 juin, pour protester contre la réforme de la haute fonction publique qui, à terme, entraînera la disparition des deux corps historiques de la diplomatie française : celui de ministre plénipotentiaire (ambassadeur) et celui de conseiller des affaires étrangères."
- text: "Ils se font passer pour des recruteurs de Lockheed Martin ou du géant britannique de la défense et de l’aérospatial BAE Systems. Ces soi-disant chasseurs de tête font miroiter des perspectives lucratives de carrière et des postes à responsabilité. Mais ce n’est que du vent. En réalité, il s’agit de cyberespions nord-coréens cherchant à voler des secrets industriels de groupes de défense ou du secteur de l’aérospatial, révèle Eset, une société slovaque de sécurité informatique, dans un rapport publié mardi 31 mai."
model-index:
- name: camembert-keyword-extractor
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# camembert-keyword-extractor
This model is a fine-tuned version of [camembert-base](https://huggingface.co/camembert-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2199
- Precision: 0.6743
- Recall: 0.6979
- Accuracy: 0.9346
- F1: 0.6859
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | Accuracy | F1 |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:--------:|:------:|
| 0.1747 | 1.0 | 1875 | 0.1780 | 0.5935 | 0.7116 | 0.9258 | 0.6472 |
| 0.1375 | 2.0 | 3750 | 0.1588 | 0.6505 | 0.7032 | 0.9334 | 0.6759 |
| 0.1147 | 3.0 | 5625 | 0.1727 | 0.6825 | 0.6689 | 0.9355 | 0.6756 |
| 0.0969 | 4.0 | 7500 | 0.1759 | 0.6886 | 0.6621 | 0.9350 | 0.6751 |
| 0.0837 | 5.0 | 9375 | 0.1967 | 0.6688 | 0.7112 | 0.9348 | 0.6893 |
| 0.0746 | 6.0 | 11250 | 0.2088 | 0.6646 | 0.7114 | 0.9334 | 0.6872 |
| 0.0666 | 7.0 | 13125 | 0.2169 | 0.6713 | 0.7054 | 0.9347 | 0.6879 |
| 0.0634 | 8.0 | 15000 | 0.2199 | 0.6743 | 0.6979 | 0.9346 | 0.6859 |
### Framework versions
- Transformers 4.19.2
- Pytorch 1.11.0+cu113
- Datasets 2.2.2
- Tokenizers 0.12.1
| 4,398 | [
[
-0.034881591796875,
-0.03875732421875,
0.01556396484375,
0.005023956298828125,
-0.02899169921875,
-0.0166015625,
-0.00890350341796875,
-0.0087738037109375,
0.01514434814453125,
0.0256195068359375,
-0.0472412109375,
-0.057891845703125,
-0.0550537109375,
-0.01251983642578125,
-0.0253143310546875,
0.076416015625,
0.0175323486328125,
0.004970550537109375,
0.0017404556274414062,
-0.0038814544677734375,
-0.0247650146484375,
-0.040863037109375,
-0.049346923828125,
-0.0439453125,
0.0240325927734375,
0.0182342529296875,
0.06494140625,
0.057525634765625,
0.043243408203125,
0.0190582275390625,
-0.02435302734375,
-0.0026798248291015625,
-0.03033447265625,
-0.029449462890625,
0.0024547576904296875,
-0.043609619140625,
-0.046722412109375,
0.0025691986083984375,
0.0400390625,
0.03143310546875,
-0.0077667236328125,
0.0301055908203125,
0.0030956268310546875,
0.0523681640625,
-0.036285400390625,
0.03265380859375,
-0.0316162109375,
0.022979736328125,
-0.016265869140625,
-0.030487060546875,
-0.01006317138671875,
-0.0172576904296875,
0.00527191162109375,
-0.039703369140625,
0.039215087890625,
-0.0003306865692138672,
0.10479736328125,
0.0223846435546875,
-0.019378662109375,
0.0045623779296875,
-0.0472412109375,
0.051666259765625,
-0.052459716796875,
0.032440185546875,
0.037384033203125,
0.0223846435546875,
0.00589752197265625,
-0.06805419921875,
-0.047332763671875,
0.01200103759765625,
-0.020721435546875,
0.0286712646484375,
-0.0184783935546875,
-0.011810302734375,
0.04345703125,
0.047515869140625,
-0.04833984375,
0.00043487548828125,
-0.03375244140625,
-0.0159912109375,
0.046875,
0.0270538330078125,
-0.00926971435546875,
-0.034820556640625,
-0.02117919921875,
-0.017974853515625,
-0.0296478271484375,
0.0180816650390625,
0.03338623046875,
0.021453857421875,
-0.0271759033203125,
0.03924560546875,
-0.017974853515625,
0.05389404296875,
0.00537872314453125,
-0.0117950439453125,
0.052337646484375,
-0.013641357421875,
-0.024566650390625,
0.002105712890625,
0.063232421875,
0.045684814453125,
0.0101470947265625,
0.006793975830078125,
-0.0164947509765625,
-0.00615692138671875,
0.019866943359375,
-0.07244873046875,
-0.0323486328125,
0.016082763671875,
-0.050201416015625,
-0.045013427734375,
0.018463134765625,
-0.06317138671875,
0.0210418701171875,
-0.022216796875,
0.023712158203125,
-0.0251312255859375,
-0.0194549560546875,
-0.0007658004760742188,
-0.01263427734375,
0.0340576171875,
0.00739288330078125,
-0.06195068359375,
0.0283050537109375,
0.043060302734375,
0.06341552734375,
0.0012693405151367188,
-0.007122039794921875,
-0.013519287109375,
-0.0004477500915527344,
-0.0228271484375,
0.050262451171875,
-0.022735595703125,
-0.03515625,
-0.01177215576171875,
0.02593994140625,
-0.016754150390625,
-0.036346435546875,
0.0556640625,
-0.02880859375,
0.030029296875,
-0.01123809814453125,
-0.043365478515625,
-0.032745361328125,
0.0384521484375,
-0.055206298828125,
0.0985107421875,
0.0206146240234375,
-0.0731201171875,
0.037017822265625,
-0.047607421875,
0.002536773681640625,
0.01297760009765625,
-0.0212249755859375,
-0.0667724609375,
-0.0083160400390625,
0.0175933837890625,
0.037322998046875,
-0.0308380126953125,
0.0310211181640625,
-0.00751495361328125,
-0.037750244140625,
0.0005741119384765625,
-0.042144775390625,
0.07452392578125,
0.00811004638671875,
-0.0428466796875,
0.00833892822265625,
-0.092041015625,
0.0226287841796875,
0.0288848876953125,
-0.0400390625,
0.0045928955078125,
-0.01433563232421875,
0.026947021484375,
0.016021728515625,
0.02252197265625,
-0.03509521484375,
0.004848480224609375,
-0.02783203125,
0.0220947265625,
0.047698974609375,
0.010406494140625,
-0.001800537109375,
-0.039398193359375,
0.0295257568359375,
0.0179595947265625,
0.01158905029296875,
0.00937652587890625,
-0.03094482421875,
-0.060089111328125,
-0.032928466796875,
0.0196380615234375,
0.043304443359375,
-0.01806640625,
0.06927490234375,
-0.0253143310546875,
-0.056854248046875,
-0.0244903564453125,
0.0008907318115234375,
0.0224456787109375,
0.037567138671875,
0.0273284912109375,
-0.0084381103515625,
-0.031524658203125,
-0.07940673828125,
0.003223419189453125,
-0.005039215087890625,
0.0191802978515625,
0.0142974853515625,
0.053375244140625,
-0.01398468017578125,
0.078125,
-0.058319091796875,
-0.0158233642578125,
-0.01210784912109375,
-0.00445556640625,
0.04345703125,
0.054290771484375,
0.05859375,
-0.0516357421875,
-0.0224456787109375,
-0.00308990478515625,
-0.060394287109375,
0.0303955078125,
0.0005650520324707031,
-0.016632080078125,
-0.018310546875,
0.0229034423828125,
-0.043701171875,
0.057647705078125,
0.0284881591796875,
-0.0216827392578125,
0.064697265625,
-0.023345947265625,
-0.00255584716796875,
-0.10235595703125,
0.02923583984375,
0.00659942626953125,
-0.0191192626953125,
-0.02410888671875,
0.003009796142578125,
0.01233673095703125,
-0.01824951171875,
-0.0243988037109375,
0.0394287109375,
-0.014556884765625,
0.01343536376953125,
-0.007232666015625,
-0.0232696533203125,
-0.0007381439208984375,
0.06195068359375,
0.020538330078125,
0.0545654296875,
0.048614501953125,
-0.0343017578125,
0.0321044921875,
0.02996826171875,
-0.03485107421875,
0.04278564453125,
-0.057403564453125,
0.00762176513671875,
0.0040130615234375,
-0.00417327880859375,
-0.06671142578125,
-0.0233154296875,
0.03167724609375,
-0.039947509765625,
0.0152435302734375,
-0.021148681640625,
-0.0302581787109375,
-0.045440673828125,
-0.01776123046875,
0.0106964111328125,
0.0330810546875,
-0.0394287109375,
0.03369140625,
-0.0089569091796875,
0.0106964111328125,
-0.0416259765625,
-0.056243896484375,
-0.01483917236328125,
-0.0225982666015625,
-0.032623291015625,
0.0212249755859375,
-0.004550933837890625,
0.0022983551025390625,
0.0037288665771484375,
-0.0108795166015625,
-0.01983642578125,
0.00022661685943603516,
0.029052734375,
0.0228271484375,
-0.014984130859375,
-0.006664276123046875,
0.0017108917236328125,
-0.0271759033203125,
0.01300048828125,
0.001983642578125,
0.048919677734375,
-0.019989013671875,
-0.031768798828125,
-0.049346923828125,
-0.0018558502197265625,
0.0390625,
-0.020538330078125,
0.0643310546875,
0.048797607421875,
-0.0386962890625,
0.006500244140625,
-0.0270538330078125,
-0.011810302734375,
-0.03375244140625,
0.0418701171875,
-0.04443359375,
-0.034820556640625,
0.062164306640625,
-0.0093841552734375,
0.015594482421875,
0.061676025390625,
0.038970947265625,
0.011810302734375,
0.08111572265625,
0.017364501953125,
-0.0038127899169921875,
0.01824951171875,
-0.0654296875,
0.0025081634521484375,
-0.053375244140625,
-0.040435791015625,
-0.045928955078125,
-0.021209716796875,
-0.050445556640625,
0.003101348876953125,
0.01242828369140625,
-0.0057525634765625,
-0.0416259765625,
0.0278472900390625,
-0.0501708984375,
0.020294189453125,
0.058074951171875,
0.0338134765625,
0.006214141845703125,
0.0035839080810546875,
-0.026458740234375,
-0.00989532470703125,
-0.05975341796875,
-0.032196044921875,
0.0924072265625,
0.0275726318359375,
0.046630859375,
-0.0007157325744628906,
0.059600830078125,
0.0033550262451171875,
0.00905609130859375,
-0.05859375,
0.0308837890625,
0.00279998779296875,
-0.07403564453125,
-0.0024852752685546875,
-0.0268707275390625,
-0.06573486328125,
0.0309600830078125,
-0.0223541259765625,
-0.053619384765625,
0.0340576171875,
0.019744873046875,
-0.030792236328125,
0.03704833984375,
-0.0291748046875,
0.07537841796875,
-0.01611328125,
-0.034759521484375,
-0.003223419189453125,
-0.045501708984375,
0.021453857421875,
-0.00794219970703125,
-0.002227783203125,
0.001186370849609375,
0.0225982666015625,
0.07647705078125,
-0.0506591796875,
0.043304443359375,
-0.01537322998046875,
0.023468017578125,
0.0189056396484375,
-0.00467681884765625,
0.044586181640625,
0.01093292236328125,
-0.01030731201171875,
0.0096893310546875,
0.01284027099609375,
-0.04254150390625,
-0.03546142578125,
0.053192138671875,
-0.0762939453125,
-0.03363037109375,
-0.049530029296875,
-0.0298004150390625,
0.00852203369140625,
0.028594970703125,
0.04852294921875,
0.050262451171875,
-0.006992340087890625,
0.0265960693359375,
0.037811279296875,
0.004436492919921875,
0.0294189453125,
0.0181732177734375,
-0.0014600753784179688,
-0.048431396484375,
0.06585693359375,
-0.005130767822265625,
0.007434844970703125,
0.002994537353515625,
0.0157012939453125,
-0.0251922607421875,
-0.033599853515625,
-0.0243072509765625,
0.01666259765625,
-0.050537109375,
-0.0228271484375,
-0.03839111328125,
-0.03851318359375,
-0.02239990234375,
-0.0120391845703125,
-0.038116455078125,
-0.0216217041015625,
-0.03814697265625,
-0.0180816650390625,
0.037078857421875,
0.022735595703125,
0.005779266357421875,
0.05670166015625,
-0.056854248046875,
-0.004650115966796875,
0.01544952392578125,
0.0210113525390625,
-0.0015707015991210938,
-0.053070068359375,
-0.0243072509765625,
-0.00946044921875,
-0.029876708984375,
-0.054901123046875,
0.050262451171875,
0.0014696121215820312,
0.0501708984375,
0.052001953125,
-0.0169830322265625,
0.06536865234375,
-0.0169677734375,
0.057647705078125,
0.026885986328125,
-0.046478271484375,
0.032745361328125,
-0.0124969482421875,
0.0292816162109375,
0.056884765625,
0.039276123046875,
-0.02630615234375,
-0.006931304931640625,
-0.08843994140625,
-0.0606689453125,
0.06744384765625,
0.032928466796875,
-0.00445556640625,
-0.004802703857421875,
0.0241241455078125,
-0.01247406005859375,
0.02618408203125,
-0.064453125,
-0.045135498046875,
-0.0262298583984375,
-0.0275726318359375,
0.004150390625,
-0.0138397216796875,
-0.01334381103515625,
-0.04443359375,
0.0628662109375,
0.005191802978515625,
0.02899169921875,
0.00786590576171875,
0.0018367767333984375,
-0.004344940185546875,
0.001300811767578125,
0.0322265625,
0.06195068359375,
-0.053375244140625,
0.0098419189453125,
0.0034542083740234375,
-0.04095458984375,
0.0017795562744140625,
0.026641845703125,
-0.014678955078125,
0.01113128662109375,
0.037200927734375,
0.06787109375,
0.01103973388671875,
-0.0150299072265625,
0.039154052734375,
0.0015764236450195312,
-0.033050537109375,
-0.044342041015625,
0.00713348388671875,
-0.0018510818481445312,
0.0180511474609375,
0.0252838134765625,
0.0236663818359375,
0.0008687973022460938,
-0.021209716796875,
0.0157470703125,
0.025970458984375,
-0.044647216796875,
-0.016845703125,
0.05810546875,
-0.0006527900695800781,
-0.028350830078125,
0.06317138671875,
-0.008453369140625,
-0.043975830078125,
0.05914306640625,
0.0294189453125,
0.06298828125,
-0.015899658203125,
-0.01031494140625,
0.0626220703125,
0.0224456787109375,
-0.0009622573852539062,
0.03363037109375,
-0.00026726722717285156,
-0.038818359375,
-0.0037689208984375,
-0.045654296875,
-0.0019216537475585938,
0.036376953125,
-0.07415771484375,
0.02783203125,
-0.035888671875,
-0.041412353515625,
0.0120849609375,
0.01007080078125,
-0.0775146484375,
0.036102294921875,
-0.0101776123046875,
0.08056640625,
-0.07135009765625,
0.052001953125,
0.042694091796875,
-0.038970947265625,
-0.0712890625,
-0.0159454345703125,
-0.0064544677734375,
-0.07183837890625,
0.055572509765625,
0.0112152099609375,
0.0164337158203125,
0.022552490234375,
-0.04833984375,
-0.059478759765625,
0.0875244140625,
-0.006412506103515625,
-0.0506591796875,
-0.0015115737915039062,
0.01285552978515625,
0.027740478515625,
-0.005229949951171875,
0.04156494140625,
0.04119873046875,
0.03094482421875,
0.0102996826171875,
-0.059478759765625,
0.0102996826171875,
-0.0309600830078125,
-0.0018701553344726562,
0.01873779296875,
-0.053619384765625,
0.060516357421875,
0.00930023193359375,
0.020599365234375,
-0.0005068778991699219,
0.04962158203125,
0.0216217041015625,
0.0146942138671875,
0.034820556640625,
0.07415771484375,
0.043487548828125,
-0.0198516845703125,
0.06304931640625,
-0.0418701171875,
0.06378173828125,
0.07427978515625,
0.0082550048828125,
0.045501708984375,
0.0279083251953125,
-0.0306549072265625,
0.033843994140625,
0.064208984375,
-0.0221710205078125,
0.035430908203125,
0.007144927978515625,
-0.0147552490234375,
-0.027557373046875,
0.0302886962890625,
-0.040985107421875,
0.026763916015625,
0.0229339599609375,
-0.04742431640625,
-0.03387451171875,
-0.01824951171875,
0.0022144317626953125,
-0.01824951171875,
-0.0213470458984375,
0.0439453125,
-0.0186920166015625,
-0.0226593017578125,
0.055694580078125,
0.005512237548828125,
0.0345458984375,
-0.053314208984375,
-0.010406494140625,
-0.00839996337890625,
0.0280914306640625,
-0.032257080078125,
-0.0538330078125,
0.01556396484375,
-0.005893707275390625,
-0.0048065185546875,
0.0178375244140625,
0.03759765625,
-0.0117645263671875,
-0.06317138671875,
0.0038242340087890625,
0.03204345703125,
0.016021728515625,
-0.0021076202392578125,
-0.07623291015625,
-0.00717926025390625,
0.006519317626953125,
-0.042266845703125,
-0.0059814453125,
0.022735595703125,
0.00336456298828125,
0.040863037109375,
0.0460205078125,
0.00004494190216064453,
0.0092315673828125,
-0.0037384033203125,
0.08050537109375,
-0.05914306640625,
-0.0445556640625,
-0.05596923828125,
0.0330810546875,
-0.0181121826171875,
-0.04547119140625,
0.049346923828125,
0.0872802734375,
0.047332763671875,
-0.0057830810546875,
0.05303955078125,
-0.0109710693359375,
0.03729248046875,
-0.0308380126953125,
0.04193115234375,
-0.045257568359375,
-0.012725830078125,
-0.02044677734375,
-0.07745361328125,
-0.016357421875,
0.058074951171875,
-0.0269927978515625,
0.01128387451171875,
0.048980712890625,
0.06500244140625,
-0.01509857177734375,
0.0046539306640625,
0.0122528076171875,
-0.01010894775390625,
0.0214996337890625,
0.0282440185546875,
0.031524658203125,
-0.04669189453125,
0.03271484375,
-0.049591064453125,
-0.01059722900390625,
-0.00957489013671875,
-0.042724609375,
-0.071533203125,
-0.03033447265625,
-0.038482666015625,
-0.033416748046875,
0.0013675689697265625,
0.08270263671875,
0.061767578125,
-0.059173583984375,
-0.026641845703125,
-0.013519287109375,
-0.022979736328125,
-0.0272674560546875,
-0.0218963623046875,
0.06597900390625,
-0.0031299591064453125,
-0.05810546875,
-0.00406646728515625,
-0.0175018310546875,
0.0234375,
-0.00878143310546875,
-0.004940032958984375,
-0.0249481201171875,
-0.018646240234375,
0.021759033203125,
0.0014934539794921875,
-0.0300445556640625,
-0.0207672119140625,
-0.006168365478515625,
0.002849578857421875,
0.027618408203125,
0.0286102294921875,
-0.037384033203125,
0.0269927978515625,
0.027099609375,
0.030609130859375,
0.0732421875,
0.0086517333984375,
0.0086669921875,
-0.05303955078125,
0.028472900390625,
0.0107269287109375,
0.0240325927734375,
0.006641387939453125,
-0.0312042236328125,
0.034576416015625,
0.046844482421875,
-0.038909912109375,
-0.04913330078125,
-0.01558685302734375,
-0.0855712890625,
-0.0037136077880859375,
0.06671142578125,
-0.004741668701171875,
-0.036865234375,
0.01055908203125,
-0.013641357421875,
0.0247650146484375,
-0.0255889892578125,
0.035980224609375,
0.06353759765625,
-0.00815582275390625,
-0.007755279541015625,
-0.043670654296875,
0.03338623046875,
0.0114898681640625,
-0.053192138671875,
-0.0145263671875,
0.025787353515625,
0.035858154296875,
0.01739501953125,
0.0239410400390625,
-0.01568603515625,
0.0237579345703125,
0.006900787353515625,
0.0225067138671875,
-0.0200653076171875,
-0.00768280029296875,
-0.0199127197265625,
0.005962371826171875,
0.00966644287109375,
-0.038909912109375
]
] |
Meina/MeinaMix_V11 | 2023-07-16T19:53:46.000Z | [
"diffusers",
"art",
"anime",
"stable diffusion",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Meina | null | null | Meina/MeinaMix_V11 | 18 | 14,160 | diffusers | 2023-07-16T19:11:15 | ---
license: creativeml-openrail-m
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- art
- anime
- stable diffusion
---
MeinaMix Objective is to be able to do good art with little prompting.
For examples and prompts, please checkout: https://civitai.com/models/7240/meinamix
I have a discord server where you can post images that you generated, discuss prompt and/or ask for help.
https://discord.gg/XC9nGZNDUd If you like one of my models and want to support their updates
I've made a ko-fi page; https://ko-fi.com/meina where you can pay me a coffee <3
And a Patreon page; https://www.patreon.com/MeinaMix where you can support me and get acess to beta of my models!
You may also try this model using Sinkin.ai: https://sinkin.ai/m/vln8Nwr
MeinaMix and the other of Meinas will ALWAYS be FREE.
Recommendations of use: Enable Quantization in K samplers.
Hires.fix is needed for prompts where the character is far away in order to make decent images, it drastically improve the quality of face and eyes!
Recommended parameters:
Sampler: Euler a: 40 to 60 steps.
Sampler: DPM++ SDE Karras: 20 to 30 steps.
Sampler: DPM++ 2M Karras: 20 to 40 steps.
CFG Scale: 7.
Resolutions: 512x768, 512x1024 for Portrait!
Resolutions: 768x512, 1024x512, 1536x512 for Landscape!
Hires.fix: R-ESRGAN 4x+Anime6b, with 10 steps at 0.3 up to 0.5 denoising.
Clip Skip: 2.
Negatives: ' (worst quality, low quality:1.4), (zombie, sketch, interlocked fingers, comic) ' | 1,487 | [
[
-0.05999755859375,
-0.040374755859375,
0.05517578125,
0.02783203125,
-0.040252685546875,
-0.024658203125,
0.00563812255859375,
-0.0489501953125,
0.03131103515625,
0.03240966796875,
-0.051300048828125,
-0.050689697265625,
-0.032623291015625,
0.015106201171875,
-0.006542205810546875,
0.06414794921875,
0.00563812255859375,
0.0156707763671875,
0.00415802001953125,
-0.0009088516235351562,
-0.0628662109375,
0.0082855224609375,
-0.09320068359375,
-0.001720428466796875,
0.06561279296875,
0.06072998046875,
0.057159423828125,
0.032318115234375,
0.0236968994140625,
0.024658203125,
-0.005550384521484375,
0.0035457611083984375,
-0.037872314453125,
0.017059326171875,
-0.00885772705078125,
-0.0438232421875,
-0.0478515625,
-0.0011577606201171875,
0.017578125,
0.0265350341796875,
0.005138397216796875,
0.00432586669921875,
-0.0226287841796875,
0.05499267578125,
-0.049560546875,
-0.004489898681640625,
0.019683837890625,
-0.0019025802612304688,
-0.02734375,
0.01432037353515625,
-0.0207977294921875,
-0.0439453125,
-0.005207061767578125,
-0.05401611328125,
0.0121002197265625,
-0.00704193115234375,
0.066650390625,
0.0140838623046875,
-0.0133514404296875,
-0.021026611328125,
-0.052825927734375,
0.04315185546875,
-0.078369140625,
0.0078887939453125,
0.0261993408203125,
0.06549072265625,
-0.0003464221954345703,
-0.0391845703125,
-0.02935791015625,
-0.0144805908203125,
0.01080322265625,
0.0235595703125,
-0.01419830322265625,
-0.00017583370208740234,
0.046417236328125,
0.036163330078125,
-0.05035400390625,
0.002716064453125,
-0.04010009765625,
-0.003551483154296875,
0.062744140625,
0.0169219970703125,
0.0609130859375,
-0.00722503662109375,
-0.029632568359375,
-0.03839111328125,
-0.052337646484375,
0.0126953125,
0.0428466796875,
-0.00873565673828125,
-0.033416748046875,
0.0386962890625,
-0.0220947265625,
0.0258026123046875,
0.01517486572265625,
0.0085296630859375,
-0.00397491455078125,
-0.00911712646484375,
0.021484375,
-0.0022449493408203125,
0.045867919921875,
0.0677490234375,
0.0238800048828125,
0.033233642578125,
-0.0244903564453125,
-0.006549835205078125,
0.029632568359375,
-0.09002685546875,
-0.0322265625,
0.0171661376953125,
-0.064453125,
-0.02288818359375,
-0.01525115966796875,
-0.07452392578125,
-0.0286712646484375,
-0.0182647705078125,
0.048309326171875,
-0.05157470703125,
-0.0200653076171875,
-0.01232147216796875,
-0.040924072265625,
0.020660400390625,
0.058868408203125,
-0.0491943359375,
0.00354766845703125,
-0.004299163818359375,
0.0543212890625,
0.00986480712890625,
-0.0051116943359375,
-0.01091766357421875,
-0.01461029052734375,
-0.04168701171875,
0.07470703125,
-0.01227569580078125,
-0.054840087890625,
-0.0211944580078125,
-0.00506591796875,
0.0216064453125,
-0.057159423828125,
0.05352783203125,
-0.032073974609375,
0.0084686279296875,
-0.0087890625,
-0.03021240234375,
-0.0295867919921875,
-0.01517486572265625,
-0.06494140625,
0.0233612060546875,
0.021392822265625,
-0.021484375,
-0.016204833984375,
-0.06646728515625,
-0.0088348388671875,
0.0224609375,
0.004665374755859375,
-0.004756927490234375,
0.042449951171875,
0.0098724365234375,
0.0238037109375,
-0.004573822021484375,
-0.00757598876953125,
-0.0499267578125,
-0.041534423828125,
0.025360107421875,
-0.03521728515625,
0.07342529296875,
0.03082275390625,
-0.0222625732421875,
-0.0011968612670898438,
-0.07305908203125,
0.01277923583984375,
0.0242919921875,
0.010955810546875,
-0.0169525146484375,
-0.00516510009765625,
0.0202178955078125,
0.01641845703125,
0.0178070068359375,
-0.01605224609375,
0.003749847412109375,
-0.02899169921875,
0.010986328125,
0.077392578125,
0.0027256011962890625,
0.02130126953125,
-0.0474853515625,
0.045013427734375,
0.005146026611328125,
0.017303466796875,
-0.037261962890625,
-0.04730224609375,
-0.07427978515625,
-0.00931549072265625,
0.016754150390625,
0.03082275390625,
-0.047393798828125,
0.0196533203125,
0.007236480712890625,
-0.048248291015625,
-0.045745849609375,
-0.006610870361328125,
0.0118255615234375,
0.020355224609375,
0.0133209228515625,
-0.05633544921875,
-0.0247802734375,
-0.0980224609375,
0.035552978515625,
0.002841949462890625,
-0.02972412109375,
0.0216217041015625,
0.0225372314453125,
-0.01922607421875,
0.052154541015625,
-0.030120849609375,
-0.006496429443359375,
0.00020611286163330078,
0.0161895751953125,
0.03564453125,
0.051055908203125,
0.04876708984375,
-0.06036376953125,
-0.050079345703125,
-0.0161590576171875,
-0.05340576171875,
-0.0183258056640625,
0.0089111328125,
-0.0266571044921875,
-0.011077880859375,
0.0167999267578125,
-0.0662841796875,
0.034881591796875,
0.015777587890625,
-0.0232086181640625,
0.044097900390625,
0.0034618377685546875,
0.0298919677734375,
-0.09326171875,
0.023223876953125,
-0.005802154541015625,
-0.02093505859375,
-0.052398681640625,
0.03314208984375,
-0.03253173828125,
-0.050323486328125,
-0.055206298828125,
0.05487060546875,
-0.0135498046875,
0.019775390625,
-0.0457763671875,
0.002468109130859375,
0.01320648193359375,
0.045654296875,
0.007282257080078125,
0.056396484375,
0.04071044921875,
-0.0513916015625,
0.032012939453125,
0.0216064453125,
-0.032745361328125,
0.0765380859375,
-0.09320068359375,
0.026947021484375,
-0.0161895751953125,
0.006374359130859375,
-0.07275390625,
-0.034576416015625,
0.0697021484375,
-0.032867431640625,
0.0281982421875,
0.0116729736328125,
-0.0167236328125,
-0.0247802734375,
-0.024200439453125,
0.0482177734375,
0.06494140625,
-0.0305938720703125,
0.048675537109375,
-0.0020885467529296875,
-0.0288543701171875,
0.01096343994140625,
-0.01319122314453125,
0.0069732666015625,
-0.0256500244140625,
-0.0474853515625,
0.03466796875,
-0.040435791015625,
-0.03265380859375,
0.007442474365234375,
0.021026611328125,
-0.023834228515625,
-0.01512908935546875,
0.0138397216796875,
0.0188140869140625,
-0.03717041015625,
-0.019775390625,
0.01111602783203125,
-0.02825927734375,
0.002239227294921875,
0.0272216796875,
0.0285491943359375,
-0.0088043212890625,
-0.037322998046875,
-0.08587646484375,
0.052886962890625,
0.051422119140625,
0.02752685546875,
-0.01141357421875,
0.0504150390625,
-0.0283050537109375,
0.0035381317138671875,
-0.032196044921875,
-0.03704833984375,
-0.0377197265625,
-0.0092010498046875,
-0.035919189453125,
-0.0276031494140625,
0.05462646484375,
-0.0128021240234375,
0.00701904296875,
0.0428466796875,
0.0350341796875,
-0.050689697265625,
0.1092529296875,
0.039764404296875,
0.00955963134765625,
0.02288818359375,
-0.0247802734375,
-0.00041866302490234375,
-0.05853271484375,
0.00672149658203125,
-0.0254058837890625,
-0.055877685546875,
-0.034698486328125,
-0.0216064453125,
0.03277587890625,
0.018524169921875,
-0.002651214599609375,
0.0308837890625,
-0.01153564453125,
0.04266357421875,
0.041168212890625,
0.035491943359375,
-0.0048675537109375,
-0.004245758056640625,
0.0203857421875,
-0.003063201904296875,
-0.045928955078125,
-0.032684326171875,
0.04644775390625,
0.0164794921875,
0.0634765625,
0.0167236328125,
0.048492431640625,
0.0260772705078125,
0.0041656494140625,
-0.05792236328125,
0.046844482421875,
-0.01366424560546875,
-0.057525634765625,
-0.007793426513671875,
0.00099945068359375,
-0.043670654296875,
-0.0013141632080078125,
0.0017566680908203125,
-0.0372314453125,
0.052581787109375,
0.01393890380859375,
-0.043731689453125,
-0.005634307861328125,
-0.04840087890625,
0.050872802734375,
-0.00435638427734375,
-0.0145263671875,
-0.0226898193359375,
-0.04449462890625,
0.027130126953125,
0.01287078857421875,
0.003108978271484375,
-0.044708251953125,
0.0196990966796875,
0.0255584716796875,
-0.042877197265625,
0.08892822265625,
-0.01396942138671875,
-0.00445556640625,
0.04595947265625,
0.0115814208984375,
0.02410888671875,
0.01013946533203125,
-0.01165008544921875,
0.027313232421875,
0.006805419921875,
-0.03497314453125,
-0.05999755859375,
0.042205810546875,
-0.055755615234375,
-0.05487060546875,
-0.0013113021850585938,
-0.0256500244140625,
0.0102386474609375,
0.01153564453125,
0.0499267578125,
0.052947998046875,
-0.0204620361328125,
0.0186614990234375,
0.0333251953125,
-0.00420379638671875,
0.04315185546875,
-0.006587982177734375,
-0.0053253173828125,
-0.0462646484375,
0.07843017578125,
-0.01407623291015625,
0.0079193115234375,
-0.0033969879150390625,
0.0246734619140625,
-0.017791748046875,
-0.00493621826171875,
-0.07550048828125,
0.01157379150390625,
-0.058074951171875,
-0.01751708984375,
-0.0007405281066894531,
-0.017120361328125,
-0.017333984375,
-0.0210723876953125,
-0.0265045166015625,
-0.0308380126953125,
-0.048980712890625,
0.04119873046875,
0.037994384765625,
0.047119140625,
-0.027313232421875,
0.036590576171875,
-0.052581787109375,
0.01514434814453125,
0.026641845703125,
0.0185394287109375,
0.0121002197265625,
-0.0265350341796875,
-0.018890380859375,
0.0160369873046875,
-0.0299224853515625,
-0.0670166015625,
0.0220184326171875,
-0.0007739067077636719,
0.026885986328125,
0.06768798828125,
0.00882720947265625,
0.0653076171875,
-0.052398681640625,
0.056732177734375,
0.0190582275390625,
-0.05926513671875,
0.061309814453125,
-0.039764404296875,
0.041839599609375,
0.06329345703125,
0.047943115234375,
-0.04400634765625,
-0.0355224609375,
-0.0535888671875,
-0.03692626953125,
0.046600341796875,
0.0284881591796875,
0.035369873046875,
0.00830841064453125,
0.044342041015625,
0.002880096435546875,
0.00673675537109375,
-0.04302978515625,
-0.0180511474609375,
-0.03143310546875,
-0.0107879638671875,
0.004199981689453125,
-0.033233642578125,
0.0016765594482421875,
-0.03717041015625,
0.06829833984375,
-0.00505828857421875,
0.032318115234375,
0.021728515625,
0.01953125,
-0.01216888427734375,
-0.0011396408081054688,
0.061370849609375,
0.050048828125,
-0.00537109375,
-0.00795745849609375,
0.006389617919921875,
-0.01558685302734375,
0.00936126708984375,
-0.0128936767578125,
-0.03680419921875,
0.0292205810546875,
0.017242431640625,
0.08697509765625,
0.0197906494140625,
-0.058990478515625,
0.01294708251953125,
0.007213592529296875,
0.0054473876953125,
-0.0163726806640625,
0.038604736328125,
0.0001455545425415039,
0.029296875,
0.0031909942626953125,
-0.00093841552734375,
0.0390625,
-0.019622802734375,
-0.0009465217590332031,
-0.0014867782592773438,
-0.021270751953125,
-0.032257080078125,
0.045867919921875,
0.01409912109375,
-0.05059814453125,
0.0587158203125,
-0.0091400146484375,
-0.0247802734375,
0.07080078125,
0.045379638671875,
0.061004638671875,
-0.034149169921875,
0.04052734375,
0.05499267578125,
-0.00908660888671875,
0.0207977294921875,
0.02703857421875,
-0.00704193115234375,
-0.0281219482421875,
0.0018472671508789062,
-0.022186279296875,
-0.052154541015625,
0.019561767578125,
-0.03411865234375,
0.06317138671875,
-0.0343017578125,
-0.006195068359375,
0.005535125732421875,
-0.0089263916015625,
-0.0193023681640625,
0.0201263427734375,
0.01180267333984375,
0.06500244140625,
-0.03350830078125,
0.0264129638671875,
0.030364990234375,
-0.05426025390625,
-0.051666259765625,
-0.0157012939453125,
0.009307861328125,
-0.03619384765625,
0.0325927734375,
0.03155517578125,
0.0017757415771484375,
0.0171661376953125,
-0.06243896484375,
-0.056304931640625,
0.06622314453125,
0.0100555419921875,
-0.047698974609375,
-0.00609588623046875,
-0.018157958984375,
0.02569580078125,
-0.0306854248046875,
0.0178070068359375,
-0.0072479248046875,
0.03948974609375,
0.0145721435546875,
-0.00916290283203125,
-0.00588226318359375,
-0.052459716796875,
0.0308685302734375,
-0.01091766357421875,
-0.05078125,
0.05859375,
-0.013763427734375,
-0.0413818359375,
0.052825927734375,
0.038238525390625,
0.0225830078125,
0.033538818359375,
0.05804443359375,
0.041656494140625,
0.0210418701171875,
0.01236724853515625,
0.07806396484375,
-0.0258331298828125,
-0.0129547119140625,
0.06903076171875,
-0.009002685546875,
0.041778564453125,
0.0099639892578125,
0.00498199462890625,
0.0579833984375,
0.08892822265625,
-0.0350341796875,
0.040313720703125,
0.002490997314453125,
-0.039306640625,
-0.0208587646484375,
-0.0229644775390625,
-0.03582763671875,
0.0177001953125,
0.00676727294921875,
-0.01114654541015625,
-0.04150390625,
0.0428466796875,
-0.0295257568359375,
-0.005016326904296875,
-0.01522064208984375,
0.047027587890625,
0.0194244384765625,
-0.01983642578125,
0.035186767578125,
0.0030307769775390625,
0.024383544921875,
-0.0411376953125,
-0.027313232421875,
-0.019561767578125,
-0.01361083984375,
0.0055694580078125,
-0.045654296875,
0.01763916015625,
-0.0198516845703125,
-0.0259246826171875,
-0.01010894775390625,
0.0631103515625,
-0.0084075927734375,
-0.032379150390625,
0.0015468597412109375,
0.0256195068359375,
0.053253173828125,
0.0018491744995117188,
-0.060211181640625,
0.0164947509765625,
-0.01422882080078125,
-0.00315093994140625,
-0.01120758056640625,
0.00933837890625,
-0.0134735107421875,
0.017822265625,
0.040924072265625,
-0.0010128021240234375,
-0.033050537109375,
0.025482177734375,
0.044464111328125,
-0.0386962890625,
-0.0169677734375,
-0.057342529296875,
0.04046630859375,
-0.0037326812744140625,
-0.030914306640625,
0.044708251953125,
0.0208587646484375,
0.06292724609375,
-0.04913330078125,
0.0161590576171875,
-0.0081939697265625,
0.006816864013671875,
-0.04400634765625,
0.080078125,
-0.051422119140625,
-0.036865234375,
0.002010345458984375,
-0.07269287109375,
-0.01061248779296875,
0.05908203125,
0.009735107421875,
0.00778961181640625,
0.0228118896484375,
0.06719970703125,
0.000007331371307373047,
0.00843048095703125,
0.042449951171875,
-0.0010499954223632812,
0.0018968582153320312,
0.03643798828125,
0.08477783203125,
-0.0362548828125,
-0.000047266483306884766,
-0.06396484375,
-0.0008573532104492188,
-0.031219482421875,
-0.039215087890625,
-0.0645751953125,
-0.043853759765625,
-0.03369140625,
-0.009918212890625,
-0.003086090087890625,
0.05352783203125,
0.0743408203125,
-0.053924560546875,
-0.0236968994140625,
-0.0011720657348632812,
0.016387939453125,
-0.007167816162109375,
-0.0185089111328125,
0.01194000244140625,
-0.00893402099609375,
-0.1011962890625,
0.038604736328125,
0.006992340087890625,
0.037506103515625,
-0.021331787109375,
0.00560760498046875,
-0.00865936279296875,
0.01110076904296875,
0.052154541015625,
0.0287628173828125,
-0.049102783203125,
-0.01447296142578125,
-0.01418304443359375,
-0.007610321044921875,
0.0109405517578125,
0.046417236328125,
-0.015655517578125,
0.0355224609375,
0.0404052734375,
0.018218994140625,
0.010711669921875,
-0.00342559814453125,
0.0435791015625,
-0.0263671875,
-0.0017194747924804688,
0.0228271484375,
0.011199951171875,
0.004238128662109375,
-0.036529541015625,
0.03924560546875,
0.0176544189453125,
-0.0251617431640625,
-0.04693603515625,
0.0177459716796875,
-0.06866455078125,
-0.0287322998046875,
0.046844482421875,
0.01033782958984375,
-0.039459228515625,
0.0224456787109375,
-0.044525146484375,
0.0025177001953125,
-0.0205841064453125,
0.04302978515625,
0.07037353515625,
-0.019927978515625,
-0.0235137939453125,
-0.052398681640625,
0.02252197265625,
0.01806640625,
-0.06988525390625,
-0.034027099609375,
0.0660400390625,
0.00830841064453125,
0.037445068359375,
0.060791015625,
-0.019927978515625,
0.061553955078125,
0.00586700439453125,
0.0211181640625,
0.0186767578125,
-0.041107177734375,
-0.044525146484375,
0.003841400146484375,
-0.005626678466796875,
-0.0237884521484375
]
] |
felflare/bert-restore-punctuation | 2021-05-24T03:04:47.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"punctuation",
"en",
"dataset:yelp_polarity",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | felflare | null | null | felflare/bert-restore-punctuation | 50 | 14,118 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
tags:
- punctuation
license: mit
datasets:
- yelp_polarity
metrics:
- f1
---
# ✨ bert-restore-punctuation
[]()
This a bert-base-uncased model finetuned for punctuation restoration on [Yelp Reviews](https://www.tensorflow.org/datasets/catalog/yelp_polarity_reviews).
The model predicts the punctuation and upper-casing of plain, lower-cased text. An example use case can be ASR output. Or other cases when text has lost punctuation.
This model is intended for direct use as a punctuation restoration model for the general English language. Alternatively, you can use this for further fine-tuning on domain-specific texts for punctuation restoration tasks.
Model restores the following punctuations -- **[! ? . , - : ; ' ]**
The model also restores the upper-casing of words.
-----------------------------------------------
## 🚋 Usage
**Below is a quick way to get up and running with the model.**
1. First, install the package.
```bash
pip install rpunct
```
2. Sample python code.
```python
from rpunct import RestorePuncts
# The default language is 'english'
rpunct = RestorePuncts()
rpunct.punctuate("""in 2018 cornell researchers built a high-powered detector that in combination with an algorithm-driven process called ptychography set a world record
by tripling the resolution of a state-of-the-art electron microscope as successful as it was that approach had a weakness it only worked with ultrathin samples that were
a few atoms thick anything thicker would cause the electrons to scatter in ways that could not be disentangled now a team again led by david muller the samuel b eckert
professor of engineering has bested its own record by a factor of two with an electron microscope pixel array detector empad that incorporates even more sophisticated
3d reconstruction algorithms the resolution is so fine-tuned the only blurring that remains is the thermal jiggling of the atoms themselves""")
# Outputs the following:
# In 2018, Cornell researchers built a high-powered detector that, in combination with an algorithm-driven process called Ptychography, set a world record by tripling the
# resolution of a state-of-the-art electron microscope. As successful as it was, that approach had a weakness. It only worked with ultrathin samples that were a few atoms
# thick. Anything thicker would cause the electrons to scatter in ways that could not be disentangled. Now, a team again led by David Muller, the Samuel B.
# Eckert Professor of Engineering, has bested its own record by a factor of two with an Electron microscope pixel array detector empad that incorporates even more
# sophisticated 3d reconstruction algorithms. The resolution is so fine-tuned the only blurring that remains is the thermal jiggling of the atoms themselves.
```
**This model works on arbitrarily large text in English language and uses GPU if available.**
-----------------------------------------------
## 📡 Training data
Here is the number of product reviews we used for finetuning the model:
| Language | Number of text samples|
| -------- | ----------------- |
| English | 560,000 |
We found the best convergence around _**3 epochs**_, which is what presented here and available via a download.
-----------------------------------------------
## 🎯 Accuracy
The fine-tuned model obtained the following accuracy on 45,990 held-out text samples:
| Accuracy | Overall F1 | Eval Support |
| -------- | ---------------------- | ------------------- |
| 91% | 90% | 45,990
Below is a breakdown of the performance of the model by each label:
| label | precision | recall | f1-score | support|
| --------- | -------------|-------- | ----------|--------|
| **!** | 0.45 | 0.17 | 0.24 | 424
| **!+Upper** | 0.43 | 0.34 | 0.38 | 98
| **'** | 0.60 | 0.27 | 0.37 | 11
| **,** | 0.59 | 0.51 | 0.55 | 1522
| **,+Upper** | 0.52 | 0.50 | 0.51 | 239
| **-** | 0.00 | 0.00 | 0.00 | 18
| **.** | 0.69 | 0.84 | 0.75 | 2488
| **.+Upper** | 0.65 | 0.52 | 0.57 | 274
| **:** | 0.52 | 0.31 | 0.39 | 39
| **:+Upper** | 0.36 | 0.62 | 0.45 | 16
| **;** | 0.00 | 0.00 | 0.00 | 17
| **?** | 0.54 | 0.48 | 0.51 | 46
| **?+Upper** | 0.40 | 0.50 | 0.44 | 4
| **none** | 0.96 | 0.96 | 0.96 |35352
| **Upper** | 0.84 | 0.82 | 0.83 | 5442
-----------------------------------------------
## ☕ Contact
Contact [Daulet Nurmanbetov](daulet.nurmanbetov@gmail.com) for questions, feedback and/or requests for similar models.
----------------------------------------------- | 4,976 | [
[
-0.033203125,
-0.06158447265625,
0.04974365234375,
-0.0135345458984375,
-0.020111083984375,
-0.0014486312866210938,
-0.00823974609375,
-0.0271148681640625,
0.0166778564453125,
0.021728515625,
-0.0149383544921875,
-0.052093505859375,
-0.040740966796875,
0.01290130615234375,
-0.026763916015625,
0.06695556640625,
0.004734039306640625,
-0.0006875991821289062,
0.00566864013671875,
0.0030670166015625,
-0.0161590576171875,
-0.03704833984375,
-0.038787841796875,
-0.0305938720703125,
0.0180206298828125,
0.01090240478515625,
0.0614013671875,
0.05322265625,
0.046356201171875,
0.023834228515625,
-0.026641845703125,
0.00824737548828125,
-0.0243988037109375,
-0.0291748046875,
0.004734039306640625,
-0.0207061767578125,
-0.045928955078125,
-0.004268646240234375,
0.032867431640625,
0.06207275390625,
-0.01422119140625,
-0.004756927490234375,
0.005779266357421875,
0.047698974609375,
-0.034393310546875,
-0.018463134765625,
-0.0250244140625,
0.005889892578125,
0.0010309219360351562,
-0.01097869873046875,
-0.041107177734375,
-0.0225677490234375,
0.004642486572265625,
-0.066650390625,
0.02447509765625,
0.0172882080078125,
0.10357666015625,
0.0009055137634277344,
-0.032196044921875,
0.0014896392822265625,
-0.05499267578125,
0.0640869140625,
-0.0611572265625,
0.034271240234375,
0.0196533203125,
-0.0018224716186523438,
-0.0137481689453125,
-0.06884765625,
-0.044464111328125,
-0.0081939697265625,
-0.0103912353515625,
0.0374755859375,
-0.03863525390625,
-0.0026226043701171875,
0.01229095458984375,
0.040924072265625,
-0.061248779296875,
0.00247955322265625,
-0.0305023193359375,
-0.0170440673828125,
0.03875732421875,
0.01459503173828125,
0.01222991943359375,
-0.01299285888671875,
-0.049072265625,
-0.022552490234375,
-0.01580810546875,
0.01068878173828125,
0.01947021484375,
0.0034122467041015625,
0.00014495849609375,
0.04010009765625,
-0.0225677490234375,
0.049835205078125,
0.01441192626953125,
-0.002620697021484375,
0.034149169921875,
-0.025970458984375,
-0.028228759765625,
0.00557708740234375,
0.0709228515625,
0.050262451171875,
0.0066375732421875,
0.004367828369140625,
-0.0162200927734375,
0.0125274658203125,
0.006534576416015625,
-0.054443359375,
-0.051361083984375,
0.01251220703125,
-0.04949951171875,
-0.0202484130859375,
0.0107879638671875,
-0.04510498046875,
0.0012464523315429688,
-0.01532745361328125,
0.045501708984375,
-0.04644775390625,
-0.0255889892578125,
0.02642822265625,
-0.021820068359375,
-0.00873565673828125,
0.0242767333984375,
-0.05804443359375,
0.0280303955078125,
0.0399169921875,
0.06768798828125,
-0.004711151123046875,
-0.0155181884765625,
0.0092926025390625,
0.0031986236572265625,
-0.036407470703125,
0.0699462890625,
-0.0271148681640625,
-0.03070068359375,
-0.022552490234375,
0.0080413818359375,
0.00807952880859375,
-0.029754638671875,
0.040130615234375,
-0.0297088623046875,
0.027557373046875,
-0.00958251953125,
-0.053955078125,
-0.034942626953125,
0.03167724609375,
-0.05499267578125,
0.09063720703125,
0.0017595291137695312,
-0.06378173828125,
0.0206756591796875,
-0.02392578125,
-0.006561279296875,
-0.0108184814453125,
0.018096923828125,
-0.051513671875,
0.0095672607421875,
0.0245819091796875,
0.0290069580078125,
-0.0116729736328125,
0.023712158203125,
-0.039306640625,
-0.02392578125,
0.012176513671875,
-0.025421142578125,
0.07989501953125,
0.0100250244140625,
-0.022186279296875,
-0.005901336669921875,
-0.0816650390625,
0.005863189697265625,
0.01256561279296875,
-0.034088134765625,
-0.0238189697265625,
-0.027984619140625,
0.001956939697265625,
0.0182037353515625,
0.0165557861328125,
-0.043548583984375,
0.0150909423828125,
-0.034942626953125,
0.042144775390625,
0.06353759765625,
0.005771636962890625,
0.0177001953125,
-0.0472412109375,
0.0229339599609375,
0.015869140625,
0.007152557373046875,
-0.00746917724609375,
-0.03875732421875,
-0.050262451171875,
-0.0285186767578125,
0.035491943359375,
0.04193115234375,
-0.0333251953125,
0.046142578125,
-0.01525115966796875,
-0.05078125,
-0.024627685546875,
-0.0094451904296875,
0.03070068359375,
0.05615234375,
0.032867431640625,
-0.02105712890625,
-0.030364990234375,
-0.0849609375,
-0.01129913330078125,
-0.015838623046875,
0.0019464492797851562,
0.0174407958984375,
0.050750732421875,
0.0014047622680664062,
0.053802490234375,
-0.05511474609375,
-0.018768310546875,
-0.0012416839599609375,
0.0093841552734375,
0.0474853515625,
0.03717041015625,
0.038818359375,
-0.041412353515625,
-0.060882568359375,
0.00325775146484375,
-0.054534912109375,
0.00481414794921875,
0.0163421630859375,
-0.01403045654296875,
0.03692626953125,
0.0277862548828125,
-0.05401611328125,
0.030029296875,
0.0389404296875,
-0.042510986328125,
0.061553955078125,
-0.034210205078125,
0.0145416259765625,
-0.095947265625,
0.031585693359375,
0.008575439453125,
-0.0185394287109375,
-0.05328369140625,
-0.01812744140625,
0.01861572265625,
-0.003093719482421875,
-0.034820556640625,
0.0312347412109375,
-0.04510498046875,
0.011260986328125,
-0.00406646728515625,
0.004993438720703125,
0.02081298828125,
0.05267333984375,
0.0012760162353515625,
0.054229736328125,
0.032318115234375,
-0.033355712890625,
0.02239990234375,
0.031890869140625,
-0.0347900390625,
0.045196533203125,
-0.04595947265625,
0.0005807876586914062,
-0.00919342041015625,
0.01123809814453125,
-0.08929443359375,
-0.00856781005859375,
0.004962921142578125,
-0.037628173828125,
0.0133819580078125,
0.0145263671875,
-0.033843994140625,
-0.041961669921875,
-0.0362548828125,
0.005279541015625,
0.03448486328125,
-0.032745361328125,
0.0263824462890625,
0.014923095703125,
-0.0051727294921875,
-0.030303955078125,
-0.051361083984375,
-0.007167816162109375,
-0.006748199462890625,
-0.04901123046875,
0.03631591796875,
-0.0165863037109375,
-0.0235443115234375,
0.00843048095703125,
0.004138946533203125,
-0.006618499755859375,
0.005115509033203125,
0.00867462158203125,
0.036468505859375,
-0.01392364501953125,
0.01078033447265625,
-0.00717926025390625,
-0.01305389404296875,
-0.004138946533203125,
-0.017364501953125,
0.049102783203125,
-0.0189361572265625,
-0.004024505615234375,
-0.05926513671875,
0.00954437255859375,
0.045806884765625,
0.0018062591552734375,
0.042327880859375,
0.027801513671875,
-0.03814697265625,
0.00506591796875,
-0.045623779296875,
-0.024200439453125,
-0.038330078125,
0.03094482421875,
-0.034759521484375,
-0.04925537109375,
0.049713134765625,
0.0106048583984375,
-0.00823211669921875,
0.058013916015625,
0.03289794921875,
-0.02203369140625,
0.10064697265625,
0.049560546875,
-0.0009236335754394531,
0.031036376953125,
-0.040740966796875,
0.01995849609375,
-0.056884765625,
-0.0297088623046875,
-0.037689208984375,
-0.0247802734375,
-0.035247802734375,
0.0133056640625,
0.029754638671875,
0.014068603515625,
-0.024444580078125,
0.0253143310546875,
-0.07159423828125,
0.0211181640625,
0.062744140625,
0.0308837890625,
0.033203125,
0.01320648193359375,
-0.0239105224609375,
-0.0171051025390625,
-0.054901123046875,
-0.04034423828125,
0.08984375,
0.0187530517578125,
0.042694091796875,
-0.00012576580047607422,
0.044769287109375,
0.0225677490234375,
0.00771331787109375,
-0.048370361328125,
0.03167724609375,
-0.012847900390625,
-0.061614990234375,
-0.013397216796875,
-0.017303466796875,
-0.07275390625,
0.0321044921875,
-0.020355224609375,
-0.059661865234375,
0.0214996337890625,
0.01232147216796875,
-0.052825927734375,
0.051788330078125,
-0.049346923828125,
0.057342529296875,
-0.018096923828125,
-0.038543701171875,
0.0030345916748046875,
-0.042327880859375,
0.0294647216796875,
-0.0050506591796875,
0.0149383544921875,
-0.007152557373046875,
0.01554107666015625,
0.06707763671875,
-0.038177490234375,
0.04083251953125,
0.000186920166015625,
0.0010633468627929688,
0.0139923095703125,
0.0211944580078125,
0.04290771484375,
-0.00888824462890625,
-0.005702972412109375,
0.00994873046875,
0.0119171142578125,
-0.0244598388671875,
-0.01454925537109375,
0.0538330078125,
-0.061187744140625,
-0.04443359375,
-0.046905517578125,
-0.043426513671875,
0.004497528076171875,
0.04638671875,
0.034820556640625,
0.034820556640625,
-0.01027679443359375,
0.0193634033203125,
0.05255126953125,
-0.00807952880859375,
0.048065185546875,
0.03778076171875,
-0.00974273681640625,
-0.056732177734375,
0.0654296875,
0.01294708251953125,
0.0084991455078125,
0.0245819091796875,
0.0139923095703125,
-0.0174407958984375,
-0.043182373046875,
-0.0238494873046875,
0.0285797119140625,
-0.040985107421875,
-0.02069091796875,
-0.056640625,
-0.0073699951171875,
-0.041290283203125,
-0.02239990234375,
-0.0249786376953125,
-0.04931640625,
-0.04693603515625,
-0.009490966796875,
0.0218505859375,
0.053955078125,
-0.012298583984375,
0.0285797119140625,
-0.055694580078125,
0.0036487579345703125,
0.0100860595703125,
0.0258026123046875,
-0.00585174560546875,
-0.045379638671875,
-0.0163421630859375,
0.0059356689453125,
-0.04901123046875,
-0.059661865234375,
0.052764892578125,
-0.002643585205078125,
0.02923583984375,
0.0299224853515625,
0.005458831787109375,
0.05322265625,
-0.027862548828125,
0.06854248046875,
0.031585693359375,
-0.06787109375,
0.036956787109375,
-0.028167724609375,
0.0267181396484375,
0.042694091796875,
0.03448486328125,
-0.02337646484375,
-0.0307464599609375,
-0.07257080078125,
-0.06768798828125,
0.06732177734375,
0.0190887451171875,
-0.020416259765625,
0.00865936279296875,
0.0177459716796875,
0.00037026405334472656,
0.01392364501953125,
-0.0604248046875,
-0.0198974609375,
-0.01580810546875,
-0.0286102294921875,
-0.001216888427734375,
-0.016876220703125,
-0.00843048095703125,
-0.031585693359375,
0.078369140625,
0.01206207275390625,
0.0330810546875,
0.033172607421875,
-0.01068115234375,
-0.0077362060546875,
0.003936767578125,
0.0496826171875,
0.05926513671875,
-0.042694091796875,
0.016387939453125,
0.006778717041015625,
-0.040618896484375,
-0.00009232759475708008,
0.0229034423828125,
-0.0271148681640625,
0.0126953125,
0.0230560302734375,
0.05145263671875,
0.00336456298828125,
-0.022064208984375,
0.0297088623046875,
0.005710601806640625,
-0.0406494140625,
-0.0279388427734375,
-0.016021728515625,
-0.008575439453125,
0.01849365234375,
0.02685546875,
0.004161834716796875,
0.0190887451171875,
-0.03497314453125,
0.01392364501953125,
0.039459228515625,
-0.034820556640625,
-0.030517578125,
0.06597900390625,
0.001312255859375,
-0.0225982666015625,
0.044677734375,
0.00119781494140625,
-0.05169677734375,
0.0574951171875,
0.0469970703125,
0.05169677734375,
-0.01446533203125,
0.0126495361328125,
0.05743408203125,
0.031524658203125,
-0.0203399658203125,
0.031524658203125,
0.01203155517578125,
-0.041534423828125,
0.00693511962890625,
-0.043182373046875,
-0.022064208984375,
0.020599365234375,
-0.05169677734375,
0.01483917236328125,
-0.04052734375,
-0.042694091796875,
0.02294921875,
0.023712158203125,
-0.0352783203125,
0.02667236328125,
-0.0012426376342773438,
0.07830810546875,
-0.0699462890625,
0.059234619140625,
0.048004150390625,
-0.034393310546875,
-0.045623779296875,
-0.0223236083984375,
-0.00861358642578125,
-0.04736328125,
0.03570556640625,
0.0140380859375,
0.002655029296875,
0.020721435546875,
-0.0509033203125,
-0.08331298828125,
0.09405517578125,
0.0102691650390625,
-0.053009033203125,
0.002899169921875,
0.0016813278198242188,
0.046539306640625,
0.007144927978515625,
0.03302001953125,
0.03424072265625,
0.03631591796875,
-0.0207366943359375,
-0.0732421875,
0.0216522216796875,
-0.034210205078125,
0.003070831298828125,
0.02215576171875,
-0.0673828125,
0.08294677734375,
0.007038116455078125,
-0.0088653564453125,
0.021453857421875,
0.039794921875,
0.02392578125,
0.0233154296875,
0.033660888671875,
0.08184814453125,
0.0517578125,
-0.027587890625,
0.0633544921875,
-0.046173095703125,
0.05413818359375,
0.054656982421875,
-0.00762939453125,
0.05523681640625,
0.0357666015625,
-0.027801513671875,
0.042694091796875,
0.045806884765625,
-0.0278778076171875,
0.041717529296875,
-0.006595611572265625,
0.00433349609375,
-0.0299224853515625,
-0.00196075439453125,
-0.034881591796875,
0.0075531005859375,
0.017120361328125,
-0.034942626953125,
-0.005290985107421875,
0.0014162063598632812,
0.008575439453125,
0.0019664764404296875,
-0.01468658447265625,
0.0523681640625,
0.0008783340454101562,
-0.042694091796875,
0.0604248046875,
0.0004916191101074219,
0.047210693359375,
-0.034423828125,
-0.0025463104248046875,
-0.01374053955078125,
0.03753662109375,
-0.0120849609375,
-0.067138671875,
0.0026111602783203125,
-0.0238494873046875,
-0.0184326171875,
-0.0126495361328125,
0.04937744140625,
-0.01739501953125,
-0.052490234375,
0.0211181640625,
0.01190948486328125,
0.006778717041015625,
0.00788116455078125,
-0.048004150390625,
0.0014019012451171875,
0.002918243408203125,
-0.04449462890625,
-0.00997161865234375,
0.028228759765625,
0.016693115234375,
0.03826904296875,
0.04962158203125,
0.0011320114135742188,
0.00415802001953125,
-0.0310821533203125,
0.0648193359375,
-0.0618896484375,
-0.048828125,
-0.057769775390625,
0.050201416015625,
-0.01629638671875,
-0.03472900390625,
0.06597900390625,
0.07806396484375,
0.07427978515625,
-0.001300811767578125,
0.0552978515625,
0.008026123046875,
0.044036865234375,
-0.036956787109375,
0.0548095703125,
-0.0635986328125,
-0.0086212158203125,
-0.0235595703125,
-0.06549072265625,
-0.045379638671875,
0.0521240234375,
-0.01198577880859375,
0.002685546875,
0.058685302734375,
0.06024169921875,
-0.01238250732421875,
0.0021381378173828125,
0.028228759765625,
0.039886474609375,
0.017333984375,
0.05084228515625,
0.045684814453125,
-0.05047607421875,
0.054901123046875,
-0.061126708984375,
-0.022613525390625,
-0.030029296875,
-0.041046142578125,
-0.07049560546875,
-0.04034423828125,
-0.026275634765625,
-0.03936767578125,
0.007007598876953125,
0.04852294921875,
0.046173095703125,
-0.0672607421875,
-0.01297760009765625,
-0.0127105712890625,
-0.01190185546875,
-0.0291900634765625,
-0.0195159912109375,
0.056884765625,
0.005115509033203125,
-0.06463623046875,
0.00782012939453125,
0.012420654296875,
0.0286712646484375,
-0.0035343170166015625,
-0.0006299018859863281,
-0.0309600830078125,
0.0034942626953125,
0.03656005859375,
0.0135040283203125,
-0.0504150390625,
-0.01275634765625,
0.013824462890625,
-0.0203704833984375,
0.021392822265625,
0.020782470703125,
-0.043792724609375,
0.034393310546875,
0.05047607421875,
0.01059722900390625,
0.057342529296875,
-0.0005631446838378906,
0.010833740234375,
-0.02569580078125,
0.004001617431640625,
0.018524169921875,
0.0350341796875,
0.0143585205078125,
-0.0251617431640625,
0.0396728515625,
0.03582763671875,
-0.047607421875,
-0.0787353515625,
-0.007747650146484375,
-0.0897216796875,
-0.0421142578125,
0.070068359375,
-0.001071929931640625,
-0.0343017578125,
0.00391387939453125,
-0.01444244384765625,
0.0286712646484375,
-0.03814697265625,
0.0767822265625,
0.072265625,
-0.0105133056640625,
-0.00910186767578125,
-0.0310821533203125,
0.02044677734375,
0.050048828125,
-0.04522705078125,
-0.016204833984375,
0.04071044921875,
0.02984619140625,
0.0313720703125,
0.04046630859375,
-0.00815582275390625,
0.020599365234375,
0.00031304359436035156,
0.0191650390625,
-0.0207366943359375,
-0.030670166015625,
-0.025848388671875,
0.0204925537109375,
-0.01418304443359375,
-0.0197601318359375
]
] |
akreal/tiny-random-mbart | 2022-06-07T18:16:58.000Z | [
"transformers",
"pytorch",
"tf",
"mbart",
"endpoints_compatible",
"region:us"
] | null | akreal | null | null | akreal/tiny-random-mbart | 0 | 14,071 | transformers | 2022-03-02T23:29:05 | This is a copy of: https://huggingface.co/hf-internal-testing/tiny-random-mbart
Changes: use old format for `pytorch_model.bin`.
| 130 | [
[
-0.0248565673828125,
-0.07550048828125,
0.00038933753967285156,
0.0245208740234375,
-0.0241851806640625,
-0.0225677490234375,
0.001201629638671875,
-0.00800323486328125,
0.031494140625,
0.0357666015625,
-0.044891357421875,
-0.0141143798828125,
-0.01264190673828125,
0.0018777847290039062,
-0.0258941650390625,
0.09075927734375,
-0.0021305084228515625,
0.01053619384765625,
-0.0134735107421875,
-0.0181121826171875,
0.004741668701171875,
-0.034942626953125,
-0.08514404296875,
-0.02349853515625,
0.04742431640625,
0.0328369140625,
0.060699462890625,
0.0284881591796875,
0.08331298828125,
0.004817962646484375,
-0.0183868408203125,
-0.0439453125,
-0.0144500732421875,
-0.02825927734375,
0.010528564453125,
0.0006856918334960938,
-0.058380126953125,
0.00177764892578125,
0.06695556640625,
0.045379638671875,
-0.03564453125,
0.034515380859375,
0.0014095306396484375,
0.049224853515625,
-0.03814697265625,
0.01910400390625,
-0.0228729248046875,
0.016357421875,
0.01015472412109375,
0.0005335807800292969,
-0.034088134765625,
-0.0218963623046875,
-0.01934814453125,
-0.007762908935546875,
0.035064697265625,
0.024078369140625,
0.0645751953125,
0.0235748291015625,
-0.04766845703125,
0.01088714599609375,
-0.03790283203125,
0.00901031494140625,
-0.046875,
0.022064208984375,
0.033721923828125,
0.05401611328125,
-0.034027099609375,
-0.0733642578125,
-0.022369384765625,
-0.04632568359375,
0.0198211669921875,
-0.0289459228515625,
-0.025482177734375,
-0.001949310302734375,
0.040924072265625,
0.028900146484375,
-0.030853271484375,
-0.0225677490234375,
-0.046905517578125,
-0.0258026123046875,
0.04034423828125,
0.00727081298828125,
0.00206756591796875,
-0.02069091796875,
-0.01605224609375,
-0.0304107666015625,
-0.060211181640625,
-0.0009121894836425781,
0.026123046875,
0.0199737548828125,
-0.04046630859375,
0.0567626953125,
-0.01296234130859375,
0.050872802734375,
0.01361846923828125,
0.0145721435546875,
0.059326171875,
0.0007686614990234375,
-0.02655029296875,
0.02593994140625,
0.050872802734375,
0.029388427734375,
0.0174560546875,
0.0242767333984375,
-0.00629425048828125,
-0.030029296875,
0.019134521484375,
-0.090576171875,
-0.07574462890625,
-0.004314422607421875,
-0.04571533203125,
-0.039154052734375,
0.0266265869140625,
-0.037384033203125,
-0.01190948486328125,
0.0012836456298828125,
0.04107666015625,
-0.039031982421875,
-0.01561737060546875,
-0.004520416259765625,
-0.0078277587890625,
0.00493621826171875,
0.0206756591796875,
-0.04046630859375,
0.0273895263671875,
0.038970947265625,
0.045074462890625,
0.0221710205078125,
-0.03240966796875,
-0.0277252197265625,
-0.0291900634765625,
-0.007007598876953125,
0.043243408203125,
0.017120361328125,
-0.0284271240234375,
-0.0242919921875,
0.01806640625,
0.0021839141845703125,
-0.056243896484375,
0.042083740234375,
-0.023529052734375,
-0.0029277801513671875,
-0.0179595947265625,
-0.00957489013671875,
-0.0021305084228515625,
0.004337310791015625,
-0.04656982421875,
0.1123046875,
0.04583740234375,
-0.0572509765625,
0.019989013671875,
-0.06353759765625,
-0.025299072265625,
-0.0201873779296875,
0.0173187255859375,
-0.058685302734375,
0.0093536376953125,
-0.01617431640625,
0.01198577880859375,
0.0240020751953125,
-0.0276031494140625,
-0.0550537109375,
-0.0361328125,
0.04736328125,
-0.01419830322265625,
0.07977294921875,
0.0165557861328125,
-0.01593017578125,
0.01441192626953125,
-0.0760498046875,
-0.00510406494140625,
0.02130126953125,
-0.00637054443359375,
-0.004276275634765625,
-0.0213165283203125,
0.0178070068359375,
0.02752685546875,
0.000598907470703125,
-0.0462646484375,
0.04278564453125,
0.006809234619140625,
0.019439697265625,
0.046234130859375,
0.029327392578125,
0.0293731689453125,
-0.037261962890625,
0.0273895263671875,
0.0050201416015625,
0.03656005859375,
0.02044677734375,
-0.06817626953125,
-0.04217529296875,
-0.054107666015625,
0.0093841552734375,
0.0137481689453125,
-0.0086212158203125,
0.0255279541015625,
0.0164642333984375,
-0.042236328125,
-0.030029296875,
0.0035953521728515625,
-0.004894256591796875,
0.028564453125,
0.011688232421875,
-0.033355712890625,
-0.049468994140625,
-0.06927490234375,
0.0128326416015625,
-0.0183258056640625,
-0.00569915771484375,
0.0095672607421875,
0.0792236328125,
-0.05242919921875,
0.0670166015625,
-0.0438232421875,
-0.00894927978515625,
-0.0037441253662109375,
0.0130615234375,
0.01666259765625,
0.045257568359375,
0.0772705078125,
-0.0272979736328125,
-0.026458740234375,
-0.019805908203125,
-0.00669097900390625,
-0.0160980224609375,
0.0222625732421875,
-0.034942626953125,
0.0070343017578125,
0.0293731689453125,
-0.05810546875,
0.045623779296875,
0.03387451171875,
-0.057464599609375,
0.056976318359375,
-0.009674072265625,
0.01201629638671875,
-0.0673828125,
0.004512786865234375,
-0.01385498046875,
-0.0290679931640625,
0.020965576171875,
0.0218963623046875,
0.041259765625,
0.006847381591796875,
-0.053436279296875,
0.057952880859375,
-0.0146636962890625,
-0.006229400634765625,
-0.0298919677734375,
-0.018768310546875,
-0.02313232421875,
0.0106658935546875,
-0.051361083984375,
0.047637939453125,
0.0249786376953125,
-0.0251617431640625,
0.052093505859375,
0.03399658203125,
-0.004695892333984375,
0.039703369140625,
-0.03240966796875,
0.033172607421875,
0.0061492919921875,
0.026123046875,
-0.059295654296875,
-0.042022705078125,
0.059844970703125,
-0.043670654296875,
0.01763916015625,
-0.02197265625,
-0.0396728515625,
-0.034942626953125,
-0.0272064208984375,
0.0296478271484375,
0.08270263671875,
-0.03521728515625,
0.021942138671875,
0.00406646728515625,
-0.0047607421875,
-0.007076263427734375,
-0.052947998046875,
-0.0138702392578125,
-0.0020542144775390625,
-0.03753662109375,
-0.005779266357421875,
-0.00858306884765625,
-0.034942626953125,
0.0112457275390625,
-0.004550933837890625,
-0.028900146484375,
0.00885772705078125,
0.020965576171875,
0.01556396484375,
-0.042327880859375,
-0.020599365234375,
-0.021759033203125,
0.0005888938903808594,
0.0167999267578125,
0.0021419525146484375,
0.0218048095703125,
-0.022308349609375,
-0.0240325927734375,
-0.0279083251953125,
-0.00020265579223632812,
0.02313232421875,
0.0048980712890625,
0.049163818359375,
0.0731201171875,
-0.040374755859375,
-0.00807952880859375,
-0.0271453857421875,
-0.048004150390625,
-0.03155517578125,
0.01904296875,
-0.01312255859375,
-0.05242919921875,
0.035064697265625,
0.024017333984375,
-0.0323486328125,
0.062164306640625,
0.046844482421875,
-0.0258026123046875,
0.05572509765625,
0.058563232421875,
-0.00160980224609375,
0.044036865234375,
-0.009490966796875,
0.007526397705078125,
-0.07470703125,
0.00592803955078125,
-0.04437255859375,
-0.0223541259765625,
-0.031524658203125,
-0.01348114013671875,
0.01132965087890625,
0.01465606689453125,
-0.04766845703125,
0.023529052734375,
-0.0245208740234375,
0.01290130615234375,
0.049102783203125,
0.0146636962890625,
0.01531219482421875,
0.016326904296875,
-0.0185546875,
-0.0009322166442871094,
-0.042999267578125,
-0.01239013671875,
0.0711669921875,
0.016876220703125,
0.059906005859375,
0.0138397216796875,
0.036895751953125,
-0.00238037109375,
0.06353759765625,
-0.032684326171875,
0.025634765625,
0.0214996337890625,
-0.0677490234375,
0.004787445068359375,
-0.049285888671875,
-0.059661865234375,
0.0154266357421875,
-0.00533294677734375,
-0.0787353515625,
-0.0270538330078125,
0.0335693359375,
-0.021209716796875,
0.033203125,
-0.06256103515625,
0.09136962890625,
0.01806640625,
-0.0010471343994140625,
0.00621795654296875,
-0.034332275390625,
0.0220947265625,
0.004657745361328125,
0.0077972412109375,
0.0006833076477050781,
0.0236663818359375,
0.053192138671875,
-0.0306549072265625,
0.048370361328125,
-0.006435394287109375,
0.0178070068359375,
0.04302978515625,
0.029083251953125,
0.03369140625,
-0.0015001296997070312,
-0.003833770751953125,
-0.00707244873046875,
0.0175323486328125,
-0.03851318359375,
-0.0237884521484375,
0.035430908203125,
-0.045745849609375,
-0.002727508544921875,
-0.05096435546875,
-0.0294189453125,
0.01439666748046875,
0.0018062591552734375,
0.01081085205078125,
0.01837158203125,
-0.03399658203125,
0.050048828125,
0.032073974609375,
0.00760650634765625,
0.01276397705078125,
0.022308349609375,
-0.039154052734375,
-0.0194854736328125,
0.0207672119140625,
-0.039337158203125,
0.01910400390625,
0.008514404296875,
0.046112060546875,
0.00307464599609375,
-0.0193023681640625,
-0.0181732177734375,
0.0158843994140625,
-0.0440673828125,
-0.0026111602783203125,
-0.038421630859375,
-0.01126861572265625,
-0.0545654296875,
-0.01033782958984375,
-0.035125732421875,
-0.041259765625,
-0.0185089111328125,
-0.003986358642578125,
0.06634521484375,
0.037200927734375,
-0.040283203125,
0.05401611328125,
-0.0562744140625,
0.01383209228515625,
-0.0099334716796875,
0.02105712890625,
-0.01485443115234375,
-0.0654296875,
-0.023956298828125,
0.00019156932830810547,
-0.036651611328125,
-0.057647705078125,
0.0162353515625,
-0.01007080078125,
0.02813720703125,
0.004306793212890625,
-0.0164947509765625,
0.0182647705078125,
-0.02105712890625,
0.0256805419921875,
0.01468658447265625,
-0.028564453125,
0.021240234375,
-0.03460693359375,
0.036163330078125,
0.054595947265625,
0.006992340087890625,
-0.034637451171875,
-0.006679534912109375,
-0.081298828125,
-0.053253173828125,
0.053009033203125,
0.03826904296875,
-0.00974273681640625,
0.023101806640625,
0.03350830078125,
-0.0097808837890625,
0.022491455078125,
-0.033599853515625,
-0.032012939453125,
0.005847930908203125,
-0.04498291015625,
-0.0211639404296875,
-0.0299072265625,
-0.01361846923828125,
-0.053558349609375,
0.052154541015625,
-0.0010805130004882812,
0.04998779296875,
0.0152130126953125,
0.001926422119140625,
-0.036376953125,
-0.006900787353515625,
0.033935546875,
0.033294677734375,
-0.08477783203125,
0.01084136962890625,
0.0084228515625,
-0.079345703125,
0.001148223876953125,
0.0215606689453125,
-0.01027679443359375,
0.005512237548828125,
0.01535797119140625,
0.051422119140625,
0.036468505859375,
-0.0010166168212890625,
0.028900146484375,
-0.014984130859375,
-0.00969696044921875,
-0.03826904296875,
0.009613037109375,
0.003948211669921875,
0.0017976760864257812,
0.0096282958984375,
0.03729248046875,
-0.00035834312438964844,
-0.040771484375,
0.0491943359375,
0.005672454833984375,
-0.0291290283203125,
-0.02099609375,
0.034393310546875,
-0.0012874603271484375,
-0.04644775390625,
0.055328369140625,
-0.06494140625,
-0.0266265869140625,
0.050689697265625,
0.026397705078125,
0.06829833984375,
-0.00849151611328125,
-0.002307891845703125,
0.031524658203125,
0.046600341796875,
-0.006244659423828125,
0.02825927734375,
-0.0192718505859375,
-0.0294189453125,
0.000492095947265625,
-0.03826904296875,
-0.037872314453125,
-0.0141143798828125,
-0.043487548828125,
0.0166168212890625,
-0.058563232421875,
-0.0162506103515625,
-0.007747650146484375,
0.004001617431640625,
-0.049468994140625,
0.0145416259765625,
-0.01361846923828125,
0.07708740234375,
-0.033294677734375,
0.07220458984375,
0.07000732421875,
-0.023895263671875,
-0.0645751953125,
-0.019317626953125,
0.0284271240234375,
-0.04046630859375,
0.0253753662109375,
-0.0162811279296875,
0.0182647705078125,
-0.005474090576171875,
-0.054168701171875,
-0.06036376953125,
0.06060791015625,
0.0237579345703125,
-0.0200958251953125,
0.01357269287109375,
-0.01319122314453125,
0.0119476318359375,
-0.0167999267578125,
0.02496337890625,
0.036407470703125,
0.01052093505859375,
0.0156707763671875,
-0.057464599609375,
0.00311279296875,
-0.0223388671875,
-0.01502227783203125,
0.0238189697265625,
-0.040191650390625,
0.08746337890625,
-0.01427459716796875,
0.034027099609375,
0.022552490234375,
0.0435791015625,
0.043304443359375,
0.0188751220703125,
0.01457977294921875,
0.04052734375,
0.0384521484375,
-0.01788330078125,
0.09075927734375,
0.00896453857421875,
0.072021484375,
0.0975341796875,
-0.0260467529296875,
0.0673828125,
0.054901123046875,
-0.011474609375,
0.03216552734375,
0.06549072265625,
-0.02960205078125,
0.021209716796875,
0.01111602783203125,
0.0034885406494140625,
0.02191162109375,
0.035369873046875,
-0.042633056640625,
0.0175323486328125,
-0.0166473388671875,
-0.031768798828125,
-0.0177459716796875,
-0.00920867919921875,
0.026397705078125,
-0.05517578125,
-0.0189056396484375,
0.006855010986328125,
0.0272369384765625,
-0.00257110595703125,
0.013092041015625,
0.0140380859375,
0.04595947265625,
-0.0400390625,
0.0058441162109375,
0.0014734268188476562,
0.061370849609375,
0.00550079345703125,
-0.0340576171875,
0.0367431640625,
-0.0185546875,
0.008087158203125,
-0.0083465576171875,
0.07171630859375,
-0.00415802001953125,
-0.0550537109375,
0.0146942138671875,
0.017791748046875,
0.01332855224609375,
-0.035369873046875,
-0.0675048828125,
0.0097198486328125,
-0.004856109619140625,
-0.040771484375,
0.022705078125,
0.02294921875,
0.0175323486328125,
0.034454345703125,
0.035186767578125,
-0.00705718994140625,
0.01131439208984375,
0.01508331298828125,
0.0100250244140625,
-0.0654296875,
-0.04400634765625,
-0.053253173828125,
0.0270538330078125,
-0.00530242919921875,
-0.0638427734375,
0.0633544921875,
0.046875,
0.076171875,
-0.0277252197265625,
0.01113128662109375,
-0.01641845703125,
-0.0045166015625,
-0.0230712890625,
0.06683349609375,
-0.034423828125,
-0.049285888671875,
-0.038970947265625,
-0.07232666015625,
-0.0207672119140625,
0.0789794921875,
0.0193939208984375,
0.03253173828125,
0.0924072265625,
0.06378173828125,
-0.03375244140625,
0.01137542724609375,
0.005512237548828125,
0.05023193359375,
-0.00785064697265625,
0.01143646240234375,
0.051727294921875,
-0.0462646484375,
0.02178955078125,
-0.0667724609375,
-0.025482177734375,
-0.0423583984375,
-0.07171630859375,
-0.06683349609375,
-0.035919189453125,
-0.05560302734375,
-0.05157470703125,
-0.0186004638671875,
0.09710693359375,
0.058380126953125,
-0.04205322265625,
-0.0180511474609375,
0.01148223876953125,
-0.0200958251953125,
-0.0235443115234375,
-0.0165252685546875,
0.033050537109375,
0.03631591796875,
-0.0068206787109375,
-0.01229095458984375,
-0.0019140243530273438,
0.0328369140625,
0.0037994384765625,
0.00756072998046875,
-0.01087188720703125,
-0.006916046142578125,
0.029998779296875,
0.015472412109375,
-0.01236724853515625,
-0.045074462890625,
-0.010894775390625,
-0.0296478271484375,
0.0228118896484375,
0.048828125,
-0.0173797607421875,
-0.011199951171875,
0.02520751953125,
0.016754150390625,
0.03961181640625,
-0.02825927734375,
0.0037136077880859375,
-0.0462646484375,
0.041839599609375,
0.004520416259765625,
0.06427001953125,
0.035003662109375,
-0.02655029296875,
0.0284271240234375,
0.041534423828125,
-0.0253753662109375,
-0.06182861328125,
-0.004703521728515625,
-0.10321044921875,
0.0020599365234375,
0.0931396484375,
0.02862548828125,
-0.061737060546875,
0.038177490234375,
-0.0665283203125,
0.038330078125,
-0.041290283203125,
0.056884765625,
0.056610107421875,
0.0035648345947265625,
-0.025238037109375,
-0.044891357421875,
0.0263214111328125,
0.01020050048828125,
-0.03985595703125,
-0.0191650390625,
0.019927978515625,
0.041259765625,
-0.0106658935546875,
0.05548095703125,
-0.0211181640625,
0.035491943359375,
0.0278778076171875,
0.0312347412109375,
-0.016937255859375,
-0.0156402587890625,
-0.0400390625,
-0.0052947998046875,
0.0125885009765625,
-0.0181884765625
]
] |
nreimers/TinyBERT_L-4_H-312_v2 | 2021-05-28T11:02:32.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | feature-extraction | nreimers | null | null | nreimers/TinyBERT_L-4_H-312_v2 | 1 | 14,063 | transformers | 2022-03-02T23:29:05 | This is the [General_TinyBERT_v2(4layer-312dim)](https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/TinyBERT) ported to Huggingface transformers. | 164 | [
[
-0.04541015625,
-0.032562255859375,
0.00665283203125,
0.0299835205078125,
0.00484466552734375,
0.003215789794921875,
0.0101776123046875,
-0.0311126708984375,
0.0404052734375,
0.037689208984375,
-0.056854248046875,
-0.00013363361358642578,
-0.0194854736328125,
-0.0238189697265625,
-0.0256500244140625,
0.07611083984375,
0.009429931640625,
-0.00439453125,
-0.004726409912109375,
-0.0253448486328125,
0.0196685791015625,
-0.0032215118408203125,
-0.037750244140625,
-0.0606689453125,
0.042022705078125,
0.060150146484375,
0.08380126953125,
0.0249481201171875,
0.0126800537109375,
0.0138092041015625,
-0.0169677734375,
-0.037017822265625,
-0.0020160675048828125,
-0.04107666015625,
-0.00664520263671875,
-0.01279449462890625,
-0.060638427734375,
0.01439666748046875,
0.042999267578125,
0.0244903564453125,
0.0024318695068359375,
0.052581787109375,
0.004337310791015625,
0.045318603515625,
-0.0013017654418945312,
0.01129150390625,
-0.01024627685546875,
0.0145263671875,
-0.00974273681640625,
0.00955963134765625,
-0.009429931640625,
-0.0140228271484375,
0.033203125,
-0.035919189453125,
0.0132598876953125,
-0.0016794204711914062,
0.0662841796875,
0.035400390625,
-0.033843994140625,
-0.0025234222412109375,
-0.0396728515625,
0.051361083984375,
-0.0026111602783203125,
0.0430908203125,
0.02423095703125,
0.050811767578125,
0.0162200927734375,
-0.09613037109375,
-0.03558349609375,
-0.012969970703125,
-0.006855010986328125,
0.00396728515625,
-0.0228271484375,
0.004764556884765625,
0.0223846435546875,
0.038360595703125,
-0.0297088623046875,
0.004123687744140625,
-0.05389404296875,
-0.043701171875,
0.037445068359375,
-0.01280975341796875,
0.037750244140625,
-0.0275421142578125,
-0.032806396484375,
-0.007762908935546875,
-0.05047607421875,
0.0089874267578125,
0.03521728515625,
0.0021114349365234375,
-0.05780029296875,
0.046112060546875,
0.0024013519287109375,
0.041717529296875,
0.001972198486328125,
0.00725555419921875,
0.045257568359375,
-0.00632476806640625,
-0.01373291015625,
-0.0043792724609375,
0.03302001953125,
0.0158538818359375,
0.037689208984375,
0.0078582763671875,
-0.0080108642578125,
-0.0190277099609375,
0.03680419921875,
-0.085693359375,
-0.0655517578125,
0.0190277099609375,
-0.0372314453125,
-0.0190887451171875,
-0.00933074951171875,
-0.05865478515625,
-0.00423431396484375,
-0.0142364501953125,
0.018890380859375,
-0.0335693359375,
-0.027679443359375,
-0.0235595703125,
0.00693511962890625,
0.0404052734375,
0.022125244140625,
-0.07635498046875,
0.035400390625,
0.0305938720703125,
0.067626953125,
0.0190582275390625,
-0.0138092041015625,
-0.038970947265625,
-0.00946807861328125,
-0.00814056396484375,
0.039886474609375,
-0.00916290283203125,
-0.0153350830078125,
0.005786895751953125,
0.0087127685546875,
0.01168060302734375,
-0.031646728515625,
0.056915283203125,
-0.03692626953125,
0.0238494873046875,
0.040557861328125,
0.001827239990234375,
-0.00925445556640625,
0.0264892578125,
-0.05352783203125,
0.07562255859375,
0.062042236328125,
-0.06121826171875,
0.00009042024612426758,
-0.038482666015625,
-0.0238800048828125,
0.0284271240234375,
-0.0208740234375,
-0.057220458984375,
-0.0032501220703125,
-0.007518768310546875,
0.0126800537109375,
-0.01538848876953125,
-0.01317596435546875,
-0.01261138916015625,
-0.0217437744140625,
-0.03424072265625,
0.0264739990234375,
0.07354736328125,
0.0285797119140625,
-0.01018524169921875,
0.0210418701171875,
-0.04888916015625,
0.0289764404296875,
0.0156707763671875,
-0.0070953369140625,
-0.007965087890625,
-0.0217742919921875,
0.01290130615234375,
0.02764892578125,
0.035919189453125,
-0.041900634765625,
0.0335693359375,
-0.0270233154296875,
0.033477783203125,
0.0247802734375,
-0.0166473388671875,
0.0684814453125,
-0.02410888671875,
0.045684814453125,
-0.0013580322265625,
0.039337158203125,
0.00531005859375,
-0.0650634765625,
-0.056182861328125,
-0.0234832763671875,
0.03118896484375,
0.03814697265625,
-0.0740966796875,
0.0223846435546875,
-0.01535797119140625,
-0.06170654296875,
-0.03228759765625,
0.0290069580078125,
0.016357421875,
0.0003151893615722656,
0.037200927734375,
-0.022979736328125,
-0.033447265625,
-0.10009765625,
0.0161590576171875,
-0.03326416015625,
-0.028717041015625,
0.0288848876953125,
0.0274200439453125,
-0.0638427734375,
0.07867431640625,
-0.0194854736328125,
-0.035064697265625,
-0.0262603759765625,
-0.0113525390625,
0.015869140625,
0.04217529296875,
0.08868408203125,
-0.03228759765625,
-0.038787841796875,
-0.00476837158203125,
-0.0150604248046875,
-0.0237274169921875,
0.0153961181640625,
-0.0138397216796875,
-0.00899505615234375,
0.0166473388671875,
-0.05926513671875,
0.01194000244140625,
0.08074951171875,
-0.027252197265625,
0.044891357421875,
-0.01071929931640625,
-0.0262451171875,
-0.0743408203125,
-0.0029811859130859375,
0.0021800994873046875,
-0.0399169921875,
-0.03948974609375,
0.0026645660400390625,
0.039581298828125,
0.0061187744140625,
-0.042724609375,
0.0552978515625,
-0.0305633544921875,
-0.01361846923828125,
-0.01163482666015625,
0.00292205810546875,
-0.016571044921875,
0.026092529296875,
-0.0237274169921875,
0.059906005859375,
0.056854248046875,
-0.0249481201171875,
0.049835205078125,
0.053466796875,
-0.01407623291015625,
0.0245208740234375,
-0.0750732421875,
0.001712799072265625,
0.003810882568359375,
0.018890380859375,
-0.03387451171875,
-0.032257080078125,
0.038970947265625,
-0.040008544921875,
0.01214599609375,
-0.049041748046875,
-0.045928955078125,
-0.06512451171875,
-0.041595458984375,
0.030303955078125,
0.055633544921875,
-0.07525634765625,
0.052764892578125,
0.025177001953125,
-0.0208587646484375,
-0.006565093994140625,
-0.057098388671875,
-0.0191802978515625,
-0.00995635986328125,
-0.050262451171875,
0.039154052734375,
-0.0120849609375,
-0.01212310791015625,
-0.0186767578125,
-0.0067901611328125,
-0.00926971435546875,
-0.0210113525390625,
0.0274810791015625,
0.01396942138671875,
-0.01776123046875,
-0.004688262939453125,
0.004486083984375,
-0.012969970703125,
0.0009860992431640625,
0.0018911361694335938,
0.045196533203125,
-0.04718017578125,
0.016754150390625,
-0.036407470703125,
-0.0032901763916015625,
0.034881591796875,
-0.004367828369140625,
0.0299530029296875,
0.06341552734375,
-0.029205322265625,
-0.025360107421875,
-0.0160369873046875,
-0.01885986328125,
-0.04168701171875,
-0.01107025146484375,
-0.03485107421875,
-0.08319091796875,
0.0206451416015625,
-0.017791748046875,
0.0032749176025390625,
0.05035400390625,
0.04638671875,
0.004863739013671875,
0.071044921875,
0.053802490234375,
-0.0201568603515625,
0.050811767578125,
-0.039093017578125,
-0.0029621124267578125,
-0.06201171875,
-0.022979736328125,
-0.0303497314453125,
-0.0164794921875,
-0.046966552734375,
-0.038482666015625,
0.0022335052490234375,
0.0246124267578125,
-0.05889892578125,
0.0576171875,
-0.03564453125,
0.022430419921875,
0.054107666015625,
-0.01027679443359375,
0.0036869049072265625,
0.00200653076171875,
-0.012176513671875,
-0.0112457275390625,
-0.036529541015625,
-0.045745849609375,
0.053436279296875,
0.059234619140625,
0.042999267578125,
0.031585693359375,
0.04058837890625,
0.0215301513671875,
0.0257720947265625,
-0.044952392578125,
0.0252838134765625,
0.01453399658203125,
-0.0643310546875,
0.004825592041015625,
-0.02850341796875,
-0.059051513671875,
0.0108795166015625,
-0.0187225341796875,
-0.08209228515625,
-0.0279693603515625,
0.018218994140625,
-0.036712646484375,
0.0167694091796875,
-0.0714111328125,
0.08380126953125,
-0.0139312744140625,
-0.003566741943359375,
-0.005031585693359375,
-0.057861328125,
0.041473388671875,
-0.01486968994140625,
-0.00818634033203125,
-0.0030040740966796875,
0.006450653076171875,
0.050933837890625,
-0.027740478515625,
0.040985107421875,
-0.01137542724609375,
-0.0017061233520507812,
0.040740966796875,
0.019439697265625,
0.028961181640625,
0.046356201171875,
0.015533447265625,
0.013702392578125,
0.0189056396484375,
-0.031646728515625,
-0.0284881591796875,
0.07012939453125,
-0.08892822265625,
-0.0214385986328125,
-0.0247955322265625,
-0.005924224853515625,
-0.0096893310546875,
0.0243377685546875,
0.01209259033203125,
0.022796630859375,
-0.015777587890625,
0.01485443115234375,
0.02008056640625,
-0.0185546875,
0.0194854736328125,
0.0205230712890625,
-0.044586181640625,
-0.021820068359375,
0.053802490234375,
-0.0297088623046875,
0.0019989013671875,
-0.0119476318359375,
0.0305938720703125,
-0.01033782958984375,
-0.0125885009765625,
-0.055450439453125,
0.0175933837890625,
0.002002716064453125,
-0.023773193359375,
-0.0435791015625,
-0.05487060546875,
-0.0163726806640625,
-0.0035343170166015625,
-0.05352783203125,
-0.053314208984375,
-0.023590087890625,
0.0028476715087890625,
0.039337158203125,
0.05145263671875,
-0.015716552734375,
0.05072021484375,
-0.06597900390625,
0.052703857421875,
0.053802490234375,
0.051177978515625,
-0.0238800048828125,
-0.06756591796875,
-0.01428985595703125,
-0.003795623779296875,
-0.048065185546875,
-0.028900146484375,
0.0105743408203125,
0.0242919921875,
0.039825439453125,
0.0266876220703125,
-0.01490020751953125,
0.001689910888671875,
-0.05426025390625,
0.05877685546875,
0.04345703125,
-0.05230712890625,
0.03363037109375,
-0.0252227783203125,
0.0312042236328125,
0.0307464599609375,
0.0220794677734375,
-0.0201416015625,
-0.01163482666015625,
-0.07861328125,
-0.049652099609375,
0.06689453125,
0.0345458984375,
0.0232696533203125,
0.0246429443359375,
0.00362396240234375,
-0.0001462697982788086,
0.01107025146484375,
-0.041229248046875,
-0.047271728515625,
-0.03857421875,
-0.0232086181640625,
-0.005107879638671875,
-0.04266357421875,
-0.005481719970703125,
-0.04681396484375,
0.03240966796875,
-0.0169830322265625,
0.041046142578125,
-0.005390167236328125,
0.0007119178771972656,
-0.004547119140625,
-0.005619049072265625,
0.041229248046875,
0.0064697265625,
-0.0213775634765625,
0.01116180419921875,
0.00791168212890625,
-0.047119140625,
-0.006366729736328125,
0.0241241455078125,
-0.01201629638671875,
-0.0026226043701171875,
0.01453399658203125,
0.038421630859375,
0.011566162109375,
-0.0238494873046875,
0.067138671875,
0.0034427642822265625,
-0.0209503173828125,
-0.059539794921875,
0.005706787109375,
0.011322021484375,
0.045867919921875,
0.00836181640625,
0.023468017578125,
0.0066986083984375,
-0.03485107421875,
0.046051025390625,
0.00812530517578125,
-0.045989990234375,
-0.05389404296875,
0.07208251953125,
0.03228759765625,
-0.05584716796875,
0.050506591796875,
-0.020416259765625,
-0.05010986328125,
0.0273895263671875,
0.035919189453125,
0.04583740234375,
-0.0133819580078125,
0.0433349609375,
0.047637939453125,
0.034881591796875,
0.01163482666015625,
0.01568603515625,
-0.0179443359375,
-0.050018310546875,
-0.01678466796875,
-0.04681396484375,
-0.039276123046875,
-0.0018854141235351562,
-0.03570556640625,
0.0242919921875,
-0.0631103515625,
-0.01507568359375,
0.0004596710205078125,
0.007022857666015625,
-0.06268310546875,
0.005168914794921875,
0.0222930908203125,
0.08990478515625,
-0.036529541015625,
0.10748291015625,
0.0634765625,
-0.01493072509765625,
-0.049468994140625,
-0.00775146484375,
0.0202484130859375,
-0.0679931640625,
0.038818359375,
0.0241241455078125,
-0.00945281982421875,
0.0186614990234375,
-0.037689208984375,
-0.04998779296875,
0.07391357421875,
0.01067352294921875,
-0.0249176025390625,
0.00946807861328125,
-0.0076751708984375,
0.04949951171875,
-0.0200958251953125,
0.027191162109375,
0.0645751953125,
0.028472900390625,
0.0033855438232421875,
-0.07861328125,
-0.0263824462890625,
-0.03179931640625,
0.0022029876708984375,
0.01364898681640625,
-0.0743408203125,
0.050994873046875,
-0.0234832763671875,
0.0124664306640625,
0.03607177734375,
0.057403564453125,
0.02069091796875,
-0.0005817413330078125,
0.050262451171875,
0.037841796875,
0.019195556640625,
-0.0209808349609375,
0.0645751953125,
-0.0180206298828125,
0.05712890625,
0.07598876953125,
0.0031757354736328125,
0.03485107421875,
0.034027099609375,
-0.0063629150390625,
0.04791259765625,
0.036773681640625,
-0.01050567626953125,
0.03594970703125,
0.0155181884765625,
-0.0199432373046875,
0.0006623268127441406,
0.0005660057067871094,
-0.033477783203125,
0.021636962890625,
0.03076171875,
-0.018951416015625,
-0.0088043212890625,
-0.009185791015625,
-0.005710601806640625,
-0.0125732421875,
-0.032379150390625,
0.020172119140625,
0.0188140869140625,
-0.0150909423828125,
0.0186767578125,
0.008453369140625,
0.05059814453125,
-0.036376953125,
0.01239776611328125,
-0.007678985595703125,
0.0178985595703125,
-0.01380157470703125,
-0.0872802734375,
0.0168914794921875,
-0.010345458984375,
0.017303466796875,
-0.0236663818359375,
0.0552978515625,
-0.02862548828125,
-0.018646240234375,
0.047821044921875,
0.032135009765625,
0.00771331787109375,
0.0093841552734375,
-0.0704345703125,
0.006885528564453125,
0.0018033981323242188,
-0.036651611328125,
0.002574920654296875,
-0.005809783935546875,
0.004375457763671875,
0.05224609375,
0.003559112548828125,
-0.00937652587890625,
0.01300811767578125,
0.028106689453125,
0.062744140625,
-0.042999267578125,
-0.034912109375,
-0.0240325927734375,
0.03948974609375,
-0.0174713134765625,
-0.02874755859375,
0.055084228515625,
0.054229736328125,
0.06939697265625,
-0.033905029296875,
0.037994384765625,
-0.0244903564453125,
0.03106689453125,
-0.00493621826171875,
0.038360595703125,
-0.0616455078125,
-0.0297393798828125,
-0.0219268798828125,
-0.09027099609375,
-0.0259857177734375,
0.054931640625,
0.0095977783203125,
0.017974853515625,
0.032379150390625,
0.041046142578125,
-0.029998779296875,
0.001567840576171875,
0.0302581787109375,
0.01849365234375,
0.01309967041015625,
0.02960205078125,
0.053680419921875,
-0.050201416015625,
0.04327392578125,
-0.040863037109375,
-0.0231781005859375,
-0.0290069580078125,
-0.0245208740234375,
-0.07891845703125,
-0.037506103515625,
-0.0289764404296875,
-0.0244140625,
-0.037109375,
0.05712890625,
0.06903076171875,
-0.047454833984375,
0.0181884765625,
0.01061248779296875,
-0.00846099853515625,
0.0221099853515625,
-0.01128387451171875,
0.01641845703125,
0.018707275390625,
-0.061492919921875,
0.0266876220703125,
0.01158905029296875,
0.03057861328125,
-0.01207733154296875,
-0.029449462890625,
0.0240478515625,
-0.006710052490234375,
0.0560302734375,
0.043548583984375,
-0.037841796875,
-0.07257080078125,
-0.0290679931640625,
-0.0308380126953125,
-0.0007290840148925781,
0.053924560546875,
-0.039581298828125,
0.0088348388671875,
0.0221710205078125,
0.0270538330078125,
0.046905517578125,
0.008209228515625,
0.0148468017578125,
-0.07562255859375,
0.052734375,
-0.006134033203125,
0.04315185546875,
0.0223846435546875,
0.0118408203125,
0.0438232421875,
0.0021648406982421875,
-0.042877197265625,
-0.01007843017578125,
0.037750244140625,
-0.10650634765625,
0.0240325927734375,
0.07305908203125,
0.029144287109375,
-0.0159149169921875,
0.009735107421875,
-0.039947509765625,
0.0133056640625,
-0.007396697998046875,
0.0275421142578125,
0.0157012939453125,
0.01387786865234375,
-0.040374755859375,
-0.056976318359375,
0.0311279296875,
0.00432586669921875,
-0.03753662109375,
-0.039764404296875,
0.015716552734375,
0.027618408203125,
0.03363037109375,
0.036712646484375,
-0.020904541015625,
0.0264434814453125,
0.0024204254150390625,
0.0308380126953125,
-0.0014371871948242188,
-0.05340576171875,
-0.006664276123046875,
-0.0125885009765625,
0.0252685546875,
-0.0244903564453125
]
] |
patrickvonplaten/wav2vec2_tiny_random | 2021-07-05T13:53:54.000Z | [
"transformers",
"pytorch",
"wav2vec2",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | feature-extraction | patrickvonplaten | null | null | patrickvonplaten/wav2vec2_tiny_random | 1 | 14,057 | transformers | 2022-03-02T23:29:05 | ## Test model
To test this model run the following code:
```python
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC
import torchaudio
import torch
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
model = Wav2Vec2ForCTC.from_pretrained("patrickvonplaten/wav2vec2_tiny_random")
def load_audio(batch):
batch["samples"], _ = torchaudio.load(batch["file"])
return batch
ds = ds.map(load_audio)
input_values = torch.nn.utils.rnn.pad_sequence([torch.tensor(x[0]) for x in ds["samples"][:10]], batch_first=True)
# forward
logits = model(input_values).logits
pred_ids = torch.argmax(logits, dim=-1)
# dummy loss
dummy_labels = pred_ids.clone()
dummy_labels[dummy_labels == model.config.pad_token_id] = 1 # can't have CTC blank token in label
dummy_labels = dummy_labels[:, -(dummy_labels.shape[1] // 4):] # make sure labels are shorter to avoid "inf" loss (can still happen though...)
loss = model(input_values, labels=dummy_labels).loss
```
| 1,020 | [
[
-0.02197265625,
-0.041473388671875,
0.015869140625,
0.025848388671875,
-0.01763916015625,
-0.0121917724609375,
-0.01261138916015625,
-0.006893157958984375,
-0.0126953125,
0.034912109375,
-0.05230712890625,
-0.0306243896484375,
-0.04937744140625,
-0.01442718505859375,
-0.045135498046875,
0.08135986328125,
-0.00786590576171875,
0.01201629638671875,
0.005397796630859375,
-0.0189056396484375,
-0.060546875,
-0.00855255126953125,
-0.06622314453125,
-0.036651611328125,
0.01335906982421875,
0.0283050537109375,
0.0274810791015625,
0.0090789794921875,
0.032562255859375,
0.0275115966796875,
-0.0030994415283203125,
-0.005016326904296875,
-0.031829833984375,
-0.01251983642578125,
0.01114654541015625,
-0.046844482421875,
-0.0269775390625,
0.00862884521484375,
0.037689208984375,
0.0124359130859375,
0.005084991455078125,
0.0265350341796875,
-0.006591796875,
0.002017974853515625,
-0.0235748291015625,
-0.0010223388671875,
-0.029998779296875,
-0.011810302734375,
0.01004791259765625,
-0.021392822265625,
-0.038787841796875,
-0.0209503173828125,
0.00897216796875,
-0.039642333984375,
0.0474853515625,
0.004756927490234375,
0.094482421875,
0.01690673828125,
-0.04315185546875,
-0.0188446044921875,
-0.0579833984375,
0.0535888671875,
-0.0513916015625,
0.045989990234375,
0.0250396728515625,
0.007778167724609375,
0.0025119781494140625,
-0.08135986328125,
-0.05755615234375,
-0.0147552490234375,
0.00897216796875,
0.0115814208984375,
-0.01332855224609375,
0.0009055137634277344,
0.03778076171875,
0.016754150390625,
-0.05303955078125,
0.0096282958984375,
-0.061676025390625,
-0.0240631103515625,
0.049072265625,
0.0033016204833984375,
0.0113372802734375,
-0.006351470947265625,
-0.046539306640625,
-0.05035400390625,
-0.033111572265625,
0.01181793212890625,
0.048614501953125,
0.014251708984375,
-0.033966064453125,
0.048736572265625,
0.0133819580078125,
0.04205322265625,
0.01715087890625,
-0.0127105712890625,
0.041259765625,
-0.0204010009765625,
-0.0279541015625,
0.029754638671875,
0.061492919921875,
0.02197265625,
0.02838134765625,
0.02386474609375,
-0.018280029296875,
0.0142364501953125,
-0.00099945068359375,
-0.06610107421875,
-0.048309326171875,
0.041015625,
-0.0228729248046875,
-0.0237274169921875,
0.00984954833984375,
-0.031402587890625,
-0.0020732879638671875,
-0.01275634765625,
0.0928955078125,
-0.0384521484375,
-0.0240325927734375,
0.028472900390625,
-0.0240325927734375,
0.0308380126953125,
-0.0053863525390625,
-0.06195068359375,
0.0200958251953125,
0.02532958984375,
0.05047607421875,
0.0049896240234375,
-0.023345947265625,
-0.0311126708984375,
-0.0088653564453125,
-0.01467132568359375,
0.041259765625,
0.0156707763671875,
-0.05548095703125,
-0.0256805419921875,
0.01163482666015625,
-0.0139923095703125,
-0.048553466796875,
0.0460205078125,
-0.0289459228515625,
0.0171661376953125,
-0.0118560791015625,
-0.03680419921875,
-0.00829315185546875,
-0.01004791259765625,
-0.032379150390625,
0.08349609375,
0.01806640625,
-0.0513916015625,
0.02557373046875,
-0.043914794921875,
-0.040802001953125,
-0.031402587890625,
-0.00861358642578125,
-0.06982421875,
0.005870819091796875,
0.004547119140625,
0.032135009765625,
-0.0150299072265625,
0.0020847320556640625,
-0.02716064453125,
-0.04168701171875,
0.034698486328125,
-0.0396728515625,
0.07830810546875,
0.0108184814453125,
-0.039947509765625,
0.0273895263671875,
-0.068115234375,
0.0088043212890625,
0.0011339187622070312,
-0.0208740234375,
0.0112152099609375,
-0.0244598388671875,
0.0460205078125,
0.004695892333984375,
-0.0303497314453125,
-0.04931640625,
0.017547607421875,
-0.00516510009765625,
0.0208740234375,
0.055755615234375,
0.0000776052474975586,
0.01094818115234375,
-0.03570556640625,
0.0126953125,
0.00946044921875,
-0.005260467529296875,
0.0224151611328125,
-0.0361328125,
-0.07080078125,
-0.045135498046875,
0.034698486328125,
0.03594970703125,
-0.034515380859375,
0.0523681640625,
-0.009429931640625,
-0.03399658203125,
-0.044403076171875,
-0.030792236328125,
0.0006518363952636719,
0.04443359375,
0.03662109375,
-0.023590087890625,
-0.05987548828125,
-0.0601806640625,
0.014434814453125,
-0.00391387939453125,
-0.038482666015625,
0.00713348388671875,
0.044830322265625,
-0.03021240234375,
0.062225341796875,
-0.047515869140625,
-0.025970458984375,
-0.024810791015625,
0.0266876220703125,
0.043060302734375,
0.063720703125,
0.0274658203125,
-0.040863037109375,
-0.0106201171875,
-0.0303955078125,
-0.029205322265625,
-0.01248931884765625,
-0.02362060546875,
0.002796173095703125,
-0.0100555419921875,
0.0089111328125,
-0.05914306640625,
0.0513916015625,
0.01995849609375,
-0.043060302734375,
0.0579833984375,
-0.013824462890625,
0.0211181640625,
-0.0704345703125,
0.004009246826171875,
-0.0167388916015625,
-0.00736236572265625,
-0.035888671875,
-0.030975341796875,
0.007244110107421875,
-0.00739288330078125,
-0.04022216796875,
0.045440673828125,
-0.005352020263671875,
-0.00559234619140625,
-0.023040771484375,
-0.018524169921875,
0.01134490966796875,
0.0279083251953125,
-0.0004639625549316406,
0.03369140625,
0.05023193359375,
-0.06304931640625,
0.04559326171875,
0.03277587890625,
-0.031890869140625,
0.0257720947265625,
-0.06695556640625,
0.00705718994140625,
0.008056640625,
0.01316070556640625,
-0.08001708984375,
-0.0025157928466796875,
0.01447296142578125,
-0.05841064453125,
0.03533935546875,
-0.00012993812561035156,
-0.035888671875,
-0.0273895263671875,
-0.0092315673828125,
0.055999755859375,
0.0579833984375,
-0.058074951171875,
0.034454345703125,
0.00574493408203125,
0.031585693359375,
-0.019744873046875,
-0.05902099609375,
-0.044586181640625,
-0.0255889892578125,
-0.05572509765625,
0.01523590087890625,
-0.01155853271484375,
-0.00244140625,
-0.0137939453125,
-0.021514892578125,
-0.0024738311767578125,
-0.005340576171875,
0.02197265625,
0.028564453125,
-0.0169219970703125,
0.0035858154296875,
-0.0124969482421875,
-0.021759033203125,
0.01483917236328125,
-0.0265350341796875,
0.05401611328125,
-0.004932403564453125,
-0.040252685546875,
-0.05523681640625,
-0.009735107421875,
0.0289764404296875,
-0.01316070556640625,
0.0264434814453125,
0.06884765625,
-0.0257568359375,
-0.020751953125,
-0.0426025390625,
-0.0241851806640625,
-0.03802490234375,
0.062103271484375,
-0.029998779296875,
-0.01522064208984375,
0.052947998046875,
0.023590087890625,
-0.0131378173828125,
0.0452880859375,
0.053314208984375,
-0.020660400390625,
0.054534912109375,
0.0214691162109375,
0.01018524169921875,
0.0169677734375,
-0.08074951171875,
0.0120697021484375,
-0.0555419921875,
-0.023773193359375,
-0.03192138671875,
-0.03521728515625,
-0.038909912109375,
-0.026123046875,
0.0274505615234375,
0.0031986236572265625,
-0.032073974609375,
0.036041259765625,
-0.03253173828125,
0.0113983154296875,
0.058135986328125,
0.02447509765625,
0.002109527587890625,
-0.00336456298828125,
-0.0035724639892578125,
0.0084228515625,
-0.0445556640625,
-0.0134429931640625,
0.0848388671875,
0.035400390625,
0.06689453125,
-0.003482818603515625,
0.047760009765625,
-0.00118255615234375,
0.01378631591796875,
-0.055877685546875,
0.02459716796875,
0.00791168212890625,
-0.057861328125,
-0.0162811279296875,
-0.01507568359375,
-0.06402587890625,
0.01104736328125,
-0.0055084228515625,
-0.058685302734375,
0.0255889892578125,
0.01375579833984375,
-0.041473388671875,
0.0312347412109375,
-0.04461669921875,
0.067138671875,
-0.01395416259765625,
-0.0204315185546875,
0.01136016845703125,
-0.028778076171875,
0.0299224853515625,
-0.0009059906005859375,
-0.0085296630859375,
-0.0049591064453125,
0.0340576171875,
0.0830078125,
-0.0279083251953125,
0.0305328369140625,
-0.01194000244140625,
0.0091705322265625,
0.046600341796875,
-0.0078582763671875,
0.024993896484375,
0.0207366943359375,
0.0081329345703125,
0.01140594482421875,
0.0124969482421875,
-0.0202178955078125,
-0.0352783203125,
0.0443115234375,
-0.0709228515625,
-0.00835418701171875,
-0.026519775390625,
-0.04730224609375,
0.0025119781494140625,
-0.0008692741394042969,
0.0650634765625,
0.05108642578125,
0.01213836669921875,
0.01529693603515625,
0.055816650390625,
-0.014404296875,
0.029296875,
0.01157379150390625,
-0.01422882080078125,
-0.03204345703125,
0.054473876953125,
0.0026073455810546875,
0.02618408203125,
-0.00933837890625,
0.00984954833984375,
-0.0362548828125,
-0.03570556640625,
-0.0236663818359375,
0.01047515869140625,
-0.04254150390625,
-0.022186279296875,
-0.0355224609375,
-0.020263671875,
-0.03582763671875,
0.003997802734375,
-0.05023193359375,
-0.0165557861328125,
-0.0220794677734375,
-0.0208282470703125,
0.039154052734375,
0.03204345703125,
-0.028594970703125,
0.04327392578125,
-0.0491943359375,
0.0361328125,
0.01169586181640625,
0.0185394287109375,
-0.00582122802734375,
-0.070556640625,
-0.0007276535034179688,
0.003570556640625,
-0.031097412109375,
-0.07257080078125,
0.0297088623046875,
-0.00989532470703125,
0.0254669189453125,
0.031463623046875,
0.00742340087890625,
0.0517578125,
-0.0205841064453125,
0.0399169921875,
0.0185089111328125,
-0.07537841796875,
0.0283355712890625,
-0.026885986328125,
0.00675201416015625,
0.0281524658203125,
0.01134490966796875,
-0.0265045166015625,
-0.016876220703125,
-0.070556640625,
-0.07891845703125,
0.0576171875,
0.0460205078125,
-0.00109100341796875,
0.02252197265625,
0.0014200210571289062,
-0.005870819091796875,
0.004791259765625,
-0.06622314453125,
-0.0357666015625,
-0.01995849609375,
-0.01390838623046875,
-0.016143798828125,
-0.0063018798828125,
-0.009979248046875,
-0.039306640625,
0.0667724609375,
0.0181732177734375,
0.043701171875,
0.033966064453125,
0.005565643310546875,
0.004764556884765625,
-0.0084991455078125,
0.05712890625,
0.016143798828125,
-0.044952392578125,
0.010498046875,
0.0307464599609375,
-0.04071044921875,
0.0322265625,
0.0025424957275390625,
-0.0016794204711914062,
0.0244293212890625,
0.037750244140625,
0.08074951171875,
0.017181396484375,
-0.0201263427734375,
0.036407470703125,
-0.0164642333984375,
-0.0224456787109375,
-0.045684814453125,
-0.0009450912475585938,
0.015777587890625,
0.013092041015625,
0.0200347900390625,
0.0237274169921875,
-0.016876220703125,
-0.024749755859375,
0.035400390625,
0.010040283203125,
-0.0287322998046875,
-0.0231475830078125,
0.0675048828125,
0.002162933349609375,
-0.02447509765625,
0.08001708984375,
0.00054168701171875,
-0.0277099609375,
0.07708740234375,
0.0369873046875,
0.06964111328125,
-0.0003161430358886719,
-0.0169830322265625,
0.03558349609375,
0.0192718505859375,
-0.011810302734375,
0.0265960693359375,
0.003597259521484375,
-0.053131103515625,
-0.0026683807373046875,
-0.037017822265625,
0.003849029541015625,
0.0303497314453125,
-0.058197021484375,
0.06414794921875,
-0.01611328125,
-0.0165557861328125,
-0.005680084228515625,
-0.00276947021484375,
-0.0810546875,
0.0323486328125,
-0.007289886474609375,
0.0712890625,
-0.0821533203125,
0.0760498046875,
0.004878997802734375,
-0.039154052734375,
-0.09832763671875,
-0.0094451904296875,
-0.0110015869140625,
-0.042938232421875,
0.0648193359375,
0.0210723876953125,
0.0131072998046875,
0.01288604736328125,
-0.033172607421875,
-0.06329345703125,
0.080322265625,
0.013671875,
-0.016143798828125,
0.019744873046875,
-0.00420379638671875,
0.0249786376953125,
-0.0082550048828125,
0.04931640625,
0.04144287109375,
0.0411376953125,
0.01526641845703125,
-0.06927490234375,
-0.0109100341796875,
0.0038604736328125,
-0.0233612060546875,
0.0004584789276123047,
-0.0287017822265625,
0.084228515625,
-0.0357666015625,
-0.0015516281127929688,
0.01288604736328125,
0.0599365234375,
0.046356201171875,
0.0260162353515625,
0.045501708984375,
0.043853759765625,
0.043365478515625,
-0.028350830078125,
0.07080078125,
-0.00046634674072265625,
0.06414794921875,
0.07611083984375,
-0.00885009765625,
0.0450439453125,
0.0290985107421875,
-0.0218353271484375,
0.0175018310546875,
0.06219482421875,
-0.038970947265625,
0.055908203125,
0.042694091796875,
-0.01142120361328125,
-0.0005950927734375,
0.025970458984375,
-0.0543212890625,
0.05401611328125,
0.0037994384765625,
-0.018890380859375,
0.01898193359375,
0.016815185546875,
-0.0102386474609375,
-0.04132080078125,
-0.0026454925537109375,
0.01508331298828125,
-0.00922393798828125,
-0.0271759033203125,
0.058685302734375,
0.00246429443359375,
0.056549072265625,
-0.054443359375,
0.00505828857421875,
-0.0012569427490234375,
0.038360595703125,
-0.0189971923828125,
-0.0210418701171875,
0.017425537109375,
-0.028533935546875,
-0.0106201171875,
0.00374603271484375,
0.034271240234375,
-0.04559326171875,
-0.049713134765625,
0.036346435546875,
-0.0011272430419921875,
0.0161895751953125,
0.006099700927734375,
-0.0250396728515625,
0.023956298828125,
0.010528564453125,
-0.0345458984375,
-0.0018224716186523438,
0.01053619384765625,
0.0244598388671875,
0.045074462890625,
0.044586181640625,
0.006439208984375,
0.0293426513671875,
0.0008015632629394531,
0.050140380859375,
-0.03326416015625,
-0.0355224609375,
-0.049957275390625,
0.053619384765625,
0.006366729736328125,
-0.04095458984375,
0.042694091796875,
0.067138671875,
0.08294677734375,
-0.0308380126953125,
0.0416259765625,
-0.003894805908203125,
0.01517486572265625,
-0.042327880859375,
0.0692138671875,
-0.029937744140625,
0.021820068359375,
-0.01666259765625,
-0.0584716796875,
-0.0008749961853027344,
0.0546875,
-0.00455474853515625,
0.02215576171875,
0.05621337890625,
0.07952880859375,
-0.0322265625,
-0.00402069091796875,
0.02288818359375,
0.0216522216796875,
0.0185394287109375,
0.036895751953125,
0.0484619140625,
-0.0799560546875,
0.03759765625,
-0.03460693359375,
-0.003513336181640625,
-0.0018444061279296875,
-0.0267486572265625,
-0.05731201171875,
-0.042083740234375,
-0.042236328125,
-0.06463623046875,
-0.01316070556640625,
0.07501220703125,
0.0714111328125,
-0.08001708984375,
-0.0256805419921875,
0.005619049072265625,
0.0006785392761230469,
-0.0171051025390625,
-0.020660400390625,
0.053192138671875,
0.0134429931640625,
-0.06646728515625,
0.026092529296875,
-0.026153564453125,
0.024810791015625,
0.00025534629821777344,
-0.0105438232421875,
-0.02984619140625,
-0.00925445556640625,
0.01126861572265625,
0.01287841796875,
-0.04852294921875,
-0.048126220703125,
-0.02459716796875,
-0.0033283233642578125,
0.004779815673828125,
0.049560546875,
-0.0570068359375,
0.02850341796875,
0.045196533203125,
0.0125579833984375,
0.061279296875,
-0.0224761962890625,
0.0280609130859375,
-0.05914306640625,
0.0216827392578125,
0.0157928466796875,
0.0306549072265625,
0.0122528076171875,
-0.0173797607421875,
0.0261688232421875,
0.03326416015625,
-0.04119873046875,
-0.079833984375,
-0.0010852813720703125,
-0.08709716796875,
0.0018901824951171875,
0.0894775390625,
0.006206512451171875,
-0.00019025802612304688,
0.0017194747924804688,
-0.0241851806640625,
0.056304931640625,
-0.012603759765625,
0.04766845703125,
0.018463134765625,
-0.017974853515625,
0.00444793701171875,
-0.036102294921875,
0.0394287109375,
0.0295867919921875,
-0.0287322998046875,
-0.0208282470703125,
-0.0008215904235839844,
0.061370849609375,
0.01080322265625,
0.05108642578125,
0.0111541748046875,
0.0249481201171875,
0.02593994140625,
0.038787841796875,
-0.0268096923828125,
-0.01251983642578125,
-0.032623291015625,
-0.00034427642822265625,
-0.01824951171875,
-0.071044921875
]
] |
google/t5-v1_1-large | 2023-01-24T16:52:33.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"en",
"dataset:c4",
"arxiv:2002.05202",
"arxiv:1910.10683",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/t5-v1_1-large | 11 | 14,052 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- c4
license: apache-2.0
---
[Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) Version 1.1
## Version 1.1
[T5 Version 1.1](https://github.com/google-research/text-to-text-transfer-transformer/blob/master/released_checkpoints.md#t511) includes the following improvements compared to the original T5 model- GEGLU activation in feed-forward hidden layer, rather than ReLU - see [here](https://arxiv.org/abs/2002.05202).
- Dropout was turned off in pre-training (quality win). Dropout should be re-enabled during fine-tuning.
- Pre-trained on C4 only without mixing in the downstream tasks.
- no parameter sharing between embedding and classifier layer
- "xl" and "xxl" replace "3B" and "11B". The model shapes are a bit different - larger `d_model` and smaller `num_heads` and `d_ff`.
**Note**: T5 Version 1.1 was only pre-trained on C4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
Pretraining Dataset: [C4](https://huggingface.co/datasets/c4)
Other Community Checkpoints: [here](https://huggingface.co/models?search=t5-v1_1)
Paper: [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/pdf/1910.10683.pdf)
Authors: *Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu*
## Abstract
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new “Colossal Clean Crawled Corpus”, we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our dataset, pre-trained models, and code.
 | 2,671 | [
[
-0.021514892578125,
-0.026824951171875,
0.0296173095703125,
0.0158233642578125,
-0.01560211181640625,
0.01049041748046875,
-0.017913818359375,
-0.05303955078125,
-0.01245880126953125,
0.033538818359375,
-0.052764892578125,
-0.043731689453125,
-0.070068359375,
0.01505279541015625,
-0.04840087890625,
0.09722900390625,
-0.01422119140625,
-0.01439666748046875,
0.0011568069458007812,
-0.00315093994140625,
-0.026702880859375,
-0.032501220703125,
-0.0640869140625,
-0.0270843505859375,
0.02862548828125,
0.0279541015625,
0.0205078125,
0.0275421142578125,
0.0537109375,
0.0130157470703125,
-0.000705718994140625,
-0.006450653076171875,
-0.0501708984375,
-0.02911376953125,
-0.0264892578125,
-0.0144500732421875,
-0.036376953125,
0.00702667236328125,
0.0452880859375,
0.054107666015625,
0.0020694732666015625,
0.018951416015625,
0.0254669189453125,
0.0450439453125,
-0.052764892578125,
0.014129638671875,
-0.045318603515625,
0.0151824951171875,
-0.00766754150390625,
0.0008702278137207031,
-0.05047607421875,
-0.0150909423828125,
0.039337158203125,
-0.056610107421875,
0.0254669189453125,
-0.00893402099609375,
0.091796875,
0.0274200439453125,
-0.037628173828125,
-0.0191192626953125,
-0.048614501953125,
0.06646728515625,
-0.045196533203125,
0.028717041015625,
0.00936126708984375,
0.0283966064453125,
0.0115966796875,
-0.0894775390625,
-0.034149169921875,
-0.0023040771484375,
-0.0089874267578125,
0.004276275634765625,
-0.02166748046875,
-0.0035762786865234375,
0.00775146484375,
0.036102294921875,
-0.035430908203125,
0.016021728515625,
-0.050048828125,
-0.019500732421875,
0.036773681640625,
-0.0180816650390625,
0.023590087890625,
0.0011796951293945312,
-0.04913330078125,
-0.0191497802734375,
-0.04052734375,
0.007904052734375,
-0.01666259765625,
0.0244598388671875,
-0.0257415771484375,
-0.0070343017578125,
-0.0015621185302734375,
0.04852294921875,
0.01123046875,
-0.004730224609375,
0.0265655517578125,
-0.04791259765625,
-0.0174407958984375,
-0.015716552734375,
0.065673828125,
0.01361846923828125,
0.0216064453125,
-0.03228759765625,
-0.0016193389892578125,
-0.0212860107421875,
0.03228759765625,
-0.07305908203125,
-0.033538818359375,
-0.006023406982421875,
-0.0280303955078125,
-0.03814697265625,
0.007648468017578125,
-0.0457763671875,
-0.00408172607421875,
-0.0189361572265625,
0.0411376953125,
-0.04302978515625,
-0.0204620361328125,
0.02630615234375,
0.0027370452880859375,
0.03216552734375,
0.040679931640625,
-0.07843017578125,
0.035430908203125,
0.036102294921875,
0.06304931640625,
-0.044769287109375,
-0.0254669189453125,
-0.04180908203125,
-0.0005259513854980469,
-0.01019287109375,
0.06097412109375,
-0.02423095703125,
-0.017547607421875,
-0.006412506103515625,
0.01291656494140625,
-0.0190277099609375,
-0.02325439453125,
0.06011962890625,
-0.031280517578125,
0.043243408203125,
-0.020660400390625,
-0.034454345703125,
-0.03900146484375,
0.01280975341796875,
-0.0511474609375,
0.0755615234375,
0.01355743408203125,
-0.044586181640625,
0.035491943359375,
-0.0655517578125,
-0.033233642578125,
-0.01320648193359375,
0.0274658203125,
-0.030731201171875,
-0.0174102783203125,
0.02520751953125,
0.042938232421875,
-0.007663726806640625,
0.00519561767578125,
-0.017242431640625,
-0.02130126953125,
-0.0126495361328125,
-0.004486083984375,
0.06805419921875,
0.022308349609375,
-0.0233917236328125,
0.004039764404296875,
-0.047119140625,
0.01384735107421875,
-0.0011310577392578125,
-0.0232086181640625,
0.01023101806640625,
-0.022369384765625,
0.01141357421875,
0.03173828125,
0.0215606689453125,
-0.024932861328125,
0.01995849609375,
-0.0194854736328125,
0.0401611328125,
0.040313720703125,
-0.01451873779296875,
0.065673828125,
-0.031829833984375,
0.038299560546875,
0.004119873046875,
0.004680633544921875,
-0.01184844970703125,
-0.0167388916015625,
-0.055908203125,
-0.008880615234375,
0.050567626953125,
0.0537109375,
-0.0511474609375,
0.043060302734375,
-0.04205322265625,
-0.03948974609375,
-0.0455322265625,
0.0062103271484375,
0.0280609130859375,
0.047882080078125,
0.05712890625,
-0.0193328857421875,
-0.04278564453125,
-0.0411376953125,
-0.0205841064453125,
0.004253387451171875,
-0.00675201416015625,
-0.0022182464599609375,
0.0362548828125,
-0.01580810546875,
0.05841064453125,
-0.0233154296875,
-0.041229248046875,
-0.04461669921875,
0.01360321044921875,
-0.004085540771484375,
0.046051025390625,
0.05230712890625,
-0.04534912109375,
-0.040496826171875,
0.00701904296875,
-0.058837890625,
-0.0134124755859375,
-0.01202392578125,
-0.00681304931640625,
0.0241546630859375,
0.04425048828125,
-0.0196533203125,
0.0238494873046875,
0.0628662109375,
-0.018402099609375,
0.0271759033203125,
-0.01061248779296875,
0.0011739730834960938,
-0.11767578125,
0.029388427734375,
0.0034275054931640625,
-0.03814697265625,
-0.05615234375,
-0.0009145736694335938,
0.0205078125,
0.00638580322265625,
-0.04327392578125,
0.046661376953125,
-0.036956787109375,
0.005565643310546875,
-0.0198822021484375,
0.01393890380859375,
-0.0005040168762207031,
0.039459228515625,
-0.0087890625,
0.060333251953125,
0.036041259765625,
-0.060699462890625,
-0.00609588623046875,
0.0323486328125,
-0.01508331298828125,
0.00989532470703125,
-0.04522705078125,
0.0321044921875,
0.0004603862762451172,
0.0345458984375,
-0.06591796875,
0.0200958251953125,
0.0310211181640625,
-0.04437255859375,
0.042816162109375,
-0.00969696044921875,
-0.01529693603515625,
-0.01499176025390625,
-0.0267486572265625,
0.0215301513671875,
0.049041748046875,
-0.04730224609375,
0.040191650390625,
0.0106658935546875,
0.0018262863159179688,
-0.052093505859375,
-0.055999755859375,
0.01482391357421875,
-0.019500732421875,
-0.047882080078125,
0.0640869140625,
0.0013256072998046875,
0.0183258056640625,
-0.004283905029296875,
-0.006114959716796875,
-0.0212554931640625,
0.0167083740234375,
-0.01192474365234375,
0.02001953125,
-0.002227783203125,
0.007781982421875,
0.0107269287109375,
-0.0214080810546875,
-0.0028324127197265625,
-0.034881591796875,
0.022064208984375,
-0.01031494140625,
0.015777587890625,
-0.04083251953125,
0.00102996826171875,
0.0234832763671875,
-0.0200958251953125,
0.05548095703125,
0.06951904296875,
-0.0188751220703125,
-0.0234375,
-0.0205078125,
-0.01580810546875,
-0.034515380859375,
0.032257080078125,
-0.037750244140625,
-0.07611083984375,
0.031097412109375,
-0.0170745849609375,
0.02294921875,
0.052276611328125,
0.006549835205078125,
-0.0029239654541015625,
0.0489501953125,
0.082275390625,
-0.025238037109375,
0.050018310546875,
-0.03369140625,
0.0206298828125,
-0.06732177734375,
-0.012054443359375,
-0.050811767578125,
-0.022308349609375,
-0.048370361328125,
-0.0215911865234375,
0.00363922119140625,
0.0193634033203125,
-0.01378631591796875,
0.040252685546875,
-0.029144287109375,
0.027099609375,
0.0148468017578125,
0.0123138427734375,
0.029815673828125,
0.00727081298828125,
0.0027923583984375,
-0.01442718505859375,
-0.058929443359375,
-0.037445068359375,
0.09210205078125,
0.0238800048828125,
0.03863525390625,
0.0067901611328125,
0.048797607421875,
0.032745361328125,
0.0330810546875,
-0.055999755859375,
0.033905029296875,
-0.034576416015625,
-0.020843505859375,
-0.02703857421875,
-0.034576416015625,
-0.08697509765625,
0.0225830078125,
-0.0372314453125,
-0.05438232421875,
-0.010528564453125,
0.0011548995971679688,
-0.00884246826171875,
0.037689208984375,
-0.060638427734375,
0.0782470703125,
0.00485992431640625,
-0.0151824951171875,
-0.0008783340454101562,
-0.0574951171875,
0.019287109375,
-0.0105743408203125,
-0.00342559814453125,
0.007404327392578125,
-0.004947662353515625,
0.055389404296875,
-0.016632080078125,
0.051055908203125,
-0.00920867919921875,
-0.006526947021484375,
0.00042724609375,
0.0002079010009765625,
0.039764404296875,
-0.029815673828125,
-0.0001341104507446289,
0.0245513916015625,
-0.0014982223510742188,
-0.0423583984375,
-0.037109375,
0.0328369140625,
-0.060211181640625,
-0.0237884521484375,
-0.0229644775390625,
-0.021209716796875,
-0.004650115966796875,
0.0260467529296875,
0.0352783203125,
0.0136566162109375,
-0.015899658203125,
0.0264129638671875,
0.053924560546875,
-0.01136016845703125,
0.043975830078125,
0.026641845703125,
-0.0212860107421875,
-0.006038665771484375,
0.052978515625,
-0.00016248226165771484,
0.0374755859375,
0.046417236328125,
0.007587432861328125,
-0.02801513671875,
-0.058837890625,
-0.037322998046875,
0.01505279541015625,
-0.047332763671875,
-0.00992584228515625,
-0.06072998046875,
-0.03094482421875,
-0.045166015625,
-0.01015472412109375,
-0.034759521484375,
-0.0219268798828125,
-0.038116455078125,
-0.01934814453125,
0.01093292236328125,
0.050933837890625,
0.00971221923828125,
0.0169525146484375,
-0.07977294921875,
0.00847625732421875,
0.00472259521484375,
0.01751708984375,
-0.003192901611328125,
-0.07550048828125,
-0.01163482666015625,
0.00196075439453125,
-0.02862548828125,
-0.050933837890625,
0.03619384765625,
0.030426025390625,
0.0296478271484375,
0.01239013671875,
0.006313323974609375,
0.0390625,
-0.028778076171875,
0.058441162109375,
0.016876220703125,
-0.0892333984375,
0.0302581787109375,
-0.0233154296875,
0.029754638671875,
0.058563232421875,
0.041839599609375,
-0.034576416015625,
-0.00799560546875,
-0.051910400390625,
-0.050262451171875,
0.05938720703125,
0.0133056640625,
-0.00508880615234375,
0.037384033203125,
0.0229644775390625,
0.02667236328125,
-0.004177093505859375,
-0.0693359375,
-0.01085662841796875,
-0.0115203857421875,
-0.01535797119140625,
-0.01140594482421875,
0.007602691650390625,
0.031005859375,
-0.0287017822265625,
0.0443115234375,
-0.01505279541015625,
0.0239715576171875,
0.0247955322265625,
-0.038421630859375,
0.0136566162109375,
0.0181732177734375,
0.043243408203125,
0.05816650390625,
-0.0182952880859375,
-0.006473541259765625,
0.035888671875,
-0.049102783203125,
-0.0029544830322265625,
0.0157928466796875,
-0.01065826416015625,
-0.0058135986328125,
0.033233642578125,
0.0635986328125,
0.02447509765625,
-0.0187225341796875,
0.043365478515625,
-0.0103912353515625,
-0.048614501953125,
-0.011138916015625,
0.0048828125,
-0.0079193115234375,
-0.005680084228515625,
0.02716064453125,
0.0192108154296875,
0.0234375,
-0.03277587890625,
0.00928497314453125,
0.0051727294921875,
-0.038055419921875,
-0.040069580078125,
0.0472412109375,
0.029998779296875,
-0.0113372802734375,
0.058563232421875,
-0.01953125,
-0.0426025390625,
0.029754638671875,
0.043121337890625,
0.077392578125,
-0.007343292236328125,
0.0260467529296875,
0.04583740234375,
0.026947021484375,
-0.0115203857421875,
-0.00812530517578125,
-0.0180816650390625,
-0.061187744140625,
-0.063232421875,
-0.03363037109375,
-0.03594970703125,
0.01120758056640625,
-0.050811767578125,
0.034576416015625,
-0.0247650146484375,
0.0151824951171875,
-0.0006422996520996094,
0.0144805908203125,
-0.062164306640625,
0.015625,
0.0115509033203125,
0.0716552734375,
-0.05816650390625,
0.07952880859375,
0.053558349609375,
-0.0222015380859375,
-0.0650634765625,
0.0037174224853515625,
-0.02471923828125,
-0.047119140625,
0.03240966796875,
0.0226593017578125,
-0.01273345947265625,
0.0168914794921875,
-0.05072021484375,
-0.072021484375,
0.09930419921875,
0.03631591796875,
-0.0260009765625,
-0.0214080810546875,
0.006282806396484375,
0.038970947265625,
-0.0240478515625,
0.0135040283203125,
0.045745849609375,
0.028533935546875,
0.01971435546875,
-0.09429931640625,
0.0201568603515625,
-0.0196380615234375,
-0.00933074951171875,
0.016693115234375,
-0.039947509765625,
0.05328369140625,
-0.024200439453125,
-0.0258331298828125,
-0.0010557174682617188,
0.0552978515625,
0.00183868408203125,
0.0186004638671875,
0.040802001953125,
0.05792236328125,
0.06146240234375,
-0.015106201171875,
0.08819580078125,
-0.003681182861328125,
0.034912109375,
0.07952880859375,
-0.0010595321655273438,
0.062744140625,
0.0240936279296875,
-0.0204010009765625,
0.044830322265625,
0.04156494140625,
0.00974273681640625,
0.043212890625,
0.0035266876220703125,
-0.004352569580078125,
-0.006206512451171875,
0.00862884521484375,
-0.033172607421875,
0.0256195068359375,
0.01287841796875,
-0.02374267578125,
-0.032745361328125,
0.00438690185546875,
0.0162353515625,
-0.006435394287109375,
-0.01424407958984375,
0.072509765625,
0.00628662109375,
-0.050018310546875,
0.047698974609375,
-0.005702972412109375,
0.0723876953125,
-0.04412841796875,
-0.00030231475830078125,
-0.0221099853515625,
0.0163116455078125,
-0.0184783935546875,
-0.05401611328125,
0.032470703125,
-0.006816864013671875,
-0.00946044921875,
-0.050811767578125,
0.07330322265625,
-0.0237274169921875,
-0.017913818359375,
0.030548095703125,
0.040252685546875,
0.0186004638671875,
-0.0096588134765625,
-0.055694580078125,
-0.017120361328125,
0.0202484130859375,
-0.0072479248046875,
0.036346435546875,
0.03631591796875,
0.00536346435546875,
0.050872802734375,
0.043792724609375,
-0.0018110275268554688,
0.01073455810546875,
0.0034198760986328125,
0.053924560546875,
-0.054840087890625,
-0.0386962890625,
-0.044403076171875,
0.03662109375,
-0.004337310791015625,
-0.040008544921875,
0.046783447265625,
0.030181884765625,
0.0885009765625,
-0.00969696044921875,
0.05816650390625,
-0.0017194747924804688,
0.0416259765625,
-0.045928955078125,
0.04852294921875,
-0.039642333984375,
0.00714111328125,
-0.0250091552734375,
-0.06378173828125,
-0.025146484375,
0.03955078125,
-0.0233917236328125,
0.016510009765625,
0.07391357421875,
0.036834716796875,
-0.006511688232421875,
0.001003265380859375,
0.018218994140625,
-0.0007901191711425781,
0.03814697265625,
0.062255859375,
0.041015625,
-0.06719970703125,
0.06610107421875,
-0.0168914794921875,
-0.00473785400390625,
-0.005290985107421875,
-0.07720947265625,
-0.0616455078125,
-0.05621337890625,
-0.0289154052734375,
-0.0169525146484375,
0.004810333251953125,
0.049407958984375,
0.06475830078125,
-0.0478515625,
-0.0019350051879882812,
-0.0215911865234375,
-0.005535125732421875,
-0.0145416259765625,
-0.0165863037109375,
0.027191162109375,
-0.051605224609375,
-0.06072998046875,
0.005481719970703125,
-0.002254486083984375,
0.00525665283203125,
0.01030731201171875,
-0.0052947998046875,
-0.023193359375,
-0.032745361328125,
0.044586181640625,
0.0217742919921875,
-0.0244140625,
-0.0238494873046875,
0.0026760101318359375,
-0.007080078125,
0.0185089111328125,
0.044403076171875,
-0.06591796875,
0.0139617919921875,
0.03521728515625,
0.07757568359375,
0.06439208984375,
-0.00997161865234375,
0.043060302734375,
-0.043731689453125,
-0.00865936279296875,
0.0132904052734375,
0.0081329345703125,
0.0272064208984375,
-0.01416015625,
0.0513916015625,
0.01248931884765625,
-0.040374755859375,
-0.03515625,
-0.00954437255859375,
-0.09588623046875,
-0.0140533447265625,
0.07977294921875,
-0.016265869140625,
-0.01568603515625,
0.0024166107177734375,
-0.01141357421875,
0.02471923828125,
-0.0243377685546875,
0.06103515625,
0.06072998046875,
0.01319122314453125,
-0.029144287109375,
-0.03662109375,
0.05169677734375,
0.044586181640625,
-0.0882568359375,
-0.0247802734375,
0.014312744140625,
0.033416748046875,
0.003326416015625,
0.042327880859375,
-0.011383056640625,
0.0190277099609375,
-0.030426025390625,
0.01470947265625,
-0.0011625289916992188,
-0.0302886962890625,
-0.04327392578125,
0.0110931396484375,
-0.0169830322265625,
-0.0261993408203125
]
] |
eimiss/EimisAnimeDiffusion_1.0v | 2023-05-16T09:28:18.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"image-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | eimiss | null | null | eimiss/EimisAnimeDiffusion_1.0v | 403 | 14,047 | diffusers | 2022-11-15T19:51:46 | ---
thumbnail: https://imgur.com/6ztDBPR.png
language:
- en
tags:
- stable-diffusion
- text-to-image
- image-to-image
- diffusers
license: creativeml-openrail-m
inference: true
---
# Check out v2 of the model:
https://huggingface.co/eimiss/EimisAnimeDiffusion_2.0v
# Diffusion model
This model is trained with high quality and detailed anime images.
## Gradio
We support a [Gradio](https://github.com/gradio-app/gradio) Web UI run EimisAnimeDiffusion_1.0v:
[](https://huggingface.co/spaces/akhaliq/EimisAnimeDiffusion_1.0v)
# Sample generations
This model works well on anime and landscape generations.<br>
Anime:<br>
There are some sample generations:<br>
```
Positive:a girl, Phoenix girl, fluffy hair, war, a hell on earth, Beautiful and detailed explosion, Cold machine, Fire in eyes, burning, Metal texture, Exquisite cloth, Metal carving, volume, best quality, normal hands, Metal details, Metal scratch, Metal defects, masterpiece, best quality, best quality, illustration, highres, masterpiece, contour deepening, illustration,(beautiful detailed girl),beautiful detailed glow
Negative:lowres, bad anatomy, ((bad hands)), text, error, ((missing fingers)), cropped, jpeg artifacts, worst quality, low quality, signature, watermark, blurry, deformed, extra ears, deformed, disfigured, mutation, censored, ((multiple_girls))
Steps: 20, Sampler: DPM++ 2S a, CFG scale: 8, Seed: 4186044705/4186044707, Size: 704x896
```
<img src=https://imgur.com/2U295w3.png width=75% height=75%>
<img src=https://imgur.com/2jtF376.png width=75% height=75%>
```
Positive:(1girl), cute, walking in the park, (night), full moon, north star, blue shirt, red skirt, detailed shirt, jewelry, autumn, dark blue hair, shirt hair, (magic:1.5), beautiful blue eyes
Negative: lowres, bad anatomy, ((bad hands)), text, error, ((missing fingers)), cropped, jpeg artifacts, worst quality, low quality, signature, watermark, blurry, deformed, extra ears, deformed, disfigured, mutation, censored, ((multiple_girls))
Steps: 35, Sampler: Euler a, CFG scale: 9, Seed: 296195494, Size: 768x960
```
<img src=https://imgur.com/gudKxQe.png width=75% height=75%>
```
Positive:night , ((1 girl)), alone, masterpiece, 8k wallpaper, highres, absurdres, high quality background, short hair, black hair, multicolor hair, beautiful frozen village, (full bright moon), blue dress, detailed dress, jewelry dress, (magic:1.2), blue fire, blue eyes, glowing eyes, fire, ice goddess, (blue detailed beautiful crown), electricity, blue electricity, blue light particles
Negative: lowres, bad anatomy, ((bad hands)), text, error, ((missing fingers)), cropped, jpeg artifacts, worst quality, low quality, signature, watermark, blurry, deformed, extra ears, deformed, disfigured, mutation, censored, ((multiple_girls))
Steps: 20, Sampler: DPM++ 2S a Karras, CFG scale: 9, Seed: 2118767319, Size: 768x832
```
<img src=https://imgur.com/lJL4CJL.png width=75% height=75%>
Want to generate some amazing backgrounds? No problem:
```
Positive: above clouds, mountains, (night), full moon, castle, huge forest, forest between mountains, beautiful, masterpiece
Negative: lowres, bad anatomy, ((bad hands)), text, error, ((missing fingers)), cropped, jpeg artifacts, worst quality, low quality, signature, watermark, blurry, deformed, extra ears, deformed, disfigured, mutation, censored, ((multiple_girls))
Steps: 20, Sampler: DPM++ 2S a Karras, CFG scale: 9, Seed: 83644543, Size: 896x640
```
<img src=https://imgur.com/XfxAx0S.png width=75% height=75%>
## Disclaimer
Some prompts might not work perfectly (mainly colors), so add some more prompts for it to work, or try these -->().
Usually they help. Also works well with img2img if you want to add detail.
## License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) | 4,833 | [
[
-0.03521728515625,
-0.060455322265625,
0.032470703125,
0.047943115234375,
-0.0269622802734375,
-0.0211181640625,
0.0258026123046875,
-0.0257110595703125,
0.0301971435546875,
0.043365478515625,
-0.057525634765625,
-0.058197021484375,
-0.05718994140625,
-0.0090484619140625,
-0.0175933837890625,
0.05810546875,
-0.007556915283203125,
0.00939178466796875,
0.0003993511199951172,
0.0113677978515625,
-0.0609130859375,
-0.0070953369140625,
-0.042724609375,
-0.0250396728515625,
0.011810302734375,
0.03021240234375,
0.046356201171875,
0.037689208984375,
0.0244903564453125,
0.02471923828125,
-0.02801513671875,
-0.0014667510986328125,
-0.055419921875,
0.006191253662109375,
-0.0026092529296875,
-0.0211181640625,
-0.044464111328125,
0.017547607421875,
0.038604736328125,
0.0174407958984375,
-0.016754150390625,
0.00848388671875,
0.01129150390625,
0.04449462890625,
-0.0161590576171875,
-0.0083160400390625,
-0.008514404296875,
0.016937255859375,
-0.021209716796875,
0.0025177001953125,
-0.00308990478515625,
-0.03265380859375,
-0.01739501953125,
-0.06939697265625,
0.0184783935546875,
0.005298614501953125,
0.087646484375,
-0.0009479522705078125,
-0.034515380859375,
-0.0235443115234375,
-0.033355712890625,
0.05035400390625,
-0.040496826171875,
0.01324462890625,
0.04010009765625,
0.016357421875,
-0.0205230712890625,
-0.06329345703125,
-0.039825439453125,
-0.004673004150390625,
-0.01165008544921875,
0.025909423828125,
-0.03436279296875,
-0.043365478515625,
0.0240478515625,
0.02935791015625,
-0.057525634765625,
-0.0164642333984375,
-0.03350830078125,
0.00519561767578125,
0.045806884765625,
0.0164794921875,
0.05108642578125,
0.0018663406372070312,
-0.0323486328125,
-0.0184326171875,
-0.048126220703125,
0.00862884521484375,
0.051513671875,
-0.0014028549194335938,
-0.051025390625,
0.04046630859375,
-0.0030422210693359375,
0.037689208984375,
0.0274200439453125,
-0.0162811279296875,
0.032470703125,
-0.033935546875,
0.00012969970703125,
-0.0267791748046875,
0.065673828125,
0.034698486328125,
-0.007732391357421875,
0.00433349609375,
0.006916046142578125,
-0.0021381378173828125,
-0.0034351348876953125,
-0.09576416015625,
-0.021392822265625,
0.0249176025390625,
-0.046478271484375,
-0.0267791748046875,
-0.018157958984375,
-0.07476806640625,
-0.0123291015625,
-0.0089263916015625,
0.035064697265625,
-0.05657958984375,
-0.032135009765625,
0.0175933837890625,
-0.034515380859375,
0.0027923583984375,
0.0307464599609375,
-0.05718994140625,
0.004695892333984375,
0.01593017578125,
0.056854248046875,
0.0229644775390625,
-0.00726318359375,
0.01328277587890625,
0.012359619140625,
-0.027618408203125,
0.049560546875,
-0.02276611328125,
-0.04827880859375,
-0.048492431640625,
0.0197601318359375,
-0.00415802001953125,
-0.020263671875,
0.05877685546875,
-0.01151275634765625,
0.00991058349609375,
-0.0109405517578125,
-0.024993896484375,
-0.030975341796875,
-0.00844573974609375,
-0.04364013671875,
0.044647216796875,
0.00919342041015625,
-0.067138671875,
0.023834228515625,
-0.0518798828125,
-0.005695343017578125,
-0.007480621337890625,
0.006549835205078125,
-0.034698486328125,
-0.0081634521484375,
0.00807952880859375,
0.026336669921875,
-0.0008969306945800781,
-0.013824462890625,
-0.038360595703125,
-0.01525115966796875,
0.0208282470703125,
-0.017364501953125,
0.08837890625,
0.030609130859375,
-0.01202392578125,
-0.0009112358093261719,
-0.060394287109375,
0.0030536651611328125,
0.0560302734375,
0.0011920928955078125,
-0.019378662109375,
-0.02117919921875,
0.0207977294921875,
0.0214385986328125,
0.01335906982421875,
-0.032073974609375,
0.0120697021484375,
0.0010156631469726562,
0.03271484375,
0.04205322265625,
0.003936767578125,
0.02362060546875,
-0.0482177734375,
0.048370361328125,
0.01392364501953125,
0.02178955078125,
-0.019378662109375,
-0.0489501953125,
-0.041656494140625,
-0.052093505859375,
0.0128631591796875,
0.0172576904296875,
-0.04071044921875,
0.03704833984375,
0.009002685546875,
-0.07720947265625,
-0.038299560546875,
0.006687164306640625,
0.04266357421875,
0.033355712890625,
0.00925445556640625,
-0.026824951171875,
-0.026092529296875,
-0.07073974609375,
0.0221099853515625,
0.016265869140625,
0.0029277801513671875,
0.0256500244140625,
0.055419921875,
-0.020721435546875,
0.0435791015625,
-0.047607421875,
-0.019195556640625,
-0.00740814208984375,
0.01197052001953125,
0.052886962890625,
0.05718994140625,
0.07720947265625,
-0.07861328125,
-0.048095703125,
-0.0087738037109375,
-0.0625,
-0.01453399658203125,
0.0247802734375,
-0.0419921875,
-0.02783203125,
-0.022003173828125,
-0.048980712890625,
0.031463623046875,
0.04364013671875,
-0.0650634765625,
0.03350830078125,
-0.027801513671875,
0.03179931640625,
-0.08685302734375,
0.0240020751953125,
0.0208892822265625,
-0.0214080810546875,
-0.060577392578125,
0.056732177734375,
-0.00045871734619140625,
-0.002716064453125,
-0.048614501953125,
0.061737060546875,
-0.034515380859375,
0.0141448974609375,
-0.032958984375,
-0.014495849609375,
0.0166473388671875,
0.029266357421875,
0.00827789306640625,
0.043731689453125,
0.06085205078125,
-0.038604736328125,
0.020050048828125,
0.04205322265625,
-0.0144195556640625,
0.056671142578125,
-0.059722900390625,
0.0238800048828125,
-0.0165863037109375,
0.021270751953125,
-0.0718994140625,
-0.0404052734375,
0.0404052734375,
-0.042724609375,
0.019073486328125,
-0.0173492431640625,
-0.0384521484375,
-0.03900146484375,
-0.0251617431640625,
0.0141448974609375,
0.06640625,
-0.04547119140625,
0.031494140625,
0.0251922607421875,
-0.0136871337890625,
-0.0270843505859375,
-0.051788330078125,
-0.016754150390625,
-0.033416748046875,
-0.061798095703125,
0.0301666259765625,
-0.0118255615234375,
-0.01200103759765625,
0.002780914306640625,
0.0205230712890625,
-0.0243377685546875,
-0.0013837814331054688,
0.038970947265625,
0.0113525390625,
-0.0032196044921875,
-0.0159454345703125,
0.0157470703125,
0.0012311935424804688,
-0.0259552001953125,
-0.0010623931884765625,
0.0592041015625,
-0.01276397705078125,
-0.024139404296875,
-0.0653076171875,
0.026397705078125,
0.04559326171875,
0.005062103271484375,
0.048828125,
0.056182861328125,
-0.04443359375,
0.005279541015625,
-0.034698486328125,
-0.0017223358154296875,
-0.031768798828125,
0.0078277587890625,
-0.0305633544921875,
-0.0253448486328125,
0.05279541015625,
0.034149169921875,
0.004505157470703125,
0.05450439453125,
0.047515869140625,
-0.0195465087890625,
0.09271240234375,
0.039459228515625,
0.0205841064453125,
0.0228118896484375,
-0.05767822265625,
-0.0272216796875,
-0.04974365234375,
-0.0279998779296875,
-0.0237579345703125,
-0.039581298828125,
-0.0186767578125,
-0.023956298828125,
0.017059326171875,
0.035308837890625,
-0.03778076171875,
0.0302886962890625,
-0.03009033203125,
0.02252197265625,
0.033172607421875,
0.045135498046875,
0.019073486328125,
0.0102691650390625,
0.0019779205322265625,
-0.0308685302734375,
-0.0289459228515625,
-0.033172607421875,
0.07708740234375,
0.03680419921875,
0.051605224609375,
0.0257568359375,
0.043182373046875,
0.01861572265625,
0.0234375,
-0.017486572265625,
0.041748046875,
-0.023956298828125,
-0.0751953125,
-0.007495880126953125,
-0.018157958984375,
-0.062286376953125,
0.024871826171875,
-0.036163330078125,
-0.054962158203125,
0.0237274169921875,
0.01293182373046875,
-0.023193359375,
0.0184326171875,
-0.053375244140625,
0.052001953125,
-0.01654052734375,
-0.054534912109375,
-0.0189208984375,
-0.025177001953125,
0.052276611328125,
-0.0040283203125,
0.016815185546875,
-0.0074005126953125,
0.0114898681640625,
0.059722900390625,
-0.04156494140625,
0.06573486328125,
-0.01377105712890625,
-0.0022640228271484375,
0.037811279296875,
-0.0021038055419921875,
0.02764892578125,
0.017974853515625,
0.00891876220703125,
0.019317626953125,
-0.005184173583984375,
-0.037567138671875,
-0.00917816162109375,
0.0697021484375,
-0.0667724609375,
-0.026611328125,
-0.023773193359375,
-0.0311431884765625,
0.021331787109375,
0.027130126953125,
0.04559326171875,
0.0289764404296875,
-0.0012798309326171875,
0.0170745849609375,
0.04718017578125,
-0.01055908203125,
0.0261993408203125,
0.0202484130859375,
-0.05755615234375,
-0.050537109375,
0.0657958984375,
0.016387939453125,
0.0153350830078125,
-0.01366424560546875,
0.023651123046875,
-0.02215576171875,
-0.04425048828125,
-0.04248046875,
0.0214996337890625,
-0.047607421875,
0.001270294189453125,
-0.0552978515625,
-0.01374053955078125,
-0.026336669921875,
-0.012359619140625,
-0.0163116455078125,
-0.02178955078125,
-0.04595947265625,
-0.0032558441162109375,
0.0487060546875,
0.037017822265625,
-0.0095367431640625,
0.033172607421875,
-0.049591064453125,
0.03277587890625,
0.03399658203125,
0.0171661376953125,
0.00634765625,
-0.05169677734375,
0.0096282958984375,
0.01538848876953125,
-0.061431884765625,
-0.09765625,
0.059844970703125,
-0.0108642578125,
0.040008544921875,
0.042694091796875,
-0.032806396484375,
0.06854248046875,
-0.0231475830078125,
0.0399169921875,
0.0438232421875,
-0.042388916015625,
0.032562255859375,
-0.061004638671875,
0.007724761962890625,
0.0290069580078125,
0.02984619140625,
-0.024627685546875,
-0.01983642578125,
-0.053741455078125,
-0.052215576171875,
0.0433349609375,
0.05914306640625,
0.01334381103515625,
0.02093505859375,
0.042572021484375,
0.0166473388671875,
0.017059326171875,
-0.07537841796875,
-0.039703369140625,
-0.037567138671875,
-0.003177642822265625,
-0.0093231201171875,
0.006561279296875,
-0.00653839111328125,
-0.02593994140625,
0.0650634765625,
0.0089263916015625,
0.0157470703125,
0.0003561973571777344,
0.01331329345703125,
-0.0298614501953125,
-0.00792694091796875,
0.0299835205078125,
0.0305633544921875,
-0.0188446044921875,
-0.03106689453125,
-0.0016994476318359375,
-0.045257568359375,
0.03204345703125,
0.005977630615234375,
-0.043914794921875,
0.004825592041015625,
0.00988006591796875,
0.0665283203125,
-0.001598358154296875,
-0.0240020751953125,
0.0223541259765625,
-0.0190887451171875,
-0.01509857177734375,
-0.0227813720703125,
0.019744873046875,
0.02716064453125,
0.03472900390625,
0.03253173828125,
0.0032978057861328125,
0.0232086181640625,
-0.04364013671875,
-0.0048065185546875,
0.03240966796875,
-0.010833740234375,
-0.03570556640625,
0.061553955078125,
-0.0023937225341796875,
-0.021026611328125,
0.0238494873046875,
-0.041259765625,
-0.0179290771484375,
0.08587646484375,
0.05474853515625,
0.05645751953125,
-0.021209716796875,
0.03204345703125,
0.05999755859375,
0.0042266845703125,
-0.01409149169921875,
0.04400634765625,
0.036376953125,
-0.0301055908203125,
0.00357818603515625,
-0.051605224609375,
-0.01424407958984375,
0.04034423828125,
-0.0201416015625,
0.05364990234375,
-0.04669189453125,
-0.0253753662109375,
-0.01557159423828125,
-0.006481170654296875,
-0.04571533203125,
0.0341796875,
0.0171661376953125,
0.054962158203125,
-0.0841064453125,
0.037200927734375,
0.054443359375,
-0.045440673828125,
-0.053924560546875,
-0.0103912353515625,
0.00740814208984375,
-0.0310821533203125,
0.041595458984375,
0.023773193359375,
0.01326751708984375,
0.0031223297119140625,
-0.06396484375,
-0.075927734375,
0.1053466796875,
0.00829315185546875,
-0.032073974609375,
0.0005774497985839844,
-0.03228759765625,
0.0491943359375,
-0.029571533203125,
0.041748046875,
0.0179901123046875,
0.039703369140625,
0.04315185546875,
-0.04205322265625,
-0.002094268798828125,
-0.044281005859375,
0.0200653076171875,
-0.00795745849609375,
-0.072265625,
0.09344482421875,
-0.01181793212890625,
-0.031829833984375,
0.046875,
0.055908203125,
0.05963134765625,
0.0301361083984375,
0.0364990234375,
0.0679931640625,
0.0223236083984375,
-0.01515960693359375,
0.06488037109375,
-0.02020263671875,
0.0196533203125,
0.056976318359375,
-0.00007545948028564453,
0.044586181640625,
0.020477294921875,
-0.0267791748046875,
0.0175323486328125,
0.0794677734375,
-0.01258087158203125,
0.06085205078125,
0.0108642578125,
0.003204345703125,
-0.00879669189453125,
-0.006866455078125,
-0.0440673828125,
0.00782012939453125,
0.024383544921875,
-0.0210723876953125,
-0.0093536376953125,
0.0092926025390625,
0.00904083251953125,
0.006427764892578125,
-0.02374267578125,
0.02435302734375,
0.00775909423828125,
-0.02532958984375,
0.0276641845703125,
0.0029659271240234375,
0.0555419921875,
-0.048126220703125,
-0.024200439453125,
-0.03387451171875,
0.008697509765625,
-0.02447509765625,
-0.06903076171875,
0.00330352783203125,
0.01245880126953125,
-0.0002567768096923828,
-0.030029296875,
0.05413818359375,
-0.01120758056640625,
-0.061614990234375,
0.023956298828125,
0.0247039794921875,
0.0238037109375,
0.016326904296875,
-0.0673828125,
-0.01409149169921875,
0.0037174224853515625,
-0.0255279541015625,
0.01544952392578125,
0.011444091796875,
0.00977325439453125,
0.05169677734375,
0.045623779296875,
0.012969970703125,
0.00865936279296875,
-0.0173492431640625,
0.05255126953125,
-0.03045654296875,
-0.0438232421875,
-0.055084228515625,
0.0567626953125,
-0.0155181884765625,
-0.032012939453125,
0.07232666015625,
0.0279388427734375,
0.041259765625,
-0.0277557373046875,
0.05322265625,
-0.0186920166015625,
0.0322265625,
-0.04693603515625,
0.076904296875,
-0.062164306640625,
-0.01334381103515625,
-0.0299835205078125,
-0.0657958984375,
-0.003086090087890625,
0.06390380859375,
-0.00876617431640625,
0.01416778564453125,
0.020599365234375,
0.058685302734375,
-0.0223846435546875,
-0.0076446533203125,
0.0013933181762695312,
0.028228759765625,
0.0159454345703125,
0.044097900390625,
0.07330322265625,
-0.036651611328125,
0.01131439208984375,
-0.0511474609375,
-0.0263824462890625,
-0.0257720947265625,
-0.055023193359375,
-0.07037353515625,
-0.032196044921875,
-0.032379150390625,
-0.0301361083984375,
0.01215362548828125,
0.05322265625,
0.06134033203125,
-0.053741455078125,
-0.01328277587890625,
-0.007541656494140625,
-0.006816864013671875,
-0.0166015625,
-0.007244110107421875,
0.0007238388061523438,
0.04107666015625,
-0.077392578125,
0.01154327392578125,
-0.0075225830078125,
0.05401611328125,
-0.018585205078125,
-0.0211944580078125,
-0.00991058349609375,
-0.01354217529296875,
0.00762176513671875,
0.00794219970703125,
-0.03179931640625,
0.004039764404296875,
-0.0117950439453125,
-0.01568603515625,
-0.0028858184814453125,
-0.000015437602996826172,
-0.0171051025390625,
0.01739501953125,
0.06317138671875,
-0.003017425537109375,
0.047760009765625,
-0.0001251697540283203,
0.006076812744140625,
-0.022674560546875,
0.019500732421875,
-0.00484466552734375,
0.053436279296875,
0.01800537109375,
-0.0302886962890625,
0.043365478515625,
0.033935546875,
-0.0287628173828125,
-0.056396484375,
0.0091094970703125,
-0.0665283203125,
-0.026092529296875,
0.08306884765625,
-0.01482391357421875,
-0.02459716796875,
0.0198822021484375,
-0.03875732421875,
0.0140380859375,
-0.02435302734375,
0.038604736328125,
0.049041748046875,
0.00099945068359375,
0.0012760162353515625,
-0.0452880859375,
-0.0020809173583984375,
0.006317138671875,
-0.05474853515625,
-0.029510498046875,
0.04730224609375,
0.045379638671875,
0.03668212890625,
0.0487060546875,
-0.0281829833984375,
0.017578125,
0.015777587890625,
0.032073974609375,
-0.009307861328125,
-0.0207366943359375,
-0.04241943359375,
0.016265869140625,
-0.013580322265625,
-0.0164642333984375
]
] |
sonoisa/t5-base-japanese | 2022-07-31T08:20:41.000Z | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"text2text-generation",
"seq2seq",
"ja",
"dataset:wikipedia",
"dataset:oscar",
"dataset:cc100",
"license:cc-by-sa-4.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | feature-extraction | sonoisa | null | null | sonoisa/t5-base-japanese | 35 | 14,022 | transformers | 2022-03-02T23:29:05 | ---
language: ja
tags:
- t5
- text2text-generation
- seq2seq
license: cc-by-sa-4.0
datasets:
- wikipedia
- oscar
- cc100
---
# 日本語T5事前学習済みモデル
This is a T5 (Text-to-Text Transfer Transformer) model pretrained on Japanese corpus.
次の日本語コーパス(約100GB)を用いて事前学習を行ったT5 (Text-to-Text Transfer Transformer) モデルです。
* [Wikipedia](https://ja.wikipedia.org)の日本語ダンプデータ (2020年7月6日時点のもの)
* [OSCAR](https://oscar-corpus.com)の日本語コーパス
* [CC-100](http://data.statmt.org/cc-100/)の日本語コーパス
このモデルは事前学習のみを行なったものであり、特定のタスクに利用するにはファインチューニングする必要があります。
本モデルにも、大規模コーパスを用いた言語モデルにつきまとう、学習データの内容の偏りに由来する偏った(倫理的ではなかったり、有害だったり、バイアスがあったりする)出力結果になる問題が潜在的にあります。
この問題が発生しうることを想定した上で、被害が発生しない用途にのみ利用するよう気をつけてください。
SentencePieceトークナイザーの学習には上記Wikipediaの全データを用いました。
# 転移学習のサンプルコード
https://github.com/sonoisa/t5-japanese
# ベンチマーク
## livedoorニュース分類タスク
livedoorニュースコーパスを用いたニュース記事のジャンル予測タスクの精度は次の通りです。
Google製多言語T5モデルに比べて、モデルサイズが25%小さく、6ptほど精度が高いです。
日本語T5 ([t5-base-japanese](https://huggingface.co/sonoisa/t5-base-japanese), パラメータ数は222M, [再現用コード](https://github.com/sonoisa/t5-japanese/blob/main/t5_japanese_classification.ipynb))
| label | precision | recall | f1-score | support |
| ----------- | ----------- | ------- | -------- | ------- |
| 0 | 0.96 | 0.94 | 0.95 | 130 |
| 1 | 0.98 | 0.99 | 0.99 | 121 |
| 2 | 0.96 | 0.96 | 0.96 | 123 |
| 3 | 0.86 | 0.91 | 0.89 | 82 |
| 4 | 0.96 | 0.97 | 0.97 | 129 |
| 5 | 0.96 | 0.96 | 0.96 | 141 |
| 6 | 0.98 | 0.98 | 0.98 | 127 |
| 7 | 1.00 | 0.99 | 1.00 | 127 |
| 8 | 0.99 | 0.97 | 0.98 | 120 |
| accuracy | | | 0.97 | 1100 |
| macro avg | 0.96 | 0.96 | 0.96 | 1100 |
| weighted avg | 0.97 | 0.97 | 0.97 | 1100 |
比較対象: 多言語T5 ([google/mt5-small](https://huggingface.co/google/mt5-small), パラメータ数は300M)
| label | precision | recall | f1-score | support |
| ----------- | ----------- | ------- | -------- | ------- |
| 0 | 0.91 | 0.88 | 0.90 | 130 |
| 1 | 0.84 | 0.93 | 0.89 | 121 |
| 2 | 0.93 | 0.80 | 0.86 | 123 |
| 3 | 0.82 | 0.74 | 0.78 | 82 |
| 4 | 0.90 | 0.95 | 0.92 | 129 |
| 5 | 0.89 | 0.89 | 0.89 | 141 |
| 6 | 0.97 | 0.98 | 0.97 | 127 |
| 7 | 0.95 | 0.98 | 0.97 | 127 |
| 8 | 0.93 | 0.95 | 0.94 | 120 |
| accuracy | | | 0.91 | 1100 |
| macro avg | 0.91 | 0.90 | 0.90 | 1100 |
| weighted avg | 0.91 | 0.91 | 0.91 | 1100 |
## JGLUEベンチマーク
[JGLUE](https://github.com/yahoojapan/JGLUE)ベンチマークの結果は次のとおりです(順次追加)。
- MARC-ja: 準備中
- JSTS: 準備中
- JNLI: 準備中
- JSQuAD: EM=0.900, F1=0.945, [再現用コード](https://github.com/sonoisa/t5-japanese/blob/main/t5_JSQuAD.ipynb)
- JCommonsenseQA: 準備中
# 免責事項
本モデルの作者は本モデルを作成するにあたって、その内容、機能等について細心の注意を払っておりますが、モデルの出力が正確であるかどうか、安全なものであるか等について保証をするものではなく、何らの責任を負うものではありません。本モデルの利用により、万一、利用者に何らかの不都合や損害が発生したとしても、モデルやデータセットの作者や作者の所属組織は何らの責任を負うものではありません。利用者には本モデルやデータセットの作者や所属組織が責任を負わないことを明確にする義務があります。
# ライセンス
[CC-BY SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/deed.ja)
[Common Crawlの利用規約](http://commoncrawl.org/terms-of-use/)も守るようご注意ください。
| 3,558 | [
[
-0.038360595703125,
-0.039703369140625,
0.023590087890625,
0.0094451904296875,
-0.031402587890625,
0.00970458984375,
-0.00856781005859375,
-0.03717041015625,
0.04364013671875,
0.00804901123046875,
-0.03948974609375,
-0.067138671875,
-0.05389404296875,
-0.004909515380859375,
-0.00506591796875,
0.0626220703125,
-0.0190582275390625,
-0.0108795166015625,
0.0030231475830078125,
-0.0157318115234375,
-0.04486083984375,
-0.01096343994140625,
-0.052001953125,
-0.0244598388671875,
0.01023101806640625,
0.03192138671875,
0.042633056640625,
0.032470703125,
0.042938232421875,
0.01934814453125,
-0.007587432861328125,
-0.0027561187744140625,
-0.0182952880859375,
-0.00757598876953125,
0.00832366943359375,
-0.03668212890625,
-0.038482666015625,
-0.01885986328125,
0.04168701171875,
0.040435791015625,
0.007965087890625,
0.0235595703125,
0.008087158203125,
0.05023193359375,
-0.03717041015625,
0.01837158203125,
-0.0233917236328125,
0.0007238388061523438,
-0.0151214599609375,
-0.01971435546875,
-0.00589752197265625,
-0.040924072265625,
-0.0133514404296875,
-0.0543212890625,
0.0093536376953125,
-0.00601959228515625,
0.09320068359375,
-0.003284454345703125,
-0.0186920166015625,
-0.01088714599609375,
-0.030364990234375,
0.063232421875,
-0.046600341796875,
0.0343017578125,
0.0277557373046875,
0.018707275390625,
-0.002834320068359375,
-0.0467529296875,
-0.04852294921875,
0.00785064697265625,
-0.0142669677734375,
0.03564453125,
-0.00618743896484375,
-0.0227813720703125,
0.0255889892578125,
0.020172119140625,
-0.045013427734375,
-0.00675201416015625,
-0.0321044921875,
-0.01253509521484375,
0.059051513671875,
0.002452850341796875,
0.047882080078125,
-0.040985107421875,
-0.023712158203125,
-0.0220184326171875,
-0.03515625,
0.041290283203125,
0.015899658203125,
0.014801025390625,
-0.0438232421875,
0.0290069580078125,
-0.0101318359375,
0.0234832763671875,
0.01192474365234375,
-0.060791015625,
0.06097412109375,
-0.0236358642578125,
-0.009033203125,
0.00927734375,
0.0860595703125,
0.049957275390625,
-0.002166748046875,
0.0036869049072265625,
-0.0222320556640625,
-0.018951416015625,
0.0005865097045898438,
-0.057464599609375,
0.00864410400390625,
0.0192413330078125,
-0.0391845703125,
-0.02825927734375,
0.00878143310546875,
-0.07183837890625,
0.0095672607421875,
-0.0002351999282836914,
0.045257568359375,
-0.052581787109375,
-0.012359619140625,
-0.002185821533203125,
-0.0003066062927246094,
0.009796142578125,
0.0207366943359375,
-0.07757568359375,
0.0164947509765625,
0.0254364013671875,
0.0555419921875,
-0.00380706787109375,
-0.015350341796875,
0.0111083984375,
0.028900146484375,
-0.03106689453125,
0.05157470703125,
-0.0135345458984375,
-0.05059814453125,
-0.01065826416015625,
0.0301971435546875,
-0.037078857421875,
-0.0190582275390625,
0.0618896484375,
-0.0194549560546875,
0.040924072265625,
-0.0086822509765625,
-0.016143798828125,
-0.0290985107421875,
0.0228729248046875,
-0.032989501953125,
0.08526611328125,
0.006572723388671875,
-0.07501220703125,
0.00749969482421875,
-0.0413818359375,
-0.046539306640625,
-0.004810333251953125,
-0.0040740966796875,
-0.052764892578125,
-0.022918701171875,
0.026611328125,
0.0283355712890625,
-0.0232696533203125,
0.0124664306640625,
-0.012664794921875,
-0.02105712890625,
0.017669677734375,
-0.0193634033203125,
0.08148193359375,
0.022918701171875,
-0.03765869140625,
-0.0006380081176757812,
-0.06427001953125,
0.00879669189453125,
0.03363037109375,
-0.006725311279296875,
-0.016357421875,
-0.01861572265625,
-0.006748199462890625,
0.0199737548828125,
0.049041748046875,
-0.042022705078125,
0.01030731201171875,
-0.0267181396484375,
0.0284576416015625,
0.051483154296875,
0.0189971923828125,
0.0135345458984375,
-0.044647216796875,
0.04644775390625,
0.007610321044921875,
0.004810333251953125,
-0.011260986328125,
-0.0290985107421875,
-0.065185546875,
-0.02587890625,
0.003307342529296875,
0.04925537109375,
-0.0556640625,
0.045196533203125,
-0.030303955078125,
-0.0594482421875,
-0.038818359375,
-0.007183074951171875,
0.0241851806640625,
0.0478515625,
0.04534912109375,
-0.0033168792724609375,
-0.025543212890625,
-0.06097412109375,
-0.00868988037109375,
-0.01366424560546875,
0.01148223876953125,
0.029022216796875,
0.0460205078125,
-0.0260009765625,
0.039031982421875,
-0.050323486328125,
-0.028717041015625,
-0.03839111328125,
0.00646209716796875,
0.043212890625,
0.052154541015625,
0.042388916015625,
-0.052276611328125,
-0.05413818359375,
0.007297515869140625,
-0.07135009765625,
-0.01227569580078125,
-0.01189422607421875,
-0.00531768798828125,
0.014862060546875,
0.018707275390625,
-0.041778564453125,
0.0245513916015625,
0.020904541015625,
-0.025543212890625,
0.035858154296875,
-0.0191802978515625,
0.02899169921875,
-0.11541748046875,
0.037384033203125,
-0.00150299072265625,
-0.01123809814453125,
-0.036468505859375,
0.00434112548828125,
0.0171356201171875,
0.003635406494140625,
-0.035552978515625,
0.045166015625,
-0.03192138671875,
0.00739288330078125,
0.00891876220703125,
0.0179290771484375,
0.0018367767333984375,
0.032318115234375,
-0.0092926025390625,
0.07867431640625,
0.04052734375,
-0.044189453125,
0.021636962890625,
0.01131439208984375,
-0.0145416259765625,
0.0413818359375,
-0.0498046875,
-0.0162506103515625,
-0.0029449462890625,
-0.00054168701171875,
-0.07061767578125,
-0.02850341796875,
0.0230865478515625,
-0.051483154296875,
0.042755126953125,
0.0021152496337890625,
-0.024627685546875,
-0.056610107421875,
-0.056854248046875,
0.01904296875,
0.039337158203125,
-0.03485107421875,
0.033935546875,
0.03460693359375,
-0.005893707275390625,
-0.048583984375,
-0.05340576171875,
-0.0134124755859375,
-0.017730712890625,
-0.05181884765625,
0.033935546875,
-0.01129150390625,
0.0063018798828125,
-0.01055908203125,
-0.01251983642578125,
-0.010528564453125,
0.00962066650390625,
0.0160675048828125,
0.037994384765625,
-0.01514434814453125,
0.004505157470703125,
0.0175628662109375,
-0.0177154541015625,
-0.00038933753967285156,
-0.0027523040771484375,
0.042449951171875,
-0.01401519775390625,
-0.016937255859375,
-0.070556640625,
-0.00769805908203125,
0.05804443359375,
-0.012359619140625,
0.038726806640625,
0.043670654296875,
-0.01702880859375,
0.00829315185546875,
-0.0305633544921875,
0.0006670951843261719,
-0.0360107421875,
0.0072784423828125,
-0.049530029296875,
-0.052154541015625,
0.05023193359375,
0.01068115234375,
-0.0087127685546875,
0.07501220703125,
0.02374267578125,
-0.032684326171875,
0.0814208984375,
0.0271759033203125,
-0.00787353515625,
0.02545166015625,
-0.047393798828125,
0.0146942138671875,
-0.0709228515625,
-0.037139892578125,
-0.0423583984375,
-0.0236358642578125,
-0.07061767578125,
-0.0059814453125,
0.0396728515625,
0.00466156005859375,
-0.01409912109375,
0.032684326171875,
-0.0489501953125,
0.0180816650390625,
0.0277252197265625,
0.01395416259765625,
0.0171661376953125,
-0.015106201171875,
-0.0275726318359375,
-0.01397705078125,
-0.038665771484375,
-0.042755126953125,
0.07843017578125,
0.0290985107421875,
0.048004150390625,
0.02105712890625,
0.059783935546875,
-0.0004019737243652344,
-0.004573822021484375,
-0.04815673828125,
0.052490234375,
-0.01229095458984375,
-0.0513916015625,
-0.01265716552734375,
-0.0206756591796875,
-0.09466552734375,
0.028106689453125,
-0.026123046875,
-0.07196044921875,
0.0085296630859375,
-0.00823211669921875,
-0.0223846435546875,
0.03631591796875,
-0.057403564453125,
0.074462890625,
-0.0269927978515625,
-0.04669189453125,
0.0008096694946289062,
-0.046142578125,
0.039794921875,
-0.0010423660278320312,
0.035888671875,
-0.0192718505859375,
-0.0013751983642578125,
0.07501220703125,
-0.043609619140625,
0.04034423828125,
-0.009552001953125,
-0.0018520355224609375,
0.0301971435546875,
0.0015363693237304688,
0.036651611328125,
0.003932952880859375,
0.01129150390625,
-0.00251007080078125,
0.0262908935546875,
-0.035064697265625,
-0.023773193359375,
0.07086181640625,
-0.07354736328125,
-0.031585693359375,
-0.050811767578125,
-0.01343536376953125,
-0.00494384765625,
0.05340576171875,
0.035797119140625,
0.027984619140625,
0.004802703857421875,
0.0183258056640625,
0.058563232421875,
-0.0248260498046875,
0.052764892578125,
0.032989501953125,
-0.0164947509765625,
-0.03900146484375,
0.0882568359375,
0.016510009765625,
0.01180267333984375,
0.041778564453125,
0.0164337158203125,
-0.02276611328125,
-0.0257720947265625,
-0.03240966796875,
0.03302001953125,
-0.035858154296875,
-0.01183319091796875,
-0.0460205078125,
-0.01107025146484375,
-0.05291748046875,
-0.0216064453125,
-0.0233917236328125,
-0.040618896484375,
-0.0298004150390625,
-0.00551605224609375,
0.020751953125,
0.0305633544921875,
0.01169586181640625,
0.011749267578125,
-0.06195068359375,
0.034149169921875,
0.006053924560546875,
0.0199737548828125,
-0.0030422210693359375,
-0.0447998046875,
-0.01065826416015625,
-0.005245208740234375,
-0.036956787109375,
-0.061004638671875,
0.051116943359375,
-0.017974853515625,
0.037689208984375,
0.0269927978515625,
-0.0016298294067382812,
0.049163818359375,
-0.012847900390625,
0.0791015625,
0.03509521484375,
-0.0694580078125,
0.05059814453125,
-0.03546142578125,
0.041015625,
0.049957275390625,
0.0496826171875,
-0.0545654296875,
-0.0219573974609375,
-0.0711669921875,
-0.0546875,
0.0596923828125,
-0.0012636184692382812,
-0.00513458251953125,
0.0146636962890625,
-0.01158905029296875,
-0.0027370452880859375,
0.01505279541015625,
-0.060333251953125,
-0.0423583984375,
-0.021728515625,
-0.0416259765625,
0.001636505126953125,
0.000659942626953125,
0.006206512451171875,
-0.028717041015625,
0.07257080078125,
-0.00713348388671875,
0.032470703125,
0.024169921875,
-0.004741668701171875,
-0.0200958251953125,
0.0286865234375,
0.059326171875,
0.03369140625,
-0.0243377685546875,
0.01395416259765625,
0.024139404296875,
-0.0533447265625,
0.006137847900390625,
0.00229644775390625,
-0.054229736328125,
0.006378173828125,
0.043548583984375,
0.052642822265625,
0.018951416015625,
-0.035888671875,
0.05419921875,
-0.00304412841796875,
-0.031982421875,
-0.0235595703125,
0.01349639892578125,
-0.00220489501953125,
-0.003673553466796875,
0.02923583984375,
-0.005741119384765625,
0.0031871795654296875,
-0.034454345703125,
-0.005481719970703125,
0.03228759765625,
-0.024078369140625,
-0.0180511474609375,
0.05804443359375,
0.0008587837219238281,
-0.0013990402221679688,
0.00353240966796875,
-0.004161834716796875,
-0.06488037109375,
0.049102783203125,
0.0384521484375,
0.06097412109375,
-0.0187835693359375,
0.0029582977294921875,
0.08258056640625,
0.030548095703125,
0.01345062255859375,
0.035369873046875,
0.0074005126953125,
-0.0330810546875,
0.004688262939453125,
-0.045257568359375,
-0.01100921630859375,
0.01837158203125,
-0.03106689453125,
0.0252532958984375,
-0.0498046875,
-0.0156097412109375,
-0.00445556640625,
0.025848388671875,
-0.01934814453125,
0.044189453125,
-0.0033473968505859375,
0.05926513671875,
-0.050079345703125,
0.06268310546875,
0.04400634765625,
-0.050537109375,
-0.0775146484375,
0.00628662109375,
-0.00891876220703125,
-0.05548095703125,
0.04180908203125,
0.0096435546875,
-0.0129241943359375,
-0.0099029541015625,
-0.03656005859375,
-0.0821533203125,
0.1087646484375,
-0.0078277587890625,
-0.02716064453125,
0.018798828125,
0.014495849609375,
0.04656982421875,
-0.0024166107177734375,
0.038665771484375,
0.0286102294921875,
0.04437255859375,
0.00710296630859375,
-0.077392578125,
0.0250244140625,
-0.03326416015625,
-0.0025482177734375,
0.01447296142578125,
-0.091064453125,
0.047515869140625,
-0.0058441162109375,
-0.0122222900390625,
-0.01763916015625,
0.04925537109375,
0.0184326171875,
0.03582763671875,
0.018951416015625,
0.056732177734375,
0.061187744140625,
-0.0110931396484375,
0.07110595703125,
-0.042510986328125,
0.043731689453125,
0.058197021484375,
0.01316070556640625,
0.0509033203125,
0.031585693359375,
-0.03759765625,
0.043792724609375,
0.049468994140625,
0.0008540153503417969,
0.03558349609375,
-0.025115966796875,
-0.0231781005859375,
0.0009007453918457031,
0.0013380050659179688,
-0.048736572265625,
0.012664794921875,
0.0175933837890625,
-0.023956298828125,
0.00403594970703125,
0.007785797119140625,
0.0166778564453125,
-0.0020084381103515625,
-0.022216796875,
0.047149658203125,
-0.0028858184814453125,
-0.05352783203125,
0.0728759765625,
0.007030487060546875,
0.051300048828125,
-0.04034423828125,
0.0261993408203125,
-0.0203857421875,
0.0086822509765625,
-0.039031982421875,
-0.0821533203125,
0.00829315185546875,
-0.0019969940185546875,
-0.00817108154296875,
-0.0162811279296875,
0.055755615234375,
-0.0209503173828125,
-0.037567138671875,
0.01300048828125,
0.0088958740234375,
0.01593017578125,
0.05645751953125,
-0.06298828125,
0.007297515869140625,
0.0288543701171875,
-0.0238800048828125,
0.01507568359375,
0.028594970703125,
-0.0030269622802734375,
0.05206298828125,
0.039642333984375,
0.00360107421875,
0.007343292236328125,
-0.01024627685546875,
0.0616455078125,
-0.049835205078125,
-0.038909912109375,
-0.04168701171875,
0.032806396484375,
-0.00568389892578125,
-0.0290374755859375,
0.061370849609375,
0.0592041015625,
0.056640625,
-0.0013446807861328125,
0.07171630859375,
-0.0203399658203125,
0.06890869140625,
-0.0361328125,
0.056304931640625,
-0.06964111328125,
-0.0006847381591796875,
-0.0299835205078125,
-0.0401611328125,
-0.0289459228515625,
0.051422119140625,
-0.037261962890625,
0.006893157958984375,
0.06866455078125,
0.05633544921875,
-0.001430511474609375,
-0.006336212158203125,
0.004116058349609375,
0.0241546630859375,
0.01934814453125,
0.0684814453125,
0.0201873779296875,
-0.0675048828125,
0.0341796875,
-0.042633056640625,
0.002445220947265625,
-0.0298614501953125,
-0.0283203125,
-0.06329345703125,
-0.0304107666015625,
-0.021759033203125,
-0.0228729248046875,
-0.01337432861328125,
0.0810546875,
0.0271759033203125,
-0.06781005859375,
-0.0252532958984375,
-0.0017261505126953125,
0.020111083984375,
-0.005947113037109375,
-0.020843505859375,
0.03875732421875,
-0.0193634033203125,
-0.060882568359375,
0.0226287841796875,
0.0028533935546875,
0.0261993408203125,
0.01038360595703125,
-0.009063720703125,
-0.0196380615234375,
-0.01116180419921875,
0.02093505859375,
0.0278472900390625,
-0.0304412841796875,
-0.01013946533203125,
-0.005443572998046875,
-0.03729248046875,
0.02093505859375,
0.0084228515625,
-0.028533935546875,
0.033172607421875,
0.0509033203125,
0.0271148681640625,
0.052642822265625,
-0.008758544921875,
-0.000247955322265625,
-0.02471923828125,
0.00994110107421875,
0.0034236907958984375,
0.0230560302734375,
0.01222991943359375,
-0.02313232421875,
0.03631591796875,
0.057403564453125,
-0.0225982666015625,
-0.047515869140625,
-0.016845703125,
-0.0892333984375,
-0.032012939453125,
0.06976318359375,
0.00643157958984375,
-0.041229248046875,
0.022216796875,
-0.025360107421875,
0.0310211181640625,
-0.041748046875,
0.03955078125,
0.047271728515625,
-0.0028514862060546875,
-0.01507568359375,
-0.051177978515625,
0.02471923828125,
0.032379150390625,
-0.0655517578125,
-0.034423828125,
0.0117950439453125,
0.024322509765625,
0.032012939453125,
0.045440673828125,
-0.00934600830078125,
0.02276611328125,
-0.01416015625,
0.023193359375,
0.01258087158203125,
-0.0006327629089355469,
-0.0230255126953125,
0.0220489501953125,
-0.0025177001953125,
-0.0221405029296875
]
] |
facebook/mbart-large-cc25 | 2023-03-28T09:36:03.000Z | [
"transformers",
"pytorch",
"tf",
"mbart",
"text2text-generation",
"translation",
"en",
"ar",
"cs",
"de",
"et",
"fi",
"fr",
"gu",
"hi",
"it",
"ja",
"kk",
"ko",
"lt",
"lv",
"my",
"ne",
"nl",
"ro",
"ru",
"si",
"tr",
"vi",
"zh",
"multilingual",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | facebook | null | null | facebook/mbart-large-cc25 | 53 | 14,015 | transformers | 2022-03-02T23:29:05 | ---
tags:
- translation
language:
- en
- ar
- cs
- de
- et
- fi
- fr
- gu
- hi
- it
- ja
- kk
- ko
- lt
- lv
- my
- ne
- nl
- ro
- ru
- si
- tr
- vi
- zh
- multilingual
---
#### mbart-large-cc25
Pretrained (not finetuned) multilingual mbart model.
Original Languages
```
export langs=ar_AR,cs_CZ,de_DE,en_XX,es_XX,et_EE,fi_FI,fr_XX,gu_IN,hi_IN,it_IT,ja_XX,kk_KZ,ko_KR,lt_LT,lv_LV,my_MM,ne_NP,nl_XX,ro_RO,ru_RU,si_LK,tr_TR,vi_VN,zh_CN
```
Original Code: https://github.com/pytorch/fairseq/tree/master/examples/mbart
Docs: https://huggingface.co/transformers/master/model_doc/mbart.html
Finetuning Code: examples/seq2seq/finetune.py (as of Aug 20, 2020)
Can also be finetuned for summarization. | 699 | [
[
-0.04376220703125,
-0.03765869140625,
0.0015125274658203125,
0.03033447265625,
-0.02276611328125,
-0.0026416778564453125,
-0.046417236328125,
0.00382232666015625,
0.008941650390625,
0.04730224609375,
-0.05322265625,
-0.038055419921875,
-0.02569580078125,
0.00310516357421875,
-0.019256591796875,
0.09515380859375,
0.0025653839111328125,
0.018707275390625,
0.00940704345703125,
-0.00762939453125,
-0.015655517578125,
-0.040496826171875,
-0.0528564453125,
-0.0096435546875,
0.0213775634765625,
0.058013916015625,
0.036163330078125,
-0.0026836395263671875,
0.041015625,
0.0290679931640625,
-0.01442718505859375,
-0.005725860595703125,
-0.0248565673828125,
-0.0164947509765625,
-0.006076812744140625,
-0.00890350341796875,
-0.05828857421875,
-0.0049591064453125,
0.05706787109375,
0.04302978515625,
-0.016082763671875,
0.03253173828125,
0.0081024169921875,
0.0467529296875,
-0.020751953125,
0.02239990234375,
-0.0328369140625,
0.00658416748046875,
-0.0125274658203125,
0.00847625732421875,
-0.038604736328125,
-0.0007486343383789062,
-0.0122222900390625,
-0.0238189697265625,
0.007160186767578125,
-0.0262298583984375,
0.0784912109375,
-0.0020809173583984375,
-0.05816650390625,
0.01186370849609375,
-0.049041748046875,
0.0614013671875,
-0.050323486328125,
0.041015625,
0.02880859375,
0.060028076171875,
0.00897979736328125,
-0.07476806640625,
-0.04034423828125,
-0.028564453125,
-0.001079559326171875,
0.03143310546875,
-0.0004050731658935547,
0.000675201416015625,
0.03021240234375,
0.0394287109375,
-0.0290069580078125,
-0.004802703857421875,
-0.0675048828125,
-0.0196533203125,
0.05645751953125,
0.006526947021484375,
0.0038394927978515625,
-0.004726409912109375,
-0.0169219970703125,
-0.0184326171875,
-0.037322998046875,
0.011749267578125,
0.021636962890625,
0.02093505859375,
-0.0430908203125,
0.056732177734375,
-0.021759033203125,
0.04571533203125,
0.004673004150390625,
-0.0028705596923828125,
0.042816162109375,
-0.0362548828125,
-0.018035888671875,
-0.018524169921875,
0.07672119140625,
0.0143585205078125,
0.049224853515625,
0.0165252685546875,
-0.00653839111328125,
-0.02423095703125,
0.00791168212890625,
-0.07501220703125,
-0.037841796875,
0.0041046142578125,
-0.026397705078125,
0.006168365478515625,
0.01488494873046875,
-0.034637451171875,
0.017730712890625,
-0.029998779296875,
0.024932861328125,
-0.04248046875,
-0.0323486328125,
-0.0027904510498046875,
0.01117706298828125,
0.028839111328125,
0.00951385498046875,
-0.06903076171875,
0.02081298828125,
0.038726806640625,
0.0404052734375,
0.01212310791015625,
-0.042694091796875,
-0.0254364013671875,
-0.0030422210693359375,
-0.00537109375,
0.033233642578125,
-0.0156707763671875,
-0.0125885009765625,
-0.0107574462890625,
0.03985595703125,
-0.003032684326171875,
-0.0258636474609375,
0.05682373046875,
-0.01715087890625,
0.01375579833984375,
-0.0058746337890625,
-0.00768280029296875,
-0.0197906494140625,
0.02337646484375,
-0.047607421875,
0.068115234375,
0.02423095703125,
-0.04620361328125,
0.0003743171691894531,
-0.049407958984375,
-0.050384521484375,
-0.01218414306640625,
-0.01030731201171875,
-0.0318603515625,
-0.004367828369140625,
0.02301025390625,
0.04296875,
-0.00814056396484375,
0.039398193359375,
0.0021648406982421875,
-0.0187225341796875,
0.015899658203125,
-0.031982421875,
0.09124755859375,
0.0400390625,
-0.0079193115234375,
0.0178375244140625,
-0.0740966796875,
-0.00031065940856933594,
0.004169464111328125,
-0.0308990478515625,
0.01666259765625,
-0.024322509765625,
0.0277862548828125,
0.0261993408203125,
0.026214599609375,
-0.037628173828125,
0.01580810546875,
-0.023284912109375,
0.039520263671875,
0.0268096923828125,
-0.01090240478515625,
0.0207977294921875,
-0.0013551712036132812,
0.0538330078125,
0.0209503173828125,
0.0216064453125,
-0.0238037109375,
-0.03656005859375,
-0.07208251953125,
-0.036529541015625,
0.0222625732421875,
0.056121826171875,
-0.041229248046875,
0.0404052734375,
-0.024261474609375,
-0.05755615234375,
-0.02398681640625,
-0.0007596015930175781,
0.015716552734375,
0.01262664794921875,
0.0223541259765625,
-0.0163726806640625,
-0.0518798828125,
-0.08123779296875,
0.00603485107421875,
0.01091766357421875,
-0.00197601318359375,
-0.00040602684020996094,
0.03570556640625,
-0.03662109375,
0.049896240234375,
-0.0287628173828125,
-0.01059722900390625,
-0.029632568359375,
-0.0084075927734375,
0.037445068359375,
0.0469970703125,
0.050811767578125,
-0.042388916015625,
-0.044708251953125,
0.0115509033203125,
-0.0179595947265625,
0.0087127685546875,
0.005435943603515625,
-0.0206451416015625,
0.0177764892578125,
0.057159423828125,
-0.046539306640625,
0.0132598876953125,
0.057525634765625,
-0.0118255615234375,
0.04998779296875,
-0.01541900634765625,
0.003955841064453125,
-0.103515625,
0.016387939453125,
-0.0283050537109375,
-0.0235443115234375,
-0.0164947509765625,
0.01319122314453125,
0.0255126953125,
-0.0251617431640625,
-0.052490234375,
0.0396728515625,
-0.034698486328125,
0.0090179443359375,
0.0010471343994140625,
0.005096435546875,
-0.0031585693359375,
0.04132080078125,
-0.00408935546875,
0.043548583984375,
0.04302978515625,
-0.050262451171875,
0.055633544921875,
0.0287933349609375,
-0.0193634033203125,
0.0304718017578125,
-0.040618896484375,
-0.0196075439453125,
-0.0012531280517578125,
0.02294921875,
-0.061309814453125,
-0.0026645660400390625,
0.01287078857421875,
-0.03857421875,
0.031341552734375,
-0.027618408203125,
-0.0239410400390625,
-0.037353515625,
-0.01044464111328125,
0.041290283203125,
0.039520263671875,
-0.02764892578125,
0.021514892578125,
0.0144805908203125,
-0.0184478759765625,
-0.052978515625,
-0.07080078125,
0.0037937164306640625,
-0.00887298583984375,
-0.04351806640625,
0.0010728836059570312,
-0.0253143310546875,
0.0038509368896484375,
-0.023712158203125,
0.0135040283203125,
-0.0193328857421875,
0.003932952880859375,
0.0028133392333984375,
0.00958251953125,
-0.038360595703125,
0.00241851806640625,
0.00394439697265625,
-0.02105712890625,
-0.0112762451171875,
0.00504302978515625,
0.053924560546875,
-0.03204345703125,
-0.0172119140625,
-0.0296173095703125,
0.01129150390625,
0.033203125,
-0.05816650390625,
0.059051513671875,
0.09808349609375,
-0.03350830078125,
0.0100250244140625,
-0.025054931640625,
-0.014007568359375,
-0.0391845703125,
0.0504150390625,
-0.036285400390625,
-0.07366943359375,
0.047332763671875,
0.0026493072509765625,
-0.002429962158203125,
0.07366943359375,
0.055908203125,
0.0144500732421875,
0.05438232421875,
0.042572021484375,
-0.00421142578125,
0.044097900390625,
-0.040924072265625,
0.0185699462890625,
-0.0609130859375,
-0.0079193115234375,
-0.0455322265625,
-0.0011072158813476562,
-0.06695556640625,
-0.0122222900390625,
-0.0076904296875,
0.0188140869140625,
-0.051788330078125,
0.04132080078125,
-0.00911712646484375,
0.039703369140625,
0.046630859375,
-0.01337432861328125,
0.0145263671875,
0.0052642822265625,
-0.028717041015625,
-0.0087432861328125,
-0.0482177734375,
-0.037933349609375,
0.08074951171875,
0.019256591796875,
0.040771484375,
0.0255126953125,
0.04473876953125,
-0.030853271484375,
0.019989013671875,
-0.05718994140625,
0.035125732421875,
-0.0264129638671875,
-0.07391357421875,
-0.0182342529296875,
-0.034271240234375,
-0.072021484375,
0.0083465576171875,
-0.01837158203125,
-0.056610107421875,
0.0007424354553222656,
0.00860595703125,
-0.0130157470703125,
0.01171875,
-0.05401611328125,
0.09320068359375,
-0.017852783203125,
-0.005329132080078125,
-0.01425933837890625,
-0.046478271484375,
0.0452880859375,
-0.016754150390625,
0.018707275390625,
-0.006561279296875,
0.029815673828125,
0.041290283203125,
-0.026702880859375,
0.046539306640625,
0.00035953521728515625,
-0.00209808349609375,
0.02532958984375,
0.0009632110595703125,
0.017974853515625,
0.0137939453125,
-0.004207611083984375,
0.007419586181640625,
0.0173187255859375,
-0.029998779296875,
0.0020771026611328125,
0.05279541015625,
-0.055938720703125,
-0.01395416259765625,
-0.0311737060546875,
-0.03594970703125,
-0.00421142578125,
0.033294677734375,
0.03375244140625,
0.0401611328125,
-0.040496826171875,
0.03253173828125,
0.0194244384765625,
-0.04779052734375,
0.041473388671875,
0.040283203125,
-0.0293731689453125,
-0.04302978515625,
0.061798095703125,
0.00283050537109375,
0.0217742919921875,
0.0279541015625,
0.016204833984375,
-0.006195068359375,
-0.025421142578125,
-0.0247344970703125,
0.026214599609375,
-0.046722412109375,
-0.020172119140625,
-0.04925537109375,
-0.03759765625,
-0.051116943359375,
0.01032257080078125,
-0.05218505859375,
-0.05377197265625,
-0.0079193115234375,
-0.01097869873046875,
0.0238189697265625,
0.0391845703125,
-0.0211181640625,
0.049560546875,
-0.0797119140625,
0.0312347412109375,
0.0166778564453125,
0.03582763671875,
-0.0026187896728515625,
-0.0491943359375,
-0.056121826171875,
0.027496337890625,
-0.0491943359375,
-0.0411376953125,
0.03131103515625,
0.0277099609375,
0.052703857421875,
0.05328369140625,
0.01256561279296875,
0.04443359375,
-0.047943115234375,
0.05633544921875,
0.01245880126953125,
-0.0731201171875,
0.0214691162109375,
-0.030059814453125,
0.034820556640625,
0.0458984375,
0.03204345703125,
-0.055450439453125,
-0.024078369140625,
-0.051666259765625,
-0.0828857421875,
0.0633544921875,
0.00974273681640625,
0.027099609375,
-0.0023097991943359375,
-0.01338958740234375,
0.003726959228515625,
0.0227203369140625,
-0.084228515625,
-0.02587890625,
-0.021484375,
-0.033172607421875,
-0.0455322265625,
-0.025604248046875,
-0.004566192626953125,
-0.05523681640625,
0.06610107421875,
0.00830078125,
0.0262451171875,
-0.00434112548828125,
-0.025115966796875,
0.0057373046875,
0.0164794921875,
0.06744384765625,
0.0531005859375,
-0.0293731689453125,
0.00949859619140625,
0.012664794921875,
-0.053070068359375,
-0.003017425537109375,
0.01514434814453125,
0.0237579345703125,
-0.01073455810546875,
0.0228424072265625,
0.05889892578125,
-0.006252288818359375,
-0.050048828125,
0.0462646484375,
-0.0027408599853515625,
-0.004947662353515625,
-0.062042236328125,
-0.008392333984375,
0.0286865234375,
0.02520751953125,
0.007411956787109375,
-0.006320953369140625,
0.0013971328735351562,
-0.03912353515625,
0.0174560546875,
0.0277252197265625,
-0.02899169921875,
-0.02423095703125,
0.037994384765625,
0.00939178466796875,
-0.013031005859375,
0.064453125,
-0.0267333984375,
-0.04632568359375,
0.051025390625,
0.040679931640625,
0.052703857421875,
-0.038238525390625,
-0.0027866363525390625,
0.059051513671875,
0.05810546875,
0.0036716461181640625,
0.02532958984375,
0.0017147064208984375,
-0.049835205078125,
-0.031494140625,
-0.0433349609375,
-0.01247406005859375,
0.006046295166015625,
-0.04718017578125,
0.043243408203125,
-0.0290069580078125,
-0.003368377685546875,
-0.0078125,
-0.0004405975341796875,
-0.059539794921875,
0.018402099609375,
0.00775146484375,
0.07135009765625,
-0.0645751953125,
0.08349609375,
0.058685302734375,
-0.039764404296875,
-0.06414794921875,
-0.02001953125,
-0.0289306640625,
-0.038726806640625,
0.035003662109375,
0.0179595947265625,
0.0177764892578125,
0.010467529296875,
-0.042144775390625,
-0.06036376953125,
0.07623291015625,
0.038055419921875,
-0.0421142578125,
0.03900146484375,
0.01116180419921875,
0.032806396484375,
-0.0165252685546875,
0.03057861328125,
0.034423828125,
0.044952392578125,
0.004337310791015625,
-0.07977294921875,
-0.0216064453125,
-0.037261962890625,
-0.03363037109375,
0.0135955810546875,
-0.05908203125,
0.08697509765625,
-0.0161285400390625,
0.0048675537109375,
0.006954193115234375,
0.044891357421875,
0.007488250732421875,
0.01324462890625,
0.0084075927734375,
0.040618896484375,
0.0308990478515625,
-0.0113372802734375,
0.06591796875,
-0.021331787109375,
0.0443115234375,
0.08709716796875,
0.012451171875,
0.0665283203125,
0.016265869140625,
-0.03173828125,
0.0178680419921875,
0.05340576171875,
-0.0133209228515625,
0.0171966552734375,
0.016448974609375,
0.00937652587890625,
-0.0199432373046875,
0.0247650146484375,
-0.051025390625,
0.038787841796875,
0.0234832763671875,
-0.02008056640625,
0.0012903213500976562,
0.005336761474609375,
0.035430908203125,
-0.03240966796875,
-0.0241851806640625,
0.025177001953125,
0.0249481201171875,
-0.038421630859375,
0.05853271484375,
0.0259552001953125,
0.0509033203125,
-0.048187255859375,
0.0168304443359375,
-0.028656005859375,
0.0274810791015625,
-0.01242828369140625,
-0.037139892578125,
0.0269012451171875,
-0.017852783203125,
0.00449371337890625,
-0.004360198974609375,
0.0301361083984375,
-0.0513916015625,
-0.0780029296875,
0.01149749755859375,
0.036773681640625,
0.0193634033203125,
-0.01532745361328125,
-0.0430908203125,
-0.0249176025390625,
0.00762939453125,
-0.032562255859375,
-0.00069427490234375,
0.04815673828125,
-0.010162353515625,
0.0433349609375,
0.0241851806640625,
-0.0186920166015625,
0.015655517578125,
0.00994110107421875,
0.045440673828125,
-0.06671142578125,
-0.040679931640625,
-0.0521240234375,
0.04913330078125,
0.015655517578125,
-0.0268402099609375,
0.057647705078125,
0.056732177734375,
0.07354736328125,
-0.034454345703125,
0.032379150390625,
0.01171112060546875,
0.013946533203125,
-0.031951904296875,
0.076171875,
-0.0404052734375,
-0.0156402587890625,
-0.043853759765625,
-0.09246826171875,
-0.02276611328125,
0.059326171875,
0.0025768280029296875,
0.0295257568359375,
0.07220458984375,
0.040557861328125,
-0.03839111328125,
-0.0150299072265625,
0.022216796875,
0.0439453125,
0.00852203369140625,
0.034149169921875,
0.04833984375,
-0.03369140625,
0.05023193359375,
-0.0288238525390625,
-0.006988525390625,
-0.0199737548828125,
-0.0633544921875,
-0.06585693359375,
-0.057708740234375,
-0.0274658203125,
-0.0112457275390625,
-0.03314208984375,
0.0687255859375,
0.05322265625,
-0.0675048828125,
-0.041015625,
0.0189971923828125,
-0.01270294189453125,
-0.0035953521728515625,
-0.0115814208984375,
0.035888671875,
-0.00046133995056152344,
-0.04510498046875,
0.00763702392578125,
-0.01763916015625,
0.021087646484375,
-0.00638580322265625,
-0.0209197998046875,
-0.025726318359375,
-0.006366729736328125,
0.05352783203125,
0.0089569091796875,
-0.027435302734375,
0.0010137557983398438,
-0.0191192626953125,
-0.005451202392578125,
0.0077056884765625,
0.03857421875,
-0.0286865234375,
0.00850677490234375,
0.0277252197265625,
0.016265869140625,
0.051239013671875,
-0.0263214111328125,
0.035491943359375,
-0.061737060546875,
0.054962158203125,
-0.0081329345703125,
0.041595458984375,
0.028656005859375,
0.007381439208984375,
0.0308685302734375,
-0.00036525726318359375,
-0.037322998046875,
-0.037628173828125,
0.0248870849609375,
-0.10064697265625,
-0.0057220458984375,
0.08050537109375,
-0.0078277587890625,
-0.00997161865234375,
0.01519775390625,
-0.044158935546875,
0.04766845703125,
-0.0105438232421875,
0.0086669921875,
0.05255126953125,
0.032257080078125,
-0.0218048095703125,
-0.058013916015625,
0.019561767578125,
0.0172119140625,
-0.0552978515625,
-0.00788116455078125,
0.03057861328125,
0.02197265625,
-0.0008234977722167969,
0.044525146484375,
-0.0208282470703125,
0.007343292236328125,
-0.007251739501953125,
0.044708251953125,
-0.0106964111328125,
-0.02349853515625,
-0.0228271484375,
-0.0311279296875,
0.0243072509765625,
-0.019927978515625
]
] |
aubmindlab/araelectra-base-generator | 2023-03-22T13:55:02.000Z | [
"transformers",
"pytorch",
"tf",
"tensorboard",
"safetensors",
"electra",
"fill-mask",
"ar",
"dataset:wikipedia",
"dataset:Osian",
"dataset:1.5B-Arabic-Corpus",
"dataset:oscar-arabic-unshuffled",
"dataset:Assafir(private)",
"arxiv:1406.2661",
"arxiv:2012.15516",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | aubmindlab | null | null | aubmindlab/araelectra-base-generator | 1 | 13,999 | transformers | 2022-03-02T23:29:05 | ---
language: ar
datasets:
- wikipedia
- Osian
- 1.5B-Arabic-Corpus
- oscar-arabic-unshuffled
- Assafir(private)
widget:
- text: " عاصمة لبنان هي [MASK] ."
---
# AraELECTRA
<img src="https://raw.githubusercontent.com/aub-mind/arabert/master/AraELECTRA.png" width="100" align="left"/>
**ELECTRA** is a method for self-supervised language representation learning. It can be used to pre-train transformer networks using relatively little compute. ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a [GAN](https://arxiv.org/pdf/1406.2661.pdf). AraELECTRA achieves state-of-the-art results on Arabic QA dataset.
For a detailed description, please refer to the AraELECTRA paper [AraELECTRA: Pre-Training Text Discriminators for Arabic Language Understanding](https://arxiv.org/abs/2012.15516).
## How to use the generator in `transformers`
```python
from transformers import pipeline
fill_mask = pipeline(
"fill-mask",
model="aubmindlab/araelectra-base-generator",
tokenizer="aubmindlab/araelectra-base-generator"
)
print(
fill_mask(" عاصمة لبنان هي [MASK] .)
)
```
# Preprocessing
It is recommended to apply our preprocessing function before training/testing on any dataset.
**Install the arabert python package to segment text for AraBERT v1 & v2 or to clean your data `pip install arabert`**
```python
from arabert.preprocess import ArabertPreprocessor
model_name="aubmindlab/araelectra-base"
arabert_prep = ArabertPreprocessor(model_name=model_name)
text = "ولن نبالغ إذا قلنا إن هاتف أو كمبيوتر المكتب في زمننا هذا ضروري"
arabert_prep.preprocess(text)
>>> output: ولن نبالغ إذا قلنا : إن هاتف أو كمبيوتر المكتب في زمننا هذا ضروري
```
# Model
Model | HuggingFace Model Name | Size (MB/Params)|
---|:---:|:---:
AraELECTRA-base-generator | [araelectra-base-generator](https://huggingface.co/aubmindlab/araelectra-base-generator) | 227MB/60M |
AraELECTRA-base-discriminator | [araelectra-base-discriminator](https://huggingface.co/aubmindlab/araelectra-base-discriminator) | 516MB/135M |
# Compute
Model | Hardware | num of examples (seq len = 512) | Batch Size | Num of Steps | Time (in days)
---|:---:|:---:|:---:|:---:|:---:
AraELECTRA-base | TPUv3-8 | - | 256 | 2M | 24
# Dataset
The pretraining data used for the new AraELECTRA model is also used for **AraGPT2 and AraELECTRA**.
The dataset consists of 77GB or 200,095,961 lines or 8,655,948,860 words or 82,232,988,358 chars (before applying Farasa Segmentation)
For the new dataset we added the unshuffled OSCAR corpus, after we thoroughly filter it, to the previous dataset used in AraBERTv1 but with out the websites that we previously crawled:
- OSCAR unshuffled and filtered.
- [Arabic Wikipedia dump](https://archive.org/details/arwiki-20190201) from 2020/09/01
- [The 1.5B words Arabic Corpus](https://www.semanticscholar.org/paper/1.5-billion-words-Arabic-Corpus-El-Khair/f3eeef4afb81223df96575adadf808fe7fe440b4)
- [The OSIAN Corpus](https://www.aclweb.org/anthology/W19-4619)
- Assafir news articles. Huge thank you for Assafir for giving us the data
# TensorFlow 1.x models
**You can find the PyTorch, TF2 and TF1 models in HuggingFace's Transformer Library under the ```aubmindlab``` username**
- `wget https://huggingface.co/aubmindlab/MODEL_NAME/resolve/main/tf1_model.tar.gz` where `MODEL_NAME` is any model under the `aubmindlab` name
# If you used this model please cite us as :
```
@inproceedings{antoun-etal-2021-araelectra,
title = "{A}ra{ELECTRA}: Pre-Training Text Discriminators for {A}rabic Language Understanding",
author = "Antoun, Wissam and
Baly, Fady and
Hajj, Hazem",
booktitle = "Proceedings of the Sixth Arabic Natural Language Processing Workshop",
month = apr,
year = "2021",
address = "Kyiv, Ukraine (Virtual)",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.wanlp-1.20",
pages = "191--195",
}
```
# Acknowledgments
Thanks to TensorFlow Research Cloud (TFRC) for the free access to Cloud TPUs, couldn't have done it without this program, and to the [AUB MIND Lab](https://sites.aub.edu.lb/mindlab/) Members for the continous support. Also thanks to [Yakshof](https://www.yakshof.com/#/) and Assafir for data and storage access. Another thanks for Habib Rahal (https://www.behance.net/rahalhabib), for putting a face to AraBERT.
# Contacts
**Wissam Antoun**: [Linkedin](https://www.linkedin.com/in/wissam-antoun-622142b4/) | [Twitter](https://twitter.com/wissam_antoun) | [Github](https://github.com/WissamAntoun) | <wfa07@mail.aub.edu> | <wissam.antoun@gmail.com>
**Fady Baly**: [Linkedin](https://www.linkedin.com/in/fadybaly/) | [Twitter](https://twitter.com/fadybaly) | [Github](https://github.com/fadybaly) | <fgb06@mail.aub.edu> | <baly.fady@gmail.com>
| 4,910 | [
[
-0.045989990234375,
-0.045501708984375,
0.022064208984375,
-0.004154205322265625,
-0.0197601318359375,
0.00838470458984375,
-0.00315093994140625,
-0.032012939453125,
0.0258331298828125,
0.02496337890625,
-0.02996826171875,
-0.0521240234375,
-0.060821533203125,
0.01401519775390625,
-0.0455322265625,
0.0816650390625,
-0.0283050537109375,
-0.01154327392578125,
0.01523590087890625,
-0.039093017578125,
-0.002574920654296875,
-0.0265655517578125,
-0.043609619140625,
-0.035125732421875,
0.019805908203125,
0.03240966796875,
0.0523681640625,
0.03350830078125,
0.01204681396484375,
0.03167724609375,
-0.0056610107421875,
0.0150604248046875,
-0.008514404296875,
-0.004222869873046875,
0.005161285400390625,
-0.0227813720703125,
-0.0209197998046875,
-0.0011920928955078125,
0.056060791015625,
0.032012939453125,
-0.003978729248046875,
0.01605224609375,
-0.0112762451171875,
0.05865478515625,
-0.039764404296875,
-0.005992889404296875,
-0.0211181640625,
-0.00011730194091796875,
-0.0199737548828125,
0.01012420654296875,
-0.00795745849609375,
-0.030517578125,
0.0107421875,
-0.0318603515625,
0.020355224609375,
0.0095367431640625,
0.10333251953125,
0.0283966064453125,
-0.020965576171875,
-0.021484375,
-0.04107666015625,
0.060394287109375,
-0.04443359375,
0.041290283203125,
0.040313720703125,
0.007030487060546875,
0.002094268798828125,
-0.06201171875,
-0.05810546875,
-0.01538848876953125,
-0.00865936279296875,
0.0081329345703125,
-0.044342041015625,
-0.0030002593994140625,
0.009490966796875,
0.030792236328125,
-0.040924072265625,
-0.0004100799560546875,
-0.036529541015625,
-0.0219268798828125,
0.04364013671875,
-0.01496124267578125,
0.0229339599609375,
-0.0026454925537109375,
-0.0157928466796875,
-0.03094482421875,
-0.0391845703125,
0.007053375244140625,
0.0318603515625,
-0.0013713836669921875,
-0.036224365234375,
0.0318603515625,
-0.004596710205078125,
0.03924560546875,
0.019500732421875,
-0.0028820037841796875,
0.047515869140625,
-0.01401519775390625,
-0.0169830322265625,
0.0149688720703125,
0.08203125,
-0.0030956268310546875,
0.0170440673828125,
-0.01514434814453125,
-0.0062713623046875,
0.01270294189453125,
0.0095672607421875,
-0.08062744140625,
-0.0247650146484375,
0.033050537109375,
-0.032867431640625,
-0.0212860107421875,
-0.007843017578125,
-0.060089111328125,
-0.00856781005859375,
0.0006623268127441406,
0.032257080078125,
-0.05267333984375,
-0.032318115234375,
0.00867462158203125,
-0.00395965576171875,
0.0205230712890625,
0.00868988037109375,
-0.0552978515625,
0.02850341796875,
0.0194091796875,
0.06829833984375,
0.002292633056640625,
-0.026641845703125,
-0.0233917236328125,
-0.0122833251953125,
-0.0133209228515625,
0.056884765625,
-0.02203369140625,
-0.017120361328125,
-0.01042938232421875,
-0.007007598876953125,
-0.0157928466796875,
-0.0361328125,
0.047821044921875,
-0.044952392578125,
0.020965576171875,
0.004917144775390625,
-0.0416259765625,
-0.03668212890625,
-0.0009918212890625,
-0.0543212890625,
0.08172607421875,
0.022857666015625,
-0.06329345703125,
0.0095062255859375,
-0.0653076171875,
-0.0306854248046875,
-0.007205963134765625,
-0.0145721435546875,
-0.061553955078125,
-0.01238250732421875,
0.0298004150390625,
0.028900146484375,
-0.01485443115234375,
0.0037021636962890625,
0.00580596923828125,
-0.012908935546875,
0.02618408203125,
-0.0114593505859375,
0.0714111328125,
0.0272674560546875,
-0.046173095703125,
0.01221466064453125,
-0.06732177734375,
-0.0029888153076171875,
0.0211944580078125,
-0.01047515869140625,
-0.0004353523254394531,
-0.006649017333984375,
0.0203399658203125,
0.0296173095703125,
0.01457977294921875,
-0.049591064453125,
-0.0002574920654296875,
-0.050628662109375,
0.032562255859375,
0.050628662109375,
-0.0135040283203125,
0.02752685546875,
-0.033843994140625,
0.04913330078125,
0.0188751220703125,
-0.0136260986328125,
0.00717926025390625,
-0.029693603515625,
-0.08465576171875,
-0.0174102783203125,
0.024932861328125,
0.032501220703125,
-0.0560302734375,
0.0382080078125,
-0.019287109375,
-0.06463623046875,
-0.06536865234375,
0.00537872314453125,
0.033355712890625,
0.03790283203125,
0.041656494140625,
-0.017974853515625,
-0.0499267578125,
-0.0672607421875,
-0.0142822265625,
-0.01873779296875,
0.01465606689453125,
0.0143585205078125,
0.056182861328125,
-0.031341552734375,
0.0701904296875,
-0.03375244140625,
-0.024932861328125,
-0.0274810791015625,
0.01326751708984375,
0.020904541015625,
0.04217529296875,
0.056884765625,
-0.053955078125,
-0.031158447265625,
-0.004730224609375,
-0.039764404296875,
-0.005645751953125,
0.0085906982421875,
-0.0104827880859375,
0.02496337890625,
0.0205230712890625,
-0.05328369140625,
0.0300750732421875,
0.04791259765625,
-0.050994873046875,
0.04571533203125,
0.0008230209350585938,
0.006626129150390625,
-0.094970703125,
0.0182342529296875,
0.006561279296875,
-0.019195556640625,
-0.04345703125,
-0.0020351409912109375,
-0.0108642578125,
0.0037689208984375,
-0.036956787109375,
0.055694580078125,
-0.028106689453125,
0.013031005859375,
-0.0164947509765625,
-0.0083770751953125,
-0.0010137557983398438,
0.045989990234375,
-0.005748748779296875,
0.08551025390625,
0.042022705078125,
-0.0550537109375,
0.005680084228515625,
0.049713134765625,
-0.031585693359375,
0.00000858306884765625,
-0.06011962890625,
0.0187835693359375,
-0.00995635986328125,
0.0126800537109375,
-0.07818603515625,
-0.00691986083984375,
0.03472900390625,
-0.043243408203125,
0.026702880859375,
-0.0047149658203125,
-0.037200927734375,
-0.0191192626953125,
-0.0240478515625,
0.032989501953125,
0.048248291015625,
-0.0462646484375,
0.05615234375,
0.033050537109375,
-0.0017557144165039062,
-0.045928955078125,
-0.04412841796875,
-0.0128173828125,
-0.0198211669921875,
-0.041259765625,
0.04534912109375,
0.0032978057861328125,
-0.006542205810546875,
0.0007829666137695312,
-0.0029048919677734375,
-0.0105743408203125,
0.0110321044921875,
0.02874755859375,
0.007701873779296875,
-0.00872039794921875,
0.0017671585083007812,
0.0087432861328125,
-0.0174713134765625,
0.00635528564453125,
-0.0168304443359375,
0.066650390625,
-0.031829833984375,
-0.004505157470703125,
-0.047119140625,
0.0204010009765625,
0.031646728515625,
-0.0372314453125,
0.0755615234375,
0.06854248046875,
-0.0216217041015625,
-0.0005578994750976562,
-0.04071044921875,
-0.0028514862060546875,
-0.039642333984375,
0.0311279296875,
-0.01873779296875,
-0.07293701171875,
0.03515625,
-0.00490570068359375,
0.01186370849609375,
0.0703125,
0.05084228515625,
0.0011510848999023438,
0.080322265625,
0.04083251953125,
-0.0185089111328125,
0.04248046875,
-0.051605224609375,
0.01349639892578125,
-0.060394287109375,
-0.01029205322265625,
-0.05126953125,
-0.0188446044921875,
-0.056793212890625,
-0.01064300537109375,
0.016021728515625,
-0.002643585205078125,
-0.040679931640625,
0.030120849609375,
-0.048095703125,
0.006526947021484375,
0.037109375,
0.00890350341796875,
-0.002285003662109375,
0.0207061767578125,
-0.020965576171875,
0.007671356201171875,
-0.045074462890625,
-0.046478271484375,
0.08367919921875,
0.024322509765625,
0.03424072265625,
0.005771636962890625,
0.068115234375,
0.018280029296875,
0.022125244140625,
-0.047943115234375,
0.03631591796875,
0.004199981689453125,
-0.03741455078125,
-0.00724029541015625,
-0.02264404296875,
-0.09368896484375,
0.01177978515625,
-0.0019474029541015625,
-0.06634521484375,
0.0159149169921875,
0.0019321441650390625,
-0.030120849609375,
0.033599853515625,
-0.04852294921875,
0.059051513671875,
-0.006988525390625,
-0.018951416015625,
-0.015625,
-0.05029296875,
0.0024394989013671875,
-0.0024356842041015625,
0.0120086669921875,
-0.006191253662109375,
0.007350921630859375,
0.10418701171875,
-0.054718017578125,
0.039093017578125,
-0.0077056884765625,
0.002056121826171875,
0.03167724609375,
-0.00817108154296875,
0.0229034423828125,
-0.0224761962890625,
-0.0147552490234375,
0.035614013671875,
0.0012960433959960938,
-0.0260772705078125,
-0.01329803466796875,
0.0469970703125,
-0.100341796875,
-0.0260162353515625,
-0.054718017578125,
-0.021820068359375,
0.0029048919677734375,
0.020721435546875,
0.03997802734375,
0.0265045166015625,
-0.0017385482788085938,
-0.00443267822265625,
0.0307769775390625,
-0.019073486328125,
0.0390625,
0.021087646484375,
-0.0165863037109375,
-0.036651611328125,
0.063720703125,
0.006977081298828125,
-0.009185791015625,
0.016937255859375,
0.005115509033203125,
-0.027679443359375,
-0.044036865234375,
-0.035552978515625,
0.0308837890625,
-0.0501708984375,
-0.024658203125,
-0.0653076171875,
-0.0253448486328125,
-0.030853271484375,
0.00246429443359375,
-0.03094482421875,
-0.036865234375,
-0.031951904296875,
-0.01459503173828125,
0.05767822265625,
0.04364013671875,
0.002727508544921875,
0.047149658203125,
-0.05670166015625,
0.0265655517578125,
-0.00466156005859375,
0.01352691650390625,
0.0075836181640625,
-0.05035400390625,
-0.02386474609375,
0.007762908935546875,
-0.04400634765625,
-0.0731201171875,
0.051910400390625,
0.01776123046875,
0.020904541015625,
0.0293731689453125,
-0.005748748779296875,
0.041595458984375,
-0.03857421875,
0.051605224609375,
0.006404876708984375,
-0.0640869140625,
0.04571533203125,
-0.0012998580932617188,
0.024505615234375,
0.052947998046875,
0.041595458984375,
-0.035125732421875,
-0.0157470703125,
-0.0631103515625,
-0.0684814453125,
0.066162109375,
0.04302978515625,
-0.00685882568359375,
0.0015468597412109375,
0.0179443359375,
0.0021038055419921875,
0.013397216796875,
-0.044677734375,
-0.060302734375,
-0.019073486328125,
-0.03375244140625,
-0.0006055831909179688,
-0.01422119140625,
0.0029048919677734375,
-0.03814697265625,
0.0828857421875,
0.013671875,
0.0443115234375,
0.0265655517578125,
-0.0215606689453125,
-0.00783538818359375,
0.02264404296875,
0.0247955322265625,
0.040557861328125,
-0.00923919677734375,
-0.0003077983856201172,
0.0099029541015625,
-0.056549072265625,
0.018280029296875,
0.04107666015625,
-0.02655029296875,
0.0162811279296875,
0.0182342529296875,
0.06671142578125,
-0.0091705322265625,
-0.042327880859375,
0.0239715576171875,
-0.010345458984375,
-0.0239105224609375,
-0.040557861328125,
-0.0014257431030273438,
-0.004474639892578125,
0.0129547119140625,
0.0394287109375,
0.00849151611328125,
0.00899505615234375,
-0.0167236328125,
0.01535797119140625,
0.0204010009765625,
-0.0141754150390625,
-0.0267791748046875,
0.05401611328125,
-0.0018510818481445312,
-0.02716064453125,
0.053558349609375,
-0.0153350830078125,
-0.068359375,
0.051055908203125,
0.037689208984375,
0.0740966796875,
-0.029754638671875,
0.0186614990234375,
0.04345703125,
0.008209228515625,
0.0182342529296875,
0.02752685546875,
-0.00933074951171875,
-0.0703125,
-0.0205841064453125,
-0.066650390625,
-0.0158843994140625,
0.0171051025390625,
-0.0311737060546875,
0.00818634033203125,
-0.03533935546875,
-0.0128021240234375,
0.017974853515625,
0.0237274169921875,
-0.06427001953125,
0.0252532958984375,
0.0038280487060546875,
0.051849365234375,
-0.060791015625,
0.06256103515625,
0.074462890625,
-0.046783447265625,
-0.0745849609375,
-0.018707275390625,
-0.0209197998046875,
-0.06597900390625,
0.039764404296875,
0.0283966064453125,
-0.0189666748046875,
0.0169525146484375,
-0.045501708984375,
-0.07379150390625,
0.0799560546875,
0.0200347900390625,
-0.02484130859375,
0.0010175704956054688,
0.0174713134765625,
0.033782958984375,
-0.00434112548828125,
0.035919189453125,
0.05078125,
0.03515625,
0.0013475418090820312,
-0.055816650390625,
0.015960693359375,
-0.040863037109375,
-0.0262908935546875,
0.0305023193359375,
-0.0589599609375,
0.063720703125,
-0.001934051513671875,
-0.0216064453125,
0.00705718994140625,
0.0650634765625,
0.029052734375,
0.01024627685546875,
0.0374755859375,
0.06793212890625,
0.0543212890625,
-0.003002166748046875,
0.075927734375,
-0.0300750732421875,
0.031158447265625,
0.048187255859375,
0.00494384765625,
0.048370361328125,
0.036224365234375,
-0.035064697265625,
0.058990478515625,
0.045989990234375,
-0.0001704692840576172,
0.035430908203125,
-0.00713348388671875,
-0.0237579345703125,
-0.007488250732421875,
-0.01119232177734375,
-0.0399169921875,
0.047119140625,
0.02264404296875,
-0.0304107666015625,
-0.0248565673828125,
-0.0030384063720703125,
0.0230865478515625,
-0.00836944580078125,
-0.0136260986328125,
0.043487548828125,
0.01215362548828125,
-0.0423583984375,
0.06903076171875,
0.0201568603515625,
0.056915283203125,
-0.0440673828125,
-0.00661468505859375,
-0.004711151123046875,
0.0273284912109375,
-0.027740478515625,
-0.0386962890625,
0.01490020751953125,
0.007289886474609375,
0.0004992485046386719,
-0.005748748779296875,
0.055084228515625,
-0.0245208740234375,
-0.048248291015625,
0.0158843994140625,
0.032135009765625,
0.0276641845703125,
-0.00408172607421875,
-0.0628662109375,
0.005748748779296875,
0.007358551025390625,
-0.02545166015625,
0.018524169921875,
0.00778961181640625,
0.0061492919921875,
0.05047607421875,
0.04364013671875,
0.005893707275390625,
0.0003581047058105469,
-0.002994537353515625,
0.07208251953125,
-0.045623779296875,
-0.033660888671875,
-0.0772705078125,
0.0294189453125,
0.0010671615600585938,
-0.04241943359375,
0.060455322265625,
0.051910400390625,
0.0645751953125,
0.01169586181640625,
0.05419921875,
-0.0123138427734375,
0.030670166015625,
-0.0272064208984375,
0.07440185546875,
-0.047332763671875,
-0.01263427734375,
-0.021331787109375,
-0.057861328125,
-0.0026702880859375,
0.07537841796875,
-0.0199432373046875,
0.0247650146484375,
0.0496826171875,
0.054656982421875,
-0.005252838134765625,
-0.004283905029296875,
-0.00653839111328125,
0.02484130859375,
0.022247314453125,
0.043609619140625,
0.041015625,
-0.06719970703125,
0.03155517578125,
-0.032867431640625,
-0.0013875961303710938,
-0.01161956787109375,
-0.040069580078125,
-0.064208984375,
-0.054107666015625,
-0.0352783203125,
-0.0443115234375,
-0.0004887580871582031,
0.085205078125,
0.04290771484375,
-0.07403564453125,
-0.0187835693359375,
-0.0195159912109375,
0.01024627685546875,
-0.00867462158203125,
-0.01525115966796875,
0.058746337890625,
-0.0007085800170898438,
-0.05291748046875,
0.0179443359375,
-0.01013946533203125,
0.0144195556640625,
-0.001728057861328125,
-0.018157958984375,
-0.04217529296875,
-0.01171112060546875,
0.0239105224609375,
0.034454345703125,
-0.038238525390625,
-0.00988006591796875,
-0.015167236328125,
-0.0013713836669921875,
0.0289459228515625,
0.01800537109375,
-0.07275390625,
0.0071868896484375,
0.016632080078125,
0.0509033203125,
0.05950927734375,
-0.0082855224609375,
0.0194244384765625,
-0.04107666015625,
0.0200347900390625,
0.024871826171875,
0.038848876953125,
0.0280914306640625,
-0.013092041015625,
0.040313720703125,
-0.0009756088256835938,
-0.04693603515625,
-0.043212890625,
0.0036716461181640625,
-0.0726318359375,
-0.005092620849609375,
0.08209228515625,
-0.002773284912109375,
-0.024200439453125,
-0.0030651092529296875,
-0.02545166015625,
0.05548095703125,
-0.040069580078125,
0.05078125,
0.054168701171875,
0.0038089752197265625,
-0.01364898681640625,
-0.023468017578125,
0.0384521484375,
0.05181884765625,
-0.054962158203125,
-0.0270843505859375,
0.0230255126953125,
0.022308349609375,
0.0224761962890625,
0.052459716796875,
0.003589630126953125,
0.0178680419921875,
-0.0161895751953125,
0.0156097412109375,
-0.00897216796875,
-0.00850677490234375,
-0.0281219482421875,
0.00031828880310058594,
-0.0035343170166015625,
-0.0269012451171875
]
] |
facebook/wav2vec2-large-lv60 | 2021-12-28T12:45:09.000Z | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"speech",
"en",
"dataset:librispeech_asr",
"arxiv:2006.11477",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | facebook | null | null | facebook/wav2vec2-large-lv60 | 5 | 13,941 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- librispeech_asr
tags:
- speech
license: apache-2.0
---
# Wav2Vec2-Large-LV60
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Note**: This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model **speech recognition**, a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more in-detail explanation of how to fine-tune the model.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this notebook](https://colab.research.google.com/drive/1FjTsqbYKphl9kL-eILgUc-bl4zVThL8F?usp=sharing) for more information on how to fine-tune the model. | 2,002 | [
[
-0.005123138427734375,
-0.038848876953125,
0.005950927734375,
-0.005329132080078125,
-0.0251922607421875,
-0.01019287109375,
-0.0249786376953125,
-0.05548095703125,
-0.0170440673828125,
0.01202392578125,
-0.03948974609375,
-0.03070068359375,
-0.049346923828125,
-0.0260772705078125,
-0.0201416015625,
0.06878662109375,
0.0279541015625,
0.0283050537109375,
0.00385284423828125,
-0.0162506103515625,
-0.036773681640625,
-0.032989501953125,
-0.06158447265625,
-0.0367431640625,
0.0261688232421875,
0.02008056640625,
0.0239105224609375,
0.03765869140625,
0.0125274658203125,
0.0174407958984375,
-0.038330078125,
0.01041412353515625,
-0.05999755859375,
-0.010101318359375,
-0.00431060791015625,
-0.0312347412109375,
-0.0238800048828125,
0.018280029296875,
0.04974365234375,
0.049285888671875,
-0.0286407470703125,
0.030731201171875,
0.0147247314453125,
0.0318603515625,
-0.0270233154296875,
0.034637451171875,
-0.060089111328125,
-0.005870819091796875,
-0.02105712890625,
0.003643035888671875,
-0.037811279296875,
0.006549835205078125,
0.005397796630859375,
-0.034637451171875,
0.01036834716796875,
-0.005748748779296875,
0.0631103515625,
0.026336669921875,
-0.040252685546875,
-0.016754150390625,
-0.07647705078125,
0.07421875,
-0.03167724609375,
0.0653076171875,
0.049957275390625,
0.0245819091796875,
-0.0006113052368164062,
-0.0809326171875,
-0.02935791015625,
-0.0008821487426757812,
0.0345458984375,
0.0262603759765625,
-0.032989501953125,
0.0015459060668945312,
0.0295867919921875,
0.022003173828125,
-0.041961669921875,
0.034088134765625,
-0.0638427734375,
-0.04595947265625,
0.03936767578125,
-0.01654052734375,
-0.00390625,
0.0031490325927734375,
-0.0265960693359375,
-0.024505615234375,
-0.0252838134765625,
0.0384521484375,
0.010650634765625,
0.01953125,
-0.0203704833984375,
0.037506103515625,
0.006000518798828125,
0.044189453125,
-0.00907135009765625,
-0.01934814453125,
0.046875,
-0.01947021484375,
-0.0015869140625,
0.0230865478515625,
0.049407958984375,
0.01284027099609375,
0.016082763671875,
0.006195068359375,
-0.020538330078125,
0.01129913330078125,
-0.00534820556640625,
-0.05657958984375,
-0.037872314453125,
0.00850677490234375,
-0.036590576171875,
0.006328582763671875,
0.00829315185546875,
-0.003940582275390625,
-0.00864410400390625,
-0.05078125,
0.05010986328125,
-0.0227813720703125,
-0.00833892822265625,
-0.0022754669189453125,
-0.006626129150390625,
0.01873779296875,
0.01226806640625,
-0.0675048828125,
0.0277557373046875,
0.04547119140625,
0.0631103515625,
-0.00370025634765625,
0.01343536376953125,
-0.06451416015625,
-0.006748199462890625,
-0.038787841796875,
0.047515869140625,
-0.0176849365234375,
-0.0282440185546875,
-0.01143646240234375,
-0.0005249977111816406,
0.034210205078125,
-0.049346923828125,
0.05029296875,
-0.0272216796875,
0.0032138824462890625,
-0.0016164779663085938,
-0.059844970703125,
-0.00208282470703125,
-0.03302001953125,
-0.050048828125,
0.09075927734375,
0.004467010498046875,
-0.046142578125,
0.025543212890625,
-0.01441192626953125,
-0.033905029296875,
-0.004459381103515625,
-0.0052642822265625,
-0.038848876953125,
0.010284423828125,
0.007656097412109375,
0.03466796875,
-0.0018453598022460938,
0.0015087127685546875,
-0.0042572021484375,
-0.02825927734375,
0.0118408203125,
-0.0390625,
0.05755615234375,
0.027740478515625,
-0.0106048583984375,
0.01007080078125,
-0.08197021484375,
0.010223388671875,
0.007343292236328125,
-0.047027587890625,
-0.0026149749755859375,
-0.00616455078125,
0.0272369384765625,
0.010528564453125,
0.00873565673828125,
-0.049468994140625,
-0.01538848876953125,
-0.06219482421875,
0.0582275390625,
0.045318603515625,
-0.00920867919921875,
0.036773681640625,
-0.0167694091796875,
0.00063323974609375,
-0.0203704833984375,
0.0001474618911743164,
0.002277374267578125,
-0.03839111328125,
-0.039276123046875,
-0.021881103515625,
0.041412353515625,
0.043853759765625,
-0.01271820068359375,
0.0416259765625,
-0.00531768798828125,
-0.051025390625,
-0.058258056640625,
0.00962066650390625,
0.03131103515625,
0.034027099609375,
0.053009033203125,
-0.0110015869140625,
-0.0604248046875,
-0.05682373046875,
-0.00746917724609375,
-0.007366180419921875,
-0.0209197998046875,
0.0304412841796875,
0.017974853515625,
-0.0171356201171875,
0.049407958984375,
-0.007320404052734375,
-0.032562255859375,
-0.01611328125,
0.01224517822265625,
0.0183563232421875,
0.044677734375,
0.0143585205078125,
-0.05487060546875,
-0.015899658203125,
-0.0194854736328125,
-0.0131072998046875,
0.0053253173828125,
0.01529693603515625,
0.007083892822265625,
0.01085662841796875,
0.0540771484375,
-0.01206207275390625,
0.0297698974609375,
0.04901123046875,
-0.0031757354736328125,
0.01079559326171875,
-0.0183868408203125,
-0.0289154052734375,
-0.0919189453125,
-0.01204681396484375,
-0.005939483642578125,
-0.037811279296875,
-0.042938232421875,
-0.053497314453125,
0.00591278076171875,
-0.0152130126953125,
-0.04449462890625,
0.034210205078125,
-0.033599853515625,
-0.0136566162109375,
-0.0251922607421875,
0.01041412353515625,
-0.0155029296875,
0.02325439453125,
0.003215789794921875,
0.0555419921875,
0.032379150390625,
-0.047119140625,
0.030914306640625,
0.0303802490234375,
-0.04449462890625,
-0.0015087127685546875,
-0.0699462890625,
0.03961181640625,
-0.0031833648681640625,
0.03094482421875,
-0.08148193359375,
-0.016204833984375,
-0.00943756103515625,
-0.06549072265625,
0.041015625,
-0.01123046875,
-0.0259552001953125,
-0.0203704833984375,
0.00335693359375,
0.032440185546875,
0.074951171875,
-0.057891845703125,
0.041107177734375,
0.0538330078125,
-0.00489044189453125,
-0.023468017578125,
-0.07769775390625,
-0.016021728515625,
0.01361083984375,
-0.03302001953125,
0.043243408203125,
0.00244140625,
0.0076141357421875,
-0.021728515625,
-0.034332275390625,
0.006320953369140625,
-0.01116943359375,
0.0458984375,
0.00586700439453125,
-0.0009832382202148438,
0.03717041015625,
0.0010471343994140625,
-0.023468017578125,
-0.004322052001953125,
-0.038116455078125,
0.043914794921875,
0.004993438720703125,
-0.0035305023193359375,
-0.07244873046875,
0.006847381591796875,
0.0163421630859375,
-0.0308685302734375,
0.03143310546875,
0.0718994140625,
-0.048248291015625,
-0.0192413330078125,
-0.035675048828125,
-0.0204925537109375,
-0.033782958984375,
0.057159423828125,
-0.025360107421875,
-0.0728759765625,
0.012115478515625,
-0.0076904296875,
-0.0008230209350585938,
0.0455322265625,
0.0546875,
-0.026336669921875,
0.0787353515625,
0.040313720703125,
-0.01061248779296875,
0.0535888671875,
-0.031951904296875,
0.00420379638671875,
-0.051544189453125,
-0.05413818359375,
-0.046875,
-0.0173492431640625,
-0.0408935546875,
-0.04638671875,
0.0222015380859375,
0.01293182373046875,
-0.00403594970703125,
0.0163421630859375,
-0.041778564453125,
0.0247955322265625,
0.050628662109375,
0.024810791015625,
-0.00983428955078125,
0.0201263427734375,
0.001018524169921875,
-0.007843017578125,
-0.04486083984375,
-0.02960205078125,
0.080078125,
0.048248291015625,
0.054718017578125,
-0.0017223358154296875,
0.048858642578125,
0.023468017578125,
-0.040496826171875,
-0.08233642578125,
0.0203704833984375,
-0.022430419921875,
-0.041015625,
-0.0228729248046875,
-0.01739501953125,
-0.07159423828125,
-0.001873016357421875,
-0.034027099609375,
-0.05743408203125,
0.004779815673828125,
0.0129852294921875,
-0.018707275390625,
0.005100250244140625,
-0.05340576171875,
0.042572021484375,
-0.004604339599609375,
-0.01490020751953125,
-0.035736083984375,
-0.056427001953125,
-0.0114898681640625,
0.004482269287109375,
0.01788330078125,
-0.00876617431640625,
0.01629638671875,
0.09503173828125,
-0.0252685546875,
0.050140380859375,
-0.033935546875,
0.00481414794921875,
0.041259765625,
-0.0188751220703125,
0.044708251953125,
-0.003307342529296875,
-0.00032591819763183594,
0.015777587890625,
0.01470184326171875,
-0.0316162109375,
-0.0262451171875,
0.0305938720703125,
-0.0777587890625,
-0.0093994140625,
-0.013397216796875,
-0.01947021484375,
-0.0293731689453125,
0.00023031234741210938,
0.03826904296875,
0.0631103515625,
-0.0191192626953125,
0.0474853515625,
0.06231689453125,
-0.0012054443359375,
0.0168609619140625,
0.0227813720703125,
0.005359649658203125,
-0.0261688232421875,
0.07171630859375,
0.0204925537109375,
0.007015228271484375,
0.0260009765625,
0.03265380859375,
-0.0426025390625,
-0.049957275390625,
-0.0003864765167236328,
0.006526947021484375,
-0.051605224609375,
-0.007038116455078125,
-0.058135986328125,
-0.033355712890625,
-0.04718017578125,
0.03240966796875,
-0.058746337890625,
-0.0440673828125,
-0.050537109375,
-0.0244293212890625,
0.025360107421875,
0.0562744140625,
-0.037261962890625,
0.017730712890625,
-0.04840087890625,
0.03741455078125,
0.04437255859375,
-0.0016660690307617188,
-0.0158538818359375,
-0.08734130859375,
-0.02032470703125,
0.0181121826171875,
0.0139617919921875,
-0.047454833984375,
0.00275421142578125,
0.034881591796875,
0.0384521484375,
0.0283355712890625,
-0.01079559326171875,
0.037078857421875,
-0.038055419921875,
0.06292724609375,
0.022369384765625,
-0.0709228515625,
0.05828857421875,
-0.004474639892578125,
0.0162506103515625,
0.035858154296875,
0.0132293701171875,
-0.019744873046875,
0.008636474609375,
-0.0426025390625,
-0.06842041015625,
0.0543212890625,
0.0166778564453125,
0.0163726806640625,
0.0262908935546875,
0.032257080078125,
0.006244659423828125,
-0.013641357421875,
-0.033233642578125,
-0.020538330078125,
-0.031982421875,
-0.0263671875,
-0.0234832763671875,
-0.04351806640625,
0.00974273681640625,
-0.043426513671875,
0.060516357421875,
0.018707275390625,
0.020294189453125,
0.0116424560546875,
-0.0131072998046875,
0.01325225830078125,
0.01727294921875,
0.03125,
0.020751953125,
-0.01241302490234375,
0.01580810546875,
0.021728515625,
-0.044403076171875,
0.00007671117782592773,
0.0254364013671875,
0.0071258544921875,
-0.00460052490234375,
0.04150390625,
0.1004638671875,
0.00130462646484375,
-0.040618896484375,
0.041168212890625,
-0.0073089599609375,
-0.032470703125,
-0.0284881591796875,
0.0169830322265625,
0.012847900390625,
0.0312347412109375,
0.023223876953125,
-0.0015802383422851562,
0.011962890625,
-0.0380859375,
0.023101806640625,
0.0178375244140625,
-0.05877685546875,
-0.022491455078125,
0.06549072265625,
0.01419830322265625,
-0.0310821533203125,
0.04693603515625,
-0.01428985595703125,
-0.0245819091796875,
0.032684326171875,
0.045257568359375,
0.03515625,
-0.0285491943359375,
-0.01424407958984375,
0.04949951171875,
0.00812530517578125,
-0.01316070556640625,
0.034576416015625,
-0.026611328125,
-0.03778076171875,
-0.02227783203125,
-0.04736328125,
-0.015655517578125,
0.02392578125,
-0.0574951171875,
0.0155029296875,
-0.035308837890625,
-0.02691650390625,
0.0270233154296875,
0.0136260986328125,
-0.05682373046875,
0.0258636474609375,
0.0296478271484375,
0.056427001953125,
-0.058868408203125,
0.0762939453125,
0.04046630859375,
-0.01654052734375,
-0.09796142578125,
-0.020294189453125,
0.003871917724609375,
-0.059234619140625,
0.039764404296875,
0.01482391357421875,
-0.024810791015625,
0.01824951171875,
-0.05853271484375,
-0.06219482421875,
0.07855224609375,
0.0131683349609375,
-0.087158203125,
0.01291656494140625,
-0.012725830078125,
0.03643798828125,
-0.004390716552734375,
-0.0078887939453125,
0.041900634765625,
0.0161895751953125,
0.0185394287109375,
-0.0809326171875,
-0.0220184326171875,
-0.0139923095703125,
-0.003078460693359375,
-0.0233612060546875,
-0.04058837890625,
0.054290771484375,
-0.030914306640625,
-0.026031494140625,
0.00888824462890625,
0.07257080078125,
0.0026531219482421875,
0.0233917236328125,
0.041900634765625,
0.029876708984375,
0.09454345703125,
-0.0108489990234375,
0.046051025390625,
0.0039215087890625,
0.024139404296875,
0.10369873046875,
0.003204345703125,
0.0782470703125,
0.02496337890625,
-0.0291595458984375,
0.017974853515625,
0.04815673828125,
-0.0202484130859375,
0.056915283203125,
0.0250396728515625,
-0.00868988037109375,
-0.0247802734375,
-0.0242156982421875,
-0.0408935546875,
0.0634765625,
0.0223388671875,
-0.00843048095703125,
0.022247314453125,
0.0187530517578125,
-0.0196380615234375,
0.006378173828125,
-0.02374267578125,
0.080810546875,
0.02130126953125,
-0.00891876220703125,
0.0504150390625,
0.0082550048828125,
0.0384521484375,
-0.04425048828125,
0.00006586313247680664,
0.01568603515625,
0.0190582275390625,
-0.0175323486328125,
-0.042694091796875,
-0.0006380081176757812,
-0.004184722900390625,
-0.022430419921875,
-0.0025386810302734375,
0.05841064453125,
-0.044189453125,
-0.0323486328125,
0.0307464599609375,
0.0122528076171875,
0.0230255126953125,
-0.0069732666015625,
-0.041229248046875,
0.015838623046875,
0.0086822509765625,
-0.01200103759765625,
0.0043182373046875,
0.0160980224609375,
0.01084136962890625,
0.022491455078125,
0.05535888671875,
0.01224517822265625,
0.004383087158203125,
0.0299530029296875,
0.044189453125,
-0.03912353515625,
-0.0494384765625,
-0.0345458984375,
0.031707763671875,
0.006378173828125,
-0.00937652587890625,
0.0180816650390625,
0.055450439453125,
0.07659912109375,
0.00864410400390625,
0.047454833984375,
0.0182647705078125,
0.0721435546875,
-0.0625,
0.048004150390625,
-0.04608154296875,
-0.0070648193359375,
0.0098419189453125,
-0.055694580078125,
-0.0125274658203125,
0.061065673828125,
0.0126953125,
0.0146026611328125,
0.03350830078125,
0.06207275390625,
0.0019207000732421875,
0.00601959228515625,
0.0214385986328125,
0.029937744140625,
0.020782470703125,
0.03717041015625,
0.06500244140625,
-0.049896240234375,
0.06036376953125,
-0.0194854736328125,
-0.0235595703125,
-0.003246307373046875,
-0.0259246826171875,
-0.08050537109375,
-0.058868408203125,
-0.025482177734375,
-0.041778564453125,
0.0081787109375,
0.08270263671875,
0.07147216796875,
-0.0701904296875,
0.00046896934509277344,
0.00247955322265625,
-0.02227783203125,
-0.00858306884765625,
-0.0068206787109375,
0.03717041015625,
-0.0025691986083984375,
-0.0560302734375,
0.057464599609375,
-0.0016155242919921875,
0.026458740234375,
0.01132965087890625,
-0.019775390625,
0.0008153915405273438,
-0.0021190643310546875,
0.034820556640625,
0.0185546875,
-0.05511474609375,
-0.01355743408203125,
-0.0221710205078125,
-0.007843017578125,
0.01200103759765625,
0.049072265625,
-0.055511474609375,
0.04644775390625,
0.0401611328125,
0.032379150390625,
0.07659912109375,
-0.005489349365234375,
0.00734710693359375,
-0.05340576171875,
0.0265350341796875,
0.0285491943359375,
0.0211334228515625,
0.0232391357421875,
-0.00408172607421875,
0.01349639892578125,
0.017730712890625,
-0.041351318359375,
-0.056396484375,
0.0113983154296875,
-0.10028076171875,
-0.016845703125,
0.09539794921875,
0.01355743408203125,
-0.00791168212890625,
0.004077911376953125,
-0.0390625,
0.0673828125,
-0.04290771484375,
0.0210418701171875,
0.03399658203125,
0.003017425537109375,
-0.0007600784301757812,
-0.03851318359375,
0.043060302734375,
0.0199737548828125,
-0.0223846435546875,
0.0038318634033203125,
0.0355224609375,
0.033843994140625,
-0.01132965087890625,
0.0478515625,
-0.0136566162109375,
0.016204833984375,
0.0078582763671875,
-0.0036334991455078125,
-0.023406982421875,
-0.031494140625,
-0.043426513671875,
0.002471923828125,
0.00891876220703125,
-0.03912353515625
]
] |
openai/imagegpt-small | 2023-06-12T11:16:21.000Z | [
"transformers",
"pytorch",
"imagegpt",
"vision",
"dataset:imagenet-21k",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | null | openai | null | null | openai/imagegpt-small | 10 | 13,894 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
tags:
- vision
datasets:
- imagenet-21k
---
# ImageGPT (small-sized model)
ImageGPT (iGPT) model pre-trained on ImageNet ILSVRC 2012 (14 million images, 21,843 classes) at resolution 32x32. It was introduced in the paper [Generative Pretraining from Pixels](https://cdn.openai.com/papers/Generative_Pretraining_from_Pixels_V2.pdf) by Chen et al. and first released in [this repository](https://github.com/openai/image-gpt). See also the official [blog post](https://openai.com/blog/image-gpt/).
Disclaimer: The team releasing ImageGPT did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
The ImageGPT (iGPT) is a transformer decoder model (GPT-like) pretrained on a large collection of images in a self-supervised fashion, namely ImageNet-21k, at a resolution of 32x32 pixels.
The goal for the model is simply to predict the next pixel value, given the previous ones.
By pre-training the model, it learns an inner representation of images that can then be used to:
- extract features useful for downstream tasks: one can either use ImageGPT to produce fixed image features, in order to train a linear model (like a sklearn logistic regression model or SVM). This is also referred to as "linear probing".
- perform (un)conditional image generation.
## Intended uses & limitations
You can use the raw model for either feature extractor or (un) conditional image generation. See the [model hub](https://huggingface.co/models?search=openai/imagegpt) to all ImageGPT variants.
### How to use
Here is how to use this model in PyTorch to perform unconditional image generation:
```python
from transformers import ImageGPTImageProcessor, ImageGPTForCausalImageModeling
import torch
import matplotlib.pyplot as plt
import numpy as np
processor = ImageGPTImageProcessor.from_pretrained('openai/imagegpt-small')
model = ImageGPTForCausalImageModeling.from_pretrained('openai/imagegpt-small')
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model.to(device)
# unconditional generation of 8 images
batch_size = 8
context = torch.full((batch_size, 1), model.config.vocab_size - 1) #initialize with SOS token
context = torch.tensor(context).to(device)
output = model.generate(pixel_values=context, max_length=model.config.n_positions + 1, temperature=1.0, do_sample=True, top_k=40)
clusters = processor.clusters
n_px = processor.size
samples = output[:,1:].cpu().detach().numpy()
samples_img = [np.reshape(np.rint(127.5 * (clusters[s] + 1.0)), [n_px, n_px, 3]).astype(np.uint8) for s in samples] # convert color cluster tokens back to pixels
f, axes = plt.subplots(1, batch_size, dpi=300)
for img, ax in zip(samples_img, axes):
ax.axis('off')
ax.imshow(img)
```
## Training data
The ImageGPT model was pretrained on [ImageNet-21k](http://www.image-net.org/), a dataset consisting of 14 million images and 21k classes.
## Training procedure
### Preprocessing
Images are first resized/rescaled to the same resolution (32x32) and normalized across the RGB channels. Next, color-clustering is performed. This means that every pixel is turned into one of 512 possible cluster values. This way, one ends up with a sequence of 32x32 = 1024 pixel values, rather than 32x32x3 = 3072, which is prohibitively large for Transformer-based models.
### Pretraining
Training details can be found in section 3.4 of v2 of the paper.
## Evaluation results
For evaluation results on several image classification benchmarks, we refer to the original paper.
### BibTeX entry and citation info
```bibtex
@InProceedings{pmlr-v119-chen20s,
title = {Generative Pretraining From Pixels},
author = {Chen, Mark and Radford, Alec and Child, Rewon and Wu, Jeffrey and Jun, Heewoo and Luan, David and Sutskever, Ilya},
booktitle = {Proceedings of the 37th International Conference on Machine Learning},
pages = {1691--1703},
year = {2020},
editor = {III, Hal Daumé and Singh, Aarti},
volume = {119},
series = {Proceedings of Machine Learning Research},
month = {13--18 Jul},
publisher = {PMLR},
pdf = {http://proceedings.mlr.press/v119/chen20s/chen20s.pdf},
url = {https://proceedings.mlr.press/v119/chen20s.html
}
```
```bibtex
@inproceedings{deng2009imagenet,
title={Imagenet: A large-scale hierarchical image database},
author={Deng, Jia and Dong, Wei and Socher, Richard and Li, Li-Jia and Li, Kai and Fei-Fei, Li},
booktitle={2009 IEEE conference on computer vision and pattern recognition},
pages={248--255},
year={2009},
organization={Ieee}
}
``` | 4,631 | [
[
-0.04742431640625,
-0.02142333984375,
0.010711669921875,
-0.0025806427001953125,
-0.031707763671875,
-0.0248260498046875,
-0.0141448974609375,
-0.0275421142578125,
-0.0024871826171875,
0.014068603515625,
-0.0333251953125,
-0.030670166015625,
-0.04443359375,
-0.01544952392578125,
-0.02069091796875,
0.0770263671875,
-0.005992889404296875,
0.0010404586791992188,
-0.00467681884765625,
-0.01020050048828125,
-0.01451873779296875,
-0.0209197998046875,
-0.0533447265625,
-0.023040771484375,
0.035491943359375,
0.0182647705078125,
0.0301513671875,
0.0408935546875,
0.050628662109375,
0.0282440185546875,
0.0120849609375,
-0.01363372802734375,
-0.034881591796875,
-0.034271240234375,
0.0230865478515625,
-0.031707763671875,
-0.0166015625,
0.0193939208984375,
0.043426513671875,
0.0231170654296875,
0.0017251968383789062,
0.0230865478515625,
0.0093994140625,
0.046905517578125,
-0.04022216796875,
0.01296234130859375,
-0.0511474609375,
0.02520751953125,
-0.01056671142578125,
-0.00986480712890625,
-0.0262908935546875,
-0.01995849609375,
0.0029735565185546875,
-0.03753662109375,
0.06396484375,
-0.0163726806640625,
0.08416748046875,
0.0277862548828125,
-0.0180206298828125,
0.0030269622802734375,
-0.03839111328125,
0.045013427734375,
-0.041046142578125,
0.00791168212890625,
0.01019287109375,
0.030242919921875,
0.0110931396484375,
-0.09405517578125,
-0.053466796875,
-0.011474609375,
-0.02197265625,
0.0207061767578125,
-0.026947021484375,
0.012237548828125,
0.0347900390625,
0.03515625,
-0.04681396484375,
0.01224517822265625,
-0.0445556640625,
-0.0095062255859375,
0.043365478515625,
-0.003330230712890625,
0.017608642578125,
-0.009429931640625,
-0.039581298828125,
-0.030487060546875,
-0.04296875,
0.0006556510925292969,
0.0158538818359375,
-0.0026111602783203125,
-0.0237884521484375,
0.027069091796875,
0.00350189208984375,
0.044281005859375,
0.0184478759765625,
-0.0068359375,
0.031951904296875,
-0.021392822265625,
-0.022735595703125,
0.0014553070068359375,
0.0721435546875,
0.0222625732421875,
0.0090179443359375,
-0.0024623870849609375,
-0.004268646240234375,
0.01454925537109375,
0.012939453125,
-0.0985107421875,
-0.04876708984375,
0.006069183349609375,
-0.046142578125,
-0.0208892822265625,
0.01338958740234375,
-0.065185546875,
-0.00421905517578125,
-0.01070404052734375,
0.05352783203125,
-0.0258026123046875,
-0.022674560546875,
-0.017303466796875,
-0.0031986236572265625,
0.0297088623046875,
0.0186309814453125,
-0.0576171875,
0.007328033447265625,
0.0080718994140625,
0.072265625,
-0.0123291015625,
-0.0316162109375,
-0.001735687255859375,
-0.026519775390625,
-0.023712158203125,
0.042816162109375,
-0.006259918212890625,
-0.01103973388671875,
-0.03167724609375,
0.01529693603515625,
-0.0119781494140625,
-0.037139892578125,
0.02557373046875,
-0.04046630859375,
0.007274627685546875,
-0.0011920928955078125,
-0.0149383544921875,
-0.0138397216796875,
0.00341796875,
-0.029571533203125,
0.07745361328125,
0.01334381103515625,
-0.07379150390625,
0.038543701171875,
-0.040252685546875,
-0.00449371337890625,
-0.0001748800277709961,
-0.0185089111328125,
-0.06475830078125,
-0.003917694091796875,
0.021636962890625,
0.0474853515625,
-0.01393890380859375,
0.0014505386352539062,
-0.013885498046875,
-0.028289794921875,
-0.00911712646484375,
-0.0246124267578125,
0.053680419921875,
0.03155517578125,
-0.0517578125,
0.0275726318359375,
-0.053466796875,
0.02215576171875,
0.0335693359375,
-0.00955963134765625,
0.0235137939453125,
-0.029876708984375,
0.006122589111328125,
0.029632568359375,
0.01438140869140625,
-0.043487548828125,
0.00945281982421875,
0.006450653076171875,
0.0489501953125,
0.05419921875,
-0.00888824462890625,
0.0243377685546875,
-0.01363372802734375,
0.0269927978515625,
0.01812744140625,
0.016082763671875,
-0.03143310546875,
-0.03564453125,
-0.06719970703125,
-0.0284423828125,
0.0308837890625,
0.03143310546875,
-0.060089111328125,
0.045166015625,
-0.02777099609375,
-0.0689697265625,
-0.00823974609375,
-0.0189361572265625,
0.02850341796875,
0.041412353515625,
0.029815673828125,
-0.03662109375,
-0.0310516357421875,
-0.060516357421875,
0.0252227783203125,
0.011260986328125,
0.0012226104736328125,
0.0179901123046875,
0.052520751953125,
-0.02178955078125,
0.06829833984375,
-0.044281005859375,
-0.01512908935546875,
-0.004711151123046875,
0.0196380615234375,
0.0249786376953125,
0.059844970703125,
0.05059814453125,
-0.0628662109375,
-0.057586669921875,
0.0011453628540039062,
-0.05657958984375,
0.01432037353515625,
0.0002777576446533203,
-0.00788116455078125,
0.01611328125,
0.0430908203125,
-0.04827880859375,
0.06085205078125,
0.04010009765625,
-0.0082855224609375,
0.052154541015625,
-0.0247344970703125,
0.01468658447265625,
-0.072265625,
0.00836944580078125,
0.0177764892578125,
-0.006679534912109375,
-0.037933349609375,
-0.003437042236328125,
0.00803375244140625,
-0.028106689453125,
-0.03369140625,
0.0281829833984375,
-0.036346435546875,
-0.020233154296875,
-0.0163421630859375,
-0.0306243896484375,
-0.0264892578125,
0.063232421875,
0.00911712646484375,
0.06201171875,
0.06134033203125,
-0.0460205078125,
0.03021240234375,
0.01432037353515625,
-0.037811279296875,
0.037384033203125,
-0.06610107421875,
0.0185089111328125,
-0.01020050048828125,
0.02545166015625,
-0.07562255859375,
-0.0187835693359375,
0.016265869140625,
-0.038543701171875,
0.035369873046875,
-0.025421142578125,
-0.0328369140625,
-0.045623779296875,
-0.0021800994873046875,
0.052337646484375,
0.06878662109375,
-0.05975341796875,
0.0236663818359375,
0.004634857177734375,
0.01611328125,
-0.03082275390625,
-0.04974365234375,
-0.01605224609375,
-0.0208587646484375,
-0.051605224609375,
0.0411376953125,
0.0049285888671875,
0.0214080810546875,
0.0170440673828125,
0.00254058837890625,
-0.003711700439453125,
-0.02880859375,
0.0279998779296875,
0.01654052734375,
-0.026947021484375,
0.00498199462890625,
-0.038238525390625,
-0.0248565673828125,
-0.01361846923828125,
-0.034149169921875,
0.060089111328125,
-0.031646728515625,
-0.03375244140625,
-0.0430908203125,
0.0074920654296875,
0.0204010009765625,
-0.01446533203125,
0.045013427734375,
0.08831787109375,
-0.022705078125,
-0.0014438629150390625,
-0.02960205078125,
-0.0206451416015625,
-0.038543701171875,
0.040191650390625,
-0.0223388671875,
-0.0487060546875,
0.038360595703125,
-0.0038280487060546875,
-0.00848388671875,
0.0576171875,
0.038909912109375,
-0.012847900390625,
0.0604248046875,
0.04296875,
0.01666259765625,
0.048583984375,
-0.058746337890625,
0.01465606689453125,
-0.05450439453125,
-0.00766754150390625,
-0.024169921875,
-0.043426513671875,
-0.060089111328125,
-0.03436279296875,
0.04791259765625,
0.01041412353515625,
-0.04962158203125,
0.0208740234375,
-0.057281494140625,
0.0308837890625,
0.06494140625,
0.043853759765625,
-0.0173492431640625,
0.0238800048828125,
0.0013055801391601562,
-0.00766754150390625,
-0.06549072265625,
-0.011077880859375,
0.07708740234375,
0.015777587890625,
0.05047607421875,
-0.0271148681640625,
0.04669189453125,
-0.002635955810546875,
0.016204833984375,
-0.047882080078125,
0.042144775390625,
-0.0173797607421875,
-0.029541015625,
-0.015777587890625,
-0.043609619140625,
-0.09466552734375,
0.0163726806640625,
-0.0175018310546875,
-0.05047607421875,
0.0235137939453125,
0.031158447265625,
-0.0141448974609375,
0.0533447265625,
-0.07275390625,
0.0821533203125,
-0.0244598388671875,
-0.022003173828125,
0.0027790069580078125,
-0.053009033203125,
0.01541900634765625,
-0.00030303001403808594,
-0.0229949951171875,
0.0108795166015625,
0.0240478515625,
0.06732177734375,
-0.046417236328125,
0.0738525390625,
-0.02667236328125,
0.03466796875,
0.037322998046875,
-0.0260467529296875,
0.036102294921875,
-0.0290374755859375,
0.014129638671875,
0.02734375,
-0.0016183853149414062,
-0.04986572265625,
-0.0271759033203125,
0.034423828125,
-0.0791015625,
-0.0279083251953125,
-0.03228759765625,
-0.0208587646484375,
-0.003734588623046875,
0.0279541015625,
0.0828857421875,
0.03216552734375,
0.0032520294189453125,
0.0227508544921875,
0.04620361328125,
-0.027862548828125,
0.0458984375,
-0.024993896484375,
-0.00225830078125,
-0.023406982421875,
0.07867431640625,
0.016021728515625,
0.0168304443359375,
-0.003116607666015625,
0.0159149169921875,
-0.0082855224609375,
-0.04986572265625,
-0.0290374755859375,
0.01800537109375,
-0.0615234375,
-0.0262451171875,
-0.03851318359375,
-0.039703369140625,
-0.0241241455078125,
-0.020111083984375,
-0.029541015625,
-0.0036163330078125,
-0.0305938720703125,
0.01177215576171875,
0.0298004150390625,
0.039215087890625,
-0.01375579833984375,
0.0599365234375,
-0.041046142578125,
0.0120697021484375,
0.043304443359375,
0.0299835205078125,
0.00949859619140625,
-0.04571533203125,
-0.012786865234375,
-0.01073455810546875,
-0.03271484375,
-0.043304443359375,
0.034820556640625,
0.0177764892578125,
0.041107177734375,
0.0255889892578125,
-0.01142120361328125,
0.04766845703125,
-0.027069091796875,
0.05670166015625,
0.05108642578125,
-0.0494384765625,
0.03802490234375,
-0.0205535888671875,
0.0200958251953125,
0.0198974609375,
0.037261962890625,
-0.01861572265625,
0.0283966064453125,
-0.08111572265625,
-0.05523681640625,
0.0570068359375,
0.006725311279296875,
0.002288818359375,
0.03204345703125,
0.03680419921875,
-0.007808685302734375,
0.01593017578125,
-0.078125,
-0.01666259765625,
-0.038787841796875,
-0.012939453125,
-0.0269012451171875,
-0.01555633544921875,
-0.005687713623046875,
-0.0474853515625,
0.04827880859375,
-0.012908935546875,
0.07000732421875,
0.01410675048828125,
-0.0035266876220703125,
0.0005993843078613281,
-0.0174713134765625,
0.029571533203125,
0.0350341796875,
-0.005619049072265625,
0.01439666748046875,
0.0032863616943359375,
-0.0538330078125,
0.0090484619140625,
0.0003082752227783203,
-0.0201568603515625,
-0.017547607421875,
0.0263824462890625,
0.0872802734375,
-0.01849365234375,
-0.005619049072265625,
0.044281005859375,
-0.00414276123046875,
-0.034820556640625,
-0.021881103515625,
0.004505157470703125,
-0.01120758056640625,
0.01800537109375,
0.0230712890625,
0.03240966796875,
-0.0173797607421875,
-0.0225372314453125,
0.0229034423828125,
0.026763916015625,
-0.027313232421875,
-0.0237579345703125,
0.0435791015625,
-0.004932403564453125,
0.004726409912109375,
0.0732421875,
-0.0272674560546875,
-0.0499267578125,
0.07305908203125,
0.04205322265625,
0.06329345703125,
0.0023250579833984375,
0.0166015625,
0.066650390625,
0.021484375,
0.0029850006103515625,
-0.0018625259399414062,
0.0010223388671875,
-0.064453125,
-0.00794219970703125,
-0.03289794921875,
0.0033740997314453125,
0.0186614990234375,
-0.0271453857421875,
0.01806640625,
-0.037689208984375,
-0.01502227783203125,
-0.00238037109375,
0.0056304931640625,
-0.0693359375,
0.0303192138671875,
0.006351470947265625,
0.05078125,
-0.05975341796875,
0.0653076171875,
0.04150390625,
-0.0322265625,
-0.07586669921875,
0.00015938282012939453,
-0.020660400390625,
-0.07177734375,
0.053558349609375,
0.0305633544921875,
0.00647735595703125,
0.025634765625,
-0.064453125,
-0.054901123046875,
0.10015869140625,
0.028289794921875,
-0.0179443359375,
0.0171051025390625,
0.002838134765625,
0.02764892578125,
-0.021484375,
0.021026611328125,
0.024932861328125,
0.02630615234375,
0.04229736328125,
-0.047088623046875,
0.0006670951843261719,
-0.0259246826171875,
0.00965118408203125,
0.0178375244140625,
-0.05462646484375,
0.087890625,
-0.027069091796875,
-0.0198822021484375,
-0.0015382766723632812,
0.04571533203125,
0.00791168212890625,
0.00955963134765625,
0.049530029296875,
0.0679931640625,
0.0384521484375,
-0.032012939453125,
0.0775146484375,
-0.0006628036499023438,
0.037750244140625,
0.05181884765625,
0.030426025390625,
0.037750244140625,
0.020050048828125,
-0.0296478271484375,
0.0119171142578125,
0.0859375,
-0.031982421875,
0.05078125,
0.0146484375,
-0.0145721435546875,
-0.0066375732421875,
0.0157470703125,
-0.048858642578125,
0.02947998046875,
0.01451873779296875,
-0.031524658203125,
-0.020294189453125,
0.01125335693359375,
-0.01009368896484375,
-0.033203125,
-0.037628173828125,
0.0382080078125,
-0.0063018798828125,
-0.044158935546875,
0.0545654296875,
-0.007720947265625,
0.07318115234375,
-0.03985595703125,
0.003314971923828125,
-0.01332855224609375,
0.0173492431640625,
-0.03582763671875,
-0.049468994140625,
0.0114898681640625,
-0.010223388671875,
-0.0039825439453125,
0.00469970703125,
0.05615234375,
-0.0107421875,
-0.03936767578125,
0.000888824462890625,
0.00738525390625,
0.0257568359375,
-0.01531219482421875,
-0.05926513671875,
0.000270843505859375,
0.005268096923828125,
-0.0260467529296875,
0.02886962890625,
0.0232391357421875,
0.005504608154296875,
0.0228271484375,
0.04937744140625,
0.0006742477416992188,
0.034027099609375,
-0.029144287109375,
0.06610107421875,
-0.0330810546875,
-0.037078857421875,
-0.047882080078125,
0.039459228515625,
-0.0040130615234375,
-0.0321044921875,
0.0382080078125,
0.0279388427734375,
0.08026123046875,
-0.0210418701171875,
0.0526123046875,
-0.0282440185546875,
-0.00738525390625,
-0.04150390625,
0.040802001953125,
-0.041961669921875,
-0.01885986328125,
-0.016448974609375,
-0.0703125,
-0.015106201171875,
0.060760498046875,
-0.038543701171875,
0.04351806640625,
0.043701171875,
0.06365966796875,
-0.0264739990234375,
-0.0127105712890625,
0.0218048095703125,
0.0034160614013671875,
0.01512908935546875,
0.03533935546875,
0.04766845703125,
-0.045806884765625,
0.0411376953125,
-0.051513671875,
-0.024932861328125,
-0.001773834228515625,
-0.0703125,
-0.049041748046875,
-0.06494140625,
-0.0360107421875,
-0.025421142578125,
-0.02490234375,
0.04180908203125,
0.09014892578125,
-0.0533447265625,
-0.0021572113037109375,
-0.02850341796875,
-0.024688720703125,
-0.0135040283203125,
-0.01085662841796875,
0.05615234375,
-0.0171966552734375,
-0.04779052734375,
-0.0271453857421875,
-0.00794219970703125,
0.02197265625,
-0.0032901763916015625,
-0.00846099853515625,
-0.0003924369812011719,
-0.03375244140625,
0.035003662109375,
0.035430908203125,
-0.0330810546875,
-0.01300811767578125,
-0.0249786376953125,
-0.0024318695068359375,
0.035888671875,
0.053466796875,
-0.04541015625,
0.02294921875,
0.03375244140625,
0.041656494140625,
0.0821533203125,
-0.013885498046875,
-0.0026683807373046875,
-0.057861328125,
0.04248046875,
0.007476806640625,
0.030487060546875,
0.018524169921875,
-0.02508544921875,
0.047882080078125,
0.032470703125,
-0.055877685546875,
-0.044830322265625,
0.01374053955078125,
-0.08551025390625,
-0.0155029296875,
0.0716552734375,
-0.027435302734375,
-0.02337646484375,
0.033203125,
-0.01226043701171875,
0.0260162353515625,
0.005962371826171875,
0.03790283203125,
0.0152435302734375,
-0.02197265625,
-0.049041748046875,
-0.03448486328125,
0.037322998046875,
0.0117034912109375,
-0.06201171875,
-0.03387451171875,
0.0231781005859375,
0.02557373046875,
0.0277557373046875,
0.060302734375,
-0.01003265380859375,
0.018280029296875,
0.026214599609375,
0.03759765625,
-0.00499725341796875,
-0.0183563232421875,
-0.020721435546875,
0.0101470947265625,
-0.0084686279296875,
-0.040924072265625
]
] |
timm/vit_large_patch14_dinov2.lvd142m | 2023-09-03T21:59:43.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:2304.07193",
"arxiv:2010.11929",
"license:apache-2.0",
"has_space",
"region:us"
] | image-classification | timm | null | null | timm/vit_large_patch14_dinov2.lvd142m | 3 | 13,882 | timm | 2023-05-09T21:05:24 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
---
# Model card for vit_large_patch14_dinov2.lvd142m
A Vision Transformer (ViT) image feature model. Pretrained on LVD-142M with self-supervised DINOv2 method.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 304.4
- GMACs: 507.1
- Activations (M): 1058.8
- Image size: 518 x 518
- **Papers:**
- DINOv2: Learning Robust Visual Features without Supervision: https://arxiv.org/abs/2304.07193
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Original:** https://github.com/facebookresearch/dinov2
- **Pretrain Dataset:** LVD-142M
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_large_patch14_dinov2.lvd142m', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_large_patch14_dinov2.lvd142m',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1370, 1024) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{oquab2023dinov2,
title={DINOv2: Learning Robust Visual Features without Supervision},
author={Oquab, Maxime and Darcet, Timothée and Moutakanni, Theo and Vo, Huy V. and Szafraniec, Marc and Khalidov, Vasil and Fernandez, Pierre and Haziza, Daniel and Massa, Francisco and El-Nouby, Alaaeldin and Howes, Russell and Huang, Po-Yao and Xu, Hu and Sharma, Vasu and Li, Shang-Wen and Galuba, Wojciech and Rabbat, Mike and Assran, Mido and Ballas, Nicolas and Synnaeve, Gabriel and Misra, Ishan and Jegou, Herve and Mairal, Julien and Labatut, Patrick and Joulin, Armand and Bojanowski, Piotr},
journal={arXiv:2304.07193},
year={2023}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
``` | 3,991 | [
[
-0.03704833984375,
-0.029754638671875,
0.00676727294921875,
0.005443572998046875,
-0.034820556640625,
-0.025390625,
-0.0211639404296875,
-0.032745361328125,
0.011383056640625,
0.0204925537109375,
-0.032958984375,
-0.0394287109375,
-0.051361083984375,
-0.0038127899169921875,
-0.00482177734375,
0.0628662109375,
-0.009979248046875,
-0.004871368408203125,
-0.022369384765625,
-0.033660888671875,
-0.01201629638671875,
-0.0123138427734375,
-0.0390625,
-0.02484130859375,
0.031402587890625,
0.0068359375,
0.043365478515625,
0.05914306640625,
0.046661376953125,
0.03167724609375,
-0.01177215576171875,
0.0164947509765625,
-0.024139404296875,
-0.01166534423828125,
0.0136566162109375,
-0.04583740234375,
-0.0325927734375,
0.013946533203125,
0.055572509765625,
0.0199432373046875,
0.00312042236328125,
0.0260772705078125,
0.0182647705078125,
0.0307159423828125,
-0.0249176025390625,
0.02392578125,
-0.035369873046875,
0.0143280029296875,
-0.00925445556640625,
0.002689361572265625,
-0.017791748046875,
-0.0239105224609375,
0.01207733154296875,
-0.045074462890625,
0.029815673828125,
-0.006465911865234375,
0.10015869140625,
0.0222930908203125,
-0.011016845703125,
0.0115814208984375,
-0.0225830078125,
0.0653076171875,
-0.039337158203125,
0.0242767333984375,
0.020782470703125,
0.0038089752197265625,
0.0011796951293945312,
-0.07623291015625,
-0.04498291015625,
0.0025997161865234375,
-0.01535797119140625,
0.007904052734375,
-0.031280517578125,
0.002918243408203125,
0.027191162109375,
0.0223236083984375,
-0.0322265625,
0.01535797119140625,
-0.041900634765625,
-0.02630615234375,
0.033294677734375,
-0.00890350341796875,
0.01274871826171875,
-0.0268707275390625,
-0.04669189453125,
-0.0419921875,
-0.026519775390625,
0.028289794921875,
0.0157470703125,
0.00312042236328125,
-0.037445068359375,
0.037261962890625,
0.01386260986328125,
0.04052734375,
0.01055908203125,
-0.0268402099609375,
0.04742431640625,
-0.0158233642578125,
-0.02880859375,
-0.021759033203125,
0.08709716796875,
0.03851318359375,
0.027069091796875,
0.01197052001953125,
-0.0111083984375,
-0.00885772705078125,
-0.00977325439453125,
-0.08355712890625,
-0.03302001953125,
0.01166534423828125,
-0.03607177734375,
-0.03155517578125,
0.0160980224609375,
-0.0687255859375,
-0.01558685302734375,
-0.005680084228515625,
0.052398681640625,
-0.0308837890625,
-0.0263824462890625,
-0.01059722900390625,
-0.00934600830078125,
0.036376953125,
0.017852783203125,
-0.051239013671875,
0.01021575927734375,
0.0199432373046875,
0.0791015625,
-0.0020618438720703125,
-0.0323486328125,
-0.031494140625,
-0.0243988037109375,
-0.0297088623046875,
0.04168701171875,
-0.00356292724609375,
-0.01320648193359375,
-0.01068115234375,
0.0275421142578125,
-0.0108489990234375,
-0.0469970703125,
0.020965576171875,
-0.00873565673828125,
0.027191162109375,
-0.005100250244140625,
-0.005046844482421875,
-0.0238800048828125,
0.016845703125,
-0.03131103515625,
0.09619140625,
0.0284423828125,
-0.06695556640625,
0.035491943359375,
-0.0198516845703125,
-0.00760650634765625,
-0.0153350830078125,
-0.007381439208984375,
-0.09124755859375,
-0.00984954833984375,
0.0286407470703125,
0.04412841796875,
-0.016387939453125,
0.0017747879028320312,
-0.03985595703125,
-0.0203094482421875,
0.0230255126953125,
-0.0140380859375,
0.06964111328125,
0.0018796920776367188,
-0.03839111328125,
0.02581787109375,
-0.04913330078125,
0.009307861328125,
0.0352783203125,
-0.023162841796875,
-0.00360107421875,
-0.0408935546875,
0.012481689453125,
0.0240478515625,
0.01666259765625,
-0.033050537109375,
0.02410888671875,
-0.0109405517578125,
0.037506103515625,
0.0574951171875,
-0.0174102783203125,
0.02850341796875,
-0.021026611328125,
0.0240478515625,
0.0178680419921875,
0.0232696533203125,
-0.006877899169921875,
-0.047393798828125,
-0.06268310546875,
-0.052825927734375,
0.02362060546875,
0.04010009765625,
-0.0531005859375,
0.045379638671875,
-0.0181427001953125,
-0.04351806640625,
-0.035919189453125,
0.02166748046875,
0.04168701171875,
0.03900146484375,
0.0413818359375,
-0.0419921875,
-0.037384033203125,
-0.06439208984375,
0.005420684814453125,
-0.00931549072265625,
-0.00707244873046875,
0.032073974609375,
0.04364013671875,
-0.0153045654296875,
0.052947998046875,
-0.029693603515625,
-0.03570556640625,
-0.0159149169921875,
0.0003311634063720703,
0.01806640625,
0.048583984375,
0.0609130859375,
-0.0469970703125,
-0.030609130859375,
-0.0166778564453125,
-0.06732177734375,
0.00998687744140625,
0.002288818359375,
-0.00876617431640625,
0.0269317626953125,
0.0214691162109375,
-0.062255859375,
0.04644775390625,
0.0166015625,
-0.02703857421875,
0.023895263671875,
-0.0259552001953125,
-0.0023708343505859375,
-0.09197998046875,
0.00506591796875,
0.0309295654296875,
-0.01763916015625,
-0.0270843505859375,
-0.00691986083984375,
0.015838623046875,
0.0038471221923828125,
-0.03173828125,
0.03863525390625,
-0.045166015625,
-0.01549530029296875,
0.00579071044921875,
-0.00872039794921875,
0.00620269775390625,
0.051849365234375,
-0.001430511474609375,
0.0367431640625,
0.06640625,
-0.029754638671875,
0.043304443359375,
0.0302734375,
-0.031036376953125,
0.0419921875,
-0.05419921875,
0.00461578369140625,
-0.0008497238159179688,
0.015777587890625,
-0.08355712890625,
-0.0216827392578125,
0.031890869140625,
-0.04425048828125,
0.055206298828125,
-0.037750244140625,
-0.031585693359375,
-0.048126220703125,
-0.0419921875,
0.03228759765625,
0.06024169921875,
-0.0579833984375,
0.032135009765625,
0.01201629638671875,
0.01430511474609375,
-0.04736328125,
-0.06976318359375,
-0.017578125,
-0.035797119140625,
-0.043914794921875,
0.023223876953125,
0.01318359375,
0.01113128662109375,
0.005115509033203125,
-0.00494384765625,
0.0008540153503417969,
-0.01389312744140625,
0.031982421875,
0.035491943359375,
-0.0210113525390625,
-0.005947113037109375,
-0.023468017578125,
-0.00885772705078125,
0.006534576416015625,
-0.022186279296875,
0.036865234375,
-0.0132904052734375,
-0.0084686279296875,
-0.0633544921875,
-0.01317596435546875,
0.04718017578125,
-0.0290374755859375,
0.053863525390625,
0.09356689453125,
-0.0309906005859375,
-0.0025787353515625,
-0.031097412109375,
-0.0208740234375,
-0.037841796875,
0.048583984375,
-0.0217742919921875,
-0.035858154296875,
0.0582275390625,
0.00936126708984375,
-0.01120758056640625,
0.052154541015625,
0.03369140625,
-0.0016984939575195312,
0.0609130859375,
0.045379638671875,
0.01904296875,
0.056884765625,
-0.06732177734375,
-0.006214141845703125,
-0.0693359375,
-0.043914794921875,
-0.015838623046875,
-0.03314208984375,
-0.055694580078125,
-0.04962158203125,
0.0293121337890625,
0.01326751708984375,
-0.0175323486328125,
0.032958984375,
-0.060943603515625,
0.00452423095703125,
0.061279296875,
0.040863037109375,
-0.0203094482421875,
0.0287017822265625,
-0.0209503173828125,
-0.01117706298828125,
-0.052734375,
-0.0022525787353515625,
0.084716796875,
0.03668212890625,
0.06024169921875,
-0.00566864013671875,
0.045562744140625,
-0.0175628662109375,
0.005634307861328125,
-0.058502197265625,
0.04559326171875,
-0.0020599365234375,
-0.0224609375,
-0.0208587646484375,
-0.03167724609375,
-0.07818603515625,
0.01116180419921875,
-0.0272674560546875,
-0.06134033203125,
0.033905029296875,
0.0177459716796875,
-0.0208892822265625,
0.04937744140625,
-0.05828857421875,
0.072509765625,
-0.00647735595703125,
-0.04595947265625,
0.007411956787109375,
-0.05242919921875,
0.0215911865234375,
0.007518768310546875,
-0.0170440673828125,
0.0079345703125,
0.00913238525390625,
0.0711669921875,
-0.039947509765625,
0.06707763671875,
-0.027984619140625,
0.0157470703125,
0.0399169921875,
-0.014556884765625,
0.0377197265625,
0.00519561767578125,
0.0156707763671875,
0.017181396484375,
0.005046844482421875,
-0.0298004150390625,
-0.035125732421875,
0.0426025390625,
-0.0821533203125,
-0.035064697265625,
-0.04107666015625,
-0.03179931640625,
0.01435089111328125,
0.0162200927734375,
0.048797607421875,
0.04400634765625,
0.02288818359375,
0.033294677734375,
0.0606689453125,
-0.0267181396484375,
0.034088134765625,
-0.006435394287109375,
-0.019439697265625,
-0.0509033203125,
0.07025146484375,
0.0223846435546875,
0.01861572265625,
0.01177215576171875,
0.01336669921875,
-0.0305023193359375,
-0.03704833984375,
-0.02508544921875,
0.03167724609375,
-0.052764892578125,
-0.034576416015625,
-0.04083251953125,
-0.0271759033203125,
-0.031280517578125,
0.00921630859375,
-0.036468505859375,
-0.0243988037109375,
-0.040679931640625,
0.00580596923828125,
0.050079345703125,
0.038543701171875,
-0.0215911865234375,
0.0199432373046875,
-0.0260772705078125,
0.0212860107421875,
0.04248046875,
0.0345458984375,
-0.0197296142578125,
-0.06964111328125,
-0.01340484619140625,
-0.003376007080078125,
-0.034393310546875,
-0.0450439453125,
0.035369873046875,
0.0191497802734375,
0.035369873046875,
0.03326416015625,
-0.0151824951171875,
0.057159423828125,
-0.005039215087890625,
0.04559326171875,
0.0309295654296875,
-0.046875,
0.0450439453125,
-0.01242828369140625,
0.00803375244140625,
0.002727508544921875,
0.01806640625,
-0.00469207763671875,
0.00524139404296875,
-0.06781005859375,
-0.058929443359375,
0.061981201171875,
0.01308441162109375,
-0.003223419189453125,
0.0266265869140625,
0.04150390625,
-0.014404296875,
-0.002391815185546875,
-0.0633544921875,
-0.02490234375,
-0.036834716796875,
-0.0193634033203125,
-0.0103607177734375,
-0.018096923828125,
-0.010528564453125,
-0.049713134765625,
0.04058837890625,
-0.0107574462890625,
0.057708740234375,
0.03033447265625,
-0.002567291259765625,
-0.0180206298828125,
-0.0251312255859375,
0.035980224609375,
0.0259552001953125,
-0.02496337890625,
0.0160369873046875,
0.0136871337890625,
-0.047210693359375,
-0.004627227783203125,
0.025421142578125,
-0.003509521484375,
-0.0023212432861328125,
0.0372314453125,
0.07525634765625,
-0.0059814453125,
-0.0036563873291015625,
0.050994873046875,
-0.00414276123046875,
-0.038543701171875,
-0.0160980224609375,
0.01036834716796875,
-0.0079803466796875,
0.036773681640625,
0.021087646484375,
0.031036376953125,
-0.019744873046875,
-0.027069091796875,
0.02008056640625,
0.04541015625,
-0.040802001953125,
-0.03082275390625,
0.061981201171875,
-0.017669677734375,
-0.0027599334716796875,
0.058624267578125,
-0.0019130706787109375,
-0.032958984375,
0.07122802734375,
0.0325927734375,
0.06024169921875,
-0.0197296142578125,
0.005634307861328125,
0.057403564453125,
0.0220794677734375,
-0.01015472412109375,
0.01371002197265625,
0.0044708251953125,
-0.04949951171875,
0.00019419193267822266,
-0.042144775390625,
0.0037384033203125,
0.0258026123046875,
-0.04949951171875,
0.033477783203125,
-0.0478515625,
-0.0298614501953125,
0.0088958740234375,
0.01206207275390625,
-0.06951904296875,
0.0170440673828125,
0.00724029541015625,
0.04962158203125,
-0.059722900390625,
0.0662841796875,
0.057403564453125,
-0.044464111328125,
-0.070556640625,
-0.00826263427734375,
0.00266265869140625,
-0.07171630859375,
0.03582763671875,
0.03326416015625,
0.0084991455078125,
0.01239776611328125,
-0.05645751953125,
-0.05914306640625,
0.11358642578125,
0.03375244140625,
-0.01190185546875,
0.0142364501953125,
-0.004547119140625,
0.0275726318359375,
-0.0167694091796875,
0.024749755859375,
0.011810302734375,
0.023651123046875,
0.029693603515625,
-0.0599365234375,
0.0125885009765625,
-0.0219573974609375,
0.01282501220703125,
0.01275634765625,
-0.07342529296875,
0.0706787109375,
-0.04345703125,
-0.00824737548828125,
0.0167236328125,
0.04876708984375,
0.01470947265625,
0.007221221923828125,
0.035064697265625,
0.0609130859375,
0.041290283203125,
-0.0283050537109375,
0.062469482421875,
-0.004505157470703125,
0.054840087890625,
0.044586181640625,
0.0298614501953125,
0.0457763671875,
0.036865234375,
-0.0302276611328125,
0.0269317626953125,
0.0726318359375,
-0.0404052734375,
0.037139892578125,
0.009002685546875,
0.00269317626953125,
-0.025909423828125,
0.001476287841796875,
-0.03692626953125,
0.038543701171875,
0.0171966552734375,
-0.04669189453125,
-0.00008803606033325195,
0.0028743743896484375,
0.0013818740844726562,
-0.022705078125,
-0.01323699951171875,
0.04132080078125,
0.006805419921875,
-0.035369873046875,
0.06640625,
-0.0034809112548828125,
0.055267333984375,
-0.0313720703125,
0.0035495758056640625,
-0.0223388671875,
0.0361328125,
-0.0251617431640625,
-0.06695556640625,
0.0086669921875,
-0.0181884765625,
-0.005153656005859375,
0.0023708343505859375,
0.049102783203125,
-0.0308685302734375,
-0.042816162109375,
0.0167388916015625,
0.020233154296875,
0.01611328125,
0.005641937255859375,
-0.07281494140625,
0.0017213821411132812,
0.002155303955078125,
-0.048095703125,
0.0281829833984375,
0.040679931640625,
0.0024814605712890625,
0.05255126953125,
0.04327392578125,
-0.00970458984375,
0.01177215576171875,
-0.01490020751953125,
0.0716552734375,
-0.0313720703125,
-0.0229339599609375,
-0.06072998046875,
0.043365478515625,
-0.005504608154296875,
-0.040985107421875,
0.0455322265625,
0.03900146484375,
0.05938720703125,
-0.00023365020751953125,
0.031951904296875,
-0.011077880859375,
0.00005310773849487305,
-0.031585693359375,
0.04693603515625,
-0.048980712890625,
0.001461029052734375,
-0.021728515625,
-0.0693359375,
-0.036712646484375,
0.06451416015625,
-0.0179290771484375,
0.0287017822265625,
0.038787841796875,
0.0758056640625,
-0.0276336669921875,
-0.0487060546875,
0.01476287841796875,
0.0233154296875,
0.0115203857421875,
0.03436279296875,
0.03265380859375,
-0.05670166015625,
0.05096435546875,
-0.040557861328125,
-0.017181396484375,
-0.01190185546875,
-0.042724609375,
-0.09014892578125,
-0.059295654296875,
-0.045867919921875,
-0.05828857421875,
-0.01258087158203125,
0.060455322265625,
0.07049560546875,
-0.0478515625,
0.003734588623046875,
-0.00444793701171875,
0.01082611083984375,
-0.02288818359375,
-0.01548004150390625,
0.055572509765625,
-0.0008788108825683594,
-0.053863525390625,
-0.024658203125,
0.004291534423828125,
0.035369873046875,
-0.01361846923828125,
-0.01556396484375,
-0.009796142578125,
-0.017333984375,
0.0186004638671875,
0.02899169921875,
-0.06207275390625,
-0.0188446044921875,
-0.010650634765625,
-0.0110015869140625,
0.04132080078125,
0.01458740234375,
-0.0531005859375,
0.038726806640625,
0.03668212890625,
0.0142364501953125,
0.06024169921875,
-0.01374053955078125,
0.0007166862487792969,
-0.055084228515625,
0.041473388671875,
-0.015838623046875,
0.03704833984375,
0.041290283203125,
-0.0172882080078125,
0.035491943359375,
0.050628662109375,
-0.03265380859375,
-0.06396484375,
-0.003650665283203125,
-0.088134765625,
-0.0006933212280273438,
0.08026123046875,
-0.03277587890625,
-0.037811279296875,
0.03082275390625,
-0.0091552734375,
0.0478515625,
-0.006168365478515625,
0.039703369140625,
0.01396942138671875,
0.00862884521484375,
-0.061370849609375,
-0.0301666259765625,
0.04412841796875,
0.005855560302734375,
-0.04052734375,
-0.02667236328125,
0.00881195068359375,
0.0479736328125,
0.0228118896484375,
0.0203857421875,
-0.0120086669921875,
0.018280029296875,
0.007160186767578125,
0.029205322265625,
-0.02337646484375,
-0.0130157470703125,
-0.0276641845703125,
-0.0034275054931640625,
-0.0032329559326171875,
-0.043548583984375
]
] |
setu4993/LaBSE | 2023-10-19T06:23:16.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"feature-extraction",
"sentence_embedding",
"multilingual",
"google",
"sentence-similarity",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"bo",
"bs",
"ca",
"ceb",
"co",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"haw",
"he",
"hi",
"hmn",
"hr",
"ht",
"hu",
"hy",
"id",
"ig",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lb",
"lo",
"lt",
"lv",
"mg",
"mi",
"mk",
"ml",
"mn",
"mr",
"ms",
"mt",
"my",
"ne",
"nl",
"no",
"ny",
"or",
"pa",
"pl",
"pt",
"ro",
"ru",
"rw",
"si",
"sk",
"sl",
"sm",
"sn",
"so",
"sq",
"sr",
"st",
"su",
"sv",
"sw",
"ta",
"te",
"tg",
"th",
"tk",
"tl",
"tr",
"tt",
"ug",
"uk",
"ur",
"uz",
"vi",
"wo",
"xh",
"yi",
"yo",
"zh",
"zu",
"dataset:CommonCrawl",
"dataset:Wikipedia",
"arxiv:2007.01852",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | setu4993 | null | null | setu4993/LaBSE | 43 | 13,875 | transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
language:
- af
- am
- ar
- as
- az
- be
- bg
- bn
- bo
- bs
- ca
- ceb
- co
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- haw
- he
- hi
- hmn
- hr
- ht
- hu
- hy
- id
- ig
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lb
- lo
- lt
- lv
- mg
- mi
- mk
- ml
- mn
- mr
- ms
- mt
- my
- ne
- nl
- no
- ny
- or
- pa
- pl
- pt
- ro
- ru
- rw
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- st
- su
- sv
- sw
- ta
- te
- tg
- th
- tk
- tl
- tr
- tt
- ug
- uk
- ur
- uz
- vi
- wo
- xh
- yi
- yo
- zh
- zu
tags:
- bert
- sentence_embedding
- multilingual
- google
- sentence-similarity
license: apache-2.0
datasets:
- CommonCrawl
- Wikipedia
---
# LaBSE
## Model description
Language-agnostic BERT Sentence Encoder (LaBSE) is a BERT-based model trained for sentence embedding for 109 languages. The pre-training process combines masked language modeling with translation language modeling. The model is useful for getting multilingual sentence embeddings and for bi-text retrieval.
- Model: [HuggingFace's model hub](https://huggingface.co/setu4993/LaBSE).
- Paper: [arXiv](https://arxiv.org/abs/2007.01852).
- Original model: [TensorFlow Hub](https://tfhub.dev/google/LaBSE/2).
- Blog post: [Google AI Blog](https://ai.googleblog.com/2020/08/language-agnostic-bert-sentence.html).
- Conversion from TensorFlow to PyTorch: [GitHub](https://github.com/setu4993/convert-labse-tf-pt).
This is migrated from the v2 model on the TF Hub, which uses dict-based input. The embeddings produced by both the versions of the model are [equivalent](https://github.com/setu4993/convert-labse-tf-pt/blob/ec3a019159a54ed6493181a64486c2808c01f216/tests/test_conversion.py#L31).
## Usage
Using the model:
```python
import torch
from transformers import BertModel, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("setu4993/LaBSE")
model = BertModel.from_pretrained("setu4993/LaBSE")
model = model.eval()
english_sentences = [
"dog",
"Puppies are nice.",
"I enjoy taking long walks along the beach with my dog.",
]
english_inputs = tokenizer(english_sentences, return_tensors="pt", padding=True)
with torch.no_grad():
english_outputs = model(**english_inputs)
```
To get the sentence embeddings, use the pooler output:
```python
english_embeddings = english_outputs.pooler_output
```
Output for other languages:
```python
italian_sentences = [
"cane",
"I cuccioli sono carini.",
"Mi piace fare lunghe passeggiate lungo la spiaggia con il mio cane.",
]
japanese_sentences = ["犬", "子犬はいいです", "私は犬と一緒にビーチを散歩するのが好きです"]
italian_inputs = tokenizer(italian_sentences, return_tensors="pt", padding=True)
japanese_inputs = tokenizer(japanese_sentences, return_tensors="pt", padding=True)
with torch.no_grad():
italian_outputs = model(**italian_inputs)
japanese_outputs = model(**japanese_inputs)
italian_embeddings = italian_outputs.pooler_output
japanese_embeddings = japanese_outputs.pooler_output
```
For similarity between sentences, an L2-norm is recommended before calculating the similarity:
```python
import torch.nn.functional as F
def similarity(embeddings_1, embeddings_2):
normalized_embeddings_1 = F.normalize(embeddings_1, p=2)
normalized_embeddings_2 = F.normalize(embeddings_2, p=2)
return torch.matmul(
normalized_embeddings_1, normalized_embeddings_2.transpose(0, 1)
)
print(similarity(english_embeddings, italian_embeddings))
print(similarity(english_embeddings, japanese_embeddings))
print(similarity(italian_embeddings, japanese_embeddings))
```
## Details
Details about data, training, evaluation and performance metrics are available in the [original paper](https://arxiv.org/abs/2007.01852).
### BibTeX entry and citation info
```bibtex
@misc{feng2020languageagnostic,
title={Language-agnostic BERT Sentence Embedding},
author={Fangxiaoyu Feng and Yinfei Yang and Daniel Cer and Naveen Arivazhagan and Wei Wang},
year={2020},
eprint={2007.01852},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 4,354 | [
[
-0.01412200927734375,
-0.05810546875,
0.02783203125,
0.0225372314453125,
-0.01551055908203125,
-0.0141754150390625,
-0.0294189453125,
-0.01316070556640625,
0.0168304443359375,
0.00583648681640625,
-0.027099609375,
-0.051544189453125,
-0.046478271484375,
0.00782012939453125,
-0.0212249755859375,
0.0693359375,
-0.016448974609375,
0.0307159423828125,
0.0012693405151367188,
-0.0147552490234375,
-0.00800323486328125,
-0.0309600830078125,
-0.042999267578125,
-0.0262908935546875,
0.0265045166015625,
-0.00875091552734375,
0.032196044921875,
0.02679443359375,
0.014312744140625,
0.0284271240234375,
0.0087738037109375,
0.00524139404296875,
-0.031646728515625,
-0.0221710205078125,
0.0082244873046875,
-0.0335693359375,
-0.0098724365234375,
0.006175994873046875,
0.04107666015625,
0.040283203125,
0.005664825439453125,
-0.00988006591796875,
0.006702423095703125,
0.0305328369140625,
-0.027008056640625,
0.01300048828125,
-0.0311431884765625,
0.011627197265625,
0.0033016204833984375,
0.002880096435546875,
-0.043853759765625,
-0.040985107421875,
0.009796142578125,
-0.037689208984375,
0.026031494140625,
-0.006114959716796875,
0.09954833984375,
0.005413055419921875,
-0.016021728515625,
-0.03143310546875,
-0.01995849609375,
0.058074951171875,
-0.07354736328125,
0.037628173828125,
0.019134521484375,
-0.0115814208984375,
-0.0200042724609375,
-0.07269287109375,
-0.048095703125,
-0.0211334228515625,
-0.023193359375,
0.0175628662109375,
-0.010650634765625,
0.0003986358642578125,
0.003997802734375,
0.0281829833984375,
-0.0562744140625,
0.0012502670288085938,
-0.0345458984375,
-0.01534271240234375,
0.044342041015625,
-0.0164031982421875,
0.0323486328125,
-0.0157318115234375,
-0.0268707275390625,
-0.0254974365234375,
-0.037353515625,
0.00467681884765625,
0.0219573974609375,
0.0025806427001953125,
-0.0217742919921875,
0.03997802734375,
0.0034084320068359375,
0.040252685546875,
-0.00455474853515625,
0.01261138916015625,
0.0509033203125,
-0.01318359375,
-0.019378662109375,
0.004016876220703125,
0.08575439453125,
0.020904541015625,
0.0258026123046875,
-0.0298004150390625,
-0.0027904510498046875,
-0.002147674560546875,
-0.0028285980224609375,
-0.06781005859375,
-0.02874755859375,
0.0096282958984375,
-0.0284271240234375,
-0.018707275390625,
0.0182647705078125,
-0.034576416015625,
-0.0004572868347167969,
0.0198974609375,
0.056427001953125,
-0.062103271484375,
0.0007610321044921875,
0.0182647705078125,
-0.0216522216796875,
0.0236968994140625,
-0.00789642333984375,
-0.058135986328125,
0.0154876708984375,
0.0301055908203125,
0.07537841796875,
0.01288604736328125,
-0.041748046875,
-0.026153564453125,
0.000812530517578125,
-0.01348114013671875,
0.02557373046875,
-0.0242767333984375,
-0.024261474609375,
0.0048828125,
0.022247314453125,
-0.024261474609375,
-0.0230560302734375,
0.0301513671875,
-0.0289154052734375,
0.042266845703125,
-0.0071868896484375,
-0.07110595703125,
-0.0144500732421875,
0.015655517578125,
-0.04083251953125,
0.0941162109375,
0.00562286376953125,
-0.058441162109375,
0.01325225830078125,
-0.047943115234375,
-0.02545166015625,
-0.00803375244140625,
-0.013153076171875,
-0.03814697265625,
0.006290435791015625,
0.031982421875,
0.04498291015625,
-0.00006723403930664062,
0.028656005859375,
-0.01763916015625,
-0.03021240234375,
0.026824951171875,
-0.0213623046875,
0.0899658203125,
0.0128631591796875,
-0.02734375,
0.007175445556640625,
-0.037353515625,
0.002681732177734375,
0.01290130615234375,
-0.0217742919921875,
-0.01264190673828125,
0.0025920867919921875,
0.0207977294921875,
0.00942230224609375,
0.020751953125,
-0.06298828125,
0.00684356689453125,
-0.046112060546875,
0.066650390625,
0.050689697265625,
-0.01233673095703125,
0.0207061767578125,
-0.022979736328125,
0.0036907196044921875,
-0.00231170654296875,
-0.00760650634765625,
0.0017232894897460938,
-0.038604736328125,
-0.059234619140625,
-0.033935546875,
0.035125732421875,
0.043121337890625,
-0.053192138671875,
0.06378173828125,
-0.03326416015625,
-0.040985107421875,
-0.06378173828125,
-0.00036334991455078125,
0.0263214111328125,
0.024688720703125,
0.0296630859375,
-0.01678466796875,
-0.035980224609375,
-0.07183837890625,
-0.0118865966796875,
-0.00501251220703125,
-0.00933837890625,
-0.0003008842468261719,
0.056488037109375,
-0.0183563232421875,
0.05621337890625,
-0.038787841796875,
-0.01568603515625,
-0.0176239013671875,
0.01052093505859375,
0.02398681640625,
0.0599365234375,
0.054779052734375,
-0.0411376953125,
-0.047332763671875,
-0.007137298583984375,
-0.0635986328125,
0.01311492919921875,
-0.00640106201171875,
-0.01247406005859375,
0.0224761962890625,
0.04486083984375,
-0.042510986328125,
0.024658203125,
0.037841796875,
-0.0276031494140625,
0.0234832763671875,
-0.03094482421875,
0.0145721435546875,
-0.10693359375,
0.00978851318359375,
0.0019702911376953125,
-0.01265716552734375,
-0.041534423828125,
0.00885009765625,
0.007755279541015625,
-0.00702667236328125,
-0.0352783203125,
0.048614501953125,
-0.0316162109375,
0.011444091796875,
0.01212310791015625,
0.0187835693359375,
0.004852294921875,
0.049835205078125,
0.007579803466796875,
0.063232421875,
0.04205322265625,
-0.0293426513671875,
0.022796630859375,
0.034332275390625,
-0.0445556640625,
0.00621795654296875,
-0.06439208984375,
0.00885009765625,
0.008636474609375,
0.01303863525390625,
-0.07879638671875,
-0.00449371337890625,
0.01123809814453125,
-0.058441162109375,
0.0127410888671875,
0.0191497802734375,
-0.0596923828125,
-0.0263214111328125,
-0.0426025390625,
0.0124969482421875,
0.04571533203125,
-0.04437255859375,
0.0285186767578125,
-0.0057830810546875,
0.0008783340454101562,
-0.046478271484375,
-0.08551025390625,
-0.01232147216796875,
-0.007755279541015625,
-0.04833984375,
0.039031982421875,
-0.006519317626953125,
0.012420654296875,
0.0214080810546875,
0.014923095703125,
-0.006427764892578125,
-0.00464630126953125,
-0.0006470680236816406,
0.0213470458984375,
-0.0134429931640625,
0.020294189453125,
0.004177093505859375,
0.01092529296875,
0.0034236907958984375,
-0.0120849609375,
0.0616455078125,
-0.02252197265625,
-0.005100250244140625,
-0.03436279296875,
0.0216827392578125,
0.0238800048828125,
-0.01070404052734375,
0.07879638671875,
0.08984375,
-0.0355224609375,
0.009307861328125,
-0.0455322265625,
-0.02081298828125,
-0.0391845703125,
0.053436279296875,
-0.03509521484375,
-0.06610107421875,
0.04718017578125,
0.0175628662109375,
0.004108428955078125,
0.0384521484375,
0.04425048828125,
-0.01375579833984375,
0.062103271484375,
0.04010009765625,
-0.01267242431640625,
0.04168701171875,
-0.044647216796875,
0.03253173828125,
-0.06201171875,
-0.0221405029296875,
-0.029388427734375,
-0.0216217041015625,
-0.058929443359375,
-0.030914306640625,
0.0227813720703125,
0.0079498291015625,
-0.02813720703125,
0.0299072265625,
-0.0310211181640625,
0.024444580078125,
0.048553466796875,
0.0260009765625,
-0.00945281982421875,
0.0175323486328125,
-0.0217742919921875,
-0.0130615234375,
-0.0548095703125,
-0.037872314453125,
0.076904296875,
0.024993896484375,
0.0301971435546875,
0.003932952880859375,
0.064453125,
0.00970458984375,
0.0068511962890625,
-0.056610107421875,
0.04229736328125,
-0.0293121337890625,
-0.043243408203125,
-0.0163116455078125,
-0.0306854248046875,
-0.07684326171875,
0.01849365234375,
-0.010650634765625,
-0.07196044921875,
0.0020809173583984375,
-0.0204010009765625,
-0.00746917724609375,
0.034820556640625,
-0.06451416015625,
0.07696533203125,
-0.00726318359375,
-0.031402587890625,
-0.017822265625,
-0.042510986328125,
0.00616455078125,
0.0205535888671875,
0.009674072265625,
0.0012426376342773438,
0.004878997802734375,
0.07757568359375,
-0.03680419921875,
0.057769775390625,
-0.00782012939453125,
0.009857177734375,
0.007625579833984375,
-0.0083160400390625,
0.02447509765625,
0.004360198974609375,
-0.01641845703125,
0.0146026611328125,
0.0012035369873046875,
-0.048980712890625,
-0.031402587890625,
0.06964111328125,
-0.08782958984375,
-0.03717041015625,
-0.03778076171875,
-0.04779052734375,
-0.00899505615234375,
0.023834228515625,
0.044158935546875,
0.0307159423828125,
-0.02642822265625,
0.0252227783203125,
0.042449951171875,
-0.034027099609375,
0.0517578125,
0.015960693359375,
-0.0018768310546875,
-0.037567138671875,
0.05810546875,
0.0007925033569335938,
0.01081085205078125,
0.037872314453125,
0.0019664764404296875,
-0.0242767333984375,
-0.024871826171875,
-0.0285491943359375,
0.04150390625,
-0.046905517578125,
-0.00012755393981933594,
-0.0584716796875,
-0.03179931640625,
-0.0477294921875,
-0.04168701171875,
-0.0221099853515625,
-0.024749755859375,
-0.0285186767578125,
-0.0199127197265625,
0.035919189453125,
0.0391845703125,
-0.0032634735107421875,
0.0220489501953125,
-0.044677734375,
0.004100799560546875,
0.00868988037109375,
0.02081298828125,
-0.00789642333984375,
-0.03924560546875,
-0.03485107421875,
0.00595855712890625,
-0.022613525390625,
-0.07354736328125,
0.045654296875,
0.017120361328125,
0.046722412109375,
0.0121002197265625,
-0.005184173583984375,
0.043121337890625,
-0.034912109375,
0.072265625,
0.0231475830078125,
-0.0802001953125,
0.036468505859375,
-0.0002162456512451172,
0.016143798828125,
0.03875732421875,
0.035797119140625,
-0.0308685302734375,
-0.0301055908203125,
-0.050811767578125,
-0.06781005859375,
0.0517578125,
0.0269012451171875,
0.0243377685546875,
-0.0178070068359375,
0.0162353515625,
0.00408935546875,
0.00749969482421875,
-0.0816650390625,
-0.029754638671875,
-0.021148681640625,
-0.036773681640625,
-0.020294189453125,
-0.0020008087158203125,
-0.0035648345947265625,
-0.035247802734375,
0.065185546875,
0.01220703125,
0.05487060546875,
0.0333251953125,
-0.0268707275390625,
0.023956298828125,
0.018890380859375,
0.0251617431640625,
0.020263671875,
-0.032012939453125,
0.004146575927734375,
0.0229949951171875,
-0.03643798828125,
0.003509521484375,
0.02484130859375,
-0.0107269287109375,
0.0272216796875,
0.033905029296875,
0.0692138671875,
0.01377105712890625,
-0.043121337890625,
0.045074462890625,
0.005664825439453125,
-0.0302276611328125,
-0.0281524658203125,
-0.0108795166015625,
0.01335906982421875,
0.01837158203125,
0.0183258056640625,
-0.0114898681640625,
0.00782012939453125,
-0.04840087890625,
0.01012420654296875,
0.0207672119140625,
-0.02618408203125,
-0.0177154541015625,
0.05096435546875,
0.012786865234375,
-0.01445770263671875,
0.06427001953125,
-0.023529052734375,
-0.04766845703125,
0.049560546875,
0.053131103515625,
0.074951171875,
-0.005489349365234375,
0.023590087890625,
0.055755615234375,
0.0318603515625,
-0.00006365776062011719,
0.01441192626953125,
0.0093536376953125,
-0.06439208984375,
-0.01358795166015625,
-0.03472900390625,
0.006748199462890625,
0.01329803466796875,
-0.05548095703125,
0.020721435546875,
-0.0201873779296875,
-0.004302978515625,
-0.005352020263671875,
0.01442718505859375,
-0.0556640625,
0.005359649658203125,
0.00209808349609375,
0.06207275390625,
-0.07537841796875,
0.08819580078125,
0.0621337890625,
-0.044342041015625,
-0.05706787109375,
-0.003688812255859375,
-0.0304107666015625,
-0.0689697265625,
0.038848876953125,
0.0266265869140625,
0.00443267822265625,
0.01245880126953125,
-0.0272369384765625,
-0.049072265625,
0.09613037109375,
0.0297393798828125,
-0.0328369140625,
-0.006748199462890625,
-0.0018854141235351562,
0.044921875,
-0.0254058837890625,
0.022216796875,
0.035858154296875,
0.022857666015625,
-0.014556884765625,
-0.060943603515625,
0.01284027099609375,
-0.041015625,
0.0159912109375,
-0.0009107589721679688,
-0.051055908203125,
0.067626953125,
-0.010406494140625,
-0.012481689453125,
0.01108551025390625,
0.049346923828125,
0.0215301513671875,
0.005584716796875,
0.015838623046875,
0.051513671875,
0.0528564453125,
-0.0201263427734375,
0.076171875,
-0.023956298828125,
0.045684814453125,
0.06512451171875,
0.01023101806640625,
0.07305908203125,
0.0445556640625,
-0.020965576171875,
0.058380126953125,
0.047210693359375,
-0.02001953125,
0.058837890625,
0.005218505859375,
-0.005634307861328125,
0.0028820037841796875,
0.00907135009765625,
-0.045867919921875,
0.01264190673828125,
0.025115966796875,
-0.04962158203125,
-0.0021800994873046875,
0.01161956787109375,
0.02557373046875,
0.0044708251953125,
0.000051856040954589844,
0.039642333984375,
-0.00313568115234375,
-0.0361328125,
0.056427001953125,
0.01239776611328125,
0.067626953125,
-0.04730224609375,
0.020965576171875,
-0.00955963134765625,
0.024139404296875,
-0.0082550048828125,
-0.05731201171875,
0.016021728515625,
0.00258636474609375,
-0.0024871826171875,
-0.005279541015625,
0.0158538818359375,
-0.04730224609375,
-0.0465087890625,
0.035369873046875,
0.032379150390625,
0.021331787109375,
0.02642822265625,
-0.07025146484375,
0.00989532470703125,
0.0185699462890625,
-0.0291900634765625,
0.0120391845703125,
0.02264404296875,
0.019622802734375,
0.03204345703125,
0.0240325927734375,
-0.0038585662841796875,
0.0197296142578125,
0.0033969879150390625,
0.0560302734375,
-0.0411376953125,
-0.035980224609375,
-0.0712890625,
0.035919189453125,
-0.011566162109375,
-0.0313720703125,
0.05194091796875,
0.05615234375,
0.08343505859375,
-0.0291595458984375,
0.0616455078125,
-0.0274810791015625,
0.003925323486328125,
-0.049591064453125,
0.05413818359375,
-0.0533447265625,
0.00034332275390625,
-0.00890350341796875,
-0.061859130859375,
-0.0243682861328125,
0.0745849609375,
-0.03460693359375,
0.00826263427734375,
0.07403564453125,
0.07049560546875,
-0.0016107559204101562,
-0.017303466796875,
0.01242828369140625,
0.027435302734375,
0.0272369384765625,
0.056610107421875,
0.0215606689453125,
-0.0653076171875,
0.040679931640625,
-0.0287017822265625,
0.0023670196533203125,
-0.007030487060546875,
-0.043731689453125,
-0.08056640625,
-0.061614990234375,
-0.037139892578125,
-0.0307464599609375,
0.01084136962890625,
0.074462890625,
0.043548583984375,
-0.0684814453125,
-0.0231170654296875,
-0.0201568603515625,
-0.01076507568359375,
-0.01580810546875,
-0.02069091796875,
0.054779052734375,
-0.029388427734375,
-0.07275390625,
0.00940704345703125,
-0.006519317626953125,
0.00125885009765625,
-0.00972747802734375,
-0.0196990966796875,
-0.046966552734375,
0.0095062255859375,
0.036834716796875,
-0.00225830078125,
-0.0687255859375,
-0.0191497802734375,
0.0049896240234375,
-0.0284423828125,
0.0023670196533203125,
0.0297698974609375,
-0.04376220703125,
0.041015625,
0.03802490234375,
0.04449462890625,
0.065673828125,
-0.035858154296875,
0.038818359375,
-0.06646728515625,
0.033050537109375,
0.006481170654296875,
0.04718017578125,
0.033905029296875,
-0.0176239013671875,
0.050079345703125,
0.0161895751953125,
-0.034759521484375,
-0.059661865234375,
-0.0025386810302734375,
-0.08013916015625,
-0.0311279296875,
0.07513427734375,
-0.04547119140625,
-0.0219573974609375,
0.0171051025390625,
-0.0111236572265625,
0.047393798828125,
-0.024169921875,
0.059478759765625,
0.07080078125,
0.00774383544921875,
-0.0150604248046875,
-0.032928466796875,
0.0179901123046875,
0.04315185546875,
-0.043914794921875,
-0.0273284912109375,
0.0157470703125,
0.038177490234375,
0.011627197265625,
0.04010009765625,
-0.006649017333984375,
0.01270294189453125,
0.00858306884765625,
0.0286407470703125,
-0.00716400146484375,
0.0188751220703125,
-0.013824462890625,
0.00017404556274414062,
-0.00992584228515625,
-0.039031982421875
]
] |
openchat/openchat_v3.2_super | 2023-10-02T03:00:40.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"arxiv:2309.11235",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | openchat | null | null | openchat/openchat_v3.2_super | 30 | 13,858 | transformers | 2023-09-04T03:06:59 | ---
license: llama2
---
# OpenChat: Advancing Open-source Language Models with Imperfect Data</h1>
<div align="center">
<img src="https://raw.githubusercontent.com/imoneoi/openchat/master/assets/logo_new.png" style="width: 65%">
</div>
OpenChat is a collection of open-source language models, optimized and fine-tuned with a strategy inspired by offline reinforcement learning. We use approximately 80k ShareGPT conversations, a conditioning strategy, and weighted loss to deliver outstanding performance, despite our simple approach. Our ultimate goal is to develop a high-performance, commercially available, open-source large language model, and we are continuously making strides towards this vision.
**🤖 Ranked #1 among all open-source models on [AgentBench](https://github.com/THUDM/AgentBench)**
**🔥 Ranked #1 among 13B open-source models | 89.5% win-rate on [AlpacaEval](https://tatsu-lab.github.io/alpaca_eval/) | 7.19 score on [MT-bench](https://chat.lmsys.org/?leaderboard)**
**🕒 Exceptionally efficient padding-free fine-tuning, only requires 15 hours on 8xA100 80G**
**💲 FREE for commercial use under [Llama 2 Community License](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)**
[](https://zenodo.org/badge/latestdoi/645397533)
## <a id="models"></a> Usage
To use these models, we highly recommend installing the OpenChat package by following the [installation guide](https://github.com/imoneoi/openchat/#installation) and using the OpenChat OpenAI-compatible API server by running the serving command from the table below. The server is optimized for high-throughput deployment using [vLLM](https://github.com/vllm-project/vllm) and can run on a GPU with at least 48GB RAM or two consumer GPUs with tensor parallelism. To enable tensor parallelism, append `--tensor-parallel-size 2` to the serving command.
When started, the server listens at `localhost:18888` for requests and is compatible with the [OpenAI ChatCompletion API specifications](https://platform.openai.com/docs/api-reference/chat). See the example request below for reference. Additionally, you can access the [OpenChat Web UI](#web-ui) for a user-friendly experience.
To deploy the server as an online service, use `--api-keys sk-KEY1 sk-KEY2 ...` to specify allowed API keys and `--disable-log-requests --disable-log-stats --log-file openchat.log` for logging only to a file. We recommend using a [HTTPS gateway](https://fastapi.tiangolo.com/es/deployment/concepts/#security-https) in front of the server for security purposes.
<details>
<summary>Example request (click to expand)</summary>
```bash
curl http://localhost:18888/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "openchat_v3.2",
"messages": [{"role": "user", "content": "You are a large language model named OpenChat. Write a poem to describe yourself"}]
}'
```
</details>
| Model | Size | Context | Weights | Serving |
|--------------|------|---------|--------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| OpenChat 3.2 SUPER | 13B | 4096 | [Huggingface](https://huggingface.co/openchat/openchat_v3.2_super) | `python -m ochat.serving.openai_api_server --model-type openchat_v3.2 --model openchat/openchat_v3.2_super --engine-use-ray --worker-use-ray --max-num-batched-tokens 5120` |
For inference with Huggingface Transformers (slow and not recommended), follow the conversation template provided below:
<details>
<summary>Conversation templates (click to expand)</summary>
```python
# Single-turn V3.2 (SUPER)
tokenize("GPT4 User: Hello<|end_of_turn|>GPT4 Assistant:")
# Result: [1, 402, 7982, 29946, 4911, 29901, 15043, 32000, 402, 7982, 29946, 4007, 22137, 29901]
# Multi-turn V3.2 (SUPER)
tokenize("GPT4 User: Hello<|end_of_turn|>GPT4 Assistant: Hi<|end_of_turn|>GPT4 User: How are you today?<|end_of_turn|>GPT4 Assistant:")
# Result: [1, 402, 7982, 29946, 4911, 29901, 15043, 32000, 402, 7982, 29946, 4007, 22137, 29901, 6324, 32000, 402, 7982, 29946, 4911, 29901, 1128, 526, 366, 9826, 29973, 32000, 402, 7982, 29946, 4007, 22137, 29901]
```
</details>
## <a id="benchmarks"></a> Benchmarks
We have evaluated our models using the two most popular evaluation benchmarks **, including AlpacaEval and MT-bench. Here we list the top models with our released versions, sorted by model size in descending order. The full version can be found on the [MT-bench](https://chat.lmsys.org/?leaderboard) and [AlpacaEval](https://tatsu-lab.github.io/alpaca_eval/) leaderboards.
To ensure consistency, we used the same routine as ChatGPT / GPT-4 to run these benchmarks. We started the OpenAI API-compatible server and set the `openai.api_base` to `http://localhost:18888/v1` in the benchmark program.
| **Model** | **Size** | **Context** | **Dataset Size** | **💲Free** | **AlpacaEval (win rate %)** | **MT-bench (win rate adjusted %)** | **MT-bench (score)** |
|----------------------------------|----------|-------------|------------------|-----------|-----------------------------|------------------------------------|----------------------|
| | | | | | **v.s. text-davinci-003** | **v.s. ChatGPT** | |
| GPT-4 | 1.8T* | 8K | | ❌ | 95.3 | 82.5 | 8.99 |
| ChatGPT | 175B* | 4K | | ❌ | 89.4 | 50.0 | 7.94 |
| Llama-2-70B-Chat | 70B | 4K | 2.9M | ✅ | 92.7 | 60.0 | 6.86 |
| **OpenChat 3.2 SUPER** | **13B** | **4K** | **80K** | ✅ | **89.5** | **57.5** | **7.19** |
| Llama-2-13B-Chat | 13B | 4K | 2.9M | ✅ | 81.1 | 55.3 | 6.65 |
| WizardLM 1.2 | 13B | 4K | 196K | ✅ | 89.2 | 53.1 | 7.05 |
| Vicuna 1.5 | 13B | 2K | 125K | ✅ | 78.8 | 37.2 | 6.57 |
*: Estimated model size
**: The benchmark metrics represent a quantified measure of a subset of the model's capabilities. A win-rate greater than 50% does not necessarily indicate that the model is better than ChatGPT in all scenarios or for all use cases. It is essential to consider the specific tasks or applications for which the model was evaluated and compare the results accordingly.
## Limitations
**Foundation Model Limitations**
Despite its advanced capabilities, OpenChat is still bound by the limitations inherent in its foundation models. These limitations may impact the model's performance in areas such as:
- Complex reasoning
- Mathematical and arithmetic tasks
- Programming and coding challenges
**Hallucination of Non-existent Information**
OpenChat may sometimes generate information that does not exist or is not accurate, also known as "hallucination". Users should be aware of this possibility and verify any critical information obtained from the model.
## License
Our OpenChat V3 models are licensed under the [Llama 2 Community License](https://ai.meta.com/resources/models-and-libraries/llama-downloads/).
```
@misc{wang2023openchat,
title={OpenChat: Advancing Open-source Language Models with Mixed-Quality Data},
author={Guan Wang and Sijie Cheng and Xianyuan Zhan and Xiangang Li and Sen Song and Yang Liu},
year={2023},
eprint={2309.11235},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 8,503 | [
[
-0.04571533203125,
-0.0706787109375,
0.0200042724609375,
0.027130126953125,
-0.01534271240234375,
-0.00933837890625,
-0.019989013671875,
-0.042449951171875,
0.020660400390625,
0.0264129638671875,
-0.04248046875,
-0.0323486328125,
-0.033203125,
-0.0245819091796875,
-0.001003265380859375,
0.07470703125,
-0.005840301513671875,
-0.01337432861328125,
0.002155303955078125,
-0.0299835205078125,
-0.04766845703125,
-0.03497314453125,
-0.05487060546875,
-0.0188751220703125,
0.0125885009765625,
0.0253753662109375,
0.05865478515625,
0.0311279296875,
0.042755126953125,
0.0263671875,
-0.00897979736328125,
0.0216064453125,
-0.052154541015625,
-0.0047454833984375,
0.01751708984375,
-0.04388427734375,
-0.060546875,
0.0101776123046875,
0.040802001953125,
0.02349853515625,
-0.0178375244140625,
0.005199432373046875,
0.0140838623046875,
0.0380859375,
-0.040435791015625,
0.0276947021484375,
-0.032928466796875,
-0.0018634796142578125,
-0.0207672119140625,
-0.01422882080078125,
-0.0114898681640625,
-0.04266357421875,
0.00438690185546875,
-0.047760009765625,
-0.0146484375,
0.00010794401168823242,
0.097412109375,
-0.0002944469451904297,
-0.0154876708984375,
-0.01233673095703125,
-0.056549072265625,
0.045806884765625,
-0.0673828125,
0.0250396728515625,
0.03424072265625,
0.036895751953125,
-0.0207061767578125,
-0.04107666015625,
-0.038848876953125,
-0.0181884765625,
-0.00852203369140625,
0.0184783935546875,
-0.0302734375,
-0.0033550262451171875,
0.01137542724609375,
0.044769287109375,
-0.05596923828125,
0.01349639892578125,
-0.037139892578125,
-0.0103759765625,
0.032379150390625,
0.0174713134765625,
0.031494140625,
0.003231048583984375,
-0.025177001953125,
-0.018096923828125,
-0.033111572265625,
0.0236663818359375,
0.0245361328125,
0.0288848876953125,
-0.044464111328125,
0.048583984375,
-0.0264434814453125,
0.027618408203125,
-0.0128326416015625,
-0.00604248046875,
0.038116455078125,
-0.037628173828125,
-0.022216796875,
-0.01508331298828125,
0.0943603515625,
0.037078857421875,
0.01404571533203125,
0.0062103271484375,
0.0019254684448242188,
0.007686614990234375,
0.002208709716796875,
-0.0693359375,
0.002483367919921875,
0.033966064453125,
-0.04229736328125,
-0.02435302734375,
-0.004634857177734375,
-0.05804443359375,
-0.00789642333984375,
0.007671356201171875,
0.022125244140625,
-0.055694580078125,
-0.03662109375,
-0.0013017654418945312,
-0.0020809173583984375,
0.0212860107421875,
0.0279998779296875,
-0.055572509765625,
0.041351318359375,
0.043487548828125,
0.097412109375,
0.0032176971435546875,
-0.0239105224609375,
-0.01181793212890625,
-0.0212249755859375,
-0.034088134765625,
0.0372314453125,
-0.01548004150390625,
-0.01143646240234375,
-0.016357421875,
-0.007732391357421875,
-0.0118408203125,
-0.0216217041015625,
0.042144775390625,
-0.0178985595703125,
0.0272216796875,
-0.01751708984375,
-0.03131103515625,
-0.00244140625,
0.0216217041015625,
-0.04473876953125,
0.0740966796875,
-0.0020084381103515625,
-0.0517578125,
-0.0005970001220703125,
-0.07745361328125,
-0.00688934326171875,
-0.00902557373046875,
-0.002483367919921875,
-0.0277862548828125,
-0.01197052001953125,
0.0232086181640625,
0.02703857421875,
-0.038726806640625,
-0.013641357421875,
-0.030059814453125,
-0.0303955078125,
0.0186614990234375,
-0.0261383056640625,
0.06927490234375,
0.035400390625,
-0.0288238525390625,
0.012542724609375,
-0.043792724609375,
0.0244140625,
0.043487548828125,
-0.0137786865234375,
-0.009124755859375,
-0.00553131103515625,
-0.02252197265625,
0.00522613525390625,
0.013275146484375,
-0.04827880859375,
0.0122222900390625,
-0.044921875,
0.052459716796875,
0.050567626953125,
-0.00311279296875,
0.0293426513671875,
-0.043670654296875,
0.0231781005859375,
0.01274871826171875,
0.03466796875,
-0.027557373046875,
-0.0640869140625,
-0.06396484375,
-0.0288848876953125,
0.017669677734375,
0.049591064453125,
-0.045013427734375,
0.043975830078125,
-0.02423095703125,
-0.06695556640625,
-0.06268310546875,
-0.009063720703125,
0.036376953125,
0.01399993896484375,
0.0169525146484375,
-0.037628173828125,
-0.0252532958984375,
-0.060333251953125,
-0.0045318603515625,
-0.0306396484375,
0.014007568359375,
0.04656982421875,
0.0372314453125,
-0.0019626617431640625,
0.06744384765625,
-0.0469970703125,
-0.0247344970703125,
-0.011260986328125,
-0.00893402099609375,
0.018280029296875,
0.03924560546875,
0.06414794921875,
-0.052337646484375,
-0.044952392578125,
0.01078033447265625,
-0.0621337890625,
0.015838623046875,
0.0177459716796875,
-0.0267333984375,
0.03265380859375,
0.009307861328125,
-0.06890869140625,
0.04852294921875,
0.043304443359375,
-0.02569580078125,
0.036865234375,
-0.0130462646484375,
0.0255126953125,
-0.06817626953125,
0.0129852294921875,
-0.0054473876953125,
-0.00507354736328125,
-0.049041748046875,
0.01064300537109375,
-0.00437164306640625,
-0.00812530517578125,
-0.0276947021484375,
0.05767822265625,
-0.0221405029296875,
0.004894256591796875,
0.00797271728515625,
0.0154876708984375,
-0.015228271484375,
0.060516357421875,
-0.0192108154296875,
0.05120849609375,
0.032501220703125,
-0.035125732421875,
0.031341552734375,
0.014373779296875,
-0.034423828125,
0.0301055908203125,
-0.06500244140625,
0.00945281982421875,
0.01270294189453125,
0.0277252197265625,
-0.09259033203125,
-0.010833740234375,
0.049652099609375,
-0.054107666015625,
0.0040435791015625,
-0.006832122802734375,
-0.033905029296875,
-0.03857421875,
-0.0255584716796875,
0.0177154541015625,
0.051513671875,
-0.0273895263671875,
0.0263214111328125,
0.019195556640625,
0.0007224082946777344,
-0.0406494140625,
-0.051971435546875,
-0.005580902099609375,
-0.01094818115234375,
-0.06414794921875,
0.0159912109375,
-0.0144195556640625,
-0.008453369140625,
0.0009222030639648438,
0.013397216796875,
-0.006458282470703125,
-0.0061492919921875,
0.0292510986328125,
0.0236663818359375,
-0.024627685546875,
-0.011688232421875,
-0.0256500244140625,
-0.0068359375,
-0.0196533203125,
-0.0016574859619140625,
0.067626953125,
-0.034423828125,
-0.035980224609375,
-0.044281005859375,
0.00855255126953125,
0.0592041015625,
-0.037139892578125,
0.07781982421875,
0.03997802734375,
-0.0194091796875,
0.016754150390625,
-0.049346923828125,
-0.00968170166015625,
-0.036041259765625,
0.021759033203125,
-0.035675048828125,
-0.06268310546875,
0.0635986328125,
0.027984619140625,
0.0325927734375,
0.034088134765625,
0.046905517578125,
0.00522613525390625,
0.08026123046875,
0.02606201171875,
-0.0157470703125,
0.034271240234375,
-0.0263214111328125,
0.019989013671875,
-0.054107666015625,
-0.0144805908203125,
-0.0455322265625,
-0.0178680419921875,
-0.06402587890625,
-0.027435302734375,
0.0183868408203125,
0.00154876708984375,
-0.034210205078125,
0.0283660888671875,
-0.043487548828125,
0.0169525146484375,
0.0531005859375,
0.01210784912109375,
0.0087127685546875,
-0.0095977783203125,
-0.0249786376953125,
-0.0027599334716796875,
-0.048004150390625,
-0.038909912109375,
0.07965087890625,
0.04107666015625,
0.041961669921875,
0.0174407958984375,
0.04541015625,
-0.006450653076171875,
0.0149078369140625,
-0.035980224609375,
0.044464111328125,
0.01236724853515625,
-0.051666259765625,
-0.0237579345703125,
-0.053009033203125,
-0.08148193359375,
0.037139892578125,
-0.0009889602661132812,
-0.0771484375,
-0.0030307769775390625,
0.0015468597412109375,
-0.01517486572265625,
0.03717041015625,
-0.053497314453125,
0.0794677734375,
-0.0161285400390625,
-0.01306915283203125,
-0.0011501312255859375,
-0.04400634765625,
0.048431396484375,
0.02239990234375,
0.031341552734375,
-0.0145721435546875,
-0.004436492919921875,
0.044158935546875,
-0.06243896484375,
0.06500244140625,
-0.0182647705078125,
0.0128326416015625,
0.03558349609375,
0.00818634033203125,
0.03717041015625,
-0.02435302734375,
0.004276275634765625,
0.017333984375,
0.01194000244140625,
-0.03399658203125,
-0.037506103515625,
0.0706787109375,
-0.09100341796875,
-0.02423095703125,
-0.0277557373046875,
-0.016937255859375,
-0.0011072158813476562,
0.0145721435546875,
0.0226593017578125,
0.0230712890625,
-0.028961181640625,
0.0191497802734375,
0.0283050537109375,
-0.038970947265625,
0.030975341796875,
0.03472900390625,
-0.0303497314453125,
-0.0433349609375,
0.06640625,
0.0216522216796875,
0.0338134765625,
0.01354217529296875,
0.0102386474609375,
-0.01318359375,
-0.02642822265625,
-0.03125,
0.02349853515625,
-0.03277587890625,
-0.00725555419921875,
-0.053436279296875,
-0.03387451171875,
-0.0396728515625,
0.0184326171875,
-0.05194091796875,
-0.026641845703125,
-0.0122222900390625,
0.00865936279296875,
0.03704833984375,
0.041351318359375,
0.0080108642578125,
0.028167724609375,
-0.05328369140625,
0.014007568359375,
0.00965118408203125,
0.055938720703125,
0.0210723876953125,
-0.035552978515625,
-0.0050811767578125,
0.035980224609375,
-0.04058837890625,
-0.05340576171875,
0.0214385986328125,
0.0104522705078125,
0.04833984375,
0.028778076171875,
0.01187896728515625,
0.0587158203125,
-0.0270233154296875,
0.0689697265625,
0.01776123046875,
-0.0498046875,
0.048858642578125,
-0.0301055908203125,
0.01509857177734375,
0.0229034423828125,
0.036865234375,
-0.05023193359375,
-0.0328369140625,
-0.05828857421875,
-0.05401611328125,
0.07928466796875,
0.0477294921875,
0.01253509521484375,
-0.00008040666580200195,
0.023468017578125,
-0.0156402587890625,
0.0080108642578125,
-0.058685302734375,
-0.0283660888671875,
-0.023956298828125,
-0.0031642913818359375,
-0.004425048828125,
-0.0025482177734375,
-0.0008544921875,
-0.014404296875,
0.046722412109375,
0.00820159912109375,
0.048126220703125,
0.0018434524536132812,
0.01253509521484375,
-0.01433563232421875,
0.009918212890625,
0.0599365234375,
0.043853759765625,
-0.0244598388671875,
-0.0240631103515625,
0.0084075927734375,
-0.032501220703125,
-0.007671356201171875,
0.0032100677490234375,
-0.0007729530334472656,
-0.01373291015625,
0.0225677490234375,
0.0877685546875,
0.0121307373046875,
-0.042266845703125,
0.049072265625,
-0.0248260498046875,
-0.00960540771484375,
-0.015106201171875,
0.0152740478515625,
0.0209808349609375,
0.0302734375,
0.0164031982421875,
-0.004299163818359375,
-0.0202789306640625,
-0.05859375,
-0.0175323486328125,
0.03936767578125,
-0.031494140625,
-0.0251312255859375,
0.050537109375,
0.00836181640625,
-0.035186767578125,
0.0513916015625,
-0.002933502197265625,
-0.041351318359375,
0.04986572265625,
0.0220794677734375,
0.0567626953125,
-0.0335693359375,
0.0090789794921875,
0.038482666015625,
0.0318603515625,
-0.01277923583984375,
0.0222930908203125,
0.01024627685546875,
-0.04827880859375,
-0.00966644287109375,
-0.041534423828125,
-0.0308380126953125,
0.0186767578125,
-0.036376953125,
0.0270233154296875,
-0.032440185546875,
-0.02294921875,
0.0074462890625,
0.01030731201171875,
-0.051727294921875,
-0.00785064697265625,
-0.01041412353515625,
0.06646728515625,
-0.039886474609375,
0.04986572265625,
0.03594970703125,
-0.047332763671875,
-0.054229736328125,
-0.031158447265625,
0.00661468505859375,
-0.06524658203125,
0.0146026611328125,
0.01491546630859375,
0.0179901123046875,
-0.0107269287109375,
-0.04595947265625,
-0.0662841796875,
0.090087890625,
0.0177764892578125,
-0.0238800048828125,
-0.0024127960205078125,
0.0191497802734375,
0.046356201171875,
-0.019683837890625,
0.061614990234375,
0.027923583984375,
0.0300140380859375,
0.0203857421875,
-0.1146240234375,
0.0171966552734375,
-0.037506103515625,
-0.0014362335205078125,
0.0162811279296875,
-0.0831298828125,
0.08831787109375,
-0.00937652587890625,
-0.01149749755859375,
0.02166748046875,
0.03656005859375,
0.0232391357421875,
0.014892578125,
0.027923583984375,
0.05224609375,
0.039398193359375,
-0.0263214111328125,
0.08917236328125,
-0.019256591796875,
0.0213165283203125,
0.06842041015625,
0.0009145736694335938,
0.06866455078125,
0.0036869049072265625,
-0.030487060546875,
0.0242156982421875,
0.05322265625,
0.01274871826171875,
0.031951904296875,
-0.011383056640625,
0.006610870361328125,
0.00592803955078125,
0.00957489013671875,
-0.0428466796875,
0.038665771484375,
0.0226593017578125,
-0.0179290771484375,
-0.01087188720703125,
0.0016927719116210938,
0.0217437744140625,
-0.02044677734375,
-0.007843017578125,
0.063720703125,
0.00179290771484375,
-0.0650634765625,
0.06964111328125,
0.01284027099609375,
0.053497314453125,
-0.06298828125,
0.006351470947265625,
-0.0220794677734375,
0.030029296875,
-0.00931549072265625,
-0.0518798828125,
0.009307861328125,
-0.0149688720703125,
0.0208892822265625,
0.0020351409912109375,
0.043609619140625,
-0.01508331298828125,
-0.0002872943878173828,
0.03326416015625,
0.043487548828125,
0.031463623046875,
-0.01517486572265625,
-0.058502197265625,
0.03271484375,
0.006168365478515625,
-0.0401611328125,
0.037628173828125,
0.04119873046875,
-0.00583648681640625,
0.060333251953125,
0.051666259765625,
0.000046312808990478516,
0.0009746551513671875,
-0.014129638671875,
0.0849609375,
-0.0445556640625,
-0.04833984375,
-0.07098388671875,
0.038909912109375,
-0.00946807861328125,
-0.039642333984375,
0.057342529296875,
0.0438232421875,
0.04827880859375,
0.0219879150390625,
0.0472412109375,
-0.02520751953125,
0.041595458984375,
-0.016510009765625,
0.061187744140625,
-0.046844482421875,
0.00946044921875,
-0.0269622802734375,
-0.0665283203125,
0.004085540771484375,
0.043426513671875,
-0.0234527587890625,
0.0196380615234375,
0.03125,
0.0609130859375,
-0.0133514404296875,
0.024261474609375,
0.0079803466796875,
0.0302734375,
0.04180908203125,
0.0548095703125,
0.057464599609375,
-0.049774169921875,
0.0653076171875,
-0.0293426513671875,
-0.040008544921875,
-0.0259552001953125,
-0.039398193359375,
-0.07427978515625,
-0.056976318359375,
-0.01568603515625,
-0.0258026123046875,
0.006702423095703125,
0.06884765625,
0.055206298828125,
-0.03955078125,
-0.03668212890625,
0.0087127685546875,
0.0007233619689941406,
-0.0194854736328125,
-0.0220794677734375,
0.0163116455078125,
-0.022705078125,
-0.058319091796875,
0.0093841552734375,
-0.003330230712890625,
0.0158843994140625,
-0.020599365234375,
-0.0225677490234375,
-0.027496337890625,
0.0168609619140625,
0.033203125,
0.037261962890625,
-0.0548095703125,
-0.0188446044921875,
-0.004695892333984375,
-0.02569580078125,
0.022705078125,
0.026611328125,
-0.0374755859375,
0.0338134765625,
0.037353515625,
0.0102691650390625,
0.055206298828125,
-0.005382537841796875,
0.028350830078125,
-0.03924560546875,
0.0183563232421875,
0.00682830810546875,
0.01715087890625,
0.013458251953125,
-0.00780487060546875,
0.055206298828125,
0.01091766357421875,
-0.044647216796875,
-0.0787353515625,
-0.017730712890625,
-0.0673828125,
-0.019744873046875,
0.07135009765625,
-0.024444580078125,
-0.025665283203125,
0.0092010498046875,
-0.02252197265625,
0.019989013671875,
-0.043426513671875,
0.03485107421875,
0.057861328125,
-0.0142974853515625,
-0.0156707763671875,
-0.0662841796875,
0.0251312255859375,
0.00893402099609375,
-0.06451416015625,
0.007701873779296875,
0.0205078125,
0.020294189453125,
0.0232391357421875,
0.076171875,
-0.01971435546875,
0.0101470947265625,
0.0013113021850585938,
0.01351165771484375,
-0.01251220703125,
-0.0035572052001953125,
0.003726959228515625,
0.0073699951171875,
-0.0062713623046875,
-0.038726806640625
]
] |
Helsinki-NLP/opus-mt-gmw-gmw | 2023-08-16T11:38:06.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"nl",
"en",
"lb",
"af",
"de",
"fy",
"yi",
"gmw",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-gmw-gmw | 0 | 13,823 | transformers | 2022-03-02T23:29:04 | ---
language:
- nl
- en
- lb
- af
- de
- fy
- yi
- gmw
tags:
- translation
license: apache-2.0
---
### gmw-gmw
* source group: West Germanic languages
* target group: West Germanic languages
* OPUS readme: [gmw-gmw](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/gmw-gmw/README.md)
* model: transformer
* source language(s): afr ang_Latn deu eng enm_Latn frr fry gos gsw ksh ltz nds nld pdc sco stq swg yid
* target language(s): afr ang_Latn deu eng enm_Latn frr fry gos gsw ksh ltz nds nld pdc sco stq swg yid
* model: transformer
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* a sentence initial language token is required in the form of `>>id<<` (id = valid target language ID)
* download original weights: [opus-2020-07-27.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/gmw-gmw/opus-2020-07-27.zip)
* test set translations: [opus-2020-07-27.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/gmw-gmw/opus-2020-07-27.test.txt)
* test set scores: [opus-2020-07-27.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/gmw-gmw/opus-2020-07-27.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newssyscomb2009-deueng.deu.eng | 25.3 | 0.527 |
| newssyscomb2009-engdeu.eng.deu | 19.0 | 0.502 |
| news-test2008-deueng.deu.eng | 23.7 | 0.515 |
| news-test2008-engdeu.eng.deu | 19.2 | 0.491 |
| newstest2009-deueng.deu.eng | 23.1 | 0.514 |
| newstest2009-engdeu.eng.deu | 18.6 | 0.495 |
| newstest2010-deueng.deu.eng | 25.8 | 0.545 |
| newstest2010-engdeu.eng.deu | 20.3 | 0.505 |
| newstest2011-deueng.deu.eng | 23.7 | 0.523 |
| newstest2011-engdeu.eng.deu | 18.9 | 0.490 |
| newstest2012-deueng.deu.eng | 24.4 | 0.529 |
| newstest2012-engdeu.eng.deu | 19.2 | 0.489 |
| newstest2013-deueng.deu.eng | 27.2 | 0.545 |
| newstest2013-engdeu.eng.deu | 22.4 | 0.514 |
| newstest2014-deen-deueng.deu.eng | 27.0 | 0.546 |
| newstest2015-ende-deueng.deu.eng | 28.4 | 0.552 |
| newstest2015-ende-engdeu.eng.deu | 25.3 | 0.541 |
| newstest2016-ende-deueng.deu.eng | 33.2 | 0.595 |
| newstest2016-ende-engdeu.eng.deu | 29.8 | 0.578 |
| newstest2017-ende-deueng.deu.eng | 29.0 | 0.557 |
| newstest2017-ende-engdeu.eng.deu | 23.9 | 0.534 |
| newstest2018-ende-deueng.deu.eng | 35.9 | 0.607 |
| newstest2018-ende-engdeu.eng.deu | 34.8 | 0.609 |
| newstest2019-deen-deueng.deu.eng | 32.1 | 0.579 |
| newstest2019-ende-engdeu.eng.deu | 31.0 | 0.579 |
| Tatoeba-test.afr-ang.afr.ang | 0.0 | 0.065 |
| Tatoeba-test.afr-deu.afr.deu | 46.8 | 0.668 |
| Tatoeba-test.afr-eng.afr.eng | 58.5 | 0.728 |
| Tatoeba-test.afr-enm.afr.enm | 13.4 | 0.357 |
| Tatoeba-test.afr-fry.afr.fry | 5.3 | 0.026 |
| Tatoeba-test.afr-gos.afr.gos | 3.5 | 0.228 |
| Tatoeba-test.afr-ltz.afr.ltz | 1.6 | 0.131 |
| Tatoeba-test.afr-nld.afr.nld | 55.4 | 0.715 |
| Tatoeba-test.afr-yid.afr.yid | 3.4 | 0.008 |
| Tatoeba-test.ang-afr.ang.afr | 3.1 | 0.096 |
| Tatoeba-test.ang-deu.ang.deu | 2.6 | 0.188 |
| Tatoeba-test.ang-eng.ang.eng | 5.4 | 0.211 |
| Tatoeba-test.ang-enm.ang.enm | 1.7 | 0.197 |
| Tatoeba-test.ang-gos.ang.gos | 6.6 | 0.186 |
| Tatoeba-test.ang-ltz.ang.ltz | 5.3 | 0.072 |
| Tatoeba-test.ang-yid.ang.yid | 0.9 | 0.131 |
| Tatoeba-test.deu-afr.deu.afr | 52.7 | 0.699 |
| Tatoeba-test.deu-ang.deu.ang | 0.8 | 0.133 |
| Tatoeba-test.deu-eng.deu.eng | 43.5 | 0.621 |
| Tatoeba-test.deu-enm.deu.enm | 6.9 | 0.245 |
| Tatoeba-test.deu-frr.deu.frr | 0.8 | 0.200 |
| Tatoeba-test.deu-fry.deu.fry | 15.1 | 0.367 |
| Tatoeba-test.deu-gos.deu.gos | 2.2 | 0.279 |
| Tatoeba-test.deu-gsw.deu.gsw | 1.0 | 0.176 |
| Tatoeba-test.deu-ksh.deu.ksh | 0.6 | 0.208 |
| Tatoeba-test.deu-ltz.deu.ltz | 12.1 | 0.274 |
| Tatoeba-test.deu-nds.deu.nds | 18.8 | 0.446 |
| Tatoeba-test.deu-nld.deu.nld | 48.6 | 0.669 |
| Tatoeba-test.deu-pdc.deu.pdc | 4.6 | 0.198 |
| Tatoeba-test.deu-sco.deu.sco | 12.0 | 0.340 |
| Tatoeba-test.deu-stq.deu.stq | 3.2 | 0.240 |
| Tatoeba-test.deu-swg.deu.swg | 0.5 | 0.179 |
| Tatoeba-test.deu-yid.deu.yid | 1.7 | 0.160 |
| Tatoeba-test.eng-afr.eng.afr | 55.8 | 0.730 |
| Tatoeba-test.eng-ang.eng.ang | 5.7 | 0.157 |
| Tatoeba-test.eng-deu.eng.deu | 36.7 | 0.584 |
| Tatoeba-test.eng-enm.eng.enm | 2.0 | 0.272 |
| Tatoeba-test.eng-frr.eng.frr | 6.1 | 0.246 |
| Tatoeba-test.eng-fry.eng.fry | 15.3 | 0.378 |
| Tatoeba-test.eng-gos.eng.gos | 1.2 | 0.242 |
| Tatoeba-test.eng-gsw.eng.gsw | 0.9 | 0.164 |
| Tatoeba-test.eng-ksh.eng.ksh | 0.9 | 0.170 |
| Tatoeba-test.eng-ltz.eng.ltz | 13.7 | 0.263 |
| Tatoeba-test.eng-nds.eng.nds | 17.1 | 0.410 |
| Tatoeba-test.eng-nld.eng.nld | 49.6 | 0.673 |
| Tatoeba-test.eng-pdc.eng.pdc | 5.1 | 0.218 |
| Tatoeba-test.eng-sco.eng.sco | 34.8 | 0.587 |
| Tatoeba-test.eng-stq.eng.stq | 2.1 | 0.322 |
| Tatoeba-test.eng-swg.eng.swg | 1.7 | 0.192 |
| Tatoeba-test.eng-yid.eng.yid | 1.7 | 0.173 |
| Tatoeba-test.enm-afr.enm.afr | 13.4 | 0.397 |
| Tatoeba-test.enm-ang.enm.ang | 0.7 | 0.063 |
| Tatoeba-test.enm-deu.enm.deu | 41.5 | 0.514 |
| Tatoeba-test.enm-eng.enm.eng | 21.3 | 0.483 |
| Tatoeba-test.enm-fry.enm.fry | 0.0 | 0.058 |
| Tatoeba-test.enm-gos.enm.gos | 10.7 | 0.354 |
| Tatoeba-test.enm-ksh.enm.ksh | 7.0 | 0.161 |
| Tatoeba-test.enm-nds.enm.nds | 18.6 | 0.316 |
| Tatoeba-test.enm-nld.enm.nld | 38.3 | 0.524 |
| Tatoeba-test.enm-yid.enm.yid | 0.7 | 0.128 |
| Tatoeba-test.frr-deu.frr.deu | 4.1 | 0.219 |
| Tatoeba-test.frr-eng.frr.eng | 14.1 | 0.186 |
| Tatoeba-test.frr-fry.frr.fry | 3.1 | 0.129 |
| Tatoeba-test.frr-gos.frr.gos | 3.6 | 0.226 |
| Tatoeba-test.frr-nds.frr.nds | 12.4 | 0.145 |
| Tatoeba-test.frr-nld.frr.nld | 9.8 | 0.209 |
| Tatoeba-test.frr-stq.frr.stq | 2.8 | 0.142 |
| Tatoeba-test.fry-afr.fry.afr | 0.0 | 1.000 |
| Tatoeba-test.fry-deu.fry.deu | 30.1 | 0.535 |
| Tatoeba-test.fry-eng.fry.eng | 28.0 | 0.486 |
| Tatoeba-test.fry-enm.fry.enm | 16.0 | 0.262 |
| Tatoeba-test.fry-frr.fry.frr | 5.5 | 0.160 |
| Tatoeba-test.fry-gos.fry.gos | 1.6 | 0.307 |
| Tatoeba-test.fry-ltz.fry.ltz | 30.4 | 0.438 |
| Tatoeba-test.fry-nds.fry.nds | 8.1 | 0.083 |
| Tatoeba-test.fry-nld.fry.nld | 41.4 | 0.616 |
| Tatoeba-test.fry-stq.fry.stq | 1.6 | 0.217 |
| Tatoeba-test.fry-yid.fry.yid | 1.6 | 0.159 |
| Tatoeba-test.gos-afr.gos.afr | 6.3 | 0.318 |
| Tatoeba-test.gos-ang.gos.ang | 6.2 | 0.058 |
| Tatoeba-test.gos-deu.gos.deu | 11.7 | 0.363 |
| Tatoeba-test.gos-eng.gos.eng | 14.9 | 0.322 |
| Tatoeba-test.gos-enm.gos.enm | 9.1 | 0.398 |
| Tatoeba-test.gos-frr.gos.frr | 3.3 | 0.117 |
| Tatoeba-test.gos-fry.gos.fry | 13.1 | 0.387 |
| Tatoeba-test.gos-ltz.gos.ltz | 3.1 | 0.154 |
| Tatoeba-test.gos-nds.gos.nds | 2.4 | 0.206 |
| Tatoeba-test.gos-nld.gos.nld | 13.9 | 0.395 |
| Tatoeba-test.gos-stq.gos.stq | 2.1 | 0.209 |
| Tatoeba-test.gos-yid.gos.yid | 1.7 | 0.147 |
| Tatoeba-test.gsw-deu.gsw.deu | 10.5 | 0.350 |
| Tatoeba-test.gsw-eng.gsw.eng | 10.7 | 0.299 |
| Tatoeba-test.ksh-deu.ksh.deu | 12.0 | 0.373 |
| Tatoeba-test.ksh-eng.ksh.eng | 3.2 | 0.225 |
| Tatoeba-test.ksh-enm.ksh.enm | 13.4 | 0.308 |
| Tatoeba-test.ltz-afr.ltz.afr | 37.4 | 0.525 |
| Tatoeba-test.ltz-ang.ltz.ang | 2.8 | 0.036 |
| Tatoeba-test.ltz-deu.ltz.deu | 40.3 | 0.596 |
| Tatoeba-test.ltz-eng.ltz.eng | 31.7 | 0.490 |
| Tatoeba-test.ltz-fry.ltz.fry | 36.3 | 0.658 |
| Tatoeba-test.ltz-gos.ltz.gos | 2.9 | 0.209 |
| Tatoeba-test.ltz-nld.ltz.nld | 38.8 | 0.530 |
| Tatoeba-test.ltz-stq.ltz.stq | 5.8 | 0.165 |
| Tatoeba-test.ltz-yid.ltz.yid | 1.0 | 0.159 |
| Tatoeba-test.multi.multi | 36.4 | 0.568 |
| Tatoeba-test.nds-deu.nds.deu | 35.0 | 0.573 |
| Tatoeba-test.nds-eng.nds.eng | 29.6 | 0.495 |
| Tatoeba-test.nds-enm.nds.enm | 3.7 | 0.194 |
| Tatoeba-test.nds-frr.nds.frr | 6.6 | 0.133 |
| Tatoeba-test.nds-fry.nds.fry | 4.2 | 0.087 |
| Tatoeba-test.nds-gos.nds.gos | 2.0 | 0.243 |
| Tatoeba-test.nds-nld.nds.nld | 41.4 | 0.618 |
| Tatoeba-test.nds-swg.nds.swg | 0.6 | 0.178 |
| Tatoeba-test.nds-yid.nds.yid | 8.3 | 0.238 |
| Tatoeba-test.nld-afr.nld.afr | 59.4 | 0.759 |
| Tatoeba-test.nld-deu.nld.deu | 49.9 | 0.685 |
| Tatoeba-test.nld-eng.nld.eng | 54.1 | 0.699 |
| Tatoeba-test.nld-enm.nld.enm | 5.0 | 0.250 |
| Tatoeba-test.nld-frr.nld.frr | 2.4 | 0.224 |
| Tatoeba-test.nld-fry.nld.fry | 19.4 | 0.446 |
| Tatoeba-test.nld-gos.nld.gos | 2.5 | 0.273 |
| Tatoeba-test.nld-ltz.nld.ltz | 13.8 | 0.292 |
| Tatoeba-test.nld-nds.nld.nds | 21.3 | 0.457 |
| Tatoeba-test.nld-sco.nld.sco | 14.7 | 0.423 |
| Tatoeba-test.nld-stq.nld.stq | 1.9 | 0.257 |
| Tatoeba-test.nld-swg.nld.swg | 4.2 | 0.162 |
| Tatoeba-test.nld-yid.nld.yid | 2.6 | 0.186 |
| Tatoeba-test.pdc-deu.pdc.deu | 39.7 | 0.529 |
| Tatoeba-test.pdc-eng.pdc.eng | 25.0 | 0.427 |
| Tatoeba-test.sco-deu.sco.deu | 28.4 | 0.428 |
| Tatoeba-test.sco-eng.sco.eng | 41.8 | 0.595 |
| Tatoeba-test.sco-nld.sco.nld | 36.4 | 0.565 |
| Tatoeba-test.stq-deu.stq.deu | 7.7 | 0.328 |
| Tatoeba-test.stq-eng.stq.eng | 21.1 | 0.428 |
| Tatoeba-test.stq-frr.stq.frr | 2.0 | 0.118 |
| Tatoeba-test.stq-fry.stq.fry | 6.3 | 0.255 |
| Tatoeba-test.stq-gos.stq.gos | 1.4 | 0.244 |
| Tatoeba-test.stq-ltz.stq.ltz | 4.4 | 0.204 |
| Tatoeba-test.stq-nld.stq.nld | 10.7 | 0.371 |
| Tatoeba-test.stq-yid.stq.yid | 1.4 | 0.105 |
| Tatoeba-test.swg-deu.swg.deu | 9.5 | 0.343 |
| Tatoeba-test.swg-eng.swg.eng | 15.1 | 0.306 |
| Tatoeba-test.swg-nds.swg.nds | 0.7 | 0.196 |
| Tatoeba-test.swg-nld.swg.nld | 11.6 | 0.308 |
| Tatoeba-test.swg-yid.swg.yid | 0.9 | 0.186 |
| Tatoeba-test.yid-afr.yid.afr | 100.0 | 1.000 |
| Tatoeba-test.yid-ang.yid.ang | 0.6 | 0.079 |
| Tatoeba-test.yid-deu.yid.deu | 16.7 | 0.372 |
| Tatoeba-test.yid-eng.yid.eng | 15.8 | 0.344 |
| Tatoeba-test.yid-enm.yid.enm | 1.3 | 0.166 |
| Tatoeba-test.yid-fry.yid.fry | 5.6 | 0.157 |
| Tatoeba-test.yid-gos.yid.gos | 2.2 | 0.160 |
| Tatoeba-test.yid-ltz.yid.ltz | 2.1 | 0.238 |
| Tatoeba-test.yid-nds.yid.nds | 14.4 | 0.365 |
| Tatoeba-test.yid-nld.yid.nld | 20.9 | 0.397 |
| Tatoeba-test.yid-stq.yid.stq | 3.7 | 0.165 |
| Tatoeba-test.yid-swg.yid.swg | 1.8 | 0.156 |
### System Info:
- hf_name: gmw-gmw
- source_languages: gmw
- target_languages: gmw
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/gmw-gmw/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['nl', 'en', 'lb', 'af', 'de', 'fy', 'yi', 'gmw']
- src_constituents: {'ksh', 'nld', 'eng', 'enm_Latn', 'ltz', 'stq', 'afr', 'pdc', 'deu', 'gos', 'ang_Latn', 'fry', 'gsw', 'frr', 'nds', 'yid', 'swg', 'sco'}
- tgt_constituents: {'ksh', 'nld', 'eng', 'enm_Latn', 'ltz', 'stq', 'afr', 'pdc', 'deu', 'gos', 'ang_Latn', 'fry', 'gsw', 'frr', 'nds', 'yid', 'swg', 'sco'}
- src_multilingual: True
- tgt_multilingual: True
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/gmw-gmw/opus-2020-07-27.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/gmw-gmw/opus-2020-07-27.test.txt
- src_alpha3: gmw
- tgt_alpha3: gmw
- short_pair: gmw-gmw
- chrF2_score: 0.568
- bleu: 36.4
- brevity_penalty: 1.0
- ref_len: 72534.0
- src_name: West Germanic languages
- tgt_name: West Germanic languages
- train_date: 2020-07-27
- src_alpha2: gmw
- tgt_alpha2: gmw
- prefer_old: False
- long_pair: gmw-gmw
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 11,700 | [
[
-0.06402587890625,
-0.038818359375,
0.020538330078125,
0.02606201171875,
-0.0210723876953125,
-0.00461578369140625,
0.01186370849609375,
-0.0212554931640625,
0.058380126953125,
0.0020465850830078125,
-0.0222015380859375,
-0.033477783203125,
-0.038177490234375,
0.02520751953125,
-0.0026416778564453125,
0.029205322265625,
-0.005985260009765625,
-0.006931304931640625,
0.0186920166015625,
-0.02056884765625,
-0.039825439453125,
0.01100921630859375,
-0.061614990234375,
-0.0077362060546875,
0.0206146240234375,
0.03619384765625,
0.052398681640625,
0.0246429443359375,
0.017120361328125,
0.031463623046875,
-0.01751708984375,
0.01125335693359375,
0.00598907470703125,
-0.0157318115234375,
0.004169464111328125,
-0.05096435546875,
-0.035888671875,
-0.010101318359375,
0.038360595703125,
0.053863525390625,
0.01450347900390625,
0.028167724609375,
-0.00403594970703125,
0.06781005859375,
-0.031280517578125,
0.010955810546875,
-0.0033779144287109375,
0.01020050048828125,
-0.026580810546875,
-0.026153564453125,
-0.03936767578125,
-0.07025146484375,
-0.011444091796875,
-0.039093017578125,
-0.0034847259521484375,
-0.00189971923828125,
0.09619140625,
-0.01016998291015625,
-0.0257110595703125,
-0.004161834716796875,
-0.0136260986328125,
0.055328369140625,
-0.06658935546875,
0.01348876953125,
0.035797119140625,
0.0024566650390625,
-0.0304718017578125,
-0.021881103515625,
-0.05255126953125,
0.0153045654296875,
-0.019744873046875,
0.03887939453125,
0.0027942657470703125,
-0.02093505859375,
0.018524169921875,
0.02923583984375,
-0.04302978515625,
0.00013840198516845703,
-0.0360107421875,
0.0104217529296875,
0.048095703125,
0.0244140625,
0.03271484375,
-0.0270233154296875,
-0.043182373046875,
-0.033477783203125,
-0.021942138671875,
0.036529541015625,
0.0182952880859375,
0.0215911865234375,
-0.01910400390625,
0.0477294921875,
-0.0241546630859375,
0.0404052734375,
0.0247802734375,
-0.008453369140625,
0.058807373046875,
-0.0345458984375,
-0.0389404296875,
-0.0224609375,
0.06805419921875,
0.05712890625,
-0.00894927978515625,
0.01354217529296875,
0.002841949462890625,
0.0087127685546875,
-0.0156097412109375,
-0.04718017578125,
0.0005950927734375,
0.01422882080078125,
-0.0237274169921875,
-0.01082611083984375,
0.01059722900390625,
-0.0950927734375,
0.0006732940673828125,
-0.003940582275390625,
0.027252197265625,
-0.044525146484375,
-0.034393310546875,
0.0175323486328125,
-0.00494384765625,
0.022705078125,
0.006855010986328125,
-0.042755126953125,
0.0233917236328125,
0.007801055908203125,
0.061676025390625,
-0.00942230224609375,
-0.0214996337890625,
0.01171112060546875,
0.01171112060546875,
-0.035247802734375,
0.054290771484375,
-0.023651123046875,
-0.042724609375,
-0.0192108154296875,
0.0115814208984375,
-0.0291290283203125,
-0.02008056640625,
0.031646728515625,
-0.00966644287109375,
0.02325439453125,
-0.045074462890625,
-0.0290374755859375,
-0.03009033203125,
0.0098419189453125,
-0.0489501953125,
0.09881591796875,
0.024169921875,
-0.0498046875,
0.046661376953125,
-0.044219970703125,
-0.00226593017578125,
-0.00835418701171875,
0.0156402587890625,
-0.038299560546875,
-0.005657196044921875,
0.022430419921875,
0.0312347412109375,
-0.0241546630859375,
0.01064300537109375,
-0.006771087646484375,
-0.02490234375,
0.000042498111724853516,
-0.015380859375,
0.09344482421875,
0.023101806640625,
-0.03509521484375,
-0.0152740478515625,
-0.083740234375,
0.027435302734375,
0.0185546875,
-0.033294677734375,
0.0028820037841796875,
-0.05181884765625,
-0.01226043701171875,
0.0167083740234375,
0.0189208984375,
-0.045928955078125,
0.02105712890625,
-0.0445556640625,
-0.007190704345703125,
0.07757568359375,
0.0215911865234375,
0.021759033203125,
-0.053253173828125,
0.0389404296875,
0.019287109375,
0.02606201171875,
0.0345458984375,
-0.038055419921875,
-0.037384033203125,
-0.0027923583984375,
0.0309295654296875,
0.04608154296875,
-0.0489501953125,
0.062042236328125,
-0.004058837890625,
-0.0762939453125,
-0.02752685546875,
-0.0179901123046875,
0.0292510986328125,
0.038177490234375,
0.0276336669921875,
-0.0186004638671875,
-0.037628173828125,
-0.0697021484375,
-0.01229095458984375,
0.0002703666687011719,
0.01245880126953125,
0.0092620849609375,
0.071044921875,
0.004924774169921875,
0.063720703125,
-0.05645751953125,
-0.0269775390625,
-0.0085906982421875,
-0.0157623291015625,
0.04681396484375,
0.0379638671875,
0.08013916015625,
-0.045867919921875,
-0.0712890625,
-0.0008645057678222656,
-0.0648193359375,
-0.006664276123046875,
-0.0018892288208007812,
-0.0202484130859375,
0.033111572265625,
0.0009012222290039062,
-0.0445556640625,
0.052459716796875,
0.040557861328125,
-0.053070068359375,
0.0457763671875,
-0.02093505859375,
0.030609130859375,
-0.07733154296875,
0.0072784423828125,
-0.01837158203125,
0.01064300537109375,
-0.016265869140625,
0.00628662109375,
-0.007663726806640625,
0.00783538818359375,
-0.0208587646484375,
0.03851318359375,
-0.061767578125,
-0.00946044921875,
0.0196533203125,
0.0264129638671875,
-0.0093994140625,
0.056793212890625,
-0.03363037109375,
0.08050537109375,
0.052581787109375,
-0.042388916015625,
0.037628173828125,
0.018798828125,
-0.00908660888671875,
0.03619384765625,
-0.032012939453125,
-0.0236968994140625,
-0.01197052001953125,
0.00623321533203125,
-0.0828857421875,
-0.006351470947265625,
0.0204925537109375,
-0.042633056640625,
0.0156097412109375,
0.0252838134765625,
-0.0254669189453125,
-0.04949951171875,
-0.05206298828125,
0.0133209228515625,
0.035736083984375,
-0.0226593017578125,
0.03948974609375,
0.02386474609375,
-0.00595855712890625,
-0.04205322265625,
-0.04534912109375,
-0.01039886474609375,
-0.0023555755615234375,
-0.03271484375,
0.006343841552734375,
-0.01113128662109375,
-0.01554107666015625,
0.01436614990234375,
-0.0125732421875,
-0.0168609619140625,
-0.0018062591552734375,
0.0271148681640625,
0.0011119842529296875,
-0.0164794921875,
-0.01922607421875,
-0.01058197021484375,
-0.01512908935546875,
-0.0013484954833984375,
0.021759033203125,
0.057708740234375,
-0.006946563720703125,
-0.0262298583984375,
-0.037506103515625,
0.0190887451171875,
0.039093017578125,
-0.0145416259765625,
0.078857421875,
0.018524169921875,
-0.02398681640625,
0.0134429931640625,
-0.033111572265625,
0.01340484619140625,
-0.03460693359375,
0.00689697265625,
-0.04510498046875,
-0.040008544921875,
0.061920166015625,
-0.00287628173828125,
0.0157623291015625,
0.06707763671875,
0.0287322998046875,
-0.001178741455078125,
0.048980712890625,
0.016021728515625,
0.0182647705078125,
0.033447265625,
-0.045013427734375,
0.004215240478515625,
-0.049530029296875,
-0.050018310546875,
-0.0418701171875,
-0.035980224609375,
-0.046661376953125,
-0.0174713134765625,
0.034423828125,
-0.00455474853515625,
-0.02154541015625,
0.05316162109375,
-0.05804443359375,
0.0242156982421875,
0.033355712890625,
0.034027099609375,
0.007411956787109375,
-0.035614013671875,
-0.0184326171875,
-0.01065826416015625,
-0.01392364501953125,
-0.033447265625,
0.09722900390625,
0.01441192626953125,
0.0289459228515625,
0.0287017822265625,
0.083251953125,
0.0216217041015625,
0.005008697509765625,
-0.028045654296875,
0.055633544921875,
0.024932861328125,
-0.05035400390625,
-0.051849365234375,
-0.006427764892578125,
-0.0758056640625,
0.042694091796875,
-0.0121002197265625,
-0.04010009765625,
0.0260009765625,
-0.01107025146484375,
-0.01861572265625,
0.0338134765625,
-0.061920166015625,
0.04730224609375,
-0.00492095947265625,
-0.03192138671875,
0.00853729248046875,
-0.041900634765625,
0.018218994140625,
0.0027523040771484375,
0.0126190185546875,
-0.007610321044921875,
-0.0003561973571777344,
0.06622314453125,
-0.057861328125,
0.028656005859375,
-0.045013427734375,
0.0033359527587890625,
0.05230712890625,
0.004550933837890625,
0.03924560546875,
0.00832366943359375,
-0.01177978515625,
-0.028411865234375,
-0.007617950439453125,
-0.050750732421875,
-0.01361083984375,
0.044769287109375,
-0.05609130859375,
-0.047515869140625,
-0.0628662109375,
-0.00957489013671875,
0.001667022705078125,
0.036956787109375,
0.0287628173828125,
0.040252685546875,
-0.0074615478515625,
0.026763916015625,
0.04473876953125,
-0.023712158203125,
0.059478759765625,
-0.0022716522216796875,
-0.01416015625,
-0.03765869140625,
0.05059814453125,
0.0290679931640625,
0.0179290771484375,
0.0267791748046875,
0.002227783203125,
-0.025146484375,
-0.032501220703125,
-0.024139404296875,
0.01383209228515625,
-0.032379150390625,
-0.040863037109375,
-0.0479736328125,
0.005016326904296875,
-0.036590576171875,
-0.02557373046875,
-0.02301025390625,
-0.037994384765625,
-0.004032135009765625,
-0.0361328125,
0.03759765625,
0.053436279296875,
-0.0216827392578125,
0.0196380615234375,
-0.0262908935546875,
0.020416259765625,
0.011505126953125,
0.0303802490234375,
-0.0020084381103515625,
-0.034423828125,
-0.0045928955078125,
0.001522064208984375,
-0.027740478515625,
-0.1005859375,
0.038482666015625,
-0.02056884765625,
0.0245819091796875,
0.01739501953125,
-0.00443267822265625,
0.06707763671875,
-0.0012769699096679688,
0.08123779296875,
0.0228118896484375,
-0.060577392578125,
0.048553466796875,
-0.02178955078125,
0.01153564453125,
0.05950927734375,
0.019500732421875,
-0.003726959228515625,
-0.049591064453125,
-0.052154541015625,
-0.07476806640625,
0.04205322265625,
0.01947021484375,
-0.0176849365234375,
-0.0301513671875,
-0.004009246826171875,
-0.00019562244415283203,
-0.0244598388671875,
-0.0572509765625,
-0.0657958984375,
0.00246429443359375,
-0.0097198486328125,
0.0291900634765625,
-0.0233917236328125,
-0.00745391845703125,
-0.032806396484375,
0.05133056640625,
0.0194091796875,
0.0382080078125,
0.020263671875,
-0.0010280609130859375,
0.0094757080078125,
0.0276031494140625,
0.06707763671875,
0.04864501953125,
-0.013885498046875,
-0.0121917724609375,
0.048675537109375,
-0.052154541015625,
0.03533935546875,
-0.01171112060546875,
-0.0216827392578125,
0.015380859375,
0.01092529296875,
0.0278472900390625,
0.018096923828125,
-0.003299713134765625,
0.03289794921875,
-0.001499176025390625,
-0.03045654296875,
-0.05487060546875,
-0.0083770751953125,
0.0063629150390625,
-0.0010471343994140625,
0.0229644775390625,
0.00466156005859375,
0.0041046142578125,
-0.04327392578125,
0.018463134765625,
0.016143798828125,
-0.0039043426513671875,
0.0023040771484375,
0.04779052734375,
0.014068603515625,
-0.004787445068359375,
0.00951385498046875,
-0.0242767333984375,
-0.024810791015625,
0.06683349609375,
0.034332275390625,
0.04559326171875,
-0.036529541015625,
-0.0006704330444335938,
0.0648193359375,
0.0117950439453125,
0.00031447410583496094,
0.053375244140625,
0.037506103515625,
-0.021942138671875,
0.0022792816162109375,
-0.05621337890625,
-0.01267242431640625,
0.0015382766723632812,
-0.05889892578125,
0.00641632080078125,
-0.039794921875,
-0.037841796875,
-0.00605010986328125,
0.0292205810546875,
-0.046661376953125,
0.0394287109375,
-0.0300140380859375,
0.09033203125,
-0.07318115234375,
0.0396728515625,
0.05010986328125,
-0.06793212890625,
-0.0760498046875,
-0.01953125,
-0.023681640625,
-0.05267333984375,
0.037811279296875,
-0.01558685302734375,
-0.00506591796875,
-0.004268646240234375,
-0.017364501953125,
-0.07147216796875,
0.105224609375,
-0.0056610107421875,
-0.018585205078125,
0.018463134765625,
-0.0032291412353515625,
0.040069580078125,
0.0276641845703125,
0.053466796875,
0.037567138671875,
0.06866455078125,
-0.01020050048828125,
-0.071044921875,
0.03265380859375,
-0.0396728515625,
-0.0102996826171875,
0.0291900634765625,
-0.0836181640625,
0.08868408203125,
-0.0013885498046875,
-0.00811004638671875,
-0.014556884765625,
0.050506591796875,
0.0172119140625,
0.00799560546875,
0.03564453125,
0.05841064453125,
0.039306640625,
-0.0300140380859375,
0.0555419921875,
-0.0029697418212890625,
0.032745361328125,
0.035980224609375,
0.0004787445068359375,
0.046844482421875,
0.0275115966796875,
-0.060699462890625,
0.031585693359375,
0.04315185546875,
-0.0178985595703125,
0.032958984375,
0.006870269775390625,
-0.02703857421875,
-0.002166748046875,
-0.00946807861328125,
-0.05145263671875,
0.032989501953125,
0.003421783447265625,
-0.00231170654296875,
-0.01059722900390625,
-0.0135345458984375,
0.023223876953125,
0.0298309326171875,
-0.0141143798828125,
0.0396728515625,
0.0007228851318359375,
-0.04498291015625,
0.038818359375,
-0.0126800537109375,
0.0628662109375,
-0.025299072265625,
-0.00728607177734375,
-0.0267791748046875,
0.0247802734375,
-0.0195465087890625,
-0.09576416015625,
0.0271759033203125,
-0.00452423095703125,
-0.032073974609375,
0.0031909942626953125,
0.017333984375,
-0.0081329345703125,
-0.03076171875,
0.038055419921875,
0.0182952880859375,
0.0185394287109375,
0.028350830078125,
-0.0287933349609375,
-0.01003265380859375,
0.006195068359375,
-0.060150146484375,
0.0267181396484375,
0.036590576171875,
0.0081634521484375,
0.049652099609375,
0.03533935546875,
0.00881195068359375,
0.0361328125,
-0.021209716796875,
0.0548095703125,
-0.069580078125,
-0.0251312255859375,
-0.059722900390625,
0.026275634765625,
-0.0265045166015625,
-0.048828125,
0.0693359375,
0.0677490234375,
0.055206298828125,
-0.0216217041015625,
0.054931640625,
-0.0404052734375,
0.03436279296875,
-0.0253448486328125,
0.05047607421875,
-0.042449951171875,
-0.04498291015625,
-0.0096588134765625,
-0.058990478515625,
-0.0174713134765625,
0.04296875,
-0.0146636962890625,
0.018096923828125,
0.062164306640625,
0.0338134765625,
0.023895263671875,
0.00959014892578125,
0.01019287109375,
0.0233917236328125,
0.007404327392578125,
0.08203125,
0.023651123046875,
-0.0589599609375,
0.040069580078125,
-0.04150390625,
0.00019049644470214844,
-0.018280029296875,
-0.042388916015625,
-0.0587158203125,
-0.033782958984375,
0.0034770965576171875,
-0.0178680419921875,
-0.032257080078125,
0.06549072265625,
-0.00047779083251953125,
-0.06646728515625,
-0.0229644775390625,
0.0191802978515625,
0.002330780029296875,
-0.037139892578125,
-0.0253143310546875,
0.0716552734375,
0.0014047622680664062,
-0.0762939453125,
0.019073486328125,
-0.003387451171875,
0.0091705322265625,
0.036468505859375,
-0.0027751922607421875,
-0.046844482421875,
0.005016326904296875,
0.029541015625,
0.0273590087890625,
-0.0484619140625,
-0.03558349609375,
0.00844573974609375,
-0.032470703125,
0.0208587646484375,
-0.01480865478515625,
-0.036590576171875,
0.01163482666015625,
0.067138671875,
0.0234527587890625,
0.0379638671875,
0.0165557861328125,
0.006450653076171875,
-0.033782958984375,
0.035675048828125,
0.001338958740234375,
0.028076171875,
-0.0146942138671875,
-0.020538330078125,
0.068603515625,
0.0212860107421875,
-0.0306396484375,
-0.0721435546875,
-0.01959228515625,
-0.09368896484375,
0.0009264945983886719,
0.0499267578125,
-0.006893157958984375,
-0.031890869140625,
-0.004547119140625,
-0.02874755859375,
0.030120849609375,
-0.03631591796875,
0.0340576171875,
0.0372314453125,
-0.023040771484375,
0.0036144256591796875,
-0.048004150390625,
0.00852203369140625,
0.040008544921875,
-0.0723876953125,
-0.01500701904296875,
0.00626373291015625,
0.024749755859375,
0.0345458984375,
0.08099365234375,
-0.034210205078125,
0.01113128662109375,
0.0295257568359375,
0.01125335693359375,
-0.01178741455078125,
-0.006168365478515625,
-0.0013294219970703125,
0.0191192626953125,
-0.0140838623046875,
-0.050689697265625
]
] |
j5ng/et5-typos-corrector | 2023-06-05T07:41:33.000Z | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"ko",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | j5ng | null | null | j5ng/et5-typos-corrector | 2 | 13,818 | transformers | 2023-06-04T07:20:53 | ---
language:
- ko
pipeline_tag: text2text-generation
license: apache-2.0
---
## 한국어 맞춤법 교정기(Korean Typos Corrector)
- ETRI-et5 모델을 기반으로 fine-tuning한 한국어 구어체 전용 맞춤법 교정기 입니다.
## Base on PLM model(ET5)
- ETRI(https://aiopen.etri.re.kr/et5Model)
## Base on Dataset
- 모두의 말뭉치(https://corpus.korean.go.kr/request/reausetMain.do?lang=ko) 맞춤법 교정 데이터
## Data Preprocessing
- 1. 특수문자 제거 (쉼표) .(마침표) 제거
- 2. null 값("") 제거
- 3. 너무 짧은 문장 제거(길이 2 이하)
- 4. 문장 내 &name&, name1 등 이름 태그가 포함된 단어 제거(단어만 제거하고 문장은 살림)
- total : 318,882 쌍
***
## How to use
```python
from transformers import T5ForConditionalGeneration, T5Tokenizer
# T5 모델 로드
model = T5ForConditionalGeneration.from_pretrained("j5ng/et5-typos-corrector")
tokenizer = T5Tokenizer.from_pretrained("j5ng/et5-typos-corrector")
device = "cuda:0" if torch.cuda.is_available() else "cpu"
# device = "mps:0" if torch.cuda.is_available() else "cpu" # for mac m1
model = model.to(device)
# 예시 입력 문장
input_text = "아늬 진짜 무ㅓ하냐고"
# 입력 문장 인코딩
input_encoding = tokenizer("맞춤법을 고쳐주세요: " + input_text, return_tensors="pt")
input_ids = input_encoding.input_ids.to(device)
attention_mask = input_encoding.attention_mask.to(device)
# T5 모델 출력 생성
output_encoding = model.generate(
input_ids=input_ids,
attention_mask=attention_mask,
max_length=128,
num_beams=5,
early_stopping=True,
)
# 출력 문장 디코딩
output_text = tokenizer.decode(output_encoding[0], skip_special_tokens=True)
# 결과 출력
print(output_text) # 아니 진짜 뭐 하냐고.
```
***
## With Transformer Pipeline
```python
from transformers import T5ForConditionalGeneration, T5Tokenizer, pipeline
model = T5ForConditionalGeneration.from_pretrained('j5ng/et5-typos-corrector')
tokenizer = T5Tokenizer.from_pretrained('j5ng/et5-typos-corrector')
typos_corrector = pipeline(
"text2text-generation",
model=model,
tokenizer=tokenizer,
device=0 if torch.cuda.is_available() else -1,
framework="pt",
)
input_text = "완죤 어이업ㅅ네진쨬ㅋㅋㅋ"
output_text = typos_corrector("맞춤법을 고쳐주세요: " + input_text,
max_length=128,
num_beams=5,
early_stopping=True)[0]['generated_text']
print(output_text) # 완전 어이없네 진짜 ᄏᄏᄏᄏ.
``` | 2,173 | [
[
-0.0155487060546875,
-0.040496826171875,
0.025787353515625,
0.034271240234375,
-0.0245208740234375,
0.01239013671875,
-0.0210723876953125,
-0.00960540771484375,
0.01111602783203125,
0.027069091796875,
-0.0309600830078125,
-0.047943115234375,
-0.0546875,
0.03863525390625,
0.0014324188232421875,
0.058258056640625,
-0.0256195068359375,
-0.0134124755859375,
0.0215911865234375,
-0.0032367706298828125,
-0.035675048828125,
-0.026519775390625,
-0.051025390625,
-0.009002685546875,
0.007579803466796875,
0.043914794921875,
0.023529052734375,
0.0357666015625,
0.040008544921875,
0.032135009765625,
-0.0134124755859375,
0.0169830322265625,
-0.01117706298828125,
-0.0006275177001953125,
0.003467559814453125,
-0.0484619140625,
-0.032928466796875,
-0.008758544921875,
0.04681396484375,
0.0288238525390625,
0.01580810546875,
0.034393310546875,
-0.000032901763916015625,
0.0281524658203125,
-0.0308990478515625,
0.0220794677734375,
-0.033233642578125,
0.00800323486328125,
-0.0246429443359375,
-0.018798828125,
-0.03265380859375,
-0.033355712890625,
-0.0137176513671875,
-0.0469970703125,
0.038787841796875,
0.0105133056640625,
0.1029052734375,
0.02935791015625,
-0.006977081298828125,
0.006298065185546875,
-0.047821044921875,
0.0614013671875,
-0.065673828125,
0.02484130859375,
0.01824951171875,
0.01110076904296875,
-0.005401611328125,
-0.09161376953125,
-0.05657958984375,
0.001827239990234375,
-0.01171112060546875,
0.0257568359375,
-0.0017681121826171875,
0.0099639892578125,
0.05218505859375,
0.0203399658203125,
-0.0216522216796875,
-0.01013946533203125,
-0.05035400390625,
-0.030426025390625,
0.03424072265625,
0.0024929046630859375,
0.03350830078125,
-0.038604736328125,
-0.006488800048828125,
-0.0163116455078125,
-0.022613525390625,
0.01605224609375,
0.01337432861328125,
0.01134490966796875,
-0.0255584716796875,
0.0657958984375,
-0.00919342041015625,
0.03485107421875,
0.0494384765625,
-0.02862548828125,
0.0498046875,
-0.03289794921875,
-0.03668212890625,
0.01534271240234375,
0.07568359375,
0.022674560546875,
0.036712646484375,
-0.008087158203125,
-0.005523681640625,
0.00986480712890625,
0.0032196044921875,
-0.04461669921875,
-0.006725311279296875,
0.0352783203125,
-0.0295562744140625,
-0.0330810546875,
0.0189208984375,
-0.059326171875,
0.004619598388671875,
0.004383087158203125,
0.041046142578125,
-0.035247802734375,
-0.00777435302734375,
0.0033550262451171875,
-0.03643798828125,
-0.005611419677734375,
-0.0191497802734375,
-0.06414794921875,
-0.005397796630859375,
0.016845703125,
0.04736328125,
0.029327392578125,
-0.02783203125,
-0.02069091796875,
0.0168304443359375,
-0.013671875,
0.051910400390625,
0.0013980865478515625,
-0.03094482421875,
-0.009246826171875,
0.0238037109375,
-0.04022216796875,
-0.038238525390625,
0.0706787109375,
-0.0198974609375,
0.045440673828125,
-0.004985809326171875,
-0.0302734375,
-0.01378631591796875,
0.009765625,
-0.045013427734375,
0.08209228515625,
0.0260009765625,
-0.06854248046875,
0.032440185546875,
-0.059661865234375,
-0.03826904296875,
-0.00763702392578125,
0.014801025390625,
-0.058868408203125,
-0.0014801025390625,
0.0306854248046875,
0.035614013671875,
0.00804901123046875,
0.00322723388671875,
-0.004058837890625,
-0.0169219970703125,
0.0157012939453125,
-0.013092041015625,
0.07098388671875,
0.0096282958984375,
-0.0501708984375,
0.01548004150390625,
-0.0699462890625,
0.0190887451171875,
0.00685882568359375,
-0.0306243896484375,
0.00450897216796875,
-0.0213165283203125,
-0.00920867919921875,
0.0291290283203125,
0.0225067138671875,
-0.044158935546875,
0.006134033203125,
-0.035888671875,
0.048187255859375,
0.043243408203125,
0.0032806396484375,
0.024871826171875,
-0.031158447265625,
0.0279083251953125,
0.020172119140625,
0.004352569580078125,
-0.001953125,
-0.0241851806640625,
-0.080810546875,
-0.021331787109375,
0.032501220703125,
0.05389404296875,
-0.0716552734375,
0.043182373046875,
-0.028289794921875,
-0.029052734375,
-0.06451416015625,
-0.0200347900390625,
0.0208282470703125,
0.050140380859375,
0.051727294921875,
0.0150909423828125,
-0.0762939453125,
-0.04718017578125,
-0.0338134765625,
-0.007457733154296875,
-0.0052642822265625,
0.00977325439453125,
0.039642333984375,
-0.0223388671875,
0.05914306640625,
-0.033782958984375,
-0.0284423828125,
-0.03533935546875,
0.01313018798828125,
0.037200927734375,
0.034698486328125,
0.031494140625,
-0.036468505859375,
-0.048187255859375,
-0.00786590576171875,
-0.045501708984375,
-0.0275115966796875,
-0.00592041015625,
0.01134490966796875,
0.047882080078125,
0.031219482421875,
-0.04608154296875,
0.0343017578125,
0.0176849365234375,
-0.047210693359375,
0.0638427734375,
-0.0297088623046875,
0.02203369140625,
-0.11004638671875,
0.027130126953125,
-0.00677490234375,
-0.02374267578125,
-0.04473876953125,
-0.0090179443359375,
0.0255126953125,
0.01293182373046875,
-0.043121337890625,
0.042633056640625,
-0.032623291015625,
0.013397216796875,
0.0037899017333984375,
-0.006977081298828125,
0.00939178466796875,
0.036407470703125,
-0.003326416015625,
0.057159423828125,
0.0150146484375,
-0.05804443359375,
0.0256805419921875,
0.026947021484375,
-0.02117919921875,
0.0104827880859375,
-0.05572509765625,
-0.01036834716796875,
0.00254058837890625,
0.015960693359375,
-0.0762939453125,
-0.02655029296875,
0.04449462890625,
-0.050262451171875,
0.02606201171875,
0.0029087066650390625,
-0.0350341796875,
-0.063232421875,
-0.031219482421875,
0.01364898681640625,
0.0279541015625,
-0.035003662109375,
0.032135009765625,
0.00396728515625,
0.0033588409423828125,
-0.0509033203125,
-0.063232421875,
-0.006893157958984375,
-0.0219268798828125,
-0.061004638671875,
0.052703857421875,
0.00478363037109375,
0.004150390625,
-0.01568603515625,
-0.00897216796875,
-0.0033740997314453125,
-0.01287078857421875,
-0.003986358642578125,
0.034423828125,
-0.01320648193359375,
-0.01629638671875,
0.0027332305908203125,
-0.0292205810546875,
-0.00545501708984375,
-0.006870269775390625,
0.052978515625,
-0.0011396408081054688,
0.001583099365234375,
-0.04998779296875,
-0.0052947998046875,
0.0484619140625,
-0.011322021484375,
0.0496826171875,
0.0631103515625,
-0.0245208740234375,
0.0125732421875,
-0.04583740234375,
0.0005674362182617188,
-0.037139892578125,
0.033966064453125,
-0.048919677734375,
-0.034698486328125,
0.047210693359375,
0.007358551025390625,
-0.005222320556640625,
0.07012939453125,
0.0286407470703125,
-0.00804901123046875,
0.07757568359375,
0.031890869140625,
0.00908660888671875,
0.0283050537109375,
-0.04290771484375,
0.034637451171875,
-0.07275390625,
-0.032196044921875,
-0.0423583984375,
-0.01885986328125,
-0.051727294921875,
-0.0225372314453125,
0.020538330078125,
0.007335662841796875,
-0.033966064453125,
0.036163330078125,
-0.04595947265625,
0.010223388671875,
0.037689208984375,
0.01219940185546875,
0.007770538330078125,
-0.0019140243530273438,
-0.0345458984375,
-0.0137176513671875,
-0.05157470703125,
-0.0294189453125,
0.0770263671875,
0.0263214111328125,
0.0340576171875,
-0.01235198974609375,
0.042083740234375,
-0.018310546875,
-0.0029773712158203125,
-0.0567626953125,
0.03472900390625,
-0.019073486328125,
-0.032012939453125,
-0.0104217529296875,
-0.0173492431640625,
-0.059661865234375,
-0.0008959770202636719,
-0.007030487060546875,
-0.06494140625,
0.0230865478515625,
0.0025787353515625,
-0.040191650390625,
0.03961181640625,
-0.0655517578125,
0.07794189453125,
-0.005840301513671875,
-0.0172119140625,
0.0092620849609375,
-0.056182861328125,
0.018646240234375,
0.0099639892578125,
0.0019235610961914062,
0.0130157470703125,
0.0009489059448242188,
0.06378173828125,
-0.036102294921875,
0.0390625,
-0.007526397705078125,
0.005420684814453125,
0.0171661376953125,
-0.0227813720703125,
0.0401611328125,
-0.0052642822265625,
-0.0004622936248779297,
0.00962066650390625,
-0.005702972412109375,
-0.01479339599609375,
-0.01029205322265625,
0.03448486328125,
-0.0682373046875,
-0.036651611328125,
-0.045074462890625,
-0.026214599609375,
0.006072998046875,
0.028106689453125,
0.0712890625,
0.0145416259765625,
0.0175933837890625,
0.0117645263671875,
0.040008544921875,
-0.0191192626953125,
0.066162109375,
0.01275634765625,
-0.0016584396362304688,
-0.04669189453125,
0.05816650390625,
0.028167724609375,
0.0218048095703125,
0.026214599609375,
0.006481170654296875,
-0.04107666015625,
-0.006801605224609375,
-0.023956298828125,
0.0328369140625,
-0.038299560546875,
-0.017303466796875,
-0.0626220703125,
-0.0295867919921875,
-0.0567626953125,
-0.005229949951171875,
-0.0212249755859375,
-0.035614013671875,
-0.0321044921875,
-0.01251983642578125,
0.0164794921875,
0.020782470703125,
-0.0194244384765625,
0.029022216796875,
-0.0635986328125,
0.05548095703125,
0.004764556884765625,
0.0285491943359375,
-0.0181884765625,
-0.054901123046875,
-0.005889892578125,
0.01287841796875,
-0.044342041015625,
-0.060211181640625,
0.05035400390625,
-0.0010499954223632812,
0.0309600830078125,
0.0255889892578125,
-0.001590728759765625,
0.06756591796875,
-0.028564453125,
0.0634765625,
0.006328582763671875,
-0.08660888671875,
0.036529541015625,
-0.0087890625,
0.0396728515625,
0.0168609619140625,
0.00833892822265625,
-0.062469482421875,
-0.0268707275390625,
-0.077392578125,
-0.07220458984375,
0.0792236328125,
0.023223876953125,
-0.0162811279296875,
0.011016845703125,
0.006969451904296875,
-0.0050506591796875,
0.007038116455078125,
-0.05474853515625,
-0.0408935546875,
-0.036041259765625,
-0.0284881591796875,
0.004329681396484375,
-0.024017333984375,
0.00954437255859375,
-0.0296173095703125,
0.07537841796875,
0.009246826171875,
0.04736328125,
0.043212890625,
-0.0198211669921875,
-0.01125335693359375,
0.0173492431640625,
0.05950927734375,
0.036102294921875,
-0.0197296142578125,
0.0001499652862548828,
0.0269927978515625,
-0.057769775390625,
0.003932952880859375,
0.0207672119140625,
-0.01580810546875,
0.03302001953125,
0.0158843994140625,
0.079345703125,
0.003917694091796875,
-0.0152435302734375,
0.010894775390625,
-0.007038116455078125,
-0.027984619140625,
-0.024627685546875,
-0.005283355712890625,
0.005970001220703125,
0.00897979736328125,
0.0301666259765625,
0.0214385986328125,
-0.024566650390625,
-0.01983642578125,
0.0001933574676513672,
-0.005970001220703125,
-0.00878143310546875,
-0.0118408203125,
0.068359375,
0.0020904541015625,
-0.02520751953125,
0.034881591796875,
-0.0009241104125976562,
-0.06134033203125,
0.068359375,
0.051910400390625,
0.06219482421875,
-0.0264892578125,
0.01459503173828125,
0.052520751953125,
0.0184173583984375,
-0.0268402099609375,
0.04248046875,
0.024322509765625,
-0.05206298828125,
-0.022369384765625,
-0.032440185546875,
0.0036163330078125,
0.030029296875,
-0.01702880859375,
0.043243408203125,
-0.03143310546875,
-0.025146484375,
0.007228851318359375,
0.01090240478515625,
-0.053955078125,
0.030975341796875,
0.004299163818359375,
0.054412841796875,
-0.040374755859375,
0.04522705078125,
0.06036376953125,
-0.0401611328125,
-0.08233642578125,
0.00312042236328125,
-0.030517578125,
-0.056793212890625,
0.05023193359375,
0.0185089111328125,
0.020965576171875,
0.021636962890625,
-0.03472900390625,
-0.0693359375,
0.09246826171875,
0.01413726806640625,
-0.03216552734375,
0.0027618408203125,
0.02825927734375,
0.040679931640625,
-0.0030193328857421875,
0.03338623046875,
0.046295166015625,
0.057891845703125,
-0.00799560546875,
-0.08074951171875,
0.0171051025390625,
-0.007293701171875,
-0.0081939697265625,
0.013336181640625,
-0.060028076171875,
0.072998046875,
-0.03338623046875,
-0.02166748046875,
0.0015277862548828125,
0.04302978515625,
0.0357666015625,
0.014129638671875,
0.0203094482421875,
0.046905517578125,
0.054595947265625,
-0.0234832763671875,
0.06298828125,
-0.03826904296875,
0.048583984375,
0.05023193359375,
0.01480865478515625,
0.0297393798828125,
0.041656494140625,
-0.02130126953125,
0.029296875,
0.055755615234375,
-0.019866943359375,
0.0328369140625,
-0.010833740234375,
-0.01326751708984375,
0.008209228515625,
0.005828857421875,
-0.0390625,
0.0245361328125,
0.027984619140625,
-0.044342041015625,
-0.0013895034790039062,
-0.00650787353515625,
0.026031494140625,
-0.0211334228515625,
-0.029510498046875,
0.034637451171875,
0.007843017578125,
-0.05859375,
0.0556640625,
0.0187835693359375,
0.0682373046875,
-0.0249786376953125,
0.0158233642578125,
-0.0105438232421875,
0.03277587890625,
-0.029327392578125,
-0.053253173828125,
0.0229949951171875,
0.0015239715576171875,
-0.0297393798828125,
-0.0109100341796875,
0.06524658203125,
-0.03656005859375,
-0.05462646484375,
0.002819061279296875,
0.01529693603515625,
-0.0029888153076171875,
0.00205230712890625,
-0.0419921875,
-0.01139068603515625,
0.0172271728515625,
-0.020416259765625,
-0.0179595947265625,
0.0330810546875,
0.004085540771484375,
0.045501708984375,
0.039764404296875,
0.0070037841796875,
0.0252685546875,
0.0105438232421875,
0.037689208984375,
-0.059661865234375,
-0.037384033203125,
-0.07110595703125,
0.061492919921875,
-0.020172119140625,
-0.043853759765625,
0.0435791015625,
0.038543701171875,
0.060699462890625,
-0.032989501953125,
0.0843505859375,
-0.029754638671875,
0.04034423828125,
-0.026336669921875,
0.0614013671875,
-0.020172119140625,
-0.006984710693359375,
-0.01922607421875,
-0.06268310546875,
-0.02783203125,
0.047637939453125,
-0.0246124267578125,
0.0002410411834716797,
0.08514404296875,
0.050567626953125,
-0.01004791259765625,
-0.01763916015625,
0.01013946533203125,
0.045684814453125,
0.0171966552734375,
0.044403076171875,
0.01317596435546875,
-0.07470703125,
0.052215576171875,
-0.0433349609375,
0.0130767822265625,
-0.0186309814453125,
-0.045135498046875,
-0.05841064453125,
-0.0252685546875,
-0.0233154296875,
-0.0443115234375,
-0.01168060302734375,
0.070068359375,
0.019287109375,
-0.07098388671875,
0.0010509490966796875,
-0.015625,
0.0224761962890625,
-0.020172119140625,
-0.0258941650390625,
0.04290771484375,
-0.040557861328125,
-0.0738525390625,
-0.0014133453369140625,
-0.006122589111328125,
0.0241546630859375,
0.0152740478515625,
0.007503509521484375,
-0.02276611328125,
-0.01238250732421875,
0.0362548828125,
0.026275634765625,
-0.047698974609375,
-0.0269012451171875,
0.01538848876953125,
-0.0269317626953125,
0.023223876953125,
0.022674560546875,
-0.048583984375,
0.0190887451171875,
0.06610107421875,
0.0161590576171875,
0.03240966796875,
-0.0119781494140625,
0.045135498046875,
-0.05206298828125,
0.015960693359375,
-0.0007815361022949219,
0.0384521484375,
0.0228118896484375,
-0.030853271484375,
0.038848876953125,
0.0482177734375,
-0.04022216796875,
-0.048248291015625,
-0.0245361328125,
-0.07427978515625,
-0.0106201171875,
0.09326171875,
-0.00010532140731811523,
-0.025177001953125,
-0.0118408203125,
-0.044647216796875,
0.07470703125,
-0.0282440185546875,
0.0767822265625,
0.059051513671875,
0.00875091552734375,
-0.0078582763671875,
-0.0274200439453125,
0.0469970703125,
0.032806396484375,
-0.052520751953125,
-0.0294952392578125,
-0.0084228515625,
0.03436279296875,
0.0206146240234375,
0.0292205810546875,
-0.004100799560546875,
0.0233917236328125,
-0.0059661865234375,
0.023895263671875,
-0.01067352294921875,
0.0024814605712890625,
-0.006557464599609375,
0.0021915435791015625,
-0.03204345703125,
-0.05035400390625
]
] |
MoritzLaurer/DeBERTa-v3-base-mnli-fever-docnli-ling-2c | 2023-04-05T20:40:03.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"text-classification",
"zero-shot-classification",
"en",
"arxiv:2104.07179",
"arxiv:2106.09449",
"arxiv:2006.03654",
"arxiv:2111.09543",
"license:mit",
"endpoints_compatible",
"region:us"
] | text-classification | MoritzLaurer | null | null | MoritzLaurer/DeBERTa-v3-base-mnli-fever-docnli-ling-2c | 9 | 13,814 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
license: mit
tags:
- text-classification
- zero-shot-classification
metrics:
- accuracy
widget:
- text: "I first thought that I liked the movie, but upon second thought it was actually disappointing. [SEP] The movie was good."
---
# DeBERTa-v3-base-mnli-fever-docnli-ling-2c
## Model description
This model was trained on 1.279.665 hypothesis-premise pairs from 8 NLI datasets: [MultiNLI](https://huggingface.co/datasets/multi_nli), [Fever-NLI](https://github.com/easonnie/combine-FEVER-NSMN/blob/master/other_resources/nli_fever.md), [LingNLI](https://arxiv.org/abs/2104.07179) and [DocNLI](https://arxiv.org/pdf/2106.09449.pdf) (which includes [ANLI](https://github.com/facebookresearch/anli), QNLI, DUC, CNN/DailyMail, Curation).
It is the only model in the model hub trained on 8 NLI datasets, including DocNLI with very long texts to learn long range reasoning. Note that the model was trained on binary NLI to predict either "entailment" or "not-entailment". The DocNLI merges the classes "neural" and "contradiction" into "not-entailment" to enable the inclusion of the DocNLI dataset.
The base model is [DeBERTa-v3-base from Microsoft](https://huggingface.co/microsoft/deberta-v3-base). The v3 variant of DeBERTa substantially outperforms previous versions of the model by including a different pre-training objective, see annex 11 of the original [DeBERTa paper](https://arxiv.org/pdf/2006.03654.pdf) as well as the [DeBERTa-V3 paper](https://arxiv.org/abs/2111.09543).
For highest performance (but less speed), I recommend using https://huggingface.co/MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli.
### How to use the model
#### Simple zero-shot classification pipeline
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification", model="MoritzLaurer/DeBERTa-v3-base-mnli-fever-docnli-ling-2c")
sequence_to_classify = "Angela Merkel is a politician in Germany and leader of the CDU"
candidate_labels = ["politics", "economy", "entertainment", "environment"]
output = classifier(sequence_to_classify, candidate_labels, multi_label=False)
print(output)
```
#### NLI use-case
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu")
model_name = "MoritzLaurer/DeBERTa-v3-base-mnli-fever-docnli-ling-2c"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
premise = "I first thought that I liked the movie, but upon second thought it was actually disappointing."
hypothesis = "The movie was good."
input = tokenizer(premise, hypothesis, truncation=True, return_tensors="pt")
output = model(input["input_ids"].to(device)) # device = "cuda:0" or "cpu"
prediction = torch.softmax(output["logits"][0], -1).tolist()
label_names = ["entailment", "not_entailment"]
prediction = {name: round(float(pred) * 100, 1) for pred, name in zip(prediction, label_names)}
print(prediction)
```
### Training data
This model was trained on 1.279.665 hypothesis-premise pairs from 8 NLI datasets: [MultiNLI](https://huggingface.co/datasets/multi_nli), [Fever-NLI](https://github.com/easonnie/combine-FEVER-NSMN/blob/master/other_resources/nli_fever.md), [LingNLI](https://arxiv.org/abs/2104.07179) and [DocNLI](https://arxiv.org/pdf/2106.09449.pdf) (which includes [ANLI](https://github.com/facebookresearch/anli), QNLI, DUC, CNN/DailyMail, Curation).
### Training procedure
DeBERTa-v3-small-mnli-fever-docnli-ling-2c was trained using the Hugging Face trainer with the following hyperparameters.
```
training_args = TrainingArguments(
num_train_epochs=3, # total number of training epochs
learning_rate=2e-05,
per_device_train_batch_size=32, # batch size per device during training
per_device_eval_batch_size=32, # batch size for evaluation
warmup_ratio=0.1, # number of warmup steps for learning rate scheduler
weight_decay=0.06, # strength of weight decay
fp16=True # mixed precision training
)
```
### Eval results
The model was evaluated using the binary test sets for MultiNLI and ANLI and the binary dev set for Fever-NLI (two classes instead of three). The metric used is accuracy.
mnli-m-2c | mnli-mm-2c | fever-nli-2c | anli-all-2c | anli-r3-2c | lingnli-2c
---------|----------|---------|----------|----------|------
0.935 | 0.933 | 0.897 | 0.710 | 0.678 | 0.895
## Limitations and bias
Please consult the original DeBERTa paper and literature on different NLI datasets for potential biases.
## Citation
If you use this model, please cite: Laurer, Moritz, Wouter van Atteveldt, Andreu Salleras Casas, and Kasper Welbers. 2022. ‘Less Annotating, More Classifying – Addressing the Data Scarcity Issue of Supervised Machine Learning with Deep Transfer Learning and BERT - NLI’. Preprint, June. Open Science Framework. https://osf.io/74b8k.
### Ideas for cooperation or questions?
If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)
### Debugging and issues
Note that DeBERTa-v3 was released on 06.12.21 and older versions of HF Transformers seem to have issues running the model (e.g. resulting in an issue with the tokenizer). Using Transformers>=4.13 might solve some issues. | 5,447 | [
[
-0.0230712890625,
-0.036285400390625,
0.0111846923828125,
0.0141448974609375,
-0.0159912109375,
-0.0176544189453125,
0.0021266937255859375,
-0.036346435546875,
0.029815673828125,
0.01279449462890625,
-0.0306396484375,
-0.0413818359375,
-0.0531005859375,
0.0127105712890625,
-0.0113372802734375,
0.07763671875,
0.00004744529724121094,
0.0079803466796875,
0.0013093948364257812,
-0.01131439208984375,
-0.01381683349609375,
-0.04742431640625,
-0.041748046875,
-0.03961181640625,
0.031005859375,
0.0300140380859375,
0.046875,
0.02655029296875,
0.035552978515625,
0.016387939453125,
-0.01300811767578125,
0.0032520294189453125,
-0.02606201171875,
-0.009796142578125,
0.0189056396484375,
-0.0341796875,
-0.0382080078125,
0.0162811279296875,
0.029815673828125,
0.0240631103515625,
0.00036716461181640625,
0.019378662109375,
-0.0115509033203125,
0.04901123046875,
-0.03704833984375,
0.009613037109375,
-0.039215087890625,
0.017425537109375,
-0.0001226663589477539,
0.0137786865234375,
-0.0265045166015625,
-0.00833892822265625,
0.0246124267578125,
-0.035247802734375,
0.01232147216796875,
-0.0136260986328125,
0.095947265625,
0.02044677734375,
-0.026702880859375,
-0.0036602020263671875,
-0.032623291015625,
0.056976318359375,
-0.07659912109375,
0.01556396484375,
0.0181732177734375,
-0.004055023193359375,
-0.0016345977783203125,
-0.03759765625,
-0.053436279296875,
-0.009918212890625,
-0.00591278076171875,
0.0177764892578125,
-0.027435302734375,
-0.02044677734375,
0.033905029296875,
0.0097808837890625,
-0.056304931640625,
0.0016126632690429688,
-0.050201416015625,
-0.00679779052734375,
0.0531005859375,
-0.0051116943359375,
0.00972747802734375,
-0.0306396484375,
-0.03302001953125,
-0.01305389404296875,
-0.031982421875,
0.010589599609375,
0.004535675048828125,
0.0032482147216796875,
-0.0281982421875,
0.019500732421875,
-0.00859832763671875,
0.05670166015625,
0.0217132568359375,
-0.0178375244140625,
0.07366943359375,
-0.0125274658203125,
-0.033050537109375,
0.01313018798828125,
0.072509765625,
0.034759521484375,
0.011749267578125,
0.00415802001953125,
0.007465362548828125,
-0.0144805908203125,
-0.01026153564453125,
-0.08209228515625,
-0.02392578125,
0.031707763671875,
-0.029266357421875,
-0.03900146484375,
0.007965087890625,
-0.0631103515625,
-0.020294189453125,
-0.02545166015625,
0.0325927734375,
-0.0482177734375,
-0.03533935546875,
-0.006683349609375,
-0.01318359375,
0.0306396484375,
0.0008082389831542969,
-0.05926513671875,
0.0045013427734375,
0.0188140869140625,
0.0645751953125,
-0.01004791259765625,
-0.0254364013671875,
-0.0249176025390625,
-0.00577545166015625,
-0.0139312744140625,
0.026458740234375,
-0.0186920166015625,
-0.019927978515625,
-0.01715087890625,
0.008148193359375,
-0.04156494140625,
-0.0377197265625,
0.03228759765625,
-0.0267181396484375,
0.020904541015625,
-0.00943756103515625,
-0.03216552734375,
-0.0293121337890625,
0.0170135498046875,
-0.033599853515625,
0.072265625,
0.0169525146484375,
-0.08453369140625,
0.01358795166015625,
-0.03814697265625,
-0.018768310546875,
-0.028778076171875,
0.0088653564453125,
-0.053192138671875,
-0.01568603515625,
0.027679443359375,
0.04119873046875,
-0.01751708984375,
0.04254150390625,
-0.034393310546875,
-0.022491455078125,
0.02264404296875,
-0.017974853515625,
0.0831298828125,
0.0263214111328125,
-0.04803466796875,
0.01042938232421875,
-0.07122802734375,
-0.0030078887939453125,
0.0294647216796875,
-0.004123687744140625,
-0.00673675537109375,
-0.035247802734375,
0.00010389089584350586,
0.0330810546875,
0.01180267333984375,
-0.033203125,
0.023468017578125,
-0.049774169921875,
0.038848876953125,
0.0260162353515625,
-0.0013742446899414062,
0.0216522216796875,
-0.032379150390625,
0.0196380615234375,
0.0360107421875,
0.0242919921875,
0.005126953125,
-0.058319091796875,
-0.0697021484375,
-0.0199127197265625,
0.0267181396484375,
0.0657958984375,
-0.05255126953125,
0.0372314453125,
-0.010345458984375,
-0.06964111328125,
-0.03704833984375,
0.01207733154296875,
0.0254058837890625,
0.051727294921875,
0.035858154296875,
-0.005954742431640625,
-0.056732177734375,
-0.068115234375,
0.0147552490234375,
-0.0128631591796875,
0.0014753341674804688,
0.01268768310546875,
0.057281494140625,
-0.0345458984375,
0.06640625,
-0.031982421875,
-0.03277587890625,
-0.0263214111328125,
0.0009703636169433594,
0.05865478515625,
0.05462646484375,
0.06524658203125,
-0.052490234375,
-0.0335693359375,
-0.016876220703125,
-0.07489013671875,
0.003597259521484375,
0.0024280548095703125,
-0.01222991943359375,
0.042266845703125,
0.02191162109375,
-0.03759765625,
0.03277587890625,
0.03729248046875,
-0.022979736328125,
0.0136260986328125,
-0.01203155517578125,
0.00994110107421875,
-0.0977783203125,
0.0250396728515625,
0.0100250244140625,
0.0123443603515625,
-0.072265625,
-0.005863189697265625,
-0.00040268898010253906,
-0.001071929931640625,
-0.048919677734375,
0.0374755859375,
-0.00824737548828125,
0.0288543701171875,
-0.007549285888671875,
-0.00243377685546875,
0.0187835693359375,
0.04248046875,
0.00620269775390625,
0.0248870849609375,
0.0712890625,
-0.0445556640625,
0.0251007080078125,
0.0165557861328125,
0.001148223876953125,
0.0066680908203125,
-0.06298828125,
0.0084381103515625,
-0.0072021484375,
0.0209197998046875,
-0.04937744140625,
-0.0177764892578125,
0.048370361328125,
-0.034820556640625,
0.036407470703125,
-0.006153106689453125,
-0.022674560546875,
-0.044097900390625,
-0.0220184326171875,
0.0280914306640625,
0.0592041015625,
-0.046661376953125,
0.047088623046875,
0.01708984375,
0.0170440673828125,
-0.064697265625,
-0.05316162109375,
-0.023193359375,
-0.031707763671875,
-0.052642822265625,
0.03485107421875,
0.0007290840148925781,
-0.01837158203125,
-0.003101348876953125,
0.005908966064453125,
-0.0170440673828125,
0.0044708251953125,
0.02789306640625,
0.042083740234375,
-0.01078033447265625,
-0.004421234130859375,
-0.0017595291137695312,
0.003421783447265625,
-0.0018062591552734375,
-0.0202484130859375,
0.03717041015625,
-0.0321044921875,
0.0036334991455078125,
-0.039794921875,
0.00824737548828125,
0.0401611328125,
-0.012939453125,
0.06805419921875,
0.075927734375,
-0.03790283203125,
0.02850341796875,
-0.042633056640625,
-0.01153564453125,
-0.0284576416015625,
0.013458251953125,
-0.030242919921875,
-0.030914306640625,
0.04339599609375,
0.022979736328125,
-0.005443572998046875,
0.07000732421875,
0.027374267578125,
0.0218658447265625,
0.057373046875,
0.033203125,
-0.019195556640625,
0.006275177001953125,
-0.061614990234375,
0.008575439453125,
-0.0576171875,
-0.02801513671875,
-0.033905029296875,
-0.007549285888671875,
-0.03948974609375,
-0.03326416015625,
0.02728271484375,
0.0242462158203125,
-0.0286407470703125,
0.0284881591796875,
-0.046478271484375,
0.01100921630859375,
0.050933837890625,
0.01690673828125,
0.00286865234375,
-0.0070648193359375,
-0.003063201904296875,
0.00991058349609375,
-0.063232421875,
-0.021240234375,
0.08489990234375,
0.04486083984375,
0.033843994140625,
0.003704071044921875,
0.07373046875,
-0.009521484375,
0.02777099609375,
-0.036865234375,
0.0205841064453125,
-0.0189056396484375,
-0.053955078125,
-0.0167388916015625,
-0.034515380859375,
-0.06640625,
0.01690673828125,
-0.027099609375,
-0.0592041015625,
0.0439453125,
0.0027904510498046875,
-0.036285400390625,
0.029083251953125,
-0.058258056640625,
0.069091796875,
-0.006092071533203125,
-0.019866943359375,
0.0007767677307128906,
-0.058746337890625,
0.041778564453125,
-0.008941650390625,
0.013641357421875,
-0.02264404296875,
0.02691650390625,
0.073486328125,
-0.0247650146484375,
0.0643310546875,
-0.029144287109375,
0.00696563720703125,
0.03472900390625,
-0.020477294921875,
0.009552001953125,
0.0209808349609375,
-0.042144775390625,
0.051177978515625,
0.021453857421875,
-0.035125732421875,
-0.0232391357421875,
0.0625,
-0.0784912109375,
-0.04754638671875,
-0.0595703125,
-0.0202789306640625,
0.0045623779296875,
0.0107269287109375,
0.04705810546875,
0.04437255859375,
-0.002391815185546875,
-0.001739501953125,
0.050933837890625,
-0.027740478515625,
0.03125,
0.0206146240234375,
-0.02032470703125,
-0.033599853515625,
0.0648193359375,
0.005405426025390625,
0.0081024169921875,
0.01068115234375,
0.01561737060546875,
-0.0135345458984375,
-0.02972412109375,
-0.050567626953125,
0.0290374755859375,
-0.05322265625,
-0.0276031494140625,
-0.07379150390625,
-0.0357666015625,
-0.06341552734375,
0.00431060791015625,
-0.00885772705078125,
-0.0207977294921875,
-0.042724609375,
-0.00046324729919433594,
0.0438232421875,
0.043853759765625,
-0.01473236083984375,
0.005260467529296875,
-0.057098388671875,
0.01171875,
0.00994873046875,
0.006855010986328125,
-0.00699615478515625,
-0.0665283203125,
-0.006877899169921875,
0.00824737548828125,
-0.031494140625,
-0.07635498046875,
0.060211181640625,
0.0251922607421875,
0.0288238525390625,
0.02008056640625,
0.01139068603515625,
0.04034423828125,
-0.01409912109375,
0.0576171875,
0.02117919921875,
-0.07293701171875,
0.052490234375,
-0.00882720947265625,
0.028778076171875,
0.034332275390625,
0.0521240234375,
-0.01322174072265625,
-0.026458740234375,
-0.058502197265625,
-0.068115234375,
0.058990478515625,
0.013519287109375,
-0.00885009765625,
0.005039215087890625,
0.0266265869140625,
-0.005634307861328125,
0.0007205009460449219,
-0.05975341796875,
-0.044219970703125,
-0.032196044921875,
-0.01357269287109375,
-0.01456451416015625,
-0.01494598388671875,
-0.01082611083984375,
-0.050537109375,
0.08502197265625,
-0.004932403564453125,
0.021240234375,
0.057708740234375,
-0.002758026123046875,
-0.0021877288818359375,
0.006114959716796875,
0.042083740234375,
0.042755126953125,
-0.0372314453125,
-0.00984954833984375,
0.034210205078125,
-0.0255279541015625,
0.006221771240234375,
0.035491943359375,
-0.024627685546875,
0.0207672119140625,
0.0255279541015625,
0.0853271484375,
-0.011962890625,
-0.032318115234375,
0.043426513671875,
-0.009552001953125,
-0.0291595458984375,
-0.035308837890625,
0.006015777587890625,
0.0006113052368164062,
0.01885986328125,
0.0269775390625,
0.022247314453125,
0.019561767578125,
-0.031585693359375,
0.0229034423828125,
0.01190185546875,
-0.0291595458984375,
-0.0159912109375,
0.05230712890625,
0.009368896484375,
0.0082244873046875,
0.053436279296875,
-0.0254669189453125,
-0.041717529296875,
0.0667724609375,
0.033935546875,
0.0521240234375,
-0.0183868408203125,
0.022369384765625,
0.0660400390625,
0.0111846923828125,
0.00579833984375,
0.01558685302734375,
0.036285400390625,
-0.04791259765625,
-0.0206146240234375,
-0.05645751953125,
-0.00807952880859375,
0.021087646484375,
-0.04754638671875,
0.032379150390625,
-0.02423095703125,
-0.0162353515625,
0.0180511474609375,
0.00250244140625,
-0.059356689453125,
0.01279449462890625,
0.0214996337890625,
0.06988525390625,
-0.07476806640625,
0.083984375,
0.040191650390625,
-0.038818359375,
-0.075439453125,
0.003147125244140625,
-0.005397796630859375,
-0.044525146484375,
0.06414794921875,
0.0406494140625,
0.01190948486328125,
-0.01412200927734375,
-0.017242431640625,
-0.07568359375,
0.0853271484375,
0.01593017578125,
-0.04364013671875,
-0.00041961669921875,
-0.0105743408203125,
0.042144775390625,
-0.0322265625,
0.033721923828125,
0.05126953125,
0.0286712646484375,
0.0222930908203125,
-0.07550048828125,
0.00946044921875,
-0.0338134765625,
-0.01104736328125,
-0.00281524658203125,
-0.041961669921875,
0.06622314453125,
-0.04656982421875,
0.009063720703125,
-0.0018110275268554688,
0.05755615234375,
0.0230712890625,
0.053436279296875,
0.041107177734375,
0.046661376953125,
0.056854248046875,
-0.01108551025390625,
0.06982421875,
-0.033660888671875,
0.02679443359375,
0.057769775390625,
-0.0196380615234375,
0.06298828125,
0.02508544921875,
-0.00527191162109375,
0.041351318359375,
0.054656982421875,
-0.023284912109375,
0.027740478515625,
0.018951416015625,
-0.00751495361328125,
0.00035643577575683594,
-0.0021839141845703125,
-0.050445556640625,
0.030303955078125,
0.021240234375,
-0.034271240234375,
0.0089569091796875,
0.02471923828125,
0.020965576171875,
-0.0219268798828125,
-0.007396697998046875,
0.0499267578125,
0.0022735595703125,
-0.06817626953125,
0.09814453125,
-0.007266998291015625,
0.062744140625,
-0.0201416015625,
0.01076507568359375,
-0.007526397705078125,
0.0157012939453125,
-0.028961181640625,
-0.032501220703125,
0.0298919677734375,
-0.00347900390625,
-0.016632080078125,
0.0162353515625,
0.037811279296875,
-0.036285400390625,
-0.048736572265625,
0.0252838134765625,
0.02655029296875,
0.02191162109375,
0.01087188720703125,
-0.075439453125,
0.005741119384765625,
0.01097869873046875,
-0.0249176025390625,
0.0194091796875,
0.01207733154296875,
0.020172119140625,
0.040374755859375,
0.039215087890625,
-0.0111846923828125,
-0.0106353759765625,
0.004085540771484375,
0.06414794921875,
-0.0296783447265625,
-0.003124237060546875,
-0.06793212890625,
0.02117919921875,
-0.01320648193359375,
-0.027435302734375,
0.044921875,
0.049713134765625,
0.057098388671875,
-0.0214691162109375,
0.0469970703125,
-0.0228118896484375,
0.02545166015625,
-0.039642333984375,
0.044464111328125,
-0.040191650390625,
0.0011463165283203125,
-0.0196990966796875,
-0.0501708984375,
-0.0294647216796875,
0.044158935546875,
-0.024383544921875,
-0.0024852752685546875,
0.0458984375,
0.06378173828125,
0.016326904296875,
-0.004093170166015625,
0.003383636474609375,
0.020477294921875,
0.0161590576171875,
0.055938720703125,
0.0296783447265625,
-0.057952880859375,
0.0298614501953125,
-0.06317138671875,
-0.030303955078125,
-0.01476287841796875,
-0.04962158203125,
-0.08154296875,
-0.043914794921875,
-0.05316162109375,
-0.06298828125,
0.0093841552734375,
0.09552001953125,
0.055908203125,
-0.080322265625,
-0.00518035888671875,
0.00566864013671875,
-0.0013952255249023438,
-0.0180206298828125,
-0.0166473388671875,
0.045257568359375,
-0.015625,
-0.06463623046875,
0.01229095458984375,
-0.00487518310546875,
0.023773193359375,
-0.002208709716796875,
-0.0081329345703125,
-0.027862548828125,
0.001544952392578125,
0.03369140625,
0.0204315185546875,
-0.05535888671875,
0.0034503936767578125,
0.01503753662109375,
-0.01488494873046875,
0.0098724365234375,
0.01488494873046875,
-0.051727294921875,
0.016204833984375,
0.034088134765625,
0.020782470703125,
0.03924560546875,
-0.0212554931640625,
0.0189056396484375,
-0.034088134765625,
0.0268096923828125,
0.0028076171875,
0.036376953125,
0.022491455078125,
-0.029327392578125,
0.04644775390625,
0.023529052734375,
-0.041168212890625,
-0.0599365234375,
-0.007350921630859375,
-0.08331298828125,
-0.0240631103515625,
0.10186767578125,
-0.0143585205078125,
-0.03704833984375,
0.0080413818359375,
-0.0195465087890625,
0.03411865234375,
-0.031402587890625,
0.048126220703125,
0.0290679931640625,
-0.0113525390625,
0.003204345703125,
-0.039947509765625,
0.048736572265625,
0.032073974609375,
-0.04180908203125,
-0.0190582275390625,
0.0036258697509765625,
0.0271759033203125,
0.035675048828125,
0.048675537109375,
-0.005767822265625,
0.01050567626953125,
-0.0150299072265625,
0.0126190185546875,
-0.00579833984375,
-0.01399993896484375,
-0.0374755859375,
0.0007033348083496094,
-0.005092620849609375,
-0.007694244384765625
]
] |
deutsche-telekom/gbert-large-paraphrase-euclidean | 2023-03-03T07:35:12.000Z | [
"sentence-transformers",
"pytorch",
"bert",
"sentence-similarity",
"transformers",
"setfit",
"de",
"dataset:deutsche-telekom/ger-backtrans-paraphrase",
"license:mit",
"endpoints_compatible",
"region:us"
] | sentence-similarity | deutsche-telekom | null | null | deutsche-telekom/gbert-large-paraphrase-euclidean | 9 | 13,772 | sentence-transformers | 2023-01-13T10:30:12 | ---
pipeline_tag: sentence-similarity
language:
- de
tags:
- sentence-transformers
- sentence-similarity
- transformers
- setfit
license: mit
datasets:
- deutsche-telekom/ger-backtrans-paraphrase
---
# German BERT large paraphrase euclidean
This is a [sentence-transformers](https://www.SBERT.net) model.
It maps sentences & paragraphs (text) into a 1024 dimensional dense vector space.
The model is intended to be used together with [SetFit](https://github.com/huggingface/setfit)
to improve German few-shot text classification.
It has a sibling model called
[deutsche-telekom/gbert-large-paraphrase-cosine](https://huggingface.co/deutsche-telekom/gbert-large-paraphrase-cosine).
This model is based on [deepset/gbert-large](https://huggingface.co/deepset/gbert-large).
Many thanks to [deepset](https://www.deepset.ai/)!
## Training
**Loss Function**\
We have used [BatchHardSoftMarginTripletLoss](https://www.sbert.net/docs/package_reference/losses.html#batchhardsoftmargintripletloss) with eucledian distance as the loss function:
``` python
train_loss = losses.BatchHardSoftMarginTripletLoss(
model=model,
distance_metric=BatchHardTripletLossDistanceFunction.eucledian_distance,
)
```
**Training Data**\
The model is trained on a carefully filtered dataset of
[deutsche-telekom/ger-backtrans-paraphrase](https://huggingface.co/datasets/deutsche-telekom/ger-backtrans-paraphrase).
We deleted the following pairs of sentences:
- `min_char_len` less than 15
- `jaccard_similarity` greater than 0.3
- `de_token_count` greater than 30
- `en_de_token_count` greater than 30
- `cos_sim` less than 0.85
**Hyperparameters**
- learning_rate: 5.5512022294147105e-06
- num_epochs: 7
- train_batch_size: 68
- num_gpu: ???
## Evaluation Results
We use the [NLU Few-shot Benchmark - English and German](https://huggingface.co/datasets/deutsche-telekom/NLU-few-shot-benchmark-en-de)
dataset to evaluate this model in a German few-shot scenario.
**Qualitative results**
- multilingual sentence embeddings provide the worst results
- Electra models also deliver poor results
- German BERT base size model ([deepset/gbert-base](https://huggingface.co/deepset/gbert-base)) provides good results
- German BERT large size model ([deepset/gbert-large](https://huggingface.co/deepset/gbert-large)) provides very good results
- our fine-tuned models (this model and [deutsche-telekom/gbert-large-paraphrase-cosine](https://huggingface.co/deutsche-telekom/gbert-large-paraphrase-cosine)) provide best results
## Licensing
Copyright (c) 2023 [Philip May](https://may.la/), [Deutsche Telekom AG](https://www.telekom.com/)\
Copyright (c) 2022 [deepset GmbH](https://www.deepset.ai/)
Licensed under the **MIT License** (the "License"); you may not use this file except in compliance with the License.
You may obtain a copy of the License by reviewing the file
[LICENSE](https://huggingface.co/deutsche-telekom/gbert-large-paraphrase-euclidean/blob/main/LICENSE) in the repository.
| 2,992 | [
[
-0.0307159423828125,
-0.0771484375,
0.040679931640625,
0.016815185546875,
-0.0305328369140625,
-0.0394287109375,
-0.037567138671875,
-0.0251312255859375,
0.0005550384521484375,
0.0384521484375,
-0.036163330078125,
-0.04888916015625,
-0.04315185546875,
0.005496978759765625,
-0.030426025390625,
0.088134765625,
-0.00688934326171875,
0.0184326171875,
-0.004177093505859375,
-0.023590087890625,
-0.0090484619140625,
-0.042877197265625,
-0.04998779296875,
-0.034759521484375,
0.0379638671875,
0.0271453857421875,
0.058258056640625,
0.0234375,
0.054290771484375,
0.0266265869140625,
-0.01861572265625,
-0.01451873779296875,
-0.049224853515625,
-0.004306793212890625,
0.0018405914306640625,
-0.0310211181640625,
-0.0124053955078125,
-0.01003265380859375,
0.055908203125,
0.04132080078125,
-0.01335906982421875,
0.00881195068359375,
-0.0019817352294921875,
0.037384033203125,
-0.0215301513671875,
0.0085296630859375,
-0.038055419921875,
0.00611114501953125,
-0.010040283203125,
0.0242767333984375,
-0.018951416015625,
-0.0265960693359375,
0.032867431640625,
-0.032440185546875,
0.0225677490234375,
0.0009179115295410156,
0.0892333984375,
0.0161285400390625,
-0.01345062255859375,
-0.01287841796875,
-0.03289794921875,
0.06329345703125,
-0.0826416015625,
0.01739501953125,
0.032562255859375,
0.0205841064453125,
0.0037212371826171875,
-0.0655517578125,
-0.055999755859375,
-0.0273895263671875,
-0.004344940185546875,
0.007305145263671875,
-0.0247802734375,
-0.0004432201385498047,
0.0258941650390625,
0.0166473388671875,
-0.0574951171875,
0.020782470703125,
-0.03680419921875,
-0.018829345703125,
0.04376220703125,
0.00466156005859375,
0.00431060791015625,
-0.03253173828125,
-0.022674560546875,
-0.0272979736328125,
-0.04827880859375,
-0.004734039306640625,
0.0279541015625,
0.0164337158203125,
-0.032806396484375,
0.051483154296875,
-0.017730712890625,
0.039276123046875,
-0.0033092498779296875,
-0.0047454833984375,
0.03448486328125,
-0.0159454345703125,
-0.0202484130859375,
0.00385284423828125,
0.07269287109375,
0.031463623046875,
0.035247802734375,
-0.0006079673767089844,
-0.00913238525390625,
-0.0105133056640625,
-0.0036468505859375,
-0.068115234375,
-0.0265045166015625,
0.0010290145874023438,
-0.041229248046875,
-0.01035308837890625,
0.00246429443359375,
-0.04840087890625,
-0.00823211669921875,
-0.00946807861328125,
0.0440673828125,
-0.0694580078125,
0.0213470458984375,
0.0235443115234375,
-0.02203369140625,
0.0290374755859375,
0.0036373138427734375,
-0.0380859375,
0.0011148452758789062,
0.040985107421875,
0.05841064453125,
0.01239013671875,
-0.04168701171875,
-0.0134429931640625,
0.0032978057861328125,
-0.01239013671875,
0.046478271484375,
-0.0240478515625,
-0.01305389404296875,
0.0084686279296875,
0.0034027099609375,
-0.0060882568359375,
-0.0293731689453125,
0.07110595703125,
-0.036468505859375,
0.0465087890625,
-0.015594482421875,
-0.0650634765625,
-0.027374267578125,
0.01497650146484375,
-0.03826904296875,
0.065185546875,
-0.01314544677734375,
-0.06219482421875,
0.0059814453125,
-0.03863525390625,
-0.037994384765625,
-0.0021686553955078125,
-0.00004303455352783203,
-0.0498046875,
0.005710601806640625,
0.0223541259765625,
0.06298828125,
-0.02239990234375,
0.006500244140625,
-0.0305633544921875,
-0.036346435546875,
0.0108642578125,
-0.034637451171875,
0.0772705078125,
0.004180908203125,
-0.0242919921875,
-0.011260986328125,
-0.05303955078125,
-0.0009918212890625,
0.0235137939453125,
-0.0173187255859375,
-0.025787353515625,
-0.0105743408203125,
0.0249786376953125,
0.027191162109375,
0.024139404296875,
-0.043975830078125,
-0.01018524169921875,
-0.0338134765625,
0.041839599609375,
0.04925537109375,
0.01168060302734375,
0.0311126708984375,
-0.0285186767578125,
0.029632568359375,
0.00377655029296875,
0.004100799560546875,
0.00794219970703125,
-0.0199127197265625,
-0.05267333984375,
-0.02392578125,
0.0200042724609375,
0.048370361328125,
-0.06097412109375,
0.05816650390625,
-0.0489501953125,
-0.05084228515625,
-0.0338134765625,
0.0158538818359375,
0.0286865234375,
0.03253173828125,
0.037567138671875,
0.00827789306640625,
-0.021636962890625,
-0.09661865234375,
-0.00007283687591552734,
0.0012884140014648438,
-0.00234222412109375,
0.0140380859375,
0.05682373046875,
-0.01418304443359375,
0.0391845703125,
-0.039306640625,
-0.032684326171875,
-0.033111572265625,
0.0100860595703125,
0.019775390625,
0.0290374755859375,
0.054840087890625,
-0.059814453125,
-0.041107177734375,
-0.0142669677734375,
-0.06683349609375,
0.015869140625,
-0.0160980224609375,
0.0006079673767089844,
0.01329803466796875,
0.0340576171875,
-0.060791015625,
0.01045989990234375,
0.038482666015625,
-0.0190887451171875,
0.033172607421875,
-0.0305938720703125,
-0.004787445068359375,
-0.09686279296875,
0.0029277801513671875,
0.020660400390625,
-0.01119232177734375,
-0.022918701171875,
0.01137542724609375,
-0.00640869140625,
-0.00897216796875,
-0.045562744140625,
0.0281829833984375,
-0.013336181640625,
0.012664794921875,
-0.0009784698486328125,
0.0133819580078125,
-0.00933074951171875,
0.0472412109375,
-0.00726318359375,
0.0599365234375,
0.02996826171875,
-0.035736083984375,
0.0377197265625,
0.047943115234375,
-0.04107666015625,
0.0499267578125,
-0.0745849609375,
-0.0018415451049804688,
-0.01309967041015625,
0.033111572265625,
-0.058685302734375,
-0.016326904296875,
0.0116424560546875,
-0.02984619140625,
0.00919342041015625,
0.0173187255859375,
-0.06829833984375,
-0.0311431884765625,
-0.040374755859375,
0.000029742717742919922,
0.050994873046875,
-0.0294342041015625,
0.01336669921875,
0.0037021636962890625,
-0.0233001708984375,
-0.03619384765625,
-0.07958984375,
0.0175323486328125,
-0.0305633544921875,
-0.05474853515625,
0.0240325927734375,
0.00664520263671875,
0.0005006790161132812,
0.01009368896484375,
0.01367950439453125,
-0.0202789306640625,
-0.012237548828125,
-0.004695892333984375,
0.01424407958984375,
-0.0187225341796875,
0.020111083984375,
0.0158233642578125,
0.004047393798828125,
-0.0004544258117675781,
0.0010423660278320312,
0.045562744140625,
-0.0298919677734375,
-0.005405426025390625,
-0.05523681640625,
0.033966064453125,
0.0360107421875,
0.006866455078125,
0.06292724609375,
0.069580078125,
-0.01471710205078125,
0.004192352294921875,
-0.041534423828125,
-0.0231170654296875,
-0.033538818359375,
0.046905517578125,
-0.032379150390625,
-0.072509765625,
0.041107177734375,
0.0222320556640625,
-0.0028076171875,
0.063232421875,
0.039947509765625,
-0.030853271484375,
0.04888916015625,
0.037109375,
-0.0266265869140625,
0.034881591796875,
-0.043487548828125,
0.0100555419921875,
-0.04736328125,
-0.0005974769592285156,
-0.038177490234375,
-0.034820556640625,
-0.058258056640625,
-0.01253509521484375,
0.027618408203125,
0.006381988525390625,
-0.03680419921875,
0.049774169921875,
-0.0204620361328125,
0.0224609375,
0.060028076171875,
0.0251007080078125,
-0.000583648681640625,
0.01104736328125,
-0.011505126953125,
-0.0167694091796875,
-0.0625,
-0.04534912109375,
0.07318115234375,
0.03057861328125,
0.056732177734375,
-0.0025501251220703125,
0.07476806640625,
0.010162353515625,
-0.0207977294921875,
-0.054840087890625,
0.0626220703125,
-0.0254974365234375,
-0.05230712890625,
-0.039825439453125,
-0.039031982421875,
-0.06988525390625,
0.0037212371826171875,
-0.0128326416015625,
-0.050537109375,
0.00122833251953125,
-0.005489349365234375,
-0.0269622802734375,
0.019866943359375,
-0.05718994140625,
0.07550048828125,
-0.0050048828125,
-0.016021728515625,
-0.034393310546875,
-0.0726318359375,
0.00875091552734375,
-0.0079498291015625,
-0.0094146728515625,
-0.005664825439453125,
0.00018870830535888672,
0.072021484375,
-0.02581787109375,
0.0616455078125,
-0.01397705078125,
-0.0047607421875,
0.0236663818359375,
-0.01180267333984375,
0.051055908203125,
-0.0098724365234375,
-0.0094146728515625,
0.035980224609375,
0.00875091552734375,
-0.03564453125,
-0.02685546875,
0.057708740234375,
-0.06689453125,
-0.0230865478515625,
-0.0231170654296875,
-0.038055419921875,
0.001354217529296875,
0.0164947509765625,
0.055908203125,
0.0274200439453125,
-0.022857666015625,
0.0552978515625,
0.042205810546875,
-0.038299560546875,
0.03948974609375,
0.00502777099609375,
0.01904296875,
-0.029541015625,
0.05987548828125,
0.0064544677734375,
0.0081024169921875,
0.0305328369140625,
0.01137542724609375,
-0.0274200439453125,
-0.034698486328125,
-0.0140380859375,
0.038909912109375,
-0.0560302734375,
-0.01329803466796875,
-0.0418701171875,
-0.0232086181640625,
-0.042236328125,
-0.0086212158203125,
-0.006587982177734375,
-0.037750244140625,
-0.0210113525390625,
-0.01076507568359375,
0.032379150390625,
0.0238800048828125,
-0.0024871826171875,
0.048370361328125,
-0.060577392578125,
0.0267181396484375,
0.006134033203125,
0.004291534423828125,
-0.0121917724609375,
-0.060211181640625,
-0.031524658203125,
0.00971221923828125,
-0.03619384765625,
-0.050079345703125,
0.0291748046875,
0.0182647705078125,
0.038482666015625,
0.023040771484375,
0.0016841888427734375,
0.044677734375,
-0.061309814453125,
0.0672607421875,
0.01739501953125,
-0.059356689453125,
0.022491455078125,
-0.030059814453125,
0.03515625,
0.059783935546875,
0.037078857421875,
-0.04754638671875,
-0.0274810791015625,
-0.07086181640625,
-0.063720703125,
0.048370361328125,
0.025665283203125,
0.04461669921875,
-0.002346038818359375,
0.020599365234375,
0.0017118453979492188,
0.0160369873046875,
-0.0635986328125,
-0.0271453857421875,
-0.0009326934814453125,
-0.0272216796875,
-0.0258331298828125,
-0.027496337890625,
-0.007633209228515625,
-0.026092529296875,
0.07012939453125,
0.009490966796875,
0.041900634765625,
0.0191497802734375,
-0.0198211669921875,
0.018096923828125,
0.02435302734375,
0.048553466796875,
0.04327392578125,
-0.01198577880859375,
-0.01043701171875,
0.0343017578125,
-0.020416259765625,
-0.0012340545654296875,
0.03973388671875,
-0.024200439453125,
0.0189361572265625,
0.0501708984375,
0.07171630859375,
0.0199127197265625,
-0.045562744140625,
0.04534912109375,
0.01544952392578125,
-0.03863525390625,
-0.032012939453125,
-0.01849365234375,
0.01165008544921875,
0.035064697265625,
0.021575927734375,
-0.0213165283203125,
0.01480865478515625,
-0.023040771484375,
0.0230865478515625,
0.021240234375,
-0.0191650390625,
-0.021240234375,
0.03741455078125,
0.010009765625,
-0.0114593505859375,
0.0655517578125,
-0.010467529296875,
-0.04864501953125,
0.035552978515625,
0.026031494140625,
0.07305908203125,
0.00885009765625,
0.0299835205078125,
0.024871826171875,
0.0242919921875,
-0.0189361572265625,
0.0063934326171875,
0.005523681640625,
-0.054840087890625,
-0.0316162109375,
-0.033935546875,
-0.01090240478515625,
0.026123046875,
-0.03912353515625,
0.018310546875,
-0.02325439453125,
-0.00922393798828125,
0.0123138427734375,
0.0021190643310546875,
-0.061920166015625,
0.01171875,
0.01617431640625,
0.0643310546875,
-0.0706787109375,
0.06719970703125,
0.05133056640625,
-0.04656982421875,
-0.051971435546875,
0.01541900634765625,
-0.0204925537109375,
-0.06298828125,
0.052490234375,
0.0145416259765625,
0.0196685791015625,
0.0039520263671875,
-0.04742431640625,
-0.0440673828125,
0.07269287109375,
0.02764892578125,
-0.039276123046875,
-0.0252532958984375,
0.00543212890625,
0.05523681640625,
0.0009984970092773438,
-0.0088653564453125,
0.035675048828125,
0.03558349609375,
0.0034122467041015625,
-0.0660400390625,
0.004734039306640625,
0.013824462890625,
0.00792694091796875,
0.007640838623046875,
-0.054046630859375,
0.06146240234375,
-0.0021648406982421875,
0.01012420654296875,
0.0211181640625,
0.043609619140625,
-0.00047659873962402344,
0.010467529296875,
0.022430419921875,
0.07049560546875,
0.03277587890625,
0.0028896331787109375,
0.0906982421875,
-0.01031494140625,
0.053192138671875,
0.0880126953125,
-0.003093719482421875,
0.07989501953125,
0.038848876953125,
-0.0176239013671875,
0.0380859375,
0.0360107421875,
-0.0211029052734375,
0.0537109375,
0.00391387939453125,
-0.0037212371826171875,
-0.0196380615234375,
0.016510009765625,
-0.0276336669921875,
0.041748046875,
0.006183624267578125,
-0.035247802734375,
-0.04052734375,
0.0081634521484375,
0.004871368408203125,
0.0020961761474609375,
0.0032482147216796875,
0.054718017578125,
0.0221405029296875,
-0.0309600830078125,
0.055145263671875,
0.009735107421875,
0.053619384765625,
-0.04736328125,
0.0158233642578125,
-0.015655517578125,
0.0262451171875,
0.0130767822265625,
-0.054840087890625,
0.019256591796875,
-0.004405975341796875,
0.0035915374755859375,
-0.005615234375,
0.034820556640625,
-0.031494140625,
-0.05059814453125,
0.0154266357421875,
0.035064697265625,
0.02569580078125,
-0.0102691650390625,
-0.097412109375,
-0.01415252685546875,
0.0016546249389648438,
-0.028778076171875,
0.00720977783203125,
0.0450439453125,
0.00922393798828125,
0.04571533203125,
0.03204345703125,
-0.0005664825439453125,
-0.007183074951171875,
-0.0013675689697265625,
0.051483154296875,
-0.05316162109375,
-0.0347900390625,
-0.07476806640625,
0.01934814453125,
-0.0267791748046875,
-0.034637451171875,
0.07232666015625,
0.052764892578125,
0.06414794921875,
-0.0184478759765625,
0.052001953125,
-0.0137786865234375,
0.023712158203125,
-0.036834716796875,
0.05224609375,
-0.043975830078125,
-0.0159149169921875,
-0.035675048828125,
-0.08319091796875,
-0.02154541015625,
0.06536865234375,
-0.037628173828125,
0.0015516281127929688,
0.06329345703125,
0.049957275390625,
-0.012176513671875,
-0.014923095703125,
0.030059814453125,
0.028076171875,
0.01480865478515625,
0.04974365234375,
0.03533935546875,
-0.059417724609375,
0.06451416015625,
-0.035125732421875,
0.0110015869140625,
-0.0232086181640625,
-0.050994873046875,
-0.0758056640625,
-0.046142578125,
-0.0274810791015625,
-0.0244903564453125,
0.0016851425170898438,
0.06622314453125,
0.039306640625,
-0.05767822265625,
-0.0076904296875,
-0.01020050048828125,
-0.0174407958984375,
-0.0099945068359375,
-0.0182342529296875,
0.04949951171875,
-0.0234527587890625,
-0.06378173828125,
0.0019817352294921875,
-0.005992889404296875,
0.0187835693359375,
0.0035877227783203125,
-0.00275421142578125,
-0.040313720703125,
-0.0117950439453125,
0.048309326171875,
0.0123138427734375,
-0.058685302734375,
-0.031890869140625,
-0.00582122802734375,
-0.0291290283203125,
-0.006134033203125,
0.0282745361328125,
-0.026763916015625,
0.002521514892578125,
0.038665771484375,
0.06500244140625,
0.05438232421875,
-0.01232147216796875,
0.033294677734375,
-0.05755615234375,
0.037384033203125,
0.026947021484375,
0.038604736328125,
0.0313720703125,
-0.019561767578125,
0.039031982421875,
0.015533447265625,
-0.046844482421875,
-0.03948974609375,
-0.007076263427734375,
-0.08392333984375,
-0.03240966796875,
0.1063232421875,
-0.0205078125,
-0.003261566162109375,
0.03173828125,
-0.0272216796875,
0.0168914794921875,
-0.033447265625,
0.0487060546875,
0.081298828125,
0.011260986328125,
-0.002685546875,
-0.034271240234375,
0.023712158203125,
0.048828125,
-0.042694091796875,
-0.0081634521484375,
0.055755615234375,
0.02899169921875,
0.007152557373046875,
0.0152130126953125,
0.0064239501953125,
0.015411376953125,
-0.0145416259765625,
0.0060577392578125,
-0.0004088878631591797,
-0.0012350082397460938,
-0.040802001953125,
0.004547119140625,
-0.0196533203125,
-0.03497314453125
]
] |
pykeio/lite-toxic-comment-classification | 2023-03-16T21:47:37.000Z | [
"transformers",
"pytorch",
"safetensors",
"albert",
"text-classification",
"en",
"dataset:jigsaw_unintended_bias",
"dataset:jigsaw_toxicity_pred",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-classification | pykeio | null | null | pykeio/lite-toxic-comment-classification | 1 | 13,719 | transformers | 2023-01-13T22:23:28 | ---
license: apache-2.0
datasets:
- jigsaw_unintended_bias
- jigsaw_toxicity_pred
language:
- en
metrics:
- accuracy
pipeline_tag: text-classification
---
# Lite Toxic Comment Classification
Lightweight ALBERT-based model for English toxic comment classification. Achieves a mean AUC score of 98.28 on the Jigsaw test set. | 323 | [
[
-0.0281219482421875,
-0.04071044921875,
0.0184173583984375,
-0.003154754638671875,
-0.0137786865234375,
-0.00856781005859375,
0.0190277099609375,
-0.038421630859375,
0.028167724609375,
0.027740478515625,
-0.0318603515625,
-0.026031494140625,
-0.0322265625,
0.0225982666015625,
-0.03448486328125,
0.0916748046875,
0.01220703125,
-0.00543212890625,
0.01346588134765625,
-0.042388916015625,
-0.0267333984375,
-0.020721435546875,
-0.06591796875,
0.003055572509765625,
0.07183837890625,
0.01483154296875,
0.04815673828125,
-0.0045166015625,
0.03485107421875,
0.0221710205078125,
-0.016265869140625,
-0.0230865478515625,
-0.0269012451171875,
-0.005279541015625,
-0.0284271240234375,
-0.06787109375,
-0.029327392578125,
0.0250244140625,
-0.0172119140625,
0.047760009765625,
-0.004085540771484375,
0.01483154296875,
-0.00354766845703125,
0.0277557373046875,
-0.04388427734375,
0.022552490234375,
-0.035308837890625,
0.016021728515625,
0.011932373046875,
-0.01458740234375,
-0.064453125,
-0.0082855224609375,
0.0154571533203125,
-0.0416259765625,
-0.0226287841796875,
-0.0020771026611328125,
0.034942626953125,
0.0140380859375,
-0.051055908203125,
-0.0121307373046875,
-0.0287322998046875,
0.0667724609375,
-0.058319091796875,
0.0009598731994628906,
0.0224761962890625,
0.0125732421875,
0.0209808349609375,
-0.06207275390625,
0.0076141357421875,
-0.011749267578125,
-0.005687713623046875,
0.023956298828125,
0.0194549560546875,
0.0129852294921875,
0.03314208984375,
0.03253173828125,
-0.031402587890625,
-0.019500732421875,
-0.028656005859375,
0.00876617431640625,
0.07562255859375,
0.023712158203125,
0.0189666748046875,
-0.009033203125,
-0.01206207275390625,
0.0255279541015625,
-0.01461029052734375,
0.006824493408203125,
0.01242828369140625,
0.0135345458984375,
0.007904052734375,
0.0294189453125,
-0.0161590576171875,
0.0165252685546875,
-0.0239105224609375,
-0.001262664794921875,
0.03216552734375,
-0.005756378173828125,
-0.0087738037109375,
-0.0023651123046875,
0.0601806640625,
0.060821533203125,
0.052978515625,
-0.01129150390625,
-0.0271453857421875,
0.03924560546875,
0.0283966064453125,
-0.051239013671875,
-0.040435791015625,
-0.0178985595703125,
-0.046661376953125,
-0.041748046875,
0.0189666748046875,
-0.0239410400390625,
-0.01788330078125,
0.015838623046875,
0.04156494140625,
-0.00664520263671875,
-0.01397705078125,
-0.004268646240234375,
-0.0139617919921875,
-0.002780914306640625,
0.03594970703125,
-0.040618896484375,
0.01235198974609375,
-0.01361083984375,
0.052032470703125,
0.00815582275390625,
0.00908660888671875,
-0.0087432861328125,
0.0026683807373046875,
-0.041778564453125,
0.055572509765625,
-0.0408935546875,
-0.035400390625,
-0.0104217529296875,
0.027313232421875,
-0.01091766357421875,
-0.00464630126953125,
0.03887939453125,
-0.03790283203125,
0.0457763671875,
-0.0183258056640625,
-0.031036376953125,
-0.04815673828125,
0.003208160400390625,
-0.039794921875,
0.06951904296875,
0.047760009765625,
-0.1029052734375,
0.0380859375,
-0.047332763671875,
-0.03265380859375,
0.0298309326171875,
0.0051422119140625,
-0.0036411285400390625,
-0.0181427001953125,
-0.01030731201171875,
0.02264404296875,
-0.00496673583984375,
-0.01415252685546875,
-0.041534423828125,
-0.040130615234375,
0.021636962890625,
-0.0306243896484375,
0.079833984375,
0.0364990234375,
-0.0312042236328125,
0.004718780517578125,
-0.04937744140625,
0.0292205810546875,
-0.0161895751953125,
-0.032958984375,
-0.0260772705078125,
0.0110931396484375,
0.0133819580078125,
0.0284576416015625,
-0.006805419921875,
-0.0352783203125,
0.0121307373046875,
-0.0178070068359375,
0.0257415771484375,
0.038177490234375,
0.051727294921875,
0.0031147003173828125,
-0.03790283203125,
0.0296630859375,
-0.0025482177734375,
0.030242919921875,
0.01378631591796875,
-0.04595947265625,
-0.06304931640625,
0.0115203857421875,
0.0421142578125,
0.06890869140625,
-0.058990478515625,
0.0518798828125,
0.00926971435546875,
-0.053680419921875,
-0.02783203125,
0.01123809814453125,
0.01763916015625,
0.00765228271484375,
0.0056610107421875,
-0.05462646484375,
-0.0177001953125,
-0.06500244140625,
-0.017364501953125,
-0.0246124267578125,
-0.02685546875,
0.0297088623046875,
0.044342041015625,
-0.0264434814453125,
0.0218658447265625,
-0.0283966064453125,
-0.036651611328125,
0.003009796142578125,
0.04510498046875,
0.00677490234375,
0.01479339599609375,
0.037811279296875,
-0.035247802734375,
-0.0814208984375,
-0.014190673828125,
-0.054779052734375,
-0.02581787109375,
-0.0033416748046875,
0.00812530517578125,
0.0084381103515625,
0.06585693359375,
-0.01520538330078125,
0.053985595703125,
0.004146575927734375,
-0.030914306640625,
0.0103302001953125,
0.00829315185546875,
0.0364990234375,
-0.0557861328125,
-0.00734710693359375,
0.011383056640625,
-0.0275726318359375,
-0.0250701904296875,
-0.00284576416015625,
-0.01019287109375,
0.01904296875,
-0.055877685546875,
0.004741668701171875,
-0.0195465087890625,
0.042205810546875,
-0.0058746337890625,
-0.01163482666015625,
-0.0016527175903320312,
0.03363037109375,
-0.0103607177734375,
0.05999755859375,
0.03997802734375,
-0.034088134765625,
0.02667236328125,
0.004787445068359375,
-0.0102996826171875,
0.03594970703125,
-0.0294952392578125,
0.012969970703125,
0.0203399658203125,
0.0149993896484375,
-0.057342529296875,
-0.024200439453125,
-0.026275634765625,
-0.02764892578125,
-0.0008378028869628906,
0.0084686279296875,
-0.03900146484375,
-0.0244293212890625,
-0.046173095703125,
0.031280517578125,
0.04278564453125,
0.003490447998046875,
0.044525146484375,
0.029052734375,
-0.01861572265625,
-0.0196533203125,
-0.0748291015625,
-0.0108795166015625,
-0.018341064453125,
-0.050140380859375,
-0.0190582275390625,
-0.046417236328125,
-0.034515380859375,
-0.00492095947265625,
-0.003665924072265625,
-0.01242828369140625,
-0.004817962646484375,
0.0203094482421875,
0.00431060791015625,
-0.0299224853515625,
-0.0244140625,
-0.02764892578125,
-0.0008840560913085938,
0.041748046875,
0.08441162109375,
0.0194091796875,
-0.0200042724609375,
0.007022857666015625,
-0.00321197509765625,
0.0287933349609375,
0.064697265625,
0.011383056640625,
0.056182861328125,
-0.0014715194702148438,
-0.01212310791015625,
-0.03173828125,
-0.0227203369140625,
-0.007843017578125,
-0.03955078125,
0.03082275390625,
-0.0085906982421875,
-0.029266357421875,
0.06787109375,
0.01131439208984375,
0.016815185546875,
0.053955078125,
0.0335693359375,
-0.0025653839111328125,
0.126220703125,
0.0179595947265625,
-0.0033855438232421875,
0.0215606689453125,
-0.01611328125,
0.03082275390625,
-0.044219970703125,
-0.00443267822265625,
-0.0035381317138671875,
-0.0145263671875,
-0.039154052734375,
-0.00047588348388671875,
0.02435302734375,
-0.009979248046875,
-0.028900146484375,
0.012725830078125,
-0.045013427734375,
0.053253173828125,
0.023468017578125,
0.00385284423828125,
0.00548553466796875,
-0.040863037109375,
-0.01181793212890625,
-0.006877899169921875,
-0.033294677734375,
-0.0465087890625,
0.0845947265625,
0.05322265625,
0.0780029296875,
0.004436492919921875,
0.0225372314453125,
0.029266357421875,
0.058990478515625,
-0.053466796875,
0.06634521484375,
0.002056121826171875,
-0.09881591796875,
-0.0128021240234375,
-0.033782958984375,
-0.036712646484375,
0.006237030029296875,
-0.0248565673828125,
-0.06439208984375,
-0.022918701171875,
-0.032623291015625,
-0.04437255859375,
0.035675048828125,
-0.071533203125,
0.05438232421875,
-0.01413726806640625,
-0.0295867919921875,
-0.00557708740234375,
-0.0672607421875,
0.040557861328125,
-0.0023822784423828125,
0.02899169921875,
-0.01154327392578125,
-0.00453948974609375,
0.083740234375,
-0.00606536865234375,
0.043121337890625,
-0.00516510009765625,
0.0026798248291015625,
0.0487060546875,
0.0291595458984375,
0.0019779205322265625,
0.023712158203125,
0.0018186569213867188,
-0.0090484619140625,
0.01324462890625,
-0.0176239013671875,
-0.02117919921875,
0.0335693359375,
-0.04962158203125,
-0.0125732421875,
-0.060699462890625,
-0.05419921875,
-0.017425537109375,
0.01050567626953125,
0.0229949951171875,
0.027435302734375,
-0.0097198486328125,
0.0284576416015625,
0.055877685546875,
-0.0080413818359375,
0.034576416015625,
0.059326171875,
-0.0194091796875,
-0.0231781005859375,
0.053558349609375,
0.033477783203125,
0.03070068359375,
0.006031036376953125,
-0.007694244384765625,
-0.019317626953125,
-0.0263214111328125,
0.00936126708984375,
0.01800537109375,
-0.08685302734375,
-0.03515625,
-0.0307159423828125,
-0.06573486328125,
-0.00650787353515625,
-0.0303192138671875,
-0.019317626953125,
-0.00937652587890625,
-0.0237579345703125,
-0.0169219970703125,
0.0217132568359375,
0.0882568359375,
0.0065460205078125,
0.02850341796875,
-0.04376220703125,
0.0019359588623046875,
0.0304412841796875,
0.064453125,
-0.0280609130859375,
-0.058319091796875,
-0.0199127197265625,
-0.0237884521484375,
-0.01496124267578125,
-0.07452392578125,
0.01424407958984375,
-0.00487518310546875,
0.01432037353515625,
0.049835205078125,
0.0269775390625,
0.0089263916015625,
-0.0171661376953125,
0.08416748046875,
0.01282501220703125,
-0.035400390625,
0.035675048828125,
-0.019775390625,
0.0025768280029296875,
0.0660400390625,
0.037139892578125,
-0.03509521484375,
-0.059814453125,
-0.08135986328125,
-0.062286376953125,
0.047760009765625,
0.006927490234375,
0.0044708251953125,
-0.01470947265625,
0.0230865478515625,
-0.0037555694580078125,
-0.00669097900390625,
-0.1055908203125,
-0.036956787109375,
0.00644683837890625,
-0.0191497802734375,
0.005680084228515625,
-0.031646728515625,
-0.00756072998046875,
-0.0228424072265625,
0.0687255859375,
-0.0014944076538085938,
-0.01361083984375,
-0.016082763671875,
0.0009646415710449219,
-0.042755126953125,
0.020782470703125,
0.0146331787109375,
0.03228759765625,
-0.058319091796875,
0.0249786376953125,
0.041900634765625,
-0.02392578125,
0.005588531494140625,
-0.023590087890625,
-0.016021728515625,
-0.0059356689453125,
0.04010009765625,
0.0203094482421875,
0.0164947509765625,
-0.030914306640625,
0.04840087890625,
-0.03460693359375,
-0.029998779296875,
-0.024993896484375,
0.051666259765625,
-0.0005679130554199219,
-0.0232086181640625,
0.0111541748046875,
0.0274200439453125,
0.032196044921875,
-0.056365966796875,
0.036895751953125,
0.0300750732421875,
-0.03839111328125,
-0.0180206298828125,
0.0576171875,
0.0426025390625,
-0.01520538330078125,
0.03125,
-0.0252838134765625,
-0.03857421875,
0.0134429931640625,
0.0162200927734375,
0.048828125,
-0.037841796875,
0.00312042236328125,
0.040924072265625,
0.033203125,
0.0004210472106933594,
0.04144287109375,
0.0318603515625,
-0.0181884765625,
-0.034942626953125,
-0.02978515625,
-0.0161895751953125,
0.0164794921875,
-0.052947998046875,
0.040771484375,
-0.053985595703125,
-0.0029754638671875,
-0.00698089599609375,
0.021514892578125,
-0.01435089111328125,
0.0506591796875,
0.028228759765625,
0.095703125,
-0.11865234375,
0.039459228515625,
0.0452880859375,
-0.063720703125,
-0.08056640625,
-0.0304412841796875,
-0.009979248046875,
-0.03558349609375,
0.013427734375,
0.04669189453125,
0.01433563232421875,
-0.01154327392578125,
-0.053131103515625,
-0.052276611328125,
0.045867919921875,
0.000858306884765625,
-0.016387939453125,
0.0115509033203125,
-0.005733489990234375,
0.050048828125,
-0.0255126953125,
0.09173583984375,
0.053558349609375,
0.021728515625,
-0.0084686279296875,
-0.0821533203125,
-0.0136566162109375,
-0.055419921875,
-0.0164642333984375,
0.0006442070007324219,
-0.056182861328125,
0.054351806640625,
0.0246124267578125,
-0.01543426513671875,
-0.0192108154296875,
0.02032470703125,
-0.005794525146484375,
0.005992889404296875,
0.07745361328125,
0.05230712890625,
0.01216888427734375,
0.006984710693359375,
0.06414794921875,
-0.008575439453125,
0.03271484375,
0.06048583984375,
-0.0240020751953125,
0.057647705078125,
0.018463134765625,
-0.0240020751953125,
0.060333251953125,
0.048095703125,
0.019683837890625,
0.048736572265625,
0.007053375244140625,
-0.0295562744140625,
-0.01047515869140625,
-0.01293182373046875,
-0.0093994140625,
0.0090179443359375,
0.028839111328125,
-0.0216064453125,
-0.00774383544921875,
-0.019622802734375,
-0.0033206939697265625,
-0.0005321502685546875,
-0.042144775390625,
0.06884765625,
0.0259552001953125,
-0.04168701171875,
0.01120758056640625,
-0.0192413330078125,
0.057769775390625,
-0.04559326171875,
-0.0213775634765625,
-0.0007238388061523438,
0.0214996337890625,
-0.0289306640625,
-0.0791015625,
0.0284271240234375,
-0.00713348388671875,
-0.00439453125,
0.0087127685546875,
0.02569580078125,
-0.025115966796875,
-0.03924560546875,
0.035736083984375,
0.002773284912109375,
0.01520538330078125,
0.01445770263671875,
-0.0662841796875,
-0.0176239013671875,
0.009185791015625,
-0.0521240234375,
0.01454925537109375,
0.045623779296875,
0.004276275634765625,
0.050872802734375,
-0.0005159378051757812,
-0.00951385498046875,
0.0202178955078125,
0.004306793212890625,
0.04534912109375,
-0.04443359375,
-0.05633544921875,
-0.0297088623046875,
0.053314208984375,
-0.01386260986328125,
-0.032440185546875,
0.05078125,
0.053375244140625,
0.043212890625,
0.00390625,
0.0506591796875,
-0.015350341796875,
0.0689697265625,
-0.00392913818359375,
0.063232421875,
-0.03125,
0.00795745849609375,
-0.0269012451171875,
-0.0594482421875,
-0.028656005859375,
0.08074951171875,
-0.00958251953125,
0.02545166015625,
0.08935546875,
0.054229736328125,
-0.00893402099609375,
0.0186309814453125,
0.0095062255859375,
0.0174102783203125,
0.0203094482421875,
0.054168701171875,
0.05712890625,
-0.041900634765625,
0.03955078125,
-0.03900146484375,
-0.02630615234375,
0.0008792877197265625,
-0.05999755859375,
-0.09368896484375,
-0.0115814208984375,
-0.0472412109375,
-0.056488037109375,
-0.04638671875,
0.051239013671875,
0.0298309326171875,
-0.07037353515625,
0.006641387939453125,
0.0038433074951171875,
-0.032958984375,
0.011322021484375,
-0.0340576171875,
0.01611328125,
-0.006404876708984375,
-0.00838470458984375,
-0.0019702911376953125,
0.0232086181640625,
-0.00487518310546875,
-0.01198577880859375,
-0.0035457611083984375,
-0.021514892578125,
0.0323486328125,
0.029052734375,
0.0016050338745117188,
-0.045318603515625,
-0.06304931640625,
-0.0250396728515625,
-0.0269927978515625,
-0.010345458984375,
0.0240936279296875,
-0.0316162109375,
0.021514892578125,
0.0313720703125,
0.0091705322265625,
0.0218658447265625,
-0.00970458984375,
0.014892578125,
-0.050140380859375,
0.0379638671875,
0.05181884765625,
0.0426025390625,
0.017242431640625,
-0.0276641845703125,
0.05877685546875,
0.016021728515625,
-0.059326171875,
-0.07403564453125,
0.01171112060546875,
-0.09649658203125,
-0.021728515625,
0.07073974609375,
-0.01216888427734375,
-0.034271240234375,
-0.02264404296875,
-0.035400390625,
0.00859832763671875,
-0.021514892578125,
0.04803466796875,
0.045074462890625,
-0.01396942138671875,
-0.029998779296875,
-0.07452392578125,
0.0223541259765625,
0.03558349609375,
-0.05657958984375,
0.0128326416015625,
0.01329803466796875,
0.05072021484375,
0.0303192138671875,
0.0548095703125,
-0.0180816650390625,
0.0411376953125,
0.039642333984375,
0.0037555694580078125,
0.03564453125,
-0.01214599609375,
0.0046539306640625,
0.014190673828125,
0.020782470703125,
0.01328277587890625
]
] |
facebook/wav2vec2-base-100h | 2022-05-27T16:32:50.000Z | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"en",
"dataset:librispeech_asr",
"arxiv:2006.11477",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/wav2vec2-base-100h | 3 | 13,707 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- librispeech_asr
tags:
- audio
- automatic-speech-recognition
license: apache-2.0
---
# Wav2Vec2-Base-100h
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The base model pretrained and fine-tuned on 100 hours of Librispeech on 16kHz sampled speech audio. When using the model
make sure that your speech input is also sampled at 16Khz.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import soundfile as sf
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-100h")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-100h")
# define function to read in sound file
def map_to_array(batch):
speech, _ = sf.read(batch["file"])
batch["speech"] = speech
return batch
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
ds = ds.map(map_to_array)
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values # Batch size 1
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
## Evaluation
This code snippet shows how to evaluate **facebook/wav2vec2-base-100h** on LibriSpeech's "clean" and "other" test data.
```python
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import soundfile as sf
import torch
from jiwer import wer
librispeech_eval = load_dataset("librispeech_asr", "clean", split="test")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-base-100h").to("cuda")
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-base-100h")
def map_to_pred(batch):
input_values = processor(batch["audio"]["array"], return_tensors="pt", padding="longest").input_values
with torch.no_grad():
logits = model(input_values.to("cuda")).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
batch["transcription"] = transcription
return batch
result = librispeech_eval.map(map_to_pred, batched=True, batch_size=1, remove_columns=["speech"])
print("WER:", wer(result["text"], result["transcription"]))
```
*Result (WER)*:
| "clean" | "other" |
|---|---|
| 6.1 | 13.5 |
| 3,765 | [
[
-0.01256561279296875,
-0.049285888671875,
0.01031494140625,
0.0138397216796875,
-0.0145263671875,
-0.01045989990234375,
-0.039306640625,
-0.040191650390625,
-0.0032901763916015625,
0.01369476318359375,
-0.0458984375,
-0.046234130859375,
-0.0428466796875,
-0.031280517578125,
-0.02984619140625,
0.07147216796875,
0.0229949951171875,
0.006542205810546875,
0.0080108642578125,
-0.0113372802734375,
-0.030548095703125,
-0.0200347900390625,
-0.0638427734375,
-0.034393310546875,
0.01366424560546875,
0.0185394287109375,
0.016937255859375,
0.0185546875,
0.024627685546875,
0.0252227783203125,
-0.0157012939453125,
0.00437164306640625,
-0.05133056640625,
-0.006183624267578125,
0.00804901123046875,
-0.031463623046875,
-0.0209197998046875,
0.020477294921875,
0.04107666015625,
0.037200927734375,
-0.0162811279296875,
0.038330078125,
0.0034656524658203125,
0.033782958984375,
-0.02294921875,
0.0251922607421875,
-0.044158935546875,
-0.0184173583984375,
-0.00922393798828125,
-0.00940704345703125,
-0.046722412109375,
-0.00920867919921875,
0.01126861572265625,
-0.03863525390625,
0.01568603515625,
-0.0184478759765625,
0.067626953125,
0.01708984375,
-0.0224609375,
-0.033599853515625,
-0.07061767578125,
0.06488037109375,
-0.049835205078125,
0.05194091796875,
0.035308837890625,
0.0149078369140625,
-0.0024852752685546875,
-0.085205078125,
-0.03363037109375,
-0.00374603271484375,
0.0202789306640625,
0.033233642578125,
-0.0251617431640625,
0.0051727294921875,
0.0276947021484375,
0.0174407958984375,
-0.049835205078125,
0.002689361572265625,
-0.067138671875,
-0.037139892578125,
0.059173583984375,
-0.0243072509765625,
0.0033092498779296875,
-0.003330230712890625,
-0.02471923828125,
-0.045928955078125,
-0.01788330078125,
0.03717041015625,
0.02386474609375,
0.0168914794921875,
-0.0308837890625,
0.032135009765625,
0.00494384765625,
0.04693603515625,
0.006565093994140625,
-0.03179931640625,
0.05340576171875,
-0.01239013671875,
-0.0100250244140625,
0.031829833984375,
0.0628662109375,
0.01471710205078125,
0.01525115966796875,
0.00897216796875,
-0.0096282958984375,
0.01544189453125,
-0.01540374755859375,
-0.056488037109375,
-0.0419921875,
0.034820556640625,
-0.03106689453125,
0.00653839111328125,
0.0107269287109375,
-0.0199737548828125,
-0.006046295166015625,
-0.0173797607421875,
0.07098388671875,
-0.043792724609375,
-0.0200347900390625,
0.0103607177734375,
-0.021759033203125,
0.01477813720703125,
-0.01068878173828125,
-0.0645751953125,
0.01434326171875,
0.035858154296875,
0.06207275390625,
0.009979248046875,
-0.007396697998046875,
-0.044036865234375,
-0.007541656494140625,
-0.0191650390625,
0.037689208984375,
-0.00719451904296875,
-0.044586181640625,
-0.0240478515625,
-0.0094146728515625,
0.005466461181640625,
-0.0465087890625,
0.054840087890625,
-0.02508544921875,
0.0187530517578125,
-0.0067901611328125,
-0.0511474609375,
-0.019500732421875,
-0.046661376953125,
-0.0408935546875,
0.09368896484375,
0.01348114013671875,
-0.045806884765625,
0.017974853515625,
-0.031280517578125,
-0.048828125,
-0.022857666015625,
-0.0032405853271484375,
-0.043212890625,
0.00872039794921875,
0.017822265625,
0.0303955078125,
-0.010223388671875,
0.0010538101196289062,
-0.007389068603515625,
-0.0478515625,
0.02801513671875,
-0.039825439453125,
0.08831787109375,
0.02362060546875,
-0.042510986328125,
0.01374053955078125,
-0.06951904296875,
0.0160675048828125,
0.0011053085327148438,
-0.035552978515625,
0.00917816162109375,
-0.006969451904296875,
0.030548095703125,
0.015594482421875,
0.0107421875,
-0.0430908203125,
-0.0155487060546875,
-0.060211181640625,
0.043426513671875,
0.0501708984375,
-0.01003265380859375,
0.0285491943359375,
-0.0239715576171875,
-0.0036468505859375,
-0.0208587646484375,
-0.004550933837890625,
0.007541656494140625,
-0.035369873046875,
-0.048919677734375,
-0.03326416015625,
0.0239410400390625,
0.04290771484375,
-0.0158538818359375,
0.049102783203125,
-0.009735107421875,
-0.06878662109375,
-0.0736083984375,
0.0005373954772949219,
0.02349853515625,
0.041534423828125,
0.052032470703125,
-0.015167236328125,
-0.06085205078125,
-0.056793212890625,
-0.0106658935546875,
-0.01546478271484375,
-0.0167999267578125,
0.021209716796875,
0.0178375244140625,
-0.02685546875,
0.05181884765625,
-0.0174407958984375,
-0.0286102294921875,
-0.0195159912109375,
0.01468658447265625,
0.047027587890625,
0.05059814453125,
0.0204620361328125,
-0.04693603515625,
-0.02447509765625,
-0.026092529296875,
-0.037811279296875,
-0.01291656494140625,
-0.007526397705078125,
-0.0003440380096435547,
0.0086822509765625,
0.0302734375,
-0.035186767578125,
0.028594970703125,
0.03875732421875,
-0.012603759765625,
0.027099609375,
-0.004947662353515625,
-0.0005145072937011719,
-0.07415771484375,
0.0010595321655273438,
-0.01102447509765625,
-0.018829345703125,
-0.041656494140625,
-0.042083740234375,
-0.010833740234375,
-0.005077362060546875,
-0.040374755859375,
0.030029296875,
-0.03179931640625,
-0.0184173583984375,
-0.01544189453125,
0.01509857177734375,
-0.0118560791015625,
0.03472900390625,
0.005359649658203125,
0.04937744140625,
0.0509033203125,
-0.040374755859375,
0.04559326171875,
0.0167083740234375,
-0.03985595703125,
0.0021495819091796875,
-0.0633544921875,
0.0350341796875,
0.01123046875,
0.0257415771484375,
-0.0908203125,
-0.00569915771484375,
-0.004428863525390625,
-0.071533203125,
0.0219879150390625,
0.0020294189453125,
-0.0291748046875,
-0.0273590087890625,
-0.005794525146484375,
0.026092529296875,
0.0738525390625,
-0.048828125,
0.04217529296875,
0.0343017578125,
0.01151275634765625,
-0.035858154296875,
-0.0723876953125,
-0.034698486328125,
-0.0011730194091796875,
-0.057403564453125,
0.029632568359375,
0.0013780593872070312,
-0.00339508056640625,
-0.00949859619140625,
-0.035247802734375,
0.011627197265625,
-0.00897979736328125,
0.042327880859375,
0.015960693359375,
-0.007274627685546875,
0.0105133056640625,
-0.0094146728515625,
-0.01959228515625,
0.01385498046875,
-0.039581298828125,
0.05572509765625,
-0.0137176513671875,
-0.01541900634765625,
-0.06536865234375,
-0.0021572113037109375,
0.0133209228515625,
-0.0225982666015625,
0.03314208984375,
0.0869140625,
-0.031890869140625,
-0.0208892822265625,
-0.0380859375,
-0.0241546630859375,
-0.0406494140625,
0.051605224609375,
-0.01904296875,
-0.047943115234375,
0.027313232421875,
0.002254486083984375,
0.0110321044921875,
0.048583984375,
0.0584716796875,
-0.03472900390625,
0.05963134765625,
0.0205078125,
0.0014629364013671875,
0.0408935546875,
-0.06610107421875,
0.007572174072265625,
-0.047393798828125,
-0.034515380859375,
-0.0272979736328125,
-0.031890869140625,
-0.03936767578125,
-0.039154052734375,
0.0338134765625,
0.007541656494140625,
-0.0067291259765625,
0.0282440185546875,
-0.05255126953125,
0.0146026611328125,
0.05340576171875,
0.0212249755859375,
-0.0123748779296875,
0.018096923828125,
0.0009489059448242188,
-0.003383636474609375,
-0.036376953125,
-0.017822265625,
0.09185791015625,
0.03167724609375,
0.0543212890625,
-0.0106048583984375,
0.058563232421875,
0.013397216796875,
-0.020111083984375,
-0.06585693359375,
0.03466796875,
-0.01019287109375,
-0.05120849609375,
-0.022369384765625,
-0.017242431640625,
-0.06414794921875,
0.00980377197265625,
-0.02166748046875,
-0.06378173828125,
0.00965118408203125,
0.0011396408081054688,
-0.022064208984375,
0.01129150390625,
-0.05865478515625,
0.045379638671875,
-0.01110076904296875,
-0.022705078125,
-0.024932861328125,
-0.0535888671875,
0.0026397705078125,
0.01041412353515625,
0.01494598388671875,
-0.01229095458984375,
0.032135009765625,
0.10357666015625,
-0.01416015625,
0.03826904296875,
-0.03131103515625,
0.0038928985595703125,
0.054443359375,
-0.01549530029296875,
0.0221099853515625,
0.0029296875,
-0.0180816650390625,
0.022796630859375,
0.01004791259765625,
-0.02581787109375,
-0.0283966064453125,
0.04608154296875,
-0.07684326171875,
-0.0179595947265625,
-0.0173797607421875,
-0.034820556640625,
-0.0212860107421875,
0.00872802734375,
0.06195068359375,
0.05657958984375,
-0.0059051513671875,
0.0347900390625,
0.047760009765625,
-0.0026416778564453125,
0.0360107421875,
0.006893157958984375,
-0.00609588623046875,
-0.034698486328125,
0.0711669921875,
0.015594482421875,
0.0155181884765625,
0.010223388671875,
0.01558685302734375,
-0.046539306640625,
-0.033905029296875,
-0.003604888916015625,
0.01483154296875,
-0.056060791015625,
-0.00012290477752685547,
-0.04571533203125,
-0.02789306640625,
-0.0513916015625,
0.00186920166015625,
-0.050445556640625,
-0.03448486328125,
-0.03424072265625,
-0.008758544921875,
0.0283660888671875,
0.04083251953125,
-0.03912353515625,
0.03338623046875,
-0.040069580078125,
0.036651611328125,
0.0231475830078125,
-0.006134033203125,
-0.01384735107421875,
-0.07598876953125,
-0.025543212890625,
0.0182647705078125,
-0.0008063316345214844,
-0.06817626953125,
0.0145721435546875,
0.017578125,
0.0343017578125,
0.0224609375,
-0.007320404052734375,
0.04864501953125,
-0.0226287841796875,
0.056640625,
0.0189666748046875,
-0.08343505859375,
0.05108642578125,
-0.0030517578125,
0.0112152099609375,
0.03564453125,
0.01123046875,
-0.0286865234375,
-0.00443267822265625,
-0.05224609375,
-0.077392578125,
0.06866455078125,
0.031646728515625,
-0.0016689300537109375,
0.0305328369140625,
0.017486572265625,
-0.007640838623046875,
-0.0054779052734375,
-0.04681396484375,
-0.040069580078125,
-0.0303955078125,
-0.0251617431640625,
-0.025848388671875,
-0.0164642333984375,
-0.0035572052001953125,
-0.038482666015625,
0.07232666015625,
0.0267333984375,
0.042449951171875,
0.02911376953125,
-0.0144805908203125,
0.0076141357421875,
0.00962066650390625,
0.0280303955078125,
0.0263214111328125,
-0.02911376953125,
0.0119781494140625,
0.0255279541015625,
-0.0509033203125,
0.0155792236328125,
0.017242431640625,
0.01140594482421875,
0.0121002197265625,
0.056732177734375,
0.0858154296875,
-0.003570556640625,
-0.03466796875,
0.03485107421875,
0.003997802734375,
-0.020355224609375,
-0.040740966796875,
0.0164337158203125,
0.036834716796875,
0.0269775390625,
0.03472900390625,
0.003269195556640625,
0.00785064697265625,
-0.031158447265625,
0.030914306640625,
0.0145721435546875,
-0.03582763671875,
-0.02252197265625,
0.06646728515625,
0.00473785400390625,
-0.0238037109375,
0.05181884765625,
-0.0015344619750976562,
-0.0208282470703125,
0.047882080078125,
0.043914794921875,
0.056793212890625,
-0.0230255126953125,
-0.01525115966796875,
0.042327880859375,
0.0204010009765625,
0.001346588134765625,
0.039825439453125,
-0.01032257080078125,
-0.0360107421875,
-0.0197906494140625,
-0.044586181640625,
0.00009959936141967773,
0.0176544189453125,
-0.05841064453125,
0.0269622802734375,
-0.0323486328125,
-0.03033447265625,
0.02117919921875,
0.0174102783203125,
-0.057525634765625,
0.0294036865234375,
0.021636962890625,
0.05615234375,
-0.0638427734375,
0.07867431640625,
0.0244140625,
-0.0233001708984375,
-0.09600830078125,
-0.01151275634765625,
-0.002872467041015625,
-0.055206298828125,
0.047149658203125,
0.022857666015625,
-0.03228759765625,
0.0160369873046875,
-0.042938232421875,
-0.061248779296875,
0.079345703125,
0.02825927734375,
-0.05322265625,
0.00807952880859375,
-0.00677490234375,
0.037506103515625,
-0.005542755126953125,
0.0170745849609375,
0.05572509765625,
0.03338623046875,
0.004535675048828125,
-0.07196044921875,
-0.01183319091796875,
-0.00982666015625,
-0.0205535888671875,
-0.0228729248046875,
-0.0445556640625,
0.07427978515625,
-0.03118896484375,
-0.025177001953125,
-0.00858306884765625,
0.07720947265625,
0.0215606689453125,
0.0251617431640625,
0.04510498046875,
0.036895751953125,
0.074462890625,
-0.017578125,
0.05462646484375,
-0.0017576217651367188,
0.0430908203125,
0.083984375,
0.006641387939453125,
0.06756591796875,
0.0194091796875,
-0.027923583984375,
0.031982421875,
0.0447998046875,
-0.010894775390625,
0.056854248046875,
0.01849365234375,
-0.018951416015625,
-0.01116943359375,
0.004894256591796875,
-0.046478271484375,
0.06781005859375,
0.0240325927734375,
-0.0102386474609375,
0.016326904296875,
0.01090240478515625,
-0.00786590576171875,
-0.0081939697265625,
-0.005512237548828125,
0.052215576171875,
0.01261138916015625,
-0.0177154541015625,
0.07293701171875,
0.005046844482421875,
0.0594482421875,
-0.04705810546875,
0.00312042236328125,
0.016326904296875,
0.022735595703125,
-0.0310516357421875,
-0.04742431640625,
0.01001739501953125,
-0.0131378173828125,
-0.0156097412109375,
0.004055023193359375,
0.046630859375,
-0.050323486328125,
-0.036346435546875,
0.0457763671875,
0.00782012939453125,
0.023956298828125,
0.00154876708984375,
-0.04998779296875,
0.0265350341796875,
0.02191162109375,
-0.0306549072265625,
-0.00925445556640625,
0.00576019287109375,
0.02618408203125,
0.022674560546875,
0.05804443359375,
0.0148773193359375,
0.01479339599609375,
0.0006299018859863281,
0.042205810546875,
-0.040008544921875,
-0.03973388671875,
-0.048126220703125,
0.03265380859375,
0.0069427490234375,
-0.0184173583984375,
0.04217529296875,
0.060302734375,
0.07720947265625,
0.001880645751953125,
0.056243896484375,
-0.0044708251953125,
0.04876708984375,
-0.050384521484375,
0.06005859375,
-0.0430908203125,
0.0089874267578125,
-0.010894775390625,
-0.058563232421875,
0.00569915771484375,
0.0703125,
-0.0083465576171875,
0.022552490234375,
0.0380859375,
0.06817626953125,
-0.006099700927734375,
-0.000278472900390625,
0.01210784912109375,
0.0238037109375,
0.0253753662109375,
0.0543212890625,
0.038330078125,
-0.060546875,
0.054962158203125,
-0.044677734375,
-0.0128936767578125,
-0.00711822509765625,
-0.0177154541015625,
-0.06524658203125,
-0.06304931640625,
-0.020294189453125,
-0.053802490234375,
-0.0033626556396484375,
0.080078125,
0.064208984375,
-0.06829833984375,
-0.0321044921875,
0.0189056396484375,
-0.0097198486328125,
-0.030609130859375,
-0.01474761962890625,
0.057373046875,
0.0031528472900390625,
-0.06610107421875,
0.059112548828125,
-0.00542449951171875,
0.0026721954345703125,
0.0048370361328125,
-0.01494598388671875,
-0.0225372314453125,
0.0024356842041015625,
0.027099609375,
0.0164642333984375,
-0.05352783203125,
-0.01537322998046875,
-0.0093231201171875,
-0.01468658447265625,
0.00974273681640625,
0.0292816162109375,
-0.052703857421875,
0.0469970703125,
0.04718017578125,
0.023529052734375,
0.076171875,
-0.0162200927734375,
0.007450103759765625,
-0.052490234375,
0.03533935546875,
0.0212554931640625,
0.022857666015625,
0.0220794677734375,
-0.01490020751953125,
0.0233001708984375,
0.0265655517578125,
-0.04058837890625,
-0.05908203125,
-0.00727081298828125,
-0.1025390625,
-0.01546478271484375,
0.09552001953125,
0.003570556640625,
-0.01605224609375,
0.01171875,
-0.0300140380859375,
0.07489013671875,
-0.036590576171875,
0.0264434814453125,
0.0323486328125,
-0.0177459716796875,
0.0124359130859375,
-0.03912353515625,
0.042083740234375,
0.032806396484375,
-0.0232086181640625,
-0.00638580322265625,
0.028594970703125,
0.0438232421875,
0.00844573974609375,
0.068603515625,
-0.006542205810546875,
0.0279541015625,
0.0261993408203125,
0.0193023681640625,
-0.019439697265625,
-0.021697998046875,
-0.03521728515625,
0.006168365478515625,
-0.01161956787109375,
-0.040771484375
]
] |
TheBloke/Falcon-7B-Instruct-GPTQ | 2023-08-21T11:21:22.000Z | [
"transformers",
"safetensors",
"RefinedWebModel",
"text-generation",
"custom_code",
"en",
"dataset:tiiuae/falcon-refinedweb",
"arxiv:2205.14135",
"arxiv:1911.02150",
"arxiv:2005.14165",
"arxiv:2104.09864",
"license:apache-2.0",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Falcon-7B-Instruct-GPTQ | 60 | 13,681 | transformers | 2023-05-27T09:03:00 | ---
datasets:
- tiiuae/falcon-refinedweb
license: apache-2.0
language:
- en
inference: false
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Falcon-7B-Instruct GPTQ
This repo contains an experimantal GPTQ 4bit model for [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct).
It is the result of quantising to 4bit using [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ).
## PERFORMANCE
Please note that performance with this GPTQ is currently very slow with AutoGPTQ.
It may perform better with the latest GPTQ-for-LLaMa code, but I havne't tested that personally yet.
## Prompt template
```
A helpful assistant who helps the user with any questions asked.
User: prompt
Assistant:
```
## AutoGPTQ
AutoGPTQ is required: `GITHUB_ACTIONS=true pip install auto-gptq`
AutoGPTQ provides pre-compiled wheels for Windows and Linux, with CUDA toolkit 11.7 or 11.8.
If you are running CUDA toolkit 12.x, you will need to compile your own by following these instructions:
```
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip install .
```
These manual steps will require that you have the [Nvidia CUDA toolkit](https://developer.nvidia.com/cuda-12-0-1-download-archive) installed.
## How to download and use this model in text-generation-webui
1. Launch text-generation-webui
2. Click the **Model tab**.
3. Untick **Autoload model**
4. Under **Download custom model or LoRA**, enter `TheBloke/falcon-7B-instruct-GPTQ`.
5. Click **Download**.
6. Wait until it says it's finished downloading.
7. Click the **Refresh** icon next to **Model** in the top left.
8. In the **Model drop-down**: choose the model you just downloaded, `falcon-7B-instruct-GPTQ`.
9. Set **Loader** to **AutoGPTQ**. This model will not work with ExLlama. It might work with recent GPTQ-for-LLaMa but I haven't tested that.
10. Tick **Trust Remote Code**, followed by **Save Settings**
11. Click **Reload**.
12. Once it says it's loaded, click the **Text Generation tab** and enter a prompt!
## About `trust_remote_code`
Please be aware that this command line argument causes Python code provided by Falcon to be executed on your machine.
This code is required at the moment because Falcon is too new to be supported by Hugging Face transformers. At some point in the future transformers will support the model natively, and then `trust_remote_code` will no longer be needed.
In this repo you can see two `.py` files - these are the files that get executed. They are copied from the base repo at [Falcon-7B-Instruct](https://huggingface.co/tiiuae/falcon-7b-instruct).
## Simple Python example code
To run this code you need to install AutoGPTQ and einops:
```
GITHUB_ACTIONS=true pip install auto-gptq
pip install einops
```
You can then run this example code:
```python
from transformers import AutoTokenizer, pipeline, logging
from auto_gptq import AutoGPTQForCausalLM, BaseQuantizeConfig
import argparse
model_name_or_path = "TheBloke/falcon-7b-instruct-GPTQ"
# You could also download the model locally, and access it there
# model_name_or_path = "/path/to/TheBloke_falcon-7b-instruct-GPTQ"
model_basename = "model"
use_triton = False
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
model = AutoGPTQForCausalLM.from_quantized(model_name_or_path,
model_basename=model_basename,
use_safetensors=True,
trust_remote_code=True,
device="cuda:0",
use_triton=use_triton,
quantize_config=None)
prompt = "Tell me about AI"
prompt_template=f'''A helpful assistant who helps the user with any questions asked.
User: {prompt}
Assistant:'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
# Note that if you use pipeline, you will see a spurious error message saying the model type is not supported
# This can be ignored! Or you can hide it with the following logging line:
# Prevent printing spurious transformers error when using pipeline with AutoGPTQ
logging.set_verbosity(logging.CRITICAL)
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
temperature=0.7,
top_p=0.95,
repetition_penalty=1.15
)
print(pipe(prompt_template)[0]['generated_text'])
```
## Provided files
**gptq_model-4bit-64g.safetensors**
This will work with AutoGPTQ 0.2.0 and later.
It was created with groupsize 64 to give higher inference quality, and without `desc_act` (act-order) to increase inference speed.
* `gptq_model-4bit-64g.safetensors`
* Works with AutoGPTQ CUDA 0.2.0 and later.
* At this time it does not work with AutoGPTQ Triton, but support will hopefully be added in time.
* Works with text-generation-webui using `--trust-remote-code`
* Does not work with any version of GPTQ-for-LLaMa
* Parameters: Groupsize = 64. No act-order.
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute.
Thanks to the [chirper.ai](https://chirper.ai) team!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Sam, theTransient, Jonathan Leane, Steven Wood, webtim, Johann-Peter Hartmann, Geoffrey Montalvo, Gabriel Tamborski, Willem Michiel, John Villwock, Derek Yates, Mesiah Bishop, Eugene Pentland, Pieter, Chadd, Stephen Murray, Daniel P. Andersen, terasurfer, Brandon Frisco, Thomas Belote, Sid, Nathan LeClaire, Magnesian, Alps Aficionado, Stanislav Ovsiannikov, Alex, Joseph William Delisle, Nikolai Manek, Michael Davis, Junyu Yang, K, J, Spencer Kim, Stefan Sabev, Olusegun Samson, transmissions 11, Michael Levine, Cory Kujawski, Rainer Wilmers, zynix, Kalila, Luke @flexchar, Ajan Kanaga, Mandus, vamX, Ai Maven, Mano Prime, Matthew Berman, subjectnull, Vitor Caleffi, Clay Pascal, biorpg, alfie_i, 阿明, Jeffrey Morgan, ya boyyy, Raymond Fosdick, knownsqashed, Olakabola, Leonard Tan, ReadyPlayerEmma, Enrico Ros, Dave, Talal Aujan, Illia Dulskyi, Sean Connelly, senxiiz, Artur Olbinski, Elle, Raven Klaugh, Fen Risland, Deep Realms, Imad Khwaja, Fred von Graf, Will Dee, usrbinkat, SuperWojo, Alexandros Triantafyllidis, Swaroop Kallakuri, Dan Guido, John Detwiler, Pedro Madruga, Iucharbius, Viktor Bowallius, Asp the Wyvern, Edmond Seymore, Trenton Dambrowitz, Space Cruiser, Spiking Neurons AB, Pyrater, LangChain4j, Tony Hughes, Kacper Wikieł, Rishabh Srivastava, David Ziegler, Luke Pendergrass, Andrey, Gabriel Puliatti, Lone Striker, Sebastain Graf, Pierre Kircher, Randy H, NimbleBox.ai, Vadim, danny, Deo Leter
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# ✨ Original model card: Falcon-7B-Instruct
**Falcon-7B-Instruct is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) based on [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b) and finetuned on a mixture of chat/instruct datasets. It is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-7b-instruct/blob/main/LICENSE.txt).**
*Paper coming soon 😊.*
## Why use Falcon-7B-Instruct?
* **You are looking for a ready-to-use chat/instruct model based on [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).**
* **Falcon-7B is a strong base model, outperforming comparable open-source models** (e.g., [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), [StableLM](https://github.com/Stability-AI/StableLM), [RedPajama](https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1) etc.), thanks to being trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
* **It features an architecture optimized for inference**, with FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)) and multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)).
💬 **This is an instruct model, which may not be ideal for further finetuning.** If you are interested in building your own instruct/chat model, we recommend starting from [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).
🔥 **Looking for an even more powerful model?** [Falcon-40B-Instruct](https://huggingface.co/tiiuae/falcon-40b-instruct) is Falcon-7B-Instruct's big brother!
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!**
# Model Card for Falcon-7B-Instruct
## Model Details
### Model Description
- **Developed by:** [https://www.tii.ae](https://www.tii.ae);
- **Model type:** Causal decoder-only;
- **Language(s) (NLP):** English and French;
- **License:** [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-7b-instruct/blob/main/LICENSE.txt);
- **Finetuned from model:** [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).
### Model Source
- **Paper:** *coming soon*.
## Uses
### Direct Use
Falcon-7B-Instruct has been finetuned on a mixture of instruct and chat datasets.
### Out-of-Scope Use
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
## Bias, Risks, and Limitations
Falcon-7B-Instruct is mostly trained on English data, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
### Recommendations
We recommend users of Falcon-7B-Instruct to develop guardrails and to take appropriate precautions for any production use.
## How to Get Started with the Model
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Training Details
### Training Data
Falcon-7B-Instruct was finetuned on a 250M tokens mixture of instruct/chat datasets.
| **Data source** | **Fraction** | **Tokens** | **Description** |
|--------------------|--------------|------------|-----------------------------------|
| [Bai ze](https://github.com/project-baize/baize-chatbot) | 65% | 164M | chat |
| [GPT4All](https://github.com/nomic-ai/gpt4all) | 25% | 62M | instruct |
| [GPTeacher](https://github.com/teknium1/GPTeacher) | 5% | 11M | instruct |
| [RefinedWeb-English](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) | 5% | 13M | massive web crawl |
The data was tokenized with the Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b) tokenizer.
## Evaluation
*Paper coming soon.*
See the [OpenLLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) for early results.
Note that this model variant is not optimized for NLP benchmarks.
## Technical Specifications
For more information about pretraining, see [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).
### Model Architecture and Objective
Falcon-7B is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token).
The architecture is broadly adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)), with the following differences:
* **Positionnal embeddings:** rotary ([Su et al., 2021](https://arxiv.org/abs/2104.09864));
* **Attention:** multiquery ([Shazeer et al., 2019](https://arxiv.org/abs/1911.02150)) and FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135));
* **Decoder-block:** parallel attention/MLP with a single layer norm.
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|-----------|----------------------------------------|
| Layers | 32 | |
| `d_model` | 4544 | Increased to compensate for multiquery |
| `head_dim` | 64 | Reduced to optimise for FlashAttention |
| Vocabulary | 65024 | |
| Sequence length | 2048 | |
### Compute Infrastructure
#### Hardware
Falcon-7B-Instruct was trained on AWS SageMaker, on 32 A100 40GB GPUs in P4d instances.
#### Software
Falcon-7B-Instruct was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.)
## Citation
*Paper coming soon 😊.*
## License
Falcon-7B-Instruct is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-7b-instruct/blob/main/LICENSE.txt). Broadly speaking,
* You can freely use our models for research and/or personal purpose;
* You are allowed to share and build derivatives of these models, but you are required to give attribution and to share-alike with the same license;
* For commercial use, you are exempt from royalties payment if the attributable revenues are inferior to $1M/year, otherwise you should enter in a commercial agreement with TII.
## Contact
falconllm@tii.ae
| 16,784 | [
[
-0.0445556640625,
-0.053680419921875,
0.01540374755859375,
0.006580352783203125,
-0.0168609619140625,
0.00608062744140625,
0.01514434814453125,
-0.03302001953125,
0.026336669921875,
0.013092041015625,
-0.050140380859375,
-0.03045654296875,
-0.039215087890625,
0.00313568115234375,
-0.030059814453125,
0.07318115234375,
0.01904296875,
-0.033294677734375,
0.0106201171875,
0.00630950927734375,
-0.0172576904296875,
-0.0257568359375,
-0.06396484375,
-0.016082763671875,
0.0206298828125,
0.0107574462890625,
0.07061767578125,
0.056488037109375,
0.0291748046875,
0.03424072265625,
-0.004863739013671875,
0.0098419189453125,
-0.024322509765625,
-0.0208892822265625,
0.00899505615234375,
-0.0214996337890625,
-0.042144775390625,
0.01029205322265625,
0.04693603515625,
0.01474761962890625,
-0.0192718505859375,
0.0116729736328125,
-0.0033740997314453125,
0.038299560546875,
-0.0284881591796875,
0.0248565673828125,
-0.03173828125,
0.004364013671875,
-0.00909423828125,
0.01216888427734375,
0.0001442432403564453,
-0.0251007080078125,
-0.0175933837890625,
-0.077392578125,
0.0141448974609375,
0.00954437255859375,
0.1005859375,
0.02923583984375,
-0.0287933349609375,
-0.0044097900390625,
-0.03302001953125,
0.04107666015625,
-0.0692138671875,
0.0276947021484375,
0.016998291015625,
0.0209197998046875,
-0.01375579833984375,
-0.08856201171875,
-0.05975341796875,
-0.0106658935546875,
-0.01372528076171875,
0.033111572265625,
-0.043426513671875,
-0.004833221435546875,
0.0227203369140625,
0.0390625,
-0.0447998046875,
-0.013702392578125,
-0.0322265625,
-0.01340484619140625,
0.054107666015625,
0.0169830322265625,
0.034576416015625,
-0.0221405029296875,
-0.024749755859375,
-0.02490234375,
-0.035736083984375,
0.0238037109375,
0.0225372314453125,
0.0078582763671875,
-0.045135498046875,
0.034576416015625,
-0.0184783935546875,
0.035400390625,
0.046722412109375,
0.01007843017578125,
0.023406982421875,
-0.036346435546875,
-0.040069580078125,
-0.018951416015625,
0.101318359375,
0.0213775634765625,
-0.01258087158203125,
0.006832122802734375,
-0.007167816162109375,
0.000014424324035644531,
0.005950927734375,
-0.06219482421875,
-0.0274505615234375,
0.0305633544921875,
-0.031341552734375,
-0.02349853515625,
0.004116058349609375,
-0.05902099609375,
-0.006580352783203125,
0.007503509521484375,
0.03424072265625,
-0.0247039794921875,
-0.032501220703125,
0.005786895751953125,
-0.0156402587890625,
0.033111572265625,
0.005710601806640625,
-0.06884765625,
0.0160064697265625,
0.033721923828125,
0.0584716796875,
0.028045654296875,
-0.0276641845703125,
-0.051055908203125,
0.01041412353515625,
-0.0176544189453125,
0.0399169921875,
-0.0235137939453125,
-0.043853759765625,
-0.0213623046875,
0.0277557373046875,
-0.007232666015625,
-0.019439697265625,
0.037872314453125,
-0.0179443359375,
0.022125244140625,
-0.02935791015625,
-0.0248870849609375,
-0.0197906494140625,
0.0028896331787109375,
-0.039093017578125,
0.0760498046875,
0.0288238525390625,
-0.06463623046875,
0.0023250579833984375,
-0.05517578125,
-0.02325439453125,
0.01471710205078125,
-0.007434844970703125,
-0.04388427734375,
-0.00894927978515625,
0.0275115966796875,
0.0254669189453125,
-0.019683837890625,
0.013671875,
-0.0308074951171875,
-0.0288848876953125,
0.0098876953125,
-0.018035888671875,
0.08984375,
0.0278778076171875,
-0.05419921875,
0.0016832351684570312,
-0.039093017578125,
-0.0002617835998535156,
0.031219482421875,
-0.005664825439453125,
0.0166168212890625,
-0.0235137939453125,
-0.0026874542236328125,
0.00860595703125,
0.01507568359375,
-0.03704833984375,
0.0257720947265625,
-0.0293121337890625,
0.05908203125,
0.04510498046875,
0.0000928044319152832,
0.031494140625,
-0.04046630859375,
0.042724609375,
0.00754547119140625,
0.043792724609375,
0.006397247314453125,
-0.048797607421875,
-0.07122802734375,
-0.01532745361328125,
0.016937255859375,
0.035308837890625,
-0.060577392578125,
0.032623291015625,
0.0036773681640625,
-0.0504150390625,
-0.0418701171875,
-0.017120361328125,
0.0207672119140625,
0.03887939453125,
0.02947998046875,
-0.025421142578125,
-0.0298309326171875,
-0.062103271484375,
-0.0011625289916992188,
-0.034393310546875,
0.00556182861328125,
0.03216552734375,
0.043365478515625,
-0.03143310546875,
0.05255126953125,
-0.038726806640625,
-0.025115966796875,
-0.0024089813232421875,
0.004673004150390625,
0.024993896484375,
0.049957275390625,
0.05645751953125,
-0.04718017578125,
-0.040863037109375,
-0.0036678314208984375,
-0.051605224609375,
-0.006526947021484375,
-0.00241851806640625,
-0.03192138671875,
0.01438140869140625,
0.0176544189453125,
-0.08160400390625,
0.03997802734375,
0.035369873046875,
-0.04376220703125,
0.042388916015625,
-0.014984130859375,
0.017181396484375,
-0.07904052734375,
0.01209259033203125,
0.004856109619140625,
-0.0168914794921875,
-0.04022216796875,
0.005794525146484375,
-0.006610870361328125,
-0.007686614990234375,
-0.032257080078125,
0.05047607421875,
-0.03106689453125,
0.0060577392578125,
-0.00319671630859375,
-0.006603240966796875,
0.0198516845703125,
0.03436279296875,
-0.012237548828125,
0.06231689453125,
0.04681396484375,
-0.037994384765625,
0.035125732421875,
0.036376953125,
-0.0024013519287109375,
0.0214691162109375,
-0.075439453125,
0.00939178466796875,
0.0002887248992919922,
0.0221099853515625,
-0.0806884765625,
-0.0224609375,
0.0523681640625,
-0.059326171875,
0.0273590087890625,
-0.0196533203125,
-0.01824951171875,
-0.041168212890625,
-0.01812744140625,
0.0172576904296875,
0.05706787109375,
-0.036041259765625,
0.049468994140625,
0.042022705078125,
-0.002254486083984375,
-0.049407958984375,
-0.054718017578125,
-0.0034542083740234375,
-0.0211181640625,
-0.047760009765625,
0.041839599609375,
-0.01393890380859375,
-0.007747650146484375,
0.0018224716186523438,
0.01305389404296875,
-0.005340576171875,
0.0007328987121582031,
0.0276641845703125,
0.0247955322265625,
-0.0083160400390625,
-0.01715087890625,
0.009796142578125,
-0.01129150390625,
0.00737762451171875,
-0.0237274169921875,
0.039459228515625,
-0.047607421875,
-0.0085906982421875,
-0.050628662109375,
0.0167694091796875,
0.048492431640625,
-0.0243988037109375,
0.04241943359375,
0.07073974609375,
-0.02490234375,
0.0043182373046875,
-0.039520263671875,
-0.0147247314453125,
-0.044158935546875,
0.00872802734375,
-0.0177154541015625,
-0.053619384765625,
0.04351806640625,
0.0245513916015625,
0.0092315673828125,
0.0655517578125,
0.035888671875,
-0.0154571533203125,
0.07574462890625,
0.04412841796875,
-0.00913238525390625,
0.0276641845703125,
-0.057281494140625,
0.003299713134765625,
-0.058837890625,
-0.021331787109375,
-0.032684326171875,
-0.013580322265625,
-0.056060791015625,
-0.0263214111328125,
0.03460693359375,
0.0027256011962890625,
-0.06121826171875,
0.0230560302734375,
-0.064208984375,
0.01419830322265625,
0.048370361328125,
0.004215240478515625,
0.011322021484375,
-0.004852294921875,
-0.0129852294921875,
0.01093292236328125,
-0.060455322265625,
-0.0181121826171875,
0.05865478515625,
0.026123046875,
0.036041259765625,
0.00890350341796875,
0.052734375,
0.0090179443359375,
0.025238037109375,
-0.036376953125,
0.032135009765625,
-0.01012420654296875,
-0.047332763671875,
-0.0181884765625,
-0.035064697265625,
-0.0655517578125,
0.004486083984375,
-0.008758544921875,
-0.0406494140625,
0.034210205078125,
0.0029754638671875,
-0.0518798828125,
0.028472900390625,
-0.04412841796875,
0.058502197265625,
-0.01471710205078125,
-0.031707763671875,
0.01397705078125,
-0.051544189453125,
0.03253173828125,
0.01349639892578125,
0.017242431640625,
-0.01097869873046875,
-0.001399993896484375,
0.055877685546875,
-0.055267333984375,
0.0615234375,
-0.01486968994140625,
0.005680084228515625,
0.0479736328125,
-0.001735687255859375,
0.043426513671875,
0.0238800048828125,
-0.0031681060791015625,
0.0242462158203125,
0.0142364501953125,
-0.0214691162109375,
-0.0293731689453125,
0.052215576171875,
-0.08148193359375,
-0.04718017578125,
-0.039459228515625,
-0.0295867919921875,
0.0100860595703125,
0.01837158203125,
0.037628173828125,
0.0288238525390625,
0.01244354248046875,
0.0038433074951171875,
0.020751953125,
-0.0197906494140625,
0.06292724609375,
0.0224609375,
-0.01264190673828125,
-0.0517578125,
0.06292724609375,
0.0025348663330078125,
0.006603240966796875,
0.00737762451171875,
0.0157928466796875,
-0.046142578125,
-0.0204315185546875,
-0.05084228515625,
0.03582763671875,
-0.038177490234375,
-0.0307464599609375,
-0.05047607421875,
-0.032501220703125,
-0.039306640625,
-0.00310516357421875,
-0.042724609375,
-0.032989501953125,
-0.042877197265625,
0.0132293701171875,
0.045196533203125,
0.049591064453125,
-0.020233154296875,
0.034149169921875,
-0.060302734375,
0.0176544189453125,
0.036163330078125,
0.018463134765625,
0.0040283203125,
-0.055572509765625,
-0.004364013671875,
0.022491455078125,
-0.042724609375,
-0.049285888671875,
0.06329345703125,
0.0073394775390625,
0.04376220703125,
0.0147857666015625,
0.01415252685546875,
0.060028076171875,
-0.016204833984375,
0.068115234375,
0.024139404296875,
-0.08612060546875,
0.031707763671875,
-0.04583740234375,
0.0273284912109375,
0.0279541015625,
0.03692626953125,
-0.030242919921875,
-0.028289794921875,
-0.0643310546875,
-0.046661376953125,
0.04180908203125,
0.038543701171875,
0.00521087646484375,
-0.002529144287109375,
0.04412841796875,
-0.020538330078125,
0.0123748779296875,
-0.06805419921875,
-0.037994384765625,
-0.050140380859375,
-0.01032257080078125,
0.006927490234375,
0.00682830810546875,
-0.00957489013671875,
-0.042938232421875,
0.07086181640625,
-0.01488494873046875,
0.059539794921875,
0.03582763671875,
0.01617431640625,
-0.01629638671875,
0.0002694129943847656,
0.0347900390625,
0.045257568359375,
-0.0075531005859375,
-0.00829315185546875,
0.0005049705505371094,
-0.041473388671875,
0.0071563720703125,
0.033660888671875,
-0.0228424072265625,
-0.00676727294921875,
-0.0013027191162109375,
0.061614990234375,
-0.01097869873046875,
-0.01177215576171875,
0.033050537109375,
-0.0202789306640625,
-0.0298309326171875,
-0.026458740234375,
0.0167694091796875,
0.0214691162109375,
0.04669189453125,
0.033294677734375,
-0.0184326171875,
0.01284027099609375,
-0.035552978515625,
0.011962890625,
0.041595458984375,
-0.02679443359375,
-0.0177764892578125,
0.09197998046875,
0.00501251220703125,
-0.0200347900390625,
0.06024169921875,
-0.0198516845703125,
-0.046295166015625,
0.07525634765625,
0.04290771484375,
0.068115234375,
-0.00827789306640625,
0.02728271484375,
0.043914794921875,
0.025177001953125,
0.002140045166015625,
0.021240234375,
0.0155792236328125,
-0.041534423828125,
-0.014556884765625,
-0.04425048828125,
-0.0272216796875,
0.0160980224609375,
-0.03582763671875,
0.021148681640625,
-0.0501708984375,
-0.032958984375,
-0.00005841255187988281,
0.0178985595703125,
-0.0433349609375,
0.0225982666015625,
0.01285552978515625,
0.04925537109375,
-0.04058837890625,
0.054718017578125,
0.0428466796875,
-0.05047607421875,
-0.08319091796875,
-0.01346588134765625,
-0.0018434524536132812,
-0.05462646484375,
0.0307464599609375,
0.00641632080078125,
-0.0004589557647705078,
0.0301055908203125,
-0.056060791015625,
-0.07000732421875,
0.10205078125,
0.01316070556640625,
-0.044952392578125,
-0.003910064697265625,
0.01029205322265625,
0.0243988037109375,
-0.0289306640625,
0.053070068359375,
0.0364990234375,
0.036285400390625,
0.0126190185546875,
-0.07293701171875,
0.0237884521484375,
-0.0369873046875,
-0.0084075927734375,
0.009674072265625,
-0.09942626953125,
0.0784912109375,
-0.0164337158203125,
-0.01093292236328125,
0.00689697265625,
0.05560302734375,
0.024444580078125,
0.01238250732421875,
0.035888671875,
0.046112060546875,
0.0587158203125,
-0.0177001953125,
0.077392578125,
-0.037872314453125,
0.05267333984375,
0.051422119140625,
0.003047943115234375,
0.04168701171875,
0.01090240478515625,
-0.034942626953125,
0.036163330078125,
0.056396484375,
-0.023406982421875,
0.02606201171875,
-0.0007877349853515625,
-0.0196990966796875,
-0.0135498046875,
-0.00669097900390625,
-0.057647705078125,
0.0255889892578125,
0.0258026123046875,
-0.0190887451171875,
0.00218963623046875,
-0.0215911865234375,
-0.002796173095703125,
-0.0361328125,
-0.01629638671875,
0.03302001953125,
0.01641845703125,
-0.037261962890625,
0.08172607421875,
-0.0011577606201171875,
0.05340576171875,
-0.042022705078125,
-0.003398895263671875,
-0.039031982421875,
0.00620269775390625,
-0.01861572265625,
-0.0496826171875,
0.0202789306640625,
-0.0165252685546875,
-0.0036106109619140625,
0.006458282470703125,
0.046844482421875,
-0.0194549560546875,
-0.0419921875,
0.01271820068359375,
0.024658203125,
0.01264190673828125,
-0.00115203857421875,
-0.07818603515625,
0.0207672119140625,
0.00701141357421875,
-0.025146484375,
0.01678466796875,
0.0193634033203125,
0.02203369140625,
0.058990478515625,
0.0479736328125,
-0.0169830322265625,
0.0117950439453125,
-0.0163116455078125,
0.074462890625,
-0.056640625,
-0.030120849609375,
-0.06329345703125,
0.046905517578125,
0.000942230224609375,
-0.026641845703125,
0.052215576171875,
0.04205322265625,
0.0538330078125,
-0.00319671630859375,
0.06402587890625,
-0.024200439453125,
0.0132293701171875,
-0.0196990966796875,
0.08197021484375,
-0.058868408203125,
0.0157470703125,
-0.0229949951171875,
-0.04840087890625,
-0.005245208740234375,
0.059478759765625,
0.0015020370483398438,
0.01910400390625,
0.036865234375,
0.064697265625,
-0.0068359375,
0.008636474609375,
0.0114593505859375,
0.032073974609375,
0.029510498046875,
0.0648193359375,
0.060638427734375,
-0.07183837890625,
0.04925537109375,
-0.050994873046875,
-0.012451171875,
0.002796173095703125,
-0.0655517578125,
-0.061004638671875,
-0.035369873046875,
-0.035888671875,
-0.039031982421875,
-0.00013267993927001953,
0.06292724609375,
0.059967041015625,
-0.041351318359375,
-0.0217132568359375,
-0.015228271484375,
0.0004372596740722656,
-0.005702972412109375,
-0.02203369140625,
0.0307464599609375,
-0.000988006591796875,
-0.067626953125,
0.01715087890625,
0.00428009033203125,
0.036041259765625,
-0.008148193359375,
-0.01090240478515625,
-0.0243072509765625,
0.0010862350463867188,
0.026397705078125,
0.042755126953125,
-0.048095703125,
-0.0122528076171875,
0.00537872314453125,
-0.00466156005859375,
0.0185089111328125,
0.02056884765625,
-0.06060791015625,
0.007480621337890625,
0.0299530029296875,
0.0287322998046875,
0.05419921875,
0.004974365234375,
0.0303955078125,
-0.0178985595703125,
0.01010894775390625,
0.005603790283203125,
0.0279541015625,
0.0116424560546875,
-0.04840087890625,
0.044647216796875,
0.031585693359375,
-0.058135986328125,
-0.055328369140625,
-0.0135498046875,
-0.080810546875,
-0.010711669921875,
0.07940673828125,
-0.00934600830078125,
-0.0286407470703125,
0.002651214599609375,
-0.01116180419921875,
0.041839599609375,
-0.03802490234375,
0.047943115234375,
0.0275726318359375,
-0.015350341796875,
-0.02691650390625,
-0.04266357421875,
0.0445556640625,
0.0208282470703125,
-0.06842041015625,
-0.009857177734375,
0.03118896484375,
0.032318115234375,
0.006618499755859375,
0.058349609375,
-0.00860595703125,
0.026458740234375,
0.00827789306640625,
0.0022335052490234375,
-0.0223236083984375,
-0.0038814544677734375,
-0.0175018310546875,
-0.0009450912475585938,
-0.0265655517578125,
-0.01702880859375
]
] |
chavinlo/alpaca-native | 2023-03-31T13:09:41.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | chavinlo | null | null | chavinlo/alpaca-native | 250 | 13,679 | transformers | 2023-03-16T02:37:26 | # Stanford Alpaca
This is a replica of Alpaca by Stanford' tatsu
Trained using the original instructions with a minor modification in FSDP mode
# Other versions:
13B: https://huggingface.co/chavinlo/alpaca-13b
13B -> GPT4 : https://huggingface.co/chavinlo/gpt4-x-alpaca
## Compute Used
Trained on 4xA100s for 6H
Donated by redmond.ai
NO LORA HAS BEEN USED, this is a natively-finetuned model, hence "alpaca-native"
If you are interested on more llama-based models, you can check out my profile or search for other models at https://huggingface.co/models?other=llama
This (MIGHT) be a quantized version of this model, but be careful: https://boards.4channel.org/g/thread/92173062#p92182396
CONFIGURATION (default except fsdp):
```shell
torchrun --nproc_per_node=4 --master_port=3045 train.py \
--model_name_or_path /workspace/llama-7b-hf \
--data_path ./alpaca_data.json \
--bf16 True \
--output_dir /workspace/output \
--num_train_epochs 3 \
--per_device_train_batch_size 4 \
--per_device_eval_batch_size 4 \
--gradient_accumulation_steps 8 \
--evaluation_strategy "no" \
--save_strategy "steps" \
--save_steps 200 \
--save_total_limit 1 \
--learning_rate 2e-5 \
--weight_decay 0. \
--warmup_ratio 0.03 \
--lr_scheduler_type "cosine" \
--logging_steps 1 \
--fsdp "shard_grad_op auto_wrap" \
--fsdp_transformer_layer_cls_to_wrap 'LLaMADecoderLayer' \
--tf32 True --report_to="wandb"
``` | 1,475 | [
[
-0.03173828125,
-0.047119140625,
0.033843994140625,
0.0391845703125,
-0.02923583984375,
0.003482818603515625,
0.007602691650390625,
-0.038360595703125,
0.04864501953125,
0.03863525390625,
-0.06365966796875,
-0.029571533203125,
-0.03778076171875,
-0.004161834716796875,
-0.0249481201171875,
0.1004638671875,
-0.0253143310546875,
-0.004150390625,
0.0276336669921875,
-0.014373779296875,
-0.016357421875,
-0.0200347900390625,
-0.07354736328125,
-0.041748046875,
0.0166168212890625,
0.00897216796875,
0.05877685546875,
0.053802490234375,
0.02880859375,
0.02197265625,
-0.0014181137084960938,
-0.01151275634765625,
-0.023101806640625,
-0.0413818359375,
0.01305389404296875,
-0.024566650390625,
-0.0576171875,
0.02166748046875,
0.0517578125,
0.0246429443359375,
-0.0229949951171875,
0.021575927734375,
-0.0216064453125,
0.011962890625,
-0.035675048828125,
0.022857666015625,
-0.0313720703125,
0.0185699462890625,
-0.0029888153076171875,
-0.00022935867309570312,
-0.003276824951171875,
-0.025665283203125,
0.01360321044921875,
-0.07080078125,
0.04217529296875,
-0.0011892318725585938,
0.07818603515625,
0.0255126953125,
-0.034393310546875,
-0.0323486328125,
-0.042633056640625,
0.055084228515625,
-0.0535888671875,
0.0160064697265625,
0.043731689453125,
0.02740478515625,
-0.01129913330078125,
-0.059844970703125,
-0.039886474609375,
-0.0238494873046875,
0.01103973388671875,
0.00496673583984375,
0.004016876220703125,
0.01152801513671875,
0.03887939453125,
0.0439453125,
-0.038482666015625,
0.01042938232421875,
-0.041595458984375,
-0.0177154541015625,
0.045654296875,
0.0132904052734375,
0.00138092041015625,
-0.0013952255249023438,
-0.053802490234375,
-0.02935791015625,
-0.032867431640625,
0.0110931396484375,
0.01605224609375,
0.0017032623291015625,
-0.03594970703125,
0.03515625,
-0.0294647216796875,
0.03582763671875,
-0.01085662841796875,
-0.0256805419921875,
0.045562744140625,
-0.005092620849609375,
-0.03387451171875,
0.0137939453125,
0.060821533203125,
0.019439697265625,
-0.002780914306640625,
-0.004459381103515625,
-0.02337646484375,
0.0012998580932617188,
0.014373779296875,
-0.068603515625,
-0.023834228515625,
0.012939453125,
-0.0299530029296875,
-0.02447509765625,
0.0017480850219726562,
-0.0279388427734375,
0.00467681884765625,
-0.0017538070678710938,
0.04046630859375,
-0.045196533203125,
0.001956939697265625,
0.01526641845703125,
-0.0009207725524902344,
0.0361328125,
0.020172119140625,
-0.07196044921875,
0.040679931640625,
0.04437255859375,
0.06890869140625,
0.0154571533203125,
-0.05255126953125,
-0.0030117034912109375,
0.026397705078125,
-0.0142059326171875,
0.0284576416015625,
0.0018968582153320312,
-0.047271728515625,
-0.0301055908203125,
0.0232086181640625,
0.00986480712890625,
-0.019622802734375,
0.041900634765625,
-0.029083251953125,
0.0174560546875,
-0.0269012451171875,
-0.03326416015625,
-0.0277252197265625,
0.01444244384765625,
-0.06585693359375,
0.0885009765625,
0.0306243896484375,
-0.0430908203125,
0.029205322265625,
-0.06085205078125,
-0.017303466796875,
-0.00725555419921875,
-0.0024585723876953125,
-0.034423828125,
-0.01229095458984375,
0.00916290283203125,
0.033538818359375,
-0.0191192626953125,
0.00681304931640625,
-0.0192413330078125,
-0.05657958984375,
0.0033664703369140625,
-0.02325439453125,
0.08135986328125,
0.008331298828125,
-0.036376953125,
0.0017309188842773438,
-0.0743408203125,
-0.005008697509765625,
0.03509521484375,
-0.022369384765625,
-0.0108489990234375,
-0.035247802734375,
-0.0236053466796875,
0.0059356689453125,
0.038238525390625,
-0.048065185546875,
0.037139892578125,
-0.005374908447265625,
0.04351806640625,
0.05535888671875,
0.0055999755859375,
0.017303466796875,
-0.0362548828125,
0.033782958984375,
-0.017669677734375,
0.044647216796875,
-0.0005788803100585938,
-0.05523681640625,
-0.0657958984375,
-0.03558349609375,
-0.017974853515625,
0.027008056640625,
-0.04864501953125,
0.0203704833984375,
-0.00049591064453125,
-0.056976318359375,
-0.047119140625,
-0.003757476806640625,
0.02044677734375,
0.04364013671875,
0.0447998046875,
-0.01837158203125,
-0.0279693603515625,
-0.06463623046875,
0.027679443359375,
-0.002559661865234375,
-0.0013561248779296875,
0.008697509765625,
0.055511474609375,
-0.0286712646484375,
0.0513916015625,
-0.042999267578125,
-0.022247314453125,
0.020111083984375,
0.017822265625,
0.029388427734375,
0.047027587890625,
0.0570068359375,
-0.02008056640625,
0.00408935546875,
-0.0190582275390625,
-0.05670166015625,
0.0047607421875,
-0.00878143310546875,
-0.0156402587890625,
0.0035839080810546875,
0.00414276123046875,
-0.05670166015625,
0.0276947021484375,
0.0400390625,
-0.01934814453125,
0.037628173828125,
-0.0286407470703125,
0.0167388916015625,
-0.0758056640625,
0.00701904296875,
-0.005519866943359375,
0.0010662078857421875,
-0.001895904541015625,
0.01134490966796875,
0.0058441162109375,
0.0011005401611328125,
-0.036773681640625,
0.040313720703125,
-0.0229949951171875,
-0.01505279541015625,
-0.0201568603515625,
-0.02001953125,
0.00811767578125,
0.051025390625,
0.00887298583984375,
0.056549072265625,
0.01229095458984375,
-0.033447265625,
0.046630859375,
0.024658203125,
-0.0243377685546875,
0.014678955078125,
-0.07281494140625,
0.0137176513671875,
0.0184173583984375,
0.050201416015625,
-0.0290985107421875,
-0.01763916015625,
0.03326416015625,
-0.01311492919921875,
0.0010595321655273438,
0.004993438720703125,
-0.0238494873046875,
-0.0419921875,
-0.038726806640625,
0.04132080078125,
0.044464111328125,
-0.05706787109375,
0.019866943359375,
0.0006628036499023438,
0.0224151611328125,
-0.052032470703125,
-0.06011962890625,
-0.01611328125,
-0.01107025146484375,
-0.03338623046875,
0.05145263671875,
-0.0019989013671875,
0.007221221923828125,
-0.006496429443359375,
0.0012340545654296875,
-0.011474609375,
0.0100250244140625,
0.036224365234375,
0.0301055908203125,
-0.014739990234375,
-0.0223236083984375,
0.015777587890625,
-0.0091094970703125,
0.0190582275390625,
0.00969696044921875,
0.04876708984375,
-0.0238494873046875,
-0.02239990234375,
-0.0631103515625,
0.0024929046630859375,
0.0201873779296875,
0.016937255859375,
0.051361083984375,
0.05401611328125,
-0.039825439453125,
-0.0192413330078125,
-0.02984619140625,
-0.0232696533203125,
-0.038818359375,
0.01247406005859375,
-0.033721923828125,
-0.038116455078125,
0.04241943359375,
0.0160369873046875,
0.0038127899169921875,
0.04486083984375,
0.028778076171875,
0.0007519721984863281,
0.0567626953125,
0.04522705078125,
-0.018524169921875,
0.044647216796875,
-0.06488037109375,
-0.024322509765625,
-0.0615234375,
-0.03228759765625,
-0.03076171875,
-0.0166168212890625,
-0.033294677734375,
-0.0333251953125,
0.006839752197265625,
0.004913330078125,
-0.06378173828125,
0.03668212890625,
-0.045379638671875,
0.02874755859375,
0.05364990234375,
0.0181121826171875,
0.02508544921875,
-0.010589599609375,
0.0234527587890625,
0.0034770965576171875,
-0.0352783203125,
-0.039306640625,
0.078369140625,
0.03924560546875,
0.059844970703125,
-0.0138702392578125,
0.0635986328125,
0.020477294921875,
0.023101806640625,
-0.04083251953125,
0.041290283203125,
0.004581451416015625,
-0.0307159423828125,
-0.0034542083740234375,
-0.0280303955078125,
-0.06005859375,
0.00952911376953125,
-0.01532745361328125,
-0.04937744140625,
-0.0203399658203125,
0.007244110107421875,
-0.0160980224609375,
0.02301025390625,
-0.036773681640625,
0.0509033203125,
-0.0210418701171875,
0.01094818115234375,
-0.007904052734375,
-0.04046630859375,
0.053802490234375,
-0.005420684814453125,
-0.006877899169921875,
-0.012603759765625,
-0.005550384521484375,
0.07965087890625,
-0.06732177734375,
0.06109619140625,
-0.01751708984375,
-0.0208587646484375,
0.0296783447265625,
-0.01107025146484375,
0.03826904296875,
0.0021991729736328125,
-0.0111846923828125,
0.0380859375,
0.00904083251953125,
-0.047271728515625,
-0.026123046875,
0.05975341796875,
-0.100341796875,
-0.0254058837890625,
-0.038818359375,
-0.0391845703125,
0.00011515617370605469,
0.0084686279296875,
0.038055419921875,
0.00667572021484375,
-0.0168304443359375,
-0.006946563720703125,
0.0210113525390625,
-0.003963470458984375,
0.036468505859375,
0.024566650390625,
-0.01149749755859375,
-0.0474853515625,
0.0673828125,
0.00641632080078125,
0.01506805419921875,
0.00803375244140625,
0.003910064697265625,
-0.01340484619140625,
-0.020782470703125,
-0.0438232421875,
0.04278564453125,
-0.0391845703125,
-0.0076751708984375,
-0.0292205810546875,
-0.0264129638671875,
-0.03070068359375,
-0.035491943359375,
-0.034210205078125,
-0.050384521484375,
-0.038360595703125,
-0.0112152099609375,
0.048553466796875,
0.057098388671875,
0.006610870361328125,
0.07281494140625,
-0.044677734375,
0.0125732421875,
0.01776123046875,
0.0126495361328125,
0.001987457275390625,
-0.04742431640625,
-0.01345062255859375,
0.019805908203125,
-0.043365478515625,
-0.058990478515625,
0.026275634765625,
-0.003322601318359375,
0.043853759765625,
0.03741455078125,
0.00037932395935058594,
0.050445556640625,
-0.017974853515625,
0.07330322265625,
0.022247314453125,
-0.07000732421875,
0.04791259765625,
-0.01678466796875,
-0.00820159912109375,
0.020904541015625,
0.045501708984375,
-0.0074310302734375,
-0.0090789794921875,
-0.07183837890625,
-0.051971435546875,
0.061737060546875,
-0.00013375282287597656,
-0.0004169940948486328,
0.0013952255249023438,
0.041748046875,
0.022857666015625,
0.0091705322265625,
-0.0711669921875,
-0.011322021484375,
-0.047698974609375,
-0.0048065185546875,
-0.002674102783203125,
0.0040435791015625,
-0.020416259765625,
-0.01465606689453125,
0.0814208984375,
-0.01259613037109375,
0.02374267578125,
-0.0013885498046875,
0.00910186767578125,
0.00798797607421875,
-0.033447265625,
0.044158935546875,
0.017547607421875,
-0.040069580078125,
-0.017913818359375,
0.028076171875,
-0.05517578125,
0.0173492431640625,
0.00592803955078125,
-0.0084686279296875,
-0.0099334716796875,
0.03533935546875,
0.0740966796875,
0.01129913330078125,
-0.0216827392578125,
0.0369873046875,
-0.00250244140625,
-0.007038116455078125,
-0.034454345703125,
0.0201873779296875,
-0.00925445556640625,
0.02752685546875,
0.01381683349609375,
-0.0220184326171875,
-0.002033233642578125,
-0.0272216796875,
-0.0125732421875,
0.02874755859375,
0.005062103271484375,
-0.031036376953125,
0.0592041015625,
0.01275634765625,
-0.030181884765625,
0.059783935546875,
-0.0159759521484375,
-0.02105712890625,
0.0736083984375,
0.056671142578125,
0.04248046875,
-0.0206298828125,
0.01983642578125,
0.0316162109375,
0.0247344970703125,
-0.0182037353515625,
0.0269927978515625,
-0.02630615234375,
-0.03961181640625,
-0.01065826416015625,
-0.04425048828125,
-0.050384521484375,
0.0179901123046875,
-0.0645751953125,
0.035400390625,
-0.0692138671875,
-0.0146026611328125,
-0.0113067626953125,
0.0278778076171875,
-0.04864501953125,
0.03131103515625,
-0.0002543926239013672,
0.08526611328125,
-0.0748291015625,
0.09503173828125,
0.028045654296875,
-0.0491943359375,
-0.062744140625,
-0.026031494140625,
-0.022186279296875,
-0.10552978515625,
0.02239990234375,
0.01201629638671875,
-0.0048675537109375,
-0.01038360595703125,
-0.048980712890625,
-0.07159423828125,
0.129150390625,
0.03436279296875,
-0.01432037353515625,
-0.0060882568359375,
0.00983428955078125,
0.029205322265625,
-0.01348114013671875,
0.0225372314453125,
0.046844482421875,
0.00867462158203125,
0.0168609619140625,
-0.060882568359375,
0.017791748046875,
-0.004058837890625,
0.00751495361328125,
0.007373809814453125,
-0.09033203125,
0.09197998046875,
-0.0281982421875,
0.022735595703125,
0.0287933349609375,
0.06231689453125,
0.055999755859375,
0.03070068359375,
0.023101806640625,
0.05523681640625,
0.06317138671875,
-0.0018205642700195312,
0.0765380859375,
-0.016326904296875,
0.0391845703125,
0.07135009765625,
-0.015350341796875,
0.045440673828125,
0.037078857421875,
-0.02581787109375,
0.044036865234375,
0.0635986328125,
-0.018890380859375,
0.038665771484375,
0.0076141357421875,
-0.0220794677734375,
0.01428985595703125,
0.00804901123046875,
-0.05181884765625,
0.025054931640625,
0.030303955078125,
-0.0367431640625,
-0.0024261474609375,
-0.0047454833984375,
0.0176544189453125,
-0.04010009765625,
-0.0191192626953125,
0.0251617431640625,
0.0005116462707519531,
-0.0333251953125,
0.052886962890625,
0.00901031494140625,
0.0657958984375,
-0.061737060546875,
-0.0015821456909179688,
-0.017791748046875,
0.0016841888427734375,
-0.0182647705078125,
-0.04840087890625,
0.01155853271484375,
-0.01074981689453125,
-0.00875091552734375,
-0.0006465911865234375,
0.023406982421875,
-0.00006574392318725586,
-0.0264892578125,
0.021759033203125,
0.004192352294921875,
0.038238525390625,
0.0130157470703125,
-0.054656982421875,
0.0269317626953125,
0.004840850830078125,
-0.039947509765625,
0.026641845703125,
0.01505279541015625,
-0.0181427001953125,
0.05780029296875,
0.04974365234375,
-0.004138946533203125,
0.0141143798828125,
-0.013671875,
0.056915283203125,
-0.0584716796875,
-0.038818359375,
-0.0406494140625,
-0.005329132080078125,
0.004451751708984375,
-0.0511474609375,
0.0283660888671875,
0.04302978515625,
0.06280517578125,
-0.003509521484375,
0.036163330078125,
0.00756072998046875,
-0.0001628398895263672,
-0.045806884765625,
0.04107666015625,
-0.036407470703125,
0.0045013427734375,
-0.0205841064453125,
-0.071044921875,
0.016387939453125,
0.078857421875,
0.0012187957763671875,
0.018798828125,
0.035247802734375,
0.041961669921875,
-0.0126495361328125,
0.038421630859375,
-0.007099151611328125,
0.01471710205078125,
0.032928466796875,
0.0579833984375,
0.043670654296875,
-0.050323486328125,
0.0238189697265625,
-0.06695556640625,
-0.032470703125,
-0.002696990966796875,
-0.070556640625,
-0.0343017578125,
-0.042510986328125,
-0.035400390625,
-0.03271484375,
-0.0131683349609375,
0.0704345703125,
0.0531005859375,
-0.03594970703125,
-0.0222320556640625,
0.0062103271484375,
-0.01433563232421875,
-0.0201416015625,
-0.01375579833984375,
0.0413818359375,
0.00852203369140625,
-0.0601806640625,
0.0328369140625,
-0.01934814453125,
0.044036865234375,
-0.0224761962890625,
-0.01483917236328125,
0.0007762908935546875,
-0.01218414306640625,
0.032684326171875,
0.036163330078125,
-0.046295166015625,
-0.0250396728515625,
-0.0210113525390625,
-0.0175323486328125,
-0.0154571533203125,
0.029571533203125,
-0.0712890625,
-0.00988006591796875,
0.0001538991928100586,
0.0274658203125,
0.07720947265625,
-0.0164794921875,
0.01053619384765625,
-0.045928955078125,
0.03375244140625,
0.00966644287109375,
0.052032470703125,
0.0169525146484375,
-0.022369384765625,
0.06072998046875,
0.02783203125,
-0.058624267578125,
-0.05303955078125,
-0.01114654541015625,
-0.10675048828125,
-0.0050048828125,
0.07965087890625,
-0.0153656005859375,
-0.03656005859375,
0.03167724609375,
-0.0289764404296875,
0.0306549072265625,
-0.03466796875,
0.06842041015625,
0.035980224609375,
-0.02587890625,
0.0210113525390625,
-0.03472900390625,
0.0203399658203125,
0.0107879638671875,
-0.055755615234375,
-0.032623291015625,
0.0281524658203125,
0.04498291015625,
0.01094818115234375,
0.06304931640625,
0.006595611572265625,
0.0158538818359375,
0.01064300537109375,
0.0267181396484375,
-0.0164642333984375,
-0.00885009765625,
-0.01552581787109375,
-0.01152801513671875,
-0.01177215576171875,
-0.04962158203125
]
] |
KoboldAI/LLaMA2-13B-Holomax | 2023-08-17T14:18:33.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | KoboldAI | null | null | KoboldAI/LLaMA2-13B-Holomax | 12 | 13,645 | transformers | 2023-08-14T14:26:32 | ---
license: other
---
# LLaMA 2 Holomax 13B - The writers version of Mythomax
This is an expansion merge to the well praised Mythomax model from Gryphe (60%) using MrSeeker's KoboldAI Holodeck model (40%)
The goal of this model is to enhance story writing capabilities while preserving the desirable traits of the Mythomax model as much as possible (It does limit chat reply length).
Testers found that this model passes the InteracTV benchmark, was useful for story writing, chatting and text adventures using Instruction mode.
Preservation of factual knowledge has not been tested since we expect the original to be better in those use cases as this merge was focussed on fiction.
## Credits
This merge is not possible without the following models and model authors (Thanks to all of you for your work!)
Mythomax by Gryphe:
- Mythologic-L2 by Gryphe:
- - Hermes by Nous-Research
- Chronos V2 by Elinas
- Airoboros m2.0 by Jondurbin
- Huginn by Face of Goonery:
- - Hermes by Nous-Research
- StableBeluga by StabilityAI
- Airoboros by Jondurbin
- Chronos by Elinas
- Limarp by Lemonila
Holodeck by Mr.Seeker
## Guidelines
This model is designed to be flexible, it should be able to be used as a co-writing model, as well as a variety of instruct formats (Tested with Alpaca) and regular chatting both augmented with traditional formatting and instruct formatting.
The Alpaca format is as follows:
```
### Instruction:
Instruction goes here
### Response:
```
But if you have a different preferred format that works on one of the models above it will likely still work.
## License
After publishing the model we were informed that one of the origin models upstream was uploaded under the AGPLv3, it is currently unknown what effects this has on this model because all weights have been modified and none of the original weights are intact.
At the moment of publishing (and writing this message) both merged models Holodeck and Mythomax were licensed Llama2, therefore the Llama2 license applies to this model.
However, Holodeck contains a non-commercial clause and may only be used for research or private use, while Limarp is licensed AGPLv3.
AGPLv3 conflicts with the commercial usage restrictions of the Llama2 license, therefore we assume this aspect does not apply and the authors indended for commercial usage restrictions to be permitted.
As a result we have decided to leave the model available for public download on the assumption that all involved authors intend for it to be licensed with commercial restrictions / llama2 restrictions in place, but with the further rights and freedoms the AGPLv3 grants a user.
If HF informs us that this assumption is incorrect and requests us to take this model down, we will republish the model in the form of the original merging script that was used to create the end result.
To comply with the AGPLv3 aspect the "source" of this model is as follows (Because this model is made on a binary level, we can only provide the script that created the model):
```
import json
import os
import shutil
import subprocess
from tkinter.filedialog import askdirectory, askopenfilename
import torch
from colorama import Fore, Style, init
from transformers import (AutoModel, AutoModelForCausalLM, AutoTokenizer,
LlamaConfig, LlamaForCausalLM, LlamaTokenizer,
PreTrainedTokenizer, PreTrainedTokenizerFast)
newline = '\n'
def clear_console():
if os.name == "nt": # For Windows
subprocess.call("cls", shell=True)
else: # For Linux and macOS
subprocess.call("clear", shell=True)
clear_console()
print(f"{Fore.YELLOW}Starting script, please wait...{Style.RESET_ALL}")
#mixer output settings
blend_ratio = 0.4 #setting to 0 gives first model, and 1 gives second model
fp16 = False #perform operations in fp16. Saves memory, but CPU inference will not be possible.
always_output_fp16 = True #if true, will output fp16 even if operating in fp32
max_shard_size = "10000MiB" #set output shard size
force_cpu = True #only use cpu
load_sharded = True #load both models shard by shard
print(f"Blend Ratio set to: {Fore.GREEN}{blend_ratio}{Style.RESET_ALL}")
print(f"Operations in fp16 is: {Fore.GREEN}{fp16}{Style.RESET_ALL}")
print(f"Save Result in fp16: {Fore.GREEN}{always_output_fp16}{Style.RESET_ALL}")
print(f"CPU RAM Only: {Fore.GREEN}{force_cpu}{Style.RESET_ALL}{newline}")
#test generation settings, only for fp32
deterministic_test = True #determines if outputs are always the same
test_prompt = "" #test prompt for generation. only for fp32. set to empty string to skip generating.
test_max_length = 32 #test generation length
blend_ratio_b = 1.0 - blend_ratio
def get_model_info(model):
with torch.no_grad():
outfo = ""
cntent = 0
outfo += "\n==============================\n"
for name, para in model.named_parameters():
cntent += 1
outfo += ('{}: {}'.format(name, para.shape))+"\n"
outfo += ("Num Entries: " + str(cntent))+"\n"
outfo += ("==============================\n")
return outfo
def merge_models(model1,model2):
with torch.no_grad():
tensornum = 0
for p1, p2 in zip(model1.parameters(), model2.parameters()):
p1 *= blend_ratio
p2 *= blend_ratio_b
p1 += p2
tensornum += 1
print("Merging tensor "+str(tensornum))
pass
def read_index_filenames(sourcedir):
index = json.load(open(sourcedir + '/pytorch_model.bin.index.json','rt'))
fl = []
for k,v in index['weight_map'].items():
if v not in fl:
fl.append(v)
return fl
print("Opening file dialog, please select FIRST model directory...")
model_path1 = "Gryphe/MythoMax-L2-13b"
print(f"First Model is: {model_path1}")
print("Opening file dialog, please select SECOND model directory...")
model_path2 = "KoboldAI/LLAMA2-13B-Holodeck-1"
print(f"Second Model is: {model_path2}")
print("Opening file dialog, please select OUTPUT model directory...")
model_path3 = askdirectory(title="Select Output Directory of merged model")
print(f"Merged Save Directory is: {model_path3}{newline}")
if not model_path1 or not model_path2:
print("\nYou must select two directories containing models to merge and one output directory. Exiting.")
exit()
with torch.no_grad():
if fp16:
torch.set_default_dtype(torch.float16)
else:
torch.set_default_dtype(torch.float32)
device = torch.device("cuda") if (torch.cuda.is_available() and not force_cpu) else torch.device("cpu")
print(device)
print("Loading Model 1...")
model1 = AutoModelForCausalLM.from_pretrained(model_path1) #,torch_dtype=torch.float16
model1 = model1.to(device)
model1.eval()
print("Model 1 Loaded. Dtype: " + str(model1.dtype))
print("Loading Model 2...")
model2 = AutoModelForCausalLM.from_pretrained(model_path2) #,torch_dtype=torch.float16
model2 = model2.to(device)
model2.eval()
print("Model 2 Loaded. Dtype: " + str(model2.dtype))
# Saving for posterity reasons, handy for troubleshooting if model result is broken
# #ensure both models have the exact same layout
# m1_info = get_model_info(model1)
# m2_info = get_model_info(model2)
# if m1_info != m2_info:
# print("Model 1 Info: " + m1_info)
# print("Model 2 Info: " + m2_info)
# print("\nERROR:\nThe two selected models are not compatible! They must have identical structure!")
# exit()
print("Merging models...")
merge_models(model1,model2)
if model_path3:
print("Saving new model...")
if always_output_fp16 and not fp16:
model1.half()
model1.save_pretrained(model_path3, max_shard_size=max_shard_size)
print("\nSaved to: " + model_path3)
print("\nCopying files to: " + model_path3)
files_to_copy = ["tokenizer.model", "special_tokens_map.json", "tokenizer_config.json", "vocab.json", "merges.txt"]
for filename in files_to_copy:
src_path = os.path.join(model_path1, filename)
dst_path = os.path.join(model_path3, filename)
try:
shutil.copy2(src_path, dst_path)
except FileNotFoundError:
print("\nFile " + filename + " not found in" + model_path1 + ". Skipping.")
else:
print("\nOutput model was not saved as no output path was selected.")
print("\nScript Completed.")
``` | 8,557 | [
[
-0.0162200927734375,
-0.046630859375,
0.026580810546875,
0.007549285888671875,
-0.016693115234375,
-0.0019121170043945312,
0.006465911865234375,
-0.037261962890625,
-0.006153106689453125,
0.040130615234375,
-0.041595458984375,
-0.027557373046875,
-0.046783447265625,
-0.01824951171875,
-0.02178955078125,
0.08416748046875,
-0.0209503173828125,
-0.0173797607421875,
-0.0189971923828125,
-0.0010089874267578125,
-0.0361328125,
-0.0247039794921875,
-0.04364013671875,
-0.032318115234375,
0.0123748779296875,
0.01275634765625,
0.049896240234375,
0.0455322265625,
0.038482666015625,
0.033355712890625,
-0.01163482666015625,
0.0211944580078125,
-0.03814697265625,
-0.0030364990234375,
0.0093841552734375,
-0.031158447265625,
-0.0504150390625,
0.01092529296875,
0.05035400390625,
0.0204925537109375,
-0.001720428466796875,
0.023773193359375,
0.0078582763671875,
0.022125244140625,
-0.02587890625,
0.0175323486328125,
-0.0263671875,
0.00699615478515625,
0.0007605552673339844,
-0.0185089111328125,
-0.0221710205078125,
-0.037078857421875,
0.0018472671508789062,
-0.0465087890625,
0.02801513671875,
0.007274627685546875,
0.086181640625,
0.01122283935546875,
-0.038848876953125,
-0.0164031982421875,
-0.04254150390625,
0.06988525390625,
-0.08343505859375,
0.00463104248046875,
0.01316070556640625,
0.01220703125,
-0.0166015625,
-0.0692138671875,
-0.0457763671875,
-0.01059722900390625,
-0.013275146484375,
0.0088043212890625,
-0.0303955078125,
-0.0033931732177734375,
0.0206146240234375,
0.039154052734375,
-0.03717041015625,
-0.00988006591796875,
-0.07598876953125,
-0.0264892578125,
0.056671142578125,
0.0269012451171875,
0.025360107421875,
-0.00853729248046875,
-0.0304718017578125,
-0.0482177734375,
-0.0146942138671875,
0.0028018951416015625,
0.032623291015625,
0.00794219970703125,
-0.04290771484375,
0.056243896484375,
0.0061798095703125,
0.051025390625,
0.0169677734375,
-0.0318603515625,
0.033599853515625,
-0.0250244140625,
-0.0380859375,
-0.001827239990234375,
0.07452392578125,
0.056365966796875,
-0.0087890625,
0.015045166015625,
-0.0027751922607421875,
-0.01273345947265625,
-0.01000213623046875,
-0.06341552734375,
-0.0277252197265625,
0.033416748046875,
-0.0399169921875,
-0.035736083984375,
-0.005084991455078125,
-0.04876708984375,
-0.0267181396484375,
0.015167236328125,
0.052276611328125,
-0.0310821533203125,
-0.0309906005859375,
0.00461578369140625,
-0.035797119140625,
0.0167694091796875,
0.0261383056640625,
-0.06646728515625,
0.01477813720703125,
0.0300750732421875,
0.0726318359375,
0.0031185150146484375,
-0.044921875,
-0.0176849365234375,
0.00852203369140625,
-0.0189056396484375,
0.0204315185546875,
-0.002063751220703125,
-0.03662109375,
-0.026763916015625,
0.0105743408203125,
-0.01323699951171875,
-0.043975830078125,
0.01313018798828125,
-0.0234832763671875,
0.036895751953125,
-0.01328277587890625,
-0.03692626953125,
-0.042236328125,
0.00321197509765625,
-0.03594970703125,
0.087890625,
0.03045654296875,
-0.0716552734375,
-0.00165557861328125,
-0.034088134765625,
-0.01197052001953125,
-0.0153961181640625,
0.00640869140625,
-0.04473876953125,
-0.004741668701171875,
-0.004016876220703125,
0.031341552734375,
-0.0262908935546875,
0.020660400390625,
-0.02935791015625,
-0.0115509033203125,
0.00970458984375,
-0.022979736328125,
0.0869140625,
0.0212860107421875,
-0.032135009765625,
0.018768310546875,
-0.039276123046875,
0.0042724609375,
0.02825927734375,
-0.020599365234375,
0.01284027099609375,
-0.027191162109375,
0.0059661865234375,
0.01293182373046875,
0.044677734375,
-0.03045654296875,
0.029052734375,
-0.01323699951171875,
0.03594970703125,
0.04510498046875,
0.0007472038269042969,
0.0289459228515625,
-0.05633544921875,
0.0198211669921875,
0.01387786865234375,
0.01495361328125,
0.005496978759765625,
-0.053863525390625,
-0.07769775390625,
-0.00020396709442138672,
-0.0007047653198242188,
0.033172607421875,
-0.0285186767578125,
0.038848876953125,
0.004261016845703125,
-0.06988525390625,
-0.0269317626953125,
0.00017821788787841797,
0.025543212890625,
0.04754638671875,
0.023590087890625,
-0.0269317626953125,
-0.051727294921875,
-0.056884765625,
0.0140838623046875,
-0.023406982421875,
0.00962066650390625,
0.035064697265625,
0.052398681640625,
-0.023468017578125,
0.06182861328125,
-0.046875,
-0.008758544921875,
-0.037384033203125,
0.023101806640625,
0.04071044921875,
0.05938720703125,
0.05694580078125,
-0.032623291015625,
-0.03656005859375,
0.0182342529296875,
-0.06689453125,
-0.0053558349609375,
-0.004116058349609375,
-0.00835418701171875,
0.0120391845703125,
0.0084381103515625,
-0.051788330078125,
0.0325927734375,
0.050201416015625,
-0.040069580078125,
0.06365966796875,
-0.037811279296875,
0.037261962890625,
-0.091552734375,
0.0051727294921875,
-0.02764892578125,
-0.003849029541015625,
-0.0418701171875,
0.006072998046875,
-0.0053558349609375,
-0.00086212158203125,
-0.04541015625,
0.0634765625,
-0.033843994140625,
-0.00899505615234375,
-0.01947021484375,
-0.00911712646484375,
0.0139617919921875,
0.033416748046875,
-0.019500732421875,
0.037353515625,
0.055511474609375,
-0.044586181640625,
0.054901123046875,
0.01934814453125,
-0.0083160400390625,
0.0003733634948730469,
-0.046783447265625,
0.0097808837890625,
0.00901031494140625,
0.0217437744140625,
-0.074462890625,
-0.0325927734375,
0.033966064453125,
-0.0391845703125,
0.0176849365234375,
-0.0170440673828125,
-0.0258636474609375,
-0.033935546875,
-0.021270751953125,
0.03338623046875,
0.05657958984375,
-0.032623291015625,
0.064208984375,
0.00908660888671875,
0.0008854866027832031,
-0.03680419921875,
-0.057952880859375,
-0.02484130859375,
-0.0238494873046875,
-0.07318115234375,
0.0262298583984375,
-0.018768310546875,
-0.0216522216796875,
0.0003247261047363281,
-0.00975799560546875,
-0.02178955078125,
-0.011749267578125,
0.02606201171875,
0.04681396484375,
-0.00994873046875,
-0.0274200439453125,
0.0079498291015625,
0.0064849853515625,
0.001277923583984375,
-0.0089874267578125,
0.06329345703125,
-0.0149383544921875,
-0.01336669921875,
-0.043121337890625,
0.005199432373046875,
0.051666259765625,
-0.00948333740234375,
0.06341552734375,
0.05450439453125,
-0.0197296142578125,
0.00823974609375,
-0.053131103515625,
-0.01055145263671875,
-0.0380859375,
0.0269622802734375,
-0.0020961761474609375,
-0.042205810546875,
0.056427001953125,
0.0234832763671875,
0.0305023193359375,
0.06097412109375,
0.039215087890625,
-0.0074310302734375,
0.05859375,
0.041778564453125,
0.0147857666015625,
0.03033447265625,
-0.06463623046875,
0.0072021484375,
-0.056671142578125,
-0.0330810546875,
-0.021392822265625,
-0.02191162109375,
-0.027099609375,
-0.0477294921875,
0.0207061767578125,
0.022491455078125,
-0.02325439453125,
0.0374755859375,
-0.048553466796875,
0.017242431640625,
0.0445556640625,
0.009307861328125,
0.00942230224609375,
0.00540924072265625,
-0.0071868896484375,
0.0096282958984375,
-0.05047607421875,
-0.0281982421875,
0.08154296875,
0.0186309814453125,
0.046600341796875,
-0.00994873046875,
0.06793212890625,
0.00440216064453125,
0.005580902099609375,
-0.031463623046875,
0.045501708984375,
0.007965087890625,
-0.05670166015625,
-0.0084381103515625,
-0.0296630859375,
-0.054718017578125,
0.0205841064453125,
-0.00933837890625,
-0.0716552734375,
0.0199432373046875,
0.020751953125,
-0.035614013671875,
0.055999755859375,
-0.06488037109375,
0.0726318359375,
-0.01568603515625,
-0.0318603515625,
-0.01412200927734375,
-0.0408935546875,
0.04620361328125,
0.01165771484375,
0.0037975311279296875,
-0.0015611648559570312,
0.0218658447265625,
0.08404541015625,
-0.0418701171875,
0.045501708984375,
0.0038509368896484375,
0.0010852813720703125,
0.041961669921875,
0.01055145263671875,
0.035064697265625,
-0.014007568359375,
-0.0031642913818359375,
0.0205535888671875,
0.01110076904296875,
-0.0123748779296875,
-0.0311737060546875,
0.061279296875,
-0.07830810546875,
-0.042205810546875,
-0.04644775390625,
-0.066650390625,
0.022796630859375,
0.018890380859375,
0.03961181640625,
0.03033447265625,
0.01313018798828125,
0.014129638671875,
0.039154052734375,
-0.018280029296875,
0.047882080078125,
0.0188446044921875,
-0.039703369140625,
-0.04132080078125,
0.05633544921875,
0.00621795654296875,
0.033721923828125,
-0.0008893013000488281,
0.01494598388671875,
-0.014617919921875,
-0.0173797607421875,
-0.0219879150390625,
0.026763916015625,
-0.057708740234375,
-0.01233673095703125,
-0.061431884765625,
-0.0350341796875,
-0.0390625,
-0.03582763671875,
-0.0309906005859375,
-0.0257110595703125,
-0.034332275390625,
0.00988006591796875,
0.041900634765625,
0.04937744140625,
-0.026824951171875,
0.0380859375,
-0.0634765625,
0.01085662841796875,
0.0235595703125,
-0.00414276123046875,
0.01387786865234375,
-0.060577392578125,
-0.00449371337890625,
0.0083160400390625,
-0.0457763671875,
-0.07684326171875,
0.045684814453125,
-0.01218414306640625,
0.019500732421875,
0.028411865234375,
-0.00876617431640625,
0.05364990234375,
0.00014221668243408203,
0.0587158203125,
0.03363037109375,
-0.06964111328125,
0.036163330078125,
-0.0291290283203125,
0.01763916015625,
0.004425048828125,
0.016448974609375,
-0.0291748046875,
-0.0164031982421875,
-0.06976318359375,
-0.054962158203125,
0.082763671875,
0.03759765625,
-0.018035888671875,
0.0265655517578125,
0.00724029541015625,
0.0017223358154296875,
0.01247406005859375,
-0.06817626953125,
-0.038970947265625,
-0.021087646484375,
-0.001476287841796875,
-0.007297515869140625,
-0.01024627685546875,
-0.03009033203125,
-0.041595458984375,
0.061004638671875,
0.024810791015625,
0.033294677734375,
0.004638671875,
0.0025577545166015625,
-0.02374267578125,
0.001842498779296875,
0.031707763671875,
0.040252685546875,
-0.046173095703125,
0.003265380859375,
0.02496337890625,
-0.04083251953125,
0.01435089111328125,
0.0131988525390625,
-0.00664520263671875,
0.0006756782531738281,
0.03460693359375,
0.0692138671875,
0.006122589111328125,
-0.019134521484375,
0.0191650390625,
0.00429534912109375,
-0.0267181396484375,
0.0031719207763671875,
0.0300750732421875,
0.025146484375,
0.03582763671875,
0.0251312255859375,
0.0224609375,
-0.01152801513671875,
-0.049041748046875,
-0.005512237548828125,
0.03106689453125,
0.003749847412109375,
-0.01401519775390625,
0.0662841796875,
0.00004661083221435547,
-0.0154876708984375,
0.051727294921875,
-0.00904083251953125,
-0.031768798828125,
0.078369140625,
0.048980712890625,
0.061492919921875,
-0.003131866455078125,
0.011871337890625,
0.047698974609375,
0.0269775390625,
-0.0096282958984375,
0.01511383056640625,
0.0013561248779296875,
-0.044677734375,
0.005207061767578125,
-0.047332763671875,
-0.0162506103515625,
0.01023101806640625,
-0.035064697265625,
0.028472900390625,
-0.040130615234375,
-0.013641357421875,
-0.0125885009765625,
0.025238037109375,
-0.040252685546875,
0.0172119140625,
0.002132415771484375,
0.051116943359375,
-0.0714111328125,
0.058197021484375,
0.04058837890625,
-0.03607177734375,
-0.07891845703125,
-0.0233306884765625,
0.0030422210693359375,
-0.046661376953125,
0.033447265625,
0.0141754150390625,
0.0112762451171875,
0.004894256591796875,
-0.049560546875,
-0.08355712890625,
0.11639404296875,
0.035308837890625,
-0.0244293212890625,
-0.009979248046875,
-0.007602691650390625,
0.02508544921875,
-0.032623291015625,
0.042724609375,
0.04217529296875,
0.045501708984375,
0.0297088623046875,
-0.08221435546875,
0.006664276123046875,
-0.01346588134765625,
-0.01497650146484375,
0.004180908203125,
-0.054779052734375,
0.10638427734375,
-0.039459228515625,
-0.01477813720703125,
0.0124359130859375,
0.05450439453125,
0.0443115234375,
0.02227783203125,
0.0234375,
0.044586181640625,
0.05194091796875,
-0.0204925537109375,
0.07733154296875,
-0.0195770263671875,
0.06268310546875,
0.0675048828125,
-0.00843048095703125,
0.0372314453125,
0.017913818359375,
-0.02508544921875,
0.03363037109375,
0.052337646484375,
-0.024444580078125,
0.033935546875,
-0.0129852294921875,
-0.01041412353515625,
-0.006732940673828125,
0.01023101806640625,
-0.0595703125,
0.0161590576171875,
0.001739501953125,
-0.027923583984375,
-0.0228424072265625,
-0.01812744140625,
0.017822265625,
-0.0438232421875,
-0.00853729248046875,
0.0238189697265625,
-0.00611114501953125,
-0.0477294921875,
0.057708740234375,
0.0034275054931640625,
0.053436279296875,
-0.0625,
-0.005401611328125,
-0.0161895751953125,
0.020538330078125,
-0.0306243896484375,
-0.05438232421875,
0.0092315673828125,
-0.021453857421875,
-0.018798828125,
0.007053375244140625,
0.046600341796875,
-0.040008544921875,
-0.038665771484375,
0.0264739990234375,
0.0095672607421875,
0.0294342041015625,
0.0257110595703125,
-0.054534912109375,
0.037750244140625,
0.029388427734375,
-0.020172119140625,
0.0209503173828125,
0.004695892333984375,
0.018463134765625,
0.054901123046875,
0.048736572265625,
-0.007579803466796875,
0.025390625,
-0.034027099609375,
0.061798095703125,
-0.034698486328125,
-0.020965576171875,
-0.05340576171875,
0.0487060546875,
-0.00591278076171875,
-0.038818359375,
0.06646728515625,
0.053131103515625,
0.06060791015625,
-0.0253448486328125,
0.056549072265625,
-0.03912353515625,
-0.00360870361328125,
-0.0241546630859375,
0.0623779296875,
-0.049835205078125,
0.003749847412109375,
-0.015106201171875,
-0.0736083984375,
0.0126190185546875,
0.046539306640625,
-0.018585205078125,
0.0172119140625,
0.054473876953125,
0.0677490234375,
-0.021087646484375,
-0.0243682861328125,
0.01495361328125,
0.031951904296875,
0.01532745361328125,
0.0518798828125,
0.037811279296875,
-0.0592041015625,
0.037994384765625,
-0.050933837890625,
-0.0216522216796875,
-0.0233154296875,
-0.05810546875,
-0.06292724609375,
-0.046356201171875,
-0.0289306640625,
-0.041778564453125,
-0.01143646240234375,
0.06317138671875,
0.046630859375,
-0.04736328125,
-0.04583740234375,
0.00537872314453125,
0.00905609130859375,
-0.0185699462890625,
-0.020751953125,
0.030914306640625,
0.004932403564453125,
-0.063232421875,
0.0159149169921875,
0.01412200927734375,
0.0345458984375,
-0.0157012939453125,
-0.0261383056640625,
-0.023773193359375,
0.002056121826171875,
0.009735107421875,
0.0281982421875,
-0.05499267578125,
0.009307861328125,
-0.0270233154296875,
-0.0072784423828125,
0.0015153884887695312,
0.026031494140625,
-0.039794921875,
0.037261962890625,
0.053802490234375,
-0.0035305023193359375,
0.047637939453125,
-0.0191650390625,
0.038360595703125,
-0.0307159423828125,
0.01666259765625,
-0.005340576171875,
0.044586181640625,
0.0161590576171875,
-0.024627685546875,
0.0323486328125,
0.024627685546875,
-0.0421142578125,
-0.07427978515625,
-0.01302337646484375,
-0.0872802734375,
-0.0106964111328125,
0.0845947265625,
-0.01042938232421875,
-0.0208282470703125,
0.019256591796875,
-0.036895751953125,
0.050262451171875,
-0.01666259765625,
0.0479736328125,
0.04034423828125,
-0.012969970703125,
-0.00861358642578125,
-0.0303192138671875,
0.0288848876953125,
0.0135345458984375,
-0.053466796875,
-0.002254486083984375,
0.0254364013671875,
0.042572021484375,
0.0227203369140625,
0.04254150390625,
0.005718231201171875,
0.037078857421875,
0.024871826171875,
0.0308074951171875,
-0.031982421875,
-0.01293182373046875,
-0.0101776123046875,
0.007579803466796875,
-0.021453857421875,
-0.0087890625
]
] |
euclaise/falcon_1b_stage2 | 2023-09-25T06:18:39.000Z | [
"transformers",
"pytorch",
"falcon",
"text-generation",
"generated_from_trainer",
"custom_code",
"license:apache-2.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | euclaise | null | null | euclaise/falcon_1b_stage2 | 2 | 13,633 | transformers | 2023-09-17T22:37:09 | ---
license: apache-2.0
base_model: euclaise/falcon_1b_stage1
tags:
- generated_from_trainer
model-index:
- name: falcon_1b_stage2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# falcon_1b_stage2
This model is a fine-tuned version of [euclaise/falcon_1b_stage1](https://huggingface.co/euclaise/falcon_1b_stage1) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-06
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8.0
- total_train_batch_size: 128.0
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 1
### Training results
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
| 1,180 | [
[
-0.039306640625,
-0.0457763671875,
-0.0010480880737304688,
0.019866943359375,
-0.018341064453125,
-0.0225067138671875,
0.00646209716796875,
-0.025360107421875,
0.0288238525390625,
0.031890869140625,
-0.06298828125,
-0.0345458984375,
-0.0498046875,
0.002777099609375,
-0.0262603759765625,
0.09503173828125,
0.0009870529174804688,
0.0231781005859375,
0.0007424354553222656,
0.0048828125,
-0.0145721435546875,
-0.034698486328125,
-0.08380126953125,
-0.046173095703125,
0.0333251953125,
0.0282135009765625,
0.0474853515625,
0.074462890625,
0.04815673828125,
0.0218963623046875,
-0.0374755859375,
-0.00537872314453125,
-0.046356201171875,
-0.0245819091796875,
-0.005168914794921875,
-0.02557373046875,
-0.0587158203125,
0.00003814697265625,
0.049774169921875,
0.033203125,
-0.0251312255859375,
0.03778076171875,
0.00530242919921875,
0.036407470703125,
-0.034210205078125,
0.039337158203125,
-0.038299560546875,
0.035369873046875,
-0.0157012939453125,
-0.0219268798828125,
-0.016632080078125,
0.00818634033203125,
-0.01397705078125,
-0.0643310546875,
0.039154052734375,
-0.007572174072265625,
0.07666015625,
0.0282135009765625,
-0.01629638671875,
0.0018625259399414062,
-0.051910400390625,
0.03167724609375,
-0.04296875,
0.01531219482421875,
0.021697998046875,
0.047698974609375,
0.0008215904235839844,
-0.07501220703125,
-0.0189056396484375,
-0.00510406494140625,
0.006439208984375,
0.0216217041015625,
-0.0068359375,
0.003871917724609375,
0.0570068359375,
0.01270294189453125,
-0.035064697265625,
0.013519287109375,
-0.05596923828125,
-0.0200347900390625,
0.046051025390625,
0.0296783447265625,
-0.01500701904296875,
-0.005161285400390625,
-0.041717529296875,
-0.013275146484375,
-0.02154541015625,
0.029052734375,
0.038665771484375,
0.035980224609375,
-0.017425537109375,
0.048828125,
-0.0250396728515625,
0.040252685546875,
0.0277252197265625,
0.00044536590576171875,
0.03759765625,
0.0026073455810546875,
-0.032562255859375,
0.00013709068298339844,
0.051239013671875,
0.032684326171875,
0.0255279541015625,
-0.00377655029296875,
-0.023895263671875,
-0.01497650146484375,
0.026123046875,
-0.07171630859375,
-0.00641632080078125,
0.006122589111328125,
-0.03216552734375,
-0.0511474609375,
0.03271484375,
-0.048828125,
0.003246307373046875,
-0.01273345947265625,
0.0248565673828125,
-0.00925445556640625,
-0.0174560546875,
0.014190673828125,
-0.00885772705078125,
0.01507568359375,
0.0028285980224609375,
-0.0657958984375,
0.042266845703125,
0.037994384765625,
0.03741455078125,
0.01447296142578125,
-0.0445556640625,
-0.0396728515625,
0.007038116455078125,
-0.02813720703125,
0.03228759765625,
-0.019012451171875,
-0.04156494140625,
-0.0072479248046875,
0.037750244140625,
-0.0176849365234375,
-0.028106689453125,
0.09051513671875,
-0.0217742919921875,
0.006931304931640625,
-0.0286102294921875,
-0.04742431640625,
-0.021453857421875,
0.0156402587890625,
-0.05047607421875,
0.0802001953125,
0.020111083984375,
-0.0667724609375,
0.038299560546875,
-0.06689453125,
-0.0183563232421875,
0.0098876953125,
0.0036716461181640625,
-0.047027587890625,
0.00023496150970458984,
0.00555419921875,
0.03857421875,
-0.0038013458251953125,
0.012786865234375,
-0.038726806640625,
-0.05059814453125,
-0.01242828369140625,
-0.01953125,
0.046661376953125,
0.029510498046875,
-0.0248565673828125,
0.004611968994140625,
-0.081787109375,
0.01090240478515625,
0.0249786376953125,
-0.0233612060546875,
0.02349853515625,
-0.023895263671875,
0.038848876953125,
0.026702880859375,
0.0244903564453125,
-0.047698974609375,
0.006839752197265625,
-0.0271759033203125,
0.0283660888671875,
0.03253173828125,
0.00995635986328125,
0.006427764892578125,
-0.037017822265625,
0.03076171875,
0.031982421875,
0.037109375,
0.01099395751953125,
-0.030364990234375,
-0.073486328125,
-0.0088958740234375,
0.017364501953125,
0.0277862548828125,
-0.0212860107421875,
0.042327880859375,
0.0026836395263671875,
-0.051971435546875,
-0.01192474365234375,
0.002513885498046875,
0.0254058837890625,
0.048492431640625,
0.0335693359375,
-0.003185272216796875,
-0.04022216796875,
-0.0794677734375,
-0.0006098747253417969,
0.0029430389404296875,
0.0271759033203125,
0.008758544921875,
0.0638427734375,
-0.0114898681640625,
0.04205322265625,
-0.0233306884765625,
-0.00817108154296875,
-0.0091400146484375,
0.0016412734985351562,
0.027801513671875,
0.06451416015625,
0.05645751953125,
-0.024627685546875,
-0.0195159912109375,
-0.0103302001953125,
-0.05548095703125,
0.02203369140625,
-0.00986480712890625,
-0.014129638671875,
-0.0036449432373046875,
0.029022216796875,
-0.037750244140625,
0.044769287109375,
0.00336456298828125,
-0.0243377685546875,
0.0222015380859375,
-0.032135009765625,
-0.00447845458984375,
-0.080322265625,
0.019866943359375,
0.0276336669921875,
-0.0070648193359375,
-0.02679443359375,
0.0205078125,
-0.00214385986328125,
-0.01145172119140625,
-0.044036865234375,
0.048431396484375,
-0.009765625,
0.004913330078125,
-0.0169830322265625,
-0.030242919921875,
0.0008344650268554688,
0.051239013671875,
0.0164337158203125,
0.0438232421875,
0.05633544921875,
-0.032012939453125,
0.01157379150390625,
0.03955078125,
-0.00885009765625,
0.0270843505859375,
-0.0830078125,
0.005908966064453125,
-0.004119873046875,
0.0075531005859375,
-0.056182861328125,
-0.023529052734375,
0.042449951171875,
-0.031982421875,
0.0233001708984375,
-0.01291656494140625,
-0.02923583984375,
-0.041015625,
-0.0038814544677734375,
0.0156402587890625,
0.031585693359375,
-0.047607421875,
0.018463134765625,
-0.0042572021484375,
0.0226287841796875,
-0.033721923828125,
-0.05560302734375,
-0.0228424072265625,
-0.0167694091796875,
-0.03509521484375,
0.015228271484375,
0.001422882080078125,
0.01397705078125,
-0.01558685302734375,
-0.00574493408203125,
-0.0189666748046875,
-0.0015783309936523438,
0.035675048828125,
0.01200103759765625,
-0.01241302490234375,
-0.012847900390625,
0.01230621337890625,
-0.02801513671875,
0.0195465087890625,
-0.008087158203125,
0.0185546875,
-0.01467132568359375,
-0.01611328125,
-0.0648193359375,
0.00403594970703125,
0.04974365234375,
-0.01055908203125,
0.055084228515625,
0.06805419921875,
-0.040008544921875,
-0.007289886474609375,
-0.0223388671875,
-0.01727294921875,
-0.03240966796875,
0.03582763671875,
-0.047943115234375,
-0.025238037109375,
0.048797607421875,
0.00738525390625,
0.0015544891357421875,
0.07513427734375,
0.02349853515625,
0.00641632080078125,
0.0943603515625,
0.0208892822265625,
0.005458831787109375,
0.02215576171875,
-0.07244873046875,
-0.0253753662109375,
-0.061767578125,
-0.040252685546875,
-0.044677734375,
-0.025360107421875,
-0.051513671875,
0.0038013458251953125,
0.0116424560546875,
0.02264404296875,
-0.062286376953125,
0.0290069580078125,
-0.036224365234375,
0.038543701171875,
0.0401611328125,
0.0238800048828125,
-0.00881195068359375,
0.0167236328125,
-0.0220184326171875,
-0.0020732879638671875,
-0.0740966796875,
-0.0297393798828125,
0.09130859375,
0.0460205078125,
0.036346435546875,
-0.0255889892578125,
0.053802490234375,
-0.005382537841796875,
-0.0001035928726196289,
-0.036651611328125,
0.0283203125,
-0.005649566650390625,
-0.06329345703125,
-0.00336456298828125,
-0.0245361328125,
-0.04815673828125,
0.005126953125,
-0.037994384765625,
-0.042327880859375,
0.006099700927734375,
0.0013055801391601562,
-0.0239105224609375,
0.0285186767578125,
-0.0333251953125,
0.0826416015625,
-0.0120391845703125,
-0.04156494140625,
0.00014078617095947266,
-0.03851318359375,
0.01195526123046875,
0.0007939338684082031,
-0.0104827880859375,
0.013580322265625,
0.007572174072265625,
0.069091796875,
-0.055694580078125,
0.048675537109375,
-0.037933349609375,
0.03265380859375,
0.03057861328125,
-0.0086517333984375,
0.060577392578125,
0.0185546875,
-0.024993896484375,
0.0136260986328125,
0.0162200927734375,
-0.051727294921875,
-0.029052734375,
0.055816650390625,
-0.0902099609375,
-0.004108428955078125,
-0.043853759765625,
-0.03289794921875,
-0.0135498046875,
0.01096343994140625,
0.044952392578125,
0.050079345703125,
-0.01458740234375,
0.0162200927734375,
0.01467132568359375,
0.0172882080078125,
0.036346435546875,
0.03076171875,
-0.00974273681640625,
-0.038970947265625,
0.059539794921875,
0.0005035400390625,
0.0012445449829101562,
-0.005725860595703125,
0.0086822509765625,
-0.03802490234375,
-0.042449951171875,
-0.0233306884765625,
0.0208587646484375,
-0.048919677734375,
-0.0013437271118164062,
-0.0305938720703125,
-0.043670654296875,
-0.0116424560546875,
-0.00384521484375,
-0.0360107421875,
-0.03570556640625,
-0.05218505859375,
-0.0181427001953125,
0.024200439453125,
0.05401611328125,
-0.010223388671875,
0.057342529296875,
-0.044403076171875,
-0.0160980224609375,
0.0033969879150390625,
0.031768798828125,
0.0010852813720703125,
-0.069580078125,
-0.0269775390625,
0.0225372314453125,
-0.034912109375,
-0.0260467529296875,
0.0245361328125,
0.010498046875,
0.0538330078125,
0.046142578125,
-0.0215911865234375,
0.06402587890625,
-0.01342010498046875,
0.057708740234375,
0.0236358642578125,
-0.038848876953125,
0.031463623046875,
-0.032958984375,
0.01244354248046875,
0.052337646484375,
0.037567138671875,
0.0076904296875,
-0.01220703125,
-0.0950927734375,
-0.0394287109375,
0.0645751953125,
0.0330810546875,
0.005588531494140625,
0.0016851425170898438,
0.055267333984375,
0.0014085769653320312,
0.01183319091796875,
-0.0469970703125,
-0.03399658203125,
-0.0345458984375,
0.0081329345703125,
-0.0233306884765625,
-0.010894775390625,
-0.00888824462890625,
-0.0592041015625,
0.08270263671875,
-0.00249481201171875,
0.01324462890625,
0.00698089599609375,
0.020721435546875,
-0.0287322998046875,
-0.01812744140625,
0.05340576171875,
0.04901123046875,
-0.048736572265625,
-0.028350830078125,
0.00975799560546875,
-0.030914306640625,
-0.00719451904296875,
0.0189666748046875,
-0.0131988525390625,
0.004863739013671875,
0.027679443359375,
0.07830810546875,
0.0134429931640625,
0.0015783309936523438,
0.0296173095703125,
-0.00838470458984375,
-0.04547119140625,
-0.016571044921875,
0.0292205810546875,
-0.016693115234375,
0.028228759765625,
0.00157928466796875,
0.03741455078125,
0.00876617431640625,
-0.005096435546875,
0.0220184326171875,
0.0163421630859375,
-0.04571533203125,
-0.021270751953125,
0.075439453125,
0.01074981689453125,
-0.0294036865234375,
0.041900634765625,
-0.0165252685546875,
-0.00946044921875,
0.070068359375,
0.049072265625,
0.06689453125,
0.00762176513671875,
0.024261474609375,
0.07269287109375,
0.00527191162109375,
-0.022735595703125,
0.04022216796875,
0.005741119384765625,
-0.03167724609375,
-0.0006847381591796875,
-0.051513671875,
-0.00978851318359375,
0.03131103515625,
-0.084716796875,
0.034088134765625,
-0.05059814453125,
-0.0294952392578125,
0.0196075439453125,
0.0300750732421875,
-0.0667724609375,
0.040191650390625,
0.0025920867919921875,
0.0989990234375,
-0.0648193359375,
0.056549072265625,
0.05340576171875,
-0.049530029296875,
-0.07794189453125,
-0.021087646484375,
-0.01192474365234375,
-0.0693359375,
0.05535888671875,
0.0016040802001953125,
0.016845703125,
0.025665283203125,
-0.041229248046875,
-0.04351806640625,
0.079833984375,
0.01476287841796875,
-0.0540771484375,
0.0106048583984375,
0.0228424072265625,
0.050079345703125,
-0.042572021484375,
0.05596923828125,
0.014923095703125,
0.0288543701171875,
0.051055908203125,
-0.05731201171875,
-0.0200958251953125,
-0.035797119140625,
0.01788330078125,
0.00193023681640625,
-0.05792236328125,
0.07489013671875,
-0.006549835205078125,
0.0238494873046875,
0.0241546630859375,
0.039398193359375,
0.0092315673828125,
0.0113067626953125,
0.02362060546875,
0.059539794921875,
0.04547119140625,
-0.0012598037719726562,
0.06524658203125,
-0.062469482421875,
0.06072998046875,
0.07916259765625,
-0.0038280487060546875,
0.040496826171875,
0.0311737060546875,
-0.0071563720703125,
0.01114654541015625,
0.07464599609375,
-0.032867431640625,
0.0293426513671875,
0.0150146484375,
0.01261138916015625,
-0.0341796875,
0.01256561279296875,
-0.06024169921875,
0.0347900390625,
-0.005260467529296875,
-0.048675537109375,
-0.036285400390625,
-0.0242462158203125,
0.002613067626953125,
-0.017486572265625,
-0.0374755859375,
0.036407470703125,
-0.027679443359375,
-0.042083740234375,
0.06158447265625,
0.011322021484375,
0.0180816650390625,
-0.045806884765625,
-0.00031638145446777344,
-0.0206451416015625,
0.0247039794921875,
-0.0263214111328125,
-0.035186767578125,
0.0243988037109375,
0.0011816024780273438,
-0.013671875,
0.005634307861328125,
0.0223236083984375,
-0.0169830322265625,
-0.0859375,
0.00942230224609375,
0.03204345703125,
0.0210418701171875,
0.0031032562255859375,
-0.08770751953125,
0.001560211181640625,
-0.017425537109375,
-0.0241241455078125,
0.01006317138671875,
0.014312744140625,
0.0181121826171875,
0.035400390625,
0.04132080078125,
0.0008440017700195312,
0.0001590251922607422,
0.01222991943359375,
0.0582275390625,
-0.0340576171875,
-0.040985107421875,
-0.046539306640625,
0.03955078125,
-0.007518768310546875,
-0.0693359375,
0.039581298828125,
0.08135986328125,
0.062744140625,
-0.01497650146484375,
0.037933349609375,
0.01499176025390625,
0.02923583984375,
-0.0289764404296875,
0.044036865234375,
-0.04925537109375,
0.0009088516235351562,
-0.020660400390625,
-0.06488037109375,
0.00653076171875,
0.05596923828125,
-0.00414276123046875,
0.0198822021484375,
0.04156494140625,
0.0726318359375,
-0.01947021484375,
0.0338134765625,
0.0104827880859375,
0.0093841552734375,
0.00989532470703125,
0.040435791015625,
0.03839111328125,
-0.0718994140625,
0.0177154541015625,
-0.044158935546875,
-0.005950927734375,
-0.0018053054809570312,
-0.052001953125,
-0.0811767578125,
-0.031768798828125,
-0.03668212890625,
-0.037322998046875,
0.00753021240234375,
0.07318115234375,
0.06591796875,
-0.0565185546875,
-0.032806396484375,
-0.0235748291015625,
-0.0277252197265625,
-0.02264404296875,
-0.01244354248046875,
0.019256591796875,
-0.024658203125,
-0.0467529296875,
-0.0022220611572265625,
-0.016937255859375,
0.022674560546875,
-0.01354217529296875,
-0.0252838134765625,
-0.01343536376953125,
-0.01470947265625,
0.0114288330078125,
0.01473236083984375,
-0.035980224609375,
-0.025543212890625,
-0.004253387451171875,
-0.0009407997131347656,
0.0129241943359375,
0.019927978515625,
-0.03326416015625,
0.02374267578125,
0.008331298828125,
0.0153350830078125,
0.0731201171875,
0.0005927085876464844,
0.00847625732421875,
-0.04156494140625,
0.031585693359375,
0.0005950927734375,
0.030120849609375,
-0.0076904296875,
-0.026611328125,
0.0472412109375,
0.032623291015625,
-0.03369140625,
-0.049102783203125,
-0.018585205078125,
-0.08648681640625,
0.00984954833984375,
0.0843505859375,
0.0052490234375,
-0.033782958984375,
0.0406494140625,
-0.0175323486328125,
0.0269927978515625,
-0.019500732421875,
0.034332275390625,
0.049041748046875,
-0.0088958740234375,
0.011322021484375,
-0.042449951171875,
0.034820556640625,
-0.0011529922485351562,
-0.054595947265625,
-0.0309295654296875,
0.012542724609375,
0.04156494140625,
-0.010589599609375,
0.0176544189453125,
-0.003208160400390625,
0.024322509765625,
0.01739501953125,
0.0037860870361328125,
-0.05615234375,
-0.03497314453125,
-0.022216796875,
0.0162811279296875,
-0.00870513916015625,
-0.03656005859375
]
] |
pyannote/overlapped-speech-detection | 2022-10-28T13:46:33.000Z | [
"pyannote-audio",
"pyannote",
"pyannote-audio-pipeline",
"audio",
"voice",
"speech",
"speaker",
"overlapped-speech-detection",
"automatic-speech-recognition",
"dataset:ami",
"dataset:dihard",
"dataset:voxconverse",
"license:mit",
"region:us"
] | automatic-speech-recognition | pyannote | null | null | pyannote/overlapped-speech-detection | 13 | 13,625 | pyannote-audio | 2022-03-02T23:29:05 | ---
tags:
- pyannote
- pyannote-audio
- pyannote-audio-pipeline
- audio
- voice
- speech
- speaker
- overlapped-speech-detection
- automatic-speech-recognition
datasets:
- ami
- dihard
- voxconverse
license: mit
extra_gated_prompt: "The collected information will help acquire a better knowledge of pyannote.audio userbase and help its maintainers apply for grants to improve it further. If you are an academic researcher, please cite the relevant papers in your own publications using the model. If you work for a company, please consider contributing back to pyannote.audio development (e.g. through unrestricted gifts). We also provide scientific consulting services around speaker diarization and machine listening."
extra_gated_fields:
Company/university: text
Website: text
I plan to use this model for (task, type of audio data, etc): text
---
# 🎹 Overlapped speech detection
Relies on pyannote.audio 2.1: see [installation instructions](https://github.com/pyannote/pyannote-audio#installation).
```python
# 1. visit hf.co/pyannote/segmentation and accept user conditions
# 2. visit hf.co/settings/tokens to create an access token
# 3. instantiate pretrained overlapped speech detection pipeline
from pyannote.audio import Pipeline
pipeline = Pipeline.from_pretrained("pyannote/overlapped-speech-detection",
use_auth_token="ACCESS_TOKEN_GOES_HERE")
output = pipeline("audio.wav")
for speech in output.get_timeline().support():
# two or more speakers are active between speech.start and speech.end
...
```
## Support
For commercial enquiries and scientific consulting, please contact [me](mailto:herve@niderb.fr).
For [technical questions](https://github.com/pyannote/pyannote-audio/discussions) and [bug reports](https://github.com/pyannote/pyannote-audio/issues), please check [pyannote.audio](https://github.com/pyannote/pyannote-audio) Github repository.
## Citation
```bibtex
@inproceedings{Bredin2021,
Title = {{End-to-end speaker segmentation for overlap-aware resegmentation}},
Author = {{Bredin}, Herv{\'e} and {Laurent}, Antoine},
Booktitle = {Proc. Interspeech 2021},
Address = {Brno, Czech Republic},
Month = {August},
Year = {2021},
}
```
```bibtex
@inproceedings{Bredin2020,
Title = {{pyannote.audio: neural building blocks for speaker diarization}},
Author = {{Bredin}, Herv{\'e} and {Yin}, Ruiqing and {Coria}, Juan Manuel and {Gelly}, Gregory and {Korshunov}, Pavel and {Lavechin}, Marvin and {Fustes}, Diego and {Titeux}, Hadrien and {Bouaziz}, Wassim and {Gill}, Marie-Philippe},
Booktitle = {ICASSP 2020, IEEE International Conference on Acoustics, Speech, and Signal Processing},
Address = {Barcelona, Spain},
Month = {May},
Year = {2020},
}
```
| 2,759 | [
[
-0.0285797119140625,
-0.04150390625,
0.0205078125,
0.034698486328125,
-0.01308441162109375,
0.00455474853515625,
-0.02862548828125,
-0.05059814453125,
0.03204345703125,
0.021881103515625,
-0.040069580078125,
-0.03643798828125,
-0.0309295654296875,
-0.0221710205078125,
-0.0144195556640625,
0.03155517578125,
0.0198974609375,
-0.00740814208984375,
0.0010814666748046875,
0.0017518997192382812,
-0.0333251953125,
-0.00896453857421875,
-0.026824951171875,
-0.028717041015625,
-0.0025577545166015625,
0.0380859375,
0.0195770263671875,
0.038543701171875,
0.0137939453125,
0.029296875,
-0.04150390625,
0.0157470703125,
0.01812744140625,
0.01171112060546875,
0.007701873779296875,
-0.008453369140625,
-0.033599853515625,
0.0000013709068298339844,
0.06640625,
0.034210205078125,
-0.012054443359375,
-0.0027179718017578125,
-0.007144927978515625,
0.012115478515625,
-0.0282440185546875,
0.0034656524658203125,
-0.04168701171875,
0.00667572021484375,
-0.036651611328125,
-0.01031494140625,
-0.0135498046875,
0.0031909942626953125,
0.024444580078125,
-0.038543701171875,
-0.006084442138671875,
-0.01309967041015625,
0.07330322265625,
0.00801849365234375,
0.005847930908203125,
-0.01519775390625,
-0.03497314453125,
0.04339599609375,
-0.07354736328125,
0.033233642578125,
0.025970458984375,
0.01181793212890625,
-0.0289764404296875,
-0.063232421875,
-0.056976318359375,
-0.01385498046875,
0.003631591796875,
0.01349639892578125,
-0.0212554931640625,
0.0196685791015625,
0.0230560302734375,
0.033233642578125,
-0.051849365234375,
0.009368896484375,
-0.0277252197265625,
-0.034698486328125,
0.0474853515625,
-0.0165557861328125,
0.042755126953125,
-0.0222015380859375,
-0.02642822265625,
-0.0229339599609375,
-0.01049041748046875,
0.011993408203125,
0.056793212890625,
0.018585205078125,
-0.0188140869140625,
0.0184478759765625,
-0.0038814544677734375,
0.05963134765625,
0.000035762786865234375,
-0.0032253265380859375,
0.05572509765625,
-0.029388427734375,
0.0013647079467773438,
0.0489501953125,
0.07342529296875,
0.00122833251953125,
0.029754638671875,
0.01502227783203125,
0.0062255859375,
-0.0184783935546875,
-0.016143798828125,
-0.033233642578125,
-0.047637939453125,
0.037689208984375,
-0.0185699462890625,
0.0232086181640625,
0.0017805099487304688,
-0.058502197265625,
-0.01294708251953125,
-0.0107269287109375,
0.043487548828125,
-0.0262298583984375,
-0.0555419921875,
0.007350921630859375,
-0.031341552734375,
0.003429412841796875,
-0.00778961181640625,
-0.090087890625,
0.0197601318359375,
0.04376220703125,
0.08203125,
0.0031719207763671875,
-0.039276123046875,
-0.04412841796875,
-0.00861358642578125,
-0.0101776123046875,
0.058319091796875,
-0.0207061767578125,
-0.0186004638671875,
-0.0171661376953125,
-0.0091552734375,
-0.0228424072265625,
-0.046142578125,
0.060577392578125,
0.006206512451171875,
0.0190887451171875,
0.00792694091796875,
-0.042724609375,
-0.0038471221923828125,
-0.0156402587890625,
-0.0258941650390625,
0.07965087890625,
0.013824462890625,
-0.0673828125,
0.0185546875,
-0.037200927734375,
-0.01461029052734375,
0.014312744140625,
-0.01483154296875,
-0.05621337890625,
-0.033935546875,
0.0234527587890625,
0.015625,
0.00861358642578125,
0.0032482147216796875,
-0.01422882080078125,
-0.038299560546875,
0.0112762451171875,
-0.01540374755859375,
0.08392333984375,
0.0009493827819824219,
-0.0543212890625,
0.0167694091796875,
-0.06793212890625,
0.0032176971435546875,
-0.0034198760986328125,
-0.0274810791015625,
-0.02337646484375,
-0.0111083984375,
0.0231781005859375,
0.000698089599609375,
0.01812744140625,
-0.0650634765625,
-0.0102996826171875,
-0.051177978515625,
0.049407958984375,
0.040435791015625,
0.002155303955078125,
0.0067901611328125,
-0.0226898193359375,
0.02227783203125,
0.0024471282958984375,
-0.004413604736328125,
-0.03826904296875,
-0.0479736328125,
-0.0386962890625,
-0.04248046875,
0.0203704833984375,
0.052581787109375,
-0.018890380859375,
0.062286376953125,
0.00469970703125,
-0.048065185546875,
-0.041961669921875,
-0.005313873291015625,
0.0233612060546875,
0.0338134765625,
0.0360107421875,
-0.01690673828125,
-0.072021484375,
-0.0594482421875,
-0.041259765625,
-0.0372314453125,
0.0069122314453125,
0.03564453125,
0.0207061767578125,
0.00015079975128173828,
0.0693359375,
-0.0212249755859375,
-0.0218658447265625,
0.0229949951171875,
-0.00433349609375,
0.0325927734375,
0.056243896484375,
0.02630615234375,
-0.057464599609375,
-0.0496826171875,
0.00307464599609375,
-0.017578125,
-0.0306396484375,
-0.0203704833984375,
0.00028824806213378906,
0.00010466575622558594,
0.0452880859375,
-0.051666259765625,
0.023406982421875,
0.0233612060546875,
-0.02264404296875,
0.058135986328125,
0.01447296142578125,
0.00789642333984375,
-0.064453125,
0.000537872314453125,
0.0020008087158203125,
-0.0141143798828125,
-0.04974365234375,
-0.044677734375,
-0.00838470458984375,
0.00514984130859375,
-0.0186614990234375,
0.042510986328125,
-0.03948974609375,
-0.022552490234375,
-0.01155853271484375,
0.037841796875,
-0.0186004638671875,
0.03778076171875,
0.00815582275390625,
0.044219970703125,
0.05548095703125,
-0.0302581787109375,
0.036651611328125,
0.04461669921875,
-0.061370849609375,
0.02606201171875,
-0.0523681640625,
0.01248931884765625,
0.045867919921875,
0.0149993896484375,
-0.103759765625,
-0.005550384521484375,
0.0650634765625,
-0.0770263671875,
0.025726318359375,
-0.0298919677734375,
-0.00963592529296875,
-0.0240936279296875,
-0.01152801513671875,
0.04327392578125,
0.020263671875,
-0.06884765625,
0.0199432373046875,
0.040740966796875,
-0.03985595703125,
-0.044677734375,
-0.0611572265625,
0.005290985107421875,
-0.0215606689453125,
-0.0780029296875,
0.048431396484375,
-0.0038661956787109375,
-0.0206451416015625,
0.0011682510375976562,
-0.020355224609375,
-0.003543853759765625,
-0.02532958984375,
0.01459503173828125,
-0.006195068359375,
-0.0239715576171875,
0.00004887580871582031,
-0.006122589111328125,
0.00328826904296875,
-0.007198333740234375,
-0.057037353515625,
0.045196533203125,
-0.0025634765625,
-0.0223388671875,
-0.06622314453125,
0.007221221923828125,
0.044189453125,
-0.0296478271484375,
0.020050048828125,
0.07427978515625,
-0.0296630859375,
0.0017633438110351562,
-0.040802001953125,
0.0032062530517578125,
-0.037872314453125,
0.0467529296875,
-0.0160980224609375,
-0.054107666015625,
0.049652099609375,
0.020782470703125,
0.00632476806640625,
0.029815673828125,
0.039154052734375,
-0.0017375946044921875,
0.0386962890625,
0.0073699951171875,
0.005229949951171875,
0.0738525390625,
-0.0116424560546875,
0.01690673828125,
-0.09161376953125,
-0.033294677734375,
-0.042388916015625,
-0.00672149658203125,
-0.041839599609375,
-0.0249481201171875,
0.0108489990234375,
0.0047607421875,
0.00916290283203125,
0.04205322265625,
-0.0615234375,
0.0292205810546875,
0.04034423828125,
-0.0008249282836914062,
-0.0123748779296875,
0.0191497802734375,
-0.0128631591796875,
-0.0069122314453125,
-0.04119873046875,
-0.0089874267578125,
0.053497314453125,
0.034881591796875,
0.0211944580078125,
0.001033782958984375,
0.047576904296875,
0.0192108154296875,
-0.046875,
-0.060516357421875,
0.0291290283203125,
-0.00530242919921875,
-0.031890869140625,
-0.0494384765625,
-0.0303802490234375,
-0.079833984375,
0.0391845703125,
0.00946807861328125,
-0.08966064453125,
0.0482177734375,
-0.0012683868408203125,
-0.03363037109375,
0.03045654296875,
-0.0479736328125,
0.058319091796875,
-0.00011801719665527344,
-0.01018524169921875,
-0.0153045654296875,
-0.045806884765625,
0.0217437744140625,
0.024139404296875,
-0.0010700225830078125,
-0.023193359375,
0.02886962890625,
0.091064453125,
-0.01971435546875,
0.0307769775390625,
-0.0325927734375,
0.003818511962890625,
0.041473388671875,
-0.0257110595703125,
0.01093292236328125,
0.004150390625,
0.0120697021484375,
0.01849365234375,
0.00969696044921875,
-0.01800537109375,
-0.01509857177734375,
0.0587158203125,
-0.05413818359375,
-0.042144775390625,
-0.028961181640625,
-0.037078857421875,
0.0024929046630859375,
0.01120758056640625,
0.0196533203125,
0.053955078125,
-0.003437042236328125,
0.0029811859130859375,
0.04498291015625,
-0.0228729248046875,
0.05572509765625,
0.0211639404296875,
0.0026569366455078125,
-0.07000732421875,
0.06658935546875,
0.00887298583984375,
0.005466461181640625,
0.027191162109375,
0.0208740234375,
-0.0256195068359375,
-0.054718017578125,
-0.0229644775390625,
0.0181427001953125,
-0.03594970703125,
0.01435089111328125,
-0.057952880859375,
-0.03466796875,
-0.056671142578125,
0.0250244140625,
-0.0338134765625,
-0.044219970703125,
-0.0272369384765625,
0.01708984375,
0.0167083740234375,
-0.00022041797637939453,
-0.046722412109375,
0.032501220703125,
-0.0457763671875,
0.01560211181640625,
0.0204925537109375,
0.011016845703125,
-0.0007772445678710938,
-0.0679931640625,
-0.026885986328125,
-0.004467010498046875,
-0.00966644287109375,
-0.0560302734375,
0.0177001953125,
0.0206298828125,
0.0728759765625,
0.0285491943359375,
-0.005924224853515625,
0.04168701171875,
-0.0091552734375,
0.05792236328125,
0.0197906494140625,
-0.08624267578125,
0.035614013671875,
-0.031402587890625,
0.03363037109375,
0.0258636474609375,
0.0004086494445800781,
-0.06488037109375,
-0.00920867919921875,
-0.04266357421875,
-0.08868408203125,
0.09124755859375,
0.03363037109375,
-0.0153656005859375,
0.01006317138671875,
0.00982666015625,
-0.0076446533203125,
-0.006519317626953125,
-0.04339599609375,
-0.01519775390625,
-0.020660400390625,
-0.006397247314453125,
-0.0012369155883789062,
-0.0219268798828125,
0.0081024169921875,
-0.031585693359375,
0.0655517578125,
0.0204925537109375,
0.0229339599609375,
0.053009033203125,
-0.00963592529296875,
-0.010955810546875,
0.0047454833984375,
0.0633544921875,
0.0374755859375,
-0.0272674560546875,
-0.005084991455078125,
-0.0131988525390625,
-0.05218505859375,
-0.0008997917175292969,
0.03564453125,
0.0252685546875,
0.0296478271484375,
0.0450439453125,
0.072021484375,
0.0070037841796875,
-0.04144287109375,
0.040740966796875,
-0.00664520263671875,
-0.0285491943359375,
-0.0574951171875,
-0.003925323486328125,
0.0173492431640625,
0.0391845703125,
0.03363037109375,
0.0009493827819824219,
-0.00028586387634277344,
-0.0191650390625,
0.0303955078125,
0.01367950439453125,
-0.0341796875,
-0.00861358642578125,
0.03131103515625,
0.0330810546875,
-0.06329345703125,
0.06597900390625,
-0.01311492919921875,
-0.060089111328125,
0.0665283203125,
0.032196044921875,
0.0748291015625,
-0.049407958984375,
-0.0115814208984375,
0.0350341796875,
0.025787353515625,
0.015289306640625,
0.0224151611328125,
-0.0450439453125,
-0.05206298828125,
-0.0152130126953125,
-0.0362548828125,
-0.01995849609375,
0.033172607421875,
-0.03131103515625,
0.01338958740234375,
-0.0284271240234375,
-0.01007843017578125,
0.028839111328125,
0.01154327392578125,
0.00490570068359375,
-0.003185272216796875,
0.0272674560546875,
0.06573486328125,
-0.05328369140625,
0.04681396484375,
0.047760009765625,
-0.0074615478515625,
-0.039886474609375,
0.0157928466796875,
-0.005153656005859375,
-0.03240966796875,
0.01242828369140625,
0.0171356201171875,
-0.0125732421875,
-0.0041046142578125,
-0.01239013671875,
-0.04315185546875,
0.0655517578125,
0.021728515625,
-0.06805419921875,
0.02166748046875,
-0.018951416015625,
0.01142120361328125,
-0.0139617919921875,
0.0207061767578125,
0.0654296875,
0.056610107421875,
0.0024089813232421875,
-0.1029052734375,
-0.018341064453125,
-0.035064697265625,
-0.045806884765625,
0.026580810546875,
-0.0631103515625,
0.0650634765625,
-0.00531005859375,
-0.00848388671875,
0.0300140380859375,
0.048004150390625,
0.027923583984375,
0.05029296875,
0.0609130859375,
0.043853759765625,
0.06036376953125,
-0.0101776123046875,
0.017669677734375,
-0.01194000244140625,
0.009490966796875,
0.089599609375,
0.01873779296875,
0.043975830078125,
0.028839111328125,
-0.03057861328125,
0.03851318359375,
0.0689697265625,
-0.02362060546875,
0.0380859375,
0.01496124267578125,
-0.02191162109375,
-0.0205078125,
-0.0026493072509765625,
-0.053619384765625,
0.057281494140625,
0.03778076171875,
-0.0124359130859375,
0.021881103515625,
-0.0166778564453125,
-0.01428985595703125,
0.0019741058349609375,
-0.0111846923828125,
0.0260162353515625,
0.0118255615234375,
-0.035369873046875,
0.04730224609375,
-0.0179443359375,
0.059356689453125,
-0.04150390625,
-0.00273895263671875,
-0.0001678466796875,
0.01284027099609375,
-0.03857421875,
-0.046905517578125,
0.004650115966796875,
-0.027252197265625,
-0.0020542144775390625,
0.006519317626953125,
0.05023193359375,
-0.07073974609375,
0.0124969482421875,
0.0312042236328125,
0.00206756591796875,
0.0355224609375,
-0.0020275115966796875,
-0.053253173828125,
0.0122528076171875,
0.01702880859375,
-0.01001739501953125,
0.004451751708984375,
0.0147705078125,
0.01168060302734375,
0.0204620361328125,
0.04339599609375,
0.0203399658203125,
0.01837158203125,
0.013458251953125,
0.03973388671875,
-0.031646728515625,
-0.07830810546875,
-0.06591796875,
0.013946533203125,
-0.02764892578125,
-0.03485107421875,
0.060577392578125,
0.053192138671875,
0.0643310546875,
0.0202789306640625,
0.041412353515625,
0.008056640625,
0.0300750732421875,
-0.036834716796875,
0.06427001953125,
-0.034820556640625,
0.019805908203125,
-0.05181884765625,
-0.060272216796875,
-0.0121307373046875,
0.050506591796875,
-0.013458251953125,
0.00499725341796875,
0.03985595703125,
0.055694580078125,
-0.01192474365234375,
0.0083770751953125,
0.017822265625,
0.02337646484375,
0.02783203125,
0.042266845703125,
0.08282470703125,
-0.027862548828125,
0.0478515625,
-0.01580810546875,
-0.015045166015625,
-0.0204010009765625,
-0.03277587890625,
-0.0565185546875,
-0.059051513671875,
-0.044586181640625,
-0.0170440673828125,
0.00879669189453125,
0.0640869140625,
0.0672607421875,
-0.05596923828125,
-0.0330810546875,
-0.00969696044921875,
0.0256805419921875,
-0.027069091796875,
-0.0166168212890625,
0.055908203125,
0.00681304931640625,
-0.058807373046875,
0.0606689453125,
0.0277099609375,
0.01512908935546875,
0.0027179718017578125,
-0.01271820068359375,
-0.04901123046875,
-0.00638580322265625,
-0.01114654541015625,
0.050384521484375,
-0.046539306640625,
0.003429412841796875,
-0.045166015625,
0.019989013671875,
0.0254974365234375,
0.05242919921875,
-0.037322998046875,
0.051239013671875,
0.0577392578125,
0.0250244140625,
0.052825927734375,
-0.0075225830078125,
0.0205535888671875,
-0.047088623046875,
0.036773681640625,
0.0223388671875,
0.02392578125,
0.0472412109375,
-0.0158538818359375,
0.006221771240234375,
0.038360595703125,
-0.054473876953125,
-0.0794677734375,
-0.006011962890625,
-0.06591796875,
-0.019805908203125,
0.07464599609375,
-0.0215606689453125,
-0.002117156982421875,
-0.023284912109375,
-0.018707275390625,
0.04290771484375,
-0.035491943359375,
0.04803466796875,
0.05792236328125,
-0.025421142578125,
0.0028362274169921875,
-0.02288818359375,
0.04510498046875,
0.038330078125,
-0.03912353515625,
0.01200103759765625,
0.0128021240234375,
0.00893402099609375,
0.027130126953125,
0.05352783203125,
0.0012454986572265625,
0.0396728515625,
0.04217529296875,
0.01029205322265625,
-0.0283660888671875,
-0.029541015625,
-0.038665771484375,
-0.01910400390625,
-0.007480621337890625,
-0.0604248046875
]
] |
baichuan-inc/Baichuan-7B | 2023-07-19T07:00:20.000Z | [
"transformers",
"pytorch",
"baichuan",
"text-generation",
"custom_code",
"zh",
"en",
"arxiv:1910.07467",
"arxiv:2009.03300",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | baichuan-inc | null | null | baichuan-inc/Baichuan-7B | 792 | 13,611 | transformers | 2023-06-13T07:47:16 | ---
language:
- zh
- en
pipeline_tag: text-generation
inference: false
---
# Baichuan-7B
<!-- Provide a quick summary of what the model is/does. -->
Baichuan-7B是由百川智能开发的一个开源的大规模预训练模型。基于Transformer结构,在大约1.2万亿tokens上训练的70亿参数模型,支持中英双语,上下文窗口长度为4096。在标准的中文和英文权威benchmark(C-EVAL/MMLU)上均取得同尺寸最好的效果。
如果希望使用Baichuan-7B(如进行推理、Finetune等),我们推荐使用配套代码库[Baichuan-7B](https://github.com/baichuan-inc/Baichuan-7B)。
Baichuan-7B is an open-source large-scale pre-trained model developed by Baichuan Intelligent Technology. Based on the Transformer architecture, it is a model with 7 billion parameters trained on approximately 1.2 trillion tokens. It supports both Chinese and English, with a context window length of 4096. It achieves the best performance of its size on standard Chinese and English authoritative benchmarks (C-EVAL/MMLU).
If you wish to use Baichuan-7B (for inference, finetuning, etc.), we recommend using the accompanying code library [Baichuan-7B](https://github.com/baichuan-inc/Baichuan-7B).
## Why use Baichuan-7B
- 在同尺寸模型中Baichuan-7B达到了目前SOTA的水平,参考下面MMLU指标
- Baichuan-7B使用自有的中英文双语语料进行训练,在中文上进行优化,在C-Eval达到SOTA水平
- 不同于LLaMA完全禁止商业使用,Baichuan-7B使用更宽松的开源协议,允许用于商业目的
- Among models of the same size, Baichuan-7B has achieved the current state-of-the-art (SOTA) level, as evidenced by the following MMLU metrics.
- Baichuan-7B is trained on proprietary bilingual Chinese-English corpora, optimized for Chinese, and achieves SOTA performance on C-Eval.
- Unlike LLaMA, which completely prohibits commercial use, Baichuan-7B employs a more lenient open-source license, allowing for commercial purposes.
## How to Get Started with the Model
如下是一个使用Baichuan-7B进行1-shot推理的任务,根据作品给出作者名,正确输出为"夜雨寄北->李商隐"
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-7B", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan-7B", device_map="auto", trust_remote_code=True)
inputs = tokenizer('登鹳雀楼->王之涣\n夜雨寄北->', return_tensors='pt')
inputs = inputs.to('cuda:0')
pred = model.generate(**inputs, max_new_tokens=64,repetition_penalty=1.1)
print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))
```
The following is a task of performing 1-shot inference using Baichuan-7B, where the author's name is given based on the work, with the correct output being "One Hundred Years of Solitude->Gabriel Garcia Marquez"
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-7B", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan-7B", device_map="auto", trust_remote_code=True)
inputs = tokenizer('Hamlet->Shakespeare\nOne Hundred Years of Solitude->', return_tensors='pt')
inputs = inputs.to('cuda:0')
pred = model.generate(**inputs, max_new_tokens=64,repetition_penalty=1.1)
print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True))
```
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** 百川智能(Baichuan Intelligent Technology)
- **Email**: opensource@baichuan-inc.com
- **Language(s) (NLP):** Chinese/English
- **License:** [Baichuan-7B License](https://huggingface.co/baichuan-inc/Baichuan-7B/blob/main/baichuan-7B%20%E6%A8%A1%E5%9E%8B%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE.pdf)
### Model Sources
<!-- Provide the basic links for the model. -->
整体模型基于标准的Transformer结构,我们采用了和LLaMA一样的模型设计
- **Position Embedding**:采用rotary-embedding,是现阶段被大多数模型采用的位置编码方案,具有很好的外推性。
- **Feedforward Layer**:采用SwiGLU,Feedforward变化为(8/3)倍的隐含层大小,即11008。
- **Layer Normalization**: 基于[RMSNorm](https://arxiv.org/abs/1910.07467)的Pre-Normalization。
具体参数和见下表
| Hyperparameter | Value |
|----------------|-------|
|n_parameters | 7000559616 |
|n_layers | 32 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 64000 |
| sequence length | 4096 |
The overall model is based on the standard Transformer structure, and we have adopted the same model design as LLaMA:
- Position Embedding: We use rotary-embedding, which is the position encoding scheme adopted by most models at this stage, and it has excellent extrapolation capabilities.
- Feedforward Layer: We use SwiGLU. The feedforward changes to (8/3) times the size of the hidden layer, that is, 11008.
- Layer Normalization: Pre-Normalization based on [RMSNorm](https://arxiv.org/abs/1910.07467).
The specific parameters are as follows:
| Hyperparameter | Value |
|----------------|-------|
|n_parameters | 7000559616 |
|n_layers | 32 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 64000 |
| sequence length | 4096 |
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Downstream Use
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
我们同时开源出了和本模型配套的训练代码,允许进行高效的Finetune用于下游任务,具体参见[Baichuan-7B](https://github.com/baichuan-inc/Baichuan-7B)。
We have also open-sourced the training code that accompanies this model, allowing for efficient finetuning for downstream tasks. For more details, please refer to [Baichuan-7B](https://github.com/baichuan-inc/Baichuan-7B).
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
在没有充分评估风险和采取缓解措施的情况下投入生产使用;任何可能被视为不负责任或有害的使用案例。
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Baichuan-7B可能会产生事实上不正确的输出,不应依赖它产生事实上准确的信息。Baichuan-7B是在各种公共数据集上进行训练的。尽管我们已经做出了巨大的努力来清洗预训练数据,但这个模型可能会生成淫秽、偏见或其他冒犯性的输出。
Baichuan-7B can produce factually incorrect output, and should not be relied on to produce factually accurate information. Baichuan-7B was trained on various public datasets. While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
## Training Details
训练具体设置参见[Baichuan-7B](https://github.com/baichuan-inc/Baichuan-7B)。
For specific training settings, please refer to [Baichuan-7B](https://github.com/baichuan-inc/Baichuan-7B).
## Evaluation
### 中文评测
#### C-Eval
[CEval数据集](https://cevalbenchmark.com/index.html)是一个全面的中文基础模型评测数据集,涵盖了52个学科和四个难度的级别。我们使用该数据集的dev集作为few-shot的来源,在test集上进行了5-shot测试。
| Model 5-shot | Average | Avg(Hard) | STEM | Social Sciences | Humanities | Others |
|-----------------------------|---------|-----------|------|-----------------|------------|--------|
| GPT-4 | 68.7 | 54.9 | 67.1 | 77.6 | 64.5 | 67.8 |
| ChatGPT | 54.4 | 41.4 | 52.9 | 61.8 | 50.9 | 53.6 |
| Claude-v1.3 | 54.2 | 39.0 | 51.9 | 61.7 | 52.1 | 53.7 |
| Claude-instant-v1.0 | 45.9 | 35.5 | 43.1 | 53.8 | 44.2 | 45.4 |
| moss-moon-003-base (16B) | 27.4 | 24.5 | 27.0 | 29.1 | 27.2 | 26.9 |
| Ziya-LLaMA-13B-pretrain | 30.2 | 22.7 | 27.7 | 34.4 | 32.0 | 28.9 |
| LLaMA-7B-hf | 27.1 | 25.9 | 27.1 | 26.8 | 27.9 | 26.3 |
| ChatGLM-6B | 34.5 | 23.1 | 30.4 | 39.6 | 37.4 | 34.5 |
| Falcon-7B | 25.8 | 24.3 | 25.8 | 26.0 | 25.8 | 25.6 |
| Open-LLaMA-v2-pretrain (7B) | 24.0 | 22.5 | 23.1 | 25.3 | 25.2 | 23.2 |
| TigerBot-7B-base | 25.7 | 27.0 | 27.3 | 24.7 | 23.4 | 26.1 |
| Aquila-7B<sup>*</sup> | 25.5 | 25.2 | 25.6 | 24.6 | 25.2 | 26.6 |
| BLOOM-7B | 22.8 | 20.2 | 21.8 | 23.3 | 23.9 | 23.3 |
| BLOOMZ-7B | 35.7 | 25.8 | 31.3 | 43.5 | 36.6 | 35.6 |
| **Baichuan-7B** | 42.8 | 31.5 | 38.2 | 52.0 | 46.2 | 39.3 |
#### Gaokao
[Gaokao](https://github.com/ExpressAI/AI-Gaokao) 是一个以中国高考题作为评测大语言模型能力的数据集,用以评估模型的语言能力和逻辑推理能力。
我们只保留了其中的单项选择题,并对所有模型进行统一5-shot测试。
以下是测试的结果。
| Model | Average |
|-------------------------|-----------------|
| Open-LLaMA-v2-pretrain | 21.41 |
| Ziya-LLaMA-13B-pretrain | 23.17 |
| Falcon-7B | 23.98 |
| TigerBot-7B-base | 25.94 |
| LLaMA-7B | 27.81 |
| ChatGLM-6B | 21.41 |
| BLOOM-7B | 26.96 |
| BLOOMZ-7B | 28.72 |
| Aquila-7B<sup>*</sup> | 24.39 |
| **Baichuan-7B** | **36.24** |
#### AGIEval
[AGIEval](https://github.com/microsoft/AGIEval) 旨在评估模型的认知和解决问题相关的任务中的一般能力。
我们只保留了其中的四选一单项选择题,随机划分后对所有模型进行了统一5-shot测试。
| Model | Average |
|-------------------------|-----------------|
| Open-LLaMA-v2-pretrain | 23.49 |
| Ziya-LLaMA-13B-pretrain | 27.64 |
| Falcon-7B | 27.18 |
| TigerBot-7B-base | 25.19 |
| LLaMA-7B | 28.17 |
| ChatGLM-6B | 23.49 |
| BLOOM-7B | 26.55 |
| BLOOMZ-7B | 30.27 |
| Aquila-7B<sup>*</sup> | 25.58 |
| **Baichuan-7B** | **34.44** |
<sup>*</sup>其中Aquila模型来源于[智源官方网站](https://model.baai.ac.cn/model-detail/100098),仅做参考
### English Leaderboard
In addition to Chinese, we also tested the model's performance in English.
#### MMLU
[MMLU](https://arxiv.org/abs/2009.03300) is an English evaluation dataset that includes 57 multiple-choice tasks, covering elementary mathematics, American history, computer science, law, etc. The difficulty ranges from high school level to expert level, making it a mainstream LLM evaluation dataset.
We adopted the [open-source]((https://github.com/hendrycks/test)) evaluation scheme, and the final 5-shot results are as follows:
| Model | Humanities | Social Sciences | STEM | Other | Average |
|----------------------------------------|-----------:|:---------------:|:----:|:-----:|:-------:|
| LLaMA-7B<sup>2</sup> | 34.0 | 38.3 | 30.5 | 38.1 | 35.1 |
| Falcon-7B<sup>1</sup> | - | - | - | - | 35.0 |
| mpt-7B<sup>1</sup> | - | - | - | - | 35.6 |
| ChatGLM-6B<sup>0</sup> | 35.4 | 41.0 | 31.3 | 40.5 | 36.9 |
| BLOOM 7B<sup>0</sup> | 25.0 | 24.4 | 26.5 | 26.4 | 25.5 |
| BLOOMZ 7B<sup>0</sup> | 31.3 | 42.1 | 34.4 | 39.0 | 36.1 |
| moss-moon-003-base (16B)<sup>0</sup> | 24.2 | 22.8 | 22.4 | 24.4 | 23.6 |
| moss-moon-003-sft (16B)<sup>0</sup> | 30.5 | 33.8 | 29.3 | 34.4 | 31.9 |
| **Baichuan-7B<sup>0</sup>** | 38.4 | 48.9 | 35.6 | 48.1 | 42.3 |
The superscript in the Model column indicates the source of the results.
```
0:reimplemented
1:https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
2:https://paperswithcode.com/sota/multi-task-language-understanding-on-mmlu
```
## Our Group

| 11,725 | [
[
-0.0294647216796875,
-0.0516357421875,
0.0058746337890625,
0.037322998046875,
-0.02093505859375,
-0.020233154296875,
-0.01265716552734375,
-0.0270233154296875,
0.0035076141357421875,
0.0262603759765625,
-0.033538818359375,
-0.043548583984375,
-0.045745849609375,
0.0030384063720703125,
-0.022613525390625,
0.06500244140625,
0.004634857177734375,
0.01084136962890625,
0.023101806640625,
-0.00015723705291748047,
-0.0435791015625,
-0.0223541259765625,
-0.051910400390625,
-0.0247802734375,
0.019500732421875,
0.013427734375,
0.052703857421875,
0.059173583984375,
0.043487548828125,
0.0232696533203125,
-0.01763916015625,
0.01084136962890625,
-0.040496826171875,
-0.0213470458984375,
0.01317596435546875,
-0.037628173828125,
-0.044342041015625,
-0.00260162353515625,
0.040771484375,
0.031402587890625,
0.0005202293395996094,
0.0259552001953125,
0.01136016845703125,
0.0303955078125,
-0.031890869140625,
0.017364501953125,
-0.032257080078125,
0.0080108642578125,
-0.0182952880859375,
0.00746917724609375,
-0.024810791015625,
-0.022064208984375,
-0.006954193115234375,
-0.0423583984375,
0.029388427734375,
0.00603485107421875,
0.10455322265625,
0.01025390625,
-0.0271759033203125,
0.005855560302734375,
-0.03961181640625,
0.0706787109375,
-0.0809326171875,
0.017822265625,
0.025299072265625,
0.0290374755859375,
-0.0070343017578125,
-0.06658935546875,
-0.052520751953125,
-0.020660400390625,
-0.0260009765625,
0.029205322265625,
-0.0013904571533203125,
-0.003551483154296875,
0.029449462890625,
0.033294677734375,
-0.03741455078125,
0.0027828216552734375,
-0.045501708984375,
-0.011199951171875,
0.05975341796875,
0.017608642578125,
0.013031005859375,
-0.036224365234375,
-0.04193115234375,
-0.00579071044921875,
-0.0379638671875,
0.0287322998046875,
0.0159454345703125,
0.00861358642578125,
-0.02862548828125,
0.0198822021484375,
-0.0164794921875,
0.040069580078125,
0.027679443359375,
-0.0171661376953125,
0.0345458984375,
-0.028411865234375,
-0.031982421875,
-0.00400543212890625,
0.08416748046875,
0.024200439453125,
-0.0039215087890625,
0.00490570068359375,
-0.0255279541015625,
-0.0086822509765625,
0.0036106109619140625,
-0.0606689453125,
-0.0213775634765625,
0.0276031494140625,
-0.05535888671875,
-0.03173828125,
0.02001953125,
-0.043914794921875,
-0.002285003662109375,
-0.01531982421875,
0.04095458984375,
-0.0360107421875,
-0.0280609130859375,
0.005390167236328125,
-0.003650665283203125,
0.0352783203125,
0.0142059326171875,
-0.062042236328125,
0.014862060546875,
0.034423828125,
0.0634765625,
-0.007572174072265625,
-0.04144287109375,
-0.009857177734375,
0.0032024383544921875,
-0.02264404296875,
0.041534423828125,
-0.01079559326171875,
-0.032684326171875,
-0.01557159423828125,
0.031280517578125,
-0.007640838623046875,
-0.03375244140625,
0.034454345703125,
-0.02935791015625,
0.00806427001953125,
-0.021728515625,
-0.0250396728515625,
-0.033782958984375,
0.01800537109375,
-0.05181884765625,
0.083740234375,
0.004802703857421875,
-0.0604248046875,
0.00550079345703125,
-0.04132080078125,
-0.0194549560546875,
-0.0122528076171875,
-0.0009613037109375,
-0.041656494140625,
-0.0215606689453125,
0.020965576171875,
0.0377197265625,
-0.032623291015625,
0.0251922607421875,
-0.014739990234375,
-0.032470703125,
0.01202392578125,
-0.043731689453125,
0.09423828125,
0.027374267578125,
-0.05035400390625,
0.0144195556640625,
-0.06451416015625,
-0.01261138916015625,
0.03021240234375,
-0.01470947265625,
0.007755279541015625,
-0.024444580078125,
0.00698089599609375,
0.01727294921875,
0.03326416015625,
-0.0323486328125,
0.00432586669921875,
-0.03570556640625,
0.040863037109375,
0.06396484375,
0.0010824203491210938,
0.0228424072265625,
-0.0430908203125,
0.037078857421875,
0.0240631103515625,
0.040283203125,
-0.017791748046875,
-0.03570556640625,
-0.0806884765625,
-0.01311492919921875,
0.0267791748046875,
0.04522705078125,
-0.03887939453125,
0.05169677734375,
-0.00641632080078125,
-0.06512451171875,
-0.044586181640625,
0.0033435821533203125,
0.0311737060546875,
0.047943115234375,
0.0421142578125,
-0.004917144775390625,
-0.037933349609375,
-0.0626220703125,
0.01318359375,
0.0018033981323242188,
-0.0006871223449707031,
0.01454925537109375,
0.0498046875,
-0.0225982666015625,
0.05340576171875,
-0.03424072265625,
-0.0261383056640625,
-0.0209808349609375,
-0.00628662109375,
0.049560546875,
0.0537109375,
0.05389404296875,
-0.0531005859375,
-0.039825439453125,
0.0028400421142578125,
-0.06951904296875,
0.0024127960205078125,
-0.0132598876953125,
-0.035430908203125,
0.00577545166015625,
0.0069732666015625,
-0.049957275390625,
0.03912353515625,
0.036590576171875,
-0.01214599609375,
0.062744140625,
-0.01078033447265625,
0.0115203857421875,
-0.09027099609375,
0.0119171142578125,
-0.008819580078125,
0.004474639892578125,
-0.03924560546875,
0.00959014892578125,
0.016326904296875,
0.006389617919921875,
-0.053131103515625,
0.045806884765625,
-0.04046630859375,
0.02337646484375,
-0.007015228271484375,
0.00542449951171875,
0.009979248046875,
0.049041748046875,
-0.002048492431640625,
0.04278564453125,
0.051239013671875,
-0.051483154296875,
0.046844482421875,
0.0264739990234375,
-0.0198822021484375,
0.0033321380615234375,
-0.06182861328125,
-0.0018634796142578125,
0.0004508495330810547,
0.0206146240234375,
-0.060211181640625,
-0.006000518798828125,
0.0294647216796875,
-0.039581298828125,
0.01363372802734375,
-0.006389617919921875,
-0.01531219482421875,
-0.043792724609375,
-0.0338134765625,
0.0273895263671875,
0.04400634765625,
-0.0445556640625,
0.041259765625,
0.00627899169921875,
0.01140594482421875,
-0.05712890625,
-0.062164306640625,
-0.021270751953125,
-0.016448974609375,
-0.0677490234375,
0.034271240234375,
-0.00565338134765625,
0.008209228515625,
-0.01093292236328125,
0.0012607574462890625,
-0.004222869873046875,
-0.0011796951293945312,
0.0215606689453125,
0.04351806640625,
-0.0271453857421875,
-0.01263427734375,
0.0018873214721679688,
-0.0059356689453125,
0.00234222412109375,
-0.0067138671875,
0.04327392578125,
-0.0176849365234375,
-0.0166473388671875,
-0.045501708984375,
0.007190704345703125,
0.026275634765625,
-0.0260467529296875,
0.05804443359375,
0.061248779296875,
-0.033599853515625,
-0.00838470458984375,
-0.042633056640625,
-0.004772186279296875,
-0.040802001953125,
0.0240325927734375,
-0.03204345703125,
-0.0355224609375,
0.06317138671875,
0.0171966552734375,
0.0140228271484375,
0.064208984375,
0.042205810546875,
0.001857757568359375,
0.07073974609375,
0.0307159423828125,
-0.01357269287109375,
0.0343017578125,
-0.0689697265625,
0.005573272705078125,
-0.0640869140625,
-0.030487060546875,
-0.031829833984375,
-0.0271453857421875,
-0.045654296875,
-0.017791748046875,
0.026275634765625,
0.01104736328125,
-0.048095703125,
0.04498291015625,
-0.046905517578125,
0.0098419189453125,
0.0570068359375,
0.025238037109375,
0.00896453857421875,
-0.020965576171875,
-0.00815582275390625,
0.01480865478515625,
-0.050018310546875,
-0.036285400390625,
0.0721435546875,
0.036468505859375,
0.0643310546875,
0.005001068115234375,
0.049774169921875,
0.00516510009765625,
0.017120361328125,
-0.05096435546875,
0.030029296875,
-0.01183319091796875,
-0.05230712890625,
-0.02142333984375,
-0.022216796875,
-0.074951171875,
0.025146484375,
-0.0080413818359375,
-0.045196533203125,
0.021270751953125,
0.01114654541015625,
-0.04461669921875,
0.0235137939453125,
-0.06085205078125,
0.07525634765625,
-0.036407470703125,
-0.031585693359375,
0.00684356689453125,
-0.054351806640625,
0.039459228515625,
0.006866455078125,
0.01187896728515625,
0.0026645660400390625,
0.022796630859375,
0.062042236328125,
-0.041961669921875,
0.05474853515625,
-0.022064208984375,
0.001556396484375,
0.040863037109375,
-0.0017061233520507812,
0.03851318359375,
0.009735107421875,
-0.0126495361328125,
0.04315185546875,
0.0106964111328125,
-0.03826904296875,
-0.0291595458984375,
0.0498046875,
-0.073486328125,
-0.054443359375,
-0.036224365234375,
-0.031005859375,
0.017578125,
0.033050537109375,
0.051422119140625,
0.019805908203125,
0.0120697021484375,
0.028839111328125,
0.034149169921875,
-0.02642822265625,
0.04779052734375,
0.026885986328125,
-0.010467529296875,
-0.0312347412109375,
0.058563232421875,
0.01934814453125,
0.017120361328125,
0.0234222412109375,
0.0203094482421875,
-0.018829345703125,
-0.0291595458984375,
-0.033538818359375,
0.0234222412109375,
-0.047882080078125,
-0.0288848876953125,
-0.039398193359375,
-0.046722412109375,
-0.0599365234375,
-0.0038776397705078125,
-0.0271453857421875,
-0.00894927978515625,
-0.034393310546875,
-0.016021728515625,
0.01479339599609375,
0.032318115234375,
0.005786895751953125,
0.0352783203125,
-0.06658935546875,
0.0291595458984375,
0.0142059326171875,
0.01432037353515625,
0.0164642333984375,
-0.06585693359375,
-0.034332275390625,
0.0221710205078125,
-0.04620361328125,
-0.06170654296875,
0.040863037109375,
-0.0005283355712890625,
0.051910400390625,
0.0535888671875,
0.0065460205078125,
0.051422119140625,
-0.02886962890625,
0.0787353515625,
0.0192413330078125,
-0.069091796875,
0.046417236328125,
-0.0201416015625,
0.010345458984375,
0.01134490966796875,
0.031646728515625,
-0.0270843505859375,
-0.01392364501953125,
-0.05059814453125,
-0.057708740234375,
0.06365966796875,
0.0222015380859375,
0.0048370361328125,
0.007762908935546875,
0.01113128662109375,
0.004428863525390625,
0.0017986297607421875,
-0.0738525390625,
-0.037841796875,
-0.047271728515625,
-0.01276397705078125,
0.0013580322265625,
-0.00514984130859375,
-0.00411224365234375,
-0.03472900390625,
0.061309814453125,
0.0094146728515625,
0.035125732421875,
0.0181732177734375,
-0.0143585205078125,
0.00848388671875,
-0.0102081298828125,
0.03717041015625,
0.04119873046875,
-0.0185089111328125,
0.00024700164794921875,
0.027740478515625,
-0.0477294921875,
0.0013074874877929688,
0.01506805419921875,
-0.030914306640625,
0.00023353099822998047,
0.0352783203125,
0.0740966796875,
-0.0026702880859375,
-0.0279541015625,
0.04095458984375,
0.0064239501953125,
-0.00826263427734375,
-0.034393310546875,
0.022247314453125,
-0.0012884140014648438,
0.02252197265625,
0.02496337890625,
-0.0017213821411132812,
0.0094757080078125,
-0.0260772705078125,
-0.0007395744323730469,
0.01233673095703125,
-0.0146026611328125,
-0.023681640625,
0.07244873046875,
0.01015472412109375,
-0.0145721435546875,
0.035491943359375,
-0.01056671142578125,
-0.03900146484375,
0.0738525390625,
0.048309326171875,
0.058197021484375,
-0.024444580078125,
0.0032863616943359375,
0.053192138671875,
0.02862548828125,
-0.01180267333984375,
0.01088714599609375,
0.00844573974609375,
-0.047515869140625,
-0.0038299560546875,
-0.052276611328125,
-0.00205230712890625,
0.0220184326171875,
-0.0423583984375,
0.04278564453125,
-0.039703369140625,
-0.024810791015625,
-0.0009369850158691406,
0.00922393798828125,
-0.057861328125,
0.036163330078125,
0.0004222393035888672,
0.0758056640625,
-0.05670166015625,
0.06500244140625,
0.036529541015625,
-0.0640869140625,
-0.08087158203125,
-0.010406494140625,
-0.01554107666015625,
-0.0682373046875,
0.049163818359375,
0.0204315185546875,
0.0006837844848632812,
-0.003673553466796875,
-0.051910400390625,
-0.06964111328125,
0.1177978515625,
0.0260467529296875,
-0.03662109375,
-0.006622314453125,
0.00411224365234375,
0.034912109375,
-0.0089111328125,
0.041015625,
0.035400390625,
0.036651611328125,
-0.0032405853271484375,
-0.07159423828125,
0.0196533203125,
-0.0343017578125,
0.0164947509765625,
-0.0192108154296875,
-0.092041015625,
0.0938720703125,
-0.0113372802734375,
-0.006534576416015625,
0.007007598876953125,
0.060028076171875,
0.02227783203125,
0.0209197998046875,
0.033782958984375,
0.030242919921875,
0.042327880859375,
-0.009124755859375,
0.06463623046875,
-0.033294677734375,
0.04876708984375,
0.062744140625,
0.00896453857421875,
0.044647216796875,
0.0065460205078125,
-0.034454345703125,
0.03717041015625,
0.064208984375,
-0.03350830078125,
0.033203125,
-0.00249481201171875,
-0.0110015869140625,
-0.00616455078125,
0.008270263671875,
-0.05487060546875,
0.02435302734375,
0.01428985595703125,
-0.031646728515625,
0.00986480712890625,
-0.004123687744140625,
0.007904052734375,
-0.020599365234375,
-0.0205230712890625,
0.0323486328125,
0.010345458984375,
-0.04266357421875,
0.06951904296875,
0.01480865478515625,
0.06707763671875,
-0.048126220703125,
0.0021820068359375,
-0.0347900390625,
0.00914764404296875,
-0.031707763671875,
-0.0435791015625,
0.0013751983642578125,
-0.00374603271484375,
-0.01457977294921875,
0.0078582763671875,
0.04840087890625,
-0.0215606689453125,
-0.047760009765625,
0.0270843505859375,
0.004772186279296875,
0.01093292236328125,
0.0028362274169921875,
-0.056640625,
-0.00528717041015625,
0.00579071044921875,
-0.037322998046875,
0.0083160400390625,
0.03668212890625,
0.006275177001953125,
0.053924560546875,
0.05029296875,
0.00716400146484375,
0.0271453857421875,
0.005550384521484375,
0.062103271484375,
-0.060394287109375,
-0.035980224609375,
-0.052001953125,
0.053985595703125,
-0.0037212371826171875,
-0.028167724609375,
0.05877685546875,
0.047332763671875,
0.0748291015625,
-0.017608642578125,
0.060791015625,
-0.0160675048828125,
0.0225067138671875,
-0.036865234375,
0.06793212890625,
-0.037628173828125,
0.0118408203125,
-0.0225982666015625,
-0.062469482421875,
-0.00711822509765625,
0.061126708984375,
-0.0180511474609375,
0.0192108154296875,
0.052093505859375,
0.07501220703125,
-0.00611114501953125,
-0.001964569091796875,
0.0190582275390625,
0.03485107421875,
0.025909423828125,
0.056884765625,
0.048980712890625,
-0.0732421875,
0.053741455078125,
-0.06170654296875,
-0.02020263671875,
-0.01558685302734375,
-0.04425048828125,
-0.061248779296875,
-0.034912109375,
-0.00992584228515625,
-0.0289306640625,
-0.0179443359375,
0.06170654296875,
0.054290771484375,
-0.0743408203125,
-0.0305633544921875,
0.004833221435546875,
0.004528045654296875,
-0.03485107421875,
-0.02142333984375,
0.047943115234375,
-0.0183258056640625,
-0.06890869140625,
0.00743865966796875,
-0.00548553466796875,
0.00809478759765625,
-0.021881103515625,
-0.0168609619140625,
-0.02423095703125,
-0.004199981689453125,
0.038726806640625,
0.006786346435546875,
-0.059814453125,
-0.014984130859375,
0.0248870849609375,
-0.01557159423828125,
0.007312774658203125,
0.01447296142578125,
-0.038787841796875,
0.021453857421875,
0.036651611328125,
0.0310821533203125,
0.0439453125,
0.00017380714416503906,
0.0162353515625,
-0.0290985107421875,
0.017364501953125,
0.002613067626953125,
0.037353515625,
0.008392333984375,
-0.0360107421875,
0.038299560546875,
0.0330810546875,
-0.04205322265625,
-0.052703857421875,
-0.01395416259765625,
-0.0855712890625,
-0.013092041015625,
0.09033203125,
-0.0268707275390625,
-0.031280517578125,
0.018707275390625,
-0.034210205078125,
0.0501708984375,
-0.0235748291015625,
0.06597900390625,
0.042633056640625,
-0.003753662109375,
-0.00707244873046875,
-0.041168212890625,
0.01654052734375,
0.0223846435546875,
-0.0509033203125,
-0.0216827392578125,
0.018707275390625,
0.0234527587890625,
0.0166473388671875,
0.031585693359375,
-0.004192352294921875,
0.026275634765625,
0.00811767578125,
0.01296234130859375,
-0.019134521484375,
-0.01093292236328125,
-0.01302337646484375,
-0.005214691162109375,
-0.01302337646484375,
-0.036041259765625
]
] |
Yntec/Dreamshaper8 | 2023-10-17T06:03:44.000Z | [
"diffusers",
"General",
"Anime",
"Art",
"Girl",
"Photorealistic",
"LandScapes",
"Lykon",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/Dreamshaper8 | 5 | 13,602 | diffusers | 2023-10-11T17:46:03 | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- General
- Anime
- Art
- Girl
- Photorealistic
- LandScapes
- Lykon
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
---
# Dreamshaper 8
Original page:
https://civitai.com/models/4384?modelVersionId=80261
Buy Lykon a coffee:
https://snipfeed.co/lykon
Sample and prompt:

PRETTY CUTE GIRL BY ROSSDRAWS. An extradimensional creature buying donuts. Pixar animation. | 608 | [
[
-0.0080108642578125,
-0.036651611328125,
0.041839599609375,
0.019317626953125,
-0.019775390625,
0.013702392578125,
0.007526397705078125,
-0.0667724609375,
0.0699462890625,
0.049163818359375,
-0.046844482421875,
-0.01751708984375,
-0.035247802734375,
0.020904541015625,
-0.0257415771484375,
0.06109619140625,
0.00872802734375,
-0.006664276123046875,
0.005748748779296875,
0.03643798828125,
-0.028717041015625,
-0.0053558349609375,
-0.08795166015625,
-0.02325439453125,
0.04681396484375,
0.043365478515625,
0.05657958984375,
-0.01256561279296875,
0.0132293701171875,
0.029266357421875,
0.0013599395751953125,
-0.01367950439453125,
-0.037445068359375,
0.024383544921875,
-0.0009751319885253906,
-0.0479736328125,
-0.037200927734375,
-0.006778717041015625,
0.0164947509765625,
0.0239105224609375,
-0.0256195068359375,
0.0030689239501953125,
0.01194000244140625,
0.025787353515625,
-0.02716064453125,
0.0158538818359375,
-0.0215606689453125,
0.0361328125,
0.01287078857421875,
0.0430908203125,
-0.011199951171875,
-0.039794921875,
-0.01702880859375,
-0.08563232421875,
0.016937255859375,
0.0151519775390625,
0.0772705078125,
0.00339508056640625,
-0.013214111328125,
-0.0311431884765625,
-0.0305328369140625,
0.06695556640625,
-0.0273895263671875,
0.02349853515625,
0.036712646484375,
0.04052734375,
-0.023590087890625,
-0.09332275390625,
-0.04046630859375,
0.0133056640625,
0.003173828125,
0.01256561279296875,
-0.052490234375,
-0.0285797119140625,
0.025604248046875,
0.02978515625,
-0.029693603515625,
-0.0273895263671875,
-0.04705810546875,
0.013885498046875,
0.0567626953125,
0.0187225341796875,
0.0511474609375,
-0.0067138671875,
-0.0206451416015625,
-0.008636474609375,
0.0020275115966796875,
0.004474639892578125,
0.047210693359375,
0.01715087890625,
-0.023162841796875,
0.0401611328125,
0.0030460357666015625,
0.046630859375,
0.0225067138671875,
-0.01039886474609375,
0.0309295654296875,
0.01340484619140625,
-0.0213165283203125,
0.0121612548828125,
0.05926513671875,
0.04693603515625,
0.0217132568359375,
0.007476806640625,
-0.0248260498046875,
0.01374053955078125,
0.01751708984375,
-0.054840087890625,
-0.066162109375,
0.02166748046875,
-0.037261962890625,
-0.04083251953125,
-0.01531982421875,
-0.0751953125,
-0.0280609130859375,
0.0214385986328125,
-0.006954193115234375,
-0.04595947265625,
-0.0372314453125,
0.0350341796875,
-0.013580322265625,
-0.005573272705078125,
0.01016998291015625,
-0.0716552734375,
0.025238037109375,
0.0244140625,
0.050445556640625,
0.004730224609375,
-0.003017425537109375,
-0.01666259765625,
-0.0006709098815917969,
-0.05029296875,
0.0665283203125,
-0.02728271484375,
-0.045501708984375,
-0.01285552978515625,
0.02545166015625,
-0.00768280029296875,
-0.035430908203125,
0.0797119140625,
-0.0289764404296875,
-0.005229949951171875,
-0.01024627685546875,
-0.031280517578125,
-0.025421142578125,
0.0127410888671875,
-0.0545654296875,
0.05743408203125,
0.0012683868408203125,
-0.047576904296875,
0.0269927978515625,
-0.05413818359375,
-0.0092620849609375,
0.0235443115234375,
-0.0272064208984375,
-0.02447509765625,
0.043426513671875,
-0.0018434524536132812,
0.0230865478515625,
0.0218963623046875,
-0.0294189453125,
-0.0338134765625,
-0.0026569366455078125,
0.04595947265625,
-0.006404876708984375,
0.06414794921875,
0.060150146484375,
0.0125579833984375,
-0.007450103759765625,
-0.0635986328125,
0.01641845703125,
0.07525634765625,
0.01222991943359375,
-0.0192108154296875,
-0.025634765625,
0.0091705322265625,
0.02117919921875,
0.0294189453125,
-0.04229736328125,
0.00994110107421875,
0.002315521240234375,
0.018310546875,
0.034759521484375,
-0.0005612373352050781,
-0.0152740478515625,
-0.045867919921875,
0.0638427734375,
-0.010040283203125,
0.030792236328125,
0.0078582763671875,
-0.04693603515625,
-0.057373046875,
-0.0255584716796875,
0.004367828369140625,
0.00762176513671875,
-0.0260467529296875,
0.0239715576171875,
0.005290985107421875,
-0.043426513671875,
-0.044158935546875,
-0.02386474609375,
-0.0020732879638671875,
0.002079010009765625,
-0.00807952880859375,
-0.01386260986328125,
-0.03961181640625,
-0.06524658203125,
-0.010589599609375,
0.01763916015625,
-0.0024471282958984375,
0.0194244384765625,
0.0183258056640625,
-0.0072021484375,
0.028289794921875,
-0.072021484375,
-0.01268768310546875,
-0.049224853515625,
-0.02630615234375,
0.07135009765625,
0.0262298583984375,
0.087890625,
-0.05224609375,
-0.07598876953125,
-0.00475311279296875,
-0.0310821533203125,
-0.0014848709106445312,
0.030426025390625,
-0.01428985595703125,
-0.036346435546875,
0.004199981689453125,
-0.07391357421875,
0.03204345703125,
0.01959228515625,
-0.04522705078125,
0.05023193359375,
-0.0246429443359375,
0.039764404296875,
-0.08154296875,
-0.022369384765625,
-0.0146026611328125,
-0.023834228515625,
-0.0347900390625,
0.0194549560546875,
-0.0199127197265625,
-0.0509033203125,
-0.05084228515625,
0.04193115234375,
-0.045867919921875,
-0.003963470458984375,
-0.0292816162109375,
-0.0037326812744140625,
0.036865234375,
0.0161895751953125,
-0.004909515380859375,
0.03680419921875,
0.062286376953125,
-0.0271148681640625,
0.03143310546875,
0.04681396484375,
-0.039581298828125,
0.05419921875,
-0.05413818359375,
0.03253173828125,
-0.01103973388671875,
0.00273895263671875,
-0.05523681640625,
-0.02752685546875,
0.05426025390625,
-0.042144775390625,
-0.0137786865234375,
-0.01971435546875,
-0.04388427734375,
-0.025390625,
-0.0138702392578125,
0.03204345703125,
0.042266845703125,
-0.0489501953125,
0.0175933837890625,
0.0037078857421875,
-0.01410675048828125,
-0.016021728515625,
-0.05145263671875,
0.01401519775390625,
-0.0306854248046875,
-0.031982421875,
0.0273895263671875,
-0.029693603515625,
-0.03369140625,
-0.0020599365234375,
0.010986328125,
-0.000046312808990478516,
-0.035797119140625,
0.02044677734375,
-0.01065826416015625,
-0.0217742919921875,
-0.022369384765625,
-0.00946044921875,
-0.01065826416015625,
0.004123687744140625,
0.00238037109375,
0.051849365234375,
-0.014404296875,
-0.01470947265625,
-0.06939697265625,
0.03521728515625,
0.04058837890625,
0.02142333984375,
0.02459716796875,
0.0333251953125,
-0.051910400390625,
-0.01214599609375,
-0.0269012451171875,
-0.013580322265625,
-0.039306640625,
0.0023975372314453125,
-0.05517578125,
-0.00257110595703125,
0.036773681640625,
0.0264892578125,
-0.0203857421875,
0.042694091796875,
0.00321197509765625,
-0.0258026123046875,
0.069091796875,
0.03515625,
0.003692626953125,
0.032135009765625,
-0.05206298828125,
0.02947998046875,
-0.02557373046875,
-0.0207977294921875,
-0.02301025390625,
-0.042938232421875,
-0.011566162109375,
-0.0151214599609375,
-0.0059814453125,
0.018463134765625,
-0.0197601318359375,
0.034454345703125,
-0.024444580078125,
0.06939697265625,
0.005794525146484375,
0.0239715576171875,
0.0223236083984375,
0.01534271240234375,
-0.0155181884765625,
-0.03167724609375,
-0.0248870849609375,
-0.0216217041015625,
0.01473236083984375,
0.0189056396484375,
0.037200927734375,
0.01708984375,
0.0489501953125,
0.0283966064453125,
-0.0352783203125,
-0.052978515625,
0.06488037109375,
-0.006542205810546875,
-0.04510498046875,
0.005832672119140625,
0.007282257080078125,
-0.07952880859375,
0.0174407958984375,
-0.038604736328125,
-0.07025146484375,
0.01253509521484375,
0.00933837890625,
-0.05023193359375,
0.03631591796875,
-0.05517578125,
0.0687255859375,
-0.04718017578125,
-0.0228271484375,
-0.0269317626953125,
0.01387786865234375,
0.011505126953125,
0.02093505859375,
0.01520538330078125,
-0.047821044921875,
0.00860595703125,
0.04010009765625,
-0.0268096923828125,
0.05096435546875,
0.0084686279296875,
0.005260467529296875,
0.030914306640625,
0.0167999267578125,
0.039154052734375,
0.01248931884765625,
0.0023250579833984375,
0.00870513916015625,
-0.0037059783935546875,
-0.0399169921875,
-0.045440673828125,
0.0733642578125,
-0.061981201171875,
-0.01004791259765625,
-0.043975830078125,
-0.0162811279296875,
-0.001712799072265625,
0.020263671875,
0.049468994140625,
0.007045745849609375,
-0.01690673828125,
0.00464630126953125,
0.0225067138671875,
-0.003826141357421875,
0.040313720703125,
-0.0111846923828125,
-0.0667724609375,
-0.042510986328125,
0.06298828125,
-0.0240478515625,
0.00125885009765625,
0.006839752197265625,
0.054901123046875,
-0.025726318359375,
-0.03387451171875,
-0.00213623046875,
0.032562255859375,
-0.03961181640625,
-0.0125732421875,
-0.0440673828125,
-0.0239715576171875,
-0.0216827392578125,
-0.03765869140625,
-0.037841796875,
-0.0257720947265625,
-0.042205810546875,
-0.030609130859375,
0.06719970703125,
0.070068359375,
-0.001247406005859375,
0.034210205078125,
-0.050506591796875,
0.020965576171875,
0.03570556640625,
0.006748199462890625,
0.00499725341796875,
-0.0212249755859375,
0.001922607421875,
-0.01120758056640625,
-0.035125732421875,
-0.05462646484375,
0.06109619140625,
-0.0181732177734375,
0.03936767578125,
0.04443359375,
-0.0002918243408203125,
0.038604736328125,
-0.0267181396484375,
0.058807373046875,
0.0821533203125,
-0.029937744140625,
0.033599853515625,
-0.040802001953125,
0.01325225830078125,
0.024871826171875,
0.018157958984375,
-0.055633544921875,
0.0003566741943359375,
-0.07086181640625,
-0.056640625,
0.0300445556640625,
0.035614013671875,
0.0203857421875,
0.006771087646484375,
0.0200042724609375,
0.01751708984375,
0.026397705078125,
-0.0572509765625,
-0.01922607421875,
-0.02789306640625,
-0.0294342041015625,
-0.0001417398452758789,
-0.0297393798828125,
0.0228729248046875,
-0.0165557861328125,
0.04193115234375,
-0.0009546279907226562,
0.024322509765625,
-0.01036834716796875,
0.01515960693359375,
-0.00580596923828125,
0.00548553466796875,
0.0282135009765625,
0.048675537109375,
-0.027862548828125,
-0.003803253173828125,
0.00772857666015625,
-0.052886962890625,
0.005428314208984375,
-0.010589599609375,
-0.017974853515625,
0.0080108642578125,
-0.0051116943359375,
0.084716796875,
0.0237884521484375,
-0.0433349609375,
0.02197265625,
-0.0264892578125,
0.00997161865234375,
-0.06719970703125,
0.045623779296875,
0.0014162063598632812,
0.049468994140625,
0.01385498046875,
-0.00589752197265625,
0.032684326171875,
-0.01290130615234375,
0.00188446044921875,
0.0015249252319335938,
-0.0484619140625,
-0.053802490234375,
0.0662841796875,
0.0022678375244140625,
-0.06658935546875,
0.052978515625,
-0.003307342529296875,
-0.00027108192443847656,
0.049530029296875,
0.06695556640625,
0.0849609375,
-0.0208892822265625,
0.0270538330078125,
0.04949951171875,
-0.0157318115234375,
0.010711669921875,
0.06353759765625,
0.006488800048828125,
-0.049652099609375,
0.0259246826171875,
-0.036376953125,
-0.023345947265625,
0.049285888671875,
-0.031494140625,
0.062286376953125,
-0.0433349609375,
-0.0238189697265625,
-0.0013589859008789062,
0.006694793701171875,
-0.045257568359375,
0.041015625,
0.0243377685546875,
0.06365966796875,
-0.04229736328125,
0.038970947265625,
0.052764892578125,
-0.038238525390625,
-0.034423828125,
0.0087890625,
0.01050567626953125,
-0.06817626953125,
0.018402099609375,
0.0130615234375,
-0.01186370849609375,
-0.01419830322265625,
-0.041748046875,
-0.04693603515625,
0.05078125,
0.0110015869140625,
-0.07244873046875,
-0.0010461807250976562,
0.0007686614990234375,
0.010955810546875,
-0.038543701171875,
0.0291290283203125,
0.0330810546875,
0.013763427734375,
0.0298004150390625,
-0.0399169921875,
-0.035247802734375,
-0.037841796875,
0.01488494873046875,
0.0008425712585449219,
-0.06500244140625,
0.046661376953125,
-0.046478271484375,
0.012847900390625,
0.01236724853515625,
0.07366943359375,
0.042266845703125,
0.00823211669921875,
0.08526611328125,
0.051849365234375,
0.03668212890625,
0.022247314453125,
0.06573486328125,
-0.0154266357421875,
0.0214996337890625,
0.08270263671875,
-0.0064239501953125,
0.056182861328125,
0.0303192138671875,
-0.03448486328125,
0.0239410400390625,
0.084716796875,
0.00872039794921875,
0.04595947265625,
0.018951416015625,
-0.0158843994140625,
-0.0114898681640625,
-0.0157318115234375,
-0.016815185546875,
0.01177215576171875,
0.0198822021484375,
-0.00678253173828125,
-0.01108551025390625,
0.005687713623046875,
-0.0036754608154296875,
0.004474639892578125,
-0.0311431884765625,
0.02642822265625,
0.044952392578125,
-0.01312255859375,
0.01340484619140625,
-0.030426025390625,
0.028289794921875,
-0.045654296875,
-0.01617431640625,
-0.026702880859375,
0.03363037109375,
-0.0249481201171875,
-0.0670166015625,
0.01552581787109375,
-0.006702423095703125,
-0.00493621826171875,
-0.0292510986328125,
0.050262451171875,
-0.0099029541015625,
-0.0806884765625,
0.028350830078125,
-0.0013475418090820312,
0.033111572265625,
0.0264892578125,
-0.09173583984375,
0.0312042236328125,
0.01401519775390625,
-0.0037212371826171875,
0.006473541259765625,
0.0222015380859375,
0.021759033203125,
0.032562255859375,
0.0167694091796875,
0.01374053955078125,
-0.0069427490234375,
0.0228271484375,
0.01548004150390625,
-0.051727294921875,
-0.0751953125,
-0.042999267578125,
0.036590576171875,
-0.030548095703125,
-0.027862548828125,
0.058807373046875,
0.069091796875,
0.05517578125,
-0.0550537109375,
0.04827880859375,
-0.0022029876708984375,
0.0518798828125,
-0.0258026123046875,
0.07257080078125,
-0.055816650390625,
-0.0179443359375,
-0.0222015380859375,
-0.047119140625,
0.00363922119140625,
0.06787109375,
0.010040283203125,
0.0219879150390625,
0.0241241455078125,
0.034942626953125,
-0.050506591796875,
0.0178375244140625,
0.051788330078125,
0.0310821533203125,
0.00441741943359375,
0.00843048095703125,
0.09735107421875,
-0.03375244140625,
0.02105712890625,
-0.04248046875,
0.0092620849609375,
-0.0214691162109375,
-0.055267333984375,
-0.0672607421875,
-0.044891357421875,
-0.04718017578125,
-0.032745361328125,
-0.01019287109375,
0.0811767578125,
0.047515869140625,
-0.080078125,
-0.048675537109375,
0.01690673828125,
0.00441741943359375,
-0.01125335693359375,
-0.0130157470703125,
0.0193328857421875,
0.01461029052734375,
-0.07916259765625,
0.03485107421875,
0.0004413127899169922,
0.048187255859375,
0.0068206787109375,
0.0039520263671875,
0.01190948486328125,
0.004703521728515625,
0.0269927978515625,
0.0008106231689453125,
-0.03948974609375,
-0.0199432373046875,
-0.0006275177001953125,
0.022735595703125,
0.01751708984375,
0.06463623046875,
-0.01332855224609375,
0.00531005859375,
0.03533935546875,
0.0023021697998046875,
0.04974365234375,
-0.027496337890625,
0.0152130126953125,
-0.02142333984375,
0.0257415771484375,
0.0190582275390625,
0.06146240234375,
0.0249786376953125,
-0.038665771484375,
0.01377105712890625,
0.027862548828125,
-0.02056884765625,
-0.042510986328125,
0.0184783935546875,
-0.057830810546875,
-0.0152130126953125,
0.05523681640625,
0.0229644775390625,
-0.047576904296875,
0.003948211669921875,
-0.04888916015625,
-0.030914306640625,
-0.0511474609375,
0.042572021484375,
0.031036376953125,
-0.0386962890625,
-0.0321044921875,
-0.039154052734375,
0.028411865234375,
-0.0098419189453125,
-0.038330078125,
-0.02392578125,
0.039215087890625,
0.049652099609375,
0.019561767578125,
0.06695556640625,
0.0044097900390625,
-0.01453399658203125,
0.03997802734375,
0.0288238525390625,
0.03143310546875,
-0.0219573974609375,
0.00545501708984375,
-0.00360107421875,
0.01702880859375,
-0.07513427734375
]
] |
facebook/deit-base-distilled-patch16-224 | 2022-07-13T11:39:38.000Z | [
"transformers",
"pytorch",
"tf",
"deit",
"image-classification",
"vision",
"dataset:imagenet",
"arxiv:2012.12877",
"arxiv:2006.03677",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-classification | facebook | null | null | facebook/deit-base-distilled-patch16-224 | 20 | 13,592 | transformers | 2022-03-02T23:29:05 | ---
license: apache-2.0
tags:
- image-classification
- vision
datasets:
- imagenet
---
# Distilled Data-efficient Image Transformer (base-sized model)
Distilled data-efficient Image Transformer (DeiT) model pre-trained and fine-tuned on ImageNet-1k (1 million images, 1,000 classes) at resolution 224x224. It was first introduced in the paper [Training data-efficient image transformers & distillation through attention](https://arxiv.org/abs/2012.12877) by Touvron et al. and first released in [this repository](https://github.com/facebookresearch/deit). However, the weights were converted from the [timm repository](https://github.com/rwightman/pytorch-image-models) by Ross Wightman.
Disclaimer: The team releasing DeiT did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
This model is a distilled Vision Transformer (ViT). It uses a distillation token, besides the class token, to effectively learn from a teacher (CNN) during both pre-training and fine-tuning. The distillation token is learned through backpropagation, by interacting with the class ([CLS]) and patch tokens through the self-attention layers.
Images are presented to the model as a sequence of fixed-size patches (resolution 16x16), which are linearly embedded.
## Intended uses & limitations
You can use the raw model for image classification. See the [model hub](https://huggingface.co/models?search=facebook/deit) to look for
fine-tuned versions on a task that interests you.
### How to use
Since this model is a distilled ViT model, you can plug it into DeiTModel, DeiTForImageClassification or DeiTForImageClassificationWithTeacher. Note that the model expects the data to be prepared using DeiTFeatureExtractor. Here we use AutoFeatureExtractor, which will automatically use the appropriate feature extractor given the model name.
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import AutoFeatureExtractor, DeiTForImageClassificationWithTeacher
from PIL import Image
import requests
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
feature_extractor = AutoFeatureExtractor.from_pretrained('facebook/deit-base-distilled-patch16-224')
model = DeiTForImageClassificationWithTeacher.from_pretrained('facebook/deit-base-distilled-patch16-224')
inputs = feature_extractor(images=image, return_tensors="pt")
# forward pass
outputs = model(**inputs)
logits = outputs.logits
# model predicts one of the 1000 ImageNet classes
predicted_class_idx = logits.argmax(-1).item()
print("Predicted class:", model.config.id2label[predicted_class_idx])
```
Currently, both the feature extractor and model support PyTorch. Tensorflow and JAX/FLAX are coming soon.
## Training data
This model was pretrained and fine-tuned with distillation on [ImageNet-1k](http://www.image-net.org/challenges/LSVRC/2012/), a dataset consisting of 1 million images and 1k classes.
## Training procedure
### Preprocessing
The exact details of preprocessing of images during training/validation can be found [here](https://github.com/facebookresearch/deit/blob/ab5715372db8c6cad5740714b2216d55aeae052e/datasets.py#L78).
At inference time, images are resized/rescaled to the same resolution (256x256), center-cropped at 224x224 and normalized across the RGB channels with the ImageNet mean and standard deviation.
### Pretraining
The model was trained on a single 8-GPU node for 3 days. Training resolution is 224. For all hyperparameters (such as batch size and learning rate) we refer to table 9 of the original paper.
## Evaluation results
| Model | ImageNet top-1 accuracy | ImageNet top-5 accuracy | # params | URL |
|---------------------------------------|-------------------------|-------------------------|----------|------------------------------------------------------------------|
| DeiT-tiny | 72.2 | 91.1 | 5M | https://huggingface.co/facebook/deit-tiny-patch16-224 |
| DeiT-small | 79.9 | 95.0 | 22M | https://huggingface.co/facebook/deit-small-patch16-224 |
| DeiT-base | 81.8 | 95.6 | 86M | https://huggingface.co/facebook/deit-base-patch16-224 |
| DeiT-tiny distilled | 74.5 | 91.9 | 6M | https://huggingface.co/facebook/deit-tiny-distilled-patch16-224 |
| DeiT-small distilled | 81.2 | 95.4 | 22M | https://huggingface.co/facebook/deit-small-distilled-patch16-224 |
| **DeiT-base distilled** | **83.4** | **96.5** | **87M** | **https://huggingface.co/facebook/deit-base-distilled-patch16-224** |
| DeiT-base 384 | 82.9 | 96.2 | 87M | https://huggingface.co/facebook/deit-base-patch16-384 |
| DeiT-base distilled 384 (1000 epochs) | 85.2 | 97.2 | 88M | https://huggingface.co/facebook/deit-base-distilled-patch16-384 |
Note that for fine-tuning, the best results are obtained with a higher resolution (384x384). Of course, increasing the model size will result in better performance.
### BibTeX entry and citation info
```bibtex
@misc{touvron2021training,
title={Training data-efficient image transformers & distillation through attention},
author={Hugo Touvron and Matthieu Cord and Matthijs Douze and Francisco Massa and Alexandre Sablayrolles and Hervé Jégou},
year={2021},
eprint={2012.12877},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```bibtex
@misc{wu2020visual,
title={Visual Transformers: Token-based Image Representation and Processing for Computer Vision},
author={Bichen Wu and Chenfeng Xu and Xiaoliang Dai and Alvin Wan and Peizhao Zhang and Zhicheng Yan and Masayoshi Tomizuka and Joseph Gonzalez and Kurt Keutzer and Peter Vajda},
year={2020},
eprint={2006.03677},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```bibtex
@inproceedings{deng2009imagenet,
title={Imagenet: A large-scale hierarchical image database},
author={Deng, Jia and Dong, Wei and Socher, Richard and Li, Li-Jia and Li, Kai and Fei-Fei, Li},
booktitle={2009 IEEE conference on computer vision and pattern recognition},
pages={248--255},
year={2009},
organization={Ieee}
}
``` | 6,823 | [
[
-0.058929443359375,
-0.03857421875,
0.00449371337890625,
0.01061248779296875,
-0.030059814453125,
-0.0153045654296875,
-0.01233673095703125,
-0.033782958984375,
0.0208282470703125,
0.01456451416015625,
-0.0298919677734375,
-0.0279693603515625,
-0.0657958984375,
0.00635528564453125,
-0.0279083251953125,
0.07177734375,
0.0026416778564453125,
-0.0116119384765625,
-0.005229949951171875,
-0.0102386474609375,
-0.033966064453125,
-0.02740478515625,
-0.051177978515625,
-0.0128173828125,
0.035675048828125,
0.0184783935546875,
0.042083740234375,
0.05670166015625,
0.0621337890625,
0.03558349609375,
-0.01149749755859375,
0.0075531005859375,
-0.038116455078125,
-0.0204925537109375,
0.002292633056640625,
-0.019866943359375,
-0.0323486328125,
0.021148681640625,
0.03314208984375,
0.030792236328125,
0.00998687744140625,
0.0250701904296875,
0.0262908935546875,
0.06646728515625,
-0.03802490234375,
0.017120361328125,
-0.04046630859375,
0.0109710693359375,
0.0008525848388671875,
-0.0017919540405273438,
-0.0182037353515625,
-0.00960540771484375,
0.01386260986328125,
-0.0374755859375,
0.033599853515625,
-0.0125732421875,
0.0950927734375,
0.038909912109375,
-0.0229949951171875,
0.011199951171875,
-0.04345703125,
0.0538330078125,
-0.039398193359375,
0.0175933837890625,
0.032257080078125,
0.0255126953125,
-0.00330352783203125,
-0.07647705078125,
-0.040374755859375,
-0.00457000732421875,
-0.0235595703125,
0.015228271484375,
-0.02691650390625,
0.0008411407470703125,
0.037689208984375,
0.05511474609375,
-0.0347900390625,
-0.00408172607421875,
-0.041534423828125,
-0.01065826416015625,
0.053436279296875,
-0.01024627685546875,
0.0010786056518554688,
-0.00543975830078125,
-0.04791259765625,
-0.02227783203125,
-0.007099151611328125,
0.0034275054931640625,
0.00635528564453125,
0.0027713775634765625,
-0.0174560546875,
0.0311737060546875,
-0.0169525146484375,
0.0491943359375,
0.0394287109375,
-0.006748199462890625,
0.040771484375,
-0.0209197998046875,
-0.031982421875,
-0.0039825439453125,
0.0723876953125,
0.037841796875,
0.0207061767578125,
0.0192718505859375,
-0.0126800537109375,
0.005157470703125,
0.011260986328125,
-0.0848388671875,
-0.0323486328125,
0.0030975341796875,
-0.052734375,
-0.041961669921875,
0.0113525390625,
-0.05340576171875,
-0.006389617919921875,
-0.0242156982421875,
0.03509521484375,
-0.0292205810546875,
-0.029296875,
-0.006587982177734375,
-0.0103912353515625,
0.0139007568359375,
0.0250244140625,
-0.0491943359375,
0.01058197021484375,
0.01537322998046875,
0.07427978515625,
-0.0079803466796875,
-0.01522064208984375,
-0.0018644332885742188,
-0.029388427734375,
-0.0287628173828125,
0.04443359375,
0.0016183853149414062,
-0.004467010498046875,
-0.023406982421875,
0.0273284912109375,
-0.0029449462890625,
-0.03912353515625,
0.01812744140625,
-0.0291900634765625,
-0.003231048583984375,
-0.01425933837890625,
-0.022308349609375,
-0.0189208984375,
0.02618408203125,
-0.062286376953125,
0.08990478515625,
0.021697998046875,
-0.0709228515625,
0.0303497314453125,
-0.033935546875,
-0.005718231201171875,
-0.0094451904296875,
0.0064544677734375,
-0.049224853515625,
-0.00521087646484375,
0.0233154296875,
0.0479736328125,
-0.0163116455078125,
0.006927490234375,
-0.024505615234375,
-0.03314208984375,
0.015777587890625,
-0.038665771484375,
0.073974609375,
0.0290985107421875,
-0.0421142578125,
-0.0129241943359375,
-0.067138671875,
0.0007877349853515625,
0.028167724609375,
-0.0178680419921875,
-0.009002685546875,
-0.03778076171875,
0.00423431396484375,
0.031280517578125,
0.0155487060546875,
-0.03656005859375,
0.0123138427734375,
-0.0040130615234375,
0.041534423828125,
0.0606689453125,
-0.007083892822265625,
0.021209716796875,
-0.020172119140625,
0.0167236328125,
0.0299835205078125,
0.03173828125,
-0.0198822021484375,
-0.032379150390625,
-0.0667724609375,
-0.03643798828125,
0.0274200439453125,
0.0242462158203125,
-0.053619384765625,
0.04632568359375,
-0.0271759033203125,
-0.054290771484375,
-0.0244140625,
0.00865936279296875,
0.025054931640625,
0.045501708984375,
0.036041259765625,
-0.03436279296875,
-0.0340576171875,
-0.0784912109375,
0.00885009765625,
-0.005252838134765625,
0.014923095703125,
0.0101165771484375,
0.050628662109375,
-0.014617919921875,
0.06842041015625,
-0.040985107421875,
-0.03045654296875,
0.00002759695053100586,
-0.0023403167724609375,
0.026275634765625,
0.051300048828125,
0.06793212890625,
-0.074462890625,
-0.057769775390625,
-0.0039215087890625,
-0.06036376953125,
0.015960693359375,
0.008514404296875,
-0.0229644775390625,
0.0120697021484375,
0.033599853515625,
-0.039031982421875,
0.06182861328125,
0.031829833984375,
-0.01371002197265625,
0.0243988037109375,
-0.006992340087890625,
0.0267333984375,
-0.0869140625,
0.00626373291015625,
0.031402587890625,
-0.025665283203125,
-0.031585693359375,
-0.01261138916015625,
0.0025501251220703125,
0.0023040771484375,
-0.0438232421875,
0.02349853515625,
-0.044342041015625,
0.0007495880126953125,
-0.016632080078125,
-0.0250244140625,
0.00858306884765625,
0.051483154296875,
0.00399017333984375,
0.040863037109375,
0.048431396484375,
-0.03778076171875,
0.04644775390625,
0.019500732421875,
-0.02520751953125,
0.051544189453125,
-0.06201171875,
0.01473236083984375,
-0.00801849365234375,
0.024749755859375,
-0.08636474609375,
-0.015960693359375,
0.015777587890625,
-0.0377197265625,
0.045928955078125,
-0.0267791748046875,
-0.0229034423828125,
-0.057525634765625,
-0.0228424072265625,
0.03497314453125,
0.048583984375,
-0.05322265625,
0.025238037109375,
0.00806427001953125,
0.0300140380859375,
-0.05426025390625,
-0.07623291015625,
-0.0077667236328125,
-0.0261688232421875,
-0.04339599609375,
0.03558349609375,
0.002460479736328125,
0.00864410400390625,
0.01430511474609375,
-0.0004341602325439453,
-0.0193634033203125,
-0.006927490234375,
0.032806396484375,
0.032440185546875,
-0.0147247314453125,
-0.00626373291015625,
-0.0203094482421875,
-0.015533447265625,
-0.01058197021484375,
-0.03082275390625,
0.0291748046875,
-0.028350830078125,
-0.013671875,
-0.0633544921875,
0.0005669593811035156,
0.050201416015625,
-0.0032329559326171875,
0.04913330078125,
0.061920166015625,
-0.032867431640625,
0.00754547119140625,
-0.04937744140625,
-0.0232696533203125,
-0.039398193359375,
0.0296630859375,
-0.0296173095703125,
-0.043548583984375,
0.048309326171875,
0.0018625259399414062,
0.001003265380859375,
0.06256103515625,
0.035614013671875,
-0.0174713134765625,
0.0655517578125,
0.035919189453125,
-0.00675201416015625,
0.054931640625,
-0.0657958984375,
-0.0036296844482421875,
-0.05126953125,
-0.0118865966796875,
-0.0175018310546875,
-0.056427001953125,
-0.055267333984375,
-0.0282135009765625,
0.0215606689453125,
0.006015777587890625,
-0.031494140625,
0.044158935546875,
-0.06854248046875,
0.0221710205078125,
0.057342529296875,
0.035797119140625,
0.0013799667358398438,
0.024871826171875,
0.0007185935974121094,
-0.0035762786865234375,
-0.043701171875,
-0.00946807861328125,
0.0657958984375,
0.0309600830078125,
0.053497314453125,
-0.0240325927734375,
0.04833984375,
0.00946044921875,
0.013153076171875,
-0.050140380859375,
0.037872314453125,
-0.016754150390625,
-0.056793212890625,
-0.0088348388671875,
-0.02825927734375,
-0.0672607421875,
0.01090240478515625,
-0.01496124267578125,
-0.043975830078125,
0.04168701171875,
0.030303955078125,
-0.0144195556640625,
0.036590576171875,
-0.057281494140625,
0.0657958984375,
-0.016326904296875,
-0.038421630859375,
0.007221221923828125,
-0.06005859375,
0.0173797607421875,
-0.0000922083854675293,
-0.007160186767578125,
0.0042572021484375,
0.025482177734375,
0.048797607421875,
-0.0562744140625,
0.0704345703125,
-0.02362060546875,
0.0185546875,
0.055572509765625,
-0.0182647705078125,
0.0235443115234375,
-0.01517486572265625,
0.00977325439453125,
0.0406494140625,
0.00499725341796875,
-0.03692626953125,
-0.035797119140625,
0.051116943359375,
-0.0657958984375,
-0.027099609375,
-0.0430908203125,
-0.0172882080078125,
0.0123138427734375,
0.01287078857421875,
0.05426025390625,
0.0305328369140625,
0.00724029541015625,
0.03546142578125,
0.047760009765625,
-0.0172576904296875,
0.035400390625,
-0.01197052001953125,
-0.0007805824279785156,
-0.032012939453125,
0.0665283203125,
0.0277557373046875,
0.022430419921875,
0.012908935546875,
0.020660400390625,
-0.0181121826171875,
-0.028778076171875,
-0.031280517578125,
0.01043701171875,
-0.055267333984375,
-0.039581298828125,
-0.0450439453125,
-0.04083251953125,
-0.033416748046875,
-0.00010097026824951172,
-0.049560546875,
-0.0278472900390625,
-0.037384033203125,
-0.00579833984375,
0.05224609375,
0.041046142578125,
-0.0218963623046875,
0.0300750732421875,
-0.044403076171875,
0.0160064697265625,
0.0297393798828125,
0.031494140625,
-0.0059051513671875,
-0.05291748046875,
-0.022308349609375,
0.01468658447265625,
-0.032257080078125,
-0.052703857421875,
0.0267333984375,
0.0223846435546875,
0.034027099609375,
0.031585693359375,
-0.003963470458984375,
0.07342529296875,
-0.0169830322265625,
0.05206298828125,
0.035980224609375,
-0.040863037109375,
0.053558349609375,
-0.01180267333984375,
0.009674072265625,
0.046661376953125,
0.034576416015625,
-0.017547607421875,
-0.002285003662109375,
-0.0574951171875,
-0.06097412109375,
0.051422119140625,
0.00981903076171875,
0.00548553466796875,
0.005397796630859375,
0.044403076171875,
-0.01290130615234375,
0.004901885986328125,
-0.056488037109375,
-0.0394287109375,
-0.039398193359375,
-0.015533447265625,
-0.00223541259765625,
-0.01416015625,
0.00691986083984375,
-0.05609130859375,
0.05072021484375,
-0.0068817138671875,
0.03460693359375,
0.0231170654296875,
-0.00911712646484375,
0.00614166259765625,
-0.032745361328125,
0.01531219482421875,
0.02838134765625,
-0.01324462890625,
0.00826263427734375,
0.0054779052734375,
-0.06201171875,
0.01372528076171875,
0.00933074951171875,
-0.00691986083984375,
-0.0028629302978515625,
0.0251312255859375,
0.07342529296875,
-0.00811004638671875,
0.0013408660888671875,
0.05072021484375,
-0.0161895751953125,
-0.03759765625,
-0.026031494140625,
0.005718231201171875,
-0.00797271728515625,
0.03106689453125,
0.029205322265625,
0.0177764892578125,
0.01343536376953125,
-0.0243377685546875,
0.0223846435546875,
0.020050048828125,
-0.0401611328125,
-0.0198516845703125,
0.0472412109375,
-0.007015228271484375,
0.0134429931640625,
0.05999755859375,
-0.00879669189453125,
-0.036041259765625,
0.07537841796875,
0.023773193359375,
0.06201171875,
-0.021026611328125,
0.01038360595703125,
0.06549072265625,
0.01172637939453125,
-0.01503753662109375,
0.0003650188446044922,
0.00505828857421875,
-0.04302978515625,
-0.01416015625,
-0.053070068359375,
0.0229949951171875,
0.026031494140625,
-0.05255126953125,
0.0282135009765625,
-0.031829833984375,
-0.04107666015625,
0.020477294921875,
0.00809478759765625,
-0.07373046875,
0.03106689453125,
0.00664520263671875,
0.06036376953125,
-0.06536865234375,
0.0570068359375,
0.054718017578125,
-0.046478271484375,
-0.0789794921875,
-0.021820068359375,
-0.002780914306640625,
-0.049835205078125,
0.06439208984375,
0.025390625,
0.013641357421875,
0.0150299072265625,
-0.0526123046875,
-0.06591796875,
0.10333251953125,
0.0250701904296875,
-0.041961669921875,
0.007099151611328125,
0.0088653564453125,
0.03582763671875,
-0.015777587890625,
0.0338134765625,
0.03350830078125,
0.0271453857421875,
0.042327880859375,
-0.061279296875,
0.0028285980224609375,
-0.031524658203125,
0.02398681640625,
-0.00562286376953125,
-0.06988525390625,
0.0731201171875,
-0.00827789306640625,
-0.0057220458984375,
-0.01029205322265625,
0.05072021484375,
-0.001621246337890625,
0.0095062255859375,
0.055633544921875,
0.06378173828125,
0.0255889892578125,
-0.0215606689453125,
0.0697021484375,
-0.00774383544921875,
0.046875,
0.05718994140625,
0.0268096923828125,
0.0270538330078125,
0.03729248046875,
-0.0260772705078125,
0.0230255126953125,
0.08349609375,
-0.0262908935546875,
0.045684814453125,
0.0039043426513671875,
0.005401611328125,
-0.00691986083984375,
0.0020198822021484375,
-0.036956787109375,
0.037841796875,
0.007793426513671875,
-0.05035400390625,
-0.006317138671875,
0.0172576904296875,
-0.0109100341796875,
-0.0218505859375,
-0.021575927734375,
0.056640625,
0.0083465576171875,
-0.04241943359375,
0.07122802734375,
-0.017791748046875,
0.05712890625,
-0.016204833984375,
-0.00885009765625,
-0.0226593017578125,
0.033416748046875,
-0.027313232421875,
-0.053619384765625,
0.0223541259765625,
-0.00677490234375,
-0.00665283203125,
-0.004039764404296875,
0.065185546875,
-0.019195556640625,
-0.049407958984375,
0.01654052734375,
0.0138397216796875,
0.022918701171875,
-0.011444091796875,
-0.0726318359375,
0.005157470703125,
0.0020008087158203125,
-0.055419921875,
0.02191162109375,
0.03875732421875,
0.0004076957702636719,
0.031585693359375,
0.047210693359375,
-0.0029087066650390625,
0.01593017578125,
-0.017608642578125,
0.08636474609375,
-0.025360107421875,
-0.02520751953125,
-0.06671142578125,
0.047607421875,
-0.0231475830078125,
-0.023468017578125,
0.0440673828125,
0.036865234375,
0.0687255859375,
-0.005397796630859375,
0.051849365234375,
-0.024383544921875,
0.005748748779296875,
-0.0136566162109375,
0.0421142578125,
-0.055511474609375,
-0.01435089111328125,
-0.03192138671875,
-0.07574462890625,
-0.0115509033203125,
0.0718994140625,
-0.01103973388671875,
0.031585693359375,
0.036468505859375,
0.0562744140625,
-0.0244903564453125,
-0.0197296142578125,
0.0167083740234375,
0.01245880126953125,
0.01496124267578125,
0.033905029296875,
0.047119140625,
-0.0626220703125,
0.033721923828125,
-0.0655517578125,
-0.03326416015625,
-0.023193359375,
-0.06329345703125,
-0.07281494140625,
-0.0673828125,
-0.046661376953125,
-0.0391845703125,
-0.011138916015625,
0.055694580078125,
0.07684326171875,
-0.047210693359375,
0.0073699951171875,
-0.01324462890625,
-0.0178070068359375,
-0.0287628173828125,
-0.0169677734375,
0.0421142578125,
-0.0030651092529296875,
-0.072021484375,
-0.0271453857421875,
0.0015869140625,
0.03265380859375,
-0.01503753662109375,
-0.0018434524536132812,
-0.021820068359375,
-0.026031494140625,
0.039703369140625,
0.01264190673828125,
-0.027374267578125,
-0.0205230712890625,
0.00455474853515625,
-0.0162811279296875,
0.0211334228515625,
0.035858154296875,
-0.04229736328125,
0.0274505615234375,
0.04071044921875,
0.03546142578125,
0.065185546875,
0.005008697509765625,
-0.0035076141357421875,
-0.05084228515625,
0.039581298828125,
0.0030975341796875,
0.031524658203125,
0.0233154296875,
-0.040069580078125,
0.053924560546875,
0.03912353515625,
-0.0479736328125,
-0.05609130859375,
-0.008819580078125,
-0.087646484375,
-0.0153961181640625,
0.07891845703125,
-0.02972412109375,
-0.04302978515625,
0.024444580078125,
-0.01430511474609375,
0.0413818359375,
-0.011444091796875,
0.04302978515625,
0.039764404296875,
0.00734710693359375,
-0.0243072509765625,
-0.047088623046875,
0.0288848876953125,
0.0006237030029296875,
-0.04052734375,
-0.0176544189453125,
0.03314208984375,
0.034912109375,
0.02911376953125,
0.04583740234375,
-0.02325439453125,
0.004169464111328125,
0.004802703857421875,
0.01389312744140625,
-0.01788330078125,
-0.0182342529296875,
-0.01165771484375,
-0.0183563232421875,
-0.018035888671875,
-0.047393798828125
]
] |
Habana/wav2vec2 | 2023-02-25T09:47:27.000Z | [
"optimum_habana",
"license:apache-2.0",
"region:us"
] | null | Habana | null | null | Habana/wav2vec2 | 0 | 13,564 | null | 2022-10-15T18:19:06 | ---
license: apache-2.0
---
[Optimum Habana](https://github.com/huggingface/optimum-habana) is the interface between the Hugging Face Transformers and Diffusers libraries and Habana's Gaudi processor (HPU).
It provides a set of tools enabling easy and fast model loading, training and inference on single- and multi-HPU settings for different downstream tasks.
Learn more about how to take advantage of the power of Habana HPUs to train and deploy Transformers and Diffusers models at [hf.co/hardware/habana](https://huggingface.co/hardware/habana).
## Wav2Vec2 model HPU configuration
This model only contains the `GaudiConfig` file for running the [Wav2Vec2](https://huggingface.co/facebook/wav2vec2-base) model on Habana's Gaudi processors (HPU).
**This model contains no model weights, only a GaudiConfig.**
This enables to specify:
- `use_habana_mixed_precision`: whether to use Habana Mixed Precision (HMP)
- `hmp_opt_level`: optimization level for HMP, see [here](https://docs.habana.ai/en/latest/PyTorch/PyTorch_Mixed_Precision/PT_Mixed_Precision.html#configuration-options) for a detailed explanation
- `hmp_bf16_ops`: list of operators that should run in bf16
- `hmp_fp32_ops`: list of operators that should run in fp32
- `hmp_is_verbose`: verbosity
- `use_fused_adam`: whether to use Habana's custom AdamW implementation
- `use_fused_clip_norm`: whether to use Habana's fused gradient norm clipping operator
## Usage
The model is instantiated the same way as in the Transformers library.
The only difference is that there are a few new training arguments specific to HPUs.
[Here](https://github.com/huggingface/optimum-habana/blob/main/examples/audio-classification/run_audio_classification.py) is an audio classification example script to fine-tune a model. You can run it with Wav2Vec2 with the following command:
```bash
python run_audio_classification.py \
--model_name_or_path facebook/wav2vec2-base \
--dataset_name superb \
--dataset_config_name ks \
--output_dir /tmp/wav2vec2-base-ft-keyword-spotting \
--overwrite_output_dir \
--remove_unused_columns False \
--do_train \
--do_eval \
--learning_rate 3e-5 \
--max_length_seconds 1 \
--attention_mask False \
--warmup_ratio 0.1 \
--num_train_epochs 5 \
--per_device_train_batch_size 256 \
--per_device_eval_batch_size 256 \
--dataloader_num_workers 4 \
--seed 27 \
--use_habana \
--use_lazy_mode \
--gaudi_config_name Habana/wav2vec2 \
--throughput_warmup_steps 2
```
Check the [documentation](https://huggingface.co/docs/optimum/habana/index) out for more advanced usage and examples.
| 2,664 | [
[
-0.05682373046875,
-0.053070068359375,
0.02069091796875,
0.00634765625,
-0.00933074951171875,
-0.01084136962890625,
-0.0168914794921875,
-0.04083251953125,
0.00708770751953125,
0.0179595947265625,
-0.0418701171875,
-0.0149078369140625,
-0.0343017578125,
-0.0228271484375,
-0.0214691162109375,
0.08087158203125,
0.002407073974609375,
-0.019500732421875,
-0.0164337158203125,
-0.01242828369140625,
-0.041046142578125,
-0.027496337890625,
-0.0771484375,
-0.0343017578125,
0.015045166015625,
0.0161590576171875,
0.0679931640625,
0.042236328125,
0.03704833984375,
0.02886962890625,
-0.0246734619140625,
0.0008687973022460938,
-0.0252685546875,
-0.0160675048828125,
0.008453369140625,
-0.037628173828125,
-0.0408935546875,
0.01259613037109375,
0.041839599609375,
0.01126861572265625,
-0.021697998046875,
0.0277252197265625,
-0.0003032684326171875,
0.036712646484375,
-0.04473876953125,
-0.001956939697265625,
-0.0223541259765625,
0.018524169921875,
-0.0148773193359375,
-0.0235443115234375,
-0.010223388671875,
-0.00849151611328125,
0.01021575927734375,
-0.04583740234375,
0.0170745849609375,
0.011016845703125,
0.0958251953125,
0.06085205078125,
-0.0297393798828125,
0.0054168701171875,
-0.06158447265625,
0.0634765625,
-0.040130615234375,
0.030914306640625,
0.032501220703125,
0.048065185546875,
-0.0099029541015625,
-0.06158447265625,
-0.023773193359375,
0.0082244873046875,
0.00740814208984375,
0.021209716796875,
-0.039581298828125,
0.0217742919921875,
0.036163330078125,
0.0594482421875,
-0.0230560302734375,
0.00933837890625,
-0.040679931640625,
-0.0238800048828125,
0.028594970703125,
-0.0021190643310546875,
0.007415771484375,
-0.0198516845703125,
-0.028594970703125,
-0.0181121826171875,
-0.028411865234375,
0.0135498046875,
0.035186767578125,
-0.007160186767578125,
-0.0411376953125,
0.0186309814453125,
0.006961822509765625,
0.056640625,
0.0079498291015625,
-0.02337646484375,
0.04522705078125,
-0.0012216567993164062,
-0.033905029296875,
-0.0038166046142578125,
0.05682373046875,
0.009979248046875,
0.0034084320068359375,
0.01166534423828125,
-0.025634765625,
0.015045166015625,
0.0528564453125,
-0.049896240234375,
-0.027984619140625,
0.0382080078125,
-0.0421142578125,
-0.042083740234375,
-0.02850341796875,
-0.0594482421875,
0.01290130615234375,
-0.026519775390625,
0.06634521484375,
-0.026031494140625,
-0.0079498291015625,
-0.00873565673828125,
-0.0310821533203125,
0.024688720703125,
0.0114898681640625,
-0.072021484375,
0.034515380859375,
0.04156494140625,
0.06683349609375,
-0.0145263671875,
-0.011932373046875,
-0.025543212890625,
-0.00878143310546875,
-0.0051727294921875,
0.03558349609375,
-0.0033588409423828125,
-0.0254669189453125,
-0.005634307861328125,
0.00463104248046875,
-0.002216339111328125,
-0.05218505859375,
0.06964111328125,
-0.0218353271484375,
0.03521728515625,
0.021209716796875,
-0.031463623046875,
-0.0284423828125,
-0.0194244384765625,
-0.039276123046875,
0.10772705078125,
0.0311737060546875,
-0.05224609375,
0.0142059326171875,
-0.049835205078125,
-0.040252685546875,
-0.015350341796875,
-0.0007243156433105469,
-0.05706787109375,
0.00433349609375,
-0.0007786750793457031,
0.0268402099609375,
-0.020294189453125,
0.005420684814453125,
-0.01540374755859375,
-0.0340576171875,
0.0014600753784179688,
-0.026397705078125,
0.0867919921875,
0.03240966796875,
-0.034149169921875,
0.031524658203125,
-0.06158447265625,
-0.0128021240234375,
0.00826263427734375,
-0.046630859375,
-0.00670623779296875,
-0.0141754150390625,
0.021575927734375,
0.01024627685546875,
0.017059326171875,
-0.029327392578125,
-0.010955810546875,
-0.01026153564453125,
0.053253173828125,
0.054473876953125,
-0.005191802978515625,
0.023284912109375,
-0.048187255859375,
0.03814697265625,
-0.031768798828125,
0.05364990234375,
0.017333984375,
-0.061279296875,
-0.0679931640625,
-0.036163330078125,
-0.00730133056640625,
0.038726806640625,
-0.002712249755859375,
0.05596923828125,
0.0182037353515625,
-0.053253173828125,
-0.0712890625,
0.0028171539306640625,
0.01800537109375,
0.048095703125,
0.042327880859375,
-0.011749267578125,
-0.064208984375,
-0.07525634765625,
0.0095062255859375,
-0.004856109619140625,
-0.00986480712890625,
0.048248291015625,
0.03045654296875,
-0.03021240234375,
0.055419921875,
-0.033355712890625,
-0.042236328125,
0.00553131103515625,
-0.00629425048828125,
0.050262451171875,
0.040374755859375,
0.051605224609375,
-0.058074951171875,
-0.01302337646484375,
-0.014923095703125,
-0.05108642578125,
-0.0004074573516845703,
-0.0005445480346679688,
-0.03759765625,
0.01087188720703125,
0.0165863037109375,
-0.050201416015625,
0.0220947265625,
0.05145263671875,
-0.0046539306640625,
0.04705810546875,
-0.0209808349609375,
-0.0027751922607421875,
-0.07391357421875,
0.01470184326171875,
-0.004241943359375,
-0.029296875,
-0.032989501953125,
0.00536346435546875,
-0.005855560302734375,
0.0005965232849121094,
-0.04119873046875,
0.0231170654296875,
-0.01192474365234375,
-0.004642486572265625,
-0.0222015380859375,
-0.0255126953125,
0.001064300537109375,
0.049835205078125,
-0.004177093505859375,
0.053253173828125,
0.049713134765625,
-0.051025390625,
0.03131103515625,
0.0273284912109375,
-0.0276641845703125,
0.0295867919921875,
-0.06451416015625,
0.01654052734375,
0.01021575927734375,
0.01291656494140625,
-0.05560302734375,
-0.029388427734375,
0.0035305023193359375,
-0.050933837890625,
0.030731201171875,
-0.0220947265625,
-0.01068115234375,
-0.04620361328125,
-0.0128631591796875,
0.0198211669921875,
0.09210205078125,
-0.06915283203125,
0.052581787109375,
0.06378173828125,
0.029205322265625,
-0.03912353515625,
-0.05023193359375,
-0.018035888671875,
-0.045928955078125,
-0.047943115234375,
0.05108642578125,
-0.004467010498046875,
-0.003398895263671875,
-0.0103302001953125,
-0.01442718505859375,
-0.00428009033203125,
0.0186309814453125,
0.039459228515625,
0.0350341796875,
0.01047515869140625,
-0.0159759521484375,
0.005153656005859375,
-0.01058197021484375,
0.0220489501953125,
-0.0253753662109375,
0.048309326171875,
-0.018035888671875,
0.01142120361328125,
-0.039276123046875,
-0.004974365234375,
0.0421142578125,
-0.00893402099609375,
0.028472900390625,
0.08209228515625,
-0.03521728515625,
-0.01385498046875,
-0.041839599609375,
-0.01186370849609375,
-0.04473876953125,
0.00615692138671875,
-0.017333984375,
-0.046356201171875,
0.034576416015625,
0.010711669921875,
0.011993408203125,
0.047515869140625,
0.06494140625,
-0.0277557373046875,
0.0721435546875,
0.040802001953125,
-0.0116424560546875,
0.053070068359375,
-0.0469970703125,
-0.00778961181640625,
-0.07586669921875,
-0.0183563232421875,
-0.048309326171875,
-0.0195465087890625,
-0.034423828125,
-0.0223846435546875,
0.049591064453125,
0.0008397102355957031,
-0.0279693603515625,
0.039093017578125,
-0.06280517578125,
0.01204681396484375,
0.06494140625,
0.01137542724609375,
-0.01096343994140625,
0.0084075927734375,
-0.0005307197570800781,
0.0242462158203125,
-0.05078125,
-0.0125579833984375,
0.0714111328125,
0.047607421875,
0.046356201171875,
0.00006961822509765625,
0.05511474609375,
0.0149993896484375,
-0.015411376953125,
-0.07611083984375,
0.0209808349609375,
0.0011882781982421875,
-0.05950927734375,
-0.009033203125,
-0.0179901123046875,
-0.06622314453125,
0.01300048828125,
-0.0234375,
-0.040191650390625,
0.0237579345703125,
0.0216827392578125,
-0.02593994140625,
0.0156707763671875,
-0.038665771484375,
0.06524658203125,
-0.007659912109375,
-0.02349853515625,
-0.022705078125,
-0.045318603515625,
0.0165863037109375,
-0.0065460205078125,
0.0160369873046875,
-0.00785064697265625,
0.0157470703125,
0.08477783203125,
-0.044891357421875,
0.037017822265625,
-0.035369873046875,
0.0025501251220703125,
0.03961181640625,
-0.0168609619140625,
0.022918701171875,
-0.00982666015625,
0.0010013580322265625,
0.0294189453125,
-0.00037741661071777344,
-0.0228424072265625,
-0.017120361328125,
0.046234130859375,
-0.08905029296875,
-0.0203857421875,
-0.0147857666015625,
-0.041015625,
-0.0012407302856445312,
0.01141357421875,
0.0469970703125,
0.024200439453125,
0.0003192424774169922,
0.0105743408203125,
0.044219970703125,
-0.01119232177734375,
0.0175933837890625,
0.0024890899658203125,
-0.01251983642578125,
-0.033782958984375,
0.05706787109375,
-0.0164031982421875,
0.0108795166015625,
0.01172637939453125,
0.033172607421875,
-0.017425537109375,
-0.03887939453125,
-0.0243377685546875,
-0.0003459453582763672,
-0.041015625,
-0.0184783935546875,
-0.04901123046875,
-0.004985809326171875,
-0.02752685546875,
-0.0097808837890625,
-0.03948974609375,
-0.0305633544921875,
-0.0213623046875,
0.00531768798828125,
0.051971435546875,
0.0118560791015625,
-0.0286865234375,
0.043609619140625,
-0.0406494140625,
0.042236328125,
0.017822265625,
0.004604339599609375,
-0.009429931640625,
-0.06744384765625,
-0.02593994140625,
-0.0038166046142578125,
-0.0197296142578125,
-0.061798095703125,
0.0245361328125,
0.027099609375,
0.051300048828125,
0.0308837890625,
-0.0145111083984375,
0.036163330078125,
-0.027801513671875,
0.040618896484375,
0.0035190582275390625,
-0.057403564453125,
0.053741455078125,
-0.03155517578125,
0.0032444000244140625,
0.04339599609375,
0.041961669921875,
-0.0211639404296875,
-0.00986480712890625,
-0.054168701171875,
-0.07305908203125,
0.0562744140625,
0.01861572265625,
-0.001995086669921875,
0.016754150390625,
0.018218994140625,
-0.00946044921875,
0.006076812744140625,
-0.0247650146484375,
-0.0264892578125,
-0.0207977294921875,
-0.0041656494140625,
-0.0024929046630859375,
0.0008015632629394531,
-0.0263824462890625,
-0.047119140625,
0.07598876953125,
0.003566741943359375,
0.044036865234375,
0.01264190673828125,
0.004840850830078125,
-0.0208587646484375,
-0.0132598876953125,
-0.006134033203125,
0.009063720703125,
-0.05511474609375,
-0.00991058349609375,
0.0115966796875,
-0.037078857421875,
0.005096435546875,
0.00005137920379638672,
-0.017730712890625,
-0.0007357597351074219,
0.009063720703125,
0.0867919921875,
0.01080322265625,
-0.0280303955078125,
0.031494140625,
-0.00531005859375,
-0.01314544677734375,
-0.050567626953125,
0.03662109375,
0.00562286376953125,
0.012237548828125,
0.00951385498046875,
0.0249481201171875,
0.0250091552734375,
-0.0333251953125,
0.0106658935546875,
0.0255584716796875,
-0.0175018310546875,
-0.00110626220703125,
0.07220458984375,
0.00899505615234375,
-0.028594970703125,
0.0626220703125,
0.005107879638671875,
-0.03143310546875,
0.06060791015625,
0.032470703125,
0.0723876953125,
-0.0202178955078125,
-0.005657196044921875,
0.0362548828125,
0.01259613037109375,
0.00933837890625,
0.0278778076171875,
-0.0255126953125,
-0.0484619140625,
-0.0234375,
-0.0736083984375,
-0.037628173828125,
0.00531005859375,
-0.07586669921875,
0.04241943359375,
-0.04473876953125,
-0.036712646484375,
0.0291290283203125,
-0.002750396728515625,
-0.06524658203125,
0.0146484375,
0.0118255615234375,
0.07940673828125,
-0.06280517578125,
0.0780029296875,
0.06005859375,
-0.0215606689453125,
-0.061279296875,
-0.03875732421875,
0.01430511474609375,
-0.054656982421875,
0.0025424957275390625,
0.00768280029296875,
-0.003444671630859375,
0.004627227783203125,
-0.030975341796875,
-0.05426025390625,
0.080322265625,
0.00888824462890625,
-0.027679443359375,
-0.0017147064208984375,
-0.0198516845703125,
0.038543701171875,
-0.01119232177734375,
0.020294189453125,
0.07916259765625,
0.045928955078125,
-0.0001176595687866211,
-0.0791015625,
-0.0014019012451171875,
-0.029052734375,
-0.0216522216796875,
0.014862060546875,
-0.051605224609375,
0.0701904296875,
-0.0193328857421875,
0.0082550048828125,
0.0004069805145263672,
0.042938232421875,
0.018524169921875,
0.01543426513671875,
0.05487060546875,
0.0589599609375,
0.06982421875,
-0.0157470703125,
0.08953857421875,
-0.024932861328125,
0.041778564453125,
0.0604248046875,
0.0184783935546875,
0.054107666015625,
0.033843994140625,
-0.04315185546875,
0.03009033203125,
0.0645751953125,
-0.0308837890625,
0.0438232421875,
0.0015096664428710938,
-0.0159759521484375,
-0.0086669921875,
-0.0032215118408203125,
-0.0281982421875,
0.04901123046875,
0.0242919921875,
-0.0216522216796875,
0.01470184326171875,
0.007213592529296875,
0.016326904296875,
-0.0372314453125,
-0.00855255126953125,
0.051116943359375,
0.004749298095703125,
-0.050079345703125,
0.0831298828125,
0.0027523040771484375,
0.0771484375,
-0.044403076171875,
0.01293182373046875,
-0.0010671615600585938,
0.0241546630859375,
-0.03924560546875,
-0.035980224609375,
0.033935546875,
-0.00893402099609375,
-0.0119781494140625,
0.01436614990234375,
0.054290771484375,
-0.0169830322265625,
-0.00537872314453125,
0.03460693359375,
0.0006694793701171875,
0.0352783203125,
-0.00400543212890625,
-0.057952880859375,
0.017425537109375,
0.00963592529296875,
-0.021270751953125,
0.0181427001953125,
-0.0167999267578125,
0.017852783203125,
0.032379150390625,
0.05682373046875,
0.00470733642578125,
0.00835418701171875,
0.0016431808471679688,
0.04638671875,
-0.02947998046875,
-0.048797607421875,
-0.0292205810546875,
0.022003173828125,
-0.004364013671875,
-0.031494140625,
0.0472412109375,
0.04266357421875,
0.062255859375,
-0.00629425048828125,
0.055511474609375,
-0.0178985595703125,
0.0207061767578125,
-0.0308074951171875,
0.0426025390625,
-0.055816650390625,
0.00048279762268066406,
-0.019622802734375,
-0.0814208984375,
0.0034656524658203125,
0.070068359375,
0.00768280029296875,
0.009796142578125,
0.03106689453125,
0.05877685546875,
-0.019073486328125,
0.00982666015625,
0.0005636215209960938,
0.0184478759765625,
0.02752685546875,
0.04840087890625,
0.0457763671875,
-0.038299560546875,
0.0299072265625,
-0.06109619140625,
-0.04107666015625,
-0.0172882080078125,
-0.043914794921875,
-0.047760009765625,
-0.0304412841796875,
-0.032928466796875,
-0.03973388671875,
0.005645751953125,
0.06256103515625,
0.06719970703125,
-0.040863037109375,
-0.0283203125,
-0.003894805908203125,
-0.0160980224609375,
-0.026519775390625,
-0.02008056640625,
0.038299560546875,
-0.0237579345703125,
-0.07470703125,
0.053741455078125,
-0.003070831298828125,
0.00868988037109375,
-0.0221405029296875,
-0.0215606689453125,
-0.005584716796875,
-0.0009918212890625,
0.027313232421875,
0.030029296875,
-0.022308349609375,
-0.0219879150390625,
-0.0260772705078125,
0.01241302490234375,
0.01331329345703125,
0.02593994140625,
-0.0626220703125,
0.0186614990234375,
0.056396484375,
0.026092529296875,
0.06549072265625,
-0.017333984375,
0.020172119140625,
-0.044189453125,
0.0186767578125,
0.00739288330078125,
0.0438232421875,
0.01097869873046875,
-0.0294189453125,
0.04632568359375,
0.01206207275390625,
-0.0693359375,
-0.05755615234375,
-0.0072784423828125,
-0.0989990234375,
-0.015899658203125,
0.0826416015625,
0.00829315185546875,
-0.03363037109375,
0.004016876220703125,
-0.0249481201171875,
0.039642333984375,
-0.0267791748046875,
0.04425048828125,
0.0206146240234375,
-0.0234375,
0.022186279296875,
-0.05999755859375,
0.055450439453125,
0.0372314453125,
-0.0465087890625,
-0.01910400390625,
0.039215087890625,
0.025634765625,
0.0226287841796875,
0.040679931640625,
-0.0271453857421875,
0.029998779296875,
0.00238800048828125,
0.0208282470703125,
-0.02288818359375,
-0.037841796875,
-0.041961669921875,
0.0023670196533203125,
-0.007793426513671875,
-0.0340576171875
]
] |
PY007/TinyLlama-1.1B-Chat-v0.3 | 2023-11-05T06:07:52.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"en",
"dataset:cerebras/SlimPajama-627B",
"dataset:bigcode/starcoderdata",
"dataset:OpenAssistant/oasst_top1_2023-08-25",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | PY007 | null | null | PY007/TinyLlama-1.1B-Chat-v0.3 | 17 | 13,563 | transformers | 2023-10-06T17:03:35 | ---
license: apache-2.0
datasets:
- cerebras/SlimPajama-627B
- bigcode/starcoderdata
- OpenAssistant/oasst_top1_2023-08-25
language:
- en
---
<div align="center">
# TinyLlama-1.1B
</div>
https://github.com/jzhang38/TinyLlama
The TinyLlama project aims to **pretrain** a **1.1B Llama model on 3 trillion tokens**. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs 🚀🚀. The training has started on 2023-09-01.
We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.
#### This Model
This is the chat model finetuned on top of [PY007/TinyLlama-1.1B-intermediate-step-480k-1T](https://huggingface.co/PY007/TinyLlama-1.1B-intermediate-step-480k-1T).
The dataset used is [OpenAssistant/oasst_top1_2023-08-25](https://huggingface.co/datasets/OpenAssistant/oasst_top1_2023-08-25) following the [chatml](https://github.com/openai/openai-python/blob/main/chatml.md) format.
#### How to use
You will need the transformers>=4.31
Do check the [TinyLlama](https://github.com/jzhang38/TinyLlama) github page for more information.
```
from transformers import AutoTokenizer
import transformers
import torch
model = "PY007/TinyLlama-1.1B-Chat-v0.3"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
CHAT_EOS_TOKEN_ID = 32002
prompt = "How to get in a good university?"
formatted_prompt = (
f"<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant\n"
)
sequences = pipeline(
formatted_prompt,
do_sample=True,
top_k=50,
top_p = 0.9,
num_return_sequences=1,
repetition_penalty=1.1,
max_new_tokens=1024,
eos_token_id=CHAT_EOS_TOKEN_ID,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
``` | 2,112 | [
[
-0.0203399658203125,
-0.0582275390625,
0.02655029296875,
0.017608642578125,
-0.02557373046875,
-0.007373809814453125,
-0.0222930908203125,
-0.0201873779296875,
0.0268707275390625,
0.00965118408203125,
-0.051055908203125,
-0.031463623046875,
-0.035675048828125,
-0.0181732177734375,
-0.0137939453125,
0.08380126953125,
0.0114593505859375,
-0.016082763671875,
0.013275146484375,
0.0009016990661621094,
-0.0230255126953125,
-0.01061248779296875,
-0.0782470703125,
-0.0194091796875,
0.028564453125,
0.05718994140625,
0.041015625,
0.0416259765625,
0.0278167724609375,
0.025390625,
-0.007717132568359375,
0.00951385498046875,
-0.04974365234375,
-0.0189056396484375,
0.0243988037109375,
-0.053131103515625,
-0.046844482421875,
0.014862060546875,
0.04888916015625,
0.0182647705078125,
-0.00928497314453125,
0.047454833984375,
0.004192352294921875,
0.0171051025390625,
-0.02740478515625,
0.0269317626953125,
-0.041412353515625,
0.00016772747039794922,
-0.025360107421875,
0.00012159347534179688,
-0.0236358642578125,
-0.0225372314453125,
0.01160430908203125,
-0.0565185546875,
0.0035190582275390625,
0.0207366943359375,
0.0784912109375,
0.0252532958984375,
-0.0124664306640625,
-0.024749755859375,
-0.034637451171875,
0.0582275390625,
-0.057861328125,
-0.005214691162109375,
0.0230560302734375,
0.033599853515625,
-0.01062774658203125,
-0.0792236328125,
-0.052337646484375,
-0.014923095703125,
-0.004657745361328125,
-0.0020122528076171875,
-0.0193023681640625,
-0.0108642578125,
0.0210113525390625,
0.036346435546875,
-0.048187255859375,
0.024169921875,
-0.05059814453125,
-0.010528564453125,
0.041473388671875,
0.040008544921875,
0.0118865966796875,
-0.0161590576171875,
-0.0223846435546875,
-0.0253143310546875,
-0.047027587890625,
0.00815582275390625,
0.01556396484375,
0.03607177734375,
-0.0550537109375,
0.03839111328125,
-0.008819580078125,
0.02471923828125,
0.00997161865234375,
-0.016357421875,
0.01513671875,
-0.032196044921875,
-0.036224365234375,
-0.0031223297119140625,
0.08489990234375,
0.0091400146484375,
-0.0029544830322265625,
0.0215301513671875,
0.00782012939453125,
0.005008697509765625,
-0.0025882720947265625,
-0.07501220703125,
-0.0330810546875,
0.026458740234375,
-0.03656005859375,
-0.040374755859375,
-0.0162353515625,
-0.056732177734375,
-0.0090484619140625,
0.00493621826171875,
0.04510498046875,
-0.0178985595703125,
-0.00983428955078125,
-0.00705718994140625,
0.015380859375,
0.01654052734375,
0.020751953125,
-0.07794189453125,
0.01287078857421875,
0.03924560546875,
0.0867919921875,
0.01538848876953125,
-0.0270538330078125,
-0.02294921875,
-0.016204833984375,
-0.01702880859375,
0.04302978515625,
-0.00887298583984375,
-0.02215576171875,
-0.0223388671875,
-0.0094146728515625,
-0.01038360595703125,
-0.034820556640625,
0.0107421875,
-0.0206451416015625,
0.03167724609375,
0.01238250732421875,
-0.0260009765625,
0.0013580322265625,
0.01116943359375,
-0.0312347412109375,
0.08294677734375,
-0.01007080078125,
-0.04193115234375,
0.0086669921875,
-0.06390380859375,
-0.00017464160919189453,
-0.013397216796875,
-0.008270263671875,
-0.033203125,
-0.0008783340454101562,
0.01702880859375,
0.026702880859375,
-0.0278167724609375,
-0.00010269880294799805,
-0.02532958984375,
-0.036529541015625,
0.0172576904296875,
-0.016998291015625,
0.07269287109375,
0.029632568359375,
-0.027618408203125,
0.023223876953125,
-0.05560302734375,
-0.0033473968505859375,
0.0263671875,
-0.02239990234375,
0.0008177757263183594,
-0.020843505859375,
0.0142059326171875,
0.0122222900390625,
0.029937744140625,
-0.04046630859375,
0.041595458984375,
-0.0513916015625,
0.047760009765625,
0.0675048828125,
-0.0166473388671875,
0.043548583984375,
-0.03143310546875,
0.0257110595703125,
0.008026123046875,
0.0214080810546875,
-0.01776123046875,
-0.056732177734375,
-0.09552001953125,
-0.0264892578125,
0.02301025390625,
0.02801513671875,
-0.04388427734375,
0.027679443359375,
-0.0224456787109375,
-0.06451416015625,
-0.05224609375,
0.0036163330078125,
0.0138702392578125,
0.027923583984375,
0.0230560302734375,
-0.0204620361328125,
-0.058990478515625,
-0.04718017578125,
0.01355743408203125,
-0.037811279296875,
-0.0123748779296875,
0.01064300537109375,
0.0557861328125,
-0.0301666259765625,
0.0726318359375,
-0.041534423828125,
-0.044342041015625,
-0.01403045654296875,
0.00955963134765625,
0.034881591796875,
0.0516357421875,
0.04083251953125,
-0.0243377685546875,
-0.03741455078125,
-0.01528167724609375,
-0.046630859375,
-0.002170562744140625,
-0.01175689697265625,
-0.01399993896484375,
0.0003838539123535156,
0.019012451171875,
-0.054351806640625,
0.036224365234375,
0.038848876953125,
-0.0288238525390625,
0.0189056396484375,
-0.00470733642578125,
-0.01328277587890625,
-0.0904541015625,
0.004993438720703125,
-0.00995635986328125,
-0.0115814208984375,
-0.0416259765625,
0.006610870361328125,
-0.01084136962890625,
-0.0178680419921875,
-0.042449951171875,
0.057708740234375,
-0.01212310791015625,
0.00704193115234375,
-0.025390625,
0.00806427001953125,
-0.0198822021484375,
0.045501708984375,
-0.0150146484375,
0.05499267578125,
0.038360595703125,
-0.0369873046875,
0.026092529296875,
0.032073974609375,
-0.0205841064453125,
0.006824493408203125,
-0.06060791015625,
0.0203857421875,
0.01849365234375,
0.026824951171875,
-0.08367919921875,
-0.01482391357421875,
0.052581787109375,
-0.042266845703125,
0.01361846923828125,
-0.005908966064453125,
-0.0467529296875,
-0.0295867919921875,
-0.032928466796875,
0.032257080078125,
0.059478759765625,
-0.05010986328125,
0.01959228515625,
0.035797119140625,
0.0041351318359375,
-0.016876220703125,
-0.054473876953125,
-0.0007796287536621094,
-0.01861572265625,
-0.053497314453125,
0.01543426513671875,
-0.006717681884765625,
-0.0030307769775390625,
-0.019683837890625,
0.007282257080078125,
0.0032749176025390625,
0.00411224365234375,
0.031219482421875,
0.0246429443359375,
-0.0156402587890625,
-0.004978179931640625,
-0.01009368896484375,
-0.02630615234375,
-0.00710296630859375,
-0.02008056640625,
0.058319091796875,
-0.037841796875,
-0.017974853515625,
-0.06268310546875,
-0.004177093505859375,
0.0250244140625,
0.01523590087890625,
0.0604248046875,
0.05126953125,
-0.037506103515625,
0.0008072853088378906,
-0.0301361083984375,
-0.02630615234375,
-0.041229248046875,
0.00750732421875,
-0.01416015625,
-0.0692138671875,
0.04449462890625,
0.01035308837890625,
0.017669677734375,
0.051849365234375,
0.07281494140625,
-0.01210784912109375,
0.060211181640625,
0.042236328125,
-0.00733184814453125,
0.03912353515625,
-0.05621337890625,
0.006488800048828125,
-0.053497314453125,
-0.0194854736328125,
-0.034332275390625,
-0.0301666259765625,
-0.035675048828125,
-0.043426513671875,
0.0170745849609375,
0.00547027587890625,
-0.03668212890625,
0.034942626953125,
-0.03924560546875,
0.0191497802734375,
0.041839599609375,
0.0015125274658203125,
0.0107421875,
-0.00030303001403808594,
-0.01012420654296875,
-0.0085601806640625,
-0.06719970703125,
-0.05908203125,
0.0966796875,
0.041473388671875,
0.06256103515625,
-0.002971649169921875,
0.06414794921875,
-0.003082275390625,
0.03460693359375,
-0.046905517578125,
0.04876708984375,
0.011505126953125,
-0.046905517578125,
-0.0174713134765625,
-0.0239715576171875,
-0.0706787109375,
0.0210113525390625,
-0.0095062255859375,
-0.0687255859375,
0.007305145263671875,
0.016693115234375,
-0.04583740234375,
0.019012451171875,
-0.050079345703125,
0.0745849609375,
0.0004818439483642578,
-0.00788116455078125,
-0.01149749755859375,
-0.039520263671875,
0.0305633544921875,
-0.00885009765625,
0.0081787109375,
-0.022247314453125,
-0.008575439453125,
0.07415771484375,
-0.0535888671875,
0.07861328125,
-0.0118865966796875,
0.00441741943359375,
0.031402587890625,
-0.0116119384765625,
0.0220184326171875,
0.020477294921875,
0.0007772445678710938,
0.03167724609375,
0.002742767333984375,
-0.0301971435546875,
-0.01708984375,
0.05462646484375,
-0.0802001953125,
-0.03338623046875,
-0.042633056640625,
-0.021087646484375,
0.01181793212890625,
0.0014057159423828125,
0.032562255859375,
0.007450103759765625,
-0.0107421875,
0.009735107421875,
0.0220794677734375,
-0.0118408203125,
0.03741455078125,
0.024444580078125,
-0.0261077880859375,
-0.020965576171875,
0.06414794921875,
0.00858306884765625,
0.0015668869018554688,
0.00598907470703125,
0.01361083984375,
-0.0135650634765625,
-0.0272369384765625,
-0.04132080078125,
0.0191802978515625,
-0.03302001953125,
-0.0195159912109375,
-0.036285400390625,
-0.02337646484375,
-0.036834716796875,
0.005130767822265625,
-0.0506591796875,
-0.041229248046875,
-0.05694580078125,
0.0124359130859375,
0.0298919677734375,
0.03472900390625,
-0.0191802978515625,
0.05267333984375,
-0.045501708984375,
0.0167999267578125,
0.03155517578125,
0.00714874267578125,
0.0141143798828125,
-0.06781005859375,
-0.034515380859375,
0.016998291015625,
-0.043426513671875,
-0.047760009765625,
0.0330810546875,
0.024749755859375,
0.0262451171875,
0.038970947265625,
-0.006687164306640625,
0.07354736328125,
-0.03045654296875,
0.0667724609375,
0.0148162841796875,
-0.06304931640625,
0.06134033203125,
-0.01157379150390625,
0.0213470458984375,
0.03216552734375,
0.0119781494140625,
-0.033233642578125,
-0.038818359375,
-0.05718994140625,
-0.052520751953125,
0.0787353515625,
0.038848876953125,
0.0186920166015625,
0.0024166107177734375,
0.0204315185546875,
-0.006084442138671875,
-0.0000966787338256836,
-0.05413818359375,
-0.03240966796875,
-0.017425537109375,
-0.0242767333984375,
-0.014984130859375,
-0.0164031982421875,
-0.0153350830078125,
-0.0338134765625,
0.054718017578125,
-0.01061248779296875,
0.04833984375,
-0.00637054443359375,
-0.0089111328125,
-0.007152557373046875,
-0.0022792816162109375,
0.053131103515625,
0.0307769775390625,
-0.0160675048828125,
-0.01151275634765625,
0.028289794921875,
-0.04437255859375,
0.01401519775390625,
0.00563812255859375,
-0.004146575927734375,
-0.002605438232421875,
0.017333984375,
0.0740966796875,
0.0209503173828125,
-0.0350341796875,
0.038360595703125,
-0.0181121826171875,
-0.007167816162109375,
-0.034088134765625,
0.01384735107421875,
0.0086822509765625,
0.032928466796875,
0.041412353515625,
-0.0072174072265625,
-0.01068878173828125,
-0.034149169921875,
-0.005275726318359375,
0.014892578125,
-0.0023670196533203125,
-0.0295867919921875,
0.07574462890625,
0.0088348388671875,
-0.0231475830078125,
0.0504150390625,
-0.00289154052734375,
-0.0213775634765625,
0.063720703125,
0.035400390625,
0.053558349609375,
0.00695037841796875,
-0.00263214111328125,
0.042449951171875,
0.03466796875,
-0.007167816162109375,
0.014556884765625,
0.00034499168395996094,
-0.040191650390625,
-0.0017719268798828125,
-0.050445556640625,
-0.033416748046875,
0.010955810546875,
-0.029266357421875,
0.03302001953125,
-0.04937744140625,
-0.015228271484375,
-0.00524139404296875,
0.0238800048828125,
-0.0631103515625,
0.00733184814453125,
0.0159759521484375,
0.0712890625,
-0.0577392578125,
0.0748291015625,
0.04754638671875,
-0.029632568359375,
-0.07476806640625,
-0.01727294921875,
0.0157012939453125,
-0.0760498046875,
0.039703369140625,
0.027130126953125,
0.0217132568359375,
0.0117034912109375,
-0.04852294921875,
-0.062744140625,
0.09808349609375,
0.0223388671875,
-0.039337158203125,
-0.01050567626953125,
0.00537109375,
0.041656494140625,
-0.035858154296875,
0.038787841796875,
0.045806884765625,
0.0224456787109375,
0.006404876708984375,
-0.08245849609375,
0.00316619873046875,
-0.0304107666015625,
0.0135040283203125,
-0.00803375244140625,
-0.0731201171875,
0.0848388671875,
-0.02166748046875,
-0.0212249755859375,
0.040771484375,
0.0687255859375,
0.0309295654296875,
0.0292816162109375,
0.032073974609375,
0.0523681640625,
0.049835205078125,
-0.0316162109375,
0.05645751953125,
-0.0220489501953125,
0.057861328125,
0.06903076171875,
0.0170440673828125,
0.061431884765625,
0.041900634765625,
-0.01641845703125,
0.0411376953125,
0.07427978515625,
-0.00787353515625,
0.04302978515625,
0.0055999755859375,
-0.00005418062210083008,
-0.005908966064453125,
0.017364501953125,
-0.045623779296875,
0.042449951171875,
0.03570556640625,
-0.0187530517578125,
-0.00896453857421875,
0.01403045654296875,
-0.0031452178955078125,
-0.034759521484375,
-0.017059326171875,
0.0552978515625,
0.0151519775390625,
-0.0171051025390625,
0.054534912109375,
0.015350341796875,
0.0765380859375,
-0.046630859375,
0.00978851318359375,
-0.027923583984375,
0.01617431640625,
-0.01104736328125,
-0.02587890625,
0.0120391845703125,
0.0079345703125,
0.015838623046875,
-0.00960540771484375,
0.04827880859375,
-0.02227783203125,
-0.0305633544921875,
-0.00476837158203125,
0.02349853515625,
0.0274658203125,
0.013397216796875,
-0.051483154296875,
0.0246734619140625,
-0.0080413818359375,
-0.040679931640625,
0.02301025390625,
0.00998687744140625,
0.019744873046875,
0.05194091796875,
0.0543212890625,
-0.00034046173095703125,
0.01885986328125,
-0.0228729248046875,
0.07159423828125,
-0.035430908203125,
-0.0550537109375,
-0.07672119140625,
0.037109375,
-0.0008530616760253906,
-0.045928955078125,
0.062042236328125,
0.044158935546875,
0.06304931640625,
-0.004657745361328125,
0.0258331298828125,
-0.0203704833984375,
0.0195770263671875,
-0.0276336669921875,
0.06353759765625,
-0.054840087890625,
0.02117919921875,
-0.007068634033203125,
-0.053741455078125,
-0.007598876953125,
0.07568359375,
-0.01160430908203125,
-0.001033782958984375,
0.0288238525390625,
0.07183837890625,
-0.00424957275390625,
0.0120697021484375,
-0.001922607421875,
0.01885986328125,
0.02581787109375,
0.06103515625,
0.06500244140625,
-0.0634765625,
0.05615234375,
-0.034942626953125,
-0.0212249755859375,
-0.036346435546875,
-0.041595458984375,
-0.0677490234375,
-0.037506103515625,
-0.0238189697265625,
-0.022857666015625,
-0.015869140625,
0.0904541015625,
0.0628662109375,
-0.054840087890625,
-0.0291290283203125,
0.0110931396484375,
0.00046443939208984375,
-0.0013456344604492188,
-0.01528167724609375,
0.0175933837890625,
-0.0206756591796875,
-0.0721435546875,
0.0223541259765625,
0.00783538818359375,
0.0223541259765625,
-0.02923583984375,
-0.0149688720703125,
-0.01253509521484375,
0.0011043548583984375,
0.0291595458984375,
0.036773681640625,
-0.0540771484375,
-0.028411865234375,
-0.020172119140625,
-0.03607177734375,
0.0059051513671875,
0.03790283203125,
-0.057098388671875,
0.023834228515625,
0.024383544921875,
0.0234222412109375,
0.06317138671875,
-0.03466796875,
0.01328277587890625,
-0.058685302734375,
0.048736572265625,
0.00299072265625,
0.021575927734375,
0.00470733642578125,
-0.007099151611328125,
0.04547119140625,
0.01548004150390625,
-0.045562744140625,
-0.07611083984375,
-0.01322174072265625,
-0.07257080078125,
0.00872039794921875,
0.07293701171875,
-0.006561279296875,
-0.011138916015625,
0.01250457763671875,
-0.0307159423828125,
0.03271484375,
-0.02410888671875,
0.071044921875,
0.0303955078125,
-0.0005712509155273438,
-0.00494384765625,
-0.0289306640625,
0.0299072265625,
0.02545166015625,
-0.060821533203125,
-0.0277099609375,
0.01194000244140625,
0.025115966796875,
0.01451873779296875,
0.089599609375,
0.01654052734375,
0.027313232421875,
0.01001739501953125,
-0.0095062255859375,
-0.01457977294921875,
-0.0191497802734375,
-0.02978515625,
0.004459381103515625,
-0.0065765380859375,
-0.035736083984375
]
] |
m-a-p/MERT-v1-330M | 2023-06-02T13:50:23.000Z | [
"transformers",
"pytorch",
"mert_model",
"feature-extraction",
"music",
"custom_code",
"arxiv:2306.00107",
"license:cc-by-nc-4.0",
"region:us"
] | feature-extraction | m-a-p | null | null | m-a-p/MERT-v1-330M | 30 | 13,560 | transformers | 2023-03-17T12:07:01 | ---
license: cc-by-nc-4.0
inference: false
tags:
- music
---
# Introduction to our series work
The development log of our Music Audio Pre-training (m-a-p) model family:
- 02/06/2023: [arxiv pre-print](https://arxiv.org/abs/2306.00107) and training [codes](https://github.com/yizhilll/MERT) released.
- 17/03/2023: we release two advanced music understanding models, [MERT-v1-95M](https://huggingface.co/m-a-p/MERT-v1-95M) and [MERT-v1-330M](https://huggingface.co/m-a-p/MERT-v1-330M) , trained with new paradigm and dataset. They outperform the previous models and can better generalize to more tasks.
- 14/03/2023: we retrained the MERT-v0 model with open-source-only music dataset [MERT-v0-public](https://huggingface.co/m-a-p/MERT-v0-public)
- 29/12/2022: a music understanding model [MERT-v0](https://huggingface.co/m-a-p/MERT-v0) trained with **MLM** paradigm, which performs better at downstream tasks.
- 29/10/2022: a pre-trained MIR model [music2vec](https://huggingface.co/m-a-p/music2vec-v1) trained with **BYOL** paradigm.
Here is a table for quick model pick-up:
| Name | Pre-train Paradigm | Training Data (hour) | Pre-train Context (second) | Model Size | Transformer Layer-Dimension | Feature Rate | Sample Rate | Release Date |
| ------------------------------------------------------------ | ------------------ | -------------------- | ---------------------------- | ---------- | --------------------------- | ------------ | ----------- | ------------ |
| [MERT-v1-330M](https://huggingface.co/m-a-p/MERT-v1-330M) | MLM | 160K | 5 | 330M | 24-1024 | 75 Hz | 24K Hz | 17/03/2023 |
| [MERT-v1-95M](https://huggingface.co/m-a-p/MERT-v1-95M) | MLM | 20K | 5 | 95M | 12-768 | 75 Hz | 24K Hz | 17/03/2023 |
| [MERT-v0-public](https://huggingface.co/m-a-p/MERT-v0-public) | MLM | 900 | 5 | 95M | 12-768 | 50 Hz | 16K Hz | 14/03/2023 |
| [MERT-v0](https://huggingface.co/m-a-p/MERT-v0) | MLM | 1000 | 5 | 95 M | 12-768 | 50 Hz | 16K Hz | 29/12/2022 |
| [music2vec-v1](https://huggingface.co/m-a-p/music2vec-v1) | BYOL | 1000 | 30 | 95 M | 12-768 | 50 Hz | 16K Hz | 30/10/2022 |
## Explanation
The m-a-p models share the similar model architecture and the most distinguished difference is the paradigm in used pre-training. Other than that, there are several nuance technical configuration needs to know before using:
- **Model Size**: the number of parameters that would be loaded to memory. Please select the appropriate size fitting your hardware.
- **Transformer Layer-Dimension**: The number of transformer layers and the corresponding feature dimensions can be outputted from our model. This is marked out because features extracted by **different layers could have various performance depending on tasks**.
- **Feature Rate**: Given a 1-second audio input, the number of features output by the model.
- **Sample Rate**: The frequency of audio that the model is trained with.
# Introduction to MERT-v1
Compared to MERT-v0, we introduce multiple new things in the MERT-v1 pre-training:
- Change the pseudo labels to 8 codebooks from [encodec](https://github.com/facebookresearch/encodec), which potentially has higher quality and empower our model to support music generation.
- MLM prediction with in-batch noise mixture.
- Train with higher audio frequency (24K Hz).
- Train with more audio data (up to 160 thousands of hours).
- More available model sizes 95M and 330M.
More details will be written in our coming-soon paper.
# Model Usage
```python
# from transformers import Wav2Vec2Processor
from transformers import Wav2Vec2FeatureExtractor
from transformers import AutoModel
import torch
from torch import nn
import torchaudio.transforms as T
from datasets import load_dataset
# loading our model weights
model = AutoModel.from_pretrained("m-a-p/MERT-v1-330M", trust_remote_code=True)
# loading the corresponding preprocessor config
processor = Wav2Vec2FeatureExtractor.from_pretrained("m-a-p/MERT-v1-330M",trust_remote_code=True)
# load demo audio and set processor
dataset = load_dataset("hf-internal-testing/librispeech_asr_demo", "clean", split="validation")
dataset = dataset.sort("id")
sampling_rate = dataset.features["audio"].sampling_rate
resample_rate = processor.sampling_rate
# make sure the sample_rate aligned
if resample_rate != sampling_rate:
print(f'setting rate from {sampling_rate} to {resample_rate}')
resampler = T.Resample(sampling_rate, resample_rate)
else:
resampler = None
# audio file is decoded on the fly
if resampler is None:
input_audio = dataset[0]["audio"]["array"]
else:
input_audio = resampler(torch.from_numpy(dataset[0]["audio"]["array"]))
inputs = processor(input_audio, sampling_rate=resample_rate, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs, output_hidden_states=True)
# take a look at the output shape, there are 25 layers of representation
# each layer performs differently in different downstream tasks, you should choose empirically
all_layer_hidden_states = torch.stack(outputs.hidden_states).squeeze()
print(all_layer_hidden_states.shape) # [25 layer, Time steps, 1024 feature_dim]
# for utterance level classification tasks, you can simply reduce the representation in time
time_reduced_hidden_states = all_layer_hidden_states.mean(-2)
print(time_reduced_hidden_states.shape) # [25, 1024]
# you can even use a learnable weighted average representation
aggregator = nn.Conv1d(in_channels=25, out_channels=1, kernel_size=1)
weighted_avg_hidden_states = aggregator(time_reduced_hidden_states.unsqueeze(0)).squeeze()
print(weighted_avg_hidden_states.shape) # [1024]
```
# Citation
```shell
@misc{li2023mert,
title={MERT: Acoustic Music Understanding Model with Large-Scale Self-supervised Training},
author={Yizhi Li and Ruibin Yuan and Ge Zhang and Yinghao Ma and Xingran Chen and Hanzhi Yin and Chenghua Lin and Anton Ragni and Emmanouil Benetos and Norbert Gyenge and Roger Dannenberg and Ruibo Liu and Wenhu Chen and Gus Xia and Yemin Shi and Wenhao Huang and Yike Guo and Jie Fu},
year={2023},
eprint={2306.00107},
archivePrefix={arXiv},
primaryClass={cs.SD}
}
``` | 6,725 | [
[
-0.0513916015625,
-0.033294677734375,
0.015655517578125,
0.00868988037109375,
-0.0080413818359375,
-0.01209259033203125,
-0.02093505859375,
-0.022003173828125,
0.01253509521484375,
0.0259246826171875,
-0.06689453125,
-0.03497314453125,
-0.038330078125,
-0.01444244384765625,
-0.0143890380859375,
0.072998046875,
0.0119781494140625,
0.005283355712890625,
0.0097808837890625,
-0.0139923095703125,
-0.04315185546875,
-0.032928466796875,
-0.051605224609375,
-0.026947021484375,
0.01503753662109375,
0.02777099609375,
0.02288818359375,
0.0406494140625,
0.04461669921875,
0.0233154296875,
-0.0140838623046875,
-0.018035888671875,
-0.021820068359375,
0.0009431838989257812,
0.0289306640625,
-0.0264129638671875,
-0.05853271484375,
0.0206298828125,
0.033233642578125,
0.0251007080078125,
0.000720977783203125,
0.044219970703125,
0.01012420654296875,
0.047821044921875,
-0.0165252685546875,
0.00925445556640625,
-0.0219573974609375,
0.01444244384765625,
-0.007671356201171875,
-0.0258636474609375,
-0.0306243896484375,
0.0014019012451171875,
0.0176544189453125,
-0.052490234375,
0.01026153564453125,
0.0035343170166015625,
0.07952880859375,
0.03399658203125,
-0.0219573974609375,
-0.01360321044921875,
-0.05755615234375,
0.06219482421875,
-0.059234619140625,
0.055816650390625,
0.044830322265625,
0.033172607421875,
0.01641845703125,
-0.046142578125,
-0.030853271484375,
-0.020263671875,
-0.0019140243530273438,
0.0274200439453125,
-0.006732940673828125,
-0.00959014892578125,
0.032257080078125,
0.046630859375,
-0.049224853515625,
-0.001186370849609375,
-0.042755126953125,
-0.01995849609375,
0.053131103515625,
-0.00897216796875,
0.006404876708984375,
-0.0220489501953125,
-0.040985107421875,
-0.0330810546875,
-0.03521728515625,
0.019195556640625,
0.0258941650390625,
-0.00034427642822265625,
-0.036163330078125,
0.02081298828125,
0.00799560546875,
0.040374755859375,
0.025360107421875,
-0.036163330078125,
0.05035400390625,
-0.022125244140625,
-0.0219573974609375,
0.0094757080078125,
0.06085205078125,
0.01111602783203125,
-0.0017137527465820312,
0.01470184326171875,
-0.0276336669921875,
-0.00762939453125,
0.001056671142578125,
-0.057220458984375,
-0.033416748046875,
0.0202178955078125,
-0.045135498046875,
-0.01049041748046875,
0.00039839744567871094,
-0.03302001953125,
0.00289154052734375,
-0.0452880859375,
0.06695556640625,
-0.04681396484375,
-0.034576416015625,
0.01535797119140625,
-0.0071868896484375,
0.0228729248046875,
-0.003284454345703125,
-0.052459716796875,
0.0185089111328125,
0.03173828125,
0.054718017578125,
0.0030651092529296875,
-0.020050048828125,
-0.00757598876953125,
-0.007442474365234375,
-0.003238677978515625,
0.025390625,
0.007541656494140625,
-0.017578125,
-0.0347900390625,
-0.0002541542053222656,
-0.01111602783203125,
-0.03955078125,
0.049652099609375,
-0.0305938720703125,
0.03546142578125,
-0.0211639404296875,
-0.041839599609375,
-0.03631591796875,
-0.0025463104248046875,
-0.03955078125,
0.07415771484375,
0.022705078125,
-0.064697265625,
0.031494140625,
-0.054595947265625,
-0.0139617919921875,
-0.031036376953125,
0.01123809814453125,
-0.05853271484375,
-0.0098876953125,
0.0207061767578125,
0.034881591796875,
-0.02423095703125,
0.006214141845703125,
-0.0160369873046875,
-0.03680419921875,
0.016357421875,
-0.0295867919921875,
0.07177734375,
0.0335693359375,
-0.045623779296875,
0.0161285400390625,
-0.08251953125,
-0.00695037841796875,
0.016845703125,
-0.031402587890625,
0.004669189453125,
-0.0220947265625,
0.0255279541015625,
0.035675048828125,
0.007904052734375,
-0.02813720703125,
-0.00609588623046875,
-0.0288848876953125,
0.044830322265625,
0.04669189453125,
0.0010509490966796875,
0.02569580078125,
-0.046783447265625,
0.035675048828125,
-0.005809783935546875,
0.007537841796875,
-0.021148681640625,
-0.021636962890625,
-0.057281494140625,
-0.03228759765625,
0.0296630859375,
0.047515869140625,
-0.013458251953125,
0.06201171875,
-0.0102691650390625,
-0.052032470703125,
-0.0660400390625,
-0.003376007080078125,
0.03131103515625,
0.03912353515625,
0.04364013671875,
-0.0280609130859375,
-0.060333251953125,
-0.07318115234375,
0.0023212432861328125,
0.002960205078125,
-0.0225067138671875,
0.034393310546875,
0.027191162109375,
-0.00926971435546875,
0.0625,
-0.0216064453125,
-0.0199432373046875,
-0.0137939453125,
0.0093994140625,
0.045806884765625,
0.06365966796875,
0.047393798828125,
-0.044891357421875,
-0.0251617431640625,
-0.0257110595703125,
-0.05377197265625,
-0.009124755859375,
-0.011566162109375,
-0.0106964111328125,
-0.0014934539794921875,
0.0206298828125,
-0.044097900390625,
0.018951416015625,
0.0288543701171875,
-0.0258941650390625,
0.050567626953125,
0.006702423095703125,
0.0225982666015625,
-0.079833984375,
0.00501251220703125,
-0.0008416175842285156,
-0.007659912109375,
-0.041595458984375,
-0.01715087890625,
-0.00860595703125,
-0.005950927734375,
-0.042327880859375,
0.0185546875,
-0.0406494140625,
-0.0239715576171875,
-0.016815185546875,
-0.0027027130126953125,
-0.006683349609375,
0.0687255859375,
0.004398345947265625,
0.052581787109375,
0.055389404296875,
-0.05126953125,
0.024383544921875,
0.01629638671875,
-0.032958984375,
0.03204345703125,
-0.06268310546875,
0.0166778564453125,
0.007274627685546875,
0.0260467529296875,
-0.071044921875,
-0.0031299591064453125,
0.0193939208984375,
-0.05792236328125,
0.041259765625,
-0.021240234375,
-0.0318603515625,
-0.045623779296875,
-0.00487518310546875,
0.04010009765625,
0.05792236328125,
-0.036651611328125,
0.0228729248046875,
0.0252838134765625,
0.01222991943359375,
-0.0303497314453125,
-0.051025390625,
-0.0355224609375,
-0.045013427734375,
-0.063232421875,
0.0223388671875,
-0.005268096923828125,
-0.004512786865234375,
-0.0086212158203125,
-0.0227203369140625,
0.001171112060546875,
0.007030487060546875,
0.036651611328125,
0.013519287109375,
-0.035888671875,
0.01076507568359375,
-0.0166015625,
-0.021514892578125,
0.0173797607421875,
-0.03887939453125,
0.0660400390625,
-0.01605224609375,
-0.0301666259765625,
-0.0703125,
0.0008645057678222656,
0.04315185546875,
-0.03070068359375,
0.02972412109375,
0.075439453125,
-0.01242828369140625,
0.01465606689453125,
-0.037353515625,
-0.02459716796875,
-0.03753662109375,
0.0283966064453125,
-0.031158447265625,
-0.050872802734375,
0.04254150390625,
0.0136566162109375,
0.0081634521484375,
0.0570068359375,
0.032806396484375,
-0.019927978515625,
0.06634521484375,
0.0279083251953125,
-0.00904083251953125,
0.03924560546875,
-0.06494140625,
-0.01161956787109375,
-0.0693359375,
-0.01215362548828125,
-0.022857666015625,
-0.045013427734375,
-0.044097900390625,
-0.0262298583984375,
0.041900634765625,
0.003520965576171875,
-0.05029296875,
0.044921875,
-0.04132080078125,
0.002227783203125,
0.0648193359375,
0.01519012451171875,
-0.007053375244140625,
-0.0012111663818359375,
0.00397491455078125,
-0.006587982177734375,
-0.039703369140625,
-0.01499176025390625,
0.10198974609375,
0.05194091796875,
0.037628173828125,
-0.00403594970703125,
0.060638427734375,
0.005550384521484375,
-0.010162353515625,
-0.056243896484375,
0.0200347900390625,
0.01486968994140625,
-0.05169677734375,
-0.0209808349609375,
-0.0185394287109375,
-0.05157470703125,
0.0272216796875,
-0.02496337890625,
-0.038543701171875,
0.01490020751953125,
0.011444091796875,
-0.03228759765625,
0.02996826171875,
-0.043426513671875,
0.05963134765625,
-0.01535797119140625,
-0.01477813720703125,
-0.00545501708984375,
-0.0540771484375,
0.03338623046875,
0.0008802413940429688,
0.020263671875,
-0.01015472412109375,
0.035003662109375,
0.08953857421875,
-0.0288543701171875,
0.03411865234375,
-0.031494140625,
-0.0015163421630859375,
0.043701171875,
-0.0011320114135742188,
0.0224609375,
-0.0010328292846679688,
0.0006036758422851562,
0.00868988037109375,
-0.007534027099609375,
-0.0268096923828125,
-0.0290069580078125,
0.031982421875,
-0.0738525390625,
-0.029541015625,
-0.0135498046875,
-0.058380126953125,
-0.016845703125,
0.0037975311279296875,
0.052734375,
0.045562744140625,
0.01151275634765625,
0.0262451171875,
0.054412841796875,
-0.01666259765625,
0.036590576171875,
0.0270843505859375,
-0.008087158203125,
-0.040740966796875,
0.06317138671875,
0.007801055908203125,
0.0234375,
0.018798828125,
0.016845703125,
-0.0255279541015625,
-0.032501220703125,
-0.023895263671875,
0.020050048828125,
-0.0321044921875,
-0.00905609130859375,
-0.058380126953125,
-0.01244354248046875,
-0.039337158203125,
0.0038700103759765625,
-0.04803466796875,
-0.044464111328125,
-0.0163726806640625,
-0.005596160888671875,
0.03338623046875,
0.038909912109375,
-0.0166168212890625,
0.05072021484375,
-0.054931640625,
0.0158538818359375,
0.0174713134765625,
0.01030731201171875,
-0.008148193359375,
-0.0885009765625,
-0.02606201171875,
-0.0011644363403320312,
-0.02801513671875,
-0.0750732421875,
0.0279693603515625,
0.00009876489639282227,
0.032623291015625,
0.043121337890625,
-0.0190887451171875,
0.0416259765625,
-0.02618408203125,
0.052154541015625,
0.015777587890625,
-0.07244873046875,
0.056182861328125,
-0.0374755859375,
0.01018524169921875,
0.040618896484375,
0.02593994140625,
-0.032318115234375,
-0.005359649658203125,
-0.06500244140625,
-0.05877685546875,
0.07818603515625,
0.017608642578125,
-0.0217132568359375,
0.03948974609375,
0.01517486572265625,
0.0023288726806640625,
0.017608642578125,
-0.06927490234375,
-0.04913330078125,
-0.035888671875,
-0.002178192138671875,
-0.02679443359375,
-0.00904083251953125,
-0.0031795501708984375,
-0.05487060546875,
0.06756591796875,
0.024200439453125,
0.041473388671875,
0.0164642333984375,
-0.00934600830078125,
0.0005006790161132812,
0.01403045654296875,
0.04608154296875,
0.027740478515625,
-0.042083740234375,
-0.0008282661437988281,
0.01235198974609375,
-0.03802490234375,
0.0235443115234375,
-0.0024509429931640625,
-0.002559661865234375,
-0.0024585723876953125,
0.03692626953125,
0.08685302734375,
0.0223846435546875,
-0.0173187255859375,
0.03759765625,
0.013275146484375,
-0.0272979736328125,
-0.0379638671875,
0.005992889404296875,
0.0191650390625,
0.0213623046875,
0.01395416259765625,
0.01397705078125,
0.01197052001953125,
-0.0272979736328125,
0.039947509765625,
0.006885528564453125,
-0.039215087890625,
-0.0286712646484375,
0.06854248046875,
-0.005939483642578125,
-0.0192108154296875,
0.043701171875,
-0.01457977294921875,
-0.043792724609375,
0.062103271484375,
0.0413818359375,
0.071533203125,
-0.03363037109375,
-0.01171875,
0.05560302734375,
0.0235137939453125,
0.0043487548828125,
0.020477294921875,
-0.0219268798828125,
-0.0271148681640625,
-0.0175933837890625,
-0.054656982421875,
0.0030460357666015625,
0.027435302734375,
-0.06622314453125,
0.0223388671875,
-0.027984619140625,
-0.01763916015625,
-0.004795074462890625,
0.0209197998046875,
-0.057769775390625,
0.0258026123046875,
0.005603790283203125,
0.07757568359375,
-0.07275390625,
0.07232666015625,
0.030242919921875,
-0.039154052734375,
-0.08819580078125,
-0.0094757080078125,
0.01555633544921875,
-0.05224609375,
0.04595947265625,
0.01165771484375,
0.007221221923828125,
0.00714111328125,
-0.039276123046875,
-0.08172607421875,
0.10443115234375,
0.0098419189453125,
-0.04833984375,
0.01373291015625,
-0.00670623779296875,
0.034759521484375,
-0.0294036865234375,
0.0214691162109375,
0.06280517578125,
0.0445556640625,
0.0131683349609375,
-0.07452392578125,
-0.0081329345703125,
-0.0280609130859375,
-0.0265655517578125,
0.003215789794921875,
-0.049041748046875,
0.098388671875,
-0.01776123046875,
-0.0011816024780273438,
-0.0173797607421875,
0.06317138671875,
0.031158447265625,
0.03192138671875,
0.05810546875,
0.05810546875,
0.048095703125,
-0.01320648193359375,
0.0689697265625,
-0.0281982421875,
0.0390625,
0.0799560546875,
0.02288818359375,
0.044403076171875,
0.01555633544921875,
-0.037811279296875,
0.01555633544921875,
0.0679931640625,
-0.0188140869140625,
0.043365478515625,
0.0223388671875,
-0.005893707275390625,
-0.0225677490234375,
0.0180816650390625,
-0.06622314453125,
0.048248291015625,
0.012298583984375,
-0.0288238525390625,
0.0228729248046875,
0.0214385986328125,
0.0017480850219726562,
-0.026519775390625,
-0.015777587890625,
0.042724609375,
0.0004830360412597656,
-0.0293731689453125,
0.0665283203125,
-0.005603790283203125,
0.06805419921875,
-0.02655029296875,
0.0207977294921875,
-0.0013294219970703125,
-0.0028438568115234375,
-0.0175628662109375,
-0.039794921875,
0.00738525390625,
-0.0226898193359375,
-0.0252532958984375,
-0.0129547119140625,
0.035614013671875,
-0.038116455078125,
-0.0235137939453125,
0.02960205078125,
0.0294036865234375,
0.027252197265625,
-0.0137939453125,
-0.060302734375,
0.002979278564453125,
0.00373077392578125,
-0.039642333984375,
0.007781982421875,
0.0172882080078125,
0.0270843505859375,
0.035736083984375,
0.041229248046875,
0.00629425048828125,
0.0276641845703125,
0.0001417398452758789,
0.046295166015625,
-0.05047607421875,
-0.042755126953125,
-0.035308837890625,
0.031951904296875,
0.016937255859375,
-0.0244293212890625,
0.0538330078125,
0.050750732421875,
0.07232666015625,
-0.0098114013671875,
0.0426025390625,
-0.0004184246063232422,
0.049407958984375,
-0.042572021484375,
0.064208984375,
-0.053375244140625,
0.0306396484375,
-0.0283050537109375,
-0.06915283203125,
0.00829315185546875,
0.05010986328125,
-0.01922607421875,
0.024932861328125,
0.031524658203125,
0.04876708984375,
-0.0193939208984375,
0.0089874267578125,
0.005725860595703125,
0.023681640625,
0.01177215576171875,
0.0626220703125,
0.053863525390625,
-0.0654296875,
0.03338623046875,
-0.057342529296875,
-0.01253509521484375,
-0.0108184814453125,
-0.0247344970703125,
-0.053558349609375,
-0.05596923828125,
-0.024169921875,
-0.02703857421875,
-0.0149688720703125,
0.083251953125,
0.0718994140625,
-0.059539794921875,
-0.029052734375,
0.015533447265625,
0.0013341903686523438,
-0.038055419921875,
-0.0174560546875,
0.055389404296875,
-0.0055084228515625,
-0.056884765625,
0.04571533203125,
-0.007579803466796875,
0.01971435546875,
-0.01215362548828125,
-0.02618408203125,
-0.0175628662109375,
-0.006343841552734375,
0.0296630859375,
0.012420654296875,
-0.042449951171875,
-0.016021728515625,
-0.0038623809814453125,
0.0014324188232421875,
0.0276641845703125,
0.04583740234375,
-0.04254150390625,
0.0123138427734375,
0.0401611328125,
0.01508331298828125,
0.05084228515625,
-0.01000213623046875,
0.00981903076171875,
-0.04327392578125,
0.0172882080078125,
0.01220703125,
0.0294342041015625,
0.0114593505859375,
-0.01324462890625,
0.03863525390625,
0.0322265625,
-0.0538330078125,
-0.06646728515625,
-0.0036220550537109375,
-0.09326171875,
-0.00962066650390625,
0.07666015625,
-0.0029811859130859375,
-0.016815185546875,
0.005748748779296875,
-0.0162200927734375,
0.043853759765625,
-0.0194549560546875,
0.028228759765625,
0.035858154296875,
-0.01490020751953125,
-0.00797271728515625,
-0.061279296875,
0.054595947265625,
0.03448486328125,
-0.034942626953125,
-0.016937255859375,
0.03143310546875,
0.0413818359375,
0.01812744140625,
0.058502197265625,
-0.0165557861328125,
0.0298004150390625,
0.0129241943359375,
0.0234375,
-0.02569580078125,
-0.034637451171875,
-0.036285400390625,
0.0158233642578125,
-0.016204833984375,
-0.042327880859375
]
] |
Habana/t5 | 2023-09-08T13:39:25.000Z | [
"optimum_habana",
"license:apache-2.0",
"region:us"
] | null | Habana | null | null | Habana/t5 | 0 | 13,547 | null | 2022-06-04T15:43:41 | ---
license: apache-2.0
---
[Optimum Habana](https://github.com/huggingface/optimum-habana) is the interface between the Hugging Face Transformers and Diffusers libraries and Habana's Gaudi processor (HPU).
It provides a set of tools enabling easy and fast model loading, training and inference on single- and multi-HPU settings for different downstream tasks.
Learn more about how to take advantage of the power of Habana HPUs to train and deploy Transformers and Diffusers models at [hf.co/hardware/habana](https://huggingface.co/hardware/habana).
## T5 model HPU configuration
This model only contains the `GaudiConfig` file for running the [T5](https://huggingface.co/t5-base) model on Habana's Gaudi processors (HPU).
**This model contains no model weights, only a GaudiConfig.**
This enables to specify:
- `use_fused_adam`: whether to use Habana's custom AdamW implementation
- `use_fused_clip_norm`: whether to use Habana's fused gradient norm clipping operator
## Usage
The model is instantiated the same way as in the Transformers library.
The only difference is that there are a few new training arguments specific to HPUs.
[Here](https://github.com/huggingface/optimum-habana/blob/main/examples/summarization/run_summarization.py) is a summarization example script to fine-tune a model. You can run it with T5-small with the following command:
```bash
python run_summarization.py \
--model_name_or_path t5-small \
--do_train \
--do_eval \
--dataset_name cnn_dailymail \
--dataset_config "3.0.0" \
--source_prefix "summarize: " \
--output_dir /tmp/tst-summarization \
--per_device_train_batch_size 4 \
--per_device_eval_batch_size 4 \
--overwrite_output_dir \
--predict_with_generate \
--use_habana \
--use_lazy_mode \
--gaudi_config_name Habana/t5 \
--ignore_pad_token_for_loss False \
--pad_to_max_length \
--save_strategy epoch \
--throughput_warmup_steps 3
```
Check the [documentation](https://huggingface.co/docs/optimum/habana/index) out for more advanced usage and examples.
| 2,075 | [
[
-0.058685302734375,
-0.057403564453125,
0.0211181640625,
0.0163116455078125,
-0.0217132568359375,
-0.00399017333984375,
-0.002330780029296875,
-0.03118896484375,
0.0233917236328125,
0.0212860107421875,
-0.039031982421875,
-0.0089874267578125,
-0.0294189453125,
-0.01177978515625,
-0.026275634765625,
0.08807373046875,
-0.0028896331787109375,
-0.024261474609375,
-0.0189666748046875,
-0.017608642578125,
-0.0313720703125,
-0.0278778076171875,
-0.06903076171875,
-0.0307159423828125,
0.020233154296875,
0.01468658447265625,
0.0682373046875,
0.036468505859375,
0.040283203125,
0.029266357421875,
-0.02423095703125,
0.00774383544921875,
-0.0350341796875,
-0.01453399658203125,
0.00849151611328125,
-0.02764892578125,
-0.045135498046875,
0.00614166259765625,
0.045684814453125,
0.0175323486328125,
-0.0156402587890625,
0.036651611328125,
0.0014429092407226562,
0.042816162109375,
-0.05291748046875,
-0.018768310546875,
-0.00780487060546875,
0.016632080078125,
-0.011962890625,
-0.020172119140625,
-0.006805419921875,
-0.0261077880859375,
0.0026035308837890625,
-0.04083251953125,
0.0225677490234375,
0.01256561279296875,
0.107177734375,
0.06640625,
-0.03558349609375,
0.020538330078125,
-0.056365966796875,
0.0623779296875,
-0.03936767578125,
0.0142974853515625,
0.01953125,
0.045684814453125,
-0.007251739501953125,
-0.064208984375,
-0.0229034423828125,
0.0090179443359375,
-0.00875091552734375,
0.0243682861328125,
-0.03704833984375,
0.0238494873046875,
0.03375244140625,
0.059417724609375,
-0.0236053466796875,
-0.004306793212890625,
-0.042572021484375,
-0.014007568359375,
0.0347900390625,
0.0103912353515625,
0.006053924560546875,
-0.026214599609375,
-0.034423828125,
-0.003414154052734375,
-0.025848388671875,
-0.001087188720703125,
0.0251007080078125,
-0.01446533203125,
-0.039703369140625,
0.02069091796875,
-0.000453948974609375,
0.060272216796875,
0.0106048583984375,
-0.0158233642578125,
0.04119873046875,
-0.0064544677734375,
-0.04913330078125,
-0.01422119140625,
0.05169677734375,
0.016448974609375,
0.004329681396484375,
0.00620269775390625,
-0.0239410400390625,
0.002044677734375,
0.055633544921875,
-0.06329345703125,
-0.0231475830078125,
0.039886474609375,
-0.050079345703125,
-0.05621337890625,
-0.0281982421875,
-0.060516357421875,
0.00589752197265625,
-0.01384735107421875,
0.05767822265625,
-0.0174560546875,
-0.00803375244140625,
-0.007534027099609375,
-0.03387451171875,
0.019775390625,
0.0230560302734375,
-0.06475830078125,
0.031463623046875,
0.031829833984375,
0.078369140625,
-0.0107574462890625,
-0.0162353515625,
-0.02850341796875,
-0.00031447410583496094,
-0.0073394775390625,
0.047821044921875,
-0.00478363037109375,
-0.028350830078125,
-0.0073394775390625,
0.005237579345703125,
-0.0082855224609375,
-0.041259765625,
0.0701904296875,
-0.0230255126953125,
0.03753662109375,
0.01385498046875,
-0.03192138671875,
-0.0305328369140625,
-0.004482269287109375,
-0.04913330078125,
0.099609375,
0.02972412109375,
-0.05889892578125,
0.011871337890625,
-0.055816650390625,
-0.039520263671875,
-0.0156402587890625,
0.0082244873046875,
-0.0633544921875,
0.0160980224609375,
0.006099700927734375,
0.0333251953125,
-0.0160980224609375,
0.00922393798828125,
-0.0203094482421875,
-0.0273590087890625,
-0.00949859619140625,
-0.0129241943359375,
0.0889892578125,
0.0298614501953125,
-0.0197906494140625,
0.040283203125,
-0.0579833984375,
-0.019500732421875,
0.01373291015625,
-0.0328369140625,
-0.00955963134765625,
-0.024444580078125,
0.0113525390625,
0.00458526611328125,
0.015777587890625,
-0.0306396484375,
0.00360107421875,
0.0023136138916015625,
0.052734375,
0.06072998046875,
0.00252532958984375,
0.0229949951171875,
-0.051361083984375,
0.04852294921875,
-0.02630615234375,
0.059722900390625,
0.003711700439453125,
-0.0494384765625,
-0.07708740234375,
-0.035888671875,
-0.0033168792724609375,
0.0289306640625,
-0.01849365234375,
0.031646728515625,
0.009063720703125,
-0.04571533203125,
-0.057220458984375,
-0.005584716796875,
0.01279449462890625,
0.060821533203125,
0.03509521484375,
-0.00836944580078125,
-0.06103515625,
-0.079833984375,
0.006748199462890625,
0.006214141845703125,
-0.00411224365234375,
0.037872314453125,
0.042022705078125,
-0.0161895751953125,
0.054412841796875,
-0.0401611328125,
-0.035919189453125,
-0.0082244873046875,
-0.002838134765625,
0.05279541015625,
0.03326416015625,
0.057647705078125,
-0.06317138671875,
-0.0295562744140625,
-0.017791748046875,
-0.061981201171875,
-0.00003325939178466797,
-0.00678253173828125,
-0.042572021484375,
0.01287078857421875,
0.0210113525390625,
-0.056671142578125,
0.029205322265625,
0.054229736328125,
-0.0157012939453125,
0.04937744140625,
-0.0281829833984375,
-0.00131988525390625,
-0.09161376953125,
0.026702880859375,
-0.006870269775390625,
-0.0428466796875,
-0.04339599609375,
0.01154327392578125,
0.0007114410400390625,
-0.002178192138671875,
-0.03759765625,
0.041748046875,
-0.0204925537109375,
-0.002437591552734375,
-0.01800537109375,
-0.02447509765625,
0.0136260986328125,
0.049835205078125,
-0.016510009765625,
0.062164306640625,
0.035308837890625,
-0.053314208984375,
0.02557373046875,
0.034698486328125,
-0.0202178955078125,
0.028656005859375,
-0.07598876953125,
0.00434112548828125,
-0.003421783447265625,
0.0135955810546875,
-0.05145263671875,
-0.036865234375,
0.006256103515625,
-0.039520263671875,
0.023651123046875,
-0.01110076904296875,
-0.0038051605224609375,
-0.03863525390625,
-0.026763916015625,
0.018463134765625,
0.08270263671875,
-0.06805419921875,
0.06842041015625,
0.061981201171875,
0.0199737548828125,
-0.0496826171875,
-0.044219970703125,
-0.00969696044921875,
-0.050811767578125,
-0.055023193359375,
0.0560302734375,
-0.0029315948486328125,
-0.0026454925537109375,
-0.0022716522216796875,
0.0013637542724609375,
-0.000858306884765625,
0.01885986328125,
0.027557373046875,
0.037445068359375,
0.01409149169921875,
-0.0146331787109375,
0.00506591796875,
-0.0130462646484375,
0.00887298583984375,
-0.0160980224609375,
0.045379638671875,
-0.0197601318359375,
0.017578125,
-0.02935791015625,
0.0028591156005859375,
0.039703369140625,
-0.0072021484375,
0.0452880859375,
0.07452392578125,
-0.029541015625,
-0.01364898681640625,
-0.044769287109375,
-0.0030364990234375,
-0.042633056640625,
-0.0034275054931640625,
-0.0278472900390625,
-0.0401611328125,
0.048919677734375,
0.00533294677734375,
0.0167083740234375,
0.046661376953125,
0.058135986328125,
-0.0197906494140625,
0.071044921875,
0.06243896484375,
-0.01526641845703125,
0.05340576171875,
-0.0439453125,
-0.0036487579345703125,
-0.0809326171875,
-0.006046295166015625,
-0.04730224609375,
-0.018402099609375,
-0.0262908935546875,
-0.0132598876953125,
0.05853271484375,
0.0003771781921386719,
-0.032318115234375,
0.025543212890625,
-0.056549072265625,
0.02178955078125,
0.0616455078125,
0.01001739501953125,
0.006282806396484375,
0.000037789344787597656,
0.0020294189453125,
0.0338134765625,
-0.05889892578125,
-0.01549530029296875,
0.06475830078125,
0.03192138671875,
0.05181884765625,
0.0024356842041015625,
0.05023193359375,
0.0168609619140625,
-0.0005350112915039062,
-0.06329345703125,
0.0247344970703125,
0.00000667572021484375,
-0.046600341796875,
-0.005596160888671875,
-0.024383544921875,
-0.06329345703125,
0.01776123046875,
-0.01175689697265625,
-0.03717041015625,
0.0261688232421875,
0.02685546875,
-0.036834716796875,
0.021331787109375,
-0.0394287109375,
0.0667724609375,
0.0005831718444824219,
-0.032012939453125,
-0.0193328857421875,
-0.043670654296875,
0.0250244140625,
-0.0017309188842773438,
0.0043182373046875,
-0.0024738311767578125,
0.003299713134765625,
0.0675048828125,
-0.0474853515625,
0.0440673828125,
-0.0239715576171875,
0.0139007568359375,
0.017333984375,
-0.01349639892578125,
0.02557373046875,
-0.0133514404296875,
-0.00606536865234375,
0.0233001708984375,
-0.0136566162109375,
-0.033447265625,
-0.019500732421875,
0.035614013671875,
-0.0809326171875,
-0.0309600830078125,
-0.021697998046875,
-0.03955078125,
0.0106658935546875,
0.0191650390625,
0.044677734375,
0.025390625,
-0.01506805419921875,
0.0007309913635253906,
0.042510986328125,
-0.014984130859375,
0.02874755859375,
-0.005229949951171875,
-0.01873779296875,
-0.0328369140625,
0.05328369140625,
-0.021392822265625,
0.02069091796875,
0.0102996826171875,
0.03289794921875,
-0.007232666015625,
-0.038299560546875,
-0.03790283203125,
0.0081634521484375,
-0.042755126953125,
-0.0228424072265625,
-0.05181884765625,
-0.00933837890625,
-0.021942138671875,
-0.0225982666015625,
-0.0298309326171875,
-0.037078857421875,
-0.01214599609375,
0.004581451416015625,
0.040618896484375,
0.019317626953125,
-0.0194854736328125,
0.040985107421875,
-0.04833984375,
0.0394287109375,
0.0090484619140625,
0.0020275115966796875,
-0.002223968505859375,
-0.0531005859375,
-0.03033447265625,
-0.0146484375,
-0.026275634765625,
-0.05621337890625,
0.037017822265625,
0.0338134765625,
0.044677734375,
0.031005859375,
-0.0049896240234375,
0.045379638671875,
-0.0219573974609375,
0.04608154296875,
-0.004756927490234375,
-0.05706787109375,
0.0450439453125,
-0.03466796875,
0.0214080810546875,
0.0390625,
0.041717529296875,
-0.0222320556640625,
-0.0136260986328125,
-0.05877685546875,
-0.0667724609375,
0.05242919921875,
0.0163726806640625,
-0.001247406005859375,
0.02557373046875,
0.0143585205078125,
-0.008331298828125,
0.0169677734375,
-0.02703857421875,
-0.0154876708984375,
-0.0211334228515625,
0.0004534721374511719,
0.005176544189453125,
-0.005817413330078125,
-0.034759521484375,
-0.03436279296875,
0.0787353515625,
-0.003063201904296875,
0.03955078125,
0.0179443359375,
-0.0004944801330566406,
-0.019744873046875,
-0.0232696533203125,
0.00974273681640625,
0.024017333984375,
-0.046966552734375,
-0.0167999267578125,
0.0109405517578125,
-0.0341796875,
-0.0089569091796875,
0.001338958740234375,
-0.0306396484375,
0.007293701171875,
0.01079559326171875,
0.0823974609375,
-0.000762939453125,
-0.0235748291015625,
0.0277862548828125,
-0.01898193359375,
-0.0099334716796875,
-0.04925537109375,
0.0212860107421875,
0.00032830238342285156,
0.0118255615234375,
0.007183074951171875,
0.0269775390625,
0.0254364013671875,
-0.0264892578125,
-0.003322601318359375,
0.026092529296875,
-0.0007200241088867188,
-0.005100250244140625,
0.072509765625,
0.0211944580078125,
-0.01491546630859375,
0.064208984375,
0.006195068359375,
-0.038787841796875,
0.059417724609375,
0.022796630859375,
0.0809326171875,
-0.01873779296875,
0.00646209716796875,
0.0408935546875,
0.0102996826171875,
0.0112762451171875,
0.026763916015625,
-0.0184326171875,
-0.050994873046875,
-0.0229644775390625,
-0.073974609375,
-0.043212890625,
-0.0024013519287109375,
-0.06512451171875,
0.043914794921875,
-0.03546142578125,
-0.02728271484375,
0.02471923828125,
-0.00955963134765625,
-0.05548095703125,
0.00795745849609375,
-0.00421905517578125,
0.0809326171875,
-0.060089111328125,
0.065673828125,
0.0567626953125,
-0.0298004150390625,
-0.056304931640625,
-0.033203125,
0.007518768310546875,
-0.05560302734375,
0.003993988037109375,
0.00772857666015625,
-0.0024356842041015625,
-0.0037555694580078125,
-0.028656005859375,
-0.056549072265625,
0.07989501953125,
0.0202178955078125,
-0.01042938232421875,
-0.006805419921875,
-0.0212554931640625,
0.03045654296875,
-0.01751708984375,
0.028900146484375,
0.0672607421875,
0.042816162109375,
0.00876617431640625,
-0.0728759765625,
0.0090484619140625,
-0.037933349609375,
-0.01507568359375,
0.024261474609375,
-0.068359375,
0.0819091796875,
-0.0229034423828125,
0.003681182861328125,
0.00565338134765625,
0.039703369140625,
0.008819580078125,
0.013458251953125,
0.043243408203125,
0.061981201171875,
0.06591796875,
-0.01036834716796875,
0.1053466796875,
-0.02911376953125,
0.044830322265625,
0.055206298828125,
0.0198822021484375,
0.040283203125,
0.031768798828125,
-0.0236663818359375,
0.026824951171875,
0.0677490234375,
-0.0267181396484375,
0.042633056640625,
-0.0109405517578125,
-0.0195770263671875,
-0.005435943603515625,
-0.01031494140625,
-0.0231170654296875,
0.032196044921875,
0.0171051025390625,
-0.0279998779296875,
-0.003955841064453125,
0.003185272216796875,
0.0231475830078125,
-0.04266357421875,
-0.01374053955078125,
0.050750732421875,
0.007625579833984375,
-0.05816650390625,
0.082763671875,
-0.00273895263671875,
0.06365966796875,
-0.048065185546875,
0.01490020751953125,
-0.02178955078125,
0.026947021484375,
-0.041717529296875,
-0.02667236328125,
0.043670654296875,
-0.006683349609375,
-0.0037784576416015625,
0.005352020263671875,
0.06427001953125,
-0.00791168212890625,
-0.016632080078125,
0.0175323486328125,
0.01200103759765625,
0.029876708984375,
0.006923675537109375,
-0.05322265625,
0.0176544189453125,
0.0128173828125,
-0.039337158203125,
0.0172119140625,
-0.0172576904296875,
0.00974273681640625,
0.045989990234375,
0.046661376953125,
0.003692626953125,
-0.00225830078125,
0.00772857666015625,
0.065673828125,
-0.0310821533203125,
-0.05242919921875,
-0.041717529296875,
0.036895751953125,
-0.022857666015625,
-0.04248046875,
0.054779052734375,
0.04071044921875,
0.05584716796875,
-0.00920867919921875,
0.0625,
-0.01971435546875,
0.0159759521484375,
-0.0289764404296875,
0.042449951171875,
-0.06201171875,
-0.007049560546875,
-0.01708984375,
-0.082763671875,
-0.005428314208984375,
0.06549072265625,
0.0041351318359375,
0.007541656494140625,
0.039337158203125,
0.05560302734375,
-0.023834228515625,
0.0134735107421875,
0.0027446746826171875,
0.0238494873046875,
0.0295867919921875,
0.033447265625,
0.03790283203125,
-0.0307159423828125,
0.0257720947265625,
-0.050048828125,
-0.04107666015625,
-0.0217437744140625,
-0.0601806640625,
-0.060150146484375,
-0.0265350341796875,
-0.019927978515625,
-0.0343017578125,
0.007843017578125,
0.0526123046875,
0.0682373046875,
-0.028076171875,
-0.0252227783203125,
-0.0062713623046875,
-0.0113677978515625,
-0.017333984375,
-0.0175323486328125,
0.039703369140625,
-0.0278167724609375,
-0.073486328125,
0.03173828125,
0.0014247894287109375,
-0.0017518997192382812,
-0.026641845703125,
-0.00768280029296875,
-0.0026531219482421875,
-0.0004401206970214844,
0.0345458984375,
0.0294342041015625,
-0.0098114013671875,
-0.01873779296875,
-0.0208282470703125,
0.0157012939453125,
0.00997161865234375,
0.029998779296875,
-0.066162109375,
0.01293182373046875,
0.058685302734375,
0.03350830078125,
0.07110595703125,
-0.0037899017333984375,
0.039398193359375,
-0.047088623046875,
0.0175323486328125,
0.00501251220703125,
0.0496826171875,
0.01422119140625,
-0.036590576171875,
0.062164306640625,
0.00003063678741455078,
-0.07220458984375,
-0.05853271484375,
-0.001239776611328125,
-0.09307861328125,
-0.01462554931640625,
0.07122802734375,
0.0043792724609375,
-0.042205810546875,
0.0087738037109375,
-0.0181884765625,
0.03173828125,
-0.029998779296875,
0.06256103515625,
0.020355224609375,
-0.0311431884765625,
0.0098114013671875,
-0.055450439453125,
0.051910400390625,
0.03955078125,
-0.057220458984375,
-0.0188751220703125,
0.0343017578125,
0.01708984375,
0.0229644775390625,
0.046661376953125,
-0.0150909423828125,
0.02410888671875,
0.00007319450378417969,
0.012786865234375,
-0.030731201171875,
-0.0295257568359375,
-0.038665771484375,
0.005031585693359375,
-0.0184783935546875,
-0.028839111328125
]
] |
ahmedrachid/FinancialBERT-Sentiment-Analysis | 2022-02-07T14:58:57.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"financial-sentiment-analysis",
"sentiment-analysis",
"en",
"dataset:financial_phrasebank",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | ahmedrachid | null | null | ahmedrachid/FinancialBERT-Sentiment-Analysis | 43 | 13,534 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- financial-sentiment-analysis
- sentiment-analysis
datasets:
- financial_phrasebank
widget:
- text: Operating profit rose to EUR 13.1 mn from EUR 8.7 mn in the corresponding period in 2007 representing 7.7 % of net sales.
- text: Bids or offers include at least 1,000 shares and the value of the shares must correspond to at least EUR 4,000.
- text: Raute reported a loss per share of EUR 0.86 for the first half of 2009 , against EPS of EUR 0.74 in the corresponding period of 2008.
---
### FinancialBERT for Sentiment Analysis
[*FinancialBERT*](https://huggingface.co/ahmedrachid/FinancialBERT) is a BERT model pre-trained on a large corpora of financial texts. The purpose is to enhance financial NLP research and practice in financial domain, hoping that financial practitioners and researchers can benefit from this model without the necessity of the significant computational resources required to train the model.
The model was fine-tuned for Sentiment Analysis task on _Financial PhraseBank_ dataset. Experiments show that this model outperforms the general BERT and other financial domain-specific models.
More details on `FinancialBERT`'s pre-training process can be found at: https://www.researchgate.net/publication/358284785_FinancialBERT_-_A_Pretrained_Language_Model_for_Financial_Text_Mining
### Training data
FinancialBERT model was fine-tuned on [Financial PhraseBank](https://www.researchgate.net/publication/251231364_FinancialPhraseBank-v10), a dataset consisting of 4840 Financial News categorised by sentiment (negative, neutral, positive).
### Fine-tuning hyper-parameters
- learning_rate = 2e-5
- batch_size = 32
- max_seq_length = 512
- num_train_epochs = 5
### Evaluation metrics
The evaluation metrics used are: Precision, Recall and F1-score. The following is the classification report on the test set.
| sentiment | precision | recall | f1-score | support |
| ------------- |:-------------:|:-------------:|:-------------:| -----:|
| negative | 0.96 | 0.97 | 0.97 | 58 |
| neutral | 0.98 | 0.99 | 0.98 | 279 |
| positive | 0.98 | 0.97 | 0.97 | 148 |
| macro avg | 0.97 | 0.98 | 0.98 | 485 |
| weighted avg | 0.98 | 0.98 | 0.98 | 485 |
### How to use
The model can be used thanks to Transformers pipeline for sentiment analysis.
```python
from transformers import BertTokenizer, BertForSequenceClassification
from transformers import pipeline
model = BertForSequenceClassification.from_pretrained("ahmedrachid/FinancialBERT-Sentiment-Analysis",num_labels=3)
tokenizer = BertTokenizer.from_pretrained("ahmedrachid/FinancialBERT-Sentiment-Analysis")
nlp = pipeline("sentiment-analysis", model=model, tokenizer=tokenizer)
sentences = ["Operating profit rose to EUR 13.1 mn from EUR 8.7 mn in the corresponding period in 2007 representing 7.7 % of net sales.",
"Bids or offers include at least 1,000 shares and the value of the shares must correspond to at least EUR 4,000.",
"Raute reported a loss per share of EUR 0.86 for the first half of 2009 , against EPS of EUR 0.74 in the corresponding period of 2008.",
]
results = nlp(sentences)
print(results)
[{'label': 'positive', 'score': 0.9998133778572083},
{'label': 'neutral', 'score': 0.9997822642326355},
{'label': 'negative', 'score': 0.9877365231513977}]
```
> Created by [Ahmed Rachid Hazourli](https://www.linkedin.com/in/ahmed-rachid/)
| 3,448 | [
[
-0.0311126708984375,
-0.03253173828125,
-0.0004730224609375,
0.0465087890625,
-0.0201873779296875,
-0.0020656585693359375,
-0.0176849365234375,
-0.019073486328125,
0.0211639404296875,
0.0280609130859375,
-0.05096435546875,
-0.06939697265625,
-0.05511474609375,
-0.003543853759765625,
-0.027130126953125,
0.11883544921875,
0.0124053955078125,
0.0131378173828125,
0.0007791519165039062,
-0.00823974609375,
-0.00743865966796875,
-0.05963134765625,
-0.060211181640625,
-0.0260009765625,
0.037506103515625,
0.00655364990234375,
0.050048828125,
-0.00821685791015625,
0.04345703125,
0.0224761962890625,
-0.007083892822265625,
-0.01476287841796875,
-0.0250244140625,
-0.0038738250732421875,
0.01190185546875,
-0.039825439453125,
-0.05499267578125,
0.0235748291015625,
0.01971435546875,
0.03985595703125,
-0.00841522216796875,
0.0250244140625,
0.017303466796875,
0.06329345703125,
-0.0307769775390625,
0.0234527587890625,
-0.03509521484375,
0.015350341796875,
0.002765655517578125,
-0.004779815673828125,
-0.0322265625,
-0.034912109375,
0.03253173828125,
-0.031219482421875,
0.016937255859375,
0.007007598876953125,
0.10247802734375,
0.01203155517578125,
-0.00914764404296875,
-0.031829833984375,
-0.0278472900390625,
0.083251953125,
-0.0732421875,
0.011932373046875,
0.0249481201171875,
-0.0030879974365234375,
0.0191192626953125,
-0.05535888671875,
-0.032196044921875,
-0.00759124755859375,
-0.003963470458984375,
0.034820556640625,
-0.0233001708984375,
-0.0014209747314453125,
0.0087890625,
0.03399658203125,
-0.03179931640625,
-0.0108795166015625,
-0.0286712646484375,
-0.0194244384765625,
0.04986572265625,
-0.005397796630859375,
-0.01526641845703125,
-0.037506103515625,
-0.04583740234375,
-0.038482666015625,
-0.0260467529296875,
0.0202178955078125,
0.045166015625,
0.03363037109375,
-0.0074310302734375,
0.03851318359375,
-0.002056121826171875,
0.036041259765625,
0.0036563873291015625,
-0.005313873291015625,
0.05462646484375,
-0.0200042724609375,
-0.03155517578125,
0.00977325439453125,
0.06915283203125,
0.035003662109375,
0.04583740234375,
0.01470184326171875,
-0.038665771484375,
-0.0014047622680664062,
0.0201568603515625,
-0.047515869140625,
-0.037750244140625,
0.0172882080078125,
-0.05426025390625,
-0.04925537109375,
0.0167083740234375,
-0.05401611328125,
-0.0018148422241210938,
-0.010772705078125,
0.043701171875,
-0.04180908203125,
-0.017730712890625,
0.0164337158203125,
-0.0250244140625,
0.035369873046875,
-0.0008382797241210938,
-0.04901123046875,
0.013641357421875,
0.043701171875,
0.05499267578125,
0.0168304443359375,
-0.01178741455078125,
-0.0189666748046875,
-0.03314208984375,
-0.00978851318359375,
0.04937744140625,
-0.015869140625,
-0.028350830078125,
0.0081939697265625,
-0.01035308837890625,
-0.0013132095336914062,
-0.0258941650390625,
0.05615234375,
-0.043670654296875,
0.040283203125,
-0.02117919921875,
-0.052886962890625,
-0.032989501953125,
0.0206756591796875,
-0.032806396484375,
0.078125,
0.011749267578125,
-0.07696533203125,
0.0406494140625,
-0.063232421875,
-0.0260009765625,
-0.0157012939453125,
0.01245880126953125,
-0.040252685546875,
-0.007228851318359375,
0.019195556640625,
0.04449462890625,
-0.01470184326171875,
0.00768280029296875,
-0.023040771484375,
-0.008331298828125,
0.039825439453125,
-0.0281829833984375,
0.07659912109375,
0.00847625732421875,
-0.0251007080078125,
0.01336669921875,
-0.04736328125,
0.0032196044921875,
0.0018100738525390625,
-0.0176849365234375,
-0.0235595703125,
0.0103607177734375,
0.02947998046875,
0.017608642578125,
0.033905029296875,
-0.038726806640625,
0.00708770751953125,
-0.058868408203125,
0.0251007080078125,
0.0733642578125,
-0.0048828125,
0.023956298828125,
-0.02630615234375,
0.043182373046875,
0.0043487548828125,
0.0183868408203125,
0.01287841796875,
-0.02581787109375,
-0.067138671875,
-0.0310516357421875,
0.049468994140625,
0.06561279296875,
-0.02679443359375,
0.0631103515625,
-0.01068115234375,
-0.035430908203125,
-0.047607421875,
0.00748443603515625,
0.00691986083984375,
0.036224365234375,
0.0345458984375,
-0.01148223876953125,
-0.05352783203125,
-0.07415771484375,
-0.0126495361328125,
-0.013031005859375,
-0.0022144317626953125,
0.00463104248046875,
0.0364990234375,
-0.0164031982421875,
0.07232666015625,
-0.056121826171875,
-0.038604736328125,
-0.0269012451171875,
0.035797119140625,
0.06243896484375,
0.0242767333984375,
0.058197021484375,
-0.04901123046875,
-0.037109375,
-0.020721435546875,
-0.05291748046875,
0.00406646728515625,
-0.0084381103515625,
-0.0125732421875,
0.0313720703125,
0.0258636474609375,
-0.0487060546875,
0.028594970703125,
0.03448486328125,
-0.0303192138671875,
0.0460205078125,
-0.010284423828125,
-0.01511383056640625,
-0.08349609375,
0.0007696151733398438,
0.021881103515625,
-0.0043487548828125,
-0.027740478515625,
-0.0178985595703125,
-0.0185089111328125,
0.0017805099487304688,
-0.02069091796875,
0.02880859375,
-0.00629425048828125,
0.003910064697265625,
-0.003520965576171875,
0.0216522216796875,
-0.00366973876953125,
0.049591064453125,
-0.00841522216796875,
0.04388427734375,
0.052703857421875,
-0.030059814453125,
0.02972412109375,
0.0247344970703125,
-0.0340576171875,
0.0217742919921875,
-0.06219482421875,
-0.02459716796875,
0.001308441162109375,
0.0122528076171875,
-0.086669921875,
0.01448822021484375,
0.0284881591796875,
-0.057373046875,
0.01023101806640625,
0.01139068603515625,
-0.038421630859375,
-0.0254974365234375,
-0.04443359375,
-0.003643035888671875,
0.05078125,
-0.02349853515625,
0.0400390625,
0.0131072998046875,
-0.031219482421875,
-0.07867431640625,
-0.046173095703125,
-0.0158233642578125,
-0.0196990966796875,
-0.0521240234375,
0.01421356201171875,
-0.0038585662841796875,
-0.01172637939453125,
0.007579803466796875,
-0.00980377197265625,
-0.0052032470703125,
0.0007672309875488281,
0.013214111328125,
0.052703857421875,
-0.0125732421875,
0.0206756591796875,
-0.004608154296875,
-0.016265869140625,
0.028594970703125,
-0.00655364990234375,
0.050567626953125,
-0.03546142578125,
0.005039215087890625,
-0.026458740234375,
0.00632476806640625,
0.031829833984375,
-0.01523590087890625,
0.06781005859375,
0.0526123046875,
-0.01000213623046875,
-0.004955291748046875,
-0.0249786376953125,
-0.0084991455078125,
-0.037384033203125,
0.0252532958984375,
-0.006931304931640625,
-0.051910400390625,
0.04901123046875,
0.0116729736328125,
0.01526641845703125,
0.07684326171875,
0.03997802734375,
-0.025970458984375,
0.0770263671875,
0.041656494140625,
-0.035003662109375,
0.0251617431640625,
-0.048248291015625,
0.036651611328125,
-0.033599853515625,
-0.01099395751953125,
-0.03228759765625,
-0.03570556640625,
-0.050048828125,
0.0022602081298828125,
0.01403045654296875,
0.006893157958984375,
-0.021728515625,
0.01947021484375,
-0.04217529296875,
0.00542449951171875,
0.053497314453125,
0.00240325927734375,
-0.005794525146484375,
0.007061004638671875,
-0.02813720703125,
-0.012939453125,
-0.0458984375,
-0.04052734375,
0.09100341796875,
0.040771484375,
0.057403564453125,
-0.00612640380859375,
0.065673828125,
0.037841796875,
0.0290985107421875,
-0.0716552734375,
0.040252685546875,
-0.0290985107421875,
-0.0594482421875,
-0.01300048828125,
-0.029083251953125,
-0.04876708984375,
0.005340576171875,
-0.0303955078125,
-0.03179931640625,
0.01158905029296875,
0.003387451171875,
-0.040313720703125,
0.012298583984375,
-0.04547119140625,
0.0662841796875,
-0.022003173828125,
0.0004515647888183594,
-0.01513671875,
-0.04376220703125,
0.0178375244140625,
-0.00888824462890625,
0.0200347900390625,
-0.01096343994140625,
0.006847381591796875,
0.0723876953125,
-0.027587890625,
0.07452392578125,
-0.03173828125,
-0.007305145263671875,
0.024871826171875,
-0.0217132568359375,
0.01666259765625,
0.0032062530517578125,
-0.0105133056640625,
0.0191192626953125,
0.006072998046875,
-0.03350830078125,
-0.024017333984375,
0.041168212890625,
-0.08111572265625,
-0.0220794677734375,
-0.061981201171875,
-0.0384521484375,
-0.01007843017578125,
-0.00003325939178466797,
0.032958984375,
0.0229644775390625,
-0.01363372802734375,
0.031036376953125,
0.0306396484375,
-0.03253173828125,
0.032806396484375,
0.0111083984375,
-0.021881103515625,
-0.0391845703125,
0.06439208984375,
0.0005364418029785156,
-0.0031795501708984375,
0.023590087890625,
0.00923919677734375,
-0.03692626953125,
-0.00807952880859375,
0.0037994384765625,
0.02117919921875,
-0.05853271484375,
-0.02801513671875,
-0.0467529296875,
-0.040191650390625,
-0.045379638671875,
-0.01544952392578125,
-0.0205230712890625,
-0.041229248046875,
-0.0290679931640625,
-0.019195556640625,
0.040283203125,
0.03509521484375,
-0.019073486328125,
0.029632568359375,
-0.061370849609375,
0.007648468017578125,
0.0185089111328125,
0.01378631591796875,
0.00308990478515625,
-0.050750732421875,
-0.01922607421875,
0.0073394775390625,
-0.034820556640625,
-0.060882568359375,
0.051910400390625,
0.0153656005859375,
0.03265380859375,
0.047637939453125,
0.0186920166015625,
0.033538818359375,
-0.0095977783203125,
0.06396484375,
0.016387939453125,
-0.08087158203125,
0.0433349609375,
-0.00997161865234375,
0.007904052734375,
0.03814697265625,
0.05499267578125,
-0.032684326171875,
-0.03387451171875,
-0.06402587890625,
-0.09423828125,
0.048187255859375,
0.0197601318359375,
0.0130615234375,
-0.004161834716796875,
0.0233154296875,
-0.003387451171875,
0.035003662109375,
-0.0814208984375,
-0.0292510986328125,
-0.037933349609375,
-0.0377197265625,
-0.01538848876953125,
-0.025604248046875,
-0.009368896484375,
-0.0321044921875,
0.08038330078125,
0.00992584228515625,
0.039520263671875,
0.022125244140625,
-0.0011272430419921875,
0.021881103515625,
0.0134429931640625,
0.034088134765625,
0.042999267578125,
-0.042327880859375,
-0.0174560546875,
0.01739501953125,
-0.0264129638671875,
-0.0140533447265625,
0.0187835693359375,
0.001983642578125,
0.007640838623046875,
0.04083251953125,
0.05364990234375,
0.0129241943359375,
-0.0494384765625,
0.05224609375,
-0.006259918212890625,
-0.04766845703125,
-0.063720703125,
-0.005794525146484375,
-0.0037555694580078125,
0.0305023193359375,
0.036468505859375,
0.023834228515625,
0.0006074905395507812,
-0.0295867919921875,
0.022674560546875,
0.025604248046875,
-0.045166015625,
-0.0172576904296875,
0.033782958984375,
0.009735107421875,
0.0017614364624023438,
0.0650634765625,
-0.00934600830078125,
-0.060882568359375,
0.038421630859375,
0.01654052734375,
0.0738525390625,
0.0018892288208007812,
0.02960205078125,
0.037353515625,
0.0300445556640625,
-0.002349853515625,
0.040557861328125,
0.01114654541015625,
-0.07598876953125,
-0.038055419921875,
-0.062286376953125,
-0.01287078857421875,
0.00994873046875,
-0.07135009765625,
0.01154327392578125,
-0.042999267578125,
-0.042022705078125,
0.01678466796875,
0.006099700927734375,
-0.046173095703125,
0.0318603515625,
0.000965118408203125,
0.086181640625,
-0.0643310546875,
0.051971435546875,
0.0565185546875,
-0.042999267578125,
-0.058258056640625,
-0.0084991455078125,
-0.020355224609375,
-0.05535888671875,
0.0733642578125,
0.00927734375,
-0.002201080322265625,
-0.009552001953125,
-0.04669189453125,
-0.039276123046875,
0.062286376953125,
-0.0081329345703125,
-0.0240478515625,
-0.0036067962646484375,
-0.001338958740234375,
0.046356201171875,
-0.0306396484375,
0.00463104248046875,
0.02789306640625,
0.036407470703125,
0.001041412353515625,
-0.0323486328125,
-0.01264190673828125,
-0.03363037109375,
-0.0272674560546875,
0.01479339599609375,
-0.046295166015625,
0.0836181640625,
0.00150299072265625,
0.01117706298828125,
0.00003647804260253906,
0.060943603515625,
0.009368896484375,
0.0197601318359375,
0.05224609375,
0.057464599609375,
0.048919677734375,
-0.0259857177734375,
0.0745849609375,
-0.038299560546875,
0.06146240234375,
0.06591796875,
-0.0057220458984375,
0.08111572265625,
0.0200347900390625,
-0.022796630859375,
0.057525634765625,
0.053466796875,
-0.02484130859375,
0.04583740234375,
0.0113067626953125,
-0.017578125,
-0.02532958984375,
0.0186614990234375,
-0.025970458984375,
0.038482666015625,
0.0170135498046875,
-0.044586181640625,
0.00017821788787841797,
-0.005626678466796875,
0.0191497802734375,
-0.00925445556640625,
-0.016204833984375,
0.038238525390625,
0.0168609619140625,
-0.032073974609375,
0.033233642578125,
0.00617218017578125,
0.060699462890625,
-0.057952880859375,
0.0218353271484375,
-0.0092620849609375,
0.0307769775390625,
-0.0093536376953125,
-0.058197021484375,
0.0244140625,
0.01120758056640625,
-0.0172576904296875,
-0.024139404296875,
0.04803466796875,
-0.01131439208984375,
-0.06085205078125,
0.033416748046875,
0.031097412109375,
-0.0005254745483398438,
-0.00926971435546875,
-0.07513427734375,
-0.020782470703125,
0.01806640625,
-0.0310211181640625,
0.008575439453125,
0.033843994140625,
0.03314208984375,
0.040191650390625,
0.048187255859375,
0.0039520263671875,
-0.017242431640625,
0.007904052734375,
0.0653076171875,
-0.06964111328125,
-0.040069580078125,
-0.06134033203125,
0.040435791015625,
-0.0204925537109375,
-0.03460693359375,
0.059967041015625,
0.042938232421875,
0.056793212890625,
-0.01617431640625,
0.061737060546875,
-0.0027923583984375,
0.040557861328125,
-0.0294952392578125,
0.0660400390625,
-0.045257568359375,
0.004974365234375,
-0.0312347412109375,
-0.07232666015625,
-0.0190887451171875,
0.08514404296875,
-0.03265380859375,
0.01122283935546875,
0.040985107421875,
0.051666259765625,
0.0121307373046875,
0.01096343994140625,
0.0028095245361328125,
0.0416259765625,
0.00482940673828125,
0.021026611328125,
0.05352783203125,
-0.046844482421875,
0.047271728515625,
-0.0232696533203125,
-0.01546478271484375,
-0.01433563232421875,
-0.06390380859375,
-0.0748291015625,
-0.032867431640625,
-0.026397705078125,
-0.03570556640625,
-0.0181884765625,
0.07000732421875,
0.0305633544921875,
-0.060455322265625,
-0.034027099609375,
0.01117706298828125,
-0.01114654541015625,
-0.0183258056640625,
-0.0269317626953125,
0.02972412109375,
-0.044769287109375,
-0.040313720703125,
-0.00032830238342285156,
0.0014429092407226562,
-0.003387451171875,
-0.0199432373046875,
0.003692626953125,
-0.02734375,
0.0203399658203125,
0.0579833984375,
-0.005771636962890625,
-0.039581298828125,
-0.0270233154296875,
0.0162353515625,
-0.00003230571746826172,
-0.001842498779296875,
0.0284881591796875,
-0.035369873046875,
0.0035419464111328125,
0.035552978515625,
0.02313232421875,
0.036285400390625,
0.0042877197265625,
0.03143310546875,
-0.064208984375,
0.0099639892578125,
0.01922607421875,
0.0268707275390625,
0.0119476318359375,
-0.0106658935546875,
0.024017333984375,
0.01465606689453125,
-0.047515869140625,
-0.035797119140625,
-0.014404296875,
-0.08758544921875,
-0.02313232421875,
0.06597900390625,
-0.01424407958984375,
-0.01259613037109375,
0.0224609375,
-0.019256591796875,
0.01500701904296875,
-0.0450439453125,
0.048828125,
0.07476806640625,
-0.005794525146484375,
0.0089263916015625,
-0.0386962890625,
0.035247802734375,
0.040496826171875,
-0.029083251953125,
-0.017303466796875,
0.017303466796875,
0.0177764892578125,
0.0186767578125,
0.03887939453125,
0.0005207061767578125,
0.0084075927734375,
-0.004100799560546875,
0.0330810546875,
0.028472900390625,
-0.00678253173828125,
-0.01277923583984375,
0.0253753662109375,
0.0028285980224609375,
-0.023223876953125
]
] |
arpanghoshal/EmoRoBERTa | 2023-02-11T01:57:27.000Z | [
"transformers",
"tf",
"roberta",
"text-classification",
"tensorflow",
"en",
"dataset:go_emotions",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | arpanghoshal | null | null | arpanghoshal/EmoRoBERTa | 77 | 13,534 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- text-classification
- tensorflow
- roberta
datasets:
- go_emotions
license: mit
---
Connect me on LinkedIn
- [linkedin.com/in/arpanghoshal](https://www.linkedin.com/in/arpanghoshal)
## What is GoEmotions
Dataset labelled 58000 Reddit comments with 28 emotions
- admiration, amusement, anger, annoyance, approval, caring, confusion, curiosity, desire, disappointment, disapproval, disgust, embarrassment, excitement, fear, gratitude, grief, joy, love, nervousness, optimism, pride, realization, relief, remorse, sadness, surprise + neutral
## What is RoBERTa
RoBERTa builds on BERT’s language masking strategy and modifies key hyperparameters in BERT, including removing BERT’s next-sentence pretraining objective, and training with much larger mini-batches and learning rates. RoBERTa was also trained on an order of magnitude more data than BERT, for a longer amount of time. This allows RoBERTa representations to generalize even better to downstream tasks compared to BERT.
## Hyperparameters
| Parameter | |
| ----------------- | :---: |
| Learning rate | 5e-5 |
| Epochs | 10 |
| Max Seq Length | 50 |
| Batch size | 16 |
| Warmup Proportion | 0.1 |
| Epsilon | 1e-8 |
## Results
Best Result of `Macro F1` - 49.30%
## Usage
```python
from transformers import RobertaTokenizerFast, TFRobertaForSequenceClassification, pipeline
tokenizer = RobertaTokenizerFast.from_pretrained("arpanghoshal/EmoRoBERTa")
model = TFRobertaForSequenceClassification.from_pretrained("arpanghoshal/EmoRoBERTa")
emotion = pipeline('sentiment-analysis',
model='arpanghoshal/EmoRoBERTa')
emotion_labels = emotion("Thanks for using it.")
print(emotion_labels)
```
Output
```
[{'label': 'gratitude', 'score': 0.9964383244514465}]
```
| 1,830 | [
[
-0.016204833984375,
-0.0296630859375,
0.013427734375,
0.017822265625,
-0.0212554931640625,
-0.0111083984375,
-0.0244598388671875,
-0.0268707275390625,
0.030517578125,
0.0234832763671875,
-0.05035400390625,
-0.057342529296875,
-0.062347412109375,
-0.0034732818603515625,
-0.0206451416015625,
0.10272216796875,
0.0095672607421875,
0.019256591796875,
0.018646240234375,
-0.0229644775390625,
-0.00664520263671875,
-0.04876708984375,
-0.05426025390625,
-0.0155181884765625,
0.032989501953125,
0.017669677734375,
0.034271240234375,
0.01010894775390625,
0.0203857421875,
0.027313232421875,
-0.004734039306640625,
-0.01116180419921875,
-0.05206298828125,
0.01409149169921875,
0.001178741455078125,
-0.038909912109375,
-0.03057861328125,
0.00775909423828125,
0.03924560546875,
0.0302734375,
-0.004253387451171875,
0.033538818359375,
0.004337310791015625,
0.061492919921875,
-0.02630615234375,
0.036285400390625,
-0.03485107421875,
0.011199951171875,
-0.018646240234375,
0.0113983154296875,
-0.0236663818359375,
-0.0261688232421875,
0.0206451416015625,
-0.032501220703125,
0.0198516845703125,
0.0013942718505859375,
0.1009521484375,
0.021820068359375,
-0.0119171142578125,
-0.02001953125,
-0.034210205078125,
0.08636474609375,
-0.055511474609375,
0.0215911865234375,
0.0248565673828125,
0.0295257568359375,
0.00855255126953125,
-0.06207275390625,
-0.04669189453125,
-0.0011091232299804688,
0.0020122528076171875,
0.0236663818359375,
-0.0166168212890625,
-0.010162353515625,
0.01995849609375,
0.036407470703125,
-0.05859375,
-0.0152587890625,
-0.025543212890625,
-0.0115203857421875,
0.03338623046875,
-0.00021779537200927734,
0.02142333984375,
-0.0171966552734375,
-0.012451171875,
-0.036376953125,
-0.027984619140625,
0.0120849609375,
0.04425048828125,
0.02099609375,
-0.0297698974609375,
0.0367431640625,
0.00011819601058959961,
0.061065673828125,
-0.0019178390502929688,
0.007381439208984375,
0.046356201171875,
-0.004550933837890625,
-0.02178955078125,
-0.02288818359375,
0.06756591796875,
0.00637054443359375,
0.035186767578125,
0.00011277198791503906,
-0.0016050338745117188,
-0.022674560546875,
0.01364898681640625,
-0.05450439453125,
-0.03472900390625,
0.0367431640625,
-0.050079345703125,
-0.0226898193359375,
0.0150909423828125,
-0.0682373046875,
-0.0085601806640625,
-0.0194854736328125,
0.035858154296875,
-0.051422119140625,
-0.00797271728515625,
0.004405975341796875,
-0.0159759521484375,
0.02044677734375,
0.0010309219360351562,
-0.0704345703125,
0.003292083740234375,
0.028564453125,
0.06329345703125,
0.0189056396484375,
-0.0164794921875,
-0.029754638671875,
-0.01184844970703125,
-0.01702880859375,
0.035675048828125,
-0.042724609375,
-0.0094757080078125,
0.004627227783203125,
0.013031005859375,
-0.0175628662109375,
-0.04010009765625,
0.028106689453125,
-0.032928466796875,
0.04180908203125,
0.00437164306640625,
-0.0286712646484375,
-0.0229949951171875,
0.006748199462890625,
-0.045074462890625,
0.08392333984375,
0.042816162109375,
-0.0450439453125,
0.0172882080078125,
-0.060821533203125,
-0.015777587890625,
-0.0038604736328125,
0.0057373046875,
-0.03948974609375,
0.00322723388671875,
0.0136260986328125,
0.03765869140625,
0.000514984130859375,
0.01580810546875,
-0.0264892578125,
-0.01122283935546875,
0.0215606689453125,
-0.004550933837890625,
0.080078125,
0.01264190673828125,
-0.02996826171875,
0.022125244140625,
-0.06768798828125,
0.01393890380859375,
0.0019216537475585938,
-0.0286712646484375,
-0.006099700927734375,
-0.015228271484375,
0.0125732421875,
0.0228729248046875,
0.0156097412109375,
-0.03662109375,
0.00262451171875,
-0.044952392578125,
0.030487060546875,
0.06390380859375,
-0.005428314208984375,
0.00951385498046875,
-0.01473236083984375,
0.040435791015625,
0.00955963134765625,
0.0157470703125,
0.00511932373046875,
-0.0411376953125,
-0.06597900390625,
-0.042449951171875,
0.032501220703125,
0.03753662109375,
-0.034820556640625,
0.05328369140625,
-0.0128173828125,
-0.046600341796875,
-0.058929443359375,
0.0189056396484375,
0.03143310546875,
0.046234130859375,
0.0380859375,
-0.0293121337890625,
-0.05950927734375,
-0.06036376953125,
-0.0191192626953125,
-0.007904052734375,
-0.0174102783203125,
0.0164947509765625,
0.0411376953125,
-0.026031494140625,
0.051605224609375,
-0.03045654296875,
-0.030670166015625,
-0.015716552734375,
0.0275115966796875,
0.047637939453125,
0.0418701171875,
0.05438232421875,
-0.0390625,
-0.051422119140625,
-0.01702880859375,
-0.0660400390625,
0.0142822265625,
0.0009570121765136719,
-0.0259857177734375,
0.0321044921875,
0.01702880859375,
-0.03662109375,
0.023101806640625,
0.04913330078125,
-0.0138702392578125,
0.04681396484375,
-0.0139312744140625,
0.005916595458984375,
-0.08966064453125,
0.00632476806640625,
0.0098876953125,
-0.0148773193359375,
-0.031494140625,
-0.00125885009765625,
-0.01157379150390625,
-0.0330810546875,
-0.045166015625,
0.04241943359375,
-0.0295257568359375,
0.0027923583984375,
0.01690673828125,
-0.0077667236328125,
-0.0099945068359375,
0.056854248046875,
-0.0029296875,
0.05743408203125,
0.06939697265625,
-0.037628173828125,
0.04095458984375,
0.041748046875,
-0.0270233154296875,
0.0312042236328125,
-0.06072998046875,
0.007793426513671875,
-0.007080078125,
0.0227203369140625,
-0.0885009765625,
-0.03350830078125,
0.029571533203125,
-0.0535888671875,
0.026763916015625,
-0.01335906982421875,
-0.0313720703125,
-0.03436279296875,
-0.036346435546875,
0.010162353515625,
0.056243896484375,
-0.04681396484375,
0.050994873046875,
0.016632080078125,
-0.0231475830078125,
-0.038482666015625,
-0.05621337890625,
0.00838470458984375,
-0.025909423828125,
-0.04107666015625,
0.01346588134765625,
-0.003200531005859375,
-0.00717926025390625,
-0.0049896240234375,
0.0134429931640625,
-0.0213623046875,
-0.021209716796875,
0.0244598388671875,
0.015625,
0.0017900466918945312,
0.017242431640625,
-0.0166778564453125,
-0.0069732666015625,
0.00745391845703125,
-0.025604248046875,
0.04718017578125,
-0.028167724609375,
-0.00453948974609375,
-0.0406494140625,
0.015838623046875,
0.03192138671875,
-0.012939453125,
0.0770263671875,
0.08148193359375,
-0.01329803466796875,
-0.005001068115234375,
-0.0311737060546875,
-0.002262115478515625,
-0.033599853515625,
0.03643798828125,
-0.0081329345703125,
-0.052215576171875,
0.053741455078125,
0.01287078857421875,
-0.00742340087890625,
0.04364013671875,
0.04705810546875,
-0.0163116455078125,
0.06939697265625,
0.0413818359375,
-0.03143310546875,
0.053314208984375,
-0.050689697265625,
0.02471923828125,
-0.06268310546875,
-0.016632080078125,
-0.0389404296875,
-0.0157623291015625,
-0.04949951171875,
-0.0318603515625,
0.0250396728515625,
-0.020538330078125,
-0.028594970703125,
0.0217132568359375,
-0.04266357421875,
0.0167083740234375,
0.059661865234375,
0.03143310546875,
-0.0263214111328125,
-0.0012464523315429688,
0.02593994140625,
-0.01322174072265625,
-0.045166015625,
-0.03826904296875,
0.1053466796875,
0.022552490234375,
0.047637939453125,
0.02069091796875,
0.06597900390625,
0.0206756591796875,
0.01294708251953125,
-0.046173095703125,
0.046417236328125,
-0.01177978515625,
-0.05377197265625,
-0.02362060546875,
-0.02130126953125,
-0.068115234375,
-0.01470184326171875,
-0.01078033447265625,
-0.0687255859375,
0.0070343017578125,
0.012969970703125,
-0.00679779052734375,
0.01222991943359375,
-0.059295654296875,
0.06939697265625,
-0.012969970703125,
-0.0283050537109375,
-0.01351165771484375,
-0.0799560546875,
0.0182952880859375,
0.0017833709716796875,
0.0005474090576171875,
0.00685882568359375,
0.018157958984375,
0.07659912109375,
-0.04205322265625,
0.07452392578125,
-0.0196075439453125,
0.0100250244140625,
0.0283050537109375,
-0.0117645263671875,
0.03558349609375,
-0.00464630126953125,
-0.0204925537109375,
0.01898193359375,
-0.0214080810546875,
-0.050140380859375,
-0.0245208740234375,
0.0517578125,
-0.06689453125,
-0.01898193359375,
-0.039337158203125,
-0.04449462890625,
-0.0148468017578125,
0.0171661376953125,
0.03216552734375,
0.024505615234375,
-0.0107421875,
0.00937652587890625,
0.058807373046875,
-0.02105712890625,
0.03387451171875,
0.004611968994140625,
0.00909423828125,
-0.03436279296875,
0.0887451171875,
-0.00031828880310058594,
0.002979278564453125,
0.0201263427734375,
0.001392364501953125,
-0.03643798828125,
-0.01328277587890625,
-0.00525665283203125,
0.036407470703125,
-0.05029296875,
-0.00839996337890625,
-0.044677734375,
-0.00579071044921875,
-0.053802490234375,
0.00045990943908691406,
-0.012939453125,
-0.03173828125,
-0.0226898193359375,
-0.01104736328125,
0.0300750732421875,
0.040283203125,
-0.019012451171875,
0.0299530029296875,
-0.061492919921875,
0.0222930908203125,
0.0364990234375,
0.032318115234375,
-0.004596710205078125,
-0.048431396484375,
-0.04107666015625,
-0.00522613525390625,
-0.027557373046875,
-0.0634765625,
0.053131103515625,
0.03179931640625,
0.04144287109375,
0.031829833984375,
-0.0007410049438476562,
0.052520751953125,
-0.01204681396484375,
0.0789794921875,
0.00841522216796875,
-0.08612060546875,
0.048126220703125,
-0.0285491943359375,
0.006374359130859375,
0.03173828125,
0.047821044921875,
-0.0252532958984375,
-0.043304443359375,
-0.06842041015625,
-0.09228515625,
0.0618896484375,
0.0239715576171875,
0.00868988037109375,
-0.003734588623046875,
0.004276275634765625,
0.00234222412109375,
0.0219879150390625,
-0.06939697265625,
-0.0187530517578125,
-0.0350341796875,
-0.04510498046875,
-0.022125244140625,
-0.029144287109375,
-0.021270751953125,
-0.0162506103515625,
0.07916259765625,
0.003643035888671875,
0.05426025390625,
0.007083892822265625,
-0.0164642333984375,
0.0082550048828125,
0.007076263427734375,
0.038055419921875,
0.0408935546875,
-0.04937744140625,
-0.007114410400390625,
0.005237579345703125,
-0.026702880859375,
-0.004261016845703125,
-0.00037598609924316406,
0.027435302734375,
0.0215606689453125,
0.044189453125,
0.0687255859375,
-0.004253387451171875,
-0.05364990234375,
0.0328369140625,
0.0076141357421875,
-0.0182647705078125,
-0.0263519287109375,
0.0083465576171875,
-0.00453948974609375,
0.0211334228515625,
0.0182952880859375,
-0.01018524169921875,
0.007549285888671875,
-0.0418701171875,
0.011688232421875,
0.019989013671875,
-0.049346923828125,
-0.034423828125,
0.043548583984375,
0.00678253173828125,
-0.0146484375,
0.054443359375,
0.0028858184814453125,
-0.04486083984375,
0.047607421875,
0.0284271240234375,
0.07769775390625,
-0.01678466796875,
0.0175628662109375,
0.04364013671875,
0.0016317367553710938,
0.005725860595703125,
0.033538818359375,
0.002071380615234375,
-0.028533935546875,
-0.0191802978515625,
-0.051971435546875,
-0.038482666015625,
0.00738525390625,
-0.062255859375,
0.0036983489990234375,
-0.03466796875,
-0.01479339599609375,
-0.0005955696105957031,
-0.005664825439453125,
-0.059539794921875,
0.03656005859375,
-0.0096435546875,
0.054046630859375,
-0.07940673828125,
0.056549072265625,
0.067138671875,
-0.049652099609375,
-0.07647705078125,
0.0023193359375,
0.004138946533203125,
-0.0689697265625,
0.07794189453125,
0.0157470703125,
0.005817413330078125,
0.0083465576171875,
-0.04608154296875,
-0.0496826171875,
0.06494140625,
0.012847900390625,
-0.018798828125,
0.010894775390625,
-0.0042572021484375,
0.060791015625,
-0.038726806640625,
0.0272979736328125,
0.024261474609375,
0.03826904296875,
-0.00794219970703125,
-0.02972412109375,
-0.01531982421875,
-0.03594970703125,
0.0044403076171875,
-0.0023517608642578125,
-0.05255126953125,
0.0740966796875,
-0.004222869873046875,
0.0015659332275390625,
0.0018939971923828125,
0.05096435546875,
0.00453948974609375,
0.00975799560546875,
0.0239410400390625,
0.063232421875,
0.05621337890625,
-0.018402099609375,
0.07550048828125,
-0.03948974609375,
0.052764892578125,
0.074462890625,
0.00251007080078125,
0.04827880859375,
0.0372314453125,
-0.033233642578125,
0.0665283203125,
0.05511474609375,
-0.005008697509765625,
0.044647216796875,
0.01806640625,
-0.019256591796875,
-0.00785064697265625,
0.0030975341796875,
-0.03363037109375,
0.0098419189453125,
0.00650787353515625,
-0.023529052734375,
-0.008819580078125,
0.0184173583984375,
0.002872467041015625,
-0.00861358642578125,
-0.0017156600952148438,
0.045166015625,
-0.01529693603515625,
-0.03387451171875,
0.04852294921875,
-0.0224761962890625,
0.033721923828125,
-0.0660400390625,
-0.0015249252319335938,
-0.01540374755859375,
0.0176239013671875,
-0.01727294921875,
-0.055755615234375,
0.0228424072265625,
0.007793426513671875,
-0.01454925537109375,
-0.01549530029296875,
0.027435302734375,
-0.038360595703125,
-0.040069580078125,
0.0244598388671875,
0.0229034423828125,
0.0338134765625,
0.001232147216796875,
-0.063232421875,
-0.00008511543273925781,
0.0096435546875,
-0.03369140625,
0.017669677734375,
0.0440673828125,
0.018585205078125,
0.048736572265625,
0.04449462890625,
0.0160980224609375,
0.00263214111328125,
0.01422119140625,
0.068603515625,
-0.06005859375,
-0.031494140625,
-0.05511474609375,
0.053497314453125,
-0.0134124755859375,
-0.034759521484375,
0.055389404296875,
0.038116455078125,
0.063720703125,
-0.03424072265625,
0.05975341796875,
-0.01904296875,
0.0450439453125,
-0.033599853515625,
0.03863525390625,
-0.0433349609375,
0.000016868114471435547,
-0.0294342041015625,
-0.0684814453125,
-0.0172119140625,
0.0562744140625,
-0.018707275390625,
0.008392333984375,
0.0531005859375,
0.06695556640625,
0.002834320068359375,
-0.02703857421875,
0.011322021484375,
0.0277252197265625,
0.034698486328125,
0.048187255859375,
0.026641845703125,
-0.04705810546875,
0.036529541015625,
-0.01166534423828125,
-0.033447265625,
-0.0254974365234375,
-0.0548095703125,
-0.0728759765625,
-0.044708251953125,
-0.0298309326171875,
-0.050384521484375,
0.0153045654296875,
0.07012939453125,
0.05548095703125,
-0.07501220703125,
-0.0288848876953125,
-0.01207733154296875,
0.01099395751953125,
-0.023345947265625,
-0.0208892822265625,
0.032958984375,
-0.0455322265625,
-0.0714111328125,
0.004726409912109375,
0.00811004638671875,
0.00789642333984375,
-0.0007963180541992188,
-0.0193939208984375,
-0.0271759033203125,
0.007129669189453125,
0.0313720703125,
0.034149169921875,
-0.035797119140625,
-0.0220184326171875,
-0.0181121826171875,
-0.0020999908447265625,
0.006988525390625,
0.0229339599609375,
-0.038055419921875,
0.044158935546875,
0.034027099609375,
0.0270233154296875,
0.045440673828125,
-0.00274658203125,
0.037445068359375,
-0.062225341796875,
0.0290679931640625,
0.014007568359375,
0.0301513671875,
0.0202484130859375,
-0.019927978515625,
0.052978515625,
0.013336181640625,
-0.05023193359375,
-0.052886962890625,
0.01409149169921875,
-0.08441162109375,
-0.00492095947265625,
0.06268310546875,
-0.036865234375,
-0.0313720703125,
0.00888824462890625,
-0.018096923828125,
0.039794921875,
-0.038330078125,
0.0692138671875,
0.06591796875,
-0.02081298828125,
0.0009493827819824219,
-0.0263671875,
0.030731201171875,
0.05206298828125,
-0.04925537109375,
-0.00666046142578125,
0.0338134765625,
0.0190277099609375,
0.01611328125,
0.03387451171875,
-0.01458740234375,
0.0091094970703125,
-0.005802154541015625,
0.0311126708984375,
-0.0106353759765625,
-0.0162200927734375,
-0.032470703125,
0.01025390625,
-0.0078887939453125,
-0.040496826171875
]
] |
Helsinki-NLP/opus-mt-no-de | 2023-08-16T12:01:50.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"no",
"de",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-no-de | 0 | 13,499 | transformers | 2022-03-02T23:29:04 | ---
language:
- no
- de
tags:
- translation
license: apache-2.0
---
### nor-deu
* source group: Norwegian
* target group: German
* OPUS readme: [nor-deu](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/nor-deu/README.md)
* model: transformer-align
* source language(s): nno nob
* target language(s): deu
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm4k,spm4k)
* download original weights: [opus-2020-06-17.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/nor-deu/opus-2020-06-17.zip)
* test set translations: [opus-2020-06-17.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/nor-deu/opus-2020-06-17.test.txt)
* test set scores: [opus-2020-06-17.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/nor-deu/opus-2020-06-17.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba-test.nor.deu | 29.6 | 0.541 |
### System Info:
- hf_name: nor-deu
- source_languages: nor
- target_languages: deu
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/nor-deu/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['no', 'de']
- src_constituents: {'nob', 'nno'}
- tgt_constituents: {'deu'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm4k,spm4k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/nor-deu/opus-2020-06-17.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/nor-deu/opus-2020-06-17.test.txt
- src_alpha3: nor
- tgt_alpha3: deu
- short_pair: no-de
- chrF2_score: 0.541
- bleu: 29.6
- brevity_penalty: 0.96
- ref_len: 34575.0
- src_name: Norwegian
- tgt_name: German
- train_date: 2020-06-17
- src_alpha2: no
- tgt_alpha2: de
- prefer_old: False
- long_pair: nor-deu
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 2,065 | [
[
-0.02691650390625,
-0.05255126953125,
0.024139404296875,
0.023529052734375,
-0.034149169921875,
-0.02484130859375,
-0.0206146240234375,
-0.0275726318359375,
0.01849365234375,
0.0265045166015625,
-0.05035400390625,
-0.06414794921875,
-0.03857421875,
0.03216552734375,
-0.001895904541015625,
0.06256103515625,
-0.00904083251953125,
0.01070404052734375,
0.032928466796875,
-0.03515625,
-0.0340576171875,
-0.0203704833984375,
-0.04058837890625,
-0.022003173828125,
0.034881591796875,
0.036163330078125,
0.0330810546875,
0.03094482421875,
0.042724609375,
0.0196380615234375,
-0.021240234375,
0.007770538330078125,
-0.0215606689453125,
-0.0038890838623046875,
0.0016527175903320312,
-0.03314208984375,
-0.059173583984375,
-0.0093536376953125,
0.07012939453125,
0.0306854248046875,
0.01076507568359375,
0.03326416015625,
-0.00617218017578125,
0.051788330078125,
-0.0191650390625,
0.006130218505859375,
-0.039764404296875,
-0.00811004638671875,
-0.03375244140625,
-0.0205230712890625,
-0.032806396484375,
-0.0276031494140625,
0.0092926025390625,
-0.049407958984375,
0.0029201507568359375,
0.00041556358337402344,
0.1162109375,
0.0165557861328125,
-0.0301513671875,
-0.0021228790283203125,
-0.02398681640625,
0.06878662109375,
-0.068115234375,
0.0341796875,
0.03070068359375,
-0.004581451416015625,
-0.01406097412109375,
-0.020782470703125,
-0.0292510986328125,
0.0012950897216796875,
-0.015106201171875,
0.0219879150390625,
-0.0141448974609375,
-0.017059326171875,
0.0169830322265625,
0.03814697265625,
-0.055023193359375,
0.00604248046875,
-0.020172119140625,
-0.002796173095703125,
0.0400390625,
0.01160430908203125,
0.023193359375,
-0.045562744140625,
-0.0360107421875,
-0.033172607421875,
-0.0384521484375,
0.005615234375,
0.0389404296875,
0.0286712646484375,
-0.040557861328125,
0.04443359375,
-0.006862640380859375,
0.045440673828125,
0.0078582763671875,
-0.01096343994140625,
0.0548095703125,
-0.04571533203125,
-0.01438140869140625,
-0.00872039794921875,
0.0897216796875,
0.020599365234375,
-0.0081939697265625,
0.01187896728515625,
-0.0179290771484375,
-0.00798797607421875,
-0.0016632080078125,
-0.06158447265625,
0.0015621185302734375,
0.016937255859375,
-0.0267791748046875,
-0.0160675048828125,
0.005229949951171875,
-0.06829833984375,
0.009490966796875,
-0.005352020263671875,
0.037017822265625,
-0.06597900390625,
-0.0207672119140625,
0.024169921875,
-0.00016677379608154297,
0.0278778076171875,
0.0029277801513671875,
-0.0374755859375,
0.009033203125,
0.0277099609375,
0.061309814453125,
-0.00890350341796875,
-0.02972412109375,
-0.02227783203125,
-0.0017042160034179688,
-0.01264190673828125,
0.04986572265625,
-0.0191802978515625,
-0.02484130859375,
-0.015380859375,
0.0240478515625,
-0.0172576904296875,
-0.0131683349609375,
0.07293701171875,
-0.0225830078125,
0.0386962890625,
-0.0228271484375,
-0.034271240234375,
-0.033599853515625,
0.0299224853515625,
-0.055938720703125,
0.0943603515625,
0.0173797607421875,
-0.0648193359375,
0.022735595703125,
-0.062255859375,
-0.0169219970703125,
-0.005584716796875,
0.0029163360595703125,
-0.052978515625,
0.00325775146484375,
0.01438140869140625,
0.0222930908203125,
-0.028411865234375,
0.035064697265625,
-0.01203155517578125,
-0.016693115234375,
0.0005273818969726562,
-0.0260772705078125,
0.09210205078125,
0.00850677490234375,
-0.038360595703125,
0.00791168212890625,
-0.06304931640625,
0.0007562637329101562,
0.02044677734375,
-0.0364990234375,
-0.0146331787109375,
-0.0183258056640625,
0.02197265625,
0.0006856918334960938,
0.022857666015625,
-0.048858642578125,
0.0290679931640625,
-0.04180908203125,
0.023956298828125,
0.058013916015625,
0.00002181529998779297,
0.0221710205078125,
-0.0229949951171875,
0.036651611328125,
0.003650665283203125,
0.0166473388671875,
0.0176544189453125,
-0.038360595703125,
-0.051177978515625,
-0.0141143798828125,
0.042999267578125,
0.053192138671875,
-0.052337646484375,
0.0560302734375,
-0.05413818359375,
-0.058837890625,
-0.043853759765625,
-0.01241302490234375,
0.040374755859375,
0.027374267578125,
0.042724609375,
-0.01343536376953125,
-0.040771484375,
-0.06707763671875,
-0.01450347900390625,
-0.01468658447265625,
0.0046844482421875,
0.025421142578125,
0.057708740234375,
-0.01244354248046875,
0.042816162109375,
-0.035369873046875,
-0.04534912109375,
-0.0090484619140625,
0.0183868408203125,
0.038970947265625,
0.049896240234375,
0.05255126953125,
-0.0653076171875,
-0.0440673828125,
-0.000027120113372802734,
-0.053497314453125,
-0.0182342529296875,
0.0062408447265625,
-0.01247406005859375,
0.0386962890625,
0.006320953369140625,
-0.032257080078125,
0.0211181640625,
0.050506591796875,
-0.055084228515625,
0.038909912109375,
-0.0116119384765625,
0.02716064453125,
-0.09759521484375,
0.02197265625,
-0.0163726806640625,
-0.00555419921875,
-0.02508544921875,
-0.0026531219482421875,
0.00673675537109375,
0.01081085205078125,
-0.0367431640625,
0.0528564453125,
-0.043487548828125,
0.00732421875,
0.0204315185546875,
0.0207672119140625,
0.003528594970703125,
0.049072265625,
-0.00997161865234375,
0.05706787109375,
0.041595458984375,
-0.028350830078125,
0.006641387939453125,
0.039276123046875,
-0.025848388671875,
0.032501220703125,
-0.0482177734375,
-0.00655364990234375,
0.0225830078125,
0.005123138427734375,
-0.060211181640625,
-0.01113128662109375,
0.0186004638671875,
-0.054962158203125,
0.024505615234375,
-0.01024627685546875,
-0.050262451171875,
-0.006526947021484375,
-0.0305938720703125,
0.040557861328125,
0.0306549072265625,
-0.0157623291015625,
0.06201171875,
0.01128387451171875,
0.00014090538024902344,
-0.047332763671875,
-0.06549072265625,
-0.0008921623229980469,
-0.0188140869140625,
-0.0521240234375,
0.03424072265625,
-0.00637054443359375,
0.00803375244140625,
0.016937255859375,
-0.01053619384765625,
-0.00817108154296875,
0.0106201171875,
0.01277923583984375,
0.027557373046875,
-0.0255584716796875,
0.006679534912109375,
-0.00533294677734375,
-0.0164947509765625,
-0.01406097412109375,
-0.01512908935546875,
0.059478759765625,
-0.0302276611328125,
-0.01580810546875,
-0.054779052734375,
0.004688262939453125,
0.031524658203125,
-0.0242156982421875,
0.06658935546875,
0.0506591796875,
-0.0089569091796875,
0.016998291015625,
-0.046661376953125,
-0.0032482147216796875,
-0.0279998779296875,
0.011383056640625,
-0.04345703125,
-0.05609130859375,
0.054534912109375,
0.0152587890625,
0.019744873046875,
0.070556640625,
0.036529541015625,
0.0135345458984375,
0.035980224609375,
0.02471923828125,
0.006122589111328125,
0.046630859375,
-0.05023193359375,
-0.0081939697265625,
-0.0693359375,
-0.017822265625,
-0.054962158203125,
-0.02655029296875,
-0.0667724609375,
-0.0208892822265625,
0.0207977294921875,
0.005741119384765625,
-0.00937652587890625,
0.059173583984375,
-0.0390625,
0.017486572265625,
0.048980712890625,
0.00658416748046875,
0.025909423828125,
-0.00981903076171875,
-0.0302886962890625,
0.00017905235290527344,
-0.038482666015625,
-0.0479736328125,
0.0941162109375,
0.01505279541015625,
0.0258941650390625,
0.02691650390625,
0.059539794921875,
-0.0025386810302734375,
0.00453948974609375,
-0.03997802734375,
0.0439453125,
-0.0154571533203125,
-0.0640869140625,
-0.03375244140625,
-0.03570556640625,
-0.081298828125,
0.01337432861328125,
-0.01983642578125,
-0.046234130859375,
0.017730712890625,
-0.0065460205078125,
-0.0139312744140625,
0.046539306640625,
-0.050262451171875,
0.07513427734375,
0.0025177001953125,
-0.0121612548828125,
0.01311492919921875,
-0.051483154296875,
0.0106658935546875,
-0.0077667236328125,
0.009124755859375,
-0.01568603515625,
-0.0118255615234375,
0.0628662109375,
-0.01082611083984375,
0.042999267578125,
-0.004673004150390625,
-0.01108551025390625,
0.0129852294921875,
0.00943756103515625,
0.0302276611328125,
0.0011148452758789062,
-0.0230255126953125,
0.039764404296875,
0.00543212890625,
-0.0377197265625,
-0.016082763671875,
0.05072021484375,
-0.060699462890625,
-0.02947998046875,
-0.040496826171875,
-0.04638671875,
0.01146697998046875,
0.04180908203125,
0.040283203125,
0.044921875,
-0.01363372802734375,
0.041778564453125,
0.04840087890625,
-0.02703857421875,
0.033599853515625,
0.049591064453125,
-0.00225067138671875,
-0.053466796875,
0.052154541015625,
0.0197906494140625,
0.0127716064453125,
0.044281005859375,
0.00994110107421875,
-0.02197265625,
-0.059326171875,
-0.033721923828125,
0.0301513671875,
-0.026824951171875,
-0.028839111328125,
-0.049041748046875,
0.004932403564453125,
-0.0408935546875,
0.0149993896484375,
-0.0252227783203125,
-0.03173828125,
-0.0214080810546875,
-0.01837158203125,
0.032562255859375,
0.03515625,
-0.00991058349609375,
0.01531982421875,
-0.0667724609375,
0.0258941650390625,
-0.01056671142578125,
0.03631591796875,
-0.023712158203125,
-0.058837890625,
-0.0063629150390625,
-0.0048370361328125,
-0.0297698974609375,
-0.07916259765625,
0.0341796875,
-0.005584716796875,
0.0224609375,
0.0120391845703125,
0.0167999267578125,
0.044281005859375,
-0.033111572265625,
0.07122802734375,
-0.0052490234375,
-0.06988525390625,
0.04522705078125,
-0.03448486328125,
0.0295867919921875,
0.06109619140625,
0.01708984375,
-0.0191802978515625,
-0.042510986328125,
-0.06585693359375,
-0.06988525390625,
0.06829833984375,
0.044281005859375,
-0.00925445556640625,
-0.006626129150390625,
0.008331298828125,
0.002227783203125,
-0.01202392578125,
-0.0819091796875,
-0.035400390625,
0.0032787322998046875,
-0.021087646484375,
0.0006365776062011719,
-0.03839111328125,
-0.007762908935546875,
-0.0106964111328125,
0.0882568359375,
0.01285552978515625,
0.007434844970703125,
0.04229736328125,
-0.01093292236328125,
0.00001043081283569336,
0.0316162109375,
0.054473876953125,
0.0311279296875,
-0.0361328125,
-0.00836181640625,
0.02783203125,
-0.038909912109375,
0.002635955810546875,
0.01131439208984375,
-0.030914306640625,
0.020233154296875,
0.03912353515625,
0.06231689453125,
0.01261138916015625,
-0.032257080078125,
0.032562255859375,
-0.00798797607421875,
-0.038909912109375,
-0.035858154296875,
-0.019317626953125,
0.00731658935546875,
0.01436614990234375,
0.0202484130859375,
-0.00879669189453125,
0.004245758056640625,
-0.01319122314453125,
0.0030765533447265625,
0.013580322265625,
-0.025604248046875,
-0.03143310546875,
0.0386962890625,
0.0079193115234375,
-0.0246124267578125,
0.03887939453125,
-0.0274810791015625,
-0.028564453125,
0.040863037109375,
0.0179595947265625,
0.08013916015625,
-0.01517486572265625,
-0.0104827880859375,
0.053192138671875,
0.037872314453125,
0.0033397674560546875,
0.041015625,
0.020050048828125,
-0.043304443359375,
-0.028564453125,
-0.0579833984375,
0.01064300537109375,
0.0197906494140625,
-0.060302734375,
0.031341552734375,
-0.00128173828125,
-0.0309906005859375,
-0.0052490234375,
0.027130126953125,
-0.04217529296875,
0.0005974769592285156,
-0.0195465087890625,
0.072265625,
-0.0689697265625,
0.061676025390625,
0.05712890625,
-0.046539306640625,
-0.0784912109375,
-0.00560760498046875,
-0.01320648193359375,
-0.036285400390625,
0.050537109375,
0.0073699951171875,
-0.0007386207580566406,
-0.002536773681640625,
-0.0220794677734375,
-0.0638427734375,
0.091064453125,
0.037933349609375,
-0.0279693603515625,
-0.0074462890625,
-0.0018663406372070312,
0.0345458984375,
0.0020294189453125,
0.0159149169921875,
0.032012939453125,
0.0655517578125,
-0.00982666015625,
-0.08062744140625,
0.0159454345703125,
-0.03167724609375,
-0.006443023681640625,
0.035858154296875,
-0.059539794921875,
0.07159423828125,
0.01220703125,
-0.02337646484375,
0.01116943359375,
0.038787841796875,
0.02825927734375,
0.008941650390625,
0.036773681640625,
0.080078125,
0.03277587890625,
-0.03570556640625,
0.07489013671875,
-0.018585205078125,
0.04833984375,
0.079345703125,
0.013092041015625,
0.055938720703125,
0.039093017578125,
-0.0150146484375,
0.052825927734375,
0.042724609375,
-0.020050048828125,
0.021881103515625,
-0.00528717041015625,
0.0074462890625,
-0.0188751220703125,
-0.0178070068359375,
-0.039276123046875,
0.0377197265625,
-0.00004684925079345703,
-0.014312744140625,
-0.01232147216796875,
-0.0202789306640625,
0.023590087890625,
0.007099151611328125,
0.002033233642578125,
0.054107666015625,
-0.005947113037109375,
-0.04302978515625,
0.0634765625,
-0.0034198760986328125,
0.047607421875,
-0.04345703125,
-0.006839752197265625,
-0.01502227783203125,
0.0112457275390625,
-0.00814056396484375,
-0.053955078125,
0.02801513671875,
0.0127716064453125,
-0.020599365234375,
-0.019561767578125,
0.0088958740234375,
-0.0302276611328125,
-0.051849365234375,
0.039215087890625,
0.0288848876953125,
0.023468017578125,
0.005615234375,
-0.059783935546875,
0.0015249252319335938,
0.013275146484375,
-0.045654296875,
0.0004363059997558594,
0.060760498046875,
0.00982666015625,
0.041595458984375,
0.03912353515625,
0.0055999755859375,
0.007427215576171875,
0.003170013427734375,
0.0513916015625,
-0.058807373046875,
-0.02569580078125,
-0.0711669921875,
0.0350341796875,
-0.0016527175903320312,
-0.04364013671875,
0.061981201171875,
0.0587158203125,
0.063232421875,
-0.0028076171875,
0.019439697265625,
-0.0227813720703125,
0.032073974609375,
-0.049285888671875,
0.04766845703125,
-0.07403564453125,
-0.00030231475830078125,
-0.0214385986328125,
-0.06427001953125,
-0.0246124267578125,
0.02569580078125,
-0.0245208740234375,
-0.006534576416015625,
0.073974609375,
0.05084228515625,
0.0020694732666015625,
-0.017486572265625,
0.0034427642822265625,
0.032562255859375,
0.018157958984375,
0.06304931640625,
0.021881103515625,
-0.06805419921875,
0.052764892578125,
-0.0209197998046875,
0.0020351409912109375,
-0.010833740234375,
-0.068603515625,
-0.06549072265625,
-0.0418701171875,
-0.0208892822265625,
-0.0275726318359375,
-0.01873779296875,
0.058990478515625,
0.0269775390625,
-0.072021484375,
-0.0193328857421875,
-0.0085906982421875,
0.0135345458984375,
-0.020660400390625,
-0.020538330078125,
0.05535888671875,
-0.00838470458984375,
-0.07470703125,
0.01715087890625,
0.004207611083984375,
0.01084136962890625,
0.007061004638671875,
-0.0013380050659179688,
-0.057373046875,
-0.00982666015625,
0.0165863037109375,
0.0014181137084960938,
-0.0677490234375,
-0.0027713775634765625,
0.01154327392578125,
-0.016448974609375,
0.0192718505859375,
0.00368499755859375,
-0.024566650390625,
0.01221466064453125,
0.061187744140625,
0.0244140625,
0.03265380859375,
-0.001251220703125,
0.0208892822265625,
-0.052825927734375,
0.0247650146484375,
0.0218505859375,
0.036651611328125,
0.0167999267578125,
-0.0120391845703125,
0.0660400390625,
0.0204620361328125,
-0.0301971435546875,
-0.07598876953125,
-0.00021851062774658203,
-0.095947265625,
-0.0095367431640625,
0.08123779296875,
-0.020294189453125,
-0.0283660888671875,
0.01462554931640625,
-0.01383209228515625,
0.03216552734375,
-0.04412841796875,
0.03778076171875,
0.059417724609375,
0.0205078125,
0.016510009765625,
-0.03924560546875,
0.0174407958984375,
0.045166015625,
-0.055328369140625,
-0.01078033447265625,
0.0243072509765625,
0.029541015625,
0.03729248046875,
0.043060302734375,
-0.03363037109375,
0.00870513916015625,
-0.01126861572265625,
0.0237579345703125,
-0.017578125,
-0.0086669921875,
-0.029144287109375,
0.004940032958984375,
-0.01325225830078125,
-0.019134521484375
]
] |
sentence-transformers/msmarco-distilbert-base-tas-b | 2022-08-18T16:35:39.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"distilbert",
"feature-extraction",
"sentence-similarity",
"transformers",
"en",
"dataset:ms_marco",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/msmarco-distilbert-base-tas-b | 27 | 13,498 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
language: "en"
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
datasets:
- ms_marco
---
# sentence-transformers/msmarco-distilbert-base-tas-b
This is a port of the [DistilBert TAS-B Model](https://huggingface.co/sebastian-hofstaetter/distilbert-dot-tas_b-b256-msmarco) to [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and is optimized for the task of semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer, util
query = "How many people live in London?"
docs = ["Around 9 Million people live in London", "London is known for its financial district"]
#Load the model
model = SentenceTransformer('sentence-transformers/msmarco-distilbert-base-tas-b')
#Encode query and documents
query_emb = model.encode(query)
doc_emb = model.encode(docs)
#Compute dot score between query and all document embeddings
scores = util.dot_score(query_emb, doc_emb)[0].cpu().tolist()
#Combine docs & scores
doc_score_pairs = list(zip(docs, scores))
#Sort by decreasing score
doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True)
#Output passages & scores
for doc, score in doc_score_pairs:
print(score, doc)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#CLS Pooling - Take output from first token
def cls_pooling(model_output):
return model_output.last_hidden_state[:,0]
#Encode text
def encode(texts):
# Tokenize sentences
encoded_input = tokenizer(texts, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input, return_dict=True)
# Perform pooling
embeddings = cls_pooling(model_output)
return embeddings
# Sentences we want sentence embeddings for
query = "How many people live in London?"
docs = ["Around 9 Million people live in London", "London is known for its financial district"]
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained("sentence-transformers/msmarco-distilbert-base-tas-b")
model = AutoModel.from_pretrained("sentence-transformers/msmarco-distilbert-base-tas-b")
#Encode query and docs
query_emb = encode(query)
doc_emb = encode(docs)
#Compute dot score between query and all document embeddings
scores = torch.mm(query_emb, doc_emb.transpose(0, 1))[0].cpu().tolist()
#Combine docs & scores
doc_score_pairs = list(zip(docs, scores))
#Sort by decreasing score
doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True)
#Output passages & scores
for doc, score in doc_score_pairs:
print(score, doc)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/msmarco-distilbert-base-tas-b)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
Have a look at: [DistilBert TAS-B Model](https://huggingface.co/sebastian-hofstaetter/distilbert-dot-tas_b-b256-msmarco) | 3,988 | [
[
-0.0186004638671875,
-0.0635986328125,
0.034942626953125,
0.025482177734375,
-0.01145172119140625,
-0.01375579833984375,
-0.01922607421875,
-0.0006051063537597656,
0.00807952880859375,
0.019195556640625,
-0.0345458984375,
-0.046234130859375,
-0.06292724609375,
0.01264190673828125,
-0.033447265625,
0.05352783203125,
-0.00907135009765625,
0.001140594482421875,
-0.0198822021484375,
-0.0145416259765625,
-0.022613525390625,
-0.0171051025390625,
-0.0233001708984375,
-0.018310546875,
0.0175018310546875,
0.022796630859375,
0.045440673828125,
0.034759521484375,
0.025360107421875,
0.037628173828125,
-0.011260986328125,
0.0189971923828125,
-0.0274200439453125,
0.0029315948486328125,
-0.01148223876953125,
-0.0265960693359375,
-0.002353668212890625,
0.017852783203125,
0.0426025390625,
0.03399658203125,
-0.006214141845703125,
0.007152557373046875,
0.0024738311767578125,
0.033416748046875,
-0.0262451171875,
0.03350830078125,
-0.039215087890625,
0.004421234130859375,
0.0029354095458984375,
-0.00988006591796875,
-0.04681396484375,
-0.0163116455078125,
0.019195556640625,
-0.028167724609375,
0.019927978515625,
0.015838623046875,
0.09283447265625,
0.02752685546875,
-0.0284271240234375,
-0.032501220703125,
-0.023223876953125,
0.057037353515625,
-0.0537109375,
0.0162200927734375,
0.019073486328125,
0.002712249755859375,
-0.01039886474609375,
-0.076171875,
-0.05908203125,
-0.01380157470703125,
-0.038543701171875,
0.02105712890625,
-0.0240020751953125,
-0.0062255859375,
0.02252197265625,
0.0298614501953125,
-0.048583984375,
-0.003787994384765625,
-0.0419921875,
-0.016754150390625,
0.043731689453125,
0.01378631591796875,
0.0171356201171875,
-0.04296875,
-0.038818359375,
-0.032745361328125,
-0.017730712890625,
0.006473541259765625,
0.009521484375,
0.013702392578125,
-0.01192474365234375,
0.058685302734375,
-0.002777099609375,
0.043701171875,
0.0028781890869140625,
0.01094818115234375,
0.048980712890625,
-0.0239410400390625,
-0.01557159423828125,
-0.004802703857421875,
0.0836181640625,
0.02783203125,
0.01611328125,
-0.0002117156982421875,
-0.0140838623046875,
0.00887298583984375,
0.0136566162109375,
-0.07098388671875,
-0.0155792236328125,
0.0244293212890625,
-0.026763916015625,
-0.0202484130859375,
0.02117919921875,
-0.047576904296875,
-0.0107574462890625,
-0.0016880035400390625,
0.05902099609375,
-0.053741455078125,
0.00565338134765625,
0.0255584716796875,
-0.0297088623046875,
0.01494598388671875,
-0.015350341796875,
-0.053741455078125,
0.00858306884765625,
0.020751953125,
0.07489013671875,
0.00818634033203125,
-0.04925537109375,
-0.0294342041015625,
-0.005962371826171875,
0.00818634033203125,
0.046875,
-0.03009033203125,
-0.03009033203125,
0.008636474609375,
0.02056884765625,
-0.031982421875,
-0.02166748046875,
0.045196533203125,
-0.015167236328125,
0.05633544921875,
-0.005809783935546875,
-0.061309814453125,
-0.0108795166015625,
0.01262664794921875,
-0.041534423828125,
0.099365234375,
0.03363037109375,
-0.07342529296875,
0.0021514892578125,
-0.059783935546875,
-0.0302581787109375,
-0.010589599609375,
-0.0009365081787109375,
-0.040802001953125,
0.0101776123046875,
0.033782958984375,
0.04925537109375,
-0.00174713134765625,
0.01251220703125,
-0.0053863525390625,
-0.044097900390625,
0.023590087890625,
-0.024261474609375,
0.07830810546875,
0.01128387451171875,
-0.03839111328125,
0.003692626953125,
-0.033935546875,
-0.01071929931640625,
0.02227783203125,
-0.025146484375,
-0.0159454345703125,
-0.007389068603515625,
0.0222625732421875,
0.026641845703125,
0.02313232421875,
-0.04827880859375,
0.0262603759765625,
-0.038665771484375,
0.0595703125,
0.048858642578125,
-0.008514404296875,
0.039581298828125,
-0.0245208740234375,
0.016998291015625,
0.0251617431640625,
0.004329681396484375,
-0.01540374755859375,
-0.031158447265625,
-0.059783935546875,
-0.0248565673828125,
0.0250244140625,
0.0390625,
-0.05633544921875,
0.07135009765625,
-0.03839111328125,
-0.043487548828125,
-0.0667724609375,
-0.0106048583984375,
0.012481689453125,
0.047454833984375,
0.051849365234375,
0.004383087158203125,
-0.03253173828125,
-0.06591796875,
-0.008209228515625,
0.00530242919921875,
0.005855560302734375,
0.01226043701171875,
0.055572509765625,
-0.0275115966796875,
0.07177734375,
-0.0638427734375,
-0.040069580078125,
-0.0233306884765625,
0.009002685546875,
0.0281219482421875,
0.03564453125,
0.04443359375,
-0.0596923828125,
-0.03741455078125,
-0.035552978515625,
-0.047271728515625,
-0.0078277587890625,
-0.006870269775390625,
-0.01155853271484375,
0.0146636962890625,
0.042633056640625,
-0.056060791015625,
0.032135009765625,
0.037933349609375,
-0.05859375,
0.0296630859375,
-0.0260162353515625,
0.0030670166015625,
-0.1055908203125,
0.0060272216796875,
0.0016937255859375,
-0.0179290771484375,
-0.0218505859375,
0.007122039794921875,
0.00957489013671875,
-0.0018825531005859375,
-0.032958984375,
0.037872314453125,
-0.039276123046875,
0.01136016845703125,
0.007068634033203125,
0.036468505859375,
0.0237274169921875,
0.043365478515625,
-0.00920867919921875,
0.054595947265625,
0.046356201171875,
-0.02972412109375,
0.0245208740234375,
0.0504150390625,
-0.03253173828125,
0.017181396484375,
-0.053253173828125,
-0.0010328292846679688,
-0.01050567626953125,
0.0193023681640625,
-0.08990478515625,
0.0114593505859375,
0.0032100677490234375,
-0.040130615234375,
0.017608642578125,
0.011505126953125,
-0.055023193359375,
-0.0518798828125,
-0.03179931640625,
0.0014371871948242188,
0.03558349609375,
-0.03851318359375,
0.048858642578125,
0.01824951171875,
0.001987457275390625,
-0.048095703125,
-0.07659912109375,
-0.0093231201171875,
-0.00926971435546875,
-0.052947998046875,
0.04779052734375,
-0.0073699951171875,
0.01187896728515625,
0.0204620361328125,
0.00951385498046875,
0.0142974853515625,
0.0007338523864746094,
0.0110626220703125,
0.0235748291015625,
-0.0017309188842773438,
0.007053375244140625,
0.0140380859375,
-0.017822265625,
0.002834320068359375,
-0.0170440673828125,
0.06195068359375,
-0.0232391357421875,
-0.0140533447265625,
-0.0196380615234375,
0.0198211669921875,
0.044342041015625,
-0.0252685546875,
0.07855224609375,
0.07354736328125,
-0.02105712890625,
-0.01200103759765625,
-0.037872314453125,
-0.01136016845703125,
-0.037841796875,
0.041259765625,
-0.032196044921875,
-0.064208984375,
0.034759521484375,
0.007297515869140625,
0.0016307830810546875,
0.0582275390625,
0.046966552734375,
-0.0164794921875,
0.057342529296875,
0.0286712646484375,
-0.01432037353515625,
0.038543701171875,
-0.056854248046875,
0.01264190673828125,
-0.054962158203125,
-0.01406097412109375,
-0.0305023193359375,
-0.0328369140625,
-0.052978515625,
-0.0308074951171875,
0.02545166015625,
0.000537872314453125,
-0.00946044921875,
0.048858642578125,
-0.0606689453125,
0.020233154296875,
0.046417236328125,
0.0116729736328125,
0.00009316205978393555,
0.0031890869140625,
-0.0419921875,
-0.0032958984375,
-0.046234130859375,
-0.03448486328125,
0.07171630859375,
0.027069091796875,
0.0302581787109375,
0.0026569366455078125,
0.057891845703125,
0.0092315673828125,
0.004192352294921875,
-0.054595947265625,
0.035003662109375,
-0.0259246826171875,
-0.03094482421875,
-0.026641845703125,
-0.03179931640625,
-0.06976318359375,
0.032440185546875,
-0.0188140869140625,
-0.049041748046875,
-0.0004754066467285156,
-0.0248565673828125,
-0.01413726806640625,
0.0159454345703125,
-0.062286376953125,
0.08013916015625,
0.004192352294921875,
-0.02056884765625,
-0.00972747802734375,
-0.042724609375,
0.001590728759765625,
0.0203399658203125,
0.01525115966796875,
-0.0010652542114257812,
0.006732940673828125,
0.05194091796875,
-0.034271240234375,
0.042205810546875,
-0.017333984375,
0.0144500732421875,
0.02618408203125,
-0.01023101806640625,
0.022430419921875,
-0.0050048828125,
-0.016845703125,
0.00981903076171875,
0.00960540771484375,
-0.0345458984375,
-0.040740966796875,
0.05389404296875,
-0.06378173828125,
-0.0313720703125,
-0.043304443359375,
-0.04364013671875,
0.0030384063720703125,
0.01641845703125,
0.038604736328125,
0.038238525390625,
-0.0066986083984375,
0.0242156982421875,
0.0294189453125,
-0.0168609619140625,
0.06048583984375,
0.030548095703125,
-0.00464630126953125,
-0.033599853515625,
0.03631591796875,
0.012054443359375,
-0.005596160888671875,
0.038421630859375,
0.0292816162109375,
-0.0491943359375,
-0.0239105224609375,
-0.02703857421875,
0.0269927978515625,
-0.043487548828125,
-0.0181884765625,
-0.0601806640625,
-0.0301361083984375,
-0.049285888671875,
-0.0026760101318359375,
-0.014617919921875,
-0.020965576171875,
-0.02288818359375,
-0.02606201171875,
0.03363037109375,
0.0421142578125,
0.0008149147033691406,
0.0242156982421875,
-0.053253173828125,
0.01398468017578125,
-0.0021724700927734375,
0.0094451904296875,
-0.01145172119140625,
-0.06134033203125,
-0.0233001708984375,
-0.0011997222900390625,
-0.031982421875,
-0.0706787109375,
0.04156494140625,
0.00505828857421875,
0.03839111328125,
0.01551055908203125,
0.0063629150390625,
0.056182861328125,
-0.043914794921875,
0.057403564453125,
-0.00269317626953125,
-0.08233642578125,
0.040496826171875,
-0.0003018379211425781,
0.0248260498046875,
0.050689697265625,
0.0292205810546875,
-0.035797119140625,
-0.0300140380859375,
-0.05010986328125,
-0.07696533203125,
0.049072265625,
0.037322998046875,
0.021484375,
-0.01953125,
0.01885986328125,
-0.008331298828125,
0.005664825439453125,
-0.0665283203125,
-0.04058837890625,
-0.0266876220703125,
-0.04376220703125,
-0.020782470703125,
-0.0152587890625,
0.006320953369140625,
-0.04150390625,
0.0516357421875,
0.003261566162109375,
0.03411865234375,
0.039703369140625,
-0.0325927734375,
0.0260772705078125,
0.007244110107421875,
0.037933349609375,
0.0223541259765625,
-0.01580810546875,
0.0083465576171875,
0.01397705078125,
-0.03985595703125,
0.002826690673828125,
0.03753662109375,
-0.00797271728515625,
0.024932861328125,
0.03460693359375,
0.061431884765625,
0.023681640625,
-0.02899169921875,
0.0645751953125,
-0.005626678466796875,
-0.022613525390625,
-0.03851318359375,
-0.0086517333984375,
0.0203399658203125,
0.0229034423828125,
0.02947998046875,
-0.0010280609130859375,
-0.0008215904235839844,
-0.03594970703125,
0.0283050537109375,
0.011260986328125,
-0.0279541015625,
-0.002658843994140625,
0.048828125,
0.003925323486328125,
-0.0193023681640625,
0.0687255859375,
-0.0265350341796875,
-0.0509033203125,
0.0305023193359375,
0.040924072265625,
0.0601806640625,
-0.002483367919921875,
0.0154266357421875,
0.03839111328125,
0.0249481201171875,
-0.00823974609375,
0.015472412109375,
0.005855560302734375,
-0.058502197265625,
-0.00848388671875,
-0.056121826171875,
0.005558013916015625,
-0.00534820556640625,
-0.048492431640625,
0.0321044921875,
-0.00531005859375,
-0.013031005859375,
-0.004093170166015625,
0.017547607421875,
-0.066650390625,
0.002849578857421875,
0.0015478134155273438,
0.07586669921875,
-0.07647705078125,
0.0660400390625,
0.050537109375,
-0.0679931640625,
-0.0697021484375,
-0.007663726806640625,
-0.0283355712890625,
-0.055755615234375,
0.03546142578125,
0.0389404296875,
0.0106353759765625,
0.0188140869140625,
-0.0222930908203125,
-0.055572509765625,
0.118896484375,
0.0201263427734375,
-0.037628173828125,
-0.0220489501953125,
0.01088714599609375,
0.041778564453125,
-0.028167724609375,
0.04779052734375,
0.03729248046875,
0.039764404296875,
-0.005115509033203125,
-0.051513671875,
0.0177764892578125,
-0.017242431640625,
-0.0004718303680419922,
-0.007549285888671875,
-0.047821044921875,
0.07318115234375,
-0.003265380859375,
-0.018157958984375,
0.00042748451232910156,
0.048431396484375,
0.0177001953125,
0.007564544677734375,
0.03656005859375,
0.066162109375,
0.054168701171875,
-0.020477294921875,
0.07830810546875,
-0.0281829833984375,
0.06243896484375,
0.0650634765625,
0.0012836456298828125,
0.0726318359375,
0.035797119140625,
-0.0193634033203125,
0.06707763671875,
0.03631591796875,
-0.027618408203125,
0.04364013671875,
0.021148681640625,
0.005275726318359375,
0.005962371826171875,
0.0182342529296875,
-0.018768310546875,
0.054931640625,
0.01436614990234375,
-0.048736572265625,
-0.01065826416015625,
0.0003631114959716797,
0.00946044921875,
0.0183868408203125,
0.0025177001953125,
0.0423583984375,
0.00677490234375,
-0.039031982421875,
0.041412353515625,
0.0148773193359375,
0.0701904296875,
-0.03179931640625,
0.01326751708984375,
-0.01421356201171875,
0.0230865478515625,
-0.01082611083984375,
-0.0625,
0.028778076171875,
-0.0157928466796875,
-0.01354217529296875,
-0.0252685546875,
0.03387451171875,
-0.044342041015625,
-0.053070068359375,
0.04217529296875,
0.0411376953125,
0.01238250732421875,
-0.0062103271484375,
-0.07257080078125,
-0.0057525634765625,
0.00034618377685546875,
-0.04278564453125,
0.00737762451171875,
0.037078857421875,
0.0226898193359375,
0.035675048828125,
0.040740966796875,
-0.007152557373046875,
0.0010805130004882812,
0.0170745849609375,
0.0648193359375,
-0.055816650390625,
-0.036407470703125,
-0.07940673828125,
0.052642822265625,
-0.0226593017578125,
-0.031036376953125,
0.047698974609375,
0.054718017578125,
0.05816650390625,
-0.016693115234375,
0.03692626953125,
-0.017791748046875,
0.0179290771484375,
-0.040435791015625,
0.07421875,
-0.039459228515625,
0.004093170166015625,
-0.0165252685546875,
-0.07501220703125,
-0.004932403564453125,
0.06207275390625,
-0.0236968994140625,
-0.002685546875,
0.07562255859375,
0.07037353515625,
-0.01041412353515625,
-0.0221710205078125,
0.0122222900390625,
0.0345458984375,
0.01227569580078125,
0.043304443359375,
0.035675048828125,
-0.0712890625,
0.051361083984375,
-0.0311126708984375,
-0.004730224609375,
-0.01013946533203125,
-0.05889892578125,
-0.0709228515625,
-0.07318115234375,
-0.0245208740234375,
-0.0411376953125,
-0.02313232421875,
0.06689453125,
0.039337158203125,
-0.05316162109375,
0.0010662078857421875,
-0.00479888916015625,
-0.0018339157104492188,
-0.01102447509765625,
-0.0280914306640625,
0.0498046875,
-0.0283050537109375,
-0.0723876953125,
0.022491455078125,
-0.0029239654541015625,
-0.0024051666259765625,
-0.0218505859375,
0.0087432861328125,
-0.040985107421875,
0.00397491455078125,
0.0445556640625,
-0.0244903564453125,
-0.055450439453125,
-0.0280303955078125,
0.007015228271484375,
-0.03814697265625,
0.00024366378784179688,
0.02410888671875,
-0.059326171875,
0.018463134765625,
0.040283203125,
0.0321044921875,
0.061767578125,
-0.0021266937255859375,
0.0259246826171875,
-0.058868408203125,
0.021759033203125,
0.006031036376953125,
0.061004638671875,
0.033447265625,
-0.02459716796875,
0.044525146484375,
0.0225830078125,
-0.03106689453125,
-0.04864501953125,
-0.0064849853515625,
-0.08355712890625,
-0.029815673828125,
0.08282470703125,
-0.0213623046875,
-0.0283966064453125,
0.021209716796875,
-0.0259552001953125,
0.0352783203125,
-0.0301361083984375,
0.06146240234375,
0.059814453125,
0.01117706298828125,
-0.0028247833251953125,
-0.0219573974609375,
0.0149078369140625,
0.02960205078125,
-0.03057861328125,
-0.024505615234375,
0.01520538330078125,
0.038818359375,
0.0238189697265625,
0.033477783203125,
-0.014862060546875,
-0.00830078125,
0.01094818115234375,
0.006134033203125,
-0.036285400390625,
0.007740020751953125,
-0.0264892578125,
0.0160064697265625,
-0.03265380859375,
-0.03289794921875
]
] |
Helsinki-NLP/opus-mt-da-de | 2023-08-16T11:27:20.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"da",
"de",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-da-de | 0 | 13,494 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-da-de
* source languages: da
* target languages: de
* OPUS readme: [da-de](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/da-de/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-01-26.zip](https://object.pouta.csc.fi/OPUS-MT-models/da-de/opus-2020-01-26.zip)
* test set translations: [opus-2020-01-26.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/da-de/opus-2020-01-26.test.txt)
* test set scores: [opus-2020-01-26.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/da-de/opus-2020-01-26.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.da.de | 57.4 | 0.740 |
| 818 | [
[
-0.022003173828125,
-0.044036865234375,
0.0216064453125,
0.025848388671875,
-0.0311126708984375,
-0.0285491943359375,
-0.03131103515625,
-0.004367828369140625,
0.005279541015625,
0.032073974609375,
-0.046478271484375,
-0.045013427734375,
-0.0513916015625,
0.0212249755859375,
-0.01064300537109375,
0.054962158203125,
-0.00878143310546875,
0.033294677734375,
0.0173797607421875,
-0.0285491943359375,
-0.028839111328125,
-0.0301055908203125,
-0.036956787109375,
-0.0214080810546875,
0.02667236328125,
0.03265380859375,
0.02935791015625,
0.03411865234375,
0.0640869140625,
0.01751708984375,
-0.007274627685546875,
0.00408172607421875,
-0.0325927734375,
-0.0014362335205078125,
0.0083465576171875,
-0.039154052734375,
-0.0625,
-0.00817108154296875,
0.0736083984375,
0.03466796875,
-0.0020046234130859375,
0.025604248046875,
-0.0032749176025390625,
0.07464599609375,
-0.0270843505859375,
0.00696563720703125,
-0.0413818359375,
-0.0026416778564453125,
-0.0232391357421875,
-0.0183258056640625,
-0.04962158203125,
-0.019500732421875,
0.01036834716796875,
-0.04541015625,
-0.01024627685546875,
-0.001354217529296875,
0.10491943359375,
0.031524658203125,
-0.03436279296875,
-0.00908660888671875,
-0.041839599609375,
0.07525634765625,
-0.06072998046875,
0.0391845703125,
0.0384521484375,
0.02325439453125,
0.01224517822265625,
-0.041778564453125,
-0.028778076171875,
0.0066070556640625,
-0.00988006591796875,
0.01274871826171875,
-0.01088714599609375,
-0.020904541015625,
0.01934814453125,
0.0556640625,
-0.051300048828125,
0.00276947021484375,
-0.036529541015625,
0.005634307861328125,
0.052764892578125,
0.004703521728515625,
0.0111236572265625,
-0.01062774658203125,
-0.03082275390625,
-0.044830322265625,
-0.05267333984375,
-0.004367828369140625,
0.0280609130859375,
0.01432037353515625,
-0.034759521484375,
0.05255126953125,
-0.010345458984375,
0.0516357421875,
0.006038665771484375,
0.0020694732666015625,
0.07269287109375,
-0.025543212890625,
-0.0272979736328125,
-0.01218414306640625,
0.08526611328125,
0.0243988037109375,
0.00836944580078125,
0.005847930908203125,
-0.0153350830078125,
-0.0142059326171875,
0.008819580078125,
-0.06988525390625,
-0.01216888427734375,
0.0124969482421875,
-0.03375244140625,
-0.010345458984375,
0.0017299652099609375,
-0.050933837890625,
0.0162506103515625,
-0.035980224609375,
0.04962158203125,
-0.048553466796875,
-0.0211639404296875,
0.0290985107421875,
0.0016126632690429688,
0.022613525390625,
-0.00240325927734375,
-0.04705810546875,
0.0152130126953125,
0.025726318359375,
0.05517578125,
-0.033050537109375,
-0.019500732421875,
-0.03973388671875,
-0.01253509521484375,
-0.00847625732421875,
0.049163818359375,
-0.007282257080078125,
-0.03289794921875,
-0.0079345703125,
0.0298004150390625,
-0.0266571044921875,
-0.0262298583984375,
0.10321044921875,
-0.0268707275390625,
0.048980712890625,
-0.033477783203125,
-0.043609619140625,
-0.029754638671875,
0.032135009765625,
-0.048553466796875,
0.09063720703125,
0.00945281982421875,
-0.0628662109375,
0.021026611328125,
-0.0654296875,
-0.01406097412109375,
-0.003437042236328125,
0.0021266937255859375,
-0.042572021484375,
0.007617950439453125,
0.012908935546875,
0.031341552734375,
-0.024322509765625,
0.02191162109375,
-0.00229644775390625,
-0.0273895263671875,
0.0025424957275390625,
-0.034271240234375,
0.08062744140625,
0.022705078125,
-0.0300140380859375,
0.0121612548828125,
-0.0748291015625,
-0.0021820068359375,
-0.0006771087646484375,
-0.035980224609375,
-0.0212860107421875,
0.00627899169921875,
0.0201873779296875,
0.0096893310546875,
0.0234222412109375,
-0.047576904296875,
0.02362060546875,
-0.0498046875,
0.0135498046875,
0.048248291015625,
-0.02484130859375,
0.02398681640625,
-0.030914306640625,
0.023345947265625,
0.0109405517578125,
-0.00001895427703857422,
0.004329681396484375,
-0.03436279296875,
-0.061737060546875,
-0.0252227783203125,
0.049560546875,
0.07733154296875,
-0.05255126953125,
0.07000732421875,
-0.051605224609375,
-0.0565185546875,
-0.05450439453125,
-0.0175933837890625,
0.0294647216796875,
0.02899169921875,
0.041748046875,
-0.0111846923828125,
-0.029754638671875,
-0.07733154296875,
-0.004245758056640625,
-0.006412506103515625,
-0.019287109375,
0.011962890625,
0.04998779296875,
-0.020660400390625,
0.040130615234375,
-0.04541015625,
-0.033233642578125,
-0.0025501251220703125,
0.0147552490234375,
0.035064697265625,
0.04437255859375,
0.0460205078125,
-0.069091796875,
-0.04718017578125,
-0.0031261444091796875,
-0.0509033203125,
-0.01557159423828125,
0.0157623291015625,
-0.00902557373046875,
0.0162506103515625,
0.00537872314453125,
-0.02093505859375,
0.00995635986328125,
0.05224609375,
-0.048309326171875,
0.041778564453125,
-0.01739501953125,
0.02734375,
-0.09796142578125,
0.0107574462890625,
-0.0198516845703125,
-0.0006566047668457031,
-0.02685546875,
-0.006389617919921875,
0.0162200927734375,
0.01025390625,
-0.053436279296875,
0.043182373046875,
-0.0191802978515625,
-0.0019311904907226562,
0.00916290283203125,
0.0020160675048828125,
0.006366729736328125,
0.051422119140625,
-0.00621795654296875,
0.06182861328125,
0.055877685546875,
-0.044189453125,
0.01505279541015625,
0.045989990234375,
-0.0308685302734375,
0.035125732421875,
-0.061737060546875,
-0.01323699951171875,
0.0251617431640625,
-0.00743865966796875,
-0.04803466796875,
0.0083465576171875,
0.01806640625,
-0.044921875,
0.029632568359375,
-0.0035457611083984375,
-0.059112548828125,
-0.0005397796630859375,
-0.01971435546875,
0.04132080078125,
0.052459716796875,
-0.0178375244140625,
0.040496826171875,
0.0120849609375,
-0.002292633056640625,
-0.03594970703125,
-0.07525634765625,
-0.0002923011779785156,
-0.0304412841796875,
-0.05517578125,
0.0254364013671875,
-0.031463623046875,
-0.00751495361328125,
0.005523681640625,
0.0226593017578125,
-0.0125885009765625,
0.0043182373046875,
0.00473785400390625,
0.01401519775390625,
-0.0293121337890625,
0.0074615478515625,
0.003936767578125,
-0.01497650146484375,
-0.00983428955078125,
-0.012786865234375,
0.0439453125,
-0.0229644775390625,
-0.0159912109375,
-0.043701171875,
0.00920867919921875,
0.044219970703125,
-0.0264739990234375,
0.06103515625,
0.043182373046875,
-0.00785064697265625,
0.00952911376953125,
-0.029754638671875,
0.00907135009765625,
-0.03216552734375,
0.007579803466796875,
-0.042022705078125,
-0.0533447265625,
0.0338134765625,
0.0094146728515625,
0.0264434814453125,
0.06719970703125,
0.05517578125,
0.0076904296875,
0.04876708984375,
0.0275421142578125,
0.0005955696105957031,
0.030364990234375,
-0.04034423828125,
-0.012847900390625,
-0.07373046875,
0.00921630859375,
-0.05438232421875,
-0.02667236328125,
-0.059478759765625,
-0.0166168212890625,
0.01788330078125,
0.00396728515625,
-0.01523590087890625,
0.0556640625,
-0.04296875,
0.01531219482421875,
0.0421142578125,
-0.0113372802734375,
0.0267181396484375,
0.0010538101196289062,
-0.037689208984375,
-0.0163726806640625,
-0.02874755859375,
-0.0428466796875,
0.10040283203125,
0.02508544921875,
0.02337646484375,
0.015960693359375,
0.03912353515625,
0.006259918212890625,
0.00762939453125,
-0.046844482421875,
0.033905029296875,
-0.026702880859375,
-0.048095703125,
-0.023529052734375,
-0.0428466796875,
-0.06805419921875,
0.030059814453125,
-0.0136260986328125,
-0.0310211181640625,
0.0158233642578125,
-0.00568389892578125,
-0.0052642822265625,
0.0298919677734375,
-0.054046630859375,
0.08526611328125,
-0.0019550323486328125,
-0.0012750625610351562,
0.01000213623046875,
-0.030242919921875,
0.0170135498046875,
-0.0076904296875,
0.0185394287109375,
-0.0132904052734375,
0.004970550537109375,
0.048004150390625,
-0.005397796630859375,
0.0278778076171875,
-0.004489898681640625,
-0.014678955078125,
0.0054168701171875,
0.004302978515625,
0.0254974365234375,
-0.0102691650390625,
-0.035369873046875,
0.035919189453125,
0.01055908203125,
-0.0328369140625,
-0.00846099853515625,
0.046173095703125,
-0.051300048828125,
-0.0021762847900390625,
-0.036773681640625,
-0.054046630859375,
0.00441741943359375,
0.0261993408203125,
0.048431396484375,
0.04833984375,
-0.0245361328125,
0.039215087890625,
0.0579833984375,
-0.0220794677734375,
0.030517578125,
0.052947998046875,
-0.0080413818359375,
-0.04290771484375,
0.05535888671875,
0.01218414306640625,
0.0264739990234375,
0.0457763671875,
0.008453369140625,
-0.01401519775390625,
-0.054046630859375,
-0.05108642578125,
0.0172882080078125,
-0.021087646484375,
-0.011383056640625,
-0.045257568359375,
-0.0017118453979492188,
-0.024322509765625,
0.0214996337890625,
-0.02874755859375,
-0.045684814453125,
-0.01165771484375,
-0.0135040283203125,
0.0193328857421875,
0.019989013671875,
-0.009979248046875,
0.03076171875,
-0.07623291015625,
0.0238189697265625,
-0.0030727386474609375,
0.026031494140625,
-0.0355224609375,
-0.06036376953125,
-0.03228759765625,
0.01125335693359375,
-0.049041748046875,
-0.0545654296875,
0.03704833984375,
0.010833740234375,
0.01381683349609375,
0.0222015380859375,
0.016448974609375,
0.0231170654296875,
-0.056610107421875,
0.067626953125,
-0.01189422607421875,
-0.0537109375,
0.035614013671875,
-0.032989501953125,
0.03338623046875,
0.0687255859375,
0.0224151611328125,
-0.018157958984375,
-0.0338134765625,
-0.052642822265625,
-0.0574951171875,
0.055267333984375,
0.05059814453125,
-0.0100555419921875,
0.00803375244140625,
-0.00406646728515625,
0.0010232925415039062,
0.0097198486328125,
-0.0782470703125,
-0.0322265625,
0.0080413818359375,
-0.0257110595703125,
-0.0160064697265625,
-0.02569580078125,
-0.0176239013671875,
-0.01568603515625,
0.0794677734375,
0.01361846923828125,
0.0108489990234375,
0.0302886962890625,
-0.00928497314453125,
-0.01126861572265625,
0.02734375,
0.07080078125,
0.042633056640625,
-0.041656494140625,
-0.01143646240234375,
0.022430419921875,
-0.0377197265625,
-0.00402069091796875,
0.01285552978515625,
-0.033905029296875,
0.0219879150390625,
0.029266357421875,
0.07269287109375,
0.005710601806640625,
-0.047271728515625,
0.0352783203125,
-0.0247955322265625,
-0.033905029296875,
-0.047760009765625,
-0.01067352294921875,
0.00604248046875,
-0.0012083053588867188,
0.0203094482421875,
0.004833221435546875,
0.0168304443359375,
-0.01348114013671875,
0.020538330078125,
0.004138946533203125,
-0.04547119140625,
-0.041046142578125,
0.038299560546875,
0.0094757080078125,
-0.0247039794921875,
0.034759521484375,
-0.032745361328125,
-0.037322998046875,
0.03106689453125,
0.009674072265625,
0.081298828125,
-0.01654052734375,
-0.01284027099609375,
0.05615234375,
0.03564453125,
-0.0137786865234375,
0.03448486328125,
0.007221221923828125,
-0.04705810546875,
-0.04681396484375,
-0.06396484375,
-0.01045989990234375,
0.00890350341796875,
-0.056610107421875,
0.03350830078125,
0.0299835205078125,
-0.0005445480346679688,
-0.0216522216796875,
0.0200958251953125,
-0.033905029296875,
0.013671875,
-0.0240478515625,
0.0771484375,
-0.0687255859375,
0.07330322265625,
0.03271484375,
-0.0225982666015625,
-0.05615234375,
-0.0130157470703125,
-0.0157623291015625,
-0.0292816162109375,
0.057281494140625,
0.0135955810546875,
0.019866943359375,
-0.005496978759765625,
-0.006412506103515625,
-0.056304931640625,
0.08062744140625,
0.0208282470703125,
-0.0458984375,
0.0013685226440429688,
0.007259368896484375,
0.0367431640625,
-0.022705078125,
0.00751495361328125,
0.03558349609375,
0.060028076171875,
0.0049285888671875,
-0.08477783203125,
-0.0196685791015625,
-0.0361328125,
-0.027984619140625,
0.040618896484375,
-0.0362548828125,
0.07354736328125,
0.02978515625,
-0.01528167724609375,
0.00296783447265625,
0.044189453125,
0.0181732177734375,
0.0210418701171875,
0.045684814453125,
0.09259033203125,
0.0239715576171875,
-0.034423828125,
0.07452392578125,
-0.019866943359375,
0.034027099609375,
0.08795166015625,
-0.00432586669921875,
0.068115234375,
0.0222015380859375,
-0.012847900390625,
0.042724609375,
0.0426025390625,
-0.0257110595703125,
0.038818359375,
0.00811004638671875,
0.01210784912109375,
-0.001850128173828125,
0.0170135498046875,
-0.054168701171875,
0.023956298828125,
0.007843017578125,
-0.009521484375,
0.0003972053527832031,
-0.001163482666015625,
0.0027790069580078125,
0.005031585693359375,
-0.004062652587890625,
0.048736572265625,
0.00010061264038085938,
-0.04339599609375,
0.06072998046875,
-0.00366973876953125,
0.05340576171875,
-0.047027587890625,
0.0019388198852539062,
-0.0012912750244140625,
0.022430419921875,
0.0023441314697265625,
-0.041473388671875,
0.043670654296875,
-0.0005860328674316406,
-0.024078369140625,
-0.0305938720703125,
0.0091094970703125,
-0.041656494140625,
-0.0625,
0.03741455078125,
0.031280517578125,
0.027435302734375,
-0.003299713134765625,
-0.063720703125,
-0.0003209114074707031,
0.00467681884765625,
-0.04461669921875,
0.0055389404296875,
0.05853271484375,
0.02459716796875,
0.03411865234375,
0.043548583984375,
0.019134521484375,
0.012786865234375,
-0.0015707015991210938,
0.046112060546875,
-0.032562255859375,
-0.0296630859375,
-0.0638427734375,
0.056396484375,
-0.006561279296875,
-0.0467529296875,
0.056304931640625,
0.07232666015625,
0.0745849609375,
-0.012908935546875,
0.01629638671875,
-0.01470184326171875,
0.055267333984375,
-0.0455322265625,
0.049591064453125,
-0.07080078125,
0.01763916015625,
-0.0108184814453125,
-0.0736083984375,
-0.014678955078125,
0.0221099853515625,
-0.01439666748046875,
-0.0250701904296875,
0.061187744140625,
0.04693603515625,
-0.01244354248046875,
-0.0215911865234375,
0.0234222412109375,
0.027984619140625,
0.0160064697265625,
0.044586181640625,
0.0241546630859375,
-0.07720947265625,
0.04302978515625,
-0.0214080810546875,
-0.011444091796875,
-0.0004830360412597656,
-0.059417724609375,
-0.061737060546875,
-0.052978515625,
-0.0186004638671875,
-0.01486968994140625,
-0.0257110595703125,
0.05511474609375,
0.0391845703125,
-0.06951904296875,
-0.03857421875,
0.0005440711975097656,
0.00836181640625,
-0.01904296875,
-0.0190582275390625,
0.054962158203125,
-0.024322509765625,
-0.08367919921875,
0.036956787109375,
0.00492095947265625,
-0.00606536865234375,
0.0011911392211914062,
-0.01959228515625,
-0.04217529296875,
-0.006488800048828125,
0.0260772705078125,
-0.0026302337646484375,
-0.040496826171875,
0.00606536865234375,
0.0109405517578125,
-0.0095062255859375,
0.02691650390625,
0.02178955078125,
-0.0244140625,
0.0180816650390625,
0.06268310546875,
0.03369140625,
0.0340576171875,
-0.01195526123046875,
0.041900634765625,
-0.04547119140625,
0.0240478515625,
0.01776123046875,
0.041656494140625,
0.024078369140625,
-0.005474090576171875,
0.0634765625,
0.01220703125,
-0.0560302734375,
-0.07757568359375,
0.00659942626953125,
-0.09368896484375,
-0.0058135986328125,
0.06640625,
-0.0218963623046875,
-0.01910400390625,
0.0234222412109375,
-0.017181396484375,
0.01401519775390625,
-0.024078369140625,
0.038665771484375,
0.066162109375,
0.031005859375,
0.0134124755859375,
-0.05340576171875,
0.027587890625,
0.042388916015625,
-0.056884765625,
-0.0129241943359375,
0.0226898193359375,
0.0104217529296875,
0.031219482421875,
0.036529541015625,
-0.0262298583984375,
0.0050201416015625,
-0.018463134765625,
0.024627685546875,
-0.009613037109375,
-0.014678955078125,
-0.0283050537109375,
0.0011119842529296875,
-0.0104217529296875,
-0.0281219482421875
]
] |
LTP/small | 2022-09-19T06:36:05.000Z | [
"transformers",
"pytorch",
"endpoints_compatible",
"region:us"
] | null | LTP | null | null | LTP/small | 4 | 13,446 | transformers | 2022-08-14T04:14:58 | 


| Language | version |
| ------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| [Python](python/interface/README.md) | [](https://pypi.org/project/ltp) [](https://pypi.org/project/ltp-core) [](https://pypi.org/project/ltp-extension) |
| [Rust](rust/ltp/README.md) | [](https://crates.io/crates/ltp) |
# LTP 4
LTP(Language Technology Platform) 提供了一系列中文自然语言处理工具,用户可以使用这些工具对于中文文本进行分词、词性标注、句法分析等等工作。
## 引用
如果您在工作中使用了 LTP,您可以引用这篇论文
```bibtex
@article{che2020n,
title={N-LTP: A Open-source Neural Chinese Language Technology Platform with Pretrained Models},
author={Che, Wanxiang and Feng, Yunlong and Qin, Libo and Liu, Ting},
journal={arXiv preprint arXiv:2009.11616},
year={2020}
}
```
**参考书:**
由哈工大社会计算与信息检索研究中心(HIT-SCIR)的多位学者共同编著的《[自然语言处理:基于预训练模型的方法](https://item.jd.com/13344628.html)
》(作者:车万翔、郭江、崔一鸣;主审:刘挺)一书现已正式出版,该书重点介绍了新的基于预训练模型的自然语言处理技术,包括基础知识、预训练词向量和预训练模型三大部分,可供广大LTP用户学习参考。
### 更新说明
- 4.2.0
- \[结构性变化\] 将 LTP 拆分成 2 个部分,维护和训练更方便,结构更清晰
- \[Legacy 模型\] 针对广大用户对于**推理速度**的需求,使用 Rust 重写了基于感知机的算法,准确率与 LTP3 版本相当,速度则是 LTP v3 的 **3.55** 倍,开启多线程更可获得 **17.17** 倍的速度提升,但目前仅支持分词、词性、命名实体三大任务
- \[深度学习模型\] 即基于 PyTorch 实现的深度学习模型,支持全部的6大任务(分词/词性/命名实体/语义角色/依存句法/语义依存)
- \[其他改进\] 改进了模型训练方法
- \[共同\] 提供了训练脚本和训练样例,使得用户能够更方便地使用私有的数据,自行训练个性化的模型
- \[深度学习模型\] 采用 hydra 对训练过程进行配置,方便广大用户修改模型训练参数以及对 LTP 进行扩展(比如使用其他包中的 Module)
- \[其他变化\] 分词、依存句法分析 (Eisner) 和 语义依存分析 (Eisner) 任务的解码算法使用 Rust 实现,速度更快
- \[新特性\] 模型上传至 [Huggingface Hub](https://huggingface.co/LTP),支持自动下载,下载速度更快,并且支持用户自行上传自己训练的模型供LTP进行推理使用
- \[破坏性变更\] 改用 Pipeline API 进行推理,方便后续进行更深入的性能优化(如SDP和SDPG很大一部分是重叠的,重用可以加快推理速度),使用说明参见[Github快速使用部分](https://github.com/hit-scir/ltp)
- 4.1.0
- 提供了自定义分词等功能
- 修复了一些bug
- 4.0.0
- 基于Pytorch 开发,原生 Python 接口
- 可根据需要自由选择不同速度和指标的模型
- 分词、词性、命名实体、依存句法、语义角色、语义依存6大任务
## 快速使用
### [Python](python/interface/README.md)
```bash
pip install -U ltp ltp-core ltp-extension -i https://pypi.org/simple # 安装 ltp
```
**注:** 如果遇到任何错误,请尝试使用上述命令重新安装 ltp,如果依然报错,请在 Github issues 中反馈。
```python
import torch
from ltp import LTP
ltp = LTP("LTP/small") # 默认加载 Small 模型
# 将模型移动到 GPU 上
if torch.cuda.is_available():
# ltp.cuda()
ltp.to("cuda")
output = ltp.pipeline(["他叫汤姆去拿外衣。"], tasks=["cws", "pos", "ner", "srl", "dep", "sdp"])
# 使用字典格式作为返回结果
print(output.cws) # print(output[0]) / print(output['cws']) # 也可以使用下标访问
print(output.pos)
print(output.sdp)
# 使用感知机算法实现的分词、词性和命名实体识别,速度比较快,但是精度略低
ltp = LTP("LTP/legacy")
# cws, pos, ner = ltp.pipeline(["他叫汤姆去拿外衣。"], tasks=["cws", "ner"]).to_tuple() # error: NER 需要 词性标注任务的结果
cws, pos, ner = ltp.pipeline(["他叫汤姆去拿外衣。"], tasks=["cws", "pos", "ner"]).to_tuple() # to tuple 可以自动转换为元组格式
# 使用元组格式作为返回结果
print(cws, pos, ner)
```
**[详细说明](python/interface/docs/quickstart.rst)**
### [Rust](rust/ltp/README.md)
```rust
use std::fs::File;
use itertools::multizip;
use ltp::{CWSModel, POSModel, NERModel, ModelSerde, Format, Codec};
fn main() -> Result<(), Box<dyn std::error::Error>> {
let file = File::open("data/legacy-models/cws_model.bin")?;
let cws: CWSModel = ModelSerde::load(file, Format::AVRO(Codec::Deflate))?;
let file = File::open("data/legacy-models/pos_model.bin")?;
let pos: POSModel = ModelSerde::load(file, Format::AVRO(Codec::Deflate))?;
let file = File::open("data/legacy-models/ner_model.bin")?;
let ner: NERModel = ModelSerde::load(file, Format::AVRO(Codec::Deflate))?;
let words = cws.predict("他叫汤姆去拿外衣。")?;
let pos = pos.predict(&words)?;
let ner = ner.predict((&words, &pos))?;
for (w, p, n) in multizip((words, pos, ner)) {
println!("{}/{}/{}", w, p, n);
}
Ok(())
}
```
## 模型性能以及下载地址
| 深度学习模型 | 分词 | 词性 | 命名实体 | 语义角色 | 依存句法 | 语义依存 | 速度(句/S) |
| :---------------------------------------: | :---: | :---: | :---: | :---: | :---: | :---: | :-----: |
| [Base](https://huggingface.co/LTP/base) | 98.7 | 98.5 | 95.4 | 80.6 | 89.5 | 75.2 | 39.12 |
| [Base1](https://huggingface.co/LTP/base1) | 99.22 | 98.73 | 96.39 | 79.28 | 89.57 | 76.57 | --.-- |
| [Base2](https://huggingface.co/LTP/base2) | 99.18 | 98.69 | 95.97 | 79.49 | 90.19 | 76.62 | --.-- |
| [Small](https://huggingface.co/LTP/small) | 98.4 | 98.2 | 94.3 | 78.4 | 88.3 | 74.7 | 43.13 |
| [Tiny](https://huggingface.co/LTP/tiny) | 96.8 | 97.1 | 91.6 | 70.9 | 83.8 | 70.1 | 53.22 |
| 感知机算法 | 分词 | 词性 | 命名实体 | 速度(句/s) | 备注 |
| :-----------------------------------------: | :---: | :---: | :---: | :------: | :------------------------: |
| [Legacy](https://huggingface.co/LTP/legacy) | 97.93 | 98.41 | 94.28 | 21581.48 | [性能详情](rust/ltp/README.md) |
**注:感知机算法速度为开启16线程速度**
## 构建 Wheel 包
```shell script
make bdist
```
## 其他语言绑定
**感知机算法**
- [Rust](rust/ltp)
- [C/C++](rust/ltp-cffi)
**深度学习算法**
- [Rust](https://github.com/HIT-SCIR/libltp/tree/master/ltp-rs)
- [C++](https://github.com/HIT-SCIR/libltp/tree/master/ltp-cpp)
- [Java](https://github.com/HIT-SCIR/libltp/tree/master/ltp-java)
## 作者信息
- 冯云龙 \<\<[ylfeng@ir.hit.edu.cn](mailto:ylfeng@ir.hit.edu.cn)>>
## 开源协议
1. 语言技术平台面向国内外大学、中科院各研究所以及个人研究者免费开放源代码,但如上述机构和个人将该平台用于商业目的(如企业合作项目等)则需要付费。
2. 除上述机构以外的企事业单位,如申请使用该平台,需付费。
3. 凡涉及付费问题,请发邮件到 car@ir.hit.edu.cn 洽商。
4. 如果您在 LTP 基础上发表论文或取得科研成果,请您在发表论文和申报成果时声明“使用了哈工大社会计算与信息检索研究中心研制的语言技术平台(LTP)”.
同时,发信给car@ir.hit.edu.cn,说明发表论文或申报成果的题目、出处等。
| 6,719 | [
[
-0.033660888671875,
-0.04571533203125,
0.0243988037109375,
0.017578125,
-0.0214996337890625,
0.006683349609375,
-0.009185791015625,
-0.027496337890625,
0.02783203125,
0.01068115234375,
-0.022125244140625,
-0.034149169921875,
-0.039642333984375,
-0.0018587112426757812,
0.0020599365234375,
0.06488037109375,
-0.0085906982421875,
-0.00640869140625,
0.01690673828125,
-0.00905609130859375,
-0.028289794921875,
-0.0143280029296875,
-0.052001953125,
-0.02587890625,
0.0124969482421875,
0.006374359130859375,
0.042724609375,
0.046051025390625,
0.036590576171875,
0.024658203125,
-0.00665283203125,
0.01175689697265625,
-0.00995635986328125,
-0.004787445068359375,
0.01358795166015625,
-0.02496337890625,
-0.034576416015625,
-0.017791748046875,
0.06353759765625,
0.0262298583984375,
0.018524169921875,
0.0091094970703125,
0.021392822265625,
0.061126708984375,
-0.0340576171875,
0.0169219970703125,
-0.0185546875,
0.005279541015625,
-0.020050048828125,
-0.03582763671875,
-0.01515960693359375,
-0.031768798828125,
-0.00501251220703125,
-0.03759765625,
-0.0024204254150390625,
0.01201629638671875,
0.09002685546875,
0.0080108642578125,
-0.00902557373046875,
-0.0029048919677734375,
-0.034881591796875,
0.063232421875,
-0.0771484375,
0.01715087890625,
0.00859832763671875,
0.00812530517578125,
-0.0247039794921875,
-0.053314208984375,
-0.0467529296875,
-0.0079498291015625,
-0.025299072265625,
0.017852783203125,
-0.00433349609375,
-0.01004791259765625,
0.0235137939453125,
0.02374267578125,
-0.050750732421875,
-0.005626678466796875,
-0.03271484375,
-0.01239013671875,
0.050445556640625,
0.0096893310546875,
0.041595458984375,
-0.03753662109375,
-0.037322998046875,
-0.019500732421875,
-0.033111572265625,
0.032440185546875,
0.0073394775390625,
0.0209808349609375,
-0.061767578125,
0.04498291015625,
-0.015777587890625,
0.050872802734375,
-0.005649566650390625,
-0.0213623046875,
0.049560546875,
-0.029327392578125,
-0.0253753662109375,
-0.013092041015625,
0.09771728515625,
0.0379638671875,
-0.002277374267578125,
0.0037403106689453125,
-0.002277374267578125,
-0.0092620849609375,
-0.031524658203125,
-0.052947998046875,
-0.01421356201171875,
0.0537109375,
-0.03582763671875,
-0.0212249755859375,
-0.002971649169921875,
-0.07403564453125,
0.000743865966796875,
0.0030536651611328125,
0.032623291015625,
-0.059661865234375,
-0.033355712890625,
0.0063934326171875,
-0.026397705078125,
0.035064697265625,
0.00691986083984375,
-0.05316162109375,
0.01537322998046875,
0.0347900390625,
0.0802001953125,
-0.0064697265625,
-0.0296783447265625,
-0.02545166015625,
0.01293182373046875,
-0.01454925537109375,
0.04119873046875,
-0.0028018951416015625,
-0.0147247314453125,
-0.007904052734375,
0.0017757415771484375,
-0.02032470703125,
-0.0037441253662109375,
0.0390625,
-0.01485443115234375,
0.01219940185546875,
-0.017364501953125,
-0.021942138671875,
-0.0107574462890625,
0.0177459716796875,
-0.0350341796875,
0.0823974609375,
0.016754150390625,
-0.10186767578125,
-0.0010280609130859375,
-0.05279541015625,
-0.019317626953125,
0.00838470458984375,
-0.0223236083984375,
-0.04278564453125,
-0.0181884765625,
0.021820068359375,
0.03277587890625,
-0.02191162109375,
-0.0027370452880859375,
-0.01314544677734375,
-0.00814056396484375,
0.0277252197265625,
-0.00537872314453125,
0.092041015625,
0.0287628173828125,
-0.038726806640625,
0.02105712890625,
-0.046905517578125,
0.0283203125,
0.0305328369140625,
-0.0230255126953125,
-0.006999969482421875,
-0.0146026611328125,
0.003566741943359375,
0.0184173583984375,
0.035125732421875,
-0.040435791015625,
0.0222320556640625,
-0.050933837890625,
0.059173583984375,
0.06353759765625,
0.002109527587890625,
0.0303802490234375,
-0.05548095703125,
0.0226287841796875,
-0.0033550262451171875,
0.0107879638671875,
-0.0042877197265625,
-0.044342041015625,
-0.06976318359375,
-0.026275634765625,
0.0134429931640625,
0.049102783203125,
-0.06329345703125,
0.0491943359375,
-0.0191802978515625,
-0.050140380859375,
-0.0213775634765625,
-0.01010894775390625,
0.0287322998046875,
0.0247802734375,
0.03912353515625,
-0.0022125244140625,
-0.049713134765625,
-0.0499267578125,
-0.00787353515625,
-0.01371002197265625,
0.007648468017578125,
0.02227783203125,
0.06268310546875,
-0.0162811279296875,
0.0665283203125,
-0.0489501953125,
-0.01136016845703125,
-0.0244903564453125,
-0.0011730194091796875,
0.045989990234375,
0.06414794921875,
0.04876708984375,
-0.041961669921875,
-0.05206298828125,
0.004467010498046875,
-0.060028076171875,
0.0091705322265625,
-0.007694244384765625,
-0.00872802734375,
0.029327392578125,
0.0109710693359375,
-0.050994873046875,
0.028076171875,
0.0411376953125,
-0.036376953125,
0.0697021484375,
-0.0278167724609375,
0.012908935546875,
-0.09344482421875,
0.02374267578125,
-0.016357421875,
-0.006664276123046875,
-0.03936767578125,
0.021514892578125,
-0.0007081031799316406,
0.00576019287109375,
-0.03839111328125,
0.055450439453125,
-0.03314208984375,
0.006320953369140625,
0.004314422607421875,
-0.0023956298828125,
-0.006710052490234375,
0.0462646484375,
-0.004146575927734375,
0.043548583984375,
0.06231689453125,
-0.04931640625,
0.0285797119140625,
0.01448822021484375,
-0.0174102783203125,
0.00505828857421875,
-0.062286376953125,
0.0022735595703125,
-0.0012464523315429688,
0.00995635986328125,
-0.07373046875,
-0.0173797607421875,
0.04754638671875,
-0.055755615234375,
0.0283355712890625,
-0.00841522216796875,
-0.0201873779296875,
-0.0426025390625,
-0.0557861328125,
0.01202392578125,
0.037353515625,
-0.0185699462890625,
0.044158935546875,
0.0126800537109375,
-0.0021686553955078125,
-0.06219482421875,
-0.0516357421875,
-0.0229339599609375,
-0.007465362548828125,
-0.06158447265625,
0.044952392578125,
-0.0168609619140625,
-0.0052490234375,
0.0022449493408203125,
0.00205230712890625,
-0.0019817352294921875,
-0.014190673828125,
0.0261993408203125,
0.03173828125,
-0.00122833251953125,
-0.0233917236328125,
-0.00894927978515625,
-0.01033782958984375,
-0.007495880126953125,
-0.0004742145538330078,
0.06390380859375,
-0.0237579345703125,
-0.01873779296875,
-0.044189453125,
0.00943756103515625,
0.050933837890625,
-0.011871337890625,
0.04852294921875,
0.0489501953125,
-0.00928497314453125,
-0.01024627685546875,
-0.031951904296875,
-0.0092010498046875,
-0.036712646484375,
0.0135498046875,
-0.0241241455078125,
-0.0716552734375,
0.04388427734375,
0.0143585205078125,
0.02197265625,
0.055755615234375,
0.026580810546875,
-0.0185089111328125,
0.050445556640625,
0.032562255859375,
-0.0131988525390625,
0.0284423828125,
-0.06317138671875,
0.006664276123046875,
-0.06585693359375,
-0.0208587646484375,
-0.050750732421875,
-0.0188140869140625,
-0.05377197265625,
-0.048828125,
0.0290985107421875,
0.01251983642578125,
-0.033477783203125,
0.0335693359375,
-0.06884765625,
-0.004848480224609375,
0.038787841796875,
0.0146636962890625,
-0.0036220550537109375,
-0.0019512176513671875,
-0.0298919677734375,
0.003307342529296875,
-0.05078125,
-0.0223236083984375,
0.079833984375,
0.0341796875,
0.035736083984375,
0.00423431396484375,
0.05804443359375,
0.014129638671875,
-0.00627899169921875,
-0.039520263671875,
0.04620361328125,
0.00830841064453125,
-0.0498046875,
-0.0285797119140625,
-0.038848876953125,
-0.072021484375,
0.014678955078125,
-0.02587890625,
-0.07421875,
0.01085662841796875,
0.0107269287109375,
-0.0316162109375,
0.044952392578125,
-0.042449951171875,
0.059051513671875,
-0.032318115234375,
-0.045928955078125,
0.0009756088256835938,
-0.04327392578125,
0.0297393798828125,
0.018524169921875,
0.037200927734375,
-0.00669097900390625,
-0.0010442733764648438,
0.071533203125,
-0.05206298828125,
0.035797119140625,
-0.019256591796875,
0.01404571533203125,
0.034027099609375,
-0.0002307891845703125,
0.053314208984375,
0.013916015625,
-0.0036487579345703125,
0.01123809814453125,
0.01448822021484375,
-0.032928466796875,
-0.021453857421875,
0.06121826171875,
-0.06756591796875,
-0.031707763671875,
-0.06884765625,
-0.0347900390625,
0.0181427001953125,
0.03155517578125,
0.035552978515625,
0.024169921875,
0.0142669677734375,
0.014190673828125,
0.03643798828125,
-0.04815673828125,
0.06756591796875,
0.0226287841796875,
-0.019012451171875,
-0.0474853515625,
0.07196044921875,
0.009490966796875,
0.0136260986328125,
0.040283203125,
0.0167388916015625,
-0.016937255859375,
-0.05328369140625,
-0.025482177734375,
0.03387451171875,
-0.029449462890625,
-0.024932861328125,
-0.0521240234375,
-0.0289459228515625,
-0.051025390625,
-0.01727294921875,
-0.0240478515625,
-0.0364990234375,
-0.0059814453125,
0.00957489013671875,
0.045013427734375,
0.0258026123046875,
-0.024810791015625,
0.0062408447265625,
-0.055816650390625,
0.031707763671875,
0.0003287792205810547,
0.0227203369140625,
-0.00565338134765625,
-0.0369873046875,
-0.01309967041015625,
0.008819580078125,
-0.03460693359375,
-0.0635986328125,
0.05804443359375,
0.0011749267578125,
0.031951904296875,
0.0133209228515625,
0.0078277587890625,
0.037872314453125,
-0.0163726806640625,
0.0560302734375,
0.03277587890625,
-0.06903076171875,
0.044830322265625,
-0.03765869140625,
0.01861572265625,
0.0103759765625,
0.032440185546875,
-0.0267486572265625,
-0.01409149169921875,
-0.04779052734375,
-0.0733642578125,
0.05999755859375,
0.022918701171875,
-0.017578125,
0.01214599609375,
0.00745391845703125,
-0.022918701171875,
-0.005046844482421875,
-0.07672119140625,
-0.05908203125,
-0.0216827392578125,
0.0014238357543945312,
0.00864410400390625,
-0.00911712646484375,
-0.003253936767578125,
-0.0249176025390625,
0.055511474609375,
0.005138397216796875,
0.05450439453125,
0.0377197265625,
0.01214599609375,
-0.00418853759765625,
0.0302276611328125,
0.039947509765625,
0.04254150390625,
-0.038818359375,
-0.00091552734375,
0.020263671875,
-0.033660888671875,
0.000438690185546875,
0.01331329345703125,
-0.03619384765625,
0.007404327392578125,
0.034271240234375,
0.047943115234375,
0.0058441162109375,
-0.0330810546875,
0.0284423828125,
0.00878143310546875,
-0.0216827392578125,
-0.035186767578125,
0.00759124755859375,
0.013458251953125,
0.0142669677734375,
0.040618896484375,
0.00982666015625,
-0.0085601806640625,
-0.03900146484375,
0.0108184814453125,
0.043914794921875,
-0.0126800537109375,
-0.007099151611328125,
0.04327392578125,
-0.0012121200561523438,
-0.010711669921875,
0.018310546875,
-0.0120391845703125,
-0.049468994140625,
0.054595947265625,
0.034759521484375,
0.052398681640625,
-0.01123809814453125,
0.01058197021484375,
0.073486328125,
0.02685546875,
0.0005583763122558594,
0.037994384765625,
0.00995635986328125,
-0.05279541015625,
-0.033233642578125,
-0.06036376953125,
-0.0016231536865234375,
0.0227508544921875,
-0.01203155517578125,
0.0221405029296875,
-0.058349609375,
-0.020050048828125,
0.01202392578125,
0.02392578125,
-0.033538818359375,
0.00762939453125,
-0.00774383544921875,
0.07037353515625,
-0.057769775390625,
0.07110595703125,
0.0406494140625,
-0.045196533203125,
-0.071533203125,
-0.0234375,
-0.01153564453125,
-0.042633056640625,
0.060272216796875,
-0.00010859966278076172,
0.004596710205078125,
-0.006748199462890625,
-0.033416748046875,
-0.08038330078125,
0.0888671875,
0.0137481689453125,
-0.029022216796875,
-0.00943756103515625,
0.009765625,
0.0261993408203125,
-0.01264190673828125,
0.04534912109375,
0.023773193359375,
0.048980712890625,
-0.0044708251953125,
-0.07904052734375,
0.027801513671875,
-0.030242919921875,
-0.01068115234375,
0.007160186767578125,
-0.0634765625,
0.09454345703125,
-0.0184783935546875,
-0.0124664306640625,
0.0004978179931640625,
0.039581298828125,
0.0455322265625,
0.0154571533203125,
0.0172576904296875,
0.056121826171875,
0.0303497314453125,
-0.028656005859375,
0.0740966796875,
-0.0218963623046875,
0.0421142578125,
0.0614013671875,
0.0143280029296875,
0.061279296875,
0.03350830078125,
-0.04583740234375,
0.04449462890625,
0.03668212890625,
-0.0132904052734375,
0.0316162109375,
0.007366180419921875,
-0.018157958984375,
0.001934051513671875,
0.00489044189453125,
-0.06280517578125,
0.0107421875,
0.0240631103515625,
-0.0262908935546875,
0.004589080810546875,
-0.0143280029296875,
0.005565643310546875,
-0.00843048095703125,
-0.0155029296875,
0.035552978515625,
-0.00006794929504394531,
-0.04339599609375,
0.07733154296875,
0.00951385498046875,
0.0767822265625,
-0.051239013671875,
-0.000016450881958007812,
0.0006079673767089844,
0.01068115234375,
-0.042724609375,
-0.06536865234375,
0.0160980224609375,
-0.0146331787109375,
-0.00989532470703125,
-0.0013151168823242188,
0.050750732421875,
-0.0116119384765625,
-0.0251312255859375,
0.033843994140625,
0.0199127197265625,
0.0236968994140625,
0.025177001953125,
-0.081298828125,
0.0270233154296875,
0.02606201171875,
-0.027679443359375,
0.02728271484375,
0.0156402587890625,
0.0249176025390625,
0.05755615234375,
0.056671142578125,
0.022216796875,
0.01538848876953125,
-0.0287628173828125,
0.075439453125,
-0.053466796875,
-0.0204315185546875,
-0.06201171875,
0.05938720703125,
-0.016571044921875,
-0.0258331298828125,
0.06268310546875,
0.057525634765625,
0.0592041015625,
-0.00688934326171875,
0.07647705078125,
-0.05230712890625,
0.042999267578125,
-0.0256195068359375,
0.051513671875,
-0.041351318359375,
0.002307891845703125,
-0.0390625,
-0.048065185546875,
-0.01261138916015625,
0.06243896484375,
-0.02691650390625,
0.0093231201171875,
0.0626220703125,
0.058746337890625,
0.0163116455078125,
-0.0203094482421875,
-0.0037517547607421875,
0.020263671875,
0.01763916015625,
0.07659912109375,
0.028472900390625,
-0.0604248046875,
0.03887939453125,
-0.052581787109375,
-0.0199432373046875,
-0.0160064697265625,
-0.059783935546875,
-0.0633544921875,
-0.047393798828125,
-0.025299072265625,
-0.043365478515625,
-0.004497528076171875,
0.0728759765625,
0.04473876953125,
-0.06268310546875,
-0.03155517578125,
-0.0209197998046875,
0.02484130859375,
-0.0328369140625,
-0.0231475830078125,
0.056915283203125,
-0.0220184326171875,
-0.05657958984375,
0.01523590087890625,
0.01258087158203125,
0.0075225830078125,
0.0030002593994140625,
-0.030364990234375,
-0.03839111328125,
-0.017364501953125,
0.0231170654296875,
0.05322265625,
-0.045989990234375,
-0.007030487060546875,
-0.0035877227783203125,
-0.0117645263671875,
0.01056671142578125,
0.0271453857421875,
-0.0311279296875,
0.02423095703125,
0.071533203125,
0.0174560546875,
0.04547119140625,
-0.0217132568359375,
0.0230255126953125,
-0.02490234375,
0.026580810546875,
0.00839996337890625,
0.03900146484375,
0.00997161865234375,
-0.0222625732421875,
0.0421142578125,
0.029083251953125,
-0.044525146484375,
-0.060150146484375,
-0.0008869171142578125,
-0.09716796875,
-0.03424072265625,
0.07781982421875,
-0.0184173583984375,
-0.02679443359375,
0.0010671615600585938,
-0.022064208984375,
0.04302978515625,
-0.04052734375,
0.035491943359375,
0.02349853515625,
-0.0133056640625,
-0.0184783935546875,
-0.0511474609375,
0.045684814453125,
0.0251922607421875,
-0.05560302734375,
-0.006378173828125,
0.0120391845703125,
0.0253448486328125,
0.033721923828125,
0.05694580078125,
-0.00952911376953125,
0.01020050048828125,
-0.0037250518798828125,
0.01439666748046875,
-0.0204010009765625,
0.0117645263671875,
-0.005054473876953125,
0.014923095703125,
-0.005733489990234375,
-0.0142364501953125
]
] |
nvidia/mit-b5 | 2022-08-06T10:25:24.000Z | [
"transformers",
"pytorch",
"tf",
"segformer",
"image-classification",
"vision",
"dataset:imagenet_1k",
"arxiv:2105.15203",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | image-classification | nvidia | null | null | nvidia/mit-b5 | 2 | 13,422 | transformers | 2022-03-02T23:29:05 | ---
license: other
tags:
- vision
datasets:
- imagenet_1k
widget:
- src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg
example_title: House
- src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg
example_title: Castle
---
# SegFormer (b5-sized) encoder pre-trained-only
SegFormer encoder fine-tuned on Imagenet-1k. It was introduced in the paper [SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers](https://arxiv.org/abs/2105.15203) by Xie et al. and first released in [this repository](https://github.com/NVlabs/SegFormer).
Disclaimer: The team releasing SegFormer did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
SegFormer consists of a hierarchical Transformer encoder and a lightweight all-MLP decode head to achieve great results on semantic segmentation benchmarks such as ADE20K and Cityscapes. The hierarchical Transformer is first pre-trained on ImageNet-1k, after which a decode head is added and fine-tuned altogether on a downstream dataset.
This repository only contains the pre-trained hierarchical Transformer, hence it can be used for fine-tuning purposes.
## Intended uses & limitations
You can use the model for fine-tuning of semantic segmentation. See the [model hub](https://huggingface.co/models?other=segformer) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import SegformerFeatureExtractor, SegformerForImageClassification
from PIL import Image
import requests
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
feature_extractor = SegformerFeatureExtractor.from_pretrained("nvidia/mit-b5")
model = SegformerForImageClassification.from_pretrained("nvidia/mit-b5")
inputs = feature_extractor(images=image, return_tensors="pt")
outputs = model(**inputs)
logits = outputs.logits
# model predicts one of the 1000 ImageNet classes
predicted_class_idx = logits.argmax(-1).item()
print("Predicted class:", model.config.id2label[predicted_class_idx])
```
For more code examples, we refer to the [documentation](https://huggingface.co/transformers/model_doc/segformer.html#).
### License
The license for this model can be found [here](https://github.com/NVlabs/SegFormer/blob/master/LICENSE).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2105-15203,
author = {Enze Xie and
Wenhai Wang and
Zhiding Yu and
Anima Anandkumar and
Jose M. Alvarez and
Ping Luo},
title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with
Transformers},
journal = {CoRR},
volume = {abs/2105.15203},
year = {2021},
url = {https://arxiv.org/abs/2105.15203},
eprinttype = {arXiv},
eprint = {2105.15203},
timestamp = {Wed, 02 Jun 2021 11:46:42 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| 3,354 | [
[
-0.0693359375,
-0.050872802734375,
0.005481719970703125,
0.0112152099609375,
-0.0252685546875,
-0.026214599609375,
0.00313568115234375,
-0.048553466796875,
0.0182647705078125,
0.04315185546875,
-0.061370849609375,
-0.040802001953125,
-0.057952880859375,
0.0086822509765625,
-0.02606201171875,
0.062286376953125,
0.0115509033203125,
-0.0070953369140625,
-0.033447265625,
-0.0191650390625,
0.0008907318115234375,
-0.0223846435546875,
-0.046875,
-0.0281982421875,
0.032257080078125,
0.017730712890625,
0.04327392578125,
0.0572509765625,
0.05889892578125,
0.036163330078125,
-0.031219482421875,
0.0072021484375,
-0.022430419921875,
-0.0218505859375,
0.002002716064453125,
-0.0081939697265625,
-0.0289154052734375,
0.0006842613220214844,
0.0305328369140625,
0.04644775390625,
0.01064300537109375,
0.0250244140625,
0.0010814666748046875,
0.03277587890625,
-0.040435791015625,
0.005062103271484375,
-0.03564453125,
0.0147552490234375,
-0.0003478527069091797,
-0.006557464599609375,
-0.025482177734375,
-0.01427459716796875,
0.01922607421875,
-0.040313720703125,
0.049468994140625,
0.00440216064453125,
0.11480712890625,
0.037628173828125,
-0.02362060546875,
-0.0029087066650390625,
-0.02703857421875,
0.058929443359375,
-0.053070068359375,
0.032928466796875,
0.0022335052490234375,
0.0270538330078125,
0.00897979736328125,
-0.07598876953125,
-0.035858154296875,
0.0094451904296875,
-0.0176239013671875,
0.0022335052490234375,
-0.02886962890625,
0.00859832763671875,
0.034332275390625,
0.03802490234375,
-0.033233642578125,
0.00839996337890625,
-0.054229736328125,
-0.030120849609375,
0.04864501953125,
0.0004360675811767578,
0.016021728515625,
-0.026123046875,
-0.05810546875,
-0.032806396484375,
-0.024261474609375,
0.00836944580078125,
0.020233154296875,
0.0024852752685546875,
-0.0224609375,
0.0299835205078125,
-0.004177093505859375,
0.0560302734375,
0.0340576171875,
-0.0134735107421875,
0.04205322265625,
-0.01163482666015625,
-0.0292510986328125,
0.009429931640625,
0.06964111328125,
0.03375244140625,
-0.0013828277587890625,
0.004085540771484375,
-0.004150390625,
0.013153076171875,
0.0196685791015625,
-0.095703125,
-0.01345062255859375,
0.0033168792724609375,
-0.040283203125,
-0.0293426513671875,
0.00910186767578125,
-0.05975341796875,
-0.0025386810302734375,
-0.00875091552734375,
0.038299560546875,
-0.0229644775390625,
-0.005756378173828125,
0.0113983154296875,
-0.01152801513671875,
0.05950927734375,
0.016326904296875,
-0.05712890625,
0.0150604248046875,
0.040008544921875,
0.0587158203125,
-0.01514434814453125,
-0.0191497802734375,
-0.00780487060546875,
-0.00783538818359375,
-0.0119781494140625,
0.06451416015625,
-0.027374267578125,
-0.0233001708984375,
-0.0174713134765625,
0.046356201171875,
-0.0161895751953125,
-0.046356201171875,
0.061187744140625,
-0.042816162109375,
0.01439666748046875,
-0.00555419921875,
-0.036407470703125,
-0.037322998046875,
0.0255126953125,
-0.04541015625,
0.06707763671875,
0.01070404052734375,
-0.0679931640625,
0.03863525390625,
-0.043853759765625,
-0.0188446044921875,
0.0005054473876953125,
0.00659942626953125,
-0.065673828125,
-0.0004813671112060547,
0.03131103515625,
0.0401611328125,
-0.0184783935546875,
0.017578125,
-0.04095458984375,
-0.0179290771484375,
0.00009816884994506836,
-0.0133056640625,
0.0699462890625,
0.02435302734375,
-0.0253143310546875,
0.029205322265625,
-0.052764892578125,
0.0021820068359375,
0.0313720703125,
0.0026073455810546875,
-0.0012989044189453125,
-0.0227203369140625,
0.015899658203125,
0.029144287109375,
0.0186004638671875,
-0.050811767578125,
0.0019025802612304688,
-0.0232696533203125,
0.0287017822265625,
0.055633544921875,
0.0058441162109375,
0.037689208984375,
-0.01116943359375,
0.02911376953125,
0.018096923828125,
0.033172607421875,
-0.0159759521484375,
-0.0155792236328125,
-0.08880615234375,
-0.019561767578125,
0.0212860107421875,
0.0086822509765625,
-0.038665771484375,
0.047088623046875,
-0.0194854736328125,
-0.05029296875,
-0.0372314453125,
-0.00969696044921875,
0.0152587890625,
0.03900146484375,
0.0399169921875,
-0.031585693359375,
-0.059234619140625,
-0.08721923828125,
0.00351715087890625,
0.0181884765625,
0.005859375,
0.0228424072265625,
0.0498046875,
-0.0523681640625,
0.0572509765625,
-0.049530029296875,
-0.0246124267578125,
-0.016845703125,
-0.0049591064453125,
0.0222930908203125,
0.05047607421875,
0.04534912109375,
-0.0606689453125,
-0.028839111328125,
-0.014556884765625,
-0.04913330078125,
-0.002239227294921875,
0.00630950927734375,
-0.0271148681640625,
0.01076507568359375,
0.035400390625,
-0.034393310546875,
0.033294677734375,
0.0345458984375,
-0.04278564453125,
0.0226287841796875,
-0.003986358642578125,
-0.002044677734375,
-0.074462890625,
0.01104736328125,
0.012176513671875,
-0.01383209228515625,
-0.038818359375,
0.00948333740234375,
-0.0020580291748046875,
-0.00991058349609375,
-0.044586181640625,
0.042449951171875,
-0.0251007080078125,
-0.0005092620849609375,
-0.01776123046875,
-0.0162200927734375,
0.004947662353515625,
0.059906005859375,
0.011444091796875,
0.026153564453125,
0.042938232421875,
-0.052032470703125,
0.005157470703125,
0.041107177734375,
-0.030975341796875,
0.032745361328125,
-0.07806396484375,
0.00946807861328125,
-0.01220703125,
0.007293701171875,
-0.052459716796875,
-0.02618408203125,
0.030426025390625,
-0.027984619140625,
0.0303192138671875,
-0.0260467529296875,
-0.0167236328125,
-0.0396728515625,
-0.008331298828125,
0.0278167724609375,
0.038177490234375,
-0.060821533203125,
0.040313720703125,
0.039031982421875,
0.01300048828125,
-0.033660888671875,
-0.051483154296875,
-0.0236053466796875,
-0.0212860107421875,
-0.07763671875,
0.048126220703125,
-0.00311279296875,
0.0206451416015625,
0.0068511962890625,
-0.0244903564453125,
-0.003658294677734375,
0.0004916191101074219,
0.030975341796875,
0.037689208984375,
-0.009521484375,
-0.02288818359375,
-0.0007023811340332031,
-0.035003662109375,
0.0113067626953125,
-0.0133514404296875,
0.04693603515625,
-0.0281524658203125,
-0.031768798828125,
-0.0183868408203125,
-0.0019321441650390625,
0.03125,
-0.0245361328125,
0.04052734375,
0.08770751953125,
-0.0220947265625,
-0.002239227294921875,
-0.04278564453125,
-0.0191802978515625,
-0.044097900390625,
0.0276336669921875,
-0.01416015625,
-0.08331298828125,
0.037078857421875,
-0.0020236968994140625,
0.000827789306640625,
0.0732421875,
0.0263671875,
0.0106658935546875,
0.08953857421875,
0.0460205078125,
0.02435302734375,
0.040069580078125,
-0.06146240234375,
0.01146697998046875,
-0.07183837890625,
-0.040618896484375,
-0.034088134765625,
-0.03375244140625,
-0.063720703125,
-0.04583740234375,
0.026611328125,
0.0080413818359375,
-0.03485107421875,
0.035858154296875,
-0.068603515625,
0.0281982421875,
0.042205810546875,
0.0035152435302734375,
-0.01525115966796875,
0.0109405517578125,
-0.006946563720703125,
0.00720977783203125,
-0.058685302734375,
-0.02740478515625,
0.036224365234375,
0.038177490234375,
0.0570068359375,
-0.0165557861328125,
0.05169677734375,
-0.0091400146484375,
0.0020351409912109375,
-0.0643310546875,
0.0472412109375,
-0.01415252685546875,
-0.056396484375,
-0.01047515869140625,
-0.0265655517578125,
-0.0755615234375,
0.027618408203125,
-0.01251220703125,
-0.057098388671875,
0.051971435546875,
0.00836944580078125,
-0.014556884765625,
0.0220184326171875,
-0.04150390625,
0.09320068359375,
-0.017669677734375,
-0.036346435546875,
0.009918212890625,
-0.057403564453125,
0.0129241943359375,
0.0181427001953125,
-0.00460052490234375,
-0.0260009765625,
0.0213165283203125,
0.0738525390625,
-0.0465087890625,
0.05572509765625,
-0.028533935546875,
0.0290374755859375,
0.042388916015625,
-0.0131988525390625,
0.029510498046875,
-0.00701904296875,
0.01751708984375,
0.036529541015625,
0.018890380859375,
-0.0285491943359375,
-0.0263214111328125,
0.0467529296875,
-0.07073974609375,
-0.04412841796875,
-0.038665771484375,
-0.0117645263671875,
-0.00023055076599121094,
0.0306549072265625,
0.04510498046875,
0.0325927734375,
-0.006561279296875,
0.0394287109375,
0.05035400390625,
-0.0290069580078125,
0.039642333984375,
0.00955963134765625,
-0.01401519775390625,
-0.030517578125,
0.06787109375,
-0.007541656494140625,
0.003971099853515625,
0.023345947265625,
0.022552490234375,
-0.028106689453125,
-0.0208740234375,
-0.02777099609375,
0.015228271484375,
-0.0565185546875,
-0.0310821533203125,
-0.06732177734375,
-0.04327392578125,
-0.032379150390625,
-0.0272064208984375,
-0.033111572265625,
-0.0191650390625,
-0.030517578125,
-0.005352020263671875,
0.0219268798828125,
0.02606201171875,
-0.01242828369140625,
0.03363037109375,
-0.051849365234375,
0.00901031494140625,
0.0276336669921875,
0.026824951171875,
0.0005159378051757812,
-0.047332763671875,
-0.01171875,
-0.0018129348754882812,
-0.036956787109375,
-0.03875732421875,
0.051025390625,
0.01314544677734375,
0.041839599609375,
0.0467529296875,
-0.00920867919921875,
0.072509765625,
-0.01270294189453125,
0.043243408203125,
0.03350830078125,
-0.058837890625,
0.029144287109375,
-0.00875091552734375,
0.04302978515625,
0.03521728515625,
0.02557373046875,
-0.0386962890625,
0.008331298828125,
-0.061676025390625,
-0.077880859375,
0.07135009765625,
0.005008697509765625,
0.004184722900390625,
0.003963470458984375,
-0.0026187896728515625,
0.0026149749755859375,
-0.0026607513427734375,
-0.046356201171875,
-0.0279541015625,
-0.034637451171875,
-0.0097808837890625,
-0.009246826171875,
-0.03564453125,
0.0014753341674804688,
-0.040435791015625,
0.05767822265625,
-0.0128021240234375,
0.050140380859375,
0.020477294921875,
-0.021270751953125,
-0.0022296905517578125,
0.0021228790283203125,
0.027618408203125,
0.0203399658203125,
-0.022216796875,
0.00806427001953125,
0.0159454345703125,
-0.032135009765625,
-0.0033931732177734375,
0.024932861328125,
-0.0227203369140625,
-0.002162933349609375,
0.0286712646484375,
0.0875244140625,
0.028472900390625,
-0.02130126953125,
0.046112060546875,
-0.0011873245239257812,
-0.039031982421875,
-0.034759521484375,
0.0172576904296875,
-0.0009284019470214844,
0.024566650390625,
0.0162200927734375,
0.0300750732421875,
0.023101806640625,
-0.002704620361328125,
0.0178985595703125,
0.0223846435546875,
-0.05389404296875,
-0.023345947265625,
0.056915283203125,
0.00835418701171875,
0.0034008026123046875,
0.051971435546875,
-0.014312744140625,
-0.052001953125,
0.0693359375,
0.0404052734375,
0.078857421875,
0.00293731689453125,
0.0200347900390625,
0.060089111328125,
0.01435089111328125,
0.008575439453125,
-0.004787445068359375,
-0.003692626953125,
-0.06341552734375,
-0.0252685546875,
-0.07989501953125,
0.00014638900756835938,
0.00281524658203125,
-0.05230712890625,
0.034088134765625,
-0.035308837890625,
-0.01503753662109375,
0.0208892822265625,
0.004085540771484375,
-0.082763671875,
0.0194854736328125,
0.0148773193359375,
0.0755615234375,
-0.041656494140625,
0.0379638671875,
0.061248779296875,
-0.02020263671875,
-0.06378173828125,
-0.0374755859375,
-0.006946563720703125,
-0.06207275390625,
0.038482666015625,
0.03759765625,
0.00252532958984375,
0.006786346435546875,
-0.06207275390625,
-0.07904052734375,
0.09674072265625,
0.0104522705078125,
-0.0280303955078125,
-0.0024547576904296875,
0.004566192626953125,
0.0278167724609375,
-0.031494140625,
0.029266357421875,
0.02691650390625,
0.043365478515625,
0.0545654296875,
-0.03314208984375,
0.0036678314208984375,
-0.02740478515625,
0.00490570068359375,
0.0254058837890625,
-0.06182861328125,
0.053009033203125,
-0.0200042724609375,
-0.0188446044921875,
-0.011260986328125,
0.04962158203125,
0.005153656005859375,
0.025848388671875,
0.048248291015625,
0.0615234375,
0.034912109375,
-0.0269012451171875,
0.06781005859375,
-0.0201416015625,
0.052490234375,
0.06353759765625,
0.023834228515625,
0.02740478515625,
0.0321044921875,
-0.007167816162109375,
0.032989501953125,
0.0716552734375,
-0.03997802734375,
0.0401611328125,
-0.0101318359375,
0.0134735107421875,
-0.037506103515625,
-0.0177459716796875,
-0.03912353515625,
0.057861328125,
0.0128021240234375,
-0.050140380859375,
-0.01027679443359375,
-0.0118255615234375,
-0.00438690185546875,
-0.04205322265625,
-0.0199737548828125,
0.05377197265625,
0.008056640625,
-0.0312347412109375,
0.049224853515625,
0.004268646240234375,
0.057708740234375,
-0.037200927734375,
0.005527496337890625,
-0.008331298828125,
0.0228424072265625,
-0.027740478515625,
-0.033477783203125,
0.032928466796875,
-0.01995849609375,
0.00013756752014160156,
-0.00782012939453125,
0.0789794921875,
-0.01934814453125,
-0.055145263671875,
0.01479339599609375,
0.01374053955078125,
0.004062652587890625,
0.0142822265625,
-0.06451416015625,
0.026214599609375,
0.005573272705078125,
-0.0267486572265625,
0.01152801513671875,
0.00893402099609375,
0.0164337158203125,
0.043365478515625,
0.0469970703125,
-0.027252197265625,
0.004756927490234375,
-0.0136566162109375,
0.07110595703125,
-0.05096435546875,
-0.0296630859375,
-0.0533447265625,
0.04180908203125,
-0.0222320556640625,
-0.03143310546875,
0.055633544921875,
0.046875,
0.0911865234375,
-0.0204925537109375,
0.03045654296875,
-0.0275726318359375,
0.006694793701171875,
-0.01464080810546875,
0.040802001953125,
-0.05084228515625,
-0.00811004638671875,
-0.032928466796875,
-0.073486328125,
-0.02130126953125,
0.0675048828125,
-0.0294647216796875,
0.0176544189453125,
0.036285400390625,
0.07012939453125,
-0.0201416015625,
0.005565643310546875,
0.0228424072265625,
0.00701904296875,
0.00756072998046875,
0.0228424072265625,
0.0528564453125,
-0.03887939453125,
0.033905029296875,
-0.057861328125,
0.0030231475830078125,
-0.03466796875,
-0.048797607421875,
-0.06689453125,
-0.043975830078125,
-0.0377197265625,
-0.0235137939453125,
-0.020416259765625,
0.06622314453125,
0.076416015625,
-0.06427001953125,
-0.00330352783203125,
-0.001758575439453125,
0.00894927978515625,
-0.01226806640625,
-0.0193328857421875,
0.033477783203125,
-0.00551605224609375,
-0.0640869140625,
-0.007701873779296875,
0.016204833984375,
0.00995635986328125,
-0.00606536865234375,
-0.0203857421875,
-0.0038852691650390625,
-0.01194000244140625,
0.0460205078125,
0.0166168212890625,
-0.04278564453125,
-0.0259552001953125,
0.01540374755859375,
-0.00252532958984375,
0.0141448974609375,
0.038482666015625,
-0.04278564453125,
0.032562255859375,
0.04150390625,
0.042572021484375,
0.07061767578125,
-0.0023937225341796875,
0.006305694580078125,
-0.03192138671875,
0.02130126953125,
0.0152130126953125,
0.03802490234375,
0.0259857177734375,
-0.016143798828125,
0.04425048828125,
0.0172119140625,
-0.043609619140625,
-0.04730224609375,
0.0022735595703125,
-0.086181640625,
-0.0121307373046875,
0.07562255859375,
-0.0027332305908203125,
-0.047637939453125,
0.0260009765625,
-0.00879669189453125,
0.028350830078125,
-0.01345062255859375,
0.03656005859375,
0.0178985595703125,
-0.0024738311767578125,
-0.03497314453125,
-0.01052093505859375,
0.02911376953125,
0.00402069091796875,
-0.045196533203125,
-0.043853759765625,
0.032928466796875,
0.0282745361328125,
0.0203857421875,
0.01448822021484375,
-0.0217132568359375,
0.0103912353515625,
0.01450347900390625,
0.0270843505859375,
-0.022064208984375,
-0.015625,
-0.0120086669921875,
0.01146697998046875,
-0.0150604248046875,
-0.021026611328125
]
] |
microsoft/BiomedVLP-BioViL-T | 2023-03-20T17:04:32.000Z | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"exbert",
"custom_code",
"en",
"arxiv:2301.04558",
"arxiv:2204.09817",
"license:mit",
"has_space",
"region:us"
] | feature-extraction | microsoft | null | null | microsoft/BiomedVLP-BioViL-T | 10 | 13,422 | transformers | 2023-02-17T10:19:58 | ---
language: en
tags:
- exbert
license: mit
widget:
- text: "Left pleural effusion with adjacent [MASK]."
example_title: "Radiology 1"
- text: "Heart size normal and lungs are [MASK]."
example_title: "Radiology 2"
- text: "[MASK] is a tumor suppressor gene."
example_title: "Biomedical"
- text: "The patient was on [MASK] for chronic atrial fibrillation"
example_title: "Medication"
---
# BioViL-T
[BioViL-T](https://arxiv.org/abs/2301.04558) is a domain-specific vision-language model designed to analyze chest X-rays (CXRs) and radiology reports. It was trained using a temporal multi-modal pre-training procedure, which distinguishes it from its predecessor model ([BioViL](https://www.ecva.net/papers/eccv_2022/papers_ECCV/papers/136960001.pdf)). In detail, BioViL-T takes advantage of the temporal structure between data points, resulting in improved downstream performance on multiple benchmarks, while using the same training dataset as its predecessor. In particular, the resultant model displays significant improvement in embedding temporal information present in the image and text modalities (see [results](#performance)), as well as in the joint space. The canonical model can be adapted to both single- and multi-image downstream applications including: natural language inference, phrase-grounding, image/text classification, and language decoding.
The corresponding BERT language model is trained in two stages: First, we pretrain [CXR-BERT-general](https://huggingface.co/microsoft/BiomedVLP-CXR-BERT-general) from a randomly initialized BERT model via Masked Language Modeling (MLM) on [PubMed](https://pubmed.ncbi.nlm.nih.gov/) abstracts and clinical notes from the publicly-available [MIMIC-III](https://physionet.org/content/mimiciii/1.4/) and [MIMIC-CXR](https://physionet.org/content/mimic-cxr/). The general model can be fine-tuned for research in other clinical domains by adjusting the parameters specific to the target domain. In the second stage, BioViL-T is continually pretrained from CXR-BERT-general using a multi-modal pre-training procedure by utilising radiology reports and sequences of chest X-rays. We utilise the latent representation of [CLS] token to align text and image embeddings.
## Language model variations
| Model | Model identifier on HuggingFace | Vocabulary | Note |
| ------------------------------------------------- | ----------------------------------------------------------------------------------------------------------- | -------------- | --------------------------------------------------------- |
| CXR-BERT-general | [microsoft/BiomedVLP-CXR-BERT-general](https://huggingface.co/microsoft/BiomedVLP-CXR-BERT-general) | PubMed & MIMIC | Pretrained for biomedical literature and clinical domains |
| CXR-BERT-specialized | [microsoft/BiomedVLP-CXR-BERT-specialized](https://huggingface.co/microsoft/BiomedVLP-CXR-BERT-specialized) | PubMed & MIMIC | Static pretraining for the CXR domain |
| BioViL-T | [microsoft/BiomedVLP-BioViL-T](https://huggingface.co/microsoft/BiomedVLP-BioViL-T) | PubMed & MIMIC | Static & temporal pretraining for the CXR domain
## Image model
The image model is jointly trained with the text model in a multi-modal contrastive learning framework. It's a hybrid image encoder composed of a Vision Transformer and ResNet-50, where the latter is used as backbone network to extract features from images at each time point. The transformer is included in the design to aggregate and compare image features extracted across the temporal dimension. The corresponding model definition and its loading functions can be accessed through our [HI-ML-Multimodal](https://github.com/microsoft/hi-ml/blob/main/hi-ml-multimodal/src/health_multimodal/image/model/model.py) GitHub repository. The joint image and text model, namely [BioViL-T](https://arxiv.org/abs/2204.09817), can be used in phrase grounding applications as shown in this python notebook [example](https://mybinder.org/v2/gh/microsoft/hi-ml/HEAD?labpath=hi-ml-multimodal%2Fnotebooks%2Fphrase_grounding.ipynb). Additionally, please check the [MS-CXR benchmark](https://physionet.org/content/ms-cxr/0.1/) for a more systematic evaluation of joint image and text models in phrase grounding tasks.
## Citation
The corresponding manuscript is accepted to be presented at the [**Conference on Computer Vision and Pattern Recognition (CVPR) 2023**](https://cvpr2023.thecvf.com/)
```bibtex
@misc{https://doi.org/10.48550/arXiv.2301.04558,
doi = {10.48550/ARXIV.2301.04558},
url = {https://arxiv.org/abs/2301.04558},
author = {Bannur, Shruthi and Hyland, Stephanie and Liu, Qianchu and Perez-Garcia, Fernando and Ilse, Maximilian and Castro, Daniel C and Boecking, Benedikt and Sharma, Harshita and Bouzid, Kenza and Thieme, Anja and Schwaighofer, Anton and Wetscherek, Maria and Lungren, Matthew P and Nori, Aditya and Alvarez-Valle, Javier and Oktay, Ozan}
title = {Learning to Exploit Temporal Structure for Biomedical Vision–Language Processing},
publisher = {arXiv},
year = {2023},
}
```
## Model Use
### Intended Use
This model is intended to be used solely for (I) future research on visual-language processing and (II) reproducibility of the experimental results reported in the reference paper.
#### Primary Intended Use
The primary intended use is to support AI researchers building on top of this work. CXR-BERT and its associated models should be helpful for exploring various clinical NLP & VLP research questions, especially in the radiology domain.
#### Out-of-Scope Use
**Any** deployed use case of the model --- commercial or otherwise --- is currently out of scope. Although we evaluated the models using a broad set of publicly-available research benchmarks, the models and evaluations are not intended for deployed use cases. Under unprecedented conditions, the models may make inaccurate predictions and display limitations, which may require additional mitigation strategies. Therefore, we discourage use of the model for automated diagnosis or in a medical device. Please refer to [the associated paper](https://arxiv.org/abs/2301.04558) for more details.
### How to use
Here is how to use this model to extract radiological sentence embeddings and obtain their cosine similarity in the joint space (image and text):
```python
import torch
from transformers import AutoModel, AutoTokenizer
# Load the model and tokenizer
url = "microsoft/BiomedVLP-BioViL-T"
tokenizer = AutoTokenizer.from_pretrained(url, trust_remote_code=True)
model = AutoModel.from_pretrained(url, trust_remote_code=True)
# Input text prompts describing findings.
# The order of prompts is adjusted to capture the spectrum from absence of a finding to its temporal progression.
text_prompts = ["No pleural effusion or pneumothorax is seen.",
"There is no pneumothorax or pleural effusion.",
"The extent of the pleural effusion is reduced.",
"The extent of the pleural effusion remains constant.",
"Interval enlargement of pleural effusion."]
# Tokenize and compute the sentence embeddings
with torch.no_grad():
tokenizer_output = tokenizer.batch_encode_plus(batch_text_or_text_pairs=text_prompts,
add_special_tokens=True,
padding='longest',
return_tensors='pt')
embeddings = model.get_projected_text_embeddings(input_ids=tokenizer_output.input_ids,
attention_mask=tokenizer_output.attention_mask)
# Compute the cosine similarity of sentence embeddings obtained from input text prompts.
sim = torch.mm(embeddings, embeddings.t())
```
## Data
This model builds upon existing publicly-available datasets:
- [PubMed](https://pubmed.ncbi.nlm.nih.gov/)
- [MIMIC-III](https://physionet.org/content/mimiciii/)
- [MIMIC-CXR](https://physionet.org/content/mimic-cxr/)
These datasets reflect a broad variety of sources ranging from biomedical abstracts to intensive care unit notes to chest X-ray radiology notes. The radiology notes are accompanied with their associated chest x-ray DICOM images in MIMIC-CXR dataset.
## Performance
The presented model achieves state-of-the-art results in radiology natural language inference by leveraging semantics and discourse characteristics at training time more efficiently.
The experiments were performed on the RadNLI and MS-CXR-T benchmarks, which measure the quality of text embeddings in terms of static and temporal semantics respectively.
BioViL-T is benchmarked against other commonly used SOTA domain specific BERT models, including [PubMedBERT](https://aka.ms/pubmedbert) and [CXR-BERT](https://aka.ms/biovil).
The results below show that BioViL-T has increased sensitivity of sentence embeddings to temporal content (MS-CXR-T) whilst better capturing the static content (RadNLI).
| | MS-CXR-T | MS-CXR-T | RadNLI (2 classes) | RadNLI (2 classes) |
| ----------------------------------------------- | :-------------------------------: | :----------------------: | :-------------------------: | :-------------: |
| | Accuracy | ROC-AUC | Accuracy | ROC-AUC |
| [PubMedBERT]((https://aka.ms/pubmedbert)) | 60.39 | .542 | 81.38 | .727 |
| [CXR-BERT-General](https://huggingface.co/microsoft/BiomedVLP-CXR-BERT-general) | 62.60 | .601 | 87.59 | .902 |
| [CXR-BERT-Specialized]((https://huggingface.co/microsoft/BiomedVLP-CXR-BERT-specialized)) | 78.12 | .837 | 89.66 | .932 |
| **BioViL-T** | **87.77** | **.933** | **90.52** | **.947** |
The novel pretraining framework yields also better vision-language representations. Below is the zero-shot phrase grounding performance obtained on the [MS-CXR](https://physionet.org/content/ms-cxr/0.1/) benchmark dataset, which evaluates the quality of image-text latent representations.
| Vision–Language Pretraining Method | MS-CXR Phrase Grounding (Avg. CNR Score) | MS-CXR Phrase Grounding (mIoU) |
| ---------------------------------- | :--------------------------------------: | :----------------------------: |
| BioViL | 1.07 +- 0.04 | 0.229 +- 0.005 |
| BioViL-L | 1.21 +- 0.05 | 0.202 +- 0.010 |
| **BioViL-T** | **1.33 +- 0.04** | **0.240 +- 0.005** |
Additional experimental results and discussion can be found in the corresponding paper, ["Learning to Exploit Temporal Structure for Biomedical Vision–Language Processing", CVPR'23](https://arxiv.org/abs/2301.04558).
## Limitations
This model was developed using English corpora, and thus can be considered English-only.
The training dataset contains only medical images and reports acquired from an intensive-care-unit (ICU), where longitudinal images are often collected within range of hours or at most few days. As a result, the models may show reduced performance in analyzing consecutive images acquired over longer periods of time (e.g. years) where significant anatomical variations are observed between the scans.
## Further information
Please refer to the corresponding paper, ["Learning to Exploit Temporal Structure for Biomedical Vision–Language Processing", CVPR'23](https://arxiv.org/abs/2301.04558.pdf) for additional details on the model training and evaluation.
For additional inference pipelines with BioViL-T, please refer to the [HI-ML GitHub](https://aka.ms/biovil-t-code) repository. The associated source files will soon be accessible through this link.
| 12,714 | [
[
-0.01300811767578125,
-0.051300048828125,
0.04412841796875,
0.007549285888671875,
-0.033660888671875,
-0.0040740966796875,
0.00409698486328125,
-0.0325927734375,
0.01157379150390625,
0.0225372314453125,
-0.02960205078125,
-0.057098388671875,
-0.05633544921875,
0.006622314453125,
-0.0039825439453125,
0.060577392578125,
-0.002162933349609375,
0.0228729248046875,
-0.0172576904296875,
-0.0266571044921875,
-0.0219268798828125,
-0.04510498046875,
-0.03363037109375,
-0.0276031494140625,
0.0258331298828125,
-0.01435089111328125,
0.059112548828125,
0.05401611328125,
0.042205810546875,
0.019256591796875,
-0.0069732666015625,
0.0022411346435546875,
-0.0166778564453125,
-0.0166778564453125,
-0.00791168212890625,
-0.0177459716796875,
-0.0306549072265625,
-0.004276275634765625,
0.047760009765625,
0.049072265625,
0.0102386474609375,
-0.00995635986328125,
0.0006532669067382812,
0.0277862548828125,
-0.0258026123046875,
0.0179595947265625,
-0.0269622802734375,
0.017791748046875,
0.0020599365234375,
-0.02825927734375,
-0.0272216796875,
-0.0186614990234375,
0.0298919677734375,
-0.039398193359375,
0.0112457275390625,
0.0142822265625,
0.10064697265625,
0.0124359130859375,
-0.025390625,
-0.01922607421875,
-0.01042938232421875,
0.059722900390625,
-0.050262451171875,
0.0452880859375,
0.0120391845703125,
0.0028438568115234375,
0.00665283203125,
-0.07501220703125,
-0.038055419921875,
-0.0193634033203125,
-0.0276336669921875,
0.0217742919921875,
-0.0289459228515625,
0.019287109375,
0.0102996826171875,
0.0197296142578125,
-0.0635986328125,
-0.006076812744140625,
-0.032745361328125,
-0.026611328125,
0.0215911865234375,
0.005794525146484375,
0.033447265625,
-0.035736083984375,
-0.0404052734375,
-0.0205535888671875,
-0.03704833984375,
-0.0023174285888671875,
-0.0148468017578125,
0.00473785400390625,
-0.037322998046875,
0.045166015625,
0.00911712646484375,
0.05712890625,
0.0077667236328125,
-0.0236968994140625,
0.059906005859375,
-0.0285491943359375,
-0.0311737060546875,
-0.002849578857421875,
0.0714111328125,
0.02508544921875,
-0.0014247894287109375,
-0.0108642578125,
0.00412750244140625,
0.018402099609375,
0.004276275634765625,
-0.060455322265625,
-0.01194000244140625,
0.02691650390625,
-0.07012939453125,
-0.035675048828125,
0.0007476806640625,
-0.043914794921875,
-0.0036163330078125,
-0.01331329345703125,
0.05377197265625,
-0.056488037109375,
-0.0009388923645019531,
0.01355743408203125,
-0.0178070068359375,
0.0224609375,
-0.0008668899536132812,
-0.025115966796875,
0.01238250732421875,
0.0203094482421875,
0.07965087890625,
-0.0008711814880371094,
-0.00734710693359375,
-0.03350830078125,
0.0025157928466796875,
-0.005687713623046875,
0.051971435546875,
-0.04248046875,
-0.0374755859375,
-0.00473785400390625,
0.0238189697265625,
-0.02056884765625,
-0.0298614501953125,
0.035186767578125,
-0.0240478515625,
0.033294677734375,
-0.0010433197021484375,
-0.03924560546875,
-0.0146636962890625,
0.01947021484375,
-0.037841796875,
0.07183837890625,
0.01322174072265625,
-0.06396484375,
0.01326751708984375,
-0.04156494140625,
-0.0273590087890625,
-0.004161834716796875,
-0.04150390625,
-0.057830810546875,
0.0007495880126953125,
0.039306640625,
0.0487060546875,
-0.00858306884765625,
0.020050048828125,
-0.0178375244140625,
-0.01218414306640625,
0.0185699462890625,
-0.004482269287109375,
0.06982421875,
0.0158233642578125,
-0.035064697265625,
0.004840850830078125,
-0.06195068359375,
0.005870819091796875,
0.0120697021484375,
-0.0105438232421875,
-0.0228424072265625,
-0.00521087646484375,
0.0136566162109375,
0.037078857421875,
0.01092529296875,
-0.050079345703125,
0.008209228515625,
-0.037384033203125,
0.030487060546875,
0.04058837890625,
0.0102996826171875,
0.01654052734375,
-0.04925537109375,
0.039031982421875,
0.018798828125,
0.0041046142578125,
-0.0290069580078125,
-0.033416748046875,
-0.03717041015625,
-0.0474853515625,
0.0106353759765625,
0.048675537109375,
-0.0616455078125,
0.03961181640625,
-0.0299072265625,
-0.039764404296875,
-0.047210693359375,
-0.018890380859375,
0.05615234375,
0.0528564453125,
0.051849365234375,
-0.0229339599609375,
-0.0465087890625,
-0.069091796875,
-0.0011434555053710938,
0.0011663436889648438,
0.0004303455352783203,
0.03680419921875,
0.034027099609375,
-0.0295257568359375,
0.059234619140625,
-0.047210693359375,
-0.04254150390625,
-0.006969451904296875,
0.025970458984375,
0.0190887451171875,
0.04248046875,
0.050994873046875,
-0.05133056640625,
-0.033447265625,
-0.0011339187622070312,
-0.0823974609375,
-0.0062713623046875,
-0.01410675048828125,
-0.0005841255187988281,
0.0213775634765625,
0.059051513671875,
-0.025482177734375,
0.040252685546875,
0.050506591796875,
-0.028045654296875,
0.0275421142578125,
-0.0263214111328125,
0.0098114013671875,
-0.10711669921875,
0.0202484130859375,
0.007843017578125,
-0.0271148681640625,
-0.0269775390625,
0.0016965866088867188,
0.007740020751953125,
-0.0230865478515625,
-0.034454345703125,
0.04840087890625,
-0.05377197265625,
0.0158233642578125,
0.0024566650390625,
0.0032100677490234375,
0.00971221923828125,
0.0374755859375,
0.0235595703125,
0.0423583984375,
0.0487060546875,
-0.031951904296875,
-0.006587982177734375,
0.031280517578125,
-0.0174407958984375,
0.032623291015625,
-0.0765380859375,
0.01236724853515625,
-0.021331787109375,
0.0184173583984375,
-0.07086181640625,
0.0079193115234375,
0.01506805419921875,
-0.04913330078125,
0.03692626953125,
-0.0059814453125,
-0.0299835205078125,
-0.0267791748046875,
-0.02850341796875,
0.0272064208984375,
0.042724609375,
-0.02978515625,
0.050445556640625,
0.0183258056640625,
-0.00281524658203125,
-0.047454833984375,
-0.06414794921875,
-0.0015554428100585938,
0.0042877197265625,
-0.0579833984375,
0.062225341796875,
-0.00290679931640625,
0.0013608932495117188,
0.02398681640625,
0.01024627685546875,
-0.0021076202392578125,
-0.0154266357421875,
0.0299072265625,
0.03204345703125,
-0.02496337890625,
0.008056640625,
0.0006003379821777344,
-0.0046844482421875,
-0.01025390625,
-0.02056884765625,
0.04425048828125,
-0.00992584228515625,
-0.018463134765625,
-0.043060302734375,
0.03271484375,
0.040802001953125,
-0.015960693359375,
0.062469482421875,
0.0592041015625,
-0.0401611328125,
0.0265960693359375,
-0.042633056640625,
-0.0028476715087890625,
-0.0330810546875,
0.044219970703125,
-0.0250396728515625,
-0.067626953125,
0.034149169921875,
0.01026153564453125,
-0.0202178955078125,
0.0404052734375,
0.0506591796875,
-0.01557159423828125,
0.07647705078125,
0.050079345703125,
0.003711700439453125,
0.029266357421875,
-0.031463623046875,
0.0103607177734375,
-0.07037353515625,
-0.0286407470703125,
-0.0338134765625,
-0.0058135986328125,
-0.044403076171875,
-0.040435791015625,
0.04901123046875,
-0.017364501953125,
0.001987457275390625,
0.00769805908203125,
-0.044525146484375,
0.005401611328125,
0.021881103515625,
0.04248046875,
0.0006856918334960938,
0.0126190185546875,
-0.041961669921875,
-0.019195556640625,
-0.051361083984375,
-0.0311431884765625,
0.0772705078125,
0.03033447265625,
0.048675537109375,
-0.0083770751953125,
0.058349609375,
0.009307861328125,
0.01071929931640625,
-0.049072265625,
0.02685546875,
-0.029693603515625,
-0.038421630859375,
0.0007710456848144531,
-0.0124969482421875,
-0.08953857421875,
0.0117950439453125,
-0.025726318359375,
-0.04437255859375,
0.0290679931640625,
0.0036468505859375,
-0.03924560546875,
0.019866943359375,
-0.0389404296875,
0.06439208984375,
-0.01499176025390625,
-0.026580810546875,
-0.004955291748046875,
-0.069091796875,
0.00351715087890625,
0.007213592529296875,
0.018585205078125,
0.01494598388671875,
-0.0118865966796875,
0.057098388671875,
-0.03778076171875,
0.07171630859375,
-0.015533447265625,
0.0208892822265625,
0.01415252685546875,
-0.0167694091796875,
0.0178985595703125,
-0.0058746337890625,
0.007843017578125,
0.026947021484375,
0.0172271728515625,
-0.032073974609375,
-0.022064208984375,
0.0362548828125,
-0.078857421875,
-0.0206756591796875,
-0.057159423828125,
-0.035614013671875,
-0.002056121826171875,
0.02618408203125,
0.04583740234375,
0.0634765625,
-0.01324462890625,
0.034027099609375,
0.061279296875,
-0.05560302734375,
0.020751953125,
0.017242431640625,
-0.00646209716796875,
-0.04815673828125,
0.05682373046875,
0.01218414306640625,
0.0108184814453125,
0.059234619140625,
0.0174102783203125,
-0.0203704833984375,
-0.039154052734375,
-0.0013055801391601562,
0.042633056640625,
-0.054107666015625,
-0.01885986328125,
-0.08270263671875,
-0.024993896484375,
-0.050506591796875,
-0.0277557373046875,
-0.005092620849609375,
-0.01300811767578125,
-0.03466796875,
0.005764007568359375,
0.019287109375,
0.028594970703125,
-0.006999969482421875,
0.0231781005859375,
-0.0711669921875,
0.0308380126953125,
0.0019931793212890625,
0.01068115234375,
-0.021148681640625,
-0.047760009765625,
-0.0164794921875,
-0.0105743408203125,
-0.028045654296875,
-0.06390380859375,
0.04888916015625,
0.0280609130859375,
0.055572509765625,
0.016937255859375,
-0.01499176025390625,
0.05426025390625,
-0.030670166015625,
0.059783935546875,
0.0182342529296875,
-0.06378173828125,
0.04669189453125,
-0.0222930908203125,
0.034332275390625,
0.0242156982421875,
0.0462646484375,
-0.0297698974609375,
-0.0177001953125,
-0.061798095703125,
-0.072265625,
0.0298919677734375,
0.0102386474609375,
-0.004611968994140625,
-0.0169677734375,
0.0263671875,
-0.004268646240234375,
-0.00008565187454223633,
-0.06298828125,
-0.0229339599609375,
-0.02581787109375,
-0.034332275390625,
-0.0006151199340820312,
-0.023590087890625,
-0.005615234375,
-0.021575927734375,
0.045654296875,
-0.01421356201171875,
0.063232421875,
0.054107666015625,
-0.031982421875,
0.0074005126953125,
0.00756072998046875,
0.04913330078125,
0.0411376953125,
-0.02752685546875,
0.0045318603515625,
0.0019512176513671875,
-0.044891357421875,
-0.0054931640625,
0.0216217041015625,
0.00864410400390625,
0.0172271728515625,
0.046661376953125,
0.050689697265625,
0.023834228515625,
-0.040252685546875,
0.0523681640625,
-0.01552581787109375,
-0.0264434814453125,
-0.0251617431640625,
-0.024139404296875,
-0.0012769699096679688,
0.0202484130859375,
0.0288543701171875,
0.012359619140625,
0.0003695487976074219,
-0.032318115234375,
0.0302886962890625,
0.027801513671875,
-0.039825439453125,
-0.01715087890625,
0.052886962890625,
-0.00464630126953125,
0.006191253662109375,
0.047210693359375,
-0.002033233642578125,
-0.041290283203125,
0.041778564453125,
0.04498291015625,
0.066162109375,
-0.006992340087890625,
0.0023059844970703125,
0.044464111328125,
0.0233001708984375,
0.01280975341796875,
0.0361328125,
0.01309967041015625,
-0.054473876953125,
-0.0246734619140625,
-0.038848876953125,
0.00365447998046875,
0.0066986083984375,
-0.05548095703125,
0.0305633544921875,
-0.041107177734375,
-0.016448974609375,
0.007282257080078125,
-0.0013141632080078125,
-0.065185546875,
0.03466796875,
0.01178741455078125,
0.0650634765625,
-0.06585693359375,
0.07586669921875,
0.059112548828125,
-0.05364990234375,
-0.05859375,
-0.00391387939453125,
0.002590179443359375,
-0.0770263671875,
0.057403564453125,
0.012451171875,
0.0030879974365234375,
0.00014972686767578125,
-0.035186767578125,
-0.06365966796875,
0.10052490234375,
0.008026123046875,
-0.036285400390625,
-0.019683837890625,
-0.0057830810546875,
0.04339599609375,
-0.034881591796875,
0.0174407958984375,
0.00975799560546875,
0.01380157470703125,
0.006107330322265625,
-0.07049560546875,
0.0210723876953125,
-0.0298919677734375,
0.0118408203125,
-0.00946044921875,
-0.044464111328125,
0.0653076171875,
-0.0184173583984375,
-0.01088714599609375,
0.014129638671875,
0.0460205078125,
0.0274658203125,
0.0175933837890625,
0.0234375,
0.0587158203125,
0.0460205078125,
-0.007232666015625,
0.0933837890625,
-0.028564453125,
0.0261383056640625,
0.06866455078125,
0.0010080337524414062,
0.051788330078125,
0.035003662109375,
-0.00853729248046875,
0.051025390625,
0.043365478515625,
-0.008819580078125,
0.051849365234375,
-0.00991058349609375,
-0.000614166259765625,
-0.003932952880859375,
-0.007083892822265625,
-0.053314208984375,
0.0276641845703125,
0.035797119140625,
-0.060638427734375,
-0.0038604736328125,
0.01385498046875,
0.0269775390625,
-0.01074981689453125,
-0.00753021240234375,
0.0521240234375,
0.0129852294921875,
-0.0286712646484375,
0.058197021484375,
-0.005062103271484375,
0.059417724609375,
-0.05218505859375,
-0.002323150634765625,
0.0016422271728515625,
0.0239105224609375,
-0.0157318115234375,
-0.04522705078125,
0.031890869140625,
-0.0281524658203125,
-0.01197052001953125,
-0.0104217529296875,
0.0474853515625,
-0.0279541015625,
-0.04229736328125,
0.028472900390625,
0.0283050537109375,
0.0030803680419921875,
0.01282501220703125,
-0.0758056640625,
0.023101806640625,
-0.0031948089599609375,
-0.0080108642578125,
0.024139404296875,
0.0285491943359375,
0.0009555816650390625,
0.035186767578125,
0.037384033203125,
0.006633758544921875,
0.0014781951904296875,
-0.001102447509765625,
0.07080078125,
-0.048004150390625,
-0.0457763671875,
-0.057281494140625,
0.042877197265625,
-0.006565093994140625,
-0.018890380859375,
0.039154052734375,
0.04510498046875,
0.058074951171875,
-0.0079345703125,
0.0653076171875,
-0.014129638671875,
0.031829833984375,
-0.050384521484375,
0.053375244140625,
-0.057464599609375,
0.0008716583251953125,
-0.03448486328125,
-0.039581298828125,
-0.0499267578125,
0.0611572265625,
-0.01454925537109375,
0.00949859619140625,
0.08197021484375,
0.0755615234375,
-0.0038928985595703125,
-0.030548095703125,
0.025726318359375,
0.03515625,
0.025238037109375,
0.038055419921875,
0.02227783203125,
-0.049591064453125,
0.03887939453125,
-0.0251617431640625,
-0.0294189453125,
-0.01552581787109375,
-0.07781982421875,
-0.07427978515625,
-0.0570068359375,
-0.047149658203125,
-0.0538330078125,
0.01776123046875,
0.08013916015625,
0.06341552734375,
-0.052032470703125,
0.0009899139404296875,
0.018218994140625,
-0.0189971923828125,
-0.0211029052734375,
-0.015625,
0.05865478515625,
-0.020843505859375,
-0.030059814453125,
0.00923919677734375,
0.02093505859375,
0.0153961181640625,
0.0007047653198242188,
0.002933502197265625,
-0.040802001953125,
-0.0028839111328125,
0.052886962890625,
0.0205230712890625,
-0.054229736328125,
-0.0222320556640625,
0.016876220703125,
-0.021148681640625,
0.022064208984375,
0.04718017578125,
-0.06622314453125,
0.04840087890625,
0.040252685546875,
0.040679931640625,
0.041015625,
-0.01800537109375,
0.0295257568359375,
-0.061431884765625,
0.0250701904296875,
0.014556884765625,
0.042205810546875,
0.0294647216796875,
-0.035675048828125,
0.0279541015625,
0.036651611328125,
-0.03875732421875,
-0.05096435546875,
-0.01068115234375,
-0.09857177734375,
-0.0292510986328125,
0.08099365234375,
-0.0153350830078125,
-0.03045654296875,
-0.008636474609375,
-0.0206451416015625,
0.0374755859375,
-0.01459503173828125,
0.046630859375,
0.035797119140625,
-0.0113067626953125,
-0.027618408203125,
-0.0178070068359375,
0.0242767333984375,
0.022613525390625,
-0.049560546875,
-0.0445556640625,
0.0225372314453125,
0.03839111328125,
0.019256591796875,
0.05780029296875,
-0.049560546875,
0.0232696533203125,
-0.01032257080078125,
0.01715087890625,
-0.0057220458984375,
0.0007781982421875,
-0.0390625,
0.0139923095703125,
-0.01090240478515625,
-0.024505615234375
]
] |
EleutherAI/pythia-1b | 2023-07-09T16:05:58.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"en",
"dataset:the_pile",
"arxiv:2304.01373",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-1b | 16 | 13,392 | transformers | 2023-03-10T21:42:46 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
license: apache-2.0
datasets:
- the_pile
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research [(see paper)](https://arxiv.org/pdf/2304.01373.pdf).
It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. We also provide 154 intermediate
checkpoints per model, hosted on Hugging Face as branches.
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
<details>
<summary style="font-weight:600">Details on previous early release and naming convention.</summary>
Previously, we released an early version of the Pythia suite to the public.
However, we decided to retrain the model suite to address a few hyperparameter
discrepancies. This model card <a href="#changelog">lists the changes</a>;
see appendix B in the Pythia paper for further discussion. We found no
difference in benchmark performance between the two Pythia versions.
The old models are
[still available](https://huggingface.co/models?other=pythia_v0), but we
suggest the retrained suite if you are just starting to use Pythia.<br>
**This is the current release.**
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
</details>
<br>
# Pythia-1B
## Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
[See paper](https://arxiv.org/pdf/2304.01373.pdf) for more evals and implementation
details.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 2M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 2M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 2M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
## Uses and Limitations
### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. We also provide
154 checkpoints per model: initial `step0`, 10 log-spaced checkpoints
`step{1,2,4...512}`, and 143 evenly-spaced checkpoints from `step1000` to
`step143000`. These checkpoints are hosted on Hugging Face as branches. Note
that branch `143000` corresponds exactly to the model checkpoint on the `main`
branch of each model.
You may also further fine-tune and adapt Pythia-1B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-1B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions. For example,
the model may generate harmful or offensive text. Please evaluate the risks
associated with your particular use case.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-1B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-1B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “follow” human instructions.
### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token used by the model need not produce the
most “accurate” text. Never rely on Pythia-1B to produce factually accurate
output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-1B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-1B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
## Training
### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-1B.
### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training,
from `step1000` to `step143000` (which is the same as `main`). In addition, we
also provide frequent early checkpoints: `step0` and `step{1,2,4...512}`.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for 143000 steps at a batch size
of 2M (2,097,152 tokens).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
## Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json/).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai_v1.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa_v1.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande_v1.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Easy Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_easy_v1.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq_v1.png" style="width:auto"/>
</details>
## Changelog
This section compares differences between previously released
[Pythia v0](https://huggingface.co/models?other=pythia_v0) and the current
models. See Appendix B of the Pythia paper for further discussion of these
changes and the motivation behind them. We found that retraining Pythia had no
impact on benchmark performance.
- All model sizes are now trained with uniform batch size of 2M tokens.
Previously, the models of size 160M, 410M, and 1.4B parameters were trained
with batch sizes of 4M tokens.
- We added checkpoints at initialization (step 0) and steps {1,2,4,8,16,32,64,
128,256,512} in addition to every 1000 training steps.
- Flash Attention was used in the new retrained suite.
- We remedied a minor inconsistency that existed in the original suite: all
models of size 2.8B parameters or smaller had a learning rate (LR) schedule
which decayed to a minimum LR of 10% the starting LR rate, but the 6.9B and
12B models all used an LR schedule which decayed to a minimum LR of 0. In
the redone training runs, we rectified this inconsistency: all models now were
trained with LR decaying to a minimum of 0.1× their maximum LR.
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 13,545 | [
[
-0.0232391357421875,
-0.06170654296875,
0.0239715576171875,
0.005413055419921875,
-0.0179443359375,
-0.01544952392578125,
-0.0167388916015625,
-0.03436279296875,
0.01531219482421875,
0.010284423828125,
-0.027191162109375,
-0.0207977294921875,
-0.033233642578125,
-0.006439208984375,
-0.034210205078125,
0.08526611328125,
-0.00811004638671875,
-0.00992584228515625,
0.0086822509765625,
-0.00461578369140625,
-0.004169464111328125,
-0.0386962890625,
-0.03399658203125,
-0.02935791015625,
0.0484619140625,
0.0141448974609375,
0.06640625,
0.041839599609375,
0.0122528076171875,
0.0211944580078125,
-0.026458740234375,
-0.005191802978515625,
-0.0131072998046875,
-0.00693511962890625,
-0.0012311935424804688,
-0.0216217041015625,
-0.052642822265625,
0.001506805419921875,
0.050201416015625,
0.048919677734375,
-0.01395416259765625,
0.019805908203125,
-0.0008740425109863281,
0.027069091796875,
-0.0386962890625,
0.0007224082946777344,
-0.0238800048828125,
-0.014312744140625,
-0.007022857666015625,
0.01192474365234375,
-0.0284271240234375,
-0.0264129638671875,
0.036163330078125,
-0.050628662109375,
0.01849365234375,
0.00836181640625,
0.09075927734375,
-0.00861358642578125,
-0.031829833984375,
-0.006282806396484375,
-0.052520751953125,
0.04986572265625,
-0.053802490234375,
0.0234832763671875,
0.021697998046875,
0.01303863525390625,
-0.0007729530334472656,
-0.0679931640625,
-0.04168701171875,
-0.0150604248046875,
-0.0107879638671875,
-0.00205230712890625,
-0.048553466796875,
0.0010499954223632812,
0.03765869140625,
0.046356201171875,
-0.064697265625,
-0.002735137939453125,
-0.0306243896484375,
-0.02655029296875,
0.024169921875,
0.003002166748046875,
0.033233642578125,
-0.0216522216796875,
0.00045752525329589844,
-0.027069091796875,
-0.049224853515625,
-0.017486572265625,
0.041229248046875,
0.005405426025390625,
-0.027801513671875,
0.038482666015625,
-0.0263519287109375,
0.041046142578125,
-0.005809783935546875,
0.0193023681640625,
0.0322265625,
-0.015228271484375,
-0.037811279296875,
-0.007518768310546875,
0.06939697265625,
0.0093841552734375,
0.01641845703125,
-0.00185394287109375,
-0.0016431808471679688,
0.00571441650390625,
0.00423431396484375,
-0.08587646484375,
-0.059661865234375,
0.0201568603515625,
-0.0296478271484375,
-0.0308990478515625,
-0.01183319091796875,
-0.0701904296875,
-0.01425933837890625,
-0.014129638671875,
0.0438232421875,
-0.03790283203125,
-0.056854248046875,
-0.01181793212890625,
0.00023472309112548828,
0.0169525146484375,
0.0275726318359375,
-0.07257080078125,
0.03131103515625,
0.03399658203125,
0.07855224609375,
0.0175628662109375,
-0.042236328125,
-0.01355743408203125,
-0.0214996337890625,
-0.01056671142578125,
0.026458740234375,
-0.00750732421875,
-0.0132598876953125,
-0.008148193359375,
0.0117034912109375,
-0.006374359130859375,
-0.026397705078125,
0.0267791748046875,
-0.0299072265625,
0.01959228515625,
-0.0221405029296875,
-0.0311737060546875,
-0.027862548828125,
0.007595062255859375,
-0.044891357421875,
0.06329345703125,
0.0198974609375,
-0.0733642578125,
0.01776123046875,
-0.016326904296875,
-0.005260467529296875,
-0.0037059783935546875,
0.015380859375,
-0.052337646484375,
0.00244140625,
0.024169921875,
0.0021724700927734375,
-0.0303802490234375,
0.0137176513671875,
-0.0180816650390625,
-0.03271484375,
0.013763427734375,
-0.038787841796875,
0.06939697265625,
0.01416778564453125,
-0.04833984375,
0.0196685791015625,
-0.042938232421875,
0.0169677734375,
0.019775390625,
-0.0261688232421875,
0.004680633544921875,
-0.01471710205078125,
0.031494140625,
0.0158233642578125,
0.0138702392578125,
-0.0273590087890625,
0.021697998046875,
-0.038604736328125,
0.054962158203125,
0.05523681640625,
-0.0032825469970703125,
0.034912109375,
-0.03271484375,
0.0345458984375,
0.00434112548828125,
0.0149078369140625,
-0.00447845458984375,
-0.04730224609375,
-0.07415771484375,
-0.0200958251953125,
0.02813720703125,
0.02410888671875,
-0.035400390625,
0.03155517578125,
-0.0173492431640625,
-0.0654296875,
-0.0153350830078125,
-0.00640106201171875,
0.032470703125,
0.0211181640625,
0.032012939453125,
-0.012359619140625,
-0.041595458984375,
-0.06756591796875,
-0.01480865478515625,
-0.03399658203125,
0.01090240478515625,
0.012298583984375,
0.0706787109375,
-0.00833892822265625,
0.0439453125,
-0.025360107421875,
0.01849365234375,
-0.027496337890625,
0.012725830078125,
0.033660888671875,
0.047271728515625,
0.0300750732421875,
-0.041259765625,
-0.0286712646484375,
0.0017709732055664062,
-0.0440673828125,
0.008392333984375,
0.0031337738037109375,
-0.025115966796875,
0.0244903564453125,
0.003246307373046875,
-0.07464599609375,
0.035980224609375,
0.046783447265625,
-0.0423583984375,
0.06005859375,
-0.024322509765625,
-0.001678466796875,
-0.08074951171875,
0.021148681640625,
0.01068878173828125,
-0.01629638671875,
-0.0458984375,
0.00528717041015625,
0.01329803466796875,
-0.01470947265625,
-0.0300445556640625,
0.04486083984375,
-0.0391845703125,
-0.01355743408203125,
-0.0172119140625,
0.003787994384765625,
-0.0030765533447265625,
0.04803466796875,
0.01082611083984375,
0.0438232421875,
0.061981201171875,
-0.0572509765625,
0.030181884765625,
0.019012451171875,
-0.021240234375,
0.027374267578125,
-0.067626953125,
0.013427734375,
0.004673004150390625,
0.030670166015625,
-0.04583740234375,
-0.027435302734375,
0.04046630859375,
-0.04541015625,
0.01291656494140625,
-0.032501220703125,
-0.04278564453125,
-0.0313720703125,
-0.01079559326171875,
0.045013427734375,
0.059814453125,
-0.04541015625,
0.049285888671875,
0.004100799560546875,
0.00885009765625,
-0.027740478515625,
-0.045135498046875,
-0.019775390625,
-0.037567138671875,
-0.049560546875,
0.0277252197265625,
0.014129638671875,
-0.01381683349609375,
0.00240325927734375,
-0.0018072128295898438,
0.00789642333984375,
-0.00417327880859375,
0.0244293212890625,
0.0251617431640625,
-0.0044097900390625,
0.000949859619140625,
-0.00908660888671875,
-0.0102081298828125,
-0.0005140304565429688,
-0.039337158203125,
0.07147216796875,
-0.0213775634765625,
-0.01224517822265625,
-0.062744140625,
0.0006570816040039062,
0.0675048828125,
-0.031463623046875,
0.0677490234375,
0.046356201171875,
-0.054229736328125,
0.0106048583984375,
-0.02764892578125,
-0.022186279296875,
-0.032928466796875,
0.0511474609375,
-0.01904296875,
-0.02606201171875,
0.044952392578125,
0.0208282470703125,
0.0211029052734375,
0.0411376953125,
0.05499267578125,
0.015838623046875,
0.0888671875,
0.0335693359375,
-0.01236724853515625,
0.047760009765625,
-0.0404052734375,
0.0203399658203125,
-0.08343505859375,
-0.01480865478515625,
-0.0380859375,
-0.0190277099609375,
-0.07196044921875,
-0.021697998046875,
0.0261383056640625,
0.0181121826171875,
-0.056854248046875,
0.042938232421875,
-0.0416259765625,
0.0035858154296875,
0.04937744140625,
0.020263671875,
0.0109405517578125,
0.0166473388671875,
0.0060272216796875,
-0.00548553466796875,
-0.049560546875,
-0.025970458984375,
0.0938720703125,
0.0391845703125,
0.0426025390625,
0.02520751953125,
0.052337646484375,
-0.01113128662109375,
0.019073486328125,
-0.053436279296875,
0.0300445556640625,
0.0259246826171875,
-0.0560302734375,
-0.0152130126953125,
-0.055908203125,
-0.0701904296875,
0.037872314453125,
0.00441741943359375,
-0.085205078125,
0.0175018310546875,
0.0172271728515625,
-0.0272979736328125,
0.034393310546875,
-0.047760009765625,
0.07501220703125,
-0.016937255859375,
-0.039093017578125,
-0.02734375,
-0.0233306884765625,
0.01776123046875,
0.02801513671875,
0.0082244873046875,
0.0085296630859375,
0.0225982666015625,
0.0732421875,
-0.051910400390625,
0.047393798828125,
-0.01184844970703125,
0.01279449462890625,
0.0260467529296875,
0.0226898193359375,
0.050628662109375,
0.01202392578125,
0.008819580078125,
-0.0034542083740234375,
0.01271820068359375,
-0.044921875,
-0.0301971435546875,
0.06671142578125,
-0.08331298828125,
-0.0275421142578125,
-0.0601806640625,
-0.042236328125,
0.006977081298828125,
0.0135040283203125,
0.03253173828125,
0.049652099609375,
-0.0027942657470703125,
0.00022876262664794922,
0.043365478515625,
-0.037567138671875,
0.0261688232421875,
0.01611328125,
-0.037811279296875,
-0.0384521484375,
0.0753173828125,
0.0021572113037109375,
0.0253448486328125,
0.0005450248718261719,
0.015716552734375,
-0.0296173095703125,
-0.034759521484375,
-0.045501708984375,
0.04095458984375,
-0.054901123046875,
0.0011510848999023438,
-0.053375244140625,
-0.0022487640380859375,
-0.033660888671875,
0.00782012939453125,
-0.030517578125,
-0.028594970703125,
-0.017120361328125,
-0.0029201507568359375,
0.04254150390625,
0.036224365234375,
0.007190704345703125,
0.027069091796875,
-0.03765869140625,
-0.00412750244140625,
0.0164642333984375,
0.00676727294921875,
0.0091705322265625,
-0.0684814453125,
-0.007602691650390625,
0.00998687744140625,
-0.032989501953125,
-0.08624267578125,
0.037750244140625,
-0.00550079345703125,
0.0255889892578125,
0.007171630859375,
-0.02093505859375,
0.0445556640625,
-0.005306243896484375,
0.050323486328125,
0.01287841796875,
-0.0767822265625,
0.0426025390625,
-0.0369873046875,
0.0231170654296875,
0.0283050537109375,
0.0258331298828125,
-0.053192138671875,
-0.006504058837890625,
-0.0738525390625,
-0.08111572265625,
0.058074951171875,
0.0391845703125,
0.0131072998046875,
0.00827789306640625,
0.0310211181640625,
-0.03387451171875,
0.01192474365234375,
-0.078369140625,
-0.024688720703125,
-0.0205841064453125,
-0.00554656982421875,
0.01334381103515625,
-0.002796173095703125,
0.005046844482421875,
-0.042266845703125,
0.0780029296875,
0.005153656005859375,
0.0274658203125,
0.021759033203125,
-0.02850341796875,
-0.00872039794921875,
-0.005069732666015625,
0.01154327392578125,
0.057952880859375,
-0.0106353759765625,
0.00278472900390625,
0.01520538330078125,
-0.040191650390625,
0.0024967193603515625,
0.0109405517578125,
-0.0285797119140625,
-0.0037250518798828125,
0.0142364501953125,
0.066162109375,
0.0103607177734375,
-0.03009033203125,
0.018341064453125,
-0.0023403167724609375,
-0.005702972412109375,
-0.0230255126953125,
-0.013824462890625,
0.01303863525390625,
0.0146636962890625,
-0.00024819374084472656,
-0.01181793212890625,
0.00021648406982421875,
-0.06903076171875,
0.004245758056640625,
0.0158843994140625,
-0.01078033447265625,
-0.0308685302734375,
0.04376220703125,
0.0018701553344726562,
-0.015655517578125,
0.085693359375,
-0.0208282470703125,
-0.05059814453125,
0.060272216796875,
0.0391845703125,
0.05352783203125,
-0.013458251953125,
0.0267333984375,
0.0704345703125,
0.0259552001953125,
-0.014617919921875,
0.005870819091796875,
0.007537841796875,
-0.03802490234375,
-0.005706787109375,
-0.060272216796875,
-0.018341064453125,
0.0188446044921875,
-0.04571533203125,
0.03271484375,
-0.04962158203125,
-0.006504058837890625,
-0.0015573501586914062,
0.0186309814453125,
-0.0445556640625,
0.024017333984375,
0.01215362548828125,
0.053955078125,
-0.0679931640625,
0.06158447265625,
0.048736572265625,
-0.056243896484375,
-0.08184814453125,
0.004291534423828125,
0.002765655517578125,
-0.033905029296875,
0.010650634765625,
0.0173492431640625,
0.018310546875,
0.01308441162109375,
-0.02294921875,
-0.06671142578125,
0.0997314453125,
0.01678466796875,
-0.050628662109375,
-0.0207061767578125,
-0.00946807861328125,
0.041473388671875,
0.004756927490234375,
0.05474853515625,
0.054168701171875,
0.03216552734375,
0.007053375244140625,
-0.07940673828125,
0.027679443359375,
-0.02655029296875,
-0.004825592041015625,
0.016876220703125,
-0.052276611328125,
0.09552001953125,
-0.0076141357421875,
-0.0029163360595703125,
0.03265380859375,
0.043701171875,
0.0305938720703125,
-0.007335662841796875,
0.02496337890625,
0.05755615234375,
0.06671142578125,
-0.0290374755859375,
0.09332275390625,
-0.0228271484375,
0.058380126953125,
0.0643310546875,
0.01552581787109375,
0.038482666015625,
0.0307769775390625,
-0.0305938720703125,
0.039825439453125,
0.0626220703125,
-0.006725311279296875,
0.0152740478515625,
0.02044677734375,
-0.020538330078125,
-0.01849365234375,
0.00943756103515625,
-0.044708251953125,
0.0142974853515625,
0.0123443603515625,
-0.041595458984375,
-0.0150604248046875,
-0.026336669921875,
0.026641845703125,
-0.03204345703125,
-0.0174102783203125,
0.0211029052734375,
0.007099151611328125,
-0.050323486328125,
0.04541015625,
0.020355224609375,
0.041412353515625,
-0.0352783203125,
0.01248931884765625,
-0.01206207275390625,
0.0223236083984375,
-0.0267791748046875,
-0.030426025390625,
0.007595062255859375,
-0.00024175643920898438,
0.004993438720703125,
0.01117706298828125,
0.0290069580078125,
-0.01145172119140625,
-0.04351806640625,
0.0144500732421875,
0.03717041015625,
0.0194244384765625,
-0.031951904296875,
-0.05157470703125,
0.006587982177734375,
-0.0133056640625,
-0.03955078125,
0.032318115234375,
0.01934814453125,
-0.007480621337890625,
0.043426513671875,
0.048126220703125,
0.002910614013671875,
-0.0006122589111328125,
0.0097503662109375,
0.07281494140625,
-0.0330810546875,
-0.036224365234375,
-0.06890869140625,
0.03790283203125,
0.0011415481567382812,
-0.0521240234375,
0.06304931640625,
0.043487548828125,
0.0504150390625,
0.0198822021484375,
0.046844482421875,
-0.035369873046875,
-0.0011167526245117188,
-0.022796630859375,
0.0498046875,
-0.036895751953125,
0.002132415771484375,
-0.03668212890625,
-0.0850830078125,
-0.003650665283203125,
0.07257080078125,
-0.03765869140625,
0.0296783447265625,
0.061492919921875,
0.062286376953125,
-0.006755828857421875,
0.00649261474609375,
0.004276275634765625,
0.0244140625,
0.037078857421875,
0.0709228515625,
0.06622314453125,
-0.054229736328125,
0.040496826171875,
-0.037445068359375,
-0.019805908203125,
-0.01093292236328125,
-0.033782958984375,
-0.064453125,
-0.035888671875,
-0.0380859375,
-0.059814453125,
0.0013914108276367188,
0.06915283203125,
0.05780029296875,
-0.047149658203125,
-0.0105438232421875,
-0.041717529296875,
0.0032863616943359375,
-0.0189208984375,
-0.017822265625,
0.03179931640625,
0.00951385498046875,
-0.07086181640625,
-0.003223419189453125,
-0.01084136962890625,
0.00897216796875,
-0.032073974609375,
-0.0223846435546875,
-0.01366424560546875,
-0.007549285888671875,
0.003696441650390625,
0.023712158203125,
-0.0400390625,
-0.0186614990234375,
0.000274658203125,
0.004245758056640625,
-0.0000171661376953125,
0.053558349609375,
-0.042266845703125,
0.0095672607421875,
0.04400634765625,
0.0081787109375,
0.062286376953125,
-0.01971435546875,
0.0297088623046875,
-0.019866943359375,
0.0269775390625,
0.0196075439453125,
0.047210693359375,
0.0251922607421875,
-0.01959228515625,
0.012939453125,
0.0299530029296875,
-0.05438232421875,
-0.06549072265625,
0.02630615234375,
-0.054595947265625,
-0.0070648193359375,
0.09661865234375,
-0.0207977294921875,
-0.02850341796875,
0.004680633544921875,
-0.01557159423828125,
0.0400390625,
-0.0213775634765625,
0.0499267578125,
0.048065185546875,
0.0077972412109375,
-0.0158233642578125,
-0.04974365234375,
0.02783203125,
0.0511474609375,
-0.061614990234375,
0.026885986328125,
0.044525146484375,
0.046478271484375,
0.0176544189453125,
0.044952392578125,
-0.02142333984375,
0.048248291015625,
0.00865936279296875,
0.00508880615234375,
0.001499176025390625,
-0.0357666015625,
-0.033294677734375,
-0.0094757080078125,
0.016937255859375,
0.0014867782592773438
]
] |
mrm8488/t5-base-finetuned-summarize-news | 2023-03-17T00:50:19.000Z | [
"transformers",
"pytorch",
"jax",
"safetensors",
"t5",
"text2text-generation",
"news",
"summary",
"en",
"arxiv:1910.10683",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | mrm8488 | null | null | mrm8488/t5-base-finetuned-summarize-news | 31 | 13,365 | transformers | 2022-03-02T23:29:05 | ---
language: en
tags:
- news
- summary
---
# T5-base fine-tuned fo News Summarization 📖✏️🧾
All credits to [Abhishek Kumar Mishra](https://github.com/abhimishra91)
[Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) base fine-tuned on [News Summary](https://www.kaggle.com/sunnysai12345/news-summary) dataset for **summarization** downstream task.
## Details of T5
The **T5** model was presented in [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/pdf/1910.10683.pdf) by *Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu* in Here the abstract:
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new “Colossal Clean Crawled Corpus”, we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our dataset, pre-trained models, and code.

## Details of the downstream task (Summarization) - Dataset 📚
[News Summary](https://www.kaggle.com/sunnysai12345/news-summary)
The dataset consists of **4515 examples** and contains Author_name, Headlines, Url of Article, Short text, Complete Article. I gathered the summarized news from Inshorts and only scraped the news articles from Hindu, Indian times and Guardian. Time period ranges from febrauary to august 2017.
## Model fine-tuning 🏋️
The training script is a slightly modified version of [this Colab Notebook](https://github.com/abhimishra91/transformers-tutorials/blob/master/transformers_summarization_wandb.ipynb) created by [Abhishek Kumar Mishra](https://github.com/abhimishra91), so all credits to him!
I also trained the model for more epochs (6).
## Model in Action 🚀
```python
from transformers import AutoModelWithLMHead, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("mrm8488/t5-base-finetuned-summarize-news")
model = AutoModelWithLMHead.from_pretrained("mrm8488/t5-base-finetuned-summarize-news")
def summarize(text, max_length=150):
input_ids = tokenizer.encode(text, return_tensors="pt", add_special_tokens=True)
generated_ids = model.generate(input_ids=input_ids, num_beams=2, max_length=max_length, repetition_penalty=2.5, length_penalty=1.0, early_stopping=True)
preds = [tokenizer.decode(g, skip_special_tokens=True, clean_up_tokenization_spaces=True) for g in generated_ids]
return preds[0]
```
Given the following article from **NYT** (2020/06/09) with title *George Floyd’s death energized a movement. He will be buried in Houston today*:
After the sound and the fury, weeks of demonstrations and anguished calls for racial justice, the man whose death gave rise to an international movement, and whose last words — “I can’t breathe” — have been a rallying cry, will be laid to rest on Tuesday at a private funeral in Houston.George Floyd, who was 46, will then be buried in a grave next to his mother’s.The service, scheduled to begin at 11 a.m. at the Fountain of Praise church, comes after five days of public memorials in Minneapolis, North Carolina and Houston and two weeks after a Minneapolis police officer was caught on video pressing his knee into Mr. Floyd’s neck for nearly nine minutes before Mr. Floyd died. That officer, Derek Chauvin, has been charged with second-degree murder and second-degree manslaughter. His bail was set at $1.25 million in a court appearance on Monday. The outpouring of anger and outrage after Mr. Floyd’s death — and the speed at which protests spread from tense, chaotic demonstrations in the city where he died to an international movement from Rome to Rio de Janeiro — has reflected the depth of frustration borne of years of watching black people die at the hands of the police or vigilantes while calls for change went unmet.
```
summarize('After the sound and the fury, weeks of demonstrations and anguished calls for racial justice, the man whose death gave rise to an international movement, and whose last words — “I can’t breathe” — have been a rallying cry, will be laid to rest on Tuesday at a private funeral in Houston.George Floyd, who was 46, will then be buried in a grave next to his mother’s.The service, scheduled to begin at 11 a.m. at the Fountain of Praise church, comes after five days of public memorials in Minneapolis, North Carolina and Houston and two weeks after a Minneapolis police officer was caught on video pressing his knee into Mr. Floyd’s neck for nearly nine minutes before Mr. Floyd died. That officer, Derek Chauvin, has been charged with second-degree murder and second-degree manslaughter. His bail was set at $1.25 million in a court appearance on Monday. The outpouring of anger and outrage after Mr. Floyd’s death — and the speed at which protests spread from tense, chaotic demonstrations in the city where he died to an international movement from Rome to Rio de Janeiro — has reflected the depth of frustration borne of years of watching black people die at the hands of the police or vigilantes while calls for change went unmet.', 80)
```
We would obtain:
At a private funeral in Houston. Floyd, who was 46 years old when his death occurred, will be buried next to the grave of his mother. A Minnesota police officer was caught on video pressing his knee into Mr's neck for nearly nine minutes before his death. The officer has been charged with second-degree manslaughter and $1.2 million bail is set at
> Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488) | [LinkedIn](https://www.linkedin.com/in/manuel-romero-cs/)
> Made with <span style="color: #e25555;">♥</span> in Spain
| 6,429 | [
[
-0.024749755859375,
-0.0265350341796875,
0.0284423828125,
0.02899169921875,
-0.0205535888671875,
-0.0003693103790283203,
-0.005191802978515625,
-0.03363037109375,
0.0163116455078125,
0.0228118896484375,
-0.01535797119140625,
-0.01385498046875,
-0.059356689453125,
0.00855255126953125,
-0.039031982421875,
0.09552001953125,
-0.0149993896484375,
-0.007091522216796875,
0.028900146484375,
-0.0272369384765625,
-0.0042877197265625,
-0.0129852294921875,
-0.06353759765625,
-0.01934814453125,
0.015716552734375,
0.00794219970703125,
0.037139892578125,
0.041259765625,
0.02239990234375,
0.030975341796875,
-0.0285491943359375,
0.00433349609375,
-0.051513671875,
-0.0079498291015625,
0.00189208984375,
-0.02398681640625,
-0.03558349609375,
0.0264739990234375,
0.037139892578125,
0.06658935546875,
0.00039696693420410156,
0.0130157470703125,
-0.0028781890869140625,
0.057861328125,
-0.045562744140625,
0.0031299591064453125,
-0.0212249755859375,
0.01374053955078125,
-0.000659942626953125,
-0.0240478515625,
-0.0225067138671875,
-0.05828857421875,
0.0223236083984375,
-0.043731689453125,
0.02374267578125,
-0.0183258056640625,
0.0802001953125,
0.0224609375,
-0.0244140625,
-0.0086517333984375,
-0.0261077880859375,
0.058349609375,
-0.040435791015625,
0.026092529296875,
0.0149078369140625,
0.02728271484375,
-0.0061187744140625,
-0.08294677734375,
-0.0330810546875,
-0.004573822021484375,
-0.01427459716796875,
0.0262603759765625,
-0.0257720947265625,
-0.00809478759765625,
0.0279083251953125,
0.05413818359375,
-0.031951904296875,
-0.0194549560546875,
-0.060577392578125,
-0.00604248046875,
0.048614501953125,
-0.0169219970703125,
0.015411376953125,
-0.032745361328125,
-0.037567138671875,
-0.004344940185546875,
-0.0247650146484375,
0.0269012451171875,
0.017669677734375,
0.003101348876953125,
-0.056732177734375,
0.0287933349609375,
-0.0318603515625,
0.040740966796875,
0.03466796875,
-0.03363037109375,
0.0244903564453125,
-0.0252838134765625,
-0.0283355712890625,
0.00023925304412841797,
0.08636474609375,
0.05511474609375,
0.044891357421875,
-0.033721923828125,
-0.0004036426544189453,
-0.01056671142578125,
0.0216522216796875,
-0.08074951171875,
-0.029571533203125,
0.0225067138671875,
-0.048553466796875,
-0.0170745849609375,
0.01288604736328125,
-0.051971435546875,
-0.012939453125,
-0.0245819091796875,
0.022613525390625,
-0.0277252197265625,
-0.00006210803985595703,
0.0132598876953125,
-0.0287933349609375,
-0.0045928955078125,
0.02374267578125,
-0.073486328125,
0.0081939697265625,
0.0232086181640625,
0.08648681640625,
0.004581451416015625,
-0.0203857421875,
-0.0439453125,
0.00933837890625,
-0.0169525146484375,
0.033538818359375,
0.00130462646484375,
-0.043609619140625,
-0.01007080078125,
0.0205841064453125,
0.02044677734375,
-0.028900146484375,
0.05865478515625,
-0.032501220703125,
0.027435302734375,
-0.0133514404296875,
-0.0230865478515625,
-0.027862548828125,
-0.006427764892578125,
-0.01454925537109375,
0.0721435546875,
0.021331787109375,
-0.046905517578125,
0.0178680419921875,
-0.04534912109375,
-0.034088134765625,
-0.0084075927734375,
0.0190277099609375,
-0.031707763671875,
0.015594482421875,
-0.002933502197265625,
0.045989990234375,
0.01018524169921875,
0.0163116455078125,
-0.0292205810546875,
-0.02978515625,
-0.0020751953125,
-0.00455474853515625,
0.0850830078125,
0.04034423828125,
-0.045867919921875,
-0.0245819091796875,
-0.05950927734375,
0.00995635986328125,
0.0023441314697265625,
-0.0286865234375,
-0.00028204917907714844,
-0.00969696044921875,
-0.004520416259765625,
0.01476287841796875,
-0.0045928955078125,
-0.035491943359375,
0.01192474365234375,
-0.0286712646484375,
0.039031982421875,
0.053466796875,
0.0198211669921875,
0.052734375,
-0.038787841796875,
0.033905029296875,
-0.01049041748046875,
0.008819580078125,
-0.0172271728515625,
-0.00846099853515625,
-0.053619384765625,
0.016510009765625,
0.03253173828125,
0.026702880859375,
0.002105712890625,
0.045196533203125,
-0.0252838134765625,
-0.04351806640625,
-0.032562255859375,
-0.00872039794921875,
0.02032470703125,
0.047576904296875,
0.0537109375,
-0.045135498046875,
-0.0771484375,
-0.051727294921875,
-0.025146484375,
-0.008331298828125,
-0.00921630859375,
-0.0165863037109375,
0.03155517578125,
-0.0250091552734375,
0.05401611328125,
-0.036895751953125,
-0.01873779296875,
0.005611419677734375,
0.01352691650390625,
0.00800323486328125,
0.0308685302734375,
0.045928955078125,
-0.068603515625,
-0.0482177734375,
0.0088653564453125,
-0.0472412109375,
-0.0286712646484375,
-0.0279693603515625,
-0.00830078125,
0.012237548828125,
0.038055419921875,
-0.01319122314453125,
0.01363372802734375,
0.0419921875,
-0.00852203369140625,
0.0301055908203125,
-0.01474761962890625,
0.01081085205078125,
-0.0833740234375,
0.015655517578125,
-0.0019474029541015625,
-0.03912353515625,
-0.03668212890625,
-0.01186370849609375,
0.0015411376953125,
-0.004505157470703125,
-0.036102294921875,
0.0474853515625,
-0.046783447265625,
-0.006458282470703125,
-0.041717529296875,
0.01218414306640625,
-0.0008387565612792969,
0.04180908203125,
0.005542755126953125,
0.05853271484375,
0.035186767578125,
-0.06951904296875,
0.0191192626953125,
0.041046142578125,
-0.006336212158203125,
0.03778076171875,
-0.05316162109375,
-0.0018520355224609375,
-0.031036376953125,
0.04754638671875,
-0.06195068359375,
0.0069122314453125,
0.0192718505859375,
-0.039215087890625,
0.022674560546875,
-0.0211639404296875,
-0.0036830902099609375,
-0.035125732421875,
-0.040740966796875,
-0.00644683837890625,
0.049530029296875,
-0.01837158203125,
0.02764892578125,
0.058319091796875,
-0.0177459716796875,
-0.047271728515625,
-0.0638427734375,
0.019744873046875,
-0.013336181640625,
-0.0396728515625,
0.045623779296875,
-0.0107574462890625,
-0.0211181640625,
-0.0079193115234375,
-0.0209503173828125,
-0.0011053085327148438,
0.002826690673828125,
0.028228759765625,
-0.003208160400390625,
0.0030460357666015625,
0.01071929931640625,
-0.001651763916015625,
-0.02459716796875,
-0.00870513916015625,
-0.020782470703125,
0.0341796875,
-0.0284423828125,
0.02197265625,
-0.034912109375,
0.0295257568359375,
0.06597900390625,
-0.0285491943359375,
0.05072021484375,
0.060028076171875,
-0.020660400390625,
0.0046539306640625,
-0.03741455078125,
0.0049591064453125,
-0.0341796875,
0.01116943359375,
-0.0292205810546875,
-0.035400390625,
0.036651611328125,
0.002979278564453125,
0.0277557373046875,
0.0418701171875,
0.060455322265625,
-0.0092010498046875,
0.0518798828125,
0.04095458984375,
-0.0231475830078125,
0.057586669921875,
-0.060577392578125,
0.0011148452758789062,
-0.0623779296875,
-0.017669677734375,
-0.0374755859375,
-0.036041259765625,
-0.04669189453125,
-0.01032257080078125,
0.041412353515625,
0.029632568359375,
-0.0196380615234375,
0.00650787353515625,
-0.04168701171875,
0.019500732421875,
0.0195159912109375,
0.00792694091796875,
0.0267791748046875,
-0.00994110107421875,
0.023284912109375,
-0.01210784912109375,
-0.027984619140625,
-0.04132080078125,
0.08544921875,
0.0242156982421875,
0.055419921875,
0.0225067138671875,
0.048431396484375,
0.0258941650390625,
0.04345703125,
-0.04010009765625,
0.0250091552734375,
-0.01419830322265625,
-0.0272369384765625,
-0.01186370849609375,
-0.0399169921875,
-0.07611083984375,
0.0204010009765625,
-0.00809478759765625,
-0.054656982421875,
0.0181427001953125,
-0.01486968994140625,
-0.035186767578125,
0.01526641845703125,
-0.0645751953125,
0.059478759765625,
-0.0261077880859375,
-0.0194091796875,
0.006351470947265625,
-0.07025146484375,
0.0254669189453125,
0.0165252685546875,
0.007598876953125,
-0.0016851425170898438,
-0.0209808349609375,
0.061309814453125,
-0.038055419921875,
0.037872314453125,
-0.0148773193359375,
0.007450103759765625,
0.003726959228515625,
-0.0188446044921875,
0.05413818359375,
-0.0003159046173095703,
0.0124053955078125,
-0.01483917236328125,
-0.0038242340087890625,
-0.0271759033203125,
-0.02423095703125,
0.038360595703125,
-0.059844970703125,
-0.0119781494140625,
-0.033477783203125,
-0.0271759033203125,
-0.006885528564453125,
0.003139495849609375,
0.048675537109375,
0.0299224853515625,
0.005340576171875,
0.0197296142578125,
0.0533447265625,
0.004669189453125,
0.061126708984375,
0.04107666015625,
0.00026226043701171875,
-0.040771484375,
0.041900634765625,
0.0294189453125,
0.0020389556884765625,
0.03887939453125,
0.01812744140625,
-0.046295166015625,
-0.045318603515625,
-0.044281005859375,
0.0307159423828125,
-0.042022705078125,
-0.0064849853515625,
-0.0908203125,
0.003971099853515625,
-0.046722412109375,
-0.022705078125,
-0.01526641845703125,
-0.0173797607421875,
-0.037933349609375,
-0.0173797607421875,
0.032867431640625,
0.016845703125,
-0.002655029296875,
0.0185699462890625,
-0.07232666015625,
0.055938720703125,
0.01517486572265625,
0.035736083984375,
-0.023468017578125,
-0.060272216796875,
-0.01519775390625,
0.0016536712646484375,
-0.041839599609375,
-0.07952880859375,
0.0279693603515625,
0.0252838134765625,
0.02801513671875,
0.03607177734375,
0.0158538818359375,
0.07696533203125,
-0.045867919921875,
0.03948974609375,
0.0248565673828125,
-0.07177734375,
0.04541015625,
-0.021942138671875,
0.015594482421875,
0.050140380859375,
0.042205810546875,
-0.08026123046875,
-0.039886474609375,
-0.068115234375,
-0.050048828125,
0.051239013671875,
-0.01091766357421875,
0.0101165771484375,
0.0192108154296875,
0.02777099609375,
0.00627899169921875,
0.0027313232421875,
-0.08160400390625,
-0.046417236328125,
-0.037353515625,
-0.007541656494140625,
-0.01526641845703125,
-0.0177459716796875,
-0.00812530517578125,
-0.026641845703125,
0.050994873046875,
-0.01177978515625,
0.0298004150390625,
0.033843994140625,
-0.03118896484375,
-0.0221099853515625,
0.01084136962890625,
0.05810546875,
0.06292724609375,
-0.02044677734375,
-0.002475738525390625,
0.01476287841796875,
-0.03631591796875,
0.036529541015625,
0.0498046875,
-0.0187835693359375,
0.01016998291015625,
0.021392822265625,
0.0609130859375,
0.0297393798828125,
-0.033905029296875,
0.042510986328125,
-0.0082244873046875,
-0.0273895263671875,
-0.0289306640625,
0.0173797607421875,
0.00255584716796875,
0.0253448486328125,
0.0408935546875,
0.006084442138671875,
0.01470947265625,
-0.06817626953125,
0.019317626953125,
0.01406097412109375,
-0.0263519287109375,
-0.029266357421875,
0.0701904296875,
0.0157928466796875,
-0.0023479461669921875,
-0.008392333984375,
-0.0211639404296875,
-0.0626220703125,
0.0233001708984375,
0.030029296875,
0.06048583984375,
-0.0028133392333984375,
0.012969970703125,
0.034759521484375,
0.01073455810546875,
-0.0206298828125,
0.009796142578125,
-0.03118896484375,
-0.03594970703125,
-0.030029296875,
-0.05560302734375,
-0.03228759765625,
0.00571441650390625,
-0.0303802490234375,
0.004688262939453125,
-0.029266357421875,
-0.0023937225341796875,
-0.0008573532104492188,
0.04705810546875,
-0.0260009765625,
0.0256500244140625,
0.005252838134765625,
0.07403564453125,
-0.058746337890625,
0.03460693359375,
0.050994873046875,
-0.04254150390625,
-0.043792724609375,
0.00774383544921875,
-0.02105712890625,
-0.045867919921875,
0.033111572265625,
0.00916290283203125,
0.003780364990234375,
0.01200103759765625,
-0.0193328857421875,
-0.052947998046875,
0.07525634765625,
0.041168212890625,
-0.0256195068359375,
-0.0009417533874511719,
-0.0264129638671875,
0.0550537109375,
0.0011968612670898438,
0.028228759765625,
0.04180908203125,
0.053924560546875,
-0.01294708251953125,
-0.07611083984375,
0.0007867813110351562,
-0.0093536376953125,
-0.0178985595703125,
0.0257568359375,
-0.09283447265625,
0.0643310546875,
-0.028076171875,
-0.031707763671875,
-0.0024261474609375,
0.052642822265625,
0.00008404254913330078,
0.0240936279296875,
0.033935546875,
0.056304931640625,
0.06256103515625,
-0.017669677734375,
0.11376953125,
-0.01519012451171875,
0.047271728515625,
0.044586181640625,
0.0244140625,
0.039276123046875,
0.030731201171875,
-0.06427001953125,
0.0187530517578125,
0.057220458984375,
0.0081787109375,
0.052734375,
0.01406097412109375,
-0.0264129638671875,
0.02447509765625,
-0.0041961669921875,
-0.05487060546875,
0.0276031494140625,
-0.005893707275390625,
-0.03558349609375,
-0.0274200439453125,
-0.0054779052734375,
0.014495849609375,
-0.00708770751953125,
-0.04034423828125,
0.04754638671875,
0.032012939453125,
-0.05914306640625,
0.06097412109375,
0.0157012939453125,
0.05419921875,
-0.05926513671875,
0.0231781005859375,
-0.0252838134765625,
0.01511383056640625,
-0.0296783447265625,
-0.031646728515625,
0.032257080078125,
-0.0003459453582763672,
-0.0322265625,
-0.049957275390625,
0.0380859375,
-0.03607177734375,
-0.004985809326171875,
0.0028591156005859375,
0.0202789306640625,
0.02490234375,
-0.01226806640625,
-0.04681396484375,
-0.008514404296875,
0.00975799560546875,
-0.034881591796875,
0.0280303955078125,
0.045013427734375,
0.0156402587890625,
0.0479736328125,
0.05078125,
0.02264404296875,
0.0167388916015625,
-0.01165008544921875,
0.039825439453125,
-0.054473876953125,
-0.055511474609375,
-0.056884765625,
0.036285400390625,
0.002918243408203125,
-0.0496826171875,
0.042449951171875,
0.05194091796875,
0.0579833984375,
-0.0005340576171875,
0.049530029296875,
-0.019744873046875,
0.051422119140625,
-0.049102783203125,
0.058868408203125,
-0.0526123046875,
-0.017669677734375,
-0.0059356689453125,
-0.046600341796875,
-0.01068115234375,
0.04833984375,
-0.0291595458984375,
0.01171112060546875,
0.0479736328125,
0.045135498046875,
-0.0044403076171875,
-0.00878143310546875,
0.0124359130859375,
0.0174407958984375,
0.019317626953125,
0.051788330078125,
0.038421630859375,
-0.058624267578125,
0.06268310546875,
-0.02667236328125,
0.00510406494140625,
-0.0423583984375,
-0.055084228515625,
-0.02764892578125,
-0.03948974609375,
-0.0107421875,
-0.01119232177734375,
-0.003078460693359375,
0.05487060546875,
0.06695556640625,
-0.057525634765625,
-0.0037097930908203125,
-0.0130462646484375,
0.00617218017578125,
-0.0282440185546875,
-0.0178375244140625,
0.0430908203125,
-0.01154327392578125,
-0.08428955078125,
0.010772705078125,
0.020172119140625,
0.0301666259765625,
0.00925445556640625,
-0.02020263671875,
-0.004596710205078125,
-0.048187255859375,
0.052978515625,
0.0552978515625,
-0.05078125,
-0.010498046875,
-0.019775390625,
-0.0343017578125,
0.01493072509765625,
0.058074951171875,
-0.0673828125,
0.025665283203125,
0.041168212890625,
0.05938720703125,
0.06005859375,
0.019775390625,
0.0372314453125,
-0.0577392578125,
0.01385498046875,
0.0146331787109375,
0.058563232421875,
0.01525115966796875,
-0.0175628662109375,
0.065185546875,
0.05035400390625,
-0.04034423828125,
-0.0748291015625,
-0.00818634033203125,
-0.09454345703125,
-0.007366180419921875,
0.05511474609375,
0.009613037109375,
0.007244110107421875,
-0.01549530029296875,
-0.02239990234375,
0.0301361083984375,
-0.04266357421875,
0.06256103515625,
0.05242919921875,
-0.01287841796875,
-0.045867919921875,
-0.0517578125,
0.037933349609375,
0.0156402587890625,
-0.068603515625,
-0.00726318359375,
0.03460693359375,
0.01406097412109375,
0.020172119140625,
0.06549072265625,
-0.0026454925537109375,
0.042388916015625,
-0.0020999908447265625,
-0.002140045166015625,
-0.020660400390625,
-0.004241943359375,
-0.033721923828125,
0.0391845703125,
-0.032562255859375,
-0.025390625
]
] |
saattrupdan/wav2vec2-xls-r-300m-ftspeech | 2023-09-11T13:27:55.000Z | [
"transformers",
"pytorch",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"da",
"dataset:ftspeech",
"license:other",
"model-index",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | saattrupdan | null | null | saattrupdan/wav2vec2-xls-r-300m-ftspeech | 0 | 13,340 | transformers | 2022-03-04T14:53:05 | ---
language:
- da
license: other
datasets:
- ftspeech
metrics:
- wer
tasks:
- automatic-speech-recognition
base_model: facebook/wav2vec2-xls-r-300m
model-index:
- name: wav2vec2-xls-r-300m-ftspeech
results:
- task:
type: automatic-speech-recognition
dataset:
name: Danish Common Voice 8.0
type: mozilla-foundation/common_voice_8_0
args: da
metrics:
- type: wer
value: 17.91
- task:
type: automatic-speech-recognition
dataset:
name: Alvenir ASR test dataset
type: Alvenir/alvenir_asr_da_eval
metrics:
- type: wer
value: 13.84
---
# XLS-R-300m-FTSpeech
## Model description
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the [FTSpeech dataset](https://ftspeech.github.io/), being a dataset of 1,800 hours of transcribed speeches from the Danish parliament.
## Performance
The model achieves the following WER scores (lower is better):
| **Dataset** | **WER without LM** | **WER with 5-gram LM** |
| :---: | ---: | ---: |
| [Danish part of Common Voice 8.0](https://huggingface.co/datasets/mozilla-foundation/common_voice_8_0/viewer/da/train) | 20.48 | 17.91 |
| [Alvenir test set](https://huggingface.co/datasets/Alvenir/alvenir_asr_da_eval) | 15.46 | 13.84 |
## License
The use of this model needs to adhere to [this license from the Danish Parliament](https://www.ft.dk/da/aktuelt/tv-fra-folketinget/deling-og-rettigheder). | 1,498 | [
[
-0.033935546875,
-0.041015625,
0.00713348388671875,
0.0292816162109375,
-0.0288238525390625,
-0.00550079345703125,
-0.0287628173828125,
-0.0202484130859375,
0.00803375244140625,
0.044342041015625,
-0.0455322265625,
-0.037261962890625,
-0.045562744140625,
0.00955963134765625,
-0.029449462890625,
0.04791259765625,
0.010498046875,
0.038055419921875,
-0.01290130615234375,
-0.02294921875,
-0.0145111083984375,
-0.039093017578125,
-0.04742431640625,
-0.0246429443359375,
0.03216552734375,
0.04302978515625,
0.03778076171875,
0.039154052734375,
0.01393890380859375,
0.018707275390625,
-0.04351806640625,
-0.008392333984375,
-0.054168701171875,
-0.01495361328125,
-0.00992584228515625,
-0.0120391845703125,
-0.052764892578125,
0.004741668701171875,
0.05596923828125,
0.036956787109375,
-0.051605224609375,
0.0250244140625,
0.0171051025390625,
0.048919677734375,
-0.005115509033203125,
0.0199737548828125,
-0.0288238525390625,
-0.020965576171875,
-0.004352569580078125,
0.01873779296875,
-0.0271148681640625,
-0.00893402099609375,
0.002178192138671875,
-0.03253173828125,
0.016021728515625,
0.00435638427734375,
0.06719970703125,
0.009796142578125,
-0.0260162353515625,
-0.00666046142578125,
-0.050201416015625,
0.05816650390625,
-0.06182861328125,
0.06439208984375,
0.0386962890625,
0.037353515625,
-0.0018873214721679688,
-0.061309814453125,
-0.040771484375,
-0.0164031982421875,
0.00994873046875,
0.01067352294921875,
-0.0340576171875,
0.0034160614013671875,
0.0281219482421875,
0.0257110595703125,
-0.05023193359375,
-0.00572967529296875,
-0.0697021484375,
-0.01239776611328125,
0.064697265625,
0.0018529891967773438,
0.007099151611328125,
-0.01258087158203125,
-0.01904296875,
-0.0296630859375,
-0.015106201171875,
0.016265869140625,
0.0280914306640625,
0.046356201171875,
-0.044769287109375,
0.047332763671875,
-0.02374267578125,
0.03997802734375,
0.01873779296875,
0.00470733642578125,
0.03582763671875,
-0.01033782958984375,
-0.006435394287109375,
0.01172637939453125,
0.0604248046875,
0.0262603759765625,
0.02294921875,
0.0225830078125,
-0.030364990234375,
0.00001537799835205078,
0.0172119140625,
-0.06683349609375,
-0.0208740234375,
0.00901031494140625,
-0.050506591796875,
0.004711151123046875,
0.00829315185546875,
-0.029510498046875,
0.007785797119140625,
-0.05169677734375,
0.042999267578125,
-0.033843994140625,
-0.036285400390625,
0.0161895751953125,
0.0004992485046386719,
0.01297760009765625,
0.01042938232421875,
-0.03485107421875,
0.045166015625,
0.03839111328125,
0.055572509765625,
-0.0048828125,
-0.0010395050048828125,
-0.04498291015625,
-0.0211181640625,
-0.0236968994140625,
0.04827880859375,
-0.0182647705078125,
-0.055755615234375,
-0.0163116455078125,
0.01226806640625,
-0.003910064697265625,
-0.03985595703125,
0.056182861328125,
-0.017425537109375,
0.03021240234375,
-0.0282440185546875,
-0.06561279296875,
-0.01885986328125,
-0.01480865478515625,
-0.06597900390625,
0.08648681640625,
0.0217437744140625,
-0.03076171875,
0.0236968994140625,
-0.05328369140625,
-0.0174560546875,
-0.00345611572265625,
-0.0053253173828125,
-0.03704833984375,
0.00563812255859375,
-0.004119873046875,
0.0406494140625,
-0.01192474365234375,
0.0185089111328125,
-0.0258026123046875,
-0.0277557373046875,
-0.0003826618194580078,
-0.02362060546875,
0.048583984375,
0.03363037109375,
-0.01007080078125,
0.01183319091796875,
-0.08343505859375,
0.006622314453125,
-0.001537322998046875,
-0.042724609375,
0.00981903076171875,
-0.0007991790771484375,
0.054840087890625,
0.03131103515625,
0.0104217529296875,
-0.052093505859375,
-0.01125335693359375,
-0.029541015625,
0.004558563232421875,
0.04241943359375,
-0.01525115966796875,
0.007049560546875,
-0.03680419921875,
0.048583984375,
-0.01187896728515625,
0.0141448974609375,
0.032470703125,
-0.0294952392578125,
-0.0294952392578125,
-0.040191650390625,
0.034027099609375,
0.04302978515625,
-0.01580810546875,
0.039581298828125,
-0.022613525390625,
-0.04925537109375,
-0.043731689453125,
0.005321502685546875,
0.029205322265625,
0.03131103515625,
0.0277557373046875,
-0.0194549560546875,
-0.041595458984375,
-0.09051513671875,
-0.0230865478515625,
-0.0026035308837890625,
-0.01381683349609375,
0.043792724609375,
0.029541015625,
-0.0037555694580078125,
0.057281494140625,
-0.01546478271484375,
-0.00475311279296875,
-0.0293426513671875,
0.0089569091796875,
0.035430908203125,
0.0243682861328125,
0.040283203125,
-0.070068359375,
-0.055572509765625,
-0.0165252685546875,
-0.01666259765625,
-0.0197906494140625,
0.018218994140625,
0.01412200927734375,
0.020416259765625,
0.0345458984375,
-0.0294952392578125,
0.042816162109375,
0.051971435546875,
-0.04876708984375,
0.04229736328125,
0.00818634033203125,
0.0146942138671875,
-0.099609375,
0.019378662109375,
-0.001750946044921875,
-0.03485107421875,
-0.030242919921875,
-0.0223236083984375,
0.005096435546875,
-0.0029163360595703125,
-0.05804443359375,
0.050140380859375,
-0.0261688232421875,
0.004425048828125,
-0.020233154296875,
-0.00913238525390625,
-0.007293701171875,
0.0269775390625,
-0.0012426376342773438,
0.06939697265625,
0.034698486328125,
-0.03466796875,
0.0206298828125,
0.0479736328125,
-0.050262451171875,
0.04241943359375,
-0.0660400390625,
0.00943756103515625,
0.007717132568359375,
0.0203399658203125,
-0.0653076171875,
-0.023651123046875,
-0.0019006729125976562,
-0.049560546875,
0.0352783203125,
0.005218505859375,
-0.029022216796875,
-0.0299835205078125,
-0.00624847412109375,
0.0178680419921875,
0.04681396484375,
-0.0335693359375,
0.045440673828125,
0.0228729248046875,
-0.03436279296875,
-0.0300140380859375,
-0.037933349609375,
-0.000659942626953125,
-0.00931549072265625,
-0.048004150390625,
0.014984130859375,
-0.01012420654296875,
-0.0156707763671875,
0.00556182861328125,
-0.0255279541015625,
-0.0029048919677734375,
-0.01763916015625,
0.029205322265625,
0.00794219970703125,
-0.018890380859375,
-0.01934814453125,
-0.002223968505859375,
0.0004711151123046875,
-0.0005397796630859375,
0.0205535888671875,
0.037811279296875,
-0.00530242919921875,
-0.0117950439453125,
-0.060699462890625,
0.047393798828125,
0.0445556640625,
0.00988006591796875,
0.0599365234375,
0.044647216796875,
-0.034271240234375,
-0.0156402587890625,
-0.049163818359375,
-0.00981903076171875,
-0.03521728515625,
0.041290283203125,
-0.03302001953125,
-0.07086181640625,
0.042938232421875,
0.0070953369140625,
-0.01273345947265625,
0.0538330078125,
0.03436279296875,
0.0011930465698242188,
0.080810546875,
0.035186767578125,
-0.0141448974609375,
0.029876708984375,
-0.0151824951171875,
-0.007152557373046875,
-0.0472412109375,
-0.0207977294921875,
-0.047607421875,
-0.01149749755859375,
-0.04071044921875,
-0.0222320556640625,
0.0186004638671875,
0.020172119140625,
-0.02801513671875,
0.032470703125,
-0.033935546875,
0.022125244140625,
0.036956787109375,
0.004604339599609375,
-0.01102447509765625,
0.01270294189453125,
-0.032073974609375,
-0.001903533935546875,
-0.046173095703125,
-0.049591064453125,
0.07098388671875,
0.033538818359375,
0.051605224609375,
0.01311492919921875,
0.031890869140625,
0.02490234375,
0.00876617431640625,
-0.04193115234375,
0.041107177734375,
-0.03826904296875,
-0.052764892578125,
-0.01458740234375,
-0.033111572265625,
-0.0499267578125,
0.0026874542236328125,
0.0005021095275878906,
-0.061859130859375,
0.0038604736328125,
0.016448974609375,
-0.02294921875,
0.0138397216796875,
-0.032806396484375,
0.050750732421875,
-0.0003299713134765625,
-0.0101776123046875,
-0.027740478515625,
-0.0479736328125,
-0.0032367706298828125,
0.004467010498046875,
0.019500732421875,
-0.01493072509765625,
0.0254974365234375,
0.07611083984375,
-0.0311431884765625,
0.049652099609375,
-0.032684326171875,
-0.01032257080078125,
0.0271759033203125,
-0.015777587890625,
0.03668212890625,
-0.018096923828125,
-0.0272216796875,
0.033050537109375,
0.0180206298828125,
-0.0141143798828125,
-0.00817108154296875,
0.047576904296875,
-0.061767578125,
-0.02099609375,
-0.0242156982421875,
-0.0280303955078125,
-0.017578125,
0.00571441650390625,
0.033721923828125,
0.03558349609375,
-0.046478271484375,
0.04071044921875,
0.035797119140625,
-0.024322509765625,
0.0153656005859375,
0.057281494140625,
-0.017913818359375,
-0.058502197265625,
0.0467529296875,
0.01287078857421875,
0.0276336669921875,
0.00664520263671875,
0.01099395751953125,
-0.0546875,
-0.013916015625,
-0.01432037353515625,
0.020599365234375,
-0.03704833984375,
-0.0013494491577148438,
-0.049041748046875,
-0.029754638671875,
-0.04071044921875,
0.01413726806640625,
-0.0439453125,
-0.042022705078125,
-0.046417236328125,
-0.023590087890625,
0.0482177734375,
0.063232421875,
-0.0484619140625,
0.0290985107421875,
-0.05230712890625,
0.04718017578125,
0.016265869140625,
0.04693603515625,
-0.02142333984375,
-0.073974609375,
-0.017303466796875,
0.01727294921875,
-0.01262664794921875,
-0.05706787109375,
0.03363037109375,
0.0203857421875,
0.036956787109375,
0.018035888671875,
-0.0107574462890625,
0.043426513671875,
-0.05230712890625,
0.07232666015625,
0.0139312744140625,
-0.06640625,
0.0198211669921875,
-0.045562744140625,
0.0161895751953125,
0.0447998046875,
0.00647735595703125,
-0.043060302734375,
-0.00738525390625,
-0.0430908203125,
-0.08056640625,
0.07452392578125,
0.0214691162109375,
0.01143646240234375,
0.0089111328125,
0.02972412109375,
0.0087127685546875,
0.01326751708984375,
-0.033599853515625,
-0.0263671875,
-0.017974853515625,
-0.02362060546875,
-0.02850341796875,
-0.0386962890625,
0.0015516281127929688,
-0.044647216796875,
0.07696533203125,
0.01352691650390625,
0.020660400390625,
0.013458251953125,
0.016845703125,
-0.0301513671875,
0.0250701904296875,
0.06396484375,
0.039520263671875,
-0.03594970703125,
-0.020538330078125,
0.0135650634765625,
-0.041717529296875,
-0.0026397705078125,
0.0246734619140625,
0.0212554931640625,
0.0129852294921875,
0.01270294189453125,
0.08685302734375,
0.0119476318359375,
-0.0396728515625,
0.036041259765625,
-0.01763916015625,
-0.045166015625,
-0.044189453125,
0.0086669921875,
0.036041259765625,
0.0229644775390625,
0.029296875,
-0.0113525390625,
0.006656646728515625,
-0.004505157470703125,
0.042510986328125,
0.01983642578125,
-0.049468994140625,
-0.0297698974609375,
0.07452392578125,
-0.0006165504455566406,
-0.036834716796875,
0.026336669921875,
-0.00768280029296875,
-0.0288238525390625,
0.042724609375,
0.038330078125,
0.06494140625,
-0.046295166015625,
0.0015249252319335938,
0.03369140625,
0.00847625732421875,
-0.03192138671875,
0.08013916015625,
0.005146026611328125,
-0.0328369140625,
-0.0311431884765625,
-0.07403564453125,
-0.0170745849609375,
0.03582763671875,
-0.0941162109375,
0.0246429443359375,
-0.009552001953125,
-0.016632080078125,
0.014678955078125,
-0.0019741058349609375,
-0.0599365234375,
0.0292816162109375,
0.01849365234375,
0.09881591796875,
-0.0633544921875,
0.060699462890625,
0.04083251953125,
-0.0009407997131347656,
-0.0916748046875,
-0.0216217041015625,
0.01236724853515625,
-0.047607421875,
0.0307159423828125,
0.0151824951171875,
-0.0106201171875,
0.0008988380432128906,
-0.06207275390625,
-0.0633544921875,
0.068115234375,
0.0240936279296875,
-0.09417724609375,
0.0355224609375,
0.00923919677734375,
0.033447265625,
-0.0255889892578125,
-0.0104522705078125,
0.0265350341796875,
0.033905029296875,
0.0117950439453125,
-0.10906982421875,
-0.0147705078125,
-0.031982421875,
-0.01409912109375,
-0.004810333251953125,
-0.06591796875,
0.075927734375,
0.003627777099609375,
-0.0053863525390625,
0.0222015380859375,
0.0458984375,
0.0196990966796875,
0.0244598388671875,
0.056396484375,
0.05242919921875,
0.03826904296875,
0.01151275634765625,
0.06689453125,
-0.019012451171875,
0.033782958984375,
0.0965576171875,
-0.03741455078125,
0.07452392578125,
0.029083251953125,
-0.0181121826171875,
0.024383544921875,
0.0394287109375,
-0.0030803680419921875,
0.04254150390625,
0.0225830078125,
-0.0139923095703125,
-0.0273284912109375,
-0.0013437271118164062,
-0.03363037109375,
0.0440673828125,
0.0267181396484375,
-0.01629638671875,
0.00826263427734375,
-0.010009765625,
-0.00290679931640625,
0.01496124267578125,
-0.0160675048828125,
0.06671142578125,
0.0196533203125,
-0.01959228515625,
0.05169677734375,
-0.0028324127197265625,
0.032684326171875,
-0.036468505859375,
-0.0059356689453125,
0.006595611572265625,
0.01212310791015625,
0.0026340484619140625,
-0.039764404296875,
0.020172119140625,
0.01039886474609375,
-0.0280303955078125,
-0.013458251953125,
0.0340576171875,
-0.04229736328125,
-0.05596923828125,
0.040191650390625,
0.030609130859375,
0.0196990966796875,
-0.0034160614013671875,
-0.067138671875,
0.0113067626953125,
0.01363372802734375,
-0.046295166015625,
-0.0010213851928710938,
0.01184844970703125,
0.0086669921875,
0.034820556640625,
0.041717529296875,
0.0227508544921875,
-0.01371002197265625,
0.020721435546875,
0.0562744140625,
-0.056671142578125,
-0.050506591796875,
-0.046051025390625,
0.04376220703125,
-0.01280975341796875,
-0.0166015625,
0.043304443359375,
0.06756591796875,
0.046966552734375,
-0.001758575439453125,
0.0599365234375,
0.0015459060668945312,
0.0860595703125,
-0.0221405029296875,
0.079833984375,
-0.065673828125,
-0.005550384521484375,
-0.0203704833984375,
-0.06304931640625,
0.0008606910705566406,
0.056396484375,
0.01219940185546875,
0.006351470947265625,
0.0190887451171875,
0.068115234375,
-0.0264739990234375,
0.0110626220703125,
0.031036376953125,
0.045684814453125,
0.004085540771484375,
-0.0016002655029296875,
0.04278564453125,
-0.031982421875,
0.02783203125,
-0.01531219482421875,
-0.0131072998046875,
-0.0047607421875,
-0.0546875,
-0.0694580078125,
-0.0472412109375,
-0.0259857177734375,
-0.039581298828125,
-0.002193450927734375,
0.08209228515625,
0.059844970703125,
-0.0814208984375,
-0.0362548828125,
0.0247802734375,
-0.035064697265625,
-0.0157623291015625,
-0.01629638671875,
0.036346435546875,
0.03125,
-0.038787841796875,
0.0506591796875,
-0.005588531494140625,
0.006717681884765625,
0.00865936279296875,
-0.0202789306640625,
-0.0204620361328125,
0.034088134765625,
0.026519775390625,
0.014434814453125,
-0.05517578125,
-0.0310821533203125,
0.00653839111328125,
-0.00299835205078125,
0.0010881423950195312,
0.0236358642578125,
-0.0213623046875,
0.0169525146484375,
0.0207061767578125,
-0.00493621826171875,
0.04046630859375,
0.01430511474609375,
0.037261962890625,
-0.060821533203125,
0.01303863525390625,
0.0151824951171875,
0.027923583984375,
0.0265960693359375,
-0.0177154541015625,
0.0294952392578125,
0.0034847259521484375,
-0.0450439453125,
-0.07012939453125,
-0.006717681884765625,
-0.1019287109375,
-0.011444091796875,
0.1038818359375,
0.01114654541015625,
-0.01363372802734375,
0.007694244384765625,
-0.030029296875,
0.0399169921875,
-0.057281494140625,
0.038665771484375,
0.0631103515625,
0.0023021697998046875,
0.005924224853515625,
-0.051513671875,
0.0423583984375,
0.0172882080078125,
-0.0240478515625,
0.00099945068359375,
0.0438232421875,
0.0570068359375,
-0.0123138427734375,
0.050537109375,
-0.0172271728515625,
0.03155517578125,
-0.01000213623046875,
0.0278778076171875,
-0.0031299591064453125,
-0.015960693359375,
-0.0340576171875,
0.006313323974609375,
-0.00991058349609375,
-0.0219879150390625
]
] |
cardiffnlp/twitter-roberta-base-dec2021-tweet-topic-multi-all | 2022-09-30T00:31:18.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"dataset:cardiffnlp/tweet_topic_multi",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/twitter-roberta-base-dec2021-tweet-topic-multi-all | 4 | 13,304 | transformers | 2022-09-29T17:01:29 | ---
datasets:
- cardiffnlp/tweet_topic_multi
metrics:
- f1
- accuracy
model-index:
- name: cardiffnlp/twitter-roberta-base-dec2021-tweet-topic-multi-all
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: cardiffnlp/tweet_topic_multi
type: cardiffnlp/tweet_topic_multi
args: cardiffnlp/tweet_topic_multi
split: test_2021
metrics:
- name: F1
type: f1
value: 0.7647668393782383
- name: F1 (macro)
type: f1_macro
value: 0.6187022581213811
- name: Accuracy
type: accuracy
value: 0.5485407980941036
pipeline_tag: text-classification
widget:
- text: "I'm sure the {@Tampa Bay Lightning@} would’ve rather faced the Flyers but man does their experience versus the Blue Jackets this year and last help them a lot versus this Islanders team. Another meat grinder upcoming for the good guys"
example_title: "Example 1"
- text: "Love to take night time bike rides at the jersey shore. Seaside Heights boardwalk. Beautiful weather. Wishing everyone a safe Labor Day weekend in the US."
example_title: "Example 2"
---
# cardiffnlp/twitter-roberta-base-dec2021-tweet-topic-multi-all
This model is a fine-tuned version of [cardiffnlp/twitter-roberta-base-dec2021](https://huggingface.co/cardiffnlp/twitter-roberta-base-dec2021) on the [tweet_topic_multi](https://huggingface.co/datasets/cardiffnlp/tweet_topic_multi). This model is fine-tuned on `train_all` split and validated on `test_2021` split of tweet_topic.
Fine-tuning script can be found [here](https://huggingface.co/datasets/cardiffnlp/tweet_topic_multi/blob/main/lm_finetuning.py). It achieves the following results on the test_2021 set:
- F1 (micro): 0.7647668393782383
- F1 (macro): 0.6187022581213811
- Accuracy: 0.5485407980941036
### Usage
```python
import math
import torch
from transformers import AutoModelForSequenceClassification, AutoTokenizer
def sigmoid(x):
return 1 / (1 + math.exp(-x))
tokenizer = AutoTokenizer.from_pretrained("cardiffnlp/twitter-roberta-base-dec2021-tweet-topic-multi-all")
model = AutoModelForSequenceClassification.from_pretrained("cardiffnlp/twitter-roberta-base-dec2021-tweet-topic-multi-all", problem_type="multi_label_classification")
model.eval()
class_mapping = model.config.id2label
with torch.no_grad():
text = #NewVideo Cray Dollas- Water- Ft. Charlie Rose- (Official Music Video)- {{URL}} via {@YouTube@} #watchandlearn {{USERNAME}}
tokens = tokenizer(text, return_tensors='pt')
output = model(**tokens)
flags = [sigmoid(s) > 0.5 for s in output[0][0].detach().tolist()]
topic = [class_mapping[n] for n, i in enumerate(flags) if i]
print(topic)
```
### Reference
```
@inproceedings{dimosthenis-etal-2022-twitter,
title = "{T}witter {T}opic {C}lassification",
author = "Antypas, Dimosthenis and
Ushio, Asahi and
Camacho-Collados, Jose and
Neves, Leonardo and
Silva, Vitor and
Barbieri, Francesco",
booktitle = "Proceedings of the 29th International Conference on Computational Linguistics",
month = oct,
year = "2022",
address = "Gyeongju, Republic of Korea",
publisher = "International Committee on Computational Linguistics"
}
```
| 3,237 | [
[
-0.0284423828125,
-0.04266357421875,
0.007228851318359375,
0.020965576171875,
-0.0259552001953125,
0.00749969482421875,
-0.0247039794921875,
-0.01200103759765625,
0.031707763671875,
0.0171356201171875,
-0.050994873046875,
-0.052337646484375,
-0.056396484375,
-0.006195068359375,
-0.0288848876953125,
0.07830810546875,
0.0011148452758789062,
-0.002109527587890625,
0.0197601318359375,
-0.02978515625,
-0.01215362548828125,
-0.0281982421875,
-0.033203125,
-0.020111083984375,
0.0277252197265625,
0.0262451171875,
0.0186920166015625,
0.0401611328125,
0.0399169921875,
0.028472900390625,
-0.00251007080078125,
0.018890380859375,
-0.035125732421875,
-0.01007843017578125,
0.0010576248168945312,
-0.006916046142578125,
-0.03656005859375,
0.0176849365234375,
0.047332763671875,
0.0239410400390625,
0.01031494140625,
0.0263824462890625,
0.0045013427734375,
0.027008056640625,
-0.0273284912109375,
0.00875091552734375,
-0.056304931640625,
-0.0215911865234375,
-0.00920867919921875,
-0.0245208740234375,
-0.0166015625,
-0.0236358642578125,
0.0096435546875,
-0.02520751953125,
0.039947509765625,
-0.009246826171875,
0.09515380859375,
0.0224151611328125,
-0.02349853515625,
-0.016937255859375,
-0.032684326171875,
0.06671142578125,
-0.061126708984375,
0.016082763671875,
0.0136260986328125,
-0.00033164024353027344,
0.01020050048828125,
-0.0576171875,
-0.0308990478515625,
-0.006031036376953125,
-0.004425048828125,
0.00693511962890625,
0.0017747879028320312,
-0.005222320556640625,
0.0222625732421875,
0.0193023681640625,
-0.04669189453125,
-0.015045166015625,
-0.05059814453125,
-0.0106964111328125,
0.047149658203125,
-0.00553131103515625,
0.01215362548828125,
-0.02642822265625,
-0.021881103515625,
-0.00267791748046875,
-0.0302581787109375,
-0.007434844970703125,
0.0177459716796875,
0.032470703125,
-0.032989501953125,
0.051513671875,
-0.01708984375,
0.047882080078125,
0.0167236328125,
-0.01438140869140625,
0.04931640625,
-0.027252197265625,
-0.0178985595703125,
-0.0038127899169921875,
0.075927734375,
0.0267486572265625,
0.030181884765625,
-0.008392333984375,
0.00907135009765625,
-0.0006861686706542969,
-0.0116424560546875,
-0.08392333984375,
-0.032989501953125,
0.0263519287109375,
-0.0296783447265625,
-0.0283966064453125,
0.0123138427734375,
-0.05133056640625,
-0.007328033447265625,
-0.01140594482421875,
0.05242919921875,
-0.043365478515625,
-0.0325927734375,
0.005626678466796875,
-0.00649261474609375,
0.0005965232849121094,
0.016082763671875,
-0.06475830078125,
-0.0115966796875,
0.036651611328125,
0.07177734375,
-0.00032901763916015625,
-0.0430908203125,
-0.0190887451171875,
0.0020809173583984375,
-0.011383056640625,
0.04425048828125,
-0.0247039794921875,
-0.004364013671875,
-0.0194854736328125,
0.006237030029296875,
-0.03411865234375,
-0.0211029052734375,
0.0293121337890625,
-0.0168609619140625,
0.027557373046875,
-0.00681304931640625,
-0.04266357421875,
0.00384521484375,
0.011566162109375,
-0.031463623046875,
0.07257080078125,
0.00797271728515625,
-0.07220458984375,
0.034912109375,
-0.05242919921875,
-0.0198516845703125,
-0.01922607421875,
-0.00562286376953125,
-0.044281005859375,
-0.009613037109375,
0.02325439453125,
0.034454345703125,
-0.0192413330078125,
0.03387451171875,
-0.037353515625,
-0.0251617431640625,
0.004108428955078125,
-0.0239715576171875,
0.07257080078125,
0.0189208984375,
-0.03314208984375,
0.038055419921875,
-0.06756591796875,
0.01409912109375,
0.0035114288330078125,
-0.0211029052734375,
0.004497528076171875,
-0.0264129638671875,
0.01360321044921875,
0.02813720703125,
0.01129913330078125,
-0.030548095703125,
0.02587890625,
-0.01824951171875,
0.054046630859375,
0.047760009765625,
0.0017442703247070312,
0.03076171875,
-0.0347900390625,
0.0130462646484375,
0.0079498291015625,
0.005558013916015625,
-0.0012836456298828125,
-0.04876708984375,
-0.0670166015625,
-0.040618896484375,
0.037994384765625,
0.0264739990234375,
-0.0411376953125,
0.044464111328125,
-0.017608642578125,
-0.05242919921875,
-0.0418701171875,
-0.014190673828125,
0.01244354248046875,
0.0236358642578125,
0.037811279296875,
0.0020503997802734375,
-0.05078125,
-0.048980712890625,
-0.0085601806640625,
-0.01206207275390625,
0.01317596435546875,
0.01312255859375,
0.0518798828125,
-0.023773193359375,
0.0657958984375,
-0.028045654296875,
-0.0060272216796875,
-0.0059661865234375,
0.0345458984375,
0.03741455078125,
0.05999755859375,
0.0499267578125,
-0.060150146484375,
-0.047088623046875,
-0.0311126708984375,
-0.046661376953125,
-0.01079559326171875,
-0.00321197509765625,
-0.02410888671875,
0.0110321044921875,
0.023590087890625,
-0.05224609375,
0.03375244140625,
0.0191802978515625,
-0.0361328125,
0.03179931640625,
-0.0022106170654296875,
0.0277252197265625,
-0.09912109375,
0.0083770751953125,
0.0157012939453125,
0.0029239654541015625,
-0.042694091796875,
-0.008544921875,
0.011566162109375,
0.00667572021484375,
-0.028778076171875,
0.05047607421875,
-0.026824951171875,
0.0157318115234375,
-0.0137786865234375,
-0.0050048828125,
-0.01082611083984375,
0.0323486328125,
-0.00606536865234375,
0.0577392578125,
0.048797607421875,
-0.048858642578125,
0.03076171875,
0.01116943359375,
-0.00811767578125,
0.0139923095703125,
-0.043365478515625,
-0.0044708251953125,
0.004962921142578125,
0.006717681884765625,
-0.07843017578125,
-0.0200653076171875,
0.0265350341796875,
-0.049835205078125,
0.0290069580078125,
-0.0294647216796875,
-0.038360595703125,
-0.03814697265625,
-0.03076171875,
0.035675048828125,
0.0272064208984375,
-0.0374755859375,
0.0295562744140625,
0.028289794921875,
0.01528167724609375,
-0.05224609375,
-0.04205322265625,
-0.01262664794921875,
-0.03448486328125,
-0.055328369140625,
0.039215087890625,
-0.0193634033203125,
-0.0020751953125,
0.0020809173583984375,
-0.0151519775390625,
-0.00759124755859375,
-0.007511138916015625,
0.00848388671875,
0.021331787109375,
-0.01763916015625,
0.01076507568359375,
-0.026458740234375,
-0.0087127685546875,
-0.014434814453125,
-0.0310821533203125,
0.0755615234375,
-0.01898193359375,
-0.0199432373046875,
-0.0321044921875,
0.00257110595703125,
0.045440673828125,
-0.0198516845703125,
0.06353759765625,
0.0855712890625,
-0.038360595703125,
0.0028133392333984375,
-0.049407958984375,
-0.019561767578125,
-0.035003662109375,
0.040283203125,
-0.038238525390625,
-0.052490234375,
0.050567626953125,
0.004734039306640625,
0.004573822021484375,
0.07647705078125,
0.05389404296875,
0.004161834716796875,
0.0626220703125,
0.04248046875,
-0.005680084228515625,
0.045684814453125,
-0.06304931640625,
0.01465606689453125,
-0.061004638671875,
-0.02410888671875,
-0.04644775390625,
-0.0069580078125,
-0.0771484375,
-0.02996826171875,
0.0083770751953125,
0.0266571044921875,
-0.046295166015625,
0.033843994140625,
-0.05682373046875,
0.011016845703125,
0.056427001953125,
0.0139007568359375,
-0.005001068115234375,
0.00786590576171875,
-0.000017762184143066406,
-0.00811767578125,
-0.048858642578125,
-0.03314208984375,
0.0882568359375,
0.017242431640625,
0.05328369140625,
0.00782012939453125,
0.0675048828125,
-0.001194000244140625,
0.0228424072265625,
-0.037109375,
0.037017822265625,
-0.022613525390625,
-0.05413818359375,
-0.027435302734375,
-0.04705810546875,
-0.05511474609375,
0.0263824462890625,
-0.01641845703125,
-0.07244873046875,
0.0128326416015625,
-0.006923675537109375,
-0.0160980224609375,
0.0341796875,
-0.0465087890625,
0.06573486328125,
-0.0104827880859375,
-0.035003662109375,
-0.003406524658203125,
-0.03265380859375,
0.0226898193359375,
0.00022327899932861328,
0.0014553070068359375,
-0.036285400390625,
0.01242828369140625,
0.07794189453125,
-0.0243988037109375,
0.049530029296875,
-0.0011167526245117188,
0.01556396484375,
0.0233154296875,
-0.0252532958984375,
0.01226043701171875,
0.022552490234375,
-0.0156402587890625,
0.006786346435546875,
-0.004718780517578125,
-0.03045654296875,
-0.01343536376953125,
0.0648193359375,
-0.07598876953125,
-0.039703369140625,
-0.05413818359375,
-0.042755126953125,
-0.006832122802734375,
0.0286865234375,
0.052276611328125,
0.047882080078125,
0.0033664703369140625,
0.01947021484375,
0.043548583984375,
-0.01059722900390625,
0.047576904296875,
0.01483917236328125,
-0.01084136962890625,
-0.04388427734375,
0.06689453125,
0.0130615234375,
0.01071929931640625,
0.01629638671875,
0.0162506103515625,
-0.0253753662109375,
-0.039337158203125,
-0.037353515625,
0.038482666015625,
-0.05035400390625,
-0.0197601318359375,
-0.059173583984375,
-0.049224853515625,
-0.05548095703125,
0.00814056396484375,
-0.027923583984375,
-0.048126220703125,
-0.046295166015625,
0.0009417533874511719,
0.038848876953125,
0.0379638671875,
-0.018707275390625,
0.0207672119140625,
-0.037200927734375,
0.02801513671875,
0.01155853271484375,
0.0198974609375,
-0.01338958740234375,
-0.066162109375,
-0.0157318115234375,
0.00797271728515625,
-0.0355224609375,
-0.0648193359375,
0.0380859375,
0.0149078369140625,
0.040008544921875,
0.0212860107421875,
0.006801605224609375,
0.06402587890625,
-0.0233154296875,
0.04547119140625,
0.022613525390625,
-0.0643310546875,
0.0430908203125,
-0.03350830078125,
0.022552490234375,
0.030731201171875,
0.0205078125,
-0.031646728515625,
-0.038482666015625,
-0.0718994140625,
-0.0845947265625,
0.0731201171875,
0.031768798828125,
0.0033817291259765625,
0.007053375244140625,
0.0151519775390625,
-0.01367950439453125,
0.026947021484375,
-0.061492919921875,
-0.04425048828125,
-0.025482177734375,
-0.02886962890625,
-0.01502227783203125,
-0.0244293212890625,
-0.0142822265625,
-0.04095458984375,
0.071533203125,
-0.002239227294921875,
0.0401611328125,
-0.00038695335388183594,
-0.0017681121826171875,
-0.00969696044921875,
0.00841522216796875,
0.05133056640625,
0.0677490234375,
-0.05194091796875,
0.0023040771484375,
0.004535675048828125,
-0.0308685302734375,
-0.0012331008911132812,
0.0269012451171875,
-0.006343841552734375,
0.02301025390625,
0.038787841796875,
0.06304931640625,
0.005847930908203125,
-0.0007762908935546875,
0.0226898193359375,
-0.00020897388458251953,
-0.0243988037109375,
-0.011260986328125,
0.008514404296875,
0.01107025146484375,
0.01910400390625,
0.0419921875,
0.0221405029296875,
-0.0011434555053710938,
-0.039398193359375,
0.0296783447265625,
0.0178375244140625,
-0.0105133056640625,
-0.014495849609375,
0.05712890625,
0.0036773681640625,
-0.01229095458984375,
0.0350341796875,
-0.01361846923828125,
-0.055206298828125,
0.0643310546875,
0.030975341796875,
0.06744384765625,
-0.0004954338073730469,
0.01404571533203125,
0.059906005859375,
0.0198822021484375,
-0.0060882568359375,
0.030364990234375,
0.01326751708984375,
-0.05133056640625,
-0.00791168212890625,
-0.048248291015625,
-0.002811431884765625,
0.020294189453125,
-0.045257568359375,
0.035675048828125,
-0.0230255126953125,
-0.03179931640625,
0.0206756591796875,
0.0218048095703125,
-0.050140380859375,
0.025787353515625,
-0.0075225830078125,
0.06378173828125,
-0.066162109375,
0.07781982421875,
0.050537109375,
-0.04266357421875,
-0.06829833984375,
0.0199737548828125,
-0.020660400390625,
-0.04852294921875,
0.051177978515625,
0.0282745361328125,
0.00829315185546875,
0.007659912109375,
-0.0382080078125,
-0.0869140625,
0.0826416015625,
0.019073486328125,
-0.020111083984375,
0.00010079145431518555,
0.0027751922607421875,
0.033447265625,
-0.027618408203125,
0.046630859375,
0.0246124267578125,
0.0284423828125,
0.0296478271484375,
-0.08062744140625,
-0.004512786865234375,
-0.0249176025390625,
-0.0133209228515625,
0.0117034912109375,
-0.07391357421875,
0.09869384765625,
-0.016754150390625,
0.003173828125,
0.005756378173828125,
0.046356201171875,
0.039154052734375,
0.0292205810546875,
0.0188140869140625,
0.0421142578125,
0.056640625,
-0.036407470703125,
0.046173095703125,
-0.03240966796875,
0.0751953125,
0.0697021484375,
0.0163726806640625,
0.038665771484375,
0.030181884765625,
-0.0105133056640625,
0.0085906982421875,
0.07220458984375,
-0.01317596435546875,
0.067138671875,
0.00911712646484375,
-0.0015649795532226562,
-0.01064300537109375,
0.004840850830078125,
-0.05523681640625,
0.023681640625,
0.0195159912109375,
-0.027618408203125,
-0.010345458984375,
-0.00824737548828125,
0.01007843017578125,
-0.0274200439453125,
-0.020751953125,
0.037628173828125,
-0.003162384033203125,
-0.040740966796875,
0.06640625,
0.014068603515625,
0.07940673828125,
-0.03094482421875,
0.01383209228515625,
0.0031642913818359375,
0.0247650146484375,
-0.021087646484375,
-0.04736328125,
0.01308441162109375,
-0.000667572021484375,
-0.0089569091796875,
-0.0030765533447265625,
0.04388427734375,
-0.04498291015625,
-0.052581787109375,
0.02008056640625,
0.02593994140625,
0.00975799560546875,
0.00839996337890625,
-0.07806396484375,
0.00534820556640625,
0.0184326171875,
-0.03240966796875,
-0.0000209808349609375,
0.040618896484375,
0.0107879638671875,
0.042755126953125,
0.0355224609375,
0.0040130615234375,
0.0226593017578125,
0.01445770263671875,
0.068115234375,
-0.0452880859375,
-0.022735595703125,
-0.0775146484375,
0.033966064453125,
-0.0218048095703125,
-0.050323486328125,
0.05755615234375,
0.056793212890625,
0.06707763671875,
0.0004734992980957031,
0.055450439453125,
-0.0274658203125,
0.033447265625,
-0.03680419921875,
0.06072998046875,
-0.056640625,
0.0121307373046875,
-0.02276611328125,
-0.05316162109375,
-0.01727294921875,
0.06573486328125,
-0.027130126953125,
0.0345458984375,
0.053070068359375,
0.056182861328125,
-0.0155181884765625,
-0.0015974044799804688,
0.007472991943359375,
0.0142822265625,
0.026123046875,
0.028961181640625,
0.043304443359375,
-0.05230712890625,
0.033111572265625,
-0.052825927734375,
-0.0118560791015625,
-0.031585693359375,
-0.045654296875,
-0.08770751953125,
-0.058502197265625,
-0.036285400390625,
-0.057525634765625,
0.00711822509765625,
0.081787109375,
0.07452392578125,
-0.08062744140625,
-0.01369476318359375,
0.0008230209350585938,
0.006931304931640625,
0.006072998046875,
-0.017730712890625,
0.056182861328125,
-0.0203857421875,
-0.06439208984375,
-0.00585174560546875,
-0.003414154052734375,
0.00856781005859375,
0.00384521484375,
-0.0033550262451171875,
-0.037353515625,
-0.00917816162109375,
0.0193328857421875,
0.00402069091796875,
-0.051544189453125,
-0.01409912109375,
-0.0059051513671875,
-0.0290985107421875,
0.01531219482421875,
0.038055419921875,
-0.027557373046875,
0.00829315185546875,
0.03936767578125,
0.00804901123046875,
0.06591796875,
-0.00798797607421875,
0.0161285400390625,
-0.04681396484375,
0.02850341796875,
0.0142822265625,
0.0379638671875,
0.025390625,
-0.0123291015625,
0.05364990234375,
0.0357666015625,
-0.0361328125,
-0.066650390625,
-0.005970001220703125,
-0.0682373046875,
-0.017303466796875,
0.094482421875,
0.0037689208984375,
-0.00919342041015625,
0.01434326171875,
-0.00807952880859375,
0.05267333984375,
-0.037994384765625,
0.06378173828125,
0.045257568359375,
0.00002580881118774414,
-0.012969970703125,
-0.04119873046875,
0.03558349609375,
0.0136260986328125,
-0.046356201171875,
-0.0162506103515625,
0.005535125732421875,
0.048065185546875,
0.0194244384765625,
0.057861328125,
0.006732940673828125,
0.01262664794921875,
-0.008880615234375,
0.00513458251953125,
-0.00958251953125,
-0.032501220703125,
-0.00792694091796875,
0.00719451904296875,
-0.015655517578125,
-0.0389404296875
]
] |
jonatasgrosman/wav2vec2-large-xlsr-53-dutch | 2022-12-14T01:58:20.000Z | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"hf-asr-leaderboard",
"mozilla-foundation/common_voice_6_0",
"nl",
"robust-speech-event",
"speech",
"xlsr-fine-tuning-week",
"dataset:common_voice",
"dataset:mozilla-foundation/common_voice_6_0",
"doi:10.57967/hf/0203",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | jonatasgrosman | null | null | jonatasgrosman/wav2vec2-large-xlsr-53-dutch | 6 | 13,293 | transformers | 2022-03-02T23:29:05 | ---
language: nl
license: apache-2.0
datasets:
- common_voice
- mozilla-foundation/common_voice_6_0
metrics:
- wer
- cer
tags:
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
- mozilla-foundation/common_voice_6_0
- nl
- robust-speech-event
- speech
- xlsr-fine-tuning-week
model-index:
- name: XLSR Wav2Vec2 Dutch by Jonatas Grosman
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice nl
type: common_voice
args: nl
metrics:
- name: Test WER
type: wer
value: 15.72
- name: Test CER
type: cer
value: 5.35
- name: Test WER (+LM)
type: wer
value: 12.84
- name: Test CER (+LM)
type: cer
value: 4.64
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Robust Speech Event - Dev Data
type: speech-recognition-community-v2/dev_data
args: nl
metrics:
- name: Dev WER
type: wer
value: 35.79
- name: Dev CER
type: cer
value: 17.67
- name: Dev WER (+LM)
type: wer
value: 31.54
- name: Dev CER (+LM)
type: cer
value: 16.37
---
# Fine-tuned XLSR-53 large model for speech recognition in Dutch
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Dutch using the train and validation splits of [Common Voice 6.1](https://huggingface.co/datasets/common_voice) and [CSS10](https://github.com/Kyubyong/css10).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned thanks to the GPU credits generously given by the [OVHcloud](https://www.ovhcloud.com/en/public-cloud/ai-training/) :)
The script used for training can be found here: https://github.com/jonatasgrosman/wav2vec2-sprint
## Usage
The model can be used directly (without a language model) as follows...
Using the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) library:
```python
from huggingsound import SpeechRecognitionModel
model = SpeechRecognitionModel("jonatasgrosman/wav2vec2-large-xlsr-53-dutch")
audio_paths = ["/path/to/file.mp3", "/path/to/another_file.wav"]
transcriptions = model.transcribe(audio_paths)
```
Writing your own inference script:
```python
import torch
import librosa
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
LANG_ID = "nl"
MODEL_ID = "jonatasgrosman/wav2vec2-large-xlsr-53-dutch"
SAMPLES = 10
test_dataset = load_dataset("common_voice", LANG_ID, split=f"test[:{SAMPLES}]")
processor = Wav2Vec2Processor.from_pretrained(MODEL_ID)
model = Wav2Vec2ForCTC.from_pretrained(MODEL_ID)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
speech_array, sampling_rate = librosa.load(batch["path"], sr=16_000)
batch["speech"] = speech_array
batch["sentence"] = batch["sentence"].upper()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
predicted_sentences = processor.batch_decode(predicted_ids)
for i, predicted_sentence in enumerate(predicted_sentences):
print("-" * 100)
print("Reference:", test_dataset[i]["sentence"])
print("Prediction:", predicted_sentence)
```
| Reference | Prediction |
| ------------- | ------------- |
| DE ABORIGINALS ZIJN DE OORSPRONKELIJKE BEWONERS VAN AUSTRALIË. | DE ABBORIGENALS ZIJN DE OORSPRONKELIJKE BEWONERS VAN AUSTRALIË |
| MIJN TOETSENBORD ZIT VOL STOF. | MIJN TOETSENBORD ZIT VOL STOF |
| ZE HAD DE BANK BESCHADIGD MET HAAR SKATEBOARD. | ZE HAD DE BANK BESCHADIGD MET HAAR SCHEETBOORD |
| WAAR LAAT JIJ JE ONDERHOUD DOEN? | WAAR LAAT JIJ HET ONDERHOUD DOEN |
| NA HET LEZEN VAN VELE BEOORDELINGEN HAD ZE EINDELIJK HAAR OOG LATEN VALLEN OP EEN LAPTOP MET EEN QWERTY TOETSENBORD. | NA HET LEZEN VAN VELE BEOORDELINGEN HAD ZE EINDELIJK HAAR OOG LATEN VALLEN OP EEN LAPTOP MET EEN QUERTITOETSEMBORD |
| DE TAMPONS ZIJN OP. | DE TAPONT ZIJN OP |
| MARIJKE KENT OLIVIER NU AL MEER DAN TWEE JAAR. | MAARRIJKEN KENT OLIEVIER NU AL MEER DAN TWEE JAAR |
| HET VOEREN VAN BROOD AAN EENDEN IS EIGENLIJK ONGEZOND VOOR DE BEESTEN. | HET VOEREN VAN BEUROT AAN EINDEN IS EIGENLIJK ONGEZOND VOOR DE BEESTEN |
| PARKET MOET JE STOFZUIGEN, TEGELS MOET JE DWEILEN. | PARKET MOET JE STOF ZUIGEN MAAR TEGELS MOET JE DWEILEN |
| IN ONZE BUURT KENT IEDEREEN ELKAAR. | IN ONZE BUURT KENT IEDEREEN ELKAAR |
## Evaluation
1. To evaluate on `mozilla-foundation/common_voice_6_0` with split `test`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-dutch --dataset mozilla-foundation/common_voice_6_0 --config nl --split test
```
2. To evaluate on `speech-recognition-community-v2/dev_data`
```bash
python eval.py --model_id jonatasgrosman/wav2vec2-large-xlsr-53-dutch --dataset speech-recognition-community-v2/dev_data --config nl --split validation --chunk_length_s 5.0 --stride_length_s 1.0
```
## Citation
If you want to cite this model you can use this:
```bibtex
@misc{grosman2021xlsr53-large-dutch,
title={Fine-tuned {XLSR}-53 large model for speech recognition in {D}utch},
author={Grosman, Jonatas},
howpublished={\url{https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-dutch}},
year={2021}
}
``` | 5,601 | [
[
-0.03338623046875,
-0.05322265625,
0.0153350830078125,
0.017822265625,
-0.01593017578125,
-0.0158538818359375,
-0.0278472900390625,
-0.049102783203125,
0.0220794677734375,
0.020263671875,
-0.046417236328125,
-0.0384521484375,
-0.0357666015625,
0.004573822021484375,
-0.02691650390625,
0.0662841796875,
0.00984954833984375,
0.0162200927734375,
-0.0091094970703125,
-0.0115966796875,
-0.0237579345703125,
-0.03485107421875,
-0.04833984375,
-0.026397705078125,
0.026824951171875,
0.0171661376953125,
0.022674560546875,
0.027862548828125,
0.029327392578125,
0.027679443359375,
-0.0242462158203125,
0.005687713623046875,
-0.02008056640625,
0.004886627197265625,
0.0063018798828125,
-0.030364990234375,
-0.0229644775390625,
-0.0022907257080078125,
0.040802001953125,
0.03912353515625,
-0.018951416015625,
0.0260772705078125,
0.001678466796875,
0.031707763671875,
-0.0284271240234375,
0.018951416015625,
-0.03216552734375,
-0.0100250244140625,
-0.006290435791015625,
0.00960540771484375,
-0.0251617431640625,
-0.0169219970703125,
0.015655517578125,
-0.032318115234375,
0.0150909423828125,
-0.00005626678466796875,
0.0765380859375,
0.01090240478515625,
-0.005306243896484375,
-0.0278167724609375,
-0.0513916015625,
0.06787109375,
-0.0706787109375,
0.0306396484375,
0.036590576171875,
0.011810302734375,
-0.0067138671875,
-0.059356689453125,
-0.053466796875,
-0.0120391845703125,
0.004535675048828125,
0.0217132568359375,
-0.03692626953125,
-0.006809234619140625,
0.025054931640625,
0.005390167236328125,
-0.056243896484375,
0.006671905517578125,
-0.05511474609375,
-0.0296173095703125,
0.05364990234375,
-0.00787353515625,
0.0146636962890625,
-0.0162200927734375,
-0.015655517578125,
-0.0465087890625,
-0.0109710693359375,
0.026641845703125,
0.0294342041015625,
0.039398193359375,
-0.042816162109375,
0.04388427734375,
-0.006473541259765625,
0.051544189453125,
0.00441741943359375,
-0.02337646484375,
0.06500244140625,
-0.01540374755859375,
-0.0245208740234375,
0.02001953125,
0.07952880859375,
0.0145416259765625,
0.02008056640625,
0.004589080810546875,
-0.00817108154296875,
0.018096923828125,
-0.0177001953125,
-0.0535888671875,
-0.0202789306640625,
0.029876708984375,
-0.0247955322265625,
-0.0096893310546875,
0.00832366943359375,
-0.04541015625,
-0.0028896331787109375,
-0.017578125,
0.0439453125,
-0.0435791015625,
-0.0218048095703125,
0.025146484375,
-0.0230255126953125,
0.00815582275390625,
-0.000194549560546875,
-0.061492919921875,
0.0123138427734375,
0.03277587890625,
0.062255859375,
0.0142364501953125,
-0.016815185546875,
-0.0253143310546875,
-0.012359619140625,
-0.016815185546875,
0.04486083984375,
-0.021484375,
-0.0209808349609375,
-0.013580322265625,
-0.001972198486328125,
-0.016448974609375,
-0.037506103515625,
0.051849365234375,
-0.0159454345703125,
0.035858154296875,
-0.01033782958984375,
-0.039520263671875,
-0.0201568603515625,
-0.00766754150390625,
-0.043182373046875,
0.0869140625,
0.0004830360412597656,
-0.06378173828125,
0.009857177734375,
-0.051727294921875,
-0.034515380859375,
-0.021728515625,
-0.0007491111755371094,
-0.043243408203125,
-0.01253509521484375,
0.021942138671875,
0.03961181640625,
-0.0150604248046875,
0.00472259521484375,
-0.026641845703125,
-0.0209808349609375,
0.02960205078125,
-0.0255889892578125,
0.0950927734375,
0.014373779296875,
-0.0286102294921875,
0.004123687744140625,
-0.0692138671875,
0.00952911376953125,
0.012481689453125,
-0.031494140625,
-0.014617919921875,
-0.0100860595703125,
0.01922607421875,
0.0159912109375,
0.01296234130859375,
-0.052337646484375,
0.0007476806640625,
-0.047943115234375,
0.04681396484375,
0.04278564453125,
-0.00797271728515625,
0.0112457275390625,
-0.0275115966796875,
0.0245208740234375,
-0.0008130073547363281,
-0.0008482933044433594,
0.006504058837890625,
-0.039215087890625,
-0.0540771484375,
-0.02630615234375,
0.026519775390625,
0.0458984375,
-0.032745361328125,
0.060211181640625,
-0.015869140625,
-0.06298828125,
-0.0555419921875,
-0.01160430908203125,
0.0301666259765625,
0.034332275390625,
0.038116455078125,
-0.004978179931640625,
-0.0728759765625,
-0.06610107421875,
0.0031337738037109375,
-0.0256805419921875,
-0.0164947509765625,
0.037139892578125,
0.046966552734375,
-0.02337646484375,
0.068359375,
-0.0291900634765625,
-0.0172576904296875,
-0.0174102783203125,
0.0118255615234375,
0.037017822265625,
0.051666259765625,
0.046112060546875,
-0.060028076171875,
-0.0285186767578125,
-0.0181732177734375,
-0.039581298828125,
-0.00682830810546875,
-0.0026798248291015625,
-0.00339508056640625,
0.0247955322265625,
0.0280914306640625,
-0.05377197265625,
0.0193939208984375,
0.043487548828125,
-0.0296783447265625,
0.04608154296875,
-0.0079803466796875,
-0.009735107421875,
-0.08447265625,
0.0110015869140625,
0.0081634521484375,
-0.013580322265625,
-0.037322998046875,
-0.0237579345703125,
-0.0131988525390625,
0.00887298583984375,
-0.033233642578125,
0.0386962890625,
-0.03155517578125,
-0.0064239501953125,
0.005237579345703125,
0.0184173583984375,
-0.00562286376953125,
0.040374755859375,
0.00208282470703125,
0.045013427734375,
0.0543212890625,
-0.039764404296875,
0.041839599609375,
0.038543701171875,
-0.04754638671875,
0.01279449462890625,
-0.06292724609375,
0.0214385986328125,
0.00665283203125,
0.0208282470703125,
-0.08416748046875,
-0.0092926025390625,
0.0286102294921875,
-0.06195068359375,
0.01611328125,
0.01168060302734375,
-0.035858154296875,
-0.04864501953125,
-0.008392333984375,
0.00965118408203125,
0.0538330078125,
-0.039337158203125,
0.050872802734375,
0.0333251953125,
-0.0126953125,
-0.0518798828125,
-0.0704345703125,
-0.016998291015625,
-0.011688232421875,
-0.0550537109375,
0.01300048828125,
-0.01073455810546875,
-0.0172271728515625,
-0.00762176513671875,
-0.015380859375,
0.00042557716369628906,
-0.0030841827392578125,
0.028289794921875,
0.027923583984375,
-0.017486572265625,
-0.00594329833984375,
-0.01221466064453125,
0.0026702880859375,
0.011962890625,
-0.011077880859375,
0.055450439453125,
-0.01116180419921875,
-0.00762176513671875,
-0.043182373046875,
0.01013946533203125,
0.040313720703125,
-0.017303466796875,
0.044647216796875,
0.065185546875,
-0.02630615234375,
-0.005191802978515625,
-0.040313720703125,
-0.0099639892578125,
-0.03515625,
0.046142578125,
-0.0204010009765625,
-0.053314208984375,
0.043609619140625,
0.025146484375,
0.0015869140625,
0.047027587890625,
0.042388916015625,
-0.014617919921875,
0.07080078125,
0.032257080078125,
-0.01300048828125,
0.041595458984375,
-0.0411376953125,
0.00463104248046875,
-0.051727294921875,
-0.0301055908203125,
-0.05364990234375,
-0.01904296875,
-0.033203125,
-0.028961181640625,
0.01145172119140625,
0.003612518310546875,
-0.0240936279296875,
0.035675048828125,
-0.04718017578125,
0.01515960693359375,
0.0494384765625,
0.0170135498046875,
-0.01213836669921875,
0.01142120361328125,
-0.023651123046875,
0.00963592529296875,
-0.05029296875,
-0.0283660888671875,
0.0758056640625,
0.0291290283203125,
0.0460205078125,
-0.0006170272827148438,
0.04644775390625,
0.00601959228515625,
-0.016693115234375,
-0.05621337890625,
0.047454833984375,
-0.01177978515625,
-0.048126220703125,
-0.034332275390625,
-0.0260772705078125,
-0.0706787109375,
0.01348876953125,
-0.02191162109375,
-0.07647705078125,
0.020843505859375,
-0.0002758502960205078,
-0.0394287109375,
0.01497650146484375,
-0.050445556640625,
0.059844970703125,
-0.0035533905029296875,
-0.01207733154296875,
-0.01267242431640625,
-0.057464599609375,
0.0161590576171875,
0.0013952255249023438,
0.0080718994140625,
-0.00994110107421875,
0.0224456787109375,
0.09771728515625,
-0.0234527587890625,
0.06591796875,
-0.0160064697265625,
0.0024871826171875,
0.03582763671875,
-0.0212249755859375,
0.0296783447265625,
-0.019378662109375,
-0.0201873779296875,
0.0308074951171875,
0.0186309814453125,
-0.0156402587890625,
-0.02044677734375,
0.043365478515625,
-0.0810546875,
-0.0230712890625,
-0.036773681640625,
-0.0384521484375,
-0.00846099853515625,
0.0164947509765625,
0.041351318359375,
0.03961181640625,
-0.01284027099609375,
0.03460693359375,
0.034881591796875,
-0.0211944580078125,
0.037628173828125,
0.032562255859375,
-0.009735107421875,
-0.03900146484375,
0.0455322265625,
0.01459503173828125,
0.0126953125,
0.01806640625,
0.02679443359375,
-0.042144775390625,
-0.033111572265625,
-0.0128173828125,
0.02239990234375,
-0.053619384765625,
-0.0108795166015625,
-0.058013916015625,
-0.0192718505859375,
-0.056732177734375,
0.008148193359375,
-0.0242919921875,
-0.0282745361328125,
-0.03497314453125,
-0.016998291015625,
0.042388916015625,
0.03570556640625,
-0.031524658203125,
0.019500732421875,
-0.03863525390625,
0.0292205810546875,
0.01038360595703125,
0.0054473876953125,
-0.0060882568359375,
-0.06146240234375,
-0.0265960693359375,
0.0211334228515625,
-0.0103607177734375,
-0.0560302734375,
0.041015625,
0.00608062744140625,
0.0374755859375,
0.028961181640625,
0.010955810546875,
0.054931640625,
-0.02496337890625,
0.06463623046875,
0.0285186767578125,
-0.073974609375,
0.04498291015625,
-0.017425537109375,
0.018585205078125,
0.0267333984375,
0.019989013671875,
-0.043426513671875,
-0.024078369140625,
-0.0576171875,
-0.07550048828125,
0.0699462890625,
0.020355224609375,
0.007030487060546875,
0.00525665283203125,
0.01329803466796875,
-0.004413604736328125,
0.0021457672119140625,
-0.054443359375,
-0.04290771484375,
-0.01416778564453125,
-0.0187835693359375,
-0.02294921875,
-0.0190887451171875,
-0.007297515869140625,
-0.03564453125,
0.0755615234375,
0.016448974609375,
0.031707763671875,
0.024993896484375,
0.006744384765625,
-0.00160980224609375,
0.01290130615234375,
0.050140380859375,
0.0281982421875,
-0.03375244140625,
-0.007503509521484375,
0.01580810546875,
-0.05413818359375,
0.007659912109375,
0.0126953125,
-0.0138092041015625,
0.02655029296875,
0.03704833984375,
0.094482421875,
0.0071868896484375,
-0.044097900390625,
0.0360107421875,
-0.0073699951171875,
-0.033935546875,
-0.053466796875,
0.01035308837890625,
0.0200958251953125,
0.0259246826171875,
0.0304107666015625,
0.0217132568359375,
-0.00913238525390625,
-0.026641845703125,
0.01241302490234375,
0.024017333984375,
-0.0207977294921875,
-0.0165252685546875,
0.051177978515625,
0.006420135498046875,
-0.02740478515625,
0.0452880859375,
-0.003986358642578125,
-0.033966064453125,
0.0621337890625,
0.04510498046875,
0.063232421875,
-0.0202789306640625,
-0.00684356689453125,
0.055511474609375,
0.01531982421875,
-0.0230255126953125,
0.03692626953125,
0.00440216064453125,
-0.0625,
-0.0226287841796875,
-0.045928955078125,
-0.0187835693359375,
0.0264739990234375,
-0.07049560546875,
0.029052734375,
-0.0264739990234375,
-0.020965576171875,
0.0229034423828125,
0.00876617431640625,
-0.052215576171875,
0.0208587646484375,
0.03253173828125,
0.07855224609375,
-0.07183837890625,
0.0712890625,
0.03900146484375,
-0.03448486328125,
-0.08453369140625,
-0.00786590576171875,
-0.0115203857421875,
-0.0550537109375,
0.04229736328125,
0.017608642578125,
-0.0087432861328125,
-0.0005116462707519531,
-0.048065185546875,
-0.07647705078125,
0.0804443359375,
0.03192138671875,
-0.0697021484375,
0.0002608299255371094,
-0.004512786865234375,
0.03546142578125,
-0.0139617919921875,
0.02984619140625,
0.046234130859375,
0.042236328125,
0.005245208740234375,
-0.08404541015625,
-0.005584716796875,
-0.0231475830078125,
-0.01500701904296875,
-0.014373779296875,
-0.041534423828125,
0.07720947265625,
-0.028350830078125,
-0.011138916015625,
0.01457977294921875,
0.05816650390625,
0.0297088623046875,
0.0195465087890625,
0.047821044921875,
0.039215087890625,
0.07464599609375,
-0.0079498291015625,
0.0693359375,
-0.01349639892578125,
0.039825439453125,
0.0845947265625,
-0.01061248779296875,
0.08624267578125,
0.02996826171875,
-0.031951904296875,
0.042938232421875,
0.045013427734375,
-0.0253448486328125,
0.04229736328125,
0.01169586181640625,
-0.0101776123046875,
-0.0166168212890625,
0.011444091796875,
-0.041259765625,
0.060394287109375,
0.01425933837890625,
-0.0216827392578125,
0.00937652587890625,
0.02093505859375,
0.008209228515625,
-0.0113372802734375,
-0.01508331298828125,
0.044342041015625,
0.006816864013671875,
-0.037139892578125,
0.06585693359375,
-0.003936767578125,
0.0703125,
-0.0531005859375,
0.01421356201171875,
0.01380157470703125,
0.020599365234375,
-0.0192718505859375,
-0.05059814453125,
0.00823211669921875,
0.0069122314453125,
-0.02630615234375,
-0.0005865097045898438,
0.037384033203125,
-0.0396728515625,
-0.05169677734375,
0.04327392578125,
0.016815185546875,
0.032257080078125,
0.000015020370483398438,
-0.06622314453125,
0.0174560546875,
0.0196075439453125,
-0.037261962890625,
0.006397247314453125,
0.022796630859375,
0.019012451171875,
0.044403076171875,
0.05810546875,
0.015899658203125,
-0.003162384033203125,
0.0035800933837890625,
0.050750732421875,
-0.04595947265625,
-0.053009033203125,
-0.061767578125,
0.038116455078125,
-0.0029354095458984375,
-0.0237579345703125,
0.055389404296875,
0.059722900390625,
0.07061767578125,
-0.005039215087890625,
0.07745361328125,
-0.0171661376953125,
0.0653076171875,
-0.033966064453125,
0.059112548828125,
-0.03497314453125,
0.006069183349609375,
-0.0261077880859375,
-0.049468994140625,
-0.01450347900390625,
0.07354736328125,
-0.031890869140625,
0.007526397705078125,
0.037841796875,
0.0679931640625,
-0.0023326873779296875,
-0.004756927490234375,
0.019500732421875,
0.042755126953125,
0.01325225830078125,
0.043060302734375,
0.03662109375,
-0.05029296875,
0.06134033203125,
-0.038421630859375,
-0.0017309188842773438,
-0.0204010009765625,
-0.046783447265625,
-0.06634521484375,
-0.05426025390625,
-0.03643798828125,
-0.052520751953125,
-0.006488800048828125,
0.08648681640625,
0.05078125,
-0.0753173828125,
-0.0243377685546875,
0.0116119384765625,
-0.0146484375,
-0.03155517578125,
-0.0165863037109375,
0.04937744140625,
0.01136016845703125,
-0.062164306640625,
0.0276947021484375,
-0.01201629638671875,
0.02154541015625,
-0.0015544891357421875,
-0.01413726806640625,
-0.021759033203125,
0.005680084228515625,
0.029693603515625,
0.0299224853515625,
-0.06463623046875,
-0.0155029296875,
-0.0004405975341796875,
-0.02239990234375,
0.01334381103515625,
0.0153045654296875,
-0.053955078125,
0.0203704833984375,
0.042388916015625,
0.013275146484375,
0.0499267578125,
-0.01629638671875,
0.01953125,
-0.042938232421875,
0.03155517578125,
0.0226898193359375,
0.045684814453125,
0.02703857421875,
-0.0127716064453125,
0.0283966064453125,
0.0188446044921875,
-0.035552978515625,
-0.07513427734375,
-0.017913818359375,
-0.10223388671875,
-0.0231475830078125,
0.1021728515625,
-0.0101318359375,
-0.0244598388671875,
0.0032253265380859375,
-0.030487060546875,
0.04766845703125,
-0.041259765625,
0.04248046875,
0.04571533203125,
-0.010986328125,
0.0003414154052734375,
-0.044158935546875,
0.023193359375,
0.0225372314453125,
-0.033172607421875,
0.004436492919921875,
0.028839111328125,
0.037384033203125,
0.02655029296875,
0.048675537109375,
0.00876617431640625,
0.0203399658203125,
0.0158538818359375,
0.033050537109375,
-0.013153076171875,
-0.0006351470947265625,
-0.033905029296875,
0.0014801025390625,
-0.0265045166015625,
-0.045623779296875
]
] |
pankajmathur/orca_mini_v3_70b | 2023-08-25T23:15:10.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:psmathur/orca_mini_v1_dataset",
"dataset:ehartford/dolphin",
"arxiv:2306.02707",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | pankajmathur | null | null | pankajmathur/orca_mini_v3_70b | 16 | 13,229 | transformers | 2023-08-10T02:28:29 | ---
language:
- en
library_name: transformers
license: other
datasets:
- psmathur/orca_mini_v1_dataset
- ehartford/dolphin
pipeline_tag: text-generation
---
# orca_mini_v3_70b
A Llama2-70b model trained on Orca Style datasets.
<br>

<br>
**P.S. If you're interested to collaborate, please connect with me at www.linkedin.com/in/pankajam.**
<br>
### quantized versions
Big thanks to [@TheBloke](https://huggingface.co/TheBloke)
1) https://huggingface.co/TheBloke/orca_mini_v3_70B-GGML
2) https://huggingface.co/TheBloke/orca_mini_v3_70B-GPTQ
<br>
#### license disclaimer:
This model is bound by the license & usage restrictions of the original Llama-2 model. And comes with no warranty or gurantees of any kind.
<br>
## Evaluation
We evaluated orca_mini_v3_70b on a wide range of tasks using [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) from EleutherAI.
Here are the results on metrics used by [HuggingFaceH4 Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|||||
|:------:|:--------:|:-------:|:--------:|
|**Task**|**Metric**|**Value**|**Stderr**|
|*arc_challenge*|acc_norm|0.7098|0.0132|
|*hellaswag*|acc_norm|0.8779|0.0032|
|*mmlu*|acc_norm|0.6904|0.0351|
|*truthfulqa_mc*|mc2|0.6196|0.0151|
|**Total Average**|-|**0.722175**||
<br>
## Example Usage
Here is the prompt format
```
### System:
You are an AI assistant that follows instruction extremely well. Help as much as you can.
### User:
Tell me about Orcas.
### Assistant:
```
Below shows a code example on how to use this model
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("psmathur/orca_mini_v3_70b")
model = AutoModelForCausalLM.from_pretrained(
"psmathur/orca_mini_v3_70b",
torch_dtype=torch.float16,
load_in_8bit=True,
low_cpu_mem_usage=True,
device_map="auto"
)
system_prompt = "### System:\nYou are an AI assistant that follows instruction extremely well. Help as much as you can.\n\n"
#generate text steps
instruction = "Tell me about Orcas."
prompt = f"{system_prompt}### User: {instruction}\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=4096)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
<br>
#### Limitations & Biases:
While this model aims for accuracy, it can occasionally produce inaccurate or misleading results.
Despite diligent efforts in refining the pretraining data, there remains a possibility for the generation of inappropriate, biased, or offensive content.
Exercise caution and cross-check information when necessary.
<br>
### Citiation:
Please kindly cite using the following BibTeX:
```
@misc{orca_mini_v3_70b,
author = {Pankaj Mathur},
title = {orca_mini_v3_70b: An Orca Style Llama2-70b model},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/psmathur/orca_mini_v3_70b},
}
```
```
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@software{touvron2023llama2,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron, Louis Martin, Kevin Stone, Peter Albert, Amjad Almahairi, Yasmine Babaei, Nikolay Bashlykov, Soumya Batra, Prajjwal Bhargava,
Shruti Bhosale, Dan Bikel, Lukas Blecher, Cristian Canton Ferrer, Moya Chen, Guillem Cucurull, David Esiobu, Jude Fernandes, Jeremy Fu, Wenyin Fu, Brian Fuller,
Cynthia Gao, Vedanuj Goswami, Naman Goyal, Anthony Hartshorn, Saghar Hosseini, Rui Hou, Hakan Inan, Marcin Kardas, Viktor Kerkez Madian Khabsa, Isabel Kloumann,
Artem Korenev, Punit Singh Koura, Marie-Anne Lachaux, Thibaut Lavril, Jenya Lee, Diana Liskovich, Yinghai Lu, Yuning Mao, Xavier Martinet, Todor Mihaylov,
Pushkar Mishra, Igor Molybog, Yixin Nie, Andrew Poulton, Jeremy Reizenstein, Rashi Rungta, Kalyan Saladi, Alan Schelten, Ruan Silva, Eric Michael Smith,
Ranjan Subramanian, Xiaoqing Ellen Tan, Binh Tang, Ross Taylor, Adina Williams, Jian Xiang Kuan, Puxin Xu , Zheng Yan, Iliyan Zarov, Yuchen Zhang, Angela Fan,
Melanie Kambadur, Sharan Narang, Aurelien Rodriguez, Robert Stojnic, Sergey Edunov, Thomas Scialom},
year={2023}
}
``` | 4,735 | [
[
-0.022003173828125,
-0.062103271484375,
0.0144805908203125,
0.00827789306640625,
-0.0178070068359375,
-0.0005202293395996094,
-0.004421234130859375,
-0.046539306640625,
0.020172119140625,
0.0130462646484375,
-0.053375244140625,
-0.04632568359375,
-0.041748046875,
-0.009124755859375,
-0.015777587890625,
0.08050537109375,
-0.002780914306640625,
-0.01216888427734375,
0.005672454833984375,
-0.013092041015625,
-0.0390625,
-0.029296875,
-0.0760498046875,
-0.031524658203125,
0.02001953125,
0.01824951171875,
0.045867919921875,
0.0546875,
0.0275726318359375,
0.0254974365234375,
-0.0191650390625,
0.0178375244140625,
-0.030975341796875,
-0.0174407958984375,
0.0175628662109375,
-0.044097900390625,
-0.07684326171875,
-0.0007524490356445312,
0.02716064453125,
0.0190887451171875,
-0.0147705078125,
0.032501220703125,
0.01354217529296875,
0.0281219482421875,
-0.0272064208984375,
0.038238525390625,
-0.0215301513671875,
0.00033736228942871094,
-0.028839111328125,
-0.001079559326171875,
-0.0014820098876953125,
-0.02099609375,
0.005016326904296875,
-0.05584716796875,
0.007015228271484375,
-0.0079345703125,
0.08526611328125,
0.028717041015625,
-0.023834228515625,
-0.0243072509765625,
-0.02679443359375,
0.054229736328125,
-0.07135009765625,
0.01331329345703125,
0.01009368896484375,
0.01788330078125,
-0.0215911865234375,
-0.06732177734375,
-0.057159423828125,
-0.0130615234375,
-0.00949859619140625,
0.01509857177734375,
-0.01474761962890625,
-0.00433349609375,
0.026702880859375,
0.025390625,
-0.040771484375,
0.0144195556640625,
-0.03424072265625,
-0.0184783935546875,
0.045196533203125,
0.029388427734375,
0.023101806640625,
-0.01043701171875,
-0.024383544921875,
-0.0286865234375,
-0.051727294921875,
0.034820556640625,
0.032806396484375,
0.012481689453125,
-0.0452880859375,
0.047760009765625,
-0.00794219970703125,
0.04632568359375,
0.01407623291015625,
-0.032745361328125,
0.04107666015625,
-0.03594970703125,
-0.02783203125,
-0.0102996826171875,
0.0648193359375,
0.01438140869140625,
0.00609588623046875,
0.0249176025390625,
-0.0074310302734375,
0.01141357421875,
-0.013427734375,
-0.054046630859375,
-0.0250701904296875,
0.0144805908203125,
-0.034088134765625,
-0.0279083251953125,
-0.0121917724609375,
-0.05816650390625,
-0.01537322998046875,
-0.013092041015625,
0.026031494140625,
-0.03173828125,
-0.028961181640625,
0.017303466796875,
0.01123046875,
0.03997802734375,
0.01079559326171875,
-0.06817626953125,
0.022857666015625,
0.0235137939453125,
0.062744140625,
0.004314422607421875,
-0.0173797607421875,
-0.011627197265625,
0.00145721435546875,
-0.008819580078125,
0.0509033203125,
-0.0180511474609375,
-0.028839111328125,
-0.0254058837890625,
0.000644683837890625,
-0.01068878173828125,
-0.0247344970703125,
0.039886474609375,
-0.026123046875,
0.01953125,
-0.0229949951171875,
-0.02044677734375,
-0.024505615234375,
0.017303466796875,
-0.043182373046875,
0.09173583984375,
-0.0016021728515625,
-0.06158447265625,
0.0198822021484375,
-0.061370849609375,
-0.00968170166015625,
-0.0212554931640625,
-0.015960693359375,
-0.05126953125,
-0.023193359375,
0.033233642578125,
0.0273590087890625,
-0.019775390625,
0.00804901123046875,
-0.030303955078125,
-0.02392578125,
0.004974365234375,
-0.0180206298828125,
0.08270263671875,
0.0128631591796875,
-0.055877685546875,
0.0207977294921875,
-0.05633544921875,
-0.006496429443359375,
0.03350830078125,
-0.024658203125,
0.0037250518798828125,
-0.01403045654296875,
-0.016693115234375,
0.01806640625,
0.033599853515625,
-0.03656005859375,
0.022918701171875,
-0.0262603759765625,
0.04718017578125,
0.059326171875,
-0.010589599609375,
0.0218963623046875,
-0.0316162109375,
0.04071044921875,
-0.00040650367736816406,
0.02325439453125,
0.0094451904296875,
-0.049591064453125,
-0.083251953125,
-0.03082275390625,
0.020172119140625,
0.03656005859375,
-0.0465087890625,
0.043365478515625,
-0.0132293701171875,
-0.0526123046875,
-0.0341796875,
0.0016412734985351562,
0.029266357421875,
0.046783447265625,
0.0263671875,
-0.02618408203125,
-0.050445556640625,
-0.055206298828125,
0.00817108154296875,
-0.01523590087890625,
-0.00360107421875,
0.0301513671875,
0.042083740234375,
-0.0202178955078125,
0.0736083984375,
-0.034271240234375,
-0.031280517578125,
-0.02099609375,
0.00685882568359375,
0.02716064453125,
0.049652099609375,
0.056182861328125,
-0.041046142578125,
-0.024871826171875,
-0.0167694091796875,
-0.06829833984375,
-0.0015954971313476562,
0.00199127197265625,
-0.0243682861328125,
0.01515960693359375,
0.0316162109375,
-0.05859375,
0.049285888671875,
0.042755126953125,
-0.0299835205078125,
0.04766845703125,
-0.01081085205078125,
-0.006072998046875,
-0.0694580078125,
0.021331787109375,
-0.0023136138916015625,
-0.01374053955078125,
-0.0303802490234375,
-0.004673004150390625,
-0.0086822509765625,
0.004261016845703125,
-0.03497314453125,
0.051910400390625,
-0.032196044921875,
0.003162384033203125,
-0.006717681884765625,
0.008453369140625,
-0.0092926025390625,
0.057464599609375,
0.0003616809844970703,
0.047821044921875,
0.05303955078125,
-0.035400390625,
0.0291595458984375,
0.025482177734375,
-0.0289764404296875,
0.0277862548828125,
-0.0723876953125,
0.0264434814453125,
0.006744384765625,
0.03924560546875,
-0.0870361328125,
-0.01666259765625,
0.04290771484375,
-0.041229248046875,
0.033599853515625,
0.002349853515625,
-0.038970947265625,
-0.0360107421875,
-0.033538818359375,
0.03863525390625,
0.038726806640625,
-0.0394287109375,
0.041717529296875,
0.0270233154296875,
-0.003261566162109375,
-0.04913330078125,
-0.054046630859375,
-0.01555633544921875,
-0.0304412841796875,
-0.0556640625,
0.0243377685546875,
-0.0196075439453125,
0.006305694580078125,
-0.006992340087890625,
-0.013916015625,
0.00986480712890625,
-0.0011510848999023438,
0.026702880859375,
0.04052734375,
-0.0143280029296875,
-0.017120361328125,
-0.00009959936141967773,
-0.01139068603515625,
0.004840850830078125,
-0.007472991943359375,
0.05584716796875,
-0.031524658203125,
-0.0220489501953125,
-0.048248291015625,
-0.005401611328125,
0.0313720703125,
-0.0139312744140625,
0.057708740234375,
0.058135986328125,
-0.0251922607421875,
0.0159912109375,
-0.039093017578125,
-0.0197601318359375,
-0.03997802734375,
0.0213623046875,
-0.038055419921875,
-0.058197021484375,
0.059326171875,
0.0175018310546875,
0.015869140625,
0.05548095703125,
0.050384521484375,
-0.0018100738525390625,
0.0792236328125,
0.057830810546875,
-0.00043892860412597656,
0.0462646484375,
-0.051422119140625,
0.00007539987564086914,
-0.07220458984375,
-0.046112060546875,
-0.0308990478515625,
-0.0325927734375,
-0.037261962890625,
-0.020355224609375,
0.0286865234375,
0.0119476318359375,
-0.045135498046875,
0.03289794921875,
-0.04620361328125,
0.007781982421875,
0.0413818359375,
0.0270233154296875,
0.013641357421875,
-0.0022258758544921875,
-0.01432037353515625,
0.00577545166015625,
-0.058624267578125,
-0.0430908203125,
0.091552734375,
0.03155517578125,
0.05633544921875,
0.007205963134765625,
0.04168701171875,
0.005096435546875,
0.0220489501953125,
-0.04071044921875,
0.0408935546875,
0.016326904296875,
-0.05426025390625,
-0.022735595703125,
-0.0258636474609375,
-0.07940673828125,
0.01045989990234375,
-0.00360107421875,
-0.0638427734375,
0.020172119140625,
0.00884246826171875,
-0.045562744140625,
0.019317626953125,
-0.04119873046875,
0.0655517578125,
-0.013092041015625,
-0.002025604248046875,
-0.00274658203125,
-0.058807373046875,
0.04144287109375,
0.003704071044921875,
0.004085540771484375,
-0.007534027099609375,
-0.01102447509765625,
0.07843017578125,
-0.042877197265625,
0.075439453125,
-0.004253387451171875,
-0.00832366943359375,
0.0391845703125,
-0.0079345703125,
0.048828125,
0.01116943359375,
-0.007137298583984375,
0.0270843505859375,
-0.0029659271240234375,
-0.0357666015625,
-0.0216522216796875,
0.04632568359375,
-0.084716796875,
-0.038299560546875,
-0.03131103515625,
-0.025726318359375,
0.00974273681640625,
0.0170440673828125,
0.036407470703125,
0.023834228515625,
0.0186309814453125,
0.0027790069580078125,
0.040374755859375,
-0.0162506103515625,
0.038482666015625,
0.0311431884765625,
-0.0009012222290039062,
-0.03173828125,
0.05377197265625,
0.01326751708984375,
0.01401519775390625,
0.00571441650390625,
0.00225830078125,
-0.033843994140625,
-0.032867431640625,
-0.03155517578125,
0.04046630859375,
-0.0455322265625,
-0.0283050537109375,
-0.049163818359375,
-0.018157958984375,
-0.031890869140625,
-0.001659393310546875,
-0.03485107421875,
-0.0236663818359375,
-0.04913330078125,
-0.01183319091796875,
0.036346435546875,
0.0411376953125,
-0.01163482666015625,
0.0257110595703125,
-0.02667236328125,
0.0128631591796875,
0.02996826171875,
0.005218505859375,
0.00919342041015625,
-0.06524658203125,
-0.00786590576171875,
0.0183258056640625,
-0.045166015625,
-0.048492431640625,
0.03662109375,
0.00640106201171875,
0.045989990234375,
0.0153961181640625,
-0.004604339599609375,
0.07611083984375,
-0.00847625732421875,
0.07159423828125,
0.0247039794921875,
-0.07647705078125,
0.04010009765625,
-0.0237274169921875,
0.015106201171875,
0.0183258056640625,
0.02099609375,
-0.0159759521484375,
-0.0253143310546875,
-0.06964111328125,
-0.0679931640625,
0.065673828125,
0.02801513671875,
0.0036869049072265625,
0.01418304443359375,
0.03515625,
0.0042724609375,
0.0117950439453125,
-0.072265625,
-0.039398193359375,
-0.0262298583984375,
-0.007190704345703125,
0.0012073516845703125,
-0.0169525146484375,
-0.00339508056640625,
-0.025299072265625,
0.057373046875,
-0.002185821533203125,
0.04449462890625,
0.011688232421875,
0.005077362060546875,
-0.00485992431640625,
-0.013153076171875,
0.054412841796875,
0.0428466796875,
-0.0202484130859375,
-0.004344940185546875,
0.029754638671875,
-0.043243408203125,
-0.001461029052734375,
0.006198883056640625,
-0.0016231536865234375,
-0.00774383544921875,
0.029449462890625,
0.051971435546875,
-0.0081939697265625,
-0.03192138671875,
0.02923583984375,
-0.0081329345703125,
-0.007190704345703125,
-0.035003662109375,
0.00875091552734375,
0.01282501220703125,
0.037750244140625,
0.0183258056640625,
0.0124969482421875,
-0.0049285888671875,
-0.041748046875,
-0.00527191162109375,
0.0247039794921875,
-0.00963592529296875,
-0.033721923828125,
0.0728759765625,
0.002185821533203125,
-0.014251708984375,
0.049163818359375,
-0.00469970703125,
-0.03338623046875,
0.06463623046875,
0.02447509765625,
0.044677734375,
-0.01126861572265625,
-0.002605438232421875,
0.03814697265625,
0.01409149169921875,
-0.007755279541015625,
0.0302886962890625,
0.003021240234375,
-0.038970947265625,
-0.0250396728515625,
-0.036956787109375,
-0.015350341796875,
0.0283203125,
-0.0465087890625,
0.04119873046875,
-0.034515380859375,
-0.026580810546875,
-0.0054473876953125,
0.02471923828125,
-0.0633544921875,
0.01528167724609375,
0.014984130859375,
0.0670166015625,
-0.04974365234375,
0.076171875,
0.0447998046875,
-0.050201416015625,
-0.087158203125,
-0.024566650390625,
0.0031147003173828125,
-0.074462890625,
0.0433349609375,
0.002391815185546875,
-0.006778717041015625,
0.0102691650390625,
-0.055023193359375,
-0.07794189453125,
0.10992431640625,
0.038421630859375,
-0.026702880859375,
-0.007038116455078125,
-0.00036835670471191406,
0.04180908203125,
-0.0191497802734375,
0.048126220703125,
0.052703857421875,
0.0311126708984375,
0.01444244384765625,
-0.08074951171875,
0.0255279541015625,
-0.028350830078125,
-0.0049285888671875,
-0.00765228271484375,
-0.08087158203125,
0.08990478515625,
-0.020233154296875,
-0.00769805908203125,
0.0287628173828125,
0.05645751953125,
0.0460205078125,
0.01132965087890625,
0.030975341796875,
0.0430908203125,
0.050994873046875,
-0.00876617431640625,
0.0709228515625,
-0.011688232421875,
0.048309326171875,
0.0638427734375,
0.01490020751953125,
0.04644775390625,
0.01611328125,
-0.02593994140625,
0.0504150390625,
0.07464599609375,
0.000995635986328125,
0.043487548828125,
0.01030731201171875,
0.0005931854248046875,
-0.007160186767578125,
0.00506591796875,
-0.0546875,
0.0252227783203125,
0.03302001953125,
-0.0202789306640625,
-0.01178741455078125,
-0.011505126953125,
0.019317626953125,
-0.0304718017578125,
-0.00750732421875,
0.0428466796875,
0.01299285888671875,
-0.0247955322265625,
0.08203125,
0.00287628173828125,
0.06927490234375,
-0.0499267578125,
0.00312042236328125,
-0.03302001953125,
0.009674072265625,
-0.029083251953125,
-0.046478271484375,
0.007015228271484375,
-0.0018482208251953125,
0.006656646728515625,
-0.0029392242431640625,
0.0379638671875,
-0.0180816650390625,
-0.0270843505859375,
0.01473236083984375,
0.01800537109375,
0.0254364013671875,
0.01104736328125,
-0.0738525390625,
0.020477294921875,
0.008453369140625,
-0.053070068359375,
0.0196533203125,
0.031463623046875,
0.0022869110107421875,
0.055572509765625,
0.047332763671875,
-0.0019102096557617188,
0.0174713134765625,
-0.0201416015625,
0.08380126953125,
-0.033111572265625,
-0.0301971435546875,
-0.071533203125,
0.043243408203125,
0.0023326873779296875,
-0.039154052734375,
0.059967041015625,
0.034637451171875,
0.06610107421875,
0.003658294677734375,
0.04669189453125,
-0.028076171875,
0.017669677734375,
-0.031280517578125,
0.050750732421875,
-0.05126953125,
0.0287322998046875,
-0.017425537109375,
-0.06671142578125,
-0.0130615234375,
0.06787109375,
-0.0247039794921875,
0.01351165771484375,
0.041229248046875,
0.06500244140625,
-0.004726409912109375,
-0.0094451904296875,
-0.0003669261932373047,
0.0239715576171875,
0.041534423828125,
0.0609130859375,
0.043548583984375,
-0.049468994140625,
0.06396484375,
-0.03216552734375,
-0.0248870849609375,
-0.0192413330078125,
-0.060821533203125,
-0.07122802734375,
-0.0266265869140625,
-0.03271484375,
-0.035400390625,
-0.008331298828125,
0.062286376953125,
0.064453125,
-0.05279541015625,
-0.02545166015625,
-0.0083465576171875,
0.00110626220703125,
-0.0278472900390625,
-0.01284027099609375,
0.05108642578125,
0.0015010833740234375,
-0.0655517578125,
0.0066986083984375,
-0.00600433349609375,
0.03302001953125,
-0.018341064453125,
-0.01436614990234375,
-0.01438140869140625,
0.000047087669372558594,
0.02392578125,
0.038726806640625,
-0.05126953125,
-0.022918701171875,
-0.00472259521484375,
-0.0218658447265625,
0.010833740234375,
0.0223236083984375,
-0.061553955078125,
0.0220947265625,
0.023895263671875,
0.0122833251953125,
0.06353759765625,
-0.01190185546875,
0.015716552734375,
-0.040557861328125,
0.026092529296875,
0.002048492431640625,
0.0283203125,
0.0119476318359375,
-0.0275726318359375,
0.05242919921875,
0.0208282470703125,
-0.035491943359375,
-0.06439208984375,
-0.00290679931640625,
-0.09423828125,
0.00640106201171875,
0.08636474609375,
-0.029083251953125,
-0.023956298828125,
0.00878143310546875,
-0.0299072265625,
0.043670654296875,
-0.03863525390625,
0.06719970703125,
0.03216552734375,
-0.02227783203125,
-0.006450653076171875,
-0.03887939453125,
0.03936767578125,
0.02044677734375,
-0.062225341796875,
-0.026702880859375,
0.00856781005859375,
0.03619384765625,
0.0169677734375,
0.052520751953125,
-0.0091705322265625,
0.0150299072265625,
0.00536346435546875,
0.011627197265625,
-0.0253448486328125,
0.00035262107849121094,
-0.0117340087890625,
-0.00868988037109375,
-0.01025390625,
-0.03179931640625
]
] |
facebook/mms-tts-eng | 2023-09-06T13:32:25.000Z | [
"transformers",
"pytorch",
"safetensors",
"vits",
"text-to-audio",
"mms",
"text-to-speech",
"arxiv:2305.13516",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-to-speech | facebook | null | null | facebook/mms-tts-eng | 22 | 13,209 | transformers | 2023-08-24T09:09:22 |
---
license: cc-by-nc-4.0
tags:
- mms
- vits
pipeline_tag: text-to-speech
---
# Massively Multilingual Speech (MMS): English Text-to-Speech
This repository contains the **English (eng)** language text-to-speech (TTS) model checkpoint.
This model is part of Facebook's [Massively Multilingual Speech](https://arxiv.org/abs/2305.13516) project, aiming to
provide speech technology across a diverse range of languages. You can find more details about the supported languages
and their ISO 639-3 codes in the [MMS Language Coverage Overview](https://dl.fbaipublicfiles.com/mms/misc/language_coverage_mms.html),
and see all MMS-TTS checkpoints on the Hugging Face Hub: [facebook/mms-tts](https://huggingface.co/models?sort=trending&search=facebook%2Fmms-tts).
MMS-TTS is available in the 🤗 Transformers library from version 4.33 onwards.
## Model Details
VITS (**V**ariational **I**nference with adversarial learning for end-to-end **T**ext-to-**S**peech) is an end-to-end
speech synthesis model that predicts a speech waveform conditional on an input text sequence. It is a conditional variational
autoencoder (VAE) comprised of a posterior encoder, decoder, and conditional prior.
A set of spectrogram-based acoustic features are predicted by the flow-based module, which is formed of a Transformer-based
text encoder and multiple coupling layers. The spectrogram is decoded using a stack of transposed convolutional layers,
much in the same style as the HiFi-GAN vocoder. Motivated by the one-to-many nature of the TTS problem, where the same text
input can be spoken in multiple ways, the model also includes a stochastic duration predictor, which allows the model to
synthesise speech with different rhythms from the same input text.
The model is trained end-to-end with a combination of losses derived from variational lower bound and adversarial training.
To improve the expressiveness of the model, normalizing flows are applied to the conditional prior distribution. During
inference, the text encodings are up-sampled based on the duration prediction module, and then mapped into the
waveform using a cascade of the flow module and HiFi-GAN decoder. Due to the stochastic nature of the duration predictor,
the model is non-deterministic, and thus requires a fixed seed to generate the same speech waveform.
For the MMS project, a separate VITS checkpoint is trained on each langauge.
## Usage
MMS-TTS is available in the 🤗 Transformers library from version 4.33 onwards. To use this checkpoint,
first install the latest version of the library:
```
pip install --upgrade transformers accelerate
```
Then, run inference with the following code-snippet:
```python
from transformers import VitsModel, AutoTokenizer
import torch
model = VitsModel.from_pretrained("facebook/mms-tts-eng")
tokenizer = AutoTokenizer.from_pretrained("facebook/mms-tts-eng")
text = "some example text in the English language"
inputs = tokenizer(text, return_tensors="pt")
with torch.no_grad():
output = model(**inputs).waveform
```
The resulting waveform can be saved as a `.wav` file:
```python
import scipy
scipy.io.wavfile.write("techno.wav", rate=model.config.sampling_rate, data=output.float().numpy())
```
Or displayed in a Jupyter Notebook / Google Colab:
```python
from IPython.display import Audio
Audio(output.numpy(), rate=model.config.sampling_rate)
```
## BibTex citation
This model was developed by Vineel Pratap et al. from Meta AI. If you use the model, consider citing the MMS paper:
```
@article{pratap2023mms,
title={Scaling Speech Technology to 1,000+ Languages},
author={Vineel Pratap and Andros Tjandra and Bowen Shi and Paden Tomasello and Arun Babu and Sayani Kundu and Ali Elkahky and Zhaoheng Ni and Apoorv Vyas and Maryam Fazel-Zarandi and Alexei Baevski and Yossi Adi and Xiaohui Zhang and Wei-Ning Hsu and Alexis Conneau and Michael Auli},
journal={arXiv},
year={2023}
}
```
## License
The model is licensed as **CC-BY-NC 4.0**.
| 3,996 | [
[
-0.02374267578125,
-0.05828857421875,
0.01502227783203125,
0.030853271484375,
-0.00531768798828125,
-0.00562286376953125,
-0.0215606689453125,
-0.0204315185546875,
0.026519775390625,
0.01419830322265625,
-0.05865478515625,
-0.0357666015625,
-0.041839599609375,
0.0029964447021484375,
-0.03582763671875,
0.0703125,
0.0223236083984375,
0.00711822509765625,
0.01277923583984375,
0.007144927978515625,
-0.01200103759765625,
-0.02899169921875,
-0.06671142578125,
-0.01531982421875,
0.0272064208984375,
0.010894775390625,
0.036590576171875,
0.04345703125,
0.03240966796875,
0.02581787109375,
-0.0343017578125,
-0.000018835067749023438,
-0.02178955078125,
-0.006130218505859375,
-0.0015344619750976562,
-0.020477294921875,
-0.040496826171875,
0.004550933837890625,
0.05926513671875,
0.034576416015625,
-0.0234375,
0.0209197998046875,
-0.00464630126953125,
0.02227783203125,
-0.021453857421875,
0.01134490966796875,
-0.0418701171875,
-0.005153656005859375,
-0.00957489013671875,
-0.014923095703125,
-0.0343017578125,
-0.0052490234375,
0.019500732421875,
-0.039703369140625,
0.0137481689453125,
-0.0278778076171875,
0.0733642578125,
0.013427734375,
-0.034210205078125,
-0.021484375,
-0.06524658203125,
0.04693603515625,
-0.064208984375,
0.04486083984375,
0.02496337890625,
0.0299072265625,
0.0031528472900390625,
-0.07049560546875,
-0.0465087890625,
-0.0269012451171875,
0.009124755859375,
0.0249786376953125,
-0.025299072265625,
0.0030765533447265625,
0.018890380859375,
0.0283355712890625,
-0.049835205078125,
0.0006513595581054688,
-0.05645751953125,
-0.031768798828125,
0.044921875,
-0.006427764892578125,
0.0133819580078125,
-0.04473876953125,
-0.01922607421875,
-0.01654052734375,
-0.0194854736328125,
0.0167694091796875,
0.021148681640625,
0.0347900390625,
-0.050537109375,
0.041290283203125,
-0.0074920654296875,
0.048248291015625,
0.003314971923828125,
-0.0266571044921875,
0.0543212890625,
-0.0222625732421875,
-0.019073486328125,
-0.01473236083984375,
0.088134765625,
0.01009368896484375,
0.019287109375,
0.00010341405868530273,
0.0014963150024414062,
0.017120361328125,
-0.0048980712890625,
-0.0648193359375,
0.0010547637939453125,
0.01483154296875,
-0.0232696533203125,
-0.01020050048828125,
-0.006855010986328125,
-0.040802001953125,
0.0098876953125,
-0.0193939208984375,
0.0408935546875,
-0.047088623046875,
-0.031829833984375,
0.00036406517028808594,
-0.01983642578125,
0.002010345458984375,
-0.01227569580078125,
-0.06854248046875,
-0.00911712646484375,
0.01242828369140625,
0.076171875,
0.00844573974609375,
-0.033905029296875,
-0.031585693359375,
0.0209197998046875,
-0.0226287841796875,
0.03277587890625,
-0.019989013671875,
-0.047821044921875,
-0.0069732666015625,
0.00666046142578125,
-0.0128021240234375,
-0.023712158203125,
0.0518798828125,
-0.01201629638671875,
0.023681640625,
-0.0119171142578125,
-0.03900146484375,
-0.006744384765625,
-0.0165252685546875,
-0.037445068359375,
0.0804443359375,
0.003398895263671875,
-0.052276611328125,
0.0164337158203125,
-0.059814453125,
-0.03289794921875,
-0.0163726806640625,
-0.0015392303466796875,
-0.034210205078125,
-0.0025272369384765625,
0.0193939208984375,
0.0423583984375,
-0.0148162841796875,
0.03363037109375,
-0.024078369140625,
-0.0214385986328125,
0.0128631591796875,
-0.054290771484375,
0.08135986328125,
0.043701171875,
-0.0254669189453125,
0.012176513671875,
-0.06549072265625,
-0.0140380859375,
0.005359649658203125,
-0.02593994140625,
0.010894775390625,
0.00948333740234375,
0.0159454345703125,
0.032989501953125,
0.003963470458984375,
-0.041107177734375,
-0.0078887939453125,
-0.039337158203125,
0.06793212890625,
0.04254150390625,
-0.00006687641143798828,
0.0256195068359375,
-0.0259246826171875,
0.0316162109375,
0.0006771087646484375,
0.0183563232421875,
-0.032318115234375,
-0.035186767578125,
-0.048370361328125,
-0.041778564453125,
0.01215362548828125,
0.0465087890625,
-0.054473876953125,
0.0280609130859375,
-0.03582763671875,
-0.05987548828125,
-0.054840087890625,
-0.0199127197265625,
0.0188751220703125,
0.040557861328125,
0.03741455078125,
-0.0026454925537109375,
-0.049102783203125,
-0.0660400390625,
-0.0013475418090820312,
-0.0306549072265625,
-0.019500732421875,
0.0205535888671875,
0.029327392578125,
-0.033294677734375,
0.07525634765625,
-0.0133209228515625,
-0.0143585205078125,
-0.0038433074951171875,
0.01690673828125,
0.01256561279296875,
0.03961181640625,
0.04949951171875,
-0.054443359375,
-0.031585693359375,
-0.0225677490234375,
-0.0509033203125,
-0.022430419921875,
0.0011749267578125,
0.0045623779296875,
0.00870513916015625,
0.048858642578125,
-0.05096435546875,
0.021240234375,
0.05792236328125,
-0.0273590087890625,
0.04180908203125,
0.0025081634521484375,
0.011810302734375,
-0.10968017578125,
0.005573272705078125,
0.00914764404296875,
-0.0246429443359375,
-0.042877197265625,
-0.0196685791015625,
-0.006168365478515625,
-0.0152740478515625,
-0.046417236328125,
0.037872314453125,
-0.02142333984375,
-0.0018672943115234375,
-0.0158233642578125,
0.01047515869140625,
-0.01265716552734375,
0.038360595703125,
0.0011014938354492188,
0.0654296875,
0.0596923828125,
-0.04901123046875,
0.0318603515625,
0.01512908935546875,
-0.00659942626953125,
0.048065185546875,
-0.0611572265625,
-0.00437164306640625,
-0.0010099411010742188,
0.030181884765625,
-0.07366943359375,
-0.0028743743896484375,
0.00508880615234375,
-0.076904296875,
0.0219573974609375,
-0.0127716064453125,
-0.040191650390625,
-0.045684814453125,
0.003528594970703125,
0.011688232421875,
0.0406494140625,
-0.031341552734375,
0.05560302734375,
0.044891357421875,
-0.0115509033203125,
-0.038299560546875,
-0.070068359375,
0.0003845691680908203,
-0.0236053466796875,
-0.06109619140625,
0.036163330078125,
-0.0156402587890625,
0.0115203857421875,
0.0011110305786132812,
0.0036220550537109375,
-0.0025539398193359375,
-0.00911712646484375,
0.022735595703125,
0.01812744140625,
-0.01114654541015625,
0.009063720703125,
0.006076812744140625,
-0.01293182373046875,
0.006473541259765625,
-0.0310821533203125,
0.0450439453125,
-0.0116729736328125,
-0.016815185546875,
-0.059661865234375,
0.026885986328125,
0.052459716796875,
-0.018035888671875,
0.04559326171875,
0.078857421875,
-0.0306396484375,
-0.0020656585693359375,
-0.040374755859375,
-0.01535797119140625,
-0.040130615234375,
0.0443115234375,
-0.030059814453125,
-0.06964111328125,
0.04937744140625,
0.01549530029296875,
0.00571441650390625,
0.06134033203125,
0.054534912109375,
-0.01012420654296875,
0.07098388671875,
0.050872802734375,
-0.02166748046875,
0.056060791015625,
-0.037933349609375,
-0.007427215576171875,
-0.047393798828125,
-0.01331329345703125,
-0.04010009765625,
0.0050048828125,
-0.05682373046875,
-0.0380859375,
0.03131103515625,
-0.008056640625,
-0.023590087890625,
0.036468505859375,
-0.037750244140625,
0.0007567405700683594,
0.042694091796875,
-0.00795745849609375,
0.003910064697265625,
0.01605224609375,
-0.020751953125,
-0.003200531005859375,
-0.05108642578125,
-0.0283966064453125,
0.07904052734375,
0.0347900390625,
0.034698486328125,
0.004825592041015625,
0.037445068359375,
0.015045166015625,
0.019012451171875,
-0.040283203125,
0.03668212890625,
-0.01763916015625,
-0.06915283203125,
-0.0261383056640625,
-0.04693603515625,
-0.0628662109375,
0.01611328125,
-0.0171661376953125,
-0.06396484375,
0.00537109375,
0.0016584396362304688,
-0.021270751953125,
0.02142333984375,
-0.0601806640625,
0.051727294921875,
0.0158538818359375,
-0.004192352294921875,
-0.012359619140625,
-0.049896240234375,
0.01401519775390625,
0.004016876220703125,
0.032196044921875,
-0.0084075927734375,
0.026519775390625,
0.07958984375,
-0.019256591796875,
0.060302734375,
-0.0133819580078125,
0.00020170211791992188,
0.040618896484375,
-0.0232391357421875,
0.0192413330078125,
-0.000006020069122314453,
-0.00411224365234375,
0.029510498046875,
0.004215240478515625,
-0.01824951171875,
-0.0208892822265625,
0.04083251953125,
-0.0616455078125,
-0.019500732421875,
-0.0159912109375,
-0.038665771484375,
-0.0119781494140625,
0.01177215576171875,
0.05584716796875,
0.03558349609375,
-0.01189422607421875,
0.0182342529296875,
0.03662109375,
-0.023468017578125,
0.058135986328125,
0.044891357421875,
-0.021514892578125,
-0.04632568359375,
0.06170654296875,
0.0235443115234375,
0.0313720703125,
0.0192108154296875,
0.01297760009765625,
-0.03057861328125,
-0.0199737548828125,
-0.041900634765625,
0.034912109375,
-0.04901123046875,
-0.0022602081298828125,
-0.05865478515625,
-0.04205322265625,
-0.05206298828125,
-0.0034961700439453125,
-0.04583740234375,
-0.02862548828125,
-0.034576416015625,
-0.019561767578125,
0.031646728515625,
0.0247650146484375,
-0.0254364013671875,
0.045166015625,
-0.0517578125,
0.036956787109375,
0.0174560546875,
0.01468658447265625,
-0.0104522705078125,
-0.08135986328125,
-0.0307159423828125,
0.01593017578125,
-0.0246429443359375,
-0.081298828125,
0.036834716796875,
0.0113372802734375,
0.043487548828125,
0.0244140625,
-0.0218963623046875,
0.055419921875,
-0.04559326171875,
0.06829833984375,
0.024627685546875,
-0.0858154296875,
0.041015625,
-0.04229736328125,
0.0210723876953125,
0.0131378173828125,
0.0210723876953125,
-0.054931640625,
-0.037872314453125,
-0.052276611328125,
-0.0716552734375,
0.0521240234375,
0.0357666015625,
0.02166748046875,
-0.00604248046875,
0.01309967041015625,
-0.027740478515625,
0.0098876953125,
-0.07550048828125,
-0.044677734375,
-0.025054931640625,
-0.0219573974609375,
-0.0272674560546875,
-0.0172576904296875,
0.004657745361328125,
-0.0261383056640625,
0.06671142578125,
0.011871337890625,
0.0419921875,
0.023468017578125,
0.00014960765838623047,
-0.00936126708984375,
0.01593017578125,
0.03826904296875,
0.0311431884765625,
-0.0161895751953125,
-0.01427459716796875,
0.0118560791015625,
-0.041900634765625,
0.01222991943359375,
0.03265380859375,
-0.017059326171875,
0.0293121337890625,
0.0176239013671875,
0.0819091796875,
-0.0015506744384765625,
-0.03204345703125,
0.041290283203125,
0.004207611083984375,
-0.022369384765625,
-0.039337158203125,
-0.01523590087890625,
0.0362548828125,
0.0259857177734375,
0.04278564453125,
-0.007778167724609375,
0.0052642822265625,
-0.02520751953125,
0.02276611328125,
0.02899169921875,
-0.0322265625,
-0.0297393798828125,
0.071533203125,
0.01419830322265625,
-0.030303955078125,
0.019500732421875,
-0.021514892578125,
-0.0242156982421875,
0.043701171875,
0.037933349609375,
0.072509765625,
-0.054840087890625,
0.0163421630859375,
0.04022216796875,
0.038604736328125,
0.0015316009521484375,
0.0287322998046875,
0.00788116455078125,
-0.032867431640625,
-0.046417236328125,
-0.055267333984375,
-0.010345458984375,
0.0184783935546875,
-0.04302978515625,
0.036834716796875,
-0.01058197021484375,
-0.0401611328125,
0.0171966552734375,
-0.01557159423828125,
-0.043487548828125,
0.038360595703125,
0.0214385986328125,
0.05810546875,
-0.07958984375,
0.064453125,
0.0276336669921875,
-0.0380859375,
-0.07769775390625,
-0.01025390625,
0.0033283233642578125,
-0.036407470703125,
0.03497314453125,
0.00943756103515625,
-0.01580810546875,
0.0150299072265625,
-0.034332275390625,
-0.0870361328125,
0.0814208984375,
0.037689208984375,
-0.03656005859375,
-0.012237548828125,
0.0028591156005859375,
0.04144287109375,
-0.0187530517578125,
0.0283355712890625,
0.027496337890625,
0.0161590576171875,
0.01242828369140625,
-0.0968017578125,
-0.0169219970703125,
-0.025634765625,
-0.0004525184631347656,
-0.01378631591796875,
-0.050048828125,
0.0780029296875,
-0.0133514404296875,
-0.0139923095703125,
-0.0085296630859375,
0.07000732421875,
0.0231170654296875,
0.025421142578125,
0.037109375,
0.042022705078125,
0.04718017578125,
0.00003057718276977539,
0.062286376953125,
-0.0206146240234375,
0.03668212890625,
0.0660400390625,
0.0191497802734375,
0.0618896484375,
0.0192108154296875,
-0.02783203125,
0.03759765625,
0.048095703125,
0.003612518310546875,
0.03546142578125,
-0.0003266334533691406,
-0.00484466552734375,
0.0042877197265625,
-0.0009551048278808594,
-0.04949951171875,
0.045257568359375,
0.03131103515625,
-0.039825439453125,
0.00567626953125,
0.01904296875,
0.0085601806640625,
-0.0213623046875,
-0.01116180419921875,
0.03759765625,
0.00955963134765625,
-0.0340576171875,
0.056732177734375,
0.00995635986328125,
0.06610107421875,
-0.059112548828125,
0.0264739990234375,
0.004192352294921875,
-0.00881195068359375,
-0.01093292236328125,
-0.0287322998046875,
0.0289154052734375,
0.01010894775390625,
-0.0265655517578125,
-0.0032196044921875,
0.021942138671875,
-0.042877197265625,
-0.046539306640625,
0.03997802734375,
0.0300445556640625,
0.012359619140625,
0.00028705596923828125,
-0.05511474609375,
-0.0009107589721679688,
0.0012598037719726562,
-0.03326416015625,
-0.004764556884765625,
0.0311126708984375,
0.0174407958984375,
0.054779052734375,
0.05841064453125,
0.0168914794921875,
0.0225677490234375,
0.012969970703125,
0.052398681640625,
-0.044677734375,
-0.0701904296875,
-0.06378173828125,
0.0496826171875,
0.0005803108215332031,
-0.0159759521484375,
0.06170654296875,
0.046844482421875,
0.061370849609375,
0.005615234375,
0.064453125,
-0.0018482208251953125,
0.058441162109375,
-0.032745361328125,
0.0545654296875,
-0.05072021484375,
0.01421356201171875,
-0.049957275390625,
-0.052459716796875,
0.0029735565185546875,
0.0531005859375,
-0.0082244873046875,
0.02337646484375,
0.041168212890625,
0.06414794921875,
0.0013370513916015625,
-0.0037555694580078125,
0.018096923828125,
0.026214599609375,
0.03173828125,
0.031890869140625,
0.04302978515625,
-0.037322998046875,
0.062286376953125,
-0.022735595703125,
-0.021453857421875,
-0.007198333740234375,
-0.056976318359375,
-0.05511474609375,
-0.06671142578125,
-0.0253753662109375,
-0.0396728515625,
-0.006809234619140625,
0.06109619140625,
0.070556640625,
-0.040496826171875,
-0.029510498046875,
0.0036029815673828125,
-0.015411376953125,
-0.0201873779296875,
-0.019744873046875,
0.033203125,
-0.00872802734375,
-0.072509765625,
0.0394287109375,
0.01523590087890625,
0.02587890625,
-0.0137786865234375,
-0.0020961761474609375,
-0.01690673828125,
0.0208587646484375,
0.0380859375,
0.0184173583984375,
-0.0523681640625,
0.0003790855407714844,
0.0096588134765625,
-0.021759033203125,
-0.0008864402770996094,
0.03594970703125,
-0.04296875,
0.03765869140625,
0.03228759765625,
0.0255126953125,
0.062744140625,
-0.0227508544921875,
0.0238037109375,
-0.053314208984375,
0.0279083251953125,
0.00910186767578125,
0.0350341796875,
0.03546142578125,
-0.006946563720703125,
0.019317626953125,
0.040130615234375,
-0.0435791015625,
-0.06475830078125,
0.0120849609375,
-0.08538818359375,
-0.0210723876953125,
0.10791015625,
-0.0010194778442382812,
-0.003917694091796875,
0.00800323486328125,
-0.00537872314453125,
0.060821533203125,
-0.02960205078125,
0.04583740234375,
0.048431396484375,
0.018890380859375,
-0.0019283294677734375,
-0.054901123046875,
0.038116455078125,
0.024932861328125,
-0.0322265625,
-0.006664276123046875,
0.039764404296875,
0.030548095703125,
0.0216522216796875,
0.07269287109375,
-0.0085601806640625,
0.0172576904296875,
0.00617218017578125,
0.0106964111328125,
-0.0035533905029296875,
-0.019256591796875,
-0.0281524658203125,
0.0037479400634765625,
-0.0283050537109375,
-0.0263824462890625
]
] |
nvidia/segformer-b3-finetuned-cityscapes-1024-1024 | 2022-08-09T11:32:45.000Z | [
"transformers",
"pytorch",
"tf",
"segformer",
"vision",
"image-segmentation",
"dataset:cityscapes",
"arxiv:2105.15203",
"license:other",
"endpoints_compatible",
"region:us"
] | image-segmentation | nvidia | null | null | nvidia/segformer-b3-finetuned-cityscapes-1024-1024 | 2 | 13,206 | transformers | 2022-03-02T23:29:05 | ---
license: other
tags:
- vision
- image-segmentation
datasets:
- cityscapes
widget:
- src: https://cdn-media.huggingface.co/Inference-API/Sample-results-on-the-Cityscapes-dataset-The-above-images-show-how-our-method-can-handle.png
example_title: Road
---
# SegFormer (b3-sized) model fine-tuned on CityScapes
SegFormer model fine-tuned on CityScapes at resolution 1024x1024. It was introduced in the paper [SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers](https://arxiv.org/abs/2105.15203) by Xie et al. and first released in [this repository](https://github.com/NVlabs/SegFormer).
Disclaimer: The team releasing SegFormer did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
SegFormer consists of a hierarchical Transformer encoder and a lightweight all-MLP decode head to achieve great results on semantic segmentation benchmarks such as ADE20K and Cityscapes. The hierarchical Transformer is first pre-trained on ImageNet-1k, after which a decode head is added and fine-tuned altogether on a downstream dataset.
## Intended uses & limitations
You can use the raw model for semantic segmentation. See the [model hub](https://huggingface.co/models?other=segformer) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import SegformerFeatureExtractor, SegformerForSemanticSegmentation
from PIL import Image
import requests
feature_extractor = SegformerFeatureExtractor.from_pretrained("nvidia/segformer-b3-finetuned-cityscapes-1024-1024")
model = SegformerForSemanticSegmentation.from_pretrained("nvidia/segformer-b3-finetuned-cityscapes-1024-1024")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = feature_extractor(images=image, return_tensors="pt")
outputs = model(**inputs)
logits = outputs.logits # shape (batch_size, num_labels, height/4, width/4)
```
For more code examples, we refer to the [documentation](https://huggingface.co/transformers/model_doc/segformer.html#).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2105-15203,
author = {Enze Xie and
Wenhai Wang and
Zhiding Yu and
Anima Anandkumar and
Jose M. Alvarez and
Ping Luo},
title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with
Transformers},
journal = {CoRR},
volume = {abs/2105.15203},
year = {2021},
url = {https://arxiv.org/abs/2105.15203},
eprinttype = {arXiv},
eprint = {2105.15203},
timestamp = {Wed, 02 Jun 2021 11:46:42 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| 3,017 | [
[
-0.06591796875,
-0.05401611328125,
0.017791748046875,
0.020416259765625,
-0.0209503173828125,
-0.0272064208984375,
0.00041675567626953125,
-0.05120849609375,
0.019134521484375,
0.043731689453125,
-0.059844970703125,
-0.043609619140625,
-0.049407958984375,
0.01168060302734375,
-0.02630615234375,
0.05853271484375,
0.003772735595703125,
-0.00897979736328125,
-0.026611328125,
-0.024261474609375,
-0.004306793212890625,
-0.0244293212890625,
-0.0465087890625,
-0.026611328125,
0.0283966064453125,
0.0208587646484375,
0.048736572265625,
0.0628662109375,
0.0545654296875,
0.033447265625,
-0.03271484375,
0.00970458984375,
-0.0242767333984375,
-0.0216064453125,
0.00044727325439453125,
-0.004222869873046875,
-0.025421142578125,
-0.00150299072265625,
0.0300140380859375,
0.049468994140625,
0.00555419921875,
0.0239105224609375,
-0.0010576248168945312,
0.035980224609375,
-0.031890869140625,
0.01027679443359375,
-0.03277587890625,
0.01275634765625,
-0.001220703125,
-0.0025310516357421875,
-0.0169677734375,
-0.016693115234375,
0.018890380859375,
-0.035919189453125,
0.058746337890625,
0.00650787353515625,
0.11309814453125,
0.030364990234375,
-0.0240936279296875,
-0.0026531219482421875,
-0.038360595703125,
0.059844970703125,
-0.045684814453125,
0.037322998046875,
-0.004245758056640625,
0.027435302734375,
0.01013946533203125,
-0.0697021484375,
-0.035797119140625,
0.01529693603515625,
-0.0205078125,
-0.007297515869140625,
-0.031219482421875,
0.002719879150390625,
0.037994384765625,
0.042999267578125,
-0.036376953125,
0.0037403106689453125,
-0.05841064453125,
-0.03033447265625,
0.051025390625,
0.00659942626953125,
0.01459503173828125,
-0.03271484375,
-0.0570068359375,
-0.03253173828125,
-0.029205322265625,
0.01177978515625,
0.01047515869140625,
0.004764556884765625,
-0.0243377685546875,
0.0311737060546875,
-0.00463104248046875,
0.056427001953125,
0.034515380859375,
-0.01143646240234375,
0.038787841796875,
-0.01253509521484375,
-0.027252197265625,
0.0004954338073730469,
0.0693359375,
0.034027099609375,
-0.0031337738037109375,
0.0034637451171875,
-0.00495147705078125,
0.0016946792602539062,
0.0226898193359375,
-0.0992431640625,
-0.0207977294921875,
0.00904083251953125,
-0.0400390625,
-0.024688720703125,
0.010040283203125,
-0.058929443359375,
-0.00543975830078125,
-0.0113677978515625,
0.032470703125,
-0.0229644775390625,
-0.0038909912109375,
0.00780487060546875,
-0.00861358642578125,
0.05731201171875,
0.0199127197265625,
-0.068603515625,
0.01300048828125,
0.039093017578125,
0.06341552734375,
-0.017791748046875,
-0.01229095458984375,
-0.0069122314453125,
-0.0029010772705078125,
-0.0167083740234375,
0.069580078125,
-0.0202789306640625,
-0.0280609130859375,
-0.0232086181640625,
0.04278564453125,
-0.01219940185546875,
-0.051300048828125,
0.058319091796875,
-0.034332275390625,
0.01451873779296875,
-0.0049285888671875,
-0.022064208984375,
-0.0428466796875,
0.0251617431640625,
-0.050048828125,
0.07586669921875,
0.0196533203125,
-0.06024169921875,
0.033843994140625,
-0.047454833984375,
-0.0207366943359375,
-0.0026531219482421875,
0.003482818603515625,
-0.06280517578125,
0.006488800048828125,
0.04022216796875,
0.039642333984375,
-0.0170745849609375,
0.01556396484375,
-0.040496826171875,
-0.01357269287109375,
-0.001178741455078125,
-0.01324462890625,
0.07220458984375,
0.0243682861328125,
-0.0227203369140625,
0.0298309326171875,
-0.049774169921875,
-0.0029773712158203125,
0.0308074951171875,
0.001617431640625,
-0.0007295608520507812,
-0.0199737548828125,
0.018341064453125,
0.0287628173828125,
0.018035888671875,
-0.051544189453125,
0.0005679130554199219,
-0.0266265869140625,
0.03277587890625,
0.047027587890625,
0.006763458251953125,
0.039337158203125,
-0.01181793212890625,
0.0205230712890625,
0.01273345947265625,
0.03436279296875,
-0.0091094970703125,
-0.0166168212890625,
-0.082275390625,
-0.033172607421875,
0.01323699951171875,
0.006633758544921875,
-0.035552978515625,
0.044219970703125,
-0.02032470703125,
-0.0545654296875,
-0.036041259765625,
-0.004474639892578125,
0.0005154609680175781,
0.03643798828125,
0.039703369140625,
-0.0277862548828125,
-0.058990478515625,
-0.09100341796875,
0.01282501220703125,
0.0181732177734375,
-0.004001617431640625,
0.0271148681640625,
0.044219970703125,
-0.049072265625,
0.06890869140625,
-0.0577392578125,
-0.022552490234375,
-0.01544952392578125,
-0.00606536865234375,
0.0272369384765625,
0.04656982421875,
0.052581787109375,
-0.062744140625,
-0.0245819091796875,
-0.0190582275390625,
-0.043731689453125,
-0.0060882568359375,
0.01044464111328125,
-0.027496337890625,
0.0104522705078125,
0.0301513671875,
-0.04229736328125,
0.03155517578125,
0.03900146484375,
-0.040435791015625,
0.0228424072265625,
-0.003177642822265625,
-0.0014476776123046875,
-0.07635498046875,
0.013336181640625,
0.0157623291015625,
-0.023193359375,
-0.044219970703125,
0.01557159423828125,
-0.0014562606811523438,
-0.0174713134765625,
-0.045013427734375,
0.0413818359375,
-0.0205078125,
-0.0054473876953125,
-0.0190277099609375,
-0.009857177734375,
0.012298583984375,
0.059783935546875,
0.017425537109375,
0.0237884521484375,
0.0330810546875,
-0.053497314453125,
0.0152130126953125,
0.041717529296875,
-0.033294677734375,
0.0369873046875,
-0.07501220703125,
0.00933837890625,
-0.005207061767578125,
0.012115478515625,
-0.0526123046875,
-0.026214599609375,
0.0272064208984375,
-0.0210113525390625,
0.0213165283203125,
-0.01861572265625,
-0.0241546630859375,
-0.0482177734375,
-0.01535797119140625,
0.034332275390625,
0.033599853515625,
-0.065185546875,
0.046905517578125,
0.03570556640625,
0.011077880859375,
-0.01496124267578125,
-0.050994873046875,
-0.024169921875,
-0.0280914306640625,
-0.078369140625,
0.047210693359375,
-0.005504608154296875,
0.01361846923828125,
-0.0009756088256835938,
-0.023101806640625,
-0.004543304443359375,
-0.0020503997802734375,
0.029296875,
0.0362548828125,
-0.0022487640380859375,
-0.033233642578125,
-0.004058837890625,
-0.030792236328125,
0.0107879638671875,
-0.007564544677734375,
0.051300048828125,
-0.02691650390625,
-0.024322509765625,
-0.0208587646484375,
0.0007882118225097656,
0.0374755859375,
-0.020538330078125,
0.0293121337890625,
0.08953857421875,
-0.0259246826171875,
-0.00019299983978271484,
-0.04193115234375,
-0.0149078369140625,
-0.042388916015625,
0.0247039794921875,
-0.018402099609375,
-0.08416748046875,
0.03900146484375,
0.00627899169921875,
0.002635955810546875,
0.07391357421875,
0.0401611328125,
0.007015228271484375,
0.09320068359375,
0.048004150390625,
0.0268402099609375,
0.037078857421875,
-0.06085205078125,
0.01557159423828125,
-0.07574462890625,
-0.04510498046875,
-0.037567138671875,
-0.0299072265625,
-0.05914306640625,
-0.056121826171875,
0.03302001953125,
0.00926971435546875,
-0.03179931640625,
0.04632568359375,
-0.0645751953125,
0.020782470703125,
0.040191650390625,
0.007526397705078125,
-0.0121307373046875,
0.007152557373046875,
-0.00730133056640625,
0.005115509033203125,
-0.055572509765625,
-0.023040771484375,
0.0229339599609375,
0.04296875,
0.057159423828125,
-0.007843017578125,
0.047119140625,
-0.005199432373046875,
-0.0009741783142089844,
-0.069091796875,
0.043853759765625,
-0.00659942626953125,
-0.057891845703125,
-0.00730133056640625,
-0.0250396728515625,
-0.0740966796875,
0.0294647216796875,
-0.0133056640625,
-0.06524658203125,
0.0523681640625,
0.006923675537109375,
-0.0242462158203125,
0.0228424072265625,
-0.043853759765625,
0.0870361328125,
-0.0161590576171875,
-0.028778076171875,
0.005260467529296875,
-0.047454833984375,
0.0151824951171875,
0.0143585205078125,
-0.0036182403564453125,
-0.031219482421875,
0.0257415771484375,
0.06707763671875,
-0.048187255859375,
0.051055908203125,
-0.0236968994140625,
0.0167694091796875,
0.040374755859375,
-0.0045318603515625,
0.02520751953125,
-0.0016765594482421875,
0.0203399658203125,
0.043914794921875,
0.021759033203125,
-0.0295867919921875,
-0.033294677734375,
0.04693603515625,
-0.0667724609375,
-0.043701171875,
-0.031402587890625,
-0.026458740234375,
0.0006060600280761719,
0.0261383056640625,
0.035247802734375,
0.03564453125,
-0.006866455078125,
0.035919189453125,
0.04180908203125,
-0.023193359375,
0.04327392578125,
0.01540374755859375,
-0.0169830322265625,
-0.0360107421875,
0.0667724609375,
-0.014862060546875,
-0.0016613006591796875,
0.02374267578125,
0.0255279541015625,
-0.03466796875,
-0.0218505859375,
-0.029998779296875,
0.0176849365234375,
-0.04669189453125,
-0.034759521484375,
-0.056488037109375,
-0.040924072265625,
-0.032989501953125,
-0.02728271484375,
-0.0311737060546875,
-0.021331787109375,
-0.031646728515625,
0.00044727325439453125,
0.024169921875,
0.03570556640625,
-0.0199737548828125,
0.0240020751953125,
-0.053436279296875,
0.0179443359375,
0.027679443359375,
0.024200439453125,
-0.002437591552734375,
-0.040924072265625,
-0.016937255859375,
-0.000591278076171875,
-0.039703369140625,
-0.039093017578125,
0.047821044921875,
-0.0011739730834960938,
0.03680419921875,
0.045257568359375,
-0.01041412353515625,
0.0771484375,
-0.0167694091796875,
0.04888916015625,
0.0300445556640625,
-0.0625,
0.029388427734375,
-0.016357421875,
0.043060302734375,
0.034027099609375,
0.0209503173828125,
-0.042694091796875,
0.0032176971435546875,
-0.06182861328125,
-0.0772705078125,
0.0726318359375,
0.00969696044921875,
-0.0006279945373535156,
0.00281524658203125,
0.002178192138671875,
-0.0005860328674316406,
-0.0052032470703125,
-0.047576904296875,
-0.022552490234375,
-0.0279083251953125,
-0.011871337890625,
-0.005596160888671875,
-0.032958984375,
-0.0016307830810546875,
-0.0440673828125,
0.04962158203125,
-0.0109405517578125,
0.050994873046875,
0.023895263671875,
-0.0200958251953125,
0.0003190040588378906,
-0.0045318603515625,
0.033447265625,
0.023193359375,
-0.02337646484375,
0.00830841064453125,
0.0196380615234375,
-0.0282745361328125,
-0.00907135009765625,
0.0274658203125,
-0.02215576171875,
-0.00504302978515625,
0.025665283203125,
0.08013916015625,
0.02520751953125,
-0.01959228515625,
0.04327392578125,
0.002544403076171875,
-0.03857421875,
-0.030181884765625,
0.0159912109375,
-0.005443572998046875,
0.035430908203125,
0.0218658447265625,
0.033050537109375,
0.024139404296875,
-0.00551605224609375,
0.0156707763671875,
0.0217437744140625,
-0.05487060546875,
-0.0242767333984375,
0.054229736328125,
0.01183319091796875,
-0.002933502197265625,
0.054290771484375,
-0.01129150390625,
-0.05255126953125,
0.0626220703125,
0.041015625,
0.07513427734375,
0.0025234222412109375,
0.0143585205078125,
0.061798095703125,
0.01175689697265625,
0.00839996337890625,
-0.00885772705078125,
-0.0122222900390625,
-0.061187744140625,
-0.029052734375,
-0.0745849609375,
-0.005298614501953125,
0.0093994140625,
-0.05279541015625,
0.039520263671875,
-0.0360107421875,
-0.01044464111328125,
0.01256561279296875,
0.00943756103515625,
-0.08038330078125,
0.0187530517578125,
0.0218048095703125,
0.0828857421875,
-0.04547119140625,
0.035369873046875,
0.05645751953125,
-0.01561737060546875,
-0.0614013671875,
-0.03314208984375,
-0.0056610107421875,
-0.06396484375,
0.030181884765625,
0.0391845703125,
0.00013184547424316406,
0.002635955810546875,
-0.05810546875,
-0.0816650390625,
0.10125732421875,
0.009765625,
-0.02520751953125,
-0.005329132080078125,
0.006702423095703125,
0.0281219482421875,
-0.035247802734375,
0.026397705078125,
0.036163330078125,
0.041839599609375,
0.0504150390625,
-0.0369873046875,
0.0075836181640625,
-0.0169219970703125,
0.017791748046875,
0.0218505859375,
-0.06439208984375,
0.0445556640625,
-0.02294921875,
-0.0188751220703125,
-0.001667022705078125,
0.052703857421875,
0.013641357421875,
0.0170135498046875,
0.0545654296875,
0.061370849609375,
0.0341796875,
-0.03155517578125,
0.06561279296875,
-0.01503753662109375,
0.058837890625,
0.055938720703125,
0.0212249755859375,
0.028533935546875,
0.027435302734375,
-0.006359100341796875,
0.039642333984375,
0.0728759765625,
-0.044921875,
0.03741455078125,
-0.00705718994140625,
0.016632080078125,
-0.0311737060546875,
-0.01363372802734375,
-0.0247039794921875,
0.06414794921875,
0.0165252685546875,
-0.038970947265625,
-0.018951416015625,
-0.01092529296875,
-0.001224517822265625,
-0.03643798828125,
-0.0195465087890625,
0.047698974609375,
0.004245758056640625,
-0.0266265869140625,
0.0455322265625,
0.0116729736328125,
0.04833984375,
-0.0308990478515625,
0.0026645660400390625,
-0.011383056640625,
0.0167388916015625,
-0.02911376953125,
-0.0401611328125,
0.0419921875,
-0.01236724853515625,
-0.00762176513671875,
-0.01261138916015625,
0.06793212890625,
-0.019775390625,
-0.05413818359375,
0.01143646240234375,
0.00739288330078125,
0.005126953125,
0.0097808837890625,
-0.06842041015625,
0.0301513671875,
0.00897979736328125,
-0.0247802734375,
-0.0038509368896484375,
0.01218414306640625,
0.0106353759765625,
0.03778076171875,
0.046539306640625,
-0.0240631103515625,
-0.004802703857421875,
-0.01044464111328125,
0.06829833984375,
-0.057220458984375,
-0.02850341796875,
-0.054473876953125,
0.04071044921875,
-0.0249786376953125,
-0.0217742919921875,
0.0570068359375,
0.057037353515625,
0.08184814453125,
-0.01444244384765625,
0.0225982666015625,
-0.03155517578125,
0.0151824951171875,
-0.0158538818359375,
0.035308837890625,
-0.04656982421875,
-0.0069122314453125,
-0.0312347412109375,
-0.082275390625,
-0.02813720703125,
0.0667724609375,
-0.030364990234375,
0.015899658203125,
0.03948974609375,
0.0712890625,
-0.0208282470703125,
-0.0025482177734375,
0.018951416015625,
0.00859832763671875,
0.0175018310546875,
0.025848388671875,
0.050048828125,
-0.036590576171875,
0.04193115234375,
-0.055572509765625,
0.0029888153076171875,
-0.035369873046875,
-0.047454833984375,
-0.0662841796875,
-0.047271728515625,
-0.031829833984375,
-0.02734375,
-0.031036376953125,
0.0653076171875,
0.08489990234375,
-0.063232421875,
-0.00522613525390625,
0.0027790069580078125,
0.01271820068359375,
-0.01021575927734375,
-0.0229339599609375,
0.035125732421875,
0.0017518997192382812,
-0.0731201171875,
-0.003154754638671875,
0.0225982666015625,
0.01041412353515625,
-0.010009765625,
-0.01373291015625,
0.0019197463989257812,
-0.01261138916015625,
0.0504150390625,
0.019012451171875,
-0.0443115234375,
-0.034027099609375,
0.01142120361328125,
-0.005512237548828125,
0.018829345703125,
0.045745849609375,
-0.044158935546875,
0.0270538330078125,
0.0411376953125,
0.0299224853515625,
0.0765380859375,
0.0047149658203125,
0.0124664306640625,
-0.0384521484375,
0.018157958984375,
0.009246826171875,
0.03887939453125,
0.03619384765625,
-0.014251708984375,
0.042327880859375,
0.0218353271484375,
-0.041595458984375,
-0.03826904296875,
0.007236480712890625,
-0.09454345703125,
-0.00963592529296875,
0.08026123046875,
0.00850677490234375,
-0.04693603515625,
0.022247314453125,
-0.01314544677734375,
0.0215301513671875,
-0.01507568359375,
0.042724609375,
0.01922607421875,
-0.01219940185546875,
-0.028411865234375,
-0.01049041748046875,
0.028411865234375,
-0.0038738250732421875,
-0.040924072265625,
-0.03857421875,
0.038909912109375,
0.032989501953125,
0.0288543701171875,
0.0177459716796875,
-0.0288848876953125,
0.00806427001953125,
0.007640838623046875,
0.0253143310546875,
-0.0161285400390625,
-0.0187225341796875,
-0.0165863037109375,
0.010223388671875,
-0.01062774658203125,
-0.01715087890625
]
] |
Envvi/Inkpunk-Diffusion | 2022-11-29T16:31:21.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Envvi | null | null | Envvi/Inkpunk-Diffusion | 916 | 13,189 | diffusers | 2022-11-25T06:06:18 | ---
license: creativeml-openrail-m
language:
- en
tags:
- stable-diffusion
- text-to-image
- diffusers
---
# Inkpunk Diffusion
Finetuned Stable Diffusion model trained on dreambooth. Vaguely inspired by Gorillaz, FLCL, and Yoji Shinkawa. Use **_nvinkpunk_** in your prompts.
# Gradio
We support a [Gradio](https://github.com/gradio-app/gradio) Web UI to run Inkpunk-Diffusion:
[](https://huggingface.co/spaces/akhaliq/Inkpunk-Diffusion)
# Sample images

 | 930 | [
[
-0.04144287109375,
-0.055999755859375,
0.050994873046875,
0.02008056640625,
-0.02581787109375,
0.01500701904296875,
0.01331329345703125,
-0.0164794921875,
0.053955078125,
0.027984619140625,
-0.04461669921875,
-0.033233642578125,
-0.0408935546875,
-0.030548095703125,
0.0057373046875,
0.06793212890625,
-0.00865936279296875,
0.007259368896484375,
-0.0045318603515625,
0.00795745849609375,
-0.0239410400390625,
0.0267486572265625,
-0.067626953125,
-0.0699462890625,
0.04156494140625,
0.039276123046875,
0.04840087890625,
0.01296234130859375,
0.007427215576171875,
0.0130615234375,
-0.0207061767578125,
-0.0193023681640625,
-0.0345458984375,
0.015411376953125,
-0.0022907257080078125,
-0.0037364959716796875,
-0.017669677734375,
-0.00014281272888183594,
0.031341552734375,
0.030975341796875,
-0.022186279296875,
0.0218353271484375,
-0.003498077392578125,
0.051727294921875,
-0.040435791015625,
-0.01027679443359375,
-0.0109710693359375,
0.01293182373046875,
-0.00902557373046875,
0.04827880859375,
0.0081939697265625,
-0.03662109375,
-0.013671875,
-0.0797119140625,
0.0246734619140625,
0.007579803466796875,
0.069091796875,
0.007137298583984375,
-0.007488250732421875,
0.016632080078125,
-0.032318115234375,
0.02264404296875,
-0.031005859375,
0.0160064697265625,
0.0301971435546875,
0.0394287109375,
-0.02557373046875,
-0.07366943359375,
-0.04693603515625,
0.0170135498046875,
0.0248260498046875,
0.0222015380859375,
-0.0007166862487792969,
0.0066680908203125,
0.01300811767578125,
0.015350341796875,
-0.0528564453125,
-0.0117950439453125,
-0.029052734375,
-0.013275146484375,
0.020965576171875,
0.0184173583984375,
0.025115966796875,
0.0221099853515625,
-0.032440185546875,
0.005992889404296875,
-0.0173797607421875,
0.00502777099609375,
0.040283203125,
-0.0014867782592773438,
-0.035858154296875,
0.024566650390625,
-0.01082611083984375,
0.0386962890625,
0.010162353515625,
-0.0012331008911132812,
0.036956787109375,
0.01508331298828125,
-0.013580322265625,
-0.024566650390625,
0.0838623046875,
0.0396728515625,
0.0005125999450683594,
-0.0016527175903320312,
-0.01149749755859375,
-0.01389312744140625,
0.01171112060546875,
-0.0843505859375,
-0.050445556640625,
0.0294952392578125,
-0.038543701171875,
-0.028564453125,
-0.0260467529296875,
-0.07537841796875,
-0.046356201171875,
0.01502227783203125,
0.05438232421875,
-0.034423828125,
-0.05401611328125,
0.023590087890625,
-0.06280517578125,
-0.0180206298828125,
0.044342041015625,
-0.0631103515625,
0.007778167724609375,
0.04180908203125,
0.08392333984375,
0.0052337646484375,
0.006702423095703125,
0.02496337890625,
0.01165771484375,
-0.01617431640625,
0.0704345703125,
-0.036651611328125,
-0.043548583984375,
-0.01165771484375,
0.0030498504638671875,
0.0022563934326171875,
-0.023345947265625,
0.048553466796875,
-0.0075531005859375,
0.0223541259765625,
-0.0035076141357421875,
-0.030242919921875,
-0.036956787109375,
0.0081329345703125,
-0.043121337890625,
0.0550537109375,
0.0250091552734375,
-0.0665283203125,
0.0026264190673828125,
-0.0899658203125,
-0.002429962158203125,
0.0264892578125,
-0.004207611083984375,
-0.047332763671875,
-0.00974273681640625,
-0.030792236328125,
0.043121337890625,
-0.0009403228759765625,
-0.00887298583984375,
-0.061859130859375,
-0.0345458984375,
-0.0037403106689453125,
-0.006488800048828125,
0.079833984375,
0.034027099609375,
-0.01372528076171875,
-0.0030078887939453125,
-0.06103515625,
-0.01451873779296875,
0.022247314453125,
0.009857177734375,
-0.03338623046875,
-0.04022216796875,
0.02459716796875,
0.0032558441162109375,
0.0232696533203125,
-0.0611572265625,
0.03839111328125,
0.01392364501953125,
0.0246734619140625,
0.0518798828125,
0.0205535888671875,
0.0252227783203125,
-0.0263214111328125,
0.0526123046875,
-0.006622314453125,
0.0158233642578125,
0.006195068359375,
-0.0745849609375,
-0.03143310546875,
-0.022186279296875,
0.004222869873046875,
0.0298919677734375,
-0.07696533203125,
0.000499725341796875,
0.017181396484375,
-0.0693359375,
-0.0023345947265625,
-0.0166473388671875,
0.003185272216796875,
0.0728759765625,
0.0029201507568359375,
-0.029754638671875,
-0.0019989013671875,
-0.048919677734375,
0.0130767822265625,
-0.007427215576171875,
-0.01389312744140625,
0.0102081298828125,
0.0260009765625,
-0.033599853515625,
0.0478515625,
-0.058380126953125,
0.0014543533325195312,
0.0076141357421875,
0.024200439453125,
0.049560546875,
0.04705810546875,
0.057586669921875,
-0.053558349609375,
-0.038177490234375,
-0.009521484375,
-0.022003173828125,
-0.0210418701171875,
0.01546478271484375,
-0.0272064208984375,
-0.030426025390625,
0.0010900497436523438,
-0.06890869140625,
0.02117919921875,
0.032928466796875,
-0.060028076171875,
0.05023193359375,
-0.00789642333984375,
0.020782470703125,
-0.0963134765625,
0.0171051025390625,
0.0220489501953125,
-0.032562255859375,
-0.043426513671875,
0.0132598876953125,
-0.01134490966796875,
-0.0292510986328125,
-0.058258056640625,
0.0640869140625,
-0.0196990966796875,
0.0390625,
-0.0200042724609375,
-0.02581787109375,
0.025787353515625,
0.031707763671875,
-0.004840850830078125,
0.045928955078125,
0.057769775390625,
-0.035125732421875,
0.033203125,
0.01491546630859375,
-0.0179443359375,
0.01983642578125,
-0.06463623046875,
0.01485443115234375,
-0.010101318359375,
0.00839996337890625,
-0.10650634765625,
-0.017333984375,
0.072021484375,
-0.034454345703125,
0.016357421875,
-0.0264892578125,
-0.044830322265625,
-0.0240020751953125,
-0.031158447265625,
0.039215087890625,
0.059600830078125,
-0.01508331298828125,
0.034912109375,
0.029815673828125,
0.003505706787109375,
-0.037261962890625,
-0.0440673828125,
-0.03204345703125,
-0.034820556640625,
-0.047882080078125,
0.018402099609375,
-0.027130126953125,
-0.005527496337890625,
-0.0062255859375,
0.01091766357421875,
-0.023193359375,
0.004039764404296875,
0.04156494140625,
0.0147552490234375,
-0.003498077392578125,
-0.005420684814453125,
0.0094146728515625,
0.0004916191101074219,
0.004222869873046875,
-0.0210723876953125,
0.044769287109375,
-0.0005135536193847656,
-0.00498199462890625,
-0.0552978515625,
-0.0011844635009765625,
0.039794921875,
0.041656494140625,
0.048614501953125,
0.081787109375,
-0.037689208984375,
0.00701904296875,
-0.03326416015625,
0.00339508056640625,
-0.0396728515625,
-0.0221099853515625,
-0.0242919921875,
-0.058837890625,
0.067138671875,
-0.027069091796875,
0.00400543212890625,
0.02313232421875,
0.043731689453125,
-0.029388427734375,
0.049346923828125,
0.03265380859375,
0.0124053955078125,
0.07122802734375,
-0.050018310546875,
-0.0161590576171875,
-0.04986572265625,
-0.0159454345703125,
-0.0155792236328125,
-0.044464111328125,
-0.02490234375,
-0.035797119140625,
0.0101470947265625,
0.0201568603515625,
-0.036163330078125,
0.02813720703125,
-0.04632568359375,
0.0491943359375,
0.0167999267578125,
0.0140380859375,
0.0007357597351074219,
0.01216888427734375,
-0.04473876953125,
0.0077056884765625,
-0.047210693359375,
-0.01465606689453125,
0.06072998046875,
0.038848876953125,
0.060089111328125,
0.003139495849609375,
0.058074951171875,
0.0186309814453125,
0.0135650634765625,
-0.03271484375,
0.0390625,
0.000148773193359375,
-0.052032470703125,
-0.0138397216796875,
-0.01220703125,
-0.08026123046875,
0.0267486572265625,
-0.03594970703125,
-0.04510498046875,
0.030975341796875,
0.00911712646484375,
-0.0323486328125,
0.029266357421875,
-0.055511474609375,
0.07293701171875,
0.018646240234375,
-0.048919677734375,
-0.0033740997314453125,
-0.017303466796875,
0.02703857421875,
0.0165863037109375,
0.0164794921875,
-0.040618896484375,
-0.0265960693359375,
0.044830322265625,
-0.0313720703125,
0.050628662109375,
-0.03326416015625,
-0.00362396240234375,
0.0286865234375,
0.005321502685546875,
0.000743865966796875,
0.024749755859375,
-0.00920867919921875,
0.028350830078125,
-0.004863739013671875,
-0.033782958984375,
-0.03076171875,
0.042327880859375,
-0.04913330078125,
-0.011627197265625,
-0.02227783203125,
-0.0206451416015625,
0.010162353515625,
0.045806884765625,
0.055511474609375,
0.01088714599609375,
-0.0224609375,
-0.005817413330078125,
0.065185546875,
0.0172576904296875,
0.050262451171875,
0.025115966796875,
-0.042572021484375,
-0.047088623046875,
0.0650634765625,
-0.00983428955078125,
0.01025390625,
0.004398345947265625,
0.047515869140625,
-0.0067138671875,
-0.050140380859375,
-0.058685302734375,
0.0199127197265625,
-0.021148681640625,
-0.011260986328125,
-0.03009033203125,
0.006046295166015625,
-0.04254150390625,
0.0002593994140625,
-0.02923583984375,
-0.056488037109375,
-0.058074951171875,
0.00966644287109375,
0.05963134765625,
0.040008544921875,
-0.01995849609375,
0.033447265625,
-0.045684814453125,
0.047210693359375,
0.01274871826171875,
0.038543701171875,
-0.012786865234375,
-0.027557373046875,
-0.023468017578125,
0.00185394287109375,
-0.0452880859375,
-0.059967041015625,
0.0267486572265625,
-0.0016841888427734375,
0.04388427734375,
0.04998779296875,
-0.004222869873046875,
0.053253173828125,
-0.050567626953125,
0.06298828125,
0.0477294921875,
-0.0311126708984375,
0.03857421875,
-0.07025146484375,
0.02117919921875,
0.051361083984375,
0.053070068359375,
-0.033599853515625,
-0.03582763671875,
-0.0614013671875,
-0.0648193359375,
0.0124053955078125,
0.01313018798828125,
0.0110626220703125,
0.01442718505859375,
0.044647216796875,
0.01493072509765625,
0.0180816650390625,
-0.05926513671875,
-0.03564453125,
-0.0167999267578125,
-0.011566162109375,
0.0178680419921875,
-0.0002880096435546875,
-0.02008056640625,
-0.0263671875,
0.055877685546875,
-0.007659912109375,
0.0276336669921875,
0.0008697509765625,
0.037261962890625,
-0.024383544921875,
-0.026031494140625,
0.045318603515625,
0.038421630859375,
-0.01430511474609375,
-0.02716064453125,
-0.028167724609375,
-0.05950927734375,
0.0117340087890625,
0.0049896240234375,
-0.02423095703125,
0.01751708984375,
-0.009063720703125,
0.055572509765625,
-0.023529052734375,
-0.0180511474609375,
0.03350830078125,
-0.0269317626953125,
-0.0225830078125,
-0.058837890625,
0.0396728515625,
0.012786865234375,
0.036590576171875,
0.0220947265625,
0.0262298583984375,
0.04046630859375,
-0.0090179443359375,
-0.0018053054809570312,
0.035797119140625,
-0.0233154296875,
-0.0102081298828125,
0.0706787109375,
0.009002685546875,
-0.047637939453125,
0.01558685302734375,
-0.03424072265625,
-0.0180511474609375,
0.049957275390625,
0.0421142578125,
0.08685302734375,
-0.00762939453125,
0.024688720703125,
0.0288848876953125,
-0.0034313201904296875,
-0.0265655517578125,
0.041290283203125,
-0.0030078887939453125,
-0.05511474609375,
-0.0052337646484375,
-0.02130126953125,
-0.05023193359375,
0.0154876708984375,
-0.02630615234375,
0.0478515625,
-0.0650634765625,
-0.030670166015625,
-0.0207672119140625,
-0.0014095306396484375,
-0.016387939453125,
0.002140045166015625,
0.0183563232421875,
0.08599853515625,
-0.077880859375,
0.062408447265625,
0.0589599609375,
-0.01129913330078125,
-0.03802490234375,
0.005565643310546875,
-0.005279541015625,
-0.039794921875,
0.00905609130859375,
0.01378631591796875,
-0.007518768310546875,
-0.00005161762237548828,
-0.054718017578125,
-0.0543212890625,
0.10113525390625,
0.031951904296875,
-0.0164642333984375,
0.0298004150390625,
-0.0232696533203125,
0.0418701171875,
-0.025787353515625,
0.029052734375,
0.0010671615600585938,
0.0214996337890625,
0.0243682861328125,
-0.048126220703125,
-0.021392822265625,
-0.0418701171875,
0.00846099853515625,
0.0157623291015625,
-0.080810546875,
0.05975341796875,
-0.00447845458984375,
-0.00283050537109375,
0.0228424072265625,
0.053131103515625,
0.029083251953125,
0.05828857421875,
0.047821044921875,
0.07708740234375,
0.04071044921875,
-0.009765625,
0.057586669921875,
-0.005420684814453125,
0.03131103515625,
0.062164306640625,
0.00986480712890625,
0.0341796875,
0.02410888671875,
-0.0059356689453125,
0.07635498046875,
0.07440185546875,
0.0026493072509765625,
0.040283203125,
0.0015420913696289062,
-0.0199127197265625,
-0.0037708282470703125,
0.0085601806640625,
-0.03271484375,
-0.0064544677734375,
0.034027099609375,
-0.004116058349609375,
0.006732940673828125,
0.0201416015625,
-0.0021514892578125,
-0.018798828125,
-0.033416748046875,
0.0288848876953125,
0.003116607666015625,
-0.0005545616149902344,
0.049407958984375,
-0.022552490234375,
0.0672607421875,
-0.046966552734375,
-0.036773681640625,
-0.0142974853515625,
-0.0088043212890625,
-0.0216827392578125,
-0.0748291015625,
0.007450103759765625,
-0.0214996337890625,
-0.00527191162109375,
-0.037078857421875,
0.06640625,
-0.0133209228515625,
-0.04425048828125,
0.01462554931640625,
-0.005741119384765625,
0.02825927734375,
0.00955963134765625,
-0.0806884765625,
0.00981903076171875,
-0.00592041015625,
-0.031005859375,
0.01104736328125,
0.0284423828125,
0.02264404296875,
0.06683349609375,
0.03302001953125,
0.0031757354736328125,
0.0030612945556640625,
-0.0012416839599609375,
0.04937744140625,
-0.025848388671875,
-0.052978515625,
-0.05389404296875,
0.05303955078125,
-0.0098419189453125,
-0.0271759033203125,
0.048187255859375,
0.0609130859375,
0.036956787109375,
-0.0288848876953125,
0.040557861328125,
-0.0023097991943359375,
0.044677734375,
-0.0231475830078125,
0.0772705078125,
-0.0614013671875,
-0.00705718994140625,
-0.04150390625,
-0.059661865234375,
-0.009613037109375,
0.057952880859375,
0.0153961181640625,
-0.0033626556396484375,
0.018341064453125,
0.048919677734375,
-0.01849365234375,
-0.0025730133056640625,
0.00962066650390625,
-0.007289886474609375,
0.00756072998046875,
0.0369873046875,
0.050811767578125,
-0.039642333984375,
0.01085662841796875,
-0.042236328125,
-0.0162811279296875,
-0.00789642333984375,
-0.06982421875,
-0.05645751953125,
-0.0312347412109375,
-0.051727294921875,
-0.049285888671875,
-0.015380859375,
0.042022705078125,
0.058013916015625,
-0.04638671875,
-0.0159149169921875,
-0.023223876953125,
0.037322998046875,
-0.00287628173828125,
-0.0208740234375,
-0.00580596923828125,
0.0170745849609375,
-0.07257080078125,
0.01108551025390625,
0.0005974769592285156,
0.045806884765625,
-0.01102447509765625,
0.00506591796875,
-0.0022716522216796875,
-0.031646728515625,
0.01507568359375,
0.03656005859375,
-0.01514434814453125,
-0.0182342529296875,
-0.0260009765625,
0.00905609130859375,
-0.0013360977172851562,
0.044097900390625,
-0.06121826171875,
0.0196533203125,
0.0751953125,
0.0200347900390625,
0.035797119140625,
-0.00965118408203125,
0.0261688232421875,
-0.027557373046875,
0.0345458984375,
0.01044464111328125,
0.039276123046875,
0.0009012222290039062,
-0.031280517578125,
0.054779052734375,
0.049041748046875,
-0.04510498046875,
-0.045501708984375,
0.007232666015625,
-0.0777587890625,
-0.03363037109375,
0.07501220703125,
0.00202178955078125,
-0.0248870849609375,
-0.020294189453125,
-0.020294189453125,
0.006221771240234375,
-0.048553466796875,
0.0357666015625,
0.04669189453125,
-0.04718017578125,
-0.02197265625,
-0.04010009765625,
0.01129913330078125,
-0.00824737548828125,
-0.01953125,
-0.0283203125,
0.045806884765625,
0.051116943359375,
0.04400634765625,
0.05609130859375,
0.008087158203125,
0.02081298828125,
0.0133209228515625,
0.025360107421875,
-0.02020263671875,
0.0027942657470703125,
-0.0382080078125,
0.0074920654296875,
-0.0054168701171875,
-0.050140380859375
]
] |
heegyu/kodialogpt-v0 | 2022-11-09T08:39:22.000Z | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"license:cc-by-nc-sa-4.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | heegyu | null | null | heegyu/kodialogpt-v0 | 1 | 13,185 | transformers | 2022-11-03T00:09:11 | ---
license: cc-by-nc-sa-4.0
---
[skt/kogpt2-base-v2](https://huggingface.co/skt/kogpt2-base-v2)를 AIHub 일상대화 데이터셋으로 파인튜닝한 모델입니다.<br/>
학습 코드: https://github.com/HeegyuKim/open-domain-dialog<br/>
Streamlit Demo: https://heegyukim-open-domain-dialog-st-demo-1tzktp.streamlitapp.com/<br/>
## 사용예시
```
tokenizer = AutoTokenizer.from_pretrained("heegyu/kodialogpt-v0")
model = AutoModelForCausalLM.from_pretrained("heegyu/kodialogpt-v0")
generator = pipeline("text-generation", model=model, tokenizer=tokenizer)
generation_args = dict(
num_beams=4,
repetition_penalty=2.0,
no_repeat_ngram_size=4,
eos_token_id=375, # \n
max_new_tokens=64,
do_sample=True,
top_k=50,
early_stopping=True
)
generator(
["0 : **는 게임 좋아하니\n1 :",
"0 : 어제 강남에서 살인사건 났대 ㅜㅜ 너무 무서워\n1 : 헐 왜? 무슨 일 있었어?\n0 : 사진보니까 막 피흘리는 사람있고 경찰들이 떠서 제압하고 난리도 아니었다던데??\n1 :",
"0 : 자기야 어제는 나한테 왜 그랬어?\n1 : 뭔 일 있었어?\n0 : 어떻게 나한테 말도 없이 그럴 수 있어? 나 진짜 실망했어\n1 : "],
**generation_args
)
```
결과
```
[[{'generated_text': '0 : **는 게임 좋아하니\n1 : 나는 게임을 잘 안 해 키키 '}],
[{'generated_text': '0 : 어제 강남에서 살인사건 났대 ㅜㅜ 너무 무서워\n1 : 헐 왜? 무슨 일 있었어?\n0 : 사진보니까 막 피흘리는 사람있고 경찰들이 떠서 제압하고 난리도 아니었다던데??\n1 : 아이고... 진짜 무섭다... '}],
[{'generated_text': '0 : 자기야 어제는 나한테 왜 그랬어?\n1 : 뭔 일 있었어?\n0 : 어떻게 나한테 말도 없이 그럴 수 있어? 나 진짜 실망했어\n1 : 뭘 잘못 했길래 그래? '}]]
```
학습에 사용한 하이퍼파라미터
| 1,346 | [
[
-0.023651123046875,
-0.0806884765625,
0.02105712890625,
0.0270233154296875,
-0.044586181640625,
-0.0030002593994140625,
-0.0077056884765625,
-0.01357269287109375,
0.01552581787109375,
0.033843994140625,
-0.05511474609375,
-0.049530029296875,
-0.03912353515625,
-0.003261566162109375,
0.0223541259765625,
0.067138671875,
-0.0160980224609375,
-0.0208282470703125,
0.0027256011962890625,
0.015777587890625,
-0.0235595703125,
-0.024810791015625,
-0.054840087890625,
-0.0246124267578125,
-0.01371002197265625,
0.0234832763671875,
0.040679931640625,
0.0440673828125,
0.01898193359375,
0.021575927734375,
-0.01293182373046875,
0.01348114013671875,
-0.0245208740234375,
0.00556182861328125,
-0.0031299591064453125,
-0.0126953125,
-0.0352783203125,
-0.0031299591064453125,
0.048675537109375,
0.025390625,
0.0175628662109375,
0.01910400390625,
0.0186614990234375,
0.033203125,
-0.0097808837890625,
0.0312042236328125,
-0.035430908203125,
0.0014581680297851562,
-0.026214599609375,
-0.0269622802734375,
-0.0196075439453125,
-0.03399658203125,
0.022430419921875,
-0.0552978515625,
0.01218414306640625,
0.022674560546875,
0.099853515625,
-0.01142120361328125,
-0.038726806640625,
-0.035064697265625,
-0.024993896484375,
0.062164306640625,
-0.06915283203125,
0.00881195068359375,
0.031768798828125,
0.004940032958984375,
-0.0136260986328125,
-0.07769775390625,
-0.051422119140625,
0.006366729736328125,
-0.01216888427734375,
0.0274658203125,
-0.00836944580078125,
-0.0026035308837890625,
0.00896453857421875,
0.01355743408203125,
-0.04168701171875,
-0.007293701171875,
-0.026885986328125,
-0.03619384765625,
0.0286102294921875,
0.00905609130859375,
0.04931640625,
-0.042449951171875,
-0.042083740234375,
-0.006465911865234375,
-0.0219268798828125,
0.0166473388671875,
0.034759521484375,
0.0199432373046875,
-0.039459228515625,
0.0516357421875,
-0.0237579345703125,
0.04852294921875,
0.0255126953125,
-0.0194854736328125,
0.04656982421875,
-0.03448486328125,
-0.0246124267578125,
0.00159454345703125,
0.0953369140625,
0.035369873046875,
0.0040130615234375,
0.019287109375,
0.00765228271484375,
-0.0112152099609375,
-0.0024814605712890625,
-0.0748291015625,
-0.0171661376953125,
0.0301513671875,
-0.0421142578125,
-0.0253448486328125,
0.00986480712890625,
-0.0849609375,
0.0141754150390625,
-0.00743865966796875,
0.0285186767578125,
-0.04095458984375,
-0.025146484375,
0.00836944580078125,
-0.00435638427734375,
0.0019283294677734375,
0.02435302734375,
-0.05401611328125,
0.0036067962646484375,
0.0247955322265625,
0.059906005859375,
0.0265960693359375,
-0.0148162841796875,
-0.005100250244140625,
0.00020110607147216797,
0.0009055137634277344,
0.039947509765625,
-0.002704620361328125,
-0.0246734619140625,
0.008514404296875,
0.00485992431640625,
-0.031219482421875,
-0.0192413330078125,
0.036224365234375,
-0.0296478271484375,
0.04522705078125,
0.0038547515869140625,
-0.031005859375,
-0.00836181640625,
0.006473541259765625,
-0.0268096923828125,
0.08404541015625,
0.01529693603515625,
-0.07208251953125,
0.023101806640625,
-0.050628662109375,
-0.02545166015625,
0.01934814453125,
-0.01003265380859375,
-0.04754638671875,
-0.0160369873046875,
0.0146026611328125,
0.0290069580078125,
-0.0005421638488769531,
-0.0126953125,
-0.008026123046875,
0.004482269287109375,
0.0249176025390625,
-0.023681640625,
0.0806884765625,
0.0312042236328125,
-0.044586181640625,
0.0085601806640625,
-0.06378173828125,
0.050079345703125,
0.0279998779296875,
-0.027252197265625,
-0.0152435302734375,
-0.0148162841796875,
0.0178680419921875,
0.037811279296875,
0.032562255859375,
-0.041351318359375,
0.0139617919921875,
-0.0567626953125,
0.04229736328125,
0.0654296875,
0.0221099853515625,
0.03863525390625,
-0.01299285888671875,
0.035400390625,
-0.0027141571044921875,
-0.0012693405151367188,
-0.01552581787109375,
-0.047119140625,
-0.052093505859375,
-0.021270751953125,
0.01273345947265625,
0.04876708984375,
-0.0648193359375,
0.0494384765625,
-0.00766754150390625,
-0.03167724609375,
-0.049713134765625,
0.0033588409423828125,
0.0297088623046875,
0.032867431640625,
0.01262664794921875,
-0.00986480712890625,
-0.0513916015625,
-0.035736083984375,
-0.01442718505859375,
-0.0244293212890625,
0.0033321380615234375,
0.042938232421875,
0.053009033203125,
-0.01438140869140625,
0.0751953125,
-0.051727294921875,
-0.0204010009765625,
-0.028717041015625,
0.01849365234375,
0.0251312255859375,
0.0411376953125,
0.046966552734375,
-0.06298828125,
-0.07281494140625,
-0.005168914794921875,
-0.058197021484375,
-0.031494140625,
-0.00922393798828125,
-0.0187225341796875,
0.01224517822265625,
0.018524169921875,
-0.054107666015625,
0.031585693359375,
0.04205322265625,
-0.060028076171875,
0.064208984375,
-0.0267486572265625,
0.0188446044921875,
-0.104736328125,
0.0027008056640625,
-0.0057525634765625,
-0.0220489501953125,
-0.059326171875,
0.015960693359375,
0.004581451416015625,
0.006778717041015625,
-0.04205322265625,
0.061492919921875,
-0.04608154296875,
0.034576416015625,
-0.0165252685546875,
0.026214599609375,
-0.0031566619873046875,
0.03875732421875,
-0.00591278076171875,
0.061767578125,
0.0390625,
-0.044708251953125,
0.0251007080078125,
0.02569580078125,
-0.02630615234375,
0.0254669189453125,
-0.051055908203125,
0.0151519775390625,
-0.0069427490234375,
0.02569580078125,
-0.09625244140625,
-0.044708251953125,
0.05694580078125,
-0.058349609375,
0.0140380859375,
-0.0203399658203125,
-0.03515625,
-0.0408935546875,
-0.01959228515625,
0.033721923828125,
0.04632568359375,
-0.03509521484375,
0.0217437744140625,
0.017242431640625,
-0.02569580078125,
-0.019317626953125,
-0.035797119140625,
0.001857757568359375,
-0.00984954833984375,
-0.054473876953125,
0.02667236328125,
-0.015838623046875,
0.00798797607421875,
-0.01427459716796875,
0.01111602783203125,
-0.00917816162109375,
-0.01427459716796875,
0.0208282470703125,
0.0303955078125,
-0.005832672119140625,
-0.020751953125,
0.018310546875,
-0.0189208984375,
0.004825592041015625,
-0.027801513671875,
0.08111572265625,
-0.003192901611328125,
-0.024322509765625,
-0.06304931640625,
0.0164794921875,
0.024261474609375,
-0.004150390625,
0.053802490234375,
0.054779052734375,
-0.02618408203125,
-0.006130218505859375,
-0.01529693603515625,
0.01201629638671875,
-0.038665771484375,
0.0181732177734375,
-0.0301055908203125,
-0.06610107421875,
0.03173828125,
-0.01702880859375,
-0.00693511962890625,
0.06787109375,
0.053985595703125,
-0.01482391357421875,
0.08026123046875,
0.0171966552734375,
0.0147705078125,
0.02410888671875,
-0.035858154296875,
0.0173492431640625,
-0.05853271484375,
-0.03173828125,
-0.0189208984375,
0.00893402099609375,
-0.04412841796875,
-0.032135009765625,
0.017425537109375,
0.03887939453125,
-0.021270751953125,
0.025787353515625,
-0.055084228515625,
0.0015420913696289062,
0.034088134765625,
0.01226043701171875,
-0.007625579833984375,
0.01348114013671875,
-0.0259857177734375,
-0.0194854736328125,
-0.0396728515625,
-0.04693603515625,
0.06427001953125,
0.034637451171875,
0.0374755859375,
0.004138946533203125,
0.057891845703125,
-8.940696716308594e-7,
0.0018453598022460938,
-0.0345458984375,
0.04486083984375,
0.0252685546875,
-0.039947509765625,
-0.013092041015625,
-0.03179931640625,
-0.064697265625,
0.02105712890625,
-0.01605224609375,
-0.0858154296875,
-0.01328277587890625,
0.01398468017578125,
-0.01410675048828125,
0.03173828125,
-0.03826904296875,
0.059295654296875,
-0.000247955322265625,
-0.0212860107421875,
0.017822265625,
-0.045806884765625,
0.0160675048828125,
0.0047454833984375,
0.0250396728515625,
-0.00988006591796875,
0.005680084228515625,
0.06365966796875,
-0.052642822265625,
0.0687255859375,
-0.0177459716796875,
-0.0182647705078125,
0.052825927734375,
-0.008453369140625,
0.046234130859375,
0.0295257568359375,
0.018646240234375,
0.01358795166015625,
-0.025665283203125,
-0.0269012451171875,
-0.01494598388671875,
0.052581787109375,
-0.053985595703125,
-0.024688720703125,
-0.042694091796875,
-0.00780487060546875,
0.0148162841796875,
0.0401611328125,
0.06475830078125,
0.028472900390625,
0.0055084228515625,
0.0062713623046875,
0.0411376953125,
-0.0229339599609375,
0.031646728515625,
0.01197052001953125,
-0.01953125,
-0.076171875,
0.054901123046875,
-0.00031828880310058594,
0.0335693359375,
0.004955291748046875,
0.00743865966796875,
-0.0389404296875,
-0.040496826171875,
-0.057647705078125,
0.01273345947265625,
-0.037017822265625,
-0.0217437744140625,
-0.0667724609375,
-0.030975341796875,
-0.05731201171875,
-0.02008056640625,
-0.0223846435546875,
-0.0300445556640625,
-0.0283203125,
-0.0079345703125,
0.0310516357421875,
0.0194244384765625,
-0.0025634765625,
0.03887939453125,
-0.055145263671875,
0.039947509765625,
0.0009298324584960938,
0.030487060546875,
0.005832672119140625,
-0.0421142578125,
-0.02117919921875,
0.0300140380859375,
-0.0262908935546875,
-0.0634765625,
0.039093017578125,
-0.01384735107421875,
0.0338134765625,
0.0311279296875,
0.0011310577392578125,
0.0401611328125,
-0.04364013671875,
0.07476806640625,
0.039794921875,
-0.071044921875,
0.053985595703125,
-0.039306640625,
0.0290069580078125,
0.030426025390625,
0.0028076171875,
-0.050537109375,
-0.0266571044921875,
-0.048126220703125,
-0.084716796875,
0.07855224609375,
0.052947998046875,
0.023590087890625,
-0.016326904296875,
0.01922607421875,
0.0001957416534423828,
0.00231170654296875,
-0.0660400390625,
-0.0665283203125,
-0.028228759765625,
-0.032318115234375,
0.0259857177734375,
-0.0206756591796875,
0.005405426025390625,
-0.0189666748046875,
0.0673828125,
0.006378173828125,
0.043670654296875,
0.0306549072265625,
-0.0030994415283203125,
-0.007282257080078125,
0.0282135009765625,
0.048492431640625,
0.02978515625,
-0.0165557861328125,
-0.007663726806640625,
0.006107330322265625,
-0.05621337890625,
0.01419830322265625,
0.003757476806640625,
-0.038177490234375,
0.01371002197265625,
0.018798828125,
0.08477783203125,
-0.001987457275390625,
-0.029296875,
0.0379638671875,
0.0041046142578125,
-0.046722412109375,
-0.030059814453125,
-0.01201629638671875,
0.0006957054138183594,
0.0006508827209472656,
0.030975341796875,
-0.006877899169921875,
-0.01251220703125,
-0.0250396728515625,
0.00806427001953125,
0.0133209228515625,
-0.0245361328125,
-0.02166748046875,
0.07427978515625,
0.008087158203125,
-0.038726806640625,
0.0187530517578125,
-0.028167724609375,
-0.06475830078125,
0.04949951171875,
0.0670166015625,
0.06591796875,
-0.002895355224609375,
0.0272369384765625,
0.055267333984375,
0.006511688232421875,
0.003330230712890625,
0.03619384765625,
0.0072784423828125,
-0.04156494140625,
-0.01073455810546875,
-0.043853759765625,
0.001171112060546875,
0.033233642578125,
-0.0267333984375,
0.00331878662109375,
-0.048858642578125,
-0.0311737060546875,
-0.005863189697265625,
0.0017232894897460938,
-0.04608154296875,
0.04095458984375,
-0.005886077880859375,
0.054443359375,
-0.04034423828125,
0.042572021484375,
0.061279296875,
-0.01934814453125,
-0.048492431640625,
0.006587982177734375,
0.00653076171875,
-0.0528564453125,
0.05413818359375,
0.0160980224609375,
0.01157379150390625,
0.01251983642578125,
-0.056182861328125,
-0.0711669921875,
0.0809326171875,
-0.0225067138671875,
-0.0136566162109375,
-0.0005235671997070312,
0.02996826171875,
0.035003662109375,
-0.02496337890625,
0.0260772705078125,
0.00762939453125,
0.033416748046875,
-0.00685882568359375,
-0.0706787109375,
0.0357666015625,
-0.033355712890625,
-0.0037994384765625,
0.0194091796875,
-0.0679931640625,
0.06427001953125,
-0.0110931396484375,
-0.0350341796875,
0.0217742919921875,
0.051788330078125,
0.0238800048828125,
0.03009033203125,
0.03326416015625,
0.035614013671875,
0.0213775634765625,
-0.0229034423828125,
0.0494384765625,
-0.0267333984375,
0.0654296875,
0.062164306640625,
0.002185821533203125,
0.033111572265625,
0.0233917236328125,
-0.014556884765625,
0.03826904296875,
0.0380859375,
-0.00205230712890625,
0.055145263671875,
0.00905609130859375,
-0.032318115234375,
-0.0009398460388183594,
0.01117706298828125,
-0.0303955078125,
0.0291290283203125,
0.02008056640625,
-0.029449462890625,
-0.01361083984375,
0.006832122802734375,
0.025146484375,
-0.0089111328125,
-0.0031890869140625,
0.06201171875,
-0.007343292236328125,
-0.03863525390625,
0.03216552734375,
0.0206756591796875,
0.06256103515625,
-0.056396484375,
-0.00518035888671875,
0.0175323486328125,
0.01117706298828125,
-0.0100860595703125,
-0.063720703125,
-0.0153350830078125,
0.005542755126953125,
-0.01318359375,
-0.0237579345703125,
0.0682373046875,
-0.03240966796875,
-0.038330078125,
0.0161285400390625,
0.02301025390625,
0.009429931640625,
0.017730712890625,
-0.060089111328125,
-0.007274627685546875,
0.036041259765625,
-0.04296875,
0.016815185546875,
0.033935546875,
0.0192718505859375,
0.036285400390625,
0.053924560546875,
0.0189361572265625,
0.0185699462890625,
-0.0025730133056640625,
0.059906005859375,
-0.055999755859375,
-0.0401611328125,
-0.070068359375,
0.054779052734375,
-0.032501220703125,
-0.043060302734375,
0.06390380859375,
0.05413818359375,
0.06719970703125,
-0.01177978515625,
0.09332275390625,
-0.035186767578125,
0.0377197265625,
-0.052581787109375,
0.0697021484375,
-0.03204345703125,
-0.0129547119140625,
-0.033355712890625,
-0.041412353515625,
0.0034027099609375,
0.051483154296875,
-0.0007920265197753906,
0.01377105712890625,
0.04400634765625,
0.039215087890625,
-0.0008182525634765625,
-0.0117950439453125,
-0.006496429443359375,
0.0250091552734375,
0.02471923828125,
0.041900634765625,
0.0279083251953125,
-0.0660400390625,
0.038421630859375,
-0.03997802734375,
-0.0198211669921875,
-0.0160980224609375,
-0.052764892578125,
-0.07562255859375,
-0.034332275390625,
-0.02557373046875,
-0.0440673828125,
-0.01430511474609375,
0.07562255859375,
0.03863525390625,
-0.055938720703125,
0.0005102157592773438,
-0.0029354095458984375,
-0.00836944580078125,
-0.010406494140625,
-0.028289794921875,
0.04656982421875,
-0.014495849609375,
-0.06744384765625,
0.002593994140625,
-0.0017652511596679688,
0.034088134765625,
0.0263671875,
-0.00576019287109375,
-0.0227813720703125,
0.00974273681640625,
0.03533935546875,
0.01378631591796875,
-0.049072265625,
-0.0185699462890625,
-0.01459503173828125,
-0.0272064208984375,
0.0010843276977539062,
0.0421142578125,
-0.02886962890625,
0.03973388671875,
0.048614501953125,
-0.00153350830078125,
0.032379150390625,
0.00251007080078125,
0.0247650146484375,
-0.055328369140625,
0.0164642333984375,
-0.006317138671875,
0.0239410400390625,
-0.007061004638671875,
-0.03594970703125,
0.02056884765625,
0.037353515625,
-0.03863525390625,
-0.044219970703125,
0.0009565353393554688,
-0.08013916015625,
-0.01497650146484375,
0.092529296875,
-0.01076507568359375,
-0.00528717041015625,
-0.01519775390625,
-0.0592041015625,
0.039459228515625,
-0.04119873046875,
0.0447998046875,
0.0513916015625,
0.0041046142578125,
0.0007634162902832031,
-0.043365478515625,
0.049407958984375,
0.00896453857421875,
-0.05975341796875,
-0.00150299072265625,
0.0171966552734375,
0.0297393798828125,
0.01529693603515625,
0.08355712890625,
-0.00021958351135253906,
0.01146697998046875,
0.007049560546875,
0.006465911865234375,
-0.0095367431640625,
0.01169586181640625,
-0.00423431396484375,
0.0077056884765625,
-0.033203125,
-0.0265350341796875
]
] |
paust/pko-t5-large | 2023-06-28T17:03:42.000Z | [
"transformers",
"pytorch",
"safetensors",
"t5",
"text2text-generation",
"ko",
"arxiv:2105.09680",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text2text-generation | paust | null | null | paust/pko-t5-large | 13 | 13,182 | transformers | 2022-05-16T11:59:52 | ---
language: ko
license: cc-by-4.0
---
# pko-t5-large
[Source Code](https://github.com/paust-team/pko-t5)
pko-t5 는 한국어 전용 데이터로 학습한 [t5 v1.1 모델](https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/released_checkpoints.md)입니다.
한국어를 tokenize 하기 위해서 sentencepiece 대신 OOV 가 없는 BBPE 를 사용했으며 한국어 데이터 (나무위키, 위키피디아, 모두의말뭉치 등..) 를 T5 의 span corruption task 를 사용해서 unsupervised learning 만 적용하여 학습을 진행했습니다.
pko-t5 를 사용하실 때는 대상 task 에 파인튜닝하여 사용하시기 바랍니다.
## Usage
transformers 의 API 를 사용하여 접근 가능합니다. tokenizer 를 사용할때는 `T5Tokenizer` 가 아니라 `T5TokenizerFast` 를 사용해주십시오. model 은 T5ForConditionalGeneration 를 그대로 활용하시면 됩니다.
### Example
```python
from transformers import T5TokenizerFast, T5ForConditionalGeneration
tokenizer = T5TokenizerFast.from_pretrained('paust/pko-t5-large')
model = T5ForConditionalGeneration.from_pretrained('paust/pko-t5-large')
input_ids = tokenizer(["qa question: 당신의 이름은 무엇인가요?"]).input_ids
labels = tokenizer(["T5 입니다."]).input_ids
outputs = model(input_ids=input_ids, labels=labels)
print(f"loss={outputs.loss} logits={outputs.logits}")
```
## Klue 평가 (dev)
| | Model | ynat (macro F1) | sts (pearsonr/F1) | nli (acc) | ner (entity-level F1) | re (micro F1) | dp (LAS) | mrc (EM/F1) |
|-----|------------------------------------------------------------------|-----------------|-------------------|-----------|-----------------------|---------------|-----------|-------------|
| | Baseline | **87.30** | **93.20/86.13** | **89.50** | 86.06 | 71.06 | 87.93 | **75.26/-** |
| FT | [pko-t5-small](https://huggingface.co/paust/pko-t5-small) (77M) | 86.21 | 77.99/77.01 | 69.20 | 82.60 | 66.46 | 93.15 | 43.81/46.58 |
| FT | [pko-t5-base](https://huggingface.co/paust/pko-t5-base) (250M) | 87.29 | 90.25/83.43 | 79.73 | 87.80 | 67.23 | 97.28 | 61.53/64.74 |
| FT | [pko-t5-large](https://huggingface.co/paust/pko-t5-large) (800M) | 87.12 | 92.05/85.24 | 84.96 | **88.18** | **75.17** | **97.60** | 68.01/71.44 |
| MT | pko-t5-small | 84.54 | 68.50/72/02 | 51.16 | 74.69 | 66.11 | 80.40 | 43.60/46.28 |
| MT | pko-t5-base | 86.89 | 83.96/80.30 | 72.03 | 85.27 | 66.59 | 95.05 | 61.11/63.94 |
| MT | pko-t5-large | 87.57 | 91.93/86.29 | 83.63 | 87.41 | 71.34 | 96.99 | 70.70/73.72 |
- FT: 싱글태스크 파인튜닝 / MT: 멀티태스크 파인튜닝
- [Baseline](https://arxiv.org/abs/2105.09680): KLUE 논문에서 소개된 dev set 에 대한 SOTA 점수
## License
[PAUST](https://paust.io)에서 만든 pko-t5는 [MIT license](https://github.com/paust-team/pko-t5/blob/main/LICENSE) 하에 공개되어 있습니다. | 3,122 | [
[
-0.0418701171875,
-0.018707275390625,
0.0254974365234375,
0.0391845703125,
-0.0352783203125,
0.00995635986328125,
-0.01018524169921875,
-0.021270751953125,
0.0303955078125,
0.01904296875,
-0.0343017578125,
-0.0531005859375,
-0.05511474609375,
0.0186309814453125,
0.010284423828125,
0.06658935546875,
-0.0167236328125,
-0.01556396484375,
0.019134521484375,
-0.003658294677734375,
-0.0439453125,
-0.01078033447265625,
-0.05462646484375,
-0.0249481201171875,
0.00487518310546875,
0.0462646484375,
0.038787841796875,
0.0264892578125,
0.04034423828125,
0.0265960693359375,
-0.007495880126953125,
-0.002002716064453125,
-0.0212860107421875,
-0.01241302490234375,
0.005901336669921875,
-0.03143310546875,
-0.054840087890625,
-0.01500701904296875,
0.050811767578125,
0.041290283203125,
0.009124755859375,
0.01461029052734375,
0.008087158203125,
0.03790283203125,
-0.033905029296875,
0.0200347900390625,
-0.00814056396484375,
0.0162506103515625,
-0.015869140625,
-0.016143798828125,
-0.0164947509765625,
-0.034576416015625,
-0.0007405281066894531,
-0.054290771484375,
0.0083770751953125,
-0.001743316650390625,
0.1094970703125,
0.006931304931640625,
-0.021881103515625,
-0.0008721351623535156,
-0.029541015625,
0.06903076171875,
-0.06927490234375,
0.0293731689453125,
0.03656005859375,
0.01456451416015625,
-0.0031032562255859375,
-0.06243896484375,
-0.037109375,
0.0011835098266601562,
-0.0297088623046875,
0.037689208984375,
-0.00827789306640625,
-0.01059722900390625,
0.035552978515625,
0.0262451171875,
-0.028472900390625,
-0.0012655258178710938,
-0.0472412109375,
-0.02392578125,
0.060699462890625,
0.007045745849609375,
0.03973388671875,
-0.0294342041015625,
-0.04559326171875,
-0.00554656982421875,
-0.0246124267578125,
0.0215911865234375,
0.01288604736328125,
-0.005126953125,
-0.03741455078125,
0.046844482421875,
-0.010040283203125,
0.0228118896484375,
0.03289794921875,
-0.0288543701171875,
0.061065673828125,
-0.043701171875,
-0.0262908935546875,
0.00046896934509277344,
0.07647705078125,
0.051666259765625,
0.00920867919921875,
-0.0041046142578125,
-0.0196990966796875,
-0.0150604248046875,
-0.001171112060546875,
-0.0760498046875,
-0.03387451171875,
0.033111572265625,
-0.04217529296875,
-0.030487060546875,
0.0271759033203125,
-0.08013916015625,
0.005397796630859375,
0.0013055801391601562,
0.044921875,
-0.04779052734375,
-0.032257080078125,
-0.00901031494140625,
-0.0211334228515625,
0.034698486328125,
0.0144195556640625,
-0.0474853515625,
0.00962066650390625,
0.018096923828125,
0.051513671875,
0.012237548828125,
-0.01715087890625,
-0.0172119140625,
0.01526641845703125,
-0.03289794921875,
0.035797119140625,
-0.00670623779296875,
-0.042327880859375,
-0.0181732177734375,
0.01495361328125,
-0.01580810546875,
-0.0293121337890625,
0.056640625,
-0.00817108154296875,
0.0152587890625,
-0.0206298828125,
-0.0276336669921875,
-0.0082550048828125,
0.01548004150390625,
-0.037078857421875,
0.0850830078125,
0.0120697021484375,
-0.07379150390625,
0.043426513671875,
-0.053192138671875,
-0.01073455810546875,
-0.00782012939453125,
0.0185699462890625,
-0.07159423828125,
-0.00908660888671875,
0.0308074951171875,
0.03607177734375,
-0.00362396240234375,
-0.00279998779296875,
-0.0220184326171875,
-0.02239990234375,
0.0110931396484375,
0.00258636474609375,
0.07183837890625,
0.01316070556640625,
-0.042022705078125,
0.021942138671875,
-0.06988525390625,
0.0240936279296875,
0.019439697265625,
-0.027099609375,
-0.005039215087890625,
-0.0277099609375,
-0.0002930164337158203,
0.0318603515625,
0.023681640625,
-0.04095458984375,
0.028228759765625,
-0.03594970703125,
0.035675048828125,
0.061248779296875,
0.01727294921875,
0.03076171875,
-0.03045654296875,
0.02911376953125,
0.0235595703125,
0.0228424072265625,
0.00605010986328125,
-0.026824951171875,
-0.057403564453125,
-0.04156494140625,
0.024749755859375,
0.038848876953125,
-0.0531005859375,
0.06292724609375,
-0.02069091796875,
-0.037017822265625,
-0.044952392578125,
-0.005359649658203125,
0.021697998046875,
0.0287322998046875,
0.031829833984375,
-0.01268768310546875,
-0.054840087890625,
-0.07330322265625,
-0.007904052734375,
0.0026397705078125,
-0.00156402587890625,
0.0271453857421875,
0.0643310546875,
-0.01204681396484375,
0.05523681640625,
-0.043487548828125,
-0.02783203125,
-0.0299072265625,
-0.004802703857421875,
0.04254150390625,
0.04974365234375,
0.056915283203125,
-0.055267333984375,
-0.06939697265625,
-0.0033016204833984375,
-0.05609130859375,
-0.0223846435546875,
0.007297515869140625,
-0.01055908203125,
0.02587890625,
0.02325439453125,
-0.055572509765625,
0.045257568359375,
0.0243682861328125,
-0.05926513671875,
0.06170654296875,
-0.0245513916015625,
0.015716552734375,
-0.107177734375,
0.048736572265625,
-0.0170440673828125,
-0.0182342529296875,
-0.041656494140625,
-0.0111846923828125,
0.0047760009765625,
-0.00888824462890625,
-0.033203125,
0.049560546875,
-0.045440673828125,
-0.00467681884765625,
0.0026493072509765625,
0.006320953369140625,
-0.003879547119140625,
0.039031982421875,
-0.01207733154296875,
0.0645751953125,
0.044921875,
-0.03961181640625,
0.02532958984375,
0.02056884765625,
-0.01666259765625,
0.032684326171875,
-0.053558349609375,
0.0013227462768554688,
0.00031185150146484375,
0.0285186767578125,
-0.07879638671875,
-0.03167724609375,
0.04791259765625,
-0.0523681640625,
0.0316162109375,
-0.00677490234375,
-0.02056884765625,
-0.05218505859375,
-0.051422119140625,
0.0228118896484375,
0.041412353515625,
-0.033843994140625,
0.0303955078125,
0.00945281982421875,
0.001560211181640625,
-0.057281494140625,
-0.03729248046875,
-0.007099151611328125,
-0.031463623046875,
-0.053192138671875,
0.05218505859375,
0.00640106201171875,
-0.0031147003173828125,
0.00925445556640625,
-0.015960693359375,
-0.01020050048828125,
-0.0019140243530273438,
0.0188140869140625,
0.0276336669921875,
-0.014495849609375,
-0.01540374755859375,
-0.01503753662109375,
-0.0294036865234375,
-0.0021839141845703125,
-0.016204833984375,
0.054595947265625,
0.0021877288818359375,
0.00284576416015625,
-0.07470703125,
0.0004887580871582031,
0.06524658203125,
-0.02508544921875,
0.0718994140625,
0.05999755859375,
-0.01194000244140625,
-0.010986328125,
-0.0275115966796875,
-0.01556396484375,
-0.0323486328125,
0.01348114013671875,
-0.05584716796875,
-0.0435791015625,
0.06414794921875,
-0.004528045654296875,
0.003002166748046875,
0.06732177734375,
0.036163330078125,
-0.0181884765625,
0.0709228515625,
0.033447265625,
0.000009894371032714844,
0.032196044921875,
-0.056732177734375,
0.0197601318359375,
-0.0723876953125,
-0.03472900390625,
-0.0300140380859375,
-0.031219482421875,
-0.041717529296875,
-0.0157470703125,
0.030517578125,
0.027587890625,
-0.0291290283203125,
0.025299072265625,
-0.040679931640625,
0.013427734375,
0.033447265625,
0.01398468017578125,
0.0008645057678222656,
-0.01500701904296875,
-0.0217132568359375,
-0.005512237548828125,
-0.047119140625,
-0.021453857421875,
0.0902099609375,
0.033599853515625,
0.03302001953125,
0.01446533203125,
0.055938720703125,
-0.0004341602325439453,
0.001453399658203125,
-0.0538330078125,
0.03741455078125,
0.004428863525390625,
-0.042999267578125,
-0.0267181396484375,
-0.0281219482421875,
-0.07989501953125,
0.024810791015625,
-0.0184783935546875,
-0.06927490234375,
0.01216888427734375,
0.006511688232421875,
-0.0225830078125,
0.037994384765625,
-0.048797607421875,
0.07647705078125,
0.00016391277313232422,
-0.037261962890625,
-0.0012083053588867188,
-0.05120849609375,
0.0251007080078125,
0.004528045654296875,
0.00989532470703125,
0.00841522216796875,
-0.006290435791015625,
0.05609130859375,
-0.060699462890625,
0.040802001953125,
-0.0267181396484375,
-0.00435638427734375,
0.031036376953125,
-0.01165771484375,
0.046661376953125,
0.00864410400390625,
-0.0009851455688476562,
-0.0022716522216796875,
-0.004169464111328125,
-0.048736572265625,
-0.0283203125,
0.04327392578125,
-0.075439453125,
-0.039398193359375,
-0.045928955078125,
-0.01617431640625,
0.00991058349609375,
0.0303497314453125,
0.045074462890625,
0.01268768310546875,
0.00543975830078125,
0.0307464599609375,
0.034698486328125,
-0.0287322998046875,
0.054840087890625,
0.01055145263671875,
-0.01134490966796875,
-0.038055419921875,
0.05908203125,
0.00775909423828125,
0.02020263671875,
0.0159454345703125,
0.0165557861328125,
-0.0382080078125,
-0.0311279296875,
-0.0243988037109375,
0.028533935546875,
-0.031097412109375,
-0.0153350830078125,
-0.0303955078125,
-0.01349639892578125,
-0.038665771484375,
-0.0119171142578125,
-0.03173828125,
-0.026519775390625,
-0.007045745849609375,
-0.00597381591796875,
0.01316070556640625,
0.035430908203125,
-0.012115478515625,
0.00867462158203125,
-0.062408447265625,
0.01465606689453125,
0.0020313262939453125,
0.0277862548828125,
-0.01087188720703125,
-0.04107666015625,
0.0008325576782226562,
-0.0022029876708984375,
-0.0300445556640625,
-0.0806884765625,
0.041046142578125,
-0.005359649658203125,
0.0282745361328125,
0.0244140625,
0.0010786056518554688,
0.048583984375,
-0.013153076171875,
0.059539794921875,
0.0275115966796875,
-0.079833984375,
0.048828125,
-0.0313720703125,
0.0172119140625,
0.0257415771484375,
0.0173492431640625,
-0.03216552734375,
-0.0080108642578125,
-0.0672607421875,
-0.07159423828125,
0.076416015625,
0.033111572265625,
-0.0193328857421875,
0.03082275390625,
0.0112457275390625,
-0.0185394287109375,
0.010711669921875,
-0.05517578125,
-0.031494140625,
-0.038482666015625,
-0.0078582763671875,
-0.0002663135528564453,
-0.01056671142578125,
-0.00013709068298339844,
-0.022491455078125,
0.06231689453125,
0.009521484375,
0.039794921875,
0.030517578125,
-0.0070953369140625,
-0.003692626953125,
0.01617431640625,
0.055419921875,
0.05908203125,
-0.02032470703125,
-0.01004791259765625,
0.036468505859375,
-0.040008544921875,
0.01045989990234375,
-0.0054168701171875,
-0.0198211669921875,
0.00751495361328125,
0.03448486328125,
0.0654296875,
0.0210418701171875,
-0.01174163818359375,
0.040618896484375,
0.01436614990234375,
-0.039703369140625,
-0.03228759765625,
-0.006137847900390625,
0.009429931640625,
0.01506805419921875,
0.0185089111328125,
0.00008088350296020508,
-0.018951416015625,
-0.040740966796875,
0.010650634765625,
0.01502227783203125,
-0.0163116455078125,
-0.0242919921875,
0.044952392578125,
0.0118408203125,
-0.006099700927734375,
0.040802001953125,
-0.009246826171875,
-0.06146240234375,
0.06671142578125,
0.03765869140625,
0.053680419921875,
-0.037628173828125,
0.0047607421875,
0.0706787109375,
0.00569915771484375,
-0.00213623046875,
0.0271453857421875,
0.0105133056640625,
-0.036712646484375,
-0.01776123046875,
-0.047210693359375,
0.0185394287109375,
0.03240966796875,
-0.028533935546875,
0.0257720947265625,
-0.041046142578125,
-0.0250091552734375,
0.0026760101318359375,
0.02777099609375,
-0.038970947265625,
0.032440185546875,
-0.0107421875,
0.05902099609375,
-0.054351806640625,
0.0633544921875,
0.06622314453125,
-0.049224853515625,
-0.087890625,
0.00904083251953125,
-0.01386260986328125,
-0.05108642578125,
0.064453125,
-0.0080108642578125,
0.020111083984375,
0.00445556640625,
-0.028778076171875,
-0.067138671875,
0.10791015625,
0.0035247802734375,
-0.0254974365234375,
0.0021209716796875,
-0.000995635986328125,
0.0338134765625,
0.00751495361328125,
0.0262298583984375,
0.0252838134765625,
0.0570068359375,
0.0142669677734375,
-0.09051513671875,
0.023651123046875,
-0.017578125,
-0.01110076904296875,
0.036956787109375,
-0.08984375,
0.08123779296875,
-0.016143798828125,
-0.0027523040771484375,
-0.006603240966796875,
0.041656494140625,
0.03076171875,
0.0030384063720703125,
0.02423095703125,
0.059539794921875,
0.034576416015625,
-0.0191497802734375,
0.06634521484375,
-0.0257110595703125,
0.0592041015625,
0.03179931640625,
0.02423095703125,
0.037994384765625,
0.03076171875,
-0.03564453125,
0.023406982421875,
0.042388916015625,
-0.031341552734375,
0.0311279296875,
0.010101318359375,
-0.023193359375,
-0.00984954833984375,
0.00836944580078125,
-0.039031982421875,
0.029815673828125,
0.0106201171875,
-0.034576416015625,
-0.0014581680297851562,
-0.0150299072265625,
0.0276947021484375,
-0.0155792236328125,
-0.034027099609375,
0.04595947265625,
0.00669097900390625,
-0.053436279296875,
0.04547119140625,
0.01331329345703125,
0.05072021484375,
-0.036773681640625,
0.00006020069122314453,
-0.00960540771484375,
0.03790283203125,
-0.034637451171875,
-0.07049560546875,
0.023834228515625,
-0.00799560546875,
-0.0085601806640625,
-0.01092529296875,
0.06317138671875,
-0.0200347900390625,
-0.036712646484375,
0.0230255126953125,
0.01523590087890625,
0.01331329345703125,
0.0271759033203125,
-0.059539794921875,
0.0024204254150390625,
0.0137786865234375,
-0.031524658203125,
0.01727294921875,
0.031280517578125,
0.00046944618225097656,
0.04595947265625,
0.042633056640625,
0.01189422607421875,
0.01436614990234375,
-0.01444244384765625,
0.055999755859375,
-0.038421630859375,
-0.04620361328125,
-0.07135009765625,
0.056640625,
-0.0228271484375,
-0.042633056640625,
0.0604248046875,
0.05572509765625,
0.050872802734375,
-0.02166748046875,
0.06610107421875,
-0.030792236328125,
0.0228729248046875,
-0.036407470703125,
0.06591796875,
-0.049652099609375,
-0.01432037353515625,
-0.0286407470703125,
-0.0675048828125,
-0.0161285400390625,
0.06500244140625,
-0.031951904296875,
0.0181121826171875,
0.056365966796875,
0.040740966796875,
-0.010833740234375,
-0.01456451416015625,
0.0014257431030273438,
0.040679931640625,
0.0260467529296875,
0.0718994140625,
0.029571533203125,
-0.052490234375,
0.0413818359375,
-0.038970947265625,
-0.0002617835998535156,
-0.026519775390625,
-0.041778564453125,
-0.0760498046875,
-0.03240966796875,
-0.021026611328125,
-0.037261962890625,
-0.00777435302734375,
0.07110595703125,
0.0445556640625,
-0.055419921875,
-0.0117034912109375,
-0.009124755859375,
0.002166748046875,
-0.015899658203125,
-0.020172119140625,
0.056732177734375,
-0.0238037109375,
-0.07073974609375,
0.00009638071060180664,
0.0008778572082519531,
0.0164947509765625,
0.01462554931640625,
-0.017120361328125,
-0.020965576171875,
-0.017364501953125,
0.032989501953125,
0.0254669189453125,
-0.046630859375,
-0.031005859375,
-0.01090240478515625,
-0.01250457763671875,
0.0183258056640625,
0.0218048095703125,
-0.0258941650390625,
0.01165771484375,
0.0596923828125,
0.0095977783203125,
0.054107666015625,
0.01258087158203125,
0.032989501953125,
-0.04315185546875,
0.0177459716796875,
0.01459503173828125,
0.0171051025390625,
0.003757476806640625,
-0.0123138427734375,
0.050537109375,
0.0439453125,
-0.0322265625,
-0.061248779296875,
-0.01288604736328125,
-0.073974609375,
-0.0272369384765625,
0.06640625,
-0.0170135498046875,
-0.0238037109375,
0.003742218017578125,
-0.04107666015625,
0.036468505859375,
-0.03759765625,
0.047760009765625,
0.06414794921875,
-0.0025157928466796875,
-0.020538330078125,
-0.04193115234375,
0.05645751953125,
0.036285400390625,
-0.0643310546875,
-0.0247650146484375,
0.0085601806640625,
0.0379638671875,
0.023406982421875,
0.054351806640625,
-0.01496124267578125,
0.01947021484375,
0.00595855712890625,
0.0157470703125,
-0.00740814208984375,
-0.0022602081298828125,
-0.0008425712585449219,
0.0223236083984375,
-0.017120361328125,
-0.039581298828125
]
] |
Michau/t5-base-en-generate-headline | 2021-06-23T03:17:34.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | Michau | null | null | Michau/t5-base-en-generate-headline | 50 | 13,180 | transformers | 2022-03-02T23:29:04 | ## About the model
The model has been trained on a collection of 500k articles with headings. Its purpose is to create a one-line heading suitable for the given article.
Sample code with a WikiNews article:
```python
import torch
from transformers import T5ForConditionalGeneration,T5Tokenizer
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model = T5ForConditionalGeneration.from_pretrained("Michau/t5-base-en-generate-headline")
tokenizer = T5Tokenizer.from_pretrained("Michau/t5-base-en-generate-headline")
model = model.to(device)
article = '''
Very early yesterday morning, the United States President Donald Trump reported he and his wife First Lady Melania Trump tested positive for COVID-19. Officials said the Trumps' 14-year-old son Barron tested negative as did First Family and Senior Advisors Jared Kushner and Ivanka Trump.
Trump took to social media, posting at 12:54 am local time (0454 UTC) on Twitter, "Tonight, [Melania] and I tested positive for COVID-19. We will begin our quarantine and recovery process immediately. We will get through this TOGETHER!" Yesterday afternoon Marine One landed on the White House's South Lawn flying Trump to Walter Reed National Military Medical Center (WRNMMC) in Bethesda, Maryland.
Reports said both were showing "mild symptoms". Senior administration officials were tested as people were informed of the positive test. Senior advisor Hope Hicks had tested positive on Thursday.
Presidential physician Sean Conley issued a statement saying Trump has been given zinc, vitamin D, Pepcid and a daily Aspirin. Conley also gave a single dose of the experimental polyclonal antibodies drug from Regeneron Pharmaceuticals.
According to official statements, Trump, now operating from the WRNMMC, is to continue performing his duties as president during a 14-day quarantine. In the event of Trump becoming incapacitated, Vice President Mike Pence could take over the duties of president via the 25th Amendment of the US Constitution. The Pence family all tested negative as of yesterday and there were no changes regarding Pence's campaign events.
'''
text = "headline: " + article
max_len = 256
encoding = tokenizer.encode_plus(text, return_tensors = "pt")
input_ids = encoding["input_ids"].to(device)
attention_masks = encoding["attention_mask"].to(device)
beam_outputs = model.generate(
input_ids = input_ids,
attention_mask = attention_masks,
max_length = 64,
num_beams = 3,
early_stopping = True,
)
result = tokenizer.decode(beam_outputs[0])
print(result)
```
Result:
```Trump and First Lady Melania Test Positive for COVID-19```
| 2,639 | [
[
-0.0161285400390625,
-0.0604248046875,
0.0196075439453125,
0.0260467529296875,
-0.0201263427734375,
-0.00872039794921875,
-0.0025615692138671875,
-0.0247650146484375,
-0.0038509368896484375,
0.023040771484375,
-0.0533447265625,
-0.046539306640625,
-0.031585693359375,
-0.00041484832763671875,
-0.03369140625,
0.10369873046875,
0.0183868408203125,
-0.0265655517578125,
-0.0125579833984375,
0.01102447509765625,
-0.030029296875,
-0.03411865234375,
-0.053955078125,
-0.0226593017578125,
0.048919677734375,
0.044403076171875,
0.058013916015625,
0.01332855224609375,
0.04901123046875,
0.0215911865234375,
0.0011358261108398438,
0.011505126953125,
-0.028289794921875,
-0.003162384033203125,
-0.018707275390625,
-0.0127716064453125,
-0.0604248046875,
0.0260467529296875,
0.0051116943359375,
0.01385498046875,
-0.006916046142578125,
0.0133056640625,
-0.01203155517578125,
0.0159912109375,
-0.027618408203125,
-0.0292205810546875,
-0.029022216796875,
0.0003597736358642578,
0.0024433135986328125,
-0.01204681396484375,
-0.02191162109375,
-0.0089874267578125,
0.0224456787109375,
-0.054473876953125,
0.016082763671875,
0.01367950439453125,
0.08917236328125,
-0.0021495819091796875,
-0.04888916015625,
0.0025539398193359375,
-0.061920166015625,
0.0504150390625,
-0.074951171875,
0.01242828369140625,
0.040679931640625,
-0.01204681396484375,
-0.032379150390625,
-0.05755615234375,
-0.034271240234375,
-0.06939697265625,
-0.00478363037109375,
0.0215301513671875,
-0.0014896392822265625,
0.0080413818359375,
0.029571533203125,
0.0276641845703125,
-0.058807373046875,
-0.0196685791015625,
-0.05078125,
-0.00015783309936523438,
0.061737060546875,
0.0117950439453125,
0.027862548828125,
-0.005527496337890625,
-0.042083740234375,
0.02532958984375,
-0.039703369140625,
0.01910400390625,
-0.0176544189453125,
0.0145721435546875,
-0.053680419921875,
0.052520751953125,
-0.03265380859375,
0.044952392578125,
0.04852294921875,
-0.024871826171875,
-0.00524139404296875,
-0.050018310546875,
-0.03009033203125,
-0.007122039794921875,
0.06988525390625,
0.0323486328125,
0.003627777099609375,
-0.0240936279296875,
0.004314422607421875,
0.006244659423828125,
0.0265655517578125,
-0.07470703125,
-0.0140533447265625,
0.039947509765625,
-0.0287017822265625,
-0.041290283203125,
0.0230255126953125,
-0.035919189453125,
-0.0246124267578125,
0.0002646446228027344,
0.07989501953125,
-0.0241241455078125,
-0.005908966064453125,
0.005718231201171875,
-0.0188446044921875,
0.007366180419921875,
0.00791168212890625,
-0.042572021484375,
0.0159149169921875,
0.031646728515625,
0.034515380859375,
-0.0143890380859375,
-0.0163726806640625,
-0.054595947265625,
-0.010345458984375,
-0.049713134765625,
0.028839111328125,
-0.01259613037109375,
-0.009368896484375,
-0.030517578125,
0.00019228458404541016,
0.0018110275268554688,
-0.041717529296875,
0.028289794921875,
-0.0155029296875,
0.02801513671875,
-0.0129547119140625,
-0.0210418701171875,
0.0078582763671875,
0.04107666015625,
-0.0172882080078125,
0.055908203125,
0.0203094482421875,
-0.06866455078125,
0.01374053955078125,
-0.019012451171875,
-0.044403076171875,
-0.01873779296875,
0.0004494190216064453,
-0.058990478515625,
0.001049041748046875,
0.0007691383361816406,
0.0307769775390625,
-0.0176544189453125,
0.0209808349609375,
-0.039154052734375,
-0.01364898681640625,
0.01727294921875,
-0.01348114013671875,
0.07574462890625,
0.01094818115234375,
-0.051361083984375,
0.00041556358337402344,
-0.07818603515625,
-0.0182952880859375,
-0.0009112358093261719,
-0.0250396728515625,
0.0034809112548828125,
-0.009429931640625,
-0.01012420654296875,
0.057403564453125,
0.0128021240234375,
-0.055206298828125,
0.01849365234375,
-0.02569580078125,
0.0144195556640625,
0.019012451171875,
0.041229248046875,
0.018829345703125,
-0.03106689453125,
0.049896240234375,
0.0270233154296875,
0.031707763671875,
0.01409912109375,
-0.03857421875,
-0.059417724609375,
0.01166534423828125,
-0.01233673095703125,
0.066162109375,
-0.0250091552734375,
0.03814697265625,
0.0035686492919921875,
-0.03564453125,
0.007312774658203125,
-0.0188446044921875,
0.0199737548828125,
0.049468994140625,
0.02813720703125,
0.00933837890625,
-0.053802490234375,
-0.0308685302734375,
-0.017822265625,
0.0038585662841796875,
0.01377105712890625,
-0.0140228271484375,
0.045013427734375,
0.0091705322265625,
0.04583740234375,
-0.0248870849609375,
-0.0250091552734375,
-0.020050048828125,
0.004085540771484375,
0.03271484375,
0.052734375,
0.01088714599609375,
-0.044891357421875,
-0.0296478271484375,
0.0249481201171875,
-0.054351806640625,
0.0186004638671875,
-0.021209716796875,
-0.0183258056640625,
0.006061553955078125,
0.03802490234375,
-0.04730224609375,
0.061126708984375,
0.038726806640625,
-0.06036376953125,
0.06689453125,
-0.0284576416015625,
-0.0039005279541015625,
-0.093017578125,
0.0056304931640625,
-0.01491546630859375,
-0.0287322998046875,
-0.06329345703125,
-0.0085906982421875,
-0.0152130126953125,
-0.0106964111328125,
-0.0263671875,
0.0252532958984375,
-0.01560211181640625,
0.01020050048828125,
-0.01495361328125,
0.0029392242431640625,
0.028594970703125,
-0.00634002685546875,
-0.020233154296875,
0.017822265625,
0.024383544921875,
-0.052001953125,
0.04962158203125,
0.00647735595703125,
-0.0022258758544921875,
0.05059814453125,
-0.05322265625,
-0.0010442733764648438,
0.01219940185546875,
0.00730133056640625,
-0.057037353515625,
-0.00969696044921875,
0.029449462890625,
-0.0379638671875,
0.01337432861328125,
0.020965576171875,
0.0089263916015625,
-0.045562744140625,
-0.029327392578125,
-0.00470733642578125,
0.0267333984375,
-0.01113128662109375,
0.016998291015625,
0.0076141357421875,
0.01371002197265625,
-0.052093505859375,
-0.05072021484375,
0.007720947265625,
-0.0124664306640625,
-0.040802001953125,
0.05242919921875,
0.0185089111328125,
-0.03411865234375,
0.0180511474609375,
-0.03387451171875,
-0.008453369140625,
0.00974273681640625,
0.00908660888671875,
0.041961669921875,
-0.006107330322265625,
0.0084686279296875,
-0.0131072998046875,
0.004520416259765625,
-0.007358551025390625,
-0.0229644775390625,
0.038726806640625,
0.0004978179931640625,
0.005077362060546875,
-0.047271728515625,
0.034454345703125,
0.052337646484375,
-0.039825439453125,
0.058135986328125,
0.0760498046875,
-0.052825927734375,
0.00975799560546875,
-0.0274810791015625,
-0.0216522216796875,
-0.030487060546875,
0.035247802734375,
-0.0340576171875,
-0.0280303955078125,
0.043182373046875,
0.045166015625,
0.01096343994140625,
0.06695556640625,
0.0338134765625,
-0.0196533203125,
0.06854248046875,
0.07098388671875,
0.005008697509765625,
0.026092529296875,
-0.039154052734375,
0.0184783935546875,
-0.0560302734375,
-0.03302001953125,
-0.0282745361328125,
0.00022280216217041016,
-0.058013916015625,
-0.00400543212890625,
0.033447265625,
0.0193634033203125,
-0.028289794921875,
0.032806396484375,
-0.050872802734375,
-0.0218048095703125,
0.032012939453125,
0.03204345703125,
0.031707763671875,
-0.027435302734375,
0.0015745162963867188,
0.0103607177734375,
-0.054473876953125,
-0.00926971435546875,
0.07366943359375,
-0.00582122802734375,
0.07342529296875,
-0.04144287109375,
0.050628662109375,
0.0038547515869140625,
0.0272979736328125,
-0.039031982421875,
0.033660888671875,
-0.04205322265625,
-0.07232666015625,
-0.010772705078125,
0.0005106925964355469,
-0.08282470703125,
0.009368896484375,
-0.036529541015625,
-0.042938232421875,
0.0230712890625,
-0.001373291015625,
-0.0565185546875,
0.01153564453125,
-0.03997802734375,
0.086669921875,
-0.0285797119140625,
-0.01323699951171875,
-0.03387451171875,
-0.03765869140625,
0.027557373046875,
0.0272369384765625,
-0.006824493408203125,
-0.036163330078125,
-0.01007843017578125,
0.05767822265625,
-0.04962158203125,
0.0299530029296875,
-0.010467529296875,
0.0221710205078125,
0.0181427001953125,
-0.032135009765625,
0.0369873046875,
0.024871826171875,
0.004421234130859375,
-0.02252197265625,
0.0144195556640625,
-0.023193359375,
-0.0029392242431640625,
0.043792724609375,
-0.06103515625,
-0.041412353515625,
-0.045440673828125,
-0.034942626953125,
0.0125579833984375,
0.0189208984375,
0.04913330078125,
0.05633544921875,
-0.01052093505859375,
0.01788330078125,
0.0263519287109375,
-0.0440673828125,
0.0201263427734375,
0.01043701171875,
-0.039215087890625,
-0.0302734375,
0.050262451171875,
0.034912109375,
0.01076507568359375,
0.025604248046875,
0.04248046875,
0.01253509521484375,
-0.0200958251953125,
0.002864837646484375,
0.03399658203125,
0.0028400421142578125,
-0.0218963623046875,
-0.0452880859375,
-0.043853759765625,
-0.035125732421875,
-0.00003910064697265625,
-0.0203399658203125,
-0.00872039794921875,
-0.0301361083984375,
-0.00623321533203125,
0.020477294921875,
0.05126953125,
0.016510009765625,
-0.00424957275390625,
-0.036895751953125,
0.061370849609375,
0.0277252197265625,
-0.00968170166015625,
-0.031646728515625,
-0.042938232421875,
-0.0287017822265625,
0.003528594970703125,
-0.049163818359375,
-0.0709228515625,
0.0269317626953125,
0.0250091552734375,
0.02001953125,
0.04925537109375,
0.0309295654296875,
0.06365966796875,
-0.015106201171875,
0.0604248046875,
0.00689697265625,
-0.062042236328125,
0.049346923828125,
-0.0307159423828125,
0.036285400390625,
-0.0028858184814453125,
0.025604248046875,
-0.058135986328125,
-0.05889892578125,
-0.062042236328125,
-0.044189453125,
0.040863037109375,
0.00702667236328125,
-0.001407623291015625,
-0.006206512451171875,
0.02044677734375,
-0.0093536376953125,
0.00782012939453125,
-0.03936767578125,
-0.037933349609375,
-0.00753021240234375,
-0.0009126663208007812,
0.005401611328125,
-0.0199127197265625,
-0.00988006591796875,
-0.066650390625,
0.053497314453125,
0.033416748046875,
0.03656005859375,
0.05816650390625,
-0.002880096435546875,
-0.0025730133056640625,
0.0295867919921875,
0.07281494140625,
0.08416748046875,
-0.02655029296875,
0.00878143310546875,
0.0079803466796875,
-0.04156494140625,
0.01459503173828125,
0.0294036865234375,
-0.003101348876953125,
0.017303466796875,
0.036590576171875,
0.0533447265625,
0.01413726806640625,
-0.0166168212890625,
0.04718017578125,
0.0012941360473632812,
-0.011932373046875,
-0.039703369140625,
-0.010009765625,
0.019073486328125,
0.014007568359375,
0.04644775390625,
0.01474761962890625,
-0.001384735107421875,
-0.058807373046875,
0.0289459228515625,
0.057281494140625,
-0.00470733642578125,
-0.059173583984375,
0.0799560546875,
0.004611968994140625,
-0.01296234130859375,
0.058441162109375,
-0.02276611328125,
-0.030609130859375,
0.06341552734375,
0.05474853515625,
0.061248779296875,
-0.0226287841796875,
0.027618408203125,
0.046783447265625,
0.0355224609375,
0.024383544921875,
0.050262451171875,
0.0206756591796875,
-0.0445556640625,
-0.005718231201171875,
-0.03375244140625,
-0.01358795166015625,
0.0418701171875,
-0.04693603515625,
0.0279541015625,
-0.0401611328125,
-0.04718017578125,
0.00801849365234375,
0.037109375,
-0.056427001953125,
0.038665771484375,
0.01313018798828125,
0.072998046875,
-0.06671142578125,
0.0533447265625,
0.041473388671875,
-0.01450347900390625,
-0.0748291015625,
0.0012292861938476562,
-0.041351318359375,
-0.0204620361328125,
0.025421142578125,
0.0266571044921875,
0.0227813720703125,
-0.004512786865234375,
-0.0254364013671875,
-0.074951171875,
0.0908203125,
0.0193939208984375,
-0.0146636962890625,
-0.002864837646484375,
-0.01377105712890625,
0.040618896484375,
-0.0282745361328125,
-0.005352020263671875,
0.02252197265625,
0.034393310546875,
-0.03240966796875,
-0.06365966796875,
0.0083465576171875,
0.007724761962890625,
-0.012786865234375,
-0.0185546875,
-0.048370361328125,
0.07733154296875,
-0.039276123046875,
0.0015382766723632812,
0.004947662353515625,
0.049835205078125,
0.00905609130859375,
0.037109375,
0.0273590087890625,
0.0594482421875,
0.040435791015625,
-0.005817413330078125,
0.09185791015625,
-0.029205322265625,
0.04022216796875,
0.0557861328125,
0.025665283203125,
0.07330322265625,
0.040679931640625,
-0.0443115234375,
0.046051025390625,
0.04583740234375,
0.01262664794921875,
0.038116455078125,
0.00467681884765625,
-0.0271453857421875,
-0.00036072731018066406,
-0.017822265625,
-0.039215087890625,
0.01555633544921875,
0.0204620361328125,
-0.0203094482421875,
-0.01800537109375,
0.00746917724609375,
0.030548095703125,
-0.048919677734375,
-0.03289794921875,
0.0172882080078125,
0.028961181640625,
-0.0660400390625,
0.05657958984375,
-0.0016946792602539062,
0.048065185546875,
-0.044464111328125,
0.0226593017578125,
-0.0015544891357421875,
0.0330810546875,
-0.035491943359375,
-0.022216796875,
0.039764404296875,
-0.01201629638671875,
-0.011138916015625,
0.01953125,
0.06396484375,
-0.0645751953125,
-0.05072021484375,
0.00391387939453125,
0.0262298583984375,
-0.018768310546875,
-0.020843505859375,
-0.048797607421875,
-0.004100799560546875,
-0.0227203369140625,
-0.052703857421875,
-0.0036792755126953125,
0.0091400146484375,
-0.00836181640625,
0.037811279296875,
0.04864501953125,
0.01171875,
0.039520263671875,
-0.015289306640625,
0.03680419921875,
-0.034576416015625,
-0.0249481201171875,
-0.07952880859375,
0.0433349609375,
-0.006702423095703125,
-0.04632568359375,
0.0623779296875,
0.046173095703125,
0.036468505859375,
-0.01568603515625,
0.03387451171875,
-0.039703369140625,
0.00884246826171875,
-0.005069732666015625,
0.0631103515625,
-0.042724609375,
-0.0379638671875,
0.00897216796875,
-0.034881591796875,
-0.0225830078125,
0.036590576171875,
-0.021728515625,
-0.0025348663330078125,
0.0718994140625,
0.047882080078125,
-0.0054168701171875,
-0.05059814453125,
0.042205810546875,
0.040740966796875,
0.029266357421875,
0.01392364501953125,
0.041107177734375,
-0.04638671875,
0.052947998046875,
-0.07244873046875,
-0.00811004638671875,
-0.029205322265625,
-0.03741455078125,
-0.0738525390625,
-0.04156494140625,
-0.031829833984375,
-0.0460205078125,
0.0141754150390625,
0.085205078125,
0.06951904296875,
-0.05670166015625,
-0.00803375244140625,
-0.01334381103515625,
0.00855255126953125,
-0.045013427734375,
-0.02349853515625,
0.0286407470703125,
-0.0088958740234375,
-0.0261688232421875,
0.01104736328125,
0.036834716796875,
-0.013519287109375,
-0.007366180419921875,
0.0108795166015625,
-0.0312347412109375,
0.006122589111328125,
0.05865478515625,
0.020965576171875,
-0.055450439453125,
-0.039337158203125,
0.0163116455078125,
-0.047637939453125,
0.004604339599609375,
0.01143646240234375,
-0.062164306640625,
0.010955810546875,
0.06048583984375,
0.0185699462890625,
0.05322265625,
0.0265655517578125,
0.0286407470703125,
-0.0294342041015625,
-0.0039005279541015625,
0.0195465087890625,
0.0173492431640625,
0.0182342529296875,
-0.036285400390625,
0.005970001220703125,
0.03057861328125,
-0.047119140625,
-0.048431396484375,
0.0034160614013671875,
-0.09527587890625,
0.01226043701171875,
0.062408447265625,
-0.01311492919921875,
-0.02056884765625,
-0.0047760009765625,
-0.00644683837890625,
0.05084228515625,
-0.00327301025390625,
0.052886962890625,
0.045562744140625,
-0.0236358642578125,
-0.0093841552734375,
-0.062164306640625,
0.0299530029296875,
0.028167724609375,
-0.06719970703125,
-0.0338134765625,
-0.01433563232421875,
0.050750732421875,
-0.004657745361328125,
0.0679931640625,
0.0022487640380859375,
0.05963134765625,
0.0009188652038574219,
0.0095672607421875,
-0.01334381103515625,
-0.008453369140625,
-0.0294342041015625,
-0.0204010009765625,
-0.004474639892578125,
-0.05938720703125
]
] |
HuggingFaceH4/starchat-beta | 2023-06-09T10:18:22.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"gpt_bigcode",
"text-generation",
"generated_from_trainer",
"license:bigcode-openrail-m",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | HuggingFaceH4 | null | null | HuggingFaceH4/starchat-beta | 229 | 13,176 | transformers | 2023-06-07T11:23:47 | ---
tags:
- generated_from_trainer
widget:
- text: "How can I write a Python function to generate the nth Fibonacci number?"
- text: "How do I get the current date using shell commands? Explain how it works."
model-index:
- name: starchat-beta
results: []
license: bigcode-openrail-m
---
<img src="https://huggingface.co/HuggingFaceH4/starchat-beta/resolve/main/model_logo.png" alt="StarChat Beta Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Model Card for StarChat-β
StarChat is a series of language models that are trained to act as helpful coding assistants. StarChat-β is the second model in the series, and is a fine-tuned version of [StarCoderPlus](https://huggingface.co/bigcode/starcoderplus) that was trained on an ["uncensored"](https://erichartford.com/uncensored-models) variant of the [`openassistant-guanaco` dataset](https://huggingface.co/datasets/timdettmers/openassistant-guanaco). We found that removing the in-built alignment of the OpenAssistant dataset boosted performance on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) and made the model more helpful at coding tasks. However, this means that model is likely to generate problematic text when prompted to do so and should only be used for educational and research purposes.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Model type:** A 16B parameter GPT-like model fine-tuned on an ["uncensored"](https://erichartford.com/uncensored-models) variant of the [`openassistant-guanaco` dataset](https://huggingface.co/datasets/timdettmers/openassistant-guanaco).
- **Language(s) (NLP):** Primarily English and 80+ programming languages.
- **License:** BigCode Open RAIL-M v1
- **Finetuned from model:** [bigcode/starcoderplus](https://huggingface.co/bigcode/starcoderplus)
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bigcode-project/starcoder
- **Demo:** https://huggingface.co/spaces/HuggingFaceH4/starchat-playground
## Intended uses & limitations
The model was fine-tuned on a variant of the [`OpenAssistant/oasst1`](https://huggingface.co/datasets/OpenAssistant/oasst1) dataset, which contains a diverse range of dialogues in over 35 languages. As a result, the model can be used for chat and you can check out our [demo](https://huggingface.co/spaces/HuggingFaceH4/starchat-playground) to test its coding capabilities.
Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:
```python
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="HuggingFaceH4/starchat-beta", torch_dtype=torch.bfloat16, device_map="auto")
# We use a variant of ChatML to format each message
prompt_template = "<|system|>\n<|end|>\n<|user|>\n{query}<|end|>\n<|assistant|>"
prompt = prompt_template.format(query="How do I sort a list in Python?")
# We use a special <|end|> token with ID 49155 to denote ends of a turn
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.2, top_k=50, top_p=0.95, eos_token_id=49155)
# You can sort a list in Python by using the sort() method. Here's an example:\n\n```\nnumbers = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5]\nnumbers.sort()\nprint(numbers)\n```\n\nThis will sort the list in place and print the sorted list.
```
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
StarChat-β has not been aligned to human preferences with techniques like RLHF or deployed with in-the-loop filtering of responses like ChatGPT, so the model can produce problematic outputs (especially when prompted to do so).
Models trained primarily on code data will also have a more skewed demographic bias commensurate with the demographics of the GitHub community, for more on this see the [StarCoder dataset](https://huggingface.co/datasets/bigcode/starcoderdata) which is derived from The Stack.
Since the base model was pretrained on a large corpus of code, it may produce code snippets that are syntactically valid but semantically incorrect.
For example, it may produce code that does not compile or that produces incorrect results.
It may also produce code that is vulnerable to security exploits.
We have observed the model also has a tendency to produce false URLs which should be carefully inspected before clicking.
StarChat-β was fine-tuned from the base model [StarCoderPlus](https://huggingface.co/bigcode/starcoderplus), please refer to its model card's [Limitations Section](https://huggingface.co/bigcode/starcoderplus#limitations) for relevant information.
In particular, the model was evaluated on some categories of gender biases, propensity for toxicity, and risk of suggesting code completions with known security flaws; these evaluations are reported in its [technical report](https://drive.google.com/file/d/1cN-b9GnWtHzQRoE7M7gAEyivY0kl4BYs/view).
## Training and evaluation data
StarChat-β is trained on an ["uncensored"](https://erichartford.com/uncensored-models) variant of the [`openassistant-guanaco` dataset](https://huggingface.co/datasets/timdettmers/openassistant-guanaco). We applied the same [recipe](https://huggingface.co/datasets/ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered/blob/main/wizardlm_clean.py) used to filter the ShareGPT datasets behind the [WizardLM](https://huggingface.co/datasets/ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered).
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- gradient_accumulation_steps: 8
- total_train_batch_size: 256
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.5321 | 0.98 | 15 | 1.2856 |
| 1.2071 | 1.97 | 30 | 1.2620 |
| 1.0162 | 2.95 | 45 | 1.2853 |
| 0.8484 | 4.0 | 61 | 1.3274 |
| 0.6981 | 4.98 | 76 | 1.3994 |
| 0.5668 | 5.9 | 90 | 1.4720 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
## Citation
Although there isn't a blog post or paper associated with StarChat-β, you can find details on the earlier version in the blog post below:
**BibTeX:**
```
@article{Tunstall2023starchat-alpha,
author = {Tunstall, Lewis and Lambert, Nathan and Rajani, Nazneen and Beeching, Edward and Le Scao, Teven and von Werra, Leandro and Han, Sheon and Schmid, Philipp and Rush, Alexander},
title = {Creating a Coding Assistant with StarCoder},
journal = {Hugging Face Blog},
year = {2023},
note = {https://huggingface.co/blog/starchat},
}
``` | 7,106 | [
[
-0.0307159423828125,
-0.061859130859375,
0.0021839141845703125,
0.01206207275390625,
-0.01251983642578125,
-0.01149749755859375,
-0.027923583984375,
-0.051300048828125,
0.01861572265625,
0.0299224853515625,
-0.035369873046875,
-0.040191650390625,
-0.054107666015625,
-0.0077972412109375,
-0.01605224609375,
0.1060791015625,
0.00992584228515625,
0.0132904052734375,
-0.00667572021484375,
-0.0153656005859375,
-0.04205322265625,
-0.044281005859375,
-0.0408935546875,
-0.01198577880859375,
0.0309906005859375,
0.038116455078125,
0.0697021484375,
0.04083251953125,
0.038177490234375,
0.02191162109375,
-0.0165252685546875,
0.00731658935546875,
-0.04693603515625,
-0.0035953521728515625,
0.0130615234375,
-0.03155517578125,
-0.0404052734375,
0.01007080078125,
0.0270843505859375,
0.0270843505859375,
0.00875091552734375,
0.0328369140625,
0.0158843994140625,
0.043975830078125,
-0.035888671875,
0.04052734375,
-0.03424072265625,
-0.025909423828125,
-0.00994110107421875,
0.007122039794921875,
-0.024261474609375,
-0.0345458984375,
-0.0025730133056640625,
-0.0280609130859375,
0.0026569366455078125,
0.01384735107421875,
0.0869140625,
0.00031638145446777344,
-0.01044464111328125,
-0.0325927734375,
-0.0594482421875,
0.034088134765625,
-0.06158447265625,
0.039398193359375,
0.037811279296875,
0.024932861328125,
0.00081634521484375,
-0.04754638671875,
-0.04681396484375,
-0.027740478515625,
-0.01467132568359375,
-0.0006895065307617188,
-0.01076507568359375,
-0.00836944580078125,
0.025299072265625,
0.0252685546875,
-0.054656982421875,
0.013275146484375,
-0.060546875,
-0.0245361328125,
0.037628173828125,
-0.006656646728515625,
0.0160064697265625,
-0.0273590087890625,
-0.0150299072265625,
-0.01251983642578125,
-0.035888671875,
0.01226043701171875,
0.0312347412109375,
0.03472900390625,
-0.035919189453125,
0.0521240234375,
-0.0009360313415527344,
0.043182373046875,
0.0004799365997314453,
0.00038933753967285156,
0.03143310546875,
-0.021209716796875,
-0.0157928466796875,
-0.01654052734375,
0.07763671875,
0.0379638671875,
0.028106689453125,
0.0035839080810546875,
-0.003528594970703125,
0.01192474365234375,
0.005084991455078125,
-0.08001708984375,
-0.016754150390625,
0.027313232421875,
-0.0411376953125,
-0.033935546875,
-0.0006098747253417969,
-0.042449951171875,
-0.013763427734375,
-0.004268646240234375,
0.0227203369140625,
-0.0455322265625,
-0.0189056396484375,
-0.0097198486328125,
0.000396728515625,
0.0209503173828125,
0.009552001953125,
-0.0667724609375,
0.021087646484375,
0.0440673828125,
0.063720703125,
0.01238250732421875,
-0.03033447265625,
-0.016754150390625,
-0.037017822265625,
-0.0188446044921875,
0.035797119140625,
-0.0299835205078125,
-0.01041412353515625,
-0.015350341796875,
-0.0007696151733398438,
-0.0016880035400390625,
-0.0299530029296875,
0.0284881591796875,
-0.039764404296875,
0.0246734619140625,
-0.0199737548828125,
-0.04449462890625,
-0.0024433135986328125,
0.0036830902099609375,
-0.032958984375,
0.0782470703125,
0.0035991668701171875,
-0.06494140625,
0.0226898193359375,
-0.040679931640625,
-0.0012073516845703125,
-0.0018091201782226562,
-0.00795745849609375,
-0.0257568359375,
-0.020050048828125,
0.017578125,
0.037322998046875,
-0.02191162109375,
0.01087188720703125,
-0.034942626953125,
-0.034515380859375,
0.026580810546875,
-0.01361846923828125,
0.0631103515625,
0.031768798828125,
-0.0178680419921875,
0.00982666015625,
-0.0284881591796875,
0.010009765625,
0.024139404296875,
-0.01195526123046875,
-0.020263671875,
-0.02374267578125,
0.0168304443359375,
0.021026611328125,
0.022186279296875,
-0.0452880859375,
0.0176544189453125,
-0.023406982421875,
0.03753662109375,
0.047454833984375,
-0.007022857666015625,
0.035247802734375,
-0.0260467529296875,
0.02435302734375,
-0.002033233642578125,
0.04852294921875,
-0.007770538330078125,
-0.0665283203125,
-0.058807373046875,
-0.019561767578125,
0.0309295654296875,
0.04620361328125,
-0.044097900390625,
0.05230712890625,
-0.01450347900390625,
-0.071044921875,
-0.05401611328125,
0.01263427734375,
0.04364013671875,
0.0270843505859375,
0.0186614990234375,
-0.0227203369140625,
-0.03564453125,
-0.0589599609375,
-0.00023567676544189453,
-0.03955078125,
0.0146636962890625,
0.04473876953125,
0.047515869140625,
-0.0264434814453125,
0.07403564453125,
-0.043121337890625,
-0.004123687744140625,
-0.01244354248046875,
-0.0221099853515625,
0.0247039794921875,
0.0379638671875,
0.042755126953125,
-0.05352783203125,
-0.037994384765625,
-0.007450103759765625,
-0.06878662109375,
0.0011186599731445312,
0.010040283203125,
-0.0227203369140625,
0.0236053466796875,
0.0313720703125,
-0.0657958984375,
0.032623291015625,
0.052703857421875,
-0.040191650390625,
0.056640625,
-0.005279541015625,
0.0110321044921875,
-0.0797119140625,
0.0189056396484375,
0.00963592529296875,
-0.0044097900390625,
-0.047943115234375,
0.00787353515625,
0.01029205322265625,
-0.0248260498046875,
-0.02825927734375,
0.05706787109375,
-0.0311431884765625,
0.003116607666015625,
-0.0170745849609375,
0.003177642822265625,
-0.01268768310546875,
0.052154541015625,
-0.00997161865234375,
0.051177978515625,
0.05413818359375,
-0.0294952392578125,
0.0132598876953125,
0.032928466796875,
-0.009490966796875,
0.01476287841796875,
-0.056976318359375,
0.01074981689453125,
-0.00396728515625,
0.021209716796875,
-0.07421875,
-0.0009565353393554688,
0.047760009765625,
-0.0667724609375,
0.02117919921875,
-0.03173828125,
-0.03460693359375,
-0.04730224609375,
-0.0142059326171875,
0.01373291015625,
0.067138671875,
-0.032379150390625,
0.007442474365234375,
0.0272979736328125,
-0.00923919677734375,
-0.047943115234375,
-0.031768798828125,
-0.00998687744140625,
-0.0116729736328125,
-0.05633544921875,
0.0031414031982421875,
-0.0170745849609375,
-0.005802154541015625,
-0.00458526611328125,
-0.01100921630859375,
0.0007524490356445312,
-0.01544189453125,
0.040496826171875,
0.041656494140625,
-0.004852294921875,
-0.00551605224609375,
-0.02044677734375,
-0.004150390625,
-0.004638671875,
-0.0225677490234375,
0.05767822265625,
-0.0308685302734375,
-0.0106353759765625,
-0.032562255859375,
0.01904296875,
0.04681396484375,
-0.01166534423828125,
0.06927490234375,
0.05950927734375,
-0.0231475830078125,
0.00673675537109375,
-0.0433349609375,
-0.02874755859375,
-0.03839111328125,
0.021270751953125,
-0.003643035888671875,
-0.07086181640625,
0.051971435546875,
0.03125,
0.0166015625,
0.03662109375,
0.0450439453125,
0.003269195556640625,
0.055694580078125,
0.044677734375,
-0.0198822021484375,
0.04180908203125,
-0.0289306640625,
0.0093536376953125,
-0.054168701171875,
-0.03057861328125,
-0.04449462890625,
0.0003769397735595703,
-0.06585693359375,
-0.040191650390625,
0.00891876220703125,
0.0246124267578125,
-0.037689208984375,
0.04718017578125,
-0.052703857421875,
0.0142974853515625,
0.03668212890625,
0.0203704833984375,
0.003421783447265625,
-0.007633209228515625,
0.0006861686706542969,
-0.0012683868408203125,
-0.049560546875,
-0.0445556640625,
0.0911865234375,
0.043212890625,
0.06988525390625,
-0.005802154541015625,
0.0491943359375,
0.01546478271484375,
0.0286712646484375,
-0.044708251953125,
0.043792724609375,
0.00368499755859375,
-0.051422119140625,
-0.01412200927734375,
-0.044769287109375,
-0.07159423828125,
0.007076263427734375,
-0.01934814453125,
-0.0733642578125,
0.004192352294921875,
0.0137786865234375,
-0.00787353515625,
0.01288604736328125,
-0.0767822265625,
0.07861328125,
-0.020782470703125,
-0.01374053955078125,
0.007778167724609375,
-0.0474853515625,
0.026947021484375,
0.006046295166015625,
0.0091705322265625,
-0.005157470703125,
0.016998291015625,
0.0692138671875,
-0.059661865234375,
0.07159423828125,
-0.0217437744140625,
0.00997161865234375,
0.0290985107421875,
-0.0006632804870605469,
0.041961669921875,
0.011322021484375,
0.004791259765625,
0.043792724609375,
-0.00867462158203125,
-0.0276031494140625,
-0.033935546875,
0.0660400390625,
-0.083740234375,
-0.033599853515625,
-0.0214996337890625,
-0.030609130859375,
-0.0027942657470703125,
0.019622802734375,
0.0290374755859375,
0.026214599609375,
-0.007480621337890625,
0.0005354881286621094,
0.0262298583984375,
-0.02886962890625,
0.0171051025390625,
0.027618408203125,
-0.031646728515625,
-0.032257080078125,
0.0592041015625,
0.012054443359375,
0.020782470703125,
0.0023097991943359375,
0.0079193115234375,
-0.0222015380859375,
-0.03131103515625,
-0.03143310546875,
0.0374755859375,
-0.053253173828125,
-0.0172882080078125,
-0.061370849609375,
-0.043487548828125,
-0.051666259765625,
-0.0006070137023925781,
-0.0439453125,
-0.02630615234375,
-0.0179901123046875,
0.003604888916015625,
0.058349609375,
0.043975830078125,
0.01806640625,
0.023681640625,
-0.049072265625,
0.0033626556396484375,
0.00843048095703125,
0.03436279296875,
-0.0021305084228515625,
-0.048828125,
-0.0238189697265625,
0.01505279541015625,
-0.0192108154296875,
-0.060333251953125,
0.0308380126953125,
0.0012359619140625,
0.039794921875,
0.020416259765625,
-0.0025157928466796875,
0.051422119140625,
-0.022979736328125,
0.05975341796875,
0.017730712890625,
-0.045745849609375,
0.0455322265625,
-0.03143310546875,
0.019012451171875,
0.0274658203125,
0.042510986328125,
-0.04754638671875,
-0.040985107421875,
-0.06170654296875,
-0.06329345703125,
0.0555419921875,
0.049041748046875,
0.00623321533203125,
0.00748443603515625,
0.03326416015625,
0.004451751708984375,
0.0110931396484375,
-0.05877685546875,
-0.042572021484375,
-0.025787353515625,
-0.000576019287109375,
-0.016326904296875,
-0.0131988525390625,
0.00501251220703125,
-0.033111572265625,
0.0440673828125,
0.0006895065307617188,
0.0565185546875,
0.0015211105346679688,
-0.006633758544921875,
0.00617218017578125,
0.004520416259765625,
0.03375244140625,
0.0391845703125,
-0.031646728515625,
-0.0213623046875,
-0.00302886962890625,
-0.05401611328125,
-0.0155181884765625,
0.0270538330078125,
-0.00806427001953125,
-0.00836944580078125,
0.0243682861328125,
0.08453369140625,
-0.003376007080078125,
-0.0301055908203125,
0.04449462890625,
-0.00904083251953125,
0.00266265869140625,
-0.024627685546875,
0.01342010498046875,
0.0136871337890625,
0.0294647216796875,
0.0283660888671875,
0.024261474609375,
-0.00864410400390625,
-0.0399169921875,
0.0186614990234375,
0.0182952880859375,
-0.033233642578125,
-0.025054931640625,
0.05712890625,
0.0130767822265625,
-0.028778076171875,
0.061614990234375,
-0.00952911376953125,
-0.0309295654296875,
0.048919677734375,
0.03515625,
0.050872802734375,
-0.0093994140625,
0.0166168212890625,
0.039306640625,
0.0264739990234375,
0.0019130706787109375,
0.01523590087890625,
0.0072174072265625,
-0.054656982421875,
-0.01013946533203125,
-0.042724609375,
-0.0204925537109375,
0.01477813720703125,
-0.047210693359375,
0.03271484375,
-0.035400390625,
-0.0303955078125,
0.0159454345703125,
0.00992584228515625,
-0.07537841796875,
-0.006725311279296875,
0.0211029052734375,
0.07818603515625,
-0.07568359375,
0.057525634765625,
0.045623779296875,
-0.04217529296875,
-0.06512451171875,
-0.0109100341796875,
0.000732421875,
-0.0670166015625,
0.032928466796875,
0.0231170654296875,
0.03173828125,
-0.01342010498046875,
-0.05023193359375,
-0.0640869140625,
0.08392333984375,
0.02142333984375,
-0.031280517578125,
-0.00644683837890625,
0.00164794921875,
0.051025390625,
-0.0303955078125,
0.06475830078125,
0.0250396728515625,
0.027374267578125,
0.01332855224609375,
-0.10137939453125,
0.0185699462890625,
-0.043487548828125,
0.00007200241088867188,
-0.0017108917236328125,
-0.0677490234375,
0.07025146484375,
-0.025360107421875,
0.00799560546875,
-0.00506591796875,
0.044281005859375,
0.033905029296875,
0.00872039794921875,
0.0290679931640625,
0.0232086181640625,
0.04852294921875,
-0.007137298583984375,
0.07708740234375,
-0.0291595458984375,
0.0408935546875,
0.06085205078125,
-0.0074615478515625,
0.04705810546875,
0.02655029296875,
-0.0079345703125,
0.029144287109375,
0.060791015625,
-0.01261138916015625,
0.0273590087890625,
-0.0029163360595703125,
0.0142974853515625,
-0.000530242919921875,
0.01029205322265625,
-0.03314208984375,
0.03369140625,
0.02557373046875,
-0.0280303955078125,
-0.01361083984375,
0.01047515869140625,
0.033233642578125,
-0.0303192138671875,
-0.007720947265625,
0.049652099609375,
0.00836181640625,
-0.053741455078125,
0.07958984375,
-0.0029010772705078125,
0.0712890625,
-0.053314208984375,
0.007198333740234375,
-0.0291748046875,
0.015228271484375,
-0.0186920166015625,
-0.055389404296875,
0.006961822509765625,
0.005863189697265625,
-0.00896453857421875,
-0.0012407302856445312,
0.0565185546875,
-0.02117919921875,
-0.0273284912109375,
0.029754638671875,
0.036407470703125,
0.03912353515625,
-0.0020427703857421875,
-0.0792236328125,
0.024932861328125,
0.003772735595703125,
-0.033233642578125,
0.02716064453125,
0.01172637939453125,
-0.01433563232421875,
0.06524658203125,
0.054412841796875,
0.0093994140625,
-0.002590179443359375,
-0.00809478759765625,
0.089111328125,
-0.04559326171875,
-0.0450439453125,
-0.056488037109375,
0.037872314453125,
-0.01318359375,
-0.0343017578125,
0.073486328125,
0.052154541015625,
0.055206298828125,
-0.0021820068359375,
0.05645751953125,
-0.03778076171875,
0.0181732177734375,
-0.0227508544921875,
0.06976318359375,
-0.0247802734375,
0.0250701904296875,
-0.0384521484375,
-0.0655517578125,
0.00162506103515625,
0.033477783203125,
-0.0234527587890625,
0.0295257568359375,
0.0391845703125,
0.08966064453125,
-0.00836944580078125,
0.0014057159423828125,
0.01110076904296875,
0.0232086181640625,
0.0200042724609375,
0.03460693359375,
0.044708251953125,
-0.056915283203125,
0.059539794921875,
-0.014923095703125,
-0.03460693359375,
-0.0223236083984375,
-0.02960205078125,
-0.08251953125,
-0.058441162109375,
-0.032958984375,
-0.0584716796875,
0.01503753662109375,
0.06671142578125,
0.069580078125,
-0.046661376953125,
-0.0255126953125,
0.0195465087890625,
0.00012034177780151367,
-0.0203399658203125,
-0.01934814453125,
0.0261993408203125,
-0.0110626220703125,
-0.045745849609375,
0.0005965232849121094,
-0.017578125,
0.006927490234375,
-0.01412200927734375,
-0.02716064453125,
-0.00786590576171875,
-0.004276275634765625,
0.035919189453125,
0.034271240234375,
-0.03472900390625,
-0.032257080078125,
0.01528167724609375,
-0.0263519287109375,
0.0016489028930664062,
0.036163330078125,
-0.044769287109375,
0.01361083984375,
0.036376953125,
0.01128387451171875,
0.053741455078125,
-0.005046844482421875,
0.0191650390625,
-0.048126220703125,
0.0182952880859375,
0.00347137451171875,
0.02313232421875,
0.020172119140625,
-0.035919189453125,
0.0308837890625,
0.0234527587890625,
-0.060150146484375,
-0.060638427734375,
-0.0116424560546875,
-0.08642578125,
-0.0171051025390625,
0.1175537109375,
-0.00799560546875,
-0.0257415771484375,
0.0012426376342773438,
-0.0201873779296875,
0.023956298828125,
-0.053741455078125,
0.043060302734375,
0.043609619140625,
-0.006870269775390625,
-0.01276397705078125,
-0.05584716796875,
0.042205810546875,
-0.0007429122924804688,
-0.05224609375,
0.0173492431640625,
0.033660888671875,
0.050140380859375,
0.00750732421875,
0.07049560546875,
-0.012969970703125,
0.01158905029296875,
0.00501251220703125,
0.0294647216796875,
-0.019195556640625,
-0.0186614990234375,
-0.014068603515625,
-0.0103912353515625,
0.002101898193359375,
-0.0221099853515625
]
] |
w11wo/indonesian-roberta-base-sentiment-classifier | 2023-05-13T04:10:11.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"roberta",
"text-classification",
"indonesian-roberta-base-sentiment-classifier",
"id",
"dataset:indonlu",
"arxiv:1907.11692",
"doi:10.57967/hf/0644",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | w11wo | null | null | w11wo/indonesian-roberta-base-sentiment-classifier | 16 | 13,130 | transformers | 2022-03-02T23:29:05 | ---
language: id
tags:
- indonesian-roberta-base-sentiment-classifier
license: mit
datasets:
- indonlu
widget:
- text: "Jangan sampai saya telpon bos saya ya!"
---
## Indonesian RoBERTa Base Sentiment Classifier
Indonesian RoBERTa Base Sentiment Classifier is a sentiment-text-classification model based on the [RoBERTa](https://arxiv.org/abs/1907.11692) model. The model was originally the pre-trained [Indonesian RoBERTa Base](https://hf.co/flax-community/indonesian-roberta-base) model, which is then fine-tuned on [`indonlu`](https://hf.co/datasets/indonlu)'s `SmSA` dataset consisting of Indonesian comments and reviews.
After training, the model achieved an evaluation accuracy of 94.36% and F1-macro of 92.42%. On the benchmark test set, the model achieved an accuracy of 93.2% and F1-macro of 91.02%.
Hugging Face's `Trainer` class from the [Transformers](https://huggingface.co/transformers) library was used to train the model. PyTorch was used as the backend framework during training, but the model remains compatible with other frameworks nonetheless.
## Model
| Model | #params | Arch. | Training/Validation data (text) |
| ---------------------------------------------- | ------- | ------------ | ------------------------------- |
| `indonesian-roberta-base-sentiment-classifier` | 124M | RoBERTa Base | `SmSA` |
## Evaluation Results
The model was trained for 5 epochs and the best model was loaded at the end.
| Epoch | Training Loss | Validation Loss | Accuracy | F1 | Precision | Recall |
| ----- | ------------- | --------------- | -------- | -------- | --------- | -------- |
| 1 | 0.342600 | 0.213551 | 0.928571 | 0.898539 | 0.909803 | 0.890694 |
| 2 | 0.190700 | 0.213466 | 0.934127 | 0.901135 | 0.925297 | 0.882757 |
| 3 | 0.125500 | 0.219539 | 0.942857 | 0.920901 | 0.927511 | 0.915193 |
| 4 | 0.083600 | 0.235232 | 0.943651 | 0.924227 | 0.926494 | 0.922048 |
| 5 | 0.059200 | 0.262473 | 0.942063 | 0.920583 | 0.924084 | 0.917351 |
## How to Use
### As Text Classifier
```python
from transformers import pipeline
pretrained_name = "w11wo/indonesian-roberta-base-sentiment-classifier"
nlp = pipeline(
"sentiment-analysis",
model=pretrained_name,
tokenizer=pretrained_name
)
nlp("Jangan sampai saya telpon bos saya ya!")
```
## Disclaimer
Do consider the biases which come from both the pre-trained RoBERTa model and the `SmSA` dataset that may be carried over into the results of this model.
## Author
Indonesian RoBERTa Base Sentiment Classifier was trained and evaluated by [Wilson Wongso](https://w11wo.github.io/). All computation and development are done on Google Colaboratory using their free GPU access.
## Citation
If used, please cite the following:
```bibtex
@misc {wilson_wongso_2023,
author = { {Wilson Wongso} },
title = { indonesian-roberta-base-sentiment-classifier (Revision e402e46) },
year = 2023,
url = { https://huggingface.co/w11wo/indonesian-roberta-base-sentiment-classifier },
doi = { 10.57967/hf/0644 },
publisher = { Hugging Face }
}
``` | 3,245 | [
[
-0.0311431884765625,
-0.050506591796875,
0.011749267578125,
0.0213775634765625,
-0.0243072509765625,
-0.01424407958984375,
-0.03228759765625,
-0.0152740478515625,
0.007610321044921875,
0.03240966796875,
-0.039764404296875,
-0.0518798828125,
-0.06878662109375,
0.01326751708984375,
0.00409698486328125,
0.1148681640625,
0.01242828369140625,
0.014495849609375,
0.0012989044189453125,
-0.023712158203125,
-0.0185089111328125,
-0.047210693359375,
-0.045806884765625,
-0.0189971923828125,
0.0167083740234375,
0.00833892822265625,
0.032196044921875,
0.0036945343017578125,
0.037200927734375,
0.020355224609375,
-0.0235443115234375,
-0.00669097900390625,
-0.0145263671875,
-0.00543212890625,
0.009368896484375,
-0.06396484375,
-0.041259765625,
0.0157470703125,
0.0220947265625,
0.0263214111328125,
0.0149993896484375,
0.037628173828125,
0.0123443603515625,
0.058837890625,
-0.040740966796875,
0.0236358642578125,
-0.02972412109375,
0.008697509765625,
-0.0253753662109375,
-0.01210784912109375,
-0.040618896484375,
-0.031585693359375,
0.02630615234375,
-0.0114898681640625,
0.0107574462890625,
-0.01280975341796875,
0.09857177734375,
0.0261077880859375,
-0.035400390625,
-0.0159912109375,
-0.035980224609375,
0.07452392578125,
-0.04876708984375,
-0.0015697479248046875,
0.014251708984375,
0.019775390625,
0.0084228515625,
-0.0236968994140625,
-0.0516357421875,
0.009490966796875,
-0.0038509368896484375,
0.022430419921875,
-0.006931304931640625,
-0.002933502197265625,
0.0264892578125,
0.048919677734375,
-0.0474853515625,
-0.0169525146484375,
-0.02850341796875,
0.0030803680419921875,
0.030609130859375,
-0.006252288818359375,
0.0045013427734375,
-0.0557861328125,
-0.033447265625,
-0.02667236328125,
-0.01953125,
0.01556396484375,
0.036468505859375,
0.0266265869140625,
-0.0279083251953125,
0.04510498046875,
0.000583648681640625,
0.0518798828125,
0.01180267333984375,
-0.0294342041015625,
0.06304931640625,
-0.005901336669921875,
-0.031829833984375,
-0.0097198486328125,
0.07879638671875,
0.0218353271484375,
0.039276123046875,
0.0262298583984375,
0.0003917217254638672,
0.022979736328125,
0.0083465576171875,
-0.048095703125,
-0.0142822265625,
0.0214080810546875,
-0.055084228515625,
-0.057769775390625,
0.0157012939453125,
-0.05718994140625,
0.013916015625,
-0.01158905029296875,
0.0244140625,
-0.0308837890625,
-0.0423583984375,
-0.01485443115234375,
-0.0280303955078125,
0.02923583984375,
0.005825042724609375,
-0.0560302734375,
0.0020656585693359375,
0.045684814453125,
0.06512451171875,
0.006450653076171875,
0.00011628866195678711,
0.006412506103515625,
-0.0152587890625,
-0.0085906982421875,
0.054901123046875,
-0.021881103515625,
-0.0384521484375,
-0.00714111328125,
0.004779815673828125,
-0.0056915283203125,
-0.0281219482421875,
0.06195068359375,
-0.0234375,
0.06756591796875,
0.006404876708984375,
-0.037078857421875,
-0.03472900390625,
0.0310211181640625,
-0.0325927734375,
0.0965576171875,
0.01531982421875,
-0.08074951171875,
0.03521728515625,
-0.052215576171875,
-0.0203094482421875,
-0.0268707275390625,
0.005199432373046875,
-0.0633544921875,
-0.003936767578125,
0.017547607421875,
0.034942626953125,
-0.02117919921875,
0.0181121826171875,
-0.0164794921875,
-0.0125732421875,
0.01910400390625,
-0.04815673828125,
0.11285400390625,
0.0214996337890625,
-0.0298919677734375,
0.016357421875,
-0.0728759765625,
0.006439208984375,
0.005428314208984375,
-0.025726318359375,
-0.033416748046875,
-0.03289794921875,
0.022216796875,
0.01727294921875,
0.0274810791015625,
-0.036224365234375,
0.0160980224609375,
-0.03814697265625,
0.00910186767578125,
0.060577392578125,
0.0034542083740234375,
0.0225677490234375,
-0.0364990234375,
0.04327392578125,
0.012542724609375,
0.01424407958984375,
-0.014495849609375,
-0.0560302734375,
-0.061767578125,
-0.037261962890625,
0.030670166015625,
0.058746337890625,
-0.0172119140625,
0.07196044921875,
-0.02734375,
-0.05035400390625,
-0.035400390625,
-0.0022106170654296875,
0.032989501953125,
0.040771484375,
0.033203125,
-0.0099639892578125,
-0.066162109375,
-0.04248046875,
-0.0192413330078125,
-0.023345947265625,
0.0016641616821289062,
0.024658203125,
0.0384521484375,
-0.017913818359375,
0.0701904296875,
-0.038177490234375,
-0.0438232421875,
-0.028564453125,
0.022918701171875,
0.046356201171875,
0.039825439453125,
0.06524658203125,
-0.0521240234375,
-0.053070068359375,
-0.0188446044921875,
-0.06964111328125,
0.003772735595703125,
-0.0017423629760742188,
-0.01068878173828125,
0.058929443359375,
0.002986907958984375,
-0.054840087890625,
0.040740966796875,
0.048858642578125,
-0.01360321044921875,
0.043701171875,
0.0011348724365234375,
0.00919342041015625,
-0.106689453125,
0.0069122314453125,
0.01407623291015625,
-0.0139312744140625,
-0.0196380615234375,
-0.003421783447265625,
-0.00923919677734375,
-0.0036067962646484375,
-0.0202789306640625,
0.0254364013671875,
-0.0207061767578125,
0.0017337799072265625,
-0.006664276123046875,
-0.01192474365234375,
-0.00199127197265625,
0.0545654296875,
0.01338958740234375,
0.051910400390625,
0.041473388671875,
-0.0283660888671875,
0.011260986328125,
0.0218505859375,
-0.02099609375,
0.058074951171875,
-0.05267333984375,
-0.0005083084106445312,
0.0010528564453125,
0.039581298828125,
-0.08258056640625,
0.0013027191162109375,
0.034271240234375,
-0.0557861328125,
0.01268768310546875,
-0.04620361328125,
-0.0237274169921875,
-0.03668212890625,
-0.0222930908203125,
0.01329803466796875,
0.06976318359375,
-0.039642333984375,
0.051300048828125,
0.0196075439453125,
0.00414276123046875,
-0.055572509765625,
-0.0631103515625,
-0.01016998291015625,
-0.03460693359375,
-0.045135498046875,
0.007965087890625,
0.00563812255859375,
-0.0053863525390625,
0.0003952980041503906,
0.0098876953125,
-0.0112152099609375,
-0.0070953369140625,
0.04302978515625,
0.0406494140625,
-0.0196685791015625,
-0.00013625621795654297,
-0.0003516674041748047,
-0.0264739990234375,
0.01983642578125,
-0.01168060302734375,
0.057952880859375,
-0.0242462158203125,
0.0009584426879882812,
-0.06390380859375,
-0.00962066650390625,
0.03765869140625,
-0.0181732177734375,
0.06402587890625,
0.045806884765625,
-0.009002685546875,
0.002689361572265625,
-0.025970458984375,
0.0117340087890625,
-0.030364990234375,
0.0106964111328125,
-0.0489501953125,
-0.03448486328125,
0.036163330078125,
-0.0078277587890625,
-0.00428009033203125,
0.060302734375,
0.03607177734375,
-0.003253936767578125,
0.08123779296875,
0.040557861328125,
-0.0282135009765625,
0.0390625,
-0.050628662109375,
0.013916015625,
-0.06842041015625,
-0.0134124755859375,
-0.056976318359375,
-0.0107879638671875,
-0.062408447265625,
-0.0207366943359375,
0.0138397216796875,
-0.00868988037109375,
-0.0232086181640625,
0.01153564453125,
-0.047607421875,
0.0140533447265625,
0.03564453125,
0.01250457763671875,
0.0104217529296875,
-0.0009088516235351562,
-0.001979827880859375,
-0.0077362060546875,
-0.0285797119140625,
-0.049957275390625,
0.11419677734375,
0.0246429443359375,
0.041473388671875,
0.007965087890625,
0.0682373046875,
0.0091705322265625,
0.0247650146484375,
-0.053070068359375,
0.028411865234375,
-0.0236358642578125,
-0.05322265625,
0.0027618408203125,
-0.0289764404296875,
-0.04827880859375,
0.01244354248046875,
-0.0113372802734375,
-0.0245819091796875,
0.00481414794921875,
0.00687408447265625,
-0.00731658935546875,
0.0223388671875,
-0.033782958984375,
0.07391357421875,
-0.00376129150390625,
-0.0004112720489501953,
-0.01471710205078125,
-0.045196533203125,
0.03289794921875,
0.0269927978515625,
0.01119232177734375,
-0.0008540153503417969,
0.0205078125,
0.048065185546875,
-0.05145263671875,
0.059814453125,
-0.04156494140625,
0.0047760009765625,
0.0299072265625,
-0.0210418701171875,
0.01898193359375,
-0.0037689208984375,
0.0008893013000488281,
0.0248870849609375,
-0.01024627685546875,
-0.036651611328125,
-0.03802490234375,
0.050018310546875,
-0.08056640625,
-0.03155517578125,
-0.072021484375,
-0.002777099609375,
0.003734588623046875,
0.016937255859375,
0.032257080078125,
0.014007568359375,
-0.003734588623046875,
0.0232391357421875,
0.05609130859375,
-0.0098876953125,
0.01035308837890625,
0.00907135009765625,
-0.0219879150390625,
-0.029937744140625,
0.057037353515625,
-0.006526947021484375,
0.007411956787109375,
0.0157318115234375,
0.01397705078125,
-0.0184478759765625,
-0.01099395751953125,
-0.044097900390625,
0.01244354248046875,
-0.0469970703125,
-0.01861572265625,
-0.054931640625,
-0.0208282470703125,
-0.029876708984375,
-0.006927490234375,
-0.0191650390625,
-0.03570556640625,
-0.034881591796875,
-0.00530242919921875,
0.046600341796875,
0.035400390625,
0.016021728515625,
0.01910400390625,
-0.043792724609375,
0.0081024169921875,
0.0014190673828125,
0.010284423828125,
-0.005237579345703125,
-0.05963134765625,
0.01076507568359375,
0.007419586181640625,
-0.01560211181640625,
-0.060638427734375,
0.037506103515625,
0.0177764892578125,
0.01543426513671875,
0.0294647216796875,
-0.006389617919921875,
0.055084228515625,
-0.00971221923828125,
0.07537841796875,
0.0051422119140625,
-0.067626953125,
0.07012939453125,
-0.0115814208984375,
0.01324462890625,
0.04034423828125,
0.044097900390625,
-0.0184478759765625,
-0.03094482421875,
-0.07110595703125,
-0.06298828125,
0.038818359375,
-0.00504302978515625,
0.002819061279296875,
0.0022449493408203125,
0.026611328125,
-0.0015840530395507812,
0.00725555419921875,
-0.06646728515625,
-0.0278472900390625,
-0.035552978515625,
-0.0552978515625,
-0.00417327880859375,
-0.0161590576171875,
0.001216888427734375,
-0.0191650390625,
0.062164306640625,
-0.0025310516357421875,
0.0178070068359375,
0.013671875,
-0.023040771484375,
-0.004863739013671875,
0.01325225830078125,
0.03204345703125,
0.01470184326171875,
-0.023162841796875,
-0.010040283203125,
0.0140380859375,
-0.03173828125,
0.00574493408203125,
-0.01081085205078125,
-0.0308380126953125,
0.00594329833984375,
0.019012451171875,
0.06451416015625,
0.007068634033203125,
-0.037261962890625,
0.056976318359375,
-0.01947021484375,
-0.004833221435546875,
-0.07391357421875,
0.0127716064453125,
-0.0255584716796875,
0.0018548965454101562,
0.031463623046875,
0.054229736328125,
0.001522064208984375,
-0.033599853515625,
0.00261688232421875,
0.0323486328125,
-0.0287017822265625,
-0.01947021484375,
0.037933349609375,
0.003520965576171875,
-0.016265869140625,
0.04248046875,
-0.00792694091796875,
-0.0572509765625,
0.045501708984375,
0.0298919677734375,
0.06695556640625,
-0.01325225830078125,
0.015380859375,
0.049102783203125,
0.005352020263671875,
0.0025348663330078125,
0.035430908203125,
-0.0018358230590820312,
-0.03594970703125,
-0.017486572265625,
-0.057098388671875,
-0.021270751953125,
0.004276275634765625,
-0.07403564453125,
0.0252685546875,
-0.053558349609375,
-0.0230865478515625,
-0.004039764404296875,
0.0203399658203125,
-0.048736572265625,
0.0360107421875,
-0.01503753662109375,
0.068359375,
-0.076416015625,
0.055816650390625,
0.062255859375,
-0.0545654296875,
-0.048980712890625,
0.0051116943359375,
0.00714111328125,
-0.03936767578125,
0.0667724609375,
0.0300140380859375,
-0.0038890838623046875,
-0.007427215576171875,
-0.0234375,
-0.04840087890625,
0.06988525390625,
-0.0230560302734375,
-0.01013946533203125,
0.018280029296875,
-0.0021686553955078125,
0.06719970703125,
-0.023956298828125,
0.036956787109375,
0.0261077880859375,
0.0322265625,
-0.012451171875,
-0.058746337890625,
-0.00217437744140625,
-0.040740966796875,
0.00968170166015625,
0.0033321380615234375,
-0.0657958984375,
0.072265625,
-0.0005588531494140625,
0.005702972412109375,
0.019073486328125,
0.061248779296875,
0.023895263671875,
0.0036296844482421875,
0.045684814453125,
0.0692138671875,
0.05194091796875,
-0.03057861328125,
0.07421875,
-0.0250396728515625,
0.039337158203125,
0.05682373046875,
0.00229644775390625,
0.07733154296875,
0.0195770263671875,
-0.03814697265625,
0.07940673828125,
0.04180908203125,
-0.0081024169921875,
0.029998779296875,
-0.006900787353515625,
-0.01468658447265625,
0.0038013458251953125,
0.0100860595703125,
-0.030609130859375,
0.042449951171875,
0.021728515625,
-0.038238525390625,
0.019500732421875,
0.01580810546875,
0.0267791748046875,
0.0027103424072265625,
-0.004253387451171875,
0.0638427734375,
-0.0202789306640625,
-0.052734375,
0.049713134765625,
-0.0003991127014160156,
0.0789794921875,
-0.033599853515625,
0.006786346435546875,
-0.006641387939453125,
0.0367431640625,
-0.0272674560546875,
-0.057037353515625,
0.0265350341796875,
0.0229034423828125,
-0.0244903564453125,
-0.005573272705078125,
0.03973388671875,
-0.02032470703125,
-0.03802490234375,
0.0087127685546875,
0.002696990966796875,
0.009033203125,
0.006633758544921875,
-0.061767578125,
0.0193634033203125,
0.01279449462890625,
-0.04296875,
0.015289306640625,
0.0396728515625,
0.006561279296875,
0.036834716796875,
0.056732177734375,
0.00652313232421875,
0.0107421875,
0.006793975830078125,
0.052978515625,
-0.0438232421875,
-0.0467529296875,
-0.047210693359375,
0.040283203125,
-0.0234375,
-0.054840087890625,
0.065185546875,
0.044158935546875,
0.0609130859375,
-0.01312255859375,
0.07586669921875,
-0.0172119140625,
0.06396484375,
-0.0300750732421875,
0.038665771484375,
-0.037322998046875,
0.00243377685546875,
-0.0303802490234375,
-0.0611572265625,
-0.027862548828125,
0.093017578125,
-0.0310211181640625,
0.0283203125,
0.03271484375,
0.04669189453125,
0.01338958740234375,
0.00885009765625,
-0.019439697265625,
0.029327392578125,
0.023773193359375,
0.033111572265625,
0.03192138671875,
-0.057342529296875,
0.0416259765625,
-0.0304107666015625,
-0.040283203125,
-0.0219879150390625,
-0.046417236328125,
-0.08404541015625,
-0.046600341796875,
-0.0225982666015625,
-0.042144775390625,
-0.0007109642028808594,
0.06768798828125,
0.03759765625,
-0.06475830078125,
-0.01593017578125,
-0.0197601318359375,
-0.0023479461669921875,
-0.00830841064453125,
-0.031463623046875,
0.037567138671875,
-0.037261962890625,
-0.07269287109375,
-0.0159149169921875,
0.0030975341796875,
0.00782012939453125,
-0.037567138671875,
-0.007617950439453125,
-0.0169525146484375,
-0.01161956787109375,
0.0316162109375,
0.00962066650390625,
-0.0572509765625,
-0.0211181640625,
-0.0087738037109375,
-0.0183868408203125,
0.018707275390625,
0.0254974365234375,
-0.0618896484375,
0.0162811279296875,
0.043426513671875,
0.023162841796875,
0.037109375,
0.007015228271484375,
0.015594482421875,
-0.050933837890625,
0.0357666015625,
0.00897216796875,
0.0208740234375,
0.0250091552734375,
-0.024749755859375,
0.032257080078125,
0.030364990234375,
-0.05194091796875,
-0.05841064453125,
0.0013036727905273438,
-0.08746337890625,
-0.0108489990234375,
0.0589599609375,
-0.0296173095703125,
-0.049835205078125,
0.01070404052734375,
-0.038482666015625,
0.0246429443359375,
-0.0218658447265625,
0.059661865234375,
0.0494384765625,
-0.005878448486328125,
-0.017913818359375,
-0.01226043701171875,
0.0438232421875,
0.0389404296875,
-0.046173095703125,
-0.016448974609375,
0.0207672119140625,
0.036163330078125,
0.02801513671875,
0.0369873046875,
-0.019195556640625,
0.0200347900390625,
0.00032401084899902344,
0.0184326171875,
0.01513671875,
-0.016265869140625,
-0.03240966796875,
0.0110931396484375,
-0.004299163818359375,
-0.01340484619140625
]
] |
deepset/deberta-v3-large-squad2 | 2023-09-27T07:59:39.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"question-answering",
"deberta",
"deberta-v3",
"deberta-v3-large",
"en",
"dataset:squad_v2",
"license:cc-by-4.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | question-answering | deepset | null | null | deepset/deberta-v3-large-squad2 | 43 | 13,049 | transformers | 2022-07-25T09:28:55 | ---
language: en
license: cc-by-4.0
tags:
- deberta
- deberta-v3
- deberta-v3-large
datasets:
- squad_v2
model-index:
- name: deepset/deberta-v3-large-squad2
results:
- task:
type: question-answering
name: Question Answering
dataset:
name: squad_v2
type: squad_v2
config: squad_v2
split: validation
metrics:
- type: exact_match
value: 88.0876
name: Exact Match
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZmE0MWEwNjBkNTA1MmU0ZDkyYTA1OGEwNzY3NGE4NWU4NGI0NTQzNjRlNjY1NGRmNDU2MjA0NjU1N2JlZmNhYiIsInZlcnNpb24iOjF9.PnBF_vD0HujNBSShGJzsJnjmiBP_qT8xb2E7ORmpKfNspKXEuN_pBk9iV0IHRzdqOSyllcxlCv93XMPblNjWDw
- type: f1
value: 91.1623
name: F1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZDBkNDUzZmNkNDQwOGRkMmVlZjkxZWVlMzk3NzFmMGIxMTFmMjZlZDcyOWFiMjljNjM5MThlZDM4OWRmNzMwOCIsInZlcnNpb24iOjF9.bacyetziNI2DxO67GWpTyeRPXqF1POkyv00wEHXlyZu71pZngsNpZyrnuj2aJlCqQwHGnF_lT2ysaXKHprQRBg
- task:
type: question-answering
name: Question Answering
dataset:
name: squad
type: squad
config: plain_text
split: validation
metrics:
- type: exact_match
value: 89.2366
name: Exact Match
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMjQ1Yjk3YTdiYTY1NmYxMTI1ZGZlMjRkNTlhZTkyNjRkNjgxYWJiNDk2NzE3NjAyYmY3YmRjNjg4YmEyNDkyYyIsInZlcnNpb24iOjF9.SEWyqX_FPQJOJt2KjOCNgQ2giyVeLj5bmLI5LT_Pfo33tbWPWD09TySYdsthaVTjUGT5DvDzQLASSwBH05FyBw
- type: f1
value: 95.0569
name: F1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2QyODQ1NWVlYjQxMjA0YTgyNmQ2NmIxOWY3MDRmZjE3ZWI5Yjc4ZDE4NzA2YjE2YTE1YTBlNzNiYmNmNzI3NCIsInZlcnNpb24iOjF9.NcXEc9xoggV76w1bQKxuJDYbOTxFzdny2k-85_b6AIMtfpYV3rGR1Z5YF6tVY2jyp7mgm5Jd5YSgGI3NvNE-CQ
- task:
type: question-answering
name: Question Answering
dataset:
name: adversarial_qa
type: adversarial_qa
config: adversarialQA
split: validation
metrics:
- type: exact_match
value: 42.100
name: Exact Match
- type: f1
value: 56.587
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squad_adversarial
type: squad_adversarial
config: AddOneSent
split: validation
metrics:
- type: exact_match
value: 83.548
name: Exact Match
- type: f1
value: 89.385
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts amazon
type: squadshifts
config: amazon
split: test
metrics:
- type: exact_match
value: 72.979
name: Exact Match
- type: f1
value: 87.254
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts new_wiki
type: squadshifts
config: new_wiki
split: test
metrics:
- type: exact_match
value: 83.938
name: Exact Match
- type: f1
value: 92.695
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts nyt
type: squadshifts
config: nyt
split: test
metrics:
- type: exact_match
value: 85.534
name: Exact Match
- type: f1
value: 93.153
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts reddit
type: squadshifts
config: reddit
split: test
metrics:
- type: exact_match
value: 73.284
name: Exact Match
- type: f1
value: 85.307
name: F1
---
# deberta-v3-large for QA
This is the [deberta-v3-large](https://huggingface.co/microsoft/deberta-v3-large) model, fine-tuned using the [SQuAD2.0](https://huggingface.co/datasets/squad_v2) dataset. It's been trained on question-answer pairs, including unanswerable questions, for the task of Question Answering.
## Overview
**Language model:** deberta-v3-large
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD 2.0
**Eval data:** SQuAD 2.0
**Code:** See [an example QA pipeline on Haystack](https://haystack.deepset.ai/tutorials/first-qa-system)
**Infrastructure**: 1x NVIDIA A10G
## Hyperparameters
```
batch_size = 2
grad_acc_steps = 32
n_epochs = 6
base_LM_model = "microsoft/deberta-v3-large"
max_seq_len = 512
learning_rate = 7e-6
lr_schedule = LinearWarmup
warmup_proportion = 0.2
doc_stride=128
max_query_length=64
```
## Usage
### In Haystack
Haystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in [Haystack](https://github.com/deepset-ai/haystack/):
```python
reader = FARMReader(model_name_or_path="deepset/deberta-v3-large-squad2")
# or
reader = TransformersReader(model_name_or_path="deepset/deberta-v3-large-squad2",tokenizer="deepset/deberta-v3-large-squad2")
```
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "deepset/deberta-v3-large-squad2"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Why is model conversion important?',
'context': 'The option to convert models between FARM and transformers gives freedom to the user and let people easily switch between frameworks.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Performance
Evaluated on the SQuAD 2.0 dev set with the [official eval script](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/).
```
"exact": 87.6105449338836,
"f1": 90.75307008866517,
"total": 11873,
"HasAns_exact": 84.37921727395411,
"HasAns_f1": 90.6732795483674,
"HasAns_total": 5928,
"NoAns_exact": 90.83263246425568,
"NoAns_f1": 90.83263246425568,
"NoAns_total": 5945
```
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://huggingface.co/spaces/deepset/README/resolve/main/haystack-logo-colored.svg" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://huggingface.co/spaces/deepset/README/resolve/main/deepset-logo-colored.svg" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community/join">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
| 8,065 | [
[
-0.03155517578125,
-0.051422119140625,
0.031585693359375,
0.01174163818359375,
-0.0027713775634765625,
0.01183319091796875,
0.0004265308380126953,
-0.03704833984375,
0.030303955078125,
0.01885986328125,
-0.0643310546875,
-0.048492431640625,
-0.02508544921875,
0.006099700927734375,
-0.0271453857421875,
0.06134033203125,
0.004772186279296875,
-0.0029468536376953125,
-0.008544921875,
-0.004032135009765625,
-0.039825439453125,
-0.032989501953125,
-0.05596923828125,
-0.01117706298828125,
0.023101806640625,
0.0283660888671875,
0.054656982421875,
0.022064208984375,
0.03936767578125,
0.0247955322265625,
-0.0049591064453125,
0.0101318359375,
-0.0341796875,
0.0185546875,
-0.00337982177734375,
-0.0297698974609375,
-0.034149169921875,
-0.0016202926635742188,
0.041351318359375,
0.0247650146484375,
-0.0088043212890625,
0.03753662109375,
-0.00911712646484375,
0.05511474609375,
-0.04132080078125,
0.00829315185546875,
-0.050811767578125,
-0.0112457275390625,
0.01776123046875,
0.01497650146484375,
-0.00394439697265625,
-0.0152130126953125,
0.0163421630859375,
-0.047119140625,
0.021484375,
-0.01419830322265625,
0.0889892578125,
0.020843505859375,
-0.006458282470703125,
-0.00859832763671875,
-0.0362548828125,
0.06622314453125,
-0.07366943359375,
0.004413604736328125,
0.040557861328125,
0.0296783447265625,
0.006481170654296875,
-0.062103271484375,
-0.048187255859375,
0.006591796875,
-0.01983642578125,
0.01708984375,
-0.0223541259765625,
-0.0283050537109375,
0.009521484375,
0.021240234375,
-0.05889892578125,
0.01523590087890625,
-0.04022216796875,
0.005062103271484375,
0.06842041015625,
0.0166778564453125,
0.01995849609375,
-0.017364501953125,
-0.020660400390625,
-0.0201873779296875,
-0.035919189453125,
0.01502227783203125,
0.0090789794921875,
0.021331787109375,
-0.0165557861328125,
0.034423828125,
-0.032806396484375,
0.0443115234375,
0.0171356201171875,
0.0330810546875,
0.03656005859375,
-0.04052734375,
-0.0201416015625,
-0.01358795166015625,
0.07330322265625,
0.0341796875,
-0.0006356239318847656,
0.0007829666137695312,
-0.020904541015625,
-0.01146697998046875,
0.017425537109375,
-0.07196044921875,
-0.01000213623046875,
0.0411376953125,
-0.0214996337890625,
-0.0284576416015625,
-0.0005221366882324219,
-0.059722900390625,
-0.024932861328125,
0.0018291473388671875,
0.040313720703125,
-0.0309295654296875,
-0.0299072265625,
0.0242767333984375,
-0.0147247314453125,
0.049163818359375,
0.0165863037109375,
-0.06427001953125,
0.00806427001953125,
0.044525146484375,
0.061370849609375,
0.0166168212890625,
-0.0211181640625,
-0.035736083984375,
-0.01120758056640625,
-0.01149749755859375,
0.0479736328125,
-0.02392578125,
-0.00368499755859375,
0.003589630126953125,
0.019927978515625,
-0.01526641845703125,
-0.0263824462890625,
0.015777587890625,
-0.0443115234375,
0.0291595458984375,
-0.01434326171875,
-0.034881591796875,
-0.019866943359375,
0.032379150390625,
-0.055328369140625,
0.08038330078125,
0.0311737060546875,
-0.042816162109375,
0.0081329345703125,
-0.054962158203125,
-0.0132293701171875,
0.007781982421875,
0.0035343170166015625,
-0.0268096923828125,
-0.02044677734375,
0.02947998046875,
0.04376220703125,
-0.02362060546875,
0.0031833648681640625,
-0.021453857421875,
-0.03424072265625,
0.0196685791015625,
0.0050506591796875,
0.08795166015625,
0.0076904296875,
-0.0330810546875,
-0.0030193328857421875,
-0.05291748046875,
0.021697998046875,
0.0196990966796875,
-0.0109710693359375,
-0.002132415771484375,
-0.0093231201171875,
0.00435638427734375,
0.030517578125,
0.040374755859375,
-0.034027099609375,
0.01456451416015625,
-0.0465087890625,
0.049285888671875,
0.04339599609375,
-0.0029296875,
0.0253143310546875,
-0.025054931640625,
0.051788330078125,
-0.004383087158203125,
0.01294708251953125,
0.013275146484375,
-0.0271148681640625,
-0.0604248046875,
-0.0153045654296875,
0.0289764404296875,
0.05364990234375,
-0.053924560546875,
0.057647705078125,
-0.0137176513671875,
-0.05084228515625,
-0.06170654296875,
0.00994110107421875,
0.0191802978515625,
0.0236358642578125,
0.03826904296875,
0.000873565673828125,
-0.056732177734375,
-0.079345703125,
-0.0012483596801757812,
-0.016021728515625,
-0.017822265625,
0.0142059326171875,
0.057098388671875,
-0.03302001953125,
0.06573486328125,
-0.0531005859375,
-0.0255584716796875,
-0.021453857421875,
-0.0131988525390625,
0.038604736328125,
0.051483154296875,
0.05401611328125,
-0.05908203125,
-0.03704833984375,
-0.0206298828125,
-0.0550537109375,
0.020050048828125,
-0.0015878677368164062,
-0.0259246826171875,
0.01324462890625,
0.028564453125,
-0.055755615234375,
0.0221710205078125,
0.0440673828125,
-0.04730224609375,
0.0211181640625,
0.0016088485717773438,
0.0081939697265625,
-0.10675048828125,
0.022613525390625,
0.0008349418640136719,
-0.0166168212890625,
-0.033416748046875,
0.014404296875,
-0.0142669677734375,
-0.0026988983154296875,
-0.035888671875,
0.048126220703125,
-0.0199737548828125,
0.01116943359375,
0.01013946533203125,
0.0217437744140625,
0.0235595703125,
0.04034423828125,
-0.01181793212890625,
0.0679931640625,
0.045745849609375,
-0.033782958984375,
0.04779052734375,
0.049346923828125,
-0.034393310546875,
0.0241546630859375,
-0.07177734375,
0.0098114013671875,
-0.00165557861328125,
0.01983642578125,
-0.07586669921875,
-0.0190277099609375,
0.018524169921875,
-0.05224609375,
0.00732421875,
-0.00496673583984375,
-0.049530029296875,
-0.03973388671875,
-0.04290771484375,
0.0186767578125,
0.059539794921875,
-0.0302886962890625,
0.0208892822265625,
0.03271484375,
-0.003154754638671875,
-0.046142578125,
-0.06561279296875,
-0.0028667449951171875,
-0.00846099853515625,
-0.054473876953125,
0.01983642578125,
-0.00971221923828125,
-0.0102691650390625,
0.004978179931640625,
0.0030536651611328125,
-0.0384521484375,
0.0209503173828125,
0.0125274658203125,
0.03350830078125,
-0.023895263671875,
0.022125244140625,
-0.01526641845703125,
-0.00203704833984375,
0.0017147064208984375,
-0.0178680419921875,
0.048980712890625,
-0.04541015625,
0.0021610260009765625,
-0.041717529296875,
0.026458740234375,
0.037933349609375,
-0.0278167724609375,
0.070068359375,
0.05755615234375,
-0.028076171875,
-0.0009183883666992188,
-0.039764404296875,
-0.0169525146484375,
-0.03485107421875,
0.03228759765625,
-0.0165863037109375,
-0.06488037109375,
0.04998779296875,
0.0269927978515625,
0.014739990234375,
0.0780029296875,
0.03289794921875,
-0.0240020751953125,
0.08148193359375,
0.03399658203125,
-0.0023345947265625,
0.02752685546875,
-0.06524658203125,
0.0007467269897460938,
-0.07403564453125,
-0.0177154541015625,
-0.040771484375,
-0.03985595703125,
-0.044464111328125,
-0.032196044921875,
0.011627197265625,
0.01454925537109375,
-0.03582763671875,
0.042083740234375,
-0.0587158203125,
0.034088134765625,
0.047637939453125,
0.012542724609375,
0.001148223876953125,
-0.00850677490234375,
0.02099609375,
0.0164031982421875,
-0.053314208984375,
-0.03277587890625,
0.07574462890625,
0.018310546875,
0.03741455078125,
0.01337432861328125,
0.05926513671875,
0.009124755859375,
-0.0146942138671875,
-0.05047607421875,
0.04010009765625,
-0.007205963134765625,
-0.0665283203125,
-0.039031982421875,
-0.02642822265625,
-0.083740234375,
0.00852203369140625,
-0.020050048828125,
-0.05682373046875,
0.0212860107421875,
-0.0012416839599609375,
-0.048431396484375,
0.01788330078125,
-0.047149658203125,
0.07012939453125,
-0.0032806396484375,
-0.0206146240234375,
-0.01262664794921875,
-0.05853271484375,
0.01666259765625,
0.0020427703857421875,
-0.002185821533203125,
-0.017425537109375,
-0.0037517547607421875,
0.059356689453125,
-0.037139892578125,
0.0655517578125,
-0.00858306884765625,
-0.00817108154296875,
0.036712646484375,
-0.0016536712646484375,
0.0343017578125,
0.0234375,
-0.0266876220703125,
0.0179443359375,
0.034210205078125,
-0.04217529296875,
-0.042083740234375,
0.05804443359375,
-0.0679931640625,
-0.0302276611328125,
-0.035491943359375,
-0.03167724609375,
-0.00928497314453125,
0.02178955078125,
0.015167236328125,
0.0287322998046875,
-0.01495361328125,
0.0404052734375,
0.043792724609375,
-0.01314544677734375,
0.034027099609375,
0.03607177734375,
-0.0122528076171875,
-0.0245513916015625,
0.0552978515625,
-0.00262451171875,
0.00856781005859375,
0.0318603515625,
0.004627227783203125,
-0.037139892578125,
-0.026641845703125,
-0.040557861328125,
0.0182952880859375,
-0.0421142578125,
-0.0311737060546875,
-0.048187255859375,
-0.034820556640625,
-0.053375244140625,
-0.00554656982421875,
-0.0272369384765625,
-0.04754638671875,
-0.036956787109375,
-0.00490570068359375,
0.054656982421875,
0.042724609375,
-0.0031890869140625,
0.015625,
-0.055633544921875,
0.0257720947265625,
0.0360107421875,
0.028076171875,
-0.00995635986328125,
-0.039764404296875,
-0.018280029296875,
0.034912109375,
-0.00864410400390625,
-0.050323486328125,
0.0235595703125,
0.004047393798828125,
0.020050048828125,
-0.0095672607421875,
0.0070343017578125,
0.040496826171875,
-0.024017333984375,
0.065185546875,
0.0086822509765625,
-0.059173583984375,
0.0467529296875,
-0.0286407470703125,
0.03656005859375,
0.07647705078125,
0.0213623046875,
-0.043731689453125,
-0.0183563232421875,
-0.05859375,
-0.0731201171875,
0.051788330078125,
0.0293121337890625,
0.01342010498046875,
-0.0052947998046875,
0.0229339599609375,
-0.011444091796875,
0.0190887451171875,
-0.0364990234375,
-0.0173187255859375,
-0.01457977294921875,
-0.0216064453125,
-0.004863739013671875,
-0.0093536376953125,
-0.011505126953125,
-0.032501220703125,
0.07080078125,
-0.005023956298828125,
0.00991058349609375,
0.025177001953125,
-0.01030731201171875,
0.0134124755859375,
0.01074981689453125,
0.036651611328125,
0.06689453125,
-0.030303955078125,
-0.0208282470703125,
0.0207977294921875,
-0.026580810546875,
-0.00017642974853515625,
0.0216064453125,
-0.029266357421875,
0.00804901123046875,
0.0228118896484375,
0.05950927734375,
0.0033321380615234375,
-0.051239013671875,
0.04949951171875,
-0.010894775390625,
-0.03466796875,
-0.0423583984375,
0.0086669921875,
0.01505279541015625,
0.03515625,
0.0282135009765625,
-0.01511383056640625,
0.00954437255859375,
-0.034149169921875,
0.0100250244140625,
0.039642333984375,
-0.032135009765625,
-0.008270263671875,
0.03680419921875,
0.025177001953125,
-0.0233612060546875,
0.05328369140625,
-0.0134124755859375,
-0.04473876953125,
0.07000732421875,
0.0228729248046875,
0.07476806640625,
0.00588226318359375,
0.032196044921875,
0.04998779296875,
0.022430419921875,
-0.00115966796875,
0.021759033203125,
0.01210784912109375,
-0.037261962890625,
-0.0266876220703125,
-0.056915283203125,
-0.012054443359375,
0.0272064208984375,
-0.0528564453125,
0.0161285400390625,
-0.032958984375,
-0.00653839111328125,
-0.0018939971923828125,
0.023529052734375,
-0.07177734375,
0.0196685791015625,
-0.01082611083984375,
0.062347412109375,
-0.038848876953125,
0.03131103515625,
0.062255859375,
-0.053558349609375,
-0.06591796875,
-0.0106048583984375,
-0.0213623046875,
-0.07708740234375,
0.040924072265625,
0.0204620361328125,
-0.00891876220703125,
0.0203857421875,
-0.05810546875,
-0.0714111328125,
0.099609375,
0.004100799560546875,
-0.038330078125,
-0.0215606689453125,
-0.004947662353515625,
0.040740966796875,
-0.0232696533203125,
0.0208892822265625,
0.04388427734375,
0.0311279296875,
0.0062713623046875,
-0.060760498046875,
0.025177001953125,
-0.026214599609375,
-0.0032863616943359375,
0.00551605224609375,
-0.065673828125,
0.0618896484375,
-0.0087127685546875,
-0.0097198486328125,
0.0190277099609375,
0.034759521484375,
0.01413726806640625,
0.0027923583984375,
0.03802490234375,
0.0419921875,
0.054931640625,
0.0017786026000976562,
0.0679931640625,
-0.014068603515625,
0.053070068359375,
0.08135986328125,
-0.0103607177734375,
0.07373046875,
0.026580810546875,
-0.023651123046875,
0.05609130859375,
0.044921875,
-0.0268707275390625,
0.022491455078125,
0.02081298828125,
-0.00566864013671875,
-0.0268402099609375,
0.006076812744140625,
-0.048370361328125,
0.045196533203125,
0.00501251220703125,
-0.0213623046875,
-0.0159759521484375,
-0.0270843505859375,
-0.00862884521484375,
-0.0028591156005859375,
-0.009674072265625,
0.061767578125,
-0.00736236572265625,
-0.04266357421875,
0.07476806640625,
-0.01373291015625,
0.04962158203125,
-0.048431396484375,
-0.005237579345703125,
-0.0168609619140625,
0.01409149169921875,
-0.01290130615234375,
-0.06707763671875,
0.01031494140625,
-0.0016927719116210938,
-0.028717041015625,
-0.00711822509765625,
0.038330078125,
-0.03173828125,
-0.0618896484375,
0.009765625,
0.041656494140625,
0.0167999267578125,
-0.00942230224609375,
-0.07379150390625,
-0.00745391845703125,
-0.0022602081298828125,
-0.0259246826171875,
0.01146697998046875,
0.0248565673828125,
0.021484375,
0.046539306640625,
0.0457763671875,
-0.004146575927734375,
-0.00818634033203125,
-0.0036411285400390625,
0.06719970703125,
-0.055816650390625,
-0.0240020751953125,
-0.06219482421875,
0.045867919921875,
-0.0256500244140625,
-0.0311737060546875,
0.050628662109375,
0.0499267578125,
0.05889892578125,
-0.0118255615234375,
0.05364990234375,
-0.0204315185546875,
0.045806884765625,
-0.034423828125,
0.0693359375,
-0.06573486328125,
0.0016145706176757812,
-0.002681732177734375,
-0.054107666015625,
-0.0098876953125,
0.054779052734375,
-0.004840850830078125,
0.00396728515625,
0.04949951171875,
0.060577392578125,
0.0048980712890625,
-0.0216522216796875,
-0.0011243820190429688,
0.021575927734375,
0.018463134765625,
0.06915283203125,
0.054290771484375,
-0.06158447265625,
0.050048828125,
-0.0212860107421875,
-0.005123138427734375,
-0.021942138671875,
-0.043731689453125,
-0.06951904296875,
-0.0498046875,
-0.022369384765625,
-0.053558349609375,
-0.004024505615234375,
0.0650634765625,
0.0545654296875,
-0.059539794921875,
-0.0068511962890625,
-0.00307464599609375,
0.01149749755859375,
-0.01525115966796875,
-0.0229034423828125,
0.03350830078125,
-0.02105712890625,
-0.05694580078125,
0.0235748291015625,
-0.006565093994140625,
0.00408935546875,
-0.0209197998046875,
0.005889892578125,
-0.055145263671875,
-0.0062408447265625,
0.04010009765625,
0.0287322998046875,
-0.0433349609375,
-0.01253509521484375,
0.0120086669921875,
-0.0196685791015625,
-0.0007767677307128906,
0.0259246826171875,
-0.07159423828125,
0.0115509033203125,
0.048828125,
0.055908203125,
0.03973388671875,
0.0003314018249511719,
0.037750244140625,
-0.043243408203125,
0.01154327392578125,
0.032989501953125,
0.016082763671875,
0.02227783203125,
-0.0367431640625,
0.061553955078125,
-0.002132415771484375,
-0.040924072265625,
-0.06298828125,
0.0006403923034667969,
-0.06854248046875,
-0.029632568359375,
0.0926513671875,
0.000965118408203125,
-0.0208892822265625,
0.01155853271484375,
-0.0173797607421875,
0.01558685302734375,
-0.032806396484375,
0.051116943359375,
0.051666259765625,
0.00888824462890625,
0.0160675048828125,
-0.05194091796875,
0.03643798828125,
0.039459228515625,
-0.06573486328125,
-0.00649261474609375,
0.034423828125,
0.0211944580078125,
0.015899658203125,
0.04718017578125,
0.0100555419921875,
0.029815673828125,
-0.01453399658203125,
0.0023365020751953125,
-0.00403594970703125,
-0.0030956268310546875,
-0.0295562744140625,
-0.00736236572265625,
-0.0191650390625,
-0.0362548828125
]
] |
damo-vilab/modelscope-damo-text-to-video-synthesis | 2023-03-29T02:40:04.000Z | [
"open_clip",
"text-to-video",
"license:cc-by-nc-4.0",
"has_space",
"region:us"
] | text-to-video | damo-vilab | null | null | damo-vilab/modelscope-damo-text-to-video-synthesis | 390 | 13,045 | open_clip | 2023-03-19T10:27:15 | ---
license: cc-by-nc-4.0
pipeline_tag: text-to-video
---
The original repo is [here](https://modelscope.cn/models/damo/text-to-video-synthesis/summary).
**We Are Hiring!** (Based in Beijing / Hangzhou, China.)
If you're looking for an exciting challenge and the opportunity to work with cutting-edge technologies in AIGC and large-scale pretraining, then we are the place for you. We are looking for talented, motivated and creative individuals to join our team. If you are interested, please send your CV to us.
EMAIL: yingya.zyy@alibaba-inc.com
This model is based on a multi-stage text-to-video generation diffusion model, which inputs a description text and returns a video that matches the text description. Only English input is supported.
## Model Description
The text-to-video generation diffusion model consists of three sub-networks: text feature extraction, text feature-to-video latent space diffusion model, and video latent space to video visual space. The overall model parameters are about 1.7 billion. Support English input. The diffusion model adopts the Unet3D structure, and realizes the function of video generation through the iterative denoising process from the pure Gaussian noise video.
**This model is meant for research purposes. Please look at the [model limitations and biases](#model-limitations-and-biases) and [misuse, malicious use and excessive use](#misuse-malicious-use-and-excessive-use) sections.**
**How to expect the model to be used and where it is applicable**
This model has a wide range of applications and can reason and generate videos based on arbitrary English text descriptions.
## How to use
The model has been launched on [ModelScope Studio](https://modelscope.cn/studios/damo/text-to-video-synthesis/summary) and [huggingface](https://huggingface.co/spaces/damo-vilab/modelscope-text-to-video-synthesis), you can experience it directly; you can also refer to [Colab page](https://colab.research.google.com/drive/1uW1ZqswkQ9Z9bp5Nbo5z59cAn7I0hE6R?usp=sharing#scrollTo=bSluBq99ObSk) to build it yourself.
In order to facilitate the experience of the model, users can refer to the [Aliyun Notebook Tutorial](https://modelscope.cn/headlines/detail/26) to quickly develop this Text-to-Video model.
This demo requires about 16GB CPU RAM and 16GB GPU RAM. Under the ModelScope framework, the current model can be used by calling a simple Pipeline, where the input must be in dictionary format, the legal key value is 'text', and the content is a short text. This model currently only supports inference on the GPU. Enter specific code examples as follows:
### Operating environment (Python Package)
```
pip install modelscope==1.4.2
pip install open_clip_torch
pip install pytorch-lightning
```
### Code example (Demo Code)
```python
from huggingface_hub import snapshot_download
from modelscope.pipelines import pipeline
from modelscope.outputs import OutputKeys
import pathlib
model_dir = pathlib.Path('weights')
snapshot_download('damo-vilab/modelscope-damo-text-to-video-synthesis',
repo_type='model', local_dir=model_dir)
pipe = pipeline('text-to-video-synthesis', model_dir.as_posix())
test_text = {
'text': 'A panda eating bamboo on a rock.',
}
output_video_path = pipe(test_text,)[OutputKeys.OUTPUT_VIDEO]
print('output_video_path:', output_video_path)
```
### View results
The above code will display the save path of the output video, and the current encoding format can be played normally with [VLC player](https://www.videolan.org/vlc/).
The output mp4 file can be viewed by [VLC media player](https://www.videolan.org/vlc/). Some other media players may not view it normally.
## Model limitations and biases
* The model is trained based on public data sets such as Webvid, and the generated results may have deviations related to the distribution of training data.
* This model cannot achieve perfect film and television quality generation.
* The model cannot generate clear text.
* The model is mainly trained with English corpus and does not support other languages at the moment**.
* The performance of this model needs to be improved on complex compositional generation tasks.
## Misuse, Malicious Use and Excessive Use
* The model was not trained to realistically represent people or events, so using it to generate such content is beyond the model's capabilities.
* It is prohibited to generate content that is demeaning or harmful to people or their environment, culture, religion, etc.
* Prohibited for pornographic, violent and bloody content generation.
* Prohibited for error and false information generation.
## Training data
The training data includes [LAION5B](https://huggingface.co/datasets/laion/laion2B-en), [ImageNet](https://www.image-net.org/), [Webvid](https://m-bain.github.io/webvid-dataset/) and other public datasets. Image and video filtering is performed after pre-training such as aesthetic score, watermark score, and deduplication.
## Citation
```bibtex
@InProceedings{VideoFusion,
author = {Luo, Zhengxiong and Chen, Dayou and Zhang, Yingya and Huang, Yan and Wang, Liang and Shen, Yujun and Zhao, Deli and Zhou, Jingren and Tan, Tieniu},
title = {VideoFusion: Decomposed Diffusion Models for High-Quality Video Generation},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023}
}
```
| 5,489 | [
[
-0.0287628173828125,
-0.062225341796875,
0.01403045654296875,
0.015716552734375,
-0.027252197265625,
-0.0206146240234375,
-0.003875732421875,
-0.0030059814453125,
-0.013214111328125,
0.033935546875,
-0.0469970703125,
-0.040374755859375,
-0.050994873046875,
-0.007663726806640625,
-0.05035400390625,
0.08514404296875,
-0.015167236328125,
0.0063629150390625,
-0.006679534912109375,
0.00989532470703125,
-0.021636962890625,
-0.013397216796875,
-0.03857421875,
0.00763702392578125,
-0.00437164306640625,
0.0198516845703125,
0.0209197998046875,
0.044464111328125,
0.041717529296875,
0.0243988037109375,
-0.01198577880859375,
0.01427459716796875,
-0.03240966796875,
-0.0025997161865234375,
0.001865386962890625,
-0.005214691162109375,
-0.034912109375,
0.007083892822265625,
0.061187744140625,
0.015838623046875,
-0.00846099853515625,
0.0302734375,
0.01137542724609375,
0.044464111328125,
-0.04534912109375,
-0.0034389495849609375,
-0.042327880859375,
0.0055389404296875,
0.0005660057067871094,
0.00308990478515625,
-0.0178985595703125,
-0.01325225830078125,
0.0079498291015625,
-0.046722412109375,
0.046417236328125,
-0.002590179443359375,
0.09161376953125,
0.01538848876953125,
-0.0286407470703125,
0.0108489990234375,
-0.05548095703125,
0.04278564453125,
-0.06793212890625,
0.03094482421875,
0.0233306884765625,
0.0221099853515625,
0.0139312744140625,
-0.07586669921875,
-0.04815673828125,
-0.0081634521484375,
0.0218505859375,
0.03369140625,
-0.006595611572265625,
0.00664520263671875,
0.022735595703125,
0.0299224853515625,
-0.045928955078125,
-0.01392364501953125,
-0.04241943359375,
-0.01375579833984375,
0.050750732421875,
0.00917816162109375,
0.029510498046875,
-0.040496826171875,
-0.0225830078125,
-0.0267181396484375,
-0.0226898193359375,
0.005214691162109375,
0.039947509765625,
-0.0041656494140625,
-0.050506591796875,
0.039764404296875,
-0.00983428955078125,
0.039886474609375,
0.0005631446838378906,
-0.0218505859375,
0.01012420654296875,
-0.035980224609375,
-0.044921875,
-0.0088958740234375,
0.08245849609375,
0.037384033203125,
-0.007568359375,
0.0281524658203125,
-0.0006256103515625,
0.0030059814453125,
0.01416778564453125,
-0.08624267578125,
-0.0193634033203125,
0.01001739501953125,
-0.03839111328125,
-0.039276123046875,
0.007415771484375,
-0.057952880859375,
0.0011348724365234375,
-0.0027675628662109375,
0.047454833984375,
-0.0233917236328125,
-0.0257568359375,
-0.004749298095703125,
-0.0176849365234375,
0.0003104209899902344,
0.025238037109375,
-0.06793212890625,
0.01430511474609375,
0.01145172119140625,
0.07366943359375,
-0.001544952392578125,
-0.03302001953125,
-0.031951904296875,
0.01287841796875,
-0.0249176025390625,
0.03277587890625,
-0.0024929046630859375,
-0.0235748291015625,
0.007232666015625,
0.00827789306640625,
0.0131683349609375,
-0.03814697265625,
0.0462646484375,
-0.020477294921875,
0.0176849365234375,
0.00494384765625,
-0.0328369140625,
-0.00296783447265625,
0.011383056640625,
-0.03839111328125,
0.0667724609375,
-0.00482177734375,
-0.07196044921875,
0.009796142578125,
-0.032470703125,
-0.025543212890625,
-0.02618408203125,
0.002532958984375,
-0.053466796875,
-0.0169219970703125,
0.023651123046875,
0.02886962890625,
-0.02545166015625,
0.0469970703125,
-0.008544921875,
-0.01568603515625,
0.021636962890625,
-0.051239013671875,
0.061309814453125,
0.03680419921875,
-0.0234222412109375,
0.00986480712890625,
-0.06927490234375,
-0.0273895263671875,
-0.0002944469451904297,
-0.017913818359375,
0.00859832763671875,
-0.0007739067077636719,
0.0257415771484375,
0.0178375244140625,
-0.00045371055603027344,
-0.032623291015625,
0.0038089752197265625,
-0.035430908203125,
0.04296875,
0.052459716796875,
0.0019931793212890625,
0.044158935546875,
-0.0031185150146484375,
0.0439453125,
0.0136566162109375,
0.025390625,
-0.01690673828125,
-0.040313720703125,
-0.061279296875,
0.001583099365234375,
0.0041656494140625,
0.034393310546875,
-0.053131103515625,
0.0146026611328125,
0.0015802383422851562,
-0.043212890625,
-0.033782958984375,
0.00952911376953125,
0.0262298583984375,
0.0616455078125,
0.020355224609375,
-0.0318603515625,
-0.05828857421875,
-0.0545654296875,
0.008087158203125,
-0.017669677734375,
-0.0038776397705078125,
0.02117919921875,
0.051483154296875,
-0.016326904296875,
0.0562744140625,
-0.047271728515625,
-0.037689208984375,
-0.0238037109375,
0.00629425048828125,
0.017059326171875,
0.035675048828125,
0.03265380859375,
-0.07281494140625,
-0.03839111328125,
-0.005359649658203125,
-0.0767822265625,
-0.005153656005859375,
-0.00042366981506347656,
-0.008514404296875,
0.005855560302734375,
0.0384521484375,
-0.050140380859375,
0.03369140625,
0.059844970703125,
-0.00704193115234375,
0.051788330078125,
-0.0291900634765625,
0.0273895263671875,
-0.1063232421875,
0.003353118896484375,
0.01186370849609375,
-0.012359619140625,
-0.038238525390625,
0.0029735565185546875,
-0.0184173583984375,
-0.03497314453125,
-0.05908203125,
0.04425048828125,
-0.0245208740234375,
0.007572174072265625,
-0.021759033203125,
0.0172271728515625,
0.006916046142578125,
0.035675048828125,
0.00678253173828125,
0.048004150390625,
0.0604248046875,
-0.0623779296875,
0.036163330078125,
0.035003662109375,
-0.03594970703125,
0.0287628173828125,
-0.061187744140625,
-0.020904541015625,
-0.0089111328125,
0.0156402587890625,
-0.07720947265625,
-0.04144287109375,
0.02178955078125,
-0.053802490234375,
0.0074920654296875,
-0.0257568359375,
-0.0187225341796875,
-0.021820068359375,
-0.01450347900390625,
0.035064697265625,
0.057647705078125,
-0.019378662109375,
0.0283355712890625,
0.0310516357421875,
-0.00453948974609375,
-0.04095458984375,
-0.061431884765625,
-0.001163482666015625,
-0.02423095703125,
-0.06622314453125,
0.05224609375,
-0.01000213623046875,
-0.004383087158203125,
-0.005664825439453125,
0.01416778564453125,
-0.0207061767578125,
-0.02227783203125,
0.022735595703125,
0.02606201171875,
0.00787353515625,
-0.01412200927734375,
0.006092071533203125,
0.01219940185546875,
-0.0008144378662109375,
-0.01441192626953125,
0.016693115234375,
0.019134521484375,
-0.0017499923706054688,
-0.048828125,
0.02392578125,
0.0203399658203125,
-0.0165863037109375,
0.038543701171875,
0.0855712890625,
-0.02996826171875,
-0.0112152099609375,
-0.0296630859375,
-0.013397216796875,
-0.0384521484375,
0.056732177734375,
0.0029468536376953125,
-0.048004150390625,
0.03131103515625,
0.0130615234375,
-0.003875732421875,
0.0455322265625,
0.05242919921875,
-0.01538848876953125,
0.0819091796875,
0.042144775390625,
0.0207672119140625,
0.054718017578125,
-0.06378173828125,
-0.0110626220703125,
-0.061431884765625,
-0.0159454345703125,
-0.01241302490234375,
0.0036792755126953125,
-0.046844482421875,
-0.0286407470703125,
0.04693603515625,
0.0008392333984375,
-0.036407470703125,
0.0218505859375,
-0.043975830078125,
0.0029449462890625,
0.0307159423828125,
0.01377105712890625,
0.00870513916015625,
0.01007843017578125,
-0.0029811859130859375,
-0.020782470703125,
-0.047088623046875,
-0.0196075439453125,
0.06951904296875,
0.0207061767578125,
0.05816650390625,
0.01137542724609375,
0.0310516357421875,
0.0204010009765625,
-0.001918792724609375,
-0.032623291015625,
0.050323486328125,
-0.024871826171875,
-0.05078125,
-0.01230621337890625,
-0.0275115966796875,
-0.052001953125,
0.0089111328125,
-0.0246429443359375,
-0.0426025390625,
-0.00601959228515625,
0.016143798828125,
-0.0101318359375,
0.03082275390625,
-0.060333251953125,
0.0826416015625,
-0.0086517333984375,
-0.059234619140625,
-0.00254058837890625,
-0.052825927734375,
0.039764404296875,
0.01508331298828125,
0.035430908203125,
0.01233673095703125,
0.02545166015625,
0.0697021484375,
-0.051666259765625,
0.056915283203125,
-0.0126190185546875,
0.013580322265625,
0.049285888671875,
-0.001422882080078125,
0.0230560302734375,
0.01971435546875,
0.01126861572265625,
0.00382232666015625,
-0.0113677978515625,
-0.001605987548828125,
-0.025848388671875,
0.055023193359375,
-0.0670166015625,
-0.04205322265625,
-0.0256500244140625,
-0.03485107421875,
0.0175018310546875,
0.0196533203125,
0.05950927734375,
0.03521728515625,
-0.0237884521484375,
0.0075836181640625,
0.0374755859375,
-0.035369873046875,
0.05035400390625,
0.00681304931640625,
-0.0294952392578125,
-0.060089111328125,
0.064208984375,
0.0015726089477539062,
0.02471923828125,
0.0251617431640625,
0.0115509033203125,
-0.00630950927734375,
-0.039886474609375,
-0.047454833984375,
0.0108795166015625,
-0.04248046875,
-0.009613037109375,
-0.03912353515625,
-0.034820556640625,
-0.06170654296875,
0.003322601318359375,
-0.04437255859375,
-0.004436492919921875,
-0.038330078125,
-0.01410675048828125,
0.0313720703125,
0.033599853515625,
-0.001598358154296875,
0.0325927734375,
-0.06097412109375,
0.039794921875,
0.0267333984375,
0.01393890380859375,
-0.00193023681640625,
-0.07220458984375,
-0.047149658203125,
0.005367279052734375,
-0.06903076171875,
-0.06451416015625,
0.068115234375,
0.019561767578125,
0.0152130126953125,
0.031951904296875,
-0.017120361328125,
0.05767822265625,
-0.03375244140625,
0.08038330078125,
0.01922607421875,
-0.06365966796875,
0.060577392578125,
-0.036041259765625,
0.0278472900390625,
0.01396942138671875,
0.027557373046875,
-0.026947021484375,
-0.0188751220703125,
-0.04364013671875,
-0.0704345703125,
0.0628662109375,
0.05224609375,
0.01416778564453125,
0.0219879150390625,
0.0196685791015625,
-0.01419830322265625,
0.0115814208984375,
-0.0750732421875,
-0.015472412109375,
-0.048095703125,
0.002719879150390625,
0.00177001953125,
-0.0097808837890625,
0.001270294189453125,
-0.035980224609375,
0.07183837890625,
0.0080108642578125,
0.0281524658203125,
0.038604736328125,
-0.003173828125,
-0.00849151611328125,
0.0013875961303710938,
0.033935546875,
0.0172882080078125,
-0.0322265625,
-0.01137542724609375,
-0.004024505615234375,
-0.050018310546875,
0.020111083984375,
0.0180511474609375,
-0.038726806640625,
0.01326751708984375,
0.01117706298828125,
0.07867431640625,
0.0005140304565429688,
-0.04815673828125,
0.05029296875,
0.0019378662109375,
-0.032867431640625,
-0.040374755859375,
-0.0006260871887207031,
0.00732421875,
0.013397216796875,
0.0240478515625,
0.01456451416015625,
0.0269622802734375,
-0.04302978515625,
0.0087432861328125,
0.0204620361328125,
-0.02471923828125,
-0.0260009765625,
0.10968017578125,
0.00910186767578125,
-0.028228759765625,
0.04302978515625,
-0.0145416259765625,
-0.019927978515625,
0.054779052734375,
0.036895751953125,
0.06231689453125,
-0.004520416259765625,
0.0384521484375,
0.056915283203125,
0.0059356689453125,
-0.0213165283203125,
0.00035858154296875,
0.004486083984375,
-0.04974365234375,
-0.03179931640625,
-0.040374755859375,
-0.0172882080078125,
0.00965118408203125,
-0.040313720703125,
0.052459716796875,
-0.031158447265625,
-0.0238189697265625,
-0.0014438629150390625,
-0.0170135498046875,
-0.036773681640625,
0.047607421875,
0.01271820068359375,
0.057037353515625,
-0.0711669921875,
0.0682373046875,
0.036773681640625,
-0.06494140625,
-0.0654296875,
-0.0235748291015625,
-0.00928497314453125,
-0.01568603515625,
0.04437255859375,
0.01123046875,
0.01123046875,
-0.00470733642578125,
-0.053436279296875,
-0.042449951171875,
0.0888671875,
0.0443115234375,
-0.0241546630859375,
-0.031494140625,
0.0082244873046875,
0.0469970703125,
-0.02734375,
0.025909423828125,
0.00931549072265625,
0.0162353515625,
0.0200347900390625,
-0.053070068359375,
-0.021697998046875,
-0.0267791748046875,
0.0112152099609375,
-0.0206146240234375,
-0.0638427734375,
0.075927734375,
-0.0211334228515625,
-0.001567840576171875,
0.016326904296875,
0.049591064453125,
0.01666259765625,
0.0321044921875,
0.01016998291015625,
0.0474853515625,
0.006557464599609375,
0.002353668212890625,
0.07373046875,
-0.0007481575012207031,
0.0435791015625,
0.06964111328125,
0.00787353515625,
0.050445556640625,
0.03466796875,
-0.0252838134765625,
0.05364990234375,
0.048736572265625,
-0.0196990966796875,
0.048797607421875,
-0.01446533203125,
-0.0038242340087890625,
-0.01248931884765625,
-0.01363372802734375,
-0.0209197998046875,
0.026519775390625,
0.0034008026123046875,
-0.055450439453125,
0.01125335693359375,
0.01739501953125,
-0.0152587890625,
0.002925872802734375,
-0.01418304443359375,
0.039947509765625,
0.00960540771484375,
-0.051300048828125,
0.0452880859375,
0.006130218505859375,
0.0509033203125,
-0.056060791015625,
-0.007694244384765625,
-0.016448974609375,
0.01282501220703125,
-0.0250244140625,
-0.0496826171875,
0.0134735107421875,
0.015899658203125,
-0.0132293701171875,
-0.0144195556640625,
0.043243408203125,
-0.04052734375,
-0.05035400390625,
0.01422882080078125,
0.0081939697265625,
0.0213165283203125,
-0.00902557373046875,
-0.056304931640625,
0.004825592041015625,
0.0097503662109375,
-0.0184173583984375,
0.019805908203125,
0.015869140625,
0.0107421875,
0.0457763671875,
0.03900146484375,
0.010650634765625,
0.0013446807861328125,
0.0145416259765625,
0.04443359375,
-0.037933349609375,
-0.033294677734375,
-0.0628662109375,
0.06005859375,
-0.00333404541015625,
-0.03143310546875,
0.06396484375,
0.05389404296875,
0.07647705078125,
-0.03167724609375,
0.07122802734375,
-0.01363372802734375,
0.002445220947265625,
-0.041015625,
0.0469970703125,
-0.0657958984375,
0.01160430908203125,
-0.05841064453125,
-0.06884765625,
-0.004924774169921875,
0.0443115234375,
0.00565338134765625,
-0.0005784034729003906,
0.04351806640625,
0.07281494140625,
-0.0169219970703125,
-0.0176849365234375,
0.0252838134765625,
0.0301666259765625,
0.0250701904296875,
0.0307769775390625,
0.036163330078125,
-0.0831298828125,
0.07293701171875,
-0.03271484375,
-0.006038665771484375,
-0.0115814208984375,
-0.06231689453125,
-0.048553466796875,
-0.066162109375,
-0.03375244140625,
-0.037750244140625,
-0.0009279251098632812,
0.04547119140625,
0.062408447265625,
-0.036529541015625,
-0.0380859375,
-0.020538330078125,
0.0023708343505859375,
-0.016204833984375,
-0.02099609375,
0.0243377685546875,
0.025299072265625,
-0.078369140625,
0.0130615234375,
0.0130615234375,
0.020416259765625,
-0.033294677734375,
-0.01175689697265625,
-0.03387451171875,
-0.0105133056640625,
0.04168701171875,
0.00908660888671875,
-0.04327392578125,
0.0026187896728515625,
0.0036716461181640625,
0.0017709732055664062,
0.00934600830078125,
0.03948974609375,
-0.0369873046875,
0.050567626953125,
0.05657958984375,
0.0159912109375,
0.08099365234375,
-0.0066986083984375,
0.036224365234375,
-0.058013916015625,
0.031494140625,
-0.013641357421875,
0.036102294921875,
0.03839111328125,
-0.0196685791015625,
0.03240966796875,
0.041839599609375,
-0.05328369140625,
-0.05889892578125,
0.012115478515625,
-0.100341796875,
-0.0145416259765625,
0.10302734375,
-0.0019407272338867188,
-0.01378631591796875,
0.029571533203125,
-0.04266357421875,
0.055694580078125,
-0.026580810546875,
0.054656982421875,
0.030242919921875,
0.0108489990234375,
-0.026824951171875,
-0.043212890625,
0.0307159423828125,
0.00812530517578125,
-0.048553466796875,
-0.0129547119140625,
0.0435791015625,
0.039276123046875,
-0.0075836181640625,
0.05120849609375,
-0.00421905517578125,
0.031646728515625,
0.0175323486328125,
0.0165557861328125,
-0.00543975830078125,
-0.01446533203125,
-0.0214691162109375,
0.0067901611328125,
-0.0144500732421875,
-0.009246826171875
]
] |
flair/pos-english-fast | 2021-03-02T22:19:11.000Z | [
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"dataset:ontonotes",
"has_space",
"region:us"
] | token-classification | flair | null | null | flair/pos-english-fast | 5 | 13,021 | flair | 2022-03-02T23:29:05 | ---
tags:
- flair
- token-classification
- sequence-tagger-model
language: en
datasets:
- ontonotes
widget:
- text: "I love Berlin."
---
## English Part-of-Speech Tagging in Flair (fast model)
This is the fast part-of-speech tagging model for English that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **98,10** (Ontonotes)
Predicts fine-grained POS tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
|ADD | Email |
|AFX | Affix |
|CC | Coordinating conjunction |
|CD | Cardinal number |
|DT | Determiner |
|EX | Existential there |
|FW | Foreign word |
|HYPH | Hyphen |
|IN | Preposition or subordinating conjunction |
|JJ | Adjective |
|JJR |Adjective, comparative |
|JJS | Adjective, superlative |
|LS | List item marker |
|MD | Modal |
|NFP | Superfluous punctuation |
|NN | Noun, singular or mass |
|NNP |Proper noun, singular |
|NNPS | Proper noun, plural |
|NNS |Noun, plural |
|PDT | Predeterminer |
|POS | Possessive ending |
|PRP | Personal pronoun |
|PRP$ | Possessive pronoun |
|RB | Adverb |
|RBR | Adverb, comparative |
|RBS | Adverb, superlative |
|RP | Particle |
|SYM | Symbol |
|TO | to |
|UH | Interjection |
|VB | Verb, base form |
|VBD | Verb, past tense |
|VBG | Verb, gerund or present participle |
|VBN | Verb, past participle |
|VBP | Verb, non-3rd person singular present |
|VBZ | Verb, 3rd person singular present |
|WDT | Wh-determiner |
|WP | Wh-pronoun |
|WP$ | Possessive wh-pronoun |
|WRB | Wh-adverb |
|XX | Unknown |
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/pos-english-fast")
# make example sentence
sentence = Sentence("I love Berlin.")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('pos'):
print(entity)
```
This yields the following output:
```
Span [1]: "I" [− Labels: PRP (1.0)]
Span [2]: "love" [− Labels: VBP (0.9998)]
Span [3]: "Berlin" [− Labels: NNP (0.9999)]
Span [4]: "." [− Labels: . (0.9998)]
```
So, the word "*I*" is labeled as a **pronoun** (PRP), "*love*" is labeled as a **verb** (VBP) and "*Berlin*" is labeled as a **proper noun** (NNP) in the sentence "*I love Berlin*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
from flair.data import Corpus
from flair.datasets import ColumnCorpus
from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
# 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself)
corpus: Corpus = ColumnCorpus(
"resources/tasks/onto-ner",
column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"},
tag_to_bioes="ner",
)
# 2. what tag do we want to predict?
tag_type = 'pos'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize each embedding we use
embedding_types = [
# contextual string embeddings, forward
FlairEmbeddings('news-forward'),
# contextual string embeddings, backward
FlairEmbeddings('news-backward'),
]
# embedding stack consists of Flair and GloVe embeddings
embeddings = StackedEmbeddings(embeddings=embedding_types)
# 5. initialize sequence tagger
from flair.models import SequenceTagger
tagger = SequenceTagger(hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type=tag_type)
# 6. initialize trainer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus)
# 7. run training
trainer.train('resources/taggers/pos-english-fast',
train_with_dev=True,
max_epochs=150)
```
---
### Cite
Please cite the following paper when using this model.
```
@inproceedings{akbik2018coling,
title={Contextual String Embeddings for Sequence Labeling},
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
pages = {1638--1649},
year = {2018}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
| 4,987 | [
[
-0.031158447265625,
-0.050933837890625,
0.0074920654296875,
0.0193634033203125,
-0.0213470458984375,
-0.007038116455078125,
-0.01535797119140625,
-0.022064208984375,
0.050140380859375,
0.01161956787109375,
-0.03057861328125,
-0.0440673828125,
-0.035797119140625,
0.0236968994140625,
0.0006494522094726562,
0.08074951171875,
0.002231597900390625,
0.0259857177734375,
-0.01279449462890625,
-0.0006165504455566406,
-0.0263519287109375,
-0.047821044921875,
-0.032318115234375,
-0.01300811767578125,
0.0423583984375,
0.0289154052734375,
0.044952392578125,
0.056182861328125,
0.0306396484375,
0.02069091796875,
-0.0169525146484375,
0.0110626220703125,
-0.00957489013671875,
-0.0003619194030761719,
-0.014984130859375,
-0.0289764404296875,
-0.05743408203125,
0.009490966796875,
0.047637939453125,
0.0360107421875,
0.01270294189453125,
0.006580352783203125,
-0.0087738037109375,
0.014068603515625,
-0.0206298828125,
0.031982421875,
-0.045623779296875,
-0.01505279541015625,
-0.025604248046875,
-0.007320404052734375,
-0.019561767578125,
-0.0301666259765625,
0.0087432861328125,
-0.038482666015625,
-0.0008797645568847656,
0.0128631591796875,
0.101318359375,
0.01059722900390625,
-0.0261993408203125,
-0.016448974609375,
-0.02996826171875,
0.056915283203125,
-0.0670166015625,
0.024078369140625,
0.0288238525390625,
-0.01078033447265625,
-0.01515960693359375,
-0.050872802734375,
-0.055877685546875,
-0.015655517578125,
-0.0157012939453125,
0.01580810546875,
-0.00661468505859375,
-0.0054779052734375,
0.0152587890625,
0.0169677734375,
-0.04705810546875,
-0.0089263916015625,
-0.007411956787109375,
-0.0204010009765625,
0.055877685546875,
0.01097869873046875,
0.0178680419921875,
-0.03265380859375,
-0.0300750732421875,
-0.004608154296875,
-0.024322509765625,
0.0035400390625,
0.0125732421875,
0.039215087890625,
-0.0098419189453125,
0.046661376953125,
-0.0019321441650390625,
0.053802490234375,
0.008636474609375,
-0.0285186767578125,
0.041412353515625,
-0.03472900390625,
-0.00978851318359375,
-0.01171112060546875,
0.0743408203125,
0.025726318359375,
0.0123443603515625,
-0.00855255126953125,
-0.003948211669921875,
0.0264129638671875,
-0.023712158203125,
-0.0445556640625,
-0.0180816650390625,
0.017059326171875,
-0.017242431640625,
-0.019775390625,
-0.0014362335205078125,
-0.054412841796875,
-0.0094146728515625,
-0.0028324127197265625,
0.040618896484375,
-0.0411376953125,
-0.0105438232421875,
0.0154571533203125,
-0.0232086181640625,
0.0141754150390625,
0.0026798248291015625,
-0.052825927734375,
-0.004138946533203125,
0.031951904296875,
0.04327392578125,
0.0165557861328125,
-0.03277587890625,
-0.02435302734375,
-0.01071929931640625,
-0.0115966796875,
0.057403564453125,
-0.031768798828125,
-0.0186920166015625,
0.00015532970428466797,
0.01222991943359375,
-0.026397705078125,
-0.0200347900390625,
0.052703857421875,
-0.03778076171875,
0.033416748046875,
-0.018402099609375,
-0.061279296875,
-0.0186920166015625,
0.0181427001953125,
-0.042877197265625,
0.06964111328125,
0.0007524490356445312,
-0.08837890625,
0.0274658203125,
-0.033660888671875,
-0.04156494140625,
0.0012798309326171875,
0.0004425048828125,
-0.03350830078125,
-0.0133056640625,
0.00406646728515625,
0.06304931640625,
-0.0096282958984375,
0.0207977294921875,
-0.023712158203125,
-0.0014295578002929688,
0.0135345458984375,
0.0181427001953125,
0.063232421875,
0.0035037994384765625,
-0.015533447265625,
0.007663726806640625,
-0.0660400390625,
-0.0176849365234375,
0.022796630859375,
-0.036346435546875,
-0.0269012451171875,
0.001438140869140625,
0.005390167236328125,
0.021697998046875,
0.015380859375,
-0.045135498046875,
0.04193115234375,
-0.041961669921875,
0.034454345703125,
0.033782958984375,
0.000789642333984375,
0.045074462890625,
-0.038116455078125,
0.03778076171875,
0.0083770751953125,
-0.015380859375,
-0.01116943359375,
-0.04974365234375,
-0.050018310546875,
-0.0325927734375,
0.045562744140625,
0.0616455078125,
-0.0584716796875,
0.057525634765625,
-0.0254669189453125,
-0.055877685546875,
-0.038055419921875,
-0.01454925537109375,
0.0286407470703125,
0.04632568359375,
0.036041259765625,
-0.0157470703125,
-0.0616455078125,
-0.052703857421875,
-0.0233306884765625,
-0.01316070556640625,
0.03179931640625,
0.0206756591796875,
0.060272216796875,
-0.015899658203125,
0.0648193359375,
-0.03753662109375,
-0.033660888671875,
-0.0288848876953125,
0.005680084228515625,
0.0328369140625,
0.037750244140625,
0.0311126708984375,
-0.054931640625,
-0.047454833984375,
-0.023101806640625,
-0.029327392578125,
0.01082611083984375,
-0.006679534912109375,
-0.00789642333984375,
0.021881103515625,
0.0261077880859375,
-0.042877197265625,
0.031219482421875,
0.01971435546875,
-0.052703857421875,
0.053436279296875,
0.009002685546875,
-0.01386260986328125,
-0.10943603515625,
0.0132904052734375,
0.0182037353515625,
-0.01617431640625,
-0.049774169921875,
-0.01336669921875,
-0.00609588623046875,
0.0295867919921875,
-0.03558349609375,
0.06170654296875,
-0.030792236328125,
0.0159759521484375,
0.004581451416015625,
0.009002685546875,
0.0063934326171875,
0.0282135009765625,
0.0222320556640625,
0.035308837890625,
0.04345703125,
-0.044708251953125,
0.0215606689453125,
0.03662109375,
-0.031158447265625,
0.00951385498046875,
-0.0302734375,
-0.018768310546875,
-0.01641845703125,
0.0262603759765625,
-0.08349609375,
-0.0208892822265625,
0.030731201171875,
-0.0594482421875,
0.034515380859375,
-0.0004897117614746094,
-0.04071044921875,
-0.0323486328125,
-0.0193634033203125,
0.01013946533203125,
0.0294036865234375,
-0.018890380859375,
0.033233642578125,
0.03509521484375,
0.007080078125,
-0.0516357421875,
-0.046600341796875,
-0.01568603515625,
-0.017059326171875,
-0.05169677734375,
0.044525146484375,
-0.006427764892578125,
-0.0161590576171875,
0.0100555419921875,
0.0128936767578125,
-0.00846099853515625,
0.0184173583984375,
0.00885009765625,
0.032073974609375,
-0.0128021240234375,
0.0117645263671875,
-0.0232696533203125,
0.00577545166015625,
-0.0102386474609375,
-0.00963592529296875,
0.059539794921875,
-0.01093292236328125,
0.0206298828125,
-0.04266357421875,
0.0133819580078125,
0.0213165283203125,
-0.026336669921875,
0.0604248046875,
0.063720703125,
-0.036102294921875,
-0.0080718994140625,
-0.023284912109375,
-0.01461029052734375,
-0.026397705078125,
0.037811279296875,
-0.039581298828125,
-0.057708740234375,
0.042572021484375,
0.009674072265625,
0.01125335693359375,
0.058319091796875,
0.038421630859375,
-0.007350921630859375,
0.0728759765625,
0.0460205078125,
-0.015655517578125,
0.033782958984375,
-0.03680419921875,
0.01213836669921875,
-0.052642822265625,
-0.0106048583984375,
-0.03997802734375,
-0.01136016845703125,
-0.05352783203125,
-0.0287017822265625,
0.003978729248046875,
0.03314208984375,
-0.0262603759765625,
0.0416259765625,
-0.03729248046875,
0.01551055908203125,
0.0445556640625,
-0.008636474609375,
0.00524139404296875,
-0.01041412353515625,
-0.0290985107421875,
-0.0196075439453125,
-0.054473876953125,
-0.03753662109375,
0.061431884765625,
0.034576416015625,
0.044189453125,
-0.0017242431640625,
0.0628662109375,
0.00341033935546875,
0.018951416015625,
-0.07476806640625,
0.045379638671875,
-0.021148681640625,
-0.05908203125,
-0.0032215118408203125,
-0.01132965087890625,
-0.070556640625,
0.0143585205078125,
-0.023193359375,
-0.06494140625,
0.0189971923828125,
0.01198577880859375,
-0.041717529296875,
0.02557373046875,
-0.0276641845703125,
0.07122802734375,
-0.0033359527587890625,
-0.0263519287109375,
0.0160064697265625,
-0.05902099609375,
0.0221405029296875,
0.013916015625,
0.028350830078125,
-0.0212554931640625,
-0.00672149658203125,
0.07977294921875,
-0.022857666015625,
0.06591796875,
0.00048542022705078125,
0.01201629638671875,
0.02337646484375,
0.005901336669921875,
0.02606201171875,
0.002635955810546875,
-0.00335693359375,
0.006603240966796875,
-0.0017642974853515625,
-0.01461029052734375,
-0.006587982177734375,
0.048065185546875,
-0.05120849609375,
-0.024078369140625,
-0.06231689453125,
-0.024810791015625,
-0.00977325439453125,
0.0218353271484375,
0.050567626953125,
0.0318603515625,
-0.01207733154296875,
-0.0074005126953125,
0.034698486328125,
-0.0165557861328125,
0.045654296875,
0.031280517578125,
-0.0298309326171875,
-0.04766845703125,
0.06982421875,
0.01436614990234375,
-0.00847625732421875,
0.042572021484375,
0.0186004638671875,
-0.031707763671875,
-0.0104522705078125,
-0.01561737060546875,
0.0518798828125,
-0.03656005859375,
-0.0266876220703125,
-0.05242919921875,
-0.0105133056640625,
-0.06787109375,
-0.008209228515625,
-0.0212860107421875,
-0.04229736328125,
-0.05230712890625,
0.0010843276977539062,
0.026519775390625,
0.053558349609375,
-0.0186920166015625,
0.02423095703125,
-0.052032470703125,
-0.0028362274169921875,
-0.000701904296875,
0.0036525726318359375,
-0.017333984375,
-0.06329345703125,
-0.021942138671875,
-0.010101318359375,
-0.0248565673828125,
-0.0789794921875,
0.0654296875,
0.0238189697265625,
0.029510498046875,
0.0279388427734375,
-0.002300262451171875,
0.0413818359375,
-0.0386962890625,
0.07562255859375,
0.004238128662109375,
-0.06866455078125,
0.0384521484375,
-0.026458740234375,
0.01029205322265625,
0.018829345703125,
0.0697021484375,
-0.0401611328125,
-0.008758544921875,
-0.05914306640625,
-0.07696533203125,
0.04376220703125,
-0.008270263671875,
-0.0011157989501953125,
-0.0296630859375,
0.0188446044921875,
-0.0093841552734375,
0.012451171875,
-0.06884765625,
-0.044158935546875,
-0.009735107421875,
-0.01763916015625,
-0.0162200927734375,
-0.019989013671875,
0.006839752197265625,
-0.044677734375,
0.088623046875,
-0.0006356239318847656,
0.042205810546875,
0.036102294921875,
0.002147674560546875,
0.0131988525390625,
0.0165557861328125,
0.04937744140625,
0.0182952880859375,
-0.0246429443359375,
-0.0015726089477539062,
0.00335693359375,
-0.0202789306640625,
-0.00864410400390625,
0.01461029052734375,
-0.0028400421142578125,
0.0234832763671875,
0.033905029296875,
0.059417724609375,
0.009185791015625,
-0.0257110595703125,
0.051055908203125,
-0.0018110275268554688,
-0.01464080810546875,
-0.0322265625,
-0.023651123046875,
0.0143280029296875,
0.0113067626953125,
0.00560760498046875,
0.006839752197265625,
-0.005535125732421875,
-0.04388427734375,
0.0172882080078125,
0.034210205078125,
-0.026458740234375,
-0.039794921875,
0.06414794921875,
0.002559661865234375,
-0.0150146484375,
0.0149078369140625,
-0.046173095703125,
-0.0701904296875,
0.0428466796875,
0.047119140625,
0.054931640625,
-0.0220794677734375,
0.01082611083984375,
0.04827880859375,
0.011505126953125,
-0.00843048095703125,
0.062469482421875,
0.0230255126953125,
-0.07830810546875,
-0.02825927734375,
-0.07183837890625,
0.0025653839111328125,
0.01555633544921875,
-0.039337158203125,
0.03070068359375,
-0.0310211181640625,
-0.034576416015625,
0.0276641845703125,
0.01320648193359375,
-0.058074951171875,
0.0199737548828125,
0.026519775390625,
0.0865478515625,
-0.0753173828125,
0.0794677734375,
0.083251953125,
-0.0543212890625,
-0.08013916015625,
-0.00601959228515625,
-0.005401611328125,
-0.04425048828125,
0.059967041015625,
0.01666259765625,
0.0242156982421875,
0.015411376953125,
-0.042449951171875,
-0.08563232421875,
0.067138671875,
-0.0160369873046875,
-0.0301666259765625,
-0.01690673828125,
-0.0155181884765625,
0.033538818359375,
-0.03289794921875,
0.034423828125,
0.0439453125,
0.03875732421875,
-0.00261688232421875,
-0.07928466796875,
0.003658294677734375,
-0.0201568603515625,
-0.00926971435546875,
0.005687713623046875,
-0.0592041015625,
0.083740234375,
-0.0164337158203125,
-0.0158233642578125,
0.0175323486328125,
0.06756591796875,
-0.002471923828125,
0.004425048828125,
0.02783203125,
0.06634521484375,
0.050140380859375,
-0.0228118896484375,
0.0611572265625,
-0.0270843505859375,
0.037261962890625,
0.08587646484375,
-0.006427764892578125,
0.0787353515625,
0.0264434814453125,
-0.01328277587890625,
0.037567138671875,
0.0599365234375,
-0.004558563232421875,
0.037139892578125,
0.01554107666015625,
-0.007160186767578125,
-0.0248565673828125,
-0.026641845703125,
-0.02996826171875,
0.051300048828125,
0.0247650146484375,
-0.03753662109375,
0.007568359375,
0.0024318695068359375,
0.049560546875,
-0.0017042160034179688,
-0.02154541015625,
0.05853271484375,
-0.0010862350463867188,
-0.047119140625,
0.045806884765625,
0.009613037109375,
0.0782470703125,
-0.0298614501953125,
0.006290435791015625,
-0.01001739501953125,
0.0175323486328125,
-0.0193328857421875,
-0.051910400390625,
0.01091766357421875,
-0.025482177734375,
-0.01500701904296875,
-0.008026123046875,
0.0513916015625,
-0.05352783203125,
-0.0198822021484375,
0.023406982421875,
0.03790283203125,
0.016632080078125,
0.004863739013671875,
-0.054718017578125,
-0.007904052734375,
0.00991058349609375,
-0.030975341796875,
0.01114654541015625,
0.01009368896484375,
0.00530242919921875,
0.035308837890625,
0.029022216796875,
0.019439697265625,
0.00946807861328125,
-0.01305389404296875,
0.0626220703125,
-0.061309814453125,
-0.0313720703125,
-0.0653076171875,
0.050445556640625,
0.00406646728515625,
-0.0386962890625,
0.06451416015625,
0.05712890625,
0.07073974609375,
-0.0104522705078125,
0.061279296875,
-0.0310211181640625,
0.06231689453125,
-0.0114898681640625,
0.05303955078125,
-0.055145263671875,
-0.007476806640625,
-0.0186309814453125,
-0.049560546875,
-0.03924560546875,
0.052703857421875,
-0.0284576416015625,
-0.02203369140625,
0.04681396484375,
0.0604248046875,
0.0147552490234375,
-0.010955810546875,
0.01088714599609375,
0.040130615234375,
0.0021343231201171875,
0.0325927734375,
0.051666259765625,
-0.0438232421875,
0.0209808349609375,
-0.04913330078125,
-0.0166015625,
-0.026458740234375,
-0.0643310546875,
-0.062286376953125,
-0.0709228515625,
-0.036407470703125,
-0.06243896484375,
-0.01424407958984375,
0.09246826171875,
0.0316162109375,
-0.06280517578125,
-0.01334381103515625,
0.01922607421875,
0.00542449951171875,
-0.0018482208251953125,
-0.023162841796875,
0.0306549072265625,
-0.01515960693359375,
-0.058135986328125,
0.0288238525390625,
-0.01849365234375,
0.0102386474609375,
0.00528717041015625,
0.01148223876953125,
-0.056640625,
0.00982666015625,
0.036865234375,
0.028076171875,
-0.05419921875,
-0.01398468017578125,
0.00800323486328125,
-0.0213470458984375,
0.00975799560546875,
0.0204010009765625,
-0.050628662109375,
0.01617431640625,
0.06427001953125,
0.020904541015625,
0.01261138916015625,
0.0025119781494140625,
0.0171051025390625,
-0.047393798828125,
0.003696441650390625,
0.033721923828125,
0.046173095703125,
0.01934814453125,
-0.0140533447265625,
0.0267181396484375,
0.042205810546875,
-0.054962158203125,
-0.05804443359375,
0.0017042160034179688,
-0.07928466796875,
-0.018585205078125,
0.0977783203125,
-0.01390838623046875,
-0.03662109375,
0.0019073486328125,
-0.01611328125,
0.0401611328125,
-0.035675048828125,
0.023651123046875,
0.04046630859375,
-0.011383056640625,
0.0265350341796875,
-0.0167388916015625,
0.0648193359375,
0.0287628173828125,
-0.03265380859375,
-0.0191650390625,
0.020263671875,
0.04144287109375,
0.028076171875,
0.049346923828125,
0.00821685791015625,
0.0036220550537109375,
-0.004322052001953125,
0.03326416015625,
0.01019287109375,
-0.0174560546875,
-0.03143310546875,
-0.00110626220703125,
-0.006725311279296875,
-0.0262298583984375
]
] |
google/owlv2-base-patch16-ensemble | 2023-10-23T09:17:34.000Z | [
"transformers",
"pytorch",
"owlv2",
"zero-shot-object-detection",
"vision",
"object-detection",
"arxiv:2306.09683",
"license:apache-2.0",
"has_space",
"region:us"
] | object-detection | google | null | null | google/owlv2-base-patch16-ensemble | 9 | 13,010 | transformers | 2023-10-13T09:27:09 | ---
license: apache-2.0
tags:
- vision
- object-detection
inference: false
---
# Model Card: OWLv2
## Model Details
The OWLv2 model (short for Open-World Localization) was proposed in [Scaling Open-Vocabulary Object Detection](https://arxiv.org/abs/2306.09683) by Matthias Minderer, Alexey Gritsenko, Neil Houlsby. OWLv2, like OWL-ViT, is a zero-shot text-conditioned object detection model that can be used to query an image with one or multiple text queries.
The model uses CLIP as its multi-modal backbone, with a ViT-like Transformer to get visual features and a causal language model to get the text features. To use CLIP for detection, OWL-ViT removes the final token pooling layer of the vision model and attaches a lightweight classification and box head to each transformer output token. Open-vocabulary classification is enabled by replacing the fixed classification layer weights with the class-name embeddings obtained from the text model. The authors first train CLIP from scratch and fine-tune it end-to-end with the classification and box heads on standard detection datasets using a bipartite matching loss. One or multiple text queries per image can be used to perform zero-shot text-conditioned object detection.
### Model Date
June 2023
### Model Type
The model uses a CLIP backbone with a ViT-B/16 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss. The CLIP backbone is trained from scratch and fine-tuned together with the box and class prediction heads with an object detection objective.
### Documents
- [OWLv2 Paper](https://arxiv.org/abs/2306.09683)
### Use with Transformers
```python3
import requests
from PIL import Image
import torch
from transformers import Owlv2Processor, Owlv2ForObjectDetection
processor = Owlv2Processor.from_pretrained("google/owlv2-base-patch16-ensemble")
model = Owlv2ForObjectDetection.from_pretrained("google/owlv2-base-patch16-ensemble")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
texts = [["a photo of a cat", "a photo of a dog"]]
inputs = processor(text=texts, images=image, return_tensors="pt")
outputs = model(**inputs)
# Target image sizes (height, width) to rescale box predictions [batch_size, 2]
target_sizes = torch.Tensor([image.size[::-1]])
# Convert outputs (bounding boxes and class logits) to COCO API
results = processor.post_process_object_detection(outputs=outputs, threshold=0.1, target_sizes=target_sizes)
i = 0 # Retrieve predictions for the first image for the corresponding text queries
text = texts[i]
boxes, scores, labels = results[i]["boxes"], results[i]["scores"], results[i]["labels"]
# Print detected objects and rescaled box coordinates
for box, score, label in zip(boxes, scores, labels):
box = [round(i, 2) for i in box.tolist()]
print(f"Detected {text[label]} with confidence {round(score.item(), 3)} at location {box}")
```
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, text-conditioned object detection. We also hope it can be used for interdisciplinary studies of the potential impact of such models, especially in areas that commonly require identifying objects whose label is unavailable during training.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
## Data
The CLIP backbone of the model was trained on publicly available image-caption data. This was done through a combination of crawling a handful of websites and using commonly-used pre-existing image datasets such as [YFCC100M](http://projects.dfki.uni-kl.de/yfcc100m/). A large portion of the data comes from our crawling of the internet. This means that the data is more representative of people and societies most connected to the internet. The prediction heads of OWL-ViT, along with the CLIP backbone, are fine-tuned on publicly available object detection datasets such as [COCO](https://cocodataset.org/#home) and [OpenImages](https://storage.googleapis.com/openimages/web/index.html).
(to be updated for v2)
### BibTeX entry and citation info
```bibtex
@misc{minderer2023scaling,
title={Scaling Open-Vocabulary Object Detection},
author={Matthias Minderer and Alexey Gritsenko and Neil Houlsby},
year={2023},
eprint={2306.09683},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` | 4,826 | [
[
-0.02520751953125,
-0.050933837890625,
0.0255279541015625,
-0.01439666748046875,
-0.0213775634765625,
-0.034027099609375,
-0.0033397674560546875,
-0.06854248046875,
0.0014162063598632812,
0.031219482421875,
-0.0242767333984375,
-0.048004150390625,
-0.047332763671875,
0.01331329345703125,
-0.0293121337890625,
0.057708740234375,
0.01551055908203125,
-0.0208587646484375,
-0.03204345703125,
-0.00832366943359375,
-0.030517578125,
-0.016143798828125,
-0.04412841796875,
-0.005771636962890625,
0.018646240234375,
0.031463623046875,
0.046173095703125,
0.08392333984375,
0.06329345703125,
0.0195159912109375,
0.0021533966064453125,
0.00865936279296875,
-0.0309295654296875,
-0.036834716796875,
-0.0099334716796875,
-0.04461669921875,
-0.0238494873046875,
0.006317138671875,
0.03082275390625,
0.0058746337890625,
0.0079345703125,
0.00855255126953125,
-0.0030078887939453125,
0.0224609375,
-0.058868408203125,
0.02301025390625,
-0.06365966796875,
0.01514434814453125,
-0.010162353515625,
-0.01276397705078125,
-0.04254150390625,
0.0234375,
0.0181427001953125,
-0.058563232421875,
0.03826904296875,
0.01331329345703125,
0.0946044921875,
0.017181396484375,
-0.00801849365234375,
-0.0152130126953125,
-0.03912353515625,
0.08782958984375,
-0.034454345703125,
0.0102996826171875,
0.041168212890625,
0.010223388671875,
0.006786346435546875,
-0.05572509765625,
-0.03961181640625,
0.0020885467529296875,
0.0006728172302246094,
0.0204315185546875,
-0.033172607421875,
-0.0113983154296875,
0.0079345703125,
0.0210113525390625,
-0.045013427734375,
0.01004791259765625,
-0.054443359375,
-0.0199127197265625,
0.047027587890625,
0.006336212158203125,
0.017242431640625,
0.0008273124694824219,
-0.04302978515625,
-0.031982421875,
-0.0171661376953125,
0.0165557861328125,
0.019134521484375,
0.0178985595703125,
-0.006450653076171875,
0.0423583984375,
-0.0299835205078125,
0.06683349609375,
-0.0008039474487304688,
-0.0255584716796875,
0.02239990234375,
-0.01558685302734375,
-0.01308441162109375,
-0.006443023681640625,
0.071533203125,
0.04986572265625,
0.017486572265625,
-0.00921630859375,
0.0021800994873046875,
0.0199432373046875,
0.004180908203125,
-0.0692138671875,
-0.020111083984375,
0.026580810546875,
-0.036590576171875,
-0.037933349609375,
0.0170135498046875,
-0.047882080078125,
0.005153656005859375,
-0.007415771484375,
0.04425048828125,
-0.025604248046875,
-0.0174407958984375,
0.0362548828125,
-0.0192718505859375,
0.038818359375,
0.0194854736328125,
-0.0494384765625,
0.0037326812744140625,
0.032470703125,
0.073486328125,
-0.0208282470703125,
-0.0284576416015625,
-0.03289794921875,
-0.00018036365509033203,
-0.0282745361328125,
0.0860595703125,
-0.040496826171875,
-0.0207672119140625,
-0.00354766845703125,
0.0201263427734375,
0.01218414306640625,
-0.036956787109375,
0.037506103515625,
-0.019317626953125,
0.0227508544921875,
-0.00032591819763183594,
-0.019073486328125,
-0.020111083984375,
0.044647216796875,
-0.02947998046875,
0.0902099609375,
-0.006134033203125,
-0.08111572265625,
0.030548095703125,
-0.034149169921875,
-0.00563812255859375,
-0.0192413330078125,
0.01212310791015625,
-0.054443359375,
-0.01497650146484375,
0.034515380859375,
0.038299560546875,
-0.0189666748046875,
-0.0086822509765625,
-0.04278564453125,
-0.031341552734375,
0.0113983154296875,
-0.019073486328125,
0.054443359375,
0.0050506591796875,
0.002300262451171875,
0.011962890625,
-0.052459716796875,
-0.0030879974365234375,
0.029998779296875,
-0.00322723388671875,
-0.017242431640625,
0.0021266937255859375,
0.0029468536376953125,
0.01120758056640625,
0.0170440673828125,
-0.05694580078125,
0.0181121826171875,
-0.0310211181640625,
0.0310211181640625,
0.034423828125,
-0.0121612548828125,
0.02496337890625,
-0.0120391845703125,
0.014862060546875,
0.00968170166015625,
0.034088134765625,
-0.0311126708984375,
-0.057373046875,
-0.049285888671875,
-0.01345062255859375,
-0.0095672607421875,
0.046875,
-0.03662109375,
0.0236663818359375,
-0.010711669921875,
-0.0347900390625,
-0.0196990966796875,
-0.017974853515625,
0.026336669921875,
0.04412841796875,
0.046661376953125,
-0.027069091796875,
-0.035308837890625,
-0.06842041015625,
0.0034236907958984375,
-0.00634765625,
-0.0192718505859375,
0.017852783203125,
0.059051513671875,
-0.0135040283203125,
0.09344482421875,
-0.068603515625,
-0.054534912109375,
0.005084991455078125,
0.0018749237060546875,
0.0020046234130859375,
0.033905029296875,
0.040740966796875,
-0.06396484375,
-0.033905029296875,
-0.005702972412109375,
-0.0723876953125,
0.0230712890625,
0.02386474609375,
-0.00859832763671875,
0.00927734375,
0.0347900390625,
-0.03179931640625,
0.059051513671875,
0.017730712890625,
-0.013916015625,
0.0423583984375,
-0.01154327392578125,
-0.01220703125,
-0.08367919921875,
-0.0024871826171875,
0.005893707275390625,
-0.00980377197265625,
-0.038848876953125,
0.005084991455078125,
-0.0002313852310180664,
-0.0170745849609375,
-0.051055908203125,
0.0389404296875,
-0.039276123046875,
-0.0173492431640625,
-0.028778076171875,
0.0189666748046875,
0.014129638671875,
0.044403076171875,
0.038421630859375,
0.05059814453125,
0.0631103515625,
-0.040252685546875,
0.0160675048828125,
0.023529052734375,
-0.016326904296875,
0.04022216796875,
-0.06610107421875,
0.02142333984375,
-0.0119476318359375,
0.00873565673828125,
-0.07159423828125,
-0.02197265625,
0.01953125,
-0.0521240234375,
0.0179901123046875,
-0.01097869873046875,
-0.0236053466796875,
-0.05059814453125,
-0.040679931640625,
0.03680419921875,
0.0301513671875,
-0.037841796875,
0.024749755859375,
0.022918701171875,
0.04736328125,
-0.06134033203125,
-0.070068359375,
-0.003543853759765625,
-0.00699615478515625,
-0.04925537109375,
0.0299072265625,
-0.003322601318359375,
0.004642486572265625,
0.0222015380859375,
-0.00012624263763427734,
-0.0161895751953125,
-0.01201629638671875,
0.01369476318359375,
0.031494140625,
-0.01611328125,
0.0064239501953125,
-0.03179931640625,
-0.0208587646484375,
-0.004314422607421875,
-0.044830322265625,
0.0299835205078125,
-0.0174713134765625,
-0.01372528076171875,
-0.052001953125,
0.00168609619140625,
0.031402587890625,
-0.039031982421875,
0.049072265625,
0.0689697265625,
-0.040985107421875,
0.008209228515625,
-0.03106689453125,
-0.027099609375,
-0.032135009765625,
0.039459228515625,
-0.0172271728515625,
-0.039337158203125,
0.034881591796875,
0.032623291015625,
-0.0019254684448242188,
0.044830322265625,
0.0291748046875,
0.007465362548828125,
0.070556640625,
0.0543212890625,
0.00909423828125,
0.03619384765625,
-0.072021484375,
0.01092529296875,
-0.08355712890625,
-0.028289794921875,
-0.0289306640625,
-0.0029125213623046875,
-0.0295562744140625,
-0.06634521484375,
0.016143798828125,
0.01454925537109375,
0.00537109375,
0.0364990234375,
-0.08026123046875,
0.037200927734375,
0.035125732421875,
0.037078857421875,
0.0184326171875,
0.01006317138671875,
0.0117950439453125,
0.01471710205078125,
-0.04107666015625,
-0.0301361083984375,
0.101806640625,
0.0269927978515625,
0.04541015625,
-0.028778076171875,
0.032989501953125,
0.014739990234375,
0.00601959228515625,
-0.07537841796875,
0.039886474609375,
-0.0185546875,
-0.051910400390625,
-0.0297698974609375,
0.007236480712890625,
-0.087158203125,
0.0264892578125,
-0.019317626953125,
-0.079833984375,
0.02716064453125,
0.0171966552734375,
-0.0151824951171875,
0.040069580078125,
-0.0390625,
0.07275390625,
0.00838470458984375,
-0.0487060546875,
0.005252838134765625,
-0.036651611328125,
0.029449462890625,
0.0181121826171875,
-0.0024242401123046875,
-0.02545166015625,
-0.004978179931640625,
0.0645751953125,
-0.0287628173828125,
0.05853271484375,
0.006862640380859375,
0.015869140625,
0.06048583984375,
-0.01435089111328125,
0.04083251953125,
-0.004932403564453125,
0.00933074951171875,
0.03717041015625,
0.00453948974609375,
-0.038543701171875,
-0.0171661376953125,
0.0284271240234375,
-0.061859130859375,
-0.03326416015625,
-0.0445556640625,
-0.045623779296875,
0.0233306884765625,
0.020233154296875,
0.059326171875,
0.04608154296875,
0.01324462890625,
0.040924072265625,
0.0491943359375,
-0.0203704833984375,
0.030975341796875,
0.026214599609375,
-0.0298004150390625,
-0.01806640625,
0.07177734375,
0.009307861328125,
0.0063629150390625,
0.0478515625,
0.02392578125,
-0.0274810791015625,
-0.034271240234375,
-0.007480621337890625,
0.01418304443359375,
-0.046875,
-0.036468505859375,
-0.0577392578125,
-0.007175445556640625,
-0.039459228515625,
-0.01055145263671875,
-0.038726806640625,
-0.006259918212890625,
-0.056854248046875,
0.0017938613891601562,
0.03289794921875,
0.038177490234375,
-0.006618499755859375,
0.0253753662109375,
-0.0159912109375,
0.0173492431640625,
0.0291290283203125,
0.01251220703125,
-0.0024509429931640625,
-0.03594970703125,
-0.01146697998046875,
0.006343841552734375,
-0.034454345703125,
-0.04632568359375,
0.033355712890625,
0.005794525146484375,
0.0169525146484375,
0.051483154296875,
0.0110626220703125,
0.045989990234375,
-0.008056640625,
0.058929443359375,
0.0236663818359375,
-0.044219970703125,
0.053466796875,
-0.02581787109375,
0.0234222412109375,
0.00627899169921875,
0.009521484375,
-0.01195526123046875,
-0.023834228515625,
-0.028350830078125,
-0.058135986328125,
0.07867431640625,
0.0208587646484375,
-0.00807952880859375,
-0.0016546249389648438,
0.0201873779296875,
-0.0167083740234375,
-0.01389312744140625,
-0.057373046875,
0.00009441375732421875,
-0.0288848876953125,
-0.0105743408203125,
0.0010271072387695312,
-0.0196075439453125,
0.01739501953125,
-0.0173797607421875,
0.035736083984375,
-0.02130126953125,
0.057098388671875,
0.044464111328125,
-0.001804351806640625,
0.0007877349853515625,
-0.0194854736328125,
0.03936767578125,
0.034515380859375,
-0.03656005859375,
-0.00461578369140625,
0.01357269287109375,
-0.050445556640625,
-0.0111083984375,
-0.0200042724609375,
-0.0184326171875,
-0.01140594482421875,
0.047454833984375,
0.0662841796875,
0.005313873291015625,
-0.03546142578125,
0.044158935546875,
0.002201080322265625,
-0.021209716796875,
-0.037322998046875,
0.00885009765625,
-0.0175323486328125,
0.002170562744140625,
0.03759765625,
0.0145111083984375,
0.0213165283203125,
-0.04058837890625,
0.01739501953125,
0.037200927734375,
-0.0389404296875,
-0.0250244140625,
0.06768798828125,
-0.0115203857421875,
-0.0256500244140625,
0.042510986328125,
-0.0179443359375,
-0.03863525390625,
0.0782470703125,
0.04339599609375,
0.059783935546875,
0.00368499755859375,
0.007167816162109375,
0.06329345703125,
0.024749755859375,
-0.0226593017578125,
-0.0088653564453125,
0.0006194114685058594,
-0.070556640625,
-0.00574493408203125,
-0.0487060546875,
-0.01995849609375,
0.01313018798828125,
-0.06329345703125,
0.04119873046875,
-0.023895263671875,
-0.0208282470703125,
-0.00017762184143066406,
0.00006276369094848633,
-0.065185546875,
0.0179290771484375,
0.00341033935546875,
0.077392578125,
-0.052093505859375,
0.059906005859375,
0.04339599609375,
-0.0421142578125,
-0.051239013671875,
0.0013790130615234375,
-0.00475311279296875,
-0.07879638671875,
0.03631591796875,
0.055328369140625,
-0.010589599609375,
0.00963592529296875,
-0.056549072265625,
-0.0740966796875,
0.09503173828125,
0.0005021095275878906,
-0.017608642578125,
-0.007137298583984375,
-0.004119873046875,
0.034088134765625,
-0.033538818359375,
0.0380859375,
0.018829345703125,
0.022918701171875,
0.0234222412109375,
-0.043487548828125,
-0.019256591796875,
-0.002185821533203125,
0.007640838623046875,
0.0034351348876953125,
-0.06329345703125,
0.08013916015625,
-0.031158447265625,
-0.016357421875,
0.0104827880859375,
0.04766845703125,
0.00556182861328125,
0.03594970703125,
0.027862548828125,
0.059906005859375,
0.030853271484375,
-0.014801025390625,
0.07769775390625,
-0.01399993896484375,
0.053955078125,
0.06805419921875,
0.0035305023193359375,
0.0845947265625,
0.0204315185546875,
-0.018157958984375,
0.0286865234375,
0.05029296875,
-0.0296630859375,
0.03912353515625,
-0.01558685302734375,
0.0186309814453125,
-0.010772705078125,
-0.0196075439453125,
-0.0261993408203125,
0.059783935546875,
0.01959228515625,
-0.015380859375,
-0.0094451904296875,
0.025390625,
-0.0019893646240234375,
-0.005794525146484375,
-0.01824951171875,
0.042022705078125,
-0.02392578125,
-0.03704833984375,
0.04327392578125,
0.0035305023193359375,
0.06427001953125,
-0.0316162109375,
-0.00032329559326171875,
-0.0032291412353515625,
0.02923583984375,
-0.02227783203125,
-0.0743408203125,
0.0303955078125,
-0.00429534912109375,
-0.0183258056640625,
0.01357269287109375,
0.0584716796875,
-0.0255584716796875,
-0.0494384765625,
0.0352783203125,
-0.0171356201171875,
0.04547119140625,
-0.0238189697265625,
-0.046112060546875,
0.027099609375,
-0.00042438507080078125,
-0.015899658203125,
0.01119232177734375,
0.0213775634765625,
-0.005786895751953125,
0.050445556640625,
0.057037353515625,
-0.0153656005859375,
0.00861358642578125,
-0.045440673828125,
0.052276611328125,
-0.0262908935546875,
-0.0270538330078125,
-0.047637939453125,
0.0360107421875,
-0.016815185546875,
-0.02972412109375,
0.04315185546875,
0.0621337890625,
0.081787109375,
-0.0191650390625,
0.028839111328125,
-0.00957489013671875,
0.01849365234375,
-0.032867431640625,
0.0254058837890625,
-0.058685302734375,
-0.007442474365234375,
-0.0048828125,
-0.06414794921875,
-0.0266265869140625,
0.05706787109375,
-0.031219482421875,
-0.01275634765625,
0.04705810546875,
0.072509765625,
-0.0132293701171875,
-0.03814697265625,
0.04132080078125,
0.0107574462890625,
0.0197296142578125,
0.0465087890625,
0.02996826171875,
-0.068603515625,
0.07342529296875,
-0.0435791015625,
-0.01715087890625,
-0.034027099609375,
-0.059295654296875,
-0.0772705078125,
-0.047454833984375,
-0.040252685546875,
-0.011962890625,
-0.00801849365234375,
0.031707763671875,
0.08026123046875,
-0.05572509765625,
-0.0070648193359375,
-0.012603759765625,
0.00519561767578125,
-0.01177978515625,
-0.0229949951171875,
0.038543701171875,
-0.00492095947265625,
-0.0709228515625,
-0.0030765533447265625,
0.033294677734375,
0.005939483642578125,
-0.0158233642578125,
-0.0023097991943359375,
-0.0234375,
0.00495147705078125,
0.05035400390625,
0.0311737060546875,
-0.07421875,
-0.032928466796875,
0.01285552978515625,
0.002246856689453125,
0.023590087890625,
0.0245361328125,
-0.059600830078125,
0.056854248046875,
0.032562255859375,
0.0055389404296875,
0.05462646484375,
-0.0006313323974609375,
-0.01166534423828125,
-0.03753662109375,
0.057525634765625,
-0.0006670951843261719,
0.0257415771484375,
0.0281982421875,
-0.0362548828125,
0.047332763671875,
0.038543701171875,
-0.04119873046875,
-0.07012939453125,
0.0073394775390625,
-0.08697509765625,
-0.0259857177734375,
0.0704345703125,
-0.0224151611328125,
-0.04937744140625,
0.00710296630859375,
-0.025604248046875,
0.0201568603515625,
-0.0298004150390625,
0.0469970703125,
0.03656005859375,
-0.005817413330078125,
-0.02801513671875,
-0.026763916015625,
0.01169586181640625,
-0.0159912109375,
-0.045928955078125,
-0.0213623046875,
0.01187896728515625,
0.0311126708984375,
0.039886474609375,
0.036407470703125,
-0.0093231201171875,
0.01120758056640625,
0.022674560546875,
0.02093505859375,
-0.0218658447265625,
-0.034881591796875,
0.000827789306640625,
0.00811767578125,
-0.011932373046875,
-0.05718994140625
]
] |
hatmimoha/arabic-ner | 2023-04-11T12:02:29.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"token-classification",
"ar",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | hatmimoha | null | null | hatmimoha/arabic-ner | 11 | 12,991 | transformers | 2022-03-02T23:29:05 | ---
language: ar
---
# Arabic Named Entity Recognition Model
Pretrained BERT-based ([arabic-bert-base](https://huggingface.co/asafaya/bert-base-arabic)) Named Entity Recognition model for Arabic.
The pre-trained model can recognize the following entities:
1. **PERSON**
- و هذا ما نفاه المعاون السياسي للرئيس ***نبيه بري*** ، النائب ***علي حسن خليل***
- لكن أوساط ***الحريري*** تعتبر أنه ضحى كثيرا في سبيل البلد
- و ستفقد الملكة ***إليزابيث الثانية*** بذلك سيادتها على واحدة من آخر ممالك الكومنولث
2. **ORGANIZATION**
- حسب أرقام ***البنك الدولي***
- أعلن ***الجيش العراقي***
- و نقلت وكالة ***رويترز*** عن ثلاثة دبلوماسيين في ***الاتحاد الأوروبي*** ، أن ***بلجيكا*** و ***إيرلندا*** و ***لوكسمبورغ*** تريد أيضاً مناقشة
- ***الحكومة الاتحادية*** و ***حكومة إقليم كردستان***
- و هو ما يثير الشكوك حول مشاركة النجم البرتغالي في المباراة المرتقبة أمام ***برشلونة*** الإسباني في
3. ***LOCATION***
- الجديد هو تمكين اللاجئين من “ مغادرة الجزيرة تدريجياً و بهدوء إلى ***أثينا*** ”
- ***جزيرة ساكيز*** تبعد 1 كم عن ***إزمير***
4. **DATE**
- ***غدا الجمعة***
- ***06 أكتوبر 2020***
- ***العام السابق***
5. **PRODUCT**
- عبر حسابه ب ***تطبيق “ إنستغرام ”***
- الجيل الثاني من ***نظارة الواقع الافتراضي أوكولوس كويست*** تحت اسم " ***أوكولوس كويست 2*** "
6. **COMPETITION**
- عدم المشاركة في ***بطولة فرنسا المفتوحة للتنس***
- في مباراة ***كأس السوبر الأوروبي***
7. **PRIZE**
- ***جائزة نوبل ل لآداب***
- الذي فاز ب ***جائزة “ إيمي ” لأفضل دور مساند***
8. **EVENT**
- تسجّل أغنية جديدة خاصة ب ***العيد الوطني السعودي***
- ***مهرجان المرأة يافوية*** في دورته الرابعة
9. **DISEASE**
- في مكافحة فيروس ***كورونا*** و عدد من الأمراض
- الأزمات المشابهة مثل “ ***انفلونزا الطيور*** ” و ” ***انفلونزا الخنازير***
## Example
[Find here a complete example to use this model](https://github.com/hatmimoha/arabic-ner)
## Training Corpus
The training corpus is made of 378.000 tokens (14.000 sentences) collected from the Web and annotated manually.
It can be obtained from here: https://www.textorch.com/datasets/arabic-named-entity-recognition-corpus
## Results
The results on a valid corpus made of 30.000 tokens shows an F-measure of ~87%.
| 2,203 | [
[
-0.052001953125,
-0.04107666015625,
0.023162841796875,
0.0173492431640625,
-0.048858642578125,
0.0058135986328125,
0.0023899078369140625,
-0.04541015625,
0.04736328125,
0.032928466796875,
-0.03338623046875,
-0.068359375,
-0.06982421875,
0.029754638671875,
-0.030120849609375,
0.07354736328125,
-0.0167236328125,
0.02532958984375,
0.027191162109375,
-0.020965576171875,
-0.018707275390625,
-0.047332763671875,
-0.046234130859375,
-0.02020263671875,
0.03900146484375,
0.0102386474609375,
0.045501708984375,
0.007354736328125,
0.04461669921875,
0.028778076171875,
0.0126953125,
0.0029811859130859375,
-0.0159149169921875,
-0.01666259765625,
0.0009627342224121094,
-0.0274505615234375,
-0.00435638427734375,
-0.0013875961303710938,
0.0472412109375,
0.0457763671875,
-0.01190948486328125,
0.0192413330078125,
-0.0021190643310546875,
0.063232421875,
-0.03759765625,
0.004638671875,
-0.005962371826171875,
0.0047607421875,
-0.01210784912109375,
0.004611968994140625,
0.00023126602172851562,
-0.05126953125,
-0.00272369384765625,
-0.0242462158203125,
0.010284423828125,
0.0204925537109375,
0.09130859375,
-0.000009000301361083984,
-0.023590087890625,
-0.049346923828125,
-0.054779052734375,
0.07391357421875,
-0.047637939453125,
0.0135955810546875,
0.03961181640625,
0.0157928466796875,
-0.017791748046875,
-0.052642822265625,
-0.057464599609375,
-0.007171630859375,
-0.0038280487060546875,
0.005504608154296875,
-0.0057220458984375,
-0.0293731689453125,
0.0167083740234375,
0.00272369384765625,
-0.02313232421875,
-0.03399658203125,
-0.015594482421875,
-0.01554107666015625,
0.049285888671875,
-0.00872039794921875,
0.04168701171875,
-0.009521484375,
-0.0284423828125,
0.00521087646484375,
-0.0267791748046875,
0.01534271240234375,
0.01065826416015625,
0.001415252685546875,
-0.0226898193359375,
0.02642822265625,
-0.00653076171875,
0.03826904296875,
0.02374267578125,
-0.027984619140625,
0.0295562744140625,
-0.0124359130859375,
-0.02850341796875,
0.026641845703125,
0.04315185546875,
0.0164642333984375,
0.01512908935546875,
-0.0041351318359375,
-0.0199127197265625,
-0.00453948974609375,
0.00618743896484375,
-0.07647705078125,
-0.009490966796875,
0.005115509033203125,
-0.059417724609375,
-0.0249481201171875,
0.0216064453125,
-0.04095458984375,
-0.00296783447265625,
0.015106201171875,
0.037078857421875,
-0.03790283203125,
-0.033935546875,
0.0023193359375,
0.0094146728515625,
0.0240478515625,
0.04180908203125,
-0.059906005859375,
0.035888671875,
0.0133209228515625,
0.0452880859375,
0.01215362548828125,
-0.00791168212890625,
0.00363922119140625,
0.0233154296875,
-0.03045654296875,
0.041259765625,
-0.0243682861328125,
-0.041839599609375,
-0.0089874267578125,
0.0031337738037109375,
-0.01824951171875,
-0.046417236328125,
0.04473876953125,
-0.06396484375,
0.018035888671875,
-0.0219268798828125,
-0.049713134765625,
-0.046783447265625,
0.0195770263671875,
-0.05572509765625,
0.06768798828125,
-0.0056304931640625,
-0.061798095703125,
0.0086517333984375,
-0.056365966796875,
-0.025665283203125,
0.015594482421875,
-0.017425537109375,
-0.041107177734375,
0.0029296875,
0.02874755859375,
0.0192413330078125,
-0.00429534912109375,
0.016571044921875,
0.0005979537963867188,
-0.01904296875,
0.028350830078125,
-0.00087738037109375,
0.0654296875,
0.01336669921875,
-0.045074462890625,
0.00511932373046875,
-0.0726318359375,
0.01024627685546875,
0.032012939453125,
-0.018463134765625,
-0.007602691650390625,
-0.0190887451171875,
0.018890380859375,
0.02642822265625,
0.04345703125,
-0.042083740234375,
0.0085296630859375,
-0.04534912109375,
0.004550933837890625,
0.05511474609375,
0.0108489990234375,
0.01212310791015625,
-0.046905517578125,
0.03302001953125,
0.019012451171875,
-0.01120758056640625,
0.0229339599609375,
-0.03802490234375,
-0.068603515625,
-0.0228424072265625,
0.042694091796875,
0.0308685302734375,
-0.046234130859375,
0.04412841796875,
-0.003978729248046875,
-0.050537109375,
-0.047027587890625,
-0.00174713134765625,
0.0341796875,
0.029388427734375,
0.0196533203125,
-0.02911376953125,
-0.0455322265625,
-0.05047607421875,
-0.0223541259765625,
-0.01580810546875,
0.004451751708984375,
0.0352783203125,
0.06640625,
-0.0263519287109375,
0.0367431640625,
-0.034332275390625,
-0.025390625,
-0.02862548828125,
0.01334381103515625,
0.045684814453125,
0.03326416015625,
0.059417724609375,
-0.074462890625,
-0.05340576171875,
-0.004535675048828125,
-0.05084228515625,
0.0281219482421875,
0.01142120361328125,
-0.007770538330078125,
0.03570556640625,
-0.001461029052734375,
-0.058685302734375,
0.0654296875,
0.022064208984375,
-0.048370361328125,
0.05316162109375,
-0.01535797119140625,
0.02545166015625,
-0.0865478515625,
0.0126800537109375,
-0.037353515625,
-0.01059722900390625,
-0.0264739990234375,
0.0157623291015625,
-0.0038394927978515625,
0.0165252685546875,
-0.038330078125,
0.05633544921875,
-0.046722412109375,
0.0230560302734375,
-0.0180206298828125,
-0.018829345703125,
-0.011474609375,
0.0528564453125,
0.00612640380859375,
0.040802001953125,
0.03057861328125,
-0.0478515625,
0.017974853515625,
0.023223876953125,
-0.06103515625,
0.01352691650390625,
-0.038818359375,
0.0217742919921875,
-0.0173797607421875,
0.00806427001953125,
-0.0765380859375,
-0.0239715576171875,
0.0469970703125,
-0.06390380859375,
0.016845703125,
0.01265716552734375,
-0.022308349609375,
-0.0138397216796875,
-0.01552581787109375,
0.033172607421875,
0.0304107666015625,
-0.031524658203125,
0.0657958984375,
0.021759033203125,
-0.0226898193359375,
-0.046905517578125,
-0.041412353515625,
0.0185546875,
-0.007564544677734375,
-0.03460693359375,
0.0321044921875,
-0.01531219482421875,
-0.00797271728515625,
0.0008025169372558594,
-0.0009965896606445312,
-0.01342010498046875,
0.0077972412109375,
0.039276123046875,
0.013946533203125,
-0.01910400390625,
-0.0027980804443359375,
-0.00771331787109375,
-0.005859375,
0.005847930908203125,
0.0128631591796875,
0.049713134765625,
-0.01849365234375,
-0.040985107421875,
-0.04168701171875,
0.044830322265625,
0.022369384765625,
-0.006237030029296875,
0.081298828125,
0.06134033203125,
-0.051727294921875,
0.0222015380859375,
-0.049285888671875,
0.0005316734313964844,
-0.034454345703125,
0.028472900390625,
-0.0439453125,
-0.05145263671875,
0.050140380859375,
-0.0218505859375,
-0.0195159912109375,
0.0594482421875,
0.06207275390625,
-0.014923095703125,
0.0828857421875,
0.04461669921875,
-0.0364990234375,
0.0069427490234375,
-0.0256805419921875,
0.03460693359375,
-0.03753662109375,
-0.0452880859375,
-0.042266845703125,
-0.050994873046875,
-0.040069580078125,
-0.0075531005859375,
0.01129913330078125,
0.0087890625,
-0.018707275390625,
0.01605224609375,
-0.03594970703125,
0.0207672119140625,
0.045196533203125,
0.01739501953125,
-0.0004596710205078125,
-0.00653076171875,
-0.037200927734375,
-0.01270294189453125,
-0.0263214111328125,
-0.042327880859375,
0.0692138671875,
0.0171356201171875,
0.0380859375,
0.04290771484375,
0.07403564453125,
0.0264129638671875,
0.0294342041015625,
-0.03765869140625,
0.049407958984375,
0.034454345703125,
-0.05078125,
-0.0270233154296875,
0.00054168701171875,
-0.082275390625,
0.003643035888671875,
-0.0020732879638671875,
-0.070068359375,
0.0269775390625,
-0.021575927734375,
-0.0295562744140625,
0.032379150390625,
-0.02593994140625,
0.037628173828125,
-0.0293426513671875,
-0.00963592529296875,
-0.009613037109375,
-0.0712890625,
0.003475189208984375,
0.003978729248046875,
0.03338623046875,
-0.0095062255859375,
0.0043487548828125,
0.08056640625,
-0.053741455078125,
0.03973388671875,
-0.0132598876953125,
0.019439697265625,
0.031402587890625,
0.010223388671875,
0.036712646484375,
0.0224761962890625,
0.00696563720703125,
0.0235137939453125,
0.00429534912109375,
-0.036590576171875,
-0.0217437744140625,
0.07330322265625,
-0.08795166015625,
-0.047607421875,
-0.06365966796875,
-0.007564544677734375,
0.01065826416015625,
0.047943115234375,
0.03411865234375,
0.026519775390625,
-0.020263671875,
-0.008941650390625,
0.03582763671875,
-0.0140380859375,
0.039825439453125,
0.035980224609375,
-0.0149993896484375,
-0.04852294921875,
0.062744140625,
0.0002009868621826172,
-0.030303955078125,
0.035858154296875,
-0.0086669921875,
-0.0254974365234375,
-0.0499267578125,
-0.0283966064453125,
0.02960205078125,
-0.0469970703125,
-0.0218048095703125,
-0.06011962890625,
-0.029205322265625,
-0.03826904296875,
-0.00423431396484375,
-0.006977081298828125,
-0.0250396728515625,
-0.037322998046875,
-0.0204315185546875,
0.022613525390625,
0.052459716796875,
-0.00009721517562866211,
0.0175323486328125,
-0.054412841796875,
0.019134521484375,
0.0097503662109375,
0.01165771484375,
0.0224761962890625,
-0.02496337890625,
-0.032318115234375,
0.001056671142578125,
-0.02490234375,
-0.0947265625,
0.06390380859375,
0.005092620849609375,
0.019683837890625,
0.042266845703125,
-0.0020580291748046875,
0.028594970703125,
-0.0225372314453125,
0.0582275390625,
0.0204315185546875,
-0.057830810546875,
0.0421142578125,
-0.01338958740234375,
0.0182647705078125,
0.048370361328125,
0.046234130859375,
-0.036895751953125,
-0.0003478527069091797,
-0.061187744140625,
-0.057373046875,
0.06060791015625,
0.035125732421875,
0.00146484375,
-0.0184173583984375,
0.005916595458984375,
-0.01220703125,
0.04302978515625,
-0.047210693359375,
-0.052032470703125,
0.002582550048828125,
-0.027069091796875,
0.01538848876953125,
-0.052032470703125,
-0.0085296630859375,
-0.03375244140625,
0.0631103515625,
0.038421630859375,
0.049530029296875,
0.02374267578125,
-0.0152587890625,
-0.01213836669921875,
0.0174713134765625,
0.041656494140625,
0.04718017578125,
-0.0245361328125,
0.003696441650390625,
0.0178070068359375,
-0.060882568359375,
0.0186767578125,
0.0012159347534179688,
-0.01849365234375,
0.0250701904296875,
0.0389404296875,
0.054779052734375,
0.0273284912109375,
-0.053497314453125,
0.039215087890625,
-0.0188140869140625,
-0.0200653076171875,
-0.05450439453125,
-0.01189422607421875,
0.020263671875,
0.006561279296875,
0.044525146484375,
-0.002498626708984375,
0.0167999267578125,
-0.03057861328125,
0.0086669921875,
0.0333251953125,
-0.0189971923828125,
-0.0036678314208984375,
0.048614501953125,
0.00870513916015625,
-0.04302978515625,
0.0634765625,
-0.0172576904296875,
-0.05413818359375,
0.0673828125,
0.041534423828125,
0.0635986328125,
-0.05316162109375,
0.0112457275390625,
0.048614501953125,
0.01053619384765625,
0.021728515625,
0.0531005859375,
0.002567291259765625,
-0.059906005859375,
-0.0021419525146484375,
-0.0672607421875,
-0.0218658447265625,
0.0146026611328125,
-0.0426025390625,
0.00730133056640625,
-0.052001953125,
-0.027618408203125,
0.01561737060546875,
0.0203704833984375,
-0.074462890625,
0.0289764404296875,
0.0231170654296875,
0.0628662109375,
-0.02923583984375,
0.06353759765625,
0.05828857421875,
-0.03448486328125,
-0.07330322265625,
-0.0089874267578125,
-0.0095062255859375,
-0.08135986328125,
0.07318115234375,
0.018218994140625,
-0.0079498291015625,
0.005645751953125,
-0.0418701171875,
-0.09759521484375,
0.06536865234375,
-0.0178375244140625,
-0.04400634765625,
0.0006847381591796875,
0.0160369873046875,
0.02117919921875,
-0.0260162353515625,
0.032684326171875,
0.016510009765625,
0.039764404296875,
0.016845703125,
-0.071044921875,
0.0243377685546875,
-0.04925537109375,
0.0016689300537109375,
0.024749755859375,
-0.042236328125,
0.060699462890625,
-0.02947998046875,
-0.02349853515625,
0.03753662109375,
0.06304931640625,
0.00565338134765625,
0.0290374755859375,
0.019500732421875,
0.059295654296875,
0.038482666015625,
-0.008209228515625,
0.062744140625,
-0.028167724609375,
0.03326416015625,
0.06072998046875,
-0.007358551025390625,
0.04998779296875,
0.0279998779296875,
-0.0367431640625,
0.0640869140625,
0.05126953125,
-0.01480865478515625,
0.0687255859375,
0.0058135986328125,
-0.050537109375,
-0.005878448486328125,
-0.014923095703125,
-0.033538818359375,
0.0209503173828125,
0.0232086181640625,
-0.044525146484375,
-0.0253143310546875,
0.0126953125,
0.0037250518798828125,
-0.00711822509765625,
-0.0184326171875,
0.061279296875,
0.00676727294921875,
-0.032928466796875,
0.048736572265625,
0.0282745361328125,
0.031646728515625,
-0.050079345703125,
-0.0116729736328125,
0.0083465576171875,
0.0236358642578125,
-0.0092010498046875,
-0.03363037109375,
0.006534576416015625,
-0.01715087890625,
-0.00927734375,
0.006107330322265625,
0.0748291015625,
-0.01522064208984375,
-0.06219482421875,
0.0137786865234375,
0.03082275390625,
0.012451171875,
-0.005523681640625,
-0.06781005859375,
0.00984954833984375,
0.013153076171875,
-0.0275726318359375,
0.0248870849609375,
0.01311492919921875,
-0.001949310302734375,
0.039276123046875,
0.037628173828125,
0.00499725341796875,
-0.00782012939453125,
-0.0016660690307617188,
0.06207275390625,
-0.07733154296875,
-0.03515625,
-0.0673828125,
0.0213165283203125,
0.001415252685546875,
-0.0142974853515625,
0.048858642578125,
0.0491943359375,
0.06072998046875,
-0.02630615234375,
0.053009033203125,
-0.0007543563842773438,
0.058197021484375,
-0.0165863037109375,
0.06817626953125,
-0.034698486328125,
-0.0217742919921875,
-0.024017333984375,
-0.039886474609375,
-0.01316070556640625,
0.057403564453125,
-0.033538818359375,
0.022491455078125,
0.03936767578125,
0.04327392578125,
0.020904541015625,
0.0013380050659179688,
0.008819580078125,
0.0179443359375,
-0.0054473876953125,
0.05108642578125,
0.053680419921875,
-0.038665771484375,
0.00609588623046875,
-0.0178375244140625,
-0.0074005126953125,
-0.0389404296875,
-0.03802490234375,
-0.07916259765625,
-0.03436279296875,
-0.020294189453125,
-0.0220947265625,
-0.0009121894836425781,
0.08331298828125,
0.024017333984375,
-0.0859375,
-0.01224517822265625,
0.01535797119140625,
0.0193328857421875,
-0.01776123046875,
-0.0180206298828125,
0.05865478515625,
0.0167999267578125,
-0.06134033203125,
-0.011383056640625,
0.010040283203125,
0.0235595703125,
0.00360107421875,
-0.0004482269287109375,
-0.046173095703125,
0.00899505615234375,
0.0170440673828125,
0.0209197998046875,
-0.06488037109375,
-0.010345458984375,
0.00011986494064331055,
-0.025634765625,
0.0014095306396484375,
0.0228424072265625,
-0.05145263671875,
0.01123809814453125,
0.01448822021484375,
0.03778076171875,
0.045562744140625,
-0.007904052734375,
0.00989532470703125,
-0.030181884765625,
0.007183074951171875,
0.042266845703125,
0.03155517578125,
0.006526947021484375,
-0.05169677734375,
0.024688720703125,
0.01332855224609375,
-0.0400390625,
-0.037322998046875,
0.01303863525390625,
-0.068603515625,
-0.0254058837890625,
0.054107666015625,
-0.00439453125,
-0.0212860107421875,
-0.00696563720703125,
-0.030517578125,
0.03057861328125,
-0.035430908203125,
0.0653076171875,
0.08502197265625,
-0.01192474365234375,
0.002063751220703125,
-0.022369384765625,
0.04693603515625,
0.041107177734375,
-0.039337158203125,
-0.057952880859375,
0.0107879638671875,
0.04052734375,
0.0203399658203125,
0.06390380859375,
0.00028228759765625,
0.0291900634765625,
-0.0135345458984375,
0.025787353515625,
0.01357269287109375,
-0.006988525390625,
0.007080078125,
0.005710601806640625,
0.01142120361328125,
-0.046234130859375
]
] |
cross-encoder/nli-MiniLM2-L6-H768 | 2021-08-05T08:40:39.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"MiniLMv2",
"zero-shot-classification",
"en",
"dataset:multi_nli",
"dataset:snli",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | zero-shot-classification | cross-encoder | null | null | cross-encoder/nli-MiniLM2-L6-H768 | 2 | 12,975 | transformers | 2022-03-02T23:29:05 | ---
language: en
pipeline_tag: zero-shot-classification
license: apache-2.0
tags:
- MiniLMv2
datasets:
- multi_nli
- snli
metrics:
- accuracy
---
# Cross-Encoder for Natural Language Inference
This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class.
## Training Data
The model was trained on the [SNLI](https://nlp.stanford.edu/projects/snli/) and [MultiNLI](https://cims.nyu.edu/~sbowman/multinli/) datasets. For a given sentence pair, it will output three scores corresponding to the labels: contradiction, entailment, neutral.
## Performance
For evaluation results, see [SBERT.net - Pretrained Cross-Encoder](https://www.sbert.net/docs/pretrained_cross-encoders.html#nli).
## Usage
Pre-trained models can be used like this:
```python
from sentence_transformers import CrossEncoder
model = CrossEncoder('cross-encoder/nli-MiniLM2-L6-H768')
scores = model.predict([('A man is eating pizza', 'A man eats something'), ('A black race car starts up in front of a crowd of people.', 'A man is driving down a lonely road.')])
#Convert scores to labels
label_mapping = ['contradiction', 'entailment', 'neutral']
labels = [label_mapping[score_max] for score_max in scores.argmax(axis=1)]
```
## Usage with Transformers AutoModel
You can use the model also directly with Transformers library (without SentenceTransformers library):
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model = AutoModelForSequenceClassification.from_pretrained('cross-encoder/nli-MiniLM2-L6-H768')
tokenizer = AutoTokenizer.from_pretrained('cross-encoder/nli-MiniLM2-L6-H768')
features = tokenizer(['A man is eating pizza', 'A black race car starts up in front of a crowd of people.'], ['A man eats something', 'A man is driving down a lonely road.'], padding=True, truncation=True, return_tensors="pt")
model.eval()
with torch.no_grad():
scores = model(**features).logits
label_mapping = ['contradiction', 'entailment', 'neutral']
labels = [label_mapping[score_max] for score_max in scores.argmax(dim=1)]
print(labels)
```
## Zero-Shot Classification
This model can also be used for zero-shot-classification:
```python
from transformers import pipeline
classifier = pipeline("zero-shot-classification", model='cross-encoder/nli-MiniLM2-L6-H768')
sent = "Apple just announced the newest iPhone X"
candidate_labels = ["technology", "sports", "politics"]
res = classifier(sent, candidate_labels)
print(res)
``` | 2,567 | [
[
-0.0166778564453125,
-0.055511474609375,
0.02093505859375,
0.01282501220703125,
0.0014600753784179688,
-0.0101776123046875,
-0.00843048095703125,
-0.023040771484375,
0.0113067626953125,
0.032806396484375,
-0.04315185546875,
-0.03594970703125,
-0.0384521484375,
0.01446533203125,
-0.040863037109375,
0.08880615234375,
-0.00030493736267089844,
0.0019311904907226562,
-0.01294708251953125,
-0.0095367431640625,
-0.0174407958984375,
-0.0303955078125,
-0.035186767578125,
-0.04498291015625,
0.0277557373046875,
0.01486968994140625,
0.045623779296875,
0.0301513671875,
0.0112152099609375,
0.0280914306640625,
0.00608062744140625,
-0.01351165771484375,
-0.01552581787109375,
-0.012451171875,
0.0013360977172851562,
-0.044708251953125,
-0.005474090576171875,
0.0142669677734375,
0.0240478515625,
0.0287017822265625,
-0.0013484954833984375,
0.0235443115234375,
-0.00794219970703125,
0.012176513671875,
-0.040802001953125,
0.007282257080078125,
-0.038818359375,
0.0139617919921875,
0.005035400390625,
0.0013217926025390625,
-0.0350341796875,
-0.0245513916015625,
0.006481170654296875,
-0.036834716796875,
0.028594970703125,
0.010955810546875,
0.09722900390625,
0.031585693359375,
-0.0225982666015625,
-0.032745361328125,
-0.0360107421875,
0.07000732421875,
-0.0758056640625,
0.0196380615234375,
0.016082763671875,
0.0024662017822265625,
0.00919342041015625,
-0.060455322265625,
-0.0709228515625,
-0.012115478515625,
-0.0243682861328125,
0.0291748046875,
-0.02569580078125,
-0.004146575927734375,
0.031982421875,
0.027374267578125,
-0.0626220703125,
0.0014362335205078125,
-0.0300750732421875,
-0.0158233642578125,
0.056976318359375,
0.0133514404296875,
0.0183868408203125,
-0.03857421875,
-0.0279998779296875,
-0.010833740234375,
-0.01253509521484375,
0.01323699951171875,
0.020233154296875,
0.0017976760864257812,
-0.0239410400390625,
0.06463623046875,
-0.0264129638671875,
0.05963134765625,
0.016632080078125,
-0.0086822509765625,
0.050628662109375,
-0.0293731689453125,
-0.035888671875,
0.0224151611328125,
0.0797119140625,
0.0303192138671875,
0.0206146240234375,
-0.00418853759765625,
-0.00839996337890625,
0.03131103515625,
-0.01055908203125,
-0.055572509765625,
-0.023773193359375,
0.026458740234375,
-0.0283966064453125,
-0.026336669921875,
0.0004220008850097656,
-0.06298828125,
-0.0028476715087890625,
-0.007633209228515625,
0.061859130859375,
-0.039031982421875,
0.01561737060546875,
0.022674560546875,
-0.02520751953125,
0.032684326171875,
-0.01464080810546875,
-0.060577392578125,
0.002994537353515625,
0.02215576171875,
0.059478759765625,
0.0171356201171875,
-0.034576416015625,
-0.0265045166015625,
0.0022125244140625,
0.0032634735107421875,
0.038330078125,
-0.0282440185546875,
-0.0032978057861328125,
-0.01372528076171875,
0.00658416748046875,
-0.0240325927734375,
-0.026885986328125,
0.04949951171875,
-0.0194244384765625,
0.0467529296875,
0.025054931640625,
-0.05926513671875,
-0.02294921875,
0.019561767578125,
-0.0313720703125,
0.09088134765625,
0.004619598388671875,
-0.06488037109375,
0.01187896728515625,
-0.0440673828125,
-0.0347900390625,
-0.0218963623046875,
-0.006977081298828125,
-0.046905517578125,
0.0064849853515625,
0.0301513671875,
0.029205322265625,
-0.0182037353515625,
0.0301513671875,
-0.020751953125,
-0.0277557373046875,
0.0213470458984375,
-0.03863525390625,
0.08721923828125,
0.007022857666015625,
-0.040496826171875,
0.0178375244140625,
-0.058013916015625,
0.01021575927734375,
0.01157379150390625,
-0.01959228515625,
-0.0027713775634765625,
-0.0225067138671875,
0.01013946533203125,
0.0243988037109375,
0.0006060600280761719,
-0.0552978515625,
-0.000453948974609375,
-0.03875732421875,
0.04486083984375,
0.0305328369140625,
-0.0045013427734375,
0.0268096923828125,
-0.020538330078125,
0.0183868408203125,
-0.0028553009033203125,
0.00572967529296875,
-0.00611114501953125,
-0.053436279296875,
-0.0826416015625,
-0.0017299652099609375,
0.034027099609375,
0.06451416015625,
-0.0712890625,
0.06793212890625,
-0.017822265625,
-0.04547119140625,
-0.058258056640625,
-0.01441192626953125,
0.018280029296875,
0.04290771484375,
0.04620361328125,
0.0008091926574707031,
-0.05718994140625,
-0.0587158203125,
-0.0292816162109375,
-0.00476837158203125,
-0.012542724609375,
0.0014848709106445312,
0.061676025390625,
-0.032958984375,
0.08447265625,
-0.03955078125,
-0.0191497802734375,
-0.040557861328125,
0.02899169921875,
0.0372314453125,
0.05047607421875,
0.0268096923828125,
-0.04278564453125,
-0.03173828125,
-0.021728515625,
-0.0633544921875,
-0.01116180419921875,
-0.025482177734375,
-0.00008291006088256836,
0.00598907470703125,
0.023956298828125,
-0.045135498046875,
0.050506591796875,
0.031341552734375,
-0.03546142578125,
0.04052734375,
-0.00909423828125,
-0.003692626953125,
-0.07403564453125,
-0.01219940185546875,
0.01415252685546875,
-0.01197052001953125,
-0.0596923828125,
-0.011322021484375,
-0.0086822509765625,
-0.0056915283203125,
-0.034088134765625,
0.04498291015625,
-0.02166748046875,
0.007228851318359375,
-0.003452301025390625,
0.00975799560546875,
0.01464080810546875,
0.043670654296875,
0.0181732177734375,
0.037445068359375,
0.054443359375,
-0.039825439453125,
0.037994384765625,
0.023590087890625,
-0.033294677734375,
0.018829345703125,
-0.060302734375,
-0.00337982177734375,
-0.011810302734375,
0.0168609619140625,
-0.0689697265625,
-0.0132293701171875,
0.02886962890625,
-0.048736572265625,
-0.0009965896606445312,
0.0126190185546875,
-0.036956787109375,
-0.038482666015625,
-0.0089111328125,
0.02618408203125,
0.040130615234375,
-0.03472900390625,
0.057037353515625,
0.00982666015625,
0.02783203125,
-0.035675048828125,
-0.08563232421875,
-0.0032367706298828125,
-0.01331329345703125,
-0.038604736328125,
0.018218994140625,
0.0012178421020507812,
0.00012373924255371094,
0.0099639892578125,
0.0019550323486328125,
-0.00962066650390625,
-0.00977325439453125,
0.01537322998046875,
0.0214691162109375,
-0.019378662109375,
-0.00182342529296875,
-0.007022857666015625,
-0.0157012939453125,
0.0108489990234375,
-0.0205078125,
0.04217529296875,
-0.023040771484375,
-0.0183563232421875,
-0.047760009765625,
0.02032470703125,
0.018524169921875,
-0.0091552734375,
0.05474853515625,
0.07269287109375,
-0.03143310546875,
-0.002620697021484375,
-0.033355712890625,
-0.0193023681640625,
-0.0322265625,
0.032562255859375,
-0.0234375,
-0.050628662109375,
0.0254058837890625,
0.0218048095703125,
-0.01416778564453125,
0.04962158203125,
0.0372314453125,
-0.005771636962890625,
0.07275390625,
0.0310821533203125,
-0.0167388916015625,
0.0200653076171875,
-0.045501708984375,
0.0249176025390625,
-0.047882080078125,
-0.0183563232421875,
-0.03594970703125,
-0.020599365234375,
-0.043548583984375,
-0.02685546875,
0.01348114013671875,
0.007579803466796875,
-0.0225067138671875,
0.033966064453125,
-0.03717041015625,
0.033905029296875,
0.056121826171875,
0.004848480224609375,
0.0051727294921875,
0.0015926361083984375,
-0.00943756103515625,
0.0009059906005859375,
-0.06768798828125,
-0.033905029296875,
0.0621337890625,
0.0225677490234375,
0.059539794921875,
-0.0110321044921875,
0.06195068359375,
0.0029392242431640625,
0.0196380615234375,
-0.0574951171875,
0.037872314453125,
-0.0207061767578125,
-0.05963134765625,
-0.0181732177734375,
-0.035797119140625,
-0.063232421875,
0.0188140869140625,
-0.028839111328125,
-0.057830810546875,
0.0189056396484375,
-0.0098419189453125,
-0.040130615234375,
0.02276611328125,
-0.06427001953125,
0.09307861328125,
-0.027679443359375,
-0.018218994140625,
0.012237548828125,
-0.059722900390625,
0.0286407470703125,
0.0094757080078125,
0.00751495361328125,
-0.01549530029296875,
0.01983642578125,
0.065185546875,
-0.01515960693359375,
0.0736083984375,
-0.0034809112548828125,
0.01922607421875,
0.033203125,
-0.0222320556640625,
0.01399993896484375,
0.00673675537109375,
-0.0263519287109375,
0.028778076171875,
-0.0077056884765625,
-0.0277862548828125,
-0.04583740234375,
0.03619384765625,
-0.06976318359375,
-0.027069091796875,
-0.04248046875,
-0.03350830078125,
0.0151214599609375,
0.017730712890625,
0.055908203125,
0.03497314453125,
0.004123687744140625,
0.004730224609375,
0.0266876220703125,
-0.0255584716796875,
0.05584716796875,
0.009063720703125,
-0.011322021484375,
-0.03466796875,
0.06304931640625,
-0.0036907196044921875,
0.01438140869140625,
0.03070068359375,
0.0251312255859375,
-0.041015625,
-0.01404571533203125,
-0.032501220703125,
0.0220794677734375,
-0.0438232421875,
-0.0160675048828125,
-0.04962158203125,
-0.045501708984375,
-0.046356201171875,
-0.00826263427734375,
-0.0156402587890625,
-0.021697998046875,
-0.037322998046875,
-0.007904052734375,
0.0227813720703125,
0.036163330078125,
-0.0037689208984375,
0.032684326171875,
-0.052337646484375,
0.034942626953125,
0.01442718505859375,
0.004268646240234375,
-0.0086669921875,
-0.056304931640625,
-0.013824462890625,
0.0007891654968261719,
-0.0280609130859375,
-0.07232666015625,
0.046844482421875,
0.023590087890625,
0.04803466796875,
0.01861572265625,
0.0184326171875,
0.053375244140625,
-0.026611328125,
0.056793212890625,
0.0286865234375,
-0.09417724609375,
0.04547119140625,
0.01099395751953125,
0.0310821533203125,
0.032989501953125,
0.0316162109375,
-0.05352783203125,
-0.03411865234375,
-0.044403076171875,
-0.0626220703125,
0.056976318359375,
0.036041259765625,
0.00759124755859375,
-0.004604339599609375,
0.01256561279296875,
0.004459381103515625,
0.014892578125,
-0.10369873046875,
-0.040008544921875,
-0.04718017578125,
-0.043914794921875,
-0.023284912109375,
-0.0002849102020263672,
0.004962921142578125,
-0.0469970703125,
0.0634765625,
-0.0014896392822265625,
0.032928466796875,
0.03985595703125,
-0.017059326171875,
0.0245513916015625,
0.0237579345703125,
0.0404052734375,
0.0136566162109375,
-0.0193939208984375,
0.01214599609375,
0.0299835205078125,
-0.0195159912109375,
0.0198974609375,
0.0158843994140625,
-0.032318115234375,
0.0172576904296875,
0.043609619140625,
0.0947265625,
-0.0016374588012695312,
-0.034881591796875,
0.04071044921875,
0.0034999847412109375,
-0.0165252685546875,
-0.0306549072265625,
0.004695892333984375,
-0.004520416259765625,
0.02362060546875,
0.0174102783203125,
0.015960693359375,
0.006671905517578125,
-0.047576904296875,
0.0254364013671875,
0.0098876953125,
-0.041778564453125,
-0.01456451416015625,
0.060760498046875,
0.0017690658569335938,
-0.03497314453125,
0.050323486328125,
-0.0255889892578125,
-0.05157470703125,
0.0478515625,
0.044464111328125,
0.072509765625,
0.0021076202392578125,
0.0241851806640625,
0.048095703125,
0.03564453125,
-0.0017652511596679688,
0.00832366943359375,
0.00418853759765625,
-0.07342529296875,
-0.02276611328125,
-0.0523681640625,
-0.0036163330078125,
0.00858306884765625,
-0.050537109375,
0.01385498046875,
-0.0154876708984375,
-0.006313323974609375,
0.006168365478515625,
-0.0138397216796875,
-0.051177978515625,
0.023956298828125,
0.01971435546875,
0.06549072265625,
-0.08270263671875,
0.0765380859375,
0.038970947265625,
-0.049896240234375,
-0.06298828125,
0.01204681396484375,
-0.0114288330078125,
-0.053436279296875,
0.04864501953125,
0.038116455078125,
0.00614166259765625,
0.01108551025390625,
-0.0308380126953125,
-0.052825927734375,
0.0731201171875,
0.00795745849609375,
-0.033477783203125,
-0.0089874267578125,
0.0228271484375,
0.04736328125,
-0.030059814453125,
0.05584716796875,
0.056732177734375,
0.033721923828125,
-0.004474639892578125,
-0.051116943359375,
0.00467681884765625,
-0.01137542724609375,
-0.00443267822265625,
-0.01198577880859375,
-0.0308380126953125,
0.0684814453125,
-0.022491455078125,
-0.001331329345703125,
0.0140838623046875,
0.05706787109375,
0.0240478515625,
0.042724609375,
0.0384521484375,
0.059539794921875,
0.044586181640625,
-0.01947021484375,
0.07135009765625,
-0.00983428955078125,
0.058868408203125,
0.08209228515625,
-0.01218414306640625,
0.06561279296875,
0.03521728515625,
-0.0098876953125,
0.053863525390625,
0.0506591796875,
-0.0280914306640625,
0.0418701171875,
0.0192108154296875,
-0.006244659423828125,
-0.019775390625,
0.01202392578125,
-0.0240936279296875,
0.053863525390625,
0.00888824462890625,
-0.0311431884765625,
-0.0177459716796875,
0.01346588134765625,
-0.0177001953125,
-0.0038166046142578125,
-0.01323699951171875,
0.04144287109375,
-0.005596160888671875,
-0.0438232421875,
0.051300048828125,
0.007228851318359375,
0.0733642578125,
-0.030303955078125,
0.00896453857421875,
-0.00177764892578125,
0.0203704833984375,
-0.0230712890625,
-0.0670166015625,
0.0244140625,
-0.003948211669921875,
-0.010772705078125,
-0.0043487548828125,
0.03863525390625,
-0.052215576171875,
-0.061798095703125,
0.035919189453125,
0.019989013671875,
0.019073486328125,
0.00907135009765625,
-0.0772705078125,
-0.005573272705078125,
0.0185089111328125,
-0.015228271484375,
-0.0078277587890625,
0.0280914306640625,
0.0247344970703125,
0.035064697265625,
0.038421630859375,
-0.0090179443359375,
0.031646728515625,
0.01267242431640625,
0.043304443359375,
-0.06304931640625,
-0.0283050537109375,
-0.07293701171875,
0.04925537109375,
-0.01262664794921875,
-0.0377197265625,
0.06854248046875,
0.0633544921875,
0.07354736328125,
-0.0223388671875,
0.05169677734375,
-0.0180511474609375,
0.024658203125,
-0.047210693359375,
0.0499267578125,
-0.04718017578125,
0.0054779052734375,
-0.005039215087890625,
-0.048431396484375,
-0.034332275390625,
0.067138671875,
-0.029083251953125,
0.010101318359375,
0.050506591796875,
0.0721435546875,
-0.004230499267578125,
0.0069122314453125,
0.0127410888671875,
0.02569580078125,
0.00598907470703125,
0.05157470703125,
0.060760498046875,
-0.07012939453125,
0.05462646484375,
-0.041961669921875,
-0.001873016357421875,
-0.006160736083984375,
-0.050201416015625,
-0.06982421875,
-0.02886962890625,
-0.03857421875,
-0.0294342041015625,
-0.0119476318359375,
0.058258056640625,
0.056304931640625,
-0.08099365234375,
-0.0210113525390625,
-0.01393890380859375,
0.020599365234375,
-0.019073486328125,
-0.026275634765625,
0.0178680419921875,
-0.0177001953125,
-0.06072998046875,
0.021636962890625,
-0.0012454986572265625,
0.0045013427734375,
-0.003574371337890625,
-0.00914764404296875,
-0.042083740234375,
0.001728057861328125,
0.034149169921875,
0.01313018798828125,
-0.07659912109375,
-0.0297088623046875,
-0.004512786865234375,
-0.018402099609375,
0.01300048828125,
0.03436279296875,
-0.06231689453125,
0.0174713134765625,
0.034637451171875,
0.045501708984375,
0.053558349609375,
-0.01294708251953125,
0.02032470703125,
-0.055908203125,
0.00925445556640625,
0.01145172119140625,
0.032745361328125,
0.022308349609375,
-0.0153656005859375,
0.03680419921875,
0.0380859375,
-0.04034423828125,
-0.04840087890625,
0.003864288330078125,
-0.07171630859375,
-0.0225677490234375,
0.08001708984375,
-0.006290435791015625,
-0.033050537109375,
-0.0116424560546875,
-0.01080322265625,
0.043212890625,
-0.0206298828125,
0.04229736328125,
0.032806396484375,
-0.0222320556640625,
-0.0206146240234375,
-0.037017822265625,
0.0231475830078125,
0.040740966796875,
-0.06060791015625,
-0.0234375,
0.01055908203125,
0.0321044921875,
0.026123046875,
0.03021240234375,
0.01678466796875,
0.00028586387634277344,
0.01904296875,
0.026123046875,
0.00782012939453125,
-0.0077972412109375,
-0.036346435546875,
0.01206207275390625,
-0.045196533203125,
-0.043212890625
]
] |
microsoft/xclip-base-patch32 | 2022-10-12T11:31:02.000Z | [
"transformers",
"pytorch",
"xclip",
"vision",
"video-classification",
"en",
"arxiv:2208.02816",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | video-classification | microsoft | null | null | microsoft/xclip-base-patch32 | 36 | 12,967 | transformers | 2022-08-25T13:06:15 | ---
language: en
license: mit
tags:
- vision
- video-classification
model-index:
- name: nielsr/xclip-base-patch32
results:
- task:
type: video-classification
dataset:
name: Kinetics 400
type: kinetics-400
metrics:
- type: top-1 accuracy
value: 80.4
- type: top-5 accuracy
value: 95.0
---
# X-CLIP (base-sized model)
X-CLIP model (base-sized, patch resolution of 32) trained fully-supervised on [Kinetics-400](https://www.deepmind.com/open-source/kinetics). It was introduced in the paper [Expanding Language-Image Pretrained Models for General Video Recognition](https://arxiv.org/abs/2208.02816) by Ni et al. and first released in [this repository](https://github.com/microsoft/VideoX/tree/master/X-CLIP).
This model was trained using 8 frames per video, at a resolution of 224x224.
Disclaimer: The team releasing X-CLIP did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
X-CLIP is a minimal extension of [CLIP](https://huggingface.co/docs/transformers/model_doc/clip) for general video-language understanding. The model is trained in a contrastive way on (video, text) pairs.

This allows the model to be used for tasks like zero-shot, few-shot or fully supervised video classification and video-text retrieval.
## Intended uses & limitations
You can use the raw model for determining how well text goes with a given video. See the [model hub](https://huggingface.co/models?search=microsoft/xclip) to look for
fine-tuned versions on a task that interests you.
### How to use
For code examples, we refer to the [documentation](https://huggingface.co/transformers/main/model_doc/xclip.html#).
## Training data
This model was trained on [Kinetics-400](https://www.deepmind.com/open-source/kinetics).
### Preprocessing
The exact details of preprocessing during training can be found [here](https://github.com/microsoft/VideoX/blob/40f6d177e0a057a50ac69ac1de6b5938fd268601/X-CLIP/datasets/build.py#L247).
The exact details of preprocessing during validation can be found [here](https://github.com/microsoft/VideoX/blob/40f6d177e0a057a50ac69ac1de6b5938fd268601/X-CLIP/datasets/build.py#L285).
During validation, one resizes the shorter edge of each frame, after which center cropping is performed to a fixed-size resolution (like 224x224). Next, frames are normalized across the RGB channels with the ImageNet mean and standard deviation.
## Evaluation results
This model achieves a top-1 accuracy of 80.4% and a top-5 accuracy of 95.0%.
| 2,728 | [
[
-0.046356201171875,
-0.034637451171875,
0.02557373046875,
0.00395965576171875,
-0.0238494873046875,
0.003971099853515625,
-0.017913818359375,
-0.01543426513671875,
0.025726318359375,
0.025146484375,
-0.06671142578125,
-0.047027587890625,
-0.060546875,
-0.0167999267578125,
-0.04766845703125,
0.07952880859375,
-0.025543212890625,
-0.004604339599609375,
-0.02069091796875,
-0.024688720703125,
-0.025146484375,
-0.04864501953125,
-0.01409149169921875,
-0.01154327392578125,
-0.00923919677734375,
0.02423095703125,
0.0655517578125,
0.0703125,
0.042510986328125,
0.0272979736328125,
-0.0177459716796875,
-0.02008056640625,
-0.0266265869140625,
-0.035369873046875,
-0.00962066650390625,
-0.02734375,
-0.043701171875,
0.016387939453125,
0.05584716796875,
0.0137786865234375,
-0.003063201904296875,
0.034027099609375,
-0.0118560791015625,
0.0292205810546875,
-0.052459716796875,
-0.0026721954345703125,
-0.0281524658203125,
0.0229339599609375,
-0.0019550323486328125,
0.0025157928466796875,
0.0030193328857421875,
-0.002628326416015625,
0.019073486328125,
-0.0289154052734375,
0.033172607421875,
-0.0201416015625,
0.09356689453125,
0.018463134765625,
-0.0025730133056640625,
0.0143890380859375,
-0.0330810546875,
0.057098388671875,
-0.033172607421875,
0.04620361328125,
0.021759033203125,
0.029815673828125,
0.0276031494140625,
-0.06109619140625,
-0.037445068359375,
-0.0029449462890625,
0.020263671875,
-0.00937652587890625,
-0.04498291015625,
0.015472412109375,
0.053955078125,
0.033660888671875,
-0.04425048828125,
0.002925872802734375,
-0.0302734375,
-0.0218505859375,
0.044525146484375,
0.010711669921875,
0.035308837890625,
-0.0126190185546875,
-0.0390625,
-0.041961669921875,
-0.025665283203125,
0.022369384765625,
0.002429962158203125,
-0.0038242340087890625,
-0.038665771484375,
0.0357666015625,
-0.009246826171875,
0.032958984375,
0.0095977783203125,
-0.032257080078125,
0.0296478271484375,
-0.0223236083984375,
-0.034271240234375,
-0.00588226318359375,
0.06884765625,
0.049468994140625,
0.021453857421875,
0.01438140869140625,
-0.0018148422241210938,
0.028594970703125,
0.0157012939453125,
-0.0806884765625,
-0.0224609375,
0.00274658203125,
-0.0292816162109375,
-0.01079559326171875,
-0.006381988525390625,
-0.041778564453125,
0.0277862548828125,
-0.033477783203125,
0.050994873046875,
-0.037139892578125,
-0.0077972412109375,
0.0164642333984375,
-0.01934814453125,
0.00036072731018066406,
0.0264434814453125,
-0.04534912109375,
0.037933349609375,
0.0261688232421875,
0.07672119140625,
-0.0143585205078125,
-0.024993896484375,
-0.03436279296875,
-0.00634002685546875,
-0.0035953521728515625,
0.050262451171875,
-0.0224609375,
-0.0057220458984375,
0.0016994476318359375,
0.01727294921875,
0.01482391357421875,
-0.038177490234375,
0.0290374755859375,
-0.027191162109375,
-0.000728607177734375,
-0.00746917724609375,
-0.035858154296875,
-0.029541015625,
0.02734375,
-0.036834716796875,
0.064208984375,
0.01160430908203125,
-0.043548583984375,
0.034881591796875,
-0.0302581787109375,
0.0008106231689453125,
-0.0206298828125,
-0.0095977783203125,
-0.0418701171875,
-0.006641387939453125,
0.0214996337890625,
0.0297088623046875,
-0.002819061279296875,
0.01494598388671875,
-0.040985107421875,
-0.007762908935546875,
0.007213592529296875,
-0.0249481201171875,
0.045196533203125,
-0.004421234130859375,
-0.0215606689453125,
0.0299072265625,
-0.060882568359375,
0.02471923828125,
-0.00691986083984375,
0.00677490234375,
0.0089263916015625,
-0.033172607421875,
-0.0026760101318359375,
0.039337158203125,
-0.0082244873046875,
-0.044342041015625,
0.0135650634765625,
0.01192474365234375,
0.037567138671875,
0.036834716796875,
-0.01328277587890625,
0.0270538330078125,
0.0006160736083984375,
0.060882568359375,
-0.004375457763671875,
0.0236358642578125,
-0.031280517578125,
-0.018524169921875,
-0.0283660888671875,
-0.027557373046875,
0.0194244384765625,
0.0369873046875,
-0.03546142578125,
0.017791748046875,
-0.0207977294921875,
-0.0426025390625,
-0.022003173828125,
0.015838623046875,
0.041473388671875,
0.027099609375,
0.031280517578125,
-0.04388427734375,
-0.05908203125,
-0.06610107421875,
0.02191162109375,
0.0009169578552246094,
-0.01248931884765625,
0.0137786865234375,
0.049652099609375,
-0.01383209228515625,
0.08636474609375,
-0.060394287109375,
-0.02813720703125,
-0.01418304443359375,
0.0025615692138671875,
0.00634002685546875,
0.033416748046875,
0.06146240234375,
-0.0555419921875,
-0.03961181640625,
-0.0267181396484375,
-0.050384521484375,
0.00861358642578125,
0.0149078369140625,
0.0018033981323242188,
-0.00516510009765625,
0.03594970703125,
-0.0501708984375,
0.057891845703125,
0.056060791015625,
-0.013671875,
0.06622314453125,
-0.0079803466796875,
0.006439208984375,
-0.050537109375,
0.0037746429443359375,
0.0174713134765625,
-0.04705810546875,
-0.0288238525390625,
-0.000016808509826660156,
0.0015659332275390625,
-0.0302734375,
-0.06488037109375,
0.0305938720703125,
-0.029876708984375,
-0.009185791015625,
-0.02423095703125,
0.01010894775390625,
0.00928497314453125,
0.04937744140625,
0.0384521484375,
0.06488037109375,
0.04205322265625,
-0.047698974609375,
0.027130126953125,
0.037994384765625,
-0.0377197265625,
0.03582763671875,
-0.06414794921875,
-0.00513458251953125,
0.0033111572265625,
0.004150390625,
-0.054473876953125,
-0.0229034423828125,
0.0033512115478515625,
-0.0380859375,
0.025634765625,
-0.02276611328125,
-0.02703857421875,
-0.03753662109375,
-0.0305023193359375,
0.041473388671875,
0.05645751953125,
-0.032684326171875,
0.0169830322265625,
0.050567626953125,
0.00966644287109375,
-0.033782958984375,
-0.056182861328125,
-0.0051116943359375,
-0.00970458984375,
-0.054229736328125,
0.05743408203125,
-0.01447296142578125,
0.01132965087890625,
0.00403594970703125,
-0.00702667236328125,
-0.032684326171875,
-0.027191162109375,
0.025238037109375,
0.017791748046875,
-0.0113372802734375,
-0.0178070068359375,
-0.00897979736328125,
-0.00545501708984375,
0.01049041748046875,
0.00838470458984375,
0.0267791748046875,
-0.0083160400390625,
-0.01045989990234375,
-0.042144775390625,
0.027099609375,
0.0377197265625,
-0.004405975341796875,
0.025726318359375,
0.06182861328125,
-0.033416748046875,
0.005802154541015625,
-0.0430908203125,
-0.0163421630859375,
-0.03607177734375,
0.046051025390625,
-0.0091705322265625,
-0.06329345703125,
0.039520263671875,
0.0098419189453125,
-0.0257720947265625,
0.023651123046875,
0.0401611328125,
0.004291534423828125,
0.09674072265625,
0.07733154296875,
-0.0008740425109863281,
0.058258056640625,
-0.048004150390625,
-0.016357421875,
-0.06585693359375,
-0.006092071533203125,
-0.01050567626953125,
-0.0165557861328125,
-0.01910400390625,
-0.040130615234375,
0.0289459228515625,
0.020965576171875,
-0.0209197998046875,
0.06109619140625,
-0.03192138671875,
0.0304412841796875,
0.03369140625,
0.0187835693359375,
-0.0054931640625,
0.006900787353515625,
-0.013641357421875,
-0.034515380859375,
-0.0499267578125,
-0.0286407470703125,
0.051605224609375,
0.04388427734375,
0.056182861328125,
-0.009979248046875,
0.020782470703125,
0.02105712890625,
0.01024627685546875,
-0.061737060546875,
0.053009033203125,
-0.0179290771484375,
-0.047149658203125,
0.000762939453125,
-0.01788330078125,
-0.040985107421875,
-0.00809478759765625,
-0.0222320556640625,
-0.07421875,
0.0028743743896484375,
0.029022216796875,
-0.039703369140625,
0.048309326171875,
-0.042510986328125,
0.0821533203125,
-0.01593017578125,
-0.026123046875,
-0.0085601806640625,
-0.06011962890625,
0.031005859375,
0.0207977294921875,
-0.00504302978515625,
-0.012969970703125,
0.01324462890625,
0.069580078125,
-0.056640625,
0.06622314453125,
-0.00811004638671875,
0.01404571533203125,
0.048675537109375,
-0.004558563232421875,
0.00971221923828125,
-0.0091705322265625,
0.028106689453125,
0.0231170654296875,
0.017974853515625,
-0.00045228004455566406,
-0.05743408203125,
0.0194091796875,
-0.061767578125,
-0.0302734375,
-0.01259613037109375,
-0.02740478515625,
0.00617218017578125,
0.0128173828125,
0.03692626953125,
0.03741455078125,
-0.011566162109375,
0.0192718505859375,
0.05438232421875,
-0.0108184814453125,
0.0289154052734375,
0.01319122314453125,
-0.03863525390625,
-0.061279296875,
0.05841064453125,
0.0144195556640625,
0.0311279296875,
0.0300140380859375,
-0.00015091896057128906,
-0.0147705078125,
-0.0234527587890625,
-0.0310516357421875,
0.014801025390625,
-0.0555419921875,
-0.033905029296875,
-0.039642333984375,
-0.02593994140625,
-0.035125732421875,
0.0015411376953125,
-0.05181884765625,
-0.027496337890625,
-0.03436279296875,
-0.0170440673828125,
0.03173828125,
0.028839111328125,
-0.0225372314453125,
0.042755126953125,
-0.0869140625,
0.02227783203125,
0.0284576416015625,
0.038665771484375,
-0.0123748779296875,
-0.05816650390625,
-0.0211639404296875,
-0.0052337646484375,
-0.0491943359375,
-0.0638427734375,
0.041900634765625,
0.01519012451171875,
0.0357666015625,
0.04339599609375,
-0.017669677734375,
0.0623779296875,
-0.04632568359375,
0.07086181640625,
0.0372314453125,
-0.08233642578125,
0.04608154296875,
-0.0101318359375,
0.024871826171875,
0.03631591796875,
0.05938720703125,
-0.0340576171875,
0.0230255126953125,
-0.05584716796875,
-0.053131103515625,
0.050750732421875,
0.01036834716796875,
0.010986328125,
0.002471923828125,
0.02557373046875,
-0.0081939697265625,
0.022552490234375,
-0.043853759765625,
-0.0203857421875,
-0.0263824462890625,
0.01033782958984375,
-0.0021533966064453125,
-0.033477783203125,
0.0011425018310546875,
-0.047088623046875,
0.04498291015625,
-0.0147247314453125,
0.0307464599609375,
0.0229949951171875,
-0.0158843994140625,
-0.004138946533203125,
-0.01220703125,
0.046600341796875,
0.0277862548828125,
-0.035064697265625,
-0.01605224609375,
0.009796142578125,
-0.0548095703125,
0.024261474609375,
-0.0110015869140625,
-0.00984954833984375,
0.0164031982421875,
0.0264129638671875,
0.09503173828125,
0.03173828125,
-0.05572509765625,
0.06329345703125,
-0.002330780029296875,
-0.03533935546875,
-0.0272216796875,
0.004085540771484375,
-0.03155517578125,
0.01284027099609375,
0.004177093505859375,
0.01058197021484375,
0.00563812255859375,
-0.03656005859375,
0.04290771484375,
0.0341796875,
-0.049407958984375,
-0.056182861328125,
0.0543212890625,
-0.004772186279296875,
-0.0229034423828125,
0.049896240234375,
-0.01251983642578125,
-0.0732421875,
0.036773681640625,
0.0283660888671875,
0.0679931640625,
-0.0017480850219726562,
0.0233612060546875,
0.053436279296875,
-0.00811767578125,
-0.01020050048828125,
0.00775146484375,
-0.0279693603515625,
-0.05462646484375,
-0.047149658203125,
-0.0305938720703125,
-0.04327392578125,
0.00439453125,
-0.07928466796875,
0.056365966796875,
-0.037567138671875,
-0.046234130859375,
-0.0024242401123046875,
-0.003055572509765625,
-0.06396484375,
0.0308074951171875,
0.0289764404296875,
0.1143798828125,
-0.064453125,
0.055938720703125,
0.04315185546875,
-0.03936767578125,
-0.060302734375,
-0.031494140625,
0.00302886962890625,
-0.052764892578125,
0.053131103515625,
0.0204010009765625,
0.01195526123046875,
-0.020751953125,
-0.062255859375,
-0.080322265625,
0.07781982421875,
0.041839599609375,
-0.0301513671875,
-0.0217742919921875,
0.002079010009765625,
0.02587890625,
-0.0280303955078125,
0.03961181640625,
0.03936767578125,
0.0022525787353515625,
0.0306854248046875,
-0.0758056640625,
-0.008087158203125,
-0.019195556640625,
-0.00208282470703125,
0.00617218017578125,
-0.050872802734375,
0.0654296875,
-0.027679443359375,
-0.00640869140625,
0.010467529296875,
0.056427001953125,
0.0159912109375,
0.00540924072265625,
0.037200927734375,
0.06744384765625,
0.0221099853515625,
0.0079193115234375,
0.0770263671875,
-0.0166168212890625,
0.03729248046875,
0.08599853515625,
-0.003925323486328125,
0.063720703125,
0.0263214111328125,
-0.032806396484375,
0.047119140625,
0.035858154296875,
-0.02191162109375,
0.041656494140625,
0.0015516281127929688,
0.0021114349365234375,
-0.03070068359375,
-0.00994110107421875,
-0.02008056640625,
0.0391845703125,
0.01525115966796875,
-0.039093017578125,
-0.024627685546875,
0.0232696533203125,
0.01427459716796875,
-0.012847900390625,
-0.033355712890625,
0.043182373046875,
-0.0007047653198242188,
-0.050018310546875,
0.03656005859375,
-0.0155029296875,
0.050018310546875,
-0.06597900390625,
-0.0120086669921875,
0.0137939453125,
0.0261688232421875,
-0.0205841064453125,
-0.08489990234375,
0.034576416015625,
0.0163116455078125,
-0.032379150390625,
-0.0124053955078125,
0.04400634765625,
-0.0255889892578125,
-0.0270538330078125,
0.0242462158203125,
-0.00513458251953125,
0.024810791015625,
-0.032440185546875,
-0.060760498046875,
0.04498291015625,
0.00629425048828125,
-0.01006317138671875,
0.036407470703125,
0.01380157470703125,
-0.007373809814453125,
0.053192138671875,
0.030364990234375,
-0.00330352783203125,
-0.0036869049072265625,
-0.0091400146484375,
0.0777587890625,
-0.046356201171875,
-0.032440185546875,
-0.05975341796875,
0.0455322265625,
-0.0069427490234375,
-0.027679443359375,
0.053375244140625,
0.035675048828125,
0.07757568359375,
-0.0147857666015625,
0.03814697265625,
0.0018033981323242188,
0.018829345703125,
-0.039215087890625,
0.026611328125,
-0.0369873046875,
-0.0235443115234375,
-0.054595947265625,
-0.051910400390625,
-0.025909423828125,
0.04302978515625,
-0.0193939208984375,
0.00665283203125,
0.035614013671875,
0.0399169921875,
-0.016754150390625,
-0.01361083984375,
0.0250701904296875,
0.0029659271240234375,
0.0189666748046875,
0.033782958984375,
0.032379150390625,
-0.053253173828125,
0.063720703125,
-0.055938720703125,
-0.025909423828125,
-0.00787353515625,
-0.08441162109375,
-0.0653076171875,
-0.0306243896484375,
-0.06512451171875,
-0.02325439453125,
-0.01204681396484375,
0.07464599609375,
0.1021728515625,
-0.045166015625,
-0.0165557861328125,
0.0126800537109375,
-0.00875091552734375,
-0.01552581787109375,
-0.01605224609375,
0.024017333984375,
0.0081329345703125,
-0.04669189453125,
-0.0010242462158203125,
0.01398468017578125,
0.026123046875,
-0.0171356201171875,
0.0029811859130859375,
-0.006439208984375,
-0.02154541015625,
0.0509033203125,
0.049835205078125,
-0.027313232421875,
-0.032318115234375,
0.00833892822265625,
0.0172882080078125,
0.03399658203125,
0.056060791015625,
-0.0703125,
0.021942138671875,
0.035797119140625,
0.02508544921875,
0.06463623046875,
0.0090484619140625,
0.03369140625,
-0.06048583984375,
0.01561737060546875,
-0.00351715087890625,
0.04010009765625,
0.0301971435546875,
-0.0110015869140625,
0.033447265625,
0.027923583984375,
-0.0426025390625,
-0.06298828125,
0.01165008544921875,
-0.0880126953125,
0.0039043426513671875,
0.07012939453125,
-0.0291290283203125,
-0.042327880859375,
0.036285400390625,
-0.03106689453125,
0.014251708984375,
-0.0189666748046875,
0.0229949951171875,
0.03515625,
-0.0036869049072265625,
-0.044586181640625,
-0.02569580078125,
0.0394287109375,
0.0092620849609375,
-0.027099609375,
-0.0149078369140625,
0.015655517578125,
0.047943115234375,
0.030548095703125,
0.03265380859375,
-0.01178741455078125,
0.03106689453125,
0.0011796951293945312,
0.02740478515625,
-0.00482940673828125,
-0.02630615234375,
-0.027191162109375,
0.01364898681640625,
-0.024810791015625,
-0.030517578125
]
] |
TheBloke/Llama-2-7B-GPTQ | 2023-09-27T12:44:46.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"facebook",
"meta",
"pytorch",
"llama-2",
"en",
"arxiv:2307.09288",
"license:llama2",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Llama-2-7B-GPTQ | 64 | 12,943 | transformers | 2023-07-18T17:06:01 | ---
language:
- en
license: llama2
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
model_name: Llama 2 7B
base_model: meta-llama/Llama-2-7b-hf
inference: false
model_creator: Meta
model_type: llama
pipeline_tag: text-generation
prompt_template: '{prompt}
'
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Llama 2 7B - GPTQ
- Model creator: [Meta](https://huggingface.co/meta-llama)
- Original model: [Llama 2 7B](https://huggingface.co/meta-llama/Llama-2-7b-hf)
<!-- description start -->
## Description
This repo contains GPTQ model files for [Meta's Llama 2 7B](https://huggingface.co/meta-llama/Llama-2-7b-hf).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Llama-2-7B-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Llama-2-7B-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Llama-2-7B-GGUF)
* [Meta's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/meta-llama/Llama-2-7b-hf)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: None
```
{prompt}
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Llama-2-7B-GPTQ/tree/main) | 4 | 128 | No | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 3.90 GB | Yes | 4-bit, without Act Order and group size 128g. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7B-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 4.28 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7B-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 4.02 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Llama-2-7B-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.01 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 3.90 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download from branches
- In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/Llama-2-7B-GPTQ:main`
- With Git, you can clone a branch with:
```
git clone --single-branch --branch main https://huggingface.co/TheBloke/Llama-2-7B-GPTQ
```
- In Python Transformers code, the branch is the `revision` parameter; see below.
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Llama-2-7B-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Llama-2-7B-GPTQ:main`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Llama-2-7B-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install transformers>=4.32.0 optimum>=1.12.0
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip3 install .
```
### For CodeLlama models only: you must use Transformers 4.33.0 or later.
If 4.33.0 is not yet released when you read this, you will need to install Transformers from source:
```shell
pip3 uninstall -y transformers
pip3 install git+https://github.com/huggingface/transformers.git
```
### You can then use the following code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Llama-2-7B-GPTQ"
# To use a different branch, change revision
# For example: revision="main"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=True,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''{prompt}
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI).
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Meta's Llama 2 7B
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B pretrained model, converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
| 23,069 | [
[
-0.0374755859375,
-0.058441162109375,
0.01467132568359375,
0.0184326171875,
-0.028839111328125,
-0.005977630615234375,
0.0114593505859375,
-0.041839599609375,
0.0183258056640625,
0.0305023193359375,
-0.05059814453125,
-0.04034423828125,
-0.0294952392578125,
0.0049285888671875,
-0.0292816162109375,
0.08514404296875,
0.0040740966796875,
-0.030242919921875,
-0.00353240966796875,
-0.0121612548828125,
-0.0202178955078125,
-0.0299224853515625,
-0.0496826171875,
-0.0227813720703125,
0.0302886962890625,
0.011322021484375,
0.061859130859375,
0.042266845703125,
0.017303466796875,
0.0266265869140625,
-0.00928497314453125,
-0.0009484291076660156,
-0.036865234375,
-0.007717132568359375,
0.01239013671875,
-0.0206451416015625,
-0.0545654296875,
0.0087738037109375,
0.0278472900390625,
0.0092010498046875,
-0.032196044921875,
0.0211029052734375,
-0.0016698837280273438,
0.04620361328125,
-0.033203125,
0.01271820068359375,
-0.03167724609375,
0.0025920867919921875,
-0.00797271728515625,
0.0098724365234375,
-0.0027027130126953125,
-0.03680419921875,
0.00876617431640625,
-0.06964111328125,
0.0115814208984375,
-0.00583648681640625,
0.09283447265625,
0.01407623291015625,
-0.051177978515625,
0.00809478759765625,
-0.0304107666015625,
0.0501708984375,
-0.07196044921875,
0.017852783203125,
0.0396728515625,
0.0160064697265625,
-0.0178985595703125,
-0.0709228515625,
-0.046844482421875,
-0.00708770751953125,
-0.01056671142578125,
0.021209716796875,
-0.037353515625,
-0.000007569789886474609,
0.03155517578125,
0.053741455078125,
-0.06256103515625,
-0.013336181640625,
-0.022369384765625,
-0.01078033447265625,
0.06573486328125,
0.0155181884765625,
0.0283660888671875,
-0.0219268798828125,
-0.025299072265625,
-0.03155517578125,
-0.055755615234375,
0.003936767578125,
0.0282135009765625,
-0.00004762411117553711,
-0.050201416015625,
0.037933349609375,
-0.02813720703125,
0.034088134765625,
0.016082763671875,
-0.01678466796875,
0.027313232421875,
-0.039031982421875,
-0.03253173828125,
-0.0291748046875,
0.0941162109375,
0.0300445556640625,
-0.01303863525390625,
0.0227203369140625,
-0.00800323486328125,
-0.00870513916015625,
0.002735137939453125,
-0.075927734375,
-0.0305023193359375,
0.033050537109375,
-0.036590576171875,
-0.0300445556640625,
-0.014129638671875,
-0.05560302734375,
-0.00295257568359375,
-0.0045928955078125,
0.037109375,
-0.033935546875,
-0.0286407470703125,
0.008148193359375,
-0.024688720703125,
0.034454345703125,
0.02655029296875,
-0.05657958984375,
0.0309295654296875,
0.02044677734375,
0.0538330078125,
0.013336181640625,
-0.008209228515625,
-0.01502227783203125,
0.01053619384765625,
-0.0050811767578125,
0.03961181640625,
-0.0114288330078125,
-0.040771484375,
-0.021392822265625,
0.01995849609375,
0.0117340087890625,
-0.02130126953125,
0.03570556640625,
-0.0224761962890625,
0.0302734375,
-0.034149169921875,
-0.0328369140625,
-0.032257080078125,
0.0117034912109375,
-0.052459716796875,
0.09515380859375,
0.037200927734375,
-0.06427001953125,
0.0113372802734375,
-0.045684814453125,
-0.0168304443359375,
-0.0011320114135742188,
0.004299163818359375,
-0.04901123046875,
-0.008148193359375,
0.0193939208984375,
0.0225830078125,
-0.028778076171875,
0.00986480712890625,
-0.019805908203125,
-0.020233154296875,
0.0147705078125,
-0.034454345703125,
0.09307861328125,
0.0140228271484375,
-0.034271240234375,
-0.00720977783203125,
-0.059173583984375,
0.00634765625,
0.041168212890625,
-0.0254364013671875,
0.0025920867919921875,
-0.01078033447265625,
0.00711822509765625,
0.00453948974609375,
0.0218048095703125,
-0.0247802734375,
0.040618896484375,
-0.017242431640625,
0.053131103515625,
0.0491943359375,
0.0038127899169921875,
0.0168914794921875,
-0.036102294921875,
0.038421630859375,
0.0003943443298339844,
0.04254150390625,
0.00670623779296875,
-0.055816650390625,
-0.05743408203125,
-0.01568603515625,
0.0294952392578125,
0.04608154296875,
-0.051666259765625,
0.035400390625,
-0.00907135009765625,
-0.0570068359375,
-0.02081298828125,
-0.00498199462890625,
0.02130126953125,
0.024658203125,
0.032196044921875,
-0.034423828125,
-0.03289794921875,
-0.06378173828125,
0.010894775390625,
-0.036712646484375,
-0.0055694580078125,
0.0289459228515625,
0.062042236328125,
-0.0280303955078125,
0.059326171875,
-0.0543212890625,
-0.007427215576171875,
-0.004940032958984375,
0.01003265380859375,
0.026824951171875,
0.040924072265625,
0.05755615234375,
-0.057098388671875,
-0.035491943359375,
-0.007568359375,
-0.051116943359375,
-0.0088958740234375,
0.004383087158203125,
-0.029052734375,
0.0093994140625,
-0.005023956298828125,
-0.0830078125,
0.050994873046875,
0.04278564453125,
-0.04052734375,
0.061676025390625,
-0.011474609375,
0.00832366943359375,
-0.08465576171875,
0.004638671875,
0.01154327392578125,
-0.02447509765625,
-0.0367431640625,
0.01403045654296875,
-0.002170562744140625,
0.01178741455078125,
-0.0309295654296875,
0.045867919921875,
-0.03271484375,
0.0036468505859375,
0.0010242462158203125,
-0.0035076141357421875,
0.0275726318359375,
0.03851318359375,
-0.0164337158203125,
0.0633544921875,
0.0261383056640625,
-0.037109375,
0.041656494140625,
0.037689208984375,
-0.006206512451171875,
0.0194854736328125,
-0.0673828125,
0.0155181884765625,
0.013336181640625,
0.0355224609375,
-0.0693359375,
-0.0257110595703125,
0.042144775390625,
-0.03704833984375,
0.0276947021484375,
-0.02239990234375,
-0.028533935546875,
-0.031951904296875,
-0.042266845703125,
0.0291748046875,
0.061981201171875,
-0.0297088623046875,
0.039031982421875,
0.030609130859375,
0.0019330978393554688,
-0.046417236328125,
-0.052978515625,
-0.0093994140625,
-0.0253143310546875,
-0.04241943359375,
0.039794921875,
-0.01232147216796875,
-0.007587432861328125,
-0.003589630126953125,
0.007091522216796875,
-0.007457733154296875,
0.00020623207092285156,
0.022430419921875,
0.025360107421875,
-0.011871337890625,
-0.01175689697265625,
0.0151214599609375,
0.0009737014770507812,
-0.00475311279296875,
-0.018890380859375,
0.037200927734375,
-0.014068603515625,
-0.0016660690307617188,
-0.035797119140625,
0.0163421630859375,
0.032989501953125,
-0.0007271766662597656,
0.052764892578125,
0.0592041015625,
-0.0266265869140625,
0.01392364501953125,
-0.040802001953125,
-0.007511138916015625,
-0.0386962890625,
0.01084136962890625,
-0.0157470703125,
-0.04949951171875,
0.03936767578125,
0.0230255126953125,
0.017303466796875,
0.0587158203125,
0.03814697265625,
-0.003753662109375,
0.07220458984375,
0.041961669921875,
-0.01320648193359375,
0.036651611328125,
-0.04693603515625,
-0.0170135498046875,
-0.065185546875,
-0.0153350830078125,
-0.0286712646484375,
-0.0167999267578125,
-0.060546875,
-0.033660888671875,
0.026885986328125,
0.0215606689453125,
-0.060821533203125,
0.04608154296875,
-0.052490234375,
0.0133514404296875,
0.040924072265625,
0.021881103515625,
0.0232086181640625,
0.00809478759765625,
-0.006801605224609375,
0.008392333984375,
-0.035491943359375,
-0.0279083251953125,
0.08477783203125,
0.0260009765625,
0.047576904296875,
0.019256591796875,
0.04241943359375,
0.01064300537109375,
0.0275726318359375,
-0.039794921875,
0.041748046875,
0.003692626953125,
-0.048248291015625,
-0.0223541259765625,
-0.042755126953125,
-0.07110595703125,
0.0205535888671875,
-0.009490966796875,
-0.058685302734375,
0.0249176025390625,
0.0017843246459960938,
-0.0267791748046875,
0.0180511474609375,
-0.045623779296875,
0.06402587890625,
-0.0107269287109375,
-0.0278472900390625,
0.001186370849609375,
-0.04736328125,
0.03204345703125,
0.01175689697265625,
0.002445220947265625,
-0.0197296142578125,
-0.0200042724609375,
0.05767822265625,
-0.06610107421875,
0.06463623046875,
-0.01407623291015625,
-0.01458740234375,
0.043304443359375,
-0.008026123046875,
0.048980712890625,
0.0116424560546875,
-0.0010833740234375,
0.0294952392578125,
0.01488494873046875,
-0.031768798828125,
-0.027557373046875,
0.03582763671875,
-0.07861328125,
-0.04522705078125,
-0.033050537109375,
-0.03216552734375,
0.005176544189453125,
0.002197265625,
0.039306640625,
0.02276611328125,
-0.00470733642578125,
-0.002796173095703125,
0.045379638671875,
-0.0296630859375,
0.03521728515625,
0.02264404296875,
-0.0252532958984375,
-0.05120849609375,
0.05987548828125,
0.0019550323486328125,
0.0153350830078125,
0.019805908203125,
0.01491546630859375,
-0.034515380859375,
-0.03271484375,
-0.057220458984375,
0.0269775390625,
-0.0399169921875,
-0.036529541015625,
-0.044708251953125,
-0.0250701904296875,
-0.0279083251953125,
0.017791748046875,
-0.02392578125,
-0.049713134765625,
-0.03948974609375,
-0.00353240966796875,
0.0731201171875,
0.0467529296875,
-0.007320404052734375,
0.033050537109375,
-0.06121826171875,
0.020294189453125,
0.035919189453125,
0.009185791015625,
-0.0013103485107421875,
-0.054779052734375,
-0.0027828216552734375,
0.0174713134765625,
-0.05804443359375,
-0.0751953125,
0.05047607421875,
0.0146942138671875,
0.025604248046875,
0.0271453857421875,
0.005340576171875,
0.06488037109375,
-0.0185546875,
0.078369140625,
0.0209197998046875,
-0.07098388671875,
0.044403076171875,
-0.0447998046875,
0.016845703125,
0.025299072265625,
0.041259765625,
-0.023284912109375,
-0.019317626953125,
-0.05230712890625,
-0.059326171875,
0.0307464599609375,
0.037506103515625,
0.0056304931640625,
0.01247406005859375,
0.0445556640625,
-0.00010180473327636719,
0.01568603515625,
-0.0787353515625,
-0.038055419921875,
-0.029541015625,
-0.01433563232421875,
0.01221466064453125,
-0.00641632080078125,
-0.0204620361328125,
-0.046844482421875,
0.06805419921875,
-0.01261138916015625,
0.0489501953125,
0.0198516845703125,
0.0080413818359375,
-0.01064300537109375,
0.00798797607421875,
0.0290069580078125,
0.0440673828125,
-0.01071929931640625,
-0.0224609375,
0.021942138671875,
-0.060882568359375,
0.01947021484375,
0.023681640625,
-0.01971435546875,
-0.01068878173828125,
0.01123046875,
0.057891845703125,
-0.00035858154296875,
-0.0243377685546875,
0.03778076171875,
-0.0218963623046875,
-0.0268707275390625,
-0.023529052734375,
0.0135955810546875,
0.0200653076171875,
0.0318603515625,
0.02789306640625,
-0.0220184326171875,
0.0212860107421875,
-0.040618896484375,
0.0019330978393554688,
0.034881591796875,
-0.006847381591796875,
-0.0272064208984375,
0.06353759765625,
0.0011138916015625,
0.00354766845703125,
0.056060791015625,
-0.0217742919921875,
-0.034271240234375,
0.0611572265625,
0.0399169921875,
0.056304931640625,
-0.0136871337890625,
0.023773193359375,
0.042633056640625,
0.012054443359375,
-0.00646209716796875,
0.0300445556640625,
-0.0017337799072265625,
-0.040008544921875,
-0.0268707275390625,
-0.054168701171875,
-0.0228424072265625,
0.0234527587890625,
-0.05029296875,
0.01654052734375,
-0.031707763671875,
-0.0347900390625,
-0.0183563232421875,
0.0249176025390625,
-0.042388916015625,
0.0195770263671875,
0.0006585121154785156,
0.066650390625,
-0.051513671875,
0.061859130859375,
0.038604736328125,
-0.039794921875,
-0.07098388671875,
-0.0124053955078125,
0.00821685791015625,
-0.052978515625,
0.0174102783203125,
0.004199981689453125,
0.021453857421875,
0.002033233642578125,
-0.0604248046875,
-0.073486328125,
0.11578369140625,
0.0273590087890625,
-0.048675537109375,
-0.010345458984375,
0.003543853759765625,
0.027740478515625,
-0.007427215576171875,
0.052093505859375,
0.0416259765625,
0.030731201171875,
0.01226043701171875,
-0.07049560546875,
0.036285400390625,
-0.0308074951171875,
0.004489898681640625,
0.01485443115234375,
-0.081298828125,
0.0789794921875,
-0.0036106109619140625,
-0.01068878173828125,
0.02423095703125,
0.047576904296875,
0.037506103515625,
0.0012340545654296875,
0.030670166015625,
0.06353759765625,
0.057281494140625,
-0.026336669921875,
0.08526611328125,
-0.0116119384765625,
0.0457763671875,
0.05389404296875,
-0.0005526542663574219,
0.0587158203125,
0.0157470703125,
-0.05743408203125,
0.0543212890625,
0.0762939453125,
-0.0108489990234375,
0.029998779296875,
0.008453369140625,
-0.0271453857421875,
-0.006572723388671875,
0.007228851318359375,
-0.05694580078125,
0.0124053955078125,
0.0284423828125,
-0.0172119140625,
0.006786346435546875,
-0.017364501953125,
-0.0004608631134033203,
-0.05059814453125,
-0.01099395751953125,
0.0458984375,
0.0229949951171875,
-0.0166778564453125,
0.06689453125,
-0.00588226318359375,
0.0517578125,
-0.043121337890625,
-0.0156097412109375,
-0.031097412109375,
-0.0118408203125,
-0.024169921875,
-0.052032470703125,
0.00768280029296875,
-0.005596160888671875,
-0.00672149658203125,
0.004276275634765625,
0.054107666015625,
-0.00975799560546875,
-0.03607177734375,
0.0228729248046875,
0.03497314453125,
0.02276611328125,
-0.005710601806640625,
-0.08123779296875,
0.01873779296875,
0.0020656585693359375,
-0.055267333984375,
0.033660888671875,
0.026885986328125,
0.01255035400390625,
0.055389404296875,
0.04833984375,
-0.0121612548828125,
-0.00023567676544189453,
-0.00860595703125,
0.07733154296875,
-0.061248779296875,
-0.01947021484375,
-0.0631103515625,
0.051788330078125,
-0.0089263916015625,
-0.0318603515625,
0.049713134765625,
0.037689208984375,
0.05078125,
0.0034885406494140625,
0.05377197265625,
-0.02587890625,
0.00943756103515625,
-0.02801513671875,
0.056732177734375,
-0.06048583984375,
0.01143646240234375,
-0.029541015625,
-0.057891845703125,
0.0007224082946777344,
0.058013916015625,
-0.0009355545043945312,
0.0197601318359375,
0.0282135009765625,
0.06170654296875,
0.0025730133056640625,
0.01253509521484375,
0.00977325439453125,
0.029052734375,
0.01288604736328125,
0.0633544921875,
0.062744140625,
-0.0765380859375,
0.0447998046875,
-0.031158447265625,
-0.0144195556640625,
-0.0050811767578125,
-0.06597900390625,
-0.056121826171875,
-0.032501220703125,
-0.042816162109375,
-0.04193115234375,
-0.0050811767578125,
0.0640869140625,
0.06005859375,
-0.045196533203125,
-0.0264892578125,
-0.0087738037109375,
0.003879547119140625,
-0.012786865234375,
-0.0214385986328125,
0.0222625732421875,
0.024658203125,
-0.056365966796875,
0.0143585205078125,
0.0016546249389648438,
0.038818359375,
-0.0134429931640625,
-0.0181427001953125,
-0.0189361572265625,
-0.0011377334594726562,
0.04730224609375,
0.033599853515625,
-0.0455322265625,
-0.0081787109375,
-0.01505279541015625,
-0.0054931640625,
0.0200653076171875,
0.0208740234375,
-0.05975341796875,
-0.00946044921875,
0.03155517578125,
0.01019287109375,
0.062744140625,
-0.0002930164337158203,
0.031951904296875,
-0.03485107421875,
0.0092315673828125,
0.0007085800170898438,
0.0299835205078125,
0.0093841552734375,
-0.040740966796875,
0.055267333984375,
0.029388427734375,
-0.0511474609375,
-0.05548095703125,
-0.00667572021484375,
-0.0830078125,
-0.0164337158203125,
0.0806884765625,
-0.00978851318359375,
-0.0284423828125,
0.0019207000732421875,
-0.022186279296875,
0.03167724609375,
-0.03875732421875,
0.03240966796875,
0.031494140625,
-0.015655517578125,
-0.0210723876953125,
-0.0552978515625,
0.040008544921875,
0.01294708251953125,
-0.0703125,
-0.006313323974609375,
0.03387451171875,
0.039703369140625,
0.004528045654296875,
0.06121826171875,
-0.01160430908203125,
0.0244293212890625,
0.00855255126953125,
0.00565338134765625,
0.00528717041015625,
0.00975799560546875,
-0.021331787109375,
-0.00344085693359375,
-0.01910400390625,
0.0024471282958984375
]
] |
pankajmathur/model_007 | 2023-09-05T23:15:22.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"arxiv:2306.02707",
"license:llama2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | pankajmathur | null | null | pankajmathur/model_007 | 21 | 12,938 | transformers | 2023-08-05T04:15:55 | ---
language:
- en
library_name: transformers
license: llama2
---
# model_007
A hybrid (explain + instruct) style Llama2-70b model, Pleae check examples below for both style prompts, Here is the list of datasets used:
* Open-Platypus
* Alpaca
* WizardLM
* Dolly-V2
* Dolphin Samples (~200K)
* Orca_minis_v1
* Alpaca_orca
* WizardLM_orca
* Dolly-V2_orca
<br>
**P.S. If you're interested to collaborate, please connect with me at www.linkedin.com/in/pankajam.**
<br>
### quantized versions
Huge respect to man.. @TheBloke, here are the GGML/GPTQ/GGUF versions, go crazy :)
https://huggingface.co/TheBloke/model_007-70B-GGML
https://huggingface.co/TheBloke/model_007-70B-GGUF
https://huggingface.co/TheBloke/model_007-70B-GPTQ
<br>
#### license disclaimer:
This model is bound by the license & usage restrictions of the original Llama-2 model. And comes with no warranty or gurantees of any kind.
<br>
## Evaluation
We evaluated model_007 on a wide range of tasks using [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) from EleutherAI.
Here are the results on metrics used by [HuggingFaceH4 Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|||||
|:------:|:--------:|:-------:|:--------:|
|**Task**|**Metric**|**Value**|**Stderr**|
|*arc_challenge*|acc_norm|0.7108|0.0141|
|*hellaswag*|acc_norm|0.8765|0.0038|
|*mmlu*|acc_norm|0.6904|0.0351|
|*truthfulqa_mc*|mc2|0.6312|0.0157|
|**Total Average**|-|**0.72729**||
<br>
## Example Usage
Here is the Orca prompt format
```
### System:
You are an AI assistant that follows instruction extremely well. Help as much as you can.
### User:
Tell me about Orcas.
### Assistant:
```
Below shows a code example on how to use this model
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("psmathur/model_007")
model = AutoModelForCausalLM.from_pretrained(
"psmathur/model_007",
torch_dtype=torch.float16,
load_in_8bit=True,
low_cpu_mem_usage=True,
device_map="auto"
)
system_prompt = "### System:\nYou are an AI assistant that follows instruction extremely well. Help as much as you can.\n\n"
#generate text steps
instruction = "Tell me about Orcas."
prompt = f"{system_prompt}### User: {instruction}\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=4096)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
Here is the Alpaca prompt format
```
### User:
Tell me about Alpacas.
### Assistant:
```
Below shows a code example on how to use this model
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("psmathur/model_007")
model = AutoModelForCausalLM.from_pretrained(
"psmathur/model_007",
torch_dtype=torch.float16,
load_in_8bit=True,
low_cpu_mem_usage=True,
device_map="auto"
)
#generate text steps
instruction = "Tell me about Alpacas."
prompt = f"### User: {instruction}\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=4096)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
<br>
#### Limitations & Biases:
While this model aims for accuracy, it can occasionally produce inaccurate or misleading results.
Despite diligent efforts in refining the pretraining data, there remains a possibility for the generation of inappropriate, biased, or offensive content.
Exercise caution and cross-check information when necessary.
<br>
### Citiation:
Please kindly cite using the following BibTeX:
```
@misc{model_007,
author = {Pankaj Mathur},
title = {model_007: A hybrid (explain + instruct) style Llama2-70b model},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/psmathur/model_007},
}
```
```
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@software{touvron2023llama2,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron, Louis Martin, Kevin Stone, Peter Albert, Amjad Almahairi, Yasmine Babaei, Nikolay Bashlykov, Soumya Batra, Prajjwal Bhargava,
Shruti Bhosale, Dan Bikel, Lukas Blecher, Cristian Canton Ferrer, Moya Chen, Guillem Cucurull, David Esiobu, Jude Fernandes, Jeremy Fu, Wenyin Fu, Brian Fuller,
Cynthia Gao, Vedanuj Goswami, Naman Goyal, Anthony Hartshorn, Saghar Hosseini, Rui Hou, Hakan Inan, Marcin Kardas, Viktor Kerkez Madian Khabsa, Isabel Kloumann,
Artem Korenev, Punit Singh Koura, Marie-Anne Lachaux, Thibaut Lavril, Jenya Lee, Diana Liskovich, Yinghai Lu, Yuning Mao, Xavier Martinet, Todor Mihaylov,
Pushkar Mishra, Igor Molybog, Yixin Nie, Andrew Poulton, Jeremy Reizenstein, Rashi Rungta, Kalyan Saladi, Alan Schelten, Ruan Silva, Eric Michael Smith,
Ranjan Subramanian, Xiaoqing Ellen Tan, Binh Tang, Ross Taylor, Adina Williams, Jian Xiang Kuan, Puxin Xu , Zheng Yan, Iliyan Zarov, Yuchen Zhang, Angela Fan,
Melanie Kambadur, Sharan Narang, Aurelien Rodriguez, Robert Stojnic, Sergey Edunov, Thomas Scialom},
year={2023}
}
``` | 5,617 | [
[
-0.0240325927734375,
-0.06439208984375,
0.01302337646484375,
0.01239013671875,
-0.0191192626953125,
0.0065460205078125,
-0.004528045654296875,
-0.03955078125,
0.01922607421875,
0.0195465087890625,
-0.051544189453125,
-0.045074462890625,
-0.03955078125,
-0.002506256103515625,
-0.02032470703125,
0.08636474609375,
-0.0128631591796875,
-0.01517486572265625,
0.0093994140625,
-0.0122222900390625,
-0.033050537109375,
-0.031982421875,
-0.057769775390625,
-0.030181884765625,
0.01373291015625,
0.00957489013671875,
0.04742431640625,
0.0423583984375,
0.0228424072265625,
0.026580810546875,
-0.0173797607421875,
0.0246429443359375,
-0.031524658203125,
-0.01849365234375,
0.0182647705078125,
-0.04327392578125,
-0.07012939453125,
0.01155853271484375,
0.0293426513671875,
0.0187835693359375,
-0.00968170166015625,
0.0265655517578125,
0.0085296630859375,
0.017578125,
-0.0295867919921875,
0.0279388427734375,
-0.0248565673828125,
0.0016307830810546875,
-0.024749755859375,
-0.00696563720703125,
-0.00988006591796875,
-0.0191192626953125,
0.0053558349609375,
-0.056640625,
0.0159912109375,
-0.004459381103515625,
0.09014892578125,
0.0206756591796875,
-0.027679443359375,
-0.0217437744140625,
-0.038604736328125,
0.058441162109375,
-0.07470703125,
0.007663726806640625,
0.018463134765625,
0.01739501953125,
-0.0209808349609375,
-0.0654296875,
-0.0653076171875,
-0.0164031982421875,
-0.0155181884765625,
0.017547607421875,
-0.018218994140625,
0.0007648468017578125,
0.02520751953125,
0.035400390625,
-0.040618896484375,
0.0083465576171875,
-0.03741455078125,
-0.0225372314453125,
0.045379638671875,
0.0287017822265625,
0.0261993408203125,
-0.0139923095703125,
-0.03009033203125,
-0.023406982421875,
-0.045806884765625,
0.0233917236328125,
0.03656005859375,
0.0067901611328125,
-0.043182373046875,
0.04925537109375,
-0.0127716064453125,
0.052734375,
0.01447296142578125,
-0.02783203125,
0.042449951171875,
-0.0367431640625,
-0.0288543701171875,
-0.0120697021484375,
0.072021484375,
0.0189056396484375,
0.004856109619140625,
0.0128326416015625,
-0.00727081298828125,
0.0124664306640625,
-0.0112457275390625,
-0.054656982421875,
-0.022308349609375,
0.0214080810546875,
-0.032806396484375,
-0.027191162109375,
-0.00826263427734375,
-0.05877685546875,
-0.011444091796875,
-0.011688232421875,
0.03570556640625,
-0.03302001953125,
-0.034423828125,
0.0198516845703125,
0.00817108154296875,
0.04083251953125,
0.007320404052734375,
-0.06658935546875,
0.022430419921875,
0.0245361328125,
0.06915283203125,
-0.00423431396484375,
-0.0225830078125,
-0.01474761962890625,
0.0017900466918945312,
-0.004535675048828125,
0.045074462890625,
-0.019378662109375,
-0.031768798828125,
-0.0263519287109375,
0.00400543212890625,
-0.017364501953125,
-0.027191162109375,
0.03570556640625,
-0.02459716796875,
0.03350830078125,
-0.0189666748046875,
-0.0264892578125,
-0.0286102294921875,
0.01605224609375,
-0.04144287109375,
0.09515380859375,
0.005199432373046875,
-0.06439208984375,
0.0140838623046875,
-0.059783935546875,
-0.0113677978515625,
-0.0183868408203125,
-0.00948333740234375,
-0.04998779296875,
-0.024993896484375,
0.031280517578125,
0.030975341796875,
-0.020538330078125,
0.0183563232421875,
-0.018707275390625,
-0.0267333984375,
0.00914764404296875,
-0.0261077880859375,
0.086669921875,
0.015411376953125,
-0.057342529296875,
0.02825927734375,
-0.0626220703125,
-0.00908660888671875,
0.0360107421875,
-0.0208282470703125,
0.004913330078125,
-0.0193634033203125,
-0.0099639892578125,
0.0171356201171875,
0.0280914306640625,
-0.0333251953125,
0.0218963623046875,
-0.02655029296875,
0.045379638671875,
0.058685302734375,
-0.007904052734375,
0.021209716796875,
-0.0289459228515625,
0.042205810546875,
0.0015153884887695312,
0.0188140869140625,
0.0011663436889648438,
-0.051055908203125,
-0.08221435546875,
-0.031707763671875,
0.01517486572265625,
0.04229736328125,
-0.04522705078125,
0.051727294921875,
0.0004982948303222656,
-0.055206298828125,
-0.03759765625,
-0.00005900859832763672,
0.02496337890625,
0.0548095703125,
0.032318115234375,
-0.0210113525390625,
-0.047119140625,
-0.06048583984375,
0.01215362548828125,
-0.007602691650390625,
-0.0021228790283203125,
0.01947021484375,
0.04736328125,
-0.0195770263671875,
0.07421875,
-0.036285400390625,
-0.0203399658203125,
-0.026885986328125,
0.0118560791015625,
0.031158447265625,
0.048736572265625,
0.052459716796875,
-0.042205810546875,
-0.02008056640625,
-0.0195465087890625,
-0.06805419921875,
-0.0032482147216796875,
-0.0034351348876953125,
-0.0286102294921875,
0.00955963134765625,
0.0187225341796875,
-0.0634765625,
0.041595458984375,
0.0460205078125,
-0.0377197265625,
0.051483154296875,
-0.01543426513671875,
-0.0018320083618164062,
-0.08172607421875,
0.01995849609375,
0.0018815994262695312,
-0.00572967529296875,
-0.04022216796875,
-0.00045561790466308594,
-0.00746917724609375,
0.00408172607421875,
-0.032470703125,
0.050567626953125,
-0.0300445556640625,
0.0032100677490234375,
-0.01238250732421875,
0.000652313232421875,
-0.002796173095703125,
0.05859375,
-0.004673004150390625,
0.05535888671875,
0.04833984375,
-0.037841796875,
0.033447265625,
0.0250244140625,
-0.02655029296875,
0.020843505859375,
-0.07025146484375,
0.0245361328125,
0.00917816162109375,
0.041290283203125,
-0.08660888671875,
-0.01367950439453125,
0.044647216796875,
-0.034820556640625,
0.0272674560546875,
0.00849151611328125,
-0.040557861328125,
-0.0391845703125,
-0.0285797119140625,
0.033203125,
0.040008544921875,
-0.038604736328125,
0.04931640625,
0.021392822265625,
0.003208160400390625,
-0.0435791015625,
-0.043792724609375,
-0.0132904052734375,
-0.0308990478515625,
-0.0565185546875,
0.027008056640625,
-0.0177154541015625,
0.001361846923828125,
-0.00963592529296875,
-0.004619598388671875,
0.0126495361328125,
0.00434112548828125,
0.0176849365234375,
0.049957275390625,
-0.010284423828125,
-0.0207366943359375,
-0.0029468536376953125,
-0.015716552734375,
0.0014247894287109375,
-0.006694793701171875,
0.06634521484375,
-0.027618408203125,
-0.020904541015625,
-0.052642822265625,
0.0024127960205078125,
0.0316162109375,
-0.02581787109375,
0.06298828125,
0.062408447265625,
-0.0269012451171875,
0.018585205078125,
-0.039459228515625,
-0.0175323486328125,
-0.038818359375,
0.025848388671875,
-0.029449462890625,
-0.049713134765625,
0.06134033203125,
0.0162811279296875,
0.0165863037109375,
0.057098388671875,
0.058013916015625,
0.00653076171875,
0.07843017578125,
0.05072021484375,
0.002227783203125,
0.0401611328125,
-0.056243896484375,
0.00439453125,
-0.07061767578125,
-0.0474853515625,
-0.0277252197265625,
-0.029754638671875,
-0.036041259765625,
-0.025360107421875,
0.0263519287109375,
0.01250457763671875,
-0.0421142578125,
0.0291595458984375,
-0.049407958984375,
0.007904052734375,
0.047637939453125,
0.021331787109375,
0.015655517578125,
-0.0036163330078125,
-0.0108795166015625,
0.0058746337890625,
-0.0489501953125,
-0.04327392578125,
0.08880615234375,
0.031097412109375,
0.05035400390625,
-0.0017538070678710938,
0.044189453125,
-0.0000852346420288086,
0.0189971923828125,
-0.042388916015625,
0.042236328125,
0.018524169921875,
-0.051116943359375,
-0.0124969482421875,
-0.0220794677734375,
-0.07171630859375,
0.01421356201171875,
-0.01317596435546875,
-0.06353759765625,
0.01374053955078125,
0.0092620849609375,
-0.041290283203125,
0.0257568359375,
-0.04803466796875,
0.061859130859375,
-0.01824951171875,
-0.00704193115234375,
-0.00128173828125,
-0.0491943359375,
0.040802001953125,
0.007404327392578125,
0.00012230873107910156,
-0.0111083984375,
-0.0026702880859375,
0.08319091796875,
-0.041595458984375,
0.07110595703125,
-0.005947113037109375,
-0.00807952880859375,
0.037628173828125,
-0.00785064697265625,
0.048736572265625,
-0.0016460418701171875,
-0.00341796875,
0.0268096923828125,
-0.01364898681640625,
-0.033233642578125,
-0.024169921875,
0.042816162109375,
-0.08612060546875,
-0.041290283203125,
-0.031158447265625,
-0.036773681640625,
0.0062408447265625,
0.015411376953125,
0.0411376953125,
0.0236358642578125,
0.024658203125,
-0.005100250244140625,
0.041290283203125,
-0.02386474609375,
0.038177490234375,
0.024017333984375,
-0.0022945404052734375,
-0.0316162109375,
0.055023193359375,
0.01226806640625,
0.021942138671875,
0.0023326873779296875,
0.006610870361328125,
-0.031341552734375,
-0.03424072265625,
-0.036773681640625,
0.039764404296875,
-0.045562744140625,
-0.022674560546875,
-0.055511474609375,
-0.01947021484375,
-0.0303802490234375,
-0.00576019287109375,
-0.031402587890625,
-0.025726318359375,
-0.043701171875,
-0.017486572265625,
0.04583740234375,
0.041046142578125,
0.0008792877197265625,
0.0325927734375,
-0.028411865234375,
0.0201568603515625,
0.02496337890625,
0.00687408447265625,
0.01055908203125,
-0.061187744140625,
-0.0018768310546875,
0.012451171875,
-0.042388916015625,
-0.06207275390625,
0.040679931640625,
0.0004048347473144531,
0.049957275390625,
0.0150146484375,
-0.005199432373046875,
0.06695556640625,
-0.00959014892578125,
0.072021484375,
0.024322509765625,
-0.07489013671875,
0.041351318359375,
-0.025970458984375,
0.0127105712890625,
0.01168060302734375,
0.026123046875,
-0.0191650390625,
-0.027008056640625,
-0.0648193359375,
-0.07293701171875,
0.0633544921875,
0.0304107666015625,
0.003421783447265625,
0.01448822021484375,
0.0309295654296875,
0.009185791015625,
0.00963592529296875,
-0.07269287109375,
-0.0435791015625,
-0.032073974609375,
-0.01284027099609375,
0.0012264251708984375,
-0.01018524169921875,
-0.010894775390625,
-0.0282745361328125,
0.0565185546875,
-0.0016117095947265625,
0.042694091796875,
0.01039886474609375,
0.003726959228515625,
-0.004425048828125,
-0.0081787109375,
0.05108642578125,
0.043792724609375,
-0.019012451171875,
-0.0078277587890625,
0.028411865234375,
-0.04327392578125,
0.004791259765625,
0.0135498046875,
-0.01078033447265625,
-0.00893402099609375,
0.0280303955078125,
0.06488037109375,
-0.00797271728515625,
-0.0276947021484375,
0.023162841796875,
-0.00090789794921875,
-0.00939178466796875,
-0.029998779296875,
0.01255035400390625,
0.01474761962890625,
0.031768798828125,
0.02679443359375,
0.005283355712890625,
-0.00560760498046875,
-0.041748046875,
-0.0061187744140625,
0.026641845703125,
0.004039764404296875,
-0.031982421875,
0.076904296875,
0.005298614501953125,
-0.016510009765625,
0.050506591796875,
-0.00811767578125,
-0.037384033203125,
0.061553955078125,
0.038360595703125,
0.04669189453125,
-0.010223388671875,
0.006290435791015625,
0.038482666015625,
0.0102081298828125,
-0.00586700439453125,
0.0225677490234375,
0.00787353515625,
-0.04217529296875,
-0.0260467529296875,
-0.042633056640625,
-0.0166168212890625,
0.03070068359375,
-0.043426513671875,
0.039764404296875,
-0.035491943359375,
-0.0187835693359375,
-0.005542755126953125,
0.0215911865234375,
-0.06024169921875,
0.0139007568359375,
0.01486968994140625,
0.0628662109375,
-0.059234619140625,
0.069580078125,
0.040313720703125,
-0.050201416015625,
-0.0828857421875,
-0.021240234375,
0.001560211181640625,
-0.07672119140625,
0.034820556640625,
0.00992584228515625,
-0.0032196044921875,
0.0081939697265625,
-0.052032470703125,
-0.07244873046875,
0.110107421875,
0.03240966796875,
-0.0263214111328125,
-0.0129547119140625,
0.002285003662109375,
0.039093017578125,
-0.0169677734375,
0.05010986328125,
0.053466796875,
0.03045654296875,
0.018402099609375,
-0.08001708984375,
0.0276031494140625,
-0.0281982421875,
-0.00885009765625,
-0.006072998046875,
-0.07659912109375,
0.09405517578125,
-0.0214385986328125,
-0.00518035888671875,
0.0219268798828125,
0.058380126953125,
0.047607421875,
0.01213836669921875,
0.0292816162109375,
0.0423583984375,
0.058441162109375,
-0.009368896484375,
0.07440185546875,
-0.0120849609375,
0.04931640625,
0.059967041015625,
0.007289886474609375,
0.044769287109375,
0.0157623291015625,
-0.02978515625,
0.04949951171875,
0.0716552734375,
-0.0085906982421875,
0.039764404296875,
0.005138397216796875,
-0.006267547607421875,
-0.00707244873046875,
0.00907135009765625,
-0.05291748046875,
0.0308685302734375,
0.029388427734375,
-0.02294921875,
-0.01342010498046875,
-0.00707244873046875,
0.0230712890625,
-0.029815673828125,
-0.007129669189453125,
0.035919189453125,
0.0044708251953125,
-0.0306854248046875,
0.07769775390625,
0.006587982177734375,
0.07611083984375,
-0.05377197265625,
0.0015106201171875,
-0.0257720947265625,
0.00811004638671875,
-0.0260467529296875,
-0.047027587890625,
0.006561279296875,
-0.0004429817199707031,
0.00585174560546875,
0.0030994415283203125,
0.03875732421875,
-0.0211944580078125,
-0.0367431640625,
0.01363372802734375,
0.0219879150390625,
0.020843505859375,
0.01187896728515625,
-0.072265625,
0.01617431640625,
0.009002685546875,
-0.05035400390625,
0.0179595947265625,
0.0205535888671875,
-0.0013637542724609375,
0.0506591796875,
0.051055908203125,
-0.0038661956787109375,
0.018707275390625,
-0.017059326171875,
0.0791015625,
-0.039306640625,
-0.031768798828125,
-0.07122802734375,
0.044921875,
-0.0018596649169921875,
-0.040313720703125,
0.059173583984375,
0.034271240234375,
0.061187744140625,
0.00638580322265625,
0.050140380859375,
-0.0223388671875,
0.0159759521484375,
-0.036956787109375,
0.050689697265625,
-0.0384521484375,
0.021331787109375,
-0.0179443359375,
-0.06671142578125,
-0.01043701171875,
0.06890869140625,
-0.026275634765625,
0.01195526123046875,
0.036773681640625,
0.06573486328125,
-0.00502777099609375,
-0.00286102294921875,
-0.00437164306640625,
0.024017333984375,
0.040802001953125,
0.05340576171875,
0.04400634765625,
-0.047210693359375,
0.053375244140625,
-0.037261962890625,
-0.0248870849609375,
-0.01541900634765625,
-0.056793212890625,
-0.0714111328125,
-0.03338623046875,
-0.02972412109375,
-0.04156494140625,
-0.0159912109375,
0.0694580078125,
0.0570068359375,
-0.0567626953125,
-0.0218963623046875,
-0.00533294677734375,
-0.0045318603515625,
-0.0215606689453125,
-0.0154876708984375,
0.051513671875,
0.0024261474609375,
-0.068359375,
0.00848388671875,
-0.007137298583984375,
0.033447265625,
-0.023712158203125,
-0.015167236328125,
-0.0220794677734375,
-0.00127410888671875,
0.0255889892578125,
0.030517578125,
-0.0546875,
-0.0147705078125,
0.0009760856628417969,
-0.013092041015625,
0.0106658935546875,
0.028350830078125,
-0.0633544921875,
0.024658203125,
0.02008056640625,
0.013458251953125,
0.057861328125,
-0.0129852294921875,
0.0234527587890625,
-0.033721923828125,
0.0279083251953125,
0.0028228759765625,
0.033935546875,
0.0183258056640625,
-0.03204345703125,
0.041351318359375,
0.026702880859375,
-0.032745361328125,
-0.0615234375,
-0.004756927490234375,
-0.09271240234375,
0.002796173095703125,
0.09210205078125,
-0.0250701904296875,
-0.024078369140625,
0.00553131103515625,
-0.03546142578125,
0.046905517578125,
-0.031646728515625,
0.06524658203125,
0.035858154296875,
-0.0264892578125,
-0.004100799560546875,
-0.039031982421875,
0.032012939453125,
0.0220489501953125,
-0.06396484375,
-0.0193939208984375,
0.01219940185546875,
0.037628173828125,
0.0168914794921875,
0.05523681640625,
-0.00719451904296875,
0.01287841796875,
0.004161834716796875,
0.0077056884765625,
-0.0286102294921875,
0.00328826904296875,
-0.01079559326171875,
-0.006305694580078125,
-0.017425537109375,
-0.032073974609375
]
] |
timm/vit_base_patch14_dinov2.lvd142m | 2023-09-03T22:00:33.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:2304.07193",
"arxiv:2010.11929",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_base_patch14_dinov2.lvd142m | 0 | 12,934 | timm | 2023-05-09T20:48:20 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
---
# Model card for vit_base_patch14_dinov2.lvd142m
A Vision Transformer (ViT) image feature model. Pretrained on LVD-142M with self-supervised DINOv2 method.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 86.6
- GMACs: 151.7
- Activations (M): 397.6
- Image size: 518 x 518
- **Papers:**
- DINOv2: Learning Robust Visual Features without Supervision: https://arxiv.org/abs/2304.07193
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Original:** https://github.com/facebookresearch/dinov2
- **Pretrain Dataset:** LVD-142M
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_base_patch14_dinov2.lvd142m', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_base_patch14_dinov2.lvd142m',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1370, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{oquab2023dinov2,
title={DINOv2: Learning Robust Visual Features without Supervision},
author={Oquab, Maxime and Darcet, Timothée and Moutakanni, Theo and Vo, Huy V. and Szafraniec, Marc and Khalidov, Vasil and Fernandez, Pierre and Haziza, Daniel and Massa, Francisco and El-Nouby, Alaaeldin and Howes, Russell and Huang, Po-Yao and Xu, Hu and Sharma, Vasu and Li, Shang-Wen and Galuba, Wojciech and Rabbat, Mike and Assran, Mido and Ballas, Nicolas and Synnaeve, Gabriel and Misra, Ishan and Jegou, Herve and Mairal, Julien and Labatut, Patrick and Joulin, Armand and Bojanowski, Piotr},
journal={arXiv:2304.07193},
year={2023}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
``` | 3,985 | [
[
-0.03564453125,
-0.0296630859375,
0.0056304931640625,
0.0047607421875,
-0.035888671875,
-0.0254669189453125,
-0.01971435546875,
-0.031829833984375,
0.0092315673828125,
0.0202178955078125,
-0.033416748046875,
-0.0399169921875,
-0.05145263671875,
-0.004974365234375,
-0.0026493072509765625,
0.06298828125,
-0.00879669189453125,
-0.004436492919921875,
-0.0225677490234375,
-0.03448486328125,
-0.01291656494140625,
-0.01175689697265625,
-0.03668212890625,
-0.025787353515625,
0.029144287109375,
0.007740020751953125,
0.043975830078125,
0.060455322265625,
0.0458984375,
0.030792236328125,
-0.01165771484375,
0.0159454345703125,
-0.02459716796875,
-0.0112457275390625,
0.013824462890625,
-0.046722412109375,
-0.033233642578125,
0.01314544677734375,
0.05322265625,
0.0210113525390625,
0.0034027099609375,
0.025726318359375,
0.018646240234375,
0.0316162109375,
-0.02496337890625,
0.02301025390625,
-0.0360107421875,
0.013641357421875,
-0.00878143310546875,
0.0013818740844726562,
-0.018096923828125,
-0.0233612060546875,
0.01279449462890625,
-0.044403076171875,
0.030731201171875,
-0.00765228271484375,
0.10003662109375,
0.0225982666015625,
-0.0101165771484375,
0.0115203857421875,
-0.0211944580078125,
0.06494140625,
-0.038818359375,
0.0252685546875,
0.020294189453125,
0.002185821533203125,
0.001522064208984375,
-0.07568359375,
-0.042633056640625,
0.002803802490234375,
-0.0164031982421875,
0.00833892822265625,
-0.0322265625,
0.0029392242431640625,
0.0260467529296875,
0.022003173828125,
-0.03289794921875,
0.012054443359375,
-0.041717529296875,
-0.025970458984375,
0.031982421875,
-0.00959014892578125,
0.0147247314453125,
-0.027740478515625,
-0.04754638671875,
-0.041351318359375,
-0.0285186767578125,
0.029022216796875,
0.01538848876953125,
0.003696441650390625,
-0.037750244140625,
0.0374755859375,
0.0139923095703125,
0.0404052734375,
0.01229095458984375,
-0.02569580078125,
0.047149658203125,
-0.017364501953125,
-0.030029296875,
-0.021759033203125,
0.086181640625,
0.03875732421875,
0.02789306640625,
0.0108795166015625,
-0.01000213623046875,
-0.0071563720703125,
-0.00907135009765625,
-0.08306884765625,
-0.033599853515625,
0.0112457275390625,
-0.036285400390625,
-0.03271484375,
0.0156402587890625,
-0.0677490234375,
-0.01409912109375,
-0.0037097930908203125,
0.050933837890625,
-0.0323486328125,
-0.027496337890625,
-0.0105133056640625,
-0.00946044921875,
0.036163330078125,
0.0164794921875,
-0.052947998046875,
0.01114654541015625,
0.0193939208984375,
0.0782470703125,
-0.0026035308837890625,
-0.031768798828125,
-0.0299835205078125,
-0.0252838134765625,
-0.02935791015625,
0.04168701171875,
-0.0024814605712890625,
-0.015716552734375,
-0.011993408203125,
0.0264739990234375,
-0.01141357421875,
-0.047393798828125,
0.02032470703125,
-0.007480621337890625,
0.0271759033203125,
-0.00540924072265625,
-0.005096435546875,
-0.02410888671875,
0.0164794921875,
-0.030426025390625,
0.09796142578125,
0.02960205078125,
-0.06689453125,
0.03533935546875,
-0.01873779296875,
-0.006977081298828125,
-0.01462554931640625,
-0.007343292236328125,
-0.09014892578125,
-0.0098114013671875,
0.028411865234375,
0.041961669921875,
-0.01499176025390625,
0.0009250640869140625,
-0.038177490234375,
-0.0205078125,
0.020751953125,
-0.0134735107421875,
0.07147216796875,
0.001941680908203125,
-0.0394287109375,
0.0251617431640625,
-0.04925537109375,
0.0108184814453125,
0.034576416015625,
-0.023162841796875,
-0.00440216064453125,
-0.041107177734375,
0.01122283935546875,
0.022369384765625,
0.0165557861328125,
-0.032470703125,
0.02435302734375,
-0.01280975341796875,
0.034881591796875,
0.05609130859375,
-0.0164947509765625,
0.0306396484375,
-0.021331787109375,
0.0243988037109375,
0.018035888671875,
0.02288818359375,
-0.006313323974609375,
-0.049652099609375,
-0.0634765625,
-0.050933837890625,
0.022796630859375,
0.0419921875,
-0.0523681640625,
0.044586181640625,
-0.0185699462890625,
-0.04400634765625,
-0.03631591796875,
0.021759033203125,
0.042724609375,
0.041351318359375,
0.0423583984375,
-0.04327392578125,
-0.03863525390625,
-0.0640869140625,
0.00377655029296875,
-0.01207733154296875,
-0.005725860595703125,
0.0323486328125,
0.042236328125,
-0.01495361328125,
0.05499267578125,
-0.027435302734375,
-0.0340576171875,
-0.0167388916015625,
0.0010137557983398438,
0.0172119140625,
0.0494384765625,
0.06280517578125,
-0.0467529296875,
-0.0316162109375,
-0.0170745849609375,
-0.0682373046875,
0.00826263427734375,
0.002681732177734375,
-0.009124755859375,
0.026214599609375,
0.0185699462890625,
-0.061920166015625,
0.047515869140625,
0.01506805419921875,
-0.027191162109375,
0.0229339599609375,
-0.0248260498046875,
-0.0019102096557617188,
-0.091552734375,
0.003734588623046875,
0.0303192138671875,
-0.017913818359375,
-0.0290679931640625,
-0.0071868896484375,
0.0157470703125,
0.004787445068359375,
-0.0328369140625,
0.038360595703125,
-0.0457763671875,
-0.015533447265625,
0.00554656982421875,
-0.01096343994140625,
0.00487518310546875,
0.052764892578125,
-0.0007338523864746094,
0.037322998046875,
0.0679931640625,
-0.029876708984375,
0.040802001953125,
0.029876708984375,
-0.0286712646484375,
0.041717529296875,
-0.0537109375,
0.0051727294921875,
-0.0009403228759765625,
0.0135498046875,
-0.0838623046875,
-0.0190887451171875,
0.034332275390625,
-0.0457763671875,
0.053619384765625,
-0.040313720703125,
-0.029327392578125,
-0.044952392578125,
-0.04156494140625,
0.031585693359375,
0.061431884765625,
-0.056396484375,
0.033294677734375,
0.01222991943359375,
0.01493072509765625,
-0.048919677734375,
-0.07098388671875,
-0.018096923828125,
-0.035064697265625,
-0.044708251953125,
0.025421142578125,
0.01287841796875,
0.010162353515625,
0.0071258544921875,
-0.00696563720703125,
0.0006489753723144531,
-0.012908935546875,
0.032989501953125,
0.035797119140625,
-0.02166748046875,
-0.00885009765625,
-0.0229339599609375,
-0.00890350341796875,
0.006328582763671875,
-0.0219268798828125,
0.03533935546875,
-0.0150146484375,
-0.00839996337890625,
-0.062347412109375,
-0.01357269287109375,
0.0462646484375,
-0.027435302734375,
0.0518798828125,
0.0914306640625,
-0.031768798828125,
-0.003753662109375,
-0.0290069580078125,
-0.0211334228515625,
-0.03765869140625,
0.047088623046875,
-0.0226287841796875,
-0.0340576171875,
0.058135986328125,
0.010498046875,
-0.0112762451171875,
0.052215576171875,
0.03546142578125,
-0.002002716064453125,
0.05963134765625,
0.046295166015625,
0.0200347900390625,
0.057525634765625,
-0.0682373046875,
-0.0072784423828125,
-0.0679931640625,
-0.044189453125,
-0.01433563232421875,
-0.03155517578125,
-0.05572509765625,
-0.048919677734375,
0.0291748046875,
0.01412200927734375,
-0.01515960693359375,
0.03558349609375,
-0.061859130859375,
0.004241943359375,
0.06146240234375,
0.04034423828125,
-0.020233154296875,
0.0293731689453125,
-0.023101806640625,
-0.01220703125,
-0.05316162109375,
-0.003398895263671875,
0.0843505859375,
0.035491943359375,
0.06005859375,
-0.005218505859375,
0.046112060546875,
-0.0162353515625,
0.007015228271484375,
-0.0582275390625,
0.0452880859375,
-0.001216888427734375,
-0.022125244140625,
-0.0192108154296875,
-0.0301666259765625,
-0.0767822265625,
0.012237548828125,
-0.0255889892578125,
-0.060943603515625,
0.0361328125,
0.0186614990234375,
-0.0208282470703125,
0.04833984375,
-0.0582275390625,
0.07281494140625,
-0.007495880126953125,
-0.0458984375,
0.00841522216796875,
-0.053497314453125,
0.02008056640625,
0.0111236572265625,
-0.017730712890625,
0.007076263427734375,
0.00954437255859375,
0.069091796875,
-0.0406494140625,
0.067138671875,
-0.028564453125,
0.0170745849609375,
0.041107177734375,
-0.0131988525390625,
0.035858154296875,
0.0054473876953125,
0.01629638671875,
0.0171661376953125,
0.004138946533203125,
-0.02911376953125,
-0.035064697265625,
0.041351318359375,
-0.08221435546875,
-0.0341796875,
-0.041229248046875,
-0.03057861328125,
0.0140533447265625,
0.0162200927734375,
0.05010986328125,
0.042877197265625,
0.0222930908203125,
0.032623291015625,
0.06072998046875,
-0.0236663818359375,
0.0330810546875,
-0.006195068359375,
-0.017669677734375,
-0.051025390625,
0.06982421875,
0.0222015380859375,
0.0180206298828125,
0.01184844970703125,
0.01248931884765625,
-0.0302581787109375,
-0.0380859375,
-0.02728271484375,
0.03009033203125,
-0.054290771484375,
-0.03375244140625,
-0.0408935546875,
-0.0268096923828125,
-0.0308990478515625,
0.0098876953125,
-0.037017822265625,
-0.0251617431640625,
-0.04168701171875,
0.003993988037109375,
0.049774169921875,
0.037933349609375,
-0.02264404296875,
0.0194854736328125,
-0.02606201171875,
0.021453857421875,
0.042022705078125,
0.033416748046875,
-0.0211639404296875,
-0.06939697265625,
-0.013214111328125,
-0.002532958984375,
-0.033416748046875,
-0.04754638671875,
0.035125732421875,
0.0189208984375,
0.037445068359375,
0.0296630859375,
-0.0164031982421875,
0.056915283203125,
-0.004852294921875,
0.045806884765625,
0.029449462890625,
-0.046966552734375,
0.04656982421875,
-0.01194000244140625,
0.00716400146484375,
0.0037097930908203125,
0.0172119140625,
-0.0043487548828125,
0.005847930908203125,
-0.0679931640625,
-0.059417724609375,
0.06146240234375,
0.01499176025390625,
-0.0032291412353515625,
0.0264892578125,
0.0419921875,
-0.01399993896484375,
-0.00302886962890625,
-0.0631103515625,
-0.027496337890625,
-0.03973388671875,
-0.020172119140625,
-0.00643157958984375,
-0.0182037353515625,
-0.00970458984375,
-0.04931640625,
0.041107177734375,
-0.01021575927734375,
0.0572509765625,
0.031402587890625,
-0.004962921142578125,
-0.0192108154296875,
-0.0247039794921875,
0.0360107421875,
0.02569580078125,
-0.0270538330078125,
0.0157012939453125,
0.01343536376953125,
-0.047332763671875,
-0.0032958984375,
0.0248260498046875,
-0.005184173583984375,
-0.0005779266357421875,
0.036376953125,
0.07635498046875,
-0.005847930908203125,
-0.002933502197265625,
0.049652099609375,
-0.00408172607421875,
-0.037933349609375,
-0.0149383544921875,
0.01171112060546875,
-0.00804901123046875,
0.034881591796875,
0.02252197265625,
0.033416748046875,
-0.019378662109375,
-0.026641845703125,
0.0222930908203125,
0.04376220703125,
-0.042022705078125,
-0.03204345703125,
0.06268310546875,
-0.0179443359375,
-0.003040313720703125,
0.057586669921875,
-0.0012950897216796875,
-0.032440185546875,
0.07159423828125,
0.031494140625,
0.061553955078125,
-0.0183563232421875,
0.005321502685546875,
0.056793212890625,
0.0222625732421875,
-0.0089111328125,
0.015899658203125,
0.004093170166015625,
-0.0518798828125,
0.00257110595703125,
-0.040374755859375,
0.0027904510498046875,
0.024749755859375,
-0.0499267578125,
0.032684326171875,
-0.04791259765625,
-0.031097412109375,
0.00873565673828125,
0.01132965087890625,
-0.06951904296875,
0.017669677734375,
0.008636474609375,
0.04931640625,
-0.059326171875,
0.06683349609375,
0.058441162109375,
-0.045501708984375,
-0.069091796875,
-0.0080108642578125,
0.0038967132568359375,
-0.0712890625,
0.03668212890625,
0.03131103515625,
0.005859375,
0.01338958740234375,
-0.05609130859375,
-0.058441162109375,
0.11322021484375,
0.034576416015625,
-0.00969696044921875,
0.01392364501953125,
-0.00412750244140625,
0.0278167724609375,
-0.017669677734375,
0.0248565673828125,
0.01146697998046875,
0.0226898193359375,
0.028045654296875,
-0.058929443359375,
0.01165771484375,
-0.023590087890625,
0.01366424560546875,
0.00957489013671875,
-0.0714111328125,
0.07012939453125,
-0.04364013671875,
-0.00937652587890625,
0.0165557861328125,
0.04931640625,
0.01561737060546875,
0.0054931640625,
0.035797119140625,
0.06146240234375,
0.03985595703125,
-0.028564453125,
0.062042236328125,
-0.00659942626953125,
0.05609130859375,
0.044189453125,
0.030364990234375,
0.043914794921875,
0.037261962890625,
-0.029510498046875,
0.028564453125,
0.07318115234375,
-0.0404052734375,
0.0399169921875,
0.007442474365234375,
0.0028076171875,
-0.0252685546875,
0.00045013427734375,
-0.03668212890625,
0.037261962890625,
0.0171661376953125,
-0.046417236328125,
-0.0014028549194335938,
0.0010461807250976562,
0.0014944076538085938,
-0.022430419921875,
-0.0115814208984375,
0.041259765625,
0.00664520263671875,
-0.03515625,
0.06671142578125,
-0.0025005340576171875,
0.05596923828125,
-0.031341552734375,
0.0028285980224609375,
-0.02294921875,
0.035125732421875,
-0.02508544921875,
-0.06719970703125,
0.0098114013671875,
-0.016754150390625,
-0.00592803955078125,
0.0007944107055664062,
0.0516357421875,
-0.031524658203125,
-0.041259765625,
0.0174407958984375,
0.0227813720703125,
0.0152130126953125,
0.00677490234375,
-0.07427978515625,
0.0011081695556640625,
0.002857208251953125,
-0.0484619140625,
0.0267333984375,
0.039703369140625,
0.002796173095703125,
0.052093505859375,
0.0433349609375,
-0.01020050048828125,
0.0131378173828125,
-0.014404296875,
0.07122802734375,
-0.03057861328125,
-0.022705078125,
-0.061676025390625,
0.04559326171875,
-0.00400543212890625,
-0.041351318359375,
0.045440673828125,
0.039337158203125,
0.06024169921875,
-0.0012912750244140625,
0.03387451171875,
-0.0124664306640625,
-0.0005865097045898438,
-0.030731201171875,
0.04736328125,
-0.048492431640625,
0.0016965866088867188,
-0.0202178955078125,
-0.06719970703125,
-0.03582763671875,
0.0645751953125,
-0.0162506103515625,
0.028656005859375,
0.038177490234375,
0.07659912109375,
-0.0276336669921875,
-0.050018310546875,
0.01462554931640625,
0.0215301513671875,
0.01020050048828125,
0.033447265625,
0.031982421875,
-0.05804443359375,
0.05108642578125,
-0.04327392578125,
-0.015869140625,
-0.01220703125,
-0.0426025390625,
-0.08966064453125,
-0.057952880859375,
-0.045989990234375,
-0.05792236328125,
-0.01055908203125,
0.06072998046875,
0.07135009765625,
-0.049041748046875,
0.00267791748046875,
-0.006114959716796875,
0.01163482666015625,
-0.023895263671875,
-0.0157470703125,
0.0543212890625,
-0.0005855560302734375,
-0.05389404296875,
-0.0220184326171875,
0.004390716552734375,
0.0364990234375,
-0.01230621337890625,
-0.01641845703125,
-0.00975799560546875,
-0.0179290771484375,
0.0165557861328125,
0.02880859375,
-0.0628662109375,
-0.01837158203125,
-0.0081329345703125,
-0.01241302490234375,
0.043060302734375,
0.01499176025390625,
-0.05438232421875,
0.039337158203125,
0.039337158203125,
0.0139923095703125,
0.059051513671875,
-0.01168060302734375,
0.001598358154296875,
-0.05682373046875,
0.04156494140625,
-0.01499176025390625,
0.037689208984375,
0.042144775390625,
-0.018280029296875,
0.037200927734375,
0.053741455078125,
-0.0299835205078125,
-0.06390380859375,
-0.0033702850341796875,
-0.08843994140625,
0.00029969215393066406,
0.07916259765625,
-0.03289794921875,
-0.037811279296875,
0.030731201171875,
-0.00852203369140625,
0.048736572265625,
-0.006809234619140625,
0.037689208984375,
0.01375579833984375,
0.00737762451171875,
-0.0626220703125,
-0.02886962890625,
0.045257568359375,
0.006847381591796875,
-0.03912353515625,
-0.0268402099609375,
0.00821685791015625,
0.049652099609375,
0.0219268798828125,
0.02294921875,
-0.01209259033203125,
0.019500732421875,
0.007518768310546875,
0.0301361083984375,
-0.0237884521484375,
-0.012969970703125,
-0.0269775390625,
-0.004360198974609375,
-0.004253387451171875,
-0.044036865234375
]
] |
pankajmathur/orca_mini_v3_13b | 2023-08-25T23:13:49.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:psmathur/orca_mini_v1_dataset",
"dataset:ehartford/dolphin",
"arxiv:2306.02707",
"license:other",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | pankajmathur | null | null | pankajmathur/orca_mini_v3_13b | 27 | 12,933 | transformers | 2023-08-09T04:01:33 | ---
language:
- en
library_name: transformers
license: other
datasets:
- psmathur/orca_mini_v1_dataset
- ehartford/dolphin
pipeline_tag: text-generation
---
# orca_mini_v3_13b
A Llama2-13b model trained on Orca Style datasets.
<br>

<br>
**P.S. If you're interested to collaborate, please connect with me at www.linkedin.com/in/pankajam.**
<br>
### quantized versions
Big thanks to [@TheBloke](https://huggingface.co/TheBloke)
1) https://huggingface.co/TheBloke/orca_mini_v3_13B-GGML
2) https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ
<br>
#### license disclaimer:
This model is bound by the license & usage restrictions of the original Llama-2 model. And comes with no warranty or gurantees of any kind.
<br>
## Evaluation
We evaluated orca_mini_v3_13b on a wide range of tasks using [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) from EleutherAI.
Here are the results on metrics used by [HuggingFaceH4 Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|||||
|:------:|:--------:|:-------:|:--------:|
|**Task**|**Metric**|**Value**|**Stderr**|
|*arc_challenge*|acc_norm|0.6314|0.0141|
|*hellaswag*|acc_norm|0.8242|0.0038|
|*mmlu*|acc_norm|0.5637|0.0351|
|*truthfulqa_mc*|mc2|0.5127|0.0157|
|**Total Average**|-|**0.6329877193**||
<br>
## Example Usage
Here is the prompt format
```
### System:
You are an AI assistant that follows instruction extremely well. Help as much as you can.
### User:
Tell me about Orcas.
### Assistant:
```
Below shows a code example on how to use this model
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("psmathur/orca_mini_v3_13b")
model = AutoModelForCausalLM.from_pretrained(
"psmathur/orca_mini_v3_13b",
torch_dtype=torch.float16,
load_in_8bit=True,
low_cpu_mem_usage=True,
device_map="auto"
)
system_prompt = "### System:\nYou are an AI assistant that follows instruction extremely well. Help as much as you can.\n\n"
#generate text steps
instruction = "Tell me about Orcas."
prompt = f"{system_prompt}### User: {instruction}\n\n### Assistant:\n"
inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=4096)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
<br>
#### Limitations & Biases:
While this model aims for accuracy, it can occasionally produce inaccurate or misleading results.
Despite diligent efforts in refining the pretraining data, there remains a possibility for the generation of inappropriate, biased, or offensive content.
Exercise caution and cross-check information when necessary.
<br>
### Citiation:
Please kindly cite using the following BibTeX:
```
@misc{orca_mini_v3_13b,
author = {Pankaj Mathur},
title = {orca_mini_v3_13b: An Orca Style Llama2-70b model},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/psmathur/orca_mini_v3_13b},
}
```
```
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@software{touvron2023llama2,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron, Louis Martin, Kevin Stone, Peter Albert, Amjad Almahairi, Yasmine Babaei, Nikolay Bashlykov, Soumya Batra, Prajjwal Bhargava,
Shruti Bhosale, Dan Bikel, Lukas Blecher, Cristian Canton Ferrer, Moya Chen, Guillem Cucurull, David Esiobu, Jude Fernandes, Jeremy Fu, Wenyin Fu, Brian Fuller,
Cynthia Gao, Vedanuj Goswami, Naman Goyal, Anthony Hartshorn, Saghar Hosseini, Rui Hou, Hakan Inan, Marcin Kardas, Viktor Kerkez Madian Khabsa, Isabel Kloumann,
Artem Korenev, Punit Singh Koura, Marie-Anne Lachaux, Thibaut Lavril, Jenya Lee, Diana Liskovich, Yinghai Lu, Yuning Mao, Xavier Martinet, Todor Mihaylov,
Pushkar Mishra, Igor Molybog, Yixin Nie, Andrew Poulton, Jeremy Reizenstein, Rashi Rungta, Kalyan Saladi, Alan Schelten, Ruan Silva, Eric Michael Smith,
Ranjan Subramanian, Xiaoqing Ellen Tan, Binh Tang, Ross Taylor, Adina Williams, Jian Xiang Kuan, Puxin Xu , Zheng Yan, Iliyan Zarov, Yuchen Zhang, Angela Fan,
Melanie Kambadur, Sharan Narang, Aurelien Rodriguez, Robert Stojnic, Sergey Edunov, Thomas Scialom},
year={2023}
}
``` | 4,742 | [
[
-0.02191162109375,
-0.06268310546875,
0.01486968994140625,
0.01071929931640625,
-0.017669677734375,
0.00031304359436035156,
-0.00396728515625,
-0.04681396484375,
0.020111083984375,
0.01157379150390625,
-0.054901123046875,
-0.04534912109375,
-0.04241943359375,
-0.00955963134765625,
-0.01529693603515625,
0.07904052734375,
-0.00128936767578125,
-0.01253509521484375,
0.006244659423828125,
-0.01300048828125,
-0.039520263671875,
-0.028167724609375,
-0.07513427734375,
-0.0323486328125,
0.0206146240234375,
0.0181884765625,
0.04620361328125,
0.055023193359375,
0.0280303955078125,
0.02520751953125,
-0.018280029296875,
0.0190277099609375,
-0.031707763671875,
-0.0177001953125,
0.0174407958984375,
-0.0416259765625,
-0.07269287109375,
0.0015821456909179688,
0.0268707275390625,
0.02056884765625,
-0.0162811279296875,
0.031707763671875,
0.01459503173828125,
0.02642822265625,
-0.0268707275390625,
0.037200927734375,
-0.0218505859375,
0.00006777048110961914,
-0.027099609375,
-0.00269317626953125,
-0.0029163360595703125,
-0.0229949951171875,
0.00341796875,
-0.056915283203125,
0.00557708740234375,
-0.006633758544921875,
0.0870361328125,
0.02984619140625,
-0.0235137939453125,
-0.0249786376953125,
-0.02777099609375,
0.054931640625,
-0.07037353515625,
0.0134735107421875,
0.0092926025390625,
0.0181884765625,
-0.021728515625,
-0.06890869140625,
-0.05560302734375,
-0.0127410888671875,
-0.0111846923828125,
0.015899658203125,
-0.0159454345703125,
-0.004665374755859375,
0.0257415771484375,
0.026580810546875,
-0.04010009765625,
0.017242431640625,
-0.032958984375,
-0.01824951171875,
0.044586181640625,
0.030914306640625,
0.0232696533203125,
-0.01013946533203125,
-0.0245208740234375,
-0.027679443359375,
-0.0521240234375,
0.033935546875,
0.03289794921875,
0.01399993896484375,
-0.044769287109375,
0.047760009765625,
-0.004566192626953125,
0.0455322265625,
0.01459503173828125,
-0.032196044921875,
0.0396728515625,
-0.037322998046875,
-0.02825927734375,
-0.01220703125,
0.06536865234375,
0.01430511474609375,
0.00482177734375,
0.0247650146484375,
-0.00649261474609375,
0.01256561279296875,
-0.0138397216796875,
-0.056915283203125,
-0.026611328125,
0.01346588134765625,
-0.03515625,
-0.0289764404296875,
-0.01306915283203125,
-0.05755615234375,
-0.0171966552734375,
-0.01290130615234375,
0.0270538330078125,
-0.03271484375,
-0.027618408203125,
0.0177154541015625,
0.012725830078125,
0.039642333984375,
0.01085662841796875,
-0.0693359375,
0.0251617431640625,
0.0233612060546875,
0.0618896484375,
0.004756927490234375,
-0.0171966552734375,
-0.01200103759765625,
-0.0008211135864257812,
-0.0100860595703125,
0.05078125,
-0.018829345703125,
-0.0284576416015625,
-0.02734375,
-0.0003821849822998047,
-0.0124053955078125,
-0.0253143310546875,
0.03912353515625,
-0.0249176025390625,
0.0195465087890625,
-0.022247314453125,
-0.022705078125,
-0.0226593017578125,
0.016021728515625,
-0.04205322265625,
0.09246826171875,
-0.0029087066650390625,
-0.06231689453125,
0.019866943359375,
-0.06097412109375,
-0.00870513916015625,
-0.021209716796875,
-0.0177764892578125,
-0.05096435546875,
-0.0226593017578125,
0.03387451171875,
0.0274505615234375,
-0.0201416015625,
0.00785064697265625,
-0.029754638671875,
-0.0237274169921875,
0.00499725341796875,
-0.0167236328125,
0.08428955078125,
0.01309967041015625,
-0.053558349609375,
0.0207366943359375,
-0.057647705078125,
-0.0074920654296875,
0.03350830078125,
-0.024810791015625,
0.0025577545166015625,
-0.01287078857421875,
-0.0157470703125,
0.0189971923828125,
0.0330810546875,
-0.037689208984375,
0.024383544921875,
-0.025543212890625,
0.04754638671875,
0.05999755859375,
-0.0081787109375,
0.0214080810546875,
-0.030487060546875,
0.039581298828125,
0.0011281967163085938,
0.0225372314453125,
0.009002685546875,
-0.0498046875,
-0.081787109375,
-0.032806396484375,
0.0200042724609375,
0.037506103515625,
-0.0458984375,
0.043731689453125,
-0.01244354248046875,
-0.05352783203125,
-0.034423828125,
0.0003542900085449219,
0.0285186767578125,
0.04669189453125,
0.026641845703125,
-0.0238800048828125,
-0.05224609375,
-0.055999755859375,
0.007419586181640625,
-0.0158538818359375,
-0.0033817291259765625,
0.02996826171875,
0.0419921875,
-0.02154541015625,
0.07318115234375,
-0.035308837890625,
-0.031005859375,
-0.020477294921875,
0.00872802734375,
0.027069091796875,
0.05072021484375,
0.055877685546875,
-0.038909912109375,
-0.0238800048828125,
-0.0171661376953125,
-0.0670166015625,
-0.002349853515625,
-0.0003864765167236328,
-0.0236358642578125,
0.01309967041015625,
0.032501220703125,
-0.05804443359375,
0.048370361328125,
0.042388916015625,
-0.030975341796875,
0.04815673828125,
-0.0107421875,
-0.0077056884765625,
-0.0693359375,
0.0209808349609375,
-0.002590179443359375,
-0.01251983642578125,
-0.02996826171875,
-0.004650115966796875,
-0.0082550048828125,
0.004077911376953125,
-0.03228759765625,
0.052001953125,
-0.031158447265625,
0.003032684326171875,
-0.006053924560546875,
0.007720947265625,
-0.01006317138671875,
0.056854248046875,
0.000850677490234375,
0.04901123046875,
0.052947998046875,
-0.0352783203125,
0.0289306640625,
0.027313232421875,
-0.0304107666015625,
0.0269622802734375,
-0.071044921875,
0.0245208740234375,
0.00858306884765625,
0.037506103515625,
-0.08880615234375,
-0.0160675048828125,
0.041748046875,
-0.042572021484375,
0.032867431640625,
0.003814697265625,
-0.040771484375,
-0.036376953125,
-0.033172607421875,
0.036346435546875,
0.038482666015625,
-0.041748046875,
0.040740966796875,
0.025360107421875,
-0.002353668212890625,
-0.048553466796875,
-0.056488037109375,
-0.0157623291015625,
-0.0298309326171875,
-0.055816650390625,
0.024505615234375,
-0.0196990966796875,
0.00650787353515625,
-0.00650787353515625,
-0.01446533203125,
0.00937652587890625,
0.00015103816986083984,
0.025299072265625,
0.04010009765625,
-0.01470947265625,
-0.0179443359375,
-0.001453399658203125,
-0.0112762451171875,
0.004619598388671875,
-0.0066070556640625,
0.054534912109375,
-0.030670166015625,
-0.02105712890625,
-0.047760009765625,
-0.005893707275390625,
0.030548095703125,
-0.013153076171875,
0.059539794921875,
0.058380126953125,
-0.025543212890625,
0.0152740478515625,
-0.038818359375,
-0.0206298828125,
-0.039947509765625,
0.01983642578125,
-0.037109375,
-0.060150146484375,
0.0609130859375,
0.017608642578125,
0.015655517578125,
0.0537109375,
0.050018310546875,
-0.0024776458740234375,
0.0794677734375,
0.058380126953125,
0.0006990432739257812,
0.045562744140625,
-0.05072021484375,
0.00217437744140625,
-0.07098388671875,
-0.045989990234375,
-0.0307159423828125,
-0.033203125,
-0.038726806640625,
-0.0191192626953125,
0.0273590087890625,
0.01318359375,
-0.045501708984375,
0.03411865234375,
-0.0460205078125,
0.007396697998046875,
0.042266845703125,
0.0256500244140625,
0.0144195556640625,
-0.0023250579833984375,
-0.0164794921875,
0.0037670135498046875,
-0.058380126953125,
-0.0413818359375,
0.08966064453125,
0.0306396484375,
0.0574951171875,
0.007503509521484375,
0.042266845703125,
0.0035686492919921875,
0.0204010009765625,
-0.0411376953125,
0.0406494140625,
0.017303466796875,
-0.05322265625,
-0.022552490234375,
-0.0263214111328125,
-0.0806884765625,
0.01155853271484375,
-0.003173828125,
-0.065185546875,
0.021942138671875,
0.00809478759765625,
-0.047393798828125,
0.0189361572265625,
-0.040985107421875,
0.06695556640625,
-0.01104736328125,
-0.00197601318359375,
-0.003170013427734375,
-0.058990478515625,
0.04229736328125,
0.003688812255859375,
0.0041351318359375,
-0.006458282470703125,
-0.01262664794921875,
0.07666015625,
-0.0450439453125,
0.07501220703125,
-0.004222869873046875,
-0.00862884521484375,
0.038482666015625,
-0.00682830810546875,
0.048919677734375,
0.010772705078125,
-0.005786895751953125,
0.0240631103515625,
-0.003253936767578125,
-0.034454345703125,
-0.0223846435546875,
0.0465087890625,
-0.085205078125,
-0.03704833984375,
-0.030792236328125,
-0.0247650146484375,
0.00887298583984375,
0.01654052734375,
0.03631591796875,
0.023529052734375,
0.01806640625,
0.0038394927978515625,
0.039276123046875,
-0.0174407958984375,
0.038726806640625,
0.031524658203125,
-0.0031261444091796875,
-0.0308990478515625,
0.05401611328125,
0.01348114013671875,
0.012786865234375,
0.006282806396484375,
0.002971649169921875,
-0.0330810546875,
-0.033355712890625,
-0.03009033203125,
0.040130615234375,
-0.04437255859375,
-0.0291290283203125,
-0.04876708984375,
-0.019744873046875,
-0.031036376953125,
-0.0011816024780273438,
-0.035186767578125,
-0.0252532958984375,
-0.04913330078125,
-0.01409912109375,
0.0369873046875,
0.040496826171875,
-0.01470947265625,
0.02606201171875,
-0.026641845703125,
0.01303863525390625,
0.0294952392578125,
0.0059814453125,
0.00859832763671875,
-0.06524658203125,
-0.00876617431640625,
0.01549530029296875,
-0.046142578125,
-0.047210693359375,
0.03643798828125,
0.00664520263671875,
0.046051025390625,
0.01549530029296875,
-0.00537109375,
0.07684326171875,
-0.00897979736328125,
0.0711669921875,
0.024383544921875,
-0.07574462890625,
0.0408935546875,
-0.0244140625,
0.01413726806640625,
0.0178680419921875,
0.02069091796875,
-0.016998291015625,
-0.025909423828125,
-0.0703125,
-0.0672607421875,
0.0665283203125,
0.0299835205078125,
0.00201416015625,
0.01206207275390625,
0.03216552734375,
0.0036945343017578125,
0.0108795166015625,
-0.072509765625,
-0.040008544921875,
-0.025390625,
-0.0062255859375,
0.0020198822021484375,
-0.0171661376953125,
-0.002349853515625,
-0.0251312255859375,
0.057037353515625,
-0.0009598731994628906,
0.044677734375,
0.01081085205078125,
0.0051727294921875,
-0.00438690185546875,
-0.0134124755859375,
0.052978515625,
0.04229736328125,
-0.021026611328125,
-0.00518035888671875,
0.0276336669921875,
-0.044036865234375,
-0.00040030479431152344,
0.007701873779296875,
-0.0011348724365234375,
-0.00848388671875,
0.029022216796875,
0.05224609375,
-0.006744384765625,
-0.0311279296875,
0.02960205078125,
-0.0080108642578125,
-0.007389068603515625,
-0.03375244140625,
0.00620269775390625,
0.01409912109375,
0.03643798828125,
0.018310546875,
0.0138092041015625,
-0.005031585693359375,
-0.04071044921875,
-0.005603790283203125,
0.023345947265625,
-0.0074615478515625,
-0.03302001953125,
0.07220458984375,
0.00218963623046875,
-0.0116424560546875,
0.0494384765625,
-0.00485992431640625,
-0.0307159423828125,
0.06378173828125,
0.024627685546875,
0.043426513671875,
-0.0130615234375,
-0.0036773681640625,
0.039398193359375,
0.015777587890625,
-0.007297515869140625,
0.029937744140625,
0.0035610198974609375,
-0.04095458984375,
-0.026397705078125,
-0.03643798828125,
-0.01446533203125,
0.0303192138671875,
-0.047454833984375,
0.04095458984375,
-0.03607177734375,
-0.0243377685546875,
-0.004619598388671875,
0.0258026123046875,
-0.06195068359375,
0.0151519775390625,
0.016693115234375,
0.06866455078125,
-0.0504150390625,
0.07598876953125,
0.043670654296875,
-0.0509033203125,
-0.087890625,
-0.0269317626953125,
0.003894805908203125,
-0.07440185546875,
0.04351806640625,
0.0024547576904296875,
-0.007099151611328125,
0.01146697998046875,
-0.055389404296875,
-0.07879638671875,
0.10968017578125,
0.038116455078125,
-0.0262451171875,
-0.00707244873046875,
-0.0006923675537109375,
0.041656494140625,
-0.0188446044921875,
0.047393798828125,
0.05438232421875,
0.031341552734375,
0.0141754150390625,
-0.0797119140625,
0.0252227783203125,
-0.0283966064453125,
-0.00383758544921875,
-0.007415771484375,
-0.07989501953125,
0.08929443359375,
-0.0189971923828125,
-0.00897216796875,
0.0303955078125,
0.05657958984375,
0.04656982421875,
0.00862884521484375,
0.0286712646484375,
0.045166015625,
0.052490234375,
-0.00949859619140625,
0.0706787109375,
-0.01104736328125,
0.048187255859375,
0.0653076171875,
0.0149078369140625,
0.04583740234375,
0.01690673828125,
-0.02593994140625,
0.05059814453125,
0.07647705078125,
-0.0006012916564941406,
0.044647216796875,
0.0106658935546875,
0.0018815994262695312,
-0.0078125,
0.006282806396484375,
-0.05340576171875,
0.0240936279296875,
0.03509521484375,
-0.0197906494140625,
-0.01309967041015625,
-0.01027679443359375,
0.017791748046875,
-0.028167724609375,
-0.0083770751953125,
0.04290771484375,
0.012725830078125,
-0.023406982421875,
0.081787109375,
0.003101348876953125,
0.0693359375,
-0.0494384765625,
0.00431060791015625,
-0.034271240234375,
0.010498046875,
-0.02923583984375,
-0.04669189453125,
0.0082244873046875,
-0.0015020370483398438,
0.00653076171875,
-0.003398895263671875,
0.037933349609375,
-0.0170440673828125,
-0.0272674560546875,
0.016357421875,
0.019073486328125,
0.024993896484375,
0.012542724609375,
-0.07415771484375,
0.020904541015625,
0.007579803466796875,
-0.0526123046875,
0.0189361572265625,
0.032379150390625,
0.003078460693359375,
0.056243896484375,
0.046234130859375,
-0.0032978057861328125,
0.0185546875,
-0.0189056396484375,
0.08349609375,
-0.033172607421875,
-0.0306396484375,
-0.07208251953125,
0.042205810546875,
0.0025997161865234375,
-0.037506103515625,
0.060638427734375,
0.036773681640625,
0.0662841796875,
0.00335693359375,
0.045806884765625,
-0.027801513671875,
0.020233154296875,
-0.030914306640625,
0.050079345703125,
-0.050323486328125,
0.0285491943359375,
-0.017333984375,
-0.0662841796875,
-0.014739990234375,
0.068603515625,
-0.0249786376953125,
0.0126953125,
0.0426025390625,
0.06536865234375,
-0.006023406982421875,
-0.00827789306640625,
-0.00022673606872558594,
0.0258331298828125,
0.040069580078125,
0.060638427734375,
0.0458984375,
-0.04925537109375,
0.06463623046875,
-0.03173828125,
-0.023773193359375,
-0.019805908203125,
-0.0615234375,
-0.0706787109375,
-0.0266265869140625,
-0.033721923828125,
-0.034393310546875,
-0.00951385498046875,
0.0635986328125,
0.06427001953125,
-0.05322265625,
-0.026031494140625,
-0.0101776123046875,
-0.00032067298889160156,
-0.028594970703125,
-0.01265716552734375,
0.050506591796875,
-0.0004787445068359375,
-0.0634765625,
0.00820159912109375,
-0.006000518798828125,
0.03326416015625,
-0.0157012939453125,
-0.01381683349609375,
-0.0157012939453125,
0.0019817352294921875,
0.0240936279296875,
0.0382080078125,
-0.051605224609375,
-0.023406982421875,
-0.00540924072265625,
-0.0225677490234375,
0.01197052001953125,
0.0228271484375,
-0.061614990234375,
0.0196990966796875,
0.022491455078125,
0.01424407958984375,
0.06378173828125,
-0.01158905029296875,
0.01459503173828125,
-0.0418701171875,
0.0260772705078125,
0.0021381378173828125,
0.0291595458984375,
0.012969970703125,
-0.02685546875,
0.053558349609375,
0.0209197998046875,
-0.0352783203125,
-0.0643310546875,
-0.0025787353515625,
-0.0933837890625,
0.005664825439453125,
0.08453369140625,
-0.0290374755859375,
-0.0223541259765625,
0.01052093505859375,
-0.02947998046875,
0.0423583984375,
-0.037689208984375,
0.06524658203125,
0.033599853515625,
-0.022705078125,
-0.007381439208984375,
-0.03704833984375,
0.039764404296875,
0.0195159912109375,
-0.0618896484375,
-0.024505615234375,
0.01000213623046875,
0.035064697265625,
0.0167083740234375,
0.051666259765625,
-0.00962066650390625,
0.01483917236328125,
0.00641632080078125,
0.011260986328125,
-0.0247650146484375,
-0.00040912628173828125,
-0.01207733154296875,
-0.010833740234375,
-0.010467529296875,
-0.0318603515625
]
] |
Helsinki-NLP/opus-mt-fi-de | 2023-08-16T11:34:22.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"fi",
"de",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-fi-de | 0 | 12,927 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-fi-de
* source languages: fi
* target languages: de
* OPUS readme: [fi-de](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/fi-de/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-04.zip](https://object.pouta.csc.fi/OPUS-MT-models/fi-de/opus-2019-12-04.zip)
* test set translations: [opus-2019-12-04.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/fi-de/opus-2019-12-04.test.txt)
* test set scores: [opus-2019-12-04.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/fi-de/opus-2019-12-04.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.fi.de | 45.2 | 0.637 |
| 818 | [
[
-0.02276611328125,
-0.04119873046875,
0.0194549560546875,
0.0258331298828125,
-0.032958984375,
-0.027618408203125,
-0.032806396484375,
-0.0049285888671875,
0.002735137939453125,
0.03302001953125,
-0.050628662109375,
-0.03936767578125,
-0.04632568359375,
0.0193328857421875,
-0.00970458984375,
0.050628662109375,
-0.0109405517578125,
0.035552978515625,
0.0149078369140625,
-0.03228759765625,
-0.0287322998046875,
-0.0301055908203125,
-0.037078857421875,
-0.0272369384765625,
0.0230255126953125,
0.031524658203125,
0.0321044921875,
0.0274658203125,
0.068359375,
0.0186004638671875,
-0.006397247314453125,
0.0066986083984375,
-0.037384033203125,
-0.0021152496337890625,
0.0009860992431640625,
-0.038909912109375,
-0.051422119140625,
-0.0112152099609375,
0.0765380859375,
0.031646728515625,
0.0022716522216796875,
0.0262298583984375,
-0.007099151611328125,
0.07568359375,
-0.024139404296875,
0.003963470458984375,
-0.04425048828125,
-0.00024080276489257812,
-0.02490234375,
-0.0243072509765625,
-0.052520751953125,
-0.016265869140625,
0.007381439208984375,
-0.0439453125,
-0.004695892333984375,
0.005939483642578125,
0.10723876953125,
0.0298614501953125,
-0.0330810546875,
-0.006946563720703125,
-0.050689697265625,
0.0709228515625,
-0.053802490234375,
0.051849365234375,
0.034332275390625,
0.01983642578125,
0.01025390625,
-0.04168701171875,
-0.0248565673828125,
0.00881195068359375,
-0.0162200927734375,
0.01319122314453125,
-0.01305389404296875,
-0.0183563232421875,
0.021148681640625,
0.056304931640625,
-0.050567626953125,
-0.0026092529296875,
-0.042877197265625,
0.0021877288818359375,
0.05438232421875,
0.00295257568359375,
0.006595611572265625,
-0.007137298583984375,
-0.035491943359375,
-0.041168212890625,
-0.053680419921875,
0.0036067962646484375,
0.0252532958984375,
0.0160675048828125,
-0.03741455078125,
0.052154541015625,
-0.01410675048828125,
0.047393798828125,
0.0016660690307617188,
0.00188446044921875,
0.07403564453125,
-0.0293426513671875,
-0.02789306640625,
-0.01151275634765625,
0.08819580078125,
0.0240936279296875,
0.01110076904296875,
0.0034542083740234375,
-0.0193328857421875,
-0.022125244140625,
0.00899505615234375,
-0.0672607421875,
-0.0045166015625,
0.01471710205078125,
-0.036651611328125,
-0.01229095458984375,
0.005062103271484375,
-0.046661376953125,
0.01520538330078125,
-0.0311279296875,
0.051177978515625,
-0.0498046875,
-0.0234832763671875,
0.0270843505859375,
-0.00023829936981201172,
0.0265350341796875,
-0.0005750656127929688,
-0.044921875,
0.01058197021484375,
0.0267333984375,
0.056640625,
-0.0266265869140625,
-0.014984130859375,
-0.04095458984375,
-0.015899658203125,
-0.0046234130859375,
0.051971435546875,
-0.006107330322265625,
-0.039581298828125,
-0.0078887939453125,
0.033447265625,
-0.03302001953125,
-0.022064208984375,
0.1025390625,
-0.0227508544921875,
0.052825927734375,
-0.028106689453125,
-0.04266357421875,
-0.0269927978515625,
0.0282440185546875,
-0.044158935546875,
0.09088134765625,
0.0141754150390625,
-0.06475830078125,
0.01131439208984375,
-0.06488037109375,
-0.0162506103515625,
-0.00634765625,
0.003818511962890625,
-0.04400634765625,
0.01409912109375,
0.004680633544921875,
0.028778076171875,
-0.02471923828125,
0.0216217041015625,
-0.0027141571044921875,
-0.027587890625,
0.0082855224609375,
-0.0350341796875,
0.076171875,
0.0284423828125,
-0.0244293212890625,
0.016510009765625,
-0.068603515625,
-0.004184722900390625,
-0.00003325939178466797,
-0.03302001953125,
-0.0157318115234375,
0.0104217529296875,
0.0191802978515625,
0.0096893310546875,
0.0209503173828125,
-0.050933837890625,
0.0185546875,
-0.04815673828125,
0.01224517822265625,
0.0479736328125,
-0.02587890625,
0.0243377685546875,
-0.03314208984375,
0.0214080810546875,
0.00798797607421875,
0.006412506103515625,
0.00031375885009765625,
-0.031524658203125,
-0.0599365234375,
-0.0230255126953125,
0.0477294921875,
0.07952880859375,
-0.05517578125,
0.0645751953125,
-0.052764892578125,
-0.0594482421875,
-0.054107666015625,
-0.0137481689453125,
0.0292510986328125,
0.03289794921875,
0.03729248046875,
-0.00786590576171875,
-0.02587890625,
-0.0797119140625,
-0.0038471221923828125,
-0.01320648193359375,
-0.016510009765625,
0.012054443359375,
0.045257568359375,
-0.019134521484375,
0.040802001953125,
-0.041717529296875,
-0.03253173828125,
-0.0048065185546875,
0.009368896484375,
0.036956787109375,
0.0489501953125,
0.04248046875,
-0.06817626953125,
-0.04052734375,
0.0009784698486328125,
-0.049713134765625,
-0.01409912109375,
0.00852203369140625,
-0.0150299072265625,
0.01360321044921875,
0.00783538818359375,
-0.022918701171875,
0.00856781005859375,
0.04736328125,
-0.0537109375,
0.034423828125,
-0.010498046875,
0.025421142578125,
-0.09881591796875,
0.01507568359375,
-0.0171051025390625,
-0.009429931640625,
-0.033721923828125,
-0.00334930419921875,
0.0161895751953125,
0.00902557373046875,
-0.0592041015625,
0.044769287109375,
-0.0095062255859375,
0.0010480880737304688,
0.01491546630859375,
-0.002956390380859375,
0.00763702392578125,
0.05377197265625,
-0.0023632049560546875,
0.061370849609375,
0.048004150390625,
-0.037445068359375,
0.01195526123046875,
0.045501708984375,
-0.0259246826171875,
0.0313720703125,
-0.06060791015625,
-0.0197601318359375,
0.0235748291015625,
-0.0078887939453125,
-0.05224609375,
0.00785064697265625,
0.0223541259765625,
-0.04931640625,
0.0307769775390625,
-0.00302886962890625,
-0.056732177734375,
-0.00225067138671875,
-0.0237884521484375,
0.034423828125,
0.050018310546875,
-0.01418304443359375,
0.04779052734375,
0.005779266357421875,
0.00004088878631591797,
-0.0308380126953125,
-0.07452392578125,
-0.01305389404296875,
-0.0304107666015625,
-0.060028076171875,
0.022308349609375,
-0.032135009765625,
-0.004665374755859375,
0.0017995834350585938,
0.0184326171875,
-0.00852203369140625,
0.0113983154296875,
0.00812530517578125,
0.01483154296875,
-0.034698486328125,
0.01262664794921875,
0.003307342529296875,
-0.0145721435546875,
-0.01082611083984375,
-0.009124755859375,
0.047210693359375,
-0.027740478515625,
-0.01459503173828125,
-0.047119140625,
0.007488250732421875,
0.04461669921875,
-0.0295562744140625,
0.061370849609375,
0.048187255859375,
-0.00656890869140625,
0.017730712890625,
-0.031768798828125,
0.007793426513671875,
-0.033538818359375,
0.0035915374755859375,
-0.0401611328125,
-0.05810546875,
0.03662109375,
0.00716400146484375,
0.031494140625,
0.06866455078125,
0.048736572265625,
0.0035381317138671875,
0.0472412109375,
0.0263214111328125,
0.004657745361328125,
0.0361328125,
-0.036895751953125,
-0.01137542724609375,
-0.0731201171875,
0.00794219970703125,
-0.057861328125,
-0.025787353515625,
-0.06341552734375,
-0.0160369873046875,
0.0183563232421875,
0.0092010498046875,
-0.01776123046875,
0.049346923828125,
-0.0401611328125,
0.0150299072265625,
0.04296875,
-0.01204681396484375,
0.024383544921875,
0.006763458251953125,
-0.037139892578125,
-0.01059722900390625,
-0.0287933349609375,
-0.0380859375,
0.09503173828125,
0.03228759765625,
0.0247344970703125,
0.0169219970703125,
0.038482666015625,
0.00685882568359375,
0.01288604736328125,
-0.045135498046875,
0.02728271484375,
-0.0281982421875,
-0.050384521484375,
-0.0218353271484375,
-0.045074462890625,
-0.061004638671875,
0.0341796875,
-0.01837158203125,
-0.03594970703125,
0.017059326171875,
-0.0025043487548828125,
-0.0084381103515625,
0.032562255859375,
-0.056610107421875,
0.08331298828125,
-0.0050201416015625,
-0.0048828125,
0.01377105712890625,
-0.033233642578125,
0.0197601318359375,
-0.006885528564453125,
0.0211334228515625,
-0.01751708984375,
0.0096282958984375,
0.0439453125,
-0.00580596923828125,
0.0239715576171875,
-0.005031585693359375,
-0.0117034912109375,
0.0032978057861328125,
0.006229400634765625,
0.025299072265625,
-0.00672149658203125,
-0.03863525390625,
0.0338134765625,
0.01207733154296875,
-0.0305938720703125,
-0.005767822265625,
0.041778564453125,
-0.05291748046875,
-0.0037937164306640625,
-0.031280517578125,
-0.04833984375,
0.0010690689086914062,
0.026885986328125,
0.051239013671875,
0.05401611328125,
-0.0214691162109375,
0.044769287109375,
0.06182861328125,
-0.0223388671875,
0.030914306640625,
0.054168701171875,
-0.01422119140625,
-0.043975830078125,
0.059112548828125,
0.004619598388671875,
0.0290985107421875,
0.0479736328125,
0.0087738037109375,
-0.01352691650390625,
-0.05133056640625,
-0.05242919921875,
0.0205535888671875,
-0.0185699462890625,
-0.016876220703125,
-0.043731689453125,
-0.006275177001953125,
-0.020965576171875,
0.016693115234375,
-0.03411865234375,
-0.047393798828125,
-0.01119232177734375,
-0.01129150390625,
0.0211944580078125,
0.0170440673828125,
-0.007965087890625,
0.04132080078125,
-0.07965087890625,
0.019500732421875,
-0.01320648193359375,
0.0298309326171875,
-0.039276123046875,
-0.058349609375,
-0.03167724609375,
0.0103912353515625,
-0.05047607421875,
-0.049285888671875,
0.03424072265625,
0.01136016845703125,
0.015960693359375,
0.0235443115234375,
0.0170135498046875,
0.0236968994140625,
-0.05133056640625,
0.068603515625,
-0.007537841796875,
-0.0546875,
0.036376953125,
-0.032440185546875,
0.036346435546875,
0.07257080078125,
0.020263671875,
-0.021728515625,
-0.0401611328125,
-0.055877685546875,
-0.056549072265625,
0.06280517578125,
0.0543212890625,
-0.0064697265625,
0.01177215576171875,
-0.0080108642578125,
-0.002010345458984375,
0.01010894775390625,
-0.07684326171875,
-0.029754638671875,
0.006809234619140625,
-0.028076171875,
-0.013916015625,
-0.0229339599609375,
-0.01495361328125,
-0.012725830078125,
0.0810546875,
0.011199951171875,
0.01947021484375,
0.0367431640625,
-0.01047515869140625,
-0.0167999267578125,
0.0257415771484375,
0.07989501953125,
0.040496826171875,
-0.045501708984375,
-0.01153564453125,
0.0214691162109375,
-0.032470703125,
-0.01042938232421875,
0.0150299072265625,
-0.030975341796875,
0.0260467529296875,
0.03448486328125,
0.07818603515625,
0.00839996337890625,
-0.04827880859375,
0.034942626953125,
-0.03131103515625,
-0.034210205078125,
-0.048004150390625,
-0.007282257080078125,
0.00737762451171875,
0.0005426406860351562,
0.01898193359375,
0.01084136962890625,
0.011016845703125,
-0.01064300537109375,
0.0178680419921875,
0.00632476806640625,
-0.051177978515625,
-0.039764404296875,
0.037567138671875,
0.01505279541015625,
-0.0250244140625,
0.03851318359375,
-0.0253143310546875,
-0.039093017578125,
0.0276336669921875,
0.006145477294921875,
0.08074951171875,
-0.0186309814453125,
-0.01336669921875,
0.0526123046875,
0.038818359375,
-0.01299285888671875,
0.036285400390625,
0.00478363037109375,
-0.050872802734375,
-0.040008544921875,
-0.06463623046875,
-0.0060272216796875,
0.00589752197265625,
-0.052947998046875,
0.034423828125,
0.0258026123046875,
-0.0008544921875,
-0.0202789306640625,
0.0180511474609375,
-0.035369873046875,
0.00772857666015625,
-0.017791748046875,
0.08203125,
-0.06695556640625,
0.072509765625,
0.030975341796875,
-0.0213775634765625,
-0.057586669921875,
-0.0188446044921875,
-0.0157012939453125,
-0.028350830078125,
0.0489501953125,
0.0157318115234375,
0.018463134765625,
-0.0067596435546875,
-0.007778167724609375,
-0.05712890625,
0.084228515625,
0.018463134765625,
-0.045745849609375,
0.0030307769775390625,
0.01299285888671875,
0.035003662109375,
-0.027587890625,
0.01224517822265625,
0.0311126708984375,
0.05596923828125,
0.005420684814453125,
-0.08673095703125,
-0.0235443115234375,
-0.03662109375,
-0.0287017822265625,
0.04412841796875,
-0.043060302734375,
0.069091796875,
0.0345458984375,
-0.01313018798828125,
0.004047393798828125,
0.04412841796875,
0.0169525146484375,
0.0245819091796875,
0.046051025390625,
0.0863037109375,
0.0250244140625,
-0.03790283203125,
0.07659912109375,
-0.0237884521484375,
0.0390625,
0.08746337890625,
-0.0036029815673828125,
0.0677490234375,
0.0245208740234375,
-0.0075225830078125,
0.035919189453125,
0.044036865234375,
-0.02337646484375,
0.035980224609375,
0.0009579658508300781,
0.01137542724609375,
-0.00821685791015625,
0.01110076904296875,
-0.053924560546875,
0.0227813720703125,
0.013214111328125,
-0.0180206298828125,
-0.0001958608627319336,
-0.00201416015625,
0.00296783447265625,
-0.00019311904907226562,
-0.007236480712890625,
0.0482177734375,
-0.00010526180267333984,
-0.040985107421875,
0.0601806640625,
-0.0028400421142578125,
0.05133056640625,
-0.051910400390625,
0.00783538818359375,
-0.006519317626953125,
0.0199127197265625,
0.00005429983139038086,
-0.043731689453125,
0.0447998046875,
-0.00002676248550415039,
-0.0239410400390625,
-0.035186767578125,
0.0145263671875,
-0.039642333984375,
-0.06463623046875,
0.033111572265625,
0.02960205078125,
0.0226287841796875,
0.0005712509155273438,
-0.06109619140625,
0.004619598388671875,
0.0137481689453125,
-0.044708251953125,
-0.0014553070068359375,
0.052825927734375,
0.022125244140625,
0.03509521484375,
0.044708251953125,
0.01232147216796875,
0.0125732421875,
-0.00397491455078125,
0.04681396484375,
-0.0352783203125,
-0.02685546875,
-0.061309814453125,
0.060028076171875,
-0.008575439453125,
-0.0498046875,
0.04541015625,
0.0736083984375,
0.07867431640625,
-0.01230621337890625,
0.01175689697265625,
-0.01268768310546875,
0.0535888671875,
-0.04913330078125,
0.04827880859375,
-0.07196044921875,
0.0185394287109375,
-0.00556182861328125,
-0.06964111328125,
-0.01430511474609375,
0.0208892822265625,
-0.0096893310546875,
-0.031341552734375,
0.061370849609375,
0.049407958984375,
-0.0149383544921875,
-0.01357269287109375,
0.02337646484375,
0.0224456787109375,
0.017822265625,
0.04058837890625,
0.0287933349609375,
-0.07684326171875,
0.03985595703125,
-0.0252685546875,
-0.004497528076171875,
-0.004154205322265625,
-0.052947998046875,
-0.05950927734375,
-0.049530029296875,
-0.0164337158203125,
-0.0169677734375,
-0.02728271484375,
0.065673828125,
0.038360595703125,
-0.06842041015625,
-0.03887939453125,
0.0054931640625,
0.01555633544921875,
-0.0110015869140625,
-0.0196380615234375,
0.04742431640625,
-0.0265045166015625,
-0.076171875,
0.03521728515625,
0.00205230712890625,
-0.004795074462890625,
0.004039764404296875,
-0.01824951171875,
-0.042510986328125,
-0.00315093994140625,
0.0259246826171875,
-0.0016584396362304688,
-0.039642333984375,
0.0029621124267578125,
0.00984954833984375,
-0.00722503662109375,
0.0262298583984375,
0.0287017822265625,
-0.0223846435546875,
0.0160675048828125,
0.06756591796875,
0.03106689453125,
0.0289306640625,
-0.01018524169921875,
0.042938232421875,
-0.049285888671875,
0.0242767333984375,
0.0116424560546875,
0.045013427734375,
0.02423095703125,
-0.00933837890625,
0.06927490234375,
0.01512908935546875,
-0.05059814453125,
-0.080322265625,
0.004833221435546875,
-0.094970703125,
-0.005466461181640625,
0.06622314453125,
-0.01849365234375,
-0.0210418701171875,
0.0280303955078125,
-0.0146636962890625,
0.01226043701171875,
-0.0269927978515625,
0.027435302734375,
0.0653076171875,
0.02728271484375,
0.01253509521484375,
-0.052001953125,
0.026275634765625,
0.040252685546875,
-0.05023193359375,
-0.0126495361328125,
0.0196380615234375,
0.01236724853515625,
0.0318603515625,
0.035247802734375,
-0.0249481201171875,
0.006526947021484375,
-0.02545166015625,
0.022796630859375,
-0.01056671142578125,
-0.0125274658203125,
-0.030059814453125,
-0.00020301342010498047,
-0.00888824462890625,
-0.023590087890625
]
] |
allenai/led-base-16384 | 2023-01-24T16:26:59.000Z | [
"transformers",
"pytorch",
"tf",
"led",
"text2text-generation",
"en",
"arxiv:2004.05150",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | text2text-generation | allenai | null | null | allenai/led-base-16384 | 34 | 12,915 | transformers | 2022-03-02T23:29:05 | ---
language: en
license: apache-2.0
---
## Introduction
[Allenai's Longformer Encoder-Decoder (LED)](https://github.com/allenai/longformer#longformer).
As described in [Longformer: The Long-Document Transformer](https://arxiv.org/pdf/2004.05150.pdf) by Iz Beltagy, Matthew E. Peters, Arman Cohan, *led-base-16384* was initialized from [*bart-base*](https://huggingface.co/facebook/bart-base) since both models share the exact same architecture. To be able to process 16K tokens, *bart-base*'s position embedding matrix was simply copied 16 times.
This model is especially interesting for long-range summarization and question answering.
## Fine-tuning for down-stream task
[This notebook](https://colab.research.google.com/drive/12LjJazBl7Gam0XBPy_y0CTOJZeZ34c2v?usp=sharing) shows how *led-base-16384* can effectively be fine-tuned on a downstream task.
| 862 | [
[
-0.029998779296875,
-0.06072998046875,
0.048370361328125,
0.018798828125,
-0.0245361328125,
0.0151214599609375,
-0.027740478515625,
-0.0307159423828125,
0.01401519775390625,
0.0321044921875,
-0.0304107666015625,
-0.01320648193359375,
-0.040374755859375,
0.01476287841796875,
-0.036834716796875,
0.0775146484375,
-0.0182952880859375,
-0.002246856689453125,
-0.004184722900390625,
-0.001636505126953125,
-0.038726806640625,
-0.021392822265625,
-0.05419921875,
-0.0178680419921875,
0.041168212890625,
0.00836944580078125,
0.0287017822265625,
0.03125,
0.046722412109375,
0.0175628662109375,
-0.01149749755859375,
0.011962890625,
-0.05810546875,
0.0138092041015625,
-0.00830841064453125,
-0.0256805419921875,
-0.0662841796875,
-0.0335693359375,
0.071044921875,
0.051788330078125,
-0.000762939453125,
0.029754638671875,
0.023162841796875,
0.08551025390625,
-0.0465087890625,
0.0217742919921875,
-0.0100860595703125,
-0.01861572265625,
-0.03155517578125,
-0.005596160888671875,
-0.0552978515625,
-0.0062408447265625,
-0.0294189453125,
-0.043060302734375,
0.019805908203125,
0.00833892822265625,
0.06060791015625,
0.036407470703125,
-0.02886962890625,
-0.0156097412109375,
-0.06170654296875,
0.064208984375,
-0.038360595703125,
0.04608154296875,
0.014190673828125,
0.033477783203125,
0.01137542724609375,
-0.0599365234375,
-0.04644775390625,
0.0186767578125,
-0.00797271728515625,
0.0232391357421875,
-0.030181884765625,
0.025604248046875,
0.032440185546875,
0.0291290283203125,
-0.042816162109375,
0.006732940673828125,
-0.06597900390625,
0.02386474609375,
0.045074462890625,
0.0079803466796875,
-0.01708984375,
-0.034210205078125,
-0.0106353759765625,
0.006748199462890625,
-0.0548095703125,
0.0026607513427734375,
0.0155792236328125,
0.01158905029296875,
-0.0194244384765625,
0.03466796875,
-0.01061248779296875,
0.05120849609375,
0.03863525390625,
0.0167694091796875,
0.027984619140625,
-0.036407470703125,
-0.034881591796875,
0.000008404254913330078,
0.0312042236328125,
0.021636962890625,
0.0120849609375,
-0.006702423095703125,
-0.01552581787109375,
-0.0035552978515625,
0.0411376953125,
-0.0672607421875,
-0.0071868896484375,
0.0298919677734375,
-0.06427001953125,
0.005260467529296875,
0.01421356201171875,
-0.035369873046875,
-0.006923675537109375,
-0.036529541015625,
0.0221099853515625,
-0.02935791015625,
-0.002948760986328125,
0.01544189453125,
-0.00971221923828125,
0.032318115234375,
0.045562744140625,
-0.07965087890625,
0.032501220703125,
0.078857421875,
0.05816650390625,
-0.0389404296875,
-0.01473236083984375,
-0.045623779296875,
0.004085540771484375,
-0.0168304443359375,
0.040771484375,
-0.00852203369140625,
-0.0002238750457763672,
-0.0074005126953125,
0.01052093505859375,
-0.007415771484375,
-0.03741455078125,
0.03253173828125,
-0.060943603515625,
0.013031005859375,
-0.0162811279296875,
-0.051605224609375,
-0.01520538330078125,
0.0026912689208984375,
-0.06768798828125,
0.07733154296875,
0.0181427001953125,
-0.04974365234375,
0.01314544677734375,
-0.07159423828125,
-0.0309295654296875,
0.007373809814453125,
-0.007617950439453125,
-0.028778076171875,
0.003665924072265625,
0.02545166015625,
0.039581298828125,
-0.01180267333984375,
0.0155181884765625,
0.00618743896484375,
-0.05157470703125,
-0.008026123046875,
-0.0257720947265625,
0.06671142578125,
0.0152435302734375,
-0.018218994140625,
0.02099609375,
-0.068603515625,
-0.0269317626953125,
-0.007518768310546875,
-0.046112060546875,
-0.00841522216796875,
-0.00266265869140625,
0.00740814208984375,
-0.0179595947265625,
0.018768310546875,
-0.058319091796875,
0.00020599365234375,
-0.01983642578125,
0.044586181640625,
0.053253173828125,
-0.0035991668701171875,
0.04388427734375,
-0.025970458984375,
0.040985107421875,
-0.013427734375,
0.01349639892578125,
-0.0330810546875,
-0.03271484375,
-0.054656982421875,
-0.019378662109375,
0.00959014892578125,
0.039093017578125,
-0.040374755859375,
0.0411376953125,
-0.04656982421875,
-0.04266357421875,
-0.05023193359375,
0.004985809326171875,
-0.0016345977783203125,
0.039642333984375,
0.056854248046875,
-0.006591796875,
-0.04376220703125,
-0.05169677734375,
0.0244903564453125,
-0.002658843994140625,
-0.0091400146484375,
0.0024509429931640625,
0.0343017578125,
-0.0188751220703125,
0.06048583984375,
-0.033050537109375,
-0.0043487548828125,
-0.03680419921875,
-0.0074005126953125,
0.0631103515625,
0.037811279296875,
0.045074462890625,
-0.0537109375,
-0.03826904296875,
-0.0261383056640625,
-0.058502197265625,
0.01641845703125,
-0.0288238525390625,
-0.0161895751953125,
0.018096923828125,
0.051849365234375,
-0.07452392578125,
0.03570556640625,
0.03619384765625,
-0.0173187255859375,
0.0234222412109375,
-0.0104827880859375,
0.00350189208984375,
-0.1009521484375,
0.01206207275390625,
-0.024871826171875,
-0.0193634033203125,
-0.036651611328125,
0.0119171142578125,
0.03179931640625,
-0.01230621337890625,
-0.02099609375,
0.0237579345703125,
-0.05133056640625,
-0.0037708282470703125,
-0.0240936279296875,
-0.01114654541015625,
0.01491546630859375,
0.042755126953125,
-0.0111083984375,
0.057952880859375,
0.026885986328125,
-0.031494140625,
0.0457763671875,
0.0295562744140625,
-0.0276641845703125,
0.037261962890625,
-0.0760498046875,
0.01184844970703125,
-0.04437255859375,
0.0364990234375,
-0.07403564453125,
0.005474090576171875,
0.0028743743896484375,
-0.0257415771484375,
0.03424072265625,
-0.007274627685546875,
-0.030303955078125,
-0.0308837890625,
-0.041259765625,
0.048797607421875,
0.04931640625,
-0.047454833984375,
0.050689697265625,
0.0088653564453125,
-0.0256805419921875,
-0.0506591796875,
-0.06085205078125,
-0.000035822391510009766,
-0.0058135986328125,
-0.049774169921875,
0.041961669921875,
-0.020843505859375,
-0.00556182861328125,
-0.0123291015625,
0.007244110107421875,
0.0212249755859375,
-0.004913330078125,
0.04766845703125,
0.0276947021484375,
-0.03936767578125,
0.02166748046875,
0.0145263671875,
-0.0122833251953125,
0.00473785400390625,
0.0095367431640625,
0.047027587890625,
-0.01221466064453125,
-0.03533935546875,
-0.0256805419921875,
0.04693603515625,
0.035858154296875,
-0.01605224609375,
0.04498291015625,
0.033233642578125,
-0.0233917236328125,
-0.02191162109375,
-0.03363037109375,
-0.007541656494140625,
-0.037567138671875,
0.03546142578125,
-0.016876220703125,
-0.06304931640625,
0.0374755859375,
0.012359619140625,
0.00457763671875,
0.04931640625,
0.0396728515625,
-0.0229949951171875,
0.042327880859375,
0.057220458984375,
-0.01043701171875,
0.050323486328125,
-0.034576416015625,
0.018829345703125,
-0.0684814453125,
-0.0005393028259277344,
-0.021453857421875,
-0.039337158203125,
-0.0257110595703125,
-0.03857421875,
0.01297760009765625,
0.01435089111328125,
-0.04046630859375,
0.03387451171875,
-0.033477783203125,
0.028167724609375,
0.0400390625,
-0.005695343017578125,
0.01462554931640625,
-0.01105499267578125,
0.03314208984375,
0.0014448165893554688,
-0.039886474609375,
-0.025238037109375,
0.08154296875,
0.054779052734375,
0.0814208984375,
0.00997161865234375,
0.06884765625,
0.0243072509765625,
0.002620697021484375,
-0.058837890625,
0.028472900390625,
-0.0174407958984375,
-0.0416259765625,
-0.031341552734375,
-0.0040130615234375,
-0.08392333984375,
-0.01413726806640625,
0.00274658203125,
-0.0253448486328125,
0.03619384765625,
0.00514984130859375,
-0.04852294921875,
0.0175933837890625,
-0.05487060546875,
0.056884765625,
-0.00799560546875,
0.0006666183471679688,
-0.035888671875,
-0.0452880859375,
-0.01120758056640625,
-0.0028400421142578125,
0.0014238357543945312,
0.0160064697265625,
0.0157470703125,
0.06427001953125,
-0.00812530517578125,
0.061279296875,
-0.0086212158203125,
-0.01007843017578125,
0.013519287109375,
-0.02484130859375,
0.032379150390625,
0.0096588134765625,
-0.01111602783203125,
0.01462554931640625,
-0.0143890380859375,
-0.03369140625,
-0.046600341796875,
0.0205230712890625,
-0.059478759765625,
-0.0268402099609375,
-0.0233612060546875,
-0.0184783935546875,
0.0147552490234375,
0.0278167724609375,
0.0264129638671875,
0.036590576171875,
-0.0167694091796875,
0.035186767578125,
0.05157470703125,
0.003849029541015625,
0.05279541015625,
0.021575927734375,
-0.00937652587890625,
-0.0227508544921875,
0.038116455078125,
0.0018529891967773438,
0.0087432861328125,
0.0662841796875,
-0.02996826171875,
0.00814056396484375,
-0.057952880859375,
-0.03936767578125,
0.031585693359375,
-0.067138671875,
-0.03009033203125,
-0.031005859375,
-0.040435791015625,
-0.0233612060546875,
-0.0289459228515625,
-0.041717529296875,
-0.03131103515625,
-0.036376953125,
-0.037750244140625,
0.02294921875,
0.06024169921875,
0.00008797645568847656,
0.020355224609375,
-0.064453125,
0.0390625,
0.024871826171875,
0.0216064453125,
0.0118255615234375,
-0.036041259765625,
-0.0426025390625,
-0.002552032470703125,
-0.01157379150390625,
-0.056396484375,
0.0175018310546875,
0.0135498046875,
0.045501708984375,
0.0288238525390625,
0.001644134521484375,
0.03240966796875,
-0.051025390625,
0.06036376953125,
0.0012149810791015625,
-0.06793212890625,
0.04296875,
-0.01727294921875,
0.02996826171875,
0.0430908203125,
0.02423095703125,
-0.028656005859375,
-0.0106353759765625,
-0.0221405029296875,
-0.0767822265625,
0.041107177734375,
0.0034332275390625,
0.01454925537109375,
-0.0018434524536132812,
0.01361083984375,
0.0304412841796875,
0.012603759765625,
-0.046112060546875,
-0.023590087890625,
-0.032562255859375,
-0.02197265625,
-0.021942138671875,
-0.0301055908203125,
-0.0166015625,
-0.0275115966796875,
0.029205322265625,
-0.0080108642578125,
0.05303955078125,
0.016693115234375,
-0.03643798828125,
-0.016571044921875,
0.02520751953125,
0.0428466796875,
0.07781982421875,
-0.052154541015625,
0.01267242431640625,
0.006252288818359375,
-0.047607421875,
-0.0204925537109375,
0.032135009765625,
-0.01708984375,
0.0135955810546875,
0.0518798828125,
0.09710693359375,
-0.004039764404296875,
-0.035797119140625,
0.0321044921875,
0.003475189208984375,
0.005695343017578125,
-0.03753662109375,
-0.0030345916748046875,
0.010162353515625,
0.024566650390625,
0.05340576171875,
0.00258636474609375,
0.00760650634765625,
-0.043304443359375,
0.00494384765625,
0.00537109375,
-0.00978851318359375,
-0.036285400390625,
0.047027587890625,
0.034912109375,
-0.01169586181640625,
0.048614501953125,
-0.01468658447265625,
-0.0205230712890625,
0.03790283203125,
0.040069580078125,
0.05877685546875,
0.014984130859375,
0.0008668899536132812,
0.03253173828125,
0.01259613037109375,
0.009185791015625,
0.04388427734375,
-0.0262908935546875,
-0.050933837890625,
-0.050201416015625,
-0.0440673828125,
-0.02203369140625,
-0.0082550048828125,
-0.06756591796875,
0.0129241943359375,
-0.0235748291015625,
-0.00457763671875,
0.0305328369140625,
0.0031566619873046875,
-0.05902099609375,
0.003139495849609375,
0.028167724609375,
0.0830078125,
-0.044464111328125,
0.06707763671875,
0.052520751953125,
-0.0231475830078125,
-0.042266845703125,
-0.01076507568359375,
-0.0313720703125,
-0.06353759765625,
0.061553955078125,
0.00836944580078125,
-0.0013532638549804688,
-0.0004050731658935547,
-0.026763916015625,
-0.06781005859375,
0.110595703125,
0.01226806640625,
-0.05694580078125,
-0.02825927734375,
-0.00276947021484375,
0.0396728515625,
-0.022003173828125,
0.01291656494140625,
0.0251922607421875,
0.0250396728515625,
-0.0208282470703125,
-0.08349609375,
0.0149993896484375,
-0.029266357421875,
-0.00141143798828125,
0.0206451416015625,
-0.05487060546875,
0.08984375,
-0.0160675048828125,
-0.017852783203125,
0.004108428955078125,
0.059661865234375,
0.00980377197265625,
0.017333984375,
0.01983642578125,
0.041351318359375,
0.0511474609375,
0.0018854141235351562,
0.0682373046875,
-0.020233154296875,
0.0266265869140625,
0.07171630859375,
-0.00015485286712646484,
0.08929443359375,
0.043121337890625,
-0.0263824462890625,
0.05419921875,
0.0242767333984375,
-0.0289764404296875,
0.04669189453125,
-0.00962066650390625,
0.0068206787109375,
-0.0008025169372558594,
0.0345458984375,
-0.04327392578125,
0.0256195068359375,
0.0220184326171875,
-0.03985595703125,
-0.016510009765625,
0.0007529258728027344,
-0.007419586181640625,
-0.00789642333984375,
-0.038360595703125,
0.05078125,
0.012664794921875,
-0.0517578125,
0.051483154296875,
0.00789642333984375,
0.0723876953125,
-0.036712646484375,
0.0010833740234375,
-0.0196533203125,
0.0270538330078125,
-0.0155181884765625,
-0.0765380859375,
0.028839111328125,
-0.0008215904235839844,
-0.040924072265625,
-0.044158935546875,
0.060028076171875,
-0.0144195556640625,
-0.0252685546875,
0.0296173095703125,
0.03155517578125,
-0.007694244384765625,
-0.007564544677734375,
-0.0267791748046875,
0.00814056396484375,
-0.00823211669921875,
-0.02557373046875,
0.00782012939453125,
0.0134124755859375,
0.0010204315185546875,
0.0294036865234375,
0.0338134765625,
0.00730133056640625,
0.00661468505859375,
0.016326904296875,
0.06304931640625,
-0.06787109375,
-0.05352783203125,
-0.055694580078125,
0.0433349609375,
-0.0046539306640625,
-0.0231475830078125,
0.05584716796875,
0.057373046875,
0.060699462890625,
-0.0287933349609375,
0.04644775390625,
0.0015745162963867188,
0.053009033203125,
-0.03839111328125,
0.062225341796875,
-0.019683837890625,
-0.0007524490356445312,
-0.0057220458984375,
-0.07843017578125,
-0.01175689697265625,
0.033355712890625,
-0.01113128662109375,
-0.00437164306640625,
0.0523681640625,
0.03759765625,
-0.01523590087890625,
-0.005573272705078125,
-0.00492095947265625,
0.01503753662109375,
0.0256805419921875,
0.044830322265625,
0.05096435546875,
-0.048370361328125,
0.033233642578125,
-0.0105133056640625,
-0.003093719482421875,
-0.022705078125,
-0.056854248046875,
-0.06781005859375,
-0.039093017578125,
-0.01007843017578125,
-0.0161285400390625,
0.0116729736328125,
0.057708740234375,
0.0595703125,
-0.08111572265625,
-0.0201873779296875,
-0.00197601318359375,
0.00550079345703125,
-0.018218994140625,
-0.01172637939453125,
0.0474853515625,
-0.0018091201782226562,
-0.046051025390625,
0.0284423828125,
0.0231170654296875,
-0.004062652587890625,
0.000514984130859375,
-0.022674560546875,
-0.004322052001953125,
0.0014095306396484375,
0.055877685546875,
0.0215301513671875,
-0.03558349609375,
-0.02276611328125,
0.0173187255859375,
-0.005584716796875,
0.034576416015625,
0.04473876953125,
-0.05975341796875,
0.01421356201171875,
0.060333251953125,
0.050689697265625,
0.055938720703125,
0.02252197265625,
0.038421630859375,
-0.0577392578125,
-0.0004260540008544922,
0.02471923828125,
0.0377197265625,
0.031005859375,
-0.028076171875,
0.033721923828125,
-0.0007152557373046875,
-0.05084228515625,
-0.06231689453125,
0.00342559814453125,
-0.11798095703125,
-0.01459503173828125,
0.074462890625,
0.005832672119140625,
-0.01297760009765625,
0.0100860595703125,
-0.01898193359375,
0.032318115234375,
-0.03814697265625,
0.050018310546875,
0.0413818359375,
-0.0238037109375,
-0.02655029296875,
-0.028076171875,
0.041259765625,
0.01323699951171875,
-0.053192138671875,
-0.00559234619140625,
0.021942138671875,
0.01404571533203125,
0.0419921875,
0.05438232421875,
0.01395416259765625,
0.028228759765625,
-0.007080078125,
-0.0013666152954101562,
-0.027587890625,
-0.012054443359375,
-0.034149169921875,
0.01739501953125,
-0.004802703857421875,
-0.0355224609375
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.